[
  {
    "path": ".github/ISSUE_TEMPLATE/----.md",
    "content": "---\nname: 需求反馈\nabout: 需求建议\ntitle: ''\nlabels: ''\nassignees: ''\n\n---\n\n欢迎您对PaddleHub提出建议，非常感谢您对PaddleHub的贡献！\n在留下您的建议时，辛苦您同步提供如下信息：\n- 您想要增加什么新特性？\n- 什么样的场景下需要该特性？\n- 没有该特性的条件下，PaddleHub目前是否能间接满足该需求？\n- 增加该特性，PaddleHub可能需要变化的部分。\n- 如果可以的话，简要描述下您的解决方案\n"
  },
  {
    "path": ".github/ISSUE_TEMPLATE/bug--.md",
    "content": "---\nname: BUG反馈\nabout: PaddleHub Bug反馈\ntitle: ''\nlabels: ''\nassignees: ''\n\n---\n\n欢迎您反馈PaddleHub使用问题，非常感谢您对PaddleHub的贡献！\n在留下您的问题时，辛苦您同步提供如下信息：\n- 版本、环境信息\n1）PaddleHub和PaddlePaddle版本：请提供您的PaddleHub和PaddlePaddle版本号，例如PaddleHub1.4.1，PaddlePaddle1.6.2\n2）系统环境：请您描述系统类型，例如Linux/Windows/MacOS/，python版本\n- 复现信息：如为报错，请给出复现环境、复现步骤\n"
  },
  {
    "path": ".gitignore",
    "content": "# Byte-compiled / optimized / DLL files\n__pycache__/\n*.py[cod]\n*$py.class\n\n# C extensions\n*.so\n\n# Distribution / packaging\n.Python\nbuild/\ndevelop-eggs/\ndist/\ndownloads/\neggs/\n.eggs/\nlib/\nlib64/\nparts/\nsdist/\nvar/\nwheels/\nshare/python-wheels/\n*.egg-info/\n.installed.cfg\n*.egg\nMANIFEST\n\n# PyInstaller\n#  Usually these files are written by a python script from a template\n#  before PyInstaller builds the exe, so as to inject date/other infos into it.\n*.manifest\n*.spec\n\n# Installer logs\npip-log.txt\npip-delete-this-directory.txt\n\n# Unit test / coverage reports\nhtmlcov/\n.tox/\n.nox/\n.coverage\n.coverage.*\n.cache\nnosetests.xml\ncoverage.xml\n*.cover\n.hypothesis/\n.pytest_cache/\n\n# Translations\n*.mo\n*.pot\n\n# Django stuff:\n*.log\nlocal_settings.py\ndb.sqlite3\n\n# Flask stuff:\ninstance/\n.webassets-cache\n\n# Scrapy stuff:\n.scrapy\n\n# Sphinx documentation\ndocs/_build/\n\n# PyBuilder\ntarget/\n\n# Jupyter Notebook\n.ipynb_checkpoints\n\n# IPython\nprofile_default/\nipython_config.py\n\n# pyenv\n.python-version\n\n# celery beat schedule file\ncelerybeat-schedule\n\n# SageMath parsed files\n*.sage.py\n\n# Environments\n.env\n.venv\nenv/\nvenv/\nENV/\nenv.bak/\nvenv.bak/\n\n# Spyder project settings\n.spyderproject\n.spyproject\n\n# Rope project settings\n.ropeproject\n\n# mkdocs documentation\n/site\n\n# mypy\n.mypy_cache/\n.dmypy.json\ndmypy.json\n\n# Pyre type checker\n.pyre/\n\n# pycharm\n.DS_Store\n.idea/\nFETCH_HEAD"
  },
  {
    "path": ".pre-commit-config.yaml",
    "content": "-   repo: local\n    hooks:\n    -   id: yapf\n        name: yapf\n        entry: yapf\n        language: system\n        args: [-i, --style .style.yapf]\n        files: \\.py$\n\n-   repo: https://github.com/pre-commit/pre-commit-hooks\n    sha: a11d9314b22d8f8c7556443875b731ef05965464\n    hooks:\n    -   id: check-merge-conflict\n    -   id: check-symlinks\n    -   id: end-of-file-fixer\n    -   id: trailing-whitespace\n    -   id: detect-private-key\n    -   id: check-symlinks\n    -   id: check-added-large-files\n\n-   repo: local\n    hooks:\n    -   id: flake8\n        name: flake8\n        entry: flake8\n        language: system\n        args:\n        -   --count\n        -   --select=E9,F63,F7,F82\n        -   --show-source\n        -   --statistics\n        files: \\.py$\n\n-   repo: https://github.com/asottile/reorder_python_imports\n    rev: v2.4.0\n    hooks:\n      - id: reorder-python-imports\n        exclude: (?=third_party).*(\\.py)$\n"
  },
  {
    "path": ".style.yapf",
    "content": "[style]\nbased_on_style = pep8\ncolumn_limit = 120\n"
  },
  {
    "path": ".travis.yml",
    "content": "language: python\n\njobs:\n  include:\n    - name: \"CI on Windows/Python3.6\"\n      os: windows\n      language: shell\n      before_install:\n        - choco install python --version 3.6.2\n        - python -m pip install --upgrade pip\n      env: PATH=/c/Python36:/c/Python36/Scripts:$PATH\n    - name: \"CI on MacOS/Python3.6\"\n      os: osx\n      language: shell\n    - name: \"CI on Linux/Python3.6\"\n      os: linux\n      python: 3.6\n      script: /bin/bash ./scripts/check_code_style.sh\n\nenv:\n  - PYTHONPATH=${PWD}\n\ninstall:\n  - if [[ $TRAVIS_OS_NAME == osx ]]; then\n      pip3 install --upgrade paddlepaddle;\n      pip3 install -r requirements.txt;\n    else\n      pip install --upgrade paddlepaddle;\n      pip install -r requirements.txt;\n      pip install yapf==0.26.0;\n    fi\n\nnotifications:\n  email:\n    on_success: change\n    on_failure: always\n"
  },
  {
    "path": "AUTHORS.md",
    "content": "| Github account | name |\n|---|---|\n| ZeyuChen | Zeyu Chen |\n| nepeplwu | Zewu Wu |\n| sjtubinlong | Bin Long |\n| Steffy-zxf | Xuefei Zhang |\n| kinghuin | Jinxuan Qiu |\n| ShenYuhan | Yuhan Shen |\n|haoyuying|Yuying Hao|\n|KPatr1ck|Xiaojie Chen|\n"
  },
  {
    "path": "LICENSE",
    "content": "Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved\n\n                                 Apache License\n                           Version 2.0, January 2004\n                        http://www.apache.org/licenses/\n\n   TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION\n\n   1. Definitions.\n\n      \"License\" shall mean the terms and conditions for use, reproduction,\n      and distribution as defined by Sections 1 through 9 of this document.\n\n      \"Licensor\" shall mean the copyright owner or entity authorized by\n      the copyright owner that is granting the License.\n\n      \"Legal Entity\" shall mean the union of the acting entity and all\n      other entities that control, are controlled by, or are under common\n      control with that entity. For the purposes of this definition,\n      \"control\" means (i) the power, direct or indirect, to cause the\n      direction or management of such entity, whether by contract or\n      otherwise, or (ii) ownership of fifty percent (50%) or more of the\n      outstanding shares, or (iii) beneficial ownership of such entity.\n\n      \"You\" (or \"Your\") shall mean an individual or Legal Entity\n      exercising permissions granted by this License.\n\n      \"Source\" form shall mean the preferred form for making modifications,\n      including but not limited to software source code, documentation\n      source, and configuration files.\n\n      \"Object\" form shall mean any form resulting from mechanical\n      transformation or translation of a Source form, including but\n      not limited to compiled object code, generated documentation,\n      and conversions to other media types.\n\n      \"Work\" shall mean the work of authorship, whether in Source or\n      Object form, made available under the License, as indicated by a\n      copyright notice that is included in or attached to the work\n      (an example is provided in the Appendix below).\n\n      \"Derivative Works\" shall mean any work, whether in Source or Object\n      form, that is based on (or derived from) the Work and for which the\n      editorial revisions, annotations, elaborations, or other modifications\n      represent, as a whole, an original work of authorship. For the purposes\n      of this License, Derivative Works shall not include works that remain\n      separable from, or merely link (or bind by name) to the interfaces of,\n      the Work and Derivative Works thereof.\n\n      \"Contribution\" shall mean any work of authorship, including\n      the original version of the Work and any modifications or additions\n      to that Work or Derivative Works thereof, that is intentionally\n      submitted to Licensor for inclusion in the Work by the copyright owner\n      or by an individual or Legal Entity authorized to submit on behalf of\n      the copyright owner. For the purposes of this definition, \"submitted\"\n      means any form of electronic, verbal, or written communication sent\n      to the Licensor or its representatives, including but not limited to\n      communication on electronic mailing lists, source code control systems,\n      and issue tracking systems that are managed by, or on behalf of, the\n      Licensor for the purpose of discussing and improving the Work, but\n      excluding communication that is conspicuously marked or otherwise\n      designated in writing by the copyright owner as \"Not a Contribution.\"\n\n      \"Contributor\" shall mean Licensor and any individual or Legal Entity\n      on behalf of whom a Contribution has been received by Licensor and\n      subsequently incorporated within the Work.\n\n   2. Grant of Copyright License. Subject to the terms and conditions of\n      this License, each Contributor hereby grants to You a perpetual,\n      worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n      copyright license to reproduce, prepare Derivative Works of,\n      publicly display, publicly perform, sublicense, and distribute the\n      Work and such Derivative Works in Source or Object form.\n\n   3. Grant of Patent License. Subject to the terms and conditions of\n      this License, each Contributor hereby grants to You a perpetual,\n      worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n      (except as stated in this section) patent license to make, have made,\n      use, offer to sell, sell, import, and otherwise transfer the Work,\n      where such license applies only to those patent claims licensable\n      by such Contributor that are necessarily infringed by their\n      Contribution(s) alone or by combination of their Contribution(s)\n      with the Work to which such Contribution(s) was submitted. If You\n      institute patent litigation against any entity (including a\n      cross-claim or counterclaim in a lawsuit) alleging that the Work\n      or a Contribution incorporated within the Work constitutes direct\n      or contributory patent infringement, then any patent licenses\n      granted to You under this License for that Work shall terminate\n      as of the date such litigation is filed.\n\n   4. Redistribution. You may reproduce and distribute copies of the\n      Work or Derivative Works thereof in any medium, with or without\n      modifications, and in Source or Object form, provided that You\n      meet the following conditions:\n\n      (a) You must give any other recipients of the Work or\n          Derivative Works a copy of this License; and\n\n      (b) You must cause any modified files to carry prominent notices\n          stating that You changed the files; and\n\n      (c) You must retain, in the Source form of any Derivative Works\n          that You distribute, all copyright, patent, trademark, and\n          attribution notices from the Source form of the Work,\n          excluding those notices that do not pertain to any part of\n          the Derivative Works; and\n\n      (d) If the Work includes a \"NOTICE\" text file as part of its\n          distribution, then any Derivative Works that You distribute must\n          include a readable copy of the attribution notices contained\n          within such NOTICE file, excluding those notices that do not\n          pertain to any part of the Derivative Works, in at least one\n          of the following places: within a NOTICE text file distributed\n          as part of the Derivative Works; within the Source form or\n          documentation, if provided along with the Derivative Works; or,\n          within a display generated by the Derivative Works, if and\n          wherever such third-party notices normally appear. The contents\n          of the NOTICE file are for informational purposes only and\n          do not modify the License. You may add Your own attribution\n          notices within Derivative Works that You distribute, alongside\n          or as an addendum to the NOTICE text from the Work, provided\n          that such additional attribution notices cannot be construed\n          as modifying the License.\n\n      You may add Your own copyright statement to Your modifications and\n      may provide additional or different license terms and conditions\n      for use, reproduction, or distribution of Your modifications, or\n      for any such Derivative Works as a whole, provided Your use,\n      reproduction, and distribution of the Work otherwise complies with\n      the conditions stated in this License.\n\n   5. Submission of Contributions. Unless You explicitly state otherwise,\n      any Contribution intentionally submitted for inclusion in the Work\n      by You to the Licensor shall be under the terms and conditions of\n      this License, without any additional terms or conditions.\n      Notwithstanding the above, nothing herein shall supersede or modify\n      the terms of any separate license agreement you may have executed\n      with Licensor regarding such Contributions.\n\n   6. Trademarks. This License does not grant permission to use the trade\n      names, trademarks, service marks, or product names of the Licensor,\n      except as required for reasonable and customary use in describing the\n      origin of the Work and reproducing the content of the NOTICE file.\n\n   7. Disclaimer of Warranty. Unless required by applicable law or\n      agreed to in writing, Licensor provides the Work (and each\n      Contributor provides its Contributions) on an \"AS IS\" BASIS,\n      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or\n      implied, including, without limitation, any warranties or conditions\n      of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A\n      PARTICULAR PURPOSE. You are solely responsible for determining the\n      appropriateness of using or redistributing the Work and assume any\n      risks associated with Your exercise of permissions under this License.\n\n   8. Limitation of Liability. In no event and under no legal theory,\n      whether in tort (including negligence), contract, or otherwise,\n      unless required by applicable law (such as deliberate and grossly\n      negligent acts) or agreed to in writing, shall any Contributor be\n      liable to You for damages, including any direct, indirect, special,\n      incidental, or consequential damages of any character arising as a\n      result of this License or out of the use or inability to use the\n      Work (including but not limited to damages for loss of goodwill,\n      work stoppage, computer failure or malfunction, or any and all\n      other commercial damages or losses), even if such Contributor\n      has been advised of the possibility of such damages.\n\n   9. Accepting Warranty or Additional Liability. While redistributing\n      the Work or Derivative Works thereof, You may choose to offer,\n      and charge a fee for, acceptance of support, warranty, indemnity,\n      or other liability obligations and/or rights consistent with this\n      License. However, in accepting such obligations, You may act only\n      on Your own behalf and on Your sole responsibility, not on behalf\n      of any other Contributor, and only if You agree to indemnify,\n      defend, and hold each Contributor harmless for any liability\n      incurred by, or claims asserted against, such Contributor by reason\n      of your accepting any such warranty or additional liability.\n\n   END OF TERMS AND CONDITIONS\n\n   APPENDIX: How to apply the Apache License to your work.\n\n      To apply the Apache License to your work, attach the following\n      boilerplate notice, with the fields enclosed by brackets \"[]\"\n      replaced with your own identifying information. (Don't include\n      the brackets!)  The text should be enclosed in the appropriate\n      comment syntax for the file format. We also recommend that a\n      file or class name and description of purpose be included on the\n      same \"printed page\" as the copyright notice for easier\n      identification within third-party archives.\n\n   Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved.\n\n   Licensed under the Apache License, Version 2.0 (the \"License\");\n   you may not use this file except in compliance with the License.\n   You may obtain a copy of the License at\n\n       http://www.apache.org/licenses/LICENSE-2.0\n\n   Unless required by applicable law or agreed to in writing, software\n   distributed under the License is distributed on an \"AS IS\" BASIS,\n   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n   See the License for the specific language governing permissions and\n   limitations under the License.\n"
  },
  {
    "path": "README.md",
    "content": "English | [简体中文](README_ch.md)\n\n<p align=\"center\">\n <img src=\"./docs/imgs/paddlehub_logo.jpg\" align=\"middle\" width=\"400\" />\n<p align=\"center\">\n<div align=\"center\">  \n  <h3> <a href=#QuickStart> Quick Start </a> | <a href=\"./modules\"> Model List </a> | <a href=#demos> Demos </a> </h3>\n</div>\n\n------------------------------------------------------------------------------------------\n\n<p align=\"center\">\n    <a href=\"./LICENSE\"><img src=\"https://img.shields.io/badge/license-Apache%202-dfd.svg\"></a>\n    <a href=\"\"><img src=\"https://img.shields.io/badge/python-3.6.2+-aff.svg\"></a>\n    <a href=\"\"><img src=\"https://img.shields.io/badge/os-linux%2C%20win%2C%20mac-pink.svg\"></a>\n    <a href=\"\"><img src=\"https://img.shields.io/pypi/format/paddlehub?color=c77\"></a>\n    <a href=\"https://pypi.org/project/paddlehub/\"><img src=\"https://img.shields.io/pypi/dm/paddlehub?color=9cf\"></a>\n    <a href=\"https://github.com/PaddlePaddle/PaddleHub/stargazers\"><img src=\"https://img.shields.io/github/stars/PaddlePaddle/PaddleHub?color=ccf\"></a>\n    <a href=\"https://huggingface.co/PaddlePaddle\"><img src=\"https://img.shields.io/badge/%F0%9F%A4%97-Hugging%20Face-blue\"></a>\n</p>\n\n\n## ⭐Features\n- **📦400+ AI Models**: Rich, high-quality AI models, including CV, NLP, Speech, Video and Cross-Modal. \n- **🧒Easy to Use**: 3 lines of code to predict 400+ AI models.\n- **💁Model As Service**: Easy to serve model with only one line of command.\n- **💻Cross-platform**: Support Linux, Windows and MacOS.\n\n### 💥Recent Updates\n- **🔥2022.08.19:** The v2.3.0 version is released 🎉\n  -  Supports [**ERNIE-ViLG**](./modules/image/text_to_image/ernie_vilg)([HuggingFace Space Demo](https://huggingface.co/spaces/PaddlePaddle/ERNIE-ViLG))\n  -  Supports [**Disco Diffusion (DD)**](./modules/image/text_to_image/disco_diffusion_clip_vitb32) and [**Stable Diffusion (SD)**](./modules/image/text_to_image/stable_diffusion)\n\n- **2022.02.18:** Release models to HuggingFace [PaddlePaddle Space](https://huggingface.co/PaddlePaddle)\n\n- For more previous release please refer to [**PaddleHub Release Note**](./docs/docs_en/release.md)\n\n\n<a name=\"demos\"></a>\n## 🌈Visualization Demo\n\n\n\n\n\n#### 🏜️ [Text-to-Image Models](https://github.com/PaddlePaddle/PaddleHub/tree/develop/modules/image/text_to_image)\n<div align=\"center\">\n<table>\n    <tr>\n        <td><img src=\"https://user-images.githubusercontent.com/59186797/200235049-fefa7642-6c4c-4f93-bd84-3b36a8a80595.gif\"  width = \"100%\"></td>\n        <td><img src=\"https://user-images.githubusercontent.com/59186797/200244625-77310db8-c9b2-4293-8fe9-c9aae27ee462.gif\" width = \"80%\"></td>\n        <td><img src=\"https://user-images.githubusercontent.com/59186797/200245387-daaf576d-8224-4937-82b8-27e31ee2df16.gif\" width = \"100%\"></td>\n    <tr>\n    <tr>\n        <td align=\"center\"><a href=\"https://github.com/PaddlePaddle/PaddleHub/tree/develop/modules/image/text_to_image/ernie_vilg\">Wenxin Big Models</a></td>\n        <td align=\"center\"><a href=\"https://github.com/PaddlePaddle/PaddleHub/tree/develop/modules/image/text_to_image/stable_diffusion\">Stable_Diffusion series</a></td>\n        <td align=\"center\"><a href=\"https://github.com/PaddlePaddle/PaddleHub/tree/develop/modules/image/text_to_image/disco_diffusion_ernievil_base\">Disco Diffusion series</a></td>\n        \n<tr>\n\n<tr>\n        <td align=\"center\">Include ERNIE-ViLG, ERNIE-ViL, ERNIE 3.0 Zeus, supports applications such as text-to-image, writing essays, summarization, couplets, question answering, writing novels and completing text。</td>\n        <td align=\"center\">Supports functions such as text_to_image, image_to_image, inpainting, ACGN external service, etc.</td>\n        <td align=\"center\">Support Chinese and English input</td>\n        \n<tr>\n\n</table>\n</div>\n\n\n\n\n#### 👓 [Computer Vision Models](./modules#Image)\n<div align=\"center\">\n<img src=\"./docs/imgs/Readme_Related/Image_all.gif\"  width = \"530\" height = \"400\" />\n</div>\n\n\n- Many thanks to CopyRight@[PaddleOCR](https://github.com/PaddlePaddle/PaddleOCR), [PaddleDetection](https://github.com/PaddlePaddle/PaddleDetection)、[PaddleGAN](https://github.com/PaddlePaddle/PaddleGAN), [AnimeGAN](https://github.com/TachibanaYoshino/AnimeGANv2)、[openpose](https://github.com/CMU-Perceptual-Computing-Lab/openpose)、[PaddleSeg](https://github.com/PaddlePaddle/PaddleSeg), [Zhengxia Zou](https://github.com/jiupinjia/SkyAR)、[PaddleClas](https://github.com/PaddlePaddle/PaddleClas) for the pre-trained models, you can try to train your models with them.\n\n\n#### 🎤 [Natural Language Processing Models](./modules#Text)\n<div align=\"center\">\n<img src=\"./docs/imgs/Readme_Related/Text_all.gif\"  width = \"640\" height = \"240\" />\n</div>\n\n- Many thanks to CopyRight@[ERNIE](https://github.com/PaddlePaddle/ERNIE)、[LAC](https://github.com/baidu/LAC)、[DDParser](https://github.com/baidu/DDParser)for the pre-trained models, you can try to train your models with them.\n\n\n\n#### 🎧 [Speech Models](./modules#Audio)\n<div align=\"center\">\n<table>\n    <thead>\n        <tr>\n            <th width=250> Input Audio  </th>\n            <th width=550> Recognition Result  </th>\n        </tr>\n    </thead>\n    <tbody>\n        <tr>\n            <td align = \"center\">\n            <a href=\"https://paddlespeech.bj.bcebos.com/PaddleAudio/en.wav\" rel=\"nofollow\">\n                    <img align=\"center\" src=\"./docs/imgs/Readme_Related/audio_icon.png\" width=250 ></a><br>\n            </td>\n            <td >I knocked at the door on the ancient side of the building.</td>\n            </tr>\n            <tr>\n            <td align = \"center\">\n            <a href=\"https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav\" rel=\"nofollow\">\n                    <img align=\"center\" src=\"./docs/imgs/Readme_Related/audio_icon.png\" width=250></a><br>\n            </td>\n            <td>我认为跑步最重要的就是给我带来了身体健康。</td>\n        </tr>\n    </tbody>\n</table>\n</div>\n<div align=\"center\">\n<table>\n    <thead>\n    </thead>\n    <tbody>\n        <tr>\n            <th>Input Text </th>\n            <th>Output Audio </th>\n        </tr>\n        <tr>\n            <th>Life was like a box of chocolates, you never know what you're gonna get.</th>\n            <th>\n            <a href=\"https://paddlehub.bj.bcebos.com/resources/fastspeech_ljspeech-0.wav\">\n            <img src=\"./docs/imgs/Readme_Related/audio_icon.png\" width=250 /></a><br>\n            </th>\n        </tr>\n    </tbody>\n</table>\n</div>\n\n- Many thanks to CopyRight@[PaddleSpeech](https://github.com/PaddlePaddle/PaddleSpeech) for the pre-trained models, you can try to train your models with PaddleSpeech.\n\n\n### ⭐ Thanks for Your Star \n- All the above pre-trained models are all **open source and free**, and the number of models is continuously updated. Welcome **Star** to pay attention.\n<div align=\"center\">\n<a href=\"https://github.com/PaddlePaddle/PaddleHub/stargazers\">\n    <img src=\"./docs/imgs/Readme_Related/star_en.png\"  width = \"411\" height = \"100\" /></a>  \n</div>\n\n<a name=\"Welcome_joinus\"></a>\n\n## 🍻Welcome to join PaddleHub technical group\n\n- If you have any questions during the use of the model, you can join the official WeChat group to get more efficient questions and answers, and fully communicate with developers from all walks of life. We look forward to your joining.\n<div align=\"center\">\n<img src=\"./docs/imgs/joinus.PNG\"  width = \"200\" height = \"200\" />\n</div> \n\n- please add WeChat above and send \"Hub\" to the robot, the robot will invite you to join the group automatically.\n\n<a name=\"QuickStart\"></a>\n## ✈️QuickStart\n\n#### 🚁The installation of required components.\n```python\n# install paddlepaddle with gpu\n# !pip install --upgrade paddlepaddle-gpu\n\n# or install paddlepaddle with cpu\n!pip install --upgrade paddlepaddle\n\n# install paddlehub\n!pip install --upgrade paddlehub\n```\n\n#### 🛫The simplest cases of Chinese word segmentation.\n\n```python\nimport paddlehub as hub\n\nlac = hub.Module(name=\"lac\")\ntest_text = [\"今天是个好天气。\"]\n\nresults = lac.cut(text=test_text, use_gpu=False, batch_size=1, return_tag=True)\nprint(results)\n#{'word': ['今天', '是', '个', '好天气', '。'], 'tag': ['TIME', 'v', 'q', 'n', 'w']}\n```\n#### 🛰️The simplest command of deploying lac service.\n</div>\n\n```python\n!hub serving start -m lac\n```\n\n- 📣More model description, please refer [Models List](./modules)\n\n<a name=\"License\"></a>\n## 📚License\nThe release of this project is certified by the <a href=\"./LICENSE\">Apache 2.0 license</a>.\n\n<a name=\"Contribution\"></a>\n## 👨‍👨‍👧‍👦Contribution\n\n<p align=\"center\">\n    <a href=\"https://github.com/nepeplwu\"><img src=\"https://avatars.githubusercontent.com/u/45024560?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/Steffy-zxf\"><img src=\"https://avatars.githubusercontent.com/u/48793257?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/ZeyuChen\"><img src=\"https://avatars.githubusercontent.com/u/1371212?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/ShenYuhan\"><img src=\"https://avatars.githubusercontent.com/u/28444161?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/kinghuin\"><img src=\"https://avatars.githubusercontent.com/u/11913168?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/grasswolfs\"><img src=\"https://avatars.githubusercontent.com/u/23690325?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/haoyuying\"><img src=\"https://avatars.githubusercontent.com/u/35907364?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/sjtubinlong\"><img src=\"https://avatars.githubusercontent.com/u/2063170?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/KPatr1ck\"><img src=\"https://avatars.githubusercontent.com/u/22954146?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/jm12138\"><img src=\"https://avatars.githubusercontent.com/u/15712990?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/DesmonDay\"><img src=\"https://avatars.githubusercontent.com/u/20554008?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/chunzhang-hub\"><img src=\"https://avatars.githubusercontent.com/u/63036966?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/rainyfly\"><img src=\"https://avatars.githubusercontent.com/u/22424850?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/adaxiadaxi\"><img src=\"https://avatars.githubusercontent.com/u/58928121?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/linjieccc\"><img src=\"https://avatars.githubusercontent.com/u/40840292?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/linshuliang\"><img src=\"https://avatars.githubusercontent.com/u/15993091?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/eepgxxy\"><img src=\"https://avatars.githubusercontent.com/u/15946195?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/paopjian\"><img src=\"https://avatars.githubusercontent.com/u/20377352?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/zbp-xxxp\"><img src=\"https://avatars.githubusercontent.com/u/58476312?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/houj04\"><img src=\"https://avatars.githubusercontent.com/u/35131887?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/Wgm-Inspur\"><img src=\"https://avatars.githubusercontent.com/u/89008682?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/AK391\"><img src=\"https://avatars.githubusercontent.com/u/81195143?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/apps/dependabot\"><img src=\"https://avatars.githubusercontent.com/in/29110?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/dxxxp\"><img src=\"https://avatars.githubusercontent.com/u/15886898?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/jianganbai\"><img src=\"https://avatars.githubusercontent.com/u/50263321?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/1084667371\"><img src=\"https://avatars.githubusercontent.com/u/50902619?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/Channingss\"><img src=\"https://avatars.githubusercontent.com/u/12471701?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/Austendeng\"><img src=\"https://avatars.githubusercontent.com/u/16330293?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/BurrowsWang\"><img src=\"https://avatars.githubusercontent.com/u/478717?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/cqvu\"><img src=\"https://avatars.githubusercontent.com/u/37096589?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/DeepGeGe\"><img src=\"https://avatars.githubusercontent.com/u/51083814?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/Haijunlv\"><img src=\"https://avatars.githubusercontent.com/u/28926237?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/holyseven\"><img src=\"https://avatars.githubusercontent.com/u/13829174?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/MRXLT\"><img src=\"https://avatars.githubusercontent.com/u/16594411?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/cclauss\"><img src=\"https://avatars.githubusercontent.com/u/3709715?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/hu-qi\"><img src=\"https://avatars.githubusercontent.com/u/17986122?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/itegel\"><img src=\"https://avatars.githubusercontent.com/u/8164474?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/jayhenry\"><img src=\"https://avatars.githubusercontent.com/u/4285375?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/hlmu\"><img src=\"https://avatars.githubusercontent.com/u/30133236?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/shinichiye\"><img src=\"https://avatars.githubusercontent.com/u/76040149?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/will-jl944\"><img src=\"https://avatars.githubusercontent.com/u/68210528?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/yma-admin\"><img src=\"https://avatars.githubusercontent.com/u/40477813?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/zl1271\"><img src=\"https://avatars.githubusercontent.com/u/22902089?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/brooklet\"><img src=\"https://avatars.githubusercontent.com/u/1585799?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/wj-Mcat\"><img src=\"https://avatars.githubusercontent.com/u/10242208?v=4\" width=75 height=75></a>\n</p>\n\nWe welcome you to contribute code to PaddleHub, and thank you for your feedback.\n\n* Many thanks to [肖培楷](https://github.com/jm12138), Contributed to street scene cartoonization, portrait cartoonization, gesture key point recognition, sky replacement, depth estimation, portrait segmentation and other modules\n* Many thanks to [Austendeng](https://github.com/Austendeng) for fixing the SequenceLabelReader\n* Many thanks to [cclauss](https://github.com/cclauss) optimizing travis-ci check\n* Many thanks to [奇想天外](http://www.cheerthink.com/)，Contributed a demo of mask detection\n* Many thanks to [mhlwsk](https://github.com/mhlwsk)，Contributed the repair sequence annotation prediction demo\n* Many thanks to [zbp-xxxp](https://github.com/zbp-xxxp)，Contributed modules for viewing pictures and writing poems\n* Many thanks to [zbp-xxxp](https://github.com/zbp-xxxp) and [七年期限](https://github.com/1084667371),Jointly contributed to the Mid-Autumn Festival Special Edition Module\n* Many thanks to [livingbody](https://github.com/livingbody)，Contributed models for style transfer based on PaddleHub's capabilities and Mid-Autumn Festival WeChat Mini Program\n* Many thanks to [BurrowsWang](https://github.com/BurrowsWang) for fixing Markdown table display problem\n* Many thanks to [huqi](https://github.com/hu-qi) for fixing readme typo\n* Many thanks to [parano](https://github.com/parano) [cqvu](https://github.com/cqvu) [deehrlic](https://github.com/deehrlic) for contributing this feature in PaddleHub\n* Many thanks to [paopjian](https://github.com/paopjian) for correcting the wrong website address [#1424](https://github.com/PaddlePaddle/PaddleHub/issues/1424)\n* Many thanks to [Wgm-Inspur](https://github.com/Wgm-Inspur) for correcting the demo errors in readme, and updating the RNN illustration in the text classification and sequence labeling demo\n* Many thanks to [zl1271](https://github.com/zl1271) for fixing serving docs typo\n* Many thanks to [AK391](https://github.com/AK391) for adding the webdemo of UGATIT and deoldify models in Hugging Face spaces\n* Many thanks to [itegel](https://github.com/itegel) for fixing quick start docs typo\n* Many thanks to [AK391](https://github.com/AK391) for adding the webdemo of Photo2Cartoon model in Hugging Face spaces\n"
  },
  {
    "path": "README_ch.md",
    "content": "简体中文 | [English](README.md)\n\n<p align=\"center\">\n <img src=\"./docs/imgs/paddlehub_logo.jpg\" align=\"middle\">\n<p align=\"center\">\n<div align=\"center\">  \n  <h3> <a href=#QuickStart> 快速开始 </a> | <a href=\"https://paddlehub.readthedocs.io/zh_CN/release-v2.1//\"> 教程文档 </a> | <a href=\"./modules/README_ch.md\"> 模型库 </a> | <a href=\"https://www.paddlepaddle.org.cn/hub\"> 演示Demo </a>\n  </h3>\n</div>\n\n------------------------------------------------------------------------------------------\n\n<p align=\"center\">\n    <a href=\"./LICENSE\"><img src=\"https://img.shields.io/badge/license-Apache%202-dfd.svg\"></a>\n    <a href=\"https://github.com/PaddlePaddle/PaddleHub/releases\"><img src=\"https://img.shields.io/github/v/release/PaddlePaddle/PaddleHub?color=ffa\"></a>\n    <a href=\"\"><img src=\"https://img.shields.io/badge/python-3.6.2+-aff.svg\"></a>\n    <a href=\"\"><img src=\"https://img.shields.io/badge/os-linux%2C%20win%2C%20mac-pink.svg\"></a>\n    <a href=\"\"><img src=\"https://img.shields.io/pypi/format/paddlehub?color=c77\"></a>\n</p>\n<p align=\"center\">\n    <a href=\"https://github.com/PaddlePaddle/PaddleHub/graphs/contributors\"><img src=\"https://img.shields.io/github/contributors/PaddlePaddle/PaddleHub?color=9ea\"></a>\n    <a href=\"https://github.com/PaddlePaddle/PaddleHub/commits\"><img src=\"https://img.shields.io/github/commit-activity/m/PaddlePaddle/PaddleHub?color=3af\"></a>\n    <a href=\"https://pypi.org/project/paddlehub/\"><img src=\"https://img.shields.io/pypi/dm/paddlehub?color=9cf\"></a>\n    <a href=\"https://github.com/PaddlePaddle/PaddleHub/issues\"><img src=\"https://img.shields.io/github/issues/PaddlePaddle/PaddleHub?color=9cc\"></a>\n    <a href=\"https://github.com/PaddlePaddle/PaddleHub/stargazers\"><img src=\"https://img.shields.io/github/stars/PaddlePaddle/PaddleHub?color=ccf\"></a>\n</p>\n\n\n\n\n## 简介与特性\n- PaddleHub旨在为开发者提供丰富的、高质量的、直接可用的预训练模型\n- **【模型种类丰富】**: 涵盖大模型、CV、NLP、Audio、Video、工业应用主流六大品类的 **400+** 预训练模型，全部开源下载，离线可运行\n- **【超低使用门槛】**：无需深度学习背景、无需数据与训练过程，可快速使用AI模型\n- **【一键模型快速预测】**：通过一行命令行或者极简的Python API实现模型调用，可快速体验模型效果\n- **【一键模型转服务化】**：一行命令，搭建深度学习模型API服务化部署能力\n- **【跨平台兼容性】**：可运行于Linux、Windows、MacOS等多种操作系统\n\n## 近期更新\n- **🔥2022.10.20:** 发布v2.3.1版本新增Stable_Diffusion系列模型和超分模型\n    - 支持[文生图](https://github.com/PaddlePaddle/PaddleHub/tree/develop/modules/image/text_to_image/stable_diffusion)、[图生图](https://github.com/PaddlePaddle/PaddleHub/tree/develop/modules/image/text_to_image/stable_diffusion_img2img)、[图修复](https://github.com/PaddlePaddle/PaddleHub/tree/develop/modules/image/text_to_image/stable_diffusion_inpainting)、[二次元专属waifu](https://github.com/PaddlePaddle/PaddleHub/tree/develop/modules/image/text_to_image/stable_diffusion_waifu)等4个模型。\n    - 基于 [SwinIR-L](https://www.paddlepaddle.org.cn/hubdetail?name=swinir_l_real_sr_x4&en_category=ImageEditing) 的 4 倍现实图像超分辨率模型\n- **🔥2022.08.19:** 发布v2.3.0版本新增[文心大模型](https://github.com/PaddlePaddle/PaddleHub/tree/develop/modules/image/text_to_image/ernie_vilg)和[disco diffusion(dd)](https://www.paddlepaddle.org.cn/hubdetail?name=disco_diffusion_ernievil_base&en_category=TextToImage)系列文图生成模型。\n   - 支持对[文心大模型API](https://wenxin.baidu.com/moduleApi)的调用, 包括 文图生成模型ERNIE-ViLG, 以及支持写作文、写文案、写摘要、对对联、自由问答、写小说、补全文本等多个应用的语言模型ERNIE 3.0 Zeus\n   - 新增基于disco diffusion技术的文图生成dd系列模型([免费GPU体验Demo](https://aistudio.baidu.com/aistudio/projectdetail/4462918))。\n- **2022.02.18:** 加入Huggingface，创建了PaddlePaddle的可视化空间并上传了模型: [PaddlePaddle Huggingface](https://huggingface.co/PaddlePaddle)。\n\n- **🔥2021.12.22**，发布v2.2.0版本新增[预训练模型库官网](https://www.paddlepaddle.org.cn/hublist)，新增100+高质量模型，涵盖对话、语音处理、语义分割、文字识别、文本处理、图像生成等多个领域，预训练模型总量达到【360+】；\n  \n\n\n- [More](./docs/docs_ch/release.md)\n\n\n\n## **精品模型效果展示[【更多】](./docs/docs_ch/visualization.md)[【模型库】](./modules/README_ch.md)**\n\n### **[大模型（10个）](./modules/README_ch.md#图像)**\n<div align=\"center\">\n<table>\n    <tr>\n        <td><img src=\"https://user-images.githubusercontent.com/59186797/200235049-fefa7642-6c4c-4f93-bd84-3b36a8a80595.gif\"  width = \"100%\"></td>\n        <td><img src=\"https://user-images.githubusercontent.com/59186797/200244625-77310db8-c9b2-4293-8fe9-c9aae27ee462.gif\" width = \"90%\"></td>\n        <td><img src=\"https://user-images.githubusercontent.com/59186797/200245387-daaf576d-8224-4937-82b8-27e31ee2df16.gif\" width = \"100%\"></td>\n    <tr>\n    <tr>\n        <td align=\"center\"><a href=\"https://github.com/PaddlePaddle/PaddleHub/tree/develop/modules/image/text_to_image/ernie_vilg\">文心大模型</a></td>\n        <td align=\"center\"><a href=\"https://github.com/PaddlePaddle/PaddleHub/tree/develop/modules/image/text_to_image/stable_diffusion\">Stable_Diffusion系列模型</a></td>\n        <td align=\"center\"><a href=\"https://github.com/PaddlePaddle/PaddleHub/tree/develop/modules/image/text_to_image/disco_diffusion_ernievil_base\">Disco Diffusion系列模型</a></td>\n        \n<tr>\n\n<tr>\n        <td align=\"center\">支持文图生成、写作文、写文案、写摘要、对对联、自由问答、写小说、补全文本等多个应用。</td>\n        <td align=\"center\">支持文生图、图生图、图修复、二次元专属waifu等功能</td>\n        <td align=\"center\">支持中英输入</td>\n        \n<tr>\n\n</table>\n</div>\n\n\n\n\n\n### **[图像类（212个）](./modules/README_ch.md#图像)**\n- 包括图像分类、人脸检测、口罩检测、车辆检测、人脸/人体/手部关键点检测、人像分割、80+语言文本识别、图像超分/上色/动漫化等\n<div align=\"center\">\n<img src=\"./docs/imgs/Readme_Related/Image_all.gif\"  width = \"530\" height = \"400\" />\n</div>\n\n- 感谢CopyRight@[PaddleOCR](https://github.com/PaddlePaddle/PaddleOCR)、[PaddleDetection](https://github.com/PaddlePaddle/PaddleDetection)、[PaddleGAN](https://github.com/PaddlePaddle/PaddleGAN)、[AnimeGAN](https://github.com/TachibanaYoshino/AnimeGANv2)、[openpose](https://github.com/CMU-Perceptual-Computing-Lab/openpose)、[PaddleSeg](https://github.com/PaddlePaddle/PaddleSeg)、[Zhengxia Zou](https://github.com/jiupinjia/SkyAR)、[PaddleClas](https://github.com/PaddlePaddle/PaddleClas) 提供相关预训练模型，训练能力开放，欢迎体验。\n\n\n### **[文本类（130个）](./modules/README_ch.md#文本)**\n- 包括中文分词、词性标注与命名实体识别、句法分析、AI写诗/对联/情话/藏头诗、中文的评论情感分析、中文色情文本审核等\n<div align=\"center\">\n<img src=\"./docs/imgs/Readme_Related/Text_all.gif\"  width = \"640\" height = \"240\" />\n</div>\n\n- 感谢CopyRight@[ERNIE](https://github.com/PaddlePaddle/ERNIE)、[LAC](https://github.com/baidu/LAC)、[DDParser](https://github.com/baidu/DDParser)提供相关预训练模型，训练能力开放，欢迎体验。\n\n\n### **[语音类（15个）](./modules/README_ch.md#语音)**\n- ASR语音识别算法，多种算法可选\n- 语音识别效果如下:\n<div align=\"center\">\n<table>\n    <thead>\n        <tr>\n            <th width=250> Input Audio  </th>\n            <th width=550> Recognition Result  </th>\n        </tr>\n    </thead>\n    <tbody>\n        <tr>\n            <td align = \"center\">\n            <a href=\"https://paddlespeech.bj.bcebos.com/PaddleAudio/en.wav\" rel=\"nofollow\">\n                    <img align=\"center\" src=\"./docs/imgs/Readme_Related/audio_icon.png\" width=250 ></a><br>\n            </td>\n            <td >I knocked at the door on the ancient side of the building.</td>\n            </tr>\n            <tr>\n            <td align = \"center\">\n            <a href=\"https://paddlespeech.bj.bcebos.com/PaddleAudio/zh.wav\" rel=\"nofollow\">\n                    <img align=\"center\" src=\"./docs/imgs/Readme_Related/audio_icon.png\" width=250></a><br>\n            </td>\n            <td>我认为跑步最重要的就是给我带来了身体健康。</td>\n        </tr>\n    </tbody>\n</table>\n</div>\n\n- TTS语音合成算法，多种算法可选\n- 输入：`Life was like a box of chocolates, you never know what you're gonna get.`\n- 合成效果如下:\n<div align=\"center\">\n<table>\n    <thead>\n    </thead>\n    <tbody>\n        <tr>\n            <th>deepvoice3 </th>\n            <th>fastspeech </th>\n            <th>transformer</th>\n        </tr>\n        <tr>\n            <th>\n            <a href=\"https://paddlehub.bj.bcebos.com/resources/deepvoice3_ljspeech-0.wav\">\n            <img src=\"./docs/imgs/Readme_Related/audio_icon.png\" width=250 /></a><br>\n            </th>\n            <th>\n            <a href=\"https://paddlehub.bj.bcebos.com/resources/fastspeech_ljspeech-0.wav\">\n            <img src=\"./docs/imgs/Readme_Related/audio_icon.png\" width=250 /></a><br>\n            </th>\n            <th>\n            <a href=\"https://paddlehub.bj.bcebos.com/resources/transformer_tts_ljspeech-0.wav\">\n            <img src=\"./docs/imgs/Readme_Related/audio_icon.png\" width=250 /></a><br>\n            </th>\n        </tr>\n    </tbody>\n</table>\n</div>\n\n- 感谢CopyRight@[PaddleSpeech](https://github.com/PaddlePaddle/PaddleSpeech)提供预训练模型，训练能力开放，欢迎体验。\n\n### **[视频类（8个）](./modules/README_ch.md#视频)**\n- 包含短视频分类，支持3000+标签种类，可输出TOP-K标签，多种算法可选。\n- 感谢CopyRight@[PaddleVideo](https://github.com/PaddlePaddle/PaddleVideo)提供预训练模型，训练能力开放，欢迎体验。\n- `举例：输入一段游泳的短视频，算法可以输出\"游泳\"结果`\n<div align=\"center\">\n<img src=\"./docs/imgs/Readme_Related/Text_Video.gif\"  width = \"400\" height = \"400\" />\n</div>\n\n\n\n\n##  ===划重点===\n- 以上所有预训练模型全部开源，模型数量持续更新，欢迎**⭐Star⭐**关注。\n<div align=\"center\">\n<a href=\"https://github.com/PaddlePaddle/PaddleHub/stargazers\">\n            <img src=\"./docs/imgs/Readme_Related/star.png\"  width = \"411\" height = \"100\" /></a>  \n</div>\n\n<a name=\"欢迎加入PaddleHub技术交流群\"></a>\n## 欢迎加入PaddleHub技术交流群\n- 在使用模型过程中有任何问题，可以加入官方微信群，获得更高效的问题答疑，与各行各业开发者充分交流，期待您的加入。\n<div align=\"center\">\n<img src=\"./docs/imgs/joinus.PNG\"  width = \"200\" height = \"200\" />\n</div>  \n扫码备注\"Hub\"加好友之后，再发送“Hub”，会自动邀请您入群。  \n\n<div id=\"QuickStart\">\n\n\n\n\n## 快速开始\n\n[【零基础windows安装并实现图像风格迁移】](./docs/docs_ch/get_start/windows_quickstart.md)\n\n[【零基础mac安装并实现图像风格迁移】](./docs/docs_ch/get_start/mac_quickstart.md)\n\n[【零基础linux安装并实现图像风格迁移】](./docs/docs_ch/get_start/linux_quickstart.md)\n\n### 快速安装相关组件\n</div>\n\n```python\n!pip install --upgrade paddlepaddle -i https://mirror.baidu.com/pypi/simple\n!pip install --upgrade paddlehub -i https://mirror.baidu.com/pypi/simple\n```\n\n### 极简中文分词案例  \n</div>\n\n```python\nimport paddlehub as hub\n\nlac = hub.Module(name=\"lac\")\ntest_text = [\"今天是个好天气。\"]\n\nresults = lac.cut(text=test_text, use_gpu=False, batch_size=1, return_tag=True)\nprint(results)\n#{'word': ['今天', '是', '个', '好天气', '。'], 'tag': ['TIME', 'v', 'q', 'n', 'w']}\n```\n\n### 一行代码部署lac（词法分析）模型\n</div>\n\n```python\n!hub serving start -m lac\n```\n\n 欢迎用户通过[模型搜索](https://www.paddlepaddle.org.cn/hublist)发现更多实用的预训练模型！\n\n 更多迁移学习能力可以参考[教程文档](https://paddlehub.readthedocs.io/zh_CN/release-v2.1/transfer_learning_index.html)\n\n\n\n\n<a name=\"许可证书\"></a>\n## 许可证书\n本项目的发布受<a href=\"./LICENSE\">Apache 2.0 license</a>许可认证。\n\n<a name=\"致谢\"></a>\n## 致谢开发者\n\n<p align=\"center\">\n    <a href=\"https://github.com/nepeplwu\"><img src=\"https://avatars.githubusercontent.com/u/45024560?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/Steffy-zxf\"><img src=\"https://avatars.githubusercontent.com/u/48793257?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/ZeyuChen\"><img src=\"https://avatars.githubusercontent.com/u/1371212?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/ShenYuhan\"><img src=\"https://avatars.githubusercontent.com/u/28444161?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/kinghuin\"><img src=\"https://avatars.githubusercontent.com/u/11913168?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/grasswolfs\"><img src=\"https://avatars.githubusercontent.com/u/23690325?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/haoyuying\"><img src=\"https://avatars.githubusercontent.com/u/35907364?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/sjtubinlong\"><img src=\"https://avatars.githubusercontent.com/u/2063170?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/KPatr1ck\"><img src=\"https://avatars.githubusercontent.com/u/22954146?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/jm12138\"><img src=\"https://avatars.githubusercontent.com/u/15712990?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/DesmonDay\"><img src=\"https://avatars.githubusercontent.com/u/20554008?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/chunzhang-hub\"><img src=\"https://avatars.githubusercontent.com/u/63036966?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/rainyfly\"><img src=\"https://avatars.githubusercontent.com/u/22424850?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/adaxiadaxi\"><img src=\"https://avatars.githubusercontent.com/u/58928121?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/linjieccc\"><img src=\"https://avatars.githubusercontent.com/u/40840292?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/linshuliang\"><img src=\"https://avatars.githubusercontent.com/u/15993091?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/eepgxxy\"><img src=\"https://avatars.githubusercontent.com/u/15946195?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/paopjian\"><img src=\"https://avatars.githubusercontent.com/u/20377352?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/zbp-xxxp\"><img src=\"https://avatars.githubusercontent.com/u/58476312?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/houj04\"><img src=\"https://avatars.githubusercontent.com/u/35131887?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/Wgm-Inspur\"><img src=\"https://avatars.githubusercontent.com/u/89008682?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/AK391\"><img src=\"https://avatars.githubusercontent.com/u/81195143?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/apps/dependabot\"><img src=\"https://avatars.githubusercontent.com/in/29110?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/dxxxp\"><img src=\"https://avatars.githubusercontent.com/u/15886898?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/jianganbai\"><img src=\"https://avatars.githubusercontent.com/u/50263321?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/1084667371\"><img src=\"https://avatars.githubusercontent.com/u/50902619?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/Channingss\"><img src=\"https://avatars.githubusercontent.com/u/12471701?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/Austendeng\"><img src=\"https://avatars.githubusercontent.com/u/16330293?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/BurrowsWang\"><img src=\"https://avatars.githubusercontent.com/u/478717?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/cqvu\"><img src=\"https://avatars.githubusercontent.com/u/37096589?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/DeepGeGe\"><img src=\"https://avatars.githubusercontent.com/u/51083814?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/Haijunlv\"><img src=\"https://avatars.githubusercontent.com/u/28926237?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/holyseven\"><img src=\"https://avatars.githubusercontent.com/u/13829174?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/MRXLT\"><img src=\"https://avatars.githubusercontent.com/u/16594411?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/cclauss\"><img src=\"https://avatars.githubusercontent.com/u/3709715?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/hu-qi\"><img src=\"https://avatars.githubusercontent.com/u/17986122?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/itegel\"><img src=\"https://avatars.githubusercontent.com/u/8164474?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/jayhenry\"><img src=\"https://avatars.githubusercontent.com/u/4285375?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/hlmu\"><img src=\"https://avatars.githubusercontent.com/u/30133236?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/shinichiye\"><img src=\"https://avatars.githubusercontent.com/u/76040149?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/will-jl944\"><img src=\"https://avatars.githubusercontent.com/u/68210528?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/yma-admin\"><img src=\"https://avatars.githubusercontent.com/u/40477813?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/zl1271\"><img src=\"https://avatars.githubusercontent.com/u/22902089?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/brooklet\"><img src=\"https://avatars.githubusercontent.com/u/1585799?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/wj-Mcat\"><img src=\"https://avatars.githubusercontent.com/u/10242208?v=4\" width=75 height=75></a>\n</p>\n\n我们非常欢迎您为PaddleHub贡献代码，也十分感谢您的反馈。\n\n* 非常感谢[肖培楷](https://github.com/jm12138)贡献了街景动漫化，人像动漫化、手势关键点识别、天空置换、深度估计、人像分割等module\n* 非常感谢[Austendeng](https://github.com/Austendeng)贡献了修复SequenceLabelReader的pr\n* 非常感谢[cclauss](https://github.com/cclauss)贡献了优化travis-ci检查的pr\n* 非常感谢[奇想天外](http://www.cheerthink.com/)贡献了口罩检测的demo\n* 非常感谢[mhlwsk](https://github.com/mhlwsk)贡献了修复序列标注预测demo的pr\n* 非常感谢[zbp-xxxp](https://github.com/zbp-xxxp)和[七年期限](https://github.com/1084667371)联合贡献了看图写诗中秋特别版module、谣言预测、请假条生成等module\n* 非常感谢[livingbody](https://github.com/livingbody)贡献了基于PaddleHub能力的风格迁移和中秋看图写诗微信小程序\n* 非常感谢[BurrowsWang](https://github.com/BurrowsWang)修复Markdown表格显示问题\n* 非常感谢[huqi](https://github.com/hu-qi)修复了readme中的错别字\n* 非常感谢[parano](https://github.com/parano)、[cqvu](https://github.com/cqvu)、[deehrlic](https://github.com/deehrlic)三位的贡献与支持\n* 非常感谢[paopjian](https://github.com/paopjian)修改了中文readme模型搜索指向的的网站地址错误[#1424](https://github.com/PaddlePaddle/PaddleHub/issues/1424)\n* 非常感谢[Wgm-Inspur](https://github.com/Wgm-Inspur)修复了readme中的代码示例问题，并优化了文本分类、序列标注demo中的RNN示例图\n* 非常感谢[zl1271](https://github.com/zl1271)修复了serving文档中的错别字\n* 非常感谢[AK391](https://github.com/AK391)在Hugging Face spaces中添加了UGATIT和deoldify模型的web demo\n* 非常感谢[itegel](https://github.com/itegel)修复了快速开始文档中的错别字\n* 非常感谢[AK391](https://github.com/AK391)在Hugging Face spaces中添加了Photo2Cartoon模型的web demo\n"
  },
  {
    "path": "demo/README.md",
    "content": "### PaddleHub Office Website：https://www.paddlepaddle.org.cn/hub\r\n### PaddleHub Module Searching：https://www.paddlepaddle.org.cn/hublist\r\n"
  },
  {
    "path": "demo/audio_classification/README.md",
    "content": "# PaddleHub 声音分类\n\n本示例展示如何使用PaddleHub Fine-tune API以及CNN14等预训练模型完成声音分类和Tagging的任务。\n\nCNN14等预训练模型的详情，请参考论文[PANNs: Large-Scale Pretrained Audio Neural Networks for Audio Pattern Recognition](https://arxiv.org/pdf/1912.10211.pdf)和代码[audioset_tagging_cnn](https://github.com/qiuqiangkong/audioset_tagging_cnn)。\n\n\n## 如何开始Fine-tune\n\n我们以环境声音分类公开数据集[ESC50](https://github.com/karolpiczak/ESC-50)为示例数据集，可以运行下面的命令，在训练集（train.npz）上进行模型训练，并在开发集（dev.npz）验证。通过如下命令，即可启动训练。\n\n```python\n# 设置使用的GPU卡号\nexport CUDA_VISIBLE_DEVICES=0\npython train.py\n```\n\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 选择模型\n\n```python\nimport paddle\nimport paddlehub as hub\nfrom paddlehub.datasets import ESC50\n\nmodel = hub.Module(name='panns_cnn14', version='1.0.0', task='sound-cls', num_class=ESC50.num_class)\n```\n\n其中，参数：\n- `name`: 模型名称，可以选择`panns_cnn14`、`panns_cnn10` 和`panns_cnn6`，具体的模型参数信息可见下表。\n- `version`: module版本号\n- `task`：模型的执行任务。`sound-cls`表示声音分类任务；`None`表示Audio Tagging任务。\n- `num_classes`：表示当前声音分类任务的类别数，根据具体使用的数据集确定。\n\n目前可选用的预训练模型：\n模型名      | PaddleHub Module\n-----------| :------:\nCNN14      | `hub.Module(name='panns_cnn14')`\nCNN10      | `hub.Module(name='panns_cnn10')`\nCNN6       | `hub.Module(name='panns_cnn6')`\n\n### Step2: 加载数据集\n\n```python\ntrain_dataset = ESC50(mode='train')\ndev_dataset = ESC50(mode='dev')\n```\n\n### Step3: 选择优化策略和运行配置\n\n```python\noptimizer = paddle.optimizer.AdamW(learning_rate=5e-5, parameters=model.parameters())\ntrainer = hub.Trainer(model, optimizer, checkpoint_dir='./', use_gpu=True)\n```\n\n#### 优化策略\n\nPaddle2.0提供了多种优化器选择，如`SGD`, `AdamW`, `Adamax`等，详细参见[策略](https://www.paddlepaddle.org.cn/documentation/docs/zh/api/paddle/optimizer/Overview_cn.html)。\n\n其中`AdamW`:\n\n- `learning_rate`: 全局学习率。默认为1e-3；\n- `parameters`: 待优化模型参数。\n\n其余可配置参数请参考[AdamW](https://www.paddlepaddle.org.cn/documentation/docs/zh/api/paddle/optimizer/adamw/AdamW_cn.html#cn-api-paddle-optimizer-adamw)。\n\n#### 运行配置\n\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n- `model`: 被优化模型；\n- `optimizer`: 优化器选择；\n- `use_vdl`: 是否使用vdl可视化训练过程；\n- `checkpoint_dir`: 保存模型参数的地址；\n- `compare_metrics`: 保存最优模型的衡量指标；\n\n\n### Step4: 执行训练和模型评估\n\n```python\ntrainer.train(\n    train_dataset,\n    epochs=50,\n    batch_size=16,\n    eval_dataset=dev_dataset,\n    save_interval=10,\n)\ntrainer.evaluate(dev_dataset, batch_size=16)\n```\n\n`trainer.train`执行模型的训练，其参数可以控制具体的训练过程，主要的参数包含：\n\n- `train_dataset`: 训练时所用的数据集；\n- `epochs`: 训练轮数；\n- `batch_size`: 训练时每一步用到的样本数目，如果使用GPU，请根据实际情况调整batch_size；\n- `num_workers`: works的数量，默认为0；\n- `eval_dataset`: 验证集；\n- `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n- `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n`trainer.evaluate`执行模型的评估，主要的参数包含：\n\n- `eval_dataset`: 模型评估时所用的数据集；\n- `batch_size`: 模型评估时每一步用到的样本数目，如果使用GPU，请根据实际情况调整batch_size\n\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n以下代码将本地的音频文件`./cat.wav`作为预测数据，使用训好的模型对它进行分类，输出结果。\n\n```python\nimport os\n\nimport librosa\n\nimport paddlehub as hub\nfrom paddlehub.datasets import ESC50\n\nwav = './cat.wav'  # 存储在本地的需要预测的wav文件\nsr = 44100  # 音频文件的采样率\ncheckpoint = './best_model/model.pdparams'  # 模型checkpoint\n\nlabel_map = {idx: label for idx, label in enumerate(ESC50.label_list)}\n\nmodel = hub.Module(name='panns_cnn14',\n                    version='1.0.0',\n                    task='sound-cls',\n                    num_class=ESC50.num_class,\n                    label_map=label_map,\n                    load_checkpoint=checkpoint)\n\ndata = [librosa.load(wav, sr=sr)[0]]\nresult = model.predict(data, sample_rate=sr, batch_size=1, feat_type='mel', use_gpu=True)\n\nprint(result[0])  # result[0]包含音频文件属于各类别的概率值\n```\n\n\n## Audio Tagging\n\n当前使用的模型是基于[Audioset数据集](https://research.google.com/audioset/)的预训练模型，除了以上的针对特定声音分类数据集的finetune任务，模型还支持基于Audioset 527个标签的Tagging功能。\n\n以下代码将本地的音频文件`./cat.wav`作为预测数据，使用预训练模型对它进行打分，输出top 10的标签和对应的得分。\n\n```python\nimport os\n\nimport librosa\nimport numpy as np\n\nimport paddlehub as hub\nfrom paddlehub.env import MODULE_HOME\n\n\nwav = './cat.wav'  # 存储在本地的需要预测的wav文件\nsr = 44100  # 音频文件的采样率\ntopk = 10  # 展示音频得分前10的标签和分数\n\n# 读取audioset数据集的label文件\nlabel_file = os.path.join(MODULE_HOME, 'panns_cnn14', 'audioset_labels.txt')\nlabel_map = {}\nwith open(label_file, 'r') as f:\n    for i, l in enumerate(f.readlines()):\n        label_map[i] = l.strip()\n\nmodel = hub.Module(name='panns_cnn14', version='1.0.0', task=None, label_map=label_map)\n\ndata = [librosa.load(wav, sr=sr)[0]]\nresult = model.predict(data, sample_rate=sr, batch_size=1, feat_type='mel', use_gpu=True)\n\n# 打印topk的类别和对应得分\nmsg = ''\nfor label, score in list(result[0].items())[:topk]:\n    msg += f'{label}: {score}\\n'\nprint(msg)\n```\n\n### 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.1.0\n"
  },
  {
    "path": "demo/audio_classification/audioset_predict.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport argparse\nimport ast\nimport os\n\nimport librosa\nimport numpy as np\n\nimport paddlehub as hub\nfrom paddlehub.env import MODULE_HOME\n\nparser = argparse.ArgumentParser(__doc__)\nparser.add_argument(\"--wav\", type=str, required=True, help=\"Audio file to infer.\")\nparser.add_argument(\"--sr\", type=int, default=32000, help=\"Sample rate of inference audio.\")\nparser.add_argument(\"--model_type\", type=str, default='panns_cnn14', help=\"Select model to to inference.\")\nparser.add_argument(\"--topk\", type=int, default=10, help=\"Show top k results of audioset labels.\")\nargs = parser.parse_args()\n\nif __name__ == '__main__':\n    label_file = os.path.join(MODULE_HOME, args.model_type, 'audioset_labels.txt')\n    label_map = {}\n    with open(label_file, 'r') as f:\n        for i, l in enumerate(f.readlines()):\n            label_map[i] = l.strip()\n\n    model = hub.Module(name=args.model_type, version='1.0.0', task=None, label_map=label_map)\n\n    data = [librosa.load(args.wav, sr=args.sr)[0]]  # (t, num_mel_bins)\n    result = model.predict(data, sample_rate=args.sr, batch_size=1, feat_type='mel', use_gpu=True)\n\n    msg = f'[{args.wav}]\\n'\n    for label, score in list(result[0].items())[:args.topk]:\n        msg += f'{label}: {score}\\n'\n    print(msg)\n"
  },
  {
    "path": "demo/audio_classification/predict.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport argparse\nimport ast\nimport os\n\nimport librosa\n\nimport paddlehub as hub\nfrom paddlehub.datasets import ESC50\n\nparser = argparse.ArgumentParser(__doc__)\nparser.add_argument(\"--wav\", type=str, required=True, help=\"Audio file to infer.\")\nparser.add_argument(\"--sr\", type=int, default=44100, help=\"Sample rate of inference audio.\")\nparser.add_argument(\"--model_type\", type=str, default='panns_cnn14', help=\"Select model to to inference.\")\nparser.add_argument(\"--topk\", type=int, default=1, help=\"Show top k results of prediction labels.\")\nparser.add_argument(\n    \"--checkpoint\", type=str, default='./checkpoint/best_model/model.pdparams', help=\"Checkpoint of model.\")\nargs = parser.parse_args()\n\nif __name__ == '__main__':\n    label_map = {idx: label for idx, label in enumerate(ESC50.label_list)}\n\n    model = hub.Module(\n        name=args.model_type,\n        version='1.0.0',\n        task='sound-cls',\n        num_class=ESC50.num_class,\n        label_map=label_map,\n        load_checkpoint=args.checkpoint)\n\n    data = [librosa.load(args.wav, sr=args.sr)[0]]\n    result = model.predict(data, sample_rate=args.sr, batch_size=1, feat_type='mel', use_gpu=True)\n\n    msg = f'[{args.wav}]\\n'\n    for label, score in list(result[0].items())[:args.topk]:\n        msg += f'{label}: {score}\\n'\n    print(msg)\n"
  },
  {
    "path": "demo/audio_classification/train.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport argparse\nimport ast\n\nimport paddle\n\nimport paddlehub as hub\nfrom paddlehub.datasets import ESC50\n\nparser = argparse.ArgumentParser(__doc__)\nparser.add_argument(\"--num_epoch\", type=int, default=50, help=\"Number of epoches for fine-tuning.\")\nparser.add_argument(\n    \"--use_gpu\",\n    type=ast.literal_eval,\n    default=True,\n    help=\"Whether use GPU for fine-tuning, input should be True or False\")\nparser.add_argument(\"--learning_rate\", type=float, default=5e-5, help=\"Learning rate used to train with warmup.\")\nparser.add_argument(\"--batch_size\", type=int, default=16, help=\"Total examples' number in batch for training.\")\nparser.add_argument(\"--checkpoint_dir\", type=str, default='./checkpoint', help=\"Directory to model checkpoint\")\nparser.add_argument(\"--save_interval\", type=int, default=10, help=\"Save checkpoint every n epoch.\")\nargs = parser.parse_args()\n\nif __name__ == \"__main__\":\n    model = hub.Module(name='panns_cnn14', task='sound-cls', num_class=ESC50.num_class)\n\n    train_dataset = ESC50(mode='train')\n    dev_dataset = ESC50(mode='dev')\n\n    optimizer = paddle.optimizer.AdamW(learning_rate=args.learning_rate, parameters=model.parameters())\n\n    trainer = hub.Trainer(model, optimizer, checkpoint_dir=args.checkpoint_dir, use_gpu=args.use_gpu)\n    trainer.train(\n        train_dataset,\n        epochs=args.num_epoch,\n        batch_size=args.batch_size,\n        eval_dataset=dev_dataset,\n        save_interval=args.save_interval,\n    )\n"
  },
  {
    "path": "demo/autoaug/README.md",
    "content": "# PaddleHub 自动数据增强\n\n本示例将展示如何使用PaddleHub搜索最适合数据的数据增强策略，并将其应用到模型训练中。\n\n## 依赖\n\n请预先从pip下载auto-augment软件包\n\n```\npip install auto-augment\n```\n\n\n\n## auto-augment简述\n\nauto-augment软件包目前支持Paddle的图像分类任务和物体检测任务。\n\n应用时分成搜索(search)和训练(train)两个阶段\n\n**搜索阶段在预置模型上对不同算子的组合进行策略搜索，输出最优数据增强调度策略组合**\n\n**训练阶段在特定模型上应用最优调度数据增强策略组合 **\n\n详细关于auto-augment的使用及benchmark可参考auto_augment/doc里的readme\n\n\n\n## 支持任务\n\n目前auto-augment支持paddlhub的图像分类任务。\n\n后续会扩充到其他任务\n\n\n\n## 图像分类任务\n\n### 参数配置\n\n参数配置支持yaml格式描述及json格式描述，项目中仅提供yaml格式配置模板。模板统一于configs/路径下\n\n用户可配置参数分为task_config(任务配置)，data_config(数据配置), resource_config(资源配置)，algo_config(算法配置)， search_space(搜索空间配置)。\n\n#### task_config(任务配置)\n\n​\t任务配置细节，包括任务类型及模型细节\n\n​\t具体字段如下:\n\n​\trun_mode: [\"ray\", \"automl_service\"],  #表示后端采用服务，目前支持单机ray框架\n\n​\twork_space: 用户工作空间\n\n​\ttask_type： [\"classifier\"] #任务类型，目前PaddleHub支持图像分类单标签，需要请使用物体检测单标签任务的增强请参考auto_augment/doc\n\n​\tclassifier: 具体任务类型的配置细节,\n\n##### classifier任务配置细节\n\n- model_name: paddlehub模型名称\n- epochs: int, 任务搜索轮数， **必填** , 该参数需要特殊指定\n- Input_size: 模型输入尺寸\n- scale_size： 数据预处理尺寸\n- no_cache_image: 不缓存数据， 默认False\n- use_class_map: 使用label_list 映射\n\n\n\n#### data_config(数据配置)\n\n数据配置支持多种格式输入, 包括图像分类txt标注格式， 物体检测voc标注格式， 物体检测coco标注格式.\n\n- train_img_prefix：str. 训练集数据路径前缀\n\n- train_ann_file：str, 训练集数据描述文件，\n\n- val_img_prefix：str, 验证集数据路径前缀\n\n- val_ann_file：str,验证集数据描述文件\n\n- label_list：str, 标签文件\n\n- delimiter： \",\"  数据描述文件采用的分隔符\n\n\n\n#### resource_config(资源配置)\n\n- gpu：float, 表示每个搜索进程的gpu分配资源，run_mode==\"ray\"模式下支持小数分配\n\n- cpu:  float, 表示每个搜索进程的cpu分配资源，run_mode==\"ray\"模式下支持小数分配\n\n\n\n#### algo_config(算法配置)\n\n算法配置目前仅支持PBA，后续会进一步拓展。\n\n##### PBA配置\n\n- algo_name: str, [\"PBA\"], 搜索算法\n- algo_param:\n  - perturbation_interval: 搜索扰动周期\n  - num_samples：搜索进程数\n\n#### search_space(搜索空间配置)\n\n搜索空间定义， 策略搜索阶段必填， 策略应用训练会忽略。\n\n- operators_repeat： int，默认1， 表示搜索算子的重复次数。\n\n- operator_space： 搜索的算子空间\n\n  1. 自定义算子模式：\n\n     htype: str, [\"choice\"] 超参类型，目前支持choice枚举\n\n     value: list, [0,0.5,1] 枚举数据\n\n     ![image-20200707162627074](./doc/operators.png)\n\n  2. 缩略版算子模式:\n\n     用户只需要指定需要搜索的算子，prob, magtitue搜索空间为系统默认配置，为0-1之间。\n\n     ![image-20200707162709253](./doc/short_operators.png)\n\n  支持1，2模式混合定议\n\n\n\n##### 图像分类算子\n\n[\"Sharpness\", \"Rotate\", \"Invert\", \"Brightness\", \"Cutout\", \"Equalize\",\"TranslateY\", \"AutoContrast\", \"Color\",\"TranslateX\", \"Solarize\", \"ShearX\",\"Contrast\", \"Posterize\", \"ShearY\", \"FlipLR\"]\n\n\n\n### 搜索阶段\n\n用于数据增强策略的搜索\n\n### 训练阶段\n\n在训练中应用搜索出来的数据增强策略\n\n\n\n### 示例demo\n\n#### Flower数据组织\n\n\n```\ncd PaddleHub/demo/autaug/\nmkdir -p ./dataset\ncd dataset\nwget https://bj.bcebos.com/paddlehub-dataset/flower_photos.tar.gz\ntar -xvf flower_photos.tar.gz\n```\n\n#### 搜索流程\n\n```\ncd PaddleHub/demo/autaug/\nbash search.sh\n# 结果会以json形式dump到workspace中，用户可利用这个json文件进行训练\n```\n\n#### 训练阶段\n\n```\ncd PaddleHub/demo/autaug/\nbash train.sh\n```\n"
  },
  {
    "path": "demo/autoaug/hub_fitter.py",
    "content": "# -*- coding: utf-8 -*-\n#*******************************************************************************\n#\n# Copyright (c) 2020 Baidu.com, Inc. All Rights Reserved\n#\n#*******************************************************************************\n\"\"\"\n\nAuthors: lvhaijun01@baidu.com\nDate:     2020-11-24 20:43\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport time\nimport six\nimport os\nfrom typing import Dict, List, Optional, Union, Tuple\nfrom auto_augment.autoaug.utils import log\nimport logging\nlogger = log.get_logger(level=logging.INFO)\nimport auto_augment\nauto_augment_path = auto_augment.__file__\n\n\nclass HubFitterClassifer(object):\n    \"\"\"Trains an instance of the Model class.\"\"\"\n\n    def __init__(self, hparams: dict) -> None:\n        \"\"\"\n        定义分类任务的数据、模型\n\n        Args:\n            hparams:\n        \"\"\"\n\n        def set_paddle_flags(**kwargs):\n            for key, value in kwargs.items():\n                if os.environ.get(key, None) is None:\n                    os.environ[key] = str(value)\n\n        # NOTE(paddle-dev): All of these flags should be set before\n        # `import paddle`. Otherwise, it would not take any effect.\n        set_paddle_flags(\n            # enable GC to save memory\n            FLAGS_fraction_of_gpu_memory_to_use=hparams.resource_config.gpu, )\n        import paddle\n        import paddlehub as hub\n        from paddlehub_utils.trainer import CustomTrainer\n        from paddlehub_utils.reader import _init_loader\n\n        # todo now does not support fleet distribute training\n        # from paddle.fluid.incubate.fleet.base import role_maker\n        # from paddle.fluid.incubate.fleet.collective import fleet\n        # role = role_maker.PaddleCloudRoleMaker(is_collective=True)\n        # fleet.init(role)\n\n        logger.info(\"classficiation data augment search begin\")\n        self.hparams = hparams\n        # param compatible\n        self._fit_param(show=True)\n        paddle.disable_static(paddle.CUDAPlace(paddle.distributed.get_rank()))\n\n        train_dataset, eval_dataset = _init_loader(self.hparams)\n        model = hub.Module(\n            name=hparams[\"task_config\"][\"classifier\"][\"model_name\"],\n            label_list=self.class_to_id_dict.keys(),\n            load_checkpoint=None)\n\n        optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n        trainer = CustomTrainer(model=model, optimizer=optimizer, checkpoint_dir='img_classification_ckpt')\n        self.model = model\n        self.optimizer = optimizer\n\n        trainer.init_train_and_eval(\n            train_dataset, epochs=100, batch_size=32, eval_dataset=eval_dataset, save_interval=1)\n        self.trainer = trainer\n\n    def _fit_param(self, show: bool = False) -> None:\n        \"\"\"\n        param fit\n        Args:\n            hparams:\n\n        Returns:\n\n        \"\"\"\n        hparams = self.hparams\n        self._get_label_info(hparams)\n\n    def _get_label_info(self, hparams: dict) -> None:\n        \"\"\"\n\n        Args:\n            hparams:\n\n        Returns:\n\n        \"\"\"\n        from paddlehub_utils.reader import _read_classes\n        data_config = hparams.data_config\n        label_list = data_config.label_list\n        if os.path.isfile(label_list):\n            class_to_id_dict = _read_classes(label_list)\n        else:\n            assert 0, \"label_list:{} not exist\".format(label_list)\n        self.num_classes = len(class_to_id_dict)\n        self.class_to_id_dict = class_to_id_dict\n\n    def reset_config(self, new_hparams: dict) -> None:\n        \"\"\"\n        reset config, used by search stage\n        Args:\n            new_hparams:\n\n        Returns:\n\n        \"\"\"\n        self.hparams = new_hparams\n        self.trainer.train_loader.dataset.reset_policy(new_hparams.search_space)\n        return None\n\n    def save_model(self, checkpoint_dir: str, step: Optional[str] = None) -> str:\n        \"\"\"Dumps model into the backup_dir.\n\n        Args:\n          step: If provided, creates a checkpoint with the given step\n            number, instead of overwriting the existing checkpoints.\n        \"\"\"\n        checkpoint_path = os.path.join(checkpoint_dir, 'epoch') + '-' + str(step)\n        logger.info('Saving model checkpoint to {}'.format(checkpoint_path))\n        self.trainer.save_model(os.path.join(checkpoint_path, \"checkpoint\"))\n\n        return checkpoint_path\n\n    def extract_model_spec(self, checkpoint_path: str) -> None:\n        \"\"\"Loads a checkpoint with the architecture structure stored in the name.\"\"\"\n        ckpt_path = os.path.join(checkpoint_path, \"checkpoint\")\n        self.trainer.load_model(ckpt_path)\n        logger.info('Loaded child model checkpoint from {}'.format(checkpoint_path))\n\n    def eval_child_model(self, mode: str, pass_id: int = 0) -> dict:\n        \"\"\"Evaluate the child model.\n\n        Args:\n          model: image model that will be evaluated.\n          data_loader: dataset object to extract eval data from.\n          mode: will the model be evalled on train, val or test.\n\n        Returns:\n          Accuracy of the model on the specified dataset.\n        \"\"\"\n        eval_loader = self.trainer.eval_loader\n        res = self.trainer.evaluate_process(eval_loader)\n        top1_acc = res[\"metrics\"][\"acc\"]\n\n        if mode == \"val\":\n            return {\"val_acc\": top1_acc}\n        elif mode == \"test\":\n            return {\"test_acc\": top1_acc}\n        else:\n            raise NotImplementedError\n\n    def train_one_epoch(self, pass_id: int) -> dict:\n        \"\"\"\n\n        Args:\n            model:\n            train_loader:\n            optimizer:\n\n        Returns:\n\n        \"\"\"\n        from paddlehub.utils.utils import Timer\n\n        batch_sampler = self.trainer.batch_sampler\n        train_loader = self.trainer.train_loader\n        steps_per_epoch = len(batch_sampler)\n        task_config = self.hparams.task_config\n        task_type = task_config.task_type\n        epochs = task_config.classifier.epochs\n        timer = Timer(steps_per_epoch * epochs)\n        timer.start()\n        self.trainer.train_one_epoch(\n            loader=train_loader,\n            timer=timer,\n            current_epoch=pass_id,\n            epochs=epochs,\n            log_interval=10,\n            steps_per_epoch=steps_per_epoch)\n        return {\"train_acc\": 0}\n\n    def _run_training_loop(self, curr_epoch: int) -> dict:\n        \"\"\"Trains the model `m` for one epoch.\"\"\"\n        start_time = time.time()\n        train_acc = self.train_one_epoch(curr_epoch)\n        logger.info('Epoch:{} time(min): {}'.format(curr_epoch, (time.time() - start_time) / 60.0))\n        return train_acc\n\n    def _compute_final_accuracies(self, iteration: int) -> dict:\n        \"\"\"Run once training is finished to compute final test accuracy.\"\"\"\n        task_config = self.hparams.task_config\n        task_type = task_config.task_type\n\n        if (iteration >= task_config[task_type].epochs - 1):\n            test_acc = self.eval_child_model('test', iteration)\n            pass\n        else:\n            test_acc = {\"test_acc\": 0}\n        logger.info('Test acc: {}'.format(test_acc))\n        return test_acc\n\n    def run_model(self, epoch: int) -> dict:\n        \"\"\"Trains and evalutes the image model.\"\"\"\n        self._fit_param()\n        train_acc = self._run_training_loop(epoch)\n        valid_acc = self.eval_child_model(mode=\"val\", pass_id=epoch)\n        logger.info('valid acc: {}'.format(valid_acc))\n        all_metric = {}\n        all_metric.update(train_acc)\n        all_metric.update(valid_acc)\n        return all_metric\n"
  },
  {
    "path": "demo/autoaug/paddlehub_utils/__init__.py",
    "content": "# -*- coding: utf-8 -*-\n#*******************************************************************************\n#\n# Copyright (c) 2019 Baidu.com, Inc. All Rights Reserved\n#\n#*******************************************************************************\n\"\"\"\n\nAuthors: lvhaijun01@baidu.com\nDate:     2019-09-17 14:15\n\"\"\"\n"
  },
  {
    "path": "demo/autoaug/paddlehub_utils/reader.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n# -*- coding: utf-8 -*-\n# *******************************************************************************\n#\n# Copyright (c) 2020 Baidu.com, Inc. All Rights Reserved\n#\n# *******************************************************************************\n\"\"\"\n\nAuthors: lvhaijun01@baidu.com\nDate:     2019-06-30 00:10\n\"\"\"\nimport re\nimport numpy as np\nfrom typing import Dict, List, Optional, Union, Tuple\nimport six\nimport cv2\nimport os\nimport paddle\nimport paddlehub.vision.transforms as transforms\nfrom PIL import ImageFile\nfrom auto_augment.autoaug.transform.autoaug_transform import AutoAugTransform\nImageFile.LOAD_TRUNCATED_IMAGES = True\n__imagenet_stats = {'mean': [0.485, 0.456, 0.406], 'std': [0.229, 0.224, 0.225]}\n\n\nclass PbaAugment(object):\n    \"\"\"\n    pytorch 分类 PbaAugment transform\n    \"\"\"\n\n    def __init__(self,\n                 input_size: int = 224,\n                 scale_size: int = 256,\n                 normalize: Optional[list] = None,\n                 pre_transform: bool = True,\n                 stage: str = \"search\",\n                 **kwargs) -> None:\n        \"\"\"\n\n        Args:\n            input_size:\n            scale_size:\n            normalize:\n            pre_transform:\n            **kwargs:\n        \"\"\"\n\n        if normalize is None:\n            normalize = {'mean': [0.485, 0.456, 0.406], 'std': [0.229, 0.224, 0.225]}\n\n        policy = kwargs[\"policy\"]\n        assert stage in [\"search\", \"train\"]\n        train_epochs = kwargs[\"hp_policy_epochs\"]\n        self.auto_aug_transform = AutoAugTransform.create(policy, stage=stage, train_epochs=train_epochs)\n        #self.auto_aug_transform = PbtAutoAugmentClassiferTransform(conf)\n        if pre_transform:\n            self.pre_transform = transforms.Resize(input_size)\n\n        self.post_transform = transforms.Compose(\n            transforms=[transforms.Permute(),\n                        transforms.Normalize(**normalize, channel_first=True)],\n            channel_first=False)\n        self.cur_epoch = 0\n\n    def set_epoch(self, indx: int) -> None:\n        \"\"\"\n\n        Args:\n            indx:\n\n        Returns:\n\n        \"\"\"\n        self.auto_aug_transform.set_epoch(indx)\n\n    def reset_policy(self, new_hparams: dict) -> None:\n        \"\"\"\n\n        Args:\n            new_hparams:\n\n        Returns:\n\n        \"\"\"\n        self.auto_aug_transform.reset_policy(new_hparams)\n\n    def __call__(self, img: np.ndarray):\n        \"\"\"\n\n        Args:\n            img: PIL image\n        Returns:\n\n        \"\"\"\n        # tensform resize\n        if self.pre_transform:\n            img = self.pre_transform(img)\n\n        img = self.auto_aug_transform.apply(img)\n        img = img.astype(np.uint8)\n        img = self.post_transform(img)\n        return img\n\n\nclass PicRecord(object):\n    \"\"\"\n    PicRecord\n    \"\"\"\n\n    def __init__(self, row: list) -> None:\n        \"\"\"\n\n        Args:\n            row:\n        \"\"\"\n        self._data = row\n\n    @property\n    def sub_path(self) -> str:\n        \"\"\"\n\n        Returns:\n\n        \"\"\"\n        return self._data[0]\n\n    @property\n    def label(self) -> str:\n        \"\"\"\n\n        Returns:\n\n        \"\"\"\n        return self._data[1]\n\n\nclass PicReader(paddle.io.Dataset):\n    \"\"\"\n    PicReader\n    \"\"\"\n\n    def __init__(self,\n                 root_path: str,\n                 list_file: str,\n                 meta: bool = False,\n                 transform: Optional[callable] = None,\n                 class_to_id_dict: Optional[dict] = None,\n                 cache_img: bool = False,\n                 **kwargs) -> None:\n        \"\"\"\n\n        Args:\n            root_path:\n            list_file:\n            meta:\n            transform:\n            class_to_id_dict:\n            cache_img:\n            **kwargs:\n        \"\"\"\n\n        self.root_path = root_path\n        self.list_file = list_file\n        self.transform = transform\n        self.meta = meta\n        self.class_to_id_dict = class_to_id_dict\n        self.train_type = kwargs[\"conf\"].get(\"train_type\", \"single_label\")\n        self.class_num = kwargs[\"conf\"].get(\"class_num\", 0)\n\n        self._parse_list(**kwargs)\n        self.cache_img = cache_img\n        self.cache_img_buff = dict()\n        if self.cache_img:\n            self._get_all_img(**kwargs)\n\n    def _get_all_img(self, **kwargs) -> None:\n        \"\"\"\n        缓存图片进行预resize, 减少内存占用\n\n        Returns:\n\n        \"\"\"\n\n        scale_size = kwargs.get(\"scale_size\", 256)\n\n        for idx in range(len(self)):\n            record = self.pic_list[idx]\n            relative_path = record.sub_path\n            if self.root_path is not None:\n                image_path = os.path.join(self.root_path, relative_path)\n            else:\n                image_path = relative_path\n            try:\n                img = cv2.imread(image_path, cv2.IMREAD_COLOR)\n                img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)\n                img = cv2.resize(img, (scale_size, scale_size))\n                self.cache_img_buff[image_path] = img\n            except BaseException:\n                print(\"img_path:{} can not by cv2\".format(image_path).format(image_path))\n\n                pass\n\n    def _load_image(self, directory: str) -> np.ndarray:\n        \"\"\"\n\n        Args:\n            directory:\n\n        Returns:\n\n        \"\"\"\n\n        if not self.cache_img:\n            img = cv2.imread(directory, cv2.IMREAD_COLOR).astype('float32')\n            img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)\n            # img = Image.open(directory).convert('RGB')\n        else:\n            if directory in self.cache_img_buff:\n                img = self.cache_img_buff[directory]\n            else:\n                img = cv2.imread(directory, cv2.IMREAD_COLOR).astype('float32')\n                img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)\n\n                # img = Image.open(directory).convert('RGB')\n        return img\n\n    def _parse_list(self, **kwargs) -> None:\n        \"\"\"\n\n        Args:\n            **kwargs:\n\n        Returns:\n\n        \"\"\"\n        delimiter = kwargs.get(\"delimiter\", \" \")\n        self.pic_list = []\n\n        with open(self.list_file) as f:\n            lines = f.read().splitlines()\n            print(\"PicReader:: found {} picture in `{}'\".format(len(lines), self.list_file))\n            for i, line in enumerate(lines):\n                record = re.split(delimiter, line)\n                # record = line.split()\n                assert len(record) == 2, \"length of record is not 2!\"\n\n                if not os.path.splitext(record[0])[1]:\n                    # 适配线上分类数据转无后缀的情况\n                    record[0] = record[0] + \".jpg\"\n\n                # 线上单标签情况兼容多标签，后续需去除\n                record[1] = re.split(\",\", record[1])[0]\n\n                self.pic_list.append(PicRecord(record))\n\n    def __getitem__(self, index: int):\n        \"\"\"\n\n        Args:\n            index:\n\n        Returns:\n\n        \"\"\"\n        record = self.pic_list[index]\n\n        return self.get(record)\n\n    def get(self, record: PicRecord) -> tuple:\n        \"\"\"\n\n        Args:\n            record:\n\n        Returns:\n\n        \"\"\"\n        relative_path = record.sub_path\n        if self.root_path is not None:\n            image_path = os.path.join(self.root_path, relative_path)\n        else:\n            image_path = relative_path\n\n        img = self._load_image(image_path)\n        # print(\"org img sum:{}\".format(np.sum(np.asarray(img))))\n\n        process_data = self.transform(img)\n\n        if self.train_type == \"single_label\":\n            if self.class_to_id_dict:\n                label = self.class_to_id_dict[record.label]\n            else:\n                label = int(record.label)\n        elif self.train_type == \"multi_labels\":\n            label_tensor = np.zeros((1, self.class_num))\n            for label in record.label.split(\",\"):\n                label_tensor[0, int(label)] = 1\n            label_tensor = np.squeeze(label_tensor)\n            label = label_tensor\n\n        if self.meta:\n            return process_data, label, relative_path\n        else:\n            return process_data, label\n\n    def __len__(self) -> int:\n        \"\"\"\n\n        Returns:\n\n        \"\"\"\n        return len(self.pic_list)\n\n    def set_meta(self, meta: bool) -> None:\n        \"\"\"\n\n        Args:\n            meta:\n\n        Returns:\n\n        \"\"\"\n        self.meta = meta\n\n    def set_epoch(self, epoch: int) -> None:\n        \"\"\"\n\n        Args:\n            epoch:\n\n        Returns:\n\n        \"\"\"\n        if self.transform is not None:\n            self.transform.set_epoch(epoch)\n\n    # only use in search\n    def reset_policy(self, new_hparams: dict) -> None:\n        \"\"\"\n\n        Args:\n            new_hparams:\n\n        Returns:\n\n        \"\"\"\n        if self.transform is not None:\n            self.transform.reset_policy(new_hparams)\n\n\ndef _parse(value: str, function: callable, fmt: str) -> None:\n    \"\"\"\n    Parse a string into a value, and format a nice ValueError if it fails.\n\n    Returns `function(value)`.\n    Any `ValueError` raised is catched and a new `ValueError` is raised\n    with message `fmt.format(e)`, where `e` is the caught `ValueError`.\n    \"\"\"\n    try:\n        return function(value)\n    except ValueError as e:\n        six.raise_from(ValueError(fmt.format(e)), None)\n\n\ndef _read_classes(csv_file: str) -> dict:\n    \"\"\" Parse the classes file.\n    \"\"\"\n    result = {}\n    with open(csv_file) as csv_reader:\n        for line, row in enumerate(csv_reader):\n            try:\n                class_name = row.strip()\n                # print(class_id, class_name)\n            except ValueError:\n                six.raise_from(ValueError('line {}: format should be \\'class_name\\''.format(line)), None)\n\n            class_id = _parse(line, int, 'line {}: malformed class ID: {{}}'.format(line))\n\n            if class_name in result:\n                raise ValueError('line {}: duplicate class name: \\'{}\\''.format(line, class_name))\n            result[class_name] = class_id\n    return result\n\n\ndef _init_loader(hparams: dict, TrainTransform=None) -> tuple:\n    \"\"\"\n\n    Args:\n        hparams:\n\n    Returns:\n\n    \"\"\"\n    train_data_root = hparams.data_config.train_img_prefix\n    val_data_root = hparams.data_config.val_img_prefix\n    train_list = hparams.data_config.train_ann_file\n    val_list = hparams.data_config.val_ann_file\n    input_size = hparams.task_config.classifier.input_size\n    scale_size = hparams.task_config.classifier.scale_size\n    search_space = hparams.search_space\n    search_space[\"task_type\"] = hparams.task_config.task_type\n    epochs = hparams.task_config.classifier.epochs\n    no_cache_img = hparams.task_config.classifier.get(\"no_cache_img\", False)\n\n    normalize = {'mean': [0.485, 0.456, 0.406], 'std': [0.229, 0.224, 0.225]}\n\n    if TrainTransform is None:\n        TrainTransform = PbaAugment(\n            input_size=input_size,\n            scale_size=scale_size,\n            normalize=normalize,\n            policy=search_space,\n            hp_policy_epochs=epochs,\n        )\n    delimiter = hparams.data_config.delimiter\n    kwargs = dict(conf=hparams, delimiter=delimiter)\n\n    if hparams.task_config.classifier.use_class_map:\n        class_to_id_dict = _read_classes(label_list=hparams.data_config.label_list)\n    else:\n        class_to_id_dict = None\n    train_data = PicReader(\n        root_path=train_data_root,\n        list_file=train_list,\n        transform=TrainTransform,\n        class_to_id_dict=class_to_id_dict,\n        cache_img=not no_cache_img,\n        **kwargs)\n\n    val_data = PicReader(\n        root_path=val_data_root,\n        list_file=val_list,\n        transform=transforms.Compose(\n            transforms=[\n                transforms.Resize((224, 224)),\n                transforms.Permute(),\n                transforms.Normalize(**normalize, channel_first=True)\n            ],\n            channel_first=False),\n        class_to_id_dict=class_to_id_dict,\n        cache_img=not no_cache_img,\n        **kwargs)\n\n    return train_data, val_data\n"
  },
  {
    "path": "demo/autoaug/paddlehub_utils/trainer.py",
    "content": "# -*- coding: utf-8 -*-\n#*******************************************************************************\n#\n# Copyright (c) 2019 Baidu.com, Inc. All Rights Reserved\n#\n#*******************************************************************************\n\"\"\"\n\nAuthors: lvhaijun01@baidu.com\nDate:     2020-11-24 20:46\n\"\"\"\nimport os\nfrom collections import defaultdict\n\nimport paddle\nfrom paddle.distributed import ParallelEnv\n\nfrom paddlehub.finetune.trainer import Trainer\nfrom paddlehub.utils.log import logger\nfrom paddlehub.utils.utils import Timer\n\n\nclass CustomTrainer(Trainer):\n\n    def __init__(self, **kwargs) -> None:\n        super(CustomTrainer, self).__init__(**kwargs)\n\n    def init_train_and_eval(self,\n                            train_dataset: paddle.io.Dataset,\n                            epochs: int = 1,\n                            batch_size: int = 1,\n                            num_workers: int = 0,\n                            eval_dataset: paddle.io.Dataset = None,\n                            log_interval: int = 10,\n                            save_interval: int = 10) -> None:\n        self.batch_sampler, self.train_loader = self.init_train(train_dataset, batch_size, num_workers)\n        self.eval_loader = self.init_evaluate(eval_dataset, batch_size, num_workers)\n\n    def init_train(self, train_dataset: paddle.io.Dataset, batch_size: int = 1, num_workers: int = 0) -> tuple:\n        use_gpu = True\n        place = paddle.CUDAPlace(ParallelEnv().dev_id) if use_gpu else paddle.CPUPlace()\n        paddle.disable_static(place)\n\n        batch_sampler = paddle.io.DistributedBatchSampler(train_dataset,\n                                                          batch_size=batch_size,\n                                                          shuffle=True,\n                                                          drop_last=False)\n        loader = paddle.io.DataLoader(train_dataset,\n                                      batch_sampler=batch_sampler,\n                                      places=place,\n                                      num_workers=num_workers,\n                                      return_list=True)\n        return batch_sampler, loader\n\n    def train_one_epoch(self, loader: paddle.io.DataLoader, timer: Timer, current_epoch: int, epochs: int,\n                        log_interval: int, steps_per_epoch: int) -> None:\n        avg_loss = 0\n        avg_metrics = defaultdict(int)\n        self.model.train()\n\n        for batch_idx, batch in enumerate(loader):\n            loss, metrics = self.training_step(batch, batch_idx)\n            self.optimizer_step(current_epoch, batch_idx, self.optimizer, loss)\n            self.optimizer_zero_grad(current_epoch, batch_idx, self.optimizer)\n\n            # calculate metrics and loss\n            avg_loss += float(loss)\n            for metric, value in metrics.items():\n                avg_metrics[metric] += float(value)\n\n            timer.count()\n\n            if (batch_idx + 1) % log_interval == 0 and self.local_rank == 0:\n                lr = self.optimizer.get_lr()\n                avg_loss /= log_interval\n                if self.use_vdl:\n                    self.log_writer.add_scalar(tag='TRAIN/loss', step=timer.current_step, value=avg_loss)\n\n                print_msg = 'Epoch={}/{}, Step={}/{}'.format(current_epoch, epochs, batch_idx + 1, steps_per_epoch)\n                print_msg += ' loss={:.4f}'.format(avg_loss)\n\n                for metric, value in avg_metrics.items():\n                    value /= log_interval\n                    if self.use_vdl:\n                        self.log_writer.add_scalar(tag='TRAIN/{}'.format(metric), step=timer.current_step, value=value)\n                    print_msg += ' {}={:.4f}'.format(metric, value)\n\n                print_msg += ' lr={:.6f} step/sec={:.2f} | ETA {}'.format(lr, timer.timing, timer.eta)\n\n                logger.train(print_msg)\n\n                avg_loss = 0\n                avg_metrics = defaultdict(int)\n\n    def train(self,\n              train_dataset: paddle.io.Dataset,\n              epochs: int = 1,\n              batch_size: int = 1,\n              num_workers: int = 0,\n              eval_dataset: paddle.io.Dataset = None,\n              log_interval: int = 10,\n              save_interval: int = 10):\n        '''\n        Train a model with specific config.\n\n        Args:\n            train_dataset(paddle.io.Dataset) : Dataset to train the model\n            epochs(int) : Number of training loops, default is 1.\n            batch_size(int) : Batch size of per step, default is 1.\n            num_workers(int) : Number of subprocess to load data, default is 0.\n            eval_dataset(paddle.io.Dataset) : The validation dataset, deafult is None. If set, the Trainer will\n                execute evaluate function every `save_interval` epochs.\n            log_interval(int) : Log the train infomation every `log_interval` steps.\n            save_interval(int) : Save the checkpoint every `save_interval` epochs.\n        '''\n        batch_sampler, loader = self.init_train(train_dataset, batch_size, num_workers)\n        steps_per_epoch = len(batch_sampler)\n        timer = Timer(steps_per_epoch * epochs)\n        timer.start()\n\n        for i in range(epochs):\n            loader.dataset.set_epoch(epochs)\n            self.current_epoch += 1\n            self.train_one_epoch(loader, timer, self.current_epoch, epochs, log_interval, steps_per_epoch)\n\n            # todo, why paddlehub put save, eval in batch?\n            if self.current_epoch % save_interval == 0 and self.local_rank == 0:\n                if eval_dataset:\n                    result = self.evaluate(eval_dataset, batch_size, num_workers)\n                    eval_loss = result.get('loss', None)\n                    eval_metrics = result.get('metrics', {})\n                    if self.use_vdl:\n                        if eval_loss:\n                            self.log_writer.add_scalar(tag='EVAL/loss', step=timer.current_step, value=eval_loss)\n\n                        for metric, value in eval_metrics.items():\n                            self.log_writer.add_scalar(tag='EVAL/{}'.format(metric),\n                                                       step=timer.current_step,\n                                                       value=value)\n\n                    if not self.best_metrics or self.compare_metrics(self.best_metrics, eval_metrics):\n                        self.best_metrics = eval_metrics\n                        best_model_path = os.path.join(self.checkpoint_dir, 'best_model')\n                        self.save_model(best_model_path)\n                        self._save_metrics()\n\n                        metric_msg = ['{}={:.4f}'.format(metric, value) for metric, value in self.best_metrics.items()]\n                        metric_msg = ' '.join(metric_msg)\n                        logger.eval('Saving best model to {} [best {}]'.format(best_model_path, metric_msg))\n\n                self._save_checkpoint()\n\n    def init_evaluate(self, eval_dataset: paddle.io.Dataset, batch_size: int, num_workers: int) -> paddle.io.DataLoader:\n        use_gpu = True\n        place = paddle.CUDAPlace(ParallelEnv().dev_id) if use_gpu else paddle.CPUPlace()\n        paddle.disable_static(place)\n\n        batch_sampler = paddle.io.DistributedBatchSampler(eval_dataset,\n                                                          batch_size=batch_size,\n                                                          shuffle=False,\n                                                          drop_last=False)\n\n        loader = paddle.io.DataLoader(eval_dataset,\n                                      batch_sampler=batch_sampler,\n                                      places=place,\n                                      num_workers=num_workers,\n                                      return_list=True)\n        return loader\n\n    def evaluate_process(self, loader: paddle.io.DataLoader) -> dict:\n        self.model.eval()\n        avg_loss = num_samples = 0\n        sum_metrics = defaultdict(int)\n        avg_metrics = defaultdict(int)\n\n        for batch_idx, batch in enumerate(loader):\n            result = self.validation_step(batch, batch_idx)\n            loss = result.get('loss', None)\n            metrics = result.get('metrics', {})\n            bs = batch[0].shape[0]\n            num_samples += bs\n\n            if loss:\n                avg_loss += float(loss) * bs\n\n            for metric, value in metrics.items():\n                sum_metrics[metric] += float(value) * bs\n\n        # print avg metrics and loss\n        print_msg = '[Evaluation result]'\n        if loss:\n            avg_loss /= num_samples\n            print_msg += ' avg_loss={:.4f}'.format(avg_loss)\n\n        for metric, value in sum_metrics.items():\n            avg_metrics[metric] = value / num_samples\n            print_msg += ' avg_{}={:.4f}'.format(metric, avg_metrics[metric])\n\n        logger.eval(print_msg)\n\n        if loss:\n            return {'loss': avg_loss, 'metrics': avg_metrics}\n        return {'metrics': avg_metrics}\n\n    def evaluate(self, eval_dataset: paddle.io.Dataset, batch_size: int = 1, num_workers: int = 0) -> dict:\n        '''\n        Run evaluation and returns metrics.\n\n        Args:\n            eval_dataset(paddle.io.Dataset) : The validation dataset\n            batch_size(int) : Batch size of per step, default is 1.\n            num_workers(int) : Number of subprocess to load data, default is 0.\n        '''\n\n        loader = self.init_evaluate(eval_dataset, batch_size, num_workers)\n        res = self.evaluate_process(loader)\n        return res\n"
  },
  {
    "path": "demo/autoaug/pba_classifier_example.yaml",
    "content": "task_config:\n    run_mode: \"ray\"\n    workspace: \"./work_dirs/pbt_hub_classifer/test_autoaug\"\n    task_type: \"classifier\"\n    classifier:\n        model_name: \"resnet50_vd_imagenet_ssld\"\n        epochs: 100\n        input_size: 224\n        scale_size: 256\n        no_cache_img: false\n        use_class_map: false\n\ndata_config:\n    train_img_prefix: \"./dataset/flower_photos\"\n    train_ann_file: \"./dataset/flower_photos/train_list.txt\"\n    val_img_prefix: \"./dataset/flower_photos\"\n    val_ann_file: \"./dataset/flower_photos/validate_list.txt\"\n    label_list: \"./dataset/flower_photos/label_list.txt\"\n    delimiter: \" \"\n\nresource_config:\n    gpu: 0.4\n    cpu: 1\n\nalgo_config:\n    algo_name: \"PBA\"\n    algo_param:\n        perturbation_interval: 3\n        num_samples: 8\n\nsearch_space:\n    operator_space:\n        -\n            name: Sharpness\n            prob:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n            magtitude:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n        -\n            name: Rotate\n            prob:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n            magtitude:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n\n        -\n            name: Invert\n            prob:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n            magtitude:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n\n        -\n            name: Brightness\n            prob:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n            magtitude:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n\n        -\n            name: Cutout\n            prob:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n            magtitude:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n        -\n            name: Equalize\n            prob:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n            magtitude:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n\n        -\n            name: TranslateY\n            prob:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n            magtitude:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n\n        -\n            name: AutoContrast\n            prob:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n            magtitude:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n\n        -\n            name: Color\n            prob:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n            magtitude:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n        -\n            name: TranslateX\n            prob:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n            magtitude:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n\n        -\n            name: Solarize\n            prob:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n            magtitude:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n\n        -\n            name: ShearX\n            prob:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n            magtitude:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n\n        -\n            name: Contrast\n            prob:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n            magtitude:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n\n        -\n            name: Posterize\n            prob:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n            magtitude:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n\n        -\n            name: ShearY\n            prob:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n            magtitude:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n\n        -\n            name: FlipLR\n            prob:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n            magtitude:\n                htype: choice\n                value: [0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]\n"
  },
  {
    "path": "demo/autoaug/search.py",
    "content": "from auto_augment.autoaug.experiment.experiment import AutoAugExperiment\nfrom auto_augment.autoaug.utils.yaml_config import get_config\nfrom hub_fitter import HubFitterClassifer\nimport os\nimport argparse\nimport logging\nlogging.basicConfig(level=logging.INFO)\nlogger = logging.getLogger(__name__)\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\n    \"--config\",\n    help=\"config file\",\n)\nparser.add_argument(\n    \"--workspace\",\n    default=None,\n    help=\"work_space\",\n)\n\n\ndef main():\n    search_test()\n\n\ndef search_test():\n    args = parser.parse_args()\n    config = args.config\n    config = get_config(config, show=True)\n    task_config = config.task_config\n    data_config = config.data_config\n    resource_config = config.resource_config\n    algo_config = config.algo_config\n    search_space = config.get(\"search_space\", None)\n\n    if args.workspace is not None:\n        task_config[\"workspace\"] = args.workspace\n    workspace = task_config[\"workspace\"]\n\n    # 算法，任务，资源，数据，搜索空间(optional)配置导入，\n    exper = AutoAugExperiment.create(\n        algo_config=algo_config,\n        task_config=task_config,\n        resource_config=resource_config,\n        data_config=data_config,\n        search_space=search_space,\n        fitter=HubFitterClassifer)\n    result = exper.search()  # 开始搜索任务\n    policy = result.get_best_policy()  # 最佳策略获取， policy格式见 搜索结果应用格式\n    print(\"policy is:{}\".format(policy))\n    dump_path = os.path.join(workspace, \"auto_aug_config.json\")\n    result.dump_best_policy(path=dump_path)\n\n\nif __name__ == \"__main__\":\n    main()\n"
  },
  {
    "path": "demo/autoaug/search.sh",
    "content": "#!/usr/bin/env bash\n\nexport FLAGS_fast_eager_deletion_mode=1\nexport FLAGS_eager_delete_tensor_gb=0.0\nconfig=\"./pba_classifier_example.yaml\"\nworkspace=\"./work_dirs//autoaug_flower_mobilenetv2\"\n# workspace工作空间需要初始化\nrm -rf ${workspace}\nmkdir -p ${workspace}\nCUDA_VISIBLE_DEVICES=0,1 python -u search.py \\\n    --config=${config} \\\n    --workspace=${workspace} 2>&1 | tee -a ${workspace}/log.txt\n"
  },
  {
    "path": "demo/autoaug/train.py",
    "content": "# -*- coding: utf-8 -*-\n#*******************************************************************************\n#\n# Copyright (c) 2020 Baidu.com, Inc. All Rights Reserved\n#\n#*******************************************************************************\n\"\"\"\n\nAuthors: lvhaijun01@baidu.com\nDate:     2020-11-26 20:57\n\"\"\"\nfrom auto_augment.autoaug.utils.yaml_config import get_config\nfrom hub_fitter import HubFitterClassifer\nimport os\nimport argparse\nimport logging\nimport paddlehub as hub\nimport paddle\nimport paddlehub.vision.transforms as transforms\nfrom paddlehub_utils.reader import _init_loader, PbaAugment\nfrom paddlehub_utils.reader import _read_classes\nfrom paddlehub_utils.trainer import CustomTrainer\nlogging.basicConfig(level=logging.INFO)\nlogger = logging.getLogger(__name__)\n\nparser = argparse.ArgumentParser()\nparser.add_argument(\n    \"--config\",\n    help=\"config file\",\n)\nparser.add_argument(\n    \"--workspace\",\n    default=None,\n    help=\"work_space\",\n)\nparser.add_argument(\n    \"--policy\",\n    default=None,\n    help=\"data aug policy\",\n)\n\nif __name__ == '__main__':\n    args = parser.parse_args()\n    config = args.config\n    config = get_config(config, show=True)\n    task_config = config.task_config\n    data_config = config.data_config\n    resource_config = config.resource_config\n    algo_config = config.algo_config\n\n    input_size = task_config.classifier.input_size\n    scale_size = task_config.classifier.scale_size\n    normalize = {'mean': [0.485, 0.456, 0.406], 'std': [0.229, 0.224, 0.225]}\n    epochs = task_config.classifier.epochs\n\n    policy = args.policy\n    if policy is None:\n        print(\"use normal train transform\")\n        TrainTransform = transforms.Compose(\n            transforms=[\n                transforms.Resize((input_size, input_size)),\n                transforms.Permute(),\n                transforms.Normalize(**normalize, channel_first=True)\n            ],\n            channel_first=False)\n    else:\n        TrainTransform = PbaAugment(\n            input_size=input_size,\n            scale_size=scale_size,\n            normalize=normalize,\n            policy=policy,\n            hp_policy_epochs=epochs,\n            stage=\"train\")\n    train_dataset, eval_dataset = _init_loader(config, TrainTransform=TrainTransform)\n    class_to_id_dict = _read_classes(config.data_config.label_list)\n    model = hub.Module(\n        name=config.task_config.classifier.model_name, label_list=class_to_id_dict.keys(), load_checkpoint=None)\n\n    optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n    trainer = CustomTrainer(model=model, optimizer=optimizer, checkpoint_dir='img_classification_ckpt')\n    trainer.train(train_dataset, epochs=epochs, batch_size=32, eval_dataset=eval_dataset, save_interval=10)\n"
  },
  {
    "path": "demo/autoaug/train.sh",
    "content": "#!/usr/bin/env bash\nexport FLAGS_fast_eager_deletion_mode=1\nexport FLAGS_eager_delete_tensor_gb=0.0\nconfig=\"./pba_classifier_example.yaml\"\nworkspace=\"./work_dirs//autoaug_flower_mobilenetv2\"\n# workspace工作空间需要初始化\nmkdir -p ${workspace}\npolicy=./work_dirs//autoaug_flower_mobilenetv2/auto_aug_config.json\nCUDA_VISIBLE_DEVICES=0,1 python train.py \\\n    --config=${config} \\\n    --policy=${policy} \\\n    --workspace=${workspace} 2>&1 | tee -a ${workspace}/log.txt\n"
  },
  {
    "path": "demo/colorization/README.md",
    "content": "# PaddleHub 图像着色\n\n本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n\n## 命令行预测\n\n```\n$ hub run user_guided_colorization --input_path \"/PATH/TO/IMAGE\"\n```\n\n## 如何开始Fine-tune\n\n在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用user_guided_colorization模型对[Canvas](../../docs/reference/datasets.md#class-hubdatasetsCanvas)等数据集进行Fine-tune。\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 定义数据预处理方式\n```python\nimport paddlehub.vision.transforms as T\n\ntransform = T.Compose([T.Resize((256, 256), interpolation='NEAREST'),\n                       T.RandomPaddingCrop(crop_size=176),\n                       T.RGB2LAB()], to_rgb=True)\n```\n\n`transforms`数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n**NOTE:** 要将`T.Compose`中`to_rgb`设定为True.\n\n### Step2: 下载数据集并使用\n```python\nfrom paddlehub.datasets import Canvas\n\ncolor_set = Canvas(transform=transform, mode='train')\n```\n* `transform`: 数据预处理方式。\n* `mode`: 选择数据模式，可选项有 `train`, `test` 默认为`train`。\n\n数据集的准备代码可以参考 [canvas.py](../../paddlehub/datasets/canvas.py)。`hub.datasets.Canvas()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n### Step3: 加载预训练模型\n\n```python\nmodel = hub.Module(name='user_guided_colorization', load_checkpoint=None)\nmodel.set_config(classification=True, prob=1)\n```\n* `name`:加载模型的名字。\n* `load_checkpoint`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n* `classification`: 着色模型分两部分训练，开始阶段`classification`设置为True, 用于浅层网络训练。训练后期将`classification`设置为False, 用于训练网络的输出层。\n* `prob`: 每张输入图不加一个先验彩色块的概率，默认为1，即不加入先验彩色块。例如，当`prob`设定为0.9时，一张图上有两个先验彩色块的概率为(1-0.9)*(1-0.9)*0.9=0.009.\n\n### Step4: 选择优化策略和运行配置\n\n```python\noptimizer = paddle.optimizer.Adam(learning_rate=0.0001, parameters=model.parameters())\ntrainer = Trainer(model, optimizer, checkpoint_dir='img_colorization_ckpt_cls_1')\ntrainer.train(color_set, epochs=201, batch_size=25, eval_dataset=color_set, log_interval=10, save_interval=10)\n```\n\n#### 优化策略\n\nPaddle2.0-rc提供了多种优化器选择，如`SGD`, `Adam`, `Adamax`等，详细参见[策略](https://www.paddlepaddle.org.cn/documentation/docs/zh/2.0-rc/api/paddle/optimizer/optimizer/Optimizer_cn.html)。\n\n其中`Adam`:\n\n* `learning_rate`: 全局学习率。默认为1e-4；\n* `parameters`: 待优化模型参数。\n\n#### 运行配置\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: works的数量，默认为0；\n* `eval_dataset`: 验证过程所用的数据集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们使用该模型来进行预测。predict.py脚本如下：\n\n```python\nimport paddle\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='user_guided_colorization', load_checkpoint='/PATH/TO/CHECKPOINT')\n    model.set_config(prob=0.1)\n    result = model.predict(images=['house.png'])\n```\n\n参数配置正确后，请执行脚本`python predict.py`， 加载模型具体可参见[加载](https://www.paddlepaddle.org.cn/documentation/docs/zh/2.0-rc/api/paddle/framework/io/load_cn.html#load)。\n\n**NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。若想获取油画风着色效果，请下载参数文件[油画着色](https://paddlehub.bj.bcebos.com/dygraph/models/canvas_rc.pdparams)\n\n**Args**\n* `images`:原始图像路径或者BGR格式图片；\n* `visualization`: 是否可视化，默认为True；\n* `save_path`: 保存结果的路径，默认为'result'。\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线着色任务服务。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m user_guided_colorization\n```\n\n这样就完成了一个着色任务服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\nimport cv2\nimport base64\n\nimport numpy as np\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n# 发送HTTP请求\norg_im = cv2.imread('/PATH/TO/IMAGE')\ndata = {'images':[cv2_to_base64(org_im)]}\nheaders = {\"Content-type\": \"application/json\"}\nurl = \"http://127.0.0.1:8866/predict/user_guided_colorization\"\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\ndata = base64_to_cv2(r.json()[\"results\"]['data'][0]['fake_reg'])\ncv2.imwrite('color.png', data)\n\n```\n\n### 查看代码\n\nhttps://github.com/richzhang/colorization-pytorch\n\n### 依赖\n\npaddlepaddle >= 2.0.0rc\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "demo/colorization/predict.py",
    "content": "import paddle\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='user_guided_colorization', load_checkpoint='/PATH/TO/CHECKPOINT')\n    model.set_config(prob=0.1)\n    result = model.predict(images=['house.png'])\n"
  },
  {
    "path": "demo/colorization/train.py",
    "content": "import paddle\nimport paddlehub as hub\nimport paddlehub.vision.transforms as T\nfrom paddlehub.finetune.trainer import Trainer\nfrom paddlehub.datasets import Canvas\n\nif __name__ == '__main__':\n\n    transform = T.Compose(\n        [T.Resize((256, 256), interpolation='NEAREST'),\n         T.RandomPaddingCrop(crop_size=176),\n         T.RGB2LAB()], to_rgb=True)\n\n    color_set = Canvas(transform=transform, mode='train')\n    model = hub.Module(name='user_guided_colorization', load_checkpoint='/PATH/TO/CHECKPOINT')\n\n    model.set_config(classification=True, prob=1)\n    optimizer = paddle.optimizer.Adam(learning_rate=0.0001, parameters=model.parameters())\n    trainer = Trainer(model, optimizer, checkpoint_dir='img_colorization_ckpt_cls_1')\n    trainer.train(color_set, epochs=201, batch_size=25, eval_dataset=color_set, log_interval=10, save_interval=10)\n\n    model.set_config(classification=False, prob=0.125)\n    optimizer = paddle.optimizer.Adam(learning_rate=0.00001, parameters=model.parameters())\n    trainer = Trainer(model, optimizer, checkpoint_dir='img_colorization_ckpt_reg_1')\n    trainer.train(color_set, epochs=101, batch_size=25, log_interval=10, save_interval=10)\n"
  },
  {
    "path": "demo/image_classification/README.md",
    "content": "# PaddleHub 图像分类\n\n本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n\n## 命令行预测\n\n```shell\n$ hub run resnet50_vd_imagenet_ssld --input_path \"/PATH/TO/IMAGE\" --top_k 5\n```\n\n## 脚本预测\n\n```python\nimport paddle\nimport paddlehub as hub\n\nif __name__ == '__main__':\n\n    model = hub.Module(name='resnet50_vd_imagenet_ssld',)\n    result = model.predict([PATH/TO/IMAGE])\n```\n\n## 如何开始Fine-tune\n\n在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用resnet50_vd_imagenet_ssld对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 定义数据预处理方式\n```python\nimport paddlehub.vision.transforms as T\n\ntransforms = T.Compose([T.Resize((256, 256)),\n                        T.CenterCrop(224),\n                        T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                        to_rgb=True)\n```\n\n`transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n### Step2: 下载数据集并使用\n```python\nfrom paddlehub.datasets import Flowers\n\nflowers = Flowers(transforms)\n\nflowers_validate = Flowers(transforms, mode='val')\n```\n\n* `transforms`: 数据预处理方式。\n* `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n### Step3: 加载预训练模型\n\n```python\nmodel = hub.Module(name=\"resnet50_vd_imagenet_ssld\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n```\n* `name`: 选择预训练模型的名字。\n* `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\nPaddleHub提供许多图像分类预训练模型，如xception、mobilenet、efficientnet等，详细信息参见[图像分类模型](https://www.paddlepaddle.org.cn/hub?filter=en_category&value=ImageClassification)。\n\n如果想尝试efficientnet模型，只需要更换Module中的`name`参数即可.\n```python\n# 更换name参数即可无缝切换efficientnet模型, 代码示例如下\nmodel = hub.Module(name=\"efficientnetb7_imagenet\")\n```\n**NOTE:**目前部分模型还没有完全升级到2.0版本，敬请期待。\n\n### Step4: 选择优化策略和运行配置\n\n```python\noptimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\ntrainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n\ntrainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n```\n\n#### 优化策略\n\nPaddle2.0提供了多种优化器选择，如`SGD`, `Adam`, `Adamax`等，其中`Adam`:\n\n* `learning_rate`: 全局学习率。默认为1e-3；\n* `parameters`: 待优化模型参数。\n\n#### 运行配置\n\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: works的数量，默认为0；\n* `eval_dataset`: 验证集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们使用该模型来进行预测。predict.py脚本如下：\n\n```python\nimport paddle\nimport paddlehub as hub\n\nif __name__ == '__main__':\n\n    model = hub.Module(name='resnet50_vd_imagenet_ssld', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n    result = model.predict(['flower.jpg'])\n```\n\n参数配置正确后，请执行脚本`python predict.py`， 加载模型具体可参见[加载](https://www.paddlepaddle.org.cn/documentation/docs/zh/2.0-rc/api/paddle/framework/io/load_cn.html#load)。\n\n**NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线分类任务服务。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m resnet50_vd_imagenet_ssld\n```\n\n这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\nimport cv2\nimport base64\n\nimport numpy as np\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n# 发送HTTP请求\norg_im = cv2.imread('/PATH/TO/IMAGE')\n\ndata = {'images':[cv2_to_base64(org_im)], 'top_k':2}\nheaders = {\"Content-type\": \"application/json\"}\nurl = \"http://127.0.0.1:8866/predict/resnet50_vd_imagenet_ssld\"\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\ndata =r.json()[\"results\"]['data']\n```\n\n### 查看代码\n\nhttps://github.com/PaddlePaddle/models/tree/develop/PaddleCV/image_classification\n\n### 依赖\n\npaddlepaddle >= 2.0.0rc\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "demo/image_classification/predict.py",
    "content": "import paddle\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(\n        name='resnet50_vd_imagenet_ssld',\n        label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"],\n        load_checkpoint='/PATH/TO/CHECKPOINT')\n    result = model.predict(['flower.jpg'])\n"
  },
  {
    "path": "demo/image_classification/train.py",
    "content": "import paddle\nimport paddlehub as hub\nimport paddlehub.vision.transforms as T\nfrom paddlehub.finetune.trainer import Trainer\nfrom paddlehub.datasets import Flowers\n\nif __name__ == '__main__':\n    transforms = T.Compose(\n        [T.Resize((256, 256)),\n         T.CenterCrop(224),\n         T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])],\n        to_rgb=True)\n\n    flowers = Flowers(transforms)\n    flowers_validate = Flowers(transforms, mode='val')\n    model = hub.Module(\n        name='resnet50_vd_imagenet_ssld',\n        label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"],\n        load_checkpoint=None)\n    optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n    trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt', use_gpu=True)\n    trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=10)\n"
  },
  {
    "path": "demo/semantic_segmentation/README.md",
    "content": "# PaddleHub 图像分割\n\n本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n\n\n## 如何开始Fine-tune\n\n在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用ocrnet_hrnetw18_voc模型对OpticDiscSeg等数据集进行Fine-tune。\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 定义数据预处理方式\n```python\nfrom paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\ntransform = Compose([Resize(target_size=(512, 512)), Normalize()])\n```\n\n`segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n### Step2: 下载数据集并使用\n```python\nfrom paddlehub.datasets import OpticDiscSeg\n\ntrain_reader = OpticDiscSeg(transform， mode='train')\n\n```\n* `transform`: 数据预处理方式。\n* `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n### Step3: 加载预训练模型\n\n```python\nmodel = hub.Module(name='ocrnet_hrnetw18_voc', num_classes=2, pretrained=None)\n```\n* `name`: 选择预训练模型的名字。\n* `num_classes`: 分割模型的类别数目。\n* `pretrained`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n### Step4: 选择优化策略和运行配置\n\n```python\nscheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\noptimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\ntrainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_ocr', use_gpu=True)\n```\n\n#### 优化策略\n\n\n其中`Adam`:\n\n* `learning_rate`: 全局学习率。\n*  `parameters`: 待优化模型参数。\n\n#### 运行配置\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_gpu`: 是否使用gpu，默认为False;\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: works的数量，默认为0；\n* `eval_dataset`: 验证集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们使用该模型来进行预测。predict.py脚本如下：\n\n```python\nimport paddle\nimport cv2\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='ocrnet_hrnetw18_voc', pretrained='/PATH/TO/CHECKPOINT')\n    img = cv2.imread(\"/PATH/TO/IMAGE\")\n    model.predict(images=[img], visualization=True)\n```\n\n参数配置正确后，请执行脚本`python predict.py`。\n**Args**\n* `images`:原始图像路径或BGR格式图片；\n* `visualization`: 是否可视化，默认为True；\n* `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n**NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线图像分割服务。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m ocrnet_hrnetw18_voc\n```\n\n这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\nimport cv2\nimport base64\n\nimport numpy as np\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n# 发送HTTP请求\norg_im = cv2.imread('/PATH/TO/IMAGE')\ndata = {'images':[cv2_to_base64(org_im)]}\nheaders = {\"Content-type\": \"application/json\"}\nurl = \"http://127.0.0.1:8866/predict/ocrnet_hrnetw18_voc\"\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\nmask = base64_to_cv2(r.json()[\"results\"][0])\n```\n\n### 查看代码\n\nhttps://github.com/PaddlePaddle/PaddleSeg\n\n### 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "demo/semantic_segmentation/predict.py",
    "content": "import paddle\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='ocrnet_hrnetw18_voc', num_classes=2, pretrained='/PATH/TO/CHECKPOINT')\n    model.predict(images=[\"N0007.jpg\"], visualization=True)\n"
  },
  {
    "path": "demo/semantic_segmentation/train.py",
    "content": "import paddle\nimport numpy as np\nimport paddlehub as hub\nfrom paddlehub.finetune.trainer import Trainer\nfrom paddlehub.datasets import OpticDiscSeg\nfrom paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\nfrom paddlehub.vision.utils import ConfusionMatrix\n\nif __name__ == \"__main__\":\n    train_transforms = Compose([Resize(target_size=(512, 512)), Normalize()])\n    eval_transforms = Compose([Normalize()])\n    train_reader = OpticDiscSeg(train_transforms)\n    eval_reader = OpticDiscSeg(eval_transforms, mode='val')\n\n    model = hub.Module(name='ocrnet_hrnetw18_voc', num_classes=2)\n    scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9, end_lr=0.0001)\n    optimizer = paddle.optimizer.Momentum(learning_rate=scheduler, parameters=model.parameters())\n    trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n    trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n\n    cfm = ConfusionMatrix(eval_reader.num_classes, streaming=True)\n    model.eval()\n    for imgs, labels in eval_reader:\n        imgs = imgs[np.newaxis, :, :, :]\n        preds = model(paddle.to_tensor(imgs))[0]\n        preds = paddle.argmax(preds, axis=1, keepdim=True).numpy()\n        labels = labels[np.newaxis, :, :, :]\n        ignores = labels != eval_reader.ignore_index\n        cfm.calculate(preds, labels, ignores)\n    _, miou = cfm.mean_iou()\n    print('miou: {:.4f}'.format(miou))\n"
  },
  {
    "path": "demo/sequence_labeling/README.md",
    "content": "# PaddleHub Transformer模型fine-tune序列标注（动态图）\n\n在2017年之前，工业界和学术界对NLP文本处理依赖于序列模型[Recurrent Neural Network (RNN)](https://baike.baidu.com/item/%E5%BE%AA%E7%8E%AF%E7%A5%9E%E7%BB%8F%E7%BD%91%E7%BB%9C/23199490?fromtitle=RNN&fromid=5707183&fr=aladdin).\n\n![](../../docs/imgs/RNN_Sample.png)\n\n近年来随着深度学习的发展，模型参数数量飞速增长，为了训练这些参数，需要更大的数据集来避免过拟合。然而，对于大部分NLP任务来说，构建大规模的标注数据集成本过高，非常困难，特别是对于句法和语义相关的任务。相比之下，大规模的未标注语料库的构建则相对容易。最近的研究表明，基于大规模未标注语料库的预训练模型（Pretrained Models, PTM) 能够习得通用的语言表示，将预训练模型Fine-tune到下游任务，能够获得出色的表现。另外，预训练模型能够避免从零开始训练模型。\n\n![](https://ai-studio-static-online.cdn.bcebos.com/327f44ff3ed24493adca5ddc4dc24bf61eebe67c84a6492f872406f464fde91e)\n\n\n本示例将展示如何使用PaddleHub Transformer模型（如 ERNIE、BERT、RoBERTa等模型）Module 以动态图方式fine-tune并完成预测任务。\n\n## 如何开始Fine-tune\n\n\n我们以微软亚洲研究院发布的中文实体识别数据集MSRA-NER为示例数据集，可以运行下面的命令，在训练集（train.tsv）上进行模型训练，并在开发集（dev.tsv）验证。通过如下命令，即可启动训练。\n\n```shell\n# 设置使用的GPU卡号\nexport CUDA_VISIBLE_DEVICES=0\npython train.py\n```\n\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 选择模型\n\n在命名实体识别的任务中，因不同的数据集标识实体的标签不同，评测的方式也有所差异。因此，在初始化模型的之前，需要先确定实际标签的形式，下方的`label_list`则是MSRA-NER数据集中使用的标签类别。  \n如果用户使用的实体识别的数据集的标签方式与MSRA-NER不同，则需要自行根据数据集确定。\n```python\nlabel_list = hub.datasets.MSRA_NER.label_list\nlabel_map = {\n    idx: label for idx, label in enumerate(label_list)\n}\n```\n\n接下来创建任务所使用的`model`\n```python\nimport paddlehub as hub\n\nmodel = hub.Module(name='ernie_tiny', version='2.0.1', task='token-cls', label_map=label_map)\n```\n\n其中，参数：\n\n* `name`：模型名称，可以选择`ernie`，`ernie_tiny`，`bert-base-cased`， `bert-base-chinese`, `roberta-wwm-ext`，`roberta-wwm-ext-large`等。\n* `version`：module版本号\n* `task`：fine-tune任务。此处为`token-cls`，表示序列标注任务。\n* `label_map`：数据集中的标签信息，实体识别任务中需要根据不同标签种类对模型性能进行评价。\n\nPaddleHub还提供BERT等模型可供选择, 当前支持序列标注任务的模型对应的加载示例如下：\n\n模型名                           | PaddleHub Module\n---------------------------------- | :------:\nERNIE, Chinese                     | `hub.Module(name='ernie')`\nERNIE tiny, Chinese                | `hub.Module(name='ernie_tiny')`\nERNIE 2.0 Base, English            | `hub.Module(name='ernie_v2_eng_base')`\nERNIE 2.0 Large, English           | `hub.Module(name='ernie_v2_eng_large')`\nBERT-Base, English Cased           | `hub.Module(name='bert-base-cased')`\nBERT-Base, English Uncased         | `hub.Module(name='bert-base-uncased')`\nBERT-Large, English Cased          | `hub.Module(name='bert-large-cased')`\nBERT-Large, English Uncased        | `hub.Module(name='bert-large-uncased')`\nBERT-Base, Multilingual Cased      | `hub.Module(nane='bert-base-multilingual-cased')`\nBERT-Base, Multilingual Uncased    | `hub.Module(nane='bert-base-multilingual-uncased')`\nBERT-Base, Chinese                 | `hub.Module(name='bert-base-chinese')`\nBERT-wwm, Chinese                  | `hub.Module(name='chinese-bert-wwm')`\nBERT-wwm-ext, Chinese              | `hub.Module(name='chinese-bert-wwm-ext')`\nRoBERTa-wwm-ext, Chinese           | `hub.Module(name='roberta-wwm-ext')`\nRoBERTa-wwm-ext-large, Chinese     | `hub.Module(name='roberta-wwm-ext-large')`\nRBT3, Chinese                      | `hub.Module(name='rbt3')`\nRBTL3, Chinese                     | `hub.Module(name='rbtl3')`\nELECTRA-Small, English             | `hub.Module(name='electra-small')`\nELECTRA-Base, English              | `hub.Module(name='electra-base')`\nELECTRA-Large, English             | `hub.Module(name='electra-large')`\nELECTRA-Base, Chinese              | `hub.Module(name='chinese-electra-base')`\nELECTRA-Small, Chinese             | `hub.Module(name='chinese-electra-small')`\n\n通过以上的一行代码，`model`初始化为一个适用于序列标注任务的模型，为ERNIE Tiny的预训练模型后拼接上一个输出token共享的全连接网络（Full Connected）。  \n![](https://ss1.bdstatic.com/70cFuXSh_Q1YnxGkpoWK1HF6hhy/it/u=224484727,3049769188&fm=15&gp=0.jpg)\n\n以上图片来自于：https://arxiv.org/pdf/1810.04805.pdf\n\n### Step2: 下载并加载数据集\n\n```python\ntrain_dataset = hub.datasets.MSRA_NER(\n    tokenizer=model.get_tokenizer(), max_seq_len=128, mode='train')\ndev_dataset = hub.datasets.MSRA_NER(\n    tokenizer=model.get_tokenizer(), max_seq_len=128, mode='dev')\ntest_dataset = hub.datasets.MSRA_NER(\n    tokenizer=model.get_tokenizer(), max_seq_len=128, mode='test')\n```\n\n* `tokenizer`：表示该module所需用到的tokenizer，其将对输入文本完成切词，并转化成module运行所需模型输入格式。\n* `mode`：选择数据模式，可选项有 `train`, `test`, `dev`， 默认为`train`。\n* `max_seq_len`：ERNIE/BERT模型使用的最大序列长度，若出现显存不足，请适当调低这一参数。\n\n预训练模型ERNIE对中文数据的处理是以字为单位，tokenizer作用为将原始输入文本转化成模型model可以接受的输入数据形式。 PaddleHub 2.0中的各种预训练模型已经内置了相应的tokenizer，可以通过`model.get_tokenizer`方法获取。\n\n![](https://bj.bcebos.com/paddlehub/paddlehub-img/ernie_network_1.png)\n![](https://bj.bcebos.com/paddlehub/paddlehub-img/ernie_network_2.png)\n\n### Step3:  选择优化策略和运行配置\n\n```python\noptimizer = paddle.optimizer.AdamW(learning_rate=5e-5, parameters=model.parameters())\ntrainer = hub.Trainer(model, optimizer, checkpoint_dir='test_ernie_token_cls', use_gpu=True)\n\ntrainer.train(train_dataset, epochs=3, batch_size=32, eval_dataset=dev_dataset)\n\n# 在测试集上评估当前训练模型\ntrainer.evaluate(test_dataset, batch_size=32)\n```\n\n#### 优化策略\n\nPaddle2.0-rc提供了多种优化器选择，如`SGD`, `Adam`, `Adamax`, `AdamW`等，详细参见[策略](https://www.paddlepaddle.org.cn/documentation/docs/zh/2.0-rc/api/paddle/optimizer/optimizer/Optimizer_cn.html)。\n\n其中`AdamW`:\n\n* `learning_rate`: 全局学习率。默认为1e-3；\n* `parameters`: 待优化模型参数。\n\n#### 运行配置\n\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_gpu`: 是否使用GPU训练，默认为False;\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: workers的数量，默认为0；\n* `eval_dataset`: 验证集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们以以下数据为待预测数据，使用该模型来进行预测\n\n```text\n去年十二月二十四日，市委书记张敬涛召集县市主要负责同志研究信访工作时，提出三问：『假如上访群众是我们的父母姐妹，你会用什么样的感情对待他们？\n新华社北京5月7日电国务院副总理李岚清今天在中南海会见了美国前商务部长芭芭拉·弗兰克林。\n根据测算，海卫1表面温度已经从“旅行者”号探测器1989年造访时的零下236摄氏度上升到零下234摄氏度。\n华裔作家韩素音女士曾三次到大足，称“大足石窟是一座未被开发的金矿”。\n```\n\n```python\nimport paddlehub as hub\n\nsplit_char = \"\\002\"\nlabel_list = [\"B-PER\", \"I-PER\", \"B-ORG\", \"I-ORG\", \"B-LOC\", \"I-LOC\", \"O\"]\ntext_a = [\n    '去年十二月二十四日，市委书记张敬涛召集县市主要负责同志研究信访工作时，提出三问：『假如上访群众是我们的父母姐妹，你会用什么样的感情对待他们？',\n    '新华社北京5月7日电国务院副总理李岚清今天在中南海会见了美国前商务部长芭芭拉·弗兰克林。',\n    '根据测算，海卫1表面温度已经从“旅行者”号探测器1989年造访时的零下236摄氏度上升到零下234摄氏度。',\n    '华裔作家韩素音女士曾三次到大足，称“大足石窟是一座未被开发的金矿”。',\n]\ndata = [[split_char.join(text)] for text in text_a]\nlabel_map = {\n    idx: label for idx, label in enumerate(label_list)\n}\n\nmodel = hub.Module(\n    name='ernie_tiny',\n    version='2.0.1',\n    task='token-cls',\n    load_checkpoint='./token_cls_save_dir/best_model/model.pdparams',\n    label_map=label_map,\n)\n\nresults = model.predict(data, max_seq_len=128, batch_size=1, use_gpu=True)\nfor idx, text in enumerate(text_a):\n    print(f'Data: {text} \\t Lable: {\", \".join(results[idx][1:len(text)+1])}')\n```\n\n参数配置正确后，请执行脚本`python predict.py`， 加载模型具体可参见[加载](https://www.paddlepaddle.org.cn/documentation/docs/zh/2.0-rc/api/paddle/framework/io/load_cn.html#load)。\n\n### 依赖\n\npaddlepaddle >= 2.0.0rc\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "demo/sequence_labeling/predict.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    split_char = \"\\002\"\n    label_list = [\"B-PER\", \"I-PER\", \"B-ORG\", \"I-ORG\", \"B-LOC\", \"I-LOC\", \"O\"]\n    text_a = [\n        '去年十二月二十四日，市委书记张敬涛召集县市主要负责同志研究信访工作时，提出三问：『假如上访群众是我们的父母姐妹，你会用什么样的感情对待他们？',\n        '新华社北京5月7日电国务院副总理李岚清今天在中南海会见了美国前商务部长芭芭拉·弗兰克林。',\n        '根据测算，海卫1表面温度已经从“旅行者”号探测器1989年造访时的零下236摄氏度上升到零下234摄氏度。',\n        '华裔作家韩素音女士曾三次到大足，称“大足石窟是一座未被开发的金矿”。',\n    ]\n    data = [[split_char.join(text)] for text in text_a]\n    label_map = {idx: label for idx, label in enumerate(label_list)}\n\n    model = hub.Module(\n        name='ernie_tiny',\n        version='2.0.1',\n        task='token-cls',\n        load_checkpoint='./token_cls_save_dir/best/model.pdparams',\n        label_map=label_map,\n    )\n\n    results = model.predict(data=data, max_seq_len=128, batch_size=1, use_gpu=True)\n    for idx, text in enumerate(text_a):\n        print(f'Text:\\n{text} \\nLable: \\n{\", \".join(results[idx][1:len(text)+1])} \\n')\n"
  },
  {
    "path": "demo/sequence_labeling/train.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddlehub as hub\nfrom paddlehub.datasets import MSRA_NER\n\nimport ast\nimport argparse\n\nparser = argparse.ArgumentParser(__doc__)\nparser.add_argument(\"--num_epoch\", type=int, default=3, help=\"Number of epoches for fine-tuning.\")\nparser.add_argument(\n    \"--use_gpu\",\n    type=ast.literal_eval,\n    default=True,\n    help=\"Whether use GPU for fine-tuning, input should be True or False\")\nparser.add_argument(\"--learning_rate\", type=float, default=5e-5, help=\"Learning rate used to train with warmup.\")\nparser.add_argument(\"--max_seq_len\", type=int, default=128, help=\"Number of words of the longest seqence.\")\nparser.add_argument(\"--batch_size\", type=int, default=32, help=\"Total examples' number in batch for training.\")\nparser.add_argument(\"--checkpoint_dir\", type=str, default='./checkpoint', help=\"Directory to model checkpoint\")\nparser.add_argument(\"--save_interval\", type=int, default=1, help=\"Save checkpoint every n epoch.\")\n\nargs = parser.parse_args()\n\nif __name__ == '__main__':\n    label_list = MSRA_NER.label_list\n    label_map = {idx: label for idx, label in enumerate(label_list)}\n\n    model = hub.Module(\n        name='ernie_tiny',\n        version='2.0.1',\n        task='token-cls',\n        label_map=label_map,  # Required for token classification task\n    )\n\n    tokenizer = model.get_tokenizer()\n    train_dataset = MSRA_NER(tokenizer=tokenizer, max_seq_len=args.max_seq_len, mode='train')\n    dev_dataset = MSRA_NER(tokenizer=tokenizer, max_seq_len=args.max_seq_len, mode='dev')\n    test_dataset = MSRA_NER(tokenizer=tokenizer, max_seq_len=args.max_seq_len, mode='test')\n\n    optimizer = paddle.optimizer.AdamW(learning_rate=args.learning_rate, parameters=model.parameters())\n    trainer = hub.Trainer(model, optimizer, checkpoint_dir=args.checkpoint_dir, use_gpu=args.use_gpu)\n    trainer.train(\n        train_dataset,\n        epochs=args.num_epoch,\n        batch_size=args.batch_size,\n        eval_dataset=dev_dataset,\n        save_interval=args.save_interval,\n    )\n    trainer.evaluate(test_dataset, batch_size=args.batch_size)\n"
  },
  {
    "path": "demo/serving/bentoml/cloud-native-model-serving-with-bentoml.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {\n    \"id\": \"erfOlc-T8kY3\"\n   },\n   \"source\": [\n    \"# **BentoML Example: Image Segmentation with PaddleHub**\\n\",\n    \"**BentoML makes moving trained ML models to production easy:**\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"*   Package models trained with any ML framework and reproduce them for model serving in production\\n\",\n    \"* **Deploy anywhere** for online API serving or offline batch serving\\n\",\n    \"* High-Performance API model server with adaptive micro-batching support\\n\",\n    \"* Central hub for managing models and deployment process via Web UI and APIs\\n\",\n    \"* Modular and flexible design making it adaptable to your infrastrcuture\\n\",\n    \"\\n\",\n    \"BentoML is a framework for serving, managing, and deploying machine learning models. It is aiming to bridge the gap between Data Science and DevOps, and enable teams to deliver prediction services in a fast, repeatable, and scalable way.\\n\",\n    \"\\n\",\n    \"Before reading this example project, be sure to check out the [Getting started guide](https://github.com/bentoml/BentoML/blob/master/guides/quick-start/bentoml-quick-start-guide.ipynb) to learn about the basic concepts in BentoML.\\n\",\n    \"\\n\",\n    \"This notebook demonstrates how to use BentoML to turn a Paddlehub module into a docker image containing a REST API server serving this model, how to use your ML service built with BentoML as a CLI tool, and how to distribute it a pypi package.\\n\",\n    \"\\n\",\n    \"This example notebook is based on the [Python quick guide from PaddleHub](https://github.com/PaddlePaddle/PaddleHub/blob/release/v2.0/docs/docs_en/quick_experience/python_use_hub_en.md).\\n\"\n   ],\n   \"outputs\": [],\n   \"execution_count\": null\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"id\": \"54jFhiru8NWO\"\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"%reload_ext autoreload\\n\",\n    \"%autoreload 2\\n\",\n    \"%matplotlib inline\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"id\": \"XHOPuMGm-Nl2\"\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"!pip3 install -q bentoml paddlepaddle paddlehub\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"id\": \"KXz5IFU94P9D\"\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"!hub install deeplabv3p_xception65_humanseg\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {\n    \"id\": \"bWx5VF_LLTef\"\n   },\n   \"source\": [\n    \"## Prepare Input Data\"\n   ],\n   \"outputs\": [],\n   \"execution_count\": null\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"id\": \"yayroXhE-sos\"\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"!wget https://paddlehub.bj.bcebos.com/resources/test_image.jpg\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {\n    \"id\": \"zcrHdbJxAHh0\"\n   },\n   \"source\": [\n    \"## Create BentoService with PaddleHub Module Instantiation\"\n   ],\n   \"outputs\": [],\n   \"execution_count\": null\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"id\": \"s_T8YQRjALqg\"\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"%%writefile paddlehub_service.py\\n\",\n    \"import paddlehub as hub\\n\",\n    \"import bentoml\\n\",\n    \"from bentoml import env, artifacts, api, BentoService\\n\",\n    \"import imageio\\n\",\n    \"from bentoml.adapters import ImageInput\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"@env(infer_pip_packages=True)\\n\",\n    \"class PaddleHubService(bentoml.BentoService):\\n\",\n    \"    def __init__(self):\\n\",\n    \"      super(PaddleHubService, self).__init__()\\n\",\n    \"      self.module = hub.Module(name=\\\"deeplabv3p_xception65_humanseg\\\")\\n\",\n    \"\\n\",\n    \"    @api(input=ImageInput(), batch=True)\\n\",\n    \"    def predict(self, images):\\n\",\n    \"        results = self.module.segmentation(images=images, visualization=True)\\n\",\n    \"        return [result['data'] for result in results]\\n\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"id\": \"ESc4D_muCWNx\"\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"# Import the custom BentoService defined above\\n\",\n    \"from paddlehub_service import PaddleHubService\\n\",\n    \"import numpy as np\\n\",\n    \"import cv2\\n\",\n    \"\\n\",\n    \"# Pack it with required artifacts\\n\",\n    \"bento_svc = PaddleHubService()\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"id\": \"ondQXpNCy_TV\"\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"# Predict with the initialized module\\n\",\n    \"image = cv2.imread(\\\"test_image.jpg\\\")\\n\",\n    \"images = [image]\\n\",\n    \"segmentation_results = bento_svc.predict(images)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {\n    \"id\": \"f-61QUPd6w9h\"\n   },\n   \"source\": [\n    \"### Visualizing the result\"\n   ],\n   \"outputs\": [],\n   \"execution_count\": null\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"id\": \"jNnyhPQt59ey\"\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"# View the segmentation mask layer\\n\",\n    \"from matplotlib import pyplot as plt\\n\",\n    \"\\n\",\n    \"for result in segmentation_results:\\n\",\n    \"    plt.imshow(cv2.cvtColor(result, cv2.COLOR_BGR2RGB))\\n\",\n    \"    plt.axis('off')\\n\",\n    \"    plt.show()\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"id\": \"kmJkYFPNRnmA\"\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"# Get the segmented image of the original image\\n\",\n    \"for result, original in zip(segmentation_results, images):\\n\",\n    \"    result = cv2.cvtColor(result, cv2.COLOR_GRAY2RGB)\\n\",\n    \"    original_mod = cv2.cvtColor(original, cv2.COLOR_RGB2RGBA)\\n\",\n    \"    mask = result / 255\\n\",\n    \"    *_, alpha = cv2.split(mask)\\n\",\n    \"    mask = cv2.merge((mask, alpha))\\n\",\n    \"    segmented_image = (original_mod * mask).clip(0, 255).astype(np.uint8)\\n\",\n    \"    \\n\",\n    \"    plt.imshow(cv2.cvtColor(segmented_image, cv2.COLOR_BGRA2RGBA))\\n\",\n    \"    plt.axis('off')\\n\",\n    \"    plt.show()\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {\n    \"id\": \"07YXA0ne7ZBc\"\n   },\n   \"source\": [\n    \"### Start dev server for testing\"\n   ],\n   \"outputs\": [],\n   \"execution_count\": null\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"id\": \"pUM64JEKaRWt\"\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"# Start a dev model server\\n\",\n    \"bento_svc.start_dev_server()\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"id\": \"3valpr2oa_OM\"\n   },\n   \"source\": [\n    \"!curl -i \\\\\\n\",\n    \"  -F image=@test_image.jpg \\\\\\n\",\n    \"  localhost:5000/predict\"\n   ],\n   \"outputs\": []\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"id\": \"oCW5xuPebByD\"\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"# Stop the dev model server\\n\",\n    \"bento_svc.stop_dev_server()\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {\n    \"id\": \"qwSpmZ1u7gez\"\n   },\n   \"source\": [\n    \"### Save the BentoService for deployment\"\n   ],\n   \"outputs\": [],\n   \"execution_count\": null\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"id\": \"kCHUw-_Hy6tH\"\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"saved_path = bento_svc.save()\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {\n    \"id\": \"IvUU0k0JCxYk\"\n   },\n   \"source\": [\n    \"## REST API Model Serving\"\n   ],\n   \"outputs\": [],\n   \"execution_count\": null\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"id\": \"CeJEIDyj_xGK\"\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"!bentoml serve PaddleHubService:latest\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {\n    \"id\": \"FPoKbR6cCq8_\"\n   },\n   \"source\": [\n    \"If you are running this notebook from Google Colab, you can start the dev server with --run-with-ngrok option, to gain acccess to the API endpoint via a public endpoint managed by ngrok:\"\n   ],\n   \"outputs\": [],\n   \"execution_count\": null\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"id\": \"RodE8ooiCqRw\"\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"!bentoml serve PaddleHubService:latest --run-with-ngrok\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {\n    \"id\": \"FMCrkYb5DDHB\"\n   },\n   \"source\": [\n    \"## Make request to the REST server\\n\",\n    \"\\n\",\n    \"*After navigating to the location of this notebook, copy and paste the following code to your terminal and run it to make request*\"\n   ],\n   \"outputs\": [],\n   \"execution_count\": null\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"id\": \"fMyLXOIUDXSn\"\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"curl -i \\\\\\n\",\n    \"  --header \\\"Content-Type: image/jpeg\\\" \\\\\\n\",\n    \"  --request POST \\\\\\n\",\n    \"  --data-binary @test_image.jpg \\\\\\n\",\n    \"  localhost:5000/predict\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {\n    \"id\": \"DlGTKeMnEEyE\"\n   },\n   \"source\": [\n    \"## Launch inference job from CLI\"\n   ],\n   \"outputs\": [],\n   \"execution_count\": null\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"id\": \"CBqvdN9-iyQu\"\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"!bentoml run PaddleHubService:latest predict --input-file test_image.jpg\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {\n    \"id\": \"6RA0JpPjDMt8\"\n   },\n   \"source\": [\n    \"## Containerize model server with Docker\\n\",\n    \"\\n\",\n    \"One common way of distributing this model API server for production deployment, is via Docker containers. And BentoML provides a convenient way to do that.\\n\",\n    \"\\n\",\n    \"Note that docker is **not available in Google Colab**. You will need to download and run this notebook locally to try out this containerization with docker feature.\\n\",\n    \"\\n\",\n    \"If you already have docker configured, simply run the follow command to product a docker container serving the PaddeHub prediction service created above:\"\n   ],\n   \"outputs\": [],\n   \"execution_count\": null\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"id\": \"JKUGBMNWDJnr\"\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"!bentoml containerize PaddleHubService:latest\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"id\": \"0nyRChqMDwv4\"\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"!docker run --rm -p 5000:5000 PaddleHubService:latest\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {\n    \"id\": \"Jb-srm9RENeh\"\n   },\n   \"source\": [\n    \"# **Deployment Options**\\n\",\n    \"\\n\",\n    \"If you are at a small team with limited engineering or DevOps resources, try out automated deployment with BentoML CLI, currently supporting AWS Lambda, AWS SageMaker, and Azure Functions:\\n\",\n    \"\\n\",\n    \"* [AWS Lambda Deployment Guide](https://docs.bentoml.org/en/latest/deployment/aws_lambda.html)\\n\",\n    \"* [AWS SageMaker Deployment Guide](https://docs.bentoml.org/en/latest/deployment/aws_sagemaker.html)\\n\",\n    \"* [Azure Functions Deployment Guide](https://docs.bentoml.org/en/latest/deployment/azure_functions.html)\\n\",\n    \"\\n\",\n    \"If the cloud platform you are working with is not on the list above, try out these step-by-step guide on manually deploying BentoML packaged model to cloud platforms:\\n\",\n    \"\\n\",\n    \"* [AWS ECS Deployment](https://docs.bentoml.org/en/latest/deployment/aws_ecs.html)\\n\",\n    \"* [Google Cloud Run Deployment](https://docs.bentoml.org/en/latest/deployment/google_cloud_run.html)\\n\",\n    \"* [Azure container instance Deployment](https://docs.bentoml.org/en/latest/deployment/azure_container_instance.html)\\n\",\n    \"* [Heroku Deployment](https://docs.bentoml.org/en/latest/deployment/heroku.html)\\n\",\n    \"\\n\",\n    \"Lastly, if you have a DevOps or ML Engineering team who's operating a Kubernetes or OpenShift cluster, use the following guides as references for implementating your deployment strategy:\\n\",\n    \"\\n\",\n    \"* [Kubernetes Deployment](https://docs.bentoml.org/en/latest/deployment/kubernetes.html)\\n\",\n    \"* [Knative Deployment](https://docs.bentoml.org/en/latest/deployment/knative.html)\\n\",\n    \"* [Kubeflow Deployment](https://docs.bentoml.org/en/latest/deployment/kubeflow.html)\\n\",\n    \"* [KFServing Deployment](https://docs.bentoml.org/en/latest/deployment/kfserving.html)\\n\",\n    \"* [Clipper.ai Deployment Guide](https://docs.bentoml.org/en/latest/deployment/clipper.html)\"\n   ],\n   \"outputs\": [],\n   \"execution_count\": null\n  }\n ],\n \"metadata\": {\n  \"colab\": {\n   \"collapsed_sections\": [],\n   \"name\": \"PaddleHub_deeplabv3p_xception65_humanseg.ipynb\",\n   \"provenance\": [],\n   \"toc_visible\": true\n  },\n  \"kernelspec\": {\n   \"display_name\": \"Python 3\",\n   \"language\": \"python\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.8.5\"\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 1\n}"
  },
  {
    "path": "demo/serving/lexical_analysis_lac/templates/lac_gpu_serving_config.json",
    "content": "{\n  \"modules_info\": {\n    \"lac\": {\n      \"init_args\": {\n        \"version\": \"2.1.0\"\n      },\n      \"predict_args\": {\n        \"batch_size\": 1\n      }\n    }\n  },\n  \"use_gpu\": true,\n  \"port\": 8866,\n  \"gpu\": \"0,1,2\"\n}\n"
  },
  {
    "path": "demo/serving/lexical_analysis_lac/templates/lac_serving_config.json",
    "content": "{\n  \"modules_info\": {\n    \"lac\": {\n      \"init_args\": {\n        \"version\": \"2.1.0\"\n      },\n      \"predict_args\": {\n        \"batch_size\": 1,\n        \"use_gpu\": false\n      }\n    }\n  },\n  \"port\": 8866,\n  \"use_multiprocess\": true,\n  \"workers\": 2,\n  \"timeout\": 30\n}\n"
  },
  {
    "path": "demo/serving/module_serving/lexical_analysis_lac/README.md",
    "content": "# 部署词法分析服务-以lac为例\n## 简介\n`Lexical Analysis of Chinese`，简称`LAC`，是一个联合的词法分析模型，能整体性地完成中文分词、词性标注、专名识别任务。关于`LAC`的具体信息请参见[LAC](https://paddlepaddle.org.cn/hubdetail?name=lac&en_category=LexicalAnalysis)。\n\n使用PaddleHub Serving可以部署一个在线词法分析服务，可以将此接口用于词法分析、在线分词等在线web应用。\n\n这里就带领大家使用PaddleHub Serving，通过简单几步部署一个词法分析在线服务。\n\n## Step1：启动PaddleHub Serving\n启动命令如下\n```shell\n$ hub serving start -m lac\n```\n启动时会显示加载模型过程，启动成功后显示\n```shell\nLoading lac successful.\n```\n这样就完成了一个词法分析服务化API的部署，默认端口号为8866。\n\n## Step2：测试语言模型在线API\n### 不使用自定义词典\n在服务部署好之后，我们可以进行测试，用来测试的文本为`今天是个好日子`和`天气预报说今天要下雨`。\n首先指定编码格式及引入需要的包：\n```python\n>>> # coding: utf8\n>>> import requests\n>>> import json\n```\n准备的数据格式为：\n```python\n{\"text\": [text_1, text_2, ...]}\n```\n**NOTE:** 字典的key为\"text\"。\n\n根据文本和数据格式，代码如下：\n```python\n>>> # 指定用于用于预测的文本并生成字典{\"text\": [text_1, text_2, ... ]}\n>>> text_list = [\"今天是个好日子\", \"天气预报说今天要下雨\"]\n>>> text = {\"text\": text_list}\n```\n\n## Step3：获取并验证结果\n接下来发送请求到词法分析API，并得到结果，代码如下：\n```python\n# 指定预测方法为lac并发送post请求\n>>> url = \"http://127.0.0.1:8866/predict/text/lac\"\n>>> r = requests.post(url=url, data=text)\n```\n`LAC`模型返回的结果为每个文本分词后的结果，我们尝试打印接口返回结果：\n```python\n# 打印预测结果\n>>> print(json.dumps(r.json(), indent=4, ensure_ascii=False))\n{\n    \"msg\": \"\",\n    \"results\": [\n        {\n            \"tag\": [\n                \"TIME\", \"v\", \"q\", \"n\"\n            ],\n            \"word\": [\n                \"今天\", \"是\", \"个\", \"好日子\"\n            ]\n        },\n        {\n            \"tag\": [\n                \"n\", \"v\", \"TIME\", \"v\", \"v\"\n            ],\n            \"word\": [\n                \"天气预报\", \"说\", \"今天\", \"要\", \"下雨\"\n            ]\n        }\n    ],\n    \"status\": \"0\"\n}\n```\n这样我们就完成了对词法分析的预测服务化部署和测试。\n\n完整的测试代码见[lac_serving_demo.py](./lac_serving_demo.py)。\n\n### 使用自定义词典\n`LAC`模型在预测时还可以使用自定义词典干预默认分词结果，这种情况只需要将自定义词典以文件的形式附加到request请求即可，数据格式如下：\n```python\n{\"user_dict\": user_dict.txt}\n```\n根据数据格式，具体代码如下：\n```python\n>>> # 指定自定义词典{\"user_dict\": dict.txt}\n>>> file = {\"user_dict\": open(\"dict.txt\", \"rb\")}\n>>> # 请求接口时以文件的形式附加自定义词典，其余和不使用自定义词典的请求方式相同，此处不再赘述\n>>> url = \"http://127.0.0.1:8866/predict/text/lac\"\n>>> r = requests.post(url=url, files=file, data=text)\n```\n\n完整的测试代码见[lac_with_dict_serving_demo.py](./lac_with_dict_serving_demo.py)。\n\n### 客户端请求新版模型的方式\n对某些新版模型，客户端请求方式有所变化，更接近本地预测的请求方式，以降低学习成本。\n以lac(2.1.0)为例，使用上述方法进行请求将提示：\n```python\n{\n    \"Warnning\": \"This usage is out of date, please use 'application/json' as content-type to post to /predict/lac. See 'https://github.com/PaddlePaddle/PaddleHub/blob/release/v1.6/docs/tutorial/serving.md' for more details.\"\n}\n```\n对于lac(2.1.0)，请求的方式如下：\n```python\n# coding: utf8\nimport requests\nimport json\n\nif __name__ == \"__main__\":\n    # 指定用于预测的文本并生成字典[text_1, text_2, ... ]\n    text = [\"今天是个好日子\", \"天气预报说今天要下雨\"]\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"texts\"\n    # 对应本地部署，则为lac.analysis_lexical(text=[text1, text2])\n    data = {\"texts\": text, \"batch_size\": 1}\n    # 指定预测方法为lac并发送post请求\n    url = \"http://127.0.0.1:8866/predict/lac\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(json.dumps(r.json(), indent=4, ensure_ascii=False))\n```\n对结果的解析等与前种方式一致，显示如下：\n```python\n{\n    \"results\": [\n        {\n            \"tag\": [\n                \"TIME\", \"v\", \"q\", \"n\"\n            ],\n            \"word\": [\n                \"今天\", \"是\", \"个\", \"好日子\"\n            ]\n        },\n        {\n            \"tag\": [\n                \"n\", \"v\", \"TIME\", \"v\", \"v\"\n            ],\n            \"word\": [\n                \"天气预报\", \"说\", \"今天\", \"要\", \"下雨\"\n            ]\n        }\n    ]\n}\n```\n此Demo的具体信息和代码请参见[LAC Serving_2.1.0](./lac_2.1.0_serving_demo.py)。\n"
  },
  {
    "path": "demo/serving/module_serving/lexical_analysis_lac/lac_serving_demo.py",
    "content": "# coding: utf8\nimport requests\nimport json\n\nif __name__ == \"__main__\":\n    # 指定用于预测的文本并生成字典{\"text\": [text_1, text_2, ... ]}\n    text = [\"今天是个好日子\", \"天气预报说今天要下雨\"]\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n    # 对应本地部署，则为lac.analysis_lexical(data=text, batch_size=1)\n    # 若使用lac版本低于2.2.0，需要将`text`参数改为`texts`\n    data = {\"text\": text, \"batch_size\": 1}\n    # 指定预测方法为lac并发送post请求，content-type类型应指定json方式\n    url = \"http://127.0.0.1:8866/predict/lac\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(json.dumps(r.json(), indent=4, ensure_ascii=False))\n"
  },
  {
    "path": "demo/serving/module_serving/object_detection_pyramidbox_lite_server_mask/pyramidbox_lite_server_mask_serving_demo.py",
    "content": "# coding: utf8\nimport requests\nimport json\nimport cv2\nimport base64\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\nif __name__ == '__main__':\n    # 获取图片的base64编码格式\n    img1 = cv2_to_base64(cv2.imread(\"../../../../docs/imgs/family_mask.jpg\"))\n    img2 = cv2_to_base64(cv2.imread(\"../../../../docs/imgs/woman_mask.jpg\"))\n    data = {'images': [img1, img2]}\n    # 指定content-type\n    headers = {\"Content-type\": \"application/json\"}\n    # 发送HTTP请求\n    url = \"http://127.0.0.1:8866/predict/pyramidbox_lite_server_mask\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json())\n"
  },
  {
    "path": "demo/style_transfer/README.md",
    "content": "# PaddleHub 图像风格迁移\n\n本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n\n## 命令行预测\n\n```\n$ hub run msgnet --input_path \"/PATH/TO/ORIGIN/IMAGE\" --style_path \"/PATH/TO/STYLE/IMAGE\"\n```\n\n## 脚本预测\n\n```python\nimport paddle\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='msgnet')\n    result = model.predict(origin=[\"venice-boat.jpg\"], style=\"candy.jpg\", visualization=True, save_path ='style_tranfer')\n```\n\n## 如何开始Fine-tune\n\n在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用msgnet模型对[MiniCOCO](../../docs/reference/datasets.md#class-hubdatasetsMiniCOCO)等数据集进行Fine-tune。\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 定义数据预处理方式\n```python\nimport paddlehub.vision.transforms as T\n\ntransform = T.Compose([T.Resize((256, 256), interpolation='LINEAR')])\n```\n\n`transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n### Step2: 下载数据集并使用\n```python\nfrom paddlehub.datasets.minicoco import MiniCOCO\n\nstyledata = MiniCOCO(transform=transform, mode='train')\n\n```\n* `transforms`: 数据预处理方式。\n* `mode`: 选择数据模式，可选项有 `train`, `test`， 默认为`train`。\n\n数据集的准备代码可以参考 [minicoco.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.MiniCOCO()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n### Step3: 加载预训练模型\n\n```python\nmodel = hub.Module(name='msgnet', load_checkpoint=None)\n```\n* `name`: 选择预训练模型的名字。\n* `load_checkpoint`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n### Step4: 选择优化策略和运行配置\n\n```python\noptimizer = paddle.optimizer.Adam(learning_rate=0.0001, parameters=model.parameters())\ntrainer = Trainer(model, optimizer, checkpoint_dir='test_style_ckpt')\ntrainer.train(styledata, epochs=101, batch_size=4, eval_dataset=styledata, log_interval=10, save_interval=10)\n```\n\n#### 优化策略\n\nPaddle2.0rc提供了多种优化器选择，如`SGD`, `Adam`, `Adamax`等，详细参见[策略](https://www.paddlepaddle.org.cn/documentation/docs/zh/2.0-rc/api/paddle/optimizer/optimizer/Optimizer_cn.html)。\n\n其中`Adam`:\n\n* `learning_rate`: 全局学习率。默认为1e-4；\n*  `parameters`: 待优化模型参数。\n\n#### 运行配置\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: works的数量，默认为0；\n* `eval_dataset`: 验证集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们使用该模型来进行预测。predict.py脚本如下：\n\n```python\nimport paddle\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='msgnet', load_checkpoint=\"/PATH/TO/CHECKPOINT\")\n    result = model.predict(origin=[\"venice-boat.jpg\"], style=\"candy.jpg\", visualization=True, save_path ='style_tranfer')\n```\n\n参数配置正确后，请执行脚本`python predict.py`， 加载模型具体可参见[加载](https://www.paddlepaddle.org.cn/documentation/docs/zh/2.0-rc/api/paddle/framework/io/load_cn.html#load)。\n\n**Args**\n* `origin`:原始图像路径或BGR格式图片；\n* `style`: 风格图像路径；\n* `visualization`: 是否可视化，默认为True；\n* `save_path`: 保存结果的路径，默认保存路径为'style_tranfer'。\n\n**NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线风格迁移服务。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m msgnet\n```\n\n这样就完成了一个风格迁移服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\nimport cv2\nimport base64\n\nimport numpy as np\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n# 发送HTTP请求\norg_im = cv2.imread('/PATH/TO/ORIGIN/IMAGE')\nstyle_im = cv2.imread('/PATH/TO/STYLE/IMAGE')\ndata = {'images':[[cv2_to_base64(org_im)], cv2_to_base64(style_im)]}\nheaders = {\"Content-type\": \"application/json\"}\nurl = \"http://127.0.0.1:8866/predict/msgnet\"\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\ndata = base64_to_cv2(r.json()[\"results\"]['data'][0])\ncv2.imwrite('style.png', data)\n```\n\n### 查看代码\n\nhttps://github.com/zhanghang1989/PyTorch-Multi-Style-Transfer\n\n### 依赖\n\npaddlepaddle >= 2.0.0rc\n\npaddlehub >= 2.0.0"
  },
  {
    "path": "demo/style_transfer/predict.py",
    "content": "import paddle\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='msgnet', load_checkpoint=\"/PATH/TO/CHECKPOINT\")\n    result = model.predict(origin=[\"venice-boat.jpg\"], style=\"candy.jpg\", visualization=True, save_path='style_tranfer')\n"
  },
  {
    "path": "demo/style_transfer/train.py",
    "content": "import paddle\nimport paddlehub as hub\n\nfrom paddlehub.finetune.trainer import Trainer\nfrom paddlehub.datasets.minicoco import MiniCOCO\nimport paddlehub.vision.transforms as T\n\nif __name__ == \"__main__\":\n    model = hub.Module(name='msgnet')\n    transform = T.Compose([T.Resize((256, 256), interpolation='LINEAR')])\n    styledata = MiniCOCO(transform)\n    optimizer = paddle.optimizer.Adam(learning_rate=0.0001, parameters=model.parameters())\n    trainer = Trainer(model, optimizer, checkpoint_dir='test_style_ckpt')\n    trainer.train(styledata, epochs=101, batch_size=4, log_interval=10, save_interval=10)\n"
  },
  {
    "path": "demo/text_classification/README.md",
    "content": "# PaddleHub Transformer模型fine-tune文本分类（动态图）\n\n在2017年之前，工业界和学术界对NLP文本处理依赖于序列模型[Recurrent Neural Network (RNN)](https://baike.baidu.com/item/%E5%BE%AA%E7%8E%AF%E7%A5%9E%E7%BB%8F%E7%BD%91%E7%BB%9C/23199490?fromtitle=RNN&fromid=5707183&fr=aladdin).\n\n![](../../docs/imgs/RNN_Sample.png)\n\n近年来随着深度学习的发展，模型参数数量飞速增长，为了训练这些参数，需要更大的数据集来避免过拟合。然而，对于大部分NLP任务来说，构建大规模的标注数据集成本过高，非常困难，特别是对于句法和语义相关的任务。相比之下，大规模的未标注语料库的构建则相对容易。最近的研究表明，基于大规模未标注语料库的预训练模型（Pretrained Models, PTM) 能够习得通用的语言表示，将预训练模型Fine-tune到下游任务，能够获得出色的表现。另外，预训练模型能够避免从零开始训练模型。\n\n![](https://ai-studio-static-online.cdn.bcebos.com/327f44ff3ed24493adca5ddc4dc24bf61eebe67c84a6492f872406f464fde91e)\n\n\n本示例将展示如何使用PaddleHub Transformer模型（如 ERNIE、BERT、RoBERTa等模型）Module 以动态图方式fine-tune并完成预测任务。\n\n## 如何开始Fine-tune\n\n\n我们以中文情感分类公开数据集ChnSentiCorp为示例数据集，可以运行下面的命令，在训练集（train.tsv）上进行模型训练，并在开发集（dev.tsv）验证。通过如下命令，即可启动训练。\n\n```shell\n# 设置使用的GPU卡号\nexport CUDA_VISIBLE_DEVICES=0\npython train.py\n```\n\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 选择模型\n```python\nimport paddlehub as hub\n\nmodel = hub.Module(name='ernie_tiny', version='2.0.1', task='seq-cls', num_classes=2)\n```\n\n其中，参数：\n\n* `name`：模型名称，可以选择`ernie`，`ernie_tiny`，`bert-base-cased`， `bert-base-chinese`, `roberta-wwm-ext`，`roberta-wwm-ext-large`等。\n* `version`：module版本号\n* `task`：fine-tune任务。此处为`seq-cls`，表示文本分类任务。\n* `num_classes`：表示当前文本分类任务的类别数，根据具体使用的数据集确定，默认为2。\n\nPaddleHub还提供BERT等模型可供选择, 当前支持文本分类任务的模型对应的加载示例如下：\n\n模型名                           | PaddleHub Module\n---------------------------------- | :------:\nERNIE, Chinese                     | `hub.Module(name='ernie')`\nERNIE tiny, Chinese                | `hub.Module(name='ernie_tiny')`\nERNIE 2.0 Base, English            | `hub.Module(name='ernie_v2_eng_base')`\nERNIE 2.0 Large, English           | `hub.Module(name='ernie_v2_eng_large')`\nBERT-Base, English Cased           | `hub.Module(name='bert-base-cased')`\nBERT-Base, English Uncased         | `hub.Module(name='bert-base-uncased')`\nBERT-Large, English Cased          | `hub.Module(name='bert-large-cased')`\nBERT-Large, English Uncased        | `hub.Module(name='bert-large-uncased')`\nBERT-Base, Multilingual Cased      | `hub.Module(nane='bert-base-multilingual-cased')`\nBERT-Base, Multilingual Uncased    | `hub.Module(nane='bert-base-multilingual-uncased')`\nBERT-Base, Chinese                 | `hub.Module(name='bert-base-chinese')`\nBERT-wwm, Chinese                  | `hub.Module(name='chinese-bert-wwm')`\nBERT-wwm-ext, Chinese              | `hub.Module(name='chinese-bert-wwm-ext')`\nRoBERTa-wwm-ext, Chinese           | `hub.Module(name='roberta-wwm-ext')`\nRoBERTa-wwm-ext-large, Chinese     | `hub.Module(name='roberta-wwm-ext-large')`\nRBT3, Chinese                      | `hub.Module(name='rbt3')`\nRBTL3, Chinese                     | `hub.Module(name='rbtl3')`\nELECTRA-Small, English             | `hub.Module(name='electra-small')`\nELECTRA-Base, English              | `hub.Module(name='electra-base')`\nELECTRA-Large, English             | `hub.Module(name='electra-large')`\nELECTRA-Base, Chinese              | `hub.Module(name='chinese-electra-base')`\nELECTRA-Small, Chinese             | `hub.Module(name='chinese-electra-small')`\n\n通过以上的一行代码，`model`初始化为一个适用于文本分类任务的模型，为ERNIE Tiny的预训练模型后拼接上一个全连接网络（Full Connected）。\n\n![](../../docs/imgs/Single_Sentence_Classsification.jpg)\n\n以上图片来自于：https://arxiv.org/pdf/1810.04805.pdf\n\n### Step2: 下载并加载数据集\n\n```python\ntrain_dataset = hub.datasets.ChnSentiCorp(\n    tokenizer=model.get_tokenizer(), max_seq_len=128, mode='train')\ndev_dataset = hub.datasets.ChnSentiCorp(\n    tokenizer=model.get_tokenizer(), max_seq_len=128, mode='dev')\ntest_dataset = hub.datasets.ChnSentiCorp(\n    tokenizer=model.get_tokenizer(), max_seq_len=128, mode='test')\n```\n\n* `tokenizer`：表示该module所需用到的tokenizer，其将对输入文本完成切词，并转化成module运行所需模型输入格式。\n* `mode`：选择数据模式，可选项有 `train`, `test`, `dev`， 默认为`train`。\n* `max_seq_len`：ERNIE/BERT模型使用的最大序列长度，若出现显存不足，请适当调低这一参数。\n\n预训练模型ERNIE对中文数据的处理是以字为单位，tokenizer作用为将原始输入文本转化成模型model可以接受的输入数据形式。 PaddleHub 2.0中的各种预训练模型已经内置了相应的tokenizer，可以通过`model.get_tokenizer`方法获取。\n\n![](https://bj.bcebos.com/paddlehub/paddlehub-img/ernie_network_1.png)\n![](https://bj.bcebos.com/paddlehub/paddlehub-img/ernie_network_2.png)\n\n### Step3:  选择优化策略和运行配置\n\n```python\noptimizer = paddle.optimizer.Adam(learning_rate=5e-5, parameters=model.parameters())\ntrainer = hub.Trainer(model, optimizer, checkpoint_dir='test_ernie_text_cls', use_gpu=True)\n\ntrainer.train(train_dataset, epochs=3, batch_size=32, eval_dataset=dev_dataset)\n\n# 在测试集上评估当前训练模型\ntrainer.evaluate(test_dataset, batch_size=32)\n```\n\n#### 优化策略\n\nPaddle2.0-rc提供了多种优化器选择，如`SGD`, `Adam`, `Adamax`等，详细参见[策略](https://www.paddlepaddle.org.cn/documentation/docs/zh/2.0-rc/api/paddle/optimizer/optimizer/Optimizer_cn.html)。\n\n其中`Adam`:\n\n* `learning_rate`: 全局学习率。默认为1e-3；\n* `parameters`: 待优化模型参数。\n\n#### 运行配置\n\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: works的数量，默认为0；\n* `eval_dataset`: 验证集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们以以下数据为待预测数据，使用该模型来进行预测\n\n```text\n这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般\n怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片\n作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。\n```\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='ernie_tiny',\n    version='2.0.1',\n    task='seq-cls',\n    load_checkpoint='./test_ernie_text_cls/best_model/model.pdparams',\n    label_map=label_map)\nresults, probs = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False, return_prob=True)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Lable: {} \\t Prob: {}'.format(text[0], results[idx], probs[idx]))\n```\n\n参数配置正确后，请执行脚本`python predict.py`， 加载模型具体可参见[加载](https://www.paddlepaddle.org.cn/documentation/docs/zh/2.0-rc/api/paddle/framework/io/load_cn.html#load)。\n"
  },
  {
    "path": "demo/text_classification/embedding/model.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import List\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nimport paddlenlp as nlp\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlenlp.data import JiebaTokenizer\n\nfrom paddlehub.utils.log import logger\nfrom paddlehub.utils.utils import pad_sequence, trunc_sequence\n\n\nclass BoWModel(nn.Layer):\n    \"\"\"\n    This class implements the Bag of Words Classification Network model to classify texts.\n    At a high level, the model starts by embedding the tokens and running them through\n    a word embedding. Then, we encode these epresentations with a `BoWEncoder`.\n    Lastly, we take the output of the encoder to create a final representation,\n    which is passed through some feed-forward layers to output a logits (`output_layer`).\n    Args:\n        vocab_size (obj:`int`): The vocabulary size.\n        emb_dim (obj:`int`, optional, defaults to 300):  The embedding dimension.\n        hidden_size (obj:`int`, optional, defaults to 128): The first full-connected layer hidden size.\n        fc_hidden_size (obj:`int`, optional, defaults to 96): The second full-connected layer hidden size.\n        num_classes (obj:`int`): All the labels that the data has.\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 2,\n                 embedder: TokenEmbedding = None,\n                 tokenizer: JiebaTokenizer = None,\n                 hidden_size: int = 128,\n                 fc_hidden_size: int = 96,\n                 load_checkpoint: str = None,\n                 label_map: dict = None):\n        super().__init__()\n        self.embedder = embedder\n        self.tokenizer = tokenizer\n        self.label_map = label_map\n\n        emb_dim = self.embedder.embedding_dim\n        self.bow_encoder = nlp.seq2vec.BoWEncoder(emb_dim)\n        self.fc1 = nn.Linear(self.bow_encoder.get_output_dim(), hidden_size)\n        self.fc2 = nn.Linear(hidden_size, fc_hidden_size)\n        self.dropout = nn.Dropout(p=0.3, axis=1)\n        self.output_layer = nn.Linear(fc_hidden_size, num_classes)\n        self.criterion = nn.loss.CrossEntropyLoss()\n        self.metric = paddle.metric.Accuracy()\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def training_step(self, batch: List[paddle.Tensor], batch_idx: int):\n        \"\"\"\n        One step for training, which should be called as forward computation.\n        Args:\n            batch(:obj:List[paddle.Tensor]): The one batch data, which contains the model needed,\n                such as input_ids, sent_ids, pos_ids, input_mask and labels.\n            batch_idx(int): The index of batch.\n        Returns:\n            results(:obj: Dict) : The model outputs, such as loss and metrics.\n        \"\"\"\n        _, avg_loss, metric = self(ids=batch[0], labels=batch[1])\n        self.metric.reset()\n        return {'loss': avg_loss, 'metrics': metric}\n\n    def validation_step(self, batch: List[paddle.Tensor], batch_idx: int):\n        \"\"\"\n        One step for validation, which should be called as forward computation.\n        Args:\n            batch(:obj:List[paddle.Tensor]): The one batch data, which contains the model needed,\n                such as input_ids, sent_ids, pos_ids, input_mask and labels.\n            batch_idx(int): The index of batch.\n        Returns:\n            results(:obj: Dict) : The model outputs, such as metrics.\n        \"\"\"\n        _, _, metric = self(ids=batch[0], labels=batch[1])\n        self.metric.reset()\n        return {'metrics': metric}\n\n    def forward(self, ids: paddle.Tensor, labels: paddle.Tensor = None):\n\n        # Shape: (batch_size, num_tokens, embedding_dim)\n        embedded_text = self.embedder(ids)\n\n        # Shape: (batch_size, embedding_dim)\n        summed = self.bow_encoder(embedded_text)\n        summed = self.dropout(summed)\n        encoded_text = paddle.tanh(summed)\n\n        # Shape: (batch_size, hidden_size)\n        fc1_out = paddle.tanh(self.fc1(encoded_text))\n        # Shape: (batch_size, fc_hidden_size)\n        fc2_out = paddle.tanh(self.fc2(fc1_out))\n        # Shape: (batch_size, num_classes)\n        logits = self.output_layer(fc2_out)\n\n        probs = F.softmax(logits, axis=1)\n        if labels is not None:\n            loss = self.criterion(logits, labels)\n            correct = self.metric.compute(probs, labels)\n            acc = self.metric.update(correct)\n            return probs, loss, {'acc': acc}\n        else:\n            return probs\n\n    def _batchify(self, data: List[List[str]], max_seq_len: int, batch_size: int):\n        examples = []\n        for item in data:\n            ids = self.tokenizer.encode(sentence=item[0])\n\n            if len(ids) > max_seq_len:\n                ids = trunc_sequence(ids, max_seq_len)\n            else:\n                pad_token = self.tokenizer.vocab.pad_token\n                pad_token_id = self.tokenizer.vocab.to_indices(pad_token)\n                ids = pad_sequence(ids, max_seq_len, pad_token_id)\n            examples.append(ids)\n\n        # Seperates data into some batches.\n        one_batch = []\n        for example in examples:\n            one_batch.append(example)\n            if len(one_batch) == batch_size:\n                yield one_batch\n                one_batch = []\n        if one_batch:\n            # The last batch whose size is less than the config batch_size setting.\n            yield one_batch\n\n    def predict(\n            self,\n            data: List[List[str]],\n            max_seq_len: int = 128,\n            batch_size: int = 1,\n            use_gpu: bool = False,\n            return_result: bool = True,\n    ):\n        paddle.set_device('gpu') if use_gpu else paddle.set_device('cpu')\n\n        batches = self._batchify(data, max_seq_len, batch_size)\n        results = []\n        self.eval()\n        for batch in batches:\n            ids = paddle.to_tensor(batch)\n            probs = self(ids)\n            idx = paddle.argmax(probs, axis=1).numpy()\n\n            if return_result:\n                idx = idx.tolist()\n                labels = [self.label_map[i] for i in idx]\n                results.extend(labels)\n            else:\n                results.extend(probs.numpy())\n\n        return results\n"
  },
  {
    "path": "demo/text_classification/embedding/predict.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddlehub as hub\nfrom paddlenlp.data import JiebaTokenizer\nfrom model import BoWModel\n\nimport ast\nimport argparse\n\nparser = argparse.ArgumentParser(__doc__)\nparser.add_argument(\"--hub_embedding_name\", type=str, default='w2v_baidu_encyclopedia_target_word-word_dim300', help=\"\")\nparser.add_argument(\"--max_seq_len\", type=int, default=128, help=\"Number of words of the longest seqence.\")\nparser.add_argument(\"--batch_size\", type=int, default=64, help=\"Total examples' number in batch for training.\")\nparser.add_argument(\"--checkpoint\", type=str, default='./checkpoint/best_model/model.pdparams', help=\"Model checkpoint\")\nparser.add_argument(\n    \"--use_gpu\",\n    type=ast.literal_eval,\n    default=True,\n    help=\"Whether use GPU for fine-tuning, input should be True or False\")\n\nargs = parser.parse_args()\n\nif __name__ == '__main__':\n    # Data to be prdicted\n    data = [\n        [\"这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般\"],\n        [\"交通方便；环境很好；服务态度很好 房间较小\"],\n        [\"还稍微重了点，可能是硬盘大的原故，还要再轻半斤就好了。其他要进一步验证。贴的几种膜气泡较多，用不了多久就要更换了，屏幕膜稍好点，但比没有要强多了。建议配赠几张膜让用用户自己贴。\"],\n        [\"前台接待太差，酒店有A B楼之分，本人check－in后，前台未告诉B楼在何处，并且B楼无明显指示；房间太小，根本不像4星级设施，下次不会再选择入住此店啦\"],\n        [\"19天硬盘就罢工了~~~算上运来的一周都没用上15天~~~可就是不能换了~~~唉~~~~你说这算什么事呀~~~\"],\n    ]\n\n    label_map = {0: 'negative', 1: 'positive'}\n\n    embedder = hub.Module(name=args.hub_embedding_name)\n    tokenizer = embedder.get_tokenizer()\n    model = BoWModel(embedder=embedder, tokenizer=tokenizer, load_checkpoint=args.checkpoint, label_map=label_map)\n\n    results = model.predict(\n        data, max_seq_len=args.max_seq_len, batch_size=args.batch_size, use_gpu=args.use_gpu, return_result=False)\n    for idx, text in enumerate(data):\n        print('Data: {} \\t Lable: {}'.format(text[0], results[idx]))\n"
  },
  {
    "path": "demo/text_classification/embedding/train.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddlehub as hub\nfrom paddlehub.datasets import ChnSentiCorp\nfrom paddlenlp.data import JiebaTokenizer\nfrom model import BoWModel\n\nimport ast\nimport argparse\n\nparser = argparse.ArgumentParser(__doc__)\nparser.add_argument(\"--hub_embedding_name\", type=str, default='w2v_baidu_encyclopedia_target_word-word_dim300', help=\"\")\nparser.add_argument(\"--num_epoch\", type=int, default=10, help=\"Number of epoches for fine-tuning.\")\nparser.add_argument(\"--learning_rate\", type=float, default=5e-4, help=\"Learning rate used to train with warmup.\")\nparser.add_argument(\"--max_seq_len\", type=int, default=128, help=\"Number of words of the longest seqence.\")\nparser.add_argument(\"--batch_size\", type=int, default=64, help=\"Total examples' number in batch for training.\")\nparser.add_argument(\"--checkpoint_dir\", type=str, default='./checkpoint', help=\"Directory to model checkpoint\")\nparser.add_argument(\"--save_interval\", type=int, default=5, help=\"Save checkpoint every n epoch.\")\nparser.add_argument(\n    \"--use_gpu\",\n    type=ast.literal_eval,\n    default=True,\n    help=\"Whether use GPU for fine-tuning, input should be True or False\")\n\nargs = parser.parse_args()\n\nif __name__ == '__main__':\n    embedder = hub.Module(name=args.hub_embedding_name)\n    tokenizer = embedder.get_tokenizer()\n\n    train_dataset = ChnSentiCorp(tokenizer=tokenizer, max_seq_len=args.max_seq_len, mode='train')\n    dev_dataset = ChnSentiCorp(tokenizer=tokenizer, max_seq_len=args.max_seq_len, mode='dev')\n    test_dataset = ChnSentiCorp(tokenizer=tokenizer, max_seq_len=args.max_seq_len, mode='test')\n\n    model = BoWModel(embedder=embedder)\n    optimizer = paddle.optimizer.AdamW(learning_rate=args.learning_rate, parameters=model.parameters())\n    trainer = hub.Trainer(model, optimizer, checkpoint_dir=args.checkpoint_dir, use_gpu=args.use_gpu)\n    trainer.train(\n        train_dataset,\n        epochs=args.num_epoch,\n        batch_size=args.batch_size,\n        eval_dataset=dev_dataset,\n        save_interval=args.save_interval,\n    )\n    trainer.evaluate(test_dataset, batch_size=args.batch_size)\n"
  },
  {
    "path": "demo/text_classification/predict.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport paddlehub as hub\n\nif __name__ == '__main__':\n\n    data = [\n        ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n        ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n        ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n    ]\n    label_map = {0: 'negative', 1: 'positive'}\n\n    model = hub.Module(\n        name='ernie_tiny',\n        version='2.0.1',\n        task='seq-cls',\n        load_checkpoint='./test_ernie_text_cls/best_model/model.pdparams',\n        label_map=label_map)\n    results, probs = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False, return_prob=True)\n    for idx, text in enumerate(data):\n        print('Data: {} \\t Lable: {} \\t Prob: {}'.format(text[0], results[idx], probs[idx]))\n"
  },
  {
    "path": "demo/text_classification/train.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddlehub as hub\nfrom paddlehub.datasets import ChnSentiCorp\n\nimport ast\nimport argparse\n\nparser = argparse.ArgumentParser(__doc__)\nparser.add_argument(\"--num_epoch\", type=int, default=3, help=\"Number of epoches for fine-tuning.\")\nparser.add_argument(\n    \"--use_gpu\",\n    type=ast.literal_eval,\n    default=True,\n    help=\"Whether use GPU for fine-tuning, input should be True or False\")\nparser.add_argument(\"--learning_rate\", type=float, default=5e-5, help=\"Learning rate used to train with warmup.\")\nparser.add_argument(\"--max_seq_len\", type=int, default=128, help=\"Number of words of the longest seqence.\")\nparser.add_argument(\"--batch_size\", type=int, default=32, help=\"Total examples' number in batch for training.\")\nparser.add_argument(\"--checkpoint_dir\", type=str, default='./checkpoint', help=\"Directory to model checkpoint\")\nparser.add_argument(\"--save_interval\", type=int, default=1, help=\"Save checkpoint every n epoch.\")\n\nargs = parser.parse_args()\n\nif __name__ == '__main__':\n    model = hub.Module(name='ernie_tiny', version='2.0.1', task='seq-cls')\n\n    train_dataset = ChnSentiCorp(tokenizer=model.get_tokenizer(), max_seq_len=args.max_seq_len, mode='train')\n    dev_dataset = ChnSentiCorp(tokenizer=model.get_tokenizer(), max_seq_len=args.max_seq_len, mode='dev')\n    test_dataset = ChnSentiCorp(tokenizer=model.get_tokenizer(), max_seq_len=args.max_seq_len, mode='test')\n\n    optimizer = paddle.optimizer.AdamW(learning_rate=args.learning_rate, parameters=model.parameters())\n    trainer = hub.Trainer(model, optimizer, checkpoint_dir=args.checkpoint_dir, use_gpu=args.use_gpu)\n    trainer.train(\n        train_dataset,\n        epochs=args.num_epoch,\n        batch_size=args.batch_size,\n        eval_dataset=dev_dataset,\n        save_interval=args.save_interval,\n    )\n    trainer.evaluate(test_dataset, batch_size=args.batch_size)\n"
  },
  {
    "path": "demo/text_matching/README.md",
    "content": "# PaddleHub Transformer模型fine-tune文本匹配（动态图）\n\n在2017年之前，工业界和学术界对NLP文本处理依赖于序列模型[Recurrent Neural Network (RNN)](https://baike.baidu.com/item/%E5%BE%AA%E7%8E%AF%E7%A5%9E%E7%BB%8F%E7%BD%91%E7%BB%9C/23199490?fromtitle=RNN&fromid=5707183&fr=aladdin).\n\n![](http://colah.github.io/posts/2015-09-NN-Types-FP/img/RNN-general.png)\n\n近年来随着深度学习的发展，模型参数数量飞速增长，为了训练这些参数，需要更大的数据集来避免过拟合。然而，对于大部分NLP任务来说，构建大规模的标注数据集成本过高，非常困难，特别是对于句法和语义相关的任务。相比之下，大规模的未标注语料库的构建则相对容易。最近的研究表明，基于大规模未标注语料库的预训练模型（Pretrained Models, PTM) 能够习得通用的语言表示，将预训练模型Fine-tune到下游任务，能够获得出色的表现。另外，预训练模型能够避免从零开始训练模型。\n\n![](https://ai-studio-static-online.cdn.bcebos.com/327f44ff3ed24493adca5ddc4dc24bf61eebe67c84a6492f872406f464fde91e)\n\n\n本示例将展示如何使用PaddleHub Transformer模型（如 ERNIE、BERT、RoBERTa等模型）Module 以动态图方式fine-tune并完成预测任务。\n\n## 文本匹配\n\n使用预训练模型ERNIE完成文本匹配任务，大家可能会想到将query和title文本拼接，之后输入ERNIE中，取`CLS`特征（pooled_output），之后输出全连接层，进行二分类。如下图ERNIE用于句对分类任务的用法：\n\n![](https://camo.githubusercontent.com/5e1867ee2b6fc3a0f94c7b2c87a4d987fed4c440d4d9c80726e5798900880027/68747470733a2f2f61692d73747564696f2d7374617469632d6f6e6c696e652e63646e2e626365626f732e636f6d2f34353434303032396330373234306164383964363635633562313736653633323937653935383465316461323465303262373964643534666239393066373461)\n\n然而，以上用法的问题在于，ERNIE的模型参数非常庞大，导致计算量非常大，预测的速度也不够理想。从而达不到线上业务的要求。针对该问题，使用Sentence Transformer网络可以优化计算量。\n\nSentence Transformer采用了双塔（Siamese）的网络结构。Query和Title分别输入Transformer网络，共享网络参数，得到各自的token embedding特征。之后对token embedding进行pooling（此处教程使用mean pooling操作），之后输出分别记作u，v。之后将三个表征（u,v,|u-v|)拼接起来，进行二分类。网络结构如下图所示。\n\n![](https://camo.githubusercontent.com/80e65553f0c82886a27897a0a151ee9745e6e2def310d6649c8a68e2672c06c2/68747470733a2f2f61692d73747564696f2d7374617469632d6f6e6c696e652e63646e2e626365626f732e636f6d2f31303339393837303365313334613731383438383335313161353338363230653136666564303435653236313464636338616661636563343436363030343338)\n\n更多关于Sentence Transformer的信息可以参考论文：https://arxiv.org/abs/1908.10084\n\n## 如何开始Fine-tune\n\n\n我们以中文文本匹配数据集LCQMC为示例数据集，可以运行下面的命令，在训练集（train.tsv）上进行模型训练，并在开发集（dev.tsv）验证和测试集测试（test.tsv）。通过如下命令，即可启动训练。\n\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 选择模型\n```python\nimport paddlehub as hub\n\nmodel = hub.Module(name='ernie_tiny', version='2.0.2', task='text-matching')\n```\n\n其中，参数：\n\n* `name`：模型名称，可以选择`ernie`，`ernie_tiny`，`bert-base-cased`， `bert-base-chinese`, `roberta-wwm-ext`，`roberta-wwm-ext-large`等。\n* `version`：module版本号\n* `task`：fine-tune任务。此处为`text-matching`，表示文本匹配任务。\n\nPaddleHub还提供BERT等模型可供选择, 当前支持文本分类任务的模型对应的加载示例如下：\n\n模型名                           | PaddleHub Module\n---------------------------------- | :------:\nERNIE, Chinese                     | `hub.Module(name='ernie')`\nERNIE tiny, Chinese                | `hub.Module(name='ernie_tiny')`\nERNIE 2.0 Base, English            | `hub.Module(name='ernie_v2_eng_base')`\nERNIE 2.0 Large, English           | `hub.Module(name='ernie_v2_eng_large')`\nBERT-Base, English Cased           | `hub.Module(name='bert-base-cased')`\nBERT-Base, English Uncased         | `hub.Module(name='bert-base-uncased')`\nBERT-Large, English Cased          | `hub.Module(name='bert-large-cased')`\nBERT-Large, English Uncased        | `hub.Module(name='bert-large-uncased')`\nBERT-Base, Multilingual Cased      | `hub.Module(nane='bert-base-multilingual-cased')`\nBERT-Base, Multilingual Uncased    | `hub.Module(nane='bert-base-multilingual-uncased')`\nBERT-Base, Chinese                 | `hub.Module(name='bert-base-chinese')`\nBERT-wwm, Chinese                  | `hub.Module(name='chinese-bert-wwm')`\nBERT-wwm-ext, Chinese              | `hub.Module(name='chinese-bert-wwm-ext')`\nRoBERTa-wwm-ext, Chinese           | `hub.Module(name='roberta-wwm-ext')`\nRoBERTa-wwm-ext-large, Chinese     | `hub.Module(name='roberta-wwm-ext-large')`\nRBT3, Chinese                      | `hub.Module(name='rbt3')`\nRBTL3, Chinese                     | `hub.Module(name='rbtl3')`\nELECTRA-Small, English             | `hub.Module(name='electra-small')`\nELECTRA-Base, English              | `hub.Module(name='electra-base')`\nELECTRA-Large, English             | `hub.Module(name='electra-large')`\nELECTRA-Base, Chinese              | `hub.Module(name='chinese-electra-base')`\nELECTRA-Small, Chinese             | `hub.Module(name='chinese-electra-small')`\n\n通过以上的一行代码，`model`初始化为一个适用于文本匹配任务的双塔（Siamese）结构模型，。\n\n\n### Step2: 下载并加载数据集\n\n```python\ntrain_dataset = LCQMC(tokenizer=model.get_tokenizer(), max_seq_len=128, mode='train')\ndev_dataset = LCQMC(tokenizer=model.get_tokenizer(), max_seq_len=128, mode='dev')\ntest_dataset = LCQMC(tokenizer=model.get_tokenizer(), max_seq_len=128, mode='test')\n```\n\n* `tokenizer`：表示该module所需用到的tokenizer，其将对输入文本完成切词，并转化成module运行所需模型输入格式。\n* `mode`：选择数据模式，可选项有 `train`, `dev`, `test`，默认为`train`。\n* `max_seq_len`：ERNIE/BERT模型使用的最大序列长度，若出现显存不足，请适当调低这一参数。\n\n预训练模型ERNIE对中文数据的处理是以字为单位，tokenizer作用为将原始输入文本转化成模型model可以接受的输入数据形式。 PaddleHub 2.0中的各种预训练模型已经内置了相应的tokenizer，可以通过`model.get_tokenizer`方法获取。\n\n\n### Step3:  选择优化策略和运行配置\n\n```python\noptimizer = paddle.optimizer.AdamW(learning_rate=5e-5, parameters=model.parameters())\ntrainer = hub.Trainer(model, optimizer, checkpoint_dir='./', use_gpu=True)\n```\n\n#### 优化策略\n\nPaddle2.0提供了多种优化器选择，如`SGD`, `AdamW`, `Adamax`等，详细参见[策略](https://www.paddlepaddle.org.cn/documentation/docs/zh/api/paddle/optimizer/Overview_cn.html)。\n\n其中`AdamW`:\n\n- `learning_rate`: 全局学习率。默认为1e-3；\n- `parameters`: 待优化模型参数。\n\n其余可配置参数请参考[AdamW](https://www.paddlepaddle.org.cn/documentation/docs/zh/api/paddle/optimizer/adamw/AdamW_cn.html#cn-api-paddle-optimizer-adamw)。\n\n#### 运行配置\n\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n- `model`: 被优化模型；\n- `optimizer`: 优化器选择；\n- `use_vdl`: 是否使用vdl可视化训练过程；\n- `checkpoint_dir`: 保存模型参数的地址；\n- `compare_metrics`: 保存最优模型的衡量指标；\n\n\n### Step4: 执行训练和模型评估\n\n```python\ntrainer.train(\n    train_dataset,\n    epochs=10,\n    batch_size=32,\n    eval_dataset=dev_dataset,\n    save_interval=2,\n)\ntrainer.evaluate(test_dataset, batch_size=32)\n```\n\n`trainer.train`执行模型的训练，其参数可以控制具体的训练过程，主要的参数包含：\n\n- `train_dataset`: 训练时所用的数据集；\n- `epochs`: 训练轮数；\n- `batch_size`: 训练时每一步用到的样本数目，如果使用GPU，请根据实际情况调整batch_size；\n- `num_workers`: works的数量，默认为0；\n- `eval_dataset`: 验证集；\n- `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n- `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n`trainer.evaluate`执行模型的评估，主要的参数包含：\n\n- `eval_dataset`: 模型评估时所用的数据集；\n- `batch_size`: 模型评估时每一步用到的样本数目，如果使用GPU，请根据实际情况调整batch_size\n\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n以下代码将使用最优模型来进行预测：\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个表情叫什么', '这个猫的表情叫什么'],\n    ['什么是智能手环', '智能手环有什么用'],\n    ['介绍几本好看的都市异能小说，要完结的！', '求一本好看点的都市异能小说，要完结的'],\n    ['一只蜜蜂落在日历上（打一成语）', '一只蜜蜂停在日历上（猜一成语）'],\n    ['一盒香烟不拆开能存放多久？', '一条没拆封的香烟能存放多久。'],\n]\nlabel_map = {0: 'similar', 1: 'dissimilar'}\n\nmodel = hub.Module(\n    name='ernie_tiny',\n    version='2.0.2',\n    task='text-matching',\n    load_checkpoint='./checkpoint/best_model/model.pdparams',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=128, batch_size=1, use_gpu=True)\nfor idx, texts in enumerate(data):\n    print('TextA: {}\\tTextB: {}\\t Label: {}'.format(texts[0], texts[1], results[idx]))\n```\n\n### 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "demo/text_matching/predict.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    data = [\n        ['这个表情叫什么', '这个猫的表情叫什么'],\n        ['什么是智能手环', '智能手环有什么用'],\n        ['介绍几本好看的都市异能小说，要完结的！', '求一本好看点的都市异能小说，要完结的'],\n        ['一只蜜蜂落在日历上（打一成语）', '一只蜜蜂停在日历上（猜一成语）'],\n        ['一盒香烟不拆开能存放多久？', '一条没拆封的香烟能存放多久。'],\n    ]\n    label_map = {0: 'similar', 1: 'dissimilar'}\n\n    model = hub.Module(\n        name='ernie_tiny',\n        version='2.0.2',\n        task='text-matching',\n        load_checkpoint='./checkpoint/best_model/model.pdparams',\n        label_map=label_map)\n    results = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=True)\n    for idx, texts in enumerate(data):\n        print('TextA: {}\\tTextB: {}\\t Label: {}'.format(texts[0], texts[1], results[idx]))\n"
  },
  {
    "path": "demo/text_matching/train.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddlehub as hub\nfrom paddlehub.datasets import LCQMC\n\nimport ast\nimport argparse\n\nparser = argparse.ArgumentParser(__doc__)\nparser.add_argument(\"--num_epoch\", type=int, default=10, help=\"Number of epoches for fine-tuning.\")\nparser.add_argument(\n    \"--use_gpu\",\n    type=ast.literal_eval,\n    default=True,\n    help=\"Whether use GPU for fine-tuning, input should be True or False\")\nparser.add_argument(\"--learning_rate\", type=float, default=5e-5, help=\"Learning rate used to train with warmup.\")\nparser.add_argument(\"--max_seq_len\", type=int, default=64, help=\"Number of words of the longest seqence.\")\nparser.add_argument(\"--batch_size\", type=int, default=128, help=\"Total examples' number in batch for training.\")\nparser.add_argument(\"--checkpoint_dir\", type=str, default='./checkpoint', help=\"Directory to model checkpoint\")\nparser.add_argument(\"--save_interval\", type=int, default=2, help=\"Save checkpoint every n epoch.\")\n\nargs = parser.parse_args()\n\nif __name__ == '__main__':\n    model = hub.Module(name='ernie_tiny', version='2.0.2', task='text-matching')\n    tokenizer = model.get_tokenizer()\n\n    train_dataset = LCQMC(tokenizer=tokenizer, max_seq_len=args.max_seq_len, mode='train')\n    dev_dataset = LCQMC(tokenizer=tokenizer, max_seq_len=args.max_seq_len, mode='dev')\n    test_dataset = LCQMC(tokenizer=tokenizer, max_seq_len=args.max_seq_len, mode='test')\n\n    optimizer = paddle.optimizer.AdamW(learning_rate=args.learning_rate, parameters=model.parameters())\n    trainer = hub.Trainer(model, optimizer, checkpoint_dir=args.checkpoint_dir, use_gpu=args.use_gpu)\n    trainer.train(\n        train_dataset,\n        epochs=args.num_epoch,\n        batch_size=args.batch_size,\n        eval_dataset=dev_dataset,\n        save_interval=args.save_interval,\n    )\n    trainer.evaluate(test_dataset, batch_size=args.batch_size)\n"
  },
  {
    "path": "docker/Dockerfile",
    "content": "FROM ubuntu:16.04\n\nRUN echo \"deb [trusted=true] http://mirrors.tuna.tsinghua.edu.cn/ubuntu/ xenial main restricted \\n\\\ndeb [trusted=true] http://mirrors.tuna.tsinghua.edu.cn/ubuntu/ xenial-updates main restricted \\n\\\ndeb [trusted=true] http://mirrors.tuna.tsinghua.edu.cn/ubuntu/ xenial universe \\n\\\ndeb [trusted=true] http://mirrors.tuna.tsinghua.edu.cn/ubuntu/ xenial-updates universe \\n\\\ndeb [trusted=true] http://mirrors.tuna.tsinghua.edu.cn/ubuntu/ xenial multiverse \\n\\\ndeb [trusted=true] http://mirrors.tuna.tsinghua.edu.cn/ubuntu/ xenial-updates multiverse \\n\\\ndeb [trusted=true] http://mirrors.tuna.tsinghua.edu.cn/ubuntu/ xenial-backports main restricted universe multiverse \\n\\\ndeb [trusted=true] http://mirrors.tuna.tsinghua.edu.cn/ubuntu/ xenial-security main restricted \\n\\\ndeb [trusted=true] http://mirrors.tuna.tsinghua.edu.cn/ubuntu/ xenial-security universe \\n\\\ndeb [trusted=true] http://mirrors.tuna.tsinghua.edu.cn/ubuntu/ xenial-security multiverse\" > /etc/apt/sources.list\n\nRUN apt-get update && apt-get install -y inetutils-ping wget vim curl cmake git sox libsndfile1 libpng12-dev \\\n    libpng-dev swig libzip-dev openssl bc libflac* libgdk-pixbuf2.0-dev libpango1.0-dev libcairo2-dev \\\n    libgtk2.0-dev pkg-config zip unzip zlib1g-dev libreadline-dev libbz2-dev liblapack-dev libjpeg-turbo8-dev \\\n    sudo lrzsz libsqlite3-dev libx11-dev libsm6 apt-utils libopencv-dev libavcodec-dev libavformat-dev \\\n    libswscale-dev locales liblzma-dev python-lzma m4 libxext-dev strace libibverbs-dev libpcre3 libpcre3-dev \\\n    build-essential libncurses5-dev libgdbm-dev libnss3-dev libssl-dev libreadline-dev libffi-dev xz-utils \\\n    libfreetype6-dev libxslt1-dev libxml2-dev libgeos-3.5.0 libgeos-dev && apt-get install -y --allow-downgrades \\\n    --allow-change-held-packages && DEBIAN_FRONTEND=noninteractive apt-get install -y tzdata \\\n    && /bin/cp /usr/share/zoneinfo/Asia/Shanghai /etc/localtime && dpkg-reconfigure -f noninteractive tzdata\n\nRUN echo \"set meta-flag on\" >> /etc/inputrc && echo \"set convert-meta off\" >> /etc/inputrc && \\\n    locale-gen en_US.UTF-8 && /sbin/ldconfig -v && groupadd -g 10001 paddlehub && \\\n    useradd -m -s /bin/bash -N -u 10001 paddlehub -g paddlehub && chmod g+w /etc/passwd && \\\n    echo \"paddlehub ALL=(ALL) NOPASSWD: ALL\" >> /etc/sudoers\n\nENV LANG=en_US.UTF-8 LC_ALL=en_US.UTF-8 LANGUAGE=en_US.UTF-8 TZ=Asia/Shanghai\n\n# official download site: https://www.python.org/ftp/python/3.7.13/Python-3.7.13.tgz\nRUN wget https://cdn.npmmirror.com/binaries/python/3.7.13/Python-3.7.13.tgz && tar xvf Python-3.7.13.tgz && \\\n    cd Python-3.7.13 && ./configure --prefix=/home/paddlehub/python3.7 && make -j8 && make install && \\\n    rm -rf ../Python-3.7.13 ../Python-3.7.13.tgz && chown -R paddlehub:paddlehub /home/paddlehub/python3.7\n\nRUN cd /tmp && wget https://mirrors.sjtug.sjtu.edu.cn/gnu/gmp/gmp-6.1.0.tar.bz2 && tar xvf gmp-6.1.0.tar.bz2 && \\\n    cd gmp-6.1.0 && ./configure --prefix=/usr/local && make -j8 && make install && \\\n    rm -rf ../gmp-6.1.0.tar.bz2 ../gmp-6.1.0 && cd /tmp && \\\n    wget https://www.mpfr.org/mpfr-3.1.4/mpfr-3.1.4.tar.bz2 && tar xvf mpfr-3.1.4.tar.bz2 && cd mpfr-3.1.4 && \\\n    ./configure --prefix=/usr/local && make -j8 && make install && rm -rf ../mpfr-3.1.4.tar.bz2 ../mpfr-3.1.4 && \\\n    cd /tmp && wget https://mirrors.sjtug.sjtu.edu.cn/gnu/mpc/mpc-1.0.3.tar.gz && tar xvf mpc-1.0.3.tar.gz && \\\n    cd mpc-1.0.3 && ./configure --prefix=/usr/local && make -j8 && make install && \\\n    rm -rf ../mpc-1.0.3.tar.gz ../mpc-1.0.3 && cd /tmp && \\\n    wget http://www.mirrorservice.org/sites/sourceware.org/pub/gcc/infrastructure/isl-0.18.tar.bz2 && \\\n    tar xvf isl-0.18.tar.bz2 && cd isl-0.18 && ./configure --prefix=/usr/local && make -j8 && make install \\\n    && rm -rf ../isl-0.18.tar.bz2 ../isl-0.18 && cd /tmp && \\\n    wget http://mirrors.ustc.edu.cn/gnu/gcc/gcc-8.2.0/gcc-8.2.0.tar.gz --no-check-certificate && \\\n    tar xvf gcc-8.2.0.tar.gz && cd gcc-8.2.0 && unset LIBRARY_PATH && ./configure --prefix=/home/paddlehub/gcc82 \\\n    --enable-threads=posix --disable-checking --disable-multilib --enable-languages=c,c++ --with-gmp=/usr/local \\\n    --with-mpfr=/usr/local --with-mpc=/usr/local --with-isl=/usr/local && make -j8 && make install && \\\n    rm -rf ../gcc-8.2.0.tar.gz ../gcc-8.2.0 && chown -R paddlehub:paddlehub /home/paddlehub/gcc82\n\nWORKDIR /home/paddlehub\nUSER paddlehub \nENV PATH=/home/paddlehub/python3.7/bin:/home/paddlehub/gcc82/bin:${PATH} \\\n    LD_LIBRARY_PATH=/usr/lib/x86_64-linux-gnu:/usr/local/cuda-11.2/targets/x86_64-linux/lib:${LD_LIBRARY_PATH}\n\nRUN mkdir -p ~/.pip && echo \"[global]\" > ~/.pip/pip.conf && \\\n    echo \"index-url=https://mirror.baidu.com/pypi/simple\" >> ~/.pip/pip.conf && \\\n    echo \"trusted-host=mirror.baidu.com\" >> ~/.pip/pip.conf && \\\n    pip3 install --upgrade pip && pip3 install paddlepaddle paddlehub shapely pyclipper && \\\n    sudo cp -f /home/paddlehub/gcc82/lib64/libstdc++.so.6.0.25 /usr/lib/x86_64-linux-gnu/libstdc++.so.6 && \\\n    rm -rf ~/.cache/pip \n\n#RUN hub install <model_name>\nCMD ['bash']\n"
  },
  {
    "path": "docs/Makefile",
    "content": "# Minimal makefile for Sphinx documentation\n#\n\n# You can set these variables from the command line, and also\n# from the environment for the first two.\nSPHINXOPTS    ?=\nSPHINXBUILD   ?= sphinx-build\nSOURCEDIR     = .\nBUILDDIR      = _build\n\n# Put it first so that \"make\" without argument is like \"make help\".\nhelp:\n\t@$(SPHINXBUILD) -M help \"$(SOURCEDIR)\" \"$(BUILDDIR)\" $(SPHINXOPTS) $(O)\n\n.PHONY: help Makefile\n\n# Catch-all target: route all unknown targets to Sphinx using the new\n# \"make mode\" option.  $(O) is meant as a shortcut for $(SPHINXOPTS).\n%: Makefile\n\t@$(SPHINXBUILD) -M $@ \"$(SOURCEDIR)\" \"$(BUILDDIR)\" $(SPHINXOPTS) $(O)\n"
  },
  {
    "path": "docs/conf.py",
    "content": "# Configuration file for the Sphinx documentation builder.\n#\n# This file only contains a selection of the most common options. For a full\n# list see the documentation:\n# http://www.sphinx-doc.org/en/master/config\n\n# -- Path setup --------------------------------------------------------------\n\n# If extensions (or modules to document with autodoc) are in another directory,\n# add these directories to sys.path here. If the directory is relative to the\n# documentation root, use os.path.abspath to make it absolute, like shown here.\n#\nfrom recommonmark.transform import AutoStructify\nfrom recommonmark.parser import CommonMarkParser\nimport os\nimport sys\nsys.path.insert(0, os.path.abspath('../paddlehub'))\n\n# -- Project information -----------------------------------------------------\n\nproject = 'PaddleHub'\ncopyright = '2020, PaddlePaddle'\nauthor = 'PaddlePaddle'\n\n# The full version, including alpha/beta/rc tags\nrelease = \"v1.5\"\n\n# -- General configuration ---------------------------------------------------\n\n# Add any Sphinx extension module names here, as strings. They can be\n# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom\n# ones.\n\nextensions = [\n    'sphinx.ext.autodoc',\n    'sphinx.ext.napoleon',\n    'sphinx.ext.coverage',\n    'sphinx.ext.viewcode',\n    'sphinx.ext.mathjax',\n    'sphinx.ext.githubpages',\n    'sphinx.ext.napoleon',\n    'recommonmark',\n    'sphinx_markdown_tables',\n]\n\n# Add any paths that contain templates here, relative to this directory.\ntemplates_path = ['_templates']\n\n# The suffix(es) of source filenames.\n# You can specify multiple suffix as a list of string:\n#\nsource_parsers = {'.md': CommonMarkParser}\n\nsource_suffix = ['.rst', '.md']\n\n# The master toctree document.\nmaster_doc = 'index'\n\n# The language for content autogenerated by Sphinx. Refer to documentation\n# for a list of supported languages.\n#\n# This is also used if you do content translation via gettext catalogs.\n# Usually you set \"language\" from the command line for these cases.\nlanguage = \"Simplified Chinese\"\n\n# List of patterns, relative to source directory, that match files and\n# directories to ignore when looking for source files.\n# This pattern also affects html_static_path and html_extra_path.\nexclude_patterns = []\n\n# -- Options for HTML output -------------------------------------------------\n\n# The theme to use for HTML and HTML Help pages.  See the documentation for\n# a list of builtin themes.\n#\nhtml_theme = 'sphinx_rtd_theme'\n\n\n# -- Extension configuration -------------------------------------------------\ndef setup(app):\n    app.add_config_value('recommonmark_config', {\n        'enable_eval_rst': True,\n        'enable_auto_toc_tree': False,\n    }, True)\n    app.add_transform(AutoStructify)\n"
  },
  {
    "path": "docs/docs_ch/Makefile",
    "content": "# Minimal makefile for Sphinx documentation\n#\n\n# You can set these variables from the command line, and also\n# from the environment for the first two.\nSPHINXOPTS    ?=\nSPHINXBUILD   ?= sphinx-build\nSOURCEDIR     = .\nBUILDDIR      = _build\n\n# Put it first so that \"make\" without argument is like \"make help\".\nhelp:\n\t@$(SPHINXBUILD) -M help \"$(SOURCEDIR)\" \"$(BUILDDIR)\" $(SPHINXOPTS) $(O)\n\n.PHONY: help Makefile\n\n# Catch-all target: route all unknown targets to Sphinx using the new\n# \"make mode\" option.  $(O) is meant as a shortcut for $(SPHINXOPTS).\n%: Makefile\n\t@$(SPHINXBUILD) -M $@ \"$(SOURCEDIR)\" \"$(BUILDDIR)\" $(SPHINXOPTS) $(O)\n"
  },
  {
    "path": "docs/docs_ch/api/datasets/canvas.rst",
    "content": "==============\nCanvas\n==============\n\n.. code-block:: python\n\n    class paddlehub.datasets.Canvas(transform: Callable, mode: str = 'train'):\n\n-----------------\n\n   Dataset for colorization. It contains 1193 and 400 pictures for Monet and Vango paintings style, respectively. We collected data from https://people.eecs.berkeley.edu/~taesung_park/CycleGAN/datasets/.\n\n-----------------\n\n* Args:\n    * transform(Callable)\n        The method of preprocess images.\n    \n    * mode(str)\n        The mode for preparing dataset(train or test). Default to 'train'."
  },
  {
    "path": "docs/docs_ch/api/datasets/chnsenticorp.rst",
    "content": "==============\nChnSentiCorp\n==============\n\n.. code-block:: python\n\n    class paddlehub.datasets.ChnSentiCorp(tokenizer: Union[BertTokenizer, CustomTokenizer], max_seq_len: int = 128, mode: str = 'train'):\n\n-----------------\n\n    ChnSentiCorp is a dataset for chinese sentiment classification, which was published by Tan Songbo at ICT of Chinese Academy of Sciences.\n\n-----------------\n\n* Args:\n    * tokenizer(:obj:`BertTokenizer` or `CustomTokenizer`)\n        It tokenizes the text and encodes the data as model needed.\n\n    * max_seq_len(:obj:`int`, `optional`, defaults to :128)\n        The maximum length (in number of tokens) for the inputs to the selected module, such as ernie, bert and so on.\n\n    * mode(:obj:`str`, `optional`, defaults to `train`):\n        It identifies the dataset mode (train, test or dev)."
  },
  {
    "path": "docs/docs_ch/api/datasets/esc50.rst",
    "content": "==============\nESC50\n==============\n\n.. code-block:: python\n\n    class paddlehub.datasets.ESC50(mode: str = 'train', feat_type: str = 'mel'):)\n\n-----------------\n\n    The ESC-50 dataset is a labeled collection of 2000 environmental audio recordings suitable for benchmarking methods of environmental sound classification.\n   \n-----------------\n\n* Args:\n    * mode(:obj:`str`, `optional`, defaults to `train`):\n        It identifies the dataset mode (train, test or dev).\n    \n    * feat_type(:obj:`str`, `optional`, defaults to `mel`):\n        It identifies the input feature type (mel, or raw)."
  },
  {
    "path": "docs/docs_ch/api/datasets/flowers.rst",
    "content": "==============\nFlowers\n==============\n\n.. code-block:: python\n\n    class paddlehub.datasets.Flowers(transform: Callable, mode: str = 'train'):\n\n-----------------\n\n   Flower classification dataset.\n\n-----------------\n\n* Args:\n    * transform(Callable)\n        The method of preprocess images.\n    \n    * mode(str)\n        The mode for preparing dataset(train, test or val). Default to 'train'."
  },
  {
    "path": "docs/docs_ch/api/datasets/lcqmc.rst",
    "content": "==============\nLCQMC\n==============\n\n.. code-block:: python\n\n    class paddlehub.datasets.LCQMC(tokenizer: Union[BertTokenizer, CustomTokenizer], max_seq_len: int = 128, mode: str = 'train'):\n\n-----------------\n\n    A Large-scale Chinese Question Matching Corpus.\n    \n-----------------\n\n* Args:\n    * tokenizer(:obj:`BertTokenizer` or `CustomTokenizer`)\n        It tokenizes the text and encodes the data as model needed.\n\n    * max_seq_len(:obj:`int`, `optional`, defaults to :128)\n        The maximum length (in number of tokens) for the inputs to the selected module, such as ernie, bert and so on.\n\n    * mode(:obj:`str`, `optional`, defaults to `train`):\n        It identifies the dataset mode (train, test or dev)."
  },
  {
    "path": "docs/docs_ch/api/datasets/minicoco.rst",
    "content": "==============\nMiniCOCO\n==============\n\n.. code-block:: python\n\n    class paddlehub.datasets.MiniCOCO(transform: Callable, mode: str = 'train'):\n\n-----------------\n\n   Dataset for Style transfer. The dataset contains 2001 images for training set and 200 images for testing set. They are derived form COCO2014. Meanwhile, it contains 21 different style pictures in file \"21styles\".\n\n-----------------\n\n* Args:\n    * transform(Callable)\n        The method of preprocess images.\n    \n    * mode(str)\n        The mode for preparing dataset(train or test). Default to 'train'."
  },
  {
    "path": "docs/docs_ch/api/datasets/msra_ner.rst",
    "content": "==============\nMSRA_NER\n==============\n\n.. code-block:: python\n\n    class paddlehub.datasets.MSRA_NER(tokenizer: Union[BertTokenizer, CustomTokenizer], max_seq_len: int = 128, mode: str = 'train'):\n\n-----------------\n\n    A set of manually annotated Chinese word-segmentation data and specifications for training and testing a Chinese word-segmentation system for research purposes.  For more information please refer to https://www.microsoft.com/en-us/download/details.aspx?id=52531\n\n-----------------\n\n* Args:\n    * tokenizer(:obj:`BertTokenizer` or `CustomTokenizer`)\n        It tokenizes the text and encodes the data as model needed.\n\n    * max_seq_len(:obj:`int`, `optional`, defaults to :128)\n        The maximum length (in number of tokens) for the inputs to the selected module, such as ernie, bert and so on.\n\n    * mode(:obj:`str`, `optional`, defaults to `train`):\n        It identifies the dataset mode (train, test or dev)."
  },
  {
    "path": "docs/docs_ch/api/datasets/opticdisc.rst",
    "content": "==============\nOpticDiscSeg\n==============\n\n.. code-block:: python\n\n    class paddlehub.datasets.OpticDiscSeg(transform: Callable, mode: str = 'train'):\n\n-----------------\n\n   OpticDiscSeg dataset is extraced from iChallenge-AMD(https://ai.baidu.com/broad/subordinate?dataset=amd).\n   \n-----------------\n\n* Args:\n    * transform(Callable)\n        The method of preprocess images.\n    \n    * mode(str)\n        The mode for preparing dataset(train, test or val). Default to 'train'."
  },
  {
    "path": "docs/docs_ch/api/datasets_index.rst",
    "content": "==============\nDatasets\n==============\n\n\nCV\n==============\n.. toctree::\n   :maxdepth: 2\n   :titlesonly:\n\n   Canvas<datasets/canvas.rst>\n   Flowers<datasets/flowers.rst>\n   OpticDiscSeg<datasets/opticdisc.rst>\n   MiniCOCO<datasets/minicoco.rst>\n\nNLP\n==============\n.. toctree::\n   :maxdepth: 2\n   :titlesonly:\n\n   ChnSentiCorp<datasets/chnsenticorp.rst>\n   LCQMC<datasets/lcqmc.rst>\n   MSRA_NER<datasets/msra_ner.rst>\n\nAudio\n==============\n.. toctree::\n   :maxdepth: 2\n   :titlesonly:\n\n   ESC50<datasets/esc50.rst>"
  },
  {
    "path": "docs/docs_ch/api/env.rst",
    "content": "================\nHub Environment\n================\n\n.. code-block:: console\n\n    HUB_HOME\n    ├── MODULE_HOME\n    ├── CACHE_HOME\n    ├── DATA_HOME\n    ├── CONF_HOME\n    ├── THIRD_PARTY_HOME\n    ├── TMP_HOME\n    └── LOG_HOME\n\n\npaddlehub.env.HUB_HOME\n=========================\n\n    The root directory for storing PaddleHub related data. Default to ~/.paddlehub. Users can change the default value through the HUB_HOME environment variable.\n\npaddlehub.env.MODULE_HOME\n=========================\n\n    Directory for storing the installed PaddleHub Module.\n\npaddlehub.env.CACHE_HOME\n=========================\n\n    Directory for storing the cached data.\n\npaddlehub.env.DATA_HOME\n=========================\n\n    Directory for storing the automatically downloaded datasets.\n\npaddlehub.env.CONF_HOME\n=========================\n\n    Directory for storing the default configuration files.\n\npaddlehub.env.THIRD_PARTY_HOME\n================================\n\n    Directory for storing third-party libraries.\n\npaddlehub.env.TMP_HOME\n=========================\n\n    Directory for storing the temporary files generated during running, such as intermediate products of installing modules, files in this directory will generally be automatically cleared.\n\npaddlehub.env.LOG_HOME\n=========================\n\n    Directory for storing the log files generated during operation, including some non-fatal errors. The log will be stored daily.\n"
  },
  {
    "path": "docs/docs_ch/api/module.rst",
    "content": "==============\nModule\n==============\n\n.. code-block:: python\n\n    class paddlehub.Module(\n        name: str = None,\n        directory: str = None,\n        version: str = None,\n        ignore_env_mismatch: bool = False,\n        **kwargs)\n\n-----------------\n\n   In PaddleHub, Module represents an executable module, which usually a pre-trained model that can be used for end-to-end prediction, such as a face detection model or a lexical analysis model, or a pre-trained model that requires finetuning, such as BERT/ERNIE. When loading a Module with a specified name, if the Module does not exist locally, PaddleHub will automatically request the server or the specified Git source to download the resource.\n\n-----------------\n\n* Args:\n    * name(str | optional)\n        Module name.\n\n    * directory(str | optional)\n        Directory of the module to be loaded, only takes effect when the `name` is not specified.\n\n    * version(str | optional)\n        The version limit of the module, only takes effect when the `name` is specified. When the local Module does not meet the specified version conditions, PaddleHub will re-request the server to download the appropriate Module. Default to None, This means that the local Module will be used. If the Module does not exist, PaddleHub will download the latest version available from the server according to the usage environment.\n    \n    * ignore_env_mismatch(bool | optional)\n        Whether to ignore the environment mismatch when installing the Module. Default to False.\n\n**member functions**\n=====================\n\nexport_onnx_model\n------------------\n\n    .. code-block:: python\n\n        def export_onnx_model(\n            dirname: str,\n            input_spec: List[paddle.static.InputSpec] = None,\n            include_sub_modules: bool = True,\n            **kwargs):\n\n    Export the model to ONNX format.\n\n    * Args:\n        * dirname(str)\n            The directory to save the onnx model.\n        \n        * input_spec(list)\n            Describes the input of the saved model's forward method, which can be described by InputSpec or example Tensor. If None, all input variables of the original Layer's forward method would be the inputs of the saved model. Default None.\n            \n        * include_sub_modules(bool)\n            Whether to export sub modules. Default to True.\n            \n        * \\*\\*kwargs(dict|optional)\n            Other export configuration options for compatibility, some may be removed in the future. Don't use them If not necessary. Refer to https://github.com/PaddlePaddle/paddle2onnx for more information.\n\nsave_inference_model\n----------------------\n\n    .. code-block:: python\n\n        def save_inference_model(\n            dirname: str,\n            model_filename: str = None,\n            params_filename: str = None,\n            input_spec: List[paddle.static.InputSpec] = None,\n            include_sub_modules: bool = True,\n            combined: bool = True):\n\n    Export the model to Paddle Inference format.\n\n    * Args:\n        * name(str | optional)\n            Module name.\n\n        * model_filename(str)\n            The name of the saved model file. Default to `__model__`.\n\n        * params_filename(str)\n            The name of the saved parameters file, only takes effect when `combined` is True. Default to `__params__`.\n\n        * input_spec(list)\n            Describes the input of the saved model's forward method, which can be described by InputSpec or example Tensor. If None, all input variables of the original Layer's forward method would be the inputs of the saved model. Default None.\n\n        * include_sub_modules(bool)\n            Whether to export sub modules. Default to True.\n        \n        * combined(bool)\n            Whether to save all parameters in a combined file. Default to True.\n\nsub_modules\n----------------------\n\n    .. code-block:: python\n\n        def sub_modules(recursive: bool = True):\n\n    Get all sub modules.\n\n    * Args:\n        * recursive(bool): \n            Whether to get sub modules recursively. Default to True.\n\n**classmethod**\n=================\n\nget_py_requirements\n----------------------\n\n    .. code-block:: python\n\n        @classmethod\n        def get_py_requirements(cls) -> List[str]:\n\n    Get Module's python package dependency list.\n\nload\n----------------------\n\n    .. code-block:: python\n\n        @classmethod\n        def load(cls, directory: str) -> Generic:\n\n    Load the Module object defined in the specified directory.\n\n    * Args:\n        * directory(str): \n            Module directory.\n\nload_module_info\n----------------------\n\n    .. code-block:: python\n\n        @classmethod\n        def load_module_info(cls, directory: str) -> EasyDict:\n\n    Load the Module object defined in the specified directory.\n\n    * Args:\n        * directory(str): \n            Module directory.\n\n**property**\n=================\n\nis_runnable\n-----------------\n\n    .. code-block:: python\n\n        is_runnable\n\n    Whether the Module is runnable, in other words, whether can we execute the Module through the `hub run` command.\n\nname\n-----------------\n\n    .. code-block:: python\n\n        name\n\n    Module name.\n\ndirectory\n-----------------\n\n    .. code-block:: python\n\n        directory\n\n    Directory of Module.\n\nversion\n-----------------\n\n    .. code-block:: python\n\n        version\n\n    Module name.\n\ntype\n-----------------\n\n    .. code-block:: python\n\n        type\n\n    Module type.\n\nsummary\n-----------------\n\n    .. code-block:: python\n\n        summary\n\n    Module summary.\n\nauthor\n-----------------\n\n    .. code-block:: python\n\n        author\n\n    The author of Module\n\nauthor_email\n-----------------\n\n    .. code-block:: python\n\n        author_email\n\n    The email of Module author\n\n.. note::\n    Module is a factory class that is used to automatically download and load user-defined model classes. In addition to the above methods or property, each Module has other custom methods or property. The relevant definitions need to be viewed in the corresponding documentation\n"
  },
  {
    "path": "docs/docs_ch/api/module_decorator.rst",
    "content": "=================\nModule Decorator\n=================\n\nmoduleinfo\n============\n\n.. code-block:: python\n\n    def paddlehub.module.module.moduleinfo(\n        name: str,\n        version: str,\n        author: str = None,\n        author_email: str = None,\n        summary: str = None,\n        type: str = None,\n        meta=None) -> Callable:\n\n-----------------\n\n   Mark Module information for a python class, and the class will automatically be extended to inherit HubModule. In other words, python classes marked with moduleinfo can be loaded through hub.Module.\n\n-----------------\n\n* Args:\n    * name(str)\n        Module name.\n    \n    * version(str)\n        Module name.\n\n    * author(str)\n        The author of Module.\n\n    * author_email(str)\n        The email of Module author.\n\n    * summary(str)\n        Module summary.\n\n    * type(str)\n        Module type.\n\n\nmoduleinfo\n============\n\n.. code-block:: python\n\n    def paddlehub.module.module.runnable(func: Callable) -> Callable:\n\n-----------------\n\n   Mark a Module method as runnable, when the command `hub run` is used, the method will be called.\n\n-----------------\n\n* Args:\n    * func(Callable)\n        member function of Module.\n\nserving\n============\n\n.. code-block:: python\n\n    def paddlehub.module.module.serving(func: Callable) -> Callable:\n\n-----------------\n\n   Mark a Module method as serving method, when the command `hub serving` is used, the method will be called.\n\n-----------------\n\n* Args:\n    * func(Callable)\n        member function of Module."
  },
  {
    "path": "docs/docs_ch/api/module_manager.rst",
    "content": "=======================\nLocalModuleManager\n=======================\n\n.. code-block:: python\n\n    class paddlehub.module.manager.LocalModuleManager(home: str = MODULE_HOME):\n\n-----------------\n\n   LocalModuleManager is used to manage PaddleHub's local Module, which supports the installation, uninstallation, and search of HubModule. LocalModuleManager is a singleton object related to the path, in other words, when the LocalModuleManager object of the same home directory is generated multiple times, the same object is returned.\n\n-----------------\n\n* Args:\n    * home(str)\n       The directory where PaddleHub modules are stored, the default is ~/.paddlehub/modules\n\n**member functions**\n=====================\n\ninstall\n------------------\n\n   .. code-block:: python\n\n      def install(\n         name: str = None,\n         directory: str = None,\n         archive: str = None,\n         url: str = None,\n         version: str = None,\n         ignore_env_mismatch: bool = False) -> HubModule:\n\n   Install a HubModule from name or directory or archive file or url. When installing with the name parameter, if a module that meets the conditions (both name and version) already installed, the installation step will be skipped. When installing with other parameter, The locally installed modules will be uninstalled.\n\n   * Args:\n      * name(str | optional)\n         module name to install\n\n      * directory(str | optional)\n         directory containing  module code\n\n      * archive(str | optional) \n         archive file containing  module code\n\n      * url(str|optional) \n         url points to a archive file containing module code\n\n      * version(str | optional)\n         module version, use with name parameter\n            \n      * ignore_env_mismatch(str | optional)\n         Whether to ignore the environment mismatch when installing the Module.\n\nuninstall\n------------------\n\n   .. code-block:: python\n\n      def uninstall(name: str) -> bool:\n\n   Uninstall a HubModule from name.\n\n   * Args:\n      * name(str)\n         module name to uninstall\n\n   * Return:\n      True if uninstall successfully else False\n\nlist\n------------------\n\n   .. code-block:: python\n\n      def list() -> List[HubModule]:\n\n   List all installed HubModule.\n\n   * Return:\n      List of installed HubModule.\n\nsearch\n------------------\n\n   .. code-block:: python\n\n      def search(name: str) -> HubModule:\n\n   search a HubModule with specified name.\n\n\n   * Args:\n      * name(str)\n         module name to search.\n\n   * Return:\n      None if not HubModule with specified name found else the specified HubModule."
  },
  {
    "path": "docs/docs_ch/api/trainer.rst",
    "content": "==============\nTrainer\n==============\n\n.. code-block:: python\n\n    class paddlehub.Trainer(\n        model: paddle.nn.Layer,\n        optimizer: paddle.optimizer.Optimizer,\n        use_gpu: bool = False,\n        use_vdl: bool = True,\n        checkpoint_dir: str = None,\n        compare_metrics: Callable = None):\n\n-----------------\n\n   Model trainer.\n\n-----------------\n\n* Args:\n    * model(paddle.nn.Layer)\n        Model to train or evaluate.\n\n    * optimizer(paddle.optimizer.Optimizer)\n        Optimizer for loss.\n        \n    * use_gpu(bool)\n        Whether to use gpu to run.\n\n    * use_vdl(bool)\n        Whether to use visualdl to record training data.\n\n    * checkpoint_dir(str)\n        Directory where the checkpoint is saved, and the trainer will restore the state and model parameters from the checkpoint.\n\n    * compare_metrics(Callable)\n        The method of comparing the model metrics. If not specified, the main metric return by `validation_step` will be used for comparison by default, the larger the value, the better the effect. This method will affect the saving of the best model. If the default behavior does not meet your requirements, please pass in a custom method.\n\n**member functions**\n=====================\n\ntrain\n------------------\n\n    .. code-block:: python\n\n        def train(\n            train_dataset: paddle.io.Dataset,\n            epochs: int = 1,\n            batch_size: int = 1,\n            num_workers: int = 0,\n            eval_dataset: paddle.io.Dataset = None,\n            log_interval: int = 10,\n            save_interval: int = 10,\n            collate_fn: Callable = None):\n\n    Train a model with specific config.\n\n    * Args:\n        * train_dataset(paddle.io.Dataset)\n            Dataset to train the model\n\n        * epochs(int)\n            Number of training loops, default is 1.\n\n        * batch_size(int)\n            Batch size of per step, default is 1.\n\n        * num_workers(int)\n            Number of subprocess to load data, default is 0.\n\n        * eval_dataset(paddle.io.Dataset)\n            The validation dataset, deafult is None. If set, the Trainer will execute evaluate function every `save_interval` epochs.\n        \n        * log_interval(int)\n            Log the train infomation every `log_interval` steps.\n\n        * save_interval(int)\n            Save the checkpoint every `save_interval` epochs.\n\n        * collate_fn(Callable)\n            function to generate mini-batch data by merging the sample list. None for only stack each fields of sample in axis 0(same as :attr::`np.stack(..., axis=0)`). Default None.\n\nevaluate\n----------------------\n\n    .. code-block:: python\n\n        def evaluate(\n            eval_dataset: paddle.io.Dataset,\n            batch_size: int = 1,\n            num_workers: int = 0,\n            collate_fn: Callable = None):\n\n    Run evaluation and returns metrics.\n\n    * Args:\n        * eval_dataset(paddle.io.Dataset)\n            The validation dataset\n        \n        * batch_size(int)\n            Batch size of per step, default is 1.\n\n        * num_workers(int)\n            Number of subprocess to load data, default is 0.\n        \n        * collate_fn(Callable)\n            function to generate mini-batch data by merging the sample list. None for only stack each fields of sample in axis 0(same as :attr::`np.stack(..., axis=0)`). Default None."
  },
  {
    "path": "docs/docs_ch/api_index.rst",
    "content": "==============\nAPI Reference\n==============\n\n\nModule\n==============\n.. toctree::\n   :maxdepth: 2\n   :titlesonly:\n\n   Module<api/module.rst>\n   LocalModuleManager<api/module_manager.rst>\n   Module Decorator<api/module_decorator.rst>\n\nTransfer Learning\n==================\n.. toctree::\n   :maxdepth: 2\n   :titlesonly:\n\n   Trainer<api/trainer.rst>\n   datasets<api/datasets_index.rst>\n\nEnvironment\n==============\n.. toctree::\n   :maxdepth: 2\n   :titlesonly:\n\n   env<api/env.rst>"
  },
  {
    "path": "docs/docs_ch/community/contribute_code.md",
    "content": "# 贡献代码\n\nPaddleHub非常欢迎贡献者。\n\n首先，如果有什么不确定的事情，可随时提交问题或拉取请求。 不会有人因此而抱怨。我们会感激任何形式的贡献，不想用一堆规则来阻止这些贡献。\n\n本文档包括了所有在贡献中需要注意的要点，会加快合并代码、解决问题的速度。\n\n查看[概览](../overview.md)来初步了解。\n\n下面是一些简单的贡献指南。\n\n## 提交问题\n\n当你使用PaddleHub遇到问题时，可以通过提交[issue](https://github.com/PaddlePaddle/PaddleHub/issues)来反馈。\n\n在提出问题时，请说明以下事项：\n\n* 按照问题模板的内容来填写问题细节，以便评审者查找问题原因。\n* 出现问题的场景 (尽量详细，以便重现问题)。\n* 错误和日志消息。\n* 其它可能有用的细节信息。\n\n## 提交新功能建议/BUG修复\n\n* 在适配使用场景时，总会需要一些新的功能。 可以加入新功能的讨论，也可以直接提交新功能的Pull-Request请求。\n\n* 在自己的 github 账户下 fork PaddleHub(https://github.com/PaddlePaddle/PaddleHub)。 在 fork 后， 利用git工具（add, commit, pull, push）提交PR。 然后就可以提交拉取请求了。\n\n如何提PR，参考下列步骤：\n\n### 第一步：将自己目录下PaddleHub远程仓库clone到本地：\n\n```\nhttps://github.com/USERNAME/PaddleHub\n```\n\n### 第二步：切换到远程分支develop\n\n```\ngit checkout develop\n```\n\n### 第三步：基于远程分支develop新建本地分支new-feature\n\n```\ngit checkout -b new-feature\n```\n\n### 第四步：使用pre-commit钩子\n\nPaddleHub开发人员使用pre-commit工具来管理Git预提交钩子。它可以帮助我们格式化源代码Python，在提交（commit）前自动检查一些基本事宜（如每个文件只有一个 EOL，Git 中不要添加大文件等）。\n\npre-commit测试是 Travis-CI 中单元测试的一部分，不满足钩子的PR不能被提交到Paddle，首先安装并在当前目录运行它：\n\n```shell\n➜  pip install pre-commit\n➜  pre-commit install\n```\n\n\n### 第五步：在new-feature分支上开发你的需求，提交你的更改\n\n```\ngit commit -m \"add new feature\"\n```\n\n### 第六步：在准备发起Pull Request之前，需要同步原仓库（https://github.com/PaddlePaddle/PaddleHub ）最新的代码。\n\n通过 git remote 查看当前远程仓库的名字。\n\n```shell\n➜  git remote\norigin\n➜  git remote -v\norigin\thttps://github.com/USERNAME/PaddleHub (fetch)\norigin\thttps://github.com/USERNAME/PaddleHub (push)\n```\n\n这里 origin 是自己用户名下的PaddleHub，接下来创建一个原始PaddleHub仓库的远程主机，命名为 upstream。\n```shell\n➜  git remote add upstream https://github.com/PaddlePaddle/PaddleHub\n➜  git remote\norigin\nupstream\n```\n\n获取 upstream 的最新代码并更新当前分支。\n```shell\n➜  git fetch upstream\n➜  git pull upstream develop\n```\n\n### 第七步：推送本地分支new-feature到自己的PaddleHub库\n\n```\n➜  git push origin new-feature\n```\n\n这样你的PaddleHub库的new-feature分支包含了你的最新更改，点击上面的“pull request”就可以推送请求了。\n\n如果评审人员给出了反馈需要继续修正代码，可以从第五步重新开始，这样所有的提交都会显示到同一个pull request中。\n\n## 代码风格和命名约定\n\n* PaddleHub 遵循 [PEP8](https://www.python.org/dev/peps/pep-0008/) 的 Python 代码命名约定。在提交拉取请求时，请尽量遵循此规范。 可通过`flake8`或`pylint`的提示工具来帮助遵循规范。\n\n## 文档\n\n文档使用了 [sphinx](http://sphinx-doc.org/) 来生成，支持 [Markdown](https://guides.github.com/features/mastering-markdown/) 和 [reStructuredText](http://www.sphinx-doc.org/en/master/usage/restructuredtext/basics.html) 格式。 所有文档都在 [docs/](../../) 目录下。\n\n* 在提交文档改动前，请先**在本地生成文档**：`cd docs/ && make clean && make html`，然后，可以在 `docs/_build/html` 目录下找到所有生成的网页。 请认真分析生成日志中的**每个 WARNING**，这非常有可能是或**空连接**或其它问题。\n\n* 需要链接时，尽量使用**相对路径**。\n"
  },
  {
    "path": "docs/docs_ch/community/more_demos.md",
    "content": "# 第三方趣味案例\n\n以下为前期PaddleHub课程或活动中，开发者们基于PaddleHub创作的趣味实践作品，均收录在AI Studio中，可在线运行，欢迎访问，希望对您有所启发。\n1. [布剪刀石头【人脸识别切换本地窗口】](https://aistudio.baidu.com/aistudio/projectdetail/507630)\n1. [秋水中的鱼【yesok舞蹈背景抠图转换并动漫风格迁移】](http://aistudio.baidu.com/aistudio/projectdetail/517066)\n1. [Ninetailskim【在人脸上玩复古windows弹球】](https://aistudio.baidu.com/aistudio/projectdetail/518861)\n1. [乌拉__【监控口罩，语音提醒，后台记录】](https://aistudio.baidu.com/aistudio/projectdetail/506931)\n1. [九品炼丹师【影流之绿蛙蔡徐坤，人脸识别加头饰+ 人像分割变分身】](https://aistudio.baidu.com/aistudio/projectdetail/505168)\n1. [七年期限【风格迁移以及本地部署】](https://aistudio.baidu.com/aistudio/projectdetail/520453)\n1. [Fanas无敌【口红试色项目】](https://aistudio.baidu.com/aistudio/projectdetail/516520)\n1. [skywalk163【用paddlehub统计飞桨源代码词频以及词云与人像展示】](https://aistudio.baidu.com/aistudio/projectdetail/519841)\n1. [AIStudio261428【人脸识别+漫画表情包】](https://aistudio.baidu.com/aistudio/projectdetail/519616)\n1. [土豆芽【六一儿童节邀请卡通人物来做客】](https://aistudio.baidu.com/aistudio/projectdetail/520925)\n1. [大熊猫的感觉【变化的口罩】](https://aistudio.baidu.com/aistudio/projectdetail/520996)\n1. [kly1997【一键旅游+戴墨镜】](https://aistudio.baidu.com/aistudio/projectdetail/518117)\n1. [寞寞_默默【穿越到油画中】](https://aistudio.baidu.com/aistudio/projectdetail/516332)\n1. [isse7【创意项目：风格“鬼脸”变换】](https://aistudio.baidu.com/aistudio/projectdetail/515307)\n1. [Pda【人脸趣味变】](https://aistudio.baidu.com/aistudio/projectdetail/516306)\n1. [Kgkzhiwen【我的新衣】](https://aistudio.baidu.com/aistudio/projectdetail/516663)\n1. [哎呀呀好好学习【脸型自动调整】](https://aistudio.baidu.com/aistudio/projectdetail/513640)\n1. [Tfboy【证件照换底】](https://aistudio.baidu.com/aistudio/projectdetail/509443)\n1. [Leigangblog【我是明星脸】](https://aistudio.baidu.com/aistudio/projectdetail/505537)\n1. [wpb3dm【时装模特换装】](https://aistudio.baidu.com/aistudio/projectdetail/519349)\n1. [lsvine_bai【女友秒变神秘金发女神】](https://aistudio.baidu.com/aistudio/projectdetail/521784)\n1. [Lemonadeqk【简单追星】](https://aistudio.baidu.com/aistudio/projectdetail/520488)\n1. [XM1436gr【利用PaddleHub关键点检测实现AI换卡通脸】](https://aistudio.baidu.com/aistudio/projectdetail/514547)\n1. [旺仔【人人都是圆眼萌仔】](https://aistudio.baidu.com/aistudio/projectdetail/519222)\n1. [Arrowarcher【AI一键换发】](https://aistudio.baidu.com/aistudio/projectdetail/508270)\n1. [WHY197598【移物换景基础】](https://aistudio.baidu.com/aistudio/projectdetail/517961)\n1. [署名景逸【基于paddlehub人脸关键点检测的疲劳检测】](https://aistudio.baidu.com/aistudio/projectdetail/506024)\n1. [thunder95【PaddleHub目光表情投票】](https://aistudio.baidu.com/aistudio/projectdetail/514205)\n1. [上弦月C 【坟头蹦迪毕业照】](https://aistudio.baidu.com/aistudio/projectdetail/511253)\n1. [如意_鸡蛋【左看像周润发，右看像刘德华】](https://aistudio.baidu.com/aistudio/projectdetail/507231)\n"
  },
  {
    "path": "docs/docs_ch/community_index.rst",
    "content": "===================\n活跃社区\n===================\n..  toctree::\n    :maxdepth: 2\n    :titlesonly:\n\n    community/more_demos.md\n    \n    community/contribute_code.md\n\n------------\n\n..  toctree::\n    :maxdepth: 2\n    :titlesonly:\n\n    第一期AI创意赛 <https://aistudio.baidu.com/aistudio/competition/detail/34>\n\n    第二期AI创意赛 <https://aistudio.baidu.com/aistudio/competition/detail/35>\n\n    第一期AI创造营 <https://aistudio.baidu.com/aistudio/competition/detail/72>\n\n    第一期AI ChatBot创意赛 <https://aistudio.baidu.com/aistudio/competition/detail/79>"
  },
  {
    "path": "docs/docs_ch/conf.py",
    "content": "# Configuration file for the Sphinx documentation builder.\n#\n# This file only contains a selection of the most common options. For a full\n# list see the documentation:\n# https://www.sphinx-doc.org/en/master/usage/configuration.html\n\n# -- Path setup --------------------------------------------------------------\n\n# If extensions (or modules to document with autodoc) are in another directory,\n# add these directories to sys.path here. If the directory is relative to the\n# documentation root, use os.path.abspath to make it absolute, like shown here.\n#\nimport os\nimport sys\nfrom recommonmark.transform import AutoStructify\nfrom recommonmark.parser import CommonMarkParser\n\n# sys.path.insert(0, os.path.abspath('../../paddlehub'))\n# import paddlehub as hub\n\n# -- Project information -----------------------------------------------------\n\nproject = 'PaddleHub'\ncopyright = '2021, PaddlePaddle'\nauthor = 'PaddlePaddle'\n\n# The full version, including alpha/beta/rc tags\n# release = 'v{}'.format(hub.__version__)\nrelease = 'v2.1.0'\n\n# -- General configuration ---------------------------------------------------\n\n# Add any Sphinx extension module names here, as strings. They can be\n# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom\n# ones.\nextensions = [\n    'sphinx.ext.autodoc', 'sphinx.ext.napoleon', 'sphinx.ext.coverage', 'sphinx.ext.viewcode', 'sphinx.ext.mathjax',\n    'sphinx.ext.githubpages', 'sphinx.ext.napoleon', 'recommonmark', 'sphinx_markdown_tables'\n]\n\n# Add any paths that contain templates here, relative to this directory.\ntemplates_path = ['_templates']\n\n# List of patterns, relative to source directory, that match files and\n# directories to ignore when looking for source files.\n# This pattern also affects html_static_path and html_extra_path.\nexclude_patterns = ['_build', '.DS_Store']\n\n# -- Options for HTML output -------------------------------------------------\n\n# The theme to use for HTML and HTML Help pages.  See the documentation for\n# a list of builtin themes.\n#\nhtml_title = project\nhtml_theme = \"sphinx_materialdesign_theme\"\n\n# Add any paths that contain custom static files (such as style sheets) here,\n# relative to this directory. They are copied after the builtin static files,\n# so a file named \"default.css\" will overwrite the builtin \"default.css\".\nhtml_static_path = ['_static']\n\n# The suffix(es) of source filenames.\n# You can specify multiple suffix as a list of string:\n#\nsource_parsers = {'.md': CommonMarkParser}\n\nsource_suffix = ['.rst', '.md']\n"
  },
  {
    "path": "docs/docs_ch/faq.md",
    "content": "# 常见问题\n\n## 使用pip install paddlehub时提示\n`Could not find a version that satisfies the requirement paddlehub (from versions: )`\n\n这可能是因为pip指向了一个pypi的镜像源，该镜像源没有及时同步paddlehub版本导致。\n\n使用如下命令来安装：\n\n```shell\n$ pip install -i https://pypi.org/simple/ paddlehub\n```\n\n## 使用paddlehub时，提示\n`ModuleNotFoundError: No module named 'paddle'`\n\n这是因为PaddleHub依赖于PaddlePaddle，用户需要自行安装合适的PaddlePaddle版本。\n如果机器不支持GPU，那么使用如下命令来安装PaddlePaddle的CPU版本：\n```shell\n$ pip install paddlepaddle\n```\n\n如果机器支持GPU，则使用如下命令来安装PaddlePaddle的GPU版本：\n```shell\n$ pip install paddlepaddle-gpu\n```\n\n## 使用paddlehub时，无法下载预置数据集、module的等现象\n\n下载数据集、module等，PaddleHub要求机器可以访问外网。可以使用server_check()可以检查本地与远端PaddleHub-Server的连接状态，使用方法如下：\n\n```python\nimport paddlehub\npaddlehub.server_check()\n# 如果可以连接远端PaddleHub-Server，则显示Request Hub-Server successfully.\n# 如果无法连接远端PaddleHub-Server，则显示Request Hub-Server unsuccessfully.\n```\n\n## PaddleHub Module是否支持多线程加速预测？\n\n由于PaddlePaddle自身的限制，PaddleHub无法通过多线程来加速模型预测。\n\n## 如何修改PaddleHub的修改预训练模型存放路径？\n\n通过设置系统环境变量HUB_HOME，修改预训练模型存放路径\n"
  },
  {
    "path": "docs/docs_ch/figures.md",
    "content": "## 特性详解\n<a name=\"丰富的预训练模型\"></a>\n\n### 1、丰富的预训练模型\n\n#### 1.1、图像\n\n|            | **精品模型举例**                                             |\n| ---------- | :----------------------------------------------------------- |\n| 图像分类 | [菜品识别](https://www.paddlepaddle.org.cn/hubdetail?name=resnet50_vd_dishes&en_category=ImageClassification)、[动物识别](https://www.paddlepaddle.org.cn/hubdetail?name=resnet50_vd_animals&en_category=ImageClassification)、[动物识别](https://www.paddlepaddle.org.cn/hubdetail?name=resnet50_vd_animals&en_category=ImageClassification)、[-->More](../modules/image/classification/README.md) |\n| 目标检测   | [通用检测](https://www.paddlepaddle.org.cn/hubdetail?name=yolov3_darknet53_coco2017&en_category=ObjectDetection)、[行人检测](https://www.paddlepaddle.org.cn/hubdetail?name=yolov3_darknet53_pedestrian&en_category=ObjectDetection)、[车辆检测](https://www.paddlepaddle.org.cn/hubdetail?name=yolov3_darknet53_vehicles&en_category=ObjectDetection)、[-->More](../modules/image/object_detection/README.md) |\n| 人脸检测 | [人脸检测](https://www.paddlepaddle.org.cn/hubdetail?name=pyramidbox_lite_server&en_category=FaceDetection)、[口罩检测](https://www.paddlepaddle.org.cn/hubdetail?name=pyramidbox_lite_server_mask&en_category=FaceDetection)、[-->More](../modules/image/face_detection/README.md) |\n| 图像分割   | [人像分割](https://www.paddlepaddle.org.cn/hubdetail?name=deeplabv3p_xception65_humanseg&en_category=ImageSegmentation)、[人体解析](https://www.paddlepaddle.org.cn/hubdetail?name=ace2p&en_category=ImageSegmentation)、[肺炎CT影像分析](https://www.paddlepaddle.org.cn/hubdetail?name=Pneumonia_CT_LKM_PP&en_category=ImageSegmentation)、[-->More](../modules/image/semantic_segmentation/README.md) |\n| 关键点检测 | [人体关键点](https://www.paddlepaddle.org.cn/hubdetail?name=human_pose_estimation_resnet50_mpii&en_category=KeyPointDetection)、[人脸关键点](https://www.paddlepaddle.org.cn/hubdetail?name=face_landmark_localization&en_category=KeyPointDetection)、[手部关键点](https://www.paddlepaddle.org.cn/hubdetail?name=hand_pose_localization&en_category=KeyPointDetection)、[-->More](./modules/image/keypoint_detection/README.md) |\n| 文本识别 | [超轻量中英文OCR文字识别](https://www.paddlepaddle.org.cn/hubdetail?name=chinese_ocr_db_crnn_mobile&en_category=TextRecognition)、[-->More](../modules/image/text_recognition/README.md) |\n| 图像生成    | [风格迁移](https://www.paddlepaddle.org.cn/hubdetail?name=stylepro_artistic&en_category=GANs)、[街景动漫画](https://www.paddlepaddle.org.cn/hubdetail?name=animegan_v2_hayao_99&en_category=GANs)、[-->More](../modules/image/Image_gan/README.md) |\n| 图像编辑 | [超分辨率](https://www.paddlepaddle.org.cn/hubdetail?name=realsr&en_category=ImageEditing)、[黑白上色](https://www.paddlepaddle.org.cn/hubdetail?name=deoldify&en_category=ImageEditing)、[-->More](../modules/image/Image_editing/README.md) |\n#### 1.2、文本\n|            | **精品模型举例**                                           |\n| ---------- | :----------------------------------------------------------- |\n| 词句分析 | [词法分析 ](https://www.paddlepaddle.org.cn/hubdetail?name=lac&en_category=LexicalAnalysis)、[句法分析](https://www.paddlepaddle.org.cn/hubdetail?name=ddparser&en_category=SyntacticAnalysis)、[-->More](../modules/text/lexical_analysis/README.md) |\n| 情感分析   | [情感判断](https://www.paddlepaddle.org.cn/hubdetail?name=lac&en_category=LexicalAnalysis)、[情绪分析](https://www.paddlepaddle.org.cn/hubdetail?name=emotion_detection_textcnn&en_category=SentimentAnalysis) 、[-->More](../modules/text/sentiment_analysis/README.md)|\n| 文本审核 | [色情审核](https://www.paddlepaddle.org.cn/hubdetail?name=porn_detection_gru&en_category=TextCensorship)、[-->More](../modules/text/text_review/README.md) |\n| 文本生成 | [对联生成](https://www.paddlepaddle.org.cn/hubdetail?name=ernie_tiny_couplet&en_category=TextGeneration)、[情话生成](https://www.paddlepaddle.org.cn/hubdetail?name=ernie_gen_poetry&en_category=TextGeneration)、[藏头诗生成](https://www.paddlepaddle.org.cn/hubdetail?name=ernie_gen_acrostic_poetry&en_category=TextGeneration)、[土味情话](https://www.paddlepaddle.org.cn/hubdetail?name=ernie_gen_lover_words&en_category=TextGeneration) 、[-->More](../modules/text/text_generation/README.md)|\n| 语义模型   | [ERNIE](https://www.paddlepaddle.org.cn/hubdetail?name=ERNIE&en_category=SemanticModel)、[文本相似度](https://www.paddlepaddle.org.cn/hubdetail?name=simnet_bow&en_category=SemanticModel)、[-->More](../modules/text/language_model/README.md) |\n\n#### 1.3、语音\n|            | **精品模型举例**                                           |\n| ---------- | :----------------------------------------------------------- |\n| 语音合成   | [语音合成](https://www.paddlepaddle.org.cn/hubdetail?name=deepvoice3_ljspeech&en_category=TextToSpeech) 、[-->More](../modules/audio/README.md)                         |\n#### 1.4、视频\n|            | **精品模型举例**                                       |\n| ---------- | :----------------------------------------------------------- |\n| 视频分类 | [视频分类](https://www.paddlepaddle.org.cn/hublist?filter=en_category&value=VideoClassificationhttps://www.paddlepaddle.org.cn/hublist?filter=en_category&value=VideoClassification)、[-->More](../modules/video/README.md) |\n\n<a name=\"一键模型预测\"></a>\n\n### 2、一键模型预测\n\n\n* 举例，假如考虑使用文字识别轻量级中文OCR模型chinese_ocr_db_crnn_mobile即可一键快速识别图片中的文字。\n```shell\n$ pip install paddlehub\n$ wget https://paddlehub.bj.bcebos.com/model/image/ocr/test_ocr.jpg\n$ hub run chinese_ocr_db_crnn_mobile --input_path test_ocr.jpg --visualization=True\n```\n\n* 预测结果图片保存在当前运行路径下ocr_result文件夹中，如下图所示。\n\n<p align=\"center\">\n <img src=\"../imgs/ocr_res.jpg\" width='70%' align=\"middle\"  \n</p>\n\n* 使用词法分析模型LAC进行分词\n```shell\n$ hub run lac --input_text \"现在，慕尼黑再保险公司不仅是此类行动的倡议者，更是将其大量气候数据整合进保险产品中，并与公众共享大量天气信息，参与到新能源领域的保障中。\"\n[{\n    'word': ['现在', '，', '慕尼黑再保险公司', '不仅', '是', '此类', '行动', '的', '倡议者', '，', '更是', '将', '其', '大量', '气候', '数据', '整合', '进', '保险', '产品', '中', '，', '并', '与', '公众', '共享', '大量', '天气', '信息', '，', '参与', '到', '新能源', '领域', '的', '保障', '中', '。'],\n    'tag':  ['TIME', 'w', 'ORG', 'c', 'v', 'r', 'n', 'u', 'n', 'w', 'd', 'p', 'r', 'a', 'n', 'n', 'v', 'v', 'n', 'n', 'f', 'w', 'c', 'p', 'n', 'v', 'a', 'n', 'n', 'w', 'v', 'v', 'n', 'n', 'u', 'vn', 'f', 'w']\n}]\n```\n\n除了一行代码预测之外，PaddleHub也支持使用API调用模型的方式，可以参考每个模型的详细文档。\n\n<a name=\"一键模型转服务\"></a>\n\n### 3、一键模型转服务\n\nPaddleHub提供便捷的模型转服务的能力，只需简单一行命令即可完成模型的HTTP服务部署。通过以下命令即可快速启动LAC词法分析服务：\n\n```shell\n$ hub serving start -m chinese_ocr_db_crnn_mobile\n```\n\n更多关于模型服务化使用说明参见[PaddleHub模型一键服务化部署](./tutorial/serving.md)。\n\n\n\n<a name=\"十行代码迁移学习\"></a>\n\n### 4、十行代码迁移学习\n\n通过Fine-tune API，只需要少量代码即可完成深度学习模型在计算机视觉场景下的迁移学习。\n\n* [Demo示例](../demo)提供丰富的Fine-tune API的使用代码，包括[图像分类](../demo/image_classification)、[图像着色](../demo/colorization)、[风格迁移](../demo/style_transfer)、等场景的模型迁移示例。\n\n<p align=\"center\">\n <img src=\"../imgs/paddlehub_finetune.gif\" align=\"middle\"  \n</p>\n\n<p align='center'>\n 十行代码完成工业级文本分类\n</p>\n\n* 如需在线快速体验，请点击[PaddleHub教程合集](https://aistudio.baidu.com/aistudio/projectdetail/231146)，可使用AI Studio平台提供的GPU算力进行快速尝试。\n"
  },
  {
    "path": "docs/docs_ch/finetune/audio_classification.md",
    "content": "# 声音分类\n\n本示例展示如何使用PaddleHub Fine-tune API以及CNN14等预训练模型完成声音分类和Tagging的任务。\n\nCNN14等预训练模型的详情，请参考论文[PANNs: Large-Scale Pretrained Audio Neural Networks for Audio Pattern Recognition](https://arxiv.org/pdf/1912.10211.pdf)和代码[audioset_tagging_cnn](https://github.com/qiuqiangkong/audioset_tagging_cnn)。\n\n\n## 如何开始Fine-tune\n\n我们以环境声音分类公开数据集[ESC50](https://github.com/karolpiczak/ESC-50)为示例数据集，可以运行下面的命令，在训练集（train.npz）上进行模型训练，并在开发集（dev.npz）验证。通过如下命令，即可启动训练。\n\n```python\n# 设置使用的GPU卡号\nexport CUDA_VISIBLE_DEVICES=0\npython train.py\n```\n\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 选择模型\n\n```python\nimport paddle\nimport paddlehub as hub\nfrom paddlehub.datasets import ESC50\n\nmodel = hub.Module(name='panns_cnn14', version='1.0.0', task='sound-cls', num_class=ESC50.num_class)\n```\n\n其中，参数：\n- `name`: 模型名称，可以选择`panns_cnn14`、`panns_cnn10` 和`panns_cnn6`，具体的模型参数信息可见下表。\n- `version`: module版本号\n- `task`：模型的执行任务。`sound-cls`表示声音分类任务；`None`表示Audio Tagging任务。\n- `num_classes`：表示当前声音分类任务的类别数，根据具体使用的数据集确定。\n\n目前可选用的预训练模型：\n模型名      | PaddleHub Module\n-----------| :------:\nCNN14      | `hub.Module(name='panns_cnn14')`\nCNN10      | `hub.Module(name='panns_cnn10')`\nCNN6       | `hub.Module(name='panns_cnn6')`\n\n### Step2: 加载数据集\n\n```python\ntrain_dataset = ESC50(mode='train')\ndev_dataset = ESC50(mode='dev')\n```\n\n### Step3: 选择优化策略和运行配置\n\n```python\noptimizer = paddle.optimizer.AdamW(learning_rate=5e-5, parameters=model.parameters())\ntrainer = hub.Trainer(model, optimizer, checkpoint_dir='./', use_gpu=True)\n```\n\n#### 优化策略\n\nPaddle2.0提供了多种优化器选择，如`SGD`, `AdamW`, `Adamax`等，详细参见[策略](https://www.paddlepaddle.org.cn/documentation/docs/zh/api/paddle/optimizer/Overview_cn.html)。\n\n其中`AdamW`:\n\n- `learning_rate`: 全局学习率。默认为1e-3；\n- `parameters`: 待优化模型参数。\n\n其余可配置参数请参考[AdamW](https://www.paddlepaddle.org.cn/documentation/docs/zh/api/paddle/optimizer/adamw/AdamW_cn.html#cn-api-paddle-optimizer-adamw)。\n\n#### 运行配置\n\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n- `model`: 被优化模型；\n- `optimizer`: 优化器选择；\n- `use_vdl`: 是否使用vdl可视化训练过程；\n- `checkpoint_dir`: 保存模型参数的地址；\n- `compare_metrics`: 保存最优模型的衡量指标；\n\n\n### Step4: 执行训练和模型评估\n\n```python\ntrainer.train(\n    train_dataset,\n    epochs=50,\n    batch_size=16,\n    eval_dataset=dev_dataset,\n    save_interval=10,\n)\ntrainer.evaluate(dev_dataset, batch_size=16)\n```\n\n`trainer.train`执行模型的训练，其参数可以控制具体的训练过程，主要的参数包含：\n\n- `train_dataset`: 训练时所用的数据集；\n- `epochs`: 训练轮数；\n- `batch_size`: 训练时每一步用到的样本数目，如果使用GPU，请根据实际情况调整batch_size；\n- `num_workers`: works的数量，默认为0；\n- `eval_dataset`: 验证集；\n- `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n- `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n`trainer.evaluate`执行模型的评估，主要的参数包含：\n\n- `eval_dataset`: 模型评估时所用的数据集；\n- `batch_size`: 模型评估时每一步用到的样本数目，如果使用GPU，请根据实际情况调整batch_size\n\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n以下代码将本地的音频文件`./cat.wav`作为预测数据，使用训好的模型对它进行分类，输出结果。\n\n```python\nimport os\n\nimport librosa\n\nimport paddlehub as hub\nfrom paddlehub.datasets import ESC50\n\nwav = './cat.wav'  # 存储在本地的需要预测的wav文件\nsr = 44100  # 音频文件的采样率\ncheckpoint = './best_model/model.pdparams'  # 模型checkpoint\n\nlabel_map = {idx: label for idx, label in enumerate(ESC50.label_list)}\n\nmodel = hub.Module(name='panns_cnn14',\n                    version='1.0.0',\n                    task='sound-cls',\n                    num_class=ESC50.num_class,\n                    label_map=label_map,\n                    load_checkpoint=checkpoint)\n\ndata = [librosa.load(wav, sr=sr)[0]]\nresult = model.predict(data, sample_rate=sr, batch_size=1, feat_type='mel', use_gpu=True)\n\nprint(result[0])  # result[0]包含音频文件属于各类别的概率值\n```\n\n\n## Audio Tagging\n\n当前使用的模型是基于[Audioset数据集](https://research.google.com/audioset/)的预训练模型，除了以上的针对特定声音分类数据集的finetune任务，模型还支持基于Audioset 527个标签的Tagging功能。\n\n以下代码将本地的音频文件`./cat.wav`作为预测数据，使用预训练模型对它进行打分，输出top 10的标签和对应的得分。\n\n```python\nimport os\n\nimport librosa\nimport numpy as np\n\nimport paddlehub as hub\nfrom paddlehub.env import MODULE_HOME\n\n\nwav = './cat.wav'  # 存储在本地的需要预测的wav文件\nsr = 44100  # 音频文件的采样率\ntopk = 10  # 展示音频得分前10的标签和分数\n\n# 读取audioset数据集的label文件\nlabel_file = os.path.join(MODULE_HOME, 'panns_cnn14', 'audioset_labels.txt')\nlabel_map = {}\nwith open(label_file, 'r') as f:\n    for i, l in enumerate(f.readlines()):\n        label_map[i] = l.strip()\n\nmodel = hub.Module(name='panns_cnn14', version='1.0.0', task=None, label_map=label_map)\n\ndata = [librosa.load(wav, sr=sr)[0]]\nresult = model.predict(data, sample_rate=sr, batch_size=1, feat_type='mel', use_gpu=True)\n\n# 打印topk的类别和对应得分\nmsg = ''\nfor label, score in list(result[0].items())[:topk]:\n    msg += f'{label}: {score}\\n'\nprint(msg)\n```\n\n### 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.1.0\n"
  },
  {
    "path": "docs/docs_ch/finetune/customized_dataset.md",
    "content": "# 自定义数据\n\n训练一个新任务时，如果从零开始训练时，这将是一个耗时的过程，并且效果可能达不到理想的效果，此时您可以利用PaddleHub提供的预训练模型进行具体任务的Fine-tune。您只需要对自定义数据进行相应的预处理，随后输入预训练模型中，即可得到相应的结果。请参考如下内容设置数据集的结构。\n\n## 一、图像分类数据集\n\n利用PaddleHub迁移分类任务使用自定义数据时，需要切分数据集，将数据集切分为训练集、验证集和测试集。\n\n### 数据准备\n\n需要三个文本文件来记录对应的图片路径和标签，此外还需要一个标签文件用于记录标签的名称。\n```\n├─data: 数据目录\n  ├─train_list.txt：训练集数据列表\n  ├─test_list.txt：测试集数据列表\n  ├─validate_list.txt：验证集数据列表\n  ├─label_list.txt：标签列表\n  └─...\n```\n训练/验证/测试集的数据列表文件的格式如下\n```\n图片1路径 图片1标签\n图片2路径 图片2标签\n...\n```\nlabel_list.txt的格式如下\n```\n分类1名称\n分类2名称\n...\n```\n\n示例：\n以[Flower数据集](../reference/datasets.md)为示例，train_list.txt/test_list.txt/validate_list.txt内容如下示例\n```\nroses/8050213579_48e1e7109f.jpg 0\nsunflowers/45045003_30bbd0a142_m.jpg 3\ndaisy/3415180846_d7b5cced14_m.jpg 2\n```\n\nlabel_list.txt内容如下：\n```\nroses\ntulips\ndaisy\nsunflowers\ndandelion\n```\n\n### 数据集加载\n\n数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。具体使用如下：\n\n```python\nfrom paddlehub.datasets import Flowers\n\nflowers = Flowers(transforms)\nflowers_validate = Flowers(transforms, mode='val')\n```\n* `transforms`: 数据预处理方式。\n* `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n## 二、图像着色数据集\n\n利用PaddleHub迁移着色任务使用自定义数据时，需要切分数据集，将数据集切分为训练集和测试集。\n\n### 数据准备\n\n需要将准备用于着色训练和测试的彩色图像分成训练集数据和测试集数据。\n```\n├─data: 数据目录\n  ├─train：训练集数据\n      |-图片文件夹1\n      |-图片文件夹2\n      |-……\n      |-图片1\n      |-图片2\n      |-……\n\n  ├─test：测试集数据\n    |-图片文件夹1\n    |-图片文件夹2\n    |-……\n    |-图片1\n    |-图片2\n    |-……\n  └─……\n```\n\n示例：\nPaddleHub为用户提供了用于着色的数据集`Canvas数据集`， 它由1193张莫奈风格和400张梵高风格的图像组成，以[Canvas数据集](../reference/datasets.md)为示例，train文件夹内容如下:\n\n```\n├─train：训练集数据\n      |-monet\n          |-图片1\n          |-图片2\n          |-……  \n      |-vango\n          |-图片1\n          |-图片2\n          |-……\n```\n\n### 数据集加载\n\n数据集的准备代码可以参考 [canvas.py](../../paddlehub/datasets/canvas.py)。`hub.datasets.Canvas()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。具体使用如下：\n\n```python\nfrom paddlehub.datasets import Canvas\n\ncolor_set = Canvas(transforms, mode='train')\n```\n* `transforms`: 数据预处理方式。\n* `mode`: 选择数据模式，可选项有 `train`, `test`, 默认为`train`。\n\n## 三、风格迁移数据集\n\n利用PaddleHub进行风格迁移任务使用自定义数据时，需要切分数据集，将数据集切分为训练集和测试集。\n\n### 数据准备\n\n需要将准备用于风格迁移的彩色图像分成训练集和测试集数据。\n\n```\n├─data: 数据目录\n  ├─train：训练集数据\n      |-图片文件夹1\n      |-图片文件夹2\n      |-...\n      |-图片1\n      |-图片2\n      |-...\n\n  ├─test：测试集数据\n    |-图片文件夹1\n    |-图片文件夹2\n    |-...\n    |-图片1\n    |-图片2\n    |-...\n  |- 21styles\n    ｜-图片1\n    ｜-图片2\n  └─...\n```\n\n示例：\nPaddleHub为用户提供了用于风格迁移的数据集`MiniCOCO数据集`, 训练集数据和测试集数据来源于COCO2014， 其中训练集有2001张图片，测试集有200张图片。 `21styles`文件夹下存放着21张不同风格的图片，用户可以根据自己的需求更换不同风格的图片。以[MiniCOCO数据集](../reference/datasets.md)为示例，train文件夹内容如下:\n\n```\n├─train：训练集数据\n      |-train\n          |-图片1\n          |-图片2\n          |-……  \n      |-test\n          |-图片1\n          |-图片2\n          |-……\n      |-21styles\n          |-图片1\n          |-图片2\n          |-……\n```\n\n### 数据集加载\n\n数据集的准备代码可以参考 [minicoco.py](../../paddlehub/datasets/minicoco.py)。`hub.datasets.MiniCOCO()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。具体使用如下：\n\n```python\nfrom paddlehub.datasets import MiniCOCO\n\nccolor_set = MiniCOCO(transforms, mode='train')\n```\n* `transforms`: 数据预处理方式。\n* `mode`: 选择数据模式，可选项有 `train`, `test`, 默认为`train`。\n\n## 四、文本分类数据集\n\n利用PaddleHub进行文本分类任务使用自定义数据时，需要切分数据集，将数据集切分为训练集和测试集。\n\n### 数据准备\n\n#### 1. 设置数据集目录\n\n用户需要将数据集目录设定为如下格式：\n```shell\n├──data: 数据目录\n   ├── train.txt: 训练集数据\n   ├── dev.txt: 验证集数据\n   └── test.txt: 测试集数据\n```\n\n#### 2. 设置文件格式和内容\n\n训练/验证/测试集的数据文件的编码格式建议为utf8格式。内容的第一列是文本类别标签，第二列为文本内容，列与列之间以Tab键分隔。建议在数据集文件第一行填写列说明\"label\"和\"text_a\"，中间以Tab键分隔，示例如下：\n```shell\nlabel    text_a\n房产    昌平京基鹭府10月29日推别墅1200万套起享97折\n教育    贵州2011高考录取分数线发布理科一本448分\n社会    众多白领因集体户口面临结婚难题\n...\n```\n\n### 数据集加载\n\n加载文本分类的自定义数据集，用户仅需要继承基类TextClassificationDataset，修改数据集存放地址以及类别即可，具体可以参考如下代码：\n\n```python\nfrom paddlehub.datasets.base_nlp_dataset import TextClassificationDataset\n\nclass MyDataset(TextClassificationDataset):\n    # 数据集存放目录\n    base_path = '/path/to/dataset'\n    # 数据集的标签列表\n    label_list=['体育', '科技', '社会', '娱乐', '股票', '房产', '教育', '时政', '财经', '星座', '游戏', '家居', '彩票', '时尚']\n\n    def __init__(self, tokenizer, max_seq_len: int = 128, mode: str = 'train'):\n        if mode == 'train':\n            data_file = 'train.txt'\n        elif mode == 'test':\n            data_file = 'test.txt'\n        else:\n            data_file = 'dev.txt'\n        super().__init__(\n            base_path=self.base_path,\n            tokenizer=tokenizer,\n            max_seq_len=max_seq_len,\n            mode=mode,\n            data_file=data_file,\n            label_list=self.label_list,\n            is_file_with_header=True)\n\n\n# 选择所需要的模型，获取对应的tokenizer\nimport paddlehub as hub\nmodel = hub.Module(name='ernie_tiny', task='seq-cls', num_classes=len(MyDataset.label_list))\ntokenizer = model.get_tokenizer()\n\n# 实例化训练集\ntrain_dataset = MyDataset(tokenizer)\n```\n\n至此用户可以通过MyDataset实例化获取对应的数据集，可以通过hub.Trainer对预训练模型`model`完成文本分类任务，详情可参考[PaddleHub文本分类demo](https://github.com/PaddlePaddle/PaddleHub/tree/release/v2.0.0-beta/demo/text_classification)。\n\n## 五、序列标注数据集\n\n利用PaddleHub进行序列标注任务使用自定义数据时，需要切分数据集，将数据集切分为训练集和测试集。\n\n### 数据准备\n\n#### 1. 设置数据集目录\n\n用户需要将数据集目录设定为如下格式：\n```shell\n├──data: 数据目录\n   ├── train.txt: 训练集数据\n   ├── dev.txt: 验证集数据\n   └── test.txt: 测试集数据\n```\n\n#### 2. 设置文件格式和内容\n\n训练/验证/测试集的数据文件的编码格式建议为utf8格式。内容的第一列是文本内容, 第二列为文本中每个token对应的标签。需要注意的是，在文本和标签中，都需使用分隔符(该例子中使用的是斜杠`/`)隔开不同的token。  \n列与列之间以Tab键分隔。建议在数据集文件第一行填写列说明\"label\"和\"text_a\"，中间以Tab键分隔，示例如下：\n```shell\ntext_a    label\n5/月/1/2/日/，/北/京/市/怀/柔/县/民/政/局/、/畜/牧/局/领/导/来/到/驻/守/在/偏/远/山/区/的/武/警/北/京/一/总/队/十/支/队/十/四/中/队/。    O/O/O/O/O/O/B-LOC/I-LOC/I-LOC/B-ORG/I-ORG/I-ORG/I-ORG/I-ORG/I-ORG/O/B-ORG/I-ORG/I-ORG/O/O/O/O/O/O/O/O/O/O/O/O/B-ORG/I-ORG/I-ORG/I-ORG/I-ORG/I-ORG/I-ORG/I-ORG/I-ORG/I-ORG/I-ORG/I-ORG/I-ORG/I-ORG/O\n他/每/年/还/为/河/北/农/业/大/学/扶/助/多/名/贫/困/学/生/。    O/O/O/O/O/B-ORG/I-ORG/I-ORG/I-ORG/I-ORG/I-ORG/O/O/O/O/O/O/O/O/O\n...\n```\n\n### 数据准备\n\n加载文本分类的自定义数据集，用户仅需要继承基类SeqLabelingDataset，修改数据集存放地址、类别信息和分隔符即可，具体可以参考如下代码：\n\n```python\nfrom paddlehub.datasets.base_nlp_dataset import SeqLabelingDataset\n\nclass MyDataset(SeqLabelingDataset):\n    # 数据集存放目录\n    base_path = '/path/to/dataset'\n    # 数据集的标签列表\n    label_list = [\"B-PER\", \"I-PER\", \"B-ORG\", \"I-ORG\", \"B-LOC\", \"I-LOC\", \"O\"]\n    label_map = {idx: label for idx, label in enumerate(label_list)}\n    # 数据文件使用的分隔符\n    split_char = '/'\n\n    def __init__(self, tokenizer, max_seq_len: int = 128, mode: str = 'train'):\n        if mode == 'train':\n            data_file = 'train.txt'\n        elif mode == 'test':\n            data_file = 'test.txt'\n        else:\n            data_file = 'dev.txt'\n        super().__init__(\n                    base_path=self.base_path,\n                    tokenizer=tokenizer,\n                    max_seq_len=max_seq_len,\n                    mode=mode,\n                    data_file=data_file,\n                    label_file=None,\n                    label_list=self.label_list,\n                    split_char=self.split_char,\n                    is_file_with_header=True)\n\n# 选择所需要的模型，获取对应的tokenizer\nimport paddlehub as hub\nmodel = hub.Module(name='ernie_tiny', task='token-cls', label_map=MyDataset.label_map)\ntokenizer = model.get_tokenizer()\n\n# 实例化训练集\ntrain_dataset = MyDataset(tokenizer)\n```\n\n至此用户可以通过MyDataset实例化获取对应的数据集，可以通过hub.Trainer对预训练模型`model`完成系列标注任务，详情可参考[PaddleHub序列标注demo](https://github.com/PaddlePaddle/PaddleHub/tree/release/v2.0.0-beta/demo/sequence_labeling)。\n"
  },
  {
    "path": "docs/docs_ch/finetune/image_classification.md",
    "content": "# 图像分类\n\n本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n\n## 命令行预测\n\n```shell\n$ hub run resnet50_vd_imagenet_ssld --input_path \"/PATH/TO/IMAGE\" --top_k 5\n```\n\n## 如何开始Fine-tune\n\n在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用resnet50_vd_imagenet_ssld对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 定义数据预处理方式\n```python\nimport paddlehub.vision.transforms as T\n\ntransforms = T.Compose([T.Resize((256, 256)),\n                        T.CenterCrop(224),\n                        T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                        to_rgb=True)\n```\n\n`transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n### Step2: 下载数据集并使用\n```python\nfrom paddlehub.datasets import Flowers\n\nflowers = Flowers(transforms)\n\nflowers_validate = Flowers(transforms, mode='val')\n```\n\n* `transforms`: 数据预处理方式。\n* `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n### Step3: 加载预训练模型\n\n```python\nmodel = hub.Module(name=\"resnet50_vd_imagenet_ssld\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n```\n* `name`: 选择预训练模型的名字。\n* `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\nPaddleHub提供许多图像分类预训练模型，如xception、mobilenet、efficientnet等，详细信息参见[图像分类模型](https://www.paddlepaddle.org.cn/hub?filter=en_category&value=ImageClassification)。\n\n如果想尝试efficientnet模型，只需要更换Module中的`name`参数即可.\n```python\n# 更换name参数即可无缝切换efficientnet模型, 代码示例如下\nmodel = hub.Module(name=\"efficientnetb7_imagenet\")\n```\n**NOTE:**目前部分模型还没有完全升级到2.0版本，敬请期待。\n\n### Step4: 选择优化策略和运行配置\n\n```python\noptimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\ntrainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n\ntrainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n```\n\n#### 优化策略\n\nPaddle2.0提供了多种优化器选择，如`SGD`, `Adam`, `Adamax`等, 其中`Adam`:\n\n* `learning_rate`: 全局学习率。默认为1e-3；\n* `parameters`: 待优化模型参数。\n\n#### 运行配置\n\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: works的数量，默认为0；\n* `eval_dataset`: 验证集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们使用该模型来进行预测。predict.py脚本如下：\n\n```python\nimport paddle\nimport paddlehub as hub\n\nif __name__ == '__main__':\n\n    model = hub.Module(name='resnet50_vd_imagenet_ssld', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n    result = model.predict(['flower.jpg'])\n```\n\n参数配置正确后，请执行脚本`python predict.py`， 加载模型具体可参见[加载](https://www.paddlepaddle.org.cn/documentation/docs/zh/2.0-rc/api/paddle/framework/io/load_cn.html#load)。\n\n**NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线分类任务服务。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m resnet50_vd_imagenet_ssld\n```\n\n这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\nimport cv2\nimport base64\n\nimport numpy as np\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n# 发送HTTP请求\norg_im = cv2.imread('/PATH/TO/IMAGE')\n\ndata = {'images':[cv2_to_base64(org_im)], 'top_k':2}\nheaders = {\"Content-type\": \"application/json\"}\nurl = \"http://127.0.0.1:8866/predict/resnet50_vd_imagenet_ssld\"\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\ndata =r.json()[\"results\"]['data']\n```\n\n### 查看代码\n\nhttps://github.com/PaddlePaddle/models/tree/develop/PaddleCV/image_classification\n\n### 依赖\n\npaddlepaddle >= 2.0.0rc\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "docs/docs_ch/finetune/image_colorization.md",
    "content": "# 图像着色\n\n本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n\n## 命令行预测\n\n```\n$ hub run user_guided_colorization --input_path \"/PATH/TO/IMAGE\"\n```\n\n## 如何开始Fine-tune\n\n在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用user_guided_colorization模型对[Canvas](../../docs/reference/datasets.md#class-hubdatasetsCanvas)等数据集进行Fine-tune。\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 定义数据预处理方式\n```python\nimport paddlehub.vision.transforms as T\n\ntransform = T.Compose([T.Resize((256, 256), interpolation='NEAREST'),\n                       T.RandomPaddingCrop(crop_size=176),\n                       T.RGB2LAB()], to_rgb=True)\n```\n\n`transforms`数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n**NOTE:** 要将`T.Compose`中`to_rgb`设定为True.\n\n### Step2: 下载数据集并使用\n```python\nfrom paddlehub.datasets import Canvas\n\ncolor_set = Canvas(transform=transform, mode='train')\n```\n* `transform`: 数据预处理方式。\n* `mode`: 选择数据模式，可选项有 `train`, `test` 默认为`train`。\n\n数据集的准备代码可以参考 [canvas.py](../../paddlehub/datasets/canvas.py)。`hub.datasets.Canvas()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n### Step3: 加载预训练模型\n\n```python\nmodel = hub.Module(name='user_guided_colorization', load_checkpoint=None)\nmodel.set_config(classification=True, prob=1)\n```\n* `name`:加载模型的名字。\n* `load_checkpoint`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n* `classification`: 着色模型分两部分训练，开始阶段`classification`设置为True, 用于浅层网络训练。训练后期将`classification`设置为False, 用于训练网络的输出层。\n* `prob`: 每张输入图不加一个先验彩色块的概率，默认为1，即不加入先验彩色块。例如，当`prob`设定为0.9时，一张图上有两个先验彩色块的概率为(1-0.9)*(1-0.9)*0.9=0.009.\n\n### Step4: 选择优化策略和运行配置\n\n```python\noptimizer = paddle.optimizer.Adam(learning_rate=0.0001, parameters=model.parameters())\ntrainer = Trainer(model, optimizer, checkpoint_dir='img_colorization_ckpt_cls_1')\ntrainer.train(color_set, epochs=201, batch_size=25, eval_dataset=color_set, log_interval=10, save_interval=10)\n```\n\n#### 优化策略\n\nPaddle2.0-rc提供了多种优化器选择，如`SGD`, `Adam`, `Adamax`等，详细参见[策略](https://www.paddlepaddle.org.cn/documentation/docs/zh/2.0-rc/api/paddle/optimizer/optimizer/Optimizer_cn.html)。\n\n其中`Adam`:\n\n* `learning_rate`: 全局学习率。默认为1e-4；\n* `parameters`: 待优化模型参数。\n\n#### 运行配置\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: works的数量，默认为0；\n* `eval_dataset`: 验证过程所用的数据集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们使用该模型来进行预测。predict.py脚本如下：\n\n```python\nimport paddle\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='user_guided_colorization', load_checkpoint='/PATH/TO/CHECKPOINT')\n    model.set_config(prob=0.1)\n    result = model.predict(images=['house.png'])\n```\n\n参数配置正确后，请执行脚本`python predict.py`， 加载模型具体可参见[加载](https://www.paddlepaddle.org.cn/documentation/docs/zh/2.0-rc/api/paddle/framework/io/load_cn.html#load)。\n\n**NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。若想获取油画风着色效果，请下载参数文件[油画着色](https://paddlehub.bj.bcebos.com/dygraph/models/canvas_rc.pdparams)\n\n**Args**\n* `images`:原始图像路径或者BGR格式图片；\n* `visualization`: 是否可视化，默认为True；\n* `save_path`: 保存结果的路径，默认为'result'。\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线着色任务服务。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m user_guided_colorization\n```\n\n这样就完成了一个着色任务服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\nimport cv2\nimport base64\n\nimport numpy as np\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n# 发送HTTP请求\norg_im = cv2.imread('/PATH/TO/IMAGE')\ndata = {'images':[cv2_to_base64(org_im)]}\nheaders = {\"Content-type\": \"application/json\"}\nurl = \"http://127.0.0.1:8866/predict/user_guided_colorization\"\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\ndata = base64_to_cv2(r.json()[\"results\"]['data'][0]['fake_reg'])\ncv2.imwrite('color.png', data)\n\n```\n\n### 查看代码\n\nhttps://github.com/richzhang/colorization-pytorch\n\n### 依赖\n\npaddlepaddle >= 2.0.0rc\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "docs/docs_ch/finetune/semantic_segmentation.md",
    "content": "# 图像分割\n\n本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n\n\n## 如何开始Fine-tune\n\n在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用ocrnet_hrnetw18_voc模型对OpticDiscSeg等数据集进行Fine-tune。\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 定义数据预处理方式\n```python\nfrom paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\ntransform = Compose([Resize(target_size=(512, 512)), Normalize()])\n```\n\n`segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n### Step2: 下载数据集并使用\n```python\nfrom paddlehub.datasets import OpticDiscSeg\n\ntrain_reader = OpticDiscSeg(transform， mode='train')\n\n```\n* `transform`: 数据预处理方式。\n* `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n### Step3: 加载预训练模型\n\n```python\nmodel = hub.Module(name='ocrnet_hrnetw18_voc', num_classes=2, pretrained=None)\n```\n* `name`: 选择预训练模型的名字。\n* `num_classes`: 分割模型的类别数目。\n* `pretrained`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n### Step4: 选择优化策略和运行配置\n\n```python\nscheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\noptimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\ntrainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_ocr', use_gpu=True)\n```\n\n#### 优化策略\n\n\n其中`Adam`:\n\n* `learning_rate`: 全局学习率。\n*  `parameters`: 待优化模型参数。\n\n#### 运行配置\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_gpu`: 是否使用gpu，默认为False;\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: works的数量，默认为0；\n* `eval_dataset`: 验证集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们使用该模型来进行预测。predict.py脚本如下：\n\n```python\nimport paddle\nimport cv2\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='ocrnet_hrnetw18_voc', pretrained='/PATH/TO/CHECKPOINT')\n    img = cv2.imread(\"/PATH/TO/IMAGE\")\n    model.predict(images=[img], visualization=True)\n```\n\n参数配置正确后，请执行脚本`python predict.py`。\n**Args**\n* `images`:原始图像路径或BGR格式图片；\n* `visualization`: 是否可视化，默认为True；\n* `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n**NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线图像分割服务。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m ocrnet_hrnetw18_voc\n```\n\n这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\nimport cv2\nimport base64\n\nimport numpy as np\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n# 发送HTTP请求\norg_im = cv2.imread('/PATH/TO/IMAGE')\ndata = {'images':[cv2_to_base64(org_im)]}\nheaders = {\"Content-type\": \"application/json\"}\nurl = \"http://127.0.0.1:8866/predict/ocrnet_hrnetw18_voc\"\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\nmask = base64_to_cv2(r.json()[\"results\"][0])\n```\n\n### 查看代码\n\nhttps://github.com/PaddlePaddle/PaddleSeg\n\n### 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "docs/docs_ch/finetune/sequence_labeling.md",
    "content": "# 序列标注\n\n在2017年之前，工业界和学术界对NLP文本处理依赖于序列模型[Recurrent Neural Network (RNN)](https://baike.baidu.com/item/%E5%BE%AA%E7%8E%AF%E7%A5%9E%E7%BB%8F%E7%BD%91%E7%BB%9C/23199490?fromtitle=RNN&fromid=5707183&fr=aladdin).\n\n![](http://colah.github.io/posts/2015-09-NN-Types-FP/img/RNN-general.png)\n\n近年来随着深度学习的发展，模型参数数量飞速增长，为了训练这些参数，需要更大的数据集来避免过拟合。然而，对于大部分NLP任务来说，构建大规模的标注数据集成本过高，非常困难，特别是对于句法和语义相关的任务。相比之下，大规模的未标注语料库的构建则相对容易。最近的研究表明，基于大规模未标注语料库的预训练模型（Pretrained Models, PTM) 能够习得通用的语言表示，将预训练模型Fine-tune到下游任务，能够获得出色的表现。另外，预训练模型能够避免从零开始训练模型。\n\n![](https://ai-studio-static-online.cdn.bcebos.com/327f44ff3ed24493adca5ddc4dc24bf61eebe67c84a6492f872406f464fde91e)\n\n\n本示例将展示如何使用PaddleHub Transformer模型（如 ERNIE、BERT、RoBERTa等模型）Module 以动态图方式fine-tune并完成预测任务。\n\n## 如何开始Fine-tune\n\n\n我们以微软亚洲研究院发布的中文实体识别数据集MSRA-NER为示例数据集，可以运行下面的命令，在训练集（train.tsv）上进行模型训练，并在开发集（dev.tsv）验证。通过如下命令，即可启动训练。\n\n```shell\n# 设置使用的GPU卡号\nexport CUDA_VISIBLE_DEVICES=0\npython train.py\n```\n\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 选择模型\n\n在命名实体识别的任务中，因不同的数据集标识实体的标签不同，评测的方式也有所差异。因此，在初始化模型的之前，需要先确定实际标签的形式，下方的`label_list`则是MSRA-NER数据集中使用的标签类别。  \n如果用户使用的实体识别的数据集的标签方式与MSRA-NER不同，则需要自行根据数据集确定。\n```python\nlabel_list = hub.datasets.MSRA_NER.label_list\nlabel_map = {\n    idx: label for idx, label in enumerate(label_list)\n}\n```\n\n接下来创建任务所使用的`model`\n```python\nimport paddlehub as hub\n\nmodel = hub.Module(name='ernie_tiny', version='2.0.1', task='token-cls', label_map=label_map)\n```\n\n其中，参数：\n\n* `name`：模型名称，可以选择`ernie`，`ernie_tiny`，`bert-base-cased`， `bert-base-chinese`, `roberta-wwm-ext`，`roberta-wwm-ext-large`等。\n* `version`：module版本号\n* `task`：fine-tune任务。此处为`token-cls`，表示序列标注任务。\n* `label_map`：数据集中的标签信息，实体识别任务中需要根据不同标签种类对模型性能进行评价。\n\nPaddleHub还提供BERT等模型可供选择, 当前支持序列标注任务的模型对应的加载示例如下：\n\n模型名                           | PaddleHub Module\n---------------------------------- | :------:\nERNIE, Chinese                     | `hub.Module(name='ernie')`\nERNIE tiny, Chinese                | `hub.Module(name='ernie_tiny')`\nERNIE 2.0 Base, English            | `hub.Module(name='ernie_v2_eng_base')`\nERNIE 2.0 Large, English           | `hub.Module(name='ernie_v2_eng_large')`\nBERT-Base, English Cased           | `hub.Module(name='bert-base-cased')`\nBERT-Base, English Uncased         | `hub.Module(name='bert-base-uncased')`\nBERT-Large, English Cased          | `hub.Module(name='bert-large-cased')`\nBERT-Large, English Uncased        | `hub.Module(name='bert-large-uncased')`\nBERT-Base, Multilingual Cased      | `hub.Module(nane='bert-base-multilingual-cased')`\nBERT-Base, Multilingual Uncased    | `hub.Module(nane='bert-base-multilingual-uncased')`\nBERT-Base, Chinese                 | `hub.Module(name='bert-base-chinese')`\nBERT-wwm, Chinese                  | `hub.Module(name='chinese-bert-wwm')`\nBERT-wwm-ext, Chinese              | `hub.Module(name='chinese-bert-wwm-ext')`\nRoBERTa-wwm-ext, Chinese           | `hub.Module(name='roberta-wwm-ext')`\nRoBERTa-wwm-ext-large, Chinese     | `hub.Module(name='roberta-wwm-ext-large')`\nRBT3, Chinese                      | `hub.Module(name='rbt3')`\nRBTL3, Chinese                     | `hub.Module(name='rbtl3')`\nELECTRA-Small, English             | `hub.Module(name='electra-small')`\nELECTRA-Base, English              | `hub.Module(name='electra-base')`\nELECTRA-Large, English             | `hub.Module(name='electra-large')`\nELECTRA-Base, Chinese              | `hub.Module(name='chinese-electra-base')`\nELECTRA-Small, Chinese             | `hub.Module(name='chinese-electra-small')`\n\n通过以上的一行代码，`model`初始化为一个适用于序列标注任务的模型，为ERNIE Tiny的预训练模型后拼接上一个输出token共享的全连接网络（Full Connected）。  \n![](https://ss1.bdstatic.com/70cFuXSh_Q1YnxGkpoWK1HF6hhy/it/u=224484727,3049769188&fm=15&gp=0.jpg)\n\n以上图片来自于：https://arxiv.org/pdf/1810.04805.pdf\n\n### Step2: 下载并加载数据集\n\n```python\ntrain_dataset = hub.datasets.MSRA_NER(\n    tokenizer=model.get_tokenizer(), max_seq_len=128, mode='train')\ndev_dataset = hub.datasets.MSRA_NER(\n    tokenizer=model.get_tokenizer(), max_seq_len=128, mode='dev')\n```\n\n* `tokenizer`：表示该module所需用到的tokenizer，其将对输入文本完成切词，并转化成module运行所需模型输入格式。\n* `mode`：选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n* `max_seq_len`：ERNIE/BERT模型使用的最大序列长度，若出现显存不足，请适当调低这一参数。\n\n预训练模型ERNIE对中文数据的处理是以字为单位，tokenizer作用为将原始输入文本转化成模型model可以接受的输入数据形式。 PaddleHub 2.0中的各种预训练模型已经内置了相应的tokenizer，可以通过`model.get_tokenizer`方法获取。\n\n![](https://bj.bcebos.com/paddlehub/paddlehub-img/ernie_network_1.png)\n![](https://bj.bcebos.com/paddlehub/paddlehub-img/ernie_network_2.png)\n\n### Step3:  选择优化策略和运行配置\n\n```python\noptimizer = paddle.optimizer.AdamW(learning_rate=5e-5, parameters=model.parameters())\ntrainer = hub.Trainer(model, optimizer, checkpoint_dir='test_ernie_token_cls', use_gpu=False)\n\ntrainer.train(train_dataset, epochs=3, batch_size=32, eval_dataset=dev_dataset)\n\n# 在测试集上评估当前训练模型\ntrainer.evaluate(test_dataset, batch_size=32)\n```\n\n#### 优化策略\n\nPaddle2.0-rc提供了多种优化器选择，如`SGD`, `Adam`, `Adamax`, `AdamW`等，详细参见[策略](https://www.paddlepaddle.org.cn/documentation/docs/zh/2.0-rc/api/paddle/optimizer/optimizer/Optimizer_cn.html)。\n\n其中`AdamW`:\n\n* `learning_rate`: 全局学习率。默认为1e-3；\n* `parameters`: 待优化模型参数。\n\n#### 运行配置\n\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_gpu`: 是否使用GPU训练，默认为False;\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: workers的数量，默认为0；\n* `eval_dataset`: 验证集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们以以下数据为待预测数据，使用该模型来进行预测\n\n```text\n去年十二月二十四日，市委书记张敬涛召集县市主要负责同志研究信访工作时，提出三问：『假如上访群众是我们的父母姐妹，你会用什么样的感情对待他们？\n新华社北京5月7日电国务院副总理李岚清今天在中南海会见了美国前商务部长芭芭拉·弗兰克林。\n根据测算，海卫1表面温度已经从“旅行者”号探测器1989年造访时的零下236摄氏度上升到零下234摄氏度。\n华裔作家韩素音女士曾三次到大足，称“大足石窟是一座未被开发的金矿”。\n```\n\n```python\nimport paddlehub as hub\n\nsplit_char = \"\\002\"\nlabel_list = [\"B-PER\", \"I-PER\", \"B-ORG\", \"I-ORG\", \"B-LOC\", \"I-LOC\", \"O\"]\ntext_a = [\n    '去年十二月二十四日，市委书记张敬涛召集县市主要负责同志研究信访工作时，提出三问：『假如上访群众是我们的父母姐妹，你会用什么样的感情对待他们？',\n    '新华社北京5月7日电国务院副总理李岚清今天在中南海会见了美国前商务部长芭芭拉·弗兰克林。',\n    '根据测算，海卫1表面温度已经从“旅行者”号探测器1989年造访时的零下236摄氏度上升到零下234摄氏度。',\n    '华裔作家韩素音女士曾三次到大足，称“大足石窟是一座未被开发的金矿”。',\n]\ndata = [[split_char.join(text)] for text in text_a]\nlabel_map = {\n    idx: label for idx, label in enumerate(label_list)\n}\n\nmodel = hub.Module(\n    name='ernie_tiny',\n    version='2.0.1',\n    task='token-cls',\n    load_checkpoint='./token_cls_save_dir/best_model/model.pdparams',\n    label_map=label_map,\n)\n\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(text_a):\n    print(f'Data: {text} \\t Lable: {\", \".join(results[idx][1:len(text)+1])}')\n```\n\n参数配置正确后，请执行脚本`python predict.py`， 加载模型具体可参见[加载](https://www.paddlepaddle.org.cn/documentation/docs/zh/2.0-rc/api/paddle/framework/io/load_cn.html#load)。\n\n### 依赖\n\npaddlepaddle >= 2.0.0rc\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "docs/docs_ch/finetune/style_transfer.md",
    "content": "# 风格迁移\n\n本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n\n## 命令行预测\n\n```\n$ hub run msgnet --input_path \"/PATH/TO/ORIGIN/IMAGE\" --style_path \"/PATH/TO/STYLE/IMAGE\"\n```\n\n## 如何开始Fine-tune\n\n在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用msgnet模型对[MiniCOCO](../../docs/reference/datasets.md#class-hubdatasetsMiniCOCO)等数据集进行Fine-tune。\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 定义数据预处理方式\n```python\nimport paddlehub.vision.transforms as T\n\ntransform = T.Compose([T.Resize((256, 256), interpolation='LINEAR')])\n```\n\n`transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n### Step2: 下载数据集并使用\n```python\nfrom paddlehub.datasets.minicoco import MiniCOCO\n\nstyledata = MiniCOCO(transform=transform, mode='train')\n\n```\n* `transforms`: 数据预处理方式。\n* `mode`: 选择数据模式，可选项有 `train`, `test`， 默认为`train`。\n\n数据集的准备代码可以参考 [minicoco.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.MiniCOCO()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n### Step3: 加载预训练模型\n\n```python\nmodel = hub.Module(name='msgnet', load_checkpoint=None)\n```\n* `name`: 选择预训练模型的名字。\n* `load_checkpoint`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n### Step4: 选择优化策略和运行配置\n\n```python\noptimizer = paddle.optimizer.Adam(learning_rate=0.0001, parameters=model.parameters())\ntrainer = Trainer(model, optimizer, checkpoint_dir='test_style_ckpt')\ntrainer.train(styledata, epochs=101, batch_size=4, eval_dataset=styledata, log_interval=10, save_interval=10)\n```\n\n#### 优化策略\n\nPaddle2.0rc提供了多种优化器选择，如`SGD`, `Adam`, `Adamax`等，详细参见[策略](https://www.paddlepaddle.org.cn/documentation/docs/zh/2.0-rc/api/paddle/optimizer/optimizer/Optimizer_cn.html)。\n\n其中`Adam`:\n\n* `learning_rate`: 全局学习率。默认为1e-4；\n*  `parameters`: 待优化模型参数。\n\n#### 运行配置\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: works的数量，默认为0；\n* `eval_dataset`: 验证集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们使用该模型来进行预测。predict.py脚本如下：\n\n```python\nimport paddle\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='msgnet', load_checkpoint=\"/PATH/TO/CHECKPOINT\")\n    result = model.predict(origin=[\"venice-boat.jpg\"], style=\"candy.jpg\", visualization=True, save_path ='style_tranfer')\n```\n\n参数配置正确后，请执行脚本`python predict.py`， 加载模型具体可参见[加载](https://www.paddlepaddle.org.cn/documentation/docs/zh/2.0-rc/api/paddle/framework/io/load_cn.html#load)。\n\n**Args**\n* `origin`:原始图像路径或BGR格式图片；\n* `style`: 风格图像路径；\n* `visualization`: 是否可视化，默认为True；\n* `save_path`: 保存结果的路径，默认保存路径为'style_tranfer'。\n\n**NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线风格迁移服务。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m msgnet\n```\n\n这样就完成了一个风格迁移服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\nimport cv2\nimport base64\n\nimport numpy as np\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n# 发送HTTP请求\norg_im = cv2.imread('/PATH/TO/ORIGIN/IMAGE')\nstyle_im = cv2.imread('/PATH/TO/STYLE/IMAGE')\ndata = {'images':[[cv2_to_base64(org_im)], cv2_to_base64(style_im)]}\nheaders = {\"Content-type\": \"application/json\"}\nurl = \"http://127.0.0.1:8866/predict/msgnet\"\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\ndata = base64_to_cv2(r.json()[\"results\"]['data'][0])\ncv2.imwrite('style.png', data)\n```\n\n### 查看代码\n\nhttps://github.com/zhanghang1989/PyTorch-Multi-Style-Transfer\n\n### 依赖\n\npaddlepaddle >= 2.0.0rc\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "docs/docs_ch/finetune/text_matching.md",
    "content": "# 文本匹配\n\n在2017年之前，工业界和学术界对NLP文本处理依赖于序列模型[Recurrent Neural Network (RNN)](https://baike.baidu.com/item/%E5%BE%AA%E7%8E%AF%E7%A5%9E%E7%BB%8F%E7%BD%91%E7%BB%9C/23199490?fromtitle=RNN&fromid=5707183&fr=aladdin).\n\n![](http://colah.github.io/posts/2015-09-NN-Types-FP/img/RNN-general.png)\n\n近年来随着深度学习的发展，模型参数数量飞速增长，为了训练这些参数，需要更大的数据集来避免过拟合。然而，对于大部分NLP任务来说，构建大规模的标注数据集成本过高，非常困难，特别是对于句法和语义相关的任务。相比之下，大规模的未标注语料库的构建则相对容易。最近的研究表明，基于大规模未标注语料库的预训练模型（Pretrained Models, PTM) 能够习得通用的语言表示，将预训练模型Fine-tune到下游任务，能够获得出色的表现。另外，预训练模型能够避免从零开始训练模型。\n\n![](https://ai-studio-static-online.cdn.bcebos.com/327f44ff3ed24493adca5ddc4dc24bf61eebe67c84a6492f872406f464fde91e)\n\n\n本示例将展示如何使用PaddleHub Transformer模型（如 ERNIE、BERT、RoBERTa等模型）Module 以动态图方式fine-tune并完成预测任务。\n\n## 文本匹配\n\n使用预训练模型ERNIE完成文本匹配任务，大家可能会想到将query和title文本拼接，之后输入ERNIE中，取`CLS`特征（pooled_output），之后输出全连接层，进行二分类。如下图ERNIE用于句对分类任务的用法：\n\n![](https://camo.githubusercontent.com/5e1867ee2b6fc3a0f94c7b2c87a4d987fed4c440d4d9c80726e5798900880027/68747470733a2f2f61692d73747564696f2d7374617469632d6f6e6c696e652e63646e2e626365626f732e636f6d2f34353434303032396330373234306164383964363635633562313736653633323937653935383465316461323465303262373964643534666239393066373461)\n\n然而，以上用法的问题在于，ERNIE的模型参数非常庞大，导致计算量非常大，预测的速度也不够理想。从而达不到线上业务的要求。针对该问题，使用Sentence Transformer网络可以优化计算量。\n\nSentence Transformer采用了双塔（Siamese）的网络结构。Query和Title分别输入Transformer网络，共享网络参数，得到各自的token embedding特征。之后对token embedding进行pooling（此处教程使用mean pooling操作），之后输出分别记作u，v。之后将三个表征（u,v,|u-v|)拼接起来，进行二分类。网络结构如下图所示。\n\n![](https://camo.githubusercontent.com/80e65553f0c82886a27897a0a151ee9745e6e2def310d6649c8a68e2672c06c2/68747470733a2f2f61692d73747564696f2d7374617469632d6f6e6c696e652e63646e2e626365626f732e636f6d2f31303339393837303365313334613731383438383335313161353338363230653136666564303435653236313464636338616661636563343436363030343338)\n\n更多关于Sentence Transformer的信息可以参考论文：https://arxiv.org/abs/1908.10084\n\n## 如何开始Fine-tune\n\n\n我们以中文文本匹配数据集LCQMC为示例数据集，可以运行下面的命令，在训练集（train.tsv）上进行模型训练，并在开发集（dev.tsv）验证和测试集测试（test.tsv）。通过如下命令，即可启动训练。\n\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 选择模型\n```python\nimport paddlehub as hub\n\nmodel = hub.Module(name='ernie_tiny', version='2.0.2', task='text-matching')\n```\n\n其中，参数：\n\n* `name`：模型名称，可以选择`ernie`，`ernie_tiny`，`bert-base-cased`， `bert-base-chinese`, `roberta-wwm-ext`，`roberta-wwm-ext-large`等。\n* `version`：module版本号\n* `task`：fine-tune任务。此处为`text-matching`，表示文本匹配任务。\n\nPaddleHub还提供BERT等模型可供选择, 当前支持文本分类任务的模型对应的加载示例如下：\n\n模型名                           | PaddleHub Module\n---------------------------------- | :------:\nERNIE, Chinese                     | `hub.Module(name='ernie')`\nERNIE tiny, Chinese                | `hub.Module(name='ernie_tiny')`\nERNIE 2.0 Base, English            | `hub.Module(name='ernie_v2_eng_base')`\nERNIE 2.0 Large, English           | `hub.Module(name='ernie_v2_eng_large')`\nBERT-Base, English Cased           | `hub.Module(name='bert-base-cased')`\nBERT-Base, English Uncased         | `hub.Module(name='bert-base-uncased')`\nBERT-Large, English Cased          | `hub.Module(name='bert-large-cased')`\nBERT-Large, English Uncased        | `hub.Module(name='bert-large-uncased')`\nBERT-Base, Multilingual Cased      | `hub.Module(nane='bert-base-multilingual-cased')`\nBERT-Base, Multilingual Uncased    | `hub.Module(nane='bert-base-multilingual-uncased')`\nBERT-Base, Chinese                 | `hub.Module(name='bert-base-chinese')`\nBERT-wwm, Chinese                  | `hub.Module(name='chinese-bert-wwm')`\nBERT-wwm-ext, Chinese              | `hub.Module(name='chinese-bert-wwm-ext')`\nRoBERTa-wwm-ext, Chinese           | `hub.Module(name='roberta-wwm-ext')`\nRoBERTa-wwm-ext-large, Chinese     | `hub.Module(name='roberta-wwm-ext-large')`\nRBT3, Chinese                      | `hub.Module(name='rbt3')`\nRBTL3, Chinese                     | `hub.Module(name='rbtl3')`\nELECTRA-Small, English             | `hub.Module(name='electra-small')`\nELECTRA-Base, English              | `hub.Module(name='electra-base')`\nELECTRA-Large, English             | `hub.Module(name='electra-large')`\nELECTRA-Base, Chinese              | `hub.Module(name='chinese-electra-base')`\nELECTRA-Small, Chinese             | `hub.Module(name='chinese-electra-small')`\n\n通过以上的一行代码，`model`初始化为一个适用于文本匹配任务的双塔（Siamese）结构模型，。\n\n\n### Step2: 下载并加载数据集\n\n```python\ntrain_dataset = LCQMC(tokenizer=model.get_tokenizer(), max_seq_len=128, mode='train')\ndev_dataset = LCQMC(tokenizer=model.get_tokenizer(), max_seq_len=128, mode='dev')\ntest_dataset = LCQMC(tokenizer=model.get_tokenizer(), max_seq_len=128, mode='test')\n```\n\n* `tokenizer`：表示该module所需用到的tokenizer，其将对输入文本完成切词，并转化成module运行所需模型输入格式。\n* `mode`：选择数据模式，可选项有 `train`, `dev`, `test`，默认为`train`。\n* `max_seq_len`：ERNIE/BERT模型使用的最大序列长度，若出现显存不足，请适当调低这一参数。\n\n预训练模型ERNIE对中文数据的处理是以字为单位，tokenizer作用为将原始输入文本转化成模型model可以接受的输入数据形式。 PaddleHub 2.0中的各种预训练模型已经内置了相应的tokenizer，可以通过`model.get_tokenizer`方法获取。\n\n\n### Step3:  选择优化策略和运行配置\n\n```python\noptimizer = paddle.optimizer.AdamW(learning_rate=5e-5, parameters=model.parameters())\ntrainer = hub.Trainer(model, optimizer, checkpoint_dir='./', use_gpu=True)\n```\n\n#### 优化策略\n\nPaddle2.0提供了多种优化器选择，如`SGD`, `AdamW`, `Adamax`等，详细参见[策略](https://www.paddlepaddle.org.cn/documentation/docs/zh/api/paddle/optimizer/Overview_cn.html)。\n\n其中`AdamW`:\n\n- `learning_rate`: 全局学习率。默认为1e-3；\n- `parameters`: 待优化模型参数。\n\n其余可配置参数请参考[AdamW](https://www.paddlepaddle.org.cn/documentation/docs/zh/api/paddle/optimizer/adamw/AdamW_cn.html#cn-api-paddle-optimizer-adamw)。\n\n#### 运行配置\n\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n- `model`: 被优化模型；\n- `optimizer`: 优化器选择；\n- `use_vdl`: 是否使用vdl可视化训练过程；\n- `checkpoint_dir`: 保存模型参数的地址；\n- `compare_metrics`: 保存最优模型的衡量指标；\n\n\n### Step4: 执行训练和模型评估\n\n```python\ntrainer.train(\n    train_dataset,\n    epochs=10,\n    batch_size=32,\n    eval_dataset=dev_dataset,\n    save_interval=2,\n)\ntrainer.evaluate(test_dataset, batch_size=32)\n```\n\n`trainer.train`执行模型的训练，其参数可以控制具体的训练过程，主要的参数包含：\n\n- `train_dataset`: 训练时所用的数据集；\n- `epochs`: 训练轮数；\n- `batch_size`: 训练时每一步用到的样本数目，如果使用GPU，请根据实际情况调整batch_size；\n- `num_workers`: works的数量，默认为0；\n- `eval_dataset`: 验证集；\n- `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n- `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n`trainer.evaluate`执行模型的评估，主要的参数包含：\n\n- `eval_dataset`: 模型评估时所用的数据集；\n- `batch_size`: 模型评估时每一步用到的样本数目，如果使用GPU，请根据实际情况调整batch_size\n\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n以下代码将使用最优模型来进行预测：\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个表情叫什么', '这个猫的表情叫什么'],\n    ['什么是智能手环', '智能手环有什么用'],\n    ['介绍几本好看的都市异能小说，要完结的！', '求一本好看点的都市异能小说，要完结的'],\n    ['一只蜜蜂落在日历上（打一成语）', '一只蜜蜂停在日历上（猜一成语）'],\n    ['一盒香烟不拆开能存放多久？', '一条没拆封的香烟能存放多久。'],\n]\nlabel_map = {0: 'similar', 1: 'dissimilar'}\n\nmodel = hub.Module(\n    name='ernie_tiny',\n    version='2.0.2',\n    task='text-matching',\n    load_checkpoint='./checkpoint/best_model/model.pdparams',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=128, batch_size=1, use_gpu=True)\nfor idx, texts in enumerate(data):\n    print('TextA: {}\\tTextB: {}\\t Label: {}'.format(texts[0], texts[1], results[idx]))\n```\n\n### 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "docs/docs_ch/get_start/installation.rst",
    "content": "============\n安装\n============\n\n\n环境依赖\n========================\n\n* 操作系统：Windows/Mac/Linux\n* Python >= 3.6.2\n* PaddlePaddle >= 2.0.0\n\n.. code-block:: shell\n\n    # 安装gpu版本的PaddlePaddle\n    pip install paddlepaddle-gpu -U\n\n    # 或者安装cpu版本的paddlepaddle\n    # pip install paddlepaddle -U\n\n安装命令\n========================\n\n在安装PaddleHub之前，请先安装PaddlePaddle深度学习框架，更多安装说明请查阅`飞桨快速安装 <https://www.paddlepaddle.org.cn/install/quick>`\n\n.. code-block:: shell\n\n    pip install paddlehub==2.1.0\n\n除上述依赖外，PaddleHub的预训练模型和预置数据集需要连接服务端进行下载，请确保机器可以正常访问网络。若本地已存在相关的数据集和预训练模型，则可以离线运行PaddleHub。\n\n.. note::\n\n    使用PaddleHub下载数据集、预训练模型等，要求机器可以访问外网。可以使用`server_check()`可以检查本地与远端PaddleHub-Server的连接状态，使用方法如下：\n\n.. code-block:: Python\n\n    import paddlehub\n    paddlehub.server_check()\n    # 如果可以连接远端PaddleHub-Server，则显示Request Hub-Server successfully。\n    # 如果无法连接远端PaddleHub-Server，则显示Request Hub-Server unsuccessfully。\n"
  },
  {
    "path": "docs/docs_ch/get_start/linux_quickstart.md",
    "content": "# 零基础Linux安装并实现图像风格迁移\n\n## 第1步：安装Anaconda\n\n- 说明：使用paddlepaddle需要先安装python环境，这里我们选择python集成环境Anaconda工具包\n  - Anaconda是1个常用的python包管理程序\n  - 安装完Anaconda后，可以安装python环境，以及numpy等所需的工具包环境\n  \n- **下载Anaconda**：\n  \n  - 下载地址：https://mirrors.tuna.tsinghua.edu.cn/anaconda/archive/?C=M&O=D\n    - <img src=\"../../imgs/Install_Related/linux/anaconda_download.png\" akt=\"anaconda download\" width=\"800\" align=\"center\"/>\n    - 选择适合您操作系统的版本\n      - 可在终端输入`uname -m`查询系统所用的指令集\n    \n  - 下载法1：本地下载，再将安装包传到linux服务器上\n  \n  - 下载法2：直接使用linux命令行下载\n  \n    - ```shell\n      # 首先安装wget\n      sudo apt-get install wget  # Ubuntu\n      sudo yum install wget  # CentOS\n      ```\n  \n    - ```shell\n      # 然后使用wget从清华源上下载\n      # 如要下载Anaconda3-2021.05-Linux-x86_64.sh，则下载命令如下：\n      wget https://mirrors.tuna.tsinghua.edu.cn/anaconda/archive/Anaconda3-2021.05-Linux-x86_64.sh\n      \n      # 若您要下载其他版本，需要将最后1个/后的文件名改成您希望下载的版本\n      ```\n  \n- 安装Anaconda：\n\n  - 在命令行输入`sh Anaconda3-2021.05-Linux-x86_64.sh`\n    - 若您下载的是其它版本，则将该命令的文件名替换为您下载的文件名\n  - 按照安装提示安装即可\n    - 查看许可时可输入q来退出\n  \n- **将conda加入环境变量**\n\n  - 加入环境变量是为了让系统能识别conda命令，若您在安装时已将conda加入环境变量path，则可跳过本步\n\n  - 在终端中打开`~/.bashrc`：\n\n    - ```shell\n      # 在终端中输入以下命令：\n      vim ~/.bashrc\n      ```\n\n  - 在`~/.bashrc`中将conda添加为环境变量：\n\n    - ```shell\n      # 先按i进入编辑模式\n      # 在第一行输入：\n      export PATH=\"~/anaconda3/bin:$PATH\"\n      # 若安装时自定义了安装位置，则将~/anaconda3/bin改为自定义的安装目录下的bin文件夹\n      ```\n\n    - ```shell\n      # 修改后的~/.bash_profile文件应如下（其中xxx为用户名）：\n      export PATH=\"~/opt/anaconda3/bin:$PATH\"\n      # >>> conda initialize >>>\n      # !! Contents within this block are managed by 'conda init' !!\n      __conda_setup=\"$('/Users/xxx/opt/anaconda3/bin/conda' 'shell.bash' 'hook' 2> /dev/null)\"\n      if [ $? -eq 0 ]; then\n          eval \"$__conda_setup\"\n      else\n          if [ -f \"/Users/xxx/opt/anaconda3/etc/profile.d/conda.sh\" ]; then\n              . \"/Users/xxx/opt/anaconda3/etc/profile.d/conda.sh\"\n          else\n              export PATH=\"/Users/xxx/opt/anaconda3/bin:$PATH\"\n          fi\n      fi\n      unset __conda_setup\n      # <<< conda initialize <<<\n      ```\n\n    - 修改完成后，先按`esc`键退出编辑模式，再输入`:wq!`并回车，以保存退出\n\n  - 验证是否能识别conda命令：\n\n    - 在终端中输入`source ~/.bash_profile`以更新环境变量\n    - 再在终端输入`conda info --envs`，若能显示当前有base环境，则conda已加入环境变量\n\n## 第2步：创建conda环境\n\n- 创建新的conda环境\n\n  - ```shell\n    # 在命令行输入以下命令，创建名为paddle_env的环境\n    # 此处为加速下载，使用清华源\n    conda create --name paddle_env python=3.8 --channel https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free/\n    ```\n\n  - 该命令会创建1个名为paddle_env、python版本为3.8的可执行环境，根据网络状态，需要花费一段时间\n\n  - 之后命令行中会输出提示信息，输入y并回车继续安装\n\n    - <img src=\"../../imgs/Install_Related/linux/conda_create.png\" alt=\"conda_create\" width=\"500\" align=\"center\"/>\n\n- 激活刚创建的conda环境，在命令行中输入以下命令：\n\n  - ```shell\n    # 激活paddle_env环境\n    conda activate paddle_env\n    ```\n    \n  - 以上anaconda环境和python环境安装完毕\n\n## 第3步：安装程序所需要库\n\n- 使用pip命令在刚激活的环境中安装paddle：\n\n  - ```shell\n    # 在命令行中输入以下命令：\n    # 默认安装CPU版本，安装paddle时建议使用百度源\n    pip install paddlepaddle -i https://mirror.baidu.com/pypi/simple\n    ```\n  \n- 安装完paddle后，继续在paddle_env环境中安装paddlehub：\n\n  - ```shell\n    # 在命令行中输入以下命令：\n    pip install paddlehub -i https://mirror.baidu.com/pypi/simple\n    ```\n\n  - paddlehub的介绍文档：https://github.com/PaddlePaddle/PaddleHub/blob/release/v2.1/README_ch.md\n  \n  - 安装paddlehub时会自动安装其它依赖库，可能需要花费一段时间\n\n## 第4步：安装paddlehub并下载模型\n\n- 安装完paddlehub后，下载风格迁移模型：\n\n  - ```shell\n    # 在命令行中输入以下命令\n    hub install stylepro_artistic==1.0.1\n    ```\n\n  - 模型的说明文档：[https://www.paddlepaddle.org.cn/hubsearch?filter=en_category&value=%7B%22scenes%22%3A%5B%22GANs%22%5D%7D](https://www.paddlepaddle.org.cn/hubsearch?filter=en_category&value={\"scenes\"%3A[\"GANs\"]})\n\n  - <img src=\"../../imgs/Install_Related/linux/hub_model_intro.png\" alt=\"hub model intro\" width=\"800\" align=\"center\"/>\n\n## 第5步：准备风格迁移数据和代码\n\n### 准备风格迁移数据\n\n- 在用户目录~下，创建工作目录`style_transfer`\n\n  - ```shell\n    # 在终端中输入以下命令:\n    cd ~  # 进入用户目录\n    mkdir style_transfer  # 创建style_transfer文件夹\n    cd style_transfer  # 进入style_transfer目录\n    ```\n\n- 分别放置待转换图片和风格图片：\n\n  - 将待转换图片放置到`~/style_transfer/pic.jpg`\n    - <img src=\"../../imgs/Install_Related/linux/pic.jpg\" alt=\"pic.jpg\" width=\"400\" align=\"center\"/>\n  - 将风格图片放置到`~/style_transfer/fangao.jpg`\n    - <img src=\"../../imgs/Install_Related/linux/fangao.jpg\" alt=\"fangao.jpg\" width=\"350\" align=\"center\"/>\n\n### 代码\n\n- 创建代码文件：\n\n  - ```shell\n    # 以下命令均在命令行执行\n    $ pwd # 查看当前目录是否为style_transfer，若不是则输入：cd ~/style_transfer\n    $ touch style_transfer.py  # 创建空文件\n    $ vim style_transfer.py  # 使用vim编辑器打开代码文件\n    # 先输入i进入编辑模式\n    # 将代码拷贝进vim编辑器中\n    # 按esc键退出编辑模式，再输入\":wq\"并回车，以保存并退出\n    ```\n    \n  - ```python\n    # 代码\n    import paddlehub as hub\n    import cv2\n    \n    # 待转换图片的相对地址\n    picture = './pic.jpg'\n    # 风格图片的相对地址\n    style_image = './fangao.jpg'\n    \n    # 创建风格转移网络并加载参数\n    stylepro_artistic = hub.Module(name=\"stylepro_artistic\")\n    \n    # 读入图片并开始风格转换\n    result = stylepro_artistic.style_transfer(\n                        images=[{'content': cv2.imread(picture),\n                                 'styles': [cv2.imread(style_image)]}],\n                        visualization=True\n    )\n    ```\n\n- 运行代码：\n\n  - 在命令行中，输入`python style_transfer.py`\n  - 程序执行时，会创建新文件夹`transfer_result`，并将转换后的文件保存到该目录下\n  - 输出的图片如下：\n    - <img src=\"../../imgs/Install_Related/linux/output_img.png\" alt=\"output image\" width=\"600\" align=\"center\"/>\n\n## 第6步：飞桨预训练模型探索之旅\n- 恭喜你，到这里PaddleHub在linux环境下的安装和入门案例就全部完成了，快快开启你更多的深度学习模型探索之旅吧。[【更多模型探索，跳转飞桨官网】](https://www.paddlepaddle.org.cn/hublist)\n\n"
  },
  {
    "path": "docs/docs_ch/get_start/mac_quickstart.md",
    "content": "# 零基础mac安装并实现图像风格迁移\n\n## 第1步：安装Anaconda\n\n- 说明：使用paddlepaddle需要先安装python环境，这里我们选择python集成环境Anaconda工具包\n  - Anaconda是1个常用的python包管理程序\n  - 安装完Anaconda后，可以安装python环境，以及numpy等所需的工具包环境\n- Anaconda下载：\n  - 地址：https://mirrors.tuna.tsinghua.edu.cn/anaconda/archive/?C=M&O=D\n  - <img src=\"../../imgs/Install_Related/mac/anaconda_start.png\" alt=\"anaconda download\" width=\"800\" align=\"center\"/>\n  - 选择最下方的`Anaconda3-2021.05-MacOSX-x86_64.pkg`下载\n- 下载完成后，双击.pkg文件进入图形界面\n  - 按默认设置即可，安装需要花费一段时间\n- 建议安装vscode或pycharm等代码编辑器\n\n## 第2步：打开终端并创建conda环境\n\n- 打开终端\n\n  - 同时按下command键和空格键，在聚焦搜索中输入\"终端\"，双击进入终端\n\n- **将conda加入环境变量**\n\n  - 加入环境变量是为了让系统能识别conda命令\n\n  - 输入以下命令，在终端中打开`~/.bash_profile`：\n\n    - ```shell\n      vim ~/.bash_profile\n      ```\n\n  - 在`~/.bash_profile`中将conda添加为环境变量：\n\n    - ```shell\n      # 先按i进入编辑模式\n      # 在第一行输入：\n      export PATH=\"~/opt/anaconda3/bin:$PATH\"\n      # 若安装时自定义了安装位置，则将~/opt/anaconda3/bin改为自定义的安装目录下的bin文件夹\n      ```\n\n    - ```shell\n      # 修改后的~/.bash_profile文件应如下（其中xxx为用户名）：\n      export PATH=\"~/opt/anaconda3/bin:$PATH\"\n      # >>> conda initialize >>>\n      # !! Contents within this block are managed by 'conda init' !!\n      __conda_setup=\"$('/Users/xxx/opt/anaconda3/bin/conda' 'shell.bash' 'hook' 2> /dev/null)\"\n      if [ $? -eq 0 ]; then\n          eval \"$__conda_setup\"\n      else\n          if [ -f \"/Users/xxx/opt/anaconda3/etc/profile.d/conda.sh\" ]; then\n              . \"/Users/xxx/opt/anaconda3/etc/profile.d/conda.sh\"\n          else\n              export PATH=\"/Users/xxx/opt/anaconda3/bin:$PATH\"\n          fi\n      fi\n      unset __conda_setup\n      # <<< conda initialize <<<\n      ```\n\n    - 修改完成后，先按`esc`键退出编辑模式，再输入`:wq!`并回车，以保存退出\n\n  - 验证是否能识别conda命令：\n\n    - 在终端中输入`source ~/.bash_profile`以更新环境变量\n    - 再在终端输入`conda info --envs`，若能显示当前有base环境，则conda已加入环境变量\n\n- 创建新的conda环境\n\n  - ```shell\n    # 在命令行输入以下命令，创建名为paddle_env的环境\n    # 此处为加速下载，使用清华源\n    conda create --name paddle_env python=3.8 --channel https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free/\n    ```\n\n  - 该命令会创建1个名为paddle_env、python版本为3.8的可执行环境，根据网络状态，需要花费一段时间\n\n  - 之后命令行中会输出提示信息，输入y并回车继续安装\n\n    - <img src=\"../../imgs/Install_Related/mac/conda_create.png\" alt=\"conda_create\" width=\"600\" align=\"center\"/>\n\n- 激活刚创建的conda环境，在命令行中输入以下命令：\n\n  - ```shell\n    # 激活paddle_env环境\n    conda activate paddle_env\n    # 查看当前python的位置\n    where python\n    ```\n\n  - <img src=\"../../imgs/Install_Related/mac/conda_activate.png\" alt=\"conda_actviate\" width=\"600\" align=\"center\"/>\n\n  - 以上anaconda环境和python环境安装完毕\n\n## 第3步：安装程序所需要库\n\n- 使用pip命令在刚激活的环境中安装paddle：\n\n  - ```shell\n    # 在命令行中输入以下命令\n    # 确认当前所用的pip是否是paddle_env环境下的pip\n    where pip\n    # 默认安装CPU版本，安装paddle时建议使用百度源\n    pip install paddlepaddle -i https://mirror.baidu.com/pypi/simple\n    ```\n\n- 安装完paddle后，继续在paddle_env环境中安装paddlehub：\n\n  - ```shell\n    # 在命令行中输入以下命令\n    pip install paddlehub -i https://mirror.baidu.com/pypi/simple\n    ```\n\n  - paddlehub的介绍文档：https://github.com/PaddlePaddle/PaddleHub/blob/release/v2.1/README_ch.md\n  \n  - 安装paddlehub时会自动安装其它依赖库，可能需要花费一段时间\n\n## 第4步：安装paddlehub并下载模型\n\n- 安装完paddlehub后，下载风格迁移模型：\n\n  - ```shell\n    # 在命令行中输入以下命令\n    hub install stylepro_artistic==1.0.1\n    ```\n\n  - 模型的说明文档：[https://www.paddlepaddle.org.cn/hubsearch?filter=en_category&value=%7B%22scenes%22%3A%5B%22GANs%22%5D%7D](https://www.paddlepaddle.org.cn/hubsearch?filter=en_category&value={\"scenes\"%3A[\"GANs\"]})\n\n  - <img src=\"../../imgs/Install_Related/mac/hub_model_intro.png\" alt=\"hub model intro\" width=\"800\" align=\"center\"/>\n\n## 第5步：准备风格迁移数据和代码\n\n### 准备风格迁移数据\n\n- 在桌面创建工作目录`style_transfer`\n\n  - ```shell\n    # 在终端中输入以下命令:\n    cd ~/Desktop  # 进入桌面\n    mkdir style_transfer  # 创建style_transfer文件夹\n    cd style_transfer  # 进入style_transfer目录\n    ```\n\n- 分别放置待转换图片和风格图片：\n\n  - 将待转换图片放置到桌面的`style_transfer/pic.jpg`\n    - <img src=\"../../imgs/Install_Related/mac/pic.jpg\" alt=\"pic.jpg\" width=\"400\" align=\"center\"/>\n  - 将风格图片放置到桌面的`style_transfer/fangao.jpg`\n    - <img src=\"../../imgs/Install_Related/mac/fangao.jpg\" alt=\"fangao.jpg\" width=\"350\" align=\"center\"/>\n\n### 代码\n\n- 在`style_transfer`目录下创建代码文件`style_transfer.py`\n\n- 在`style_transfer.py`中复制进如下代码：\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n    \n    # 待转换图片的相对地址\n    picture = './pic.jpg'\n    # 风格图片的相对地址\n    style_image = './fangao.jpg'\n    \n    # 创建风格转移网络并加载参数\n    stylepro_artistic = hub.Module(name=\"stylepro_artistic\")\n    \n    # 读入图片并开始风格转换\n    result = stylepro_artistic.style_transfer(\n                        images=[{'content': cv2.imread(picture),\n                                 'styles': [cv2.imread(style_image)]}],\n                        visualization=True\n    )\n    ```\n\n- 若没有vscode等代码编辑器，则可通过命令行方法：\n\n  - ```shell\n    pwd # 查看当前目录是否为style_transfer，若不是则输入：cd ~/Desktop/style_transfer\n    touch style_transfer.py  # 创建空文件\n    vim style_transfer.py  # 使用vim编辑器打开代码文件\n    # 先输入i进入编辑模式\n    # 将上面的代码拷贝进vim编辑器中\n    # 按esc键退出编辑模式，再输入\":wq\"并回车，以保存并退出\n    ```\n\n- 运行代码：\n\n  - 在命令行中，输入`python style_transfer.py`\n  - 程序执行时，会创建新文件夹`transfer_result`，并将转换后的文件保存到该目录下\n  - 输出的图片如下：\n    - <img src=\"../../imgs/Install_Related/mac/output_img.png\" alt=\"output image\" width=\"600\" align=\"center\"/>\n\n## 第6步：飞桨预训练模型探索之旅\n- 恭喜你，到这里PaddleHub在mac环境下的安装和入门案例就全部完成了，快快开启你更多的深度学习模型探索之旅吧。[【更多模型探索，跳转飞桨官网】](https://www.paddlepaddle.org.cn/hublist)\n\n\n\n\n\n"
  },
  {
    "path": "docs/docs_ch/get_start/python_use_hub.rst",
    "content": "=================\n快速体验\n=================\n\n在PaddleHub中，Module代表一个可执行模块，一般来讲就是一个可以端到端预测的预训练模型（例如目标检测模型、中文词法分析模型），又或者是一个需要根据下游任务进一步微调（迁移学习）的模型，例如BERT/ERNIE。\n\nPaddleHub采用模型即软件的设计理念，所有的预训练模型与Python软件包类似，具备版本的概念。\n\n本教程会带您快速体验如何使用PaddleHub提供的预训练模型来完成一些常见的AI任务。\n\n下载测试图片\n=============================================\n\n首先，使用`wget`下载一张测试图片\n\n.. code-block:: shell\n\n    # 下载图片\n    wget https://paddlehub.bj.bcebos.com/resources/test_image.jpg\n\n.. image:: ../../imgs/humanseg_test.png\n    :width: 300px\n\n人像分割\n=============================================\n\n人像分割任务旨在将输入图片中的人像和背景区分开来。该任务有很多的应用场景，例如背景虚化、背景替换、影视后期处理等等。我们使用 `humanseg_lite <https://www.paddlepaddle.org.cn/hubdetail?name=humanseg_lite&en_category=ImageSegmentation>`_ 来展示这个功能。\n\n.. code-block:: python\n\n    import paddlehub as hub\n    # module = hub.Module(name=\"humanseg_lite\", version=\"1.1.1\")\n    module = hub.Module(name=\"humanseg_lite\")\n\n    res = module.segment(\n        paths = [\"./test_image.jpg\"], \n        visualization=True, \n        output_dir='humanseg_output')\n\n.. image:: ../../imgs/output_8_3.png\n    :width: 300px\n\n人体解析\n=============================================\n\n人体解析是人像分割的细粒度任务。该任务旨在提取输入图片中人体的不同部件。相关模型经常和新兴的GAN模型一起使用，应用场景包括美颜、换衣服等等。我们使用 `ace2p <https://www.paddlepaddle.org.cn/hubdetail?name=ace2p&en_category=ImageSegmentation>`_  来展示这个功能。\n\n.. code-block:: python\n\n    import paddlehub as hub\n    # module = hub.Module(name=\"ace2p\", version=\"1.1.0\")\n    module = hub.Module(name=\"ace2p\")\n\n    res = module.segment(\n        paths = [\"./test_image.jpg\"], \n        visualization=True, \n        output_dir='ace2p_output')\n\n.. image:: ../../imgs/output_12_3.png\n    :width: 300px\n\n人脸检测 \n=============================================\n\n人脸检测任务旨在检测出输入图片中的每一张人脸的位置。应用的场景包括视频监控、人流量估计等等场景。我们使用 `ultra_light_fast_generic_face_detector_1mb_640 <https://www.paddlepaddle.org.cn/hubdetail?name=ultra_light_fast_generic_face_detector_1mb_640&en_category=FaceDetection>`_ 来展示这个功能。\n\n.. code-block:: python\n\n    import paddlehub as hub\n    # module = hub.Module(name=\"ultra_light_fast_generic_face_detector_1mb_640\", version=\"1.1.2\")\n    module = hub.Module(name=\"ultra_light_fast_generic_face_detector_1mb_640\")\n\n    res = module.face_detection(\n        paths = [\"./test_image.jpg\"], \n        visualization=True, \n        output_dir='face_detection_output')\n\n.. image:: ../../imgs/output_15_3.png\n    :width: 300px\n\n关键点检测\n=============================================\n\n关键点检测任务旨在识别输入图片中每一个人体的不同关键点信息，例如头部、肩膀、关节等等。依赖于模型能力的不同，能够检测到的关键点数量也不同。该任务一般用于人体美型、人体姿态估计等等，我们使用 `openpose_body_estimation <https://www.paddlepaddle.org.cn/hubdetail?name=openpose_body_estimation&en_category=KeyPointDetection>`_ 来展示这个功能。\n\n.. code-block:: python\n\n    import paddlehub as hub\n    # module = hub.Module(name=\"openpose_body_estimation\", version=\"1.0.0\")\n    module = hub.Module(name=\"openpose_body_estimation\")\n\n    res = module.predict(\n        img=\"./test_image.jpg\", \n        visualization=True, \n        save_path='keypoint_output')\n\n.. image:: ../../imgs/output_18_2.png\n    :width: 300px\n\n中文词法分析\n=============================================\n\n中文词法分析旨在对输入的语句进行分词、词性分析、命名实体识别，我们使用 `lac <https://www.paddlepaddle.org.cn/hubdetail?name=lac&en_category=LexicalAnalysis>`_ 来展示这个功能。\n\n.. code-block:: python\n\n    import paddlehub as hub\n    # lac = hub.Module(name=\"lac\", version=\"2.2.0\")\n    lac = hub.Module(name=\"lac\")\n\n    test_text = [\"1996年，曾经是微软员工的加布·纽维尔和麦克·哈灵顿一同创建了Valve软件公司。他们在1996年下半年从id software取得了雷神之锤引擎的使用许可，用来开发半条命系列。\"]\n    print(lac.lexical_analysis(texts = test_text))\n    \n----------------\n\n    [{'word': ['1996年', '，', '曾经', '是', '微软', '员工', '的', '加布·纽维尔', '和', '麦克·哈灵顿', '一同', '创建', '了', 'Valve软件公司', '。', '他们', '在', '1996年下半年', '从', 'id', ' ', 'software', '取得', '了', '雷神之锤', '引擎', '的', '使用', '许可', '，', '用来', '开发', '半条命', '系列', '。'], 'tag': ['TIME', 'w', 'd', 'v', 'ORG', 'n', 'u', 'PER', 'c', 'PER', 'd', 'v', 'u', 'ORG', 'w', 'r', 'p', 'TIME', 'p', 'nz', 'w', 'n', 'v', 'u', 'n', 'n', 'u', 'vn', 'vn', 'w', 'v', 'v', 'n', 'n', 'w']}]\n\n中文情感分析\n=============================================\n\n中文情感分析旨在分析输入语句的情感倾向，我们使用 `senta_bilstm <https://www.paddlepaddle.org.cn/hubdetail?name=senta_bilstm&en_category=SentimentAnalysis>`_ 来展示这个功能。\n\n.. code-block:: python\n\n    import paddlehub as hub\n    # senta = hub.Module(name=\"senta_bilstm\", version=\"1.2.0\")\n    senta = hub.Module(name=\"senta_bilstm\")\n\n    test_text = [\"味道不错，确实不算太辣，适合不能吃辣的人。就在长江边上，抬头就能看到长江的风景。鸭肠、黄鳝都比较新鲜。\"]\n    print(senta.sentiment_classify(texts = test_text))\n\n----------------\n\n    [{'text': '味道不错，确实不算太辣，适合不能吃辣的人。就在长江边上，抬头就能看到长江的风景。鸭肠、黄鳝都比较新鲜。', 'sentiment_label': 1, 'sentiment_key': 'positive', 'positive_probs': 0.9771, 'negative_probs': 0.0229}]\n"
  },
  {
    "path": "docs/docs_ch/get_start/windows_quickstart.md",
    "content": "# 零基础windows安装并实现图像风格迁移\n\n## 第1步：安装Anaconda\n\n- 说明：使用paddlepaddle需要先安装python环境，这里我们选择python集成环境Anaconda工具包\n  - Anaconda是1个常用的python包管理程序\n  - 安装完Anaconda后，可以安装python环境，以及numpy等所需的工具包环境。\n- Anaconda下载：\n  - 地址：https://mirrors.tuna.tsinghua.edu.cn/anaconda/archive/?C=M&O=D\n  - 大部分win10电脑均为64位操作系统，选择x86_64版本；若电脑为32位操作系统，则选择x86.exe\n  - <img src=\"../../imgs/Install_Related/windows/Anaconda_download.png\" alt=\"anaconda download\" width=\"800\" align=\"center\"/>\n  - 下载完成后，双击安装程序进入图形界面\n  - 默认安装位置为C盘，建议将安装位置更改到D盘：\n    - <img src=\"../../imgs/Install_Related/windows/anaconda_install_folder.png\" alt=\"install config\" width=\"500\" align=\"center\"/>\n  - 勾选conda加入环境变量，忽略警告：\n    - <img src=\"../../imgs/Install_Related/windows/anaconda_install_env.png\" alt=\"add conda to path\" width=\"500\" align=\"center\"/>\n\n## 第2步：打开终端并创建conda环境\n\n- 打开Anaconda Prompt终端\n  - 左下角Windows Start Menu -> Anaconda3 -> Anaconda Prompt启动控制台\n  - <img src=\"../../imgs/Install_Related/windows/anaconda_prompt.png\" alt=\"anaconda download\" width=\"300\" align=\"center\"/>\n\n\n- 创建新的conda环境\n\n  - ```shell\n    # 在命令行输入以下命令，创建名为paddle_env的环境\n    # 此处为加速下载，使用清华源\n    conda create --name paddle_env python=3.8 --channel https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free/  # 这是一行命令\n    ```\n\n  - 该命令会创建1个名为paddle_env、python版本为3.8的可执行环境，根据网络状态，需要花费一段时间\n\n  - 之后命令行中会输出提示信息，输入y并回车继续安装\n\n  - <img src=\"../../imgs/Install_Related/windows/conda_new_env.png\" alt=\"conda create\" width=\"700\" align=\"center\"/>\n\n- 激活刚创建的conda环境，在命令行中输入以下命令：\n\n  - ```shell\n    # 激活paddle_env环境\n    conda activate paddle_env\n    # 查看当前python的位置\n    where python\n    ```\n\n  - <img src=\"../../imgs/Install_Related/windows/conda_list_env.png\" alt=\"create environment\" width=\"600\" align=\"center\"/>\n\n  - 以上anaconda环境和python环境安装完毕\n\n## 第3步：安装程序运行所需库\n\n- 使用pip命令在刚激活的环境中安装paddle，\n\n  - ```shell\n    # 在命令行中输入以下命令\n    # 确认当前所用的pip是否是paddle_env环境下的pip\n    where pip\n    # 默认安装CPU版本，安装paddle时建议使用百度源\n    pip install paddlepaddle -i https://mirror.baidu.com/pypi/simple\n    ```\n\n  - 若需要安装GPU版本，则请打开[paddle官网](https://www.paddlepaddle.org.cn/)选择适合的版本\n\n    - paddle官网：https://www.paddlepaddle.org.cn/\n    - 由于安装GPU版本需要先配置好CUDA和cudnn，建议有一定基础后再安装GPU版本\n\n- 安装完paddle后，继续在paddle_env环境中安装paddlehub：\n\n  - ```shell\n    # 在命令行中输入以下命令\n    pip install paddlehub -i https://mirror.baidu.com/pypi/simple\n    ```\n\n  - paddlehub的介绍文档：https://github.com/PaddlePaddle/PaddleHub/blob/release/v2.1/README_ch.md\n\n## 第4步：安装paddlehub并下载模型\n\n- 安装完paddlehub后，下载风格迁移模型：\n\n  - ```shell\n    # 在命令行中输入以下命令\n    hub install stylepro_artistic==1.0.1\n    ```\n\n  - 模型的说明文档：[https://www.paddlepaddle.org.cn/hubsearch?filter=en_category&value=%7B%22scenes%22%3A%5B%22GANs%22%5D%7D](https://www.paddlepaddle.org.cn/hubsearch?filter=en_category&value={\"scenes\"%3A[\"GANs\"]})\n\n  - <img src=\"../../imgs/Install_Related/windows/paddlehub_modulelist.png\" alt=\"model introduction\" width=\"700\" align=\"center\"/>\n\n## 第5步：准备风格迁移数据和代码\n\n### 准备风格迁移数据\n\n- 切换工作目录到`D:\\style_transfer`，在命令行中输入以下命令\n\n  - ```shell\n    # 在命令行中输入以下命令\n    #把当前工作目录切换到D盘根目录\n    D:\n    #创建style_transfer目录\n    mkdir style_transfer\n    #切换当前目录到style_transfer目录\n    cd style_transfer\n    ```\n\n- 分别放置待转换图片和风格图片\n  - 将待转换图片放置到`D:\\style_transfer\\pic.jpg`\n    - <img src=\"../../imgs/Install_Related/windows/pic.jpg\" alt=\"pic.jpg\" width=\"400\" align=\"center\"/>\n  - 将风格图片放置到`D:\\style_transfer\\fangao.jpg`\n    - <img src=\"../../imgs/Install_Related/windows/fangao.jpg\" alt=\"fangao.jpg\" width=\"350\" align=\"center\"/>\n\n### 代码\n\n- 在`D:\\style_transfer`目录下创建代码文件`style_transfer.py`\n\n  - 若没有vscode等编辑器，可使用记事本先创建1个txt文件，再将文件名改成`style_transfer.py`\n\n- 在`style_transfer.py`中复制进如下代码：\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n  \n    # 待转换图片的绝对地址\n    picture = 'D:\\\\style_transfer\\\\pic.jpg'  # 注意代码中此处为双反斜杠\n    # 风格图片的绝对地址\n    style_image = 'D:\\\\style_transfer\\\\fangao.jpg'\n  \n    # 创建风格转移网络并加载参数\n    stylepro_artistic = hub.Module(name=\"stylepro_artistic\")\n  \n    # 读入图片并开始风格转换\n    result = stylepro_artistic.style_transfer(\n                        images=[{'content': cv2.imread(picture),\n                                 'styles': [cv2.imread(style_image)]}],\n                        visualization=True\n    )\n    ```\n\n- 运行代码：\n\n  - 在命令行中，输入`python style_transfer.py`\n  - 程序执行时，会创建新文件夹`transfer_result`，并将转换后的文件保存到该目录下\n  - 输出图片如下：\n    - <img src=\"../../imgs/Install_Related/windows/after_transfer.png\" alt=\"transferred image\" width=\"600\" align=\"center\"/>\n\n## 第六步：飞桨预训练模型探索之旅\n- 恭喜你，到这里PaddleHub在windows环境下的安装和入门案例就全部完成了，快快开启你更多的深度学习模型探索之旅吧。[【更多模型探索，跳转飞桨官网】](https://www.paddlepaddle.org.cn/hublist)\n"
  },
  {
    "path": "docs/docs_ch/get_start_index.rst",
    "content": "===============================\n快速入门PaddleHub\n===============================\n\n.. toctree::\n   :maxdepth: 2\n\n   get_start/installation.md\n   get_start/python_use_hub.md"
  },
  {
    "path": "docs/docs_ch/index.rst",
    "content": "===========================\n关于PaddleHub\n===========================\n\n欢迎使用PaddleHub！这是一个基于飞桨框架的预训练模型应用工具，旨在降低AI模型的使用门槛并促动AI社区的发展。无论您是AI领域的资深开发者，还是对该领域不甚了解却非常感兴趣的用户，PaddleHub都可以对您产生帮助。\n\n* PaddleHub旨在为开发者提供丰富的、高质量的、直接可用的预训练模型。\n* **【无需深度学习背景、无需数据与训练过程】**，可快速使用AI模型，享受人工智能时代红利。\n* 涵盖CV、NLP、Audio、Video主流四大品类，支持**一键预测**、**一键服务化部署**和**快速迁移学习**\n* 全部模型开源下载，**离线可运行**。\n\n------------\n\n===========================\n概览\n===========================\n.. toctree::\n   :maxdepth: 2\n   :titlesonly:\n\n   快速入门PaddleHub<get_start_index>\n   教程<tutorial_index>\n   API<api_index>\n   常见问题<faq>\n   社区<community_index>\n   发版信息<release>\n\n   "
  },
  {
    "path": "docs/docs_ch/make.bat",
    "content": "@ECHO OFF\r\n\r\npushd %~dp0\r\n\r\nREM Command file for Sphinx documentation\r\n\r\nif \"%SPHINXBUILD%\" == \"\" (\r\n\tset SPHINXBUILD=sphinx-build\r\n)\r\nset SOURCEDIR=.\r\nset BUILDDIR=_build\r\n\r\nif \"%1\" == \"\" goto help\r\n\r\n%SPHINXBUILD% >NUL 2>NUL\r\nif errorlevel 9009 (\r\n\techo.\r\n\techo.The 'sphinx-build' command was not found. Make sure you have Sphinx\r\n\techo.installed, then set the SPHINXBUILD environment variable to point\r\n\techo.to the full path of the 'sphinx-build' executable. Alternatively you\r\n\techo.may add the Sphinx directory to PATH.\r\n\techo.\r\n\techo.If you don't have Sphinx installed, grab it from\r\n\techo.http://sphinx-doc.org/\r\n\texit /b 1\r\n)\r\n\r\n%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%\r\ngoto end\r\n\r\n:help\r\n%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%\r\n\r\n:end\r\npopd\r\n"
  },
  {
    "path": "docs/docs_ch/release.md",
    "content": "# 更新历史\n\n## `v2.3.0`\n\n### 【1、支持文图生成新场景】\n  - 新增基于disco diffusion技术的文图生成dd系列模型5个，其中英文模型3个，中文模型2个，其中中文文图生成模型[disco_diffusion_ernievil_base](https://aistudio.baidu.com/aistudio/projectdetail/4444998)基于百度自研多模态模型**ERNIE-ViL**开发，欢迎体验。\n\n### 【2、支持文心大模型API调用】\n  - 新增对文心大模型[**ERNIE-ViLG**](https://aistudio.baidu.com/aistudio/projectdetail/4445016)的API调用，支持文图生成任务。\n  - 新增对文心大模型[**ERNIE 3.0 Zeus**](https://aistudio.baidu.com/aistudio/projectdetail/4445054)的API调用，支持写作文、写文案、写摘要、对对联、自由问答、写小说、补全文本等多个应用。\n\n\n## `v2.1.0`\n\n### 【1、版本迭代】\n\n - 模型支持：新增基于VOC数据集的高精度语义分割模型2个，语音分类模型3个。\n - 迁移学习能力升级：新增图像语义分割、文本语义匹配、语音分类等相关任务的Fine-Tune能力以及相关任务数据集。\n\n### 【2、部署能力重要升级】\n\n - 完善部署能力：新增ONNX和PaddleInference等模型格式的导出功能。\n - **重要开源生态合作**：新增[BentoML](https://github.com/bentoml/BentoML) 云原生服务化部署能力，可以支持统一的多框架模型管理和模型部署的工作流，[详细教程](https://github.com/PaddlePaddle/PaddleHub/tree/release/v2.1/demo/serving/BentoML)，更多内容可以参考BentoML 最新 v0.12.1 [Releasenote](https://github.com/bentoml/BentoML/releases/tag/v0.12.1)\n   （感谢@ [parano](https://github.com/parano) @[cqvu](https://github.com/cqvu) @[deehrlic](https://github.com/deehrlic)）的贡献与支持\n\n### 【3、Bug fixes】\n\n - [#7da1230](https://github.com/PaddlePaddle/PaddleHub/commit/7da12302dd77e3d739da72821d41715ad8a7c79c) 修复了模型未记录评估指标时无法恢复训练的问题。\n - [#b0b3144](https://github.com/PaddlePaddle/PaddleHub/commit/b0b3144eff34e47cac8fc450c8b7cb6c557f9b84) 修复了评估过程出现异常时线程没有正常退出的问题。\n - [#30aace4](https://github.com/PaddlePaddle/PaddleHub/commit/30aace46414bbeef02beb75b7128f48fada82150) 优化模型安装流程，提升易用性。\n\n## `v2.0.0`\n\n* 发布 2.0版本，全面迁移动态图编程模式，模型开发调试更加方便，finetune接口更加灵活易用。\n* 视觉类任务迁移学习能力全面升级，支持图像分类、图像着色、风格迁移等多种任务。\n* BERT、ERNIE、RoBERTa等Transformer类模型升级至动态图，支持文本分类、序列标注的Fine-Tune能力。\n* 新增词向量模型61个，其中包含中文模型51个，英文模型10个。\n* 优化服务化部署Serving能力，支持多卡预测、自动负载均衡，性能大幅度提升。\n* 新增自动数据增强能力Auto Augment，能高效地搜索适合数据集的数据增强策略组合。\n\n## `v2.0.0-beta1`\n\n* BERT、ERNIE、RoBERTa等Transformer类模型升级至动态图，增加[文本分类](../../demo/text_classification)的Fine-Tune能力\n* 修复部分已知问题\n\n## `v2.0.0-beta0`\n\n* 全面迁移动态图编程模式，模型开发调试更加方便，finetune接口更加灵活易用。\n* 优化服务化部署Serving能力，支持多卡预测、自动负载均衡，性能大幅度提升。\n* 视觉类迁移学习能力全面升级，支持[图像分类](../../demo/image_classification)、[图像着色](../../demo/colorization)、[风格迁移](../../demo/style_transfer)等多种视觉任务。\n\n## `v1.8.1`\n\n* 『[图像分割](https://www.paddlepaddle.org.cn/hublist?filter=en_category&value=ImageSegmentation)』新增轻量级人像分割模型Humanseg，支持移动端实时分割\n* 增强文本匹配任务性能，使用[EMNLP2019-Sentence-BERT](https://arxiv.org/abs/1908.10084)作为文本匹配任务网络，可同时大幅提升准确率和预测速度。配套教程：[pointwise文本语义匹配](https://aistudio.baidu.com/aistudio/projectdetail/705526)、[pairwise文本语义匹配](https://aistudio.baidu.com/aistudio/projectdetail/709472)\n* 修复文本分类选择F1作为评价指标，运行错误\n\n## `v1.8.0`\n\n* 预训练模型丰富，一键完成更多\n  * 『[文本生成](https://www.paddlepaddle.org.cn/hublist?filter=en_category&value=TextGeneration)』新增基于ERNIE-tiny和ERNIE-gen的对联和写诗生成模型，支持一键自动写诗和对对联。\n  *  『[词法分析](https://www.paddlepaddle.org.cn/hublist?filter=en_category&value=LexicalAnalysis)』新增jieba的paddle模式切词模型，可一键完成中文分词、关键词抽取等功能。\n  * 『[语义表示](https://www.paddlepaddle.org.cn/hublist?filter=en_category&value=SemanticModel)』新增基于网页、小说、新闻三类大规模文本数据的LDA主题模型及其语义相似度计算接口。\n* Fine-tune API升级，提升灵活性并支持更多任务\n   * 新增Tokenizer API，支持更加灵活的切词、切字模式和自定义切词工具拓展。\n   * 新增[文本生成](https://github.com/PaddlePaddle/PaddleHub/tree/release/v1.8/demo/text_generation)任务，支持Seq2Seq任务的Fine-tuning。\n  * 新增文本匹配任务，支持[Pointwise](https://github.com/PaddlePaddle/PaddleHub/tree/release/v1.8/demo/pointwise_text_matching)、[Pairwise](https://github.com/PaddlePaddle/PaddleHub/tree/release/v1.8/demo/pairwise_text_matching)两种文本匹配训练模式，更便捷完成语义匹配任务。\n\n## `v1.7.0`\n\n* 丰富预训练模型，提升应用性\n  * 新增VENUS系列视觉预训练模型[yolov3_darknet53_venus](https://www.paddlepaddle.org.cn/hubdetail?name=yolov3_darknet53_venus&en_category=ObjectDetection)，[faster_rcnn_resnet50_fpn_venus](https://www.paddlepaddle.org.cn/hubdetail?name=faster_rcnn_resnet50_fpn_venus&en_category=ObjectDetection)，可大幅度提升图像分类和目标检测任务的Fine-tune效果\n  * 新增工业级短视频分类模型[videotag_tsn_lstm](https://paddlepaddle.org.cn/hubdetail?name=videotag_tsn_lstm&en_category=VideoClassification)，支持3000类中文标签识别\n  * 新增轻量级中文OCR模型[chinese_ocr_db_rcnn](https://www.paddlepaddle.org.cn/hubdetail?name=chinese_ocr_db_rcnn&en_category=TextRecognition)、[chinese_text_detection_db](https://www.paddlepaddle.org.cn/hubdetail?name=chinese_text_detection_db&en_category=TextRecognition)，支持一键快速OCR识别\n  * 新增行人检测、车辆检测、动物识别、Object等工业级模型\n\n* Fine-tune API升级\n  * 文本分类任务新增6个预置网络，包括CNN, BOW, LSTM, BiLSTM, DPCNN等\n  * 使用VisualDL可视化训练评估性能数据\n\n## `v1.6.2`\n\n* 修复图像分类在windows下运行错误\n\n## `v1.6.1`\n\n* 修复windows下安装PaddleHub缺失config.json文件\n\n## `v1.6.0`\n\n* NLP Module全面升级，提升应用性和灵活性\n  * lac、senta系列(bow、cnn、bilstm、gru、lstm)、simnet_bow、porn_detection系列(cnn、gru、lstm)升级高性能预测，性能提升高达50%\n  * ERNIE、BERT、RoBERTa等Transformer类语义模型新增获取预训练embedding接口get_embedding，方便接入下游任务，提升应用性\n  * 新增RoBERTa通过模型结构压缩得到的3层Transformer模型[rbt3](https://www.paddlepaddle.org.cn/hubdetail?name=rbt3&en_category=SemanticModel)、[rbtl3](https://www.paddlepaddle.org.cn/hubdetail?name=rbtl3&en_category=SemanticModel)\n\n* Task predict接口增加高性能预测模式accelerate_mode，性能提升高达90%\n\n* PaddleHub Module创建流程开放，支持Fine-tune模型转化，全面提升应用性和灵活性\n  * [预训练模型转化为PaddleHub Module教程](https://github.com/PaddlePaddle/PaddleHub/blob/release/v1.6/docs/contribution/contri_pretrained_model.md)\n  * [Fine-tune模型转化为PaddleHub Module教程](https://github.com/PaddlePaddle/PaddleHub/blob/release/v1.6/docs/tutorial/finetuned_model_to_module.md)\n\n* [PaddleHub Serving](https://github.com/PaddlePaddle/PaddleHub/blob/release/v1.6/docs/tutorial/serving.md)优化启动方式，支持更加灵活的参数配置\n\n## `v1.5.2`\n\n* 优化pyramidbox_lite_server_mask、pyramidbox_lite_mobile_mask模型的服务化部署性能\n\n## `v1.5.1`\n\n* 修复加载module缺少cache目录的问题\n\n## `v1.5.0`\n\n* 升级PaddleHub Serving，提升性能和易用性\n   * 新增文本Embedding服务[Bert Service](./tutorial/bert_service.md), 轻松获取文本embedding；\n      * 代码精短，易于使用。服务端/客户端一行命令即可获取文本embedding；  \n      * 更高性能，更高效率。通过Paddle AnalysisPredictor API优化计算图，提升速度减小显存占用\n      * 随\"机\"应变，灵活扩展。根据机器资源和实际需求可灵活增加服务端数量，支持多显卡多模型计算任务\n   * 优化并发方式，多核环境中使用多线程并发提高整体QPS\n\n* 优化PaddleHub迁移学习组网Task功能，提升易用性\n   * 增加Hook机制，支持[修改Task内置方法](./tutorial/hook.md)\n   * 增加colorlog，支持日志彩色显示\n   * 改用save_inference_model接口保存模型，方便模型部署\n   * 优化predict接口，增加return_result参数，方便用户直接获取预测结果\n\n* 优化PaddleHub Dataset基类，加载[自定义数据](./tutorial/how_to_load_data.md)代码更少、更简单\n\n\n## `v1.4.1`\n\n* 修复利用Transformer类模型完成序列标注任务适配paddle1.6版本的问题\n* Windows下兼容性提升为python >= 3.6\n\n## `v1.4.0`\n\n* 新增预训练模型ERNIE tiny\n* 新增数据集：INEWS、BQ、DRCD、CMRC2018、THUCNEWS，支持ChineseGLUE（CLUE）V0 所有任务\n* 修复module与PaddlePaddle版本兼容性问题\n* 优化Hub Serving启动过程和模型加载流程，提高服务响应速度\n\n\n## `v1.3.0`\n\n* 新增PaddleHub Serving服务部署\n  * 新增[hub serving](https://github.com/PaddlePaddle/PaddleHub/wiki/PaddleHub-Serving%E4%B8%80%E9%94%AE%E6%9C%8D%E5%8A%A1%E9%83%A8%E7%BD%B2)命令，支持一键启动Module预测服务部署\n* 新增预训练模型：\n  * roberta_wwm_ext_chinese_L-24_H-1024_A-16\n  * roberta_wwm_ext_chinese_L-12_H-768_A-12\n  * bert_wwm_ext_chinese_L-12_H-768_A-12\n  * bert_wwm_chinese_L-12_H-768_A-12\n* AutoDL Finetuner优化使用体验\n  * 支持通过接口方式回传模型性能\n  * 可视化效果优化，支持多trail效果显示\n\n## `v1.2.1`\n\n* 新增**超参优化Auto Fine-tune**，实现给定超参搜索空间，PaddleHub自动给出较佳的超参组合\n  * 支持两种超参优化算法：HAZero和PSHE2\n  * 支持两种评估方式：FullTrail和PopulationBased\n* 新增Fine-tune**优化策略ULMFiT**，包括以下三种设置\n  * Slanted triangular learning rates：学习率先线性增加后缓慢降低\n  * Discriminative fine-tuning：将计算图划分为n段，不同的段设置不同学习率\n  * Gradual unfreezing：根据计算图的拓扑结构逐层unfreezing\n* 新增支持用户自定义PaddleHub配置，包括\n  * 预训练模型管理服务器地址\n  * 日志记录级别\n* Fine-tune API升级，灵活性与易用性提升\n  * 新增**阅读理解Fine-tune任务**和**回归Fine-tune任务**\n  * 新增多指标评测\n  * 优化predict接口\n  * 可视化工具支持使用tensorboard\n\n\n## `v1.1.2`\n\n* PaddleHub支持修改预训练模型存放路径${HUB_HOME}\n\n\n## `v1.1.1`\n\n* PaddleHub支持离线运行\n* 修复python2安装PaddleHub失败问题\n\n\n## `v1.1.0`\n\n* PaddleHub **新增预训练模型ERNIE 2.0**\n  * 升级Reader， 支持自动传送数据给Ernie 1.0/2.0\n  * 新增数据集GLUE(MRPC、QQP、SST-2、CoLA、QNLI、RTE、MNLI)\n\n\n## `v1.0.1`\n\n* 安装模型时自动选择与paddlepaddle版本适配的模型\n\n\n## `v1.0.0`\n\n* 全新发布PaddleHub官网，易用性全面提升\n  * 新增网站  https://www.paddlepaddle.org.cn/hub  包含PaddlePaddle生态的预训练模型使用介绍\n  * 迁移学习Demo接入AI Studio与AI Book,无需安装即可快速体验\n\n* 新增29个预训练模型，覆盖文本、图像、视频三大领域；目前官方提供40个预训练模型\n  * CV预训练模型：\n    * 新增图像分类预训练模型11个：SE_ResNeXt, GoogleNet, ShuffleNet等\n    * 新增目标检测模型Faster-RCNN和YOLOv3\n    * 新增图像生成模型CycleGAN\n    * 新增人脸检测模型Pyramidbox\n    * 新增视频分类模型4个: TSN, TSM, StNet, Non-Local\n  * NLP预训练模型\n    * 新增语义模型ELMo\n    * 新增情感分析模型5个: Senta-BOW, Senta-CNN, Senta-GRNN, , Senta-LSTM, EmoTect\n    * 新增中文语义相似度分析模型SimNet\n    * 升级LAC词法分析模型，新增词典干预功能，支持用户自定义分词\n* Fine-tune API升级，灵活性与性能全面提升\n  * 支持多卡并行、PyReader多线程IO，Fine-tune速度提升60%\n  * 简化finetune、evaluate、predict等使用逻辑，提升易用性\n  * 增加事件回调功能，方便用户快速实现自定义迁移学习任务\n  * 新增多标签分类Fine-tune任务\n\n\n## `v0.5.0`\n\n正式发布PaddleHub预训练模型管理工具，旨在帮助用户更高效的管理模型并开展迁移学习的工作。\n\n**预训练模型管理**: 通过hub命令行可完成PaddlePaddle生态的预训练模型下载、搜索、版本管理等功能。\n\n**命令行一键使用**: 无需代码，通过命令行即可直接使用预训练模型进行预测，快速调研训练模型效果。目前版本支持以下模型：词法分析LAC；情感分析Senta；目标检测SSD；图像分类ResNet, MobileNet, NASNet等。\n\n**迁移学习**: 提供了基于预训练模型的Fine-tune API，用户通过少量代码即可完成迁移学习，包括BERT/ERNIE文本分类、序列标注、图像分类迁移等。\n"
  },
  {
    "path": "docs/docs_ch/transfer_learning_index.rst",
    "content": "==================\n迁移学习\n==================\n\n迁移学习 (Transfer Learning) 是属于深度学习的一个子研究领域，该研究领域的目标在于利用数据、任务、或模型之间的相似性，将在旧领域学习过的知识，迁移应用于新领域中。通俗的来讲，迁移学习就是运用已有的知识来学习新的知识，例如学会了骑自行车的人也能较快的学会骑电动车。较为常用的一种迁移学习方式是利用预训练模型进行微调，即用户基于当前任务的场景从PaddleHub中选择已训练成功的模型进行新任务训练，且该模型曾经使用的数据集与新场景的数据集情况相近，此时仅需要在当前任务场景的训练过程中使用新场景的数据对模型参数进行微调（**Fine-tune**），即可完成训练任务。迁移学习吸引了很多研究者投身其中，因为它能够很好的解决深度学习中的以下几个问题：  \n\n* 一些研究领域只有少量标注数据，且数据标注成本较高，不足以训练一个足够鲁棒的神经网络。\n* 大规模神经网络的训练依赖于大量的计算资源，这对于一般用户而言难以实现。\n* 应对于普适化需求的模型，在特定应用上表现不尽如人意。  \n\n为了让开发者更便捷地应用迁移学习，飞桨开源了预训练模型管理工具 PaddleHub。开发者仅仅使用十余行的代码，就能完成迁移学习。本文将为读者全面介绍使用PaddleHub完成迁移学习的方法。\n\n.. image:: ../imgs/paddlehub_finetune.gif \n   :width: 900px\n   :align: center\n\n.. toctree::\n   :maxdepth: 2\n\n   finetune/sequence_labeling.md\n   finetune/text_matching.md\n   finetune/image_classification.md\n   finetune/image_colorization.md\n   finetune/style_transfer.md\n   finetune/semantic_segmentation.md\n   finetune/audio_classification.md\n   finetune/customized_dataset.md"
  },
  {
    "path": "docs/docs_ch/tutorial/cmd_usage.rst",
    "content": "===========================\nPaddleHub命令行工具\n===========================\n\nPaddleHub为预训练模型的管理和使用提供了命令行工具。\n\n我们一共提供了11个命令，涵盖了模型安装、卸载、预测等等各方面。\n\nhub install\n==================\n\n用于将Module安装到本地，默认安装在`${HUB_HOME}/.paddlehub/modules`目录下，当一个Module安装到本地后，用户可以通过其他命令操作该Module（例如，使用该Module进行预测），也可以使用PaddleHub提供的python API，将Module应用到自己的任务中，实现迁移学习\n\n.. tip::\n\n    如果设置了环境变量 *${HUB_HOME}* ，则预训练模型和相关的配置文件都会保存到指定的 *${HUB_HOME}* 路径下。\n    如果未设置环境变量 *${HUB_HOME}* ，则预训练模型和相关的配置文件都会保存到用户的主目录 *$HOME* 下。\n\nhub uninstall\n==================\n\n用于卸载本地Module\n\nhub show\n==================\n\n用于查看本地已安装Module的属性或者指定目录下确定的Module的属性，包括其名字、版本、描述、作者等信息\n\nhub download\n==================\n\n用于下载PaddleHub提供的Module\n\nhub search\n==================\n\n通过关键字在服务端检索匹配的Module，当想要查找某个特定模型的Module时，使用search命令可以快速得到结果，例如`hub search ssd`命令，会查找所有包含了ssd字样的Module，命令支持正则表达式，例如`hub search ^s.*`搜索所有以s开头的资源。\n\n.. tip::\n    \n    如果想要搜索全部的Module，使用`hub search \\*`并不生效，这是因为shell会自行进行通配符展开，将\\*替换为当前目录下的文件名。为了进行全局搜索，用户可以直接键入`hub search`。\n\n\nhub list\n==================\n\n列出本地已经安装的Module\n\nhub run\n==================\n\n用于执行Module的预测，需要注意的是，并不是所有的模型都支持预测（同样，也不是所有的模型都支持迁移学习）。使用示例可以参考[hub run快速体验](../quick_experience/cmd_quick_run.md)。\n\nPaddleHub尽量简化了用户在使用命令行预测时的理解成本，一般来讲，我们将预测分为NLP和CV两大类\n\nNLP类的任务\n---------------\n输入数据通过--input_text指定。以百度LAC模型（中文词法分析）为例，可以通过以下命令实现文本分析。\n\n\n.. code-block:: console\n\n    $ hub run lac --input_text \"今天是个好日子\"\n\n\nCV类的任务\n---------------\n\n输入数据通过`--input_path`指定。以SSD模型（单阶段目标检测）为例子，可以通过以下命令实现预测\n\n.. code-block:: console\n\n    $ hub run resnet_v2_50_imagenet --input_path test.jpg\n\nhub help\n==================\n\n显示帮助信息\n\nhub version\n==================\n\n显示PaddleHub版本信息\n\nhub clear\n==================\n\nPaddleHub在使用过程中会产生一些缓存数据，这部分数据默认存放在${HUB_HOME}/.paddlehub/cache目录下，用户可以通过clear命令来清空缓存\n\nhub config\n==================\n\n用于查看和设置paddlehub相关设置，包括对server地址、日志级别的设置：\n\n.. code-block:: console\n\n    $ # 显示当前paddlehub的设置\n    $ hub config \n    \n    $ # 恢复当前paddlehub的设置为默认设置\n    $ hub config reset \n    \n    $ # 设置当前paddlehub-server地址为${HOST}，paddlehub客户端从此地址获取模型信息\n    $ hub config server==${HOST} \n    \n    $ # 设置当前日志级别为${LEVEL}， 可选值为CRITICAL, ERROR, WARNING, EVAL, TRAIN, INFO, DEBUG, 从左到右优先级从高到低\n    $ hub config log.level==${LEVEL} \n    \n    $ # 设置当日志是否可用\n    $ hub config log.enable==True|False \n\nhub serving\n==================\n\n用于一键部署Module预测服务，详细用法见`PaddleHub Serving一键服务部署 <serving>`_"
  },
  {
    "path": "docs/docs_ch/tutorial/custom_module.rst",
    "content": "======================\n如何创建自己的Module\n======================\n\n\n一、 准备工作\n=======================\n\n模型基本信息\n------------------------\n\n我们准备编写一个PaddleHub Module，Module的基本信息如下：\n\n.. code-block:: yaml\n\n    name: senta_test\n    version: 1.0.0\n    summary: This is a PaddleHub Module. Just for test.\n    author: anonymous\n    author_email:\n    type: nlp/sentiment_analysis\n\nModule存在一个接口predict，用于接收传入图片，并得到最终输出的结果，支持python接口调用和命令行调用。\n\n.. code-block:: python\n\n    import paddlehub as hub\n\n    senta_test = hub.Module(name=\"senta_test\")\n    senta_test.sentiment_classify(texts=[\"这部电影太差劲了\"])\n\n.. code-block:: shell\n\n    hub run senta_test --input_text 这部电影太差劲了\n\n策略\n------------------------\n\n为了示例代码简单起见，我们使用一个非常简单的情感判断策略，当输入文本中带有词表中指定单词时，则判断文本倾向为负向，否则为正向\n\n二、创建Module\n=======================\n\n步骤1. 创建必要的目录与文件\n----------------------------------------------------\n\n创建一个senta_test的目录，并在senta_test目录下分别创建module.py、processor.py、vocab.list，其中\n\n.. code-block:: shell\n\n    $ tree senta_test\n    senta_test/\n    ├── vocab.list \n    ├── module.py \n    └── processor.py \n\n============    =========================================================================\nFile Name       Purpose                                                      \n------------    -------------------------------------------------------------------------\n============    =========================================================================\nmodule.py       主模块，提供Module的实现代码\nprocessor.py    辅助模块，提供词表加载的方法\nvocab.list      存放词表 \n============    =========================================================================\n\n\n步骤2. 实现辅助模块processor\n------------------------------------------------\n\n在processor.py中实现一个load_vocab接口用于读取词表\n\n.. code-block:: python\n\n    def load_vocab(vocab_path):\n        with open(vocab_path) as file:\n            return file.read().split()\n\n步骤3. 编写Module处理代码\n------------------------------------------------\n\nmodule.py文件为Module的入口代码所在，我们需要在其中实现预测逻辑。\n\n步骤 3_1. 引入必要的头文件\n^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n.. code-block:: python\n\n    import argparse\n    import os\n\n    import paddlehub as hub\n    from paddlehub.module.module import runnable, moduleinfo\n\n    from senta_test.processor import load_vocab\n\n.. note::\n\n    当引用Module中模块时，需要输入全路径，如senta_test.processor\n\n步骤 3_2. 定义SentaTest类\n^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\nmodule.py中需要有一个继承了hub.Module的类存在，该类负责实现预测逻辑，并使用moduleinfo填写基本信息。当使用hub.Module(name=\"senta_test\")加载Module时，PaddleHub会自动创建SentaTest的对象并返回。\n\n.. code-block:: python\n\n    @moduleinfo(\n        name=\"senta_test\",\n        version=\"1.0.0\",\n        summary=\"This is a PaddleHub Module. Just for test.\",\n        author=\"anonymous\",\n        author_email=\"\",\n        type=\"nlp/sentiment_analysis\",\n    )\n    class SentaTest:\n        ...\n\n步骤 3_3. 执行必要的初始化\n^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n.. code-block:: python\n\n    @moduleinfo(\n        name=\"senta_test\",\n        version=\"1.0.0\",\n        summary=\"This is a PaddleHub Module. Just for test.\",\n        author=\"anonymous\",\n        author_email=\"\",\n        type=\"nlp/sentiment_analysis\",\n    )\n    class SentaTest:\n\n        def __init__(self):\n            # add arg parser\n            self.parser = argparse.ArgumentParser(\n                description=\"Run the senta_test module.\",\n                prog='hub run senta_test',\n                usage='%(prog)s',\n                add_help=True)\n            self.parser.add_argument(\n                '--input_text', type=str, default=None, help=\"text to predict\")\n\n            # load word dict\n            vocab_path = os.path.join(self.directory, \"vocab.list\")\n            self.vocab = load_vocab(vocab_path)\n\n        ...\n\n.. note::\n\n    执行类对象默认内置了directory属性，可以直接获取到Module所在路径\n\n步骤 3_4: 完善预测逻辑\n^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n.. code-block:: python\n\n    @moduleinfo(\n        name=\"senta_test\",\n        version=\"1.0.0\",\n        summary=\"This is a PaddleHub Module. Just for test.\",\n        author=\"anonymous\",\n        author_email=\"\",\n        type=\"nlp/sentiment_analysis\",\n    )\n    class SentaTest:\n        ...\n\n        def sentiment_classify(self, texts):\n            results = []\n            for text in texts:\n                sentiment = \"positive\"\n                for word in self.vocab:\n                    if word in text:\n                        sentiment = \"negative\"\n                        break\n                results.append({\"text\":text, \"sentiment\":sentiment})\n\n            return results\n        \n        ...\n\n步骤 3_5. 支持命令行调用\n^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n如果希望Module可以支持命令行调用，则需要提供一个经过runnable修饰的接口，接口负责解析传入数据并进行预测，将结果返回。\n\n如果不需要提供命令行预测功能，则可以不实现该接口，PaddleHub在用命令行执行时，会自动发现该Module不支持命令行方式，并给出提示。\n\n.. code-block:: python\n\n    @moduleinfo(\n        name=\"senta_test\",\n        version=\"1.0.0\",\n        summary=\"This is a PaddleHub Module. Just for test.\",\n        author=\"anonymous\",\n        author_email=\"\",\n        type=\"nlp/sentiment_analysis\",\n    )\n    class SentaTest:\n        ...\n\n        @runnable\n        def run_cmd(self, argvs):\n            args = self.parser.parse_args(argvs)\n            texts = [args.input_text]\n            return self.sentiment_classify(texts)\n\n        ...\n\n步骤 3_6. 支持serving调用\n^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n如果希望Module可以支持PaddleHub Serving部署预测服务，则需要提供一个经过serving修饰的接口，接口负责解析传入数据并进行预测，将结果返回。\n\n如果不需要提供PaddleHub Serving部署预测服务，则可以不需要加上serving修饰。\n\n.. code-block:: python\n\n    @moduleinfo(\n        name=\"senta_test\",\n        version=\"1.0.0\",\n        summary=\"This is a PaddleHub Module. Just for test.\",\n        author=\"anonymous\",\n        author_email=\"\",\n        type=\"nlp/sentiment_analysis\",\n    )\n    class SentaTest:\n        ...\n\n        @serving\n        def sentiment_classify(self, texts):\n            results = []\n            for text in texts:\n                sentiment = \"positive\"\n                for word in self.vocab:\n                    if word in text:\n                        sentiment = \"negative\"\n                        break\n                results.append({\"text\":text, \"sentiment\":sentiment})\n\n            return results\n\n完整代码\n------------------------------------------------\n\n* `module.py <https://github.com/PaddlePaddle/PaddleHub/blob/release/v2.1/modules/demo/senta_test/module.py>`_\n\n* `processor.py <https://github.com/PaddlePaddle/PaddleHub/blob/release/v2.1/modules/demo/senta_test/processor.py>`_\n\n三、安装并测试Module\n===================================\n\n完成Module编写后，我们可以通过以下方式测试该Module\n\n调用方法1\n------------------------------------------------\n\n将Module安装到本机中，再通过Hub.Module(name=...)加载\n\n.. code-block:: console\n\n    $ hub install senta_test\n\n\n.. code-block:: python\n\n    import paddlehub as hub\n\n    senta_test = hub.Module(name=\"senta_test\")\n    senta_test.sentiment_classify(texts=[\"这部电影太差劲了\"])\n\n调用方法2\n------------------------------------------------\n\n直接通过Hub.Module(directory=...)加载\n\n.. code-block:: python\n\n    import paddlehub as hub\n\n    senta_test = hub.Module(directory=\"senta_test/\")\n    senta_test.sentiment_classify(texts=[\"这部电影太差劲了\"])\n\n调用方法3\n------------------------------------------------\n\n将Module安装到本机中，再通过hub run运行\n\n.. code-block:: console\n\n    $ hub install senta_test\n    $ hub run senta_test --input_text \"这部电影太差劲了\""
  },
  {
    "path": "docs/docs_ch/tutorial/serving.md",
    "content": "# PaddleHub Serving模型一键服务部署\n## 简介\n### 为什么使用一键服务部署\n使用PaddleHub能够快速进行模型预测，但开发者常面临本地预测过程迁移线上的需求。无论是对外开放服务端口，还是在局域网中搭建预测服务，都需要PaddleHub具有快速部署模型预测服务的能力。在这个背景下，模型一键服务部署工具——PaddleHub Serving应运而生。开发者通过一行命令即可快速启动一个模型预测在线服务，而无需关注网络框架选择和实现。\n### 什么是一键服务部署\nPaddleHub Serving是基于PaddleHub的一键模型服务部署工具，能够通过简单的Hub命令行工具轻松启动一个模型预测在线服务，前端通过Flask和Gunicorn完成网络请求的处理，后端直接调用PaddleHub预测接口，同时支持使用多进程方式利用多核提高并发能力，保证预测服务的性能。\n\n### 支持模型\n目前PaddleHub Serving支持对PaddleHub所有可直接预测的模型进行服务部署，包括`lac`、`senta_bilstm`等NLP类模型，以及`yolov3_darknet53_coco2017`、`vgg16_imagenet`等CV类模型，更多模型请参见[PaddleHub支持模型列表](https://paddlepaddle.org.cn/hublist)。未来还将支持开发者使用PaddleHub Fine-tune API得到的模型用于快捷服务部署。\n\n## 使用\n### Step1：启动服务端部署\nPaddleHub Serving有两种启动方式，分别是使用命令行启动，以及使用配置文件启动。\n\n#### 命令行命令启动\n启动命令\n```shell\n$ hub serving start --modules Module1==Version1 Module2==Version2 ... \\\n                    --port XXXX \\\n                    --use_gpu \\\n                    --use_multiprocess \\\n                    --workers \\\n                    --gpu \\\n```\n\n**参数**：\n\n|参数|用途|  \n|-|-|  \n|--modules/-m|PaddleHub Serving预安装模型，以多个Module==Version键值对的形式列出<br>*`当不指定Version时，默认选择最新版本`*|  \n|--port/-p|服务端口，默认为8866|  \n|--use_gpu|使用GPU进行预测，必须安装paddlepaddle-gpu|  \n|--use_multiprocess|是否启用并发方式，默认为单进程方式，推荐多核CPU机器使用此方式<br>*`Windows操作系统只支持单进程方式`*|\n|--workers|在并发方式下指定的并发任务数，默认为`2*cpu_count-1`，其中`cpu_count`为CPU核数|\n|--gpu|指定使用gpu的卡号，如`1,2`代表使用1号显卡和2号显卡，默认仅使用0号显卡|\n\n**NOTE:** --use_gpu不可与--use_multiprocess共用。\n\n#### 配置文件启动\n启动命令\n```shell\n$ hub serving start --config config.json\n```\n`config.json`格式如下：  \n\n```json\n{\n  \"modules_info\": {\n    \"yolov3_darknet53_coco2017\": {\n      \"init_args\": {\n        \"version\": \"1.0.0\"\n      },\n      \"predict_args\": {\n        \"batch_size\": 1,\n        \"use_gpu\": false\n      }\n    },\n    \"lac\": {\n      \"init_args\": {\n        \"version\": \"1.1.0\"\n      },\n      \"predict_args\": {\n        \"batch_size\": 1,\n        \"use_gpu\": false\n      }\n    }\n  },\n  \"port\": 8866,\n  \"use_multiprocess\": false,\n  \"workers\": 2,\n  \"gpu\": \"0,1,2\"\n}\n\n```\n\n**参数**：\n\n|参数|用途|  \n|-|-|  \n|modules_info|PaddleHub Serving预安装模型，以字典列表形式列出，key为模型名称。其中:<br>`init_args`为模型加载时输入的参数，等同于`paddlehub.Module(**init_args)`<br>`predict_args`为模型预测时输入的参数，以`lac`为例，等同于`lac.analysis_lexical(**predict_args)`\n|port|服务端口，默认为8866|  \n|use_gpu|使用GPU进行预测，必须安装paddlepaddle-gpu|  \n|use_multiprocess|是否启用并发方式，默认为单进程方式，推荐多核CPU机器使用此方式<br>*`Windows操作系统只支持单进程方式`*|\n|workers|启动的并发任务数，在并发模式下才生效，默认为`2*cpu_count-1`，其中`cpu_count`代表CPU的核数|\n|gpu|指定使用gpu的卡号，如`1,2`代表使用1号显卡和2号显卡，默认仅使用0号显卡|\n**NOTE:** --use_gpu不可与--use_multiprocess共用。\n\n### Step2：访问服务端\n\n在使用PaddleHub Serving部署服务端的模型预测服务后，就可以在客户端访问预测接口以获取结果了，接口url格式为：\n\n`http://127.0.0.1:8866/predict/<MODULE>`\n\n其中，`<MODULE>`为模型名。\n\n通过发送一个POST请求，即可获取预测结果，下面我们将展示一个具体的demo，以说明使用PaddleHub Serving部署和使用流程。\n\n\n### Step3：利用PaddleHub Serving进行个性化开发\n使用PaddleHub Serving进行模型服务部署后，可以利用得到的接口进行开发，如对外提供web服务，或接入到应用程序中，以降低客户端预测压力，提高性能，下面展示了一个web页面demo:\n\n![](../../imgs/web_demo.png)\n\n\n### Step4：关闭serving\n使用关闭命令即可关闭启动的serving，\n```shell\n$ hub serving stop --port XXXX\n```\n**参数**：\n\n|参数|用途|  \n|-|-|  \n|--port/-p|指定要关闭的服务端口，默认为8866|  \n\n## Demo——部署一个在线lac分词服务\n\n### Step1：部署lac在线服务\n现在，我们要部署一个lac在线服务，以通过接口获取文本的分词结果。\n\n首先，任意选择一种启动方式，两种方式分别为:\n```shell\n$ hub serving start -m lac\n```\n或\n```shell\n$ hub serving start -c serving_config.json\n```\n其中`serving_config.json`的内容如下：\n```json\n{\n  \"modules_info\": {\n    \"lac\": {\n      \"init_args\": {\n        \"version\": \"1.1.0\"\n      },\n      \"predict_args\": {\n        \"batch_size\": 1,\n        \"use_gpu\": false\n      }\n    }\n  },\n  \"port\": 8866,\n  \"use_multiprocess\": false,\n  \"workers\": 2\n}\n```\n启动成功界面如图：\n\n![](../../imgs/start_serving_lac.png)\n\n这样我们就在8866端口成功部署了lac的在线分词服务。\n*此处warning为Flask提示，不影响使用*\n\n### Step2：访问lac预测接口\n\n在服务部署好之后，我们可以进行测试，用来测试的文本为`今天是个好日子`和`天气预报说今天要下雨`。\n\n客户端代码如下\n```python\n# coding: utf8\nimport requests\nimport json\n\nif __name__ == \"__main__\":\n    # 指定用于预测的文本并生成字典{\"text\": [text_1, text_2, ... ]}\n    text = [\"今天是个好日子\", \"天气预报说今天要下雨\"]\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n    # 对应本地部署，则为lac.analysis_lexical(data=text, batch_size=1)\n    data = {\"text\": text, \"batch_size\": 1}\n    # 指定预测方法为lac并发送post请求，content-type类型应指定json方式\n    url = \"http://127.0.0.1:8866/predict/lac\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(json.dumps(r.json(), indent=4, ensure_ascii=False))\n```\n运行后得到结果\n\n\n```python\n{\n    \"msg\": \"\",\n    \"results\": [\n        {\n            \"tag\": [\n                \"TIME\", \"v\", \"q\", \"n\"\n            ],\n            \"word\": [\n                \"今天\", \"是\", \"个\", \"好日子\"\n            ]\n        },\n        {\n            \"tag\": [\n                \"n\", \"v\", \"TIME\", \"v\", \"v\"\n            ],\n            \"word\": [\n                \"天气预报\", \"说\", \"今天\", \"要\", \"下雨\"\n            ]\n        }\n    ],\n    \"status\": \"0\"\n}\n\n```\n\n### Step3：停止serving服务\n\n由于启动时我们使用了默认的服务端口8866，则对应的关闭命令为：\n```shell\n$ hub serving stop --port 8866\n```\n或不指定关闭端口，则默认为8866。\n```shell\n$ hub serving stop\n```\n等待serving清理服务后，提示：\n```shell\n$ PaddleHub Serving will stop.\n```\n则serving服务已经停止。\n\n此Demo的具体信息和代码请参见[LAC Serving](../../demo/serving/module_serving/lexical_analysis_lac)。另外，下面展示了一些其他的一键服务部署Demo。\n\n## Demo——其他模型的一键部署服务\n\n* [中文词法分析-基于lac](../../../demo/serving/module_serving/lexical_analysis_lac)\n\n&emsp;&emsp;该示例展示了利用lac完成中文文本分词服务化部署和在线预测，获取文本的分词结果，并可通过用户自定义词典干预分词结果。\n\n* [人脸检测-基于pyramidbox_lite_server_mask](../../../demo/serving/module_serving/object_detection_pyramidbox_lite_server_mask)\n\n&emsp;&emsp;该示例展示了利用pyramidbox_lite_server_mask完成人脸口罩检测，检测人脸位置以及戴口罩的置信度。\n"
  },
  {
    "path": "docs/docs_ch/tutorial_index.rst",
    "content": "========\n教程\n========\n\n\n\n.. toctree::\n   :maxdepth: 2\n   :titlesonly:\n\n   命令行使用方法<tutorial/cmd_usage>\n   一键服务化部署<tutorial/serving>\n   如何创建自己的Module<tutorial/custom_module>\n   迁移学习<transfer_learning_index>"
  },
  {
    "path": "docs/docs_ch/visualization.md",
    "content": "## 精品模型效果展示\n\n### 文本识别\n- 包含超轻量中英文OCR模型，高精度中英文、多语种德语、法语、日语、韩语OCR识别。\n- 感谢CopyRight@[PaddleOCR](https://github.com/PaddlePaddle/PaddleOCR)提供预训练模型，训练能力开放，欢迎体验。\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/Image_Ocr.gif\"  width = \"800\" height = \"400\" />\n</div>\n\n### 人脸检测\n- 包含人脸检测，口罩人脸检测，多种算法可选。\n- 感谢CopyRight@[PaddleDetection](https://github.com/PaddlePaddle/PaddleDetection)提供预训练模型，训练能力开放，欢迎体验。\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/Image_ObjectDetection_Face_Mask.gif\"  width = \"588\" height = \"400\" />\n</div>\n\n### 图像编辑\n- 4倍超分效果，多种超分算法可选。\n- 黑白图片上色，可用于老旧照片修复，\n- 感谢CopyRight@[PaddleGan](https://github.com/PaddlePaddle/PaddleGan)提供预训练模型，训练能力开放，欢迎体验。\n<div align=\"center\">\n<table>\n    <thead>\n    </thead>\n    <tbody>\n        <tr>\n            <th>图像超分辨率 </th>\n            <th>黑白图片上色 </th>\n        </tr>\n        <tr>\n            <th>\n            <a>\n            <img src=\"../imgs/Readme_Related/ImageEdit_SuperResolution.gif\"  width = \"266\" height = \"400\" /></a><br>\n            </th>\n            <th>\n            <a>\n            <img src=\"../imgs/Readme_Related/ImageEdit_Restoration.gif\"  width = \"300\" height = \"400\" /></a><br>\n            </th>\n        </tr>\n    </tbody>\n</table>\n</div>\n\n\n### 图像生成\n- 包含人像动漫化、街景动漫化、风格迁移。\n- 感谢CopyRight@[PaddleGAN](https://github.com/PaddlePaddle/PaddleGAN)、CopyRight@[AnimeGAN](https://github.com/TachibanaYoshino/AnimeGANv2)提供预训练模型。\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/ImageGAN.gif\"  width = \"640\" height = \"600\" />\n</div>\n\n### 目标检测\n- 包含行人检测、车辆检测，更有工业级超大规模预训练模型可选。--\n- 感谢CopyRight@[PaddleDetection](https://github.com/PaddlePaddle/PaddleDetection)提供预训练模型，训练能力开放，欢迎体验。\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/Image_ObjectDetection_Pedestrian_Vehicle.gif\"  width = \"642\" height = \"400\" />\n</div>\n\n### 关键点检测\n- 包含单人、多人身体关键点检测、面部关键点检测、手部关键点检测。\n- 感谢CopyRight@[openpose](https://github.com/CMU-Perceptual-Computing-Lab/openpose)开源预训练模型。\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/Image_keypoint.gif\"  width = \"642\" height = \"550\" />\n</div>\n\n### 图像分割\n- 包含效果卓越的人像抠图模型、ACE2P人体解析世界冠军模型、动态天空置换算法\n- 感谢CopyRight@[PaddleSeg](https://github.com/PaddlePaddle/PaddleSeg)、感谢CopyRight@[Zhengxia Zou](https://github.com/jiupinjia/SkyAR)提供预训练模型，训练能力开放，欢迎体验。\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/ImageSeg_Human.gif\"  width = \"642\" height = \"400\" />\n</div>\n\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/9dis.gif\"  width = \"642\" height = \"200\" />\n</div>\n\n<div align=\"center\">\n\n（第二张动图来自于CopyRight@[jiupinjia/SkyAR](https://github.com/jiupinjia/SkyAR#district-9-ship-video-source))\n</div>\n\n### 图像分类\n- 包含动物分类、菜品分类、野生动物制品分类，多种算法可选\n- 感谢CopyRight@[PaddleClas](https://github.com/PaddlePaddle/PaddleClas)提供预训练模型，训练能力开放，欢迎体验。\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/ImageClas_animal_dish_wild.gif\"  width = \"530\" height = \"400\" />\n</div>\n\n### 文本生成\n- 包含AI写诗、AI对联、AI情话、AI藏头诗，多种算法可选。\n- 感谢CopyRight@[ERNIE](https://github.com/PaddlePaddle/ERNIE)提供预训练模型，训练能力开放，欢迎体验。\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/Text_Textgen_poetry.gif\"  width = \"850\" height = \"400\" />\n</div>\n\n### 词法分析\n- 效果优秀的中文分词、词性标注与命名实体识别的模型。\n- 感谢CopyRight@[LAC](https://github.com/baidu/LAC)提供预训练模型，训练能力开放，欢迎体验。\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/Text_Lexical Analysis.png\"  width = \"640\" height = \"233\" />\n</div>\n\n### 句法分析\n- 效果领先的中文句法分析模型。\n- 感谢CopyRight@[DDParser](https://github.com/baidu/DDParser)提供预训练模型，训练能力开放，欢迎体验。\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/Text_SyntacticAnalysis.png\"  width = \"640\" height = \"301\" />\n</div>\n\n### 情感分析\n- 支持中文的评论情感分析\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/Text_SentimentAnalysis.png\"  width = \"640\" height = \"228\" />\n</div>\n\n### 文本审核\n- 包含中文色情文本的审核，多种算法可选。\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/Text_Textreview.png\"  width = \"640\" height = \"140\" />\n</div>\n\n### 语音合成\n- TTS语音合成算法，多种算法可选\n- 感谢CopyRight@[Parakeet](https://github.com/PaddlePaddle/Parakeet)提供预训练模型，训练能力开放，欢迎体验。\n- 输入：`Life was like a box of chocolates, you never know what you're gonna get.`\n- 合成效果如下:\n<div align=\"center\">\n<table>\n    <thead>\n    </thead>\n    <tbody>\n        <tr>\n            <th>deepvoice3 </th>\n            <th>fastspeech </th>\n            <th>transformer</th>\n        </tr>\n        <tr>\n            <th>\n            <a href=\"https://paddlehub.bj.bcebos.com/resources/deepvoice3_ljspeech-0.wav\">\n            <img src=\"../imgs/Readme_Related/audio_icon.png\" width=250 /></a><br>\n            </th>\n            <th>\n            <a href=\"https://paddlehub.bj.bcebos.com/resources/fastspeech_ljspeech-0.wav\">\n            <img src=\"../imgs/Readme_Related/audio_icon.png\" width=250 /></a><br>\n            </th>\n            <th>\n            <a href=\"https://paddlehub.bj.bcebos.com/resources/transformer_tts_ljspeech-0.wav\">\n            <img src=\"../imgs/Readme_Related/audio_icon.png\" width=250 /></a><br>\n            </th>\n        </tr>\n    </tbody>\n</table>\n</div>\n\n### 视频分类\n- 包含短视频分类，支持3000+标签种类，可输出TOP-K标签，多种算法可选。\n- 感谢CopyRight@[PaddleVideo](https://github.com/PaddlePaddle/PaddleVideo)提供预训练模型，训练能力开放，欢迎体验。\n- `举例：输入一段游泳的短视频，算法可以输出\"游泳\"结果`\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/Text_Video.gif\"  width = \"400\" height = \"400\" />\n</div>\n"
  },
  {
    "path": "docs/docs_en/Makefile",
    "content": "# Minimal makefile for Sphinx documentation\n#\n\n# You can set these variables from the command line, and also\n# from the environment for the first two.\nSPHINXOPTS    ?=\nSPHINXBUILD   ?= sphinx-build\nSOURCEDIR     = .\nBUILDDIR      = _build\n\n# Put it first so that \"make\" without argument is like \"make help\".\nhelp:\n\t@$(SPHINXBUILD) -M help \"$(SOURCEDIR)\" \"$(BUILDDIR)\" $(SPHINXOPTS) $(O)\n\n.PHONY: help Makefile\n\n# Catch-all target: route all unknown targets to Sphinx using the new\n# \"make mode\" option.  $(O) is meant as a shortcut for $(SPHINXOPTS).\n%: Makefile\n\t@$(SPHINXBUILD) -M $@ \"$(SOURCEDIR)\" \"$(BUILDDIR)\" $(SPHINXOPTS) $(O)\n"
  },
  {
    "path": "docs/docs_en/api/datasets/canvas.rst",
    "content": "==============\nCanvas\n==============\n\n.. code-block:: python\n\n    class paddlehub.datasets.Canvas(transform: Callable, mode: str = 'train'):\n\n-----------------\n\n   Dataset for colorization. It contains 1193 and 400 pictures for Monet and Vango paintings style, respectively. We collected data from https://people.eecs.berkeley.edu/~taesung_park/CycleGAN/datasets/.\n\n-----------------\n\n* Args:\n    * transform(Callable)\n        The method of preprocess images.\n    \n    * mode(str)\n        The mode for preparing dataset(train or test). Default to 'train'."
  },
  {
    "path": "docs/docs_en/api/datasets/chnsenticorp.rst",
    "content": "==============\nChnSentiCorp\n==============\n\n.. code-block:: python\n\n    class paddlehub.datasets.ChnSentiCorp(tokenizer: Union[BertTokenizer, CustomTokenizer], max_seq_len: int = 128, mode: str = 'train'):\n\n-----------------\n\n    ChnSentiCorp is a dataset for chinese sentiment classification, which was published by Tan Songbo at ICT of Chinese Academy of Sciences.\n\n-----------------\n\n* Args:\n    * tokenizer(:obj:`BertTokenizer` or `CustomTokenizer`)\n        It tokenizes the text and encodes the data as model needed.\n\n    * max_seq_len(:obj:`int`, `optional`, defaults to :128)\n        The maximum length (in number of tokens) for the inputs to the selected module, such as ernie, bert and so on.\n\n    * mode(:obj:`str`, `optional`, defaults to `train`):\n        It identifies the dataset mode (train, test or dev)."
  },
  {
    "path": "docs/docs_en/api/datasets/esc50.rst",
    "content": "==============\nESC50\n==============\n\n.. code-block:: python\n\n    class paddlehub.datasets.ESC50(mode: str = 'train', feat_type: str = 'mel'):)\n\n-----------------\n\n    The ESC-50 dataset is a labeled collection of 2000 environmental audio recordings suitable for benchmarking methods of environmental sound classification.\n   \n-----------------\n\n* Args:\n    * mode(:obj:`str`, `optional`, defaults to `train`):\n        It identifies the dataset mode (train, test or dev).\n    \n    * feat_type(:obj:`str`, `optional`, defaults to `mel`):\n        It identifies the input feature type (mel, or raw)."
  },
  {
    "path": "docs/docs_en/api/datasets/flowers.rst",
    "content": "==============\nFlowers\n==============\n\n.. code-block:: python\n\n    class paddlehub.datasets.Flowers(transform: Callable, mode: str = 'train'):\n\n-----------------\n\n   Flower classification dataset.\n\n-----------------\n\n* Args:\n    * transform(Callable)\n        The method of preprocess images.\n    \n    * mode(str)\n        The mode for preparing dataset(train, test or val). Default to 'train'."
  },
  {
    "path": "docs/docs_en/api/datasets/lcqmc.rst",
    "content": "==============\nLCQMC\n==============\n\n.. code-block:: python\n\n    class paddlehub.datasets.LCQMC(tokenizer: Union[BertTokenizer, CustomTokenizer], max_seq_len: int = 128, mode: str = 'train'):\n\n-----------------\n\n    A Large-scale Chinese Question Matching Corpus.\n    \n-----------------\n\n* Args:\n    * tokenizer(:obj:`BertTokenizer` or `CustomTokenizer`)\n        It tokenizes the text and encodes the data as model needed.\n\n    * max_seq_len(:obj:`int`, `optional`, defaults to :128)\n        The maximum length (in number of tokens) for the inputs to the selected module, such as ernie, bert and so on.\n\n    * mode(:obj:`str`, `optional`, defaults to `train`):\n        It identifies the dataset mode (train, test or dev)."
  },
  {
    "path": "docs/docs_en/api/datasets/minicoco.rst",
    "content": "==============\nMiniCOCO\n==============\n\n.. code-block:: python\n\n    class paddlehub.datasets.MiniCOCO(transform: Callable, mode: str = 'train'):\n\n-----------------\n\n   Dataset for Style transfer. The dataset contains 2001 images for training set and 200 images for testing set. They are derived form COCO2014. Meanwhile, it contains 21 different style pictures in file \"21styles\".\n\n-----------------\n\n* Args:\n    * transform(Callable)\n        The method of preprocess images.\n    \n    * mode(str)\n        The mode for preparing dataset(train or test). Default to 'train'."
  },
  {
    "path": "docs/docs_en/api/datasets/msra_ner.rst",
    "content": "==============\nMSRA_NER\n==============\n\n.. code-block:: python\n\n    class paddlehub.datasets.MSRA_NER(tokenizer: Union[BertTokenizer, CustomTokenizer], max_seq_len: int = 128, mode: str = 'train'):\n\n-----------------\n\n    A set of manually annotated Chinese word-segmentation data and specifications for training and testing a Chinese word-segmentation system for research purposes.  For more information please refer to https://www.microsoft.com/en-us/download/details.aspx?id=52531\n\n-----------------\n\n* Args:\n    * tokenizer(:obj:`BertTokenizer` or `CustomTokenizer`)\n        It tokenizes the text and encodes the data as model needed.\n\n    * max_seq_len(:obj:`int`, `optional`, defaults to :128)\n        The maximum length (in number of tokens) for the inputs to the selected module, such as ernie, bert and so on.\n\n    * mode(:obj:`str`, `optional`, defaults to `train`):\n        It identifies the dataset mode (train, test or dev)."
  },
  {
    "path": "docs/docs_en/api/datasets/opticdisc.rst",
    "content": "==============\nOpticDiscSeg\n==============\n\n.. code-block:: python\n\n    class paddlehub.datasets.OpticDiscSeg(transform: Callable, mode: str = 'train'):\n\n-----------------\n\n   OpticDiscSeg dataset is extraced from iChallenge-AMD(https://ai.baidu.com/broad/subordinate?dataset=amd).\n   \n-----------------\n\n* Args:\n    * transform(Callable)\n        The method of preprocess images.\n    \n    * mode(str)\n        The mode for preparing dataset(train, test or val). Default to 'train'."
  },
  {
    "path": "docs/docs_en/api/datasets_index.rst",
    "content": "==============\nDatasets\n==============\n\n\nCV\n==============\n.. toctree::\n   :maxdepth: 2\n   :titlesonly:\n\n   Canvas<datasets/canvas.rst>\n   Flowers<datasets/flowers.rst>\n   OpticDiscSeg<datasets/opticdisc.rst>\n   MiniCOCO<datasets/minicoco.rst>\n\nNLP\n==============\n.. toctree::\n   :maxdepth: 2\n   :titlesonly:\n\n   ChnSentiCorp<datasets/chnsenticorp.rst>\n   LCQMC<datasets/lcqmc.rst>\n   MSRA_NER<datasets/msra_ner.rst>\n\nAudio\n==============\n.. toctree::\n   :maxdepth: 2\n   :titlesonly:\n\n   ESC50<datasets/esc50.rst>"
  },
  {
    "path": "docs/docs_en/api/env.rst",
    "content": "================\nHub Environment\n================\n\n.. code-block:: console\n\n    HUB_HOME\n    ├── MODULE_HOME\n    ├── CACHE_HOME\n    ├── DATA_HOME\n    ├── CONF_HOME\n    ├── THIRD_PARTY_HOME\n    ├── TMP_HOME\n    └── LOG_HOME\n\n\npaddlehub.env.HUB_HOME\n=========================\n\n    The root directory for storing PaddleHub related data. Default to ~/.paddlehub. Users can change the default value through the HUB_HOME environment variable.\n\npaddlehub.env.MODULE_HOME\n=========================\n\n    Directory for storing the installed PaddleHub Module.\n\npaddlehub.env.CACHE_HOME\n=========================\n\n    Directory for storing the cached data.\n\npaddlehub.env.DATA_HOME\n=========================\n\n    Directory for storing the automatically downloaded datasets.\n\npaddlehub.env.CONF_HOME\n=========================\n\n    Directory for storing the default configuration files.\n\npaddlehub.env.THIRD_PARTY_HOME\n================================\n\n    Directory for storing third-party libraries.\n\npaddlehub.env.TMP_HOME\n=========================\n\n    Directory for storing the temporary files generated during running, such as intermediate products of installing modules, files in this directory will generally be automatically cleared.\n\npaddlehub.env.LOG_HOME\n=========================\n\n    Directory for storing the log files generated during operation, including some non-fatal errors. The log will be stored daily.\n"
  },
  {
    "path": "docs/docs_en/api/module.rst",
    "content": "==============\nModule\n==============\n\n.. code-block:: python\n\n    class paddlehub.Module(\n        name: str = None,\n        directory: str = None,\n        version: str = None,\n        ignore_env_mismatch: bool = False,\n        **kwargs)\n\n-----------------\n\n   In PaddleHub, Module represents an executable module, which usually a pre-trained model that can be used for end-to-end prediction, such as a face detection model or a lexical analysis model, or a pre-trained model that requires finetuning, such as BERT/ERNIE. When loading a Module with a specified name, if the Module does not exist locally, PaddleHub will automatically request the server or the specified Git source to download the resource.\n\n-----------------\n\n* Args:\n    * name(str | optional)\n        Module name.\n\n    * directory(str | optional)\n        Directory of the module to be loaded, only takes effect when the `name` is not specified.\n\n    * version(str | optional)\n        The version limit of the module, only takes effect when the `name` is specified. When the local Module does not meet the specified version conditions, PaddleHub will re-request the server to download the appropriate Module. Default to None, This means that the local Module will be used. If the Module does not exist, PaddleHub will download the latest version available from the server according to the usage environment.\n    \n    * ignore_env_mismatch(bool | optional)\n        Whether to ignore the environment mismatch when installing the Module. Default to False.\n\n**member functions**\n=====================\n\nexport_onnx_model\n------------------\n\n    .. code-block:: python\n\n        def export_onnx_model(\n            dirname: str,\n            input_spec: List[paddle.static.InputSpec] = None,\n            include_sub_modules: bool = True,\n            **kwargs):\n\n    Export the model to ONNX format.\n\n    * Args:\n        * dirname(str)\n            The directory to save the onnx model.\n        \n        * input_spec(list)\n            Describes the input of the saved model's forward method, which can be described by InputSpec or example Tensor. If None, all input variables of the original Layer's forward method would be the inputs of the saved model. Default None.\n            \n        * include_sub_modules(bool)\n            Whether to export sub modules. Default to True.\n            \n        * \\*\\*kwargs(dict|optional)\n            Other export configuration options for compatibility, some may be removed in the future. Don't use them If not necessary. Refer to https://github.com/PaddlePaddle/paddle2onnx for more information.\n\nsave_inference_model\n----------------------\n\n    .. code-block:: python\n\n        def save_inference_model(\n            dirname: str,\n            model_filename: str = None,\n            params_filename: str = None,\n            input_spec: List[paddle.static.InputSpec] = None,\n            include_sub_modules: bool = True,\n            combined: bool = True):\n\n    Export the model to Paddle Inference format.\n\n    * Args:\n        * name(str | optional)\n            Module name.\n\n        * model_filename(str)\n            The name of the saved model file. Default to `__model__`.\n\n        * params_filename(str)\n            The name of the saved parameters file, only takes effect when `combined` is True. Default to `__params__`.\n\n        * input_spec(list)\n            Describes the input of the saved model's forward method, which can be described by InputSpec or example Tensor. If None, all input variables of the original Layer's forward method would be the inputs of the saved model. Default None.\n\n        * include_sub_modules(bool)\n            Whether to export sub modules. Default to True.\n        \n        * combined(bool)\n            Whether to save all parameters in a combined file. Default to True.\n\nsub_modules\n----------------------\n\n    .. code-block:: python\n\n        def sub_modules(recursive: bool = True):\n\n    Get all sub modules.\n\n    * Args:\n        * recursive(bool): \n            Whether to get sub modules recursively. Default to True.\n\n**classmethod**\n=================\n\nget_py_requirements\n----------------------\n\n    .. code-block:: python\n\n        @classmethod\n        def get_py_requirements(cls) -> List[str]:\n\n    Get Module's python package dependency list.\n\nload\n----------------------\n\n    .. code-block:: python\n\n        @classmethod\n        def load(cls, directory: str) -> Generic:\n\n    Load the Module object defined in the specified directory.\n\n    * Args:\n        * directory(str): \n            Module directory.\n\nload_module_info\n----------------------\n\n    .. code-block:: python\n\n        @classmethod\n        def load_module_info(cls, directory: str) -> EasyDict:\n\n    Load the Module object defined in the specified directory.\n\n    * Args:\n        * directory(str): \n            Module directory.\n\n**property**\n=================\n\nis_runnable\n-----------------\n\n    .. code-block:: python\n\n        is_runnable\n\n    Whether the Module is runnable, in other words, whether can we execute the Module through the `hub run` command.\n\nname\n-----------------\n\n    .. code-block:: python\n\n        name\n\n    Module name.\n\ndirectory\n-----------------\n\n    .. code-block:: python\n\n        directory\n\n    Directory of Module.\n\nversion\n-----------------\n\n    .. code-block:: python\n\n        version\n\n    Module name.\n\ntype\n-----------------\n\n    .. code-block:: python\n\n        type\n\n    Module type.\n\nsummary\n-----------------\n\n    .. code-block:: python\n\n        summary\n\n    Module summary.\n\nauthor\n-----------------\n\n    .. code-block:: python\n\n        author\n\n    The author of Module\n\nauthor_email\n-----------------\n\n    .. code-block:: python\n\n        author_email\n\n    The email of Module author\n\n.. note::\n    Module is a factory class that is used to automatically download and load user-defined model classes. In addition to the above methods or property, each Module has other custom methods or property. The relevant definitions need to be viewed in the corresponding documentation\n"
  },
  {
    "path": "docs/docs_en/api/module_decorator.rst",
    "content": "=================\nModule Decorator\n=================\n\nmoduleinfo\n============\n\n.. code-block:: python\n\n    def paddlehub.module.module.moduleinfo(\n        name: str,\n        version: str,\n        author: str = None,\n        author_email: str = None,\n        summary: str = None,\n        type: str = None,\n        meta=None) -> Callable:\n\n-----------------\n\n   Mark Module information for a python class, and the class will automatically be extended to inherit HubModule. In other words, python classes marked with moduleinfo can be loaded through hub.Module.\n\n-----------------\n\n* Args:\n    * name(str)\n        Module name.\n    \n    * version(str)\n        Module name.\n\n    * author(str)\n        The author of Module.\n\n    * author_email(str)\n        The email of Module author.\n\n    * summary(str)\n        Module summary.\n\n    * type(str)\n        Module type.\n\n\nmoduleinfo\n============\n\n.. code-block:: python\n\n    def paddlehub.module.module.runnable(func: Callable) -> Callable:\n\n-----------------\n\n   Mark a Module method as runnable, when the command `hub run` is used, the method will be called.\n\n-----------------\n\n* Args:\n    * func(Callable)\n        member function of Module.\n\nserving\n============\n\n.. code-block:: python\n\n    def paddlehub.module.module.serving(func: Callable) -> Callable:\n\n-----------------\n\n   Mark a Module method as serving method, when the command `hub serving` is used, the method will be called.\n\n-----------------\n\n* Args:\n    * func(Callable)\n        member function of Module."
  },
  {
    "path": "docs/docs_en/api/module_manager.rst",
    "content": "=======================\nLocalModuleManager\n=======================\n\n.. code-block:: python\n\n    class paddlehub.module.manager.LocalModuleManager(home: str = MODULE_HOME):\n\n-----------------\n\n   LocalModuleManager is used to manage PaddleHub's local Module, which supports the installation, uninstallation, and search of HubModule. LocalModuleManager is a singleton object related to the path, in other words, when the LocalModuleManager object of the same home directory is generated multiple times, the same object is returned.\n\n-----------------\n\n* Args:\n    * home(str)\n       The directory where PaddleHub modules are stored, the default is ~/.paddlehub/modules\n\n**member functions**\n=====================\n\ninstall\n------------------\n\n   .. code-block:: python\n\n      def install(\n         name: str = None,\n         directory: str = None,\n         archive: str = None,\n         url: str = None,\n         version: str = None,\n         ignore_env_mismatch: bool = False) -> HubModule:\n\n   Install a HubModule from name or directory or archive file or url. When installing with the name parameter, if a module that meets the conditions (both name and version) already installed, the installation step will be skipped. When installing with other parameter, The locally installed modules will be uninstalled.\n\n   * Args:\n      * name(str | optional)\n         module name to install\n\n      * directory(str | optional)\n         directory containing  module code\n\n      * archive(str | optional) \n         archive file containing  module code\n\n      * url(str|optional) \n         url points to a archive file containing module code\n\n      * version(str | optional)\n         module version, use with name parameter\n            \n      * ignore_env_mismatch(str | optional)\n         Whether to ignore the environment mismatch when installing the Module.\n\nuninstall\n------------------\n\n   .. code-block:: python\n\n      def uninstall(name: str) -> bool:\n\n   Uninstall a HubModule from name.\n\n   * Args:\n      * name(str)\n         module name to uninstall\n\n   * Return:\n      True if uninstall successfully else False\n\nlist\n------------------\n\n   .. code-block:: python\n\n      def list() -> List[HubModule]:\n\n   List all installed HubModule.\n\n   * Return:\n      List of installed HubModule.\n\nsearch\n------------------\n\n   .. code-block:: python\n\n      def search(name: str) -> HubModule:\n\n   search a HubModule with specified name.\n\n\n   * Args:\n      * name(str)\n         module name to search.\n\n   * Return:\n      None if not HubModule with specified name found else the specified HubModule."
  },
  {
    "path": "docs/docs_en/api/trainer.rst",
    "content": "==============\nTrainer\n==============\n\n.. code-block:: python\n\n    class paddlehub.Trainer(\n        model: paddle.nn.Layer,\n        optimizer: paddle.optimizer.Optimizer,\n        use_gpu: bool = False,\n        use_vdl: bool = True,\n        checkpoint_dir: str = None,\n        compare_metrics: Callable = None):\n\n-----------------\n\n   Model trainer.\n\n-----------------\n\n* Args:\n    * model(paddle.nn.Layer)\n        Model to train or evaluate.\n\n    * optimizer(paddle.optimizer.Optimizer)\n        Optimizer for loss.\n        \n    * use_gpu(bool)\n        Whether to use gpu to run.\n\n    * use_vdl(bool)\n        Whether to use visualdl to record training data.\n\n    * checkpoint_dir(str)\n        Directory where the checkpoint is saved, and the trainer will restore the state and model parameters from the checkpoint.\n\n    * compare_metrics(Callable)\n        The method of comparing the model metrics. If not specified, the main metric return by `validation_step` will be used for comparison by default, the larger the value, the better the effect. This method will affect the saving of the best model. If the default behavior does not meet your requirements, please pass in a custom method.\n\n**member functions**\n=====================\n\ntrain\n------------------\n\n    .. code-block:: python\n\n        def train(\n            train_dataset: paddle.io.Dataset,\n            epochs: int = 1,\n            batch_size: int = 1,\n            num_workers: int = 0,\n            eval_dataset: paddle.io.Dataset = None,\n            log_interval: int = 10,\n            save_interval: int = 10,\n            collate_fn: Callable = None):\n\n    Train a model with specific config.\n\n    * Args:\n        * train_dataset(paddle.io.Dataset)\n            Dataset to train the model\n\n        * epochs(int)\n            Number of training loops, default is 1.\n\n        * batch_size(int)\n            Batch size of per step, default is 1.\n\n        * num_workers(int)\n            Number of subprocess to load data, default is 0.\n\n        * eval_dataset(paddle.io.Dataset)\n            The validation dataset, default is None. If set, the Trainer will execute evaluate function every `save_interval` epochs.\n        \n        * log_interval(int)\n            Log the train infomation every `log_interval` steps.\n\n        * save_interval(int)\n            Save the checkpoint every `save_interval` epochs.\n\n        * collate_fn(Callable)\n            function to generate mini-batch data by merging the sample list. None for only stack each fields of sample in axis 0(same as :attr::`np.stack(..., axis=0)`). Default None.\n\nevaluate\n----------------------\n\n    .. code-block:: python\n\n        def evaluate(\n            eval_dataset: paddle.io.Dataset,\n            batch_size: int = 1,\n            num_workers: int = 0,\n            collate_fn: Callable = None):\n\n    Run evaluation and returns metrics.\n\n    * Args:\n        * eval_dataset(paddle.io.Dataset)\n            The validation dataset\n        \n        * batch_size(int)\n            Batch size of per step, default is 1.\n\n        * num_workers(int)\n            Number of subprocess to load data, default is 0.\n        \n        * collate_fn(Callable)\n            function to generate mini-batch data by merging the sample list. None for only stack each fields of sample in axis 0(same as :attr::`np.stack(..., axis=0)`). Default None.\n"
  },
  {
    "path": "docs/docs_en/api_index.rst",
    "content": "==============\nAPI Reference\n==============\n\n\nModule\n==============\n.. toctree::\n   :maxdepth: 2\n   :titlesonly:\n\n   Module<api/module.rst>\n   LocalModuleManager<api/module_manager.rst>\n   Module Decorator<api/module_decorator.rst>\n\nTransfer Learning\n==================\n.. toctree::\n   :maxdepth: 2\n   :titlesonly:\n\n   Trainer<api/trainer.rst>\n   datasets<api/datasets_index.rst>\n\nEnvironment\n==============\n.. toctree::\n   :maxdepth: 2\n   :titlesonly:\n\n   env<api/env.rst>"
  },
  {
    "path": "docs/docs_en/community/contribute_code.md",
    "content": "# How to contribution code\n\nPaddleHub welcomes contributors.\n\nFirst of all, feel free to submit a question or pull request if there is something you are unsure about. No one will complain about it. We appreciate contributions of any kind, and don't want to block them with a bunch of rules.\n\nThis document includes all the key points to keep in mind when making contributions. This will speed up the process of merging code and solving problems.\n\nClick Overview for an initial overview.\n\nHere are some simple guidelines for making contributions.\n\n## Submit a Question\n\nWhen you encounter a problem with PaddleHub, you can provide feedback by submitting [issue](https://github.com/PaddlePaddle/PaddleHub/issues).\n\nWhen asking your question, specify the following:\n\n* Fill in the details of the problem according to the problem template so that the reviewer can find the cause of the problem.\n* Problem scenarios (as detailed as possible to reproduce the problem):\n* Error and log messages.\n* Other details that may be useful.\n\n## Submit new feature suggestions/bug fixing\n\n* When adapting a usage scenario, there is always a need for new features. You can either join the discussion of new features, or submit a Pull-Request for new features directly.\n\n* Fork PaddleHub (https://github.com/PaddlePaddle/PaddleHub) under your own github account. After fork, use the git tools (add, commit, pull, push) to submit the PR. Then you can submit the pull-request.\n\nTo do the PR, follow the following steps:\n\n### Step 1: Clone the PaddleHub remote repository in your own directory to your local.\n\n```\nhttps://github.com/USERNAME/PaddleHub\n```\n\n### Step 2: Switch to the remote branch develop.\n\n```\ngit checkout develop\n```\n\n### Step 3: Create a local branch new-feature based on remote branch develop.\n\n```\ngit checkout -b new-feature\n```\n\n### Step 4: Use the pre-commit hook.\n\nPaddleHub developers use the pre-commit tool to manage Git pre-commit hooks. It helps us to format the source Python and automatically check some basic things before committing (for example, only one EOL per file, and not adding large files in Git).\n\nThe pre-commit test is a part of the unit tests in Travis-CI. PRs that do not meet the hooks cannot be committed to Paddle. First install and run it in the current directory.\n\n```shell\n➜  pip install pre-commit\n➜  pre-commit install\n```\n\n### Step 5: Develop your requirements on the new-feature branch and commit your changes.\n\n```\ngit commit -m \"add new feature\"\n```\n\n### Step 6: Before you are ready to launch a Pull Request, you need to synchronize the latest codes from the original repository (https://github.com/PaddlePaddle/PaddleHub).\n\nCheck the name of the current remote repository via git remote.\n\n```shell\n➜  git remote\norigin\n➜  git remote -v\norigin\thttps://github.com/USERNAME/PaddleHub (fetch)\norigin\thttps://github.com/USERNAME/PaddleHub (push)\n```\n\nHere, origin is PaddleHub under your own username. Next, create a remote host of the original PaddleHub repository and name it upstream.\n\n```shell\n➜  git remote add upstream https://github.com/PaddlePaddle/PaddleHub\n➜  git remote\norigin\nupstream\n```\n\nGet the latest code for upstream and update the current branch.\n\n```shell\n➜  git fetch upstream\n➜  git pull upstream develop\n```\n\n### Step 7: Push a local branch new-feature to your own PaddleHub repository.\n\n```\n➜  git push origin new-feature\n```\n\nSo, the new-feature branch of your PaddleHub repository contains your latest changes, click \"pull request\" above to push the request.\n\nIf the reviewer gives you feedback that you need to continue fixing the codes, you can start over from step 5 so that all commits are displayed in the same pull request.\n\n## Code Style and Naming Conventions\n\n* PaddleHub follows the Python code naming convention of  [PEP8](https://www.python.org/dev/peps/pep-0008/). Try to follow this specification when submitting pull requests. You can use the `flake8` or `pylint` hint tool to help you follow this specification.\n\n## Document\n\nDocuments are generated using  [sphinx](http://sphinx-doc.org/). Files support the [Markdown](https://guides.github.com/features/mastering-markdown/) and [reStructuredText](http://www.sphinx-doc.org/en/master/usage/restructuredtext/basics.html) formats. All documents are in the [docs/](https://github.com/PaddlePaddle/PaddleHub/tree/release/v2.1/docs) directory.\n\n* Before submitting document changes, please Generate documents locally: `cd docs/ && make clean && make html` and then, all generated pages can be found in `docs/_build/html` directory. Please carefully analyze Each WARNING in generated logs, which is very likely or Empty connections or other problems.\n\n* Try to use Relative Path when you need links.\n\n## Thanks for contribution\n\n<p align=\"center\">\n    <a href=\"https://github.com/nepeplwu\"><img src=\"https://avatars.githubusercontent.com/u/45024560?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/Steffy-zxf\"><img src=\"https://avatars.githubusercontent.com/u/48793257?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/ZeyuChen\"><img src=\"https://avatars.githubusercontent.com/u/1371212?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/ShenYuhan\"><img src=\"https://avatars.githubusercontent.com/u/28444161?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/kinghuin\"><img src=\"https://avatars.githubusercontent.com/u/11913168?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/haoyuying\"><img src=\"https://avatars.githubusercontent.com/u/35907364?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/grasswolfs\"><img src=\"https://avatars.githubusercontent.com/u/23690325?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/sjtubinlong\"><img src=\"https://avatars.githubusercontent.com/u/2063170?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/KPatr1ck\"><img src=\"https://avatars.githubusercontent.com/u/22954146?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/jm12138\"><img src=\"https://avatars.githubusercontent.com/u/15712990?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/DesmonDay\"><img src=\"https://avatars.githubusercontent.com/u/20554008?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/adaxiadaxi\"><img src=\"https://avatars.githubusercontent.com/u/58928121?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/chunzhang-hub\"><img src=\"https://avatars.githubusercontent.com/u/63036966?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/linshuliang\"><img src=\"https://avatars.githubusercontent.com/u/15993091?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/eepgxxy\"><img src=\"https://avatars.githubusercontent.com/u/15946195?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/houj04\"><img src=\"https://avatars.githubusercontent.com/u/35131887?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/paopjian\"><img src=\"https://avatars.githubusercontent.com/u/20377352?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/zbp-xxxp\"><img src=\"https://avatars.githubusercontent.com/u/58476312?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/dxxxp\"><img src=\"https://avatars.githubusercontent.com/u/15886898?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/1084667371\"><img src=\"https://avatars.githubusercontent.com/u/50902619?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/Channingss\"><img src=\"https://avatars.githubusercontent.com/u/12471701?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/Austendeng\"><img src=\"https://avatars.githubusercontent.com/u/16330293?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/BurrowsWang\"><img src=\"https://avatars.githubusercontent.com/u/478717?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/cqvu\"><img src=\"https://avatars.githubusercontent.com/u/37096589?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/Haijunlv\"><img src=\"https://avatars.githubusercontent.com/u/28926237?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/holyseven\"><img src=\"https://avatars.githubusercontent.com/u/13829174?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/MRXLT\"><img src=\"https://avatars.githubusercontent.com/u/16594411?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/cclauss\"><img src=\"https://avatars.githubusercontent.com/u/3709715?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/hu-qi\"><img src=\"https://avatars.githubusercontent.com/u/17986122?v=4\" width=75 height=75></a>\n    <a href=\"https://github.com/jayhenry\"><img src=\"https://avatars.githubusercontent.com/u/4285375?v=4\" width=75 height=75></a>\n</p>\n\n* Many thanks to [肖培楷](https://github.com/jm12138), Contributed to street scene cartoonization, portrait cartoonization, gesture key point recognition, sky replacement, depth estimation, portrait segmentation and other modules\n* Many thanks to [Austendeng](https://github.com/Austendeng) for fixing the SequenceLabelReader\n* Many thanks to [cclauss](https://github.com/cclauss) optimizing travis-ci check\n* Many thanks to [奇想天外](http://www.cheerthink.com/)，Contributed a demo of mask detection\n* Many thanks to [mhlwsk](https://github.com/mhlwsk)，Contributed the repair sequence annotation prediction demo\n* Many thanks to [zbp-xxxp](https://github.com/zbp-xxxp)，Contributed modules for viewing pictures and writing poems\n* Many thanks to [zbp-xxxp](https://github.com/zbp-xxxp) and [七年期限](https://github.com/1084667371),Jointly contributed to the Mid-Autumn Festival Special Edition Module\n* Many thanks to [livingbody](https://github.com/livingbody)，Contributed models for style transfer based on PaddleHub's capabilities and Mid-Autumn Festival WeChat Mini Program\n* Many thanks to [BurrowsWang](https://github.com/BurrowsWang) for fixing Markdown table display problem\n* Many thanks to [huqi](https://github.com/hu-qi) for fixing readme typo\n"
  },
  {
    "path": "docs/docs_en/community/more_demos.md",
    "content": "# Interesting cases of third parties\n\nThe following are some of the interesting and practical works created by developers based on PaddleHub in previous PaddleHub courses or events, which are all included in AI Studio and can run online.\n\n1. [Cloth Scissors Rock \\[Face Recognition to Switch Local Window\\] ](https://aistudio.baidu.com/aistudio/projectdetail/507630)\n2. [Fish in the Autumn Water \\[yesok dance background cutout conversion \\& anime style migration\\] ](http://aistudio.baidu.com/aistudio/projectdetail/517066)\n3. [Ninetailskim \\[play retro windows pinball on people's faces\\] ](https://aistudio.baidu.com/aistudio/projectdetail/518861)\n4. [Ura\\_\\_ \\[Monitor Mask, Voice Alerts, Background Recording\\] ](https://aistudio.baidu.com/aistudio/projectdetail/506931)\n5. [Nine Alchemists \\[Shadow stream's Green Frog Cai Xukun, Face Recognition + Headgear + Portrait Segmentation\\] ](https://aistudio.baidu.com/aistudio/projectdetail/505168)\n6. [Seven-year term \\[style migration and local deployment\\] ](https://aistudio.baidu.com/aistudio/projectdetail/520453)\n7. [Fanas Invincible \\[Lipstick Color Test Project\\] ](https://aistudio.baidu.com/aistudio/projectdetail/516520)\n8. [skywalk163 \\[statistics of source code word frequency and word cloud \\& portrait display with paddlehub\\] ](https://aistudio.baidu.com/aistudio/projectdetail/519841)\n9. [AIStudio261428 \\[Face Recognition + Cartoon Emoticons\\]](https://aistudio.baidu.com/aistudio/projectdetail/519616)\n10. [Tudou Sprouts \\[Invite Cartoon Characters to Visit on Children's Day\\]](https://aistudio.baidu.com/aistudio/projectdetail/520925)\n11. [Panda Feeling \\[changing masks\\] ](https://aistudio.baidu.com/aistudio/projectdetail/520996)\n12. [Kly1997 \\[One-key Travel + Sunglasses\\]](https://aistudio.baidu.com/aistudio/projectdetail/518117)\n13. [Loneliness\\_Memo \\[Crossing over to the painting\\] ](https://aistudio.baidu.com/aistudio/projectdetail/516332)\n14. [isse7 \\[Creative Projects: Style \"Ghostface\" Change\\]](https://aistudio.baidu.com/aistudio/projectdetail/515307)\n15. [Pda \\[Face Fun Change\\] ](https://aistudio.baidu.com/aistudio/projectdetail/516306)\n16. [Kgkzhiwen \\[My New Clothes\\]](https://aistudio.baidu.com/aistudio/projectdetail/516663)\n17. [Ah Ah Ah Good Good Study \\[Automatic face shape adjustment\\]](https://aistudio.baidu.com/aistudio/projectdetail/513640)\n18. [Tfboy \\[ID Photo Replacement\\]](https://aistudio.baidu.com/aistudio/projectdetail/509443)\n19. [Leigangblog \\[I am a star face\\]](https://aistudio.baidu.com/aistudio/projectdetail/505537)\n20. [wpb3dm \\[Fashion Model Dressup\\]](https://aistudio.baidu.com/aistudio/projectdetail/519349)\n21. [lsvine\\_bai \\[Girlfriend turns into mysterious blonde goddess in seconds\\]](https://aistudio.baidu.com/aistudio/projectdetail/521784)\n22. [Lemonadeqk \\[Easy Stargazing\\]](https://aistudio.baidu.com/aistudio/projectdetail/520488)\n23. [XM1436gr \\[AI for Cartoon Face using PaddleHub Keypoint Detection\\] ](https://aistudio.baidu.com/aistudio/projectdetail/514547)\n24. [Wantzai \\[Everyone is a round-eyed cute boy\\] ](https://aistudio.baidu.com/aistudio/projectdetail/519222)\n25. [Arrowarcher \\[AI One Key Hair chane\\] ](https://aistudio.baidu.com/aistudio/projectdetail/508270)\n26. [WHY197598 \\[Fundamentals of Shifting Objects\\] ](https://aistudio.baidu.com/aistudio/projectdetail/517961)\n27. [SIGNED BY JINGYI \\[Fatigue detection based on paddlehub face key point detection\\]](https://aistudio.baidu.com/aistudio/projectdetail/506024)\n28. [thunder95 \\[PaddleHub Eyes Emotion Poll\\] ](https://aistudio.baidu.com/aistudio/projectdetail/514205)\n29. [Windy Moon C \\[Grave Bouncing Graduation Photo\\]](https://aistudio.baidu.com/aistudio/projectdetail/511253)\n30. [Ruyi\\_Egg \\[left looks like Chow Yun-Fat, right looks like Andy Lau\\] ](https://aistudio.baidu.com/aistudio/projectdetail/507231)\n"
  },
  {
    "path": "docs/docs_en/community_index.rst",
    "content": "===================\nActive community\n===================\n..  toctree::\n    :maxdepth: 2\n    :titlesonly:\n\n    community/more_demos.md\n    \n    community/contribute_code.md\n\n------------\n\n..  toctree::\n    :maxdepth: 2\n    :titlesonly:\n\n    AI Creative Competition I <https://aistudio.baidu.com/aistudio/competition/detail/34>\n\n    AI Creative Competition II <https://aistudio.baidu.com/aistudio/competition/detail/35>\n\n    AI Creative Competition III <https://aistudio.baidu.com/aistudio/competition/detail/72>\n\n    AI Creative Competition IV <https://aistudio.baidu.com/aistudio/competition/detail/79>"
  },
  {
    "path": "docs/docs_en/conf.py",
    "content": "# Configuration file for the Sphinx documentation builder.\n#\n# This file only contains a selection of the most common options. For a full\n# list see the documentation:\n# https://www.sphinx-doc.org/en/master/usage/configuration.html\n\n# -- Path setup --------------------------------------------------------------\n\n# If extensions (or modules to document with autodoc) are in another directory,\n# add these directories to sys.path here. If the directory is relative to the\n# documentation root, use os.path.abspath to make it absolute, like shown here.\n#\nimport os\nimport sys\nfrom recommonmark.transform import AutoStructify\nfrom recommonmark.parser import CommonMarkParser\n\n# sys.path.insert(0, os.path.abspath('../../paddlehub'))\n# import paddlehub as hub\n\n# -- Project information -----------------------------------------------------\n\nproject = 'PaddleHub'\ncopyright = '2021, PaddlePaddle'\nauthor = 'PaddlePaddle'\n\n# The full version, including alpha/beta/rc tags\n# release = 'v{}'.format(hub.__version__)\nrelease = 'v2.1.0'\n\n# -- General configuration ---------------------------------------------------\n\n# Add any Sphinx extension module names here, as strings. They can be\n# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom\n# ones.\nextensions = [\n    'sphinx.ext.autodoc', 'sphinx.ext.napoleon', 'sphinx.ext.coverage', 'sphinx.ext.viewcode', 'sphinx.ext.mathjax',\n    'sphinx.ext.githubpages', 'sphinx.ext.napoleon', 'recommonmark', 'sphinx_markdown_tables'\n]\n\n# Add any paths that contain templates here, relative to this directory.\ntemplates_path = ['_templates']\n\n# List of patterns, relative to source directory, that match files and\n# directories to ignore when looking for source files.\n# This pattern also affects html_static_path and html_extra_path.\nexclude_patterns = ['_build', '.DS_Store']\n\n# -- Options for HTML output -------------------------------------------------\n\n# The theme to use for HTML and HTML Help pages.  See the documentation for\n# a list of builtin themes.\n#\nhtml_title = project\nhtml_theme = \"sphinx_materialdesign_theme\"\n\n# Add any paths that contain custom static files (such as style sheets) here,\n# relative to this directory. They are copied after the builtin static files,\n# so a file named \"default.css\" will overwrite the builtin \"default.css\".\nhtml_static_path = ['_static']\n\n# The suffix(es) of source filenames.\n# You can specify multiple suffix as a list of string:\n#\nsource_parsers = {'.md': CommonMarkParser}\n\nsource_suffix = ['.rst', '.md']\n"
  },
  {
    "path": "docs/docs_en/faq.md",
    "content": "# FAQ\n\n## Failed to install paddlehub via pip\n`Could not find a version that satisfies the requirement paddlehub (from versions: )`\n\nThis may be because pip points to a pypi mirror source, which is not synchronized with the paddlehub version in time.\n\nTry the following command:\n\n```shell\n$ pip install -i https://pypi.org/simple/ paddlehub\n```\n\n## When using paddlehub, raise an Exception like\n`ModuleNotFoundError: No module named 'paddle'`\n\nThis is because PaddleHub depends on PaddlePaddle, and users need to install the appropriate PaddlePaddle version by themselves.\n\nIf the machine does not support GPU, use the following command to install the CPU version of PaddlePaddle:\n\n```shell\n$ pip install paddlepaddle\n```\n\nOr, use the following command to install the GPU version of PaddlePaddle:\n\n```shell\n$ pip install paddlepaddle-gpu\n```\n\n## Cannot download preset datasets and Modules.\n\nYou can use server_check() to check the connection status between the local and remote PaddleHub-Server\n\n```python\nimport paddlehub\npaddlehub.server_check()\n# If the remote PaddleHub-Server can be connected, it will display Request Hub-Server successfully.\n# Otherwise, it will display Request Hub-Server unsuccessfully.\n```\n\n## Does PaddleHub Module support multi-threading to speed up prediction?\n\nDue to the limitations of PaddlePaddle itself, PaddleHub cannot speed up model prediction through multi-threaded concurrent execution.\n\n## How to modify the default storage path of PaddleHub's Module?\n\nSet environment variable ${HUB_HOME}.\n"
  },
  {
    "path": "docs/docs_en/figures.md",
    "content": "## Detailed Features\n\n<a name=\"Various Pre-training Models\"></a>\n\n### 1\\. Various Pre-training Models\n\n#### 1.1. Image\n\n|                      | **Examples of Boutique Models**                              |\n| -------------------- | :----------------------------------------------------------- |\n| Image Classification | [Dish Identification](https://www.paddlepaddle.org.cn/hubdetail?name=resnet50_vd_dishes&en_category=ImageClassification), [Animal Identification](https://www.paddlepaddle.org.cn/hubdetail?name=resnet50_vd_animals&en_category=ImageClassification), [Animal Identification](https://www.paddlepaddle.org.cn/hubdetail?name=resnet50_vd_animals&en_category=ImageClassification), [-->More](../modules/image/classification/README.md) |\n| Object Detection     | [Universal Detection](https://www.paddlepaddle.org.cn/hubdetail?name=yolov3_darknet53_coco2017&en_category=ObjectDetection), [Pedestrian Detection](https://www.paddlepaddle.org.cn/hubdetail?name=yolov3_darknet53_pedestrian&en_category=ObjectDetection), [Vehicle Detection](https://www.paddlepaddle.org.cn/hubdetail?name=yolov3_darknet53_vehicles&en_category=ObjectDetection), [-->More](../modules/image/object_detection/README.md) |\n| Face Detection       | [Face Detection](https://www.paddlepaddle.org.cn/hubdetail?name=pyramidbox_lite_server&en_category=FaceDetection), [Mask Detection](https://www.paddlepaddle.org.cn/hubdetail?name=pyramidbox_lite_server_mask&en_category=FaceDetection), [-->More](../modules/image/face_detection/README.md) |\n| Image Segmentation   | [Portrait Segmentation](https://www.paddlepaddle.org.cn/hubdetail?name=deeplabv3p_xception65_humanseg&en_category=ImageSegmentation), [Body Analysis](https://www.paddlepaddle.org.cn/hubdetail?name=ace2p&en_category=ImageSegmentation), [Pneumonia CT Imaging Analysis](https://www.paddlepaddle.org.cn/hubdetail?name=Pneumonia_CT_LKM_PP&en_category=ImageSegmentation), [-->More](../modules/image/semantic_segmentation/README.md) |\n| Key Point Detection  | [Body Key Points](https://www.paddlepaddle.org.cn/hubdetail?name=human_pose_estimation_resnet50_mpii&en_category=KeyPointDetection), [Face Key Points](https://www.paddlepaddle.org.cn/hubdetail?name=face_landmark_localization&en_category=KeyPointDetection), [Hands Key Points](https://www.paddlepaddle.org.cn/hubdetail?name=hand_pose_localization&en_category=KeyPointDetection), [-->More](./modules/image/keypoint_detection/README.md) |\n| Text Recognition     | [Ultra Lightweight Chinese \\& English OCR Text Recognition](https://www.paddlepaddle.org.cn/hubdetail?name=chinese_ocr_db_crnn_mobile&en_category=TextRecognition), [-->More](../modules/image/text_recognition/README.md) |\n| Image Generation     | [Style Migration](https://www.paddlepaddle.org.cn/hubdetail?name=stylepro_artistic&en_category=GANs), [Street View Cartoon](https://www.paddlepaddle.org.cn/hubdetail?name=animegan_v2_hayao_99&en_category=GANs), [-->More](../modules/image/Image_gan/README.md) |\n| Image Editing        | [Super Resolution](https://www.paddlepaddle.org.cn/hubdetail?name=realsr&en_category=ImageEditing), [B\\&W Color](https://www.paddlepaddle.org.cn/hubdetail?name=deoldify&en_category=ImageEditing), [-->More](../modules/image/Image_editing/README.md) |\n\n#### 1.2  Text\n\n|                    | **Examples of Boutique Models**                              |\n| ------------------ | :----------------------------------------------------------- |\n| Word Analysis      | [Linguistic Analysis](https://www.paddlepaddle.org.cn/hubdetail?name=lac&en_category=LexicalAnalysis), [Syntactic Analysis](https://www.paddlepaddle.org.cn/hubdetail?name=ddparser&en_category=SyntacticAnalysis), [-->More](../modules/text/lexical_analysis/README.md) |\n| Sentiment Analysis | [Emotion Judgment](https://www.paddlepaddle.org.cn/hubdetail?name=lac&en_category=LexicalAnalysis), [Emotion Analysis](https://www.paddlepaddle.org.cn/hubdetail?name=emotion_detection_textcnn&en_category=SentimentAnalysis), [-->More](../modules/text/sentiment_analysis/README.md) |\n| Text Review        | [Porn Review](https://www.paddlepaddle.org.cn/hubdetail?name=porn_detection_gru&en_category=TextCensorship), [-->More](../modules/text/text_review/README.md) |\n| Text Generation    | [Poetic Couplet Generation](https://www.paddlepaddle.org.cn/hubdetail?name=ernie_tiny_couplet&en_category=TextGeneration), [Love Letters Generation](https://www.paddlepaddle.org.cn/hubdetail?name=ernie_gen_poetry&en_category=TextGeneration), [Popular Love Letters](https://www.paddlepaddle.org.cn/hubdetail?name=ernie_gen_lover_words&en_category=TextGeneration), [-->More](../modules/text/text_generation/README.md) |\n| Semantic Models    | [ERNIE](https://www.paddlepaddle.org.cn/hubdetail?name=ERNIE&en_category=SemanticModel), [Text Similarity](https://www.paddlepaddle.org.cn/hubdetail?name=simnet_bow&en_category=SemanticModel), [-->More](../modules/text/language_model/README.md) |\n\n#### 1.3. Speech\n\n|                | **Examples of Boutique Models**                           |\n| -------------- | :-------------------------------------------------------- |\n| Text-to-speech | [Text-to-speech](https://www.paddlepaddle.org.cn/hubdetail?name=deepvoice3_ljspeech&en_category=TextToSpeech), [-->More](../modules/audio/README.md) |\n\n#### 1.4. Video\n\n|                      | **Examples of Boutique Models**                              |\n| -------------------- | :----------------------------------------------------------- |\n| Video Classification | [ Video Classification](https://www.paddlepaddle.org.cn/hublist?filter=en_category&value=VideoClassification), [-->More](../modules/video/README.md) |\n\n<a name=\"One-key Model Prediction\"></a>\n\n### 2\\. One-key Model Prediction\n\n* For example, if you use the lightweight Chinese OCR model chinese\\_ocr\\_db\\_crnn\\_mobile for text recognition, you can quickly recognize the text in an image with pressing one key.\n\n```shell\n$ pip install paddlehub\n$ wget https://paddlehub.bj.bcebos.com/model/image/ocr/test_ocr.jpg\n$ hub run chinese_ocr_db_crnn_mobile --input_path test_ocr.jpg --visualization=True\n```\n\n* The prediction results images are stored in the ocr\\_result folder under the current path, as shown in the following figure.\n\n<p align=\"center\">\n <img src=\"./imgs/ocr_res.jpg\" width='70%' align=\"middle\"  \n</p>\n* Use the lexical analysis model LAC for word segmentation.\n\n```shell\n$ hub run lac --input_text \"现在，慕尼黑再保险公司不仅是此类行动的倡议者，更是将其大量气候数据整合进保险产品中，并与公众共享大量天气信息，参与到新能源领域的保障中。\"\n[{\n    'word': ['现在', '，', '慕尼黑再保险公司', '不仅', '是', '此类', '行动', '的', '倡议者', '，', '更是', '将', '其', '大量', '气候', '数据', '整合', '进', '保险', '产品', '中', '，', '并', '与', '公众', '共享', '大量', '天气', '信息', '，', '参与', '到', '新能源', '领域', '的', '保障', '中', '。'],\n    'tag':  ['TIME', 'w', 'ORG', 'c', 'v', 'r', 'n', 'u', 'n', 'w', 'd', 'p', 'r', 'a', 'n', 'n', 'v', 'v', 'n', 'n', 'f', 'w', 'c', 'p', 'n', 'v', 'a', 'n', 'n', 'w', 'v', 'v', 'n', 'n', 'u', 'vn', 'f', 'w']\n}]\n```\n\nIn addition to one-line code prediction, PaddleHub also supports the use of API to revoke the model. For details, refer to the detailed documentation of each model.\n\n<a name=\"One-Key Model to Service\"></a>\n\n### 3\\. One-Key to deploy Models as Services\n\nPaddleHub provides convenient model-to-service capability to deploy HTTP services for models with one simple command. The LAC lexical analysis service can quickly start with the following commands:\n\n```shell\n$ hub serving start -m chinese_ocr_db_crnn_mobile\n```\n\nFor more instructions on using Model Serving, See PaddleHub Model One-Key Model Serving Deployment.\n\n<a name=\"Transfer Learning within ten lines of Codes\"></a>\n\n### 4\\. Transfer Learning within Ten Lines of Codes\n\nWith the Fine-tune API, deep learning models can be migrated and learned in computer vision scenarios with a small number of codes.\n\n* The [Demo Examples](../demo) provides rich codes for using Fine-tune API, including [Image Classification](../demo/image_classification), [Image Coloring](../demo/colorization), [Style Migration](../demo/style_transfer), and other scenario model migration examples.\n\n<p align=\"center\">\n <img src=\"../../imgs/paddlehub_finetune.gif\" align=\"middle\"  \n</p>\n<p align='center'>\n Transfer Learning within Ten Lines of Codes\n</p>\n\n* For a quick online experience, click [PaddleHub Tutorial Collection](https://aistudio.baidu.com/aistudio/projectdetail/231146) to use the GPU computing power provided by AI Studio platform for a quick attempt.\n"
  },
  {
    "path": "docs/docs_en/finetune/audio_classification.md",
    "content": "# 声音分类\n\n本示例展示如何使用PaddleHub Fine-tune API以及CNN14等预训练模型完成声音分类和Tagging的任务。\n\nCNN14等预训练模型的详情，请参考论文[PANNs: Large-Scale Pretrained Audio Neural Networks for Audio Pattern Recognition](https://arxiv.org/pdf/1912.10211.pdf)和代码[audioset_tagging_cnn](https://github.com/qiuqiangkong/audioset_tagging_cnn)。\n\n\n## 如何开始Fine-tune\n\n我们以环境声音分类公开数据集[ESC50](https://github.com/karolpiczak/ESC-50)为示例数据集，可以运行下面的命令，在训练集（train.npz）上进行模型训练，并在开发集（dev.npz）验证。通过如下命令，即可启动训练。\n\n```python\n# 设置使用的GPU卡号\nexport CUDA_VISIBLE_DEVICES=0\npython train.py\n```\n\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 选择模型\n\n```python\nimport paddle\nimport paddlehub as hub\nfrom paddlehub.datasets import ESC50\n\nmodel = hub.Module(name='panns_cnn14', version='1.0.0', task='sound-cls', num_class=ESC50.num_class)\n```\n\n其中，参数：\n- `name`: 模型名称，可以选择`panns_cnn14`、`panns_cnn10` 和`panns_cnn6`，具体的模型参数信息可见下表。\n- `version`: module版本号\n- `task`：模型的执行任务。`sound-cls`表示声音分类任务；`None`表示Audio Tagging任务。\n- `num_classes`：表示当前声音分类任务的类别数，根据具体使用的数据集确定。\n\n目前可选用的预训练模型：\n模型名      | PaddleHub Module\n-----------| :------:\nCNN14      | `hub.Module(name='panns_cnn14')`\nCNN10      | `hub.Module(name='panns_cnn10')`\nCNN6       | `hub.Module(name='panns_cnn6')`\n\n### Step2: 加载数据集\n\n```python\ntrain_dataset = ESC50(mode='train')\ndev_dataset = ESC50(mode='dev')\n```\n\n### Step3: 选择优化策略和运行配置\n\n```python\noptimizer = paddle.optimizer.AdamW(learning_rate=5e-5, parameters=model.parameters())\ntrainer = hub.Trainer(model, optimizer, checkpoint_dir='./', use_gpu=True)\n```\n\n#### 优化策略\n\nPaddle2.0提供了多种优化器选择，如`SGD`, `AdamW`, `Adamax`等，详细参见[策略](https://www.paddlepaddle.org.cn/documentation/docs/zh/api/paddle/optimizer/Overview_cn.html)。\n\n其中`AdamW`:\n\n- `learning_rate`: 全局学习率。默认为1e-3；\n- `parameters`: 待优化模型参数。\n\n其余可配置参数请参考[AdamW](https://www.paddlepaddle.org.cn/documentation/docs/zh/api/paddle/optimizer/adamw/AdamW_cn.html#cn-api-paddle-optimizer-adamw)。\n\n#### 运行配置\n\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n- `model`: 被优化模型；\n- `optimizer`: 优化器选择；\n- `use_vdl`: 是否使用vdl可视化训练过程；\n- `checkpoint_dir`: 保存模型参数的地址；\n- `compare_metrics`: 保存最优模型的衡量指标；\n\n\n### Step4: 执行训练和模型评估\n\n```python\ntrainer.train(\n    train_dataset,\n    epochs=50,\n    batch_size=16,\n    eval_dataset=dev_dataset,\n    save_interval=10,\n)\ntrainer.evaluate(dev_dataset, batch_size=16)\n```\n\n`trainer.train`执行模型的训练，其参数可以控制具体的训练过程，主要的参数包含：\n\n- `train_dataset`: 训练时所用的数据集；\n- `epochs`: 训练轮数；\n- `batch_size`: 训练时每一步用到的样本数目，如果使用GPU，请根据实际情况调整batch_size；\n- `num_workers`: works的数量，默认为0；\n- `eval_dataset`: 验证集；\n- `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n- `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n`trainer.evaluate`执行模型的评估，主要的参数包含：\n\n- `eval_dataset`: 模型评估时所用的数据集；\n- `batch_size`: 模型评估时每一步用到的样本数目，如果使用GPU，请根据实际情况调整batch_size\n\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n以下代码将本地的音频文件`./cat.wav`作为预测数据，使用训好的模型对它进行分类，输出结果。\n\n```python\nimport os\n\nimport librosa\n\nimport paddlehub as hub\nfrom paddlehub.datasets import ESC50\n\nwav = './cat.wav'  # 存储在本地的需要预测的wav文件\nsr = 44100  # 音频文件的采样率\ncheckpoint = './best_model/model.pdparams'  # 模型checkpoint\n\nlabel_map = {idx: label for idx, label in enumerate(ESC50.label_list)}\n\nmodel = hub.Module(name='panns_cnn14',\n                    version='1.0.0',\n                    task='sound-cls',\n                    num_class=ESC50.num_class,\n                    label_map=label_map,\n                    load_checkpoint=checkpoint)\n\ndata = [librosa.load(wav, sr=sr)[0]]\nresult = model.predict(data, sample_rate=sr, batch_size=1, feat_type='mel', use_gpu=True)\n\nprint(result[0])  # result[0]包含音频文件属于各类别的概率值\n```\n\n\n## Audio Tagging\n\n当前使用的模型是基于[Audioset数据集](https://research.google.com/audioset/)的预训练模型，除了以上的针对特定声音分类数据集的finetune任务，模型还支持基于Audioset 527个标签的Tagging功能。\n\n以下代码将本地的音频文件`./cat.wav`作为预测数据，使用预训练模型对它进行打分，输出top 10的标签和对应的得分。\n\n```python\nimport os\n\nimport librosa\nimport numpy as np\n\nimport paddlehub as hub\nfrom paddlehub.env import MODULE_HOME\n\n\nwav = './cat.wav'  # 存储在本地的需要预测的wav文件\nsr = 44100  # 音频文件的采样率\ntopk = 10  # 展示音频得分前10的标签和分数\n\n# 读取audioset数据集的label文件\nlabel_file = os.path.join(MODULE_HOME, 'panns_cnn14', 'audioset_labels.txt')\nlabel_map = {}\nwith open(label_file, 'r') as f:\n    for i, l in enumerate(f.readlines()):\n        label_map[i] = l.strip()\n\nmodel = hub.Module(name='panns_cnn14', version='1.0.0', task=None, label_map=label_map)\n\ndata = [librosa.load(wav, sr=sr)[0]]\nresult = model.predict(data, sample_rate=sr, batch_size=1, feat_type='mel', use_gpu=True)\n\n# 打印topk的类别和对应得分\nmsg = ''\nfor label, score in list(result[0].items())[:topk]:\n    msg += f'{label}: {score}\\n'\nprint(msg)\n```\n\n### 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.1.0\n"
  },
  {
    "path": "docs/docs_en/finetune/customized_dataset.md",
    "content": "# Customized Data\n\nIn the training of a new task, it is a time-consuming process starting from zero, without producing the desired results probably. You can use the pre-training model provided by PaddleHub for fine-tune of a specific task. You just need to perform pre-processing of the customized data accordingly, and then input the pre-training model to get the corresponding results. Refer to the following for setting up the structure of the dataset.\n\n## I. Image Classification Dataset\n\nWhen migrating a classification task using custom data with PaddleHub, you need to slice the dataset into training set, validation set, and test set.\n\n### Data Preparation\n\nThree text files are needed to record the corresponding image paths and labels, plus a label file to record the name of the label.\n\n```\n├─data:\n  ├─train_list.txt：\n  ├─test_list.txt：\n  ├─validate_list.txt：\n  ├─label_list.txt：\n  └─...\n```\n\nThe format of the data list file for the training/validation/test set is as follows:\n\n```\nPath-1  label-1\nPath-2  label-2\n...\n```\n\nThe format of label\\_list.txt is:\n\n```\nClassification 1\nClassification 2\n...\n```\n\nExample: Take [Flower Dataset](../reference/dataset.md) as an example, train\\_list.txt/test\\_list.txt/validate\\_list.txt:\n\n```\nroses/8050213579_48e1e7109f.jpg 0\nsunflowers/45045003_30bbd0a142_m.jpg 3\ndaisy/3415180846_d7b5cced14_m.jpg 2\n```\n\nlabel\\_list.txt reads as follows.\n\n```\nroses\ntulips\ndaisy\nsunflowers\ndandelion\n```\n\n### Dataset Loading\n\nFor the preparation code of dataset, see [flowers.py](../../paddlehub/datasets/flowers.py). `hub.datasets.Flowers()` It automatically downloads the dataset from the network and unzip it into the `$HOME/.paddlehub/dataset` directory. Specific usage:\n\n```python\nfrom paddlehub.datasets import Flowers\n\nflowers = Flowers(transforms)\nflowers_validate = Flowers(transforms, mode='val')\n```\n\n* `transforms`: Data pre-processing mode.\n* `mode`: Select data mode. Options are `train`, `test`, `val`. Default is `train`.\n\n## II. Image Coloring Dataset\n\nWhen migrating a coloring task using custom data with PaddleHub, you need to slice the dataset into training set and test set.\n\n### Data Preparation\n\nYou need to divide the color images for coloring training and testing into training set data and test set data.\n\n```\n├─data:\n  ├─train：\n      |-folder1\n      |-folder2\n      |-……\n      |-pic1\n      |-pic2\n      |-……\n\n  ├─test：\n    |-folder1\n    |-folder2\n    |-……\n    |-pic1\n    |-pic2\n    |-……\n  └─……\n```\n\nExample: PaddleHub provides users with a dataset for coloring `Canvas dataset. It consists of 1193 images in Monet style and 400 images in Van Gogh style. Take [Canvas Dataset](../reference/datasets.md) as an example, the contents of the train folder are as follows:\n\n```\n├─train：\n      |-monet\n          |-pic1\n          |-pic2\n          |-……  \n      |-vango\n          |-pic1\n          |-pic2\n          |-……\n```\n\n### Dataset Loading\n\nFor dataset preparation codes, refer to [canvas.py](../../paddlehub/datasets/canvas.py). `hub.datasets.Canvas()` It automatically downloads the dataset from the network and unzip it into the `$HOME/.paddlehub/dataset` directory. Specific usage:\n\n```python\nfrom paddlehub.datasets import Canvas\n\ncolor_set = Canvas(transforms, mode='train')\n```\n\n* `transforms`: Data pre-processing mode.\n* `mode`: Select data mode. Options are `train`, `test`. The default is `train`.\n\n## III. Style Migration Dataset\n\nWhen using custom data for style migration tasks with PaddleHub, you need to slice the dataset into training set and test set.\n\n### Data Preparation\n\nYou need to split the color images for style migration into training set and test set data.\n\n```\n├─data:\n  ├─train：\n      |-folder1\n      |-folder2\n      |-...\n      |-pic1\n      |-pic2\n      |-...\n\n  ├─test：\n    |-folder1\n    |-folder2\n    |-...\n    |-pic1\n    |-pic1\n    |-...\n  |- 21styles\n    ｜-pic1\n    ｜-pic1\n  └─...\n```\n\nExample: PaddleHub provides users with data sets for style migration `MiniCOCO dataset`. Training set data and test set data is from COCO2014, in which there are 2001 images in the training set and 200 images in the test set. There are 21 images with different styles in `21styles` folder. Users can change the images of different styles as needed. Take [MiniCOCO Dataset](../reference/datasets.md) as an example. The content of train folder is as follows:\n\n```\n├─train：\n      |-train\n          |-pic1\n          |-pic2\n          |-……  \n      |-test\n          |-pic1\n          |-pic2\n          |-……\n      |-21styles\n          |-pic1\n          |-pic2\n          |-……\n```\n\n### Dataset Loading\n\nFor the preparation codes of the dataset, refer to [minicoco.py](../../paddlehub/datasets/minicoco.py). `hub.datasets.MiniCOCO()` It automatically downloads the dataset from the network and unzip it into the `$HOME/.paddlehub/dataset` directory. Specific usage:\n\n```python\nfrom paddlehub.datasets import MiniCOCO\n\nccolor_set = MiniCOCO(transforms, mode='train')\n```\n\n* `transforms`: Data pre-processing mode.\n* `mode`: Select data mode. Options are `train`, `test`. The default is `train`.\n"
  },
  {
    "path": "docs/docs_en/finetune/image_classification.md",
    "content": "# 图像分类\n\n本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n\n## 命令行预测\n\n```shell\n$ hub run resnet50_vd_imagenet_ssld --input_path \"/PATH/TO/IMAGE\" --top_k 5\n```\n\n## 如何开始Fine-tune\n\n在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用resnet50_vd_imagenet_ssld对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 定义数据预处理方式\n```python\nimport paddlehub.vision.transforms as T\n\ntransforms = T.Compose([T.Resize((256, 256)),\n                        T.CenterCrop(224),\n                        T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                        to_rgb=True)\n```\n\n`transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n### Step2: 下载数据集并使用\n```python\nfrom paddlehub.datasets import Flowers\n\nflowers = Flowers(transforms)\n\nflowers_validate = Flowers(transforms, mode='val')\n```\n\n* `transforms`: 数据预处理方式。\n* `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n### Step3: 加载预训练模型\n\n```python\nmodel = hub.Module(name=\"resnet50_vd_imagenet_ssld\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n```\n* `name`: 选择预训练模型的名字。\n* `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\nPaddleHub提供许多图像分类预训练模型，如xception、mobilenet、efficientnet等，详细信息参见[图像分类模型](https://www.paddlepaddle.org.cn/hub?filter=en_category&value=ImageClassification)。\n\n如果想尝试efficientnet模型，只需要更换Module中的`name`参数即可.\n```python\n# 更换name参数即可无缝切换efficientnet模型, 代码示例如下\nmodel = hub.Module(name=\"efficientnetb7_imagenet\")\n```\n**NOTE:**目前部分模型还没有完全升级到2.0版本，敬请期待。\n\n### Step4: 选择优化策略和运行配置\n\n```python\noptimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\ntrainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n\ntrainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n```\n\n#### 优化策略\n\nPaddle2.0提供了多种优化器选择，如`SGD`, `Adam`, `Adamax`等, 其中`Adam`:\n\n* `learning_rate`: 全局学习率。默认为1e-3；\n* `parameters`: 待优化模型参数。\n\n#### 运行配置\n\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: works的数量，默认为0；\n* `eval_dataset`: 验证集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们使用该模型来进行预测。predict.py脚本如下：\n\n```python\nimport paddle\nimport paddlehub as hub\n\nif __name__ == '__main__':\n\n    model = hub.Module(name='resnet50_vd_imagenet_ssld', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n    result = model.predict(['flower.jpg'])\n```\n\n参数配置正确后，请执行脚本`python predict.py`， 加载模型具体可参见[加载](https://www.paddlepaddle.org.cn/documentation/docs/zh/2.0-rc/api/paddle/framework/io/load_cn.html#load)。\n\n**NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线分类任务服务。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m resnet50_vd_imagenet_ssld\n```\n\n这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\nimport cv2\nimport base64\n\nimport numpy as np\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n# 发送HTTP请求\norg_im = cv2.imread('/PATH/TO/IMAGE')\n\ndata = {'images':[cv2_to_base64(org_im)], 'top_k':2}\nheaders = {\"Content-type\": \"application/json\"}\nurl = \"http://127.0.0.1:8866/predict/resnet50_vd_imagenet_ssld\"\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\ndata =r.json()[\"results\"]['data']\n```\n\n### 查看代码\n\nhttps://github.com/PaddlePaddle/models/tree/develop/PaddleCV/image_classification\n\n### 依赖\n\npaddlepaddle >= 2.0.0rc\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "docs/docs_en/finetune/image_colorization.md",
    "content": "# 图像着色\n\n本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n\n## 命令行预测\n\n```\n$ hub run user_guided_colorization --input_path \"/PATH/TO/IMAGE\"\n```\n\n## 如何开始Fine-tune\n\n在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用user_guided_colorization模型对[Canvas](../../docs/reference/datasets.md#class-hubdatasetsCanvas)等数据集进行Fine-tune。\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 定义数据预处理方式\n```python\nimport paddlehub.vision.transforms as T\n\ntransform = T.Compose([T.Resize((256, 256), interpolation='NEAREST'),\n                       T.RandomPaddingCrop(crop_size=176),\n                       T.RGB2LAB()], to_rgb=True)\n```\n\n`transforms`数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n**NOTE:** 要将`T.Compose`中`to_rgb`设定为True.\n\n### Step2: 下载数据集并使用\n```python\nfrom paddlehub.datasets import Canvas\n\ncolor_set = Canvas(transform=transform, mode='train')\n```\n* `transform`: 数据预处理方式。\n* `mode`: 选择数据模式，可选项有 `train`, `test` 默认为`train`。\n\n数据集的准备代码可以参考 [canvas.py](../../paddlehub/datasets/canvas.py)。`hub.datasets.Canvas()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n### Step3: 加载预训练模型\n\n```python\nmodel = hub.Module(name='user_guided_colorization', load_checkpoint=None)\nmodel.set_config(classification=True, prob=1)\n```\n* `name`:加载模型的名字。\n* `load_checkpoint`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n* `classification`: 着色模型分两部分训练，开始阶段`classification`设置为True, 用于浅层网络训练。训练后期将`classification`设置为False, 用于训练网络的输出层。\n* `prob`: 每张输入图不加一个先验彩色块的概率，默认为1，即不加入先验彩色块。例如，当`prob`设定为0.9时，一张图上有两个先验彩色块的概率为(1-0.9)*(1-0.9)*0.9=0.009.\n\n### Step4: 选择优化策略和运行配置\n\n```python\noptimizer = paddle.optimizer.Adam(learning_rate=0.0001, parameters=model.parameters())\ntrainer = Trainer(model, optimizer, checkpoint_dir='img_colorization_ckpt_cls_1')\ntrainer.train(color_set, epochs=201, batch_size=25, eval_dataset=color_set, log_interval=10, save_interval=10)\n```\n\n#### 优化策略\n\nPaddle2.0-rc提供了多种优化器选择，如`SGD`, `Adam`, `Adamax`等，详细参见[策略](https://www.paddlepaddle.org.cn/documentation/docs/zh/2.0-rc/api/paddle/optimizer/optimizer/Optimizer_cn.html)。\n\n其中`Adam`:\n\n* `learning_rate`: 全局学习率。默认为1e-4；\n* `parameters`: 待优化模型参数。\n\n#### 运行配置\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: works的数量，默认为0；\n* `eval_dataset`: 验证过程所用的数据集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们使用该模型来进行预测。predict.py脚本如下：\n\n```python\nimport paddle\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='user_guided_colorization', load_checkpoint='/PATH/TO/CHECKPOINT')\n    model.set_config(prob=0.1)\n    result = model.predict(images=['house.png'])\n```\n\n参数配置正确后，请执行脚本`python predict.py`， 加载模型具体可参见[加载](https://www.paddlepaddle.org.cn/documentation/docs/zh/2.0-rc/api/paddle/framework/io/load_cn.html#load)。\n\n**NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。若想获取油画风着色效果，请下载参数文件[油画着色](https://paddlehub.bj.bcebos.com/dygraph/models/canvas_rc.pdparams)\n\n**Args**\n* `images`:原始图像路径或者BGR格式图片；\n* `visualization`: 是否可视化，默认为True；\n* `save_path`: 保存结果的路径，默认为'result'。\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线着色任务服务。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m user_guided_colorization\n```\n\n这样就完成了一个着色任务服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\nimport cv2\nimport base64\n\nimport numpy as np\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n# 发送HTTP请求\norg_im = cv2.imread('/PATH/TO/IMAGE')\ndata = {'images':[cv2_to_base64(org_im)]}\nheaders = {\"Content-type\": \"application/json\"}\nurl = \"http://127.0.0.1:8866/predict/user_guided_colorization\"\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\ndata = base64_to_cv2(r.json()[\"results\"]['data'][0]['fake_reg'])\ncv2.imwrite('color.png', data)\n\n```\n\n### 查看代码\n\nhttps://github.com/richzhang/colorization-pytorch\n\n### 依赖\n\npaddlepaddle >= 2.0.0rc\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "docs/docs_en/finetune/semantic_segmentation.md",
    "content": "# 图像分割\n\n本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n\n\n## 如何开始Fine-tune\n\n在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用ocrnet_hrnetw18_voc模型对OpticDiscSeg等数据集进行Fine-tune。\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 定义数据预处理方式\n```python\nfrom paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\ntransform = Compose([Resize(target_size=(512, 512)), Normalize()])\n```\n\n`segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n### Step2: 下载数据集并使用\n```python\nfrom paddlehub.datasets import OpticDiscSeg\n\ntrain_reader = OpticDiscSeg(transform， mode='train')\n\n```\n* `transform`: 数据预处理方式。\n* `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n### Step3: 加载预训练模型\n\n```python\nmodel = hub.Module(name='ocrnet_hrnetw18_voc', num_classes=2, pretrained=None)\n```\n* `name`: 选择预训练模型的名字。\n* `num_classes`: 分割模型的类别数目。\n* `pretrained`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n### Step4: 选择优化策略和运行配置\n\n```python\nscheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\noptimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\ntrainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_ocr', use_gpu=True)\n```\n\n#### 优化策略\n\n\n其中`Adam`:\n\n* `learning_rate`: 全局学习率。\n*  `parameters`: 待优化模型参数。\n\n#### 运行配置\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_gpu`: 是否使用gpu，默认为False;\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: works的数量，默认为0；\n* `eval_dataset`: 验证集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们使用该模型来进行预测。predict.py脚本如下：\n\n```python\nimport paddle\nimport cv2\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='ocrnet_hrnetw18_voc', pretrained='/PATH/TO/CHECKPOINT')\n    img = cv2.imread(\"/PATH/TO/IMAGE\")\n    model.predict(images=[img], visualization=True)\n```\n\n参数配置正确后，请执行脚本`python predict.py`。\n**Args**\n* `images`:原始图像路径或BGR格式图片；\n* `visualization`: 是否可视化，默认为True；\n* `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n**NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线图像分割服务。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m ocrnet_hrnetw18_voc\n```\n\n这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\nimport cv2\nimport base64\n\nimport numpy as np\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n# 发送HTTP请求\norg_im = cv2.imread('/PATH/TO/IMAGE')\ndata = {'images':[cv2_to_base64(org_im)]}\nheaders = {\"Content-type\": \"application/json\"}\nurl = \"http://127.0.0.1:8866/predict/ocrnet_hrnetw18_voc\"\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\nmask = base64_to_cv2(r.json()[\"results\"][0])\n```\n\n### 查看代码\n\nhttps://github.com/PaddlePaddle/PaddleSeg\n\n### 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "docs/docs_en/finetune/sequence_labeling.md",
    "content": "# 序列标注\n\n在2017年之前，工业界和学术界对NLP文本处理依赖于序列模型[Recurrent Neural Network (RNN)](https://baike.baidu.com/item/%E5%BE%AA%E7%8E%AF%E7%A5%9E%E7%BB%8F%E7%BD%91%E7%BB%9C/23199490?fromtitle=RNN&fromid=5707183&fr=aladdin).\n\n![](http://colah.github.io/posts/2015-09-NN-Types-FP/img/RNN-general.png)\n\n近年来随着深度学习的发展，模型参数数量飞速增长，为了训练这些参数，需要更大的数据集来避免过拟合。然而，对于大部分NLP任务来说，构建大规模的标注数据集成本过高，非常困难，特别是对于句法和语义相关的任务。相比之下，大规模的未标注语料库的构建则相对容易。最近的研究表明，基于大规模未标注语料库的预训练模型（Pretrained Models, PTM) 能够习得通用的语言表示，将预训练模型Fine-tune到下游任务，能够获得出色的表现。另外，预训练模型能够避免从零开始训练模型。\n\n![](https://ai-studio-static-online.cdn.bcebos.com/327f44ff3ed24493adca5ddc4dc24bf61eebe67c84a6492f872406f464fde91e)\n\n\n本示例将展示如何使用PaddleHub Transformer模型（如 ERNIE、BERT、RoBERTa等模型）Module 以动态图方式fine-tune并完成预测任务。\n\n## 如何开始Fine-tune\n\n\n我们以微软亚洲研究院发布的中文实体识别数据集MSRA-NER为示例数据集，可以运行下面的命令，在训练集（train.tsv）上进行模型训练，并在开发集（dev.tsv）验证。通过如下命令，即可启动训练。\n\n```shell\n# 设置使用的GPU卡号\nexport CUDA_VISIBLE_DEVICES=0\npython train.py\n```\n\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 选择模型\n\n在命名实体识别的任务中，因不同的数据集标识实体的标签不同，评测的方式也有所差异。因此，在初始化模型的之前，需要先确定实际标签的形式，下方的`label_list`则是MSRA-NER数据集中使用的标签类别。  \n如果用户使用的实体识别的数据集的标签方式与MSRA-NER不同，则需要自行根据数据集确定。\n```python\nlabel_list = hub.datasets.MSRA_NER.label_list\nlabel_map = {\n    idx: label for idx, label in enumerate(label_list)\n}\n```\n\n接下来创建任务所使用的`model`\n```python\nimport paddlehub as hub\n\nmodel = hub.Module(name='ernie_tiny', version='2.0.1', task='token-cls', label_map=label_map)\n```\n\n其中，参数：\n\n* `name`：模型名称，可以选择`ernie`，`ernie_tiny`，`bert-base-cased`， `bert-base-chinese`, `roberta-wwm-ext`，`roberta-wwm-ext-large`等。\n* `version`：module版本号\n* `task`：fine-tune任务。此处为`token-cls`，表示序列标注任务。\n* `label_map`：数据集中的标签信息，实体识别任务中需要根据不同标签种类对模型性能进行评价。\n\nPaddleHub还提供BERT等模型可供选择, 当前支持序列标注任务的模型对应的加载示例如下：\n\n模型名                           | PaddleHub Module\n---------------------------------- | :------:\nERNIE, Chinese                     | `hub.Module(name='ernie')`\nERNIE tiny, Chinese                | `hub.Module(name='ernie_tiny')`\nERNIE 2.0 Base, English            | `hub.Module(name='ernie_v2_eng_base')`\nERNIE 2.0 Large, English           | `hub.Module(name='ernie_v2_eng_large')`\nBERT-Base, English Cased           | `hub.Module(name='bert-base-cased')`\nBERT-Base, English Uncased         | `hub.Module(name='bert-base-uncased')`\nBERT-Large, English Cased          | `hub.Module(name='bert-large-cased')`\nBERT-Large, English Uncased        | `hub.Module(name='bert-large-uncased')`\nBERT-Base, Multilingual Cased      | `hub.Module(nane='bert-base-multilingual-cased')`\nBERT-Base, Multilingual Uncased    | `hub.Module(nane='bert-base-multilingual-uncased')`\nBERT-Base, Chinese                 | `hub.Module(name='bert-base-chinese')`\nBERT-wwm, Chinese                  | `hub.Module(name='chinese-bert-wwm')`\nBERT-wwm-ext, Chinese              | `hub.Module(name='chinese-bert-wwm-ext')`\nRoBERTa-wwm-ext, Chinese           | `hub.Module(name='roberta-wwm-ext')`\nRoBERTa-wwm-ext-large, Chinese     | `hub.Module(name='roberta-wwm-ext-large')`\nRBT3, Chinese                      | `hub.Module(name='rbt3')`\nRBTL3, Chinese                     | `hub.Module(name='rbtl3')`\nELECTRA-Small, English             | `hub.Module(name='electra-small')`\nELECTRA-Base, English              | `hub.Module(name='electra-base')`\nELECTRA-Large, English             | `hub.Module(name='electra-large')`\nELECTRA-Base, Chinese              | `hub.Module(name='chinese-electra-base')`\nELECTRA-Small, Chinese             | `hub.Module(name='chinese-electra-small')`\n\n通过以上的一行代码，`model`初始化为一个适用于序列标注任务的模型，为ERNIE Tiny的预训练模型后拼接上一个输出token共享的全连接网络（Full Connected）。  \n![](https://ss1.bdstatic.com/70cFuXSh_Q1YnxGkpoWK1HF6hhy/it/u=224484727,3049769188&fm=15&gp=0.jpg)\n\n以上图片来自于：https://arxiv.org/pdf/1810.04805.pdf\n\n### Step2: 下载并加载数据集\n\n```python\ntrain_dataset = hub.datasets.MSRA_NER(\n    tokenizer=model.get_tokenizer(), max_seq_len=128, mode='train')\ndev_dataset = hub.datasets.MSRA_NER(\n    tokenizer=model.get_tokenizer(), max_seq_len=128, mode='dev')\n```\n\n* `tokenizer`：表示该module所需用到的tokenizer，其将对输入文本完成切词，并转化成module运行所需模型输入格式。\n* `mode`：选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n* `max_seq_len`：ERNIE/BERT模型使用的最大序列长度，若出现显存不足，请适当调低这一参数。\n\n预训练模型ERNIE对中文数据的处理是以字为单位，tokenizer作用为将原始输入文本转化成模型model可以接受的输入数据形式。 PaddleHub 2.0中的各种预训练模型已经内置了相应的tokenizer，可以通过`model.get_tokenizer`方法获取。\n\n![](https://bj.bcebos.com/paddlehub/paddlehub-img/ernie_network_1.png)\n![](https://bj.bcebos.com/paddlehub/paddlehub-img/ernie_network_2.png)\n\n### Step3:  选择优化策略和运行配置\n\n```python\noptimizer = paddle.optimizer.AdamW(learning_rate=5e-5, parameters=model.parameters())\ntrainer = hub.Trainer(model, optimizer, checkpoint_dir='test_ernie_token_cls', use_gpu=False)\n\ntrainer.train(train_dataset, epochs=3, batch_size=32, eval_dataset=dev_dataset)\n\n# 在测试集上评估当前训练模型\ntrainer.evaluate(test_dataset, batch_size=32)\n```\n\n#### 优化策略\n\nPaddle2.0-rc提供了多种优化器选择，如`SGD`, `Adam`, `Adamax`, `AdamW`等，详细参见[策略](https://www.paddlepaddle.org.cn/documentation/docs/zh/2.0-rc/api/paddle/optimizer/optimizer/Optimizer_cn.html)。\n\n其中`AdamW`:\n\n* `learning_rate`: 全局学习率。默认为1e-3；\n* `parameters`: 待优化模型参数。\n\n#### 运行配置\n\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_gpu`: 是否使用GPU训练，默认为False;\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: workers的数量，默认为0；\n* `eval_dataset`: 验证集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们以以下数据为待预测数据，使用该模型来进行预测\n\n```text\n去年十二月二十四日，市委书记张敬涛召集县市主要负责同志研究信访工作时，提出三问：『假如上访群众是我们的父母姐妹，你会用什么样的感情对待他们？\n新华社北京5月7日电国务院副总理李岚清今天在中南海会见了美国前商务部长芭芭拉·弗兰克林。\n根据测算，海卫1表面温度已经从“旅行者”号探测器1989年造访时的零下236摄氏度上升到零下234摄氏度。\n华裔作家韩素音女士曾三次到大足，称“大足石窟是一座未被开发的金矿”。\n```\n\n```python\nimport paddlehub as hub\n\nsplit_char = \"\\002\"\nlabel_list = [\"B-PER\", \"I-PER\", \"B-ORG\", \"I-ORG\", \"B-LOC\", \"I-LOC\", \"O\"]\ntext_a = [\n    '去年十二月二十四日，市委书记张敬涛召集县市主要负责同志研究信访工作时，提出三问：『假如上访群众是我们的父母姐妹，你会用什么样的感情对待他们？',\n    '新华社北京5月7日电国务院副总理李岚清今天在中南海会见了美国前商务部长芭芭拉·弗兰克林。',\n    '根据测算，海卫1表面温度已经从“旅行者”号探测器1989年造访时的零下236摄氏度上升到零下234摄氏度。',\n    '华裔作家韩素音女士曾三次到大足，称“大足石窟是一座未被开发的金矿”。',\n]\ndata = [[split_char.join(text)] for text in text_a]\nlabel_map = {\n    idx: label for idx, label in enumerate(label_list)\n}\n\nmodel = hub.Module(\n    name='ernie_tiny',\n    version='2.0.1',\n    task='token-cls',\n    load_checkpoint='./token_cls_save_dir/best_model/model.pdparams',\n    label_map=label_map,\n)\n\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(text_a):\n    print(f'Data: {text} \\t Lable: {\", \".join(results[idx][1:len(text)+1])}')\n```\n\n参数配置正确后，请执行脚本`python predict.py`， 加载模型具体可参见[加载](https://www.paddlepaddle.org.cn/documentation/docs/zh/2.0-rc/api/paddle/framework/io/load_cn.html#load)。\n\n### 依赖\n\npaddlepaddle >= 2.0.0rc\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "docs/docs_en/finetune/style_transfer.md",
    "content": "# 风格迁移\n\n本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n\n## 命令行预测\n\n```\n$ hub run msgnet --input_path \"/PATH/TO/ORIGIN/IMAGE\" --style_path \"/PATH/TO/STYLE/IMAGE\"\n```\n\n## 如何开始Fine-tune\n\n在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用msgnet模型对[MiniCOCO](../../docs/reference/datasets.md#class-hubdatasetsMiniCOCO)等数据集进行Fine-tune。\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 定义数据预处理方式\n```python\nimport paddlehub.vision.transforms as T\n\ntransform = T.Compose([T.Resize((256, 256), interpolation='LINEAR')])\n```\n\n`transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n### Step2: 下载数据集并使用\n```python\nfrom paddlehub.datasets.minicoco import MiniCOCO\n\nstyledata = MiniCOCO(transform=transform, mode='train')\n\n```\n* `transforms`: 数据预处理方式。\n* `mode`: 选择数据模式，可选项有 `train`, `test`， 默认为`train`。\n\n数据集的准备代码可以参考 [minicoco.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.MiniCOCO()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n### Step3: 加载预训练模型\n\n```python\nmodel = hub.Module(name='msgnet', load_checkpoint=None)\n```\n* `name`: 选择预训练模型的名字。\n* `load_checkpoint`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n### Step4: 选择优化策略和运行配置\n\n```python\noptimizer = paddle.optimizer.Adam(learning_rate=0.0001, parameters=model.parameters())\ntrainer = Trainer(model, optimizer, checkpoint_dir='test_style_ckpt')\ntrainer.train(styledata, epochs=101, batch_size=4, eval_dataset=styledata, log_interval=10, save_interval=10)\n```\n\n#### 优化策略\n\nPaddle2.0rc提供了多种优化器选择，如`SGD`, `Adam`, `Adamax`等，详细参见[策略](https://www.paddlepaddle.org.cn/documentation/docs/zh/2.0-rc/api/paddle/optimizer/optimizer/Optimizer_cn.html)。\n\n其中`Adam`:\n\n* `learning_rate`: 全局学习率。默认为1e-4；\n*  `parameters`: 待优化模型参数。\n\n#### 运行配置\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: works的数量，默认为0；\n* `eval_dataset`: 验证集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们使用该模型来进行预测。predict.py脚本如下：\n\n```python\nimport paddle\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='msgnet', load_checkpoint=\"/PATH/TO/CHECKPOINT\")\n    result = model.predict(origin=[\"venice-boat.jpg\"], style=\"candy.jpg\", visualization=True, save_path ='style_tranfer')\n```\n\n参数配置正确后，请执行脚本`python predict.py`， 加载模型具体可参见[加载](https://www.paddlepaddle.org.cn/documentation/docs/zh/2.0-rc/api/paddle/framework/io/load_cn.html#load)。\n\n**Args**\n* `origin`:原始图像路径或BGR格式图片；\n* `style`: 风格图像路径；\n* `visualization`: 是否可视化，默认为True；\n* `save_path`: 保存结果的路径，默认保存路径为'style_tranfer'。\n\n**NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线风格迁移服务。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m msgnet\n```\n\n这样就完成了一个风格迁移服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\nimport cv2\nimport base64\n\nimport numpy as np\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n# 发送HTTP请求\norg_im = cv2.imread('/PATH/TO/ORIGIN/IMAGE')\nstyle_im = cv2.imread('/PATH/TO/STYLE/IMAGE')\ndata = {'images':[[cv2_to_base64(org_im)], cv2_to_base64(style_im)]}\nheaders = {\"Content-type\": \"application/json\"}\nurl = \"http://127.0.0.1:8866/predict/msgnet\"\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\ndata = base64_to_cv2(r.json()[\"results\"]['data'][0])\ncv2.imwrite('style.png', data)\n```\n\n### 查看代码\n\nhttps://github.com/zhanghang1989/PyTorch-Multi-Style-Transfer\n\n### 依赖\n\npaddlepaddle >= 2.0.0rc\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "docs/docs_en/finetune/text_matching.md",
    "content": "# 文本匹配\n\n在2017年之前，工业界和学术界对NLP文本处理依赖于序列模型[Recurrent Neural Network (RNN)](https://baike.baidu.com/item/%E5%BE%AA%E7%8E%AF%E7%A5%9E%E7%BB%8F%E7%BD%91%E7%BB%9C/23199490?fromtitle=RNN&fromid=5707183&fr=aladdin).\n\n![](http://colah.github.io/posts/2015-09-NN-Types-FP/img/RNN-general.png)\n\n近年来随着深度学习的发展，模型参数数量飞速增长，为了训练这些参数，需要更大的数据集来避免过拟合。然而，对于大部分NLP任务来说，构建大规模的标注数据集成本过高，非常困难，特别是对于句法和语义相关的任务。相比之下，大规模的未标注语料库的构建则相对容易。最近的研究表明，基于大规模未标注语料库的预训练模型（Pretrained Models, PTM) 能够习得通用的语言表示，将预训练模型Fine-tune到下游任务，能够获得出色的表现。另外，预训练模型能够避免从零开始训练模型。\n\n![](https://ai-studio-static-online.cdn.bcebos.com/327f44ff3ed24493adca5ddc4dc24bf61eebe67c84a6492f872406f464fde91e)\n\n\n本示例将展示如何使用PaddleHub Transformer模型（如 ERNIE、BERT、RoBERTa等模型）Module 以动态图方式fine-tune并完成预测任务。\n\n## 文本匹配\n\n使用预训练模型ERNIE完成文本匹配任务，大家可能会想到将query和title文本拼接，之后输入ERNIE中，取`CLS`特征（pooled_output），之后输出全连接层，进行二分类。如下图ERNIE用于句对分类任务的用法：\n\n![](https://camo.githubusercontent.com/5e1867ee2b6fc3a0f94c7b2c87a4d987fed4c440d4d9c80726e5798900880027/68747470733a2f2f61692d73747564696f2d7374617469632d6f6e6c696e652e63646e2e626365626f732e636f6d2f34353434303032396330373234306164383964363635633562313736653633323937653935383465316461323465303262373964643534666239393066373461)\n\n然而，以上用法的问题在于，ERNIE的模型参数非常庞大，导致计算量非常大，预测的速度也不够理想。从而达不到线上业务的要求。针对该问题，使用Sentence Transformer网络可以优化计算量。\n\nSentence Transformer采用了双塔（Siamese）的网络结构。Query和Title分别输入Transformer网络，共享网络参数，得到各自的token embedding特征。之后对token embedding进行pooling（此处教程使用mean pooling操作），之后输出分别记作u，v。之后将三个表征（u,v,|u-v|)拼接起来，进行二分类。网络结构如下图所示。\n\n![](https://camo.githubusercontent.com/80e65553f0c82886a27897a0a151ee9745e6e2def310d6649c8a68e2672c06c2/68747470733a2f2f61692d73747564696f2d7374617469632d6f6e6c696e652e63646e2e626365626f732e636f6d2f31303339393837303365313334613731383438383335313161353338363230653136666564303435653236313464636338616661636563343436363030343338)\n\n更多关于Sentence Transformer的信息可以参考论文：https://arxiv.org/abs/1908.10084\n\n## 如何开始Fine-tune\n\n\n我们以中文文本匹配数据集LCQMC为示例数据集，可以运行下面的命令，在训练集（train.tsv）上进行模型训练，并在开发集（dev.tsv）验证和测试集测试（test.tsv）。通过如下命令，即可启动训练。\n\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 选择模型\n```python\nimport paddlehub as hub\n\nmodel = hub.Module(name='ernie_tiny', version='2.0.2', task='text-matching')\n```\n\n其中，参数：\n\n* `name`：模型名称，可以选择`ernie`，`ernie_tiny`，`bert-base-cased`， `bert-base-chinese`, `roberta-wwm-ext`，`roberta-wwm-ext-large`等。\n* `version`：module版本号\n* `task`：fine-tune任务。此处为`text-matching`，表示文本匹配任务。\n\nPaddleHub还提供BERT等模型可供选择, 当前支持文本分类任务的模型对应的加载示例如下：\n\n模型名                           | PaddleHub Module\n---------------------------------- | :------:\nERNIE, Chinese                     | `hub.Module(name='ernie')`\nERNIE tiny, Chinese                | `hub.Module(name='ernie_tiny')`\nERNIE 2.0 Base, English            | `hub.Module(name='ernie_v2_eng_base')`\nERNIE 2.0 Large, English           | `hub.Module(name='ernie_v2_eng_large')`\nBERT-Base, English Cased           | `hub.Module(name='bert-base-cased')`\nBERT-Base, English Uncased         | `hub.Module(name='bert-base-uncased')`\nBERT-Large, English Cased          | `hub.Module(name='bert-large-cased')`\nBERT-Large, English Uncased        | `hub.Module(name='bert-large-uncased')`\nBERT-Base, Multilingual Cased      | `hub.Module(nane='bert-base-multilingual-cased')`\nBERT-Base, Multilingual Uncased    | `hub.Module(nane='bert-base-multilingual-uncased')`\nBERT-Base, Chinese                 | `hub.Module(name='bert-base-chinese')`\nBERT-wwm, Chinese                  | `hub.Module(name='chinese-bert-wwm')`\nBERT-wwm-ext, Chinese              | `hub.Module(name='chinese-bert-wwm-ext')`\nRoBERTa-wwm-ext, Chinese           | `hub.Module(name='roberta-wwm-ext')`\nRoBERTa-wwm-ext-large, Chinese     | `hub.Module(name='roberta-wwm-ext-large')`\nRBT3, Chinese                      | `hub.Module(name='rbt3')`\nRBTL3, Chinese                     | `hub.Module(name='rbtl3')`\nELECTRA-Small, English             | `hub.Module(name='electra-small')`\nELECTRA-Base, English              | `hub.Module(name='electra-base')`\nELECTRA-Large, English             | `hub.Module(name='electra-large')`\nELECTRA-Base, Chinese              | `hub.Module(name='chinese-electra-base')`\nELECTRA-Small, Chinese             | `hub.Module(name='chinese-electra-small')`\n\n通过以上的一行代码，`model`初始化为一个适用于文本匹配任务的双塔（Siamese）结构模型，。\n\n\n### Step2: 下载并加载数据集\n\n```python\ntrain_dataset = LCQMC(tokenizer=model.get_tokenizer(), max_seq_len=128, mode='train')\ndev_dataset = LCQMC(tokenizer=model.get_tokenizer(), max_seq_len=128, mode='dev')\ntest_dataset = LCQMC(tokenizer=model.get_tokenizer(), max_seq_len=128, mode='test')\n```\n\n* `tokenizer`：表示该module所需用到的tokenizer，其将对输入文本完成切词，并转化成module运行所需模型输入格式。\n* `mode`：选择数据模式，可选项有 `train`, `dev`, `test`，默认为`train`。\n* `max_seq_len`：ERNIE/BERT模型使用的最大序列长度，若出现显存不足，请适当调低这一参数。\n\n预训练模型ERNIE对中文数据的处理是以字为单位，tokenizer作用为将原始输入文本转化成模型model可以接受的输入数据形式。 PaddleHub 2.0中的各种预训练模型已经内置了相应的tokenizer，可以通过`model.get_tokenizer`方法获取。\n\n\n### Step3:  选择优化策略和运行配置\n\n```python\noptimizer = paddle.optimizer.AdamW(learning_rate=5e-5, parameters=model.parameters())\ntrainer = hub.Trainer(model, optimizer, checkpoint_dir='./', use_gpu=True)\n```\n\n#### 优化策略\n\nPaddle2.0提供了多种优化器选择，如`SGD`, `AdamW`, `Adamax`等，详细参见[策略](https://www.paddlepaddle.org.cn/documentation/docs/zh/api/paddle/optimizer/Overview_cn.html)。\n\n其中`AdamW`:\n\n- `learning_rate`: 全局学习率。默认为1e-3；\n- `parameters`: 待优化模型参数。\n\n其余可配置参数请参考[AdamW](https://www.paddlepaddle.org.cn/documentation/docs/zh/api/paddle/optimizer/adamw/AdamW_cn.html#cn-api-paddle-optimizer-adamw)。\n\n#### 运行配置\n\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n- `model`: 被优化模型；\n- `optimizer`: 优化器选择；\n- `use_vdl`: 是否使用vdl可视化训练过程；\n- `checkpoint_dir`: 保存模型参数的地址；\n- `compare_metrics`: 保存最优模型的衡量指标；\n\n\n### Step4: 执行训练和模型评估\n\n```python\ntrainer.train(\n    train_dataset,\n    epochs=10,\n    batch_size=32,\n    eval_dataset=dev_dataset,\n    save_interval=2,\n)\ntrainer.evaluate(test_dataset, batch_size=32)\n```\n\n`trainer.train`执行模型的训练，其参数可以控制具体的训练过程，主要的参数包含：\n\n- `train_dataset`: 训练时所用的数据集；\n- `epochs`: 训练轮数；\n- `batch_size`: 训练时每一步用到的样本数目，如果使用GPU，请根据实际情况调整batch_size；\n- `num_workers`: works的数量，默认为0；\n- `eval_dataset`: 验证集；\n- `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n- `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n`trainer.evaluate`执行模型的评估，主要的参数包含：\n\n- `eval_dataset`: 模型评估时所用的数据集；\n- `batch_size`: 模型评估时每一步用到的样本数目，如果使用GPU，请根据实际情况调整batch_size\n\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n以下代码将使用最优模型来进行预测：\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个表情叫什么', '这个猫的表情叫什么'],\n    ['什么是智能手环', '智能手环有什么用'],\n    ['介绍几本好看的都市异能小说，要完结的！', '求一本好看点的都市异能小说，要完结的'],\n    ['一只蜜蜂落在日历上（打一成语）', '一只蜜蜂停在日历上（猜一成语）'],\n    ['一盒香烟不拆开能存放多久？', '一条没拆封的香烟能存放多久。'],\n]\nlabel_map = {0: 'similar', 1: 'dissimilar'}\n\nmodel = hub.Module(\n    name='ernie_tiny',\n    version='2.0.2',\n    task='text-matching',\n    load_checkpoint='./checkpoint/best_model/model.pdparams',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=128, batch_size=1, use_gpu=True)\nfor idx, texts in enumerate(data):\n    print('TextA: {}\\tTextB: {}\\t Label: {}'.format(texts[0], texts[1], results[idx]))\n```\n\n### 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "docs/docs_en/get_start/installation.rst",
    "content": "============\nInstallation\n============\n\n\nEnvironment Dependency\n========================\n\n* Operating System: Windows/Mac/Linux\n* Python >= 3.6.2\n* PaddlePaddle >= 2.0.0\n\n.. code-block:: shell\n\n    # Install gpu version of paddlepaddle\n    pip install paddlepaddle-gpu -U\n\n    # Or install cpu version of paddlepaddle\n    # pip install paddlepaddle -U\n\nInstallation Command\n========================\n\nBefore installing the PaddleHub, install the PaddlePaddle deep learning framework first. For more installation instructions, refer to `PaddleQuickInstall <https://www.paddlepaddle.org.cn/install/quick>`_.\n\n.. code-block:: shell\n\n    pip install paddlehub==2.1.0\n\nIn addition to the above dependences, PaddleHub's pre-training models and pre-set datasets need to be downloaded through connecting to the server. Make sure that the computer can access the network. You can run PaddleHub offline if the relevant datasets and pre-set models are already available locally.\n\n.. note::\n    Make sure that the computer can access the external network when the PaddleHub is used to download datasets and pre-training models. You can use `server_check()` to check the connection status between the local and remote PaddleHub-Server in the following methods:\n\n.. code-block:: Python\n\n    import paddlehub\n    paddlehub.server_check()\n    # If OK, reply \"Request Hub-Server successfully\".\n    # If not OK, reply \"Hub-Server unsuccessfully\"."
  },
  {
    "path": "docs/docs_en/get_start/linux_quickstart.md",
    "content": "# Zero base Linux installation and image style transfer\n\n## Step 1: Install Anaconda\n\n- Note: To use paddlepaddle, you need to install the Python environment first. Here we choose the Python integrated environment Anaconda toolkit.\n  - Anaconda is a commonly used python package management program.\n  - After installing Anaconda, you can install the python environment and the toolkit environment required by numpy.\n  \n- **Download Anaconda**：\n  \n  - Download address: https://mirrors.tuna.tsinghua.edu.cn/anaconda/archive/?C=M&O=D\n    - <img src=\"../../imgs/Install_Related/linux/anaconda_download.png\" akt=\"anaconda download\" width=\"800\" align=\"center\"/>\n    - Select the version appropriate for your operating system\n      - You can enter `uname -m` at the terminal to query the instruction set used by the system\n    \n  - Download method 1: Download locally, and then transfer the installation package to the Linux server\n  \n  - Download method 2: directly use the Linux command line to download\n  \n    - ```shell\n      # Install wget first\n      sudo apt-get install wget  # Ubuntu\n      sudo yum install wget  # CentOS\n      ```\n  \n    - ```shell\n      # Then use wget to download from Tsinghua Source\n      # To download Anaconda3-2021.05-Linux-x86_64.sh, the download command is as follows:\n      wget https://mirrors.tuna.tsinghua.edu.cn/anaconda/archive/Anaconda3-2021.05-Linux-x86_64.sh\n      \n      # If you want to download other versions, you need to change the last/last file name to the version you want to download\n      ```\n  \n- To install Anaconda:\n\n  - At the command line, enter `sh Anaconda3-2021.05-Linux-x86_64.sh`\n    - If you download another version, replace the file name of the command with the file name you downloaded\n  - Just follow the installation prompts\n    - When viewing the license, you can enter q to exit\n  \n- **Add conda to the environment variable**\n\n  - The environment variable is added to enable the system to recognize the conda command. If you have added conda to the environment variable path during installation, you can skip this step\n\n  - Open `~/.bashrc` in the terminal:\n\n    - ```shell\n      # Enter the following command in the terminal:\n      vim ~/.bashrc\n      ```\n\n  - Add conda as an environment variable in `~/.bashrc`:\n\n    - ```shell\n      # Press i first to enter editing mode\n      # On the first line, enter:\n      export PATH=\"~/anaconda3/bin:$PATH\"\n      # If the installation location is customized during installation, change ~/anaconda3/bin to the bin folder under the customized installation directory\n      ```\n\n    - ```shell\n      # Modified ~/.bash_profile file should be as follows (where xxx is the user name):：\n      export PATH=\"~/opt/anaconda3/bin:$PATH\"\n      # >>> conda initialize >>>\n      # !! Contents within this block are managed by 'conda init' !!\n      __conda_setup=\"$('/Users/xxx/opt/anaconda3/bin/conda' 'shell.bash' 'hook' 2> /dev/null)\"\n      if [ $? -eq 0 ]; then\n          eval \"$__conda_setup\"\n      else\n          if [ -f \"/Users/xxx/opt/anaconda3/etc/profile.d/conda.sh\" ]; then\n              . \"/Users/xxx/opt/anaconda3/etc/profile.d/conda.sh\"\n          else\n              export PATH=\"/Users/xxx/opt/anaconda3/bin:$PATH\"\n          fi\n      fi\n      unset __conda_setup\n      # <<< conda initialize <<<\n      ```\n\n    - After modification, press the `esc` key to exit editing mode, and then enter `:wq!` And enter to save the exit\n\n  - Verify that the conda command is recognized:\n\n    - Enter `source ~/.bash_profile` in the terminal to update environment variables\n    - Then enter `conda info --envs` on the terminal. If the current base environment can be displayed, conda has added an environment variable\n\n## Step 2: Create a conda environment\n\n- Create a new conda environment\n\n  - ```shell\n    # On the command line, enter the following command to create a file named  paddle_env environment\n    # This is for accelerated download, use Tsinghua Source\n    conda create --name paddle_env python=3.8 --channel https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free/\n    ```\n\n  - This command will create an executable environment named paddle_env with Python version 3.8. It will take a while depending on the network status\n\n  - Then the command line will output a prompt message, enter y and press Enter to continue the installation\n\n    - <img src=\"../../imgs/Install_Related/linux/conda_create.png\" alt=\"conda_create\" width=\"500\" align=\"center\"/>\n\n- Activate the newly created conda environment, and enter the following command on the command line:\n\n  - ```shell\n    # Activate paddle_env environment\n    conda activate paddle_env\n    ```\n    \n  - The above anaconda environment and python environment have been installed\n\n## Step 3: Install the libraries required by the program\n\n- Use the pip command to install the paddle in the newly activated environment:\n\n  - ```shell\n    # On the command line, enter the following command:\n    # The CPU version is installed by default. Baidu Source is recommended when installing the paddlepaddle\n    pip install paddlepaddle -i https://mirror.baidu.com/pypi/simple\n    ```\n  \n- After installing the PaddlePaddle, continue to install the paddlehub in the paddle_env environment:\n\n  - ```shell\n    # On the command line, enter the following command:\n    pip install paddlehub -i https://mirror.baidu.com/pypi/simple\n    ```\n\n  - Introduction document of paddlehub: https://github.com/PaddlePaddle/PaddleHub/blob/develop/README.md\n  \n  - When installing the paddlehub, other dependent libraries will be installed automatically, which may take a while\n\n## Step 4: Install the paddlehub and download the model\n\n- After installing the paddlehub, download the style migration model:\n\n  - ```shell\n    # Enter the following command on the command line\n    hub install stylepro_artistic==1.0.1\n    ```\n\n  - Description document of the model: [https://www.paddlepaddle.org.cn/hubsearch?filter=en_category&value=%7B%22scenes%22%3A%5B%22GANs%22%5D%7D](https://www.paddlepaddle.org.cn/hubsearch?filter=en_category&value={\"scenes\"%3A[\"GANs\"]})\n\n  - <img src=\"../../imgs/Install_Related/linux/hub_model_intro.png\" alt=\"hub model intro\" width=\"800\" align=\"center\"/>\n\n## Step 5: Prepare the style to migrate data and code\n\n### Prepare style migration data\n\n- Create Working Directory `style_transfer` under User Directory `~`\n\n  - ```shell\n    # Enter the following command in the terminal:\n    cd ~  # Enter the user directory\n    mkdir style_transfer  # Create style_transfer folder\n    cd style_transfer  # Enter style_transfer directory\n    ```\n\n- Place pictures to be converted and style pictures respectively:\n\n  - Place the picture to be converted to `~/style_transfer/pic.jpg`\n    - <img src=\"../../imgs/Install_Related/linux/pic.jpg\" alt=\"pic.jpg\" width=\"400\" align=\"center\"/>\n  - Place style picture to `~/style_transfer/fangao.jpg`\n    - <img src=\"../../imgs/Install_Related/linux/fangao.jpg\" alt=\"fangao.jpg\" width=\"350\" align=\"center\"/>\n\n### Code\n\n- Create code file:\n\n  - ```shell\n    # The following commands are executed on the command line\n    $ pwd # Check whether the current directory is style_transfer， if not, enter: cd ~/style_transfer\n    $ touch style_transfer.py  # Create an empty file\n    $ vim style_transfer.py  # Open code file with vim editor\n    # Enter i first to enter editing mode\n    # Copy the code into the vim editor\n    # Press esc key to exit editing mode, then enter \": wq\" and enter Enter to save and exit\n    ```\n    \n  - ```python\n    # Code\n    import paddlehub as hub\n    import cv2\n    \n    # Relative address of the picture to be converted\n    picture = './pic.jpg'\n    # Relative address of style picture\n    style_image = './fangao.jpg'\n    \n    # Create a style transfer network and load parameters\n    stylepro_artistic = hub.Module(name=\"stylepro_artistic\")\n    \n    # Read in pictures and start style conversion\n    result = stylepro_artistic.style_transfer(\n                        images=[{'content': cv2.imread(picture),\n                                 'styles': [cv2.imread(style_image)]}],\n                        visualization=True\n    )\n    ```\n\n- Running code:\n\n  - On the command line, enter `python style_transfer.py`\n  - When the program executes, a new folder `transfer_result` will be created, and save the converted file to this directory\n  - The output pictures are as follows:\n    - <img src=\"../../imgs/Install_Related/linux/output_img.png\" alt=\"output image\" width=\"600\" align=\"center\"/>\n\n## Step 6: Explore the pre training model of flying oars\n- Congratulations, the installation and introduction cases of PaddleHub in the Linux environment will be completed here. Start your more in-depth learning model exploration journey quickly.[【More model exploration, jump to the official website of PaddlePaddle】](https://www.paddlepaddle.org.cn/hublist)\n\n"
  },
  {
    "path": "docs/docs_en/get_start/mac_quickstart.md",
    "content": "# Zero base mac installation and image style transfer\n\n## Step 1: Install Anaconda\n\n- Note: To use paddlepaddle, you need to install the Python environment first. Here we choose the Python integrated environment Anaconda toolkit\n  - Anaconda is a commonly used python package management program\n  - After installing Anaconda, you can install the python environment and the toolkit environment required by numpy\n- Anaconda Download:\n  - Link: https://mirrors.tuna.tsinghua.edu.cn/anaconda/archive/?C=M&O=D\n  - <img src=\"../../imgs/Install_Related/mac/anaconda_start.png\" alt=\"anaconda download\" width=\"800\" align=\"center\"/>\n  - Select the lowest `Anaconda3-2021.05-MacOSX-x86_64.pkg` download\n- After downloading, double click the. pkg file to enter the graphical interface\n  - By default, the installation will take a while\n- It is recommended to install a code editor such as vscode or pycharm\n\n## Step 2: Open the terminal and create a conda environment\n\n- Open terminal\n\n  - Press the command key and the space bar at the same time, enter \"terminal\" in the focus search, and double-click to enter the terminal\n\n- **Add conda to the environment variable**\n\n  - The environment variable is added to enable the system to recognize the conda command\n\n  - Enter the following command to open `~/.bash_profile`：\n\n    - ```shell\n      vim ~/.bash_profile\n      ```\n\n  - In `~/.bash_profile` add conda as an environment variable:\n\n    - ```shell\n      # Press i first to enter editing mode\n      # On the first line, enter:\n      export PATH=\"~/opt/anaconda3/bin:$PATH\"\n      # If the installation location is customized during installation, change ~/opt/anaconda3/bin to the bin folder under the customized installation directory\n      ```\n\n    - ```shell\n      # Modified ~/.bash_profile file should be as follows (where xxx is the user name):\n      export PATH=\"~/opt/anaconda3/bin:$PATH\"\n      # >>> conda initialize >>>\n      # !! Contents within this block are managed by 'conda init' !!\n      __conda_setup=\"$('/Users/xxx/opt/anaconda3/bin/conda' 'shell.bash' 'hook' 2> /dev/null)\"\n      if [ $? -eq 0 ]; then\n          eval \"$__conda_setup\"\n      else\n          if [ -f \"/Users/xxx/opt/anaconda3/etc/profile.d/conda.sh\" ]; then\n              . \"/Users/xxx/opt/anaconda3/etc/profile.d/conda.sh\"\n          else\n              export PATH=\"/Users/xxx/opt/anaconda3/bin:$PATH\"\n          fi\n      fi\n      unset __conda_setup\n      # <<< conda initialize <<<\n      ```\n\n    - After modification, press the `esc` key to exit editing mode, and then enter `:wq!` And enter to save the exit\n\n  - Verify that the conda command is recognized:\n\n    - Enter `source ~/.bash_profile` to update environment variables\n    - Then enter `conda info --envs` on the terminal. If the current base environment can be displayed, conda has added an environment variable\n\n- Create a new conda environment\n\n  - ```shell\n    # On the command line, enter the following command to create a file named paddle_env environment\n    # This is for accelerated download, use Tsinghua source\n    conda create --name paddle_env python=3.8 --channel https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free/\n    ```\n\n  - This command will create an executable environment named paddle_env with Python version 3.8. It will take a while depending on the network status\n\n  - Then the command line will output a prompt message, enter y and press Enter to continue the installation\n\n    - <img src=\"../../imgs/Install_Related/mac/conda_create.png\" alt=\"conda_create\" width=\"600\" align=\"center\"/>\n\n- Activate the newly created conda environment, and enter the following command on the command line:\n\n  - ```shell\n    # Activate paddle_env environment\n    conda activate paddle_env\n    # View the current python location\n    where python\n    ```\n\n  - <img src=\"../../imgs/Install_Related/mac/conda_activate.png\" alt=\"conda_actviate\" width=\"600\" align=\"center\"/>\n\n  - The above anaconda environment and python environment have been installed\n\n## Step 3: Install the libraries required by the program\n\n- Use the pip command to install the pad in the newly activated environment:\n\n  - ```shell\n    # Enter the following command on the command line\n    # Confirm whether the currently used pip is the pip in the paddle_env environment\n    where pip\n    # The CPU version is installed by default. Baidu Source is recommended when installing the PaddlePaddle\n    pip install paddlepaddle -i https://mirror.baidu.com/pypi/simple\n    ```\n\n- After installing Paddle, continue to install the PaddleHub in the paddle_env environment:\n\n  - ```shell\n    # Enter the following command on the command line\n    pip install paddlehub -i https://mirror.baidu.com/pypi/simple\n    ```\n\n  - Introduction document of paddlehub: https://github.com/PaddlePaddle/PaddleHub/blob/develop/README.md\n  \n  - When installing the paddlehub, other dependent libraries will be installed automatically, which may take a while\n\n## Step 4: Install the paddlehub and download the model\n\n- After installing the PaddleHub, download the style migration model:\n\n  - ```shell\n    # Enter the following command on the command line\n    hub install stylepro_artistic==1.0.1\n    ```\n\n  - Description document of the model: [https://www.paddlepaddle.org.cn/hubsearch?filter=en_category&value=%7B%22scenes%22%3A%5B%22GANs%22%5D%7D](https://www.paddlepaddle.org.cn/hubsearch?filter=en_category&value={\"scenes\"%3A[\"GANs\"]})\n\n  - <img src=\"../../imgs/Install_Related/mac/hub_model_intro.png\" alt=\"hub model intro\" width=\"800\" align=\"center\"/>\n\n## Step 5: Prepare the style to migrate data and code\n\n### Prepare style migration data\n\n- Create Working Directory `style_transfer` on Desktop \n\n  - ```shell\n    # Enter the following command in the terminal:\n    cd ~/Desktop  # Enter the desktop\n    mkdir style_transfer  # Create style_transfer folder\n    cd style_transfer  # Enter style_transfer directory\n    ```\n\n- Place pictures to be converted and style pictures respectively:\n\n  - Place the picture to be converted on the desktop  `style_transfer/pic.jpg`\n    - <img src=\"../../imgs/Install_Related/mac/pic.jpg\" alt=\"pic.jpg\" width=\"400\" align=\"center\"/>\n  - Place Style Picture on Desktop `style_transfer/fangao.jpg`\n    - <img src=\"../../imgs/Install_Related/mac/fangao.jpg\" alt=\"fangao.jpg\" width=\"350\" align=\"center\"/>\n\n### 代码\n\n- In `style_transfer` create code file `style_transfer.py`\n\n- Copy the following code into `style_transfer.py`\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n    \n    # Relative address of the picture to be converted\n    picture = './pic.jpg'\n    # Relative address of style picture\n    style_image = './fangao.jpg'\n    \n    # Create a style transfer network and load parameters\n    stylepro_artistic = hub.Module(name=\"stylepro_artistic\")\n    \n    # Read in pictures and start style conversion\n    result = stylepro_artistic.style_transfer(\n                        images=[{'content': cv2.imread(picture),\n                                 'styles': [cv2.imread(style_image)]}],\n                        visualization=True\n    )\n    ```\n\n- If there is no code editor such as vscode, you can use the command line method:\n\n  - ```shell\n    pwd # Check whether the current directory is style_transfer， if not, enter: cd ~/Desktop/style_transfer\n    touch style_transfer.py  # Create an empty file\n    vim style_transfer.py  #  Open code file with vim editor\n    # Enter i first to enter editing mode\n    # Copy the above code into the vim editor\n    # Press esc key to exit editing mode, then enter \": wq\" and enter Enter to save and exit\n    ```\n\n- Running code:\n\n  - On the command line, enter `python style_transfer.py`\n  - When the program executes, a new folder `transfer_result` will be created, and save the converted file to this directory\n  - The output pictures are as follows:\n    - <img src=\"../../imgs/Install_Related/mac/output_img.png\" alt=\"output image\" width=\"600\" align=\"center\"/>\n\n## Step 6: Explore the pre training model of flying oars\n- Congratulations, the installation and introduction cases of PaddleHub in the Mac environment will be completed here. Start your more in-depth learning model exploration journey quickly.[【More model exploration, jump to the official website of PaddlePaddle】](https://www.paddlepaddle.org.cn/hublist)\n\n\n\n\n"
  },
  {
    "path": "docs/docs_en/get_start/python_use_hub.rst",
    "content": "=================\nQuick experience\n=================\n\nIn PaddleHub, the concept `Module` represents an executable module, which usually a pre-trained model that can be used for end-to-end prediction, such as a face detection model or a lexical analysis model, or a pre-trained model that requires finetuning, such as BERT/ERNIE. \n\nPaddleHub adopts the model-based software design concept. All pre-training models are similar to Python packages, with the concept of version.\n\nThis tutorial will take you to quickly experience how to use the pre-trained model provided by PaddleHub to quickly complete several common AI tasks.\n\nDownload test image\n=============================================\n\nFirst of all, let's download a test image through `wget`.\n\n.. code-block:: shell\n\n    # Download a picture\n    wget https://paddlehub.bj.bcebos.com/resources/test_image.jpg\n\n.. image:: ../../imgs/humanseg_test.png\n    :width: 300px\n\nHuman segmentation\n=============================================\n\nHuman segmentation aims to distinguish the person and the background in the input image. This task has many application scenarios, such as background blur, background replacement, film and television post-processing, etc. We use `humanseg_lite <https://www.paddlepaddle.org.cn/hubdetail?name=humanseg_lite&en_category=ImageSegmentation>`_ to show this feature.\n\n.. code-block:: python\n\n    import paddlehub as hub\n    # module = hub.Module(name=\"humanseg_lite\", version=\"1.1.1\")\n    module = hub.Module(name=\"humanseg_lite\")\n\n    res = module.segmentation(\n        paths = [\"./test_image.jpg\"], \n        visualization=True, \n        output_dir='humanseg_output')\n\n.. image:: ../../imgs/output_8_3.png\n    :width: 300px\n\nHuman parsing\n=============================================\n\nHuman parsing is a more detailed task of human segmentation. It aims to extract and distinguish different parts of the human body in the input photo. It is often used with many emerging GAN models. Application scenarios include beauty, changing clothes, etc. We use `ace2p <https://www.paddlepaddle.org.cn/hubdetail?name=ace2p&en_category=ImageSegmentation>`_ to show this feature.\n\n.. code-block:: python\n\n    import paddlehub as hub\n    # module = hub.Module(name=\"ace2p\", version=\"1.1.0\")\n    module = hub.Module(name=\"ace2p\")\n\n    res = module.segment(\n        paths = [\"./test_image.jpg\"], \n        visualization=True, \n        output_dir='ace2p_output')\n\n.. image:: ../../imgs/output_12_3.png\n    :width: 300px\n\nFace Detection \n=============================================\n\nThe task of face detection aims to detect all the faces in the input picture and give the specific coordinates and size of the faces (generally called the bounding box). Application scenarios include video surveillance, traffic analysis, etc. We use `ultra_light_fast_generic_face_detector_1mb_640 <https://www.paddlepaddle.org.cn/hubdetail?name=ultra_light_fast_generic_face_detector_1mb_640&en_category=FaceDetection>`_ to show this feature.\n\n.. code-block:: python\n\n    import paddlehub as hub\n    # module = hub.Module(name=\"ultra_light_fast_generic_face_detector_1mb_640\", version=\"1.1.2\")\n    module = hub.Module(name=\"ultra_light_fast_generic_face_detector_1mb_640\")\n\n    res = module.face_detection(\n        paths = [\"./test_image.jpg\"], \n        visualization=True, \n        output_dir='face_detection_output')\n\n.. image:: ../../imgs/output_15_3.png\n    :width: 300px\n\nKey Point Detection\n=============================================\n\nThe key point detection task aims to identify different key points of each human body in the input image, such as the head, shoulders, joints, etc. The number of key points that can be identified is also different depending on the model's ability. This task can be used for human body beauty, human body gesture recognition and other tasks. We use `openpose_body_estimation <https://www.paddlepaddle.org.cn/hubdetail?name=openpose_body_estimation&en_category=KeyPointDetection>`_ to show this feature.\n\n.. code-block:: python\n\n    import paddlehub as hub\n    # module = hub.Module(name=\"openpose_body_estimation\", version=\"1.0.0\")\n    module = hub.Module(name=\"openpose_body_estimation\")\n\n    res = module.predict(\n        img=\"./test_image.jpg\", \n        visualization=True, \n        save_path='keypoint_output')\n\n.. image:: ../../imgs/output_18_2.png\n    :width: 300px\n\nLexical analysis of Chinese\n=============================================\n\nChinese lexical analysis aims to perform tasks such as word segmentation, part-of-speech analysis, and named entity recognition on the input Chinese sentences. We use `lac <https://www.paddlepaddle.org.cn/hubdetail?name=lac&en_category=LexicalAnalysis>`_ to show this feature.\n\n.. code-block:: python\n\n    import paddlehub as hub\n    # lac = hub.Module(name=\"lac\", version=\"2.2.0\")\n    lac = hub.Module(name=\"lac\")\n\n    test_text = [\"1996年，曾经是微软员工的加布·纽维尔和麦克·哈灵顿一同创建了Valve软件公司。他们在1996年下半年从id software取得了雷神之锤引擎的使用许可，用来开发半条命系列。\"]\n    print(lac.lexical_analysis(texts = test_text))\n    \n----------------\n\n    [{'word': ['1996年', '，', '曾经', '是', '微软', '员工', '的', '加布·纽维尔', '和', '麦克·哈灵顿', '一同', '创建', '了', 'Valve软件公司', '。', '他们', '在', '1996年下半年', '从', 'id', ' ', 'software', '取得', '了', '雷神之锤', '引擎', '的', '使用', '许可', '，', '用来', '开发', '半条命', '系列', '。'], 'tag': ['TIME', 'w', 'd', 'v', 'ORG', 'n', 'u', 'PER', 'c', 'PER', 'd', 'v', 'u', 'ORG', 'w', 'r', 'p', 'TIME', 'p', 'nz', 'w', 'n', 'v', 'u', 'n', 'n', 'u', 'vn', 'vn', 'w', 'v', 'v', 'n', 'n', 'w']}]\n\nSentiment analysis of Chinese\n=============================================\n\nChinese sentiment analysis aims to analyze the sentiment tendency of input Chinese sentences. We use `senta_bilstm <https://www.paddlepaddle.org.cn/hubdetail?name=senta_bilstm&en_category=SentimentAnalysis>`_ to show this feature.\n\n.. code-block:: python\n\n    import paddlehub as hub\n    # senta = hub.Module(name=\"senta_bilstm\", version=\"1.2.0\")\n    senta = hub.Module(name=\"senta_bilstm\")\n\n    test_text = [\"味道不错，确实不算太辣，适合不能吃辣的人。就在长江边上，抬头就能看到长江的风景。鸭肠、黄鳝都比较新鲜。\"]\n    print(senta.sentiment_classify(texts = test_text))\n\n----------------\n\n    [{'text': '味道不错，确实不算太辣，适合不能吃辣的人。就在长江边上，抬头就能看到长江的风景。鸭肠、黄鳝都比较新鲜。', 'sentiment_label': 1, 'sentiment_key': 'positive', 'positive_probs': 0.9771, 'negative_probs': 0.0229}]"
  },
  {
    "path": "docs/docs_en/get_start/windows_quickstart.md",
    "content": "# Zero base Windows installation and image style transfer\n\n## Step 1: Install Anaconda\n\n- Note: To use paddlepaddle, you need to install the Python environment first. Here we choose the Python integrated environment Anaconda toolkit\n  - Anaconda is a commonly used python package management program\n  - After installing Anaconda, you can install the python environment and the toolkit environment required by numpy.\n- Anaconda Download:\n  - Link: https://mirrors.tuna.tsinghua.edu.cn/anaconda/archive/?C=M&O=D\n  - Most win10 computers are 64 bit operating systems, choose x86_64 version; If the computer is a 32-bit operating system, select x86.exe\n  - <img src=\"../../imgs/Install_Related/windows/Anaconda_download.png\" alt=\"anaconda download\" width=\"800\" align=\"center\"/>\n  - After downloading, double click the installer to enter the graphical interface\n  - The default installation location is Disk C, and it is recommended to change the installation location to Disk D:\n    - <img src=\"../../imgs/Install_Related/windows/anaconda_install_folder.png\" alt=\"install config\" width=\"500\" align=\"center\"/>\n  - Check conda to add environment variables, and ignore the warning:\n    - <img src=\"../../imgs/Install_Related/windows/anaconda_install_env.png\" alt=\"add conda to path\" width=\"500\" align=\"center\"/>\n\n## Step 2: Open the terminal and create a conda environment\n\n- Open Anaconda Prompt terminal\n  - Windows Start Menu -> Anaconda3 -> Anaconda Prompt\n  - <img src=\"../../imgs/Install_Related/windows/anaconda_prompt.png\" alt=\"anaconda download\" width=\"300\" align=\"center\"/>\n\n\n- Create a new conda environment\n\n  - ```shell\n    # On the command line, enter the following command to create a file named paddle_env Env environment\n    # This is for accelerated download, use Tsinghua Source\n    conda create --name paddle_env python=3.8 --channel https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free/  # a shell command\n    ```\n\n  - This command will create an executable environment named paddle_env with Python version 3.8. It will take a while depending on the network status\n\n  - Then the command line will output a prompt message, enter y and press Enter to continue the installation\n\n  - <img src=\"../../imgs/Install_Related/windows/conda_new_env.png\" alt=\"conda create\" width=\"700\" align=\"center\"/>\n\n- Activate the newly created conda environment, and enter the following command on the command line:\n\n  - ```shell\n    # Activate paddle_env environment\n    conda activate paddle_env\n    # View the current python location\n    where python\n    ```\n\n  - <img src=\"../../imgs/Install_Related/windows/conda_list_env.png\" alt=\"create environment\" width=\"600\" align=\"center\"/>\n\n  - The above anaconda environment and python environment have been installed\n\n## Step 3: The required libraries for the installer to run\n\n- Use the pip command to install the PaddlePaddle in the environment you just activated\n\n  - ```shell\n    # Enter the following command on the command line\n    # Confirm whether the currently used pip is a pad_ Pip in env environment\n    where pip\n    # The CPU version is installed by default. Baidu Source is recommended when installing the paddle\n    pip install paddlepaddle -i https://mirror.baidu.com/pypi/simple\n    ```\n\n  - If you need to install the GPU version, please open the [paddle official website](https://www.paddlepaddle.org.cn/) select the appropriate version.\n\n    - Paddle official website: https://www.paddlepaddle.org.cn/\n    - Since CUDA and cudnn need to be configured before installing the GPU version, it is recommended to install the GPU version after a certain foundation\n\n- After installing the Paddle, continue to install the paddlehub in the paddle_env environment:\n\n  - ```shell\n    # Enter the following command on the command line\n    pip install paddlehub -i https://mirror.baidu.com/pypi/simple\n    ```\n\n  - Introduction document of paddlehub: https://github.com/PaddlePaddle/PaddleHub/blob/develop/README.md\n\n## Step 4: Install the paddlehub and download the model\n\n- After installing the paddlehub, download the style migration model:\n\n  - ```shell\n    # Enter the following command on the command line\n    hub install stylepro_artistic==1.0.1\n    ```\n\n  - Description document of the model: [https://www.paddlepaddle.org.cn/hubsearch?filter=en_category&value=%7B%22scenes%22%3A%5B%22GANs%22%5D%7D](https://www.paddlepaddle.org.cn/hubsearch?filter=en_category&value={\"scenes\"%3A[\"GANs\"]})\n\n  - <img src=\"../../imgs/Install_Related/windows/paddlehub_modulelist.png\" alt=\"model introduction\" width=\"700\" align=\"center\"/>\n\n## Step 5: Prepare the style to migrate data and code\n\n### Prepare style migration data\n\n- Switch Working Directory to `D:\\style_transfer`, enter the following command on the command line\n\n  - ```shell\n    # Enter the following command on the command line\n    # Switch the current working directory to the root directory of disk D\n    D:\n    # Create style_transfer directory\n    mkdir style_transfer\n    # Switch the current directory to style_transfer directory\n    cd style_transfer\n    ```\n\n- Place pictures to be converted and style pictures respectively\n  - Place the picture to be converted to `D:\\style_transfer\\pic.jpg`\n    - <img src=\"../../imgs/Install_Related/windows/pic.jpg\" alt=\"pic.jpg\" width=\"400\" align=\"center\"/>\n  - Place Style Picture to `D:\\style_transfer\\fangao.jpg`\n    - <img src=\"../../imgs/Install_Related/windows/fangao.jpg\" alt=\"fangao.jpg\" width=\"350\" align=\"center\"/>\n\n### Code\n\n- In `D:\\style_transfer` create code file `style_transfer.py`\n\n  - If there is no editor such as vscode, you can use Notepad to create a txt file first, and then change the file name to `style_transfer.py`\n\n- Copy the following code into `style_transfer.py`\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n  \n    # The absolute address of the picture to be converted\n    picture = 'D:\\\\style_transfer\\\\pic.jpg'  # Note that double backslashes are used in the code\n  \n    # Absolute address of the style picture\n    style_image = 'D:\\\\style_transfer\\\\fangao.jpg'\n  \n    # Create a style transfer network and load parameters\n    stylepro_artistic = hub.Module(name=\"stylepro_artistic\")\n  \n    # Read in pictures and start style conversion\n    result = stylepro_artistic.style_transfer(\n                        images=[{'content': cv2.imread(picture),\n                                 'styles': [cv2.imread(style_image)]}],\n                        visualization=True\n    )\n    ```\n\n- Running code:\n\n  - On the command line, enter `python style_transfer.py`\n  - When the program executes, a new folder `transfer_result` will be created, and save the converted file to this directory.\n  - The output picture is as follows:\n    - <img src=\"../../imgs/Install_Related/windows/after_transfer.png\" alt=\"transferred image\" width=\"600\" align=\"center\"/>\n\n## Step 6: Explore the pre training model of flying oars\n- Congratulations, the installation and introduction cases of PaddleHub in the Windows environment will be completed here. Start your more in-depth learning model exploration journey quickly.[【More model exploration, jump to the official website of PaddlePaddle】](https://www.paddlepaddle.org.cn/hublist)\n\n"
  },
  {
    "path": "docs/docs_en/get_start_index.rst",
    "content": "===============================\nGet start with PaddleHub\n===============================\n\n.. toctree::\n   :maxdepth: 2\n\n   get_start/installation.md\n   get_start/python_use_hub.md"
  },
  {
    "path": "docs/docs_en/index.rst",
    "content": "===========================\nIntroduction to PaddleHub\n===========================\n\nWelcome to PaddleHub! This is an Awesome pre-trained models toolkit based on PaddlePaddle which aimed at reducing the cost of using AI models and promoting the development of AI communities. No matter you are a senior developer in the AI ​​industry nor an interested person who knows nothing about the AI ​​field, you can benefit from PaddleHub.\n\n* PaddleHub aims to provide developers with rich, high-quality, and directly usable pre-trained models.\n* **No need for deep learning background**, you can use AI models quickly and enjoy the dividends of the artificial intelligence era.\n* Covers 4 major categories of Image, Text, Audio, and Video, and supports **one-click prediction**, **easy service deployment** and **transfer learning**.\n* All models are **OPEN SOURCE**, **FREE** to download and use them in offline scenario.\n\n.. PaddleHub documentation master file, created by\n   sphinx-quickstart on Thu Apr 22 17:42:49 2021.\n   You can adapt this file completely to your liking, but it should at least\n   contain the root `toctree` directive.\n\n\n------------\n\n===========================\nOverview\n===========================\n.. toctree::\n   :maxdepth: 2\n   :titlesonly:\n\n   Get start with PaddleHub<get_start_index>\n   Tutorial<tutorial_index>\n   API reference<api_index>\n   FAQ<faq>\n   Community<community_index>\n   Release note<release>\n\n   "
  },
  {
    "path": "docs/docs_en/make.bat",
    "content": "@ECHO OFF\r\n\r\npushd %~dp0\r\n\r\nREM Command file for Sphinx documentation\r\n\r\nif \"%SPHINXBUILD%\" == \"\" (\r\n\tset SPHINXBUILD=sphinx-build\r\n)\r\nset SOURCEDIR=.\r\nset BUILDDIR=_build\r\n\r\nif \"%1\" == \"\" goto help\r\n\r\n%SPHINXBUILD% >NUL 2>NUL\r\nif errorlevel 9009 (\r\n\techo.\r\n\techo.The 'sphinx-build' command was not found. Make sure you have Sphinx\r\n\techo.installed, then set the SPHINXBUILD environment variable to point\r\n\techo.to the full path of the 'sphinx-build' executable. Alternatively you\r\n\techo.may add the Sphinx directory to PATH.\r\n\techo.\r\n\techo.If you don't have Sphinx installed, grab it from\r\n\techo.http://sphinx-doc.org/\r\n\texit /b 1\r\n)\r\n\r\n%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%\r\ngoto end\r\n\r\n:help\r\n%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%\r\n\r\n:end\r\npopd\r\n"
  },
  {
    "path": "docs/docs_en/release.md",
    "content": "# Release Note\n\n## `v2.3.0`\n\n### [1、Support text-to-image domain model]\n  - Add five text-to-image domain models based on disco diffusion, in which three models are for English and two for Chinese. Especially, Chinese text-to-image model [disco_diffusion_ernievil_base](https://aistudio.baidu.com/aistudio/projectdetail/4444998) is based on Baidu **ERNIE-ViL**，welcome to experience.\n\n### 【2、Support Wenxin large models API】\n  - Add api call for [**ERNIE-ViLG**](https://aistudio.baidu.com/aistudio/projectdetail/4445016) model, which supports text-to-image task。\n  - Add api call for [**ERNIE 3.0 Zeus**](https://aistudio.baidu.com/aistudio/projectdetail/4445054) model, which supports applications such as writing essays, summarization, couplets, question answering, writing novels and completing text.\n\n## `v2.1.0`\n\n### [ 1. Improvements]\n\n- Add supports for five new models, including two high-precision semantic segmentation models based on VOC dataset and three voice classification models.\n- Enforce the transfer learning capabilities for image semantic segmentation, text semantic matching and voice classification on related datasets.\n\n### [ 2. Upgrades of deployment capabilities]\n\n- Add the export function APIs for two kinds of model formats, i.,e,  ONNX and PaddleInference.\n- **Important Open-Source Ecological Cooperation**: add the support for [BentoML](https://github.com/bentoml/BentoML/), which is  a cloud native framework for serving  deployment. Users can easily serve pre-trained models from [PaddleHub](https://github.com/PaddlePaddle/PaddleHub) by following the [Tutorial notebooks](https:// github.com/PaddlePaddle/PaddleHub/tree/release/v2.1/demo/serving/BentoML).  Also, see this announcement and [Release note](https://github.com/bentoml/BentoML/releases/tag/v0.12.1)  from BentoML. (Many thanks to @[parano](https://github.com/parano) @[cqvu](https://github.com/cqvu) @[deehrlic](https://github.com/deehrlic) for contributing this feature in PaddleHub)\n\n### [ 3. Bug fixes ]\n\n - [#7da1230](https://github.com/PaddlePaddle/PaddleHub/commit/7da12302dd77e3d739da72821d41715ad8a7c79c) Fixed the problem that the model cannot resume training if metrics is not recorded.\n - [#b0b3144](https://github.com/PaddlePaddle/PaddleHub/commit/b0b3144eff34e47cac8fc450c8b7cb6c557f9b84) Fixed the problem that the thread did not exit normally when the evaluation process was abnormal.\n - [#30aace4](https://github.com/PaddlePaddle/PaddleHub/commit/30aace46414bbeef02beb75b7128f48fada82150) Improve the model installation process.\n\n## `v2.0.0`\n\n* 发布 2.0版本，全面迁移动态图编程模式，模型开发调试更加方便，finetune接口更加灵活易用。\n* 视觉类任务迁移学习能力全面升级，支持图像分类、图像着色、风格迁移等多种任务。\n* BERT、ERNIE、RoBERTa等Transformer类模型升级至动态图，支持文本分类、序列标注的Fine-Tune能力。\n* 新增词向量模型61个，其中包含中文模型51个，英文模型10个。\n* 优化服务化部署Serving能力，支持多卡预测、自动负载均衡，性能大幅度提升。\n* 新增自动数据增强能力Auto Augment，能高效地搜索适合数据集的数据增强策略组合。\n\n## `v2.0.0-beta1`\n\n* BERT、ERNIE、RoBERTa等Transformer类模型升级至动态图，增加[文本分类](../../demo/text_classification)的Fine-Tune能力\n* 修复部分已知问题\n\n## `v2.0.0-beta0`\n\n* 全面迁移动态图编程模式，模型开发调试更加方便，finetune接口更加灵活易用。\n* 优化服务化部署Serving能力，支持多卡预测、自动负载均衡，性能大幅度提升。\n* 视觉类迁移学习能力全面升级，支持[图像分类](../../demo/image_classification)、[图像着色](../../demo/colorization)、[风格迁移](../../demo/style_transfer)等多种视觉任务。\n\n## `v1.8.1`\n\n* 『[图像分割](https://www.paddlepaddle.org.cn/hublist?filter=en_category&value=ImageSegmentation)』新增轻量级人像分割模型Humanseg，支持移动端实时分割\n* 增强文本匹配任务性能，使用[EMNLP2019-Sentence-BERT](https://arxiv.org/abs/1908.10084)作为文本匹配任务网络，可同时大幅提升准确率和预测速度。配套教程：[pointwise文本语义匹配](https://aistudio.baidu.com/aistudio/projectdetail/705526)、[pairwise文本语义匹配](https://aistudio.baidu.com/aistudio/projectdetail/709472)\n* 修复文本分类选择F1作为评价指标，运行错误\n\n## `v1.8.0`\n\n* 预训练模型丰富，一键完成更多\n  * 『[文本生成](https://www.paddlepaddle.org.cn/hublist?filter=en_category&value=TextGeneration)』新增基于ERNIE-tiny和ERNIE-gen的对联和写诗生成模型，支持一键自动写诗和对对联。\n  *  『[词法分析](https://www.paddlepaddle.org.cn/hublist?filter=en_category&value=LexicalAnalysis)』新增jieba的paddle模式切词模型，可一键完成中文分词、关键词抽取等功能。\n  * 『[语义表示](https://www.paddlepaddle.org.cn/hublist?filter=en_category&value=SemanticModel)』新增基于网页、小说、新闻三类大规模文本数据的LDA主题模型及其语义相似度计算接口。\n* Fine-tune API升级，提升灵活性并支持更多任务\n   * 新增Tokenizer API，支持更加灵活的切词、切字模式和自定义切词工具拓展。\n   * 新增[文本生成](https://github.com/PaddlePaddle/PaddleHub/tree/release/v1.8/demo/text_generation)任务，支持Seq2Seq任务的Fine-tuning。\n  * 新增文本匹配任务，支持[Pointwise](https://github.com/PaddlePaddle/PaddleHub/tree/release/v1.8/demo/pointwise_text_matching)、[Pairwise](https://github.com/PaddlePaddle/PaddleHub/tree/release/v1.8/demo/pairwise_text_matching)两种文本匹配训练模式，更便捷完成语义匹配任务。\n\n## `v1.7.0`\n\n* 丰富预训练模型，提升应用性\n  * 新增VENUS系列视觉预训练模型[yolov3_darknet53_venus](https://www.paddlepaddle.org.cn/hubdetail?name=yolov3_darknet53_venus&en_category=ObjectDetection)，[faster_rcnn_resnet50_fpn_venus](https://www.paddlepaddle.org.cn/hubdetail?name=faster_rcnn_resnet50_fpn_venus&en_category=ObjectDetection)，可大幅度提升图像分类和目标检测任务的Fine-tune效果\n  * 新增工业级短视频分类模型[videotag_tsn_lstm](https://paddlepaddle.org.cn/hubdetail?name=videotag_tsn_lstm&en_category=VideoClassification)，支持3000类中文标签识别\n  * 新增轻量级中文OCR模型[chinese_ocr_db_rcnn](https://www.paddlepaddle.org.cn/hubdetail?name=chinese_ocr_db_rcnn&en_category=TextRecognition)、[chinese_text_detection_db](https://www.paddlepaddle.org.cn/hubdetail?name=chinese_text_detection_db&en_category=TextRecognition)，支持一键快速OCR识别\n  * 新增行人检测、车辆检测、动物识别、Object等工业级模型\n\n* Fine-tune API升级\n  * 文本分类任务新增6个预置网络，包括CNN, BOW, LSTM, BiLSTM, DPCNN等\n  * 使用VisualDL可视化训练评估性能数据\n\n## `v1.6.2`\n\n* 修复图像分类在windows下运行错误\n\n## `v1.6.1`\n\n* 修复windows下安装PaddleHub缺失config.json文件\n\n## `v1.6.0`\n\n* NLP Module全面升级，提升应用性和灵活性\n  * lac、senta系列(bow、cnn、bilstm、gru、lstm)、simnet_bow、porn_detection系列(cnn、gru、lstm)升级高性能预测，性能提升高达50%\n  * ERNIE、BERT、RoBERTa等Transformer类语义模型新增获取预训练embedding接口get_embedding，方便接入下游任务，提升应用性\n  * 新增RoBERTa通过模型结构压缩得到的3层Transformer模型[rbt3](https://www.paddlepaddle.org.cn/hubdetail?name=rbt3&en_category=SemanticModel)、[rbtl3](https://www.paddlepaddle.org.cn/hubdetail?name=rbtl3&en_category=SemanticModel)\n\n* Task predict接口增加高性能预测模式accelerate_mode，性能提升高达90%\n\n* PaddleHub Module创建流程开放，支持Fine-tune模型转化，全面提升应用性和灵活性\n  * [预训练模型转化为PaddleHub Module教程](https://github.com/PaddlePaddle/PaddleHub/blob/release/v1.6/docs/contribution/contri_pretrained_model.md)\n  * [Fine-tune模型转化为PaddleHub Module教程](https://github.com/PaddlePaddle/PaddleHub/blob/release/v1.6/docs/tutorial/finetuned_model_to_module.md)\n\n* [PaddleHub Serving](https://github.com/PaddlePaddle/PaddleHub/blob/release/v1.6/docs/tutorial/serving.md)优化启动方式，支持更加灵活的参数配置\n\n## `v1.5.2`\n\n* 优化pyramidbox_lite_server_mask、pyramidbox_lite_mobile_mask模型的服务化部署性能\n\n## `v1.5.1`\n\n* 修复加载module缺少cache目录的问题\n\n## `v1.5.0`\n\n* 升级PaddleHub Serving，提升性能和易用性\n   * 新增文本Embedding服务[Bert Service](./tutorial/bert_service.md), 轻松获取文本embedding；\n      * 代码精短，易于使用。服务端/客户端一行命令即可获取文本embedding；  \n      * 更高性能，更高效率。通过Paddle AnalysisPredictor API优化计算图，提升速度减小显存占用\n      * 随\"机\"应变，灵活扩展。根据机器资源和实际需求可灵活增加服务端数量，支持多显卡多模型计算任务\n   * 优化并发方式，多核环境中使用多线程并发提高整体QPS\n\n* 优化PaddleHub迁移学习组网Task功能，提升易用性\n   * 增加Hook机制，支持[修改Task内置方法](./tutorial/hook.md)\n   * 增加colorlog，支持日志彩色显示\n   * 改用save_inference_model接口保存模型，方便模型部署\n   * 优化predict接口，增加return_result参数，方便用户直接获取预测结果\n\n* 优化PaddleHub Dataset基类，加载[自定义数据](./tutorial/how_to_load_data.md)代码更少、更简单\n\n\n## `v1.4.1`\n\n* 修复利用Transformer类模型完成序列标注任务适配paddle1.6版本的问题\n* Windows下兼容性提升为python >= 3.6\n\n## `v1.4.0`\n\n* 新增预训练模型ERNIE tiny\n* 新增数据集：INEWS、BQ、DRCD、CMRC2018、THUCNEWS，支持ChineseGLUE（CLUE）V0 所有任务\n* 修复module与PaddlePaddle版本兼容性问题\n* 优化Hub Serving启动过程和模型加载流程，提高服务响应速度\n\n\n## `v1.3.0`\n\n* 新增PaddleHub Serving服务部署\n  * 新增[hub serving](https://github.com/PaddlePaddle/PaddleHub/wiki/PaddleHub-Serving%E4%B8%80%E9%94%AE%E6%9C%8D%E5%8A%A1%E9%83%A8%E7%BD%B2)命令，支持一键启动Module预测服务部署\n* 新增预训练模型：\n  * roberta_wwm_ext_chinese_L-24_H-1024_A-16\n  * roberta_wwm_ext_chinese_L-12_H-768_A-12\n  * bert_wwm_ext_chinese_L-12_H-768_A-12\n  * bert_wwm_chinese_L-12_H-768_A-12\n* AutoDL Finetuner优化使用体验\n  * 支持通过接口方式回传模型性能\n  * 可视化效果优化，支持多trail效果显示\n\n## `v1.2.1`\n\n* 新增**超参优化Auto Fine-tune**，实现给定超参搜索空间，PaddleHub自动给出较佳的超参组合\n  * 支持两种超参优化算法：HAZero和PSHE2\n  * 支持两种评估方式：FullTrail和PopulationBased\n* 新增Fine-tune**优化策略ULMFiT**，包括以下三种设置\n  * Slanted triangular learning rates：学习率先线性增加后缓慢降低\n  * Discriminative fine-tuning：将计算图划分为n段，不同的段设置不同学习率\n  * Gradual unfreezing：根据计算图的拓扑结构逐层unfreezing\n* 新增支持用户自定义PaddleHub配置，包括\n  * 预训练模型管理服务器地址\n  * 日志记录级别\n* Fine-tune API升级，灵活性与易用性提升\n  * 新增**阅读理解Fine-tune任务**和**回归Fine-tune任务**\n  * 新增多指标评测\n  * 优化predict接口\n  * 可视化工具支持使用tensorboard\n\n\n## `v1.1.2`\n\n* PaddleHub支持修改预训练模型存放路径${HUB_HOME}\n\n\n## `v1.1.1`\n\n* PaddleHub支持离线运行\n* 修复python2安装PaddleHub失败问题\n\n\n## `v1.1.0`\n\n* PaddleHub **新增预训练模型ERNIE 2.0**\n  * 升级Reader， 支持自动传送数据给Ernie 1.0/2.0\n  * 新增数据集GLUE(MRPC、QQP、SST-2、CoLA、QNLI、RTE、MNLI)\n\n\n## `v1.0.1`\n\n* 安装模型时自动选择与paddlepaddle版本适配的模型\n\n\n## `v1.0.0`\n\n* 全新发布PaddleHub官网，易用性全面提升\n  * 新增网站  https://www.paddlepaddle.org.cn/hub  包含PaddlePaddle生态的预训练模型使用介绍\n  * 迁移学习Demo接入AI Studio与AI Book,无需安装即可快速体验\n\n* 新增29个预训练模型，覆盖文本、图像、视频三大领域；目前官方提供40个预训练模型\n  * CV预训练模型：\n    * 新增图像分类预训练模型11个：SE_ResNeXt, GoogleNet, ShuffleNet等\n    * 新增目标检测模型Faster-RCNN和YOLOv3\n    * 新增图像生成模型CycleGAN\n    * 新增人脸检测模型Pyramidbox\n    * 新增视频分类模型4个: TSN, TSM, StNet, Non-Local\n  * NLP预训练模型\n    * 新增语义模型ELMo\n    * 新增情感分析模型5个: Senta-BOW, Senta-CNN, Senta-GRNN, , Senta-LSTM, EmoTect\n    * 新增中文语义相似度分析模型SimNet\n    * 升级LAC词法分析模型，新增词典干预功能，支持用户自定义分词\n* Fine-tune API升级，灵活性与性能全面提升\n  * 支持多卡并行、PyReader多线程IO，Fine-tune速度提升60%\n  * 简化finetune、evaluate、predict等使用逻辑，提升易用性\n  * 增加事件回调功能，方便用户快速实现自定义迁移学习任务\n  * 新增多标签分类Fine-tune任务\n\n\n## `v0.5.0`\n\n正式发布PaddleHub预训练模型管理工具，旨在帮助用户更高效的管理模型并开展迁移学习的工作。\n\n**预训练模型管理**: 通过hub命令行可完成PaddlePaddle生态的预训练模型下载、搜索、版本管理等功能。\n\n**命令行一键使用**: 无需代码，通过命令行即可直接使用预训练模型进行预测，快速调研训练模型效果。目前版本支持以下模型：词法分析LAC；情感分析Senta；目标检测SSD；图像分类ResNet, MobileNet, NASNet等。\n\n**迁移学习**: 提供了基于预训练模型的Fine-tune API，用户通过少量代码即可完成迁移学习，包括BERT/ERNIE文本分类、序列标注、图像分类迁移等。\n"
  },
  {
    "path": "docs/docs_en/transfer_learning_index.rst",
    "content": "==================\nTransfer Learning\n==================\n\nTransfer Learning is a subfield of deep learning that aims to use similarities in data, tasks, or models to transfer knowledge learned in old fields to new fields. In other words, transfer learning refers to the use of existing knowledge to learn new knowledge. For example, people who have learned to ride a bicycle can learn to ride an electric bike faster. A common method of transfer learning is to perform fine-tune of a pre-training model. That is, the user selects a successfully trained model from PaddleHub for a new task based on the current task scenario, and the dataset used by the model is similar to the dataset of the new scenario. In this case, you only needs to perform fine-tune of the parameters of the model (**Fine-tune**) during the training of the current task scenario using the data of the new scenario. Transfer learning has attracted many researchers because it is a good solution to the following problems in deep learning:\n\n* In some research areas, there are only a small amount of annotated data, and the cost of data annotation is high, which is not enough to train a sufficiently robust neural network.\n* The training of large-scale neural networks relies on large computational resources, which is difficult to implement for a common user.\n* Models that address generalized needs do not perform as well as expected for specific applications.\n\nIn order to make it easier for developers to apply transfer learning, Paddle has open-sourced PaddleHub – a pre-training model management tool. With just ten lines of codes, developers can complete the transfer learning process. This section describes comprehensive transfer learning by using the PaddleHub.\n\n.. image:: ../imgs/paddlehub_finetune.gif \n   :width: 900px\n   :align: center\n\n.. toctree::\n   :maxdepth: 2\n\n   finetune/sequence_labeling.md\n   finetune/text_matching.md\n   finetune/image_classification.md\n   finetune/image_colorization.md\n   finetune/style_transfer.md\n   finetune/semantic_segmentation.md\n   finetune/audio_classification.md\n   finetune/customized_dataset.md"
  },
  {
    "path": "docs/docs_en/tutorial/cmd_usage.rst",
    "content": "===========================\nPaddleHub Command Line Tool\n===========================\n\nPaddleHub provides the command line tool for the management and use of pre-training models. \n\nThere are 11 commands in total, covering various aspects such as model installation, uninstallation, prediction, etc.\n\n\nhub install\n==================\n\nInstalls Module locally. By default, it is installed in the `${HUB_HOME}/.paddlehub/modules` directory. When a module is installed locally, users can operate the Module through other commands (e.g., use the Module for prediction), or use the python API provided by PaddleHub to apply the Module to their own tasks to achieve migration learning.\n\n.. tip::\n\n    If Environment Variable *${HUB_HOME}* is set, the pre-training models and configuration files are stored in the path specified by *${HUB_HOME}*.\n    If Environment Variable *${HUB_HOME}* is not set, these files are stored in the path specified by *$HOME*.\n\nhub uninstall\n==================\n\nUninstalls local Modules.\n\nhub show\n==================\n\nViews properties of locally installed modules or properties of modules in a specified directory, including name, version, description, author and other information.\n\nhub download\n==================\n\nDownloads the Module provided by PaddleHub.\n\nhub search\n==================\n\nSearch the matching Module on the server by keywords. When you want to find the Module of a specific model, you can run the search command to quickly get the result. For example, the `hub search ssd` command runs to search for all Modules containing the word ssd. The command supports regular expressions, for example, `hub search ^s.*` runs to search all resources beginning with s.\n\n.. tip::\n    \n    If you want to search all Modules, the running of `hub search *` does not work. Because the shell expands its own wildcard, replacing \\* with the filename in the current directory. For global search, users can type `hub search` directly.\n\nhub list\n==================\n\nLists the locally installed Modules\n\nhub run\n==================\n\nExecutes the prediction of Module. It should be noted that not all models support prediction (and likewise, not all models support migration learning). For more details please refer to the [Quick experience of *hub run*](../quick_experience/cmd_quick_run_en.md)\n\nPaddleHub tries to simplify the cost of understanding when using command line predictions. Generally, the predictions are classified into two categories: NLP and CV.\n\nNLP Class Tasks\n---------------\n\nYou can input data specified by *--input_text*. Take the Baidu LAC model (Chinese lexical analysis) as an example. You can use the following command to analyze texts.\n\n\n.. code-block:: console\n\n    $ hub run lac --input_text \"今天是个好日子\"\n\nCV Class Tasks\n---------------\n\nInput data is specified by *--input\\_path*. Take the SSD model (single-stage object detection) as an example. Predictions can be performed by running the following command:\n\n.. code-block:: console\n\n    $ hub run resnet_v2_50_imagenet --input_path test.jpg\n\n.. note::\n\n    In PaddleHub, Module represents a executable pre-training model. A Module can support direct command-line prediction, or with the PaddleHub Finetune API. It implements the migration learning through a small number of codes. Not all Modules support command line prediction (for example, for BERT/ERNIE Transformer model, finetune is performed generally with a task). Not all Modules can be used for fine-tune (for example, for the LAC lexical analysis model, we do not recommend users to use finetune).\n\nhub help\n==================\n\nDisplays help information.\n\nhub version\n==================\n\nDisplays the PaddleHub version information.\n\nhub clear\n==================\n\nPaddleHub generates some cached data in the operation, which is stored in ${HUB\\_HOME}/.paddlehub/cache directory by default. Users can clear the cache by running the clear command.\n\nhub config\n==================\n\nViews and configures paddlehub-related information, including server address and log level.\n\n.. code-block:: console\n\n    $ # Displays the current paddlehub settings.\n    $ hub config \n    \n    $ # Restores current paddlehub settings to default settings.\n    $ hub config reset \n    \n    $ # Sets the current paddlehub-server address to ${HOST}, and paddlehub client gets model information from this address.\n    $ hub config server==${HOST} \n    \n    $ # Sets the current log level to ${LEVEL}. Options are CRITICAL, ERROR, WARNING, EVAL, TRAIN, INFO, DEBUG, from left to right, from high priority to low priority.\n    $ hub config log.level==${LEVEL} \n    \n    $ # Sets whether the current log is available.\n    $ hub config log.enable==True|False \n\nhub serving\n==================\n\nDeploys Module prediction service in one key. For details, see `PaddleHub Serving Deployment <serving>`_."
  },
  {
    "path": "docs/docs_en/tutorial/custom_module.rst",
    "content": "======================\nHow to custom Module\n======================\n\n\nI. Preparation\n=======================\n\nBasic Model Information\n------------------------\n\nWe are going to write a PaddleHub Module with the following basic information about the module:\n\n.. code-block:: yaml\n\n    name: senta_test\n    version: 1.0.0\n    summary: This is a PaddleHub Module. Just for test.\n    author: anonymous\n    author_email:\n    type: nlp/sentiment_analysis\n\nThe Module has an interface sentiment_classify, which is used to receive incoming text and give it a sentiment preference (positive/negative). It supports python interface calls and command line calls.\n\n.. code-block:: python\n\n    import paddlehub as hub\n\n    senta_test = hub.Module(name=\"senta_test\")\n    senta_test.sentiment_classify(texts=[\"这部电影太差劲了\"])\n\n.. code-block:: shell\n\n    hub run senta_test --input_text 这部电影太差劲了\n\nStrategy\n------------------------\n\nFor the sake of simplicity of the sample codes, we use a very simple sentiment strategy. When the input text has the word specified in the vocabulary list, the text tendency is judged to be negative; otherwise it is positive.\n\nII. Create Module\n=======================\n\nStep 1: Create the necessary directories and files.\n----------------------------------------------------\n\nCreate a senta_test directory. Then, create module.py, processor.py, and vocab.list in the senta_test directory, respectively.\n\n.. code-block:: shell\n\n    $ tree senta_test\n    senta_test/\n    ├── vocab.list \n    ├── module.py \n    └── processor.py \n\n============    =========================================================================\nFile Name       Purpose                                                      \n------------    -------------------------------------------------------------------------\n============    =========================================================================\nmodule.py       It is the main module that provides the implementation codes of Module. \nprocessor.py    It is the helper module that provides a way to load the vocabulary list.\nvocab.list      It stores the vocabulary. \n============    =========================================================================\n\n\nStep 2: Implement the helper module processor.\n------------------------------------------------\n\nImplement a load_vocab interface in processor.py to read the vocabulary  list.\n\n.. code-block:: python\n\n    def load_vocab(vocab_path):\n        with open(vocab_path) as file:\n            return file.read().split()\n\nStep 3: Write Module processing codes.\n------------------------------------------------\n\nThe module.py file is the place where the Module entry code is located. We need to implement prediction logic on it.\n\nStep 3_1. Reference the necessary header files\n^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n.. code-block:: python\n\n    import argparse\n    import os\n\n    import paddlehub as hub\n    from paddlehub.module.module import runnable, moduleinfo\n\n    from senta_test.processor import load_vocab\n\n.. note::\n\n    When referencing a module in Module, you need to enter the full path, for example, senta_test. processor.\n\nStep 3_2. Define the SentaTest class.\n^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\nModule.py needs to have a class that inherits hub. Module, and this class is responsible for implementing the prediction logic and filling in basic information with using moduleinfo. When the hub. Module(name=\"senta\\_test\") is used to load Module, PaddleHub automatically creates an object of SentaTest and return it.\n\n.. code-block:: python\n\n    @moduleinfo(\n        name=\"senta_test\",\n        version=\"1.0.0\",\n        summary=\"This is a PaddleHub Module. Just for test.\",\n        author=\"anonymous\",\n        author_email=\"\",\n        type=\"nlp/sentiment_analysis\",\n    )\n    class SentaTest:\n        ...\n\nStep 3_3. Perform necessary initialization.\n^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n.. code-block:: python\n\n    @moduleinfo(\n        name=\"senta_test\",\n        version=\"1.0.0\",\n        summary=\"This is a PaddleHub Module. Just for test.\",\n        author=\"anonymous\",\n        author_email=\"\",\n        type=\"nlp/sentiment_analysis\",\n    )\n    class SentaTest:\n\n        def __init__(self):\n            # add arg parser\n            self.parser = argparse.ArgumentParser(\n                description=\"Run the senta_test module.\",\n                prog='hub run senta_test',\n                usage='%(prog)s',\n                add_help=True)\n            self.parser.add_argument(\n                '--input_text', type=str, default=None, help=\"text to predict\")\n\n            # load word dict\n            vocab_path = os.path.join(self.directory, \"vocab.list\")\n            self.vocab = load_vocab(vocab_path)\n\n        ...\n\n.. note::\n\n    The execution class object has a built-in directory attribute by default. You can directly get the path of the Module.\n\nStep 3_4: Refine the prediction logic.\n^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n.. code-block:: python\n\n    @moduleinfo(\n        name=\"senta_test\",\n        version=\"1.0.0\",\n        summary=\"This is a PaddleHub Module. Just for test.\",\n        author=\"anonymous\",\n        author_email=\"\",\n        type=\"nlp/sentiment_analysis\",\n    )\n    class SentaTest:\n        ...\n\n        def sentiment_classify(self, texts):\n            results = []\n            for text in texts:\n                sentiment = \"positive\"\n                for word in self.vocab:\n                    if word in text:\n                        sentiment = \"negative\"\n                        break\n                results.append({\"text\":text, \"sentiment\":sentiment})\n\n            return results\n        \n        ...\n\nStep 3_5. Support the command-line invoke.\n^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\nIf you want the module to support command-line invoke, you need to provide a runnable modified interface that parses the incoming data, makes prediction, and returns the results.\n\nIf you don't want to provide command-line prediction, you can leave the interface alone and PaddleHub automatically finds out that the module does not support command-line methods and gives a hint when PaddleHub executes in command lines.\n\n.. code-block:: python\n\n    @moduleinfo(\n        name=\"senta_test\",\n        version=\"1.0.0\",\n        summary=\"This is a PaddleHub Module. Just for test.\",\n        author=\"anonymous\",\n        author_email=\"\",\n        type=\"nlp/sentiment_analysis\",\n    )\n    class SentaTest:\n        ...\n\n        @runnable\n        def run_cmd(self, argvs):\n            args = self.parser.parse_args(argvs)\n            texts = [args.input_text]\n            return self.sentiment_classify(texts)\n\n        ...\n\nstep 3_6. Support the serving invoke.\n^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\nIf you want the module to support the PaddleHub Serving deployment prediction service, you need to provide a serving-modified interface that parses the incoming data, makes prediction, and returns the results.\n\nIf you do not want to provide the PaddleHub Serving deployment prediction service, you do not need to add the serving modification.\n\n.. code-block:: python\n\n    @moduleinfo(\n        name=\"senta_test\",\n        version=\"1.0.0\",\n        summary=\"This is a PaddleHub Module. Just for test.\",\n        author=\"anonymous\",\n        author_email=\"\",\n        type=\"nlp/sentiment_analysis\",\n    )\n    class SentaTest:\n        ...\n\n        @serving\n        def sentiment_classify(self, texts):\n            results = []\n            for text in texts:\n                sentiment = \"positive\"\n                for word in self.vocab:\n                    if word in text:\n                        sentiment = \"negative\"\n                        break\n                results.append({\"text\":text, \"sentiment\":sentiment})\n\n            return results\n\nComplete Code\n------------------------------------------------\n\n* `module.py <https://github.com/PaddlePaddle/PaddleHub/blob/release/v2.1/modules/demo/senta_test/module.py>`_\n\n* `processor.py <https://github.com/PaddlePaddle/PaddleHub/blob/release/v2.1/modules/demo/senta_test/processor.py>`_\n\nIII. Install and test Module.\n===================================\n\nAfter writing a module, we can test it in the following ways:\n\nCall Method 1\n------------------------------------------------\n\nInstall the Module into the local machine, and then load it through Hub.Module(name=...)\n\n.. code-block:: console\n\n    $ hub install senta_test\n\n\n.. code-block:: python\n\n    import paddlehub as hub\n\n    senta_test = hub.Module(name=\"senta_test\")\n    senta_test.sentiment_classify(texts=[\"这部电影太差劲了\"])\n\nCall Method 2\n------------------------------------------------\n\nLoad directly through Hub.Module(directory=...)\n\n.. code-block:: python\n\n    import paddlehub as hub\n\n    senta_test = hub.Module(directory=\"senta_test/\")\n    senta_test.sentiment_classify(texts=[\"这部电影太差劲了\"])\n\nCall Method 3\n------------------------------------------------\n\nInstall the Module on the local machine and run it through hub run.\n\n.. code-block:: console\n\n    $ hub install senta_test\n    $ hub run senta_test --input_text \"这部电影太差劲了\""
  },
  {
    "path": "docs/docs_en/tutorial/serving.md",
    "content": "# PaddleHub Serving ： One-Key Deploy Models as Services\n\n## Introduction\n\n### Background\n\nPaddleHub enables the rapid model prediction. Developers are often faced with the need to migrate the local prediction process to go online. PaddleHub's ability to rapidly deploy model prediction services is required, whether to open public service ports or to build the prediction services on a LAN. In this background, the PaddleHub Serving, the model one-key service deployment tool, emerges. Developers can quickly launch a model prediction online service with using a single line of command, without having to concern about the network framework selection and implementation.\n\n### One-Key Service Deployment\n\nPaddleHub Serving is a one-key model service deployment tool based on PaddleHub. It can easily start a model prediction online service through a simple Hub command line tool. The front-end completes the processing of network requests through Flask and Gunicorn. The background directly invokes the PaddleHub prediction interface and supports multi-process to improve the concurrency by using multi-core to ensure the performance of the prediction service.\n\n### Supported Models\n\nCurrently, PaddleHub Serving supports the service deployment for all direct prediction PaddleHub models, including `lac`, `senta_bilstm` and other NLP models, as well as `yolov3_darknet53_coco2017`, `vgg16_imagenet` and other CV models. For more models, refer to [PaddleHub Supported Models List](https://paddlepaddle.org.cn/hublist). In the future, developers will also be able to use the models from PaddleHub Fine-tune API for rapid service deployment.\n\n## Usage\n\n### Step 1: Start the server deployment.\n\nPaddleHub Serving starts in two ways: Starting in the command line, and Starting by using the configuration file.\n\n#### Start in command line\n\nCommand\n\n```shell\n$ hub serving start --modules [Module1==Version1, Module2==Version2, ...] \\\n                    --port XXXX \\\n                    --use_gpu \\\n                    --use_multiprocess \\\n                    --workers \\\n                    --gpu \\\n```\n\nParameter:\n\n| Parameter          | Purpose                                                      |\n| ------------------ | ------------------------------------------------------------ |\n| --modules/-m       | PaddleHub Serving pre-installed model. It is listed as multiple Module==Version key value pairs <br/> |\n| --port/-p          | Service port. By default, it is 8866.                        |\n| --use_gpu          | paddlepaddle-gpu must be installed when the GPU is used for prediction. |\n| --use_multiprocess | Whether to enable concurrent mode. By default, it is single process. This is recommended for multi-core CPUs <br/>*`Only single process is supported in Windows system`* |\n| --workers          | The number of concurrent tasks specified in the concurrent mode. By default, it is `2*cpu_count-1`, where `cpu_count` is the number of CPU cores. |\n| --gpu              | Specify the card number of the GPU to be used. For example, `1,2` means using VGA card 1 and VGA card 2. By default, only VGA card 0 is used. |\n\n**NOTE:** --use\\_gpu should not be used together with --use\\_multiprocess.\n\n#### Start by using the configuration file\n\nCommand\n\n```shell\n$ hub serving start --config config.json\n```\n\n`config.json`The format is as follows:\n\n```json\n{\n  \"modules_info\": {\n    \"yolov3_darknet53_coco2017\": {\n      \"init_args\": {\n        \"version\": \"1.0.0\"\n      },\n      \"predict_args\": {\n        \"batch_size\": 1,\n        \"use_gpu\": false\n      }\n    },\n    \"lac\": {\n      \"init_args\": {\n        \"version\": \"1.1.0\"\n      },\n      \"predict_args\": {\n        \"batch_size\": 1,\n        \"use_gpu\": false\n      }\n    }\n  },\n  \"port\": 8866,\n  \"use_multiprocess\": false,\n  \"workers\": 2,\n  \"gpu\": \"0,1,2\"\n}\n\n```\n\nParameter:\n\n\n\n### Step 2: Access the server\n\nAfter deploying the server model prediction service using PaddleHub Serving, you can access the prediction interface on the client to get the results. The interface url format is:\n\n`http://127.0.0.1:8866/predict/<MODULE>`\n\nwhere `<MODULE>` is the model name.\n\nThe prediction results can be obtained by sending a POST request. Below we will show a concrete demo to illustrate the deployment and usage process using the PaddleHub Serving.\n\n### Step 3: Use the PaddleHub Serving to perform the personalization development.\n\nAfter deploying the model service using the PaddleHub Serving, you can use the obtained interface for development, such as, providing external web services, or accessing the application to reduce the prediction pressure on clients and improve performance. A web page demo is displayed as follows:\n\n![](../../imgs/web_demo.png)\n\n### Step 4: Shut down the serving.\n\nRun the shutdown command to shut down the serving.\n\n```shell\n$ hub serving stop --port XXXX\n```\n\nParameter:\n\n| Parameter | Purpose                                                      |\n| --------- | ------------------------------------------------------------ |\n| --port/-p | Specify the service port to be closed. By default, it is 8866. |\n\n## Demo – Deploy an Online Lac Word Segmentation Service\n\n### Step 1: Deploy the lac online service.\n\nNow, we want to deploy a lac online service to get the word segmentation results of texts through an interface.\n\nFirst, choose either start-up methods:\n\n```shell\n$ hub serving start -m lac\n```\n\nOR\n\n```shell\n$ hub serving start -c serving_config.json\n```\n\nwhere `serving_config.json` is as follows.\n\n```json\n{\n  \"modules_info\": {\n    \"lac\": {\n      \"init_args\": {\n        \"version\": \"1.1.0\"\n      },\n      \"predict_args\": {\n        \"batch_size\": 1,\n        \"use_gpu\": false\n      }\n    }\n  },\n  \"port\": 8866,\n  \"use_multiprocess\": false,\n  \"workers\": 2\n}\n```\n\nThe successful startup interface is displayed in the following figure:\n\n![](../../imgs/start_serving_lac.png)\n\nWe have successfully deployed lac's online word segmentation service on port 8866. *The warning here is a Flask prompt, without affecting the usage.*\n\n### Step 2: Access the lac prediction interface.\n\nAfter the service is deployed, we can test it with the texts `今天是个好日子` and `天气预报说今天要下雨`.\n\nThe client codes are as follows:\n\n```python\n# coding: utf8\nimport requests\nimport json\n\nif __name__ == \"__main__\":\n\n    text = [\"今天是个好日子\", \"天气预报说今天要下雨\"]\n    data = {\"texts\": text, \"batch_size\": 1}\n    url = \"http://127.0.0.1:8866/predict/lac\"\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # Print results\n    print(json.dumps(r.json(), indent=4, ensure_ascii=False))\n```\n\nRun and get results.\n\n```python\n{\n    \"msg\": \"\",\n    \"results\": [\n        {\n            \"tag\": [\n                \"TIME\", \"v\", \"q\", \"n\"\n            ],\n            \"word\": [\n                \"今天\", \"是\", \"个\", \"好日子\"\n            ]\n        },\n        {\n            \"tag\": [\n                \"n\", \"v\", \"TIME\", \"v\", \"v\"\n            ],\n            \"word\": [\n                \"天气预报\", \"说\", \"今天\", \"要\", \"下雨\"\n            ]\n        }\n    ],\n    \"status\": \"0\"\n}\n\n```\n\n### Step 3: Stop the serving the service\n\nSince we are using the default service port 8866 at startup, the corresponding shutdown command is:\n\n```shell\n$ hub serving stop --port 8866\n```\n\nOr if you do not specify a shutdown port, the default port is 8866.\n\n```shell\n$ hub serving stop\n```\n\nWait for serving to clear the service. The system prompts:\n\n```shell\n$ PaddleHub Serving will stop.\n```\n\nThe serving service has been stopped.\n\nFor specific information and codes of this demo, see [LAC Serving](../../demo/serving/module_serving/lexical_analysis_lac). In addition, some other One-key Service Deployment demos are shown below.\n\n## More Demos\n\n* [Chinese lexical analysis - based on lac.](../../demo/serving/module_serving/lexical_analysis_lac)\n\n  This example demonstrates the deployment and online prediction of Chinese text segmentation services using lac, to get the text word-segmentation results, and intervene in the word segmentation results with a user-defined dictionary.\n\n* [Face Detection - based on pyramidbox\\_lite\\_server\\_mask](../../demo/serving/module_serving/object_detection_pyramidbox_lite_server_mask)\n\n  This example shows the face mask detection by using  pyramidbox\\_lite\\_server\\_mask, to detect the location of the face and the confidence level of the mask.\n"
  },
  {
    "path": "docs/docs_en/tutorial_index.rst",
    "content": "========\nTutorial\n========\n\n\n\n.. toctree::\n   :maxdepth: 2\n   :titlesonly:\n\n   Command Usage<tutorial/cmd_usage>\n   Hub Serving<tutorial/serving>\n   How to custom Module<tutorial/custom_module>\n   Transfer learning<transfer_learning_index>"
  },
  {
    "path": "docs/docs_en/visualization.md",
    "content": "### Text Recognition\n- Contain ultra-lightweight Chinese and English OCR models, high-precision Chinese and English, multilingual German, French, Japanese, Korean OCR recognition.\n- Many thanks to CopyRight@[PaddleOCR](https://github.com/PaddlePaddle/PaddleOCR) for the pre-trained models, you can try to train your models with PadddleOCR.\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/Image_Ocr.gif\"  width = \"800\" height = \"400\" />\n</div>\n\n### Face Detection\n- Including face detection, mask face detection, multiple algorithms are optional.\n- Many thanks to CopyRight@[PaddleDetection](https://github.com/PaddlePaddle/PaddleDetection) for the pre-trained models, you can try to train your models with PadddleDetection.\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/Image_ObjectDetection_Face_Mask.gif\"  width = \"588\" height = \"400\" />\n</div>\n\n### Image Editing\n- 4x super resolution effect, multiple super resolution models are optional.\n- Colorization models can be used to repair old grayscale photos.\n- Many thanks to CopyRight@[PaddleGAN](https://github.com/PaddlePaddle/PaddleGAN) for the pre-trained models, you can try to train your models with PadddleGAN.\n<div align=\"center\">\n<table>\n    <thead>\n    </thead>\n    <tbody>\n        <tr>\n            <th>SuperResolution </th>\n            <th>Restoration </th>\n        </tr>\n        <tr>\n            <th>\n            <a>\n            <img src=\"../imgs/Readme_Related/ImageEdit_SuperResolution.gif\"  width = \"266\" height = \"400\" /></a><br>\n            </th>\n            <th>\n            <a>\n            <img src=\"../imgs/Readme_Related/ImageEdit_Restoration.gif\"  width = \"300\" height = \"400\" /></a><br>\n            </th>\n        </tr>\n    </tbody>\n</table>\n</div>\n\n**Deoldify Huggingface Web Demo**: Integrated to [Huggingface Spaces](https://huggingface.co/spaces) with [Gradio](https://github.com/gradio-app/gradio). See demo: [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/akhaliq/deoldify)\n\n### Image Generation\n- Including portrait cartoonization, street scene cartoonization, and style transfer.\n- Many thanks to CopyRight@[PaddleGAN](https://github.com/PaddlePaddle/PaddleGAN)、CopyRight@[AnimeGAN](https://github.com/TachibanaYoshino/AnimeGANv2)for the pre-trained models.\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/ImageGAN.gif\"  width = \"640\" height = \"600\" />\n</div>\n\n**UGATIT Selfie2anime Huggingface Web Demo**: Integrated to [Huggingface Spaces](https://huggingface.co/spaces) with [Gradio](https://github.com/gradio-app/gradio). See demo: [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/akhaliq/U-GAT-IT-selfie2anime)\n\n**Photo2Cartoon Huggingface Web Demo**: Integrated to [Huggingface Spaces](https://huggingface.co/spaces) with [Gradio](https://github.com/gradio-app/gradio). See demo: [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/akhaliq/photo2cartoon)\n\n\n### Object Detection\n- Pedestrian detection, vehicle detection, and more industrial-grade ultra-large-scale pretrained models are provided.\n- Many thanks to CopyRight@[PaddleDetection](https://github.com/PaddlePaddle/PaddleDetection) for the pre-trained models, you can try to train your models with PadddleDetection.\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/Image_ObjectDetection_Pedestrian_Vehicle.gif\"  width = \"642\" height = \"400\" />\n</div>\n\n### Key Point Detection\n- Support body, face and hands key point detection for single or multiple person.\n- Many thanks to CopyRight@[openpose](https://github.com/CMU-Perceptual-Computing-Lab/openpose) for the pre-trained models.\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/Image_keypoint.gif\"  width = \"642\" height = \"550\" />\n</div>\n\n### Image Segmentation\n- High quality pixel-level portrait cutout model, ACE2P human body analysis world champion models are provided, Dynamic Sky Replacement and Harmonization.\n- Many thanks to CopyRight@[PaddleSeg](https://github.com/PaddlePaddle/PaddleSeg), CopyRight@[Zhengxia Zou](https://github.com/jiupinjia/SkyAR) for the pre-trained models, you can try to retrain your models by paddleseg or sky matting model.\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/ImageSeg_Human.gif\"  width = \"642\" height = \"400\" />\n</div>\n\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/9dis.gif\"  width = \"642\" height = \"200\" />\n</div>\n\n<div align=\"center\">\n\n(The second gif comes from  CopyRight@[jiupinjia/SkyAR](https://github.com/jiupinjia/SkyAR#district-9-ship-video-source))\n</div>\n\n\n### Image Classification\n- Various models like animal classification, dish classification, wild animal product classification are available.\n- Many thanks to CopyRight@[PaddleClas](https://github.com/PaddlePaddle/PaddleClas) for the pre-trained models, you can try to train your models with PadddleClas.\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/ImageClas_animal_dish_wild.gif\"  width = \"530\" height = \"400\" />\n</div>\n\n### Text Generation\n- AI poem writing, AI couplets, AI love words generation models are available.\n- Many thanks to CopyRight@[ERNIE](https://github.com/PaddlePaddle/ERNIE) for the pre-trained models, you can try to train your models with ERNIE.\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/Text_Textgen_poetry.gif\"  width = \"850\" height = \"400\" />\n</div>\n\n### Lexical Analysis\n- Excelent Chinese text segmentation, part-of-speech, named entity recognition model are provided by [LAC](https://github.com/baidu/LAC)@Baidu NLP.\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/Text_Lexical Analysis.png\"  width = \"640\" height = \"233\" />\n</div>\n\n### Syntactic Analysis\n- Leading Chinese syntactic analysis model are provided by [DDParser](https://github.com/baidu/DDParser)@Baidu NLP.\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/Text_SyntacticAnalysis.png\"  width = \"640\" height = \"301\" />\n</div>\n\n### Sentiment Analysis\n- All SOTA Chinese sentiment analysis model released by Baidu NLP can be used just one-line of code.\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/Text_SentimentAnalysis.png\"  width = \"640\" height = \"228\" />\n</div>\n\n### Text Review\n- Text review model of Chinese pornographic text are available.\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/Text_Textreview.png\"  width = \"640\" height = \"140\" />\n</div>\n\n### Speech Synthesis\n- TTS speech synthesis algorithm, multiple algorithms are available.\n- Many thanks to CopyRight@[Parakeet](https://github.com/PaddlePaddle/Parakeet) for the pre-trained models, you can try to train your models with Parakeet.\n- Input: `Life was like a box of chocolates, you never know what you're gonna get.`\n- The synthesis effect is as follows:\n<div align=\"center\">\n<table>\n    <thead>\n    </thead>\n    <tbody>\n        <tr>\n            <th>deepvoice3 </th>\n            <th>fastspeech </th>\n            <th>transformer</th>\n        </tr>\n        <tr>\n            <th>\n            <a href=\"https://paddlehub.bj.bcebos.com/resources/deepvoice3_ljspeech-0.wav\">\n            <img src=\"../imgs/Readme_Related/audio_icon.png\" width=250 /></a><br>\n            </th>\n            <th>\n            <a href=\"https://paddlehub.bj.bcebos.com/resources/fastspeech_ljspeech-0.wav\">\n            <img src=\"../imgs/Readme_Related/audio_icon.png\" width=250 /></a><br>\n            </th>\n            <th>\n            <a href=\"https://paddlehub.bj.bcebos.com/resources/transformer_tts_ljspeech-0.wav\">\n            <img src=\"../imgs/Readme_Related/audio_icon.png\" width=250 /></a><br>\n            </th>\n        </tr>\n    </tbody>\n</table>\n</div>\n\n### Video Classification\n- Short video classification trained via large-scale video datasets, supports 3000+ tag types prediction for short Form Videos.\n- Many thanks to CopyRight@[PaddleVideo](https://github.com/PaddlePaddle/PaddleVideo) for the pre-trained model, you can try to train your models with PaddleVideo.\n- `Example: Input a short video of swimming, the algorithm can output the result of \"swimming\"`\n<div align=\"center\">\n<img src=\"../imgs/Readme_Related/Text_Video.gif\"  width = \"400\" height = \"400\" />\n</div>\n"
  },
  {
    "path": "docs/make.bat",
    "content": "@ECHO OFF\r\n\r\npushd %~dp0\r\n\r\nREM Command file for Sphinx documentation\r\n\r\nif \"%SPHINXBUILD%\" == \"\" (\r\n\tset SPHINXBUILD=sphinx-build\r\n)\r\nset SOURCEDIR=.\r\nset BUILDDIR=_build\r\n\r\nif \"%1\" == \"\" goto help\r\n\r\n%SPHINXBUILD% >NUL 2>NUL\r\nif errorlevel 9009 (\r\n\techo.\r\n\techo.The 'sphinx-build' command was not found. Make sure you have Sphinx\r\n\techo.installed, then set the SPHINXBUILD environment variable to point\r\n\techo.to the full path of the 'sphinx-build' executable. Alternatively you\r\n\techo.may add the Sphinx directory to PATH.\r\n\techo.\r\n\techo.If you don't have Sphinx installed, grab it from\r\n\techo.http://sphinx-doc.org/\r\n\texit /b 1\r\n)\r\n\r\n%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%\r\ngoto end\r\n\r\n:help\r\n%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%\r\n\r\n:end\r\npopd\r\n"
  },
  {
    "path": "docs/requirements.txt",
    "content": "sphinx==3.1.2\nsphinx-markdown-tables==0.0.15\nsphinx_materialdesign_theme==0.1.11\nrecommonmark==0.6.0\nsphinx-serve==1.0.1\nsphinxcontrib-applehelp==1.0.2\nsphinxcontrib-devhelp==1.0.2\nsphinxcontrib-htmlhelp==1.0.3\nsphinxcontrib-jsmath==1.0.1\nsphinxcontrib-qthelp==1.0.3\nsphinxcontrib-serializinghtml==1.1.4\n"
  },
  {
    "path": "modules/README.md",
    "content": "English | [简体中文](README_ch.md)\n\n# CONTENTS\n|[Image](#Image) (212)|[Text](#Text) (130)|[Audio](#Audio) (15)|[Video](#Video) (8)|[Industrial Application](#Industrial-Application) (1)|\n|--|--|--|--|--|\n|[Image Classification](#Image-Classification) (108)|[Text Generation](#Text-Generation) (17)| [Voice Cloning](#Voice-Cloning) (2)|[Video Classification](#Video-Classification) (5)| [Meter Detection](#Meter-Detection) (1)|\n|[Image Generation](#Image-Generation) (26)|[Word Embedding](#Word-Embedding) (62)|[Text to Speech](#Text-to-Speech) (5)|[Video Editing](#Video-Editing) (1)|-|\n|[Keypoint Detection](#Keypoint-Detection) (5)|[Machine Translation](#Machine-Translation) (2)|[Automatic Speech Recognition](#Automatic-Speech-Recognition) (5)|[Multiple Object tracking](#Multiple-Object-tracking) (2)|-|\n|[Semantic Segmentation](#Semantic-Segmentation) (25)|[Language Model](#Language-Model) (30)|[Audio Classification](#Audio-Classification) (3)| -|-|\n|[Face Detection](#Face-Detection) (7)|[Sentiment Analysis](#Sentiment-Analysis) (7)|-|-|-|\n|[Text Recognition](#Text-Recognition) (17)|[Syntactic Analysis](#Syntactic-Analysis) (1)|-|-|-|\n|[Image Editing](#Image-Editing) (8)|[Simultaneous Translation](#Simultaneous-Translation) (5)|-|-|-|\n|[Instance Segmentation](#Instance-Segmentation) (1)|[Lexical Analysis](#Lexical-Analysis) (2)|-|-|-|\n|[Object Detection](#Object-Detection) (13)|[Punctuation Restoration](#Punctuation-Restoration) (1)|-|-|-|\n|[Depth Estimation](#Depth-Estimation) (2)|[Text Review](#Text-Review) (3)|-|-|-|\n\n## Image\n  - ### Image Classification\n\n<details><summary>expand</summary><div>\n\n|module|Network|Dataset|Introduction|\n|--|--|--|--|\n|[DriverStatusRecognition](image/classification/DriverStatusRecognition)|MobileNetV3_small_ssld|Drivers||\n|[mobilenet_v2_animals](image/classification/mobilenet_v2_animals)|MobileNet_v2|Animals||\n|[repvgg_a1_imagenet](image/classification/repvgg_a1_imagenet)|RepVGG|ImageNet-2012||\n|[repvgg_a0_imagenet](image/classification/repvgg_a0_imagenet)|RepVGG|ImageNet-2012||\n|[resnext152_32x4d_imagenet](image/classification/resnext152_32x4d_imagenet)|ResNeXt|ImageNet-2012||\n|[resnet_v2_152_imagenet](image/classification/resnet_v2_152_imagenet)|ResNet V2|ImageNet-2012||\n|[resnet50_vd_animals](image/classification/resnet50_vd_animals)|ResNet50_vd|Animals||\n|[food_classification](image/classification/food_classification)|ResNet50_vd_ssld|dishes||\n|[mobilenet_v3_large_imagenet_ssld](image/classification/mobilenet_v3_large_imagenet_ssld)|Mobilenet_v3_large|ImageNet-2012||\n|[resnext152_vd_32x4d_imagenet](image/classification/resnext152_vd_32x4d_imagenet)||||\n|[ghostnet_x1_3_imagenet_ssld](image/classification/ghostnet_x1_3_imagenet_ssld)|GhostNet|ImageNet-2012||\n|[rexnet_1_5_imagenet](image/classification/rexnet_1_5_imagenet)|ReXNet|ImageNet-2012||\n|[resnext50_64x4d_imagenet](image/classification/resnext50_64x4d_imagenet)|ResNeXt|ImageNet-2012||\n|[resnext101_64x4d_imagenet](image/classification/resnext101_64x4d_imagenet)|ResNeXt|ImageNet-2012||\n|[efficientnetb0_imagenet](image/classification/efficientnetb0_imagenet)|EfficientNet|ImageNet-2012||\n|[efficientnetb1_imagenet](image/classification/efficientnetb1_imagenet)|EfficientNet|ImageNet-2012||\n|[mobilenet_v2_imagenet_ssld](image/classification/mobilenet_v2_imagenet_ssld)|Mobilenet_v2|ImageNet-2012||\n|[resnet50_vd_dishes](image/classification/resnet50_vd_dishes)|ResNet50_vd|dishes||\n|[pnasnet_imagenet](image/classification/pnasnet_imagenet)|PNASNet|ImageNet-2012||\n|[rexnet_2_0_imagenet](image/classification/rexnet_2_0_imagenet)|ReXNet|ImageNet-2012||\n|[SnakeIdentification](image/classification/SnakeIdentification)|ResNet50_vd_ssld|snakes||\n|[hrnet40_imagenet](image/classification/hrnet40_imagenet)|HRNet|ImageNet-2012||\n|[resnet_v2_34_imagenet](image/classification/resnet_v2_34_imagenet)|ResNet V2|ImageNet-2012||\n|[mobilenet_v2_dishes](image/classification/mobilenet_v2_dishes)|MobileNet_v2|dishes||\n|[resnext101_vd_32x4d_imagenet](image/classification/resnext101_vd_32x4d_imagenet)|ResNeXt|ImageNet-2012||\n|[repvgg_b2g4_imagenet](image/classification/repvgg_b2g4_imagenet)|RepVGG|ImageNet-2012||\n|[fix_resnext101_32x48d_wsl_imagenet](image/classification/fix_resnext101_32x48d_wsl_imagenet)|ResNeXt|ImageNet-2012||\n|[vgg13_imagenet](image/classification/vgg13_imagenet)|VGG|ImageNet-2012||\n|[se_resnext101_32x4d_imagenet](image/classification/se_resnext101_32x4d_imagenet)|SE_ResNeXt|ImageNet-2012||\n|[hrnet30_imagenet](image/classification/hrnet30_imagenet)|HRNet|ImageNet-2012||\n|[ghostnet_x1_3_imagenet](image/classification/ghostnet_x1_3_imagenet)|GhostNet|ImageNet-2012||\n|[dpn107_imagenet](image/classification/dpn107_imagenet)|DPN|ImageNet-2012||\n|[densenet161_imagenet](image/classification/densenet161_imagenet)|DenseNet|ImageNet-2012||\n|[vgg19_imagenet](image/classification/vgg19_imagenet)|vgg19_imagenet|ImageNet-2012||\n|[mobilenet_v2_imagenet](image/classification/mobilenet_v2_imagenet)|Mobilenet_v2|ImageNet-2012||\n|[resnet50_vd_10w](image/classification/resnet50_vd_10w)|ResNet_vd|private||\n|[resnet_v2_101_imagenet](image/classification/resnet_v2_101_imagenet)|ResNet V2 101|ImageNet-2012||\n|[darknet53_imagenet](image/classification/darknet53_imagenet)|DarkNet|ImageNet-2012||\n|[se_resnext50_32x4d_imagenet](image/classification/se_resnext50_32x4d_imagenet)|SE_ResNeXt|ImageNet-2012||\n|[se_hrnet64_imagenet_ssld](image/classification/se_hrnet64_imagenet_ssld)|HRNet|ImageNet-2012||\n|[resnext101_32x16d_wsl](image/classification/resnext101_32x16d_wsl)|ResNeXt_wsl|ImageNet-2012||\n|[hrnet18_imagenet](image/classification/hrnet18_imagenet)|HRNet|ImageNet-2012||\n|[spinalnet_res101_gemstone](image/classification/spinalnet_res101_gemstone)|resnet101|gemstone||\n|[densenet264_imagenet](image/classification/densenet264_imagenet)|DenseNet|ImageNet-2012||\n|[resnext50_vd_32x4d_imagenet](image/classification/resnext50_vd_32x4d_imagenet)|ResNeXt_vd|ImageNet-2012||\n|[SpinalNet_Gemstones](image/classification/SpinalNet_Gemstones)||||\n|[spinalnet_vgg16_gemstone](image/classification/spinalnet_vgg16_gemstone)|vgg16|gemstone||\n|[xception71_imagenet](image/classification/xception71_imagenet)|Xception|ImageNet-2012||\n|[repvgg_b2_imagenet](image/classification/repvgg_b2_imagenet)|RepVGG|ImageNet-2012||\n|[dpn68_imagenet](image/classification/dpn68_imagenet)|DPN|ImageNet-2012||\n|[alexnet_imagenet](image/classification/alexnet_imagenet)|AlexNet|ImageNet-2012||\n|[rexnet_1_3_imagenet](image/classification/rexnet_1_3_imagenet)|ReXNet|ImageNet-2012||\n|[hrnet64_imagenet](image/classification/hrnet64_imagenet)|HRNet|ImageNet-2012||\n|[efficientnetb7_imagenet](image/classification/efficientnetb7_imagenet)|EfficientNet|ImageNet-2012||\n|[efficientnetb0_small_imagenet](image/classification/efficientnetb0_small_imagenet)|EfficientNet|ImageNet-2012||\n|[efficientnetb6_imagenet](image/classification/efficientnetb6_imagenet)|EfficientNet|ImageNet-2012||\n|[hrnet48_imagenet](image/classification/hrnet48_imagenet)|HRNet|ImageNet-2012||\n|[rexnet_3_0_imagenet](image/classification/rexnet_3_0_imagenet)|ReXNet|ImageNet-2012||\n|[shufflenet_v2_imagenet](image/classification/shufflenet_v2_imagenet)|ShuffleNet V2|ImageNet-2012||\n|[ghostnet_x0_5_imagenet](image/classification/ghostnet_x0_5_imagenet)|GhostNet|ImageNet-2012||\n|[inception_v4_imagenet](image/classification/inception_v4_imagenet)|Inception_V4|ImageNet-2012||\n|[resnext101_vd_64x4d_imagenet](image/classification/resnext101_vd_64x4d_imagenet)|ResNeXt_vd|ImageNet-2012||\n|[densenet201_imagenet](image/classification/densenet201_imagenet)|DenseNet|ImageNet-2012||\n|[vgg16_imagenet](image/classification/vgg16_imagenet)|VGG|ImageNet-2012||\n|[mobilenet_v3_small_imagenet_ssld](image/classification/mobilenet_v3_small_imagenet_ssld)|Mobilenet_v3_Small|ImageNet-2012||\n|[hrnet18_imagenet_ssld](image/classification/hrnet18_imagenet_ssld)|HRNet|ImageNet-2012||\n|[resnext152_64x4d_imagenet](image/classification/resnext152_64x4d_imagenet)|ResNeXt|ImageNet-2012||\n|[efficientnetb3_imagenet](image/classification/efficientnetb3_imagenet)|EfficientNet|ImageNet-2012||\n|[efficientnetb2_imagenet](image/classification/efficientnetb2_imagenet)|EfficientNet|ImageNet-2012||\n|[repvgg_b1g4_imagenet](image/classification/repvgg_b1g4_imagenet)|RepVGG|ImageNet-2012||\n|[resnext101_32x4d_imagenet](image/classification/resnext101_32x4d_imagenet)|ResNeXt|ImageNet-2012||\n|[resnext50_32x4d_imagenet](image/classification/resnext50_32x4d_imagenet)|ResNeXt|ImageNet-2012||\n|[repvgg_a2_imagenet](image/classification/repvgg_a2_imagenet)|RepVGG|ImageNet-2012||\n|[resnext152_vd_64x4d_imagenet](image/classification/resnext152_vd_64x4d_imagenet)|ResNeXt_vd|ImageNet-2012||\n|[xception41_imagenet](image/classification/xception41_imagenet)|Xception|ImageNet-2012||\n|[googlenet_imagenet](image/classification/googlenet_imagenet)|GoogleNet|ImageNet-2012||\n|[resnet50_vd_imagenet_ssld](image/classification/resnet50_vd_imagenet_ssld)|ResNet_vd|ImageNet-2012||\n|[repvgg_b1_imagenet](image/classification/repvgg_b1_imagenet)|RepVGG|ImageNet-2012||\n|[repvgg_b0_imagenet](image/classification/repvgg_b0_imagenet)|RepVGG|ImageNet-2012||\n|[resnet_v2_50_imagenet](image/classification/resnet_v2_50_imagenet)|ResNet V2|ImageNet-2012||\n|[rexnet_1_0_imagenet](image/classification/rexnet_1_0_imagenet)|ReXNet|ImageNet-2012||\n|[resnet_v2_18_imagenet](image/classification/resnet_v2_18_imagenet)|ResNet V2|ImageNet-2012||\n|[resnext101_32x8d_wsl](image/classification/resnext101_32x8d_wsl)|ResNeXt_wsl|ImageNet-2012||\n|[efficientnetb4_imagenet](image/classification/efficientnetb4_imagenet)|EfficientNet|ImageNet-2012||\n|[efficientnetb5_imagenet](image/classification/efficientnetb5_imagenet)|EfficientNet|ImageNet-2012||\n|[repvgg_b1g2_imagenet](image/classification/repvgg_b1g2_imagenet)|RepVGG|ImageNet-2012||\n|[resnext101_32x48d_wsl](image/classification/resnext101_32x48d_wsl)|ResNeXt_wsl|ImageNet-2012||\n|[resnet50_vd_wildanimals](image/classification/resnet50_vd_wildanimals)|ResNet_vd|IFAW wild animals||\n|[nasnet_imagenet](image/classification/nasnet_imagenet)|NASNet|ImageNet-2012||\n|[se_resnet18_vd_imagenet](image/classification/se_resnet18_vd_imagenet)||||\n|[spinalnet_res50_gemstone](image/classification/spinalnet_res50_gemstone)|resnet50|gemstone||\n|[resnext50_vd_64x4d_imagenet](image/classification/resnext50_vd_64x4d_imagenet)|ResNeXt_vd|ImageNet-2012||\n|[resnext101_32x32d_wsl](image/classification/resnext101_32x32d_wsl)|ResNeXt_wsl|ImageNet-2012||\n|[dpn131_imagenet](image/classification/dpn131_imagenet)|DPN|ImageNet-2012||\n|[xception65_imagenet](image/classification/xception65_imagenet)|Xception|ImageNet-2012||\n|[repvgg_b3g4_imagenet](image/classification/repvgg_b3g4_imagenet)|RepVGG|ImageNet-2012||\n|[marine_biometrics](image/classification/marine_biometrics)|ResNet50_vd_ssld|Fish4Knowledge||\n|[res2net101_vd_26w_4s_imagenet](image/classification/res2net101_vd_26w_4s_imagenet)|Res2Net|ImageNet-2012||\n|[dpn98_imagenet](image/classification/dpn98_imagenet)|DPN|ImageNet-2012||\n|[resnet18_vd_imagenet](image/classification/resnet18_vd_imagenet)|ResNet_vd|ImageNet-2012||\n|[densenet121_imagenet](image/classification/densenet121_imagenet)|DenseNet|ImageNet-2012||\n|[vgg11_imagenet](image/classification/vgg11_imagenet)|VGG|ImageNet-2012||\n|[hrnet44_imagenet](image/classification/hrnet44_imagenet)|HRNet|ImageNet-2012||\n|[densenet169_imagenet](image/classification/densenet169_imagenet)|DenseNet|ImageNet-2012||\n|[hrnet32_imagenet](image/classification/hrnet32_imagenet)|HRNet|ImageNet-2012||\n|[dpn92_imagenet](image/classification/dpn92_imagenet)|DPN|ImageNet-2012||\n|[ghostnet_x1_0_imagenet](image/classification/ghostnet_x1_0_imagenet)|GhostNet|ImageNet-2012||\n|[hrnet48_imagenet_ssld](image/classification/hrnet48_imagenet_ssld)|HRNet|ImageNet-2012||\n\n</div></details>\n\n\n  - ### Image Generation\n\n|module|Network|Dataset|Introduction| Huggingface Spaces Demo|\n|--|--|--|--|--|\n|[pixel2style2pixel](image/Image_gan/gan/pixel2style2pixel/)|Pixel2Style2Pixel|-|human face|\n|[stgan_bald](image/Image_gan/gan/stgan_bald/)|STGAN|CelebA|stgan_bald| [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/PaddlePaddle/stgan_bald) |\n|[styleganv2_editing](image/Image_gan/gan/styleganv2_editing)|StyleGAN V2|-|human face editing|\n|[wav2lip](image/Image_gan/gan/wav2lip)|wav2lip|LRS2|wav2lip| [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/PaddlePaddle/wav2lip) |\n|[attgan_celeba](image/Image_gan/attgan_celeba/)|AttGAN|Celeba|human face editing|\n|[cyclegan_cityscapes](image/Image_gan/cyclegan_cityscapes)|CycleGAN|Cityscapes|cyclegan_cityscapes|\n|[stargan_celeba](image/Image_gan/stargan_celeba)|StarGAN|Celeba|human face editing|\n|[stgan_celeba](image/Image_gan/stgan_celeba/)|STGAN|Celeba|human face editing|\n|[ID_Photo_GEN](image/Image_gan/style_transfer/ID_Photo_GEN)|HRNet_W18|-|ID_Photo_GEN|\n|[Photo2Cartoon](image/Image_gan/style_transfer/Photo2Cartoon)|U-GAT-IT|cartoon_data|cartoon|[![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/PaddlePaddle/photo2cartoon) |\n|[U2Net_Portrait](image/Image_gan/style_transfer/U2Net_Portrait)|U^2Net|-|Portrait|\n|[UGATIT_100w](image/Image_gan/style_transfer/UGATIT_100w)|U-GAT-IT|selfie2anime|selfie2anime|\n|[UGATIT_83w](image/Image_gan/style_transfer/UGATIT_83w)|U-GAT-IT|selfie2anime|selfie2anime|\n|[UGATIT_92w](image/Image_gan/style_transfer/UGATIT_92w)| U-GAT-IT|selfie2anime|selfie2anime|\n|[animegan_v1_hayao_60](image/Image_gan/style_transfer/animegan_v1_hayao_60)|AnimeGAN|The Wind Rises|animegan_v1_hayao| [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/PaddlePaddle/animegan_v1_hayao_60) |\n|[animegan_v2_hayao_64](image/Image_gan/style_transfer/animegan_v2_hayao_64)|AnimeGAN|The Wind Rises|animegan_v1_hayao| [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/PaddlePaddle/animegan_v2_hayao_64) |\n|[animegan_v2_hayao_99](image/Image_gan/style_transfer/animegan_v2_hayao_99)|AnimeGAN|The Wind Rises|animegan_v1_hayao| [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/PaddlePaddle/animegan_v2_hayao_99) |\n|[animegan_v2_paprika_54](image/Image_gan/style_transfer/animegan_v2_paprika_54)|AnimeGAN|Paprika|animegan_v2_paprika| [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/PaddlePaddle/animegan_v2_paprika_54) |\n|[animegan_v2_paprika_74](image/Image_gan/style_transfer/animegan_v2_paprika_74)|AnimeGAN|Paprika|animegan_v2_paprika| [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/PaddlePaddle/animegan_v2_paprika_74) |\n|[animegan_v2_paprika_97](image/Image_gan/style_transfer/animegan_v2_paprika_97)|AnimeGAN|Paprika|animegan_v2_paprika| [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/PaddlePaddle/animegan_v2_paprika_97) |\n|[animegan_v2_paprika_98](image/Image_gan/style_transfer/animegan_v2_paprika_98)|AnimeGAN|Paprika|animegan_v2_paprika| [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/PaddlePaddle/animegan_v2_paprika_98) |\n|[animegan_v2_shinkai_33](image/Image_gan/style_transfer/animegan_v2_shinkai_33)|AnimeGAN|Your Name, Weathering with you|animegan_v2_shinkai| [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/PaddlePaddle/animegan_v2_shinkai_33) |\n|[animegan_v2_shinkai_53](image/Image_gan/style_transfer/animegan_v2_shinkai_53)|AnimeGAN|Your Name, Weathering with you|animegan_v2_shinkai| [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/PaddlePaddle/animegan_v2_shinkai_53) |\n|[msgnet](image/Image_gan/style_transfer/msgnet)|msgnet|COCO2014| |[![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/PaddlePaddle/msgnet) |\n|[stylepro_artistic](image/Image_gan/style_transfer/stylepro_artistic)|StyleProNet|MS-COCO + WikiArt|stylepro_artistic| [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/PaddlePaddle/stylepro_artistic) |\n|stylegan_ffhq|StyleGAN|FFHQ|stylepro_artistic|\n\n  - ### Keypoint Detection\n\n|module|Network|Dataset|Introduction|\n|--|--|--|--|\n|[face_landmark_localization](image/keypoint_detection/face_landmark_localization)|Face_Landmark|AFW/AFLW|Face_Landmark|\n|[hand_pose_localization](image/keypoint_detection/hand_pose_localization)|-|MPII, NZSL|hand_pose_localization|\n|[openpose_body_estimation](image/keypoint_detection/openpose_body_estimation)|two-branch multi-stage CNN|MPII, COCO 2016|openpose_body_estimation|\n|[human_pose_estimation_resnet50_mpii](image/keypoint_detection/human_pose_estimation_resnet50_mpii)|Pose_Resnet50|MPII|human_pose_estimation\n|[openpose_hands_estimation](image/keypoint_detection/openpose_hands_estimation)|-|MPII, NZSL|openpose_hands_estimation|\n\n  - ### Semantic Segmentation\n\n|module|Network|Dataset|Introduction|\n|--|--|--|--|\n|[deeplabv3p_xception65_humanseg](image/semantic_segmentation/deeplabv3p_xception65_humanseg)|deeplabv3p|-|humanseg|\n|[humanseg_server](image/semantic_segmentation/humanseg_server)|deeplabv3p|-|humanseg|\n|[humanseg_mobile](image/semantic_segmentation/humanseg_mobile)|hrnet|-|humanseg|\n|[humanseg_lite](image/semantic_segmentation/umanseg_lite)|shufflenet|-|humanseg|\n|[ExtremeC3_Portrait_Segmentation](image/semantic_segmentation/ExtremeC3_Portrait_Segmentation)|ExtremeC3|EG1800, Baidu fashion dataset|humanseg|\n|[SINet_Portrait_Segmentation](image/semantic_segmentation/SINet_Portrait_Segmentation)|SINet|EG1800, Baidu fashion dataset|humanseg|\n|[FCN_HRNet_W18_Face_Seg](image/semantic_segmentation/FCN_HRNet_W18_Face_Seg)|FCN_HRNet_W18|-|humanseg|\n|[ace2p](image/semantic_segmentation/ace2p)|ACE2P|LIP|ACE2P|\n|[Pneumonia_CT_LKM_PP](image/semantic_segmentation/Pneumonia_CT_LKM_PP)|U-NET+|-|Pneumonia_CT|\n|[Pneumonia_CT_LKM_PP_lung](image/semantic_segmentation/Pneumonia_CT_LKM_PP_lung)|U-NET+|-|Pneumonia_CT|\n|[ocrnet_hrnetw18_voc](image/semantic_segmentation/ocrnet_hrnetw18_voc)|ocrnet, hrnet|PascalVoc2012|\n|[U2Net](image/semantic_segmentation/U2Net)|U^2Net|-|U2Net|\n|[U2Netp](image/semantic_segmentation/U2Netp)|U^2Net|-|U2Net|\n|[Extract_Line_Draft](image/semantic_segmentation/Extract_Line_Draft)|UNet|Pixiv|Extract_Line_Draft|\n|[unet_cityscapes](image/semantic_segmentation/unet_cityscapes)|UNet|cityscapes|\n|[ocrnet_hrnetw18_cityscapes](image/semantic_segmentation/ocrnet_hrnetw18_cityscapes)|ocrnet_hrnetw18|cityscapes|\n|[hardnet_cityscapes](image/semantic_segmentation/hardnet_cityscapes)|hardnet|cityscapes|\n|[fcn_hrnetw48_voc](image/semantic_segmentation/fcn_hrnetw48_voc)|fcn_hrnetw48|PascalVoc2012|\n|[fcn_hrnetw48_cityscapes](image/semantic_segmentation/fcn_hrnetw48_cityscapes)|fcn_hrnetw48|cityscapes|\n|[fcn_hrnetw18_voc](image/semantic_segmentation/fcn_hrnetw18_voc)|fcn_hrnetw18|PascalVoc2012|\n|[fcn_hrnetw18_cityscapes](image/semantic_segmentation/fcn_hrnetw18_cityscapes)|fcn_hrnetw18|cityscapes|\n|[fastscnn_cityscapes](image/semantic_segmentation/fastscnn_cityscapes)|fastscnn|cityscapes|\n|[deeplabv3p_resnet50_voc](image/semantic_segmentation/deeplabv3p_resnet50_voc)|deeplabv3p, resnet50|PascalVoc2012|\n|[deeplabv3p_resnet50_cityscapes](image/semantic_segmentation/deeplabv3p_resnet50_cityscapes)|deeplabv3p, resnet50|cityscapes|\n|[bisenetv2_cityscapes](image/semantic_segmentation/bisenetv2_cityscapes)|bisenetv2|cityscapes|\n\n\n\n  - ### Face Detection\n\n|module|Network|Dataset|Introduction|\n|--|--|--|--|\n|[pyramidbox_lite_mobile](image/face_detection/pyramidbox_lite_mobile)|PyramidBox|WIDER FACE|face_detection|\n|[pyramidbox_lite_mobile_mask](image/face_detection/pyramidbox_lite_mobile_mask)|PyramidBox|WIDER FACE|face_detection|\n|[pyramidbox_lite_server_mask](image/face_detection/pyramidbox_lite_server_mask)|PyramidBox|WIDER FACE|face_detection|\n|[ultra_light_fast_generic_face_detector_1mb_640](image/face_detection/ultra_light_fast_generic_face_detector_1mb_640)|Ultra-Light-Fast-Generic-Face-Detector-1MB|WIDER FACE|face_detection|\n|[ultra_light_fast_generic_face_detector_1mb_320](image/face_detection/ultra_light_fast_generic_face_detector_1mb_320)|Ultra-Light-Fast-Generic-Face-Detector-1MB|WIDER FACE|face_detection|\n|[pyramidbox_lite_server](image/face_detection/pyramidbox_lite_server)|PyramidBox|WIDER FACE|face_detection|\n|[pyramidbox_face_detection](image/face_detection/pyramidbox_face_detection)|PyramidBox|WIDER FACE|face_detection|\n\n  - ### Text Recognition\n\n|module|Network|Dataset|Introduction|Huggingface Spaces Demo|\n|--|--|--|--|--|\n|[chinese_ocr_db_crnn_mobile](image/text_recognition/chinese_ocr_db_crnn_mobile)|Differentiable Binarization+RCNN|icdar2015|Chinese text recognition|[![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/PaddlePaddle/chinese_ocr_db_crnn_mobile) |\n|[chinese_text_detection_db_mobile](image/text_recognition/chinese_text_detection_db_mobile)|Differentiable Binarization|icdar2015|Chinese text Detection|\n|[chinese_text_detection_db_server](image/text_recognition/chinese_text_detection_db_server)|Differentiable Binarization|icdar2015|Chinese text Detection|\n|[chinese_ocr_db_crnn_server](image/text_recognition/chinese_ocr_db_crnn_server)|Differentiable Binarization+RCNN|icdar2015|Chinese text recognition|[![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/PaddlePaddle/chinese_ocr_db_crnn_server) |\n|[Vehicle_License_Plate_Recognition](image/text_recognition/Vehicle_License_Plate_Recognition)|-|CCPD|Vehicle license plate recognition|\n|[chinese_cht_ocr_db_crnn_mobile](image/text_recognition/chinese_cht_ocr_db_crnn_mobile)|Differentiable Binarization+CRNN|icdar2015|Traditional Chinese text Detection|\n|[japan_ocr_db_crnn_mobile](image/text_recognition/japan_ocr_db_crnn_mobile)|Differentiable Binarization+CRNN|icdar2015|Japanese text recognition|\n|[korean_ocr_db_crnn_mobile](image/text_recognition/korean_ocr_db_crnn_mobile)|Differentiable Binarization+CRNN|icdar2015|Korean text recognition|\n|[german_ocr_db_crnn_mobile](image/text_recognition/german_ocr_db_crnn_mobile)|Differentiable Binarization+CRNN|icdar2015|German text recognition|\n|[french_ocr_db_crnn_mobile](image/text_recognition/french_ocr_db_crnn_mobile)|Differentiable Binarization+CRNN|icdar2015|French text recognition|\n|[latin_ocr_db_crnn_mobile](image/text_recognition/latin_ocr_db_crnn_mobile)|Differentiable Binarization+CRNN|icdar2015|Latin text recognition|\n|[cyrillic_ocr_db_crnn_mobile](image/text_recognition/cyrillic_ocr_db_crnn_mobile)|Differentiable Binarization+CRNN|icdar2015|Cyrillic text recognition|\n|[multi_languages_ocr_db_crnn](image/text_recognition/multi_languages_ocr_db_crnn)|Differentiable Binarization+RCNN|icdar2015|Multi languages text recognition|\n|[kannada_ocr_db_crnn_mobile](image/text_recognition/kannada_ocr_db_crnn_mobile)|Differentiable Binarization+CRNN|icdar2015|Kannada text recognition|\n|[arabic_ocr_db_crnn_mobile](image/text_recognition/arabic_ocr_db_crnn_mobile)|Differentiable Binarization+CRNN|icdar2015|Arabic text recognition|\n|[telugu_ocr_db_crnn_mobile](image/text_recognition/telugu_ocr_db_crnn_mobile)|Differentiable Binarization+CRNN|icdar2015|Telugu text recognition|\n|[devanagari_ocr_db_crnn_mobile](image/text_recognition/devanagari_ocr_db_crnn_mobile)|Differentiable Binarization+CRNN|icdar2015|Devanagari text recognition|\n|[tamil_ocr_db_crnn_mobile](image/text_recognition/tamil_ocr_db_crnn_mobile)|Differentiable Binarization+CRNN|icdar2015|Tamil text recognition|\n\n\n  - ### Image Editing\n\n|module|Network|Dataset|Introduction|Huggingface Spaces Demo|\n|--|--|--|--|--|\n|[realsr](image/Image_editing/super_resolution/realsr)|LP-KPN|RealSR dataset|Image / Video super-resolution|\n|[deoldify](image/Image_editing/colorization/deoldify)|GAN|ILSVRC 2012|Black-and-white image / video colorization|[![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/PaddlePaddle/deoldify) |\n|[photo_restoration](image/Image_editing/colorization/photo_restoration)|deoldify + realsr|-|Old photo restoration|\n|[user_guided_colorization](image/Image_editing/colorization/user_guided_colorization)|siggraph|ILSVRC 2012|User guided colorization|\n|[falsr_c](image/Image_editing/super_resolution/falsr_c)|falsr_c| DIV2k|Lightweight super resolution - 2x|\n|[dcscn](image/Image_editing/super_resolution/dcscn)|dcscn| DIV2k|Lightweight super resolution - 2x|\n|[falsr_a](image/Image_editing/super_resolution/falsr_a)|falsr_a| DIV2k|Lightweight super resolution - 2x|\n|[falsr_b](image/Image_editing/super_resolution/falsr_b)|falsr_b|DIV2k|Lightweight super resolution - 2x|\n\n  - ### Instance Segmentation\n\n|module|Network|Dataset|Introduction|\n|--|--|--|--|\n|[solov2](image/instance_segmentation/solov2)|-|COCO2014|Instance segmentation|\n\n  - ### Object Detection\n\n|module|Network|Dataset|Introduction|\n|--|--|--|--|\n|[faster_rcnn_resnet50_coco2017](image/object_detection/faster_rcnn_resnet50_coco2017)|faster_rcnn|COCO2017||\n|[ssd_vgg16_512_coco2017](image/object_detection/ssd_vgg16_512_coco2017)|SSD|COCO2017||\n|[faster_rcnn_resnet50_fpn_venus](image/object_detection/faster_rcnn_resnet50_fpn_venus)|faster_rcnn|Baidu self built dataset|Large-scale general detection|\n|[ssd_vgg16_300_coco2017](image/object_detection/ssd_vgg16_300_coco2017)||||\n|[yolov3_resnet34_coco2017](image/object_detection/yolov3_resnet34_coco2017)|YOLOv3|COCO2017||\n|[yolov3_darknet53_pedestrian](image/object_detection/yolov3_darknet53_pedestrian)|YOLOv3|Baidu Self built large-scale pedestrian dataset|Pedestrian Detection|\n|[yolov3_mobilenet_v1_coco2017](image/object_detection/yolov3_mobilenet_v1_coco2017)|YOLOv3|COCO2017||\n|[ssd_mobilenet_v1_pascal](image/object_detection/ssd_mobilenet_v1_pascal)|SSD|PASCAL VOC||\n|[faster_rcnn_resnet50_fpn_coco2017](image/object_detection/faster_rcnn_resnet50_fpn_coco2017)|faster_rcnn|COCO2017||\n|[yolov3_darknet53_coco2017](image/object_detection/yolov3_darknet53_coco2017)|YOLOv3|COCO2017||\n|[yolov3_darknet53_vehicles](image/object_detection/yolov3_darknet53_vehicles)|YOLOv3|Baidu Self built large-scale vehicles dataset|vehicles Detection|\n|[yolov3_darknet53_venus](image/object_detection/yolov3_darknet53_venus)|YOLOv3|Baidu self built datasetset|Large-scale general detection|\n|[yolov3_resnet50_vd_coco2017](image/object_detection/yolov3_resnet50_vd_coco2017)|YOLOv3|COCO2017||\n\n  - ### Depth Estimation\n\n|module|Network|Dataset|Introduction|Huggingface Spaces Demo|\n|--|--|--|--|--|\n|[MiDaS_Large](image/depth_estimation/MiDaS_Large)|-|3D Movies, WSVD, ReDWeb, MegaDepth|| [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/PaddlePaddle/MiDaS_Large) |\n|[MiDaS_Small](image/depth_estimation/MiDaS_Small)|-|3D Movies, WSVD, ReDWeb, MegaDepth, etc.|| [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/PaddlePaddle/MiDaS_Small) |\n\n- ### Text_to_Image\n\n|module|Network|Dataset|Introduction|Huggingface Spaces Demo|\n|--|--|--|--|--|\n|[disco_diffusion_clip_rn101](image/text_to_image/disco_diffusion_clip_rn101)|-|Open domain multi round dataset|text_to_image|\n|[ernie_vilg](image/text_to_image/ernie_vilg)|-|Open domain multi round dataset|text_to_image|[![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/PaddlePaddle/ERNIE-ViLG) |\n|[stable_diffusion_img2img](image/text_to_image/stable_diffusion_img2img)|-|Open domain multi round dataset|img2img|\n\n## Text\n  - ### Text Generation\n\n|module|Network|Dataset|Introduction|\n|--|--|--|--|\n|[ernie_gen](text/text_generation/ernie_gen)|ERNIE-GEN|-|Pre-training finetuning framework for generating tasks|\n|[ernie_gen_poetry](text/text_generation/ernie_gen_poetry)|ERNIE-GEN|Open source poetry dataset|Poetry generation|\n|[ernie_gen_couplet](text/text_generation/ernie_gen_couplet)|ERNIE-GEN|Open source couplet dataset|Couplet generation|\n|[ernie_gen_lover_words](text/text_generation/ernie_gen_lover_words)|ERNIE-GEN|Online love poems and love talk data|Love word generation|\n|[ernie_tiny_couplet](text/text_generation/ernie_tiny_couplet)|Eernie_tiny|Open source couplet dataset|Couplet generation|\n|[ernie_gen_acrostic_poetry](text/text_generation/ernie_gen_acrostic_poetry)|ERNIE-GEN|Open source poetry dataset|Acrostic poetry Generation|\n|[Rumor_prediction](text/text_generation/Rumor_prediction)|-|Sina Weibo Chinese rumor data|Rumor prediction|\n|[plato-mini](text/text_generation/plato-mini)|Unified Transformer|Billion level Chinese conversation data|Chinese dialogue|\n|[plato2_en_large](text/text_generation/plato2_en_large)|plato2|Open domain multi round dataset|Super large scale generative dialogue|\n|[plato2_en_base](text/text_generation/plato2_en_base)|plato2|Open domain multi round dataset|Super large scale generative dialogue|\n|[CPM_LM](text/text_generation/CPM_LM)|GPT-2|Self built dataset|Chinese text generation|\n|[unified_transformer-12L-cn](text/text_generation/unified_transformer-12L-cn)|Unified Transformer|Ten million level Chinese conversation data|Man machine multi round dialogue|\n|[unified_transformer-12L-cn-luge](text/text_generation/unified_transformer-12L-cn-luge)|Unified Transformer|dialogue dataset|Man machine multi round dialogue|\n|[reading_pictures_writing_poems](text/text_generation/reading_pictures_writing_poems)|Multi network cascade|-|Look at pictures and write poems|\n|[GPT2_CPM_LM](text/text_generation/GPT2_CPM_LM)|||Q&A text generation|\n|[GPT2_Base_CN](text/text_generation/GPT2_Base_CN)|||Q&A text generation|\n\n  - ### Word Embedding\n\n<details><summary>expand</summary><div>\n\n|module|Network|Dataset|Introduction|\n|--|--|--|--|\n|[w2v_weibo_target_word-bigram_dim300](text/embedding/w2v_weibo_target_word-bigram_dim300)|w2v|weibo||\n|[w2v_baidu_encyclopedia_target_word-ngram_1-2_dim300](text/embedding/w2v_baidu_encyclopedia_target_word-ngram_1-2_dim300)|w2v|baidu_encyclopedia||\n|[w2v_literature_target_word-word_dim300](text/embedding/w2v_literature_target_word-word_dim300)|w2v|literature||\n|[word2vec_skipgram](text/embedding/word2vec_skipgram)|skip-gram|Baidu self built dataset||\n|[w2v_sogou_target_word-char_dim300](text/embedding/w2v_sogou_target_word-char_dim300)|w2v|sogou||\n|[w2v_weibo_target_bigram-char_dim300](text/embedding/w2v_weibo_target_bigram-char_dim300)|w2v|weibo||\n|[w2v_zhihu_target_word-bigram_dim300](text/embedding/w2v_zhihu_target_word-bigram_dim300)|w2v|zhihu||\n|[w2v_financial_target_word-word_dim300](text/embedding/w2v_financial_target_word-word_dim300)|w2v|financial||\n|[w2v_wiki_target_word-word_dim300](text/embedding/w2v_wiki_target_word-word_dim300)|w2v|wiki||\n|[w2v_baidu_encyclopedia_context_word-word_dim300](text/embedding/w2v_baidu_encyclopedia_context_word-word_dim300)|w2v|baidu_encyclopedia||\n|[w2v_weibo_target_word-word_dim300](text/embedding/w2v_weibo_target_word-word_dim300)|w2v|weibo||\n|[w2v_zhihu_target_bigram-char_dim300](text/embedding/w2v_zhihu_target_bigram-char_dim300)|w2v|zhihu||\n|[w2v_zhihu_target_word-word_dim300](text/embedding/w2v_zhihu_target_word-word_dim300)|w2v|zhihu||\n|[w2v_people_daily_target_word-char_dim300](text/embedding/w2v_people_daily_target_word-char_dim300)|w2v|people_daily||\n|[w2v_sikuquanshu_target_word-word_dim300](text/embedding/w2v_sikuquanshu_target_word-word_dim300)|w2v|sikuquanshu||\n|[glove_twitter_target_word-word_dim200_en](text/embedding/glove_twitter_target_word-word_dim200_en)|fasttext|twitter||\n|[fasttext_crawl_target_word-word_dim300_en](text/embedding/fasttext_crawl_target_word-word_dim300_en)|fasttext|crawl||\n|[w2v_wiki_target_word-bigram_dim300](text/embedding/w2v_wiki_target_word-bigram_dim300)|w2v|wiki||\n|[w2v_baidu_encyclopedia_context_word-character_char1-1_dim300](text/embedding/w2v_baidu_encyclopedia_context_word-character_char1-1_dim300)|w2v|baidu_encyclopedia||\n|[glove_wiki2014-gigaword_target_word-word_dim300_en](text/embedding/glove_wiki2014-gigaword_target_word-word_dim300_en)|glove|wiki2014-gigaword||\n|[glove_wiki2014-gigaword_target_word-word_dim50_en](text/embedding/glove_wiki2014-gigaword_target_word-word_dim50_en)|glove|wiki2014-gigaword||\n|[w2v_baidu_encyclopedia_context_word-ngram_2-2_dim300](text/embedding/w2v_baidu_encyclopedia_context_word-ngram_2-2_dim300)|w2v|baidu_encyclopedia||\n|[w2v_wiki_target_bigram-char_dim300](text/embedding/w2v_wiki_target_bigram-char_dim300)|w2v|wiki||\n|[w2v_baidu_encyclopedia_target_word-character_char1-1_dim300](text/embedding/w2v_baidu_encyclopedia_target_word-character_char1-1_dim300)|w2v|baidu_encyclopedia||\n|[w2v_financial_target_bigram-char_dim300](text/embedding/w2v_financial_target_bigram-char_dim300)|w2v|financial||\n|[glove_wiki2014-gigaword_target_word-word_dim200_en](text/embedding/glove_wiki2014-gigaword_target_word-word_dim200_en)|glove|wiki2014-gigaword||\n|[w2v_financial_target_word-bigram_dim300](text/embedding/w2v_financial_target_word-bigram_dim300)|w2v|financial||\n|[w2v_mixed-large_target_word-char_dim300](text/embedding/w2v_mixed-large_target_word-char_dim300)|w2v|mixed||\n|[w2v_baidu_encyclopedia_target_word-wordPosition_dim300](text/embedding/w2v_baidu_encyclopedia_target_word-wordPosition_dim300)|w2v|baidu_encyclopedia||\n|[w2v_baidu_encyclopedia_context_word-ngram_1-3_dim300](text/embedding/w2v_baidu_encyclopedia_context_word-ngram_1-3_dim300)|w2v|baidu_encyclopedia||\n|[w2v_baidu_encyclopedia_target_word-wordLR_dim300](text/embedding/w2v_baidu_encyclopedia_target_word-wordLR_dim300)|w2v|baidu_encyclopedia||\n|[w2v_sogou_target_bigram-char_dim300](text/embedding/w2v_sogou_target_bigram-char_dim300)|w2v|sogou||\n|[w2v_weibo_target_word-char_dim300](text/embedding/w2v_weibo_target_word-char_dim300)|w2v|weibo||\n|[w2v_people_daily_target_word-word_dim300](text/embedding/w2v_people_daily_target_word-word_dim300)|w2v|people_daily||\n|[w2v_zhihu_target_word-char_dim300](text/embedding/w2v_zhihu_target_word-char_dim300)|w2v|zhihu||\n|[w2v_wiki_target_word-char_dim300](text/embedding/w2v_wiki_target_word-char_dim300)|w2v|wiki||\n|[w2v_sogou_target_word-bigram_dim300](text/embedding/w2v_sogou_target_word-bigram_dim300)|w2v|sogou||\n|[w2v_financial_target_word-char_dim300](text/embedding/w2v_financial_target_word-char_dim300)|w2v|financial||\n|[w2v_baidu_encyclopedia_target_word-ngram_1-3_dim300](text/embedding/w2v_baidu_encyclopedia_target_word-ngram_1-3_dim300)|w2v|baidu_encyclopedia||\n|[glove_wiki2014-gigaword_target_word-word_dim100_en](text/embedding/glove_wiki2014-gigaword_target_word-word_dim100_en)|glove|wiki2014-gigaword||\n|[w2v_baidu_encyclopedia_target_word-character_char1-4_dim300](text/embedding/w2v_baidu_encyclopedia_target_word-character_char1-4_dim300)|w2v|baidu_encyclopedia||\n|[w2v_sogou_target_word-word_dim300](text/embedding/w2v_sogou_target_word-word_dim300)|w2v|sogou||\n|[w2v_literature_target_word-char_dim300](text/embedding/w2v_literature_target_word-char_dim300)|w2v|literature||\n|[w2v_baidu_encyclopedia_target_bigram-char_dim300](text/embedding/w2v_baidu_encyclopedia_target_bigram-char_dim300)|w2v|baidu_encyclopedia||\n|[w2v_baidu_encyclopedia_target_word-word_dim300](text/embedding/w2v_baidu_encyclopedia_target_word-word_dim300)|w2v|baidu_encyclopedia||\n|[glove_twitter_target_word-word_dim100_en](text/embedding/glove_twitter_target_word-word_dim100_en)|glove|crawl||\n|[w2v_baidu_encyclopedia_target_word-ngram_2-2_dim300](text/embedding/w2v_baidu_encyclopedia_target_word-ngram_2-2_dim300)|w2v|baidu_encyclopedia||\n|[w2v_baidu_encyclopedia_context_word-character_char1-4_dim300](text/embedding/w2v_baidu_encyclopedia_context_word-character_char1-4_dim300)|w2v|baidu_encyclopedia||\n|[w2v_literature_target_bigram-char_dim300](text/embedding/w2v_literature_target_bigram-char_dim300)|w2v|literature||\n|[fasttext_wiki-news_target_word-word_dim300_en](text/embedding/fasttext_wiki-news_target_word-word_dim300_en)|fasttext|wiki-news||\n|[w2v_people_daily_target_word-bigram_dim300](text/embedding/w2v_people_daily_target_word-bigram_dim300)|w2v|people_daily||\n|[w2v_mixed-large_target_word-word_dim300](text/embedding/w2v_mixed-large_target_word-word_dim300)|w2v|mixed||\n|[w2v_people_daily_target_bigram-char_dim300](text/embedding/w2v_people_daily_target_bigram-char_dim300)|w2v|people_daily||\n|[w2v_literature_target_word-bigram_dim300](text/embedding/w2v_literature_target_word-bigram_dim300)|w2v|literature||\n|[glove_twitter_target_word-word_dim25_en](text/embedding/glove_twitter_target_word-word_dim25_en)|glove|twitter||\n|[w2v_baidu_encyclopedia_context_word-ngram_1-2_dim300](text/embedding/w2v_baidu_encyclopedia_context_word-ngram_1-2_dim300)|w2v|baidu_encyclopedia||\n|[w2v_sikuquanshu_target_word-bigram_dim300](text/embedding/w2v_sikuquanshu_target_word-bigram_dim300)|w2v|sikuquanshu||\n|[w2v_baidu_encyclopedia_context_word-character_char1-2_dim300](text/embedding/w2v_baidu_encyclopedia_context_word-character_char1-2_dim300)|w2v|baidu_encyclopedia||\n|[glove_twitter_target_word-word_dim50_en](text/embedding/glove_twitter_target_word-word_dim50_en)|glove|twitter||\n|[w2v_baidu_encyclopedia_context_word-wordLR_dim300](text/embedding/w2v_baidu_encyclopedia_context_word-wordLR_dim300)|w2v|baidu_encyclopedia||\n|[w2v_baidu_encyclopedia_target_word-character_char1-2_dim300](text/embedding/w2v_baidu_encyclopedia_target_word-character_char1-2_dim300)|w2v|baidu_encyclopedia||\n|[w2v_baidu_encyclopedia_context_word-wordPosition_dim300](text/embedding/w2v_baidu_encyclopedia_context_word-wordPosition_dim300)|w2v|baidu_encyclopedia||\n\n</div></details>\n\n  - ### Machine Translation\n\n|module|Network|Dataset|Introduction|\n|--|--|--|--|\n|[transformer_zh-en](text/machine_translation/transformer/zh-en)|Transformer|CWMT2021|中文译英文|\n|[transformer_en-de](text/machine_translation/transformer/en-de)|Transformer|WMT14 EN-DE|英文译德文|\n\n  - ### Language Model\n\n<details><summary>expand</summary><div>\n\n|module|Network|Dataset|Introduction|\n|--|--|--|--|\n|[chinese_electra_small](text/language_model/chinese_electra_small)||||\n|[chinese_electra_base](text/language_model/chinese_electra_base)||||\n|[roberta-wwm-ext-large](text/language_model/roberta-wwm-ext-large)|roberta-wwm-ext-large|Baidu self built dataset||\n|[chinese-bert-wwm-ext](text/language_model/chinese_bert_wwm_ext)|chinese-bert-wwm-ext|Baidu self built dataset||\n|[lda_webpage](text/language_model/lda_webpage)|LDA|Baidu Self built Web Page Domain Dataset||\n|[lda_novel](text/language_model/lda_novel)||||\n|[bert-base-multilingual-uncased](text/language_model/bert-base-multilingual-uncased)||||\n|[rbt3](text/language_model/rbt3)||||\n|[ernie_v2_eng_base](text/language_model/ernie_v2_eng_base)|ernie_v2_eng_base|Baidu self built dataset||\n|[bert-base-multilingual-cased](text/language_model/bert-base-multilingual-cased)||||\n|[rbtl3](text/language_model/rbtl3)||||\n|[chinese-bert-wwm](text/language_model/chinese_bert_wwm)|chinese-bert-wwm|Baidu self built dataset||\n|[bert-large-uncased](text/language_model/bert-large-uncased)||||\n|[slda_novel](text/language_model/slda_novel)||||\n|[slda_news](text/language_model/slda_news)||||\n|[electra_small](text/language_model/electra_small)||||\n|[slda_webpage](text/language_model/slda_webpage)||||\n|[bert-base-cased](text/language_model/bert-base-cased)||||\n|[slda_weibo](text/language_model/slda_weibo)||||\n|[roberta-wwm-ext](text/language_model/roberta-wwm-ext)|roberta-wwm-ext|Baidu self built dataset||\n|[bert-base-uncased](text/language_model/bert-base-uncased)||||\n|[electra_large](text/language_model/electra_large)||||\n|[ernie](text/language_model/ernie)|ernie-1.0|Baidu self built dataset||\n|[simnet_bow](text/language_model/simnet_bow)|BOW|Baidu self built dataset||\n|[ernie_tiny](text/language_model/ernie_tiny)|ernie_tiny|Baidu self built dataset||\n|[bert-base-chinese](text/language_model/bert-base-chinese)|bert-base-chinese|Baidu self built dataset||\n|[lda_news](text/language_model/lda_news)|LDA|Baidu Self built News Field Dataset||\n|[electra_base](text/language_model/electra_base)||||\n|[ernie_v2_eng_large](text/language_model/ernie_v2_eng_large)|ernie_v2_eng_large|Baidu self built dataset||\n|[bert-large-cased](text/language_model/bert-large-cased)||||\n\n</div></details>\n\n\n  - ### Sentiment Analysis\n\n|module|Network|Dataset|Introduction|Huggingface Spaces Demo|\n|--|--|--|--|--|\n|[ernie_skep_sentiment_analysis](text/sentiment_analysis/ernie_skep_sentiment_analysis)|SKEP|Baidu self built dataset|Sentence level sentiment analysis|\n|[emotion_detection_textcnn](text/sentiment_analysis/emotion_detection_textcnn)|TextCNN|Baidu self built dataset|Dialogue emotion detection|\n|[senta_bilstm](text/sentiment_analysis/senta_bilstm)|BiLSTM|Baidu self built dataset|Chinesesentiment analysis| [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/PaddlePaddle/senta_bilstm) \n|[senta_bow](text/sentiment_analysis/senta_bow)|BOW|Baidu self built dataset|Chinese sentiment analysis|\n|[senta_gru](text/sentiment_analysis/senta_gru)|GRU|Baidu self built dataset|Chinese sentiment analysis|\n|[senta_lstm](text/sentiment_analysis/senta_lstm)|LSTM|Baidu self built dataset|Chinese sentiment analysis|\n|[senta_cnn](text/sentiment_analysis/senta_cnn)|CNN|Baidu self built dataset|Chinese sentiment analysis|\n\n  - ### Syntactic Analysis\n\n|module|Network|Dataset|Introduction|\n|--|--|--|--|\n|[DDParser](text/syntactic_analysis/DDParser)|Deep Biaffine Attention|Search query, web text, voice input and other data|Syntactic analysis|\n\n  - ### Simultaneous Translation\n\n|module|Network|Dataset|Introduction|\n|--|--|--|--|\n|[transformer_nist_wait_1](text/simultaneous_translation/stacl/transformer_nist_wait_1)|transformer|NIST 2008|Chinese to English - wait-1|\n|[transformer_nist_wait_3](text/simultaneous_translation/stacl/transformer_nist_wait_3)|transformer|NIST 2008|Chinese to English - wait-3|\n|[transformer_nist_wait_5](text/simultaneous_translation/stacl/transformer_nist_wait_5)|transformer|NIST 2008|Chinese to English - wait-5|\n|[transformer_nist_wait_7](text/simultaneous_translation/stacl/transformer_nist_wait_7)|transformer|NIST 2008|Chinese to English - wait-7|\n|[transformer_nist_wait_all](text/simultaneous_translation/stacl/transformer_nist_wait_all)|transformer|NIST 2008|Chinese to English - waitk=-1|\n\n\n  - ### Lexical Analysis\n\n|module|Network|Dataset|Introduction|Huggingface Spaces Demo|\n|--|--|--|--|--|\n|[jieba_paddle](text/lexical_analysis/jieba_paddle)|BiGRU+CRF|Baidu self built dataset|Jieba uses Paddle to build a word segmentation network (two-way GRU). At the same time, it supports traditional word segmentation methods of jieba, such as precise mode, full mode, search engine mode, etc.|\n|[lac](text/lexical_analysis/lac)|BiGRU+CRF|Baidu self built dataset|The lexical analysis model jointly developed by Baidu can complete the tasks of Chinese word segmentation, part of speech tagging and proper name recognition as a whole. Evaluated on Baidu self built dataset, LAC effect: Precision=88.0%, Recall=88.7%, F1 Score=88.4%.|[![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/PaddlePaddle/lac) \n\n  - ### Punctuation Restoration\n\n|module|Network|Dataset|Introduction|\n|--|--|--|--|\n|[auto_punc](text/punctuation_restoration/auto_punc)|Ernie-1.0|WuDaoCorpora 2.0|Automatically add 7 punctuation marks|\n\n  - ### Text Review\n\n|module|Network|Dataset|Introduction|\n|--|--|--|--|\n|[porn_detection_cnn](text/text_review/porn_detection_cnn)|CNN|Baidu self built dataset|Pornography detection, automatically identify whether the text is pornographic and give the corresponding confidence, and identify pornographic descriptions, vulgar friends, and dirty documents in the text|\n|[porn_detection_gru](text/text_review/porn_detection_gru)|GRU|Baidu self built dataset|Pornography detection, automatically identify whether the text is pornographic and give the corresponding confidence, and identify pornographic descriptions, vulgar friends, and dirty documents in the text|\n|[porn_detection_lstm](text/text_review/porn_detection_lstm)|LSTM|Baidu self built dataset|Pornography detection, automatically identify whether the text is pornographic and give the corresponding confidence, and identify pornographic descriptions, vulgar friends, and dirty documents in the text|\n\n## Audio\n\n  - ### Voice cloning\n\n|module|Network|Dataset|Introduction|\n|--|--|--|--|\n|[ge2e_fastspeech2_pwgan](audio/voice_cloning/ge2e_fastspeech2_pwgan)|FastSpeech2|AISHELL-3|Chinese speech cloning|\n|[lstm_tacotron2](audio/voice_cloning/lstm_tacotron2)|LSTM、Tacotron2、WaveFlow|AISHELL-3|Chinese speech cloning|\n\n  - ### Text to Speech\n\n|module|Network|Dataset|Introduction|\n|--|--|--|--|\n|[transformer_tts_ljspeech](audio/tts/transformer_tts_ljspeech)|Transformer|LJSpeech-1.1|English speech synthesis|\n|[fastspeech_ljspeech](audio/tts/fastspeech_ljspeech)|FastSpeech|LJSpeech-1.1|English speech synthesis|\n|[fastspeech2_baker](audio/tts/fastspeech2_baker)|FastSpeech2|Chinese Standard Mandarin Speech Copus|Chinese speech synthesis|\n|[fastspeech2_ljspeech](audio/tts/fastspeech2_ljspeech)|FastSpeech2|LJSpeech-1.1|English speech synthesis|\n|[deepvoice3_ljspeech](audio/tts/deepvoice3_ljspeech)|DeepVoice3|LJSpeech-1.1|English speech synthesis|\n\n  - ### Automatic Speech Recognition\n\n|module|Network|Dataset|Introduction|\n|--|--|--|--|\n|[deepspeech2_aishell](audio/asr/deepspeech2_aishell)|DeepSpeech2|AISHELL-1|Chinese Speech Recognition|\n|[deepspeech2_librispeech](audio/asr/deepspeech2_librispeech)|DeepSpeech2|LibriSpeech|English Speech Recognition|\n|[u2_conformer_aishell](audio/asr/u2_conformer_aishell)|Conformer|AISHELL-1|Chinese Speech Recognition|\n|[u2_conformer_wenetspeech](audio/asr/u2_conformer_wenetspeech)|Conformer|WenetSpeech|Chinese Speech Recognition|\n|[u2_conformer_librispeech](audio/asr/u2_conformer_librispeech)|Conformer|LibriSpeech|English Speech Recognition|\n\n\n  - ### Audio Classification\n\n|module|Network|Dataset|Introduction|\n|--|--|--|--|\n|[panns_cnn6](audio/audio_classification/PANNs/cnn6)|PANNs|Google Audioset|It mainly includes 4 convolution layers and 2 full connection layers, and the model parameter is 4.5M. After pre-training, it can be used to extract the embbedding of audio. The dimension is 512|\n|[panns_cnn14](audio/audio_classification/PANNs/cnn14)|PANNs|Google Audioset|It mainly includes 4 convolution layers and 2 full connection layers, and the model parameter is 4.5M. After pre-training, it can be used to extract the embbedding of audio. The dimension is 2048|\n|[panns_cnn10](audio/audio_classification/PANNs/cnn10)|PANNs|Google Audioset|It mainly includes 4 convolution layers and 2 full connection layers, and the model parameter is 4.5M. After pre-training, it can be used to extract the embbedding of audio. The dimension is 512|\n\n## Video\n  - ### Video Classification\n\n|module|Network|Dataset|Introduction|\n|--|--|--|--|\n|[videotag_tsn_lstm](video/classification/videotag_tsn_lstm)|TSN + AttentionLSTM|Baidu self built dataset|Short-video classification|\n|[tsn_kinetics400](video/classification/tsn_kinetics400)|TSN|Kinetics-400|Video classification|\n|[tsm_kinetics400](video/classification/tsm_kinetics400)|TSM|Kinetics-400|Video classification|\n|[stnet_kinetics400](video/classification/stnet_kinetics400)|StNet|Kinetics-400|Video classification|\n|[nonlocal_kinetics400](video/classification/nonlocal_kinetics400)|Non-local|Kinetics-400|Video classification|\n\n\n  - ### Video Editing\n\n|module|Network|Dataset|Introduction|\n|--|--|--|--|\n|[SkyAR](video/Video_editing/SkyAR)|UNet|UNet|Video sky Replacement|\n\n  - ### Multiple Object tracking\n\n|module|Network|Dataset|Introduction|\n|--|--|--|--|\n|[fairmot_dla34](video/multiple_object_tracking/fairmot_dla34)|CenterNet|Caltech Pedestrian+CityPersons+CUHK-SYSU+PRW+ETHZ+MOT17|Realtime multiple object tracking|\n|[jde_darknet53](video/multiple_object_tracking/jde_darknet53)|YOLOv3|Caltech Pedestrian+CityPersons+CUHK-SYSU+PRW+ETHZ+MOT17|object tracking with both accuracy and speed|\n\n## Industrial Application\n\n  - ### Meter Detection\n\n|module|Network|Dataset|Introduction|\n|--|--|--|--|\n|[WatermeterSegmentation](image/semantic_segmentation/WatermeterSegmentation)|DeepLabV3|Water meter dataset|Water meter segmentation|\n"
  },
  {
    "path": "modules/README_ch.md",
    "content": "简体中文 | [English](README.md)\r\n\r\n# 目录\r\n|[图像](#图像) （222个）|[文本](#文本) （130个）|[语音](#语音) （15个）|[视频](#视频) （8个）|[工业应用](#工业应用) （1个）|\r\n|--|--|--|--|--|\r\n|[图像分类](#图像分类) (108)|[文本生成](#文本生成) (17)| [声音克隆](#声音克隆) (2)|[视频分类](#视频分类) (5)| [表针识别](#表针识别) (1)|\r\n|[图像生成](#图像生成) (26)|[词向量](#词向量) (62)|[语音合成](#语音合成) (5)|[视频修复](#视频修复) (1)|-|\r\n|[关键点检测](#关键点检测) (5)|[机器翻译](#机器翻译) (2)|[语音识别](#语音识别) (5)|[多目标追踪](#多目标追踪) (2)|-|\r\n|[图像分割](#图像分割) (25)|[语义模型](#语义模型) (30)|[声音分类](#声音分类) (3)| -|-|\r\n|[人脸检测](#人脸检测) (7)|[情感分析](#情感分析) (7)|-|-|-|\r\n|[文字识别](#文字识别) (17)|[句法分析](#句法分析) (1)|-|-|-|\r\n|[图像编辑](#图像编辑) (8)|[同声传译](#同声传译) (5)|-|-|-|\r\n|[实例分割](#实例分割) (1)|[词法分析](#词法分析) (2)|-|-|-|\r\n|[目标检测](#目标检测) (13)|[标点恢复](#标点恢复) (1)|-|-|-|\r\n|[深度估计](#深度估计) (2)|[文本审核](#文本审核) (3)|-|-|-|\r\n|[文生图](#文生图) (10)|-|-|-|-|\r\n\r\n## 图像\r\n  - ### 图像分类\r\n\r\n<details><summary>expand</summary><div>\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[DriverStatusRecognition](image/classification/DriverStatusRecognition)|MobileNetV3_small_ssld|分心司机检测数据集||\r\n|[mobilenet_v2_animals](image/classification/mobilenet_v2_animals)|MobileNet_v2|百度自建动物数据集||\r\n|[repvgg_a1_imagenet](image/classification/repvgg_a1_imagenet)|RepVGG|ImageNet-2012||\r\n|[repvgg_a0_imagenet](image/classification/repvgg_a0_imagenet)|RepVGG|ImageNet-2012||\r\n|[resnext152_32x4d_imagenet](image/classification/resnext152_32x4d_imagenet)|ResNeXt|ImageNet-2012||\r\n|[resnet_v2_152_imagenet](image/classification/resnet_v2_152_imagenet)|ResNet V2|ImageNet-2012||\r\n|[resnet50_vd_animals](image/classification/resnet50_vd_animals)|ResNet50_vd|百度自建动物数据集||\r\n|[food_classification](image/classification/food_classification)|ResNet50_vd_ssld|美食数据集||\r\n|[mobilenet_v3_large_imagenet_ssld](image/classification/mobilenet_v3_large_imagenet_ssld)|Mobilenet_v3_large|ImageNet-2012||\r\n|[resnext152_vd_32x4d_imagenet](image/classification/resnext152_vd_32x4d_imagenet)||||\r\n|[ghostnet_x1_3_imagenet_ssld](image/classification/ghostnet_x1_3_imagenet_ssld)|GhostNet|ImageNet-2012||\r\n|[rexnet_1_5_imagenet](image/classification/rexnet_1_5_imagenet)|ReXNet|ImageNet-2012||\r\n|[resnext50_64x4d_imagenet](image/classification/resnext50_64x4d_imagenet)|ResNeXt|ImageNet-2012||\r\n|[resnext101_64x4d_imagenet](image/classification/resnext101_64x4d_imagenet)|ResNeXt|ImageNet-2012||\r\n|[efficientnetb0_imagenet](image/classification/efficientnetb0_imagenet)|EfficientNet|ImageNet-2012||\r\n|[efficientnetb1_imagenet](image/classification/efficientnetb1_imagenet)|EfficientNet|ImageNet-2012||\r\n|[mobilenet_v2_imagenet_ssld](image/classification/mobilenet_v2_imagenet_ssld)|Mobilenet_v2|ImageNet-2012||\r\n|[resnet50_vd_dishes](image/classification/resnet50_vd_dishes)|ResNet50_vd|百度自建菜品数据集||\r\n|[pnasnet_imagenet](image/classification/pnasnet_imagenet)|PNASNet|ImageNet-2012||\r\n|[rexnet_2_0_imagenet](image/classification/rexnet_2_0_imagenet)|ReXNet|ImageNet-2012||\r\n|[SnakeIdentification](image/classification/SnakeIdentification)|ResNet50_vd_ssld|蛇种数据集||\r\n|[hrnet40_imagenet](image/classification/hrnet40_imagenet)|HRNet|ImageNet-2012||\r\n|[resnet_v2_34_imagenet](image/classification/resnet_v2_34_imagenet)|ResNet V2|ImageNet-2012||\r\n|[mobilenet_v2_dishes](image/classification/mobilenet_v2_dishes)|MobileNet_v2|百度自建菜品数据集||\r\n|[resnext101_vd_32x4d_imagenet](image/classification/resnext101_vd_32x4d_imagenet)|ResNeXt|ImageNet-2012||\r\n|[repvgg_b2g4_imagenet](image/classification/repvgg_b2g4_imagenet)|RepVGG|ImageNet-2012||\r\n|[fix_resnext101_32x48d_wsl_imagenet](image/classification/fix_resnext101_32x48d_wsl_imagenet)|ResNeXt|ImageNet-2012||\r\n|[vgg13_imagenet](image/classification/vgg13_imagenet)|VGG|ImageNet-2012||\r\n|[se_resnext101_32x4d_imagenet](image/classification/se_resnext101_32x4d_imagenet)|SE_ResNeXt|ImageNet-2012||\r\n|[hrnet30_imagenet](image/classification/hrnet30_imagenet)|HRNet|ImageNet-2012||\r\n|[ghostnet_x1_3_imagenet](image/classification/ghostnet_x1_3_imagenet)|GhostNet|ImageNet-2012||\r\n|[dpn107_imagenet](image/classification/dpn107_imagenet)|DPN|ImageNet-2012||\r\n|[densenet161_imagenet](image/classification/densenet161_imagenet)|DenseNet|ImageNet-2012||\r\n|[vgg19_imagenet](image/classification/vgg19_imagenet)|vgg19_imagenet|ImageNet-2012||\r\n|[mobilenet_v2_imagenet](image/classification/mobilenet_v2_imagenet)|Mobilenet_v2|ImageNet-2012||\r\n|[resnet50_vd_10w](image/classification/resnet50_vd_10w)|ResNet_vd|百度自建数据集||\r\n|[resnet_v2_101_imagenet](image/classification/resnet_v2_101_imagenet)|ResNet V2 101|ImageNet-2012||\r\n|[darknet53_imagenet](image/classification/darknet53_imagenet)|DarkNet|ImageNet-2012||\r\n|[se_resnext50_32x4d_imagenet](image/classification/se_resnext50_32x4d_imagenet)|SE_ResNeXt|ImageNet-2012||\r\n|[se_hrnet64_imagenet_ssld](image/classification/se_hrnet64_imagenet_ssld)|HRNet|ImageNet-2012||\r\n|[resnext101_32x16d_wsl](image/classification/resnext101_32x16d_wsl)|ResNeXt_wsl|ImageNet-2012||\r\n|[hrnet18_imagenet](image/classification/hrnet18_imagenet)|HRNet|ImageNet-2012||\r\n|[spinalnet_res101_gemstone](image/classification/spinalnet_res101_gemstone)|resnet101|gemstone||\r\n|[densenet264_imagenet](image/classification/densenet264_imagenet)|DenseNet|ImageNet-2012||\r\n|[resnext50_vd_32x4d_imagenet](image/classification/resnext50_vd_32x4d_imagenet)|ResNeXt_vd|ImageNet-2012||\r\n|[SpinalNet_Gemstones](image/classification/SpinalNet_Gemstones)||||\r\n|[spinalnet_vgg16_gemstone](image/classification/spinalnet_vgg16_gemstone)|vgg16|gemstone||\r\n|[xception71_imagenet](image/classification/xception71_imagenet)|Xception|ImageNet-2012||\r\n|[repvgg_b2_imagenet](image/classification/repvgg_b2_imagenet)|RepVGG|ImageNet-2012||\r\n|[dpn68_imagenet](image/classification/dpn68_imagenet)|DPN|ImageNet-2012||\r\n|[alexnet_imagenet](image/classification/alexnet_imagenet)|AlexNet|ImageNet-2012||\r\n|[rexnet_1_3_imagenet](image/classification/rexnet_1_3_imagenet)|ReXNet|ImageNet-2012||\r\n|[hrnet64_imagenet](image/classification/hrnet64_imagenet)|HRNet|ImageNet-2012||\r\n|[efficientnetb7_imagenet](image/classification/efficientnetb7_imagenet)|EfficientNet|ImageNet-2012||\r\n|[efficientnetb0_small_imagenet](image/classification/efficientnetb0_small_imagenet)|EfficientNet|ImageNet-2012||\r\n|[efficientnetb6_imagenet](image/classification/efficientnetb6_imagenet)|EfficientNet|ImageNet-2012||\r\n|[hrnet48_imagenet](image/classification/hrnet48_imagenet)|HRNet|ImageNet-2012||\r\n|[rexnet_3_0_imagenet](image/classification/rexnet_3_0_imagenet)|ReXNet|ImageNet-2012||\r\n|[shufflenet_v2_imagenet](image/classification/shufflenet_v2_imagenet)|ShuffleNet V2|ImageNet-2012||\r\n|[ghostnet_x0_5_imagenet](image/classification/ghostnet_x0_5_imagenet)|GhostNet|ImageNet-2012||\r\n|[inception_v4_imagenet](image/classification/inception_v4_imagenet)|Inception_V4|ImageNet-2012||\r\n|[resnext101_vd_64x4d_imagenet](image/classification/resnext101_vd_64x4d_imagenet)|ResNeXt_vd|ImageNet-2012||\r\n|[densenet201_imagenet](image/classification/densenet201_imagenet)|DenseNet|ImageNet-2012||\r\n|[vgg16_imagenet](image/classification/vgg16_imagenet)|VGG|ImageNet-2012||\r\n|[mobilenet_v3_small_imagenet_ssld](image/classification/mobilenet_v3_small_imagenet_ssld)|Mobilenet_v3_Small|ImageNet-2012||\r\n|[hrnet18_imagenet_ssld](image/classification/hrnet18_imagenet_ssld)|HRNet|ImageNet-2012||\r\n|[resnext152_64x4d_imagenet](image/classification/resnext152_64x4d_imagenet)|ResNeXt|ImageNet-2012||\r\n|[efficientnetb3_imagenet](image/classification/efficientnetb3_imagenet)|EfficientNet|ImageNet-2012||\r\n|[efficientnetb2_imagenet](image/classification/efficientnetb2_imagenet)|EfficientNet|ImageNet-2012||\r\n|[repvgg_b1g4_imagenet](image/classification/repvgg_b1g4_imagenet)|RepVGG|ImageNet-2012||\r\n|[resnext101_32x4d_imagenet](image/classification/resnext101_32x4d_imagenet)|ResNeXt|ImageNet-2012||\r\n|[resnext50_32x4d_imagenet](image/classification/resnext50_32x4d_imagenet)|ResNeXt|ImageNet-2012||\r\n|[repvgg_a2_imagenet](image/classification/repvgg_a2_imagenet)|RepVGG|ImageNet-2012||\r\n|[resnext152_vd_64x4d_imagenet](image/classification/resnext152_vd_64x4d_imagenet)|ResNeXt_vd|ImageNet-2012||\r\n|[xception41_imagenet](image/classification/xception41_imagenet)|Xception|ImageNet-2012||\r\n|[googlenet_imagenet](image/classification/googlenet_imagenet)|GoogleNet|ImageNet-2012||\r\n|[resnet50_vd_imagenet_ssld](image/classification/resnet50_vd_imagenet_ssld)|ResNet_vd|ImageNet-2012||\r\n|[repvgg_b1_imagenet](image/classification/repvgg_b1_imagenet)|RepVGG|ImageNet-2012||\r\n|[repvgg_b0_imagenet](image/classification/repvgg_b0_imagenet)|RepVGG|ImageNet-2012||\r\n|[resnet_v2_50_imagenet](image/classification/resnet_v2_50_imagenet)|ResNet V2|ImageNet-2012||\r\n|[rexnet_1_0_imagenet](image/classification/rexnet_1_0_imagenet)|ReXNet|ImageNet-2012||\r\n|[resnet_v2_18_imagenet](image/classification/resnet_v2_18_imagenet)|ResNet V2|ImageNet-2012||\r\n|[resnext101_32x8d_wsl](image/classification/resnext101_32x8d_wsl)|ResNeXt_wsl|ImageNet-2012||\r\n|[efficientnetb4_imagenet](image/classification/efficientnetb4_imagenet)|EfficientNet|ImageNet-2012||\r\n|[efficientnetb5_imagenet](image/classification/efficientnetb5_imagenet)|EfficientNet|ImageNet-2012||\r\n|[repvgg_b1g2_imagenet](image/classification/repvgg_b1g2_imagenet)|RepVGG|ImageNet-2012||\r\n|[resnext101_32x48d_wsl](image/classification/resnext101_32x48d_wsl)|ResNeXt_wsl|ImageNet-2012||\r\n|[resnet50_vd_wildanimals](image/classification/resnet50_vd_wildanimals)|ResNet_vd|IFAW 自建野生动物数据集||\r\n|[nasnet_imagenet](image/classification/nasnet_imagenet)|NASNet|ImageNet-2012||\r\n|[se_resnet18_vd_imagenet](image/classification/se_resnet18_vd_imagenet)||||\r\n|[spinalnet_res50_gemstone](image/classification/spinalnet_res50_gemstone)|resnet50|gemstone||\r\n|[resnext50_vd_64x4d_imagenet](image/classification/resnext50_vd_64x4d_imagenet)|ResNeXt_vd|ImageNet-2012||\r\n|[resnext101_32x32d_wsl](image/classification/resnext101_32x32d_wsl)|ResNeXt_wsl|ImageNet-2012||\r\n|[dpn131_imagenet](image/classification/dpn131_imagenet)|DPN|ImageNet-2012||\r\n|[xception65_imagenet](image/classification/xception65_imagenet)|Xception|ImageNet-2012||\r\n|[repvgg_b3g4_imagenet](image/classification/repvgg_b3g4_imagenet)|RepVGG|ImageNet-2012||\r\n|[marine_biometrics](image/classification/marine_biometrics)|ResNet50_vd_ssld|Fish4Knowledge||\r\n|[res2net101_vd_26w_4s_imagenet](image/classification/res2net101_vd_26w_4s_imagenet)|Res2Net|ImageNet-2012||\r\n|[dpn98_imagenet](image/classification/dpn98_imagenet)|DPN|ImageNet-2012||\r\n|[resnet18_vd_imagenet](image/classification/resnet18_vd_imagenet)|ResNet_vd|ImageNet-2012||\r\n|[densenet121_imagenet](image/classification/densenet121_imagenet)|DenseNet|ImageNet-2012||\r\n|[vgg11_imagenet](image/classification/vgg11_imagenet)|VGG|ImageNet-2012||\r\n|[hrnet44_imagenet](image/classification/hrnet44_imagenet)|HRNet|ImageNet-2012||\r\n|[densenet169_imagenet](image/classification/densenet169_imagenet)|DenseNet|ImageNet-2012||\r\n|[hrnet32_imagenet](image/classification/hrnet32_imagenet)|HRNet|ImageNet-2012||\r\n|[dpn92_imagenet](image/classification/dpn92_imagenet)|DPN|ImageNet-2012||\r\n|[ghostnet_x1_0_imagenet](image/classification/ghostnet_x1_0_imagenet)|GhostNet|ImageNet-2012||\r\n|[hrnet48_imagenet_ssld](image/classification/hrnet48_imagenet_ssld)|HRNet|ImageNet-2012||\r\n\r\n</div></details>\r\n\r\n\r\n  - ### 图像生成\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[pixel2style2pixel](image/Image_gan/gan/pixel2style2pixel/)|Pixel2Style2Pixel|-|人脸转正|\r\n|[stgan_bald](image/Image_gan/gan/stgan_bald/)|STGAN|CelebA|秃头生成器|\r\n|[styleganv2_editing](image/Image_gan/gan/styleganv2_editing)|StyleGAN V2|-|人脸编辑|\r\n|[wav2lip](image/Image_gan/gan/wav2lip)|wav2lip|LRS2|唇形生成|\r\n|[attgan_celeba](image/Image_gan/attgan_celeba/)|AttGAN|Celeba|人脸编辑|\r\n|[cyclegan_cityscapes](image/Image_gan/cyclegan_cityscapes)|CycleGAN|Cityscapes|实景图和语义分割结果互相转换|\r\n|[stargan_celeba](image/Image_gan/stargan_celeba)|StarGAN|Celeba|人脸编辑|\r\n|[stgan_celeba](image/Image_gan/stgan_celeba/)|STGAN|Celeba|人脸编辑|\r\n|[ID_Photo_GEN](image/Image_gan/style_transfer/ID_Photo_GEN)|HRNet_W18|-|证件照生成|\r\n|[Photo2Cartoon](image/Image_gan/style_transfer/Photo2Cartoon)|U-GAT-IT|cartoon_data|人脸卡通化|\r\n|[U2Net_Portrait](image/Image_gan/style_transfer/U2Net_Portrait)|U^2Net|-|人脸素描化|\r\n|[UGATIT_100w](image/Image_gan/style_transfer/UGATIT_100w)|U-GAT-IT|selfie2anime|人脸动漫化|\r\n|[UGATIT_83w](image/Image_gan/style_transfer/UGATIT_83w)|U-GAT-IT|selfie2anime|人脸动漫化|\r\n|[UGATIT_92w](image/Image_gan/style_transfer/UGATIT_92w)| U-GAT-IT|selfie2anime|人脸动漫化|\r\n|[animegan_v1_hayao_60](image/Image_gan/style_transfer/animegan_v1_hayao_60)|AnimeGAN|The Wind Rises|图像风格迁移-宫崎骏|\r\n|[animegan_v2_hayao_64](image/Image_gan/style_transfer/animegan_v2_hayao_64)|AnimeGAN|The Wind Rises|图像风格迁移-宫崎骏|\r\n|[animegan_v2_hayao_99](image/Image_gan/style_transfer/animegan_v2_hayao_99)|AnimeGAN|The Wind Rises|图像风格迁移-宫崎骏|\r\n|[animegan_v2_paprika_54](image/Image_gan/style_transfer/animegan_v2_paprika_54)|AnimeGAN|Paprika|图像风格迁移-今敏|\r\n|[animegan_v2_paprika_74](image/Image_gan/style_transfer/animegan_v2_paprika_74)|AnimeGAN|Paprika|图像风格迁移-今敏|\r\n|[animegan_v2_paprika_97](image/Image_gan/style_transfer/animegan_v2_paprika_97)|AnimeGAN|Paprika|图像风格迁移-今敏|\r\n|[animegan_v2_paprika_98](image/Image_gan/style_transfer/animegan_v2_paprika_98)|AnimeGAN|Paprika|图像风格迁移-今敏|\r\n|[animegan_v2_shinkai_33](image/Image_gan/style_transfer/animegan_v2_shinkai_33)|AnimeGAN|Your Name, Weathering with you|图像风格迁移-新海诚|\r\n|[animegan_v2_shinkai_53](image/Image_gan/style_transfer/animegan_v2_shinkai_53)|AnimeGAN|Your Name, Weathering with you|图像风格迁移-新海诚|\r\n|[msgnet](image/Image_gan/style_transfer/msgnet)|msgnet|COCO2014|\r\n|[stylepro_artistic](image/Image_gan/style_transfer/stylepro_artistic)|StyleProNet|MS-COCO + WikiArt|艺术风格迁移|\r\n|stylegan_ffhq|StyleGAN|FFHQ|图像风格迁移|\r\n\r\n  - ### 关键点检测\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[face_landmark_localization](image/keypoint_detection/face_landmark_localization)|Face_Landmark|AFW/AFLW|人脸关键点检测|\r\n|[hand_pose_localization](image/keypoint_detection/hand_pose_localization)|-|MPII, NZSL|手部关键点检测|\r\n|[openpose_body_estimation](image/keypoint_detection/openpose_body_estimation)|two-branch multi-stage CNN|MPII, COCO 2016|肢体关键点检测|\r\n|[human_pose_estimation_resnet50_mpii](image/keypoint_detection/human_pose_estimation_resnet50_mpii)|Pose_Resnet50|MPII|人体骨骼关键点检测\r\n|[openpose_hands_estimation](image/keypoint_detection/openpose_hands_estimation)|-|MPII, NZSL|手部关键点检测|\r\n\r\n  - ### 图像分割\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[deeplabv3p_xception65_humanseg](image/semantic_segmentation/deeplabv3p_xception65_humanseg)|deeplabv3p|百度自建数据集|人像分割|\r\n|[humanseg_server](image/semantic_segmentation/humanseg_server)|deeplabv3p|百度自建数据集|人像分割|\r\n|[humanseg_mobile](image/semantic_segmentation/humanseg_mobile)|hrnet|百度自建数据集|人像分割-移动端前置摄像头|\r\n|[humanseg_lite](image/semantic_segmentation/umanseg_lite)|shufflenet|百度自建数据集|轻量级人像分割-移动端实时|\r\n|[ExtremeC3_Portrait_Segmentation](image/semantic_segmentation/ExtremeC3_Portrait_Segmentation)|ExtremeC3|EG1800, Baidu fashion dataset|轻量化人像分割|\r\n|[SINet_Portrait_Segmentation](image/semantic_segmentation/SINet_Portrait_Segmentation)|SINet|EG1800, Baidu fashion dataset|轻量化人像分割|\r\n|[FCN_HRNet_W18_Face_Seg](image/semantic_segmentation/FCN_HRNet_W18_Face_Seg)|FCN_HRNet_W18|-|人像分割|\r\n|[ace2p](image/semantic_segmentation/ace2p)|ACE2P|LIP|人体解析|\r\n|[Pneumonia_CT_LKM_PP](image/semantic_segmentation/Pneumonia_CT_LKM_PP)|U-NET+|连心医疗授权脱敏数据集|肺炎CT影像分析|\r\n|[Pneumonia_CT_LKM_PP_lung](image/semantic_segmentation/Pneumonia_CT_LKM_PP_lung)|U-NET+|连心医疗授权脱敏数据集|肺炎CT影像分析|\r\n|[ocrnet_hrnetw18_voc](image/semantic_segmentation/ocrnet_hrnetw18_voc)|ocrnet, hrnet|PascalVoc2012|\r\n|[U2Net](image/semantic_segmentation/U2Net)|U^2Net|-|图像前景背景分割|\r\n|[U2Netp](image/semantic_segmentation/U2Netp)|U^2Net|-|图像前景背景分割|\r\n|[Extract_Line_Draft](image/semantic_segmentation/Extract_Line_Draft)|UNet|Pixiv|线稿提取|\r\n|[unet_cityscapes](image/semantic_segmentation/unet_cityscapes)|UNet|cityscapes|\r\n|[ocrnet_hrnetw18_cityscapes](image/semantic_segmentation/ocrnet_hrnetw18_cityscapes)|ocrnet_hrnetw18|cityscapes|\r\n|[hardnet_cityscapes](image/semantic_segmentation/hardnet_cityscapes)|hardnet|cityscapes|\r\n|[fcn_hrnetw48_voc](image/semantic_segmentation/fcn_hrnetw48_voc)|fcn_hrnetw48|PascalVoc2012|\r\n|[fcn_hrnetw48_cityscapes](image/semantic_segmentation/fcn_hrnetw48_cityscapes)|fcn_hrnetw48|cityscapes|\r\n|[fcn_hrnetw18_voc](image/semantic_segmentation/fcn_hrnetw18_voc)|fcn_hrnetw18|PascalVoc2012|\r\n|[fcn_hrnetw18_cityscapes](image/semantic_segmentation/fcn_hrnetw18_cityscapes)|fcn_hrnetw18|cityscapes|\r\n|[fastscnn_cityscapes](image/semantic_segmentation/fastscnn_cityscapes)|fastscnn|cityscapes|\r\n|[deeplabv3p_resnet50_voc](image/semantic_segmentation/deeplabv3p_resnet50_voc)|deeplabv3p, resnet50|PascalVoc2012|\r\n|[deeplabv3p_resnet50_cityscapes](image/semantic_segmentation/deeplabv3p_resnet50_cityscapes)|deeplabv3p, resnet50|cityscapes|\r\n|[bisenetv2_cityscapes](image/semantic_segmentation/bisenetv2_cityscapes)|bisenetv2|cityscapes|\r\n\r\n\r\n\r\n  - ### 人脸检测\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[pyramidbox_lite_mobile](image/face_detection/pyramidbox_lite_mobile)|PyramidBox|WIDER FACE数据集 + 百度自采人脸数据集|轻量级人脸检测-移动端|\r\n|[pyramidbox_lite_mobile_mask](image/face_detection/pyramidbox_lite_mobile_mask)|PyramidBox|WIDER FACE数据集 + 百度自采人脸数据集|轻量级人脸口罩检测-移动端|\r\n|[pyramidbox_lite_server_mask](image/face_detection/pyramidbox_lite_server_mask)|PyramidBox|WIDER FACE数据集 + 百度自采人脸数据集|轻量级人脸口罩检测|\r\n|[ultra_light_fast_generic_face_detector_1mb_640](image/face_detection/ultra_light_fast_generic_face_detector_1mb_640)|Ultra-Light-Fast-Generic-Face-Detector-1MB|WIDER FACE数据集|轻量级通用人脸检测-低算力设备|\r\n|[ultra_light_fast_generic_face_detector_1mb_320](image/face_detection/ultra_light_fast_generic_face_detector_1mb_320)|Ultra-Light-Fast-Generic-Face-Detector-1MB|WIDER FACE数据集|轻量级通用人脸检测-低算力设备|\r\n|[pyramidbox_lite_server](image/face_detection/pyramidbox_lite_server)|PyramidBox|WIDER FACE数据集 + 百度自采人脸数据集|轻量级人脸检测|\r\n|[pyramidbox_face_detection](image/face_detection/pyramidbox_face_detection)|PyramidBox|WIDER FACE数据集|人脸检测|\r\n\r\n  - ### 文字识别\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[chinese_ocr_db_crnn_mobile](image/text_recognition/chinese_ocr_db_crnn_mobile)|Differentiable Binarization+RCNN|icdar2015数据集|中文文字识别|[chinese_text_detection_db_mobile](image/text_recognition/chinese_text_detection_db_mobile)|Differentiable Binarization|icdar2015数据集|中文文本检测|\r\n|[chinese_text_detection_db_server](image/text_recognition/chinese_text_detection_db_server)|Differentiable Binarization|icdar2015数据集|中文文本检测|\r\n|[chinese_ocr_db_crnn_server](image/text_recognition/chinese_ocr_db_crnn_server)|Differentiable Binarization+RCNN|icdar2015数据集|中文文字识别|\r\n|[Vehicle_License_Plate_Recognition](image/text_recognition/Vehicle_License_Plate_Recognition)|-|CCPD|车牌识别|\r\n|[chinese_cht_ocr_db_crnn_mobile](image/text_recognition/chinese_cht_ocr_db_crnn_mobile)|Differentiable Binarization+CRNN|icdar2015数据集|繁体中文文字识别|\r\n|[japan_ocr_db_crnn_mobile](image/text_recognition/japan_ocr_db_crnn_mobile)|Differentiable Binarization+CRNN|icdar2015数据集|日文文字识别|\r\n|[korean_ocr_db_crnn_mobile](image/text_recognition/korean_ocr_db_crnn_mobile)|Differentiable Binarization+CRNN|icdar2015数据集|韩文文字识别|\r\n|[german_ocr_db_crnn_mobile](image/text_recognition/german_ocr_db_crnn_mobile)|Differentiable Binarization+CRNN|icdar2015数据集|德文文字识别|\r\n|[french_ocr_db_crnn_mobile](image/text_recognition/french_ocr_db_crnn_mobile)|Differentiable Binarization+CRNN|icdar2015数据集|法文文字识别|\r\n|[latin_ocr_db_crnn_mobile](image/text_recognition/latin_ocr_db_crnn_mobile)|Differentiable Binarization+CRNN|icdar2015数据集|拉丁文文字识别|\r\n|[cyrillic_ocr_db_crnn_mobile](image/text_recognition/cyrillic_ocr_db_crnn_mobile)|Differentiable Binarization+CRNN|icdar2015数据集|斯拉夫文文字识别|\r\n|[multi_languages_ocr_db_crnn](image/text_recognition/multi_languages_ocr_db_crnn)|Differentiable Binarization+RCNN|icdar2015数据集|多语言文字识别|\r\n|[kannada_ocr_db_crnn_mobile](image/text_recognition/kannada_ocr_db_crnn_mobile)|Differentiable Binarization+CRNN|icdar2015数据集|卡纳达文文字识别|\r\n|[arabic_ocr_db_crnn_mobile](image/text_recognition/arabic_ocr_db_crnn_mobile)|Differentiable Binarization+CRNN|icdar2015数据集|阿拉伯文文字识别|\r\n|[telugu_ocr_db_crnn_mobile](image/text_recognition/telugu_ocr_db_crnn_mobile)|Differentiable Binarization+CRNN|icdar2015数据集|泰卢固文文字识别|\r\n|[devanagari_ocr_db_crnn_mobile](image/text_recognition/devanagari_ocr_db_crnn_mobile)|Differentiable Binarization+CRNN|icdar2015数据集|梵文文字识别|\r\n|[tamil_ocr_db_crnn_mobile](image/text_recognition/tamil_ocr_db_crnn_mobile)|Differentiable Binarization+CRNN|icdar2015数据集|泰米尔文文字识别|\r\n\r\n\r\n  - ### 图像编辑\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[realsr](image/Image_editing/super_resolution/realsr)|LP-KPN|RealSR dataset|图像/视频超分-4倍|\r\n|[deoldify](image/Image_editing/colorization/deoldify)|GAN|ILSVRC 2012|黑白照片/视频着色|\r\n|[photo_restoration](image/Image_editing/colorization/photo_restoration)|基于deoldify和realsr模型|-|老照片修复|\r\n|[user_guided_colorization](image/Image_editing/colorization/user_guided_colorization)|siggraph|ILSVRC 2012|图像着色|\r\n|[falsr_c](image/Image_editing/super_resolution/falsr_c)|falsr_c| DIV2k|轻量化超分-2倍|\r\n|[dcscn](image/Image_editing/super_resolution/dcscn)|dcscn| DIV2k|轻量化超分-2倍|\r\n|[falsr_a](image/Image_editing/super_resolution/falsr_a)|falsr_a| DIV2k|轻量化超分-2倍|\r\n|[falsr_b](image/Image_editing/super_resolution/falsr_b)|falsr_b|DIV2k|轻量化超分-2倍|\r\n\r\n  - ### 实例分割\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[solov2](image/instance_segmentation/solov2)|-|COCO2014|实例分割|\r\n\r\n  - ### 目标检测\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[faster_rcnn_resnet50_coco2017](image/object_detection/faster_rcnn_resnet50_coco2017)|faster_rcnn|COCO2017||\r\n|[ssd_vgg16_512_coco2017](image/object_detection/ssd_vgg16_512_coco2017)|SSD|COCO2017||\r\n|[faster_rcnn_resnet50_fpn_venus](image/object_detection/faster_rcnn_resnet50_fpn_venus)|faster_rcnn|百度自建数据集|大规模通用目标检测|\r\n|[ssd_vgg16_300_coco2017](image/object_detection/ssd_vgg16_300_coco2017)||||\r\n|[yolov3_resnet34_coco2017](image/object_detection/yolov3_resnet34_coco2017)|YOLOv3|COCO2017||\r\n|[yolov3_darknet53_pedestrian](image/object_detection/yolov3_darknet53_pedestrian)|YOLOv3|百度自建大规模行人数据集|行人检测|\r\n|[yolov3_mobilenet_v1_coco2017](image/object_detection/yolov3_mobilenet_v1_coco2017)|YOLOv3|COCO2017||\r\n|[ssd_mobilenet_v1_pascal](image/object_detection/ssd_mobilenet_v1_pascal)|SSD|PASCAL VOC||\r\n|[faster_rcnn_resnet50_fpn_coco2017](image/object_detection/faster_rcnn_resnet50_fpn_coco2017)|faster_rcnn|COCO2017||\r\n|[yolov3_darknet53_coco2017](image/object_detection/yolov3_darknet53_coco2017)|YOLOv3|COCO2017||\r\n|[yolov3_darknet53_vehicles](image/object_detection/yolov3_darknet53_vehicles)|YOLOv3|百度自建大规模车辆数据集|车辆检测|\r\n|[yolov3_darknet53_venus](image/object_detection/yolov3_darknet53_venus)|YOLOv3|百度自建数据集|大规模通用检测|\r\n|[yolov3_resnet50_vd_coco2017](image/object_detection/yolov3_resnet50_vd_coco2017)|YOLOv3|COCO2017||\r\n\r\n  - ### 深度估计\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[MiDaS_Large](image/depth_estimation/MiDaS_Large)|-|3D Movies, WSVD, ReDWeb, MegaDepth||\r\n|[MiDaS_Small](image/depth_estimation/MiDaS_Small)|-|3D Movies, WSVD, ReDWeb, MegaDepth, etc.||\r\n\r\n- ### 文生图\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[disco_diffusion_clip_rn101](image/text_to_image/disco_diffusion_clip_rn101)|-|开放域多轮数据集|文生图|\r\n|[ernie_vilg](image/text_to_image/ernie_vilg)|-|开放域多轮数据集|文生图|\r\n|[stable_diffusion_img2img](image/text_to_image/stable_diffusion_img2img)|-|开放域多轮数据集|图生图|\r\n\r\n\r\n\r\n\r\n\r\n## 文本\r\n  - ### 文本生成\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[ernie_gen](text/text_generation/ernie_gen)|ERNIE-GEN|-|面向生成任务的预训练-微调框架|\r\n|[ernie_gen_poetry](text/text_generation/ernie_gen_poetry)|ERNIE-GEN|开源诗歌数据集|诗歌生成|\r\n|[ernie_gen_couplet](text/text_generation/ernie_gen_couplet)|ERNIE-GEN|开源对联数据集|对联生成|\r\n|[ernie_gen_lover_words](text/text_generation/ernie_gen_lover_words)|ERNIE-GEN|网络情诗、情话数据|情话生成|\r\n|[ernie_tiny_couplet](text/text_generation/ernie_tiny_couplet)|Eernie_tiny|开源对联数据集|对联生成|\r\n|[ernie_gen_acrostic_poetry](text/text_generation/ernie_gen_acrostic_poetry)|ERNIE-GEN|开源诗歌数据集|藏头诗生成|\r\n|[Rumor_prediction](text/text_generation/Rumor_prediction)|-|新浪微博中文谣言数据|谣言预测|\r\n|[plato-mini](text/text_generation/plato-mini)|Unified Transformer|十亿级别的中文对话数据|中文对话|\r\n|[plato2_en_large](text/text_generation/plato2_en_large)|plato2|开放域多轮数据集|超大规模生成式对话|\r\n|[plato2_en_base](text/text_generation/plato2_en_base)|plato2|开放域多轮数据集|超大规模生成式对话|\r\n|[CPM_LM](text/text_generation/CPM_LM)|GPT-2|自建数据集|中文文本生成|\r\n|[unified_transformer-12L-cn](text/text_generation/unified_transformer-12L-cn)|Unified Transformer|千万级别中文会话数据|人机多轮对话|\r\n|[unified_transformer-12L-cn-luge](text/text_generation/unified_transformer-12L-cn-luge)|Unified Transformer|千言对话数据集|人机多轮对话|\r\n|[reading_pictures_writing_poems](text/text_generation/reading_pictures_writing_poems)|多网络级联|-|看图写诗|\r\n|[GPT2_CPM_LM](text/text_generation/GPT2_CPM_LM)|||问答类文本生成|\r\n|[GPT2_Base_CN](text/text_generation/GPT2_Base_CN)|||问答类文本生成|\r\n\r\n  - ### 词向量\r\n\r\n<details><summary>expand</summary><div>\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[w2v_weibo_target_word-bigram_dim300](text/embedding/w2v_weibo_target_word-bigram_dim300)|w2v|weibo||\r\n|[w2v_baidu_encyclopedia_target_word-ngram_1-2_dim300](text/embedding/w2v_baidu_encyclopedia_target_word-ngram_1-2_dim300)|w2v|baidu_encyclopedia||\r\n|[w2v_literature_target_word-word_dim300](text/embedding/w2v_literature_target_word-word_dim300)|w2v|literature||\r\n|[word2vec_skipgram](text/embedding/word2vec_skipgram)|skip-gram|百度自建数据集||\r\n|[w2v_sogou_target_word-char_dim300](text/embedding/w2v_sogou_target_word-char_dim300)|w2v|sogou||\r\n|[w2v_weibo_target_bigram-char_dim300](text/embedding/w2v_weibo_target_bigram-char_dim300)|w2v|weibo||\r\n|[w2v_zhihu_target_word-bigram_dim300](text/embedding/w2v_zhihu_target_word-bigram_dim300)|w2v|zhihu||\r\n|[w2v_financial_target_word-word_dim300](text/embedding/w2v_financial_target_word-word_dim300)|w2v|financial||\r\n|[w2v_wiki_target_word-word_dim300](text/embedding/w2v_wiki_target_word-word_dim300)|w2v|wiki||\r\n|[w2v_baidu_encyclopedia_context_word-word_dim300](text/embedding/w2v_baidu_encyclopedia_context_word-word_dim300)|w2v|baidu_encyclopedia||\r\n|[w2v_weibo_target_word-word_dim300](text/embedding/w2v_weibo_target_word-word_dim300)|w2v|weibo||\r\n|[w2v_zhihu_target_bigram-char_dim300](text/embedding/w2v_zhihu_target_bigram-char_dim300)|w2v|zhihu||\r\n|[w2v_zhihu_target_word-word_dim300](text/embedding/w2v_zhihu_target_word-word_dim300)|w2v|zhihu||\r\n|[w2v_people_daily_target_word-char_dim300](text/embedding/w2v_people_daily_target_word-char_dim300)|w2v|people_daily||\r\n|[w2v_sikuquanshu_target_word-word_dim300](text/embedding/w2v_sikuquanshu_target_word-word_dim300)|w2v|sikuquanshu||\r\n|[glove_twitter_target_word-word_dim200_en](text/embedding/glove_twitter_target_word-word_dim200_en)|fasttext|twitter||\r\n|[fasttext_crawl_target_word-word_dim300_en](text/embedding/fasttext_crawl_target_word-word_dim300_en)|fasttext|crawl||\r\n|[w2v_wiki_target_word-bigram_dim300](text/embedding/w2v_wiki_target_word-bigram_dim300)|w2v|wiki||\r\n|[w2v_baidu_encyclopedia_context_word-character_char1-1_dim300](text/embedding/w2v_baidu_encyclopedia_context_word-character_char1-1_dim300)|w2v|baidu_encyclopedia||\r\n|[glove_wiki2014-gigaword_target_word-word_dim300_en](text/embedding/glove_wiki2014-gigaword_target_word-word_dim300_en)|glove|wiki2014-gigaword||\r\n|[glove_wiki2014-gigaword_target_word-word_dim50_en](text/embedding/glove_wiki2014-gigaword_target_word-word_dim50_en)|glove|wiki2014-gigaword||\r\n|[w2v_baidu_encyclopedia_context_word-ngram_2-2_dim300](text/embedding/w2v_baidu_encyclopedia_context_word-ngram_2-2_dim300)|w2v|baidu_encyclopedia||\r\n|[w2v_wiki_target_bigram-char_dim300](text/embedding/w2v_wiki_target_bigram-char_dim300)|w2v|wiki||\r\n|[w2v_baidu_encyclopedia_target_word-character_char1-1_dim300](text/embedding/w2v_baidu_encyclopedia_target_word-character_char1-1_dim300)|w2v|baidu_encyclopedia||\r\n|[w2v_financial_target_bigram-char_dim300](text/embedding/w2v_financial_target_bigram-char_dim300)|w2v|financial||\r\n|[glove_wiki2014-gigaword_target_word-word_dim200_en](text/embedding/glove_wiki2014-gigaword_target_word-word_dim200_en)|glove|wiki2014-gigaword||\r\n|[w2v_financial_target_word-bigram_dim300](text/embedding/w2v_financial_target_word-bigram_dim300)|w2v|financial||\r\n|[w2v_mixed-large_target_word-char_dim300](text/embedding/w2v_mixed-large_target_word-char_dim300)|w2v|mixed||\r\n|[w2v_baidu_encyclopedia_target_word-wordPosition_dim300](text/embedding/w2v_baidu_encyclopedia_target_word-wordPosition_dim300)|w2v|baidu_encyclopedia||\r\n|[w2v_baidu_encyclopedia_context_word-ngram_1-3_dim300](text/embedding/w2v_baidu_encyclopedia_context_word-ngram_1-3_dim300)|w2v|baidu_encyclopedia||\r\n|[w2v_baidu_encyclopedia_target_word-wordLR_dim300](text/embedding/w2v_baidu_encyclopedia_target_word-wordLR_dim300)|w2v|baidu_encyclopedia||\r\n|[w2v_sogou_target_bigram-char_dim300](text/embedding/w2v_sogou_target_bigram-char_dim300)|w2v|sogou||\r\n|[w2v_weibo_target_word-char_dim300](text/embedding/w2v_weibo_target_word-char_dim300)|w2v|weibo||\r\n|[w2v_people_daily_target_word-word_dim300](text/embedding/w2v_people_daily_target_word-word_dim300)|w2v|people_daily||\r\n|[w2v_zhihu_target_word-char_dim300](text/embedding/w2v_zhihu_target_word-char_dim300)|w2v|zhihu||\r\n|[w2v_wiki_target_word-char_dim300](text/embedding/w2v_wiki_target_word-char_dim300)|w2v|wiki||\r\n|[w2v_sogou_target_word-bigram_dim300](text/embedding/w2v_sogou_target_word-bigram_dim300)|w2v|sogou||\r\n|[w2v_financial_target_word-char_dim300](text/embedding/w2v_financial_target_word-char_dim300)|w2v|financial||\r\n|[w2v_baidu_encyclopedia_target_word-ngram_1-3_dim300](text/embedding/w2v_baidu_encyclopedia_target_word-ngram_1-3_dim300)|w2v|baidu_encyclopedia||\r\n|[glove_wiki2014-gigaword_target_word-word_dim100_en](text/embedding/glove_wiki2014-gigaword_target_word-word_dim100_en)|glove|wiki2014-gigaword||\r\n|[w2v_baidu_encyclopedia_target_word-character_char1-4_dim300](text/embedding/w2v_baidu_encyclopedia_target_word-character_char1-4_dim300)|w2v|baidu_encyclopedia||\r\n|[w2v_sogou_target_word-word_dim300](text/embedding/w2v_sogou_target_word-word_dim300)|w2v|sogou||\r\n|[w2v_literature_target_word-char_dim300](text/embedding/w2v_literature_target_word-char_dim300)|w2v|literature||\r\n|[w2v_baidu_encyclopedia_target_bigram-char_dim300](text/embedding/w2v_baidu_encyclopedia_target_bigram-char_dim300)|w2v|baidu_encyclopedia||\r\n|[w2v_baidu_encyclopedia_target_word-word_dim300](text/embedding/w2v_baidu_encyclopedia_target_word-word_dim300)|w2v|baidu_encyclopedia||\r\n|[glove_twitter_target_word-word_dim100_en](text/embedding/glove_twitter_target_word-word_dim100_en)|glove|crawl||\r\n|[w2v_baidu_encyclopedia_target_word-ngram_2-2_dim300](text/embedding/w2v_baidu_encyclopedia_target_word-ngram_2-2_dim300)|w2v|baidu_encyclopedia||\r\n|[w2v_baidu_encyclopedia_context_word-character_char1-4_dim300](text/embedding/w2v_baidu_encyclopedia_context_word-character_char1-4_dim300)|w2v|baidu_encyclopedia||\r\n|[w2v_literature_target_bigram-char_dim300](text/embedding/w2v_literature_target_bigram-char_dim300)|w2v|literature||\r\n|[fasttext_wiki-news_target_word-word_dim300_en](text/embedding/fasttext_wiki-news_target_word-word_dim300_en)|fasttext|wiki-news||\r\n|[w2v_people_daily_target_word-bigram_dim300](text/embedding/w2v_people_daily_target_word-bigram_dim300)|w2v|people_daily||\r\n|[w2v_mixed-large_target_word-word_dim300](text/embedding/w2v_mixed-large_target_word-word_dim300)|w2v|mixed||\r\n|[w2v_people_daily_target_bigram-char_dim300](text/embedding/w2v_people_daily_target_bigram-char_dim300)|w2v|people_daily||\r\n|[w2v_literature_target_word-bigram_dim300](text/embedding/w2v_literature_target_word-bigram_dim300)|w2v|literature||\r\n|[glove_twitter_target_word-word_dim25_en](text/embedding/glove_twitter_target_word-word_dim25_en)|glove|twitter||\r\n|[w2v_baidu_encyclopedia_context_word-ngram_1-2_dim300](text/embedding/w2v_baidu_encyclopedia_context_word-ngram_1-2_dim300)|w2v|baidu_encyclopedia||\r\n|[w2v_sikuquanshu_target_word-bigram_dim300](text/embedding/w2v_sikuquanshu_target_word-bigram_dim300)|w2v|sikuquanshu||\r\n|[w2v_baidu_encyclopedia_context_word-character_char1-2_dim300](text/embedding/w2v_baidu_encyclopedia_context_word-character_char1-2_dim300)|w2v|baidu_encyclopedia||\r\n|[glove_twitter_target_word-word_dim50_en](text/embedding/glove_twitter_target_word-word_dim50_en)|glove|twitter||\r\n|[w2v_baidu_encyclopedia_context_word-wordLR_dim300](text/embedding/w2v_baidu_encyclopedia_context_word-wordLR_dim300)|w2v|baidu_encyclopedia||\r\n|[w2v_baidu_encyclopedia_target_word-character_char1-2_dim300](text/embedding/w2v_baidu_encyclopedia_target_word-character_char1-2_dim300)|w2v|baidu_encyclopedia||\r\n|[w2v_baidu_encyclopedia_context_word-wordPosition_dim300](text/embedding/w2v_baidu_encyclopedia_context_word-wordPosition_dim300)|w2v|baidu_encyclopedia||\r\n\r\n</div></details>\r\n\r\n  - ### 机器翻译\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[transformer_zh-en](text/machine_translation/transformer/zh-en)|Transformer|CWMT2021|中文译英文|\r\n|[transformer_en-de](text/machine_translation/transformer/en-de)|Transformer|WMT14 EN-DE|英文译德文|\r\n\r\n  - ### 语义模型\r\n\r\n<details><summary>expand</summary><div>\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[chinese_electra_small](text/language_model/chinese_electra_small)||||\r\n|[chinese_electra_base](text/language_model/chinese_electra_base)||||\r\n|[roberta-wwm-ext-large](text/language_model/roberta-wwm-ext-large)|roberta-wwm-ext-large|百度自建数据集||\r\n|[chinese-bert-wwm-ext](text/language_model/chinese_bert_wwm_ext)|chinese-bert-wwm-ext|百度自建数据集||\r\n|[lda_webpage](text/language_model/lda_webpage)|LDA|百度自建网页领域数据集||\r\n|[lda_novel](text/language_model/lda_novel)||||\r\n|[bert-base-multilingual-uncased](text/language_model/bert-base-multilingual-uncased)||||\r\n|[rbt3](text/language_model/rbt3)||||\r\n|[ernie_v2_eng_base](text/language_model/ernie_v2_eng_base)|ernie_v2_eng_base|百度自建数据集||\r\n|[bert-base-multilingual-cased](text/language_model/bert-base-multilingual-cased)||||\r\n|[rbtl3](text/language_model/rbtl3)||||\r\n|[chinese-bert-wwm](text/language_model/chinese_bert_wwm)|chinese-bert-wwm|百度自建数据集||\r\n|[bert-large-uncased](text/language_model/bert-large-uncased)||||\r\n|[slda_novel](text/language_model/slda_novel)||||\r\n|[slda_news](text/language_model/slda_news)||||\r\n|[electra_small](text/language_model/electra_small)||||\r\n|[slda_webpage](text/language_model/slda_webpage)||||\r\n|[bert-base-cased](text/language_model/bert-base-cased)||||\r\n|[slda_weibo](text/language_model/slda_weibo)||||\r\n|[roberta-wwm-ext](text/language_model/roberta-wwm-ext)|roberta-wwm-ext|百度自建数据集||\r\n|[bert-base-uncased](text/language_model/bert-base-uncased)||||\r\n|[electra_large](text/language_model/electra_large)||||\r\n|[ernie](text/language_model/ernie)|ernie-1.0|百度自建数据集||\r\n|[simnet_bow](text/language_model/simnet_bow)|BOW|百度自建数据集||\r\n|[ernie_tiny](text/language_model/ernie_tiny)|ernie_tiny|百度自建数据集||\r\n|[bert-base-chinese](text/language_model/bert-base-chinese)|bert-base-chinese|百度自建数据集||\r\n|[lda_news](text/language_model/lda_news)|LDA|百度自建新闻领域数据集||\r\n|[electra_base](text/language_model/electra_base)||||\r\n|[ernie_v2_eng_large](text/language_model/ernie_v2_eng_large)|ernie_v2_eng_large|百度自建数据集||\r\n|[bert-large-cased](text/language_model/bert-large-cased)||||\r\n\r\n</div></details>\r\n\r\n\r\n  - ### 情感分析\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[ernie_skep_sentiment_analysis](text/sentiment_analysis/ernie_skep_sentiment_analysis)|SKEP|百度自建数据集|句子级情感分析|\r\n|[emotion_detection_textcnn](text/sentiment_analysis/emotion_detection_textcnn)|TextCNN|百度自建数据集|对话情绪识别|\r\n|[senta_bilstm](text/sentiment_analysis/senta_bilstm)|BiLSTM|百度自建数据集|中文情感倾向分析|\r\n|[senta_bow](text/sentiment_analysis/senta_bow)|BOW|百度自建数据集|中文情感倾向分析|\r\n|[senta_gru](text/sentiment_analysis/senta_gru)|GRU|百度自建数据集|中文情感倾向分析|\r\n|[senta_lstm](text/sentiment_analysis/senta_lstm)|LSTM|百度自建数据集|中文情感倾向分析|\r\n|[senta_cnn](text/sentiment_analysis/senta_cnn)|CNN|百度自建数据集|中文情感倾向分析|\r\n\r\n  - ### 句法分析\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[DDParser](text/syntactic_analysis/DDParser)|Deep Biaffine Attention|搜索query、网页文本、语音输入等数据|句法分析|\r\n\r\n  - ### 同声传译\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[transformer_nist_wait_1](text/simultaneous_translation/stacl/transformer_nist_wait_1)|transformer|NIST 2008-中英翻译数据集|中译英-wait-1策略|\r\n|[transformer_nist_wait_3](text/simultaneous_translation/stacl/transformer_nist_wait_3)|transformer|NIST 2008-中英翻译数据集|中译英-wait-3策略|\r\n|[transformer_nist_wait_5](text/simultaneous_translation/stacl/transformer_nist_wait_5)|transformer|NIST 2008-中英翻译数据集|中译英-wait-5策略|\r\n|[transformer_nist_wait_7](text/simultaneous_translation/stacl/transformer_nist_wait_7)|transformer|NIST 2008-中英翻译数据集|中译英-wait-7策略|\r\n|[transformer_nist_wait_all](text/simultaneous_translation/stacl/transformer_nist_wait_all)|transformer|NIST 2008-中英翻译数据集|中译英-waitk=-1策略|\r\n\r\n\r\n  - ### 词法分析\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[jieba_paddle](text/lexical_analysis/jieba_paddle)|BiGRU+CRF|百度自建数据集|百度自研联合的词法分析模型，能整体性地完成中文分词、词性标注、专名识别任务。在百度自建数据集上评测，LAC效果：Precision=88.0%，Recall=88.7%，F1-Score=88.4%。|\r\n|[lac](text/lexical_analysis/lac)|BiGRU+CRF|百度自建数据集|jieba使用Paddle搭建的切词网络（双向GRU）。同时支持jieba的传统切词方法，如精确模式、全模式、搜索引擎模式等切词模式。|\r\n\r\n  - ### 标点恢复\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[auto_punc](text/punctuation_restoration/auto_punc)|Ernie-1.0|WuDaoCorpora 2.0|自动添加7种标点符号|\r\n\r\n  - ### 文本审核\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[porn_detection_cnn](text/text_review/porn_detection_cnn)|CNN|百度自建数据集|色情检测，自动判别文本是否涉黄并给出相应的置信度，对文本中的色情描述、低俗交友、污秽文案进行识别|\r\n|[porn_detection_gru](text/text_review/porn_detection_gru)|GRU|百度自建数据集|色情检测，自动判别文本是否涉黄并给出相应的置信度，对文本中的色情描述、低俗交友、污秽文案进行识别|\r\n|[porn_detection_lstm](text/text_review/porn_detection_lstm)|LSTM|百度自建数据集|色情检测，自动判别文本是否涉黄并给出相应的置信度，对文本中的色情描述、低俗交友、污秽文案进行识别|\r\n\r\n## 语音\r\n  - ### 声音克隆\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[ge2e_fastspeech2_pwgan](audio/voice_cloning/ge2e_fastspeech2_pwgan)|FastSpeech2|AISHELL-3|中文语音克隆|\r\n|[lstm_tacotron2](audio/voice_cloning/lstm_tacotron2)|LSTM、Tacotron2、WaveFlow|AISHELL-3|中文语音克隆|\r\n\r\n  - ### 语音合成\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[transformer_tts_ljspeech](audio/tts/transformer_tts_ljspeech)|Transformer|LJSpeech-1.1|英文语音合成|\r\n|[fastspeech_ljspeech](audio/tts/fastspeech_ljspeech)|FastSpeech|LJSpeech-1.1|英文语音合成|\r\n|[fastspeech2_baker](audio/tts/fastspeech2_baker)|FastSpeech2|Chinese Standard Mandarin Speech Copus|中文语音合成|\r\n|[fastspeech2_ljspeech](audio/tts/fastspeech2_ljspeech)|FastSpeech2|LJSpeech-1.1|英文语音合成|\r\n|[deepvoice3_ljspeech](audio/tts/deepvoice3_ljspeech)|DeepVoice3|LJSpeech-1.1|英文语音合成|\r\n\r\n  - ### 语音识别\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[deepspeech2_aishell](audio/asr/deepspeech2_aishell)|DeepSpeech2|AISHELL-1|中文语音识别|\r\n|[deepspeech2_librispeech](audio/asr/deepspeech2_librispeech)|DeepSpeech2|LibriSpeech|英文语音识别|\r\n|[u2_conformer_aishell](audio/asr/u2_conformer_aishell)|Conformer|AISHELL-1|中文语音识别|\r\n|[u2_conformer_wenetspeech](audio/asr/u2_conformer_wenetspeech)|Conformer|WenetSpeech|中文语音识别|\r\n|[u2_conformer_librispeech](audio/asr/u2_conformer_librispeech)|Conformer|LibriSpeech|英文语音识别|\r\n\r\n\r\n  - ### 声音分类\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[panns_cnn6](audio/audio_classification/PANNs/cnn6)|PANNs|Google Audioset|主要包含4个卷积层和2个全连接层，模型参数为4.5M。经过预训练后，可以用于提取音频的embbedding，维度是512|\r\n|[panns_cnn14](audio/audio_classification/PANNs/cnn14)|PANNs|Google Audioset|主要包含12个卷积层和2个全连接层，模型参数为79.6M。经过预训练后，可以用于提取音频的embbedding，维度是2048|\r\n|[panns_cnn10](audio/audio_classification/PANNs/cnn10)|PANNs|Google Audioset|主要包含8个卷积层和2个全连接层，模型参数为4.9M。经过预训练后，可以用于提取音频的embbedding，维度是512|\r\n\r\n## 视频\r\n  - ### 视频分类\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[videotag_tsn_lstm](video/classification/videotag_tsn_lstm)|TSN + AttentionLSTM|百度自建数据集|大规模短视频分类打标签|\r\n|[tsn_kinetics400](video/classification/tsn_kinetics400)|TSN|Kinetics-400|视频分类|\r\n|[tsm_kinetics400](video/classification/tsm_kinetics400)|TSM|Kinetics-400|视频分类|\r\n|[stnet_kinetics400](video/classification/stnet_kinetics400)|StNet|Kinetics-400|视频分类|\r\n|[nonlocal_kinetics400](video/classification/nonlocal_kinetics400)|Non-local|Kinetics-400|视频分类|\r\n\r\n\r\n  - ### 视频修复\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[SkyAR](video/Video_editing/SkyAR)|UNet|UNet|视频换天|\r\n\r\n  - ### 多目标追踪\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[fairmot_dla34](video/multiple_object_tracking/fairmot_dla34)|CenterNet|Caltech Pedestrian+CityPersons+CUHK-SYSU+PRW+ETHZ+MOT17|实时多目标跟踪|\r\n|[jde_darknet53](video/multiple_object_tracking/jde_darknet53)|YOLOv3|Caltech Pedestrian+CityPersons+CUHK-SYSU+PRW+ETHZ+MOT17|多目标跟踪-兼顾精度和速度|\r\n\r\n## 工业应用\r\n\r\n  - ### 表针识别\r\n\r\n|module|网络|数据集|简介|\r\n|--|--|--|--|\r\n|[WatermeterSegmentation](image/semantic_segmentation/WatermeterSegmentation)|DeepLabV3|水表的数字表盘分割数据集|水表的数字表盘分割|\r\n"
  },
  {
    "path": "modules/audio/README.md",
    "content": "## **更好用户体验，建议参考WEB端官方文档 -> [【语音合成】](https://www.paddlepaddle.org.cn/hublist)**\n\n### 文字识别\n语音合成（TTS）任务可以实现讲文字转化为语音，已经广泛应用于各种语音交互设备中。\n- 推荐模型\n\n| 模型名称                                                     | 模型简介                                                     |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n| [语音合成transformer_tts_ljspeech](https://www.paddlepaddle.org.cn/hubdetail?name=transformer_tts_ljspeech&en_category=TextToSpeech) | TansformerTTS 对 Transformer 和 Tacotron2 进行了融合，取得了令人满意的效果，英文TTS模型，仅支持预测。 |\n| [语音合成fastspeech_ljspeech](https://www.paddlepaddle.org.cn/hubdetail?name=fastspeech_ljspeech&en_category=TextToSpeech) | FastSpeech是基于encoder-decoder结构的teacher model中提取attention对角线来做发音持续时间预测，英文TTS模型，仅支持预测。 |\n| [语音合成deepvoice3_ljspeech](https://www.paddlepaddle.org.cn/hubdetail?name=deepvoice3_ljspeech&en_category=TextToSpeech) | Deep Voice 3是百度研究院2017年发布的端到端的TTS模型（论文录用于ICLR 2018）。它是一个基于卷积神经网络和注意力机制的seq2seq模型,英文TTS模型，仅支持预测。|\n"
  },
  {
    "path": "modules/audio/README_en.md",
    "content": "## **For better user experience, refer to the official documentation on WEB -> [Text-to-speech](https://www.paddlepaddle.org.cn/hublist)**\n\n### OCR\n\nText-to-speech (TTS) task can realize the conversion of text into speech, which has been widely used in a variety of speech interactive devices.\n\n- Recommended Models\n\n| Model Name                                                   | Model Introduction                                           |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n| [Text-to-speech transformer\\_tts\\_ljspeech](https://www.paddlepaddle.org.cn/hubdetail?name=transformer_tts_ljspeech&en_category=TextToSpeech) | TansformerTTS is a fusion of Transformer and Tacotron2 with satisfactory results. It is an English TTS model and supports the prediction only. |\n| [Text-to-speech fastspeech\\_ljspeech](https://www.paddlepaddle.org.cn/hubdetail?name=fastspeech_ljspeech&en_category=TextToSpeech) | FastSpeech is based on the attention diagonal line extracted from the teacher model in the encoder-decoder structure to make pronunciation duration prediction. It is an English TTS model and supports the prediction only. |\n| [Text-to-speech deepvoice3\\_ljspeech](https://www.paddlepaddle.org.cn/hubdetail?name=deepvoice3_ljspeech&en_category=TextToSpeech) | Deep Voice 3 is an end-to-end TTS model released by Baidu Research Institute in 2017 (paper accepted in ICLR 2018). It is a seq2seq model based on the convolutional neural network and attention mechanism. It is an English TTS model and supports the prediction only. |\n"
  },
  {
    "path": "modules/audio/asr/deepspeech2_aishell/README.md",
    "content": "# deepspeech2_aishell\n\n|模型名称|deepspeech2_aishell|\n| :--- | :---: |\n|类别|语音-语音识别|\n|网络|DeepSpeech2|\n|数据集|AISHELL-1|\n|是否支持Fine-tuning|否|\n|模型大小|306MB|\n|最新更新日期|2021-10-20|\n|数据指标|中文CER 0.065|\n\n## 一、模型基本信息\n\n### 模型介绍\n\nDeepSpeech2是百度于2015年提出的适用于英文和中文的end-to-end语音识别模型。deepspeech2_aishell使用了DeepSpeech2离线模型的结构，模型主要由2层卷积网络和3层GRU组成，并在中文普通话开源语音数据集[AISHELL-1](http://www.aishelltech.com/kysjcp)进行了预训练，该模型在其测试集上的CER指标是0.065。\n\n\n<p align=\"center\">\n<img src=\"https://raw.githubusercontent.com/PaddlePaddle/DeepSpeech/Hub/docs/images/ds2offlineModel.png\" hspace='10'/> <br />\n</p>\n\n更多详情请参考[Deep Speech 2: End-to-End Speech Recognition in English and Mandarin](https://arxiv.org/abs/1512.02595)\n\n## 二、安装\n\n- ### 1、系统依赖\n\n  - libsndfile, swig >= 3.0\n    - Linux\n      ```shell\n      $ sudo apt-get install libsndfile swig\n      or\n      $ sudo yum install libsndfile swig\n      ```\n    - MacOs\n      ```\n      $ brew install libsndfile swig\n      ```\n\n- ### 2、环境依赖\n  - swig_decoder:\n    ```\n    git clone https://github.com/PaddlePaddle/DeepSpeech.git && cd DeepSpeech && git reset --hard b53171694e7b87abe7ea96870b2f4d8e0e2b1485 && cd deepspeech/decoders/ctcdecoder/swig && sh setup.sh\n    ```\n\n  - paddlepaddle >= 2.1.0\n\n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 3、安装\n\n  - ```shell\n    $ hub install deepspeech2_aishell\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n    ```python\n    import paddlehub as hub\n\n    # 采样率为16k，格式为wav的中文语音音频\n    wav_file = '/PATH/TO/AUDIO'\n\n    model = hub.Module(\n        name='deepspeech2_aishell',\n        version='1.0.0')\n    text = model.speech_recognize(wav_file)\n\n    print(text)\n    ```\n\n- ### 2、API\n  - ```python\n    def check_audio(audio_file)\n    ```\n    - 检查输入音频格式和采样率是否满足为16000\n\n    - **参数**\n\n      - `audio_file`：本地音频文件(*.wav)的路径，如`/path/to/input.wav`\n\n  - ```python\n    def speech_recognize(\n        audio_file,\n        device='cpu',\n    )\n    ```\n    - 将输入的音频识别成文字\n\n    - **参数**\n\n      - `audio_file`：本地音频文件(*.wav)的路径，如`/path/to/input.wav`\n      - `device`：预测时使用的设备，默认为\b`cpu`，如需使用gpu预测，请设置为`gpu`。\n\n    - **返回**\n\n      - `text`：str类型，返回输入音频的识别文字结果。\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线的语音识别服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m deepspeech2_aishell\n    ```\n\n  - 这样就完成了一个语音识别服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 需要识别的音频的存放路径，确保部署服务的机器可访问\n    file = '/path/to/input.wav'\n\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"audio_file\"\n    data = {\"audio_file\": file}\n\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/deepspeech2_aishell\"\n\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install deepspeech2_aishell\n  ```\n"
  },
  {
    "path": "modules/audio/asr/deepspeech2_aishell/__init__.py",
    "content": ""
  },
  {
    "path": "modules/audio/asr/deepspeech2_aishell/assets/conf/augmentation.json",
    "content": "{}\n"
  },
  {
    "path": "modules/audio/asr/deepspeech2_aishell/assets/conf/deepspeech2.yaml",
    "content": "# https://yaml.org/type/float.html\ndata:\n  train_manifest: data/manifest.train\n  dev_manifest: data/manifest.dev\n  test_manifest: data/manifest.test\n  min_input_len: 0.0\n  max_input_len: 27.0 # second\n  min_output_len: 0.0\n  max_output_len: .inf\n  min_output_input_ratio: 0.00\n  max_output_input_ratio: .inf\n\ncollator:\n  batch_size: 64 # one gpu\n  mean_std_filepath: data/mean_std.json\n  unit_type: char\n  vocab_filepath: data/vocab.txt\n  augmentation_config: conf/augmentation.json\n  random_seed: 0\n  spm_model_prefix:\n  spectrum_type: linear\n  feat_dim:\n  delta_delta: False\n  stride_ms: 10.0\n  window_ms: 20.0\n  n_fft: None\n  max_freq: None\n  target_sample_rate: 16000\n  use_dB_normalization: True\n  target_dB: -20\n  dither: 1.0\n  keep_transcription_text: False\n  sortagrad: True\n  shuffle_method: batch_shuffle\n  num_workers: 2\n\nmodel:\n  num_conv_layers: 2\n  num_rnn_layers: 3\n  rnn_layer_size: 1024\n  use_gru: True\n  share_rnn_weights: False\n  blank_id: 0\n  ctc_grad_norm_type: instance\n\ntraining:\n  n_epoch: 80\n  accum_grad: 1\n  lr: 2e-3\n  lr_decay: 0.83\n  weight_decay: 1e-06\n  global_grad_clip: 3.0\n  log_interval: 100\n  checkpoint:\n    kbest_n: 50\n    latest_n: 5\n\ndecoding:\n  batch_size: 128\n  error_rate_type: cer\n  decoding_method: ctc_beam_search\n  lang_model_path: data/lm/zh_giga.no_cna_cmn.prune01244.klm\n  alpha: 1.9\n  beta: 5.0\n  beam_size: 300\n  cutoff_prob: 0.99\n  cutoff_top_n: 40\n  num_proc_bsearch: 10\n"
  },
  {
    "path": "modules/audio/asr/deepspeech2_aishell/assets/data/mean_std.json",
    "content": "{\"mean_stat\": [-13505966.65209869, -12778154.889588555, -13487728.30750011, -12897344.94123812, -12472281.490772562, -12631566.475106332, -13391790.349327326, -14045382.570026815, -14159320.465516506, -14273422.438486755, -14639805.161347123, -15145380.07768254, -15612893.133258691, -15938542.05012206, -16115293.502621327, -16188225.698757892, -16317206.280373082, -16500598.476283036, -16671564.297937019, -16804599.860397574, -16916423.142814968, -17011785.59439087, -17075067.62262626, -17154580.16740178, -17257812.961825978, -17355683.228599995, -17441455.258318607, -17473199.925130684, -17488835.5763828, -17491232.15414511, -17485000.29006962, -17499471.646940477, -17551398.97122984, -17641732.10682403, -17757209.077974595, -17843801.500521667, -17935647.58641936, -18020362.347413756, -18117633.806080323, -18232427.58935143, -18316024.35215119, -18378789.145393644, -18421147.25807373, -18445805.18294822, -18460946.27810118, -18467914.04034822, -18469404.319909714, -18469606.974339806, -18470754.294192698, -18458320.91921723, -18441354.111811973, -18428332.216321833, -18422281.413955193, -18433421.585668042, -18460521.025954794, -18494800.856363494, -18539532.288011573, -18583823.79899225, -18614474.56256926, -18646872.180154275, -18661137.85367877, -18673590.719379324, -18702967.62040798, -18736434.748098046, -18777912.13098326, -18794675.486509323, -18837225.856196072, -18874872.796128694, -18927340.44407057, -18994929.076545004, -19060701.164406348, -19118006.18996682, -19175792.05766062, -19230755.996405277, -19270174.594219487, -19334788.35904946, -19401456.988906194, -19484580.095938426, -19582040.4715673, -19696598.86662636, -19810401.513227757, -19931755.37941177, -20021867.47620737, -20082298.984455004, -20114708.336475413, -20143802.72793865, -20146821.988139726, -20165613.317683898, -20189938.602584295, -20220059.08673595, -20242848.528134122, -20250859.979931064, -20267382.93048284, -20267964.544716164, -20261372.89563879, -20252878.74023849, -20247550.771284755, -20231778.31093504, -20231376.103159923, -20236926.52293088, -20248068.41488535, -20255076.901920393, -20262924.167151034, -20263926.583205637, -20263790.273742784, -20268560.080967404, -20268997.150654405, -20269810.816284582, -20267771.864327505, -20256472.703380838, -20241790.559690386, -20241865.794732895, -20244924.716114976, -20249736.631184842, -20257257.816903576, -20268027.212145977, -20277399.95533857, -20281840.8112546, -20270512.52002465, -20255938.63066214, -20242421.685443826, -20241986.654626504, -20237836.034444932, -20231458.31132546, -20218092.819713395, -20204994.19634715, -20198880.142133974, -20197376.49014031, -20198117.60450857, -20197443.473929476, -20191142.03632657, -20174428.452719454, -20159204.32090646, -20137981.294740904, -20124944.79897834, -20112774.604521394, -20109389.248600915, -20115248.61302806, -20117743.853294585, -20123076.93515528, -20132224.95454374, -20147099.26793121, -20169581.367630124, -20190957.518733896, -20215197.057997894, -20242033.589256056, -20282032.217160087, -20316778.653784916, -20360354.215504933, -20425089.908502825, -20534553.0465662, -20737928.349233944, -21091705.14104186, -21646013.197923105, -22403182.076235127, -23313516.63322832, -24244679.879594248, -25027534.00417361, -25502455.708560493, -25665136.744125813, -26602318.88405537], \"var_stat\": [209924783.1093623, 185218712.4577822, 209991180.89829063, 196198511.40798286, 186098265.7827955, 191905798.58923203, 214281935.29191792, 235042114.51049897, 240179456.24597096, 244657890.3963041, 256099586.32657292, 271849135.9872555, 287174069.13527167, 298171137.28863454, 304112589.91933817, 306553976.2206335, 310813670.30674237, 316958840.3099824, 322651440.3639528, 327213725.196089, 331252123.26114285, 334856188.3081607, 337217897.6545214, 340385427.82557064, 344400488.5633641, 348086880.08086526, 351349070.53148264, 352648076.18415344, 353409462.33704513, 353598061.4967693, 353405322.74993587, 353917215.6834277, 355784796.898883, 359222461.3224974, 363671441.7428676, 366908651.69908494, 370304677.0615045, 373477194.79721, 377174088.9808273, 381531608.6574547, 384703574.426059, 387104126.9474883, 388723211.11308575, 389687817.27351815, 390351031.4418706, 390659006.3690262, 390704649.89417714, 390702370.1919126, 390731862.59274197, 390216004.4126628, 389516083.054853, 389017745.636457, 388788872.1127645, 389269311.2239042, 390401819.5968815, 391842612.97859454, 393708801.05223197, 395569598.4694, 396868892.67152405, 398210915.02133286, 398743299.4753882, 399330344.88417244, 400565940.1325846, 401901693.4656316, 403513855.43933284, 404103248.96526104, 405986814.274556, 407507145.4104169, 409598353.6517908, 412453848.0248063, 415138273.0558441, 417479272.96907294, 419785633.3276395, 422003065.1681787, 423610264.8868346, 426260552.96545905, 428973536.3620236, 432368654.40899384, 436359561.5468266, 441119512.777527, 445884989.25794005, 451037422.65838546, 454872292.24179226, 457497136.8780015, 458904066.0675219, 460155836.4432799, 460272943.80738074, 461087498.6828549, 462144907.7850926, 463483598.81228757, 464530694.44478536, 464971538.85301507, 465771535.6019992, 465936698.93801653, 465741012.7287712, 465448625.0011534, 465296363.8603534, 464718299.2207512, 464720391.25778216, 465016640.5248736, 465564374.0248998, 465982788.8695927, 466425068.01245564, 466595649.90489674, 466707658.8296169, 467015570.78026086, 467099213.08769494, 467201640.15951264, 467163862.3709329, 466727597.56313753, 466174871.71213347, 466255498.45248336, 466439062.65458614, 466693130.99620277, 467068587.1422199, 467536070.1402474, 467955819.1549621, 468187227.1069643, 467742976.2778335, 467159585.250493, 466592359.52916145, 466583195.8099961, 466424348.9572719, 466155323.6074322, 465569620.1801811, 465021642.5158305, 464757658.6383867, 464713882.60103834, 464724239.2941314, 464679163.728191, 464407007.8705965, 463660736.0136739, 463001339.2385198, 462077058.47595775, 461505071.67199403, 460946277.95973784, 460816158.9197017, 461123589.268546, 461232998.1572812, 461445601.0442877, 461803238.28569543, 462436966.22005004, 463391404.7434971, 464299608.85523456, 465319405.3931429, 466432961.70208246, 468168080.3331244, 469640808.6809098, 471501539.22440934, 474301795.1694898, 479155711.93441755, 488314271.10405815, 504537056.23994666, 530509400.5201074, 566892036.4437443, 611792826.0442055, 658913502.9004005, 699716882.9169292, 725237302.8248898, 734259159.9571886, 789267050.8287783], \"frame_num\": 899422}\n"
  },
  {
    "path": "modules/audio/asr/deepspeech2_aishell/assets/data/vocab.txt",
    "content": "<blank>\n<unk>\n一\n丁\n七\n万\n丈\n三\n上\n下\n不\n与\n丐\n丑\n专\n且\n世\n丘\n丙\n业\n丛\n东\n丝\n丞\n丢\n两\n严\n丧\n个\n丫\n中\n丰\n串\n临\n丸\n丹\n为\n主\n丽\n举\n乃\n久\n么\n义\n之\n乌\n乍\n乎\n乏\n乐\n乒\n乓\n乔\n乖\n乘\n乙\n九\n乞\n也\n习\n乡\n书\n买\n乱\n乳\n乾\n了\n予\n争\n事\n二\n于\n亏\n云\n互\n五\n井\n亚\n些\n亟\n亡\n亢\n交\n亥\n亦\n产\n亨\n亩\n享\n京\n亭\n亮\n亲\n亳\n亵\n人\n亿\n什\n仁\n仄\n仅\n仇\n今\n介\n仍\n从\n仑\n仓\n仔\n仕\n他\n仗\n付\n仙\n仡\n代\n令\n以\n仨\n仪\n们\n仰\n仲\n件\n价\n任\n份\n仿\n企\n伉\n伊\n伍\n伎\n伏\n伐\n休\n众\n优\n伙\n会\n伞\n伟\n传\n伢\n伤\n伦\n伪\n伯\n估\n伴\n伶\n伸\n伺\n似\n伽\n佃\n但\n位\n低\n住\n佐\n佑\n体\n何\n佘\n余\n佛\n作\n佟\n你\n佣\n佩\n佬\n佳\n佶\n佼\n使\n侃\n侄\n侈\n例\n侍\n侑\n侗\n供\n依\n侠\n侣\n侥\n侦\n侧\n侨\n侬\n侮\n侯\n侵\n便\n促\n俄\n俊\n俏\n俐\n俗\n俘\n俚\n保\n俞\n信\n俨\n俩\n俪\n俭\n修\n俯\n俱\n俸\n俺\n俾\n倍\n倒\n倘\n候\n倚\n倜\n借\n倡\n倦\n倩\n倪\n债\n值\n倾\n假\n偏\n做\n停\n健\n偶\n偷\n偿\n傅\n傍\n傥\n储\n催\n傲\n傻\n像\n僚\n僧\n僮\n僵\n僻\n儒\n儿\n兀\n允\n元\n兄\n充\n兆\n先\n光\n克\n免\n兑\n兔\n兖\n党\n兜\n兢\n入\n全\n八\n公\n六\n兰\n共\n关\n兴\n兵\n其\n具\n典\n兹\n养\n兼\n兽\n冀\n内\n冈\n冉\n册\n再\n冒\n冕\n冗\n写\n军\n农\n冠\n冤\n冥\n冬\n冯\n冰\n冲\n决\n况\n冶\n冷\n冻\n净\n凄\n准\n凇\n凉\n凋\n凌\n减\n凑\n凝\n几\n凡\n凤\n凭\n凯\n凰\n凳\n凶\n凸\n凹\n出\n击\n函\n凿\n刀\n刁\n刃\n分\n切\n刊\n刑\n划\n列\n刘\n则\n刚\n创\n初\n删\n判\n刨\n利\n别\n刮\n到\n制\n刷\n券\n刹\n刺\n刻\n剁\n剂\n剃\n削\n前\n剐\n剑\n剔\n剖\n剥\n剧\n剩\n剪\n副\n割\n剽\n剿\n劈\n力\n劝\n办\n功\n加\n务\n劣\n动\n助\n努\n劫\n励\n劲\n劳\n劵\n势\n勃\n勇\n勉\n勋\n勒\n勘\n募\n勤\n勺\n勾\n勿\n匀\n包\n匆\n匈\n匕\n化\n北\n匙\n匝\n匠\n匡\n匣\n匪\n匮\n匹\n区\n医\n匾\n匿\n十\n千\n升\n午\n卉\n半\n华\n协\n卑\n卒\n卓\n单\n卖\n南\n博\n卜\n卞\n占\n卡\n卢\n卤\n卦\n卧\n卫\n卯\n印\n危\n卲\n即\n却\n卵\n卷\n卸\n卿\n厂\n厄\n厅\n历\n厉\n压\n厌\n厕\n厘\n厚\n原\n厢\n厥\n厦\n厨\n厩\n厮\n去\n县\n参\n又\n叉\n及\n友\n双\n反\n发\n叔\n取\n受\n变\n叙\n叛\n叠\n口\n古\n句\n另\n叨\n叩\n只\n叫\n召\n叭\n叮\n可\n台\n叱\n史\n右\n叵\n叶\n号\n司\n叹\n叼\n吁\n吃\n各\n吆\n合\n吉\n吊\n同\n名\n后\n吏\n吐\n向\n吓\n吕\n吗\n君\n吝\n吞\n吟\n否\n吧\n吨\n吩\n含\n听\n吭\n启\n吴\n吵\n吸\n吹\n吻\n吼\n吾\n吿\n呀\n呃\n呆\n呈\n告\n呐\n呕\n呗\n员\n呛\n呜\n呢\n呦\n周\n呲\n味\n呵\n呼\n命\n咀\n咄\n咋\n和\n咎\n咏\n咐\n咒\n咔\n咕\n咖\n咙\n咚\n咣\n咤\n咧\n咨\n咪\n咫\n咬\n咯\n咱\n咳\n咸\n咽\n哀\n品\n哄\n哆\n哇\n哈\n哉\n响\n哎\n哑\n哒\n哗\n哟\n哥\n哦\n哨\n哪\n哭\n哲\n哺\n哼\n哽\n唁\n唇\n唉\n唏\n唐\n唠\n唤\n唬\n售\n唯\n唱\n唾\n啃\n商\n啊\n啕\n啡\n啤\n啥\n啦\n啧\n啪\n啬\n啰\n啵\n啶\n啸\n啼\n喀\n喂\n善\n喆\n喇\n喉\n喊\n喔\n喘\n喜\n喝\n喧\n喱\n喵\n喷\n喻\n喽\n嗅\n嗑\n嗒\n嗓\n嗡\n嗣\n嗤\n嗦\n嗨\n嗬\n嗯\n嗲\n嗷\n嗽\n嘀\n嘈\n嘉\n嘎\n嘘\n嘛\n嘟\n嘭\n嘱\n嘲\n嘴\n嘶\n嘻\n噎\n噘\n器\n噩\n噪\n噬\n噱\n噼\n嚎\n嚏\n嚓\n嚣\n嚷\n嚼\n囊\n囚\n四\n回\n因\n团\n囤\n囧\n园\n困\n围\n固\n国\n图\n圃\n圆\n圈\n土\n圣\n在\n圩\n圪\n圭\n地\n圳\n场\n圾\n址\n坂\n均\n坊\n坍\n坎\n坏\n坐\n坑\n块\n坚\n坛\n坝\n坞\n坟\n坠\n坡\n坤\n坦\n坪\n坯\n坷\n垂\n垃\n垄\n垅\n型\n垌\n垒\n垛\n垡\n垢\n垣\n垤\n垦\n垫\n垮\n埃\n埋\n城\n埔\n埜\n域\n埠\n培\n基\n堂\n堆\n堕\n堡\n堤\n堪\n堰\n堵\n塌\n塍\n塑\n塔\n塘\n塞\n填\n塬\n塾\n境\n墅\n墓\n墙\n增\n墟\n墨\n墩\n壁\n壑\n壕\n壤\n士\n壮\n声\n壳\n壶\n壹\n处\n备\n复\n夏\n夕\n外\n夙\n多\n夜\n够\n大\n天\n太\n夫\n夭\n央\n夯\n失\n头\n夷\n夸\n夹\n夺\n奂\n奇\n奈\n奉\n奋\n奎\n奏\n契\n奔\n奕\n奖\n套\n奘\n奚\n奠\n奢\n奥\n女\n奴\n奶\n奸\n她\n好\n如\n妃\n妄\n妆\n妇\n妈\n妊\n妍\n妒\n妖\n妙\n妞\n妤\n妥\n妧\n妨\n妩\n妮\n妯\n妹\n妻\n姆\n姊\n始\n姐\n姑\n姓\n委\n姗\n姚\n姜\n姝\n姣\n姥\n姨\n姬\n姻\n姿\n威\n娃\n娄\n娅\n娇\n娌\n娘\n娜\n娟\n娠\n娥\n娩\n娱\n娴\n娶\n娼\n婀\n婆\n婉\n婕\n婚\n婧\n婪\n婴\n婵\n婶\n婷\n婿\n媒\n媚\n媛\n媞\n媲\n媳\n嫁\n嫂\n嫉\n嫌\n嫔\n嫖\n嫚\n嫡\n嫣\n嫦\n嫩\n嬉\n嬛\n嬷\n孀\n子\n孔\n孕\n字\n存\n孙\n孚\n孜\n孝\n孟\n孢\n季\n孤\n学\n孩\n孪\n孰\n孱\n孵\n孺\n宁\n它\n宅\n宇\n守\n安\n宋\n完\n宏\n宓\n宕\n宗\n官\n宙\n定\n宛\n宜\n宝\n实\n宠\n审\n客\n宣\n室\n宦\n宪\n宫\n宰\n害\n宴\n宵\n家\n宸\n容\n宽\n宾\n宿\n寂\n寄\n寅\n密\n寇\n富\n寐\n寒\n寓\n寝\n寞\n察\n寡\n寥\n寨\n寮\n寰\n寸\n对\n寺\n寻\n导\n寿\n封\n射\n将\n尊\n小\n少\n尔\n尖\n尘\n尚\n尝\n尤\n尧\n尬\n就\n尴\n尸\n尹\n尺\n尼\n尽\n尾\n尿\n局\n屁\n层\n居\n屈\n届\n屋\n屌\n屎\n屏\n屑\n展\n属\n屠\n屡\n履\n屯\n山\n屹\n屿\n岁\n岂\n岌\n岐\n岔\n岖\n岗\n岚\n岛\n岩\n岬\n岭\n岱\n岳\n岷\n岸\n峁\n峙\n峡\n峥\n峨\n峪\n峭\n峰\n峻\n崂\n崃\n崇\n崎\n崔\n崖\n崛\n崧\n崩\n崭\n崴\n嵋\n嵌\n嵘\n嵛\n嵩\n嶝\n巅\n巍\n川\n州\n巡\n巢\n工\n左\n巧\n巨\n巩\n巫\n差\n己\n已\n巴\n巷\n巾\n巿\n币\n市\n布\n帅\n帆\n师\n希\n帐\n帕\n帖\n帘\n帚\n帜\n帝\n带\n席\n帮\n帷\n常\n帼\n帽\n幂\n幄\n幅\n幌\n幕\n幢\n干\n平\n年\n并\n幸\n幺\n幻\n幼\n幽\n广\n庄\n庆\n庇\n床\n序\n庐\n库\n应\n底\n店\n庙\n庚\n府\n庞\n废\n度\n座\n庭\n庵\n庶\n康\n庸\n庾\n廉\n廊\n廓\n廖\n延\n廷\n建\n开\n异\n弃\n弄\n弈\n弊\n式\n弑\n弓\n引\n弗\n弘\n弛\n弟\n张\n弥\n弦\n弧\n弩\n弯\n弱\n弹\n强\n归\n当\n录\n彝\n形\n彤\n彦\n彩\n彪\n彬\n彭\n彰\n影\n彷\n役\n彻\n彼\n彿\n往\n征\n径\n待\n徇\n很\n徉\n徊\n律\n徐\n徒\n得\n徘\n徙\n御\n循\n微\n德\n徽\n心\n必\n忆\n忌\n忍\n忐\n忑\n志\n忘\n忙\n忠\n忧\n忪\n快\n忱\n念\n忻\n忽\n怀\n态\n怂\n怅\n怎\n怒\n怕\n怖\n怜\n思\n怠\n怡\n急\n怦\n性\n怨\n怪\n怯\n怵\n总\n恋\n恍\n恐\n恒\n恙\n恢\n恣\n恤\n恨\n恩\n恪\n恬\n恭\n息\n恰\n恳\n恶\n恸\n恺\n恼\n恿\n悄\n悉\n悌\n悍\n悔\n悖\n悚\n悟\n悠\n患\n悦\n您\n悬\n悯\n悲\n悴\n悸\n悼\n情\n惆\n惊\n惋\n惑\n惕\n惚\n惜\n惟\n惠\n惦\n惧\n惨\n惩\n惫\n惬\n惮\n惯\n惰\n想\n惶\n惹\n惺\n愁\n愈\n愉\n意\n愕\n愚\n感\n愤\n愧\n愿\n慈\n慌\n慎\n慑\n慕\n慢\n慧\n慨\n慰\n慷\n憋\n憔\n憧\n憨\n憩\n憬\n憷\n憾\n懂\n懈\n懊\n懋\n懒\n懵\n懿\n戈\n戎\n戏\n成\n我\n戒\n或\n战\n戚\n戛\n戟\n截\n戬\n戮\n戳\n戴\n户\n房\n所\n扁\n扇\n扉\n手\n才\n扎\n扑\n扒\n打\n扔\n托\n扛\n扣\n执\n扩\n扫\n扬\n扭\n扮\n扯\n扰\n扳\n扶\n批\n扼\n找\n承\n技\n抄\n抉\n把\n抑\n抒\n抓\n投\n抖\n抗\n折\n抚\n抛\n抠\n抡\n抢\n护\n报\n抨\n披\n抬\n抱\n抵\n抹\n押\n抽\n抿\n拄\n担\n拆\n拇\n拈\n拉\n拌\n拍\n拎\n拐\n拒\n拓\n拔\n拖\n拗\n拘\n拙\n招\n拜\n拟\n拢\n拣\n拥\n拦\n拧\n拨\n择\n括\n拭\n拮\n拯\n拱\n拳\n拴\n拷\n拼\n拽\n拾\n拿\n持\n挂\n指\n按\n挎\n挑\n挖\n挚\n挛\n挝\n挟\n挠\n挡\n挣\n挤\n挥\n挨\n挪\n挫\n振\n挺\n挽\n捂\n捅\n捆\n捉\n捍\n捎\n捏\n捐\n捕\n捞\n损\n捡\n换\n捣\n捧\n据\n捷\n捺\n捻\n掀\n掂\n授\n掉\n掌\n掏\n掐\n排\n掖\n掘\n掠\n探\n掣\n接\n控\n推\n掩\n措\n掬\n掮\n掰\n掳\n掴\n掷\n掺\n揄\n揉\n揍\n描\n提\n插\n握\n揣\n揩\n揪\n揭\n援\n揶\n揽\n搀\n搁\n搂\n搅\n搏\n搜\n搞\n搡\n搪\n搬\n搭\n携\n搽\n摁\n摄\n摆\n摇\n摊\n摒\n摔\n摘\n摧\n摩\n摸\n摹\n撂\n撇\n撑\n撒\n撕\n撞\n撤\n撩\n撬\n播\n撮\n撰\n撵\n撸\n撼\n擂\n擅\n操\n擎\n擒\n擘\n擞\n擦\n攀\n攒\n攥\n支\n收\n改\n攻\n放\n政\n故\n效\n敌\n敏\n救\n敖\n教\n敛\n敝\n敞\n敢\n散\n敦\n敬\n数\n敲\n整\n敷\n文\n斋\n斌\n斐\n斑\n斓\n斗\n料\n斛\n斜\n斟\n斡\n斤\n斥\n斧\n斩\n断\n斯\n新\n方\n施\n旁\n旅\n旋\n族\n旗\n无\n既\n日\n旦\n旧\n旨\n早\n旬\n旭\n旱\n时\n旷\n旺\n昀\n昂\n昆\n昊\n昌\n明\n昏\n易\n昔\n昕\n昙\n星\n映\n春\n昧\n昨\n昭\n是\n昱\n昵\n昼\n显\n晃\n晋\n晏\n晒\n晓\n晔\n晕\n晖\n晗\n晚\n晟\n晤\n晦\n晨\n普\n景\n晰\n晴\n晶\n智\n晾\n暂\n暄\n暇\n暑\n暖\n暗\n暧\n暨\n暮\n暴\n曙\n曝\n曦\n曰\n曲\n更\n曹\n曼\n曾\n替\n最\n月\n有\n朋\n服\n朐\n朔\n朗\n望\n朝\n期\n朦\n木\n未\n末\n本\n札\n术\n朱\n朴\n朵\n机\n朽\n杀\n杂\n权\n杆\n杉\n李\n杏\n材\n村\n杖\n杜\n杞\n束\n杠\n条\n来\n杨\n杭\n杯\n杰\n杳\n松\n板\n极\n构\n枉\n析\n枕\n林\n枚\n果\n枝\n枞\n枢\n枣\n枪\n枫\n枭\n枯\n架\n枷\n柄\n柏\n某\n染\n柔\n柚\n柜\n柞\n柠\n查\n柬\n柯\n柱\n柳\n柴\n柿\n栅\n标\n栈\n栋\n栏\n树\n栓\n栖\n栗\n校\n株\n样\n核\n根\n格\n栽\n栾\n桂\n桃\n框\n案\n桉\n桌\n桎\n桐\n桑\n桓\n桔\n档\n桥\n桦\n桩\n桶\n梁\n梅\n梓\n梗\n梦\n梧\n梨\n梭\n梯\n械\n梳\n梵\n检\n棉\n棋\n棍\n棒\n棕\n棘\n棚\n棠\n森\n棱\n棵\n棺\n椅\n椋\n植\n椎\n椒\n椰\n椿\n楂\n楔\n楚\n楞\n楠\n楣\n楷\n楼\n概\n榄\n榆\n榈\n榉\n榔\n榕\n榜\n榨\n榭\n榴\n榷\n榻\n槌\n槎\n槐\n槛\n槟\n槽\n槿\n樊\n樟\n模\n横\n樱\n橄\n橘\n橙\n橡\n橱\n檀\n檐\n檬\n欠\n次\n欢\n欣\n欧\n欲\n欺\n款\n歆\n歇\n歉\n歌\n止\n正\n此\n步\n武\n歧\n歪\n歹\n死\n殃\n殆\n殉\n殊\n残\n殒\n殓\n殖\n殚\n殡\n殭\n殴\n段\n殷\n殿\n毁\n毂\n毅\n毋\n母\n每\n毒\n毓\n比\n毕\n毗\n毙\n毛\n毫\n毯\n毽\n氏\n民\n氓\n气\n氛\n氟\n氢\n氦\n氧\n氨\n氪\n氮\n氯\n氰\n水\n永\n氾\n汀\n汁\n求\n汇\n汉\n汕\n汗\n汛\n汝\n汞\n江\n池\n污\n汤\n汪\n汰\n汲\n汴\n汶\n汹\n汽\n汾\n沁\n沂\n沃\n沅\n沈\n沉\n沏\n沐\n沓\n沙\n沛\n沟\n没\n沣\n沥\n沦\n沧\n沪\n沫\n沮\n沱\n河\n沸\n油\n治\n沼\n沽\n沾\n沿\n泄\n泉\n泊\n泌\n泓\n泔\n法\n泗\n泛\n泞\n泠\n泡\n波\n泣\n泥\n注\n泪\n泯\n泰\n泱\n泳\n泵\n泷\n泸\n泻\n泼\n泽\n泾\n洁\n洋\n洒\n洗\n洙\n洛\n洞\n津\n洪\n洱\n洲\n洵\n活\n洼\n洽\n派\n流\n浅\n浆\n浇\n浈\n浊\n测\n济\n浏\n浑\n浓\n浙\n浚\n浦\n浩\n浪\n浮\n浴\n海\n浸\n涂\n涅\n消\n涉\n涌\n涎\n涓\n涕\n涛\n涝\n涞\n涟\n涠\n涡\n涤\n润\n涧\n涨\n涩\n涮\n涯\n液\n涵\n涿\n淀\n淄\n淆\n淇\n淋\n淌\n淑\n淖\n淘\n淝\n淞\n淡\n淤\n淫\n淮\n深\n淳\n混\n淹\n添\n淼\n渀\n清\n渊\n渍\n渎\n渐\n渔\n渗\n渚\n渝\n渠\n渡\n渣\n渤\n渥\n温\n渭\n港\n渲\n渴\n游\n渺\n湃\n湄\n湍\n湖\n湘\n湛\n湾\n湿\n溃\n溅\n溉\n源\n溜\n溢\n溥\n溧\n溪\n溯\n溶\n溺\n滁\n滇\n滋\n滑\n滔\n滕\n滚\n滞\n满\n滢\n滤\n滥\n滨\n滩\n滴\n漂\n漆\n漏\n漓\n演\n漕\n漠\n漩\n漫\n漭\n漯\n漱\n漳\n漾\n潇\n潘\n潜\n潞\n潢\n潦\n潭\n潮\n潼\n澄\n澈\n澎\n澜\n澡\n澳\n激\n濑\n濒\n濠\n濡\n濮\n瀑\n瀚\n瀛\n灌\n灞\n火\n灭\n灯\n灰\n灵\n灶\n灸\n灼\n灾\n灿\n炅\n炉\n炊\n炎\n炒\n炕\n炖\n炙\n炜\n炫\n炬\n炭\n炮\n炯\n炳\n炷\n炸\n点\n炼\n炽\n烁\n烂\n烃\n烈\n烊\n烘\n烙\n烛\n烟\n烤\n烦\n烧\n烨\n烫\n热\n烯\n烷\n烹\n烽\n焉\n焊\n焕\n焖\n焘\n焚\n焦\n焯\n焰\n焱\n然\n煊\n煌\n煎\n煜\n煞\n煤\n煦\n照\n煮\n煲\n熄\n熊\n熏\n熔\n熙\n熟\n熠\n熨\n熬\n熹\n燃\n燊\n燎\n燕\n燥\n爆\n爪\n爬\n爱\n爵\n父\n爷\n爸\n爹\n爽\n片\n版\n牌\n牙\n牛\n牟\n牡\n牢\n牧\n物\n牲\n牵\n特\n牺\n牾\n犀\n犁\n犄\n犊\n犒\n犬\n犯\n状\n犷\n犹\n狂\n狄\n狈\n狐\n狒\n狗\n狙\n狞\n狠\n狡\n狩\n独\n狭\n狮\n狰\n狱\n狸\n狼\n猎\n猖\n猛\n猜\n猝\n猥\n猩\n猪\n猫\n猬\n献\n猴\n猾\n猿\n獒\n獗\n獾\n玄\n率\n玉\n王\n玖\n玛\n玟\n玥\n玩\n玫\n玮\n环\n现\n玲\n玳\n玺\n玻\n珀\n珉\n珊\n珍\n珏\n珑\n珜\n珠\n班\n珮\n珲\n珺\n球\n琅\n理\n琉\n琊\n琏\n琐\n琛\n琢\n琥\n琦\n琨\n琪\n琬\n琰\n琳\n琴\n琵\n琶\n琼\n瑁\n瑄\n瑕\n瑙\n瑚\n瑛\n瑜\n瑞\n瑟\n瑰\n瑶\n瑾\n璀\n璃\n璇\n璋\n璐\n璞\n璧\n璨\n瓜\n瓢\n瓣\n瓦\n瓮\n瓯\n瓶\n瓷\n甄\n甘\n甚\n甜\n生\n甥\n用\n甩\n甫\n甬\n甯\n田\n由\n甲\n申\n电\n男\n甸\n町\n画\n畅\n畊\n界\n畏\n畔\n留\n畜\n略\n番\n畴\n畸\n畿\n疃\n疆\n疏\n疑\n疗\n疚\n疝\n疤\n疫\n疯\n疲\n疵\n疹\n疼\n疾\n病\n症\n痉\n痊\n痒\n痕\n痘\n痛\n痣\n痪\n痫\n痰\n痱\n痴\n痹\n痼\n瘀\n瘁\n瘟\n瘠\n瘤\n瘦\n瘩\n瘪\n瘫\n瘸\n瘾\n癌\n癖\n癣\n癫\n登\n白\n百\n皂\n的\n皆\n皇\n皋\n皎\n皓\n皖\n皙\n皮\n皱\n盆\n盈\n益\n盎\n盐\n监\n盒\n盔\n盖\n盗\n盘\n盛\n盟\n目\n盯\n盲\n直\n相\n盹\n盼\n盾\n省\n眈\n眉\n看\n真\n眠\n眨\n眬\n眯\n眶\n眷\n眺\n眼\n着\n睁\n睐\n睛\n睡\n督\n睦\n睫\n睬\n睹\n睾\n睿\n瞄\n瞅\n瞌\n瞎\n瞒\n瞟\n瞧\n瞩\n瞪\n瞬\n瞰\n瞳\n瞻\n瞿\n矗\n矛\n矜\n矢\n矣\n知\n矩\n矫\n短\n矮\n石\n矶\n矸\n矿\n码\n砂\n砌\n砍\n砒\n研\n砖\n砚\n砝\n砥\n砰\n砲\n破\n砷\n砸\n砺\n砾\n础\n硅\n硕\n硚\n硝\n硫\n硬\n确\n碉\n碌\n碍\n碎\n碑\n碗\n碘\n碚\n碟\n碧\n碰\n碱\n碳\n碴\n碾\n磁\n磅\n磊\n磋\n磐\n磕\n磡\n磨\n磴\n磷\n磺\n礁\n示\n礼\n社\n祁\n祈\n祉\n祖\n祛\n祝\n神\n祠\n祢\n祥\n票\n祭\n祯\n祷\n祸\n祺\n禀\n禁\n禄\n禅\n福\n禧\n禹\n禺\n离\n禽\n禾\n秀\n私\n秃\n秆\n秉\n秋\n种\n科\n秒\n秘\n租\n秣\n秤\n秦\n秧\n秩\n积\n称\n秸\n移\n秽\n稀\n程\n稍\n税\n稚\n稠\n稣\n稳\n稻\n稼\n稽\n稿\n穆\n穗\n穴\n究\n穷\n空\n穿\n突\n窃\n窄\n窈\n窍\n窑\n窒\n窕\n窖\n窗\n窘\n窜\n窝\n窟\n窥\n窦\n窨\n窿\n立\n竖\n站\n竞\n竟\n章\n竣\n童\n竭\n端\n竲\n竹\n竺\n竽\n竿\n笃\n笈\n笋\n笑\n笔\n笙\n笛\n符\n笨\n第\n笼\n等\n筋\n筏\n筐\n筑\n筒\n答\n策\n筛\n筱\n筵\n筷\n筹\n签\n简\n箍\n箔\n箕\n算\n管\n箫\n箭\n箱\n篇\n篡\n篪\n篮\n篷\n簇\n簧\n簸\n簿\n籁\n籍\n米\n类\n籽\n粉\n粒\n粕\n粗\n粘\n粟\n粤\n粥\n粪\n粮\n粱\n粹\n粽\n精\n糊\n糕\n糖\n糗\n糙\n糟\n糯\n系\n紊\n素\n索\n紧\n紫\n累\n絮\n綦\n繁\n纠\n红\n纣\n纤\n约\n级\n纪\n纬\n纯\n纰\n纱\n纲\n纳\n纵\n纶\n纷\n纸\n纹\n纺\n纽\n线\n练\n组\n绅\n细\n织\n终\n绊\n绌\n绍\n绎\n经\n绑\n绒\n结\n绕\n绘\n给\n绚\n络\n绝\n绞\n统\n绢\n绣\n继\n绩\n绪\n续\n绮\n绯\n绰\n绳\n维\n绵\n绷\n绸\n综\n绽\n绿\n缀\n缄\n缅\n缆\n缇\n缉\n缓\n缔\n缕\n编\n缘\n缙\n缚\n缜\n缝\n缠\n缤\n缨\n缩\n缪\n缭\n缮\n缰\n缴\n缸\n缺\n罂\n罄\n罐\n网\n罕\n罗\n罚\n罡\n罢\n罩\n罪\n置\n署\n罹\n羁\n羊\n美\n羔\n羚\n羞\n羡\n羣\n群\n羲\n羹\n羽\n羿\n翁\n翅\n翌\n翔\n翘\n翟\n翠\n翡\n翩\n翰\n翱\n翻\n翼\n耀\n老\n考\n耄\n者\n耋\n而\n耍\n耐\n耒\n耕\n耗\n耘\n耳\n耶\n耷\n耸\n耻\n耽\n耿\n聂\n聆\n聊\n聋\n职\n联\n聘\n聚\n聪\n肃\n肆\n肇\n肉\n肋\n肌\n肖\n肘\n肚\n肛\n肝\n肠\n股\n肢\n肤\n肥\n肩\n肪\n肮\n肯\n育\n肴\n肺\n肾\n肿\n胀\n胁\n胃\n胆\n背\n胎\n胖\n胚\n胛\n胜\n胞\n胡\n胤\n胧\n胫\n胯\n胰\n胱\n胳\n胶\n胸\n胺\n能\n脂\n脆\n脉\n脊\n脍\n脏\n脐\n脑\n脖\n脚\n脯\n脱\n脸\n脾\n腆\n腊\n腋\n腌\n腐\n腑\n腓\n腔\n腕\n腥\n腩\n腮\n腰\n腱\n腹\n腺\n腻\n腼\n腾\n腿\n膀\n膊\n膏\n膑\n膛\n膜\n膝\n膨\n膳\n膺\n臀\n臂\n臃\n臆\n臣\n自\n臭\n至\n致\n臻\n舀\n舅\n舆\n舌\n舍\n舒\n舛\n舜\n舞\n舟\n航\n般\n舰\n舱\n舵\n舶\n舸\n船\n艇\n艋\n艘\n良\n艰\n色\n艳\n艺\n艾\n节\n芊\n芋\n芒\n芙\n芜\n芝\n芦\n芪\n芬\n芭\n芮\n芯\n花\n芳\n芷\n芸\n芽\n苇\n苍\n苏\n苑\n苗\n苛\n苟\n苡\n苣\n若\n苦\n苯\n英\n苹\n茁\n茂\n范\n茄\n茅\n茆\n茎\n茗\n茜\n茨\n茫\n茬\n茵\n茶\n茸\n茹\n荃\n荆\n荇\n草\n荐\n荒\n荔\n荚\n荞\n荟\n荡\n荣\n荤\n荧\n荫\n药\n荷\n荼\n莅\n莆\n莉\n莎\n莓\n莘\n莞\n莠\n莫\n莱\n莲\n莴\n获\n莹\n莺\n莽\n菁\n菇\n菊\n菌\n菜\n菠\n菡\n菩\n菱\n菲\n萃\n萄\n萋\n萌\n萍\n萎\n萝\n萤\n营\n萦\n萧\n萨\n萱\n落\n葆\n著\n葛\n葡\n董\n葩\n葫\n葬\n葱\n葵\n蒂\n蒋\n蒙\n蒜\n蒲\n蒸\n蒿\n蓁\n蓄\n蓉\n蓝\n蓟\n蓬\n蔑\n蔓\n蔗\n蔚\n蔡\n蔫\n蔬\n蔷\n蔺\n蔽\n蕉\n蕊\n蕙\n蕲\n蕴\n蕾\n薄\n薇\n薙\n薛\n薪\n薯\n薰\n藏\n藜\n藤\n藩\n藻\n蘑\n虎\n虏\n虐\n虑\n虚\n虞\n虫\n虱\n虹\n虽\n虾\n蚀\n蚁\n蚂\n蚊\n蚌\n蚓\n蚕\n蚝\n蚣\n蚯\n蛀\n蛆\n蛇\n蛋\n蛐\n蛙\n蛛\n蛟\n蛮\n蛰\n蜀\n蜂\n蜇\n蜈\n蜊\n蜒\n蜓\n蜕\n蜗\n蜘\n蜚\n蜜\n蜡\n蜥\n蜴\n蜷\n蜻\n蜿\n蝇\n蝉\n蝎\n蝗\n蝙\n蝠\n蝴\n蝶\n螂\n螃\n融\n螳\n螺\n蟀\n蟋\n蟑\n蟒\n蟹\n蠕\n蠢\n血\n衅\n行\n衍\n衔\n街\n衙\n衡\n衣\n补\n表\n衫\n衬\n衰\n衷\n袁\n袂\n袄\n袆\n袈\n袋\n袍\n袒\n袖\n袜\n被\n袭\n袱\n裁\n裂\n装\n裆\n裔\n裕\n裙\n裟\n裤\n裳\n裴\n裸\n裹\n褂\n褒\n褓\n褚\n褛\n褪\n褴\n褶\n襁\n襄\n襟\n西\n要\n覃\n覆\n见\n观\n规\n觅\n视\n览\n觉\n觊\n觎\n觐\n觑\n角\n解\n觥\n触\n言\n詹\n誉\n誓\n警\n譬\n计\n订\n认\n讧\n讨\n让\n讪\n训\n议\n讯\n记\n讲\n讳\n讶\n许\n讹\n论\n讼\n讽\n设\n访\n诀\n证\n评\n诅\n识\n诈\n诉\n诊\n词\n译\n诓\n试\n诗\n诙\n诚\n话\n诞\n诟\n诠\n诡\n询\n该\n详\n诧\n诩\n诫\n诬\n语\n误\n诱\n诲\n说\n诵\n诶\n请\n诸\n诺\n读\n诽\n课\n诿\n谀\n谁\n调\n谅\n谈\n谊\n谋\n谌\n谍\n谎\n谐\n谑\n谓\n谕\n谙\n谚\n谜\n谢\n谣\n谤\n谦\n谨\n谩\n谬\n谭\n谱\n谴\n谷\n豁\n豆\n豚\n象\n豪\n豫\n豹\n貅\n貉\n貌\n貔\n贝\n贞\n负\n贡\n财\n责\n贤\n败\n账\n货\n质\n贩\n贪\n贫\n贬\n购\n贮\n贯\n贱\n贴\n贵\n贷\n贸\n费\n贺\n贼\n贾\n贿\n赁\n赂\n赃\n资\n赋\n赌\n赎\n赏\n赐\n赔\n赖\n赘\n赚\n赛\n赝\n赞\n赠\n赡\n赢\n赣\n赤\n赦\n赫\n走\n赴\n赵\n赶\n起\n趁\n超\n越\n趋\n趟\n趣\n足\n趴\n趸\n趾\n跃\n跄\n跆\n跌\n跑\n跛\n距\n跟\n跤\n跨\n跪\n路\n跳\n践\n跷\n跺\n跻\n踉\n踊\n踏\n踝\n踞\n踢\n踩\n踪\n踵\n踹\n蹂\n蹄\n蹈\n蹊\n蹚\n蹦\n蹬\n蹭\n蹲\n蹴\n蹶\n蹼\n蹿\n躁\n躏\n身\n躬\n躯\n躲\n躺\n车\n轧\n轨\n轩\n转\n轮\n软\n轰\n轴\n轶\n轻\n载\n轿\n较\n辄\n辅\n辆\n辈\n辉\n辍\n辐\n辑\n输\n辖\n辗\n辘\n辙\n辛\n辜\n辞\n辟\n辣\n辨\n辩\n辫\n辰\n辱\n边\n辽\n达\n迁\n迂\n迄\n迅\n过\n迈\n迎\n运\n近\n返\n还\n这\n进\n远\n违\n连\n迟\n迢\n迥\n迪\n迫\n迭\n述\n迷\n迸\n迹\n追\n退\n送\n适\n逃\n逅\n逆\n选\n逊\n逍\n透\n逐\n递\n途\n逗\n通\n逛\n逝\n逞\n速\n造\n逡\n逢\n逮\n逵\n逸\n逻\n逼\n逾\n遁\n遂\n遇\n遍\n遏\n遐\n道\n遗\n遛\n遢\n遣\n遥\n遨\n遭\n遮\n遴\n遵\n避\n邀\n邂\n邃\n邋\n邑\n邓\n邛\n邝\n邢\n那\n邦\n邪\n邬\n邮\n邯\n邱\n邵\n邹\n邺\n邻\n郁\n郊\n郎\n郑\n郜\n郝\n郡\n部\n郫\n郭\n郸\n都\n鄂\n鄙\n鄞\n鄢\n酋\n酌\n配\n酒\n酗\n酝\n酣\n酪\n酬\n酯\n酱\n酵\n酶\n酷\n酸\n酿\n醇\n醉\n醋\n醍\n醐\n醒\n醛\n采\n釉\n释\n里\n重\n野\n量\n金\n釜\n鉴\n鏖\n鑫\n针\n钉\n钊\n钒\n钓\n钛\n钜\n钝\n钞\n钟\n钠\n钢\n钥\n钦\n钧\n钩\n钮\n钰\n钱\n钴\n钵\n钻\n钾\n铀\n铁\n铂\n铃\n铅\n铆\n铉\n铎\n铐\n铜\n铝\n铠\n铡\n铣\n铨\n铬\n铭\n铮\n铰\n铲\n银\n铸\n铺\n链\n铿\n销\n锁\n锂\n锄\n锅\n锆\n锈\n锋\n锌\n锏\n锐\n错\n锚\n锜\n锟\n锡\n锢\n锣\n锤\n锥\n锦\n锭\n键\n锯\n锰\n锵\n锷\n锹\n锻\n镀\n镁\n镇\n镉\n镊\n镍\n镐\n镑\n镖\n镜\n镯\n镳\n镶\n长\n门\n闪\n闫\n闭\n问\n闯\n闰\n闲\n闳\n间\n闵\n闷\n闸\n闹\n闺\n闻\n闽\n阀\n阁\n阂\n阅\n阎\n阐\n阔\n阙\n阚\n阜\n队\n阮\n阱\n防\n阳\n阴\n阵\n阶\n阻\n阿\n陀\n陂\n附\n际\n陆\n陇\n陈\n陋\n陌\n降\n限\n陕\n陡\n院\n除\n陨\n险\n陪\n陬\n陵\n陶\n陷\n隅\n隆\n隋\n隍\n随\n隐\n隔\n隘\n隙\n障\n隧\n隶\n隼\n隽\n难\n雀\n雁\n雄\n雅\n集\n雇\n雌\n雍\n雏\n雕\n雨\n雪\n雯\n雳\n零\n雷\n雾\n需\n霁\n霄\n霆\n震\n霈\n霉\n霍\n霎\n霏\n霖\n霜\n霞\n露\n霸\n霹\n霾\n靑\n青\n靓\n靖\n静\n靛\n非\n靠\n靡\n面\n革\n靳\n靴\n靶\n鞋\n鞍\n鞘\n鞠\n鞭\n韦\n韧\n韩\n韬\n音\n韵\n韶\n页\n顶\n顷\n项\n顺\n须\n顽\n顾\n顿\n颁\n颂\n预\n颅\n领\n颇\n颈\n颊\n颍\n颐\n频\n颓\n颖\n颗\n题\n颚\n颜\n额\n颠\n颤\n风\n飒\n飓\n飘\n飙\n飚\n飞\n食\n餐\n餮\n饕\n饥\n饪\n饭\n饮\n饰\n饱\n饲\n饵\n饶\n饺\n饼\n饽\n饿\n馀\n馅\n馆\n馈\n馊\n馋\n馑\n馒\n首\n馗\n香\n馥\n馨\n马\n驭\n驯\n驰\n驱\n驳\n驴\n驶\n驻\n驼\n驾\n驿\n骁\n骂\n骄\n骅\n骆\n骇\n骊\n骋\n验\n骏\n骐\n骑\n骗\n骚\n骜\n骤\n骥\n骨\n骷\n骸\n骼\n髅\n髋\n髌\n髓\n高\n髦\n鬼\n魁\n魂\n魄\n魅\n魇\n魏\n魔\n鱼\n鲁\n鲍\n鲜\n鲟\n鲤\n鲨\n鲶\n鲷\n鲸\n鳄\n鳅\n鳌\n鳖\n鳝\n鳞\n鸟\n鸠\n鸡\n鸣\n鸥\n鸦\n鸭\n鸯\n鸳\n鸵\n鸽\n鸾\n鸿\n鹃\n鹅\n鹊\n鹏\n鹜\n鹞\n鹤\n鹭\n鹰\n鹿\n麋\n麒\n麓\n麟\n麦\n麻\n麾\n黄\n黍\n黎\n黏\n黑\n黔\n默\n黛\n黝\n黯\n鼎\n鼓\n鼠\n鼻\n鼾\n齐\n齿\n龄\n龙\n龚\n龟\nａ\nｃ\nｋ\nｔ\n<eos>\n"
  },
  {
    "path": "modules/audio/asr/deepspeech2_aishell/deepspeech_tester.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Evaluation for DeepSpeech2 model.\"\"\"\nimport os\nimport sys\nfrom pathlib import Path\n\nimport paddle\n\nfrom deepspeech.frontend.featurizer.text_featurizer import TextFeaturizer\nfrom deepspeech.io.collator import SpeechCollator\nfrom deepspeech.models.ds2 import DeepSpeech2Model\nfrom deepspeech.utils import mp_tools\nfrom deepspeech.utils.utility import UpdateConfig\n\n\nclass DeepSpeech2Tester:\n    def __init__(self, config):\n        self.config = config\n        self.collate_fn_test = SpeechCollator.from_config(config)\n        self._text_featurizer = TextFeaturizer(unit_type=config.collator.unit_type, vocab_filepath=None)\n\n    def compute_result_transcripts(self, audio, audio_len, vocab_list, cfg):\n        result_transcripts = self.model.decode(\n            audio,\n            audio_len,\n            vocab_list,\n            decoding_method=cfg.decoding_method,\n            lang_model_path=cfg.lang_model_path,\n            beam_alpha=cfg.alpha,\n            beam_beta=cfg.beta,\n            beam_size=cfg.beam_size,\n            cutoff_prob=cfg.cutoff_prob,\n            cutoff_top_n=cfg.cutoff_top_n,\n            num_processes=cfg.num_proc_bsearch)\n        #replace the '<space>' with ' '\n        result_transcripts = [self._text_featurizer.detokenize(sentence) for sentence in result_transcripts]\n\n        return result_transcripts\n\n    @mp_tools.rank_zero_only\n    @paddle.no_grad()\n    def test(self, audio_file):\n        self.model.eval()\n        cfg = self.config\n        collate_fn_test = self.collate_fn_test\n        audio, _ = collate_fn_test.process_utterance(audio_file=audio_file, transcript=\" \")\n        audio_len = audio.shape[0]\n        audio = paddle.to_tensor(audio, dtype='float32')\n        audio_len = paddle.to_tensor(audio_len)\n        audio = paddle.unsqueeze(audio, axis=0)\n        vocab_list = collate_fn_test.vocab_list\n        result_transcripts = self.compute_result_transcripts(audio, audio_len, vocab_list, cfg.decoding)\n        return result_transcripts\n\n    def setup_model(self):\n        config = self.config.clone()\n        with UpdateConfig(config):\n            config.model.feat_size = self.collate_fn_test.feature_size\n            config.model.dict_size = self.collate_fn_test.vocab_size\n\n        model = DeepSpeech2Model.from_config(config.model)\n        self.model = model\n\n    def resume(self, checkpoint):\n        \"\"\"Resume from the checkpoint at checkpoints in the output\n        directory or load a specified checkpoint.\n        \"\"\"\n        model_dict = paddle.load(checkpoint)\n        self.model.set_state_dict(model_dict)\n"
  },
  {
    "path": "modules/audio/asr/deepspeech2_aishell/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom pathlib import Path\nimport sys\n\nimport numpy as np\nfrom paddlehub.env import MODULE_HOME\nfrom paddlehub.module.module import moduleinfo, serving\nfrom paddlehub.utils.log import logger\nfrom paddle.utils.download import get_path_from_url\n\ntry:\n    import swig_decoders\nexcept ModuleNotFoundError as e:\n    logger.error(e)\n    logger.info('The module requires additional dependencies: swig_decoders. '\n                'please install via:\\n\\'git clone https://github.com/PaddlePaddle/DeepSpeech.git '\n                '&& cd DeepSpeech && git reset --hard b53171694e7b87abe7ea96870b2f4d8e0e2b1485 '\n                '&& cd deepspeech/decoders/ctcdecoder/swig && sh setup.sh\\'')\n    sys.exit(1)\n\nimport paddle\nimport soundfile as sf\n\n# TODO: Remove system path when deepspeech can be installed via pip.\nsys.path.append(os.path.join(MODULE_HOME, 'deepspeech2_aishell'))\nfrom deepspeech.exps.deepspeech2.config import get_cfg_defaults\nfrom deepspeech.utils.utility import UpdateConfig\nfrom .deepspeech_tester import DeepSpeech2Tester\n\nLM_URL = 'https://deepspeech.bj.bcebos.com/zh_lm/zh_giga.no_cna_cmn.prune01244.klm'\nLM_MD5 = '29e02312deb2e59b3c8686c7966d4fe3'\n\n\n@moduleinfo(name=\"deepspeech2_aishell\", version=\"1.0.0\", summary=\"\", author=\"Baidu\", author_email=\"\", type=\"audio/asr\")\nclass DeepSpeech2(paddle.nn.Layer):\n    def __init__(self):\n        super(DeepSpeech2, self).__init__()\n\n        # resource\n        res_dir = os.path.join(MODULE_HOME, 'deepspeech2_aishell', 'assets')\n        conf_file = os.path.join(res_dir, 'conf/deepspeech2.yaml')\n        checkpoint = os.path.join(res_dir, 'checkpoints/avg_1.pdparams')\n        # Download LM manually cause its large size.\n        lm_path = os.path.join(res_dir, 'data', 'lm')\n        lm_file = os.path.join(lm_path, LM_URL.split('/')[-1])\n        if not os.path.isfile(lm_file):\n            logger.info(f'Downloading lm from {LM_URL}.')\n            get_path_from_url(url=LM_URL, root_dir=lm_path, md5sum=LM_MD5)\n\n        # config\n        self.model_type = 'offline'\n        self.config = get_cfg_defaults(self.model_type)\n        self.config.merge_from_file(conf_file)\n\n        # TODO: Remove path updating snippet.\n        with UpdateConfig(self.config):\n            self.config.collator.mean_std_filepath = os.path.join(res_dir, self.config.collator.mean_std_filepath)\n            self.config.collator.vocab_filepath = os.path.join(res_dir, self.config.collator.vocab_filepath)\n            self.config.collator.augmentation_config = os.path.join(res_dir, self.config.collator.augmentation_config)\n            self.config.decoding.lang_model_path = os.path.join(res_dir, self.config.decoding.lang_model_path)\n\n        # model\n        self.tester = DeepSpeech2Tester(self.config)\n        self.tester.setup_model()\n        self.tester.resume(checkpoint)\n\n    @staticmethod\n    def check_audio(audio_file):\n        sig, sample_rate = sf.read(audio_file)\n        assert sample_rate == 16000, 'Excepting sample rate of input audio is 16000, but got {}'.format(sample_rate)\n\n    @serving\n    def speech_recognize(self, audio_file, device='cpu'):\n        assert os.path.isfile(audio_file), 'File not exists: {}'.format(audio_file)\n        self.check_audio(audio_file)\n\n        paddle.set_device(device)\n        return self.tester.test(audio_file)[0]\n"
  },
  {
    "path": "modules/audio/asr/deepspeech2_aishell/requirements.txt",
    "content": "# system level: libsnd swig\nloguru\nyacs\njsonlines\nscipy==1.2.1\nsentencepiece\nresampy==0.2.2\nSoundFile==0.9.0.post1\nsoxbindings\nkaldiio\ntypeguard\neditdistance\n"
  },
  {
    "path": "modules/audio/asr/deepspeech2_librispeech/README.md",
    "content": "# deepspeech2_librispeech\n\n|模型名称|deepspeech2_librispeech|\n| :--- | :---: |\n|类别|语音-语音识别|\n|网络|DeepSpeech2|\n|数据集|LibriSpeech|\n|是否支持Fine-tuning|否|\n|模型大小|518MB|\n|最新更新日期|2021-10-20|\n|数据指标|英文WER 0.072|\n\n## 一、模型基本信息\n\n### 模型介绍\n\nDeepSpeech2是百度于2015年提出的适用于英文和中文的end-to-end语音识别模型。deepspeech2_librispeech使用了DeepSpeech2离线模型的结构，模型主要由2层卷积网络和3层GRU组成，并在英文开源语音数据集[LibriSpeech ASR corpus](http://www.openslr.org/12/)进行了预训练，该模型在其测试集上的WER指标是0.072。\n\n\n<p align=\"center\">\n<img src=\"https://raw.githubusercontent.com/PaddlePaddle/DeepSpeech/Hub/docs/images/ds2offlineModel.png\" hspace='10'/> <br />\n</p>\n\n更多详情请参考[Deep Speech 2: End-to-End Speech Recognition in English and Mandarin](https://arxiv.org/abs/1512.02595)\n\n## 二、安装\n\n- ### 1、系统依赖\n\n  - libsndfile, swig >= 3.0\n    - Linux\n      ```shell\n      $ sudo apt-get install libsndfile swig\n      or\n      $ sudo yum install libsndfile swig\n      ```\n    - MacOs\n      ```\n      $ brew install libsndfile swig\n      ```\n\n- ### 2、环境依赖\n  - swig_decoder:\n    ```\n    git clone https://github.com/paddlepaddle/deepspeech && cd DeepSpeech && git reset --hard b53171694e7b87abe7ea96870b2f4d8e0e2b1485 && cd deepspeech/decoders/ctcdecoder/swig && sh setup.sh\n    ```\n\n  - paddlepaddle >= 2.1.0\n\n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 3、安装\n\n  - ```shell\n    $ hub install deepspeech2_librispeech\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n    ```python\n    import paddlehub as hub\n\n    # 采样率为16k，格式为wav的英文语音音频\n    wav_file = '/PATH/TO/AUDIO'\n\n    model = hub.Module(\n        name='deepspeech2_librispeech',\n        version='1.0.0')\n    text = model.speech_recognize(wav_file)\n\n    print(text)\n    ```\n\n- ### 2、API\n  - ```python\n    def check_audio(audio_file)\n    ```\n    - 检查输入音频格式和采样率是否满足为16000\n\n    - **参数**\n\n      - `audio_file`：本地音频文件(*.wav)的路径，如`/path/to/input.wav`\n\n  - ```python\n    def speech_recognize(\n        audio_file,\n        device='cpu',\n    )\n    ```\n    - 将输入的音频识别成文字\n\n    - **参数**\n\n      - `audio_file`：本地音频文件(*.wav)的路径，如`/path/to/input.wav`\n      - `device`：预测时使用的设备，默认为\b`cpu`，如需使用gpu预测，请设置为`gpu`。\n\n    - **返回**\n\n      - `text`：str类型，返回输入音频的识别文字结果。\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线的语音识别服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m deepspeech2_librispeech\n    ```\n\n  - 这样就完成了一个语音识别服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 需要识别的音频的存放路径，确保部署服务的机器可访问\n    file = '/path/to/input.wav'\n\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"audio_file\"\n    data = {\"audio_file\": file}\n\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/deepspeech2_librispeech\"\n\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install deepspeech2_librispeech\n  ```\n"
  },
  {
    "path": "modules/audio/asr/deepspeech2_librispeech/__init__.py",
    "content": ""
  },
  {
    "path": "modules/audio/asr/deepspeech2_librispeech/assets/conf/augmentation.json",
    "content": "{}\n"
  },
  {
    "path": "modules/audio/asr/deepspeech2_librispeech/assets/conf/deepspeech2.yaml",
    "content": "# https://yaml.org/type/float.html\ndata:\n  train_manifest: data/manifest.train\n  dev_manifest: data/manifest.dev-clean\n  test_manifest: data/manifest.test-clean\n  min_input_len: 0.0\n  max_input_len: 30.0 # second\n  min_output_len: 0.0\n  max_output_len: .inf\n  min_output_input_ratio: 0.00\n  max_output_input_ratio: .inf\n\ncollator:\n  batch_size: 20\n  mean_std_filepath: data/mean_std.json\n  unit_type: char\n  vocab_filepath: data/vocab.txt\n  augmentation_config: conf/augmentation.json\n  random_seed: 0\n  spm_model_prefix:\n  spectrum_type: linear\n  target_sample_rate: 16000\n  max_freq: None\n  n_fft: None\n  stride_ms: 10.0\n  window_ms: 20.0\n  delta_delta: False\n  dither: 1.0\n  use_dB_normalization: True\n  target_dB: -20\n  random_seed: 0\n  keep_transcription_text: False\n  sortagrad: True\n  shuffle_method: batch_shuffle\n  num_workers: 2\n\nmodel:\n  num_conv_layers: 2\n  num_rnn_layers: 3\n  rnn_layer_size: 2048\n  use_gru: False\n  share_rnn_weights: True\n  blank_id: 0\n  ctc_grad_norm_type: instance\n\ntraining:\n  n_epoch: 50\n  accum_grad: 1\n  lr: 1e-3\n  lr_decay: 0.83\n  weight_decay: 1e-06\n  global_grad_clip: 5.0\n  log_interval: 100\n  checkpoint:\n    kbest_n: 50\n    latest_n: 5\n\ndecoding:\n  batch_size: 128\n  error_rate_type: wer\n  decoding_method: ctc_beam_search\n  lang_model_path: data/lm/common_crawl_00.prune01111.trie.klm\n  alpha: 1.9\n  beta: 0.3\n  beam_size: 500\n  cutoff_prob: 1.0\n  cutoff_top_n: 40\n  num_proc_bsearch: 8\n"
  },
  {
    "path": "modules/audio/asr/deepspeech2_librispeech/deepspeech_tester.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Evaluation for DeepSpeech2 model.\"\"\"\nimport os\nimport sys\nfrom pathlib import Path\n\nimport paddle\n\nfrom deepspeech.frontend.featurizer.text_featurizer import TextFeaturizer\nfrom deepspeech.io.collator import SpeechCollator\nfrom deepspeech.models.ds2 import DeepSpeech2Model\nfrom deepspeech.utils import mp_tools\nfrom deepspeech.utils.utility import UpdateConfig\n\n\nclass DeepSpeech2Tester:\n    def __init__(self, config):\n        self.config = config\n        self.collate_fn_test = SpeechCollator.from_config(config)\n        self._text_featurizer = TextFeaturizer(unit_type=config.collator.unit_type, vocab_filepath=None)\n\n    def compute_result_transcripts(self, audio, audio_len, vocab_list, cfg):\n        result_transcripts = self.model.decode(\n            audio,\n            audio_len,\n            vocab_list,\n            decoding_method=cfg.decoding_method,\n            lang_model_path=cfg.lang_model_path,\n            beam_alpha=cfg.alpha,\n            beam_beta=cfg.beta,\n            beam_size=cfg.beam_size,\n            cutoff_prob=cfg.cutoff_prob,\n            cutoff_top_n=cfg.cutoff_top_n,\n            num_processes=cfg.num_proc_bsearch)\n        #replace the '<space>' with ' '\n        result_transcripts = [self._text_featurizer.detokenize(sentence) for sentence in result_transcripts]\n\n        return result_transcripts\n\n    @mp_tools.rank_zero_only\n    @paddle.no_grad()\n    def test(self, audio_file):\n        self.model.eval()\n        cfg = self.config\n        collate_fn_test = self.collate_fn_test\n        audio, _ = collate_fn_test.process_utterance(audio_file=audio_file, transcript=\" \")\n        audio_len = audio.shape[0]\n        audio = paddle.to_tensor(audio, dtype='float32')\n        audio_len = paddle.to_tensor(audio_len)\n        audio = paddle.unsqueeze(audio, axis=0)\n        vocab_list = collate_fn_test.vocab_list\n        result_transcripts = self.compute_result_transcripts(audio, audio_len, vocab_list, cfg.decoding)\n        return result_transcripts\n\n    def setup_model(self):\n        config = self.config.clone()\n        with UpdateConfig(config):\n            config.model.feat_size = self.collate_fn_test.feature_size\n            config.model.dict_size = self.collate_fn_test.vocab_size\n\n        model = DeepSpeech2Model.from_config(config.model)\n        self.model = model\n\n    def resume(self, checkpoint):\n        \"\"\"Resume from the checkpoint at checkpoints in the output\n        directory or load a specified checkpoint.\n        \"\"\"\n        model_dict = paddle.load(checkpoint)\n        self.model.set_state_dict(model_dict)\n"
  },
  {
    "path": "modules/audio/asr/deepspeech2_librispeech/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom pathlib import Path\nimport sys\n\nimport numpy as np\nfrom paddlehub.env import MODULE_HOME\nfrom paddlehub.module.module import moduleinfo, serving\nfrom paddlehub.utils.log import logger\nfrom paddle.utils.download import get_path_from_url\n\ntry:\n    import swig_decoders\nexcept ModuleNotFoundError as e:\n    logger.error(e)\n    logger.info('The module requires additional dependencies: swig_decoders. '\n                'please install via:\\n\\'git clone https://github.com/PaddlePaddle/DeepSpeech.git '\n                '&& cd DeepSpeech && git reset --hard b53171694e7b87abe7ea96870b2f4d8e0e2b1485 '\n                '&& cd deepspeech/decoders/ctcdecoder/swig && sh setup.sh\\'')\n    sys.exit(1)\n\nimport paddle\nimport soundfile as sf\n\n# TODO: Remove system path when deepspeech can be installed via pip.\nsys.path.append(os.path.join(MODULE_HOME, 'deepspeech2_librispeech'))\nfrom deepspeech.exps.deepspeech2.config import get_cfg_defaults\nfrom deepspeech.utils.utility import UpdateConfig\nfrom .deepspeech_tester import DeepSpeech2Tester\n\nLM_URL = 'https://deepspeech.bj.bcebos.com/en_lm/common_crawl_00.prune01111.trie.klm'\nLM_MD5 = '099a601759d467cd0a8523ff939819c5'\n\n\n@moduleinfo(\n    name=\"deepspeech2_librispeech\", version=\"1.0.0\", summary=\"\", author=\"Baidu\", author_email=\"\", type=\"audio/asr\")\nclass DeepSpeech2(paddle.nn.Layer):\n    def __init__(self):\n        super(DeepSpeech2, self).__init__()\n\n        # resource\n        res_dir = os.path.join(MODULE_HOME, 'deepspeech2_librispeech', 'assets')\n        conf_file = os.path.join(res_dir, 'conf/deepspeech2.yaml')\n        checkpoint = os.path.join(res_dir, 'checkpoints/avg_1.pdparams')\n        # Download LM manually cause its large size.\n        lm_path = os.path.join(res_dir, 'data', 'lm')\n        lm_file = os.path.join(lm_path, LM_URL.split('/')[-1])\n        if not os.path.isfile(lm_file):\n            logger.info(f'Downloading lm from {LM_URL}.')\n            get_path_from_url(url=LM_URL, root_dir=lm_path, md5sum=LM_MD5)\n\n        # config\n        self.model_type = 'offline'\n        self.config = get_cfg_defaults(self.model_type)\n        self.config.merge_from_file(conf_file)\n\n        # TODO: Remove path updating snippet.\n        with UpdateConfig(self.config):\n            self.config.collator.mean_std_filepath = os.path.join(res_dir, self.config.collator.mean_std_filepath)\n            self.config.collator.vocab_filepath = os.path.join(res_dir, self.config.collator.vocab_filepath)\n            self.config.collator.augmentation_config = os.path.join(res_dir, self.config.collator.augmentation_config)\n            self.config.decoding.lang_model_path = os.path.join(res_dir, self.config.decoding.lang_model_path)\n\n        # model\n        self.tester = DeepSpeech2Tester(self.config)\n        self.tester.setup_model()\n        self.tester.resume(checkpoint)\n\n    @staticmethod\n    def check_audio(audio_file):\n        sig, sample_rate = sf.read(audio_file)\n        assert sample_rate == 16000, 'Excepting sample rate of input audio is 16000, but got {}'.format(sample_rate)\n\n    @serving\n    def speech_recognize(self, audio_file, device='cpu'):\n        assert os.path.isfile(audio_file), 'File not exists: {}'.format(audio_file)\n        self.check_audio(audio_file)\n\n        paddle.set_device(device)\n        return self.tester.test(audio_file)[0]\n"
  },
  {
    "path": "modules/audio/asr/deepspeech2_librispeech/requirements.txt",
    "content": "loguru\nyacs\njsonlines\nscipy==1.2.1\nsentencepiece\nresampy==0.2.2\nSoundFile==0.9.0.post1\nsoxbindings\nkaldiio\ntypeguard\neditdistance\n"
  },
  {
    "path": "modules/audio/asr/u2_conformer_aishell/README.md",
    "content": "# u2_conformer_aishell\n\n|模型名称|u2_conformer_aishell|\n| :--- | :---: |\n|类别|语音-语音识别|\n|网络|Conformer|\n|数据集|AISHELL-1|\n|是否支持Fine-tuning|否|\n|模型大小|284MB|\n|最新更新日期|2021-11-01|\n|数据指标|中文CER 0.055|\n\n## 一、模型基本信息\n\n### 模型介绍\n\nU2 Conformer模型是一种适用于英文和中文的end-to-end语音识别模型。u2_conformer_aishell采用了conformer的encoder和transformer的decoder的模型结构，并且使用了ctc-prefix beam search的方式进行一遍打分，再利用attention decoder进行二次打分的方式进行解码来得到最终结果。\n\nu2_conformer_aishell在中文普通话开源语音数据集[AISHELL-1](http://www.aishelltech.com/kysjcp)进行了预训练，该模型在其测试集上的CER指标是0.055257。\n\n<p align=\"center\">\n<img src=\"https://paddlehub.bj.bcebos.com/paddlehub-img/conformer.png\" hspace='10'/> <br />\n</p>\n\n<p align=\"center\">\n<img src=\"https://paddlehub.bj.bcebos.com/paddlehub-img/u2_conformer.png\" hspace='10'/> <br />\n</p>\n\n更多详情请参考:\n- [Unified Streaming and Non-streaming Two-pass End-to-end Model for Speech Recognition](https://arxiv.org/abs/2012.05481)\n- [Conformer: Convolution-augmented Transformer for Speech Recognition](https://arxiv.org/abs/2005.08100)\n\n## 二、安装\n\n- ### 1、系统依赖\n\n  - libsndfile\n    - Linux\n      ```shell\n      $ sudo apt-get install libsndfile\n      or\n      $ sudo yum install libsndfile\n      ```\n    - MacOs\n      ```\n      $ brew install libsndfile\n      ```\n\n- ### 2、环境依赖\n\n  - paddlepaddle >= 2.1.0\n\n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 3、安装\n\n  - ```shell\n    $ hub install u2_conformer_aishell\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n    ```python\n    import paddlehub as hub\n\n    # 采样率为16k，格式为wav的中文语音音频\n    wav_file = '/PATH/TO/AUDIO'\n\n    model = hub.Module(\n        name='u2_conformer_aishell',\n        version='1.0.0')\n    text = model.speech_recognize(wav_file)\n\n    print(text)\n    ```\n\n- ### 2、API\n  - ```python\n    def check_audio(audio_file)\n    ```\n    - 检查输入音频格式和采样率是否满足为16000\n\n    - **参数**\n\n      - `audio_file`：本地音频文件(*.wav)的路径，如`/path/to/input.wav`\n\n  - ```python\n    def speech_recognize(\n        audio_file,\n        device='cpu',\n    )\n    ```\n    - 将输入的音频识别成文字\n\n    - **参数**\n\n      - `audio_file`：本地音频文件(*.wav)的路径，如`/path/to/input.wav`\n      - `device`：预测时使用的设备，默认为\b`cpu`，如需使用gpu预测，请设置为`gpu`。\n\n    - **返回**\n\n      - `text`：str类型，返回输入音频的识别文字结果。\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线的语音识别服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m u2_conformer_aishell\n    ```\n\n  - 这样就完成了一个语音识别服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 需要识别的音频的存放路径，确保部署服务的机器可访问\n    file = '/path/to/input.wav'\n\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"audio_file\"\n    data = {\"audio_file\": file}\n\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/u2_conformer_aishell\"\n\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install u2_conformer_aishell\n  ```\n"
  },
  {
    "path": "modules/audio/asr/u2_conformer_aishell/__init__.py",
    "content": ""
  },
  {
    "path": "modules/audio/asr/u2_conformer_aishell/assets/conf/augmentation.json",
    "content": "{}\n"
  },
  {
    "path": "modules/audio/asr/u2_conformer_aishell/assets/conf/conformer.yaml",
    "content": "data:\n  train_manifest: data/manifest.train\n  dev_manifest: data/manifest.dev\n  test_manifest: data/manifest.test\n  min_input_len: 0.5\n  max_input_len: 20.0 # second\n  min_output_len: 0.0\n  max_output_len: 400.0\n  min_output_input_ratio: 0.05\n  max_output_input_ratio: 10.0\n\ncollator:\n  vocab_filepath: data/vocab.txt\n  unit_type: 'char'\n  spm_model_prefix: ''\n  augmentation_config: conf/augmentation.json\n  batch_size: 64\n  raw_wav: True  # use raw_wav or kaldi feature\n  spectrum_type: fbank #linear, mfcc, fbank\n  feat_dim: 80\n  delta_delta: False\n  dither: 1.0\n  target_sample_rate: 16000\n  max_freq: None\n  n_fft: None\n  stride_ms: 10.0\n  window_ms: 25.0\n  use_dB_normalization: False\n  target_dB: -20\n  random_seed: 0\n  keep_transcription_text: False\n  sortagrad: True\n  shuffle_method: batch_shuffle\n  num_workers: 2\n\ndecoding:\n  alpha: 2.5\n  batch_size: 128\n  beam_size: 10\n  beta: 0.3\n  ctc_weight: 0.0\n  cutoff_prob: 1.0\n  cutoff_top_n: 0\n  decoding_chunk_size: -1\n  decoding_method: attention\n  error_rate_type: cer\n  lang_model_path: data/lm/common_crawl_00.prune01111.trie.klm\n  num_decoding_left_chunks: -1\n  num_proc_bsearch: 8\n  simulate_streaming: False\nmodel:\n  cmvn_file: data/mean_std.json\n  cmvn_file_type: json\n  decoder: transformer\n  decoder_conf:\n    attention_heads: 4\n    dropout_rate: 0.1\n    linear_units: 2048\n    num_blocks: 6\n    positional_dropout_rate: 0.1\n    self_attention_dropout_rate: 0.0\n    src_attention_dropout_rate: 0.0\n  encoder: conformer\n  encoder_conf:\n    activation_type: swish\n    attention_dropout_rate: 0.0\n    attention_heads: 4\n    cnn_module_kernel: 15\n    dropout_rate: 0.1\n    input_layer: conv2d\n    linear_units: 2048\n    normalize_before: True\n    num_blocks: 12\n    output_size: 256\n    pos_enc_layer_type: rel_pos\n    positional_dropout_rate: 0.1\n    selfattention_layer_type: rel_selfattn\n    use_cnn_module: True\n  input_dim: 0\n  model_conf:\n    ctc_weight: 0.3\n    ctc_dropoutrate: 0.0\n    ctc_grad_norm_type: instance\n    length_normalized_loss: False\n    lsm_weight: 0.1\n  output_dim: 0\ntraining:\n  accum_grad: 2\n  global_grad_clip: 5.0\n  log_interval: 100\n  n_epoch: 300\n  optim: adam\n  optim_conf:\n    lr: 0.002\n    weight_decay: 1e-06\n  scheduler: warmuplr\n  scheduler_conf:\n    lr_decay: 1.0\n    warmup_steps: 25000\n  checkpoint:\n    kbest_n: 50\n    latest_n: 5\n"
  },
  {
    "path": "modules/audio/asr/u2_conformer_aishell/assets/data/mean_std.json",
    "content": "{\"mean_stat\": [533749178.75492024, 537379151.9412827, 553560684.251823, 587164297.7995199, 631868827.5506272, 662598279.7375823, 684377628.7270963, 695391900.076011, 692470493.5234187, 679434068.1698124, 666124153.9164762, 656323498.7897255, 665750586.0282139, 678693518.7836165, 681921713.5434498, 679622373.0941861, 669891550.4909347, 656595089.7941492, 653838531.0994304, 637678601.7858486, 628412248.7348012, 644835299.462052, 638840698.1892803, 646181879.4332589, 639724189.2981818, 642757470.3933163, 637471382.8647255, 642368839.4687729, 643414999.4559816, 647384269.1630985, 649348352.9727564, 649293860.0141628, 650234047.7200857, 654485430.6703687, 660474314.9996675, 667417041.2224753, 673157601.3226709, 675674470.304284, 675124085.6890339, 668017589.4583111, 670061307.6169846, 662625614.6886193, 663144526.4351237, 662504003.7634674, 666413530.1149732, 672263295.5639057, 678483738.2530766, 685387098.3034457, 692570857.529439, 699066050.4399202, 700784878.5879861, 701201520.50868, 702666292.305144, 705443439.2278953, 706070270.9023902, 705988909.8337733, 702843339.0362502, 699318566.4701376, 696089900.3030818, 687559674.541517, 675279201.9502573, 663676352.2301354, 662963751.7464145, 664300133.8414352, 666095384.4212626, 671682092.7777623, 676652386.6696675, 680097668.2490273, 683810023.0071762, 688701544.3655603, 692082724.9923568, 695788849.6782106, 701085780.0070009, 706389529.7959046, 711492753.1344281, 717637923.73355, 719691678.2081754, 715810733.4964175, 696362890.4862831, 604649423.9932467], \"var_stat\": [5413314850.92017, 5559847287.933615, 6150990253.613769, 6921242242.585692, 7999776708.347419, 8789877370.390867, 9405801233.462742, 9768050110.323652, 9759783206.942099, 9430647265.679018, 9090547056.72849, 8873147345.425886, 9155912918.518642, 9542539953.84679, 9653547618.806402, 9593434792.936714, 9316633026.420147, 8959273999.588833, 8863548125.445953, 8450615911.730164, 8211598033.615433, 8587083872.162145, 8432613574.987708, 8583943640.722399, 8401731458.393406, 8439359231.367369, 8293779802.711447, 8401506934.147289, 8427506949.839874, 8525176341.071184, 8577080109.482346, 8575106681.347283, 8594987363.896849, 8701703698.13697, 8854967559.695303, 9029484499.828356, 9168774993.437275, 9221457044.693224, 9194525496.858181, 8997085233.031223, 9024585998.805922, 8819398159.92156, 8807895653.788486, 8777245867.886335, 8869681168.825321, 9017397167.041729, 9173402827.38027, 9345595113.30765, 9530638054.282673, 9701241750.610865, 9749002220.142677, 9762753891.356327, 9802020174.527405, 9874432300.977995, 9883303068.689241, 9873499335.610315, 9780680890.924107, 9672603363.913414, 9569436761.47915, 9321842521.985804, 8968140697.297707, 8646348638.918655, 8616965457.523136, 8648620220.395298, 8702086138.675117, 8859213220.99842, 8999405313.087536, 9105949447.399998, 9220413227.016796, 9358601578.269663, 9451405873.00428, 9552727080.824707, 9695443509.54488, 9836687193.669691, 9970962418.410656, 10135881535.317768, 10189390919.400673, 10070483257.345238, 9532953296.22076, 7261219636.045063], \"frame_num\": 54068199}\n"
  },
  {
    "path": "modules/audio/asr/u2_conformer_aishell/assets/data/vocab.txt",
    "content": "<blank>\n<unk>\n一\n丁\n七\n万\n丈\n三\n上\n下\n不\n与\n丐\n丑\n专\n且\n世\n丘\n丙\n业\n丛\n东\n丝\n丞\n丢\n两\n严\n丧\n个\n丫\n中\n丰\n串\n临\n丸\n丹\n为\n主\n丽\n举\n乃\n久\n么\n义\n之\n乌\n乍\n乎\n乏\n乐\n乒\n乓\n乔\n乖\n乘\n乙\n九\n乞\n也\n习\n乡\n书\n买\n乱\n乳\n乾\n了\n予\n争\n事\n二\n于\n亏\n云\n互\n五\n井\n亚\n些\n亟\n亡\n亢\n交\n亥\n亦\n产\n亨\n亩\n享\n京\n亭\n亮\n亲\n亳\n亵\n人\n亿\n什\n仁\n仄\n仅\n仇\n今\n介\n仍\n从\n仑\n仓\n仔\n仕\n他\n仗\n付\n仙\n仡\n代\n令\n以\n仨\n仪\n们\n仰\n仲\n件\n价\n任\n份\n仿\n企\n伉\n伊\n伍\n伎\n伏\n伐\n休\n众\n优\n伙\n会\n伞\n伟\n传\n伢\n伤\n伦\n伪\n伯\n估\n伴\n伶\n伸\n伺\n似\n伽\n佃\n但\n位\n低\n住\n佐\n佑\n体\n何\n佘\n余\n佛\n作\n佟\n你\n佣\n佩\n佬\n佳\n佶\n佼\n使\n侃\n侄\n侈\n例\n侍\n侑\n侗\n供\n依\n侠\n侣\n侥\n侦\n侧\n侨\n侬\n侮\n侯\n侵\n便\n促\n俄\n俊\n俏\n俐\n俗\n俘\n俚\n保\n俞\n信\n俨\n俩\n俪\n俭\n修\n俯\n俱\n俸\n俺\n俾\n倍\n倒\n倘\n候\n倚\n倜\n借\n倡\n倦\n倩\n倪\n债\n值\n倾\n假\n偏\n做\n停\n健\n偶\n偷\n偿\n傅\n傍\n傥\n储\n催\n傲\n傻\n像\n僚\n僧\n僮\n僵\n僻\n儒\n儿\n兀\n允\n元\n兄\n充\n兆\n先\n光\n克\n免\n兑\n兔\n兖\n党\n兜\n兢\n入\n全\n八\n公\n六\n兰\n共\n关\n兴\n兵\n其\n具\n典\n兹\n养\n兼\n兽\n冀\n内\n冈\n冉\n册\n再\n冒\n冕\n写\n军\n农\n冠\n冤\n冥\n冬\n冯\n冰\n冲\n决\n况\n冶\n冷\n冻\n净\n凄\n准\n凇\n凉\n凋\n凌\n减\n凑\n凝\n几\n凡\n凤\n凭\n凯\n凰\n凳\n凶\n凸\n凹\n出\n击\n函\n凿\n刀\n刁\n刃\n分\n切\n刊\n刑\n划\n列\n刘\n则\n刚\n创\n初\n删\n判\n刨\n利\n别\n刮\n到\n制\n刷\n券\n刹\n刺\n刻\n剁\n剂\n剃\n削\n前\n剐\n剑\n剔\n剖\n剥\n剧\n剩\n剪\n副\n割\n剽\n剿\n劈\n力\n劝\n办\n功\n加\n务\n劣\n动\n助\n努\n劫\n励\n劲\n劳\n劵\n势\n勃\n勇\n勉\n勋\n勒\n勘\n募\n勤\n勺\n勾\n勿\n匀\n包\n匆\n匈\n匕\n化\n北\n匙\n匝\n匠\n匡\n匣\n匪\n匮\n匹\n区\n医\n匾\n匿\n十\n千\n升\n午\n卉\n半\n华\n协\n卑\n卒\n卓\n单\n卖\n南\n博\n卜\n卞\n占\n卡\n卢\n卤\n卦\n卧\n卫\n卯\n印\n危\n卲\n即\n却\n卵\n卷\n卸\n卿\n厂\n厄\n厅\n历\n厉\n压\n厌\n厕\n厘\n厚\n原\n厢\n厥\n厦\n厨\n厩\n厮\n去\n县\n参\n又\n叉\n及\n友\n双\n反\n发\n叔\n取\n受\n变\n叙\n叛\n叠\n口\n古\n句\n另\n叨\n叩\n只\n叫\n召\n叭\n叮\n可\n台\n叱\n史\n右\n叵\n叶\n号\n司\n叹\n叼\n吁\n吃\n各\n吆\n合\n吉\n吊\n同\n名\n后\n吏\n吐\n向\n吓\n吕\n吗\n君\n吝\n吞\n吟\n否\n吧\n吨\n吩\n含\n听\n吭\n启\n吴\n吵\n吸\n吹\n吻\n吼\n吾\n吿\n呀\n呃\n呆\n呈\n告\n呐\n呕\n呗\n员\n呛\n呜\n呢\n呦\n周\n呲\n味\n呵\n呼\n命\n咀\n咄\n咋\n和\n咎\n咏\n咐\n咒\n咔\n咕\n咖\n咚\n咣\n咤\n咧\n咨\n咪\n咫\n咬\n咯\n咱\n咳\n咸\n咽\n哀\n品\n哄\n哆\n哇\n哈\n哉\n响\n哎\n哑\n哒\n哗\n哟\n哥\n哦\n哨\n哪\n哭\n哲\n哺\n哼\n哽\n唁\n唇\n唉\n唏\n唐\n唠\n唤\n唬\n售\n唯\n唱\n唾\n啃\n商\n啊\n啕\n啡\n啤\n啥\n啦\n啧\n啪\n啬\n啰\n啵\n啶\n啸\n啼\n喀\n喂\n善\n喆\n喇\n喉\n喊\n喔\n喘\n喜\n喝\n喧\n喱\n喵\n喷\n喻\n喽\n嗅\n嗑\n嗒\n嗓\n嗡\n嗣\n嗤\n嗦\n嗨\n嗬\n嗯\n嗲\n嗷\n嗽\n嘀\n嘉\n嘎\n嘘\n嘛\n嘟\n嘭\n嘱\n嘲\n嘴\n嘻\n噎\n器\n噩\n噪\n噬\n噱\n噼\n嚎\n嚏\n嚓\n嚣\n嚷\n嚼\n囊\n囚\n四\n回\n因\n团\n囤\n囧\n园\n困\n围\n固\n国\n图\n圆\n圈\n土\n圣\n在\n圩\n圪\n圭\n地\n圳\n场\n圾\n址\n坂\n均\n坊\n坍\n坎\n坏\n坐\n坑\n块\n坚\n坛\n坝\n坞\n坟\n坠\n坡\n坤\n坦\n坪\n坯\n坷\n垂\n垃\n垄\n垅\n型\n垌\n垒\n垛\n垢\n垣\n垤\n垦\n垫\n垮\n埃\n埋\n城\n埔\n埜\n域\n培\n基\n堂\n堆\n堕\n堡\n堤\n堪\n堰\n堵\n塌\n塑\n塔\n塘\n塞\n填\n塬\n塾\n境\n墅\n墓\n墙\n增\n墟\n墨\n墩\n壁\n壑\n壕\n壤\n士\n壮\n声\n壳\n壶\n壹\n处\n备\n复\n夏\n夕\n外\n夙\n多\n夜\n够\n大\n天\n太\n夫\n夭\n央\n夯\n失\n头\n夷\n夸\n夹\n夺\n奂\n奇\n奈\n奉\n奋\n奎\n奏\n契\n奔\n奕\n奖\n套\n奘\n奚\n奠\n奢\n奥\n女\n奴\n奶\n奸\n她\n好\n如\n妃\n妄\n妆\n妇\n妈\n妊\n妍\n妒\n妖\n妙\n妞\n妤\n妥\n妧\n妨\n妩\n妮\n妯\n妹\n妻\n姆\n姊\n始\n姐\n姑\n姓\n委\n姗\n姚\n姜\n姝\n姣\n姥\n姨\n姬\n姻\n姿\n威\n娃\n娄\n娅\n娇\n娌\n娘\n娜\n娟\n娠\n娥\n娩\n娱\n娴\n娶\n娼\n婀\n婆\n婉\n婕\n婚\n婧\n婪\n婴\n婵\n婶\n婷\n婿\n媒\n媚\n媛\n媞\n媲\n媳\n嫁\n嫂\n嫉\n嫌\n嫔\n嫖\n嫚\n嫣\n嫦\n嫩\n嬉\n嬛\n嬷\n孀\n子\n孔\n孕\n字\n存\n孙\n孚\n孜\n孝\n孟\n孢\n季\n孤\n学\n孩\n孪\n孰\n孱\n孵\n孺\n宁\n它\n宅\n宇\n守\n安\n宋\n完\n宏\n宓\n宕\n宗\n官\n宙\n定\n宛\n宜\n宝\n实\n宠\n审\n客\n宣\n室\n宦\n宪\n宫\n宰\n害\n宴\n宵\n家\n宸\n容\n宽\n宾\n宿\n寂\n寄\n寅\n密\n寇\n富\n寐\n寒\n寓\n寝\n寞\n察\n寡\n寥\n寨\n寮\n寰\n寸\n对\n寺\n寻\n导\n寿\n封\n射\n将\n尊\n小\n少\n尔\n尖\n尘\n尚\n尝\n尤\n尧\n尬\n就\n尴\n尸\n尹\n尺\n尼\n尽\n尾\n尿\n局\n屁\n层\n居\n屈\n届\n屋\n屌\n屎\n屏\n屑\n展\n属\n屠\n屡\n履\n屯\n山\n屹\n屿\n岁\n岂\n岌\n岐\n岔\n岖\n岗\n岚\n岛\n岩\n岬\n岭\n岱\n岳\n岷\n岸\n峁\n峙\n峡\n峥\n峨\n峪\n峭\n峰\n峻\n崂\n崃\n崇\n崎\n崔\n崖\n崛\n崧\n崩\n崭\n崴\n嵋\n嵌\n嵘\n嵛\n嵩\n嶝\n巅\n巍\n川\n州\n巡\n巢\n工\n左\n巧\n巨\n巩\n巫\n差\n己\n已\n巴\n巷\n巾\n巿\n币\n市\n布\n帅\n帆\n师\n希\n帐\n帕\n帖\n帘\n帚\n帜\n帝\n带\n席\n帮\n帷\n常\n帼\n帽\n幂\n幄\n幅\n幌\n幕\n幢\n干\n平\n年\n并\n幸\n幺\n幻\n幼\n幽\n广\n庄\n庆\n庇\n床\n序\n庐\n库\n应\n底\n店\n庙\n庚\n府\n庞\n废\n度\n座\n庭\n庵\n康\n庸\n庾\n廉\n廊\n廓\n廖\n延\n廷\n建\n开\n异\n弃\n弄\n弈\n弊\n式\n弓\n引\n弗\n弘\n弛\n弟\n张\n弥\n弦\n弧\n弩\n弯\n弱\n弹\n强\n归\n当\n录\n彝\n形\n彤\n彦\n彩\n彪\n彬\n彭\n彰\n影\n彷\n役\n彻\n彼\n彿\n往\n征\n径\n待\n徇\n很\n徉\n徊\n律\n徐\n徒\n得\n徘\n徙\n御\n循\n微\n德\n徽\n心\n必\n忆\n忌\n忍\n忐\n忑\n志\n忘\n忙\n忠\n忧\n忪\n快\n忱\n念\n忽\n怀\n态\n怂\n怎\n怒\n怕\n怖\n怜\n思\n怠\n怡\n急\n怦\n性\n怨\n怪\n怯\n怵\n总\n恋\n恍\n恐\n恒\n恙\n恢\n恣\n恤\n恨\n恩\n恪\n恬\n恭\n息\n恰\n恳\n恶\n恸\n恺\n恼\n恿\n悄\n悉\n悍\n悔\n悖\n悚\n悟\n悠\n患\n悦\n您\n悬\n悯\n悲\n悴\n悸\n悼\n情\n惊\n惋\n惑\n惕\n惚\n惜\n惟\n惠\n惦\n惧\n惨\n惩\n惫\n惬\n惮\n惯\n惰\n想\n惶\n惹\n惺\n愁\n愈\n愉\n意\n愕\n愚\n感\n愤\n愧\n愿\n慈\n慌\n慎\n慑\n慕\n慢\n慧\n慨\n慰\n慷\n憋\n憔\n憧\n憨\n憩\n憬\n憷\n憾\n懂\n懈\n懊\n懋\n懒\n懵\n懿\n戈\n戎\n戏\n成\n我\n戒\n或\n战\n戚\n戛\n戟\n截\n戬\n戮\n戳\n戴\n户\n房\n所\n扁\n扇\n扉\n手\n才\n扎\n扑\n扒\n打\n扔\n托\n扛\n扣\n执\n扩\n扫\n扬\n扭\n扮\n扯\n扰\n扳\n扶\n批\n扼\n找\n承\n技\n抄\n抉\n把\n抑\n抒\n抓\n投\n抖\n抗\n折\n抚\n抛\n抠\n抡\n抢\n护\n报\n抨\n披\n抬\n抱\n抵\n抹\n押\n抽\n抿\n拄\n担\n拆\n拇\n拈\n拉\n拌\n拍\n拎\n拐\n拒\n拓\n拔\n拖\n拗\n拘\n拙\n招\n拜\n拟\n拢\n拣\n拥\n拦\n拧\n拨\n择\n括\n拭\n拮\n拯\n拱\n拳\n拴\n拷\n拼\n拽\n拾\n拿\n持\n挂\n指\n按\n挎\n挑\n挖\n挚\n挛\n挝\n挟\n挠\n挡\n挣\n挤\n挥\n挨\n挪\n挫\n振\n挺\n挽\n捂\n捅\n捆\n捉\n捍\n捎\n捏\n捐\n捕\n捞\n损\n捡\n换\n捣\n捧\n据\n捷\n捺\n捻\n掀\n掂\n授\n掉\n掌\n掏\n掐\n排\n掖\n掘\n掠\n探\n掣\n接\n控\n推\n掩\n措\n掬\n掮\n掰\n掴\n掷\n掺\n揉\n揍\n描\n提\n插\n握\n揣\n揩\n揪\n揭\n援\n揽\n搀\n搁\n搂\n搅\n搏\n搜\n搞\n搡\n搪\n搬\n搭\n携\n搽\n摁\n摄\n摆\n摇\n摊\n摒\n摔\n摘\n摧\n摩\n摸\n摹\n撂\n撇\n撑\n撒\n撕\n撞\n撤\n撩\n撬\n播\n撮\n撰\n撵\n撸\n撼\n擂\n擅\n操\n擎\n擒\n擘\n擞\n擦\n攀\n攒\n攥\n支\n收\n改\n攻\n放\n政\n故\n效\n敌\n敏\n救\n敖\n教\n敛\n敝\n敞\n敢\n散\n敦\n敬\n数\n敲\n整\n敷\n文\n斌\n斐\n斑\n斓\n斗\n料\n斛\n斜\n斟\n斤\n斥\n斧\n斩\n断\n斯\n新\n方\n施\n旁\n旅\n旋\n族\n旗\n无\n既\n日\n旦\n旧\n旨\n早\n旬\n旭\n旱\n时\n旷\n旺\n昀\n昂\n昆\n昊\n昌\n明\n昏\n易\n昔\n昕\n昙\n星\n映\n春\n昧\n昨\n昭\n是\n昱\n昵\n昼\n显\n晃\n晋\n晏\n晒\n晓\n晔\n晕\n晖\n晗\n晚\n晟\n晤\n晦\n晨\n普\n景\n晰\n晴\n晶\n智\n晾\n暂\n暄\n暇\n暑\n暖\n暗\n暧\n暨\n暮\n暴\n曙\n曝\n曦\n曰\n曲\n更\n曹\n曼\n曾\n替\n最\n月\n有\n朋\n服\n朐\n朔\n朗\n望\n朝\n期\n朦\n木\n未\n末\n本\n札\n术\n朱\n朴\n朵\n机\n朽\n杀\n杂\n权\n杆\n杉\n李\n杏\n材\n村\n杖\n杜\n杞\n束\n杠\n条\n来\n杨\n杭\n杯\n杰\n杳\n松\n板\n极\n构\n枉\n析\n枕\n林\n枚\n果\n枝\n枞\n枢\n枣\n枪\n枫\n枭\n枯\n架\n枷\n柄\n柏\n某\n染\n柔\n柜\n柞\n柠\n查\n柬\n柯\n柱\n柳\n柴\n柿\n栅\n标\n栈\n栋\n栏\n树\n栓\n栖\n栗\n校\n株\n样\n核\n根\n格\n栽\n栾\n桂\n桃\n框\n案\n桉\n桌\n桎\n桐\n桑\n桓\n桔\n档\n桥\n桦\n桩\n桶\n梁\n梅\n梓\n梗\n梦\n梧\n梨\n梭\n梯\n械\n梳\n梵\n检\n棉\n棋\n棍\n棒\n棕\n棘\n棚\n棠\n森\n棱\n棵\n棺\n椅\n椋\n植\n椎\n椒\n椰\n椿\n楂\n楔\n楚\n楞\n楠\n楣\n楷\n楼\n概\n榄\n榆\n榈\n榉\n榔\n榕\n榜\n榨\n榭\n榴\n榷\n榻\n槌\n槎\n槐\n槛\n槟\n槽\n槿\n樊\n樟\n模\n横\n樱\n橄\n橘\n橙\n橡\n橱\n檀\n檐\n檬\n欠\n次\n欢\n欣\n欧\n欲\n欺\n款\n歆\n歇\n歉\n歌\n止\n正\n此\n步\n武\n歧\n歪\n歹\n死\n殃\n殆\n殉\n殊\n残\n殒\n殓\n殖\n殚\n殡\n殭\n殴\n段\n殷\n殿\n毁\n毂\n毅\n毋\n母\n每\n毒\n毓\n比\n毕\n毗\n毙\n毛\n毫\n毯\n毽\n氏\n民\n氓\n气\n氛\n氟\n氢\n氦\n氧\n氨\n氪\n氮\n氯\n氰\n水\n永\n汀\n汁\n求\n汇\n汉\n汕\n汗\n汛\n汝\n汞\n江\n池\n污\n汤\n汪\n汰\n汲\n汴\n汶\n汹\n汽\n汾\n沁\n沃\n沅\n沈\n沉\n沏\n沐\n沓\n沙\n沛\n沟\n没\n沣\n沥\n沦\n沧\n沪\n沫\n沮\n沱\n河\n沸\n油\n治\n沼\n沽\n沾\n沿\n泄\n泉\n泊\n泌\n泓\n泔\n法\n泗\n泛\n泞\n泠\n泡\n波\n泣\n泥\n注\n泪\n泯\n泰\n泱\n泳\n泵\n泷\n泸\n泻\n泼\n泽\n泾\n洁\n洋\n洒\n洗\n洙\n洛\n洞\n津\n洪\n洱\n洲\n洵\n活\n洼\n洽\n派\n流\n浅\n浆\n浇\n浈\n浊\n测\n济\n浏\n浑\n浓\n浙\n浚\n浦\n浩\n浪\n浮\n浴\n海\n浸\n涂\n涅\n消\n涉\n涌\n涎\n涓\n涕\n涛\n涝\n涞\n涠\n涡\n涤\n润\n涧\n涨\n涩\n涮\n涯\n液\n涵\n涿\n淀\n淄\n淆\n淇\n淋\n淌\n淑\n淖\n淘\n淝\n淞\n淡\n淤\n淫\n淮\n深\n淳\n混\n淹\n添\n淼\n渀\n清\n渊\n渍\n渎\n渐\n渔\n渗\n渚\n渝\n渠\n渡\n渣\n渤\n渥\n温\n渭\n港\n渲\n渴\n游\n渺\n湃\n湍\n湖\n湘\n湛\n湾\n湿\n溃\n溅\n溉\n源\n溜\n溢\n溥\n溧\n溪\n溯\n溶\n溺\n滁\n滇\n滋\n滑\n滔\n滕\n滚\n滞\n满\n滢\n滤\n滥\n滨\n滩\n滴\n漂\n漆\n漏\n漓\n演\n漕\n漠\n漩\n漫\n漭\n漯\n漱\n漳\n漾\n潇\n潘\n潜\n潞\n潢\n潭\n潮\n潼\n澄\n澈\n澎\n澜\n澡\n澳\n激\n濑\n濒\n濠\n濡\n濮\n瀑\n瀚\n瀛\n灌\n灞\n火\n灭\n灯\n灰\n灵\n灶\n灼\n灾\n灿\n炅\n炉\n炊\n炎\n炒\n炕\n炖\n炙\n炜\n炫\n炬\n炭\n炮\n炯\n炳\n炷\n炸\n点\n炼\n炽\n烁\n烂\n烃\n烈\n烊\n烘\n烙\n烟\n烤\n烦\n烧\n烨\n烫\n热\n烯\n烷\n烹\n烽\n焉\n焊\n焕\n焖\n焘\n焚\n焦\n焯\n焰\n焱\n然\n煊\n煌\n煎\n煜\n煞\n煤\n煦\n照\n煮\n煲\n熄\n熊\n熏\n熔\n熙\n熟\n熠\n熨\n熬\n熹\n燃\n燊\n燎\n燕\n燥\n爆\n爪\n爬\n爱\n爵\n父\n爷\n爸\n爹\n爽\n片\n版\n牌\n牙\n牛\n牟\n牡\n牢\n牧\n物\n牲\n牵\n特\n牺\n牾\n犀\n犊\n犒\n犬\n犯\n状\n犷\n犹\n狂\n狄\n狈\n狐\n狗\n狙\n狞\n狠\n狡\n狩\n独\n狭\n狮\n狰\n狱\n狸\n狼\n猎\n猖\n猛\n猜\n猝\n猥\n猩\n猪\n猫\n猬\n献\n猴\n猾\n猿\n獒\n獗\n獾\n玄\n率\n玉\n王\n玖\n玛\n玟\n玥\n玩\n玫\n玮\n环\n现\n玲\n玳\n玺\n玻\n珀\n珉\n珊\n珍\n珏\n珑\n珜\n珠\n班\n珮\n珲\n珺\n球\n琅\n理\n琉\n琊\n琏\n琐\n琛\n琢\n琥\n琦\n琪\n琬\n琰\n琳\n琴\n琵\n琶\n琼\n瑁\n瑄\n瑕\n瑙\n瑚\n瑛\n瑜\n瑞\n瑟\n瑰\n瑶\n瑾\n璀\n璃\n璇\n璋\n璐\n璞\n璧\n璨\n瓜\n瓢\n瓣\n瓦\n瓮\n瓯\n瓶\n瓷\n甄\n甘\n甚\n甜\n生\n甥\n用\n甩\n甫\n甬\n田\n由\n甲\n申\n电\n男\n甸\n町\n画\n畅\n畊\n界\n畏\n畔\n留\n畜\n略\n番\n畴\n畸\n畿\n疃\n疆\n疏\n疑\n疗\n疚\n疝\n疤\n疫\n疯\n疲\n疵\n疹\n疼\n疾\n病\n症\n痉\n痊\n痒\n痕\n痘\n痛\n痣\n痪\n痫\n痰\n痱\n痴\n痹\n痼\n瘀\n瘁\n瘟\n瘠\n瘤\n瘦\n瘩\n瘪\n瘫\n瘸\n瘾\n癌\n癖\n癣\n癫\n登\n白\n百\n皂\n的\n皆\n皇\n皋\n皎\n皓\n皖\n皙\n皮\n皱\n盆\n盈\n益\n盎\n盐\n监\n盒\n盔\n盖\n盗\n盘\n盛\n盟\n目\n盯\n盲\n直\n相\n盹\n盼\n盾\n省\n眈\n眉\n看\n真\n眠\n眨\n眬\n眯\n眶\n眷\n眺\n眼\n着\n睁\n睐\n睛\n睡\n督\n睦\n睫\n睬\n睹\n睿\n瞄\n瞅\n瞌\n瞎\n瞒\n瞟\n瞧\n瞩\n瞪\n瞬\n瞰\n瞳\n瞻\n瞿\n矗\n矛\n矜\n矢\n矣\n知\n矩\n矫\n短\n矮\n石\n矶\n矿\n码\n砂\n砌\n砍\n砒\n研\n砖\n砚\n砝\n砥\n砰\n砲\n破\n砷\n砸\n砺\n砾\n础\n硅\n硕\n硚\n硝\n硫\n硬\n确\n碉\n碌\n碍\n碎\n碑\n碗\n碘\n碚\n碟\n碧\n碰\n碱\n碳\n碴\n碾\n磁\n磅\n磊\n磋\n磐\n磕\n磡\n磨\n磴\n磷\n磺\n礁\n示\n礼\n社\n祁\n祈\n祉\n祖\n祛\n祝\n神\n祠\n祢\n祥\n票\n祭\n祯\n祷\n祸\n祺\n禀\n禁\n禄\n禅\n福\n禧\n禹\n禺\n离\n禽\n禾\n秀\n私\n秃\n秆\n秉\n秋\n种\n科\n秒\n秘\n租\n秣\n秤\n秦\n秧\n秩\n积\n称\n秸\n移\n秽\n稀\n程\n稍\n税\n稚\n稠\n稣\n稳\n稻\n稼\n稽\n稿\n穆\n穗\n穴\n究\n穷\n空\n穿\n突\n窃\n窄\n窈\n窍\n窑\n窒\n窕\n窖\n窗\n窘\n窜\n窝\n窟\n窥\n窦\n窨\n窿\n立\n竖\n站\n竞\n竟\n章\n竣\n童\n竭\n端\n竲\n竹\n竺\n竽\n竿\n笃\n笈\n笋\n笑\n笔\n笙\n笛\n符\n笨\n第\n笼\n等\n筋\n筐\n筑\n筒\n答\n策\n筛\n筱\n筵\n筷\n筹\n签\n简\n箍\n算\n管\n箫\n箭\n箱\n篇\n篡\n篪\n篮\n篷\n簇\n簧\n簸\n簿\n籁\n籍\n米\n类\n籽\n粉\n粒\n粕\n粗\n粘\n粟\n粤\n粥\n粪\n粮\n粱\n粹\n精\n糊\n糕\n糖\n糗\n糙\n糟\n糯\n系\n紊\n素\n索\n紧\n紫\n累\n絮\n綦\n繁\n纠\n红\n纣\n纤\n约\n级\n纪\n纬\n纯\n纰\n纱\n纲\n纳\n纵\n纶\n纷\n纸\n纹\n纺\n纽\n线\n练\n组\n绅\n细\n织\n终\n绊\n绌\n绍\n绎\n经\n绑\n绒\n结\n绕\n绘\n给\n绚\n络\n绝\n绞\n统\n绣\n继\n绩\n绪\n续\n绮\n绯\n绰\n绳\n维\n绵\n绷\n绸\n综\n绽\n绿\n缀\n缄\n缅\n缆\n缇\n缉\n缓\n缔\n缕\n编\n缘\n缙\n缚\n缜\n缝\n缠\n缤\n缨\n缩\n缪\n缭\n缮\n缰\n缴\n缸\n缺\n罂\n罄\n罐\n网\n罕\n罗\n罚\n罡\n罢\n罩\n罪\n置\n署\n罹\n羁\n羊\n美\n羚\n羞\n羡\n羣\n群\n羲\n羹\n羽\n羿\n翁\n翅\n翌\n翔\n翘\n翟\n翠\n翡\n翩\n翰\n翱\n翻\n翼\n耀\n老\n考\n耄\n者\n耋\n而\n耍\n耐\n耒\n耕\n耗\n耘\n耳\n耶\n耷\n耸\n耻\n耽\n耿\n聂\n聆\n聊\n聋\n职\n联\n聘\n聚\n聪\n肃\n肆\n肇\n肉\n肋\n肌\n肖\n肘\n肚\n肛\n肝\n肠\n股\n肢\n肤\n肥\n肩\n肪\n肮\n肯\n育\n肴\n肺\n肾\n肿\n胀\n胁\n胃\n胆\n背\n胎\n胖\n胚\n胛\n胜\n胞\n胡\n胤\n胧\n胫\n胯\n胰\n胱\n胳\n胶\n胸\n胺\n能\n脂\n脆\n脉\n脊\n脍\n脏\n脐\n脑\n脖\n脚\n脯\n脱\n脸\n脾\n腆\n腊\n腋\n腌\n腐\n腑\n腓\n腔\n腕\n腥\n腩\n腰\n腱\n腹\n腺\n腻\n腼\n腾\n腿\n膀\n膊\n膏\n膑\n膛\n膜\n膝\n膨\n膳\n膺\n臀\n臂\n臃\n臆\n臣\n自\n臭\n至\n致\n臻\n舀\n舅\n舆\n舌\n舍\n舒\n舛\n舜\n舞\n舟\n航\n般\n舰\n舱\n舵\n舶\n舸\n船\n艇\n艋\n艘\n良\n艰\n色\n艳\n艺\n艾\n节\n芊\n芋\n芒\n芙\n芜\n芝\n芦\n芬\n芭\n芮\n芯\n花\n芳\n芷\n芸\n芽\n苇\n苍\n苏\n苑\n苗\n苛\n苟\n苡\n苣\n若\n苦\n苯\n英\n苹\n茁\n茂\n范\n茄\n茅\n茆\n茎\n茗\n茜\n茨\n茫\n茵\n茶\n茸\n茹\n荃\n荆\n草\n荐\n荒\n荔\n荚\n荞\n荟\n荡\n荣\n荤\n荧\n荫\n药\n荷\n荼\n莅\n莆\n莉\n莎\n莓\n莘\n莞\n莠\n莫\n莱\n莲\n莴\n获\n莹\n莺\n莽\n菁\n菇\n菊\n菌\n菜\n菠\n菡\n菩\n菱\n菲\n萃\n萄\n萋\n萌\n萍\n萎\n萝\n萤\n营\n萦\n萧\n萨\n萱\n落\n葆\n著\n葛\n葡\n董\n葩\n葫\n葬\n葱\n葵\n蒂\n蒋\n蒙\n蒜\n蒲\n蒸\n蒿\n蓁\n蓄\n蓉\n蓝\n蓟\n蓬\n蔑\n蔓\n蔗\n蔚\n蔡\n蔫\n蔬\n蔷\n蔺\n蔽\n蕉\n蕊\n蕙\n蕲\n蕴\n蕾\n薄\n薇\n薛\n薪\n薯\n薰\n藏\n藜\n藤\n藩\n藻\n蘑\n虎\n虐\n虑\n虚\n虞\n虫\n虱\n虹\n虽\n虾\n蚀\n蚁\n蚂\n蚊\n蚌\n蚓\n蚕\n蚝\n蚣\n蚯\n蛀\n蛇\n蛋\n蛐\n蛙\n蛛\n蛟\n蛮\n蛰\n蜀\n蜂\n蜇\n蜈\n蜊\n蜒\n蜓\n蜕\n蜘\n蜚\n蜜\n蜡\n蜥\n蜴\n蜷\n蜿\n蝇\n蝉\n蝎\n蝗\n蝙\n蝠\n蝴\n蝶\n螂\n螃\n融\n螳\n螺\n蟑\n蟹\n蠢\n血\n衅\n行\n衍\n衔\n街\n衙\n衡\n衣\n补\n表\n衫\n衬\n衰\n衷\n袁\n袂\n袄\n袆\n袈\n袋\n袍\n袒\n袖\n袜\n被\n袭\n袱\n裁\n裂\n装\n裆\n裔\n裕\n裙\n裟\n裤\n裳\n裴\n裸\n裹\n褂\n褒\n褓\n褚\n褛\n褪\n褴\n褶\n襁\n襄\n襟\n西\n要\n覃\n覆\n见\n观\n规\n觅\n视\n览\n觉\n觊\n觎\n觐\n觑\n角\n解\n觥\n触\n言\n詹\n誉\n誓\n警\n譬\n计\n订\n认\n讧\n讨\n让\n讪\n训\n议\n讯\n记\n讲\n讳\n讶\n许\n讹\n论\n讼\n讽\n设\n访\n诀\n证\n评\n诅\n识\n诈\n诉\n诊\n词\n译\n诓\n试\n诗\n诙\n诚\n话\n诞\n诟\n诠\n诡\n询\n该\n详\n诧\n诩\n诫\n诬\n语\n误\n诱\n诲\n说\n诵\n诶\n请\n诸\n诺\n读\n诽\n课\n诿\n谀\n谁\n调\n谅\n谈\n谊\n谋\n谌\n谍\n谎\n谐\n谑\n谓\n谕\n谙\n谚\n谜\n谢\n谣\n谤\n谦\n谨\n谩\n谬\n谭\n谱\n谴\n谷\n豁\n豆\n豚\n象\n豪\n豫\n豹\n貅\n貉\n貌\n貔\n贝\n贞\n负\n贡\n财\n责\n贤\n败\n账\n货\n质\n贩\n贪\n贫\n贬\n购\n贮\n贯\n贱\n贴\n贵\n贷\n贸\n费\n贺\n贼\n贾\n贿\n赁\n赂\n赃\n资\n赋\n赌\n赎\n赏\n赐\n赔\n赖\n赘\n赚\n赛\n赝\n赞\n赠\n赡\n赢\n赣\n赤\n赦\n赫\n走\n赴\n赵\n赶\n起\n趁\n超\n越\n趋\n趟\n趣\n足\n趴\n趸\n趾\n跃\n跄\n跆\n跌\n跑\n跛\n距\n跟\n跤\n跨\n跪\n路\n跳\n践\n跷\n跺\n跻\n踉\n踊\n踏\n踝\n踞\n踢\n踩\n踪\n踵\n踹\n蹂\n蹄\n蹈\n蹊\n蹚\n蹦\n蹬\n蹭\n蹲\n蹴\n蹶\n蹼\n蹿\n躁\n躏\n身\n躬\n躯\n躲\n躺\n车\n轧\n轨\n轩\n转\n轮\n软\n轰\n轴\n轶\n轻\n载\n轿\n较\n辄\n辅\n辆\n辈\n辉\n辍\n辐\n辑\n输\n辖\n辗\n辘\n辙\n辛\n辜\n辞\n辟\n辣\n辨\n辩\n辫\n辰\n辱\n边\n辽\n达\n迁\n迂\n迄\n迅\n过\n迈\n迎\n运\n近\n返\n还\n这\n进\n远\n违\n连\n迟\n迢\n迥\n迪\n迫\n迭\n述\n迷\n迸\n迹\n追\n退\n送\n适\n逃\n逅\n逆\n选\n逊\n逍\n透\n逐\n递\n途\n逗\n通\n逛\n逝\n逞\n速\n造\n逡\n逢\n逮\n逵\n逸\n逻\n逼\n逾\n遁\n遂\n遇\n遍\n遏\n遐\n道\n遗\n遛\n遢\n遣\n遥\n遨\n遭\n遮\n遴\n遵\n避\n邀\n邂\n邃\n邋\n邑\n邓\n邛\n邝\n邢\n那\n邦\n邪\n邬\n邮\n邯\n邱\n邵\n邹\n邺\n邻\n郁\n郊\n郎\n郑\n郜\n郝\n郡\n部\n郫\n郭\n郸\n都\n鄂\n鄙\n鄞\n鄢\n酋\n酌\n配\n酒\n酗\n酝\n酣\n酪\n酬\n酯\n酱\n酵\n酶\n酷\n酸\n酿\n醇\n醉\n醋\n醍\n醐\n醒\n醛\n采\n釉\n释\n里\n重\n野\n量\n金\n釜\n鉴\n鏖\n鑫\n针\n钉\n钊\n钓\n钛\n钝\n钞\n钟\n钠\n钢\n钥\n钦\n钧\n钩\n钮\n钰\n钱\n钵\n钻\n钾\n铀\n铁\n铂\n铃\n铅\n铆\n铉\n铎\n铐\n铜\n铝\n铠\n铣\n铨\n铬\n铭\n铮\n铰\n铲\n银\n铸\n铺\n链\n铿\n销\n锁\n锂\n锄\n锅\n锆\n锈\n锋\n锌\n锏\n锐\n错\n锜\n锟\n锡\n锢\n锣\n锤\n锥\n锦\n锭\n键\n锯\n锰\n锵\n锷\n锹\n锻\n镀\n镁\n镇\n镉\n镊\n镍\n镑\n镖\n镜\n镯\n镳\n镶\n长\n门\n闪\n闫\n闭\n问\n闯\n闰\n闲\n闳\n间\n闵\n闷\n闸\n闹\n闺\n闻\n闽\n阀\n阁\n阂\n阅\n阎\n阐\n阔\n阙\n阚\n阜\n队\n阮\n阱\n防\n阳\n阴\n阵\n阶\n阻\n阿\n陀\n陂\n附\n际\n陆\n陈\n陋\n陌\n降\n限\n陕\n陡\n院\n除\n陨\n险\n陪\n陬\n陵\n陶\n陷\n隅\n隆\n隋\n隍\n随\n隐\n隔\n隘\n隙\n障\n隧\n隶\n隼\n隽\n难\n雀\n雁\n雄\n雅\n集\n雇\n雌\n雍\n雏\n雕\n雨\n雪\n雯\n雳\n零\n雷\n雾\n需\n霁\n霄\n霆\n震\n霈\n霉\n霍\n霎\n霏\n霖\n霜\n霞\n露\n霸\n霹\n霾\n靑\n青\n靓\n靖\n静\n靛\n非\n靠\n靡\n面\n革\n靳\n靴\n靶\n鞋\n鞍\n鞘\n鞠\n鞭\n韦\n韧\n韩\n韬\n音\n韵\n韶\n页\n顶\n顷\n项\n顺\n须\n顽\n顾\n顿\n颁\n颂\n预\n颅\n领\n颇\n颈\n颊\n颍\n颐\n频\n颓\n颖\n颗\n题\n颚\n颜\n额\n颠\n颤\n风\n飒\n飓\n飘\n飙\n飚\n飞\n食\n餐\n餮\n饕\n饥\n饪\n饭\n饮\n饰\n饱\n饲\n饵\n饶\n饺\n饼\n饽\n饿\n馀\n馅\n馆\n馈\n馊\n馋\n馑\n馒\n首\n馗\n香\n馥\n馨\n马\n驭\n驯\n驰\n驱\n驳\n驴\n驶\n驻\n驼\n驾\n驿\n骁\n骂\n骄\n骅\n骆\n骇\n骊\n骋\n验\n骏\n骐\n骑\n骗\n骚\n骜\n骤\n骥\n骨\n骷\n骸\n骼\n髅\n髋\n髓\n高\n髦\n鬼\n魁\n魂\n魄\n魅\n魇\n魏\n魔\n鱼\n鲁\n鲍\n鲜\n鲟\n鲨\n鲶\n鲷\n鲸\n鳄\n鳅\n鳌\n鳖\n鳝\n鳞\n鸟\n鸠\n鸡\n鸣\n鸥\n鸦\n鸭\n鸯\n鸳\n鸵\n鸽\n鸾\n鸿\n鹃\n鹅\n鹊\n鹏\n鹜\n鹞\n鹤\n鹭\n鹰\n鹿\n麋\n麒\n麓\n麟\n麦\n麻\n麾\n黄\n黍\n黎\n黏\n黑\n黔\n默\n黛\n黝\n黯\n鼎\n鼓\n鼠\n鼻\n鼾\n齐\n齿\n龄\n龙\n龚\n龟\n<eos>\n"
  },
  {
    "path": "modules/audio/asr/u2_conformer_aishell/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom pathlib import Path\nimport sys\n\nimport numpy as np\nfrom paddlehub.env import MODULE_HOME\nfrom paddlehub.module.module import moduleinfo, serving\nfrom paddlehub.utils.log import logger\n\nimport paddle\nimport soundfile as sf\n\n# TODO: Remove system path when deepspeech can be installed via pip.\nsys.path.append(os.path.join(MODULE_HOME, 'u2_conformer_aishell'))\nfrom deepspeech.exps.u2.config import get_cfg_defaults\nfrom deepspeech.utils.utility import UpdateConfig\nfrom .u2_conformer_tester import U2ConformerTester\n\n\n@moduleinfo(name=\"u2_conformer_aishell\", version=\"1.0.0\", summary=\"\", author=\"Baidu\", author_email=\"\", type=\"audio/asr\")\nclass U2Conformer(paddle.nn.Layer):\n    def __init__(self):\n        super(U2Conformer, self).__init__()\n\n        # resource\n        res_dir = os.path.join(MODULE_HOME, 'u2_conformer_aishell', 'assets')\n        conf_file = os.path.join(res_dir, 'conf/conformer.yaml')\n        checkpoint = os.path.join(res_dir, 'checkpoints/avg_20.pdparams')\n\n        # config\n        self.config = get_cfg_defaults()\n        self.config.merge_from_file(conf_file)\n\n        # TODO: Remove path updating snippet.\n        with UpdateConfig(self.config):\n            self.config.collator.vocab_filepath = os.path.join(res_dir, self.config.collator.vocab_filepath)\n            # self.config.collator.spm_model_prefix = os.path.join(res_dir, self.config.collator.spm_model_prefix)\n            self.config.collator.augmentation_config = os.path.join(res_dir, self.config.collator.augmentation_config)\n            self.config.model.cmvn_file = os.path.join(res_dir, self.config.model.cmvn_file)\n            self.config.decoding.decoding_method = 'attention_rescoring'\n            self.config.decoding.batch_size = 1\n\n        # model\n        self.tester = U2ConformerTester(self.config)\n        self.tester.setup_model()\n        self.tester.resume(checkpoint)\n\n    @staticmethod\n    def check_audio(audio_file):\n        sig, sample_rate = sf.read(audio_file)\n        assert sample_rate == 16000, 'Excepting sample rate of input audio is 16000, but got {}'.format(sample_rate)\n\n    @serving\n    def speech_recognize(self, audio_file, device='cpu'):\n        assert os.path.isfile(audio_file), 'File not exists: {}'.format(audio_file)\n        self.check_audio(audio_file)\n\n        paddle.set_device(device)\n        return self.tester.test(audio_file)[0][0]\n"
  },
  {
    "path": "modules/audio/asr/u2_conformer_aishell/requirements.txt",
    "content": "loguru\nyacs\njsonlines\nscipy==1.2.1\nsentencepiece\nresampy==0.2.2\nSoundFile==0.9.0.post1\nsoxbindings\nkaldiio\ntypeguard\neditdistance\ntextgrid\n"
  },
  {
    "path": "modules/audio/asr/u2_conformer_aishell/u2_conformer_tester.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Evaluation for U2 model.\"\"\"\nimport os\nimport sys\n\nimport paddle\n\nfrom deepspeech.frontend.featurizer.text_featurizer import TextFeaturizer\nfrom deepspeech.io.collator import SpeechCollator\nfrom deepspeech.models.u2 import U2Model\nfrom deepspeech.utils import mp_tools\nfrom deepspeech.utils.utility import UpdateConfig\n\n\nclass U2ConformerTester:\n    def __init__(self, config):\n        self.config = config\n        self.collate_fn_test = SpeechCollator.from_config(config)\n        self._text_featurizer = TextFeaturizer(\n            unit_type=config.collator.unit_type, vocab_filepath=None, spm_model_prefix=config.collator.spm_model_prefix)\n\n    @mp_tools.rank_zero_only\n    @paddle.no_grad()\n    def test(self, audio_file):\n        self.model.eval()\n        cfg = self.config.decoding\n        collate_fn_test = self.collate_fn_test\n        audio, _ = collate_fn_test.process_utterance(audio_file=audio_file, transcript=\"Hello\")\n        audio_len = audio.shape[0]\n        audio = paddle.to_tensor(audio, dtype='float32')\n        audio_len = paddle.to_tensor(audio_len)\n        audio = paddle.unsqueeze(audio, axis=0)\n        vocab_list = collate_fn_test.vocab_list\n\n        text_feature = self.collate_fn_test.text_feature\n        result_transcripts = self.model.decode(\n            audio,\n            audio_len,\n            text_feature=text_feature,\n            decoding_method=cfg.decoding_method,\n            lang_model_path=cfg.lang_model_path,\n            beam_alpha=cfg.alpha,\n            beam_beta=cfg.beta,\n            beam_size=cfg.beam_size,\n            cutoff_prob=cfg.cutoff_prob,\n            cutoff_top_n=cfg.cutoff_top_n,\n            num_processes=cfg.num_proc_bsearch,\n            ctc_weight=cfg.ctc_weight,\n            decoding_chunk_size=cfg.decoding_chunk_size,\n            num_decoding_left_chunks=cfg.num_decoding_left_chunks,\n            simulate_streaming=cfg.simulate_streaming)\n\n        return result_transcripts\n\n    def setup_model(self):\n        config = self.config.clone()\n        with UpdateConfig(config):\n            config.model.input_dim = self.collate_fn_test.feature_size\n            config.model.output_dim = self.collate_fn_test.vocab_size\n\n        self.model = U2Model.from_config(config.model)\n\n    def resume(self, checkpoint):\n        \"\"\"Resume from the checkpoint at checkpoints in the output\n        directory or load a specified checkpoint.\n        \"\"\"\n        model_dict = paddle.load(checkpoint)\n        self.model.set_state_dict(model_dict)\n"
  },
  {
    "path": "modules/audio/asr/u2_conformer_librispeech/README.md",
    "content": "# u2_conformer_librispeech\n\n|模型名称|u2_conformer_librispeech|\n| :--- | :---: |\n|类别|语音-语音识别|\n|网络|Conformer|\n|数据集|LibriSpeech|\n|是否支持Fine-tuning|否|\n|模型大小|191MB|\n|最新更新日期|2021-11-01|\n|数据指标|英文WER 0.034|\n\n## 一、模型基本信息\n\n### 模型介绍\n\nU2 Conformer模型是一种适用于英文和中文的end-to-end语音识别模型。u2_conformer_libirspeech采用了conformer的encoder和transformer的decoder的模型结构，并且使用了ctc-prefix beam search的方式进行一遍打分，再利用attention decoder进行二次打分的方式进行解码来得到最终结果。\n\nu2_conformer_libirspeech在英文开源语音数据集[LibriSpeech ASR corpus](http://www.openslr.org/12/)进行了预训练，该模型在其测试集上的WER指标是0.034655。\n\n<p align=\"center\">\n<img src=\"https://paddlehub.bj.bcebos.com/paddlehub-img/conformer.png\" hspace='10'/> <br />\n</p>\n\n<p align=\"center\">\n<img src=\"https://paddlehub.bj.bcebos.com/paddlehub-img/u2_conformer.png\" hspace='10'/> <br />\n</p>\n\n更多详情请参考:\n- [Unified Streaming and Non-streaming Two-pass End-to-end Model for Speech Recognition](https://arxiv.org/abs/2012.05481)\n- [Conformer: Convolution-augmented Transformer for Speech Recognition](https://arxiv.org/abs/2005.08100)\n\n## 二、安装\n\n- ### 1、系统依赖\n\n  - libsndfile\n    - Linux\n      ```shell\n      $ sudo apt-get install libsndfile\n      or\n      $ sudo yum install libsndfile\n      ```\n    - MacOs\n      ```\n      $ brew install libsndfile\n      ```\n\n- ### 2、环境依赖\n\n  - paddlepaddle >= 2.1.0\n\n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 3、安装\n\n  - ```shell\n    $ hub install u2_conformer_librispeech\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    # 采样率为16k，格式为wav的英文语音音频\n    wav_file = '/PATH/TO/AUDIO'\n\n    model = hub.Module(\n        name='u2_conformer_librispeech',\n        version='1.0.0')\n    text = model.speech_recognize(wav_file)\n\n    print(text)\n    ```\n\n- ### 2、API\n  - ```python\n    def check_audio(audio_file)\n    ```\n    - 检查输入音频格式和采样率是否满足为16000\n\n    - **参数**\n\n      - `audio_file`：本地音频文件(*.wav)的路径，如`/path/to/input.wav`\n\n  - ```python\n    def speech_recognize(\n        audio_file,\n        device='cpu',\n    )\n    ```\n    - 将输入的音频识别成文字\n\n    - **参数**\n\n      - `audio_file`：本地音频文件(*.wav)的路径，如`/path/to/input.wav`\n      - `device`：预测时使用的设备，默认为\b`cpu`，如需使用gpu预测，请设置为`gpu`。\n\n    - **返回**\n\n      - `text`：str类型，返回输入音频的识别文字结果。\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线的语音识别服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m u2_conformer_librispeech\n    ```\n\n  - 这样就完成了一个语音识别服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 需要识别的音频的存放路径，确保部署服务的机器可访问\n    file = '/path/to/input.wav'\n\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"audio_file\"\n    data = {\"audio_file\": file}\n\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/u2_conformer_librispeech\"\n\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install u2_conformer_librispeech\n  ```\n"
  },
  {
    "path": "modules/audio/asr/u2_conformer_librispeech/__init__.py",
    "content": ""
  },
  {
    "path": "modules/audio/asr/u2_conformer_librispeech/assets/conf/augmentation.json",
    "content": "{}\n"
  },
  {
    "path": "modules/audio/asr/u2_conformer_librispeech/assets/conf/conformer.yaml",
    "content": "# https://yaml.org/type/float.html\ndata:\n  train_manifest: data/manifest.test-clean\n  dev_manifest: data/manifest.test-clean\n  test_manifest: data/manifest.test-clean\n  min_input_len: 0.5  # seconds\n  max_input_len: 30.0 # seconds\n  min_output_len: 0.0 # tokens\n  max_output_len: 400.0 # tokens\n  min_output_input_ratio: 0.05\n  max_output_input_ratio: 100.0\n\ncollator:\n  vocab_filepath: data/vocab.txt\n  unit_type: 'spm'\n  spm_model_prefix: 'data/bpe_unigram_5000'\n  mean_std_filepath: \"\"\n  augmentation_config: conf/augmentation.json\n  batch_size: 16\n  raw_wav: True  # use raw_wav or kaldi feature\n  spectrum_type: fbank #linear, mfcc, fbank\n  feat_dim: 80\n  delta_delta: False\n  dither: 1.0\n  target_sample_rate: 16000\n  max_freq: None\n  n_fft: None\n  stride_ms: 10.0\n  window_ms: 25.0\n  use_dB_normalization: True\n  target_dB: -20\n  random_seed: 0\n  keep_transcription_text: False\n  sortagrad: True\n  shuffle_method: batch_shuffle\n  num_workers: 2\n\n\n# network architecture\nmodel:\n    cmvn_file: \"data/mean_std.json\"\n    cmvn_file_type: \"json\"\n    # encoder related\n    encoder: conformer\n    encoder_conf:\n        output_size: 256    # dimension of attention\n        attention_heads: 4\n        linear_units: 2048  # the number of units of position-wise feed forward\n        num_blocks: 12      # the number of encoder blocks\n        dropout_rate: 0.1\n        positional_dropout_rate: 0.1\n        attention_dropout_rate: 0.0\n        input_layer: conv2d # encoder input type, you can chose conv2d, conv2d6 and conv2d8\n        normalize_before: True\n        use_cnn_module: True\n        cnn_module_kernel: 15\n        activation_type: 'swish'\n        pos_enc_layer_type: 'rel_pos'\n        selfattention_layer_type: 'rel_selfattn'\n\n    # decoder related\n    decoder: transformer\n    decoder_conf:\n        attention_heads: 4\n        linear_units: 2048\n        num_blocks: 6\n        dropout_rate: 0.1\n        positional_dropout_rate: 0.1\n        self_attention_dropout_rate: 0.0\n        src_attention_dropout_rate: 0.0\n\n    # hybrid CTC/attention\n    model_conf:\n        ctc_weight: 0.3\n        ctc_dropoutrate: 0.0\n        ctc_grad_norm_type: instance\n        lsm_weight: 0.1     # label smoothing option\n        length_normalized_loss: false\n\n\ntraining:\n  n_epoch: 120\n  accum_grad: 8\n  global_grad_clip: 3.0\n  optim: adam\n  optim_conf:\n    lr: 0.004\n    weight_decay: 1e-06\n  scheduler: warmuplr     # pytorch v1.1.0+ required\n  scheduler_conf:\n    warmup_steps: 25000\n    lr_decay: 1.0\n  log_interval: 100\n  checkpoint:\n    kbest_n: 50\n    latest_n: 5\n\n\ndecoding:\n  batch_size: 64\n  error_rate_type: wer\n  decoding_method: attention  # 'attention', 'ctc_greedy_search', 'ctc_prefix_beam_search', 'attention_rescoring'\n  lang_model_path: data/lm/common_crawl_00.prune01111.trie.klm\n  alpha: 2.5\n  beta: 0.3\n  beam_size: 10\n  cutoff_prob: 1.0\n  cutoff_top_n: 0\n  num_proc_bsearch: 8\n  ctc_weight: 0.5 # ctc weight for attention rescoring decode mode.\n  decoding_chunk_size: -1 # decoding chunk size. Defaults to -1.\n      # <0: for decoding, use full chunk.\n      # >0: for decoding, use fixed chunk size as set.\n      # 0: used for training, it's prohibited here.\n  num_decoding_left_chunks: -1  # number of left chunks for decoding. Defaults to -1.\n  simulate_streaming: False  # simulate streaming inference. Defaults to False.\n"
  },
  {
    "path": "modules/audio/asr/u2_conformer_librispeech/assets/data/bpe_unigram_5000.vocab",
    "content": "<unk>\t0\n<s>\t0\n</s>\t0\n▁the\t-2.9911\ns\t-3.44691\n▁and\t-3.58286\n▁of\t-3.70894\n▁to\t-3.78001\n▁a\t-3.89871\n▁in\t-4.20996\n▁i\t-4.36145\n▁he\t-4.48281\n▁that\t-4.55289\ned\t-4.59016\n▁was\t-4.59181\n▁it\t-4.62484\n'\t-4.81583\n▁his\t-4.84177\ning\t-4.88039\n▁you\t-4.99998\n▁with\t-5.00838\n▁for\t-5.02039\nt\t-5.0555\n▁had\t-5.07751\n▁as\t-5.09744\n▁her\t-5.13191\n▁be\t-5.19505\n▁is\t-5.19882\n▁but\t-5.21324\n▁not\t-5.22608\n▁she\t-5.23394\nd\t-5.27841\n▁at\t-5.34023\n▁on\t-5.34498\nly\t-5.40443\n▁him\t-5.50709\n▁they\t-5.56045\n▁all\t-5.58704\n▁have\t-5.59768\n▁by\t-5.60002\n▁\t-5.60186\n▁so\t-5.61262\ne\t-5.61903\n▁this\t-5.62164\n▁my\t-5.64057\n▁which\t-5.64669\n▁me\t-5.69076\n▁said\t-5.70437\n▁from\t-5.70664\n▁one\t-5.7513\n▁were\t-5.78541\n▁we\t-5.82874\ny\t-5.85619\n▁no\t-5.88631\n▁there\t-5.90758\nn\t-5.91704\ner\t-5.92896\n▁or\t-5.93481\n▁an\t-5.95345\n▁when\t-5.96716\n▁are\t-6.01743\n▁their\t-6.0437\n▁would\t-6.05331\n▁if\t-6.06359\n▁what\t-6.0895\n▁them\t-6.08963\n▁who\t-6.10441\n▁do\t-6.134\n▁out\t-6.14848\n▁will\t-6.16929\n▁up\t-6.18755\nm\t-6.19966\n▁been\t-6.20889\n▁man\t-6.28662\n▁then\t-6.31167\n▁could\t-6.37658\nr\t-6.38978\np\t-6.401\n▁more\t-6.40231\n▁into\t-6.4095\n▁now\t-6.45621\nes\t-6.45723\n▁very\t-6.46767\n▁your\t-6.47768\nc\t-6.49829\n▁some\t-6.5032\n▁little\t-6.52174\n▁time\t-6.53362\n▁can\t-6.57863\n▁like\t-6.58001\nll\t-6.58456\nre\t-6.59459\n▁about\t-6.6011\n▁has\t-6.63724\n▁than\t-6.64773\n▁did\t-6.64974\n▁upon\t-6.66755\nl\t-6.67708\n▁over\t-6.6829\n▁any\t-6.69691\nin\t-6.70055\n▁well\t-6.70679\n▁only\t-6.70884\n▁see\t-6.72382\n▁good\t-6.7302\n▁other\t-6.73256\n▁two\t-6.73281\nal\t-6.76971\n▁know\t-6.77014\nb\t-6.77332\n▁go\t-6.78028\n▁down\t-6.78382\n▁before\t-6.79386\na\t-6.80864\n▁our\t-6.81482\n▁old\t-6.82309\n▁should\t-6.82836\n▁made\t-6.82895\n▁after\t-6.84628\n▁great\t-6.85243\n▁day\t-6.85544\n▁must\t-6.87627\n▁come\t-6.87777\n▁how\t-6.87869\n▁such\t-6.88362\n▁came\t-6.88807\n▁where\t-6.89779\n▁us\t-6.90031\n▁never\t-6.92945\nle\t-6.93511\n▁these\t-6.95338\n▁much\t-6.95525\n▁mister\t-6.96536\n▁de\t-6.975\nor\t-6.98345\n▁may\t-6.98676\n▁long\t-7.01388\n▁way\t-7.01809\n▁first\t-7.04141\n▁back\t-7.05466\n▁own\t-7.05634\n▁am\t-7.05808\n▁again\t-7.06591\n▁say\t-7.07176\n▁men\t-7.07357\n▁went\t-7.07513\n▁himself\t-7.07891\n▁here\t-7.09085\nion\t-7.10388\n▁think\t-7.10393\nness\t-7.10433\nen\t-7.11572\n▁even\t-7.12414\ng\t-7.12655\n▁thought\t-7.12694\n▁hand\t-7.1271\nu\t-7.13322\n▁just\t-7.13401\nve\t-7.14094\n▁its\t-7.15029\no\t-7.16142\n▁un\t-7.16965\n▁re\t-7.1721\n▁make\t-7.17463\n▁might\t-7.1793\nation\t-7.18013\n▁too\t-7.18635\non\t-7.1907\n▁away\t-7.19477\nst\t-7.19708\n▁life\t-7.20558\n▁without\t-7.21952\n▁o\t-7.22087\n▁through\t-7.22747\n▁most\t-7.22784\nic\t-7.22971\n▁take\t-7.23593\n▁don\t-7.23927\n▁every\t-7.24535\nth\t-7.25167\n▁shall\t-7.25978\n▁those\t-7.26214\n▁eyes\t-7.27376\n▁still\t-7.28725\n▁last\t-7.29948\n▁house\t-7.30575\n▁head\t-7.3073\n▁nothing\t-7.31319\n▁night\t-7.3151\nable\t-7.32761\n▁off\t-7.33689\nity\t-7.33883\n▁let\t-7.33975\n▁many\t-7.34144\nar\t-7.34535\n▁being\t-7.34757\n▁found\t-7.34819\n▁while\t-7.35326\ni\t-7.36804\n▁saw\t-7.37042\n▁get\t-7.37494\nan\t-7.37662\n▁people\t-7.38318\n▁face\t-7.38748\n▁young\t-7.39215\n▁under\t-7.40057\n▁once\t-7.40078\n▁tell\t-7.40791\n▁three\t-7.413\n▁place\t-7.41377\n▁room\t-7.41704\nli\t-7.42158\n▁yet\t-7.42442\n▁same\t-7.42976\nri\t-7.42985\nv\t-7.4311\n▁father\t-7.44096\n▁though\t-7.45043\nk\t-7.45091\n▁another\t-7.45131\n▁right\t-7.46533\n▁heart\t-7.46662\n▁put\t-7.48293\n▁took\t-7.48368\n▁give\t-7.48808\n▁ever\t-7.4903\n▁work\t-7.50099\nel\t-7.50309\nit\t-7.50743\n▁e\t-7.51169\n▁look\t-7.51181\nry\t-7.5122\n▁new\t-7.51353\nil\t-7.51571\ners\t-7.51791\n▁part\t-7.52099\n▁king\t-7.52387\n▁missus\t-7.52455\n▁sir\t-7.53014\n▁mind\t-7.5303\n▁looked\t-7.53104\nus\t-7.53328\n▁love\t-7.53458\nra\t-7.53906\n▁asked\t-7.53965\n▁left\t-7.54703\n▁light\t-7.56075\n▁moment\t-7.57071\nro\t-7.57073\net\t-7.5746\nive\t-7.57948\n▁world\t-7.58543\n▁things\t-7.58651\n▁home\t-7.58975\n▁thing\t-7.6002\nf\t-7.60068\nh\t-7.60196\nful\t-7.60292\n▁why\t-7.60735\n▁mother\t-7.61051\n▁always\t-7.61115\n▁far\t-7.61265\n▁water\t-7.61901\n▁s\t-7.61926\nla\t-7.62405\nce\t-7.62873\nck\t-7.62955\n▁heard\t-7.63327\n▁something\t-7.63489\nw\t-7.63624\n▁seemed\t-7.63649\nch\t-7.64796\n▁because\t-7.65167\n▁end\t-7.65457\n▁told\t-7.66091\n▁yes\t-7.66365\n▁door\t-7.6662\nted\t-7.6708\n▁going\t-7.67276\n▁got\t-7.67607\nis\t-7.68689\nter\t-7.68801\n▁woman\t-7.68896\n▁god\t-7.68943\nol\t-7.69186\nest\t-7.69247\nent\t-7.69838\nur\t-7.70382\nte\t-7.70972\nling\t-7.71225\n▁find\t-7.71593\n▁knew\t-7.72124\nne\t-7.72399\n▁soon\t-7.72471\n▁each\t-7.72548\n▁side\t-7.72953\n▁oh\t-7.73896\nul\t-7.74838\n▁against\t-7.75871\n▁name\t-7.77125\n▁miss\t-7.77191\n▁quite\t-7.77406\n▁con\t-7.77659\n▁ma\t-7.7812\n▁want\t-7.78461\n▁years\t-7.78825\n▁few\t-7.78901\n▁better\t-7.79308\n▁half\t-7.79628\nton\t-7.79945\n▁done\t-7.80176\nment\t-7.81027\n▁also\t-7.81536\nse\t-7.81952\n▁began\t-7.82133\n▁having\t-7.82983\n▁enough\t-7.83157\n▁lady\t-7.84016\n▁whole\t-7.84092\n▁both\t-7.8452\n▁seen\t-7.84696\nled\t-7.85123\n▁set\t-7.8565\n▁white\t-7.85755\n▁course\t-7.86189\ntion\t-7.86283\n▁voice\t-7.86482\nir\t-7.865\n▁called\t-7.86562\nma\t-7.88043\nlo\t-7.88068\n▁turned\t-7.88486\n▁gave\t-7.88561\nman\t-7.89007\n▁poor\t-7.89153\n▁dear\t-7.89597\n▁girl\t-7.89892\n▁morning\t-7.90137\nless\t-7.90146\n▁between\t-7.90202\n▁nor\t-7.90275\n▁among\t-7.9053\nate\t-7.90969\nies\t-7.91089\n▁p\t-7.91307\nff\t-7.91729\nna\t-7.92272\n▁small\t-7.92689\nty\t-7.92942\nous\t-7.93067\n▁ga\t-7.93278\n▁whom\t-7.93725\n▁felt\t-7.93876\n▁hands\t-7.93947\n▁myself\t-7.94602\n▁high\t-7.94632\n▁ex\t-7.94686\n▁however\t-7.94887\nia\t-7.94934\n▁herself\t-7.95264\n▁stood\t-7.95858\n▁kind\t-7.95874\n▁hundred\t-7.95955\n▁la\t-7.96684\n▁round\t-7.97066\n▁almost\t-7.97354\nom\t-7.98129\n▁since\t-7.9813\nsh\t-7.98849\n▁c\t-7.98852\n▁ten\t-7.9898\n▁rest\t-7.9973\n▁boy\t-7.99935\n▁mo\t-8.00015\n▁perhaps\t-8.00311\nish\t-8.0036\nru\t-8.0045\n▁words\t-8.00475\nmp\t-8.00876\n▁sat\t-8.01874\nco\t-8.02001\n▁replied\t-8.02087\n▁four\t-8.02469\n▁anything\t-8.02776\nas\t-8.02812\n▁till\t-8.02843\nx\t-8.02978\nting\t-8.0301\n▁until\t-8.03441\n▁black\t-8.03588\nated\t-8.03649\nme\t-8.03831\n▁b\t-8.04278\nid\t-8.04354\n▁cried\t-8.04406\n▁fact\t-8.05064\n▁help\t-8.05169\n▁next\t-8.05191\nie\t-8.05368\n▁looking\t-8.05378\n▁friend\t-8.05529\n▁does\t-8.05546\n▁lay\t-8.05695\n▁brought\t-8.06229\n▁fire\t-8.06598\n▁keep\t-8.06679\nver\t-8.07005\n▁sea\t-8.07356\n▁country\t-8.07394\n▁word\t-8.07524\n▁days\t-8.07754\n▁together\t-8.0803\n▁reason\t-8.0831\nut\t-8.08642\nance\t-8.0867\n▁indeed\t-8.08859\n▁matter\t-8.08986\n▁ra\t-8.09017\n▁li\t-8.09673\n▁air\t-8.09835\n▁full\t-8.09927\n▁rather\t-8.10244\n▁hope\t-8.10365\n▁land\t-8.1041\ngg\t-8.10417\nam\t-8.10449\n▁open\t-8.10788\ntic\t-8.10921\n▁feet\t-8.11058\n▁imp\t-8.11102\nke\t-8.11263\nine\t-8.11421\n▁d\t-8.11547\n▁five\t-8.11674\n▁point\t-8.11763\n▁large\t-8.1235\nci\t-8.12437\nvi\t-8.1256\n▁child\t-8.13099\n▁gone\t-8.13104\n▁ho\t-8.1317\npp\t-8.13272\n▁best\t-8.13427\n▁hard\t-8.13582\nant\t-8.13757\n▁lord\t-8.13785\n▁wife\t-8.13848\n▁sure\t-8.13962\nde\t-8.14218\npo\t-8.14226\n▁form\t-8.14557\n▁death\t-8.14965\n▁care\t-8.15583\nence\t-8.15604\n▁nature\t-8.15699\n▁co\t-8.15856\n▁believe\t-8.15947\n▁near\t-8.16247\n▁red\t-8.16407\n▁ro\t-8.16449\n▁ha\t-8.16607\n▁speak\t-8.16703\n▁fear\t-8.16889\n▁case\t-8.16944\n▁taken\t-8.17098\n▁cannot\t-8.17343\n▁hear\t-8.17518\n▁along\t-8.17564\n▁themselves\t-8.17588\num\t-8.17641\n▁present\t-8.18164\n▁master\t-8.18704\n▁son\t-8.18955\n▁war\t-8.19388\n▁po\t-8.19446\n▁thus\t-8.19772\n▁true\t-8.20459\n▁car\t-8.20477\n▁less\t-8.20846\n▁thousand\t-8.21254\n▁w\t-8.21417\nmi\t-8.2162\n▁money\t-8.21713\nnd\t-8.21716\n▁da\t-8.21888\n▁power\t-8.22077\n▁behind\t-8.22087\nard\t-8.2226\nto\t-8.22274\n▁children\t-8.2228\n▁doctor\t-8.22317\n▁dis\t-8.22371\n▁twenty\t-8.22732\n▁wish\t-8.22739\n▁sound\t-8.22843\n▁whose\t-8.23097\n▁leave\t-8.23197\n▁answered\t-8.23298\n▁thou\t-8.23321\nac\t-8.23461\n▁dur\t-8.23471\n▁certain\t-8.2375\nge\t-8.24317\n▁cl\t-8.24703\n▁g\t-8.24779\n▁passed\t-8.24862\n▁arm\t-8.25095\nmo\t-8.25395\nious\t-8.2544\n▁state\t-8.25486\n▁alone\t-8.25597\n▁show\t-8.25689\n▁ba\t-8.25864\n▁need\t-8.25881\n▁live\t-8.26099\n▁dead\t-8.26254\n▁pro\t-8.26311\n▁mu\t-8.26701\n▁strong\t-8.26733\n▁en\t-8.26801\n▁bo\t-8.26981\n▁ground\t-8.27309\n▁short\t-8.27476\n▁st\t-8.27974\n▁horse\t-8.28616\n▁prince\t-8.28817\n▁pre\t-8.28817\nian\t-8.29122\nat\t-8.29216\nun\t-8.29302\n▁fell\t-8.2982\n▁order\t-8.29901\n▁call\t-8.29938\n▁ca\t-8.30443\n▁sun\t-8.30517\nta\t-8.30566\n▁given\t-8.30619\n▁therefore\t-8.30754\n▁dark\t-8.30758\n▁close\t-8.30816\n▁body\t-8.31022\n▁others\t-8.31043\n▁sent\t-8.31212\nad\t-8.3132\n▁second\t-8.316\nred\t-8.31726\n▁often\t-8.31883\n▁manner\t-8.32481\n▁vi\t-8.32632\n▁f\t-8.33096\n▁lo\t-8.33173\n▁question\t-8.33377\n▁hour\t-8.33469\n▁turn\t-8.33975\n▁table\t-8.34248\n▁general\t-8.34277\n▁earth\t-8.34496\n▁bed\t-8.34708\nage\t-8.3481\nward\t-8.35051\n▁really\t-8.35139\n▁six\t-8.35374\n▁become\t-8.35755\n▁read\t-8.36081\n▁use\t-8.36236\n▁coming\t-8.37141\n▁everything\t-8.37319\n▁above\t-8.37882\n▁evening\t-8.37903\n▁beautiful\t-8.3822\n▁feel\t-8.38244\n▁least\t-8.3841\nical\t-8.38416\n▁law\t-8.38452\n▁already\t-8.38637\n▁rose\t-8.38677\n▁mean\t-8.38681\n▁ran\t-8.38738\n▁itself\t-8.38828\n▁soul\t-8.39221\n▁suddenly\t-8.39493\n▁around\t-8.39553\n▁ti\t-8.39629\n▁sa\t-8.39657\n▁answer\t-8.39921\n▁em\t-8.40114\nber\t-8.40546\nque\t-8.40812\nti\t-8.40975\n▁won\t-8.41017\n▁wind\t-8.41105\n▁fine\t-8.41304\n▁whether\t-8.41526\n▁known\t-8.41725\n▁captain\t-8.42272\n▁eye\t-8.42551\n▁person\t-8.42656\n▁women\t-8.42706\n▁sort\t-8.42764\n▁ask\t-8.42963\n▁per\t-8.43123\n▁brother\t-8.43586\nni\t-8.43821\n▁used\t-8.44025\n▁held\t-8.44066\n▁big\t-8.44256\n▁returned\t-8.44473\n▁strange\t-8.44488\nno\t-8.45273\n▁free\t-8.45451\n▁either\t-8.45513\n▁within\t-8.45564\n▁doubt\t-8.45671\n▁year\t-8.45862\n▁clear\t-8.46003\n▁sight\t-8.46043\n▁lost\t-8.46111\nho\t-8.46112\n▁se\t-8.46255\n▁le\t-8.46257\n▁kept\t-8.46289\n▁bar\t-8.46341\n▁bu\t-8.46354\n▁town\t-8.46388\nring\t-8.46594\n▁sleep\t-8.46906\nist\t-8.47099\n▁hair\t-8.47372\n▁friends\t-8.47427\nnt\t-8.4756\n▁dream\t-8.47568\n▁fellow\t-8.47629\n▁deep\t-8.47799\n▁past\t-8.4783\n▁became\t-8.47901\nop\t-8.48024\n▁making\t-8.48051\n▁act\t-8.48477\nbo\t-8.48576\nim\t-8.48695\n▁bad\t-8.4879\nary\t-8.49097\n▁ta\t-8.49642\nily\t-8.4979\n▁bring\t-8.498\nster\t-8.49837\n▁ye\t-8.50127\n▁means\t-8.50147\n▁run\t-8.50334\nmen\t-8.50338\n▁daughter\t-8.50689\n▁sense\t-8.50862\ncy\t-8.51181\n▁city\t-8.51186\n▁sometimes\t-8.51205\n▁towards\t-8.51344\n▁road\t-8.51845\n▁gra\t-8.51919\n▁ready\t-8.52448\ndy\t-8.5251\nure\t-8.52531\nson\t-8.52666\n▁mar\t-8.52707\n▁cold\t-8.53015\n▁foot\t-8.53033\n▁else\t-8.53193\n▁letter\t-8.5321\nud\t-8.53213\n▁k\t-8.53803\n▁sp\t-8.53997\n▁truth\t-8.54012\n▁idea\t-8.54104\n▁sta\t-8.54296\n▁business\t-8.54487\n▁subject\t-8.54754\n▁john\t-8.54757\n▁court\t-8.54846\n▁river\t-8.55047\n▁ru\t-8.55137\n▁di\t-8.5541\n▁family\t-8.5565\n▁didn\t-8.56006\n▁several\t-8.56147\n▁glad\t-8.56226\nens\t-8.56422\n▁understand\t-8.56476\n▁possible\t-8.56873\n▁return\t-8.56875\n▁different\t-8.56878\n▁arms\t-8.5689\nhe\t-8.57005\n▁low\t-8.57062\n▁hold\t-8.57171\nating\t-8.57288\n▁talk\t-8.57294\n▁window\t-8.57563\n▁lu\t-8.57574\n▁sh\t-8.57632\n▁interest\t-8.57875\n▁sister\t-8.57949\n▁blood\t-8.58666\n▁says\t-8.58691\nland\t-8.59031\n▁th\t-8.59363\n▁human\t-8.59452\n▁cause\t-8.59568\ngo\t-8.59691\n▁thank\t-8.59812\n▁late\t-8.59857\n▁cut\t-8.59993\n▁across\t-8.60115\nng\t-8.60191\n▁story\t-8.6039\nial\t-8.60458\n▁count\t-8.60531\nby\t-8.61141\n▁number\t-8.61156\n▁stand\t-8.61173\n▁able\t-8.61219\nper\t-8.61242\n▁church\t-8.61299\nche\t-8.61435\nles\t-8.61602\n▁thy\t-8.61746\n▁comp\t-8.61815\n▁suppose\t-8.6189\n▁effect\t-8.62111\n▁si\t-8.62299\nba\t-8.62734\n▁spoke\t-8.62957\n▁green\t-8.6315\n▁husband\t-8.63174\n▁respect\t-8.63174\ncu\t-8.63314\n▁remember\t-8.63324\n▁followed\t-8.63382\n▁longer\t-8.63684\nions\t-8.63877\ntro\t-8.63906\n▁taking\t-8.64065\n▁seem\t-8.64106\n▁t\t-8.64367\n▁happy\t-8.64443\npe\t-8.64475\n▁line\t-8.64596\nley\t-8.64671\n▁stay\t-8.6532\n▁play\t-8.6534\n▁common\t-8.65531\nbe\t-8.65623\n▁times\t-8.65717\n▁book\t-8.65736\nund\t-8.65793\n▁object\t-8.66012\n▁seven\t-8.66091\n▁met\t-8.66215\nca\t-8.66333\n▁age\t-8.66376\n▁sha\t-8.66505\n▁pretty\t-8.6663\n▁fair\t-8.66837\ndo\t-8.66895\n▁wood\t-8.66965\nos\t-8.67011\n▁reached\t-8.6731\n▁sweet\t-8.67437\n▁appeared\t-8.67453\n▁fall\t-8.67545\n▁pass\t-8.67577\n▁sign\t-8.67655\n▁art\t-8.67659\nda\t-8.67771\n▁tree\t-8.68022\n▁garden\t-8.68055\n▁fl\t-8.68212\n▁remain\t-8.68618\n▁opened\t-8.68883\nqui\t-8.69114\n▁bright\t-8.69391\n▁street\t-8.6983\n▁hu\t-8.69925\n▁tu\t-8.70032\n▁trouble\t-8.70065\n▁pain\t-8.7029\n▁continued\t-8.70344\n▁school\t-8.70366\n▁carried\t-8.70421\n▁saying\t-8.70493\n▁follow\t-8.71325\n▁change\t-8.71328\nnce\t-8.71349\n▁gold\t-8.71391\n▁bear\t-8.71554\n▁su\t-8.71566\n▁feeling\t-8.71637\n▁command\t-8.71679\n▁certainly\t-8.71824\n▁blue\t-8.71904\n▁wild\t-8.72003\n▁account\t-8.72368\n▁ne\t-8.72403\n▁ought\t-8.72848\n▁fi\t-8.73365\n▁breath\t-8.73491\n▁wanted\t-8.73914\nov\t-8.74173\nlt\t-8.74286\n▁ill\t-8.74353\now\t-8.74421\n▁sc\t-8.74663\nder\t-8.74682\n▁heaven\t-8.74684\n▁purpose\t-8.74686\nha\t-8.74759\n▁character\t-8.74843\n▁rich\t-8.7515\nour\t-8.75547\n▁dress\t-8.75781\n▁english\t-8.76108\n▁chance\t-8.76254\n▁view\t-8.76496\n▁ship\t-8.76584\n▁toward\t-8.76672\n▁real\t-8.76718\n▁joy\t-8.76779\n▁cap\t-8.77235\n▁plan\t-8.77246\n▁neither\t-8.77275\n▁force\t-8.77285\n▁uncle\t-8.77317\n▁princess\t-8.77387\n▁har\t-8.77474\n▁hat\t-8.77801\nway\t-8.77869\n▁chief\t-8.77894\n▁lived\t-8.78017\n▁na\t-8.78141\n▁visit\t-8.7824\n▁mor\t-8.78381\n▁wall\t-8.78652\n▁pleasure\t-8.78739\n▁pe\t-8.7879\n▁smile\t-8.78797\n▁front\t-8.78866\n▁mine\t-8.78902\n▁ri\t-8.79253\n▁deal\t-8.79282\nier\t-8.79326\n▁further\t-8.79368\n▁tried\t-8.79541\n▁none\t-8.80009\nuc\t-8.80166\n▁entered\t-8.80167\n▁pay\t-8.80408\n▁queen\t-8.80455\n▁except\t-8.80579\nva\t-8.80801\n▁forward\t-8.80805\not\t-8.80998\n▁eight\t-8.81171\n▁added\t-8.81314\n▁public\t-8.81323\n▁eighteen\t-8.81324\nft\t-8.81377\n▁star\t-8.81398\n▁happened\t-8.81873\nned\t-8.81953\n▁although\t-8.822\n▁later\t-8.82204\n▁walked\t-8.82218\n▁walk\t-8.82238\n▁spirit\t-8.8225\n▁bit\t-8.82313\n▁meet\t-8.82432\n▁led\t-8.82559\nfa\t-8.82849\n▁mouth\t-8.82946\n▁wait\t-8.83231\nrs\t-8.83281\n▁gu\t-8.83416\n▁hours\t-8.83454\nlin\t-8.83526\n▁living\t-8.83739\n▁yourself\t-8.83798\nem\t-8.83827\n▁fast\t-8.83971\n▁hall\t-8.84497\n▁beyond\t-8.84576\n▁boat\t-8.84732\n▁secret\t-8.84736\n▁chair\t-8.84911\n▁pu\t-8.85297\n▁received\t-8.85389\n▁pa\t-8.85426\n▁cat\t-8.8545\n▁desire\t-8.85826\n▁ja\t-8.8592\n▁gentleman\t-8.85927\n▁cra\t-8.85959\nress\t-8.8609\n▁laid\t-8.86415\n▁party\t-8.86721\n▁wonder\t-8.86748\n▁occasion\t-8.86751\nig\t-8.86771\n▁fish\t-8.87005\n▁mi\t-8.87027\n▁send\t-8.87486\nvo\t-8.87515\nged\t-8.87522\nak\t-8.87728\n▁nearly\t-8.87803\ncon\t-8.87846\n▁try\t-8.8788\n▁seems\t-8.88114\n▁silence\t-8.88499\n▁bell\t-8.88523\never\t-8.88574\n▁bra\t-8.88685\n▁guard\t-8.88716\n▁rep\t-8.88973\n▁die\t-8.89013\n▁doing\t-8.89179\n▁early\t-8.89211\nugh\t-8.89235\n▁bank\t-8.89235\n▁figure\t-8.89252\nden\t-8.89326\n▁england\t-8.89568\n▁mary\t-8.896\n▁fo\t-8.89799\n▁cor\t-8.89892\n▁afraid\t-8.90011\n▁watch\t-8.90402\n▁gre\t-8.90554\n▁aunt\t-8.91001\ntur\t-8.91229\n▁service\t-8.91353\n▁je\t-8.91387\n▁minutes\t-8.91421\n▁trees\t-8.91568\n▁glass\t-8.91774\n▁pan\t-8.91942\n▁va\t-8.91977\n▁tone\t-8.91998\n▁please\t-8.92034\n▁forth\t-8.92051\n▁cur\t-8.92101\n▁cross\t-8.92166\n▁fa\t-8.92184\n▁exclaimed\t-8.92273\nler\t-8.92342\n▁pen\t-8.92344\nten\t-8.92376\n▁pi\t-8.92426\n▁eat\t-8.92444\n▁drew\t-8.92453\nble\t-8.92499\nably\t-8.9255\n▁grave\t-8.92616\n▁miles\t-8.92876\n▁ago\t-8.92887\n▁position\t-8.9304\n▁warm\t-8.93052\n▁length\t-8.93236\n▁necessary\t-8.93236\n▁thinking\t-8.93313\n▁soft\t-8.9336\n▁picture\t-8.93367\nship\t-8.93369\nations\t-8.9338\nav\t-8.93443\nible\t-8.93462\n▁ah\t-8.93999\n▁heavy\t-8.94029\n▁attention\t-8.94092\n▁dog\t-8.94119\n▁standing\t-8.94354\nrn\t-8.94361\nron\t-8.94363\n▁natural\t-8.94438\n▁appear\t-8.94438\n▁caught\t-8.94556\ngra\t-8.94669\n▁spring\t-8.94922\n▁experience\t-8.94955\n▁pat\t-8.95299\n▁pri\t-8.95372\n▁stopped\t-8.95569\n▁regard\t-8.95615\n▁hardly\t-8.95978\n▁self\t-8.96008\n▁strength\t-8.96095\nkin\t-8.96238\n▁grew\t-8.96282\n▁knight\t-8.96298\n▁opinion\t-8.96298\n▁ab\t-8.96388\nrk\t-8.96526\n▁wide\t-8.96661\n▁instead\t-8.96774\n▁south\t-8.96781\n▁trans\t-8.96816\n▁learn\t-8.9712\n▁corner\t-8.97137\n▁island\t-8.97439\n▁third\t-8.97591\n▁straight\t-8.97728\n▁tea\t-8.97822\n▁bound\t-8.97901\n▁seeing\t-8.97967\n▁cha\t-8.98025\n▁dinner\t-8.98079\n▁beauty\t-8.98209\n▁peace\t-8.98292\n▁silent\t-8.98762\n▁cre\t-8.98909\n▁sw\t-8.99093\n▁step\t-8.99147\n▁jo\t-8.99178\n▁wa\t-8.99194\n▁sitting\t-8.99214\n▁thirty\t-8.99247\n▁save\t-8.99425\n▁glance\t-8.99532\n▁loved\t-8.99677\n▁reach\t-8.99979\n▁action\t-9.00043\n▁ver\t-9.0005\nger\t-9.00278\n▁sad\t-9.00395\n▁stone\t-9.00628\nened\t-9.00671\n▁french\t-9.00862\n▁m\t-9.0087\n▁struck\t-9.01003\n▁paper\t-9.01106\nally\t-9.01111\n▁whatever\t-9.01193\n▁sub\t-9.01227\n▁distance\t-9.01287\n▁wrong\t-9.01358\n▁knowledge\t-9.01358\n▁safe\t-9.01474\n▁snow\t-9.01501\n▁fifty\t-9.01643\n▁attempt\t-9.01714\n▁music\t-9.01799\n▁government\t-9.01876\n▁crowd\t-9.02244\n▁besides\t-9.02296\n▁box\t-9.02356\n▁direction\t-9.02387\n▁train\t-9.02393\n▁north\t-9.02395\nped\t-9.02429\n▁el\t-9.02475\n▁thick\t-9.02509\n▁getting\t-9.02554\n▁floor\t-9.0289\n▁company\t-9.03007\n▁blow\t-9.03021\nbu\t-9.03086\n▁plain\t-9.03126\n▁beside\t-9.0315\nities\t-9.03293\n▁rock\t-9.03348\n▁immediately\t-9.03354\n▁shadow\t-9.03442\n▁sit\t-9.03601\n▁drink\t-9.03952\nking\t-9.04249\n▁spot\t-9.04416\n▁danger\t-9.04433\n▁wi\t-9.04538\n▁saint\t-9.04685\n▁slowly\t-9.04691\nah\t-9.04742\n▁palace\t-9.04831\nors\t-9.04944\n▁peter\t-9.05013\n▁result\t-9.05052\nric\t-9.05115\n▁forest\t-9.05173\n▁tears\t-9.0564\nism\t-9.05656\n▁belong\t-9.05664\n▁appearance\t-9.05678\n▁par\t-9.05711\n▁gate\t-9.05778\n▁ju\t-9.06233\n▁quickly\t-9.06437\n▁fit\t-9.06524\n▁quiet\t-9.06573\nris\t-9.06619\n▁london\t-9.06688\n▁start\t-9.06791\nrt\t-9.06846\n▁brown\t-9.06949\n▁consider\t-9.07025\n▁battle\t-9.07145\n▁anne\t-9.07195\n▁piece\t-9.07248\n▁died\t-9.07512\n▁success\t-9.07617\n▁post\t-9.07672\n▁lips\t-9.07702\n▁filled\t-9.078\n▁forget\t-9.07832\nified\t-9.08089\n▁margaret\t-9.08123\n▁food\t-9.08284\n▁pleasant\t-9.08657\nner\t-9.08809\n▁expression\t-9.08909\n▁pocket\t-9.08963\nfi\t-9.08995\n▁wear\t-9.09356\n▁fresh\t-9.09425\nau\t-9.09646\nham\t-9.09714\n▁broken\t-9.09722\n▁laughed\t-9.09757\n▁following\t-9.09843\n▁youth\t-9.09887\n▁touch\t-9.10015\n▁sal\t-9.10107\n▁week\t-9.10288\n▁remained\t-9.10418\n▁leg\t-9.10432\n▁easy\t-9.1051\n▁al\t-9.10564\n▁enter\t-9.10865\n▁ste\t-9.1089\n▁ch\t-9.10922\n▁fight\t-9.10933\n▁placed\t-9.10947\n▁travel\t-9.10964\n▁simple\t-9.11135\n▁girls\t-9.11236\n▁waiting\t-9.11512\n▁stop\t-9.11684\nif\t-9.11804\nile\t-9.11906\nning\t-9.11982\n▁camp\t-9.12002\n▁ni\t-9.12035\n▁wise\t-9.12043\n▁office\t-9.12111\n▁fe\t-9.12205\n▁grand\t-9.12295\n▁judge\t-9.12363\nny\t-9.12381\n▁quick\t-9.12617\ntri\t-9.12647\n▁du\t-9.12874\n▁fra\t-9.12979\n▁flo\t-9.1301\nging\t-9.13045\n▁comfort\t-9.13208\n▁particular\t-9.13305\n▁suit\t-9.1338\n▁started\t-9.13391\n▁top\t-9.13613\n▁hot\t-9.13623\n▁impossible\t-9.13675\nach\t-9.13707\n▁pale\t-9.13732\nments\t-9.13795\n▁ve\t-9.13914\n▁conversation\t-9.13917\n▁scene\t-9.14081\n▁boys\t-9.14082\n▁society\t-9.14402\n▁outside\t-9.14432\n▁write\t-9.14476\n▁effort\t-9.14645\n▁talking\t-9.14693\n▁fortune\t-9.14726\n▁nine\t-9.14985\n▁single\t-9.151\n▁cro\t-9.152\n▁port\t-9.15411\n▁happen\t-9.15427\n▁rule\t-9.15463\n▁cast\t-9.15628\n▁shut\t-9.15709\n▁noble\t-9.15917\n▁gun\t-9.15924\n▁path\t-9.15997\n▁begin\t-9.16092\n▁win\t-9.16136\n▁sky\t-9.16149\n▁wonderful\t-9.16515\n▁sudden\t-9.16577\n▁army\t-9.16589\nga\t-9.16805\n▁mountain\t-9.16841\n▁worth\t-9.16959\n▁grace\t-9.17162\n▁below\t-9.17203\n▁chapter\t-9.17215\n▁turning\t-9.17273\n▁afternoon\t-9.17612\n▁iron\t-9.17626\n▁bow\t-9.17691\nup\t-9.17693\n▁evil\t-9.17696\n▁trust\t-9.17749\nag\t-9.17757\n▁recogni\t-9.1778\n▁ring\t-9.17871\n▁lad\t-9.17907\n▁sail\t-9.18071\n▁content\t-9.18118\n▁horses\t-9.18165\n▁silver\t-9.18199\nory\t-9.18236\nay\t-9.18273\n▁tri\t-9.18493\n▁running\t-9.18731\n▁hill\t-9.18744\n▁beginning\t-9.18888\n▁habit\t-9.1913\n▁mad\t-9.19289\npa\t-9.19389\n▁clothes\t-9.19512\n▁morrow\t-9.19566\n▁cry\t-9.19577\n▁fashion\t-9.1964\n▁presence\t-9.19642\n▁min\t-9.19708\n▁tra\t-9.19725\n▁arrived\t-9.19781\n▁quarter\t-9.19811\n▁perfect\t-9.19902\n▁usual\t-9.19961\n▁neck\t-9.19975\n▁married\t-9.19983\n▁seat\t-9.20022\nwi\t-9.20071\n▁sand\t-9.20413\n▁shore\t-9.20419\nries\t-9.20447\n▁giving\t-9.20584\n▁probably\t-9.2067\n▁expect\t-9.20736\n▁minute\t-9.20838\n▁shot\t-9.20958\n▁instant\t-9.21089\n▁degree\t-9.21275\n▁color\t-9.21461\n▁west\t-9.21547\n▁winter\t-9.21587\nran\t-9.21593\nval\t-9.21703\n▁march\t-9.21721\n▁gar\t-9.21774\n▁bird\t-9.21826\n▁serious\t-9.21896\n▁greater\t-9.21909\n▁showed\t-9.21924\n▁covered\t-9.21941\n▁former\t-9.21951\n▁carry\t-9.21985\n▁loud\t-9.22023\n▁moved\t-9.2207\n▁mass\t-9.22168\n▁tom\t-9.22175\nlar\t-9.22214\n▁roman\t-9.22598\n▁moon\t-9.22677\n▁stream\t-9.22937\n▁easily\t-9.23026\n▁couldn\t-9.2303\ney\t-9.23089\n▁search\t-9.23115\n▁board\t-9.23122\n▁wished\t-9.23148\nap\t-9.23201\n▁months\t-9.23242\n▁sick\t-9.23317\n▁bla\t-9.23394\n▁duty\t-9.23511\n▁twelve\t-9.23557\n▁faint\t-9.23649\n▁hi\t-9.23676\n▁stranger\t-9.23765\n▁surprise\t-9.23849\n▁kill\t-9.23864\nfe\t-9.239\n▁leaving\t-9.23913\nub\t-9.23923\n▁journey\t-9.24091\n▁raised\t-9.24202\n▁scarcely\t-9.24209\n▁speaking\t-9.2426\n▁terrible\t-9.24359\n▁game\t-9.24488\n▁field\t-9.24561\n▁mer\t-9.24586\n▁promise\t-9.24657\n▁condition\t-9.24771\n▁personal\t-9.24929\n▁tall\t-9.24935\n▁stick\t-9.25\n▁threw\t-9.25168\nip\t-9.25241\n▁marry\t-9.25282\native\t-9.25306\ngi\t-9.25323\n▁van\t-9.25378\n▁according\t-9.25484\n▁burn\t-9.25574\n▁sei\t-9.25721\n▁lie\t-9.25726\n▁attack\t-9.25802\n▁sword\t-9.25809\n▁rise\t-9.25828\n▁thoughts\t-9.25867\nside\t-9.25899\n▁guess\t-9.25901\n▁dar\t-9.26041\n▁calm\t-9.26116\n▁thin\t-9.2615\n▁village\t-9.26256\n▁anxious\t-9.26439\n▁expected\t-9.26601\n▁ball\t-9.26745\n▁especially\t-9.26805\n▁charge\t-9.26831\n▁measure\t-9.26897\ngn\t-9.26921\n▁seek\t-9.26938\n▁te\t-9.26963\n▁nice\t-9.2709\nher\t-9.27108\n▁trying\t-9.27193\n▁allow\t-9.27357\n▁bread\t-9.27449\n▁sharp\t-9.27462\ngu\t-9.27478\n▁honour\t-9.27541\n▁honor\t-9.27635\n▁entirely\t-9.2768\n▁bill\t-9.27739\nrous\t-9.27784\n▁bri\t-9.27788\n▁written\t-9.27819\n▁broke\t-9.27946\n▁killed\t-9.2795\nwa\t-9.28007\n▁offer\t-9.28008\n▁ladies\t-9.28047\n▁mark\t-9.28091\n▁flowers\t-9.28165\n▁learned\t-9.28181\n▁forty\t-9.28372\n▁happiness\t-9.28469\n▁pray\t-9.28486\n▁class\t-9.28584\n▁principle\t-9.28749\n▁ven\t-9.28892\ngen\t-9.28901\n▁fer\t-9.28919\n▁shape\t-9.28928\n▁summer\t-9.28943\n▁books\t-9.2895\n▁jack\t-9.28989\n▁draw\t-9.29038\ntin\t-9.2915\n▁golden\t-9.29273\n▁decided\t-9.29353\n▁unless\t-9.29627\n▁lead\t-9.29655\n▁listen\t-9.29844\n▁shook\t-9.29892\n▁noise\t-9.29931\n▁influence\t-9.29972\neth\t-9.30032\n▁perfectly\t-9.30091\n▁marriage\t-9.30257\n▁broad\t-9.30274\n▁states\t-9.30314\n▁escape\t-9.30317\n▁middle\t-9.30362\n▁plant\t-9.30436\n▁movement\t-9.30501\n▁enemy\t-9.30542\n▁break\t-9.30544\n▁history\t-9.30549\n▁understood\t-9.30637\n▁latter\t-9.30638\n▁comes\t-9.30659\nwn\t-9.30685\n▁merely\t-9.3078\n▁simply\t-9.30828\n▁imagine\t-9.31019\n▁lower\t-9.3121\n▁born\t-9.31282\n▁conduct\t-9.31306\n▁yard\t-9.31406\n▁den\t-9.31624\n▁closed\t-9.31666\n▁fro\t-9.31877\n▁makes\t-9.31891\nlie\t-9.32113\n▁exist\t-9.32174\n▁speech\t-9.3227\n▁bitter\t-9.3235\njo\t-9.3246\nhi\t-9.3254\nib\t-9.32565\n▁grass\t-9.32705\n▁reply\t-9.32779\n▁changed\t-9.32821\n▁ka\t-9.3295\n▁dance\t-9.3312\n▁lying\t-9.33191\n▁finally\t-9.33222\n▁american\t-9.33343\n▁enjoy\t-9.33348\n▁contain\t-9.33439\n▁observed\t-9.33536\n▁meant\t-9.33571\n▁flu\t-9.3378\nev\t-9.33858\n▁laugh\t-9.34134\noo\t-9.34138\n▁afterwards\t-9.34164\npose\t-9.34235\n▁beat\t-9.34266\n▁equal\t-9.3437\n▁race\t-9.34393\n▁rain\t-9.34564\n▁steps\t-9.34565\n▁gi\t-9.3462\n▁beneath\t-9.34821\nio\t-9.34833\n▁tail\t-9.34953\n▁taste\t-9.35112\n▁che\t-9.3514\n▁char\t-9.35243\n▁grow\t-9.35273\nclock\t-9.35505\n▁repeated\t-9.3551\n▁move\t-9.3553\n▁mon\t-9.35718\n▁lot\t-9.35898\n▁note\t-9.36107\nther\t-9.36128\n▁madame\t-9.36149\n▁brave\t-9.36158\nians\t-9.36183\n▁castle\t-9.36196\nbi\t-9.36309\n▁future\t-9.36322\n▁relation\t-9.36426\n▁sorry\t-9.36427\n▁health\t-9.36434\n▁dick\t-9.36447\n▁building\t-9.36547\nlf\t-9.36874\n▁edge\t-9.36921\n▁bless\t-9.36973\n▁mis\t-9.36985\n▁spite\t-9.36994\nmer\t-9.37185\n▁mill\t-9.37444\n▁prisoner\t-9.37517\n▁allowed\t-9.37651\n▁catch\t-9.379\n▁coat\t-9.38075\n▁complete\t-9.38129\n▁wouldn\t-9.382\nthe\t-9.38299\n▁yellow\t-9.3836\n▁important\t-9.38367\n▁creature\t-9.38369\n▁passing\t-9.38461\n▁darkness\t-9.38601\n▁carriage\t-9.38669\n▁fifteen\t-9.38772\n▁hung\t-9.38791\n▁spread\t-9.38876\n▁pleased\t-9.38883\n▁curious\t-9.38918\n▁reali\t-9.38934\n▁worse\t-9.3898\nement\t-9.39043\n▁circumstances\t-9.39055\n▁qua\t-9.39079\n▁din\t-9.39256\n▁jane\t-9.39383\n▁add\t-9.39383\n▁east\t-9.3941\n▁cup\t-9.39472\n▁blind\t-9.39499\n▁passion\t-9.39519\n▁discovered\t-9.39614\n▁notice\t-9.39644\n▁report\t-9.39752\nwe\t-9.39837\n▁space\t-9.39918\n▁com\t-9.4017\n▁presently\t-9.40287\n▁sorrow\t-9.40336\n▁pack\t-9.40421\n▁dry\t-9.40549\n▁ancient\t-9.40651\nfer\t-9.40713\n▁cover\t-9.40802\n▁dressed\t-9.40804\n▁existence\t-9.40998\n▁exactly\t-9.41068\n▁beast\t-9.41096\n▁proper\t-9.41119\n▁dropped\t-9.41192\n▁clean\t-9.41286\n▁colour\t-9.41297\n▁host\t-9.41436\n▁mere\t-9.41572\nand\t-9.4175\n▁determined\t-9.41801\n▁chamber\t-9.41816\ncent\t-9.41871\n▁faith\t-9.41872\n▁sto\t-9.4188\n▁skin\t-9.421\n▁storm\t-9.42138\n▁persons\t-9.42186\n▁priest\t-9.42212\n▁pick\t-9.42288\n▁support\t-9.4235\n▁narrow\t-9.4235\n▁private\t-9.42457\n▁smiled\t-9.42561\n▁cousin\t-9.42672\n▁drawing\t-9.42682\n▁attend\t-9.42755\n▁cook\t-9.42811\n▁prevent\t-9.42995\n▁various\t-9.43011\n▁hole\t-9.43205\n▁weak\t-9.43221\n▁fixed\t-9.43226\nlet\t-9.43406\n▁bottom\t-9.43427\n▁nobody\t-9.43427\n▁eli\t-9.43557\n▁legs\t-9.43638\n▁ar\t-9.43728\nade\t-9.4384\n▁individual\t-9.43861\n▁dare\t-9.43865\n▁ears\t-9.44178\nug\t-9.44328\n▁advantage\t-9.44516\n▁france\t-9.44539\n▁lives\t-9.44639\n▁wine\t-9.44744\n▁walls\t-9.44867\n▁tired\t-9.44922\n▁shop\t-9.44987\n▁cru\t-9.45028\n▁animal\t-9.45076\n▁wrote\t-9.45175\n▁royal\t-9.45176\nki\t-9.45265\n▁isn\t-9.45395\n▁bon\t-9.45485\n▁considered\t-9.45562\n▁moral\t-9.45564\n▁companion\t-9.4577\n▁lose\t-9.45813\n▁lake\t-9.45864\n▁bag\t-9.46002\n▁letters\t-9.46007\n▁luck\t-9.46037\n▁sy\t-9.46198\nhood\t-9.46307\n▁inter\t-9.46621\n▁german\t-9.46634\n▁sake\t-9.46706\n▁drop\t-9.46715\n▁paid\t-9.4679\n▁ear\t-9.46913\n▁breakfast\t-9.46953\n▁labor\t-9.46955\n▁desert\t-9.47071\n▁declared\t-9.47139\n▁study\t-9.47178\n▁instance\t-9.47184\n▁song\t-9.47236\n▁somewhat\t-9.47291\n▁cloth\t-9.47377\n▁colonel\t-9.47403\n▁special\t-9.47403\n▁value\t-9.47527\nld\t-9.47606\n▁main\t-9.47694\n▁proud\t-9.47697\n▁express\t-9.47824\n▁nation\t-9.47829\n▁handsome\t-9.47938\n▁confess\t-9.47973\nps\t-9.48006\n▁passage\t-9.48021\n▁period\t-9.48082\n▁gen\t-9.4815\n▁christ\t-9.48187\n▁custom\t-9.48309\nrow\t-9.4831\n▁hurt\t-9.48337\n▁shoulder\t-9.48433\n▁cu\t-9.48495\n▁sin\t-9.48574\n▁receive\t-9.48598\nite\t-9.48641\nlight\t-9.48678\n▁difficult\t-9.48784\nple\t-9.48865\n▁depend\t-9.48879\n▁meeting\t-9.48891\n▁heat\t-9.48893\n▁believed\t-9.48972\n▁social\t-9.48997\n▁difficulty\t-9.4905\n▁greatest\t-9.4908\n▁drawn\t-9.49088\n▁grant\t-9.49184\n▁birds\t-9.49301\n▁angry\t-9.49342\nign\t-9.49466\n▁places\t-9.49511\n▁gri\t-9.4964\n▁courage\t-9.49683\n▁disc\t-9.4972\n▁evidently\t-9.49722\n▁gentle\t-9.49742\n▁cruel\t-9.49742\n▁george\t-9.49798\n▁due\t-9.49871\n▁paris\t-9.50034\n▁knows\t-9.50057\n▁knowing\t-9.50084\n▁servant\t-9.50088\n▁writing\t-9.50377\n▁pure\t-9.50397\n▁holding\t-9.50448\n▁remembered\t-9.50481\n▁tender\t-9.5049\n▁whi\t-9.50695\n▁burst\t-9.50701\n▁surely\t-9.50748\n▁valley\t-9.50855\nhy\t-9.51064\n▁conf\t-9.51116\n▁spoken\t-9.51131\n▁christian\t-9.51262\n▁store\t-9.51318\n▁henry\t-9.51332\n▁finished\t-9.51369\n▁qui\t-9.51369\n▁ob\t-9.51392\n▁prove\t-9.51443\n▁fool\t-9.51478\n▁ban\t-9.51521\n▁soldiers\t-9.51612\n▁language\t-9.51779\n▁inside\t-9.51827\n▁fallen\t-9.5209\nitch\t-9.52244\n▁baby\t-9.52317\n▁pot\t-9.52331\n▁situation\t-9.5237\n▁ruin\t-9.52474\n▁watched\t-9.52482\n▁gentlemen\t-9.52509\n▁fancy\t-9.52617\n▁accept\t-9.52659\n▁mal\t-9.52755\n▁season\t-9.52821\n▁ourselves\t-9.52844\n▁speed\t-9.53094\nans\t-9.53103\nnic\t-9.53266\n▁fu\t-9.53441\n▁cool\t-9.53512\nform\t-9.53515\n▁vessel\t-9.53561\n▁william\t-9.53563\n▁serve\t-9.53642\n▁obliged\t-9.53681\n▁group\t-9.53691\nmy\t-9.53852\nod\t-9.53859\n▁leaves\t-9.53884\n▁goes\t-9.53981\n▁peculiar\t-9.54041\n▁news\t-9.54053\n▁vain\t-9.54213\n▁everybody\t-9.54282\n▁pin\t-9.5434\n▁forgotten\t-9.54412\n▁carefully\t-9.54456\n▁flash\t-9.54524\nuous\t-9.54561\nook\t-9.54675\nched\t-9.54731\n▁murder\t-9.54736\n▁und\t-9.54748\n▁delight\t-9.54769\n▁waited\t-9.54905\n▁roll\t-9.54927\n▁property\t-9.54931\n▁noticed\t-9.54941\n▁hum\t-9.54975\nhan\t-9.54979\n▁fur\t-9.55108\n▁knock\t-9.55131\n▁earnest\t-9.55152\n▁ge\t-9.55239\nuch\t-9.55241\n▁honest\t-9.55375\n▁promised\t-9.55457\nwood\t-9.55616\n▁san\t-9.55635\n▁walking\t-9.55738\n▁quietly\t-9.55865\n▁square\t-9.55866\n▁cloud\t-9.5589\none\t-9.55892\n▁higher\t-9.56088\n▁built\t-9.5611\n▁formed\t-9.56135\n▁teach\t-9.56201\n▁fate\t-9.56269\n▁false\t-9.56356\n▁york\t-9.56368\n▁bal\t-9.56386\n▁climb\t-9.56479\n▁dust\t-9.56506\n▁fond\t-9.56536\n▁grown\t-9.56693\n▁fruit\t-9.5685\n▁generally\t-9.56896\n▁offered\t-9.57025\n▁nurse\t-9.57101\n▁spent\t-9.57227\n▁join\t-9.57301\n▁meaning\t-9.57367\n▁smoke\t-9.57471\n▁station\t-9.57515\n▁rough\t-9.57528\nline\t-9.5754\nju\t-9.57649\n▁likely\t-9.57725\n▁surface\t-9.57845\n▁month\t-9.57879\n▁r\t-9.5807\n▁possession\t-9.58089\n▁tongue\t-9.58102\nfor\t-9.58136\nang\t-9.58153\n▁duke\t-9.5827\nstra\t-9.58404\n▁laughing\t-9.58435\n▁weather\t-9.58474\n▁whispered\t-9.58519\ngan\t-9.58545\n▁rag\t-9.58575\n▁system\t-9.58599\n▁laws\t-9.58622\n▁touched\t-9.58764\n▁nose\t-9.58808\n▁surprised\t-9.58815\n▁wealth\t-9.58855\n▁trade\t-9.58885\n▁nu\t-9.58947\n▁temper\t-9.58978\n▁frank\t-9.58978\n▁arch\t-9.59065\n▁opportunity\t-9.59231\n▁animals\t-9.59345\n▁bare\t-9.59353\n▁claim\t-9.59358\n▁cost\t-9.59584\n▁opposite\t-9.59739\n▁police\t-9.59739\n▁key\t-9.59776\n▁ideas\t-9.59836\n▁wave\t-9.5985\n▁cal\t-9.5994\n▁reading\t-9.60061\n▁corn\t-9.6011\n▁collect\t-9.60123\nker\t-9.60382\n▁gray\t-9.60456\n▁crown\t-9.60465\n▁shoulders\t-9.60493\n▁swift\t-9.60507\n▁wash\t-9.60516\n▁ice\t-9.60591\n▁tar\t-9.60632\nuse\t-9.6067\n▁prepared\t-9.6068\n▁gro\t-9.60782\nlac\t-9.60967\n▁empty\t-9.61022\n▁share\t-9.61049\n▁smiling\t-9.61152\n▁avoid\t-9.61153\n▁difference\t-9.61161\n▁explain\t-9.61169\n▁pour\t-9.61217\n▁fat\t-9.61242\n▁attract\t-9.61281\n▁opening\t-9.61463\n▁breast\t-9.6154\n▁material\t-9.6154\n▁wheel\t-9.6154\nius\t-9.61563\n▁suffering\t-9.61577\n▁distinct\t-9.61639\n▁rever\t-9.61748\n▁sing\t-9.61819\n▁chi\t-9.61843\n▁fingers\t-9.61874\n▁altogether\t-9.6193\n▁papa\t-9.6196\ndding\t-9.62028\n▁brain\t-9.62096\n▁row\t-9.62113\n▁asleep\t-9.62191\n▁grey\t-9.62254\n▁windows\t-9.62363\n▁alive\t-9.62446\n▁proceed\t-9.62486\n▁flower\t-9.62538\n▁pieces\t-9.6261\n▁leap\t-9.62618\npping\t-9.62686\nef\t-9.6269\n▁alter\t-9.62705\n▁memory\t-9.62717\naw\t-9.62815\n▁fill\t-9.62844\n▁thrown\t-9.62844\n▁rode\t-9.6292\n▁kingdom\t-9.6298\n▁dish\t-9.62982\n▁mat\t-9.63055\n▁maid\t-9.6322\n▁band\t-9.63234\nsome\t-9.63329\n▁virtue\t-9.63374\n▁clo\t-9.63425\n▁guest\t-9.63479\n▁loss\t-9.63491\n▁caused\t-9.63624\nbra\t-9.63641\n▁motion\t-9.63672\n▁lovely\t-9.63741\n▁swa\t-9.63749\n▁million\t-9.63758\n▁fault\t-9.63772\n▁united\t-9.63911\noc\t-9.64057\n▁mountains\t-9.64071\n▁pur\t-9.64112\n▁dim\t-9.64149\n▁satisfied\t-9.6417\n▁lover\t-9.64196\n▁harm\t-9.64233\n▁dollars\t-9.64303\n▁hero\t-9.64369\n▁conceal\t-9.64437\n▁vast\t-9.64488\n▁hath\t-9.64582\n▁rush\t-9.64604\n▁despair\t-9.64704\n▁pull\t-9.64708\nlan\t-9.64708\n▁height\t-9.64721\nex\t-9.64763\n▁pet\t-9.64824\nney\t-9.64929\n▁spi\t-9.64936\n▁remark\t-9.64976\n▁pity\t-9.64999\n▁rising\t-9.65036\n▁bent\t-9.65173\n▁hurry\t-9.65242\n▁bree\t-9.65243\nddle\t-9.65325\n▁pride\t-9.65356\n▁settled\t-9.65371\n▁justice\t-9.65381\n▁finding\t-9.65389\n▁lifted\t-9.65406\n▁soldier\t-9.65444\n▁regular\t-9.65511\n▁struggle\t-9.65511\n▁machine\t-9.65512\n▁sum\t-9.65631\n▁hurried\t-9.65647\n▁sufficient\t-9.65738\n▁throw\t-9.65747\n▁represent\t-9.65772\n▁supper\t-9.65918\n▁double\t-9.65922\n▁alarm\t-9.65924\n▁dreadful\t-9.65954\n▁stock\t-9.66116\n▁flow\t-9.66166\n▁example\t-9.66189\n▁roof\t-9.66189\n▁ce\t-9.66229\n▁supposed\t-9.66546\n▁preserv\t-9.666\n▁listened\t-9.66708\n▁col\t-9.66819\n▁secure\t-9.67009\n▁frightened\t-9.67014\nka\t-9.6705\n▁drive\t-9.67127\n▁disturb\t-9.67145\n▁emotion\t-9.67283\n▁servants\t-9.6735\n▁buy\t-9.674\n▁forced\t-9.67485\n▁kitchen\t-9.67558\nrin\t-9.6761\n▁terror\t-9.67696\n▁stairs\t-9.677\n▁sixty\t-9.67838\n▁ordinary\t-9.67972\n▁directly\t-9.67979\n▁heads\t-9.67985\n▁greatly\t-9.68092\n▁method\t-9.68111\n▁forgive\t-9.68116\n▁awful\t-9.68119\n▁reflect\t-9.68138\n▁talked\t-9.68277\n▁favour\t-9.6838\nties\t-9.68388\n▁welcome\t-9.68388\n▁tin\t-9.6845\n▁yo\t-9.68486\n▁butter\t-9.68532\n▁control\t-9.68668\n▁angel\t-9.68714\n▁vo\t-9.68747\nstone\t-9.68797\n▁ordered\t-9.6884\n▁usually\t-9.68842\n▁poet\t-9.68918\n▁bold\t-9.68985\nridge\t-9.69084\n▁adventure\t-9.69092\n▁watching\t-9.69214\n▁ride\t-9.69302\n▁folk\t-9.69436\n▁mistress\t-9.69518\n▁rate\t-9.69657\n▁growing\t-9.69734\n▁evidence\t-9.69788\n▁cave\t-9.69821\n▁j\t-9.69842\n▁finger\t-9.69866\nbbe\t-9.699\n▁seventeen\t-9.69929\n▁moving\t-9.69932\n▁cow\t-9.69957\n▁doesn\t-9.69962\nator\t-9.70019\n▁type\t-9.70071\n▁tale\t-9.70074\n▁boil\t-9.70121\n▁deliver\t-9.70212\nire\t-9.70237\n▁farm\t-9.70249\n▁mil\t-9.70318\n▁feelings\t-9.70333\n▁monsieur\t-9.70353\n▁gathered\t-9.7039\n▁putting\t-9.70417\n▁remarked\t-9.70434\n▁er\t-9.70444\n▁contrary\t-9.70495\niness\t-9.70602\n▁crime\t-9.7078\n▁nearer\t-9.70882\n▁shame\t-9.71081\n▁loose\t-9.71084\n▁discover\t-9.71192\n▁flat\t-9.71232\n▁fail\t-9.7131\n▁twice\t-9.7135\n▁pla\t-9.71489\n▁europe\t-9.71637\n▁patient\t-9.71637\n▁unto\t-9.71665\n▁pair\t-9.71729\n▁suffer\t-9.7173\ntte\t-9.71755\nea\t-9.71796\n▁hy\t-9.71815\n▁treasure\t-9.71925\n▁eager\t-9.72052\n▁bi\t-9.72074\n▁salt\t-9.72239\n▁fly\t-9.72313\n▁parts\t-9.7254\npec\t-9.72573\n▁arthur\t-9.72647\n▁affairs\t-9.7268\n▁slow\t-9.72704\n▁consist\t-9.72808\n▁devil\t-9.72834\n▁affection\t-9.73001\n▁bore\t-9.7301\n▁kiss\t-9.73036\n▁engaged\t-9.73052\n▁officer\t-9.73173\nification\t-9.73228\n▁milk\t-9.73339\n▁process\t-9.73375\n▁gift\t-9.73398\n▁dan\t-9.73398\n▁lamp\t-9.73427\n▁hid\t-9.73427\n▁pulled\t-9.73464\n▁excellent\t-9.73521\n▁impression\t-9.73522\n▁telling\t-9.73545\n▁proved\t-9.73575\n▁authority\t-9.73576\n▁tower\t-9.73802\n▁consequence\t-9.73814\n▁ray\t-9.73837\n▁favor\t-9.73953\n▁flew\t-9.73962\n▁charles\t-9.73993\n▁address\t-9.73994\n▁familiar\t-9.74108\n▁confidence\t-9.74112\n▁limit\t-9.74112\n▁weeks\t-9.74244\n▁woods\t-9.74288\n▁direct\t-9.74355\n▁intention\t-9.74383\n▁rare\t-9.74439\n▁perform\t-9.74547\n▁solemn\t-9.74551\n▁distant\t-9.74552\n▁bur\t-9.74558\n▁image\t-9.74713\n▁president\t-9.74847\n▁firm\t-9.74855\n▁indian\t-9.74876\n▁rid\t-9.74907\n▁rank\t-9.74916\n▁liked\t-9.74918\n▁houses\t-9.74982\n▁agree\t-9.75016\n▁ya\t-9.7506\n▁matters\t-9.7508\n▁working\t-9.75208\n▁prison\t-9.75226\n▁major\t-9.75227\n▁slip\t-9.75273\nlike\t-9.75278\n▁mode\t-9.75344\n▁aware\t-9.75452\n▁looks\t-9.75466\n▁weight\t-9.75468\n▁busy\t-9.75475\n▁wound\t-9.7562\n▁bath\t-9.75727\nhen\t-9.75879\n▁wore\t-9.75892\n▁exercise\t-9.7604\n▁similar\t-9.7604\n▁amount\t-9.7619\n▁questions\t-9.76376\n▁violent\t-9.76642\n▁excuse\t-9.76643\n▁aside\t-9.76705\n▁dull\t-9.76778\n▁emperor\t-9.76793\n▁nevertheless\t-9.76793\n▁shout\t-9.76836\ngue\t-9.76895\n▁explained\t-9.76923\n▁accomplish\t-9.76944\nlung\t-9.77072\n▁instantly\t-9.77126\n▁mistake\t-9.77134\n▁smooth\t-9.77248\n▁strike\t-9.77248\n▁horror\t-9.77552\n▁science\t-9.77552\n▁protest\t-9.77553\n▁bob\t-9.77559\n▁obey\t-9.77567\n▁manage\t-9.77573\n▁ama\t-9.77643\n▁press\t-9.77671\n▁necessity\t-9.77704\n▁splendid\t-9.77704\n▁holy\t-9.77754\n▁interesting\t-9.7778\nath\t-9.7784\n▁religion\t-9.77857\n▁unknown\t-9.77857\n▁fierce\t-9.7801\n▁disappeared\t-9.78045\n▁unc\t-9.78099\n▁naturally\t-9.7813\n▁louis\t-9.78163\n▁drove\t-9.78164\n▁played\t-9.78241\n▁brand\t-9.78401\nford\t-9.78471\n▁hate\t-9.78556\n▁lines\t-9.78597\n▁shoot\t-9.78625\n▁consent\t-9.78635\n▁agreed\t-9.7869\n▁seated\t-9.78715\n▁stir\t-9.78774\n▁circle\t-9.78778\n▁streets\t-9.78825\nbble\t-9.78905\n▁task\t-9.78939\n▁produced\t-9.7904\n▁accident\t-9.79087\nburg\t-9.79088\n▁lin\t-9.79162\n▁witness\t-9.79162\n▁liberty\t-9.79241\n▁detail\t-9.79242\n▁minister\t-9.79242\n▁powerful\t-9.79327\n▁savage\t-9.79397\n▁sixteen\t-9.79397\n▁pretend\t-9.79552\n▁coast\t-9.79554\n▁utter\t-9.79799\n▁named\t-9.79837\n▁clever\t-9.7993\n▁admit\t-9.79966\n▁couple\t-9.80019\n▁message\t-9.80021\n▁wicked\t-9.80023\n▁bro\t-9.80067\n▁temple\t-9.80175\n▁stones\t-9.80204\n▁yesterday\t-9.80332\n▁hills\t-9.80372\n▁plea\t-9.80428\n▁sca\t-9.80497\n▁slight\t-9.80546\n▁squ\t-9.80554\n▁diamond\t-9.80646\n▁possibly\t-9.80646\n▁affair\t-9.80767\n▁hearing\t-9.8086\n▁original\t-9.80867\n▁sell\t-9.80869\n▁worthy\t-9.80872\n▁cottage\t-9.8096\n▁progress\t-9.8096\n▁sacrifice\t-9.8096\n▁shock\t-9.80961\n▁sunday\t-9.80961\n▁design\t-9.80964\n▁sought\t-9.80966\nlus\t-9.81045\n▁otherwise\t-9.81118\nright\t-9.81118\n▁prayer\t-9.81126\n▁cabin\t-9.81127\n▁dwell\t-9.81146\n▁rev\t-9.81234\n▁bridge\t-9.81314\n▁particularly\t-9.81374\nied\t-9.81392\n▁yield\t-9.81434\n▁treat\t-9.81442\n▁oak\t-9.81465\n▁gain\t-9.81614\nwin\t-9.81616\n▁rope\t-9.81746\ntan\t-9.81759\nou\t-9.81816\n▁orders\t-9.81844\n▁suspect\t-9.8191\n▁edward\t-9.82087\n▁eleven\t-9.82229\nability\t-9.82243\n▁occurred\t-9.82244\n▁teeth\t-9.82246\n▁val\t-9.82333\n▁lion\t-9.82382\n▁america\t-9.82547\n▁falling\t-9.8255\nists\t-9.82559\n▁depart\t-9.82607\n▁keeping\t-9.82633\n▁demand\t-9.82658\nnny\t-9.82735\n▁paused\t-9.82763\n▁ceased\t-9.82864\n▁cheer\t-9.83045\n▁pardon\t-9.83193\n▁native\t-9.83194\noon\t-9.83204\n▁beg\t-9.83285\nitude\t-9.83312\n▁dogs\t-9.83322\n▁required\t-9.8337\n▁elect\t-9.83506\n▁entertain\t-9.83514\nina\t-9.83517\n▁blu\t-9.83533\n▁huge\t-9.83628\n▁carrying\t-9.83629\n▁insist\t-9.83641\n▁satisfaction\t-9.83676\nboard\t-9.83736\n▁upper\t-9.83744\nord\t-9.8376\n▁hunt\t-9.83761\n▁countenance\t-9.83838\n▁maiden\t-9.83958\n▁james\t-9.84004\n▁foreign\t-9.84011\n▁failed\t-9.84019\n▁gather\t-9.8402\n▁fun\t-9.8409\n▁test\t-9.84104\n▁pal\t-9.84163\n▁mighty\t-9.84183\n▁pit\t-9.8431\n▁silk\t-9.84328\n▁terms\t-9.8435\n▁page\t-9.84434\n▁knees\t-9.84447\n▁brothers\t-9.84472\n▁shown\t-9.8448\n▁professor\t-9.84527\n▁log\t-9.84552\nmore\t-9.84553\n▁defi\t-9.8461\n▁cart\t-9.84746\n▁charm\t-9.84749\n▁require\t-9.84799\n▁proof\t-9.84816\n▁softly\t-9.84961\n▁unfortunate\t-9.8498\n▁possessed\t-9.84987\n▁severe\t-9.85032\n▁singing\t-9.85039\n▁stage\t-9.8507\n▁medi\t-9.85097\n▁price\t-9.85122\n▁freedom\t-9.85145\n▁farther\t-9.85228\n▁shouted\t-9.85263\n▁majesty\t-9.85309\n▁previous\t-9.85309\n▁guide\t-9.85355\n▁match\t-9.85362\n▁chest\t-9.85369\n▁intended\t-9.85443\n▁excitement\t-9.85485\n▁officers\t-9.85487\n▁shake\t-9.85565\n▁sentiment\t-9.85639\n▁gently\t-9.85644\n▁succeeded\t-9.85691\n▁sur\t-9.85879\n▁ki\t-9.8588\npha\t-9.85914\n▁mention\t-9.85927\n▁acquaintance\t-9.85969\n▁imagination\t-9.85969\n▁physical\t-9.85969\n▁leading\t-9.85978\n▁slave\t-9.8605\n▁lock\t-9.8607\n▁base\t-9.86187\n▁steam\t-9.86204\n▁term\t-9.86288\n▁pointed\t-9.86301\n▁pipe\t-9.86304\n▁shade\t-9.86323\n▁invent\t-9.86325\n▁regret\t-9.86468\n▁alas\t-9.86474\n▁faithful\t-9.86713\n▁worked\t-9.86766\n▁bay\t-9.86795\n▁record\t-9.86801\n▁complain\t-9.86802\n▁mentioned\t-9.86831\n▁superior\t-9.86969\n▁hotel\t-9.87087\n▁seventy\t-9.87096\n▁sheep\t-9.87201\n▁advice\t-9.87304\n▁hidden\t-9.8732\n▁demanded\t-9.87361\n▁fore\t-9.8737\n▁meal\t-9.87387\n▁conscious\t-9.8739\nky\t-9.87404\n▁possess\t-9.87473\n▁praise\t-9.87488\n▁brow\t-9.87501\n▁fourth\t-9.87589\n▁events\t-9.87621\n▁advanced\t-9.87786\n▁resolved\t-9.87809\n▁stuff\t-9.87809\n▁cheerful\t-9.87861\n▁fri\t-9.87884\n▁fairy\t-9.87922\n▁birth\t-9.87978\n▁afford\t-9.8798\n▁grief\t-9.87988\n▁sides\t-9.88093\n▁substance\t-9.88147\n▁article\t-9.88148\n▁level\t-9.8815\n▁wake\t-9.88165\nville\t-9.88325\n▁joined\t-9.88349\n▁mist\t-9.88439\n▁practical\t-9.88486\n▁clearly\t-9.88488\n▁trace\t-9.88538\n▁awake\t-9.8864\n▁lack\t-9.88656\n▁basket\t-9.88656\n▁observe\t-9.88658\nette\t-9.88747\n▁spirits\t-9.88853\n▁excited\t-9.88955\n▁abandon\t-9.88997\n▁shining\t-9.89001\n▁fully\t-9.89019\n▁calling\t-9.89202\nvan\t-9.89205\n▁considerable\t-9.89318\n▁sprang\t-9.8934\n▁mile\t-9.89356\n▁dangerous\t-9.89425\n▁pounds\t-9.89446\n▁jew\t-9.89454\n▁fox\t-9.89599\n▁information\t-9.89684\n▁wit\t-9.89688\n▁deck\t-9.8973\n▁lies\t-9.8975\n▁paul\t-9.89839\n▁stars\t-9.90127\n▁anger\t-9.90188\n▁strain\t-9.90201\n▁faces\t-9.90244\n▁settle\t-9.90251\n▁adam\t-9.90281\n▁smith\t-9.90373\n▁citi\t-9.90381\n▁importance\t-9.90385\n▁feather\t-9.9072\n▁willing\t-9.90763\n▁served\t-9.90764\n▁author\t-9.90817\n▁perceived\t-9.90847\n▁haven\t-9.90898\n▁flame\t-9.90907\n▁divine\t-9.90945\n▁trail\t-9.91006\n▁anybody\t-9.91068\n▁sigh\t-9.91159\n▁delicate\t-9.91243\n▁desired\t-9.91307\nwar\t-9.91329\n▁curiosity\t-9.91418\n▁practice\t-9.91418\n▁fold\t-9.91533\n▁absolutely\t-9.91541\n▁bottle\t-9.91607\n▁consideration\t-9.91616\n▁prop\t-9.91638\n▁meat\t-9.91639\n▁choose\t-9.91768\n▁occupied\t-9.91768\n▁interested\t-9.91782\n▁throat\t-9.91978\n▁candle\t-9.91985\n▁dawn\t-9.91996\ncha\t-9.92028\n▁protect\t-9.92033\n▁sentence\t-9.92088\n▁rocks\t-9.92105\n▁apparently\t-9.9218\n▁portion\t-9.92182\n▁aid\t-9.92242\n▁tight\t-9.92315\n▁actually\t-9.92396\n▁presented\t-9.92442\n▁dying\t-9.92675\n▁daily\t-9.92765\n▁political\t-9.92827\n▁bodies\t-9.92828\n▁suffered\t-9.9284\n▁modern\t-9.92845\n▁completely\t-9.92895\n▁sooner\t-9.92933\n▁advance\t-9.93029\n▁refused\t-9.93067\n▁farmer\t-9.93074\n▁polite\t-9.93183\n▁plate\t-9.93356\n▁thunder\t-9.93361\n▁elsie\t-9.93364\n▁sailor\t-9.93371\n▁brief\t-9.93374\n▁suggested\t-9.93403\n▁anti\t-9.93442\n▁flesh\t-9.93541\n▁buck\t-9.93573\n▁weep\t-9.93586\n▁dri\t-9.93665\n▁ocean\t-9.93719\n▁spend\t-9.93721\n▁odd\t-9.9377\n▁governor\t-9.93809\nwell\t-9.93829\n▁entrance\t-9.93898\n▁suspicion\t-9.93898\n▁stepped\t-9.93935\n▁rapidly\t-9.93971\n▁check\t-9.93987\nlow\t-9.94128\n▁club\t-9.94131\n▁flight\t-9.94132\n▁hide\t-9.94165\n▁entire\t-9.94167\n▁indians\t-9.94179\n▁sam\t-9.94213\n▁capital\t-9.94257\n▁mamma\t-9.94258\n▁jud\t-9.94284\n▁correct\t-9.94437\n▁haste\t-9.94579\n▁pace\t-9.9458\n▁crack\t-9.94583\n▁sensation\t-9.94619\n▁worst\t-9.94619\n▁driven\t-9.94787\n▁midst\t-9.94797\n▁august\t-9.94799\n▁proportion\t-9.94799\n▁innocent\t-9.94799\nja\t-9.94854\n▁doors\t-9.94913\n▁regarded\t-9.95005\n▁education\t-9.95016\n▁employ\t-9.95052\n▁truly\t-9.95138\nliness\t-9.9516\n▁instrument\t-9.95161\n▁foolish\t-9.95213\nility\t-9.95287\n▁frame\t-9.95289\n▁taught\t-9.95343\n▁nay\t-9.95365\n▁hang\t-9.95432\n▁argument\t-9.95525\n▁nineteen\t-9.95525\n▁elder\t-9.95574\nog\t-9.95638\n▁spar\t-9.95647\n▁papers\t-9.95683\n▁neighbor\t-9.957\n▁instruct\t-9.95708\n▁reward\t-9.95728\n▁fields\t-9.95806\n▁equally\t-9.95809\n▁needed\t-9.95816\n▁conditions\t-9.95965\n▁ways\t-9.95977\n▁request\t-9.96074\n▁worn\t-9.96075\n▁dig\t-9.96135\n▁load\t-9.96212\n▁remarkable\t-9.96225\n▁worship\t-9.96257\n▁park\t-9.96344\n▁interrupted\t-9.96393\n▁skill\t-9.96396\n▁critic\t-9.96441\n▁distress\t-9.96442\n▁belief\t-9.96442\n▁stern\t-9.9649\n▁track\t-9.96546\n▁hunting\t-9.96568\n▁jewel\t-9.96585\n▁gradually\t-9.96625\n▁glow\t-9.96653\n▁mental\t-9.96704\n▁rushed\t-9.96737\n▁powers\t-9.96763\n▁visitor\t-9.96783\night\t-9.96826\n▁behold\t-9.96859\n▁ski\t-9.96872\n▁picked\t-9.96903\n▁expressed\t-9.96991\nartagnan\t-9.96994\n▁moreover\t-9.96997\n▁keen\t-9.96998\n▁operation\t-9.97029\n▁careful\t-9.97036\n▁hence\t-9.97131\n▁wander\t-9.97162\n▁enemies\t-9.9718\n▁mysterious\t-9.9718\n▁assert\t-9.97181\n▁depth\t-9.97182\nium\t-9.97185\n▁prefer\t-9.97198\n▁charming\t-9.97301\n▁crossed\t-9.97306\n▁dread\t-9.97315\nnnie\t-9.97438\n▁robin\t-9.97446\n▁relief\t-9.97556\n▁inquired\t-9.9758\n▁apple\t-9.97602\n▁urge\t-9.97616\n▁wings\t-9.97698\n▁choice\t-9.97737\n▁tre\t-9.97846\n▁species\t-9.97924\n▁delighted\t-9.97997\n▁rapid\t-9.98035\n▁appeal\t-9.98111\n▁famous\t-9.98111\n▁civili\t-9.98157\n▁helen\t-9.98168\n▁useful\t-9.9818\n▁card\t-9.98181\n▁newspaper\t-9.98298\n▁plenty\t-9.98298\nqua\t-9.98375\n▁bearing\t-9.98432\n▁nervous\t-9.98486\n▁rub\t-9.98727\n▁roar\t-9.98756\n▁wounded\t-9.98825\n▁chain\t-9.98829\n▁produce\t-9.98919\n▁reflection\t-9.99014\n▁baron\t-9.99026\n▁merchant\t-9.99051\n▁quarrel\t-9.99051\n▁glory\t-9.99051\n▁begun\t-9.99086\n▁queer\t-9.99244\n▁mix\t-9.9934\n▁whisper\t-9.99361\nrg\t-9.99439\n▁buried\t-9.9944\n▁bid\t-9.99446\n▁tip\t-9.99521\n▁frequently\t-9.99541\n▁div\t-9.99601\n▁knee\t-9.99684\n▁region\t-9.99813\nctor\t-9.99893\n▁root\t-9.99909\n▁trip\t-9.99947\n▁jealous\t-10\nhead\t-10.0005\n▁saved\t-10.0006\n▁pig\t-10.0007\n▁phil\t-10.0019\n▁union\t-10.0028\n▁ships\t-10.0029\n▁companions\t-10.0031\n▁approached\t-10.0038\n▁harry\t-10.0038\n▁arrival\t-10.0038\n▁drunk\t-10.0038\n▁slept\t-10.0038\n▁furnish\t-10.0038\n▁hale\t-10.0039\n▁para\t-10.004\n▁heap\t-10.0047\n▁absence\t-10.0058\n▁shoes\t-10.0065\n▁consciousness\t-10.0067\n▁kindly\t-10.008\nbel\t-10.0083\n▁evident\t-10.0089\n▁lest\t-10.0095\n▁grasp\t-10.0104\n▁steal\t-10.0106\nlon\t-10.0107\n▁knife\t-10.0115\n▁precious\t-10.0115\n▁element\t-10.0118\n▁proceeded\t-10.013\n▁fever\t-10.013\n▁leader\t-10.0134\n▁risk\t-10.0137\n▁ease\t-10.0139\n▁mount\t-10.0149\n▁meanwhile\t-10.0154\n▁century\t-10.0154\n▁grim\t-10.0155\n▁owe\t-10.0167\n▁judgment\t-10.0173\n▁arose\t-10.0174\n▁vision\t-10.0176\n▁sang\t-10.0177\n▁extreme\t-10.0186\n▁constant\t-10.0186\n▁asking\t-10.0188\n▁observation\t-10.0192\n▁thrust\t-10.0192\n▁delay\t-10.0193\n▁hit\t-10.0211\n▁includ\t-10.0212\n▁admire\t-10.0212\n▁lift\t-10.0219\n▁lesson\t-10.022\n▁friendship\t-10.0221\n▁spare\t-10.0222\n▁issue\t-10.0223\n▁principal\t-10.0231\n▁mourn\t-10.0232\n▁capable\t-10.0235\n▁burning\t-10.0241\n▁accepted\t-10.0242\n▁extraordinary\t-10.0251\n▁hoped\t-10.0256\n▁removed\t-10.0257\n▁horn\t-10.0261\n▁cent\t-10.0262\n▁alice\t-10.0272\n▁chap\t-10.028\n▁apartment\t-10.0284\n▁fighting\t-10.0284\n▁trembling\t-10.029\n▁somebody\t-10.029\n▁anyone\t-10.0291\n▁blame\t-10.0294\n▁bride\t-10.0299\n▁reader\t-10.0304\n▁everywhere\t-10.031\n▁labour\t-10.031\n▁recall\t-10.031\n▁rob\t-10.0317\n▁bull\t-10.0324\n▁council\t-10.0329\n▁popular\t-10.0329\n▁trial\t-10.0337\n▁wishes\t-10.0348\n▁dun\t-10.0349\n▁assured\t-10.0349\n▁brilliant\t-10.0349\n▁forgot\t-10.035\n▁cab\t-10.0352\n▁continue\t-10.0358\n▁acknowledg\t-10.0369\n▁retreat\t-10.0369\n▁increased\t-10.0374\n▁contempt\t-10.0389\n▁grandfather\t-10.0389\n▁sympathy\t-10.0389\n▁ghost\t-10.0389\n▁creatures\t-10.0407\n▁ken\t-10.0408\n▁stretched\t-10.0409\n▁playing\t-10.0415\n▁hind\t-10.0417\n▁members\t-10.0428\n▁miserable\t-10.0428\n▁kindness\t-10.0435\n▁gla\t-10.0444\n▁highest\t-10.0447\naries\t-10.0457\n▁eighty\t-10.0467\n▁kissed\t-10.0468\n▁deserve\t-10.0468\n▁begged\t-10.0474\n▁hut\t-10.0478\n▁closely\t-10.0485\n▁wondered\t-10.0499\n▁larger\t-10.0505\n▁accordingly\t-10.0508\n▁military\t-10.0508\n▁remind\t-10.0508\n▁destroy\t-10.0527\n▁maintain\t-10.0528\n▁engine\t-10.0528\n▁motive\t-10.0529\nwick\t-10.0531\n▁strip\t-10.0543\nison\t-10.0544\n▁hans\t-10.0548\n▁ahead\t-10.0562\n▁magic\t-10.0565\n▁infinite\t-10.0569\n▁prompt\t-10.0569\n▁informed\t-10.0571\n▁peer\t-10.0594\n▁pressed\t-10.0603\n▁somewhere\t-10.0609\n▁bought\t-10.0609\n▁trap\t-10.0621\n▁scar\t-10.0623\n▁visible\t-10.063\n▁ashamed\t-10.0631\ngar\t-10.0643\n▁neighbour\t-10.0649\n▁constitution\t-10.065\n▁intelligence\t-10.065\n▁tear\t-10.0651\n▁profession\t-10.0655\n▁hungry\t-10.0661\n▁smell\t-10.067\n▁listening\t-10.0671\n▁stories\t-10.0672\n▁approach\t-10.0676\n▁aim\t-10.0681\n▁ham\t-10.0682\n▁string\t-10.0684\n▁explanation\t-10.0691\n▁immense\t-10.0691\n▁religious\t-10.0691\n▁hollow\t-10.0691\nabeth\t-10.0691\n▁throughout\t-10.0691\n▁await\t-10.0691\n▁flying\t-10.0699\ncum\t-10.071\n▁scream\t-10.0711\n▁active\t-10.0716\nport\t-10.0718\nett\t-10.0729\n▁product\t-10.0731\n▁unhappy\t-10.0731\n▁vague\t-10.0733\n▁stupid\t-10.0752\n▁dignity\t-10.0752\n▁isabel\t-10.0752\n▁pitch\t-10.0767\n▁comrade\t-10.0773\n▁reckon\t-10.0773\n▁stiff\t-10.0773\nrick\t-10.0779\n▁spark\t-10.078\n▁sold\t-10.0785\n▁stro\t-10.0806\n▁crying\t-10.0812\n▁repeat\t-10.0817\n▁comfortable\t-10.0831\n▁marked\t-10.0834\n▁project\t-10.0835\n▁becoming\t-10.0835\n▁parents\t-10.0835\n▁shelter\t-10.0836\nfield\t-10.0839\n▁nest\t-10.0841\n▁stole\t-10.0843\n▁hint\t-10.0844\n▁trick\t-10.0849\n▁thoroughly\t-10.0852\n▁hospital\t-10.0855\n▁weapon\t-10.0855\n▁style\t-10.0856\n▁rome\t-10.0857\n▁admitted\t-10.0862\n▁safety\t-10.0866\n▁understanding\t-10.0871\n▁weary\t-10.0872\n▁slaves\t-10.088\n▁print\t-10.0886\n▁credit\t-10.0897\n▁unable\t-10.0914\n▁clouds\t-10.0917\n▁conclusion\t-10.0918\n▁seldom\t-10.0918\n▁unusual\t-10.0918\n▁hanging\t-10.0942\n▁david\t-10.096\n▁bowed\t-10.0963\nmond\t-10.0969\n▁pushed\t-10.0983\n▁escaped\t-10.0988\n▁warn\t-10.099\n▁betray\t-10.1002\n▁eggs\t-10.1024\n▁plainly\t-10.1028\n▁ser\t-10.1036\n▁exhibit\t-10.1044\n▁gay\t-10.1047\n▁display\t-10.1065\n▁member\t-10.1066\n▁grin\t-10.1078\n▁prospect\t-10.1086\n▁brush\t-10.1086\n▁waves\t-10.1087\n▁successful\t-10.11\n▁extent\t-10.1108\n▁persuade\t-10.1129\n▁mood\t-10.1136\n▁mid\t-10.1138\n▁arranged\t-10.115\n▁universal\t-10.115\n▁jim\t-10.1153\n▁signal\t-10.116\n▁whilst\t-10.1172\n▁wolf\t-10.1172\n▁philip\t-10.1173\n▁billy\t-10.1195\n▁eagerly\t-10.1196\n▁returning\t-10.1207\n▁conscience\t-10.1215\n▁fortunate\t-10.1215\n▁gleam\t-10.1215\n▁female\t-10.1215\n▁hastily\t-10.1216\n▁provided\t-10.1218\n▁obtain\t-10.1221\n▁render\t-10.1221\n▁instinct\t-10.1236\n▁concerning\t-10.1239\n▁concerned\t-10.1241\n▁rum\t-10.1247\n▁vol\t-10.1256\n▁somehow\t-10.1258\n▁gall\t-10.1259\n▁pink\t-10.126\n▁artist\t-10.1267\n▁accustomed\t-10.128\n▁unconscious\t-10.128\n▁advise\t-10.128\nmmed\t-10.1283\n▁tiny\t-10.1288\n▁mud\t-10.1288\n▁branches\t-10.1291\n▁refuse\t-10.1294\n▁rage\t-10.1295\n▁bishop\t-10.1301\n▁supply\t-10.1301\n▁peasant\t-10.1301\n▁lawyer\t-10.1302\n▁connection\t-10.1306\n▁develop\t-10.1316\n▁correspond\t-10.1323\n▁rang\t-10.1325\nhouse\t-10.1336\n▁plum\t-10.1345\n▁nodded\t-10.1345\n▁slipped\t-10.1347\n▁kit\t-10.1349\n▁constantly\t-10.1352\n▁earl\t-10.1356\n▁fairly\t-10.1365\n▁features\t-10.138\n▁pause\t-10.1384\n▁painful\t-10.1388\n▁super\t-10.1397\n▁laughter\t-10.1399\n▁whence\t-10.14\n▁opera\t-10.1401\n▁joe\t-10.1402\n▁eating\t-10.1408\n▁christmas\t-10.1411\ntime\t-10.1412\n▁wholly\t-10.1416\n▁apart\t-10.1418\n▁coach\t-10.1418\n▁crew\t-10.143\n▁cheeks\t-10.1431\n▁revolution\t-10.1432\n▁lonely\t-10.1433\n▁attain\t-10.1433\n▁luc\t-10.1436\n▁established\t-10.1437\n▁throne\t-10.1439\n▁dash\t-10.144\n▁friendly\t-10.1443\n▁exhaust\t-10.1454\n▁cliff\t-10.1455\n▁reveal\t-10.1455\n▁adopt\t-10.1455\n▁centre\t-10.1457\n▁merry\t-10.1469\n▁sylvia\t-10.1477\n▁misfortune\t-10.1499\n▁feast\t-10.1499\n▁arab\t-10.1509\n▁fetch\t-10.1521\n▁descend\t-10.153\nick\t-10.1531\n▁nut\t-10.1542\n▁fought\t-10.1543\nko\t-10.1545\n▁setting\t-10.1558\n▁source\t-10.1566\n▁persist\t-10.1566\n▁mercy\t-10.1571\n▁compare\t-10.1581\n▁deeply\t-10.1584\n▁pile\t-10.1584\n▁attitude\t-10.1588\n▁delightful\t-10.1597\n▁endure\t-10.1602\n▁patience\t-10.161\n▁local\t-10.161\n▁victory\t-10.1615\n▁uttered\t-10.1622\n▁treated\t-10.1623\n▁separate\t-10.1626\n▁dragg\t-10.1627\n▁beard\t-10.1643\n▁rear\t-10.1652\n▁tied\t-10.1657\n▁title\t-10.1657\n▁triumph\t-10.1674\n▁gained\t-10.1688\n▁defend\t-10.17\nbury\t-10.1714\n▁increase\t-10.1717\n▁bark\t-10.172\n▁fled\t-10.1725\n▁pond\t-10.1728\n▁conquer\t-10.1746\n▁forehead\t-10.1746\n▁wag\t-10.1749\n▁organi\t-10.1751\n▁anxiety\t-10.1768\n▁encounter\t-10.1768\n▁sex\t-10.1773\n▁sank\t-10.1779\n▁halt\t-10.1784\nella\t-10.1789\n▁cheek\t-10.1792\n▁writer\t-10.1793\nchi\t-10.1796\n▁employed\t-10.1805\n▁humble\t-10.1806\n▁raise\t-10.181\n▁troops\t-10.1814\n▁distinguished\t-10.1816\n▁giant\t-10.1821\n▁sink\t-10.1822\n▁flag\t-10.1826\ncar\t-10.1826\n▁obtained\t-10.183\n▁discovery\t-10.1836\n▁national\t-10.1842\n▁jumped\t-10.1842\n▁commission\t-10.1859\n▁positive\t-10.1859\n▁loving\t-10.186\n▁exact\t-10.1861\n▁ideal\t-10.1862\n▁range\t-10.1864\n▁refer\t-10.1874\n▁murmured\t-10.1877\n▁encourage\t-10.1882\n▁college\t-10.1882\n▁novel\t-10.1884\nworth\t-10.1892\n▁mortal\t-10.1906\n▁fan\t-10.1914\n▁rolled\t-10.1915\n▁guilty\t-10.1918\n▁victor\t-10.1926\n▁approaching\t-10.1945\n▁relative\t-10.1952\n▁estate\t-10.1952\n▁ugly\t-10.1952\n▁metal\t-10.1967\n▁dared\t-10.1969\n▁boots\t-10.1969\n▁robert\t-10.1976\n▁clock\t-10.198\n▁admiration\t-10.1998\n▁fourteen\t-10.1998\n▁witch\t-10.1999\n▁barbar\t-10.2001\n▁pra\t-10.2017\n▁cake\t-10.2022\n▁shone\t-10.2025\n▁managed\t-10.2031\n▁volume\t-10.2045\n▁greek\t-10.2045\n▁dancing\t-10.2045\nj\t-10.2055\n▁wretched\t-10.2055\n▁condemn\t-10.2068\n▁magnificent\t-10.2068\n▁consult\t-10.2068\n▁fleet\t-10.2083\n▁arrangement\t-10.2092\n▁incident\t-10.2092\n▁misery\t-10.2092\n▁arrow\t-10.2094\n▁stroke\t-10.2099\n▁assist\t-10.21\n▁succeed\t-10.2108\n▁recent\t-10.2109\n▁build\t-10.211\n▁desperate\t-10.2115\n▁widow\t-10.2115\n▁market\t-10.2129\nfall\t-10.213\n▁wisdom\t-10.2139\n▁current\t-10.2139\n▁spoil\t-10.2139\n▁resist\t-10.2161\n▁obvious\t-10.2163\n▁sensible\t-10.2163\n▁wooden\t-10.2166\n▁addressed\t-10.2184\n▁bade\t-10.2185\n▁counsel\t-10.2186\n▁select\t-10.2186\n▁purchase\t-10.2186\n▁useless\t-10.2187\n▁fin\t-10.2195\n▁bringing\t-10.2207\n▁arrest\t-10.221\n▁stared\t-10.2212\n▁poison\t-10.2213\n▁gil\t-10.2214\n▁swallow\t-10.2234\n▁anna\t-10.2234\nrate\t-10.2234\n▁slid\t-10.2236\n▁block\t-10.2237\n▁sport\t-10.2242\n▁ninety\t-10.2245\n▁provide\t-10.2255\n▁lamb\t-10.2259\n▁interval\t-10.226\n▁described\t-10.228\n▁provision\t-10.2282\n▁striking\t-10.2282\n▁proposed\t-10.2285\n▁jump\t-10.2287\n▁suggest\t-10.2303\n▁melancholy\t-10.2306\n▁warrior\t-10.2306\n▁burden\t-10.2308\n▁departure\t-10.2309\n▁limb\t-10.2316\n▁troubled\t-10.2325\n▁meadow\t-10.233\n▁sacred\t-10.233\n▁straw\t-10.233\n▁tru\t-10.2332\n▁solid\t-10.2334\n▁soil\t-10.2348\n▁lucy\t-10.2348\n▁civil\t-10.2348\n▁recover\t-10.2348\n▁energy\t-10.2354\n▁powder\t-10.2354\n▁resumed\t-10.2354\n▁intense\t-10.2354\n▁british\t-10.2378\n▁agreeable\t-10.2389\n▁trot\t-10.2393\n▁everyone\t-10.2393\n▁concern\t-10.2394\n▁voyage\t-10.2402\n▁southern\t-10.2402\n▁bosom\t-10.2406\n▁utterly\t-10.2424\n▁essential\t-10.2426\n▁feed\t-10.2427\n▁household\t-10.243\n▁extremely\t-10.2434\n▁wondering\t-10.2435\n▁list\t-10.2446\n▁experiment\t-10.2451\n▁joseph\t-10.2451\n▁mystery\t-10.2451\n▁restore\t-10.2455\n▁blush\t-10.2456\nfold\t-10.2459\n▁lap\t-10.2464\n▁chosen\t-10.2471\n▁epi\t-10.2472\n▁intellect\t-10.2475\n▁curtain\t-10.2475\nology\t-10.2475\n▁pine\t-10.2477\n▁mounted\t-10.2481\nhar\t-10.249\n▁punish\t-10.2492\n▁drift\t-10.2502\n▁wedding\t-10.2506\n▁ko\t-10.2508\n▁preparation\t-10.2524\n▁resolution\t-10.2524\n▁oppress\t-10.2524\n▁fix\t-10.2535\n▁sch\t-10.2548\n▁victim\t-10.2549\n▁summon\t-10.2549\n▁julia\t-10.2549\n▁flood\t-10.2551\n▁slightly\t-10.257\n▁lodge\t-10.2578\n▁unexpected\t-10.2598\n▁confusion\t-10.2598\n▁addition\t-10.2598\n▁conceive\t-10.2598\n▁jesus\t-10.2599\n▁wire\t-10.2608\nlong\t-10.2615\n▁rude\t-10.2624\n▁fatal\t-10.2627\n▁patch\t-10.2629\n▁careless\t-10.2629\n▁vari\t-10.2635\n▁wal\t-10.2643\n▁catherine\t-10.2647\n▁parliament\t-10.2647\n▁profound\t-10.2647\n▁aloud\t-10.2648\n▁relieve\t-10.2649\n▁push\t-10.266\n▁accompanied\t-10.2672\n▁sovereign\t-10.2672\n▁singular\t-10.2672\n▁composed\t-10.2672\n▁assistance\t-10.2676\n▁echo\t-10.2678\n▁shaking\t-10.2679\n▁teacher\t-10.2684\n▁horrible\t-10.2697\n▁strict\t-10.2697\n▁gown\t-10.2703\n▁punishment\t-10.2704\n▁verse\t-10.2712\natory\t-10.2712\n▁mistaken\t-10.2716\n▁swept\t-10.2722\n▁gesture\t-10.2722\n▁steel\t-10.2724\n▁bush\t-10.2735\n▁affected\t-10.2739\n▁directed\t-10.2745\n▁absurd\t-10.2747\n▁surrounded\t-10.2747\n▁scrap\t-10.2749\n▁sugar\t-10.2749\n▁immediate\t-10.2753\n▁saddle\t-10.2753\n▁sighed\t-10.2768\n▁govern\t-10.2768\n▁pea\t-10.2769\n▁snap\t-10.2769\n▁arise\t-10.277\n▁exchange\t-10.2772\n▁impatient\t-10.2772\n▁whip\t-10.2794\n▁stretch\t-10.2797\n▁embrace\t-10.2798\n▁disease\t-10.2798\n▁profit\t-10.2798\n▁riding\t-10.2802\n▁recovered\t-10.2803\n▁convinced\t-10.2814\n▁leaning\t-10.2815\n▁domestic\t-10.2823\n▁complex\t-10.2823\n▁manifest\t-10.2823\n▁indulge\t-10.2823\n▁genius\t-10.2824\n▁agent\t-10.2841\n▁veil\t-10.2841\n▁description\t-10.2848\n▁inclined\t-10.2848\n▁deceive\t-10.2848\n▁mac\t-10.2851\n▁darling\t-10.2861\n▁reign\t-10.2866\n▁enormous\t-10.2874\n▁restrain\t-10.2874\n▁duties\t-10.2876\n▁enable\t-10.2899\nttered\t-10.2902\n▁pole\t-10.2906\n▁exception\t-10.292\n▁intimate\t-10.2925\n▁countess\t-10.2927\n▁tribe\t-10.2931\n▁oil\t-10.2938\ncast\t-10.2944\n▁handkerchief\t-10.295\n▁midnight\t-10.295\n▁problem\t-10.295\n▁reli\t-10.2951\n▁unre\t-10.2952\n▁crush\t-10.2959\n▁discuss\t-10.296\n▁tramp\t-10.296\n▁whirl\t-10.2977\n▁hori\t-10.2985\nhin\t-10.2992\n▁official\t-10.3001\n▁drown\t-10.3002\n▁pierre\t-10.3002\n▁scheme\t-10.3002\n▁locked\t-10.3006\n▁permitted\t-10.3007\n▁carr\t-10.3007\n▁connected\t-10.3008\n▁assure\t-10.3015\n▁cock\t-10.3018\n▁utmost\t-10.3027\n▁devoted\t-10.3027\n▁sufficiently\t-10.3036\nulation\t-10.304\n▁intellectual\t-10.3053\n▁carpet\t-10.3053\n▁objection\t-10.3062\n▁afterward\t-10.3067\n▁reality\t-10.3067\ncho\t-10.3068\ngate\t-10.3074\n▁negro\t-10.3079\n▁retain\t-10.3079\n▁ascend\t-10.3079\n▁cease\t-10.308\n▁marvel\t-10.3081\nmost\t-10.3086\n▁bond\t-10.3092\n▁kate\t-10.3101\n▁breaking\t-10.3104\n▁coal\t-10.3105\n▁ignorant\t-10.3106\n▁twin\t-10.3109\n▁astonishment\t-10.3131\n▁coffee\t-10.3131\n▁execut\t-10.3146\n▁origin\t-10.3147\n▁final\t-10.3151\n▁inhabitants\t-10.3157\n▁stable\t-10.3164\n▁parties\t-10.3169\n▁cities\t-10.3169\n▁generous\t-10.3183\n▁describe\t-10.3185\n▁jar\t-10.3187\n▁plunge\t-10.3192\n▁announced\t-10.3202\n▁merit\t-10.3207\n▁ere\t-10.3222\n▁disappoint\t-10.3228\n▁suggestion\t-10.3233\n▁doubtless\t-10.3234\n▁trunk\t-10.3236\n▁job\t-10.3253\n▁stamp\t-10.3257\n▁divided\t-10.3258\n▁appointed\t-10.3259\n▁acquainted\t-10.3262\n▁absolute\t-10.327\n▁fearful\t-10.3279\n▁privilege\t-10.3289\n▁steep\t-10.3291\n▁vote\t-10.3291\n▁craft\t-10.3296\n▁hunter\t-10.3296\n▁modest\t-10.3303\n▁forbid\t-10.3305\n▁endeavour\t-10.3315\n▁sweep\t-10.3315\n▁beheld\t-10.3315\nacious\t-10.332\n▁absorb\t-10.3342\n▁construct\t-10.3342\n▁expedition\t-10.3342\n▁empire\t-10.3342\n▁erect\t-10.3343\n▁offend\t-10.3344\n▁intend\t-10.3351\n▁chin\t-10.3356\n▁permit\t-10.3363\n▁contract\t-10.3368\n▁thirst\t-10.3369\n▁destroyed\t-10.337\n▁ger\t-10.3375\n▁wagon\t-10.3378\n▁gloom\t-10.3393\n▁atmosphere\t-10.3395\n▁reserve\t-10.3395\nlock\t-10.3412\n▁nonsense\t-10.3422\n▁prevail\t-10.3422\n▁quality\t-10.3422\n▁clasp\t-10.3422\n▁concluded\t-10.3426\n▁katy\t-10.3433\n▁eternal\t-10.3449\n▁neglect\t-10.3449\n▁creep\t-10.345\n▁squire\t-10.345\n▁muttered\t-10.3452\n▁electric\t-10.3452\n▁hay\t-10.3456\n▁expense\t-10.3476\n▁scorn\t-10.3476\n▁retired\t-10.3476\n▁murmur\t-10.3482\n▁stout\t-10.3484\n▁sharply\t-10.35\n▁district\t-10.3503\n▁leaf\t-10.3503\n▁failure\t-10.3507\n▁numerous\t-10.353\n▁infant\t-10.3531\n▁traveller\t-10.3535\n▁crep\t-10.354\n▁june\t-10.3547\nwork\t-10.3547\n▁hunger\t-10.3548\n▁recommend\t-10.3557\n▁jean\t-10.3562\n▁richard\t-10.3571\n▁monte\t-10.3588\n▁preach\t-10.3593\n▁palm\t-10.3594\n▁tap\t-10.36\n▁anywhere\t-10.3612\n▁disposition\t-10.3612\n▁mirror\t-10.3612\n▁venture\t-10.3616\n▁pound\t-10.3638\n▁cigar\t-10.3639\n▁invited\t-10.364\n▁bench\t-10.3645\n▁protection\t-10.3653\n▁benefit\t-10.3667\n▁thomas\t-10.3667\n▁reproach\t-10.3694\n▁clerk\t-10.3694\nhu\t-10.3707\n▁uniform\t-10.3722\n▁generation\t-10.3722\n▁compass\t-10.3722\n▁warning\t-10.3723\n▁extended\t-10.3728\n▁difficulties\t-10.3731\n▁affect\t-10.374\n▁maybe\t-10.3741\n▁comb\t-10.3743\n▁seal\t-10.3743\n▁groan\t-10.3743\n▁western\t-10.3751\n▁chop\t-10.3753\n▁earn\t-10.3756\n▁score\t-10.3758\n▁idle\t-10.3761\n▁astonished\t-10.3777\n▁introduced\t-10.3777\n▁lieutenant\t-10.3777\n▁leisure\t-10.3777\n▁violence\t-10.3777\n▁firmly\t-10.3778\n▁monster\t-10.3784\n▁properly\t-10.3785\n▁rendered\t-10.3797\n▁twist\t-10.3805\n▁pirate\t-10.3807\n▁batter\t-10.3808\n▁robber\t-10.3809\n▁wept\t-10.3815\n▁descended\t-10.3821\n▁throwing\t-10.3822\n▁leaned\t-10.3823\n▁ornament\t-10.3834\n▁andrew\t-10.3839\n▁capture\t-10.3841\n▁bushes\t-10.3852\n▁republic\t-10.3861\n▁confident\t-10.3862\n▁lean\t-10.3902\n▁date\t-10.3904\n▁counter\t-10.3909\n▁northern\t-10.3918\n▁pearl\t-10.3924\n▁nearest\t-10.3933\n▁francis\t-10.3946\n▁wandering\t-10.3948\n▁frequent\t-10.3957\n▁startled\t-10.3961\n▁statement\t-10.3965\n▁occur\t-10.3971\n▁bloom\t-10.3974\n▁nerve\t-10.3974\n▁induce\t-10.3978\n▁flatter\t-10.3984\n▁ambition\t-10.4002\n▁madam\t-10.4005\n▁monk\t-10.4018\n▁rent\t-10.4023\n▁investigat\t-10.4031\n▁rabbit\t-10.4031\n▁confirm\t-10.4031\n▁regiment\t-10.4031\n▁submit\t-10.4031\n▁spell\t-10.4032\n▁eva\t-10.4033\n▁slope\t-10.4036\n▁furious\t-10.4037\n▁bestow\t-10.4047\n▁rail\t-10.4057\n▁ralph\t-10.4059\n▁compelled\t-10.4059\n▁thread\t-10.4059\n▁scattered\t-10.406\n▁deny\t-10.4067\n▁curl\t-10.4068\n▁chill\t-10.4075\n▁pronounc\t-10.4088\n▁mankind\t-10.4088\n▁cattle\t-10.4091\n▁male\t-10.4097\n▁execution\t-10.41\n▁tide\t-10.4115\n▁supreme\t-10.4117\n▁valuable\t-10.4117\n▁likewise\t-10.4117\n▁convey\t-10.4117\n▁gloomy\t-10.4119\n▁coin\t-10.4122\n▁actual\t-10.4129\n▁fog\t-10.4136\n▁tax\t-10.4139\n▁province\t-10.4146\n▁grateful\t-10.4146\n▁spiritual\t-10.4146\n▁vanished\t-10.4146\n▁diana\t-10.4146\n▁haunt\t-10.4146\n▁dragon\t-10.4151\n▁crawl\t-10.4153\n▁neat\t-10.4154\n▁china\t-10.4171\n▁gratitude\t-10.4174\n▁gasp\t-10.4179\n▁irre\t-10.419\n▁finish\t-10.4193\n▁intent\t-10.4198\n▁fright\t-10.4202\n▁embarrass\t-10.4203\n▁thirteen\t-10.4203\n▁ruth\t-10.4209\n▁slightest\t-10.4212\n▁development\t-10.4213\n▁interview\t-10.4233\n▁spectacle\t-10.4233\n▁brook\t-10.4233\n▁weakness\t-10.4255\n▁audience\t-10.4262\n▁consequently\t-10.4262\n▁abroad\t-10.4262\n▁release\t-10.4262\n▁aspect\t-10.4263\n▁painted\t-10.4263\n▁insult\t-10.4263\n▁sooth\t-10.4269\n▁disappointment\t-10.427\n▁emerg\t-10.4271\n▁brig\t-10.4284\n▁esteem\t-10.4291\n▁publish\t-10.4291\n▁passenger\t-10.4291\n▁invitation\t-10.4291\n▁piano\t-10.4291\n▁irish\t-10.4295\n▁desk\t-10.4297\n▁beaten\t-10.4318\n▁fifth\t-10.432\n▁impulse\t-10.432\n▁swear\t-10.432\n▁purple\t-10.4322\n▁committed\t-10.4324\n▁countries\t-10.4327\n▁perceive\t-10.4328\n▁eaten\t-10.4329\n▁celebrat\t-10.435\n▁grandmother\t-10.435\n▁shudder\t-10.435\n▁spanish\t-10.435\n▁sunshine\t-10.435\n▁hitherto\t-10.4352\n▁amid\t-10.4366\n▁mock\t-10.4378\n▁marilla\t-10.4379\n▁snake\t-10.4379\n▁interfere\t-10.4381\n▁walter\t-10.4385\n▁marble\t-10.4388\nterior\t-10.4394\n▁mission\t-10.4399\n▁boot\t-10.4407\n▁furniture\t-10.4409\n▁driving\t-10.4409\n▁steady\t-10.4409\nstead\t-10.4414\n▁circumstance\t-10.4417\n▁interpret\t-10.4438\n▁enchant\t-10.4438\n▁error\t-10.4439\n▁conviction\t-10.4449\n▁helpless\t-10.445\n▁qualities\t-10.4468\n▁medicine\t-10.4468\n▁italian\t-10.447\n▁hastened\t-10.4472\n▁occasionally\t-10.4474\n▁pursued\t-10.4475\nux\t-10.4475\n▁hesitated\t-10.4493\n▁chase\t-10.4496\n▁independent\t-10.4498\n▁oliver\t-10.4498\n▁linger\t-10.4503\n▁examined\t-10.4508\n▁repent\t-10.4521\n▁physician\t-10.4528\n▁beloved\t-10.4558\n▁attached\t-10.4558\n▁florence\t-10.4558\n▁honey\t-10.4565\n▁mouse\t-10.4569\n▁cries\t-10.457\n▁poem\t-10.4573\n▁ram\t-10.4588\n▁destruction\t-10.4588\n▁messenger\t-10.4588\n▁tristram\t-10.4588\n▁fulfil\t-10.4588\n▁fancied\t-10.4588\n▁excess\t-10.4588\n▁bake\t-10.4604\nmont\t-10.4613\n▁thornton\t-10.4618\n▁quantity\t-10.4618\n▁wh\t-10.4628\n▁created\t-10.4633\n▁curse\t-10.4637\n▁continually\t-10.4638\n▁lightning\t-10.4642\n▁borne\t-10.4669\n▁mild\t-10.4673\nttle\t-10.4677\n▁disposed\t-10.4679\n▁rifle\t-10.4679\n▁polly\t-10.468\n▁goat\t-10.4682\n▁total\t-10.4686\n▁virginia\t-10.4689\n▁backward\t-10.469\n▁peril\t-10.469\n▁kick\t-10.4691\n▁quo\t-10.4702\n▁glorious\t-10.471\n▁multitude\t-10.471\n▁leather\t-10.471\n▁absent\t-10.471\n▁demon\t-10.4711\n▁torture\t-10.4711\n▁debt\t-10.4712\n▁accord\t-10.4725\n▁catholic\t-10.474\n▁pill\t-10.475\n▁flour\t-10.4764\n▁library\t-10.4771\n▁pursuit\t-10.4771\n▁shirt\t-10.4771\n▁dearest\t-10.4772\n▁collar\t-10.4773\n▁declare\t-10.4781\n▁tempt\t-10.4784\n▁branch\t-10.4785\n▁steadily\t-10.4802\n▁disgust\t-10.4802\n▁silly\t-10.4803\n▁robe\t-10.481\n▁arrive\t-10.4812\n▁drank\t-10.4832\n▁communicat\t-10.4847\n▁mate\t-10.485\n▁rachel\t-10.4863\n▁washington\t-10.4863\n▁resign\t-10.4864\n▁meantime\t-10.4867\n▁engagement\t-10.4869\n▁separated\t-10.4872\n▁quiver\t-10.4872\n▁discussion\t-10.4882\n▁ventured\t-10.489\n▁nail\t-10.4894\n▁surrounding\t-10.4894\n▁polish\t-10.4895\n▁lace\t-10.4896\n▁swell\t-10.4906\n▁lincoln\t-10.4926\n▁student\t-10.4926\n▁glitter\t-10.4926\n▁joke\t-10.4931\n▁russian\t-10.4941\n▁readily\t-10.4943\n▁poverty\t-10.4957\n▁disgrace\t-10.4957\n▁heavily\t-10.4957\n▁cheese\t-10.4957\n▁staff\t-10.4984\n▁entreat\t-10.4988\n▁farewell\t-10.4988\n▁lunch\t-10.4988\n▁peep\t-10.4989\n▁someone\t-10.4997\n▁chris\t-10.5008\n▁disappear\t-10.5012\n▁decision\t-10.502\n▁pistol\t-10.502\n▁spur\t-10.5021\n▁assumed\t-10.5027\n▁extend\t-10.5044\n▁definite\t-10.5051\n▁enthusiasm\t-10.5051\n▁undertake\t-10.5052\n▁committee\t-10.5083\n▁simon\t-10.5083\n▁scale\t-10.5094\n▁applied\t-10.5115\n▁fence\t-10.5115\n▁related\t-10.5117\n▁vice\t-10.5129\n▁unpleasant\t-10.5146\n▁probable\t-10.5146\n▁procure\t-10.5147\n▁frown\t-10.515\nistic\t-10.5168\n▁cloak\t-10.5182\n▁humanity\t-10.5191\n▁dwarf\t-10.521\n▁families\t-10.521\n▁philosopher\t-10.521\n▁overcome\t-10.521\n▁defeat\t-10.5211\n▁plac\t-10.5215\n▁fastened\t-10.5217\n▁tomb\t-10.5219\n▁classes\t-10.5236\n▁marsh\t-10.5239\n▁gracious\t-10.5243\n▁remote\t-10.5243\n▁cell\t-10.5247\n▁shriek\t-10.5275\n▁rescue\t-10.5276\n▁chose\t-10.5281\n▁pool\t-10.529\n▁slo\t-10.5298\n▁cutting\t-10.5301\n▁coward\t-10.5307\n▁dirty\t-10.5307\n▁border\t-10.5307\n▁hook\t-10.5308\n▁monkey\t-10.5308\n▁chuck\t-10.5311\n▁weigh\t-10.5321\n▁emily\t-10.5325\n▁jest\t-10.5328\n▁mule\t-10.5328\n▁associate\t-10.534\n▁glimpse\t-10.534\n▁stuck\t-10.534\n▁bolt\t-10.5369\n▁murderer\t-10.538\n▁pony\t-10.5385\n▁rattl\t-10.5401\n▁distinguish\t-10.5401\n▁institution\t-10.5405\n▁cunning\t-10.5405\n▁compliment\t-10.5405\n▁spin\t-10.5406\n▁appetite\t-10.5438\n▁reputation\t-10.5438\n▁feeble\t-10.5438\n▁series\t-10.5452\n▁graceful\t-10.5457\n▁phrase\t-10.5471\n▁platform\t-10.5471\n▁clay\t-10.5481\n▁opposition\t-10.5504\n▁boast\t-10.5505\n▁lane\t-10.551\n▁growth\t-10.5527\n▁inclination\t-10.5537\n▁behave\t-10.5537\n▁susan\t-10.5538\n▁dislike\t-10.5543\n▁distinction\t-10.5545\n▁illustrat\t-10.557\n▁nicholas\t-10.557\n▁satisfy\t-10.557\n▁drama\t-10.557\n▁elbow\t-10.557\n▁consum\t-10.5571\n▁oath\t-10.5586\n▁channel\t-10.5603\n▁spear\t-10.5603\n▁slain\t-10.5603\n▁characteristic\t-10.5605\n▁sauce\t-10.5609\n▁frog\t-10.5629\n▁conception\t-10.5637\n▁timid\t-10.5637\n▁apparent\t-10.5659\n▁center\t-10.567\n▁variety\t-10.567\n▁dusk\t-10.5679\nshire\t-10.5689\n▁apt\t-10.5693\n▁column\t-10.5704\n▁revenge\t-10.5704\n▁rival\t-10.571\n▁imitat\t-10.571\n▁passionate\t-10.5716\n▁selfish\t-10.5721\n▁norman\t-10.5725\n▁extra\t-10.5737\n▁repair\t-10.5738\n▁thrill\t-10.5738\n▁treatment\t-10.5747\n▁rosa\t-10.575\n▁organ\t-10.5768\n▁martin\t-10.5771\n▁indifferent\t-10.5772\n▁thither\t-10.5772\n▁pepper\t-10.5772\n▁gallant\t-10.5776\n▁recollect\t-10.5784\n▁scarce\t-10.5804\n▁trembled\t-10.5804\n▁shield\t-10.5806\n▁mingled\t-10.5806\n▁brick\t-10.5829\n▁harsh\t-10.583\n▁humor\t-10.5838\n▁mischief\t-10.584\n▁tremendous\t-10.584\n▁function\t-10.584\n▁smart\t-10.584\n▁sultan\t-10.5874\n▁dismiss\t-10.5874\n▁threatened\t-10.5875\nji\t-10.5876\n▁cheap\t-10.5878\n▁vine\t-10.5878\n▁flock\t-10.5898\n▁endeavor\t-10.5908\n▁italy\t-10.5912\n▁flutter\t-10.5913\n▁whisk\t-10.5916\n▁waist\t-10.5922\n▁monarch\t-10.5943\n▁smoking\t-10.5943\n▁africa\t-10.5943\n▁accuse\t-10.5943\n▁herbert\t-10.5946\n▁refresh\t-10.5977\n▁rejoice\t-10.5977\n▁pillow\t-10.5979\n▁hopeless\t-10.5989\n▁poetry\t-10.5991\n▁perish\t-10.6007\n▁philosophy\t-10.6012\n▁bernard\t-10.6012\n▁whistle\t-10.6013\n▁lament\t-10.6014\n▁expectation\t-10.6028\n▁improve\t-10.6034\n▁fountain\t-10.6047\n▁perplex\t-10.6047\n▁despise\t-10.6047\n▁league\t-10.6047\n▁narrat\t-10.6047\n▁ignorance\t-10.6049\n▁reference\t-10.6051\n▁sunk\t-10.6052\nsail\t-10.6055\n▁wip\t-10.6057\n▁duck\t-10.6068\n▁partner\t-10.6076\n▁grove\t-10.6081\n▁prophet\t-10.6082\n▁shiver\t-10.6083\n▁neighbourhood\t-10.6083\n▁purse\t-10.6084\n▁representative\t-10.6084\n▁precisely\t-10.6104\n▁angle\t-10.6115\n▁acquired\t-10.6117\n▁chimney\t-10.6117\n▁doctrine\t-10.6117\n▁maxim\t-10.6117\n▁majority\t-10.6132\n▁autumn\t-10.6152\n▁cristo\t-10.6152\n▁disguise\t-10.6152\n▁achieve\t-10.6152\n▁confused\t-10.6152\n▁reduced\t-10.6152\n▁earlier\t-10.6155\n▁theatre\t-10.616\n▁decide\t-10.6172\nological\t-10.6188\n▁continent\t-10.6188\n▁occupation\t-10.6188\n▁vigorous\t-10.6188\n▁decline\t-10.6188\n▁community\t-10.6193\n▁motionless\t-10.6198\n▁hatred\t-10.6205\n▁communication\t-10.6206\n▁determin\t-10.6218\n▁comment\t-10.6223\n▁approve\t-10.6223\n▁ceremony\t-10.6223\n▁criminal\t-10.6223\n▁scientific\t-10.6223\n▁duchess\t-10.6223\n▁vivid\t-10.6223\n▁shift\t-10.6223\n▁avail\t-10.6224\n▁bowl\t-10.6234\n▁johnson\t-10.6241\n▁contrast\t-10.6259\n▁slender\t-10.6259\n▁amusement\t-10.6259\n▁plot\t-10.6259\n▁damp\t-10.6261\n▁association\t-10.6294\n▁uncertain\t-10.6294\n▁snatch\t-10.6294\n▁pressure\t-10.6299\n▁apply\t-10.6306\n▁restless\t-10.6311\n▁perch\t-10.6315\n▁notwithstanding\t-10.633\n▁swung\t-10.633\n▁planet\t-10.633\n▁stirred\t-10.6337\n▁attendant\t-10.634\n▁thro\t-10.6354\n▁enjoyment\t-10.6364\n▁worry\t-10.6366\n▁albert\t-10.6366\n▁naked\t-10.6367\n▁talent\t-10.6372\n▁marian\t-10.6387\n▁reform\t-10.639\n▁lyn\t-10.6402\n▁deliberate\t-10.6402\n▁intelligent\t-10.6402\n▁sensitive\t-10.6402\n▁yonder\t-10.6402\n▁pupil\t-10.6402\n▁frightful\t-10.6409\n▁doubtful\t-10.6411\n▁standard\t-10.6423\n▁deposit\t-10.6439\n▁magistrate\t-10.6439\n▁shepherd\t-10.6439\n▁stomach\t-10.6439\n▁renew\t-10.6439\n▁hedge\t-10.6458\n▁possibility\t-10.6475\n▁fatigue\t-10.6475\n▁francs\t-10.6475\n▁portrait\t-10.6475\n▁resemble\t-10.6475\n▁favorite\t-10.6477\n▁cream\t-10.6491\n▁pope\t-10.651\n▁secretary\t-10.6524\n▁divers\t-10.6526\n▁activity\t-10.6548\n▁speculat\t-10.6548\n▁humour\t-10.6553\n▁fitted\t-10.6575\n▁external\t-10.6585\n▁cetera\t-10.6585\n▁wrapped\t-10.6586\n▁jaw\t-10.6612\n▁fred\t-10.6615\n▁examination\t-10.6622\n▁lodging\t-10.6622\n▁crow\t-10.6623\n▁owing\t-10.6625\n▁balance\t-10.6631\n▁puff\t-10.6644\n▁tenderness\t-10.6648\n▁porthos\t-10.6659\n▁anchor\t-10.666\n▁interrupt\t-10.6668\n▁driver\t-10.6689\n▁necessarily\t-10.6696\n▁perpetual\t-10.6696\n▁agony\t-10.6703\n▁scholar\t-10.6733\n▁scotland\t-10.6733\n▁suppress\t-10.6733\n▁wrath\t-10.6733\n▁wreck\t-10.6733\n▁exceed\t-10.6734\n▁perfection\t-10.6758\n▁doorway\t-10.6765\n▁india\t-10.6766\n▁clergy\t-10.6771\n▁tradition\t-10.6771\n▁section\t-10.6771\n▁eastern\t-10.6771\n▁wives\t-10.6774\n▁convention\t-10.6779\n▁announc\t-10.6782\n▁egypt\t-10.6797\n▁contradict\t-10.6808\n▁scratch\t-10.6808\n▁glove\t-10.6808\n▁central\t-10.6808\n▁wax\t-10.6826\nifying\t-10.6831\n▁prepare\t-10.6833\n▁accompany\t-10.6846\n▁increasing\t-10.6846\n▁liberal\t-10.6846\n▁raising\t-10.6846\n▁orange\t-10.6847\n▁shoe\t-10.687\n▁attribute\t-10.6884\n▁literature\t-10.6884\n▁withdraw\t-10.6884\n▁hawk\t-10.6885\nthorpe\t-10.6886\n▁whither\t-10.6887\n▁moonlight\t-10.6887\n▁examine\t-10.6909\n▁happily\t-10.6922\n▁precede\t-10.6925\n▁detective\t-10.6927\n▁inches\t-10.6927\n▁solitary\t-10.696\n▁dutch\t-10.696\n▁napoleon\t-10.6998\n▁uneasy\t-10.6998\n▁cardinal\t-10.6998\n▁blew\t-10.6999\n▁fowl\t-10.6999\n▁decorat\t-10.6999\n▁childhood\t-10.7009\n▁torment\t-10.7012\n▁scent\t-10.7016\n▁losing\t-10.7024\n▁permission\t-10.7037\n▁blank\t-10.707\n▁upstairs\t-10.7075\n▁capacity\t-10.7075\n▁trifle\t-10.7076\n▁folly\t-10.7076\n▁remove\t-10.7102\n▁vengeance\t-10.7114\n▁enterprise\t-10.7114\n▁bedroom\t-10.7114\n▁anyhow\t-10.7114\n▁inquiry\t-10.7115\n▁ashes\t-10.714\n▁hush\t-10.7148\n▁awkward\t-10.7153\n▁saturday\t-10.7153\n▁genuine\t-10.7153\n▁surviv\t-10.7154\n▁drag\t-10.7156\n▁skirt\t-10.7156\n▁affectionate\t-10.7163\n▁tang\t-10.7179\n▁mutual\t-10.7192\n▁dispute\t-10.7192\n▁eagle\t-10.7192\n▁income\t-10.7193\n▁bind\t-10.7201\n▁wilt\t-10.7204\n▁fame\t-10.7206\n▁improvement\t-10.7208\n▁differ\t-10.7224\n▁awoke\t-10.7231\n▁sleeve\t-10.7231\n▁solitude\t-10.7231\n▁favourite\t-10.7234\n▁detect\t-10.7266\n▁comprehend\t-10.7271\n▁preparing\t-10.7271\n▁serpent\t-10.7271\n▁summit\t-10.7271\n▁knot\t-10.7271\n▁knit\t-10.7271\n▁copy\t-10.7271\n▁woe\t-10.7273\n▁stopping\t-10.7274\n▁faded\t-10.7274\n▁hideous\t-10.7279\n▁julie\t-10.7279\n▁shine\t-10.7306\n▁axe\t-10.731\n▁conflict\t-10.731\n▁proposition\t-10.731\n▁refuge\t-10.731\n▁gallery\t-10.731\n▁bundle\t-10.7311\n▁slavery\t-10.7324\n▁mask\t-10.733\n▁alyosha\t-10.735\n▁ladder\t-10.7359\n▁department\t-10.737\n▁discharge\t-10.739\n▁depress\t-10.739\n▁scarlet\t-10.7392\n▁gallop\t-10.7394\n▁kitty\t-10.7397\n▁paw\t-10.7403\n▁receiving\t-10.743\n▁surrender\t-10.743\n▁sustain\t-10.743\n▁twilight\t-10.743\n▁congress\t-10.743\n▁ireland\t-10.7431\n▁funny\t-10.7435\n▁lend\t-10.7459\n▁constitute\t-10.747\n▁crystal\t-10.747\n▁lofty\t-10.747\n▁funeral\t-10.747\n▁spain\t-10.747\n▁exceedingly\t-10.747\n▁damn\t-10.7473\n▁commun\t-10.7503\n▁prejudice\t-10.751\n▁porch\t-10.7511\n▁assistant\t-10.7515\n▁today\t-10.7521\n▁smot\t-10.7543\n▁enclos\t-10.7545\n▁industry\t-10.7551\n▁defence\t-10.7551\n▁hither\t-10.7554\n▁coloni\t-10.7567\n▁marguerite\t-10.7591\n▁miracle\t-10.7591\n▁inherit\t-10.7592\n▁beggar\t-10.7594\n▁unlike\t-10.7613\n▁envelope\t-10.7632\n▁indignation\t-10.7632\n▁natasha\t-10.7632\n▁proposal\t-10.7632\n▁fragment\t-10.7632\n▁roast\t-10.7634\n▁roused\t-10.7635\nencies\t-10.7651\n▁commenced\t-10.7673\n▁resource\t-10.7673\n▁population\t-10.7673\n▁quoth\t-10.7683\n▁tumble\t-10.7702\n▁pursue\t-10.7705\n▁educat\t-10.7706\n▁afflict\t-10.7714\n▁contact\t-10.7714\n▁crimson\t-10.7714\n▁division\t-10.7714\n▁disorder\t-10.7714\n▁copper\t-10.7715\n▁moderate\t-10.7716\n▁drum\t-10.772\n▁swim\t-10.7727\n▁salute\t-10.7732\n▁assume\t-10.7746\n▁nav\t-10.7747\n▁emphasi\t-10.7756\n▁overwhelm\t-10.7756\n▁shakespeare\t-10.7756\n▁struggling\t-10.7756\n▁tranquil\t-10.7756\n▁muscle\t-10.7756\n▁chicken\t-10.7756\n▁tread\t-10.7761\n▁claw\t-10.7764\n▁solicit\t-10.7766\n▁bible\t-10.778\n▁threat\t-10.7796\n▁velvet\t-10.7797\n▁exposed\t-10.7797\n▁idiot\t-10.7797\n▁barrel\t-10.7798\n▁ripe\t-10.7799\n▁penny\t-10.7809\n▁temptation\t-10.7822\n▁danglars\t-10.7839\nmbled\t-10.7841\nkeep\t-10.7867\n▁chu\t-10.787\n▁centuries\t-10.7881\n▁distribut\t-10.7881\n▁reject\t-10.7881\n▁retorted\t-10.7881\n▁concentrat\t-10.7881\n▁cordial\t-10.7881\n▁motor\t-10.7882\n▁cannon\t-10.7884\n▁wretch\t-10.7905\n▁assurance\t-10.7923\n▁thief\t-10.7923\n▁survey\t-10.7923\n▁railway\t-10.7925\n▁vital\t-10.7925\n▁jackson\t-10.7933\n▁combat\t-10.7935\n▁recollection\t-10.7949\n▁security\t-10.7965\n▁nancy\t-10.7965\n▁jacob\t-10.7965\n▁clutch\t-10.7965\n▁growl\t-10.797\n▁blanket\t-10.7971\n▁cellar\t-10.7973\n▁indignant\t-10.8007\n▁convenient\t-10.8007\n▁worm\t-10.8008\n▁screen\t-10.8008\n▁coarse\t-10.8008\n▁transport\t-10.801\n▁determination\t-10.8019\n▁bullet\t-10.8019\n▁appreciate\t-10.805\n▁invisible\t-10.805\n▁devotion\t-10.805\n▁mixture\t-10.805\n▁candid\t-10.8051\n▁performance\t-10.8059\n▁rebel\t-10.8078\n▁exquisite\t-10.8093\n▁bargain\t-10.8093\n▁tobacco\t-10.8093\n▁loyal\t-10.8094\n▁mould\t-10.8094\n▁attentive\t-10.8135\n▁dorothy\t-10.8135\n▁brute\t-10.8136\n▁establishment\t-10.8145\n▁glen\t-10.8163\n▁inhabit\t-10.8179\n▁obscure\t-10.8179\n▁borrow\t-10.8179\n▁essence\t-10.8179\n▁dismay\t-10.8179\nhurst\t-10.8185\n▁vow\t-10.8195\n▁flee\t-10.82\n▁pluck\t-10.8222\n▁coffin\t-10.8222\n▁sunset\t-10.8224\n▁stephen\t-10.8226\n▁blade\t-10.8228\n▁holiday\t-10.8265\n▁mechanical\t-10.8265\n▁cotton\t-10.8266\n▁awakened\t-10.827\nhold\t-10.8309\n▁ridiculous\t-10.8309\n▁hesitation\t-10.8309\n▁corpse\t-10.8309\n▁saving\t-10.831\n▁sancho\t-10.831\nfoot\t-10.8316\n▁eldest\t-10.8353\n▁peak\t-10.8374\n▁despite\t-10.8397\n▁edith\t-10.8397\n▁wilson\t-10.8397\n▁cherish\t-10.8397\n▁resistance\t-10.8403\n▁argue\t-10.8405\n▁inquire\t-10.8437\n▁apprehension\t-10.8441\n▁avenue\t-10.8441\n▁drake\t-10.8441\n▁propose\t-10.8446\n▁inferior\t-10.8486\n▁staircase\t-10.8486\n▁wherefore\t-10.8486\n▁carlyle\t-10.8486\n▁couch\t-10.8496\n▁route\t-10.8504\n▁politics\t-10.853\n▁tomorrow\t-10.853\n▁confined\t-10.8531\n▁naught\t-10.8531\n▁throng\t-10.8533\n▁sunlight\t-10.854\n▁imperfect\t-10.8575\n▁indifference\t-10.8575\n▁obedience\t-10.8575\n▁reception\t-10.8575\n▁turkey\t-10.8575\n▁vegetable\t-10.8575\n▁residence\t-10.8575\n▁violet\t-10.8575\n▁sarah\t-10.8575\n▁altar\t-10.8577\n▁grieve\t-10.8579\n▁jerk\t-10.8587\n▁magician\t-10.8589\n▁ensu\t-10.8609\n▁blossom\t-10.862\n▁lantern\t-10.862\n▁resolute\t-10.862\n▁thoughtfully\t-10.8621\n▁fortnight\t-10.8665\n▁trumpet\t-10.8665\n▁unwilling\t-10.8665\n▁valjean\t-10.8665\n▁lecture\t-10.8665\n▁whereupon\t-10.8665\n▁holland\t-10.8665\n▁creek\t-10.8666\n▁changing\t-10.8666\n▁slice\t-10.8666\n▁accent\t-10.8667\n▁normal\t-10.8667\n▁disagreeable\t-10.8711\n▁frederick\t-10.8711\n▁rubbed\t-10.8711\n▁dumb\t-10.8711\n▁establish\t-10.8736\n▁import\t-10.8754\n▁affirm\t-10.8757\n▁matthew\t-10.8757\n▁bunch\t-10.8757\n▁hoping\t-10.8758\n▁convert\t-10.8759\n▁brisk\t-10.8759\n▁bending\t-10.8763\n▁michael\t-10.8802\n▁mademoiselle\t-10.8802\n▁easier\t-10.8802\n▁facing\t-10.8803\n▁jones\t-10.8804\n▁excellency\t-10.8848\n▁literary\t-10.8849\n▁gossip\t-10.8849\n▁devour\t-10.8849\n▁stagger\t-10.8849\n▁pencil\t-10.8849\n▁average\t-10.8849\n▁hammer\t-10.8851\n▁triumphant\t-10.8855\n▁preferred\t-10.8855\nburn\t-10.8877\n▁application\t-10.8895\n▁occupy\t-10.8895\n▁authorities\t-10.8898\n▁ascertain\t-10.8941\n▁corridor\t-10.8941\n▁delicious\t-10.8941\n▁practise\t-10.8941\n▁universe\t-10.8941\n▁shilling\t-10.8941\n▁contest\t-10.8942\n▁ashore\t-10.8942\n▁commit\t-10.8983\n▁administration\t-10.8988\n▁studied\t-10.8988\n▁rigid\t-10.8988\n▁adorn\t-10.8989\n▁elsewhere\t-10.9035\n▁innocence\t-10.9035\n▁journal\t-10.9035\n▁landscape\t-10.9035\n▁telegraph\t-10.9035\n▁angrily\t-10.9035\n▁campaign\t-10.9035\n▁unjust\t-10.9035\n▁flourish\t-10.904\n▁challenge\t-10.9082\n▁torrent\t-10.9082\n▁relate\t-10.9127\n▁assembled\t-10.913\n▁impressed\t-10.913\n▁canoe\t-10.915\n▁conclud\t-10.9171\n▁quixote\t-10.9177\n▁satisfactory\t-10.9177\n▁niece\t-10.9177\n▁deaf\t-10.9178\n▁glid\t-10.9179\n▁jimmy\t-10.9179\n▁regulat\t-10.9179\n▁chatter\t-10.9215\n▁statue\t-10.9225\n▁glacier\t-10.9225\n▁envy\t-10.9225\n▁boston\t-10.9227\n▁richmond\t-10.9229\n▁denied\t-10.9229\n▁fanny\t-10.9232\n▁solomon\t-10.9273\n▁vulgar\t-10.9273\n▁stalk\t-10.9274\n▁spoon\t-10.9279\n▁abuse\t-10.928\n▁basin\t-10.9291\n▁feature\t-10.9293\n▁convict\t-10.9304\n▁admiral\t-10.9321\n▁architect\t-10.9321\n▁ribbon\t-10.9321\n▁permanent\t-10.9321\n▁april\t-10.9321\n▁jolly\t-10.9322\nborough\t-10.9322\n▁neighborhood\t-10.9323\n▁impart\t-10.9324\n▁horrid\t-10.937\n▁immortal\t-10.937\n▁penetrate\t-10.937\n▁prudence\t-10.937\n▁reconcil\t-10.937\n▁spaniard\t-10.937\n▁supposing\t-10.937\n▁telephone\t-10.937\n▁temperature\t-10.937\n▁oyster\t-10.937\n▁appointment\t-10.9375\n▁egyptian\t-10.9384\n▁dwelt\t-10.9419\n▁nephew\t-10.9419\n▁railroad\t-10.9419\n▁september\t-10.9419\n▁gilbert\t-10.9419\n▁wheat\t-10.9419\n▁device\t-10.9419\n▁squee\t-10.9453\n▁elegant\t-10.9468\n▁advertise\t-10.9517\n▁turtle\t-10.9517\n▁rational\t-10.9517\n▁brood\t-10.9519\ncomb\t-10.9563\n▁assembly\t-10.9566\n▁cultivate\t-10.9566\n▁specimen\t-10.9566\n▁undoubtedly\t-10.9566\n▁editor\t-10.9567\n▁dropping\t-10.9567\n▁medical\t-10.9569\n▁balloon\t-10.9569\n▁whale\t-10.9574\n▁composition\t-10.9616\n▁footsteps\t-10.9616\n▁launcelot\t-10.9616\n▁discourse\t-10.9616\n▁errand\t-10.9616\n▁converse\t-10.9618\n▁advancing\t-10.9666\n▁downstairs\t-10.9666\n▁tumult\t-10.9666\n▁corrupt\t-10.9666\n▁suffice\t-10.9666\n▁anguish\t-10.9666\n▁shaggy\t-10.9666\n▁retire\t-10.9716\n▁timber\t-10.9717\n▁abstract\t-10.9767\n▁embroider\t-10.9767\n▁photograph\t-10.9767\n▁prosperity\t-10.9767\n▁terribly\t-10.9767\n▁territory\t-10.9767\n▁threshold\t-10.9767\n▁pavement\t-10.9767\n▁injured\t-10.9767\n▁levin\t-10.9767\n▁agitation\t-10.9818\n▁rascal\t-10.9818\n▁presume\t-10.9819\n▁strat\t-10.9842\n▁observing\t-10.9869\n▁obstacle\t-10.9869\n▁simplicity\t-10.9869\n▁slumber\t-10.9869\n▁supplied\t-10.9869\n▁combination\t-10.9869\n▁drain\t-10.9869\n▁wilderness\t-10.9869\n▁believing\t-10.992\n▁villain\t-10.992\n▁friday\t-10.992\n▁reckless\t-10.992\n▁injury\t-10.992\n▁clapp\t-10.9921\n▁symptom\t-10.9972\n▁kennedy\t-10.9972\n▁sledge\t-10.9972\n▁monday\t-10.9972\n▁hercules\t-10.9972\n▁ceiling\t-10.9972\n▁lemon\t-10.9972\n▁plague\t-10.9974\n▁canvas\t-10.9976\n▁impatience\t-11.0023\n▁uncomfortable\t-11.0023\n▁access\t-11.0023\n▁senator\t-11.0023\n▁swimming\t-11.0024\n▁barrier\t-11.0024\n▁adjust\t-11.0076\n▁comparison\t-11.0076\n▁proclaim\t-11.0076\n▁wrinkl\t-11.0076\n▁overlook\t-11.0076\n▁mitya\t-11.0076\n▁guilt\t-11.01\n▁distract\t-11.0128\n▁perception\t-11.0128\n▁precaution\t-11.0128\n▁spectator\t-11.0128\n▁surprising\t-11.0128\n▁disdain\t-11.0128\n▁bonnet\t-11.0128\n▁bapti\t-11.0129\n▁profess\t-11.0154\n▁inspector\t-11.018\n▁sketch\t-11.018\n▁structure\t-11.018\n▁ultimate\t-11.018\n▁confound\t-11.0181\n▁globe\t-11.0181\n▁insect\t-11.0181\n▁orchard\t-11.0181\n▁descent\t-11.0182\n▁amiable\t-11.0183\n▁independence\t-11.0233\n▁manufacture\t-11.0233\n▁sprinkle\t-11.0233\n▁nightingale\t-11.0233\n▁cushion\t-11.0233\n▁eminent\t-11.0233\n▁array\t-11.0234\n▁scott\t-11.0234\n▁troop\t-11.0234\n▁cosette\t-11.0234\n▁waving\t-11.0234\n▁irregular\t-11.0287\n▁persecut\t-11.0287\n▁derived\t-11.0287\n▁withdrew\t-11.0287\n▁caution\t-11.0287\n▁extract\t-11.0288\n▁suspicious\t-11.034\n▁memories\t-11.034\n▁nowhere\t-11.0341\n▁tremble\t-11.0343\n▁subtle\t-11.0343\n▁thorough\t-11.0349\nq\t-11.0372\n▁appropriate\t-11.0394\n▁slaughter\t-11.0394\n▁yourselves\t-11.0394\n▁thumb\t-11.0394\n▁twas\t-11.0394\n▁stray\t-11.0395\n▁abode\t-11.0395\n▁conspicuous\t-11.0448\n▁rebecca\t-11.0448\n▁sergeant\t-11.0448\n▁woke\t-11.0448\n▁apron\t-11.0451\n▁anticipate\t-11.0502\n▁discipline\t-11.0502\n▁glancing\t-11.0502\n▁pilgrim\t-11.0502\n▁sullen\t-11.0502\n▁contribute\t-11.0557\n▁prairie\t-11.0557\n▁carved\t-11.0559\n▁hypnoti\t-11.0612\n▁commerce\t-11.0612\n▁exclamation\t-11.0612\n▁muscular\t-11.0612\n▁november\t-11.0612\n▁phenomena\t-11.0612\n▁symbol\t-11.0612\n▁umbrella\t-11.0612\n▁diminish\t-11.0612\n▁parlour\t-11.0612\n▁threatening\t-11.0612\n▁stump\t-11.0612\n▁extensive\t-11.0667\n▁remembrance\t-11.0667\n▁combined\t-11.0667\n▁sheriff\t-11.0668\n▁laura\t-11.0673\n▁intercourse\t-11.0723\n▁supplies\t-11.0723\n▁landlord\t-11.0723\n▁stricken\t-11.0723\n▁shrink\t-11.0723\n▁caesar\t-11.0723\n▁drug\t-11.0726\n▁bewildered\t-11.0778\n▁commercial\t-11.0778\n▁nautilus\t-11.0778\n▁brutal\t-11.0779\n▁maggie\t-11.0779\n▁sphere\t-11.0779\n▁virgin\t-11.0816\n▁brethren\t-11.0835\n▁terrified\t-11.0835\n▁destiny\t-11.0835\n▁policy\t-11.0835\n▁housekeeper\t-11.0835\n▁ardent\t-11.0835\n▁discern\t-11.0836\n▁marquis\t-11.0836\nmouth\t-11.0854\n▁russia\t-11.0864\n▁wrap\t-11.0871\n▁britain\t-11.0891\n▁harbour\t-11.0891\n▁concert\t-11.0891\n▁harmony\t-11.0891\n▁donkey\t-11.0892\n▁damage\t-11.0892\n▁slim\t-11.0896\nabout\t-11.0911\n▁luxury\t-11.0948\n▁paradise\t-11.0948\n▁culture\t-11.0948\n▁monstrous\t-11.0948\n▁tendency\t-11.0948\n▁julius\t-11.0948\n▁remedy\t-11.0948\n▁raoul\t-11.0948\n▁scold\t-11.0948\n▁decay\t-11.0948\n▁split\t-11.0949\n▁assault\t-11.1005\n▁december\t-11.1005\n▁moscow\t-11.1005\n▁explore\t-11.1005\n▁trousers\t-11.1005\n▁wrist\t-11.1006\npiece\t-11.1026\n▁tyrant\t-11.1063\n▁valentine\t-11.1063\n▁musket\t-11.1063\n▁abraham\t-11.1063\n▁strait\t-11.1063\n▁artificial\t-11.112\n▁faculty\t-11.112\n▁obligation\t-11.112\n▁resemblance\t-11.112\n▁inquiries\t-11.1121\n▁detain\t-11.1121\n▁swarm\t-11.1121\n▁pledge\t-11.1121\n▁admirable\t-11.1179\n▁defect\t-11.1179\n▁superintend\t-11.1179\n▁patriot\t-11.1179\n▁breton\t-11.1179\n▁dismal\t-11.1181\n▁recit\t-11.1191\n▁ignor\t-11.1232\n▁amelia\t-11.1237\n▁elephant\t-11.1296\n▁estimate\t-11.1296\n▁knelt\t-11.1296\n▁serving\t-11.1296\n▁shrill\t-11.1296\n▁text\t-11.1296\n▁studio\t-11.13\n▁alexander\t-11.1355\n▁wrought\t-11.1355\n▁abundant\t-11.1355\n▁situated\t-11.1355\n▁regain\t-11.1355\n▁sneer\t-11.1356\n▁sweat\t-11.1357\n▁wren\t-11.1359\n▁justify\t-11.138\n▁nigh\t-11.1409\n▁escort\t-11.1415\n▁inevitable\t-11.1415\n▁psmith\t-11.1415\n▁reluctant\t-11.1415\n▁preceding\t-11.1415\n▁resort\t-11.1415\n▁outrage\t-11.1419\n▁ambassador\t-11.1474\n▁consolation\t-11.1474\n▁remorse\t-11.1474\n▁behalf\t-11.1474\n▁formidable\t-11.1474\n▁gravity\t-11.1475\n▁apologi\t-11.1482\n▁divide\t-11.1484\n▁gigantic\t-11.1535\n▁october\t-11.1535\n▁flank\t-11.1535\n▁stooped\t-11.1535\n▁slew\t-11.1535\n▁confront\t-11.1535\n▁clara\t-11.1535\n▁film\t-11.1536\n▁bulk\t-11.1536\ndolph\t-11.1545\n▁eleanor\t-11.1595\n▁exclusive\t-11.1595\n▁japanese\t-11.1595\n▁sympathi\t-11.1595\n▁cavalry\t-11.1595\n▁perfume\t-11.1595\n▁federal\t-11.1595\n▁liquid\t-11.1595\n▁rubbing\t-11.1596\n▁oven\t-11.1597\n▁convuls\t-11.1656\n▁significant\t-11.1656\n▁deprived\t-11.1656\n▁responsibility\t-11.1656\n▁waistcoat\t-11.1656\n▁cluster\t-11.1656\n▁martha\t-11.1657\n▁attorney\t-11.1718\n▁droop\t-11.1718\n▁skilful\t-11.1718\n▁habitual\t-11.1718\n▁interven\t-11.1719\n▁owl\t-11.172\n▁conjecture\t-11.1779\n▁fantastic\t-11.1779\n▁responsible\t-11.1779\n▁destined\t-11.1779\n▁thereupon\t-11.1779\n▁goddess\t-11.178\n▁pacific\t-11.178\n▁warrant\t-11.178\n▁costume\t-11.178\n▁document\t-11.178\n▁bridle\t-11.1783\n▁california\t-11.1841\n▁democratic\t-11.1841\n▁eustace\t-11.1841\n▁squirrel\t-11.1841\n▁uncommon\t-11.1841\n▁plough\t-11.1841\n▁marvellous\t-11.1841\n▁tragedy\t-11.1841\n▁vault\t-11.1842\n▁hesitate\t-11.1853\n▁admiring\t-11.1904\n▁corporal\t-11.1904\n▁entitled\t-11.1904\n▁refrain\t-11.1904\n▁shrewd\t-11.1904\n▁strap\t-11.1927\n▁accurate\t-11.1967\n▁tempest\t-11.1967\n▁monument\t-11.1967\n▁siege\t-11.1967\n▁chinese\t-11.1967\n▁raven\t-11.1968\n▁loung\t-11.1969\nleigh\t-11.1985\n▁assassin\t-11.203\n▁inflict\t-11.203\n▁agitated\t-11.203\n▁desirable\t-11.203\n▁earliest\t-11.203\n▁launch\t-11.203\n▁pilot\t-11.2031\n▁pulse\t-11.2031\n▁liquor\t-11.2094\n▁scarecrow\t-11.2094\n▁skull\t-11.2094\n▁desolate\t-11.2094\n▁ticket\t-11.2094\n▁sublime\t-11.2094\n▁recess\t-11.2094\n▁serene\t-11.2094\n▁righteous\t-11.2094\n▁pinocchio\t-11.2158\n▁priscilla\t-11.2158\n▁charlotte\t-11.2158\n▁circular\t-11.2158\n▁injustice\t-11.2158\n▁thyself\t-11.2158\n▁occurrence\t-11.2158\n▁casual\t-11.2158\n▁trout\t-11.2158\n▁legend\t-11.2158\n▁fertil\t-11.2178\n▁background\t-11.2222\n▁comparatively\t-11.2222\n▁delicacy\t-11.2222\n▁estralla\t-11.2222\n▁manuscript\t-11.2222\n▁response\t-11.2222\n▁university\t-11.2222\n▁wolves\t-11.2222\n▁scandal\t-11.2222\n▁hoarse\t-11.2223\n▁stumble\t-11.2223\n▁convent\t-11.2272\n▁utili\t-11.2278\n▁examining\t-11.2287\n▁incapable\t-11.2287\n▁perceiving\t-11.2287\n▁philadelphia\t-11.2287\n▁subsequent\t-11.2287\n▁thieves\t-11.2287\n▁accumulat\t-11.2287\n▁damsel\t-11.2287\n▁scotch\t-11.2287\n▁underneath\t-11.2287\n▁smash\t-11.2287\n▁nobility\t-11.2287\n▁revolt\t-11.2288\n▁engage\t-11.229\n▁cathedral\t-11.2353\n▁despatch\t-11.2353\n▁eternity\t-11.2353\n▁january\t-11.2353\n▁probability\t-11.2353\n▁parallel\t-11.2353\n▁jimmie\t-11.2353\n▁champion\t-11.2353\n▁fisherman\t-11.2353\n▁jerry\t-11.2353\n▁swore\t-11.2353\n▁draught\t-11.2419\n▁opponent\t-11.2419\n▁primitive\t-11.2419\n▁significance\t-11.2419\n▁substantial\t-11.2419\n▁dunbar\t-11.2419\n▁commend\t-11.2419\n▁jasper\t-11.2419\n▁contemplate\t-11.2485\n▁testimony\t-11.2485\n▁imperial\t-11.2485\n▁adapt\t-11.2485\n▁juice\t-11.2485\n▁calamit\t-11.2489\n▁phoenix\t-11.2551\n▁prudent\t-11.2551\n▁solution\t-11.2551\n▁villefort\t-11.2551\n▁chateau\t-11.2551\n▁reaction\t-11.2551\n▁relax\t-11.2552\n▁quaint\t-11.2552\n▁plunder\t-11.2619\n▁distrust\t-11.2619\n▁prohibit\t-11.2619\n▁welfare\t-11.2619\n▁parlor\t-11.2619\n▁navigat\t-11.262\n▁tank\t-11.2624\nthink\t-11.2657\n▁discourage\t-11.2686\n▁obstinate\t-11.2686\n▁rejoicing\t-11.2686\n▁vehicle\t-11.2686\n▁fancies\t-11.2686\n▁enlighten\t-11.2686\n▁sermon\t-11.2686\n▁illusion\t-11.2686\n▁anthea\t-11.2686\n▁martian\t-11.2688\n▁excite\t-11.2698\n▁attachment\t-11.2754\n▁generosity\t-11.2754\n▁unworthy\t-11.2754\n▁kettle\t-11.2754\n▁internal\t-11.2755\n▁incense\t-11.2756\n▁vibrat\t-11.2757\n▁adhere\t-11.2767\n▁february\t-11.2823\n▁incessant\t-11.2823\n▁mexican\t-11.2823\n▁interposed\t-11.2823\n▁granite\t-11.2823\n▁parcel\t-11.2823\n▁vexed\t-11.2823\n▁promote\t-11.2826\n▁debate\t-11.2839\nmidst\t-11.2854\n▁cyril\t-11.2892\n▁embark\t-11.2892\n▁terrace\t-11.2892\n▁abundance\t-11.2892\n▁surgeon\t-11.2892\n▁aristocrat\t-11.2892\n▁literally\t-11.2892\n▁atlantic\t-11.2892\n▁martyr\t-11.2892\n▁senate\t-11.2892\n▁speck\t-11.2892\n▁loaf\t-11.2892\nvocation\t-11.2902\n▁administer\t-11.2961\n▁apprehend\t-11.2961\n▁elaborate\t-11.2961\n▁subdued\t-11.2961\n▁temporary\t-11.2961\n▁dominion\t-11.2961\n▁dignified\t-11.2961\n▁splash\t-11.2961\n▁conseil\t-11.2961\n▁dexter\t-11.2961\n▁unseen\t-11.2961\n▁tragic\t-11.2962\nologist\t-11.3023\n▁sympathetic\t-11.3031\n▁bachelor\t-11.3031\n▁defense\t-11.3031\n▁excursion\t-11.3031\n▁faculties\t-11.3031\n▁proprietor\t-11.3031\n▁radiant\t-11.3031\n▁unnecessary\t-11.3031\n▁vacant\t-11.3031\n▁screw\t-11.3031\n▁ounce\t-11.3031\n▁gratify\t-11.3032\n▁calculated\t-11.3101\n▁keith\t-11.3101\n▁phenomenon\t-11.3101\n▁prominent\t-11.3101\n▁worried\t-11.3101\n▁climate\t-11.3101\n▁studies\t-11.3101\n▁aramis\t-11.3101\n▁bliss\t-11.3102\n▁contend\t-11.3102\nclose\t-11.312\n▁continual\t-11.3127\n▁surpass\t-11.3172\n▁hebrew\t-11.3172\n▁identity\t-11.3172\n▁provoke\t-11.3172\n▁temperament\t-11.3172\n▁chariot\t-11.3172\n▁ninth\t-11.3172\n▁harbor\t-11.3173\n▁desirous\t-11.3244\n▁jerusalem\t-11.3244\n▁undertaking\t-11.3244\n▁chorus\t-11.3244\n▁scout\t-11.3244\n▁mirth\t-11.3244\n▁hymn\t-11.3244\n▁particle\t-11.3246\n▁apparatus\t-11.3316\n▁intelligible\t-11.3316\n▁invariably\t-11.3316\n▁pierced\t-11.3316\n▁review\t-11.3316\n▁flicker\t-11.3316\n▁exciting\t-11.3316\n▁gospel\t-11.3316\n▁dixon\t-11.3316\n▁revelation\t-11.3316\n▁constance\t-11.3316\n▁overtake\t-11.3316\n▁guinea\t-11.3316\n▁drap\t-11.3322\n▁precise\t-11.3343\n▁aladdin\t-11.3388\n▁chicago\t-11.3388\n▁tulliver\t-11.3388\n▁hamilton\t-11.3388\n▁garrison\t-11.3388\n▁disciple\t-11.3388\n▁intensity\t-11.3388\n▁traitor\t-11.3388\n▁chancellor\t-11.3388\n▁proverb\t-11.3388\n▁dagger\t-11.3389\n▁foresee\t-11.3399\n▁chauvelin\t-11.3461\n▁glimmer\t-11.3461\n▁volunteer\t-11.3461\n▁jungle\t-11.3461\n▁streak\t-11.3461\n▁sunrise\t-11.3461\n▁dissolv\t-11.3461\n▁confide\t-11.3482\n▁awhile\t-11.3535\n▁felicity\t-11.3535\n▁legislature\t-11.3535\n▁leonora\t-11.3535\n▁pitiful\t-11.3535\n▁colony\t-11.3535\n▁shawl\t-11.3536\n▁harmoni\t-11.3552\n▁arriving\t-11.3609\n▁carpenter\t-11.3609\n▁fundamental\t-11.3609\n▁overflow\t-11.3609\n▁expand\t-11.3609\n▁harvest\t-11.3609\n▁tidings\t-11.3609\nfolk\t-11.3636\n▁feminine\t-11.3683\n▁innumerable\t-11.3683\n▁twentieth\t-11.3683\n▁trifling\t-11.3683\n▁ghastl\t-11.3683\n▁conquest\t-11.3683\n▁butterfly\t-11.3683\n▁daniel\t-11.3684\n▁scramble\t-11.3684\n▁facilit\t-11.3685\n▁forsake\t-11.3687\n▁behaviour\t-11.3759\n▁gorgeous\t-11.3759\n▁producing\t-11.3759\n▁happier\t-11.3759\n▁promising\t-11.3759\n▁rainbow\t-11.3759\n▁instinctively\t-11.3759\n▁decree\t-11.376\n▁copie\t-11.3764\n▁strew\t-11.3765\n▁eyebrows\t-11.3834\n▁irresistible\t-11.3834\n▁pharaoh\t-11.3834\n▁scrooge\t-11.3834\n▁unnatural\t-11.3834\n▁crumbs\t-11.3834\n▁refined\t-11.3834\n▁dreary\t-11.3834\n▁trench\t-11.3835\n▁clair\t-11.3838\n▁convince\t-11.386\n▁fringe\t-11.3877\n▁extremity\t-11.3911\n▁intimacy\t-11.3911\n▁scoundrel\t-11.3911\n▁suffrage\t-11.3911\n▁uneasiness\t-11.3911\n▁barricade\t-11.3911\n▁circulat\t-11.3911\n▁samuel\t-11.3911\n▁bruce\t-11.3911\n▁spake\t-11.3911\n▁ambitious\t-11.3988\n▁energetic\t-11.3988\n▁splendor\t-11.3988\n▁tuesday\t-11.3988\n▁virtuous\t-11.3988\n"
  },
  {
    "path": "modules/audio/asr/u2_conformer_librispeech/assets/data/mean_std.json",
    "content": "{\"mean_stat\": [3419817384.9589553, 3554070049.1888413, 3818511309.9166613, 4066044518.3850017, 4291564631.2871633, 4447813845.146345, 4533096457.680424, 4535743891.989957, 4529762966.952207, 4506798370.255702, 4563810141.721841, 4621582319.277632, 4717208210.814803, 4782916961.295261, 4800534153.252695, 4816978042.979026, 4813370098.242317, 4783029495.131413, 4797780594.144404, 4697681126.278327, 4615891408.325888, 4660549391.6024275, 4576180438.146472, 4609080513.250168, 4575296489.058092, 4602504837.872262, 4568039825.650208, 4596829549.204861, 4590634987.343898, 4604371982.549804, 4623782318.317643, 4643582410.8842745, 4681460771.788484, 4759470876.31175, 4808639788.683043, 4828470941.416027, 4868984035.113543, 4906503986.801533, 4945995579.443381, 4936645225.986488, 4975902400.919519, 4960230208.656678, 4986734786.199859, 4983472199.8246765, 5002204376.162232, 5030432036.352981, 5060386169.086892, 5093482058.577236, 5118330657.308789, 5137270836.326198, 5140137363.319094, 5144296534.330122, 5158812605.654329, 5166263515.51458, 5156261604.282723, 5155820011.532965, 5154511256.8968, 5152063882.193671, 5153425524.412178, 5149000486.683038, 5154587156.35868, 5134412165.07972, 5092874838.792056, 5062281231.5140915, 5029059442.072953, 4996045017.917702, 4962203662.170533, 4928110046.282831, 4900476581.092096, 4881407033.533021, 4859626116.955097, 4851430742.3865795, 4850317443.454599, 4848197040.155383, 4837178106.464577, 4818448202.7298765, 4803345264.527405, 4765785994.104498, 4735296707.352132, 4699957946.40757], \"var_stat\": [39487786239.20539, 42865198005.60155, 49718916704.468704, 55953639455.490585, 62156293826.00315, 66738657819.12445, 69416921986.47835, 69657873431.17258, 69240303799.53061, 68286972351.43054, 69718367152.18843, 71405427710.7103, 74174200331.87572, 76047347951.43869, 76478048614.40665, 76810929560.19212, 76540466184.85634, 75538479521.34026, 75775624554.07217, 72775991318.16557, 70350402972.93352, 71358602366.48341, 68872845697.9878, 69552396791.49916, 68471390455.59991, 69022047288.07498, 67982260910.11236, 68656154716.71916, 68461419064.9241, 68795285460.65717, 69270474608.52791, 69754495937.76433, 70596044579.14969, 72207936275.97945, 73629619360.65047, 74746445259.57487, 75925168496.81197, 76973508692.04265, 78074337163.3413, 77765963787.96971, 78839167623.49733, 78328768943.2287, 79016127287.03778, 78922638306.99306, 79489768324.9408, 80354861037.44005, 81311991408.12526, 82368205917.26112, 83134782296.1741, 83667769421.23245, 83673751953.46239, 83806087685.62842, 84193971202.07523, 84424752763.34825, 84092846117.64104, 84039114093.08766, 83982515225.7085, 83909645482.75613, 83947278563.15077, 83800767707.19617, 83851106027.8772, 83089292432.37892, 82056425825.3622, 81138570746.92316, 80131843258.75557, 79130160837.19037, 78092166878.71533, 77104785522.79205, 76308548392.10454, 75709445890.58063, 75084778641.6033, 74795849006.19067, 74725807683.832, 74645651838.2169, 74300193368.39339, 73696619147.86806, 73212785808.97992, 72240491743.0697, 71420246227.32545, 70457076435.4593], \"frame_num\": 345484372}\n"
  },
  {
    "path": "modules/audio/asr/u2_conformer_librispeech/assets/data/vocab.txt",
    "content": "<blank>\n<unk>\n'\na\nabeth\nability\nable\nably\nabout\nac\nach\nacious\nad\nade\nag\nage\nah\nak\nal\nally\nam\nan\nance\nand\nang\nans\nant\nap\nar\nard\naries\nartagnan\nary\nas\nat\nate\nated\nath\nating\nation\nations\native\nator\natory\nau\nav\naw\nay\nb\nba\nbbe\nbble\nbe\nbel\nber\nbi\nble\nbo\nboard\nborough\nbra\nbu\nburg\nburn\nbury\nby\nc\nca\ncar\ncast\nce\ncent\nch\ncha\nche\nched\nchi\ncho\nci\nck\nclock\nclose\nco\ncomb\ncon\nctor\ncu\ncum\ncy\nd\nda\ndding\nddle\nde\nden\nder\ndo\ndolph\ndy\ne\nea\ned\nef\nel\nella\nem\nement\nen\nence\nencies\nened\nens\nent\ner\ners\nes\nest\net\neth\nett\nette\nev\never\nex\ney\nf\nfa\nfall\nfe\nfer\nff\nfi\nfield\nfold\nfolk\nfoot\nfor\nford\nform\nft\nful\ng\nga\ngan\ngar\ngate\nge\nged\ngen\nger\ngg\ngi\nging\ngn\ngo\ngra\ngu\ngue\nh\nha\nham\nhan\nhar\nhe\nhead\nhen\nher\nhi\nhin\nho\nhold\nhood\nhouse\nhu\nhurst\nhy\ni\nia\nial\nian\nians\nib\nible\nic\nical\nick\nid\nie\nied\nier\nies\nif\nification\nified\nifying\nig\night\nign\nil\nile\nility\nily\nim\nin\nina\nine\niness\ning\nio\nion\nions\nious\nip\nir\nire\nis\nish\nism\nison\nist\nistic\nists\nit\nitch\nite\nities\nitude\nity\nium\nius\nive\nj\nja\nji\njo\nju\nk\nka\nke\nkeep\nker\nki\nkin\nking\nko\nky\nl\nla\nlac\nlan\nland\nlar\nld\nle\nled\nleigh\nler\nles\nless\nlet\nley\nlf\nli\nlie\nlight\nlike\nlin\nline\nliness\nling\nll\nlo\nlock\nlon\nlong\nlow\nlt\nlung\nlus\nly\nm\nma\nman\nmbled\nme\nmen\nment\nments\nmer\nmi\nmidst\nmmed\nmo\nmond\nmont\nmore\nmost\nmouth\nmp\nmy\nn\nna\nnce\nnd\nne\nned\nner\nness\nney\nng\nni\nnic\nning\nnnie\nnny\nno\nnt\nny\no\noc\nod\nog\nol\nological\nologist\nology\nom\non\none\noo\nook\noon\nop\nor\nord\nors\nory\nos\not\nou\nour\nous\nov\now\np\npa\npe\npec\nped\nper\npha\npiece\nple\npo\nport\npose\npp\npping\nps\nq\nqua\nque\nqui\nr\nra\nran\nrate\nre\nred\nress\nrg\nri\nric\nrick\nridge\nries\nright\nrin\nring\nris\nrk\nrn\nro\nron\nrous\nrow\nrs\nrt\nru\nry\ns\nsail\nse\nsh\nship\nshire\nside\nsome\nson\nst\nstead\nster\nstone\nstra\nt\nta\ntan\nte\nted\nten\nter\nterior\nth\nthe\nther\nthink\nthorpe\nti\ntic\nties\ntime\ntin\nting\ntion\nto\nton\ntri\ntro\ntte\nttered\nttle\ntur\nty\nu\nub\nuc\nuch\nud\nug\nugh\nul\nulation\num\nun\nund\nuous\nup\nur\nure\nus\nuse\nut\nux\nv\nva\nval\nvan\nve\nver\nvi\nville\nvo\nvocation\nw\nwa\nwar\nward\nway\nwe\nwell\nwi\nwick\nwin\nwn\nwood\nwork\nworth\nx\ny\nz\nzz\n▁\n▁a\n▁ab\n▁abandon\n▁able\n▁abode\n▁about\n▁above\n▁abraham\n▁abroad\n▁absence\n▁absent\n▁absolute\n▁absolutely\n▁absorb\n▁abstract\n▁absurd\n▁abundance\n▁abundant\n▁abuse\n▁accent\n▁accept\n▁accepted\n▁access\n▁accident\n▁accompanied\n▁accompany\n▁accomplish\n▁accord\n▁according\n▁accordingly\n▁account\n▁accumulat\n▁accurate\n▁accuse\n▁accustomed\n▁achieve\n▁acknowledg\n▁acquaintance\n▁acquainted\n▁acquired\n▁across\n▁act\n▁action\n▁active\n▁activity\n▁actual\n▁actually\n▁adam\n▁adapt\n▁add\n▁added\n▁addition\n▁address\n▁addressed\n▁adhere\n▁adjust\n▁administer\n▁administration\n▁admirable\n▁admiral\n▁admiration\n▁admire\n▁admiring\n▁admit\n▁admitted\n▁adopt\n▁adorn\n▁advance\n▁advanced\n▁advancing\n▁advantage\n▁adventure\n▁advertise\n▁advice\n▁advise\n▁affair\n▁affairs\n▁affect\n▁affected\n▁affection\n▁affectionate\n▁affirm\n▁afflict\n▁afford\n▁afraid\n▁africa\n▁after\n▁afternoon\n▁afterward\n▁afterwards\n▁again\n▁against\n▁age\n▁agent\n▁agitated\n▁agitation\n▁ago\n▁agony\n▁agree\n▁agreeable\n▁agreed\n▁ah\n▁ahead\n▁aid\n▁aim\n▁air\n▁al\n▁aladdin\n▁alarm\n▁alas\n▁albert\n▁alexander\n▁alice\n▁alive\n▁all\n▁allow\n▁allowed\n▁almost\n▁alone\n▁along\n▁aloud\n▁already\n▁also\n▁altar\n▁alter\n▁although\n▁altogether\n▁always\n▁alyosha\n▁am\n▁ama\n▁ambassador\n▁ambition\n▁ambitious\n▁amelia\n▁america\n▁american\n▁amiable\n▁amid\n▁among\n▁amount\n▁amusement\n▁an\n▁anchor\n▁ancient\n▁and\n▁andrew\n▁angel\n▁anger\n▁angle\n▁angrily\n▁angry\n▁anguish\n▁animal\n▁animals\n▁anna\n▁anne\n▁announc\n▁announced\n▁another\n▁answer\n▁answered\n▁anthea\n▁anti\n▁anticipate\n▁anxiety\n▁anxious\n▁any\n▁anybody\n▁anyhow\n▁anyone\n▁anything\n▁anywhere\n▁apart\n▁apartment\n▁apologi\n▁apparatus\n▁apparent\n▁apparently\n▁appeal\n▁appear\n▁appearance\n▁appeared\n▁appetite\n▁apple\n▁application\n▁applied\n▁apply\n▁appointed\n▁appointment\n▁appreciate\n▁apprehend\n▁apprehension\n▁approach\n▁approached\n▁approaching\n▁appropriate\n▁approve\n▁april\n▁apron\n▁apt\n▁ar\n▁arab\n▁aramis\n▁arch\n▁architect\n▁ardent\n▁are\n▁argue\n▁argument\n▁arise\n▁aristocrat\n▁arm\n▁arms\n▁army\n▁arose\n▁around\n▁arranged\n▁arrangement\n▁array\n▁arrest\n▁arrival\n▁arrive\n▁arrived\n▁arriving\n▁arrow\n▁art\n▁arthur\n▁article\n▁artificial\n▁artist\n▁as\n▁ascend\n▁ascertain\n▁ashamed\n▁ashes\n▁ashore\n▁aside\n▁ask\n▁asked\n▁asking\n▁asleep\n▁aspect\n▁assassin\n▁assault\n▁assembled\n▁assembly\n▁assert\n▁assist\n▁assistance\n▁assistant\n▁associate\n▁association\n▁assume\n▁assumed\n▁assurance\n▁assure\n▁assured\n▁astonished\n▁astonishment\n▁at\n▁atlantic\n▁atmosphere\n▁attached\n▁attachment\n▁attack\n▁attain\n▁attempt\n▁attend\n▁attendant\n▁attention\n▁attentive\n▁attitude\n▁attorney\n▁attract\n▁attribute\n▁audience\n▁august\n▁aunt\n▁author\n▁authorities\n▁authority\n▁autumn\n▁avail\n▁avenue\n▁average\n▁avoid\n▁await\n▁awake\n▁awakened\n▁aware\n▁away\n▁awful\n▁awhile\n▁awkward\n▁awoke\n▁axe\n▁b\n▁ba\n▁baby\n▁bachelor\n▁back\n▁background\n▁backward\n▁bad\n▁bade\n▁bag\n▁bake\n▁bal\n▁balance\n▁ball\n▁balloon\n▁ban\n▁band\n▁bank\n▁bapti\n▁bar\n▁barbar\n▁bare\n▁bargain\n▁bark\n▁baron\n▁barrel\n▁barricade\n▁barrier\n▁base\n▁basin\n▁basket\n▁bath\n▁batter\n▁battle\n▁bay\n▁be\n▁bear\n▁beard\n▁bearing\n▁beast\n▁beat\n▁beaten\n▁beautiful\n▁beauty\n▁became\n▁because\n▁become\n▁becoming\n▁bed\n▁bedroom\n▁been\n▁before\n▁beg\n▁began\n▁beggar\n▁begged\n▁begin\n▁beginning\n▁begun\n▁behalf\n▁behave\n▁behaviour\n▁beheld\n▁behind\n▁behold\n▁being\n▁belief\n▁believe\n▁believed\n▁believing\n▁bell\n▁belong\n▁beloved\n▁below\n▁bench\n▁bending\n▁beneath\n▁benefit\n▁bent\n▁bernard\n▁beside\n▁besides\n▁best\n▁bestow\n▁betray\n▁better\n▁between\n▁bewildered\n▁beyond\n▁bi\n▁bible\n▁bid\n▁big\n▁bill\n▁billy\n▁bind\n▁bird\n▁birds\n▁birth\n▁bishop\n▁bit\n▁bitter\n▁bla\n▁black\n▁blade\n▁blame\n▁blank\n▁blanket\n▁bless\n▁blew\n▁blind\n▁bliss\n▁block\n▁blood\n▁bloom\n▁blossom\n▁blow\n▁blu\n▁blue\n▁blush\n▁bo\n▁board\n▁boast\n▁boat\n▁bob\n▁bodies\n▁body\n▁boil\n▁bold\n▁bolt\n▁bon\n▁bond\n▁bonnet\n▁book\n▁books\n▁boot\n▁boots\n▁border\n▁bore\n▁born\n▁borne\n▁borrow\n▁bosom\n▁boston\n▁both\n▁bottle\n▁bottom\n▁bought\n▁bound\n▁bow\n▁bowed\n▁bowl\n▁box\n▁boy\n▁boys\n▁bra\n▁brain\n▁branch\n▁branches\n▁brand\n▁brave\n▁bread\n▁break\n▁breakfast\n▁breaking\n▁breast\n▁breath\n▁bree\n▁brethren\n▁breton\n▁bri\n▁brick\n▁bride\n▁bridge\n▁bridle\n▁brief\n▁brig\n▁bright\n▁brilliant\n▁bring\n▁bringing\n▁brisk\n▁britain\n▁british\n▁bro\n▁broad\n▁broke\n▁broken\n▁brood\n▁brook\n▁brother\n▁brothers\n▁brought\n▁brow\n▁brown\n▁bruce\n▁brush\n▁brutal\n▁brute\n▁bu\n▁buck\n▁build\n▁building\n▁built\n▁bulk\n▁bull\n▁bullet\n▁bunch\n▁bundle\n▁bur\n▁burden\n▁buried\n▁burn\n▁burning\n▁burst\n▁bush\n▁bushes\n▁business\n▁busy\n▁but\n▁butter\n▁butterfly\n▁buy\n▁by\n▁c\n▁ca\n▁cab\n▁cabin\n▁caesar\n▁cake\n▁cal\n▁calamit\n▁calculated\n▁california\n▁call\n▁called\n▁calling\n▁calm\n▁came\n▁camp\n▁campaign\n▁can\n▁candid\n▁candle\n▁cannon\n▁cannot\n▁canoe\n▁canvas\n▁cap\n▁capable\n▁capacity\n▁capital\n▁captain\n▁capture\n▁car\n▁card\n▁cardinal\n▁care\n▁careful\n▁carefully\n▁careless\n▁carlyle\n▁carpenter\n▁carpet\n▁carr\n▁carriage\n▁carried\n▁carry\n▁carrying\n▁cart\n▁carved\n▁case\n▁cast\n▁castle\n▁casual\n▁cat\n▁catch\n▁cathedral\n▁catherine\n▁catholic\n▁cattle\n▁caught\n▁cause\n▁caused\n▁caution\n▁cavalry\n▁cave\n▁ce\n▁cease\n▁ceased\n▁ceiling\n▁celebrat\n▁cell\n▁cellar\n▁cent\n▁center\n▁central\n▁centre\n▁centuries\n▁century\n▁ceremony\n▁certain\n▁certainly\n▁cetera\n▁ch\n▁cha\n▁chain\n▁chair\n▁challenge\n▁chamber\n▁champion\n▁chance\n▁chancellor\n▁change\n▁changed\n▁changing\n▁channel\n▁chap\n▁chapter\n▁char\n▁character\n▁characteristic\n▁charge\n▁chariot\n▁charles\n▁charlotte\n▁charm\n▁charming\n▁chase\n▁chateau\n▁chatter\n▁chauvelin\n▁che\n▁cheap\n▁check\n▁cheek\n▁cheeks\n▁cheer\n▁cheerful\n▁cheese\n▁cherish\n▁chest\n▁chi\n▁chicago\n▁chicken\n▁chief\n▁child\n▁childhood\n▁children\n▁chill\n▁chimney\n▁chin\n▁china\n▁chinese\n▁choice\n▁choose\n▁chop\n▁chorus\n▁chose\n▁chosen\n▁chris\n▁christ\n▁christian\n▁christmas\n▁chu\n▁chuck\n▁church\n▁cigar\n▁circle\n▁circular\n▁circulat\n▁circumstance\n▁circumstances\n▁citi\n▁cities\n▁city\n▁civil\n▁civili\n▁cl\n▁claim\n▁clair\n▁clapp\n▁clara\n▁clasp\n▁class\n▁classes\n▁claw\n▁clay\n▁clean\n▁clear\n▁clearly\n▁clergy\n▁clerk\n▁clever\n▁cliff\n▁climate\n▁climb\n▁clo\n▁cloak\n▁clock\n▁close\n▁closed\n▁closely\n▁cloth\n▁clothes\n▁cloud\n▁clouds\n▁club\n▁cluster\n▁clutch\n▁co\n▁coach\n▁coal\n▁coarse\n▁coast\n▁coat\n▁cock\n▁coffee\n▁coffin\n▁coin\n▁col\n▁cold\n▁collar\n▁collect\n▁college\n▁colonel\n▁coloni\n▁colony\n▁color\n▁colour\n▁column\n▁com\n▁comb\n▁combat\n▁combination\n▁combined\n▁come\n▁comes\n▁comfort\n▁comfortable\n▁coming\n▁command\n▁commenced\n▁commend\n▁comment\n▁commerce\n▁commercial\n▁commission\n▁commit\n▁committed\n▁committee\n▁common\n▁commun\n▁communicat\n▁communication\n▁community\n▁comp\n▁companion\n▁companions\n▁company\n▁comparatively\n▁compare\n▁comparison\n▁compass\n▁compelled\n▁complain\n▁complete\n▁completely\n▁complex\n▁compliment\n▁composed\n▁composition\n▁comprehend\n▁comrade\n▁con\n▁conceal\n▁conceive\n▁concentrat\n▁conception\n▁concern\n▁concerned\n▁concerning\n▁concert\n▁conclud\n▁concluded\n▁conclusion\n▁condemn\n▁condition\n▁conditions\n▁conduct\n▁conf\n▁confess\n▁confide\n▁confidence\n▁confident\n▁confined\n▁confirm\n▁conflict\n▁confound\n▁confront\n▁confused\n▁confusion\n▁congress\n▁conjecture\n▁connected\n▁connection\n▁conquer\n▁conquest\n▁conscience\n▁conscious\n▁consciousness\n▁conseil\n▁consent\n▁consequence\n▁consequently\n▁consider\n▁considerable\n▁consideration\n▁considered\n▁consist\n▁consolation\n▁conspicuous\n▁constance\n▁constant\n▁constantly\n▁constitute\n▁constitution\n▁construct\n▁consult\n▁consum\n▁contact\n▁contain\n▁contemplate\n▁contempt\n▁contend\n▁content\n▁contest\n▁continent\n▁continual\n▁continually\n▁continue\n▁continued\n▁contract\n▁contradict\n▁contrary\n▁contrast\n▁contribute\n▁control\n▁convenient\n▁convent\n▁convention\n▁conversation\n▁converse\n▁convert\n▁convey\n▁convict\n▁conviction\n▁convince\n▁convinced\n▁convuls\n▁cook\n▁cool\n▁copie\n▁copper\n▁copy\n▁cor\n▁cordial\n▁corn\n▁corner\n▁corporal\n▁corpse\n▁correct\n▁correspond\n▁corridor\n▁corrupt\n▁cosette\n▁cost\n▁costume\n▁cottage\n▁cotton\n▁couch\n▁could\n▁couldn\n▁council\n▁counsel\n▁count\n▁countenance\n▁counter\n▁countess\n▁countries\n▁country\n▁couple\n▁courage\n▁course\n▁court\n▁cousin\n▁cover\n▁covered\n▁cow\n▁coward\n▁cra\n▁crack\n▁craft\n▁crawl\n▁cre\n▁cream\n▁created\n▁creature\n▁creatures\n▁credit\n▁creek\n▁creep\n▁crep\n▁crew\n▁cried\n▁cries\n▁crime\n▁criminal\n▁crimson\n▁cristo\n▁critic\n▁cro\n▁cross\n▁crossed\n▁crow\n▁crowd\n▁crown\n▁cru\n▁cruel\n▁crumbs\n▁crush\n▁cry\n▁crying\n▁crystal\n▁cu\n▁cultivate\n▁culture\n▁cunning\n▁cup\n▁cur\n▁curiosity\n▁curious\n▁curl\n▁current\n▁curse\n▁curtain\n▁cushion\n▁custom\n▁cut\n▁cutting\n▁cyril\n▁d\n▁da\n▁dagger\n▁daily\n▁damage\n▁damn\n▁damp\n▁damsel\n▁dan\n▁dance\n▁dancing\n▁danger\n▁dangerous\n▁danglars\n▁daniel\n▁dar\n▁dare\n▁dared\n▁dark\n▁darkness\n▁darling\n▁dash\n▁date\n▁daughter\n▁david\n▁dawn\n▁day\n▁days\n▁de\n▁dead\n▁deaf\n▁deal\n▁dear\n▁dearest\n▁death\n▁debate\n▁debt\n▁decay\n▁deceive\n▁december\n▁decide\n▁decided\n▁decision\n▁deck\n▁declare\n▁declared\n▁decline\n▁decorat\n▁decree\n▁deep\n▁deeply\n▁defeat\n▁defect\n▁defence\n▁defend\n▁defense\n▁defi\n▁definite\n▁degree\n▁delay\n▁deliberate\n▁delicacy\n▁delicate\n▁delicious\n▁delight\n▁delighted\n▁delightful\n▁deliver\n▁demand\n▁demanded\n▁democratic\n▁demon\n▁den\n▁denied\n▁deny\n▁depart\n▁department\n▁departure\n▁depend\n▁deposit\n▁depress\n▁deprived\n▁depth\n▁derived\n▁descend\n▁descended\n▁descent\n▁describe\n▁described\n▁description\n▁desert\n▁deserve\n▁design\n▁desirable\n▁desire\n▁desired\n▁desirous\n▁desk\n▁desolate\n▁despair\n▁despatch\n▁desperate\n▁despise\n▁despite\n▁destined\n▁destiny\n▁destroy\n▁destroyed\n▁destruction\n▁detail\n▁detain\n▁detect\n▁detective\n▁determin\n▁determination\n▁determined\n▁develop\n▁development\n▁device\n▁devil\n▁devoted\n▁devotion\n▁devour\n▁dexter\n▁di\n▁diamond\n▁diana\n▁dick\n▁did\n▁didn\n▁die\n▁died\n▁differ\n▁difference\n▁different\n▁difficult\n▁difficulties\n▁difficulty\n▁dig\n▁dignified\n▁dignity\n▁dim\n▁diminish\n▁din\n▁dinner\n▁direct\n▁directed\n▁direction\n▁directly\n▁dirty\n▁dis\n▁disagreeable\n▁disappear\n▁disappeared\n▁disappoint\n▁disappointment\n▁disc\n▁discern\n▁discharge\n▁disciple\n▁discipline\n▁discourage\n▁discourse\n▁discover\n▁discovered\n▁discovery\n▁discuss\n▁discussion\n▁disdain\n▁disease\n▁disgrace\n▁disguise\n▁disgust\n▁dish\n▁dislike\n▁dismal\n▁dismay\n▁dismiss\n▁disorder\n▁display\n▁disposed\n▁disposition\n▁dispute\n▁dissolv\n▁distance\n▁distant\n▁distinct\n▁distinction\n▁distinguish\n▁distinguished\n▁distract\n▁distress\n▁distribut\n▁district\n▁distrust\n▁disturb\n▁div\n▁divers\n▁divide\n▁divided\n▁divine\n▁division\n▁dixon\n▁do\n▁doctor\n▁doctrine\n▁document\n▁does\n▁doesn\n▁dog\n▁dogs\n▁doing\n▁dollars\n▁domestic\n▁dominion\n▁don\n▁done\n▁donkey\n▁door\n▁doors\n▁doorway\n▁dorothy\n▁double\n▁doubt\n▁doubtful\n▁doubtless\n▁down\n▁downstairs\n▁drag\n▁dragg\n▁dragon\n▁drain\n▁drake\n▁drama\n▁drank\n▁drap\n▁draught\n▁draw\n▁drawing\n▁drawn\n▁dread\n▁dreadful\n▁dream\n▁dreary\n▁dress\n▁dressed\n▁drew\n▁dri\n▁drift\n▁drink\n▁drive\n▁driven\n▁driver\n▁driving\n▁droop\n▁drop\n▁dropped\n▁dropping\n▁drove\n▁drown\n▁drug\n▁drum\n▁drunk\n▁dry\n▁du\n▁duchess\n▁duck\n▁due\n▁duke\n▁dull\n▁dumb\n▁dun\n▁dunbar\n▁dur\n▁dusk\n▁dust\n▁dutch\n▁duties\n▁duty\n▁dwarf\n▁dwell\n▁dwelt\n▁dying\n▁e\n▁each\n▁eager\n▁eagerly\n▁eagle\n▁ear\n▁earl\n▁earlier\n▁earliest\n▁early\n▁earn\n▁earnest\n▁ears\n▁earth\n▁ease\n▁easier\n▁easily\n▁east\n▁eastern\n▁easy\n▁eat\n▁eaten\n▁eating\n▁echo\n▁edge\n▁edith\n▁editor\n▁educat\n▁education\n▁edward\n▁effect\n▁effort\n▁eggs\n▁egypt\n▁egyptian\n▁eight\n▁eighteen\n▁eighty\n▁either\n▁el\n▁elaborate\n▁elbow\n▁elder\n▁eldest\n▁eleanor\n▁elect\n▁electric\n▁elegant\n▁element\n▁elephant\n▁eleven\n▁eli\n▁else\n▁elsewhere\n▁elsie\n▁em\n▁embark\n▁embarrass\n▁embrace\n▁embroider\n▁emerg\n▁emily\n▁eminent\n▁emotion\n▁emperor\n▁emphasi\n▁empire\n▁employ\n▁employed\n▁empty\n▁en\n▁enable\n▁enchant\n▁enclos\n▁encounter\n▁encourage\n▁end\n▁endeavor\n▁endeavour\n▁endure\n▁enemies\n▁enemy\n▁energetic\n▁energy\n▁engage\n▁engaged\n▁engagement\n▁engine\n▁england\n▁english\n▁enjoy\n▁enjoyment\n▁enlighten\n▁enormous\n▁enough\n▁ensu\n▁enter\n▁entered\n▁enterprise\n▁entertain\n▁enthusiasm\n▁entire\n▁entirely\n▁entitled\n▁entrance\n▁entreat\n▁envelope\n▁envy\n▁epi\n▁equal\n▁equally\n▁er\n▁ere\n▁erect\n▁errand\n▁error\n▁escape\n▁escaped\n▁escort\n▁especially\n▁essence\n▁essential\n▁establish\n▁established\n▁establishment\n▁estate\n▁esteem\n▁estimate\n▁estralla\n▁eternal\n▁eternity\n▁europe\n▁eustace\n▁eva\n▁even\n▁evening\n▁events\n▁ever\n▁every\n▁everybody\n▁everyone\n▁everything\n▁everywhere\n▁evidence\n▁evident\n▁evidently\n▁evil\n▁ex\n▁exact\n▁exactly\n▁examination\n▁examine\n▁examined\n▁examining\n▁example\n▁exceed\n▁exceedingly\n▁excellency\n▁excellent\n▁except\n▁exception\n▁excess\n▁exchange\n▁excite\n▁excited\n▁excitement\n▁exciting\n▁exclaimed\n▁exclamation\n▁exclusive\n▁excursion\n▁excuse\n▁execut\n▁execution\n▁exercise\n▁exhaust\n▁exhibit\n▁exist\n▁existence\n▁expand\n▁expect\n▁expectation\n▁expected\n▁expedition\n▁expense\n▁experience\n▁experiment\n▁explain\n▁explained\n▁explanation\n▁explore\n▁exposed\n▁express\n▁expressed\n▁expression\n▁exquisite\n▁extend\n▁extended\n▁extensive\n▁extent\n▁external\n▁extra\n▁extract\n▁extraordinary\n▁extreme\n▁extremely\n▁extremity\n▁eye\n▁eyebrows\n▁eyes\n▁f\n▁fa\n▁face\n▁faces\n▁facilit\n▁facing\n▁fact\n▁faculties\n▁faculty\n▁faded\n▁fail\n▁failed\n▁failure\n▁faint\n▁fair\n▁fairly\n▁fairy\n▁faith\n▁faithful\n▁fall\n▁fallen\n▁falling\n▁false\n▁fame\n▁familiar\n▁families\n▁family\n▁famous\n▁fan\n▁fancied\n▁fancies\n▁fancy\n▁fanny\n▁fantastic\n▁far\n▁farewell\n▁farm\n▁farmer\n▁farther\n▁fashion\n▁fast\n▁fastened\n▁fat\n▁fatal\n▁fate\n▁father\n▁fatigue\n▁fault\n▁favor\n▁favorite\n▁favour\n▁favourite\n▁fe\n▁fear\n▁fearful\n▁feast\n▁feather\n▁feature\n▁features\n▁february\n▁federal\n▁feeble\n▁feed\n▁feel\n▁feeling\n▁feelings\n▁feet\n▁felicity\n▁fell\n▁fellow\n▁felt\n▁female\n▁feminine\n▁fence\n▁fer\n▁fertil\n▁fetch\n▁fever\n▁few\n▁fi\n▁field\n▁fields\n▁fierce\n▁fifteen\n▁fifth\n▁fifty\n▁fight\n▁fighting\n▁figure\n▁fill\n▁filled\n▁film\n▁fin\n▁final\n▁finally\n▁find\n▁finding\n▁fine\n▁finger\n▁fingers\n▁finish\n▁finished\n▁fire\n▁firm\n▁firmly\n▁first\n▁fish\n▁fisherman\n▁fit\n▁fitted\n▁five\n▁fix\n▁fixed\n▁fl\n▁flag\n▁flame\n▁flank\n▁flash\n▁flat\n▁flatter\n▁fled\n▁flee\n▁fleet\n▁flesh\n▁flew\n▁flicker\n▁flight\n▁flo\n▁flock\n▁flood\n▁floor\n▁florence\n▁flour\n▁flourish\n▁flow\n▁flower\n▁flowers\n▁flu\n▁flutter\n▁fly\n▁flying\n▁fo\n▁fog\n▁fold\n▁folk\n▁follow\n▁followed\n▁following\n▁folly\n▁fond\n▁food\n▁fool\n▁foolish\n▁foot\n▁footsteps\n▁for\n▁forbid\n▁force\n▁forced\n▁fore\n▁forehead\n▁foreign\n▁foresee\n▁forest\n▁forget\n▁forgive\n▁forgot\n▁forgotten\n▁form\n▁formed\n▁former\n▁formidable\n▁forsake\n▁forth\n▁fortnight\n▁fortunate\n▁fortune\n▁forty\n▁forward\n▁fought\n▁found\n▁fountain\n▁four\n▁fourteen\n▁fourth\n▁fowl\n▁fox\n▁fra\n▁fragment\n▁frame\n▁france\n▁francis\n▁francs\n▁frank\n▁fred\n▁frederick\n▁free\n▁freedom\n▁french\n▁frequent\n▁frequently\n▁fresh\n▁fri\n▁friday\n▁friend\n▁friendly\n▁friends\n▁friendship\n▁fright\n▁frightened\n▁frightful\n▁fringe\n▁fro\n▁frog\n▁from\n▁front\n▁frown\n▁fruit\n▁fu\n▁fulfil\n▁full\n▁fully\n▁fun\n▁function\n▁fundamental\n▁funeral\n▁funny\n▁fur\n▁furious\n▁furnish\n▁furniture\n▁further\n▁future\n▁g\n▁ga\n▁gain\n▁gained\n▁gall\n▁gallant\n▁gallery\n▁gallop\n▁game\n▁gar\n▁garden\n▁garrison\n▁gasp\n▁gate\n▁gather\n▁gathered\n▁gave\n▁gay\n▁ge\n▁gen\n▁general\n▁generally\n▁generation\n▁generosity\n▁generous\n▁genius\n▁gentle\n▁gentleman\n▁gentlemen\n▁gently\n▁genuine\n▁george\n▁ger\n▁german\n▁gesture\n▁get\n▁getting\n▁ghastl\n▁ghost\n▁gi\n▁giant\n▁gift\n▁gigantic\n▁gil\n▁gilbert\n▁girl\n▁girls\n▁give\n▁given\n▁giving\n▁gla\n▁glacier\n▁glad\n▁glance\n▁glancing\n▁glass\n▁gleam\n▁glen\n▁glid\n▁glimmer\n▁glimpse\n▁glitter\n▁globe\n▁gloom\n▁gloomy\n▁glorious\n▁glory\n▁glove\n▁glow\n▁go\n▁goat\n▁god\n▁goddess\n▁goes\n▁going\n▁gold\n▁golden\n▁gone\n▁good\n▁gorgeous\n▁gospel\n▁gossip\n▁got\n▁govern\n▁government\n▁governor\n▁gown\n▁gra\n▁grace\n▁graceful\n▁gracious\n▁gradually\n▁grand\n▁grandfather\n▁grandmother\n▁granite\n▁grant\n▁grasp\n▁grass\n▁grateful\n▁gratify\n▁gratitude\n▁grave\n▁gravity\n▁gray\n▁gre\n▁great\n▁greater\n▁greatest\n▁greatly\n▁greek\n▁green\n▁grew\n▁grey\n▁gri\n▁grief\n▁grieve\n▁grim\n▁grin\n▁gro\n▁groan\n▁ground\n▁group\n▁grove\n▁grow\n▁growing\n▁growl\n▁grown\n▁growth\n▁gu\n▁guard\n▁guess\n▁guest\n▁guide\n▁guilt\n▁guilty\n▁guinea\n▁gun\n▁ha\n▁habit\n▁habitual\n▁had\n▁hair\n▁hale\n▁half\n▁hall\n▁halt\n▁ham\n▁hamilton\n▁hammer\n▁hand\n▁handkerchief\n▁hands\n▁handsome\n▁hang\n▁hanging\n▁hans\n▁happen\n▁happened\n▁happier\n▁happily\n▁happiness\n▁happy\n▁har\n▁harbor\n▁harbour\n▁hard\n▁hardly\n▁harm\n▁harmoni\n▁harmony\n▁harry\n▁harsh\n▁harvest\n▁has\n▁haste\n▁hastened\n▁hastily\n▁hat\n▁hate\n▁hath\n▁hatred\n▁haunt\n▁have\n▁haven\n▁having\n▁hawk\n▁hay\n▁he\n▁head\n▁heads\n▁health\n▁heap\n▁hear\n▁heard\n▁hearing\n▁heart\n▁heat\n▁heaven\n▁heavily\n▁heavy\n▁hebrew\n▁hedge\n▁height\n▁held\n▁helen\n▁help\n▁helpless\n▁hence\n▁henry\n▁her\n▁herbert\n▁hercules\n▁here\n▁hero\n▁herself\n▁hesitate\n▁hesitated\n▁hesitation\n▁hi\n▁hid\n▁hidden\n▁hide\n▁hideous\n▁high\n▁higher\n▁highest\n▁hill\n▁hills\n▁him\n▁himself\n▁hind\n▁hint\n▁his\n▁history\n▁hit\n▁hither\n▁hitherto\n▁ho\n▁hoarse\n▁hold\n▁holding\n▁hole\n▁holiday\n▁holland\n▁hollow\n▁holy\n▁home\n▁honest\n▁honey\n▁honor\n▁honour\n▁hook\n▁hope\n▁hoped\n▁hopeless\n▁hoping\n▁hori\n▁horn\n▁horrible\n▁horrid\n▁horror\n▁horse\n▁horses\n▁hospital\n▁host\n▁hot\n▁hotel\n▁hour\n▁hours\n▁house\n▁household\n▁housekeeper\n▁houses\n▁how\n▁however\n▁hu\n▁huge\n▁hum\n▁human\n▁humanity\n▁humble\n▁humor\n▁humour\n▁hundred\n▁hung\n▁hunger\n▁hungry\n▁hunt\n▁hunter\n▁hunting\n▁hurried\n▁hurry\n▁hurt\n▁husband\n▁hush\n▁hut\n▁hy\n▁hymn\n▁hypnoti\n▁i\n▁ice\n▁idea\n▁ideal\n▁ideas\n▁identity\n▁idiot\n▁idle\n▁if\n▁ignor\n▁ignorance\n▁ignorant\n▁ill\n▁illusion\n▁illustrat\n▁image\n▁imagination\n▁imagine\n▁imitat\n▁immediate\n▁immediately\n▁immense\n▁immortal\n▁imp\n▁impart\n▁impatience\n▁impatient\n▁imperfect\n▁imperial\n▁import\n▁importance\n▁important\n▁impossible\n▁impressed\n▁impression\n▁improve\n▁improvement\n▁impulse\n▁in\n▁incapable\n▁incense\n▁incessant\n▁inches\n▁incident\n▁inclination\n▁inclined\n▁includ\n▁income\n▁increase\n▁increased\n▁increasing\n▁indeed\n▁independence\n▁independent\n▁india\n▁indian\n▁indians\n▁indifference\n▁indifferent\n▁indignant\n▁indignation\n▁individual\n▁induce\n▁indulge\n▁industry\n▁inevitable\n▁infant\n▁inferior\n▁infinite\n▁inflict\n▁influence\n▁information\n▁informed\n▁inhabit\n▁inhabitants\n▁inherit\n▁injured\n▁injury\n▁injustice\n▁innocence\n▁innocent\n▁innumerable\n▁inquire\n▁inquired\n▁inquiries\n▁inquiry\n▁insect\n▁inside\n▁insist\n▁inspector\n▁instance\n▁instant\n▁instantly\n▁instead\n▁instinct\n▁instinctively\n▁institution\n▁instruct\n▁instrument\n▁insult\n▁intellect\n▁intellectual\n▁intelligence\n▁intelligent\n▁intelligible\n▁intend\n▁intended\n▁intense\n▁intensity\n▁intent\n▁intention\n▁inter\n▁intercourse\n▁interest\n▁interested\n▁interesting\n▁interfere\n▁internal\n▁interposed\n▁interpret\n▁interrupt\n▁interrupted\n▁interval\n▁interven\n▁interview\n▁intimacy\n▁intimate\n▁into\n▁introduced\n▁invariably\n▁invent\n▁investigat\n▁invisible\n▁invitation\n▁invited\n▁ireland\n▁irish\n▁iron\n▁irre\n▁irregular\n▁irresistible\n▁is\n▁isabel\n▁island\n▁isn\n▁issue\n▁it\n▁italian\n▁italy\n▁its\n▁itself\n▁j\n▁ja\n▁jack\n▁jackson\n▁jacob\n▁james\n▁jane\n▁january\n▁japanese\n▁jar\n▁jasper\n▁jaw\n▁je\n▁jealous\n▁jean\n▁jerk\n▁jerry\n▁jerusalem\n▁jest\n▁jesus\n▁jew\n▁jewel\n▁jim\n▁jimmie\n▁jimmy\n▁jo\n▁job\n▁joe\n▁john\n▁johnson\n▁join\n▁joined\n▁joke\n▁jolly\n▁jones\n▁joseph\n▁journal\n▁journey\n▁joy\n▁ju\n▁jud\n▁judge\n▁judgment\n▁juice\n▁julia\n▁julie\n▁julius\n▁jump\n▁jumped\n▁june\n▁jungle\n▁just\n▁justice\n▁justify\n▁k\n▁ka\n▁kate\n▁katy\n▁keen\n▁keep\n▁keeping\n▁keith\n▁ken\n▁kennedy\n▁kept\n▁kettle\n▁key\n▁ki\n▁kick\n▁kill\n▁killed\n▁kind\n▁kindly\n▁kindness\n▁king\n▁kingdom\n▁kiss\n▁kissed\n▁kit\n▁kitchen\n▁kitty\n▁knee\n▁knees\n▁knelt\n▁knew\n▁knife\n▁knight\n▁knit\n▁knock\n▁knot\n▁know\n▁knowing\n▁knowledge\n▁known\n▁knows\n▁ko\n▁la\n▁labor\n▁labour\n▁lace\n▁lack\n▁lad\n▁ladder\n▁ladies\n▁lady\n▁laid\n▁lake\n▁lamb\n▁lament\n▁lamp\n▁land\n▁landlord\n▁landscape\n▁lane\n▁language\n▁lantern\n▁lap\n▁large\n▁larger\n▁last\n▁late\n▁later\n▁latter\n▁laugh\n▁laughed\n▁laughing\n▁laughter\n▁launcelot\n▁launch\n▁laura\n▁law\n▁laws\n▁lawyer\n▁lay\n▁le\n▁lead\n▁leader\n▁leading\n▁leaf\n▁league\n▁lean\n▁leaned\n▁leaning\n▁leap\n▁learn\n▁learned\n▁least\n▁leather\n▁leave\n▁leaves\n▁leaving\n▁lecture\n▁led\n▁left\n▁leg\n▁legend\n▁legislature\n▁legs\n▁leisure\n▁lemon\n▁lend\n▁length\n▁leonora\n▁less\n▁lesson\n▁lest\n▁let\n▁letter\n▁letters\n▁level\n▁levin\n▁li\n▁liberal\n▁liberty\n▁library\n▁lie\n▁lies\n▁lieutenant\n▁life\n▁lift\n▁lifted\n▁light\n▁lightning\n▁like\n▁liked\n▁likely\n▁likewise\n▁limb\n▁limit\n▁lin\n▁lincoln\n▁line\n▁lines\n▁linger\n▁lion\n▁lips\n▁liquid\n▁liquor\n▁list\n▁listen\n▁listened\n▁listening\n▁literally\n▁literary\n▁literature\n▁little\n▁live\n▁lived\n▁lives\n▁living\n▁lo\n▁load\n▁loaf\n▁local\n▁lock\n▁locked\n▁lodge\n▁lodging\n▁lofty\n▁log\n▁london\n▁lonely\n▁long\n▁longer\n▁look\n▁looked\n▁looking\n▁looks\n▁loose\n▁lord\n▁lose\n▁losing\n▁loss\n▁lost\n▁lot\n▁loud\n▁louis\n▁loung\n▁love\n▁loved\n▁lovely\n▁lover\n▁loving\n▁low\n▁lower\n▁loyal\n▁lu\n▁luc\n▁luck\n▁lucy\n▁lunch\n▁luxury\n▁lying\n▁lyn\n▁m\n▁ma\n▁mac\n▁machine\n▁mad\n▁madam\n▁madame\n▁made\n▁mademoiselle\n▁maggie\n▁magic\n▁magician\n▁magistrate\n▁magnificent\n▁maid\n▁maiden\n▁main\n▁maintain\n▁majesty\n▁major\n▁majority\n▁make\n▁makes\n▁making\n▁mal\n▁male\n▁mamma\n▁man\n▁manage\n▁managed\n▁manifest\n▁mankind\n▁manner\n▁manufacture\n▁manuscript\n▁many\n▁mar\n▁marble\n▁march\n▁margaret\n▁marguerite\n▁marian\n▁marilla\n▁mark\n▁marked\n▁market\n▁marquis\n▁marriage\n▁married\n▁marry\n▁marsh\n▁martha\n▁martian\n▁martin\n▁martyr\n▁marvel\n▁marvellous\n▁mary\n▁mask\n▁mass\n▁master\n▁mat\n▁match\n▁mate\n▁material\n▁matter\n▁matters\n▁matthew\n▁maxim\n▁may\n▁maybe\n▁me\n▁meadow\n▁meal\n▁mean\n▁meaning\n▁means\n▁meant\n▁meantime\n▁meanwhile\n▁measure\n▁meat\n▁mechanical\n▁medi\n▁medical\n▁medicine\n▁meet\n▁meeting\n▁melancholy\n▁member\n▁members\n▁memories\n▁memory\n▁men\n▁mental\n▁mention\n▁mentioned\n▁mer\n▁merchant\n▁mercy\n▁mere\n▁merely\n▁merit\n▁merry\n▁message\n▁messenger\n▁met\n▁metal\n▁method\n▁mexican\n▁mi\n▁michael\n▁mid\n▁middle\n▁midnight\n▁midst\n▁might\n▁mighty\n▁mil\n▁mild\n▁mile\n▁miles\n▁military\n▁milk\n▁mill\n▁million\n▁min\n▁mind\n▁mine\n▁mingled\n▁minister\n▁minute\n▁minutes\n▁miracle\n▁mirror\n▁mirth\n▁mis\n▁mischief\n▁miserable\n▁misery\n▁misfortune\n▁miss\n▁mission\n▁missus\n▁mist\n▁mistake\n▁mistaken\n▁mister\n▁mistress\n▁mitya\n▁mix\n▁mixture\n▁mo\n▁mock\n▁mode\n▁moderate\n▁modern\n▁modest\n▁moment\n▁mon\n▁monarch\n▁monday\n▁money\n▁monk\n▁monkey\n▁monsieur\n▁monster\n▁monstrous\n▁monte\n▁month\n▁months\n▁monument\n▁mood\n▁moon\n▁moonlight\n▁mor\n▁moral\n▁more\n▁moreover\n▁morning\n▁morrow\n▁mortal\n▁moscow\n▁most\n▁mother\n▁motion\n▁motionless\n▁motive\n▁motor\n▁mould\n▁mount\n▁mountain\n▁mountains\n▁mounted\n▁mourn\n▁mouse\n▁mouth\n▁move\n▁moved\n▁movement\n▁moving\n▁mu\n▁much\n▁mud\n▁mule\n▁multitude\n▁murder\n▁murderer\n▁murmur\n▁murmured\n▁muscle\n▁muscular\n▁music\n▁musket\n▁must\n▁muttered\n▁mutual\n▁my\n▁myself\n▁mysterious\n▁mystery\n▁na\n▁nail\n▁naked\n▁name\n▁named\n▁nancy\n▁napoleon\n▁narrat\n▁narrow\n▁natasha\n▁nation\n▁national\n▁native\n▁natural\n▁naturally\n▁nature\n▁naught\n▁nautilus\n▁nav\n▁navigat\n▁nay\n▁ne\n▁near\n▁nearer\n▁nearest\n▁nearly\n▁neat\n▁necessarily\n▁necessary\n▁necessity\n▁neck\n▁need\n▁needed\n▁neglect\n▁negro\n▁neighbor\n▁neighborhood\n▁neighbour\n▁neighbourhood\n▁neither\n▁nephew\n▁nerve\n▁nervous\n▁nest\n▁never\n▁nevertheless\n▁new\n▁news\n▁newspaper\n▁next\n▁ni\n▁nice\n▁nicholas\n▁niece\n▁nigh\n▁night\n▁nightingale\n▁nine\n▁nineteen\n▁ninety\n▁ninth\n▁no\n▁nobility\n▁noble\n▁nobody\n▁nodded\n▁noise\n▁none\n▁nonsense\n▁nor\n▁normal\n▁norman\n▁north\n▁northern\n▁nose\n▁not\n▁note\n▁nothing\n▁notice\n▁noticed\n▁notwithstanding\n▁novel\n▁november\n▁now\n▁nowhere\n▁nu\n▁number\n▁numerous\n▁nurse\n▁nut\n▁o\n▁oak\n▁oath\n▁ob\n▁obedience\n▁obey\n▁object\n▁objection\n▁obligation\n▁obliged\n▁obscure\n▁observation\n▁observe\n▁observed\n▁observing\n▁obstacle\n▁obstinate\n▁obtain\n▁obtained\n▁obvious\n▁occasion\n▁occasionally\n▁occupation\n▁occupied\n▁occupy\n▁occur\n▁occurred\n▁occurrence\n▁ocean\n▁october\n▁odd\n▁of\n▁off\n▁offend\n▁offer\n▁offered\n▁office\n▁officer\n▁officers\n▁official\n▁often\n▁oh\n▁oil\n▁old\n▁oliver\n▁on\n▁once\n▁one\n▁only\n▁open\n▁opened\n▁opening\n▁opera\n▁operation\n▁opinion\n▁opponent\n▁opportunity\n▁opposite\n▁opposition\n▁oppress\n▁or\n▁orange\n▁orchard\n▁order\n▁ordered\n▁orders\n▁ordinary\n▁organ\n▁organi\n▁origin\n▁original\n▁ornament\n▁other\n▁others\n▁otherwise\n▁ought\n▁ounce\n▁our\n▁ourselves\n▁out\n▁outrage\n▁outside\n▁oven\n▁over\n▁overcome\n▁overflow\n▁overlook\n▁overtake\n▁overwhelm\n▁owe\n▁owing\n▁owl\n▁own\n▁oyster\n▁p\n▁pa\n▁pace\n▁pacific\n▁pack\n▁page\n▁paid\n▁pain\n▁painful\n▁painted\n▁pair\n▁pal\n▁palace\n▁pale\n▁palm\n▁pan\n▁papa\n▁paper\n▁papers\n▁par\n▁para\n▁paradise\n▁parallel\n▁parcel\n▁pardon\n▁parents\n▁paris\n▁park\n▁parliament\n▁parlor\n▁parlour\n▁part\n▁particle\n▁particular\n▁particularly\n▁parties\n▁partner\n▁parts\n▁party\n▁pass\n▁passage\n▁passed\n▁passenger\n▁passing\n▁passion\n▁passionate\n▁past\n▁pat\n▁patch\n▁path\n▁patience\n▁patient\n▁patriot\n▁paul\n▁pause\n▁paused\n▁pavement\n▁paw\n▁pay\n▁pe\n▁pea\n▁peace\n▁peak\n▁pearl\n▁peasant\n▁peculiar\n▁peep\n▁peer\n▁pen\n▁pencil\n▁penetrate\n▁penny\n▁people\n▁pepper\n▁per\n▁perceive\n▁perceived\n▁perceiving\n▁perception\n▁perch\n▁perfect\n▁perfection\n▁perfectly\n▁perform\n▁performance\n▁perfume\n▁perhaps\n▁peril\n▁period\n▁perish\n▁permanent\n▁permission\n▁permit\n▁permitted\n▁perpetual\n▁perplex\n▁persecut\n▁persist\n▁person\n▁personal\n▁persons\n▁persuade\n▁pet\n▁peter\n▁pharaoh\n▁phenomena\n▁phenomenon\n▁phil\n▁philadelphia\n▁philip\n▁philosopher\n▁philosophy\n▁phoenix\n▁photograph\n▁phrase\n▁physical\n▁physician\n▁pi\n▁piano\n▁pick\n▁picked\n▁picture\n▁piece\n▁pieces\n▁pierced\n▁pierre\n▁pig\n▁pile\n▁pilgrim\n▁pill\n▁pillow\n▁pilot\n▁pin\n▁pine\n▁pink\n▁pinocchio\n▁pipe\n▁pirate\n▁pistol\n▁pit\n▁pitch\n▁pitiful\n▁pity\n▁pla\n▁plac\n▁place\n▁placed\n▁places\n▁plague\n▁plain\n▁plainly\n▁plan\n▁planet\n▁plant\n▁plate\n▁platform\n▁play\n▁played\n▁playing\n▁plea\n▁pleasant\n▁please\n▁pleased\n▁pleasure\n▁pledge\n▁plenty\n▁plot\n▁plough\n▁pluck\n▁plum\n▁plunder\n▁plunge\n▁po\n▁pocket\n▁poem\n▁poet\n▁poetry\n▁point\n▁pointed\n▁poison\n▁pole\n▁police\n▁policy\n▁polish\n▁polite\n▁political\n▁politics\n▁polly\n▁pond\n▁pony\n▁pool\n▁poor\n▁pope\n▁popular\n▁population\n▁porch\n▁port\n▁porthos\n▁portion\n▁portrait\n▁position\n▁positive\n▁possess\n▁possessed\n▁possession\n▁possibility\n▁possible\n▁possibly\n▁post\n▁pot\n▁pound\n▁pounds\n▁pour\n▁poverty\n▁powder\n▁power\n▁powerful\n▁powers\n▁pra\n▁practical\n▁practice\n▁practise\n▁prairie\n▁praise\n▁pray\n▁prayer\n▁pre\n▁preach\n▁precaution\n▁precede\n▁preceding\n▁precious\n▁precise\n▁precisely\n▁prefer\n▁preferred\n▁prejudice\n▁preparation\n▁prepare\n▁prepared\n▁preparing\n▁presence\n▁present\n▁presented\n▁presently\n▁preserv\n▁president\n▁press\n▁pressed\n▁pressure\n▁presume\n▁pretend\n▁pretty\n▁prevail\n▁prevent\n▁previous\n▁pri\n▁price\n▁pride\n▁priest\n▁primitive\n▁prince\n▁princess\n▁principal\n▁principle\n▁print\n▁priscilla\n▁prison\n▁prisoner\n▁private\n▁privilege\n▁pro\n▁probability\n▁probable\n▁probably\n▁problem\n▁proceed\n▁proceeded\n▁process\n▁proclaim\n▁procure\n▁produce\n▁produced\n▁producing\n▁product\n▁profess\n▁profession\n▁professor\n▁profit\n▁profound\n▁progress\n▁prohibit\n▁project\n▁prominent\n▁promise\n▁promised\n▁promising\n▁promote\n▁prompt\n▁pronounc\n▁proof\n▁prop\n▁proper\n▁properly\n▁property\n▁prophet\n▁proportion\n▁proposal\n▁propose\n▁proposed\n▁proposition\n▁proprietor\n▁prospect\n▁prosperity\n▁protect\n▁protection\n▁protest\n▁proud\n▁prove\n▁proved\n▁proverb\n▁provide\n▁provided\n▁province\n▁provision\n▁provoke\n▁prudence\n▁prudent\n▁psmith\n▁pu\n▁public\n▁publish\n▁puff\n▁pull\n▁pulled\n▁pulse\n▁punish\n▁punishment\n▁pupil\n▁pur\n▁purchase\n▁pure\n▁purple\n▁purpose\n▁purse\n▁pursue\n▁pursued\n▁pursuit\n▁push\n▁pushed\n▁put\n▁putting\n▁qua\n▁quaint\n▁qualities\n▁quality\n▁quantity\n▁quarrel\n▁quarter\n▁queen\n▁queer\n▁question\n▁questions\n▁qui\n▁quick\n▁quickly\n▁quiet\n▁quietly\n▁quite\n▁quiver\n▁quixote\n▁quo\n▁quoth\n▁r\n▁ra\n▁rabbit\n▁race\n▁rachel\n▁radiant\n▁rag\n▁rage\n▁rail\n▁railroad\n▁railway\n▁rain\n▁rainbow\n▁raise\n▁raised\n▁raising\n▁ralph\n▁ram\n▁ran\n▁rang\n▁range\n▁rank\n▁raoul\n▁rapid\n▁rapidly\n▁rare\n▁rascal\n▁rate\n▁rather\n▁rational\n▁rattl\n▁raven\n▁ray\n▁re\n▁reach\n▁reached\n▁reaction\n▁read\n▁reader\n▁readily\n▁reading\n▁ready\n▁real\n▁reali\n▁reality\n▁really\n▁rear\n▁reason\n▁rebecca\n▁rebel\n▁recall\n▁receive\n▁received\n▁receiving\n▁recent\n▁reception\n▁recess\n▁recit\n▁reckless\n▁reckon\n▁recogni\n▁recollect\n▁recollection\n▁recommend\n▁reconcil\n▁record\n▁recover\n▁recovered\n▁red\n▁reduced\n▁refer\n▁reference\n▁refined\n▁reflect\n▁reflection\n▁reform\n▁refrain\n▁refresh\n▁refuge\n▁refuse\n▁refused\n▁regain\n▁regard\n▁regarded\n▁regiment\n▁region\n▁regret\n▁regular\n▁regulat\n▁reign\n▁reject\n▁rejoice\n▁rejoicing\n▁relate\n▁related\n▁relation\n▁relative\n▁relax\n▁release\n▁reli\n▁relief\n▁relieve\n▁religion\n▁religious\n▁reluctant\n▁remain\n▁remained\n▁remark\n▁remarkable\n▁remarked\n▁remedy\n▁remember\n▁remembered\n▁remembrance\n▁remind\n▁remorse\n▁remote\n▁remove\n▁removed\n▁render\n▁rendered\n▁renew\n▁rent\n▁rep\n▁repair\n▁repeat\n▁repeated\n▁repent\n▁replied\n▁reply\n▁report\n▁represent\n▁representative\n▁reproach\n▁republic\n▁reputation\n▁request\n▁require\n▁required\n▁rescue\n▁resemblance\n▁resemble\n▁reserve\n▁residence\n▁resign\n▁resist\n▁resistance\n▁resolute\n▁resolution\n▁resolved\n▁resort\n▁resource\n▁respect\n▁response\n▁responsibility\n▁responsible\n▁rest\n▁restless\n▁restore\n▁restrain\n▁result\n▁resumed\n▁retain\n▁retire\n▁retired\n▁retorted\n▁retreat\n▁return\n▁returned\n▁returning\n▁rev\n▁reveal\n▁revelation\n▁revenge\n▁rever\n▁review\n▁revolt\n▁revolution\n▁reward\n▁ri\n▁ribbon\n▁rich\n▁richard\n▁richmond\n▁rid\n▁ride\n▁ridiculous\n▁riding\n▁rifle\n▁right\n▁righteous\n▁rigid\n▁ring\n▁ripe\n▁rise\n▁rising\n▁risk\n▁rival\n▁river\n▁ro\n▁road\n▁roar\n▁roast\n▁rob\n▁robber\n▁robe\n▁robert\n▁robin\n▁rock\n▁rocks\n▁rode\n▁roll\n▁rolled\n▁roman\n▁rome\n▁roof\n▁room\n▁root\n▁rope\n▁rosa\n▁rose\n▁rough\n▁round\n▁roused\n▁route\n▁row\n▁royal\n▁ru\n▁rub\n▁rubbed\n▁rubbing\n▁rude\n▁ruin\n▁rule\n▁rum\n▁run\n▁running\n▁rush\n▁rushed\n▁russia\n▁russian\n▁ruth\n▁s\n▁sa\n▁sacred\n▁sacrifice\n▁sad\n▁saddle\n▁safe\n▁safety\n▁said\n▁sail\n▁sailor\n▁saint\n▁sake\n▁sal\n▁salt\n▁salute\n▁sam\n▁same\n▁samuel\n▁san\n▁sancho\n▁sand\n▁sang\n▁sank\n▁sarah\n▁sat\n▁satisfaction\n▁satisfactory\n▁satisfied\n▁satisfy\n▁saturday\n▁sauce\n▁savage\n▁save\n▁saved\n▁saving\n▁saw\n▁say\n▁saying\n▁says\n▁sc\n▁sca\n▁scale\n▁scandal\n▁scar\n▁scarce\n▁scarcely\n▁scarecrow\n▁scarlet\n▁scattered\n▁scene\n▁scent\n▁sch\n▁scheme\n▁scholar\n▁school\n▁science\n▁scientific\n▁scold\n▁score\n▁scorn\n▁scotch\n▁scotland\n▁scott\n▁scoundrel\n▁scout\n▁scramble\n▁scrap\n▁scratch\n▁scream\n▁screen\n▁screw\n▁scrooge\n▁se\n▁sea\n▁seal\n▁search\n▁season\n▁seat\n▁seated\n▁second\n▁secret\n▁secretary\n▁section\n▁secure\n▁security\n▁see\n▁seeing\n▁seek\n▁seem\n▁seemed\n▁seems\n▁seen\n▁sei\n▁seldom\n▁select\n▁self\n▁selfish\n▁sell\n▁senate\n▁senator\n▁send\n▁sensation\n▁sense\n▁sensible\n▁sensitive\n▁sent\n▁sentence\n▁sentiment\n▁separate\n▁separated\n▁september\n▁ser\n▁serene\n▁sergeant\n▁series\n▁serious\n▁sermon\n▁serpent\n▁servant\n▁servants\n▁serve\n▁served\n▁service\n▁serving\n▁set\n▁setting\n▁settle\n▁settled\n▁seven\n▁seventeen\n▁seventy\n▁several\n▁severe\n▁sex\n▁sh\n▁sha\n▁shade\n▁shadow\n▁shaggy\n▁shake\n▁shakespeare\n▁shaking\n▁shall\n▁shame\n▁shape\n▁share\n▁sharp\n▁sharply\n▁shawl\n▁she\n▁sheep\n▁shelter\n▁shepherd\n▁sheriff\n▁shield\n▁shift\n▁shilling\n▁shine\n▁shining\n▁ship\n▁ships\n▁shirt\n▁shiver\n▁shock\n▁shoe\n▁shoes\n▁shone\n▁shook\n▁shoot\n▁shop\n▁shore\n▁short\n▁shot\n▁should\n▁shoulder\n▁shoulders\n▁shout\n▁shouted\n▁show\n▁showed\n▁shown\n▁shrewd\n▁shriek\n▁shrill\n▁shrink\n▁shudder\n▁shut\n▁si\n▁sick\n▁side\n▁sides\n▁siege\n▁sigh\n▁sighed\n▁sight\n▁sign\n▁signal\n▁significance\n▁significant\n▁silence\n▁silent\n▁silk\n▁silly\n▁silver\n▁similar\n▁simon\n▁simple\n▁simplicity\n▁simply\n▁sin\n▁since\n▁sing\n▁singing\n▁single\n▁singular\n▁sink\n▁sir\n▁sister\n▁sit\n▁sitting\n▁situated\n▁situation\n▁six\n▁sixteen\n▁sixty\n▁sketch\n▁ski\n▁skilful\n▁skill\n▁skin\n▁skirt\n▁skull\n▁sky\n▁slain\n▁slaughter\n▁slave\n▁slavery\n▁slaves\n▁sledge\n▁sleep\n▁sleeve\n▁slender\n▁slept\n▁slew\n▁slice\n▁slid\n▁slight\n▁slightest\n▁slightly\n▁slim\n▁slip\n▁slipped\n▁slo\n▁slope\n▁slow\n▁slowly\n▁slumber\n▁small\n▁smart\n▁smash\n▁smell\n▁smile\n▁smiled\n▁smiling\n▁smith\n▁smoke\n▁smoking\n▁smooth\n▁smot\n▁snake\n▁snap\n▁snatch\n▁sneer\n▁snow\n▁so\n▁social\n▁society\n▁soft\n▁softly\n▁soil\n▁sold\n▁soldier\n▁soldiers\n▁solemn\n▁solicit\n▁solid\n▁solitary\n▁solitude\n▁solomon\n▁solution\n▁some\n▁somebody\n▁somehow\n▁someone\n▁something\n▁sometimes\n▁somewhat\n▁somewhere\n▁son\n▁song\n▁soon\n▁sooner\n▁sooth\n▁sorrow\n▁sorry\n▁sort\n▁sought\n▁soul\n▁sound\n▁source\n▁south\n▁southern\n▁sovereign\n▁sp\n▁space\n▁spain\n▁spake\n▁spaniard\n▁spanish\n▁spar\n▁spare\n▁spark\n▁speak\n▁speaking\n▁spear\n▁special\n▁species\n▁specimen\n▁speck\n▁spectacle\n▁spectator\n▁speculat\n▁speech\n▁speed\n▁spell\n▁spend\n▁spent\n▁sphere\n▁spi\n▁spin\n▁spirit\n▁spirits\n▁spiritual\n▁spite\n▁splash\n▁splendid\n▁splendor\n▁split\n▁spoil\n▁spoke\n▁spoken\n▁spoon\n▁sport\n▁spot\n▁sprang\n▁spread\n▁spring\n▁sprinkle\n▁spur\n▁squ\n▁square\n▁squee\n▁squire\n▁squirrel\n▁st\n▁sta\n▁stable\n▁staff\n▁stage\n▁stagger\n▁staircase\n▁stairs\n▁stalk\n▁stamp\n▁stand\n▁standard\n▁standing\n▁star\n▁stared\n▁stars\n▁start\n▁started\n▁startled\n▁state\n▁statement\n▁states\n▁station\n▁statue\n▁stay\n▁ste\n▁steadily\n▁steady\n▁steal\n▁steam\n▁steel\n▁steep\n▁step\n▁stephen\n▁stepped\n▁steps\n▁stern\n▁stick\n▁stiff\n▁still\n▁stir\n▁stirred\n▁sto\n▁stock\n▁stole\n▁stomach\n▁stone\n▁stones\n▁stood\n▁stooped\n▁stop\n▁stopped\n▁stopping\n▁store\n▁stories\n▁storm\n▁story\n▁stout\n▁straight\n▁strain\n▁strait\n▁strange\n▁stranger\n▁strap\n▁strat\n▁straw\n▁stray\n▁streak\n▁stream\n▁street\n▁streets\n▁strength\n▁stretch\n▁stretched\n▁strew\n▁stricken\n▁strict\n▁strike\n▁striking\n▁string\n▁strip\n▁stro\n▁stroke\n▁strong\n▁struck\n▁structure\n▁struggle\n▁struggling\n▁stuck\n▁student\n▁studied\n▁studies\n▁studio\n▁study\n▁stuff\n▁stumble\n▁stump\n▁stupid\n▁style\n▁su\n▁sub\n▁subdued\n▁subject\n▁sublime\n▁submit\n▁subsequent\n▁substance\n▁substantial\n▁subtle\n▁succeed\n▁succeeded\n▁success\n▁successful\n▁such\n▁sudden\n▁suddenly\n▁suffer\n▁suffered\n▁suffering\n▁suffice\n▁sufficient\n▁sufficiently\n▁suffrage\n▁sugar\n▁suggest\n▁suggested\n▁suggestion\n▁suit\n▁sullen\n▁sultan\n▁sum\n▁summer\n▁summit\n▁summon\n▁sun\n▁sunday\n▁sunk\n▁sunlight\n▁sunrise\n▁sunset\n▁sunshine\n▁super\n▁superintend\n▁superior\n▁supper\n▁supplied\n▁supplies\n▁supply\n▁support\n▁suppose\n▁supposed\n▁supposing\n▁suppress\n▁supreme\n▁sur\n▁sure\n▁surely\n▁surface\n▁surgeon\n▁surpass\n▁surprise\n▁surprised\n▁surprising\n▁surrender\n▁surrounded\n▁surrounding\n▁survey\n▁surviv\n▁susan\n▁suspect\n▁suspicion\n▁suspicious\n▁sustain\n▁sw\n▁swa\n▁swallow\n▁swarm\n▁swear\n▁sweat\n▁sweep\n▁sweet\n▁swell\n▁swept\n▁swift\n▁swim\n▁swimming\n▁sword\n▁swore\n▁swung\n▁sy\n▁sylvia\n▁symbol\n▁sympathetic\n▁sympathi\n▁sympathy\n▁symptom\n▁system\n▁t\n▁ta\n▁table\n▁tail\n▁take\n▁taken\n▁taking\n▁tale\n▁talent\n▁talk\n▁talked\n▁talking\n▁tall\n▁tang\n▁tank\n▁tap\n▁tar\n▁task\n▁taste\n▁taught\n▁tax\n▁te\n▁tea\n▁teach\n▁teacher\n▁tear\n▁tears\n▁teeth\n▁telegraph\n▁telephone\n▁tell\n▁telling\n▁temper\n▁temperament\n▁temperature\n▁tempest\n▁temple\n▁temporary\n▁tempt\n▁temptation\n▁ten\n▁tendency\n▁tender\n▁tenderness\n▁term\n▁terms\n▁terrace\n▁terrible\n▁terribly\n▁terrified\n▁territory\n▁terror\n▁test\n▁testimony\n▁text\n▁th\n▁than\n▁thank\n▁that\n▁the\n▁theatre\n▁their\n▁them\n▁themselves\n▁then\n▁there\n▁therefore\n▁thereupon\n▁these\n▁they\n▁thick\n▁thief\n▁thieves\n▁thin\n▁thing\n▁things\n▁think\n▁thinking\n▁third\n▁thirst\n▁thirteen\n▁thirty\n▁this\n▁thither\n▁thomas\n▁thornton\n▁thorough\n▁thoroughly\n▁those\n▁thou\n▁though\n▁thought\n▁thoughtfully\n▁thoughts\n▁thousand\n▁thread\n▁threat\n▁threatened\n▁threatening\n▁three\n▁threshold\n▁threw\n▁thrill\n▁thro\n▁throat\n▁throne\n▁throng\n▁through\n▁throughout\n▁throw\n▁throwing\n▁thrown\n▁thrust\n▁thumb\n▁thunder\n▁thus\n▁thy\n▁thyself\n▁ti\n▁ticket\n▁tide\n▁tidings\n▁tied\n▁tight\n▁till\n▁timber\n▁time\n▁times\n▁timid\n▁tin\n▁tiny\n▁tip\n▁tired\n▁title\n▁to\n▁tobacco\n▁today\n▁together\n▁told\n▁tom\n▁tomb\n▁tomorrow\n▁tone\n▁tongue\n▁too\n▁took\n▁top\n▁torment\n▁torrent\n▁torture\n▁total\n▁touch\n▁touched\n▁toward\n▁towards\n▁tower\n▁town\n▁tra\n▁trace\n▁track\n▁trade\n▁tradition\n▁tragedy\n▁tragic\n▁trail\n▁train\n▁traitor\n▁tramp\n▁tranquil\n▁trans\n▁transport\n▁trap\n▁travel\n▁traveller\n▁tre\n▁tread\n▁treasure\n▁treat\n▁treated\n▁treatment\n▁tree\n▁trees\n▁tremble\n▁trembled\n▁trembling\n▁tremendous\n▁trench\n▁tri\n▁trial\n▁tribe\n▁trick\n▁tried\n▁trifle\n▁trifling\n▁trip\n▁tristram\n▁triumph\n▁triumphant\n▁troop\n▁troops\n▁trot\n▁trouble\n▁troubled\n▁trousers\n▁trout\n▁tru\n▁true\n▁truly\n▁trumpet\n▁trunk\n▁trust\n▁truth\n▁try\n▁trying\n▁tu\n▁tuesday\n▁tulliver\n▁tumble\n▁tumult\n▁turkey\n▁turn\n▁turned\n▁turning\n▁turtle\n▁twas\n▁twelve\n▁twentieth\n▁twenty\n▁twice\n▁twilight\n▁twin\n▁twist\n▁two\n▁type\n▁tyrant\n▁ugly\n▁ultimate\n▁umbrella\n▁un\n▁unable\n▁unc\n▁uncertain\n▁uncle\n▁uncomfortable\n▁uncommon\n▁unconscious\n▁und\n▁under\n▁underneath\n▁understand\n▁understanding\n▁understood\n▁undertake\n▁undertaking\n▁undoubtedly\n▁uneasiness\n▁uneasy\n▁unexpected\n▁unfortunate\n▁unhappy\n▁uniform\n▁union\n▁united\n▁universal\n▁universe\n▁university\n▁unjust\n▁unknown\n▁unless\n▁unlike\n▁unnatural\n▁unnecessary\n▁unpleasant\n▁unre\n▁unseen\n▁until\n▁unto\n▁unusual\n▁unwilling\n▁unworthy\n▁up\n▁upon\n▁upper\n▁upstairs\n▁urge\n▁us\n▁use\n▁used\n▁useful\n▁useless\n▁usual\n▁usually\n▁utili\n▁utmost\n▁utter\n▁uttered\n▁utterly\n▁va\n▁vacant\n▁vague\n▁vain\n▁val\n▁valentine\n▁valjean\n▁valley\n▁valuable\n▁value\n▁van\n▁vanished\n▁vari\n▁variety\n▁various\n▁vast\n▁vault\n▁ve\n▁vegetable\n▁vehicle\n▁veil\n▁velvet\n▁ven\n▁vengeance\n▁venture\n▁ventured\n▁ver\n▁verse\n▁very\n▁vessel\n▁vexed\n▁vi\n▁vibrat\n▁vice\n▁victim\n▁victor\n▁victory\n▁view\n▁vigorous\n▁village\n▁villain\n▁villefort\n▁vine\n▁violence\n▁violent\n▁violet\n▁virgin\n▁virginia\n▁virtue\n▁virtuous\n▁visible\n▁vision\n▁visit\n▁visitor\n▁vital\n▁vivid\n▁vo\n▁voice\n▁vol\n▁volume\n▁volunteer\n▁vote\n▁vow\n▁voyage\n▁vulgar\n▁w\n▁wa\n▁wag\n▁wagon\n▁waist\n▁waistcoat\n▁wait\n▁waited\n▁waiting\n▁wake\n▁wal\n▁walk\n▁walked\n▁walking\n▁wall\n▁walls\n▁walter\n▁wander\n▁wandering\n▁want\n▁wanted\n▁war\n▁warm\n▁warn\n▁warning\n▁warrant\n▁warrior\n▁was\n▁wash\n▁washington\n▁watch\n▁watched\n▁watching\n▁water\n▁wave\n▁waves\n▁waving\n▁wax\n▁way\n▁ways\n▁we\n▁weak\n▁weakness\n▁wealth\n▁weapon\n▁wear\n▁weary\n▁weather\n▁wedding\n▁week\n▁weeks\n▁weep\n▁weigh\n▁weight\n▁welcome\n▁welfare\n▁well\n▁went\n▁wept\n▁were\n▁west\n▁western\n▁wh\n▁whale\n▁what\n▁whatever\n▁wheat\n▁wheel\n▁when\n▁whence\n▁where\n▁wherefore\n▁whereupon\n▁whether\n▁whi\n▁which\n▁while\n▁whilst\n▁whip\n▁whirl\n▁whisk\n▁whisper\n▁whispered\n▁whistle\n▁white\n▁whither\n▁who\n▁whole\n▁wholly\n▁whom\n▁whose\n▁why\n▁wi\n▁wicked\n▁wide\n▁widow\n▁wife\n▁wild\n▁wilderness\n▁will\n▁william\n▁willing\n▁wilson\n▁wilt\n▁win\n▁wind\n▁window\n▁windows\n▁wine\n▁wings\n▁winter\n▁wip\n▁wire\n▁wisdom\n▁wise\n▁wish\n▁wished\n▁wishes\n▁wit\n▁witch\n▁with\n▁withdraw\n▁withdrew\n▁within\n▁without\n▁witness\n▁wives\n▁woe\n▁woke\n▁wolf\n▁wolves\n▁woman\n▁women\n▁won\n▁wonder\n▁wondered\n▁wonderful\n▁wondering\n▁wood\n▁wooden\n▁woods\n▁word\n▁words\n▁wore\n▁work\n▁worked\n▁working\n▁world\n▁worm\n▁worn\n▁worried\n▁worry\n▁worse\n▁worship\n▁worst\n▁worth\n▁worthy\n▁would\n▁wouldn\n▁wound\n▁wounded\n▁wrap\n▁wrapped\n▁wrath\n▁wreck\n▁wren\n▁wretch\n▁wretched\n▁wrinkl\n▁wrist\n▁write\n▁writer\n▁writing\n▁written\n▁wrong\n▁wrote\n▁wrought\n▁ya\n▁yard\n▁ye\n▁year\n▁years\n▁yellow\n▁yes\n▁yesterday\n▁yet\n▁yield\n▁yo\n▁yonder\n▁york\n▁you\n▁young\n▁your\n▁yourself\n▁yourselves\n▁youth\n<eos>\n"
  },
  {
    "path": "modules/audio/asr/u2_conformer_librispeech/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom pathlib import Path\nimport sys\n\nimport numpy as np\nfrom paddlehub.env import MODULE_HOME\nfrom paddlehub.module.module import moduleinfo, serving\nfrom paddlehub.utils.log import logger\n\nimport paddle\nimport soundfile as sf\n\n# TODO: Remove system path when deepspeech can be installed via pip.\nsys.path.append(os.path.join(MODULE_HOME, 'u2_conformer_librispeech'))\nfrom deepspeech.exps.u2.config import get_cfg_defaults\nfrom deepspeech.utils.utility import UpdateConfig\nfrom .u2_conformer_tester import U2ConformerTester\n\n\n@moduleinfo(\n    name=\"u2_conformer_librispeech\", version=\"1.0.0\", summary=\"\", author=\"Baidu\", author_email=\"\", type=\"audio/asr\")\nclass U2Conformer(paddle.nn.Layer):\n    def __init__(self):\n        super(U2Conformer, self).__init__()\n\n        # resource\n        res_dir = os.path.join(MODULE_HOME, 'u2_conformer_librispeech', 'assets')\n        conf_file = os.path.join(res_dir, 'conf/conformer.yaml')\n        checkpoint = os.path.join(res_dir, 'checkpoints/avg_30.pdparams')\n\n        # config\n        self.config = get_cfg_defaults()\n        self.config.merge_from_file(conf_file)\n\n        # TODO: Remove path updating snippet.\n        with UpdateConfig(self.config):\n            self.config.collator.vocab_filepath = os.path.join(res_dir, self.config.collator.vocab_filepath)\n            self.config.collator.spm_model_prefix = os.path.join(res_dir, self.config.collator.spm_model_prefix)\n            self.config.collator.augmentation_config = os.path.join(res_dir, self.config.collator.augmentation_config)\n            self.config.model.cmvn_file = os.path.join(res_dir, self.config.model.cmvn_file)\n            self.config.decoding.decoding_method = 'attention_rescoring'\n            self.config.decoding.batch_size = 1\n\n        # model\n        self.tester = U2ConformerTester(self.config)\n        self.tester.setup_model()\n        self.tester.resume(checkpoint)\n\n    @staticmethod\n    def check_audio(audio_file):\n        sig, sample_rate = sf.read(audio_file)\n        assert sample_rate == 16000, 'Excepting sample rate of input audio is 16000, but got {}'.format(sample_rate)\n\n    @serving\n    def speech_recognize(self, audio_file, device='cpu'):\n        assert os.path.isfile(audio_file), 'File not exists: {}'.format(audio_file)\n        self.check_audio(audio_file)\n\n        paddle.set_device(device)\n        return self.tester.test(audio_file)[0][0]\n"
  },
  {
    "path": "modules/audio/asr/u2_conformer_librispeech/requirements.txt",
    "content": "loguru\nyacs\njsonlines\nscipy==1.2.1\nsentencepiece\nresampy==0.2.2\nSoundFile==0.9.0.post1\nsoxbindings\nkaldiio\ntypeguard\neditdistance\ntextgrid\n"
  },
  {
    "path": "modules/audio/asr/u2_conformer_librispeech/u2_conformer_tester.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Evaluation for U2 model.\"\"\"\nimport os\nimport sys\n\nimport paddle\n\nfrom deepspeech.frontend.featurizer.text_featurizer import TextFeaturizer\nfrom deepspeech.io.collator import SpeechCollator\nfrom deepspeech.models.u2 import U2Model\nfrom deepspeech.utils import mp_tools\nfrom deepspeech.utils.utility import UpdateConfig\n\n\nclass U2ConformerTester:\n    def __init__(self, config):\n        self.config = config\n        self.collate_fn_test = SpeechCollator.from_config(config)\n        self._text_featurizer = TextFeaturizer(\n            unit_type=config.collator.unit_type, vocab_filepath=None, spm_model_prefix=config.collator.spm_model_prefix)\n\n    @mp_tools.rank_zero_only\n    @paddle.no_grad()\n    def test(self, audio_file):\n        self.model.eval()\n        cfg = self.config.decoding\n        collate_fn_test = self.collate_fn_test\n        audio, _ = collate_fn_test.process_utterance(audio_file=audio_file, transcript=\"Hello\")\n        audio_len = audio.shape[0]\n        audio = paddle.to_tensor(audio, dtype='float32')\n        audio_len = paddle.to_tensor(audio_len)\n        audio = paddle.unsqueeze(audio, axis=0)\n        vocab_list = collate_fn_test.vocab_list\n\n        text_feature = self.collate_fn_test.text_feature\n        result_transcripts = self.model.decode(\n            audio,\n            audio_len,\n            text_feature=text_feature,\n            decoding_method=cfg.decoding_method,\n            lang_model_path=cfg.lang_model_path,\n            beam_alpha=cfg.alpha,\n            beam_beta=cfg.beta,\n            beam_size=cfg.beam_size,\n            cutoff_prob=cfg.cutoff_prob,\n            cutoff_top_n=cfg.cutoff_top_n,\n            num_processes=cfg.num_proc_bsearch,\n            ctc_weight=cfg.ctc_weight,\n            decoding_chunk_size=cfg.decoding_chunk_size,\n            num_decoding_left_chunks=cfg.num_decoding_left_chunks,\n            simulate_streaming=cfg.simulate_streaming)\n\n        return result_transcripts\n\n    def setup_model(self):\n        config = self.config.clone()\n        with UpdateConfig(config):\n            config.model.input_dim = self.collate_fn_test.feature_size\n            config.model.output_dim = self.collate_fn_test.vocab_size\n\n        self.model = U2Model.from_config(config.model)\n\n    def resume(self, checkpoint):\n        \"\"\"Resume from the checkpoint at checkpoints in the output\n        directory or load a specified checkpoint.\n        \"\"\"\n        model_dict = paddle.load(checkpoint)\n        self.model.set_state_dict(model_dict)\n"
  },
  {
    "path": "modules/audio/asr/u2_conformer_wenetspeech/README.md",
    "content": "# u2_conformer_wenetspeech\n\n|模型名称|u2_conformer_wenetspeech|\n| :--- | :---: |\n|类别|语音-语音识别|\n|网络|Conformer|\n|数据集|WenetSpeech|\n|是否支持Fine-tuning|否|\n|模型大小|494MB|\n|最新更新日期|2021-12-10|\n|数据指标|中文CER 0.087 |\n\n## 一、模型基本信息\n\n### 模型介绍\n\nU2 Conformer模型是一种适用于英文和中文的end-to-end语音识别模型。u2_conformer_wenetspeech采用了conformer的encoder和transformer的decoder的模型结构，并且使用了ctc-prefix beam search的方式进行一遍打分，再利用attention decoder进行二次打分的方式进行解码来得到最终结果。\n\nu2_conformer_wenetspeech在中文普通话开源语音数据集[WenetSpeech](https://wenet-e2e.github.io/WenetSpeech/)进行了预训练，该模型在其DEV测试集上的CER指标是0.087。\n\n<p align=\"center\">\n<img src=\"https://paddlehub.bj.bcebos.com/paddlehub-img/conformer.png\" hspace='10'/> <br />\n</p>\n\n<p align=\"center\">\n<img src=\"https://paddlehub.bj.bcebos.com/paddlehub-img/u2_conformer.png\" hspace='10'/> <br />\n</p>\n\n更多详情请参考:\n- [Unified Streaming and Non-streaming Two-pass End-to-end Model for Speech Recognition](https://arxiv.org/abs/2012.05481)\n- [Conformer: Convolution-augmented Transformer for Speech Recognition](https://arxiv.org/abs/2005.08100)\n- [WenetSpeech: A 10000+ Hours Multi-domain Mandarin Corpus for Speech Recognition](https://arxiv.org/abs/2110.03370)\n\n## 二、安装\n\n- ### 1、系统依赖\n\n  - libsndfile\n    - Linux\n      ```shell\n      $ sudo apt-get install libsndfile\n      or\n      $ sudo yum install libsndfile\n      ```\n    - MacOs\n      ```\n      $ brew install libsndfile\n      ```\n\n- ### 2、环境依赖\n\n  - paddlepaddle >= 2.2.0\n\n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 3、安装\n\n  - ```shell\n    $ hub install u2_conformer_wenetspeech\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n    ```python\n    import paddlehub as hub\n\n    # 采样率为16k，格式为wav的中文语音音频\n    wav_file = '/PATH/TO/AUDIO'\n\n    model = hub.Module(\n        name='u2_conformer_wenetspeech',\n        version='1.0.0')\n    text = model.speech_recognize(wav_file)\n\n    print(text)\n    ```\n\n- ### 2、API\n  - ```python\n    def check_audio(audio_file)\n    ```\n    - 检查输入音频格式和采样率是否满足为16000，如果不满足，则重新采样至16000并将新的音频文件保存至相同目录。\n\n    - **参数**\n\n      - `audio_file`：本地音频文件(*.wav)的路径，如`/path/to/input.wav`\n\n  - ```python\n    def speech_recognize(\n        audio_file,\n        device='cpu',\n    )\n    ```\n    - 将输入的音频识别成文字\n\n    - **参数**\n\n      - `audio_file`：本地音频文件(*.wav)的路径，如`/path/to/input.wav`\n      - `device`：预测时使用的设备，默认为\b`cpu`，如需使用gpu预测，请设置为`gpu`。\n\n    - **返回**\n\n      - `text`：str类型，返回输入音频的识别文字结果。\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线的语音识别服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m u2_conformer_wenetspeech\n    ```\n\n  - 这样就完成了一个语音识别服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 需要识别的音频的存放路径，确保部署服务的机器可访问\n    file = '/path/to/input.wav'\n\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"audio_file\"\n    data = {\"audio_file\": file}\n\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/u2_conformer_wenetspeech\"\n\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install u2_conformer_wenetspeech\n  ```\n"
  },
  {
    "path": "modules/audio/asr/u2_conformer_wenetspeech/__init__.py",
    "content": ""
  },
  {
    "path": "modules/audio/asr/u2_conformer_wenetspeech/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\n\nimport paddle\nfrom paddleaudio import load, save_wav\nfrom paddlespeech.cli import ASRExecutor\nfrom paddlehub.module.module import moduleinfo, serving\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"u2_conformer_wenetspeech\", version=\"1.0.0\", summary=\"\", author=\"Wenet\", author_email=\"\", type=\"audio/asr\")\nclass U2Conformer(paddle.nn.Layer):\n    def __init__(self):\n        super(U2Conformer, self).__init__()\n        self.asr_executor = ASRExecutor()\n        self.asr_kw_args = {\n            'model': 'conformer_wenetspeech',\n            'lang': 'zh',\n            'sample_rate': 16000,\n            'config': None,  # Set `config` and `ckpt_path` to None to use pretrained model.\n            'ckpt_path': None,\n        }\n\n    @staticmethod\n    def check_audio(audio_file):\n        assert audio_file.endswith('.wav'), 'Input file must be a wave file `*.wav`.'\n        sig, sample_rate = load(audio_file)\n        if sample_rate != 16000:\n            sig, _ = load(audio_file, 16000)\n            audio_file_16k = audio_file[:audio_file.rindex('.')] + '_16k.wav'\n            logger.info('Resampling to 16000 sample rate to new audio file: {}'.format(audio_file_16k))\n            save_wav(sig, 16000, audio_file_16k)\n            return audio_file_16k\n        else:\n            return audio_file\n\n    @serving\n    def speech_recognize(self, audio_file, device='cpu'):\n        assert os.path.isfile(audio_file), 'File not exists: {}'.format(audio_file)\n        audio_file = self.check_audio(audio_file)\n        text = self.asr_executor(audio_file=audio_file, device=device, **self.asr_kw_args)\n        return text\n"
  },
  {
    "path": "modules/audio/asr/u2_conformer_wenetspeech/requirements.txt",
    "content": "paddlespeech==0.1.0a9\n"
  },
  {
    "path": "modules/audio/audio_classification/PANNs/cnn10/README.md",
    "content": "# panns_cnn10\n\n|模型名称|panns_cnn10|\n| :--- | :---: |\n|类别|语音-声音分类|\n|网络|PANNs|\n|数据集|Google Audioset|\n|是否支持Fine-tuning|是|\n|模型大小|31MB|\n|最新更新日期|2021-06-15|\n|数据指标|mAP 0.380|\n\n## 一、模型基本信息\n\n### 模型介绍\n\n`panns_cnn10`是一个基于[Google Audioset](https://research.google.com/audioset/)数据集训练的声音分类/识别的模型。该模型主要包含8个卷积层和2个全连接层，模型参数为4.9M。经过预训练后，可以用于提取音频的embbedding，维度是512。\n\n更多详情请参考论文：[PANNs: Large-Scale Pretrained Audio Neural Networks for Audio Pattern Recognition](https://arxiv.org/pdf/1912.10211.pdf)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install panns_cnn10\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n  - ```python\n    # ESC50声音分类预测\n    import librosa\n    import paddlehub as hub\n    from paddlehub.datasets import ESC50\n\n    sr = 44100 # 音频文件的采样率\n    wav_file = '/PATH/TO/AUDIO' # 用于预测的音频文件路径\n    checkpoint = 'model.pdparams' # 用于预测的模型参数\n\n    label_map = {idx: label for idx, label in enumerate(ESC50.label_list)}\n\n    model = hub.Module(\n        name='panns_cnn10',\n        version='1.0.0',\n        task='sound-cls',\n        num_class=ESC50.num_class,\n        label_map=label_map,\n        load_checkpoint=checkpoint)\n\n    data = [librosa.load(wav_file, sr=sr)[0]]\n    result = model.predict(\n        data,\n        sample_rate=sr,\n        batch_size=1,\n        feat_type='mel',\n        use_gpu=True)\n\n    print('File: {}\\tLable: {}'.format(wav_file, result[0]))\n    ```\n\n  - ```python\n    # Audioset Tagging\n    import librosa\n    import numpy as np\n    import paddlehub as hub\n\n    def show_topk(k, label_map, file, result):\n        \"\"\"\n        展示topk的分的类别和分数。\n        \"\"\"\n        result = np.asarray(result)\n        topk_idx = (-result).argsort()[:k]\n        msg = f'[{file}]\\n'\n        for idx in topk_idx:\n            label, score = label_map[idx], result[idx]\n            msg += f'{label}: {score}\\n'\n        print(msg)\n\n    sr = 44100 # 音频文件的采样率\n    wav_file = '/PATH/TO/AUDIO' # 用于预测的音频文件路径\n    label_file = './audioset_labels.txt' # audioset标签文本文件\n    topk = 10 # 展示的topk数\n\n    label_map = {}\n    with open(label_file, 'r') as f:\n        for i, l in enumerate(f.readlines()):\n            label_map[i] = l.strip()\n\n    model = hub.Module(\n        name='panns_cnn10',\n        version='1.0.0',\n        task=None)\n\n    data = [librosa.load(wav_file, sr=sr)[0]]\n    result = model.predict(\n        data,\n        sample_rate=sr,\n        batch_size=1,\n        feat_type='mel',\n        use_gpu=True)\n\n    show_topk(topk, label_map, wav_file, result[0])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n            task,\n            num_class=None,\n            label_map=None,\n            load_checkpoint=None,\n            **kwargs,\n    )\n    ```\n    - 创建Module对象。\n\n    - **参数**\n      - `task`： 任务名称，可为`sound-cls`或者`None`。`sound-cls`代表声音分类任务，可以对声音分类的数据集进行finetune；为`None`时可以获取预训练模型对音频进行分类/Tagging。\n      - `num_classes`：声音分类任务的类别数，finetune时需要指定，数值与具体使用的数据集类别数一致。\n      - `label_map`：预测时的类别映射表。\n      - `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n  - ```python\n    def predict(\n            data,\n            sample_rate,\n            batch_size=1,\n            feat_type='mel',\n            use_gpu=False\n    )\n    ```\n    - 模型预测，输入为音频波形数据，输出为分类标签。\n\n    - **参数**\n      - `data`： 待预测数据，格式为\\[waveform1, wavwform2…,\\]，其中每个元素都是一个一维numpy列表，是音频的波形采样数值列表。\n      - `sample_rate`：音频文件的采样率。\n      - `feat_type`：音频特征的种类选取，当前支持`'mel'`(详情可查看[Mel-frequency cepstrum](https://en.wikipedia.org/wiki/Mel-frequency_cepstrum))和原始波形特征`'raw'`。\n      - `batch_size`：模型批处理大小。\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n      - `results`：list类型，不同任务类型的返回结果如下\n      - 声音分类(task参数为`sound-cls`)：列表里包含每个音频文件的分类标签。\n      - Tagging(task参数为`None`)：列表里包含每个音频文件527个类别([Audioset标签](https://research.google.com/audioset/))的得分。\n\n    详情可参考PaddleHub示例：\n    - [AudioClassification](https://github.com/PaddlePaddle/PaddleHub/tree/release/v2.0/demo/audio_classification)\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布，动态图版本模型，支持声音分类`sound-cls`任务的fine-tune和基于Audioset Tagging预测。\n\n  ```shell\n  $ hub install panns_cnn10\n  ```\n"
  },
  {
    "path": "modules/audio/audio_classification/PANNs/cnn10/__init__.py",
    "content": ""
  },
  {
    "path": "modules/audio/audio_classification/PANNs/cnn10/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport math\nimport os\nfrom typing import Dict\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom panns_cnn10.network import CNN10\n\nfrom paddlehub.env import MODULE_HOME\nfrom paddlehub.module.audio_module import AudioClassifierModule\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"panns_cnn10\",\n    version=\"1.0.0\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"audio/sound_classification\",\n    meta=AudioClassifierModule)\nclass PANN(nn.Layer):\n    def __init__(\n            self,\n            task: str,\n            num_class: int = None,\n            label_map: Dict = None,\n            load_checkpoint: str = None,\n            **kwargs,\n    ):\n        super(PANN, self).__init__()\n\n        if label_map:\n            self.label_map = label_map\n            self.num_class = len(label_map)\n        else:\n            self.num_class = num_class\n\n        if task == 'sound-cls':\n            self.cnn10 = CNN10(\n                extract_embedding=True, checkpoint=os.path.join(MODULE_HOME, 'panns_cnn10', 'cnn10.pdparams'))\n            self.dropout = nn.Dropout(0.1)\n            self.fc = nn.Linear(self.cnn10.emb_size, num_class)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        else:\n            self.cnn10 = CNN10(\n                extract_embedding=False, checkpoint=os.path.join(MODULE_HOME, 'panns_cnn10', 'cnn10.pdparams'))\n\n        self.task = task\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self, feats, labels=None):\n        # feats: (batch_size, num_frames, num_melbins) -> (batch_size, 1, num_frames, num_melbins)\n        feats = feats.unsqueeze(1)\n\n        if self.task == 'sound-cls':\n            embeddings = self.cnn10(feats)\n            embeddings = self.dropout(embeddings)\n            logits = self.fc(embeddings)\n            probs = F.softmax(logits, axis=1)\n\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            audioset_logits = self.cnn10(feats)\n            return audioset_logits\n"
  },
  {
    "path": "modules/audio/audio_classification/PANNs/cnn10/network.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom paddlehub.utils.log import logger\n\n\nclass ConvBlock(nn.Layer):\n    def __init__(self, in_channels, out_channels):\n        super(ConvBlock, self).__init__()\n\n        self.conv1 = nn.Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=(3, 3),\n            stride=(1, 1),\n            padding=(1, 1),\n            bias_attr=False)\n        self.conv2 = nn.Conv2D(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=(3, 3),\n            stride=(1, 1),\n            padding=(1, 1),\n            bias_attr=False)\n        self.bn1 = nn.BatchNorm2D(out_channels)\n        self.bn2 = nn.BatchNorm2D(out_channels)\n\n    def forward(self, x, pool_size=(2, 2), pool_type='avg'):\n        x = self.conv1(x)\n        x = self.bn1(x)\n        x = F.relu(x)\n\n        x = self.conv2(x)\n        x = self.bn2(x)\n        x = F.relu(x)\n\n        if pool_type == 'max':\n            x = F.max_pool2d(x, kernel_size=pool_size)\n        elif pool_type == 'avg':\n            x = F.avg_pool2d(x, kernel_size=pool_size)\n        elif pool_type == 'avg+max':\n            x = F.avg_pool2d(x, kernel_size=pool_size) + F.max_pool2d(x, kernel_size=pool_size)\n        else:\n            raise Exception(\n                f'Pooling type of {pool_type} is not supported. It must be one of \"max\", \"avg\" and \"avg+max\".')\n        return x\n\n\nclass CNN10(nn.Layer):\n    emb_size = 512\n\n    def __init__(self, extract_embedding: bool = True, checkpoint: str = None):\n\n        super(CNN10, self).__init__()\n        self.bn0 = nn.BatchNorm2D(64)\n        self.conv_block1 = ConvBlock(in_channels=1, out_channels=64)\n        self.conv_block2 = ConvBlock(in_channels=64, out_channels=128)\n        self.conv_block3 = ConvBlock(in_channels=128, out_channels=256)\n        self.conv_block4 = ConvBlock(in_channels=256, out_channels=512)\n\n        self.fc1 = nn.Linear(512, self.emb_size)\n        self.fc_audioset = nn.Linear(self.emb_size, 527)\n\n        if checkpoint is not None and os.path.isfile(checkpoint):\n            state_dict = paddle.load(checkpoint)\n            self.set_state_dict(state_dict)\n            print(f'Loaded CNN10 pretrained parameters from: {checkpoint}')\n        else:\n            print('No valid checkpoints for CNN10. Start training from scratch.')\n\n        self.extract_embedding = extract_embedding\n\n    def forward(self, x):\n        x.stop_gradient = False\n        x = x.transpose([0, 3, 2, 1])\n        x = self.bn0(x)\n        x = x.transpose([0, 3, 2, 1])\n\n        x = self.conv_block1(x, pool_size=(2, 2), pool_type='avg')\n        x = F.dropout(x, p=0.2, training=self.training)\n\n        x = self.conv_block2(x, pool_size=(2, 2), pool_type='avg')\n        x = F.dropout(x, p=0.2, training=self.training)\n\n        x = self.conv_block3(x, pool_size=(2, 2), pool_type='avg')\n        x = F.dropout(x, p=0.2, training=self.training)\n\n        x = self.conv_block4(x, pool_size=(2, 2), pool_type='avg')\n        x = F.dropout(x, p=0.2, training=self.training)\n\n        x = x.mean(axis=3)\n        x = x.max(axis=2) + x.mean(axis=2)\n\n        x = F.dropout(x, p=0.5, training=self.training)\n        x = F.relu(self.fc1(x))\n\n        if self.extract_embedding:\n            output = F.dropout(x, p=0.5, training=self.training)\n        else:\n            output = F.sigmoid(self.fc_audioset(x))\n        return output\n"
  },
  {
    "path": "modules/audio/audio_classification/PANNs/cnn10/requirements.txt",
    "content": "librosa\n"
  },
  {
    "path": "modules/audio/audio_classification/PANNs/cnn14/README.md",
    "content": "# panns_cnn14\n\n|模型名称|panns_cnn14|\n| :--- | :---: |\n|类别|语音-声音分类|\n|网络|PANNs|\n|数据集|Google Audioset|\n|是否支持Fine-tuning|是|\n|模型大小|469MB|\n|最新更新日期|2021-06-15|\n|数据指标|mAP 0.431|\n\n## 一、模型基本信息\n\n### 模型介绍\n\n`panns_cnn14`是一个基于[Google Audioset](https://research.google.com/audioset/)数据集训练的声音分类/识别的模型。该模型主要包含12个卷积层和2个全连接层，模型参数为79.6M。经过预训练后，可以用于提取音频的embbedding，维度是2048。\n\n更多详情请参考论文：[PANNs: Large-Scale Pretrained Audio Neural Networks for Audio Pattern Recognition](https://arxiv.org/pdf/1912.10211.pdf)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install panns_cnn14\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n  - ```python\n    # ESC50声音分类预测\n    import librosa\n    import paddlehub as hub\n    from paddlehub.datasets import ESC50\n\n    sr = 44100 # 音频文件的采样率\n    wav_file = '/PATH/TO/AUDIO' # 用于预测的音频文件路径\n    checkpoint = 'model.pdparams' # 用于预测的模型参数\n\n    label_map = {idx: label for idx, label in enumerate(ESC50.label_list)}\n\n    model = hub.Module(\n        name='panns_cnn14',\n        version='1.0.0',\n        task='sound-cls',\n        num_class=ESC50.num_class,\n        label_map=label_map,\n        load_checkpoint=checkpoint)\n\n    data = [librosa.load(wav_file, sr=sr)[0]]\n    result = model.predict(\n        data,\n        sample_rate=sr,\n        batch_size=1,\n        feat_type='mel',\n        use_gpu=True)\n\n    print('File: {}\\tLable: {}'.format(wav_file, result[0]))\n    ```\n\n  - ```python\n    # Audioset Tagging\n    import librosa\n    import numpy as np\n    import paddlehub as hub\n\n    def show_topk(k, label_map, file, result):\n        \"\"\"\n        展示topk的分的类别和分数。\n        \"\"\"\n        result = np.asarray(result)\n        topk_idx = (-result).argsort()[:k]\n        msg = f'[{file}]\\n'\n        for idx in topk_idx:\n            label, score = label_map[idx], result[idx]\n            msg += f'{label}: {score}\\n'\n        print(msg)\n\n    sr = 44100 # 音频文件的采样率\n    wav_file = '/PATH/TO/AUDIO' # 用于预测的音频文件路径\n    label_file = './audioset_labels.txt' # audioset标签文本文件\n    topk = 10 # 展示的topk数\n\n    label_map = {}\n    with open(label_file, 'r') as f:\n        for i, l in enumerate(f.readlines()):\n            label_map[i] = l.strip()\n\n    model = hub.Module(\n        name='panns_cnn14',\n        version='1.0.0',\n        task=None)\n\n    data = [librosa.load(wav_file, sr=sr)[0]]\n    result = model.predict(\n        data,\n        sample_rate=sr,\n        batch_size=1,\n        feat_type='mel',\n        use_gpu=True)\n\n    show_topk(topk, label_map, wav_file, result[0])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n            task,\n            num_class=None,\n            label_map=None,\n            load_checkpoint=None,\n            **kwargs,\n    )\n    ```\n    - 创建Module对象。\n\n    - **参数**\n      - `task`： 任务名称，可为`sound-cls`或者`None`。`sound-cls`代表声音分类任务，可以对声音分类的数据集进行finetune；为`None`时可以获取预训练模型对音频进行分类/Tagging。\n      - `num_classes`：声音分类任务的类别数，finetune时需要指定，数值与具体使用的数据集类别数一致。\n      - `label_map`：预测时的类别映射表。\n      - `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n  - ```python\n    def predict(\n            data,\n            sample_rate,\n            batch_size=1,\n            feat_type='mel',\n            use_gpu=False\n    )\n    ```\n    - 模型预测，输入为音频波形数据，输出为分类标签。\n\n    - **参数**\n      - `data`： 待预测数据，格式为\\[waveform1, wavwform2…,\\]，其中每个元素都是一个一维numpy列表，是音频的波形采样数值列表。\n      - `sample_rate`：音频文件的采样率。\n      - `feat_type`：音频特征的种类选取，当前支持`'mel'`(详情可查看[Mel-frequency cepstrum](https://en.wikipedia.org/wiki/Mel-frequency_cepstrum))和原始波形特征`'raw'`。\n      - `batch_size`：模型批处理大小。\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n      - `results`：list类型，不同任务类型的返回结果如下\n      - 声音分类(task参数为`sound-cls`)：列表里包含每个音频文件的分类标签。\n      - Tagging(task参数为`None`)：列表里包含每个音频文件527个类别([Audioset标签](https://research.google.com/audioset/))的得分。\n\n    详情可参考PaddleHub示例：\n    - [AudioClassification](https://github.com/PaddlePaddle/PaddleHub/tree/release/v2.0/demo/audio_classification)\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布，动态图版本模型，支持声音分类`sound-cls`任务的fine-tune和基于Audioset Tagging预测。\n\n  ```shell\n  $ hub install panns_cnn14\n  ```\n"
  },
  {
    "path": "modules/audio/audio_classification/PANNs/cnn14/__init__.py",
    "content": ""
  },
  {
    "path": "modules/audio/audio_classification/PANNs/cnn14/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport math\nimport os\nfrom typing import Dict\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom panns_cnn14.network import CNN14\n\nfrom paddlehub.env import MODULE_HOME\nfrom paddlehub.module.audio_module import AudioClassifierModule\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"panns_cnn14\",\n    version=\"1.0.0\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"audio/sound_classification\",\n    meta=AudioClassifierModule)\nclass PANN(nn.Layer):\n    def __init__(\n            self,\n            task: str,\n            num_class: int = None,\n            label_map: Dict = None,\n            load_checkpoint: str = None,\n            **kwargs,\n    ):\n        super(PANN, self).__init__()\n\n        if label_map:\n            self.label_map = label_map\n            self.num_class = len(label_map)\n        else:\n            self.num_class = num_class\n\n        if task == 'sound-cls':\n            self.cnn14 = CNN14(\n                extract_embedding=True, checkpoint=os.path.join(MODULE_HOME, 'panns_cnn14', 'cnn14.pdparams'))\n            self.dropout = nn.Dropout(0.1)\n            self.fc = nn.Linear(self.cnn14.emb_size, num_class)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        else:\n            self.cnn14 = CNN14(\n                extract_embedding=False, checkpoint=os.path.join(MODULE_HOME, 'panns_cnn14', 'cnn14.pdparams'))\n\n        self.task = task\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self, feats, labels=None):\n        # feats: (batch_size, num_frames, num_melbins) -> (batch_size, 1, num_frames, num_melbins)\n        feats = feats.unsqueeze(1)\n\n        if self.task == 'sound-cls':\n            embeddings = self.cnn14(feats)\n            embeddings = self.dropout(embeddings)\n            logits = self.fc(embeddings)\n            probs = F.softmax(logits, axis=1)\n\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            audioset_logits = self.cnn14(feats)\n            return audioset_logits\n"
  },
  {
    "path": "modules/audio/audio_classification/PANNs/cnn14/network.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom paddlehub.utils.log import logger\n\n\nclass ConvBlock(nn.Layer):\n    def __init__(self, in_channels, out_channels):\n        super(ConvBlock, self).__init__()\n\n        self.conv1 = nn.Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=(3, 3),\n            stride=(1, 1),\n            padding=(1, 1),\n            bias_attr=False)\n        self.conv2 = nn.Conv2D(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=(3, 3),\n            stride=(1, 1),\n            padding=(1, 1),\n            bias_attr=False)\n        self.bn1 = nn.BatchNorm2D(out_channels)\n        self.bn2 = nn.BatchNorm2D(out_channels)\n\n    def forward(self, x, pool_size=(2, 2), pool_type='avg'):\n        x = self.conv1(x)\n        x = self.bn1(x)\n        x = F.relu(x)\n\n        x = self.conv2(x)\n        x = self.bn2(x)\n        x = F.relu(x)\n\n        if pool_type == 'max':\n            x = F.max_pool2d(x, kernel_size=pool_size)\n        elif pool_type == 'avg':\n            x = F.avg_pool2d(x, kernel_size=pool_size)\n        elif pool_type == 'avg+max':\n            x = F.avg_pool2d(x, kernel_size=pool_size) + F.max_pool2d(x, kernel_size=pool_size)\n        else:\n            raise Exception(\n                f'Pooling type of {pool_type} is not supported. It must be one of \"max\", \"avg\" and \"avg+max\".')\n        return x\n\n\nclass CNN14(nn.Layer):\n    emb_size = 2048\n\n    def __init__(self, extract_embedding: bool = True, checkpoint: str = None):\n\n        super(CNN14, self).__init__()\n        self.bn0 = nn.BatchNorm2D(64)\n        self.conv_block1 = ConvBlock(in_channels=1, out_channels=64)\n        self.conv_block2 = ConvBlock(in_channels=64, out_channels=128)\n        self.conv_block3 = ConvBlock(in_channels=128, out_channels=256)\n        self.conv_block4 = ConvBlock(in_channels=256, out_channels=512)\n        self.conv_block5 = ConvBlock(in_channels=512, out_channels=1024)\n        self.conv_block6 = ConvBlock(in_channels=1024, out_channels=2048)\n\n        self.fc1 = nn.Linear(2048, self.emb_size)\n        self.fc_audioset = nn.Linear(self.emb_size, 527)\n\n        if checkpoint is not None and os.path.isfile(checkpoint):\n            state_dict = paddle.load(checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info(f'Loaded CNN14 pretrained parameters from: {checkpoint}')\n        else:\n            logger.error('No valid checkpoints for CNN14. Start training from scratch.')\n\n        self.extract_embedding = extract_embedding\n\n    def forward(self, x):\n        x.stop_gradient = False\n        x = x.transpose([0, 3, 2, 1])\n        x = self.bn0(x)\n        x = x.transpose([0, 3, 2, 1])\n\n        x = self.conv_block1(x, pool_size=(2, 2), pool_type='avg')\n        x = F.dropout(x, p=0.2, training=self.training)\n\n        x = self.conv_block2(x, pool_size=(2, 2), pool_type='avg')\n        x = F.dropout(x, p=0.2, training=self.training)\n\n        x = self.conv_block3(x, pool_size=(2, 2), pool_type='avg')\n        x = F.dropout(x, p=0.2, training=self.training)\n\n        x = self.conv_block4(x, pool_size=(2, 2), pool_type='avg')\n        x = F.dropout(x, p=0.2, training=self.training)\n\n        x = self.conv_block5(x, pool_size=(2, 2), pool_type='avg')\n        x = F.dropout(x, p=0.2, training=self.training)\n\n        x = self.conv_block6(x, pool_size=(1, 1), pool_type='avg')\n        x = F.dropout(x, p=0.2, training=self.training)\n\n        x = x.mean(axis=3)\n        x = x.max(axis=2) + x.mean(axis=2)\n\n        x = F.dropout(x, p=0.5, training=self.training)\n        x = F.relu(self.fc1(x))\n\n        if self.extract_embedding:\n            output = F.dropout(x, p=0.5, training=self.training)\n        else:\n            output = F.sigmoid(self.fc_audioset(x))\n        return output\n"
  },
  {
    "path": "modules/audio/audio_classification/PANNs/cnn14/requirements.txt",
    "content": "librosa\n"
  },
  {
    "path": "modules/audio/audio_classification/PANNs/cnn6/README.md",
    "content": "# panns_cnn6\n\n|模型名称|panns_cnn6|\n| :--- | :---: |\n|类别|语音-声音分类|\n|网络|PANNs|\n|数据集|Google Audioset|\n|是否支持Fine-tuning|是|\n|模型大小|29MB|\n|最新更新日期|2021-06-15|\n|数据指标|mAP 0.343|\n\n## 一、模型基本信息\n\n### 模型介绍\n\n`panns_cnn6`是一个基于[Google Audioset](https://research.google.com/audioset/)数据集训练的声音分类/识别的模型。该模型主要包含4个卷积层和2个全连接层，模型参数为4.5M。经过预训练后，可以用于提取音频的embbedding，维度是512。\n\n更多详情请参考：[PANNs: Large-Scale Pretrained Audio Neural Networks for Audio Pattern Recognition](https://arxiv.org/pdf/1912.10211.pdf)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install panns_cnn6\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n  - ```python\n    # ESC50声音分类预测\n    import librosa\n    import paddlehub as hub\n    from paddlehub.datasets import ESC50\n\n    sr = 44100 # 音频文件的采样率\n    wav_file = '/PATH/TO/AUDIO' # 用于预测的音频文件路径\n    checkpoint = 'model.pdparams' # 用于预测的模型参数\n\n    label_map = {idx: label for idx, label in enumerate(ESC50.label_list)}\n\n    model = hub.Module(\n        name='panns_cnn6',\n        version='1.0.0',\n        task='sound-cls',\n        num_class=ESC50.num_class,\n        label_map=label_map,\n        load_checkpoint=checkpoint)\n\n    data = [librosa.load(wav_file, sr=sr)[0]]\n    result = model.predict(\n        data,\n        sample_rate=sr,\n        batch_size=1,\n        feat_type='mel',\n        use_gpu=True)\n\n    print('File: {}\\tLable: {}'.format(wav_file, result[0]))\n    ```\n\n  - ```python\n    # Audioset Tagging\n    import librosa\n    import numpy as np\n    import paddlehub as hub\n\n    def show_topk(k, label_map, file, result):\n        \"\"\"\n        展示topk的分的类别和分数。\n        \"\"\"\n        result = np.asarray(result)\n        topk_idx = (-result).argsort()[:k]\n        msg = f'[{file}]\\n'\n        for idx in topk_idx:\n            label, score = label_map[idx], result[idx]\n            msg += f'{label}: {score}\\n'\n        print(msg)\n\n    sr = 44100 # 音频文件的采样率\n    wav_file = '/PATH/TO/AUDIO' # 用于预测的音频文件路径\n    label_file = './audioset_labels.txt' # audioset标签文本文件\n    topk = 10 # 展示的topk数\n\n    label_map = {}\n    with open(label_file, 'r') as f:\n        for i, l in enumerate(f.readlines()):\n            label_map[i] = l.strip()\n\n    model = hub.Module(\n        name='panns_cnn6',\n        version='1.0.0',\n        task=None)\n\n    data = [librosa.load(wav_file, sr=sr)[0]]\n    result = model.predict(\n        data,\n        sample_rate=sr,\n        batch_size=1,\n        feat_type='mel',\n        use_gpu=True)\n\n    show_topk(topk, label_map, wav_file, result[0])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n            task,\n            num_class=None,\n            label_map=None,\n            load_checkpoint=None,\n            **kwargs,\n    )\n    ```\n    - 创建Module对象。\n\n    - **参数**\n      - `task`： 任务名称，可为`sound-cls`或者`None`。`sound-cls`代表声音分类任务，可以对声音分类的数据集进行finetune；为`None`时可以获取预训练模型对音频进行分类/Tagging。\n      - `num_classes`：声音分类任务的类别数，finetune时需要指定，数值与具体使用的数据集类别数一致。\n      - `label_map`：预测时的类别映射表。\n      - `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n  - ```python\n    def predict(\n            data,\n            sample_rate,\n            batch_size=1,\n            feat_type='mel',\n            use_gpu=False\n    )\n    ```\n    - 模型预测，输入为音频波形数据，输出为分类标签。\n\n    - **参数**\n      - `data`： 待预测数据，格式为\\[waveform1, wavwform2…,\\]，其中每个元素都是一个一维numpy列表，是音频的波形采样数值列表。\n      - `sample_rate`：音频文件的采样率。\n      - `feat_type`：音频特征的种类选取，当前支持`'mel'`(详情可查看[Mel-frequency cepstrum](https://en.wikipedia.org/wiki/Mel-frequency_cepstrum))和原始波形特征`'raw'`。\n      - `batch_size`：模型批处理大小。\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n      - `results`：list类型，不同任务类型的返回结果如下\n      - 声音分类(task参数为`sound-cls`)：列表里包含每个音频文件的分类标签。\n      - Tagging(task参数为`None`)：列表里包含每个音频文件527个类别([Audioset标签](https://research.google.com/audioset/))的得分。\n\n    详情可参考PaddleHub示例：\n    - [AudioClassification](https://github.com/PaddlePaddle/PaddleHub/tree/release/v2.0/demo/audio_classification)\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布，动态图版本模型，支持声音分类`sound-cls`任务的fine-tune和基于Audioset Tagging预测。\n\n  ```shell\n  $ hub install panns_cnn6\n  ```\n"
  },
  {
    "path": "modules/audio/audio_classification/PANNs/cnn6/__init__.py",
    "content": ""
  },
  {
    "path": "modules/audio/audio_classification/PANNs/cnn6/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport math\nimport os\nfrom typing import Dict\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom panns_cnn6.network import CNN6\n\nfrom paddlehub.env import MODULE_HOME\nfrom paddlehub.module.audio_module import AudioClassifierModule\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"panns_cnn6\",\n    version=\"1.0.0\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"audio/sound_classification\",\n    meta=AudioClassifierModule)\nclass PANN(nn.Layer):\n    def __init__(\n            self,\n            task: str,\n            num_class: int = None,\n            label_map: Dict = None,\n            load_checkpoint: str = None,\n            **kwargs,\n    ):\n        super(PANN, self).__init__()\n\n        if label_map:\n            self.label_map = label_map\n            self.num_class = len(label_map)\n        else:\n            self.num_class = num_class\n\n        if task == 'sound-cls':\n            self.cnn6 = CNN6(\n                extract_embedding=True, checkpoint=os.path.join(MODULE_HOME, 'panns_cnn6', 'cnn6.pdparams'))\n            self.dropout = nn.Dropout(0.1)\n            self.fc = nn.Linear(self.cnn6.emb_size, num_class)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        else:\n            self.cnn6 = CNN6(\n                extract_embedding=False, checkpoint=os.path.join(MODULE_HOME, 'panns_cnn6', 'cnn6.pdparams'))\n\n        self.task = task\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self, feats, labels=None):\n        # feats: (batch_size, num_frames, num_melbins) -> (batch_size, 1, num_frames, num_melbins)\n        feats = feats.unsqueeze(1)\n\n        if self.task == 'sound-cls':\n            embeddings = self.cnn6(feats)\n            embeddings = self.dropout(embeddings)\n            logits = self.fc(embeddings)\n            probs = F.softmax(logits, axis=1)\n\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            audioset_logits = self.cnn6(feats)\n            return audioset_logits\n"
  },
  {
    "path": "modules/audio/audio_classification/PANNs/cnn6/network.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom paddlehub.utils.log import logger\n\n\nclass ConvBlock5x5(nn.Layer):\n    def __init__(self, in_channels, out_channels):\n        super(ConvBlock5x5, self).__init__()\n\n        self.conv1 = nn.Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=(5, 5),\n            stride=(1, 1),\n            padding=(2, 2),\n            bias_attr=False)\n        self.bn1 = nn.BatchNorm2D(out_channels)\n\n    def forward(self, x, pool_size=(2, 2), pool_type='avg'):\n        x = self.conv1(x)\n        x = self.bn1(x)\n        x = F.relu(x)\n\n        if pool_type == 'max':\n            x = F.max_pool2d(x, kernel_size=pool_size)\n        elif pool_type == 'avg':\n            x = F.avg_pool2d(x, kernel_size=pool_size)\n        elif pool_type == 'avg+max':\n            x = F.avg_pool2d(x, kernel_size=pool_size) + F.max_pool2d(x, kernel_size=pool_size)\n        else:\n            raise Exception(\n                f'Pooling type of {pool_type} is not supported. It must be one of \"max\", \"avg\" and \"avg+max\".')\n        return x\n\n\nclass CNN6(nn.Layer):\n    emb_size = 512\n\n    def __init__(self, extract_embedding: bool = True, checkpoint: str = None):\n\n        super(CNN6, self).__init__()\n        self.bn0 = nn.BatchNorm2D(64)\n        self.conv_block1 = ConvBlock5x5(in_channels=1, out_channels=64)\n        self.conv_block2 = ConvBlock5x5(in_channels=64, out_channels=128)\n        self.conv_block3 = ConvBlock5x5(in_channels=128, out_channels=256)\n        self.conv_block4 = ConvBlock5x5(in_channels=256, out_channels=512)\n\n        self.fc1 = nn.Linear(512, self.emb_size)\n        self.fc_audioset = nn.Linear(self.emb_size, 527)\n\n        if checkpoint is not None and os.path.isfile(checkpoint):\n            state_dict = paddle.load(checkpoint)\n            self.set_state_dict(state_dict)\n            print(f'Loaded CNN6 pretrained parameters from: {checkpoint}')\n        else:\n            print('No valid checkpoints for CNN6. Start training from scratch.')\n\n        self.extract_embedding = extract_embedding\n\n    def forward(self, x):\n        x.stop_gradient = False\n        x = x.transpose([0, 3, 2, 1])\n        x = self.bn0(x)\n        x = x.transpose([0, 3, 2, 1])\n\n        x = self.conv_block1(x, pool_size=(2, 2), pool_type='avg')\n        x = F.dropout(x, p=0.2, training=self.training)\n\n        x = self.conv_block2(x, pool_size=(2, 2), pool_type='avg')\n        x = F.dropout(x, p=0.2, training=self.training)\n\n        x = self.conv_block3(x, pool_size=(2, 2), pool_type='avg')\n        x = F.dropout(x, p=0.2, training=self.training)\n\n        x = self.conv_block4(x, pool_size=(2, 2), pool_type='avg')\n        x = F.dropout(x, p=0.2, training=self.training)\n\n        x = x.mean(axis=3)\n        x = x.max(axis=2) + x.mean(axis=2)\n\n        x = F.dropout(x, p=0.5, training=self.training)\n        x = F.relu(self.fc1(x))\n\n        if self.extract_embedding:\n            output = F.dropout(x, p=0.5, training=self.training)\n        else:\n            output = F.sigmoid(self.fc_audioset(x))\n        return output\n"
  },
  {
    "path": "modules/audio/audio_classification/PANNs/cnn6/requirements.txt",
    "content": "librosa\n"
  },
  {
    "path": "modules/audio/keyword_spotting/kwmlp_speech_commands/README.md",
    "content": "# kwmlp_speech_commands\n\n|模型名称|kwmlp_speech_commands|\n| :--- | :---: |\n|类别|语音-语言识别|\n|网络|Keyword-MLP|\n|数据集|Google Speech Commands V2|\n|是否支持Fine-tuning|否|\n|模型大小|1.6MB|\n|最新更新日期|2022-01-04|\n|数据指标|ACC 97.56%|\n\n## 一、模型基本信息\n\n### 模型介绍\n\nkwmlp_speech_commands采用了 [Keyword-MLP](https://arxiv.org/pdf/2110.07749v1.pdf) 的轻量级模型结构，并在 [Google Speech Commands V2](https://arxiv.org/abs/1804.03209) 数据集上进行了预训练，在其测试集的测试结果为 ACC 97.56%。\n\n<p align=\"center\">\n<img src=\"https://d3i71xaburhd42.cloudfront.net/fa690a97f76ba119ca08fb02fa524a546c47f031/2-Figure1-1.png\" hspace='10' height=\"550\"/> <br />\n</p>\n\n\n更多详情请参考\n- [Speech Commands: A Dataset for Limited-Vocabulary Speech Recognition](https://arxiv.org/abs/1804.03209)\n- [ATTENTION-FREE KEYWORD SPOTTING](https://arxiv.org/pdf/2110.07749v1.pdf)\n- [Keyword-MLP](https://github.com/AI-Research-BD/Keyword-MLP)\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.2.0\n\n  - paddlehub >= 2.2.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install kwmlp_speech_commands\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n    ```python\n    import paddlehub as hub\n\n    model = hub.Module(\n        name='kwmlp_speech_commands',\n        version='1.0.0')\n\n    # 通过下列链接可下载示例音频\n    # https://paddlehub.bj.bcebos.com/paddlehub_dev/go.wav\n\n    # Keyword spotting\n    score, label = model.keyword_recognize('no.wav')\n    print(score, label)\n    # [0.89498246] no\n    score, label = model.keyword_recognize('go.wav')\n    print(score, label)\n    # [0.8997176] go\n    score, label = model.keyword_recognize('one.wav')\n    print(score, label)\n    # [0.88598305] one\n    ```\n\n- ### 2、API\n  - ```python\n    def keyword_recognize(\n        wav: os.PathLike,\n    )\n    ```\n    - 检测音频中包含的关键词。\n\n    - **参数**\n\n      - `wav`：输入的包含关键词的音频文件，格式为`*.wav`。\n\n    - **返回**\n\n      - 输出结果的得分和对应的关键词标签。\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install kwmlp_speech_commands\n  ```\n"
  },
  {
    "path": "modules/audio/keyword_spotting/kwmlp_speech_commands/__init__.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n"
  },
  {
    "path": "modules/audio/keyword_spotting/kwmlp_speech_commands/feature.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\n\nimport numpy as np\nimport paddle\nimport paddleaudio\n\n\ndef create_dct(n_mfcc: int, n_mels: int, norm: str = 'ortho'):\n    n = paddle.arange(float(n_mels))\n    k = paddle.arange(float(n_mfcc)).unsqueeze(1)\n    dct = paddle.cos(math.pi / float(n_mels) * (n + 0.5) * k)  # size (n_mfcc, n_mels)\n    if norm is None:\n        dct *= 2.0\n    else:\n        assert norm == \"ortho\"\n        dct[0] *= 1.0 / math.sqrt(2.0)\n        dct *= math.sqrt(2.0 / float(n_mels))\n    return dct.t()\n\n\ndef compute_mfcc(\n        x: paddle.Tensor,\n        sr: int = 16000,\n        n_mels: int = 40,\n        n_fft: int = 480,\n        win_length: int = 480,\n        hop_length: int = 160,\n        f_min: float = 0.0,\n        f_max: float = None,\n        center: bool = False,\n        top_db: float = 80.0,\n        norm: str = 'ortho',\n):\n    fbank = paddleaudio.features.spectrum.MelSpectrogram(\n        sr=sr,\n        n_mels=n_mels,\n        n_fft=n_fft,\n        win_length=win_length,\n        hop_length=hop_length,\n        f_min=0.0,\n        f_max=f_max,\n        center=center)(x)  # waveforms batch ~ (B, T)\n    log_fbank = paddleaudio.features.spectrum.power_to_db(fbank, top_db=top_db)\n    dct_matrix = create_dct(n_mfcc=n_mels, n_mels=n_mels, norm=norm)\n    mfcc = paddle.matmul(log_fbank.transpose((0, 2, 1)), dct_matrix).transpose((0, 2, 1))  # (B, n_mels, L)\n    return mfcc\n"
  },
  {
    "path": "modules/audio/keyword_spotting/kwmlp_speech_commands/kwmlp.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\nclass Residual(nn.Layer):\n    def __init__(self, fn):\n        super().__init__()\n        self.fn = fn\n\n    def forward(self, x):\n        return self.fn(x) + x\n\n\nclass PreNorm(nn.Layer):\n    def __init__(self, dim, fn):\n        super().__init__()\n        self.fn = fn\n        self.norm = nn.LayerNorm(dim)\n\n    def forward(self, x, **kwargs):\n        x = self.norm(x)\n        return self.fn(x, **kwargs)\n\n\nclass PostNorm(nn.Layer):\n    def __init__(self, dim, fn):\n        super().__init__()\n        self.norm = nn.LayerNorm(dim)\n        self.fn = fn\n\n    def forward(self, x, **kwargs):\n        return self.norm(self.fn(x, **kwargs))\n\n\nclass SpatialGatingUnit(nn.Layer):\n    def __init__(self, dim, dim_seq, act=nn.Identity(), init_eps=1e-3):\n        super().__init__()\n        dim_out = dim // 2\n\n        self.norm = nn.LayerNorm(dim_out)\n        self.proj = nn.Conv1D(dim_seq, dim_seq, 1)\n\n        self.act = act\n\n        init_eps /= dim_seq\n\n    def forward(self, x):\n        res, gate = x.split(2, axis=-1)\n        gate = self.norm(gate)\n\n        weight, bias = self.proj.weight, self.proj.bias\n        gate = F.conv1d(gate, weight, bias)\n\n        return self.act(gate) * res\n\n\nclass gMLPBlock(nn.Layer):\n    def __init__(self, *, dim, dim_ff, seq_len, act=nn.Identity()):\n        super().__init__()\n        self.proj_in = nn.Sequential(nn.Linear(dim, dim_ff), nn.GELU())\n\n        self.sgu = SpatialGatingUnit(dim_ff, seq_len, act)\n        self.proj_out = nn.Linear(dim_ff // 2, dim)\n\n    def forward(self, x):\n        x = self.proj_in(x)\n        x = self.sgu(x)\n        x = self.proj_out(x)\n        return x\n\n\nclass Rearrange(nn.Layer):\n    def __init__(self):\n        super().__init__()\n\n    def forward(self, x):\n        x = x.transpose([0, 1, 3, 2]).squeeze(1)\n        return x\n\n\nclass Reduce(nn.Layer):\n    def __init__(self, axis=1):\n        super().__init__()\n        self.axis = axis\n\n    def forward(self, x):\n        x = x.mean(axis=self.axis, keepdim=False)\n        return x\n\n\nclass KW_MLP(nn.Layer):\n    \"\"\"Keyword-MLP.\"\"\"\n\n    def __init__(self,\n                 input_res=[40, 98],\n                 patch_res=[40, 1],\n                 num_classes=35,\n                 dim=64,\n                 depth=12,\n                 ff_mult=4,\n                 channels=1,\n                 prob_survival=0.9,\n                 pre_norm=False,\n                 **kwargs):\n        super().__init__()\n        image_height, image_width = input_res\n        patch_height, patch_width = patch_res\n        assert (image_height % patch_height) == 0 and (\n            image_width % patch_width) == 0, 'image height and width must be divisible by patch size'\n        num_patches = (image_height // patch_height) * (image_width // patch_width)\n\n        P_Norm = PreNorm if pre_norm else PostNorm\n\n        dim_ff = dim * ff_mult\n\n        self.to_patch_embed = nn.Sequential(Rearrange(), nn.Linear(channels * patch_height * patch_width, dim))\n\n        self.prob_survival = prob_survival\n\n        self.layers = nn.LayerList(\n            [Residual(P_Norm(dim, gMLPBlock(dim=dim, dim_ff=dim_ff, seq_len=num_patches))) for i in range(depth)])\n\n        self.to_logits = nn.Sequential(nn.LayerNorm(dim), Reduce(axis=1), nn.Linear(dim, num_classes))\n\n    def forward(self, x):\n        x = self.to_patch_embed(x)\n        layers = self.layers\n        x = nn.Sequential(*layers)(x)\n        return self.to_logits(x)\n"
  },
  {
    "path": "modules/audio/keyword_spotting/kwmlp_speech_commands/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\n\nimport numpy as np\nimport paddle\nimport paddleaudio\n\nfrom .feature import compute_mfcc\nfrom .kwmlp import KW_MLP\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"kwmlp_speech_commands\",\n    version=\"1.0.0\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"audio/language_identification\")\nclass KWS(paddle.nn.Layer):\n    def __init__(self):\n        super(KWS, self).__init__()\n        ckpt_path = os.path.join(self.directory, 'assets', 'model.pdparams')\n        label_path = os.path.join(self.directory, 'assets', 'label.txt')\n\n        self.label_list = []\n        with open(label_path, 'r') as f:\n            for l in f:\n                self.label_list.append(l.strip())\n\n        self.sr = 16000\n        model_conf = {\n            'input_res': [40, 98],\n            'patch_res': [40, 1],\n            'num_classes': 35,\n            'channels': 1,\n            'dim': 64,\n            'depth': 12,\n            'pre_norm': False,\n            'prob_survival': 0.9,\n        }\n        self.model = KW_MLP(**model_conf)\n        self.model.set_state_dict(paddle.load(ckpt_path))\n        self.model.eval()\n\n    def load_audio(self, wav):\n        wav = os.path.abspath(os.path.expanduser(wav))\n        assert os.path.isfile(wav), 'Please check wav file: {}'.format(wav)\n        waveform, _ = paddleaudio.load(wav, sr=self.sr, mono=True, normal=False)\n        return waveform\n\n    def keyword_recognize(self, wav):\n        waveform = self.load_audio(wav)\n\n        # fix_length to 1s\n        if len(waveform) > self.sr:\n            waveform = waveform[:self.sr]\n        else:\n            waveform = np.pad(waveform, (0, self.sr - len(waveform)))\n\n        logits = self(paddle.to_tensor(waveform)).reshape([-1])\n        probs = paddle.nn.functional.softmax(logits)\n        idx = paddle.argmax(probs)\n        return probs[idx].numpy(), self.label_list[idx]\n\n    def forward(self, x):\n        if len(x.shape) == 1:  # x: waveform tensors with (B, T) shape\n            x = x.unsqueeze(0)\n\n        mfcc = compute_mfcc(x).unsqueeze(1)  # (B, C, n_mels, L)\n        logits = self.model(mfcc).squeeze(1)\n\n        return logits\n"
  },
  {
    "path": "modules/audio/keyword_spotting/kwmlp_speech_commands/requirements.txt",
    "content": "paddleaudio==0.1.0\n"
  },
  {
    "path": "modules/audio/language_identification/ecapa_tdnn_common_language/README.md",
    "content": "# ecapa_tdnn_common_language\n\n|模型名称|ecapa_tdnn_common_language|\n| :--- | :---: |\n|类别|语音-语言识别|\n|网络|ECAPA-TDNN|\n|数据集|CommonLanguage|\n|是否支持Fine-tuning|否|\n|模型大小|79MB|\n|最新更新日期|2021-12-30|\n|数据指标|ACC 84.9%|\n\n## 一、模型基本信息\n\n### 模型介绍\n\necapa_tdnn_common_language采用了[ECAPA-TDNN](https://arxiv.org/abs/2005.07143)的模型结构，并在[CommonLanguage](https://zenodo.org/record/5036977/)数据集上进行了预训练，在其测试集的测试结果为 ACC 84.9%。\n\n<p align=\"center\">\n<img src=\"https://d3i71xaburhd42.cloudfront.net/9609f4817a7e769f5e3e07084db35e46696e82cd/3-Figure2-1.png\" hspace='10' height=\"550\"/> <br />\n</p>\n\n\n更多详情请参考\n- [CommonLanguage](https://zenodo.org/record/5036977#.Yc19b5Mzb0o)\n- [ECAPA-TDNN: Emphasized Channel Attention, Propagation and Aggregation in TDNN Based Speaker Verification](https://arxiv.org/pdf/2005.07143.pdf)\n- [The SpeechBrain Toolkit](https://github.com/speechbrain/speechbrain)\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.2.0\n\n  - paddlehub >= 2.2.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install ecapa_tdnn_common_language\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n    ```python\n    import paddlehub as hub\n\n    model = hub.Module(\n        name='ecapa_tdnn_common_language',\n        version='1.0.0')\n\n    # 通过下列链接可下载示例音频\n    # https://paddlehub.bj.bcebos.com/paddlehub_dev/zh.wav\n    # https://paddlehub.bj.bcebos.com/paddlehub_dev/en.wav\n    # https://paddlehub.bj.bcebos.com/paddlehub_dev/it.wav\n\n    # Language Identification\n    score, label = model.speaker_verify('zh.wav')\n    print(score, label)\n    # array([0.6214552], dtype=float32), 'Chinese_China'\n    score, label = model.speaker_verify('en.wav')\n    print(score, label)\n    # array([0.37193954], dtype=float32), 'English'\n    score, label = model.speaker_verify('it.wav')\n    print(score, label)\n    # array([0.46913534], dtype=float32), 'Italian'\n    ```\n\n- ### 2、API\n  - ```python\n    def language_identify(\n        wav: os.PathLike,\n    )\n    ```\n    - 判断输入人声音频的语言类别。\n\n    - **参数**\n\n      - `wav`：输入的说话人的音频文件，格式为`*.wav`。\n\n    - **返回**\n\n      - 输出结果的得分和对应的语言类别。\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install ecapa_tdnn_common_language\n  ```\n"
  },
  {
    "path": "modules/audio/language_identification/ecapa_tdnn_common_language/__init__.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n"
  },
  {
    "path": "modules/audio/language_identification/ecapa_tdnn_common_language/ecapa_tdnn.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nimport os\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\ndef length_to_mask(length, max_len=None, dtype=None):\n    assert len(length.shape) == 1\n\n    if max_len is None:\n        max_len = length.max().astype('int').item()  # using arange to generate mask\n    mask = paddle.arange(max_len, dtype=length.dtype).expand((len(length), max_len)) < length.unsqueeze(1)\n\n    if dtype is None:\n        dtype = length.dtype\n\n    mask = paddle.to_tensor(mask, dtype=dtype)\n    return mask\n\n\nclass Conv1d(nn.Layer):\n    def __init__(\n            self,\n            in_channels,\n            out_channels,\n            kernel_size,\n            stride=1,\n            padding=\"same\",\n            dilation=1,\n            groups=1,\n            bias=True,\n            padding_mode=\"reflect\",\n    ):\n        super(Conv1d, self).__init__()\n\n        self.kernel_size = kernel_size\n        self.stride = stride\n        self.dilation = dilation\n        self.padding = padding\n        self.padding_mode = padding_mode\n\n        self.conv = nn.Conv1D(\n            in_channels,\n            out_channels,\n            self.kernel_size,\n            stride=self.stride,\n            padding=0,\n            dilation=self.dilation,\n            groups=groups,\n            bias_attr=bias,\n        )\n\n    def forward(self, x):\n        if self.padding == \"same\":\n            x = self._manage_padding(x, self.kernel_size, self.dilation, self.stride)\n        else:\n            raise ValueError(\"Padding must be 'same'. Got {self.padding}\")\n\n        return self.conv(x)\n\n    def _manage_padding(self, x, kernel_size: int, dilation: int, stride: int):\n        L_in = x.shape[-1]  # Detecting input shape\n        padding = self._get_padding_elem(L_in, stride, kernel_size, dilation)  # Time padding\n        x = F.pad(x, padding, mode=self.padding_mode, data_format=\"NCL\")  # Applying padding\n        return x\n\n    def _get_padding_elem(self, L_in: int, stride: int, kernel_size: int, dilation: int):\n        if stride > 1:\n            n_steps = math.ceil(((L_in - kernel_size * dilation) / stride) + 1)\n            L_out = stride * (n_steps - 1) + kernel_size * dilation\n            padding = [kernel_size // 2, kernel_size // 2]\n        else:\n            L_out = (L_in - dilation * (kernel_size - 1) - 1) // stride + 1\n\n            padding = [(L_in - L_out) // 2, (L_in - L_out) // 2]\n\n        return padding\n\n\nclass BatchNorm1d(nn.Layer):\n    def __init__(\n            self,\n            input_size,\n            eps=1e-05,\n            momentum=0.9,\n            weight_attr=None,\n            bias_attr=None,\n            data_format='NCL',\n            use_global_stats=None,\n    ):\n        super(BatchNorm1d, self).__init__()\n\n        self.norm = nn.BatchNorm1D(\n            input_size,\n            epsilon=eps,\n            momentum=momentum,\n            weight_attr=weight_attr,\n            bias_attr=bias_attr,\n            data_format=data_format,\n            use_global_stats=use_global_stats,\n        )\n\n    def forward(self, x):\n        x_n = self.norm(x)\n        return x_n\n\n\nclass TDNNBlock(nn.Layer):\n    def __init__(\n            self,\n            in_channels,\n            out_channels,\n            kernel_size,\n            dilation,\n            activation=nn.ReLU,\n    ):\n        super(TDNNBlock, self).__init__()\n        self.conv = Conv1d(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            dilation=dilation,\n        )\n        self.activation = activation()\n        self.norm = BatchNorm1d(input_size=out_channels)\n\n    def forward(self, x):\n        return self.norm(self.activation(self.conv(x)))\n\n\nclass Res2NetBlock(nn.Layer):\n    def __init__(self, in_channels, out_channels, scale=8, dilation=1):\n        super(Res2NetBlock, self).__init__()\n        assert in_channels % scale == 0\n        assert out_channels % scale == 0\n\n        in_channel = in_channels // scale\n        hidden_channel = out_channels // scale\n\n        self.blocks = nn.LayerList(\n            [TDNNBlock(in_channel, hidden_channel, kernel_size=3, dilation=dilation) for i in range(scale - 1)])\n        self.scale = scale\n\n    def forward(self, x):\n        y = []\n        for i, x_i in enumerate(paddle.chunk(x, self.scale, axis=1)):\n            if i == 0:\n                y_i = x_i\n            elif i == 1:\n                y_i = self.blocks[i - 1](x_i)\n            else:\n                y_i = self.blocks[i - 1](x_i + y_i)\n            y.append(y_i)\n        y = paddle.concat(y, axis=1)\n        return y\n\n\nclass SEBlock(nn.Layer):\n    def __init__(self, in_channels, se_channels, out_channels):\n        super(SEBlock, self).__init__()\n\n        self.conv1 = Conv1d(in_channels=in_channels, out_channels=se_channels, kernel_size=1)\n        self.relu = paddle.nn.ReLU()\n        self.conv2 = Conv1d(in_channels=se_channels, out_channels=out_channels, kernel_size=1)\n        self.sigmoid = paddle.nn.Sigmoid()\n\n    def forward(self, x, lengths=None):\n        L = x.shape[-1]\n        if lengths is not None:\n            mask = length_to_mask(lengths * L, max_len=L)\n            mask = mask.unsqueeze(1)\n            total = mask.sum(axis=2, keepdim=True)\n            s = (x * mask).sum(axis=2, keepdim=True) / total\n        else:\n            s = x.mean(axis=2, keepdim=True)\n\n        s = self.relu(self.conv1(s))\n        s = self.sigmoid(self.conv2(s))\n\n        return s * x\n\n\nclass AttentiveStatisticsPooling(nn.Layer):\n    def __init__(self, channels, attention_channels=128, global_context=True):\n        super().__init__()\n\n        self.eps = 1e-12\n        self.global_context = global_context\n        if global_context:\n            self.tdnn = TDNNBlock(channels * 3, attention_channels, 1, 1)\n        else:\n            self.tdnn = TDNNBlock(channels, attention_channels, 1, 1)\n        self.tanh = nn.Tanh()\n        self.conv = Conv1d(in_channels=attention_channels, out_channels=channels, kernel_size=1)\n\n    def forward(self, x, lengths=None):\n        C, L = x.shape[1], x.shape[2]  # KP: (N, C, L)\n\n        def _compute_statistics(x, m, axis=2, eps=self.eps):\n            mean = (m * x).sum(axis)\n            std = paddle.sqrt((m * (x - mean.unsqueeze(axis)).pow(2)).sum(axis).clip(eps))\n            return mean, std\n\n        if lengths is None:\n            lengths = paddle.ones([x.shape[0]])\n\n        # Make binary mask of shape [N, 1, L]\n        mask = length_to_mask(lengths * L, max_len=L)\n        mask = mask.unsqueeze(1)\n\n        # Expand the temporal context of the pooling layer by allowing the\n        # self-attention to look at global properties of the utterance.\n        if self.global_context:\n            total = mask.sum(axis=2, keepdim=True).astype('float32')\n            mean, std = _compute_statistics(x, mask / total)\n            mean = mean.unsqueeze(2).tile((1, 1, L))\n            std = std.unsqueeze(2).tile((1, 1, L))\n            attn = paddle.concat([x, mean, std], axis=1)\n        else:\n            attn = x\n\n        # Apply layers\n        attn = self.conv(self.tanh(self.tdnn(attn)))\n\n        # Filter out zero-paddings\n        attn = paddle.where(mask.tile((1, C, 1)) == 0, paddle.ones_like(attn) * float(\"-inf\"), attn)\n\n        attn = F.softmax(attn, axis=2)\n        mean, std = _compute_statistics(x, attn)\n\n        # Append mean and std of the batch\n        pooled_stats = paddle.concat((mean, std), axis=1)\n        pooled_stats = pooled_stats.unsqueeze(2)\n\n        return pooled_stats\n\n\nclass SERes2NetBlock(nn.Layer):\n    def __init__(\n            self,\n            in_channels,\n            out_channels,\n            res2net_scale=8,\n            se_channels=128,\n            kernel_size=1,\n            dilation=1,\n            activation=nn.ReLU,\n    ):\n        super(SERes2NetBlock, self).__init__()\n        self.out_channels = out_channels\n        self.tdnn1 = TDNNBlock(\n            in_channels,\n            out_channels,\n            kernel_size=1,\n            dilation=1,\n            activation=activation,\n        )\n        self.res2net_block = Res2NetBlock(out_channels, out_channels, res2net_scale, dilation)\n        self.tdnn2 = TDNNBlock(\n            out_channels,\n            out_channels,\n            kernel_size=1,\n            dilation=1,\n            activation=activation,\n        )\n        self.se_block = SEBlock(out_channels, se_channels, out_channels)\n\n        self.shortcut = None\n        if in_channels != out_channels:\n            self.shortcut = Conv1d(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1,\n            )\n\n    def forward(self, x, lengths=None):\n        residual = x\n        if self.shortcut:\n            residual = self.shortcut(x)\n\n        x = self.tdnn1(x)\n        x = self.res2net_block(x)\n        x = self.tdnn2(x)\n        x = self.se_block(x, lengths)\n\n        return x + residual\n\n\nclass ECAPA_TDNN(nn.Layer):\n    def __init__(\n            self,\n            input_size,\n            lin_neurons=192,\n            activation=nn.ReLU,\n            channels=[512, 512, 512, 512, 1536],\n            kernel_sizes=[5, 3, 3, 3, 1],\n            dilations=[1, 2, 3, 4, 1],\n            attention_channels=128,\n            res2net_scale=8,\n            se_channels=128,\n            global_context=True,\n    ):\n\n        super(ECAPA_TDNN, self).__init__()\n        assert len(channels) == len(kernel_sizes)\n        assert len(channels) == len(dilations)\n        self.channels = channels\n        self.blocks = nn.LayerList()\n        self.emb_size = lin_neurons\n\n        # The initial TDNN layer\n        self.blocks.append(TDNNBlock(\n            input_size,\n            channels[0],\n            kernel_sizes[0],\n            dilations[0],\n            activation,\n        ))\n\n        # SE-Res2Net layers\n        for i in range(1, len(channels) - 1):\n            self.blocks.append(\n                SERes2NetBlock(\n                    channels[i - 1],\n                    channels[i],\n                    res2net_scale=res2net_scale,\n                    se_channels=se_channels,\n                    kernel_size=kernel_sizes[i],\n                    dilation=dilations[i],\n                    activation=activation,\n                ))\n\n        # Multi-layer feature aggregation\n        self.mfa = TDNNBlock(\n            channels[-1],\n            channels[-1],\n            kernel_sizes[-1],\n            dilations[-1],\n            activation,\n        )\n\n        # Attentive Statistical Pooling\n        self.asp = AttentiveStatisticsPooling(\n            channels[-1],\n            attention_channels=attention_channels,\n            global_context=global_context,\n        )\n        self.asp_bn = BatchNorm1d(input_size=channels[-1] * 2)\n\n        # Final linear transformation\n        self.fc = Conv1d(\n            in_channels=channels[-1] * 2,\n            out_channels=self.emb_size,\n            kernel_size=1,\n        )\n\n    def forward(self, x, lengths=None):\n        xl = []\n        for layer in self.blocks:\n            try:\n                x = layer(x, lengths=lengths)\n            except TypeError:\n                x = layer(x)\n            xl.append(x)\n\n        # Multi-layer feature aggregation\n        x = paddle.concat(xl[1:], axis=1)\n        x = self.mfa(x)\n\n        # Attentive Statistical Pooling\n        x = self.asp(x, lengths=lengths)\n        x = self.asp_bn(x)\n\n        # Final linear transformation\n        x = self.fc(x)\n\n        return x\n\n\nclass Classifier(nn.Layer):\n    def __init__(self, backbone, num_class, dtype=paddle.float32):\n        super(Classifier, self).__init__()\n        self.backbone = backbone\n        self.params = nn.ParameterList(\n            [paddle.create_parameter(shape=[num_class, self.backbone.emb_size], dtype=dtype)])\n\n    def forward(self, x):\n        emb = self.backbone(x.transpose([0, 2, 1])).transpose([0, 2, 1])\n        logits = F.linear(F.normalize(emb.squeeze(1)), F.normalize(self.params[0]).transpose([1, 0]))\n\n        return logits\n"
  },
  {
    "path": "modules/audio/language_identification/ecapa_tdnn_common_language/feature.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport paddle\nimport paddleaudio\nfrom paddleaudio.features.spectrum import hz_to_mel\nfrom paddleaudio.features.spectrum import mel_to_hz\nfrom paddleaudio.features.spectrum import power_to_db\nfrom paddleaudio.features.spectrum import Spectrogram\nfrom paddleaudio.features.window import get_window\n\n\ndef compute_fbank_matrix(sample_rate: int = 16000,\n                         n_fft: int = 400,\n                         n_mels: int = 80,\n                         f_min: int = 0.0,\n                         f_max: int = 8000.0):\n    mel = paddle.linspace(hz_to_mel(f_min, htk=True), hz_to_mel(f_max, htk=True), n_mels + 2, dtype=paddle.float32)\n    hz = mel_to_hz(mel, htk=True)\n\n    band = hz[1:] - hz[:-1]\n    band = band[:-1]\n    f_central = hz[1:-1]\n\n    n_stft = n_fft // 2 + 1\n    all_freqs = paddle.linspace(0, sample_rate // 2, n_stft)\n    all_freqs_mat = all_freqs.tile([f_central.shape[0], 1])\n\n    f_central_mat = f_central.tile([all_freqs_mat.shape[1], 1]).transpose([1, 0])\n    band_mat = band.tile([all_freqs_mat.shape[1], 1]).transpose([1, 0])\n\n    slope = (all_freqs_mat - f_central_mat) / band_mat\n    left_side = slope + 1.0\n    right_side = -slope + 1.0\n\n    fbank_matrix = paddle.maximum(paddle.zeros_like(left_side), paddle.minimum(left_side, right_side))\n\n    return fbank_matrix\n\n\ndef compute_log_fbank(\n        x: paddle.Tensor,\n        sample_rate: int = 16000,\n        n_fft: int = 400,\n        hop_length: int = 160,\n        win_length: int = 400,\n        n_mels: int = 80,\n        window: str = 'hamming',\n        center: bool = True,\n        pad_mode: str = 'constant',\n        f_min: float = 0.0,\n        f_max: float = None,\n        top_db: float = 80.0,\n):\n\n    if f_max is None:\n        f_max = sample_rate / 2\n\n    spect = Spectrogram(\n        n_fft=n_fft, hop_length=hop_length, win_length=win_length, window=window, center=center, pad_mode=pad_mode)(x)\n\n    fbank_matrix = compute_fbank_matrix(\n        sample_rate=sample_rate,\n        n_fft=n_fft,\n        n_mels=n_mels,\n        f_min=f_min,\n        f_max=f_max,\n    )\n    fbank = paddle.matmul(fbank_matrix, spect)\n    log_fbank = power_to_db(fbank, top_db=top_db).transpose([0, 2, 1])\n    return log_fbank\n\n\ndef compute_stats(x: paddle.Tensor, mean_norm: bool = True, std_norm: bool = False, eps: float = 1e-10):\n    if mean_norm:\n        current_mean = paddle.mean(x, axis=0)\n    else:\n        current_mean = paddle.to_tensor([0.0])\n\n    if std_norm:\n        current_std = paddle.std(x, axis=0)\n    else:\n        current_std = paddle.to_tensor([1.0])\n\n    current_std = paddle.maximum(current_std, eps * paddle.ones_like(current_std))\n\n    return current_mean, current_std\n\n\ndef normalize(\n        x: paddle.Tensor,\n        global_mean: paddle.Tensor = None,\n        global_std: paddle.Tensor = None,\n):\n\n    for i in range(x.shape[0]):  # (B, ...)\n        if global_mean is None and global_std is None:\n            mean, std = compute_stats(x[i])\n            x[i] = (x[i] - mean) / std\n        else:\n            x[i] = (x[i] - global_mean) / global_std\n    return x\n"
  },
  {
    "path": "modules/audio/language_identification/ecapa_tdnn_common_language/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport re\nfrom typing import List\nfrom typing import Union\n\nimport numpy as np\nimport paddle\nimport paddleaudio\n\nfrom .ecapa_tdnn import Classifier\nfrom .ecapa_tdnn import ECAPA_TDNN\nfrom .feature import compute_log_fbank\nfrom .feature import normalize\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"ecapa_tdnn_common_language\",\n    version=\"1.0.0\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"audio/language_identification\")\nclass LanguageIdentification(paddle.nn.Layer):\n    def __init__(self):\n        super(LanguageIdentification, self).__init__()\n        ckpt_path = os.path.join(self.directory, 'assets', 'model.pdparams')\n        label_path = os.path.join(self.directory, 'assets', 'label.txt')\n\n        self.label_list = []\n        with open(label_path, 'r') as f:\n            for l in f:\n                self.label_list.append(l.strip())\n\n        self.sr = 16000\n        model_conf = {\n            'input_size': 80,\n            'channels': [1024, 1024, 1024, 1024, 3072],\n            'kernel_sizes': [5, 3, 3, 3, 1],\n            'dilations': [1, 2, 3, 4, 1],\n            'attention_channels': 128,\n            'lin_neurons': 192\n        }\n        self.model = Classifier(\n            backbone=ECAPA_TDNN(**model_conf),\n            num_class=45,\n        )\n        self.model.set_state_dict(paddle.load(ckpt_path))\n        self.model.eval()\n\n    def load_audio(self, wav):\n        wav = os.path.abspath(os.path.expanduser(wav))\n        assert os.path.isfile(wav), 'Please check wav file: {}'.format(wav)\n        waveform, _ = paddleaudio.load(wav, sr=self.sr, mono=True, normal=False)\n        return waveform\n\n    def language_identify(self, wav):\n        waveform = self.load_audio(wav)\n        logits = self(paddle.to_tensor(waveform)).reshape([-1])\n        idx = paddle.argmax(logits)\n        return logits[idx].numpy(), self.label_list[idx]\n\n    def forward(self, x):\n        if len(x.shape) == 1:\n            x = x.unsqueeze(0)\n\n        fbank = compute_log_fbank(x)  # x: waveform tensors with (B, T) shape\n        norm_fbank = normalize(fbank)\n        logits = self.model(norm_fbank).squeeze(1)\n\n        return logits\n"
  },
  {
    "path": "modules/audio/language_identification/ecapa_tdnn_common_language/requirements.txt",
    "content": "paddleaudio==0.1.0\n"
  },
  {
    "path": "modules/audio/speaker_recognition/ecapa_tdnn_voxceleb/README.md",
    "content": "# ecapa_tdnn_voxceleb\n\n|模型名称|ecapa_tdnn_voxceleb|\n| :--- | :---: |\n|类别|语音-声纹识别|\n|网络|ECAPA-TDNN|\n|数据集|VoxCeleb|\n|是否支持Fine-tuning|否|\n|模型大小|79MB|\n|最新更新日期|2021-12-30|\n|数据指标|EER 0.69%|\n\n## 一、模型基本信息\n\n### 模型介绍\n\necapa_tdnn_voxceleb采用了[ECAPA-TDNN](https://arxiv.org/abs/2005.07143)的模型结构，并在[VoxCeleb](http://www.robots.ox.ac.uk/~vgg/data/voxceleb/)数据集上进行了预训练，在VoxCeleb1的声纹识别测试集([veri_test.txt](https://www.robots.ox.ac.uk/~vgg/data/voxceleb/meta/veri_test.txt))上的测试结果为 EER 0.69%，达到了该数据集的SOTA。\n\n<p align=\"center\">\n<img src=\"https://d3i71xaburhd42.cloudfront.net/9609f4817a7e769f5e3e07084db35e46696e82cd/3-Figure2-1.png\" hspace='10' height=\"550\"/> <br />\n</p>\n\n\n\n更多详情请参考\n- [VoxCeleb: a large-scale speaker identification dataset](https://www.robots.ox.ac.uk/~vgg/publications/2017/Nagrani17/nagrani17.pdf)\n- [ECAPA-TDNN: Emphasized Channel Attention, Propagation and Aggregation in TDNN Based Speaker Verification](https://arxiv.org/pdf/2005.07143.pdf)\n- [The SpeechBrain Toolkit](https://github.com/speechbrain/speechbrain)\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.2.0\n\n  - paddlehub >= 2.2.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install ecapa_tdnn_voxceleb\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n    ```python\n    import paddlehub as hub\n\n    model = hub.Module(\n        name='ecapa_tdnn_voxceleb',\n        threshold=0.25,\n        version='1.0.0')\n\n    # 通过下列链接可下载示例音频\n    # https://paddlehub.bj.bcebos.com/paddlehub_dev/sv1.wav\n    # https://paddlehub.bj.bcebos.com/paddlehub_dev/sv2.wav\n\n    # Speaker Embedding\n    embedding = model.speaker_embedding('sv1.wav')\n    print(embedding.shape)\n    # (192,)\n\n    # Speaker Verification\n    score, pred = model.speaker_verify('sv1.wav', 'sv2.wav')\n    print(score, pred)\n    # [0.16354457], [False]\n    ```\n\n- ### 2、API\n  - ```python\n    def __init__(\n        threshold: float,\n    )\n    ```\n    - 初始化声纹模型，确定判别阈值。\n\n    - **参数**\n\n      - `threshold`：设定模型判别声纹相似度的得分阈值，默认为 0.25。\n\n  - ```python\n    def speaker_embedding(\n        wav: os.PathLike,\n    )\n    ```\n    - 获取输入音频的声纹特征\n\n    - **参数**\n\n      - `wav`：输入的说话人的音频文件，格式为`*.wav`。\n\n    - **返回**\n\n      - 输出纬度为 (192,) 的声纹特征向量。\n\n  - ```python\n    def speaker_verify(\n        wav1: os.PathLike,\n        wav2: os.PathLike,\n    )\n    ```\n    - 对比两段音频，分别计算其声纹特征的相似度得分，并判断是否为同一说话人。\n\n    - **参数**\n\n      - `wav1`：输入的说话人1的音频文件，格式为`*.wav`。\n      - `wav2`：输入的说话人2的音频文件，格式为`*.wav`。\n\n    - **返回**\n\n      - 返回声纹相似度得分[-1, 1]和预测结果。\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install ecapa_tdnn_voxceleb\n  ```\n"
  },
  {
    "path": "modules/audio/speaker_recognition/ecapa_tdnn_voxceleb/__init__.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n"
  },
  {
    "path": "modules/audio/speaker_recognition/ecapa_tdnn_voxceleb/ecapa_tdnn.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nimport os\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\ndef length_to_mask(length, max_len=None, dtype=None):\n    assert len(length.shape) == 1\n\n    if max_len is None:\n        max_len = length.max().astype('int').item()  # using arange to generate mask\n    mask = paddle.arange(max_len, dtype=length.dtype).expand((len(length), max_len)) < length.unsqueeze(1)\n\n    if dtype is None:\n        dtype = length.dtype\n\n    mask = paddle.to_tensor(mask, dtype=dtype)\n    return mask\n\n\nclass Conv1d(nn.Layer):\n    def __init__(\n            self,\n            in_channels,\n            out_channels,\n            kernel_size,\n            stride=1,\n            padding=\"same\",\n            dilation=1,\n            groups=1,\n            bias=True,\n            padding_mode=\"reflect\",\n    ):\n        super(Conv1d, self).__init__()\n\n        self.kernel_size = kernel_size\n        self.stride = stride\n        self.dilation = dilation\n        self.padding = padding\n        self.padding_mode = padding_mode\n\n        self.conv = nn.Conv1D(\n            in_channels,\n            out_channels,\n            self.kernel_size,\n            stride=self.stride,\n            padding=0,\n            dilation=self.dilation,\n            groups=groups,\n            bias_attr=bias,\n        )\n\n    def forward(self, x):\n        if self.padding == \"same\":\n            x = self._manage_padding(x, self.kernel_size, self.dilation, self.stride)\n        else:\n            raise ValueError(\"Padding must be 'same'. Got {self.padding}\")\n\n        return self.conv(x)\n\n    def _manage_padding(self, x, kernel_size: int, dilation: int, stride: int):\n        L_in = x.shape[-1]  # Detecting input shape\n        padding = self._get_padding_elem(L_in, stride, kernel_size, dilation)  # Time padding\n        x = F.pad(x, padding, mode=self.padding_mode, data_format=\"NCL\")  # Applying padding\n        return x\n\n    def _get_padding_elem(self, L_in: int, stride: int, kernel_size: int, dilation: int):\n        if stride > 1:\n            n_steps = math.ceil(((L_in - kernel_size * dilation) / stride) + 1)\n            L_out = stride * (n_steps - 1) + kernel_size * dilation\n            padding = [kernel_size // 2, kernel_size // 2]\n        else:\n            L_out = (L_in - dilation * (kernel_size - 1) - 1) // stride + 1\n\n            padding = [(L_in - L_out) // 2, (L_in - L_out) // 2]\n\n        return padding\n\n\nclass BatchNorm1d(nn.Layer):\n    def __init__(\n            self,\n            input_size,\n            eps=1e-05,\n            momentum=0.9,\n            weight_attr=None,\n            bias_attr=None,\n            data_format='NCL',\n            use_global_stats=None,\n    ):\n        super(BatchNorm1d, self).__init__()\n\n        self.norm = nn.BatchNorm1D(\n            input_size,\n            epsilon=eps,\n            momentum=momentum,\n            weight_attr=weight_attr,\n            bias_attr=bias_attr,\n            data_format=data_format,\n            use_global_stats=use_global_stats,\n        )\n\n    def forward(self, x):\n        x_n = self.norm(x)\n        return x_n\n\n\nclass TDNNBlock(nn.Layer):\n    def __init__(\n            self,\n            in_channels,\n            out_channels,\n            kernel_size,\n            dilation,\n            activation=nn.ReLU,\n    ):\n        super(TDNNBlock, self).__init__()\n        self.conv = Conv1d(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            dilation=dilation,\n        )\n        self.activation = activation()\n        self.norm = BatchNorm1d(input_size=out_channels)\n\n    def forward(self, x):\n        return self.norm(self.activation(self.conv(x)))\n\n\nclass Res2NetBlock(nn.Layer):\n    def __init__(self, in_channels, out_channels, scale=8, dilation=1):\n        super(Res2NetBlock, self).__init__()\n        assert in_channels % scale == 0\n        assert out_channels % scale == 0\n\n        in_channel = in_channels // scale\n        hidden_channel = out_channels // scale\n\n        self.blocks = nn.LayerList(\n            [TDNNBlock(in_channel, hidden_channel, kernel_size=3, dilation=dilation) for i in range(scale - 1)])\n        self.scale = scale\n\n    def forward(self, x):\n        y = []\n        for i, x_i in enumerate(paddle.chunk(x, self.scale, axis=1)):\n            if i == 0:\n                y_i = x_i\n            elif i == 1:\n                y_i = self.blocks[i - 1](x_i)\n            else:\n                y_i = self.blocks[i - 1](x_i + y_i)\n            y.append(y_i)\n        y = paddle.concat(y, axis=1)\n        return y\n\n\nclass SEBlock(nn.Layer):\n    def __init__(self, in_channels, se_channels, out_channels):\n        super(SEBlock, self).__init__()\n\n        self.conv1 = Conv1d(in_channels=in_channels, out_channels=se_channels, kernel_size=1)\n        self.relu = paddle.nn.ReLU()\n        self.conv2 = Conv1d(in_channels=se_channels, out_channels=out_channels, kernel_size=1)\n        self.sigmoid = paddle.nn.Sigmoid()\n\n    def forward(self, x, lengths=None):\n        L = x.shape[-1]\n        if lengths is not None:\n            mask = length_to_mask(lengths * L, max_len=L)\n            mask = mask.unsqueeze(1)\n            total = mask.sum(axis=2, keepdim=True)\n            s = (x * mask).sum(axis=2, keepdim=True) / total\n        else:\n            s = x.mean(axis=2, keepdim=True)\n\n        s = self.relu(self.conv1(s))\n        s = self.sigmoid(self.conv2(s))\n\n        return s * x\n\n\nclass AttentiveStatisticsPooling(nn.Layer):\n    def __init__(self, channels, attention_channels=128, global_context=True):\n        super().__init__()\n\n        self.eps = 1e-12\n        self.global_context = global_context\n        if global_context:\n            self.tdnn = TDNNBlock(channels * 3, attention_channels, 1, 1)\n        else:\n            self.tdnn = TDNNBlock(channels, attention_channels, 1, 1)\n        self.tanh = nn.Tanh()\n        self.conv = Conv1d(in_channels=attention_channels, out_channels=channels, kernel_size=1)\n\n    def forward(self, x, lengths=None):\n        C, L = x.shape[1], x.shape[2]  # KP: (N, C, L)\n\n        def _compute_statistics(x, m, axis=2, eps=self.eps):\n            mean = (m * x).sum(axis)\n            std = paddle.sqrt((m * (x - mean.unsqueeze(axis)).pow(2)).sum(axis).clip(eps))\n            return mean, std\n\n        if lengths is None:\n            lengths = paddle.ones([x.shape[0]])\n\n        # Make binary mask of shape [N, 1, L]\n        mask = length_to_mask(lengths * L, max_len=L)\n        mask = mask.unsqueeze(1)\n\n        # Expand the temporal context of the pooling layer by allowing the\n        # self-attention to look at global properties of the utterance.\n        if self.global_context:\n            total = mask.sum(axis=2, keepdim=True).astype('float32')\n            mean, std = _compute_statistics(x, mask / total)\n            mean = mean.unsqueeze(2).tile((1, 1, L))\n            std = std.unsqueeze(2).tile((1, 1, L))\n            attn = paddle.concat([x, mean, std], axis=1)\n        else:\n            attn = x\n\n        # Apply layers\n        attn = self.conv(self.tanh(self.tdnn(attn)))\n\n        # Filter out zero-paddings\n        attn = paddle.where(mask.tile((1, C, 1)) == 0, paddle.ones_like(attn) * float(\"-inf\"), attn)\n\n        attn = F.softmax(attn, axis=2)\n        mean, std = _compute_statistics(x, attn)\n\n        # Append mean and std of the batch\n        pooled_stats = paddle.concat((mean, std), axis=1)\n        pooled_stats = pooled_stats.unsqueeze(2)\n\n        return pooled_stats\n\n\nclass SERes2NetBlock(nn.Layer):\n    def __init__(\n            self,\n            in_channels,\n            out_channels,\n            res2net_scale=8,\n            se_channels=128,\n            kernel_size=1,\n            dilation=1,\n            activation=nn.ReLU,\n    ):\n        super(SERes2NetBlock, self).__init__()\n        self.out_channels = out_channels\n        self.tdnn1 = TDNNBlock(\n            in_channels,\n            out_channels,\n            kernel_size=1,\n            dilation=1,\n            activation=activation,\n        )\n        self.res2net_block = Res2NetBlock(out_channels, out_channels, res2net_scale, dilation)\n        self.tdnn2 = TDNNBlock(\n            out_channels,\n            out_channels,\n            kernel_size=1,\n            dilation=1,\n            activation=activation,\n        )\n        self.se_block = SEBlock(out_channels, se_channels, out_channels)\n\n        self.shortcut = None\n        if in_channels != out_channels:\n            self.shortcut = Conv1d(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1,\n            )\n\n    def forward(self, x, lengths=None):\n        residual = x\n        if self.shortcut:\n            residual = self.shortcut(x)\n\n        x = self.tdnn1(x)\n        x = self.res2net_block(x)\n        x = self.tdnn2(x)\n        x = self.se_block(x, lengths)\n\n        return x + residual\n\n\nclass ECAPA_TDNN(nn.Layer):\n    def __init__(\n            self,\n            input_size,\n            lin_neurons=192,\n            activation=nn.ReLU,\n            channels=[512, 512, 512, 512, 1536],\n            kernel_sizes=[5, 3, 3, 3, 1],\n            dilations=[1, 2, 3, 4, 1],\n            attention_channels=128,\n            res2net_scale=8,\n            se_channels=128,\n            global_context=True,\n    ):\n\n        super(ECAPA_TDNN, self).__init__()\n        assert len(channels) == len(kernel_sizes)\n        assert len(channels) == len(dilations)\n        self.channels = channels\n        self.blocks = nn.LayerList()\n        self.emb_size = lin_neurons\n\n        # The initial TDNN layer\n        self.blocks.append(TDNNBlock(\n            input_size,\n            channels[0],\n            kernel_sizes[0],\n            dilations[0],\n            activation,\n        ))\n\n        # SE-Res2Net layers\n        for i in range(1, len(channels) - 1):\n            self.blocks.append(\n                SERes2NetBlock(\n                    channels[i - 1],\n                    channels[i],\n                    res2net_scale=res2net_scale,\n                    se_channels=se_channels,\n                    kernel_size=kernel_sizes[i],\n                    dilation=dilations[i],\n                    activation=activation,\n                ))\n\n        # Multi-layer feature aggregation\n        self.mfa = TDNNBlock(\n            channels[-1],\n            channels[-1],\n            kernel_sizes[-1],\n            dilations[-1],\n            activation,\n        )\n\n        # Attentive Statistical Pooling\n        self.asp = AttentiveStatisticsPooling(\n            channels[-1],\n            attention_channels=attention_channels,\n            global_context=global_context,\n        )\n        self.asp_bn = BatchNorm1d(input_size=channels[-1] * 2)\n\n        # Final linear transformation\n        self.fc = Conv1d(\n            in_channels=channels[-1] * 2,\n            out_channels=self.emb_size,\n            kernel_size=1,\n        )\n\n    def forward(self, x, lengths=None):\n        xl = []\n        for layer in self.blocks:\n            try:\n                x = layer(x, lengths=lengths)\n            except TypeError:\n                x = layer(x)\n            xl.append(x)\n\n        # Multi-layer feature aggregation\n        x = paddle.concat(xl[1:], axis=1)\n        x = self.mfa(x)\n\n        # Attentive Statistical Pooling\n        x = self.asp(x, lengths=lengths)\n        x = self.asp_bn(x)\n\n        # Final linear transformation\n        x = self.fc(x)\n\n        return x\n"
  },
  {
    "path": "modules/audio/speaker_recognition/ecapa_tdnn_voxceleb/feature.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport paddle\nimport paddleaudio\nfrom paddleaudio.features.spectrum import hz_to_mel\nfrom paddleaudio.features.spectrum import mel_to_hz\nfrom paddleaudio.features.spectrum import power_to_db\nfrom paddleaudio.features.spectrum import Spectrogram\nfrom paddleaudio.features.window import get_window\n\n\ndef compute_fbank_matrix(sample_rate: int = 16000,\n                         n_fft: int = 400,\n                         n_mels: int = 80,\n                         f_min: int = 0.0,\n                         f_max: int = 8000.0):\n    mel = paddle.linspace(hz_to_mel(f_min, htk=True), hz_to_mel(f_max, htk=True), n_mels + 2, dtype=paddle.float32)\n    hz = mel_to_hz(mel, htk=True)\n\n    band = hz[1:] - hz[:-1]\n    band = band[:-1]\n    f_central = hz[1:-1]\n\n    n_stft = n_fft // 2 + 1\n    all_freqs = paddle.linspace(0, sample_rate // 2, n_stft)\n    all_freqs_mat = all_freqs.tile([f_central.shape[0], 1])\n\n    f_central_mat = f_central.tile([all_freqs_mat.shape[1], 1]).transpose([1, 0])\n    band_mat = band.tile([all_freqs_mat.shape[1], 1]).transpose([1, 0])\n\n    slope = (all_freqs_mat - f_central_mat) / band_mat\n    left_side = slope + 1.0\n    right_side = -slope + 1.0\n\n    fbank_matrix = paddle.maximum(paddle.zeros_like(left_side), paddle.minimum(left_side, right_side))\n\n    return fbank_matrix\n\n\ndef compute_log_fbank(\n        x: paddle.Tensor,\n        sample_rate: int = 16000,\n        n_fft: int = 400,\n        hop_length: int = 160,\n        win_length: int = 400,\n        n_mels: int = 80,\n        window: str = 'hamming',\n        center: bool = True,\n        pad_mode: str = 'constant',\n        f_min: float = 0.0,\n        f_max: float = None,\n        top_db: float = 80.0,\n):\n\n    if f_max is None:\n        f_max = sample_rate / 2\n\n    spect = Spectrogram(\n        n_fft=n_fft, hop_length=hop_length, win_length=win_length, window=window, center=center, pad_mode=pad_mode)(x)\n\n    fbank_matrix = compute_fbank_matrix(\n        sample_rate=sample_rate,\n        n_fft=n_fft,\n        n_mels=n_mels,\n        f_min=f_min,\n        f_max=f_max,\n    )\n    fbank = paddle.matmul(fbank_matrix, spect)\n    log_fbank = power_to_db(fbank, top_db=top_db).transpose([0, 2, 1])\n    return log_fbank\n\n\ndef compute_stats(x: paddle.Tensor, mean_norm: bool = True, std_norm: bool = False, eps: float = 1e-10):\n    if mean_norm:\n        current_mean = paddle.mean(x, axis=0)\n    else:\n        current_mean = paddle.to_tensor([0.0])\n\n    if std_norm:\n        current_std = paddle.std(x, axis=0)\n    else:\n        current_std = paddle.to_tensor([1.0])\n\n    current_std = paddle.maximum(current_std, eps * paddle.ones_like(current_std))\n\n    return current_mean, current_std\n\n\ndef normalize(\n        x: paddle.Tensor,\n        global_mean: paddle.Tensor = None,\n        global_std: paddle.Tensor = None,\n):\n\n    for i in range(x.shape[0]):  # (B, ...)\n        if global_mean is None and global_std is None:\n            mean, std = compute_stats(x[i])\n            x[i] = (x[i] - mean) / std\n        else:\n            x[i] = (x[i] - global_mean) / global_std\n    return x\n"
  },
  {
    "path": "modules/audio/speaker_recognition/ecapa_tdnn_voxceleb/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport re\nfrom typing import List\nfrom typing import Union\n\nimport numpy as np\nimport paddle\nimport paddleaudio\n\nfrom .ecapa_tdnn import ECAPA_TDNN\nfrom .feature import compute_log_fbank\nfrom .feature import normalize\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"ecapa_tdnn_voxceleb\",\n    version=\"1.0.0\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"audio/speaker_recognition\")\nclass SpeakerRecognition(paddle.nn.Layer):\n    def __init__(self, threshold=0.25):\n        super(SpeakerRecognition, self).__init__()\n        global_stats_path = os.path.join(self.directory, 'assets', 'global_embedding_stats.npy')\n        ckpt_path = os.path.join(self.directory, 'assets', 'model.pdparams')\n\n        self.sr = 16000\n        self.threshold = threshold\n        model_conf = {\n            'input_size': 80,\n            'channels': [1024, 1024, 1024, 1024, 3072],\n            'kernel_sizes': [5, 3, 3, 3, 1],\n            'dilations': [1, 2, 3, 4, 1],\n            'attention_channels': 128,\n            'lin_neurons': 192\n        }\n        self.model = ECAPA_TDNN(**model_conf)\n        self.model.set_state_dict(paddle.load(ckpt_path))\n        self.model.eval()\n\n        global_embedding_stats = np.load(global_stats_path, allow_pickle=True)\n        self.global_emb_mean = paddle.to_tensor(global_embedding_stats.item().get('global_emb_mean'))\n        self.global_emb_std = paddle.to_tensor(global_embedding_stats.item().get('global_emb_std'))\n\n        self.similarity = paddle.nn.CosineSimilarity(axis=-1, eps=1e-6)\n\n    def load_audio(self, wav):\n        wav = os.path.abspath(os.path.expanduser(wav))\n        assert os.path.isfile(wav), 'Please check wav file: {}'.format(wav)\n        waveform, _ = paddleaudio.load(wav, sr=self.sr, mono=True, normal=False)\n        return waveform\n\n    def speaker_embedding(self, wav):\n        waveform = self.load_audio(wav)\n        embedding = self(paddle.to_tensor(waveform)).reshape([-1])\n        return embedding.numpy()\n\n    def speaker_verify(self, wav1, wav2):\n        waveform1 = self.load_audio(wav1)\n        embedding1 = self(paddle.to_tensor(waveform1)).reshape([-1])\n\n        waveform2 = self.load_audio(wav2)\n        embedding2 = self(paddle.to_tensor(waveform2)).reshape([-1])\n\n        score = self.similarity(embedding1, embedding2).numpy()\n        return score, score > self.threshold\n\n    def forward(self, x):\n        if len(x.shape) == 1:\n            x = x.unsqueeze(0)\n\n        fbank = compute_log_fbank(x)  # x: waveform tensors with (B, T) shape\n        norm_fbank = normalize(fbank)\n        embedding = self.model(norm_fbank.transpose([0, 2, 1])).transpose([0, 2, 1])\n        norm_embedding = normalize(x=embedding, global_mean=self.global_emb_mean, global_std=self.global_emb_std)\n\n        return norm_embedding\n"
  },
  {
    "path": "modules/audio/speaker_recognition/ecapa_tdnn_voxceleb/requirements.txt",
    "content": "paddleaudio==0.1.0\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/README.md",
    "content": "# diffsinger\n\n|模型名称|diffsinger|\n| :--- | :---: |\n|类别|音频-歌声合成|\n|网络|DiffSinger|\n|数据集|-|\n|是否支持Fine-tuning|否|\n|模型大小|256.1MB|\n|指标|-|\n|最新更新日期|2022-10-25|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n  - 网络结构：\n      <p align=\"center\">\n        <img src=\"https://neuralsvb.github.io/resources/model_all7.png\"/>\n      </p>\n\n  - 样例结果示例：\n\n    |文本|音频|\n    |:-:|:-:|\n    |让 梦 恒 久 比 天 长|<audio controls=\"controls\"><source src=\"https://diffsinger.github.io/audio/singing_demo/diffsinger-base/000000007.wav\" autoplay=\"\"></audio>|\n    |我 终 于 翱 翔|<audio controls=\"controls\"><source src=\"https://diffsinger.github.io/audio/singing_demo/diffsinger-base/000000005.wav\" autoplay=\"\"></audio>|\n\n- ### 模型介绍\n\n  - DiffSinger，一个基于扩散概率模型的 SVS 声学模型。DiffSinger 是一个参数化的马尔科夫链，它可以根据乐谱的条件，迭代地将噪声转换为旋律谱。通过隐式优化变异约束，DiffSinger 可以被稳定地训练并产生真实的输出。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - onnxruntime >= 1.12.0\n\n    ```shell\n    # CPU\n    $ pip install onnxruntime\n\n    # GPU\n    $ pip install onnxruntime-gpu\n    ```\n\n  - paddlehub >= 2.0.0  \n\n- ### 2.安装\n\n    - ```shell\n      $ hub install diffsinger\n      ```\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n  - ### 1、命令行预测\n\n    ```shell\n    $ hub run diffsinger \\\n        --input_type \"word\" \\\n        --text \"小酒窝长睫毛AP是你最美的记号\" \\\n        --notes \"C#4/Db4 | F#4/Gb4 | G#4/Ab4 | A#4/Bb4 F#4/Gb4 | F#4/Gb4 C#4/Db4 | C#4/Db4 | rest | C#4/Db4 | A#4/Bb4 | G#4/Ab4 | A#4/Bb4 | G#4/Ab4 | F4 | C#4/Db4\" \\\n        --notes_duration \"0.407140 | 0.376190 | 0.242180 | 0.509550 0.183420 | 0.315400 0.235020 | 0.361660 | 0.223070 | 0.377270 | 0.340550 | 0.299620 | 0.344510 | 0.283770 | 0.323390 | 0.360340\" \\\n        --sample_num 1 \\\n        --save_dir \"outputs\"\n\n    $ hub run diffsinger \\\n        --input_type \"phoneme\" \\\n        --text \"小酒窝长睫毛AP是你最美的记号\" \\\n        --ph_seq \"x iao j iu w o ch ang ang j ie ie m ao AP sh i n i z ui m ei d e j i h ao\" \\\n        --note_seq \"C#4/Db4 C#4/Db4 F#4/Gb4 F#4/Gb4 G#4/Ab4 G#4/Ab4 A#4/Bb4 A#4/Bb4 F#4/Gb4 F#4/Gb4 F#4/Gb4 C#4/Db4 C#4/Db4 C#4/Db4 rest C#4/Db4 C#4/Db4 A#4/Bb4 A#4/Bb4 G#4/Ab4 G#4/Ab4 A#4/Bb4 A#4/Bb4 G#4/Ab4 G#4/Ab4 F4 F4 C#4/Db4 C#4/Db4\" \\\n        --note_dur_seq \"0.407140 0.407140 0.376190 0.376190 0.242180 0.242180 0.509550 0.509550 0.183420 0.315400 0.315400 0.235020 0.361660 0.361660 0.223070 0.377270 0.377270 0.340550 0.340550 0.299620 0.299620 0.344510 0.344510 0.283770 0.283770 0.323390 0.323390 0.360340 0.360340\" \\\n        --is_slur_seq \"0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0\" \\\n        --sample_num 1 \\\n        --save_dir \"outputs\"\n    ```\n\n  - ### 2、预测代码示例\n\n    ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"diffsinger\")\n    results = module.singing_voice_synthesis(\n      inputs={\n        'text': '小酒窝长睫毛AP是你最美的记号',\n        'notes': 'C#4/Db4 | F#4/Gb4 | G#4/Ab4 | A#4/Bb4 F#4/Gb4 | F#4/Gb4 C#4/Db4 | C#4/Db4 | rest | C#4/Db4 | A#4/Bb4 | G#4/Ab4 | A#4/Bb4 | G#4/Ab4 | F4 | C#4/Db4',\n        'notes_duration': '0.407140 | 0.376190 | 0.242180 | 0.509550 0.183420 | 0.315400 0.235020 | 0.361660 | 0.223070 | 0.377270 | 0.340550 | 0.299620 | 0.344510 | 0.283770 | 0.323390 | 0.360340',\n        'input_type': 'word'\n      },\n      sample_num=1,\n      save_audio=True,\n      save_dir='outputs'\n    )\n    ```\n\n  - ### 3、API\n\n    ```python\n    def singing_voice_synthesis(\n      inputs: Dict[str, str],\n      sample_num: int = 1,\n      save_audio: bool = True,\n      save_dir: str = 'outputs'\n    ) -> Dict[str, Union[List[List[int]], int]]:\n    ```\n\n    - 歌声合成 API\n\n    - **参数**\n\n      * inputs (Dict\\[str, str\\]): 输入数据，支持如下两种格式；\n\n        ```python\n        {\n          'text': '小酒窝长睫毛AP是你最美的记号',\n          'notes': 'C#4/Db4 | F#4/Gb4 | G#4/Ab4 | A#4/Bb4 F#4/Gb4 | F#4/Gb4 C#4/Db4 | C#4/Db4 | rest | C#4/Db4 | A#4/Bb4 | G#4/Ab4 | A#4/Bb4 | G#4/Ab4 | F4 | C#4/Db4',\n          'notes_duration': '0.407140 | 0.376190 | 0.242180 | 0.509550 0.183420 | 0.315400 0.235020 | 0.361660 | 0.223070 | 0.377270 | 0.340550 | 0.299620 | 0.344510 | 0.283770 | 0.323390 | 0.360340',\n          'input_type': 'word'\n        }\n        {\n            'text': '小酒窝长睫毛AP是你最美的记号',\n            'ph_seq': 'x iao j iu w o ch ang ang j ie ie m ao AP sh i n i z ui m ei d e j i h ao',\n            'note_seq': 'C#4/Db4 C#4/Db4 F#4/Gb4 F#4/Gb4 G#4/Ab4 G#4/Ab4 A#4/Bb4 A#4/Bb4 F#4/Gb4 F#4/Gb4 F#4/Gb4 C#4/Db4 C#4/Db4 C#4/Db4 rest C#4/Db4 C#4/Db4 A#4/Bb4 A#4/Bb4 G#4/Ab4 G#4/Ab4 A#4/Bb4 A#4/Bb4 G#4/Ab4 G#4/Ab4 F4 F4 C#4/Db4 C#4/Db4',\n            'note_dur_seq': '0.407140 0.407140 0.376190 0.376190 0.242180 0.242180 0.509550 0.509550 0.183420 0.315400 0.315400 0.235020 0.361660 0.361660 0.223070 0.377270 0.377270 0.340550 0.340550 0.299620 0.299620 0.344510 0.344510 0.283770 0.283770 0.323390 0.323390 0.360340 0.360340',\n            'is_slur_seq': '0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0',\n            'input_type': 'phoneme'\n        }\n        ```\n\n      * sample_num (int): 生成音频的数量；\n      * save_audio (bool): 是否保存音频文件；\n      * save\\_dir (str): 保存处理结果的文件目录。\n\n    - **返回**\n\n      * res (Dict\\[str, Union\\[List\\[List\\[int\\]\\], int\\]\\]): 歌声合成结果，一个字典，包容如下内容；\n\n        * wavs: 歌声音频数据\n        * sample_rate: 音频采样率\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个歌声合成的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n\n    ```shell\n     $ hub serving start -m diffsinger\n    ```\n\n    - 这样就完成了一个歌声合成服务化API的部署，默认端口号为8866。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    ```python\n    import requests\n    import json\n\n    data = {\n        'inputs': {\n                'text': '小酒窝长睫毛AP是你最美的记号',\n                'notes': 'C#4/Db4 | F#4/Gb4 | G#4/Ab4 | A#4/Bb4 F#4/Gb4 | F#4/Gb4 C#4/Db4 | C#4/Db4 | rest | C#4/Db4 | A#4/Bb4 | G#4/Ab4 | A#4/Bb4 | G#4/Ab4 | F4 | C#4/Db4',\n                'notes_duration': '0.407140 | 0.376190 | 0.242180 | 0.509550 0.183420 | 0.315400 0.235020 | 0.361660 | 0.223070 | 0.377270 | 0.340550 | 0.299620 | 0.344510 | 0.283770 | 0.323390 | 0.360340',\n                'input_type': 'word'\n            },\n        'save_audio': False,\n    }\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/diffsinger\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    results = r.json()['results']\n    ```\n\n## 五、参考资料\n\n* 论文：[DiffSinger: Singing Voice Synthesis via Shallow Diffusion Mechanism](https://arxiv.org/abs/2105.02446)\n\n* 官方实现：[MoonInTheRiver/DiffSinger](https://github.com/MoonInTheRiver/DiffSinger)\n\n## 六、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install diffsinger==1.0.0\n  ```\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/__init__.py",
    "content": ""
  },
  {
    "path": "modules/audio/svs/diffsinger/configs/config_base.yaml",
    "content": "# task\nbinary_data_dir: ''\nwork_dir: '' # experiment directory.\ninfer: false # infer\nseed: 1234\ndebug: false\nsave_codes:\n  - configs\n  - modules\n  - tasks\n  - utils\n  - usr\n\n#############\n# dataset\n#############\nds_workers: 1\ntest_num: 100\nvalid_num: 100\nendless_ds: false\nsort_by_len: true\n\n#########\n# train and eval\n#########\nload_ckpt: ''\nsave_ckpt: true\nsave_best: false\nnum_ckpt_keep: 3\nclip_grad_norm: 0\naccumulate_grad_batches: 1\nlog_interval: 100\nnum_sanity_val_steps: 5  # steps of validation at the beginning\ncheck_val_every_n_epoch: 10\nval_check_interval: 2000\nmax_epochs: 1000\nmax_updates: 160000\nmax_tokens: 31250\nmax_sentences: 100000\nmax_eval_tokens: -1\nmax_eval_sentences: -1\ntest_input_dir: ''\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/configs/singing/base.yaml",
    "content": "base_config:\n  - configs/tts/base.yaml\n  - configs/tts/base_zh.yaml\n\n\ndatasets: []\ntest_prefixes: []\ntest_num: 0\nvalid_num: 0\n\npre_align_cls: data_gen.singing.pre_align.SingingPreAlign\nbinarizer_cls: data_gen.singing.binarize.SingingBinarizer\npre_align_args:\n  use_tone: false # for ZH\n  forced_align: mfa\n  use_sox: true\nhop_size: 128            # Hop size.\nfft_size: 512           # FFT size.\nwin_size: 512           # FFT size.\nmax_frames: 8000\nfmin: 50                 # Minimum freq in mel basis calculation.\nfmax: 11025               # Maximum frequency in mel basis calculation.\npitch_type: frame\n\nhidden_size: 256\nmel_loss: \"ssim:0.5|l1:0.5\"\nlambda_f0: 0.0\nlambda_uv: 0.0\nlambda_energy: 0.0\nlambda_ph_dur: 0.0\nlambda_sent_dur: 0.0\nlambda_word_dur: 0.0\npredictor_grad: 0.0\nuse_spk_embed: true\nuse_spk_id: false\n\nmax_tokens: 20000\nmax_updates: 400000\nnum_spk: 100\nsave_f0: true\nuse_gt_dur: true\nuse_gt_f0: true\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/configs/singing/fs2.yaml",
    "content": "base_config:\n  - configs/tts/fs2.yaml\n  - configs/singing/base.yaml\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/configs/tts/base.yaml",
    "content": "# task\nbase_config: configs/config_base.yaml\ntask_cls: ''\n#############\n# dataset\n#############\nraw_data_dir: ''\nprocessed_data_dir: ''\nbinary_data_dir: ''\ndict_dir: ''\npre_align_cls: ''\nbinarizer_cls: data_gen.tts.base_binarizer.BaseBinarizer\npre_align_args:\n  use_tone: true # for ZH\n  forced_align: mfa\n  use_sox: false\n  txt_processor: en\n  allow_no_txt: false\n  denoise: false\nbinarization_args:\n  shuffle: false\n  with_txt: true\n  with_wav: false\n  with_align: true\n  with_spk_embed: true\n  with_f0: true\n  with_f0cwt: true\n\nloud_norm: false\nendless_ds: true\nreset_phone_dict: true\n\ntest_num: 100\nvalid_num: 100\nmax_frames: 1550\nmax_input_tokens: 1550\naudio_num_mel_bins: 80\naudio_sample_rate: 22050\nhop_size: 256  # For 22050Hz, 275 ~= 12.5 ms (0.0125 * sample_rate)\nwin_size: 1024  # For 22050Hz, 1100 ~= 50 ms (If None, win_size: fft_size) (0.05 * sample_rate)\nfmin: 80  # Set this to 55 if your speaker is male! if female, 95 should help taking off noise. (To test depending on dataset. Pitch info: male~[65, 260], female~[100, 525])\nfmax: 7600  # To be increased/reduced depending on data.\nfft_size: 1024  # Extra window size is filled with 0 paddings to match this parameter\nmin_level_db: -100\nnum_spk: 1\nmel_vmin: -6\nmel_vmax: 1.5\nds_workers: 4\n\n#########\n# model\n#########\ndropout: 0.1\nenc_layers: 4\ndec_layers: 4\nhidden_size: 384\nnum_heads: 2\nprenet_dropout: 0.5\nprenet_hidden_size: 256\nstop_token_weight: 5.0\nenc_ffn_kernel_size: 9\ndec_ffn_kernel_size: 9\nffn_act: gelu\nffn_padding: 'SAME'\n\n\n###########\n# optimization\n###########\nlr: 2.0\nwarmup_updates: 8000\noptimizer_adam_beta1: 0.9\noptimizer_adam_beta2: 0.98\nweight_decay: 0\nclip_grad_norm: 1\n\n\n###########\n# train and eval\n###########\nmax_tokens: 30000\nmax_sentences: 100000\nmax_eval_sentences: 1\nmax_eval_tokens: 60000\ntrain_set_name: 'train'\nvalid_set_name: 'valid'\ntest_set_name: 'test'\nvocoder: pwg\nvocoder_ckpt: ''\nprofile_infer: false\nout_wav_norm: false\nsave_gt: false\nsave_f0: false\ngen_dir_name: ''\nuse_denoise: false\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/configs/tts/base_zh.yaml",
    "content": "pre_align_args:\n  txt_processor: zh_g2pM\nbinarizer_cls: data_gen.tts.binarizer_zh.ZhBinarizer\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/configs/tts/fs2.yaml",
    "content": "base_config: configs/tts/base.yaml\ntask_cls: tasks.tts.fs2.FastSpeech2Task\n\n# model\nhidden_size: 256\ndropout: 0.1\nencoder_type: fft # fft|tacotron|tacotron2|conformer\nencoder_K: 8 # for tacotron encoder\ndecoder_type: fft # fft|rnn|conv|conformer\nuse_pos_embed: true\n\n# duration\npredictor_hidden: -1\npredictor_kernel: 5\npredictor_layers: 2\ndur_predictor_kernel: 3\ndur_predictor_layers: 2\npredictor_dropout: 0.5\n\n# pitch and energy\nuse_pitch_embed: true\npitch_type: ph # frame|ph|cwt\nuse_uv: true\ncwt_hidden_size: 128\ncwt_layers: 2\ncwt_loss: l1\ncwt_add_f0_loss: false\ncwt_std_scale: 0.8\n\npitch_ar: false\n#pitch_embed_type: 0q\npitch_loss: 'l1' # l1|l2|ssim\npitch_norm: log\nuse_energy_embed: false\n\n# reference encoder and speaker embedding\nuse_spk_id: false\nuse_split_spk_id: false\nuse_spk_embed: false\nuse_var_enc: false\nlambda_commit: 0.25\nref_norm_layer: bn\npitch_enc_hidden_stride_kernel:\n  - 0,2,5 # conv_hidden_size, conv_stride, conv_kernel_size. conv_hidden_size=0: use hidden_size\n  - 0,2,5\n  - 0,2,5\ndur_enc_hidden_stride_kernel:\n  - 0,2,3 # conv_hidden_size, conv_stride, conv_kernel_size. conv_hidden_size=0: use hidden_size\n  - 0,2,3\n  - 0,1,3\n\n\n# mel\nmel_loss: l1:0.5|ssim:0.5 # l1|l2|gdl|ssim or l1:0.5|ssim:0.5\n\n# loss lambda\nlambda_f0: 1.0\nlambda_uv: 1.0\nlambda_energy: 0.1\nlambda_ph_dur: 1.0\nlambda_sent_dur: 1.0\nlambda_word_dur: 1.0\npredictor_grad: 0.1\n\n# train and eval\npretrain_fs_ckpt: ''\nwarmup_updates: 2000\nmax_tokens: 32000\nmax_sentences: 100000\nmax_eval_sentences: 1\nmax_updates: 120000\nnum_valid_plots: 5\nnum_test_samples: 0\ntest_ids: []\nuse_gt_dur: false\nuse_gt_f0: false\n\n# exp\ndur_loss: mse # huber|mol\nnorm_type: gn\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/configs/tts/hifigan.yaml",
    "content": "base_config: configs/tts/pwg.yaml\ntask_cls: tasks.vocoder.hifigan.HifiGanTask\nresblock: \"1\"\nadam_b1: 0.8\nadam_b2: 0.99\nupsample_rates: [ 8,8,2,2 ]\nupsample_kernel_sizes: [ 16,16,4,4 ]\nupsample_initial_channel: 128\nresblock_kernel_sizes: [ 3,7,11 ]\nresblock_dilation_sizes: [ [ 1,3,5 ], [ 1,3,5 ], [ 1,3,5 ] ]\n\nlambda_mel: 45.0\n\nmax_samples: 8192\nmax_sentences: 16\n\ngenerator_params:\n  lr: 0.0002            # Generator's learning rate.\n  aux_context_window: 0 # Context window size for auxiliary feature.\ndiscriminator_optimizer_params:\n  lr: 0.0002            # Discriminator's learning rate.\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/configs/tts/lj/base_mel2wav.yaml",
    "content": "raw_data_dir: 'data/raw/LJSpeech-1.1'\nprocessed_data_dir: 'data/processed/ljspeech'\nbinary_data_dir: 'data/binary/ljspeech_wav'\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/configs/tts/lj/base_text2mel.yaml",
    "content": "raw_data_dir: 'data/raw/LJSpeech-1.1'\nprocessed_data_dir: 'data/processed/ljspeech'\nbinary_data_dir: 'data/binary/ljspeech'\npre_align_cls: data_gen.tts.lj.pre_align.LJPreAlign\n\npitch_type: cwt\nmel_loss: l1\nnum_test_samples: 20\ntest_ids: [ 68, 70, 74, 87, 110, 172, 190, 215, 231, 294,\n            316, 324, 402, 422, 485, 500, 505, 508, 509, 519 ]\nuse_energy_embed: false\ntest_num: 523\nvalid_num: 348\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/configs/tts/lj/fs2.yaml",
    "content": "base_config:\n  - configs/tts/fs2.yaml\n  - configs/tts/lj/base_text2mel.yaml\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/configs/tts/lj/hifigan.yaml",
    "content": "base_config:\n  - configs/tts/hifigan.yaml\n  - configs/tts/lj/base_mel2wav.yaml\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/configs/tts/lj/pwg.yaml",
    "content": "base_config:\n  - configs/tts/pwg.yaml\n  - configs/tts/lj/base_mel2wav.yaml\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/configs/tts/pwg.yaml",
    "content": "base_config: configs/tts/base.yaml\ntask_cls: tasks.vocoder.pwg.PwgTask\n\nbinarization_args:\n  with_wav: true\n  with_spk_embed: false\n  with_align: false\ntest_input_dir: ''\n\n###########\n# train and eval\n###########\nmax_samples: 25600\nmax_sentences: 5\nmax_eval_sentences: 1\nmax_updates: 1000000\nval_check_interval: 2000\n\n\n###########################################################\n#                FEATURE EXTRACTION SETTING               #\n###########################################################\nsampling_rate: 22050     # Sampling rate.\nfft_size: 1024           # FFT size.\nhop_size: 256            # Hop size.\nwin_length: null         # Window length.\n# If set to null, it will be the same as fft_size.\nwindow: \"hann\"           # Window function.\nnum_mels: 80             # Number of mel basis.\nfmin: 80                 # Minimum freq in mel basis calculation.\nfmax: 7600               # Maximum frequency in mel basis calculation.\nformat: \"hdf5\"           # Feature file format. \"npy\" or \"hdf5\" is supported.\n\n###########################################################\n#         GENERATOR NETWORK ARCHITECTURE SETTING          #\n###########################################################\ngenerator_params:\n  in_channels: 1        # Number of input channels.\n  out_channels: 1       # Number of output channels.\n  kernel_size: 3        # Kernel size of dilated convolution.\n  layers: 30            # Number of residual block layers.\n  stacks: 3             # Number of stacks i.e., dilation cycles.\n  residual_channels: 64 # Number of channels in residual conv.\n  gate_channels: 128    # Number of channels in gated conv.\n  skip_channels: 64     # Number of channels in skip conv.\n  aux_channels: 80      # Number of channels for auxiliary feature conv.\n  # Must be the same as num_mels.\n  aux_context_window: 2 # Context window size for auxiliary feature.\n  # If set to 2, previous 2 and future 2 frames will be considered.\n  dropout: 0.0          # Dropout rate. 0.0 means no dropout applied.\n  use_weight_norm: true # Whether to use weight norm.\n  # If set to true, it will be applied to all of the conv layers.\n  upsample_net: \"ConvInUpsampleNetwork\" # Upsampling network architecture.\n  upsample_params:                      # Upsampling network parameters.\n    upsample_scales: [4, 4, 4, 4]     # Upsampling scales. Prodcut of these must be the same as hop size.\n  use_pitch_embed: false\n\n###########################################################\n#       DISCRIMINATOR NETWORK ARCHITECTURE SETTING        #\n###########################################################\ndiscriminator_params:\n  in_channels: 1        # Number of input channels.\n  out_channels: 1       # Number of output channels.\n  kernel_size: 3        # Number of output channels.\n  layers: 10            # Number of conv layers.\n  conv_channels: 64     # Number of chnn layers.\n  bias: true            # Whether to use bias parameter in conv.\n  use_weight_norm: true # Whether to use weight norm.\n  # If set to true, it will be applied to all of the conv layers.\n  nonlinear_activation: \"LeakyReLU\" # Nonlinear function after each conv.\n  nonlinear_activation_params:      # Nonlinear function parameters\n    negative_slope: 0.2           # Alpha in LeakyReLU.\n\n###########################################################\n#                   STFT LOSS SETTING                     #\n###########################################################\nstft_loss_params:\n  fft_sizes: [1024, 2048, 512]  # List of FFT size for STFT-based loss.\n  hop_sizes: [120, 240, 50]     # List of hop size for STFT-based loss\n  win_lengths: [600, 1200, 240] # List of window length for STFT-based loss.\n  window: \"hann_window\"         # Window function for STFT-based loss\nuse_mel_loss: false\n\n###########################################################\n#               ADVERSARIAL LOSS SETTING                  #\n###########################################################\nlambda_adv: 4.0  # Loss balancing coefficient.\n\n###########################################################\n#             OPTIMIZER & SCHEDULER SETTING               #\n###########################################################\ngenerator_optimizer_params:\n  lr: 0.0001             # Generator's learning rate.\n  eps: 1.0e-6            # Generator's epsilon.\n  weight_decay: 0.0      # Generator's weight decay coefficient.\ngenerator_scheduler_params:\n  step_size: 200000      # Generator's scheduler step size.\n  gamma: 0.5             # Generator's scheduler gamma.\n  # At each step size, lr will be multiplied by this parameter.\ngenerator_grad_norm: 10    # Generator's gradient norm.\ndiscriminator_optimizer_params:\n  lr: 0.00005            # Discriminator's learning rate.\n  eps: 1.0e-6            # Discriminator's epsilon.\n  weight_decay: 0.0      # Discriminator's weight decay coefficient.\ndiscriminator_scheduler_params:\n  step_size: 200000      # Discriminator's scheduler step size.\n  gamma: 0.5             # Discriminator's scheduler gamma.\n  # At each step size, lr will be multiplied by this parameter.\ndiscriminator_grad_norm: 1 # Discriminator's gradient norm.\ndisc_start_steps: 40000 # Number of steps to start to train discriminator.\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/infer.py",
    "content": "import os\nfrom collections import deque\n\nimport librosa\nimport numpy as np\nimport onnxruntime as rt\nfrom pypinyin import lazy_pinyin\nfrom tqdm import tqdm\n\nfrom .inference.svs.opencpop.map import cpop_pinyin2ph_func\nfrom .utils.hparams import hparams\nfrom .utils.text_encoder import TokenTextEncoder\n\n\nclass Infer:\n\n    def __init__(self, root='.', providers=None):\n        model_dir = os.path.join(root, 'model')\n        if providers is None:\n            providers = rt.get_available_providers()\n        print('Using these as onnxruntime providers:', providers)\n\n        phone_list = [\n            \"AP\", \"SP\", \"a\", \"ai\", \"an\", \"ang\", \"ao\", \"b\", \"c\", \"ch\", \"d\", \"e\", \"ei\", \"en\", \"eng\", \"er\", \"f\", \"g\", \"h\",\n            \"i\", \"ia\", \"ian\", \"iang\", \"iao\", \"ie\", \"in\", \"ing\", \"iong\", \"iu\", \"j\", \"k\", \"l\", \"m\", \"n\", \"o\", \"ong\", \"ou\",\n            \"p\", \"q\", \"r\", \"s\", \"sh\", \"t\", \"u\", \"ua\", \"uai\", \"uan\", \"uang\", \"ui\", \"un\", \"uo\", \"v\", \"van\", \"ve\", \"vn\",\n            \"w\", \"x\", \"y\", \"z\", \"zh\"\n        ]\n        self.ph_encoder = TokenTextEncoder(None, vocab_list=phone_list, replace_oov=',')\n        self.pinyin2phs = cpop_pinyin2ph_func(path=os.path.join(root, 'inference/svs/opencpop/cpop_pinyin2ph.txt'))\n        self.spk_map = {'opencpop': 0}\n\n        options = rt.SessionOptions()\n        for provider in providers:\n            if 'dml' in provider.lower():\n                options.enable_mem_pattern = False\n                options.execution_mode = rt.ExecutionMode.ORT_SEQUENTIAL\n        fs2_path = os.path.join(model_dir, 'fs2.onnx')\n        q_sample_path = os.path.join(model_dir, 'q_sample.onnx')\n        p_sample_path = os.path.join(model_dir, 'p_sample.onnx')\n        pe_path = os.path.join(model_dir, 'pe.onnx')\n        vocoder_path = os.path.join(model_dir, 'vocoder.onnx')\n        self.fs2 = rt.InferenceSession(fs2_path, options, providers=providers)\n        self.q_sample = rt.InferenceSession(q_sample_path, options, providers=providers)\n        self.p_sample = rt.InferenceSession(p_sample_path, options, providers=providers)\n        self.pe = rt.InferenceSession(pe_path, options, providers=providers)\n        self.vocoder = rt.InferenceSession(vocoder_path, options, providers=providers)\n\n        self.K_step = hparams['K_step']\n        self.spec_min = np.asarray(hparams['spec_min'], np.float32)[None, None, :hparams['keep_bins']]\n        self.spec_max = np.asarray(hparams['spec_max'], np.float32)[None, None, :hparams['keep_bins']]\n        self.mel_bins = hparams['audio_num_mel_bins']\n        self.use_pe = hparams.get('pe_enable') is not None and hparams['pe_enable']\n\n    def model(self, txt_tokens, **kwargs):\n        fs_input_names = [node.name for node in self.fs2.get_inputs()]\n        inputs = {'txt_tokens': txt_tokens}\n        inputs.update({k: v for k, v in kwargs.items() if isinstance(v, np.ndarray) and k in fs_input_names})\n\n        io_binding = self.fs2.io_binding()\n        for k, v in inputs.items():\n            io_binding.bind_cpu_input(k, v)\n        io_binding.bind_output('decoder_inp')\n        io_binding.bind_output('mel_out')\n        if not self.use_pe:\n            io_binding.bind_output('f0_denorm')\n        self.fs2.run_with_iobinding(io_binding)\n        decoder_inp, mel_out = io_binding.get_outputs()[:2]\n        self.device_name = mel_out.device_name()\n        ret = {'decoder_inp': decoder_inp, 'mel_out': mel_out}\n        if not self.use_pe:\n            ret.update({'f0_denorm': io_binding.get_outputs()[-1]})\n        cond = decoder_inp.numpy().transpose([0, 2, 1])\n\n        ret['fs2_mel'] = ret['mel_out']\n        fs2_mels = mel_out.numpy()\n        t = self.K_step\n        fs2_mels = self.norm_spec(fs2_mels)\n        fs2_mels = fs2_mels.transpose([0, 2, 1])[:, None, :, :]\n\n        io_binding = self.q_sample.io_binding()\n        io_binding.bind_cpu_input('x_start', fs2_mels)\n        io_binding.bind_cpu_input('noise', np.random.randn(*fs2_mels.shape).astype(fs2_mels.dtype))\n        io_binding.bind_cpu_input('t', np.asarray([t - 1], dtype=np.int64))\n        io_binding.bind_output('x_next')\n        self.q_sample.run_with_iobinding(io_binding)\n        x = io_binding.get_outputs()[0].numpy()\n        if hparams.get('gaussian_start') is not None and hparams['gaussian_start']:\n            print('===> gaussion start.')\n            shape = (cond.shape[0], 1, self.mel_bins, cond.shape[2])\n            x = np.random.randn(*shape).astype(fs2_mels.dtype)\n\n        cond = rt.OrtValue.ortvalue_from_numpy(cond, mel_out.device_name(), 0)\n        x = rt.OrtValue.ortvalue_from_numpy(x, mel_out.device_name(), 0)\n\n        if hparams.get('pndm_speedup'):\n            self.noise_list = deque(maxlen=4)\n            iteration_interval = hparams['pndm_speedup']\n            interval = rt.OrtValue.ortvalue_from_numpy(np.asarray([iteration_interval], np.int64),\n                                                       mel_out.device_name(), 0)\n            for i in tqdm(reversed(range(0, t, iteration_interval)),\n                          desc='sample time step',\n                          total=t // iteration_interval):\n                io_binding = self.p_sample_plms.io_binding()\n                io_binding.bind_ortvalue_input('x', x)\n                io_binding.bind_cpu_input('noise', np.random.randn(*x.shape).astype(x.dtype))\n                io_binding.bind_ortvalue_input('cond', cond)\n                io_binding.bind_cpu_input('t', np.asarray([i], dtype=np.int64))  # torch i-1 but here i\n                io_binding.bind_ortvalue_input('interval', interval)\n                io_binding.bind_output('x_next')\n                self.p_sample_plms.run_with_iobinding(io_binding)\n                x = io_binding.get_outputs()[0]\n        else:\n            for i in tqdm(reversed(range(0, t)), desc='sample time step', total=t):\n                io_binding = self.p_sample.io_binding()\n                io_binding.bind_ortvalue_input('x', x)\n                io_binding.bind_cpu_input('noise', np.random.randn(*x.shape()).astype(np.float32))\n                io_binding.bind_ortvalue_input('cond', cond)\n                io_binding.bind_cpu_input('t', np.asarray([i], dtype=np.int64))  # torch i-1 but here i\n                io_binding.bind_output('x_next')\n                self.p_sample.run_with_iobinding(io_binding)\n                x = io_binding.get_outputs()[0]\n        x = x.numpy()[:, 0].transpose([0, 2, 1])\n        mel2ph = kwargs.get('mel2ph', None)\n        if mel2ph is not None:  # for singing\n            ret['mel_out'] = self.denorm_spec(x) * ((mel2ph > 0).astype(np.float32)[:, :, None])\n        else:\n            ret['mel_out'] = self.denorm_spec(x)\n        return ret\n\n    def norm_spec(self, x):\n        return (x - self.spec_min) / (self.spec_max - self.spec_min) * 2 - 1\n\n    def denorm_spec(self, x):\n        return (x + 1) / 2 * (self.spec_max - self.spec_min) + self.spec_min\n\n    def forward_model(self, inp):\n        sample = self.input_to_batch(inp)\n        txt_tokens = sample['txt_tokens']  # [B, T_t]\n        spk_id = sample.get('spk_ids')\n\n        output = self.model(txt_tokens,\n                            spk_id=spk_id,\n                            ref_mels=None,\n                            infer=True,\n                            pitch_midi=sample['pitch_midi'],\n                            midi_dur=sample['midi_dur'],\n                            is_slur=sample['is_slur'])\n        mel_out = output['mel_out']  # [B, T,80]\n        mel_out = rt.OrtValue.ortvalue_from_numpy(mel_out, self.device_name, 0)\n        if hparams.get('pe_enable') is not None and hparams['pe_enable']:\n            # pe predict from Pred mel\n            io_binding = self.pe.io_binding()\n            io_binding.bind_ortvalue_input('mel_input', mel_out)\n            io_binding.bind_output('f0_denorm_pred')\n            self.pe.run_with_iobinding(io_binding)\n            f0_pred = io_binding.get_outputs()[0]\n        else:\n            f0_pred = output['f0_denorm']\n        wav_out = self.run_vocoder(mel_out, f0=f0_pred.numpy())\n\n        return wav_out[0]\n\n    def run_vocoder(self, c, **kwargs):\n        # c = c.transpose([0, 2, 1])  # [B, 80, T]\n        f0 = kwargs.get('f0')  # [B, T]\n        if f0 is not None and hparams.get('use_nsf'):\n            y = self.vocoder.run(['wav_out'], {\n                'mel_out': c,\n                'f0': f0,\n            })[0]  # .reshape([-1])\n        else:\n            y = self.vocoder.run(['wav_out'], {\n                'mel_out': c,\n            })[0]  # .reshape([-1])\n            # [T]\n        return y  # [None]\n\n    def preprocess_word_level_input(self, inp):\n        # Pypinyin can't solve polyphonic words\n        text_raw = inp['text'].replace('最长', '最常').replace('长睫毛', '常睫毛') \\\n            .replace('那么长', '那么常').replace('多长', '多常') \\\n            .replace('很长', '很常')  # We hope someone could provide a better g2p module for us by opening pull requests.\n\n        # lyric\n        pinyins = lazy_pinyin(text_raw, strict=False)\n        ph_per_word_lst = [self.pinyin2phs[pinyin.strip()] for pinyin in pinyins if pinyin.strip() in self.pinyin2phs]\n\n        # Note\n        note_per_word_lst = [x.strip() for x in inp['notes'].split('|') if x.strip() != '']\n        mididur_per_word_lst = [x.strip() for x in inp['notes_duration'].split('|') if x.strip() != '']\n\n        if len(note_per_word_lst) == len(ph_per_word_lst) == len(mididur_per_word_lst):\n            print('Pass word-notes check.')\n        else:\n            print('The number of words does\\'t match the number of notes\\' windows. ',\n                  'You should split the note(s) for each word by | mark.')\n            print(ph_per_word_lst, note_per_word_lst, mididur_per_word_lst)\n            print(len(ph_per_word_lst), len(note_per_word_lst), len(mididur_per_word_lst))\n            return None\n\n        note_lst = []\n        ph_lst = []\n        midi_dur_lst = []\n        is_slur = []\n        for idx, ph_per_word in enumerate(ph_per_word_lst):\n            # for phs in one word:\n            # single ph like ['ai']  or multiple phs like ['n', 'i']\n            ph_in_this_word = ph_per_word.split()\n\n            # for notes in one word:\n            # single note like ['D4'] or multiple notes like ['D4', 'E4'] which means a 'slur' here.\n            note_in_this_word = note_per_word_lst[idx].split()\n            midi_dur_in_this_word = mididur_per_word_lst[idx].split()\n            # process for the model input\n            # Step 1.\n            #  Deal with note of 'not slur' case or the first note of 'slur' case\n            #  j        ie\n            #  F#4/Gb4  F#4/Gb4\n            #  0        0\n            for ph in ph_in_this_word:\n                ph_lst.append(ph)\n                note_lst.append(note_in_this_word[0])\n                midi_dur_lst.append(midi_dur_in_this_word[0])\n                is_slur.append(0)\n            # step 2.\n            #  Deal with the 2nd, 3rd... notes of 'slur' case\n            #  j        ie         ie\n            #  F#4/Gb4  F#4/Gb4    C#4/Db4\n            #  0        0          1\n            # is_slur = True, we should repeat the YUNMU to match the 2nd, 3rd... notes.\n            if len(note_in_this_word) > 1:\n                for idx in range(1, len(note_in_this_word)):\n                    ph_lst.append(ph_in_this_word[-1])\n                    note_lst.append(note_in_this_word[idx])\n                    midi_dur_lst.append(midi_dur_in_this_word[idx])\n                    is_slur.append(1)\n        ph_seq = ' '.join(ph_lst)\n\n        if len(ph_lst) == len(note_lst) == len(midi_dur_lst):\n            print(len(ph_lst), len(note_lst), len(midi_dur_lst))\n            print('Pass word-notes check.')\n        else:\n            print('The number of words does\\'t match the number of notes\\' windows. ',\n                  'You should split the note(s) for each word by | mark.')\n            return None\n        return ph_seq, note_lst, midi_dur_lst, is_slur\n\n    def preprocess_phoneme_level_input(self, inp):\n        ph_seq = inp['ph_seq']\n        note_lst = inp['note_seq'].split()\n        midi_dur_lst = inp['note_dur_seq'].split()\n        is_slur = [float(x) for x in inp['is_slur_seq'].split()]\n        print(len(note_lst), len(ph_seq.split()), len(midi_dur_lst))\n        if len(note_lst) == len(ph_seq.split()) == len(midi_dur_lst):\n            print('Pass word-notes check.')\n        else:\n            print('The number of words does\\'t match the number of notes\\' windows. ',\n                  'You should split the note(s) for each word by | mark.')\n            return None\n        return ph_seq, note_lst, midi_dur_lst, is_slur\n\n    def preprocess_input(self, inp, input_type='word'):\n        \"\"\"\n\n        :param inp: {'text': str, 'item_name': (str, optional), 'spk_name': (str, optional)}\n        :return:\n        \"\"\"\n\n        item_name = inp.get('item_name', '<ITEM_NAME>')\n        spk_name = inp.get('spk_name', 'opencpop')\n\n        # single spk\n        spk_id = self.spk_map[spk_name]\n\n        # get ph seq, note lst, midi dur lst, is slur lst.\n        if input_type == 'word':\n            ret = self.preprocess_word_level_input(inp)\n        # like transcriptions.txt in Opencpop dataset.\n        elif input_type == 'phoneme':\n            ret = self.preprocess_phoneme_level_input(inp)\n        else:\n            print('Invalid input type.')\n            return None\n\n        if ret:\n            ph_seq, note_lst, midi_dur_lst, is_slur = ret\n        else:\n            print('==========> Preprocess_word_level or phone_level input wrong.')\n            return None\n\n        # convert note lst to midi id; convert note dur lst to midi duration\n        try:\n            midis = [librosa.note_to_midi(x.split(\"/\")[0]) if x != 'rest' else 0 for x in note_lst]\n            midi_dur_lst = [float(x) for x in midi_dur_lst]\n        except Exception as e:\n            print(e)\n            print('Invalid Input Type.')\n            return None\n\n        ph_token = self.ph_encoder.encode(ph_seq)\n        item = {\n            'item_name': item_name,\n            'text': inp['text'],\n            'ph': ph_seq,\n            'spk_id': spk_id,\n            'ph_token': ph_token,\n            'pitch_midi': np.asarray(midis),\n            'midi_dur': np.asarray(midi_dur_lst),\n            'is_slur': np.asarray(is_slur),\n        }\n        item['ph_len'] = len(item['ph_token'])\n        return item\n\n    def input_to_batch(self, item):\n        item_names = [item['item_name']]\n        text = [item['text']]\n        ph = [item['ph']]\n        txt_tokens = np.int64(item['ph_token'])[None, :]\n        txt_lengths = np.int64([txt_tokens.shape[1]])\n        spk_ids = np.asarray(item['spk_id'], np.int64)[None]\n\n        pitch_midi = np.int64(item['pitch_midi'])[None, :hparams['max_frames']]\n        midi_dur = np.float32(item['midi_dur'])[None, :hparams['max_frames']]\n        is_slur = np.int64(item['is_slur'])[None, :hparams['max_frames']]\n\n        batch = {\n            'item_name': item_names,\n            'text': text,\n            'ph': ph,\n            'txt_tokens': txt_tokens,\n            'txt_lengths': txt_lengths,\n            'spk_ids': spk_ids,\n            'pitch_midi': pitch_midi,\n            'midi_dur': midi_dur,\n            'is_slur': is_slur\n        }\n        return batch\n\n    def infer_once(self, inp):\n        inp = self.preprocess_input(inp, input_type=inp['input_type'] if inp.get('input_type') else 'word')\n        output = self.forward_model(inp)\n        return output\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/inference/svs/opencpop/cpop_pinyin2ph.txt",
    "content": "| a      | a        |\n| ai     | ai       |\n| an     | an       |\n| ang    | ang      |\n| ao     | ao       |\n| ba     | b a      |\n| bai    | b ai     |\n| ban    | b an     |\n| bang   | b ang    |\n| bao    | b ao     |\n| bei    | b ei     |\n| ben    | b en     |\n| beng   | b eng    |\n| bi     | b i      |\n| bian   | b ian    |\n| biao   | b iao    |\n| bie    | b ie     |\n| bin    | b in     |\n| bing   | b ing    |\n| bo     | b o      |\n| bu     | b u      |\n| ca     | c a      |\n| cai    | c ai     |\n| can    | c an     |\n| cang   | c ang    |\n| cao    | c ao     |\n| ce     | c e      |\n| cei    | c ei     |\n| cen    | c en     |\n| ceng   | c eng    |\n| cha    | ch a     |\n| chai   | ch ai    |\n| chan   | ch an    |\n| chang  | ch ang   |\n| chao   | ch ao    |\n| che    | ch e     |\n| chen   | ch en    |\n| cheng  | ch eng   |\n| chi    | ch i     |\n| chong  | ch ong   |\n| chou   | ch ou    |\n| chu    | ch u     |\n| chua   | ch ua    |\n| chuai  | ch uai   |\n| chuan  | ch uan   |\n| chuang | ch uang  |\n| chui   | ch ui    |\n| chun   | ch un    |\n| chuo   | ch uo    |\n| ci     | c i      |\n| cong   | c ong    |\n| cou    | c ou     |\n| cu     | c u      |\n| cuan   | c uan    |\n| cui    | c ui     |\n| cun    | c un     |\n| cuo    | c uo     |\n| da     | d a      |\n| dai    | d ai     |\n| dan    | d an     |\n| dang   | d ang    |\n| dao    | d ao     |\n| de     | d e      |\n| dei    | d ei     |\n| den    | d en     |\n| deng   | d eng    |\n| di     | d i      |\n| dia    | d ia     |\n| dian   | d ian    |\n| diao   | d iao    |\n| die    | d ie     |\n| ding   | d ing    |\n| diu    | d iu     |\n| dong   | d ong    |\n| dou    | d ou     |\n| du     | d u      |\n| duan   | d uan    |\n| dui    | d ui     |\n| dun    | d un     |\n| duo    | d uo     |\n| e      | e        |\n| ei     | ei       |\n| en     | en       |\n| eng    | eng      |\n| er     | er       |\n| fa     | f a      |\n| fan    | f an     |\n| fang   | f ang    |\n| fei    | f ei     |\n| fen    | f en     |\n| feng   | f eng    |\n| fo     | f o      |\n| fou    | f ou     |\n| fu     | f u      |\n| ga     | g a      |\n| gai    | g ai     |\n| gan    | g an     |\n| gang   | g ang    |\n| gao    | g ao     |\n| ge     | g e      |\n| gei    | g ei     |\n| gen    | g en     |\n| geng   | g eng    |\n| gong   | g ong    |\n| gou    | g ou     |\n| gu     | g u      |\n| gua    | g ua     |\n| guai   | g uai    |\n| guan   | g uan    |\n| guang  | g uang   |\n| gui    | g ui     |\n| gun    | g un     |\n| guo    | g uo     |\n| ha     | h a      |\n| hai    | h ai     |\n| han    | h an     |\n| hang   | h ang    |\n| hao    | h ao     |\n| he     | h e      |\n| hei    | h ei     |\n| hen    | h en     |\n| heng   | h eng    |\n| hm     | h m      |\n| hng    | h ng     |\n| hong   | h ong    |\n| hou    | h ou     |\n| hu     | h u      |\n| hua    | h ua     |\n| huai   | h uai    |\n| huan   | h uan    |\n| huang  | h uang   |\n| hui    | h ui     |\n| hun    | h un     |\n| huo    | h uo     |\n| ji     | j i      |\n| jia    | j ia     |\n| jian   | j ian    |\n| jiang  | j iang   |\n| jiao   | j iao    |\n| jie    | j ie     |\n| jin    | j in     |\n| jing   | j ing    |\n| jiong  | j iong   |\n| jiu    | j iu     |\n| ju     | j v      |\n| juan   | j van    |\n| jue    | j ve     |\n| jun    | j vn     |\n| ka     | k a      |\n| kai    | k ai     |\n| kan    | k an     |\n| kang   | k ang    |\n| kao    | k ao     |\n| ke     | k e      |\n| kei    | k ei     |\n| ken    | k en     |\n| keng   | k eng    |\n| kong   | k ong    |\n| kou    | k ou     |\n| ku     | k u      |\n| kua    | k ua     |\n| kuai   | k uai    |\n| kuan   | k uan    |\n| kuang  | k uang   |\n| kui    | k ui     |\n| kun    | k un     |\n| kuo    | k uo     |\n| la     | l a      |\n| lai    | l ai     |\n| lan    | l an     |\n| lang   | l ang    |\n| lao    | l ao     |\n| le     | l e      |\n| lei    | l ei     |\n| leng   | l eng    |\n| li     | l i      |\n| lia    | l ia     |\n| lian   | l ian    |\n| liang  | l iang   |\n| liao   | l iao    |\n| lie    | l ie     |\n| lin    | l in     |\n| ling   | l ing    |\n| liu    | l iu     |\n| lo     | l o      |\n| long   | l ong    |\n| lou    | l ou     |\n| lu     | l u      |\n| luan   | l uan    |\n| lun    | l un     |\n| luo    | l uo     |\n| lv     | l v      |\n| lve    | l ve     |\n| m      | m        |\n| ma     | m a      |\n| mai    | m ai     |\n| man    | m an     |\n| mang   | m ang    |\n| mao    | m ao     |\n| me     | m e      |\n| mei    | m ei     |\n| men    | m en     |\n| meng   | m eng    |\n| mi     | m i      |\n| mian   | m ian    |\n| miao   | m iao    |\n| mie    | m ie     |\n| min    | m in     |\n| ming   | m ing    |\n| miu    | m iu     |\n| mo     | m o      |\n| mou    | m ou     |\n| mu     | m u      |\n| n      | n        |\n| na     | n a      |\n| nai    | n ai     |\n| nan    | n an     |\n| nang   | n ang    |\n| nao    | n ao     |\n| ne     | n e      |\n| nei    | n ei     |\n| nen    | n en     |\n| neng   | n eng    |\n| ng     | n g      |\n| ni     | n i      |\n| nian   | n ian    |\n| niang  | n iang   |\n| niao   | n iao    |\n| nie    | n ie     |\n| nin    | n in     |\n| ning   | n ing    |\n| niu    | n iu     |\n| nong   | n ong    |\n| nou    | n ou     |\n| nu     | n u      |\n| nuan   | n uan    |\n| nun    | n un     |\n| nuo    | n uo     |\n| nv     | n v      |\n| nve    | n ve     |\n| o      | o        |\n| ou     | ou       |\n| pa     | p a      |\n| pai    | p ai     |\n| pan    | p an     |\n| pang   | p ang    |\n| pao    | p ao     |\n| pei    | p ei     |\n| pen    | p en     |\n| peng   | p eng    |\n| pi     | p i      |\n| pian   | p ian    |\n| piao   | p iao    |\n| pie    | p ie     |\n| pin    | p in     |\n| ping   | p ing    |\n| po     | p o      |\n| pou    | p ou     |\n| pu     | p u      |\n| qi     | q i      |\n| qia    | q ia     |\n| qian   | q ian    |\n| qiang  | q iang   |\n| qiao   | q iao    |\n| qie    | q ie     |\n| qin    | q in     |\n| qing   | q ing    |\n| qiong  | q iong   |\n| qiu    | q iu     |\n| qu     | q v      |\n| quan   | q van    |\n| que    | q ve     |\n| qun    | q vn     |\n| ran    | r an     |\n| rang   | r ang    |\n| rao    | r ao     |\n| re     | r e      |\n| ren    | r en     |\n| reng   | r eng    |\n| ri     | r i      |\n| rong   | r ong    |\n| rou    | r ou     |\n| ru     | r u      |\n| rua    | r ua     |\n| ruan   | r uan    |\n| rui    | r ui     |\n| run    | r un     |\n| ruo    | r uo     |\n| sa     | s a      |\n| sai    | s ai     |\n| san    | s an     |\n| sang   | s ang    |\n| sao    | s ao     |\n| se     | s e      |\n| sen    | s en     |\n| seng   | s eng    |\n| sha    | sh a     |\n| shai   | sh ai    |\n| shan   | sh an    |\n| shang  | sh ang   |\n| shao   | sh ao    |\n| she    | sh e     |\n| shei   | sh ei    |\n| shen   | sh en    |\n| sheng  | sh eng   |\n| shi    | sh i     |\n| shou   | sh ou    |\n| shu    | sh u     |\n| shua   | sh ua    |\n| shuai  | sh uai   |\n| shuan  | sh uan   |\n| shuang | sh uang  |\n| shui   | sh ui    |\n| shun   | sh un    |\n| shuo   | sh uo    |\n| si     | s i      |\n| song   | s ong    |\n| sou    | s ou     |\n| su     | s u      |\n| suan   | s uan    |\n| sui    | s ui     |\n| sun    | s un     |\n| suo    | s uo     |\n| ta     | t a      |\n| tai    | t ai     |\n| tan    | t an     |\n| tang   | t ang    |\n| tao    | t ao     |\n| te     | t e      |\n| tei    | t ei     |\n| teng   | t eng    |\n| ti     | t i      |\n| tian   | t ian    |\n| tiao   | t iao    |\n| tie    | t ie     |\n| ting   | t ing    |\n| tong   | t ong    |\n| tou    | t ou     |\n| tu     | t u      |\n| tuan   | t uan    |\n| tui    | t ui     |\n| tun    | t un     |\n| tuo    | t uo     |\n| wa     | w a      |\n| wai    | w ai     |\n| wan    | w an     |\n| wang   | w ang    |\n| wei    | w ei     |\n| wen    | w en     |\n| weng   | w eng    |\n| wo     | w o      |\n| wu     | w u      |\n| xi     | x i      |\n| xia    | x ia     |\n| xian   | x ian    |\n| xiang  | x iang   |\n| xiao   | x iao    |\n| xie    | x ie     |\n| xin    | x in     |\n| xing   | x ing    |\n| xiong  | x iong   |\n| xiu    | x iu     |\n| xu     | x v      |\n| xuan   | x van    |\n| xue    | x ve     |\n| xun    | x vn     |\n| ya     | y a      |\n| yan    | y an     |\n| yang   | y ang    |\n| yao    | y ao     |\n| ye     | y e      |\n| yi     | y i      |\n| yin    | y in     |\n| ying   | y ing    |\n| yo     | y o      |\n| yong   | y ong    |\n| you    | y ou     |\n| yu     | y v      |\n| yuan   | y van    |\n| yue    | y ve     |\n| yun    | y vn     |\n| za     | z a      |\n| zai    | z ai     |\n| zan    | z an     |\n| zang   | z ang    |\n| zao    | z ao     |\n| ze     | z e      |\n| zei    | z ei     |\n| zen    | z en     |\n| zeng   | z eng    |\n| zha    | zh a     |\n| zhai   | zh ai    |\n| zhan   | zh an    |\n| zhang  | zh ang   |\n| zhao   | zh ao    |\n| zhe    | zh e     |\n| zhei   | zh ei    |\n| zhen   | zh en    |\n| zheng  | zh eng   |\n| zhi    | zh i     |\n| zhong  | zh ong   |\n| zhou   | zh ou    |\n| zhu    | zh u     |\n| zhua   | zh ua    |\n| zhuai  | zh uai   |\n| zhuan  | zh uan   |\n| zhuang | zh uang  |\n| zhui   | zh ui    |\n| zhun   | zh un    |\n| zhuo   | zh uo    |\n| zi     | z i      |\n| zong   | z ong    |\n| zou    | z ou     |\n| zu     | z u      |\n| zuan   | z uan    |\n| zui    | z ui     |\n| zun    | z un     |\n| zuo    | z uo     |\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/inference/svs/opencpop/map.py",
    "content": "def cpop_pinyin2ph_func(path):\n    # In the README file of opencpop dataset, they defined a \"pinyin to phoneme mapping table\"\n    pinyin2phs = {'AP': 'AP', 'SP': 'SP'}\n    with open(path) as rf:\n        for line in rf.readlines():\n            elements = [x.strip() for x in line.split('|') if x.strip() != '']\n            pinyin2phs[elements[0]] = elements[1]\n    return pinyin2phs\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/module.py",
    "content": "import argparse\nimport os\nimport time\nfrom typing import Dict\nfrom typing import List\nfrom typing import Union\n\nfrom .infer import Infer\nfrom .utils.audio import save_wav\nfrom .utils.hparams import hparams\nfrom .utils.hparams import set_hparams\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"diffsinger\",\n            type=\"Audio/svs\",\n            author=\"\",\n            author_email=\"\",\n            summary=\"DiffSinger: Singing Voice Synthesis via Shallow Diffusion Mechanism\",\n            version=\"1.0.0\")\nclass DiffSinger:\n\n    def __init__(self, providers: List[str] = None) -> None:\n        root = self.directory\n        config = os.path.join('model', 'config.yaml')\n        set_hparams(config, root=root)\n        self.infer = Infer(root, providers=providers)\n\n    @serving\n    def singing_voice_synthesis(self,\n                                inputs: Dict[str, str],\n                                sample_num: int = 1,\n                                save_audio: bool = True,\n                                save_dir: str = 'outputs') -> Dict[str, Union[List[List[int]], int]]:\n        '''\n        inputs = {\n            'text': '小酒窝长睫毛AP是你最美的记号',\n            'notes': 'C#4/Db4 | F#4/Gb4 | G#4/Ab4 | A#4/Bb4 F#4/Gb4 | F#4/Gb4 C#4/Db4 | C#4/Db4 | rest | C#4/Db4 | A#4/Bb4 | G#4/Ab4 | A#4/Bb4 | G#4/Ab4 | F4 | C#4/Db4',\n            'notes_duration': '0.407140 | 0.376190 | 0.242180 | 0.509550 0.183420 | 0.315400 0.235020 | 0.361660 | 0.223070 | 0.377270 | 0.340550 | 0.299620 | 0.344510 | 0.283770 | 0.323390 | 0.360340',\n            'input_type': 'word'\n        }  # user input: Chinese characters\n        or,\n        inputs = {\n            'text': '小酒窝长睫毛AP是你最美的记号',\n            'ph_seq': 'x iao j iu w o ch ang ang j ie ie m ao AP sh i n i z ui m ei d e j i h ao',\n            'note_seq': 'C#4/Db4 C#4/Db4 F#4/Gb4 F#4/Gb4 G#4/Ab4 G#4/Ab4 A#4/Bb4 A#4/Bb4 F#4/Gb4 F#4/Gb4 F#4/Gb4 C#4/Db4 C#4/Db4 C#4/Db4 rest C#4/Db4 C#4/Db4 A#4/Bb4 A#4/Bb4 G#4/Ab4 G#4/Ab4 A#4/Bb4 A#4/Bb4 G#4/Ab4 G#4/Ab4 F4 F4 C#4/Db4 C#4/Db4',\n            'note_dur_seq': '0.407140 0.407140 0.376190 0.376190 0.242180 0.242180 0.509550 0.509550 0.183420 0.315400 0.315400 0.235020 0.361660 0.361660 0.223070 0.377270 0.377270 0.340550 0.340550 0.299620 0.299620 0.344510 0.344510 0.283770 0.283770 0.323390 0.323390 0.360340 0.360340',\n            'is_slur_seq': '0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0',\n            'input_type': 'phoneme'\n        }  # input like Opencpop dataset.\n        '''\n        outputs = []\n        for i in range(sample_num):\n            output = self.infer.infer_once(inputs)\n            os.makedirs(save_dir, exist_ok=True)\n            if save_audio:\n                save_wav(output, os.path.join(save_dir, '%d_%d.wav' % (i, int(time.time()))),\n                         hparams['audio_sample_rate'])\n            outputs.append(output.tolist())\n        return {'wavs': outputs, 'sample_rate': hparams['audio_sample_rate']}\n\n    @runnable\n    def run_cmd(self, argvs: List[str]) -> str:\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.parser.add_argument('--input_type',\n                                 type=str,\n                                 choices=['word', 'phoneme'],\n                                 required=True,\n                                 help='input type in [\"word\", \"phoneme\"].')\n        args = self.parser.parse_args(argvs[:2])\n        if args.input_type == 'word':\n            self.arg_input_group = self.parser.add_argument_group(title=\"Input options (type: word).\",\n                                                                  description=\"Input options (type: word).\")\n            self.arg_input_group.add_argument('--text', type=str, required=True, help='input text.')\n            self.arg_input_group.add_argument('--notes', type=str, required=True, help='input notes.')\n            self.arg_input_group.add_argument('--notes_duration', type=str, required=True, help='input notes duration.')\n        elif args.input_type == 'phoneme':\n            self.arg_input_group = self.parser.add_argument_group(title=\"Input options (type: phoneme).\",\n                                                                  description=\"Input options (type: phoneme).\")\n            self.arg_input_group.add_argument('--text', type=str, required=True, help='input text.')\n            self.arg_input_group.add_argument('--ph_seq', type=str, required=True, help='input phoneme seq.')\n            self.arg_input_group.add_argument('--note_seq', type=str, required=True, help='input note seq.')\n            self.arg_input_group.add_argument('--note_dur_seq',\n                                              type=str,\n                                              required=True,\n                                              help='input note duration seq.')\n            self.arg_input_group.add_argument('--is_slur_seq',\n                                              type=str,\n                                              required=True,\n                                              help='input if note is slur seq.')\n        else:\n            raise ValueError('Input type (--input_type) should be in [\"word\", \"phoneme\"]')\n        self.parser.add_argument('--sample_num', type=int, default=1, help='sample audios num, default=1')\n        self.parser.add_argument('--save_dir',\n                                 type=str,\n                                 default='outputs',\n                                 help='sample audios save_dir, default=\"outputs\"')\n        args = self.parser.parse_args(argvs)\n        kwargs = vars(args).copy()\n        kwargs.pop('sample_num')\n        kwargs.pop('save_dir')\n        self.singing_voice_synthesis(kwargs, sample_num=args.sample_num, save_dir=args.save_dir, save_audio=True)\n        return \"Audios are saved in %s\" % args.save_dir\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/requirements.txt",
    "content": "librosa>=0.9.2\nmatplotlib==3.5.3\nnumpy>=1.21.6\npycwt>=0.3.0a22\npypinyin>=0.47.1\nPyYAML>=6.0\nscipy>=1.7.3\nsix>=1.16.0\nsoundfile>=0.11.0\ntqdm>=4.64.1\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/test.py",
    "content": "import shutil\nimport unittest\n\nimport paddlehub as hub\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        cls.module = hub.Module(name=\"diffsinger\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('outputs')\n\n    def test_singing_voice_synthesis1(self):\n        results = self.module.singing_voice_synthesis(inputs={\n            'text': '小酒窝长睫毛AP是你最美的记号',\n            'notes':\n            'C#4/Db4 | F#4/Gb4 | G#4/Ab4 | A#4/Bb4 F#4/Gb4 | F#4/Gb4 C#4/Db4 | C#4/Db4 | rest | C#4/Db4 | A#4/Bb4 | G#4/Ab4 | A#4/Bb4 | G#4/Ab4 | F4 | C#4/Db4',\n            'notes_duration':\n            '0.407140 | 0.376190 | 0.242180 | 0.509550 0.183420 | 0.315400 0.235020 | 0.361660 | 0.223070 | 0.377270 | 0.340550 | 0.299620 | 0.344510 | 0.283770 | 0.323390 | 0.360340',\n            'input_type': 'word'\n        },\n                                                      sample_num=1,\n                                                      save_audio=True,\n                                                      save_dir='outputs')\n        self.assertIsInstance(results, dict)\n        self.assertIsInstance(results['wavs'], list)\n        self.assertIsInstance(results['wavs'][0], list)\n        self.assertEqual(len(results['wavs'][0]), 123776)\n        self.assertEqual(results['sample_rate'], 24000)\n\n    def test_singing_voice_synthesis2(self):\n        results = self.module.singing_voice_synthesis(inputs={\n            'text': '小酒窝长睫毛AP是你最美的记号',\n            'ph_seq': 'x iao j iu w o ch ang ang j ie ie m ao AP sh i n i z ui m ei d e j i h ao',\n            'note_seq':\n            'C#4/Db4 C#4/Db4 F#4/Gb4 F#4/Gb4 G#4/Ab4 G#4/Ab4 A#4/Bb4 A#4/Bb4 F#4/Gb4 F#4/Gb4 F#4/Gb4 C#4/Db4 C#4/Db4 C#4/Db4 rest C#4/Db4 C#4/Db4 A#4/Bb4 A#4/Bb4 G#4/Ab4 G#4/Ab4 A#4/Bb4 A#4/Bb4 G#4/Ab4 G#4/Ab4 F4 F4 C#4/Db4 C#4/Db4',\n            'note_dur_seq':\n            '0.407140 0.407140 0.376190 0.376190 0.242180 0.242180 0.509550 0.509550 0.183420 0.315400 0.315400 0.235020 0.361660 0.361660 0.223070 0.377270 0.377270 0.340550 0.340550 0.299620 0.299620 0.344510 0.344510 0.283770 0.283770 0.323390 0.323390 0.360340 0.360340',\n            'is_slur_seq': '0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0',\n            'input_type': 'phoneme'\n        },\n                                                      sample_num=1,\n                                                      save_audio=True,\n                                                      save_dir='outputs')\n        self.assertIsInstance(results, dict)\n        self.assertIsInstance(results['wavs'], list)\n        self.assertIsInstance(results['wavs'][0], list)\n        self.assertEqual(len(results['wavs'][0]), 123776)\n        self.assertEqual(results['sample_rate'], 24000)\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/usr/configs/base.yaml",
    "content": "task_cls: usr.task.DiffFsTask\npitch_type: frame\ntimesteps: 100\ndilation_cycle_length: 1\nresidual_layers: 20\nresidual_channels: 256\nlr: 0.001\ndecay_steps: 50000\nkeep_bins: 80\nspec_min: [ ]\nspec_max: [ ]\n\ncontent_cond_steps: [ ] # [ 0, 10000 ]\nspk_cond_steps: [ ] # [ 0, 10000 ]\n# train and eval\nfs2_ckpt: ''\nmax_updates: 400000\n# max_updates: 200000\nuse_gt_dur: true\nuse_gt_f0: true\ngen_tgt_spk_id: -1\nmax_sentences: 48\nnum_sanity_val_steps: 1\nnum_valid_plots: 1\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/usr/configs/lj_ds_beta6.yaml",
    "content": "base_config:\n  - configs/tts/lj/fs2.yaml\n  - ./base.yaml\n# spec_min and spec_max are calculated on the training set.\nspec_min: [ -4.7574, -4.6783, -4.6431, -4.5832, -4.5390, -4.6771, -4.8089, -4.7672,\n            -4.5784, -4.7755, -4.7150, -4.8919, -4.8271, -4.7389, -4.6047, -4.7759,\n            -4.6799, -4.8201, -4.7823, -4.8262, -4.7857, -4.7545, -4.9358, -4.9733,\n            -5.1134, -5.1395, -4.9016, -4.8434, -5.0189, -4.8460, -5.0529, -4.9510,\n            -5.0217, -5.0049, -5.1831, -5.1445, -5.1015, -5.0281, -4.9887, -4.9916,\n            -4.9785, -4.9071, -4.9488, -5.0342, -4.9332, -5.0650, -4.8924, -5.0875,\n            -5.0483, -5.0848, -5.1809, -5.0677, -5.0015, -5.0792, -5.0636, -5.2413,\n            -5.1421, -5.1710, -5.3256, -5.0511, -5.1186, -5.0057, -5.0446, -5.1173,\n            -5.0325, -5.1085, -5.0053, -5.0755, -5.1176, -5.1004, -5.2153, -5.2757,\n            -5.3025, -5.2867, -5.2918, -5.3328, -5.2731, -5.2985, -5.2400, -5.2211 ]\nspec_max: [ -0.5982, -0.0778,  0.1205,  0.2747,  0.4657,  0.5123,  0.5684,  0.7093,\n            0.6461,  0.6420,  0.7316,  0.7715,  0.7681,  0.8349,  0.7815,  0.7591,\n            0.7910,  0.7433,  0.7352,  0.6869,  0.6854,  0.6623,  0.5353,  0.6492,\n            0.6909,  0.6106,  0.5761,  0.5936,  0.5638,  0.4054,  0.4545,  0.3589,\n            0.3037,  0.3380,  0.1599,  0.2433,  0.2741,  0.2130,  0.1569,  0.1911,\n            0.2324,  0.1586,  0.1221,  0.0341, -0.0558,  0.0553, -0.1153, -0.0933,\n            -0.1171, -0.0050, -0.1519, -0.1629, -0.0522, -0.0739, -0.2069, -0.2405,\n            -0.1244, -0.2116, -0.1361, -0.1575, -0.1442,  0.0513, -0.1567, -0.2000,\n            0.0086, -0.0698,  0.1385,  0.0941,  0.1864,  0.1225,  0.2176,  0.2566,\n            0.1670,  0.1007,  0.1444,  0.0888,  0.1998,  0.2414,  0.2932,  0.3047 ]\n\ntask_cls: usr.diffspeech_task.DiffSpeechTask\nvocoder: vocoders.hifigan.HifiGAN\nvocoder_ckpt: checkpoints/0414_hifi_lj_1\nnum_valid_plots: 10\nuse_gt_dur: false\nuse_gt_f0: false\npitch_type: cwt\npitch_extractor: 'parselmouth'\nmax_updates: 160000\nlr: 0.001\ntimesteps: 100\nK_step: 71\ndiff_loss_type: l1\ndiff_decoder_type: 'wavenet'\nschedule_type: 'linear'\nmax_beta: 0.06\nfs2_ckpt: checkpoints/fs2_lj_1/model_ckpt_steps_150000.ckpt\nsave_gt: true\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/usr/configs/midi/cascade/opencs/aux_rel.yaml",
    "content": "base_config:\n  - configs/singing/fs2.yaml\n  - usr/configs/midi/cascade/opencs/opencpop_statis.yaml\n\naudio_sample_rate: 24000\nhop_size: 128            # Hop size.\nfft_size: 512           # FFT size.\nwin_size: 512           # FFT size.\nfmin: 30\nfmax: 12000\nmin_level_db: -120\n\nbinarization_args:\n  with_wav: true\n  with_spk_embed: false\n  with_align: true\nraw_data_dir: 'data/raw/opencpop/segments'\nprocessed_data_dir: 'xxx'\nbinarizer_cls: data_gen.singing.binarize.OpencpopBinarizer\n\n\nbinary_data_dir: 'data/binary/opencpop-midi-dp'\nuse_midi: true  #  for midi exp\nuse_gt_f0: false  #  for midi exp\nuse_gt_dur: false  # for further midi exp\nlambda_f0: 1.0\nlambda_uv: 1.0\n#lambda_energy: 0.1\nlambda_ph_dur: 1.0\nlambda_sent_dur: 1.0\nlambda_word_dur: 1.0\npredictor_grad: 0.1\npe_enable: false\npe_ckpt: ''\n\nnum_spk: 1\ntest_prefixes: [\n    '2044',\n    '2086',\n    '2092',\n    '2093',\n    '2100',\n]\n\ntask_cls: usr.diffsinger_task.AuxDecoderMIDITask\n#vocoder: usr.singingvocoder.highgan.HighGAN\n#vocoder_ckpt: checkpoints/h_2_model/checkpoint-530000steps.pkl\nvocoder: vocoders.hifigan.HifiGAN\nvocoder_ckpt: checkpoints/0109_hifigan_bigpopcs_hop128\n\nuse_nsf: true\n\n# config for experiments\nmax_frames: 5000\nmax_tokens: 40000\npredictor_layers: 5\nrel_pos: true\ndur_predictor_layers: 5  # *\n\nuse_spk_embed: false\nnum_valid_plots: 10\nmax_updates: 160000\nsave_gt: true\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/usr/configs/midi/cascade/opencs/ds60_rel.yaml",
    "content": "base_config:\n  - usr/configs/popcs_ds_beta6.yaml\n  - usr/configs/midi/cascade/opencs/opencpop_statis.yaml\n\nbinarizer_cls: data_gen.singing.binarize.OpencpopBinarizer\nbinary_data_dir: 'data/binary/opencpop-midi-dp'\n\n#switch_midi2f0_step: 174000\nuse_midi: true  #  for midi exp\nuse_gt_f0: false  #  for midi exp\nuse_gt_dur: false  # for further midi exp\nlambda_f0: 1.0\nlambda_uv: 1.0\n#lambda_energy: 0.1\nlambda_ph_dur: 1.0\nlambda_sent_dur: 1.0\nlambda_word_dur: 1.0\npredictor_grad: 0.1\npe_enable: false\npe_ckpt: ''\n\nfs2_ckpt: 'checkpoints/0302_opencpop_fs_midi/model_ckpt_steps_160000.ckpt'  #\n#num_valid_plots: 0\ntask_cls: usr.diffsinger_task.DiffSingerMIDITask\n\nK_step: 60\nmax_tokens: 40000\npredictor_layers: 5\ndilation_cycle_length: 4  # *\nrel_pos: true\ndur_predictor_layers: 5  # *\nmax_updates: 160000\ngaussian_start: false\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/usr/configs/midi/cascade/opencs/opencpop_statis.yaml",
    "content": "spec_min: [-6., -6., -6., -6., -6., -6., -6., -6., -6., -6., -6., -6.,\n           -6., -6., -6., -6., -6., -6., -6., -6., -6., -6., -6., -6.,\n           -6., -6., -6., -6., -6., -6., -6., -6., -6., -6., -6., -6.,\n           -6., -6., -6., -6., -6., -6., -6., -6., -6., -6., -6., -6.,\n           -6., -6., -6., -6., -6., -6., -6., -6., -6., -6., -6., -6.,\n           -6., -6., -6., -6., -6., -6., -6., -6., -6., -6., -6., -6.,\n           -6., -6., -6., -6., -6., -6., -6., -6.]\nspec_max: [-7.9453e-01, -8.1116e-01, -6.1631e-01, -3.0679e-01, -1.3863e-01,\n           -5.0652e-02, -1.1563e-01, -1.0679e-01, -9.1068e-02, -6.2174e-02,\n           -7.5302e-02, -7.2217e-02, -6.3815e-02, -7.3299e-02,  7.3610e-03,\n           -7.2508e-02, -5.0234e-02, -1.6534e-01, -2.6928e-01, -2.0782e-01,\n           -2.0823e-01, -1.1702e-01, -7.0128e-02, -6.5868e-02, -1.2675e-02,\n           1.5121e-03, -8.9902e-02, -2.1392e-01, -2.3789e-01, -2.8922e-01,\n           -3.0405e-01, -2.3029e-01, -2.2088e-01, -2.1542e-01, -2.9367e-01,\n           -3.0137e-01, -3.8281e-01, -4.3590e-01, -2.8681e-01, -4.6855e-01,\n           -5.7485e-01, -4.7022e-01, -5.4266e-01, -4.4848e-01, -6.4120e-01,\n           -6.8700e-01, -6.4860e-01, -7.6436e-01, -4.9971e-01, -7.1068e-01,\n           -6.9724e-01, -6.1487e-01, -5.5843e-01, -6.9773e-01, -5.7502e-01,\n           -7.0919e-01, -8.2431e-01, -8.4213e-01, -9.0431e-01, -8.2840e-01,\n           -7.7945e-01, -8.2758e-01, -8.7699e-01, -1.0532e+00, -1.0766e+00,\n           -1.1198e+00, -1.0185e+00, -9.8983e-01, -1.0001e+00, -1.0756e+00,\n           -1.0024e+00, -1.0304e+00, -1.0579e+00, -1.0188e+00, -1.0500e+00,\n           -1.0842e+00, -1.0923e+00, -1.1223e+00, -1.2381e+00, -1.6467e+00]\n\nmel_vmin: -6. #-6.\nmel_vmax: 1.5\nwav2spec_eps: 1e-6\n\nraw_data_dir: 'data/raw/opencpop/segments'\nprocessed_data_dir: 'xxx'\nbinary_data_dir: 'data/binary/opencpop-midi-dp'\ndatasets: [\n  'opencpop',\n]\ntest_prefixes: [\n    '2044',\n    '2086',\n    '2092',\n    '2093',\n    '2100',\n]\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/usr/configs/midi/e2e/opencpop/ds1000.yaml",
    "content": "base_config:\n  - usr/configs/popcs_ds_beta6.yaml\n  - usr/configs/midi/cascade/opencs/opencpop_statis.yaml\n\nbinarizer_cls: data_gen.singing.binarize.OpencpopBinarizer\nbinary_data_dir: 'data/binary/opencpop-midi-dp'\n\n#switch_midi2f0_step: 174000\nuse_midi: true  #  for midi exp\nuse_gt_dur: false  # for further midi exp\nlambda_ph_dur: 1.0\nlambda_sent_dur: 1.0\nlambda_word_dur: 1.0\npredictor_grad: 0.1\ndur_predictor_layers: 5  # *\n\n\nfs2_ckpt: ''  #\n#num_valid_plots: 0\ntask_cls: usr.diffsinger_task.DiffSingerMIDITask\n\n# for diffusion schedule\ntimesteps: 1000\nK_step: 1000\nmax_beta: 0.02\nmax_tokens: 36000\nmax_updates: 320000\ngaussian_start: True\npndm_speedup: 40\n\nuse_pitch_embed: false\nuse_gt_f0: false  #  for midi exp\n\nlambda_f0: 0.\nlambda_uv: 0.\ndilation_cycle_length: 4  # *\nrel_pos: true\npredictor_layers: 5\npe_enable: true\npe_ckpt: 'checkpoints/0102_xiaoma_pe'\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/usr/configs/midi/e2e/opencpop/ds100_adj_rel.yaml",
    "content": "base_config:\n  - usr/configs/popcs_ds_beta6.yaml\n  - usr/configs/midi/cascade/opencs/opencpop_statis.yaml\n\nbinarizer_cls: data_gen.singing.binarize.OpencpopBinarizer\nbinary_data_dir: 'data/binary/opencpop-midi-dp'\n\n#switch_midi2f0_step: 174000\nuse_midi: true  #  for midi exp\nuse_gt_dur: false  # for further midi exp\nlambda_ph_dur: 1.0\nlambda_sent_dur: 1.0\nlambda_word_dur: 1.0\npredictor_grad: 0.1\ndur_predictor_layers: 5  # *\n\n\nfs2_ckpt: ''  #\n#num_valid_plots: 0\ntask_cls: usr.diffsinger_task.DiffSingerMIDITask\n\nK_step: 100\nmax_tokens: 40000\nmax_updates: 160000\ngaussian_start: True\n\nuse_pitch_embed: false\nuse_gt_f0: false  #  for midi exp\n\nlambda_f0: 0.\nlambda_uv: 0.\ndilation_cycle_length: 4  # *\nrel_pos: true\npredictor_layers: 5\npe_enable: true\npe_ckpt: 'checkpoints/0102_xiaoma_pe'\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/usr/configs/midi/e2e/popcs/ds100_adj_rel.yaml",
    "content": "base_config:\n  - usr/configs/popcs_ds_beta6.yaml\n  - usr/configs/midi/cascade/popcs/popcs_statis.yaml\n\nbinarizer_cls: data_gen.singing.binarize.MidiSingingBinarizer\nbinary_data_dir: 'data/binary/popcs-midi-dp'\n\n#switch_midi2f0_step: 174000\nuse_midi: true  #  for midi exp\nuse_gt_dur: false  # for further midi exp\nlambda_ph_dur: 1.0\nlambda_sent_dur: 1.0\nlambda_word_dur: 1.0\npredictor_grad: 0.1\ndur_predictor_layers: 5  # *\n\n\nfs2_ckpt: ''  #\n#num_valid_plots: 0\ntask_cls: usr.diffsinger_task.DiffSingerMIDITask\n\nK_step: 100\nmax_tokens: 40000\nmax_updates: 160000\ngaussian_start: True\n\nuse_pitch_embed: false\nuse_gt_f0: false  #  for midi exp\n\nlambda_f0: 0.\nlambda_uv: 0.\ndilation_cycle_length: 4  # *\nrel_pos: true\npredictor_layers: 5\npe_enable: true\npe_ckpt: 'checkpoints/0102_xiaoma_pe'\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/usr/configs/midi/pe.yaml",
    "content": "base_config:\n  - configs/tts/lj/fs2.yaml\n\nmax_frames: 8000\naudio_sample_rate: 24000\nhop_size: 128            # Hop size.\nfft_size: 512           # FFT size.\nwin_size: 512           # FFT size.\nfmin: 30\nfmax: 12000\nmin_level_db: -120\n\nbinary_data_dir: 'xxx'\n\npitch_type: frame\ntask_cls: tasks.tts.pe.PitchExtractionTask\npitch_extractor_conv_layers: 2\n\n\n# config for experiments\nmax_tokens: 20000\nuse_spk_embed: false\nnum_valid_plots: 10\nmax_updates: 60000\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/usr/configs/popcs_ds_beta6.yaml",
    "content": "base_config:\n  - configs/tts/fs2.yaml\n  - configs/singing/base.yaml\n  - ./base.yaml\n\naudio_sample_rate: 24000\nhop_size: 128            # Hop size.\nfft_size: 512           # FFT size.\nwin_size: 512           # FFT size.\nfmin: 30\nfmax: 12000\nmin_level_db: -120\n\nbinarization_args:\n  with_wav: true\n  with_spk_embed: false\n  with_align: true\nraw_data_dir: 'data/raw/popcs'\nprocessed_data_dir: 'data/processed/popcs'\nbinary_data_dir: 'data/binary/popcs-pmf0'\nnum_spk: 1\ndatasets: [\n  'popcs',\n]\ntest_prefixes: [\n  'popcs-说散就散',\n  'popcs-隐形的翅膀',\n]\n\nspec_min: [-6.8276, -7.0270, -6.8142, -7.1429, -7.6669, -7.6000, -7.1148, -6.9640,\n           -6.8414, -6.6596, -6.6880, -6.7439, -6.7986, -7.4940, -7.7845, -7.6586,\n           -6.9288, -6.7639, -6.9118, -6.8246, -6.7183, -7.1769, -6.9794, -7.4513,\n           -7.3422, -7.5623, -6.9610, -6.8158, -6.9595, -6.8403, -6.5688, -6.6356,\n           -7.0209, -6.5002, -6.7819, -6.5232, -6.6927, -6.5701, -6.5531, -6.7069,\n           -6.6462, -6.4523, -6.5954, -6.4264, -6.4487, -6.7070, -6.4025, -6.3042,\n           -6.4008, -6.3857, -6.3903, -6.3094, -6.2491, -6.3518, -6.3566, -6.4168,\n           -6.2481, -6.3624, -6.2858, -6.2575, -6.3638, -6.4520, -6.1835, -6.2754,\n           -6.1253, -6.1645, -6.0638, -6.1262, -6.0710, -6.1039, -6.4428, -6.1363,\n           -6.1054, -6.1252, -6.1797, -6.0235, -6.0758, -5.9453, -6.0213, -6.0446]\nspec_max: [ 0.2645,  0.0583, -0.2344, -0.0184,  0.1227,  0.1533,  0.1103,  0.1212,\n            0.2421,  0.1809,  0.2134,  0.3161,  0.3301,  0.3289,  0.2667,  0.2421,\n            0.2581,  0.2600,  0.1394,  0.1907,  0.1082,  0.1474,  0.1680,  0.2550,\n            0.1057,  0.0826,  0.0423,  0.1203, -0.0701, -0.0056,  0.0477, -0.0639,\n            -0.0272, -0.0728, -0.1648, -0.0855, -0.2652, -0.1998, -0.1547, -0.2167,\n            -0.4181, -0.5463, -0.4161, -0.4733, -0.6518, -0.5387, -0.4290, -0.4191,\n            -0.4151, -0.3042, -0.3810, -0.4160, -0.4496, -0.2847, -0.4676, -0.4658,\n            -0.4931, -0.4885, -0.5547, -0.5481, -0.6948, -0.7968, -0.8455, -0.8392,\n            -0.8770, -0.9520, -0.8749, -0.7297, -0.8374, -0.8667, -0.7157, -0.9035,\n            -0.9219, -0.8801, -0.9298, -0.9009, -0.9604, -1.0537, -1.0781, -1.3766]\n\ntask_cls: usr.diffsinger_task.DiffSingerTask\n#vocoder: usr.singingvocoder.highgan.HighGAN\n#vocoder_ckpt: checkpoints/h_2_model/checkpoint-530000steps.pkl\nvocoder: vocoders.hifigan.HifiGAN\nvocoder_ckpt: checkpoints/0109_hifigan_bigpopcs_hop128\n\npitch_extractor: 'parselmouth'\n# config for experiments\nuse_spk_embed: false\nnum_valid_plots: 10\nmax_updates: 160000\nlr: 0.001\ntimesteps: 100\nK_step: 51\ndiff_loss_type: l1\ndiff_decoder_type: 'wavenet'\nschedule_type: 'linear'\nmax_beta: 0.06\nfs2_ckpt: ''\nuse_nsf: true\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/usr/configs/popcs_ds_beta6_offline.yaml",
    "content": "base_config:\n  - ./popcs_ds_beta6.yaml\n\nfs2_ckpt: checkpoints/popcs_fs2_pmf0_1230/model_ckpt_steps_160000.ckpt  # to be infer\nnum_valid_plots: 0\ntask_cls: usr.diffsinger_task.DiffSingerOfflineTask\n\n# tmp:\n#pe_enable: true\n#pe_ckpt: ''\nvocoder: vocoders.hifigan.HifiGAN\nvocoder_ckpt: checkpoints/0109_hifigan_bigpopcs_hop128\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/usr/configs/popcs_fs2.yaml",
    "content": "base_config:\n  - configs/singing/fs2.yaml\n\naudio_sample_rate: 24000\nhop_size: 128            # Hop size.\nfft_size: 512           # FFT size.\nwin_size: 512           # FFT size.\nfmin: 30\nfmax: 12000\nmin_level_db: -120\n\nbinarization_args:\n  with_wav: true\n  with_spk_embed: false\n  with_align: true\nraw_data_dir: 'data/raw/popcs'\nprocessed_data_dir: 'data/processed/popcs'\nbinary_data_dir: 'data/binary/popcs-pmf0'\nnum_spk: 1\ndatasets: [\n  'popcs',\n]\ntest_prefixes: [\n  'popcs-说散就散',\n  'popcs-隐形的翅膀',\n]\n\ntask_cls: tasks.tts.fs2.FastSpeech2Task\n#vocoder: usr.singingvocoder.highgan.HighGAN\n#vocoder_ckpt: checkpoints/h_2_model/checkpoint-530000steps.pkl\nvocoder: vocoders.hifigan.HifiGAN\nvocoder_ckpt: checkpoints/0109_hifigan_bigpopcs_hop128\nuse_nsf: true\n\n# config for experiments\nmax_tokens: 18000\nuse_spk_embed: false\nnum_valid_plots: 10\nmax_updates: 160000\nsave_gt: true\n\n# tmp:\n#pe_enable: true\n#pe_ckpt: ''\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/utils/__init__.py",
    "content": "import sys\nimport time\nimport types\n\nimport numpy as np\n\n\nclass AvgrageMeter(object):\n\n    def __init__(self):\n        self.reset()\n\n    def reset(self):\n        self.avg = 0\n        self.sum = 0\n        self.cnt = 0\n\n    def update(self, val, n=1):\n        self.sum += val * n\n        self.cnt += n\n        self.avg = self.sum / self.cnt\n\n\ndef collate_1d(values, pad_idx=0, left_pad=False, shift_right=False, max_len=None, shift_id=1):\n    \"\"\"Convert a list of 1d tensors into a padded 2d tensor.\"\"\"\n    size = max(v.size(0) for v in values) if max_len is None else max_len\n    res = values[0].new(len(values), size).fill_(pad_idx)\n\n    def copy_tensor(src, dst):\n        assert dst.numel() == src.numel()\n        if shift_right:\n            dst[1:] = src[:-1]\n            dst[0] = shift_id\n        else:\n            dst.copy_(src)\n\n    for i, v in enumerate(values):\n        copy_tensor(v, res[i][size - len(v):] if left_pad else res[i][:len(v)])\n    return res\n\n\ndef collate_2d(values, pad_idx=0, left_pad=False, shift_right=False, max_len=None):\n    \"\"\"Convert a list of 2d tensors into a padded 3d tensor.\"\"\"\n    size = max(v.size(0) for v in values) if max_len is None else max_len\n    res = values[0].new(len(values), size, values[0].shape[1]).fill_(pad_idx)\n\n    def copy_tensor(src, dst):\n        assert dst.numel() == src.numel()\n        if shift_right:\n            dst[1:] = src[:-1]\n        else:\n            dst.copy_(src)\n\n    for i, v in enumerate(values):\n        copy_tensor(v, res[i][size - len(v):] if left_pad else res[i][:len(v)])\n    return res\n\n\ndef _is_batch_full(batch, num_tokens, max_tokens, max_sentences):\n    if len(batch) == 0:\n        return 0\n    if len(batch) == max_sentences:\n        return 1\n    if num_tokens > max_tokens:\n        return 1\n    return 0\n\n\ndef batch_by_size(indices,\n                  num_tokens_fn,\n                  max_tokens=None,\n                  max_sentences=None,\n                  required_batch_size_multiple=1,\n                  distributed=False):\n    \"\"\"\n    Yield mini-batches of indices bucketed by size. Batches may contain\n    sequences of different lengths.\n\n    Args:\n        indices (List[int]): ordered list of dataset indices\n        num_tokens_fn (callable): function that returns the number of tokens at\n            a given index\n        max_tokens (int, optional): max number of tokens in each batch\n            (default: None).\n        max_sentences (int, optional): max number of sentences in each\n            batch (default: None).\n        required_batch_size_multiple (int, optional): require batch size to\n            be a multiple of N (default: 1).\n    \"\"\"\n    max_tokens = max_tokens if max_tokens is not None else sys.maxsize\n    max_sentences = max_sentences if max_sentences is not None else sys.maxsize\n    bsz_mult = required_batch_size_multiple\n\n    if isinstance(indices, types.GeneratorType):\n        indices = np.fromiter(indices, dtype=np.int64, count=-1)\n\n    sample_len = 0\n    sample_lens = []\n    batch = []\n    batches = []\n    for i in range(len(indices)):\n        idx = indices[i]\n        num_tokens = num_tokens_fn(idx)\n        sample_lens.append(num_tokens)\n        sample_len = max(sample_len, num_tokens)\n        assert sample_len <= max_tokens, (\"sentence at index {} of size {} exceeds max_tokens \"\n                                          \"limit of {}!\".format(idx, sample_len, max_tokens))\n        num_tokens = (len(batch) + 1) * sample_len\n\n        if _is_batch_full(batch, num_tokens, max_tokens, max_sentences):\n            mod_len = max(\n                bsz_mult * (len(batch) // bsz_mult),\n                len(batch) % bsz_mult,\n            )\n            batches.append(batch[:mod_len])\n            batch = batch[mod_len:]\n            sample_lens = sample_lens[mod_len:]\n            sample_len = max(sample_lens) if len(sample_lens) > 0 else 0\n        batch.append(idx)\n    if len(batch) > 0:\n        batches.append(batch)\n    return batches\n\n\ndef unpack_dict_to_list(samples):\n    samples_ = []\n    bsz = samples.get('outputs').size(0)\n    for i in range(bsz):\n        res = {}\n        for k, v in samples.items():\n            try:\n                res[k] = v[i]\n            except:\n                pass\n        samples_.append(res)\n    return samples_\n\n\ndef remove_padding(x, padding_idx=0):\n    if x is None:\n        return None\n    assert len(x.shape) in [1, 2]\n    if len(x.shape) == 2:  # [T, H]\n        return x[np.abs(x).sum(-1) != padding_idx]\n    elif len(x.shape) == 1:  # [T]\n        return x[x != padding_idx]\n\n\nclass Timer:\n    timer_map = {}\n\n    def __init__(self, name, print_time=False):\n        if name not in Timer.timer_map:\n            Timer.timer_map[name] = 0\n        self.name = name\n        self.print_time = print_time\n\n    def __enter__(self):\n        self.t = time.time()\n\n    def __exit__(self, exc_type, exc_val, exc_tb):\n        Timer.timer_map[self.name] += time.time() - self.t\n        if self.print_time:\n            print(self.name, Timer.timer_map[self.name])\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/utils/audio.py",
    "content": "import subprocess\n\nimport matplotlib\n\nmatplotlib.use('Agg')\nimport librosa\nimport librosa.filters\nimport numpy as np\nfrom scipy import signal\nfrom scipy.io import wavfile\n\n\ndef save_wav(wav, path, sr, norm=False):\n    if norm:\n        wav = wav / np.abs(wav).max()\n    wav *= 32767\n    # proposed by @dsmiller\n    wavfile.write(path, sr, wav.astype(np.int16))\n\n\ndef get_hop_size(hparams):\n    hop_size = hparams['hop_size']\n    if hop_size is None:\n        assert hparams['frame_shift_ms'] is not None\n        hop_size = int(hparams['frame_shift_ms'] / 1000 * hparams['audio_sample_rate'])\n    return hop_size\n\n\n###########################################################################################\ndef _stft(y, hparams):\n    return librosa.stft(y=y,\n                        n_fft=hparams['fft_size'],\n                        hop_length=get_hop_size(hparams),\n                        win_length=hparams['win_size'],\n                        pad_mode='constant')\n\n\ndef _istft(y, hparams):\n    return librosa.istft(y, hop_length=get_hop_size(hparams), win_length=hparams['win_size'])\n\n\ndef librosa_pad_lr(x, fsize, fshift, pad_sides=1):\n    '''compute right padding (final frame) or both sides padding (first and final frames)\n    '''\n    assert pad_sides in (1, 2)\n    # return int(fsize // 2)\n    pad = (x.shape[0] // fshift + 1) * fshift - x.shape[0]\n    if pad_sides == 1:\n        return 0, pad\n    else:\n        return pad // 2, pad // 2 + pad % 2\n\n\n# Conversions\ndef amp_to_db(x):\n    return 20 * np.log10(np.maximum(1e-5, x))\n\n\ndef normalize(S, hparams):\n    return (S - hparams['min_level_db']) / -hparams['min_level_db']\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/utils/cwt.py",
    "content": "import librosa\nimport numpy as np\nfrom pycwt import wavelet\nfrom scipy.interpolate import interp1d\n\n\ndef load_wav(wav_file, sr):\n    wav, _ = librosa.load(wav_file, sr=sr, mono=True)\n    return wav\n\n\ndef convert_continuos_f0(f0):\n    '''CONVERT F0 TO CONTINUOUS F0\n    Args:\n        f0 (ndarray): original f0 sequence with the shape (T)\n    Return:\n        (ndarray): continuous f0 with the shape (T)\n    '''\n    # get uv information as binary\n    f0 = np.copy(f0)\n    uv = np.float32(f0 != 0)\n\n    # get start and end of f0\n    if (f0 == 0).all():\n        print(\"| all of the f0 values are 0.\")\n        return uv, f0\n    start_f0 = f0[f0 != 0][0]\n    end_f0 = f0[f0 != 0][-1]\n\n    # padding start and end of f0 sequence\n    start_idx = np.where(f0 == start_f0)[0][0]\n    end_idx = np.where(f0 == end_f0)[0][-1]\n    f0[:start_idx] = start_f0\n    f0[end_idx:] = end_f0\n\n    # get non-zero frame index\n    nz_frames = np.where(f0 != 0)[0]\n\n    # perform linear interpolation\n    f = interp1d(nz_frames, f0[nz_frames])\n    cont_f0 = f(np.arange(0, f0.shape[0]))\n\n    return uv, cont_f0\n\n\ndef get_cont_lf0(f0, frame_period=5.0):\n    uv, cont_f0_lpf = convert_continuos_f0(f0)\n    # cont_f0_lpf = low_pass_filter(cont_f0_lpf, int(1.0 / (frame_period * 0.001)), cutoff=20)\n    cont_lf0_lpf = np.log(cont_f0_lpf)\n    return uv, cont_lf0_lpf\n\n\ndef get_lf0_cwt(lf0):\n    '''\n    input:\n        signal of shape (N)\n    output:\n        Wavelet_lf0 of shape(10, N), scales of shape(10)\n    '''\n    mother = wavelet.MexicanHat()\n    dt = 0.005\n    dj = 1\n    s0 = dt * 2\n    J = 9\n\n    Wavelet_lf0, scales, _, _, _, _ = wavelet.cwt(np.squeeze(lf0), dt, dj, s0, J, mother)\n    # Wavelet.shape => (J + 1, len(lf0))\n    Wavelet_lf0 = np.real(Wavelet_lf0).T\n    return Wavelet_lf0, scales\n\n\ndef norm_scale(Wavelet_lf0):\n    Wavelet_lf0_norm = np.zeros((Wavelet_lf0.shape[0], Wavelet_lf0.shape[1]))\n    mean = Wavelet_lf0.mean(0)[None, :]\n    std = Wavelet_lf0.std(0)[None, :]\n    Wavelet_lf0_norm = (Wavelet_lf0 - mean) / std\n    return Wavelet_lf0_norm, mean, std\n\n\ndef normalize_cwt_lf0(f0, mean, std):\n    uv, cont_lf0_lpf = get_cont_lf0(f0)\n    cont_lf0_norm = (cont_lf0_lpf - mean) / std\n    Wavelet_lf0, scales = get_lf0_cwt(cont_lf0_norm)\n    Wavelet_lf0_norm, _, _ = norm_scale(Wavelet_lf0)\n\n    return Wavelet_lf0_norm\n\n\ndef get_lf0_cwt_norm(f0s, mean, std):\n    uvs = list()\n    cont_lf0_lpfs = list()\n    cont_lf0_lpf_norms = list()\n    Wavelet_lf0s = list()\n    Wavelet_lf0s_norm = list()\n    scaless = list()\n\n    means = list()\n    stds = list()\n    for f0 in f0s:\n        uv, cont_lf0_lpf = get_cont_lf0(f0)\n        cont_lf0_lpf_norm = (cont_lf0_lpf - mean) / std\n\n        Wavelet_lf0, scales = get_lf0_cwt(cont_lf0_lpf_norm)  # [560,10]\n        Wavelet_lf0_norm, mean_scale, std_scale = norm_scale(Wavelet_lf0)  # [560,10],[1,10],[1,10]\n\n        Wavelet_lf0s_norm.append(Wavelet_lf0_norm)\n        uvs.append(uv)\n        cont_lf0_lpfs.append(cont_lf0_lpf)\n        cont_lf0_lpf_norms.append(cont_lf0_lpf_norm)\n        Wavelet_lf0s.append(Wavelet_lf0)\n        scaless.append(scales)\n        means.append(mean_scale)\n        stds.append(std_scale)\n\n    return Wavelet_lf0s_norm, scaless, means, stds\n\n\ndef inverse_cwt(Wavelet_lf0, scales):\n    b = ((np.arange(0, len(scales))[None, None, :] + 1 + 2.5)**(-2.5))\n    lf0_rec = Wavelet_lf0 * b\n    lf0_rec_sum = lf0_rec.sum(-1)\n    lf0_rec_sum = (lf0_rec_sum - lf0_rec_sum.mean(-1, keepdims=True)) / lf0_rec_sum.std(-1, keepdims=True)\n    return lf0_rec_sum\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/utils/hparams.py",
    "content": "import argparse\nimport os\n\nimport yaml\n\nglobal_print_hparams = True\nhparams = {}\n\n\nclass Args:\n\n    def __init__(self, **kwargs):\n        for k, v in kwargs.items():\n            self.__setattr__(k, v)\n\n\ndef override_config(old_config: dict, new_config: dict):\n    for k, v in new_config.items():\n        if isinstance(v, dict) and k in old_config:\n            override_config(old_config[k], new_config[k])\n        else:\n            old_config[k] = v\n\n\ndef set_hparams(config='', exp_name='', hparams_str='', print_hparams=True, global_hparams=True, root='.'):\n    if config == '' and exp_name == '':\n        parser = argparse.ArgumentParser(description='neural music')\n        parser.add_argument('--config', type=str, default='', help='location of the data corpus')\n        parser.add_argument('--exp_name', type=str, default='', help='exp_name')\n        parser.add_argument('--hparams', type=str, default='', help='location of the data corpus')\n        parser.add_argument('--infer', action='store_true', help='infer')\n        parser.add_argument('--validate', action='store_true', help='validate')\n        parser.add_argument('--reset', action='store_true', help='reset hparams')\n        parser.add_argument('--debug', action='store_true', help='debug')\n        args, unknown = parser.parse_known_args()\n    else:\n        args = Args(config=config,\n                    exp_name=exp_name,\n                    hparams=hparams_str,\n                    infer=False,\n                    validate=False,\n                    reset=False,\n                    debug=False)\n    args_work_dir = ''\n    if args.exp_name != '':\n        args.work_dir = args.exp_name\n        args_work_dir = f'checkpoints/{args.work_dir}'\n\n    config_chains = []\n    loaded_config = set()\n\n    def load_config(config_fn):  # deep first\n        with open(os.path.join(root, config_fn)) as f:\n            hparams_ = yaml.safe_load(f)\n        loaded_config.add(config_fn)\n        if 'base_config' in hparams_:\n            ret_hparams = {}\n            if not isinstance(hparams_['base_config'], list):\n                hparams_['base_config'] = [hparams_['base_config']]\n            for c in hparams_['base_config']:\n                if c not in loaded_config:\n                    if c.startswith('.'):\n                        c = f'{os.path.dirname(config_fn)}/{c}'\n                        c = os.path.normpath(c)\n                    override_config(ret_hparams, load_config(c))\n            override_config(ret_hparams, hparams_)\n        else:\n            ret_hparams = hparams_\n        config_chains.append(config_fn)\n        return ret_hparams\n\n    global hparams\n    assert args.config != '' or args_work_dir != ''\n    saved_hparams = {}\n    if args_work_dir != 'checkpoints/':\n        ckpt_config_path = f'{args_work_dir}/config.yaml'\n        if os.path.exists(ckpt_config_path):\n            try:\n                with open(ckpt_config_path) as f:\n                    saved_hparams.update(yaml.safe_load(f))\n            except:\n                pass\n        if args.config == '':\n            args.config = ckpt_config_path\n\n    hparams_ = {}\n\n    hparams_.update(load_config(args.config))\n\n    if not args.reset:\n        hparams_.update(saved_hparams)\n    hparams_['work_dir'] = args_work_dir\n\n    if args.hparams != \"\":\n        for new_hparam in args.hparams.split(\",\"):\n            k, v = new_hparam.split(\"=\")\n            if v in ['True', 'False'] or type(hparams_[k]) == bool:\n                hparams_[k] = eval(v)\n            else:\n                hparams_[k] = type(hparams_[k])(v)\n\n    if args_work_dir != '' and (not os.path.exists(ckpt_config_path) or args.reset) and not args.infer:\n        os.makedirs(hparams_['work_dir'], exist_ok=True)\n        with open(ckpt_config_path, 'w') as f:\n            yaml.safe_dump(hparams_, f)\n\n    hparams_['infer'] = args.infer\n    hparams_['debug'] = args.debug\n    hparams_['validate'] = args.validate\n    global global_print_hparams\n    if global_hparams:\n        hparams.clear()\n        hparams.update(hparams_)\n\n    if print_hparams and global_print_hparams and global_hparams:\n        print('| Hparams chains: ', config_chains)\n        print('| Hparams: ')\n        for i, (k, v) in enumerate(sorted(hparams_.items())):\n            print(f\"\\033[;33;m{k}\\033[0m: {v}, \", end=\"\\n\" if i % 5 == 4 else \"\")\n        print(\"\")\n        global_print_hparams = False\n    # print(hparams_.keys())\n    if hparams.get('exp_name') is None:\n        hparams['exp_name'] = args.exp_name\n    if hparams_.get('exp_name') is None:\n        hparams_['exp_name'] = args.exp_name\n    return hparams_\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/utils/multiprocess_utils.py",
    "content": "import os\nimport traceback\nfrom multiprocessing import Process\nfrom multiprocessing import Queue\n\n\ndef chunked_worker(worker_id, map_func, args, results_queue=None, init_ctx_func=None):\n    ctx = init_ctx_func(worker_id) if init_ctx_func is not None else None\n    for job_idx, arg in args:\n        try:\n            if ctx is not None:\n                res = map_func(*arg, ctx=ctx)\n            else:\n                res = map_func(*arg)\n            results_queue.put((job_idx, res))\n        except:\n            traceback.print_exc()\n            results_queue.put((job_idx, None))\n\n\ndef chunked_multiprocess_run(map_func, args, num_workers=None, ordered=True, init_ctx_func=None, q_max_size=1000):\n    args = zip(range(len(args)), args)\n    args = list(args)\n    n_jobs = len(args)\n    if num_workers is None:\n        num_workers = int(os.getenv('N_PROC', os.cpu_count()))\n    results_queues = []\n    if ordered:\n        for i in range(num_workers):\n            results_queues.append(Queue(maxsize=q_max_size // num_workers))\n    else:\n        results_queue = Queue(maxsize=q_max_size)\n        for i in range(num_workers):\n            results_queues.append(results_queue)\n    workers = []\n    for i in range(num_workers):\n        args_worker = args[i::num_workers]\n        p = Process(target=chunked_worker,\n                    args=(i, map_func, args_worker, results_queues[i], init_ctx_func),\n                    daemon=True)\n        workers.append(p)\n        p.start()\n    for n_finished in range(n_jobs):\n        results_queue = results_queues[n_finished % num_workers]\n        job_idx, res = results_queue.get()\n        assert job_idx == n_finished or not ordered, (job_idx, n_finished)\n        yield res\n    for w in workers:\n        w.join()\n        w.close()\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/utils/text_encoder.py",
    "content": "import re\n\nimport six\n\nPAD = \"<pad>\"\nEOS = \"<EOS>\"\nUNK = \"<UNK>\"\nSEG = \"|\"\nRESERVED_TOKENS = [PAD, EOS, UNK]\nNUM_RESERVED_TOKENS = len(RESERVED_TOKENS)\nPAD_ID = RESERVED_TOKENS.index(PAD)  # Normally 0\nEOS_ID = RESERVED_TOKENS.index(EOS)  # Normally 1\nUNK_ID = RESERVED_TOKENS.index(UNK)  # Normally 2\n\nif six.PY2:\n    RESERVED_TOKENS_BYTES = RESERVED_TOKENS\nelse:\n    RESERVED_TOKENS_BYTES = [bytes(PAD, \"ascii\"), bytes(EOS, \"ascii\")]\n\n# Regular expression for unescaping token strings.\n# '\\u' is converted to '_'\n# '\\\\' is converted to '\\'\n# '\\213;' is converted to unichr(213)\n_UNESCAPE_REGEX = re.compile(r\"\\\\u|\\\\\\\\|\\\\([0-9]+);\")\n_ESCAPE_CHARS = set(u\"\\\\_u;0123456789\")\n\n\ndef strip_ids(ids, ids_to_strip):\n    \"\"\"Strip ids_to_strip from the end ids.\"\"\"\n    ids = list(ids)\n    while ids and ids[-1] in ids_to_strip:\n        ids.pop()\n    return ids\n\n\nclass TextEncoder(object):\n    \"\"\"Base class for converting from ints to/from human readable strings.\"\"\"\n\n    def __init__(self, num_reserved_ids=NUM_RESERVED_TOKENS):\n        self._num_reserved_ids = num_reserved_ids\n\n    @property\n    def num_reserved_ids(self):\n        return self._num_reserved_ids\n\n    def encode(self, s):\n        \"\"\"Transform a human-readable string into a sequence of int ids.\n\n        The ids should be in the range [num_reserved_ids, vocab_size). Ids [0,\n        num_reserved_ids) are reserved.\n\n        EOS is not appended.\n\n        Args:\n        s: human-readable string to be converted.\n\n        Returns:\n        ids: list of integers\n        \"\"\"\n        return [int(w) + self._num_reserved_ids for w in s.split()]\n\n    def decode(self, ids, strip_extraneous=False):\n        \"\"\"Transform a sequence of int ids into a human-readable string.\n\n        EOS is not expected in ids.\n\n        Args:\n        ids: list of integers to be converted.\n        strip_extraneous: bool, whether to strip off extraneous tokens\n            (EOS and PAD).\n\n        Returns:\n        s: human-readable string.\n        \"\"\"\n        if strip_extraneous:\n            ids = strip_ids(ids, list(range(self._num_reserved_ids or 0)))\n        return \" \".join(self.decode_list(ids))\n\n    def decode_list(self, ids):\n        \"\"\"Transform a sequence of int ids into a their string versions.\n\n        This method supports transforming individual input/output ids to their\n        string versions so that sequence to/from text conversions can be visualized\n        in a human readable format.\n\n        Args:\n        ids: list of integers to be converted.\n\n        Returns:\n        strs: list of human-readable string.\n        \"\"\"\n        decoded_ids = []\n        for id_ in ids:\n            if 0 <= id_ < self._num_reserved_ids:\n                decoded_ids.append(RESERVED_TOKENS[int(id_)])\n            else:\n                decoded_ids.append(id_ - self._num_reserved_ids)\n        return [str(d) for d in decoded_ids]\n\n    @property\n    def vocab_size(self):\n        raise NotImplementedError()\n\n\nclass ByteTextEncoder(TextEncoder):\n    \"\"\"Encodes each byte to an id. For 8-bit strings only.\"\"\"\n\n    def encode(self, s):\n        numres = self._num_reserved_ids\n        return [c + numres for c in s.encode(\"utf-8\")]\n\n    def decode(self, ids, strip_extraneous=False):\n        if strip_extraneous:\n            ids = strip_ids(ids, list(range(self._num_reserved_ids or 0)))\n        numres = self._num_reserved_ids\n        decoded_ids = []\n        int2byte = six.int2byte\n        for id_ in ids:\n            if 0 <= id_ < numres:\n                decoded_ids.append(RESERVED_TOKENS_BYTES[int(id_)])\n            else:\n                decoded_ids.append(int2byte(id_ - numres))\n        if six.PY2:\n            return \"\".join(decoded_ids)\n        # Python3: join byte arrays and then decode string\n        return b\"\".join(decoded_ids).decode(\"utf-8\", \"replace\")\n\n    def decode_list(self, ids):\n        numres = self._num_reserved_ids\n        decoded_ids = []\n        int2byte = six.int2byte\n        for id_ in ids:\n            if 0 <= id_ < numres:\n                decoded_ids.append(RESERVED_TOKENS_BYTES[int(id_)])\n            else:\n                decoded_ids.append(int2byte(id_ - numres))\n        # Python3: join byte arrays and then decode string\n        return decoded_ids\n\n    @property\n    def vocab_size(self):\n        return 2**8 + self._num_reserved_ids\n\n\nclass ByteTextEncoderWithEos(ByteTextEncoder):\n    \"\"\"Encodes each byte to an id and appends the EOS token.\"\"\"\n\n    def encode(self, s):\n        return super(ByteTextEncoderWithEos, self).encode(s) + [EOS_ID]\n\n\nclass TokenTextEncoder(TextEncoder):\n    \"\"\"Encoder based on a user-supplied vocabulary (file or list).\"\"\"\n\n    def __init__(self,\n                 vocab_filename,\n                 reverse=False,\n                 vocab_list=None,\n                 replace_oov=None,\n                 num_reserved_ids=NUM_RESERVED_TOKENS):\n        \"\"\"Initialize from a file or list, one token per line.\n\n        Handling of reserved tokens works as follows:\n        - When initializing from a list, we add reserved tokens to the vocab.\n        - When initializing from a file, we do not add reserved tokens to the vocab.\n        - When saving vocab files, we save reserved tokens to the file.\n\n        Args:\n            vocab_filename: If not None, the full filename to read vocab from. If this\n                is not None, then vocab_list should be None.\n            reverse: Boolean indicating if tokens should be reversed during encoding\n                and decoding.\n            vocab_list: If not None, a list of elements of the vocabulary. If this is\n                not None, then vocab_filename should be None.\n            replace_oov: If not None, every out-of-vocabulary token seen when\n                encoding will be replaced by this string (which must be in vocab).\n            num_reserved_ids: Number of IDs to save for reserved tokens like <EOS>.\n        \"\"\"\n        super(TokenTextEncoder, self).__init__(num_reserved_ids=num_reserved_ids)\n        self._reverse = reverse\n        self._replace_oov = replace_oov\n        if vocab_filename:\n            self._init_vocab_from_file(vocab_filename)\n        else:\n            assert vocab_list is not None\n            self._init_vocab_from_list(vocab_list)\n        self.pad_index = self._token_to_id[PAD]\n        self.eos_index = self._token_to_id[EOS]\n        self.unk_index = self._token_to_id[UNK]\n        self.seg_index = self._token_to_id[SEG] if SEG in self._token_to_id else self.eos_index\n\n    def encode(self, s):\n        \"\"\"Converts a space-separated string of tokens to a list of ids.\"\"\"\n        sentence = s\n        tokens = sentence.strip().split()\n        if self._replace_oov is not None:\n            tokens = [t if t in self._token_to_id else self._replace_oov for t in tokens]\n        ret = [self._token_to_id[tok] for tok in tokens]\n        return ret[::-1] if self._reverse else ret\n\n    def decode(self, ids, strip_eos=False, strip_padding=False):\n        if strip_padding and self.pad() in list(ids):\n            pad_pos = list(ids).index(self.pad())\n            ids = ids[:pad_pos]\n        if strip_eos and self.eos() in list(ids):\n            eos_pos = list(ids).index(self.eos())\n            ids = ids[:eos_pos]\n        return \" \".join(self.decode_list(ids))\n\n    def decode_list(self, ids):\n        seq = reversed(ids) if self._reverse else ids\n        return [self._safe_id_to_token(i) for i in seq]\n\n    @property\n    def vocab_size(self):\n        return len(self._id_to_token)\n\n    def __len__(self):\n        return self.vocab_size\n\n    def _safe_id_to_token(self, idx):\n        return self._id_to_token.get(idx, \"ID_%d\" % idx)\n\n    def _init_vocab_from_file(self, filename):\n        \"\"\"Load vocab from a file.\n\n        Args:\n        filename: The file to load vocabulary from.\n        \"\"\"\n        with open(filename) as f:\n            tokens = [token.strip() for token in f.readlines()]\n\n        def token_gen():\n            for token in tokens:\n                yield token\n\n        self._init_vocab(token_gen(), add_reserved_tokens=False)\n\n    def _init_vocab_from_list(self, vocab_list):\n        \"\"\"Initialize tokens from a list of tokens.\n\n        It is ok if reserved tokens appear in the vocab list. They will be\n        removed. The set of tokens in vocab_list should be unique.\n\n        Args:\n        vocab_list: A list of tokens.\n        \"\"\"\n\n        def token_gen():\n            for token in vocab_list:\n                if token not in RESERVED_TOKENS:\n                    yield token\n\n        self._init_vocab(token_gen())\n\n    def _init_vocab(self, token_generator, add_reserved_tokens=True):\n        \"\"\"Initialize vocabulary with tokens from token_generator.\"\"\"\n\n        self._id_to_token = {}\n        non_reserved_start_index = 0\n\n        if add_reserved_tokens:\n            self._id_to_token.update(enumerate(RESERVED_TOKENS))\n            non_reserved_start_index = len(RESERVED_TOKENS)\n\n        self._id_to_token.update(enumerate(token_generator, start=non_reserved_start_index))\n\n        # _token_to_id is the reverse of _id_to_token\n        self._token_to_id = dict((v, k) for k, v in six.iteritems(self._id_to_token))\n\n    def pad(self):\n        return self.pad_index\n\n    def eos(self):\n        return self.eos_index\n\n    def unk(self):\n        return self.unk_index\n\n    def seg(self):\n        return self.seg_index\n\n    def store_to_file(self, filename):\n        \"\"\"Write vocab file to disk.\n\n        Vocab files have one token per line. The file ends in a newline. Reserved\n        tokens are written to the vocab file as well.\n\n        Args:\n        filename: Full path of the file to store the vocab to.\n        \"\"\"\n        with open(filename, \"w\") as f:\n            for i in range(len(self._id_to_token)):\n                f.write(self._id_to_token[i] + \"\\n\")\n\n    def sil_phonemes(self):\n        return [p for p in self._id_to_token.values() if not p[0].isalpha()]\n"
  },
  {
    "path": "modules/audio/svs/diffsinger/utils/text_norm.py",
    "content": "# coding=utf-8\n# Authors:\n#   2019.5 Zhiyang Zhou (https://github.com/Joee1995/chn_text_norm.git)\n#   2019.9 Jiayu DU\n#\n# requirements:\n#   - python 3.X\n# notes: python 2.X WILL fail or produce misleading results\nimport argparse\nimport codecs\nimport os\nimport re\nimport string\nimport sys\n\n# ================================================================================ #\n#                                    basic constant\n# ================================================================================ #\nCHINESE_DIGIS = u'零一二三四五六七八九'\nBIG_CHINESE_DIGIS_SIMPLIFIED = u'零壹贰叁肆伍陆柒捌玖'\nBIG_CHINESE_DIGIS_TRADITIONAL = u'零壹貳參肆伍陸柒捌玖'\nSMALLER_BIG_CHINESE_UNITS_SIMPLIFIED = u'十百千万'\nSMALLER_BIG_CHINESE_UNITS_TRADITIONAL = u'拾佰仟萬'\nLARGER_CHINESE_NUMERING_UNITS_SIMPLIFIED = u'亿兆京垓秭穰沟涧正载'\nLARGER_CHINESE_NUMERING_UNITS_TRADITIONAL = u'億兆京垓秭穰溝澗正載'\nSMALLER_CHINESE_NUMERING_UNITS_SIMPLIFIED = u'十百千万'\nSMALLER_CHINESE_NUMERING_UNITS_TRADITIONAL = u'拾佰仟萬'\n\nZERO_ALT = u'〇'\nONE_ALT = u'幺'\nTWO_ALTS = [u'两', u'兩']\n\nPOSITIVE = [u'正', u'正']\nNEGATIVE = [u'负', u'負']\nPOINT = [u'点', u'點']\n# PLUS = [u'加', u'加']\n# SIL = [u'杠', u'槓']\n\n# 中文数字系统类型\nNUMBERING_TYPES = ['low', 'mid', 'high']\n\nCURRENCY_NAMES = '(人民币|美元|日元|英镑|欧元|马克|法郎|加拿大元|澳元|港币|先令|芬兰马克|爱尔兰镑|' \\\n                 '里拉|荷兰盾|埃斯库多|比塞塔|印尼盾|林吉特|新西兰元|比索|卢布|新加坡元|韩元|泰铢)'\nCURRENCY_UNITS = '((亿|千万|百万|万|千|百)|(亿|千万|百万|万|千|百|)元|(亿|千万|百万|万|千|百|)块|角|毛|分)'\nCOM_QUANTIFIERS = '(匹|张|座|回|场|尾|条|个|首|阙|阵|网|炮|顶|丘|棵|只|支|袭|辆|挑|担|颗|壳|窠|曲|墙|群|腔|' \\\n                  '砣|座|客|贯|扎|捆|刀|令|打|手|罗|坡|山|岭|江|溪|钟|队|单|双|对|出|口|头|脚|板|跳|枝|件|贴|' \\\n                  '针|线|管|名|位|身|堂|课|本|页|家|户|层|丝|毫|厘|分|钱|两|斤|担|铢|石|钧|锱|忽|(千|毫|微)克|' \\\n                  '毫|厘|分|寸|尺|丈|里|寻|常|铺|程|(千|分|厘|毫|微)米|撮|勺|合|升|斗|石|盘|碗|碟|叠|桶|笼|盆|' \\\n                  '盒|杯|钟|斛|锅|簋|篮|盘|桶|罐|瓶|壶|卮|盏|箩|箱|煲|啖|袋|钵|年|月|日|季|刻|时|周|天|秒|分|旬|' \\\n                  '纪|岁|世|更|夜|春|夏|秋|冬|代|伏|辈|丸|泡|粒|颗|幢|堆|条|根|支|道|面|片|张|颗|块)'\n\n# punctuation information are based on Zhon project (https://github.com/tsroten/zhon.git)\nCHINESE_PUNC_STOP = '！？｡。'\nCHINESE_PUNC_NON_STOP = '＂＃＄％＆＇（）＊＋，－／：；＜＝＞＠［＼］＾＿｀｛｜｝～｟｠｢｣､、〃《》「」『』【】〔〕〖〗〘〙〚〛〜〝〞〟〰〾〿–—‘’‛“”„‟…‧﹏'\nCHINESE_PUNC_LIST = CHINESE_PUNC_STOP + CHINESE_PUNC_NON_STOP\n\n\n# ================================================================================ #\n#                                    basic class\n# ================================================================================ #\nclass ChineseChar(object):\n    \"\"\"\n    中文字符\n    每个字符对应简体和繁体,\n    e.g. 简体 = '负', 繁体 = '負'\n    转换时可转换为简体或繁体\n    \"\"\"\n\n    def __init__(self, simplified, traditional):\n        self.simplified = simplified\n        self.traditional = traditional\n        # self.__repr__ = self.__str__\n\n    def __str__(self):\n        return self.simplified or self.traditional or None\n\n    def __repr__(self):\n        return self.__str__()\n\n\nclass ChineseNumberUnit(ChineseChar):\n    \"\"\"\n    中文数字/数位字符\n    每个字符除繁简体外还有一个额外的大写字符\n    e.g. '陆' 和 '陸'\n    \"\"\"\n\n    def __init__(self, power, simplified, traditional, big_s, big_t):\n        super(ChineseNumberUnit, self).__init__(simplified, traditional)\n        self.power = power\n        self.big_s = big_s\n        self.big_t = big_t\n\n    def __str__(self):\n        return '10^{}'.format(self.power)\n\n    @classmethod\n    def create(cls, index, value, numbering_type=NUMBERING_TYPES[1], small_unit=False):\n\n        if small_unit:\n            return ChineseNumberUnit(power=index + 1,\n                                     simplified=value[0],\n                                     traditional=value[1],\n                                     big_s=value[1],\n                                     big_t=value[1])\n        elif numbering_type == NUMBERING_TYPES[0]:\n            return ChineseNumberUnit(power=index + 8,\n                                     simplified=value[0],\n                                     traditional=value[1],\n                                     big_s=value[0],\n                                     big_t=value[1])\n        elif numbering_type == NUMBERING_TYPES[1]:\n            return ChineseNumberUnit(power=(index + 2) * 4,\n                                     simplified=value[0],\n                                     traditional=value[1],\n                                     big_s=value[0],\n                                     big_t=value[1])\n        elif numbering_type == NUMBERING_TYPES[2]:\n            return ChineseNumberUnit(power=pow(2, index + 3),\n                                     simplified=value[0],\n                                     traditional=value[1],\n                                     big_s=value[0],\n                                     big_t=value[1])\n        else:\n            raise ValueError('Counting type should be in {0} ({1} provided).'.format(NUMBERING_TYPES, numbering_type))\n\n\nclass ChineseNumberDigit(ChineseChar):\n    \"\"\"\n    中文数字字符\n    \"\"\"\n\n    def __init__(self, value, simplified, traditional, big_s, big_t, alt_s=None, alt_t=None):\n        super(ChineseNumberDigit, self).__init__(simplified, traditional)\n        self.value = value\n        self.big_s = big_s\n        self.big_t = big_t\n        self.alt_s = alt_s\n        self.alt_t = alt_t\n\n    def __str__(self):\n        return str(self.value)\n\n    @classmethod\n    def create(cls, i, v):\n        return ChineseNumberDigit(i, v[0], v[1], v[2], v[3])\n\n\nclass ChineseMath(ChineseChar):\n    \"\"\"\n    中文数位字符\n    \"\"\"\n\n    def __init__(self, simplified, traditional, symbol, expression=None):\n        super(ChineseMath, self).__init__(simplified, traditional)\n        self.symbol = symbol\n        self.expression = expression\n        self.big_s = simplified\n        self.big_t = traditional\n\n\nCC, CNU, CND, CM = ChineseChar, ChineseNumberUnit, ChineseNumberDigit, ChineseMath\n\n\nclass NumberSystem(object):\n    \"\"\"\n    中文数字系统\n    \"\"\"\n    pass\n\n\nclass MathSymbol(object):\n    \"\"\"\n    用于中文数字系统的数学符号 (繁/简体), e.g.\n    positive = ['正', '正']\n    negative = ['负', '負']\n    point = ['点', '點']\n    \"\"\"\n\n    def __init__(self, positive, negative, point):\n        self.positive = positive\n        self.negative = negative\n        self.point = point\n\n    def __iter__(self):\n        for v in self.__dict__.values():\n            yield v\n\n\n# class OtherSymbol(object):\n#     \"\"\"\n#     其他符号\n#     \"\"\"\n#\n#     def __init__(self, sil):\n#         self.sil = sil\n#\n#     def __iter__(self):\n#         for v in self.__dict__.values():\n#             yield v\n\n\n# ================================================================================ #\n#                                    basic utils\n# ================================================================================ #\ndef create_system(numbering_type=NUMBERING_TYPES[1]):\n    \"\"\"\n    根据数字系统类型返回创建相应的数字系统，默认为 mid\n    NUMBERING_TYPES = ['low', 'mid', 'high']: 中文数字系统类型\n        low:  '兆' = '亿' * '十' = $10^{9}$,  '京' = '兆' * '十', etc.\n        mid:  '兆' = '亿' * '万' = $10^{12}$, '京' = '兆' * '万', etc.\n        high: '兆' = '亿' * '亿' = $10^{16}$, '京' = '兆' * '兆', etc.\n    返回对应的数字系统\n    \"\"\"\n\n    # chinese number units of '亿' and larger\n    all_larger_units = zip(LARGER_CHINESE_NUMERING_UNITS_SIMPLIFIED, LARGER_CHINESE_NUMERING_UNITS_TRADITIONAL)\n    larger_units = [CNU.create(i, v, numbering_type, False) for i, v in enumerate(all_larger_units)]\n    # chinese number units of '十, 百, 千, 万'\n    all_smaller_units = zip(SMALLER_CHINESE_NUMERING_UNITS_SIMPLIFIED, SMALLER_CHINESE_NUMERING_UNITS_TRADITIONAL)\n    smaller_units = [CNU.create(i, v, small_unit=True) for i, v in enumerate(all_smaller_units)]\n    # digis\n    chinese_digis = zip(CHINESE_DIGIS, CHINESE_DIGIS, BIG_CHINESE_DIGIS_SIMPLIFIED, BIG_CHINESE_DIGIS_TRADITIONAL)\n    digits = [CND.create(i, v) for i, v in enumerate(chinese_digis)]\n    digits[0].alt_s, digits[0].alt_t = ZERO_ALT, ZERO_ALT\n    digits[1].alt_s, digits[1].alt_t = ONE_ALT, ONE_ALT\n    digits[2].alt_s, digits[2].alt_t = TWO_ALTS[0], TWO_ALTS[1]\n\n    # symbols\n    positive_cn = CM(POSITIVE[0], POSITIVE[1], '+', lambda x: x)\n    negative_cn = CM(NEGATIVE[0], NEGATIVE[1], '-', lambda x: -x)\n    point_cn = CM(POINT[0], POINT[1], '.', lambda x, y: float(str(x) + '.' + str(y)))\n    # sil_cn = CM(SIL[0], SIL[1], '-', lambda x, y: float(str(x) + '-' + str(y)))\n    system = NumberSystem()\n    system.units = smaller_units + larger_units\n    system.digits = digits\n    system.math = MathSymbol(positive_cn, negative_cn, point_cn)\n    # system.symbols = OtherSymbol(sil_cn)\n    return system\n\n\ndef chn2num(chinese_string, numbering_type=NUMBERING_TYPES[1]):\n\n    def get_symbol(char, system):\n        for u in system.units:\n            if char in [u.traditional, u.simplified, u.big_s, u.big_t]:\n                return u\n        for d in system.digits:\n            if char in [d.traditional, d.simplified, d.big_s, d.big_t, d.alt_s, d.alt_t]:\n                return d\n        for m in system.math:\n            if char in [m.traditional, m.simplified]:\n                return m\n\n    def string2symbols(chinese_string, system):\n        int_string, dec_string = chinese_string, ''\n        for p in [system.math.point.simplified, system.math.point.traditional]:\n            if p in chinese_string:\n                int_string, dec_string = chinese_string.split(p)\n                break\n        return [get_symbol(c, system) for c in int_string], \\\n               [get_symbol(c, system) for c in dec_string]\n\n    def correct_symbols(integer_symbols, system):\n        \"\"\"\n        一百八 to 一百八十\n        一亿一千三百万 to 一亿 一千万 三百万\n        \"\"\"\n\n        if integer_symbols and isinstance(integer_symbols[0], CNU):\n            if integer_symbols[0].power == 1:\n                integer_symbols = [system.digits[1]] + integer_symbols\n\n        if len(integer_symbols) > 1:\n            if isinstance(integer_symbols[-1], CND) and isinstance(integer_symbols[-2], CNU):\n                integer_symbols.append(CNU(integer_symbols[-2].power - 1, None, None, None, None))\n\n        result = []\n        unit_count = 0\n        for s in integer_symbols:\n            if isinstance(s, CND):\n                result.append(s)\n                unit_count = 0\n            elif isinstance(s, CNU):\n                current_unit = CNU(s.power, None, None, None, None)\n                unit_count += 1\n\n            if unit_count == 1:\n                result.append(current_unit)\n            elif unit_count > 1:\n                for i in range(len(result)):\n                    if isinstance(result[-i - 1], CNU) and result[-i - 1].power < current_unit.power:\n                        result[-i - 1] = CNU(result[-i - 1].power + current_unit.power, None, None, None, None)\n        return result\n\n    def compute_value(integer_symbols):\n        \"\"\"\n        Compute the value.\n        When current unit is larger than previous unit, current unit * all previous units will be used as all previous units.\n        e.g. '两千万' = 2000 * 10000 not 2000 + 10000\n        \"\"\"\n        value = [0]\n        last_power = 0\n        for s in integer_symbols:\n            if isinstance(s, CND):\n                value[-1] = s.value\n            elif isinstance(s, CNU):\n                value[-1] *= pow(10, s.power)\n                if s.power > last_power:\n                    value[:-1] = list(map(lambda v: v * pow(10, s.power), value[:-1]))\n                    last_power = s.power\n                value.append(0)\n        return sum(value)\n\n    system = create_system(numbering_type)\n    int_part, dec_part = string2symbols(chinese_string, system)\n    int_part = correct_symbols(int_part, system)\n    int_str = str(compute_value(int_part))\n    dec_str = ''.join([str(d.value) for d in dec_part])\n    if dec_part:\n        return '{0}.{1}'.format(int_str, dec_str)\n    else:\n        return int_str\n\n\ndef num2chn(number_string,\n            numbering_type=NUMBERING_TYPES[1],\n            big=False,\n            traditional=False,\n            alt_zero=False,\n            alt_one=False,\n            alt_two=True,\n            use_zeros=True,\n            use_units=True):\n\n    def get_value(value_string, use_zeros=True):\n\n        striped_string = value_string.lstrip('0')\n\n        # record nothing if all zeros\n        if not striped_string:\n            return []\n\n        # record one digits\n        elif len(striped_string) == 1:\n            if use_zeros and len(value_string) != len(striped_string):\n                return [system.digits[0], system.digits[int(striped_string)]]\n            else:\n                return [system.digits[int(striped_string)]]\n\n        # recursively record multiple digits\n        else:\n            result_unit = next(u for u in reversed(system.units) if u.power < len(striped_string))\n            result_string = value_string[:-result_unit.power]\n            return get_value(result_string) + [result_unit] + get_value(striped_string[-result_unit.power:])\n\n    system = create_system(numbering_type)\n\n    int_dec = number_string.split('.')\n    if len(int_dec) == 1:\n        int_string = int_dec[0]\n        dec_string = \"\"\n    elif len(int_dec) == 2:\n        int_string = int_dec[0]\n        dec_string = int_dec[1]\n    else:\n        raise ValueError(\"invalid input num string with more than one dot: {}\".format(number_string))\n\n    if use_units and len(int_string) > 1:\n        result_symbols = get_value(int_string)\n    else:\n        result_symbols = [system.digits[int(c)] for c in int_string]\n    dec_symbols = [system.digits[int(c)] for c in dec_string]\n    if dec_string:\n        result_symbols += [system.math.point] + dec_symbols\n\n    if alt_two:\n        liang = CND(2, system.digits[2].alt_s, system.digits[2].alt_t, system.digits[2].big_s, system.digits[2].big_t)\n        for i, v in enumerate(result_symbols):\n            if isinstance(v, CND) and v.value == 2:\n                next_symbol = result_symbols[i + 1] if i < len(result_symbols) - 1 else None\n                previous_symbol = result_symbols[i - 1] if i > 0 else None\n                if isinstance(next_symbol, CNU) and isinstance(previous_symbol, (CNU, type(None))):\n                    if next_symbol.power != 1 and ((previous_symbol is None) or (previous_symbol.power != 1)):\n                        result_symbols[i] = liang\n\n    # if big is True, '两' will not be used and `alt_two` has no impact on output\n    if big:\n        attr_name = 'big_'\n        if traditional:\n            attr_name += 't'\n        else:\n            attr_name += 's'\n    else:\n        if traditional:\n            attr_name = 'traditional'\n        else:\n            attr_name = 'simplified'\n\n    result = ''.join([getattr(s, attr_name) for s in result_symbols])\n\n    # if not use_zeros:\n    #     result = result.strip(getattr(system.digits[0], attr_name))\n\n    if alt_zero:\n        result = result.replace(getattr(system.digits[0], attr_name), system.digits[0].alt_s)\n\n    if alt_one:\n        result = result.replace(getattr(system.digits[1], attr_name), system.digits[1].alt_s)\n\n    for i, p in enumerate(POINT):\n        if result.startswith(p):\n            return CHINESE_DIGIS[0] + result\n\n    # ^10, 11, .., 19\n    if len(result) >= 2 and result[1] in [SMALLER_CHINESE_NUMERING_UNITS_SIMPLIFIED[0],\n                                          SMALLER_CHINESE_NUMERING_UNITS_TRADITIONAL[0]] and \\\n            result[0] in [CHINESE_DIGIS[1], BIG_CHINESE_DIGIS_SIMPLIFIED[1], BIG_CHINESE_DIGIS_TRADITIONAL[1]]:\n        result = result[1:]\n\n    return result\n\n\n# ================================================================================ #\n#                          different types of rewriters\n# ================================================================================ #\nclass Cardinal:\n    \"\"\"\n    CARDINAL类\n    \"\"\"\n\n    def __init__(self, cardinal=None, chntext=None):\n        self.cardinal = cardinal\n        self.chntext = chntext\n\n    def chntext2cardinal(self):\n        return chn2num(self.chntext)\n\n    def cardinal2chntext(self):\n        return num2chn(self.cardinal)\n\n\nclass Digit:\n    \"\"\"\n    DIGIT类\n    \"\"\"\n\n    def __init__(self, digit=None, chntext=None):\n        self.digit = digit\n        self.chntext = chntext\n\n    # def chntext2digit(self):\n    #     return chn2num(self.chntext)\n\n    def digit2chntext(self):\n        return num2chn(self.digit, alt_two=False, use_units=False)\n\n\nclass TelePhone:\n    \"\"\"\n    TELEPHONE类\n    \"\"\"\n\n    def __init__(self, telephone=None, raw_chntext=None, chntext=None):\n        self.telephone = telephone\n        self.raw_chntext = raw_chntext\n        self.chntext = chntext\n\n    # def chntext2telephone(self):\n    #     sil_parts = self.raw_chntext.split('<SIL>')\n    #     self.telephone = '-'.join([\n    #         str(chn2num(p)) for p in sil_parts\n    #     ])\n    #     return self.telephone\n\n    def telephone2chntext(self, fixed=False):\n\n        if fixed:\n            sil_parts = self.telephone.split('-')\n            self.raw_chntext = '<SIL>'.join([num2chn(part, alt_two=False, use_units=False) for part in sil_parts])\n            self.chntext = self.raw_chntext.replace('<SIL>', '')\n        else:\n            sp_parts = self.telephone.strip('+').split()\n            self.raw_chntext = '<SP>'.join([num2chn(part, alt_two=False, use_units=False) for part in sp_parts])\n            self.chntext = self.raw_chntext.replace('<SP>', '')\n        return self.chntext\n\n\nclass Fraction:\n    \"\"\"\n    FRACTION类\n    \"\"\"\n\n    def __init__(self, fraction=None, chntext=None):\n        self.fraction = fraction\n        self.chntext = chntext\n\n    def chntext2fraction(self):\n        denominator, numerator = self.chntext.split('分之')\n        return chn2num(numerator) + '/' + chn2num(denominator)\n\n    def fraction2chntext(self):\n        numerator, denominator = self.fraction.split('/')\n        return num2chn(denominator) + '分之' + num2chn(numerator)\n\n\nclass Date:\n    \"\"\"\n    DATE类\n    \"\"\"\n\n    def __init__(self, date=None, chntext=None):\n        self.date = date\n        self.chntext = chntext\n\n    # def chntext2date(self):\n    #     chntext = self.chntext\n    #     try:\n    #         year, other = chntext.strip().split('年', maxsplit=1)\n    #         year = Digit(chntext=year).digit2chntext() + '年'\n    #     except ValueError:\n    #         other = chntext\n    #         year = ''\n    #     if other:\n    #         try:\n    #             month, day = other.strip().split('月', maxsplit=1)\n    #             month = Cardinal(chntext=month).chntext2cardinal() + '月'\n    #         except ValueError:\n    #             day = chntext\n    #             month = ''\n    #         if day:\n    #             day = Cardinal(chntext=day[:-1]).chntext2cardinal() + day[-1]\n    #     else:\n    #         month = ''\n    #         day = ''\n    #     date = year + month + day\n    #     self.date = date\n    #     return self.date\n\n    def date2chntext(self):\n        date = self.date\n        try:\n            year, other = date.strip().split('年', 1)\n            year = Digit(digit=year).digit2chntext() + '年'\n        except ValueError:\n            other = date\n            year = ''\n        if other:\n            try:\n                month, day = other.strip().split('月', 1)\n                month = Cardinal(cardinal=month).cardinal2chntext() + '月'\n            except ValueError:\n                day = date\n                month = ''\n            if day:\n                day = Cardinal(cardinal=day[:-1]).cardinal2chntext() + day[-1]\n        else:\n            month = ''\n            day = ''\n        chntext = year + month + day\n        self.chntext = chntext\n        return self.chntext\n\n\nclass Money:\n    \"\"\"\n    MONEY类\n    \"\"\"\n\n    def __init__(self, money=None, chntext=None):\n        self.money = money\n        self.chntext = chntext\n\n    # def chntext2money(self):\n    #     return self.money\n\n    def money2chntext(self):\n        money = self.money\n        pattern = re.compile(r'(\\d+(\\.\\d+)?)')\n        matchers = pattern.findall(money)\n        if matchers:\n            for matcher in matchers:\n                money = money.replace(matcher[0], Cardinal(cardinal=matcher[0]).cardinal2chntext())\n        self.chntext = money\n        return self.chntext\n\n\nclass Percentage:\n    \"\"\"\n    PERCENTAGE类\n    \"\"\"\n\n    def __init__(self, percentage=None, chntext=None):\n        self.percentage = percentage\n        self.chntext = chntext\n\n    def chntext2percentage(self):\n        return chn2num(self.chntext.strip().strip('百分之')) + '%'\n\n    def percentage2chntext(self):\n        return '百分之' + num2chn(self.percentage.strip().strip('%'))\n\n\n# ================================================================================ #\n#                            NSW Normalizer\n# ================================================================================ #\nclass NSWNormalizer:\n\n    def __init__(self, raw_text):\n        self.raw_text = '^' + raw_text + '$'\n        self.norm_text = ''\n\n    def _particular(self):\n        text = self.norm_text\n        pattern = re.compile(r\"(([a-zA-Z]+)二([a-zA-Z]+))\")\n        matchers = pattern.findall(text)\n        if matchers:\n            # print('particular')\n            for matcher in matchers:\n                text = text.replace(matcher[0], matcher[1] + '2' + matcher[2], 1)\n        self.norm_text = text\n        return self.norm_text\n\n    def normalize(self, remove_punc=True):\n        text = self.raw_text\n\n        # 规范化日期\n        pattern = re.compile(r\"\\D+((([089]\\d|(19|20)\\d{2})年)?(\\d{1,2}月(\\d{1,2}[日号])?)?)\")\n        matchers = pattern.findall(text)\n        if matchers:\n            # print('date')\n            for matcher in matchers:\n                text = text.replace(matcher[0], Date(date=matcher[0]).date2chntext(), 1)\n\n        # 规范化金钱\n        pattern = re.compile(r\"\\D+((\\d+(\\.\\d+)?)[多余几]?\" + CURRENCY_UNITS + r\"(\\d\" + CURRENCY_UNITS + r\"?)?)\")\n        matchers = pattern.findall(text)\n        if matchers:\n            # print('money')\n            for matcher in matchers:\n                text = text.replace(matcher[0], Money(money=matcher[0]).money2chntext(), 1)\n\n        # 规范化固话/手机号码\n        # 手机\n        # http://www.jihaoba.com/news/show/13680\n        # 移动：139、138、137、136、135、134、159、158、157、150、151、152、188、187、182、183、184、178、198\n        # 联通：130、131、132、156、155、186、185、176\n        # 电信：133、153、189、180、181、177\n        pattern = re.compile(r\"\\D((\\+?86 ?)?1([38]\\d|5[0-35-9]|7[678]|9[89])\\d{8})\\D\")\n        matchers = pattern.findall(text)\n        if matchers:\n            # print('telephone')\n            for matcher in matchers:\n                text = text.replace(matcher[0], TelePhone(telephone=matcher[0]).telephone2chntext(), 1)\n        # 固话\n        pattern = re.compile(r\"\\D((0(10|2[1-3]|[3-9]\\d{2})-?)?[1-9]\\d{6,7})\\D\")\n        matchers = pattern.findall(text)\n        if matchers:\n            # print('fixed telephone')\n            for matcher in matchers:\n                text = text.replace(matcher[0], TelePhone(telephone=matcher[0]).telephone2chntext(fixed=True), 1)\n\n        # 规范化分数\n        pattern = re.compile(r\"(\\d+/\\d+)\")\n        matchers = pattern.findall(text)\n        if matchers:\n            # print('fraction')\n            for matcher in matchers:\n                text = text.replace(matcher, Fraction(fraction=matcher).fraction2chntext(), 1)\n\n        # 规范化百分数\n        text = text.replace('％', '%')\n        pattern = re.compile(r\"(\\d+(\\.\\d+)?%)\")\n        matchers = pattern.findall(text)\n        if matchers:\n            # print('percentage')\n            for matcher in matchers:\n                text = text.replace(matcher[0], Percentage(percentage=matcher[0]).percentage2chntext(), 1)\n\n        # 规范化纯数+量词\n        pattern = re.compile(r\"(\\d+(\\.\\d+)?)[多余几]?\" + COM_QUANTIFIERS)\n        matchers = pattern.findall(text)\n        if matchers:\n            # print('cardinal+quantifier')\n            for matcher in matchers:\n                text = text.replace(matcher[0], Cardinal(cardinal=matcher[0]).cardinal2chntext(), 1)\n\n        # 规范化数字编号\n        pattern = re.compile(r\"(\\d{4,32})\")\n        matchers = pattern.findall(text)\n        if matchers:\n            # print('digit')\n            for matcher in matchers:\n                text = text.replace(matcher, Digit(digit=matcher).digit2chntext(), 1)\n\n        # 规范化纯数\n        pattern = re.compile(r\"(\\d+(\\.\\d+)?)\")\n        matchers = pattern.findall(text)\n        if matchers:\n            # print('cardinal')\n            for matcher in matchers:\n                text = text.replace(matcher[0], Cardinal(cardinal=matcher[0]).cardinal2chntext(), 1)\n\n        self.norm_text = text\n        self._particular()\n\n        text = self.norm_text.lstrip('^').rstrip('$')\n        if remove_punc:\n            # Punctuations removal\n            old_chars = CHINESE_PUNC_LIST + string.punctuation  # includes all CN and EN punctuations\n            new_chars = ' ' * len(old_chars)\n            del_chars = ''\n            text = text.translate(str.maketrans(old_chars, new_chars, del_chars))\n        return text\n\n\ndef nsw_test_case(raw_text):\n    print('I:' + raw_text)\n    print('O:' + NSWNormalizer(raw_text).normalize())\n    print('')\n\n\ndef nsw_test():\n    nsw_test_case('固话：0595-23865596或23880880。')\n    nsw_test_case('固话：0595-23865596或23880880。')\n    nsw_test_case('手机：+86 19859213959或15659451527。')\n    nsw_test_case('分数：32477/76391。')\n    nsw_test_case('百分数：80.03%。')\n    nsw_test_case('编号：31520181154418。')\n    nsw_test_case('纯数：2983.07克或12345.60米。')\n    nsw_test_case('日期：1999年2月20日或09年3月15号。')\n    nsw_test_case('金钱：12块5，34.5元，20.1万')\n    nsw_test_case('特殊：O2O或B2C。')\n    nsw_test_case('3456万吨')\n    nsw_test_case('2938个')\n    nsw_test_case('938')\n    nsw_test_case('今天吃了115个小笼包231个馒头')\n    nsw_test_case('有62％的概率')\n\n\nif __name__ == '__main__':\n    # nsw_test()\n\n    p = argparse.ArgumentParser()\n    p.add_argument('ifile', help='input filename, assume utf-8 encoding')\n    p.add_argument('ofile', help='output filename')\n    p.add_argument('--to_upper', action='store_true', help='convert to upper case')\n    p.add_argument('--to_lower', action='store_true', help='convert to lower case')\n    p.add_argument('--has_key', action='store_true', help=\"input text has Kaldi's key as first field.\")\n    p.add_argument('--log_interval', type=int, default=10000, help='log interval in number of processed lines')\n    args = p.parse_args()\n\n    ifile = codecs.open(args.ifile, 'r', 'utf8')\n    ofile = codecs.open(args.ofile, 'w+', 'utf8')\n\n    n = 0\n    for l in ifile:\n        key = ''\n        text = ''\n        if args.has_key:\n            cols = l.split(maxsplit=1)\n            key = cols[0]\n            if len(cols) == 2:\n                text = cols[1]\n            else:\n                text = ''\n        else:\n            text = l\n\n        # cases\n        if args.to_upper and args.to_lower:\n            sys.stderr.write('text norm: to_upper OR to_lower?')\n            exit(1)\n        if args.to_upper:\n            text = text.upper()\n        if args.to_lower:\n            text = text.lower()\n\n        # NSW(Non-Standard-Word) normalization\n        text = NSWNormalizer(text).normalize()\n\n        #\n        if args.has_key:\n            ofile.write(key + '\\t' + text)\n        else:\n            ofile.write(text)\n\n        n += 1\n        if n % args.log_interval == 0:\n            sys.stderr.write(\"text norm: {} lines done.\\n\".format(n))\n\n    sys.stderr.write(\"text norm: {} lines done in total.\\n\".format(n))\n\n    ifile.close()\n    ofile.close()\n"
  },
  {
    "path": "modules/audio/tts/deepvoice3_ljspeech/README.md",
    "content": "# deepvoice3_ljspeech\n\n|模型名称|deepvoice3_ljspeech|\n| :--- | :---: |\n|类别|语音-语音合成|\n|网络|DeepVoice3|\n|数据集|LJSpeech-1.1|\n|是否支持Fine-tuning|否|\n|模型大小|58MB|\n|最新更新日期|2020-10-27|\n|数据指标|-|\n\n## 一、模型基本信息\n\n### 模型介绍\n\nDeep Voice 3是百度研究院2017年发布的端到端的TTS模型（论文录用于ICLR 2018）。它是一个基于卷积神经网络和注意力机制的seq2seq模型,由于不包含循环神经网络，它可以并行训练，远快于基于循环神经网络的模型。Deep Voice 3可以学习到多个说话人的特征，也支持搭配多种声码器使用。deepvoice3_ljspeech是基于ljspeech英文语音数据集预训练得到的英文TTS模型，仅支持预测。\n\n<p align=\"center\">\n<img src=\"https://raw.githubusercontent.com/PaddlePaddle/Parakeet/release/v0.1/examples/deepvoice3/images/model_architecture.png\" hspace='10'/> <br/>\n</p>\n\n更多详情参考论文[Deep Voice 3: Scaling Text-to-Speech with Convolutional Sequence Learning](https://arxiv.org/abs/1710.07654)\n\n\n## 二、安装\n\n- ### 1、系统依赖\n\n    对于Ubuntu用户，请执行：\n    ```\n    sudo apt-get install libsndfile1\n    ```\n    对于Centos用户，请执行：\n    ```\n    sudo yum install libsndfile\n    ```\n\n- ### 2、环境依赖\n\n  - 2.0.0 > paddlepaddle >= 1.8.2\n\n  - 2.0.0 > paddlehub >= 1.7.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 3、安装\n\n  - ```shell\n    $ hub install deepvoice3_ljspeech\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run deepvoice3_ljspeech --input_text='Simple as this proposition is, it is necessary to be stated' --use_gpu True --vocoder griffin-lim\n    ```\n  - 通过命令行方式实现语音合成模型的调用，更多请见[PaddleHub命令行指令](https://github.com/shinichiye/PaddleHub/blob/release/v2.1/docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import soundfile as sf\n\n    # Load deepvoice3_ljspeech module.\n    module = hub.Module(name=\"deepvoice3_ljspeech\")\n\n    # Predict sentiment label\n    test_texts = ['Simple as this proposition is, it is necessary to be stated',\n                'Parakeet stands for Paddle PARAllel text-to-speech toolkit']\n    wavs, sample_rate = module.synthesize(texts=test_texts)\n    for index, wav in enumerate(wavs):\n        sf.write(f\"{index}.wav\", wav, sample_rate)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def synthesize(texts, use_gpu=False, vocoder=\"griffin-lim\"):\n    ```\n\n    - 预测API，由输入文本合成对应音频波形。\n\n    - **参数**\n      - texts (list\\[str\\]): 待预测文本；\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA\\_VISIBLE\\_DEVICES环境变量**；\n      - vocoder: 指定声码器，可选 \"griffin-lim\"或\"waveflow\"\n\n    - **返回**\n      - wavs (list): 语音合成结果列表，列表中每一个元素为对应输入文本的音频波形，可使用`soundfile.write`进一步处理或保存。\n      - sample\\_rate (int): 合成音频的采样率。\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线语音合成服务，可以将此接口用于在线web应用。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令\n  - ```shell\n    $ hub serving start -m deepvoice3_ljspeech\n    ```\n  - 这样就完成了服务化API的部署，默认端口号为8866。  \n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    import soundfile as sf\n\n    # 发送HTTP请求\n\n    data = {'texts':['Simple as this proposition is, it is necessary to be stated',\n                    'Parakeet stands for Paddle PARAllel text-to-speech toolkit'],\n            'use_gpu':False}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/deepvoice3_ljspeech\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 保存结果\n    result = r.json()[\"results\"]\n    wavs = result[\"wavs\"]\n    sample_rate = result[\"sample_rate\"]\n    for index, wav in enumerate(wavs):\n        sf.write(f\"{index}.wav\", wav, sample_rate)\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install deepvoice3_ljspeech\n  ```\n"
  },
  {
    "path": "modules/audio/tts/deepvoice3_ljspeech/__init__.py",
    "content": ""
  },
  {
    "path": "modules/audio/tts/deepvoice3_ljspeech/module.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport argparse\nimport ast\nimport importlib.util\n\nimport nltk\nimport numpy as np\nimport paddle.fluid as fluid\nimport paddle.fluid.dygraph as dg\nimport paddlehub as hub\nfrom paddlehub.common.logger import logger\nfrom paddlehub.module.module import moduleinfo, serving\nfrom paddlehub.common.dir import THIRD_PARTY_HOME\nfrom paddlehub.common.utils import mkdir\nfrom paddlehub.common.downloader import default_downloader\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.nlp_module import DataFormatError\n\nlack_dependency = []\nfor dependency in [\"ruamel\", \"parakeet\", \"soundfile\", \"librosa\"]:\n    if not importlib.util.find_spec(dependency):\n        lack_dependency.append(dependency)\n\n# Accelerate NLTK package download via paddlehub. 'import parakeet' will use the package.\n_PUNKT_URL = \"https://paddlehub.bj.bcebos.com/paddlehub-thirdparty/punkt.tar.gz\"\n_CMUDICT_URL = \"https://paddlehub.bj.bcebos.com/paddlehub-thirdparty/cmudict.tar.gz\"\nnltk_path = os.path.join(THIRD_PARTY_HOME, \"nltk_data\")\ntokenizers_path = os.path.join(nltk_path, \"tokenizers\")\ncorpora_path = os.path.join(nltk_path, \"corpora\")\npunkt_path = os.path.join(tokenizers_path, \"punkt\")\ncmudict_path = os.path.join(corpora_path, \"cmudict\")\n\nif not os.path.exists(punkt_path):\n    default_downloader.download_file_and_uncompress(url=_PUNKT_URL, save_path=tokenizers_path, print_progress=True)\nif not os.path.exists(cmudict_path):\n    default_downloader.download_file_and_uncompress(url=_CMUDICT_URL, save_path=corpora_path, print_progress=True)\nnltk.data.path.append(nltk_path)\n\nif not lack_dependency:\n    import soundfile as sf\n    import librosa\n    import ruamel.yaml\n    from parakeet.utils import io\n    from parakeet.g2p import en\n    from parakeet.models.deepvoice3 import Encoder, Decoder, PostNet, SpectraNet\n    from parakeet.models.waveflow import WaveFlowModule\n    from parakeet.models.deepvoice3.weight_norm_hook import remove_weight_norm\nelse:\n    raise ImportError(\n        \"The module requires additional dependencies: %s. You can install parakeet via 'git clone https://github.com/PaddlePaddle/Parakeet && cd Parakeet && pip install -e .' and others via pip install\"\n        % \", \".join(lack_dependency))\n\n\nclass AttrDict(dict):\n    def __init__(self, *args, **kwargs):\n        super(AttrDict, self).__init__(*args, **kwargs)\n        self.__dict__ = self\n\n\nclass WaveflowVocoder(object):\n    def __init__(self, config_path, checkpoint_path):\n        with open(config_path, 'rt') as f:\n            config = ruamel.yaml.safe_load(f)\n        ns = argparse.Namespace()\n        for k, v in config.items():\n            setattr(ns, k, v)\n        ns.use_fp16 = False\n\n        self.model = WaveFlowModule(ns)\n        io.load_parameters(self.model, checkpoint_path=checkpoint_path)\n\n    def __call__(self, mel):\n        with dg.no_grad():\n            self.model.eval()\n            audio = self.model.synthesize(mel)\n        self.model.train()\n        return audio\n\n\nclass GriffinLimVocoder(object):\n    def __init__(self, sharpening_factor=1.4, sample_rate=22050, n_fft=1024, win_length=1024, hop_length=256):\n        self.sample_rate = sample_rate\n        self.n_fft = n_fft\n        self.sharpening_factor = sharpening_factor\n        self.win_length = win_length\n        self.hop_length = hop_length\n\n    def __call__(self, mel):\n        spec = librosa.feature.inverse.mel_to_stft(\n            np.exp(mel), sr=self.sample_rate, n_fft=self.n_fft, fmin=0, fmax=8000.0, power=1.0)\n        audio = librosa.core.griffinlim(\n            spec**self.sharpening_factor, win_length=self.win_length, hop_length=self.hop_length)\n        return audio\n\n\n@moduleinfo(\n    name=\"deepvoice3_ljspeech\",\n    version=\"1.0.0\",\n    summary=\"Deep Voice 3, a fully-convolutional attention-based neural text-to-speech (TTS) system.\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/tts\",\n)\nclass DeepVoice3(hub.NLPPredictionModule):\n    def _initialize(self):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        self.tts_checkpoint_path = os.path.join(self.directory, \"assets\", \"tts\", \"step-1780000\")\n        self.waveflow_checkpoint_path = os.path.join(self.directory, \"assets\", \"vocoder\", \"step-2000000\")\n        self.waveflow_config_path = os.path.join(self.directory, \"assets\", \"vocoder\", \"waveflow_ljspeech.yaml\")\n        tts_checkpoint_path = os.path.join(self.directory, \"assets\", \"tts\", \"ljspeech.yaml\")\n        with open(tts_checkpoint_path) as f:\n            self.tts_config = ruamel.yaml.safe_load(f)\n\n        with fluid.dygraph.guard(fluid.CPUPlace()):\n            char_embedding = dg.Embedding((en.n_vocab, self.tts_config[\"char_dim\"]))\n            multi_speaker = self.tts_config[\"n_speakers\"] > 1\n            speaker_embedding = dg.Embedding((self.tts_config[\"n_speakers\"], self.tts_config[\"speaker_dim\"])) \\\n                if multi_speaker else None\n            encoder = Encoder(\n                self.tts_config[\"encoder_layers\"],\n                self.tts_config[\"char_dim\"],\n                self.tts_config[\"encoder_dim\"],\n                self.tts_config[\"kernel_size\"],\n                has_bias=multi_speaker,\n                bias_dim=self.tts_config[\"speaker_dim\"],\n                keep_prob=1.0 - self.tts_config[\"dropout\"])\n            decoder = Decoder(\n                self.tts_config[\"n_mels\"],\n                self.tts_config[\"reduction_factor\"],\n                list(self.tts_config[\"prenet_sizes\"]) + [self.tts_config[\"char_dim\"]],\n                self.tts_config[\"decoder_layers\"],\n                self.tts_config[\"kernel_size\"],\n                self.tts_config[\"attention_dim\"],\n                position_encoding_weight=self.tts_config[\"position_weight\"],\n                omega=self.tts_config[\"position_rate\"],\n                has_bias=multi_speaker,\n                bias_dim=self.tts_config[\"speaker_dim\"],\n                keep_prob=1.0 - self.tts_config[\"dropout\"])\n            postnet = PostNet(\n                self.tts_config[\"postnet_layers\"],\n                self.tts_config[\"char_dim\"],\n                self.tts_config[\"postnet_dim\"],\n                self.tts_config[\"kernel_size\"],\n                self.tts_config[\"n_mels\"],\n                self.tts_config[\"reduction_factor\"],\n                has_bias=multi_speaker,\n                bias_dim=self.tts_config[\"speaker_dim\"],\n                keep_prob=1.0 - self.tts_config[\"dropout\"])\n            self.tts_model = SpectraNet(char_embedding, speaker_embedding, encoder, decoder, postnet)\n            io.load_parameters(model=self.tts_model, checkpoint_path=self.tts_checkpoint_path)\n            for name, layer in self.tts_model.named_sublayers():\n                try:\n                    remove_weight_norm(layer)\n                except ValueError:\n                    # this layer has not weight norm hook\n                    pass\n\n            self.waveflow = WaveflowVocoder(\n                config_path=self.waveflow_config_path, checkpoint_path=self.waveflow_checkpoint_path)\n            self.griffin = GriffinLimVocoder(\n                sharpening_factor=self.tts_config[\"sharpening_factor\"],\n                sample_rate=self.tts_config[\"sample_rate\"],\n                n_fft=self.tts_config[\"n_fft\"],\n                win_length=self.tts_config[\"win_length\"],\n                hop_length=self.tts_config[\"hop_length\"])\n\n    def synthesize(self, texts, use_gpu=False, vocoder=\"griffin-lim\"):\n        \"\"\"\n        Get the synthetic wavs from the texts.\n\n        Args:\n             texts(list): the input texts to be predicted.\n             use_gpu(bool): whether use gpu to predict or not\n             vocoder(str): the vocoder name, \"griffin-lim\" or \"waveflow\"\n\n        Returns:\n             wavs(str): the audio wav with sample rate . You can use soundfile.write to save it.\n             sample_rate(int): the audio sample rate.\n        \"\"\"\n        if use_gpu and \"CUDA_VISIBLE_DEVICES\" not in os.environ:\n            use_gpu = False\n            logger.warning(\n                \"use_gpu has been set False as you didn't set the environment variable CUDA_VISIBLE_DEVICES while using use_gpu=True\"\n            )\n\n        place = fluid.CUDAPlace(0) if use_gpu else fluid.CPUPlace()\n\n        if texts and isinstance(texts, list):\n            predicted_data = texts\n        else:\n            raise ValueError(\"The input data is inconsistent with expectations.\")\n\n        wavs = []\n        with fluid.dygraph.guard(place):\n            self.tts_model.eval()\n            self.waveflow.model.eval()\n            monotonic_layers = [4]\n            for text in predicted_data:\n                # init input\n                logger.info(\"Processing sentence: %s\" % text)\n                text = en.text_to_sequence(text, p=1.0)\n                text = np.expand_dims(np.array(text, dtype=\"int64\"), 0)\n                lengths = np.array([text.size], dtype=np.int64)\n                text_seqs = dg.to_variable(text)\n                text_lengths = dg.to_variable(lengths)\n\n                decoder_layers = self.tts_config[\"decoder_layers\"]\n                force_monotonic_attention = [False] * decoder_layers\n                for i in monotonic_layers:\n                    force_monotonic_attention[i] = True\n\n                outputs = self.tts_model(\n                    text_seqs,\n                    text_lengths,\n                    speakers=None,\n                    force_monotonic_attention=force_monotonic_attention,\n                    window=(self.tts_config[\"backward_step\"], self.tts_config[\"forward_step\"]))\n                decoded, refined, attentions = outputs\n                if vocoder == 'griffin-lim':\n                    # synthesis use griffin-lim\n                    wav = self.griffin(refined.numpy()[0].T)\n                elif vocoder == 'waveflow':\n                    # synthesis use waveflow\n                    wav = self.waveflow(fluid.layers.transpose(refined, [0, 2, 1])).numpy()[0]\n                else:\n                    raise ValueError(\n                        'vocoder error, we only support griffinlim and waveflow, but recevied %s.' % vocoder)\n                wavs.append(wav)\n        return wavs, self.tts_config[\"sample_rate\"]\n\n    @serving\n    def serving_method(self, texts, use_gpu=False, vocoder=\"griffin-lim\"):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        wavs, sample_rate = self.synthesize(texts, use_gpu, vocoder)\n        wavs = [wav.tolist() for wav in wavs]\n        result = {\"wavs\": wavs, \"sample_rate\": sample_rate}\n        return result\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu', type=ast.literal_eval, default=False, help=\"whether use GPU for prediction\")\n\n        self.arg_config_group.add_argument(\n            '--vocoder', type=str, default=\"griffin-lim\", choices=['griffin-lim', 'waveflow'], help=\"the vocoder name\")\n\n    def add_module_output_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--output_path',\n            type=str,\n            default=os.path.abspath(os.path.join(os.path.curdir, f\"{self.name}_prediction\")),\n            help=\"path to save experiment results\")\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description='Run the %s module.' % self.name,\n            prog='hub run %s' % self.name,\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_input_group = self.parser.add_argument_group(\n            title=\"Ouput options\", description=\"Ouput path. Optional.\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, optional.\")\n\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.add_module_output_arg()\n\n        args = self.parser.parse_args(argvs)\n\n        try:\n            input_data = self.check_input_data(args)\n        except DataFormatError and RuntimeError:\n            self.parser.print_help()\n            return None\n\n        mkdir(args.output_path)\n        wavs, sample_rate = self.synthesize(texts=input_data, use_gpu=args.use_gpu, vocoder=args.vocoder)\n\n        for index, wav in enumerate(wavs):\n            sf.write(os.path.join(args.output_path, f\"{index}.wav\"), wav, sample_rate)\n\n        ret = f\"The synthesized wav files have been saved in {args.output_path}\"\n        return ret\n\n\nif __name__ == \"__main__\":\n    module = DeepVoice3()\n    test_text = [\n        \"Simple as this proposition is, it is necessary to be stated\",\n        \"Parakeet stands for Paddle PARAllel text-to-speech toolkit.\",\n    ]\n    wavs, sample_rate = module.synthesize(texts=test_text, vocoder=\"waveflow\")\n    for index, wav in enumerate(wavs):\n        sf.write(f\"{index}.wav\", wav, sample_rate)\n"
  },
  {
    "path": "modules/audio/tts/deepvoice3_ljspeech/requirements.txt",
    "content": "librosa\nsoundfile\nruamel.yaml >= 0.16.3\n"
  },
  {
    "path": "modules/audio/tts/fastspeech2_baker/README.md",
    "content": "# fastspeech2_baker\n\n|模型名称|fastspeech2_baker|\n| :--- | :---: |\n|类别|语音-语音合成|\n|网络|FastSpeech2|\n|数据集|Chinese Standard Mandarin Speech Copus|\n|是否支持Fine-tuning|否|\n|模型大小|621MB|\n|最新更新日期|2021-10-20|\n|数据指标|-|\n\n## 一、模型基本信息\n\n### 模型介绍\n\nFastSpeech2是微软亚洲研究院和微软Azure语音团队联合浙江大学于2020年提出的语音合成(Text to Speech, TTS)模型。FastSpeech2是FastSpeech的改进版，解决了FastSpeech依赖Teacher-Student的知识蒸馏框架，训练流程比较复杂和训练目标相比真实语音存在信息损失的问题。\n\nFastSpeech2的模型架构如下图所示，它沿用FastSpeech中提出的Feed-Forward Transformer(FFT)架构，但在音素编码器和梅尔频谱解码器中加入了一个可变信息适配器(Variance Adaptor)，从而支持在FastSpeech2中引入更多语音中变化的信息，例如时长、音高、音量(频谱能量)等，来解决语音合成中的一对多映射问题。\n\n<p align=\"center\">\n<img src=\"https://paddlespeech.bj.bcebos.com/Parakeet/docs/images/fastspeech2.png\" hspace='10'/> <br />\n</p>\n\nParallel WaveGAN是一种使用了无蒸馏的对抗生成网络，快速且占用空间小的波形生成方法。该方法通过联合优化多分辨率谱图和对抗损失函数来训练非自回归WaveNet，可以有效捕获真实语音波形的时频分布。Parallel WaveGAN的结构如下图所示：\n\n<p align=\"center\">\n<img src=\"https://paddlespeech.bj.bcebos.com/Parakeet/docs/images/pwg.png\" hspace='10'/> <br />\n</p>\n\nfastspeech2_baker使用了FastSpeech2作为声学模型，使用Parallel WaveGAN作为声码器，并在[中文标准女声音库(Chinese Standard Mandarin Speech Copus)](https://www.data-baker.com/open_source.html)数据集上进行了预训练，可直接用于预测合成音频。\n\n更多详情请参考:\n- [FastSpeech 2: Fast and High-Quality End-to-End Text-to-Speech](https://arxiv.org/abs/2006.04558)\n- [FastSpeech语音合成系统技术升级，微软联合浙大提出FastSpeech2](https://www.msra.cn/zh-cn/news/features/fastspeech2)\n- [Parallel WaveGAN: A fast waveform generation model based on generative adversarial networks with multi-resolution spectrogram](https://arxiv.org/abs/1910.11480)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.1.0\n\n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install fastspeech2_baker\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n    ```python\n    import paddlehub as hub\n\n    # 需要合成语音的文本\n    sentences = ['这是一段测试语音合成的音频。']\n\n    model = hub.Module(\n        name='fastspeech2_baker',\n        version='1.0.0')\n    wav_files =  model.generate(sentences)\n\n    # 打印合成的音频文件的路径\n    print(wav_files)\n    ```\n\n    详情可参考PaddleHub示例：\n    - [语音合成](../../../../demo/text_to_speech)\n\n\n- ### 2、API\n  - ```python\n    def __init__(output_dir)\n    ```\n\n    - 创建Module对象（动态图组网版本）\n\n    - **参数**\n\n      - `output_dir`： 合成音频文件的输出目录。\n\n  - ```python\n    def generate(\n        sentences,\n        device='cpu',\n    )\n    ```\n    - 将输入的文本合成为音频文件并保存到输出目录。\n\n    - **参数**\n\n      - `sentences`：合成音频的文本列表，类型为`List[str]`。\n      - `device`：预测时使用的设备，默认为\b`cpu`，如需使用gpu预测，请设置为`gpu`。\n\n    - **返回**\n\n      - `wav_files`：`List[str]`类型，返回合成音频的存放路径。\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线的语音识别服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m fastspeech2_baker\n    ```\n\n  - 这样就完成了一个语音识别服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 需要合成语音的文本\n    sentences = [\n        '这是第一段测试语音合成的音频。',\n        '这是第二段测试语音合成的音频。',\n    ]\n\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"sentences\"\n    data = {\"sentences\": sentences}\n\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/fastspeech2_baker\"\n\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install fastspeech2_baker\n  ```\n"
  },
  {
    "path": "modules/audio/tts/fastspeech2_baker/__init__.py",
    "content": ""
  },
  {
    "path": "modules/audio/tts/fastspeech2_baker/assets/fastspeech2_nosil_baker_ckpt_0.4/default.yaml",
    "content": "###########################################################\n#                FEATURE EXTRACTION SETTING               #\n###########################################################\n\nfs: 24000          # sr\nn_fft: 2048        # FFT size.\nn_shift: 300       # Hop size.\nwin_length: 1200   # Window length.\n                   # If set to null, it will be the same as fft_size.\nwindow: \"hann\"     # Window function.\n\n# Only used for feats_type != raw\n\nfmin: 80           # Minimum frequency of Mel basis.\nfmax: 7600         # Maximum frequency of Mel basis.\nn_mels: 80         # The number of mel basis.\n\n# Only used for the model using pitch features (e.g. FastSpeech2)\nf0min: 80          # Maximum f0 for pitch extraction.\nf0max: 400         # Minimum f0 for pitch extraction.\n\n\n###########################################################\n#                       DATA SETTING                      #\n###########################################################\nbatch_size: 64\nnum_workers: 4\n\n\n###########################################################\n#                       MODEL SETTING                     #\n###########################################################\nmodel:\n    adim: 384         # attention dimension\n    aheads: 2         # number of attention heads\n    elayers: 4        # number of encoder layers\n    eunits: 1536      # number of encoder ff units\n    dlayers: 4        # number of decoder layers\n    dunits: 1536      # number of decoder ff units\n    positionwise_layer_type: conv1d   # type of position-wise layer\n    positionwise_conv_kernel_size: 3  # kernel size of position wise conv layer\n    duration_predictor_layers: 2      # number of layers of duration predictor\n    duration_predictor_chans: 256     # number of channels of duration predictor\n    duration_predictor_kernel_size: 3 # filter size of duration predictor\n    postnet_layers: 5                 # number of layers of postnset\n    postnet_filts: 5                  # filter size of conv layers in postnet\n    postnet_chans: 256                # number of channels of conv layers in postnet\n    use_masking: True                 # whether to apply masking for padded part in loss calculation\n    use_scaled_pos_enc: True          # whether to use scaled positional encoding\n    encoder_normalize_before: True    # whether to perform layer normalization before the input\n    decoder_normalize_before: True    # whether to perform layer normalization before the input\n    reduction_factor: 1               # reduction factor\n    init_type: xavier_uniform         # initialization type\n    init_enc_alpha: 1.0               # initial value of alpha of encoder scaled position encoding\n    init_dec_alpha: 1.0               # initial value of alpha of decoder scaled position encoding\n    transformer_enc_dropout_rate: 0.2            # dropout rate for transformer encoder layer\n    transformer_enc_positional_dropout_rate: 0.2 # dropout rate for transformer encoder positional encoding\n    transformer_enc_attn_dropout_rate: 0.2       # dropout rate for transformer encoder attention layer\n    transformer_dec_dropout_rate: 0.2            # dropout rate for transformer decoder layer\n    transformer_dec_positional_dropout_rate: 0.2 # dropout rate for transformer decoder positional encoding\n    transformer_dec_attn_dropout_rate: 0.2       # dropout rate for transformer decoder attention layer\n    pitch_predictor_layers: 5                  # number of conv layers in pitch predictor\n    pitch_predictor_chans: 256                 # number of channels of conv layers in pitch predictor\n    pitch_predictor_kernel_size: 5             # kernel size of conv leyers in pitch predictor\n    pitch_predictor_dropout: 0.5               # dropout rate in pitch predictor\n    pitch_embed_kernel_size: 1                 # kernel size of conv embedding layer for pitch\n    pitch_embed_dropout: 0.0                   # dropout rate after conv embedding layer for pitch\n    stop_gradient_from_pitch_predictor: true   # whether to stop the gradient from pitch predictor to encoder\n    energy_predictor_layers: 2                 # number of conv layers in energy predictor\n    energy_predictor_chans: 256                # number of channels of conv layers in energy predictor\n    energy_predictor_kernel_size: 3            # kernel size of conv leyers in energy predictor\n    energy_predictor_dropout: 0.5              # dropout rate in energy predictor\n    energy_embed_kernel_size: 1                # kernel size of conv embedding layer for energy\n    energy_embed_dropout: 0.0                  # dropout rate after conv embedding layer for energy\n    stop_gradient_from_energy_predictor: false # whether to stop the gradient from energy predictor to encoder\n\n\n\n###########################################################\n#                       UPDATER SETTING                   #\n###########################################################\nupdater:\n    use_masking: True                 # whether to apply masking for padded part in loss calculation\n\n\n\n###########################################################\n#                     OPTIMIZER SETTING                   #\n###########################################################\noptimizer:\n  optim: adam               # optimizer type\n  learning_rate: 0.001     # learning rate\n\n###########################################################\n#                     TRAINING SETTING                    #\n###########################################################\nmax_epoch: 1000\nnum_snapshots: 5\n\n\n###########################################################\n#                       OTHER SETTING                     #\n###########################################################\nseed: 10086\n"
  },
  {
    "path": "modules/audio/tts/fastspeech2_baker/assets/fastspeech2_nosil_baker_ckpt_0.4/phone_id_map.txt",
    "content": "<pad> 0\n<unk> 1\na1 2\na2 3\na3 4\na4 5\na5 6\nai1 7\nai2 8\nai3 9\nai4 10\nai5 11\nair2 12\nair4 13\nan1 14\nan2 15\nan3 16\nan4 17\nan5 18\nang1 19\nang2 20\nang3 21\nang4 22\nang5 23\nanr1 24\nanr3 25\nanr4 26\nao1 27\nao2 28\nao3 29\nao4 30\nao5 31\naor3 32\naor4 33\nar2 34\nar3 35\nar4 36\nb 37\nc 38\nch 39\nd 40\ne1 41\ne2 42\ne3 43\ne4 44\ne5 45\nei1 46\nei2 47\nei3 48\nei4 49\nei5 50\nen1 51\nen2 52\nen3 53\nen4 54\nen5 55\neng1 56\neng2 57\neng3 58\neng4 59\neng5 60\nenr1 61\nenr2 62\nenr4 63\nenr5 64\ner2 65\ner3 66\ner4 67\ner5 68\nf 69\ng 70\nh 71\ni1 72\ni2 73\ni3 74\ni4 75\ni5 76\nia1 77\nia2 78\nia3 79\nia4 80\nia5 81\nian1 82\nian2 83\nian3 84\nian4 85\nian5 86\niang1 87\niang2 88\niang3 89\niang4 90\niang5 91\niangr4 92\nianr1 93\nianr2 94\nianr3 95\niao1 96\niao2 97\niao3 98\niao4 99\niao5 100\niar1 101\niar3 102\nie1 103\nie2 104\nie3 105\nie4 106\nie5 107\nii1 108\nii2 109\nii3 110\nii4 111\nii5 112\niii1 113\niii2 114\niii3 115\niii4 116\niii5 117\niiir4 118\niir2 119\nin1 120\nin2 121\nin3 122\nin4 123\nin5 124\ning1 125\ning2 126\ning3 127\ning4 128\ning5 129\ningr2 130\ningr3 131\ninr4 132\nio1 133\nio5 134\niong1 135\niong2 136\niong3 137\niong4 138\niong5 139\niou1 140\niou2 141\niou3 142\niou4 143\niou5 144\niour1 145\nir1 146\nir2 147\nir3 148\nir4 149\nir5 150\nj 151\nk 152\nl 153\nm 154\nn 155\no1 156\no2 157\no3 158\no4 159\no5 160\nong1 161\nong2 162\nong3 163\nong4 164\nong5 165\nongr4 166\nou1 167\nou2 168\nou3 169\nou4 170\nou5 171\nour2 172\np 173\nq 174\nr 175\ns 176\nsh 177\nsil 178\nsp 179\nspl 180\nspn 181\nt 182\nu1 183\nu2 184\nu3 185\nu4 186\nu5 187\nua1 188\nua2 189\nua3 190\nua4 191\nua5 192\nuai1 193\nuai2 194\nuai3 195\nuai4 196\nuai5 197\nuair4 198\nuan1 199\nuan2 200\nuan3 201\nuan4 202\nuan5 203\nuang1 204\nuang2 205\nuang3 206\nuang4 207\nuang5 208\nuanr1 209\nuanr2 210\nuei1 211\nuei2 212\nuei3 213\nuei4 214\nuei5 215\nueir1 216\nueir3 217\nueir4 218\nuen1 219\nuen2 220\nuen3 221\nuen4 222\nuen5 223\nueng1 224\nueng2 225\nueng3 226\nueng4 227\nuenr3 228\nuenr4 229\nuo1 230\nuo2 231\nuo3 232\nuo4 233\nuo5 234\nuor2 235\nuor3 236\nur3 237\nur4 238\nv1 239\nv2 240\nv3 241\nv4 242\nv5 243\nvan1 244\nvan2 245\nvan3 246\nvan4 247\nvan5 248\nvanr4 249\nve1 250\nve2 251\nve3 252\nve4 253\nve5 254\nvn1 255\nvn2 256\nvn3 257\nvn4 258\nvn5 259\nx 260\nz 261\nzh 262\n， 263\n。 264\n？ 265\n！ 266\n<eos> 267\n"
  },
  {
    "path": "modules/audio/tts/fastspeech2_baker/assets/pwg_baker_ckpt_0.4/pwg_default.yaml",
    "content": "# This is the hyperparameter configuration file for Parallel WaveGAN.\n# Please make sure this is adjusted for the CSMSC dataset. If you want to\n# apply to the other dataset, you might need to carefully change some parameters.\n# This configuration requires 12 GB GPU memory and takes ~3 days on RTX TITAN.\n\n###########################################################\n#                FEATURE EXTRACTION SETTING               #\n###########################################################\nfs: 24000                # Sampling rate.\nn_fft: 2048              # FFT size. (in samples)\nn_shift: 300             # Hop size. (in samples)\nwin_length: 1200         # Window length. (in samples)\n                         # If set to null, it will be the same as fft_size.\nwindow: \"hann\"           # Window function.\nn_mels: 80             # Number of mel basis.\nfmin: 80                 # Minimum freq in mel basis calculation.\nfmax: 7600               # Maximum frequency in mel basis calculation.\n# global_gain_scale: 1.0   # Will be multiplied to all of waveform.\ntrim_silence: false      # Whether to trim the start and end of silence.\ntop_db: 60 # Need to tune carefully if the recording is not good.\ntrim_frame_length: 2048    # Frame size in trimming.(in samples)\ntrim_hop_length: 512       # Hop size in trimming.(in samples)\n\n###########################################################\n#         GENERATOR NETWORK ARCHITECTURE SETTING          #\n###########################################################\ngenerator_params:\n    in_channels: 1        # Number of input channels.\n    out_channels: 1       # Number of output channels.\n    kernel_size: 3        # Kernel size of dilated convolution.\n    layers: 30            # Number of residual block layers.\n    stacks: 3             # Number of stacks i.e., dilation cycles.\n    residual_channels: 64 # Number of channels in residual conv.\n    gate_channels: 128    # Number of channels in gated conv.\n    skip_channels: 64     # Number of channels in skip conv.\n    aux_channels: 80      # Number of channels for auxiliary feature conv.\n                          # Must be the same as num_mels.\n    aux_context_window: 2 # Context window size for auxiliary feature.\n                          # If set to 2, previous 2 and future 2 frames will be considered.\n    dropout: 0.0          # Dropout rate. 0.0 means no dropout applied.\n    bias: true            # use bias in residual blocks\n    use_weight_norm: true # Whether to use weight norm.\n                          # If set to true, it will be applied to all of the conv layers.\n    use_causal_conv: false               # use causal conv in residual blocks and upsample layers\n    # upsample_net: \"ConvInUpsampleNetwork\" # Upsampling network architecture.\n    upsample_scales: [4, 5, 3, 5]     # Upsampling scales. Prodcut of these must be the same as hop size.\n    interpolate_mode: \"nearest\" # upsample net interpolate mode\n    freq_axis_kernel_size: 1 # upsamling net: convolution kernel size in frequencey axis\n    nonlinear_activation: null\n    nonlinear_activation_params: {}\n\n###########################################################\n#       DISCRIMINATOR NETWORK ARCHITECTURE SETTING        #\n###########################################################\ndiscriminator_params:\n    in_channels: 1        # Number of input channels.\n    out_channels: 1       # Number of output channels.\n    kernel_size: 3        # Number of output channels.\n    layers: 10            # Number of conv layers.\n    conv_channels: 64     # Number of chnn layers.\n    bias: true            # Whether to use bias parameter in conv.\n    use_weight_norm: true # Whether to use weight norm.\n                          # If set to true, it will be applied to all of the conv layers.\n    nonlinear_activation: \"LeakyReLU\" # Nonlinear function after each conv.\n    nonlinear_activation_params:      # Nonlinear function parameters\n        negative_slope: 0.2           # Alpha in LeakyReLU.\n\n###########################################################\n#                   STFT LOSS SETTING                     #\n###########################################################\nstft_loss_params:\n    fft_sizes: [1024, 2048, 512]  # List of FFT size for STFT-based loss.\n    hop_sizes: [120, 240, 50]     # List of hop size for STFT-based loss\n    win_lengths: [600, 1200, 240] # List of window length for STFT-based loss.\n    window: \"hann\"         # Window function for STFT-based loss\n\n###########################################################\n#               ADVERSARIAL LOSS SETTING                  #\n###########################################################\nlambda_adv: 4.0  # Loss balancing coefficient.\n\n###########################################################\n#                  DATA LOADER SETTING                    #\n###########################################################\nbatch_size: 6              # Batch size.\nbatch_max_steps: 25500     # Length of each audio in batch. Make sure dividable by hop_size.\npin_memory: true           # Whether to pin memory in Pytorch DataLoader.\nnum_workers: 4             # Number of workers in Pytorch DataLoader.\nremove_short_samples: true # Whether to remove samples the length of which are less than batch_max_steps.\nallow_cache: true          # Whether to allow cache in dataset. If true, it requires cpu memory.\n\n###########################################################\n#             OPTIMIZER & SCHEDULER SETTING               #\n###########################################################\ngenerator_optimizer_params:\n    epsilon: 1.0e-6            # Generator's epsilon.\n    weight_decay: 0.0      # Generator's weight decay coefficient.\ngenerator_scheduler_params:\n    learning_rate: 0.0001             # Generator's learning rate.\n    step_size: 200000      # Generator's scheduler step size.\n    gamma: 0.5             # Generator's scheduler gamma.\n                           # At each step size, lr will be multiplied by this parameter.\ngenerator_grad_norm: 10    # Generator's gradient norm.\ndiscriminator_optimizer_params:\n    epsilon: 1.0e-6            # Discriminator's epsilon.\n    weight_decay: 0.0      # Discriminator's weight decay coefficient.\ndiscriminator_scheduler_params:\n    learning_rate: 0.00005            # Discriminator's learning rate.\n    step_size: 200000      # Discriminator's scheduler step size.\n    gamma: 0.5             # Discriminator's scheduler gamma.\n                           # At each step size, lr will be multiplied by this parameter.\ndiscriminator_grad_norm: 1 # Discriminator's gradient norm.\n\n###########################################################\n#                    INTERVAL SETTING                     #\n###########################################################\ndiscriminator_train_start_steps: 100000 # Number of steps to start to train discriminator.\ntrain_max_steps: 400000                 # Number of training steps.\nsave_interval_steps: 5000               # Interval steps to save checkpoint.\neval_interval_steps: 1000               # Interval steps to evaluate the network.\n\n\n###########################################################\n#                     OTHER SETTING                       #\n###########################################################\nnum_save_intermediate_results: 4  # Number of results to be saved as intermediate results.\nnum_snapshots: 10                 # max number of snapshots to keep while training\nseed: 42                          # random seed for paddle, random, and np.random\n"
  },
  {
    "path": "modules/audio/tts/fastspeech2_baker/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom pathlib import Path\nfrom typing import List\n\nimport numpy as np\nimport paddle\nfrom paddlehub.env import MODULE_HOME\nfrom paddlehub.module.module import moduleinfo, serving\nfrom paddlehub.utils.log import logger\nfrom parakeet.frontend.zh_frontend import Frontend\nfrom parakeet.models.fastspeech2 import FastSpeech2\nfrom parakeet.models.fastspeech2 import FastSpeech2Inference\nfrom parakeet.models.parallel_wavegan import PWGGenerator\nfrom parakeet.models.parallel_wavegan import PWGInference\nfrom parakeet.modules.normalizer import ZScore\nimport soundfile as sf\nfrom yacs.config import CfgNode\nimport yaml\n\n\n@moduleinfo(name=\"fastspeech2_baker\", version=\"1.0.0\", summary=\"\", author=\"Baidu\", author_email=\"\", type=\"audio/tts\")\nclass FastSpeech(paddle.nn.Layer):\n    def __init__(self, output_dir='./wavs'):\n        super(FastSpeech, self).__init__()\n        fastspeech2_res_dir = os.path.join(MODULE_HOME, 'fastspeech2_baker', 'assets/fastspeech2_nosil_baker_ckpt_0.4')\n        pwg_res_dir = os.path.join(MODULE_HOME, 'fastspeech2_baker', 'assets/pwg_baker_ckpt_0.4')\n\n        phones_dict = os.path.join(fastspeech2_res_dir, 'phone_id_map.txt')\n        with open(phones_dict, \"r\") as f:\n            phn_id = [line.strip().split() for line in f.readlines()]\n        vocab_size = len(phn_id)\n\n        # fastspeech2\n        fastspeech2_config = os.path.join(fastspeech2_res_dir, 'default.yaml')\n        with open(fastspeech2_config) as f:\n            fastspeech2_config = CfgNode(yaml.safe_load(f))\n        self.samplerate = fastspeech2_config.fs\n\n        fastspeech2_checkpoint = os.path.join(fastspeech2_res_dir, 'snapshot_iter_76000.pdz')\n        model = FastSpeech2(idim=vocab_size, odim=fastspeech2_config.n_mels, **fastspeech2_config[\"model\"])\n        model.set_state_dict(paddle.load(fastspeech2_checkpoint)[\"main_params\"])\n        logger.info('Load fastspeech2 params from %s' % os.path.abspath(fastspeech2_checkpoint))\n        model.eval()\n\n        # vocoder\n        pwg_config = os.path.join(pwg_res_dir, 'pwg_default.yaml')\n        with open(pwg_config) as f:\n            pwg_config = CfgNode(yaml.safe_load(f))\n\n        pwg_checkpoint = os.path.join(pwg_res_dir, 'pwg_snapshot_iter_400000.pdz')\n        vocoder = PWGGenerator(**pwg_config[\"generator_params\"])\n        vocoder.set_state_dict(paddle.load(pwg_checkpoint)[\"generator_params\"])\n        logger.info('Load vocoder params from %s' % os.path.abspath(pwg_checkpoint))\n        vocoder.remove_weight_norm()\n        vocoder.eval()\n\n        # frontend\n        self.frontend = Frontend(phone_vocab_path=phones_dict)\n\n        # stat\n        fastspeech2_stat = os.path.join(fastspeech2_res_dir, 'speech_stats.npy')\n        stat = np.load(fastspeech2_stat)\n        mu, std = stat\n        mu = paddle.to_tensor(mu)\n        std = paddle.to_tensor(std)\n        fastspeech2_normalizer = ZScore(mu, std)\n\n        pwg_stat = os.path.join(pwg_res_dir, 'pwg_stats.npy')\n        stat = np.load(pwg_stat)\n        mu, std = stat\n        mu = paddle.to_tensor(mu)\n        std = paddle.to_tensor(std)\n        pwg_normalizer = ZScore(mu, std)\n\n        # inference\n        self.fastspeech2_inference = FastSpeech2Inference(fastspeech2_normalizer, model)\n        self.pwg_inference = PWGInference(pwg_normalizer, vocoder)\n\n        self.output_dir = Path(output_dir)\n        self.output_dir.mkdir(parents=True, exist_ok=True)\n\n    def forward(self, text: str):\n        wav = None\n        input_ids = self.frontend.get_input_ids(text, merge_sentences=True)\n        phone_ids = input_ids[\"phone_ids\"]\n        for part_phone_ids in phone_ids:\n            with paddle.no_grad():\n                mel = self.fastspeech2_inference(part_phone_ids)\n                temp_wav = self.pwg_inference(mel)\n                if wav is None:\n                    wav = temp_wav\n                else:\n                    wav = paddle.concat([wav, temp_wav])\n\n        return wav\n\n    @serving\n    def generate(self, sentences: List[str], device='cpu'):\n        assert isinstance(sentences, list) and isinstance(sentences[0], str), \\\n            'Input data should be List[str], but got {}'.format(type(sentences))\n\n        paddle.set_device(device)\n        wav_files = []\n        for i, sentence in enumerate(sentences):\n            wav = self(sentence)\n            wav_file = str(self.output_dir.absolute() / (str(i + 1) + \".wav\"))\n            sf.write(wav_file, wav.numpy(), samplerate=self.samplerate)\n            wav_files.append(wav_file)\n\n        logger.info('{} wave files have been generated in {}'.format(len(sentences), self.output_dir.absolute()))\n        return wav_files\n"
  },
  {
    "path": "modules/audio/tts/fastspeech2_baker/requirements.txt",
    "content": "git+https://github.com/PaddlePaddle/Parakeet@8040cb0#egg=paddle-parakeet\n"
  },
  {
    "path": "modules/audio/tts/fastspeech2_ljspeech/README.md",
    "content": "# fastspeech2_ljspeech\n\n|模型名称|fastspeech2_ljspeech|\n| :--- | :---: |\n|类别|语音-语音合成|\n|网络|FastSpeech2|\n|数据集|LJSpeech-1.1|\n|是否支持Fine-tuning|否|\n|模型大小|425MB|\n|最新更新日期|2021-10-20|\n|数据指标|-|\n\n## 一、模型基本信息\n\n### 模型介绍\n\nFastSpeech2是微软亚洲研究院和微软Azure语音团队联合浙江大学于2020年提出的语音合成(Text to Speech, TTS)模型。FastSpeech2是FastSpeech的改进版，解决了FastSpeech依赖Teacher-Student的知识蒸馏框架，训练流程比较复杂和训练目标相比真实语音存在信息损失的问题。\n\nFastSpeech2的模型架构如下图所示，它沿用FastSpeech中提出的Feed-Forward Transformer(FFT)架构，但在音素编码器和梅尔频谱解码器中加入了一个可变信息适配器(Variance Adaptor)，从而支持在FastSpeech2中引入更多语音中变化的信息，例如时长、音高、音量(频谱能量)等，来解决语音合成中的一对多映射问题。\n\n<p align=\"center\">\n<img src=\"https://paddlespeech.bj.bcebos.com/Parakeet/docs/images/fastspeech2.png\" hspace='10'/> <br />\n</p>\n\nParallel WaveGAN是一种使用了无蒸馏的对抗生成网络，快速且占用空间小的波形生成方法。该方法通过联合优化多分辨率谱图和对抗损失函数来训练非自回归WaveNet，可以有效捕获真实语音波形的时频分布。Parallel WaveGAN的结构如下图所示：\n\n<p align=\"center\">\n<img src=\"https://paddlespeech.bj.bcebos.com/Parakeet/docs/images/pwg.png\" hspace='10'/> <br />\n</p>\n\nfastspeech2_ljspeech使用了FastSpeech2作为声学模型，使用Parallel WaveGAN作为声码器，并在[The LJ Speech Dataset](https://keithito.com/LJ-Speech-Dataset/)数据集上进行了预训练，可直接用于预测合成音频。\n\n更多详情请参考:\n- [FastSpeech 2: Fast and High-Quality End-to-End Text-to-Speech](https://arxiv.org/abs/2006.04558)\n- [FastSpeech语音合成系统技术升级，微软联合浙大提出FastSpeech2](https://www.msra.cn/zh-cn/news/features/fastspeech2)\n- [Parallel WaveGAN: A fast waveform generation model based on generative adversarial networks with multi-resolution spectrogram](https://arxiv.org/abs/1910.11480)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.1.0\n\n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install fastspeech2_ljspeech\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n    ```python\n    import paddlehub as hub\n\n    # 需要合成语音的文本\n    sentences = ['The quick brown fox jumps over a lazy dog.']\n\n    model = hub.Module(\n        name='fastspeech2_ljspeech',\n        version='1.0.0')\n    wav_files =  model.generate(sentences)\n\n    # 打印合成的音频文件的路径\n    print(wav_files)\n    ```\n\n    详情可参考PaddleHub示例：\n    - [语音合成](../../../../demo/text_to_speech)\n\n\n- ### 2、API\n  - ```python\n    def __init__(output_dir)\n    ```\n\n    - 创建Module对象（动态图组网版本）\n\n    - **参数**\n\n      - `output_dir`： 合成音频文件的输出目录。\n\n  - ```python\n    def generate(\n        sentences,\n        device='cpu',\n    )\n    ```\n    - 将输入的文本合成为音频文件并保存到输出目录。\n\n    - **参数**\n\n      - `sentences`：合成音频的文本列表，类型为`List[str]`。\n      - `device`：预测时使用的设备，默认为\b`cpu`，如需使用gpu预测，请设置为`gpu`。\n\n    - **返回**\n\n      - `wav_files`：`List[str]`类型，返回合成音频的存放路径。\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线的语音识别服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m fastspeech2_ljspeech\n    ```\n\n  - 这样就完成了一个语音识别服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 需要合成语音的文本\n    sentences = [\n        'The quick brown fox jumps over a lazy dog.',\n        'Today is a good day!',\n    ]\n\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"sentences\"\n    data = {\"sentences\": sentences}\n\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/fastspeech2_ljspeech\"\n\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install fastspeech2_ljspeech\n  ```\n"
  },
  {
    "path": "modules/audio/tts/fastspeech2_ljspeech/__init__.py",
    "content": ""
  },
  {
    "path": "modules/audio/tts/fastspeech2_ljspeech/assets/fastspeech2_nosil_ljspeech_ckpt_0.5/default.yaml",
    "content": "###########################################################\n#                FEATURE EXTRACTION SETTING               #\n###########################################################\n\nfs: 22050          # sr\nn_fft: 1024        # FFT size.\nn_shift: 256       # Hop size.\nwin_length: null   # Window length.\n                   # If set to null, it will be the same as fft_size.\nwindow: \"hann\"     # Window function.\n\n# Only used for feats_type != raw\n\nfmin: 80           # Minimum frequency of Mel basis.\nfmax: 7600         # Maximum frequency of Mel basis.\nn_mels: 80         # The number of mel basis.\n\n# Only used for the model using pitch features (e.g. FastSpeech2)\nf0min: 80          # Maximum f0 for pitch extraction.\nf0max: 400         # Minimum f0 for pitch extraction.\n\n\n###########################################################\n#                       DATA SETTING                      #\n###########################################################\nbatch_size: 64\nnum_workers: 4\n\n\n###########################################################\n#                       MODEL SETTING                     #\n###########################################################\nmodel:\n    adim: 384         # attention dimension\n    aheads: 2         # number of attention heads\n    elayers: 4        # number of encoder layers\n    eunits: 1536      # number of encoder ff units\n    dlayers: 4        # number of decoder layers\n    dunits: 1536      # number of decoder ff units\n    positionwise_layer_type: conv1d   # type of position-wise layer\n    positionwise_conv_kernel_size: 3  # kernel size of position wise conv layer\n    duration_predictor_layers: 2      # number of layers of duration predictor\n    duration_predictor_chans: 256     # number of channels of duration predictor\n    duration_predictor_kernel_size: 3 # filter size of duration predictor\n    postnet_layers: 5                 # number of layers of postnset\n    postnet_filts: 5                  # filter size of conv layers in postnet\n    postnet_chans: 256                # number of channels of conv layers in postnet\n    use_masking: True                 # whether to apply masking for padded part in loss calculation\n    use_scaled_pos_enc: True          # whether to use scaled positional encoding\n    encoder_normalize_before: True    # whether to perform layer normalization before the input\n    decoder_normalize_before: True    # whether to perform layer normalization before the input\n    reduction_factor: 1               # reduction factor\n    init_type: xavier_uniform         # initialization type\n    init_enc_alpha: 1.0               # initial value of alpha of encoder scaled position encoding\n    init_dec_alpha: 1.0               # initial value of alpha of decoder scaled position encoding\n    transformer_enc_dropout_rate: 0.2            # dropout rate for transformer encoder layer\n    transformer_enc_positional_dropout_rate: 0.2 # dropout rate for transformer encoder positional encoding\n    transformer_enc_attn_dropout_rate: 0.2       # dropout rate for transformer encoder attention layer\n    transformer_dec_dropout_rate: 0.2            # dropout rate for transformer decoder layer\n    transformer_dec_positional_dropout_rate: 0.2 # dropout rate for transformer decoder positional encoding\n    transformer_dec_attn_dropout_rate: 0.2       # dropout rate for transformer decoder attention layer\n    pitch_predictor_layers: 5                  # number of conv layers in pitch predictor\n    pitch_predictor_chans: 256                 # number of channels of conv layers in pitch predictor\n    pitch_predictor_kernel_size: 5             # kernel size of conv leyers in pitch predictor\n    pitch_predictor_dropout: 0.5               # dropout rate in pitch predictor\n    pitch_embed_kernel_size: 1                 # kernel size of conv embedding layer for pitch\n    pitch_embed_dropout: 0.0                   # dropout rate after conv embedding layer for pitch\n    stop_gradient_from_pitch_predictor: true   # whether to stop the gradient from pitch predictor to encoder\n    energy_predictor_layers: 2                 # number of conv layers in energy predictor\n    energy_predictor_chans: 256                # number of channels of conv layers in energy predictor\n    energy_predictor_kernel_size: 3            # kernel size of conv leyers in energy predictor\n    energy_predictor_dropout: 0.5              # dropout rate in energy predictor\n    energy_embed_kernel_size: 1                # kernel size of conv embedding layer for energy\n    energy_embed_dropout: 0.0                  # dropout rate after conv embedding layer for energy\n    stop_gradient_from_energy_predictor: false # whether to stop the gradient from energy predictor to encoder\n\n\n\n###########################################################\n#                       UPDATER SETTING                   #\n###########################################################\nupdater:\n    use_masking: True                 # whether to apply masking for padded part in loss calculation\n\n\n\n###########################################################\n#                     OPTIMIZER SETTING                   #\n###########################################################\noptimizer:\n  optim: adam               # optimizer type\n  learning_rate: 0.001     # learning rate\n\n###########################################################\n#                     TRAINING SETTING                    #\n###########################################################\nmax_epoch: 1000\nnum_snapshots: 5\n\n\n###########################################################\n#                       OTHER SETTING                     #\n###########################################################\nseed: 10086\n"
  },
  {
    "path": "modules/audio/tts/fastspeech2_ljspeech/assets/fastspeech2_nosil_ljspeech_ckpt_0.5/phone_id_map.txt",
    "content": "<pad> 0\n<unk> 1\nAA0 2\nAA1 3\nAA2 4\nAE0 5\nAE1 6\nAE2 7\nAH0 8\nAH1 9\nAH2 10\nAO0 11\nAO1 12\nAO2 13\nAW0 14\nAW1 15\nAW2 16\nAY0 17\nAY1 18\nAY2 19\nB 20\nCH 21\nD 22\nDH 23\nEH0 24\nEH1 25\nEH2 26\nER0 27\nER1 28\nER2 29\nEY0 30\nEY1 31\nEY2 32\nF 33\nG 34\nHH 35\nIH0 36\nIH1 37\nIH2 38\nIY0 39\nIY1 40\nIY2 41\nJH 42\nK 43\nL 44\nM 45\nN 46\nNG 47\nOW0 48\nOW1 49\nOW2 50\nOY0 51\nOY1 52\nOY2 53\nP 54\nR 55\nS 56\nSH 57\nT 58\nTH 59\nUH0 60\nUH1 61\nUH2 62\nUW0 63\nUW1 64\nUW2 65\nV 66\nW 67\nY 68\nZ 69\nZH 70\nsil 71\nsp 72\nspl 73\nspn 74\n, 75\n. 76\n? 77\n! 78\n<eos> 79\n"
  },
  {
    "path": "modules/audio/tts/fastspeech2_ljspeech/assets/pwg_ljspeech_ckpt_0.5/pwg_default.yaml",
    "content": "# This is the hyperparameter configuration file for Parallel WaveGAN.\n# Please make sure this is adjusted for the LJSpeech dataset. If you want to\n# apply to the other dataset, you might need to carefully change some parameters.\n# This configuration requires 12 GB GPU memory and takes ~3 days on TITAN V.\n\n###########################################################\n#                FEATURE EXTRACTION SETTING               #\n###########################################################\nfs: 22050                # Sampling rate.\nn_fft: 1024              # FFT size. (in samples)\nn_shift: 256             # Hop size. (in samples)\nwin_length: null         # Window length. (in samples)\n                         # If set to null, it will be the same as fft_size.\nwindow: \"hann\"           # Window function.\nn_mels: 80               # Number of mel basis.\nfmin: 80                 # Minimum freq in mel basis calculation. (Hz)\nfmax: 7600               # Maximum frequency in mel basis calculation. (Hz)\ntrim_silence: false      # Whether to trim the start and end of silence.\ntop_db: 60               # Need to tune carefully if the recording is not good.\ntrim_frame_length: 2048    # Frame size in trimming. (in samples)\ntrim_hop_length: 512       # Hop size in trimming. (in samples)\n\n###########################################################\n#         GENERATOR NETWORK ARCHITECTURE SETTING          #\n###########################################################\ngenerator_params:\n    in_channels: 1        # Number of input channels.\n    out_channels: 1       # Number of output channels.\n    kernel_size: 3        # Kernel size of dilated convolution.\n    layers: 30            # Number of residual block layers.\n    stacks: 3             # Number of stacks i.e., dilation cycles.\n    residual_channels: 64 # Number of channels in residual conv.\n    gate_channels: 128    # Number of channels in gated conv.\n    skip_channels: 64     # Number of channels in skip conv.\n    aux_channels: 80      # Number of channels for auxiliary feature conv.\n                          # Must be the same as num_mels.\n    aux_context_window: 2 # Context window size for auxiliary feature.\n                          # If set to 2, previous 2 and future 2 frames will be considered.\n    dropout: 0.0          # Dropout rate. 0.0 means no dropout applied.\n    use_weight_norm: true # Whether to use weight norm.\n                          # If set to true, it will be applied to all of the conv layers.\n    upsample_scales: [4, 4, 4, 4]     # Upsampling scales. Prodcut of these must be the same as hop size.\n\n###########################################################\n#       DISCRIMINATOR NETWORK ARCHITECTURE SETTING        #\n###########################################################\ndiscriminator_params:\n    in_channels: 1        # Number of input channels.\n    out_channels: 1       # Number of output channels.\n    kernel_size: 3        # Number of output channels.\n    layers: 10            # Number of conv layers.\n    conv_channels: 64     # Number of chnn layers.\n    bias: true            # Whether to use bias parameter in conv.\n    use_weight_norm: true # Whether to use weight norm.\n                          # If set to true, it will be applied to all of the conv layers.\n    nonlinear_activation: \"LeakyReLU\" # Nonlinear function after each conv.\n    nonlinear_activation_params:      # Nonlinear function parameters\n        negative_slope: 0.2           # Alpha in LeakyReLU.\n\n###########################################################\n#                   STFT LOSS SETTING                     #\n###########################################################\nstft_loss_params:\n    fft_sizes: [1024, 2048, 512]  # List of FFT size for STFT-based loss.\n    hop_sizes: [120, 240, 50]     # List of hop size for STFT-based loss\n    win_lengths: [600, 1200, 240] # List of window length for STFT-based loss.\n    window: \"hann\"         # Window function for STFT-based loss\n\n###########################################################\n#               ADVERSARIAL LOSS SETTING                  #\n###########################################################\nlambda_adv: 4.0  # Loss balancing coefficient.\n\n###########################################################\n#                  DATA LOADER SETTING                    #\n###########################################################\nbatch_size: 8              # Batch size.\nbatch_max_steps: 25600     # Length of each audio in batch. Make sure dividable by hop_size.\npin_memory: true           # Whether to pin memory in Pytorch DataLoader.\nnum_workers: 4             # Number of workers in Pytorch DataLoader.\nremove_short_samples: true # Whether to remove samples the length of which are less than batch_max_steps.\nallow_cache: true          # Whether to allow cache in dataset. If true, it requires cpu memory.\n\n###########################################################\n#             OPTIMIZER & SCHEDULER SETTING               #\n###########################################################\ngenerator_optimizer_params:\n    epsilon: 1.0e-6        # Generator's epsilon.\n    weight_decay: 0.0      # Generator's weight decay coefficient.\ngenerator_scheduler_params:\n    learning_rate: 0.0001             # Generator's learning rate.\n    step_size: 200000      # Generator's scheduler step size.\n    gamma: 0.5             # Generator's scheduler gamma.\n                           # At each step size, lr will be multiplied by this parameter.\ngenerator_grad_norm: 10    # Generator's gradient norm.\ndiscriminator_optimizer_params:\n    epsilon: 1.0e-6            # Discriminator's epsilon.\n    weight_decay: 0.0      # Discriminator's weight decay coefficient.\ndiscriminator_scheduler_params:\n    learning_rate: 0.00005            # Discriminator's learning rate.\n    step_size: 200000      # Discriminator's scheduler step size.\n    gamma: 0.5             # Discriminator's scheduler gamma.\n                           # At each step size, lr will be multiplied by this parameter.\ndiscriminator_grad_norm: 1 # Discriminator's gradient norm.\n\n###########################################################\n#                    INTERVAL SETTING                     #\n###########################################################\ndiscriminator_train_start_steps: 100000 # Number of steps to start to train discriminator.\ntrain_max_steps: 400000                 # Number of training steps.\nsave_interval_steps: 5000               # Interval steps to save checkpoint.\neval_interval_steps: 1000               # Interval steps to evaluate the network.\n\n###########################################################\n#                     OTHER SETTING                       #\n###########################################################\nnum_save_intermediate_results: 4  # Number of results to be saved as intermediate results.\nnum_snapshots: 10                 # max number of snapshots to keep while training\nseed: 42                          # random seed for paddle, random, and np.random\n"
  },
  {
    "path": "modules/audio/tts/fastspeech2_ljspeech/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom pathlib import Path\nfrom typing import List\n\nimport numpy as np\nimport paddle\nfrom paddlehub.env import MODULE_HOME\nfrom paddlehub.module.module import moduleinfo, serving\nfrom paddlehub.utils.log import logger\nfrom parakeet.frontend import English\nfrom parakeet.models.fastspeech2 import FastSpeech2\nfrom parakeet.models.fastspeech2 import FastSpeech2Inference\nfrom parakeet.models.parallel_wavegan import PWGGenerator\nfrom parakeet.models.parallel_wavegan import PWGInference\nfrom parakeet.modules.normalizer import ZScore\nimport soundfile as sf\nfrom yacs.config import CfgNode\nimport yaml\n\n\n@moduleinfo(name=\"fastspeech2_ljspeech\", version=\"1.0.0\", summary=\"\", author=\"Baidu\", author_email=\"\", type=\"audio/tts\")\nclass FastSpeech(paddle.nn.Layer):\n    def __init__(self, output_dir='./wavs'):\n        super(FastSpeech, self).__init__()\n        fastspeech2_res_dir = os.path.join(MODULE_HOME, 'fastspeech2_ljspeech',\n                                           'assets/fastspeech2_nosil_ljspeech_ckpt_0.5')\n        pwg_res_dir = os.path.join(MODULE_HOME, 'fastspeech2_ljspeech', 'assets/pwg_ljspeech_ckpt_0.5')\n\n        phones_dict = os.path.join(fastspeech2_res_dir, 'phone_id_map.txt')\n        with open(phones_dict, \"r\") as f:\n            phn_id = [line.strip().split() for line in f.readlines()]\n        vocab_size = len(phn_id)\n        self.phone_id_map = {}\n        for phn, _id in phn_id:\n            self.phone_id_map[phn] = int(_id)\n\n        # fastspeech2\n        fastspeech2_config = os.path.join(fastspeech2_res_dir, 'default.yaml')\n        with open(fastspeech2_config) as f:\n            fastspeech2_config = CfgNode(yaml.safe_load(f))\n        self.samplerate = fastspeech2_config.fs\n\n        fastspeech2_checkpoint = os.path.join(fastspeech2_res_dir, 'snapshot_iter_100000.pdz')\n        model = FastSpeech2(idim=vocab_size, odim=fastspeech2_config.n_mels, **fastspeech2_config[\"model\"])\n        model.set_state_dict(paddle.load(fastspeech2_checkpoint)[\"main_params\"])\n        logger.info('Load fastspeech2 params from %s' % os.path.abspath(fastspeech2_checkpoint))\n        model.eval()\n\n        # vocoder\n        pwg_config = os.path.join(pwg_res_dir, 'pwg_default.yaml')\n        with open(pwg_config) as f:\n            pwg_config = CfgNode(yaml.safe_load(f))\n\n        pwg_checkpoint = os.path.join(pwg_res_dir, 'pwg_snapshot_iter_400000.pdz')\n        vocoder = PWGGenerator(**pwg_config[\"generator_params\"])\n        vocoder.set_state_dict(paddle.load(pwg_checkpoint)[\"generator_params\"])\n        logger.info('Load vocoder params from %s' % os.path.abspath(pwg_checkpoint))\n        vocoder.remove_weight_norm()\n        vocoder.eval()\n\n        # frontend\n        self.frontend = English()\n        self.punc = \"：，；。？！“”‘’':,;.?!\"\n\n        # stat\n        fastspeech2_stat = os.path.join(fastspeech2_res_dir, 'speech_stats.npy')\n        stat = np.load(fastspeech2_stat)\n        mu, std = stat\n        mu = paddle.to_tensor(mu)\n        std = paddle.to_tensor(std)\n        fastspeech2_normalizer = ZScore(mu, std)\n\n        pwg_stat = os.path.join(pwg_res_dir, 'pwg_stats.npy')\n        stat = np.load(pwg_stat)\n        mu, std = stat\n        mu = paddle.to_tensor(mu)\n        std = paddle.to_tensor(std)\n        pwg_normalizer = ZScore(mu, std)\n\n        # inference\n        self.fastspeech2_inference = FastSpeech2Inference(fastspeech2_normalizer, model)\n        self.pwg_inference = PWGInference(pwg_normalizer, vocoder)\n\n        self.output_dir = Path(output_dir)\n        self.output_dir.mkdir(parents=True, exist_ok=True)\n\n    def forward(self, text: str):\n        phones = self.frontend.phoneticize(text)\n        # remove start_symbol and end_symbol\n        phones = phones[1:-1]\n        phones = [phn for phn in phones if not phn.isspace()]\n        phones = [phn if (phn in self.phone_id_map and phn not in self.punc) else \"sp\" for phn in phones]\n        phone_ids = [self.phone_id_map[phn] for phn in phones]\n        phone_ids = paddle.to_tensor(phone_ids)\n\n        with paddle.no_grad():\n            mel = self.fastspeech2_inference(phone_ids)\n            wav = self.pwg_inference(mel)\n\n        return wav\n\n    @serving\n    def generate(self, sentences: List[str], device='cpu'):\n        assert isinstance(sentences, list) and isinstance(sentences[0], str), \\\n            'Input data should be List[str], but got {}'.format(type(sentences))\n\n        paddle.set_device(device)\n        wav_files = []\n        for i, sentence in enumerate(sentences):\n            wav = self(sentence)\n            wav_file = str(self.output_dir.absolute() / (str(i + 1) + \".wav\"))\n            sf.write(wav_file, wav.numpy(), samplerate=self.samplerate)\n            wav_files.append(wav_file)\n\n        logger.info('{} wave files have been generated in {}'.format(len(sentences), self.output_dir.absolute()))\n        return wav_files\n"
  },
  {
    "path": "modules/audio/tts/fastspeech2_ljspeech/requirements.txt",
    "content": "git+https://github.com/PaddlePaddle/Parakeet@8040cb0#egg=paddle-parakeet\n"
  },
  {
    "path": "modules/audio/tts/fastspeech_ljspeech/README.md",
    "content": "# fastspeech_ljspeech\n\n|模型名称|fastspeech_ljspeech|\n| :--- | :---: |\n|类别|语音-语音合成|\n|网络|FastSpeech|\n|数据集|LJSpeech-1.1|\n|是否支持Fine-tuning|否|\n|模型大小|320MB|\n|最新更新日期|2020-10-27|\n|数据指标|-|\n\n## 一、模型基本信息\n\n### 模型介绍\n\nFastSpeech是基于Transformer的前馈神经网络，作者从encoder-decoder结构的teacher model中提取attention对角线来做发音持续时间预测，即使用长度调节器对文本序列进行扩展来匹配目标梅尔频谱的长度，以便并行生成梅尔频谱。该模型基本上消除了复杂情况下的跳词和重复的问题，并且可以平滑地调整语音速度，更重要的是，该模型大幅度提升了梅尔频谱的生成速度。fastspeech_ljspeech是基于ljspeech英文语音数据集预训练得到的英文TTS模型，仅支持预测。\n\n<p align=\"center\">\n<img src=\"https://raw.githubusercontent.com/PaddlePaddle/Parakeet/release/v0.1/examples/fastspeech/images/model_architecture.png\" hspace='10'/> <br/>\n</p>\n\n更多详情参考论文[FastSpeech: Fast, Robust and Controllable Text to Speech](https://arxiv.org/abs/1905.09263)\n\n\n## 二、安装\n\n- ### 1、系统依赖\n\n    对于Ubuntu用户，请执行：\n    ```\n    sudo apt-get install libsndfile1\n    ```\n    对于Centos用户，请执行：\n    ```\n    sudo yum install libsndfile\n    ```\n\n- ### 2、环境依赖\n\n  - 2.0.0 > paddlepaddle >= 1.8.2\n\n  - 2.0.0 > paddlehub >= 1.7.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 3、安装\n\n  - ```shell\n    $ hub install fastspeech_ljspeech\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run fastspeech_ljspeech --input_text='Simple as this proposition is, it is necessary to be stated' --use_gpu True --vocoder griffin-lim\n    ```\n  - 通过命令行方式实现语音合成模型的调用，更多请见[PaddleHub命令行指令](https://github.com/shinichiye/PaddleHub/blob/release/v2.1/docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import soundfile as sf\n\n    # Load fastspeech_ljspeech module.\n    module = hub.Module(name=\"fastspeech_ljspeech\")\n\n    # Predict sentiment label\n    test_texts = ['Simple as this proposition is, it is necessary to be stated',\n                'Parakeet stands for Paddle PARAllel text-to-speech toolkit']\n    wavs, sample_rate = module.synthesize(texts=test_texts)\n    for index, wav in enumerate(wavs):\n        sf.write(f\"{index}.wav\", wav, sample_rate)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def synthesize(texts, use_gpu=False, speed=1.0, vocoder=\"griffin-lim\"):\n    ```\n\n    - 预测API，由输入文本合成对应音频波形。\n\n    - **参数**\n      - texts (list\\[str\\]): 待预测文本；\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA\\_VISIBLE\\_DEVICES环境变量**；\n      - speed(float): 音频速度，1.0表示以原速输出。\n      - vocoder: 指定声码器，可选 \"griffin-lim\"或\"waveflow\"\n\n    - **返回**\n      - wavs (list): 语音合成结果列表，列表中每一个元素为对应输入文本的音频波形，可使用`soundfile.write`进一步处理或保存。\n      - sample\\_rate (int): 合成音频的采样率。\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线语音合成服务，可以将此接口用于在线web应用。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令\n  - ```shell\n    $ hub serving start -m fastspeech_ljspeech\n    ```\n  - 这样就完成了服务化API的部署，默认端口号为8866。  \n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    import soundfile as sf\n\n    # 发送HTTP请求\n\n    data = {'texts':['Simple as this proposition is, it is necessary to be stated',\n                    'Parakeet stands for Paddle PARAllel text-to-speech toolkit'],\n            'use_gpu':False}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/fastspeech_ljspeech\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 保存结果\n    result = r.json()[\"results\"]\n    wavs = result[\"wavs\"]\n    sample_rate = result[\"sample_rate\"]\n    for index, wav in enumerate(wavs):\n        sf.write(f\"{index}.wav\", wav, sample_rate)\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install fastspeech_ljspeech\n  ```\n"
  },
  {
    "path": "modules/audio/tts/fastspeech_ljspeech/__init__.py",
    "content": ""
  },
  {
    "path": "modules/audio/tts/fastspeech_ljspeech/module.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport ast\nimport argparse\nimport importlib.util\n\nimport nltk\nimport paddle.fluid as fluid\nimport paddle.fluid.dygraph as dg\nimport paddlehub as hub\nfrom paddlehub.module.module import runnable\nfrom paddlehub.common.utils import mkdir\nfrom paddlehub.module.nlp_module import DataFormatError\nfrom paddlehub.common.logger import logger\nfrom paddlehub.module.module import moduleinfo, serving\nfrom paddlehub.common.dir import THIRD_PARTY_HOME\nfrom paddlehub.common.downloader import default_downloader\n\nlack_dependency = []\nfor dependency in [\"ruamel\", \"parakeet\", \"soundfile\", \"librosa\"]:\n    if not importlib.util.find_spec(dependency):\n        lack_dependency.append(dependency)\n\n# Accelerate NLTK package download via paddlehub. 'import parakeet' will use the package.\n_PUNKT_URL = \"https://paddlehub.bj.bcebos.com/paddlehub-thirdparty/punkt.tar.gz\"\n_CMUDICT_URL = \"https://paddlehub.bj.bcebos.com/paddlehub-thirdparty/cmudict.tar.gz\"\nnltk_path = os.path.join(THIRD_PARTY_HOME, \"nltk_data\")\ntokenizers_path = os.path.join(nltk_path, \"tokenizers\")\ncorpora_path = os.path.join(nltk_path, \"corpora\")\npunkt_path = os.path.join(tokenizers_path, \"punkt\")\ncmudict_path = os.path.join(corpora_path, \"cmudict\")\n\nif not os.path.exists(punkt_path):\n    default_downloader.download_file_and_uncompress(url=_PUNKT_URL, save_path=tokenizers_path, print_progress=True)\nif not os.path.exists(cmudict_path):\n    default_downloader.download_file_and_uncompress(url=_CMUDICT_URL, save_path=corpora_path, print_progress=True)\nnltk.data.path.append(nltk_path)\n\nif not lack_dependency:\n    import soundfile as sf\n    import librosa\n    from ruamel import yaml\n    from parakeet.models.fastspeech.fastspeech import FastSpeech as FastSpeechModel\n    from parakeet.g2p.en import text_to_sequence\n    from parakeet.models.transformer_tts.utils import *\n    from parakeet.utils import io\n    from parakeet.modules.weight_norm import WeightNormWrapper\n    from parakeet.models.waveflow import WaveFlowModule\nelse:\n    raise ImportError(\n        \"The module requires additional dependencies: %s. You can install parakeet via 'git clone https://github.com/PaddlePaddle/Parakeet && cd Parakeet && pip install -e .' and others via pip install\"\n        % \", \".join(lack_dependency))\n\n\nclass AttrDict(dict):\n    def __init__(self, *args, **kwargs):\n        super(AttrDict, self).__init__(*args, **kwargs)\n        self.__dict__ = self\n\n\n@moduleinfo(\n    name=\"fastspeech_ljspeech\",\n    version=\"1.0.0\",\n    summary=\n    \"FastSpeech proposes a novel feed-forward network based on Transformer to generate mel-spectrogram in parallel for TTS. See https://arxiv.org/abs/1905.09263 for details.\",\n    author=\"baidu-nlp\",\n    author_email=\"\",\n    type=\"nlp/tts\",\n)\nclass FastSpeech(hub.NLPPredictionModule):\n    def _initialize(self):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        self.tts_checkpoint_path = os.path.join(self.directory, \"assets\", \"tts\", \"step-162000\")\n        self.waveflow_checkpoint_path = os.path.join(self.directory, \"assets\", \"vocoder\", \"step-2000000\")\n        self.waveflow_config_path = os.path.join(self.directory, \"assets\", \"vocoder\", \"waveflow_ljspeech.yaml\")\n\n        tts_config_path = os.path.join(self.directory, \"assets\", \"tts\", \"ljspeech.yaml\")\n        with open(tts_config_path) as f:\n            self.tts_config = yaml.load(f, Loader=yaml.Loader)\n        with fluid.dygraph.guard(fluid.CPUPlace()):\n            self.tts_model = FastSpeechModel(self.tts_config['network'], num_mels=self.tts_config['audio']['num_mels'])\n            io.load_parameters(model=self.tts_model, checkpoint_path=self.tts_checkpoint_path)\n\n            # Build vocoder.\n            args = AttrDict()\n            args.config = self.waveflow_config_path\n            args.use_fp16 = False\n            self.waveflow_config = io.add_yaml_config_to_args(args)\n            self.waveflow = WaveFlowModule(self.waveflow_config)\n            io.load_parameters(model=self.waveflow, checkpoint_path=self.waveflow_checkpoint_path)\n\n    def synthesize(self, texts, use_gpu=False, speed=1.0, vocoder=\"griffin-lim\"):\n        \"\"\"\n        Get the synthetic wavs from the texts.\n\n        Args:\n             texts(list): the input texts to be predicted.\n             use_gpu(bool): whether use gpu to predict or not. Default False.\n             speed(float): Controlling the voice speed. Default 1.0.\n             vocoder(str): the vocoder name, \"griffin-lim\" or \"waveflow\".\n\n        Returns:\n             wavs(str): the audio wav with sample rate . You can use soundfile.write to save it.\n             sample_rate(int): the audio sample rate.\n        \"\"\"\n        if use_gpu and \"CUDA_VISIBLE_DEVICES\" not in os.environ:\n            use_gpu = False\n            logger.warning(\n                \"use_gpu has been set False as you didn't set the environment variable CUDA_VISIBLE_DEVICES while using use_gpu=True\"\n            )\n        if use_gpu:\n            place = fluid.CUDAPlace(0)\n        else:\n            place = fluid.CPUPlace()\n\n        if texts and isinstance(texts, list):\n            predicted_data = texts\n        else:\n            raise ValueError(\"The input data is inconsistent with expectations.\")\n\n        wavs = []\n        with fluid.dygraph.guard(place):\n            self.tts_model.eval()\n            self.waveflow.eval()\n            for text in predicted_data:\n                # init input\n                logger.info(\"Processing sentence: %s\" % text)\n                text = np.asarray(text_to_sequence(text))\n                text = np.expand_dims(text, axis=0)\n                pos_text = np.arange(1, text.shape[1] + 1)\n                pos_text = np.expand_dims(pos_text, axis=0)\n\n                text = dg.to_variable(text).astype(np.int64)\n                pos_text = dg.to_variable(pos_text).astype(np.int64)\n\n                _, mel_output_postnet = self.tts_model(text, pos_text, alpha=1 / speed)\n\n                if vocoder == 'griffin-lim':\n                    # synthesis use griffin-lim\n                    wav = self.synthesis_with_griffinlim(mel_output_postnet, self.tts_config['audio'])\n                elif vocoder == 'waveflow':\n                    wav = self.synthesis_with_waveflow(mel_output_postnet, self.waveflow_config.sigma)\n                else:\n                    raise ValueError(\n                        'vocoder error, we only support griffinlim and waveflow, but recevied %s.' % vocoder)\n                wavs.append(wav)\n        return wavs, self.tts_config['audio']['sr']\n\n    def synthesis_with_griffinlim(self, mel_output, cfg):\n        # synthesis with griffin-lim\n        mel_output = fluid.layers.transpose(fluid.layers.squeeze(mel_output, [0]), [1, 0])\n        mel_output = np.exp(mel_output.numpy())\n        basis = librosa.filters.mel(cfg['sr'], cfg['n_fft'], cfg['num_mels'], fmin=cfg['fmin'], fmax=cfg['fmax'])\n        inv_basis = np.linalg.pinv(basis)\n        spec = np.maximum(1e-10, np.dot(inv_basis, mel_output))\n\n        wav = librosa.core.griffinlim(spec**cfg['power'], hop_length=cfg['hop_length'], win_length=cfg['win_length'])\n\n        return wav\n\n    def synthesis_with_waveflow(self, mel_output, sigma):\n        mel_spectrogram = fluid.layers.transpose(fluid.layers.squeeze(mel_output, [0]), [1, 0])\n        mel_spectrogram = fluid.layers.unsqueeze(mel_spectrogram, [0])\n\n        for layer in self.waveflow.sublayers():\n            if isinstance(layer, WeightNormWrapper):\n                layer.remove_weight_norm()\n\n        # Run model inference.\n        wav = self.waveflow.synthesize(mel_spectrogram, sigma=sigma)\n        return wav.numpy()[0]\n\n    @serving\n    def serving_method(self, texts, use_gpu=False, speed=1.0, vocoder=\"griffin-lim\"):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        wavs, sample_rate = self.synthesize(texts, use_gpu, speed, vocoder)\n        wavs = [wav.tolist() for wav in wavs]\n        result = {\"wavs\": wavs, \"sample_rate\": sample_rate}\n        return result\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu', type=ast.literal_eval, default=False, help=\"whether use GPU for prediction\")\n\n        self.arg_config_group.add_argument(\n            '--vocoder', type=str, default=\"griffin-lim\", choices=['griffin-lim', 'waveflow'], help=\"the vocoder name\")\n\n    def add_module_output_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--output_path',\n            type=str,\n            default=os.path.abspath(os.path.join(os.path.curdir, f\"{self.name}_prediction\")),\n            help=\"path to save experiment results\")\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description='Run the %s module.' % self.name,\n            prog='hub run %s' % self.name,\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_input_group = self.parser.add_argument_group(\n            title=\"Ouput options\", description=\"Ouput path. Optional.\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, optional.\")\n\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.add_module_output_arg()\n\n        args = self.parser.parse_args(argvs)\n\n        try:\n            input_data = self.check_input_data(args)\n        except DataFormatError and RuntimeError:\n            self.parser.print_help()\n            return None\n\n        mkdir(args.output_path)\n        wavs, sample_rate = self.synthesize(texts=input_data, use_gpu=args.use_gpu, vocoder=args.vocoder)\n\n        for index, wav in enumerate(wavs):\n            sf.write(os.path.join(args.output_path, f\"{index}.wav\"), wav, sample_rate)\n\n        ret = f\"The synthesized wav files have been saved in {args.output_path}\"\n        return ret\n\n\nif __name__ == \"__main__\":\n\n    module = FastSpeech()\n    test_text = [\n        \"Simple as this proposition is, it is necessary to be stated\",\n    ]\n    wavs, sample_rate = module.synthesize(texts=test_text, speed=1, vocoder=\"waveflow\")\n    for index, wav in enumerate(wavs):\n        sf.write(f\"{index}.wav\", wav, sample_rate)\n"
  },
  {
    "path": "modules/audio/tts/fastspeech_ljspeech/requirements.txt",
    "content": "librosa\nsoundfile\nruamel.yaml >= 0.16.3\n"
  },
  {
    "path": "modules/audio/tts/transformer_tts_ljspeech/README.md",
    "content": "# transformer_tts_ljspeech\n\n|模型名称|transformer_tts_ljspeech|\n| :--- | :---: |\n|类别|语音-语音合成|\n|网络|Transformer|\n|数据集|LJSpeech-1.1|\n|是否支持Fine-tuning|否|\n|模型大小|54MB|\n|最新更新日期|2020-10-27|\n|数据指标|-|\n\n## 一、模型基本信息\n\n### 模型介绍\n\nTansformerTTS 是使用了 Transformer 结构的端到端语音合成模型，对 Transformer 和 Tacotron2 进行了融合，取得了令人满意的效果。因为删除了 RNN 的循环连接，可并行的提供 decoder 的输入，进行并行训练，大大提升了模型的训练速度。transformer_tts_ljspeech是基于ljspeech英文语音数据集预训练得到的英文TTS模型，仅支持预测。\n\n<p align=\"center\">\n<img src=\"https://raw.githubusercontent.com/PaddlePaddle/Parakeet/release/v0.1/examples/transformer_tts/images/model_architecture.jpg\" hspace='10'/> <br/>\n</p>\n\n更多详情参考论文[Neural Speech Synthesis with Transformer Network](https://arxiv.org/abs/1809.08895)\n\n\n## 二、安装\n\n- ### 1、系统依赖\n\n    对于Ubuntu用户，请执行：\n    ```\n    sudo apt-get install libsndfile1\n    ```\n    对于Centos用户，请执行：\n    ```\n    sudo yum install libsndfile\n    ```\n\n- ### 2、环境依赖\n\n  - 2.0.0 > paddlepaddle >= 1.8.2\n\n  - 2.0.0 > paddlehub >= 1.7.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 3、安装\n\n  - ```shell\n    $ hub install transformer_tts_ljspeech\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run transformer_tts_ljspeech --input_text=\"Life was like a box of chocolates, you never know what you're gonna get.\" --use_gpu True --vocoder griffin-lim\n    ```\n  - 通过命令行方式实现语音合成模型的调用，更多请见[PaddleHub命令行指令](https://github.com/shinichiye/PaddleHub/blob/release/v2.1/docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import soundfile as sf\n\n    # Load transformer_tts_ljspeech module.\n    module = hub.Module(name=\"transformer_tts_ljspeech\")\n\n    # Predict sentiment label\n    test_texts = [\"Life was like a box of chocolates, you never know what you're gonna get.\"]\n    wavs, sample_rate = module.synthesize(texts=test_texts, use_gpu=True, vocoder=\"waveflow\")\n    for index, wav in enumerate(wavs):\n        sf.write(f\"{index}.wav\", wav, sample_rate)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def synthesize(texts, use_gpu=False, vocoder=\"griffin-lim\"):\n    ```\n\n    - 预测API，由输入文本合成对应音频波形。\n\n    - **参数**\n      - texts (list\\[str\\]): 待预测文本；\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA\\_VISIBLE\\_DEVICES环境变量**；\n      - vocoder: 指定声码器，可选 \"griffin-lim\"或\"waveflow\"\n\n    - **返回**\n      - wavs (list): 语音合成结果列表，列表中每一个元素为对应输入文本的音频波形，可使用`soundfile.write`进一步处理或保存。\n      - sample\\_rate (int): 合成音频的采样率。\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线语音合成服务，可以将此接口用于在线web应用。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令\n\n  - ```shell\n    $ hub serving start -m transformer_tts_ljspeech\n    ```\n  - 这样就完成了服务化API的部署，默认端口号为8866。  \n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    import soundfile as sf\n\n    # 发送HTTP请求\n\n    data = {'texts':['Simple as this proposition is, it is necessary to be stated',\n                    'Parakeet stands for Paddle PARAllel text-to-speech toolkit'],\n            'use_gpu':False}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/transformer_tts_ljspeech\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 保存结果\n    result = r.json()[\"results\"]\n    wavs = result[\"wavs\"]\n    sample_rate = result[\"sample_rate\"]\n    for index, wav in enumerate(wavs):\n        sf.write(f\"{index}.wav\", wav, sample_rate)\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install transformer_tts_ljspeech\n  ```\n"
  },
  {
    "path": "modules/audio/tts/transformer_tts_ljspeech/__init__.py",
    "content": ""
  },
  {
    "path": "modules/audio/tts/transformer_tts_ljspeech/module.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport ast\nimport argparse\nimport importlib.util\n\nimport nltk\nimport paddle.fluid as fluid\nimport paddle.fluid.dygraph as dg\nimport paddlehub as hub\nfrom paddlehub.module.module import runnable\nfrom paddlehub.common.utils import mkdir\nfrom paddlehub.module.nlp_module import DataFormatError\nfrom paddlehub.common.logger import logger\nfrom paddlehub.module.module import moduleinfo, serving\nfrom paddlehub.common.dir import THIRD_PARTY_HOME\nfrom paddlehub.common.downloader import default_downloader\n\nlack_dependency = []\nfor dependency in [\"ruamel\", \"parakeet\", \"scipy\", \"soundfile\", \"librosa\"]:\n    if not importlib.util.find_spec(dependency):\n        lack_dependency.append(dependency)\n\n# Accelerate NLTK package download via paddlehub. 'import parakeet' will use the package.\n_PUNKT_URL = \"https://paddlehub.bj.bcebos.com/paddlehub-thirdparty/punkt.tar.gz\"\n_CMUDICT_URL = \"https://paddlehub.bj.bcebos.com/paddlehub-thirdparty/cmudict.tar.gz\"\nnltk_path = os.path.join(THIRD_PARTY_HOME, \"nltk_data\")\ntokenizers_path = os.path.join(nltk_path, \"tokenizers\")\ncorpora_path = os.path.join(nltk_path, \"corpora\")\npunkt_path = os.path.join(tokenizers_path, \"punkt\")\ncmudict_path = os.path.join(corpora_path, \"cmudict\")\n\nif not os.path.exists(punkt_path):\n    default_downloader.download_file_and_uncompress(url=_PUNKT_URL, save_path=tokenizers_path, print_progress=True)\nif not os.path.exists(cmudict_path):\n    default_downloader.download_file_and_uncompress(url=_CMUDICT_URL, save_path=corpora_path, print_progress=True)\nnltk.data.path.append(nltk_path)\n\nif not lack_dependency:\n    import soundfile as sf\n    import librosa\n    from ruamel import yaml\n    from scipy.io.wavfile import write\n    from parakeet.g2p.en import text_to_sequence\n    from parakeet.models.transformer_tts.utils import *\n    from parakeet.models.transformer_tts import TransformerTTS as TransformerTTSModel\n    from parakeet.models.waveflow import WaveFlowModule\n    from parakeet.utils import io\n    from parakeet.modules.weight_norm import WeightNormWrapper\nelse:\n    raise ImportError(\n        \"The module requires additional dependencies: %s. You can install parakeet via 'git clone https://github.com/PaddlePaddle/Parakeet && cd Parakeet && pip install -e .' and others via pip install\"\n        % \", \".join(lack_dependency))\n\n\nclass AttrDict(dict):\n    def __init__(self, *args, **kwargs):\n        super(AttrDict, self).__init__(*args, **kwargs)\n        self.__dict__ = self\n\n\n@moduleinfo(\n    name=\"transformer_tts_ljspeech\",\n    version=\"1.0.0\",\n    summary=\n    \"Transformer TTS introduces and adapts the multi-head attention mechanism to replace the RNN structures and also the original attention mechanism in Tacotron2. See https://arxiv.org/abs/1809.08895 for details\",\n    author=\"baidu-nlp\",\n    author_email=\"\",\n    type=\"nlp/tts\",\n)\nclass TransformerTTS(hub.NLPPredictionModule):\n    def _initialize(self):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        self.tts_checkpoint_path = os.path.join(self.directory, \"assets\", \"tts\", \"step-120000\")\n        self.waveflow_checkpoint_path = os.path.join(self.directory, \"assets\", \"vocoder\", \"step-2000000\")\n        self.waveflow_config_path = os.path.join(self.directory, \"assets\", \"vocoder\", \"waveflow_ljspeech.yaml\")\n\n        tts_config_path = os.path.join(self.directory, \"assets\", \"tts\", \"ljspeech.yaml\")\n        with open(tts_config_path) as f:\n            self.tts_config = yaml.load(f, Loader=yaml.Loader)\n\n        # The max length of audio when synthsis.\n        self.max_len = 1000\n        # The threshold of stop token which indicates the time step should stop generate spectrum or not.\n        self.stop_threshold = 0.5\n\n        with fluid.dygraph.guard(fluid.CPUPlace()):\n            # Build TTS.\n            with fluid.unique_name.guard():\n                network_cfg = self.tts_config['network']\n                self.tts_model = TransformerTTSModel(\n                    network_cfg['embedding_size'], network_cfg['hidden_size'], network_cfg['encoder_num_head'],\n                    network_cfg['encoder_n_layers'], self.tts_config['audio']['num_mels'],\n                    network_cfg['outputs_per_step'], network_cfg['decoder_num_head'], network_cfg['decoder_n_layers'])\n                io.load_parameters(model=self.tts_model, checkpoint_path=self.tts_checkpoint_path)\n\n            # Build vocoder.\n            args = AttrDict()\n            args.config = self.waveflow_config_path\n            args.use_fp16 = False\n            self.waveflow_config = io.add_yaml_config_to_args(args)\n            self.waveflow = WaveFlowModule(self.waveflow_config)\n            io.load_parameters(model=self.waveflow, checkpoint_path=self.waveflow_checkpoint_path)\n\n    def synthesize(self, texts, use_gpu=False, vocoder=\"griffin-lim\"):\n        \"\"\"\n        Get the synthetic wavs from the texts.\n\n        Args:\n             texts(list): the input texts to be predicted.\n             use_gpu(bool): whether use gpu to predict or not\n             vocoder(str): the vocoder name, \"griffin-lim\" or \"waveflow\"\n\n        Returns:\n             wavs(str): the audio wav with sample rate . You can use soundfile.write to save it.\n             sample_rate(int): the audio sample rate.\n        \"\"\"\n        if use_gpu and \"CUDA_VISIBLE_DEVICES\" not in os.environ:\n            use_gpu = False\n            logger.warning(\n                \"use_gpu has been set False as you didn't set the environment variable CUDA_VISIBLE_DEVICES while using use_gpu=True\"\n            )\n        if use_gpu:\n            place = fluid.CUDAPlace(0)\n        else:\n            place = fluid.CPUPlace()\n\n        if texts and isinstance(texts, list):\n            predicted_data = texts\n        else:\n            raise ValueError(\"The input data is inconsistent with expectations.\")\n\n        wavs = []\n        with fluid.dygraph.guard(place):\n            self.tts_model.eval()\n            self.waveflow.eval()\n            for text in predicted_data:\n                # init input\n                logger.info(\"Processing sentence: %s\" % text)\n                text = np.asarray(text_to_sequence(text))\n                text = fluid.layers.unsqueeze(dg.to_variable(text).astype(np.int64), [0])\n                mel_input = dg.to_variable(np.zeros([1, 1, 80])).astype(np.float32)\n                pos_text = np.arange(1, text.shape[1] + 1)\n                pos_text = fluid.layers.unsqueeze(dg.to_variable(pos_text).astype(np.int64), [0])\n\n                for i in range(self.max_len):\n                    pos_mel = np.arange(1, mel_input.shape[1] + 1)\n                    pos_mel = fluid.layers.unsqueeze(dg.to_variable(pos_mel).astype(np.int64), [0])\n                    mel_pred, postnet_pred, attn_probs, stop_preds, attn_enc, attn_dec = self.tts_model(\n                        text, mel_input, pos_text, pos_mel)\n                    if stop_preds.numpy()[0, -1] > self.stop_threshold:\n                        break\n                    mel_input = fluid.layers.concat([mel_input, postnet_pred[:, -1:, :]], axis=1)\n                if vocoder == 'griffin-lim':\n                    # synthesis use griffin-lim\n                    wav = self.synthesis_with_griffinlim(postnet_pred, self.tts_config['audio'])\n                elif vocoder == 'waveflow':\n                    # synthesis use waveflow\n                    wav = self.synthesis_with_waveflow(postnet_pred, self.waveflow_config.sigma)\n                else:\n                    raise ValueError(\n                        'vocoder error, we only support griffinlim and waveflow, but recevied %s.' % vocoder)\n                wavs.append(wav)\n        return wavs, self.tts_config['audio']['sr']\n\n    def synthesis_with_griffinlim(self, mel_output, cfg):\n        # synthesis with griffin-lim\n        mel_output = fluid.layers.transpose(fluid.layers.squeeze(mel_output, [0]), [1, 0])\n        mel_output = np.exp(mel_output.numpy())\n        basis = librosa.filters.mel(cfg['sr'], cfg['n_fft'], cfg['num_mels'], fmin=cfg['fmin'], fmax=cfg['fmax'])\n        inv_basis = np.linalg.pinv(basis)\n        spec = np.maximum(1e-10, np.dot(inv_basis, mel_output))\n\n        wav = librosa.core.griffinlim(spec**cfg['power'], hop_length=cfg['hop_length'], win_length=cfg['win_length'])\n\n        return wav\n\n    def synthesis_with_waveflow(self, mel_output, sigma):\n        mel_spectrogram = fluid.layers.transpose(fluid.layers.squeeze(mel_output, [0]), [1, 0])\n        mel_spectrogram = fluid.layers.unsqueeze(mel_spectrogram, [0])\n\n        for layer in self.waveflow.sublayers():\n            if isinstance(layer, WeightNormWrapper):\n                layer.remove_weight_norm()\n\n        # Run model inference.\n        wav = self.waveflow.synthesize(mel_spectrogram, sigma=sigma)\n        return wav.numpy()[0]\n\n    @serving\n    def serving_method(self, texts, use_gpu=False, vocoder=\"griffin-lim\"):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        wavs, sample_rate = self.synthesize(texts, use_gpu, vocoder)\n        wavs = [wav.tolist() for wav in wavs]\n        result = {\"wavs\": wavs, \"sample_rate\": sample_rate}\n        return result\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu', type=ast.literal_eval, default=False, help=\"whether use GPU for prediction\")\n\n        self.arg_config_group.add_argument(\n            '--vocoder', type=str, default=\"griffin-lim\", choices=['griffin-lim', 'waveflow'], help=\"the vocoder name\")\n\n    def add_module_output_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--output_path',\n            type=str,\n            default=os.path.abspath(os.path.join(os.path.curdir, f\"{self.name}_prediction\")),\n            help=\"path to save experiment results\")\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description='Run the %s module.' % self.name,\n            prog='hub run %s' % self.name,\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_input_group = self.parser.add_argument_group(\n            title=\"Ouput options\", description=\"Ouput path. Optional.\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, optional.\")\n\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.add_module_output_arg()\n\n        args = self.parser.parse_args(argvs)\n\n        try:\n            input_data = self.check_input_data(args)\n        except DataFormatError and RuntimeError:\n            self.parser.print_help()\n            return None\n\n        mkdir(args.output_path)\n        wavs, sample_rate = self.synthesize(texts=input_data, use_gpu=args.use_gpu, vocoder=args.vocoder)\n\n        for index, wav in enumerate(wavs):\n            sf.write(os.path.join(args.output_path, f\"{index}.wav\"), wav, sample_rate)\n\n        ret = f\"The synthesized wav files have been saved in {args.output_path}\"\n        return ret\n\n\nif __name__ == \"__main__\":\n\n    module = TransformerTTS()\n    test_text = [\n        \"Life was like a box of chocolates, you never know what you're gonna get.\",\n    ]\n    wavs, sample_rate = module.synthesize(texts=test_text, vocoder=\"waveflow\")\n    for index, wav in enumerate(wavs):\n        sf.write(f\"{index}.wav\", wav, sample_rate)\n"
  },
  {
    "path": "modules/audio/tts/transformer_tts_ljspeech/requirements.txt",
    "content": "scipy\nlibrosa\nsoundfile\nruamel.yaml >= 0.16.3\n"
  },
  {
    "path": "modules/audio/voice_cloning/ge2e_fastspeech2_pwgan/README.md",
    "content": "# ge2e_fastspeech2_pwgan\n\n|模型名称|ge2e_fastspeech2_pwgan|\n| :--- | :---: |\n|类别|语音-声音克隆|\n|网络|FastSpeech2|\n|数据集|AISHELL-3|\n|是否支持Fine-tuning|否|\n|模型大小|462MB|\n|最新更新日期|2021-12-17|\n|数据指标|-|\n\n## 一、模型基本信息\n\n### 模型介绍\n\n声音克隆是指使用特定的音色，结合文字的读音合成音频，使得合成后的音频具有目标说话人的特征，从而达到克隆的目的。\n\n在训练语音克隆模型时，目标音色作为Speaker Encoder的输入，模型会提取这段语音的说话人特征（音色）作为Speaker Embedding。接着，在训练模型重新合成此类音色的语音时，除了输入的目标文本外，说话人的特征也将成为额外条件加入模型的训练。\n\n在预测时，选取一段新的目标音色作为Speaker Encoder的输入，并提取其说话人特征，最终实现输入为一段文本和一段目标音色，模型生成目标音色说出此段文本的语音片段。\n\n![](https://ai-studio-static-online.cdn.bcebos.com/982ab955b87244d3bae3b003aff8e28d9ec159ff0d6246a79757339076dfe7d4)\n\n`ge2e_fastspeech2_pwgan`是一个支持中文的语音克隆模型，分别使用了LSTMSpeakerEncoder、FastSpeech2和PWGan模型分别用于语音特征提取、目标音频特征合成和语音波形转换。\n\n关于模型的详请可参考[PaddleSpeech](https://github.com/PaddlePaddle/PaddleSpeech)。\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.2.0\n\n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install ge2e_fastspeech2_pwgan\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    model = hub.Module(name='ge2e_fastspeech2_pwgan', output_dir='./', speaker_audio='/data/man.wav')  # 指定目标音色音频文件\n    texts = [\n        '语音的表现形式在未来将变得越来越重要$',\n        '今天的天气怎么样$',  ]\n    wavs = model.generate(texts, use_gpu=True)\n\n    for text, wav in zip(texts, wavs):\n        print('='*30)\n        print(f'Text: {text}')\n        print(f'Wav: {wav}')\n    ```\n\n- ### 2、API\n  - ```python\n    def __init__(speaker_audio: str = None,\n                 output_dir: str = './')\n    ```\n    - 初始化module，可配置模型的目标音色的音频文件和输出的路径。\n\n    - **参数**\n      - `speaker_audio`(str): 目标说话人语音音频文件(*.wav)的路径，默认为None(使用默认的女声作为目标音色)。\n      - `output_dir`(str): 合成音频的输出文件，默认为当前目录。\n\n\n  - ```python\n    def get_speaker_embedding()\n    ```\n    - 获取模型的目标说话人特征。\n\n    - **返回**\n      - `results`(numpy.ndarray): 长度为256的numpy数组，代表目标说话人的特征。\n\n  - ```python\n    def set_speaker_embedding(speaker_audio: str)\n    ```\n    - 设置模型的目标说话人特征。\n\n    - **参数**\n      - `speaker_audio`(str): 必填，目标说话人语音音频文件(*.wav)的路径。\n\n  - ```python\n    def generate(data: Union[str, List[str]], use_gpu: bool = False):\n    ```\n    - 根据输入文字，合成目标说话人的语音音频文件。\n\n    - **参数**\n      - `data`(Union[str, List[str]]): 必填，目标音频的内容文本列表，目前只支持中文，不支持添加标点符号。\n      - `use_gpu`(bool): 是否使用gpu执行计算，默认为False。\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布。\n\n  ```shell\n  $ hub install ge2e_fastspeech2_pwgan\n  ```\n"
  },
  {
    "path": "modules/audio/voice_cloning/ge2e_fastspeech2_pwgan/__init__.py",
    "content": ""
  },
  {
    "path": "modules/audio/voice_cloning/ge2e_fastspeech2_pwgan/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import List, Union\n\nimport numpy as np\nimport paddle\nimport soundfile as sf\nimport yaml\nfrom yacs.config import CfgNode\n\nfrom paddlehub.env import MODULE_HOME\nfrom paddlehub.module.module import moduleinfo, serving\nfrom paddlehub.utils.log import logger\nfrom paddlespeech.t2s.frontend.zh_frontend import Frontend\nfrom paddlespeech.t2s.models.fastspeech2 import FastSpeech2\nfrom paddlespeech.t2s.models.fastspeech2 import FastSpeech2Inference\nfrom paddlespeech.t2s.models.parallel_wavegan import PWGGenerator\nfrom paddlespeech.t2s.models.parallel_wavegan import PWGInference\nfrom paddlespeech.t2s.modules.normalizer import ZScore\nfrom paddlespeech.vector.exps.ge2e.audio_processor import SpeakerVerificationPreprocessor\nfrom paddlespeech.vector.models.lstm_speaker_encoder import LSTMSpeakerEncoder\n\n\n@moduleinfo(\n    name=\"ge2e_fastspeech2_pwgan\",\n    version=\"1.0.0\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"audio/voice_cloning\",\n)\nclass VoiceCloner(paddle.nn.Layer):\n    def __init__(self, speaker_audio: str = None, output_dir: str = './'):\n        super(VoiceCloner, self).__init__()\n\n        speaker_encoder_ckpt = os.path.join(MODULE_HOME, 'ge2e_fastspeech2_pwgan', 'assets',\n                                            'ge2e_ckpt_0.3/step-3000000.pdparams')\n        synthesizer_res_dir = os.path.join(MODULE_HOME, 'ge2e_fastspeech2_pwgan', 'assets',\n                                           'fastspeech2_nosil_aishell3_vc1_ckpt_0.5')\n        vocoder_res_dir = os.path.join(MODULE_HOME, 'ge2e_fastspeech2_pwgan', 'assets', 'pwg_aishell3_ckpt_0.5')\n\n        # Speaker encoder\n        self.speaker_processor = SpeakerVerificationPreprocessor(\n            sampling_rate=16000,\n            audio_norm_target_dBFS=-30,\n            vad_window_length=30,\n            vad_moving_average_width=8,\n            vad_max_silence_length=6,\n            mel_window_length=25,\n            mel_window_step=10,\n            n_mels=40,\n            partial_n_frames=160,\n            min_pad_coverage=0.75,\n            partial_overlap_ratio=0.5)\n        self.speaker_encoder = LSTMSpeakerEncoder(n_mels=40, num_layers=3, hidden_size=256, output_size=256)\n        self.speaker_encoder.set_state_dict(paddle.load(speaker_encoder_ckpt))\n        self.speaker_encoder.eval()\n\n        # Voice synthesizer\n        with open(os.path.join(synthesizer_res_dir, 'default.yaml'), 'r') as f:\n            fastspeech2_config = CfgNode(yaml.safe_load(f))\n        with open(os.path.join(synthesizer_res_dir, 'phone_id_map.txt'), 'r') as f:\n            phn_id = [line.strip().split() for line in f.readlines()]\n\n        model = FastSpeech2(idim=len(phn_id), odim=fastspeech2_config.n_mels, **fastspeech2_config[\"model\"])\n        model.set_state_dict(paddle.load(os.path.join(synthesizer_res_dir, 'snapshot_iter_96400.pdz'))[\"main_params\"])\n        model.eval()\n\n        stat = np.load(os.path.join(synthesizer_res_dir, 'speech_stats.npy'))\n        mu, std = stat\n        mu = paddle.to_tensor(mu)\n        std = paddle.to_tensor(std)\n        fastspeech2_normalizer = ZScore(mu, std)\n        self.sample_rate = fastspeech2_config.fs\n\n        self.fastspeech2_inference = FastSpeech2Inference(fastspeech2_normalizer, model)\n        self.fastspeech2_inference.eval()\n\n        # Vocoder\n        with open(os.path.join(vocoder_res_dir, 'default.yaml')) as f:\n            pwg_config = CfgNode(yaml.safe_load(f))\n\n        vocoder = PWGGenerator(**pwg_config[\"generator_params\"])\n        vocoder.set_state_dict(\n            paddle.load(os.path.join(vocoder_res_dir, 'snapshot_iter_1000000.pdz'))[\"generator_params\"])\n        vocoder.remove_weight_norm()\n        vocoder.eval()\n\n        stat = np.load(os.path.join(vocoder_res_dir, 'feats_stats.npy'))\n        mu, std = stat\n        mu = paddle.to_tensor(mu)\n        std = paddle.to_tensor(std)\n        pwg_normalizer = ZScore(mu, std)\n\n        self.pwg_inference = PWGInference(pwg_normalizer, vocoder)\n        self.pwg_inference.eval()\n\n        # Text frontend\n        self.frontend = Frontend(phone_vocab_path=os.path.join(synthesizer_res_dir, 'phone_id_map.txt'))\n\n        # Speaking embedding\n        self._speaker_embedding = None\n        if speaker_audio is None or not os.path.isfile(speaker_audio):\n            speaker_audio = os.path.join(MODULE_HOME, 'ge2e_fastspeech2_pwgan', 'assets', 'voice_cloning.wav')\n            logger.warning(f'Due to no speaker audio is specified, speaker encoder will use defult '\n                           f'waveform({speaker_audio}) to extract speaker embedding. You can use '\n                           '\"set_speaker_embedding()\" method to reset a speaker audio for voice cloning.')\n        self.set_speaker_embedding(speaker_audio)\n\n        self.output_dir = os.path.abspath(output_dir)\n        if not os.path.exists(self.output_dir):\n            os.makedirs(self.output_dir)\n\n    def get_speaker_embedding(self):\n        return self._speaker_embedding.numpy()\n\n    @paddle.no_grad()\n    def set_speaker_embedding(self, speaker_audio: str):\n        assert os.path.exists(speaker_audio), f'Speaker audio file: {speaker_audio} does not exists.'\n        mel_sequences = self.speaker_processor.extract_mel_partials(\n            self.speaker_processor.preprocess_wav(speaker_audio))\n        self._speaker_embedding = self.speaker_encoder.embed_utterance(paddle.to_tensor(mel_sequences))\n\n        logger.info(f'Speaker embedding has been set from file: {speaker_audio}')\n\n    @paddle.no_grad()\n    def generate(self, data: Union[str, List[str]], use_gpu: bool = False):\n        assert self._speaker_embedding is not None, f'Set speaker embedding before voice cloning.'\n\n        if isinstance(data, str):\n            data = [data]\n        elif isinstance(data, list):\n            assert len(data) > 0 and isinstance(data[0],\n                                                str) and len(data[0]) > 0, f'Input data should be str of List[str].'\n        else:\n            raise Exception(f'Input data should be str of List[str].')\n\n        paddle.set_device('gpu') if use_gpu else paddle.set_device('cpu')\n        files = []\n        for idx, text in enumerate(data):\n            phone_ids = self.frontend.get_input_ids(text, merge_sentences=True)[\"phone_ids\"][0]\n            wav = self.pwg_inference(self.fastspeech2_inference(phone_ids, spk_emb=self._speaker_embedding))\n            output_wav = os.path.join(self.output_dir, f'{idx+1}.wav')\n            sf.write(output_wav, wav.numpy(), samplerate=self.sample_rate)\n            files.append(output_wav)\n\n        return files\n"
  },
  {
    "path": "modules/audio/voice_cloning/ge2e_fastspeech2_pwgan/requirements.txt",
    "content": "paddlespeech==0.1.0a13\n"
  },
  {
    "path": "modules/audio/voice_cloning/lstm_tacotron2/README.md",
    "content": "# lstm_tacotron2\n\n|模型名称|lstm_tacotron2|\n| :--- | :---: |\n|类别|语音-语音合成|\n|网络|LSTM、Tacotron2、WaveFlow|\n|数据集|AISHELL-3|\n|是否支持Fine-tuning|否|\n|模型大小|327MB|\n|最新更新日期|2021-06-15|\n|数据指标|-|\n\n## 一、模型基本信息\n\n### 模型介绍\n\n声音克隆是指使用特定的音色，结合文字的读音合成音频，使得合成后的音频具有目标说话人的特征，从而达到克隆的目的。\n\n在训练语音克隆模型时，目标音色作为Speaker Encoder的输入，模型会提取这段语音的说话人特征（音色）作为Speaker Embedding。接着，在训练模型重新合成此类音色的语音时，除了输入的目标文本外，说话人的特征也将成为额外条件加入模型的训练。\n\n在预测时，选取一段新的目标音色作为Speaker Encoder的输入，并提取其说话人特征，最终实现输入为一段文本和一段目标音色，模型生成目标音色说出此段文本的语音片段。\n\n<p align=\"center\">\n<img src=\"https://ai-studio-static-online.cdn.bcebos.com/982ab955b87244d3bae3b003aff8e28d9ec159ff0d6246a79757339076dfe7d4\" hspace='10'/> <br/>\n</p>\n\n`lstm_tacotron2`是一个支持中文的语音克隆模型，分别使用了LSTMSpeakerEncoder、Tacotron2和WaveFlow模型分别用于语音特征提取、目标音频特征合成和语音波形转换。\n\n更多详情请参考:\n- [Transfer Learning from Speaker Verification to Multispeaker Text-To-Speech Synthesis](https://arxiv.org/pdf/1806.04558.pdf)\n- [Parakeet](https://github.com/PaddlePaddle/Parakeet/tree/release/v0.3/parakeet/models)\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install lstm_tacotron2\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    model = hub.Module(name='lstm_tacotron2', output_dir='/data', speaker_audio='/data/man.wav')  # 指定目标音色音频文件\n    texts = [\n        '语音的表现形式在未来将变得越来越重要$',\n        '今天的天气怎么样$',  ]\n    wavs = model.generate(texts, use_gpu=True)\n\n    for text, wav in zip(texts, wavs):\n        print('='*30)\n        print(f'Text: {text}')\n        print(f'Wav: {wav}')\n    ```\n    ```\n    ==============================\n    Text: 语音的表现形式在未来将变得越来越重要$\n    Wav: /data/1.wav\n    ==============================\n    Text: 今天的天气怎么样$\n    Wav: /data/2.wav\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(speaker_audio: str = None,\n                output_dir: str = './')\n    ```\n    - 初始化module，可配置模型的目标音色的音频文件和输出的路径。\n\n    - **参数**\n      - `speaker_audio`(str): 目标说话人语音音频文件(*.wav)的路径，默认为None(使用默认的女声作为目标音色)。\n      - `output_dir`(str): 合成音频的输出文件，默认为当前目录。\n\n\n  - ```python\n    def get_speaker_embedding()\n    ```\n    - 获取模型的目标说话人特征。\n\n    - **返回**\n      - `results`(numpy.ndarray): 长度为256的numpy数组，代表目标说话人的特征。\n\n  - ```python\n    def set_speaker_embedding(speaker_audio: str)\n    ```\n    - 设置模型的目标说话人特征。\n\n    - **参数**\n      - `speaker_audio`(str): 必填，目标说话人语音音频文件(*.wav)的路径。\n\n  - ```python\n    def generate(data: List[str], batch_size: int = 1, use_gpu: bool = False):\n    ```\n    - 根据输入文字，合成目标说话人的语音音频文件。\n\n    - **参数**\n      - `data`(List[str]): 必填，目标音频的内容文本列表，目前只支持中文，不支持添加标点符号。\n      - `batch_size`(int): 可选，模型合成语音时的batch_size，默认为1。\n      - `use_gpu`(bool): 是否使用gpu执行计算，默认为False。\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n```shell\n$ hub install lstm_tacotron2==1.0.0\n```\n"
  },
  {
    "path": "modules/audio/voice_cloning/lstm_tacotron2/__init__.py",
    "content": ""
  },
  {
    "path": "modules/audio/voice_cloning/lstm_tacotron2/audio_processor.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom pathlib import Path\nfrom warnings import warn\nimport struct\n\nfrom scipy.ndimage.morphology import binary_dilation\nimport numpy as np\nimport librosa\n\ntry:\n    import webrtcvad\nexcept ModuleNotFoundError:\n    warn(\"Unable to import 'webrtcvad'.\" \"This package enables noise removal and is recommended.\")\n    webrtcvad = None\n\nINT16_MAX = (2**15) - 1\n\n\ndef normalize_volume(wav, target_dBFS, increase_only=False, decrease_only=False):\n    # this function implements Loudness normalization, instead of peak\n    # normalization, See https://en.wikipedia.org/wiki/Audio_normalization\n    # dBFS: Decibels relative to full scale\n    # See https://en.wikipedia.org/wiki/DBFS for more details\n    # for 16Bit PCM audio, minimal level is -96dB\n    # compute the mean dBFS and adjust to target dBFS, with by increasing\n    # or decreasing\n    if increase_only and decrease_only:\n        raise ValueError(\"Both increase only and decrease only are set\")\n    dBFS_change = target_dBFS - 10 * np.log10(np.mean(wav**2))\n    if ((dBFS_change < 0 and increase_only) or (dBFS_change > 0 and decrease_only)):\n        return wav\n    gain = 10**(dBFS_change / 20)\n    return wav * gain\n\n\ndef trim_long_silences(wav, vad_window_length: int, vad_moving_average_width: int, vad_max_silence_length: int,\n                       sampling_rate: int):\n    \"\"\"\n    Ensures that segments without voice in the waveform remain no longer than a\n    threshold determined by the VAD parameters in params.py.\n\n    :param wav: the raw waveform as a numpy array of floats\n    :return: the same waveform with silences trimmed away (length <= original wav length)\n    \"\"\"\n    # Compute the voice detection window size\n    samples_per_window = (vad_window_length * sampling_rate) // 1000\n\n    # Trim the end of the audio to have a multiple of the window size\n    wav = wav[:len(wav) - (len(wav) % samples_per_window)]\n\n    # Convert the float waveform to 16-bit mono PCM\n    pcm_wave = struct.pack(\"%dh\" % len(wav), *(np.round(wav * INT16_MAX)).astype(np.int16))\n\n    # Perform voice activation detection\n    voice_flags = []\n    vad = webrtcvad.Vad(mode=3)\n    for window_start in range(0, len(wav), samples_per_window):\n        window_end = window_start + samples_per_window\n        voice_flags.append(vad.is_speech(pcm_wave[window_start * 2:window_end * 2], sample_rate=sampling_rate))\n    voice_flags = np.array(voice_flags)\n\n    # Smooth the voice detection with a moving average\n    def moving_average(array, width):\n        array_padded = np.concatenate((np.zeros((width - 1) // 2), array, np.zeros(width // 2)))\n        ret = np.cumsum(array_padded, dtype=float)\n        ret[width:] = ret[width:] - ret[:-width]\n        return ret[width - 1:] / width\n\n    audio_mask = moving_average(voice_flags, vad_moving_average_width)\n    audio_mask = np.round(audio_mask).astype(np.bool)\n\n    # Dilate the voiced regions\n    audio_mask = binary_dilation(audio_mask, np.ones(vad_max_silence_length + 1))\n    audio_mask = np.repeat(audio_mask, samples_per_window)\n\n    return wav[audio_mask]\n\n\ndef compute_partial_slices(n_samples: int,\n                           partial_utterance_n_frames: int,\n                           hop_length: int,\n                           min_pad_coverage: float = 0.75,\n                           overlap: float = 0.5):\n    \"\"\"\n    Computes where to split an utterance waveform and its corresponding mel spectrogram to obtain\n    partial utterances of <partial_utterance_n_frames> each. Both the waveform and the mel\n    spectrogram slices are returned, so as to make each partial utterance waveform correspond to\n    its spectrogram. This function assumes that the mel spectrogram parameters used are those\n    defined in params_data.py.\n\n    The returned ranges may be indexing further than the length of the waveform. It is\n    recommended that you pad the waveform with zeros up to wave_slices[-1].stop.\n\n    :param n_samples: the number of samples in the waveform\n    :param partial_utterance_n_frames: the number of mel spectrogram frames in each partial\n    utterance\n    :param min_pad_coverage: when reaching the last partial utterance, it may or may not have\n    enough frames. If at least <min_pad_coverage> of <partial_utterance_n_frames> are present,\n    then the last partial utterance will be considered, as if we padded the audio. Otherwise,\n    it will be discarded, as if we trimmed the audio. If there aren't enough frames for 1 partial\n    utterance, this parameter is ignored so that the function always returns at least 1 slice.\n    :param overlap: by how much the partial utterance should overlap. If set to 0, the partial\n    utterances are entirely disjoint.\n    :return: the waveform slices and mel spectrogram slices as lists of array slices. Index\n    respectively the waveform and the mel spectrogram with these slices to obtain the partial\n    utterances.\n    \"\"\"\n    assert 0 <= overlap < 1\n    assert 0 < min_pad_coverage <= 1\n\n    # librosa's function to compute num_frames from num_samples\n    n_frames = int(np.ceil((n_samples + 1) / hop_length))\n    # frame shift between ajacent partials\n    frame_step = max(1, int(np.round(partial_utterance_n_frames * (1 - overlap))))\n\n    # Compute the slices\n    wav_slices, mel_slices = [], []\n    steps = max(1, n_frames - partial_utterance_n_frames + frame_step + 1)\n    for i in range(0, steps, frame_step):\n        mel_range = np.array([i, i + partial_utterance_n_frames])\n        wav_range = mel_range * hop_length\n        mel_slices.append(slice(*mel_range))\n        wav_slices.append(slice(*wav_range))\n\n    # Evaluate whether extra padding is warranted or not\n    last_wav_range = wav_slices[-1]\n    coverage = (n_samples - last_wav_range.start) / (last_wav_range.stop - last_wav_range.start)\n    if coverage < min_pad_coverage and len(mel_slices) > 1:\n        mel_slices = mel_slices[:-1]\n        wav_slices = wav_slices[:-1]\n\n    return wav_slices, mel_slices\n\n\nclass SpeakerVerificationPreprocessor(object):\n    def __init__(self,\n                 sampling_rate: int,\n                 audio_norm_target_dBFS: float,\n                 vad_window_length,\n                 vad_moving_average_width,\n                 vad_max_silence_length,\n                 mel_window_length,\n                 mel_window_step,\n                 n_mels,\n                 partial_n_frames: int,\n                 min_pad_coverage: float = 0.75,\n                 partial_overlap_ratio: float = 0.5):\n        self.sampling_rate = sampling_rate\n        self.audio_norm_target_dBFS = audio_norm_target_dBFS\n\n        self.vad_window_length = vad_window_length\n        self.vad_moving_average_width = vad_moving_average_width\n        self.vad_max_silence_length = vad_max_silence_length\n\n        self.n_fft = int(mel_window_length * sampling_rate / 1000)\n        self.hop_length = int(mel_window_step * sampling_rate / 1000)\n        self.n_mels = n_mels\n\n        self.partial_n_frames = partial_n_frames\n        self.min_pad_coverage = min_pad_coverage\n        self.partial_overlap_ratio = partial_overlap_ratio\n\n    def preprocess_wav(self, fpath_or_wav, source_sr=None):\n        # Load the wav from disk if needed\n        if isinstance(fpath_or_wav, (str, Path)):\n            wav, source_sr = librosa.load(str(fpath_or_wav), sr=None)\n        else:\n            wav = fpath_or_wav\n\n        # Resample if numpy.array is passed and sr does not match\n        if source_sr is not None and source_sr != self.sampling_rate:\n            wav = librosa.resample(wav, source_sr, self.sampling_rate)\n\n        # loudness normalization\n        wav = normalize_volume(wav, self.audio_norm_target_dBFS, increase_only=True)\n\n        # trim long silence\n        if webrtcvad:\n            wav = trim_long_silences(wav, self.vad_window_length, self.vad_moving_average_width,\n                                     self.vad_max_silence_length, self.sampling_rate)\n        return wav\n\n    def melspectrogram(self, wav):\n        mel = librosa.feature.melspectrogram(\n            wav, sr=self.sampling_rate, n_fft=self.n_fft, hop_length=self.hop_length, n_mels=self.n_mels)\n        mel = mel.astype(np.float32).T\n        return mel\n\n    def extract_mel_partials(self, wav):\n        wav_slices, mel_slices = compute_partial_slices(\n            len(wav), self.partial_n_frames, self.hop_length, self.min_pad_coverage, self.partial_overlap_ratio)\n\n        # pad audio if needed\n        max_wave_length = wav_slices[-1].stop\n        if max_wave_length >= len(wav):\n            wav = np.pad(wav, (0, max_wave_length - len(wav)), \"constant\")\n\n        # Split the utterance into partials\n        frames = self.melspectrogram(wav)\n        frames_batch = np.array([frames[s] for s in mel_slices])\n        return frames_batch  # [B, T, C]\n"
  },
  {
    "path": "modules/audio/voice_cloning/lstm_tacotron2/chinese_g2p.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import List, Tuple\nfrom pypinyin import lazy_pinyin, Style\n\nfrom .preprocess_transcription import split_syllable\n\n\ndef convert_to_pinyin(text: str) -> List[str]:\n    \"\"\"convert text into list of syllables, other characters that are not chinese, thus\n    cannot be converted to pinyin are splited.\n    \"\"\"\n    syllables = lazy_pinyin(text, style=Style.TONE3, neutral_tone_with_five=True)\n    return syllables\n\n\ndef convert_sentence(text: str) -> List[Tuple[str]]:\n    \"\"\"convert a sentence into two list: phones and tones\"\"\"\n    syllables = convert_to_pinyin(text)\n    phones = []\n    tones = []\n    for syllable in syllables:\n        p, t = split_syllable(syllable)\n        phones.extend(p)\n        tones.extend(t)\n\n    return phones, tones\n"
  },
  {
    "path": "modules/audio/voice_cloning/lstm_tacotron2/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport importlib\nimport os\nfrom typing import List\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nfrom paddlehub.env import MODULE_HOME\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.utils.log import logger\nfrom paddlenlp.data import Pad\nfrom parakeet.models import ConditionalWaveFlow, Tacotron2\nfrom parakeet.models.lstm_speaker_encoder import LSTMSpeakerEncoder\nimport soundfile as sf\n\nfrom .audio_processor import SpeakerVerificationPreprocessor\nfrom .chinese_g2p import convert_sentence\nfrom .preprocess_transcription import voc_phones, voc_tones, phone_pad_token, tone_pad_token\n\n\n@moduleinfo(\n    name=\"lstm_tacotron2\",\n    version=\"1.0.0\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"audio/voice_cloning\",\n)\nclass VoiceCloner(nn.Layer):\n    def __init__(self, speaker_audio: str = None, output_dir: str = './'):\n        super(VoiceCloner, self).__init__()\n\n        self.sample_rate = 22050  # Hyper params for the following model ckpts.\n        speaker_encoder_ckpt = os.path.join(MODULE_HOME, 'lstm_tacotron2', 'assets',\n                                            'ge2e_ckpt_0.3/step-3000000.pdparams')\n        synthesizer_ckpt = os.path.join(MODULE_HOME, 'lstm_tacotron2', 'assets',\n                                        'tacotron2_aishell3_ckpt_0.3/step-450000.pdparams')\n        vocoder_ckpt = os.path.join(MODULE_HOME, 'lstm_tacotron2', 'assets',\n                                    'waveflow_ljspeech_ckpt_0.3/step-2000000.pdparams')\n\n        # Speaker encoder\n        self.speaker_processor = SpeakerVerificationPreprocessor(\n            sampling_rate=16000,\n            audio_norm_target_dBFS=-30,\n            vad_window_length=30,\n            vad_moving_average_width=8,\n            vad_max_silence_length=6,\n            mel_window_length=25,\n            mel_window_step=10,\n            n_mels=40,\n            partial_n_frames=160,\n            min_pad_coverage=0.75,\n            partial_overlap_ratio=0.5)\n        self.speaker_encoder = LSTMSpeakerEncoder(n_mels=40, num_layers=3, hidden_size=256, output_size=256)\n        self.speaker_encoder.set_state_dict(paddle.load(speaker_encoder_ckpt))\n        self.speaker_encoder.eval()\n\n        # Voice synthesizer\n        self.synthesizer = Tacotron2(\n            vocab_size=68,\n            n_tones=10,\n            d_mels=80,\n            d_encoder=512,\n            encoder_conv_layers=3,\n            encoder_kernel_size=5,\n            d_prenet=256,\n            d_attention_rnn=1024,\n            d_decoder_rnn=1024,\n            attention_filters=32,\n            attention_kernel_size=31,\n            d_attention=128,\n            d_postnet=512,\n            postnet_kernel_size=5,\n            postnet_conv_layers=5,\n            reduction_factor=1,\n            p_encoder_dropout=0.5,\n            p_prenet_dropout=0.5,\n            p_attention_dropout=0.1,\n            p_decoder_dropout=0.1,\n            p_postnet_dropout=0.5,\n            d_global_condition=256,\n            use_stop_token=False)\n        self.synthesizer.set_state_dict(paddle.load(synthesizer_ckpt))\n        self.synthesizer.eval()\n\n        # Vocoder\n        self.vocoder = ConditionalWaveFlow(\n            upsample_factors=[16, 16], n_flows=8, n_layers=8, n_group=16, channels=128, n_mels=80, kernel_size=[3, 3])\n        self.vocoder.set_state_dict(paddle.load(vocoder_ckpt))\n        self.vocoder.eval()\n\n        # Speaking embedding\n        self._speaker_embedding = None\n        if speaker_audio is None or not os.path.isfile(speaker_audio):\n            speaker_audio = os.path.join(MODULE_HOME, 'lstm_tacotron2', 'assets', 'voice_cloning.wav')\n            logger.warning(f'Due to no speaker audio is specified, speaker encoder will use defult '\n                           f'waveform({speaker_audio}) to extract speaker embedding. You can use '\n                           '\"set_speaker_embedding()\" method to reset a speaker audio for voice cloning.')\n        self.set_speaker_embedding(speaker_audio)\n\n        self.output_dir = os.path.abspath(output_dir)\n        if not os.path.exists(self.output_dir):\n            os.makedirs(self.output_dir)\n\n    def get_speaker_embedding(self):\n        return self._speaker_embedding.numpy()\n\n    def set_speaker_embedding(self, speaker_audio: str):\n        assert os.path.exists(speaker_audio), f'Speaker audio file: {speaker_audio} does not exists.'\n        mel_sequences = self.speaker_processor.extract_mel_partials(\n            self.speaker_processor.preprocess_wav(speaker_audio))\n        self._speaker_embedding = self.speaker_encoder.embed_utterance(paddle.to_tensor(mel_sequences))\n        logger.info(f'Speaker embedding has been set from file: {speaker_audio}')\n\n    def forward(self, phones: paddle.Tensor, tones: paddle.Tensor, speaker_embeddings: paddle.Tensor):\n        outputs = self.synthesizer.infer(phones, tones=tones, global_condition=speaker_embeddings)\n        mel_input = paddle.transpose(outputs[\"mel_outputs_postnet\"], [0, 2, 1])\n        waveforms = self.vocoder.infer(mel_input)\n        return waveforms\n\n    def _convert_text_to_input(self, text: str):\n        \"\"\"\n        Convert input string to phones and tones.\n        \"\"\"\n        phones, tones = convert_sentence(text)\n        phones = np.array([voc_phones.lookup(item) for item in phones], dtype=np.int64)\n        tones = np.array([voc_tones.lookup(item) for item in tones], dtype=np.int64)\n        return phones, tones\n\n    def _batchify(self, data: List[str], batch_size: int):\n        \"\"\"\n        Generate input batches.\n        \"\"\"\n        phone_pad_func = Pad(voc_phones.lookup(phone_pad_token))\n        tone_pad_func = Pad(voc_tones.lookup(tone_pad_token))\n\n        def _parse_batch(batch_data):\n            phones, tones = zip(*batch_data)\n            speaker_embeddings = paddle.expand(self._speaker_embedding, shape=(len(batch_data), -1))\n            return phone_pad_func(phones), tone_pad_func(tones), speaker_embeddings\n\n        examples = []  # [(phones, tones), ...]\n        for text in data:\n            examples.append(self._convert_text_to_input(text))\n\n        # Seperates data into some batches.\n        one_batch = []\n        for example in examples:\n            one_batch.append(example)\n            if len(one_batch) == batch_size:\n                yield _parse_batch(one_batch)\n                one_batch = []\n        if one_batch:\n            yield _parse_batch(one_batch)\n\n    def generate(self, data: List[str], batch_size: int = 1, use_gpu: bool = False):\n        assert self._speaker_embedding is not None, f'Set speaker embedding before voice cloning.'\n\n        paddle.set_device('gpu') if use_gpu else paddle.set_device('cpu')\n        batches = self._batchify(data, batch_size)\n\n        results = []\n        for batch in batches:\n            phones, tones, speaker_embeddings = map(paddle.to_tensor, batch)\n            waveforms = self(phones, tones, speaker_embeddings).numpy()\n            results.extend(list(waveforms))\n\n        files = []\n        for idx, waveform in enumerate(results):\n            output_wav = os.path.join(self.output_dir, f'{idx+1}.wav')\n            sf.write(output_wav, waveform, samplerate=self.sample_rate)\n            files.append(output_wav)\n\n        return files\n"
  },
  {
    "path": "modules/audio/voice_cloning/lstm_tacotron2/preprocess_transcription.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport argparse\nfrom pathlib import Path\nimport pickle\nimport re\n\nfrom parakeet.frontend import Vocab\nimport tqdm\n\nzh_pattern = re.compile(\"[\\u4e00-\\u9fa5]\")\n\n_tones = {'<pad>', '<s>', '</s>', '0', '1', '2', '3', '4', '5'}\n\n_pauses = {'%', '$'}\n\n_initials = {\n    'b',\n    'p',\n    'm',\n    'f',\n    'd',\n    't',\n    'n',\n    'l',\n    'g',\n    'k',\n    'h',\n    'j',\n    'q',\n    'x',\n    'zh',\n    'ch',\n    'sh',\n    'r',\n    'z',\n    'c',\n    's',\n}\n\n_finals = {\n    'ii',\n    'iii',\n    'a',\n    'o',\n    'e',\n    'ea',\n    'ai',\n    'ei',\n    'ao',\n    'ou',\n    'an',\n    'en',\n    'ang',\n    'eng',\n    'er',\n    'i',\n    'ia',\n    'io',\n    'ie',\n    'iai',\n    'iao',\n    'iou',\n    'ian',\n    'ien',\n    'iang',\n    'ieng',\n    'u',\n    'ua',\n    'uo',\n    'uai',\n    'uei',\n    'uan',\n    'uen',\n    'uang',\n    'ueng',\n    'v',\n    've',\n    'van',\n    'ven',\n    'veng',\n}\n\n_ernized_symbol = {'&r'}\n\n_specials = {'<pad>', '<unk>', '<s>', '</s>'}\n\n_phones = _initials | _finals | _ernized_symbol | _specials | _pauses\n\nphone_pad_token = '<pad>'\ntone_pad_token = '<pad>'\nvoc_phones = Vocab(sorted(list(_phones)))\nvoc_tones = Vocab(sorted(list(_tones)))\n\n\ndef is_zh(word):\n    global zh_pattern\n    match = zh_pattern.search(word)\n    return match is not None\n\n\ndef ernized(syllable):\n    return syllable[:2] != \"er\" and syllable[-2] == 'r'\n\n\ndef convert(syllable):\n    # expansion of o -> uo\n    syllable = re.sub(r\"([bpmf])o$\", r\"\\1uo\", syllable)\n    # syllable = syllable.replace(\"bo\", \"buo\").replace(\"po\", \"puo\").replace(\"mo\", \"muo\").replace(\"fo\", \"fuo\")\n    # expansion for iong, ong\n    syllable = syllable.replace(\"iong\", \"veng\").replace(\"ong\", \"ueng\")\n\n    # expansion for ing, in\n    syllable = syllable.replace(\"ing\", \"ieng\").replace(\"in\", \"ien\")\n\n    # expansion for un, ui, iu\n    syllable = syllable.replace(\"un\", \"uen\").replace(\"ui\", \"uei\").replace(\"iu\", \"iou\")\n\n    # rule for variants of i\n    syllable = syllable.replace(\"zi\", \"zii\").replace(\"ci\", \"cii\").replace(\"si\", \"sii\")\\\n        .replace(\"zhi\", \"zhiii\").replace(\"chi\", \"chiii\").replace(\"shi\", \"shiii\")\\\n        .replace(\"ri\", \"riii\")\n\n    # rule for y preceding i, u\n    syllable = syllable.replace(\"yi\", \"i\").replace(\"yu\", \"v\").replace(\"y\", \"i\")\n\n    # rule for w\n    syllable = syllable.replace(\"wu\", \"u\").replace(\"w\", \"u\")\n\n    # rule for v following j, q, x\n    syllable = syllable.replace(\"ju\", \"jv\").replace(\"qu\", \"qv\").replace(\"xu\", \"xv\")\n\n    return syllable\n\n\ndef split_syllable(syllable: str):\n    \"\"\"Split a syllable in pinyin into a list of phones and a list of tones.\n    Initials have no tone, represented by '0', while finals have tones from\n    '1,2,3,4,5'.\n\n    e.g.\n\n    zhang -> ['zh', 'ang'], ['0', '1']\n    \"\"\"\n    if syllable in _pauses:\n        # syllable, tone\n        return [syllable], ['0']\n\n    tone = syllable[-1]\n    syllable = convert(syllable[:-1])\n\n    phones = []\n    tones = []\n\n    global _initials\n    if syllable[:2] in _initials:\n        phones.append(syllable[:2])\n        tones.append('0')\n        phones.append(syllable[2:])\n        tones.append(tone)\n    elif syllable[0] in _initials:\n        phones.append(syllable[0])\n        tones.append('0')\n        phones.append(syllable[1:])\n        tones.append(tone)\n    else:\n        phones.append(syllable)\n        tones.append(tone)\n    return phones, tones\n"
  },
  {
    "path": "modules/audio/voice_cloning/lstm_tacotron2/requirements.txt",
    "content": "paddle-parakeet\n"
  },
  {
    "path": "modules/demo/README.md",
    "content": "# 如何编写一个PaddleHub Module\n\n## 模型基本信息\n\n我们准备编写一个用于做情感分析的PaddleHub Module，Module的基本信息如下：\n```yaml\nname: senta_test\nversion: 1.0.0\nsummary: This is a PaddleHub Module. Just for test.\nauthor: anonymous\nauthor_email:\ntype: nlp/sentiment_analysis\n```\n\nModule存在一个接口sentiment_classify，用于接收传入文本，并给出文本的情感倾向（正面/负面），支持python接口调用和命令行调用。\n```python\nimport paddlehub as hub\n\nsenta_test = hub.Module(name=\"senta_test\")\nsenta_test.sentiment_classify(texts=[\"这部电影太差劲了\"])\n```\n```cmd\nhub run senta_test --input_text 这部电影太差劲了\n```\n\n<br/>\n\n## 策略\n\n为了示例代码简单起见，我们使用一个非常简单的情感判断策略，当输入文本中带有词表中指定单词时，则判断文本倾向为负向，否则为正向\n\n<br/>\n\n## Module创建\n\n### step 1. 创建必要的目录与文件\n\n创建一个senta_test的目录，并在senta_test目录下分别创建__init__.py、module.py、processor.py、vocab.list，其中\n\n|文件名|用途|\n|-|-|\n|\\_\\_init\\_\\_.py|空文件|\n|module.py|主模块，提供Module的实现代码|\n|processor.py|辅助模块，提供词表加载的方法|\n|vocab.list|存放词表|\n\n```cmd\n➜  tree senta_test\nsenta_test/\n├── vocab.list\n├── __init__.py\n├── module.py\n└── processor.py\n```\n### step 2. 实现辅助模块processor\n\n在processor.py中实现一个load_vocab接口用于读取词表\n```python\ndef load_vocab(vocab_path):\n    with open(vocab_path) as file:\n        return file.read().split()\n```\n\n### step 3. 编写Module处理代码\n\nmodule.py文件为Module的入口代码所在，我们需要在其中实现预测逻辑。\n\n#### step 3_1. 引入必要的头文件\n```python\nimport argparse\nimport os\n\nimport paddlehub as hub\nfrom paddlehub.module.module import runnable, moduleinfo\n\nfrom senta_test.processor import load_vocab\n```\n`注意`：当引用Module中模块时，需要输入全路径，如senta_test.processor\n#### step 3_2. 定义SentaTest类\nmodule.py中需要有一个继承了hub.Module的类存在，该类负责实现预测逻辑，并使用moduleinfo填写基本信息。当使用hub.Module(name=\"senta_test\")加载Module时，PaddleHub会自动创建SentaTest的对象并返回。\n```python\n@moduleinfo(\n    name=\"senta_test\",\n    version=\"1.0.0\",\n    summary=\"This is a PaddleHub Module. Just for test.\",\n    author=\"anonymous\",\n    author_email=\"\",\n    type=\"nlp/sentiment_analysis\",\n)\nclass SentaTest(hub.Module):\n    ...\n```\n#### step 3_3. 执行必要的初始化\n```python\ndef _initialize(self):\n    # add arg parser\n    self.parser = argparse.ArgumentParser(\n        description=\"Run the senta_test module.\",\n        prog='hub run senta_test',\n        usage='%(prog)s',\n        add_help=True)\n    self.parser.add_argument(\n        '--input_text', type=str, default=None, help=\"text to predict\")\n\n    # load word dict\n    vocab_path = os.path.join(self.directory, \"vocab.list\")\n    self.vocab = load_vocab(vocab_path)\n```\n`注意`：执行类的初始化不能使用默认的__init__接口，而是应该重载实现_initialize接口。对象默认内置了directory属性，可以直接获取到Module所在路径\n#### step 3_4. 完善预测逻辑\n```python\ndef sentiment_classify(self, texts):\n    results = []\n    for text in texts:\n        sentiment = \"positive\"\n        for word in self.vocab:\n            if word in text:\n                sentiment = \"negative\"\n                break\n        results.append({\"text\":text, \"sentiment\":sentiment})\n\n    return results\n```\n#### step 3_5. 支持命令行调用\n如果希望Module可以支持命令行调用，则需要提供一个经过runnable修饰的接口，接口负责解析传入数据并进行预测，将结果返回。\n\n如果不需要提供命令行预测功能，则可以不实现该接口，PaddleHub在用命令行执行时，会自动发现该Module不支持命令行方式，并给出提示。\n```python\n@runnable\ndef run_cmd(self, argvs):\n    args = self.parser.parse_args(argvs)\n    texts = [args.input_text]\n    return self.sentiment_classify(texts)\n```\n#### step 3_6. 支持serving调用\n\nTODO\n\n### 完整代码\n\n* [module.py](./senta_test/module.py)\n\n* [processor.py](./senta_test/module.py)\n\n<br/>\n\n## 测试步骤\n\n完成Module编写后，我们可以通过以下方式测试该Module\n\n### 调用方法1\n\n将Module安装到本机中，再通过Hub.Module(name=...)加载\n```shell\nhub install senta_test\n```\n\n```python\nimport paddlehub as hub\n\nsenta_test = hub.Module(name=\"senta_test\")\nsenta_test.sentiment_classify(texts=[\"这部电影太差劲了\"])\n```\n\n### 调用方法2\n\n直接通过Hub.Module(directory=...)加载\n```python\nimport paddlehub as hub\n\nsenta_test = hub.Module(directory=\"senta_test/\")\nsenta_test.sentiment_classify(texts=[\"这部电影太差劲了\"])\n```\n\n### 调用方法3\n将senta_test作为路径加到环境变量中，直接加载SentaTest对象\n```shell\nexport PYTHONPATH=senta_test:$PYTHONPATH\n```\n\n```python\nfrom senta_test.module import SentaTest\n\nSentaTest.sentiment_classify(texts=[\"这部电影太差劲了\"])\n```\n\n### 调用方法4\n将Module安装到本机中，再通过hub run运行\n```shell\nhub install senta_test\nhub run senta_test --input_text \"这部电影太差劲了\"\n```\n"
  },
  {
    "path": "modules/demo/senta_test/__init__.py",
    "content": ""
  },
  {
    "path": "modules/demo/senta_test/module.py",
    "content": "import argparse\nimport os\n\nimport paddlehub as hub\nfrom paddlehub.module.module import runnable, moduleinfo\n\nfrom senta_test.processor import load_vocab\n\n\n@moduleinfo(\n    name=\"senta_test\",\n    version=\"1.0.0\",\n    summary=\"This is a PaddleHub Module. Just for test.\",\n    author=\"anonymous\",\n    author_email=\"\",\n    type=\"nlp/sentiment_analysis\",\n)\nclass SentaTest:\n    def __init__(self):\n        # add arg parser\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the senta_test module.\", prog='hub run senta_test', usage='%(prog)s', add_help=True)\n        self.parser.add_argument('--input_text', type=str, default=None, help=\"text to predict\")\n\n        # load word dict\n        vocab_path = os.path.join(self.directory, \"vocab.list\")\n        self.vocab = load_vocab(vocab_path)\n\n    def sentiment_classify(self, texts):\n        results = []\n        for text in texts:\n            sentiment = \"positive\"\n            for word in self.vocab:\n                if word in text:\n                    sentiment = \"negative\"\n                    break\n            results.append({\"text\": text, \"sentiment\": sentiment})\n\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        args = self.parser.parse_args(argvs)\n        texts = [args.input_text]\n        return self.sentiment_classify(texts)\n"
  },
  {
    "path": "modules/demo/senta_test/processor.py",
    "content": "def load_vocab(vocab_path):\n    with open(vocab_path) as file:\n        return file.read().split()\n"
  },
  {
    "path": "modules/demo/senta_test/vocab.list",
    "content": "糟糕\n差劲\n浪费\n"
  },
  {
    "path": "modules/demo/test.py",
    "content": "import paddlehub as hub\n\nsenta_test = hub.Module(directory=\"senta_test\")\nprint(senta_test.sentiment_classify([\"这部电影太糟糕了\", \"这部电影太棒了\"]))\n"
  },
  {
    "path": "modules/image/Image_editing/README.md",
    "content": "## **更好用户体验，建议参考WEB端官方文档 -> [【图像编辑】](https://www.paddlepaddle.org.cn/hublist)**\n\n\n\n### 图像编辑\n\n图像编辑是指在输入图像的基础上，对图像的像素点进行进一步的编辑和调整，输出新的目标图像，具体的应用场景有：超分辨率、黑白片上色，老照片修复等。\n\n- 精选推荐模型\n\n| 模型名称                                                     | 模型简介                                                     |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n | [超分辨率](https://www.paddlepaddle.org.cn/hubdetail?name=realsr&en_category=ImageEditing) | 可用于图像和视频超分模型，它能够将输入的图片和视频超分四倍。 |\n | [黑白图像上色](https://www.paddlepaddle.org.cn/hubdetail?name=deoldify&en_category=ImageEditing) | deoldify是用于图像和视频的着色渲染模型，该模型能够实现给黑白照片和视频恢复原彩。 |\n  | [老照片修复](https://www.paddlepaddle.org.cn/hubdetail?name=photo_restoration&en_category=ImageEditing) | 针对老照片修复的模型。它主要由两个部分组成：着色和超分。|\n"
  },
  {
    "path": "modules/image/Image_editing/README_en.md",
    "content": "## **For a better user experience, refer to the WEB official document ->  [Image Editing](https://www.paddlepaddle.org.cn/hublist)**\n\n### Image Editing\n\nImage editing refers to the further editing and adjustment of image pixels on the basis of the input images, to output a new target image. Specific application scenarios include super-resolution, black and white picture coloring, old photo restoration, and so on.\n\n- Recommended Models\n\n| Model Name                                                   | Model Introduction                                           |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n| [Super-resolution](https://www.paddlepaddle.org.cn/hubdetail?name=realsr&en_category=ImageEditing) | It can be used for the super-resolution models of images and videos, with four times of resolution of input images and videos. |\n| [B\\&W Image Coloring](https://www.paddlepaddle.org.cn/hubdetail?name=deoldify&en_category=ImageEditing) | deoldify is a color rendering model for images and videos that allows you to restore original color to black and white photos and videos. |\n| [Old Photo Restoration](https://www.paddlepaddle.org.cn/hubdetail?name=photo_restoration&en_category=ImageEditing) | It is a model for the restoration of old photos. It consists of two parts: coloring and superdivision. |\n"
  },
  {
    "path": "modules/image/Image_editing/colorization/deoldify/README.md",
    "content": "# deoldify\n\n|模型名称|deoldify|\n| :--- | :---: |\n|类别|图像-图像编辑|\n|网络|NoGAN|\n|数据集|ILSVRC 2012|\n|是否支持Fine-tuning|否|\n|模型大小|834MB|\n|指标|-|\n|最新更新日期|2021-04-13|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n  - 样例结果示例(左为原图，右为效果图)：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/130886749-668dfa38-42ed-4a09-8d4a-b18af0475375.jpg\" width = \"450\" height = \"300\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/130886685-76221736-839a-46a2-8415-e5e0dd3b345e.png\" width = \"450\" height = \"300\" hspace='10'/>\n    </p>\n\n- ### 模型介绍\n\n  - deoldify是用于图像和视频的着色渲染模型，该模型能够实现给黑白照片和视频恢复原彩。\n\n  - 更多详情请参考：[deoldify](https://github.com/jantic/DeOldify)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n    - NOTE: 使用该模型需要自行安装ffmpeg，若您使用conda环境，推荐使用如下语句进行安装。\n\n      ```shell\n      $ conda install x264=='1!152.20180717' ffmpeg=4.0.2 -c conda-forge\n      ```\n\n\n- ### 2、安装\n    - ```shell\n      $ hub install deoldify\n      ```\n\n    - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n\n## 三、模型API预测\n  - ### 1、预测代码示例\n\n       - ```python\n         import paddlehub as hub\n\n         model = hub.Module(name='deoldify')\n         model.predict('/PATH/TO/IMAGE')\n\n         # model.predict('/PATH/TO/VIDEO')\n         ```\n\n  - ### 2、API\n\n    - ```python\n        def predict(self, input):\n        ```\n\n        - 着色变换API，得到着色后的图片或者视频。\n\n        - **参数**\n\n            - input(str): 图片或者视频的路径；\n\n        - **返回**\n\n            -  若输入是图片，返回值为：\n                - pred_img(np.ndarray): BGR图片数据；\n                - out_path(str): 保存图片路径。\n\n            - 若输入是视频，返回值为：\n                - frame_pattern_combined(str): 视频着色后单帧数据保存路径；\n                - vid_out_path(str): 视频保存路径。\n\n    - ```python\n      def run_image(self, img):\n      ```\n        - 图像着色API， 得到着色后的图片。\n\n        - **参数**\n\n            - img (str｜np.ndarray): 图片路径或则BGR格式图片。\n\n        - **返回**\n\n            - pred_img(np.ndarray): BGR图片数据；\n\n    - ```python\n      def run_video(self, video):\n      ```\n\n        - 视频着色API， 得到着色后的视频。\n\n        - **参数**\n\n            - video (str): 待处理视频路径。\n\n        - **返回**\n\n            - frame_pattern_combined(str): 视频着色后单帧数据保存路径；\n            - vid_out_path(str): 视频保存路径。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线照片着色服务\n\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n        - ```shell\n          $ hub serving start -m deoldify\n          ```\n\n        - 这样就完成了一个图像着色的在线服务API的部署，默认端口号为8866。\n\n        - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n      ```python\n      import requests\n      import json\n      import base64\n\n      import cv2\n      import numpy as np\n\n      def cv2_to_base64(image):\n          data = cv2.imencode('.jpg', image)[1]\n          return base64.b64encode(data.tostring()).decode('utf8')\n      def base64_to_cv2(b64str):\n          data = base64.b64decode(b64str.encode('utf8'))\n          data = np.fromstring(data, np.uint8)\n          data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n          return data\n\n      # 发送HTTP请求\n      org_im = cv2.imread('/PATH/TO/ORIGIN/IMAGE')\n      data = {'images':cv2_to_base64(org_im)}\n      headers = {\"Content-type\": \"application/json\"}\n      url = \"http://127.0.0.1:8866/predict/deoldify\"\n      r = requests.post(url=url, headers=headers, data=json.dumps(data))\n      img = base64_to_cv2(r.json()[\"results\"])\n      cv2.imwrite('/PATH/TO/SAVE/IMAGE', img)\n      ```\n\n- ### Gradio APP 支持\n  从 PaddleHub 2.3.1 开始支持使用链接 http://127.0.0.1:8866/gradio/deoldify 在浏览器中访问 deoldify 的 Gradio APP。\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  适配paddlehub2.0版本\n\n* 1.1.0\n\n  移除 Fluid API\n\n* 1.2.0\n\n  添加 Gradio APP 支持\n\n  ```shell\n  $ hub install deoldify == 1.2.0\n  ```\n"
  },
  {
    "path": "modules/image/Image_editing/colorization/deoldify/README_en.md",
    "content": "# deoldify\n\n| Module Name |deoldify|\n| :--- | :---: |\n|Category|Image editing|\n|Network |NoGAN|\n|Dataset|ILSVRC 2012|\n|Fine-tuning supported or not |No|\n|Module Size |834MB|\n|Data indicators|-|\n|Latest update date |2021-04-13|\n\n\n## I. Basic Information\n\n- ### Application Effect Display\n\n  - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/130886749-668dfa38-42ed-4a09-8d4a-b18af0475375.jpg\" width = \"450\" height = \"300\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/130886685-76221736-839a-46a2-8415-e5e0dd3b345e.png\" width = \"450\" height = \"300\" hspace='10'/>\n    </p>\n\n- ### Module Introduction\n\n  - Deoldify is a color rendering model for images and videos, which can restore color for black and white photos and videos.\n\n  - For more information, please refer to: [deoldify](https://github.com/jantic/DeOldify)\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n    - NOTE: This Module relies on ffmpeg, Please install ffmpeg before using this Module.\n\n        ```shell\n        $ conda install x264=='1!152.20180717' ffmpeg=4.0.2 -c conda-forge\n        ```\n\n\n- ### 2、Installation\n    - ```shell\n      $ hub install deoldify\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n  - ### 1、Prediction Code Example\n\n    - ```python\n      import paddlehub as hub\n\n      model = hub.Module(name='deoldify')\n      model.predict('/PATH/TO/IMAGE')\n\n      # model.predict('/PATH/TO/VIDEO')\n      ```\n\n  - ### 2、API\n\n    - ```python\n        def predict(self, input):\n        ```\n\n        - Prediction API.\n\n        - **Parameter**\n\n            - input (str): Image path.\n\n        - **Return**\n\n            - If input is image path, the output is：\n              - pred_img(np.ndarray): image data, ndarray.shape is in the format [H, W, C], BGR.\n              - out_path(str): save path of images.\n\n            - If input is video path, the output is ：\n              - frame_pattern_combined(str): save path of frames from output video.\n              - vid_out_path(str): save path of output video.\n\n    - ```python\n      def run_image(self, img):\n      ```\n        - Prediction API for image.\n\n        - **Parameter**\n\n            - img (str｜np.ndarray): Image data,  str or ndarray. ndarray.shape is in the format [H, W, C], BGR.\n\n        - **Return**\n\n            - pred_img(np.ndarray): Ndarray.shape is in the format [H, W, C], BGR.\n\n    - ```python\n      def run_video(self, video):\n      ```\n       -  Prediction API for video.\n\n       - **Parameter**\n\n         - video(str): Video path.\n\n       - **Return**\n\n         - frame_pattern_combined(str): Save path of frames from output video.\n         - vid_out_path(str): Save path of output video.\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of coloring old photos or videos.\n\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n      - ```shell\n        $ hub serving start -m deoldify\n        ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result.\n\n      - ```python\n        import requests\n        import json\n        import base64\n\n        import cv2\n        import numpy as np\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # Send an HTTP request\n        org_im = cv2.imread('/PATH/TO/ORIGIN/IMAGE')\n        data = {'images':cv2_to_base64(org_im)}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/deoldify\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        img = base64_to_cv2(r.json()[\"results\"])\n        cv2.imwrite('/PATH/TO/SAVE/IMAGE', img)\n        ```\n\n- ### Gradio APP support\n  Starting with PaddleHub 2.3.1, the Gradio APP for deoldify is supported to be accessed in the browser using the link http://127.0.0.1:8866/gradio/deoldify.\n\n## V. Release Note\n\n- 1.0.0\n\n  First release\n\n- 1.0.1\n\n  Adapt to paddlehub2.0\n\n* 1.1.0\n\n  Remove Fluid API\n\n* 1.2.0\n\n  Add Gradio APP support\n\n  ```shell\n  $ hub install deoldify == 1.2.0\n  ```\n"
  },
  {
    "path": "modules/image/Image_editing/colorization/deoldify/base_module.py",
    "content": "import numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.vision.models import resnet101\n\nfrom . import utils as U\n\n\nclass SequentialEx(nn.Layer):\n    \"Like `nn.Sequential`, but with ModuleList semantics, and can access module input\"\n\n    def __init__(self, *layers):\n        super().__init__()\n        self.layers = nn.LayerList(layers)\n\n    def forward(self, x):\n        res = x\n        for l in self.layers:\n            if isinstance(l, MergeLayer):\n                l.orig = x\n            nres = l(res)\n            # We have to remove res.orig to avoid hanging refs and therefore memory leaks\n            # l.orig = None\n            res = nres\n        return res\n\n    def __getitem__(self, i):\n        return self.layers[i]\n\n    def append(self, l):\n        return self.layers.append(l)\n\n    def extend(self, l):\n        return self.layers.extend(l)\n\n    def insert(self, i, l):\n        return self.layers.insert(i, l)\n\n\nclass Deoldify(SequentialEx):\n\n    def __init__(self,\n                 encoder,\n                 n_classes,\n                 blur=False,\n                 blur_final=True,\n                 self_attention=False,\n                 y_range=None,\n                 last_cross=True,\n                 bottle=False,\n                 norm_type='Batch',\n                 nf_factor=1,\n                 **kwargs):\n\n        imsize = (256, 256)\n        sfs_szs = U.model_sizes(encoder, size=imsize)\n        sfs_idxs = list(reversed(_get_sfs_idxs(sfs_szs)))\n        self.sfs = U.hook_outputs([encoder[i] for i in sfs_idxs], detach=False)\n        x = U.dummy_eval(encoder, imsize).detach()\n\n        nf = 512 * nf_factor\n        extra_bn = norm_type == 'Spectral'\n        ni = sfs_szs[-1][1]\n        middle_conv = nn.Sequential(\n            custom_conv_layer(ni, ni * 2, norm_type=norm_type, extra_bn=extra_bn),\n            custom_conv_layer(ni * 2, ni, norm_type=norm_type, extra_bn=extra_bn),\n        )\n\n        layers = [encoder, nn.BatchNorm(ni), nn.ReLU(), middle_conv]\n\n        for i, idx in enumerate(sfs_idxs):\n            not_final = i != len(sfs_idxs) - 1\n            up_in_c, x_in_c = int(x.shape[1]), int(sfs_szs[idx][1])\n            do_blur = blur and (not_final or blur_final)\n            sa = self_attention and (i == len(sfs_idxs) - 3)\n\n            n_out = nf if not_final else nf // 2\n\n            unet_block = UnetBlockWide(up_in_c,\n                                       x_in_c,\n                                       n_out,\n                                       self.sfs[i],\n                                       final_div=not_final,\n                                       blur=blur,\n                                       self_attention=sa,\n                                       norm_type=norm_type,\n                                       extra_bn=extra_bn,\n                                       **kwargs)\n            unet_block.eval()\n            layers.append(unet_block)\n            x = unet_block(x)\n\n        ni = x.shape[1]\n        if imsize != sfs_szs[0][-2:]:\n            layers.append(PixelShuffle_ICNR(ni, **kwargs))\n        if last_cross:\n            layers.append(MergeLayer(dense=True))\n            ni += 3\n            layers.append(res_block(ni, bottle=bottle, norm_type=norm_type, **kwargs))\n        layers += [custom_conv_layer(ni, n_classes, ks=1, use_activ=False, norm_type=norm_type)]\n        if y_range is not None:\n            layers.append(SigmoidRange(*y_range))\n        super().__init__(*layers)\n\n\ndef custom_conv_layer(ni: int,\n                      nf: int,\n                      ks: int = 3,\n                      stride: int = 1,\n                      padding: int = None,\n                      bias: bool = None,\n                      is_1d: bool = False,\n                      norm_type='Batch',\n                      use_activ: bool = True,\n                      leaky: float = None,\n                      transpose: bool = False,\n                      self_attention: bool = False,\n                      extra_bn: bool = False,\n                      **kwargs):\n    \"Create a sequence of convolutional (`ni` to `nf`), ReLU (if `use_activ`) and batchnorm (if `bn`) layers.\"\n    if padding is None:\n        padding = (ks - 1) // 2 if not transpose else 0\n    bn = norm_type in ('Batch', 'Batchzero') or extra_bn == True\n    if bias is None:\n        bias = not bn\n    conv_func = nn.Conv2DTranspose if transpose else nn.Conv1d if is_1d else nn.Conv2D\n\n    conv = conv_func(ni, nf, kernel_size=ks, bias_attr=bias, stride=stride, padding=padding)\n    if norm_type == 'Weight':\n        conv = nn.utils.weight_norm(conv)\n    elif norm_type == 'Spectral':\n        conv = U.Spectralnorm(conv)\n    layers = [conv]\n    if use_activ:\n        layers.append(relu(True, leaky=leaky))\n    if bn:\n        layers.append((nn.BatchNorm if is_1d else nn.BatchNorm)(nf))\n    if self_attention:\n        layers.append(SelfAttention(nf))\n\n    return nn.Sequential(*layers)\n\n\ndef relu(inplace: bool = False, leaky: float = None):\n    \"Return a relu activation, maybe `leaky` and `inplace`.\"\n    return nn.LeakyReLU(leaky) if leaky is not None else nn.ReLU()\n\n\nclass UnetBlockWide(nn.Layer):\n    \"A quasi-UNet block, using `PixelShuffle_ICNR upsampling`.\"\n\n    def __init__(self,\n                 up_in_c: int,\n                 x_in_c: int,\n                 n_out: int,\n                 hook,\n                 final_div: bool = True,\n                 blur: bool = False,\n                 leaky: float = None,\n                 self_attention: bool = False,\n                 **kwargs):\n        super().__init__()\n        self.hook = hook\n        up_out = x_out = n_out // 2\n        self.shuf = CustomPixelShuffle_ICNR(up_in_c, up_out, blur=blur, leaky=leaky, **kwargs)\n        self.bn = nn.BatchNorm(x_in_c)\n        ni = up_out + x_in_c\n        self.conv = custom_conv_layer(ni, x_out, leaky=leaky, self_attention=self_attention, **kwargs)\n        self.relu = relu(leaky=leaky)\n\n    def forward(self, up_in):\n        s = self.hook.stored\n        up_out = self.shuf(up_in)\n        ssh = s.shape[-2:]\n        if ssh != up_out.shape[-2:]:\n            up_out = F.interpolate(up_out, s.shape[-2:], mode='nearest')\n        cat_x = self.relu(paddle.concat([up_out, self.bn(s)], axis=1))\n        return self.conv(cat_x)\n\n\nclass UnetBlockDeep(nn.Layer):\n    \"A quasi-UNet block, using `PixelShuffle_ICNR upsampling`.\"\n\n    def __init__(\n            self,\n            up_in_c: int,\n            x_in_c: int,\n            # hook: Hook,\n            final_div: bool = True,\n            blur: bool = False,\n            leaky: float = None,\n            self_attention: bool = False,\n            nf_factor: float = 1.0,\n            **kwargs):\n        super().__init__()\n\n        self.shuf = CustomPixelShuffle_ICNR(up_in_c, up_in_c // 2, blur=blur, leaky=leaky, **kwargs)\n        self.bn = nn.BatchNorm(x_in_c)\n        ni = up_in_c // 2 + x_in_c\n        nf = int((ni if final_div else ni // 2) * nf_factor)\n        self.conv1 = custom_conv_layer(ni, nf, leaky=leaky, **kwargs)\n        self.conv2 = custom_conv_layer(nf, nf, leaky=leaky, self_attention=self_attention, **kwargs)\n        self.relu = relu(leaky=leaky)\n\n    def forward(self, up_in):\n        s = self.hook.stored\n        up_out = self.shuf(up_in)\n        ssh = s.shape[-2:]\n        if ssh != up_out.shape[-2:]:\n            up_out = F.interpolate(up_out, s.shape[-2:], mode='nearest')\n        cat_x = self.relu(paddle.concat([up_out, self.bn(s)], axis=1))\n        return self.conv2(self.conv1(cat_x))\n\n\ndef ifnone(a, b):\n    \"`a` if `a` is not None, otherwise `b`.\"\n    return b if a is None else a\n\n\nclass PixelShuffle_ICNR(nn.Layer):\n    \"Upsample by `scale` from `ni` filters to `nf` (default `ni`), using `nn.PixelShuffle`, \\\n    `icnr` init, and `weight_norm`.\"\n\n    def __init__(self,\n                 ni: int,\n                 nf: int = None,\n                 scale: int = 2,\n                 blur: bool = False,\n                 norm_type='Weight',\n                 leaky: float = None):\n        super().__init__()\n        nf = ifnone(nf, ni)\n        self.conv = conv_layer(ni, nf * (scale**2), ks=1, norm_type=norm_type, use_activ=False)\n\n        self.shuf = PixelShuffle(scale)\n\n        self.pad = ReplicationPad2d([1, 0, 1, 0])\n        self.blur = nn.AvgPool2D(2, stride=1)\n        self.relu = relu(True, leaky=leaky)\n\n    def forward(self, x):\n        x = self.shuf(self.relu(self.conv(x)))\n        return self.blur(self.pad(x)) if self.blur else x\n\n\ndef conv_layer(ni: int,\n               nf: int,\n               ks: int = 3,\n               stride: int = 1,\n               padding: int = None,\n               bias: bool = None,\n               is_1d: bool = False,\n               norm_type='Batch',\n               use_activ: bool = True,\n               leaky: float = None,\n               transpose: bool = False,\n               init=None,\n               self_attention: bool = False):\n    \"Create a sequence of convolutional (`ni` to `nf`), ReLU (if `use_activ`) and batchnorm (if `bn`) layers.\"\n    if padding is None: padding = (ks - 1) // 2 if not transpose else 0\n    bn = norm_type in ('Batch', 'BatchZero')\n    if bias is None: bias = not bn\n    conv_func = nn.Conv2DTranspose if transpose else nn.Conv1d if is_1d else nn.Conv2D\n\n    conv = conv_func(ni, nf, kernel_size=ks, bias_attr=bias, stride=stride, padding=padding)\n    if norm_type == 'Weight':\n        conv = nn.utils.weight_norm(conv)\n    elif norm_type == 'Spectral':\n        conv = U.Spectralnorm(conv)\n\n    layers = [conv]\n    if use_activ: layers.append(relu(True, leaky=leaky))\n    if bn: layers.append((nn.BatchNorm if is_1d else nn.BatchNorm)(nf))\n    if self_attention: layers.append(SelfAttention(nf))\n    return nn.Sequential(*layers)\n\n\nclass CustomPixelShuffle_ICNR(nn.Layer):\n    \"Upsample by `scale` from `ni` filters to `nf` (default `ni`), using `nn.PixelShuffle`, `icnr` init, \\\n    and `weight_norm`.\"\n\n    def __init__(self, ni: int, nf: int = None, scale: int = 2, blur: bool = False, leaky: float = None, **kwargs):\n        super().__init__()\n        nf = ifnone(nf, ni)\n        self.conv = custom_conv_layer(ni, nf * (scale**2), ks=1, use_activ=False, **kwargs)\n\n        self.shuf = PixelShuffle(scale)\n\n        self.pad = ReplicationPad2d([1, 0, 1, 0])\n        self.blur = nn.AvgPool2D(2, stride=1)\n        self.relu = nn.LeakyReLU(leaky) if leaky is not None else nn.ReLU()  # relu(True, leaky=leaky)\n\n    def forward(self, x):\n        x = self.shuf(self.relu(self.conv(x)))\n        return self.blur(self.pad(x)) if self.blur else x\n\n\nclass MergeLayer(nn.Layer):\n    \"Merge a shortcut with the result of the module by adding them or concatenating thme if `dense=True`.\"\n\n    def __init__(self, dense: bool = False):\n        super().__init__()\n        self.dense = dense\n        self.orig = None\n\n    def forward(self, x):\n        out = paddle.concat([x, self.orig], axis=1) if self.dense else (x + self.orig)\n        self.orig = None\n        return out\n\n\ndef res_block(nf, dense: bool = False, norm_type='Batch', bottle: bool = False, **conv_kwargs):\n    \"Resnet block of `nf` features. `conv_kwargs` are passed to `conv_layer`.\"\n    norm2 = norm_type\n    if not dense and (norm_type == 'Batch'): norm2 = 'BatchZero'\n    nf_inner = nf // 2 if bottle else nf\n    return SequentialEx(conv_layer(nf, nf_inner, norm_type=norm_type, **conv_kwargs),\n                        conv_layer(nf_inner, nf, norm_type=norm2, **conv_kwargs), MergeLayer(dense))\n\n\nclass SigmoidRange(nn.Layer):\n    \"Sigmoid module with range `(low,x_max)`\"\n\n    def __init__(self, low, high):\n        super().__init__()\n        self.low, self.high = low, high\n\n    def forward(self, x):\n        return sigmoid_range(x, self.low, self.high)\n\n\ndef sigmoid_range(x, low, high):\n    \"Sigmoid function with range `(low, high)`\"\n    return F.sigmoid(x) * (high - low) + low\n\n\nclass PixelShuffle(nn.Layer):\n\n    def __init__(self, upscale_factor):\n        super(PixelShuffle, self).__init__()\n        self.upscale_factor = upscale_factor\n\n    def forward(self, x):\n        return F.pixel_shuffle(x, self.upscale_factor)\n\n\nclass ReplicationPad2d(nn.Layer):\n\n    def __init__(self, size):\n        super(ReplicationPad2d, self).__init__()\n        self.size = size\n\n    def forward(self, x):\n        return F.pad(x, self.size, mode=\"replicate\")\n\n\ndef conv1d(ni: int, no: int, ks: int = 1, stride: int = 1, padding: int = 0, bias: bool = False):\n    \"Create and initialize a `nn.Conv1d` layer with spectral normalization.\"\n    conv = nn.Conv1D(ni, no, ks, stride=stride, padding=padding, bias_attr=bias)\n    return U.Spectralnorm(conv)\n\n\nclass SelfAttention(nn.Layer):\n    \"Self attention layer for nd.\"\n\n    def __init__(self, n_channels):\n        super().__init__()\n        self.query = conv1d(n_channels, n_channels // 8)\n        self.key = conv1d(n_channels, n_channels // 8)\n        self.value = conv1d(n_channels, n_channels)\n        self.gamma = self.create_parameter(\n            shape=[1], default_initializer=paddle.nn.initializer.Constant(0.0))  # nn.Parameter(tensor([0.]))\n\n    def forward(self, x):\n        # Notation from https://arxiv.org/pdf/1805.08318.pdf\n        size = x.shape\n        x = paddle.reshape(x, list(size[:2]) + [-1])\n        f, g, h = self.query(x), self.key(x), self.value(x)\n\n        beta = paddle.nn.functional.softmax(paddle.bmm(paddle.transpose(f, [0, 2, 1]), g), axis=1)\n        o = self.gamma * paddle.bmm(h, beta) + x\n        return paddle.reshape(o, size)\n\n\ndef _get_sfs_idxs(sizes):\n    \"Get the indexes of the layers where the size of the activation changes.\"\n    feature_szs = [size[-1] for size in sizes]\n    sfs_idxs = list(np.where(np.array(feature_szs[:-1]) != np.array(feature_szs[1:]))[0])\n    if feature_szs[0] != feature_szs[1]:\n        sfs_idxs = [0] + sfs_idxs\n    return sfs_idxs\n\n\ndef build_model():\n    backbone = resnet101()\n    cut = -2\n    encoder = nn.Sequential(*list(backbone.children())[:cut])\n\n    model = Deoldify(encoder, 3, blur=True, y_range=(-3, 3), norm_type='Spectral', self_attention=True, nf_factor=2)\n    return model\n"
  },
  {
    "path": "modules/image/Image_editing/colorization/deoldify/module.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport glob\nimport os\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nfrom PIL import Image\nfrom tqdm import tqdm\n\nfrom . import utils as U\nfrom .base_module import build_model\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"deoldify\",\n            type=\"CV/image_editing\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"Deoldify is a colorizaton model\",\n            version=\"1.2.0\")\nclass DeOldifyPredictor(nn.Layer):\n\n    def __init__(self, render_factor: int = 32, output_path: int = 'output', load_checkpoint: str = None):\n        super(DeOldifyPredictor, self).__init__()\n        self.model = build_model()\n        self.render_factor = render_factor\n        self.output = os.path.join(output_path, 'DeOldify')\n        if not os.path.exists(self.output):\n            os.makedirs(self.output)\n        if load_checkpoint is not None:\n            state_dict = paddle.load(load_checkpoint)\n            self.model.load_dict(state_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'DeOldify_stable.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system('wget https://paddlegan.bj.bcebos.com/applications/DeOldify_stable.pdparams -O ' + checkpoint)\n            state_dict = paddle.load(checkpoint)\n            self.model.load_dict(state_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def norm(self, img, render_factor=32, render_base=16):\n        target_size = render_factor * render_base\n        img = img.resize((target_size, target_size), resample=Image.BILINEAR)\n\n        img = np.array(img).transpose([2, 0, 1]).astype('float32') / 255.0\n\n        img_mean = np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))\n        img_std = np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))\n\n        img -= img_mean\n        img /= img_std\n        return img.astype('float32')\n\n    def denorm(self, img):\n        img_mean = np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))\n        img_std = np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))\n\n        img *= img_std\n        img += img_mean\n        img = img.transpose((1, 2, 0))\n\n        return (img * 255).clip(0, 255).astype('uint8')\n\n    def post_process(self, raw_color, orig):\n        color_np = np.asarray(raw_color)\n        orig_np = np.asarray(orig)\n        color_yuv = cv2.cvtColor(color_np, cv2.COLOR_BGR2YUV)\n        orig_yuv = cv2.cvtColor(orig_np, cv2.COLOR_BGR2YUV)\n        hires = np.copy(orig_yuv)\n        hires[:, :, 1:3] = color_yuv[:, :, 1:3]\n        final = cv2.cvtColor(hires, cv2.COLOR_YUV2BGR)\n        return final\n\n    def run_image(self, img):\n        if isinstance(img, str):\n            ori_img = Image.open(img).convert('LA').convert('RGB')\n        elif isinstance(img, np.ndarray):\n            ori_img = Image.fromarray(img).convert('LA').convert('RGB')\n        elif isinstance(img, Image.Image):\n            ori_img = img\n\n        img = self.norm(ori_img, self.render_factor)\n        x = paddle.to_tensor(img[np.newaxis, ...])\n        out = self.model(x)\n\n        pred_img = self.denorm(out.numpy()[0])\n        pred_img = Image.fromarray(pred_img)\n        pred_img = pred_img.resize(ori_img.size, resample=Image.BILINEAR)\n        pred_img = self.post_process(pred_img, ori_img)\n        pred_img = cv2.cvtColor(pred_img, cv2.COLOR_RGB2BGR)\n        return pred_img\n\n    def run_video(self, video):\n        base_name = os.path.basename(video).split('.')[0]\n        output_path = os.path.join(self.output, base_name)\n        pred_frame_path = os.path.join(output_path, 'frames_pred')\n\n        if not os.path.exists(output_path):\n            os.makedirs(output_path)\n\n        if not os.path.exists(pred_frame_path):\n            os.makedirs(pred_frame_path)\n\n        cap = cv2.VideoCapture(video)\n        fps = cap.get(cv2.CAP_PROP_FPS)\n\n        out_path = U.video2frames(video, output_path)\n\n        frames = sorted(glob.glob(os.path.join(out_path, '*.png')))\n\n        for frame in tqdm(frames):\n            pred_img = self.run_image(frame)\n            pred_img = cv2.cvtColor(pred_img, cv2.COLOR_BGR2RGB)\n            pred_img = Image.fromarray(pred_img)\n            frame_name = os.path.basename(frame)\n            pred_img.save(os.path.join(pred_frame_path, frame_name))\n\n        frame_pattern_combined = os.path.join(pred_frame_path, '%08d.png')\n\n        vid_out_path = os.path.join(output_path, '{}_deoldify_out.mp4'.format(base_name))\n        U.frames2video(frame_pattern_combined, vid_out_path, str(int(fps)))\n        print('Save video result at {}.'.format(vid_out_path))\n\n        return frame_pattern_combined, vid_out_path\n\n    def predict(self, input):\n\n        if not U.is_image(input):\n            return self.run_video(input)\n        else:\n            pred_img = self.run_image(input)\n\n            if self.output:\n                base_name = os.path.splitext(os.path.basename(input))[0]\n                out_path = os.path.join(self.output, base_name + '.png')\n                cv2.imwrite(out_path, pred_img)\n            return pred_img, out_path\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = U.base64_to_cv2(images)\n        results = self.run_image(img=images_decode)\n        results = U.cv2_to_base64(results)\n        return results\n\n    def create_gradio_app(self):\n        import gradio as gr\n\n        def inference(image):\n            img, _ = self.predict(image.name)\n            return img\n\n        title = \"DeOldify\"\n        interface = gr.Interface(inference,\n                                 inputs=gr.inputs.Image(type=\"file\"),\n                                 outputs=gr.outputs.Image(type=\"numpy\"),\n                                 title=title,\n                                 allow_flagging='never')\n        return interface\n"
  },
  {
    "path": "modules/image/Image_editing/colorization/deoldify/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport numpy as np\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/1sLIu1XKQrY/download?ixid=MnwxMjA3fDB8MXxhbGx8MTJ8fHx8fHwyfHwxNjYyMzQxNDUx&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"deoldify\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('output')\n\n    def test_run_image1(self):\n        results = self.module.run_image(img='tests/test.jpg')\n        self.assertIsInstance(results, np.ndarray)\n\n    def test_run_image2(self):\n        results = self.module.run_image(img=cv2.imread('tests/test.jpg'))\n        self.assertIsInstance(results, np.ndarray)\n\n    def test_run_image3(self):\n        self.assertRaises(FileNotFoundError, self.module.run_image, img='no.jpg')\n\n    def test_predict1(self):\n        pred_img, out_path = self.module.predict(input='tests/test.jpg')\n        self.assertIsInstance(pred_img, np.ndarray)\n        self.assertIsInstance(out_path, str)\n\n    def test_predict2(self):\n        self.assertRaises(RuntimeError, self.module.predict, input='no.jpg')\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/Image_editing/colorization/deoldify/utils.py",
    "content": "import base64\nimport os\nimport sys\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nfrom PIL import Image\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef is_listy(x):\n    return isinstance(x, (tuple, list))\n\n\nclass Hook():\n    \"Create a hook on `m` with `hook_func`.\"\n\n    def __init__(self, m, hook_func, is_forward=True, detach=True):\n        self.hook_func, self.detach, self.stored = hook_func, detach, None\n        f = m.register_forward_post_hook if is_forward else m.register_backward_hook\n        self.hook = f(self.hook_fn)\n        self.removed = False\n\n    def hook_fn(self, module, input, output):\n        \"Applies `hook_func` to `module`, `input`, `output`.\"\n        if self.detach:\n            input = (o.detach() for o in input) if is_listy(input) else input.detach()\n            output = (o.detach() for o in output) if is_listy(output) else output.detach()\n        self.stored = self.hook_func(module, input, output)\n\n    def remove(self):\n        \"Remove the hook from the model.\"\n        if not self.removed:\n            self.hook.remove()\n            self.removed = True\n\n    def __enter__(self, *args):\n        return self\n\n    def __exit__(self, *args):\n        self.remove()\n\n\nclass Hooks():\n    \"Create several hooks on the modules in `ms` with `hook_func`.\"\n\n    def __init__(self, ms, hook_func, is_forward=True, detach=True):\n        self.hooks = []\n        try:\n            for m in ms:\n                self.hooks.append(Hook(m, hook_func, is_forward, detach))\n        except Exception as e:\n            pass\n\n    def __getitem__(self, i: int) -> Hook:\n        return self.hooks[i]\n\n    def __len__(self) -> int:\n        return len(self.hooks)\n\n    def __iter__(self):\n        return iter(self.hooks)\n\n    @property\n    def stored(self):\n        return [o.stored for o in self]\n\n    def remove(self):\n        \"Remove the hooks from the model.\"\n        for h in self.hooks:\n            h.remove()\n\n    def __enter__(self, *args):\n        return self\n\n    def __exit__(self, *args):\n        self.remove()\n\n\ndef _hook_inner(m, i, o):\n    return o if isinstance(o, paddle.Tensor) else o if is_listy(o) else list(o)\n\n\ndef hook_output(module, detach=True, grad=False):\n    \"Return a `Hook` that stores activations of `module` in `self.stored`\"\n    return Hook(module, _hook_inner, detach=detach, is_forward=not grad)\n\n\ndef hook_outputs(modules, detach=True, grad=False):\n    \"Return `Hooks` that store activations of all `modules` in `self.stored`\"\n    return Hooks(modules, _hook_inner, detach=detach, is_forward=not grad)\n\n\ndef model_sizes(m, size=(64, 64)):\n    \"Pass a dummy input through the model `m` to get the various sizes of activations.\"\n    with hook_outputs(m) as hooks:\n        x = dummy_eval(m, size)\n        return [o.stored.shape for o in hooks]\n\n\ndef dummy_eval(m, size=(64, 64)):\n    \"Pass a `dummy_batch` in evaluation mode in `m` with `size`.\"\n    m.eval()\n    return m(dummy_batch(size))\n\n\ndef dummy_batch(size=(64, 64), ch_in=3):\n    \"Create a dummy batch to go through `m` with `size`.\"\n    arr = np.random.rand(1, ch_in, *size).astype('float32') * 2 - 1\n    return paddle.to_tensor(arr)\n\n\nclass _SpectralNorm(nn.SpectralNorm):\n\n    def __init__(self, weight_shape, dim=0, power_iters=1, eps=1e-12, dtype='float32'):\n        super(_SpectralNorm, self).__init__(weight_shape, dim, power_iters, eps, dtype)\n\n    def forward(self, weight):\n        inputs = {'Weight': weight, 'U': self.weight_u, 'V': self.weight_v}\n        out = self._helper.create_variable_for_type_inference(self._dtype)\n        _power_iters = self._power_iters if self.training else 0\n        self._helper.append_op(type=\"spectral_norm\",\n                               inputs=inputs,\n                               outputs={\n                                   \"Out\": out,\n                               },\n                               attrs={\n                                   \"dim\": self._dim,\n                                   \"power_iters\": _power_iters,\n                                   \"eps\": self._eps,\n                               })\n\n        return out\n\n\nclass Spectralnorm(paddle.nn.Layer):\n\n    def __init__(self, layer, dim=0, power_iters=1, eps=1e-12, dtype='float32'):\n        super(Spectralnorm, self).__init__()\n        self.spectral_norm = _SpectralNorm(layer.weight.shape, dim, power_iters, eps, dtype)\n        self.dim = dim\n        self.power_iters = power_iters\n        self.eps = eps\n        self.layer = layer\n        weight = layer._parameters['weight']\n        del layer._parameters['weight']\n        self.weight_orig = self.create_parameter(weight.shape, dtype=weight.dtype)\n        self.weight_orig.set_value(weight)\n\n    def forward(self, x):\n        weight = self.spectral_norm(self.weight_orig)\n        self.layer.weight = weight\n        out = self.layer(x)\n        return out\n\n\ndef video2frames(video_path, outpath, **kargs):\n\n    def _dict2str(kargs):\n        cmd_str = ''\n        for k, v in kargs.items():\n            cmd_str += (' ' + str(k) + ' ' + str(v))\n        return cmd_str\n\n    ffmpeg = ['ffmpeg ', ' -y -loglevel ', ' error ']\n    vid_name = video_path.split('/')[-1].split('.')[0]\n    out_full_path = os.path.join(outpath, vid_name)\n\n    if not os.path.exists(out_full_path):\n        os.makedirs(out_full_path)\n\n    # video file name\n    outformat = out_full_path + '/%08d.png'\n\n    cmd = ffmpeg\n    cmd = ffmpeg + [' -i ', video_path, ' -start_number ', ' 0 ', outformat]\n\n    cmd = ''.join(cmd) + _dict2str(kargs)\n\n    if os.system(cmd) != 0:\n        raise RuntimeError('ffmpeg process video: {} error'.format(vid_name))\n\n    sys.stdout.flush()\n    return out_full_path\n\n\ndef frames2video(frame_path, video_path, r):\n    ffmpeg = ['ffmpeg ', ' -y -loglevel ', ' error ']\n    cmd = ffmpeg + [' -r ', r, ' -f ', ' image2 ', ' -i ', frame_path, ' -pix_fmt ', ' yuv420p ', video_path]\n    cmd = ''.join(cmd)\n\n    if os.system(cmd) != 0:\n        raise RuntimeError('ffmpeg process video: {} error'.format(video_path))\n\n    sys.stdout.flush()\n\n\ndef is_image(input):\n    try:\n        img = Image.open(input)\n        _ = img.size\n\n        return True\n    except:\n        return False\n"
  },
  {
    "path": "modules/image/Image_editing/colorization/photo_restoration/README.md",
    "content": "# photo_restoration\n\n|模型名称|photo_restoration|\n| :--- | :---: | \n|类别|图像-图像编辑|\n|网络|基于deoldify和realsr模型|\n|是否支持Fine-tuning|否|\n|模型大小|64MB+834MB|\n|指标|-|\n|最新更新日期|2021-08-19|\n\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例(左为原图，右为效果图)：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/130897828-d0c86b81-63d1-4e9a-8095-bc000b8c7ca8.jpg\" width = \"260\" height = \"400\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/130897762-5c9fa711-62bc-4067-8d44-f8feff8c574c.png\" width = \"260\" height = \"400\" hspace='10'/>\n    </p>\n\n\n\n- ### 模型介绍\n\n    - photo_restoration 是针对老照片修复的模型。它主要由两个部分组成：着色和超分。着色模型基于deoldify\n    ，超分模型基于realsr. 用户可以根据自己的需求选择对图像进行着色或超分操作。因此在使用该模型时，请预先安装deoldify和realsr两个模型。\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n    - NOTE: 使用该模型需要自行安装ffmpeg，若您使用conda环境，推荐使用如下语句进行安装。\n\n      ```shell\n      $ conda install x264=='1!152.20180717' ffmpeg=4.0.2 -c conda-forge\n      ```\n   \n- ### 2、安装\n    - ```shell\n      $ hub install photo_restoration\n      ```\n      \n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n## 三、模型API预测\n  - ### 1、预测代码示例\n\n    - ```python\n      import cv2\n      import paddlehub as hub\n\n      model = hub.Module(name='photo_restoration', visualization=True)\n      im = cv2.imread('/PATH/TO/IMAGE')\n      res = model.run_image(im)\n\n      ```\n- ### 2、API\n\n\n    ```python\n    def run_image(self,\n                  input,\n                  model_select= ['Colorization', 'SuperResolution'],\n                  save_path = 'photo_restoration'):\n    ```\n\n    - 预测API，用于图片修复。\n\n    - **参数**\n\n        - input (numpy.ndarray｜str): 图片数据，numpy.ndarray 或者 str形式。ndarray.shape 为 \\[H, W, C\\]，BGR格式; str为图片的路径。\n\n        - model_select (list\\[str\\]): 选择对图片对操作，\\['Colorization'\\]对图像只进行着色操作， \\['SuperResolution'\\]对图像只进行超分操作；\n        默认值为\\['Colorization', 'SuperResolution'\\]。\n\n        - save_path (str): 保存图片的路径, 默认为'photo_restoration'。\n\n    - **返回**\n\n        - output (numpy.ndarray): 照片修复结果，ndarray.shape 为 \\[H, W, C\\]，BGR格式。\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个照片修复的在线服务。\n\n- ## 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n        - ```shell\n          $ hub serving start -m photo_restoration\n          ```\n\n        - 这样就完成了一个照片修复的服务化API的部署，默认端口号为8866。\n\n        - **NOTE:** 如使用GPU预测，则需要在启动服务之前，设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import base64\n\n        import cv2\n        import numpy as np\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # 发送HTTP请求\n        org_im = cv2.imread('PATH/TO/IMAGE')\n        data = {'images':cv2_to_base64(org_im), 'model_select': ['Colorization', 'SuperResolution']}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/photo_restoration\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        img = base64_to_cv2(r.json()[\"results\"])\n        cv2.imwrite('PATH/TO/SAVE/IMAGE', img)\n        ```\n\n## 五、更新历史\n\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  适配paddlehub2.0版本\n\n  * ```shell\n    $ hub install photo_restoration==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_editing/colorization/photo_restoration/README_en.md",
    "content": "# photo_restoration\n\n|Module Name|photo_restoration|\n| :--- | :---: | \n|Category|Image editing|\n|Network|deoldify and realsr|\n|Fine-tuning supported or not|No|\n|Module Size |64MB+834MB|\n|Data indicators|-|\n|Latest update date|2021-08-19|\n\n\n\n## I. Basic Information \n\n- ### Application Effect Display\n\n  - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/130897828-d0c86b81-63d1-4e9a-8095-bc000b8c7ca8.jpg\" width = \"260\" height = \"400\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/130897762-5c9fa711-62bc-4067-8d44-f8feff8c574c.png\" width = \"260\" height = \"400\" hspace='10'/>\n    </p>\n\n\n\n- ### Module Introduction\n\n    - Photo_restoration can restore old photos. It mainly consists of two parts: coloring and super-resolution. The coloring model is deoldify\n     , and super resolution model is realsr. Therefore, when using this model, please install deoldify and realsr in advance.\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n    - NOTE: This Module relies on ffmpeg, Please install ffmpeg before using this Module.\n\n      ```shell\n      $ conda install x264=='1!152.20180717' ffmpeg=4.0.2 -c conda-forge\n      ```\n   \n- ### 2、Installation\n\n    - ```shell\n      $ hub install photo_restoration\n      ```\n      \n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n\n  - ```python\n    import cv2\n    import paddlehub as hub\n\n    model = hub.Module(name='photo_restoration', visualization=True)\n    im = cv2.imread('/PATH/TO/IMAGE')\n    res = model.run_image(im)\n\n    ```\n- ### 2、API\n\n\n  - ```python\n    def run_image(self,\n                  input,\n                  model_select= ['Colorization', 'SuperResolution'],\n                  save_path = 'photo_restoration'):\n    ```\n\n    - Predicition API,  produce repaired photos.\n\n    - **Parameter**\n\n        - input (numpy.ndarray｜str): Image data，numpy.ndarray or str. ndarray.shape is in the format [H, W, C], BGR.\n\n        - model_select (list\\[str\\]): Mode selection，\\['Colorization'\\] only colorize the input image， \\['SuperResolution'\\] only increase the image resolution；\n        default is \\['Colorization', 'SuperResolution'\\]。\n\n        - save_path (str): Save path, default is 'photo_restoration'.\n\n     - **Return**\n\n        - output (numpy.ndarray): Restoration result，ndarray.shape is in the format [H, W, C], BGR.\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of photo restoration.\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n        - ```shell\n          $ hub serving start -m photo_restoration\n          ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n      - ```python\n        import requests\n        import json\n        import base64\n\n        import cv2\n        import numpy as np\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # Send an HTTP request\n        org_im = cv2.imread('PATH/TO/IMAGE')\n        data = {'images':cv2_to_base64(org_im), 'model_select': ['Colorization', 'SuperResolution']}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/photo_restoration\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        img = base64_to_cv2(r.json()[\"results\"])\n        cv2.imwrite('PATH/TO/SAVE/IMAGE', img)\n        ```\n\n\n## V. Release Note\n\n- 1.0.0\n\n  First release\n\n- 1.1.0\n\n  Adapt to paddlehub2.0\n\n  * ```shell\n    $ hub install photo_restoration==1.1.0\n    ```\n\n"
  },
  {
    "path": "modules/image/Image_editing/colorization/photo_restoration/module.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport time\n\nimport cv2\nimport paddle.nn as nn\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo, serving\n\nfrom . import utils as U\n\n\n@moduleinfo(name=\"photo_restoration\",\n            type=\"CV/image_editing\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"photo_restoration is a photo restoration model based on deoldify and realsr.\",\n            version=\"1.1.0\")\nclass PhotoRestoreModel(nn.Layer):\n    \"\"\"\n    PhotoRestoreModel\n\n    Args:\n        load_checkpoint(str): Checkpoint save path, default is None.\n        visualization (bool): Whether to save the estimation result. Default is True.\n    \"\"\"\n\n    def __init__(self, visualization: bool = False):\n        super(PhotoRestoreModel, self).__init__()\n        self.deoldify = hub.Module(name='deoldify')\n        self.realsr = hub.Module(name='realsr')\n        self.visualization = visualization\n\n    def run_image(self,\n                  input,\n                  model_select: list = ['Colorization', 'SuperResolution'],\n                  save_path: str = 'photo_restoration'):\n        self.models = []\n        for model in model_select:\n            print('\\n {} model proccess start..'.format(model))\n            if model == 'Colorization':\n                self.deoldify.eval()\n                self.models.append(self.deoldify)\n            if model == 'SuperResolution':\n                self.realsr.eval()\n                self.models.append(self.realsr)\n\n        for model in self.models:\n            output = model.run_image(input)\n            input = output\n        if self.visualization:\n            if not os.path.exists(save_path):\n                os.mkdir(save_path)\n            img_name = str(time.time()) + '.png'\n            save_img = os.path.join(save_path, img_name)\n            cv2.imwrite(save_img, output)\n            print(\"save result at: \", save_img)\n\n        return output\n\n    @serving\n    def serving_method(self, images, model_select):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        print(model_select)\n        images_decode = U.base64_to_cv2(images)\n        results = self.run_image(input=images_decode, model_select=model_select)\n        results = U.cv2_to_base64(results)\n        return results\n"
  },
  {
    "path": "modules/image/Image_editing/colorization/photo_restoration/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport numpy as np\nimport paddlehub as hub\n\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/1sLIu1XKQrY/download?ixid=MnwxMjA3fDB8MXxhbGx8MTJ8fHx8fHwyfHwxNjYyMzQxNDUx&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"photo_restoration\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('photo_restoration')\n\n    def test_run_image1(self):\n        results = self.module.run_image(\n            input='tests/test.jpg'\n        )\n        self.assertIsInstance(results, np.ndarray)\n\n    def test_run_image2(self):\n        results = self.module.run_image(\n            input=cv2.imread('tests/test.jpg')\n        )\n        self.assertIsInstance(results, np.ndarray)\n\n    def test_run_image3(self):\n        self.assertRaises(\n            FileNotFoundError,\n            self.module.run_image,\n            input='no.jpg'\n        )\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/Image_editing/colorization/photo_restoration/utils.py",
    "content": "import base64\nimport cv2\nimport numpy as np\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n"
  },
  {
    "path": "modules/image/Image_editing/colorization/user_guided_colorization/README.md",
    "content": "# user_guided_colorization\n\n|模型名称|user_guided_colorization|\n| :--- | :---: | \n|类别|图像-图像编辑|\n|网络| Local and Global Hints Network |\n|数据集|ILSVRC 2012|\n|是否支持Fine-tuning|是|\n|模型大小|131MB|\n|指标|-|\n|最新更新日期|2021-02-26|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n- ### 应用效果展示\n  \n  - 样例结果示例(左为原图，右为效果图)：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/136653401-6644bd46-d280-4c15-8d48-680b7eb152cb.png\" width = \"300\" height = \"450\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/136648959-40493c9c-08ec-46cd-a2a2-5e2038dcbfa7.png\" width = \"300\" height = \"450\" hspace='10'/>\n    </p>\n\n  - user_guided_colorization 是基于\"Real-Time User-Guided Image Colorization with Learned Deep Priors\"的着色模型，该模型利用预先提供的着色块对图像进行着色。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install user_guided_colorization\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)   \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run user_guided_colorization --input_path \"/PATH/TO/IMAGE\"\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n\n    if __name__ == '__main__':\n\n        model = hub.Module(name='user_guided_colorization')\n        model.set_config(prob=0.1)\n        result = model.predict(images=['/PATH/TO/IMAGE'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用user_guided_colorization模型对[Canvas](../../docs/reference/datasets.md#class-hubdatasetsCanvas)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n\n              transform = T.Compose([T.Resize((256, 256), interpolation='NEAREST'),\n                       T.RandomPaddingCrop(crop_size=176),\n                       T.RGB2LAB()], to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Canvas\n\n              color_set = Canvas(transform=transform, mode='train')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * `hub.datasets.Canvas()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name='user_guided_colorization', load_checkpoint=None)\n              model.set_config(classification=True, prob=1)\n              ```\n                * `name`:加载模型的名字。\n                * `load_checkpoint`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n                * `classification`: 着色模型分两部分训练，开始阶段`classification`设置为True, 用于浅层网络训练。训练后期将`classification`设置为False, 用于训练网络的输出层。\n                * `prob`: 每张输入图不加一个先验彩色块的概率，默认为1，即不加入先验彩色块。例如，当`prob`设定为0.9时，一张图上有两个先验彩色块的概率为(1-0.9)*(1-0.9)*0.9=0.009.\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.0001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_colorization_ckpt_cls_1')\n            trainer.train(color_set, epochs=201, batch_size=25, eval_dataset=color_set, log_interval=10, save_interval=10)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n\n              if __name__ == '__main__':\n                  model = hub.Module(name='user_guided_colorization', load_checkpoint='/PATH/TO/CHECKPOINT')\n                  model.set_config(prob=0.1)\n                  result = model.predict(images=['house.png'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。若想获取油画风着色效果，请下载参数文件[油画着色](https://paddlehub.bj.bcebos.com/dygraph/models/canvas_rc.pdparams)\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线着色任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m user_guided_colorization\n      ```\n\n    - 这样就完成了一个着色任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n       ```python\n       import requests\n       import json\n       import cv2\n       import base64\n       import numpy as np\n\n       def cv2_to_base64(image):\n           data = cv2.imencode('.jpg', image)[1]\n           return base64.b64encode(data.tostring()).decode('utf8')\n\n       def base64_to_cv2(b64str):\n           data = base64.b64decode(b64str.encode('utf8'))\n           data = np.fromstring(data, np.uint8)\n           data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n           return data\n\n       # 发送HTTP请求\n       org_im = cv2.imread('/PATH/TO/IMAGE')\n       data = {'images':[cv2_to_base64(org_im)]}\n       headers = {\"Content-type\": \"application/json\"}\n       url = \"http://127.0.0.1:8866/predict/user_guided_colorization\"\n       r = requests.post(url=url, headers=headers, data=json.dumps(data))\n       data = base64_to_cv2(r.json()[\"results\"]['data'][0]['fake_reg'])\n       cv2.imwrite('color.png', data)\n       ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install user_guided_colorization==1.0.0\n    ```\n\n\n"
  },
  {
    "path": "modules/image/Image_editing/colorization/user_guided_colorization/README_en.md",
    "content": "# user_guided_colorization\n\n|Module Name|user_guided_colorization|\n| :--- | :---: | \n|Category |Image editing|\n|Network| Local and Global Hints Network |\n|Dataset|ILSVRC 2012|\n|Fine-tuning supported or notFine-tuning|Yes|\n|Module Size|131MB|\n|Data indicators|-|\n|Latest update date |2021-02-26|\n\n\n\n## I. Basic Information \n\n\n- ### Application Effect Display\n  \n  - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/136653401-6644bd46-d280-4c15-8d48-680b7eb152cb.png\" width = \"300\" height = \"450\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/136648959-40493c9c-08ec-46cd-a2a2-5e2038dcbfa7.png\" width = \"300\" height = \"450\" hspace='10'/>\n    </p>\n\n- ### Module Introduction\n\n  - User_guided_colorization is a colorization model based on \"Real-Time User-Guided Image Colorization with Learned Deep Priors\"，this model uses pre-supplied coloring blocks to color the gray image.\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install user_guided_colorization\n      ```\n\n    - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)   \n\n## III. Module API Prediction\n\n- ### 1、Command line Prediction\n\n    ```shell\n    $ hub run user_guided_colorization --input_path \"/PATH/TO/IMAGE\"\n    ```\n\n     - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_en/tutorial/cmd_usage.rst)\n- ### 2、Prediction Code Example\n\n    ```python\n    import paddle\n    import paddlehub as hub\n\n    if __name__ == '__main__':\n\n        model = hub.Module(name='user_guided_colorization')\n        model.set_config(prob=0.1)\n        result = model.predict(images=['/PATH/TO/IMAGE'])\n    ```\n- ### 3.Fine-tune and Encapsulation\n\n    - After completing the installation of PaddlePaddle and PaddleHub, you can start using the user_guided_colorization model to fine-tune datasets such as [Canvas](../../docs/reference/datasets.md#class-hubdatasetsCanvas) by executing `python train.py`.\n\n    - Steps:\n\n        - Step1: Define the data preprocessing method\n\n            - ```python\n              import paddlehub.vision.transforms as T\n\n              transform = T.Compose([T.Resize((256, 256), interpolation='NEAREST'),\n                       T.RandomPaddingCrop(crop_size=176),\n                       T.RGB2LAB()], to_rgb=True)\n              ```\n\n              - `transforms`: The data enhancement module defines lots of data preprocessing methods. Users can replace the data preprocessing methods according to their needs.\n\n        - Step2: Download the dataset\n            - ```python\n              from paddlehub.datasets import Canvas\n\n              color_set = Canvas(transform=transform, mode='train')\n              ```\n\n                * `transforms`: Data preprocessing methods.\n                * `mode`: Select the data mode, the options are `train`, `test`, `val`. Default is `train`.\n                * `hub.datasets.Canvas()`: The dataset will be automatically downloaded from the network and decompressed to the `$HOME/.paddlehub/dataset` directory under the user directory.\n\n\n        - Step3: Load the pre-trained model\n\n            - ```python\n              model = hub.Module(name='user_guided_colorization', load_checkpoint=None)\n              model.set_config(classification=True, prob=1)\n              ```\n                * `name`: Model name.\n                * `load_checkpoint`: Whether to load the self-trained model, if it is None, load the provided parameters.\n                * `classification`: The model is trained by two mode. At the beginning, `classification` is set to True, which is used for shallow network training. In the later stage of training, set `classification` to False, which is used to train the output layer of the network.\n                * `prob`: The probability that a priori color block is not added to each input image, the default is 1, that is, no prior color block is added. For example, when `prob` is set to 0.9, the probability that there are two a priori color blocks on a picture is(1-0.9)*(1-0.9)*0.9=0.009.\n\n        - Step4: Optimization strategy\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.0001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_colorization_ckpt_cls_1')\n            trainer.train(color_set, epochs=201, batch_size=25, eval_dataset=color_set, log_interval=10, save_interval=10)\n            ```\n\n\n            - Run configuration\n\n            - `Trainer` mainly control the training of Fine-tune, including the following controllable parameters:\n\n                * `model`: Optimized model.\n                * `optimizer`: Optimizer selection.\n                * `use_vdl`: Whether to use vdl to visualize the training process.\n                * `checkpoint_dir`: The storage address of the model parameters.\n                * `compare_metrics`: The measurement index of the optimal model.\n\n            - `trainer.train` mainly control the specific training process, including the following controllable parameters:\n\n                * `train_dataset`: Training dataset.\n                * `epochs`: Epochs of training process.\n                * `batch_size`: Batch size.\n                * `num_workers`: Number of workers.\n                * `eval_dataset`: Validation dataset.\n                * `log_interval`:The interval for printing logs.\n                * `save_interval`: The interval for saving model parameters.\n\n    - Model prediction\n\n        -   When Fine-tune is completed, the model with the best performance on the verification set will be saved in the `${CHECKPOINT_DIR}/best_model` directory. We use this model to make predictions. The `predict.py` script is as follows:\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n\n              if __name__ == '__main__':\n                  model = hub.Module(name='user_guided_colorization', load_checkpoint='/PATH/TO/CHECKPOINT')\n                  model.set_config(prob=0.1)\n                  result = model.predict(images=['/PATH/TO/IMAGE'])\n              ```\n\n\n            - **NOTE:** If you want to get the oil painting style, please download the parameter file [Canvas colorization](https://paddlehub.bj.bcebos.com/dygraph/models/canvas_rc.pdparams)\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of colorization.\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n      - ```shell\n        $ hub serving start -m user_guided_colorization\n        ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:** If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result\n      - ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # Send an HTTP request\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/user_guided_colorization\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data = base64_to_cv2(r.json()[\"results\"]['data'][0]['fake_reg'])\n        cv2.imwrite('color.png', data)\n        ```\n\n\n## V. Release Note\n\n* 1.0.0\n\n  First release\n\n\n  - ```shell\n    $ hub install user_guided_colorization==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_editing/colorization/user_guided_colorization/data_feed.py",
    "content": "import paddle\nimport numpy as np\n\n\nclass ColorizeHint:\n    \"\"\"Get hint and mask images for colorization.\n\n    This method is prepared for user guided colorization tasks. Take the original RGB images as imput,\n    we will obtain the local hints and correspoding mask to guid colorization process.\n\n    Args:\n       percent(float): Probability for ignoring hint in an iteration.\n       num_points(int): Number of selected hints in an iteration.\n       samp(str): Sample method, default is normal.\n       use_avg(bool): Whether to use mean in selected hint area.\n\n    Return:\n        hint(np.ndarray): hint images\n        mask(np.ndarray): mask images\n    \"\"\"\n\n    def __init__(self, percent: float, num_points: int = None, samp: str = 'normal', use_avg: bool = True):\n        self.percent = percent\n        self.num_points = num_points\n        self.samp = samp\n        self.use_avg = use_avg\n\n    def __call__(self, data: np.ndarray, hint: np.ndarray, mask: np.ndarray):\n        sample_Ps = [1, 2, 3, 4, 5, 6, 7, 8, 9]\n        self.data = data\n        self.hint = hint\n        self.mask = mask\n        N, C, H, W = data.shape\n        for nn in range(N):\n            pp = 0\n            cont_cond = True\n            while cont_cond:\n                if self.num_points is None:  # draw from geometric\n                    # embed()\n                    cont_cond = np.random.rand() < (1 - self.percent)\n                else:  # add certain number of points\n                    cont_cond = pp < self.num_points\n                if not cont_cond:  # skip out of loop if condition not met\n                    continue\n                P = np.random.choice(sample_Ps)  # patch size\n                # sample location\n                if self.samp == 'normal':  # geometric distribution\n                    h = int(np.clip(np.random.normal((H - P + 1) / 2., (H - P + 1) / 4.), 0, H - P))\n                    w = int(np.clip(np.random.normal((W - P + 1) / 2., (W - P + 1) / 4.), 0, W - P))\n                else:  # uniform distribution\n                    h = np.random.randint(H - P + 1)\n                    w = np.random.randint(W - P + 1)\n                # add color point\n                if self.use_avg:\n                    # embed()\n                    hint[nn, :, h:h + P, w:w + P] = np.mean(\n                        np.mean(data[nn, :, h:h + P, w:w + P], axis=2, keepdims=True), axis=1, keepdims=True).reshape(\n                            1, C, 1, 1)\n                else:\n                    hint[nn, :, h:h + P, w:w + P] = data[nn, :, h:h + P, w:w + P]\n                mask[nn, :, h:h + P, w:w + P] = 1\n                # increment counter\n                pp += 1\n\n        mask -= 0.5\n        return hint, mask\n\n\nclass ColorizePreprocess:\n    \"\"\"Prepare dataset for image Colorization.\n\n    Args:\n       ab_thresh(float): Thresh value for setting mask value.\n       p(float): Probability for ignoring hint in an iteration.\n       num_points(int): Number of selected hints in an iteration.\n       samp(str): Sample method, default is normal.\n       use_avg(bool): Whether to use mean in selected hint area.\n       is_train(bool): Training process or not.\n\n    Return:\n        data(dict)：The preprocessed data for colorization.\n\n    \"\"\"\n\n    def __init__(self,\n                 ab_thresh: float = 0.,\n                 p: float = 0.,\n                 points: int = None,\n                 samp: str = 'normal',\n                 use_avg: bool = True):\n        self.ab_thresh = ab_thresh\n        self.p = p\n        self.num_points = points\n        self.samp = samp\n        self.use_avg = use_avg\n        self.gethint = ColorizeHint(percent=self.p, num_points=self.num_points, samp=self.samp, use_avg=self.use_avg)\n\n    def __call__(self, data_lab):\n        \"\"\"\n        This method seperates the L channel and AB channel, obtain hint, mask and real_B_enc as the input for colorization task.\n\n        Args:\n           img(np.ndarray|paddle.Tensor): LAB image.\n\n        Returns:\n            data(dict)：The preprocessed data for colorization.\n        \"\"\"\n        if type(data_lab) is not np.ndarray:\n            data_lab = data_lab.numpy()\n        data = {}\n        A = 2 * 110 / 10 + 1\n        data['A'] = data_lab[:, [0], :, :]\n        data['B'] = data_lab[:, 1:, :, :]\n        if self.ab_thresh > 0:  # mask out grayscale images\n            thresh = 1. * self.ab_thresh / 110\n            mask = np.sum(\n                np.abs(np.max(np.max(data['B'], axis=3), axis=2) - np.min(np.min(data['B'], axis=3), axis=2)), axis=1)\n            mask = (mask >= thresh)\n            data['A'] = data['A'][mask, :, :, :]\n            data['B'] = data['B'][mask, :, :, :]\n            if np.sum(mask) == 0:\n                return None\n        data_ab_rs = np.round((data['B'][:, :, ::4, ::4] * 110. + 110.) / 10.)  # normalized bin number\n        data['real_B_enc'] = data_ab_rs[:, [0], :, :] * A + data_ab_rs[:, [1], :, :]\n        data['hint_B'] = np.zeros(shape=data['B'].shape)\n        data['mask_B'] = np.zeros(shape=data['A'].shape)\n        data['hint_B'], data['mask_B'] = self.gethint(data['B'], data['hint_B'], data['mask_B'])\n        data['A'] = paddle.to_tensor(data['A'].astype(np.float32))\n        data['B'] = paddle.to_tensor(data['B'].astype(np.float32))\n        data['real_B_enc'] = paddle.to_tensor(data['real_B_enc'].astype(np.int64))\n        data['hint_B'] = paddle.to_tensor(data['hint_B'].astype(np.float32))\n        data['mask_B'] = paddle.to_tensor(data['mask_B'].astype(np.float32))\n        return data\n"
  },
  {
    "path": "modules/image/Image_editing/colorization/user_guided_colorization/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\n\nimport paddle\nimport paddle.nn as nn\nfrom paddle.nn import Conv2D, Conv2DTranspose\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.transforms as T\nfrom paddlehub.module.cv_module import ImageColorizeModule\nfrom .data_feed import ColorizePreprocess\n\n\n@moduleinfo(\n    name=\"user_guided_colorization\",\n    type=\"CV/image_editing\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"User_guided_colorization is a image colorization model, this module is trained with ILSVRC2012 dataset.\",\n    version=\"1.0.0\",\n    meta=ImageColorizeModule)\nclass UserGuidedColorization(nn.Layer):\n    \"\"\"\n    Userguidedcolorization, see https://github.com/haoyuying/colorization-pytorch\n\n    Args:\n        use_tanh (bool): Whether to use tanh as final activation function.\n        load_checkpoint (str): Pretrained checkpoint path.\n\n    \"\"\"\n\n    def __init__(self, use_tanh: bool = True, load_checkpoint: str = None):\n        super(UserGuidedColorization, self).__init__()\n        self.input_nc = 4\n        self.output_nc = 2\n        # Conv1\n        model1 = (\n            Conv2D(self.input_nc, 64, 3, 1, 1),\n            nn.ReLU(),\n            Conv2D(64, 64, 3, 1, 1),\n            nn.ReLU(),\n            nn.BatchNorm(64),\n        )\n\n        # Conv2\n        model2 = (\n            Conv2D(64, 128, 3, 1, 1),\n            nn.ReLU(),\n            Conv2D(128, 128, 3, 1, 1),\n            nn.ReLU(),\n            nn.BatchNorm(128),\n        )\n\n        # Conv3\n        model3 = (\n            Conv2D(128, 256, 3, 1, 1),\n            nn.ReLU(),\n            Conv2D(256, 256, 3, 1, 1),\n            nn.ReLU(),\n            Conv2D(256, 256, 3, 1, 1),\n            nn.ReLU(),\n            nn.BatchNorm(256),\n        )\n\n        # Conv4\n        model4 = (\n            Conv2D(256, 512, 3, 1, 1),\n            nn.ReLU(),\n            Conv2D(512, 512, 3, 1, 1),\n            nn.ReLU(),\n            Conv2D(512, 512, 3, 1, 1),\n            nn.ReLU(),\n            nn.BatchNorm(512),\n        )\n\n        # Conv5\n        model5 = (\n            Conv2D(512, 512, 3, 1, 2, 2),\n            nn.ReLU(),\n            Conv2D(512, 512, 3, 1, 2, 2),\n            nn.ReLU(),\n            Conv2D(512, 512, 3, 1, 2, 2),\n            nn.ReLU(),\n            nn.BatchNorm(512),\n        )\n\n        # Conv6\n        model6 = (\n            Conv2D(512, 512, 3, 1, 2, 2),\n            nn.ReLU(),\n            Conv2D(512, 512, 3, 1, 2, 2),\n            nn.ReLU(),\n            Conv2D(512, 512, 3, 1, 2, 2),\n            nn.ReLU(),\n            nn.BatchNorm(512),\n        )\n\n        # Conv7\n        model7 = (\n            Conv2D(512, 512, 3, 1, 1),\n            nn.ReLU(),\n            Conv2D(512, 512, 3, 1, 1),\n            nn.ReLU(),\n            Conv2D(512, 512, 3, 1, 1),\n            nn.ReLU(),\n            nn.BatchNorm(512),\n        )\n\n        # Conv8\n        model8up = (Conv2DTranspose(512, 256, kernel_size=4, stride=2, padding=1), )\n        model3short8 = (Conv2D(256, 256, 3, 1, 1), )\n        model8 = (\n            nn.ReLU(),\n            Conv2D(256, 256, 3, 1, 1),\n            nn.ReLU(),\n            Conv2D(256, 256, 3, 1, 1),\n            nn.ReLU(),\n            nn.BatchNorm(256),\n        )\n\n        # Conv9\n        model9up = (Conv2DTranspose(256, 128, kernel_size=4, stride=2, padding=1), )\n        model2short9 = (Conv2D(\n            128,\n            128,\n            3,\n            1,\n            1,\n        ), )\n        model9 = (nn.ReLU(), Conv2D(128, 128, 3, 1, 1), nn.ReLU(), nn.BatchNorm(128))\n\n        # Conv10\n        model10up = (Conv2DTranspose(128, 128, kernel_size=4, stride=2, padding=1), )\n        model1short10 = (Conv2D(64, 128, 3, 1, 1), )\n        model10 = (nn.ReLU(), Conv2D(128, 128, 3, 1, 1), nn.LeakyReLU(negative_slope=0.2))\n        model_class = (Conv2D(256, 529, 1), )\n\n        if use_tanh:\n            model_out = (Conv2D(128, 2, 1, 1, 0, 1), nn.Tanh())\n        else:\n            model_out = (Conv2D(128, 2, 1, 1, 0, 1), )\n\n        self.model1 = nn.Sequential(*model1)\n        self.model2 = nn.Sequential(*model2)\n        self.model3 = nn.Sequential(*model3)\n        self.model4 = nn.Sequential(*model4)\n        self.model5 = nn.Sequential(*model5)\n        self.model6 = nn.Sequential(*model6)\n        self.model7 = nn.Sequential(*model7)\n        self.model8up = nn.Sequential(*model8up)\n        self.model8 = nn.Sequential(*model8)\n        self.model9up = nn.Sequential(*model9up)\n        self.model9 = nn.Sequential(*model9)\n        self.model10up = nn.Sequential(*model10up)\n        self.model10 = nn.Sequential(*model10)\n        self.model3short8 = nn.Sequential(*model3short8)\n        self.model2short9 = nn.Sequential(*model2short9)\n        self.model1short10 = nn.Sequential(*model1short10)\n        self.model_class = nn.Sequential(*model_class)\n        self.model_out = nn.Sequential(*model_out)\n\n        self.set_config()\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'user_guided.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def transforms(self, images: str) -> callable:\n\n        transform = T.Compose([T.Resize((256, 256), interpolation='NEAREST'), T.RGB2LAB()], to_rgb=True)\n        return transform(images)\n\n    def set_config(self, classification: bool = True, prob: float = 1., num_point: int = None):\n        self.classification = classification\n        self.pre_func = ColorizePreprocess(ab_thresh=0., p=prob, points=num_point)\n\n    def preprocess(self, inputs: paddle.Tensor):\n        output = self.pre_func(inputs)\n        return output\n\n    def forward(self,\n                input_A: paddle.Tensor,\n                input_B: paddle.Tensor,\n                mask_B: paddle.Tensor,\n                real_b: paddle.Tensor = None,\n                real_B_enc: paddle.Tensor = None) -> paddle.Tensor:\n        conv1_2 = self.model1(paddle.concat([input_A, input_B, mask_B], axis=1))\n        conv2_2 = self.model2(conv1_2[:, :, ::2, ::2])\n        conv3_3 = self.model3(conv2_2[:, :, ::2, ::2])\n        conv4_3 = self.model4(conv3_3[:, :, ::2, ::2])\n        conv5_3 = self.model5(conv4_3)\n        conv6_3 = self.model6(conv5_3)\n        conv7_3 = self.model7(conv6_3)\n        conv8_up = self.model8up(conv7_3) + self.model3short8(conv3_3)\n        conv8_3 = self.model8(conv8_up)\n\n        if self.classification:\n            out_class = self.model_class(conv8_3)\n            conv9_up = self.model9up(conv8_3.detach()) + self.model2short9(conv2_2.detach())\n            conv9_3 = self.model9(conv9_up)\n            conv10_up = self.model10up(conv9_3) + self.model1short10(conv1_2.detach())\n            conv10_2 = self.model10(conv10_up)\n            out_reg = self.model_out(conv10_2)\n        else:\n            out_class = self.model_class(conv8_3.detach())\n            conv9_up = self.model9up(conv8_3) + self.model2short9(conv2_2)\n            conv9_3 = self.model9(conv9_up)\n            conv10_up = self.model10up(conv9_3) + self.model1short10(conv1_2)\n            conv10_2 = self.model10(conv10_up)\n            out_reg = self.model_out(conv10_2)\n\n        return out_class, out_reg\n"
  },
  {
    "path": "modules/image/Image_editing/colorization/user_guided_colorization/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport numpy as np\nimport paddlehub as hub\n\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/1sLIu1XKQrY/download?ixid=MnwxMjA3fDB8MXxhbGx8MTJ8fHx8fHwyfHwxNjYyMzQxNDUx&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"user_guided_colorization\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('colorization')\n\n    def test_predict1(self):\n        results = self.module.predict(\n            images=['tests/test.jpg'],\n            visualization=False\n        )\n        gray = results[0]['gray']\n        hint = results[0]['hint']\n        real = results[0]['real']\n        fake_reg = results[0]['fake_reg']\n\n        self.assertIsInstance(gray, np.ndarray)\n        self.assertIsInstance(hint, np.ndarray)\n        self.assertIsInstance(real, np.ndarray)\n        self.assertIsInstance(fake_reg, np.ndarray)\n\n    def test_predict2(self):\n        results = self.module.predict(\n            images=[cv2.imread('tests/test.jpg')],\n            visualization=False\n        )\n        gray = results[0]['gray']\n        hint = results[0]['hint']\n        real = results[0]['real']\n        fake_reg = results[0]['fake_reg']\n\n        self.assertIsInstance(gray, np.ndarray)\n        self.assertIsInstance(hint, np.ndarray)\n        self.assertIsInstance(real, np.ndarray)\n        self.assertIsInstance(fake_reg, np.ndarray)\n\n    def test_predict3(self):\n        results = self.module.predict(\n            images=[cv2.imread('tests/test.jpg')],\n            visualization=True\n        )\n        gray = results[0]['gray']\n        hint = results[0]['hint']\n        real = results[0]['real']\n        fake_reg = results[0]['fake_reg']\n\n        self.assertIsInstance(gray, np.ndarray)\n        self.assertIsInstance(hint, np.ndarray)\n        self.assertIsInstance(real, np.ndarray)\n        self.assertIsInstance(fake_reg, np.ndarray)\n    \n    def test_predict4(self):\n        self.assertRaises(\n            IndexError,\n            self.module.predict,\n            images=['no.jpg'],\n            visualization=False\n        )\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/Image_editing/enhancement/fbcnn_color/README.md",
    "content": "# fbcnn_color\n\n|模型名称|fbcnn_color|\n| :--- | :---: |\n|类别|图像-图像编辑|\n|网络|FBCNN|\n|数据集|-|\n|是否支持Fine-tuning|否|\n|模型大小|288MB|\n|指标|-|\n|最新更新日期|2022-10-08|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n  - 网络结构：\n      <p align=\"center\">\n      <img src=\"https://ai-studio-static-online.cdn.bcebos.com/08afa15df2e54adeb39587cd7aaa9b60fc82d349bda34f51993d6304776fd374\" hspace='10'/> <br />\n      </p>\n\n  - 样例结果示例：\n      <p align=\"center\">\n      <img src=\"https://ai-studio-static-online.cdn.bcebos.com/f486da7c9d5e4cac8b7ff252b5a4c17633f44f28745c4e489f31e6b78caea005\" hspace='10'/>\n      </p>\n\n- ### 模型介绍\n\n  - FBCNN 是一个基于卷积神经网络的 JPEG 图像伪影去除模型，它可以预测可调整的质量因子，以控制伪影重新移动和细节保留之间的权衡。\n\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0  \n\n- ### 2.安装\n\n    - ```shell\n      $ hub install fbcnn_color\n      ```\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n  - ### 1、命令行预测\n\n    ```shell\n    $ hub run fbcnn_color \\\n        --input_path \"/PATH/TO/IMAGE\" \\\n        --quality_factor -1 \\\n        --output_dir \"fbcnn_color_output\"\n    ```\n\n  - ### 2、预测代码示例\n\n    ```python\n    import paddlehub as hub\n    import cv2\n\n    module = hub.Module(name=\"fbcnn_color\")\n    result = module.artifacts_removal(\n        image=cv2.imread('/PATH/TO/IMAGE'),\n        quality_factor=None,\n        visualization=True,\n        output_dir='fbcnn_color_output'\n    )\n    ```\n\n  - ### 3、API\n\n    ```python\n    def artifacts_removal(\n        image: Union[str, numpy.ndarray],\n        quality_factor: float = None,\n        visualization: bool = True,\n        output_dir: str = \"fbcnn_color_output\"\n    ) -> numpy.ndarray\n    ```\n\n    - 伪影去除 API\n\n    - **参数**\n\n      * image (Union\\[str, numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      * quality_factor (float): 自定义质量因子（0.0 - 1.0），默认 None 为自适应；\n      * visualization (bool): 是否将识别结果保存为图片文件；\n      * output\\_dir (str): 保存处理结果的文件目录。\n\n    - **返回**\n\n      * res (numpy.ndarray): 图像伪影去除结果 (BGR)；\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个图像伪影去除的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n\n    ```shell\n     $ hub serving start -m fbcnn_color\n    ```\n\n    - 这样就完成了一个图像伪影去除服务化API的部署，默认端口号为8866。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    ```python\n    import requests\n    import json\n    import base64\n\n    import cv2\n    import numpy as np\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tobytes()).decode('utf8')\n\n    def base64_to_cv2(b64str):\n        data = base64.b64decode(b64str.encode('utf8'))\n        data = np.frombuffer(data, np.uint8)\n        data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n        return data\n\n    # 发送HTTP请求\n    org_im = cv2.imread('/PATH/TO/IMAGE')\n    data = {\n        'image': cv2_to_base64(org_im)\n    }\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/fbcnn_color\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 结果转换\n    results = r.json()['results']\n    results = base64_to_cv2(results)\n\n    # 保存结果\n    cv2.imwrite('output.jpg', results)\n    ```\n\n## 五、参考资料\n\n* 论文：[Towards Flexible Blind JPEG Artifacts Removal](https://arxiv.org/abs/2109.14573)\n\n* 官方实现：[jiaxi-jiang/FBCNN](https://github.com/jiaxi-jiang/FBCNN)\n\n## 六、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install fbcnn_color==1.0.0\n  ```\n"
  },
  {
    "path": "modules/image/Image_editing/enhancement/fbcnn_color/fbcnn.py",
    "content": "from collections import OrderedDict\n\nimport numpy as np\nimport paddle.nn as nn\n'''\n# --------------------------------------------\n# Advanced nn.Sequential\n# https://github.com/xinntao/BasicSR\n# --------------------------------------------\n'''\n\n\ndef sequential(*args):\n    \"\"\"Advanced nn.Sequential.\n    Args:\n        nn.Sequential, nn.Layer\n    Returns:\n        nn.Sequential\n    \"\"\"\n    if len(args) == 1:\n        if isinstance(args[0], OrderedDict):\n            raise NotImplementedError('sequential does not support OrderedDict input.')\n        return args[0]  # No sequential is needed.\n    modules = []\n    for module in args:\n        if isinstance(module, nn.Sequential):\n            for submodule in module.children():\n                modules.append(submodule)\n        elif isinstance(module, nn.Layer):\n            modules.append(module)\n    return nn.Sequential(*modules)\n\n\n# --------------------------------------------\n# return nn.Sequantial of (Conv + BN + ReLU)\n# --------------------------------------------\ndef conv(in_channels=64,\n         out_channels=64,\n         kernel_size=3,\n         stride=1,\n         padding=1,\n         bias=True,\n         mode='CBR',\n         negative_slope=0.2):\n    L = []\n    for t in mode:\n        if t == 'C':\n            L.append(\n                nn.Conv2D(in_channels=in_channels,\n                          out_channels=out_channels,\n                          kernel_size=kernel_size,\n                          stride=stride,\n                          padding=padding,\n                          bias_attr=bias))\n        elif t == 'T':\n            L.append(\n                nn.Conv2DTranspose(in_channels=in_channels,\n                                   out_channels=out_channels,\n                                   kernel_size=kernel_size,\n                                   stride=stride,\n                                   padding=padding,\n                                   bias_attr=bias))\n        elif t == 'B':\n            L.append(nn.BatchNorm2D(out_channels, momentum=0.9, eps=1e-04, affine=True))\n        elif t == 'I':\n            L.append(nn.InstanceNorm2D(out_channels, affine=True))\n        elif t == 'R':\n            L.append(nn.ReLU())\n        elif t == 'r':\n            L.append(nn.ReLU())\n        elif t == 'L':\n            L.append(nn.LeakyReLU(negative_slope=negative_slope))\n        elif t == 'l':\n            L.append(nn.LeakyReLU(negative_slope=negative_slope))\n        elif t == '2':\n            L.append(nn.PixelShuffle(upscale_factor=2))\n        elif t == '3':\n            L.append(nn.PixelShuffle(upscale_factor=3))\n        elif t == '4':\n            L.append(nn.PixelShuffle(upscale_factor=4))\n        elif t == 'U':\n            L.append(nn.Upsample(scale_factor=2, mode='nearest'))\n        elif t == 'u':\n            L.append(nn.Upsample(scale_factor=3, mode='nearest'))\n        elif t == 'v':\n            L.append(nn.Upsample(scale_factor=4, mode='nearest'))\n        elif t == 'M':\n            L.append(nn.MaxPool2D(kernel_size=kernel_size, stride=stride, padding=0))\n        elif t == 'A':\n            L.append(nn.AvgPool2D(kernel_size=kernel_size, stride=stride, padding=0))\n        else:\n            raise NotImplementedError('Undefined type: '.format(t))\n    return sequential(*L)\n\n\n# --------------------------------------------\n# Res Block: x + conv(relu(conv(x)))\n# --------------------------------------------\nclass ResBlock(nn.Layer):\n\n    def __init__(self,\n                 in_channels=64,\n                 out_channels=64,\n                 kernel_size=3,\n                 stride=1,\n                 padding=1,\n                 bias=True,\n                 mode='CRC',\n                 negative_slope=0.2):\n        super(ResBlock, self).__init__()\n\n        assert in_channels == out_channels, 'Only support in_channels==out_channels.'\n        if mode[0] in ['R', 'L']:\n            mode = mode[0].lower() + mode[1:]\n\n        self.res = conv(in_channels, out_channels, kernel_size, stride, padding, bias, mode, negative_slope)\n\n    def forward(self, x):\n        res = self.res(x)\n        return x + res\n\n\n# --------------------------------------------\n# conv + subp (+ relu)\n# --------------------------------------------\ndef upsample_pixelshuffle(in_channels=64,\n                          out_channels=3,\n                          kernel_size=3,\n                          stride=1,\n                          padding=1,\n                          bias=True,\n                          mode='2R',\n                          negative_slope=0.2):\n    assert len(mode) < 4 and mode[0] in ['2', '3', '4'], 'mode examples: 2, 2R, 2BR, 3, ..., 4BR.'\n    up1 = conv(in_channels,\n               out_channels * (int(mode[0])**2),\n               kernel_size,\n               stride,\n               padding,\n               bias,\n               mode='C' + mode,\n               negative_slope=negative_slope)\n    return up1\n\n\n# --------------------------------------------\n# nearest_upsample + conv (+ R)\n# --------------------------------------------\ndef upsample_upconv(in_channels=64,\n                    out_channels=3,\n                    kernel_size=3,\n                    stride=1,\n                    padding=1,\n                    bias=True,\n                    mode='2R',\n                    negative_slope=0.2):\n    assert len(mode) < 4 and mode[0] in ['2', '3', '4'], 'mode examples: 2, 2R, 2BR, 3, ..., 4BR'\n    if mode[0] == '2':\n        uc = 'UC'\n    elif mode[0] == '3':\n        uc = 'uC'\n    elif mode[0] == '4':\n        uc = 'vC'\n    mode = mode.replace(mode[0], uc)\n    up1 = conv(in_channels, out_channels, kernel_size, stride, padding, bias, mode=mode, negative_slope=negative_slope)\n    return up1\n\n\n# --------------------------------------------\n# convTranspose (+ relu)\n# --------------------------------------------\ndef upsample_convtranspose(in_channels=64,\n                           out_channels=3,\n                           kernel_size=2,\n                           stride=2,\n                           padding=0,\n                           bias=True,\n                           mode='2R',\n                           negative_slope=0.2):\n    assert len(mode) < 4 and mode[0] in ['2', '3', '4'], 'mode examples: 2, 2R, 2BR, 3, ..., 4BR.'\n    kernel_size = int(mode[0])\n    stride = int(mode[0])\n    mode = mode.replace(mode[0], 'T')\n    up1 = conv(in_channels, out_channels, kernel_size, stride, padding, bias, mode, negative_slope)\n    return up1\n\n\n'''\n# --------------------------------------------\n# Downsampler\n# Kai Zhang, https://github.com/cszn/KAIR\n# --------------------------------------------\n# downsample_strideconv\n# downsample_maxpool\n# downsample_avgpool\n# --------------------------------------------\n'''\n\n\n# --------------------------------------------\n# strideconv (+ relu)\n# --------------------------------------------\ndef downsample_strideconv(in_channels=64,\n                          out_channels=64,\n                          kernel_size=2,\n                          stride=2,\n                          padding=0,\n                          bias=True,\n                          mode='2R',\n                          negative_slope=0.2):\n    assert len(mode) < 4 and mode[0] in ['2', '3', '4'], 'mode examples: 2, 2R, 2BR, 3, ..., 4BR.'\n    kernel_size = int(mode[0])\n    stride = int(mode[0])\n    mode = mode.replace(mode[0], 'C')\n    down1 = conv(in_channels, out_channels, kernel_size, stride, padding, bias, mode, negative_slope)\n    return down1\n\n\n# --------------------------------------------\n# maxpooling + conv (+ relu)\n# --------------------------------------------\ndef downsample_maxpool(in_channels=64,\n                       out_channels=64,\n                       kernel_size=3,\n                       stride=1,\n                       padding=0,\n                       bias=True,\n                       mode='2R',\n                       negative_slope=0.2):\n    assert len(mode) < 4 and mode[0] in ['2', '3'], 'mode examples: 2, 2R, 2BR, 3, ..., 3BR.'\n    kernel_size_pool = int(mode[0])\n    stride_pool = int(mode[0])\n    mode = mode.replace(mode[0], 'MC')\n    pool = conv(kernel_size=kernel_size_pool, stride=stride_pool, mode=mode[0], negative_slope=negative_slope)\n    pool_tail = conv(in_channels,\n                     out_channels,\n                     kernel_size,\n                     stride,\n                     padding,\n                     bias,\n                     mode=mode[1:],\n                     negative_slope=negative_slope)\n    return sequential(pool, pool_tail)\n\n\n# --------------------------------------------\n# averagepooling + conv (+ relu)\n# --------------------------------------------\ndef downsample_avgpool(in_channels=64,\n                       out_channels=64,\n                       kernel_size=3,\n                       stride=1,\n                       padding=1,\n                       bias=True,\n                       mode='2R',\n                       negative_slope=0.2):\n    assert len(mode) < 4 and mode[0] in ['2', '3'], 'mode examples: 2, 2R, 2BR, 3, ..., 3BR.'\n    kernel_size_pool = int(mode[0])\n    stride_pool = int(mode[0])\n    mode = mode.replace(mode[0], 'AC')\n    pool = conv(kernel_size=kernel_size_pool, stride=stride_pool, mode=mode[0], negative_slope=negative_slope)\n    pool_tail = conv(in_channels,\n                     out_channels,\n                     kernel_size,\n                     stride,\n                     padding,\n                     bias,\n                     mode=mode[1:],\n                     negative_slope=negative_slope)\n    return sequential(pool, pool_tail)\n\n\nclass QFAttention(nn.Layer):\n\n    def __init__(self,\n                 in_channels=64,\n                 out_channels=64,\n                 kernel_size=3,\n                 stride=1,\n                 padding=1,\n                 bias=True,\n                 mode='CRC',\n                 negative_slope=0.2):\n        super(QFAttention, self).__init__()\n\n        assert in_channels == out_channels, 'Only support in_channels==out_channels.'\n        if mode[0] in ['R', 'L']:\n            mode = mode[0].lower() + mode[1:]\n\n        self.res = conv(in_channels, out_channels, kernel_size, stride, padding, bias, mode, negative_slope)\n\n    def forward(self, x, gamma, beta):\n        gamma = gamma.unsqueeze(-1).unsqueeze(-1)\n        beta = beta.unsqueeze(-1).unsqueeze(-1)\n        res = (gamma) * self.res(x) + beta\n        return x + res\n\n\nclass FBCNN(nn.Layer):\n\n    def __init__(self,\n                 in_nc=3,\n                 out_nc=3,\n                 nc=[64, 128, 256, 512],\n                 nb=4,\n                 act_mode='R',\n                 downsample_mode='strideconv',\n                 upsample_mode='convtranspose'):\n        super(FBCNN, self).__init__()\n\n        self.m_head = conv(in_nc, nc[0], bias=True, mode='C')\n        self.nb = nb\n        self.nc = nc\n        # downsample\n        if downsample_mode == 'avgpool':\n            downsample_block = downsample_avgpool\n        elif downsample_mode == 'maxpool':\n            downsample_block = downsample_maxpool\n        elif downsample_mode == 'strideconv':\n            downsample_block = downsample_strideconv\n        else:\n            raise NotImplementedError('downsample mode [{:s}] is not found'.format(downsample_mode))\n\n        self.m_down1 = sequential(*[ResBlock(nc[0], nc[0], bias=True, mode='C' + act_mode + 'C') for _ in range(nb)],\n                                  downsample_block(nc[0], nc[1], bias=True, mode='2'))\n        self.m_down2 = sequential(*[ResBlock(nc[1], nc[1], bias=True, mode='C' + act_mode + 'C') for _ in range(nb)],\n                                  downsample_block(nc[1], nc[2], bias=True, mode='2'))\n        self.m_down3 = sequential(*[ResBlock(nc[2], nc[2], bias=True, mode='C' + act_mode + 'C') for _ in range(nb)],\n                                  downsample_block(nc[2], nc[3], bias=True, mode='2'))\n\n        self.m_body_encoder = sequential(\n            *[ResBlock(nc[3], nc[3], bias=True, mode='C' + act_mode + 'C') for _ in range(nb)])\n\n        self.m_body_decoder = sequential(\n            *[ResBlock(nc[3], nc[3], bias=True, mode='C' + act_mode + 'C') for _ in range(nb)])\n\n        # upsample\n        if upsample_mode == 'upconv':\n            upsample_block = upsample_upconv\n        elif upsample_mode == 'pixelshuffle':\n            upsample_block = upsample_pixelshuffle\n        elif upsample_mode == 'convtranspose':\n            upsample_block = upsample_convtranspose\n        else:\n            raise NotImplementedError('upsample mode [{:s}] is not found'.format(upsample_mode))\n\n        self.m_up3 = nn.LayerList([\n            upsample_block(nc[3], nc[2], bias=True, mode='2'),\n            *[QFAttention(nc[2], nc[2], bias=True, mode='C' + act_mode + 'C') for _ in range(nb)]\n        ])\n\n        self.m_up2 = nn.LayerList([\n            upsample_block(nc[2], nc[1], bias=True, mode='2'),\n            *[QFAttention(nc[1], nc[1], bias=True, mode='C' + act_mode + 'C') for _ in range(nb)]\n        ])\n\n        self.m_up1 = nn.LayerList([\n            upsample_block(nc[1], nc[0], bias=True, mode='2'),\n            *[QFAttention(nc[0], nc[0], bias=True, mode='C' + act_mode + 'C') for _ in range(nb)]\n        ])\n\n        self.m_tail = conv(nc[0], out_nc, bias=True, mode='C')\n\n        self.qf_pred = sequential(*[ResBlock(nc[3], nc[3], bias=True, mode='C' + act_mode + 'C') for _ in range(nb)],\n                                  nn.AdaptiveAvgPool2D((1, 1)), nn.Flatten(), nn.Linear(512, 512), nn.ReLU(),\n                                  nn.Linear(512, 512), nn.ReLU(), nn.Linear(512, 1), nn.Sigmoid())\n\n        self.qf_embed = sequential(nn.Linear(1, 512), nn.ReLU(), nn.Linear(512, 512), nn.ReLU(), nn.Linear(512, 512),\n                                   nn.ReLU())\n\n        self.to_gamma_3 = sequential(nn.Linear(512, nc[2]), nn.Sigmoid())\n        self.to_beta_3 = sequential(nn.Linear(512, nc[2]), nn.Tanh())\n        self.to_gamma_2 = sequential(nn.Linear(512, nc[1]), nn.Sigmoid())\n        self.to_beta_2 = sequential(nn.Linear(512, nc[1]), nn.Tanh())\n        self.to_gamma_1 = sequential(nn.Linear(512, nc[0]), nn.Sigmoid())\n        self.to_beta_1 = sequential(nn.Linear(512, nc[0]), nn.Tanh())\n\n    def forward(self, x, qf_input=None):\n\n        h, w = x.shape[-2:]\n        paddingBottom = int(np.ceil(h / 8) * 8 - h)\n        paddingRight = int(np.ceil(w / 8) * 8 - w)\n        x = nn.functional.pad(x, (0, paddingRight, 0, paddingBottom), mode='reflect')\n\n        x1 = self.m_head(x)\n        x2 = self.m_down1(x1)\n        x3 = self.m_down2(x2)\n        x4 = self.m_down3(x3)\n        x = self.m_body_encoder(x4)\n        qf = self.qf_pred(x)\n        x = self.m_body_decoder(x)\n        qf_embedding = self.qf_embed(qf_input) if qf_input is not None else self.qf_embed(qf)\n        gamma_3 = self.to_gamma_3(qf_embedding)\n        beta_3 = self.to_beta_3(qf_embedding)\n\n        gamma_2 = self.to_gamma_2(qf_embedding)\n        beta_2 = self.to_beta_2(qf_embedding)\n\n        gamma_1 = self.to_gamma_1(qf_embedding)\n        beta_1 = self.to_beta_1(qf_embedding)\n\n        x = x + x4\n        x = self.m_up3[0](x)\n        for i in range(self.nb):\n            x = self.m_up3[i + 1](x, gamma_3, beta_3)\n\n        x = x + x3\n\n        x = self.m_up2[0](x)\n        for i in range(self.nb):\n            x = self.m_up2[i + 1](x, gamma_2, beta_2)\n        x = x + x2\n\n        x = self.m_up1[0](x)\n        for i in range(self.nb):\n            x = self.m_up1[i + 1](x, gamma_1, beta_1)\n\n        x = x + x1\n        x = self.m_tail(x)\n        x = x[..., :h, :w]\n\n        return x, qf\n"
  },
  {
    "path": "modules/image/Image_editing/enhancement/fbcnn_color/module.py",
    "content": "import argparse\nimport base64\nimport os\nimport time\nfrom typing import Union\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\n\nfrom .fbcnn import FBCNN\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tobytes()).decode('utf8')\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.frombuffer(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\n@moduleinfo(\n    name='fbcnn_color',\n    version='1.0.0',\n    type=\"CV/image_editing\",\n    author=\"\",\n    author_email=\"\",\n    summary=\"Flexible JPEG Artifacts Removal.\",\n)\nclass FBCNNColor(nn.Layer):\n\n    def __init__(self):\n        super(FBCNNColor, self).__init__()\n        self.default_pretrained_model_path = os.path.join(self.directory, 'ckpts', 'fbcnn_color.pdparams')\n        self.fbcnn = FBCNN()\n        state_dict = paddle.load(self.default_pretrained_model_path)\n        self.fbcnn.set_state_dict(state_dict)\n        self.fbcnn.eval()\n\n    def preprocess(self, img: np.ndarray) -> np.ndarray:\n        img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)\n        img = img.transpose((2, 0, 1))\n        img = img / 255.0\n        return img.astype(np.float32)\n\n    def postprocess(self, img: np.ndarray) -> np.ndarray:\n        img = img.clip(0, 1)\n        img = img * 255.0\n        img = img.transpose((1, 2, 0))\n        img = cv2.cvtColor(img, cv2.COLOR_RGB2BGR)\n        return img.astype(np.uint8)\n\n    def artifacts_removal(self,\n                          image: Union[str, np.ndarray],\n                          quality_factor: float = None,\n                          visualization: bool = True,\n                          output_dir: str = \"fbcnn_color_output\") -> np.ndarray:\n        if isinstance(image, str):\n            _, file_name = os.path.split(image)\n            save_name, _ = os.path.splitext(file_name)\n            save_name = save_name + '_' + str(int(time.time())) + '.jpg'\n            image = cv2.imdecode(np.fromfile(image, dtype=np.uint8), cv2.IMREAD_COLOR)\n        elif isinstance(image, np.ndarray):\n            save_name = str(int(time.time())) + '.jpg'\n            image = image\n        else:\n            raise Exception(\"image should be a str / np.ndarray\")\n\n        with paddle.no_grad():\n            img_input = self.preprocess(image)\n            img_input = paddle.to_tensor(img_input[None, ...], dtype=paddle.float32)\n            if quality_factor and 0 <= quality_factor <= 1:\n                qf_input = paddle.to_tensor([[quality_factor]], dtype=paddle.float32)\n            else:\n                qf_input = None\n            img_output, _ = self.fbcnn(img_input, qf_input)\n            img_output = img_output.numpy()[0]\n            img_output = self.postprocess(img_output)\n\n        if visualization:\n            if not os.path.isdir(output_dir):\n                os.makedirs(output_dir)\n            save_path = os.path.join(output_dir, save_name)\n            cv2.imwrite(save_path, img_output)\n\n        return img_output\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.parser.add_argument('--input_path', type=str, help=\"Path to image.\")\n        self.parser.add_argument('--quality_factor', type=float, default=None, help=\"Image quality factor (0.0-1.0).\")\n        self.parser.add_argument('--output_dir',\n                                 type=str,\n                                 default='fbcnn_color_output',\n                                 help=\"The directory to save output images.\")\n        args = self.parser.parse_args(argvs)\n        self.artifacts_removal(image=args.input_path,\n                               quality_factor=args.quality_factor,\n                               visualization=True,\n                               output_dir=args.output_dir)\n        return 'Artifacts removal results are saved in %s' % args.output_dir\n\n    @serving\n    def serving_method(self, image, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        image = base64_to_cv2(image)\n        img_output = self.artifacts_removal(image=image, **kwargs)\n\n        return cv2_to_base64(img_output)\n"
  },
  {
    "path": "modules/image/Image_editing/enhancement/fbcnn_color/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport numpy as np\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/mJaD10XeD7w/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8M3x8Y2F0fGVufDB8fHx8MTY2MzczNDc3Mw&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"fbcnn_color\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('fbcnn_color_output')\n\n    def test_artifacts_removal1(self):\n        results = self.module.artifacts_removal(image='tests/test.jpg', quality_factor=None, visualization=False)\n\n        self.assertIsInstance(results, np.ndarray)\n\n    def test_artifacts_removal2(self):\n        results = self.module.artifacts_removal(image=cv2.imread('tests/test.jpg'),\n                                                quality_factor=None,\n                                                visualization=True)\n\n        self.assertIsInstance(results, np.ndarray)\n\n    def test_artifacts_removal3(self):\n        results = self.module.artifacts_removal(image=cv2.imread('tests/test.jpg'),\n                                                quality_factor=0.5,\n                                                visualization=True)\n\n        self.assertIsInstance(results, np.ndarray)\n\n    def test_artifacts_removal4(self):\n        self.assertRaises(Exception, self.module.artifacts_removal, image=['tests/test.jpg'])\n\n    def test_artifacts_removal5(self):\n        self.assertRaises(FileNotFoundError, self.module.artifacts_removal, image='no.jpg')\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/Image_editing/enhancement/fbcnn_gray/README.md",
    "content": "# fbcnn_gray\n\n|模型名称|fbcnn_gray|\n| :--- | :---: |\n|类别|图像-图像编辑|\n|网络|FBCNN|\n|数据集|-|\n|是否支持Fine-tuning|否|\n|模型大小|288MB|\n|指标|-|\n|最新更新日期|2022-10-08|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n  - 网络结构：\n      <p align=\"center\">\n      <img src=\"https://ai-studio-static-online.cdn.bcebos.com/08afa15df2e54adeb39587cd7aaa9b60fc82d349bda34f51993d6304776fd374\" hspace='10'/> <br />\n      </p>\n\n  - 样例结果示例：\n      <p align=\"center\">\n      <img src=\"https://ai-studio-static-online.cdn.bcebos.com/4804ea3fff524c578014ec98e5222b6310a4cdf1ba41448c94829399e82880b6\" hspace='10'/>\n      </p>\n\n- ### 模型介绍\n\n  - FBCNN 是一个基于卷积神经网络的 JPEG 图像伪影去除模型，它可以预测可调整的质量因子，以控制伪影重新移动和细节保留之间的权衡。\n\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0  \n\n- ### 2.安装\n\n    - ```shell\n      $ hub install fbcnn_gray\n      ```\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n  - ### 1、命令行预测\n\n    ```shell\n    $ hub run fbcnn_gray \\\n        --input_path \"/PATH/TO/IMAGE\" \\\n        --quality_factor -1 \\\n        --output_dir \"fbcnn_gray_output\"\n    ```\n\n  - ### 2、预测代码示例\n\n    ```python\n    import paddlehub as hub\n    import cv2\n\n    module = hub.Module(name=\"fbcnn_gray\")\n    result = module.artifacts_removal(\n        image=cv2.imread('/PATH/TO/IMAGE', cv2.IMREAD_GRAYSCALE),\n        quality_factor=None,\n        visualization=True,\n        output_dir='fbcnn_gray_output'\n    )\n    ```\n\n  - ### 3、API\n\n    ```python\n    def artifacts_removal(\n        image: Union[str, numpy.ndarray],\n        quality_factor: float = None,\n        visualization: bool = True,\n        output_dir: str = \"fbcnn_gray_output\"\n    ) -> numpy.ndarray\n    ```\n\n    - 伪影去除 API\n\n    - **参数**\n\n      * image (Union\\[str, numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W\\]，GRAY格式；\n      * quality_factor (float): 自定义质量因子（0.0 - 1.0），默认 None 为自适应；\n      * visualization (bool): 是否将识别结果保存为图片文件；\n      * output\\_dir (str): 保存处理结果的文件目录。\n\n    - **返回**\n\n      * res (numpy.ndarray): 图像伪影去除结果 (GRAY)；\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个图像伪影去除的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n\n    ```shell\n     $ hub serving start -m fbcnn_gray\n    ```\n\n    - 这样就完成了一个图像伪影去除服务化API的部署，默认端口号为8866。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    ```python\n    import requests\n    import json\n    import base64\n\n    import cv2\n    import numpy as np\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tobytes()).decode('utf8')\n\n    def base64_to_cv2(b64str):\n        data = base64.b64decode(b64str.encode('utf8'))\n        data = np.frombuffer(data, np.uint8)\n        data = cv2.imdecode(data, cv2.IMREAD_GRAYSCALE)\n        return data\n\n    # 发送HTTP请求\n    org_im = cv2.imread('/PATH/TO/IMAGE')\n    data = {\n        'image': cv2_to_base64(org_im)\n    }\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/fbcnn_gray\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 结果转换\n    results = r.json()['results']\n    results = base64_to_cv2(results)\n\n    # 保存结果\n    cv2.imwrite('output.jpg', results)\n    ```\n\n## 五、参考资料\n\n* 论文：[Towards Flexible Blind JPEG Artifacts Removal](https://arxiv.org/abs/2109.14573)\n\n* 官方实现：[jiaxi-jiang/FBCNN](https://github.com/jiaxi-jiang/FBCNN)\n\n## 六、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install fbcnn_gray==1.0.0\n  ```\n"
  },
  {
    "path": "modules/image/Image_editing/enhancement/fbcnn_gray/fbcnn.py",
    "content": "from collections import OrderedDict\n\nimport numpy as np\nimport paddle.nn as nn\n'''\n# --------------------------------------------\n# Advanced nn.Sequential\n# https://github.com/xinntao/BasicSR\n# --------------------------------------------\n'''\n\n\ndef sequential(*args):\n    \"\"\"Advanced nn.Sequential.\n    Args:\n        nn.Sequential, nn.Layer\n    Returns:\n        nn.Sequential\n    \"\"\"\n    if len(args) == 1:\n        if isinstance(args[0], OrderedDict):\n            raise NotImplementedError('sequential does not support OrderedDict input.')\n        return args[0]  # No sequential is needed.\n    modules = []\n    for module in args:\n        if isinstance(module, nn.Sequential):\n            for submodule in module.children():\n                modules.append(submodule)\n        elif isinstance(module, nn.Layer):\n            modules.append(module)\n    return nn.Sequential(*modules)\n\n\n# --------------------------------------------\n# return nn.Sequantial of (Conv + BN + ReLU)\n# --------------------------------------------\ndef conv(in_channels=64,\n         out_channels=64,\n         kernel_size=3,\n         stride=1,\n         padding=1,\n         bias=True,\n         mode='CBR',\n         negative_slope=0.2):\n    L = []\n    for t in mode:\n        if t == 'C':\n            L.append(\n                nn.Conv2D(in_channels=in_channels,\n                          out_channels=out_channels,\n                          kernel_size=kernel_size,\n                          stride=stride,\n                          padding=padding,\n                          bias_attr=bias))\n        elif t == 'T':\n            L.append(\n                nn.Conv2DTranspose(in_channels=in_channels,\n                                   out_channels=out_channels,\n                                   kernel_size=kernel_size,\n                                   stride=stride,\n                                   padding=padding,\n                                   bias_attr=bias))\n        elif t == 'B':\n            L.append(nn.BatchNorm2D(out_channels, momentum=0.9, eps=1e-04, affine=True))\n        elif t == 'I':\n            L.append(nn.InstanceNorm2D(out_channels, affine=True))\n        elif t == 'R':\n            L.append(nn.ReLU())\n        elif t == 'r':\n            L.append(nn.ReLU())\n        elif t == 'L':\n            L.append(nn.LeakyReLU(negative_slope=negative_slope))\n        elif t == 'l':\n            L.append(nn.LeakyReLU(negative_slope=negative_slope))\n        elif t == '2':\n            L.append(nn.PixelShuffle(upscale_factor=2))\n        elif t == '3':\n            L.append(nn.PixelShuffle(upscale_factor=3))\n        elif t == '4':\n            L.append(nn.PixelShuffle(upscale_factor=4))\n        elif t == 'U':\n            L.append(nn.Upsample(scale_factor=2, mode='nearest'))\n        elif t == 'u':\n            L.append(nn.Upsample(scale_factor=3, mode='nearest'))\n        elif t == 'v':\n            L.append(nn.Upsample(scale_factor=4, mode='nearest'))\n        elif t == 'M':\n            L.append(nn.MaxPool2D(kernel_size=kernel_size, stride=stride, padding=0))\n        elif t == 'A':\n            L.append(nn.AvgPool2D(kernel_size=kernel_size, stride=stride, padding=0))\n        else:\n            raise NotImplementedError('Undefined type: '.format(t))\n    return sequential(*L)\n\n\n# --------------------------------------------\n# Res Block: x + conv(relu(conv(x)))\n# --------------------------------------------\nclass ResBlock(nn.Layer):\n\n    def __init__(self,\n                 in_channels=64,\n                 out_channels=64,\n                 kernel_size=3,\n                 stride=1,\n                 padding=1,\n                 bias=True,\n                 mode='CRC',\n                 negative_slope=0.2):\n        super(ResBlock, self).__init__()\n\n        assert in_channels == out_channels, 'Only support in_channels==out_channels.'\n        if mode[0] in ['R', 'L']:\n            mode = mode[0].lower() + mode[1:]\n\n        self.res = conv(in_channels, out_channels, kernel_size, stride, padding, bias, mode, negative_slope)\n\n    def forward(self, x):\n        res = self.res(x)\n        return x + res\n\n\n# --------------------------------------------\n# conv + subp (+ relu)\n# --------------------------------------------\ndef upsample_pixelshuffle(in_channels=64,\n                          out_channels=3,\n                          kernel_size=3,\n                          stride=1,\n                          padding=1,\n                          bias=True,\n                          mode='2R',\n                          negative_slope=0.2):\n    assert len(mode) < 4 and mode[0] in ['2', '3', '4'], 'mode examples: 2, 2R, 2BR, 3, ..., 4BR.'\n    up1 = conv(in_channels,\n               out_channels * (int(mode[0])**2),\n               kernel_size,\n               stride,\n               padding,\n               bias,\n               mode='C' + mode,\n               negative_slope=negative_slope)\n    return up1\n\n\n# --------------------------------------------\n# nearest_upsample + conv (+ R)\n# --------------------------------------------\ndef upsample_upconv(in_channels=64,\n                    out_channels=3,\n                    kernel_size=3,\n                    stride=1,\n                    padding=1,\n                    bias=True,\n                    mode='2R',\n                    negative_slope=0.2):\n    assert len(mode) < 4 and mode[0] in ['2', '3', '4'], 'mode examples: 2, 2R, 2BR, 3, ..., 4BR'\n    if mode[0] == '2':\n        uc = 'UC'\n    elif mode[0] == '3':\n        uc = 'uC'\n    elif mode[0] == '4':\n        uc = 'vC'\n    mode = mode.replace(mode[0], uc)\n    up1 = conv(in_channels, out_channels, kernel_size, stride, padding, bias, mode=mode, negative_slope=negative_slope)\n    return up1\n\n\n# --------------------------------------------\n# convTranspose (+ relu)\n# --------------------------------------------\ndef upsample_convtranspose(in_channels=64,\n                           out_channels=3,\n                           kernel_size=2,\n                           stride=2,\n                           padding=0,\n                           bias=True,\n                           mode='2R',\n                           negative_slope=0.2):\n    assert len(mode) < 4 and mode[0] in ['2', '3', '4'], 'mode examples: 2, 2R, 2BR, 3, ..., 4BR.'\n    kernel_size = int(mode[0])\n    stride = int(mode[0])\n    mode = mode.replace(mode[0], 'T')\n    up1 = conv(in_channels, out_channels, kernel_size, stride, padding, bias, mode, negative_slope)\n    return up1\n\n\n'''\n# --------------------------------------------\n# Downsampler\n# Kai Zhang, https://github.com/cszn/KAIR\n# --------------------------------------------\n# downsample_strideconv\n# downsample_maxpool\n# downsample_avgpool\n# --------------------------------------------\n'''\n\n\n# --------------------------------------------\n# strideconv (+ relu)\n# --------------------------------------------\ndef downsample_strideconv(in_channels=64,\n                          out_channels=64,\n                          kernel_size=2,\n                          stride=2,\n                          padding=0,\n                          bias=True,\n                          mode='2R',\n                          negative_slope=0.2):\n    assert len(mode) < 4 and mode[0] in ['2', '3', '4'], 'mode examples: 2, 2R, 2BR, 3, ..., 4BR.'\n    kernel_size = int(mode[0])\n    stride = int(mode[0])\n    mode = mode.replace(mode[0], 'C')\n    down1 = conv(in_channels, out_channels, kernel_size, stride, padding, bias, mode, negative_slope)\n    return down1\n\n\n# --------------------------------------------\n# maxpooling + conv (+ relu)\n# --------------------------------------------\ndef downsample_maxpool(in_channels=64,\n                       out_channels=64,\n                       kernel_size=3,\n                       stride=1,\n                       padding=0,\n                       bias=True,\n                       mode='2R',\n                       negative_slope=0.2):\n    assert len(mode) < 4 and mode[0] in ['2', '3'], 'mode examples: 2, 2R, 2BR, 3, ..., 3BR.'\n    kernel_size_pool = int(mode[0])\n    stride_pool = int(mode[0])\n    mode = mode.replace(mode[0], 'MC')\n    pool = conv(kernel_size=kernel_size_pool, stride=stride_pool, mode=mode[0], negative_slope=negative_slope)\n    pool_tail = conv(in_channels,\n                     out_channels,\n                     kernel_size,\n                     stride,\n                     padding,\n                     bias,\n                     mode=mode[1:],\n                     negative_slope=negative_slope)\n    return sequential(pool, pool_tail)\n\n\n# --------------------------------------------\n# averagepooling + conv (+ relu)\n# --------------------------------------------\ndef downsample_avgpool(in_channels=64,\n                       out_channels=64,\n                       kernel_size=3,\n                       stride=1,\n                       padding=1,\n                       bias=True,\n                       mode='2R',\n                       negative_slope=0.2):\n    assert len(mode) < 4 and mode[0] in ['2', '3'], 'mode examples: 2, 2R, 2BR, 3, ..., 3BR.'\n    kernel_size_pool = int(mode[0])\n    stride_pool = int(mode[0])\n    mode = mode.replace(mode[0], 'AC')\n    pool = conv(kernel_size=kernel_size_pool, stride=stride_pool, mode=mode[0], negative_slope=negative_slope)\n    pool_tail = conv(in_channels,\n                     out_channels,\n                     kernel_size,\n                     stride,\n                     padding,\n                     bias,\n                     mode=mode[1:],\n                     negative_slope=negative_slope)\n    return sequential(pool, pool_tail)\n\n\nclass QFAttention(nn.Layer):\n\n    def __init__(self,\n                 in_channels=64,\n                 out_channels=64,\n                 kernel_size=3,\n                 stride=1,\n                 padding=1,\n                 bias=True,\n                 mode='CRC',\n                 negative_slope=0.2):\n        super(QFAttention, self).__init__()\n\n        assert in_channels == out_channels, 'Only support in_channels==out_channels.'\n        if mode[0] in ['R', 'L']:\n            mode = mode[0].lower() + mode[1:]\n\n        self.res = conv(in_channels, out_channels, kernel_size, stride, padding, bias, mode, negative_slope)\n\n    def forward(self, x, gamma, beta):\n        gamma = gamma.unsqueeze(-1).unsqueeze(-1)\n        beta = beta.unsqueeze(-1).unsqueeze(-1)\n        res = (gamma) * self.res(x) + beta\n        return x + res\n\n\nclass FBCNN(nn.Layer):\n\n    def __init__(self,\n                 in_nc=3,\n                 out_nc=3,\n                 nc=[64, 128, 256, 512],\n                 nb=4,\n                 act_mode='R',\n                 downsample_mode='strideconv',\n                 upsample_mode='convtranspose'):\n        super(FBCNN, self).__init__()\n\n        self.m_head = conv(in_nc, nc[0], bias=True, mode='C')\n        self.nb = nb\n        self.nc = nc\n        # downsample\n        if downsample_mode == 'avgpool':\n            downsample_block = downsample_avgpool\n        elif downsample_mode == 'maxpool':\n            downsample_block = downsample_maxpool\n        elif downsample_mode == 'strideconv':\n            downsample_block = downsample_strideconv\n        else:\n            raise NotImplementedError('downsample mode [{:s}] is not found'.format(downsample_mode))\n\n        self.m_down1 = sequential(*[ResBlock(nc[0], nc[0], bias=True, mode='C' + act_mode + 'C') for _ in range(nb)],\n                                  downsample_block(nc[0], nc[1], bias=True, mode='2'))\n        self.m_down2 = sequential(*[ResBlock(nc[1], nc[1], bias=True, mode='C' + act_mode + 'C') for _ in range(nb)],\n                                  downsample_block(nc[1], nc[2], bias=True, mode='2'))\n        self.m_down3 = sequential(*[ResBlock(nc[2], nc[2], bias=True, mode='C' + act_mode + 'C') for _ in range(nb)],\n                                  downsample_block(nc[2], nc[3], bias=True, mode='2'))\n\n        self.m_body_encoder = sequential(\n            *[ResBlock(nc[3], nc[3], bias=True, mode='C' + act_mode + 'C') for _ in range(nb)])\n\n        self.m_body_decoder = sequential(\n            *[ResBlock(nc[3], nc[3], bias=True, mode='C' + act_mode + 'C') for _ in range(nb)])\n\n        # upsample\n        if upsample_mode == 'upconv':\n            upsample_block = upsample_upconv\n        elif upsample_mode == 'pixelshuffle':\n            upsample_block = upsample_pixelshuffle\n        elif upsample_mode == 'convtranspose':\n            upsample_block = upsample_convtranspose\n        else:\n            raise NotImplementedError('upsample mode [{:s}] is not found'.format(upsample_mode))\n\n        self.m_up3 = nn.LayerList([\n            upsample_block(nc[3], nc[2], bias=True, mode='2'),\n            *[QFAttention(nc[2], nc[2], bias=True, mode='C' + act_mode + 'C') for _ in range(nb)]\n        ])\n\n        self.m_up2 = nn.LayerList([\n            upsample_block(nc[2], nc[1], bias=True, mode='2'),\n            *[QFAttention(nc[1], nc[1], bias=True, mode='C' + act_mode + 'C') for _ in range(nb)]\n        ])\n\n        self.m_up1 = nn.LayerList([\n            upsample_block(nc[1], nc[0], bias=True, mode='2'),\n            *[QFAttention(nc[0], nc[0], bias=True, mode='C' + act_mode + 'C') for _ in range(nb)]\n        ])\n\n        self.m_tail = conv(nc[0], out_nc, bias=True, mode='C')\n\n        self.qf_pred = sequential(*[ResBlock(nc[3], nc[3], bias=True, mode='C' + act_mode + 'C') for _ in range(nb)],\n                                  nn.AdaptiveAvgPool2D((1, 1)), nn.Flatten(), nn.Linear(512, 512), nn.ReLU(),\n                                  nn.Linear(512, 512), nn.ReLU(), nn.Linear(512, 1), nn.Sigmoid())\n\n        self.qf_embed = sequential(nn.Linear(1, 512), nn.ReLU(), nn.Linear(512, 512), nn.ReLU(), nn.Linear(512, 512),\n                                   nn.ReLU())\n\n        self.to_gamma_3 = sequential(nn.Linear(512, nc[2]), nn.Sigmoid())\n        self.to_beta_3 = sequential(nn.Linear(512, nc[2]), nn.Tanh())\n        self.to_gamma_2 = sequential(nn.Linear(512, nc[1]), nn.Sigmoid())\n        self.to_beta_2 = sequential(nn.Linear(512, nc[1]), nn.Tanh())\n        self.to_gamma_1 = sequential(nn.Linear(512, nc[0]), nn.Sigmoid())\n        self.to_beta_1 = sequential(nn.Linear(512, nc[0]), nn.Tanh())\n\n    def forward(self, x, qf_input=None):\n\n        h, w = x.shape[-2:]\n        paddingBottom = int(np.ceil(h / 8) * 8 - h)\n        paddingRight = int(np.ceil(w / 8) * 8 - w)\n        x = nn.functional.pad(x, (0, paddingRight, 0, paddingBottom), mode='reflect')\n\n        x1 = self.m_head(x)\n        x2 = self.m_down1(x1)\n        x3 = self.m_down2(x2)\n        x4 = self.m_down3(x3)\n        x = self.m_body_encoder(x4)\n        qf = self.qf_pred(x)\n        x = self.m_body_decoder(x)\n        qf_embedding = self.qf_embed(qf_input) if qf_input is not None else self.qf_embed(qf)\n        gamma_3 = self.to_gamma_3(qf_embedding)\n        beta_3 = self.to_beta_3(qf_embedding)\n\n        gamma_2 = self.to_gamma_2(qf_embedding)\n        beta_2 = self.to_beta_2(qf_embedding)\n\n        gamma_1 = self.to_gamma_1(qf_embedding)\n        beta_1 = self.to_beta_1(qf_embedding)\n\n        x = x + x4\n        x = self.m_up3[0](x)\n        for i in range(self.nb):\n            x = self.m_up3[i + 1](x, gamma_3, beta_3)\n\n        x = x + x3\n\n        x = self.m_up2[0](x)\n        for i in range(self.nb):\n            x = self.m_up2[i + 1](x, gamma_2, beta_2)\n        x = x + x2\n\n        x = self.m_up1[0](x)\n        for i in range(self.nb):\n            x = self.m_up1[i + 1](x, gamma_1, beta_1)\n\n        x = x + x1\n        x = self.m_tail(x)\n        x = x[..., :h, :w]\n\n        return x, qf\n"
  },
  {
    "path": "modules/image/Image_editing/enhancement/fbcnn_gray/module.py",
    "content": "import argparse\nimport base64\nimport os\nimport time\nfrom typing import Union\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\n\nfrom .fbcnn import FBCNN\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tobytes()).decode('utf8')\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.frombuffer(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_GRAYSCALE)\n    return data\n\n\n@moduleinfo(\n    name='fbcnn_gray',\n    version='1.0.0',\n    type=\"CV/image_editing\",\n    author=\"\",\n    author_email=\"\",\n    summary=\"Flexible JPEG Artifacts Removal.\",\n)\nclass FBCNNGary(nn.Layer):\n\n    def __init__(self):\n        super(FBCNNGary, self).__init__()\n        self.default_pretrained_model_path = os.path.join(self.directory, 'ckpts', 'fbcnn_gray.pdparams')\n        self.fbcnn = FBCNN(in_nc=1, out_nc=1)\n        state_dict = paddle.load(self.default_pretrained_model_path)\n        self.fbcnn.set_state_dict(state_dict)\n        self.fbcnn.eval()\n\n    def preprocess(self, img: np.ndarray) -> np.ndarray:\n        img = img[None, ...]\n        img = img / 255.0\n        return img.astype(np.float32)\n\n    def postprocess(self, img: np.ndarray) -> np.ndarray:\n        img = img.clip(0, 1)\n        img = img * 255.0\n        return img.astype(np.uint8)\n\n    def artifacts_removal(self,\n                          image: Union[str, np.ndarray],\n                          quality_factor: float = None,\n                          visualization: bool = True,\n                          output_dir: str = \"fbcnn_gray_output\") -> np.ndarray:\n        if isinstance(image, str):\n            _, file_name = os.path.split(image)\n            save_name, _ = os.path.splitext(file_name)\n            save_name = save_name + '_' + str(int(time.time())) + '.jpg'\n            image = cv2.imdecode(np.fromfile(image, dtype=np.uint8), cv2.IMREAD_GRAYSCALE)\n        elif isinstance(image, np.ndarray):\n            save_name = str(int(time.time())) + '.jpg'\n            image = image\n        else:\n            raise Exception(\"image should be a str / np.ndarray\")\n\n        with paddle.no_grad():\n            img_input = self.preprocess(image)\n            img_input = paddle.to_tensor(img_input[None, ...], dtype=paddle.float32)\n            if quality_factor and 0 <= quality_factor <= 1:\n                qf_input = paddle.to_tensor([[quality_factor]], dtype=paddle.float32)\n            else:\n                qf_input = None\n            img_output, _ = self.fbcnn(img_input, qf_input)\n            img_output = img_output.numpy()[0][0]\n            img_output = self.postprocess(img_output)\n\n        if visualization:\n            if not os.path.isdir(output_dir):\n                os.makedirs(output_dir)\n            save_path = os.path.join(output_dir, save_name)\n            cv2.imwrite(save_path, img_output)\n\n        return img_output\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.parser.add_argument('--input_path', type=str, help=\"Path to image.\")\n        self.parser.add_argument('--quality_factor', type=float, default=None, help=\"Image quality factor (0.0-1.0).\")\n        self.parser.add_argument('--output_dir',\n                                 type=str,\n                                 default='fbcnn_gray_output',\n                                 help=\"The directory to save output images.\")\n        args = self.parser.parse_args(argvs)\n        self.artifacts_removal(image=args.input_path,\n                               quality_factor=args.quality_factor,\n                               visualization=True,\n                               output_dir=args.output_dir)\n        return 'Artifacts removal results are saved in %s' % args.output_dir\n\n    @serving\n    def serving_method(self, image, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        image = base64_to_cv2(image)\n        img_output = self.artifacts_removal(image=image, **kwargs)\n\n        return cv2_to_base64(img_output)\n"
  },
  {
    "path": "modules/image/Image_editing/enhancement/fbcnn_gray/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport numpy as np\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/mJaD10XeD7w/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8M3x8Y2F0fGVufDB8fHx8MTY2MzczNDc3Mw&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"fbcnn_gray\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('fbcnn_gray_output')\n\n    def test_artifacts_removal1(self):\n        results = self.module.artifacts_removal(image='tests/test.jpg', quality_factor=None, visualization=False)\n\n        self.assertIsInstance(results, np.ndarray)\n\n    def test_artifacts_removal2(self):\n        results = self.module.artifacts_removal(image=cv2.imread('tests/test.jpg', cv2.IMREAD_GRAYSCALE),\n                                                quality_factor=None,\n                                                visualization=True)\n\n        self.assertIsInstance(results, np.ndarray)\n\n    def test_artifacts_removal3(self):\n        results = self.module.artifacts_removal(image=cv2.imread('tests/test.jpg', cv2.IMREAD_GRAYSCALE),\n                                                quality_factor=0.5,\n                                                visualization=True)\n\n        self.assertIsInstance(results, np.ndarray)\n\n    def test_artifacts_removal4(self):\n        self.assertRaises(Exception, self.module.artifacts_removal, image=['tests/test.jpg'])\n\n    def test_artifacts_removal5(self):\n        self.assertRaises(FileNotFoundError, self.module.artifacts_removal, image='no.jpg')\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/dcscn/README.md",
    "content": "# dcscn\n\n\n|模型名称|dcscn|\n| :--- | :---: | \n|类别|图像-图像编辑|\n|网络|dcscn|\n|数据集|DIV2k|\n|是否支持Fine-tuning|否|\n|模型大小|260KB|\n|指标|PSNR37.63|\n|最新更新日期|2021-02-26|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  \n  - 样例结果示例(左为原图，右为效果图)：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/133558583-0b7049db-ed1f-4a16-8676-f2141fcb3dee.png\" width = \"450\" height = \"300\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/130899031-a6f8c58a-5cb7-4105-b990-8cca5ae15368.png\" width = \"450\" height = \"300\" hspace='10'/>\n    </p>\n\n\n- ### 模型介绍\n\n  - DCSCN是基于Fast and Accurate Image Super Resolution by Deep CNN with Skip Connection and Network in Network设计的轻量化超分辨模型。该模型使用残差结构和跳连的方式构建网络来提取局部和全局特征，同时使用并行1*1的卷积网络学习细节特征提升模型性能。该模型提供的超分倍数为2倍。\n\n  - 更多详情请参考：[dcscn](https://github.com/jiny2001/dcscn-super-resolution)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0\n\n\n- ### 2、安装\n    - ```shell\n      $ hub install dcscn\n      ```\n\n    - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n- ### 1、命令行预测\n\n  - ```\n    $ hub run dcscn --input_path \"/PATH/TO/IMAGE\"\n    ```\n- ### 2、预测代码示例\n\n  ```python\n  import cv2\n  import paddlehub as hub\n\n  sr_model = hub.Module(name='dcscn')\n  im = cv2.imread('/PATH/TO/IMAGE').astype('float32')\n  #visualization=True可以用于查看超分图片效果，可设置为False提升运行速度。\n  res = sr_model.reconstruct(images=[im], visualization=True)\n  print(res[0]['data'])\n  ```\n\n- ### 3、API\n\n  - ```python\n    def reconstruct(images=None,\n                    paths=None,\n                    use_gpu=False,\n                    visualization=False,\n                    output_dir=\"dcscn_output\")\n    ```\n\n    - 预测API，用于图像超分辨率。\n\n    - **参数**\n\n      * images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      * paths (list\\[str\\]): 图片的路径；\n      * use\\_gpu (bool): 是否使用 GPU预测，如果使用GPU预测，则在预测之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置；\n      * visualization (bool): 是否将识别结果保存为图片文件；\n      * output\\_dir (str): 图片的保存路径。\n\n    - **返回**\n\n      * res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，关键字有 'save\\_path'， 'data'，对应的取值为：\n        * save\\_path (str, optional): 可视化图片的保存路径（仅当visualization=True时存在）；\n        * data (numpy.ndarray): 超分辨后图像。\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n\n    - 将模型保存到指定路径。\n\n    - **参数**\n\n      * dirname: 模型保存路径\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像超分的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n      - ```shell\n        $ hub serving start -m dcscn\n        ```\n\n      - 这样就完成了一个超分任务的服务化API的部署，默认端口号为8866。\n\n      - **NOTE:** 如使用GPU预测，则需要在启动服务之前，设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n - ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n        ```python\n        import requests\n        import json\n        import base64\n\n        import cv2\n        import numpy as np\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # 发送HTTP请求\n\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/dcscn\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n        sr = np.expand_dims(cv2.cvtColor(base64_to_cv2(r.json()[\"results\"][0]['data']), cv2.COLOR_BGR2GRAY), axis=2)\n        shape =sr.shape\n        org_im = cv2.cvtColor(org_im, cv2.COLOR_BGR2YUV)\n        uv = cv2.resize(org_im[...,1:], (shape[1], shape[0]), interpolation=cv2.INTER_CUBIC)\n        combine_im =  cv2.cvtColor(np.concatenate((sr, uv), axis=2), cv2.COLOR_YUV2BGR)\n        cv2.imwrite('dcscn_X2.png', combine_im)\n        print(\"save image as dcscn_X2.png\")\n        ```\n\n\n## 五、更新历史\n\n\n* 1.0.0\n\n  初始发布\n\n\n* 1.1.0\n\n  移除 fluid API\n\n  ```shell\n  $ hub install dcscn == 1.1.0\n  ```\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/dcscn/README_en.md",
    "content": "# dcscn\n\n|Module Name|dcscn|\n| :--- | :---: | \n|Category |Image editing|\n|Network|dcscn|\n|Dataset|DIV2k|\n|Fine-tuning supported or not|No|\n|Module Size|260KB|\n|Data indicators|PSNR37.63|\n|Data indicators |2021-02-26|\n\n\n## I. Basic Information\n\n- ### Application Effect Display\n  \n  - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/133558583-0b7049db-ed1f-4a16-8676-f2141fcb3dee.png\" width = \"450\" height = \"300\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/130899031-a6f8c58a-5cb7-4105-b990-8cca5ae15368.png\" width = \"450\" height = \"300\" hspace='10'/>\n    </p>\n\n\n- ### Module Introduction\n\n  - DCSCN is a super resolution model based on 'Fast and Accurate Image Super Resolution by Deep CNN with Skip Connection and Network in Network'. The model uses residual structure and skip connections  to extract local and global features. It uses a parallel 1*1 convolutional network to learn detailed features to improve model performance. This model provides super resolution result with scale factor x2.\n\n  - For more information, please refer to: [dcscn](https://github.com/jiny2001/dcscn-super-resolution)\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0\n\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install dcscn\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n## III. Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```\n    $ hub run dcscn --input_path \"/PATH/TO/IMAGE\"\n    ```\n\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_en/tutorial/cmd_usage.rst)\n- ### 2、Prediction Code Example\n\n  - ```python\n    import cv2\n    import paddlehub as hub\n\n    sr_model = hub.Module(name='dcscn')\n    im = cv2.imread('/PATH/TO/IMAGE').astype('float32')\n    res = sr_model.reconstruct(images=[im], visualization=True)\n    print(res[0]['data'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def reconstruct(images=None,\n                    paths=None,\n                    use_gpu=False,\n                    visualization=False,\n                    output_dir=\"dcscn_output\")\n    ```\n\n    - Prediction API.\n\n    - **Parameter**\n\n      * images (list\\[numpy.ndarray\\]): Image data，ndarray.shape is in the format \\[H, W, C\\]，BGR.\n      * paths (list\\[str\\]): image path.\n      * use\\_gpu (bool): Use GPU or not. **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**.\n      * visualization (bool): Whether to save the recognition results as picture files.\n      * output\\_dir (str): Save path of images, \"dcscn_output\" by default.\n\n    - **Return**\n      * res (list\\[dict\\]): The list of model results, where each element is dict and each field is: \n        * save\\_path (str, optional): Save path of the result, save_path is '' if no image is saved.\n        * data (numpy.ndarray): Result of super resolution.\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n\n    - Save the model to the specified path.\n\n    - **Parameters**\n\n      * dirname: Model save path.\n\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of super resolution.\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n      - ```shell\n        $ hub serving start -m dcscn\n        ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n    - ```python\n      import requests\n      import json\n      import base64\n\n      import cv2\n      import numpy as np\n\n      def cv2_to_base64(image):\n          data = cv2.imencode('.jpg', image)[1]\n          return base64.b64encode(data.tostring()).decode('utf8')\n      def base64_to_cv2(b64str):\n          data = base64.b64decode(b64str.encode('utf8'))\n          data = np.fromstring(data, np.uint8)\n          data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n          return data\n\n\n      org_im = cv2.imread('/PATH/TO/IMAGE')\n      data = {'images':[cv2_to_base64(org_im)]}\n      headers = {\"Content-type\": \"application/json\"}\n      url = \"http://127.0.0.1:8866/predict/dcscn\"\n      r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n      sr = np.expand_dims(cv2.cvtColor(base64_to_cv2(r.json()[\"results\"][0]['data']), cv2.COLOR_BGR2GRAY), axis=2)\n      shape =sr.shape\n      org_im = cv2.cvtColor(org_im, cv2.COLOR_BGR2YUV)\n      uv = cv2.resize(org_im[...,1:], (shape[1], shape[0]), interpolation=cv2.INTER_CUBIC)\n      combine_im =  cv2.cvtColor(np.concatenate((sr, uv), axis=2), cv2.COLOR_YUV2BGR)\n      cv2.imwrite('dcscn_X2.png', combine_im)\n      print(\"save image as dcscn_X2.png\")\n      ```\n\n## V. Release Note\n\n- 1.0.0\n\n  First release\n\n\n- 1.1.0\n\n  Remove Fluid API\n\n\n  ```shell\n  $ hub install dcscn == 1.1.0\n  ```\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/dcscn/data_feed.py",
    "content": "# -*- coding:utf-8 -*-\nimport os\nimport time\nfrom collections import OrderedDict\n\nimport cv2\nimport numpy as np\n\n\n__all__ = ['reader']\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            im = cv2.imread(im_path).astype('float32')\n            each['org_im'] = im\n            each['org_im_path'] = im_path\n            each['org_im_shape'] = im.shape\n            component.append(each)\n    if images is not None:\n        assert type(images) is list, \"images should be a list.\"\n        for im in images:\n            im = im.astype(np.float32)\n            each = OrderedDict()\n            each['org_im'] = im\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            each['org_im_shape'] = im.shape\n            component.append(each)\n\n    for element in component:\n        img = element['org_im'].copy()\n        img = cv2.cvtColor(img, cv2.COLOR_BGR2YUV)\n        shape = img.shape\n        img_x = np.expand_dims(img[:, :, 0], axis=2)\n        img_x2 = np.expand_dims(cv2.resize(img_x, (shape[1] * 2, shape[0] * 2), interpolation=cv2.INTER_CUBIC), axis=2)\n        img_x = img_x.transpose((2, 0, 1)) / 255\n        img_x2 = img_x2.transpose(2, 0, 1) / 255\n        img_x = img_x.astype(np.float32)\n        img_x2 = img_x2.astype(np.float32)\n        element['img_x'] = img_x\n        element['img_x2'] = img_x2\n        yield element\n\n\nif __name__ == \"__main__\":\n    path = ['photo.jpg']\n    reader(paths=path)\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/dcscn/module.py",
    "content": "# -*- coding:utf-8 -*-\n# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport ast\nimport os\nimport argparse\n\nimport numpy as np\nimport paddle\nimport paddle.jit\nimport paddle.static\nfrom paddle.inference import Config, create_predictor\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\nfrom .data_feed import reader\nfrom .processor import postprocess, base64_to_cv2, cv2_to_base64, check_dir\n\n\n@moduleinfo(\n    name=\"dcscn\",\n    type=\"CV/image_editing\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"dcscn is a super resolution model.\",\n    version=\"1.1.0\")\nclass Dcscn:\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"dcscn_model\", \"model\")\n        self._set_config()\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path+'.pdmodel'\n        params = self.default_pretrained_model_path+'.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def reconstruct(self, images=None, paths=None, use_gpu=False, visualization=False, output_dir=\"dcscn_output\"):\n        \"\"\"\n        API for super resolution.\n\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C], the color space is BGR.\n            paths (list[str]): The paths of images.\n            use_gpu (bool): Whether to use gpu.\n            visualization (bool): Whether to save image or not.\n            output_dir (str): The path to store output images.\n\n        Returns:\n            res (list[dict]): each element in the list is a dict, the keys and values are:\n                save_path (str, optional): the path to save images. (Exists only if visualization is True)\n                data (numpy.ndarray): data of post processed image.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        all_data = list()\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        res = list()\n\n        for i in range(total_num):\n            image_x = np.array([all_data[i]['img_x']])\n            image_x2 = np.array([all_data[i]['img_x2']])\n            dropout = np.array([0])\n\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(image_x.copy())\n            input_handle = predictor.get_input_handle(input_names[1])\n            input_handle.copy_from_cpu(image_x2.copy())\n\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            output = np.expand_dims(output_handle.copy_to_cpu(), axis=1)\n\n            out = postprocess(\n                data_out=output,\n                org_im=all_data[i]['org_im'],\n                org_im_shape=all_data[i]['org_im_shape'],\n                org_im_path=all_data[i]['org_im_path'],\n                output_dir=output_dir,\n                visualization=visualization)\n            res.append(out)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.reconstruct(images=images_decode, **kwargs)\n        results = [{'data': cv2_to_base64(result['data'])} for result in results]\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.reconstruct(\n            paths=[args.input_path], use_gpu=args.use_gpu, output_dir=args.output_dir, visualization=args.visualization)\n        if args.save_dir is not None:\n            check_dir(args.save_dir)\n            self.save_inference_model(args.save_dir)\n\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu', type=ast.literal_eval, default=False, help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default='dcscn_output', help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument(\n            '--save_dir', type=str, default='dcscn_save_model', help=\"The directory to save model.\")\n        self.arg_config_group.add_argument(\n            '--visualization', type=ast.literal_eval, default=True, help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n\n\nif __name__ == \"__main__\":\n    module = Dcscn()\n    #module.reconstruct(paths=[\"BSD100_001.png\",\"BSD100_002.png\"])\n    import cv2\n    img = cv2.imread(\"BSD100_001.png\").astype('float32')\n    res = module.reconstruct(images=[img])\n    module.save_inference_model()\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/dcscn/processor.py",
    "content": "# -*- coding:utf-8 -*-\nimport os\nimport time\nimport base64\n\nimport cv2\nimport numpy as np\n\n__all__ = ['cv2_to_base64', 'base64_to_cv2', 'postprocess']\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef postprocess(data_out, org_im, org_im_shape, org_im_path, output_dir, visualization):\n    \"\"\"\n    Postprocess output of network. one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output of network.\n        org_im (numpy.ndarray): original image.\n        org_im_shape (list): shape pf original image.\n        org_im_path (list): path of riginal image.\n        output_dir (str): output directory to store image.\n        visualization (bool): whether to save image or not.\n\n    Returns:\n        result (dict): The data of processed image.\n    \"\"\"\n    result = dict()\n    for sr in data_out:\n        sr = np.squeeze(sr, 0)\n        sr = np.clip(sr * 255, 0, 255)\n        sr = sr.astype(np.uint8)\n        shape = sr.shape\n        if visualization:\n            org_im = cv2.cvtColor(org_im, cv2.COLOR_BGR2YUV)\n            uv = cv2.resize(org_im[..., 1:], (shape[1], shape[0]), interpolation=cv2.INTER_CUBIC)\n            combine_im = cv2.cvtColor(np.concatenate((sr, uv), axis=2), cv2.COLOR_YUV2BGR)\n            check_dir(output_dir)\n            save_im_path = get_save_image_name(org_im, org_im_path, output_dir)\n            cv2.imwrite(save_im_path, combine_im)\n            print(\"save image at: \", save_im_path)\n            result['save_path'] = save_im_path\n            result['data'] = sr\n        else:\n            result['data'] = sr\n\n    return result\n\n\ndef check_dir(dir_path):\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef get_save_image_name(org_im, org_im_path, output_dir):\n    \"\"\"\n    Get save image name from source image path.\n    \"\"\"\n    # name prefix of orginal image\n    org_im_name = os.path.split(org_im_path)[-1]\n    im_prefix = os.path.splitext(org_im_name)[0]\n    ext = '.png'\n    # save image path\n    save_im_path = os.path.join(output_dir, im_prefix + ext)\n    if os.path.exists(save_im_path):\n        save_im_path = os.path.join(output_dir, im_prefix + 'time={}'.format(int(time.time())) + ext)\n\n    return save_im_path\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/dcscn/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport numpy as np\nimport paddlehub as hub\n\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/1sLIu1XKQrY/download?ixid=MnwxMjA3fDB8MXxhbGx8MTJ8fHx8fHwyfHwxNjYyMzQxNDUx&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"dcscn\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('dcscn_output')\n\n    def test_reconstruct1(self):\n        results = self.module.reconstruct(\n            paths=['tests/test.jpg'],\n            use_gpu=False,\n            visualization=False\n        )\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_reconstruct2(self):\n        results = self.module.reconstruct(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=False\n        )\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_reconstruct3(self):\n        results = self.module.reconstruct(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=True\n        )\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_reconstruct4(self):\n        results = self.module.reconstruct(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=True,\n            visualization=False\n        )\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_reconstruct5(self):\n        self.assertRaises(\n            AssertionError,\n            self.module.reconstruct,\n            paths=['no.jpg']\n        )\n\n    def test_reconstruct6(self):\n        self.assertRaises(\n            AttributeError,\n            self.module.reconstruct,\n            images=['test.jpg']\n        )\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/falsr_a/README.md",
    "content": "# falsr_a\n\n\n|模型名称|falsr_a|\n| :--- | :---: |\n|类别|图像-图像编辑|\n|网络|falsr_a|\n|数据集|DIV2k|\n|是否支持Fine-tuning|否|\n|模型大小|8.9MB|\n|指标|PSNR37.82|\n|最新更新日期|2021-02-26|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n  - 样例结果示例(左为原图，右为效果图)：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/133558583-0b7049db-ed1f-4a16-8676-f2141fcb3dee.png\" width = \"450\" height = \"300\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/130899031-a6f8c58a-5cb7-4105-b990-8cca5ae15368.png\" width = \"450\" height = \"300\" hspace='10'/>\n    </p>\n\n\n- ### 模型介绍\n\n  - falsr_a是基于Fast, Accurate and Lightweight Super-Resolution with Neural Architecture Search设计的轻量化超分辨模型。该模型使用多目标方法处理超分问题，同时使用基于混合控制器的弹性搜索策略来提升模型性能。该模型提供的超分倍数为2倍。\n\n  - 更多详情请参考：[falsr_a](https://github.com/xiaomi-automl/FALSR)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0\n\n\n- ### 2、安装\n    - ```shell\n      $ hub install falsr_a\n      ```\n\n    - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n- ### 1、命令行预测\n\n  - ```\n    $ hub run falsr_a --input_path \"/PATH/TO/IMAGE\"\n    ```\n- ### 2、预测代码示例\n\n  ```python\n  import cv2\n  import paddlehub as hub\n\n  sr_model = hub.Module(name='falsr_a')\n  im = cv2.imread('/PATH/TO/IMAGE').astype('float32')\n  #visualization=True可以用于查看超分图片效果，可设置为False提升运行速度。\n  res = sr_model.reconstruct(images=[im], visualization=True)\n  print(res[0]['data'])\n  ```\n\n- ### 3、API\n\n  - ```python\n    def reconstruct(images=None,\n                    paths=None,\n                    use_gpu=False,\n                    visualization=False,\n                    output_dir=\"falsr_a_output\")\n    ```\n\n    - 预测API，用于图像超分辨率。\n\n    - **参数**\n\n      * images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      * paths (list\\[str\\]): 图片的路径；\n      * use\\_gpu (bool): 是否使用 GPU预测，如果使用GPU预测，则在预测之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置；\n      * visualization (bool): 是否将识别结果保存为图片文件；\n      * output\\_dir (str): 图片的保存路径。\n\n    - **返回**\n\n      * res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，关键字有 'save\\_path', 'data'，对应的取值为：\n        * save\\_path (str, optional): 可视化图片的保存路径（仅当visualization=True时存在）；\n        * data (numpy.ndarray): 超分辨后图像。\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n\n    - 将模型保存到指定路径。\n\n    - **参数**\n\n      * dirname: 模型保存路径\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像超分的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n      - ```shell\n        $ hub serving start -m falsr_a\n        ```\n\n      - 这样就完成了一个超分任务的服务化API的部署，默认端口号为8866。\n\n      - **NOTE:** 如使用GPU预测，则需要在启动服务之前，设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n - ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n        ```python\n        import requests\n        import json\n        import base64\n\n        import cv2\n        import numpy as np\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/falsr_a\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        sr = base64_to_cv2(r.json()[\"results\"][0]['data'])\n        cv2.imwrite('falsr_a_X2.png', sr)\n        print(\"save image as falsr_a_X2.png\")\n        ```\n\n- ### Gradio APP 支持\n  从 PaddleHub 2.3.1 开始支持使用链接 http://127.0.0.1:8866/gradio/falsr_a 在浏览器中访问 falsr_a 的 Gradio APP。\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 fluid API\n\n* 1.2.0\n\n  添加 Gradio APP 支持\n\n  ```shell\n  $ hub install falsr_a == 1.2.0\n  ```\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/falsr_a/README_en.md",
    "content": "# falsr_a\n\n|Module Name|falsr_a|\n| :--- | :---: |\n|Category |Image editing|\n|Network |falsr_a|\n|Dataset|DIV2k|\n|Fine-tuning supported or not|No|\n|Module Size |8.9MB|\n|Data indicators|PSNR37.82|\n|Latest update date|2021-02-26|\n\n\n## I. Basic Information\n\n- ### Application Effect Display\n\n  - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/133558583-0b7049db-ed1f-4a16-8676-f2141fcb3dee.png\" width = \"450\" height = \"300\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/130899031-a6f8c58a-5cb7-4105-b990-8cca5ae15368.png\" width = \"450\" height = \"300\" hspace='10'/>\n    </p>\n\n\n- ### Module Introduction\n\n  - Falsr_a is a lightweight super-resolution model based on \"Accurate and Lightweight Super-Resolution with Neural Architecture Search\". The model uses a multi-objective approach to deal with the over-segmentation problem, and uses an elastic search strategy based on a hybrid controller to improve the performance of the model. This model provides super resolution result with scale factor x2.\n\n  - For more information, please refer to: [falsr_a](https://github.com/xiaomi-automl/FALSR)\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0\n\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install falsr_a\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```\n    $ hub run falsr_a --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_en/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import cv2\n    import paddlehub as hub\n\n    sr_model = hub.Module(name='falsr_a')\n    im = cv2.imread('/PATH/TO/IMAGE').astype('float32')\n    res = sr_model.reconstruct(images=[im], visualization=True)\n    print(res[0]['data'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def reconstruct(images=None,\n                    paths=None,\n                    use_gpu=False,\n                    visualization=False,\n                    output_dir=\"falsr_a_output\")\n    ```\n\n    - Prediction API.\n\n    - **Parameter**\n\n      * images (list\\[numpy.ndarray\\]): image data，ndarray.shape is in the format \\[H, W, C\\]，BGR.\n      * paths (list\\[str\\]): image path.\n      * use\\_gpu (bool): use GPU or not. **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**.\n      * visualization (bool): Whether to save the recognition results as picture files.\n      * output\\_dir (str): save path of images, \"dcscn_output\" by default.\n\n    - **Return**\n      * res (list\\[dict\\]): The list of model results, where each element is dict and each field is:\n        * save\\_path (str, optional): Save path of the result, save_path is '' if no image is saved.\n        * data (numpy.ndarray): result of super resolution.\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n\n    - Save the model to the specified path.\n\n    - **Parameters**\n\n      * dirname: Model save path.\n\n\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of super resolution.\n\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n      - ```shell\n        $ hub serving start -m falsr_a\n        ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n      - ```python\n        import requests\n        import json\n        import base64\n\n        import cv2\n        import numpy as np\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/falsr_a\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        sr = base64_to_cv2(r.json()[\"results\"][0]['data'])\n        cv2.imwrite('falsr_a_X2.png', sr)\n        print(\"save image as falsr_a_X2.png\")\n        ```\n\n- ### Gradio APP support\n  Starting with PaddleHub 2.3.1, the Gradio APP for falsr_a is supported to be accessed in the browser using the link http://127.0.0.1:8866/gradio/falsr_a.\n\n## V. Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Remove Fluid API\n\n* 1.2.0\n\n  Add Gradio APP support.\n\n  ```shell\n  $ hub install falsr_a == 1.2.0\n  ```\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/falsr_a/data_feed.py",
    "content": "import os\nimport time\nfrom collections import OrderedDict\n\nimport cv2\nimport numpy as np\n\n__all__ = ['reader']\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            im = cv2.imread(im_path).astype('float32')\n            each['org_im'] = im\n            each['org_im_path'] = im_path\n            each['org_im_shape'] = im.shape\n            component.append(each)\n    if images is not None:\n        assert type(images) is list, \"images should be a list.\"\n        for im in images:\n            im = im.astype(np.float32)\n            each = OrderedDict()\n            each['org_im'] = im\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            each['org_im_shape'] = im.shape\n            component.append(each)\n\n    for element in component:\n        img = element['org_im'].copy()\n        img = cv2.cvtColor(img, cv2.COLOR_BGR2YUV)\n        shape = img.shape\n        img_scale = cv2.resize(img, (shape[1] * 2, shape[0] * 2), interpolation=cv2.INTER_CUBIC)\n        img_y = np.expand_dims(img[:, :, 0], axis=2)\n        img_scale_pbpr = img_scale[..., 1:]\n        img_y = img_y.transpose((2, 0, 1)) / 255\n        img_scale_pbpr = img_scale_pbpr.transpose(2, 0, 1) / 255\n        element['img_y'] = img_y\n        element['img_scale_pbpr'] = img_scale_pbpr\n        yield element\n\n\nif __name__ == \"__main__\":\n    path = ['BSD100_001.png']\n    reader(paths=path)\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/falsr_a/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport ast\nimport os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import check_dir\nfrom .processor import cv2_to_base64\nfrom .processor import postprocess\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"falsr_a\",\n            type=\"CV/image_editing\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"falsr_a is a super resolution model.\",\n            version=\"1.2.0\")\nclass Falsr_A:\n\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"falsr_a_model\", \"model\")\n        self._set_config()\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path + '.pdmodel'\n        params = self.default_pretrained_model_path + '.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def reconstruct(self, images=None, paths=None, use_gpu=False, visualization=False, output_dir=\"falsr_a_output\"):\n        \"\"\"\n        API for super resolution.\n\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C], the color space is BGR.\n            paths (list[str]): The paths of images.\n            use_gpu (bool): Whether to use gpu.\n            visualization (bool): Whether to save image or not.\n            output_dir (str): The path to store output images.\n\n        Returns:\n            res (list[dict]): each element in the list is a dict, the keys and values are:\n                save_path (str, optional): the path to save images. (Exists only if visualization is True)\n                data (numpy.ndarray): data of post processed image.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        all_data = list()\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        res = list()\n\n        for i in range(total_num):\n            image_y = np.array([all_data[i]['img_y']])\n            image_scale_pbpr = np.array([all_data[i]['img_scale_pbpr']])\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(image_y.copy())\n            input_handle = predictor.get_input_handle(input_names[1])\n            input_handle.copy_from_cpu(image_scale_pbpr.copy())\n\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n            output = np.expand_dims(output_handle.copy_to_cpu(), axis=1)\n            out = postprocess(data_out=output,\n                              org_im=all_data[i]['org_im'],\n                              org_im_shape=all_data[i]['org_im_shape'],\n                              org_im_path=all_data[i]['org_im_path'],\n                              output_dir=output_dir,\n                              visualization=visualization)\n            res.append(out)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.reconstruct(images=images_decode, **kwargs)\n        results = [{'data': cv2_to_base64(result['data'])} for result in results]\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.reconstruct(paths=[args.input_path],\n                                   use_gpu=args.use_gpu,\n                                   output_dir=args.output_dir,\n                                   visualization=args.visualization)\n        if args.save_dir is not None:\n            check_dir(args.save_dir)\n            self.save_inference_model(args.save_dir)\n\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='falsr_a_output',\n                                           help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument('--save_dir',\n                                           type=str,\n                                           default='falsr_a_save_model',\n                                           help=\"The directory to save model.\")\n        self.arg_config_group.add_argument('--visualization',\n                                           type=ast.literal_eval,\n                                           default=True,\n                                           help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n\n    def create_gradio_app(self):\n        import gradio as gr\n        import tempfile\n        import os\n        from PIL import Image\n\n        def inference(image, use_gpu=False):\n            with tempfile.TemporaryDirectory() as temp_dir:\n                self.reconstruct(paths=[image], use_gpu=use_gpu, visualization=True, output_dir=temp_dir)\n                return Image.open(os.path.join(temp_dir, os.listdir(temp_dir)[0]))\n\n        interface = gr.Interface(\n            inference,\n            [gr.inputs.Image(type=\"filepath\"), gr.Checkbox(label='use_gpu')],\n            gr.outputs.Image(type=\"ndarray\"),\n            title='falsr_a')\n        return interface\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/falsr_a/processor.py",
    "content": "import base64\nimport os\nimport time\n\nimport cv2\nimport numpy as np\n\n__all__ = ['cv2_to_base64', 'base64_to_cv2', 'postprocess']\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef postprocess(data_out, org_im, org_im_shape, org_im_path, output_dir, visualization):\n    \"\"\"\n    Postprocess output of network. one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output of network.\n        org_im (numpy.ndarray): original image.\n        org_im_shape (list): shape pf original image.\n        org_im_path (list): path of riginal image.\n        output_dir (str): output directory to store image.\n        visualization (bool): whether to save image or not.\n\n    Returns:\n        result (dict): The data of processed image.\n    \"\"\"\n    result = dict()\n    for sr in data_out:\n        sr = np.squeeze(sr, 0)\n        sr = np.clip(sr * 255, 0, 255)\n        sr = sr.astype(np.uint8)\n        sr = cv2.cvtColor(sr, cv2.COLOR_RGB2BGR)\n\n        if visualization:\n            check_dir(output_dir)\n            save_im_path = get_save_image_name(org_im, org_im_path, output_dir)\n            cv2.imwrite(save_im_path, sr)\n            print(\"save image at: \", save_im_path)\n            result['save_path'] = save_im_path\n            result['data'] = sr\n        else:\n            result['data'] = sr\n\n    return result\n\n\ndef check_dir(dir_path):\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef get_save_image_name(org_im, org_im_path, output_dir):\n    \"\"\"\n    Get save image name from source image path.\n    \"\"\"\n    # name prefix of orginal image\n    org_im_name = os.path.split(org_im_path)[-1]\n    im_prefix = os.path.splitext(org_im_name)[0]\n    ext = '.png'\n    # save image path\n    save_im_path = os.path.join(output_dir, im_prefix + ext)\n    if os.path.exists(save_im_path):\n        save_im_path = os.path.join(output_dir, im_prefix + 'time={}'.format(int(time.time())) + ext)\n\n    return save_im_path\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/falsr_a/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport numpy as np\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/1sLIu1XKQrY/download?ixid=MnwxMjA3fDB8MXxhbGx8MTJ8fHx8fHwyfHwxNjYyMzQxNDUx&force=true&w=120'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"falsr_a\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('falsr_a_output')\n\n    def test_reconstruct1(self):\n        results = self.module.reconstruct(paths=['tests/test.jpg'], use_gpu=False, visualization=False)\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_reconstruct2(self):\n        results = self.module.reconstruct(images=[cv2.imread('tests/test.jpg')], use_gpu=False, visualization=False)\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_reconstruct3(self):\n        results = self.module.reconstruct(images=[cv2.imread('tests/test.jpg')], use_gpu=False, visualization=True)\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_reconstruct4(self):\n        results = self.module.reconstruct(images=[cv2.imread('tests/test.jpg')], use_gpu=True, visualization=False)\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_reconstruct5(self):\n        self.assertRaises(AssertionError, self.module.reconstruct, paths=['no.jpg'])\n\n    def test_reconstruct6(self):\n        self.assertRaises(AttributeError, self.module.reconstruct, images=['test.jpg'])\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/falsr_b/README.md",
    "content": "# falsr_b\n\n\n|模型名称|falsr_b|\n| :--- | :---: |\n|类别|图像-图像编辑|\n|网络|falsr_b|\n|数据集|DIV2k|\n|是否支持Fine-tuning|否|\n|模型大小|4MB|\n|指标|PSNR37.61|\n|最新更新日期|2021-02-26|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n  - 样例结果示例(左为原图，右为效果图)：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/133558583-0b7049db-ed1f-4a16-8676-f2141fcb3dee.png\" width = \"450\" height = \"300\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/130899031-a6f8c58a-5cb7-4105-b990-8cca5ae15368.png\" width = \"450\" height = \"300\" hspace='10'/>\n    </p>\n\n\n- ### 模型介绍\n\n  - falsr_b是基于Fast, Accurate and Lightweight Super-Resolution with Neural Architecture Search设计的轻量化超分辨模型。该模型使用多目标方法处理超分问题，同时使用基于混合控制器的弹性搜索策略来提升模型性能。该模型提供的超分倍数为2倍。\n\n  - 更多详情请参考：[falsr_b](https://github.com/xiaomi-automl/FALSR)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0\n\n\n- ### 2、安装\n    - ```shell\n      $ hub install falsr_b\n      ```\n\n    - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n- ### 1、命令行预测\n\n  - ```\n    $ hub run falsr_b --input_path \"/PATH/TO/IMAGE\"\n    ```\n- ### 2、预测代码示例\n\n  ```python\n  import cv2\n  import paddlehub as hub\n\n  sr_model = hub.Module(name='falsr_b')\n  im = cv2.imread('/PATH/TO/IMAGE').astype('float32')\n  #visualization=True可以用于查看超分图片效果，可设置为False提升运行速度。\n  res = sr_model.reconstruct(images=[im], visualization=True)\n  print(res[0]['data'])\n  ```\n\n- ### 3、API\n\n  - ```python\n    def reconstruct(images=None,\n                    paths=None,\n                    use_gpu=False,\n                    visualization=False,\n                    output_dir=\"falsr_b_output\")\n    ```\n\n    - 预测API，用于图像超分辨率。\n\n    - **参数**\n\n      * images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      * paths (list\\[str\\]): 图片的路径；\n      * use\\_gpu (bool): 是否使用 GPU预测，如果使用GPU预测，则在预测之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置；\n      * visualization (bool): 是否将识别结果保存为图片文件；\n      * output\\_dir (str): 图片的保存路径。\n\n    - **返回**\n\n      * res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，关键字有 'save\\_path'， 'data'，对应的取值为：\n        * save\\_path (str, optional): 可视化图片的保存路径（仅当visualization=True时存在）；\n        * data (numpy.ndarray): 超分辨后图像。\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n\n    - 将模型保存到指定路径。\n\n    - **参数**\n\n      * dirname: 模型保存路径\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像超分的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n      - ```shell\n        $ hub serving start -m falsr_b\n        ```\n\n      - 这样就完成了一个超分任务的服务化API的部署，默认端口号为8866。\n\n      - **NOTE:** 如使用GPU预测，则需要在启动服务之前，设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n - ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n        ```python\n        import requests\n        import json\n        import base64\n\n        import cv2\n        import numpy as np\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/falsr_b\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        sr = base64_to_cv2(r.json()[\"results\"][0]['data'])\n        cv2.imwrite('falsr_b_X2.png', sr)\n        print(\"save image as falsr_b_X2.png\")\n        ```\n\n- ### Gradio APP 支持\n  从 PaddleHub 2.3.1 开始支持使用链接 http://127.0.0.1:8866/gradio/falsr_b 在浏览器中访问 falsr_b 的 Gradio APP。\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 fluid API\n\n* 1.2.0\n\n  添加 Gradio APP 支持\n\n  ```shell\n  $ hub install falsr_b == 1.2.0\n  ```\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/falsr_b/README_en.md",
    "content": "# falsr_b\n\n|Module Name|falsr_b|\n| :--- | :---: |\n|Category |Image editing|\n|Network |falsr_b|\n|Dataset|DIV2k|\n|Fine-tuning supported or not|No|\n|Module Size |4MB|\n|Data indicators|PSNR37.61|\n|Latest update date|2021-02-26|\n\n\n## I. Basic Information\n\n- ### Application Effect Display\n\n  - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/133558583-0b7049db-ed1f-4a16-8676-f2141fcb3dee.png\" width = \"450\" height = \"300\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/130899031-a6f8c58a-5cb7-4105-b990-8cca5ae15368.png\" width = \"450\" height = \"300\" hspace='10'/>\n    </p>\n\n\n- ### Module Introduction\n\n  - Falsr_b is a lightweight super-resolution model based on \"Accurate and Lightweight Super-Resolution with Neural Architecture Search\". The model uses a multi-objective approach to deal with the over-segmentation problem, and uses an elastic search strategy based on a hybrid controller to improve the performance of the model. This model provides super resolution result with scale factor x2.\n\n  - For more information, please refer to:[falsr_b](https://github.com/xiaomi-automl/FALSR)\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0\n\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install falsr_b\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```\n    $ hub run falsr_b --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_en/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  ```python\n  import cv2\n  import paddlehub as hub\n\n  sr_model = hub.Module(name='falsr_b')\n  im = cv2.imread('/PATH/TO/IMAGE').astype('float32')\n  res = sr_model.reconstruct(images=[im], visualization=True)\n  print(res[0]['data'])\n  ```\n\n- ### 3、API\n\n  - ```python\n    def reconstruct(images=None,\n                    paths=None,\n                    use_gpu=False,\n                    visualization=False,\n                    output_dir=\"falsr_b_output\")\n    ```\n\n    - Prediction API.\n\n    - **Parameter**\n\n      * images (list\\[numpy.ndarray\\]): Image data，ndarray.shape is in the format \\[H, W, C\\]，BGR.\n      * paths (list\\[str\\]): Image path.\n      * use\\_gpu (bool): Use GPU or not. **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**.\n      * visualization (bool): Whether to save the recognition results as picture files.\n      * output\\_dir (str): Save path of images, \"dcscn_output\" by default.\n\n    - **Return**\n      * res (list\\[dict\\]): The list of model results, where each element is dict and each field is:\n        * save\\_path (str, optional): Save path of the result, save_path is '' if no image is saved.\n        * data (numpy.ndarray): Result of super resolution.\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n\n    - Save the model to the specified path.\n\n    - **Parameters**\n\n      * dirname: Model save path.\n\n\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of super resolution.\n\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n      - ```shell\n        $ hub serving start -m falsr_b\n        ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n      - ```python\n        import requests\n        import json\n        import base64\n\n        import cv2\n        import numpy as np\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/falsr_b\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        sr = base64_to_cv2(r.json()[\"results\"][0]['data'])\n        cv2.imwrite('falsr_b_X2.png', sr)\n        print(\"save image as falsr_b_X2.png\")\n        ```\n\n- ### Gradio APP support\n  Starting with PaddleHub 2.3.1, the Gradio APP for falsr_b is supported to be accessed in the browser using the link http://127.0.0.1:8866/gradio/falsr_b.\n\n## V. Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Remove Fluid API\n\n* 1.2.0\n\n  Add Gradio APP support.\n\n  ```shell\n  $ hub install falsr_b == 1.2.0\n  ```\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/falsr_b/data_feed.py",
    "content": "import os\nimport time\nfrom collections import OrderedDict\n\nimport cv2\nimport numpy as np\n\n__all__ = ['reader']\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            im = cv2.imread(im_path).astype('float32')\n            each['org_im'] = im\n            each['org_im_path'] = im_path\n            each['org_im_shape'] = im.shape\n            component.append(each)\n    if images is not None:\n        assert type(images) is list, \"images should be a list.\"\n        for im in images:\n            im = im.astype(np.float32)\n            each = OrderedDict()\n            each['org_im'] = im\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            each['org_im_shape'] = im.shape\n            component.append(each)\n\n    for element in component:\n        img = element['org_im'].copy()\n        img = cv2.cvtColor(img, cv2.COLOR_BGR2YUV)\n        shape = img.shape\n        img_scale = cv2.resize(img, (shape[1] * 2, shape[0] * 2), interpolation=cv2.INTER_CUBIC)\n        img_y = np.expand_dims(img[:, :, 0], axis=2)\n        img_scale_pbpr = img_scale[..., 1:]\n        img_y = img_y.transpose((2, 0, 1)) / 255\n        img_scale_pbpr = img_scale_pbpr.transpose(2, 0, 1) / 255\n        element['img_y'] = img_y\n        element['img_scale_pbpr'] = img_scale_pbpr\n        yield element\n\n\nif __name__ == \"__main__\":\n    path = ['BSD100_001.png']\n    reader(paths=path)\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/falsr_b/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport ast\nimport os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import check_dir\nfrom .processor import cv2_to_base64\nfrom .processor import postprocess\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"falsr_b\",\n            type=\"CV/image_editing\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"falsr_b is a super resolution model.\",\n            version=\"1.2.0\")\nclass Falsr_B:\n\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"falsr_b_model\", \"model\")\n        self._set_config()\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path + '.pdmodel'\n        params = self.default_pretrained_model_path + '.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def reconstruct(self, images=None, paths=None, use_gpu=False, visualization=False, output_dir=\"falsr_b_output\"):\n        \"\"\"\n        API for super resolution.\n\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C], the color space is BGR.\n            paths (list[str]): The paths of images.\n            use_gpu (bool): Whether to use gpu.\n            visualization (bool): Whether to save image or not.\n            output_dir (str): The path to store output images.\n\n        Returns:\n            res (list[dict]): each element in the list is a dict, the keys and values are:\n                save_path (str, optional): the path to save images. (Exists only if visualization is True)\n                data (numpy.ndarray): data of post processed image.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        all_data = list()\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        res = list()\n\n        for i in range(total_num):\n            image_y = np.array([all_data[i]['img_y']])\n            image_scale_pbpr = np.array([all_data[i]['img_scale_pbpr']])\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(image_y.copy())\n            input_handle = predictor.get_input_handle(input_names[1])\n            input_handle.copy_from_cpu(image_scale_pbpr.copy())\n\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n            output = np.expand_dims(output_handle.copy_to_cpu(), axis=1)\n            out = postprocess(data_out=output,\n                              org_im=all_data[i]['org_im'],\n                              org_im_shape=all_data[i]['org_im_shape'],\n                              org_im_path=all_data[i]['org_im_path'],\n                              output_dir=output_dir,\n                              visualization=visualization)\n            res.append(out)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.reconstruct(images=images_decode, **kwargs)\n        results = [{'data': cv2_to_base64(result['data'])} for result in results]\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.reconstruct(paths=[args.input_path],\n                                   use_gpu=args.use_gpu,\n                                   output_dir=args.output_dir,\n                                   visualization=args.visualization)\n        if args.save_dir is not None:\n            check_dir(args.save_dir)\n            self.save_inference_model(args.save_dir)\n\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='falsr_b_output',\n                                           help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument('--save_dir',\n                                           type=str,\n                                           default='falsr_b_save_model',\n                                           help=\"The directory to save model.\")\n        self.arg_config_group.add_argument('--visualization',\n                                           type=ast.literal_eval,\n                                           default=True,\n                                           help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n\n    def create_gradio_app(self):\n        import gradio as gr\n        import tempfile\n        import os\n        from PIL import Image\n\n        def inference(image, use_gpu=False):\n            with tempfile.TemporaryDirectory() as temp_dir:\n                self.reconstruct(paths=[image], use_gpu=use_gpu, visualization=True, output_dir=temp_dir)\n                return Image.open(os.path.join(temp_dir, os.listdir(temp_dir)[0]))\n\n        interface = gr.Interface(\n            inference,\n            [gr.inputs.Image(type=\"filepath\"), gr.Checkbox(label='use_gpu')],\n            gr.outputs.Image(type=\"ndarray\"),\n            title='falsr_b')\n        return interface\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/falsr_b/processor.py",
    "content": "import base64\nimport os\nimport time\n\nimport cv2\nimport numpy as np\n\n__all__ = ['cv2_to_base64', 'base64_to_cv2', 'postprocess']\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef postprocess(data_out, org_im, org_im_shape, org_im_path, output_dir, visualization):\n    \"\"\"\n    Postprocess output of network. one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output of network.\n        org_im (numpy.ndarray): original image.\n        org_im_shape (list): shape pf original image.\n        org_im_path (list): path of riginal image.\n        output_dir (str): output directory to store image.\n        visualization (bool): whether to save image or not.\n\n    Returns:\n        result (dict): The data of processed image.\n    \"\"\"\n    result = dict()\n    for sr in data_out:\n        sr = np.squeeze(sr, 0)\n        sr = np.clip(sr * 255, 0, 255)\n        sr = sr.astype(np.uint8)\n        sr = cv2.cvtColor(sr, cv2.COLOR_RGB2BGR)\n\n        if visualization:\n            check_dir(output_dir)\n            save_im_path = get_save_image_name(org_im, org_im_path, output_dir)\n            cv2.imwrite(save_im_path, sr)\n            print(\"save image at: \", save_im_path)\n            result['save_path'] = save_im_path\n            result['data'] = sr\n        else:\n            result['data'] = sr\n\n    return result\n\n\ndef check_dir(dir_path):\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef get_save_image_name(org_im, org_im_path, output_dir):\n    \"\"\"\n    Get save image name from source image path.\n    \"\"\"\n    # name prefix of orginal image\n    org_im_name = os.path.split(org_im_path)[-1]\n    im_prefix = os.path.splitext(org_im_name)[0]\n    ext = '.png'\n    # save image path\n    save_im_path = os.path.join(output_dir, im_prefix + ext)\n    if os.path.exists(save_im_path):\n        save_im_path = os.path.join(output_dir, im_prefix + 'time={}'.format(int(time.time())) + ext)\n\n    return save_im_path\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/falsr_b/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport numpy as np\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/1sLIu1XKQrY/download?ixid=MnwxMjA3fDB8MXxhbGx8MTJ8fHx8fHwyfHwxNjYyMzQxNDUx&force=true&w=120'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"falsr_b\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('falsr_b_output')\n\n    def test_reconstruct1(self):\n        results = self.module.reconstruct(paths=['tests/test.jpg'], use_gpu=False, visualization=False)\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_reconstruct2(self):\n        results = self.module.reconstruct(images=[cv2.imread('tests/test.jpg')], use_gpu=False, visualization=False)\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_reconstruct3(self):\n        results = self.module.reconstruct(images=[cv2.imread('tests/test.jpg')], use_gpu=False, visualization=True)\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_reconstruct4(self):\n        results = self.module.reconstruct(images=[cv2.imread('tests/test.jpg')], use_gpu=True, visualization=False)\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_reconstruct5(self):\n        self.assertRaises(AssertionError, self.module.reconstruct, paths=['no.jpg'])\n\n    def test_reconstruct6(self):\n        self.assertRaises(AttributeError, self.module.reconstruct, images=['test.jpg'])\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/falsr_c/README.md",
    "content": "# falsr_c\n\n\n|模型名称|falsr_c|\n| :--- | :---: |\n|类别|图像-图像编辑|\n|网络|falsr_c|\n|数据集|DIV2k|\n|是否支持Fine-tuning|否|\n|模型大小|4.4MB|\n|PSNR|37.66|\n|最新更新日期|2021-02-26|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n  - 样例结果示例(左为原图，右为效果图)：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/133558583-0b7049db-ed1f-4a16-8676-f2141fcb3dee.png\" width = \"450\" height = \"300\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/130899031-a6f8c58a-5cb7-4105-b990-8cca5ae15368.png\" width = \"450\" height = \"300\" hspace='10'/>\n    </p>\n\n\n- ### 模型介绍\n\n  - falsr_c是基于Fast, Accurate and Lightweight Super-Resolution with Neural Architecture Search设计的轻量化超分辨模型。该模型使用多目标方法处理超分问题，同时使用基于混合控制器的弹性搜索策略来提升模型性能。该模型提供的超分倍数为2倍。\n\n  - 更多详情请参考：[falsr_c](https://github.com/xiaomi-automl/FALSR)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0\n\n\n- ### 2、安装\n    - ```shell\n      $ hub install falsr_c\n      ```\n\n    - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n- ### 1、命令行预测\n\n  - ```\n    $ hub run falsr_c --input_path \"/PATH/TO/IMAGE\"\n    ```\n- ### 2、预测代码示例\n\n  ```python\n  import cv2\n  import paddlehub as hub\n\n  sr_model = hub.Module(name='falsr_c')\n  im = cv2.imread('/PATH/TO/IMAGE').astype('float32')\n  #visualization=True可以用于查看超分图片效果，可设置为False提升运行速度。\n  res = sr_model.reconstruct(images=[im], visualization=True)\n  print(res[0]['data'])\n  ```\n\n- ### 3、API\n\n  - ```python\n    def reconstruct(images=None,\n                    paths=None,\n                    use_gpu=False,\n                    visualization=False,\n                    output_dir=\"falsr_c_output\")\n    ```\n\n    - 预测API，用于图像超分辨率。\n\n    - **参数**\n\n      * images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      * paths (list\\[str\\]): 图片的路径；\n      * use\\_gpu (bool): 是否使用 GPU预测，如果使用GPU预测，则在预测之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置；\n      * visualization (bool): 是否将识别结果保存为图片文件；\n      * output\\_dir (str): 图片的保存路径。\n\n    - **返回**\n\n      * res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，关键字有 'save\\_path', 'data'，对应的取值为：\n        * save\\_path (str, optional): 可视化图片的保存路径（仅当visualization=True时存在）；\n        * data (numpy.ndarray): 超分辨后图像。\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n\n    - 将模型保存到指定路径。\n\n    - **参数**\n\n      * dirname: 模型保存路径\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像超分的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n      - ```shell\n        $ hub serving start -m falsr_c\n        ```\n\n      - 这样就完成了一个超分任务的服务化API的部署，默认端口号为8866。\n\n      - **NOTE:** 如使用GPU预测，则需要在启动服务之前，设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n - ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n        ```python\n        import requests\n        import json\n        import base64\n\n        import cv2\n        import numpy as np\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/falsr_c\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        sr = base64_to_cv2(r.json()[\"results\"][0]['data'])\n        cv2.imwrite('falsr_c_X2.png', sr)\n        print(\"save image as falsr_c_X2.png\")\n        ```\n\n- ### Gradio APP 支持\n  从 PaddleHub 2.3.1 开始支持使用链接 http://127.0.0.1:8866/gradio/falsr_c 在浏览器中访问 falsr_c 的 Gradio APP。\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 fluid API\n\n* 1.2.0\n\n  添加 Gradio APP 支持\n\n  ```shell\n  $ hub install falsr_c == 1.2.0\n  ```\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/falsr_c/README_en.md",
    "content": "# falsr_c\n\n|Module Name|falsr_c|\n| :--- | :---: |\n|Category |Image editing|\n|Network |falsr_c|\n|Dataset|DIV2k|\n|Fine-tuning supported or not|No|\n|Module Size |4.4MB|\n|Data indicators|PSNR37.66|\n|Latest update date|2021-02-26|\n\n\n## I. Basic Information\n\n- ### Application Effect Display\n\n  - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/133558583-0b7049db-ed1f-4a16-8676-f2141fcb3dee.png\" width = \"450\" height = \"300\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/130899031-a6f8c58a-5cb7-4105-b990-8cca5ae15368.png\" width = \"450\" height = \"300\" hspace='10'/>\n    </p>\n\n\n- ### Module Introduction\n\n  - Falsr_c is a lightweight super-resolution model based on \"Accurate and Lightweight Super-Resolution with Neural Architecture Search\". The model uses a multi-objective approach to deal with the over-segmentation problem, and uses an elastic search strategy based on a hybrid controller to improve the performance of the model. This model provides super resolution result with scale factor x2.\n\n  - For more information, please refer to:[falsr_c](https://github.com/xiaomi-automl/FALSR)\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0\n\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install falsr_c\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```\n    $ hub run falsr_c --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_en/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  ```python\n  import cv2\n  import paddlehub as hub\n\n  sr_model = hub.Module(name='falsr_c')\n  im = cv2.imread('/PATH/TO/IMAGE').astype('float32')\n  res = sr_model.reconstruct(images=[im], visualization=True)\n  print(res[0]['data'])\n  ```\n\n- ### 3、API\n\n  - ```python\n    def reconstruct(images=None,\n                    paths=None,\n                    use_gpu=False,\n                    visualization=False,\n                    output_dir=\"falsr_c_output\")\n    ```\n\n    - Prediction API.\n\n    - **Parameter**\n\n      * images (list\\[numpy.ndarray\\]): Image data，ndarray.shape is in the format \\[H, W, C\\]，BGR.\n      * paths (list\\[str\\]): Image path.\n      * use\\_gpu (bool): Use GPU or not. **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**.\n      * visualization (bool): Whether to save the recognition results as picture files.\n      * output\\_dir (str): Save path of images, \"dcscn_output\" by default.\n\n    - **Return**\n      * res (list\\[dict\\]): The list of model results, where each element is dict and each field is:\n        * save\\_path (str, optional): Save path of the result, save_path is '' if no image is saved.\n        * data (numpy.ndarray): Result of super resolution.\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n\n    - Save the model to the specified path.\n\n    - **Parameters**\n\n      * dirname: Model save path.\n\n\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of super resolution.\n\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n      - ```shell\n        $ hub serving start -m falsr_c\n        ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n      - ```python\n        import requests\n        import json\n        import base64\n\n        import cv2\n        import numpy as np\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/falsr_c\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        sr = base64_to_cv2(r.json()[\"results\"][0]['data'])\n        cv2.imwrite('falsr_c_X2.png', sr)\n        print(\"save image as falsr_c_X2.png\")\n        ```\n\n- ### Gradio APP support\n  Starting with PaddleHub 2.3.1, the Gradio APP for falsr_c is supported to be accessed in the browser using the link http://127.0.0.1:8866/gradio/falsr_c.\n\n## V. Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Remove Fluid API\n\n* 1.2.0\n\n  Add Gradio APP support.\n\n  ```shell\n  $ hub install falsr_c == 1.2.0\n  ```\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/falsr_c/data_feed.py",
    "content": "import os\nimport time\nfrom collections import OrderedDict\n\nimport cv2\nimport numpy as np\n\n__all__ = ['reader']\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            im = cv2.imread(im_path).astype('float32')\n            each['org_im'] = im\n            each['org_im_path'] = im_path\n            each['org_im_shape'] = im.shape\n            component.append(each)\n    if images is not None:\n        assert type(images) is list, \"images should be a list.\"\n        for im in images:\n            im = im.astype(np.float32)\n            each = OrderedDict()\n            each['org_im'] = im\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            each['org_im_shape'] = im.shape\n            component.append(each)\n\n    for element in component:\n        img = element['org_im'].copy()\n        img = cv2.cvtColor(img, cv2.COLOR_BGR2YUV)\n        shape = img.shape\n        img_scale = cv2.resize(img, (shape[1] * 2, shape[0] * 2), interpolation=cv2.INTER_CUBIC)\n        img_y = np.expand_dims(img[:, :, 0], axis=2)\n        img_scale_pbpr = img_scale[..., 1:]\n        img_y = img_y.transpose((2, 0, 1)) / 255\n        img_scale_pbpr = img_scale_pbpr.transpose(2, 0, 1) / 255\n        element['img_y'] = img_y\n        element['img_scale_pbpr'] = img_scale_pbpr\n        yield element\n\n\nif __name__ == \"__main__\":\n    path = ['BSD100_001.png']\n    reader(paths=path)\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/falsr_c/module.py",
    "content": "# -*- coding:utf-8 -*-\n# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport ast\nimport os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import check_dir\nfrom .processor import cv2_to_base64\nfrom .processor import postprocess\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"falsr_c\",\n            type=\"CV/image_editing\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"falsr_c is a super resolution model.\",\n            version=\"1.2.0\")\nclass Falsr_C:\n\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"falsr_c_model\", \"model\")\n        self._set_config()\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path + '.pdmodel'\n        params = self.default_pretrained_model_path + '.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def reconstruct(self, images=None, paths=None, use_gpu=False, visualization=False, output_dir=\"falsr_c_output\"):\n        \"\"\"\n        API for super resolution.\n\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C], the color space is BGR.\n            paths (list[str]): The paths of images.\n            use_gpu (bool): Whether to use gpu.\n            visualization (bool): Whether to save image or not.\n            output_dir (str): The path to store output images.\n\n        Returns:\n            res (list[dict]): each element in the list is a dict, the keys and values are:\n                save_path (str, optional): the path to save images. (Exists only if visualization is True)\n                data (numpy.ndarray): data of post processed image.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        all_data = list()\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        res = list()\n\n        for i in range(total_num):\n            image_y = np.array([all_data[i]['img_y']])\n            image_scale_pbpr = np.array([all_data[i]['img_scale_pbpr']])\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(image_y.copy())\n            input_handle = predictor.get_input_handle(input_names[1])\n            input_handle.copy_from_cpu(image_scale_pbpr.copy())\n\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n            output = np.expand_dims(output_handle.copy_to_cpu(), axis=1)\n            out = postprocess(data_out=output,\n                              org_im=all_data[i]['org_im'],\n                              org_im_shape=all_data[i]['org_im_shape'],\n                              org_im_path=all_data[i]['org_im_path'],\n                              output_dir=output_dir,\n                              visualization=visualization)\n            res.append(out)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.reconstruct(images=images_decode, **kwargs)\n        results = [{'data': cv2_to_base64(result['data'])} for result in results]\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.reconstruct(paths=[args.input_path],\n                                   use_gpu=args.use_gpu,\n                                   output_dir=args.output_dir,\n                                   visualization=args.visualization)\n        if args.save_dir is not None:\n            check_dir(args.save_dir)\n            self.save_inference_model(args.save_dir)\n\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='falsr_c_output',\n                                           help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument('--save_dir',\n                                           type=str,\n                                           default='falsr_c_save_model',\n                                           help=\"The directory to save model.\")\n        self.arg_config_group.add_argument('--visualization',\n                                           type=ast.literal_eval,\n                                           default=True,\n                                           help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n\n    def create_gradio_app(self):\n        import gradio as gr\n        import tempfile\n        import os\n        from PIL import Image\n\n        def inference(image, use_gpu=False):\n            with tempfile.TemporaryDirectory() as temp_dir:\n                self.reconstruct(paths=[image], use_gpu=use_gpu, visualization=True, output_dir=temp_dir)\n                return Image.open(os.path.join(temp_dir, os.listdir(temp_dir)[0]))\n\n        interface = gr.Interface(\n            inference,\n            [gr.inputs.Image(type=\"filepath\"), gr.Checkbox(label='use_gpu')],\n            gr.outputs.Image(type=\"ndarray\"),\n            title='falsr_c')\n        return interface\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/falsr_c/processor.py",
    "content": "import base64\nimport os\nimport time\n\nimport cv2\nimport numpy as np\n\n__all__ = ['cv2_to_base64', 'base64_to_cv2', 'postprocess']\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef postprocess(data_out, org_im, org_im_shape, org_im_path, output_dir, visualization):\n    \"\"\"\n    Postprocess output of network. one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output of network.\n        org_im (numpy.ndarray): original image.\n        org_im_shape (list): shape pf original image.\n        org_im_path (list): path of riginal image.\n        output_dir (str): output directory to store image.\n        visualization (bool): whether to save image or not.\n\n    Returns:\n        result (dict): The data of processed image.\n    \"\"\"\n    result = dict()\n    for sr in data_out:\n        sr = np.squeeze(sr, 0)\n        sr = np.clip(sr * 255, 0, 255)\n        sr = sr.astype(np.uint8)\n        sr = cv2.cvtColor(sr, cv2.COLOR_RGB2BGR)\n\n        if visualization:\n            check_dir(output_dir)\n            save_im_path = get_save_image_name(org_im, org_im_path, output_dir)\n            cv2.imwrite(save_im_path, sr)\n            print(\"save image at: \", save_im_path)\n            result['save_path'] = save_im_path\n            result['data'] = sr\n        else:\n            result['data'] = sr\n\n    return result\n\n\ndef check_dir(dir_path):\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef get_save_image_name(org_im, org_im_path, output_dir):\n    \"\"\"\n    Get save image name from source image path.\n    \"\"\"\n    # name prefix of orginal image\n    org_im_name = os.path.split(org_im_path)[-1]\n    im_prefix = os.path.splitext(org_im_name)[0]\n    ext = '.png'\n    # save image path\n    save_im_path = os.path.join(output_dir, im_prefix + ext)\n    if os.path.exists(save_im_path):\n        save_im_path = os.path.join(output_dir, im_prefix + 'time={}'.format(int(time.time())) + ext)\n\n    return save_im_path\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/falsr_c/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport numpy as np\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/1sLIu1XKQrY/download?ixid=MnwxMjA3fDB8MXxhbGx8MTJ8fHx8fHwyfHwxNjYyMzQxNDUx&force=true&w=120'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"falsr_c\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('falsr_c_output')\n\n    def test_reconstruct1(self):\n        results = self.module.reconstruct(paths=['tests/test.jpg'], use_gpu=False, visualization=False)\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_reconstruct2(self):\n        results = self.module.reconstruct(images=[cv2.imread('tests/test.jpg')], use_gpu=False, visualization=False)\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_reconstruct3(self):\n        results = self.module.reconstruct(images=[cv2.imread('tests/test.jpg')], use_gpu=False, visualization=True)\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_reconstruct4(self):\n        results = self.module.reconstruct(images=[cv2.imread('tests/test.jpg')], use_gpu=True, visualization=False)\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_reconstruct5(self):\n        self.assertRaises(AssertionError, self.module.reconstruct, paths=['no.jpg'])\n\n    def test_reconstruct6(self):\n        self.assertRaises(AttributeError, self.module.reconstruct, images=['test.jpg'])\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/realsr/README.md",
    "content": "# realsr\n\n|模型名称|reasr|\n| :--- | :---: | \n|类别|图像-图像编辑|\n|网络|LP-KPN|\n|数据集|RealSR dataset|\n|是否支持Fine-tuning|否|\n|模型大小|64MB|\n|PSNR|29.05|\n|最新更新日期|2021-02-26|\n\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n  - 样例结果示例(左为原图，右为效果图)：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/133558583-0b7049db-ed1f-4a16-8676-f2141fcb3dee.png\" width = \"450\" height = \"300\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/130789888-a0d4f78e-acd6-44c1-9570-7390e90ae8dc.png\" width = \"450\" height = \"300\" hspace='10'/>\n    </p>\n\n- ### 模型介绍\n\n  - realsr是用于图像和视频超分模型，该模型基于Toward Real-World Single Image Super-Resolution: A New Benchmark and A New Mode，它能够将输入的图片和视频超分四倍。\n  \n  - 更多详情请参考：[realsr](https://github.com/csjcai/RealSR)\n  \n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n    - NOTE: 使用该模型需要自行安装ffmpeg，若您使用conda环境，推荐使用如下语句进行安装。\n\n      ```shell\n      $ conda install x264=='1!152.20180717' ffmpeg=4.0.2 -c conda-forge\n      ```\n\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install realsr\n      ```\n      \n    - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n    \n\n\n## 三、模型API预测\n\n  - ### 1、预测代码示例\n\n    ```python\n    import paddlehub as hub\n\n    model = hub.Module(name='realsr')\n    model.predict('/PATH/TO/IMAGE')\n    \n    # model.predict('/PATH/TO/VIDEO')\n    ```\n  - ### 2、API\n\n    - ```python\n      def predict(self, input):\n      ```\n\n      - 超分API，得到超分后的图片或者视频。\n\n\n      - **参数**\n\n        - input (str): 图片或者视频的路径；\n\n      - **返回**\n\n        - 若输入是图片，返回值为：\n          - pred_img(np.ndarray): BGR图片数据；\n          - out_path(str): 保存图片路径。\n\n        - 若输入是视频，返回值为：\n          - frame_pattern_combined(str): 视频超分后单帧数据保存路径；\n          - vid_out_path(str): 视频保存路径。\n\n    - ```python\n      def run_image(self, img):\n      ```\n      - 图像超分API， 得到超分后的图片。\n\n      - **参数**\n\n        - img (str｜np.ndarray): 图片路径或则BGR格式图片。\n\n      - **返回**\n\n        - pred_img(np.ndarray): BGR图片数据；\n\n    - ```python\n      def run_video(self, video):\n      ```\n       - 视频超分API， 得到超分后的视频。\n\n       - **参数**\n\n         - video(str): 待处理视频路径。\n\n       - **返回**\n\n         - frame_pattern_combined(str): 视频超分后单帧数据保存路径；\n         - vid_out_path(str): 视频保存路径。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线照片超分服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m realsr\n      ```\n\n    - 这样就完成了一个图像超分的在线服务API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    ```python\n      import requests\n      import json\n      import base64\n\n      import cv2\n      import numpy as np\n\n      def cv2_to_base64(image):\n          data = cv2.imencode('.jpg', image)[1]\n          return base64.b64encode(data.tostring()).decode('utf8')\n      def base64_to_cv2(b64str):\n          data = base64.b64decode(b64str.encode('utf8'))\n          data = np.fromstring(data, np.uint8)\n          data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n          return data\n\n      # 发送HTTP请求\n      org_im = cv2.imread('/PATH/TO/IMAGE')\n      data = {'images':cv2_to_base64(org_im)}\n      headers = {\"Content-type\": \"application/json\"}\n      url = \"http://127.0.0.1:8866/predict/realsr\"\n      r = requests.post(url=url, headers=headers, data=json.dumps(data))\n      img = base64_to_cv2(r.json()[\"results\"])\n      cv2.imwrite('/PATH/TO/SAVE/IMAGE', img)\n\n      ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  适配paddlehub2.0版本\n\n* 1.1.0\n\n  更新代码格式\n\n  ```shell\n  $ hub install realsr == 1.1.0\n  ```"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/realsr/README_en.md",
    "content": "# realsr\n\n|Module Name |reasr|\n| :--- | :---: | \n|Category |Image editing|\n|Network|LP-KPN|\n|Dataset |RealSR dataset|\n|Fine-tuning supported or not|No|\n|Module Size |64MB|\n|Latest update date|2021-02-26|\n|Data indicators |PSNR29.05|\n\n\n\n## I. Basic Information \n\n- ### Application Effect Display\n\n  - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/133558583-0b7049db-ed1f-4a16-8676-f2141fcb3dee.png\" width = \"450\" height = \"300\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/130789888-a0d4f78e-acd6-44c1-9570-7390e90ae8dc.png\" width = \"450\" height = \"300\" hspace='10'/>\n    </p>\n\n- ### Module Introduction\n\n  - Realsr is a super resolution model for image and video based on \"Toward Real-World Single Image Super-Resolution: A New Benchmark and A New Mode\". This model provides super resolution result with scale factor x4.\n  \n  - For more information, please refer to: [realsr](https://github.com/csjcai/RealSR)\n  \n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n    - **NOTE**: This Module relies on ffmpeg, Please install ffmpeg before using this Module.\n      ```shell\n      $ conda install x264=='1!152.20180717' ffmpeg=4.0.2 -c conda-forge\n      ```\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install realsr\n      ```\n      \n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n    \n\n## III. Module API Prediction\n\n  - ### 1、Prediction Code Example\n\n    - ```python\n      import paddlehub as hub\n\n      model = hub.Module(name='realsr')\n      model.predict('/PATH/TO/IMAGE')\n\n      # model.predict('/PATH/TO/VIDEO')\n      ```\n  - ### 2、API\n\n    - ```python\n      def predict(self, input):\n      ```\n\n      - Prediction API.\n\n      - **Parameter**\n\n          - input (str): image path.\n\n      - **Return**\n\n          - If input is image path, the output is：\n            - pred_img(np.ndarray): image data, ndarray.shape is in the format [H, W, C], BGR.\n            - out_path(str): save path of images.\n\n          - If input is video path, the output is ：\n            - frame_pattern_combined(str): save path of frames from output video.\n            - vid_out_path(str): save path of output video.\n\n    - ```python\n      def run_image(self, img):\n      ```\n      - Prediction API for images.\n\n      - **Parameter**\n\n          - img (str｜np.ndarray): Image data,  str or ndarray. ndarray.shape is in the format [H, W, C], BGR.\n\n      - **Return**\n\n          - pred_img(np.ndarray): Prediction result, ndarray.shape is in the format [H, W, C], BGR.\n\n    - ```python\n      def run_video(self, video):\n      ```\n       -  Prediction API for video.\n\n          - **Parameter**\n\n            - video(str): Video path.\n\n          - **Return**\n\n            - frame_pattern_combined(str): Save path of frames from output video.\n            - vid_out_path(str): Save path of output video.\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of image super resolution.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command:\n\n    - ```shell\n      $ hub serving start -m realsr\n      ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n      - ```python\n        import requests\n        import json\n        import base64\n\n        import cv2\n        import numpy as np\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':cv2_to_base64(org_im)}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/realsr\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        img = base64_to_cv2(r.json()[\"results\"])\n        cv2.imwrite('/PATH/TO/SAVE/IMAGE', img)\n\n        ```\n\n\n## V. Release Note\n\n\n- 1.0.0\n\n  First release\n\n* 1.0.1\n\n  Support paddlehub2.0\n\n* 1.1.0\n\n  Update code format\n\n  ```shell\n  $ hub install realsr == 1.1.0\n  ```"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/realsr/module.py",
    "content": "#  Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport cv2\nimport glob\n\nfrom tqdm import tqdm\nimport numpy as np\nfrom PIL import Image\nimport paddle\nimport paddle.nn as nn\nfrom paddlehub.module.module import moduleinfo, serving\n\nfrom .rrdb import RRDBNet\nfrom . import utils as U\n\n\n@moduleinfo(name=\"realsr\",\n            type=\"CV/image_editing\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"realsr is a super resolution model\",\n            version=\"1.1.0\")\nclass RealSRPredictor(nn.Layer):\n    def __init__(self, output='output', weight_path=None, load_checkpoint: str = None):\n        super(RealSRPredictor, self).__init__()\n        self.input = input\n        self.output = os.path.join(output, 'RealSR')\n        self.model = RRDBNet(3, 3, 64, 23)\n\n        if load_checkpoint is not None:\n            state_dict = paddle.load(load_checkpoint)\n            self.model.load_dict(state_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'DF2K_JPEG.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system('wget https://paddlegan.bj.bcebos.com/applications/DF2K_JPEG.pdparams -O ' + checkpoint)\n            state_dict = paddle.load(checkpoint)\n            self.model.load_dict(state_dict)\n            print(\"load pretrained checkpoint success\")\n\n        self.model.eval()\n\n    def norm(self, img):\n        img = np.array(img).transpose([2, 0, 1]).astype('float32') / 255.0\n        return img.astype('float32')\n\n    def denorm(self, img):\n        img = img.transpose((1, 2, 0))\n        return (img * 255).clip(0, 255).astype('uint8')\n\n    def run_image(self, img):\n        if isinstance(img, str):\n            ori_img = Image.open(img).convert('RGB')\n        elif isinstance(img, np.ndarray):\n            ori_img = Image.fromarray(cv2.cvtColor(img, cv2.COLOR_BGR2RGB))\n        elif isinstance(img, Image.Image):\n            ori_img = img\n\n        img = self.norm(ori_img)\n        \n        with paddle.no_grad():\n            x = paddle.to_tensor(img[np.newaxis, ...])\n            out = self.model(x)\n\n        pred_img = self.denorm(out.numpy()[0])\n        pred_img = cv2.cvtColor(pred_img, cv2.COLOR_RGB2BGR)\n\n        return pred_img\n\n    def run_video(self, video):\n        base_name = os.path.basename(video).split('.')[0]\n        output_path = os.path.join(self.output, base_name)\n        pred_frame_path = os.path.join(output_path, 'frames_pred')\n\n        if not os.path.exists(output_path):\n            os.makedirs(output_path)\n\n        if not os.path.exists(pred_frame_path):\n            os.makedirs(pred_frame_path)\n\n        cap = cv2.VideoCapture(video)\n        fps = cap.get(cv2.CAP_PROP_FPS)\n\n        out_path = U.video2frames(video, output_path)\n\n        frames = sorted(glob.glob(os.path.join(out_path, '*.png')))\n\n        for frame in tqdm(frames):\n            pred_img = self.run_image(frame)\n            pred_img = cv2.cvtColor(pred_img, cv2.COLOR_BGR2RGB)\n            pred_img = Image.fromarray(pred_img)\n            frame_name = os.path.basename(frame)\n            pred_img.save(os.path.join(pred_frame_path, frame_name))\n\n        frame_pattern_combined = os.path.join(pred_frame_path, '%08d.png')\n\n        vid_out_path = os.path.join(output_path,\n                                    '{}_realsr_out.mp4'.format(base_name))\n        U.frames2video(frame_pattern_combined, vid_out_path, str(int(fps)))\n        print(\"save result at {}\".format(vid_out_path))\n\n        return frame_pattern_combined, vid_out_path\n\n    def predict(self, input):\n        if not os.path.exists(self.output):\n            os.makedirs(self.output)\n\n        if not U.is_image(input):\n            return self.run_video(input)\n        else:\n            pred_img = self.run_image(input)\n\n            out_path = None\n            if self.output:\n                final = cv2.cvtColor(pred_img, cv2.COLOR_BGR2RGB)\n                final = Image.fromarray(final)\n                base_name = os.path.splitext(os.path.basename(input))[0]\n                out_path = os.path.join(self.output, base_name + '.png')\n                final.save(out_path)\n                print('save result at {}'.format(out_path))\n\n            return pred_img, out_path\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = U.base64_to_cv2(images)\n        results = self.run_image(img=images_decode)\n        results = U.cv2_to_base64(results)\n        return results\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/realsr/rrdb.py",
    "content": "import functools\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\nclass Registry(object):\n    \"\"\"\n    The registry that provides name -> object mapping, to support third-party users' custom modules.\n    To create a registry (inside segmentron):\n    .. code-block:: python\n        BACKBONE_REGISTRY = Registry('BACKBONE')\n    To register an object:\n    .. code-block:: python\n        @BACKBONE_REGISTRY.register()\n        class MyBackbone():\n            ...\n    Or:\n    .. code-block:: python\n        BACKBONE_REGISTRY.register(MyBackbone)\n    \"\"\"\n\n    def __init__(self, name):\n        \"\"\"\n        Args:\n            name (str): the name of this registry\n        \"\"\"\n        self._name = name\n\n        self._obj_map = {}\n\n    def _do_register(self, name, obj):\n        assert (\n            name not in self._obj_map\n        ), \"An object named '{}' was already registered in '{}' registry!\".format(name, self._name)\n        self._obj_map[name] = obj\n\n    def register(self, obj=None, name=None):\n        \"\"\"\n        Register the given object under the the name `obj.__name__`.\n        Can be used as either a decorator or not. See docstring of this class for usage.\n        \"\"\"\n        if obj is None:\n            # used as a decorator\n            def deco(func_or_class, name=name):\n                if name is None:\n                    name = func_or_class.__name__\n                self._do_register(name, func_or_class)\n                return func_or_class\n\n            return deco\n\n        # used as a function call\n        if name is None:\n            name = obj.__name__\n        self._do_register(name, obj)\n\n    def get(self, name):\n        ret = self._obj_map.get(name)\n        if ret is None:\n            raise KeyError(\"No object named '{}' found in '{}' registry!\".format(name, self._name))\n\n        return ret\n\n\nclass ResidualDenseBlock_5C(nn.Layer):\n    def __init__(self, nf=64, gc=32, bias=True):\n        super(ResidualDenseBlock_5C, self).__init__()\n        # gc: growth channel, i.e. intermediate channels\n        self.conv1 = nn.Conv2D(nf, gc, 3, 1, 1, bias_attr=bias)\n        self.conv2 = nn.Conv2D(nf + gc, gc, 3, 1, 1, bias_attr=bias)\n        self.conv3 = nn.Conv2D(nf + 2 * gc, gc, 3, 1, 1, bias_attr=bias)\n        self.conv4 = nn.Conv2D(nf + 3 * gc, gc, 3, 1, 1, bias_attr=bias)\n        self.conv5 = nn.Conv2D(nf + 4 * gc, nf, 3, 1, 1, bias_attr=bias)\n        self.lrelu = nn.LeakyReLU(negative_slope=0.2)\n\n    def forward(self, x):\n        x1 = self.lrelu(self.conv1(x))\n        x2 = self.lrelu(self.conv2(paddle.concat((x, x1), 1)))\n        x3 = self.lrelu(self.conv3(paddle.concat((x, x1, x2), 1)))\n        x4 = self.lrelu(self.conv4(paddle.concat((x, x1, x2, x3), 1)))\n        x5 = self.conv5(paddle.concat((x, x1, x2, x3, x4), 1))\n        return x5 * 0.2 + x\n\n\nclass RRDB(nn.Layer):\n    '''Residual in Residual Dense Block'''\n    def __init__(self, nf, gc=32):\n        super(RRDB, self).__init__()\n        self.RDB1 = ResidualDenseBlock_5C(nf, gc)\n        self.RDB2 = ResidualDenseBlock_5C(nf, gc)\n        self.RDB3 = ResidualDenseBlock_5C(nf, gc)\n\n    def forward(self, x):\n        out = self.RDB1(x)\n        out = self.RDB2(out)\n        out = self.RDB3(out)\n        return out * 0.2 + x\n\n\ndef make_layer(block, n_layers):\n    layers = []\n    for _ in range(n_layers):\n        layers.append(block())\n    return nn.Sequential(*layers)\n\nGENERATORS = Registry(\"GENERATOR\")\n\n@GENERATORS.register()\nclass RRDBNet(nn.Layer):\n    def __init__(self, in_nc, out_nc, nf, nb, gc=32):\n        super(RRDBNet, self).__init__()\n        RRDB_block_f = functools.partial(RRDB, nf=nf, gc=gc)\n\n        self.conv_first = nn.Conv2D(in_nc, nf, 3, 1, 1, bias_attr=True)\n        self.RRDB_trunk = make_layer(RRDB_block_f, nb)\n        self.trunk_conv = nn.Conv2D(nf, nf, 3, 1, 1, bias_attr=True)\n        #### upsampling\n        self.upconv1 = nn.Conv2D(nf, nf, 3, 1, 1, bias_attr=True)\n        self.upconv2 = nn.Conv2D(nf, nf, 3, 1, 1, bias_attr=True)\n        self.HRconv = nn.Conv2D(nf, nf, 3, 1, 1, bias_attr=True)\n        self.conv_last = nn.Conv2D(nf, out_nc, 3, 1, 1, bias_attr=True)\n\n        self.lrelu = nn.LeakyReLU(negative_slope=0.2)\n\n    def forward(self, x):\n        fea = self.conv_first(x)\n        trunk = self.trunk_conv(self.RRDB_trunk(fea))\n        fea = fea + trunk\n\n        fea = self.lrelu(\n            self.upconv1(F.interpolate(fea, scale_factor=2, mode='nearest')))\n        fea = self.lrelu(\n            self.upconv2(F.interpolate(fea, scale_factor=2, mode='nearest')))\n        out = self.conv_last(self.lrelu(self.HRconv(fea)))\n\n        return out\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/realsr/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport numpy as np\nimport paddlehub as hub\n\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/1sLIu1XKQrY/download?ixid=MnwxMjA3fDB8MXxhbGx8MTJ8fHx8fHwyfHwxNjYyMzQxNDUx&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"realsr\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('output')\n\n    def test_run_image1(self):\n        results = self.module.run_image(\n            img='tests/test.jpg'\n        )\n        self.assertIsInstance(results, np.ndarray)\n\n    def test_run_image2(self):\n        results = self.module.run_image(\n            img=cv2.imread('tests/test.jpg')\n        )\n        self.assertIsInstance(results, np.ndarray)\n\n    def test_run_image3(self):\n        self.assertRaises(\n            FileNotFoundError,\n            self.module.run_image,\n            img='no.jpg'\n        )\n\n    def test_predict1(self):\n        pred_img, out_path = self.module.predict(\n            input='tests/test.jpg'\n        )\n        self.assertIsInstance(pred_img, np.ndarray)\n        self.assertIsInstance(out_path, str)\n\n    def test_predict2(self):\n        self.assertRaises(\n            RuntimeError,\n            self.module.predict,\n            input='no.jpg'\n        )\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/realsr/utils.py",
    "content": "import os\nimport sys\nimport base64\n\nimport cv2\nfrom PIL import Image\nimport numpy as np\n\n\ndef video2frames(video_path, outpath, **kargs):\n    def _dict2str(kargs):\n        cmd_str = ''\n        for k, v in kargs.items():\n            cmd_str += (' ' + str(k) + ' ' + str(v))\n        return cmd_str\n\n    ffmpeg = ['ffmpeg ', ' -y -loglevel ', ' error ']\n    vid_name = video_path.split('/')[-1].split('.')[0]\n    out_full_path = os.path.join(outpath, vid_name)\n\n    if not os.path.exists(out_full_path):\n        os.makedirs(out_full_path)\n\n    # video file name\n    outformat = out_full_path + '/%08d.png'\n\n    cmd = ffmpeg\n    cmd = ffmpeg + [' -i ', video_path, ' -start_number ', ' 0 ', outformat]\n\n    cmd = ''.join(cmd) + _dict2str(kargs)\n\n    if os.system(cmd) != 0:\n        raise RuntimeError('ffmpeg process video: {} error'.format(vid_name))\n\n    sys.stdout.flush()\n    return out_full_path\n\n\ndef frames2video(frame_path, video_path, r):\n    ffmpeg = ['ffmpeg ', ' -y -loglevel ', ' error ']\n    cmd = ffmpeg + [\n        ' -r ', r, ' -f ', ' image2 ', ' -i ', frame_path, ' -pix_fmt ', ' yuv420p ', video_path\n    ]\n    cmd = ''.join(cmd)\n\n    if os.system(cmd) != 0:\n        raise RuntimeError('ffmpeg process video: {} error'.format(video_path))\n\n    sys.stdout.flush()\n\n\ndef is_image(input):\n    try:\n        img = Image.open(input)\n        _ = img.size\n        return True\n    except:\n        return False\n    \n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/swin2sr_real_sr_x4/README.md",
    "content": "# swin2sr_real_sr_x4\n\n|模型名称|swin2sr_real_sr_x4|\n| :--- | :---: |\n|类别|图像-图像编辑|\n|网络|Swin2SR|\n|数据集|DIV2K / Flickr2K|\n|是否支持Fine-tuning|否|\n|模型大小|68.4MB|\n|指标|-|\n|最新更新日期|2022-10-25|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n  - 网络结构：\n      <p align=\"center\">\n      <img src=\"https://ai-studio-static-online.cdn.bcebos.com/884d4d4472b44bf1879606374ed64a7e8d2fec0bcf034285a5cecfc582e8cd65\" hspace='10'/> <br />\n      </p>\n\n  - 样例结果示例：\n      <p align=\"center\">\n      <img src=\"https://ai-studio-static-online.cdn.bcebos.com/c5517af6c3f944c4b281aedc417a4f8c02c0a969d0dd494c9106c4ff2709fc2f\" hspace='10'/>\n      <img src=\"https://ai-studio-static-online.cdn.bcebos.com/183c5821029f45bbb78d1700ab8297baabba15f82ab4467e88414bbed056ccf0\" hspace='10'/>\n      </p>\n\n- ### 模型介绍\n\n  - Swin2SR 是一个基于 Swin Transformer v2 的图像超分辨率模型。swin2sr_real_sr_x4 是基于 Swin2SR 的 4 倍现实图像超分辨率模型。\n\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0  \n\n- ### 2.安装\n\n    - ```shell\n      $ hub install swin2sr_real_sr_x4\n      ```\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n  - ### 1、命令行预测\n\n    ```shell\n    $ hub run swin2sr_real_sr_x4 \\\n        --input_path \"/PATH/TO/IMAGE\" \\\n        --output_dir \"swin2sr_real_sr_x4_output\"\n    ```\n\n  - ### 2、预测代码示例\n\n    ```python\n    import paddlehub as hub\n    import cv2\n\n    module = hub.Module(name=\"swin2sr_real_sr_x4\")\n    result = module.real_sr(\n        image=cv2.imread('/PATH/TO/IMAGE'),\n        visualization=True,\n        output_dir='swin2sr_real_sr_x4_output'\n    )\n    ```\n\n  - ### 3、API\n\n    ```python\n    def real_sr(\n        image: Union[str, numpy.ndarray],\n        visualization: bool = True,\n        output_dir: str = \"swin2sr_real_sr_x4_output\"\n    ) -> numpy.ndarray\n    ```\n\n    - 超分辨率 API\n\n    - **参数**\n\n      * image (Union\\[str, numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      * visualization (bool): 是否将识别结果保存为图片文件；\n      * output\\_dir (str): 保存处理结果的文件目录。\n\n    - **返回**\n\n      * res (numpy.ndarray): 图像超分辨率结果 (BGR)；\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个图像超分辨率的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n\n    ```shell\n     $ hub serving start -m swin2sr_real_sr_x4\n    ```\n\n    - 这样就完成了一个图像超分辨率服务化API的部署，默认端口号为8866。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    ```python\n    import requests\n    import json\n    import base64\n\n    import cv2\n    import numpy as np\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tobytes()).decode('utf8')\n\n    def base64_to_cv2(b64str):\n        data = base64.b64decode(b64str.encode('utf8'))\n        data = np.frombuffer(data, np.uint8)\n        data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n        return data\n\n    # 发送HTTP请求\n    org_im = cv2.imread('/PATH/TO/IMAGE')\n    data = {\n        'image': cv2_to_base64(org_im)\n    }\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/swin2sr_real_sr_x4\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 结果转换\n    results = r.json()['results']\n    results = base64_to_cv2(results)\n\n    # 保存结果\n    cv2.imwrite('output.jpg', results)\n    ```\n\n## 五、参考资料\n\n* 论文：[Swin2SR: SwinV2 Transformer for Compressed Image Super-Resolution and Restoration](https://arxiv.org/abs/2209.11345)\n\n* 官方实现：[mv-lab/swin2sr](https://github.com/mv-lab/swin2sr/)\n\n## 六、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install swin2sr_real_sr_x4==1.0.0\n  ```\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/swin2sr_real_sr_x4/module.py",
    "content": "import argparse\nimport base64\nimport os\nimport time\nfrom typing import Union\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\n\nfrom .swin2sr import Swin2SR\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tobytes()).decode('utf8')\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.frombuffer(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\n@moduleinfo(\n    name='swin2sr_real_sr_x4',\n    version='1.0.0',\n    type=\"CV/image_editing\",\n    author=\"\",\n    author_email=\"\",\n    summary=\"SwinV2 Transformer for Compressed Image Super-Resolution and Restoration.\",\n)\nclass SwinIRMRealSR(nn.Layer):\n\n    def __init__(self):\n        super(SwinIRMRealSR, self).__init__()\n        self.default_pretrained_model_path = os.path.join(self.directory,\n                                                          'Swin2SR_RealworldSR_X4_64_BSRGAN_PSNR.pdparams')\n        self.swin2sr = Swin2SR(upscale=4,\n                               in_chans=3,\n                               img_size=64,\n                               window_size=8,\n                               img_range=1.,\n                               depths=[6, 6, 6, 6, 6, 6],\n                               embed_dim=180,\n                               num_heads=[6, 6, 6, 6, 6, 6],\n                               mlp_ratio=2,\n                               upsampler='nearest+conv',\n                               resi_connection='1conv')\n        state_dict = paddle.load(self.default_pretrained_model_path)\n        self.swin2sr.set_state_dict(state_dict)\n        self.swin2sr.eval()\n\n    def preprocess(self, img: np.ndarray) -> np.ndarray:\n        img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)\n        img = img.transpose((2, 0, 1))\n        img = img / 255.0\n        return img.astype(np.float32)\n\n    def postprocess(self, img: np.ndarray) -> np.ndarray:\n        img = img.clip(0, 1)\n        img = img * 255.0\n        img = img.transpose((1, 2, 0))\n        img = cv2.cvtColor(img, cv2.COLOR_RGB2BGR)\n        return img.astype(np.uint8)\n\n    def real_sr(self,\n                image: Union[str, np.ndarray],\n                visualization: bool = True,\n                output_dir: str = \"swin2sr_real_sr_x4_output\") -> np.ndarray:\n        if isinstance(image, str):\n            _, file_name = os.path.split(image)\n            save_name, _ = os.path.splitext(file_name)\n            save_name = save_name + '_' + str(int(time.time())) + '.jpg'\n            image = cv2.imdecode(np.fromfile(image, dtype=np.uint8), cv2.IMREAD_COLOR)\n        elif isinstance(image, np.ndarray):\n            save_name = str(int(time.time())) + '.jpg'\n            image = image\n        else:\n            raise Exception(\"image should be a str / np.ndarray\")\n\n        with paddle.no_grad():\n            img_input = self.preprocess(image)\n            img_input = paddle.to_tensor(img_input[None, ...], dtype=paddle.float32)\n\n            img_output = self.swin2sr(img_input)\n            img_output = img_output.numpy()[0]\n            img_output = self.postprocess(img_output)\n\n        if visualization:\n            if not os.path.isdir(output_dir):\n                os.makedirs(output_dir)\n            save_path = os.path.join(output_dir, save_name)\n            cv2.imwrite(save_path, img_output)\n\n        return img_output\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.parser.add_argument('--input_path', type=str, help=\"Path to image.\")\n        self.parser.add_argument('--output_dir',\n                                 type=str,\n                                 default='swin2sr_real_sr_x4_output',\n                                 help=\"The directory to save output images.\")\n        args = self.parser.parse_args(argvs)\n        self.real_sr(image=args.input_path, visualization=True, output_dir=args.output_dir)\n        return 'Results are saved in %s' % args.output_dir\n\n    @serving\n    def serving_method(self, image, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        image = base64_to_cv2(image)\n        img_output = self.real_sr(image=image, **kwargs)\n\n        return cv2_to_base64(img_output)\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/swin2sr_real_sr_x4/swin2sr.py",
    "content": "import collections.abc\nimport math\nfrom itertools import repeat\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\ndef _ntuple(n):\n\n    def parse(x):\n        if isinstance(x, collections.abc.Iterable):\n            return x\n        return tuple(repeat(x, n))\n\n    return parse\n\n\nto_2tuple = _ntuple(2)\n\n\nclass Mlp(nn.Layer):\n\n    def __init__(self, in_features, hidden_features=None, out_features=None, act_layer=nn.GELU, drop=0.):\n        super().__init__()\n        out_features = out_features or in_features\n        hidden_features = hidden_features or in_features\n        self.fc1 = nn.Linear(in_features, hidden_features)\n        self.act = act_layer()\n        self.fc2 = nn.Linear(hidden_features, out_features)\n        self.drop = nn.Dropout(drop)\n\n    def forward(self, x):\n        x = self.fc1(x)\n        x = self.act(x)\n        x = self.drop(x)\n        x = self.fc2(x)\n        x = self.drop(x)\n        return x\n\n\ndef window_partition(x, window_size):\n    \"\"\"\n    Args:\n        x: (B, H, W, C)\n        window_size (int): window size\n    Returns:\n        windows: (num_windows*B, window_size, window_size, C)\n    \"\"\"\n    B, H, W, C = x.shape\n    x = x.reshape((B, H // window_size, window_size, W // window_size, window_size, C))\n    windows = x.transpose((0, 1, 3, 2, 4, 5)).reshape((-1, window_size, window_size, C))\n    return windows\n\n\ndef window_reverse(windows, window_size, H, W):\n    \"\"\"\n    Args:\n        windows: (num_windows*B, window_size, window_size, C)\n        window_size (int): Window size\n        H (int): Height of image\n        W (int): Width of image\n    Returns:\n        x: (B, H, W, C)\n    \"\"\"\n    B = int(windows.shape[0] / (H * W / window_size / window_size))\n    x = windows.reshape((B, H // window_size, W // window_size, window_size, window_size, -1))\n    x = x.transpose((0, 1, 3, 2, 4, 5)).reshape((B, H, W, -1))\n    return x\n\n\nclass WindowAttention(nn.Layer):\n    r\"\"\" Window based multi-head self attention (W-MSA) module with relative position bias.\n    It supports both of shifted and non-shifted window.\n    Args:\n        dim (int): Number of input channels.\n        window_size (tuple[int]): The height and width of the window.\n        num_heads (int): Number of attention heads.\n        qkv_bias (bool, optional):  If True, add a learnable bias to query, key, value. Default: True\n        attn_drop (float, optional): Dropout ratio of attention weight. Default: 0.0\n        proj_drop (float, optional): Dropout ratio of output. Default: 0.0\n        pretrained_window_size (tuple[int]): The height and width of the window in pre-training.\n    \"\"\"\n\n    def __init__(self,\n                 dim,\n                 window_size,\n                 num_heads,\n                 qkv_bias=True,\n                 attn_drop=0.,\n                 proj_drop=0.,\n                 pretrained_window_size=[0, 0]):\n\n        super().__init__()\n        self.dim = dim\n        self.window_size = window_size  # Wh, Ww\n        self.pretrained_window_size = pretrained_window_size\n        self.num_heads = num_heads\n\n        self.logit_scale = self.create_parameter(shape=(num_heads, 1, 1),\n                                                 dtype=paddle.float32,\n                                                 default_initializer=nn.initializer.Assign(\n                                                     paddle.log(10 * paddle.ones((num_heads, 1, 1)))))\n\n        # mlp to generate continuous relative position bias\n        self.cpb_mlp = nn.Sequential(nn.Linear(2, 512, bias_attr=True), nn.ReLU(),\n                                     nn.Linear(512, num_heads, bias_attr=False))\n\n        # get relative_coords_table\n        relative_coords_h = paddle.arange(-(self.window_size[0] - 1), self.window_size[0], dtype=paddle.float32)\n        relative_coords_w = paddle.arange(-(self.window_size[1] - 1), self.window_size[1], dtype=paddle.float32)\n        relative_coords_table = paddle.stack(paddle.meshgrid([relative_coords_h, relative_coords_w])).transpose(\n            (1, 2, 0)).unsqueeze(0)  # 1, 2*Wh-1, 2*Ww-1, 2\n        if pretrained_window_size[0] > 0:\n            relative_coords_table[:, :, :, 0] /= (pretrained_window_size[0] - 1)\n            relative_coords_table[:, :, :, 1] /= (pretrained_window_size[1] - 1)\n        else:\n            relative_coords_table[:, :, :, 0] /= (self.window_size[0] - 1)\n            relative_coords_table[:, :, :, 1] /= (self.window_size[1] - 1)\n        relative_coords_table *= 8  # normalize to -8, 8\n        relative_coords_table = paddle.sign(relative_coords_table) * paddle.log2(\n            paddle.abs(relative_coords_table) + 1.0) / np.log2(8)\n\n        self.register_buffer(\"relative_coords_table\", relative_coords_table)\n\n        # get pair-wise relative position index for each token inside the window\n        coords_h = paddle.arange(self.window_size[0])\n        coords_w = paddle.arange(self.window_size[1])\n        coords = paddle.stack(paddle.meshgrid([coords_h, coords_w]))  # 2, Wh, Ww\n        coords_flatten = paddle.flatten(coords, 1)  # 2, Wh*Ww\n        relative_coords = coords_flatten[:, :, None] - \\\n            coords_flatten[:, None, :]  # 2, Wh*Ww, Wh*Ww\n        relative_coords = relative_coords.transpose((1, 2, 0))  # Wh*Ww, Wh*Ww, 2\n        relative_coords[:, :, 0] += self.window_size[0] - \\\n            1  # shift to start from 0\n        relative_coords[:, :, 1] += self.window_size[1] - 1\n        relative_coords[:, :, 0] *= 2 * self.window_size[1] - 1\n        relative_position_index = relative_coords.sum(-1)  # Wh*Ww, Wh*Ww\n        self.register_buffer(\"relative_position_index\", relative_position_index)\n\n        self.qkv = nn.Linear(dim, dim * 3, bias_attr=False)\n        if qkv_bias:\n            self.q_bias = self.create_parameter(shape=(dim, ),\n                                                dtype=paddle.float32,\n                                                default_initializer=nn.initializer.Constant(0.0))\n            self.v_bias = self.create_parameter(shape=(dim, ),\n                                                dtype=paddle.float32,\n                                                default_initializer=nn.initializer.Constant(0.0))\n        else:\n            self.q_bias = None\n            self.v_bias = None\n        self.attn_drop = nn.Dropout(attn_drop)\n        self.proj = nn.Linear(dim, dim)\n        self.proj_drop = nn.Dropout(proj_drop)\n        self.softmax = nn.Softmax(axis=-1)\n\n    def forward(self, x, mask=None):\n        \"\"\"\n        Args:\n            x: input features with shape of (num_windows*B, N, C)\n            mask: (0/-inf) mask with shape of (num_windows, Wh*Ww, Wh*Ww) or None\n        \"\"\"\n        B_, N, C = x.shape\n        qkv_bias = None\n        if self.q_bias is not None:\n            qkv_bias = paddle.concat((self.q_bias, paddle.zeros_like(self.v_bias), self.v_bias))\n        qkv = F.linear(x=x, weight=self.qkv.weight, bias=qkv_bias)\n        qkv = qkv.reshape((B_, N, 3, self.num_heads, -1)).transpose((2, 0, 3, 1, 4))\n        # make torchscript happy (cannot use tensor as tuple)\n        q, k, v = qkv[0], qkv[1], qkv[2]\n\n        # cosine attention\n        attn = (F.normalize(q, axis=-1) @ F.normalize(k, axis=-1).transpose((0, 1, 3, 2)))\n        logit_scale = paddle.clip(self.logit_scale, max=paddle.log(paddle.to_tensor(1. / 0.01))).exp()\n        attn = attn * logit_scale\n\n        relative_position_bias_table = self.cpb_mlp(self.relative_coords_table).reshape((-1, self.num_heads))\n        relative_position_bias = relative_position_bias_table[self.relative_position_index.reshape((-1, ))].reshape(\n            (self.window_size[0] * self.window_size[1], self.window_size[0] * self.window_size[1],\n             -1))  # Wh*Ww,Wh*Ww,nH\n        relative_position_bias = relative_position_bias.transpose((2, 0, 1))  # nH, Wh*Ww, Wh*Ww\n        relative_position_bias = 16 * \\\n            nn.functional.sigmoid(relative_position_bias)\n        attn = attn + relative_position_bias.unsqueeze(0)\n\n        if mask is not None:\n            nW = mask.shape[0]\n            attn = attn.reshape((B_ // nW, nW, self.num_heads, N, N)) + mask.unsqueeze(1).unsqueeze(0)\n            attn = attn.reshape((-1, self.num_heads, N, N))\n            attn = self.softmax(attn)\n        else:\n            attn = self.softmax(attn)\n\n        attn = self.attn_drop(attn)\n\n        x = (attn @ v).transpose((0, 2, 1, 3)).reshape((B_, N, C))\n        x = self.proj(x)\n        x = self.proj_drop(x)\n        return x\n\n    def extra_repr(self) -> str:\n        return f'dim={self.dim}, window_size={self.window_size}, ' \\\n               f'pretrained_window_size={self.pretrained_window_size}, num_heads={self.num_heads}'\n\n    def flops(self, N):\n        # calculate flops for 1 window with token length of N\n        flops = 0\n        # qkv = self.qkv(x)\n        flops += N * self.dim * 3 * self.dim\n        # attn = (q @ k.transpose(-2, -1))\n        flops += self.num_heads * N * (self.dim // self.num_heads) * N\n        #  x = (attn @ v)\n        flops += self.num_heads * N * N * (self.dim // self.num_heads)\n        # x = self.proj(x)\n        flops += N * self.dim * self.dim\n        return flops\n\n\nclass SwinTransformerBlock(nn.Layer):\n    r\"\"\" Swin Transformer Block.\n    Args:\n        dim (int): Number of input channels.\n        input_resolution (tuple[int]): Input resulotion.\n        num_heads (int): Number of attention heads.\n        window_size (int): Window size.\n        shift_size (int): Shift size for SW-MSA.\n        mlp_ratio (float): Ratio of mlp hidden dim to embedding dim.\n        qkv_bias (bool, optional): If True, add a learnable bias to query, key, value. Default: True\n        drop (float, optional): Dropout rate. Default: 0.0\n        attn_drop (float, optional): Attention dropout rate. Default: 0.0\n        drop_path (float, optional): Stochastic depth rate. Default: 0.0\n        act_layer (nn.Layer, optional): Activation layer. Default: nn.GELU\n        norm_layer (nn.Layer, optional): Normalization layer.  Default: nn.LayerNorm\n        pretrained_window_size (int): Window size in pre-training.\n    \"\"\"\n\n    def __init__(self,\n                 dim,\n                 input_resolution,\n                 num_heads,\n                 window_size=7,\n                 shift_size=0,\n                 mlp_ratio=4.,\n                 qkv_bias=True,\n                 drop=0.,\n                 attn_drop=0.,\n                 drop_path=0.,\n                 act_layer=nn.GELU,\n                 norm_layer=nn.LayerNorm,\n                 pretrained_window_size=0):\n        super().__init__()\n        self.dim = dim\n        self.input_resolution = input_resolution\n        self.num_heads = num_heads\n        self.window_size = window_size\n        self.shift_size = shift_size\n        self.mlp_ratio = mlp_ratio\n        if min(self.input_resolution) <= self.window_size:\n            # if window size is larger than input resolution, we don't partition windows\n            self.shift_size = 0\n            self.window_size = min(self.input_resolution)\n        assert 0 <= self.shift_size < self.window_size, \"shift_size must in 0-window_size\"\n\n        self.norm1 = norm_layer(dim)\n        self.attn = WindowAttention(dim,\n                                    window_size=to_2tuple(self.window_size),\n                                    num_heads=num_heads,\n                                    qkv_bias=qkv_bias,\n                                    attn_drop=attn_drop,\n                                    proj_drop=drop,\n                                    pretrained_window_size=to_2tuple(pretrained_window_size))\n\n        # self.drop_path = DropPath(drop_path) if drop_path > 0. else nn.Identity()\n        self.drop_path = nn.Identity()\n        self.norm2 = norm_layer(dim)\n        mlp_hidden_dim = int(dim * mlp_ratio)\n        self.mlp = Mlp(in_features=dim, hidden_features=mlp_hidden_dim, act_layer=act_layer, drop=drop)\n\n        if self.shift_size > 0:\n            attn_mask = self.calculate_mask(self.input_resolution)\n        else:\n            attn_mask = None\n\n        self.register_buffer(\"attn_mask\", attn_mask)\n\n    def calculate_mask(self, x_size):\n        # calculate attention mask for SW-MSA\n        H, W = x_size\n        img_mask = paddle.zeros((1, H, W, 1))  # 1 H W 1\n        h_slices = (slice(0, -self.window_size), slice(-self.window_size,\n                                                       -self.shift_size if self.shift_size else None),\n                    slice(-self.shift_size, None))\n        w_slices = (slice(0, -self.window_size), slice(-self.window_size,\n                                                       -self.shift_size if self.shift_size else None),\n                    slice(-self.shift_size, None))\n        cnt = 0\n        for h in h_slices:\n            for w in w_slices:\n                img_mask[:, h, w, :] = cnt\n                cnt += 1\n\n        # nW, window_size, window_size, 1\n        mask_windows = window_partition(img_mask, self.window_size)\n        mask_windows = mask_windows.reshape((-1, self.window_size * self.window_size))\n        attn_mask = mask_windows.unsqueeze(1) - mask_windows.unsqueeze(2)\n\n        _h = paddle.full_like(attn_mask, -100.0, dtype=paddle.float32)\n        _z = paddle.full_like(attn_mask, 0.0, dtype=paddle.float32)\n        attn_mask = paddle.where(attn_mask != 0, _h, _z)\n\n        # attn_mask = attn_mask.masked_fill(attn_mask != 0, float(-100.0)).masked_fill(attn_mask == 0, float(0.0))\n\n        return attn_mask\n\n    def forward(self, x, x_size):\n        H, W = x_size\n        B, L, C = x.shape\n        #assert L == H * W, \"input feature has wrong size\"\n\n        shortcut = x\n        x = x.reshape((B, H, W, C))\n\n        # cyclic shift\n        if self.shift_size > 0:\n            shifted_x = paddle.roll(x, shifts=(-self.shift_size, -self.shift_size), axis=(1, 2))\n        else:\n            shifted_x = x\n\n        # partition windows\n        # nW*B, window_size, window_size, C\n        x_windows = window_partition(shifted_x, self.window_size)\n        # nW*B, window_size*window_size, C\n        x_windows = x_windows.reshape((-1, self.window_size * self.window_size, C))\n\n        # W-MSA/SW-MSA (to be compatible for testing on images whose shapes are the multiple of window size\n        if self.input_resolution == x_size:\n            # nW*B, window_size*window_size, C\n            attn_windows = self.attn(x_windows, mask=self.attn_mask)\n        else:\n            attn_windows = self.attn(x_windows, mask=self.calculate_mask(x_size))\n\n        # merge windows\n        attn_windows = attn_windows.reshape((-1, self.window_size, self.window_size, C))\n        shifted_x = window_reverse(attn_windows, self.window_size, H, W)  # B H' W' C\n\n        # reverse cyclic shift\n        if self.shift_size > 0:\n            x = paddle.roll(shifted_x, shifts=(self.shift_size, self.shift_size), axis=(1, 2))\n        else:\n            x = shifted_x\n        x = x.reshape((B, H * W, C))\n        x = shortcut + self.drop_path(self.norm1(x))\n\n        # FFN\n        x = x + self.drop_path(self.norm2(self.mlp(x)))\n\n        return x\n\n    def extra_repr(self) -> str:\n        return f\"dim={self.dim}, input_resolution={self.input_resolution}, num_heads={self.num_heads}, \" \\\n               f\"window_size={self.window_size}, shift_size={self.shift_size}, mlp_ratio={self.mlp_ratio}\"\n\n    def flops(self):\n        flops = 0\n        H, W = self.input_resolution\n        # norm1\n        flops += self.dim * H * W\n        # W-MSA/SW-MSA\n        nW = H * W / self.window_size / self.window_size\n        flops += nW * self.attn.flops(self.window_size * self.window_size)\n        # mlp\n        flops += 2 * H * W * self.dim * self.dim * self.mlp_ratio\n        # norm2\n        flops += self.dim * H * W\n        return flops\n\n\nclass PatchMerging(nn.Layer):\n    r\"\"\" Patch Merging Layer.\n    Args:\n        input_resolution (tuple[int]): Resolution of input feature.\n        dim (int): Number of input channels.\n        norm_layer (nn.Layer, optional): Normalization layer.  Default: nn.LayerNorm\n    \"\"\"\n\n    def __init__(self, input_resolution, dim, norm_layer=nn.LayerNorm):\n        super().__init__()\n        self.input_resolution = input_resolution\n        self.dim = dim\n        self.reduction = nn.Linear(4 * dim, 2 * dim, bias_attr=False)\n        self.norm = norm_layer(2 * dim)\n\n    def forward(self, x):\n        \"\"\"\n        x: B, H*W, C\n        \"\"\"\n        H, W = self.input_resolution\n        B, L, C = x.shape\n        assert L == H * W, \"input feature has wrong size\"\n        assert H % 2 == 0 and W % 2 == 0, f\"x size ({H}*{W}) are not even.\"\n\n        x = x.reshape((B, H, W, C))\n\n        x0 = x[:, 0::2, 0::2, :]  # B H/2 W/2 C\n        x1 = x[:, 1::2, 0::2, :]  # B H/2 W/2 C\n        x2 = x[:, 0::2, 1::2, :]  # B H/2 W/2 C\n        x3 = x[:, 1::2, 1::2, :]  # B H/2 W/2 C\n        x = paddle.concat([x0, x1, x2, x3], -1)  # B H/2 W/2 4*C\n        x = x.rehsape((B, -1, 4 * C))  # B H/2*W/2 4*C\n\n        x = self.reduction(x)\n        x = self.norm(x)\n\n        return x\n\n    def extra_repr(self) -> str:\n        return f\"input_resolution={self.input_resolution}, dim={self.dim}\"\n\n    def flops(self):\n        H, W = self.input_resolution\n        flops = (H // 2) * (W // 2) * 4 * self.dim * 2 * self.dim\n        flops += H * W * self.dim // 2\n        return flops\n\n\nclass BasicLayer(nn.Layer):\n    \"\"\" A basic Swin Transformer layer for one stage.\n    Args:\n        dim (int): Number of input channels.\n        input_resolution (tuple[int]): Input resolution.\n        depth (int): Number of blocks.\n        num_heads (int): Number of attention heads.\n        window_size (int): Local window size.\n        mlp_ratio (float): Ratio of mlp hidden dim to embedding dim.\n        qkv_bias (bool, optional): If True, add a learnable bias to query, key, value. Default: True\n        drop (float, optional): Dropout rate. Default: 0.0\n        attn_drop (float, optional): Attention dropout rate. Default: 0.0\n        drop_path (float | tuple[float], optional): Stochastic depth rate. Default: 0.0\n        norm_layer (nn.Layer, optional): Normalization layer. Default: nn.LayerNorm\n        downsample (nn.Layer | None, optional): Downsample layer at the end of the layer. Default: None\n        use_checkpoint (bool): Whether to use checkpointing to save memory. Default: False.\n        pretrained_window_size (int): Local window size in pre-training.\n    \"\"\"\n\n    def __init__(self,\n                 dim,\n                 input_resolution,\n                 depth,\n                 num_heads,\n                 window_size,\n                 mlp_ratio=4.,\n                 qkv_bias=True,\n                 drop=0.,\n                 attn_drop=0.,\n                 drop_path=0.,\n                 norm_layer=nn.LayerNorm,\n                 downsample=None,\n                 use_checkpoint=False,\n                 pretrained_window_size=0):\n\n        super().__init__()\n        self.dim = dim\n        self.input_resolution = input_resolution\n        self.depth = depth\n        self.use_checkpoint = use_checkpoint\n\n        # build blocks\n        self.blocks = nn.LayerList([\n            SwinTransformerBlock(dim=dim,\n                                 input_resolution=input_resolution,\n                                 num_heads=num_heads,\n                                 window_size=window_size,\n                                 shift_size=0 if (i % 2 == 0) else window_size // 2,\n                                 mlp_ratio=mlp_ratio,\n                                 qkv_bias=qkv_bias,\n                                 drop=drop,\n                                 attn_drop=attn_drop,\n                                 drop_path=drop_path[i] if isinstance(drop_path, list) else drop_path,\n                                 norm_layer=norm_layer,\n                                 pretrained_window_size=pretrained_window_size) for i in range(depth)\n        ])\n\n        # patch merging layer\n        if downsample is not None:\n            self.downsample = downsample(input_resolution, dim=dim, norm_layer=norm_layer)\n        else:\n            self.downsample = None\n\n    def forward(self, x, x_size):\n        for blk in self.blocks:\n            x = blk(x, x_size)\n        if self.downsample is not None:\n            x = self.downsample(x)\n        return x\n\n    def extra_repr(self) -> str:\n        return f\"dim={self.dim}, input_resolution={self.input_resolution}, depth={self.depth}\"\n\n    def flops(self):\n        flops = 0\n        for blk in self.blocks:\n            flops += blk.flops()\n        if self.downsample is not None:\n            flops += self.downsample.flops()\n        return flops\n\n\nclass PatchEmbed(nn.Layer):\n    r\"\"\" Image to Patch Embedding\n    Args:\n        img_size (int): Image size.  Default: 224.\n        patch_size (int): Patch token size. Default: 4.\n        in_chans (int): Number of input image channels. Default: 3.\n        embed_dim (int): Number of linear projection output channels. Default: 96.\n        norm_layer (nn.Layer, optional): Normalization layer. Default: None\n    \"\"\"\n\n    def __init__(self, img_size=224, patch_size=4, in_chans=3, embed_dim=96, norm_layer=None):\n        super().__init__()\n        img_size = to_2tuple(img_size)\n        patch_size = to_2tuple(patch_size)\n        patches_resolution = [img_size[0] // patch_size[0], img_size[1] // patch_size[1]]\n        self.img_size = img_size\n        self.patch_size = patch_size\n        self.patches_resolution = patches_resolution\n        self.num_patches = patches_resolution[0] * patches_resolution[1]\n\n        self.in_chans = in_chans\n        self.embed_dim = embed_dim\n\n        self.proj = nn.Conv2D(in_chans, embed_dim, kernel_size=patch_size, stride=patch_size)\n        if norm_layer is not None:\n            self.norm = norm_layer(embed_dim)\n        else:\n            self.norm = None\n\n    def forward(self, x):\n        B, C, H, W = x.shape\n        # FIXME look at relaxing size constraints\n        # assert H == self.img_size[0] and W == self.img_size[1],\n        #     f\"Input image size ({H}*{W}) doesn't match model ({self.img_size[0]}*{self.img_size[1]}).\"\n        x = self.proj(x).flatten(2).transpose((0, 2, 1))  # B Ph*Pw C\n        if self.norm is not None:\n            x = self.norm(x)\n        return x\n\n    def flops(self):\n        Ho, Wo = self.patches_resolution\n        flops = Ho * Wo * self.embed_dim * self.in_chans * \\\n            (self.patch_size[0] * self.patch_size[1])\n        if self.norm is not None:\n            flops += Ho * Wo * self.embed_dim\n        return flops\n\n\nclass RSTB(nn.Layer):\n    \"\"\"Residual Swin Transformer Block (RSTB).\n    Args:\n        dim (int): Number of input channels.\n        input_resolution (tuple[int]): Input resolution.\n        depth (int): Number of blocks.\n        num_heads (int): Number of attention heads.\n        window_size (int): Local window size.\n        mlp_ratio (float): Ratio of mlp hidden dim to embedding dim.\n        qkv_bias (bool, optional): If True, add a learnable bias to query, key, value. Default: True\n        drop (float, optional): Dropout rate. Default: 0.0\n        attn_drop (float, optional): Attention dropout rate. Default: 0.0\n        drop_path (float | tuple[float], optional): Stochastic depth rate. Default: 0.0\n        norm_layer (nn.Layer, optional): Normalization layer. Default: nn.LayerNorm\n        downsample (nn.Layer | None, optional): Downsample layer at the end of the layer. Default: None\n        use_checkpoint (bool): Whether to use checkpointing to save memory. Default: False.\n        img_size: Input image size.\n        patch_size: Patch size.\n        resi_connection: The convolutional block before residual connection.\n    \"\"\"\n\n    def __init__(self,\n                 dim,\n                 input_resolution,\n                 depth,\n                 num_heads,\n                 window_size,\n                 mlp_ratio=4.,\n                 qkv_bias=True,\n                 drop=0.,\n                 attn_drop=0.,\n                 drop_path=0.,\n                 norm_layer=nn.LayerNorm,\n                 downsample=None,\n                 use_checkpoint=False,\n                 img_size=224,\n                 patch_size=4,\n                 resi_connection='1conv'):\n        super(RSTB, self).__init__()\n\n        self.dim = dim\n        self.input_resolution = input_resolution\n\n        self.residual_group = BasicLayer(dim=dim,\n                                         input_resolution=input_resolution,\n                                         depth=depth,\n                                         num_heads=num_heads,\n                                         window_size=window_size,\n                                         mlp_ratio=mlp_ratio,\n                                         qkv_bias=qkv_bias,\n                                         drop=drop,\n                                         attn_drop=attn_drop,\n                                         drop_path=drop_path,\n                                         norm_layer=norm_layer,\n                                         downsample=downsample,\n                                         use_checkpoint=use_checkpoint)\n\n        if resi_connection == '1conv':\n            self.conv = nn.Conv2D(dim, dim, 3, 1, 1)\n        elif resi_connection == '3conv':\n            # to save parameters and memory\n            self.conv = nn.Sequential(nn.Conv2D(dim, dim // 4, 3, 1, 1), nn.LeakyReLU(negative_slope=0.2, ),\n                                      nn.Conv2D(dim // 4, dim // 4, 1, 1, 0), nn.LeakyReLU(negative_slope=0.2, ),\n                                      nn.Conv2D(dim // 4, dim, 3, 1, 1))\n\n        self.patch_embed = PatchEmbed(img_size=img_size,\n                                      patch_size=patch_size,\n                                      in_chans=dim,\n                                      embed_dim=dim,\n                                      norm_layer=None)\n\n        self.patch_unembed = PatchUnEmbed(img_size=img_size,\n                                          patch_size=patch_size,\n                                          in_chans=dim,\n                                          embed_dim=dim,\n                                          norm_layer=None)\n\n    def forward(self, x, x_size):\n        return self.patch_embed(self.conv(self.patch_unembed(self.residual_group(x, x_size), x_size))) + x\n\n    def flops(self):\n        flops = 0\n        flops += self.residual_group.flops()\n        H, W = self.input_resolution\n        flops += H * W * self.dim * self.dim * 9\n        flops += self.patch_embed.flops()\n        flops += self.patch_unembed.flops()\n\n        return flops\n\n\nclass PatchUnEmbed(nn.Layer):\n    r\"\"\" Image to Patch Unembedding\n    Args:\n        img_size (int): Image size.  Default: 224.\n        patch_size (int): Patch token size. Default: 4.\n        in_chans (int): Number of input image channels. Default: 3.\n        embed_dim (int): Number of linear projection output channels. Default: 96.\n        norm_layer (nn.Layer, optional): Normalization layer. Default: None\n    \"\"\"\n\n    def __init__(self, img_size=224, patch_size=4, in_chans=3, embed_dim=96, norm_layer=None):\n        super().__init__()\n        img_size = to_2tuple(img_size)\n        patch_size = to_2tuple(patch_size)\n        patches_resolution = [img_size[0] // patch_size[0], img_size[1] // patch_size[1]]\n        self.img_size = img_size\n        self.patch_size = patch_size\n        self.patches_resolution = patches_resolution\n        self.num_patches = patches_resolution[0] * patches_resolution[1]\n\n        self.in_chans = in_chans\n        self.embed_dim = embed_dim\n\n    def forward(self, x, x_size):\n        B, HW, C = x.shape\n        x = x.transpose((0, 2, 1)).reshape((B, self.embed_dim, x_size[0], x_size[1]))  # B Ph*Pw C\n        return x\n\n    def flops(self):\n        flops = 0\n        return flops\n\n\nclass Upsample(nn.Sequential):\n    \"\"\"Upsample module.\n    Args:\n        scale (int): Scale factor. Supported scales: 2^n and 3.\n        num_feat (int): Channel number of intermediate features.\n    \"\"\"\n\n    def __init__(self, scale, num_feat):\n        m = []\n        if (scale & (scale - 1)) == 0:  # scale = 2^n\n            for _ in range(int(math.log(scale, 2))):\n                m.append(nn.Conv2D(num_feat, 4 * num_feat, 3, 1, 1))\n                m.append(nn.PixelShuffle(2))\n        elif scale == 3:\n            m.append(nn.Conv2D(num_feat, 9 * num_feat, 3, 1, 1))\n            m.append(nn.PixelShuffle(3))\n        else:\n            raise ValueError(f'scale {scale} is not supported. '\n                             'Supported scales: 2^n and 3.')\n        super(Upsample, self).__init__(*m)\n\n\nclass Upsample_hf(nn.Sequential):\n    \"\"\"Upsample module.\n    Args:\n        scale (int): Scale factor. Supported scales: 2^n and 3.\n        num_feat (int): Channel number of intermediate features.\n    \"\"\"\n\n    def __init__(self, scale, num_feat):\n        m = []\n        if (scale & (scale - 1)) == 0:  # scale = 2^n\n            for _ in range(int(math.log(scale, 2))):\n                m.append(nn.Conv2D(num_feat, 4 * num_feat, 3, 1, 1))\n                m.append(nn.PixelShuffle(2))\n        elif scale == 3:\n            m.append(nn.Conv2D(num_feat, 9 * num_feat, 3, 1, 1))\n            m.append(nn.PixelShuffle(3))\n        else:\n            raise ValueError(f'scale {scale} is not supported. '\n                             'Supported scales: 2^n and 3.')\n        super(Upsample_hf, self).__init__(*m)\n\n\nclass UpsampleOneStep(nn.Sequential):\n    \"\"\"UpsampleOneStep module (the difference with Upsample is that it always only has 1conv + 1pixelshuffle)\n       Used in lightweight SR to save parameters.\n    Args:\n        scale (int): Scale factor. Supported scales: 2^n and 3.\n        num_feat (int): Channel number of intermediate features.\n    \"\"\"\n\n    def __init__(self, scale, num_feat, num_out_ch, input_resolution=None):\n        self.num_feat = num_feat\n        self.input_resolution = input_resolution\n        m = []\n        m.append(nn.Conv2D(num_feat, (scale**2) * num_out_ch, 3, 1, 1))\n        m.append(nn.PixelShuffle(scale))\n        super(UpsampleOneStep, self).__init__(*m)\n\n    def flops(self):\n        H, W = self.input_resolution\n        flops = H * W * self.num_feat * 3 * 9\n        return flops\n\n\nclass Swin2SR(nn.Layer):\n    r\"\"\" Swin2SR\n        A PyTorch impl of : `Swin2SR: SwinV2 Transformer for Compressed Image Super-Resolution and Restoration`.\n    Args:\n        img_size (int | tuple(int)): Input image size. Default 64\n        patch_size (int | tuple(int)): Patch size. Default: 1\n        in_chans (int): Number of input image channels. Default: 3\n        embed_dim (int): Patch embedding dimension. Default: 96\n        depths (tuple(int)): Depth of each Swin Transformer layer.\n        num_heads (tuple(int)): Number of attention heads in different layers.\n        window_size (int): Window size. Default: 7\n        mlp_ratio (float): Ratio of mlp hidden dim to embedding dim. Default: 4\n        qkv_bias (bool): If True, add a learnable bias to query, key, value. Default: True\n        drop_rate (float): Dropout rate. Default: 0\n        attn_drop_rate (float): Attention dropout rate. Default: 0\n        drop_path_rate (float): Stochastic depth rate. Default: 0.1\n        norm_layer (nn.Layer): Normalization layer. Default: nn.LayerNorm.\n        ape (bool): If True, add absolute position embedding to the patch embedding. Default: False\n        patch_norm (bool): If True, add normalization after patch embedding. Default: True\n        use_checkpoint (bool): Whether to use checkpointing to save memory. Default: False\n        upscale: Upscale factor. 2/3/4/8 for image SR, 1 for denoising and compress artifact reduction\n        img_range: Image range. 1. or 255.\n        upsampler: The reconstruction reconstruction module. 'pixelshuffle'/'pixelshuffledirect'/'nearest+conv'/None\n        resi_connection: The convolutional block before residual connection. '1conv'/'3conv'\n    \"\"\"\n\n    def __init__(self,\n                 img_size=64,\n                 patch_size=1,\n                 in_chans=3,\n                 embed_dim=96,\n                 depths=[6, 6, 6, 6],\n                 num_heads=[6, 6, 6, 6],\n                 window_size=7,\n                 mlp_ratio=4.,\n                 qkv_bias=True,\n                 drop_rate=0.,\n                 attn_drop_rate=0.,\n                 drop_path_rate=0.1,\n                 norm_layer=nn.LayerNorm,\n                 ape=False,\n                 patch_norm=True,\n                 use_checkpoint=False,\n                 upscale=2,\n                 img_range=1.,\n                 upsampler='',\n                 resi_connection='1conv',\n                 **kwargs):\n        super(Swin2SR, self).__init__()\n        num_in_ch = in_chans\n        num_out_ch = in_chans\n        num_feat = 64\n        self.img_range = img_range\n        if in_chans == 3:\n            rgb_mean = (0.4488, 0.4371, 0.4040)\n            self.mean = paddle.to_tensor(rgb_mean).reshape((1, 3, 1, 1))\n        else:\n            self.mean = paddle.zeros((1, 1, 1, 1))\n        self.upscale = upscale\n        self.upsampler = upsampler\n        self.window_size = window_size\n\n        #####################################################################################################\n        ################################### 1, shallow feature extraction ###################################\n        self.conv_first = nn.Conv2D(num_in_ch, embed_dim, 3, 1, 1)\n\n        #####################################################################################################\n        ################################### 2, deep feature extraction ######################################\n        self.num_layers = len(depths)\n        self.embed_dim = embed_dim\n        self.ape = ape\n        self.patch_norm = patch_norm\n        self.num_features = embed_dim\n        self.mlp_ratio = mlp_ratio\n\n        # split image into non-overlapping patches\n        self.patch_embed = PatchEmbed(img_size=img_size,\n                                      patch_size=patch_size,\n                                      in_chans=embed_dim,\n                                      embed_dim=embed_dim,\n                                      norm_layer=norm_layer if self.patch_norm else None)\n        num_patches = self.patch_embed.num_patches\n        patches_resolution = self.patch_embed.patches_resolution\n        self.patches_resolution = patches_resolution\n\n        # merge non-overlapping patches into image\n        self.patch_unembed = PatchUnEmbed(img_size=img_size,\n                                          patch_size=patch_size,\n                                          in_chans=embed_dim,\n                                          embed_dim=embed_dim,\n                                          norm_layer=norm_layer if self.patch_norm else None)\n\n        # absolute position embedding\n        if self.ape:\n            self.absolute_pos_embed = self.create_parameter(shape=(1, num_patches, embed_dim),\n                                                            dtype=paddle.float32,\n                                                            default_initializer=nn.initializer.Constant(0.0))\n            # trunc_normal_(self.absolute_pos_embed, std=.02)\n\n        self.pos_drop = nn.Dropout(p=drop_rate)\n\n        # stochastic depth\n        dpr = [x.item() for x in paddle.linspace(0, drop_path_rate, sum(depths))]  # stochastic depth decay rule\n\n        # build Residual Swin Transformer blocks (RSTB)\n        self.layers = nn.LayerList()\n        for i_layer in range(self.num_layers):\n            layer = RSTB(\n                dim=embed_dim,\n                input_resolution=(patches_resolution[0], patches_resolution[1]),\n                depth=depths[i_layer],\n                num_heads=num_heads[i_layer],\n                window_size=window_size,\n                mlp_ratio=self.mlp_ratio,\n                qkv_bias=qkv_bias,\n                drop=drop_rate,\n                attn_drop=attn_drop_rate,\n                drop_path=dpr[sum(depths[:i_layer]):sum(depths[:i_layer + 1])],  # no impact on SR results\n                norm_layer=norm_layer,\n                downsample=None,\n                use_checkpoint=use_checkpoint,\n                img_size=img_size,\n                patch_size=patch_size,\n                resi_connection=resi_connection)\n            self.layers.append(layer)\n\n        if self.upsampler == 'pixelshuffle_hf':\n            self.layers_hf = nn.LayerList()\n            for i_layer in range(self.num_layers):\n                layer = RSTB(\n                    dim=embed_dim,\n                    input_resolution=(patches_resolution[0], patches_resolution[1]),\n                    depth=depths[i_layer],\n                    num_heads=num_heads[i_layer],\n                    window_size=window_size,\n                    mlp_ratio=self.mlp_ratio,\n                    qkv_bias=qkv_bias,\n                    drop=drop_rate,\n                    attn_drop=attn_drop_rate,\n                    drop_path=dpr[sum(depths[:i_layer]):sum(depths[:i_layer + 1])],  # no impact on SR results\n                    norm_layer=norm_layer,\n                    downsample=None,\n                    use_checkpoint=use_checkpoint,\n                    img_size=img_size,\n                    patch_size=patch_size,\n                    resi_connection=resi_connection)\n                self.layers_hf.append(layer)\n\n        self.norm = norm_layer(self.num_features)\n\n        # build the last conv layer in deep feature extraction\n        if resi_connection == '1conv':\n            self.conv_after_body = nn.Conv2D(embed_dim, embed_dim, 3, 1, 1)\n        elif resi_connection == '3conv':\n            # to save parameters and memory\n            self.conv_after_body = nn.Sequential(nn.Conv2D(embed_dim, embed_dim // 4, 3, 1, 1),\n                                                 nn.LeakyReLU(negative_slope=0.2, ),\n                                                 nn.Conv2D(embed_dim // 4, embed_dim // 4, 1, 1, 0),\n                                                 nn.LeakyReLU(negative_slope=0.2, ),\n                                                 nn.Conv2D(embed_dim // 4, embed_dim, 3, 1, 1))\n\n        #####################################################################################################\n        ################################ 3, high quality image reconstruction ################################\n        if self.upsampler == 'pixelshuffle':\n            # for classical SR\n            self.conv_before_upsample = nn.Sequential(nn.Conv2D(embed_dim, num_feat, 3, 1, 1), nn.LeakyReLU())\n            self.upsample = Upsample(upscale, num_feat)\n            self.conv_last = nn.Conv2D(num_feat, num_out_ch, 3, 1, 1)\n        elif self.upsampler == 'pixelshuffle_aux':\n            self.conv_bicubic = nn.Conv2D(num_in_ch, num_feat, 3, 1, 1)\n            self.conv_before_upsample = nn.Sequential(nn.Conv2D(embed_dim, num_feat, 3, 1, 1), nn.LeakyReLU())\n            self.conv_aux = nn.Conv2D(num_feat, num_out_ch, 3, 1, 1)\n            self.conv_after_aux = nn.Sequential(nn.Conv2D(3, num_feat, 3, 1, 1), nn.LeakyReLU())\n            self.upsample = Upsample(upscale, num_feat)\n            self.conv_last = nn.Conv2D(num_feat, num_out_ch, 3, 1, 1)\n\n        elif self.upsampler == 'pixelshuffle_hf':\n            self.conv_before_upsample = nn.Sequential(nn.Conv2D(embed_dim, num_feat, 3, 1, 1), nn.LeakyReLU())\n            self.upsample = Upsample(upscale, num_feat)\n            self.upsample_hf = Upsample_hf(upscale, num_feat)\n            self.conv_last = nn.Conv2D(num_feat, num_out_ch, 3, 1, 1)\n            self.conv_first_hf = nn.Sequential(nn.Conv2D(num_feat, embed_dim, 3, 1, 1), nn.LeakyReLU())\n            self.conv_after_body_hf = nn.Conv2D(embed_dim, embed_dim, 3, 1, 1)\n            self.conv_before_upsample_hf = nn.Sequential(nn.Conv2D(embed_dim, num_feat, 3, 1, 1), nn.LeakyReLU())\n            self.conv_last_hf = nn.Conv2D(num_feat, num_out_ch, 3, 1, 1)\n\n        elif self.upsampler == 'pixelshuffledirect':\n            # for lightweight SR (to save parameters)\n            self.upsample = UpsampleOneStep(upscale, embed_dim, num_out_ch,\n                                            (patches_resolution[0], patches_resolution[1]))\n        elif self.upsampler == 'nearest+conv':\n            # for real-world SR (less artifacts)\n            assert self.upscale == 4, 'only support x4 now.'\n            self.conv_before_upsample = nn.Sequential(nn.Conv2D(embed_dim, num_feat, 3, 1, 1), nn.LeakyReLU())\n            self.conv_up1 = nn.Conv2D(num_feat, num_feat, 3, 1, 1)\n            self.conv_up2 = nn.Conv2D(num_feat, num_feat, 3, 1, 1)\n            self.conv_hr = nn.Conv2D(num_feat, num_feat, 3, 1, 1)\n            self.conv_last = nn.Conv2D(num_feat, num_out_ch, 3, 1, 1)\n            self.lrelu = nn.LeakyReLU(negative_slope=0.2, )\n        else:\n            # for image denoising and JPEG compression artifact reduction\n            self.conv_last = nn.Conv2D(embed_dim, num_out_ch, 3, 1, 1)\n\n    def check_image_size(self, x):\n        _, _, h, w = x.shape\n        mod_pad_h = (self.window_size - h % self.window_size) % self.window_size\n        mod_pad_w = (self.window_size - w % self.window_size) % self.window_size\n        x = F.pad(x, (0, mod_pad_w, 0, mod_pad_h), 'reflect')\n        return x\n\n    def forward_features(self, x):\n        x_size = (x.shape[2], x.shape[3])\n        x = self.patch_embed(x)\n        if self.ape:\n            x = x + self.absolute_pos_embed\n        x = self.pos_drop(x)\n\n        for layer in self.layers:\n            x = layer(x, x_size)\n\n        x = self.norm(x)  # B L C\n        x = self.patch_unembed(x, x_size)\n\n        return x\n\n    def forward_features_hf(self, x):\n        x_size = (x.shape[2], x.shape[3])\n        x = self.patch_embed(x)\n        if self.ape:\n            x = x + self.absolute_pos_embed\n        x = self.pos_drop(x)\n\n        for layer in self.layers_hf:\n            x = layer(x, x_size)\n\n        x = self.norm(x)  # B L C\n        x = self.patch_unembed(x, x_size)\n\n        return x\n\n    def forward(self, x):\n        H, W = x.shape[2:]\n        x = self.check_image_size(x)\n\n        self.mean = self.mean.cast(x.dtype)\n        x = (x - self.mean) * self.img_range\n\n        if self.upsampler == 'pixelshuffle':\n            # for classical SR\n            x = self.conv_first(x)\n            x = self.conv_after_body(self.forward_features(x)) + x\n            x = self.conv_before_upsample(x)\n            x = self.conv_last(self.upsample(x))\n        elif self.upsampler == 'pixelshuffle_aux':\n            bicubic = F.interpolate(x, size=(H * self.upscale, W * self.upscale), mode='bicubic', align_corners=False)\n            bicubic = self.conv_bicubic(bicubic)\n            x = self.conv_first(x)\n            x = self.conv_after_body(self.forward_features(x)) + x\n            x = self.conv_before_upsample(x)\n            aux = self.conv_aux(x)  # b, 3, LR_H, LR_W\n            x = self.conv_after_aux(aux)\n            x = self.upsample(x)[:, :, :H * self.upscale, :W * self.upscale] + \\\n                bicubic[:, :, :H * self.upscale, :W * self.upscale]\n            x = self.conv_last(x)\n            aux = aux / self.img_range + self.mean\n        elif self.upsampler == 'pixelshuffle_hf':\n            # for classical SR with HF\n            x = self.conv_first(x)\n            x = self.conv_after_body(self.forward_features(x)) + x\n            x_before = self.conv_before_upsample(x)\n            x_out = self.conv_last(self.upsample(x_before))\n\n            x_hf = self.conv_first_hf(x_before)\n            x_hf = self.conv_after_body_hf(self.forward_features_hf(x_hf)) + x_hf\n            x_hf = self.conv_before_upsample_hf(x_hf)\n            x_hf = self.conv_last_hf(self.upsample_hf(x_hf))\n            x = x_out + x_hf\n            x_hf = x_hf / self.img_range + self.mean\n\n        elif self.upsampler == 'pixelshuffledirect':\n            # for lightweight SR\n            x = self.conv_first(x)\n            x = self.conv_after_body(self.forward_features(x)) + x\n            x = self.upsample(x)\n        elif self.upsampler == 'nearest+conv':\n            # for real-world SR\n            x = self.conv_first(x)\n            x = self.conv_after_body(self.forward_features(x)) + x\n            x = self.conv_before_upsample(x)\n            x = self.lrelu(self.conv_up1(nn.functional.interpolate(x, scale_factor=2, mode='nearest')))\n            x = self.lrelu(self.conv_up2(nn.functional.interpolate(x, scale_factor=2, mode='nearest')))\n            x = self.conv_last(self.lrelu(self.conv_hr(x)))\n        else:\n            # for image denoising and JPEG compression artifact reduction\n            x_first = self.conv_first(x)\n            res = self.conv_after_body(self.forward_features(x_first)) + x_first\n            x = x + self.conv_last(res)\n\n        x = x / self.img_range + self.mean\n        if self.upsampler == \"pixelshuffle_aux\":\n            return x[:, :, :H * self.upscale, :W * self.upscale], aux\n\n        elif self.upsampler == \"pixelshuffle_hf\":\n            x_out = x_out / self.img_range + self.mean\n            return x_out[:, :, :H * self.upscale, :W *\n                         self.upscale], x[:, :, :H * self.upscale, :W *\n                                          self.upscale], x_hf[:, :, :H * self.upscale, :W * self.upscale]\n\n        else:\n            return x[:, :, :H * self.upscale, :W * self.upscale]\n\n    def flops(self):\n        flops = 0\n        H, W = self.patches_resolution\n        flops += H * W * 3 * self.embed_dim * 9\n        flops += self.patch_embed.flops()\n        for i, layer in enumerate(self.layers):\n            flops += layer.flops()\n        flops += H * W * 3 * self.embed_dim * self.embed_dim\n        flops += self.upsample.flops()\n        return flops\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/swin2sr_real_sr_x4/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport numpy as np\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/mJaD10XeD7w/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8M3x8Y2F0fGVufDB8fHx8MTY2MzczNDc3Mw&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        img = cv2.imread('tests/test.jpg')\n        img = cv2.resize(img, (0, 0), fx=0.25, fy=0.25)\n        cv2.imwrite('tests/test.jpg', img)\n        cls.module = hub.Module(name=\"swin2sr_real_sr_x4\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('swin2sr_real_sr_x4_output')\n\n    def test_real_sr1(self):\n        results = self.module.real_sr(image='tests/test.jpg', visualization=False)\n\n        self.assertIsInstance(results, np.ndarray)\n\n    def test_real_sr2(self):\n        results = self.module.real_sr(image=cv2.imread('tests/test.jpg'), visualization=True)\n\n        self.assertIsInstance(results, np.ndarray)\n\n    def test_real_sr3(self):\n        results = self.module.real_sr(image=cv2.imread('tests/test.jpg'), visualization=True)\n\n        self.assertIsInstance(results, np.ndarray)\n\n    def test_real_sr4(self):\n        self.assertRaises(Exception, self.module.real_sr, image=['tests/test.jpg'])\n\n    def test_real_sr5(self):\n        self.assertRaises(FileNotFoundError, self.module.real_sr, image='no.jpg')\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/swinir_l_real_sr_x4/README.md",
    "content": "# swinir_l_real_sr_x4\n\n|模型名称|swinir_l_real_sr_x4|\n| :--- | :---: |\n|类别|图像-图像编辑|\n|网络|SwinIR|\n|数据集|DIV2K / Flickr2K|\n|是否支持Fine-tuning|否|\n|模型大小|142.2MB|\n|指标|-|\n|最新更新日期|2022-10-10|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n  - 网络结构：\n      <p align=\"center\">\n      <img src=\"https://ai-studio-static-online.cdn.bcebos.com/b3c6bfc3dfc14078adcf3dc19acaf04acd4b064770384e2bbd8865697c7dbc91\" hspace='10'/> <br />\n      </p>\n\n  - 样例结果示例：\n      <p align=\"center\">\n      <img src=\"https://ai-studio-static-online.cdn.bcebos.com/c5517af6c3f944c4b281aedc417a4f8c02c0a969d0dd494c9106c4ff2709fc2f\" hspace='10'/>\n      <img src=\"https://ai-studio-static-online.cdn.bcebos.com/183c5821029f45bbb78d1700ab8297baabba15f82ab4467e88414bbed056ccf0\" hspace='10'/>\n      </p>\n\n- ### 模型介绍\n\n  - SwinIR 是一个基于 Swin Transformer 的图像恢复模型。swinir_l_real_sr_x4 是基于 SwinIR-L 的 4 倍现实图像超分辨率模型。\n\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0  \n\n- ### 2.安装\n\n    - ```shell\n      $ hub install swinir_l_real_sr_x4\n      ```\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n  - ### 1、命令行预测\n\n    ```shell\n    $ hub run swinir_l_real_sr_x4 \\\n        --input_path \"/PATH/TO/IMAGE\" \\\n        --output_dir \"swinir_l_real_sr_x4_output\"\n    ```\n\n  - ### 2、预测代码示例\n\n    ```python\n    import paddlehub as hub\n    import cv2\n\n    module = hub.Module(name=\"swinir_l_real_sr_x4\")\n    result = module.real_sr(\n        image=cv2.imread('/PATH/TO/IMAGE'),\n        visualization=True,\n        output_dir='swinir_l_real_sr_x4_output'\n    )\n    ```\n\n  - ### 3、API\n\n    ```python\n    def real_sr(\n        image: Union[str, numpy.ndarray],\n        visualization: bool = True,\n        output_dir: str = \"swinir_l_real_sr_x4_output\"\n    ) -> numpy.ndarray\n    ```\n\n    - 超分辨率 API\n\n    - **参数**\n\n      * image (Union\\[str, numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      * visualization (bool): 是否将识别结果保存为图片文件；\n      * output\\_dir (str): 保存处理结果的文件目录。\n\n    - **返回**\n\n      * res (numpy.ndarray): 图像超分辨率结果 (BGR)；\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个图像超分辨率的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n\n    ```shell\n     $ hub serving start -m swinir_l_real_sr_x4\n    ```\n\n    - 这样就完成了一个图像超分辨率服务化API的部署，默认端口号为8866。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    ```python\n    import requests\n    import json\n    import base64\n\n    import cv2\n    import numpy as np\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tobytes()).decode('utf8')\n\n    def base64_to_cv2(b64str):\n        data = base64.b64decode(b64str.encode('utf8'))\n        data = np.frombuffer(data, np.uint8)\n        data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n        return data\n\n    # 发送HTTP请求\n    org_im = cv2.imread('/PATH/TO/IMAGE')\n    data = {\n        'image': cv2_to_base64(org_im)\n    }\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/swinir_l_real_sr_x4\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 结果转换\n    results = r.json()['results']\n    results = base64_to_cv2(results)\n\n    # 保存结果\n    cv2.imwrite('output.jpg', results)\n    ```\n\n## 五、参考资料\n\n* 论文：[SwinIR: Image Restoration Using Swin Transformer](https://arxiv.org/abs/2108.10257)\n\n* 官方实现：[JingyunLiang/SwinIR](https://github.com/JingyunLiang/SwinIR)\n\n## 六、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install swinir_l_real_sr_x4==1.0.0\n  ```\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/swinir_l_real_sr_x4/module.py",
    "content": "import argparse\nimport base64\nimport os\nimport time\nfrom typing import Union\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\n\nfrom .swinir import SwinIR\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tobytes()).decode('utf8')\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.frombuffer(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\n@moduleinfo(\n    name='swinir_l_real_sr_x4',\n    version='1.0.0',\n    type=\"CV/image_editing\",\n    author=\"\",\n    author_email=\"\",\n    summary=\"Image Restoration (Real image Super Resolution) Using Swin Transformer.\",\n)\nclass SwinIRMRealSR(nn.Layer):\n\n    def __init__(self):\n        super(SwinIRMRealSR, self).__init__()\n        self.default_pretrained_model_path = os.path.join(self.directory,\n                                                          '003_realSR_BSRGAN_DFOWMFC_s64w8_SwinIR-L_x4_GAN.pdparams')\n        self.swinir = SwinIR(upscale=4,\n                             in_chans=3,\n                             img_size=64,\n                             window_size=8,\n                             img_range=1.,\n                             depths=[6, 6, 6, 6, 6, 6, 6, 6, 6],\n                             embed_dim=240,\n                             num_heads=[8, 8, 8, 8, 8, 8, 8, 8, 8],\n                             mlp_ratio=2,\n                             upsampler='nearest+conv',\n                             resi_connection='3conv')\n        state_dict = paddle.load(self.default_pretrained_model_path)\n        self.swinir.set_state_dict(state_dict)\n        self.swinir.eval()\n\n    def preprocess(self, img: np.ndarray) -> np.ndarray:\n        img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)\n        img = img.transpose((2, 0, 1))\n        img = img / 255.0\n        return img.astype(np.float32)\n\n    def postprocess(self, img: np.ndarray) -> np.ndarray:\n        img = img.clip(0, 1)\n        img = img * 255.0\n        img = img.transpose((1, 2, 0))\n        img = cv2.cvtColor(img, cv2.COLOR_RGB2BGR)\n        return img.astype(np.uint8)\n\n    def real_sr(self,\n                image: Union[str, np.ndarray],\n                visualization: bool = True,\n                output_dir: str = \"swinir_l_real_sr_x4_output\") -> np.ndarray:\n        if isinstance(image, str):\n            _, file_name = os.path.split(image)\n            save_name, _ = os.path.splitext(file_name)\n            save_name = save_name + '_' + str(int(time.time())) + '.jpg'\n            image = cv2.imdecode(np.fromfile(image, dtype=np.uint8), cv2.IMREAD_COLOR)\n        elif isinstance(image, np.ndarray):\n            save_name = str(int(time.time())) + '.jpg'\n            image = image\n        else:\n            raise Exception(\"image should be a str / np.ndarray\")\n\n        with paddle.no_grad():\n            img_input = self.preprocess(image)\n            img_input = paddle.to_tensor(img_input[None, ...], dtype=paddle.float32)\n\n            img_output = self.swinir(img_input)\n            img_output = img_output.numpy()[0]\n            img_output = self.postprocess(img_output)\n\n        if visualization:\n            if not os.path.isdir(output_dir):\n                os.makedirs(output_dir)\n            save_path = os.path.join(output_dir, save_name)\n            cv2.imwrite(save_path, img_output)\n\n        return img_output\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.parser.add_argument('--input_path', type=str, help=\"Path to image.\")\n        self.parser.add_argument('--output_dir',\n                                 type=str,\n                                 default='swinir_l_real_sr_x4_output',\n                                 help=\"The directory to save output images.\")\n        args = self.parser.parse_args(argvs)\n        self.real_sr(image=args.input_path, visualization=True, output_dir=args.output_dir)\n        return 'Results are saved in %s' % args.output_dir\n\n    @serving\n    def serving_method(self, image, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        image = base64_to_cv2(image)\n        img_output = self.real_sr(image=image, **kwargs)\n\n        return cv2_to_base64(img_output)\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/swinir_l_real_sr_x4/swinir.py",
    "content": "import math\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\ndef to_2tuple(x):\n    if isinstance(x, int):\n        return (x, x)\n    else:\n        return tuple(x)\n\n\nclass Mlp(nn.Layer):\n\n    def __init__(self, in_features, hidden_features=None, out_features=None, act_layer=nn.GELU, drop=0.):\n        super().__init__()\n        out_features = out_features or in_features\n        hidden_features = hidden_features or in_features\n        self.fc1 = nn.Linear(in_features, hidden_features)\n        self.act = act_layer()\n        self.fc2 = nn.Linear(hidden_features, out_features)\n        self.drop = nn.Dropout(drop)\n\n    def forward(self, x):\n        x = self.fc1(x)\n        x = self.act(x)\n        x = self.drop(x)\n        x = self.fc2(x)\n        x = self.drop(x)\n        return x\n\n\ndef window_partition(x, window_size):\n    \"\"\"\n    Args:\n        x: (B, H, W, C)\n        window_size (int): window size\n    Returns:\n        windows: (num_windows*B, window_size, window_size, C)\n    \"\"\"\n    B, H, W, C = x.shape\n    x = x.reshape((B, H // window_size, window_size, W // window_size, window_size, C))\n    windows = x.transpose((0, 1, 3, 2, 4, 5)).reshape((-1, window_size, window_size, C))\n    return windows\n\n\ndef window_reverse(windows, window_size, H, W):\n    \"\"\"\n    Args:\n        windows: (num_windows*B, window_size, window_size, C)\n        window_size (int): Window size\n        H (int): Height of image\n        W (int): Width of image\n    Returns:\n        x: (B, H, W, C)\n    \"\"\"\n    B = int(windows.shape[0] / (H * W / window_size / window_size))\n    x = windows.reshape((B, H // window_size, W // window_size, window_size, window_size, -1))\n    x = x.transpose((0, 1, 3, 2, 4, 5)).reshape((B, H, W, -1))\n    return x\n\n\nclass WindowAttention(nn.Layer):\n    r\"\"\" Window based multi-head self attention (W-MSA) module with relative position bias.\n    It supports both of shifted and non-shifted window.\n    Args:\n        dim (int): Number of input channels.\n        window_size (tuple[int]): The height and width of the window.\n        num_heads (int): Number of attention heads.\n        qkv_bias (bool, optional):  If True, add a learnable bias to query, key, value. Default: True\n        qk_scale (float | None, optional): Override default qk scale of head_dim ** -0.5 if set\n        attn_drop (float, optional): Dropout ratio of attention weight. Default: 0.0\n        proj_drop (float, optional): Dropout ratio of output. Default: 0.0\n    \"\"\"\n\n    def __init__(self, dim, window_size, num_heads, qkv_bias=True, qk_scale=None, attn_drop=0., proj_drop=0.):\n\n        super().__init__()\n        self.dim = dim\n        self.window_size = window_size  # Wh, Ww\n        self.num_heads = num_heads\n        head_dim = dim // num_heads\n        self.scale = qk_scale or head_dim**-0.5\n\n        # define a parameter table of relative position bias\n        self.relative_position_bias_table = self.create_parameter(shape=((2 * window_size[0] - 1) *\n                                                                         (2 * window_size[1] - 1), num_heads),\n                                                                  default_initializer=nn.initializer.Constant(0.0))\n\n        # get pair-wise relative position index for each token inside the window\n        coords_h = paddle.arange(self.window_size[0])\n        coords_w = paddle.arange(self.window_size[1])\n        coords = paddle.stack(paddle.meshgrid([coords_h, coords_w]))  # 2, Wh, Ww\n        coords_flatten = paddle.flatten(coords, 1)  # 2, Wh*Ww\n        relative_coords = coords_flatten[:, :, None] - coords_flatten[:, None, :]  # 2, Wh*Ww, Wh*Ww\n        relative_coords = relative_coords.transpose((1, 2, 0))  # Wh*Ww, Wh*Ww, 2\n        relative_coords[:, :, 0] += self.window_size[0] - 1  # shift to start from 0\n        relative_coords[:, :, 1] += self.window_size[1] - 1\n        relative_coords[:, :, 0] *= 2 * self.window_size[1] - 1\n        relative_position_index = relative_coords.sum(-1)  # Wh*Ww, Wh*Ww\n        self.register_buffer(\"relative_position_index\", relative_position_index)\n\n        self.qkv = nn.Linear(dim, dim * 3, bias_attr=qkv_bias)\n        self.attn_drop = nn.Dropout(attn_drop)\n        self.proj = nn.Linear(dim, dim)\n\n        self.proj_drop = nn.Dropout(proj_drop)\n\n        self.softmax = nn.Softmax(axis=-1)\n\n    def forward(self, x, mask=None):\n        \"\"\"\n        Args:\n            x: input features with shape of (num_windows*B, N, C)\n            mask: (0/-inf) mask with shape of (num_windows, Wh*Ww, Wh*Ww) or None\n        \"\"\"\n        B_, N, C = x.shape\n        qkv = self.qkv(x).reshape((B_, N, 3, self.num_heads, C // self.num_heads)).transpose((2, 0, 3, 1, 4))\n        q, k, v = qkv[0], qkv[1], qkv[2]  # make torchscript happy (cannot use tensor as tuple)\n\n        q = q * self.scale\n        attn = (q @ k.transpose((0, 1, 3, 2)))\n\n        relative_position_bias = self.relative_position_bias_table[self.relative_position_index.reshape(\n            (-1, ))].reshape((self.window_size[0] * self.window_size[1], self.window_size[0] * self.window_size[1],\n                              -1))  # Wh*Ww,Wh*Ww,nH\n        relative_position_bias = relative_position_bias.transpose((2, 0, 1))  # nH, Wh*Ww, Wh*Ww\n        attn = attn + relative_position_bias.unsqueeze(0)\n\n        if mask is not None:\n            nW = mask.shape[0]\n            attn = attn.reshape((B_ // nW, nW, self.num_heads, N, N)) + mask.unsqueeze(1).unsqueeze(0)\n            attn = attn.reshape((-1, self.num_heads, N, N))\n            attn = self.softmax(attn)\n        else:\n            attn = self.softmax(attn)\n\n        attn = self.attn_drop(attn)\n\n        x = (attn @ v).transpose((0, 2, 1, 3)).reshape((B_, N, C))\n        x = self.proj(x)\n        x = self.proj_drop(x)\n        return x\n\n    def extra_repr(self) -> str:\n        return f'dim={self.dim}, window_size={self.window_size}, num_heads={self.num_heads}'\n\n    def flops(self, N):\n        # calculate flops for 1 window with token length of N\n        flops = 0\n        # qkv = self.qkv(x)\n        flops += N * self.dim * 3 * self.dim\n        # attn = (q @ k.transpose(-2, -1))\n        flops += self.num_heads * N * (self.dim // self.num_heads) * N\n        #  x = (attn @ v)\n        flops += self.num_heads * N * N * (self.dim // self.num_heads)\n        # x = self.proj(x)\n        flops += N * self.dim * self.dim\n        return flops\n\n\nclass SwinTransformerBlock(nn.Layer):\n    r\"\"\" Swin Transformer Block.\n    Args:\n        dim (int): Number of input channels.\n        input_resolution (tuple[int]): Input resulotion.\n        num_heads (int): Number of attention heads.\n        window_size (int): Window size.\n        shift_size (int): Shift size for SW-MSA.\n        mlp_ratio (float): Ratio of mlp hidden dim to embedding dim.\n        qkv_bias (bool, optional): If True, add a learnable bias to query, key, value. Default: True\n        qk_scale (float | None, optional): Override default qk scale of head_dim ** -0.5 if set.\n        drop (float, optional): Dropout rate. Default: 0.0\n        attn_drop (float, optional): Attention dropout rate. Default: 0.0\n        drop_path (float, optional): Stochastic depth rate. Default: 0.0\n        act_layer (nn.Layer, optional): Activation layer. Default: nn.GELU\n        norm_layer (nn.Layer, optional): Normalization layer.  Default: nn.LayerNorm\n    \"\"\"\n\n    def __init__(self,\n                 dim,\n                 input_resolution,\n                 num_heads,\n                 window_size=7,\n                 shift_size=0,\n                 mlp_ratio=4.,\n                 qkv_bias=True,\n                 qk_scale=None,\n                 drop=0.,\n                 attn_drop=0.,\n                 drop_path=0.,\n                 act_layer=nn.GELU,\n                 norm_layer=nn.LayerNorm):\n        super().__init__()\n        self.dim = dim\n        self.input_resolution = input_resolution\n        self.num_heads = num_heads\n        self.window_size = window_size\n        self.shift_size = shift_size\n        self.mlp_ratio = mlp_ratio\n        if min(self.input_resolution) <= self.window_size:\n            # if window size is larger than input resolution, we don't partition windows\n            self.shift_size = 0\n            self.window_size = min(self.input_resolution)\n        assert 0 <= self.shift_size < self.window_size, \"shift_size must in 0-window_size\"\n\n        self.norm1 = norm_layer(dim)\n        self.attn = WindowAttention(dim,\n                                    window_size=to_2tuple(self.window_size),\n                                    num_heads=num_heads,\n                                    qkv_bias=qkv_bias,\n                                    qk_scale=qk_scale,\n                                    attn_drop=attn_drop,\n                                    proj_drop=drop)\n\n        self.drop_path = nn.Identity()\n        self.norm2 = norm_layer(dim)\n        mlp_hidden_dim = int(dim * mlp_ratio)\n        self.mlp = Mlp(in_features=dim, hidden_features=mlp_hidden_dim, act_layer=act_layer, drop=drop)\n\n        if self.shift_size > 0:\n            attn_mask = self.calculate_mask(self.input_resolution)\n        else:\n            attn_mask = None\n\n        self.register_buffer(\"attn_mask\", attn_mask)\n\n    def calculate_mask(self, x_size):\n        # calculate attention mask for SW-MSA\n        H, W = x_size\n        img_mask = paddle.zeros((1, H, W, 1))  # 1 H W 1\n\n        h_slices = (slice(0, -self.window_size), slice(-self.window_size,\n                                                       -self.shift_size if self.shift_size else None),\n                    slice(-self.shift_size, None))\n        w_slices = (slice(0, -self.window_size), slice(-self.window_size,\n                                                       -self.shift_size if self.shift_size else None),\n                    slice(-self.shift_size, None))\n        cnt = 0\n        for h in h_slices:\n            for w in w_slices:\n                img_mask[:, h, w, :] = cnt\n                cnt += 1\n\n        mask_windows = window_partition(img_mask, self.window_size)  # nW, window_size, window_size, 1\n        mask_windows = mask_windows.reshape((-1, self.window_size * self.window_size))\n\n        attn_mask = mask_windows.unsqueeze(1) - mask_windows.unsqueeze(2)\n        _h = paddle.full_like(attn_mask, -100.0, dtype='float32')\n        _z = paddle.full_like(attn_mask, 0.0, dtype='float32')\n        attn_mask = paddle.where(attn_mask != 0, _h, _z)\n\n        return attn_mask\n\n    def forward(self, x, x_size):\n        H, W = x_size\n        B, L, C = x.shape\n        # assert L == H * W, \"input feature has wrong size\"\n\n        shortcut = x\n        x = self.norm1(x)\n        x = x.reshape((B, H, W, C))\n\n        # cyclic shift\n        if self.shift_size > 0:\n            shifted_x = paddle.roll(x, shifts=(-self.shift_size, -self.shift_size), axis=(1, 2))\n        else:\n            shifted_x = x\n\n        # partition windows\n        x_windows = window_partition(shifted_x, self.window_size)  # nW*B, window_size, window_size, C\n        x_windows = x_windows.reshape((-1, self.window_size * self.window_size, C))  # nW*B, window_size*window_size, C\n\n        # W-MSA/SW-MSA (to be compatible for testing on images whose shapes are the multiple of window size\n        if self.input_resolution == x_size:\n            attn_windows = self.attn(x_windows, mask=self.attn_mask)  # nW*B, window_size*window_size, C\n        else:\n            attn_windows = self.attn(x_windows, mask=self.calculate_mask(x_size))\n\n        # merge windows\n        attn_windows = attn_windows.reshape((-1, self.window_size, self.window_size, C))\n        shifted_x = window_reverse(attn_windows, self.window_size, H, W)  # B H' W' C\n\n        # reverse cyclic shift\n        if self.shift_size > 0:\n            x = paddle.roll(shifted_x, shifts=(self.shift_size, self.shift_size), axis=(1, 2))\n        else:\n            x = shifted_x\n        x = x.reshape((B, H * W, C))\n\n        # FFN\n        x = shortcut + self.drop_path(x)\n        x = x + self.drop_path(self.mlp(self.norm2(x)))\n\n        return x\n\n    def extra_repr(self) -> str:\n        return f\"dim={self.dim}, input_resolution={self.input_resolution}, num_heads={self.num_heads}, \" \\\n               f\"window_size={self.window_size}, shift_size={self.shift_size}, mlp_ratio={self.mlp_ratio}\"\n\n    def flops(self):\n        flops = 0\n        H, W = self.input_resolution\n        # norm1\n        flops += self.dim * H * W\n        # W-MSA/SW-MSA\n        nW = H * W / self.window_size / self.window_size\n        flops += nW * self.attn.flops(self.window_size * self.window_size)\n        # mlp\n        flops += 2 * H * W * self.dim * self.dim * self.mlp_ratio\n        # norm2\n        flops += self.dim * H * W\n        return flops\n\n\nclass PatchMerging(nn.Layer):\n    r\"\"\" Patch Merging Layer.\n    Args:\n        input_resolution (tuple[int]): Resolution of input feature.\n        dim (int): Number of input channels.\n        norm_layer (nn.Layer, optional): Normalization layer.  Default: nn.LayerNorm\n    \"\"\"\n\n    def __init__(self, input_resolution, dim, norm_layer=nn.LayerNorm):\n        super().__init__()\n        self.input_resolution = input_resolution\n        self.dim = dim\n        self.reduction = nn.Linear(4 * dim, 2 * dim, bias=False)\n        self.norm = norm_layer(4 * dim)\n\n    def forward(self, x):\n        \"\"\"\n        x: B, H*W, C\n        \"\"\"\n        H, W = self.input_resolution\n        B, L, C = x.shape\n        assert L == H * W, \"input feature has wrong size\"\n        assert H % 2 == 0 and W % 2 == 0, f\"x size ({H}*{W}) are not even.\"\n\n        x = x.reshape((B, H, W, C))\n\n        x0 = x[:, 0::2, 0::2, :]  # B H/2 W/2 C\n        x1 = x[:, 1::2, 0::2, :]  # B H/2 W/2 C\n        x2 = x[:, 0::2, 1::2, :]  # B H/2 W/2 C\n        x3 = x[:, 1::2, 1::2, :]  # B H/2 W/2 C\n        x = paddle.concat([x0, x1, x2, x3], -1)  # B H/2 W/2 4*C\n        x = x.reshape((B, -1, 4 * C))  # B H/2*W/2 4*C\n\n        x = self.norm(x)\n        x = self.reduction(x)\n\n        return x\n\n    def extra_repr(self) -> str:\n        return f\"input_resolution={self.input_resolution}, dim={self.dim}\"\n\n    def flops(self):\n        H, W = self.input_resolution\n        flops = H * W * self.dim\n        flops += (H // 2) * (W // 2) * 4 * self.dim * 2 * self.dim\n        return flops\n\n\nclass BasicLayer(nn.Layer):\n    \"\"\" A basic Swin Transformer layer for one stage.\n    Args:\n        dim (int): Number of input channels.\n        input_resolution (tuple[int]): Input resolution.\n        depth (int): Number of blocks.\n        num_heads (int): Number of attention heads.\n        window_size (int): Local window size.\n        mlp_ratio (float): Ratio of mlp hidden dim to embedding dim.\n        qkv_bias (bool, optional): If True, add a learnable bias to query, key, value. Default: True\n        qk_scale (float | None, optional): Override default qk scale of head_dim ** -0.5 if set.\n        drop (float, optional): Dropout rate. Default: 0.0\n        attn_drop (float, optional): Attention dropout rate. Default: 0.0\n        drop_path (float | tuple[float], optional): Stochastic depth rate. Default: 0.0\n        norm_layer (nn.Layer, optional): Normalization layer. Default: nn.LayerNorm\n        downsample (nn.Layer | None, optional): Downsample layer at the end of the layer. Default: None\n        use_checkpoint (bool): Whether to use checkpointing to save memory. Default: False.\n    \"\"\"\n\n    def __init__(self,\n                 dim,\n                 input_resolution,\n                 depth,\n                 num_heads,\n                 window_size,\n                 mlp_ratio=4.,\n                 qkv_bias=True,\n                 qk_scale=None,\n                 drop=0.,\n                 attn_drop=0.,\n                 drop_path=0.,\n                 norm_layer=nn.LayerNorm,\n                 downsample=None,\n                 use_checkpoint=False):\n\n        super().__init__()\n        self.dim = dim\n        self.input_resolution = input_resolution\n        self.depth = depth\n        self.use_checkpoint = use_checkpoint\n\n        # build blocks\n        self.blocks = nn.LayerList([\n            SwinTransformerBlock(dim=dim,\n                                 input_resolution=input_resolution,\n                                 num_heads=num_heads,\n                                 window_size=window_size,\n                                 shift_size=0 if (i % 2 == 0) else window_size // 2,\n                                 mlp_ratio=mlp_ratio,\n                                 qkv_bias=qkv_bias,\n                                 qk_scale=qk_scale,\n                                 drop=drop,\n                                 attn_drop=attn_drop,\n                                 drop_path=drop_path[i] if isinstance(drop_path, list) else drop_path,\n                                 norm_layer=norm_layer) for i in range(depth)\n        ])\n\n        # patch merging layer\n        if downsample is not None:\n            self.downsample = downsample(input_resolution, dim=dim, norm_layer=norm_layer)\n        else:\n            self.downsample = None\n\n    def forward(self, x, x_size):\n        for blk in self.blocks:\n            x = blk(x, x_size)\n        if self.downsample is not None:\n            x = self.downsample(x)\n        return x\n\n    def extra_repr(self) -> str:\n        return f\"dim={self.dim}, input_resolution={self.input_resolution}, depth={self.depth}\"\n\n    def flops(self):\n        flops = 0\n        for blk in self.blocks:\n            flops += blk.flops()\n        if self.downsample is not None:\n            flops += self.downsample.flops()\n        return flops\n\n\nclass RSTB(nn.Layer):\n    \"\"\"Residual Swin Transformer Block (RSTB).\n    Args:\n        dim (int): Number of input channels.\n        input_resolution (tuple[int]): Input resolution.\n        depth (int): Number of blocks.\n        num_heads (int): Number of attention heads.\n        window_size (int): Local window size.\n        mlp_ratio (float): Ratio of mlp hidden dim to embedding dim.\n        qkv_bias (bool, optional): If True, add a learnable bias to query, key, value. Default: True\n        qk_scale (float | None, optional): Override default qk scale of head_dim ** -0.5 if set.\n        drop (float, optional): Dropout rate. Default: 0.0\n        attn_drop (float, optional): Attention dropout rate. Default: 0.0\n        drop_path (float | tuple[float], optional): Stochastic depth rate. Default: 0.0\n        norm_layer (nn.Layer, optional): Normalization layer. Default: nn.LayerNorm\n        downsample (nn.Layer | None, optional): Downsample layer at the end of the layer. Default: None\n        use_checkpoint (bool): Whether to use checkpointing to save memory. Default: False.\n        img_size: Input image size.\n        patch_size: Patch size.\n        resi_connection: The convolutional block before residual connection.\n    \"\"\"\n\n    def __init__(self,\n                 dim,\n                 input_resolution,\n                 depth,\n                 num_heads,\n                 window_size,\n                 mlp_ratio=4.,\n                 qkv_bias=True,\n                 qk_scale=None,\n                 drop=0.,\n                 attn_drop=0.,\n                 drop_path=0.,\n                 norm_layer=nn.LayerNorm,\n                 downsample=None,\n                 use_checkpoint=False,\n                 img_size=224,\n                 patch_size=4,\n                 resi_connection='1conv'):\n        super(RSTB, self).__init__()\n\n        self.dim = dim\n        self.input_resolution = input_resolution\n\n        self.residual_group = BasicLayer(dim=dim,\n                                         input_resolution=input_resolution,\n                                         depth=depth,\n                                         num_heads=num_heads,\n                                         window_size=window_size,\n                                         mlp_ratio=mlp_ratio,\n                                         qkv_bias=qkv_bias,\n                                         qk_scale=qk_scale,\n                                         drop=drop,\n                                         attn_drop=attn_drop,\n                                         drop_path=drop_path,\n                                         norm_layer=norm_layer,\n                                         downsample=downsample,\n                                         use_checkpoint=use_checkpoint)\n\n        if resi_connection == '1conv':\n            self.conv = nn.Conv2D(dim, dim, 3, 1, 1)\n        elif resi_connection == '3conv':\n            # to save parameters and memory\n            self.conv = nn.Sequential(nn.Conv2D(dim, dim // 4, 3, 1, 1), nn.LeakyReLU(negative_slope=0.2),\n                                      nn.Conv2D(dim // 4, dim // 4, 1, 1, 0), nn.LeakyReLU(negative_slope=0.2),\n                                      nn.Conv2D(dim // 4, dim, 3, 1, 1))\n\n        self.patch_embed = PatchEmbed(img_size=img_size,\n                                      patch_size=patch_size,\n                                      in_chans=0,\n                                      embed_dim=dim,\n                                      norm_layer=None)\n\n        self.patch_unembed = PatchUnEmbed(img_size=img_size,\n                                          patch_size=patch_size,\n                                          in_chans=0,\n                                          embed_dim=dim,\n                                          norm_layer=None)\n\n    def forward(self, x, x_size):\n        return self.patch_embed(self.conv(self.patch_unembed(self.residual_group(x, x_size), x_size))) + x\n\n    def flops(self):\n        flops = 0\n        flops += self.residual_group.flops()\n        H, W = self.input_resolution\n        flops += H * W * self.dim * self.dim * 9\n        flops += self.patch_embed.flops()\n        flops += self.patch_unembed.flops()\n\n        return flops\n\n\nclass PatchEmbed(nn.Layer):\n    r\"\"\" Image to Patch Embedding\n    Args:\n        img_size (int): Image size.  Default: 224.\n        patch_size (int): Patch token size. Default: 4.\n        in_chans (int): Number of input image channels. Default: 3.\n        embed_dim (int): Number of linear projection output channels. Default: 96.\n        norm_layer (nn.Layer, optional): Normalization layer. Default: None\n    \"\"\"\n\n    def __init__(self, img_size=224, patch_size=4, in_chans=3, embed_dim=96, norm_layer=None):\n        super().__init__()\n        img_size = to_2tuple(img_size)\n        patch_size = to_2tuple(patch_size)\n        patches_resolution = [img_size[0] // patch_size[0], img_size[1] // patch_size[1]]\n        self.img_size = img_size\n        self.patch_size = patch_size\n        self.patches_resolution = patches_resolution\n        self.num_patches = patches_resolution[0] * patches_resolution[1]\n\n        self.in_chans = in_chans\n        self.embed_dim = embed_dim\n\n        if norm_layer is not None:\n            self.norm = norm_layer(embed_dim)\n        else:\n            self.norm = None\n\n    def forward(self, x):\n        x = x.flatten(2).transpose((0, 2, 1))  # B Ph*Pw C\n        if self.norm is not None:\n            x = self.norm(x)\n        return x\n\n    def flops(self):\n        flops = 0\n        H, W = self.img_size\n        if self.norm is not None:\n            flops += H * W * self.embed_dim\n        return flops\n\n\nclass PatchUnEmbed(nn.Layer):\n    r\"\"\" Image to Patch Unembedding\n    Args:\n        img_size (int): Image size.  Default: 224.\n        patch_size (int): Patch token size. Default: 4.\n        in_chans (int): Number of input image channels. Default: 3.\n        embed_dim (int): Number of linear projection output channels. Default: 96.\n        norm_layer (nn.Layer, optional): Normalization layer. Default: None\n    \"\"\"\n\n    def __init__(self, img_size=224, patch_size=4, in_chans=3, embed_dim=96, norm_layer=None):\n        super().__init__()\n        img_size = to_2tuple(img_size)\n        patch_size = to_2tuple(patch_size)\n        patches_resolution = [img_size[0] // patch_size[0], img_size[1] // patch_size[1]]\n        self.img_size = img_size\n        self.patch_size = patch_size\n        self.patches_resolution = patches_resolution\n        self.num_patches = patches_resolution[0] * patches_resolution[1]\n\n        self.in_chans = in_chans\n        self.embed_dim = embed_dim\n\n    def forward(self, x, x_size):\n        B, HW, C = x.shape\n        x = x.transpose((0, 2, 1)).reshape((B, self.embed_dim, x_size[0], x_size[1]))  # B Ph*Pw C\n        return x\n\n    def flops(self):\n        flops = 0\n        return flops\n\n\nclass Upsample(nn.Sequential):\n    \"\"\"Upsample module.\n    Args:\n        scale (int): Scale factor. Supported scales: 2^n and 3.\n        num_feat (int): Channel number of intermediate features.\n    \"\"\"\n\n    def __init__(self, scale, num_feat):\n        m = []\n        if (scale & (scale - 1)) == 0:  # scale = 2^n\n            for _ in range(int(math.log(scale, 2))):\n                m.append(nn.Conv2D(num_feat, 4 * num_feat, 3, 1, 1))\n                m.append(nn.PixelShuffle(2))\n        elif scale == 3:\n            m.append(nn.Conv2D(num_feat, 9 * num_feat, 3, 1, 1))\n            m.append(nn.PixelShuffle(3))\n        else:\n            raise ValueError(f'scale {scale} is not supported. '\n                             'Supported scales: 2^n and 3.')\n        super(Upsample, self).__init__(*m)\n\n\nclass UpsampleOneStep(nn.Sequential):\n    \"\"\"UpsampleOneStep module (the difference with Upsample is that it always only has 1conv + 1pixelshuffle)\n       Used in lightweight SR to save parameters.\n    Args:\n        scale (int): Scale factor. Supported scales: 2^n and 3.\n        num_feat (int): Channel number of intermediate features.\n    \"\"\"\n\n    def __init__(self, scale, num_feat, num_out_ch, input_resolution=None):\n        self.num_feat = num_feat\n        self.input_resolution = input_resolution\n        m = []\n        m.append(nn.Conv2D(num_feat, (scale**2) * num_out_ch, 3, 1, 1))\n        m.append(nn.PixelShuffle(scale))\n        super(UpsampleOneStep, self).__init__(*m)\n\n    def flops(self):\n        H, W = self.input_resolution\n        flops = H * W * self.num_feat * 3 * 9\n        return flops\n\n\nclass SwinIR(nn.Layer):\n    r\"\"\" SwinIR\n        A PyTorch impl of : `SwinIR: Image Restoration Using Swin Transformer`, based on Swin Transformer.\n    Args:\n        img_size (int | tuple(int)): Input image size. Default 64\n        patch_size (int | tuple(int)): Patch size. Default: 1\n        in_chans (int): Number of input image channels. Default: 3\n        embed_dim (int): Patch embedding dimension. Default: 96\n        depths (tuple(int)): Depth of each Swin Transformer layer.\n        num_heads (tuple(int)): Number of attention heads in different layers.\n        window_size (int): Window size. Default: 7\n        mlp_ratio (float): Ratio of mlp hidden dim to embedding dim. Default: 4\n        qkv_bias (bool): If True, add a learnable bias to query, key, value. Default: True\n        qk_scale (float): Override default qk scale of head_dim ** -0.5 if set. Default: None\n        drop_rate (float): Dropout rate. Default: 0\n        attn_drop_rate (float): Attention dropout rate. Default: 0\n        drop_path_rate (float): Stochastic depth rate. Default: 0.1\n        norm_layer (nn.Layer): Normalization layer. Default: nn.LayerNorm.\n        ape (bool): If True, add absolute position embedding to the patch embedding. Default: False\n        patch_norm (bool): If True, add normalization after patch embedding. Default: True\n        use_checkpoint (bool): Whether to use checkpointing to save memory. Default: False\n        upscale: Upscale factor. 2/3/4/8 for image SR, 1 for denoising and compress artifact reduction\n        img_range: Image range. 1. or 255.\n        upsampler: The reconstruction reconstruction module. 'pixelshuffle'/'pixelshuffledirect'/'nearest+conv'/None\n        resi_connection: The convolutional block before residual connection. '1conv'/'3conv'\n    \"\"\"\n\n    def __init__(self,\n                 img_size=64,\n                 patch_size=1,\n                 in_chans=3,\n                 embed_dim=96,\n                 depths=[6, 6, 6, 6],\n                 num_heads=[6, 6, 6, 6],\n                 window_size=7,\n                 mlp_ratio=4.,\n                 qkv_bias=True,\n                 qk_scale=None,\n                 drop_rate=0.,\n                 attn_drop_rate=0.,\n                 drop_path_rate=0.1,\n                 norm_layer=nn.LayerNorm,\n                 ape=False,\n                 patch_norm=True,\n                 use_checkpoint=False,\n                 upscale=2,\n                 img_range=1.,\n                 upsampler='',\n                 resi_connection='1conv',\n                 **kwargs):\n        super(SwinIR, self).__init__()\n        num_in_ch = in_chans\n        num_out_ch = in_chans\n        num_feat = 64\n        self.img_range = img_range\n        if in_chans == 3:\n            rgb_mean = (0.4488, 0.4371, 0.4040)\n            self.mean = paddle.to_tensor(rgb_mean).reshape((1, 3, 1, 1))\n        else:\n            self.mean = paddle.zeros((1, 1, 1, 1))\n        self.upscale = upscale\n        self.upsampler = upsampler\n        self.window_size = window_size\n\n        #####################################################################################################\n        ################################### 1, shallow feature extraction ###################################\n        self.conv_first = nn.Conv2D(num_in_ch, embed_dim, 3, 1, 1)\n\n        #####################################################################################################\n        ################################### 2, deep feature extraction ######################################\n        self.num_layers = len(depths)\n        self.embed_dim = embed_dim\n        self.ape = ape\n        self.patch_norm = patch_norm\n        self.num_features = embed_dim\n        self.mlp_ratio = mlp_ratio\n\n        # split image into non-overlapping patches\n        self.patch_embed = PatchEmbed(img_size=img_size,\n                                      patch_size=patch_size,\n                                      in_chans=embed_dim,\n                                      embed_dim=embed_dim,\n                                      norm_layer=norm_layer if self.patch_norm else None)\n        num_patches = self.patch_embed.num_patches\n        patches_resolution = self.patch_embed.patches_resolution\n        self.patches_resolution = patches_resolution\n\n        # merge non-overlapping patches into image\n        self.patch_unembed = PatchUnEmbed(img_size=img_size,\n                                          patch_size=patch_size,\n                                          in_chans=embed_dim,\n                                          embed_dim=embed_dim,\n                                          norm_layer=norm_layer if self.patch_norm else None)\n\n        # absolute position embedding\n        if self.ape:\n            # self.absolute_pos_embed = nn.Parameter(torch.zeros(1, num_patches, embed_dim))\n            self.absolute_pos_embed = self.create_parameter(shape=(1, num_patches, embed_dim),\n                                                            default_initializer=nn.initializer.Constant(0.0))\n\n        self.pos_drop = nn.Dropout(p=drop_rate)\n\n        # stochastic depth\n        dpr = [x.item() for x in paddle.linspace(0, drop_path_rate, sum(depths))]  # stochastic depth decay rule\n\n        # build Residual Swin Transformer blocks (RSTB)\n        self.layers = nn.LayerList()\n        for i_layer in range(self.num_layers):\n            layer = RSTB(\n                dim=embed_dim,\n                input_resolution=(patches_resolution[0], patches_resolution[1]),\n                depth=depths[i_layer],\n                num_heads=num_heads[i_layer],\n                window_size=window_size,\n                mlp_ratio=self.mlp_ratio,\n                qkv_bias=qkv_bias,\n                qk_scale=qk_scale,\n                drop=drop_rate,\n                attn_drop=attn_drop_rate,\n                drop_path=dpr[sum(depths[:i_layer]):sum(depths[:i_layer + 1])],  # no impact on SR results\n                norm_layer=norm_layer,\n                downsample=None,\n                use_checkpoint=use_checkpoint,\n                img_size=img_size,\n                patch_size=patch_size,\n                resi_connection=resi_connection)\n            self.layers.append(layer)\n        self.norm = norm_layer(self.num_features)\n\n        # build the last conv layer in deep feature extraction\n        if resi_connection == '1conv':\n            self.conv_after_body = nn.Conv2D(embed_dim, embed_dim, 3, 1, 1)\n        elif resi_connection == '3conv':\n            # to save parameters and memory\n            self.conv_after_body = nn.Sequential(nn.Conv2D(embed_dim, embed_dim // 4, 3, 1, 1),\n                                                 nn.LeakyReLU(negative_slope=0.2),\n                                                 nn.Conv2D(embed_dim // 4, embed_dim // 4, 1, 1, 0),\n                                                 nn.LeakyReLU(negative_slope=0.2),\n                                                 nn.Conv2D(embed_dim // 4, embed_dim, 3, 1, 1))\n\n        #####################################################################################################\n        ################################ 3, high quality image reconstruction ################################\n        if self.upsampler == 'pixelshuffle':\n            # for classical SR\n            self.conv_before_upsample = nn.Sequential(nn.Conv2D(embed_dim, num_feat, 3, 1, 1), nn.LeakyReLU())\n            self.upsample = Upsample(upscale, num_feat)\n            self.conv_last = nn.Conv2D(num_feat, num_out_ch, 3, 1, 1)\n        elif self.upsampler == 'pixelshuffledirect':\n            # for lightweight SR (to save parameters)\n            self.upsample = UpsampleOneStep(upscale, embed_dim, num_out_ch,\n                                            (patches_resolution[0], patches_resolution[1]))\n        elif self.upsampler == 'nearest+conv':\n            # for real-world SR (less artifacts)\n            self.conv_before_upsample = nn.Sequential(nn.Conv2D(embed_dim, num_feat, 3, 1, 1), nn.LeakyReLU())\n            self.conv_up1 = nn.Conv2D(num_feat, num_feat, 3, 1, 1)\n            if self.upscale == 4:\n                self.conv_up2 = nn.Conv2D(num_feat, num_feat, 3, 1, 1)\n            self.conv_hr = nn.Conv2D(num_feat, num_feat, 3, 1, 1)\n            self.conv_last = nn.Conv2D(num_feat, num_out_ch, 3, 1, 1)\n            self.lrelu = nn.LeakyReLU(negative_slope=0.2)\n        else:\n            # for image denoising and JPEG compression artifact reduction\n            self.conv_last = nn.Conv2D(embed_dim, num_out_ch, 3, 1, 1)\n\n        self.apply(self._init_weights)\n\n    def _init_weights(self, m):\n        if isinstance(m, nn.Linear):\n            if isinstance(m, nn.Linear) and m.bias is not None:\n                nn.initializer.Constant(0.0)(m.bias)\n        elif isinstance(m, nn.LayerNorm):\n            nn.initializer.Constant(0.0)(m.bias)\n            nn.initializer.Constant(1.0)(m.weight)\n\n    def check_image_size(self, x):\n        _, _, h, w = x.shape\n        mod_pad_h = (self.window_size - h % self.window_size) % self.window_size\n        mod_pad_w = (self.window_size - w % self.window_size) % self.window_size\n        x = F.pad(x, (0, mod_pad_w, 0, mod_pad_h), 'reflect')\n        return x\n\n    def forward_features(self, x):\n        x_size = (x.shape[2], x.shape[3])\n        x = self.patch_embed(x)\n        if self.ape:\n            x = x + self.absolute_pos_embed\n        x = self.pos_drop(x)\n\n        for layer in self.layers:\n            x = layer(x, x_size)\n\n        x = self.norm(x)  # B L C\n        x = self.patch_unembed(x, x_size)\n\n        return x\n\n    def forward(self, x):\n        H, W = x.shape[2:]\n        x = self.check_image_size(x)\n\n        self.mean = self.mean.cast(x.dtype)\n        x = (x - self.mean) * self.img_range\n\n        if self.upsampler == 'pixelshuffle':\n            # for classical SR\n            x = self.conv_first(x)\n            x = self.conv_after_body(self.forward_features(x)) + x\n            x = self.conv_before_upsample(x)\n            x = self.conv_last(self.upsample(x))\n        elif self.upsampler == 'pixelshuffledirect':\n            # for lightweight SR\n            x = self.conv_first(x)\n            x = self.conv_after_body(self.forward_features(x)) + x\n            x = self.upsample(x)\n        elif self.upsampler == 'nearest+conv':\n            # for real-world SR\n            x = self.conv_first(x)\n            x = self.conv_after_body(self.forward_features(x)) + x\n            x = self.conv_before_upsample(x)\n            x = self.lrelu(self.conv_up1(nn.functional.interpolate(x, scale_factor=2, mode='nearest')))\n            if self.upscale == 4:\n                x = self.lrelu(self.conv_up2(nn.functional.interpolate(x, scale_factor=2, mode='nearest')))\n            x = self.conv_last(self.lrelu(self.conv_hr(x)))\n        else:\n            # for image denoising and JPEG compression artifact reduction\n            x_first = self.conv_first(x)\n            res = self.conv_after_body(self.forward_features(x_first)) + x_first\n            x = x + self.conv_last(res)\n\n        x = x / self.img_range + self.mean\n\n        return x[:, :, :H * self.upscale, :W * self.upscale]\n\n    def flops(self):\n        flops = 0\n        H, W = self.patches_resolution\n        flops += H * W * 3 * self.embed_dim * 9\n        flops += self.patch_embed.flops()\n        for i, layer in enumerate(self.layers):\n            flops += layer.flops()\n        flops += H * W * 3 * self.embed_dim * self.embed_dim\n        flops += self.upsample.flops()\n        return flops\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/swinir_l_real_sr_x4/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport numpy as np\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/mJaD10XeD7w/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8M3x8Y2F0fGVufDB8fHx8MTY2MzczNDc3Mw&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        img = cv2.imread('tests/test.jpg')\n        img = cv2.resize(img, (0, 0), fx=0.25, fy=0.25)\n        cv2.imwrite('tests/test.jpg', img)\n        cls.module = hub.Module(name=\"swinir_l_real_sr_x4\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('swinir_l_real_sr_x4_output')\n\n    def test_real_sr1(self):\n        results = self.module.real_sr(image='tests/test.jpg', visualization=False)\n\n        self.assertIsInstance(results, np.ndarray)\n\n    def test_real_sr2(self):\n        results = self.module.real_sr(image=cv2.imread('tests/test.jpg'), visualization=True)\n\n        self.assertIsInstance(results, np.ndarray)\n\n    def test_real_sr3(self):\n        results = self.module.real_sr(image=cv2.imread('tests/test.jpg'), visualization=True)\n\n        self.assertIsInstance(results, np.ndarray)\n\n    def test_real_sr4(self):\n        self.assertRaises(Exception, self.module.real_sr, image=['tests/test.jpg'])\n\n    def test_real_sr5(self):\n        self.assertRaises(FileNotFoundError, self.module.real_sr, image='no.jpg')\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/swinir_m_real_sr_x2/README.md",
    "content": "# swinir_m_real_sr_x2\n\n|模型名称|swinir_m_real_sr_x2|\n| :--- | :---: |\n|类别|图像-图像编辑|\n|网络|SwinIR|\n|数据集|DIV2K / Flickr2K|\n|是否支持Fine-tuning|否|\n|模型大小|66.8MB|\n|指标|-|\n|最新更新日期|2022-10-10|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n  - 网络结构：\n      <p align=\"center\">\n      <img src=\"https://ai-studio-static-online.cdn.bcebos.com/b3c6bfc3dfc14078adcf3dc19acaf04acd4b064770384e2bbd8865697c7dbc91\" hspace='10'/> <br />\n      </p>\n\n  - 样例结果示例：\n      <p align=\"center\">\n      <img src=\"https://ai-studio-static-online.cdn.bcebos.com/c5517af6c3f944c4b281aedc417a4f8c02c0a969d0dd494c9106c4ff2709fc2f\" hspace='10'/>\n      <img src=\"https://ai-studio-static-online.cdn.bcebos.com/49502aba3d0c46b1964f294925f566f38f1544d159614a6ab12eaec0afe5da21\" hspace='10'/>\n      </p>\n\n- ### 模型介绍\n\n  - SwinIR 是一个基于 Swin Transformer 的图像恢复模型。swinir_m_real_sr_x2 是基于 SwinIR-M 的 2 倍现实图像超分辨率模型。\n\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0  \n\n- ### 2.安装\n\n    - ```shell\n      $ hub install swinir_m_real_sr_x2\n      ```\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n  - ### 1、命令行预测\n\n    ```shell\n    $ hub run swinir_m_real_sr_x2 \\\n        --input_path \"/PATH/TO/IMAGE\" \\\n        --output_dir \"swinir_m_real_sr_x2_output\"\n    ```\n\n  - ### 2、预测代码示例\n\n    ```python\n    import paddlehub as hub\n    import cv2\n\n    module = hub.Module(name=\"swinir_m_real_sr_x2\")\n    result = module.real_sr(\n        image=cv2.imread('/PATH/TO/IMAGE'),\n        visualization=True,\n        output_dir='swinir_m_real_sr_x2_output'\n    )\n    ```\n\n  - ### 3、API\n\n    ```python\n    def real_sr(\n        image: Union[str, numpy.ndarray],\n        visualization: bool = True,\n        output_dir: str = \"swinir_m_real_sr_x2_output\"\n    ) -> numpy.ndarray\n    ```\n\n    - 超分辨率 API\n\n    - **参数**\n\n      * image (Union\\[str, numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      * visualization (bool): 是否将识别结果保存为图片文件；\n      * output\\_dir (str): 保存处理结果的文件目录。\n\n    - **返回**\n\n      * res (numpy.ndarray): 图像超分辨率结果 (BGR)；\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个图像超分辨率的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n\n    ```shell\n     $ hub serving start -m swinir_m_real_sr_x2\n    ```\n\n    - 这样就完成了一个图像超分辨率服务化API的部署，默认端口号为8866。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    ```python\n    import requests\n    import json\n    import base64\n\n    import cv2\n    import numpy as np\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tobytes()).decode('utf8')\n\n    def base64_to_cv2(b64str):\n        data = base64.b64decode(b64str.encode('utf8'))\n        data = np.frombuffer(data, np.uint8)\n        data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n        return data\n\n    # 发送HTTP请求\n    org_im = cv2.imread('/PATH/TO/IMAGE')\n    data = {\n        'image': cv2_to_base64(org_im)\n    }\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/swinir_m_real_sr_x2\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 结果转换\n    results = r.json()['results']\n    results = base64_to_cv2(results)\n\n    # 保存结果\n    cv2.imwrite('output.jpg', results)\n    ```\n\n## 五、参考资料\n\n* 论文：[SwinIR: Image Restoration Using Swin Transformer](https://arxiv.org/abs/2108.10257)\n\n* 官方实现：[JingyunLiang/SwinIR](https://github.com/JingyunLiang/SwinIR)\n\n## 六、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install swinir_m_real_sr_x2==1.0.0\n  ```\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/swinir_m_real_sr_x2/module.py",
    "content": "import argparse\nimport base64\nimport os\nimport time\nfrom typing import Union\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\n\nfrom .swinir import SwinIR\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tobytes()).decode('utf8')\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.frombuffer(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\n@moduleinfo(\n    name='swinir_m_real_sr_x2',\n    version='1.0.0',\n    type=\"CV/image_editing\",\n    author=\"\",\n    author_email=\"\",\n    summary=\"Image Restoration (Real image Super Resolution) Using Swin Transformer.\",\n)\nclass SwinIRMRealSR(nn.Layer):\n\n    def __init__(self):\n        super(SwinIRMRealSR, self).__init__()\n        self.default_pretrained_model_path = os.path.join(self.directory,\n                                                          '003_realSR_BSRGAN_DFO_s64w8_SwinIR-M_x2_GAN.pdparams')\n        self.swinir = SwinIR(upscale=2,\n                             in_chans=3,\n                             img_size=64,\n                             window_size=8,\n                             img_range=1.,\n                             depths=[6, 6, 6, 6, 6, 6],\n                             embed_dim=180,\n                             num_heads=[6, 6, 6, 6, 6, 6],\n                             mlp_ratio=2,\n                             upsampler='nearest+conv',\n                             resi_connection='1conv')\n        state_dict = paddle.load(self.default_pretrained_model_path)\n        self.swinir.set_state_dict(state_dict)\n        self.swinir.eval()\n\n    def preprocess(self, img: np.ndarray) -> np.ndarray:\n        img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)\n        img = img.transpose((2, 0, 1))\n        img = img / 255.0\n        return img.astype(np.float32)\n\n    def postprocess(self, img: np.ndarray) -> np.ndarray:\n        img = img.clip(0, 1)\n        img = img * 255.0\n        img = img.transpose((1, 2, 0))\n        img = cv2.cvtColor(img, cv2.COLOR_RGB2BGR)\n        return img.astype(np.uint8)\n\n    def real_sr(self,\n                image: Union[str, np.ndarray],\n                visualization: bool = True,\n                output_dir: str = \"swinir_m_real_sr_x2_output\") -> np.ndarray:\n        if isinstance(image, str):\n            _, file_name = os.path.split(image)\n            save_name, _ = os.path.splitext(file_name)\n            save_name = save_name + '_' + str(int(time.time())) + '.jpg'\n            image = cv2.imdecode(np.fromfile(image, dtype=np.uint8), cv2.IMREAD_COLOR)\n        elif isinstance(image, np.ndarray):\n            save_name = str(int(time.time())) + '.jpg'\n            image = image\n        else:\n            raise Exception(\"image should be a str / np.ndarray\")\n\n        with paddle.no_grad():\n            img_input = self.preprocess(image)\n            img_input = paddle.to_tensor(img_input[None, ...], dtype=paddle.float32)\n\n            img_output = self.swinir(img_input)\n            img_output = img_output.numpy()[0]\n            img_output = self.postprocess(img_output)\n\n        if visualization:\n            if not os.path.isdir(output_dir):\n                os.makedirs(output_dir)\n            save_path = os.path.join(output_dir, save_name)\n            cv2.imwrite(save_path, img_output)\n\n        return img_output\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.parser.add_argument('--input_path', type=str, help=\"Path to image.\")\n        self.parser.add_argument('--output_dir',\n                                 type=str,\n                                 default='swinir_m_real_sr_x2_output',\n                                 help=\"The directory to save output images.\")\n        args = self.parser.parse_args(argvs)\n        self.real_sr(image=args.input_path, visualization=True, output_dir=args.output_dir)\n        return 'Results are saved in %s' % args.output_dir\n\n    @serving\n    def serving_method(self, image, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        image = base64_to_cv2(image)\n        img_output = self.real_sr(image=image, **kwargs)\n\n        return cv2_to_base64(img_output)\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/swinir_m_real_sr_x2/swinir.py",
    "content": "import math\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\ndef to_2tuple(x):\n    if isinstance(x, int):\n        return (x, x)\n    else:\n        return tuple(x)\n\n\nclass Mlp(nn.Layer):\n\n    def __init__(self, in_features, hidden_features=None, out_features=None, act_layer=nn.GELU, drop=0.):\n        super().__init__()\n        out_features = out_features or in_features\n        hidden_features = hidden_features or in_features\n        self.fc1 = nn.Linear(in_features, hidden_features)\n        self.act = act_layer()\n        self.fc2 = nn.Linear(hidden_features, out_features)\n        self.drop = nn.Dropout(drop)\n\n    def forward(self, x):\n        x = self.fc1(x)\n        x = self.act(x)\n        x = self.drop(x)\n        x = self.fc2(x)\n        x = self.drop(x)\n        return x\n\n\ndef window_partition(x, window_size):\n    \"\"\"\n    Args:\n        x: (B, H, W, C)\n        window_size (int): window size\n    Returns:\n        windows: (num_windows*B, window_size, window_size, C)\n    \"\"\"\n    B, H, W, C = x.shape\n    x = x.reshape((B, H // window_size, window_size, W // window_size, window_size, C))\n    windows = x.transpose((0, 1, 3, 2, 4, 5)).reshape((-1, window_size, window_size, C))\n    return windows\n\n\ndef window_reverse(windows, window_size, H, W):\n    \"\"\"\n    Args:\n        windows: (num_windows*B, window_size, window_size, C)\n        window_size (int): Window size\n        H (int): Height of image\n        W (int): Width of image\n    Returns:\n        x: (B, H, W, C)\n    \"\"\"\n    B = int(windows.shape[0] / (H * W / window_size / window_size))\n    x = windows.reshape((B, H // window_size, W // window_size, window_size, window_size, -1))\n    x = x.transpose((0, 1, 3, 2, 4, 5)).reshape((B, H, W, -1))\n    return x\n\n\nclass WindowAttention(nn.Layer):\n    r\"\"\" Window based multi-head self attention (W-MSA) module with relative position bias.\n    It supports both of shifted and non-shifted window.\n    Args:\n        dim (int): Number of input channels.\n        window_size (tuple[int]): The height and width of the window.\n        num_heads (int): Number of attention heads.\n        qkv_bias (bool, optional):  If True, add a learnable bias to query, key, value. Default: True\n        qk_scale (float | None, optional): Override default qk scale of head_dim ** -0.5 if set\n        attn_drop (float, optional): Dropout ratio of attention weight. Default: 0.0\n        proj_drop (float, optional): Dropout ratio of output. Default: 0.0\n    \"\"\"\n\n    def __init__(self, dim, window_size, num_heads, qkv_bias=True, qk_scale=None, attn_drop=0., proj_drop=0.):\n\n        super().__init__()\n        self.dim = dim\n        self.window_size = window_size  # Wh, Ww\n        self.num_heads = num_heads\n        head_dim = dim // num_heads\n        self.scale = qk_scale or head_dim**-0.5\n\n        # define a parameter table of relative position bias\n        self.relative_position_bias_table = self.create_parameter(shape=((2 * window_size[0] - 1) *\n                                                                         (2 * window_size[1] - 1), num_heads),\n                                                                  default_initializer=nn.initializer.Constant(0.0))\n\n        # get pair-wise relative position index for each token inside the window\n        coords_h = paddle.arange(self.window_size[0])\n        coords_w = paddle.arange(self.window_size[1])\n        coords = paddle.stack(paddle.meshgrid([coords_h, coords_w]))  # 2, Wh, Ww\n        coords_flatten = paddle.flatten(coords, 1)  # 2, Wh*Ww\n        relative_coords = coords_flatten[:, :, None] - coords_flatten[:, None, :]  # 2, Wh*Ww, Wh*Ww\n        relative_coords = relative_coords.transpose((1, 2, 0))  # Wh*Ww, Wh*Ww, 2\n        relative_coords[:, :, 0] += self.window_size[0] - 1  # shift to start from 0\n        relative_coords[:, :, 1] += self.window_size[1] - 1\n        relative_coords[:, :, 0] *= 2 * self.window_size[1] - 1\n        relative_position_index = relative_coords.sum(-1)  # Wh*Ww, Wh*Ww\n        self.register_buffer(\"relative_position_index\", relative_position_index)\n\n        self.qkv = nn.Linear(dim, dim * 3, bias_attr=qkv_bias)\n        self.attn_drop = nn.Dropout(attn_drop)\n        self.proj = nn.Linear(dim, dim)\n\n        self.proj_drop = nn.Dropout(proj_drop)\n\n        self.softmax = nn.Softmax(axis=-1)\n\n    def forward(self, x, mask=None):\n        \"\"\"\n        Args:\n            x: input features with shape of (num_windows*B, N, C)\n            mask: (0/-inf) mask with shape of (num_windows, Wh*Ww, Wh*Ww) or None\n        \"\"\"\n        B_, N, C = x.shape\n        qkv = self.qkv(x).reshape((B_, N, 3, self.num_heads, C // self.num_heads)).transpose((2, 0, 3, 1, 4))\n        q, k, v = qkv[0], qkv[1], qkv[2]  # make torchscript happy (cannot use tensor as tuple)\n\n        q = q * self.scale\n        attn = (q @ k.transpose((0, 1, 3, 2)))\n\n        relative_position_bias = self.relative_position_bias_table[self.relative_position_index.reshape(\n            (-1, ))].reshape((self.window_size[0] * self.window_size[1], self.window_size[0] * self.window_size[1],\n                              -1))  # Wh*Ww,Wh*Ww,nH\n        relative_position_bias = relative_position_bias.transpose((2, 0, 1))  # nH, Wh*Ww, Wh*Ww\n        attn = attn + relative_position_bias.unsqueeze(0)\n\n        if mask is not None:\n            nW = mask.shape[0]\n            attn = attn.reshape((B_ // nW, nW, self.num_heads, N, N)) + mask.unsqueeze(1).unsqueeze(0)\n            attn = attn.reshape((-1, self.num_heads, N, N))\n            attn = self.softmax(attn)\n        else:\n            attn = self.softmax(attn)\n\n        attn = self.attn_drop(attn)\n\n        x = (attn @ v).transpose((0, 2, 1, 3)).reshape((B_, N, C))\n        x = self.proj(x)\n        x = self.proj_drop(x)\n        return x\n\n    def extra_repr(self) -> str:\n        return f'dim={self.dim}, window_size={self.window_size}, num_heads={self.num_heads}'\n\n    def flops(self, N):\n        # calculate flops for 1 window with token length of N\n        flops = 0\n        # qkv = self.qkv(x)\n        flops += N * self.dim * 3 * self.dim\n        # attn = (q @ k.transpose(-2, -1))\n        flops += self.num_heads * N * (self.dim // self.num_heads) * N\n        #  x = (attn @ v)\n        flops += self.num_heads * N * N * (self.dim // self.num_heads)\n        # x = self.proj(x)\n        flops += N * self.dim * self.dim\n        return flops\n\n\nclass SwinTransformerBlock(nn.Layer):\n    r\"\"\" Swin Transformer Block.\n    Args:\n        dim (int): Number of input channels.\n        input_resolution (tuple[int]): Input resulotion.\n        num_heads (int): Number of attention heads.\n        window_size (int): Window size.\n        shift_size (int): Shift size for SW-MSA.\n        mlp_ratio (float): Ratio of mlp hidden dim to embedding dim.\n        qkv_bias (bool, optional): If True, add a learnable bias to query, key, value. Default: True\n        qk_scale (float | None, optional): Override default qk scale of head_dim ** -0.5 if set.\n        drop (float, optional): Dropout rate. Default: 0.0\n        attn_drop (float, optional): Attention dropout rate. Default: 0.0\n        drop_path (float, optional): Stochastic depth rate. Default: 0.0\n        act_layer (nn.Layer, optional): Activation layer. Default: nn.GELU\n        norm_layer (nn.Layer, optional): Normalization layer.  Default: nn.LayerNorm\n    \"\"\"\n\n    def __init__(self,\n                 dim,\n                 input_resolution,\n                 num_heads,\n                 window_size=7,\n                 shift_size=0,\n                 mlp_ratio=4.,\n                 qkv_bias=True,\n                 qk_scale=None,\n                 drop=0.,\n                 attn_drop=0.,\n                 drop_path=0.,\n                 act_layer=nn.GELU,\n                 norm_layer=nn.LayerNorm):\n        super().__init__()\n        self.dim = dim\n        self.input_resolution = input_resolution\n        self.num_heads = num_heads\n        self.window_size = window_size\n        self.shift_size = shift_size\n        self.mlp_ratio = mlp_ratio\n        if min(self.input_resolution) <= self.window_size:\n            # if window size is larger than input resolution, we don't partition windows\n            self.shift_size = 0\n            self.window_size = min(self.input_resolution)\n        assert 0 <= self.shift_size < self.window_size, \"shift_size must in 0-window_size\"\n\n        self.norm1 = norm_layer(dim)\n        self.attn = WindowAttention(dim,\n                                    window_size=to_2tuple(self.window_size),\n                                    num_heads=num_heads,\n                                    qkv_bias=qkv_bias,\n                                    qk_scale=qk_scale,\n                                    attn_drop=attn_drop,\n                                    proj_drop=drop)\n\n        self.drop_path = nn.Identity()\n        self.norm2 = norm_layer(dim)\n        mlp_hidden_dim = int(dim * mlp_ratio)\n        self.mlp = Mlp(in_features=dim, hidden_features=mlp_hidden_dim, act_layer=act_layer, drop=drop)\n\n        if self.shift_size > 0:\n            attn_mask = self.calculate_mask(self.input_resolution)\n        else:\n            attn_mask = None\n\n        self.register_buffer(\"attn_mask\", attn_mask)\n\n    def calculate_mask(self, x_size):\n        # calculate attention mask for SW-MSA\n        H, W = x_size\n        img_mask = paddle.zeros((1, H, W, 1))  # 1 H W 1\n\n        h_slices = (slice(0, -self.window_size), slice(-self.window_size,\n                                                       -self.shift_size if self.shift_size else None),\n                    slice(-self.shift_size, None))\n        w_slices = (slice(0, -self.window_size), slice(-self.window_size,\n                                                       -self.shift_size if self.shift_size else None),\n                    slice(-self.shift_size, None))\n        cnt = 0\n        for h in h_slices:\n            for w in w_slices:\n                img_mask[:, h, w, :] = cnt\n                cnt += 1\n\n        mask_windows = window_partition(img_mask, self.window_size)  # nW, window_size, window_size, 1\n        mask_windows = mask_windows.reshape((-1, self.window_size * self.window_size))\n\n        attn_mask = mask_windows.unsqueeze(1) - mask_windows.unsqueeze(2)\n        _h = paddle.full_like(attn_mask, -100.0, dtype='float32')\n        _z = paddle.full_like(attn_mask, 0.0, dtype='float32')\n        attn_mask = paddle.where(attn_mask != 0, _h, _z)\n\n        return attn_mask\n\n    def forward(self, x, x_size):\n        H, W = x_size\n        B, L, C = x.shape\n        # assert L == H * W, \"input feature has wrong size\"\n\n        shortcut = x\n        x = self.norm1(x)\n        x = x.reshape((B, H, W, C))\n\n        # cyclic shift\n        if self.shift_size > 0:\n            shifted_x = paddle.roll(x, shifts=(-self.shift_size, -self.shift_size), axis=(1, 2))\n        else:\n            shifted_x = x\n\n        # partition windows\n        x_windows = window_partition(shifted_x, self.window_size)  # nW*B, window_size, window_size, C\n        x_windows = x_windows.reshape((-1, self.window_size * self.window_size, C))  # nW*B, window_size*window_size, C\n\n        # W-MSA/SW-MSA (to be compatible for testing on images whose shapes are the multiple of window size\n        if self.input_resolution == x_size:\n            attn_windows = self.attn(x_windows, mask=self.attn_mask)  # nW*B, window_size*window_size, C\n        else:\n            attn_windows = self.attn(x_windows, mask=self.calculate_mask(x_size))\n\n        # merge windows\n        attn_windows = attn_windows.reshape((-1, self.window_size, self.window_size, C))\n        shifted_x = window_reverse(attn_windows, self.window_size, H, W)  # B H' W' C\n\n        # reverse cyclic shift\n        if self.shift_size > 0:\n            x = paddle.roll(shifted_x, shifts=(self.shift_size, self.shift_size), axis=(1, 2))\n        else:\n            x = shifted_x\n        x = x.reshape((B, H * W, C))\n\n        # FFN\n        x = shortcut + self.drop_path(x)\n        x = x + self.drop_path(self.mlp(self.norm2(x)))\n\n        return x\n\n    def extra_repr(self) -> str:\n        return f\"dim={self.dim}, input_resolution={self.input_resolution}, num_heads={self.num_heads}, \" \\\n               f\"window_size={self.window_size}, shift_size={self.shift_size}, mlp_ratio={self.mlp_ratio}\"\n\n    def flops(self):\n        flops = 0\n        H, W = self.input_resolution\n        # norm1\n        flops += self.dim * H * W\n        # W-MSA/SW-MSA\n        nW = H * W / self.window_size / self.window_size\n        flops += nW * self.attn.flops(self.window_size * self.window_size)\n        # mlp\n        flops += 2 * H * W * self.dim * self.dim * self.mlp_ratio\n        # norm2\n        flops += self.dim * H * W\n        return flops\n\n\nclass PatchMerging(nn.Layer):\n    r\"\"\" Patch Merging Layer.\n    Args:\n        input_resolution (tuple[int]): Resolution of input feature.\n        dim (int): Number of input channels.\n        norm_layer (nn.Layer, optional): Normalization layer.  Default: nn.LayerNorm\n    \"\"\"\n\n    def __init__(self, input_resolution, dim, norm_layer=nn.LayerNorm):\n        super().__init__()\n        self.input_resolution = input_resolution\n        self.dim = dim\n        self.reduction = nn.Linear(4 * dim, 2 * dim, bias=False)\n        self.norm = norm_layer(4 * dim)\n\n    def forward(self, x):\n        \"\"\"\n        x: B, H*W, C\n        \"\"\"\n        H, W = self.input_resolution\n        B, L, C = x.shape\n        assert L == H * W, \"input feature has wrong size\"\n        assert H % 2 == 0 and W % 2 == 0, f\"x size ({H}*{W}) are not even.\"\n\n        x = x.reshape((B, H, W, C))\n\n        x0 = x[:, 0::2, 0::2, :]  # B H/2 W/2 C\n        x1 = x[:, 1::2, 0::2, :]  # B H/2 W/2 C\n        x2 = x[:, 0::2, 1::2, :]  # B H/2 W/2 C\n        x3 = x[:, 1::2, 1::2, :]  # B H/2 W/2 C\n        x = paddle.concat([x0, x1, x2, x3], -1)  # B H/2 W/2 4*C\n        x = x.reshape((B, -1, 4 * C))  # B H/2*W/2 4*C\n\n        x = self.norm(x)\n        x = self.reduction(x)\n\n        return x\n\n    def extra_repr(self) -> str:\n        return f\"input_resolution={self.input_resolution}, dim={self.dim}\"\n\n    def flops(self):\n        H, W = self.input_resolution\n        flops = H * W * self.dim\n        flops += (H // 2) * (W // 2) * 4 * self.dim * 2 * self.dim\n        return flops\n\n\nclass BasicLayer(nn.Layer):\n    \"\"\" A basic Swin Transformer layer for one stage.\n    Args:\n        dim (int): Number of input channels.\n        input_resolution (tuple[int]): Input resolution.\n        depth (int): Number of blocks.\n        num_heads (int): Number of attention heads.\n        window_size (int): Local window size.\n        mlp_ratio (float): Ratio of mlp hidden dim to embedding dim.\n        qkv_bias (bool, optional): If True, add a learnable bias to query, key, value. Default: True\n        qk_scale (float | None, optional): Override default qk scale of head_dim ** -0.5 if set.\n        drop (float, optional): Dropout rate. Default: 0.0\n        attn_drop (float, optional): Attention dropout rate. Default: 0.0\n        drop_path (float | tuple[float], optional): Stochastic depth rate. Default: 0.0\n        norm_layer (nn.Layer, optional): Normalization layer. Default: nn.LayerNorm\n        downsample (nn.Layer | None, optional): Downsample layer at the end of the layer. Default: None\n        use_checkpoint (bool): Whether to use checkpointing to save memory. Default: False.\n    \"\"\"\n\n    def __init__(self,\n                 dim,\n                 input_resolution,\n                 depth,\n                 num_heads,\n                 window_size,\n                 mlp_ratio=4.,\n                 qkv_bias=True,\n                 qk_scale=None,\n                 drop=0.,\n                 attn_drop=0.,\n                 drop_path=0.,\n                 norm_layer=nn.LayerNorm,\n                 downsample=None,\n                 use_checkpoint=False):\n\n        super().__init__()\n        self.dim = dim\n        self.input_resolution = input_resolution\n        self.depth = depth\n        self.use_checkpoint = use_checkpoint\n\n        # build blocks\n        self.blocks = nn.LayerList([\n            SwinTransformerBlock(dim=dim,\n                                 input_resolution=input_resolution,\n                                 num_heads=num_heads,\n                                 window_size=window_size,\n                                 shift_size=0 if (i % 2 == 0) else window_size // 2,\n                                 mlp_ratio=mlp_ratio,\n                                 qkv_bias=qkv_bias,\n                                 qk_scale=qk_scale,\n                                 drop=drop,\n                                 attn_drop=attn_drop,\n                                 drop_path=drop_path[i] if isinstance(drop_path, list) else drop_path,\n                                 norm_layer=norm_layer) for i in range(depth)\n        ])\n\n        # patch merging layer\n        if downsample is not None:\n            self.downsample = downsample(input_resolution, dim=dim, norm_layer=norm_layer)\n        else:\n            self.downsample = None\n\n    def forward(self, x, x_size):\n        for blk in self.blocks:\n            x = blk(x, x_size)\n        if self.downsample is not None:\n            x = self.downsample(x)\n        return x\n\n    def extra_repr(self) -> str:\n        return f\"dim={self.dim}, input_resolution={self.input_resolution}, depth={self.depth}\"\n\n    def flops(self):\n        flops = 0\n        for blk in self.blocks:\n            flops += blk.flops()\n        if self.downsample is not None:\n            flops += self.downsample.flops()\n        return flops\n\n\nclass RSTB(nn.Layer):\n    \"\"\"Residual Swin Transformer Block (RSTB).\n    Args:\n        dim (int): Number of input channels.\n        input_resolution (tuple[int]): Input resolution.\n        depth (int): Number of blocks.\n        num_heads (int): Number of attention heads.\n        window_size (int): Local window size.\n        mlp_ratio (float): Ratio of mlp hidden dim to embedding dim.\n        qkv_bias (bool, optional): If True, add a learnable bias to query, key, value. Default: True\n        qk_scale (float | None, optional): Override default qk scale of head_dim ** -0.5 if set.\n        drop (float, optional): Dropout rate. Default: 0.0\n        attn_drop (float, optional): Attention dropout rate. Default: 0.0\n        drop_path (float | tuple[float], optional): Stochastic depth rate. Default: 0.0\n        norm_layer (nn.Layer, optional): Normalization layer. Default: nn.LayerNorm\n        downsample (nn.Layer | None, optional): Downsample layer at the end of the layer. Default: None\n        use_checkpoint (bool): Whether to use checkpointing to save memory. Default: False.\n        img_size: Input image size.\n        patch_size: Patch size.\n        resi_connection: The convolutional block before residual connection.\n    \"\"\"\n\n    def __init__(self,\n                 dim,\n                 input_resolution,\n                 depth,\n                 num_heads,\n                 window_size,\n                 mlp_ratio=4.,\n                 qkv_bias=True,\n                 qk_scale=None,\n                 drop=0.,\n                 attn_drop=0.,\n                 drop_path=0.,\n                 norm_layer=nn.LayerNorm,\n                 downsample=None,\n                 use_checkpoint=False,\n                 img_size=224,\n                 patch_size=4,\n                 resi_connection='1conv'):\n        super(RSTB, self).__init__()\n\n        self.dim = dim\n        self.input_resolution = input_resolution\n\n        self.residual_group = BasicLayer(dim=dim,\n                                         input_resolution=input_resolution,\n                                         depth=depth,\n                                         num_heads=num_heads,\n                                         window_size=window_size,\n                                         mlp_ratio=mlp_ratio,\n                                         qkv_bias=qkv_bias,\n                                         qk_scale=qk_scale,\n                                         drop=drop,\n                                         attn_drop=attn_drop,\n                                         drop_path=drop_path,\n                                         norm_layer=norm_layer,\n                                         downsample=downsample,\n                                         use_checkpoint=use_checkpoint)\n\n        if resi_connection == '1conv':\n            self.conv = nn.Conv2D(dim, dim, 3, 1, 1)\n        elif resi_connection == '3conv':\n            # to save parameters and memory\n            self.conv = nn.Sequential(nn.Conv2D(dim, dim // 4, 3, 1, 1), nn.LeakyReLU(negative_slope=0.2),\n                                      nn.Conv2D(dim // 4, dim // 4, 1, 1, 0), nn.LeakyReLU(negative_slope=0.2),\n                                      nn.Conv2D(dim // 4, dim, 3, 1, 1))\n\n        self.patch_embed = PatchEmbed(img_size=img_size,\n                                      patch_size=patch_size,\n                                      in_chans=0,\n                                      embed_dim=dim,\n                                      norm_layer=None)\n\n        self.patch_unembed = PatchUnEmbed(img_size=img_size,\n                                          patch_size=patch_size,\n                                          in_chans=0,\n                                          embed_dim=dim,\n                                          norm_layer=None)\n\n    def forward(self, x, x_size):\n        return self.patch_embed(self.conv(self.patch_unembed(self.residual_group(x, x_size), x_size))) + x\n\n    def flops(self):\n        flops = 0\n        flops += self.residual_group.flops()\n        H, W = self.input_resolution\n        flops += H * W * self.dim * self.dim * 9\n        flops += self.patch_embed.flops()\n        flops += self.patch_unembed.flops()\n\n        return flops\n\n\nclass PatchEmbed(nn.Layer):\n    r\"\"\" Image to Patch Embedding\n    Args:\n        img_size (int): Image size.  Default: 224.\n        patch_size (int): Patch token size. Default: 4.\n        in_chans (int): Number of input image channels. Default: 3.\n        embed_dim (int): Number of linear projection output channels. Default: 96.\n        norm_layer (nn.Layer, optional): Normalization layer. Default: None\n    \"\"\"\n\n    def __init__(self, img_size=224, patch_size=4, in_chans=3, embed_dim=96, norm_layer=None):\n        super().__init__()\n        img_size = to_2tuple(img_size)\n        patch_size = to_2tuple(patch_size)\n        patches_resolution = [img_size[0] // patch_size[0], img_size[1] // patch_size[1]]\n        self.img_size = img_size\n        self.patch_size = patch_size\n        self.patches_resolution = patches_resolution\n        self.num_patches = patches_resolution[0] * patches_resolution[1]\n\n        self.in_chans = in_chans\n        self.embed_dim = embed_dim\n\n        if norm_layer is not None:\n            self.norm = norm_layer(embed_dim)\n        else:\n            self.norm = None\n\n    def forward(self, x):\n        x = x.flatten(2).transpose((0, 2, 1))  # B Ph*Pw C\n        if self.norm is not None:\n            x = self.norm(x)\n        return x\n\n    def flops(self):\n        flops = 0\n        H, W = self.img_size\n        if self.norm is not None:\n            flops += H * W * self.embed_dim\n        return flops\n\n\nclass PatchUnEmbed(nn.Layer):\n    r\"\"\" Image to Patch Unembedding\n    Args:\n        img_size (int): Image size.  Default: 224.\n        patch_size (int): Patch token size. Default: 4.\n        in_chans (int): Number of input image channels. Default: 3.\n        embed_dim (int): Number of linear projection output channels. Default: 96.\n        norm_layer (nn.Layer, optional): Normalization layer. Default: None\n    \"\"\"\n\n    def __init__(self, img_size=224, patch_size=4, in_chans=3, embed_dim=96, norm_layer=None):\n        super().__init__()\n        img_size = to_2tuple(img_size)\n        patch_size = to_2tuple(patch_size)\n        patches_resolution = [img_size[0] // patch_size[0], img_size[1] // patch_size[1]]\n        self.img_size = img_size\n        self.patch_size = patch_size\n        self.patches_resolution = patches_resolution\n        self.num_patches = patches_resolution[0] * patches_resolution[1]\n\n        self.in_chans = in_chans\n        self.embed_dim = embed_dim\n\n    def forward(self, x, x_size):\n        B, HW, C = x.shape\n        x = x.transpose((0, 2, 1)).reshape((B, self.embed_dim, x_size[0], x_size[1]))  # B Ph*Pw C\n        return x\n\n    def flops(self):\n        flops = 0\n        return flops\n\n\nclass Upsample(nn.Sequential):\n    \"\"\"Upsample module.\n    Args:\n        scale (int): Scale factor. Supported scales: 2^n and 3.\n        num_feat (int): Channel number of intermediate features.\n    \"\"\"\n\n    def __init__(self, scale, num_feat):\n        m = []\n        if (scale & (scale - 1)) == 0:  # scale = 2^n\n            for _ in range(int(math.log(scale, 2))):\n                m.append(nn.Conv2D(num_feat, 4 * num_feat, 3, 1, 1))\n                m.append(nn.PixelShuffle(2))\n        elif scale == 3:\n            m.append(nn.Conv2D(num_feat, 9 * num_feat, 3, 1, 1))\n            m.append(nn.PixelShuffle(3))\n        else:\n            raise ValueError(f'scale {scale} is not supported. '\n                             'Supported scales: 2^n and 3.')\n        super(Upsample, self).__init__(*m)\n\n\nclass UpsampleOneStep(nn.Sequential):\n    \"\"\"UpsampleOneStep module (the difference with Upsample is that it always only has 1conv + 1pixelshuffle)\n       Used in lightweight SR to save parameters.\n    Args:\n        scale (int): Scale factor. Supported scales: 2^n and 3.\n        num_feat (int): Channel number of intermediate features.\n    \"\"\"\n\n    def __init__(self, scale, num_feat, num_out_ch, input_resolution=None):\n        self.num_feat = num_feat\n        self.input_resolution = input_resolution\n        m = []\n        m.append(nn.Conv2D(num_feat, (scale**2) * num_out_ch, 3, 1, 1))\n        m.append(nn.PixelShuffle(scale))\n        super(UpsampleOneStep, self).__init__(*m)\n\n    def flops(self):\n        H, W = self.input_resolution\n        flops = H * W * self.num_feat * 3 * 9\n        return flops\n\n\nclass SwinIR(nn.Layer):\n    r\"\"\" SwinIR\n        A PyTorch impl of : `SwinIR: Image Restoration Using Swin Transformer`, based on Swin Transformer.\n    Args:\n        img_size (int | tuple(int)): Input image size. Default 64\n        patch_size (int | tuple(int)): Patch size. Default: 1\n        in_chans (int): Number of input image channels. Default: 3\n        embed_dim (int): Patch embedding dimension. Default: 96\n        depths (tuple(int)): Depth of each Swin Transformer layer.\n        num_heads (tuple(int)): Number of attention heads in different layers.\n        window_size (int): Window size. Default: 7\n        mlp_ratio (float): Ratio of mlp hidden dim to embedding dim. Default: 4\n        qkv_bias (bool): If True, add a learnable bias to query, key, value. Default: True\n        qk_scale (float): Override default qk scale of head_dim ** -0.5 if set. Default: None\n        drop_rate (float): Dropout rate. Default: 0\n        attn_drop_rate (float): Attention dropout rate. Default: 0\n        drop_path_rate (float): Stochastic depth rate. Default: 0.1\n        norm_layer (nn.Layer): Normalization layer. Default: nn.LayerNorm.\n        ape (bool): If True, add absolute position embedding to the patch embedding. Default: False\n        patch_norm (bool): If True, add normalization after patch embedding. Default: True\n        use_checkpoint (bool): Whether to use checkpointing to save memory. Default: False\n        upscale: Upscale factor. 2/3/4/8 for image SR, 1 for denoising and compress artifact reduction\n        img_range: Image range. 1. or 255.\n        upsampler: The reconstruction reconstruction module. 'pixelshuffle'/'pixelshuffledirect'/'nearest+conv'/None\n        resi_connection: The convolutional block before residual connection. '1conv'/'3conv'\n    \"\"\"\n\n    def __init__(self,\n                 img_size=64,\n                 patch_size=1,\n                 in_chans=3,\n                 embed_dim=96,\n                 depths=[6, 6, 6, 6],\n                 num_heads=[6, 6, 6, 6],\n                 window_size=7,\n                 mlp_ratio=4.,\n                 qkv_bias=True,\n                 qk_scale=None,\n                 drop_rate=0.,\n                 attn_drop_rate=0.,\n                 drop_path_rate=0.1,\n                 norm_layer=nn.LayerNorm,\n                 ape=False,\n                 patch_norm=True,\n                 use_checkpoint=False,\n                 upscale=2,\n                 img_range=1.,\n                 upsampler='',\n                 resi_connection='1conv',\n                 **kwargs):\n        super(SwinIR, self).__init__()\n        num_in_ch = in_chans\n        num_out_ch = in_chans\n        num_feat = 64\n        self.img_range = img_range\n        if in_chans == 3:\n            rgb_mean = (0.4488, 0.4371, 0.4040)\n            self.mean = paddle.to_tensor(rgb_mean).reshape((1, 3, 1, 1))\n        else:\n            self.mean = paddle.zeros((1, 1, 1, 1))\n        self.upscale = upscale\n        self.upsampler = upsampler\n        self.window_size = window_size\n\n        #####################################################################################################\n        ################################### 1, shallow feature extraction ###################################\n        self.conv_first = nn.Conv2D(num_in_ch, embed_dim, 3, 1, 1)\n\n        #####################################################################################################\n        ################################### 2, deep feature extraction ######################################\n        self.num_layers = len(depths)\n        self.embed_dim = embed_dim\n        self.ape = ape\n        self.patch_norm = patch_norm\n        self.num_features = embed_dim\n        self.mlp_ratio = mlp_ratio\n\n        # split image into non-overlapping patches\n        self.patch_embed = PatchEmbed(img_size=img_size,\n                                      patch_size=patch_size,\n                                      in_chans=embed_dim,\n                                      embed_dim=embed_dim,\n                                      norm_layer=norm_layer if self.patch_norm else None)\n        num_patches = self.patch_embed.num_patches\n        patches_resolution = self.patch_embed.patches_resolution\n        self.patches_resolution = patches_resolution\n\n        # merge non-overlapping patches into image\n        self.patch_unembed = PatchUnEmbed(img_size=img_size,\n                                          patch_size=patch_size,\n                                          in_chans=embed_dim,\n                                          embed_dim=embed_dim,\n                                          norm_layer=norm_layer if self.patch_norm else None)\n\n        # absolute position embedding\n        if self.ape:\n            # self.absolute_pos_embed = nn.Parameter(torch.zeros(1, num_patches, embed_dim))\n            self.absolute_pos_embed = self.create_parameter(shape=(1, num_patches, embed_dim),\n                                                            default_initializer=nn.initializer.Constant(0.0))\n\n        self.pos_drop = nn.Dropout(p=drop_rate)\n\n        # stochastic depth\n        dpr = [x.item() for x in paddle.linspace(0, drop_path_rate, sum(depths))]  # stochastic depth decay rule\n\n        # build Residual Swin Transformer blocks (RSTB)\n        self.layers = nn.LayerList()\n        for i_layer in range(self.num_layers):\n            layer = RSTB(\n                dim=embed_dim,\n                input_resolution=(patches_resolution[0], patches_resolution[1]),\n                depth=depths[i_layer],\n                num_heads=num_heads[i_layer],\n                window_size=window_size,\n                mlp_ratio=self.mlp_ratio,\n                qkv_bias=qkv_bias,\n                qk_scale=qk_scale,\n                drop=drop_rate,\n                attn_drop=attn_drop_rate,\n                drop_path=dpr[sum(depths[:i_layer]):sum(depths[:i_layer + 1])],  # no impact on SR results\n                norm_layer=norm_layer,\n                downsample=None,\n                use_checkpoint=use_checkpoint,\n                img_size=img_size,\n                patch_size=patch_size,\n                resi_connection=resi_connection)\n            self.layers.append(layer)\n        self.norm = norm_layer(self.num_features)\n\n        # build the last conv layer in deep feature extraction\n        if resi_connection == '1conv':\n            self.conv_after_body = nn.Conv2D(embed_dim, embed_dim, 3, 1, 1)\n        elif resi_connection == '3conv':\n            # to save parameters and memory\n            self.conv_after_body = nn.Sequential(nn.Conv2D(embed_dim, embed_dim // 4, 3, 1, 1),\n                                                 nn.LeakyReLU(negative_slope=0.2),\n                                                 nn.Conv2D(embed_dim // 4, embed_dim // 4, 1, 1, 0),\n                                                 nn.LeakyReLU(negative_slope=0.2),\n                                                 nn.Conv2D(embed_dim // 4, embed_dim, 3, 1, 1))\n\n        #####################################################################################################\n        ################################ 3, high quality image reconstruction ################################\n        if self.upsampler == 'pixelshuffle':\n            # for classical SR\n            self.conv_before_upsample = nn.Sequential(nn.Conv2D(embed_dim, num_feat, 3, 1, 1), nn.LeakyReLU())\n            self.upsample = Upsample(upscale, num_feat)\n            self.conv_last = nn.Conv2D(num_feat, num_out_ch, 3, 1, 1)\n        elif self.upsampler == 'pixelshuffledirect':\n            # for lightweight SR (to save parameters)\n            self.upsample = UpsampleOneStep(upscale, embed_dim, num_out_ch,\n                                            (patches_resolution[0], patches_resolution[1]))\n        elif self.upsampler == 'nearest+conv':\n            # for real-world SR (less artifacts)\n            self.conv_before_upsample = nn.Sequential(nn.Conv2D(embed_dim, num_feat, 3, 1, 1), nn.LeakyReLU())\n            self.conv_up1 = nn.Conv2D(num_feat, num_feat, 3, 1, 1)\n            if self.upscale == 4:\n                self.conv_up2 = nn.Conv2D(num_feat, num_feat, 3, 1, 1)\n            self.conv_hr = nn.Conv2D(num_feat, num_feat, 3, 1, 1)\n            self.conv_last = nn.Conv2D(num_feat, num_out_ch, 3, 1, 1)\n            self.lrelu = nn.LeakyReLU(negative_slope=0.2)\n        else:\n            # for image denoising and JPEG compression artifact reduction\n            self.conv_last = nn.Conv2D(embed_dim, num_out_ch, 3, 1, 1)\n\n        self.apply(self._init_weights)\n\n    def _init_weights(self, m):\n        if isinstance(m, nn.Linear):\n            if isinstance(m, nn.Linear) and m.bias is not None:\n                nn.initializer.Constant(0.0)(m.bias)\n        elif isinstance(m, nn.LayerNorm):\n            nn.initializer.Constant(0.0)(m.bias)\n            nn.initializer.Constant(1.0)(m.weight)\n\n    def check_image_size(self, x):\n        _, _, h, w = x.shape\n        mod_pad_h = (self.window_size - h % self.window_size) % self.window_size\n        mod_pad_w = (self.window_size - w % self.window_size) % self.window_size\n        x = F.pad(x, (0, mod_pad_w, 0, mod_pad_h), 'reflect')\n        return x\n\n    def forward_features(self, x):\n        x_size = (x.shape[2], x.shape[3])\n        x = self.patch_embed(x)\n        if self.ape:\n            x = x + self.absolute_pos_embed\n        x = self.pos_drop(x)\n\n        for layer in self.layers:\n            x = layer(x, x_size)\n\n        x = self.norm(x)  # B L C\n        x = self.patch_unembed(x, x_size)\n\n        return x\n\n    def forward(self, x):\n        H, W = x.shape[2:]\n        x = self.check_image_size(x)\n\n        self.mean = self.mean.cast(x.dtype)\n        x = (x - self.mean) * self.img_range\n\n        if self.upsampler == 'pixelshuffle':\n            # for classical SR\n            x = self.conv_first(x)\n            x = self.conv_after_body(self.forward_features(x)) + x\n            x = self.conv_before_upsample(x)\n            x = self.conv_last(self.upsample(x))\n        elif self.upsampler == 'pixelshuffledirect':\n            # for lightweight SR\n            x = self.conv_first(x)\n            x = self.conv_after_body(self.forward_features(x)) + x\n            x = self.upsample(x)\n        elif self.upsampler == 'nearest+conv':\n            # for real-world SR\n            x = self.conv_first(x)\n            x = self.conv_after_body(self.forward_features(x)) + x\n            x = self.conv_before_upsample(x)\n            x = self.lrelu(self.conv_up1(nn.functional.interpolate(x, scale_factor=2, mode='nearest')))\n            if self.upscale == 4:\n                x = self.lrelu(self.conv_up2(nn.functional.interpolate(x, scale_factor=2, mode='nearest')))\n            x = self.conv_last(self.lrelu(self.conv_hr(x)))\n        else:\n            # for image denoising and JPEG compression artifact reduction\n            x_first = self.conv_first(x)\n            res = self.conv_after_body(self.forward_features(x_first)) + x_first\n            x = x + self.conv_last(res)\n\n        x = x / self.img_range + self.mean\n\n        return x[:, :, :H * self.upscale, :W * self.upscale]\n\n    def flops(self):\n        flops = 0\n        H, W = self.patches_resolution\n        flops += H * W * 3 * self.embed_dim * 9\n        flops += self.patch_embed.flops()\n        for i, layer in enumerate(self.layers):\n            flops += layer.flops()\n        flops += H * W * 3 * self.embed_dim * self.embed_dim\n        flops += self.upsample.flops()\n        return flops\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/swinir_m_real_sr_x2/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport numpy as np\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/mJaD10XeD7w/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8M3x8Y2F0fGVufDB8fHx8MTY2MzczNDc3Mw&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        img = cv2.imread('tests/test.jpg')\n        img = cv2.resize(img, (0, 0), fx=0.5, fy=0.5)\n        cv2.imwrite('tests/test.jpg', img)\n        cls.module = hub.Module(name=\"swinir_m_real_sr_x2\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('swinir_m_real_sr_x2_output')\n\n    def test_real_sr1(self):\n        results = self.module.real_sr(image='tests/test.jpg', visualization=False)\n\n        self.assertIsInstance(results, np.ndarray)\n\n    def test_real_sr2(self):\n        results = self.module.real_sr(image=cv2.imread('tests/test.jpg'), visualization=True)\n\n        self.assertIsInstance(results, np.ndarray)\n\n    def test_real_sr3(self):\n        results = self.module.real_sr(image=cv2.imread('tests/test.jpg'), visualization=True)\n\n        self.assertIsInstance(results, np.ndarray)\n\n    def test_real_sr4(self):\n        self.assertRaises(Exception, self.module.real_sr, image=['tests/test.jpg'])\n\n    def test_real_sr5(self):\n        self.assertRaises(FileNotFoundError, self.module.real_sr, image='no.jpg')\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/swinir_m_real_sr_x4/README.md",
    "content": "# swinir_m_real_sr_x4\n\n|模型名称|swinir_m_real_sr_x4|\n| :--- | :---: |\n|类别|图像-图像编辑|\n|网络|SwinIR|\n|数据集|DIV2K / Flickr2K|\n|是否支持Fine-tuning|否|\n|模型大小|66.8MB|\n|指标|-|\n|最新更新日期|2022-10-10|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n  - 网络结构：\n      <p align=\"center\">\n      <img src=\"https://ai-studio-static-online.cdn.bcebos.com/b3c6bfc3dfc14078adcf3dc19acaf04acd4b064770384e2bbd8865697c7dbc91\" hspace='10'/> <br />\n      </p>\n\n  - 样例结果示例：\n      <p align=\"center\">\n      <img src=\"https://ai-studio-static-online.cdn.bcebos.com/c5517af6c3f944c4b281aedc417a4f8c02c0a969d0dd494c9106c4ff2709fc2f\" hspace='10'/>\n      <img src=\"https://ai-studio-static-online.cdn.bcebos.com/183c5821029f45bbb78d1700ab8297baabba15f82ab4467e88414bbed056ccf0\" hspace='10'/>\n      </p>\n\n- ### 模型介绍\n\n  - SwinIR 是一个基于 Swin Transformer 的图像恢复模型。swinir_m_real_sr_x4 是基于 SwinIR-M 的 4 倍现实图像超分辨率模型。\n\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0  \n\n- ### 2.安装\n\n    - ```shell\n      $ hub install swinir_m_real_sr_x4\n      ```\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n  - ### 1、命令行预测\n\n    ```shell\n    $ hub run swinir_m_real_sr_x4 \\\n        --input_path \"/PATH/TO/IMAGE\" \\\n        --output_dir \"swinir_m_real_sr_x4_output\"\n    ```\n\n  - ### 2、预测代码示例\n\n    ```python\n    import paddlehub as hub\n    import cv2\n\n    module = hub.Module(name=\"swinir_m_real_sr_x4\")\n    result = module.real_sr(\n        image=cv2.imread('/PATH/TO/IMAGE'),\n        visualization=True,\n        output_dir='swinir_m_real_sr_x4_output'\n    )\n    ```\n\n  - ### 3、API\n\n    ```python\n    def real_sr(\n        image: Union[str, numpy.ndarray],\n        visualization: bool = True,\n        output_dir: str = \"swinir_m_real_sr_x4_output\"\n    ) -> numpy.ndarray\n    ```\n\n    - 超分辨率 API\n\n    - **参数**\n\n      * image (Union\\[str, numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      * visualization (bool): 是否将识别结果保存为图片文件；\n      * output\\_dir (str): 保存处理结果的文件目录。\n\n    - **返回**\n\n      * res (numpy.ndarray): 图像超分辨率结果 (BGR)；\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个图像超分辨率的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n\n    ```shell\n     $ hub serving start -m swinir_m_real_sr_x4\n    ```\n\n    - 这样就完成了一个图像超分辨率服务化API的部署，默认端口号为8866。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    ```python\n    import requests\n    import json\n    import base64\n\n    import cv2\n    import numpy as np\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tobytes()).decode('utf8')\n\n    def base64_to_cv2(b64str):\n        data = base64.b64decode(b64str.encode('utf8'))\n        data = np.frombuffer(data, np.uint8)\n        data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n        return data\n\n    # 发送HTTP请求\n    org_im = cv2.imread('/PATH/TO/IMAGE')\n    data = {\n        'image': cv2_to_base64(org_im)\n    }\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/swinir_m_real_sr_x4\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 结果转换\n    results = r.json()['results']\n    results = base64_to_cv2(results)\n\n    # 保存结果\n    cv2.imwrite('output.jpg', results)\n    ```\n\n## 五、参考资料\n\n* 论文：[SwinIR: Image Restoration Using Swin Transformer](https://arxiv.org/abs/2108.10257)\n\n* 官方实现：[JingyunLiang/SwinIR](https://github.com/JingyunLiang/SwinIR)\n\n## 六、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install swinir_m_real_sr_x4==1.0.0\n  ```\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/swinir_m_real_sr_x4/module.py",
    "content": "import argparse\nimport base64\nimport os\nimport time\nfrom typing import Union\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\n\nfrom .swinir import SwinIR\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tobytes()).decode('utf8')\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.frombuffer(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\n@moduleinfo(\n    name='swinir_m_real_sr_x4',\n    version='1.0.0',\n    type=\"CV/image_editing\",\n    author=\"\",\n    author_email=\"\",\n    summary=\"Image Restoration (Real image Super Resolution) Using Swin Transformer.\",\n)\nclass SwinIRMRealSR(nn.Layer):\n\n    def __init__(self):\n        super(SwinIRMRealSR, self).__init__()\n        self.default_pretrained_model_path = os.path.join(self.directory,\n                                                          '003_realSR_BSRGAN_DFO_s64w8_SwinIR-M_x4_GAN.pdparams')\n        self.swinir = SwinIR(upscale=4,\n                             in_chans=3,\n                             img_size=64,\n                             window_size=8,\n                             img_range=1.,\n                             depths=[6, 6, 6, 6, 6, 6],\n                             embed_dim=180,\n                             num_heads=[6, 6, 6, 6, 6, 6],\n                             mlp_ratio=2,\n                             upsampler='nearest+conv',\n                             resi_connection='1conv')\n        state_dict = paddle.load(self.default_pretrained_model_path)\n        self.swinir.set_state_dict(state_dict)\n        self.swinir.eval()\n\n    def preprocess(self, img: np.ndarray) -> np.ndarray:\n        img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)\n        img = img.transpose((2, 0, 1))\n        img = img / 255.0\n        return img.astype(np.float32)\n\n    def postprocess(self, img: np.ndarray) -> np.ndarray:\n        img = img.clip(0, 1)\n        img = img * 255.0\n        img = img.transpose((1, 2, 0))\n        img = cv2.cvtColor(img, cv2.COLOR_RGB2BGR)\n        return img.astype(np.uint8)\n\n    def real_sr(self,\n                image: Union[str, np.ndarray],\n                visualization: bool = True,\n                output_dir: str = \"swinir_m_real_sr_x4_output\") -> np.ndarray:\n        if isinstance(image, str):\n            _, file_name = os.path.split(image)\n            save_name, _ = os.path.splitext(file_name)\n            save_name = save_name + '_' + str(int(time.time())) + '.jpg'\n            image = cv2.imdecode(np.fromfile(image, dtype=np.uint8), cv2.IMREAD_COLOR)\n        elif isinstance(image, np.ndarray):\n            save_name = str(int(time.time())) + '.jpg'\n            image = image\n        else:\n            raise Exception(\"image should be a str / np.ndarray\")\n\n        with paddle.no_grad():\n            img_input = self.preprocess(image)\n            img_input = paddle.to_tensor(img_input[None, ...], dtype=paddle.float32)\n\n            img_output = self.swinir(img_input)\n            img_output = img_output.numpy()[0]\n            img_output = self.postprocess(img_output)\n\n        if visualization:\n            if not os.path.isdir(output_dir):\n                os.makedirs(output_dir)\n            save_path = os.path.join(output_dir, save_name)\n            cv2.imwrite(save_path, img_output)\n\n        return img_output\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.parser.add_argument('--input_path', type=str, help=\"Path to image.\")\n        self.parser.add_argument('--output_dir',\n                                 type=str,\n                                 default='swinir_m_real_sr_x4_output',\n                                 help=\"The directory to save output images.\")\n        args = self.parser.parse_args(argvs)\n        self.real_sr(image=args.input_path, visualization=True, output_dir=args.output_dir)\n        return 'Results are saved in %s' % args.output_dir\n\n    @serving\n    def serving_method(self, image, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        image = base64_to_cv2(image)\n        img_output = self.real_sr(image=image, **kwargs)\n\n        return cv2_to_base64(img_output)\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/swinir_m_real_sr_x4/swinir.py",
    "content": "import math\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\ndef to_2tuple(x):\n    if isinstance(x, int):\n        return (x, x)\n    else:\n        return tuple(x)\n\n\nclass Mlp(nn.Layer):\n\n    def __init__(self, in_features, hidden_features=None, out_features=None, act_layer=nn.GELU, drop=0.):\n        super().__init__()\n        out_features = out_features or in_features\n        hidden_features = hidden_features or in_features\n        self.fc1 = nn.Linear(in_features, hidden_features)\n        self.act = act_layer()\n        self.fc2 = nn.Linear(hidden_features, out_features)\n        self.drop = nn.Dropout(drop)\n\n    def forward(self, x):\n        x = self.fc1(x)\n        x = self.act(x)\n        x = self.drop(x)\n        x = self.fc2(x)\n        x = self.drop(x)\n        return x\n\n\ndef window_partition(x, window_size):\n    \"\"\"\n    Args:\n        x: (B, H, W, C)\n        window_size (int): window size\n    Returns:\n        windows: (num_windows*B, window_size, window_size, C)\n    \"\"\"\n    B, H, W, C = x.shape\n    x = x.reshape((B, H // window_size, window_size, W // window_size, window_size, C))\n    windows = x.transpose((0, 1, 3, 2, 4, 5)).reshape((-1, window_size, window_size, C))\n    return windows\n\n\ndef window_reverse(windows, window_size, H, W):\n    \"\"\"\n    Args:\n        windows: (num_windows*B, window_size, window_size, C)\n        window_size (int): Window size\n        H (int): Height of image\n        W (int): Width of image\n    Returns:\n        x: (B, H, W, C)\n    \"\"\"\n    B = int(windows.shape[0] / (H * W / window_size / window_size))\n    x = windows.reshape((B, H // window_size, W // window_size, window_size, window_size, -1))\n    x = x.transpose((0, 1, 3, 2, 4, 5)).reshape((B, H, W, -1))\n    return x\n\n\nclass WindowAttention(nn.Layer):\n    r\"\"\" Window based multi-head self attention (W-MSA) module with relative position bias.\n    It supports both of shifted and non-shifted window.\n    Args:\n        dim (int): Number of input channels.\n        window_size (tuple[int]): The height and width of the window.\n        num_heads (int): Number of attention heads.\n        qkv_bias (bool, optional):  If True, add a learnable bias to query, key, value. Default: True\n        qk_scale (float | None, optional): Override default qk scale of head_dim ** -0.5 if set\n        attn_drop (float, optional): Dropout ratio of attention weight. Default: 0.0\n        proj_drop (float, optional): Dropout ratio of output. Default: 0.0\n    \"\"\"\n\n    def __init__(self, dim, window_size, num_heads, qkv_bias=True, qk_scale=None, attn_drop=0., proj_drop=0.):\n\n        super().__init__()\n        self.dim = dim\n        self.window_size = window_size  # Wh, Ww\n        self.num_heads = num_heads\n        head_dim = dim // num_heads\n        self.scale = qk_scale or head_dim**-0.5\n\n        # define a parameter table of relative position bias\n        self.relative_position_bias_table = self.create_parameter(shape=((2 * window_size[0] - 1) *\n                                                                         (2 * window_size[1] - 1), num_heads),\n                                                                  default_initializer=nn.initializer.Constant(0.0))\n\n        # get pair-wise relative position index for each token inside the window\n        coords_h = paddle.arange(self.window_size[0])\n        coords_w = paddle.arange(self.window_size[1])\n        coords = paddle.stack(paddle.meshgrid([coords_h, coords_w]))  # 2, Wh, Ww\n        coords_flatten = paddle.flatten(coords, 1)  # 2, Wh*Ww\n        relative_coords = coords_flatten[:, :, None] - coords_flatten[:, None, :]  # 2, Wh*Ww, Wh*Ww\n        relative_coords = relative_coords.transpose((1, 2, 0))  # Wh*Ww, Wh*Ww, 2\n        relative_coords[:, :, 0] += self.window_size[0] - 1  # shift to start from 0\n        relative_coords[:, :, 1] += self.window_size[1] - 1\n        relative_coords[:, :, 0] *= 2 * self.window_size[1] - 1\n        relative_position_index = relative_coords.sum(-1)  # Wh*Ww, Wh*Ww\n        self.register_buffer(\"relative_position_index\", relative_position_index)\n\n        self.qkv = nn.Linear(dim, dim * 3, bias_attr=qkv_bias)\n        self.attn_drop = nn.Dropout(attn_drop)\n        self.proj = nn.Linear(dim, dim)\n\n        self.proj_drop = nn.Dropout(proj_drop)\n\n        self.softmax = nn.Softmax(axis=-1)\n\n    def forward(self, x, mask=None):\n        \"\"\"\n        Args:\n            x: input features with shape of (num_windows*B, N, C)\n            mask: (0/-inf) mask with shape of (num_windows, Wh*Ww, Wh*Ww) or None\n        \"\"\"\n        B_, N, C = x.shape\n        qkv = self.qkv(x).reshape((B_, N, 3, self.num_heads, C // self.num_heads)).transpose((2, 0, 3, 1, 4))\n        q, k, v = qkv[0], qkv[1], qkv[2]  # make torchscript happy (cannot use tensor as tuple)\n\n        q = q * self.scale\n        attn = (q @ k.transpose((0, 1, 3, 2)))\n\n        relative_position_bias = self.relative_position_bias_table[self.relative_position_index.reshape(\n            (-1, ))].reshape((self.window_size[0] * self.window_size[1], self.window_size[0] * self.window_size[1],\n                              -1))  # Wh*Ww,Wh*Ww,nH\n        relative_position_bias = relative_position_bias.transpose((2, 0, 1))  # nH, Wh*Ww, Wh*Ww\n        attn = attn + relative_position_bias.unsqueeze(0)\n\n        if mask is not None:\n            nW = mask.shape[0]\n            attn = attn.reshape((B_ // nW, nW, self.num_heads, N, N)) + mask.unsqueeze(1).unsqueeze(0)\n            attn = attn.reshape((-1, self.num_heads, N, N))\n            attn = self.softmax(attn)\n        else:\n            attn = self.softmax(attn)\n\n        attn = self.attn_drop(attn)\n\n        x = (attn @ v).transpose((0, 2, 1, 3)).reshape((B_, N, C))\n        x = self.proj(x)\n        x = self.proj_drop(x)\n        return x\n\n    def extra_repr(self) -> str:\n        return f'dim={self.dim}, window_size={self.window_size}, num_heads={self.num_heads}'\n\n    def flops(self, N):\n        # calculate flops for 1 window with token length of N\n        flops = 0\n        # qkv = self.qkv(x)\n        flops += N * self.dim * 3 * self.dim\n        # attn = (q @ k.transpose(-2, -1))\n        flops += self.num_heads * N * (self.dim // self.num_heads) * N\n        #  x = (attn @ v)\n        flops += self.num_heads * N * N * (self.dim // self.num_heads)\n        # x = self.proj(x)\n        flops += N * self.dim * self.dim\n        return flops\n\n\nclass SwinTransformerBlock(nn.Layer):\n    r\"\"\" Swin Transformer Block.\n    Args:\n        dim (int): Number of input channels.\n        input_resolution (tuple[int]): Input resulotion.\n        num_heads (int): Number of attention heads.\n        window_size (int): Window size.\n        shift_size (int): Shift size for SW-MSA.\n        mlp_ratio (float): Ratio of mlp hidden dim to embedding dim.\n        qkv_bias (bool, optional): If True, add a learnable bias to query, key, value. Default: True\n        qk_scale (float | None, optional): Override default qk scale of head_dim ** -0.5 if set.\n        drop (float, optional): Dropout rate. Default: 0.0\n        attn_drop (float, optional): Attention dropout rate. Default: 0.0\n        drop_path (float, optional): Stochastic depth rate. Default: 0.0\n        act_layer (nn.Layer, optional): Activation layer. Default: nn.GELU\n        norm_layer (nn.Layer, optional): Normalization layer.  Default: nn.LayerNorm\n    \"\"\"\n\n    def __init__(self,\n                 dim,\n                 input_resolution,\n                 num_heads,\n                 window_size=7,\n                 shift_size=0,\n                 mlp_ratio=4.,\n                 qkv_bias=True,\n                 qk_scale=None,\n                 drop=0.,\n                 attn_drop=0.,\n                 drop_path=0.,\n                 act_layer=nn.GELU,\n                 norm_layer=nn.LayerNorm):\n        super().__init__()\n        self.dim = dim\n        self.input_resolution = input_resolution\n        self.num_heads = num_heads\n        self.window_size = window_size\n        self.shift_size = shift_size\n        self.mlp_ratio = mlp_ratio\n        if min(self.input_resolution) <= self.window_size:\n            # if window size is larger than input resolution, we don't partition windows\n            self.shift_size = 0\n            self.window_size = min(self.input_resolution)\n        assert 0 <= self.shift_size < self.window_size, \"shift_size must in 0-window_size\"\n\n        self.norm1 = norm_layer(dim)\n        self.attn = WindowAttention(dim,\n                                    window_size=to_2tuple(self.window_size),\n                                    num_heads=num_heads,\n                                    qkv_bias=qkv_bias,\n                                    qk_scale=qk_scale,\n                                    attn_drop=attn_drop,\n                                    proj_drop=drop)\n\n        self.drop_path = nn.Identity()\n        self.norm2 = norm_layer(dim)\n        mlp_hidden_dim = int(dim * mlp_ratio)\n        self.mlp = Mlp(in_features=dim, hidden_features=mlp_hidden_dim, act_layer=act_layer, drop=drop)\n\n        if self.shift_size > 0:\n            attn_mask = self.calculate_mask(self.input_resolution)\n        else:\n            attn_mask = None\n\n        self.register_buffer(\"attn_mask\", attn_mask)\n\n    def calculate_mask(self, x_size):\n        # calculate attention mask for SW-MSA\n        H, W = x_size\n        img_mask = paddle.zeros((1, H, W, 1))  # 1 H W 1\n\n        h_slices = (slice(0, -self.window_size), slice(-self.window_size,\n                                                       -self.shift_size if self.shift_size else None),\n                    slice(-self.shift_size, None))\n        w_slices = (slice(0, -self.window_size), slice(-self.window_size,\n                                                       -self.shift_size if self.shift_size else None),\n                    slice(-self.shift_size, None))\n        cnt = 0\n        for h in h_slices:\n            for w in w_slices:\n                img_mask[:, h, w, :] = cnt\n                cnt += 1\n\n        mask_windows = window_partition(img_mask, self.window_size)  # nW, window_size, window_size, 1\n        mask_windows = mask_windows.reshape((-1, self.window_size * self.window_size))\n\n        attn_mask = mask_windows.unsqueeze(1) - mask_windows.unsqueeze(2)\n        _h = paddle.full_like(attn_mask, -100.0, dtype='float32')\n        _z = paddle.full_like(attn_mask, 0.0, dtype='float32')\n        attn_mask = paddle.where(attn_mask != 0, _h, _z)\n\n        return attn_mask\n\n    def forward(self, x, x_size):\n        H, W = x_size\n        B, L, C = x.shape\n        # assert L == H * W, \"input feature has wrong size\"\n\n        shortcut = x\n        x = self.norm1(x)\n        x = x.reshape((B, H, W, C))\n\n        # cyclic shift\n        if self.shift_size > 0:\n            shifted_x = paddle.roll(x, shifts=(-self.shift_size, -self.shift_size), axis=(1, 2))\n        else:\n            shifted_x = x\n\n        # partition windows\n        x_windows = window_partition(shifted_x, self.window_size)  # nW*B, window_size, window_size, C\n        x_windows = x_windows.reshape((-1, self.window_size * self.window_size, C))  # nW*B, window_size*window_size, C\n\n        # W-MSA/SW-MSA (to be compatible for testing on images whose shapes are the multiple of window size\n        if self.input_resolution == x_size:\n            attn_windows = self.attn(x_windows, mask=self.attn_mask)  # nW*B, window_size*window_size, C\n        else:\n            attn_windows = self.attn(x_windows, mask=self.calculate_mask(x_size))\n\n        # merge windows\n        attn_windows = attn_windows.reshape((-1, self.window_size, self.window_size, C))\n        shifted_x = window_reverse(attn_windows, self.window_size, H, W)  # B H' W' C\n\n        # reverse cyclic shift\n        if self.shift_size > 0:\n            x = paddle.roll(shifted_x, shifts=(self.shift_size, self.shift_size), axis=(1, 2))\n        else:\n            x = shifted_x\n        x = x.reshape((B, H * W, C))\n\n        # FFN\n        x = shortcut + self.drop_path(x)\n        x = x + self.drop_path(self.mlp(self.norm2(x)))\n\n        return x\n\n    def extra_repr(self) -> str:\n        return f\"dim={self.dim}, input_resolution={self.input_resolution}, num_heads={self.num_heads}, \" \\\n               f\"window_size={self.window_size}, shift_size={self.shift_size}, mlp_ratio={self.mlp_ratio}\"\n\n    def flops(self):\n        flops = 0\n        H, W = self.input_resolution\n        # norm1\n        flops += self.dim * H * W\n        # W-MSA/SW-MSA\n        nW = H * W / self.window_size / self.window_size\n        flops += nW * self.attn.flops(self.window_size * self.window_size)\n        # mlp\n        flops += 2 * H * W * self.dim * self.dim * self.mlp_ratio\n        # norm2\n        flops += self.dim * H * W\n        return flops\n\n\nclass PatchMerging(nn.Layer):\n    r\"\"\" Patch Merging Layer.\n    Args:\n        input_resolution (tuple[int]): Resolution of input feature.\n        dim (int): Number of input channels.\n        norm_layer (nn.Layer, optional): Normalization layer.  Default: nn.LayerNorm\n    \"\"\"\n\n    def __init__(self, input_resolution, dim, norm_layer=nn.LayerNorm):\n        super().__init__()\n        self.input_resolution = input_resolution\n        self.dim = dim\n        self.reduction = nn.Linear(4 * dim, 2 * dim, bias=False)\n        self.norm = norm_layer(4 * dim)\n\n    def forward(self, x):\n        \"\"\"\n        x: B, H*W, C\n        \"\"\"\n        H, W = self.input_resolution\n        B, L, C = x.shape\n        assert L == H * W, \"input feature has wrong size\"\n        assert H % 2 == 0 and W % 2 == 0, f\"x size ({H}*{W}) are not even.\"\n\n        x = x.reshape((B, H, W, C))\n\n        x0 = x[:, 0::2, 0::2, :]  # B H/2 W/2 C\n        x1 = x[:, 1::2, 0::2, :]  # B H/2 W/2 C\n        x2 = x[:, 0::2, 1::2, :]  # B H/2 W/2 C\n        x3 = x[:, 1::2, 1::2, :]  # B H/2 W/2 C\n        x = paddle.concat([x0, x1, x2, x3], -1)  # B H/2 W/2 4*C\n        x = x.reshape((B, -1, 4 * C))  # B H/2*W/2 4*C\n\n        x = self.norm(x)\n        x = self.reduction(x)\n\n        return x\n\n    def extra_repr(self) -> str:\n        return f\"input_resolution={self.input_resolution}, dim={self.dim}\"\n\n    def flops(self):\n        H, W = self.input_resolution\n        flops = H * W * self.dim\n        flops += (H // 2) * (W // 2) * 4 * self.dim * 2 * self.dim\n        return flops\n\n\nclass BasicLayer(nn.Layer):\n    \"\"\" A basic Swin Transformer layer for one stage.\n    Args:\n        dim (int): Number of input channels.\n        input_resolution (tuple[int]): Input resolution.\n        depth (int): Number of blocks.\n        num_heads (int): Number of attention heads.\n        window_size (int): Local window size.\n        mlp_ratio (float): Ratio of mlp hidden dim to embedding dim.\n        qkv_bias (bool, optional): If True, add a learnable bias to query, key, value. Default: True\n        qk_scale (float | None, optional): Override default qk scale of head_dim ** -0.5 if set.\n        drop (float, optional): Dropout rate. Default: 0.0\n        attn_drop (float, optional): Attention dropout rate. Default: 0.0\n        drop_path (float | tuple[float], optional): Stochastic depth rate. Default: 0.0\n        norm_layer (nn.Layer, optional): Normalization layer. Default: nn.LayerNorm\n        downsample (nn.Layer | None, optional): Downsample layer at the end of the layer. Default: None\n        use_checkpoint (bool): Whether to use checkpointing to save memory. Default: False.\n    \"\"\"\n\n    def __init__(self,\n                 dim,\n                 input_resolution,\n                 depth,\n                 num_heads,\n                 window_size,\n                 mlp_ratio=4.,\n                 qkv_bias=True,\n                 qk_scale=None,\n                 drop=0.,\n                 attn_drop=0.,\n                 drop_path=0.,\n                 norm_layer=nn.LayerNorm,\n                 downsample=None,\n                 use_checkpoint=False):\n\n        super().__init__()\n        self.dim = dim\n        self.input_resolution = input_resolution\n        self.depth = depth\n        self.use_checkpoint = use_checkpoint\n\n        # build blocks\n        self.blocks = nn.LayerList([\n            SwinTransformerBlock(dim=dim,\n                                 input_resolution=input_resolution,\n                                 num_heads=num_heads,\n                                 window_size=window_size,\n                                 shift_size=0 if (i % 2 == 0) else window_size // 2,\n                                 mlp_ratio=mlp_ratio,\n                                 qkv_bias=qkv_bias,\n                                 qk_scale=qk_scale,\n                                 drop=drop,\n                                 attn_drop=attn_drop,\n                                 drop_path=drop_path[i] if isinstance(drop_path, list) else drop_path,\n                                 norm_layer=norm_layer) for i in range(depth)\n        ])\n\n        # patch merging layer\n        if downsample is not None:\n            self.downsample = downsample(input_resolution, dim=dim, norm_layer=norm_layer)\n        else:\n            self.downsample = None\n\n    def forward(self, x, x_size):\n        for blk in self.blocks:\n            x = blk(x, x_size)\n        if self.downsample is not None:\n            x = self.downsample(x)\n        return x\n\n    def extra_repr(self) -> str:\n        return f\"dim={self.dim}, input_resolution={self.input_resolution}, depth={self.depth}\"\n\n    def flops(self):\n        flops = 0\n        for blk in self.blocks:\n            flops += blk.flops()\n        if self.downsample is not None:\n            flops += self.downsample.flops()\n        return flops\n\n\nclass RSTB(nn.Layer):\n    \"\"\"Residual Swin Transformer Block (RSTB).\n    Args:\n        dim (int): Number of input channels.\n        input_resolution (tuple[int]): Input resolution.\n        depth (int): Number of blocks.\n        num_heads (int): Number of attention heads.\n        window_size (int): Local window size.\n        mlp_ratio (float): Ratio of mlp hidden dim to embedding dim.\n        qkv_bias (bool, optional): If True, add a learnable bias to query, key, value. Default: True\n        qk_scale (float | None, optional): Override default qk scale of head_dim ** -0.5 if set.\n        drop (float, optional): Dropout rate. Default: 0.0\n        attn_drop (float, optional): Attention dropout rate. Default: 0.0\n        drop_path (float | tuple[float], optional): Stochastic depth rate. Default: 0.0\n        norm_layer (nn.Layer, optional): Normalization layer. Default: nn.LayerNorm\n        downsample (nn.Layer | None, optional): Downsample layer at the end of the layer. Default: None\n        use_checkpoint (bool): Whether to use checkpointing to save memory. Default: False.\n        img_size: Input image size.\n        patch_size: Patch size.\n        resi_connection: The convolutional block before residual connection.\n    \"\"\"\n\n    def __init__(self,\n                 dim,\n                 input_resolution,\n                 depth,\n                 num_heads,\n                 window_size,\n                 mlp_ratio=4.,\n                 qkv_bias=True,\n                 qk_scale=None,\n                 drop=0.,\n                 attn_drop=0.,\n                 drop_path=0.,\n                 norm_layer=nn.LayerNorm,\n                 downsample=None,\n                 use_checkpoint=False,\n                 img_size=224,\n                 patch_size=4,\n                 resi_connection='1conv'):\n        super(RSTB, self).__init__()\n\n        self.dim = dim\n        self.input_resolution = input_resolution\n\n        self.residual_group = BasicLayer(dim=dim,\n                                         input_resolution=input_resolution,\n                                         depth=depth,\n                                         num_heads=num_heads,\n                                         window_size=window_size,\n                                         mlp_ratio=mlp_ratio,\n                                         qkv_bias=qkv_bias,\n                                         qk_scale=qk_scale,\n                                         drop=drop,\n                                         attn_drop=attn_drop,\n                                         drop_path=drop_path,\n                                         norm_layer=norm_layer,\n                                         downsample=downsample,\n                                         use_checkpoint=use_checkpoint)\n\n        if resi_connection == '1conv':\n            self.conv = nn.Conv2D(dim, dim, 3, 1, 1)\n        elif resi_connection == '3conv':\n            # to save parameters and memory\n            self.conv = nn.Sequential(nn.Conv2D(dim, dim // 4, 3, 1, 1), nn.LeakyReLU(negative_slope=0.2),\n                                      nn.Conv2D(dim // 4, dim // 4, 1, 1, 0), nn.LeakyReLU(negative_slope=0.2),\n                                      nn.Conv2D(dim // 4, dim, 3, 1, 1))\n\n        self.patch_embed = PatchEmbed(img_size=img_size,\n                                      patch_size=patch_size,\n                                      in_chans=0,\n                                      embed_dim=dim,\n                                      norm_layer=None)\n\n        self.patch_unembed = PatchUnEmbed(img_size=img_size,\n                                          patch_size=patch_size,\n                                          in_chans=0,\n                                          embed_dim=dim,\n                                          norm_layer=None)\n\n    def forward(self, x, x_size):\n        return self.patch_embed(self.conv(self.patch_unembed(self.residual_group(x, x_size), x_size))) + x\n\n    def flops(self):\n        flops = 0\n        flops += self.residual_group.flops()\n        H, W = self.input_resolution\n        flops += H * W * self.dim * self.dim * 9\n        flops += self.patch_embed.flops()\n        flops += self.patch_unembed.flops()\n\n        return flops\n\n\nclass PatchEmbed(nn.Layer):\n    r\"\"\" Image to Patch Embedding\n    Args:\n        img_size (int): Image size.  Default: 224.\n        patch_size (int): Patch token size. Default: 4.\n        in_chans (int): Number of input image channels. Default: 3.\n        embed_dim (int): Number of linear projection output channels. Default: 96.\n        norm_layer (nn.Layer, optional): Normalization layer. Default: None\n    \"\"\"\n\n    def __init__(self, img_size=224, patch_size=4, in_chans=3, embed_dim=96, norm_layer=None):\n        super().__init__()\n        img_size = to_2tuple(img_size)\n        patch_size = to_2tuple(patch_size)\n        patches_resolution = [img_size[0] // patch_size[0], img_size[1] // patch_size[1]]\n        self.img_size = img_size\n        self.patch_size = patch_size\n        self.patches_resolution = patches_resolution\n        self.num_patches = patches_resolution[0] * patches_resolution[1]\n\n        self.in_chans = in_chans\n        self.embed_dim = embed_dim\n\n        if norm_layer is not None:\n            self.norm = norm_layer(embed_dim)\n        else:\n            self.norm = None\n\n    def forward(self, x):\n        x = x.flatten(2).transpose((0, 2, 1))  # B Ph*Pw C\n        if self.norm is not None:\n            x = self.norm(x)\n        return x\n\n    def flops(self):\n        flops = 0\n        H, W = self.img_size\n        if self.norm is not None:\n            flops += H * W * self.embed_dim\n        return flops\n\n\nclass PatchUnEmbed(nn.Layer):\n    r\"\"\" Image to Patch Unembedding\n    Args:\n        img_size (int): Image size.  Default: 224.\n        patch_size (int): Patch token size. Default: 4.\n        in_chans (int): Number of input image channels. Default: 3.\n        embed_dim (int): Number of linear projection output channels. Default: 96.\n        norm_layer (nn.Layer, optional): Normalization layer. Default: None\n    \"\"\"\n\n    def __init__(self, img_size=224, patch_size=4, in_chans=3, embed_dim=96, norm_layer=None):\n        super().__init__()\n        img_size = to_2tuple(img_size)\n        patch_size = to_2tuple(patch_size)\n        patches_resolution = [img_size[0] // patch_size[0], img_size[1] // patch_size[1]]\n        self.img_size = img_size\n        self.patch_size = patch_size\n        self.patches_resolution = patches_resolution\n        self.num_patches = patches_resolution[0] * patches_resolution[1]\n\n        self.in_chans = in_chans\n        self.embed_dim = embed_dim\n\n    def forward(self, x, x_size):\n        B, HW, C = x.shape\n        x = x.transpose((0, 2, 1)).reshape((B, self.embed_dim, x_size[0], x_size[1]))  # B Ph*Pw C\n        return x\n\n    def flops(self):\n        flops = 0\n        return flops\n\n\nclass Upsample(nn.Sequential):\n    \"\"\"Upsample module.\n    Args:\n        scale (int): Scale factor. Supported scales: 2^n and 3.\n        num_feat (int): Channel number of intermediate features.\n    \"\"\"\n\n    def __init__(self, scale, num_feat):\n        m = []\n        if (scale & (scale - 1)) == 0:  # scale = 2^n\n            for _ in range(int(math.log(scale, 2))):\n                m.append(nn.Conv2D(num_feat, 4 * num_feat, 3, 1, 1))\n                m.append(nn.PixelShuffle(2))\n        elif scale == 3:\n            m.append(nn.Conv2D(num_feat, 9 * num_feat, 3, 1, 1))\n            m.append(nn.PixelShuffle(3))\n        else:\n            raise ValueError(f'scale {scale} is not supported. '\n                             'Supported scales: 2^n and 3.')\n        super(Upsample, self).__init__(*m)\n\n\nclass UpsampleOneStep(nn.Sequential):\n    \"\"\"UpsampleOneStep module (the difference with Upsample is that it always only has 1conv + 1pixelshuffle)\n       Used in lightweight SR to save parameters.\n    Args:\n        scale (int): Scale factor. Supported scales: 2^n and 3.\n        num_feat (int): Channel number of intermediate features.\n    \"\"\"\n\n    def __init__(self, scale, num_feat, num_out_ch, input_resolution=None):\n        self.num_feat = num_feat\n        self.input_resolution = input_resolution\n        m = []\n        m.append(nn.Conv2D(num_feat, (scale**2) * num_out_ch, 3, 1, 1))\n        m.append(nn.PixelShuffle(scale))\n        super(UpsampleOneStep, self).__init__(*m)\n\n    def flops(self):\n        H, W = self.input_resolution\n        flops = H * W * self.num_feat * 3 * 9\n        return flops\n\n\nclass SwinIR(nn.Layer):\n    r\"\"\" SwinIR\n        A PyTorch impl of : `SwinIR: Image Restoration Using Swin Transformer`, based on Swin Transformer.\n    Args:\n        img_size (int | tuple(int)): Input image size. Default 64\n        patch_size (int | tuple(int)): Patch size. Default: 1\n        in_chans (int): Number of input image channels. Default: 3\n        embed_dim (int): Patch embedding dimension. Default: 96\n        depths (tuple(int)): Depth of each Swin Transformer layer.\n        num_heads (tuple(int)): Number of attention heads in different layers.\n        window_size (int): Window size. Default: 7\n        mlp_ratio (float): Ratio of mlp hidden dim to embedding dim. Default: 4\n        qkv_bias (bool): If True, add a learnable bias to query, key, value. Default: True\n        qk_scale (float): Override default qk scale of head_dim ** -0.5 if set. Default: None\n        drop_rate (float): Dropout rate. Default: 0\n        attn_drop_rate (float): Attention dropout rate. Default: 0\n        drop_path_rate (float): Stochastic depth rate. Default: 0.1\n        norm_layer (nn.Layer): Normalization layer. Default: nn.LayerNorm.\n        ape (bool): If True, add absolute position embedding to the patch embedding. Default: False\n        patch_norm (bool): If True, add normalization after patch embedding. Default: True\n        use_checkpoint (bool): Whether to use checkpointing to save memory. Default: False\n        upscale: Upscale factor. 2/3/4/8 for image SR, 1 for denoising and compress artifact reduction\n        img_range: Image range. 1. or 255.\n        upsampler: The reconstruction reconstruction module. 'pixelshuffle'/'pixelshuffledirect'/'nearest+conv'/None\n        resi_connection: The convolutional block before residual connection. '1conv'/'3conv'\n    \"\"\"\n\n    def __init__(self,\n                 img_size=64,\n                 patch_size=1,\n                 in_chans=3,\n                 embed_dim=96,\n                 depths=[6, 6, 6, 6],\n                 num_heads=[6, 6, 6, 6],\n                 window_size=7,\n                 mlp_ratio=4.,\n                 qkv_bias=True,\n                 qk_scale=None,\n                 drop_rate=0.,\n                 attn_drop_rate=0.,\n                 drop_path_rate=0.1,\n                 norm_layer=nn.LayerNorm,\n                 ape=False,\n                 patch_norm=True,\n                 use_checkpoint=False,\n                 upscale=2,\n                 img_range=1.,\n                 upsampler='',\n                 resi_connection='1conv',\n                 **kwargs):\n        super(SwinIR, self).__init__()\n        num_in_ch = in_chans\n        num_out_ch = in_chans\n        num_feat = 64\n        self.img_range = img_range\n        if in_chans == 3:\n            rgb_mean = (0.4488, 0.4371, 0.4040)\n            self.mean = paddle.to_tensor(rgb_mean).reshape((1, 3, 1, 1))\n        else:\n            self.mean = paddle.zeros((1, 1, 1, 1))\n        self.upscale = upscale\n        self.upsampler = upsampler\n        self.window_size = window_size\n\n        #####################################################################################################\n        ################################### 1, shallow feature extraction ###################################\n        self.conv_first = nn.Conv2D(num_in_ch, embed_dim, 3, 1, 1)\n\n        #####################################################################################################\n        ################################### 2, deep feature extraction ######################################\n        self.num_layers = len(depths)\n        self.embed_dim = embed_dim\n        self.ape = ape\n        self.patch_norm = patch_norm\n        self.num_features = embed_dim\n        self.mlp_ratio = mlp_ratio\n\n        # split image into non-overlapping patches\n        self.patch_embed = PatchEmbed(img_size=img_size,\n                                      patch_size=patch_size,\n                                      in_chans=embed_dim,\n                                      embed_dim=embed_dim,\n                                      norm_layer=norm_layer if self.patch_norm else None)\n        num_patches = self.patch_embed.num_patches\n        patches_resolution = self.patch_embed.patches_resolution\n        self.patches_resolution = patches_resolution\n\n        # merge non-overlapping patches into image\n        self.patch_unembed = PatchUnEmbed(img_size=img_size,\n                                          patch_size=patch_size,\n                                          in_chans=embed_dim,\n                                          embed_dim=embed_dim,\n                                          norm_layer=norm_layer if self.patch_norm else None)\n\n        # absolute position embedding\n        if self.ape:\n            # self.absolute_pos_embed = nn.Parameter(torch.zeros(1, num_patches, embed_dim))\n            self.absolute_pos_embed = self.create_parameter(shape=(1, num_patches, embed_dim),\n                                                            default_initializer=nn.initializer.Constant(0.0))\n\n        self.pos_drop = nn.Dropout(p=drop_rate)\n\n        # stochastic depth\n        dpr = [x.item() for x in paddle.linspace(0, drop_path_rate, sum(depths))]  # stochastic depth decay rule\n\n        # build Residual Swin Transformer blocks (RSTB)\n        self.layers = nn.LayerList()\n        for i_layer in range(self.num_layers):\n            layer = RSTB(\n                dim=embed_dim,\n                input_resolution=(patches_resolution[0], patches_resolution[1]),\n                depth=depths[i_layer],\n                num_heads=num_heads[i_layer],\n                window_size=window_size,\n                mlp_ratio=self.mlp_ratio,\n                qkv_bias=qkv_bias,\n                qk_scale=qk_scale,\n                drop=drop_rate,\n                attn_drop=attn_drop_rate,\n                drop_path=dpr[sum(depths[:i_layer]):sum(depths[:i_layer + 1])],  # no impact on SR results\n                norm_layer=norm_layer,\n                downsample=None,\n                use_checkpoint=use_checkpoint,\n                img_size=img_size,\n                patch_size=patch_size,\n                resi_connection=resi_connection)\n            self.layers.append(layer)\n        self.norm = norm_layer(self.num_features)\n\n        # build the last conv layer in deep feature extraction\n        if resi_connection == '1conv':\n            self.conv_after_body = nn.Conv2D(embed_dim, embed_dim, 3, 1, 1)\n        elif resi_connection == '3conv':\n            # to save parameters and memory\n            self.conv_after_body = nn.Sequential(nn.Conv2D(embed_dim, embed_dim // 4, 3, 1, 1),\n                                                 nn.LeakyReLU(negative_slope=0.2),\n                                                 nn.Conv2D(embed_dim // 4, embed_dim // 4, 1, 1, 0),\n                                                 nn.LeakyReLU(negative_slope=0.2),\n                                                 nn.Conv2D(embed_dim // 4, embed_dim, 3, 1, 1))\n\n        #####################################################################################################\n        ################################ 3, high quality image reconstruction ################################\n        if self.upsampler == 'pixelshuffle':\n            # for classical SR\n            self.conv_before_upsample = nn.Sequential(nn.Conv2D(embed_dim, num_feat, 3, 1, 1), nn.LeakyReLU())\n            self.upsample = Upsample(upscale, num_feat)\n            self.conv_last = nn.Conv2D(num_feat, num_out_ch, 3, 1, 1)\n        elif self.upsampler == 'pixelshuffledirect':\n            # for lightweight SR (to save parameters)\n            self.upsample = UpsampleOneStep(upscale, embed_dim, num_out_ch,\n                                            (patches_resolution[0], patches_resolution[1]))\n        elif self.upsampler == 'nearest+conv':\n            # for real-world SR (less artifacts)\n            self.conv_before_upsample = nn.Sequential(nn.Conv2D(embed_dim, num_feat, 3, 1, 1), nn.LeakyReLU())\n            self.conv_up1 = nn.Conv2D(num_feat, num_feat, 3, 1, 1)\n            if self.upscale == 4:\n                self.conv_up2 = nn.Conv2D(num_feat, num_feat, 3, 1, 1)\n            self.conv_hr = nn.Conv2D(num_feat, num_feat, 3, 1, 1)\n            self.conv_last = nn.Conv2D(num_feat, num_out_ch, 3, 1, 1)\n            self.lrelu = nn.LeakyReLU(negative_slope=0.2)\n        else:\n            # for image denoising and JPEG compression artifact reduction\n            self.conv_last = nn.Conv2D(embed_dim, num_out_ch, 3, 1, 1)\n\n        self.apply(self._init_weights)\n\n    def _init_weights(self, m):\n        if isinstance(m, nn.Linear):\n            if isinstance(m, nn.Linear) and m.bias is not None:\n                nn.initializer.Constant(0.0)(m.bias)\n        elif isinstance(m, nn.LayerNorm):\n            nn.initializer.Constant(0.0)(m.bias)\n            nn.initializer.Constant(1.0)(m.weight)\n\n    def check_image_size(self, x):\n        _, _, h, w = x.shape\n        mod_pad_h = (self.window_size - h % self.window_size) % self.window_size\n        mod_pad_w = (self.window_size - w % self.window_size) % self.window_size\n        x = F.pad(x, (0, mod_pad_w, 0, mod_pad_h), 'reflect')\n        return x\n\n    def forward_features(self, x):\n        x_size = (x.shape[2], x.shape[3])\n        x = self.patch_embed(x)\n        if self.ape:\n            x = x + self.absolute_pos_embed\n        x = self.pos_drop(x)\n\n        for layer in self.layers:\n            x = layer(x, x_size)\n\n        x = self.norm(x)  # B L C\n        x = self.patch_unembed(x, x_size)\n\n        return x\n\n    def forward(self, x):\n        H, W = x.shape[2:]\n        x = self.check_image_size(x)\n\n        self.mean = self.mean.cast(x.dtype)\n        x = (x - self.mean) * self.img_range\n\n        if self.upsampler == 'pixelshuffle':\n            # for classical SR\n            x = self.conv_first(x)\n            x = self.conv_after_body(self.forward_features(x)) + x\n            x = self.conv_before_upsample(x)\n            x = self.conv_last(self.upsample(x))\n        elif self.upsampler == 'pixelshuffledirect':\n            # for lightweight SR\n            x = self.conv_first(x)\n            x = self.conv_after_body(self.forward_features(x)) + x\n            x = self.upsample(x)\n        elif self.upsampler == 'nearest+conv':\n            # for real-world SR\n            x = self.conv_first(x)\n            x = self.conv_after_body(self.forward_features(x)) + x\n            x = self.conv_before_upsample(x)\n            x = self.lrelu(self.conv_up1(nn.functional.interpolate(x, scale_factor=2, mode='nearest')))\n            if self.upscale == 4:\n                x = self.lrelu(self.conv_up2(nn.functional.interpolate(x, scale_factor=2, mode='nearest')))\n            x = self.conv_last(self.lrelu(self.conv_hr(x)))\n        else:\n            # for image denoising and JPEG compression artifact reduction\n            x_first = self.conv_first(x)\n            res = self.conv_after_body(self.forward_features(x_first)) + x_first\n            x = x + self.conv_last(res)\n\n        x = x / self.img_range + self.mean\n\n        return x[:, :, :H * self.upscale, :W * self.upscale]\n\n    def flops(self):\n        flops = 0\n        H, W = self.patches_resolution\n        flops += H * W * 3 * self.embed_dim * 9\n        flops += self.patch_embed.flops()\n        for i, layer in enumerate(self.layers):\n            flops += layer.flops()\n        flops += H * W * 3 * self.embed_dim * self.embed_dim\n        flops += self.upsample.flops()\n        return flops\n"
  },
  {
    "path": "modules/image/Image_editing/super_resolution/swinir_m_real_sr_x4/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport numpy as np\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/mJaD10XeD7w/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8M3x8Y2F0fGVufDB8fHx8MTY2MzczNDc3Mw&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        img = cv2.imread('tests/test.jpg')\n        img = cv2.resize(img, (0, 0), fx=0.25, fy=0.25)\n        cv2.imwrite('tests/test.jpg', img)\n        cls.module = hub.Module(name=\"swinir_m_real_sr_x4\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('swinir_m_real_sr_x4_output')\n\n    def test_real_sr1(self):\n        results = self.module.real_sr(image='tests/test.jpg', visualization=False)\n\n        self.assertIsInstance(results, np.ndarray)\n\n    def test_real_sr2(self):\n        results = self.module.real_sr(image=cv2.imread('tests/test.jpg'), visualization=True)\n\n        self.assertIsInstance(results, np.ndarray)\n\n    def test_real_sr3(self):\n        results = self.module.real_sr(image=cv2.imread('tests/test.jpg'), visualization=True)\n\n        self.assertIsInstance(results, np.ndarray)\n\n    def test_real_sr4(self):\n        self.assertRaises(Exception, self.module.real_sr, image=['tests/test.jpg'])\n\n    def test_real_sr5(self):\n        self.assertRaises(FileNotFoundError, self.module.real_sr, image='no.jpg')\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/Image_gan/README.md",
    "content": "## **更好用户体验，建议参考WEB端官方文档 -> [【图像生成】](https://www.paddlepaddle.org.cn/hublist)**\n\n\n\n### 图像生成\n\n图像生成是指根据输入向量，生成目标图像。这里的输入向量可以是随机的噪声或用户指定的条件向量。具体的应用场景有：风格迁移、图像动漫画等。\n\n- 精选推荐模型\n\n| 模型名称                                                     | 模型简介                                                     |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n | [艺术风格迁移](https://www.paddlepaddle.org.cn/hubdetail?name=stylepro_artistic&en_category=GANs) | 将给定的图像转换为任意的艺术风格。确保模型高保真还原内容图片的语义细节信息与风格图片的风格信息。 |\n | [图像动漫化-新海诚](https://www.paddlepaddle.org.cn/hubdetail?name=animegan_v2_shinkai_53&en_category=GANs) | AnimeGAN V2 图像风格转换模型, 模型可将输入的图像转换成新海诚动漫风格 |\n  | [图像动漫化-宫崎骏](https://www.paddlepaddle.org.cn/hubdetail?name=animegan_v2_hayao_64&en_category=GANs) | AnimeGAN V2 图像风格转换模型, 模型可将输入的图像转换成宫崎骏动漫风格|\n  | [图像动漫化-今敏红辣椒](https://www.paddlepaddle.org.cn/hubdetail?name=animegan_v2_paprika_97&en_category=GANs) | AnimeGAN V2 图像风格转换模型, 模型可将输入的图像转换成今敏红辣椒动漫风格。|\n"
  },
  {
    "path": "modules/image/Image_gan/README_en.md",
    "content": "## **For better user experience, refer to the Web official document ->  [Image Generation](https://www.paddlepaddle.org.cn/hublist)**\n\n### Image Generation\n\nImage generation is the generation of a target image based on an input vector. The input vector here can be a random noise or a user-specified conditional vector. Specific application scenarios include style migration, image cartoons, and so on.\n\n- Recommended Models\n\n| Model Name                                                   | Model Introduction                                           |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n| [Art Style Migration](https://www.paddlepaddle.org.cn/hubdetail?name=stylepro_artistic&en_category=GANs) | Convert a given image into an arbitrary artistic style. Ensure that high fidelity model reproduction of semantic detail information of content images is consistent with the style information of style images. |\n| [Image Animation - Makoto Shinkai](https://www.paddlepaddle.org.cn/hubdetail?name=animegan_v2_shinkai_53&en_category=GANs) | AnimeGAN V2 image style conversion model. It is a model that converts the input image into a  Makoto Shinkai style. |\n| [Image Animation - Hayao Miyazaki](https://www.paddlepaddle.org.cn/hubdetail?name=animegan_v2_hayao_64&en_category=GANs) | AnimeGAN V2 image style conversion model. It is a model that converts the input image into a Miyazaki anime style. |\n| [Image Animation - Satoshi Kon Paprika](https://www.paddlepaddle.org.cn/hubdetail?name=animegan_v2_paprika_97&en_category=GANs) | AnimeGAN V2 image style conversion model. It is a model that converts the input image into an Satoshi Kon Paprika style anime style. |\n"
  },
  {
    "path": "modules/image/Image_gan/attgan_celeba/README.md",
    "content": "# attgan_celeba\n\n|模型名称|attgan_celeba|\n| :--- | :---: |\n|类别|图像 - 图像生成|\n|网络|AttGAN|\n|数据集|Celeba|\n|是否支持Fine-tuning|否|\n|模型大小|167MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/137855667-43c5c40c-28f5-45d8-accc-028e185b988f.JPG\" width=1200><br/>\n    图1. AttGAN的效果图(图片属性分别为：original image, Bald, Bangs, Black_Hair, Blond_Hair, Brown_Hair, Bushy_Eyebrows, Eyeglasses, Gender, Mouth_Slightly_Open, Mustache, No_Beard, Pale_Skin, Aged)<br/>\n    </p>\n\n\n- ### 模型介绍\n\n  - AttGAN 是一种生成对抗网络(Generative Adversarial Networks)，它利用分类损失和重构损失来保证改变特定的属性。该 PaddleHub Module 使用 Celeba 数据集训练完成，目前支持 \"Bald\", \"Bangs\", \"Black_Hair\", \"Blond_Hair\", \"Brown_Hair\", \"Bushy_Eyebrows\", \"Eyeglasses\", \"Gender\", \"Mouth_Slightly_Open\", \"Mustache\", \"No_Beard\", \"Pale_Skin\", \"Aged\" 这十三种人脸属性转换。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.5.2 \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install attgan_celeba==1.0.0\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n \n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run attgan_celeba --image \"/PATH/TO/IMAGE\" --style \"target_attribute\" \n    ```\n  - **参数**\n\n    - image ：指定图片路径。\n\n    - style 指定拟转换的属性，可选择 \"Bald\", \"Bangs\", \"Black_Hair\", \"Blond_Hair\", \"Brown_Hair\", \"Bushy_Eyebrows\", \"Eyeglasses\", \"Gender\", \"Mouth_Slightly_Open\", \"Mustache\", \"No_Beard\", \"Pale_Skin\", \"Aged\" 中的一种。\n\n\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    attgan = hub.Module(name=\"attgan_celeba\")\n\n    test_img_path = [\"/PATH/TO/IMAGE\"]\n    trans_attr = [\"Bangs\"]\n\n    # set input dict\n    input_dict = {\"image\": test_img_path, \"style\": trans_attr}\n\n    # execute predict and print the result\n    results = attgan.generate(data=input_dict)\n    print(results)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def generate(data)\n    ```\n\n    - 风格转换API，用于图像生成。\n\n    - **参数**\n\n      - data： dict 类型，有以下字段\n          - image (list\\[str\\])： list中每个元素为待转换的图片路径。\n          - style (list\\[str\\])： list中每个元素为字符串，填写待转换的人脸属性。\n\n    - **返回**\n      - res (list\\[str\\]): 提示生成图片的保存路径。\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n\n"
  },
  {
    "path": "modules/image/Image_gan/attgan_celeba/README_en.md",
    "content": "# attgan_celeba\n\n|Module Name|attgan_celeba|\n| :--- | :---: |\n|Category |image generation|\n|Network |AttGAN|\n|Dataset|Celeba|\n|Fine-tuning supported or not |No|\n|Module Size |167MB|\n|Latest update date|2021-02-26|\n|Data indicators |-|\n\n\n## I. Basic Information \n\n- ### Application Effect Display\n  - Sample results:\n\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/137855667-43c5c40c-28f5-45d8-accc-028e185b988f.JPG\" width=1200><br/>\n    The image attributes are: original image, Bald, Bangs, Black_Hair, Blond_Hair, Brown_Hair, Bushy_Eyebrows, Eyeglasses, Gender, Mouth_Slightly_Open, Mustache, No_Beard, Pale_Skin, Aged<br/>\n    </p>\n\n\n- ### Module Introduction\n\n  - AttGAN is a Generative Adversarial Network, which uses classification loss and reconstruction loss to train the network. The PaddleHub Module is trained one Celeba dataset and currently supports attributes of \"Bald\", \"Bangs\", \"Black_Hair\", \"Blond_Hair\", \"Brown_Hair\", \"Bushy_Eyebrows\", \"Eyeglasses\", \"Gender\", \"Mouth_Slightly_Open\", \"Mustache\", \"No_Beard\", \"Pale_Skin\", \"Aged\".\n\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n  - paddlepaddle >= 1.5.2 \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install attgan_celeba==1.0.0\n    ```\n\n  - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md).\n\n \n\n## III. Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run attgan_celeba --image \"/PATH/TO/IMAGE\" --style \"target_attribute\" \n    ```\n\n  - **Parameters**\n\n    - image: Input image path.\n\n    - style: Specify the attributes to be converted. The options are \"Bald\", \"Bangs\", \"Black_Hair\", \"Blond_Hair\", \"Brown_Hair\", \"Bushy_Eyebrows\", \"Eyeglasses\", \"Gender\", \"Mouth_Slightly_Open\", \"Mustache\", \"No_Beard\", \"Pale_Skin\", \"Aged\". You can choose one of the options.\n\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_en/tutorial/cmd_usage.rst)\n\n\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n\n    attgan = hub.Module(name=\"attgan_celeba\")\n\n    test_img_path = [\"/PATH/TO/IMAGE\"]\n    trans_attr = [\"Bangs\"]\n\n    # set input dict\n    input_dict = {\"image\": test_img_path, \"style\": trans_attr}\n\n    # execute predict and print the result\n    results = attgan.generate(data=input_dict)\n    print(results)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def generate(data)\n    ```\n\n    - Style transfer API.\n\n    - **Parameter**\n\n      - data(list[dict]): Each element in the list is dict and each field is: \n          - image (list\\[str\\])： Each element in the list is the path of the image to be converted.\n          - style (list\\[str\\])： Each element in the list is a string, fill in the face attributes to be converted.\n\n    - **Return**\n      - res (list\\[str\\]): Save path of the result.\n\n\n\n## IV. Release Note\n\n- 1.0.0\n\n  First release\n\n\n"
  },
  {
    "path": "modules/image/Image_gan/cyclegan_cityscapes/README.md",
    "content": "# cyclegan_cityscapes\n\n|模型名称|cyclegan_cityscapes|\n| :--- | :---: |\n|类别|图像 - 图像生成|\n|网络|CycleGAN|\n|数据集|Cityscapes|\n|是否支持Fine-tuning|否|\n|模型大小|33MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/137839740-4be4cf40-816f-401e-a73f-6cda037041dd.png\"  width = \"450\" height = \"300\" hspace='10'/>\n     <br />\n    输入图像\n     <br />\n    <img src=\"https://user-images.githubusercontent.com/35907364/137839777-89fc705b-f0d7-4a93-94e2-76c0d3c5a0b0.png\"  width = \"450\" height = \"300\" hspace='10'/>\n     <br />\n    输出图像\n     <br />\n    </p>\n\n\n- ### 模型介绍\n\n  - CycleGAN是生成对抗网络（Generative Adversarial Networks ）的一种，与传统的GAN只能单向生成图片不同，CycleGAN可以同时完成两个domain的图片进行相互转换。该PaddleHub Module使用Cityscapes数据集训练完成，支持图片从实景图转换为语义分割结果，也支持从语义分割结果转换为实景图。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0\n\n  - paddlehub >= 1.1.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install cyclegan_cityscapes==1.0.0\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n \n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run cyclegan_cityscapes --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - **参数**\n\n    - input_path ：指定图片路径。\n\n\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    cyclegan = hub.Module(name=\"cyclegan_cityscapes\")\n\n    test_img_path = \"/PATH/TO/IMAGE\"\n\n    # set input dict\n    input_dict = {\"image\": [test_img_path]}\n\n    # execute predict and print the result\n    results = cyclegan.generate(data=input_dict)\n    print(results)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def generate(data)\n    ```\n\n    - 风格转换API，用于图像生成。\n\n    - **参数**\n\n      - data： dict 类型，有以下字段:\n          - image (list\\[str\\])： list中每个元素为待转换的图片路径。\n\n    - **返回**\n      - res (list\\[str\\]): 每个元素为对应输入图片的预测结果。预测结果为dict类型，有以下字段：\n          - origin: 原输入图片路径.\n          - generated: 生成图片的路径。\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n"
  },
  {
    "path": "modules/image/Image_gan/cyclegan_cityscapes/README_en.md",
    "content": "# cyclegan_cityscapes\n\n|Module Name|cyclegan_cityscapes|\n| :--- | :---: |\n|Category |Image generation|\n|Network |CycleGAN|\n|Dataset|Cityscapes|\n|Fine-tuning supported or not |No|\n|Module Size |33MB|\n|Latest update date |2021-02-26|\n|Data indicators|-|\n\n\n## I. Basic Information \n\n\n- ### Application Effect Display\n\n  - Sample results:\n\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/137839740-4be4cf40-816f-401e-a73f-6cda037041dd.png\"  width = \"450\" height = \"300\" hspace='10'/>\n     <br />\n    Input image\n     <br />\n    <img src=\"https://user-images.githubusercontent.com/35907364/137839777-89fc705b-f0d7-4a93-94e2-76c0d3c5a0b0.png\"  width = \"450\" height = \"300\" hspace='10'/>\n     <br />\n    Output image\n     <br />\n    </p>\n\n\n- ### Module Introduction\n\n  - CycleGAN belongs to Generative Adversarial Networks(GANs). Unlike traditional GANs that can only generate pictures in one direction, CycleGAN can simultaneously complete the style transfer of two domains. The PaddleHub Module is trained by Cityscapes dataset, and supports the conversion from real images to semantic segmentation results, and also supports conversion from semantic segmentation results to real images.\n\n\n## II. Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0\n\n  - paddlehub >= 1.1.0 \n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install cyclegan_cityscapes==1.0.0\n    ```\n  - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n \n## III. Module API Prediction\n\n- ### 1、Command line Prediction\n  - ```shell\n    $ hub run cyclegan_cityscapes --input_path \"/PATH/TO/IMAGE\"\n    ```\n  \n    - **Parameters**\n\n      - input_path: image path\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_en/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n\n    cyclegan = hub.Module(name=\"cyclegan_cityscapes\")\n\n    test_img_path = \"/PATH/TO/IMAGE\"\n\n    # set input dict\n    input_dict = {\"image\": [test_img_path]}\n\n    # execute predict and print the result\n    results = cyclegan.generate(data=input_dict)\n    print(results)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def generate(data)\n    ```\n\n    - Style transfer API.\n\n    - **Parameters**\n\n      - data(list[dict]): Each element in the list is dict and each field is:\n          - image (list\\[str\\])： Image path.\n\n    - **Return**\n      - res (list\\[str\\]): The list of style transfer results, where each element is dict and each field is: \n          - origin: Original input path.\n          - generated: Save path of images.\n\n\n\n## IV. Release Note\n\n* 1.0.0\n\n  First release\n\n"
  },
  {
    "path": "modules/image/Image_gan/gan/README.md",
    "content": ""
  },
  {
    "path": "modules/image/Image_gan/gan/first_order_motion/README.md",
    "content": "# first_order_motion\n\n|模型名称|first_order_motion|\n| :--- | :---: |\n|类别|图像 - 图像生成|\n|网络|S3FD|\n|数据集|-|\n|是否支持Fine-tuning|否|\n|模型大小|343MB|\n|最新更新日期|2021-12-24|\n|数据指标|-|\n\n\n## 一、模型基本信息  \n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/147347145-1a7e84b6-2853-4490-8eaf-caf9cfdca79b.png\"  width = \"40%\"  hspace='10'/>\n    <br />\n    输入图像\n    <br />\n    <img src=\"https://user-images.githubusercontent.com/22424850/147347151-d6c5690b-00cd-433f-b82b-3f8bb90bc7bd.gif\"  width = \"40%\"  hspace='10'/>\n    <br />\n    输入视频\n    <br />\n    <img src=\"https://user-images.githubusercontent.com/22424850/147348127-52eb3f26-9b2c-49d5-a4a2-20a31f159802.gif\"  width = \"40%\"  hspace='10'/>\n    <br />\n    输出视频\n     <br />\n    </p>\n\n- ### 模型介绍\n\n  - First Order Motion的任务是图像动画/Image Animation，即输入为一张源图片和一个驱动视频，源图片中的人物则会做出驱动视频中的动作。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n  - paddlepaddle >= 2.1.0\n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install first_order_motion\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run first_order_motion --source_image \"/PATH/TO/IMAGE\" --driving_video \"/PATH/TO/VIDEO\"  --use_gpu\n    ```\n  - 通过命令行方式实现视频驱动生成模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"first_order_motion\")\n    module.generate(source_image=\"/PATH/TO/IMAGE\", driving_video=\"/PATH/TO/VIDEO\", ratio=0.4, image_size=256, output_dir='./motion_driving_result/', filename='result.mp4', use_gpu=False)\n    ```\n\n- ### 3、API\n\n  - ```python\n    generate(self, source_image=None, driving_video=None, ratio=0.4, image_size=256, output_dir='./motion_driving_result/', filename='result.mp4', use_gpu=False)\n    ```\n    - 视频驱动生成API。\n\n    - **参数**\n      - source_image (str):  原始图片，支持单人图片和多人图片，视频中人物的表情动作将迁移到该原始图片中的人物上。\n      - driving_video (str): 驱动视频，视频中人物的表情动作作为待迁移的对象。\n      - ratio (float): 贴回驱动生成的人脸区域占原图的比例, 用户需要根据生成的效果调整该参数，尤其对于多人脸距离比较近的情况下需要调整改参数, 默认为0.4，调整范围是[0.4, 0.5]。\n      - image_size (int): 图片人脸大小，默认为256，可设置为512。\n      - output\\_dir (str): 结果保存的文件夹名； <br/>\n      - filename (str): 结果保存的文件名。\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install first_order_motion==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/gan/first_order_motion/model.py",
    "content": "#  Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n#Licensed under the Apache License, Version 2.0 (the \"License\");\n#you may not use this file except in compliance with the License.\n#You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n#Unless required by applicable law or agreed to in writing, software\n#distributed under the License is distributed on an \"AS IS\" BASIS,\n#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n#See the License for the specific language governing permissions and\n#limitations under the License.\n\nimport os\nimport sys\nimport math\nimport pickle\n\nimport yaml\nimport imageio\nimport numpy as np\nfrom tqdm import tqdm\nfrom scipy.spatial import ConvexHull\nimport cv2\nimport paddle\nfrom ppgan.utils.download import get_path_from_url\nfrom ppgan.utils.animate import normalize_kp\nfrom ppgan.modules.keypoint_detector import KPDetector\nfrom ppgan.models.generators.occlusion_aware import OcclusionAwareGenerator\nfrom ppgan.faceutils import face_detection\n\n\nclass FirstOrderPredictor:\n    def __init__(self,\n                 weight_path=None,\n                 config=None,\n                 image_size=256,\n                 relative=True,\n                 adapt_scale=False,\n                 find_best_frame=False,\n                 best_frame=None,\n                 face_detector='sfd',\n                 multi_person=False,\n                 face_enhancement=True,\n                 batch_size=1,\n                 mobile_net=False):\n        if config is not None and isinstance(config, str):\n            with open(config) as f:\n                self.cfg = yaml.load(f, Loader=yaml.SafeLoader)\n        elif isinstance(config, dict):\n            self.cfg = config\n        elif config is None:\n            self.cfg = {\n                'model': {\n                    'common_params': {\n                        'num_kp': 10,\n                        'num_channels': 3,\n                        'estimate_jacobian': True\n                    },\n                    'generator': {\n                        'kp_detector_cfg': {\n                            'temperature': 0.1,\n                            'block_expansion': 32,\n                            'max_features': 1024,\n                            'scale_factor': 0.25,\n                            'num_blocks': 5\n                        },\n                        'generator_cfg': {\n                            'block_expansion': 64,\n                            'max_features': 512,\n                            'num_down_blocks': 2,\n                            'num_bottleneck_blocks': 6,\n                            'estimate_occlusion_map': True,\n                            'dense_motion_params': {\n                                'block_expansion': 64,\n                                'max_features': 1024,\n                                'num_blocks': 5,\n                                'scale_factor': 0.25\n                            }\n                        }\n                    }\n                }\n            }\n        self.image_size = image_size\n        if weight_path is None:\n            if mobile_net:\n                vox_cpk_weight_url = 'https://paddlegan.bj.bcebos.com/applications/first_order_model/vox-mobile.pdparams'\n\n            else:\n                if self.image_size == 512:\n                    vox_cpk_weight_url = 'https://paddlegan.bj.bcebos.com/applications/first_order_model/vox-cpk-512.pdparams'\n                else:\n                    vox_cpk_weight_url = 'https://paddlegan.bj.bcebos.com/applications/first_order_model/vox-cpk.pdparams'\n            weight_path = get_path_from_url(vox_cpk_weight_url)\n\n        self.weight_path = weight_path\n        self.relative = relative\n        self.adapt_scale = adapt_scale\n        self.find_best_frame = find_best_frame\n        self.best_frame = best_frame\n        self.face_detector = face_detector\n        self.generator, self.kp_detector = self.load_checkpoints(self.cfg, self.weight_path)\n        self.multi_person = multi_person\n        self.face_enhancement = face_enhancement\n        self.batch_size = batch_size\n        if face_enhancement:\n            from ppgan.faceutils.face_enhancement import FaceEnhancement\n            self.faceenhancer = FaceEnhancement(batch_size=batch_size)\n\n    def read_img(self, path):\n        img = imageio.imread(path)\n        if img.ndim == 2:\n            img = np.expand_dims(img, axis=2)\n        # som images have 4 channels\n        if img.shape[2] > 3:\n            img = img[:, :, :3]\n        return img\n\n    def run(self, source_image, driving_video, ratio, image_size, output_dir, filename):\n        self.ratio = ratio\n        self.image_size = image_size\n        self.output = output_dir\n        self.filename = filename\n        if not os.path.exists(output_dir):\n            os.makedirs(output_dir)\n\n        def get_prediction(face_image):\n            if self.find_best_frame or self.best_frame is not None:\n                i = self.best_frame if self.best_frame is not None else self.find_best_frame_func(\n                    source_image, driving_video)\n\n                print(\"Best frame: \" + str(i))\n                driving_forward = driving_video[i:]\n                driving_backward = driving_video[:(i + 1)][::-1]\n                predictions_forward = self.make_animation(\n                    face_image,\n                    driving_forward,\n                    self.generator,\n                    self.kp_detector,\n                    relative=self.relative,\n                    adapt_movement_scale=self.adapt_scale)\n                predictions_backward = self.make_animation(\n                    face_image,\n                    driving_backward,\n                    self.generator,\n                    self.kp_detector,\n                    relative=self.relative,\n                    adapt_movement_scale=self.adapt_scale)\n                predictions = predictions_backward[::-1] + predictions_forward[1:]\n            else:\n                predictions = self.make_animation(\n                    face_image,\n                    driving_video,\n                    self.generator,\n                    self.kp_detector,\n                    relative=self.relative,\n                    adapt_movement_scale=self.adapt_scale)\n            return predictions\n\n        source_image = self.read_img(source_image)\n        reader = imageio.get_reader(driving_video)\n        fps = reader.get_meta_data()['fps']\n        driving_video = []\n        try:\n            for im in reader:\n                driving_video.append(im)\n        except RuntimeError:\n            print(\"Read driving video error!\")\n            pass\n        reader.close()\n\n        driving_video = [cv2.resize(frame, (self.image_size, self.image_size)) / 255.0 for frame in driving_video]\n        results = []\n\n        bboxes = self.extract_bbox(source_image.copy())\n        print(str(len(bboxes)) + \" persons have been detected\")\n\n        # for multi person\n        for rec in bboxes:\n            face_image = source_image.copy()[rec[1]:rec[3], rec[0]:rec[2]]\n            face_image = cv2.resize(face_image, (self.image_size, self.image_size)) / 255.0\n            predictions = get_prediction(face_image)\n            results.append({'rec': rec, 'predict': [predictions[i] for i in range(predictions.shape[0])]})\n            if len(bboxes) == 1 or not self.multi_person:\n                break\n        out_frame = []\n\n        for i in range(len(driving_video)):\n            frame = source_image.copy()\n            for result in results:\n                x1, y1, x2, y2, _ = result['rec']\n                h = y2 - y1\n                w = x2 - x1\n                out = result['predict'][i]\n                out = cv2.resize(out.astype(np.uint8), (x2 - x1, y2 - y1))\n                if len(results) == 1:\n                    frame[y1:y2, x1:x2] = out\n                    break\n                else:\n                    patch = np.zeros(frame.shape).astype('uint8')\n                    patch[y1:y2, x1:x2] = out\n                    mask = np.zeros(frame.shape[:2]).astype('uint8')\n                    cx = int((x1 + x2) / 2)\n                    cy = int((y1 + y2) / 2)\n                    cv2.circle(mask, (cx, cy), math.ceil(h * self.ratio), (255, 255, 255), -1, 8, 0)\n                    frame = cv2.copyTo(patch, mask, frame)\n\n            out_frame.append(frame)\n        imageio.mimsave(os.path.join(self.output, self.filename), [frame for frame in out_frame], fps=fps)\n\n    def load_checkpoints(self, config, checkpoint_path):\n\n        generator = OcclusionAwareGenerator(\n            **config['model']['generator']['generator_cfg'], **config['model']['common_params'], inference=True)\n\n        kp_detector = KPDetector(**config['model']['generator']['kp_detector_cfg'], **config['model']['common_params'])\n\n        checkpoint = paddle.load(self.weight_path)\n        generator.set_state_dict(checkpoint['generator'])\n\n        kp_detector.set_state_dict(checkpoint['kp_detector'])\n\n        generator.eval()\n        kp_detector.eval()\n\n        return generator, kp_detector\n\n    def make_animation(self,\n                       source_image,\n                       driving_video,\n                       generator,\n                       kp_detector,\n                       relative=True,\n                       adapt_movement_scale=True):\n        with paddle.no_grad():\n            predictions = []\n            source = paddle.to_tensor(source_image[np.newaxis].astype(np.float32)).transpose([0, 3, 1, 2])\n\n            driving = paddle.to_tensor(np.array(driving_video).astype(np.float32)).transpose([0, 3, 1, 2])\n            kp_source = kp_detector(source)\n            kp_driving_initial = kp_detector(driving[0:1])\n            kp_source_batch = {}\n            kp_source_batch[\"value\"] = paddle.tile(kp_source[\"value\"], repeat_times=[self.batch_size, 1, 1])\n            kp_source_batch[\"jacobian\"] = paddle.tile(kp_source[\"jacobian\"], repeat_times=[self.batch_size, 1, 1, 1])\n            source = paddle.tile(source, repeat_times=[self.batch_size, 1, 1, 1])\n            begin_idx = 0\n            for frame_idx in tqdm(range(int(np.ceil(float(driving.shape[0]) / self.batch_size)))):\n                frame_num = min(self.batch_size, driving.shape[0] - begin_idx)\n                driving_frame = driving[begin_idx:begin_idx + frame_num]\n                kp_driving = kp_detector(driving_frame)\n                kp_source_img = {}\n                kp_source_img[\"value\"] = kp_source_batch[\"value\"][0:frame_num]\n                kp_source_img[\"jacobian\"] = kp_source_batch[\"jacobian\"][0:frame_num]\n\n                kp_norm = normalize_kp(\n                    kp_source=kp_source,\n                    kp_driving=kp_driving,\n                    kp_driving_initial=kp_driving_initial,\n                    use_relative_movement=relative,\n                    use_relative_jacobian=relative,\n                    adapt_movement_scale=adapt_movement_scale)\n\n                out = generator(source[0:frame_num], kp_source=kp_source_img, kp_driving=kp_norm)\n                img = np.transpose(out['prediction'].numpy(), [0, 2, 3, 1]) * 255.0\n\n                if self.face_enhancement:\n                    img = self.faceenhancer.enhance_from_batch(img)\n\n                predictions.append(img)\n                begin_idx += frame_num\n        return np.concatenate(predictions)\n\n    def find_best_frame_func(self, source, driving):\n        import face_alignment\n\n        def normalize_kp(kp):\n            kp = kp - kp.mean(axis=0, keepdims=True)\n            area = ConvexHull(kp[:, :2]).volume\n            area = np.sqrt(area)\n            kp[:, :2] = kp[:, :2] / area\n            return kp\n\n        fa = face_alignment.FaceAlignment(face_alignment.LandmarksType._2D, flip_input=True)\n\n        kp_source = fa.get_landmarks(255 * source)[0]\n        kp_source = normalize_kp(kp_source)\n        norm = float('inf')\n        frame_num = 0\n        for i, image in tqdm(enumerate(driving)):\n            kp_driving = fa.get_landmarks(255 * image)[0]\n            kp_driving = normalize_kp(kp_driving)\n            new_norm = (np.abs(kp_source - kp_driving)**2).sum()\n            if new_norm < norm:\n                norm = new_norm\n                frame_num = i\n        return frame_num\n\n    def extract_bbox(self, image):\n        detector = face_detection.FaceAlignment(\n            face_detection.LandmarksType._2D, flip_input=False, face_detector=self.face_detector)\n\n        frame = [image]\n        predictions = detector.get_detections_for_image(np.array(frame))\n        person_num = len(predictions)\n        if person_num == 0:\n            return np.array([])\n        results = []\n        face_boxs = []\n        h, w, _ = image.shape\n        for rect in predictions:\n            bh = rect[3] - rect[1]\n            bw = rect[2] - rect[0]\n            cy = rect[1] + int(bh / 2)\n            cx = rect[0] + int(bw / 2)\n            margin = max(bh, bw)\n            y1 = max(0, cy - margin)\n            x1 = max(0, cx - int(0.8 * margin))\n            y2 = min(h, cy + margin)\n            x2 = min(w, cx + int(0.8 * margin))\n            area = (y2 - y1) * (x2 - x1)\n            results.append([x1, y1, x2, y2, area])\n        # if a person has more than one bbox, keep the largest one\n        # maybe greedy will be better?\n        sorted(results, key=lambda area: area[4], reverse=True)\n        results_box = [results[0]]\n        for i in range(1, person_num):\n            num = len(results_box)\n            add_person = True\n            for j in range(num):\n                pre_person = results_box[j]\n                iou = self.IOU(pre_person[0], pre_person[1], pre_person[2], pre_person[3], pre_person[4], results[i][0],\n                               results[i][1], results[i][2], results[i][3], results[i][4])\n                if iou > 0.5:\n                    add_person = False\n                    break\n            if add_person:\n                results_box.append(results[i])\n        boxes = np.array(results_box)\n        return boxes\n\n    def IOU(self, ax1, ay1, ax2, ay2, sa, bx1, by1, bx2, by2, sb):\n        #sa = abs((ax2 - ax1) * (ay2 - ay1))\n        #sb = abs((bx2 - bx1) * (by2 - by1))\n        x1, y1 = max(ax1, bx1), max(ay1, by1)\n        x2, y2 = min(ax2, bx2), min(ay2, by2)\n        w = x2 - x1\n        h = y2 - y1\n        if w < 0 or h < 0:\n            return 0.0\n        else:\n            return 1.0 * w * h / (sa + sb - w * h)\n"
  },
  {
    "path": "modules/image/Image_gan/gan/first_order_motion/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport argparse\nimport copy\n\nimport paddle\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo, runnable, serving\nimport numpy as np\nimport cv2\nfrom skimage.io import imread\nfrom skimage.transform import rescale, resize\n\nfrom .model import FirstOrderPredictor\n\n\n@moduleinfo(\n    name=\"first_order_motion\", type=\"CV/gan\", author=\"paddlepaddle\", author_email=\"\", summary=\"\", version=\"1.0.0\")\nclass FirstOrderMotion:\n    def __init__(self):\n        self.pretrained_model = os.path.join(self.directory, \"vox-cpk.pdparams\")\n        self.network = FirstOrderPredictor(weight_path=self.pretrained_model, face_enhancement=True)\n\n    def generate(self,\n                 source_image=None,\n                 driving_video=None,\n                 ratio=0.4,\n                 image_size=256,\n                 output_dir='./motion_driving_result/',\n                 filename='result.mp4',\n                 use_gpu=False):\n        '''\n        source_image (str): path to image<br/>\n        driving_video (str) : path to driving_video<br/>\n        ratio: margin ratio\n        image_size: size of image\n        output_dir: the dir to save the results\n        filename: filename to save the results\n        use_gpu: if True, use gpu to perform the computation, otherwise cpu.\n        '''\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if source_image == None or driving_video == None:\n            print('No image or driving video provided. Please input an image and a driving video.')\n            return\n        self.network.run(source_image, driving_video, ratio, image_size, output_dir, filename)\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n        self.generate(\n            source_image=self.args.source_image,\n            driving_video=self.args.driving_video,\n            ratio=self.args.ratio,\n            image_size=self.args.image_size,\n            output_dir=self.args.output_dir,\n            use_gpu=self.args.use_gpu)\n        return\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default='motion_driving_result', help='output directory for saving result.')\n        self.arg_config_group.add_argument(\"--filename\", default='result.mp4', help=\"filename to output\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument(\"--source_image\", type=str, help=\"path to source image\")\n        self.arg_input_group.add_argument(\"--driving_video\", type=str, help=\"path to driving video\")\n        self.arg_input_group.add_argument(\"--ratio\", dest=\"ratio\", type=float, default=0.4, help=\"margin ratio\")\n        self.arg_input_group.add_argument(\n            \"--image_size\", dest=\"image_size\", type=int, default=256, help=\"size of image\")\n"
  },
  {
    "path": "modules/image/Image_gan/gan/first_order_motion/requirements.txt",
    "content": "ppgan\n"
  },
  {
    "path": "modules/image/Image_gan/gan/photopen/README.md",
    "content": "# photopen\n\n|模型名称|photopen|\n| :--- | :---: |\n|类别|图像 - 图像生成|\n|网络|SPADEGenerator|\n|数据集|coco_stuff|\n|是否支持Fine-tuning|否|\n|模型大小|74MB|\n|最新更新日期|2021-12-14|\n|数据指标|-|\n\n\n## 一、模型基本信息  \n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://camo.githubusercontent.com/22e94b0c7278af08da8c475a3d968ba2f3cd565fcb2ad6b9a165c8a65f2d12f8/68747470733a2f2f61692d73747564696f2d7374617469632d6f6e6c696e652e63646e2e626365626f732e636f6d2f39343733313032336561623934623162393762396361383062643362333038333063393138636631363264303436626438383534306464613435303239356133\"  width = \"90%\"  hspace='10'/>\n    <br />\n\n- ### 模型介绍\n\n  - 本模块采用一个像素风格迁移网络 Pix2PixHD，能够根据输入的语义分割标签生成照片风格的图片。为了解决模型归一化层导致标签语义信息丢失的问题，向 Pix2PixHD 的生成器网络中添加了 SPADE（Spatially-Adaptive\n      Normalization）空间自适应归一化模块，通过两个卷积层保留了归一化时训练的缩放与偏置参数的空间维度，以增强生成图片的质量。语义风格标签图像可以参考[coco_stuff数据集](https://github.com/nightrome/cocostuff)获取, 也可以通过[PaddleGAN repo中的该项目](https://github.com/PaddlePaddle/PaddleGAN/blob/87537ad9d4eeda17eaa5916c6a585534ab989ea8/docs/zh_CN/tutorials/photopen.md)来自定义生成图像进行体验。\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n  - ppgan\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install photopen\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    # Read from a file\n    $ hub run photopen --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像生成模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"photopen\")\n    input_path = [\"/PATH/TO/IMAGE\"]\n    # Read from a file\n    module.photo_transfer(paths=input_path, output_dir='./transfer_result/', use_gpu=True)  \n    ```\n\n- ### 3、API\n\n  - ```python\n    photo_transfer(images=None, paths=None, output_dir='./transfer_result/', use_gpu=False, visualization=True):\n    ```\n    - 图像转换生成API。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]；<br/>\n      - paths (list\\[str\\]): 图片的路径；<br/>\n      - output\\_dir (str): 结果保存的路径； <br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - visualization(bool): 是否保存结果到本地文件夹\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像转换生成服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m photopen\n    ```\n\n  - 这样就完成了一个图像转换生成的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/photopen\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install photopen==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/gan/photopen/model.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\n\nimport cv2\nimport numpy as np\nimport paddle\nfrom PIL import Image\nfrom PIL import ImageOps\nfrom ppgan.models.generators import SPADEGenerator\nfrom ppgan.utils.filesystem import load\nfrom ppgan.utils.photopen import data_onehot_pro\n\n\nclass PhotoPenPredictor:\n    def __init__(self, weight_path, gen_cfg):\n\n        # 初始化模型\n        gen = SPADEGenerator(\n            gen_cfg.ngf,\n            gen_cfg.num_upsampling_layers,\n            gen_cfg.crop_size,\n            gen_cfg.aspect_ratio,\n            gen_cfg.norm_G,\n            gen_cfg.semantic_nc,\n            gen_cfg.use_vae,\n            gen_cfg.nef,\n        )\n        gen.eval()\n        para = load(weight_path)\n        if 'net_gen' in para:\n            gen.set_state_dict(para['net_gen'])\n        else:\n            gen.set_state_dict(para)\n\n        self.gen = gen\n        self.gen_cfg = gen_cfg\n\n    def run(self, image):\n        sem = Image.fromarray(image).convert('L')\n        sem = sem.resize((self.gen_cfg.crop_size, self.gen_cfg.crop_size), Image.NEAREST)\n        sem = np.array(sem).astype('float32')\n        sem = paddle.to_tensor(sem)\n        sem = sem.reshape([1, 1, self.gen_cfg.crop_size, self.gen_cfg.crop_size])\n\n        one_hot = data_onehot_pro(sem, self.gen_cfg)\n        predicted = self.gen(one_hot)\n        pic = predicted.numpy()[0].reshape((3, 256, 256)).transpose((1, 2, 0))\n        pic = ((pic + 1.) / 2. * 255).astype('uint8')\n\n        return pic\n"
  },
  {
    "path": "modules/image/Image_gan/gan/photopen/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport cv2\nimport numpy as np\nimport paddle\nfrom ppgan.utils.config import get_config\nfrom skimage.io import imread\nfrom skimage.transform import rescale\nfrom skimage.transform import resize\n\nimport paddlehub as hub\nfrom .model import PhotoPenPredictor\nfrom .util import base64_to_cv2\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"photopen\", type=\"CV/style_transfer\", author=\"paddlepaddle\", author_email=\"\", summary=\"\", version=\"1.0.0\")\nclass Photopen:\n    def __init__(self):\n        self.pretrained_model = os.path.join(self.directory, \"photopen.pdparams\")\n        cfg = get_config(os.path.join(self.directory, \"photopen.yaml\"))\n        self.network = PhotoPenPredictor(weight_path=self.pretrained_model, gen_cfg=cfg.predict)\n\n    def photo_transfer(self,\n                       images: list = None,\n                       paths: list = None,\n                       output_dir: str = './transfer_result/',\n                       use_gpu: bool = False,\n                       visualization: bool = True):\n        '''\n        images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR(read by cv2).\n        paths (list[str]): paths to images\n\n        output_dir (str): the dir to save the results\n        use_gpu (bool): if True, use gpu to perform the computation, otherwise cpu.\n        visualization (bool): if True, save results in output_dir.\n        '''\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n\n        if images != None:\n            for image in images:\n                image = image[:, :, ::-1]\n                out = self.network.run(image)\n                results.append(out)\n\n        if paths != None:\n            for path in paths:\n                image = cv2.imread(path)[:, :, ::-1]\n                out = self.network.run(image)\n                results.append(out)\n\n        if visualization == True:\n            if not os.path.exists(output_dir):\n                os.makedirs(output_dir, exist_ok=True)\n            for i, out in enumerate(results):\n                if out is not None:\n                    cv2.imwrite(os.path.join(output_dir, 'output_{}.png'.format(i)), out[:, :, ::-1])\n\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n        results = self.photo_transfer(\n            paths=[self.args.input_path],\n            output_dir=self.args.output_dir,\n            use_gpu=self.args.use_gpu,\n            visualization=self.args.visualization)\n        return results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.photo_transfer(images=images_decode, **kwargs)\n        tolist = [result.tolist() for result in results]\n        return tolist\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default='transfer_result', help='output directory for saving result.')\n        self.arg_config_group.add_argument('--visualization', type=bool, default=False, help='save results or not.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to input image.\")\n"
  },
  {
    "path": "modules/image/Image_gan/gan/photopen/photopen.yaml",
    "content": "total_iters: 1\noutput_dir: output_dir\ncheckpoints_dir: checkpoints\n\nmodel:\n  name: PhotoPenModel\n  generator:\n    name: SPADEGenerator\n    ngf: 24\n    num_upsampling_layers: normal\n    crop_size: 256\n    aspect_ratio: 1.0\n    norm_G: spectralspadebatch3x3\n    semantic_nc: 14\n    use_vae: False\n    nef: 16\n  discriminator:\n    name: MultiscaleDiscriminator\n    ndf: 128\n    num_D: 4\n    crop_size: 256\n    label_nc: 12\n    output_nc: 3\n    contain_dontcare_label: True\n    no_instance: False\n    n_layers_D: 6\n  criterion:\n    name: PhotoPenPerceptualLoss\n    crop_size: 224\n    lambda_vgg: 1.6\n  label_nc: 12\n  contain_dontcare_label: True\n  batchSize: 1\n  crop_size: 256\n  lambda_feat: 10.0\n\ndataset:\n  train:\n    name: PhotoPenDataset\n    content_root: test/coco_stuff\n    load_size: 286\n    crop_size: 256\n    num_workers: 0\n    batch_size: 1\n  test:\n    name: PhotoPenDataset_test\n    content_root: test/coco_stuff\n    load_size: 286\n    crop_size: 256\n    num_workers: 0\n    batch_size: 1\n\nlr_scheduler: # abundoned\n  name: LinearDecay\n  learning_rate: 0.0001\n  start_epoch: 99999\n  decay_epochs: 99999\n  # will get from real dataset\n  iters_per_epoch: 1\n\noptimizer:\n  lr: 0.0001\n  optimG:\n    name: Adam\n    net_names:\n      - net_gen\n    beta1: 0.9\n    beta2: 0.999\n  optimD:\n    name: Adam\n    net_names:\n      - net_des\n    beta1: 0.9\n    beta2: 0.999\n\nlog_config:\n  interval: 1\n  visiual_interval: 1\n\nsnapshot_config:\n  interval: 1\n\npredict:\n  name: SPADEGenerator\n  ngf: 24\n  num_upsampling_layers: normal\n  crop_size: 256\n  aspect_ratio: 1.0\n  norm_G: spectralspadebatch3x3\n  semantic_nc: 14\n  use_vae: False\n  nef: 16\n  contain_dontcare_label: True\n  label_nc: 12\n  batchSize: 1\n"
  },
  {
    "path": "modules/image/Image_gan/gan/photopen/requirements.txt",
    "content": "ppgan\n"
  },
  {
    "path": "modules/image/Image_gan/gan/photopen/util.py",
    "content": "import base64\n\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n"
  },
  {
    "path": "modules/image/Image_gan/gan/pixel2style2pixel/README.md",
    "content": "# pixel2style2pixel\n\n|模型名称|pixel2style2pixel|\n| :--- | :---: |\n|类别|图像 - 图像生成|\n|网络|Pixel2Style2Pixel|\n|数据集|-|\n|是否支持Fine-tuning|否|\n|模型大小|1.7GB|\n|最新更新日期|2021-12-14|\n|数据指标|-|\n\n\n## 一、模型基本信息  \n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/146486444-63637926-4e46-4299-8905-d93f529d9d54.jpg\"  width = \"40%\"  hspace='10'/>\n    <br />\n    输入图像\n    <br />\n    <img src=\"https://user-images.githubusercontent.com/22424850/146486413-0447dcc8-80ac-4b2c-8a7a-69347d60a2c4.png\"  width = \"40%\"  hspace='10'/>\n    <br />\n    输出图像\n     <br />\n    </p>\n\n- ### 模型介绍\n\n  - Pixel2Style2Pixel使用相当大的模型对图像进行编码，将图像编码到StyleGAN V2的风格向量空间中，使编码前的图像和解码后的图像具有强关联性。该模块应用于人脸转正任务。\n\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.1.0\n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n- ### 2、安装\n\n  - ```shell\n    $ hub install pixel2style2pixel\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    # Read from a file\n    $ hub run pixel2style2pixel --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现人脸转正模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"pixel2style2pixel\")\n    input_path = [\"/PATH/TO/IMAGE\"]\n    # Read from a file\n    module.style_transfer(paths=input_path, output_dir='./transfer_result/', use_gpu=True)  \n    ```\n\n- ### 3、API\n\n  - ```python\n    style_transfer(images=None, paths=None, output_dir='./transfer_result/', use_gpu=False, visualization=True):\n    ```\n    - 人脸转正生成API。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]；<br/>\n      - paths (list\\[str\\]): 图片的路径；<br/>\n      - output\\_dir (str): 结果保存的路径； <br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - visualization(bool): 是否保存结果到本地文件夹\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线人脸转正服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m pixel2style2pixel\n    ```\n\n  - 这样就完成了一个人脸转正的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/pixel2style2pixel\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install pixel2style2pixel==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/gan/pixel2style2pixel/model.py",
    "content": "#   Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport cv2\nimport scipy\nimport random\nimport numpy as np\nimport paddle\nimport paddle.vision.transforms as T\nimport ppgan.faceutils as futils\nfrom ppgan.models.generators import Pixel2Style2Pixel\nfrom ppgan.utils.download import get_path_from_url\nfrom PIL import Image\n\nmodel_cfgs = {\n    'ffhq-inversion': {\n        'model_urls':\n        'https://paddlegan.bj.bcebos.com/models/pSp-ffhq-inversion.pdparams',\n        'transform':\n        T.Compose([T.Resize((256, 256)),\n                   T.Transpose(),\n                   T.Normalize([127.5, 127.5, 127.5], [127.5, 127.5, 127.5])]),\n        'size':\n        1024,\n        'style_dim':\n        512,\n        'n_mlp':\n        8,\n        'channel_multiplier':\n        2\n    },\n    'ffhq-toonify': {\n        'model_urls':\n        'https://paddlegan.bj.bcebos.com/models/pSp-ffhq-toonify.pdparams',\n        'transform':\n        T.Compose([T.Resize((256, 256)),\n                   T.Transpose(),\n                   T.Normalize([127.5, 127.5, 127.5], [127.5, 127.5, 127.5])]),\n        'size':\n        1024,\n        'style_dim':\n        512,\n        'n_mlp':\n        8,\n        'channel_multiplier':\n        2\n    },\n    'default': {\n        'transform':\n        T.Compose([T.Resize((256, 256)),\n                   T.Transpose(),\n                   T.Normalize([127.5, 127.5, 127.5], [127.5, 127.5, 127.5])])\n    }\n}\n\n\ndef run_alignment(image):\n    img = Image.fromarray(image).convert(\"RGB\")\n    face = futils.dlib.detect(img)\n    if not face:\n        raise Exception('Could not find a face in the given image.')\n    face_on_image = face[0]\n    lm = futils.dlib.landmarks(img, face_on_image)\n    lm = np.array(lm)[:, ::-1]\n    lm_eye_left = lm[36:42]\n    lm_eye_right = lm[42:48]\n    lm_mouth_outer = lm[48:60]\n\n    output_size = 1024\n    transform_size = 4096\n    enable_padding = True\n\n    # Calculate auxiliary vectors.\n    eye_left = np.mean(lm_eye_left, axis=0)\n    eye_right = np.mean(lm_eye_right, axis=0)\n    eye_avg = (eye_left + eye_right) * 0.5\n    eye_to_eye = eye_right - eye_left\n    mouth_left = lm_mouth_outer[0]\n    mouth_right = lm_mouth_outer[6]\n    mouth_avg = (mouth_left + mouth_right) * 0.5\n    eye_to_mouth = mouth_avg - eye_avg\n\n    # Choose oriented crop rectangle.\n    x = eye_to_eye - np.flipud(eye_to_mouth) * [-1, 1]\n    x /= np.hypot(*x)\n    x *= max(np.hypot(*eye_to_eye) * 2.0, np.hypot(*eye_to_mouth) * 1.8)\n    y = np.flipud(x) * [-1, 1]\n    c = eye_avg + eye_to_mouth * 0.1\n    quad = np.stack([c - x - y, c - x + y, c + x + y, c + x - y])\n    qsize = np.hypot(*x) * 2\n\n    # Shrink.\n    shrink = int(np.floor(qsize / output_size * 0.5))\n    if shrink > 1:\n        rsize = (int(np.rint(float(img.size[0]) / shrink)), int(np.rint(float(img.size[1]) / shrink)))\n        img = img.resize(rsize, Image.ANTIALIAS)\n        quad /= shrink\n        qsize /= shrink\n\n    # Crop.\n    border = max(int(np.rint(qsize * 0.1)), 3)\n    crop = (int(np.floor(min(quad[:, 0]))), int(np.floor(min(quad[:, 1]))), int(np.ceil(max(quad[:, 0]))),\n            int(np.ceil(max(quad[:, 1]))))\n    crop = (max(crop[0] - border, 0), max(crop[1] - border, 0), min(crop[2] + border, img.size[0]),\n            min(crop[3] + border, img.size[1]))\n    if crop[2] - crop[0] < img.size[0] or crop[3] - crop[1] < img.size[1]:\n        img = img.crop(crop)\n        quad -= crop[0:2]\n\n    # Pad.\n    pad = (int(np.floor(min(quad[:, 0]))), int(np.floor(min(quad[:, 1]))), int(np.ceil(max(quad[:, 0]))),\n           int(np.ceil(max(quad[:, 1]))))\n    pad = (max(-pad[0] + border, 0), max(-pad[1] + border, 0), max(pad[2] - img.size[0] + border, 0),\n           max(pad[3] - img.size[1] + border, 0))\n    if enable_padding and max(pad) > border - 4:\n        pad = np.maximum(pad, int(np.rint(qsize * 0.3)))\n        img = np.pad(np.float32(img), ((pad[1], pad[3]), (pad[0], pad[2]), (0, 0)), 'reflect')\n        h, w, _ = img.shape\n        y, x, _ = np.ogrid[:h, :w, :1]\n        mask = np.maximum(1.0 - np.minimum(np.float32(x) / pad[0],\n                                           np.float32(w - 1 - x) / pad[2]),\n                          1.0 - np.minimum(np.float32(y) / pad[1],\n                                           np.float32(h - 1 - y) / pad[3]))\n        blur = qsize * 0.02\n        img += (scipy.ndimage.gaussian_filter(img, [blur, blur, 0]) - img) * np.clip(mask * 3.0 + 1.0, 0.0, 1.0)\n        img += (np.median(img, axis=(0, 1)) - img) * np.clip(mask, 0.0, 1.0)\n        img = Image.fromarray(np.uint8(np.clip(np.rint(img), 0, 255)), 'RGB')\n        quad += pad[:2]\n\n    # Transform.\n    img = img.transform((transform_size, transform_size), Image.QUAD, (quad + 0.5).flatten(), Image.BILINEAR)\n\n    return img\n\n\nclass AttrDict(dict):\n    def __init__(self, *args, **kwargs):\n        super(AttrDict, self).__init__(*args, **kwargs)\n        self.__dict__ = self\n\n\nclass Pixel2Style2PixelPredictor:\n    def __init__(self,\n                 weight_path=None,\n                 model_type=None,\n                 seed=None,\n                 size=1024,\n                 style_dim=512,\n                 n_mlp=8,\n                 channel_multiplier=2):\n\n        if weight_path is None and model_type != 'default':\n            if model_type in model_cfgs.keys():\n                weight_path = get_path_from_url(model_cfgs[model_type]['model_urls'])\n                size = model_cfgs[model_type].get('size', size)\n                style_dim = model_cfgs[model_type].get('style_dim', style_dim)\n                n_mlp = model_cfgs[model_type].get('n_mlp', n_mlp)\n                channel_multiplier = model_cfgs[model_type].get('channel_multiplier', channel_multiplier)\n                checkpoint = paddle.load(weight_path)\n            else:\n                raise ValueError('Predictor need a weight path or a pretrained model type')\n        else:\n            checkpoint = paddle.load(weight_path)\n\n        opts = checkpoint.pop('opts')\n        opts = AttrDict(opts)\n        opts['size'] = size\n        opts['style_dim'] = style_dim\n        opts['n_mlp'] = n_mlp\n        opts['channel_multiplier'] = channel_multiplier\n\n        self.generator = Pixel2Style2Pixel(opts)\n        self.generator.set_state_dict(checkpoint)\n        self.generator.eval()\n\n        if seed is not None:\n            paddle.seed(seed)\n            random.seed(seed)\n            np.random.seed(seed)\n\n        self.model_type = 'default' if model_type is None else model_type\n\n    def run(self, image):\n        src_img = run_alignment(image)\n        src_img = np.asarray(src_img)\n        transformed_image = model_cfgs[self.model_type]['transform'](src_img)\n        dst_img, latents = self.generator(\n            paddle.to_tensor(transformed_image[None, ...]), resize=False, return_latents=True)\n        dst_img = (dst_img * 0.5 + 0.5)[0].numpy() * 255\n        dst_img = dst_img.transpose((1, 2, 0))\n        dst_npy = latents[0].numpy()\n\n        return dst_img, dst_npy\n"
  },
  {
    "path": "modules/image/Image_gan/gan/pixel2style2pixel/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport argparse\nimport copy\n\nimport paddle\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo, runnable, serving\nimport numpy as np\nimport cv2\nfrom skimage.io import imread\nfrom skimage.transform import rescale, resize\n\nfrom .model import Pixel2Style2PixelPredictor\nfrom .util import base64_to_cv2\n\n\n@moduleinfo(\n    name=\"pixel2style2pixel\",\n    type=\"CV/style_transfer\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"\",\n    version=\"1.0.0\")\nclass pixel2style2pixel:\n    def __init__(self):\n        self.pretrained_model = os.path.join(self.directory, \"pSp-ffhq-inversion.pdparams\")\n\n        self.network = Pixel2Style2PixelPredictor(weight_path=self.pretrained_model, model_type='ffhq-inversion')\n\n    def style_transfer(self,\n                       images=None,\n                       paths=None,\n                       output_dir='./transfer_result/',\n                       use_gpu=False,\n                       visualization=True):\n        '''\n\n\n        images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR(read by cv2).\n        paths (list[str]): paths to images\n        output_dir: the dir to save the results\n        use_gpu: if True, use gpu to perform the computation, otherwise cpu.\n        visualization: if True, save results in output_dir.\n        '''\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n\n        if images != None:\n            for image in images:\n                image = image[:, :, ::-1]\n                out = self.network.run(image)\n                results.append(out)\n\n        if paths != None:\n            for path in paths:\n                image = cv2.imread(path)[:, :, ::-1]\n                out = self.network.run(image)\n                results.append(out)\n\n        if visualization == True:\n            if not os.path.exists(output_dir):\n                os.makedirs(output_dir, exist_ok=True)\n            for i, out in enumerate(results):\n                if out is not None:\n                    cv2.imwrite(os.path.join(output_dir, 'output_{}.png'.format(i)), out[0][:, :, ::-1])\n                    np.save(os.path.join(output_dir, 'output_{}.npy'.format(i)), out[1])\n\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n        results = self.style_transfer(\n            paths=[self.args.input_path],\n            output_dir=self.args.output_dir,\n            use_gpu=self.args.use_gpu,\n            visualization=self.args.visualization)\n        return results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.style_transfer(images=images_decode, **kwargs)\n        tolist = [result.tolist() for result in results]\n        return tolist\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default='transfer_result', help='output directory for saving result.')\n        self.arg_config_group.add_argument('--visualization', type=bool, default=False, help='save results or not.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to input image.\")\n"
  },
  {
    "path": "modules/image/Image_gan/gan/pixel2style2pixel/requirements.txt",
    "content": "ppgan\ndlib\n"
  },
  {
    "path": "modules/image/Image_gan/gan/pixel2style2pixel/util.py",
    "content": "import base64\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n"
  },
  {
    "path": "modules/image/Image_gan/gan/stgan_bald/README.md",
    "content": "# stgan_bald\n\n|模型名称|stgan_bald|\n| :--- | :---: |\n|类别|图像 - 图像生成|\n|网络|STGAN|\n|数据集|CelebA|\n|是否支持Fine-tuning|否|\n|模型大小|287MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 详情请查看此链接：https://aistudio.baidu.com/aistudio/projectdetail/1145381\n\n- ### 模型介绍\n\n  - stgan_bald 以STGAN 为模型，使用 CelebA 数据集训练完成，该模型可自动根据图像生成1年、3年、5年的秃头效果。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlehub >= 1.8.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install stgan_bald\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    stgan_bald = hub.Module(name=\"stgan_bald\")\n    result = stgan_bald.bald(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = stgan_bald.bald(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def bald(images=None,\n             paths=None,\n             use_gpu=False,\n             visualization=False,\n             output_dir=\"bald_output\")\n    ```\n\n    - 秃头生成器API预测接口, 预测输入一张人像，输出三张秃头效果(1年、3年、5年)。\n\n    - **参数**\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]；<br/>\n      - paths (list\\[str\\]): 图片的路径；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - visualization (bool): 是否将结果保存为图片，默认为 False; <br/>\n      - output\\_dir (str): 图片的保存路径，默认设为bald\\_output。\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n\n      - res (list\\[numpy.ndarray\\]): 输出图像数据，ndarray.shape 为 \\[H, W, C\\]\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个秃头生成器服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m stgan_bald\n    ```\n\n  - 这样就完成了一个秃头生成器API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    import numpy as np\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    def base64_to_cv2(b64str):\n      data = base64.b64decode(b64str.encode('utf8'))\n      data = np.fromstring(data, np.uint8)\n      data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n      return data\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/stgan_bald\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 保存图片 1年 3年 5年\n    one_year =cv2.cvtColor(base64_to_cv2(r.json()[\"results\"]['data_0']), cv2.COLOR_RGB2BGR)\n    three_year =cv2.cvtColor(base64_to_cv2(r.json()[\"results\"]['data_1']), cv2.COLOR_RGB2BGR)\n    five_year =cv2.cvtColor(base64_to_cv2(r.json()[\"results\"]['data_2']), cv2.COLOR_RGB2BGR)\n    cv2.imwrite(\"stgan_bald_server.png\", one_year)\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install stgan_bald==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/gan/stgan_bald/README_en.md",
    "content": "# stgan_bald\n\n|Module Name|stgan_bald|\n| :--- | :---: |\n|Category|image generation|\n|Network|STGAN|\n|Dataset|CelebA|\n|Fine-tuning supported or not|No|\n|Module Size|287MB|\n|Latest update date|2021-02-26|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Please refer to this [link](https://aistudio.baidu.com/aistudio/projectdetail/1145381)\n\n- ### Module Introduction\n\n  - This module is based on STGAN model, trained on CelebA dataset, and can be used to predict bald appearance after 1, 3 and 5 years.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlehub >= 1.8.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install stgan_bald\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n## III.Module API Prediction\n\n- ### 1、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    stgan_bald = hub.Module(name=\"stgan_bald\")\n    result = stgan_bald.bald(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = stgan_bald.bald(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def bald(images=None,\n             paths=None,\n             use_gpu=False,\n             visualization=False,\n             output_dir=\"bald_output\")\n    ```\n\n    - Bald appearance generation API.\n\n    - **Parameters**\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - visualization (bool): Whether to save the results as picture files;\n      - output_dir (str): save path of images;\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n\n      - res (list\\[numpy.ndarray\\]): result list，ndarray.shape is \\[H, W, C\\]\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of bald appearance generation.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m stgan_bald\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    import numpy as np\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    def base64_to_cv2(b64str):\n      data = base64.b64decode(b64str.encode('utf8'))\n      data = np.fromstring(data, np.uint8)\n      data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n      return data\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/stgan_bald\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # save results\n    one_year =cv2.cvtColor(base64_to_cv2(r.json()[\"results\"]['data_0']), cv2.COLOR_RGB2BGR)\n    three_year =cv2.cvtColor(base64_to_cv2(r.json()[\"results\"]['data_1']), cv2.COLOR_RGB2BGR)\n    five_year =cv2.cvtColor(base64_to_cv2(r.json()[\"results\"]['data_2']), cv2.COLOR_RGB2BGR)\n    cv2.imwrite(\"stgan_bald_server.png\", one_year)\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Remove Fluid API\n\n  - ```shell\n    $ hub install stgan_bald==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/gan/stgan_bald/data_feed.py",
    "content": "# -*- coding:utf-8 -*-\nimport os\nimport time\nfrom collections import OrderedDict\n\nimport cv2\nimport numpy as np\n\n__all__ = ['reader']\n\n\ndef reader(images=None, paths=None, org_labels=None, target_labels=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for i, im_path in enumerate(paths):\n            each = OrderedDict()\n            assert os.path.isfile(\n                im_path), \"The {} isn't a valid file path.\".format(im_path)\n            im = cv2.imread(im_path)\n            each['org_im'] = im\n            each['org_im_path'] = im_path\n            each['org_label'] = np.array(org_labels[i]).astype('float32')\n            if not target_labels:\n                each['target_label'] = np.array(\n                    org_labels[i]).astype('float32')\n            else:\n                each['target_label'] = np.array(\n                    target_labels[i]).astype('float32')\n            component.append(each)\n    if images is not None:\n        assert type(images) is list, \"images should be a list.\"\n        for i, im in enumerate(images):\n            each = OrderedDict()\n            each['org_im'] = im\n            each['org_im_path'] = 'ndarray_time={}'.format(\n                round(time.time(), 6) * 1e6)\n            each['org_label'] = np.array(org_labels[i]).astype('float32')\n            if not target_labels:\n                each['target_label'] = np.array(\n                    org_labels[i]).astype('float32')\n            else:\n                each['target_label'] = np.array(\n                    target_labels[i]).astype('float32')\n            component.append(each)\n\n    for element in component:\n        img = cv2.cvtColor(element['org_im'], cv2.COLOR_BGR2RGB)\n        img = cv2.resize(img, (128, 128), interpolation=cv2.INTER_LINEAR)\n        img = (img.astype('float32') / 255.0 - 0.5) / 0.5\n        img = img.transpose([2, 0, 1])\n        element['img'] = img[np.newaxis, :, :, :]\n\n        yield element\n"
  },
  {
    "path": "modules/image/Image_gan/gan/stgan_bald/module.py",
    "content": "# -*- coding:utf-8 -*-\n# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport copy\nimport paddle\nimport numpy as np\nfrom paddle.inference import Config, create_predictor\nfrom paddlehub.module.module import moduleinfo, serving\nfrom .data_feed import reader\nfrom .processor import postprocess, base64_to_cv2, cv2_to_base64\n\n\ndef check_attribute_conflict(label_batch):\n    ''' Based on https://github.com/LynnHo/AttGAN-Tensorflow'''\n    attrs = \"Bald,Bangs,Black_Hair,Blond_Hair,Brown_Hair,Bushy_Eyebrows,Eyeglasses,Male,Mouth_Slightly_Open,Mustache,No_Beard,Pale_Skin,Young\".split(\n        ',')\n\n    def _set(label, value, attr):\n        if attr in attrs:\n            label[attrs.index(attr)] = value\n\n    attr_id = attrs.index('Bald')\n    for label in label_batch:\n        if attrs[attr_id] != 0:\n            _set(label, 0, 'Bangs')\n\n    return label_batch\n\n\n@moduleinfo(\n    name=\"stgan_bald\",\n    version=\"1.1.0\",\n    summary=\"Baldness generator\",\n    author=\"Arrow, 七年期限，Mr.郑先生_\",\n    author_email=\"1084667371@qq.com，2733821739@qq.com\",\n    type=\"image/gan\")\nclass StganBald:\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(\n            self.directory, \"module\", \"model\")\n        self._set_config()\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path+'.pdmodel'\n        params = self.default_pretrained_model_path+'.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n            self.place = paddle.CUDAPlace(0)\n        except:\n            use_gpu = False\n            self.place = paddle.CPUPlace()\n\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(\n                memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def bald(self,\n             images=None,\n             paths=None,\n             data=None,\n             use_gpu=False,\n             org_labels=[[0., 0., 1., 0., 0., 1., 1., 1., 0., 0., 0., 0., 1.]],\n             target_labels=None,\n             visualization=True,\n             output_dir=\"bald_output\"):\n        \"\"\"\n        API for super resolution.\n\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C], the color space is BGR.\n            paths (list[str]): The paths of images.\n            data (dict): key is 'image', the corresponding value is the path to image.\n            use_gpu (bool): Whether to use gpu.\n            visualization (bool): Whether to save image or not.\n            output_dir (str): The path to store output images.\n\n        Returns:\n            res (list[dict]): each element in the list is a dict, the keys and values are:\n                save_path (str, optional): the path to save images. (Exists only if visualization is True)\n                data (numpy.ndarray): data of post processed image.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        if data and 'image' in data:\n            if paths is None:\n                paths = list()\n            paths += data['image']\n\n        all_data = list()\n        for yield_data in reader(images, paths, org_labels, target_labels):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        res = list()\n        outputs = []\n        for i in range(total_num):\n            image_np = all_data[i]['img']\n            org_label_np = [all_data[i]['org_label']]\n            target_label_np = [all_data[i]['target_label']]\n            for j in range(5):\n                if j % 2 == 0:\n                    label_trg_tmp = copy.deepcopy(target_label_np)\n                    new_i = 0\n                    label_trg_tmp[0][new_i] = 1.0 - label_trg_tmp[0][new_i]\n                    label_trg_tmp = check_attribute_conflict(\n                        label_trg_tmp)\n                    change_num = j * 0.02 + 0.3\n                    label_org_tmp = list(\n                        map(lambda x: ((x * 2) - 1) * change_num, org_label_np))\n                    label_trg_tmp = list(\n                        map(lambda x: ((x * 2) - 1) * change_num, label_trg_tmp))\n\n                    predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n                    input_names = predictor.get_input_names()\n                    input_handle = predictor.get_input_handle(input_names[0])\n                    input_handle.copy_from_cpu(image_np.copy())\n                    input_handle = predictor.get_input_handle(input_names[1])\n                    input_handle.copy_from_cpu(\n                        np.array(label_org_tmp).astype('float32'))\n                    input_handle = predictor.get_input_handle(input_names[2])\n                    input_handle.copy_from_cpu(\n                        np.array(label_trg_tmp).astype('float32'))\n                    predictor.run()\n                    output_names = predictor.get_output_names()\n                    output_handle = predictor.get_output_handle(\n                        output_names[0])\n                    outputs.append(output_handle)\n\n            out = postprocess(\n                data_out=outputs,\n                org_im=all_data[i]['org_im'],\n                org_im_path=all_data[i]['org_im_path'],\n                output_dir=output_dir,\n                visualization=visualization)\n            res.append(out)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.bald(images=images_decode, **kwargs)\n        output = {}\n        for key, value in results[0].items():\n            output[key] = cv2_to_base64(value)\n\n        return output\n"
  },
  {
    "path": "modules/image/Image_gan/gan/stgan_bald/processor.py",
    "content": "# -*- coding:utf-8 -*-\nimport os\nimport base64\n\nimport cv2\nimport numpy as np\nfrom PIL import Image\n\n__all__ = ['cv2_to_base64', 'base64_to_cv2', 'postprocess']\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef postprocess(data_out,\n                org_im,\n                org_im_path,\n                output_dir,\n                visualization,\n                thresh=120):\n    \"\"\"\n    Postprocess output of network. one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output of network.\n        org_im (numpy.ndarray): original image.\n        org_im_shape (list): shape pf original image.\n        org_im_path (list): path of riginal image.\n        output_dir (str): output directory to store image.\n        visualization (bool): whether to save image or not.\n        thresh (float): threshold.\n\n    Returns:\n        result (dict): The data of processed image.\n    \"\"\"\n    result = dict()\n    for i, img in enumerate(data_out):\n\n        img = np.squeeze(img.copy_to_cpu(), 0).transpose((1, 2, 0))\n        img = ((img + 1) * 127.5).astype(np.uint8)\n        img = cv2.resize(img, (256, 341), cv2.INTER_CUBIC)\n        fake_image = Image.fromarray(img)\n\n        if visualization:\n            check_dir(output_dir)\n            save_im_path = get_save_image_name(org_im_path, output_dir, i)\n            img_name = '{}.png'.format(i)\n            fake_image.save(os.path.join(output_dir, img_name))\n\n        result['data_{}'.format(i)] = img\n\n    return result\n\n\ndef check_dir(dir_path):\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef get_save_image_name(org_im_path, output_dir, num):\n    \"\"\"\n    Get save image name from source image path.\n    \"\"\"\n    # name prefix of orginal image\n    org_im_name = os.path.split(org_im_path)[-1]\n    im_prefix = os.path.splitext(org_im_name)[0]\n    ext = '.png'\n    # save image path\n    save_im_path = os.path.join(output_dir, im_prefix + ext)\n    if os.path.exists(save_im_path):\n        save_im_path = os.path.join(\n            output_dir, im_prefix + str(num) + ext)\n\n    return save_im_path\n"
  },
  {
    "path": "modules/image/Image_gan/gan/stgan_bald/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport numpy as np\nimport paddlehub as hub\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://ai-studio-static-online.cdn.bcebos.com/68313e182f5e4ad9907e69dac9ece8fc50840d7ffbd24fa88396f009958f969a'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"stgan_bald\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('bald_output')\n\n    def test_bald1(self):\n        results = self.module.bald(\n            paths=['tests/test.jpg']\n        )\n        data_0 = results[0]['data_0']\n        data_1 = results[0]['data_1']\n        data_2 = results[0]['data_2']\n        self.assertIsInstance(data_0, np.ndarray)\n        self.assertIsInstance(data_1, np.ndarray)\n        self.assertIsInstance(data_2, np.ndarray)\n\n    def test_bald2(self):\n        results = self.module.bald(\n            images=[cv2.imread('tests/test.jpg')]\n        )\n        data_0 = results[0]['data_0']\n        data_1 = results[0]['data_1']\n        data_2 = results[0]['data_2']\n        self.assertIsInstance(data_0, np.ndarray)\n        self.assertIsInstance(data_1, np.ndarray)\n        self.assertIsInstance(data_2, np.ndarray)\n\n    def test_bald3(self):\n        results = self.module.bald(\n            images=[cv2.imread('tests/test.jpg')],\n            visualization=False\n        )\n        data_0 = results[0]['data_0']\n        data_1 = results[0]['data_1']\n        data_2 = results[0]['data_2']\n        self.assertIsInstance(data_0, np.ndarray)\n        self.assertIsInstance(data_1, np.ndarray)\n        self.assertIsInstance(data_2, np.ndarray)\n\n    def test_bald4(self):\n        self.assertRaises(\n            AssertionError,\n            self.module.bald,\n            paths=['no.jpg']\n        )\n\n    def test_bald5(self):\n        self.assertRaises(\n            cv2.error,\n            self.module.bald,\n            images=['tests/test.jpg']\n        )\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/Image_gan/gan/styleganv2_editing/README.md",
    "content": "# styleganv2_editing\n\n|模型名称|styleganv2_editing|\n| :--- | :---: |\n|类别|图像 - 图像生成|\n|网络|StyleGAN V2|\n|数据集|-|\n|是否支持Fine-tuning|否|\n|模型大小|190MB|\n|最新更新日期|2021-12-15|\n|数据指标|-|\n\n\n## 一、模型基本信息  \n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/146483720-fb0ea3c0-b259-4ad6-b176-966675b9b164.png\"  width = \"40%\"  hspace='10'/>\n    <br />\n    输入图像\n    <br />\n    <img src=\"https://user-images.githubusercontent.com/22424850/146483730-3104795e-4ee6-43de-b4dc-b7760d502b50.png\"  width = \"40%\"  hspace='10'/>\n    <br />\n    输出图像(修改age)\n     <br />\n    </p>\n\n- ### 模型介绍\n\n  - StyleGAN V2 的任务是使用风格向量进行image generation，而Editing模块则是利用预先对多图的风格向量进行分类回归得到的属性操纵向量来操纵生成图像的属性。\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n  - ppgan\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install styleganv2_editing\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    # Read from a file\n    $ hub run styleganv2_editing --input_path \"/PATH/TO/IMAGE\" --direction_name age --direction_offset 5\n    ```\n  - 通过命令行方式实现人脸编辑模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"styleganv2_editing\")\n    input_path = [\"/PATH/TO/IMAGE\"]\n    # Read from a file\n    module.generate(paths=input_path, direction_name='age', direction_offset=5, output_dir='./editing_result/', use_gpu=True)  \n    ```\n\n- ### 3、API\n\n  - ```python\n    generate(self, images=None, paths=None, direction_name='age', direction_offset=0.0, output_dir='./editing_result/', use_gpu=False, visualization=True)\n    ```\n    - 人脸编辑生成API。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据 <br/>\n      - paths (list\\[str\\]): 图片路径；<br/>\n      - direction_name （str): 要编辑的属性名称，对于ffhq-conf-f有预先准备的这些属性: age、eyes_open、eye_distance、eye_eyebrow_distance、eye_ratio、gender、lip_ratio、mouth_open、mouth_ratio、nose_mouth_distance、nose_ratio、nose_tip、pitch、roll、smile、yaw <br/>\n      - direction_offset (float): 属性的偏移强度 <br/>\n      - output\\_dir (str): 结果保存的路径； <br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - visualization(bool): 是否保存结果到本地文件夹\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线人脸编辑服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m styleganv2_editing\n    ```\n\n  - 这样就完成了一个人脸编辑的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/styleganv2_editing\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install styleganv2_editing==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/gan/styleganv2_editing/basemodel.py",
    "content": "#   Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport random\nimport numpy as np\nimport paddle\nfrom ppgan.models.generators import StyleGANv2Generator\nfrom ppgan.utils.download import get_path_from_url\nfrom ppgan.utils.visual import make_grid, tensor2img, save_image\n\nmodel_cfgs = {\n    'ffhq-config-f': {\n        'model_urls': 'https://paddlegan.bj.bcebos.com/models/stylegan2-ffhq-config-f.pdparams',\n        'size': 1024,\n        'style_dim': 512,\n        'n_mlp': 8,\n        'channel_multiplier': 2\n    },\n    'animeface-512': {\n        'model_urls': 'https://paddlegan.bj.bcebos.com/models/stylegan2-animeface-512.pdparams',\n        'size': 512,\n        'style_dim': 512,\n        'n_mlp': 8,\n        'channel_multiplier': 2\n    }\n}\n\n\n@paddle.no_grad()\ndef get_mean_style(generator):\n    mean_style = None\n\n    for i in range(10):\n        style = generator.mean_latent(1024)\n\n        if mean_style is None:\n            mean_style = style\n\n        else:\n            mean_style += style\n\n    mean_style /= 10\n    return mean_style\n\n\n@paddle.no_grad()\ndef sample(generator, mean_style, n_sample):\n    image = generator(\n        [paddle.randn([n_sample, generator.style_dim])],\n        truncation=0.7,\n        truncation_latent=mean_style,\n    )[0]\n\n    return image\n\n\n@paddle.no_grad()\ndef style_mixing(generator, mean_style, n_source, n_target):\n    source_code = paddle.randn([n_source, generator.style_dim])\n    target_code = paddle.randn([n_target, generator.style_dim])\n\n    resolution = 2**((generator.n_latent + 2) // 2)\n\n    images = [paddle.ones([1, 3, resolution, resolution]) * -1]\n\n    source_image = generator([source_code], truncation_latent=mean_style, truncation=0.7)[0]\n    target_image = generator([target_code], truncation_latent=mean_style, truncation=0.7)[0]\n\n    images.append(source_image)\n\n    for i in range(n_target):\n        image = generator(\n            [target_code[i].unsqueeze(0).tile([n_source, 1]), source_code],\n            truncation_latent=mean_style,\n            truncation=0.7,\n        )[0]\n        images.append(target_image[i].unsqueeze(0))\n        images.append(image)\n\n    images = paddle.concat(images, 0)\n\n    return images\n\n\nclass StyleGANv2Predictor:\n    def __init__(self,\n                 output_path='output_dir',\n                 weight_path=None,\n                 model_type=None,\n                 seed=None,\n                 size=1024,\n                 style_dim=512,\n                 n_mlp=8,\n                 channel_multiplier=2):\n        self.output_path = output_path\n\n        if weight_path is None:\n            if model_type in model_cfgs.keys():\n                weight_path = get_path_from_url(model_cfgs[model_type]['model_urls'])\n                size = model_cfgs[model_type].get('size', size)\n                style_dim = model_cfgs[model_type].get('style_dim', style_dim)\n                n_mlp = model_cfgs[model_type].get('n_mlp', n_mlp)\n                channel_multiplier = model_cfgs[model_type].get('channel_multiplier', channel_multiplier)\n                checkpoint = paddle.load(weight_path)\n            else:\n                raise ValueError('Predictor need a weight path or a pretrained model type')\n        else:\n            checkpoint = paddle.load(weight_path)\n\n        self.generator = StyleGANv2Generator(size, style_dim, n_mlp, channel_multiplier)\n        self.generator.set_state_dict(checkpoint)\n        self.generator.eval()\n\n        if seed is not None:\n            paddle.seed(seed)\n            random.seed(seed)\n            np.random.seed(seed)\n\n    def run(self, n_row=3, n_col=5):\n        os.makedirs(self.output_path, exist_ok=True)\n        mean_style = get_mean_style(self.generator)\n\n        img = sample(self.generator, mean_style, n_row * n_col)\n        save_image(tensor2img(make_grid(img, nrow=n_col)), f'{self.output_path}/sample.png')\n\n        for j in range(2):\n            img = style_mixing(self.generator, mean_style, n_col, n_row)\n            save_image(tensor2img(make_grid(img, nrow=n_col + 1)), f'{self.output_path}/sample_mixing_{j}.png')\n"
  },
  {
    "path": "modules/image/Image_gan/gan/styleganv2_editing/model.py",
    "content": "#   Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport cv2\nimport numpy as np\nimport paddle\n\nfrom ppgan.utils.download import get_path_from_url\nfrom .basemodel import StyleGANv2Predictor\n\nmodel_cfgs = {\n    'ffhq-config-f': {\n        'direction_urls': 'https://paddlegan.bj.bcebos.com/models/stylegan2-ffhq-config-f-directions.pdparams'\n    }\n}\n\n\ndef make_image(tensor):\n    return (((tensor.detach() + 1) / 2 * 255).clip(min=0, max=255).transpose((0, 2, 3, 1)).numpy().astype('uint8'))\n\n\nclass StyleGANv2EditingPredictor(StyleGANv2Predictor):\n    def __init__(self, model_type=None, direction_path=None, **kwargs):\n        super().__init__(model_type=model_type, **kwargs)\n\n        if direction_path is None and model_type is not None:\n            assert model_type in model_cfgs, f'There is not any pretrained direction file for {model_type} model.'\n            direction_path = get_path_from_url(model_cfgs[model_type]['direction_urls'])\n        self.directions = paddle.load(direction_path)\n\n    @paddle.no_grad()\n    def run(self, latent, direction, offset):\n\n        latent = paddle.to_tensor(latent).unsqueeze(0).astype('float32')\n        direction = self.directions[direction].unsqueeze(0).astype('float32')\n\n        latent_n = paddle.concat([latent, latent + offset * direction], 0)\n        generator = self.generator\n        img_gen, _ = generator([latent_n], input_is_latent=True, randomize_noise=False)\n        imgs = make_image(img_gen)\n        src_img = imgs[0]\n        dst_img = imgs[1]\n\n        dst_latent = (latent + offset * direction)[0].numpy().astype('float32')\n\n        return src_img, dst_img, dst_latent\n"
  },
  {
    "path": "modules/image/Image_gan/gan/styleganv2_editing/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport argparse\nimport copy\n\nimport paddle\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo, runnable, serving\nimport numpy as np\nimport cv2\nfrom skimage.io import imread\nfrom skimage.transform import rescale, resize\n\nfrom .model import StyleGANv2EditingPredictor\nfrom .util import base64_to_cv2\n\n\n@moduleinfo(\n    name=\"styleganv2_editing\",\n    type=\"CV/style_transfer\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"\",\n    version=\"1.0.0\")\nclass styleganv2_editing:\n    def __init__(self):\n        self.pretrained_model = os.path.join(self.directory, \"stylegan2-ffhq-config-f-directions.pdparams\")\n\n        self.network = StyleGANv2EditingPredictor(direction_path=self.pretrained_model, model_type='ffhq-config-f')\n        self.pixel2style2pixel_module = hub.Module(name='pixel2style2pixel')\n\n    def generate(self,\n                 images=None,\n                 paths=None,\n                 direction_name='age',\n                 direction_offset=0.0,\n                 output_dir='./editing_result/',\n                 use_gpu=False,\n                 visualization=True):\n        '''\n\n\n        images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR(read by cv2).\n        paths (list[str]): paths to image.\n        direction_name(str): Attribute to be manipulated，For ffhq-conf-f, we have: age, eyes_open, eye_distance, eye_eyebrow_distance, eye_ratio, gender, lip_ratio, mouth_open, mouth_ratio, nose_mouth_distance, nose_ratio, nose_tip, pitch, roll, smile, yaw.\n        direction_offset(float): Offset strength of the attribute.\n        output_dir: the dir to save the results\n        use_gpu: if True, use gpu to perform the computation, otherwise cpu.\n        visualization: if True, save results in output_dir.\n        '''\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n\n        if images != None:\n            for image in images:\n                image = image[:, :, ::-1]\n                _, latent = self.pixel2style2pixel_module.network.run(image)\n                out = self.network.run(latent, direction_name, direction_offset)\n                results.append(out)\n\n        if paths != None:\n            for path in paths:\n                image = cv2.imread(path)[:, :, ::-1]\n                _, latent = self.pixel2style2pixel_module.network.run(image)\n                out = self.network.run(latent, direction_name, direction_offset)\n                results.append(out)\n\n        if visualization == True:\n            if not os.path.exists(output_dir):\n                os.makedirs(output_dir, exist_ok=True)\n            for i, out in enumerate(results):\n                if out is not None:\n                    cv2.imwrite(os.path.join(output_dir, 'src_{}.png'.format(i)), out[0][:, :, ::-1])\n                    cv2.imwrite(os.path.join(output_dir, 'dst_{}.png'.format(i)), out[1][:, :, ::-1])\n                    np.save(os.path.join(output_dir, 'dst_{}.npy'.format(i)), out[2])\n\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n        results = self.generate(\n            paths=[self.args.input_path],\n            direction_name=self.args.direction_name,\n            direction_offset=self.args.direction_offset,\n            output_dir=self.args.output_dir,\n            use_gpu=self.args.use_gpu,\n            visualization=self.args.visualization)\n        return results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.generate(images=images_decode, **kwargs)\n        tolist = [result.tolist() for result in results]\n        return tolist\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default='editing_result', help='output directory for saving result.')\n        self.arg_config_group.add_argument('--visualization', type=bool, default=False, help='save results or not.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to input image.\")\n        self.arg_input_group.add_argument(\n            '--direction_name',\n            type=str,\n            default='age',\n            help=\n            \"Attribute to be manipulated，For ffhq-conf-f, we have: age, eyes_open, eye_distance, eye_eyebrow_distance, eye_ratio, gender, lip_ratio, mouth_open, mouth_ratio, nose_mouth_distance, nose_ratio, nose_tip, pitch, roll, smile, yaw.\"\n        )\n        self.arg_input_group.add_argument('--direction_offset', type=float, help=\"Offset strength of the attribute.\")\n"
  },
  {
    "path": "modules/image/Image_gan/gan/styleganv2_editing/requirements.txt",
    "content": "ppgan\n"
  },
  {
    "path": "modules/image/Image_gan/gan/styleganv2_editing/util.py",
    "content": "import base64\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n"
  },
  {
    "path": "modules/image/Image_gan/gan/styleganv2_mixing/README.md",
    "content": "# styleganv2_mixing\n\n|模型名称|styleganv2_mixing|\n| :--- | :---: |\n|类别|图像 - 图像生成|\n|网络|StyleGAN V2|\n|数据集|-|\n|是否支持Fine-tuning|否|\n|模型大小|190MB|\n|最新更新日期|2021-12-23|\n|数据指标|-|\n\n\n## 一、模型基本信息  \n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/147241001-3babb1bd-98d4-4a9c-a61d-2298fca041e1.jpg\"  width = \"40%\"  hspace='10'/>\n    <br />\n    输入图像1\n    <br />\n    <img src=\"https://user-images.githubusercontent.com/22424850/147241006-0bc2cda8-d271-4cfd-8a0d-e6feea7bf167.jpg\"  width = \"40%\"  hspace='10'/>\n    <br />\n    输入图像2\n    <br />\n    <img src=\"https://user-images.githubusercontent.com/22424850/147241020-f4420729-c489-4661-b43f-c929c62c0ce7.png\"  width = \"40%\"  hspace='10'/>\n    <br />\n    输出图像\n     <br />\n    </p>\n\n- ### 模型介绍\n\n  - StyleGAN V2 的任务是使用风格向量进行image generation，而Mixing模块则是利用其风格向量实现两张生成图像不同层次不同比例的混合。\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n  - paddlepaddle >= 2.1.0\n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install styleganv2_mixing\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    # Read from a file\n    $ hub run styleganv2_mixing --image1 \"/PATH/TO/IMAGE1\" --image2 \"/PATH/TO/IMAGE2\"\n    ```\n  - 通过命令行方式实现人脸融合模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"styleganv2_mixing\")\n    input_path = [\"/PATH/TO/IMAGE\"]\n    # Read from a file\n    module.generate(paths=input_path, direction_name = 'age', direction_offset = 5, output_dir='./editing_result/', use_gpu=True)  \n    ```\n\n- ### 3、API\n\n  - ```python\n    generate(self, images=None, paths=None, weights = [0.5] * 18, output_dir='./mixing_result/', use_gpu=False, visualization=True)\n    ```\n    - 人脸融合生成API。\n\n    - **参数**\n      - images (list[dict]): data of images, 每一个元素都为一个 dict，有关键字 image1, image2, 相应取值为：\n          - image1 (numpy.ndarray): 待融合的图片1，shape 为 \\[H, W, C\\]，BGR格式；<br/>\n          - image2 (numpy.ndarray) : 待融合的图片2，shape为 \\[H, W, C\\]，BGR格式；<br/>\n      - paths (list[str]): paths to images, 每一个元素都为一个dict, 有关键字 image1, image2, 相应取值为：\n          - image1 (str): 待融合的图片1的路径；<br/>\n          - image2 (str) : 待融合的图片2的路径；<br/>\n      - weights (list(float)): 融合的权重\n      - images (list\\[numpy.ndarray\\]): 图片数据 <br/>\n      - paths (list\\[str\\]): 图片路径；<br/>\n      - output\\_dir (str): 结果保存的路径； <br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - visualization(bool): 是否保存结果到本地文件夹\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线人脸融合服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m styleganv2_mixing\n    ```\n\n  - 这样就完成了一个人脸融合的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[{'image1': cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE1\")),'image2': cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE2\"))}]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/styleganv2_mixing\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install styleganv2_mixing==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/gan/styleganv2_mixing/basemodel.py",
    "content": "#   Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport random\nimport numpy as np\nimport paddle\nfrom ppgan.models.generators import StyleGANv2Generator\nfrom ppgan.utils.download import get_path_from_url\nfrom ppgan.utils.visual import make_grid, tensor2img, save_image\n\nmodel_cfgs = {\n    'ffhq-config-f': {\n        'model_urls': 'https://paddlegan.bj.bcebos.com/models/stylegan2-ffhq-config-f.pdparams',\n        'size': 1024,\n        'style_dim': 512,\n        'n_mlp': 8,\n        'channel_multiplier': 2\n    },\n    'animeface-512': {\n        'model_urls': 'https://paddlegan.bj.bcebos.com/models/stylegan2-animeface-512.pdparams',\n        'size': 512,\n        'style_dim': 512,\n        'n_mlp': 8,\n        'channel_multiplier': 2\n    }\n}\n\n\n@paddle.no_grad()\ndef get_mean_style(generator):\n    mean_style = None\n\n    for i in range(10):\n        style = generator.mean_latent(1024)\n\n        if mean_style is None:\n            mean_style = style\n\n        else:\n            mean_style += style\n\n    mean_style /= 10\n    return mean_style\n\n\n@paddle.no_grad()\ndef sample(generator, mean_style, n_sample):\n    image = generator(\n        [paddle.randn([n_sample, generator.style_dim])],\n        truncation=0.7,\n        truncation_latent=mean_style,\n    )[0]\n\n    return image\n\n\n@paddle.no_grad()\ndef style_mixing(generator, mean_style, n_source, n_target):\n    source_code = paddle.randn([n_source, generator.style_dim])\n    target_code = paddle.randn([n_target, generator.style_dim])\n\n    resolution = 2**((generator.n_latent + 2) // 2)\n\n    images = [paddle.ones([1, 3, resolution, resolution]) * -1]\n\n    source_image = generator([source_code], truncation_latent=mean_style, truncation=0.7)[0]\n    target_image = generator([target_code], truncation_latent=mean_style, truncation=0.7)[0]\n\n    images.append(source_image)\n\n    for i in range(n_target):\n        image = generator(\n            [target_code[i].unsqueeze(0).tile([n_source, 1]), source_code],\n            truncation_latent=mean_style,\n            truncation=0.7,\n        )[0]\n        images.append(target_image[i].unsqueeze(0))\n        images.append(image)\n\n    images = paddle.concat(images, 0)\n\n    return images\n\n\nclass StyleGANv2Predictor:\n    def __init__(self,\n                 output_path='output_dir',\n                 weight_path=None,\n                 model_type=None,\n                 seed=None,\n                 size=1024,\n                 style_dim=512,\n                 n_mlp=8,\n                 channel_multiplier=2):\n        self.output_path = output_path\n\n        if weight_path is None:\n            if model_type in model_cfgs.keys():\n                weight_path = get_path_from_url(model_cfgs[model_type]['model_urls'])\n                size = model_cfgs[model_type].get('size', size)\n                style_dim = model_cfgs[model_type].get('style_dim', style_dim)\n                n_mlp = model_cfgs[model_type].get('n_mlp', n_mlp)\n                channel_multiplier = model_cfgs[model_type].get('channel_multiplier', channel_multiplier)\n                checkpoint = paddle.load(weight_path)\n            else:\n                raise ValueError('Predictor need a weight path or a pretrained model type')\n        else:\n            checkpoint = paddle.load(weight_path)\n\n        self.generator = StyleGANv2Generator(size, style_dim, n_mlp, channel_multiplier)\n        self.generator.set_state_dict(checkpoint)\n        self.generator.eval()\n\n        if seed is not None:\n            paddle.seed(seed)\n            random.seed(seed)\n            np.random.seed(seed)\n\n    def run(self, n_row=3, n_col=5):\n        os.makedirs(self.output_path, exist_ok=True)\n        mean_style = get_mean_style(self.generator)\n\n        img = sample(self.generator, mean_style, n_row * n_col)\n        save_image(tensor2img(make_grid(img, nrow=n_col)), f'{self.output_path}/sample.png')\n\n        for j in range(2):\n            img = style_mixing(self.generator, mean_style, n_col, n_row)\n            save_image(tensor2img(make_grid(img, nrow=n_col + 1)), f'{self.output_path}/sample_mixing_{j}.png')\n"
  },
  {
    "path": "modules/image/Image_gan/gan/styleganv2_mixing/model.py",
    "content": "#   Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport cv2\nimport numpy as np\nimport paddle\n\nfrom .basemodel import StyleGANv2Predictor\n\n\ndef make_image(tensor):\n    return (((tensor.detach() + 1) / 2 * 255).clip(min=0, max=255).transpose((0, 2, 3, 1)).numpy().astype('uint8'))\n\n\nclass StyleGANv2MixingPredictor(StyleGANv2Predictor):\n    @paddle.no_grad()\n    def run(self, latent1, latent2, weights=[0.5] * 18):\n\n        latent1 = paddle.to_tensor(latent1).unsqueeze(0)\n        latent2 = paddle.to_tensor(latent2).unsqueeze(0)\n        assert latent1.shape[1] == latent2.shape[1] == len(\n            weights), 'latents and their weights should have the same level nums.'\n        mix_latent = []\n        for i, weight in enumerate(weights):\n            mix_latent.append(latent1[:, i:i + 1] * weight + latent2[:, i:i + 1] * (1 - weight))\n        mix_latent = paddle.concat(mix_latent, 1)\n        latent_n = paddle.concat([latent1, latent2, mix_latent], 0)\n        generator = self.generator\n        img_gen, _ = generator([latent_n], input_is_latent=True, randomize_noise=False)\n        imgs = make_image(img_gen)\n        src_img1 = imgs[0]\n        src_img2 = imgs[1]\n        dst_img = imgs[2]\n\n        return src_img1, src_img2, dst_img\n"
  },
  {
    "path": "modules/image/Image_gan/gan/styleganv2_mixing/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport argparse\nimport copy\n\nimport paddle\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo, runnable, serving\nimport numpy as np\nimport cv2\nfrom skimage.io import imread\nfrom skimage.transform import rescale, resize\n\nfrom .model import StyleGANv2MixingPredictor\nfrom .util import base64_to_cv2\n\n\n@moduleinfo(\n    name=\"styleganv2_mixing\",\n    type=\"CV/style_transfer\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"\",\n    version=\"1.0.0\")\nclass styleganv2_mixing:\n    def __init__(self):\n        self.pretrained_model = os.path.join(self.directory, \"stylegan2-ffhq-config-f.pdparams\")\n        self.network = StyleGANv2MixingPredictor(weight_path=self.pretrained_model, model_type='ffhq-config-f')\n        self.pixel2style2pixel_module = hub.Module(name='pixel2style2pixel')\n\n    def generate(self,\n                 images=None,\n                 paths=None,\n                 weights=[0.5] * 18,\n                 output_dir='./mixing_result/',\n                 use_gpu=False,\n                 visualization=True):\n        '''\n        images (list[dict]): data of images, each element is a dict，the keys are as below：\n          - image1 (numpy.ndarray): image1 to be mixed，shape is \\[H, W, C\\]，BGR format；<br/>\n          - image2 (numpy.ndarray) : image2 to be mixed，shape is \\[H, W, C\\]，BGR format；<br/>\n        paths (list[str]): paths to images, each element is a dict，the keys are as below：\n          - image1 (str): path to image1；<br/>\n          - image2 (str) : path to image2；<br/>\n        weights (list(float)): weight for mixing\n        output_dir: the dir to save the results\n        use_gpu: if True, use gpu to perform the computation, otherwise cpu.\n        visualization: if True, save results in output_dir.\n        '''\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n        if images != None:\n            for image_dict in images:\n                image1 = image_dict['image1'][:, :, ::-1]\n                image2 = image_dict['image2'][:, :, ::-1]\n                _, latent1 = self.pixel2style2pixel_module.network.run(image1)\n                _, latent2 = self.pixel2style2pixel_module.network.run(image2)\n                results.append(self.network.run(latent1, latent2, weights))\n\n        if paths != None:\n            for path_dict in paths:\n                path1 = path_dict['image1']\n                path2 = path_dict['image2']\n                image1 = cv2.imread(path1)[:, :, ::-1]\n                image2 = cv2.imread(path2)[:, :, ::-1]\n                _, latent1 = self.pixel2style2pixel_module.network.run(image1)\n                _, latent2 = self.pixel2style2pixel_module.network.run(image2)\n                results.append(self.network.run(latent1, latent2, weights))\n\n        if visualization == True:\n            if not os.path.exists(output_dir):\n                os.makedirs(output_dir, exist_ok=True)\n            for i, out in enumerate(results):\n                if out is not None:\n                    cv2.imwrite(os.path.join(output_dir, 'src_{}_image1.png'.format(i)), out[0][:, :, ::-1])\n                    cv2.imwrite(os.path.join(output_dir, 'src_{}_image2.png'.format(i)), out[1][:, :, ::-1])\n                    cv2.imwrite(os.path.join(output_dir, 'dst_{}.png'.format(i)), out[2][:, :, ::-1])\n\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n        results = self.generate(\n            paths=[{\n                'image1': self.args.image1,\n                'image2': self.args.image2\n            }],\n            weights=self.args.weights,\n            output_dir=self.args.output_dir,\n            use_gpu=self.args.use_gpu,\n            visualization=self.args.visualization)\n        return results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = copy.deepcopy(images)\n        for image in images_decode:\n            image['image1'] = base64_to_cv2(image['image1'])\n            image['image2'] = base64_to_cv2(image['image2'])\n        results = self.generate(images_decode, **kwargs)\n        tolist = [result.tolist() for result in results]\n        return tolist\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default='mixing_result', help='output directory for saving result.')\n        self.arg_config_group.add_argument('--visualization', type=bool, default=False, help='save results or not.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--image1', type=str, help=\"path to input image1.\")\n        self.arg_input_group.add_argument('--image2', type=str, help=\"path to input image2.\")\n        self.arg_input_group.add_argument(\n            \"--weights\",\n            type=float,\n            nargs=\"+\",\n            default=[0.5] * 18,\n            help=\"different weights at each level of two latent codes\")\n"
  },
  {
    "path": "modules/image/Image_gan/gan/styleganv2_mixing/requirements.txt",
    "content": "ppgan\n"
  },
  {
    "path": "modules/image/Image_gan/gan/styleganv2_mixing/util.py",
    "content": "import base64\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n"
  },
  {
    "path": "modules/image/Image_gan/gan/wav2lip/README.md",
    "content": "# wav2lip\n\n|模型名称|wav2lip|\n| :--- | :---: |\n|类别|图像 - 视频生成|\n|网络|Wav2Lip|\n|数据集|LRS2|\n|是否支持Fine-tuning|否|\n|模型大小|139MB|\n|最新更新日期|2021-12-14|\n|数据指标|-|\n\n\n## 一、模型基本信息  \n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/146481773-4ec50285-3b13-4a86-84a2-b105787b63d1.png\"  width = \"40%\"  hspace='10'/>\n    <br />\n    输入图像\n    <br />\n    <img src=\"https://user-images.githubusercontent.com/22424850/146482210-5f309fc3-7582-452d-bcf5-f2c54b5c8dc8.gif\"  width = \"40%\"  hspace='10'/>\n    <br />\n    输出视频\n     <br />\n    </p>\n\n\n- ### 模型介绍\n\n  - Wav2Lip实现的是视频人物根据输入音频生成与语音同步的人物唇形，使得生成的视频人物口型与输入语音同步。Wav2Lip不仅可以基于静态图像来输出与目标语音匹配的唇形同步视频，还可以直接将动态的视频进行唇形转换，输出与目标语音匹配的视频。Wav2Lip实现唇形与语音精准同步突破的关键在于，它采用了唇形同步判别器，以强制生成器持续产生准确而逼真的唇部运动。此外，它通过在鉴别器中使用多个连续帧而不是单个帧，并使用视觉质量损失（而不仅仅是对比损失）来考虑时间相关性，从而改善了视觉质量。Wav2Lip适用于任何人脸、任何语言，对任意视频都能达到很高都准确率，可以无缝地与原始视频融合，还可以用于转换动画人脸。\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n  - ffmpeg\n  - libsndfile\n- ### 2、安装\n\n  - ```shell\n    $ hub install wav2lip\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    # Read from a file\n    $ hub run wav2lip --face \"/PATH/TO/VIDEO or IMAGE\" --audio \"/PATH/TO/AUDIO\"\n    ```\n  - 通过命令行方式人物唇形生成模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"wav2lip\")\n    face_input_path = \"/PATH/TO/VIDEO or IMAGE\"\n    audio_input_path = \"/PATH/TO/AUDIO\"\n    module.wav2lip_transfer(face=face_input_path, audio=audio_input_path, output_dir='./transfer_result/', use_gpu=True)  \n    ```\n\n- ### 3、API\n\n  - ```python\n    def wav2lip_transfer(face, audio, output_dir ='./output_result/', use_gpu=False, visualization=True):\n    ```\n    - 人脸唇形生成API。\n\n    - **参数**\n\n      - face (str): 视频或图像文件的路径<br/>\n      - audio (str): 音频文件的路径<br/>\n      - output\\_dir (str): 结果保存的路径； <br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - visualization(bool): 是否保存结果到本地文件夹\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install wav2lip==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/gan/wav2lip/model.py",
    "content": "from os import listdir, path, makedirs\nimport platform\nimport numpy as np\nimport scipy, cv2, os, sys, argparse\nimport json, subprocess, random, string\nfrom tqdm import tqdm\nfrom glob import glob\nimport paddle\nfrom paddle.utils.download import get_weights_path_from_url\nfrom ppgan.faceutils import face_detection\nfrom ppgan.utils import audio\nfrom ppgan.models.generators.wav2lip import Wav2Lip\n\nWAV2LIP_WEIGHT_URL = 'https://paddlegan.bj.bcebos.com/models/wav2lip_hq.pdparams'\nmel_step_size = 16\n\n\nclass Wav2LipPredictor:\n    def __init__(self,\n                 checkpoint_path=None,\n                 static=False,\n                 fps=25,\n                 pads=[0, 10, 0, 0],\n                 face_det_batch_size=16,\n                 wav2lip_batch_size=128,\n                 resize_factor=1,\n                 crop=[0, -1, 0, -1],\n                 box=[-1, -1, -1, -1],\n                 rotate=False,\n                 nosmooth=False,\n                 face_detector='sfd',\n                 face_enhancement=False):\n        self.img_size = 96\n        self.checkpoint_path = checkpoint_path\n        self.static = static\n        self.fps = fps\n        self.pads = pads\n        self.face_det_batch_size = face_det_batch_size\n        self.wav2lip_batch_size = wav2lip_batch_size\n        self.resize_factor = resize_factor\n        self.crop = crop\n        self.box = box\n        self.rotate = rotate\n        self.nosmooth = nosmooth\n        self.face_detector = face_detector\n        self.face_enhancement = face_enhancement\n        if face_enhancement:\n            from ppgan.faceutils.face_enhancement import FaceEnhancement\n            self.faceenhancer = FaceEnhancement()\n        makedirs('./temp', exist_ok=True)\n\n    def get_smoothened_boxes(self, boxes, T):\n        for i in range(len(boxes)):\n            if i + T > len(boxes):\n                window = boxes[len(boxes) - T:]\n            else:\n                window = boxes[i:i + T]\n            boxes[i] = np.mean(window, axis=0)\n        return boxes\n\n    def face_detect(self, images):\n        detector = face_detection.FaceAlignment(\n            face_detection.LandmarksType._2D, flip_input=False, face_detector=self.face_detector)\n\n        batch_size = self.face_det_batch_size\n\n        while 1:\n            predictions = []\n            try:\n                for i in tqdm(range(0, len(images), batch_size)):\n                    predictions.extend(detector.get_detections_for_batch(np.array(images[i:i + batch_size])))\n            except RuntimeError:\n                if batch_size == 1:\n                    raise RuntimeError(\n                        'Image too big to run face detection on GPU. Please use the --resize_factor argument')\n                batch_size //= 2\n                print('Recovering from OOM error; New batch size: {}'.format(batch_size))\n                continue\n            break\n\n        results = []\n        pady1, pady2, padx1, padx2 = self.pads\n        for rect, image in zip(predictions, images):\n            if rect is None:\n                cv2.imwrite('temp/faulty_frame.jpg', image)  # check this frame where the face was not detected.\n                raise ValueError('Face not detected! Ensure the video contains a face in all the frames.')\n\n            y1 = max(0, rect[1] - pady1)\n            y2 = min(image.shape[0], rect[3] + pady2)\n            x1 = max(0, rect[0] - padx1)\n            x2 = min(image.shape[1], rect[2] + padx2)\n\n            results.append([x1, y1, x2, y2])\n\n        boxes = np.array(results)\n        if not self.nosmooth: boxes = self.get_smoothened_boxes(boxes, T=5)\n        results = [[image[y1:y2, x1:x2], (y1, y2, x1, x2)] for image, (x1, y1, x2, y2) in zip(images, boxes)]\n\n        del detector\n        return results\n\n    def datagen(self, frames, mels):\n        img_batch, mel_batch, frame_batch, coords_batch = [], [], [], []\n\n        if self.box[0] == -1:\n            if not self.static:\n                face_det_results = self.face_detect(frames)  # BGR2RGB for CNN face detection\n            else:\n                face_det_results = self.face_detect([frames[0]])\n        else:\n            print('Using the specified bounding box instead of face detection...')\n            y1, y2, x1, x2 = self.box\n            face_det_results = [[f[y1:y2, x1:x2], (y1, y2, x1, x2)] for f in frames]\n\n        for i, m in enumerate(mels):\n            idx = 0 if self.static else i % len(frames)\n            frame_to_save = frames[idx].copy()\n            face, coords = face_det_results[idx].copy()\n\n            face = cv2.resize(face, (self.img_size, self.img_size))\n\n            img_batch.append(face)\n            mel_batch.append(m)\n            frame_batch.append(frame_to_save)\n            coords_batch.append(coords)\n\n            if len(img_batch) >= self.wav2lip_batch_size:\n                img_batch, mel_batch = np.asarray(img_batch), np.asarray(mel_batch)\n\n                img_masked = img_batch.copy()\n                img_masked[:, self.img_size // 2:] = 0\n\n                img_batch = np.concatenate((img_masked, img_batch), axis=3) / 255.\n                mel_batch = np.reshape(mel_batch, [len(mel_batch), mel_batch.shape[1], mel_batch.shape[2], 1])\n\n                yield img_batch, mel_batch, frame_batch, coords_batch\n                img_batch, mel_batch, frame_batch, coords_batch = [], [], [], []\n\n        if len(img_batch) > 0:\n            img_batch, mel_batch = np.asarray(img_batch), np.asarray(mel_batch)\n\n            img_masked = img_batch.copy()\n            img_masked[:, self.img_size // 2:] = 0\n\n            img_batch = np.concatenate((img_masked, img_batch), axis=3) / 255.\n            mel_batch = np.reshape(mel_batch, [len(mel_batch), mel_batch.shape[1], mel_batch.shape[2], 1])\n\n            yield img_batch, mel_batch, frame_batch, coords_batch\n\n    def run(self, face, audio_seq, output_dir, visualization=True):\n        if os.path.isfile(face) and path.basename(face).split('.')[1] in ['jpg', 'png', 'jpeg']:\n            self.static = True\n\n        if not os.path.isfile(face):\n            raise ValueError('--face argument must be a valid path to video/image file')\n\n        elif path.basename(face).split('.')[1] in ['jpg', 'png', 'jpeg']:\n            full_frames = [cv2.imread(face)]\n            fps = self.fps\n\n        else:\n            video_stream = cv2.VideoCapture(face)\n            fps = video_stream.get(cv2.CAP_PROP_FPS)\n\n            print('Reading video frames...')\n\n            full_frames = []\n            while 1:\n                still_reading, frame = video_stream.read()\n                if not still_reading:\n                    video_stream.release()\n                    break\n                if self.resize_factor > 1:\n                    frame = cv2.resize(frame,\n                                       (frame.shape[1] // self.resize_factor, frame.shape[0] // self.resize_factor))\n\n                if self.rotate:\n                    frame = cv2.rotate(frame, cv2.cv2.ROTATE_90_CLOCKWISE)\n\n                y1, y2, x1, x2 = self.crop\n                if x2 == -1: x2 = frame.shape[1]\n                if y2 == -1: y2 = frame.shape[0]\n\n                frame = frame[y1:y2, x1:x2]\n\n                full_frames.append(frame)\n\n        print(\"Number of frames available for inference: \" + str(len(full_frames)))\n\n        if not audio_seq.endswith('.wav'):\n            print('Extracting raw audio...')\n            command = 'ffmpeg -y -i {} -strict -2 {}'.format(audio_seq, 'temp/temp.wav')\n\n            subprocess.call(command, shell=True)\n            audio_seq = 'temp/temp.wav'\n\n        wav = audio.load_wav(audio_seq, 16000)\n        mel = audio.melspectrogram(wav)\n        if np.isnan(mel.reshape(-1)).sum() > 0:\n            raise ValueError(\n                'Mel contains nan! Using a TTS voice? Add a small epsilon noise to the wav file and try again')\n\n        mel_chunks = []\n        mel_idx_multiplier = 80. / fps\n        i = 0\n        while 1:\n            start_idx = int(i * mel_idx_multiplier)\n            if start_idx + mel_step_size > len(mel[0]):\n                mel_chunks.append(mel[:, len(mel[0]) - mel_step_size:])\n                break\n            mel_chunks.append(mel[:, start_idx:start_idx + mel_step_size])\n            i += 1\n\n        print(\"Length of mel chunks: {}\".format(len(mel_chunks)))\n\n        full_frames = full_frames[:len(mel_chunks)]\n\n        batch_size = self.wav2lip_batch_size\n        gen = self.datagen(full_frames.copy(), mel_chunks)\n\n        model = Wav2Lip()\n        if self.checkpoint_path is None:\n            model_weights_path = get_weights_path_from_url(WAV2LIP_WEIGHT_URL)\n            weights = paddle.load(model_weights_path)\n        else:\n            weights = paddle.load(self.checkpoint_path)\n        model.load_dict(weights)\n        model.eval()\n        print(\"Model loaded\")\n        for i, (img_batch, mel_batch, frames, coords) in enumerate(\n                tqdm(gen, total=int(np.ceil(float(len(mel_chunks)) / batch_size)))):\n            if i == 0:\n\n                frame_h, frame_w = full_frames[0].shape[:-1]\n                out = cv2.VideoWriter('temp/result.avi', cv2.VideoWriter_fourcc(*'DIVX'), fps, (frame_w, frame_h))\n\n            img_batch = paddle.to_tensor(np.transpose(img_batch, (0, 3, 1, 2))).astype('float32')\n            mel_batch = paddle.to_tensor(np.transpose(mel_batch, (0, 3, 1, 2))).astype('float32')\n\n            with paddle.no_grad():\n                pred = model(mel_batch, img_batch)\n\n            pred = pred.numpy().transpose(0, 2, 3, 1) * 255.\n\n            for p, f, c in zip(pred, frames, coords):\n                y1, y2, x1, x2 = c\n                if self.face_enhancement:\n                    p = self.faceenhancer.enhance_from_image(p)\n                p = cv2.resize(p.astype(np.uint8), (x2 - x1, y2 - y1))\n\n                f[y1:y2, x1:x2] = p\n                out.write(f)\n\n        out.release()\n        os.makedirs(output_dir, exist_ok=True)\n        if visualization:\n            command = 'ffmpeg -y -i {} -i {} -strict -2 -q:v 1 {}'.format(audio_seq, 'temp/result.avi',\n                                                                          os.path.join(output_dir, 'result.avi'))\n            subprocess.call(command, shell=platform.system() != 'Windows')\n"
  },
  {
    "path": "modules/image/Image_gan/gan/wav2lip/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport argparse\nimport copy\n\nimport paddle\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo, runnable, serving\nimport numpy as np\nimport cv2\n\nfrom .model import Wav2LipPredictor\n\n\n@moduleinfo(name=\"wav2lip\", type=\"CV/generation\", author=\"paddlepaddle\", author_email=\"\", summary=\"\", version=\"1.0.0\")\nclass wav2lip:\n    def __init__(self):\n        self.pretrained_model = os.path.join(self.directory, \"wav2lip_hq.pdparams\")\n\n        self.network = Wav2LipPredictor(\n            checkpoint_path=self.pretrained_model,\n            static=False,\n            fps=25,\n            pads=[0, 10, 0, 0],\n            face_det_batch_size=16,\n            wav2lip_batch_size=128,\n            resize_factor=1,\n            crop=[0, -1, 0, -1],\n            box=[-1, -1, -1, -1],\n            rotate=False,\n            nosmooth=False,\n            face_detector='sfd',\n            face_enhancement=True)\n\n    def wav2lip_transfer(self, face, audio, output_dir='./output_result/', use_gpu=False, visualization=True):\n        '''\n        face (str): path to video/image that contains faces to use.\n        audio (str): path to input audio.\n        output_dir: the dir to save the results\n        use_gpu: if True, use gpu to perform the computation, otherwise cpu.\n        visualization: if True, save results in output_dir.\n        '''\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        self.network.run(face, audio, output_dir, visualization)\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n        self.wav2lip_transfer(\n            face=self.args.face,\n            audio=self.args.audio,\n            output_dir=self.args.output_dir,\n            use_gpu=self.args.use_gpu,\n            visualization=self.args.visualization)\n        return\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default='output_result', help='output directory for saving result.')\n        self.arg_config_group.add_argument('--visualization', type=bool, default=False, help='save results or not.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--audio', type=str, help=\"path to input audio.\")\n        self.arg_input_group.add_argument('--face', type=str, help=\"path to video/image that contains faces to use.\")\n"
  },
  {
    "path": "modules/image/Image_gan/gan/wav2lip/requirements.txt",
    "content": "ppgan\n"
  },
  {
    "path": "modules/image/Image_gan/stargan_celeba/README.md",
    "content": "# stargan_celeba\n\n|模型名称|stargan_celeba|\n| :--- | :---: |\n|类别|图像 - 图像生成|\n|网络|STGAN|\n|数据集|Celeba|\n|是否支持Fine-tuning|否|\n|模型大小|33MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/137855887-f0abca76-2735-4275-b7ad-242decf31bb3.PNG\" width=600><br/>\n    图1. StarGAN的效果图 (属性分别为：origial image, Black_Hair, Blond_Hair, Brown_Hair, Male, Aged)<br/>\n    </p>\n\n\n- ### 模型介绍\n\n  - StarGAN 是为了解决跨多个域、多个数据集的训练而提出的生成对抗网络模型。单个 StarGAN 模型就可以实现多个风格域的转换。 该 PaddleHub Module 使用 Celeba 数据集训练完成，目前支持 \"Black_Hair\", \"Blond_Hair\", \"Brown_Hair\", \"Female\", \"Male\", \"Aged\" 这六种人脸属性转换。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.5.2 \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install stargan_celeba==1.0.0\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n \n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run stargan_celeba --image \"/PATH/TO/IMAGE\" --style \"target_attribute\"\n    ```\n  - **参数**\n\n    - image ：指定图片路径。\n\n    - style 指定拟转换的属性，可选择 \"Black_Hair\", \"Blond_Hair\", \"Brown_Hair\", \"Female\", \"Male\", \"Aged\" 中的一个。\n\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    stargan = hub.Module(name=\"stargan_celeba\")\n    test_img_path = [\"/PATH/TO/IMAGE\"]\n    trans_attr = [\"Blond_Hair\"]\n\n    # set input dict\n    input_dict = {\"image\": test_img_path, \"style\": trans_attr}\n\n    # execute predict and print the result\n    results = stargan.generate(data=input_dict)\n    print(results)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def generate(data)\n    ```\n\n    - 风格转换API，用于图像生成。\n\n    - **参数**\n\n      - data： dict 类型，有以下字段\n          - image (list\\[str\\])： list中每个元素为待转换的图片路径。\n          - style (list\\[str\\])： list中每个元素为字符串，填写待转换的人脸属性。\n\n    - **返回**\n      - res (list\\[str\\]): 提示生成图片的保存路径。\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n"
  },
  {
    "path": "modules/image/Image_gan/stargan_celeba/README_en.md",
    "content": "# stargan_celeba\n\n|Module Name|stargan_celeba|\n| :--- | :---: |\n|Category|image generation|\n|Network|STGAN|\n|Dataset|Celeba|\n|Fine-tuning supported or not|No|\n|Module Size |33MB|\n|Latest update date|2021-02-26|\n|Data indicators|-|\n\n\n## I. Basic Information \n\n- ### Application Effect Display\n  - Sample results:\n\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/137855887-f0abca76-2735-4275-b7ad-242decf31bb3.PNG\" width=600><br/>\n     The image attributes are: origial image, Black_Hair, Blond_Hair, Brown_Hair, Male, Aged<br/>\n    </p>\n\n\n- ### Module Introduction\n\n  - STGAN takes the original attribute and the target attribute as input, and  proposes STUs (Selective transfer units) to select and modify features of the encoder. The PaddleHub Module is trained one Celeba dataset and currently supports attributes of  \"Black_Hair\", \"Blond_Hair\", \"Brown_Hair\", \"Female\", \"Male\", \"Aged\".\n\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n  - paddlepaddle >= 1.5.2 \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install stargan_celeba==1.0.0\n    ```\n  - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n \n\n## III. Module API Prediction\n\n- ### 1、Command line Prediction\n\n    - ```shell\n      $ hub run stargan_celeba --image \"/PATH/TO/IMAGE\" --style \"target_attribute\"\n      ```\n\n    - **Parameters**\n\n      - image: image path\n\n      - style: Specify the attributes to be converted. The options are \"Black_Hair\", \"Blond_Hair\", \"Brown_Hair\", \"Female\", \"Male\", \"Aged\". You can choose one of the options.\n\n    - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_en/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n\n    stargan = hub.Module(name=\"stargan_celeba\")\n    test_img_path = [\"/PATH/TO/IMAGE\"]\n    trans_attr = [\"Blond_Hair\"]\n\n    # set input dict\n    input_dict = {\"image\": test_img_path, \"style\": trans_attr}\n\n    # execute predict and print the result\n    results = stargan.generate(data=input_dict)\n    print(results)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def generate(data)\n    ```\n\n    - Style transfer API.\n\n    - **Parameter**\n\n      - data(list[dict]): each element in the list is dict and each field is: \n          - image (list\\[str\\])： Each element in the list is the path of the image to be converted.\n          - style (list\\[str\\])： Each element in the list is a string, fill in the face attributes to be converted.\n\n    - **Return**\n      - res (list\\[str\\]): Save path of the result.\n\n## IV. Release Note\n\n- 1.0.0\n\n  First release\n"
  },
  {
    "path": "modules/image/Image_gan/stgan_celeba/README.md",
    "content": "# stgan_celeba\n\n|模型名称|stgan_celeba|\n| :--- | :---: |\n|类别|图像 - 图像生成|\n|网络|STGAN|\n|数据集|Celeba|\n|是否支持Fine-tuning|否|\n|模型大小|287MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/137856070-2a43facd-cda0-473f-8935-e61f5dd583d8.JPG\" width=1200><br/>\n    STGAN的效果图(图片属性分别为：original image, Bald, Bangs, Black_Hair, Blond_Hair, Brown_Hair, Bushy_Eyebrows, Eyeglasses, Gender, Mouth_Slightly_Open, Mustache, No_Beard, Pale_Skin, Aged)<br/>\n    </p>\n\n\n- ### 模型介绍\n\n  - STGAN 以原属性和目标属性的差值作为输入，并创造性地提出了 STUs (Selective transfer units) 来选择和修改 encoder 的特征，从而改善了转换效果和处理能力。 该 PaddleHub Module 使用 Celeba 数据集训练完成，目前支持 \"Bald\", \"Bangs\", \"Black_Hair\", \"Blond_Hair\", \"Brown_Hair\", \"Bushy_Eyebrows\", \"Eyeglasses\", \"Gender\", \"Mouth_Slightly_Open\", \"Mustache\", \"No_Beard\", \"Pale_Skin\", \"Aged\" 这十三种人脸属性转换。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.5.2 \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install stgan_celeba==1.0.0\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n \n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run stgan_celeba --image \"/PATH/TO/IMAGE\" --info \"original_attributes\" --style \"target_attribute\" \n    ```\n  - **参数**\n\n    - image ：指定图片路径。\n\n    - info ：原图的属性，必须填写性别（ \"Male\" 或者 \"Female\"）。可选值有：\"Bald\", \"Bangs\", \"Black_Hair\", \"Blond_Hair\", \"Brown_Hair\", \"Bushy_Eyebrows\", \"Eyeglasses\", \"Mouth_Slightly_Open\", \"Mustache\", \"No_Beard\", \"Pale_Skin\", \"Aged\" 。比如输入图片是一个女孩，有着黑头发，那么就填写为 \"Female,Black_Hair\"。建议尽可能完整地填写原图具备的属性，比如一个黑发女孩还戴了眼镜，那么应填写为 \"Female,Black_Hair,Eyeglasses\"，否则有可能转换失败。\n    \n    - style 指定拟转换的属性，可选择 \"Bald\", \"Bangs\", \"Black_Hair\", \"Blond_Hair\", \"Brown_Hair\", \"Bushy_Eyebrows\", \"Eyeglasses\", \"Gender\", \"Mouth_Slightly_Open\", \"Mustache\", \"No_Beard\", \"Pale_Skin\", \"Aged\" 中的一种。\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    stgan = hub.Module(name=\"stgan_celeba\")\n\n    test_img_path = [\"/PATH/TO/IMAGE\"]\n    org_info = [\"Female,Black_Hair\"]\n    trans_attr = [\"Bangs\"]\n\n    # set input dict\n    input_dict = {\"image\": test_img_path, \"style\": trans_attr, \"info\": org_info}\n\n    # execute predict and print the result\n    results = stgan.generate(data=input_dict)\n    print(results)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def generate(data)\n    ```\n\n    - 风格转换API，用于图像生成。\n\n    - **参数**\n\n      - data： dict 类型，有以下字段\n          - image (list\\[str\\])： list中每个元素为待转换的图片路径。\n          - style (list\\[str\\])： list中每个元素为字符串，填写待转换的人脸属性。\n          - info (list\\[str\\])： 表示原图具备的人脸属性，填得越详细效果会越好，不同属性用逗号隔开。\n          \n\n    - **返回**\n      - res (list\\[str\\]): 提示生成图片的保存路径。\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/Image_gan/stgan_celeba/README_en.md",
    "content": "# stgan_celeba\n\n|Module Name|stgan_celeba|\n| :--- | :---: |\n|Category|image generation|\n|Network|STGAN|\n|Dataset|Celeba|\n|Fine-tuning supported or not|No|\n|Module Size |287MB|\n|Latest update date|2021-02-26|\n|Data indicators|-|\n\n\n## I. Basic Information \n\n- ### Application Effect Display\n  - Sample results:\n\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/137856070-2a43facd-cda0-473f-8935-e61f5dd583d8.JPG\" width=1200><br/>\n    The image attributes are: original image, Bald, Bangs, Black_Hair, Blond_Hair, Brown_Hair, Bushy_Eyebrows, Eyeglasses, Gender, Mouth_Slightly_Open, Mustache, No_Beard, Pale_Skin, Aged<br/>\n    </p>\n\n\n- ### Module Introduction\n\n  - STGAN takes the original attribute and the target attribute as input, and proposes STUs (Selective transfer units) to select and modify features of the encoder. The PaddleHub Module is trained one Celeba dataset and currently supports attributes of \"Bald\", \"Bangs\", \"Black_Hair\", \"Blond_Hair\", \"Brown_Hair\", \"Bushy_Eyebrows\", \"Eyeglasses\", \"Gender\", \"Mouth_Slightly_Open\", \"Mustache\", \"No_Beard\", \"Pale_Skin\", \"Aged\".\n\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n  - paddlepaddle >= 1.5.2 \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install stgan_celeba==1.0.0\n    ```\n  - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n \n\n## III. Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run stgan_celeba --image \"/PATH/TO/IMAGE\" --info \"original_attributes\" --style \"target_attribute\" \n    ```\n    - **Parameters**\n\n      - image: Image path\n\n      - info: Attributes of original image, must fill in gender（ \"Male\" or \"Female\").The options are \"Bald\", \"Bangs\", \"Black_Hair\", \"Blond_Hair\", \"Brown_Hair\", \"Bushy_Eyebrows\", \"Eyeglasses\", \"Mouth_Slightly_Open\", \"Mustache\", \"No_Beard\", \"Pale_Skin\", \"Aged\". For example, the input picture is a girl with black hair, then fill in as \"Female,Black_Hair\". \n    \n      - style: Specify the attributes to be converted. The options are \"Bald\", \"Bangs\", \"Black_Hair\", \"Blond_Hair\", \"Brown_Hair\", \"Bushy_Eyebrows\", \"Eyeglasses\", \"Gender\", \"Mouth_Slightly_Open\", \"Mustache\", \"No_Beard\", \"Pale_Skin\", \"Aged\". You can choose one of the options.\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_en/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n\n    stgan = hub.Module(name=\"stgan_celeba\")\n\n    test_img_path = [\"/PATH/TO/IMAGE\"]\n    org_info = [\"Female,Black_Hair\"]\n    trans_attr = [\"Bangs\"]\n\n    # set input dict\n    input_dict = {\"image\": test_img_path, \"style\": trans_attr, \"info\": org_info}\n\n    # execute predict and print the result\n    results = stgan.generate(data=input_dict)\n    print(results)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def generate(data)\n    ```\n\n    - Style transfer API.\n\n    - **Parameter**\n\n      - data(list[dict]): Each element in the list is dict and each field is: \n          - image (list\\[str\\])： Each element in the list is the path of the image to be converted.\n          - style (list\\[str\\])： Each element in the list is a string, fill in the face attributes to be converted.\n          - info (list\\[str\\])： Represents the face attributes of the original image. Different attributes are separated by commas.\n          \n\n    - **Return**\n      - res (list\\[str\\]): Save path of the result.\n\n## IV. Release Note\n\n- 1.0.0\n\n  First release\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/ID_Photo_GEN/README.md",
    "content": "# ID_Photo_GEN\n\n|模型名称|ID_Photo_GEN|\n| :--- | :---: |\n|类别|图像 - 图像生成|\n|网络|HRNet_W18|\n|数据集|-|\n|是否支持Fine-tuning|否|\n|模型大小|28KB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://img-blog.csdnimg.cn/20201224163307901.jpg\" > \n    </p>\n\n\n- ### 模型介绍\n\n  - 基于face_landmark_localization和FCN_HRNet_W18_Face_Seg模型实现的证件照生成模型，一键生成白底、红底和蓝底的人像照片\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install ID_Photo_GEN\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n \n \n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    import cv2\n    import paddlehub as hub\n\n    model = hub.Module(name='ID_Photo_GEN')\n\n    result = model.Photo_GEN(\n    images=[cv2.imread('/PATH/TO/IMAGE')],\n    paths=None,\n    batch_size=1,\n    output_dir='output',\n    visualization=True,\n    use_gpu=False)\n    ```\n\n- ### 2、API\n\n  - ```python\n    def Photo_GEN(\n        images=None,\n        paths=None,\n        batch_size=1,\n        output_dir='output',\n        visualization=False,\n        use_gpu=False):\n    ```\n\n    - 证件照生成API\n\n    - **参数**\n        * images (list[np.ndarray]) : 输入图像数据列表（BGR）\n        * paths (list[str]) : 输入图像路径列表\n        * batch_size (int) : 数据批大小\n        * output_dir (str) : 可视化图像输出目录\n        * visualization (bool) : 是否可视化\n        * use_gpu (bool) : 是否使用 GPU 进行推理\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n    \n      * results (list[dict{\"write\":np.ndarray,\"blue\":np.ndarray,\"red\":np.ndarray}]): 输出图像数据列表\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/ID_Photo_GEN/README_en.md",
    "content": "# ID_Photo_GEN\n\n|Module Name |ID_Photo_GEN|\n| :--- | :---: |\n|Category|Image generation|\n|Network|HRNet_W18|\n|Dataset |-|\n|Fine-tuning supported or not |No|\n|Module Size|28KB|\n|Latest update date|2021-02-26|\n|Data indicators|-|\n\n\n## I. Basic Information \n\n- ### Application Effect Display\n  - Sample results:\n    <p align=\"center\">\n    <img src=\"https://img-blog.csdnimg.cn/20201224163307901.jpg\" > \n    </p>\n\n\n- ### Module Introduction\n\n  - This model is based on face_landmark_localization and FCN_HRNet_W18_Face_Seg. It can generate ID photos with white, red and blue background\n\n\n## II. Installation\n\n- ### 1、Environmental Dependence \n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install ID_Photo_GEN\n    ```\n\n  - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n \n \n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n  - ```python\n    import cv2\n    import paddlehub as hub\n\n    model = hub.Module(name='ID_Photo_GEN')\n\n    result = model.Photo_GEN(\n    images=[cv2.imread('/PATH/TO/IMAGE')],\n    paths=None,\n    batch_size=1,\n    output_dir='output',\n    visualization=True,\n    use_gpu=False)\n    ```\n\n- ### 2、API\n\n  - ```python\n    def Photo_GEN(\n        images=None,\n        paths=None,\n        batch_size=1,\n        output_dir='output',\n        visualization=False,\n        use_gpu=False):\n    ```\n\n    - Prediction API, generating ID photos.\n\n    - **Parameter**\n        * images (list[np.ndarray]): Image data, ndarray.shape is in the format [H, W, C], BGR.\n        * paths (list[str]): Image path\n        * batch_size (int): Batch size\n        * output_dir (str): Save path of images, output by default.\n        * visualization (bool): Whether to save the recognition results as picture files.\n        * use_gpu (bool): Use GPU or not. **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n\n        **NOTE:** Choose one of `paths` and `images` to provide input data.\n\n    - **Return**\n    \n      * results (list[dict{\"write\":np.ndarray,\"blue\":np.ndarray,\"red\":np.ndarray}]): The list of generation results.\n\n\n## IV. Release Note\n\n- 1.0.0\n\n  First release"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/ID_Photo_GEN/module.py",
    "content": "import os\r\nimport cv2\r\nimport math\r\nimport paddle\r\nimport numpy as np\r\nimport paddle.nn as nn\r\nimport paddlehub as hub\r\nfrom paddlehub.module.module import moduleinfo\r\n\r\n\r\n@moduleinfo(\r\n    name=\"ID_Photo_GEN\",  # 模型名称\r\n    type=\"CV\",  # 模型类型\r\n    author=\"jm12138\",  # 作者名称\r\n    author_email=\"jm12138@qq.com\",  # 作者邮箱\r\n    summary=\"ID_Photo_GEN\",  # 模型介绍\r\n    version=\"1.0.0\"  # 版本号\r\n)\r\nclass ID_Photo_GEN(nn.Layer):\r\n    def __init__(self):\r\n        super(ID_Photo_GEN, self).__init__()\r\n        # 加载人脸关键点检测模型\r\n        self.face_detector = hub.Module(name=\"face_landmark_localization\")\r\n\r\n        # 加载人脸分割模型\r\n        self.seg = hub.Module(name='FCN_HRNet_W18_Face_Seg')\r\n\r\n    # 读取数据函数\r\n    @staticmethod\r\n    def load_datas(paths, images):\r\n        datas = []\r\n\r\n        # 读取数据列表\r\n        if paths is not None:\r\n            for im_path in paths:\r\n                assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\r\n                im = cv2.imread(im_path)\r\n                datas.append(im)\r\n\r\n        if images is not None:\r\n            datas = images\r\n\r\n        # 返回数据列表\r\n        return datas\r\n\r\n    # 数据预处理函数\r\n    def preprocess(self, images, batch_size, use_gpu):\r\n        # 获取人脸关键点\r\n        outputs = self.face_detector.keypoint_detection(images=images, batch_size=batch_size, use_gpu=use_gpu)\r\n\r\n        crops = []\r\n        for output, image in zip(outputs, images):\r\n            for landmarks in output['data']:\r\n                landmarks = np.array(landmarks)\r\n\r\n                # rotation angle\r\n                left_eye_corner = landmarks[36]\r\n                right_eye_corner = landmarks[45]\r\n                radian = np.arctan(\r\n                    (left_eye_corner[1] - right_eye_corner[1]) / (left_eye_corner[0] - right_eye_corner[0]))\r\n\r\n                # image size after rotating\r\n                height, width, _ = image.shape\r\n                cos = math.cos(radian)\r\n                sin = math.sin(radian)\r\n                new_w = int(width * abs(cos) + height * abs(sin))\r\n                new_h = int(width * abs(sin) + height * abs(cos))\r\n\r\n                # translation\r\n                Tx = new_w // 2 - width // 2\r\n                Ty = new_h // 2 - height // 2\r\n\r\n                # affine matrix\r\n                M = np.array([[cos, sin, (1 - cos) * width / 2. - sin * height / 2. + Tx],\r\n                              [-sin, cos, sin * width / 2. + (1 - cos) * height / 2. + Ty]])\r\n\r\n                image = cv2.warpAffine(image, M, (new_w, new_h), borderValue=(255, 255, 255))\r\n\r\n                landmarks = np.concatenate([landmarks, np.ones((landmarks.shape[0], 1))], axis=1)\r\n                landmarks = np.dot(M, landmarks.T).T\r\n                landmarks_top = np.min(landmarks[:, 1])\r\n                landmarks_bottom = np.max(landmarks[:, 1])\r\n                landmarks_left = np.min(landmarks[:, 0])\r\n                landmarks_right = np.max(landmarks[:, 0])\r\n\r\n                # expand bbox\r\n                top = int(landmarks_top - 0.8 * (landmarks_bottom - landmarks_top))\r\n                bottom = int(landmarks_bottom + 0.3 * (landmarks_bottom - landmarks_top))\r\n                left = int(landmarks_left - 0.3 * (landmarks_right - landmarks_left))\r\n                right = int(landmarks_right + 0.3 * (landmarks_right - landmarks_left))\r\n\r\n                # crop\r\n                if bottom - top > right - left:\r\n                    left -= ((bottom - top) - (right - left)) // 2\r\n                    right = left + (bottom - top)\r\n                else:\r\n                    top -= ((right - left) - (bottom - top)) // 2\r\n                    bottom = top + (right - left)\r\n\r\n                image_crop = np.ones((bottom - top + 1, right - left + 1, 3), np.uint8) * 255\r\n\r\n                h, w = image.shape[:2]\r\n                left_white = max(0, -left)\r\n                left = max(0, left)\r\n                right = min(right, w - 1)\r\n                right_white = left_white + (right - left)\r\n                top_white = max(0, -top)\r\n                top = max(0, top)\r\n                bottom = min(bottom, h - 1)\r\n                bottom_white = top_white + (bottom - top)\r\n\r\n                image_crop[top_white:bottom_white + 1, left_white:right_white + 1] = image[top:bottom + 1, left:right +\r\n                                                                                           1].copy()\r\n                crops.append(image_crop)\r\n\r\n        # 获取人像分割的输出\r\n        results = self.seg.Segmentation(images=crops, batch_size=batch_size)\r\n\r\n        faces = []\r\n        masks = []\r\n\r\n        for result in results:\r\n            # 提取MASK和输出图像\r\n            face = result['face']\r\n            mask = result['mask']\r\n\r\n            faces.append(face)\r\n            masks.append(mask)\r\n\r\n        return faces, masks\r\n\r\n    # 模型预测函数\r\n    def predict(self, input_datas):\r\n        outputs = []\r\n\r\n        for data in input_datas:\r\n            # 转换数据为Tensor\r\n            data = paddle.to_tensor(data)\r\n\r\n            # 模型前向计算\r\n            cartoon = self.net(data)\r\n\r\n            outputs.append(cartoon[0].numpy())\r\n\r\n        outputs = np.concatenate(outputs, 0)\r\n\r\n        return outputs\r\n\r\n    # 结果后处理函数\r\n    @staticmethod\r\n    def postprocess(faces, masks, visualization, output_dir):\r\n        # 检查输出目录\r\n        if visualization:\r\n            if not os.path.exists(output_dir):\r\n                os.mkdir(output_dir)\r\n\r\n        results = []\r\n\r\n        for face, mask, i in zip(faces, masks, range(len(masks))):\r\n            mask = mask[..., np.newaxis] / 255\r\n            write = face * mask + (1 - mask) * 255\r\n            blue = face * mask + (1 - mask) * [255, 0, 0]\r\n            red = face * mask + (1 - mask) * [0, 0, 255]\r\n\r\n            # 可视化结果保存\r\n            if visualization:\r\n                cv2.imwrite(os.path.join(output_dir, 'write_%d.jpg' % i), write)\r\n                cv2.imwrite(os.path.join(output_dir, 'blue_%d.jpg' % i), blue)\r\n                cv2.imwrite(os.path.join(output_dir, 'red_%d.jpg' % i), red)\r\n\r\n            results.append({'write': write, 'blue': blue, 'red': red})\r\n\r\n        return results\r\n\r\n    def Photo_GEN(self, images=None, paths=None, batch_size=1, output_dir='output', visualization=False, use_gpu=False):\r\n\r\n        # 获取输入数据\r\n        images = self.load_datas(paths, images)\r\n\r\n        # 数据预处理\r\n        faces, masks = self.preprocess(images, batch_size, use_gpu)\r\n\r\n        # 结果后处理\r\n        results = self.postprocess(faces, masks, visualization, output_dir)\r\n\r\n        return results\r\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/Photo2Cartoon/README.md",
    "content": "# Photo2Cartoon\n\n|模型名称|Photo2Cartoon|\n| :--- | :---: |\n|类别|图像 - 图像生成|\n|网络|U-GAT-IT|\n|数据集|cartoon_data|\n|是否支持Fine-tuning|否|\n|模型大小|205MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://img-blog.csdnimg.cn/20201224164040624.jpg\"   hspace='10'/> <br />\n    </p>\n\n\n\n- ### 模型介绍\n\n  - 本模型封装自[小视科技photo2cartoon项目的paddlepaddle版本](https://github.com/minivision-ai/photo2cartoon-paddle)。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.0.0  \n\n  - paddlehub >= 2.0.0   | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)  \n\n- ### 2、安装\n\n  - ```shell\n    $ hub install Photo2Cartoon\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"Photo2Cartoon\")\n    result = model.Cartoon_GEN(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.Cartoon_GEN(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def Cartoon_GEN(images=None,\n                    paths=None,\n                    batch_size=1,\n                    output_dir='output',\n                    visualization=False,\n                    use_gpu=False):\n    ```\n\n    - 人像卡通化图像生成API。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]；<br/>\n      - paths (list\\[str\\]): 输入图像路径；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 output；<br/>\n      - batch_size (int) : batch大小；<br/>  \n      - visualization (bool) : 是否将结果保存为图片文件；；<br/>\n      - use_gpu (bool) : 是否使用 GPU 进行推理。\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n      - res (list\\[numpy.ndarray\\]): 输出图像数据，ndarray.shape 为 \\[H, W, C\\]\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install Photo2Cartoon==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/Photo2Cartoon/README_en.md",
    "content": "# Photo2Cartoon\n\n|Module Name|Photo2Cartoon|\n| :--- | :---: |\n|Category|image generation|\n|Network|U-GAT-IT|\n|Dataset|cartoon_data|\n|Fine-tuning supported or not|No|\n|Module Size|205MB|\n|Latest update date|2021-02-26|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n    <p align=\"center\">\n    <img src=\"https://img-blog.csdnimg.cn/20201224164040624.jpg\"   hspace='10'/> <br />\n    </p>\n\n\n\n- ### Module Introduction\n\n  - This module encapsulates project [photo2cartoon](https://github.com/minivision-ai/photo2cartoon-paddle).\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 2.0.0  \n\n  - paddlehub >= 2.0.0   | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)  \n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install Photo2Cartoon\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"Photo2Cartoon\")\n    result = model.Cartoon_GEN(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.Cartoon_GEN(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def Cartoon_GEN(images=None,\n                    paths=None,\n                    batch_size=1,\n                    output_dir='output',\n                    visualization=False,\n                    use_gpu=False):\n    ```\n\n    - Cartoon style generation API.\n\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - output_dir (str): save path of images;\n      - batch_size (int): the size of batch;\n      - visualization (bool): Whether to save the results as picture files;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n      - res (list\\[numpy.ndarray\\]): result list，ndarray.shape is \\[H, W, C\\]\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install Photo2Cartoon==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/Photo2Cartoon/model/__init__.py",
    "content": "from .networks import ResnetGenerator\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/Photo2Cartoon/model/networks.py",
    "content": "import paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\nclass ResnetGenerator(nn.Layer):\n    def __init__(self, ngf=32, img_size=256, n_blocks=4, light=True):\n        super(ResnetGenerator, self).__init__()\n        self.light = light\n        self.n_blocks = n_blocks\n\n        DownBlock = []\n        DownBlock += [\n            nn.Pad2D([3, 3, 3, 3], 'reflect'),\n            nn.Conv2D(3, ngf, kernel_size=7, stride=1, bias_attr=False),\n            nn.InstanceNorm2D(ngf, weight_attr=False, bias_attr=False),\n            nn.ReLU()\n        ]\n\n        DownBlock += [HourGlass(ngf, ngf), HourGlass(ngf, ngf)]\n\n        # Down-Sampling\n        n_downsampling = 2\n        for i in range(n_downsampling):\n            mult = 2**i\n            DownBlock += [\n                nn.Pad2D([1, 1, 1, 1], 'reflect'),\n                nn.Conv2D(ngf * mult, ngf * mult * 2, kernel_size=3, stride=2, bias_attr=False),\n                nn.InstanceNorm2D(ngf * mult * 2, weight_attr=False, bias_attr=False),\n                nn.ReLU()\n            ]\n\n        # Encoder Bottleneck\n        mult = 2**n_downsampling\n        for i in range(n_blocks):\n            setattr(self, 'EncodeBlock' + str(i + 1), ResnetBlock(ngf * mult))\n\n        # Class Activation Map\n        self.gap_fc = nn.Linear(ngf * mult, 1, bias_attr=False)\n        self.gmp_fc = nn.Linear(ngf * mult, 1, bias_attr=False)\n        self.conv1x1 = nn.Conv2D(ngf * mult * 2, ngf * mult, kernel_size=1, stride=1)\n        self.relu = nn.ReLU()\n\n        # Gamma, Beta block\n        FC = []\n        if self.light:\n            FC += [\n                nn.Linear(ngf * mult, ngf * mult, bias_attr=False),\n                nn.ReLU(),\n                nn.Linear(ngf * mult, ngf * mult, bias_attr=False),\n                nn.ReLU()\n            ]\n\n        else:\n            FC += [\n                nn.Linear(img_size // mult * img_size // mult * ngf * mult, ngf * mult, bias_attr=False),\n                nn.ReLU(),\n                nn.Linear(ngf * mult, ngf * mult, bias_attr=False),\n                nn.ReLU()\n            ]\n\n        # Decoder Bottleneck\n        mult = 2**n_downsampling\n        for i in range(n_blocks):\n            setattr(self, 'DecodeBlock' + str(i + 1), ResnetSoftAdaLINBlock(ngf * mult))\n\n        # Up-Sampling\n        UpBlock = []\n        for i in range(n_downsampling):\n            mult = 2**(n_downsampling - i)\n            UpBlock += [\n                nn.Upsample(scale_factor=2),\n                nn.Pad2D([1, 1, 1, 1], 'reflect'),\n                nn.Conv2D(ngf * mult, ngf * mult // 2, kernel_size=3, stride=1, bias_attr=False),\n                LIN(ngf * mult // 2),\n                nn.ReLU()\n            ]\n\n        UpBlock += [HourGlass(ngf, ngf), HourGlass(ngf, ngf, False)]\n\n        UpBlock += [\n            nn.Pad2D([3, 3, 3, 3], 'reflect'),\n            nn.Conv2D(3, 3, kernel_size=7, stride=1, bias_attr=False),\n            nn.Tanh()\n        ]\n\n        self.DownBlock = nn.Sequential(*DownBlock)\n        self.FC = nn.Sequential(*FC)\n        self.UpBlock = nn.Sequential(*UpBlock)\n\n    def forward(self, x):\n        bs = x.shape[0]\n\n        x = self.DownBlock(x)\n\n        content_features = []\n        for i in range(self.n_blocks):\n            x = getattr(self, 'EncodeBlock' + str(i + 1))(x)\n            content_features.append(F.adaptive_avg_pool2d(x, 1).reshape([bs, -1]))\n\n        gap = F.adaptive_avg_pool2d(x, 1)\n        gap_logit = self.gap_fc(gap.reshape([bs, -1]))\n        gap_weight = list(self.gap_fc.parameters())[0].transpose([1, 0])\n        gap = x * gap_weight.unsqueeze(2).unsqueeze(3)\n\n        gmp = F.adaptive_max_pool2d(x, 1)\n        gmp_logit = self.gmp_fc(gmp.reshape([bs, -1]))\n        gmp_weight = list(self.gmp_fc.parameters())[0].transpose([1, 0])\n        gmp = x * gmp_weight.unsqueeze(2).unsqueeze(3)\n\n        cam_logit = paddle.concat([gap_logit, gmp_logit], 1)\n        x = paddle.concat([gap, gmp], 1)\n        x = self.relu(self.conv1x1(x))\n\n        heatmap = paddle.sum(x, axis=1, keepdim=True)\n\n        if self.light:\n            x_ = F.adaptive_avg_pool2d(x, 1)\n            style_features = self.FC(x_.reshape([bs, -1]))\n        else:\n            style_features = self.FC(x.reshape([bs, -1]))\n\n        for i in range(self.n_blocks):\n            x = getattr(self, 'DecodeBlock' + str(i + 1))(x, content_features[4 - i - 1], style_features)\n\n        out = self.UpBlock(x)\n\n        return out, cam_logit, heatmap\n\n\nclass ConvBlock(nn.Layer):\n    def __init__(self, dim_in, dim_out):\n        super(ConvBlock, self).__init__()\n        self.dim_in = dim_in\n        self.dim_out = dim_out\n\n        self.conv_block1 = self.__convblock(dim_in, dim_out // 2)\n        self.conv_block2 = self.__convblock(dim_out // 2, dim_out // 4)\n        self.conv_block3 = self.__convblock(dim_out // 4, dim_out // 4)\n\n        if self.dim_in != self.dim_out:\n            self.conv_skip = nn.Sequential(\n                nn.InstanceNorm2D(dim_in, weight_attr=False, bias_attr=False), nn.ReLU(),\n                nn.Conv2D(dim_in, dim_out, kernel_size=1, stride=1, bias_attr=False))\n\n    @staticmethod\n    def __convblock(dim_in, dim_out):\n        return nn.Sequential(\n            nn.InstanceNorm2D(dim_in, weight_attr=False, bias_attr=False), nn.ReLU(), nn.Pad2D([1, 1, 1, 1], 'reflect'),\n            nn.Conv2D(dim_in, dim_out, kernel_size=3, stride=1, bias_attr=False))\n\n    def forward(self, x):\n        residual = x\n\n        x1 = self.conv_block1(x)\n        x2 = self.conv_block2(x1)\n        x3 = self.conv_block3(x2)\n        out = paddle.concat([x1, x2, x3], 1)\n\n        if self.dim_in != self.dim_out:\n            residual = self.conv_skip(residual)\n\n        return residual + out\n\n\nclass HourGlassBlock(nn.Layer):\n    def __init__(self, dim_in):\n        super(HourGlassBlock, self).__init__()\n\n        self.n_skip = 4\n        self.n_block = 9\n\n        for i in range(self.n_skip):\n            setattr(self, 'ConvBlockskip' + str(i + 1), ConvBlock(dim_in, dim_in))\n\n        for i in range(self.n_block):\n            setattr(self, 'ConvBlock' + str(i + 1), ConvBlock(dim_in, dim_in))\n\n    def forward(self, x):\n        skips = []\n        for i in range(self.n_skip):\n            skips.append(getattr(self, 'ConvBlockskip' + str(i + 1))(x))\n            x = F.avg_pool2d(x, 2)\n            x = getattr(self, 'ConvBlock' + str(i + 1))(x)\n\n        x = self.ConvBlock5(x)\n\n        for i in range(self.n_skip):\n            x = getattr(self, 'ConvBlock' + str(i + 6))(x)\n            x = F.upsample(x, scale_factor=2)\n            x = skips[self.n_skip - i - 1] + x\n\n        return x\n\n\nclass HourGlass(nn.Layer):\n    def __init__(self, dim_in, dim_out, use_res=True):\n        super(HourGlass, self).__init__()\n        self.use_res = use_res\n\n        self.HG = nn.Sequential(\n            HourGlassBlock(dim_in), ConvBlock(dim_out, dim_out),\n            nn.Conv2D(dim_out, dim_out, kernel_size=1, stride=1, bias_attr=False),\n            nn.InstanceNorm2D(dim_out, weight_attr=False, bias_attr=False), nn.ReLU())\n\n        self.Conv1 = nn.Conv2D(dim_out, 3, kernel_size=1, stride=1)\n\n        if self.use_res:\n            self.Conv2 = nn.Conv2D(dim_out, dim_out, kernel_size=1, stride=1)\n            self.Conv3 = nn.Conv2D(3, dim_out, kernel_size=1, stride=1)\n\n    def forward(self, x):\n        ll = self.HG(x)\n        tmp_out = self.Conv1(ll)\n\n        if self.use_res:\n            ll = self.Conv2(ll)\n            tmp_out_ = self.Conv3(tmp_out)\n            return x + ll + tmp_out_\n\n        else:\n            return tmp_out\n\n\nclass ResnetBlock(nn.Layer):\n    def __init__(self, dim, use_bias=False):\n        super(ResnetBlock, self).__init__()\n        conv_block = []\n        conv_block += [\n            nn.Pad2D([1, 1, 1, 1], 'reflect'),\n            nn.Conv2D(dim, dim, kernel_size=3, stride=1, bias_attr=use_bias),\n            nn.InstanceNorm2D(dim, weight_attr=False, bias_attr=False),\n            nn.ReLU()\n        ]\n\n        conv_block += [\n            nn.Pad2D([1, 1, 1, 1], 'reflect'),\n            nn.Conv2D(dim, dim, kernel_size=3, stride=1, bias_attr=use_bias),\n            nn.InstanceNorm2D(dim, weight_attr=False, bias_attr=False)\n        ]\n\n        self.conv_block = nn.Sequential(*conv_block)\n\n    def forward(self, x):\n        out = x + self.conv_block(x)\n        return out\n\n\nclass ResnetSoftAdaLINBlock(nn.Layer):\n    def __init__(self, dim, use_bias=False):\n        super(ResnetSoftAdaLINBlock, self).__init__()\n        self.pad1 = nn.Pad2D([1, 1, 1, 1], 'reflect')\n        self.conv1 = nn.Conv2D(dim, dim, kernel_size=3, stride=1, bias_attr=use_bias)\n        self.norm1 = SoftAdaLIN(dim)\n        self.relu1 = nn.ReLU()\n\n        self.pad2 = nn.Pad2D([1, 1, 1, 1], 'reflect')\n        self.conv2 = nn.Conv2D(dim, dim, kernel_size=3, stride=1, bias_attr=use_bias)\n        self.norm2 = SoftAdaLIN(dim)\n\n    def forward(self, x, content_features, style_features):\n        out = self.pad1(x)\n        out = self.conv1(out)\n        out = self.norm1(out, content_features, style_features)\n        out = self.relu1(out)\n\n        out = self.pad2(out)\n        out = self.conv2(out)\n        out = self.norm2(out, content_features, style_features)\n        return out + x\n\n\nclass SoftAdaLIN(nn.Layer):\n    def __init__(self, num_features, eps=1e-5):\n        super(SoftAdaLIN, self).__init__()\n        self.norm = AdaLIN(num_features, eps)\n\n        self.w_gamma = self.create_parameter([1, num_features], default_initializer=nn.initializer.Constant(0.))\n        self.w_beta = self.create_parameter([1, num_features], default_initializer=nn.initializer.Constant(0.))\n\n        self.c_gamma = nn.Sequential(\n            nn.Linear(num_features, num_features, bias_attr=False), nn.ReLU(),\n            nn.Linear(num_features, num_features, bias_attr=False))\n        self.c_beta = nn.Sequential(\n            nn.Linear(num_features, num_features, bias_attr=False), nn.ReLU(),\n            nn.Linear(num_features, num_features, bias_attr=False))\n        self.s_gamma = nn.Linear(num_features, num_features, bias_attr=False)\n        self.s_beta = nn.Linear(num_features, num_features, bias_attr=False)\n\n    def forward(self, x, content_features, style_features):\n        content_gamma, content_beta = self.c_gamma(content_features), self.c_beta(content_features)\n        style_gamma, style_beta = self.s_gamma(style_features), self.s_beta(style_features)\n\n        # w_gamma_ = nn.clip(self.w_gamma, 0, 1)\n        # w_beta_ = nn.clip(self.w_beta, 0, 1)\n\n        w_gamma_, w_beta_ = self.w_gamma.expand([x.shape[0], -1]), self.w_beta.expand([x.shape[0], -1])\n        soft_gamma = (1. - w_gamma_) * style_gamma + w_gamma_ * content_gamma\n        soft_beta = (1. - w_beta_) * style_beta + w_beta_ * content_beta\n\n        out = self.norm(x, soft_gamma, soft_beta)\n        return out\n\n\nclass AdaLIN(nn.Layer):\n    def __init__(self, num_features, eps=1e-5):\n        super(AdaLIN, self).__init__()\n        self.eps = eps\n        self.rho = self.create_parameter([1, num_features, 1, 1], default_initializer=nn.initializer.Constant(0.9))\n\n    def forward(self, x, gamma, beta):\n        in_mean, in_var = paddle.mean(x, axis=[2, 3], keepdim=True), paddle.var(x, axis=[2, 3], keepdim=True)\n        out_in = (x - in_mean) / paddle.sqrt(in_var + self.eps)\n        ln_mean, ln_var = paddle.mean(x, axis=[1, 2, 3], keepdim=True), paddle.var(x, axis=[1, 2, 3], keepdim=True)\n        out_ln = (x - ln_mean) / paddle.sqrt(ln_var + self.eps)\n        out = self.rho.expand([x.shape[0], -1, -1, -1]) * out_in + \\\n              (1-self.rho.expand([x.shape[0], -1, -1, -1])) * out_ln\n        out = out * gamma.unsqueeze(2).unsqueeze(3) + beta.unsqueeze(2).unsqueeze(3)\n\n        return out\n\n\nclass LIN(nn.Layer):\n    def __init__(self, num_features, eps=1e-5):\n        super(LIN, self).__init__()\n        self.eps = eps\n        self.rho = self.create_parameter([1, num_features, 1, 1], default_initializer=nn.initializer.Constant(0.))\n        self.gamma = self.create_parameter([1, num_features, 1, 1], default_initializer=nn.initializer.Constant(1.))\n        self.beta = self.create_parameter([1, num_features, 1, 1], default_initializer=nn.initializer.Constant(0.))\n\n    def forward(self, x):\n        in_mean, in_var = paddle.mean(x, axis=[2, 3], keepdim=True), paddle.var(x, axis=[2, 3], keepdim=True)\n        out_in = (x - in_mean) / paddle.sqrt(in_var + self.eps)\n        ln_mean, ln_var = paddle.mean(x, axis=[1, 2, 3], keepdim=True), paddle.var(x, axis=[1, 2, 3], keepdim=True)\n        out_ln = (x - ln_mean) / paddle.sqrt(ln_var + self.eps)\n        out = self.rho.expand([x.shape[0], -1, -1, -1]) * out_in + \\\n              (1-self.rho.expand([x.shape[0], -1, -1, -1])) * out_ln\n        out = out * self.gamma.expand([x.shape[0], -1, -1, -1]) + self.beta.expand([x.shape[0], -1, -1, -1])\n\n        return out\n\n\nif __name__ == '__main__':\n    #d = Discriminator(3)\n    # paddle.summary(d, (4, 3, 256, 256))\n    #out, cam_logit, heatmap = d(paddle.ones([4, 3, 256, 256]))\n    #print(out.shape, cam_logit.shape, heatmap.shape)\n\n    g = ResnetGenerator(ngf=32, img_size=256, light=True)\n    out, cam_logit, heatmap = g(paddle.ones([4, 3, 256, 256]))\n    print(out.shape, cam_logit.shape, heatmap.shape)\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/Photo2Cartoon/module.py",
    "content": "import os\r\nimport cv2\r\nimport math\r\nimport paddle\r\nimport numpy as np\r\nimport paddle.nn as nn\r\nimport paddlehub as hub\r\nfrom Photo2Cartoon.model import ResnetGenerator\r\nfrom paddlehub.module.module import moduleinfo\r\n\r\n\r\n@moduleinfo(\r\n    name=\"Photo2Cartoon\",  # 模型名称\r\n    type=\"CV\",  # 模型类型\r\n    author=\"jm12138\",  # 作者名称\r\n    author_email=\"jm12138@qq.com\",  # 作者邮箱\r\n    summary=\"Photo2Cartoon\",  # 模型介绍\r\n    version=\"1.0.0\"  # 版本号\r\n)\r\nclass Photo2Cartoon(nn.Layer):\r\n    def __init__(self):\r\n        super(Photo2Cartoon, self).__init__()\r\n        # 加载人脸关键点检测模型\r\n        self.face_detector = hub.Module(name=\"face_landmark_localization\")\r\n\r\n        # 加载人脸分割模型\r\n        self.seg = hub.Module(name='FCN_HRNet_W18_Face_Seg')\r\n\r\n        # 加载人脸动漫化模型\r\n        self.net = ResnetGenerator(ngf=32, img_size=256, light=True)\r\n\r\n        # 加载人脸动漫化模型参数\r\n        state_dict = paddle.load(os.path.join(self.directory, 'photo2cartoon_weights.pdparams'))\r\n        self.net.set_state_dict(state_dict['genA2B'])\r\n\r\n        # 将人脸动漫化模型设为评估模式\r\n        self.net.eval()\r\n\r\n    # 读取数据函数\r\n    @staticmethod\r\n    def load_datas(paths, images):\r\n        datas = []\r\n\r\n        # 读取数据列表\r\n        if paths is not None:\r\n            for im_path in paths:\r\n                assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\r\n                im = cv2.imread(im_path)\r\n                datas.append(im)\r\n\r\n        if images is not None:\r\n            datas = images\r\n\r\n        # 返回数据列表\r\n        return datas\r\n\r\n    # 数据预处理函数\r\n    def preprocess(self, images, batch_size, use_gpu):\r\n        # 获取人脸关键点\r\n        outputs = self.face_detector.keypoint_detection(images=images, batch_size=batch_size, use_gpu=use_gpu)\r\n\r\n        crops = []\r\n        for output, image in zip(outputs, images):\r\n            for landmarks in output['data']:\r\n                landmarks = np.array(landmarks)\r\n\r\n                # rotation angle\r\n                left_eye_corner = landmarks[36]\r\n                right_eye_corner = landmarks[45]\r\n                radian = np.arctan(\r\n                    (left_eye_corner[1] - right_eye_corner[1]) / (left_eye_corner[0] - right_eye_corner[0]))\r\n\r\n                # image size after rotating\r\n                height, width, _ = image.shape\r\n                cos = math.cos(radian)\r\n                sin = math.sin(radian)\r\n                new_w = int(width * abs(cos) + height * abs(sin))\r\n                new_h = int(width * abs(sin) + height * abs(cos))\r\n\r\n                # translation\r\n                Tx = new_w // 2 - width // 2\r\n                Ty = new_h // 2 - height // 2\r\n\r\n                # affine matrix\r\n                M = np.array([[cos, sin, (1 - cos) * width / 2. - sin * height / 2. + Tx],\r\n                              [-sin, cos, sin * width / 2. + (1 - cos) * height / 2. + Ty]])\r\n\r\n                image = cv2.warpAffine(image, M, (new_w, new_h), borderValue=(255, 255, 255))\r\n\r\n                landmarks = np.concatenate([landmarks, np.ones((landmarks.shape[0], 1))], axis=1)\r\n                landmarks = np.dot(M, landmarks.T).T\r\n                landmarks_top = np.min(landmarks[:, 1])\r\n                landmarks_bottom = np.max(landmarks[:, 1])\r\n                landmarks_left = np.min(landmarks[:, 0])\r\n                landmarks_right = np.max(landmarks[:, 0])\r\n\r\n                # expand bbox\r\n                top = int(landmarks_top - 0.8 * (landmarks_bottom - landmarks_top))\r\n                bottom = int(landmarks_bottom + 0.3 * (landmarks_bottom - landmarks_top))\r\n                left = int(landmarks_left - 0.3 * (landmarks_right - landmarks_left))\r\n                right = int(landmarks_right + 0.3 * (landmarks_right - landmarks_left))\r\n\r\n                # crop\r\n                if bottom - top > right - left:\r\n                    left -= ((bottom - top) - (right - left)) // 2\r\n                    right = left + (bottom - top)\r\n                else:\r\n                    top -= ((right - left) - (bottom - top)) // 2\r\n                    bottom = top + (right - left)\r\n\r\n                image_crop = np.ones((bottom - top + 1, right - left + 1, 3), np.uint8) * 255\r\n\r\n                h, w = image.shape[:2]\r\n                left_white = max(0, -left)\r\n                left = max(0, left)\r\n                right = min(right, w - 1)\r\n                right_white = left_white + (right - left)\r\n                top_white = max(0, -top)\r\n                top = max(0, top)\r\n                bottom = min(bottom, h - 1)\r\n                bottom_white = top_white + (bottom - top)\r\n\r\n                image_crop[top_white:bottom_white + 1, left_white:right_white + 1] = image[top:bottom + 1, left:right +\r\n                                                                                           1].copy()\r\n                crops.append(image_crop)\r\n\r\n        # 获取人像分割的输出\r\n        results = self.seg.Segmentation(images=crops, batch_size=batch_size)\r\n\r\n        faces = []\r\n        masks = []\r\n\r\n        for result in results:\r\n            # 提取MASK和输出图像\r\n            face = result['face']\r\n            mask = result['mask']\r\n\r\n            # 图像格式转换\r\n            face = cv2.cvtColor(face, cv2.COLOR_BGR2RGB)\r\n\r\n            # 图像拼接\r\n            face_rgba = np.dstack((face, mask))\r\n\r\n            # 图像缩放\r\n            face_rgba = cv2.resize(face_rgba, (256, 256), interpolation=cv2.INTER_AREA)\r\n\r\n            # 拆分图像\r\n            face = face_rgba[:, :, :3].copy()\r\n            mask = face_rgba[:, :, 3][:, :, np.newaxis].copy() / 255.\r\n\r\n            # 数据格式转换\r\n            face = np.transpose(face[np.newaxis, :, :, :], (0, 3, 1, 2)).astype(np.float32)\r\n\r\n            faces.append(face)\r\n            masks.append(mask)\r\n\r\n        input_datas = np.concatenate(faces, 0)\r\n\r\n        # 切分数据\r\n        datas_num = input_datas.shape[0]\r\n        split_num = datas_num // batch_size + 1 if datas_num % batch_size != 0 else datas_num // batch_size\r\n        input_datas = np.array_split(input_datas, split_num)\r\n\r\n        return input_datas, masks\r\n\r\n    # 模型预测函数\r\n    def predict(self, input_datas):\r\n        outputs = []\r\n\r\n        for data in input_datas:\r\n            # 转换数据为Tensor\r\n            data = paddle.to_tensor(data)\r\n\r\n            # 模型前向计算\r\n            cartoon = self.net(data)\r\n\r\n            outputs.append(cartoon[0].numpy())\r\n\r\n        outputs = np.concatenate(outputs, 0)\r\n\r\n        return outputs\r\n\r\n    # 结果后处理函数\r\n    @staticmethod\r\n    def postprocess(outputs, masks, visualization, output_dir):\r\n        # 检查输出目录\r\n        if visualization:\r\n            if not os.path.exists(output_dir):\r\n                os.mkdir(output_dir)\r\n\r\n        cartoons = []\r\n\r\n        for cartoon, mask, i in zip(outputs, masks, range(len(masks))):\r\n            # 格式转换\r\n            cartoon = np.transpose(cartoon, (1, 2, 0))\r\n            cartoon = (cartoon + 1) * 127.5\r\n\r\n            # 计算输出图像\r\n            cartoon = (cartoon * mask + 255 * (1 - mask)).astype(np.uint8)\r\n            cartoon = cv2.cvtColor(cartoon, cv2.COLOR_RGB2BGR)\r\n\r\n            # 可视化结果保存\r\n            if visualization:\r\n                cv2.imwrite(os.path.join(output_dir, 'result_%d.png' % i), cartoon)\r\n\r\n            cartoons.append(cartoon)\r\n\r\n        return cartoons\r\n\r\n    def Cartoon_GEN(self,\r\n                    images=None,\r\n                    paths=None,\r\n                    batch_size=1,\r\n                    output_dir='output',\r\n                    visualization=False,\r\n                    use_gpu=False):\r\n\r\n        # 获取输入数据\r\n        images = self.load_datas(paths, images)\r\n\r\n        # 数据预处理\r\n        input_datas, masks = self.preprocess(images, batch_size, use_gpu)\r\n\r\n        # 模型预测\r\n        outputs = self.predict(input_datas)\r\n\r\n        # 结果后处理\r\n        cartoons = self.postprocess(outputs, masks, visualization, output_dir)\r\n\r\n        return cartoons\r\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/U2Net_Portrait/README.md",
    "content": "# U2Net_Portrait\n\n|模型名称|U2Net_Portrait|\n| :--- | :---: |\n|类别|图像 - 图像生成|\n|网络|U^2Net|\n|数据集|-|\n|是否支持Fine-tuning|否|\n|模型大小|254MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/07f73466f3294373965e06c141c4781992f447104a94471dadfabc1c3d920861\"  height='50%' hspace='10'/>\n    <br />\n    输入图像\n    <br />\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/c6ab02cf27414a5ba5921d9e6b079b487f6cda6026dc4d6dbca8f0167ad7cae3\"   height='50%' hspace='10'/>\n    <br />\n    输出图像\n    <br />\n    </p>\n\n\n- ### 模型介绍\n\n  - U2Net_Portrait 可以用于提取人脸的素描结果。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.0.0  \n\n  - paddlehub >= 2.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install U2Net_Portrait\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"U2Net_Portrait\")\n    result = model.Portrait_GEN(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.Portrait_GEN(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def Portrait_GEN(images=None,\n                    paths=None,\n                    scale=1,\n                    batch_size=1,\n                    output_dir='output',\n                    face_detection=True,\n                    visualization=False):\n    ```\n\n    - 人脸画像生成API。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]；<br/>\n      - paths (list\\[str\\]): 输入图像路径；<br/>\n      - scale (float) : 缩放因子（与face_detection相关联)；<br/>\n      - batch_size (int) : batch大小；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 output；<br/>\n      - visualization (bool) : 是否将结果保存为图片文件；；<br/>\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n      - res (list\\[numpy.ndarray\\]): 输出图像数据，ndarray.shape 为 \\[H, W, C\\]\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install U2Net_Portrait==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/U2Net_Portrait/README_en.md",
    "content": "# U2Net_Portrait\n\n|Module Name|U2Net_Portrait|\n| :--- | :---: |\n|Category|image generation|\n|Network|U^2Net|\n|Dataset|-|\n|Fine-tuning supported or not|No|\n|Module Size|254MB|\n|Latest update date|2021-02-26|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n    <p align=\"center\">\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/07f73466f3294373965e06c141c4781992f447104a94471dadfabc1c3d920861\"  height='50%' hspace='10'/>\n    <br />\n    Input image\n    <br />\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/c6ab02cf27414a5ba5921d9e6b079b487f6cda6026dc4d6dbca8f0167ad7cae3\"   height='50%' hspace='10'/>\n    <br />\n    Output image\n    <br />\n    </p>\n\n\n- ### Module Introduction\n\n  - U2Net_Portrait can be used to create a face portrait.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 2.0.0  \n\n  - paddlehub >= 2.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install U2Net_Portrait\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"U2Net_Portrait\")\n    result = model.Portrait_GEN(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.Portrait_GEN(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def Portrait_GEN(images=None,\n                    paths=None,\n                    scale=1,\n                    batch_size=1,\n                    output_dir='output',\n                    face_detection=True,\n                    visualization=False):\n    ```\n\n    - Portrait generation API.\n\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - scale (float) : scale for resizing image；<br/>\n      - batch_size (int): the size of batch;\n      - output_dir (str): save path of images;\n      - visualization (bool): Whether to save the results as picture files;\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n      - res (list\\[numpy.ndarray\\]): result list，ndarray.shape is \\[H, W, C\\]\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install U2Net_Portrait==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/U2Net_Portrait/module.py",
    "content": "import os\r\nimport paddle\r\nimport paddle.nn as nn\r\nimport numpy as np\r\nfrom U2Net_Portrait.u2net import U2NET\r\nfrom U2Net_Portrait.processor import Processor\r\nfrom paddlehub.module.module import moduleinfo\r\n\r\n\r\n@moduleinfo(\r\n    name=\"U2Net_Portrait\",  # 模型名称\r\n    type=\"CV\",  # 模型类型\r\n    author=\"jm12138\",  # 作者名称\r\n    author_email=\"jm12138@qq.com\",  # 作者邮箱\r\n    summary=\"U2Net_Portrait\",  # 模型介绍\r\n    version=\"1.0.0\"  # 版本号\r\n)\r\nclass U2Net_Portrait(nn.Layer):\r\n    def __init__(self):\r\n        super(U2Net_Portrait, self).__init__()\r\n        self.model = U2NET(3, 1)\r\n        state_dict = paddle.load(os.path.join(self.directory, 'u2net_portrait.pdparams'))\r\n        self.model.set_dict(state_dict)\r\n        self.model.eval()\r\n\r\n    def predict(self, input_datas):\r\n        outputs = []\r\n        for data in input_datas:\r\n            data = paddle.to_tensor(data, 'float32')\r\n            d1, d2, d3, d4, d5, d6, d7 = self.model(data)\r\n            outputs.append(d1.numpy())\r\n\r\n        outputs = np.concatenate(outputs, 0)\r\n\r\n        return outputs\r\n\r\n    def Portrait_GEN(self,\r\n                     images=None,\r\n                     paths=None,\r\n                     scale=1,\r\n                     batch_size=1,\r\n                     output_dir='output',\r\n                     face_detection=True,\r\n                     visualization=False):\r\n\r\n        # 初始化数据处理器\r\n        processor = Processor(paths, images, batch_size, face_detection, scale)\r\n\r\n        # 模型预测\r\n        outputs = self.predict(processor.input_datas)\r\n\r\n        # 预测结果后处理\r\n        results = processor.postprocess(outputs, visualization=visualization, output_dir=output_dir)\r\n\r\n        return results\r\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/U2Net_Portrait/processor.py",
    "content": "import os\r\nimport cv2\r\nimport numpy as np\r\nimport paddlehub as hub\r\n\r\n__all__ = ['Processor']\r\n\r\n\r\nclass Processor():\r\n    def __init__(self, paths, images, batch_size, face_detection=True, scale=1):\r\n        # 图像列表\r\n        self.imgs = self.load_datas(paths, images)\r\n\r\n        # 输入数据\r\n        self.input_datas = self.preprocess(self.imgs, batch_size, face_detection, scale)\r\n\r\n    # 读取数据函数\r\n    def load_datas(self, paths, images):\r\n        datas = []\r\n\r\n        # 读取数据列表\r\n        if paths is not None:\r\n            for im_path in paths:\r\n                assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\r\n                im = cv2.imread(im_path)\r\n                datas.append(im)\r\n\r\n        if images is not None:\r\n            datas = images\r\n\r\n        # 返回数据列表\r\n        return datas\r\n\r\n    # 预处理\r\n    def preprocess(self, imgs, batch_size=1, face_detection=True, scale=1):\r\n        if face_detection:\r\n            # face detection\r\n            face_detector = hub.Module(name=\"pyramidbox_lite_mobile\")\r\n            results = face_detector.face_detection(images=imgs, use_gpu=False, visualization=False, confs_threshold=0.5)\r\n            im_faces = []\r\n            for datas, img in zip(results, imgs):\r\n                for face in datas['data']:\r\n                    # get detection result\r\n                    l, r, t, b = [face['left'], face['right'], face['top'], face['bottom']]\r\n\r\n                    # square crop\r\n                    pad = max(int(scale * (r - l)), int(scale * (b - t)))\r\n                    c_w, c_h = (r - l) // 2 + l, (b - t) // 2 + t\r\n                    top = 0 if c_h - pad < 0 else c_h - pad\r\n                    bottom = pad + c_h\r\n                    left = 0 if c_w - pad < 0 else c_w - pad\r\n                    right = pad + c_w\r\n                    crop = img[top:bottom, left:right]\r\n\r\n                    # resize\r\n                    im_face = cv2.resize(crop, (512, 512), interpolation=cv2.INTER_AREA)\r\n                    im_faces.append(im_face)\r\n        else:\r\n            im_faces = []\r\n            for img in imgs:\r\n                h, w = img.shape[:2]\r\n                if h > w:\r\n                    if (h - w) % 2 == 0:\r\n                        img = np.pad(\r\n                            img, ((0, 0), ((h - w) // 2, (h - w) // 2), (0, 0)),\r\n                            mode='constant',\r\n                            constant_values=((255, 255), (255, 255), (255, 255)))\r\n                    else:\r\n                        img = np.pad(\r\n                            img, ((0, 0), ((h - w) // 2, (h - w) // 2 + 1), (0, 0)),\r\n                            mode='constant',\r\n                            constant_values=((255, 255), (255, 255), (255, 255)))\r\n                else:\r\n                    if (w - h) % 2 == 0:\r\n                        img = np.pad(\r\n                            img, (((w - h) // 2, (w - h) // 2), (0, 0), (0, 0)),\r\n                            mode='constant',\r\n                            constant_values=((255, 255), (255, 255), (255, 255)))\r\n                    else:\r\n                        img = np.pad(\r\n                            img, (((w - h) // 2, (w - h) // 2 + 1), (0, 0), (0, 0)),\r\n                            mode='constant',\r\n                            constant_values=((255, 255), (255, 255), (255, 255)))\r\n                im_face = cv2.resize(img, (512, 512), interpolation=cv2.INTER_AREA)\r\n                im_faces.append(im_face)\r\n\r\n        input_datas = []\r\n        for im_face in im_faces:\r\n            tmpImg = np.zeros((im_face.shape[0], im_face.shape[1], 3))\r\n            im_face = im_face / np.max(im_face)\r\n\r\n            tmpImg[:, :, 0] = (im_face[:, :, 2] - 0.406) / 0.225\r\n            tmpImg[:, :, 1] = (im_face[:, :, 1] - 0.456) / 0.224\r\n            tmpImg[:, :, 2] = (im_face[:, :, 0] - 0.485) / 0.229\r\n\r\n            # convert BGR to RGB\r\n            tmpImg = tmpImg.transpose((2, 0, 1))\r\n            tmpImg = tmpImg[np.newaxis, :, :, :]\r\n            input_datas.append(tmpImg)\r\n\r\n        input_datas = np.concatenate(input_datas, 0)\r\n\r\n        datas_num = input_datas.shape[0]\r\n        split_num = datas_num // batch_size + 1 if datas_num % batch_size != 0 else datas_num // batch_size\r\n\r\n        input_datas = np.array_split(input_datas, split_num)\r\n\r\n        return input_datas\r\n\r\n    def normPRED(self, d):\r\n        ma = np.max(d)\r\n        mi = np.min(d)\r\n\r\n        dn = (d - mi) / (ma - mi)\r\n\r\n        return dn\r\n\r\n    # 后处理\r\n    def postprocess(self, outputs, visualization=False, output_dir='output'):\r\n        results = []\r\n        if visualization and not os.path.exists(output_dir):\r\n            os.mkdir(output_dir)\r\n\r\n        for i in range(outputs.shape[0]):\r\n            # normalization\r\n            pred = 1.0 - outputs[i, 0, :, :]\r\n\r\n            pred = self.normPRED(pred)\r\n\r\n            # convert torch tensor to numpy array\r\n            pred = pred.squeeze()\r\n            pred = (pred * 255).astype(np.uint8)\r\n\r\n            results.append(pred)\r\n\r\n            if visualization:\r\n                cv2.imwrite(os.path.join(output_dir, 'result_%d.png' % i), pred)\r\n\r\n        return results\r\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/U2Net_Portrait/u2net.py",
    "content": "import paddle\r\nimport paddle.nn as nn\r\nimport paddle.nn.functional as F\r\n\r\n__all__ = ['U2NETP', 'U2NET']\r\n\r\n\r\nclass REBNCONV(nn.Layer):\r\n    def __init__(self, in_ch=3, out_ch=3, dirate=1):\r\n        super(REBNCONV, self).__init__()\r\n\r\n        self.conv_s1 = nn.Conv2D(in_ch, out_ch, 3, padding=1 * dirate, dilation=1 * dirate)\r\n        self.bn_s1 = nn.BatchNorm2D(out_ch)\r\n        self.relu_s1 = nn.ReLU()\r\n\r\n    def forward(self, x):\r\n\r\n        hx = x\r\n        xout = self.relu_s1(self.bn_s1(self.conv_s1(hx)))\r\n\r\n        return xout\r\n\r\n\r\n## upsample tensor 'src' to have the same spatial size with tensor 'tar'\r\ndef _upsample_like(src, tar):\r\n\r\n    src = F.upsample(src, size=tar.shape[2:], mode='bilinear')\r\n\r\n    return src\r\n\r\n\r\n### RSU-7 ###\r\nclass RSU7(nn.Layer):  #UNet07DRES(nn.Layer):\r\n    def __init__(self, in_ch=3, mid_ch=12, out_ch=3):\r\n        super(RSU7, self).__init__()\r\n\r\n        self.rebnconvin = REBNCONV(in_ch, out_ch, dirate=1)\r\n\r\n        self.rebnconv1 = REBNCONV(out_ch, mid_ch, dirate=1)\r\n        self.pool1 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv2 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool2 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv3 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool3 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv4 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool4 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv5 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool5 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv6 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n\r\n        self.rebnconv7 = REBNCONV(mid_ch, mid_ch, dirate=2)\r\n\r\n        self.rebnconv6d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv5d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv4d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv3d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv2d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv1d = REBNCONV(mid_ch * 2, out_ch, dirate=1)\r\n\r\n    def forward(self, x):\r\n\r\n        hx = x\r\n        hxin = self.rebnconvin(hx)\r\n\r\n        hx1 = self.rebnconv1(hxin)\r\n        hx = self.pool1(hx1)\r\n\r\n        hx2 = self.rebnconv2(hx)\r\n        hx = self.pool2(hx2)\r\n\r\n        hx3 = self.rebnconv3(hx)\r\n        hx = self.pool3(hx3)\r\n\r\n        hx4 = self.rebnconv4(hx)\r\n        hx = self.pool4(hx4)\r\n\r\n        hx5 = self.rebnconv5(hx)\r\n        hx = self.pool5(hx5)\r\n\r\n        hx6 = self.rebnconv6(hx)\r\n\r\n        hx7 = self.rebnconv7(hx6)\r\n\r\n        hx6d = self.rebnconv6d(paddle.concat((hx7, hx6), 1))\r\n        hx6dup = _upsample_like(hx6d, hx5)\r\n\r\n        hx5d = self.rebnconv5d(paddle.concat((hx6dup, hx5), 1))\r\n        hx5dup = _upsample_like(hx5d, hx4)\r\n\r\n        hx4d = self.rebnconv4d(paddle.concat((hx5dup, hx4), 1))\r\n        hx4dup = _upsample_like(hx4d, hx3)\r\n\r\n        hx3d = self.rebnconv3d(paddle.concat((hx4dup, hx3), 1))\r\n        hx3dup = _upsample_like(hx3d, hx2)\r\n\r\n        hx2d = self.rebnconv2d(paddle.concat((hx3dup, hx2), 1))\r\n        hx2dup = _upsample_like(hx2d, hx1)\r\n\r\n        hx1d = self.rebnconv1d(paddle.concat((hx2dup, hx1), 1))\r\n\r\n        return hx1d + hxin\r\n\r\n\r\n### RSU-6 ###\r\nclass RSU6(nn.Layer):  #UNet06DRES(nn.Layer):\r\n    def __init__(self, in_ch=3, mid_ch=12, out_ch=3):\r\n        super(RSU6, self).__init__()\r\n\r\n        self.rebnconvin = REBNCONV(in_ch, out_ch, dirate=1)\r\n\r\n        self.rebnconv1 = REBNCONV(out_ch, mid_ch, dirate=1)\r\n        self.pool1 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv2 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool2 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv3 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool3 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv4 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool4 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv5 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n\r\n        self.rebnconv6 = REBNCONV(mid_ch, mid_ch, dirate=2)\r\n\r\n        self.rebnconv5d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv4d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv3d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv2d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv1d = REBNCONV(mid_ch * 2, out_ch, dirate=1)\r\n\r\n    def forward(self, x):\r\n\r\n        hx = x\r\n\r\n        hxin = self.rebnconvin(hx)\r\n\r\n        hx1 = self.rebnconv1(hxin)\r\n        hx = self.pool1(hx1)\r\n\r\n        hx2 = self.rebnconv2(hx)\r\n        hx = self.pool2(hx2)\r\n\r\n        hx3 = self.rebnconv3(hx)\r\n        hx = self.pool3(hx3)\r\n\r\n        hx4 = self.rebnconv4(hx)\r\n        hx = self.pool4(hx4)\r\n\r\n        hx5 = self.rebnconv5(hx)\r\n\r\n        hx6 = self.rebnconv6(hx5)\r\n\r\n        hx5d = self.rebnconv5d(paddle.concat((hx6, hx5), 1))\r\n        hx5dup = _upsample_like(hx5d, hx4)\r\n\r\n        hx4d = self.rebnconv4d(paddle.concat((hx5dup, hx4), 1))\r\n        hx4dup = _upsample_like(hx4d, hx3)\r\n\r\n        hx3d = self.rebnconv3d(paddle.concat((hx4dup, hx3), 1))\r\n        hx3dup = _upsample_like(hx3d, hx2)\r\n\r\n        hx2d = self.rebnconv2d(paddle.concat((hx3dup, hx2), 1))\r\n        hx2dup = _upsample_like(hx2d, hx1)\r\n\r\n        hx1d = self.rebnconv1d(paddle.concat((hx2dup, hx1), 1))\r\n\r\n        return hx1d + hxin\r\n\r\n\r\n### RSU-5 ###\r\nclass RSU5(nn.Layer):  #UNet05DRES(nn.Layer):\r\n    def __init__(self, in_ch=3, mid_ch=12, out_ch=3):\r\n        super(RSU5, self).__init__()\r\n\r\n        self.rebnconvin = REBNCONV(in_ch, out_ch, dirate=1)\r\n\r\n        self.rebnconv1 = REBNCONV(out_ch, mid_ch, dirate=1)\r\n        self.pool1 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv2 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool2 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv3 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool3 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv4 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n\r\n        self.rebnconv5 = REBNCONV(mid_ch, mid_ch, dirate=2)\r\n\r\n        self.rebnconv4d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv3d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv2d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv1d = REBNCONV(mid_ch * 2, out_ch, dirate=1)\r\n\r\n    def forward(self, x):\r\n\r\n        hx = x\r\n\r\n        hxin = self.rebnconvin(hx)\r\n\r\n        hx1 = self.rebnconv1(hxin)\r\n        hx = self.pool1(hx1)\r\n\r\n        hx2 = self.rebnconv2(hx)\r\n        hx = self.pool2(hx2)\r\n\r\n        hx3 = self.rebnconv3(hx)\r\n        hx = self.pool3(hx3)\r\n\r\n        hx4 = self.rebnconv4(hx)\r\n\r\n        hx5 = self.rebnconv5(hx4)\r\n\r\n        hx4d = self.rebnconv4d(paddle.concat((hx5, hx4), 1))\r\n        hx4dup = _upsample_like(hx4d, hx3)\r\n\r\n        hx3d = self.rebnconv3d(paddle.concat((hx4dup, hx3), 1))\r\n        hx3dup = _upsample_like(hx3d, hx2)\r\n\r\n        hx2d = self.rebnconv2d(paddle.concat((hx3dup, hx2), 1))\r\n        hx2dup = _upsample_like(hx2d, hx1)\r\n\r\n        hx1d = self.rebnconv1d(paddle.concat((hx2dup, hx1), 1))\r\n\r\n        return hx1d + hxin\r\n\r\n\r\n### RSU-4 ###\r\nclass RSU4(nn.Layer):  #UNet04DRES(nn.Layer):\r\n    def __init__(self, in_ch=3, mid_ch=12, out_ch=3):\r\n        super(RSU4, self).__init__()\r\n\r\n        self.rebnconvin = REBNCONV(in_ch, out_ch, dirate=1)\r\n\r\n        self.rebnconv1 = REBNCONV(out_ch, mid_ch, dirate=1)\r\n        self.pool1 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv2 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool2 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv3 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n\r\n        self.rebnconv4 = REBNCONV(mid_ch, mid_ch, dirate=2)\r\n\r\n        self.rebnconv3d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv2d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv1d = REBNCONV(mid_ch * 2, out_ch, dirate=1)\r\n\r\n    def forward(self, x):\r\n\r\n        hx = x\r\n\r\n        hxin = self.rebnconvin(hx)\r\n\r\n        hx1 = self.rebnconv1(hxin)\r\n        hx = self.pool1(hx1)\r\n\r\n        hx2 = self.rebnconv2(hx)\r\n        hx = self.pool2(hx2)\r\n\r\n        hx3 = self.rebnconv3(hx)\r\n\r\n        hx4 = self.rebnconv4(hx3)\r\n\r\n        hx3d = self.rebnconv3d(paddle.concat((hx4, hx3), 1))\r\n        hx3dup = _upsample_like(hx3d, hx2)\r\n\r\n        hx2d = self.rebnconv2d(paddle.concat((hx3dup, hx2), 1))\r\n        hx2dup = _upsample_like(hx2d, hx1)\r\n\r\n        hx1d = self.rebnconv1d(paddle.concat((hx2dup, hx1), 1))\r\n\r\n        return hx1d + hxin\r\n\r\n\r\n### RSU-4F ###\r\nclass RSU4F(nn.Layer):  #UNet04FRES(nn.Layer):\r\n    def __init__(self, in_ch=3, mid_ch=12, out_ch=3):\r\n        super(RSU4F, self).__init__()\r\n\r\n        self.rebnconvin = REBNCONV(in_ch, out_ch, dirate=1)\r\n\r\n        self.rebnconv1 = REBNCONV(out_ch, mid_ch, dirate=1)\r\n        self.rebnconv2 = REBNCONV(mid_ch, mid_ch, dirate=2)\r\n        self.rebnconv3 = REBNCONV(mid_ch, mid_ch, dirate=4)\r\n\r\n        self.rebnconv4 = REBNCONV(mid_ch, mid_ch, dirate=8)\r\n\r\n        self.rebnconv3d = REBNCONV(mid_ch * 2, mid_ch, dirate=4)\r\n        self.rebnconv2d = REBNCONV(mid_ch * 2, mid_ch, dirate=2)\r\n        self.rebnconv1d = REBNCONV(mid_ch * 2, out_ch, dirate=1)\r\n\r\n    def forward(self, x):\r\n\r\n        hx = x\r\n\r\n        hxin = self.rebnconvin(hx)\r\n\r\n        hx1 = self.rebnconv1(hxin)\r\n        hx2 = self.rebnconv2(hx1)\r\n        hx3 = self.rebnconv3(hx2)\r\n\r\n        hx4 = self.rebnconv4(hx3)\r\n\r\n        hx3d = self.rebnconv3d(paddle.concat((hx4, hx3), 1))\r\n        hx2d = self.rebnconv2d(paddle.concat((hx3d, hx2), 1))\r\n        hx1d = self.rebnconv1d(paddle.concat((hx2d, hx1), 1))\r\n\r\n        return hx1d + hxin\r\n\r\n\r\n##### U^2-Net ####\r\nclass U2NET(nn.Layer):\r\n    def __init__(self, in_ch=3, out_ch=1):\r\n        super(U2NET, self).__init__()\r\n\r\n        self.stage1 = RSU7(in_ch, 32, 64)\r\n        self.pool12 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage2 = RSU6(64, 32, 128)\r\n        self.pool23 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage3 = RSU5(128, 64, 256)\r\n        self.pool34 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage4 = RSU4(256, 128, 512)\r\n        self.pool45 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage5 = RSU4F(512, 256, 512)\r\n        self.pool56 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage6 = RSU4F(512, 256, 512)\r\n\r\n        # decoder\r\n        self.stage5d = RSU4F(1024, 256, 512)\r\n        self.stage4d = RSU4(1024, 128, 256)\r\n        self.stage3d = RSU5(512, 64, 128)\r\n        self.stage2d = RSU6(256, 32, 64)\r\n        self.stage1d = RSU7(128, 16, 64)\r\n\r\n        self.side1 = nn.Conv2D(64, out_ch, 3, padding=1)\r\n        self.side2 = nn.Conv2D(64, out_ch, 3, padding=1)\r\n        self.side3 = nn.Conv2D(128, out_ch, 3, padding=1)\r\n        self.side4 = nn.Conv2D(256, out_ch, 3, padding=1)\r\n        self.side5 = nn.Conv2D(512, out_ch, 3, padding=1)\r\n        self.side6 = nn.Conv2D(512, out_ch, 3, padding=1)\r\n\r\n        self.outconv = nn.Conv2D(6, out_ch, 1)\r\n\r\n    def forward(self, x):\r\n\r\n        hx = x\r\n\r\n        #stage 1\r\n        hx1 = self.stage1(hx)\r\n        hx = self.pool12(hx1)\r\n\r\n        #stage 2\r\n        hx2 = self.stage2(hx)\r\n        hx = self.pool23(hx2)\r\n\r\n        #stage 3\r\n        hx3 = self.stage3(hx)\r\n        hx = self.pool34(hx3)\r\n\r\n        #stage 4\r\n        hx4 = self.stage4(hx)\r\n        hx = self.pool45(hx4)\r\n\r\n        #stage 5\r\n        hx5 = self.stage5(hx)\r\n        hx = self.pool56(hx5)\r\n\r\n        #stage 6\r\n        hx6 = self.stage6(hx)\r\n        hx6up = _upsample_like(hx6, hx5)\r\n\r\n        #-------------------- decoder --------------------\r\n        hx5d = self.stage5d(paddle.concat((hx6up, hx5), 1))\r\n        hx5dup = _upsample_like(hx5d, hx4)\r\n\r\n        hx4d = self.stage4d(paddle.concat((hx5dup, hx4), 1))\r\n        hx4dup = _upsample_like(hx4d, hx3)\r\n\r\n        hx3d = self.stage3d(paddle.concat((hx4dup, hx3), 1))\r\n        hx3dup = _upsample_like(hx3d, hx2)\r\n\r\n        hx2d = self.stage2d(paddle.concat((hx3dup, hx2), 1))\r\n        hx2dup = _upsample_like(hx2d, hx1)\r\n\r\n        hx1d = self.stage1d(paddle.concat((hx2dup, hx1), 1))\r\n\r\n        #side output\r\n        d1 = self.side1(hx1d)\r\n\r\n        d2 = self.side2(hx2d)\r\n        d2 = _upsample_like(d2, d1)\r\n\r\n        d3 = self.side3(hx3d)\r\n        d3 = _upsample_like(d3, d1)\r\n\r\n        d4 = self.side4(hx4d)\r\n        d4 = _upsample_like(d4, d1)\r\n\r\n        d5 = self.side5(hx5d)\r\n        d5 = _upsample_like(d5, d1)\r\n\r\n        d6 = self.side6(hx6)\r\n        d6 = _upsample_like(d6, d1)\r\n\r\n        d0 = self.outconv(paddle.concat((d1, d2, d3, d4, d5, d6), 1))\r\n\r\n        return F.sigmoid(d0), F.sigmoid(d1), F.sigmoid(d2), F.sigmoid(d3), F.sigmoid(d4), F.sigmoid(d5), F.sigmoid(d6)\r\n\r\n\r\n### U^2-Net small ###\r\nclass U2NETP(nn.Layer):\r\n    def __init__(self, in_ch=3, out_ch=1):\r\n        super(U2NETP, self).__init__()\r\n\r\n        self.stage1 = RSU7(in_ch, 16, 64)\r\n        self.pool12 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage2 = RSU6(64, 16, 64)\r\n        self.pool23 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage3 = RSU5(64, 16, 64)\r\n        self.pool34 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage4 = RSU4(64, 16, 64)\r\n        self.pool45 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage5 = RSU4F(64, 16, 64)\r\n        self.pool56 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage6 = RSU4F(64, 16, 64)\r\n\r\n        # decoder\r\n        self.stage5d = RSU4F(128, 16, 64)\r\n        self.stage4d = RSU4(128, 16, 64)\r\n        self.stage3d = RSU5(128, 16, 64)\r\n        self.stage2d = RSU6(128, 16, 64)\r\n        self.stage1d = RSU7(128, 16, 64)\r\n\r\n        self.side1 = nn.Conv2D(64, out_ch, 3, padding=1)\r\n        self.side2 = nn.Conv2D(64, out_ch, 3, padding=1)\r\n        self.side3 = nn.Conv2D(64, out_ch, 3, padding=1)\r\n        self.side4 = nn.Conv2D(64, out_ch, 3, padding=1)\r\n        self.side5 = nn.Conv2D(64, out_ch, 3, padding=1)\r\n        self.side6 = nn.Conv2D(64, out_ch, 3, padding=1)\r\n\r\n        self.outconv = nn.Conv2D(6, out_ch, 1)\r\n\r\n    def forward(self, x):\r\n\r\n        hx = x\r\n\r\n        #stage 1\r\n        hx1 = self.stage1(hx)\r\n        hx = self.pool12(hx1)\r\n\r\n        #stage 2\r\n        hx2 = self.stage2(hx)\r\n        hx = self.pool23(hx2)\r\n\r\n        #stage 3\r\n        hx3 = self.stage3(hx)\r\n        hx = self.pool34(hx3)\r\n\r\n        #stage 4\r\n        hx4 = self.stage4(hx)\r\n        hx = self.pool45(hx4)\r\n\r\n        #stage 5\r\n        hx5 = self.stage5(hx)\r\n        hx = self.pool56(hx5)\r\n\r\n        #stage 6\r\n        hx6 = self.stage6(hx)\r\n        hx6up = _upsample_like(hx6, hx5)\r\n\r\n        #decoder\r\n        hx5d = self.stage5d(paddle.concat((hx6up, hx5), 1))\r\n        hx5dup = _upsample_like(hx5d, hx4)\r\n\r\n        hx4d = self.stage4d(paddle.concat((hx5dup, hx4), 1))\r\n        hx4dup = _upsample_like(hx4d, hx3)\r\n\r\n        hx3d = self.stage3d(paddle.concat((hx4dup, hx3), 1))\r\n        hx3dup = _upsample_like(hx3d, hx2)\r\n\r\n        hx2d = self.stage2d(paddle.concat((hx3dup, hx2), 1))\r\n        hx2dup = _upsample_like(hx2d, hx1)\r\n\r\n        hx1d = self.stage1d(paddle.concat((hx2dup, hx1), 1))\r\n\r\n        #side output\r\n        d1 = self.side1(hx1d)\r\n\r\n        d2 = self.side2(hx2d)\r\n        d2 = _upsample_like(d2, d1)\r\n\r\n        d3 = self.side3(hx3d)\r\n        d3 = _upsample_like(d3, d1)\r\n\r\n        d4 = self.side4(hx4d)\r\n        d4 = _upsample_like(d4, d1)\r\n\r\n        d5 = self.side5(hx5d)\r\n        d5 = _upsample_like(d5, d1)\r\n\r\n        d6 = self.side6(hx6)\r\n        d6 = _upsample_like(d6, d1)\r\n\r\n        d0 = self.outconv(paddle.concat((d1, d2, d3, d4, d5, d6), 1))\r\n\r\n        return F.sigmoid(d0), F.sigmoid(d1), F.sigmoid(d2), F.sigmoid(d3), F.sigmoid(d4), F.sigmoid(d5), F.sigmoid(d6)\r\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/UGATIT_100w/README.md",
    "content": "# UGATIT_100w\n\n|模型名称|UGATIT_100w|\n| :--- | :---: |\n|类别|图像 - 图像生成|\n|网络|U-GAT-IT|\n|数据集|selfie2anime|\n|是否支持Fine-tuning|否|\n|模型大小|41MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/d130fabd8bd34e53b2f942b3766eb6bbd3c19c0676d04abfbd5cc4b83b66f8b6\"  height='80%' hspace='10'/>\n    <br />\n    输入图像\n    <br />\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/8538af03b3f14b1884fcf4eec48965baf939e35a783d40129085102057438c77\"   height='80%' hspace='10'/>\n    <br />\n    输出图像\n    <br />\n    </p>\n\n\n- ### 模型介绍\n\n  - UGATIT图像风格转换模型, 模型可将输入的人脸图像转换成动漫风格, 模型详情请参考[UGATIT-Paddle开源项目](https://github.com/miraiwk/UGATIT-paddle)。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.8.0  \n\n  - paddlehub >= 1.8.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install UGATIT_100w\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"UGATIT_100w\")\n    result = model.style_transfer(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.style_transfer(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def style_transfer(images=None,\n                       paths=None,\n                       batch_size=1,\n                       output_dir='output',\n                       visualization=False)\n    ```\n\n    - 风格转换API，将输入的人脸图像转换成动漫风格。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]；<br/>\n      - paths (list\\[str\\]): 图片的路径；<br/>\n      - batch\\_size (int): batch的大小；<br/>\n      - visualization (bool): 是否将识别结果保存为图片文件；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 output；<br/>\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n      - res (list\\[numpy.ndarray\\]): 输出图像数据，ndarray.shape 为 \\[H, W, C\\]\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像风格转换服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m UGATIT_100w\n    ```\n\n  - 这样就完成了一个图像风格转换的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/UGATIT_100w\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install UGATIT_100w==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/UGATIT_100w/README_en.md",
    "content": "# UGATIT_100w\n\n|Module Name|UGATIT_100w|\n| :--- | :---: |\n|Category|image generation|\n|Network|U-GAT-IT|\n|Dataset|selfie2anime|\n|Fine-tuning supported or not|No|\n|Module Size|41MB|\n|Latest update date|2021-02-26|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n    <p align=\"center\">\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/d130fabd8bd34e53b2f942b3766eb6bbd3c19c0676d04abfbd5cc4b83b66f8b6\"  height='80%' hspace='10'/>\n    <br />\n    Input image\n    <br />\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/8538af03b3f14b1884fcf4eec48965baf939e35a783d40129085102057438c77\"   height='80%' hspace='10'/>\n    <br />\n    Output image\n    <br />\n    </p>\n\n\n- ### Module Introduction\n\n  - UGATIT is a model for style transfer. This module can be used to transfer a face image to cartoon style. For more information, please refer to [UGATIT-Paddle Project](https://github.com/miraiwk/UGATIT-paddle).\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.8.0  \n\n  - paddlehub >= 1.8.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install UGATIT_100w\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"UGATIT_100w\")\n    result = model.style_transfer(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.style_transfer(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def style_transfer(images=None,\n                       paths=None,\n                       batch_size=1,\n                       output_dir='output',\n                       visualization=False)\n    ```\n\n    - Style transfer API.\n\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - batch_size (int): the size of batch;\n      - visualization (bool): Whether to save the results as picture files;\n      - output_dir (str): save path of images;\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n      - res (list\\[numpy.ndarray\\]): result list，ndarray.shape is  \\[H, W, C\\]\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of style transfer.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m UGATIT_100w\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/UGATIT_100w\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install UGATIT_100w==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/UGATIT_100w/model.py",
    "content": "import os\nimport numpy as np\n\nfrom paddle.inference import create_predictor, Config\n\n__all__ = ['Model']\n\n\nclass Model():\n    # 初始化函数\n    def __init__(self, modelpath, use_gpu=False, use_mkldnn=True, combined=True):\n        # 加载模型预测器\n        self.predictor = self.load_model(modelpath, use_gpu, use_mkldnn, combined)\n\n        # 获取模型的输入输出\n        self.input_names = self.predictor.get_input_names()\n        self.output_names = self.predictor.get_output_names()\n        self.input_handle = self.predictor.get_input_handle(self.input_names[0])\n        self.output_handle = self.predictor.get_output_handle(self.output_names[0])\n\n    # 模型加载函数\n    def load_model(self, modelpath, use_gpu, use_mkldnn, combined):\n        # 对运行位置进行配置\n        if use_gpu:\n            try:\n                int(os.environ.get('CUDA_VISIBLE_DEVICES'))\n            except Exception:\n                print(\n                    'Error! Unable to use GPU. Please set the environment variables \"CUDA_VISIBLE_DEVICES=GPU_id\" to use GPU.'\n                )\n                use_gpu = False\n\n        # 加载模型参数\n        if combined:\n            model = os.path.join(modelpath, \"__model__\")\n            params = os.path.join(modelpath, \"__params__\")\n            config = Config(model, params)\n        else:\n            config = Config(modelpath)\n\n        # 设置参数\n        if use_gpu:\n            config.enable_use_gpu(100, 0)\n        else:\n            config.disable_gpu()\n            if use_mkldnn:\n                config.enable_mkldnn()\n        config.disable_glog_info()\n        config.switch_ir_optim(True)\n        config.enable_memory_optim()\n        config.switch_use_feed_fetch_ops(False)\n        config.switch_specify_input_names(True)\n\n        # 通过参数加载模型预测器\n        predictor = create_predictor(config)\n\n        # 返回预测器\n        return predictor\n\n    # 模型预测函数\n    def predict(self, input_datas):\n        outputs = []\n\n        # 遍历输入数据进行预测\n        for input_data in input_datas:\n            inputs = input_data.copy()\n            self.input_handle.copy_from_cpu(inputs)\n            self.predictor.run()\n            output = self.output_handle.copy_to_cpu()\n            outputs.append(output)\n\n        # 预测结果合并\n        outputs = np.concatenate(outputs, 0)\n\n        # 返回预测结果\n        return outputs\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/UGATIT_100w/module.py",
    "content": "import os\n\nfrom paddlehub import Module\nfrom paddlehub.module.module import moduleinfo, serving\n\nfrom UGATIT_100w.model import Model\nfrom UGATIT_100w.processor import base64_to_cv2, cv2_to_base64, Processor\n\n\n@moduleinfo(\n    name=\"UGATIT_100w\",  # 模型名称\n    type=\"CV/style_transfer\",  # 模型类型\n    author=\"jm12138\",  # 作者名称\n    author_email=\"jm12138@qq.com\",  # 作者邮箱\n    summary=\"UGATIT_100w\",  # 模型介绍\n    version=\"1.0.1\"  # 版本号\n)\nclass UGATIT_100w(Module):\n    # 初始化函数\n    def __init__(self, name=None, use_gpu=False):\n        # 设置模型路径\n        self.model_path = os.path.join(self.directory, \"UGATIT_100w\")\n\n        # 加载模型\n        self.model = Model(modelpath=self.model_path, use_gpu=use_gpu, use_mkldnn=False, combined=False)\n\n    # 关键点检测函数\n    def style_transfer(self, images=None, paths=None, batch_size=1, output_dir='output', visualization=False):\n        # 加载数据处理器\n        processor = Processor(images, paths, output_dir, batch_size)\n\n        # 模型预测\n        outputs = self.model.predict(processor.input_datas)\n\n        # 结果后处理\n        results = processor.postprocess(outputs, visualization)\n\n        # 返回结果\n        return results\n\n    # Hub Serving\n    @serving\n    def serving_method(self, images, **kwargs):\n        # 获取输入数据\n        images_decode = [base64_to_cv2(image) for image in images]\n\n        # 图片风格转换\n        results = self.style_transfer(images_decode, **kwargs)\n\n        # 对输出图片进行编码\n        encodes = []\n        for result in results:\n            encode = cv2_to_base64(result)\n            encodes.append(encode)\n\n        # 返回结果\n        return encodes\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/UGATIT_100w/processor.py",
    "content": "import os\nimport cv2\nimport time\nimport base64\nimport numpy as np\n\n__all__ = ['base64_to_cv2', 'cv2_to_base64', 'Processor']\n\n\ndef check_dir(dir_path):\n    # 目录检查函数\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef base64_to_cv2(b64str):\n    # base64转cv2函数\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.frombuffer(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef cv2_to_base64(image):\n    # cv2转base64函数\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\nclass Processor():\n    # 初始化函数\n    def __init__(self, images=None, paths=None, output_dir='output', batch_size=1):\n        # 变量设置\n        self.images = images\n        self.paths = paths\n        self.output_dir = output_dir\n        self.batch_size = batch_size\n\n        # 获取原始输入数据\n        self.datas = self.load_datas()\n\n        # 对原始输入数据进行预处理\n        self.input_datas = self.preprocess()\n\n    # 读取数据函数\n    def load_datas(self):\n        datas = []\n\n        # 读取数据列表\n        if self.paths is not None:\n            for im_path in self.paths:\n                assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n                im = cv2.imread(im_path)\n                datas.append(im)\n\n        if self.images is not None:\n            datas = self.images\n\n        # 返回数据列表\n        return datas\n\n    # 数据预处理函数\n    def preprocess(self):\n        input_datas = []\n\n        # 数据预处理\n        for i, img in enumerate(self.datas):\n            # 图像缩放\n            img = cv2.resize(img, (256, 256))\n\n            # 归一化\n            img = (img.astype('float32') / 255.0 - 0.5) / 0.5\n\n            # 转置\n            img = img.transpose((2, 0, 1))\n\n            # 增加维度\n            img = np.expand_dims(img, axis=0)\n\n            # 加入输入数据列表\n            input_datas.append(img)\n\n        # 数据按batch_size切分\n        input_datas = np.concatenate(input_datas, 0)\n        split_num = len(self.datas) // self.batch_size + 1 if len(self.datas) % self.batch_size != 0 else len(\n            self.datas) // self.batch_size\n        input_datas = np.array_split(input_datas, split_num)\n\n        # 返回预处理完成的数据\n        return input_datas\n\n    def postprocess(self, outputs, visualization):\n        results = []\n\n        for im_id, output in enumerate(outputs):\n            # 图像后处理\n            img = (output * 0.5 + 0.5) * 255.\n\n            # 限幅\n            img = np.clip(img, 0, 255).astype(np.uint8)\n\n            # 转置\n            img = img.transpose((1, 2, 0))\n\n            # 可视化\n            if visualization:\n                # 检查输出目录\n                check_dir(self.output_dir)\n\n                # 写入输出图片\n                cv2.imwrite(os.path.join(self.output_dir, '%d_%d.jpg' % (im_id, time.time())), img)\n\n            results.append(img)\n\n        # 返回结果\n        return results\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/UGATIT_83w/README.md",
    "content": "# UGATIT_83w\n\n|模型名称|UGATIT_83w|\n| :--- | :---: |\n|类别|图像 - 图像生成|\n|网络|U-GAT-IT|\n|数据集|selfie2anime|\n|是否支持Fine-tuning|否|\n|模型大小|41MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例(左为原图，右为效果图)：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/136651638-33cac040-edad-41ac-a9ce-7c0e678d8c52.jpg\" width = \"400\" height = \"400\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/136651644-dd1d3836-99b3-40f0-8543-37de18f9cfd9.jpg\" width = \"400\" height = \"400\" hspace='10'/>\n    </p>\n\n\n\n- ### 模型介绍\n\n  - UGATIT 图像风格转换模型, 模型可将输入的人脸图像转换成动漫风格.\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.8.2  \n\n  - paddlehub >= 1.8.0\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install UGATIT_83w\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n \n \n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    import cv2\n    import paddlehub as hub\n\n    # 模型加载\n    # use_gpu：是否使用GPU进行预测\n    model = hub.Module(name='UGATIT_83w', use_gpu=False)\n\n    # 模型预测\n    result = model.style_transfer(images=[cv2.imread('/PATH/TO/IMAGE')])\n\n    # or\n    # result = model.style_transfer(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def style_transfer(\n        self,\n        images=None,\n        paths=None,\n        batch_size=1,\n        output_dir='output',\n        visualization=False\n    )\n    ```\n\n    - 风格转换API，将输入的人脸图像转换成动漫风格。\n\n    - **参数**\n        * images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，默认为 None；\n        * paths (list\\[str\\]): 图片的路径，默认为 None；\n        * batch\\_size (int): batch 的大小，默认设为 1；\n        * visualization (bool): 是否将识别结果保存为图片文件，默认设为 False；\n        * output\\_dir (str): 图片的保存路径，默认设为 output\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n\n      - res (list\\[numpy.ndarray\\]): 输出图像数据，ndarray.shape 为 \\[H, W, C\\]\n      \n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像风格转换服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  \n  - ```shell\n    $ hub serving start -m UGATIT_83w\n    ```\n\n  - 这样就完成了一个图像风格转换的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/UGATIT_83w\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/UGATIT_83w/README_en.md",
    "content": "# UGATIT_83w\n\n|Module Name|UGATIT_83w|\n| :--- | :---: |\n|Category|Image editing|\n|Network |U-GAT-IT|\n|Dataset|selfie2anime|\n|Fine-tuning supported or not|No|\n|Module Size|41MB|\n|Latest update date |2021-02-26|\n|Data indicators|-|\n\n\n## I. Basic Information \n\n- ### Application Effect Display\n  - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/136651638-33cac040-edad-41ac-a9ce-7c0e678d8c52.jpg\" width = \"400\" height = \"400\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/136651644-dd1d3836-99b3-40f0-8543-37de18f9cfd9.jpg\" width = \"400\" height = \"400\" hspace='10'/>\n    </p>\n\n\n\n- ### Module Introduction\n\n  - UGATIT  can transfer the input face image into the anime style.\n\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n  - paddlepaddle >= 1.8.2  \n\n  - paddlehub >= 1.8.0\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install UGATIT_83w\n    ```\n\n  - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n \n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n  - ```python\n    import cv2\n    import paddlehub as hub\n\n    model = hub.Module(name='UGATIT_83w', use_gpu=False)\n    result = model.style_transfer(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.style_transfer(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def style_transfer(\n        self,\n        images=None,\n        paths=None,\n        batch_size=1,\n        output_dir='output',\n        visualization=False\n    )\n    ```\n\n    - Style transfer API, convert the input face image into anime style.\n\n    - **Parameters**\n        * images (list\\[numpy.ndarray\\]): Image data, ndarray.shape is in the format [H, W, C], BGR.\n        * paths (list\\[str\\]): image path，default is None；\n        * batch\\_size (int): Batch size, default is 1；\n        * visualization (bool): Whether to save the recognition results as picture files, default is False.\n        * output\\_dir (str): Save path of images, `output` by default.\n\n      **NOTE:** Choose one of `paths` and `images` to provide data.\n\n    - **Return**\n\n      - res (list\\[numpy.ndarray\\]): Result,  ndarray.shape is in the format [H, W, C].\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of Style transfer task.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command:\n  \n    - ```shell\n      $ hub serving start -m UGATIT_83w\n      ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n    - ```python\n      import requests\n      import json\n      import cv2\n      import base64\n\n\n      def cv2_to_base64(image):\n          data = cv2.imencode('.jpg', image)[1]\n          return base64.b64encode(data.tostring()).decode('utf8')\n\n\n      # Send an HTTP request\n      data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n      headers = {\"Content-type\": \"application/json\"}\n      url = \"http://127.0.0.1:8866/predict/UGATIT_83w\"\n      r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n      # print prediction results\n      print(r.json()[\"results\"])\n      ```\n\n## V. Release Note\n\n- 1.0.0\n\n  First release"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/UGATIT_83w/model.py",
    "content": "import os\nimport numpy as np\n\nfrom paddle.inference import create_predictor, Config\n\n__all__ = ['Model']\n\n\nclass Model():\n    # 初始化函数\n    def __init__(self, modelpath, use_gpu=False, use_mkldnn=True, combined=True):\n        # 加载模型预测器\n        self.predictor = self.load_model(modelpath, use_gpu, use_mkldnn, combined)\n\n        # 获取模型的输入输出\n        self.input_names = self.predictor.get_input_names()\n        self.output_names = self.predictor.get_output_names()\n        self.input_handle = self.predictor.get_input_handle(self.input_names[0])\n        self.output_handle = self.predictor.get_output_handle(self.output_names[0])\n\n    # 模型加载函数\n    def load_model(self, modelpath, use_gpu, use_mkldnn, combined):\n        # 对运行位置进行配置\n        if use_gpu:\n            try:\n                int(os.environ.get('CUDA_VISIBLE_DEVICES'))\n            except Exception:\n                print(\n                    'Error! Unable to use GPU. Please set the environment variables \"CUDA_VISIBLE_DEVICES=GPU_id\" to use GPU.'\n                )\n                use_gpu = False\n\n        # 加载模型参数\n        if combined:\n            model = os.path.join(modelpath, \"__model__\")\n            params = os.path.join(modelpath, \"__params__\")\n            config = Config(model, params)\n        else:\n            config = Config(modelpath)\n\n        # 设置参数\n        if use_gpu:\n            config.enable_use_gpu(100, 0)\n        else:\n            config.disable_gpu()\n            if use_mkldnn:\n                config.enable_mkldnn()\n        config.disable_glog_info()\n        config.switch_ir_optim(True)\n        config.enable_memory_optim()\n        config.switch_use_feed_fetch_ops(False)\n        config.switch_specify_input_names(True)\n\n        # 通过参数加载模型预测器\n        predictor = create_predictor(config)\n\n        # 返回预测器\n        return predictor\n\n    # 模型预测函数\n    def predict(self, input_datas):\n        outputs = []\n\n        # 遍历输入数据进行预测\n        for input_data in input_datas:\n            inputs = input_data.copy()\n            self.input_handle.copy_from_cpu(inputs)\n            self.predictor.run()\n            output = self.output_handle.copy_to_cpu()\n            outputs.append(output)\n\n        # 预测结果合并\n        outputs = np.concatenate(outputs, 0)\n\n        # 返回预测结果\n        return outputs\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/UGATIT_83w/module.py",
    "content": "import os\n\nfrom paddlehub import Module\nfrom paddlehub.module.module import moduleinfo, serving\n\nfrom UGATIT_83w.model import Model\nfrom UGATIT_83w.processor import base64_to_cv2, cv2_to_base64, Processor\n\n\n@moduleinfo(\n    name=\"UGATIT_83w\",  # 模型名称\n    type=\"CV/style_transfer\",  # 模型类型\n    author=\"jm12138\",  # 作者名称\n    author_email=\"jm12138@qq.com\",  # 作者邮箱\n    summary=\"UGATIT\",  # 模型介绍\n    version=\"1.0.1\"  # 版本号\n)\nclass UGATIT_83w(Module):\n    # 初始化函数\n    def __init__(self, name=None, use_gpu=False):\n        # 设置模型路径\n        self.model_path = os.path.join(self.directory, \"UGATIT_83w\")\n\n        # 加载模型\n        self.model = Model(modelpath=self.model_path, use_gpu=use_gpu, use_mkldnn=False, combined=False)\n\n    # 关键点检测函数\n    def style_transfer(self, images=None, paths=None, batch_size=1, output_dir='output', visualization=False):\n        # 加载数据处理器\n        processor = Processor(images, paths, output_dir, batch_size)\n\n        # 模型预测\n        outputs = self.model.predict(processor.input_datas)\n\n        # 结果后处理\n        results = processor.postprocess(outputs, visualization)\n\n        # 返回结果\n        return results\n\n    # Hub Serving\n    @serving\n    def serving_method(self, images, **kwargs):\n        # 获取输入数据\n        images_decode = [base64_to_cv2(image) for image in images]\n\n        # 图片风格转换\n        results = self.style_transfer(images_decode, **kwargs)\n\n        # 对输出图片进行编码\n        encodes = []\n        for result in results:\n            encode = cv2_to_base64(result)\n            encodes.append(encode)\n\n        # 返回结果\n        return encodes\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/UGATIT_83w/processor.py",
    "content": "import os\nimport cv2\nimport time\nimport base64\nimport numpy as np\n\n__all__ = ['base64_to_cv2', 'cv2_to_base64', 'Processor']\n\n\ndef check_dir(dir_path):\n    # 目录检查函数\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef base64_to_cv2(b64str):\n    # base64转cv2函数\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.frombuffer(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef cv2_to_base64(image):\n    # cv2转base64函数\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\nclass Processor():\n    # 初始化函数\n    def __init__(self, images=None, paths=None, output_dir='output', batch_size=1):\n        # 变量设置\n        self.images = images\n        self.paths = paths\n        self.output_dir = output_dir\n        self.batch_size = batch_size\n\n        # 获取原始输入数据\n        self.datas = self.load_datas()\n\n        # 对原始输入数据进行预处理\n        self.input_datas = self.preprocess()\n\n    # 读取数据函数\n    def load_datas(self):\n        datas = []\n\n        # 读取数据列表\n        if self.paths is not None:\n            for im_path in self.paths:\n                assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n                im = cv2.imread(im_path)\n                datas.append(im)\n\n        if self.images is not None:\n            datas = self.images\n\n        # 返回数据列表\n        return datas\n\n    # 数据预处理函数\n    def preprocess(self):\n        input_datas = []\n\n        # 数据预处理\n        for i, img in enumerate(self.datas):\n            # 图像缩放\n            img = cv2.resize(img, (256, 256))\n\n            # 归一化\n            img = (img.astype('float32') / 255.0 - 0.5) / 0.5\n\n            # 转置\n            img = img.transpose((2, 0, 1))\n\n            # 增加维度\n            img = np.expand_dims(img, axis=0)\n\n            # 加入输入数据列表\n            input_datas.append(img)\n\n        # 数据按batch_size切分\n        input_datas = np.concatenate(input_datas, 0)\n        split_num = len(self.datas) // self.batch_size + 1 if len(self.datas) % self.batch_size != 0 else len(\n            self.datas) // self.batch_size\n        input_datas = np.array_split(input_datas, split_num)\n\n        # 返回预处理完成的数据\n        return input_datas\n\n    def postprocess(self, outputs, visualization):\n        results = []\n\n        for im_id, output in enumerate(outputs):\n            # 图像后处理\n            img = (output * 0.5 + 0.5) * 255.\n\n            # 限幅\n            img = np.clip(img, 0, 255).astype(np.uint8)\n\n            # 转置\n            img = img.transpose((1, 2, 0))\n\n            # 可视化\n            if visualization:\n                # 检查输出目录\n                check_dir(self.output_dir)\n\n                # 写入输出图片\n                cv2.imwrite(os.path.join(self.output_dir, '%d_%d.jpg' % (im_id, time.time())), img)\n\n            results.append(img)\n\n        # 返回结果\n        return results\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/UGATIT_92w/README.md",
    "content": "# UGATIT_92w\n\n|模型名称|UGATIT_92w|\n| :--- | :---: |\n|类别|图像 - 图像生成|\n|网络|U-GAT-IT|\n|数据集|selfie2anime|\n|是否支持Fine-tuning|否|\n|模型大小|41MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例(左为原图，右为效果图)：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/136651638-33cac040-edad-41ac-a9ce-7c0e678d8c52.jpg\" width = \"400\" height = \"400\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/136653047-f00c30fb-521f-486f-8247-8d8f63649473.jpg\" width = \"400\" height = \"400\" hspace='10'/>\n    </p>\n\n\n\n- ### 模型介绍\n\n  - UGATIT 图像风格转换模型, 模型可将输入的人脸图像转换成动漫风格.\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.8.2  \n\n  - paddlehub >= 1.8.0\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install UGATIT_92w\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n \n \n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    import cv2\n    import paddlehub as hub\n\n    # 模型加载\n    # use_gpu：是否使用GPU进行预测\n    model = hub.Module(name='UGATIT_92w', use_gpu=False)\n\n    # 模型预测\n    result = model.style_transfer(images=[cv2.imread('/PATH/TO/IMAGE')])\n\n    # or\n    # result = model.style_transfer(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def style_transfer(\n        self,\n        images=None,\n        paths=None,\n        batch_size=1,\n        output_dir='output',\n        visualization=False\n    )\n    ```\n\n    - 风格转换API，将输入的人脸图像转换成动漫风格。\n\n    - **参数**\n        * images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，默认为 None；\n        * paths (list\\[str\\]): 图片的路径，默认为 None；\n        * batch\\_size (int): batch 的大小，默认设为 1；\n        * visualization (bool): 是否将识别结果保存为图片文件，默认设为 False；\n        * output\\_dir (str): 图片的保存路径，默认设为 output\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n\n      - res (list\\[numpy.ndarray\\]): 输出图像数据，ndarray.shape 为 \\[H, W, C\\]\n      \n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像风格转换服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  \n  - ```shell\n    $ hub serving start -m UGATIT_92w\n    ```\n\n  - 这样就完成了一个图像风格转换的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/UGATIT_92w\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/UGATIT_92w/README_en.md",
    "content": "# UGATIT_92w\n\n|Module Name|UGATIT_92w|\n| :--- | :---: |\n|Category|Image editing|\n|Network |U-GAT-IT|\n|Dataset|selfie2anime|\n|Fine-tuning supported or not|No|\n|Module Size|41MB|\n|Latest update date |2021-02-26|\n|Data indicators|-|\n\n\n## I. Basic Information \n\n- ### Application Effect Display\n  - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/136651638-33cac040-edad-41ac-a9ce-7c0e678d8c52.jpg\" width = \"400\" height = \"400\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/136653047-f00c30fb-521f-486f-8247-8d8f63649473.jpg\" width = \"400\" height = \"400\" hspace='10'/>\n    </p>\n\n\n\n- ### Module Introduction\n\n  - UGATIT  can transfer the input face image into the anime style.\n\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n  - paddlepaddle >= 1.8.2  \n\n  - paddlehub >= 1.8.0\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install UGATIT_92w\n    ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n \n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n  - ```python\n    import cv2\n    import paddlehub as hub\n\n    model = hub.Module(name='UGATIT_92w', use_gpu=False)\n    result = model.style_transfer(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.style_transfer(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def style_transfer(\n        self,\n        images=None,\n        paths=None,\n        batch_size=1,\n        output_dir='output',\n        visualization=False\n    )\n    ```\n\n    - Style transfer API, convert the input face image into anime style.\n\n    - **Parameters**\n        * images (list\\[numpy.ndarray\\]): Image data, ndarray.shape is in the format [H, W, C], BGR.\n        * paths (list\\[str\\]): Image path，default is None；\n        * batch\\_size (int): Batch size, default is 1；\n        * visualization (bool): Whether to save the recognition results as picture files, default is False.\n        * output\\_dir (str): save path of images, `output` by default.\n\n      **NOTE:** Choose one of `paths` and `images` to provide input data.\n\n    - **Return**\n\n      - res (list\\[numpy.ndarray\\]): Style tranfer result,  ndarray.shape is in the format [H, W, C].\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of Style transfer task.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command:\n  \n    - ```shell\n      $ hub serving start -m UGATIT_92w\n      ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n    - ```python\n      import requests\n      import json\n      import cv2\n      import base64\n\n\n      def cv2_to_base64(image):\n          data = cv2.imencode('.jpg', image)[1]\n          return base64.b64encode(data.tostring()).decode('utf8')\n\n\n      # Send an HTTP request\n      data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n      headers = {\"Content-type\": \"application/json\"}\n      url = \"http://127.0.0.1:8866/predict/UGATIT_92w\"\n      r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n      # print prediction results\n      print(r.json()[\"results\"])\n      ```\n\n## V. Release Note\n\n- 1.0.0\n\n  First release"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/UGATIT_92w/model.py",
    "content": "import os\nimport numpy as np\n\nfrom paddle.inference import create_predictor, Config\n\n__all__ = ['Model']\n\n\nclass Model():\n    # 初始化函数\n    def __init__(self, modelpath, use_gpu=False, use_mkldnn=True, combined=True):\n        # 加载模型预测器\n        self.predictor = self.load_model(modelpath, use_gpu, use_mkldnn, combined)\n\n        # 获取模型的输入输出\n        self.input_names = self.predictor.get_input_names()\n        self.output_names = self.predictor.get_output_names()\n        self.input_handle = self.predictor.get_input_handle(self.input_names[0])\n        self.output_handle = self.predictor.get_output_handle(self.output_names[0])\n\n    # 模型加载函数\n    def load_model(self, modelpath, use_gpu, use_mkldnn, combined):\n        # 对运行位置进行配置\n        if use_gpu:\n            try:\n                int(os.environ.get('CUDA_VISIBLE_DEVICES'))\n            except Exception:\n                print(\n                    'Error! Unable to use GPU. Please set the environment variables \"CUDA_VISIBLE_DEVICES=GPU_id\" to use GPU.'\n                )\n                use_gpu = False\n\n        # 加载模型参数\n        if combined:\n            model = os.path.join(modelpath, \"__model__\")\n            params = os.path.join(modelpath, \"__params__\")\n            config = Config(model, params)\n        else:\n            config = Config(modelpath)\n\n        # 设置参数\n        if use_gpu:\n            config.enable_use_gpu(100, 0)\n        else:\n            config.disable_gpu()\n            if use_mkldnn:\n                config.enable_mkldnn()\n        config.disable_glog_info()\n        config.switch_ir_optim(True)\n        config.enable_memory_optim()\n        config.switch_use_feed_fetch_ops(False)\n        config.switch_specify_input_names(True)\n\n        # 通过参数加载模型预测器\n        predictor = create_predictor(config)\n\n        # 返回预测器\n        return predictor\n\n    # 模型预测函数\n    def predict(self, input_datas):\n        outputs = []\n\n        # 遍历输入数据进行预测\n        for input_data in input_datas:\n            inputs = input_data.copy()\n            self.input_handle.copy_from_cpu(inputs)\n            self.predictor.run()\n            output = self.output_handle.copy_to_cpu()\n            outputs.append(output)\n\n        # 预测结果合并\n        outputs = np.concatenate(outputs, 0)\n\n        # 返回预测结果\n        return outputs\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/UGATIT_92w/module.py",
    "content": "import os\n\nfrom paddlehub import Module\nfrom paddlehub.module.module import moduleinfo, serving\n\nfrom UGATIT_92w.model import Model\nfrom UGATIT_92w.processor import base64_to_cv2, cv2_to_base64, Processor\n\n\n@moduleinfo(\n    name=\"UGATIT_92w\",  # 模型名称\n    type=\"CV/style_transfer\",  # 模型类型\n    author=\"jm12138\",  # 作者名称\n    author_email=\"jm12138@qq.com\",  # 作者邮箱\n    summary=\"UGATIT_92w\",  # 模型介绍\n    version=\"1.0.1\"  # 版本号\n)\nclass UGATIT_92w(Module):\n    # 初始化函数\n    def __init__(self, name=None, use_gpu=False):\n        # 设置模型路径\n        self.model_path = os.path.join(self.directory, \"UGATIT_92w\")\n\n        # 加载模型\n        self.model = Model(modelpath=self.model_path, use_gpu=use_gpu, use_mkldnn=False, combined=False)\n\n    # 关键点检测函数\n    def style_transfer(self, images=None, paths=None, batch_size=1, output_dir='output', visualization=False):\n        # 加载数据处理器\n        processor = Processor(images, paths, output_dir, batch_size)\n\n        # 模型预测\n        outputs = self.model.predict(processor.input_datas)\n\n        # 结果后处理\n        results = processor.postprocess(outputs, visualization)\n\n        # 返回结果\n        return results\n\n    # Hub Serving\n    @serving\n    def serving_method(self, images, **kwargs):\n        # 获取输入数据\n        images_decode = [base64_to_cv2(image) for image in images]\n\n        # 图片风格转换\n        results = self.style_transfer(images_decode, **kwargs)\n\n        # 对输出图片进行编码\n        encodes = []\n        for result in results:\n            encode = cv2_to_base64(result)\n            encodes.append(encode)\n\n        # 返回结果\n        return encodes\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/UGATIT_92w/processor.py",
    "content": "import os\nimport cv2\nimport time\nimport base64\nimport numpy as np\n\n__all__ = ['base64_to_cv2', 'cv2_to_base64', 'Processor']\n\n\ndef check_dir(dir_path):\n    # 目录检查函数\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef base64_to_cv2(b64str):\n    # base64转cv2函数\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.frombuffer(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef cv2_to_base64(image):\n    # cv2转base64函数\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\nclass Processor():\n    # 初始化函数\n    def __init__(self, images=None, paths=None, output_dir='output', batch_size=1):\n        # 变量设置\n        self.images = images\n        self.paths = paths\n        self.output_dir = output_dir\n        self.batch_size = batch_size\n\n        # 获取原始输入数据\n        self.datas = self.load_datas()\n\n        # 对原始输入数据进行预处理\n        self.input_datas = self.preprocess()\n\n    # 读取数据函数\n    def load_datas(self):\n        datas = []\n\n        # 读取数据列表\n        if self.paths is not None:\n            for im_path in self.paths:\n                assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n                im = cv2.imread(im_path)\n                datas.append(im)\n\n        if self.images is not None:\n            datas = self.images\n\n        # 返回数据列表\n        return datas\n\n    # 数据预处理函数\n    def preprocess(self):\n        input_datas = []\n\n        # 数据预处理\n        for i, img in enumerate(self.datas):\n            # 图像缩放\n            img = cv2.resize(img, (256, 256))\n\n            # 归一化\n            img = (img.astype('float32') / 255.0 - 0.5) / 0.5\n\n            # 转置\n            img = img.transpose((2, 0, 1))\n\n            # 增加维度\n            img = np.expand_dims(img, axis=0)\n\n            # 加入输入数据列表\n            input_datas.append(img)\n\n        # 数据按batch_size切分\n        input_datas = np.concatenate(input_datas, 0)\n        split_num = len(self.datas) // self.batch_size + 1 if len(self.datas) % self.batch_size != 0 else len(\n            self.datas) // self.batch_size\n        input_datas = np.array_split(input_datas, split_num)\n\n        # 返回预处理完成的数据\n        return input_datas\n\n    def postprocess(self, outputs, visualization):\n        results = []\n\n        for im_id, output in enumerate(outputs):\n            # 图像后处理\n            img = (output * 0.5 + 0.5) * 255.\n\n            # 限幅\n            img = np.clip(img, 0, 255).astype(np.uint8)\n\n            # 转置\n            img = img.transpose((1, 2, 0))\n\n            # 可视化\n            if visualization:\n                # 检查输出目录\n                check_dir(self.output_dir)\n\n                # 写入输出图片\n                cv2.imwrite(os.path.join(self.output_dir, '%d_%d.jpg' % (im_id, time.time())), img)\n\n            results.append(img)\n\n        # 返回结果\n        return results\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v1_hayao_60/README.md",
    "content": "# animegan_v1_hayao_60\n\n|模型名称|animegan_v1_hayao_60|\n| :--- | :---: |\n|类别|图像 - 图像生成|\n|网络|AnimeGAN|\n|数据集|The Wind Rises|\n|是否支持Fine-tuning|否|\n|模型大小|18MB|\n|最新更新日期|2021-07-30|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/bd002c4bb6a7427daf26988770bb18648b7d8d2bfd6746bfb9a429db4867727f\"  width = \"450\" height = \"300\" hspace='10'/>\n     <br />\n    输入图像\n     <br />\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/10175bb964e94ce18608a84b0ab6ebfe154b523df42f44a3a851b2d91dd17a63\"  width = \"450\" height = \"300\" hspace='10'/>\n     <br />\n    输出图像\n     <br />\n    </p>\n\n\n\n- ### 模型介绍\n\n  - AnimeGAN V1 图像风格转换模型, 模型可将输入的图像转换成宫崎骏动漫风格，模型权重转换自[AnimeGAN V1官方开源项目](https://github.com/TachibanaYoshino/AnimeGAN)。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.8.0  \n\n  - paddlehub >= 1.8.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install animegan_v1_hayao_60\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"animegan_v1_hayao_60\")\n    result = model.style_transfer(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.style_transfer(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def style_transfer(images=None,\n                       paths=None,\n                       output_dir='output',\n                       visualization=False,\n                       min_size=32,\n                       max_size=1024)\n    ```\n\n    - 风格转换API，将输入的图片转换为漫画风格。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]；<br/>\n      - paths (list\\[str\\]): 图片的路径；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 output；<br/>\n      - visualization (bool): 是否将识别结果保存为图片文件；<br/>\n      - min\\_size (int): 输入图片的短边最小尺寸，默认设为 32；<br/>\n      - max\\_size (int): 输入图片的短边最大尺寸，默认设为 1024。\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n      - res (list\\[numpy.ndarray\\]): 输出图像数据，ndarray.shape 为 \\[H, W, C\\]\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像风格转换服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m animegan_v1_hayao_60\n    ```\n\n  - 这样就完成了一个图像风格转换的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/animegan_v1_hayao_60\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install animegan_v1_hayao_60==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v1_hayao_60/README_en.md",
    "content": "# animegan_v1_hayao_60\n\n|Module Name|animegan_v1_hayao_60|\n| :--- | :---: |\n|Category|image generation|\n|Network|AnimeGAN|\n|Dataset|The Wind Rises|\n|Fine-tuning supported or not|No|\n|Module Size|18MB|\n|Latest update date|2021-07-30|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n    <p align=\"center\">\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/bd002c4bb6a7427daf26988770bb18648b7d8d2bfd6746bfb9a429db4867727f\"  width = \"450\" height = \"300\" hspace='10'/>\n     <br />\n    Input Image\n     <br />\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/10175bb964e94ce18608a84b0ab6ebfe154b523df42f44a3a851b2d91dd17a63\"  width = \"450\" height = \"300\" hspace='10'/>\n     <br />\n    Output Image\n     <br />\n    </p>\n\n\n\n- ### Module Introduction\n\n  - AnimeGAN V1 is a style transfer model, which can transfer a image style to Miyazaki carton style. For more information, please refer to [AnimeGAN V1 Project](https://github.com/TachibanaYoshino/AnimeGAN).\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.8.0  \n\n  - paddlehub >= 1.8.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install animegan_v1_hayao_60\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"animegan_v1_hayao_60\")\n    result = model.style_transfer(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.style_transfer(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def style_transfer(images=None,\n                       paths=None,\n                       output_dir='output',\n                       visualization=False,\n                       min_size=32,\n                       max_size=1024)\n    ```\n\n    - Style transfer API.\n\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - output_dir (str): save path of images;\n      - visualization (bool): Whether to save the results as picture files;\n      - min\\_size (int): min size of image shape，default is 32；\n      - max\\_size (int): max size of image shape，default is 1024.\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n      - res (list\\[numpy.ndarray\\]): result list，ndarray.shape is \\[H, W, C\\]\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of style transfer.\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m animegan_v1_hayao_60\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/animegan_v1_hayao_60\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Remove Fluid API\n\n  - ```shell\n    $ hub install animegan_v1_hayao_60==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v1_hayao_60/model.py",
    "content": "import os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\n__all__ = ['InferenceModel']\n\n\nclass InferenceModel:\n    # 初始化函数\n    def __init__(self, modelpath, use_gpu=False, gpu_id=0, use_mkldnn=False, cpu_threads=1):\n        '''\n        init the inference model\n        modelpath: inference model path\n        use_gpu: use gpu or not\n        use_mkldnn: use mkldnn or not\n        '''\n        # 加载模型配置\n        self.config = self.load_config(modelpath, use_gpu, gpu_id, use_mkldnn, cpu_threads)\n\n    # 打印函数\n    def __repr__(self):\n        '''\n        get the numbers and name of inputs and outputs\n        '''\n        return 'input_num: %d\\ninput_names: %s\\noutput_num: %d\\noutput_names: %s' % (\n            self.input_num, str(self.input_names), self.output_num, str(self.output_names))\n\n    # 类调用函数\n    def __call__(self, *input_datas, batch_size=1):\n        '''\n        call function\n        '''\n        return self.forward(*input_datas, batch_size=batch_size)\n\n    # 模型参数加载函数\n    def load_config(self, modelpath, use_gpu, gpu_id, use_mkldnn, cpu_threads):\n        '''\n        load the model config\n        modelpath: inference model path\n        use_gpu: use gpu or not\n        use_mkldnn: use mkldnn or not\n        '''\n        # 对运行位置进行配置\n        if use_gpu:\n            try:\n                int(os.environ.get('CUDA_VISIBLE_DEVICES'))\n            except Exception:\n                print(\n                    '''Error! Unable to use GPU. Please set the environment variables \"CUDA_VISIBLE_DEVICES=GPU_id\" to use GPU. Now switch to CPU to continue...'''\n                )\n                use_gpu = False\n\n        if os.path.isdir(modelpath):\n            if os.path.exists(os.path.join(modelpath, \"__params__\")):\n                # __model__ + __params__\n                model = os.path.join(modelpath, \"__model__\")\n                params = os.path.join(modelpath, \"__params__\")\n                config = Config(model, params)\n            elif os.path.exists(os.path.join(modelpath, \"params\")):\n                # model + params\n                model = os.path.join(modelpath, \"model\")\n                params = os.path.join(modelpath, \"params\")\n                config = Config(model, params)\n            elif os.path.exists(os.path.join(modelpath, \"__model__\")):\n                # __model__ + others\n                config = Config(modelpath)\n            else:\n                raise Exception(\"Error! Can\\'t find the model in: %s. Please check your model path.\" %\n                                os.path.abspath(modelpath))\n        elif os.path.exists(modelpath + \".pdmodel\"):\n            # *.pdmodel + *.pdiparams\n            model = modelpath + \".pdmodel\"\n            params = modelpath + \".pdiparams\"\n            config = Config(model, params)\n        elif isinstance(modelpath, Config):\n            config = modelpath\n        else:\n            raise Exception(\"Error! Can\\'t find the model in: %s. Please check your model path.\" %\n                            os.path.abspath(modelpath))\n\n        # 设置参数\n        if use_gpu:\n            config.enable_use_gpu(100, gpu_id)\n        else:\n            config.disable_gpu()\n            config.set_cpu_math_library_num_threads(cpu_threads)\n            if use_mkldnn:\n                config.enable_mkldnn()\n\n        config.disable_glog_info()\n\n        # 返回配置\n        return config\n\n    # 预测器创建函数\n    def eval(self):\n        '''\n        create the model predictor by model config\n        '''\n        # 创建预测器\n        self.predictor = create_predictor(self.config)\n\n        # 获取模型的输入输出名称\n        self.input_names = self.predictor.get_input_names()\n        self.output_names = self.predictor.get_output_names()\n\n        # 获取模型的输入输出节点数量\n        self.input_num = len(self.input_names)\n        self.output_num = len(self.output_names)\n\n        # 获取输入\n        self.input_handles = []\n        for input_name in self.input_names:\n            self.input_handles.append(self.predictor.get_input_handle(input_name))\n\n        # 获取输出\n        self.output_handles = []\n        for output_name in self.output_names:\n            self.output_handles.append(self.predictor.get_output_handle(output_name))\n\n    # 前向计算函数\n    def forward(self, *input_datas, batch_size=1):\n        \"\"\"\n        model inference\n        batch_size: batch size\n        *input_datas: x1, x2, ..., xn\n        \"\"\"\n        # 切分输入数据\n        datas_num = input_datas[0].shape[0]\n        split_num = datas_num // batch_size + \\\n                    1 if datas_num % batch_size != 0 else datas_num // batch_size\n        input_datas = [np.array_split(input_data, split_num) for input_data in input_datas]\n\n        # 遍历输入数据进行预测\n        outputs = {}\n        for step in range(split_num):\n            for i in range(self.input_num):\n                input_data = input_datas[i][step].copy()\n                self.input_handles[i].copy_from_cpu(input_data)\n\n            self.predictor.run()\n\n            for i in range(self.output_num):\n                output = self.output_handles[i].copy_to_cpu()\n                if i in outputs:\n                    outputs[i].append(output)\n                else:\n                    outputs[i] = [output]\n\n        # 预测结果合并\n        for key in outputs.keys():\n            outputs[key] = np.concatenate(outputs[key], 0)\n\n        outputs = [v for v in outputs.values()]\n\n        # 返回预测结果\n        return tuple(outputs) if len(outputs) > 1 else outputs[0]\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v1_hayao_60/module.py",
    "content": "import os\n\nfrom .model import InferenceModel\nfrom .processor import base64_to_cv2\nfrom .processor import cv2_to_base64\nfrom .processor import Processor\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"animegan_v1_hayao_60\",  # 模型名称\n    type=\"CV/style_transfer\",  # 模型类型\n    author=\"jm12138\",  # 作者名称\n    author_email=\"jm12138@qq.com\",  # 作者邮箱\n    summary=\"animegan_v1_hayao_60\",  # 模型介绍\n    version=\"1.1.0\"  # 版本号\n)\nclass Animegan_V1_Hayao_60:\n    # 初始化函数\n    def __init__(self, use_gpu=False, use_mkldnn=False):\n        # 设置模型路径\n        self.model_path = os.path.join(self.directory, \"animegan_v1_hayao_60\", \"model\")\n\n        # 加载模型\n        self.model = InferenceModel(modelpath=self.model_path, use_gpu=use_gpu, use_mkldnn=use_mkldnn)\n\n        self.model.eval()\n\n    # 关键点检测函数\n    def style_transfer(self,\n                       images=None,\n                       paths=None,\n                       output_dir='output',\n                       visualization=False,\n                       min_size=32,\n                       max_size=1024):\n        # 加载数据处理器\n        processor = Processor(images=images,\n                              paths=paths,\n                              batch_size=1,\n                              output_dir=output_dir,\n                              min_size=min_size,\n                              max_size=max_size)\n\n        # 模型预测\n        outputs = []\n        for input_data in processor.input_datas:\n            output = self.model(input_data)\n            outputs.append(output)\n\n        # 结果后处理\n        results = processor.postprocess(outputs, visualization)\n\n        # 返回结果\n        return results\n\n    # Hub Serving\n    @serving\n    def serving_method(self, images, **kwargs):\n        # 获取输入数据\n        images_decode = [base64_to_cv2(image) for image in images]\n\n        # 图片风格转换\n        results = self.style_transfer(images_decode, **kwargs)\n\n        # 对输出图片进行编码\n        encodes = []\n        for result in results:\n            encode = cv2_to_base64(result)\n            encodes.append(encode)\n\n        # 返回结果\n        return encodes\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v1_hayao_60/processor.py",
    "content": "import base64\nimport os\nimport time\n\nimport cv2\nimport numpy as np\n\n__all__ = ['base64_to_cv2', 'cv2_to_base64', 'Processor']\n\n\ndef check_dir(dir_path):\n    # 目录检查函数\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef base64_to_cv2(b64str):\n    # base64转cv2函数\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.frombuffer(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef cv2_to_base64(image):\n    # cv2转base64函数\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\nclass Processor():\n    # 初始化函数\n    def __init__(self, images=None, paths=None, batch_size=1, output_dir='output', min_size=32, max_size=1024):\n        # 变量设置\n        self.min_size = min_size\n        self.max_size = max_size\n\n        self.images = images\n        self.paths = paths\n        self.batch_size = batch_size\n        self.output_dir = output_dir\n\n        # 获取原始输入数据\n        self.datas = self.load_datas()\n\n        # 对原始输入数据进行预处理\n        self.input_datas = self.preprocess()\n\n    # 读取数据函数\n    def load_datas(self):\n        datas = []\n\n        # 读取数据列表\n        if self.paths is not None:\n            for im_path in self.paths:\n                assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n                im = cv2.imread(im_path)\n                datas.append(im)\n\n        if self.images is not None:\n            datas = self.images\n\n        # 返回数据列表\n        return datas\n\n    # 数据预处理函数\n    def preprocess(self):\n        input_datas = []\n\n        # 数据预处理\n        for i, img in enumerate(self.datas):\n            # 格式转换\n            img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)\n\n            # 缩放图片\n            h, w = img.shape[:2]\n            if max(h, w) > self.max_size:\n                img = cv2.resize(img, (self.max_size, int(h / w * self.max_size))) if h < w else cv2.resize(\n                    img, (int(w / h * self.max_size), self.max_size))\n            elif min(h, w) < self.min_size:\n                img = cv2.resize(img, (self.min_size, int(h / w * self.min_size))) if h > w else cv2.resize(\n                    img, (int(w / h * self.min_size), self.min_size))\n\n            # 裁剪图片\n            h, w = img.shape[:2]\n            img = img[:h - (h % 32), :w - (w % 32), :]\n\n            # 归一化\n            img = img / 127.5 - 1.0\n\n            # 新建维度\n            img = np.expand_dims(img, axis=0).astype('float32')\n\n            # 加入输入数据列表\n            input_datas.append(img)\n\n        # 数据按batch_size切分\n        input_datas = np.concatenate(input_datas, 0)\n        split_num = len(self.datas) // self.batch_size + 1 if len(self.datas) % self.batch_size != 0 else len(\n            self.datas) // self.batch_size\n        input_datas = np.array_split(input_datas, split_num)\n\n        # 返回预处理完成的数据\n        return input_datas\n\n    def postprocess(self, outputs, visualization):\n        results = []\n\n        for im_id, output in enumerate(outputs):\n            # 反归一化\n            image = (output.squeeze() + 1.) / 2 * 255\n\n            # 限幅\n            image = np.clip(image, 0, 255).astype(np.uint8)\n\n            # 格式转换\n            image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)\n\n            # 可视化\n            if visualization:\n                # 检查输出目录\n                check_dir(self.output_dir)\n\n                # 写入输出图片\n                cv2.imwrite(os.path.join(self.output_dir, '%d_%d.jpg' % (im_id, time.time())), image)\n\n            results.append(image)\n\n        # 返回结果\n        return results\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v1_hayao_60/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport numpy as np\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/mJaD10XeD7w/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8M3x8Y2F0fGVufDB8fHx8MTY2MzczNDc3Mw&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        img = cv2.imread('tests/test.jpg')\n        img = cv2.resize(img, (0, 0), fx=0.25, fy=0.25)\n        cv2.imwrite('tests/test.jpg', img)\n        cls.module = hub.Module(name=\"animegan_v1_hayao_60\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('output')\n\n    def test_style_transfer1(self):\n        results = self.module.style_transfer(paths=['tests/test.jpg'])\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer2(self):\n        results = self.module.style_transfer(paths=['tests/test.jpg'], visualization=True)\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer3(self):\n        results = self.module.style_transfer(images=[cv2.imread('tests/test.jpg')])\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer4(self):\n        results = self.module.style_transfer(images=[cv2.imread('tests/test.jpg')], visualization=True)\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer5(self):\n        self.assertRaises(AssertionError, self.module.style_transfer, paths=['no.jpg'])\n\n    def test_style_transfer6(self):\n        self.assertRaises(cv2.error, self.module.style_transfer, images=['tests/test.jpg'])\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_hayao_64/README.md",
    "content": "# animegan_v2_hayao_64\n\n|模型名称|animegan_v2_hayao_64|\n| :--- | :---: |\n|类别|图像 - 图像生成|\n|网络|AnimeGAN|\n|数据集|The Wind Rises|\n|是否支持Fine-tuning|否|\n|模型大小|9.4MB|\n|最新更新日期|2021-07-30|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/bd002c4bb6a7427daf26988770bb18648b7d8d2bfd6746bfb9a429db4867727f\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    输入图像\n    <br />\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/49620341f1fe4f00af4d93c22694897a1ae578a235844a1db1bbb4bd37bf750b\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    输出图像\n     <br />\n    </p>\n\n- ### 模型介绍\n\n  - AnimeGAN V2 图像风格转换模型, 模型可将输入的图像转换成宫崎骏动漫风格，模型权重转换自[AnimeGAN V2官方开源项目](https://github.com/TachibanaYoshino/AnimeGANv2)。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.8.0  \n\n  - paddlehub >= 1.8.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install animegan_v2_hayao_64\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"animegan_v2_hayao_64\")\n    result = model.style_transfer(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.style_transfer(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def style_transfer(images=None,\n                       paths=None,\n                       output_dir='output',\n                       visualization=False,\n                       min_size=32,\n                       max_size=1024)\n    ```\n\n    - 风格转换API，将输入的图片转换为漫画风格。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]；<br/>\n      - paths (list\\[str\\]): 图片的路径；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 output；<br/>\n      - visualization (bool): 是否将识别结果保存为图片文件；<br/>\n      - min\\_size (int): 输入图片的短边最小尺寸，默认设为 32；<br/>\n      - max\\_size (int): 输入图片的短边最大尺寸，默认设为 1024。\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n      - res (list\\[numpy.ndarray\\]): 输出图像数据，ndarray.shape 为 \\[H, W, C\\]\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像风格转换服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m animegan_v2_hayao_64\n    ```\n\n  - 这样就完成了一个图像风格转换的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/animegan_v2_hayao_64\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install animegan_v2_hayao_64==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_hayao_64/README_en.md",
    "content": "# animegan_v2_hayao_64\n\n|Module Name|animegan_v2_hayao_64|\n| :--- | :---: |\n|Category|image generation|\n|Network|AnimeGAN|\n|Dataset|The Wind Rises|\n|Fine-tuning supported or not|No|\n|Module Size|9.4MB|\n|Latest update date|2021-07-30|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n    <p align=\"center\">\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/bd002c4bb6a7427daf26988770bb18648b7d8d2bfd6746bfb9a429db4867727f\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    Input image\n    <br />\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/49620341f1fe4f00af4d93c22694897a1ae578a235844a1db1bbb4bd37bf750b\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    Output image\n     <br />\n    </p>\n\n- ### Module Introduction\n\n  - AnimeGAN V2 is a style transfer model, which can transfer a image style to Miyazaki carton style. For more information, please refer to [AnimeGAN V2 Project](https://github.com/TachibanaYoshino/AnimeGANv2).\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.8.0  \n\n  - paddlehub >= 1.8.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install animegan_v2_hayao_64\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"animegan_v2_hayao_64\")\n    result = model.style_transfer(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.style_transfer(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def style_transfer(images=None,\n                       paths=None,\n                       output_dir='output',\n                       visualization=False,\n                       min_size=32,\n                       max_size=1024)\n    ```\n\n    - Style transfer API.\n\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - output_dir (str): save path of images;\n      - visualization (bool): Whether to save the results as picture files;\n      - min\\_size (int): min size of image shape，default is 32；\n      - max\\_size (int): max size of image shape，default is 1024.\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n      - res (list\\[numpy.ndarray\\]): result list，ndarray.shape is \\[H, W, C\\]\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of style transfer.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m animegan_v2_hayao_64\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/animegan_v2_hayao_64\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Remove Fluid API\n\n  - ```shell\n    $ hub install animegan_v2_hayao_64==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_hayao_64/model.py",
    "content": "import os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\n__all__ = ['InferenceModel']\n\n\nclass InferenceModel:\n    # 初始化函数\n    def __init__(self, modelpath, use_gpu=False, gpu_id=0, use_mkldnn=False, cpu_threads=1):\n        '''\n        init the inference model\n        modelpath: inference model path\n        use_gpu: use gpu or not\n        use_mkldnn: use mkldnn or not\n        '''\n        # 加载模型配置\n        self.config = self.load_config(modelpath, use_gpu, gpu_id, use_mkldnn, cpu_threads)\n\n    # 打印函数\n    def __repr__(self):\n        '''\n        get the numbers and name of inputs and outputs\n        '''\n        return 'input_num: %d\\ninput_names: %s\\noutput_num: %d\\noutput_names: %s' % (\n            self.input_num, str(self.input_names), self.output_num, str(self.output_names))\n\n    # 类调用函数\n    def __call__(self, *input_datas, batch_size=1):\n        '''\n        call function\n        '''\n        return self.forward(*input_datas, batch_size=batch_size)\n\n    # 模型参数加载函数\n    def load_config(self, modelpath, use_gpu, gpu_id, use_mkldnn, cpu_threads):\n        '''\n        load the model config\n        modelpath: inference model path\n        use_gpu: use gpu or not\n        use_mkldnn: use mkldnn or not\n        '''\n        # 对运行位置进行配置\n        if use_gpu:\n            try:\n                int(os.environ.get('CUDA_VISIBLE_DEVICES'))\n            except Exception:\n                print(\n                    '''Error! Unable to use GPU. Please set the environment variables \"CUDA_VISIBLE_DEVICES=GPU_id\" to use GPU. Now switch to CPU to continue...'''\n                )\n                use_gpu = False\n\n        if os.path.isdir(modelpath):\n            if os.path.exists(os.path.join(modelpath, \"__params__\")):\n                # __model__ + __params__\n                model = os.path.join(modelpath, \"__model__\")\n                params = os.path.join(modelpath, \"__params__\")\n                config = Config(model, params)\n            elif os.path.exists(os.path.join(modelpath, \"params\")):\n                # model + params\n                model = os.path.join(modelpath, \"model\")\n                params = os.path.join(modelpath, \"params\")\n                config = Config(model, params)\n            elif os.path.exists(os.path.join(modelpath, \"__model__\")):\n                # __model__ + others\n                config = Config(modelpath)\n            else:\n                raise Exception(\"Error! Can\\'t find the model in: %s. Please check your model path.\" %\n                                os.path.abspath(modelpath))\n        elif os.path.exists(modelpath + \".pdmodel\"):\n            # *.pdmodel + *.pdiparams\n            model = modelpath + \".pdmodel\"\n            params = modelpath + \".pdiparams\"\n            config = Config(model, params)\n        elif isinstance(modelpath, Config):\n            config = modelpath\n        else:\n            raise Exception(\"Error! Can\\'t find the model in: %s. Please check your model path.\" %\n                            os.path.abspath(modelpath))\n\n        # 设置参数\n        if use_gpu:\n            config.enable_use_gpu(100, gpu_id)\n        else:\n            config.disable_gpu()\n            config.set_cpu_math_library_num_threads(cpu_threads)\n            if use_mkldnn:\n                config.enable_mkldnn()\n\n        config.disable_glog_info()\n\n        # 返回配置\n        return config\n\n    # 预测器创建函数\n    def eval(self):\n        '''\n        create the model predictor by model config\n        '''\n        # 创建预测器\n        self.predictor = create_predictor(self.config)\n\n        # 获取模型的输入输出名称\n        self.input_names = self.predictor.get_input_names()\n        self.output_names = self.predictor.get_output_names()\n\n        # 获取模型的输入输出节点数量\n        self.input_num = len(self.input_names)\n        self.output_num = len(self.output_names)\n\n        # 获取输入\n        self.input_handles = []\n        for input_name in self.input_names:\n            self.input_handles.append(self.predictor.get_input_handle(input_name))\n\n        # 获取输出\n        self.output_handles = []\n        for output_name in self.output_names:\n            self.output_handles.append(self.predictor.get_output_handle(output_name))\n\n    # 前向计算函数\n    def forward(self, *input_datas, batch_size=1):\n        \"\"\"\n        model inference\n        batch_size: batch size\n        *input_datas: x1, x2, ..., xn\n        \"\"\"\n        # 切分输入数据\n        datas_num = input_datas[0].shape[0]\n        split_num = datas_num // batch_size + \\\n                    1 if datas_num % batch_size != 0 else datas_num // batch_size\n        input_datas = [np.array_split(input_data, split_num) for input_data in input_datas]\n\n        # 遍历输入数据进行预测\n        outputs = {}\n        for step in range(split_num):\n            for i in range(self.input_num):\n                input_data = input_datas[i][step].copy()\n                self.input_handles[i].copy_from_cpu(input_data)\n\n            self.predictor.run()\n\n            for i in range(self.output_num):\n                output = self.output_handles[i].copy_to_cpu()\n                if i in outputs:\n                    outputs[i].append(output)\n                else:\n                    outputs[i] = [output]\n\n        # 预测结果合并\n        for key in outputs.keys():\n            outputs[key] = np.concatenate(outputs[key], 0)\n\n        outputs = [v for v in outputs.values()]\n\n        # 返回预测结果\n        return tuple(outputs) if len(outputs) > 1 else outputs[0]\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_hayao_64/module.py",
    "content": "import os\n\nfrom .model import InferenceModel\nfrom .processor import base64_to_cv2\nfrom .processor import cv2_to_base64\nfrom .processor import Processor\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"animegan_v2_hayao_64\",  # 模型名称\n    type=\"CV/style_transfer\",  # 模型类型\n    author=\"jm12138\",  # 作者名称\n    author_email=\"jm12138@qq.com\",  # 作者邮箱\n    summary=\"animegan_v2_hayao_64\",  # 模型介绍\n    version=\"1.1.0\"  # 版本号\n)\nclass Animegan_V2_Hayao_64:\n    # 初始化函数\n    def __init__(self, use_gpu=False, use_mkldnn=False):\n        # 设置模型路径\n        self.model_path = os.path.join(self.directory, \"animegan_v2_hayao_64\", \"model\")\n\n        # 加载模型\n        self.model = InferenceModel(modelpath=self.model_path, use_gpu=use_gpu, use_mkldnn=use_mkldnn)\n\n        self.model.eval()\n\n    # 关键点检测函数\n    def style_transfer(self,\n                       images=None,\n                       paths=None,\n                       output_dir='output',\n                       visualization=False,\n                       min_size=32,\n                       max_size=1024):\n        # 加载数据处理器\n        processor = Processor(images=images,\n                              paths=paths,\n                              batch_size=1,\n                              output_dir=output_dir,\n                              min_size=min_size,\n                              max_size=max_size)\n\n        # 模型预测\n        outputs = []\n        for input_data in processor.input_datas:\n            output = self.model(input_data)\n            outputs.append(output)\n\n        # 结果后处理\n        results = processor.postprocess(outputs, visualization)\n\n        # 返回结果\n        return results\n\n    # Hub Serving\n    @serving\n    def serving_method(self, images, **kwargs):\n        # 获取输入数据\n        images_decode = [base64_to_cv2(image) for image in images]\n\n        # 图片风格转换\n        results = self.style_transfer(images_decode, **kwargs)\n\n        # 对输出图片进行编码\n        encodes = []\n        for result in results:\n            encode = cv2_to_base64(result)\n            encodes.append(encode)\n\n        # 返回结果\n        return encodes\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_hayao_64/processor.py",
    "content": "import base64\nimport os\nimport time\n\nimport cv2\nimport numpy as np\n\n__all__ = ['base64_to_cv2', 'cv2_to_base64', 'Processor']\n\n\ndef check_dir(dir_path):\n    # 目录检查函数\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef base64_to_cv2(b64str):\n    # base64转cv2函数\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.frombuffer(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef cv2_to_base64(image):\n    # cv2转base64函数\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\nclass Processor():\n    # 初始化函数\n    def __init__(self, images=None, paths=None, batch_size=1, output_dir='output', min_size=32, max_size=1024):\n        # 变量设置\n        self.min_size = min_size\n        self.max_size = max_size\n\n        self.images = images\n        self.paths = paths\n        self.batch_size = batch_size\n        self.output_dir = output_dir\n\n        # 获取原始输入数据\n        self.datas = self.load_datas()\n\n        # 对原始输入数据进行预处理\n        self.input_datas = self.preprocess()\n\n    # 读取数据函数\n    def load_datas(self):\n        datas = []\n\n        # 读取数据列表\n        if self.paths is not None:\n            for im_path in self.paths:\n                assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n                im = cv2.imread(im_path)\n                datas.append(im)\n\n        if self.images is not None:\n            datas = self.images\n\n        # 返回数据列表\n        return datas\n\n    # 数据预处理函数\n    def preprocess(self):\n        input_datas = []\n\n        # 数据预处理\n        for i, img in enumerate(self.datas):\n            # 格式转换\n            img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)\n\n            # 缩放图片\n            h, w = img.shape[:2]\n            if max(h, w) > self.max_size:\n                img = cv2.resize(img, (self.max_size, int(h / w * self.max_size))) if h < w else cv2.resize(\n                    img, (int(w / h * self.max_size), self.max_size))\n            elif min(h, w) < self.min_size:\n                img = cv2.resize(img, (self.min_size, int(h / w * self.min_size))) if h > w else cv2.resize(\n                    img, (int(w / h * self.min_size), self.min_size))\n\n            # 裁剪图片\n            h, w = img.shape[:2]\n            img = img[:h - (h % 32), :w - (w % 32), :]\n\n            # 归一化\n            img = img / 127.5 - 1.0\n\n            # 新建维度\n            img = np.expand_dims(img, axis=0).astype('float32')\n\n            # 加入输入数据列表\n            input_datas.append(img)\n\n        # 数据按batch_size切分\n        input_datas = np.concatenate(input_datas, 0)\n        split_num = len(self.datas) // self.batch_size + 1 if len(self.datas) % self.batch_size != 0 else len(\n            self.datas) // self.batch_size\n        input_datas = np.array_split(input_datas, split_num)\n\n        # 返回预处理完成的数据\n        return input_datas\n\n    def postprocess(self, outputs, visualization):\n        results = []\n\n        for im_id, output in enumerate(outputs):\n            # 反归一化\n            image = (output.squeeze() + 1.) / 2 * 255\n\n            # 限幅\n            image = np.clip(image, 0, 255).astype(np.uint8)\n\n            # 格式转换\n            image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)\n\n            # 可视化\n            if visualization:\n                # 检查输出目录\n                check_dir(self.output_dir)\n\n                # 写入输出图片\n                cv2.imwrite(os.path.join(self.output_dir, '%d_%d.jpg' % (im_id, time.time())), image)\n\n            results.append(image)\n\n        # 返回结果\n        return results\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_hayao_64/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport numpy as np\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/mJaD10XeD7w/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8M3x8Y2F0fGVufDB8fHx8MTY2MzczNDc3Mw&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        img = cv2.imread('tests/test.jpg')\n        img = cv2.resize(img, (0, 0), fx=0.25, fy=0.25)\n        cv2.imwrite('tests/test.jpg', img)\n        cls.module = hub.Module(name=\"animegan_v2_hayao_64\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('output')\n\n    def test_style_transfer1(self):\n        results = self.module.style_transfer(paths=['tests/test.jpg'])\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer2(self):\n        results = self.module.style_transfer(paths=['tests/test.jpg'], visualization=True)\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer3(self):\n        results = self.module.style_transfer(images=[cv2.imread('tests/test.jpg')])\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer4(self):\n        results = self.module.style_transfer(images=[cv2.imread('tests/test.jpg')], visualization=True)\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer5(self):\n        self.assertRaises(AssertionError, self.module.style_transfer, paths=['no.jpg'])\n\n    def test_style_transfer6(self):\n        self.assertRaises(cv2.error, self.module.style_transfer, images=['tests/test.jpg'])\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_hayao_99/README.md",
    "content": "# animegan_v2_hayao_99\n\n|模型名称|animegan_v2_hayao_99|\n| :--- | :---: |\n|类别|图像 - 图像生成|\n|网络|AnimeGAN|\n|数据集|The Wind Rises|\n|是否支持Fine-tuning|否|\n|模型大小|9.4MB|\n|最新更新日期|2021-07-30|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/bd002c4bb6a7427daf26988770bb18648b7d8d2bfd6746bfb9a429db4867727f\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    输入图像\n    <br />\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/16195e03d7e0412d990349587c587a26d9ae9e2ed1ec4fa1b4dc994e948d1f7d\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    输出图像\n     <br />\n    </p>\n\n\n- ### 模型介绍\n\n  - AnimeGAN V2 图像风格转换模型, 模型可将输入的图像转换成宫崎骏动漫风格，模型权重转换自[AnimeGAN V2官方开源项目](https://github.com/TachibanaYoshino/AnimeGANv2)。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.8.0  \n\n  - paddlehub >= 1.8.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install animegan_v2_hayao_99\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"animegan_v2_hayao_99\")\n    result = model.style_transfer(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.style_transfer(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def style_transfer(images=None,\n                       paths=None,\n                       output_dir='output',\n                       visualization=False,\n                       min_size=32,\n                       max_size=1024)\n    ```\n\n    - 风格转换API，将输入的图片转换为漫画风格。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]；<br/>\n      - paths (list\\[str\\]): 图片的路径；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 output；<br/>\n      - visualization (bool): 是否将识别结果保存为图片文件；<br/>\n      - min\\_size (int): 输入图片的短边最小尺寸，默认设为 32；<br/>\n      - max\\_size (int): 输入图片的短边最大尺寸，默认设为 1024。\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n      - res (list\\[numpy.ndarray\\]): 输出图像数据，ndarray.shape 为 \\[H, W, C\\]\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像风格转换服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m animegan_v2_hayao_99\n    ```\n\n  - 这样就完成了一个图像风格转换的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/animegan_v2_hayao_99\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install animegan_v2_hayao_99==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_hayao_99/README_en.md",
    "content": "# animegan_v2_hayao_99\n\n|Module Name|animegan_v2_hayao_99|\n| :--- | :---: |\n|Category|image generation|\n|Network|AnimeGAN|\n|Dataset|The Wind Rises|\n|Fine-tuning supported or not|No|\n|Module Size|9.4MB|\n|Latest update date|2021-07-30|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n    <p align=\"center\">\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/bd002c4bb6a7427daf26988770bb18648b7d8d2bfd6746bfb9a429db4867727f\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    Input image\n    <br />\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/16195e03d7e0412d990349587c587a26d9ae9e2ed1ec4fa1b4dc994e948d1f7d\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    Output image\n     <br />\n    </p>\n\n\n- ### Module Introduction\n\n  - AnimeGAN V2 is a style transfer model, which can transfer a image style to Miyazaki carton style. For more information, please refer to [AnimeGAN V2 Project](https://github.com/TachibanaYoshino/AnimeGANv2).\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.8.0  \n\n  - paddlehub >= 1.8.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install animegan_v2_hayao_99\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n\n- ### 1、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"animegan_v2_hayao_99\")\n    result = model.style_transfer(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.style_transfer(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def style_transfer(images=None,\n                       paths=None,\n                       output_dir='output',\n                       visualization=False,\n                       min_size=32,\n                       max_size=1024)\n    ```\n\n    - Style transfer API.\n\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - output_dir (str): save path of images;\n      - visualization (bool): Whether to save the results as picture files;\n      - min\\_size (int): min size of image shape，default is 32；\n      - max\\_size (int): max size of image shape，default is 1024.\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n      - res (list\\[numpy.ndarray\\]): result list，ndarray.shape is \\[H, W, C\\]\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of style transfer.\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m animegan_v2_hayao_99\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/animegan_v2_hayao_99\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Remove Fluid API\n\n  - ```shell\n    $ hub install animegan_v2_hayao_99==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_hayao_99/model.py",
    "content": "import os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\n__all__ = ['InferenceModel']\n\n\nclass InferenceModel:\n    # 初始化函数\n    def __init__(self, modelpath, use_gpu=False, gpu_id=0, use_mkldnn=False, cpu_threads=1):\n        '''\n        init the inference model\n        modelpath: inference model path\n        use_gpu: use gpu or not\n        use_mkldnn: use mkldnn or not\n        '''\n        # 加载模型配置\n        self.config = self.load_config(modelpath, use_gpu, gpu_id, use_mkldnn, cpu_threads)\n\n    # 打印函数\n    def __repr__(self):\n        '''\n        get the numbers and name of inputs and outputs\n        '''\n        return 'input_num: %d\\ninput_names: %s\\noutput_num: %d\\noutput_names: %s' % (\n            self.input_num, str(self.input_names), self.output_num, str(self.output_names))\n\n    # 类调用函数\n    def __call__(self, *input_datas, batch_size=1):\n        '''\n        call function\n        '''\n        return self.forward(*input_datas, batch_size=batch_size)\n\n    # 模型参数加载函数\n    def load_config(self, modelpath, use_gpu, gpu_id, use_mkldnn, cpu_threads):\n        '''\n        load the model config\n        modelpath: inference model path\n        use_gpu: use gpu or not\n        use_mkldnn: use mkldnn or not\n        '''\n        # 对运行位置进行配置\n        if use_gpu:\n            try:\n                int(os.environ.get('CUDA_VISIBLE_DEVICES'))\n            except Exception:\n                print(\n                    '''Error! Unable to use GPU. Please set the environment variables \"CUDA_VISIBLE_DEVICES=GPU_id\" to use GPU. Now switch to CPU to continue...'''\n                )\n                use_gpu = False\n\n        if os.path.isdir(modelpath):\n            if os.path.exists(os.path.join(modelpath, \"__params__\")):\n                # __model__ + __params__\n                model = os.path.join(modelpath, \"__model__\")\n                params = os.path.join(modelpath, \"__params__\")\n                config = Config(model, params)\n            elif os.path.exists(os.path.join(modelpath, \"params\")):\n                # model + params\n                model = os.path.join(modelpath, \"model\")\n                params = os.path.join(modelpath, \"params\")\n                config = Config(model, params)\n            elif os.path.exists(os.path.join(modelpath, \"__model__\")):\n                # __model__ + others\n                config = Config(modelpath)\n            else:\n                raise Exception(\"Error! Can\\'t find the model in: %s. Please check your model path.\" %\n                                os.path.abspath(modelpath))\n        elif os.path.exists(modelpath + \".pdmodel\"):\n            # *.pdmodel + *.pdiparams\n            model = modelpath + \".pdmodel\"\n            params = modelpath + \".pdiparams\"\n            config = Config(model, params)\n        elif isinstance(modelpath, Config):\n            config = modelpath\n        else:\n            raise Exception(\"Error! Can\\'t find the model in: %s. Please check your model path.\" %\n                            os.path.abspath(modelpath))\n\n        # 设置参数\n        if use_gpu:\n            config.enable_use_gpu(100, gpu_id)\n        else:\n            config.disable_gpu()\n            config.set_cpu_math_library_num_threads(cpu_threads)\n            if use_mkldnn:\n                config.enable_mkldnn()\n\n        config.disable_glog_info()\n\n        # 返回配置\n        return config\n\n    # 预测器创建函数\n    def eval(self):\n        '''\n        create the model predictor by model config\n        '''\n        # 创建预测器\n        self.predictor = create_predictor(self.config)\n\n        # 获取模型的输入输出名称\n        self.input_names = self.predictor.get_input_names()\n        self.output_names = self.predictor.get_output_names()\n\n        # 获取模型的输入输出节点数量\n        self.input_num = len(self.input_names)\n        self.output_num = len(self.output_names)\n\n        # 获取输入\n        self.input_handles = []\n        for input_name in self.input_names:\n            self.input_handles.append(self.predictor.get_input_handle(input_name))\n\n        # 获取输出\n        self.output_handles = []\n        for output_name in self.output_names:\n            self.output_handles.append(self.predictor.get_output_handle(output_name))\n\n    # 前向计算函数\n    def forward(self, *input_datas, batch_size=1):\n        \"\"\"\n        model inference\n        batch_size: batch size\n        *input_datas: x1, x2, ..., xn\n        \"\"\"\n        # 切分输入数据\n        datas_num = input_datas[0].shape[0]\n        split_num = datas_num // batch_size + \\\n                    1 if datas_num % batch_size != 0 else datas_num // batch_size\n        input_datas = [np.array_split(input_data, split_num) for input_data in input_datas]\n\n        # 遍历输入数据进行预测\n        outputs = {}\n        for step in range(split_num):\n            for i in range(self.input_num):\n                input_data = input_datas[i][step].copy()\n                self.input_handles[i].copy_from_cpu(input_data)\n\n            self.predictor.run()\n\n            for i in range(self.output_num):\n                output = self.output_handles[i].copy_to_cpu()\n                if i in outputs:\n                    outputs[i].append(output)\n                else:\n                    outputs[i] = [output]\n\n        # 预测结果合并\n        for key in outputs.keys():\n            outputs[key] = np.concatenate(outputs[key], 0)\n\n        outputs = [v for v in outputs.values()]\n\n        # 返回预测结果\n        return tuple(outputs) if len(outputs) > 1 else outputs[0]\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_hayao_99/module.py",
    "content": "import os\n\nfrom .model import InferenceModel\nfrom .processor import base64_to_cv2\nfrom .processor import cv2_to_base64\nfrom .processor import Processor\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"animegan_v2_hayao_99\",  # 模型名称\n    type=\"CV/style_transfer\",  # 模型类型\n    author=\"jm12138\",  # 作者名称\n    author_email=\"jm12138@qq.com\",  # 作者邮箱\n    summary=\"animegan_v2_hayao_99\",  # 模型介绍\n    version=\"1.1.0\"  # 版本号\n)\nclass Animegan_V2_Hayao_99:\n    # 初始化函数\n    def __init__(self, use_gpu=False, use_mkldnn=False):\n        # 设置模型路径\n        self.model_path = os.path.join(self.directory, \"animegan_v2_hayao_99\", \"model\")\n\n        # 加载模型\n        self.model = InferenceModel(modelpath=self.model_path, use_gpu=use_gpu, use_mkldnn=use_mkldnn)\n\n        self.model.eval()\n\n    # 关键点检测函数\n    def style_transfer(self,\n                       images=None,\n                       paths=None,\n                       output_dir='output',\n                       visualization=False,\n                       min_size=32,\n                       max_size=1024):\n        # 加载数据处理器\n        processor = Processor(images=images,\n                              paths=paths,\n                              batch_size=1,\n                              output_dir=output_dir,\n                              min_size=min_size,\n                              max_size=max_size)\n\n        # 模型预测\n        outputs = []\n        for input_data in processor.input_datas:\n            output = self.model(input_data)\n            outputs.append(output)\n\n        # 结果后处理\n        results = processor.postprocess(outputs, visualization)\n\n        # 返回结果\n        return results\n\n    # Hub Serving\n    @serving\n    def serving_method(self, images, **kwargs):\n        # 获取输入数据\n        images_decode = [base64_to_cv2(image) for image in images]\n\n        # 图片风格转换\n        results = self.style_transfer(images_decode, **kwargs)\n\n        # 对输出图片进行编码\n        encodes = []\n        for result in results:\n            encode = cv2_to_base64(result)\n            encodes.append(encode)\n\n        # 返回结果\n        return encodes\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_hayao_99/processor.py",
    "content": "import base64\nimport os\nimport time\n\nimport cv2\nimport numpy as np\n\n__all__ = ['base64_to_cv2', 'cv2_to_base64', 'Processor']\n\n\ndef check_dir(dir_path):\n    # 目录检查函数\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef base64_to_cv2(b64str):\n    # base64转cv2函数\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.frombuffer(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef cv2_to_base64(image):\n    # cv2转base64函数\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\nclass Processor():\n    # 初始化函数\n    def __init__(self, images=None, paths=None, batch_size=1, output_dir='output', min_size=32, max_size=1024):\n        # 变量设置\n        self.min_size = min_size\n        self.max_size = max_size\n\n        self.images = images\n        self.paths = paths\n        self.batch_size = batch_size\n        self.output_dir = output_dir\n\n        # 获取原始输入数据\n        self.datas = self.load_datas()\n\n        # 对原始输入数据进行预处理\n        self.input_datas = self.preprocess()\n\n    # 读取数据函数\n    def load_datas(self):\n        datas = []\n\n        # 读取数据列表\n        if self.paths is not None:\n            for im_path in self.paths:\n                assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n                im = cv2.imread(im_path)\n                datas.append(im)\n\n        if self.images is not None:\n            datas = self.images\n\n        # 返回数据列表\n        return datas\n\n    # 数据预处理函数\n    def preprocess(self):\n        input_datas = []\n\n        # 数据预处理\n        for i, img in enumerate(self.datas):\n            # 格式转换\n            img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)\n\n            # 缩放图片\n            h, w = img.shape[:2]\n            if max(h, w) > self.max_size:\n                img = cv2.resize(img, (self.max_size, int(h / w * self.max_size))) if h < w else cv2.resize(\n                    img, (int(w / h * self.max_size), self.max_size))\n            elif min(h, w) < self.min_size:\n                img = cv2.resize(img, (self.min_size, int(h / w * self.min_size))) if h > w else cv2.resize(\n                    img, (int(w / h * self.min_size), self.min_size))\n\n            # 裁剪图片\n            h, w = img.shape[:2]\n            img = img[:h - (h % 32), :w - (w % 32), :]\n\n            # 归一化\n            img = img / 127.5 - 1.0\n\n            # 新建维度\n            img = np.expand_dims(img, axis=0).astype('float32')\n\n            # 加入输入数据列表\n            input_datas.append(img)\n\n        # 数据按batch_size切分\n        input_datas = np.concatenate(input_datas, 0)\n        split_num = len(self.datas) // self.batch_size + 1 if len(self.datas) % self.batch_size != 0 else len(\n            self.datas) // self.batch_size\n        input_datas = np.array_split(input_datas, split_num)\n\n        # 返回预处理完成的数据\n        return input_datas\n\n    def postprocess(self, outputs, visualization):\n        results = []\n\n        for im_id, output in enumerate(outputs):\n            # 反归一化\n            image = (output.squeeze() + 1.) / 2 * 255\n\n            # 限幅\n            image = np.clip(image, 0, 255).astype(np.uint8)\n\n            # 格式转换\n            image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)\n\n            # 可视化\n            if visualization:\n                # 检查输出目录\n                check_dir(self.output_dir)\n\n                # 写入输出图片\n                cv2.imwrite(os.path.join(self.output_dir, '%d_%d.jpg' % (im_id, time.time())), image)\n\n            results.append(image)\n\n        # 返回结果\n        return results\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_hayao_99/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport numpy as np\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/mJaD10XeD7w/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8M3x8Y2F0fGVufDB8fHx8MTY2MzczNDc3Mw&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        img = cv2.imread('tests/test.jpg')\n        img = cv2.resize(img, (0, 0), fx=0.25, fy=0.25)\n        cv2.imwrite('tests/test.jpg', img)\n        cls.module = hub.Module(name=\"animegan_v2_hayao_99\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('output')\n\n    def test_style_transfer1(self):\n        results = self.module.style_transfer(paths=['tests/test.jpg'])\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer2(self):\n        results = self.module.style_transfer(paths=['tests/test.jpg'], visualization=True)\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer3(self):\n        results = self.module.style_transfer(images=[cv2.imread('tests/test.jpg')])\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer4(self):\n        results = self.module.style_transfer(images=[cv2.imread('tests/test.jpg')], visualization=True)\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer5(self):\n        self.assertRaises(AssertionError, self.module.style_transfer, paths=['no.jpg'])\n\n    def test_style_transfer6(self):\n        self.assertRaises(cv2.error, self.module.style_transfer, images=['tests/test.jpg'])\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_paprika_54/README.md",
    "content": "# animegan_v2_paprika_54\n\n|模型名称|animegan_v2_paprika_54|\n| :--- | :---: |\n|类别|图像 - 图像生成|\n|网络|AnimeGAN|\n|数据集|Paprika|\n|是否支持Fine-tuning|否|\n|模型大小|9.4MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/bd002c4bb6a7427daf26988770bb18648b7d8d2bfd6746bfb9a429db4867727f\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    输入图像\n    <br />\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/6574669d87b24bab9627c6e33896528b4a0bf5af1cd84ca29655d68719f2d551\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    输出图像\n     <br />\n    </p>\n\n\n- ### 模型介绍\n\n  - AnimeGAN V2 图像风格转换模型, 模型可将输入的图像转换成今敏红辣椒动漫风格，模型权重转换自[AnimeGAN V2官方开源项目](https://github.com/TachibanaYoshino/AnimeGANv2)。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.8.0  \n\n  - paddlehub >= 1.8.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install animegan_v2_paprika_54\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"animegan_v2_paprika_54\")\n    result = model.style_transfer(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.style_transfer(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def style_transfer(images=None,\n                       paths=None,\n                       output_dir='output',\n                       visualization=False,\n                       min_size=32,\n                       max_size=1024)\n    ```\n\n    - 风格转换API，将输入的图片转换为漫画风格。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]；<br/>\n      - paths (list\\[str\\]): 图片的路径；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 output；<br/>\n      - visualization (bool): 是否将结果保存为图片文件；<br/>\n      - min\\_size (int): 输入图片的短边最小尺寸，默认设为 32；<br/>\n      - max\\_size (int): 输入图片的短边最大尺寸，默认设为 1024。\n\n    - **返回**\n      - res (list\\[numpy.ndarray\\]): 输出图像数据，ndarray.shape 为 \\[H, W, C\\]\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像风格转换服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m animegan_v2_paprika_54\n    ```\n\n  - 这样就完成了一个图像风格转换的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/animegan_v2_paprika_54\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install animegan_v2_paprika_54==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_paprika_54/README_en.md",
    "content": "# animegan_v2_paprika_54\n\n|Module Name |animegan_v2_paprika_54|\n| :--- | :---: |\n|Category |Image generation|\n|Network|AnimeGAN|\n|Dataset|Paprika|\n|Fine-tuning supported or not|No|\n|Module Size|9.4MB|\n|Latest update date|2021-02-26|\n|Data indicators|-|\n\n\n## I. Basic Information\n\n- ### Application Effect Display\n\n  - Sample results:\n    <p align=\"center\">\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/bd002c4bb6a7427daf26988770bb18648b7d8d2bfd6746bfb9a429db4867727f\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    Input image\n    <br />\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/6574669d87b24bab9627c6e33896528b4a0bf5af1cd84ca29655d68719f2d551\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    Output image\n     <br />\n    </p>\n\n\n\n- ### Module Introduction\n\n  - AnimeGAN V2 image style stransfer model, the model can convert the input image into red pepper anime style, the model weight is converted from[AnimeGAN V2 official repo](https://github.com/TachibanaYoshino/AnimeGAN)。\n\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n  - paddlepaddle >= 1.8.0  \n\n  - paddlehub >= 1.8.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install animegan_v2_paprika_54\n    ```\n\n  - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"animegan_v2_paprika_54\")\n    result = model.style_transfer(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.style_transfer(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def style_transfer(images=None,\n                       paths=None,\n                       output_dir='output',\n                       visualization=False,\n                       min_size=32,\n                       max_size=1024)\n    ```\n\n    - Style transfer API.\n\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): Image data, ndarray.shape is in the format [H, W, C], BGR.\n      - paths (list\\[str\\]): Image path.\n      - output\\_dir (str): Save path of images, `output` by default.\n      - visualization (bool): Whether to save the results as picture files.\n      - min\\_size (int): Minimum size, default is  32.\n      - max\\_size (int): Maximum size, default is 1024.\n\n      **NOTE:** Choose one of `paths` and `images` to provide input data.\n\n    - **Return**\n      - res (list\\[numpy.ndarray\\]): The list of style transfer results，ndarray.shape is in the format [H, W, C].\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of style transfer.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command:\n\n    - ```shell\n      $ hub serving start -m animegan_v2_paprika_54\n      ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n   - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n   - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n      - ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n\n        def cv2_to_base64(image):\n          data = cv2.imencode('.jpg', image)[1]\n          return base64.b64encode(data.tostring()).decode('utf8')\n\n        # Send an HTTP request\n        data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/animegan_v2_paprika_54\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n        # print prediction results\n        print(r.json()[\"results\"])\n        ```\n\n\n## V. Release Note\n\n- 1.0.0\n\n  First release.\n\n* 1.1.0\n\n  Remove Fluid API\n\n  - ```shell\n    $ hub install animegan_v2_paprika_54==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_paprika_54/model.py",
    "content": "import os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\n__all__ = ['InferenceModel']\n\n\nclass InferenceModel:\n    # 初始化函数\n    def __init__(self, modelpath, use_gpu=False, gpu_id=0, use_mkldnn=False, cpu_threads=1):\n        '''\n        init the inference model\n        modelpath: inference model path\n        use_gpu: use gpu or not\n        use_mkldnn: use mkldnn or not\n        '''\n        # 加载模型配置\n        self.config = self.load_config(modelpath, use_gpu, gpu_id, use_mkldnn, cpu_threads)\n\n    # 打印函数\n    def __repr__(self):\n        '''\n        get the numbers and name of inputs and outputs\n        '''\n        return 'input_num: %d\\ninput_names: %s\\noutput_num: %d\\noutput_names: %s' % (\n            self.input_num, str(self.input_names), self.output_num, str(self.output_names))\n\n    # 类调用函数\n    def __call__(self, *input_datas, batch_size=1):\n        '''\n        call function\n        '''\n        return self.forward(*input_datas, batch_size=batch_size)\n\n    # 模型参数加载函数\n    def load_config(self, modelpath, use_gpu, gpu_id, use_mkldnn, cpu_threads):\n        '''\n        load the model config\n        modelpath: inference model path\n        use_gpu: use gpu or not\n        use_mkldnn: use mkldnn or not\n        '''\n        # 对运行位置进行配置\n        if use_gpu:\n            try:\n                int(os.environ.get('CUDA_VISIBLE_DEVICES'))\n            except Exception:\n                print(\n                    '''Error! Unable to use GPU. Please set the environment variables \"CUDA_VISIBLE_DEVICES=GPU_id\" to use GPU. Now switch to CPU to continue...'''\n                )\n                use_gpu = False\n\n        if os.path.isdir(modelpath):\n            if os.path.exists(os.path.join(modelpath, \"__params__\")):\n                # __model__ + __params__\n                model = os.path.join(modelpath, \"__model__\")\n                params = os.path.join(modelpath, \"__params__\")\n                config = Config(model, params)\n            elif os.path.exists(os.path.join(modelpath, \"params\")):\n                # model + params\n                model = os.path.join(modelpath, \"model\")\n                params = os.path.join(modelpath, \"params\")\n                config = Config(model, params)\n            elif os.path.exists(os.path.join(modelpath, \"__model__\")):\n                # __model__ + others\n                config = Config(modelpath)\n            else:\n                raise Exception(\"Error! Can\\'t find the model in: %s. Please check your model path.\" %\n                                os.path.abspath(modelpath))\n        elif os.path.exists(modelpath + \".pdmodel\"):\n            # *.pdmodel + *.pdiparams\n            model = modelpath + \".pdmodel\"\n            params = modelpath + \".pdiparams\"\n            config = Config(model, params)\n        elif isinstance(modelpath, Config):\n            config = modelpath\n        else:\n            raise Exception(\"Error! Can\\'t find the model in: %s. Please check your model path.\" %\n                            os.path.abspath(modelpath))\n\n        # 设置参数\n        if use_gpu:\n            config.enable_use_gpu(100, gpu_id)\n        else:\n            config.disable_gpu()\n            config.set_cpu_math_library_num_threads(cpu_threads)\n            if use_mkldnn:\n                config.enable_mkldnn()\n\n        config.disable_glog_info()\n\n        # 返回配置\n        return config\n\n    # 预测器创建函数\n    def eval(self):\n        '''\n        create the model predictor by model config\n        '''\n        # 创建预测器\n        self.predictor = create_predictor(self.config)\n\n        # 获取模型的输入输出名称\n        self.input_names = self.predictor.get_input_names()\n        self.output_names = self.predictor.get_output_names()\n\n        # 获取模型的输入输出节点数量\n        self.input_num = len(self.input_names)\n        self.output_num = len(self.output_names)\n\n        # 获取输入\n        self.input_handles = []\n        for input_name in self.input_names:\n            self.input_handles.append(self.predictor.get_input_handle(input_name))\n\n        # 获取输出\n        self.output_handles = []\n        for output_name in self.output_names:\n            self.output_handles.append(self.predictor.get_output_handle(output_name))\n\n    # 前向计算函数\n    def forward(self, *input_datas, batch_size=1):\n        \"\"\"\n        model inference\n        batch_size: batch size\n        *input_datas: x1, x2, ..., xn\n        \"\"\"\n        # 切分输入数据\n        datas_num = input_datas[0].shape[0]\n        split_num = datas_num // batch_size + \\\n                    1 if datas_num % batch_size != 0 else datas_num // batch_size\n        input_datas = [np.array_split(input_data, split_num) for input_data in input_datas]\n\n        # 遍历输入数据进行预测\n        outputs = {}\n        for step in range(split_num):\n            for i in range(self.input_num):\n                input_data = input_datas[i][step].copy()\n                self.input_handles[i].copy_from_cpu(input_data)\n\n            self.predictor.run()\n\n            for i in range(self.output_num):\n                output = self.output_handles[i].copy_to_cpu()\n                if i in outputs:\n                    outputs[i].append(output)\n                else:\n                    outputs[i] = [output]\n\n        # 预测结果合并\n        for key in outputs.keys():\n            outputs[key] = np.concatenate(outputs[key], 0)\n\n        outputs = [v for v in outputs.values()]\n\n        # 返回预测结果\n        return tuple(outputs) if len(outputs) > 1 else outputs[0]\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_paprika_54/module.py",
    "content": "import os\n\nfrom .model import InferenceModel\nfrom .processor import base64_to_cv2\nfrom .processor import cv2_to_base64\nfrom .processor import Processor\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"animegan_v2_paprika_54\",  # 模型名称\n    type=\"CV/style_transfer\",  # 模型类型\n    author=\"jm12138\",  # 作者名称\n    author_email=\"jm12138@qq.com\",  # 作者邮箱\n    summary=\"animegan_v2_paprika_54\",  # 模型介绍\n    version=\"1.1.0\"  # 版本号\n)\nclass Animegan_V2_Paprika_54:\n    # 初始化函数\n    def __init__(self, use_gpu=False, use_mkldnn=False):\n        # 设置模型路径\n        self.model_path = os.path.join(self.directory, \"animegan_v2_paprika_54\", \"model\")\n\n        # 加载模型\n        self.model = InferenceModel(modelpath=self.model_path, use_gpu=use_gpu, use_mkldnn=use_mkldnn)\n\n        self.model.eval()\n\n    # 关键点检测函数\n    def style_transfer(self,\n                       images=None,\n                       paths=None,\n                       output_dir='output',\n                       visualization=False,\n                       min_size=32,\n                       max_size=1024):\n        # 加载数据处理器\n        processor = Processor(images=images,\n                              paths=paths,\n                              batch_size=1,\n                              output_dir=output_dir,\n                              min_size=min_size,\n                              max_size=max_size)\n\n        # 模型预测\n        outputs = []\n        for input_data in processor.input_datas:\n            output = self.model(input_data)\n            outputs.append(output)\n\n        # 结果后处理\n        results = processor.postprocess(outputs, visualization)\n\n        # 返回结果\n        return results\n\n    # Hub Serving\n    @serving\n    def serving_method(self, images, **kwargs):\n        # 获取输入数据\n        images_decode = [base64_to_cv2(image) for image in images]\n\n        # 图片风格转换\n        results = self.style_transfer(images_decode, **kwargs)\n\n        # 对输出图片进行编码\n        encodes = []\n        for result in results:\n            encode = cv2_to_base64(result)\n            encodes.append(encode)\n\n        # 返回结果\n        return encodes\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_paprika_54/processor.py",
    "content": "import base64\nimport os\nimport time\n\nimport cv2\nimport numpy as np\n\n__all__ = ['base64_to_cv2', 'cv2_to_base64', 'Processor']\n\n\ndef check_dir(dir_path):\n    # 目录检查函数\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef base64_to_cv2(b64str):\n    # base64转cv2函数\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.frombuffer(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef cv2_to_base64(image):\n    # cv2转base64函数\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\nclass Processor():\n    # 初始化函数\n    def __init__(self, images=None, paths=None, batch_size=1, output_dir='output', min_size=32, max_size=1024):\n        # 变量设置\n        self.min_size = min_size\n        self.max_size = max_size\n\n        self.images = images\n        self.paths = paths\n        self.batch_size = batch_size\n        self.output_dir = output_dir\n\n        # 获取原始输入数据\n        self.datas = self.load_datas()\n\n        # 对原始输入数据进行预处理\n        self.input_datas = self.preprocess()\n\n    # 读取数据函数\n    def load_datas(self):\n        datas = []\n\n        # 读取数据列表\n        if self.paths is not None:\n            for im_path in self.paths:\n                assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n                im = cv2.imread(im_path)\n                datas.append(im)\n\n        if self.images is not None:\n            datas = self.images\n\n        # 返回数据列表\n        return datas\n\n    # 数据预处理函数\n    def preprocess(self):\n        input_datas = []\n\n        # 数据预处理\n        for i, img in enumerate(self.datas):\n            # 格式转换\n            img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)\n\n            # 缩放图片\n            h, w = img.shape[:2]\n            if max(h, w) > self.max_size:\n                img = cv2.resize(img, (self.max_size, int(h / w * self.max_size))) if h < w else cv2.resize(\n                    img, (int(w / h * self.max_size), self.max_size))\n            elif min(h, w) < self.min_size:\n                img = cv2.resize(img, (self.min_size, int(h / w * self.min_size))) if h > w else cv2.resize(\n                    img, (int(w / h * self.min_size), self.min_size))\n\n            # 裁剪图片\n            h, w = img.shape[:2]\n            img = img[:h - (h % 32), :w - (w % 32), :]\n\n            # 归一化\n            img = img / 127.5 - 1.0\n\n            # 新建维度\n            img = np.expand_dims(img, axis=0).astype('float32')\n\n            # 加入输入数据列表\n            input_datas.append(img)\n\n        # 数据按batch_size切分\n        input_datas = np.concatenate(input_datas, 0)\n        split_num = len(self.datas) // self.batch_size + 1 if len(self.datas) % self.batch_size != 0 else len(\n            self.datas) // self.batch_size\n        input_datas = np.array_split(input_datas, split_num)\n\n        # 返回预处理完成的数据\n        return input_datas\n\n    def postprocess(self, outputs, visualization):\n        results = []\n\n        for im_id, output in enumerate(outputs):\n            # 反归一化\n            image = (output.squeeze() + 1.) / 2 * 255\n\n            # 限幅\n            image = np.clip(image, 0, 255).astype(np.uint8)\n\n            # 格式转换\n            image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)\n\n            # 可视化\n            if visualization:\n                # 检查输出目录\n                check_dir(self.output_dir)\n\n                # 写入输出图片\n                cv2.imwrite(os.path.join(self.output_dir, '%d_%d.jpg' % (im_id, time.time())), image)\n\n            results.append(image)\n\n        # 返回结果\n        return results\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_paprika_54/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport numpy as np\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/mJaD10XeD7w/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8M3x8Y2F0fGVufDB8fHx8MTY2MzczNDc3Mw&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        img = cv2.imread('tests/test.jpg')\n        img = cv2.resize(img, (0, 0), fx=0.25, fy=0.25)\n        cv2.imwrite('tests/test.jpg', img)\n        cls.module = hub.Module(name=\"animegan_v2_paprika_54\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('output')\n\n    def test_style_transfer1(self):\n        results = self.module.style_transfer(paths=['tests/test.jpg'])\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer2(self):\n        results = self.module.style_transfer(paths=['tests/test.jpg'], visualization=True)\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer3(self):\n        results = self.module.style_transfer(images=[cv2.imread('tests/test.jpg')])\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer4(self):\n        results = self.module.style_transfer(images=[cv2.imread('tests/test.jpg')], visualization=True)\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer5(self):\n        self.assertRaises(AssertionError, self.module.style_transfer, paths=['no.jpg'])\n\n    def test_style_transfer6(self):\n        self.assertRaises(cv2.error, self.module.style_transfer, images=['tests/test.jpg'])\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_paprika_74/README.md",
    "content": "# animegan_v2_paprika_74\n\n|模型名称|animegan_v2_paprika_74|\n| :--- | :---: |\n|类别|图像 - 图像生成|\n|网络|AnimeGAN|\n|数据集|Paprika|\n|是否支持Fine-tuning|否|\n|模型大小|9.4MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/bd002c4bb6a7427daf26988770bb18648b7d8d2bfd6746bfb9a429db4867727f\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    输入图像\n    <br />\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/6574669d87b24bab9627c6e33896528b4a0bf5af1cd84ca29655d68719f2d551\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    输出图像\n     <br />\n    </p>\n\n\n- ### 模型介绍\n\n  - AnimeGAN V2 图像风格转换模型, 模型可将输入的图像转换成今敏红辣椒动漫风格，模型权重转换自[AnimeGAN V2官方开源项目](https://github.com/TachibanaYoshino/AnimeGANv2)。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.8.0  \n\n  - paddlehub >= 1.8.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install animegan_v2_paprika_74\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"animegan_v2_paprika_74\")\n    result = model.style_transfer(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.style_transfer(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def style_transfer(images=None,\n                       paths=None,\n                       output_dir='output',\n                       visualization=False,\n                       min_size=32,\n                       max_size=1024)\n    ```\n\n    - 风格转换API，将输入的图片转换为漫画风格。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]；<br/>\n      - paths (list\\[str\\]): 图片的路径；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 output；<br/>\n      - visualization (bool): 是否将结果保存为图片文件；<br/>\n      - min\\_size (int): 输入图片的短边最小尺寸，默认设为 32；<br/>\n      - max\\_size (int): 输入图片的短边最大尺寸，默认设为 1024。\n\n    - **返回**\n      - res (list\\[numpy.ndarray\\]): 输出图像数据，ndarray.shape 为 \\[H, W, C\\]\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像风格转换服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m animegan_v2_paprika_74\n    ```\n\n  - 这样就完成了一个图像风格转换的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/animegan_v2_paprika_74\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install animegan_v2_paprika_74==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_paprika_74/README_en.md",
    "content": "# animegan_v2_paprika_74\n\n|Module Name|animegan_v2_paprika_74|\n| :--- | :---: |\n|Category|image generation|\n|Network|AnimeGAN|\n|Dataset|Paprika|\n|Fine-tuning supported or not|No|\n|Module Size|9.4MB|\n|Latest update date|2021-02-26|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n    <p align=\"center\">\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/bd002c4bb6a7427daf26988770bb18648b7d8d2bfd6746bfb9a429db4867727f\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    Input Image\n    <br />\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/6574669d87b24bab9627c6e33896528b4a0bf5af1cd84ca29655d68719f2d551\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    Output Image\n     <br />\n    </p>\n\n\n- ### Module Introduction\n\n  - AnimeGAN V2 is a style transfer model, which can transfer a image style to paprika carton style. For more information, please refer to [AnimeGAN V2 Project](https://github.com/TachibanaYoshino/AnimeGANv2).\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.8.0  \n\n  - paddlehub >= 1.8.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install animegan_v2_paprika_74\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"animegan_v2_paprika_74\")\n    result = model.style_transfer(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.style_transfer(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def style_transfer(images=None,\n                       paths=None,\n                       output_dir='output',\n                       visualization=False,\n                       min_size=32,\n                       max_size=1024)\n    ```\n\n    - Style transfer API.\n\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - output_dir (str): save path of images;\n      - visualization (bool): Whether to save the results as picture files;\n      - min\\_size (int): min size of image shape，default is 32；\n      - max\\_size (int): max size of image shape，default is 1024.\n\n    - **Return**\n      - res (list\\[numpy.ndarray\\]): result list，ndarray.shape is \\[H, W, C\\]\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of style transfer.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m animegan_v2_paprika_74\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/animegan_v2_paprika_74\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Remove Fluid API\n\n  - ```shell\n    $ hub install animegan_v2_paprika_74==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_paprika_74/model.py",
    "content": "import os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\n__all__ = ['InferenceModel']\n\n\nclass InferenceModel:\n    # 初始化函数\n    def __init__(self, modelpath, use_gpu=False, gpu_id=0, use_mkldnn=False, cpu_threads=1):\n        '''\n        init the inference model\n        modelpath: inference model path\n        use_gpu: use gpu or not\n        use_mkldnn: use mkldnn or not\n        '''\n        # 加载模型配置\n        self.config = self.load_config(modelpath, use_gpu, gpu_id, use_mkldnn, cpu_threads)\n\n    # 打印函数\n    def __repr__(self):\n        '''\n        get the numbers and name of inputs and outputs\n        '''\n        return 'input_num: %d\\ninput_names: %s\\noutput_num: %d\\noutput_names: %s' % (\n            self.input_num, str(self.input_names), self.output_num, str(self.output_names))\n\n    # 类调用函数\n    def __call__(self, *input_datas, batch_size=1):\n        '''\n        call function\n        '''\n        return self.forward(*input_datas, batch_size=batch_size)\n\n    # 模型参数加载函数\n    def load_config(self, modelpath, use_gpu, gpu_id, use_mkldnn, cpu_threads):\n        '''\n        load the model config\n        modelpath: inference model path\n        use_gpu: use gpu or not\n        use_mkldnn: use mkldnn or not\n        '''\n        # 对运行位置进行配置\n        if use_gpu:\n            try:\n                int(os.environ.get('CUDA_VISIBLE_DEVICES'))\n            except Exception:\n                print(\n                    '''Error! Unable to use GPU. Please set the environment variables \"CUDA_VISIBLE_DEVICES=GPU_id\" to use GPU. Now switch to CPU to continue...'''\n                )\n                use_gpu = False\n\n        if os.path.isdir(modelpath):\n            if os.path.exists(os.path.join(modelpath, \"__params__\")):\n                # __model__ + __params__\n                model = os.path.join(modelpath, \"__model__\")\n                params = os.path.join(modelpath, \"__params__\")\n                config = Config(model, params)\n            elif os.path.exists(os.path.join(modelpath, \"params\")):\n                # model + params\n                model = os.path.join(modelpath, \"model\")\n                params = os.path.join(modelpath, \"params\")\n                config = Config(model, params)\n            elif os.path.exists(os.path.join(modelpath, \"__model__\")):\n                # __model__ + others\n                config = Config(modelpath)\n            else:\n                raise Exception(\"Error! Can\\'t find the model in: %s. Please check your model path.\" %\n                                os.path.abspath(modelpath))\n        elif os.path.exists(modelpath + \".pdmodel\"):\n            # *.pdmodel + *.pdiparams\n            model = modelpath + \".pdmodel\"\n            params = modelpath + \".pdiparams\"\n            config = Config(model, params)\n        elif isinstance(modelpath, Config):\n            config = modelpath\n        else:\n            raise Exception(\"Error! Can\\'t find the model in: %s. Please check your model path.\" %\n                            os.path.abspath(modelpath))\n\n        # 设置参数\n        if use_gpu:\n            config.enable_use_gpu(100, gpu_id)\n        else:\n            config.disable_gpu()\n            config.set_cpu_math_library_num_threads(cpu_threads)\n            if use_mkldnn:\n                config.enable_mkldnn()\n\n        config.disable_glog_info()\n\n        # 返回配置\n        return config\n\n    # 预测器创建函数\n    def eval(self):\n        '''\n        create the model predictor by model config\n        '''\n        # 创建预测器\n        self.predictor = create_predictor(self.config)\n\n        # 获取模型的输入输出名称\n        self.input_names = self.predictor.get_input_names()\n        self.output_names = self.predictor.get_output_names()\n\n        # 获取模型的输入输出节点数量\n        self.input_num = len(self.input_names)\n        self.output_num = len(self.output_names)\n\n        # 获取输入\n        self.input_handles = []\n        for input_name in self.input_names:\n            self.input_handles.append(self.predictor.get_input_handle(input_name))\n\n        # 获取输出\n        self.output_handles = []\n        for output_name in self.output_names:\n            self.output_handles.append(self.predictor.get_output_handle(output_name))\n\n    # 前向计算函数\n    def forward(self, *input_datas, batch_size=1):\n        \"\"\"\n        model inference\n        batch_size: batch size\n        *input_datas: x1, x2, ..., xn\n        \"\"\"\n        # 切分输入数据\n        datas_num = input_datas[0].shape[0]\n        split_num = datas_num // batch_size + \\\n                    1 if datas_num % batch_size != 0 else datas_num // batch_size\n        input_datas = [np.array_split(input_data, split_num) for input_data in input_datas]\n\n        # 遍历输入数据进行预测\n        outputs = {}\n        for step in range(split_num):\n            for i in range(self.input_num):\n                input_data = input_datas[i][step].copy()\n                self.input_handles[i].copy_from_cpu(input_data)\n\n            self.predictor.run()\n\n            for i in range(self.output_num):\n                output = self.output_handles[i].copy_to_cpu()\n                if i in outputs:\n                    outputs[i].append(output)\n                else:\n                    outputs[i] = [output]\n\n        # 预测结果合并\n        for key in outputs.keys():\n            outputs[key] = np.concatenate(outputs[key], 0)\n\n        outputs = [v for v in outputs.values()]\n\n        # 返回预测结果\n        return tuple(outputs) if len(outputs) > 1 else outputs[0]\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_paprika_74/module.py",
    "content": "import os\n\nfrom .model import InferenceModel\nfrom .processor import base64_to_cv2\nfrom .processor import cv2_to_base64\nfrom .processor import Processor\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"animegan_v2_paprika_74\",  # 模型名称\n    type=\"CV/style_transfer\",  # 模型类型\n    author=\"jm12138\",  # 作者名称\n    author_email=\"jm12138@qq.com\",  # 作者邮箱\n    summary=\"animegan_v2_paprika_74\",  # 模型介绍\n    version=\"1.1.0\"  # 版本号\n)\nclass Animegan_V2_Paprika_74:\n    # 初始化函数\n    def __init__(self, use_gpu=False, use_mkldnn=False):\n        # 设置模型路径\n        self.model_path = os.path.join(self.directory, \"animegan_v2_paprika_74\", \"model\")\n\n        # 加载模型\n        self.model = InferenceModel(modelpath=self.model_path, use_gpu=use_gpu, use_mkldnn=use_mkldnn)\n\n        self.model.eval()\n\n    # 关键点检测函数\n    def style_transfer(self,\n                       images=None,\n                       paths=None,\n                       output_dir='output',\n                       visualization=False,\n                       min_size=32,\n                       max_size=1024):\n        # 加载数据处理器\n        processor = Processor(images=images,\n                              paths=paths,\n                              batch_size=1,\n                              output_dir=output_dir,\n                              min_size=min_size,\n                              max_size=max_size)\n\n        # 模型预测\n        outputs = []\n        for input_data in processor.input_datas:\n            output = self.model(input_data)\n            outputs.append(output)\n\n        # 结果后处理\n        results = processor.postprocess(outputs, visualization)\n\n        # 返回结果\n        return results\n\n    # Hub Serving\n    @serving\n    def serving_method(self, images, **kwargs):\n        # 获取输入数据\n        images_decode = [base64_to_cv2(image) for image in images]\n\n        # 图片风格转换\n        results = self.style_transfer(images_decode, **kwargs)\n\n        # 对输出图片进行编码\n        encodes = []\n        for result in results:\n            encode = cv2_to_base64(result)\n            encodes.append(encode)\n\n        # 返回结果\n        return encodes\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_paprika_74/processor.py",
    "content": "import base64\nimport os\nimport time\n\nimport cv2\nimport numpy as np\n\n__all__ = ['base64_to_cv2', 'cv2_to_base64', 'Processor']\n\n\ndef check_dir(dir_path):\n    # 目录检查函数\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef base64_to_cv2(b64str):\n    # base64转cv2函数\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.frombuffer(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef cv2_to_base64(image):\n    # cv2转base64函数\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\nclass Processor():\n    # 初始化函数\n    def __init__(self, images=None, paths=None, batch_size=1, output_dir='output', min_size=32, max_size=1024):\n        # 变量设置\n        self.min_size = min_size\n        self.max_size = max_size\n\n        self.images = images\n        self.paths = paths\n        self.batch_size = batch_size\n        self.output_dir = output_dir\n\n        # 获取原始输入数据\n        self.datas = self.load_datas()\n\n        # 对原始输入数据进行预处理\n        self.input_datas = self.preprocess()\n\n    # 读取数据函数\n    def load_datas(self):\n        datas = []\n\n        # 读取数据列表\n        if self.paths is not None:\n            for im_path in self.paths:\n                assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n                im = cv2.imread(im_path)\n                datas.append(im)\n\n        if self.images is not None:\n            datas = self.images\n\n        # 返回数据列表\n        return datas\n\n    # 数据预处理函数\n    def preprocess(self):\n        input_datas = []\n\n        # 数据预处理\n        for i, img in enumerate(self.datas):\n            # 格式转换\n            img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)\n\n            # 缩放图片\n            h, w = img.shape[:2]\n            if max(h, w) > self.max_size:\n                img = cv2.resize(img, (self.max_size, int(h / w * self.max_size))) if h < w else cv2.resize(\n                    img, (int(w / h * self.max_size), self.max_size))\n            elif min(h, w) < self.min_size:\n                img = cv2.resize(img, (self.min_size, int(h / w * self.min_size))) if h > w else cv2.resize(\n                    img, (int(w / h * self.min_size), self.min_size))\n\n            # 裁剪图片\n            h, w = img.shape[:2]\n            img = img[:h - (h % 32), :w - (w % 32), :]\n\n            # 归一化\n            img = img / 127.5 - 1.0\n\n            # 新建维度\n            img = np.expand_dims(img, axis=0).astype('float32')\n\n            # 加入输入数据列表\n            input_datas.append(img)\n\n        # 数据按batch_size切分\n        input_datas = np.concatenate(input_datas, 0)\n        split_num = len(self.datas) // self.batch_size + 1 if len(self.datas) % self.batch_size != 0 else len(\n            self.datas) // self.batch_size\n        input_datas = np.array_split(input_datas, split_num)\n\n        # 返回预处理完成的数据\n        return input_datas\n\n    def postprocess(self, outputs, visualization):\n        results = []\n\n        for im_id, output in enumerate(outputs):\n            # 反归一化\n            image = (output.squeeze() + 1.) / 2 * 255\n\n            # 限幅\n            image = np.clip(image, 0, 255).astype(np.uint8)\n\n            # 格式转换\n            image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)\n\n            # 可视化\n            if visualization:\n                # 检查输出目录\n                check_dir(self.output_dir)\n\n                # 写入输出图片\n                cv2.imwrite(os.path.join(self.output_dir, '%d_%d.jpg' % (im_id, time.time())), image)\n\n            results.append(image)\n\n        # 返回结果\n        return results\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_paprika_74/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport numpy as np\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/mJaD10XeD7w/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8M3x8Y2F0fGVufDB8fHx8MTY2MzczNDc3Mw&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        img = cv2.imread('tests/test.jpg')\n        img = cv2.resize(img, (0, 0), fx=0.25, fy=0.25)\n        cv2.imwrite('tests/test.jpg', img)\n        cls.module = hub.Module(name=\"animegan_v2_paprika_74\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('output')\n\n    def test_style_transfer1(self):\n        results = self.module.style_transfer(paths=['tests/test.jpg'])\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer2(self):\n        results = self.module.style_transfer(paths=['tests/test.jpg'], visualization=True)\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer3(self):\n        results = self.module.style_transfer(images=[cv2.imread('tests/test.jpg')])\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer4(self):\n        results = self.module.style_transfer(images=[cv2.imread('tests/test.jpg')], visualization=True)\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer5(self):\n        self.assertRaises(AssertionError, self.module.style_transfer, paths=['no.jpg'])\n\n    def test_style_transfer6(self):\n        self.assertRaises(cv2.error, self.module.style_transfer, images=['tests/test.jpg'])\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_paprika_97/README.md",
    "content": "# animegan_v2_paprika_97\n\n|模型名称|animegan_v2_paprika_97|\n| :--- | :---: |\n|类别|图像 - 图像生成|\n|网络|AnimeGAN|\n|数据集|Paprika|\n|是否支持Fine-tuning|否|\n|模型大小|9.7MB|\n|最新更新日期|2021-07-30|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/136652269-48b8c902-3a2b-46b7-a9f2-d500097bbb0e.jpg\"  width = \"450\" height = \"300\" hspace='10'/>\n     <br />\n    输入图像\n     <br />\n    <img src=\"https://user-images.githubusercontent.com/35907364/136652280-7e9ebfd2-8a45-4b5b-b3ac-f107770525c4.jpg\"  width = \"450\" height = \"300\" hspace='10'/>\n     <br />\n    输出图像\n     <br />\n    </p>\n\n\n\n- ### 模型介绍\n\n  - AnimeGAN V2 图像风格转换模型, 模型可将输入的图像转换成红辣椒动漫风格，模型权重转换自[AnimeGAN V2官方开源项目](https://github.com/TachibanaYoshino/AnimeGAN)。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.8.0  \n\n  - paddlehub >= 1.8.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install animegan_v2_paprika_97\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"animegan_v2_paprika_97\")\n    result = model.style_transfer(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.style_transfer(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def style_transfer(images=None,\n                       paths=None,\n                       output_dir='output',\n                       visualization=False,\n                       min_size=32,\n                       max_size=1024)\n    ```\n\n    - 风格转换API，将输入的图片转换为漫画风格。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]；<br/>\n      - paths (list\\[str\\]): 图片的路径；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 output；<br/>\n      - visualization (bool): 是否将识别结果保存为图片文件；<br/>\n      - min\\_size (int): 输入图片的短边最小尺寸，默认设为 32；<br/>\n      - max\\_size (int): 输入图片的短边最大尺寸，默认设为 1024。\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n      - res (list\\[numpy.ndarray\\]): 输出图像数据，ndarray.shape 为 \\[H, W, C\\]\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像风格转换服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m animegan_v2_paprika_97\n    ```\n\n  - 这样就完成了一个图像风格转换的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/animegan_v2_paprika_97\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install animegan_v2_paprika_97==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_paprika_97/README_en.md",
    "content": "# animegan_v2_paprika_97\n\n|Module Name |animegan_v2_paprika_97|\n| :--- | :---: |\n|Category |Image generation|\n|Network|AnimeGAN|\n|Dataset|Paprika|\n|Fine-tuning supported or not|No|\n|Module Size|9.7MB|\n|Latest update date|2021-07-30|\n|Data indicators|-|\n\n\n## I. Basic Information\n\n- ### Application Effect Display\n\n  - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/136652269-48b8c902-3a2b-46b7-a9f2-d500097bbb0e.jpg\"  width = \"450\" height = \"300\" hspace='10'/>\n     <br />\n    Input image\n     <br />\n    <img src=\"https://user-images.githubusercontent.com/35907364/136652280-7e9ebfd2-8a45-4b5b-b3ac-f107770525c4.jpg\"  width = \"450\" height = \"300\" hspace='10'/>\n     <br />\n    Output image\n     <br />\n    </p>\n\n\n\n- ### Module Introduction\n\n  - AnimeGAN V2 image style stransfer model, the model can convert the input image into red pepper anime style, the model weight is converted from[AnimeGAN V2 official repo](https://github.com/TachibanaYoshino/AnimeGAN)。\n\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n  - paddlepaddle >= 1.8.0  \n\n  - paddlehub >= 1.8.0  | [How to install PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install animegan_v2_paprika_97\n    ```\n\n  - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"animegan_v2_paprika_97\")\n    result = model.style_transfer(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.style_transfer(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def style_transfer(images=None,\n                       paths=None,\n                       output_dir='output',\n                       visualization=False,\n                       min_size=32,\n                       max_size=1024)\n    ```\n\n    - Style transfer API.\n\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): Image data, ndarray.shape is in the format [H, W, C], BGR.\n      - paths (list\\[str\\]): Image path.\n      - output\\_dir (str): Save path of images, `output` by default.\n      - visualization (bool): Whether to save the results as picture files.\n      - min\\_size (int): Minimum size, default is  32.\n      - max\\_size (int): Maximum size, default is 1024.\n\n      **NOTE:** Choose one of `paths` and `images` to provide input data.\n\n    - **Return**\n      - res (list\\[numpy.ndarray\\]): The list of style transfer results，ndarray.shape is in the format [H, W, C].\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of style transfer.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command:\n\n    - ```shell\n      $ hub serving start -m animegan_v2_paprika_97\n      ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n   - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n   - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n      - ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n\n        def cv2_to_base64(image):\n          data = cv2.imencode('.jpg', image)[1]\n          return base64.b64encode(data.tostring()).decode('utf8')\n\n        # Send an HTTP request\n        data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/animegan_v2_paprika_97\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n        # print prediction results\n        print(r.json()[\"results\"])\n        ```\n\n\n## V. Release Note\n\n- 1.0.0\n\n  First release.\n\n* 1.1.0\n\n  Remove Fluid API\n\n  - ```shell\n    $ hub install animegan_v2_paprika_97==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_paprika_97/model.py",
    "content": "import os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\n__all__ = ['InferenceModel']\n\n\nclass InferenceModel:\n    # 初始化函数\n    def __init__(self, modelpath, use_gpu=False, gpu_id=0, use_mkldnn=False, cpu_threads=1):\n        '''\n        init the inference model\n        modelpath: inference model path\n        use_gpu: use gpu or not\n        use_mkldnn: use mkldnn or not\n        '''\n        # 加载模型配置\n        self.config = self.load_config(modelpath, use_gpu, gpu_id, use_mkldnn, cpu_threads)\n\n    # 打印函数\n    def __repr__(self):\n        '''\n        get the numbers and name of inputs and outputs\n        '''\n        return 'input_num: %d\\ninput_names: %s\\noutput_num: %d\\noutput_names: %s' % (\n            self.input_num, str(self.input_names), self.output_num, str(self.output_names))\n\n    # 类调用函数\n    def __call__(self, *input_datas, batch_size=1):\n        '''\n        call function\n        '''\n        return self.forward(*input_datas, batch_size=batch_size)\n\n    # 模型参数加载函数\n    def load_config(self, modelpath, use_gpu, gpu_id, use_mkldnn, cpu_threads):\n        '''\n        load the model config\n        modelpath: inference model path\n        use_gpu: use gpu or not\n        use_mkldnn: use mkldnn or not\n        '''\n        # 对运行位置进行配置\n        if use_gpu:\n            try:\n                int(os.environ.get('CUDA_VISIBLE_DEVICES'))\n            except Exception:\n                print(\n                    '''Error! Unable to use GPU. Please set the environment variables \"CUDA_VISIBLE_DEVICES=GPU_id\" to use GPU. Now switch to CPU to continue...'''\n                )\n                use_gpu = False\n\n        if os.path.isdir(modelpath):\n            if os.path.exists(os.path.join(modelpath, \"__params__\")):\n                # __model__ + __params__\n                model = os.path.join(modelpath, \"__model__\")\n                params = os.path.join(modelpath, \"__params__\")\n                config = Config(model, params)\n            elif os.path.exists(os.path.join(modelpath, \"params\")):\n                # model + params\n                model = os.path.join(modelpath, \"model\")\n                params = os.path.join(modelpath, \"params\")\n                config = Config(model, params)\n            elif os.path.exists(os.path.join(modelpath, \"__model__\")):\n                # __model__ + others\n                config = Config(modelpath)\n            else:\n                raise Exception(\"Error! Can\\'t find the model in: %s. Please check your model path.\" %\n                                os.path.abspath(modelpath))\n        elif os.path.exists(modelpath + \".pdmodel\"):\n            # *.pdmodel + *.pdiparams\n            model = modelpath + \".pdmodel\"\n            params = modelpath + \".pdiparams\"\n            config = Config(model, params)\n        elif isinstance(modelpath, Config):\n            config = modelpath\n        else:\n            raise Exception(\"Error! Can\\'t find the model in: %s. Please check your model path.\" %\n                            os.path.abspath(modelpath))\n\n        # 设置参数\n        if use_gpu:\n            config.enable_use_gpu(100, gpu_id)\n        else:\n            config.disable_gpu()\n            config.set_cpu_math_library_num_threads(cpu_threads)\n            if use_mkldnn:\n                config.enable_mkldnn()\n\n        config.disable_glog_info()\n\n        # 返回配置\n        return config\n\n    # 预测器创建函数\n    def eval(self):\n        '''\n        create the model predictor by model config\n        '''\n        # 创建预测器\n        self.predictor = create_predictor(self.config)\n\n        # 获取模型的输入输出名称\n        self.input_names = self.predictor.get_input_names()\n        self.output_names = self.predictor.get_output_names()\n\n        # 获取模型的输入输出节点数量\n        self.input_num = len(self.input_names)\n        self.output_num = len(self.output_names)\n\n        # 获取输入\n        self.input_handles = []\n        for input_name in self.input_names:\n            self.input_handles.append(self.predictor.get_input_handle(input_name))\n\n        # 获取输出\n        self.output_handles = []\n        for output_name in self.output_names:\n            self.output_handles.append(self.predictor.get_output_handle(output_name))\n\n    # 前向计算函数\n    def forward(self, *input_datas, batch_size=1):\n        \"\"\"\n        model inference\n        batch_size: batch size\n        *input_datas: x1, x2, ..., xn\n        \"\"\"\n        # 切分输入数据\n        datas_num = input_datas[0].shape[0]\n        split_num = datas_num // batch_size + \\\n                    1 if datas_num % batch_size != 0 else datas_num // batch_size\n        input_datas = [np.array_split(input_data, split_num) for input_data in input_datas]\n\n        # 遍历输入数据进行预测\n        outputs = {}\n        for step in range(split_num):\n            for i in range(self.input_num):\n                input_data = input_datas[i][step].copy()\n                self.input_handles[i].copy_from_cpu(input_data)\n\n            self.predictor.run()\n\n            for i in range(self.output_num):\n                output = self.output_handles[i].copy_to_cpu()\n                if i in outputs:\n                    outputs[i].append(output)\n                else:\n                    outputs[i] = [output]\n\n        # 预测结果合并\n        for key in outputs.keys():\n            outputs[key] = np.concatenate(outputs[key], 0)\n\n        outputs = [v for v in outputs.values()]\n\n        # 返回预测结果\n        return tuple(outputs) if len(outputs) > 1 else outputs[0]\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_paprika_97/module.py",
    "content": "import os\n\nfrom .model import InferenceModel\nfrom .processor import base64_to_cv2\nfrom .processor import cv2_to_base64\nfrom .processor import Processor\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"animegan_v2_paprika_97\",  # 模型名称\n    type=\"CV/style_transfer\",  # 模型类型\n    author=\"jm12138\",  # 作者名称\n    author_email=\"jm12138@qq.com\",  # 作者邮箱\n    summary=\"animegan_v2_paprika_97\",  # 模型介绍\n    version=\"1.1.0\"  # 版本号\n)\nclass Animegan_V2_Paprika_97:\n    # 初始化函数\n    def __init__(self, use_gpu=False, use_mkldnn=False):\n        # 设置模型路径\n        self.model_path = os.path.join(self.directory, \"animegan_v2_paprika_97\", \"model\")\n\n        # 加载模型\n        self.model = InferenceModel(modelpath=self.model_path, use_gpu=use_gpu, use_mkldnn=use_mkldnn)\n\n        self.model.eval()\n\n    # 关键点检测函数\n    def style_transfer(self,\n                       images=None,\n                       paths=None,\n                       output_dir='output',\n                       visualization=False,\n                       min_size=32,\n                       max_size=1024):\n        # 加载数据处理器\n        processor = Processor(images=images,\n                              paths=paths,\n                              batch_size=1,\n                              output_dir=output_dir,\n                              min_size=min_size,\n                              max_size=max_size)\n\n        # 模型预测\n        outputs = []\n        for input_data in processor.input_datas:\n            output = self.model(input_data)\n            outputs.append(output)\n\n        # 结果后处理\n        results = processor.postprocess(outputs, visualization)\n\n        # 返回结果\n        return results\n\n    # Hub Serving\n    @serving\n    def serving_method(self, images, **kwargs):\n        # 获取输入数据\n        images_decode = [base64_to_cv2(image) for image in images]\n\n        # 图片风格转换\n        results = self.style_transfer(images_decode, **kwargs)\n\n        # 对输出图片进行编码\n        encodes = []\n        for result in results:\n            encode = cv2_to_base64(result)\n            encodes.append(encode)\n\n        # 返回结果\n        return encodes\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_paprika_97/processor.py",
    "content": "import base64\nimport os\nimport time\n\nimport cv2\nimport numpy as np\n\n__all__ = ['base64_to_cv2', 'cv2_to_base64', 'Processor']\n\n\ndef check_dir(dir_path):\n    # 目录检查函数\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef base64_to_cv2(b64str):\n    # base64转cv2函数\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.frombuffer(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef cv2_to_base64(image):\n    # cv2转base64函数\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\nclass Processor():\n    # 初始化函数\n    def __init__(self, images=None, paths=None, batch_size=1, output_dir='output', min_size=32, max_size=1024):\n        # 变量设置\n        self.min_size = min_size\n        self.max_size = max_size\n\n        self.images = images\n        self.paths = paths\n        self.batch_size = batch_size\n        self.output_dir = output_dir\n\n        # 获取原始输入数据\n        self.datas = self.load_datas()\n\n        # 对原始输入数据进行预处理\n        self.input_datas = self.preprocess()\n\n    # 读取数据函数\n    def load_datas(self):\n        datas = []\n\n        # 读取数据列表\n        if self.paths is not None:\n            for im_path in self.paths:\n                assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n                im = cv2.imread(im_path)\n                datas.append(im)\n\n        if self.images is not None:\n            datas = self.images\n\n        # 返回数据列表\n        return datas\n\n    # 数据预处理函数\n    def preprocess(self):\n        input_datas = []\n\n        # 数据预处理\n        for i, img in enumerate(self.datas):\n            # 格式转换\n            img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)\n\n            # 缩放图片\n            h, w = img.shape[:2]\n            if max(h, w) > self.max_size:\n                img = cv2.resize(img, (self.max_size, int(h / w * self.max_size))) if h < w else cv2.resize(\n                    img, (int(w / h * self.max_size), self.max_size))\n            elif min(h, w) < self.min_size:\n                img = cv2.resize(img, (self.min_size, int(h / w * self.min_size))) if h > w else cv2.resize(\n                    img, (int(w / h * self.min_size), self.min_size))\n\n            # 裁剪图片\n            h, w = img.shape[:2]\n            img = img[:h - (h % 32), :w - (w % 32), :]\n\n            # 归一化\n            img = img / 127.5 - 1.0\n\n            # 新建维度\n            img = np.expand_dims(img, axis=0).astype('float32')\n\n            # 加入输入数据列表\n            input_datas.append(img)\n\n        # 数据按batch_size切分\n        input_datas = np.concatenate(input_datas, 0)\n        split_num = len(self.datas) // self.batch_size + 1 if len(self.datas) % self.batch_size != 0 else len(\n            self.datas) // self.batch_size\n        input_datas = np.array_split(input_datas, split_num)\n\n        # 返回预处理完成的数据\n        return input_datas\n\n    def postprocess(self, outputs, visualization):\n        results = []\n\n        for im_id, output in enumerate(outputs):\n            # 反归一化\n            image = (output.squeeze() + 1.) / 2 * 255\n\n            # 限幅\n            image = np.clip(image, 0, 255).astype(np.uint8)\n\n            # 格式转换\n            image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)\n\n            # 可视化\n            if visualization:\n                # 检查输出目录\n                check_dir(self.output_dir)\n\n                # 写入输出图片\n                cv2.imwrite(os.path.join(self.output_dir, '%d_%d.jpg' % (im_id, time.time())), image)\n\n            results.append(image)\n\n        # 返回结果\n        return results\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_paprika_97/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport numpy as np\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/mJaD10XeD7w/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8M3x8Y2F0fGVufDB8fHx8MTY2MzczNDc3Mw&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        img = cv2.imread('tests/test.jpg')\n        img = cv2.resize(img, (0, 0), fx=0.25, fy=0.25)\n        cv2.imwrite('tests/test.jpg', img)\n        cls.module = hub.Module(name=\"animegan_v2_paprika_97\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('output')\n\n    def test_style_transfer1(self):\n        results = self.module.style_transfer(paths=['tests/test.jpg'])\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer2(self):\n        results = self.module.style_transfer(paths=['tests/test.jpg'], visualization=True)\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer3(self):\n        results = self.module.style_transfer(images=[cv2.imread('tests/test.jpg')])\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer4(self):\n        results = self.module.style_transfer(images=[cv2.imread('tests/test.jpg')], visualization=True)\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer5(self):\n        self.assertRaises(AssertionError, self.module.style_transfer, paths=['no.jpg'])\n\n    def test_style_transfer6(self):\n        self.assertRaises(cv2.error, self.module.style_transfer, images=['tests/test.jpg'])\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_paprika_98/README.md",
    "content": "# animegan_v2_paprika_98\n\n|模型名称|animegan_v2_paprika_98|\n| :--- | :---: |\n|类别|图像 - 图像生成|\n|网络|AnimeGAN|\n|数据集|Paprika|\n|是否支持Fine-tuning|否|\n|模型大小|9.4MB|\n|最新更新日期|2021-07-30|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/bd002c4bb6a7427daf26988770bb18648b7d8d2bfd6746bfb9a429db4867727f\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    输入图像\n    <br />\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/495436a627ef423ab572536c5f2ba6d0eb99b1ce098947a5ac02af36e7eb85f7\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    输出图像\n    <br />\n    </p>\n\n\n- ### 模型介绍\n\n  - AnimeGAN V2 图像风格转换模型, 模型可将输入的图像转换成今敏红辣椒动漫风格，模型权重转换自[AnimeGAN V2官方开源项目](https://github.com/TachibanaYoshino/AnimeGANv2)。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.8.0  \n\n  - paddlehub >= 1.8.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install animegan_v2_paprika_98\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"animegan_v2_paprika_98\")\n    result = model.style_transfer(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.style_transfer(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def style_transfer(images=None,\n                       paths=None,\n                       output_dir='output',\n                       visualization=False,\n                       min_size=32,\n                       max_size=1024)\n    ```\n\n    - 风格转换API，将输入的图片转换为漫画风格。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]；<br/>\n      - paths (list\\[str\\]): 图片的路径；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 output；<br/>\n      - visualization (bool): 是否将识别结果保存为图片文件；<br/>\n      - min\\_size (int): 输入图片的短边最小尺寸，默认设为 32；<br/>\n      - max\\_size (int): 输入图片的短边最大尺寸，默认设为 1024。\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n      - res (list\\[numpy.ndarray\\]): 输出图像数据，ndarray.shape 为 \\[H, W, C\\]\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像风格转换服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m animegan_v2_paprika_98\n    ```\n\n  - 这样就完成了一个图像风格转换的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/animegan_v2_paprika_98\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install animegan_v2_paprika_98==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_paprika_98/README_en.md",
    "content": "# animegan_v2_paprika_98\n\n|Module Name|animegan_v2_paprika_98|\n| :--- | :---: |\n|Category|image generation|\n|Network|AnimeGAN|\n|Dataset|Paprika|\n|Fine-tuning supported or not|No|\n|Module Size|9.4MB|\n|Latest update date|2021-07-30|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n    <p align=\"center\">\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/bd002c4bb6a7427daf26988770bb18648b7d8d2bfd6746bfb9a429db4867727f\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    Input image\n    <br />\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/495436a627ef423ab572536c5f2ba6d0eb99b1ce098947a5ac02af36e7eb85f7\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    Output image\n    <br />\n    </p>\n\n\n- ### Module Introduction\n\n  - AnimeGAN V2 is a style transfer model, which can transfer a image style to paprika carton style. For more information, please refer to [AnimeGAN V2 Project](https://github.com/TachibanaYoshino/AnimeGANv2).\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.8.0  \n\n  - paddlehub >= 1.8.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install animegan_v2_paprika_98\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"animegan_v2_paprika_98\")\n    result = model.style_transfer(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.style_transfer(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def style_transfer(images=None,\n                       paths=None,\n                       output_dir='output',\n                       visualization=False,\n                       min_size=32,\n                       max_size=1024)\n    ```\n\n    - Style transfer API.\n\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - output_dir (str): save path of images;\n      - visualization (bool): Whether to save the results as picture files;\n      - min\\_size (int): min size of image shape，default is 32；\n      - max\\_size (int): max size of image shape，default is 1024.\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n      - res (list\\[numpy.ndarray\\]): result list，ndarray.shape is \\[H, W, C\\]\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of style transfer.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m animegan_v2_paprika_98\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/animegan_v2_paprika_98\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Remove Fluid API\n\n  - ```shell\n    $ hub install animegan_v2_paprika_98==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_paprika_98/model.py",
    "content": "import os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\n__all__ = ['InferenceModel']\n\n\nclass InferenceModel:\n    # 初始化函数\n    def __init__(self, modelpath, use_gpu=False, gpu_id=0, use_mkldnn=False, cpu_threads=1):\n        '''\n        init the inference model\n        modelpath: inference model path\n        use_gpu: use gpu or not\n        use_mkldnn: use mkldnn or not\n        '''\n        # 加载模型配置\n        self.config = self.load_config(modelpath, use_gpu, gpu_id, use_mkldnn, cpu_threads)\n\n    # 打印函数\n    def __repr__(self):\n        '''\n        get the numbers and name of inputs and outputs\n        '''\n        return 'input_num: %d\\ninput_names: %s\\noutput_num: %d\\noutput_names: %s' % (\n            self.input_num, str(self.input_names), self.output_num, str(self.output_names))\n\n    # 类调用函数\n    def __call__(self, *input_datas, batch_size=1):\n        '''\n        call function\n        '''\n        return self.forward(*input_datas, batch_size=batch_size)\n\n    # 模型参数加载函数\n    def load_config(self, modelpath, use_gpu, gpu_id, use_mkldnn, cpu_threads):\n        '''\n        load the model config\n        modelpath: inference model path\n        use_gpu: use gpu or not\n        use_mkldnn: use mkldnn or not\n        '''\n        # 对运行位置进行配置\n        if use_gpu:\n            try:\n                int(os.environ.get('CUDA_VISIBLE_DEVICES'))\n            except Exception:\n                print(\n                    '''Error! Unable to use GPU. Please set the environment variables \"CUDA_VISIBLE_DEVICES=GPU_id\" to use GPU. Now switch to CPU to continue...'''\n                )\n                use_gpu = False\n\n        if os.path.isdir(modelpath):\n            if os.path.exists(os.path.join(modelpath, \"__params__\")):\n                # __model__ + __params__\n                model = os.path.join(modelpath, \"__model__\")\n                params = os.path.join(modelpath, \"__params__\")\n                config = Config(model, params)\n            elif os.path.exists(os.path.join(modelpath, \"params\")):\n                # model + params\n                model = os.path.join(modelpath, \"model\")\n                params = os.path.join(modelpath, \"params\")\n                config = Config(model, params)\n            elif os.path.exists(os.path.join(modelpath, \"__model__\")):\n                # __model__ + others\n                config = Config(modelpath)\n            else:\n                raise Exception(\"Error! Can\\'t find the model in: %s. Please check your model path.\" %\n                                os.path.abspath(modelpath))\n        elif os.path.exists(modelpath + \".pdmodel\"):\n            # *.pdmodel + *.pdiparams\n            model = modelpath + \".pdmodel\"\n            params = modelpath + \".pdiparams\"\n            config = Config(model, params)\n        elif isinstance(modelpath, Config):\n            config = modelpath\n        else:\n            raise Exception(\"Error! Can\\'t find the model in: %s. Please check your model path.\" %\n                            os.path.abspath(modelpath))\n\n        # 设置参数\n        if use_gpu:\n            config.enable_use_gpu(100, gpu_id)\n        else:\n            config.disable_gpu()\n            config.set_cpu_math_library_num_threads(cpu_threads)\n            if use_mkldnn:\n                config.enable_mkldnn()\n\n        config.disable_glog_info()\n\n        # 返回配置\n        return config\n\n    # 预测器创建函数\n    def eval(self):\n        '''\n        create the model predictor by model config\n        '''\n        # 创建预测器\n        self.predictor = create_predictor(self.config)\n\n        # 获取模型的输入输出名称\n        self.input_names = self.predictor.get_input_names()\n        self.output_names = self.predictor.get_output_names()\n\n        # 获取模型的输入输出节点数量\n        self.input_num = len(self.input_names)\n        self.output_num = len(self.output_names)\n\n        # 获取输入\n        self.input_handles = []\n        for input_name in self.input_names:\n            self.input_handles.append(self.predictor.get_input_handle(input_name))\n\n        # 获取输出\n        self.output_handles = []\n        for output_name in self.output_names:\n            self.output_handles.append(self.predictor.get_output_handle(output_name))\n\n    # 前向计算函数\n    def forward(self, *input_datas, batch_size=1):\n        \"\"\"\n        model inference\n        batch_size: batch size\n        *input_datas: x1, x2, ..., xn\n        \"\"\"\n        # 切分输入数据\n        datas_num = input_datas[0].shape[0]\n        split_num = datas_num // batch_size + \\\n                    1 if datas_num % batch_size != 0 else datas_num // batch_size\n        input_datas = [np.array_split(input_data, split_num) for input_data in input_datas]\n\n        # 遍历输入数据进行预测\n        outputs = {}\n        for step in range(split_num):\n            for i in range(self.input_num):\n                input_data = input_datas[i][step].copy()\n                self.input_handles[i].copy_from_cpu(input_data)\n\n            self.predictor.run()\n\n            for i in range(self.output_num):\n                output = self.output_handles[i].copy_to_cpu()\n                if i in outputs:\n                    outputs[i].append(output)\n                else:\n                    outputs[i] = [output]\n\n        # 预测结果合并\n        for key in outputs.keys():\n            outputs[key] = np.concatenate(outputs[key], 0)\n\n        outputs = [v for v in outputs.values()]\n\n        # 返回预测结果\n        return tuple(outputs) if len(outputs) > 1 else outputs[0]\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_paprika_98/module.py",
    "content": "import os\n\nfrom .model import InferenceModel\nfrom .processor import base64_to_cv2\nfrom .processor import cv2_to_base64\nfrom .processor import Processor\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"animegan_v2_paprika_98\",  # 模型名称\n    type=\"CV/style_transfer\",  # 模型类型\n    author=\"jm12138\",  # 作者名称\n    author_email=\"jm12138@qq.com\",  # 作者邮箱\n    summary=\"animegan_v2_paprika_98\",  # 模型介绍\n    version=\"1.1.0\"  # 版本号\n)\nclass Animegan_V2_Paprika_98:\n    # 初始化函数\n    def __init__(self, use_gpu=False, use_mkldnn=False):\n        # 设置模型路径\n        self.model_path = os.path.join(self.directory, \"animegan_v2_paprika_98\", \"model\")\n\n        # 加载模型\n        self.model = InferenceModel(modelpath=self.model_path, use_gpu=use_gpu, use_mkldnn=use_mkldnn)\n\n        self.model.eval()\n\n    # 关键点检测函数\n    def style_transfer(self,\n                       images=None,\n                       paths=None,\n                       output_dir='output',\n                       visualization=False,\n                       min_size=32,\n                       max_size=1024):\n        # 加载数据处理器\n        processor = Processor(images=images,\n                              paths=paths,\n                              batch_size=1,\n                              output_dir=output_dir,\n                              min_size=min_size,\n                              max_size=max_size)\n\n        # 模型预测\n        outputs = []\n        for input_data in processor.input_datas:\n            output = self.model(input_data)\n            outputs.append(output)\n\n        # 结果后处理\n        results = processor.postprocess(outputs, visualization)\n\n        # 返回结果\n        return results\n\n    # Hub Serving\n    @serving\n    def serving_method(self, images, **kwargs):\n        # 获取输入数据\n        images_decode = [base64_to_cv2(image) for image in images]\n\n        # 图片风格转换\n        results = self.style_transfer(images_decode, **kwargs)\n\n        # 对输出图片进行编码\n        encodes = []\n        for result in results:\n            encode = cv2_to_base64(result)\n            encodes.append(encode)\n\n        # 返回结果\n        return encodes\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_paprika_98/processor.py",
    "content": "import base64\nimport os\nimport time\n\nimport cv2\nimport numpy as np\n\n__all__ = ['base64_to_cv2', 'cv2_to_base64', 'Processor']\n\n\ndef check_dir(dir_path):\n    # 目录检查函数\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef base64_to_cv2(b64str):\n    # base64转cv2函数\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.frombuffer(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef cv2_to_base64(image):\n    # cv2转base64函数\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\nclass Processor():\n    # 初始化函数\n    def __init__(self, images=None, paths=None, batch_size=1, output_dir='output', min_size=32, max_size=1024):\n        # 变量设置\n        self.min_size = min_size\n        self.max_size = max_size\n\n        self.images = images\n        self.paths = paths\n        self.batch_size = batch_size\n        self.output_dir = output_dir\n\n        # 获取原始输入数据\n        self.datas = self.load_datas()\n\n        # 对原始输入数据进行预处理\n        self.input_datas = self.preprocess()\n\n    # 读取数据函数\n    def load_datas(self):\n        datas = []\n\n        # 读取数据列表\n        if self.paths is not None:\n            for im_path in self.paths:\n                assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n                im = cv2.imread(im_path)\n                datas.append(im)\n\n        if self.images is not None:\n            datas = self.images\n\n        # 返回数据列表\n        return datas\n\n    # 数据预处理函数\n    def preprocess(self):\n        input_datas = []\n\n        # 数据预处理\n        for i, img in enumerate(self.datas):\n            # 格式转换\n            img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)\n\n            # 缩放图片\n            h, w = img.shape[:2]\n            if max(h, w) > self.max_size:\n                img = cv2.resize(img, (self.max_size, int(h / w * self.max_size))) if h < w else cv2.resize(\n                    img, (int(w / h * self.max_size), self.max_size))\n            elif min(h, w) < self.min_size:\n                img = cv2.resize(img, (self.min_size, int(h / w * self.min_size))) if h > w else cv2.resize(\n                    img, (int(w / h * self.min_size), self.min_size))\n\n            # 裁剪图片\n            h, w = img.shape[:2]\n            img = img[:h - (h % 32), :w - (w % 32), :]\n\n            # 归一化\n            img = img / 127.5 - 1.0\n\n            # 新建维度\n            img = np.expand_dims(img, axis=0).astype('float32')\n\n            # 加入输入数据列表\n            input_datas.append(img)\n\n        # 数据按batch_size切分\n        input_datas = np.concatenate(input_datas, 0)\n        split_num = len(self.datas) // self.batch_size + 1 if len(self.datas) % self.batch_size != 0 else len(\n            self.datas) // self.batch_size\n        input_datas = np.array_split(input_datas, split_num)\n\n        # 返回预处理完成的数据\n        return input_datas\n\n    def postprocess(self, outputs, visualization):\n        results = []\n\n        for im_id, output in enumerate(outputs):\n            # 反归一化\n            image = (output.squeeze() + 1.) / 2 * 255\n\n            # 限幅\n            image = np.clip(image, 0, 255).astype(np.uint8)\n\n            # 格式转换\n            image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)\n\n            # 可视化\n            if visualization:\n                # 检查输出目录\n                check_dir(self.output_dir)\n\n                # 写入输出图片\n                cv2.imwrite(os.path.join(self.output_dir, '%d_%d.jpg' % (im_id, time.time())), image)\n\n            results.append(image)\n\n        # 返回结果\n        return results\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_paprika_98/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport numpy as np\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/mJaD10XeD7w/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8M3x8Y2F0fGVufDB8fHx8MTY2MzczNDc3Mw&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        img = cv2.imread('tests/test.jpg')\n        img = cv2.resize(img, (0, 0), fx=0.25, fy=0.25)\n        cv2.imwrite('tests/test.jpg', img)\n        cls.module = hub.Module(name=\"animegan_v2_paprika_98\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('output')\n\n    def test_style_transfer1(self):\n        results = self.module.style_transfer(paths=['tests/test.jpg'])\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer2(self):\n        results = self.module.style_transfer(paths=['tests/test.jpg'], visualization=True)\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer3(self):\n        results = self.module.style_transfer(images=[cv2.imread('tests/test.jpg')])\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer4(self):\n        results = self.module.style_transfer(images=[cv2.imread('tests/test.jpg')], visualization=True)\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer5(self):\n        self.assertRaises(AssertionError, self.module.style_transfer, paths=['no.jpg'])\n\n    def test_style_transfer6(self):\n        self.assertRaises(cv2.error, self.module.style_transfer, images=['tests/test.jpg'])\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_shinkai_33/README.md",
    "content": "# animegan_v2_shinkai_33\n\n|模型名称|animegan_v2_shinkai_33|\n| :--- | :---: |\n|类别|图像 - 图像生成|\n|网络|AnimeGAN|\n|数据集|Your Name, Weathering with you|\n|是否支持Fine-tuning|否|\n|模型大小|9.4MB|\n|最新更新日期|2021-07-30|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/bd002c4bb6a7427daf26988770bb18648b7d8d2bfd6746bfb9a429db4867727f\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    输入图像\n    <br />\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/776a84a0d97c452bbbe479592fbb8f5c6fe9c45f3b7e41fd8b7da80bf52ee668\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    输出图像\n    <br />\n    </p>\n\n\n- ### 模型介绍\n\n  - AnimeGAN V2 图像风格转换模型, 模型可将输入的图像转换成新海诚动漫风格，模型权重转换自[AnimeGAN V2官方开源项目](https://github.com/TachibanaYoshino/AnimeGANv2)。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.8.0  \n\n  - paddlehub >= 1.8.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install animegan_v2_shinkai_33\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"animegan_v2_shinkai_33\")\n    result = model.style_transfer(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.style_transfer(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def style_transfer(images=None,\n                       paths=None,\n                       output_dir='output',\n                       visualization=False,\n                       min_size=32,\n                       max_size=1024)\n    ```\n\n    - 风格转换API，将输入的图片转换为漫画风格。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]；<br/>\n      - paths (list\\[str\\]): 图片的路径；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 output；<br/>\n      - visualization (bool): 是否将识别结果保存为图片文件；<br/>\n      - min\\_size (int): 输入图片的短边最小尺寸，默认设为 32；<br/>\n      - max\\_size (int): 输入图片的短边最大尺寸，默认设为 1024。\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n      - res (list\\[numpy.ndarray\\]): 输出图像数据，ndarray.shape 为 \\[H, W, C\\]\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像风格转换服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m animegan_v2_shinkai_33\n    ```\n\n  - 这样就完成了一个图像风格转换的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/animegan_v2_shinkai_33\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install animegan_v2_shinkai_33==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_shinkai_33/README_en.md",
    "content": "# animegan_v2_shinkai_33\n\n|Module Name|animegan_v2_shinkai_33|\n| :--- | :---: |\n|Category|image generation|\n|Network|AnimeGAN|\n|Dataset|Your Name, Weathering with you|\n|Fine-tuning supported or not|No|\n|Module Size|9.4MB|\n|Latest update date|2021-07-30|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n    <p align=\"center\">\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/bd002c4bb6a7427daf26988770bb18648b7d8d2bfd6746bfb9a429db4867727f\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    Input image\n    <br />\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/776a84a0d97c452bbbe479592fbb8f5c6fe9c45f3b7e41fd8b7da80bf52ee668\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    Output image\n    <br />\n    </p>\n\n\n- ### Module Introduction\n\n  - AnimeGAN V2 is a style transfer model, which can transfer a image style to Shinkai carton style. For more information, please refer to [AnimeGAN V2 Project](https://github.com/TachibanaYoshino/AnimeGANv2).\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.8.0  \n\n  - paddlehub >= 1.8.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install animegan_v2_shinkai_33\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"animegan_v2_shinkai_33\")\n    result = model.style_transfer(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.style_transfer(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def style_transfer(images=None,\n                       paths=None,\n                       output_dir='output',\n                       visualization=False,\n                       min_size=32,\n                       max_size=1024)\n    ```\n\n    - Style transfer API.\n\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - output_dir (str): save path of images;\n      - visualization (bool): Whether to save the results as picture files;\n      - min\\_size (int): min size of image shape，default is 32；\n      - max\\_size (int): max size of image shape，default is 1024.\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n      - res (list\\[numpy.ndarray\\]): result list，ndarray.shape is \\[H, W, C\\]\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of style transfer.\n\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m animegan_v2_shinkai_33\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/animegan_v2_shinkai_33\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Remove Fluid API\n\n  - ```shell\n    $ hub install animegan_v2_shinkai_33==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_shinkai_33/model.py",
    "content": "import os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\n__all__ = ['InferenceModel']\n\n\nclass InferenceModel:\n    # 初始化函数\n    def __init__(self, modelpath, use_gpu=False, gpu_id=0, use_mkldnn=False, cpu_threads=1):\n        '''\n        init the inference model\n        modelpath: inference model path\n        use_gpu: use gpu or not\n        use_mkldnn: use mkldnn or not\n        '''\n        # 加载模型配置\n        self.config = self.load_config(modelpath, use_gpu, gpu_id, use_mkldnn, cpu_threads)\n\n    # 打印函数\n    def __repr__(self):\n        '''\n        get the numbers and name of inputs and outputs\n        '''\n        return 'input_num: %d\\ninput_names: %s\\noutput_num: %d\\noutput_names: %s' % (\n            self.input_num, str(self.input_names), self.output_num, str(self.output_names))\n\n    # 类调用函数\n    def __call__(self, *input_datas, batch_size=1):\n        '''\n        call function\n        '''\n        return self.forward(*input_datas, batch_size=batch_size)\n\n    # 模型参数加载函数\n    def load_config(self, modelpath, use_gpu, gpu_id, use_mkldnn, cpu_threads):\n        '''\n        load the model config\n        modelpath: inference model path\n        use_gpu: use gpu or not\n        use_mkldnn: use mkldnn or not\n        '''\n        # 对运行位置进行配置\n        if use_gpu:\n            try:\n                int(os.environ.get('CUDA_VISIBLE_DEVICES'))\n            except Exception:\n                print(\n                    '''Error! Unable to use GPU. Please set the environment variables \"CUDA_VISIBLE_DEVICES=GPU_id\" to use GPU. Now switch to CPU to continue...'''\n                )\n                use_gpu = False\n\n        if os.path.isdir(modelpath):\n            if os.path.exists(os.path.join(modelpath, \"__params__\")):\n                # __model__ + __params__\n                model = os.path.join(modelpath, \"__model__\")\n                params = os.path.join(modelpath, \"__params__\")\n                config = Config(model, params)\n            elif os.path.exists(os.path.join(modelpath, \"params\")):\n                # model + params\n                model = os.path.join(modelpath, \"model\")\n                params = os.path.join(modelpath, \"params\")\n                config = Config(model, params)\n            elif os.path.exists(os.path.join(modelpath, \"__model__\")):\n                # __model__ + others\n                config = Config(modelpath)\n            else:\n                raise Exception(\"Error! Can\\'t find the model in: %s. Please check your model path.\" %\n                                os.path.abspath(modelpath))\n        elif os.path.exists(modelpath + \".pdmodel\"):\n            # *.pdmodel + *.pdiparams\n            model = modelpath + \".pdmodel\"\n            params = modelpath + \".pdiparams\"\n            config = Config(model, params)\n        elif isinstance(modelpath, Config):\n            config = modelpath\n        else:\n            raise Exception(\"Error! Can\\'t find the model in: %s. Please check your model path.\" %\n                            os.path.abspath(modelpath))\n\n        # 设置参数\n        if use_gpu:\n            config.enable_use_gpu(100, gpu_id)\n        else:\n            config.disable_gpu()\n            config.set_cpu_math_library_num_threads(cpu_threads)\n            if use_mkldnn:\n                config.enable_mkldnn()\n\n        config.disable_glog_info()\n\n        # 返回配置\n        return config\n\n    # 预测器创建函数\n    def eval(self):\n        '''\n        create the model predictor by model config\n        '''\n        # 创建预测器\n        self.predictor = create_predictor(self.config)\n\n        # 获取模型的输入输出名称\n        self.input_names = self.predictor.get_input_names()\n        self.output_names = self.predictor.get_output_names()\n\n        # 获取模型的输入输出节点数量\n        self.input_num = len(self.input_names)\n        self.output_num = len(self.output_names)\n\n        # 获取输入\n        self.input_handles = []\n        for input_name in self.input_names:\n            self.input_handles.append(self.predictor.get_input_handle(input_name))\n\n        # 获取输出\n        self.output_handles = []\n        for output_name in self.output_names:\n            self.output_handles.append(self.predictor.get_output_handle(output_name))\n\n    # 前向计算函数\n    def forward(self, *input_datas, batch_size=1):\n        \"\"\"\n        model inference\n        batch_size: batch size\n        *input_datas: x1, x2, ..., xn\n        \"\"\"\n        # 切分输入数据\n        datas_num = input_datas[0].shape[0]\n        split_num = datas_num // batch_size + \\\n                    1 if datas_num % batch_size != 0 else datas_num // batch_size\n        input_datas = [np.array_split(input_data, split_num) for input_data in input_datas]\n\n        # 遍历输入数据进行预测\n        outputs = {}\n        for step in range(split_num):\n            for i in range(self.input_num):\n                input_data = input_datas[i][step].copy()\n                self.input_handles[i].copy_from_cpu(input_data)\n\n            self.predictor.run()\n\n            for i in range(self.output_num):\n                output = self.output_handles[i].copy_to_cpu()\n                if i in outputs:\n                    outputs[i].append(output)\n                else:\n                    outputs[i] = [output]\n\n        # 预测结果合并\n        for key in outputs.keys():\n            outputs[key] = np.concatenate(outputs[key], 0)\n\n        outputs = [v for v in outputs.values()]\n\n        # 返回预测结果\n        return tuple(outputs) if len(outputs) > 1 else outputs[0]\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_shinkai_33/module.py",
    "content": "import os\n\nfrom .model import InferenceModel\nfrom .processor import base64_to_cv2\nfrom .processor import cv2_to_base64\nfrom .processor import Processor\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"animegan_v2_shinkai_33\",  # 模型名称\n    type=\"CV/style_transfer\",  # 模型类型\n    author=\"jm12138\",  # 作者名称\n    author_email=\"jm12138@qq.com\",  # 作者邮箱\n    summary=\"animegan_v2_shinkai_33\",  # 模型介绍\n    version=\"1.1.0\"  # 版本号\n)\nclass Animegan_V2_Shinkai_33:\n    # 初始化函数\n    def __init__(self, use_gpu=False, use_mkldnn=False):\n        # 设置模型路径\n        self.model_path = os.path.join(self.directory, \"animegan_v2_shinkai_33\", \"model\")\n\n        # 加载模型\n        self.model = InferenceModel(modelpath=self.model_path, use_gpu=use_gpu, use_mkldnn=use_mkldnn)\n\n        self.model.eval()\n\n    # 关键点检测函数\n    def style_transfer(self,\n                       images=None,\n                       paths=None,\n                       output_dir='output',\n                       visualization=False,\n                       min_size=32,\n                       max_size=1024):\n        # 加载数据处理器\n        processor = Processor(images=images,\n                              paths=paths,\n                              batch_size=1,\n                              output_dir=output_dir,\n                              min_size=min_size,\n                              max_size=max_size)\n\n        # 模型预测\n        outputs = []\n        for input_data in processor.input_datas:\n            output = self.model(input_data)\n            outputs.append(output)\n\n        # 结果后处理\n        results = processor.postprocess(outputs, visualization)\n\n        # 返回结果\n        return results\n\n    # Hub Serving\n    @serving\n    def serving_method(self, images, **kwargs):\n        # 获取输入数据\n        images_decode = [base64_to_cv2(image) for image in images]\n\n        # 图片风格转换\n        results = self.style_transfer(images_decode, **kwargs)\n\n        # 对输出图片进行编码\n        encodes = []\n        for result in results:\n            encode = cv2_to_base64(result)\n            encodes.append(encode)\n\n        # 返回结果\n        return encodes\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_shinkai_33/processor.py",
    "content": "import base64\nimport os\nimport time\n\nimport cv2\nimport numpy as np\n\n__all__ = ['base64_to_cv2', 'cv2_to_base64', 'Processor']\n\n\ndef check_dir(dir_path):\n    # 目录检查函数\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef base64_to_cv2(b64str):\n    # base64转cv2函数\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.frombuffer(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef cv2_to_base64(image):\n    # cv2转base64函数\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\nclass Processor():\n    # 初始化函数\n    def __init__(self, images=None, paths=None, batch_size=1, output_dir='output', min_size=32, max_size=1024):\n        # 变量设置\n        self.min_size = min_size\n        self.max_size = max_size\n\n        self.images = images\n        self.paths = paths\n        self.batch_size = batch_size\n        self.output_dir = output_dir\n\n        # 获取原始输入数据\n        self.datas = self.load_datas()\n\n        # 对原始输入数据进行预处理\n        self.input_datas = self.preprocess()\n\n    # 读取数据函数\n    def load_datas(self):\n        datas = []\n\n        # 读取数据列表\n        if self.paths is not None:\n            for im_path in self.paths:\n                assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n                im = cv2.imread(im_path)\n                datas.append(im)\n\n        if self.images is not None:\n            datas = self.images\n\n        # 返回数据列表\n        return datas\n\n    # 数据预处理函数\n    def preprocess(self):\n        input_datas = []\n\n        # 数据预处理\n        for i, img in enumerate(self.datas):\n            # 格式转换\n            img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)\n\n            # 缩放图片\n            h, w = img.shape[:2]\n            if max(h, w) > self.max_size:\n                img = cv2.resize(img, (self.max_size, int(h / w * self.max_size))) if h < w else cv2.resize(\n                    img, (int(w / h * self.max_size), self.max_size))\n            elif min(h, w) < self.min_size:\n                img = cv2.resize(img, (self.min_size, int(h / w * self.min_size))) if h > w else cv2.resize(\n                    img, (int(w / h * self.min_size), self.min_size))\n\n            # 裁剪图片\n            h, w = img.shape[:2]\n            img = img[:h - (h % 32), :w - (w % 32), :]\n\n            # 归一化\n            img = img / 127.5 - 1.0\n\n            # 新建维度\n            img = np.expand_dims(img, axis=0).astype('float32')\n\n            # 加入输入数据列表\n            input_datas.append(img)\n\n        # 数据按batch_size切分\n        input_datas = np.concatenate(input_datas, 0)\n        split_num = len(self.datas) // self.batch_size + 1 if len(self.datas) % self.batch_size != 0 else len(\n            self.datas) // self.batch_size\n        input_datas = np.array_split(input_datas, split_num)\n\n        # 返回预处理完成的数据\n        return input_datas\n\n    def postprocess(self, outputs, visualization):\n        results = []\n\n        for im_id, output in enumerate(outputs):\n            # 反归一化\n            image = (output.squeeze() + 1.) / 2 * 255\n\n            # 限幅\n            image = np.clip(image, 0, 255).astype(np.uint8)\n\n            # 格式转换\n            image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)\n\n            # 可视化\n            if visualization:\n                # 检查输出目录\n                check_dir(self.output_dir)\n\n                # 写入输出图片\n                cv2.imwrite(os.path.join(self.output_dir, '%d_%d.jpg' % (im_id, time.time())), image)\n\n            results.append(image)\n\n        # 返回结果\n        return results\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_shinkai_33/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport numpy as np\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/mJaD10XeD7w/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8M3x8Y2F0fGVufDB8fHx8MTY2MzczNDc3Mw&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        img = cv2.imread('tests/test.jpg')\n        img = cv2.resize(img, (0, 0), fx=0.25, fy=0.25)\n        cv2.imwrite('tests/test.jpg', img)\n        cls.module = hub.Module(name=\"animegan_v2_shinkai_33\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('output')\n\n    def test_style_transfer1(self):\n        results = self.module.style_transfer(paths=['tests/test.jpg'])\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer2(self):\n        results = self.module.style_transfer(paths=['tests/test.jpg'], visualization=True)\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer3(self):\n        results = self.module.style_transfer(images=[cv2.imread('tests/test.jpg')])\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer4(self):\n        results = self.module.style_transfer(images=[cv2.imread('tests/test.jpg')], visualization=True)\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer5(self):\n        self.assertRaises(AssertionError, self.module.style_transfer, paths=['no.jpg'])\n\n    def test_style_transfer6(self):\n        self.assertRaises(cv2.error, self.module.style_transfer, images=['tests/test.jpg'])\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_shinkai_53/README.md",
    "content": "# animegan_v2_shinkai_53\n\n|模型名称|animegan_v2_shinkai_53|\n| :--- | :---: |\n|类别|图像 - 图像生成|\n|网络|AnimeGAN|\n|数据集|Your Name, Weathering with you|\n|是否支持Fine-tuning|否|\n|模型大小|9.4MB|\n|最新更新日期|2021-07-30|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/bd002c4bb6a7427daf26988770bb18648b7d8d2bfd6746bfb9a429db4867727f\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    输入图像\n    <br />\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/fa4ba157e73c48658c4c9c6b8b92f5c99231d1d19556472788b1e5dd58d5d6cc\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    输出图像\n    <br />\n    </p>\n\n\n- ### 模型介绍\n\n  - AnimeGAN V2 图像风格转换模型, 模型可将输入的图像转换成新海诚动漫风格，模型权重转换自[AnimeGAN V2官方开源项目](https://github.com/TachibanaYoshino/AnimeGANv2)。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.8.0  \n\n  - paddlehub >= 1.8.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install animegan_v2_shinkai_53\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"animegan_v2_shinkai_53\")\n    result = model.style_transfer(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.style_transfer(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def style_transfer(images=None,\n                       paths=None,\n                       output_dir='output',\n                       visualization=False,\n                       min_size=32,\n                       max_size=1024)\n    ```\n\n    - 风格转换API，将输入的图片转换为漫画风格。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]；<br/>\n      - paths (list\\[str\\]): 图片的路径；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 output；<br/>\n      - visualization (bool): 是否将识别结果保存为图片文件；<br/>\n      - min\\_size (int): 输入图片的短边最小尺寸，默认设为 32；<br/>\n      - max\\_size (int): 输入图片的短边最大尺寸，默认设为 1024。\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n      - res (list\\[numpy.ndarray\\]): 输出图像数据，ndarray.shape 为 \\[H, W, C\\]\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像风格转换服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m animegan_v2_shinkai_53\n    ```\n\n  - 这样就完成了一个图像风格转换的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/animegan_v2_shinkai_53\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install animegan_v2_shinkai_53==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_shinkai_53/README_en.md",
    "content": "# animegan_v2_shinkai_53\n\n|Module Name|animegan_v2_shinkai_53|\n| :--- | :---: |\n|Category|image generation|\n|Network|AnimeGAN|\n|Dataset|Your Name, Weathering with you|\n|Fine-tuning supported or not|No|\n|Module Size|9.4MB|\n|Latest update date|2021-07-30|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n    <p align=\"center\">\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/bd002c4bb6a7427daf26988770bb18648b7d8d2bfd6746bfb9a429db4867727f\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    Input image\n    <br />\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/fa4ba157e73c48658c4c9c6b8b92f5c99231d1d19556472788b1e5dd58d5d6cc\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    Output image\n    <br />\n    </p>\n\n\n- ### Module Introduction\n\n  - AnimeGAN V2 is a style transfer model, which can transfer a image style to Shinkai carton style. For more information, please refer to [AnimeGAN V2 Project](https://github.com/TachibanaYoshino/AnimeGANv2).\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.8.0  \n\n  - paddlehub >= 1.8.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install animegan_v2_shinkai_53\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"animegan_v2_shinkai_53\")\n    result = model.style_transfer(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.style_transfer(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def style_transfer(images=None,\n                       paths=None,\n                       output_dir='output',\n                       visualization=False,\n                       min_size=32,\n                       max_size=1024)\n    ```\n\n    - Style transfer API.\n\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - output_dir (str): save path of images;\n      - visualization (bool): Whether to save the results as picture files;\n      - min\\_size (int): min size of image shape，default is 32；\n      - max\\_size (int): max size of image shape，default is 1024.\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n      - res (list\\[numpy.ndarray\\]): result list，ndarray.shape is \\[H, W, C\\]\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of style transfer.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m animegan_v2_shinkai_53\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/animegan_v2_shinkai_53\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Remove Fluid API\n\n  - ```shell\n    $ hub install animegan_v2_shinkai_53==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_shinkai_53/model.py",
    "content": "import os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\n__all__ = ['InferenceModel']\n\n\nclass InferenceModel:\n    # 初始化函数\n    def __init__(self, modelpath, use_gpu=False, gpu_id=0, use_mkldnn=False, cpu_threads=1):\n        '''\n        init the inference model\n        modelpath: inference model path\n        use_gpu: use gpu or not\n        use_mkldnn: use mkldnn or not\n        '''\n        # 加载模型配置\n        self.config = self.load_config(modelpath, use_gpu, gpu_id, use_mkldnn, cpu_threads)\n\n    # 打印函数\n    def __repr__(self):\n        '''\n        get the numbers and name of inputs and outputs\n        '''\n        return 'input_num: %d\\ninput_names: %s\\noutput_num: %d\\noutput_names: %s' % (\n            self.input_num, str(self.input_names), self.output_num, str(self.output_names))\n\n    # 类调用函数\n    def __call__(self, *input_datas, batch_size=1):\n        '''\n        call function\n        '''\n        return self.forward(*input_datas, batch_size=batch_size)\n\n    # 模型参数加载函数\n    def load_config(self, modelpath, use_gpu, gpu_id, use_mkldnn, cpu_threads):\n        '''\n        load the model config\n        modelpath: inference model path\n        use_gpu: use gpu or not\n        use_mkldnn: use mkldnn or not\n        '''\n        # 对运行位置进行配置\n        if use_gpu:\n            try:\n                int(os.environ.get('CUDA_VISIBLE_DEVICES'))\n            except Exception:\n                print(\n                    '''Error! Unable to use GPU. Please set the environment variables \"CUDA_VISIBLE_DEVICES=GPU_id\" to use GPU. Now switch to CPU to continue...'''\n                )\n                use_gpu = False\n\n        if os.path.isdir(modelpath):\n            if os.path.exists(os.path.join(modelpath, \"__params__\")):\n                # __model__ + __params__\n                model = os.path.join(modelpath, \"__model__\")\n                params = os.path.join(modelpath, \"__params__\")\n                config = Config(model, params)\n            elif os.path.exists(os.path.join(modelpath, \"params\")):\n                # model + params\n                model = os.path.join(modelpath, \"model\")\n                params = os.path.join(modelpath, \"params\")\n                config = Config(model, params)\n            elif os.path.exists(os.path.join(modelpath, \"__model__\")):\n                # __model__ + others\n                config = Config(modelpath)\n            else:\n                raise Exception(\"Error! Can\\'t find the model in: %s. Please check your model path.\" %\n                                os.path.abspath(modelpath))\n        elif os.path.exists(modelpath + \".pdmodel\"):\n            # *.pdmodel + *.pdiparams\n            model = modelpath + \".pdmodel\"\n            params = modelpath + \".pdiparams\"\n            config = Config(model, params)\n        elif isinstance(modelpath, Config):\n            config = modelpath\n        else:\n            raise Exception(\"Error! Can\\'t find the model in: %s. Please check your model path.\" %\n                            os.path.abspath(modelpath))\n\n        # 设置参数\n        if use_gpu:\n            config.enable_use_gpu(100, gpu_id)\n        else:\n            config.disable_gpu()\n            config.set_cpu_math_library_num_threads(cpu_threads)\n            if use_mkldnn:\n                config.enable_mkldnn()\n\n        config.disable_glog_info()\n\n        # 返回配置\n        return config\n\n    # 预测器创建函数\n    def eval(self):\n        '''\n        create the model predictor by model config\n        '''\n        # 创建预测器\n        self.predictor = create_predictor(self.config)\n\n        # 获取模型的输入输出名称\n        self.input_names = self.predictor.get_input_names()\n        self.output_names = self.predictor.get_output_names()\n\n        # 获取模型的输入输出节点数量\n        self.input_num = len(self.input_names)\n        self.output_num = len(self.output_names)\n\n        # 获取输入\n        self.input_handles = []\n        for input_name in self.input_names:\n            self.input_handles.append(self.predictor.get_input_handle(input_name))\n\n        # 获取输出\n        self.output_handles = []\n        for output_name in self.output_names:\n            self.output_handles.append(self.predictor.get_output_handle(output_name))\n\n    # 前向计算函数\n    def forward(self, *input_datas, batch_size=1):\n        \"\"\"\n        model inference\n        batch_size: batch size\n        *input_datas: x1, x2, ..., xn\n        \"\"\"\n        # 切分输入数据\n        datas_num = input_datas[0].shape[0]\n        split_num = datas_num // batch_size + \\\n                    1 if datas_num % batch_size != 0 else datas_num // batch_size\n        input_datas = [np.array_split(input_data, split_num) for input_data in input_datas]\n\n        # 遍历输入数据进行预测\n        outputs = {}\n        for step in range(split_num):\n            for i in range(self.input_num):\n                input_data = input_datas[i][step].copy()\n                self.input_handles[i].copy_from_cpu(input_data)\n\n            self.predictor.run()\n\n            for i in range(self.output_num):\n                output = self.output_handles[i].copy_to_cpu()\n                if i in outputs:\n                    outputs[i].append(output)\n                else:\n                    outputs[i] = [output]\n\n        # 预测结果合并\n        for key in outputs.keys():\n            outputs[key] = np.concatenate(outputs[key], 0)\n\n        outputs = [v for v in outputs.values()]\n\n        # 返回预测结果\n        return tuple(outputs) if len(outputs) > 1 else outputs[0]\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_shinkai_53/module.py",
    "content": "import os\n\nfrom .model import InferenceModel\nfrom .processor import base64_to_cv2\nfrom .processor import cv2_to_base64\nfrom .processor import Processor\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"animegan_v2_shinkai_53\",  # 模型名称\n    type=\"CV/style_transfer\",  # 模型类型\n    author=\"jm12138\",  # 作者名称\n    author_email=\"jm12138@qq.com\",  # 作者邮箱\n    summary=\"animegan_v2_shinkai_53\",  # 模型介绍\n    version=\"1.1.0\"  # 版本号\n)\nclass Animegan_V2_Shinkai_53:\n    # 初始化函数\n    def __init__(self, use_gpu=False, use_mkldnn=False):\n        # 设置模型路径\n        self.model_path = os.path.join(self.directory, \"animegan_v2_shinkai_53\", \"model\")\n\n        # 加载模型\n        self.model = InferenceModel(modelpath=self.model_path, use_gpu=use_gpu, use_mkldnn=use_mkldnn)\n\n        self.model.eval()\n\n    # 关键点检测函数\n    def style_transfer(self,\n                       images=None,\n                       paths=None,\n                       output_dir='output',\n                       visualization=False,\n                       min_size=32,\n                       max_size=1024):\n        # 加载数据处理器\n        processor = Processor(images=images,\n                              paths=paths,\n                              batch_size=1,\n                              output_dir=output_dir,\n                              min_size=min_size,\n                              max_size=max_size)\n\n        # 模型预测\n        outputs = []\n        for input_data in processor.input_datas:\n            output = self.model(input_data)\n            outputs.append(output)\n\n        # 结果后处理\n        results = processor.postprocess(outputs, visualization)\n\n        # 返回结果\n        return results\n\n    # Hub Serving\n    @serving\n    def serving_method(self, images, **kwargs):\n        # 获取输入数据\n        images_decode = [base64_to_cv2(image) for image in images]\n\n        # 图片风格转换\n        results = self.style_transfer(images_decode, **kwargs)\n\n        # 对输出图片进行编码\n        encodes = []\n        for result in results:\n            encode = cv2_to_base64(result)\n            encodes.append(encode)\n\n        # 返回结果\n        return encodes\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_shinkai_53/processor.py",
    "content": "import base64\nimport os\nimport time\n\nimport cv2\nimport numpy as np\n\n__all__ = ['base64_to_cv2', 'cv2_to_base64', 'Processor']\n\n\ndef check_dir(dir_path):\n    # 目录检查函数\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef base64_to_cv2(b64str):\n    # base64转cv2函数\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.frombuffer(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef cv2_to_base64(image):\n    # cv2转base64函数\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\nclass Processor():\n    # 初始化函数\n    def __init__(self, images=None, paths=None, batch_size=1, output_dir='output', min_size=32, max_size=1024):\n        # 变量设置\n        self.min_size = min_size\n        self.max_size = max_size\n\n        self.images = images\n        self.paths = paths\n        self.batch_size = batch_size\n        self.output_dir = output_dir\n\n        # 获取原始输入数据\n        self.datas = self.load_datas()\n\n        # 对原始输入数据进行预处理\n        self.input_datas = self.preprocess()\n\n    # 读取数据函数\n    def load_datas(self):\n        datas = []\n\n        # 读取数据列表\n        if self.paths is not None:\n            for im_path in self.paths:\n                assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n                im = cv2.imread(im_path)\n                datas.append(im)\n\n        if self.images is not None:\n            datas = self.images\n\n        # 返回数据列表\n        return datas\n\n    # 数据预处理函数\n    def preprocess(self):\n        input_datas = []\n\n        # 数据预处理\n        for i, img in enumerate(self.datas):\n            # 格式转换\n            img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)\n\n            # 缩放图片\n            h, w = img.shape[:2]\n            if max(h, w) > self.max_size:\n                img = cv2.resize(img, (self.max_size, int(h / w * self.max_size))) if h < w else cv2.resize(\n                    img, (int(w / h * self.max_size), self.max_size))\n            elif min(h, w) < self.min_size:\n                img = cv2.resize(img, (self.min_size, int(h / w * self.min_size))) if h > w else cv2.resize(\n                    img, (int(w / h * self.min_size), self.min_size))\n\n            # 裁剪图片\n            h, w = img.shape[:2]\n            img = img[:h - (h % 32), :w - (w % 32), :]\n\n            # 归一化\n            img = img / 127.5 - 1.0\n\n            # 新建维度\n            img = np.expand_dims(img, axis=0).astype('float32')\n\n            # 加入输入数据列表\n            input_datas.append(img)\n\n        # 数据按batch_size切分\n        input_datas = np.concatenate(input_datas, 0)\n        split_num = len(self.datas) // self.batch_size + 1 if len(self.datas) % self.batch_size != 0 else len(\n            self.datas) // self.batch_size\n        input_datas = np.array_split(input_datas, split_num)\n\n        # 返回预处理完成的数据\n        return input_datas\n\n    def postprocess(self, outputs, visualization):\n        results = []\n\n        for im_id, output in enumerate(outputs):\n            # 反归一化\n            image = (output.squeeze() + 1.) / 2 * 255\n\n            # 限幅\n            image = np.clip(image, 0, 255).astype(np.uint8)\n\n            # 格式转换\n            image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)\n\n            # 可视化\n            if visualization:\n                # 检查输出目录\n                check_dir(self.output_dir)\n\n                # 写入输出图片\n                cv2.imwrite(os.path.join(self.output_dir, '%d_%d.jpg' % (im_id, time.time())), image)\n\n            results.append(image)\n\n        # 返回结果\n        return results\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/animegan_v2_shinkai_53/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport numpy as np\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/mJaD10XeD7w/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8M3x8Y2F0fGVufDB8fHx8MTY2MzczNDc3Mw&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        img = cv2.imread('tests/test.jpg')\n        img = cv2.resize(img, (0, 0), fx=0.25, fy=0.25)\n        cv2.imwrite('tests/test.jpg', img)\n        cls.module = hub.Module(name=\"animegan_v2_shinkai_53\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('output')\n\n    def test_style_transfer1(self):\n        results = self.module.style_transfer(paths=['tests/test.jpg'])\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer2(self):\n        results = self.module.style_transfer(paths=['tests/test.jpg'], visualization=True)\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer3(self):\n        results = self.module.style_transfer(images=[cv2.imread('tests/test.jpg')])\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer4(self):\n        results = self.module.style_transfer(images=[cv2.imread('tests/test.jpg')], visualization=True)\n        self.assertIsInstance(results[0], np.ndarray)\n\n    def test_style_transfer5(self):\n        self.assertRaises(AssertionError, self.module.style_transfer, paths=['no.jpg'])\n\n    def test_style_transfer6(self):\n        self.assertRaises(cv2.error, self.module.style_transfer, images=['tests/test.jpg'])\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/face_parse/README.md",
    "content": "# face_parse\n\n|模型名称|face_parse|\n| :--- | :---: |\n|类别|图像 - 人脸解析|\n|网络|BiSeNet|\n|数据集|COCO-Stuff|\n|是否支持Fine-tuning|否|\n|模型大小|77MB|\n|最新更新日期|2021-12-07|\n|数据指标|-|\n\n\n## 一、模型基本信息  \n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/157190651-595b6964-97c5-4b0b-ac0a-c30c8520a972.png\"  width = \"40%\"  hspace='10'/>\n    <br />\n    输入图像\n    <br />\n    <img src=\"https://user-images.githubusercontent.com/22424850/157192693-b3f737ed-1a24-4ef9-8454-bfd9d51755af.png\"  width = \"40%\"  hspace='10'/>\n    <br />\n    输出图像\n     <br />\n    </p>\n\n- ### 模型介绍\n\n  - 人脸解析是语义图像分割的一种特殊情况，人脸解析是计算人脸图像中不同语义成分(如头发、嘴唇、鼻子、眼睛等)的像素级标签映射。给定一个输入的人脸图像，人脸解析将为每个语义成分分配一个像素级标签。\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n  - ppgan\n  - dlib\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install face_parse\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    # Read from a file\n    $ hub run face_parse --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现人脸解析模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"face_parse\")\n    input_path = [\"/PATH/TO/IMAGE\"]\n    # Read from a file\n    module.style_transfer(paths=input_path, output_dir='./transfer_result/', use_gpu=True)  \n    ```\n\n- ### 3、API\n\n  - ```python\n    style_transfer(images=None, paths=None, output_dir='./transfer_result/', use_gpu=False, visualization=True):\n    ```\n    - 人脸解析转换API。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]；<br/>\n      - paths (list\\[str\\]): 图片的路径；<br/>\n      - output\\_dir (str): 结果保存的路径； <br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - visualization(bool): 是否保存结果到本地文件夹\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线人脸解析转换服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m face_parse\n    ```\n\n  - 这样就完成了一个人脸解析转换的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/face_parse\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install face_parse==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/face_parse/model.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport sys\nimport argparse\n\nfrom PIL import Image\nimport numpy as np\nimport cv2\n\nimport ppgan.faceutils as futils\nfrom ppgan.utils.preprocess import *\nfrom ppgan.utils.visual import mask2image\n\n\nclass FaceParsePredictor:\n    def __init__(self):\n        self.input_size = (512, 512)\n        self.up_ratio = 0.6 / 0.85\n        self.down_ratio = 0.2 / 0.85\n        self.width_ratio = 0.2 / 0.85\n        self.face_parser = futils.mask.FaceParser()\n\n    def run(self, image):\n        image = Image.fromarray(image)\n        face = futils.dlib.detect(image)\n\n        if not face:\n            return\n        face_on_image = face[0]\n        image, face, crop_face = futils.dlib.crop(image, face_on_image, self.up_ratio, self.down_ratio,\n                                                  self.width_ratio)\n        np_image = np.array(image)\n        mask = self.face_parser.parse(np.float32(cv2.resize(np_image, self.input_size)))\n        mask = cv2.resize(mask.numpy(), (256, 256))\n        mask = mask.astype(np.uint8)\n        mask = mask2image(mask)\n\n        return mask\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/face_parse/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport cv2\nimport numpy as np\nimport paddle\nfrom skimage.io import imread\nfrom skimage.transform import rescale\nfrom skimage.transform import resize\n\nimport paddlehub as hub\nfrom .model import FaceParsePredictor\nfrom .util import base64_to_cv2\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"face_parse\", type=\"CV/style_transfer\", author=\"paddlepaddle\", author_email=\"\", summary=\"\", version=\"1.0.0\")\nclass Face_parse:\n    def __init__(self):\n        self.pretrained_model = os.path.join(self.directory, \"bisenet.pdparams\")\n\n        self.network = FaceParsePredictor()\n\n    def style_transfer(self,\n                       images: list = None,\n                       paths: list = None,\n                       output_dir: str = './transfer_result/',\n                       use_gpu: bool = False,\n                       visualization: bool = True):\n        '''\n\n\n        images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR(read by cv2).\n        paths (list[str]): paths to images\n        output_dir (str): the dir to save the results\n        use_gpu (bool): if True, use gpu to perform the computation, otherwise cpu.\n        visualization (bool): if True, save results in output_dir.\n        '''\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n\n        if images != None:\n            for image in images:\n                image = image[:, :, ::-1]\n                out = self.network.run(image)\n                results.append(out)\n\n        if paths != None:\n            for path in paths:\n                image = cv2.imread(path)[:, :, ::-1]\n                out = self.network.run(image)\n                results.append(out)\n\n        if visualization == True:\n            if not os.path.exists(output_dir):\n                os.makedirs(output_dir, exist_ok=True)\n            for i, out in enumerate(results):\n                if out is not None:\n                    cv2.imwrite(os.path.join(output_dir, 'output_{}.png'.format(i)), out[:, :, ::-1])\n\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n        results = self.style_transfer(\n            paths=[self.args.input_path],\n            output_dir=self.args.output_dir,\n            use_gpu=self.args.use_gpu,\n            visualization=self.args.visualization)\n        return results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.style_transfer(images=images_decode, **kwargs)\n        tolist = [result.tolist() for result in results]\n        return tolist\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default='transfer_result', help='output directory for saving result.')\n        self.arg_config_group.add_argument('--visualization', type=bool, default=False, help='save results or not.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to input image.\")\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/face_parse/requirements.txt",
    "content": "ppgan\ndlib\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/face_parse/util.py",
    "content": "import base64\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/lapstyle_circuit/README.md",
    "content": "# lapstyle_circuit\n\n|模型名称|lapstyle_circuit|\n| :--- | :---: |\n|类别|图像 - 风格迁移|\n|网络|LapStyle|\n|数据集|COCO|\n|是否支持Fine-tuning|否|\n|模型大小|121MB|\n|最新更新日期|2021-12-07|\n|数据指标|-|\n\n\n## 一、模型基本信息  \n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/144995283-77ddba45-9efe-4f72-914c-1bff734372ed.png\"  width = \"50%\"  hspace='10'/>\n    <br />\n    输入内容图形\n    <br />\n    <img src=\"https://user-images.githubusercontent.com/22424850/144997574-8b4028ad-d871-4caf-87d1-191582bba805.jpg\"  width = \"50%\" hspace='10'/>\n    <br />\n    输入风格图形\n    <br />\n    <img src=\"https://user-images.githubusercontent.com/22424850/144997589-407a12b9-95bf-44e7-b558-b1026ef3cd5a.png\"  width = \"50%\"  hspace='10'/>\n    <br />\n    输出图像\n     <br />\n    </p>\n\n- ### 模型介绍\n\n  - LapStyle--拉普拉斯金字塔风格化网络，是一种能够生成高质量风格化图的快速前馈风格化网络，能渐进地生成复杂的纹理迁移效果，同时能够在512分辨率下达到100fps的速度。可实现多种不同艺术风格的快速迁移，在艺术图像生成、滤镜等领域有广泛的应用。\n\n  - 更多详情参考：[Drafting and Revision: Laplacian Pyramid Network for Fast High-Quality Artistic Style Transfer](https://arxiv.org/pdf/2104.05376.pdf)\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n  - ppgan\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install lapstyle_circuit\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    # Read from a file\n    $ hub run lapstyle_circuit --content \"/PATH/TO/IMAGE\" --style \"/PATH/TO/IMAGE1\"\n    ```\n  - 通过命令行方式实现风格转换模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"lapstyle_circuit\")\n    content = cv2.imread(\"/PATH/TO/IMAGE\")\n    style = cv2.imread(\"/PATH/TO/IMAGE1\")\n    results = module.style_transfer(images=[{'content':content, 'style':style}], output_dir='./transfer_result', use_gpu=True)\n    ```\n\n- ### 3、API\n\n  - ```python\n    style_transfer(images=None, paths=None, output_dir='./transfer_result/', use_gpu=False, visualization=True)\n    ```\n    - 风格转换API。\n\n    - **参数**\n\n      - images (list[dict]): data of images, 每一个元素都为一个 dict，有关键字 content, style, 相应取值为：\n        - content (numpy.ndarray): 待转换的图片，shape 为 \\[H, W, C\\]，BGR格式；<br/>\n        - style (numpy.ndarray) : 风格图像，shape为 \\[H, W, C\\]，BGR格式；<br/>\n      - paths (list[str]): paths to images, 每一个元素都为一个dict, 有关键字 content, style, 相应取值为：\n        - content (str): 待转换的图片的路径；<br/>\n        - style (str) : 风格图像的路径；<br/>\n      - output\\_dir (str): 结果保存的路径； <br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - visualization(bool): 是否保存结果到本地文件夹\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像风格转换服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m lapstyle_circuit\n    ```\n\n  - 这样就完成了一个图像风格转换的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[{'content': cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\")), 'style': cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE1\"))}]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/lapstyle_circuit\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install lapstyle_circuit==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/lapstyle_circuit/model.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport urllib.request\n\nimport cv2 as cv\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\nfrom paddle.vision.transforms import functional\nfrom PIL import Image\nfrom ppgan.models.generators import DecoderNet\nfrom ppgan.models.generators import Encoder\nfrom ppgan.models.generators import RevisionNet\nfrom ppgan.utils.visual import tensor2img\n\n\ndef img(img):\n    # some images have 4 channels\n    if img.shape[2] > 3:\n        img = img[:, :, :3]\n    # HWC to CHW\n    return img\n\n\ndef img_totensor(content_img, style_img):\n    if content_img.ndim == 2:\n        content_img = cv.cvtColor(content_img, cv.COLOR_GRAY2RGB)\n    else:\n        content_img = cv.cvtColor(content_img, cv.COLOR_BGR2RGB)\n    h, w, c = content_img.shape\n    content_img = Image.fromarray(content_img)\n    content_img = content_img.resize((512, 512), Image.BILINEAR)\n    content_img = np.array(content_img)\n    content_img = img(content_img)\n    content_img = functional.to_tensor(content_img)\n\n    style_img = cv.cvtColor(style_img, cv.COLOR_BGR2RGB)\n    style_img = Image.fromarray(style_img)\n    style_img = style_img.resize((512, 512), Image.BILINEAR)\n    style_img = np.array(style_img)\n    style_img = img(style_img)\n    style_img = functional.to_tensor(style_img)\n\n    content_img = paddle.unsqueeze(content_img, axis=0)\n    style_img = paddle.unsqueeze(style_img, axis=0)\n    return content_img, style_img, h, w\n\n\ndef tensor_resample(tensor, dst_size, mode='bilinear'):\n    return F.interpolate(tensor, dst_size, mode=mode, align_corners=False)\n\n\ndef laplacian(x):\n    \"\"\"\n    Laplacian\n\n    return:\n       x - upsample(downsample(x))\n    \"\"\"\n    return x - tensor_resample(tensor_resample(x, [x.shape[2] // 2, x.shape[3] // 2]), [x.shape[2], x.shape[3]])\n\n\ndef make_laplace_pyramid(x, levels):\n    \"\"\"\n    Make Laplacian Pyramid\n    \"\"\"\n    pyramid = []\n    current = x\n    for i in range(levels):\n        pyramid.append(laplacian(current))\n        current = tensor_resample(current, (max(current.shape[2] // 2, 1), max(current.shape[3] // 2, 1)))\n    pyramid.append(current)\n    return pyramid\n\n\ndef fold_laplace_pyramid(pyramid):\n    \"\"\"\n    Fold Laplacian Pyramid\n    \"\"\"\n    current = pyramid[-1]\n    for i in range(len(pyramid) - 2, -1, -1):  # iterate from len-2 to 0\n        up_h, up_w = pyramid[i].shape[2], pyramid[i].shape[3]\n        current = pyramid[i] + tensor_resample(current, (up_h, up_w))\n    return current\n\n\nclass LapStylePredictor:\n    def __init__(self, weight_path=None):\n\n        self.net_enc = Encoder()\n        self.net_dec = DecoderNet()\n        self.net_rev = RevisionNet()\n        self.net_rev_2 = RevisionNet()\n\n        self.net_enc.set_dict(paddle.load(weight_path)['net_enc'])\n        self.net_enc.eval()\n        self.net_dec.set_dict(paddle.load(weight_path)['net_dec'])\n        self.net_dec.eval()\n        self.net_rev.set_dict(paddle.load(weight_path)['net_rev'])\n        self.net_rev.eval()\n        self.net_rev_2.set_dict(paddle.load(weight_path)['net_rev_2'])\n        self.net_rev_2.eval()\n\n    def run(self, content_img, style_image):\n        content_img, style_img, h, w = img_totensor(content_img, style_image)\n        pyr_ci = make_laplace_pyramid(content_img, 2)\n        pyr_si = make_laplace_pyramid(style_img, 2)\n        pyr_ci.append(content_img)\n        pyr_si.append(style_img)\n        cF = self.net_enc(pyr_ci[2])\n        sF = self.net_enc(pyr_si[2])\n        stylized_small = self.net_dec(cF, sF)\n        stylized_up = F.interpolate(stylized_small, scale_factor=2)\n\n        revnet_input = paddle.concat(x=[pyr_ci[1], stylized_up], axis=1)\n        stylized_rev_lap = self.net_rev(revnet_input)\n        stylized_rev = fold_laplace_pyramid([stylized_rev_lap, stylized_small])\n\n        stylized_up = F.interpolate(stylized_rev, scale_factor=2)\n\n        revnet_input = paddle.concat(x=[pyr_ci[0], stylized_up], axis=1)\n        stylized_rev_lap_second = self.net_rev_2(revnet_input)\n        stylized_rev_second = fold_laplace_pyramid([stylized_rev_lap_second, stylized_rev_lap, stylized_small])\n\n        stylized = stylized_rev_second\n        stylized_visual = tensor2img(stylized, min_max=(0., 1.))\n\n        return stylized_visual\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/lapstyle_circuit/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport cv2\nimport numpy as np\nimport paddle\nfrom skimage.io import imread\nfrom skimage.transform import rescale\nfrom skimage.transform import resize\n\nimport paddlehub as hub\nfrom .model import LapStylePredictor\nfrom .util import base64_to_cv2\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"lapstyle_circuit\",\n    type=\"CV/style_transfer\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"\",\n    version=\"1.0.0\")\nclass Lapstyle_circuit:\n    def __init__(self):\n        self.pretrained_model = os.path.join(self.directory, \"lapstyle_circuit.pdparams\")\n\n        self.network = LapStylePredictor(weight_path=self.pretrained_model)\n\n    def style_transfer(self,\n                       images: list = None,\n                       paths: list = None,\n                       output_dir: str = './transfer_result/',\n                       use_gpu: bool = False,\n                       visualization: bool = True):\n        '''\n        Transfer a image to circuit style.\n\n        images (list[dict]): data of images, each element is a dict:\n          - content (numpy.ndarray): input image，shape is \\[H, W, C\\]，BGR format；<br/>\n          - style (numpy.ndarray) : style image，shape is \\[H, W, C\\]，BGR format；<br/>\n        paths (list[dict]): paths to images, eacg element is a dict:\n          - content (str): path to input image；<br/>\n          - style (str) : path to style image；<br/>\n\n         output_dir (str): the dir to save the results\n         use_gpu (bool): if True, use gpu to perform the computation, otherwise cpu.\n         visualization (bool): if True, save results in output_dir.\n\n        '''\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n\n        if images != None:\n            for image_dict in images:\n                content_img = image_dict['content']\n                style_img = image_dict['style']\n                results.append(self.network.run(content_img, style_img))\n\n        if paths != None:\n            for path_dict in paths:\n                content_img = cv2.imread(path_dict['content'])\n                style_img = cv2.imread(path_dict['style'])\n                results.append(self.network.run(content_img, style_img))\n\n        if visualization == True:\n            if not os.path.exists(output_dir):\n                os.makedirs(output_dir, exist_ok=True)\n            for i, out in enumerate(results):\n                cv2.imwrite(os.path.join(output_dir, 'output_{}.png'.format(i)), out[:, :, ::-1])\n\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n\n        self.style_transfer(\n            paths=[{\n                'content': self.args.content,\n                'style': self.args.style\n            }],\n            output_dir=self.args.output_dir,\n            use_gpu=self.args.use_gpu,\n            visualization=self.args.visualization)\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = copy.deepcopy(images)\n        for image in images_decode:\n            image['content'] = base64_to_cv2(image['content'])\n            image['style'] = base64_to_cv2(image['style'])\n        results = self.style_transfer(images_decode, **kwargs)\n        tolist = [result.tolist() for result in results]\n        return tolist\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default='transfer_result', help='output directory for saving result.')\n        self.arg_config_group.add_argument('--visualization', type=bool, default=False, help='save results or not.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--content', type=str, help=\"path to content image.\")\n        self.arg_input_group.add_argument('--style', type=str, help=\"path to style image.\")\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/lapstyle_circuit/requirements.txt",
    "content": "ppgan\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/lapstyle_circuit/util.py",
    "content": "import base64\n\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/lapstyle_ocean/README.md",
    "content": "# lapstyle_ocean\n\n|模型名称|lapstyle_ocean|\n| :--- | :---: |\n|类别|图像 - 风格迁移|\n|网络|LapStyle|\n|数据集|COCO|\n|是否支持Fine-tuning|否|\n|模型大小|121MB|\n|最新更新日期|2021-12-07|\n|数据指标|-|\n\n\n## 一、模型基本信息  \n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/144995283-77ddba45-9efe-4f72-914c-1bff734372ed.png\"  width = \"50%\"  hspace='10'/>\n    <br />\n    输入内容图形\n    <br />\n    <img src=\"https://user-images.githubusercontent.com/22424850/144997958-9162c304-dff4-4048-a197-607882ded00c.png\"  width = \"50%\" hspace='10'/>\n    <br />\n    输入风格图形\n    <br />\n    <img src=\"https://user-images.githubusercontent.com/22424850/144997967-43d7579c-cc73-452e-a920-5759eb5a5d67.png\"  width = \"50%\"  hspace='10'/>\n    <br />\n    输出图像\n     <br />\n    </p>\n\n- ### 模型介绍\n\n  - LapStyle--拉普拉斯金字塔风格化网络，是一种能够生成高质量风格化图的快速前馈风格化网络，能渐进地生成复杂的纹理迁移效果，同时能够在512分辨率下达到100fps的速度。可实现多种不同艺术风格的快速迁移，在艺术图像生成、滤镜等领域有广泛的应用。\n\n  - 更多详情参考：[Drafting and Revision: Laplacian Pyramid Network for Fast High-Quality Artistic Style Transfer](https://arxiv.org/pdf/2104.05376.pdf)\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n  - ppgan\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install lapstyle_ocean\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    # Read from a file\n    $ hub run lapstyle_ocean --content \"/PATH/TO/IMAGE\" --style \"/PATH/TO/IMAGE1\"\n    ```\n  - 通过命令行方式实现风格转换模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"lapstyle_ocean\")\n    content = cv2.imread(\"/PATH/TO/IMAGE\")\n    style = cv2.imread(\"/PATH/TO/IMAGE1\")\n    results = module.style_transfer(images=[{'content':content, 'style':style}], output_dir='./transfer_result', use_gpu=True)\n    ```\n\n- ### 3、API\n\n  - ```python\n    style_transfer(images=None, paths=None, output_dir='./transfer_result/', use_gpu=False, visualization=True)\n    ```\n    - 风格转换API。\n\n    - **参数**\n\n      - images (list[dict]): data of images, 每一个元素都为一个 dict，有关键字 content, style, 相应取值为：\n        - content (numpy.ndarray): 待转换的图片，shape 为 \\[H, W, C\\]，BGR格式；<br/>\n        - style (numpy.ndarray) : 风格图像，shape为 \\[H, W, C\\]，BGR格式；<br/>\n      - paths (list[str]): paths to images, 每一个元素都为一个dict, 有关键字 content, style, 相应取值为：\n        - content (str): 待转换的图片的路径；<br/>\n        - style (str) : 风格图像的路径；<br/>\n      - output\\_dir (str): 结果保存的路径； <br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - visualization(bool): 是否保存结果到本地文件夹\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像风格转换服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m lapstyle_ocean\n    ```\n\n  - 这样就完成了一个图像风格转换的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[{'content': cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\")), 'style': cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE1\"))}]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/lapstyle_ocean\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install lapstyle_ocean==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/lapstyle_ocean/model.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport urllib.request\n\nimport cv2 as cv\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\nfrom paddle.vision.transforms import functional\nfrom PIL import Image\nfrom ppgan.models.generators import DecoderNet\nfrom ppgan.models.generators import Encoder\nfrom ppgan.models.generators import RevisionNet\nfrom ppgan.utils.visual import tensor2img\n\n\ndef img(img):\n    # some images have 4 channels\n    if img.shape[2] > 3:\n        img = img[:, :, :3]\n    # HWC to CHW\n    return img\n\n\ndef img_totensor(content_img, style_img):\n    if content_img.ndim == 2:\n        content_img = cv.cvtColor(content_img, cv.COLOR_GRAY2RGB)\n    else:\n        content_img = cv.cvtColor(content_img, cv.COLOR_BGR2RGB)\n    h, w, c = content_img.shape\n    content_img = Image.fromarray(content_img)\n    content_img = content_img.resize((512, 512), Image.BILINEAR)\n    content_img = np.array(content_img)\n    content_img = img(content_img)\n    content_img = functional.to_tensor(content_img)\n\n    style_img = cv.cvtColor(style_img, cv.COLOR_BGR2RGB)\n    style_img = Image.fromarray(style_img)\n    style_img = style_img.resize((512, 512), Image.BILINEAR)\n    style_img = np.array(style_img)\n    style_img = img(style_img)\n    style_img = functional.to_tensor(style_img)\n\n    content_img = paddle.unsqueeze(content_img, axis=0)\n    style_img = paddle.unsqueeze(style_img, axis=0)\n    return content_img, style_img, h, w\n\n\ndef tensor_resample(tensor, dst_size, mode='bilinear'):\n    return F.interpolate(tensor, dst_size, mode=mode, align_corners=False)\n\n\ndef laplacian(x):\n    \"\"\"\n    Laplacian\n\n    return:\n       x - upsample(downsample(x))\n    \"\"\"\n    return x - tensor_resample(tensor_resample(x, [x.shape[2] // 2, x.shape[3] // 2]), [x.shape[2], x.shape[3]])\n\n\ndef make_laplace_pyramid(x, levels):\n    \"\"\"\n    Make Laplacian Pyramid\n    \"\"\"\n    pyramid = []\n    current = x\n    for i in range(levels):\n        pyramid.append(laplacian(current))\n        current = tensor_resample(current, (max(current.shape[2] // 2, 1), max(current.shape[3] // 2, 1)))\n    pyramid.append(current)\n    return pyramid\n\n\ndef fold_laplace_pyramid(pyramid):\n    \"\"\"\n    Fold Laplacian Pyramid\n    \"\"\"\n    current = pyramid[-1]\n    for i in range(len(pyramid) - 2, -1, -1):  # iterate from len-2 to 0\n        up_h, up_w = pyramid[i].shape[2], pyramid[i].shape[3]\n        current = pyramid[i] + tensor_resample(current, (up_h, up_w))\n    return current\n\n\nclass LapStylePredictor:\n    def __init__(self, weight_path=None):\n\n        self.net_enc = Encoder()\n        self.net_dec = DecoderNet()\n        self.net_rev = RevisionNet()\n        self.net_rev_2 = RevisionNet()\n\n        self.net_enc.set_dict(paddle.load(weight_path)['net_enc'])\n        self.net_enc.eval()\n        self.net_dec.set_dict(paddle.load(weight_path)['net_dec'])\n        self.net_dec.eval()\n        self.net_rev.set_dict(paddle.load(weight_path)['net_rev'])\n        self.net_rev.eval()\n        self.net_rev_2.set_dict(paddle.load(weight_path)['net_rev_2'])\n        self.net_rev_2.eval()\n\n    def run(self, content_img, style_image):\n        content_img, style_img, h, w = img_totensor(content_img, style_image)\n        pyr_ci = make_laplace_pyramid(content_img, 2)\n        pyr_si = make_laplace_pyramid(style_img, 2)\n        pyr_ci.append(content_img)\n        pyr_si.append(style_img)\n        cF = self.net_enc(pyr_ci[2])\n        sF = self.net_enc(pyr_si[2])\n        stylized_small = self.net_dec(cF, sF)\n        stylized_up = F.interpolate(stylized_small, scale_factor=2)\n\n        revnet_input = paddle.concat(x=[pyr_ci[1], stylized_up], axis=1)\n        stylized_rev_lap = self.net_rev(revnet_input)\n        stylized_rev = fold_laplace_pyramid([stylized_rev_lap, stylized_small])\n\n        stylized_up = F.interpolate(stylized_rev, scale_factor=2)\n\n        revnet_input = paddle.concat(x=[pyr_ci[0], stylized_up], axis=1)\n        stylized_rev_lap_second = self.net_rev_2(revnet_input)\n        stylized_rev_second = fold_laplace_pyramid([stylized_rev_lap_second, stylized_rev_lap, stylized_small])\n\n        stylized = stylized_rev_second\n        stylized_visual = tensor2img(stylized, min_max=(0., 1.))\n\n        return stylized_visual\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/lapstyle_ocean/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport cv2\nimport numpy as np\nimport paddle\nfrom skimage.io import imread\nfrom skimage.transform import rescale\nfrom skimage.transform import resize\n\nimport paddlehub as hub\nfrom .model import LapStylePredictor\nfrom .util import base64_to_cv2\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"lapstyle_ocean\",\n    type=\"CV/style_transfer\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"\",\n    version=\"1.0.0\")\nclass Lapstyle_ocean:\n    def __init__(self):\n        self.pretrained_model = os.path.join(self.directory, \"lapstyle_ocean.pdparams\")\n\n        self.network = LapStylePredictor(weight_path=self.pretrained_model)\n\n    def style_transfer(self,\n                       images: list = None,\n                       paths: list = None,\n                       output_dir: str = './transfer_result/',\n                       use_gpu: bool = False,\n                       visualization: bool = True):\n        '''\n        Transfer a image to ocean style.\n\n        images (list[dict]): data of images, each element is a dict:\n          - content (numpy.ndarray): input image，shape is \\[H, W, C\\]，BGR format；<br/>\n          - style (numpy.ndarray) : style image，shape is \\[H, W, C\\]，BGR format；<br/>\n        paths (list[dict]): paths to images, eacg element is a dict:\n          - content (str): path to input image；<br/>\n          - style (str) : path to style image；<br/>\n\n         output_dir (str): the dir to save the results\n         use_gpu (bool): if True, use gpu to perform the computation, otherwise cpu.\n         visualization (bool): if True, save results in output_dir.\n        '''\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n\n        if images != None:\n            for image_dict in images:\n                content_img = image_dict['content']\n                style_img = image_dict['style']\n                results.append(self.network.run(content_img, style_img))\n\n        if paths != None:\n            for path_dict in paths:\n                content_img = cv2.imread(path_dict['content'])\n                style_img = cv2.imread(path_dict['style'])\n                results.append(self.network.run(content_img, style_img))\n\n        if visualization == True:\n            if not os.path.exists(output_dir):\n                os.makedirs(output_dir, exist_ok=True)\n            for i, out in enumerate(results):\n                cv2.imwrite(os.path.join(output_dir, 'output_{}.png'.format(i)), out[:, :, ::-1])\n\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n\n        self.style_transfer(\n            paths=[{\n                'content': self.args.content,\n                'style': self.args.style\n            }],\n            output_dir=self.args.output_dir,\n            use_gpu=self.args.use_gpu,\n            visualization=self.args.visualization)\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = copy.deepcopy(images)\n        for image in images_decode:\n            image['content'] = base64_to_cv2(image['content'])\n            image['style'] = base64_to_cv2(image['style'])\n        results = self.style_transfer(images_decode, **kwargs)\n        tolist = [result.tolist() for result in results]\n        return tolist\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default='transfer_result', help='output directory for saving result.')\n        self.arg_config_group.add_argument('--visualization', type=bool, default=False, help='save results or not.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--content', type=str, help=\"path to content image.\")\n        self.arg_input_group.add_argument('--style', type=str, help=\"path to style image.\")\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/lapstyle_ocean/requirements.txt",
    "content": "ppgan\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/lapstyle_ocean/util.py",
    "content": "import base64\n\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/lapstyle_starrynew/README.md",
    "content": "# lapstyle_starrynew\n\n|模型名称|lapstyle_starrynew|\n| :--- | :---: |\n|类别|图像 - 风格迁移|\n|网络|LapStyle|\n|数据集|COCO|\n|是否支持Fine-tuning|否|\n|模型大小|121MB|\n|最新更新日期|2021-12-07|\n|数据指标|-|\n\n\n## 一、模型基本信息  \n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/144995283-77ddba45-9efe-4f72-914c-1bff734372ed.png\"  width = \"50%\"  hspace='10'/>\n    <br />\n    输入内容图形\n    <br />\n    <img src=\"https://user-images.githubusercontent.com/22424850/144995349-59651a1d-7be4-479f-ad58-063b4fc6dded.png\"  width = \"50%\" hspace='10'/>\n    <br />\n    输入风格图形\n    <br />\n    <img src=\"https://user-images.githubusercontent.com/22424850/144995779-bb87c39e-643c-4c75-be49-7de5f8b52a17.png\"  width = \"50%\"  hspace='10'/>\n    <br />\n    输出图像\n     <br />\n    </p>\n\n- ### 模型介绍\n\n  - LapStyle--拉普拉斯金字塔风格化网络，是一种能够生成高质量风格化图的快速前馈风格化网络，能渐进地生成复杂的纹理迁移效果，同时能够在512分辨率下达到100fps的速度。可实现多种不同艺术风格的快速迁移，在艺术图像生成、滤镜等领域有广泛的应用。\n\n  - 更多详情参考：[Drafting and Revision: Laplacian Pyramid Network for Fast High-Quality Artistic Style Transfer](https://arxiv.org/pdf/2104.05376.pdf)\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n  - ppgan\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install lapstyle_starrynew\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    # Read from a file\n    $ hub run lapstyle_starrynew --content \"/PATH/TO/IMAGE\" --style \"/PATH/TO/IMAGE1\"\n    ```\n  - 通过命令行方式实现风格转换模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"lapstyle_starrynew\")\n    content = cv2.imread(\"/PATH/TO/IMAGE\")\n    style = cv2.imread(\"/PATH/TO/IMAGE1\")\n    results = module.style_transfer(images=[{'content':content, 'style':style}], output_dir='./transfer_result', use_gpu=True)\n    ```\n\n- ### 3、API\n\n  - ```python\n    style_transfer(images=None, paths=None, output_dir='./transfer_result/', use_gpu=False, visualization=True)\n    ```\n    - 风格转换API。\n\n    - **参数**\n\n      - images (list[dict]): data of images, 每一个元素都为一个 dict，有关键字 content, style, 相应取值为：\n        - content (numpy.ndarray): 待转换的图片，shape 为 \\[H, W, C\\]，BGR格式；<br/>\n        - style (numpy.ndarray) : 风格图像，shape为 \\[H, W, C\\]，BGR格式；<br/>\n      - paths (list[str]): paths to images, 每一个元素都为一个dict, 有关键字 content, style, 相应取值为：\n        - content (str): 待转换的图片的路径；<br/>\n        - style (str) : 风格图像的路径；<br/>\n      - output\\_dir (str): 结果保存的路径； <br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - visualization(bool): 是否保存结果到本地文件夹\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像风格转换服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m lapstyle_starrynew\n    ```\n\n  - 这样就完成了一个图像风格转换的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[{'content': cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\")), 'style': cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE1\"))}]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/lapstyle_starrynew\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install lapstyle_starrynew==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/lapstyle_starrynew/model.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport urllib.request\n\nimport cv2 as cv\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\nfrom paddle.vision.transforms import functional\nfrom PIL import Image\nfrom ppgan.models.generators import DecoderNet\nfrom ppgan.models.generators import Encoder\nfrom ppgan.models.generators import RevisionNet\nfrom ppgan.utils.visual import tensor2img\n\n\ndef img(img):\n    # some images have 4 channels\n    if img.shape[2] > 3:\n        img = img[:, :, :3]\n    # HWC to CHW\n    return img\n\n\ndef img_totensor(content_img, style_img):\n    if content_img.ndim == 2:\n        content_img = cv.cvtColor(content_img, cv.COLOR_GRAY2RGB)\n    else:\n        content_img = cv.cvtColor(content_img, cv.COLOR_BGR2RGB)\n    h, w, c = content_img.shape\n    content_img = Image.fromarray(content_img)\n    content_img = content_img.resize((512, 512), Image.BILINEAR)\n    content_img = np.array(content_img)\n    content_img = img(content_img)\n    content_img = functional.to_tensor(content_img)\n\n    style_img = cv.cvtColor(style_img, cv.COLOR_BGR2RGB)\n    style_img = Image.fromarray(style_img)\n    style_img = style_img.resize((512, 512), Image.BILINEAR)\n    style_img = np.array(style_img)\n    style_img = img(style_img)\n    style_img = functional.to_tensor(style_img)\n\n    content_img = paddle.unsqueeze(content_img, axis=0)\n    style_img = paddle.unsqueeze(style_img, axis=0)\n    return content_img, style_img, h, w\n\n\ndef tensor_resample(tensor, dst_size, mode='bilinear'):\n    return F.interpolate(tensor, dst_size, mode=mode, align_corners=False)\n\n\ndef laplacian(x):\n    \"\"\"\n    Laplacian\n\n    return:\n       x - upsample(downsample(x))\n    \"\"\"\n    return x - tensor_resample(tensor_resample(x, [x.shape[2] // 2, x.shape[3] // 2]), [x.shape[2], x.shape[3]])\n\n\ndef make_laplace_pyramid(x, levels):\n    \"\"\"\n    Make Laplacian Pyramid\n    \"\"\"\n    pyramid = []\n    current = x\n    for i in range(levels):\n        pyramid.append(laplacian(current))\n        current = tensor_resample(current, (max(current.shape[2] // 2, 1), max(current.shape[3] // 2, 1)))\n    pyramid.append(current)\n    return pyramid\n\n\ndef fold_laplace_pyramid(pyramid):\n    \"\"\"\n    Fold Laplacian Pyramid\n    \"\"\"\n    current = pyramid[-1]\n    for i in range(len(pyramid) - 2, -1, -1):  # iterate from len-2 to 0\n        up_h, up_w = pyramid[i].shape[2], pyramid[i].shape[3]\n        current = pyramid[i] + tensor_resample(current, (up_h, up_w))\n    return current\n\n\nclass LapStylePredictor:\n    def __init__(self, weight_path=None):\n\n        self.net_enc = Encoder()\n        self.net_dec = DecoderNet()\n        self.net_rev = RevisionNet()\n        self.net_rev_2 = RevisionNet()\n\n        self.net_enc.set_dict(paddle.load(weight_path)['net_enc'])\n        self.net_enc.eval()\n        self.net_dec.set_dict(paddle.load(weight_path)['net_dec'])\n        self.net_dec.eval()\n        self.net_rev.set_dict(paddle.load(weight_path)['net_rev'])\n        self.net_rev.eval()\n        self.net_rev_2.set_dict(paddle.load(weight_path)['net_rev_2'])\n        self.net_rev_2.eval()\n\n    def run(self, content_img, style_image):\n        content_img, style_img, h, w = img_totensor(content_img, style_image)\n        pyr_ci = make_laplace_pyramid(content_img, 2)\n        pyr_si = make_laplace_pyramid(style_img, 2)\n        pyr_ci.append(content_img)\n        pyr_si.append(style_img)\n        cF = self.net_enc(pyr_ci[2])\n        sF = self.net_enc(pyr_si[2])\n        stylized_small = self.net_dec(cF, sF)\n        stylized_up = F.interpolate(stylized_small, scale_factor=2)\n\n        revnet_input = paddle.concat(x=[pyr_ci[1], stylized_up], axis=1)\n        stylized_rev_lap = self.net_rev(revnet_input)\n        stylized_rev = fold_laplace_pyramid([stylized_rev_lap, stylized_small])\n\n        stylized_up = F.interpolate(stylized_rev, scale_factor=2)\n\n        revnet_input = paddle.concat(x=[pyr_ci[0], stylized_up], axis=1)\n        stylized_rev_lap_second = self.net_rev_2(revnet_input)\n        stylized_rev_second = fold_laplace_pyramid([stylized_rev_lap_second, stylized_rev_lap, stylized_small])\n\n        stylized = stylized_rev_second\n        stylized_visual = tensor2img(stylized, min_max=(0., 1.))\n\n        return stylized_visual\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/lapstyle_starrynew/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport cv2\nimport numpy as np\nimport paddle\nfrom skimage.io import imread\nfrom skimage.transform import rescale\nfrom skimage.transform import resize\n\nimport paddlehub as hub\nfrom .model import LapStylePredictor\nfrom .util import base64_to_cv2\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"lapstyle_starrynew\",\n    type=\"CV/style_transfer\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"\",\n    version=\"1.0.0\")\nclass Lapstyle_starrynew:\n    def __init__(self):\n        self.pretrained_model = os.path.join(self.directory, \"lapstyle_starrynew.pdparams\")\n\n        self.network = LapStylePredictor(weight_path=self.pretrained_model)\n\n    def style_transfer(self,\n                       images: list = None,\n                       paths: list = None,\n                       output_dir: str = './transfer_result/',\n                       use_gpu: bool = False,\n                       visualization: bool = True):\n        '''\n        Transfer a image to starrynew style.\n\n        images (list[dict]): data of images, each element is a dict:\n          - content (numpy.ndarray): input image，shape is \\[H, W, C\\]，BGR format；<br/>\n          - style (numpy.ndarray) : style image，shape is \\[H, W, C\\]，BGR format；<br/>\n        paths (list[dict]): paths to images, eacg element is a dict:\n          - content (str): path to input image；<br/>\n          - style (str) : path to style image；<br/>\n        output_dir (str): the dir to save the results\n        use_gpu (bool): if True, use gpu to perform the computation, otherwise cpu.\n        visualization (bool): if True, save results in output_dir.\n        '''\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n\n        if images != None:\n            for image_dict in images:\n                content_img = image_dict['content']\n                style_img = image_dict['style']\n                results.append(self.network.run(content_img, style_img))\n\n        if paths != None:\n            for path_dict in paths:\n                content_img = cv2.imread(path_dict['content'])\n                style_img = cv2.imread(path_dict['style'])\n                results.append(self.network.run(content_img, style_img))\n\n        if visualization == True:\n            if not os.path.exists(output_dir):\n                os.makedirs(output_dir, exist_ok=True)\n            for i, out in enumerate(results):\n                cv2.imwrite(os.path.join(output_dir, 'output_{}.png'.format(i)), out[:, :, ::-1])\n\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n\n        self.style_transfer(\n            paths=[{\n                'content': self.args.content,\n                'style': self.args.style\n            }],\n            output_dir=self.args.output_dir,\n            use_gpu=self.args.use_gpu,\n            visualization=self.args.visualization)\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = copy.deepcopy(images)\n        for image in images_decode:\n            image['content'] = base64_to_cv2(image['content'])\n            image['style'] = base64_to_cv2(image['style'])\n        results = self.style_transfer(images_decode, **kwargs)\n        tolist = [result.tolist() for result in results]\n        return tolist\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default='transfer_result', help='output directory for saving result.')\n        self.arg_config_group.add_argument('--visualization', type=bool, default=False, help='save results or not.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--content', type=str, help=\"path to content image.\")\n        self.arg_input_group.add_argument('--style', type=str, help=\"path to style image.\")\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/lapstyle_starrynew/requirements.txt",
    "content": "ppgan\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/lapstyle_starrynew/util.py",
    "content": "import base64\n\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/lapstyle_stars/README.md",
    "content": "# lapstyle_stars\n\n|模型名称|lapstyle_stars|\n| :--- | :---: |\n|类别|图像 - 风格迁移|\n|网络|LapStyle|\n|数据集|COCO|\n|是否支持Fine-tuning|否|\n|模型大小|121MB|\n|最新更新日期|2021-12-07|\n|数据指标|-|\n\n\n## 一、模型基本信息  \n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/144995283-77ddba45-9efe-4f72-914c-1bff734372ed.png\"  width = \"50%\"  hspace='10'/>\n    <br />\n    输入内容图形\n    <br />\n    <img src=\"https://user-images.githubusercontent.com/22424850/144998358-14b87265-e966-422e-95f7-1738407e84ee.png\"  width = \"50%\" hspace='10'/>\n    <br />\n    输入风格图形\n    <br />\n    <img src=\"https://user-images.githubusercontent.com/22424850/144998367-5bc21fae-27fc-4c0e-8e1e-9702c7ee2b26.png\"  width = \"50%\"  hspace='10'/>\n    <br />\n    输出图像\n     <br />\n    </p>\n\n- ### 模型介绍\n\n  - LapStyle--拉普拉斯金字塔风格化网络，是一种能够生成高质量风格化图的快速前馈风格化网络，能渐进地生成复杂的纹理迁移效果，同时能够在512分辨率下达到100fps的速度。可实现多种不同艺术风格的快速迁移，在艺术图像生成、滤镜等领域有广泛的应用。\n\n  - 更多详情参考：[Drafting and Revision: Laplacian Pyramid Network for Fast High-Quality Artistic Style Transfer](https://arxiv.org/pdf/2104.05376.pdf)\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n  - ppgan\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install lapstyle_stars\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    # Read from a file\n    $ hub run lapstyle_stars --content \"/PATH/TO/IMAGE\" --style \"/PATH/TO/IMAGE1\"\n    ```\n  - 通过命令行方式实现风格转换模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"lapstyle_stars\")\n    content = cv2.imread(\"/PATH/TO/IMAGE\")\n    style = cv2.imread(\"/PATH/TO/IMAGE1\")\n    results = module.style_transfer(images=[{'content':content, 'style':style}], output_dir='./transfer_result', use_gpu=True)\n    ```\n\n- ### 3、API\n\n  - ```python\n    style_transfer(images=None, paths=None, output_dir='./transfer_result/', use_gpu=False, visualization=True)\n    ```\n    - 风格转换API。\n\n    - **参数**\n\n      - images (list[dict]): data of images, 每一个元素都为一个 dict，有关键字 content, style, 相应取值为：\n        - content (numpy.ndarray): 待转换的图片，shape 为 \\[H, W, C\\]，BGR格式；<br/>\n        - style (numpy.ndarray) : 风格图像，shape为 \\[H, W, C\\]，BGR格式；<br/>\n      - paths (list[str]): paths to images, 每一个元素都为一个dict, 有关键字 content, style, 相应取值为：\n        - content (str): 待转换的图片的路径；<br/>\n        - style (str) : 风格图像的路径；<br/>\n      - output\\_dir (str): 结果保存的路径； <br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - visualization(bool): 是否保存结果到本地文件夹\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像风格转换服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m lapstyle_stars\n    ```\n\n  - 这样就完成了一个图像风格转换的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[{'content': cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\")), 'style': cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE1\"))}]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/lapstyle_stars\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install lapstyle_stars==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/lapstyle_stars/model.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport urllib.request\n\nimport cv2 as cv\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\nfrom paddle.vision.transforms import functional\nfrom PIL import Image\nfrom ppgan.models.generators import DecoderNet\nfrom ppgan.models.generators import Encoder\nfrom ppgan.models.generators import RevisionNet\nfrom ppgan.utils.visual import tensor2img\n\n\ndef img(img):\n    # some images have 4 channels\n    if img.shape[2] > 3:\n        img = img[:, :, :3]\n    # HWC to CHW\n    return img\n\n\ndef img_totensor(content_img, style_img):\n    if content_img.ndim == 2:\n        content_img = cv.cvtColor(content_img, cv.COLOR_GRAY2RGB)\n    else:\n        content_img = cv.cvtColor(content_img, cv.COLOR_BGR2RGB)\n    h, w, c = content_img.shape\n    content_img = Image.fromarray(content_img)\n    content_img = content_img.resize((512, 512), Image.BILINEAR)\n    content_img = np.array(content_img)\n    content_img = img(content_img)\n    content_img = functional.to_tensor(content_img)\n\n    style_img = cv.cvtColor(style_img, cv.COLOR_BGR2RGB)\n    style_img = Image.fromarray(style_img)\n    style_img = style_img.resize((512, 512), Image.BILINEAR)\n    style_img = np.array(style_img)\n    style_img = img(style_img)\n    style_img = functional.to_tensor(style_img)\n\n    content_img = paddle.unsqueeze(content_img, axis=0)\n    style_img = paddle.unsqueeze(style_img, axis=0)\n    return content_img, style_img, h, w\n\n\ndef tensor_resample(tensor, dst_size, mode='bilinear'):\n    return F.interpolate(tensor, dst_size, mode=mode, align_corners=False)\n\n\ndef laplacian(x):\n    \"\"\"\n    Laplacian\n\n    return:\n       x - upsample(downsample(x))\n    \"\"\"\n    return x - tensor_resample(tensor_resample(x, [x.shape[2] // 2, x.shape[3] // 2]), [x.shape[2], x.shape[3]])\n\n\ndef make_laplace_pyramid(x, levels):\n    \"\"\"\n    Make Laplacian Pyramid\n    \"\"\"\n    pyramid = []\n    current = x\n    for i in range(levels):\n        pyramid.append(laplacian(current))\n        current = tensor_resample(current, (max(current.shape[2] // 2, 1), max(current.shape[3] // 2, 1)))\n    pyramid.append(current)\n    return pyramid\n\n\ndef fold_laplace_pyramid(pyramid):\n    \"\"\"\n    Fold Laplacian Pyramid\n    \"\"\"\n    current = pyramid[-1]\n    for i in range(len(pyramid) - 2, -1, -1):  # iterate from len-2 to 0\n        up_h, up_w = pyramid[i].shape[2], pyramid[i].shape[3]\n        current = pyramid[i] + tensor_resample(current, (up_h, up_w))\n    return current\n\n\nclass LapStylePredictor:\n    def __init__(self, weight_path=None):\n\n        self.net_enc = Encoder()\n        self.net_dec = DecoderNet()\n        self.net_rev = RevisionNet()\n        self.net_rev_2 = RevisionNet()\n\n        self.net_enc.set_dict(paddle.load(weight_path)['net_enc'])\n        self.net_enc.eval()\n        self.net_dec.set_dict(paddle.load(weight_path)['net_dec'])\n        self.net_dec.eval()\n        self.net_rev.set_dict(paddle.load(weight_path)['net_rev'])\n        self.net_rev.eval()\n        self.net_rev_2.set_dict(paddle.load(weight_path)['net_rev_2'])\n        self.net_rev_2.eval()\n\n    def run(self, content_img, style_image):\n        content_img, style_img, h, w = img_totensor(content_img, style_image)\n        pyr_ci = make_laplace_pyramid(content_img, 2)\n        pyr_si = make_laplace_pyramid(style_img, 2)\n        pyr_ci.append(content_img)\n        pyr_si.append(style_img)\n        cF = self.net_enc(pyr_ci[2])\n        sF = self.net_enc(pyr_si[2])\n        stylized_small = self.net_dec(cF, sF)\n        stylized_up = F.interpolate(stylized_small, scale_factor=2)\n\n        revnet_input = paddle.concat(x=[pyr_ci[1], stylized_up], axis=1)\n        stylized_rev_lap = self.net_rev(revnet_input)\n        stylized_rev = fold_laplace_pyramid([stylized_rev_lap, stylized_small])\n\n        stylized_up = F.interpolate(stylized_rev, scale_factor=2)\n\n        revnet_input = paddle.concat(x=[pyr_ci[0], stylized_up], axis=1)\n        stylized_rev_lap_second = self.net_rev_2(revnet_input)\n        stylized_rev_second = fold_laplace_pyramid([stylized_rev_lap_second, stylized_rev_lap, stylized_small])\n\n        stylized = stylized_rev_second\n        stylized_visual = tensor2img(stylized, min_max=(0., 1.))\n\n        return stylized_visual\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/lapstyle_stars/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport cv2\nimport numpy as np\nimport paddle\nfrom skimage.io import imread\nfrom skimage.transform import rescale\nfrom skimage.transform import resize\n\nimport paddlehub as hub\nfrom .model import LapStylePredictor\nfrom .util import base64_to_cv2\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"lapstyle_stars\",\n    type=\"CV/style_transfer\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"\",\n    version=\"1.0.0\")\nclass Lapstyle_stars:\n    def __init__(self):\n        self.pretrained_model = os.path.join(self.directory, \"lapstyle_stars.pdparams\")\n\n        self.network = LapStylePredictor(weight_path=self.pretrained_model)\n\n    def style_transfer(self,\n                       images: list = None,\n                       paths: list = None,\n                       output_dir: str = './transfer_result/',\n                       use_gpu: bool = False,\n                       visualization: bool = True):\n        '''\n        Transfer a image to stars style.\n\n        images (list[dict]): data of images, each element is a dict:\n          - content (numpy.ndarray): input image，shape is \\[H, W, C\\]，BGR format；<br/>\n          - style (numpy.ndarray) : style image，shape is \\[H, W, C\\]，BGR format；<br/>\n        paths (list[dict]): paths to images, eacg element is a dict:\n          - content (str): path to input image；<br/>\n          - style (str) : path to style image；<br/>\n\n        output_dir (str): the dir to save the results\n        use_gpu (bool): if True, use gpu to perform the computation, otherwise cpu.\n        visualization (bool): if True, save results in output_dir.\n        '''\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n\n        if images != None:\n            for image_dict in images:\n                content_img = image_dict['content']\n                style_img = image_dict['style']\n                results.append(self.network.run(content_img, style_img))\n\n        if paths != None:\n            for path_dict in paths:\n                content_img = cv2.imread(path_dict['content'])\n                style_img = cv2.imread(path_dict['style'])\n                results.append(self.network.run(content_img, style_img))\n\n        if visualization == True:\n            if not os.path.exists(output_dir):\n                os.makedirs(output_dir, exist_ok=True)\n            for i, out in enumerate(results):\n                cv2.imwrite(os.path.join(output_dir, 'output_{}.png'.format(i)), out[:, :, ::-1])\n\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n\n        self.style_transfer(\n            paths=[{\n                'content': self.args.content,\n                'style': self.args.style\n            }],\n            output_dir=self.args.output_dir,\n            use_gpu=self.args.use_gpu,\n            visualization=self.args.visualization)\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = copy.deepcopy(images)\n        for image in images_decode:\n            image['content'] = base64_to_cv2(image['content'])\n            image['style'] = base64_to_cv2(image['style'])\n        results = self.style_transfer(images_decode, **kwargs)\n        tolist = [result.tolist() for result in results]\n        return tolist\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default='transfer_result', help='output directory for saving result.')\n        self.arg_config_group.add_argument('--visualization', type=bool, default=False, help='save results or not.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--content', type=str, help=\"path to content image.\")\n        self.arg_input_group.add_argument('--style', type=str, help=\"path to style image.\")\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/lapstyle_stars/requirements.txt",
    "content": "ppgan\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/lapstyle_stars/util.py",
    "content": "import base64\n\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/msgnet/README.md",
    "content": "# msgnet\n\n|模型名称|msgnet|\n| :--- | :---: | \n|类别|图像-图像编辑|\n|网络|msgnet|\n|数据集|COCO2014|\n|是否支持Fine-tuning|是|\n|模型大小|68MB|\n|指标|-|\n|最新更新日期|2021-07-29|\n\n\n## 一、模型基本信息\n  \n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/130910325-d72f34b2-d567-4e77-bb60-35148864301e.jpg\" width = \"450\" height = \"300\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/130910195-9433e4a7-3596-4677-85d2-2ffc16939597.png\" width = \"450\" height = \"300\" hspace='10'/>\n    </p>\n\n- ### 模型介绍\n\n    - 本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n    - 更多详情请参考：[msgnet](https://github.com/zhanghang1989/PyTorch-Multi-Style-Transfer)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install msgnet\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)   \n\n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n```\n$ hub run msgnet --input_path \"/PATH/TO/ORIGIN/IMAGE\" --style_path \"/PATH/TO/STYLE/IMAGE\"\n```\n\n- ### 2.预测代码示例\n\n\n```python\nimport paddle\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='msgnet')\n    result = model.predict(origin=[\"/PATH/TO/ORIGIN/IMAGE\"], style=\"/PATH/TO/STYLE/IMAGE\", visualization=True, save_path =\"/PATH/TO/SAVE/IMAGE\")\n```\n\n\n\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用msgnet模型对[MiniCOCO](../../docs/reference/datasets.md#class-hubdatasetsMiniCOCO)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n\n              transform = T.Compose([T.Resize((256, 256), interpolation='LINEAR')])\n              ```\n\n            - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets.minicoco import MiniCOCO\n\n              styledata = MiniCOCO(transform=transform, mode='train')\n\n              ```\n                - `transforms`: 数据预处理方式。\n                - `mode`: 选择数据模式，可选项有 `train`, `test`， 默认为`train`。\n\n                - 数据集的准备代码可以参考 [minicoco.py](../../paddlehub/datasets/minicoco.py)。`hub.datasets.MiniCOCO()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name='msgnet', load_checkpoint=None)\n              ```\n                - `name`: 选择预训练模型的名字。\n                - `load_checkpoint`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n        - Step4: 选择优化策略和运行配置\n\n            - ```python\n              optimizer = paddle.optimizer.Adam(learning_rate=0.0001, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_style_ckpt')\n              trainer.train(styledata, epochs=101, batch_size=4, eval_dataset=styledata, log_interval=10, save_interval=10)\n              ```\n\n\n\n\n    - 模型预测\n\n        - 当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。我们使用该模型来进行预测。predict.py脚本如下：\n\n            ```python\n            import paddle\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='msgnet', load_checkpoint=\"/PATH/TO/CHECKPOINT\")\n                result = model.predict(origin=[\"/PATH/TO/ORIGIN/IMAGE\"], style=\"/PATH/TO/STYLE/IMAGE\", visualization=True, save_path =\"/PATH/TO/SAVE/IMAGE\")\n            ```\n\n            - 参数配置正确后，请执行脚本`python predict.py`， 加载模型具体可参见[加载](https://www.paddlepaddle.org.cn/documentation/docs/zh/2.0-rc/api/paddle/framework/io/load_cn.html#load)。\n\n            - **Args**\n                * `origin`:原始图像路径或BGR格式图片；\n                * `style`: 风格图像路径；\n                * `visualization`: 是否可视化，默认为True；\n                * `save_path`: 保存结果的路径，默认保存路径为'style_tranfer'。\n\n                **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线风格迁移服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m msgnet\n      ```\n\n    - 这样就完成了一个风格迁移服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/ORIGIN/IMAGE')\n        style_im = cv2.imread('/PATH/TO/STYLE/IMAGE')\n        data = {'images':[[cv2_to_base64(org_im)], cv2_to_base64(style_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/msgnet\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data = base64_to_cv2(r.json()[\"results\"]['data'][0])\n        cv2.imwrite('style.png', data)\n        ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/msgnet/README_en.md",
    "content": "# msgnet\n\n|Module Name|msgnet|\n| :--- | :---: | \n|Category|Image editing|\n|Network|msgnet|\n|Dataset|COCO2014|\n|Fine-tuning supported or not|Yes|\n|Module Size|68MB|\n|Data indicators|-|\n|Latest update date|2021-07-29|\n\n\n## I. Basic Information \n  \n- ### Application Effect Display\n    - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/130910325-d72f34b2-d567-4e77-bb60-35148864301e.jpg\" width = \"450\" height = \"300\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/130910195-9433e4a7-3596-4677-85d2-2ffc16939597.png\" width = \"450\" height = \"300\" hspace='10'/>\n    </p>\n\n- ### Module Introduction\n\n    - Msgnet is a style transfer model. We will show how to use PaddleHub to finetune the pre-trained model and complete the prediction.\n    - For more information, please refer to [msgnet](https://github.com/zhanghang1989/PyTorch-Multi-Style-Transfer)\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install msgnet\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```\n    $ hub run msgnet --input_path \"/PATH/TO/ORIGIN/IMAGE\" --style_path \"/PATH/TO/STYLE/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_en/tutorial/cmd_usage.rst)\n\n\n- ### 2、Prediction Code Example\n\n    -  ```python\n        import paddle\n        import paddlehub as hub\n\n        if __name__ == '__main__':\n            model = hub.Module(name='msgnet')\n            result = model.predict(origin=[\"/PATH/TO/ORIGIN/IMAGE\"], style=\"/PATH/TO/STYLE/IMAGE\", visualization=True, save_path =\"/PATH/TO/SAVE/IMAGE\")\n        ```\n\n- ### 3.Fine-tune and Encapsulation\n\n    - After completing the installation of PaddlePaddle and PaddleHub, you can start using the msgnet model to fine-tune datasets such as [MiniCOCO](../../docs/reference/datasets.md#class-hubdatasetsMiniCOCO) by executing `python train.py`.\n\n    - Steps:\n\n        - Step1: Define the data preprocessing method\n\n            - ```python\n              import paddlehub.vision.transforms as T\n\n              transform = T.Compose([T.Resize((256, 256), interpolation='LINEAR')])\n              ```\n\n            - `transforms` The data enhancement module defines lots of data preprocessing methods. Users can replace the data preprocessing methods according to their needs.\n\n        - Step2: Download the dataset\n            - ```python\n              from paddlehub.datasets.minicoco import MiniCOCO\n\n              styledata = MiniCOCO(transform=transform, mode='train')\n\n              ```\n                * `transforms`: data preprocessing methods.\n                * `mode`: Select the data mode, the options are `train`, `test`, `val`. Default is `train`.\n\n                - Dataset preparation can be referred to [minicoco.py](../../paddlehub/datasets/minicoco.py). `hub.datasets.MiniCOCO()` will be automatically downloaded from the network and decompressed to the `$HOME/.paddlehub/dataset` directory under the user directory.\n\n        - Step3: Load the pre-trained model\n\n            - ```python\n              model = hub.Module(name='msgnet', load_checkpoint=None)\n              ```\n                * `name`: model name.\n                * `load_checkpoint`: Whether to load the self-trained model, if it is None, load the provided parameters.\n\n        - Step4: Optimization strategy\n\n            - ```python\n              optimizer = paddle.optimizer.Adam(learning_rate=0.0001, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_style_ckpt')\n              trainer.train(styledata, epochs=101, batch_size=4, eval_dataset=styledata, log_interval=10, save_interval=10)\n              ```\n\n\n    - Model prediction\n\n        -   When Fine-tune is completed, the model with the best performance on the verification set will be saved in the `${CHECKPOINT_DIR}/best_model` directory. We use this model to make predictions. The `predict.py` script is as follows:\n            -   ```python\n                import paddle\n                import paddlehub as hub\n\n                if __name__ == '__main__':\n                    model = hub.Module(name='msgnet', load_checkpoint=\"/PATH/TO/CHECKPOINT\")\n                    result = model.predict(origin=[\"/PATH/TO/ORIGIN/IMAGE\"], style=\"/PATH/TO/STYLE/IMAGE\", visualization=True, save_path =\"/PATH/TO/SAVE/IMAGE\")\n                ```\n\n                - **Parameters**\n                    * `origin`: Image path or ndarray data with format [H, W, C], BGR.\n                    * `style`: Style image path.\n                    * `visualization`: Whether to save the recognition results as picture files.\n                    * `save_path`: Save path of the result, default is 'style_tranfer'.\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of style transfer.\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n        - ```shell\n        $ hub serving start -m msgnet\n        ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result:\n\n        -   ```python\n            import requests\n            import json\n            import cv2\n            import base64\n\n            import numpy as np\n\n\n            def cv2_to_base64(image):\n                data = cv2.imencode('.jpg', image)[1]\n                return base64.b64encode(data.tostring()).decode('utf8')\n\n            def base64_to_cv2(b64str):\n                data = base64.b64decode(b64str.encode('utf8'))\n                data = np.fromstring(data, np.uint8)\n                data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n                return data\n\n            # Send an HTTP request\n            org_im = cv2.imread('/PATH/TO/ORIGIN/IMAGE')\n            style_im = cv2.imread('/PATH/TO/STYLE/IMAGE')\n            data = {'images':[[cv2_to_base64(org_im)], cv2_to_base64(style_im)]}\n            headers = {\"Content-type\": \"application/json\"}\n            url = \"http://127.0.0.1:8866/predict/msgnet\"\n            r = requests.post(url=url, headers=headers, data=json.dumps(data))\n            data = base64_to_cv2(r.json()[\"results\"]['data'][0])\n            cv2.imwrite('style.png', data)\n            ```\n\n## V. Release Note\n\n- 1.0.0\n\n  First release\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/msgnet/module.py",
    "content": "import os\nimport argparse\n\nimport paddle\nimport paddle.nn as nn\nimport numpy as np\nimport paddle.nn.functional as F\n\nfrom paddlehub.env import MODULE_HOME\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.vision.transforms import Compose, Resize, CenterCrop\nfrom paddlehub.module.cv_module import StyleTransferModule\n\n\nclass GramMatrix(nn.Layer):\n    \"\"\"Calculate gram matrix\"\"\"\n\n    def forward(self, y):\n        (b, ch, h, w) = y.shape\n        features = y.reshape((b, ch, w * h))\n        features_t = features.transpose((0, 2, 1))\n        gram = features.bmm(features_t) / (ch * h * w)\n        return gram\n\n\nclass ConvLayer(nn.Layer):\n    \"\"\"Basic conv layer with reflection padding layer\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, stride: int):\n        super(ConvLayer, self).__init__()\n        pad = int(np.floor(kernel_size / 2))\n        self.reflection_pad = nn.Pad2D([pad, pad, pad, pad], mode='reflect')\n        self.conv2d = nn.Conv2D(in_channels, out_channels, kernel_size, stride)\n\n    def forward(self, x: paddle.Tensor):\n        out = self.reflection_pad(x)\n        out = self.conv2d(out)\n        return out\n\n\nclass UpsampleConvLayer(nn.Layer):\n    \"\"\"\n    Upsamples the input and then does a convolution. This method gives better results compared to ConvTranspose2d.\n    ref: http://distill.pub/2016/deconv-checkerboard/\n\n    Args:\n       in_channels(int): Number of input channels.\n       out_channels(int): Number of output channels.\n       kernel_size(int): Number of kernel size.\n       stride(int): Number of stride.\n       upsample(int): Scale factor for upsample layer, default is None.\n\n    Return:\n        img(paddle.Tensor): UpsampleConvLayer output.\n    \"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, stride: int, upsample=None):\n        super(UpsampleConvLayer, self).__init__()\n        self.upsample = upsample\n        if upsample:\n            self.upsample_layer = nn.Upsample(scale_factor=upsample)\n        self.pad = int(np.floor(kernel_size / 2))\n        if self.pad != 0:\n            self.reflection_pad = nn.Pad2D([self.pad, self.pad, self.pad, self.pad], mode='reflect')\n        self.conv2d = nn.Conv2D(in_channels, out_channels, kernel_size, stride)\n\n    def forward(self, x):\n        if self.upsample:\n            x = self.upsample_layer(x)\n        if self.pad != 0:\n            x = self.reflection_pad(x)\n        out = self.conv2d(x)\n        return out\n\n\nclass Bottleneck(nn.Layer):\n    \"\"\" Pre-activation residual block\n        Identity Mapping in Deep Residual Networks\n        ref https://arxiv.org/abs/1603.05027\n\n    Args:\n       inplanes(int): Number of input channels.\n       planes(int): Number of output channels.\n       stride(int): Number of stride.\n       downsample(int): Scale factor for downsample layer, default is None.\n       norm_layer(nn.Layer): Batch norm layer, default is nn.BatchNorm2D.\n\n    Return:\n        img(paddle.Tensor): Bottleneck output.\n    \"\"\"\n\n    def __init__(self,\n                 inplanes: int,\n                 planes: int,\n                 stride: int = 1,\n                 downsample: int = None,\n                 norm_layer: nn.Layer = nn.BatchNorm2D):\n        super(Bottleneck, self).__init__()\n        self.expansion = 4\n        self.downsample = downsample\n        if self.downsample is not None:\n            self.residual_layer = nn.Conv2D(inplanes, planes * self.expansion, kernel_size=1, stride=stride)\n        conv_block = (norm_layer(inplanes), nn.ReLU(), nn.Conv2D(inplanes, planes, kernel_size=1, stride=1),\n                      norm_layer(planes), nn.ReLU(), ConvLayer(planes, planes, kernel_size=3, stride=stride),\n                      norm_layer(planes), nn.ReLU(), nn.Conv2D(\n                          planes, planes * self.expansion, kernel_size=1, stride=1))\n        self.conv_block = nn.Sequential(*conv_block)\n\n    def forward(self, x: paddle.Tensor):\n        if self.downsample is not None:\n            residual = self.residual_layer(x)\n        else:\n            residual = x\n        m = self.conv_block(x)\n        return residual + self.conv_block(x)\n\n\nclass UpBottleneck(nn.Layer):\n    \"\"\" Up-sample residual block (from MSG-Net paper)\n    Enables passing identity all the way through the generator\n    ref https://arxiv.org/abs/1703.06953\n\n    Args:\n       inplanes(int): Number of input channels.\n       planes(int): Number of output channels.\n       stride(int): Number of stride, default is 2.\n       norm_layer(nn.Layer): Batch norm layer, default is nn.BatchNorm2D.\n\n    Return:\n        img(paddle.Tensor): UpBottleneck output.\n    \"\"\"\n\n    def __init__(self, inplanes: int, planes: int, stride: int = 2, norm_layer: nn.Layer = nn.BatchNorm2D):\n        super(UpBottleneck, self).__init__()\n        self.expansion = 4\n        self.residual_layer = UpsampleConvLayer(\n            inplanes, planes * self.expansion, kernel_size=1, stride=1, upsample=stride)\n        conv_block = []\n        conv_block += [norm_layer(inplanes), nn.ReLU(), nn.Conv2D(inplanes, planes, kernel_size=1, stride=1)]\n        conv_block += [\n            norm_layer(planes),\n            nn.ReLU(),\n            UpsampleConvLayer(planes, planes, kernel_size=3, stride=1, upsample=stride)\n        ]\n        conv_block += [\n            norm_layer(planes),\n            nn.ReLU(),\n            nn.Conv2D(planes, planes * self.expansion, kernel_size=1, stride=1)\n        ]\n        self.conv_block = nn.Sequential(*conv_block)\n\n    def forward(self, x: paddle.Tensor):\n        return self.residual_layer(x) + self.conv_block(x)\n\n\nclass Inspiration(nn.Layer):\n    \"\"\" Inspiration Layer (from MSG-Net paper)\n    tuning the featuremap with target Gram Matrix\n    ref https://arxiv.org/abs/1703.06953\n\n    Args:\n       C(int): Number of input channels.\n       B(int):  B is equal to 1 or input mini_batch, default is 1.\n\n    Return:\n        img(paddle.Tensor): UpBottleneck output.\n    \"\"\"\n\n    def __init__(self, C: int, B: int = 1):\n        super(Inspiration, self).__init__()\n\n        self.weight = self.weight = paddle.create_parameter(shape=[1, C, C], dtype='float32')\n        # non-parameter buffer\n        self.G = paddle.to_tensor(np.random.rand(B, C, C))\n        self.C = C\n\n    def setTarget(self, target: paddle.Tensor):\n        self.G = target\n\n    def forward(self, X: paddle.Tensor):\n        # input X is a 3D feature map\n        self.P = paddle.bmm(self.weight.expand_as(self.G), self.G)\n\n        x = paddle.bmm(\n            self.P.transpose((0, 2, 1)).expand((X.shape[0], self.C, self.C)), X.reshape((X.shape[0], X.shape[1],\n                                                                                         -1))).reshape(X.shape)\n        return x\n\n    def __repr__(self):\n        return self.__class__.__name__ + '(' \\\n               + 'N x ' + str(self.C) + ')'\n\n\nclass Vgg16(nn.Layer):\n    \"\"\" First four layers from Vgg16.\"\"\"\n\n    def __init__(self):\n        super(Vgg16, self).__init__()\n        self.conv1_1 = nn.Conv2D(3, 64, kernel_size=3, stride=1, padding=1)\n        self.conv1_2 = nn.Conv2D(64, 64, kernel_size=3, stride=1, padding=1)\n\n        self.conv2_1 = nn.Conv2D(64, 128, kernel_size=3, stride=1, padding=1)\n        self.conv2_2 = nn.Conv2D(128, 128, kernel_size=3, stride=1, padding=1)\n\n        self.conv3_1 = nn.Conv2D(128, 256, kernel_size=3, stride=1, padding=1)\n        self.conv3_2 = nn.Conv2D(256, 256, kernel_size=3, stride=1, padding=1)\n        self.conv3_3 = nn.Conv2D(256, 256, kernel_size=3, stride=1, padding=1)\n\n        self.conv4_1 = nn.Conv2D(256, 512, kernel_size=3, stride=1, padding=1)\n        self.conv4_2 = nn.Conv2D(512, 512, kernel_size=3, stride=1, padding=1)\n        self.conv4_3 = nn.Conv2D(512, 512, kernel_size=3, stride=1, padding=1)\n\n        self.conv5_1 = nn.Conv2D(512, 512, kernel_size=3, stride=1, padding=1)\n        self.conv5_2 = nn.Conv2D(512, 512, kernel_size=3, stride=1, padding=1)\n        self.conv5_3 = nn.Conv2D(512, 512, kernel_size=3, stride=1, padding=1)\n\n        checkpoint = os.path.join(MODULE_HOME, 'msgnet', 'vgg16.pdparams')\n        model_dict = paddle.load(checkpoint)\n        self.set_dict(model_dict)\n        print(\"load pretrained vgg16 checkpoint success\")\n\n    def forward(self, X):\n        h = F.relu(self.conv1_1(X))\n        h = F.relu(self.conv1_2(h))\n        relu1_2 = h\n        h = F.max_pool2d(h, kernel_size=2, stride=2)\n\n        h = F.relu(self.conv2_1(h))\n        h = F.relu(self.conv2_2(h))\n        relu2_2 = h\n        h = F.max_pool2d(h, kernel_size=2, stride=2)\n\n        h = F.relu(self.conv3_1(h))\n        h = F.relu(self.conv3_2(h))\n        h = F.relu(self.conv3_3(h))\n        relu3_3 = h\n        h = F.max_pool2d(h, kernel_size=2, stride=2)\n\n        h = F.relu(self.conv4_1(h))\n        h = F.relu(self.conv4_2(h))\n        h = F.relu(self.conv4_3(h))\n        relu4_3 = h\n\n        return [relu1_2, relu2_2, relu3_3, relu4_3]\n\n\n@moduleinfo(\n    name=\"msgnet\",\n    type=\"CV/image_editing\",\n    author=\"baidu-vis\",\n    author_email=\"\",\n    summary=\"Msgnet is a image colorization style transfer model, this module is trained with COCO2014 dataset.\",\n    version=\"1.0.0\",\n    meta=StyleTransferModule)\nclass MSGNet(nn.Layer):\n    \"\"\" MSGNet (from MSG-Net paper)\n    Enables passing identity all the way through the generator\n    ref https://arxiv.org/abs/1703.06953\n\n    Args:\n       input_nc(int): Number of input channels, default is 3.\n       output_nc(int): Number of output channels, default is 3.\n       ngf(int): Number of input channel for middle layer, default is 128.\n       n_blocks(int): Block number, default is 6.\n       norm_layer(nn.Layer): Batch norm layer, default is nn.InstanceNorm2D.\n       load_checkpoint(str): Pretrained checkpoint path, default is None.\n\n    Return:\n        img(paddle.Tensor): MSGNet output.\n    \"\"\"\n\n    def __init__(self, input_nc=3, output_nc=3, ngf=128, n_blocks=6, norm_layer=nn.InstanceNorm2D,\n                 load_checkpoint=None):\n        super(MSGNet, self).__init__()\n        self.gram = GramMatrix()\n        block = Bottleneck\n        upblock = UpBottleneck\n        expansion = 4\n\n        model1 = [\n            ConvLayer(input_nc, 64, kernel_size=7, stride=1),\n            norm_layer(64),\n            nn.ReLU(),\n            block(64, 32, 2, 1, norm_layer),\n            block(32 * expansion, ngf, 2, 1, norm_layer)\n        ]\n\n        self.model1 = nn.Sequential(*tuple(model1))\n\n        model = []\n        model += model1\n\n        self.ins = Inspiration(ngf * expansion)\n        model.append(self.ins)\n        for i in range(n_blocks):\n            model += [block(ngf * expansion, ngf, 1, None, norm_layer)]\n\n        model += [\n            upblock(ngf * expansion, 32, 2, norm_layer),\n            upblock(32 * expansion, 16, 2, norm_layer),\n            norm_layer(16 * expansion),\n            nn.ReLU(),\n            ConvLayer(16 * expansion, output_nc, kernel_size=7, stride=1)\n        ]\n        model = tuple(model)\n        self.model = nn.Sequential(*model)\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'style_paddle.pdparams')\n            model_dict = paddle.load(checkpoint)\n            model_dict_clone = model_dict.copy()\n            for key, value in model_dict_clone.items():\n                if key.endswith((\"scale\")):\n                    name = key.rsplit('.', 1)[0] + '.bias'\n                    model_dict[name] = paddle.zeros(shape=model_dict[name].shape, dtype='float32')\n                    model_dict[key] = paddle.ones(shape=model_dict[key].shape, dtype='float32')\n            self.set_dict(model_dict)\n            self.model_dict = model_dict\n            print(\"load pretrained checkpoint success\")\n\n        self._vgg = None\n\n    def transform(self, path: str):\n        transform = Compose([Resize((256, 256), interpolation='LINEAR')])\n        return transform(path)\n\n    def setTarget(self, Xs: paddle.Tensor):\n        \"\"\"Calculate feature gram matrix\"\"\"\n        F = self.model1(Xs)\n        G = self.gram(F)\n        self.ins.setTarget(G)\n\n    def getFeature(self, input: paddle.Tensor):\n        if not self._vgg:\n            self._vgg = Vgg16()\n        return self._vgg(input)\n\n    def forward(self, input: paddle.Tensor):\n        return self.model(input)\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/paint_transformer/README.md",
    "content": "# paint_transformer\n\n|模型名称|paint_transformer|\n| :--- | :---: |\n|类别|图像 - 风格转换|\n|网络|Paint Transformer|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|否|\n|模型大小|77MB|\n|最新更新日期|2021-12-07|\n|数据指标|-|\n\n\n## 一、模型基本信息  \n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/145002878-ffdeea71-8ff4-48cc-88d0-fba1aa1dce4b.jpg\"  width = \"40%\"  hspace='10'/>\n    <br />\n    输入图像\n    <br />\n    <img src=\"https://user-images.githubusercontent.com/22424850/145002301-97c45887-cb2e-4a06-9d00-07b74080effa.png\"  width = \"40%\"  hspace='10'/>\n    <br />\n    输出图像\n     <br />\n    </p>\n\n- ### 模型介绍\n\n  - 该模型可以实现图像油画风格的转换。\n  - 更多详情参考：[Paint Transformer: Feed Forward Neural Painting with Stroke Prediction](https://github.com/wzmsltw/PaintTransformer)\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n  - ppgan\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install paint_transformer\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    # Read from a file\n    $ hub run paint_transformer --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现风格转换模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"paint_transformer\")\n    input_path = [\"/PATH/TO/IMAGE\"]\n    # Read from a file\n    module.style_transfer(paths=input_path, output_dir='./transfer_result/', use_gpu=True)  \n    ```\n\n- ### 3、API\n\n  - ```python\n    style_transfer(images=None, paths=None, output_dir='./transfer_result/', use_gpu=False, need_animation=False, visualization=True):\n    ```\n    - 油画风格转换API。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]；<br/>\n      - paths (list\\[str\\]): 图片的路径；<br/>\n      - output\\_dir (str): 结果保存的路径； <br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - need_animation(bool): 是否保存中间结果形成动画\n      - visualization(bool): 是否保存结果到本地文件夹\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线油画风格转换服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m paint_transformer\n    ```\n\n  - 这样就完成了一个油画风格转换的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/paint_transformer\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install paint_transformer==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/paint_transformer/inference.py",
    "content": "import numpy as np\nfrom PIL import Image\nimport network\nimport os\nimport math\nimport render_utils\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport cv2\nimport render_parallel\nimport render_serial\n\n\ndef main(input_path, model_path, output_dir, need_animation=False, resize_h=None, resize_w=None, serial=False):\n    if not os.path.exists(output_dir):\n        os.mkdir(output_dir)\n    input_name = os.path.basename(input_path)\n    output_path = os.path.join(output_dir, input_name)\n    frame_dir = None\n    if need_animation:\n        if not serial:\n            print('It must be under serial mode if animation results are required, so serial flag is set to True!')\n            serial = True\n        frame_dir = os.path.join(output_dir, input_name[:input_name.find('.')])\n        if not os.path.exists(frame_dir):\n            os.mkdir(frame_dir)\n    stroke_num = 8\n\n    #* ----- load model ----- *#\n    paddle.set_device('gpu')\n    net_g = network.Painter(5, stroke_num, 256, 8, 3, 3)\n    net_g.set_state_dict(paddle.load(model_path))\n    net_g.eval()\n    for param in net_g.parameters():\n        param.stop_gradient = True\n\n    #* ----- load brush ----- *#\n    brush_large_vertical = render_utils.read_img('brush/brush_large_vertical.png', 'L')\n    brush_large_horizontal = render_utils.read_img('brush/brush_large_horizontal.png', 'L')\n    meta_brushes = paddle.concat([brush_large_vertical, brush_large_horizontal], axis=0)\n\n    import time\n    t0 = time.time()\n\n    original_img = render_utils.read_img(input_path, 'RGB', resize_h, resize_w)\n    if serial:\n        final_result_list = render_serial.render_serial(original_img, net_g, meta_brushes)\n        if need_animation:\n\n            print(\"total frame:\", len(final_result_list))\n            for idx, frame in enumerate(final_result_list):\n                cv2.imwrite(os.path.join(frame_dir, '%03d.png' % idx), frame)\n        else:\n            cv2.imwrite(output_path, final_result_list[-1])\n    else:\n        final_result = render_parallel.render_parallel(original_img, net_g, meta_brushes)\n        cv2.imwrite(output_path, final_result)\n\n    print(\"total infer time:\", time.time() - t0)\n\n\nif __name__ == '__main__':\n\n    main(\n        input_path='input/chicago.jpg',\n        model_path='paint_best.pdparams',\n        output_dir='output/',\n        need_animation=True,  # whether need intermediate results for animation.\n        resize_h=512,  # resize original input to this size. None means do not resize.\n        resize_w=512,  # resize original input to this size. None means do not resize.\n        serial=True)  # if need animation, serial must be True.\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/paint_transformer/model.py",
    "content": "import paddle\nimport paddle.nn as nn\nimport math\n\n\nclass Painter(nn.Layer):\n    \"\"\"\n    network architecture written in paddle.\n    \"\"\"\n\n    def __init__(self, param_per_stroke, total_strokes, hidden_dim, n_heads=8, n_enc_layers=3, n_dec_layers=3):\n        super().__init__()\n        self.enc_img = nn.Sequential(\n            nn.Pad2D([1, 1, 1, 1], 'reflect'),\n            nn.Conv2D(3, 32, 3, 1),\n            nn.BatchNorm2D(32),\n            nn.ReLU(),  # maybe replace with the inplace version\n            nn.Pad2D([1, 1, 1, 1], 'reflect'),\n            nn.Conv2D(32, 64, 3, 2),\n            nn.BatchNorm2D(64),\n            nn.ReLU(),\n            nn.Pad2D([1, 1, 1, 1], 'reflect'),\n            nn.Conv2D(64, 128, 3, 2),\n            nn.BatchNorm2D(128),\n            nn.ReLU())\n        self.enc_canvas = nn.Sequential(\n            nn.Pad2D([1, 1, 1, 1], 'reflect'), nn.Conv2D(3, 32, 3, 1), nn.BatchNorm2D(32), nn.ReLU(),\n            nn.Pad2D([1, 1, 1, 1], 'reflect'), nn.Conv2D(32, 64, 3, 2), nn.BatchNorm2D(64), nn.ReLU(),\n            nn.Pad2D([1, 1, 1, 1], 'reflect'), nn.Conv2D(64, 128, 3, 2), nn.BatchNorm2D(128), nn.ReLU())\n        self.conv = nn.Conv2D(128 * 2, hidden_dim, 1)\n        self.transformer = nn.Transformer(hidden_dim, n_heads, n_enc_layers, n_dec_layers)\n        self.linear_param = nn.Sequential(\n            nn.Linear(hidden_dim, hidden_dim), nn.ReLU(), nn.Linear(hidden_dim, hidden_dim), nn.ReLU(),\n            nn.Linear(hidden_dim, param_per_stroke))\n        self.linear_decider = nn.Linear(hidden_dim, 1)\n        self.query_pos = paddle.static.create_parameter([total_strokes, hidden_dim],\n                                                        dtype='float32',\n                                                        default_initializer=nn.initializer.Uniform(0, 1))\n        self.row_embed = paddle.static.create_parameter([8, hidden_dim // 2],\n                                                        dtype='float32',\n                                                        default_initializer=nn.initializer.Uniform(0, 1))\n        self.col_embed = paddle.static.create_parameter([8, hidden_dim // 2],\n                                                        dtype='float32',\n                                                        default_initializer=nn.initializer.Uniform(0, 1))\n\n    def forward(self, img, canvas):\n        \"\"\"\n        prediction\n        \"\"\"\n        b, _, H, W = img.shape\n        img_feat = self.enc_img(img)\n        canvas_feat = self.enc_canvas(canvas)\n        h, w = img_feat.shape[-2:]\n        feat = paddle.concat([img_feat, canvas_feat], axis=1)\n        feat_conv = self.conv(feat)\n\n        pos_embed = paddle.concat([\n            self.col_embed[:w].unsqueeze(0).tile([h, 1, 1]),\n            self.row_embed[:h].unsqueeze(1).tile([1, w, 1]),\n        ],\n                                  axis=-1).flatten(0, 1).unsqueeze(1)\n\n        hidden_state = self.transformer((pos_embed + feat_conv.flatten(2).transpose([2, 0, 1])).transpose([1, 0, 2]),\n                                        self.query_pos.unsqueeze(1).tile([1, b, 1]).transpose([1, 0, 2]))\n\n        param = self.linear_param(hidden_state)\n        decision = self.linear_decider(hidden_state)\n        return param, decision\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/paint_transformer/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport argparse\nimport copy\n\nimport paddle\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo, runnable, serving\nimport numpy as np\nimport cv2\nfrom skimage.io import imread\nfrom skimage.transform import rescale, resize\n\nfrom .model import Painter\nfrom .render_utils import totensor, read_img\nfrom .render_serial import render_serial\nfrom .util import base64_to_cv2\n\n\n@moduleinfo(\n    name=\"paint_transformer\",\n    type=\"CV/style_transfer\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"\",\n    version=\"1.0.0\")\nclass paint_transformer:\n    def __init__(self):\n        self.pretrained_model = os.path.join(self.directory, \"paint_best.pdparams\")\n\n        self.network = Painter(5, 8, 256, 8, 3, 3)\n        self.network.set_state_dict(paddle.load(self.pretrained_model))\n        self.network.eval()\n        for param in self.network.parameters():\n            param.stop_gradient = True\n        #* ----- load brush ----- *#\n        brush_large_vertical = read_img(os.path.join(self.directory, 'brush/brush_large_vertical.png'), 'L')\n        brush_large_horizontal = read_img(os.path.join(self.directory, 'brush/brush_large_horizontal.png'), 'L')\n        self.meta_brushes = paddle.concat([brush_large_vertical, brush_large_horizontal], axis=0)\n\n    def style_transfer(self,\n                       images: list = None,\n                       paths: list = None,\n                       output_dir: str = './transfer_result/',\n                       use_gpu: bool = False,\n                       need_animation: bool = False,\n                       visualization: bool = True):\n        '''\n\n\n        images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR(read by cv2).\n        paths (list[str]): paths to images\n        output_dir (str): the dir to save the results\n        use_gpu (bool): if True, use gpu to perform the computation, otherwise cpu.\n        need_animation (bool): if True, save every frame to show the process of painting.\n        visualization (bool): if True, save results in output_dir.\n        '''\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n\n        if images != None:\n            for image in images:\n                image = image[:, :, ::-1]\n                image = totensor(image)\n                final_result_list = render_serial(image, self.network, self.meta_brushes)\n                results.append(final_result_list)\n\n        if paths != None:\n            for path in paths:\n                image = cv2.imread(path)[:, :, ::-1]\n                image = totensor(image)\n                final_result_list = render_serial(image, self.network, self.meta_brushes)\n                results.append(final_result_list)\n\n        if visualization == True:\n            if not os.path.exists(output_dir):\n                os.makedirs(output_dir, exist_ok=True)\n            for i, out in enumerate(results):\n                if out:\n                    if need_animation:\n                        curoutputdir = os.path.join(output_dir, 'output_{}'.format(i))\n                        if not os.path.exists(curoutputdir):\n                            os.makedirs(curoutputdir, exist_ok=True)\n                        for j, outimg in enumerate(out):\n                            cv2.imwrite(os.path.join(curoutputdir, 'frame_{}.png'.format(j)), outimg)\n                    else:\n                        cv2.imwrite(os.path.join(output_dir, 'output_{}.png'.format(i)), out[-1])\n\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n        results = self.style_transfer(\n            paths=[self.args.input_path],\n            output_dir=self.args.output_dir,\n            use_gpu=self.args.use_gpu,\n            need_animation=self.args.need_animation,\n            visualization=self.args.visualization)\n        return results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.style_transfer(images=images_decode, **kwargs)\n        tolist = [result.tolist() for result in results]\n        return tolist\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default='transfer_result', help='output directory for saving result.')\n        self.arg_config_group.add_argument('--visualization', type=bool, default=False, help='save results or not.')\n        self.arg_config_group.add_argument(\n            '--need_animation', type=bool, default=False, help='save intermediate results or not.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to input image.\")\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/paint_transformer/render_parallel.py",
    "content": "import render_utils\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport numpy as np\nimport math\n\n\ndef crop(img, h, w):\n    H, W = img.shape[-2:]\n    pad_h = (H - h) // 2\n    pad_w = (W - w) // 2\n    remainder_h = (H - h) % 2\n    remainder_w = (W - w) % 2\n    img = img[:, :, pad_h:H - pad_h - remainder_h, pad_w:W - pad_w - remainder_w]\n    return img\n\n\ndef stroke_net_predict(img_patch, result_patch, patch_size, net_g, stroke_num, patch_num):\n    \"\"\"\n    stroke_net_predict\n    \"\"\"\n    img_patch = img_patch.transpose([0, 2, 1]).reshape([-1, 3, patch_size, patch_size])\n    result_patch = result_patch.transpose([0, 2, 1]).reshape([-1, 3, patch_size, patch_size])\n    #*----- Stroke Predictor -----*#\n    shape_param, stroke_decision = net_g(img_patch, result_patch)\n    stroke_decision = (stroke_decision > 0).astype('float32')\n    #*----- sampling color -----*#\n    grid = shape_param[:, :, :2].reshape([img_patch.shape[0] * stroke_num, 1, 1, 2])\n    img_temp = img_patch.unsqueeze(1).tile([1, stroke_num, 1, 1,\n                                            1]).reshape([img_patch.shape[0] * stroke_num, 3, patch_size, patch_size])\n    color = nn.functional.grid_sample(\n        img_temp, 2 * grid - 1, align_corners=False).reshape([img_patch.shape[0], stroke_num, 3])\n    param = paddle.concat([shape_param, color], axis=-1)\n\n    param = param.reshape([-1, 8])\n    param[:, :2] = param[:, :2] / 2 + 0.25\n    param[:, 2:4] = param[:, 2:4] / 2\n    param = param.reshape([1, patch_num, patch_num, stroke_num, 8])\n    decision = stroke_decision.reshape([1, patch_num, patch_num, stroke_num])  #.astype('bool')\n    return param, decision\n\n\ndef param2img_parallel(param, decision, meta_brushes, cur_canvas, stroke_num=8):\n    \"\"\"\n        Input stroke parameters and decisions for each patch, meta brushes, current canvas, frame directory,\n        and whether there is a border (if intermediate painting results are required).\n        Output the painting results of adding the corresponding strokes on the current canvas.\n        Args:\n            param: a tensor with shape batch size x patch along height dimension x patch along width dimension\n             x n_stroke_per_patch x n_param_per_stroke\n            decision: a 01 tensor with shape batch size x patch along height dimension x patch along width dimension\n             x n_stroke_per_patch\n            meta_brushes: a tensor with shape 2 x 3 x meta_brush_height x meta_brush_width.\n            The first slice on the batch dimension denotes vertical brush and the second one denotes horizontal brush.\n            cur_canvas: a tensor with shape batch size x 3 x H x W,\n             where H and W denote height and width of padded results of original images.\n\n        Returns:\n            cur_canvas: a tensor with shape batch size x 3 x H x W, denoting painting results.\n        \"\"\"\n    # param: b, h, w, stroke_per_patch, param_per_stroke\n    # decision: b, h, w, stroke_per_patch\n    b, h, w, s, p = param.shape\n    h, w = int(h), int(w)\n    param = param.reshape([-1, 8])\n    decision = decision.reshape([-1, 8])\n\n    H, W = cur_canvas.shape[-2:]\n    is_odd_y = h % 2 == 1\n    is_odd_x = w % 2 == 1\n    render_size_y = 2 * H // h\n    render_size_x = 2 * W // w\n\n    even_idx_y = paddle.arange(0, h, 2)\n    even_idx_x = paddle.arange(0, w, 2)\n    if h > 1:\n        odd_idx_y = paddle.arange(1, h, 2)\n    if w > 1:\n        odd_idx_x = paddle.arange(1, w, 2)\n\n    cur_canvas = F.pad(cur_canvas, [render_size_x // 4, render_size_x // 4, render_size_y // 4, render_size_y // 4])\n\n    valid_foregrounds = render_utils.param2stroke(param, render_size_y, render_size_x, meta_brushes)\n\n    #* ----- load dilation/erosion ---- *#\n    dilation = render_utils.Dilation2d(m=1)\n    erosion = render_utils.Erosion2d(m=1)\n\n    #* ----- generate alphas ----- *#\n    valid_alphas = (valid_foregrounds > 0).astype('float32')\n    valid_foregrounds = valid_foregrounds.reshape([-1, stroke_num, 1, render_size_y, render_size_x])\n    valid_alphas = valid_alphas.reshape([-1, stroke_num, 1, render_size_y, render_size_x])\n\n    temp = [dilation(valid_foregrounds[:, i, :, :, :]) for i in range(stroke_num)]\n    valid_foregrounds = paddle.stack(temp, axis=1)\n    valid_foregrounds = valid_foregrounds.reshape([-1, 1, render_size_y, render_size_x])\n\n    temp = [erosion(valid_alphas[:, i, :, :, :]) for i in range(stroke_num)]\n    valid_alphas = paddle.stack(temp, axis=1)\n    valid_alphas = valid_alphas.reshape([-1, 1, render_size_y, render_size_x])\n\n    foregrounds = valid_foregrounds.reshape([-1, h, w, stroke_num, 1, render_size_y, render_size_x])\n    alphas = valid_alphas.reshape([-1, h, w, stroke_num, 1, render_size_y, render_size_x])\n    decision = decision.reshape([-1, h, w, stroke_num, 1, 1, 1])\n    param = param.reshape([-1, h, w, stroke_num, 8])\n\n    def partial_render(this_canvas, patch_coord_y, patch_coord_x):\n        canvas_patch = F.unfold(\n            this_canvas, [render_size_y, render_size_x], strides=[render_size_y // 2, render_size_x // 2])\n        # canvas_patch: b, 3 * py * px, h * w\n        canvas_patch = canvas_patch.reshape([b, 3, render_size_y, render_size_x, h, w])\n        canvas_patch = canvas_patch.transpose([0, 4, 5, 1, 2, 3])\n        selected_canvas_patch = paddle.gather(canvas_patch, patch_coord_y, 1)\n        selected_canvas_patch = paddle.gather(selected_canvas_patch, patch_coord_x, 2)\n        selected_canvas_patch = selected_canvas_patch.reshape([0, 0, 0, 1, 3, render_size_y, render_size_x])\n        selected_foregrounds = paddle.gather(foregrounds, patch_coord_y, 1)\n        selected_foregrounds = paddle.gather(selected_foregrounds, patch_coord_x, 2)\n        selected_alphas = paddle.gather(alphas, patch_coord_y, 1)\n        selected_alphas = paddle.gather(selected_alphas, patch_coord_x, 2)\n        selected_decisions = paddle.gather(decision, patch_coord_y, 1)\n        selected_decisions = paddle.gather(selected_decisions, patch_coord_x, 2)\n        selected_color = paddle.gather(param, patch_coord_y, 1)\n        selected_color = paddle.gather(selected_color, patch_coord_x, 2)\n        selected_color = paddle.gather(selected_color, paddle.to_tensor([5, 6, 7]), 4)\n        selected_color = selected_color.reshape([0, 0, 0, stroke_num, 3, 1, 1])\n\n        for i in range(stroke_num):\n            i = paddle.to_tensor(i)\n\n            cur_foreground = paddle.gather(selected_foregrounds, i, 3)\n            cur_alpha = paddle.gather(selected_alphas, i, 3)\n            cur_decision = paddle.gather(selected_decisions, i, 3)\n            cur_color = paddle.gather(selected_color, i, 3)\n            cur_foreground = cur_foreground * cur_color\n            selected_canvas_patch = cur_foreground * cur_alpha * cur_decision + selected_canvas_patch * (\n                1 - cur_alpha * cur_decision)\n\n        selected_canvas_patch = selected_canvas_patch.reshape([0, 0, 0, 3, render_size_y, render_size_x])\n        this_canvas = selected_canvas_patch.transpose([0, 3, 1, 4, 2, 5])\n\n        # this_canvas: b, 3, h_half, py, w_half, px\n        h_half = this_canvas.shape[2]\n        w_half = this_canvas.shape[4]\n        this_canvas = this_canvas.reshape([b, 3, h_half * render_size_y, w_half * render_size_x])\n        # this_canvas: b, 3, h_half * py, w_half * px\n        return this_canvas\n\n    # even - even area\n    # 1 | 0\n    # 0 | 0\n    canvas = partial_render(cur_canvas, even_idx_y, even_idx_x)\n    if not is_odd_y:\n        canvas = paddle.concat([canvas, cur_canvas[:, :, -render_size_y // 2:, :canvas.shape[3]]], axis=2)\n    if not is_odd_x:\n        canvas = paddle.concat([canvas, cur_canvas[:, :, :canvas.shape[2], -render_size_x // 2:]], axis=3)\n    cur_canvas = canvas\n\n    # odd - odd area\n    # 0 | 0\n    # 0 | 1\n    if h > 1 and w > 1:\n        canvas = partial_render(cur_canvas, odd_idx_y, odd_idx_x)\n        canvas = paddle.concat([cur_canvas[:, :, :render_size_y // 2, -canvas.shape[3]:], canvas], axis=2)\n        canvas = paddle.concat([cur_canvas[:, :, -canvas.shape[2]:, :render_size_x // 2], canvas], axis=3)\n        if is_odd_y:\n            canvas = paddle.concat([canvas, cur_canvas[:, :, -render_size_y // 2:, :canvas.shape[3]]], axis=2)\n        if is_odd_x:\n            canvas = paddle.concat([canvas, cur_canvas[:, :, :canvas.shape[2], -render_size_x // 2:]], axis=3)\n        cur_canvas = canvas\n\n    # odd - even area\n    # 0 | 0\n    # 1 | 0\n    if h > 1:\n        canvas = partial_render(cur_canvas, odd_idx_y, even_idx_x)\n        canvas = paddle.concat([cur_canvas[:, :, :render_size_y // 2, :canvas.shape[3]], canvas], axis=2)\n        if is_odd_y:\n            canvas = paddle.concat([canvas, cur_canvas[:, :, -render_size_y // 2:, :canvas.shape[3]]], axis=2)\n        if not is_odd_x:\n            canvas = paddle.concat([canvas, cur_canvas[:, :, :canvas.shape[2], -render_size_x // 2:]], axis=3)\n        cur_canvas = canvas\n\n    # odd - even area\n    # 0 | 1\n    # 0 | 0\n    if w > 1:\n        canvas = partial_render(cur_canvas, even_idx_y, odd_idx_x)\n        canvas = paddle.concat([cur_canvas[:, :, :canvas.shape[2], :render_size_x // 2], canvas], axis=3)\n        if not is_odd_y:\n            canvas = paddle.concat([canvas, cur_canvas[:, :, -render_size_y // 2:, -canvas.shape[3]:]], axis=2)\n        if is_odd_x:\n            canvas = paddle.concat([canvas, cur_canvas[:, :, :canvas.shape[2], -render_size_x // 2:]], axis=3)\n        cur_canvas = canvas\n\n    cur_canvas = cur_canvas[:, :, render_size_y // 4:-render_size_y // 4, render_size_x // 4:-render_size_x // 4]\n\n    return cur_canvas\n\n\ndef render_parallel(original_img, net_g, meta_brushes):\n\n    patch_size = 32\n    stroke_num = 8\n\n    with paddle.no_grad():\n\n        original_h, original_w = original_img.shape[-2:]\n        K = max(math.ceil(math.log2(max(original_h, original_w) / patch_size)), 0)\n        original_img_pad_size = patch_size * (2**K)\n        original_img_pad = render_utils.pad(original_img, original_img_pad_size, original_img_pad_size)\n        final_result = paddle.zeros_like(original_img)\n\n        for layer in range(0, K + 1):\n            layer_size = patch_size * (2**layer)\n\n            img = F.interpolate(original_img_pad, (layer_size, layer_size))\n            result = F.interpolate(final_result, (layer_size, layer_size))\n            img_patch = F.unfold(img, [patch_size, patch_size], strides=[patch_size, patch_size])\n            result_patch = F.unfold(result, [patch_size, patch_size], strides=[patch_size, patch_size])\n\n            # There are patch_num * patch_num patches in total\n            patch_num = (layer_size - patch_size) // patch_size + 1\n            param, decision = stroke_net_predict(img_patch, result_patch, patch_size, net_g, stroke_num, patch_num)\n\n            #print(param.shape, decision.shape)\n            final_result = param2img_parallel(param, decision, meta_brushes, final_result)\n\n        # paint another time for last layer\n        border_size = original_img_pad_size // (2 * patch_num)\n        img = F.interpolate(original_img_pad, (layer_size, layer_size))\n        result = F.interpolate(final_result, (layer_size, layer_size))\n        img = F.pad(img, [patch_size // 2, patch_size // 2, patch_size // 2, patch_size // 2])\n        result = F.pad(result, [patch_size // 2, patch_size // 2, patch_size // 2, patch_size // 2])\n        img_patch = F.unfold(img, [patch_size, patch_size], strides=[patch_size, patch_size])\n        result_patch = F.unfold(result, [patch_size, patch_size], strides=[patch_size, patch_size])\n        final_result = F.pad(final_result, [border_size, border_size, border_size, border_size])\n        patch_num = (img.shape[2] - patch_size) // patch_size + 1\n        #w = (img.shape[3] - patch_size) // patch_size + 1\n\n        param, decision = stroke_net_predict(img_patch, result_patch, patch_size, net_g, stroke_num, patch_num)\n\n        final_result = param2img_parallel(param, decision, meta_brushes, final_result)\n\n        final_result = final_result[:, :, border_size:-border_size, border_size:-border_size]\n        final_result = (final_result.numpy().squeeze().transpose([1, 2, 0])[:, :, ::-1] * 255).astype(np.uint8)\n        return final_result\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/paint_transformer/render_serial.py",
    "content": "# !/usr/bin/env python3\n\"\"\"\ncodes for oilpainting style transfer.\n\"\"\"\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport numpy as np\nfrom PIL import Image\nimport math\nimport cv2\nimport time\nfrom .render_utils import param2stroke, Dilation2d, Erosion2d\n\n\ndef get_single_layer_lists(param, decision, ori_img, render_size_x, render_size_y, h, w, meta_brushes, dilation,\n                           erosion, stroke_num):\n    \"\"\"\n    get_single_layer_lists\n    \"\"\"\n    valid_foregrounds = param2stroke(param[:, :], render_size_y, render_size_x, meta_brushes)\n\n    valid_alphas = (valid_foregrounds > 0).astype('float32')\n    valid_foregrounds = valid_foregrounds.reshape([-1, stroke_num, 1, render_size_y, render_size_x])\n    valid_alphas = valid_alphas.reshape([-1, stroke_num, 1, render_size_y, render_size_x])\n\n    temp = [dilation(valid_foregrounds[:, i, :, :, :]) for i in range(stroke_num)]\n    valid_foregrounds = paddle.stack(temp, axis=1)\n    valid_foregrounds = valid_foregrounds.reshape([-1, 1, render_size_y, render_size_x])\n\n    temp = [erosion(valid_alphas[:, i, :, :, :]) for i in range(stroke_num)]\n    valid_alphas = paddle.stack(temp, axis=1)\n    valid_alphas = valid_alphas.reshape([-1, 1, render_size_y, render_size_x])\n\n    patch_y = 4 * render_size_y // 5\n    patch_x = 4 * render_size_x // 5\n\n    img_patch = ori_img.reshape([1, 3, h, ori_img.shape[2] // h, w, ori_img.shape[3] // w])\n    img_patch = img_patch.transpose([0, 2, 4, 1, 3, 5])[0]\n\n    xid_list = []\n    yid_list = []\n    error_list = []\n\n    for flag_idx, flag in enumerate(decision.cpu().numpy()):\n        if flag:\n            flag_idx = flag_idx // stroke_num\n            x_id = flag_idx % w\n            flag_idx = flag_idx // w\n            y_id = flag_idx % h\n            xid_list.append(x_id)\n            yid_list.append(y_id)\n\n    inner_fores = valid_foregrounds[:, :, render_size_y // 10:9 * render_size_y // 10, render_size_x // 10:9 *\n                                    render_size_x // 10]\n    inner_alpha = valid_alphas[:, :, render_size_y // 10:9 * render_size_y // 10, render_size_x // 10:9 *\n                               render_size_x // 10]\n    inner_fores = inner_fores.reshape([h * w, stroke_num, 1, patch_y, patch_x])\n    inner_alpha = inner_alpha.reshape([h * w, stroke_num, 1, patch_y, patch_x])\n    inner_real = img_patch.reshape([h * w, 3, patch_y, patch_x]).unsqueeze(1)\n\n    R = param[:, 5]\n    G = param[:, 6]\n    B = param[:, 7]  #, G, B = param[5:]\n    R = R.reshape([-1, stroke_num]).unsqueeze(-1).unsqueeze(-1).unsqueeze(-1)\n    G = G.reshape([-1, stroke_num]).unsqueeze(-1).unsqueeze(-1).unsqueeze(-1)\n    B = B.reshape([-1, stroke_num]).unsqueeze(-1).unsqueeze(-1).unsqueeze(-1)\n    error_R = R * inner_fores - inner_real[:, :, 0:1, :, :]\n    error_G = G * inner_fores - inner_real[:, :, 1:2, :, :]\n    error_B = B * inner_fores - inner_real[:, :, 2:3, :, :]\n    error = paddle.abs(error_R) + paddle.abs(error_G) + paddle.abs(error_B)\n\n    error = error * inner_alpha\n    error = paddle.sum(error, axis=(2, 3, 4)) / paddle.sum(inner_alpha, axis=(2, 3, 4))\n    error_list = error.reshape([-1]).numpy()[decision.numpy()]\n    error_list = list(error_list)\n\n    valid_foregrounds = paddle.to_tensor(valid_foregrounds.numpy()[decision.numpy()])\n    valid_alphas = paddle.to_tensor(valid_alphas.numpy()[decision.numpy()])\n\n    selected_param = paddle.to_tensor(param.numpy()[decision.numpy()])\n    return xid_list, yid_list, valid_foregrounds, valid_alphas, error_list, selected_param\n\n\ndef get_single_stroke_on_full_image_A(x_id, y_id, valid_foregrounds, valid_alphas, param, original_img, render_size_x,\n                                      render_size_y, patch_x, patch_y):\n    \"\"\"\n    get_single_stroke_on_full_image_A\n    \"\"\"\n    tmp_foreground = paddle.zeros_like(original_img)\n\n    patch_y_num = original_img.shape[2] // patch_y\n    patch_x_num = original_img.shape[3] // patch_x\n\n    brush = valid_foregrounds.unsqueeze(0)\n    color_map = param[5:]\n    brush = brush.tile([1, 3, 1, 1])\n    color_map = color_map.unsqueeze(-1).unsqueeze(-1).unsqueeze(0)  #.repeat(1, 1, H, W)\n    brush = brush * color_map\n\n    pad_l = x_id * patch_x\n    pad_r = (patch_x_num - x_id - 1) * patch_x\n    pad_t = y_id * patch_y\n    pad_b = (patch_y_num - y_id - 1) * patch_y\n    tmp_foreground = nn.functional.pad(brush, [pad_l, pad_r, pad_t, pad_b])\n    tmp_foreground = tmp_foreground[:, :, render_size_y // 10:-render_size_y // 10, render_size_x //\n                                    10:-render_size_x // 10]\n\n    tmp_alpha = nn.functional.pad(valid_alphas.unsqueeze(0), [pad_l, pad_r, pad_t, pad_b])\n    tmp_alpha = tmp_alpha[:, :, render_size_y // 10:-render_size_y // 10, render_size_x // 10:-render_size_x // 10]\n    return tmp_foreground, tmp_alpha\n\n\ndef get_single_stroke_on_full_image_B(x_id, y_id, valid_foregrounds, valid_alphas, param, original_img, render_size_x,\n                                      render_size_y, patch_x, patch_y):\n    \"\"\"\n    get_single_stroke_on_full_image_B\n    \"\"\"\n    x_expand = patch_x // 2 + render_size_x // 10\n    y_expand = patch_y // 2 + render_size_y // 10\n\n    pad_l = x_id * patch_x\n    pad_r = original_img.shape[3] + 2 * x_expand - (x_id * patch_x + render_size_x)\n    pad_t = y_id * patch_y\n    pad_b = original_img.shape[2] + 2 * y_expand - (y_id * patch_y + render_size_y)\n\n    brush = valid_foregrounds.unsqueeze(0)\n    color_map = param[5:]\n    brush = brush.tile([1, 3, 1, 1])\n    color_map = color_map.unsqueeze(-1).unsqueeze(-1).unsqueeze(0)  #.repeat(1, 1, H, W)\n    brush = brush * color_map\n\n    tmp_foreground = nn.functional.pad(brush, [pad_l, pad_r, pad_t, pad_b])\n\n    tmp_foreground = tmp_foreground[:, :, y_expand:-y_expand, x_expand:-x_expand]\n    tmp_alpha = nn.functional.pad(valid_alphas.unsqueeze(0), [pad_l, pad_r, pad_t, pad_b])\n    tmp_alpha = tmp_alpha[:, :, y_expand:-y_expand, x_expand:-x_expand]\n    return tmp_foreground, tmp_alpha\n\n\ndef stroke_net_predict(img_patch, result_patch, patch_size, net_g, stroke_num):\n    \"\"\"\n    stroke_net_predict\n    \"\"\"\n    img_patch = img_patch.transpose([0, 2, 1]).reshape([-1, 3, patch_size, patch_size])\n    result_patch = result_patch.transpose([0, 2, 1]).reshape([-1, 3, patch_size, patch_size])\n    #*----- Stroke Predictor -----*#\n    shape_param, stroke_decision = net_g(img_patch, result_patch)\n    stroke_decision = (stroke_decision > 0).astype('float32')\n    #*----- sampling color -----*#\n    grid = shape_param[:, :, :2].reshape([img_patch.shape[0] * stroke_num, 1, 1, 2])\n    img_temp = img_patch.unsqueeze(1).tile([1, stroke_num, 1, 1,\n                                            1]).reshape([img_patch.shape[0] * stroke_num, 3, patch_size, patch_size])\n    color = nn.functional.grid_sample(\n        img_temp, 2 * grid - 1, align_corners=False).reshape([img_patch.shape[0], stroke_num, 3])\n    stroke_param = paddle.concat([shape_param, color], axis=-1)\n\n    param = stroke_param.reshape([-1, 8])\n    decision = stroke_decision.reshape([-1]).astype('bool')\n    param[:, :2] = param[:, :2] / 1.25 + 0.1\n    param[:, 2:4] = param[:, 2:4] / 1.25\n    return param, decision\n\n\ndef sort_strokes(params, decision, scores):\n    \"\"\"\n    sort_strokes\n    \"\"\"\n    sorted_scores, sorted_index = paddle.sort(scores, axis=1, descending=False)\n    sorted_params = []\n    for idx in range(8):\n        tmp_pick_params = paddle.gather(params[:, :, idx], axis=1, index=sorted_index)\n        sorted_params.append(tmp_pick_params)\n    sorted_params = paddle.stack(sorted_params, axis=2)\n    sorted_decison = paddle.gather(decision.squeeze(2), axis=1, index=sorted_index)\n    return sorted_params, sorted_decison\n\n\ndef render_serial(original_img, net_g, meta_brushes):\n\n    patch_size = 32\n    stroke_num = 8\n    H, W = original_img.shape[-2:]\n    K = max(math.ceil(math.log2(max(H, W) / patch_size)), 0)\n\n    dilation = Dilation2d(m=1)\n    erosion = Erosion2d(m=1)\n    frames_per_layer = [20, 20, 30, 40, 60]\n    final_frame_list = []\n\n    with paddle.no_grad():\n        #* ----- read in image and init canvas ----- *#\n        final_result = paddle.zeros_like(original_img)\n\n        for layer in range(0, K + 1):\n            t0 = time.time()\n            layer_size = patch_size * (2**layer)\n\n            img = nn.functional.interpolate(original_img, (layer_size, layer_size))\n            result = nn.functional.interpolate(final_result, (layer_size, layer_size))\n            img_patch = nn.functional.unfold(img, [patch_size, patch_size], strides=[patch_size, patch_size])\n            result_patch = nn.functional.unfold(result, [patch_size, patch_size], strides=[patch_size, patch_size])\n            h = (img.shape[2] - patch_size) // patch_size + 1\n            w = (img.shape[3] - patch_size) // patch_size + 1\n            render_size_y = int(1.25 * H // h)\n            render_size_x = int(1.25 * W // w)\n\n            #* -------------------------------------------------------------*#\n            #* -------------generate strokes on window type A---------------*#\n            #* -------------------------------------------------------------*#\n            param, decision = stroke_net_predict(img_patch, result_patch, patch_size, net_g, stroke_num)\n            expand_img = original_img\n            wA_xid_list, wA_yid_list, wA_fore_list, wA_alpha_list, wA_error_list, wA_params = \\\n                get_single_layer_lists(param, decision, original_img, render_size_x, render_size_y, h, w,\n                                        meta_brushes, dilation, erosion, stroke_num)\n\n            #* -------------------------------------------------------------*#\n            #* -------------generate strokes on window type B---------------*#\n            #* -------------------------------------------------------------*#\n            #*----- generate input canvas and target patches -----*#\n            wB_error_list = []\n\n            img = nn.functional.pad(img, [patch_size // 2, patch_size // 2, patch_size // 2, patch_size // 2])\n            result = nn.functional.pad(result, [patch_size // 2, patch_size // 2, patch_size // 2, patch_size // 2])\n            img_patch = nn.functional.unfold(img, [patch_size, patch_size], strides=[patch_size, patch_size])\n            result_patch = nn.functional.unfold(result, [patch_size, patch_size], strides=[patch_size, patch_size])\n            h += 1\n            w += 1\n\n            param, decision = stroke_net_predict(img_patch, result_patch, patch_size, net_g, stroke_num)\n\n            patch_y = 4 * render_size_y // 5\n            patch_x = 4 * render_size_x // 5\n            expand_img = nn.functional.pad(original_img, [patch_x // 2, patch_x // 2, patch_y // 2, patch_y // 2])\n            wB_xid_list, wB_yid_list, wB_fore_list, wB_alpha_list, wB_error_list, wB_params = \\\n                get_single_layer_lists(param, decision, expand_img, render_size_x, render_size_y, h, w,\n                                        meta_brushes, dilation, erosion, stroke_num)\n            #* -------------------------------------------------------------*#\n            #* -------------rank strokes and plot stroke one by one---------*#\n            #* -------------------------------------------------------------*#\n            numA = len(wA_error_list)\n            numB = len(wB_error_list)\n            total_error_list = wA_error_list + wB_error_list\n            sort_list = list(np.argsort(total_error_list))\n\n            sample = 0\n            samples = np.linspace(0, len(sort_list) - 2, frames_per_layer[layer]).astype(int)\n            for ii in sort_list:\n                ii = int(ii)\n                if ii < numA:\n                    x_id = wA_xid_list[ii]\n                    y_id = wA_yid_list[ii]\n                    valid_foregrounds = wA_fore_list[ii]\n                    valid_alphas = wA_alpha_list[ii]\n                    sparam = wA_params[ii]\n                    tmp_foreground, tmp_alpha = get_single_stroke_on_full_image_A(\n                        x_id, y_id, valid_foregrounds, valid_alphas, sparam, original_img, render_size_x, render_size_y,\n                        patch_x, patch_y)\n                else:\n                    x_id = wB_xid_list[ii - numA]\n                    y_id = wB_yid_list[ii - numA]\n                    valid_foregrounds = wB_fore_list[ii - numA]\n                    valid_alphas = wB_alpha_list[ii - numA]\n                    sparam = wB_params[ii - numA]\n                    tmp_foreground, tmp_alpha = get_single_stroke_on_full_image_B(\n                        x_id, y_id, valid_foregrounds, valid_alphas, sparam, original_img, render_size_x, render_size_y,\n                        patch_x, patch_y)\n\n                final_result = tmp_foreground * tmp_alpha + (1 - tmp_alpha) * final_result\n                if sample in samples:\n                    saveframe = (final_result.numpy().squeeze().transpose([1, 2, 0])[:, :, ::-1] * 255).astype(np.uint8)\n                    final_frame_list.append(saveframe)\n                    #saveframe = cv2.resize(saveframe, (ow, oh))\n\n                sample += 1\n            print(\"layer %d cost: %.02f\" % (layer, time.time() - t0))\n\n        saveframe = (final_result.numpy().squeeze().transpose([1, 2, 0])[:, :, ::-1] * 255).astype(np.uint8)\n        final_frame_list.append(saveframe)\n    return final_frame_list\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/paint_transformer/render_utils.py",
    "content": "import paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport cv2\nimport numpy as np\nfrom PIL import Image\nimport math\n\n\nclass Erosion2d(nn.Layer):\n    \"\"\"\n    Erosion2d\n    \"\"\"\n\n    def __init__(self, m=1):\n        super(Erosion2d, self).__init__()\n        self.m = m\n        self.pad = [m, m, m, m]\n\n    def forward(self, x):\n        batch_size, c, h, w = x.shape\n        x_pad = F.pad(x, pad=self.pad, mode='constant', value=1e9)\n        channel = nn.functional.unfold(x_pad, 2 * self.m + 1, strides=1, paddings=0).reshape([batch_size, c, -1, h, w])\n        result = paddle.min(channel, axis=2)\n        return result\n\n\nclass Dilation2d(nn.Layer):\n    \"\"\"\n    Dilation2d\n    \"\"\"\n\n    def __init__(self, m=1):\n        super(Dilation2d, self).__init__()\n        self.m = m\n        self.pad = [m, m, m, m]\n\n    def forward(self, x):\n        batch_size, c, h, w = x.shape\n        x_pad = F.pad(x, pad=self.pad, mode='constant', value=-1e9)\n        channel = nn.functional.unfold(x_pad, 2 * self.m + 1, strides=1, paddings=0).reshape([batch_size, c, -1, h, w])\n        result = paddle.max(channel, axis=2)\n        return result\n\n\ndef param2stroke(param, H, W, meta_brushes):\n    \"\"\"\n    param2stroke\n    \"\"\"\n    b = param.shape[0]\n    param_list = paddle.split(param, 8, axis=1)\n    x0, y0, w, h, theta = [item.squeeze(-1) for item in param_list[:5]]\n    sin_theta = paddle.sin(math.pi * theta)\n    cos_theta = paddle.cos(math.pi * theta)\n    index = paddle.full((b, ), -1, dtype='int64').numpy()\n\n    index[(h > w).numpy()] = 0\n    index[(h <= w).numpy()] = 1\n    meta_brushes_resize = F.interpolate(meta_brushes, (H, W)).numpy()\n    brush = paddle.to_tensor(meta_brushes_resize[index])\n\n    warp_00 = cos_theta / w\n    warp_01 = sin_theta * H / (W * w)\n    warp_02 = (1 - 2 * x0) * cos_theta / w + (1 - 2 * y0) * sin_theta * H / (W * w)\n    warp_10 = -sin_theta * W / (H * h)\n    warp_11 = cos_theta / h\n    warp_12 = (1 - 2 * y0) * cos_theta / h - (1 - 2 * x0) * sin_theta * W / (H * h)\n    warp_0 = paddle.stack([warp_00, warp_01, warp_02], axis=1)\n    warp_1 = paddle.stack([warp_10, warp_11, warp_12], axis=1)\n    warp = paddle.stack([warp_0, warp_1], axis=1)\n    grid = nn.functional.affine_grid(warp, [b, 3, H, W])  # paddle和torch默认值是反过来的\n    brush = nn.functional.grid_sample(brush, grid)\n    return brush\n\n\ndef read_img(img_path, img_type='RGB', h=None, w=None):\n    \"\"\"\n    read img\n    \"\"\"\n    img = Image.open(img_path).convert(img_type)\n    if h is not None and w is not None:\n        img = img.resize((w, h), resample=Image.NEAREST)\n    img = np.array(img)\n    if img.ndim == 2:\n        img = np.expand_dims(img, axis=-1)\n    img = img.transpose((2, 0, 1))\n    img = paddle.to_tensor(img).unsqueeze(0).astype('float32') / 255.\n    return img\n\n\ndef preprocess(img, w=512, h=512):\n    image = cv2.resize(img, (w, h), cv2.INTER_NEAREST)\n    image = image.transpose((2, 0, 1))\n    image = paddle.to_tensor(image).unsqueeze(0).astype('float32') / 255.\n    return image\n\n\ndef totensor(img):\n    image = img.transpose((2, 0, 1))\n    image = paddle.to_tensor(image).unsqueeze(0).astype('float32') / 255.\n    return image\n\n\ndef pad(img, H, W):\n    b, c, h, w = img.shape\n    pad_h = (H - h) // 2\n    pad_w = (W - w) // 2\n    remainder_h = (H - h) % 2\n    remainder_w = (W - w) % 2\n    expand_img = nn.functional.pad(img, [pad_w, pad_w + remainder_w, pad_h, pad_h + remainder_h])\n    return expand_img\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/paint_transformer/requirements.txt",
    "content": "ppgan\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/paint_transformer/util.py",
    "content": "import base64\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/psgan/README.md",
    "content": "# psgan\n\n|模型名称|psgan|\n| :--- | :---: |\n|类别|图像 - 妆容迁移|\n|网络|PSGAN|\n|数据集|-|\n|是否支持Fine-tuning|否|\n|模型大小|121MB|\n|最新更新日期|2021-12-07|\n|数据指标|-|\n\n\n## 一、模型基本信息  \n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/157190651-595b6964-97c5-4b0b-ac0a-c30c8520a972.png\"  width = \"30%\"  hspace='10'/>\n    <br />\n    输入内容图形\n    <br />\n    <img src=\"https://user-images.githubusercontent.com/22424850/145003966-c5c2e6ad-d306-4eaf-89a2-965a3dbf3675.jpg\"  width = \"30%\" hspace='10'/>\n    <br />\n    输入妆容图形\n    <br />\n    <img src=\"https://user-images.githubusercontent.com/22424850/157190800-b1dd79d4-0eca-4b36-b091-6fcd2c00dcf6.png\"  width = \"30%\"  hspace='10'/>\n    <br />\n    输出图像\n     <br />\n    </p>\n\n- ### 模型介绍\n\n  - PSGAN模型的任务是妆容迁移， 即将任意参照图像上的妆容迁移到不带妆容的源图像上。很多人像美化应用都需要这种技术。\n\n  - 更多详情参考：[PSGAN: Pose and Expression Robust Spatial-Aware GAN for Customizable Makeup Transfer](https://arxiv.org/pdf/1909.06956.pdf)\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n  - ppgan\n  - dlib\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install psgan\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    # Read from a file\n    $ hub run psgan --content \"/PATH/TO/IMAGE\" --style \"/PATH/TO/IMAGE1\"\n    ```\n  - 通过命令行方式实现妆容转换模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"psgan\")\n    content = cv2.imread(\"/PATH/TO/IMAGE\")\n    style = cv2.imread(\"/PATH/TO/IMAGE1\")\n    results = module.makeup_transfer(images=[{'content':content, 'style':style}], output_dir='./transfer_result', use_gpu=True)\n    ```\n\n- ### 3、API\n\n  - ```python\n    makeup_transfer(images=None, paths=None, output_dir='./transfer_result/', use_gpu=False, visualization=True)\n    ```\n    - 妆容风格转换API。\n\n    - **参数**\n\n      - images (list[dict]): data of images, 每一个元素都为一个 dict，有关键字 content, style, 相应取值为：\n        - content (numpy.ndarray): 待转换的图片，shape 为 \\[H, W, C\\]，BGR格式；<br/>\n        - style (numpy.ndarray) : 风格图像，shape为 \\[H, W, C\\]，BGR格式；<br/>\n      - paths (list[str]): paths to images, 每一个元素都为一个dict, 有关键字 content, style, 相应取值为：\n        - content (str): 待转换的图片的路径；<br/>\n        - style (str) : 风格图像的路径；<br/>\n      - output\\_dir (str): 结果保存的路径； <br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - visualization(bool): 是否保存结果到本地文件夹\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线妆容风格转换服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m psgan\n    ```\n\n  - 这样就完成了一个妆容风格转换的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[{'content': cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\")), 'style': cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE1\"))}]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/psgan\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install psgan==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/psgan/makeup.yaml",
    "content": "epochs: 100\noutput_dir: tmp\ncheckpoints_dir: checkpoints\nfind_unused_parameters: True\n\nmodel:\n  name: MakeupModel\n  generator:\n    name: GeneratorPSGANAttention\n    conv_dim: 64\n    repeat_num: 6\n  discriminator:\n    name: NLayerDiscriminator\n    ndf: 64\n    n_layers: 3\n    input_nc: 3\n    norm_type: spectral\n  cycle_criterion:\n    name: L1Loss\n  idt_criterion:\n    name: L1Loss\n    loss_weight: 0.5\n  l1_criterion:\n    name: L1Loss\n  l2_criterion:\n    name: MSELoss\n  gan_criterion:\n    name: GANLoss\n    gan_mode: lsgan\n\ndataset:\n  train:\n    name: MakeupDataset\n    trans_size: 256\n    dataroot: data/MT-Dataset\n    cls_list: [non-makeup, makeup]\n    phase: train\n  test:\n    name: MakeupDataset\n    trans_size: 256\n    dataroot: data/MT-Dataset\n    cls_list: [non-makeup, makeup]\n    phase: test\n\n\nlr_scheduler:\n  name: LinearDecay\n  learning_rate: 0.0002\n  start_epoch: 100\n  decay_epochs: 100\n  # will get from real dataset\n  iters_per_epoch: 1\n\noptimizer:\n  optimizer_G:\n    name: Adam\n    net_names:\n      - netG\n    beta1: 0.5\n  optimizer_DA:\n    name: Adam\n    net_names:\n      - netD_A\n    beta1: 0.5\n  optimizer_DB:\n    name: Adam\n    net_names:\n      - netD_B\n    beta1: 0.5\n\nlog_config:\n  interval: 10\n  visiual_interval: 500\n\nsnapshot_config:\n  interval: 5\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/psgan/model.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport os\nimport sys\nfrom pathlib import Path\n\nimport numpy as np\nimport paddle\nimport paddle.vision.transforms as T\nimport ppgan.faceutils as futils\nfrom paddle.utils.download import get_weights_path_from_url\nfrom PIL import Image\nfrom ppgan.models.builder import build_model\nfrom ppgan.utils.config import get_config\nfrom ppgan.utils.filesystem import load\nfrom ppgan.utils.options import parse_args\nfrom ppgan.utils.preprocess import *\n\n\ndef toImage(net_output):\n    img = net_output.squeeze(0).transpose((1, 2, 0)).numpy()  # [1,c,h,w]->[h,w,c]\n    img = (img * 255.0).clip(0, 255)\n    img = np.uint8(img)\n    img = Image.fromarray(img, mode='RGB')\n    return img\n\n\nPS_WEIGHT_URL = \"https://paddlegan.bj.bcebos.com/models/psgan_weight.pdparams\"\n\n\nclass PreProcess:\n    def __init__(self, config, need_parser=True):\n        self.img_size = 256\n        self.transform = transform = T.Compose([\n            T.Resize(size=256),\n            T.ToTensor(),\n        ])\n        self.norm = T.Normalize([0.5, 0.5, 0.5], [0.5, 0.5, 0.5])\n        if need_parser:\n            self.face_parser = futils.mask.FaceParser()\n        self.up_ratio = 0.6 / 0.85\n        self.down_ratio = 0.2 / 0.85\n        self.width_ratio = 0.2 / 0.85\n\n    def __call__(self, image):\n        face = futils.dlib.detect(image)\n\n        if not face:\n            return\n        face_on_image = face[0]\n        image, face, crop_face = futils.dlib.crop(image, face_on_image, self.up_ratio, self.down_ratio,\n                                                  self.width_ratio)\n        np_image = np.array(image)\n        image_trans = self.transform(np_image)\n        mask = self.face_parser.parse(np.float32(cv2.resize(np_image, (512, 512))))\n        mask = cv2.resize(mask.numpy(), (self.img_size, self.img_size), interpolation=cv2.INTER_NEAREST)\n        mask = mask.astype(np.uint8)\n        mask_tensor = paddle.to_tensor(mask)\n\n        lms = futils.dlib.landmarks(image, face) / image_trans.shape[:2] * self.img_size\n        lms = lms.round()\n\n        P_np = generate_P_from_lmks(lms, self.img_size, self.img_size, self.img_size)\n\n        mask_aug = generate_mask_aug(mask, lms)\n\n        return [self.norm(image_trans).unsqueeze(0),\n                np.float32(mask_aug),\n                np.float32(P_np),\n                np.float32(mask)], face_on_image, crop_face\n\n\nclass PostProcess:\n    def __init__(self, config):\n        self.denoise = True\n        self.img_size = 256\n\n    def __call__(self, source: Image, result: Image):\n        # TODO: Refract -> name, resize\n        source = np.array(source)\n        result = np.array(result)\n\n        height, width = source.shape[:2]\n        small_source = cv2.resize(source, (self.img_size, self.img_size))\n        laplacian_diff = source.astype(np.float) - cv2.resize(small_source, (width, height)).astype(np.float)\n        result = (cv2.resize(result, (width, height)) + laplacian_diff).round().clip(0, 255).astype(np.uint8)\n        if self.denoise:\n            result = cv2.fastNlMeansDenoisingColored(result)\n        result = Image.fromarray(result).convert('RGB')\n        return result\n\n\nclass Inference:\n    def __init__(self, config, model_path=''):\n        self.model = build_model(config.model)\n        self.preprocess = PreProcess(config)\n        self.model_path = model_path\n\n    def transfer(self, source, reference, with_face=False):\n        source_input, face, crop_face = self.preprocess(source)\n        reference_input, face, crop_face = self.preprocess(reference)\n\n        consis_mask = np.float32(calculate_consis_mask(source_input[1], reference_input[1]))\n        consis_mask = paddle.to_tensor(np.expand_dims(consis_mask, 0))\n\n        if not (source_input and reference_input):\n            if with_face:\n                return None, None\n            return\n\n        for i in range(1, len(source_input) - 1):\n            source_input[i] = paddle.to_tensor(np.expand_dims(source_input[i], 0))\n\n        for i in range(1, len(reference_input) - 1):\n            reference_input[i] = paddle.to_tensor(np.expand_dims(reference_input[i], 0))\n\n        input_data = {\n            'image_A': source_input[0],\n            'image_B': reference_input[0],\n            'mask_A_aug': source_input[1],\n            'mask_B_aug': reference_input[1],\n            'P_A': source_input[2],\n            'P_B': reference_input[2],\n            'consis_mask': consis_mask\n        }\n\n        state_dicts = load(self.model_path)\n        for net_name, net in self.model.nets.items():\n            net.set_state_dict(state_dicts[net_name])\n        result, _ = self.model.test(input_data)\n        min_, max_ = result.min(), result.max()\n        result += -min_\n        result = paddle.divide(result, max_ - min_ + 1e-5)\n        img = toImage(result)\n\n        if with_face:\n            return img, crop_face\n\n        return img\n\n\nclass PSGANPredictor:\n    def __init__(self, cfg, weight_path):\n        self.cfg = cfg\n        self.weight_path = weight_path\n\n    def run(self, source, reference):\n        source = Image.fromarray(source)\n        reference = Image.fromarray(reference)\n        inference = Inference(self.cfg, self.weight_path)\n        postprocess = PostProcess(self.cfg)\n\n        # Transfer the psgan from reference to source.\n        image, face = inference.transfer(source, reference, with_face=True)\n        source_crop = source.crop((face.left(), face.top(), face.right(), face.bottom()))\n        image = postprocess(source_crop, image)\n        image = np.array(image)\n        return image\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/psgan/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport cv2\nimport numpy as np\nimport paddle\nfrom ppgan.utils.config import get_config\nfrom skimage.io import imread\nfrom skimage.transform import rescale\nfrom skimage.transform import resize\n\nimport paddlehub as hub\nfrom .model import PSGANPredictor\nfrom .util import base64_to_cv2\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"psgan\", type=\"CV/gan\", author=\"paddlepaddle\", author_email=\"\", summary=\"\", version=\"1.0.0\")\nclass psgan:\n    def __init__(self):\n        self.pretrained_model = os.path.join(self.directory, \"psgan_weight.pdparams\")\n        cfg = get_config(os.path.join(self.directory, 'makeup.yaml'))\n        self.network = PSGANPredictor(cfg, self.pretrained_model)\n\n    def makeup_transfer(self,\n                        images=None,\n                        paths=None,\n                        output_dir='./transfer_result/',\n                        use_gpu=False,\n                        visualization=True):\n        '''\n        Transfer a image to stars style.\n\n        images (list[dict]): data of images, 每一个元素都为一个 dict，有关键字 content, style, 相应取值为：\n          - content (numpy.ndarray): 待转换的图片，shape 为 \\[H, W, C\\]，BGR格式；<br/>\n          - style (numpy.ndarray) : 妆容图像，shape为 \\[H, W, C\\]，BGR格式；<br/>\n        paths (list[str]): paths to images, 每一个元素都为一个dict, 有关键字 content, style, 相应取值为：\n          - content (str): 待转换的图片的路径；<br/>\n          - style (str) : 妆容图像的路径；<br/>\n\n        output_dir: the dir to save the results\n        use_gpu: if True, use gpu to perform the computation, otherwise cpu.\n        visualization: if True, save results in output_dir.\n        '''\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n\n        if images != None:\n            for image_dict in images:\n                content_img = image_dict['content'][:, :, ::-1]\n                style_img = image_dict['style'][:, :, ::-1]\n                results.append(self.network.run(content_img, style_img))\n\n        if paths != None:\n            for path_dict in paths:\n                content_img = cv2.imread(path_dict['content'])[:, :, ::-1]\n                style_img = cv2.imread(path_dict['style'])[:, :, ::-1]\n                results.append(self.network.run(content_img, style_img))\n\n        if visualization == True:\n            if not os.path.exists(output_dir):\n                os.makedirs(output_dir, exist_ok=True)\n            for i, out in enumerate(results):\n                cv2.imwrite(os.path.join(output_dir, 'output_{}.png'.format(i)), out[:, :, ::-1])\n\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n\n        self.makeup_transfer(\n            paths=[{\n                'content': self.args.content,\n                'style': self.args.style\n            }],\n            output_dir=self.args.output_dir,\n            use_gpu=self.args.use_gpu,\n            visualization=self.args.visualization)\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = copy.deepcopy(images)\n        for image in images_decode:\n            image['content'] = base64_to_cv2(image['content'])\n            image['style'] = base64_to_cv2(image['style'])\n        results = self.makeup_transfer(images_decode, **kwargs)\n        tolist = [result.tolist() for result in results]\n        return tolist\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default='transfer_result', help='output directory for saving result.')\n        self.arg_config_group.add_argument('--visualization', type=bool, default=False, help='save results or not.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--content', type=str, help=\"path to content image.\")\n        self.arg_input_group.add_argument('--style', type=str, help=\"path to style image.\")\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/psgan/requirements.txt",
    "content": "ppgan\ndlib\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/psgan/util.py",
    "content": "import base64\n\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/stylepro_artistic/README.md",
    "content": "# stylepro_artistic\n\n|模型名称|stylepro_artistic|\n| :--- | :---: |\n|类别|图像 - 图像生成|\n|网络|StyleProNet|\n|数据集|MS-COCO + WikiArt|\n|是否支持Fine-tuning|否|\n|模型大小|28MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://paddlehub.bj.bcebos.com/resources/style.png\"  width='80%' hspace='10'/> <br />\n    </p>\n\n- ### 模型介绍\n\n  - 艺术风格迁移模型可以将给定的图像转换为任意的艺术风格。本模型StyleProNet整体采用全卷积神经网络架构(FCNs)，通过encoder-decoder重建艺术风格图片。StyleProNet的核心是无参数化的内容-风格融合算法Style Projection，模型规模小，响应速度快。模型训练的损失函数包含style loss、content perceptual loss以及content KL loss，确保模型高保真还原内容图片的语义细节信息与风格图片的风格信息。预训练数据集采用MS-COCO数据集作为内容端图像，WikiArt数据集作为风格端图像。更多详情请参考StyleProNet论文[https://arxiv.org/abs/2003.07694](https://arxiv.org/abs/2003.07694)。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0     | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)  \n\n- ### 2、安装\n\n  - ```shell\n    $ hub install stylepro_artistic\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run stylepro_artistic --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现风格转换模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    stylepro_artistic = hub.Module(name=\"stylepro_artistic\")\n    result = stylepro_artistic.style_transfer(\n    images=[{\n        'content': cv2.imread('/PATH/TO/CONTENT_IMAGE'),\n        'styles': [cv2.imread('/PATH/TO/STYLE_IMAGE')]\n    }])\n\n    # or\n    # result = stylepro_artistic.style_transfer(\n    #     paths=[{\n    #         'content': '/PATH/TO/CONTENT_IMAGE',\n    #         'styles': ['/PATH/TO/STYLE_IMAGE']\n    #     }])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def style_transfer(images=None,\n                       paths=None,\n                       alpha=1,\n                       use_gpu=False,\n                       visualization=False,\n                       output_dir='transfer_result')\n    ```\n\n    - 对图片进行风格转换。\n\n    - **参数**\n      - images (list\\[dict\\]): ndarray 格式的图片数据。每一个元素都为一个 dict，有关键字 content, styles, weights(可选)，相应取值为：\n        - content (numpy.ndarray): 待转换的图片，shape 为 \\[H, W, C\\]，BGR格式；<br/>\n        - styles (list\\[numpy.ndarray\\]) : 作为底色的风格图片组成的列表，各个图片数组的shape 都是 \\[H, W, C\\]，BGR格式；<br/>\n        - weights (list\\[float\\], optioal) : 各个 style 对应的权重。当不设置 weights 时，默认各个 style 有着相同的权重；<br/>\n      - paths (list\\[str\\]): 图片的路径。每一个元素都为一个 dict，有关键字 content, styles, weights(可选)，相应取值为：\n        - content (str): 待转换的图片的路径；<br/>\n        - styles (list\\[str\\]) : 作为底色的风格图片的路径；<br/>\n        - weights (list\\[float\\], optioal) : 各个 style 对应的权重。当不设置 weights 时，各个 style 的权重相同；<br/>\n      - alpha (float) : 转换的强度，\\[0, 1\\] 之间，默认值为1；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - visualization (bool): 是否将结果保存为图片，默认为 False; <br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 transfer\\_result。\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 OrderedDict，关键字有 date, save\\_path，相应取值为：\n        - save\\_path (str): 保存图片的路径\n        - data (numpy.ndarray): 风格转换的图片数据\n\n\n  - ```python\n    def save_inference_model(dirname,\n                             model_filename=None,\n                             params_filename=None,\n                             combined=True)\n    ```\n    - 将模型保存到指定路径。\n\n    - **参数**\n\n      - dirname: 存在模型的目录名称； <br/>\n      - model\\_filename: 模型文件名称，默认为\\_\\_model\\_\\_； <br/>\n      - params\\_filename: 参数文件名称，默认为\\_\\_params\\_\\_(仅当`combined`为True时生效)；<br/>\n      - combined: 是否将参数保存到统一的一个文件中。\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线风格转换服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m stylepro_artistic\n    ```\n\n  - 这样就完成了一个风格转换服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    import numpy as np\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    def base64_to_cv2(b64str):\n      data = base64.b64decode(b64str.encode('utf8'))\n      data = np.fromstring(data, np.uint8)\n      data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n      return data\n\n    # 发送HTTP请求\n    data = {'images':[\n    {\n        'content':cv2_to_base64(cv2.imread('/PATH/TO/CONTENT_IMAGE')),\n        'styles':[cv2_to_base64(cv2.imread('/PATH/TO/STYLE_IMAGE'))]\n    }\n    ]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/stylepro_artistic\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(base64_to_cv2(r.json()[\"results\"][0]['data']))\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.3\n\n  移除 fluid api\n\n  - ```shell\n    $ hub install stylepro_artistic==1.0.3\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/stylepro_artistic/README_en.md",
    "content": "# stylepro_artistic\n\n|Module Name|stylepro_artistic|\n| :--- | :---: |\n|Category|image generation|\n|Network|StyleProNet|\n|Dataset|MS-COCO + WikiArt|\n|Fine-tuning supported or not|No|\n|Module Size|28MB|\n|Latest update date|2021-02-26|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n    <p align=\"center\">\n    <img src=\"https://paddlehub.bj.bcebos.com/resources/style.png\"  width='80%' hspace='10'/> <br />\n    </p>\n\n- ### Module Introduction\n\n  - StyleProNet is a model for style transfer, which is light-weight and responds quickly. This module is based on StyleProNet, trained on WikiArt(MS-COCO) and WikiArt(style) datasets, and can be used for style transfer. For more information, please refer to [StyleProNet](https://arxiv.org/abs/2003.07694).\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0     | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)  \n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install stylepro_artistic\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run stylepro_artistic --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    stylepro_artistic = hub.Module(name=\"stylepro_artistic\")\n    result = stylepro_artistic.style_transfer(\n    images=[{\n        'content': cv2.imread('/PATH/TO/CONTENT_IMAGE'),\n        'styles': [cv2.imread('/PATH/TO/STYLE_IMAGE')]\n    }])\n\n    # or\n    # result = stylepro_artistic.style_transfer(\n    #     paths=[{\n    #         'content': '/PATH/TO/CONTENT_IMAGE',\n    #         'styles': ['/PATH/TO/STYLE_IMAGE']\n    #     }])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def style_transfer(images=None,\n                       paths=None,\n                       alpha=1,\n                       use_gpu=False,\n                       visualization=False,\n                       output_dir='transfer_result')\n    ```\n\n    - Style transfer API.\n\n    - **Parameters**\n      - images (list\\[dict\\]): each element is a dict，includes:\n        - content (numpy.ndarray): input image array，shape is \\[H, W, C\\]，BGR format；<br/>\n        - styles (list\\[numpy.ndarray\\]) : list of style image arrays，shape is \\[H, W, C\\]，BGR format；<br/>\n        - weights (list\\[float\\], optioal) : weight for each style, if not set, each style has the same weight;<br/>\n      - paths (list\\[dict\\]): each element is a dict，includes:\n        - content (str): path for input image；<br/>\n        - styles (list\\[str\\]) : paths for style images；<br/>\n        - weights (list\\[float\\], optioal) :  weight for each style, if not set, each style has the same weight;<br/>\n      - alpha (float) : alpha value，\\[0, 1\\] ，default is 1<br/>\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - visualization (bool): Whether to save the results as picture files;\n      - output_dir (str): save path of images;\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n\n      - res (list\\[dict\\]): results\n        - path (str): path for input image\n        - data (numpy.ndarray): output image\n\n\n  - ```python\n    def save_inference_model(dirname,\n                             model_filename=None,\n                             params_filename=None,\n                             combined=True)\n    ```\n    - Save model to specific path\n\n    - **Parameters**\n\n      - dirname: output dir for saving model\n      - model\\_filename: filename for saving model\n      - params\\_filename: filename for saving parameters\n      - combined: whether save parameters into one file\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of style transfer.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m stylepro_artistic\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    import numpy as np\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    def base64_to_cv2(b64str):\n      data = base64.b64decode(b64str.encode('utf8'))\n      data = np.fromstring(data, np.uint8)\n      data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n      return data\n\n    # Send an HTTP request\n    data = {'images':[\n    {\n        'content':cv2_to_base64(cv2.imread('/PATH/TO/CONTENT_IMAGE')),\n        'styles':[cv2_to_base64(cv2.imread('/PATH/TO/STYLE_IMAGE'))]\n    }\n    ]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/stylepro_artistic\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(base64_to_cv2(r.json()[\"results\"][0]['data']))\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.0.3\n\n  Remove fluid api\n\n  - ```shell\n    $ hub install stylepro_artistic==1.0.3\n    ```\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/stylepro_artistic/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/Image_gan/style_transfer/stylepro_artistic/data_feed.py",
    "content": "# coding=utf-8\nimport os\nimport time\nfrom collections import OrderedDict\n\nimport cv2\nimport numpy as np\nfrom PIL import Image\n\n__all__ = ['reader']\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to get image data.\n\n    Args:\n        images (list): list of dict objects, each dict contains key:\n            content(str): value is a numpy.ndarry with shape [H, W, C], content data.\n            styles(str): value is a list of numpy.ndarray with shape [H, W, C], styles data.\n            weights(str, optional): value is the interpolation weights correspond to styles.\n        paths (list): list of dict objects, each dict contains key:\n            content(str): value is the path to content.\n            styles(str): value is the paths to styles.\n            weights(str, optional): value is the interpolation weights correspond to styles.\n    Yield:\n        im (numpy.ndarray): preprocessed data, with shape (1, 3, 512, 512).\n    \"\"\"\n    pipeline_list = list()\n    # images\n    for key, data in [('im_arr', images), ('im_path', paths)]:\n        if data is not None:\n            for component in data:\n                each_res = OrderedDict()\n                # content_arr\n                each_res['content_arr'], w, h = _handle_single(**{key: component['content']})\n                # styles_arr_list\n                styles_list = component['styles']\n                styles_num = len(styles_list)\n                each_res['styles_arr_list'] = []\n                for i, style_arr in enumerate(styles_list):\n                    each_res['styles_arr_list'].append(_handle_single(**{key: style_arr})[0])\n                # style_interpolation_weights\n                if 'weights' in component:\n                    assert len(component['weights']\n                               ) == styles_num, \"The number of weights must be equal to the number of styles.\"\n                    each_res['style_interpolation_weights'] = component['weights']\n                else:\n                    each_res['style_interpolation_weights'] = np.ones(styles_num)\n                each_res['style_interpolation_weights'] = [\n                    each_res['style_interpolation_weights'][j] / sum(each_res['style_interpolation_weights'])\n                    for j in range(styles_num)\n                ]\n                pipeline_list.append([each_res, w, h])\n\n    # yield\n    for element in pipeline_list:\n        yield element\n\n\ndef _handle_single(im_path=None, im_arr=None):\n    \"\"\"\n    Preprocess to get image data.\n    Args:\n        im_path (str): path to image.\n        im_arr (numpy.ndarray): image data, with shape (H, W, 3).\n    Returns:\n        im (numpy.ndarray): preprocessed data, with shape (1, 3, 512, 512).\n    \"\"\"\n    im = None\n    if im_path is not None:\n        im = cv2.imread(im_path)\n        if im is None:\n            raise FileNotFoundError('Error: The file path \"{}\"  may not exist or is not a valid image file, please provide a valid path.'.format(im_path))\n        else:\n            assert(len(im.shape) == 3, 'The input image shape should be [H, W, 3], but got {}'.format(im.shape))\n            assert(im.shape[2] == 3,  'The input image should have 3 channels, but got {}'.format(im.shape[2]))\n            im = im[:, :, ::-1].astype(np.float32)    ### Image should have 3-channels, and BGR format is arranged by cv2, we should change it to RGB.\n    if im_arr is not None:\n        im = im_arr[:, :, ::-1].astype(np.float32)\n    if im is None:\n        raise ValueError('No image data is provided. Please check the input \"images\" and \"paths\".')\n    w, h = im.shape[1], im.shape[0]\n    im = cv2.resize(im, (512, 512), interpolation=cv2.INTER_LINEAR)\n    im = im.transpose((2, 0, 1))\n    im = np.expand_dims(im, axis=0)\n    im /= 255.0\n    return im, w, h\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/stylepro_artistic/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\n\nimport argparse\nimport ast\nimport copy\nimport os\nimport time\n\nimport numpy as np\nimport paddle\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\nfrom stylepro_artistic.data_feed import reader\nfrom stylepro_artistic.processor import base64_to_cv2\nfrom stylepro_artistic.processor import cv2_to_base64\nfrom stylepro_artistic.processor import fr\nfrom stylepro_artistic.processor import postprocess\n\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n# coding=utf-8\n\n\n@moduleinfo(\n    name=\"stylepro_artistic\",\n    version=\"1.0.3\",\n    type=\"cv/style_transfer\",\n    summary=\"StylePro Artistic is an algorithm for Arbitrary image style, which is parameter-free, fast yet effective.\",\n    author=\"baidu-bdl\",\n    author_email=\"\")\nclass StyleProjection(hub.Module):\n\n    def _initialize(self):\n        self.pretrained_encoder_net = os.path.join(self.directory, \"style_projection_enc\")\n        self.pretrained_decoder_net = os.path.join(self.directory, \"style_projection_dec\")\n        self._set_config()\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        # encoder\n        cpu_config_enc = Config(self.pretrained_encoder_net)\n        cpu_config_enc.disable_glog_info()\n        cpu_config_enc.disable_gpu()\n        self.cpu_predictor_enc = create_predictor(cpu_config_enc)\n        # decoder\n        cpu_config_dec = Config(self.pretrained_decoder_net)\n        cpu_config_dec.disable_glog_info()\n        cpu_config_dec.disable_gpu()\n        self.cpu_predictor_dec = create_predictor(cpu_config_dec)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            # encoder\n            gpu_config_enc = Config(self.pretrained_encoder_net)\n            gpu_config_enc.disable_glog_info()\n            gpu_config_enc.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor_enc = create_predictor(gpu_config_enc)\n            # decoder\n            gpu_config_dec = Config(self.pretrained_decoder_net)\n            gpu_config_dec.disable_glog_info()\n            gpu_config_dec.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor_dec = create_predictor(gpu_config_dec)\n\n    def style_transfer(self,\n                       images=None,\n                       paths=None,\n                       alpha=1,\n                       use_gpu=False,\n                       output_dir='transfer_result',\n                       visualization=False):\n        \"\"\"\n        API for image style transfer.\n\n        Args:\n            images (list): list of dict objects, each dict contains key:\n                content(str): value is a numpy.ndarry with shape [H, W, C], content data.\n                styles(str): value is a list of numpy.ndarray with shape [H, W, C], styles data.\n                weights(str, optional): value is the interpolation weights correspond to styles.\n            paths (list): list of dict objects, each dict contains key:\n                content(str): value is the path to content.\n                styles(str): value is the paths to styles.\n                weights(str, optional): value is the interpolation weights correspond to styles.\n            alpha (float): The weight that controls the degree of stylization. Should be between 0 and 1.\n            use_gpu (bool): whether to use gpu.\n            output_dir (str): the path to store output images.\n            visualization (bool): whether to save image or not.\n\n        Returns:\n            im_output (list[dict()]): list of output images and save path of images.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        predictor_enc = self.gpu_predictor_enc if use_gpu else self.cpu_predictor_enc\n        input_names_enc = predictor_enc.get_input_names()\n        input_handle_enc = predictor_enc.get_input_handle(input_names_enc[0])\n        output_names_enc = predictor_enc.get_output_names()\n        output_handle_enc = predictor_enc.get_output_handle(output_names_enc[0])\n\n        predictor_dec = self.gpu_predictor_dec if use_gpu else self.cpu_predictor_dec\n        input_names_dec = predictor_dec.get_input_names()\n        input_handle_dec = predictor_dec.get_input_handle(input_names_dec[0])\n        output_names_dec = predictor_dec.get_output_names()\n        output_handle_dec = predictor_dec.get_output_handle(output_names_dec[0])\n\n        im_output = []\n        for component, w, h in reader(images, paths):\n            input_handle_enc.copy_from_cpu(component['content_arr'])\n            predictor_enc.run()\n            content_feats = output_handle_enc.copy_to_cpu()\n            accumulate = np.zeros((3, 512, 512))\n            for idx, style_arr in enumerate(component['styles_arr_list']):\n                # encode\n                input_handle_enc.copy_from_cpu(style_arr)\n                predictor_enc.run()\n                style_feats = output_handle_enc.copy_to_cpu()\n                fr_feats = fr(content_feats, style_feats, alpha)\n                # decode\n                input_handle_dec.copy_from_cpu(fr_feats)\n                predictor_dec.run()\n                predict_outputs = output_handle_dec.copy_to_cpu()\n                # interpolation\n                accumulate += predict_outputs[0] * component['style_interpolation_weights'][idx]\n            # postprocess\n            save_im_name = 'ndarray_{}.jpg'.format(time.time())\n            result = postprocess(accumulate, output_dir, save_im_name, visualization, size=(w, h))\n            im_output.append(result)\n        return im_output\n\n    def save_inference_model(self, dirname, model_filename=None, params_filename=None, combined=True):\n        encode_dirname = os.path.join(dirname, 'encoder')\n        decode_dirname = os.path.join(dirname, 'decoder')\n        self._save_encode_model(encode_dirname, model_filename, params_filename, combined)\n        self._save_decode_model(decode_dirname, model_filename, params_filename, combined)\n\n    def _save_encode_model(self, dirname, model_filename=None, params_filename=None, combined=True):\n        if combined:\n            model_filename = \"__model__\" if not model_filename else model_filename\n            params_filename = \"__params__\" if not params_filename else params_filename\n        place = paddle.CPUPlace()\n        exe = paddle.Executor(place)\n\n        encode_program, encode_feeded_var_names, encode_target_vars = paddle.static.load_inference_model(\n            dirname=self.pretrained_encoder_net, executor=exe)\n\n        paddle.static.save_inference_model(dirname=dirname,\n                                           main_program=encode_program,\n                                           executor=exe,\n                                           feeded_var_names=encode_feeded_var_names,\n                                           target_vars=encode_target_vars,\n                                           model_filename=model_filename,\n                                           params_filename=params_filename)\n\n    def _save_decode_model(self, dirname, model_filename=None, params_filename=None, combined=True):\n        if combined:\n            model_filename = \"__model__\" if not model_filename else model_filename\n            params_filename = \"__params__\" if not params_filename else params_filename\n        place = paddle.CPUPlace()\n        exe = paddle.Executor(place)\n\n        decode_program, decode_feeded_var_names, decode_target_vars = paddle.static.load_inference_model(\n            dirname=self.pretrained_decoder_net, executor=exe)\n\n        paddle.static.save_inference_model(dirname=dirname,\n                                           main_program=decode_program,\n                                           executor=exe,\n                                           feeded_var_names=decode_feeded_var_names,\n                                           target_vars=decode_target_vars,\n                                           model_filename=model_filename,\n                                           params_filename=params_filename)\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = copy.deepcopy(images)\n        for image in images_decode:\n            image['content'] = base64_to_cv2(image['content'])\n            image['styles'] = [base64_to_cv2(style) for style in image['styles']]\n        results = self.style_transfer(images_decode, **kwargs)\n        results = [{'data': cv2_to_base64(result['data'])} for result in results]\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        if args.weights is None:\n            paths = [{'content': args.content, 'styles': args.styles.split(',')}]\n        else:\n            paths = [{'content': args.content, 'styles': args.styles.split(','), 'weights': list(args.weights)}]\n        results = self.style_transfer(paths=paths,\n                                      alpha=args.alpha,\n                                      use_gpu=args.use_gpu,\n                                      output_dir=args.output_dir,\n                                      visualization=True)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='transfer_result',\n                                           help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument('--visualization',\n                                           type=ast.literal_eval,\n                                           default=True,\n                                           help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--content', type=str, help=\"path to content.\")\n        self.arg_input_group.add_argument('--styles', type=str, help=\"path to styles.\")\n        self.arg_input_group.add_argument('--weights',\n                                          type=ast.literal_eval,\n                                          default=None,\n                                          help=\"interpolation weights of styles.\")\n        self.arg_config_group.add_argument('--alpha',\n                                           type=ast.literal_eval,\n                                           default=1,\n                                           help=\"The parameter to control the tranform degree.\")\n"
  },
  {
    "path": "modules/image/Image_gan/style_transfer/stylepro_artistic/processor.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport os\n\nimport base64\nimport cv2\nimport numpy as np\n\n__all__ = ['postprocess', 'fr']\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef postprocess(im, output_dir, save_im_name, visualization, size):\n    im = np.multiply(im, 255.0) + 0.5\n    im = np.clip(im, 0, 255)\n    im = im.astype(np.uint8)\n    im = im.transpose((1, 2, 0))\n    im = im[:, :, ::-1]\n    im = cv2.resize(im, (size[0], size[1]), interpolation=cv2.INTER_LINEAR)\n    result = {'data': im}\n    if visualization:\n        if not os.path.exists(output_dir):\n            os.makedirs(output_dir)\n        elif os.path.isfile(output_dir):\n            os.remove(output_dir)\n            os.makedirs(output_dir)\n        # save image\n        save_path = os.path.join(output_dir, save_im_name)\n        try:\n            cv2.imwrite(save_path, im)\n            print('Notice: an image has been proccessed and saved in path \"{}\".'.format(os.path.abspath(save_path)))\n        except Exception as e:\n            print('Exception {}: Fail to save output image in path \"{}\".'.format(e, os.path.abspath(save_path)))\n        result['save_path'] = save_path\n    return result\n\n\ndef fr(content_feat, style_feat, alpha):\n    content_feat = np.reshape(content_feat, (512, -1))\n    style_feat = np.reshape(style_feat, (512, -1))\n\n    content_feat_index = np.argsort(content_feat, axis=1)\n    style_feat = np.sort(style_feat, axis=1)\n\n    fr_feat = scatter_numpy(dim=1, index=content_feat_index, src=style_feat)\n    fr_feat = fr_feat * alpha + content_feat * (1 - alpha)\n    fr_feat = np.reshape(fr_feat, (1, 512, 64, 64))\n    return fr_feat\n\n\ndef scatter_numpy(dim, index, src):\n    \"\"\"\n    Writes all values from the Tensor src into dst at the indices specified in the index Tensor.\n\n    :param dim: The axis along which to index\n    :param index: The indices of elements to scatter\n    :param src: The source element(s) to scatter\n    :return: dst\n    \"\"\"\n    dst = src.copy()\n    idx_xsection_shape = index.shape[:dim] + index.shape[dim + 1:]\n    dst_xsection_shape = dst.shape[:dim] + dst.shape[dim + 1:]\n    if idx_xsection_shape != dst_xsection_shape:\n        raise ValueError(\"Except for dimension \" + str(dim) +\n                         \", all dimensions of index and output should be the same size\")\n    if (index >= dst.shape[dim]).any() or (index < 0).any():\n        raise IndexError(\"The values of index must be between 0 and {}.\".format(dst.shape[dim] - 1))\n\n    def make_slice(arr, dim, i):\n        slc = [slice(None)] * arr.ndim\n        slc[dim] = i\n        return tuple(slc)\n\n    # We use index and dim parameters to create idx\n    # idx is in a form that can be used as a NumPy advanced index for scattering of src param.\n    idx = [[\n        *np.indices(idx_xsection_shape).reshape(index.ndim - 1, -1), index[make_slice(index, dim, i)].reshape(1, -1)[0]\n    ] for i in range(index.shape[dim])]\n    idx = list(np.concatenate(idx, axis=1))\n    idx.insert(dim, idx.pop())\n\n    if not np.isscalar(src):\n        if index.shape[dim] > src.shape[dim]:\n            raise IndexError(\"Dimension \" + str(dim) + \"of index can not be bigger than that of src \")\n        src_xsection_shape = src.shape[:dim] + src.shape[dim + 1:]\n        if idx_xsection_shape != src_xsection_shape:\n            raise ValueError(\"Except for dimension \" + str(dim) +\n                             \", all dimensions of index and src should be the same size\")\n        # src_idx is a NumPy advanced index for indexing of elements in the src\n        src_idx = list(idx)\n        src_idx.pop(dim)\n        src_idx.insert(dim, np.repeat(np.arange(index.shape[dim]), np.prod(idx_xsection_shape)))\n        dst[tuple(idx)] = src[tuple(src_idx)]\n    else:\n        dst[idx] = src\n    return dst\n"
  },
  {
    "path": "modules/image/README.md",
    "content": ""
  },
  {
    "path": "modules/image/classification/DriverStatusRecognition/README.md",
    "content": "# DriverStatusRecognition\n\n|模型名称|DriverStatusRecognition|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|MobileNetV3_small_ssld|\n|数据集|分心司机检测数据集|\n|是否支持Fine-tuning|否|\n|模型大小|6MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - 驾驶员状态识别（DriverStatusRecognition），该模型可挖掘出人在疲劳状态下的表情特征，然后将这些定性的表情特征进行量化，提取出面部特征点及特征指标作为判断依据，再结合实验数据总结出基于这些参数的识别方法，最后输入获取到的状态数据进行识别和判断。该PaddleHub Module支持API预测及命令行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.0.0  \n\n  - paddlehub >= 2.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n  - paddlex >= 1.3.7\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install DriverStatusRecognition\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n- ### 3、在线体验\n  [AI Studio 快速体验](https://aistudio.baidu.com/aistudio/projectdetail/1649513)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run DriverStatusRecognition --input_path /PATH/TO/IMAGE\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"DriverStatusRecognition\")\n    images = [cv2.imread('/PATH/TO/IMAGE')]\n    results = classifier.predict(images=images)\n    for result in results:\n        print(result)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def predict(images)\n    ```\n    - 分类接口API。\n    - **参数**\n      - images：list类型，待检测的图像。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install DriverStatusRecognition==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/DriverStatusRecognition/README_en.md",
    "content": "# DriverStatusRecognition\n\n|Module Name|DriverStatusRecognition|\n| :--- | :---: |\n|Category|image classification|\n|Network|MobileNetV3_small_ssld|\n|Dataset|Distractible Driver Dataset|\n|Fine-tuning supported or not|No|\n|Module Size|6MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - This module can be used for recognizing distractible drivers by analysing the expression on the face.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 2.0.0  \n\n  - paddlehub >= 2.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n  - paddlex >= 1.3.7\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install DriverStatusRecognition\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n- ### 3、Online experience\n  [AI Studio](https://aistudio.baidu.com/aistudio/projectdetail/1649513)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run DriverStatusRecognition --input_path /PATH/TO/IMAGE\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"DriverStatusRecognition\")\n    images = [cv2.imread('/PATH/TO/IMAGE')]\n    results = classifier.predict(images=images)\n    for result in results:\n        print(result)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def predict(images)\n    ```\n    - classification API.\n    - **Parameters**\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install DriverStatusRecognition==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/DriverStatusRecognition/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/classification/DriverStatusRecognition/assets/model.yml",
    "content": "Model: MobileNetV3_small_ssld\nTransforms:\n- ResizeByShort:\n    max_size: -1\n    short_size: 256\n- CenterCrop:\n    crop_size: 224\n- Normalize:\n    mean:\n    - 0.485\n    - 0.456\n    - 0.406\n    std:\n    - 0.229\n    - 0.224\n    - 0.225\nTransformsMode: RGB\n_Attributes:\n  eval_metrics:\n    acc1: 0.9888542131074454\n  fixed_input_shape: null\n  labels:\n  - ch0\n  - ch1\n  - ch2\n  - ch3\n  - ch4\n  - ch5\n  - ch6\n  - ch7\n  - ch8\n  - ch9\n  model_type: classifier\n  num_classes: 10\n_ModelInputsOutputs:\n  test_inputs:\n  - - image\n    - image\n  test_outputs:\n  - - predict\n    - softmax_0.tmp_0\n_init_params:\n  num_classes: 10\ncompleted_epochs: 0\nstatus: Infer\nversion: 1.3.7\n"
  },
  {
    "path": "modules/image/classification/DriverStatusRecognition/module.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\n\nimport os\nimport cv2\nimport argparse\nimport base64\nimport paddlex as pdx\n\nimport numpy as np\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef cv2_to_base64(image):\n    # return base64.b64encode(image)\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef read_images(paths):\n    images = []\n    for path in paths:\n        images.append(cv2.imread(path))\n    return images\n\n\n@moduleinfo(\n    name='DriverStatusRecognition',\n    type='cv/classification',\n    author='郑博培、彭兆帅',\n    author_email='2733821739@qq.com',\n    summary=\"Distinguish the driver's normal driving, making a phone call, drinking water and other different actions\",\n    version='1.0.0')\nclass MODULE(hub.Module):\n    def _initialize(self, **kwargs):\n        self.default_pretrained_model_path = os.path.join(self.directory, 'assets')\n        self.model = pdx.deploy.Predictor(self.default_pretrained_model_path, **kwargs)\n\n    def predict(self, images=None, paths=None, data=None, batch_size=1, use_gpu=False, **kwargs):\n\n        all_data = images if images is not None else read_images(paths)\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = []\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except IndexError:\n                    break\n            out = self.model.batch_predict(batch_data, **kwargs)\n            res.extend(out)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.predict(images_decode, **kwargs)\n        res = []\n        for result in results:\n            if isinstance(result, dict):\n                # result_new = dict()\n                for key, value in result.items():\n                    if isinstance(value, np.ndarray):\n                        result[key] = cv2_to_base64(value)\n                    elif isinstance(value, np.generic):\n                        result[key] = np.asscalar(value)\n\n            elif isinstance(result, list):\n                for index in range(len(result)):\n                    for key, value in result[index].items():\n                        if isinstance(value, np.ndarray):\n                            result[index][key] = cv2_to_base64(value)\n                        elif isinstance(value, np.generic):\n                            result[index][key] = np.asscalar(value)\n            else:\n                raise RuntimeError('The result cannot be used in serving.')\n            res.append(result)\n        return res\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.predict(paths=[args.input_path], use_gpu=args.use_gpu)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', type=bool, default=False, help=\"whether use GPU or not\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n\n\nif __name__ == '__main__':\n    module = MODULE(directory='./new_model')\n    images = [cv2.imread('./cat.jpg'), cv2.imread('./cat.jpg'), cv2.imread('./cat.jpg')]\n    res = module.predict(images=images)\n"
  },
  {
    "path": "modules/image/classification/DriverStatusRecognition/requirements.txt",
    "content": "paddlex==1.3.7\nchardet\n"
  },
  {
    "path": "modules/image/classification/DriverStatusRecognition/serving_client_demo.py",
    "content": "# coding: utf8\nimport requests\nimport json\nimport cv2\nimport base64\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\nif __name__ == '__main__':\n    # 获取图片的base64编码格式\n    img1 = cv2_to_base64(cv2.imread(\"IMAGE_PATH1\"))\n    img2 = cv2_to_base64(cv2.imread(\"IMAGE_PATH2\"))\n    data = {'images': [img1, img2]}\n    # 指定content-type\n    headers = {\"Content-type\": \"application/json\"}\n    # 发送HTTP请求\n    url = \"http://127.0.0.1:8866/predict/DriverStatusRecognition\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n"
  },
  {
    "path": "modules/image/classification/README.md",
    "content": "\n## **更好用户体验，建议参考WEB端官方文档 -> [【图像分类】](https://www.paddlepaddle.org.cn/hublist)**\n\n\n### 图像分类\n图像分类是根据图像的语义信息对不同类别图像进行区分，是计算机视觉中重要的基础问题，是物体检测、图像分割、物体跟踪、行为分析、人脸识别等其他高层视觉任务的基础，在许多领域都有着广泛的应用。如：安防领域的人脸识别和智能视频分析等，交通领域的交通场景识别，互联网领域基于内容的图像检索和相册自动归类，医学领域的图像识别等。\n\n**注：** **如果你是资深开发者，那可以随意按需使用**，**假如你是新手，服务器端优先选择Resnet50，移动端优先选择MobileNetV3**\n\n- 精选模型推荐\n\n|            | **模型名称**                                                 | **模型特色**                                       |\n| ---------- | :----------------------------------------------------------- | ---------------------------------------------------------- |\n| 图像分类 | [菜品识别](https://www.paddlepaddle.org.cn/hubdetail?name=resnet50_vd_dishes&en_category=ImageClassification) | 私有数据集训练，支持8416种菜品的分类识别，适合进一步菜品方向微调 |\n| 图像分类 | [动物识别](https://www.paddlepaddle.org.cn/hubdetail?name=resnet50_vd_animals&en_category=ImageClassification) | 私有数据集训练，支持7978种动物的分类识别，适合进一步动物方向微调 |\n| 图像分类 | [野生动物制品识别](https://www.paddlepaddle.org.cn/hubdetail?name=resnet50_vd_wildanimals&en_category=ImageClassification) | 支持'象牙制品', '象牙', '大象', '虎皮', '老虎', '虎牙/虎爪/虎骨', '穿山甲甲片', '穿山甲', '穿山甲爪子', '其他' 这十个标签的识别。 |\n\n\n- 更多模型\n\n| **模型名称** | **模型简介** |\n| - | - |\n| [AlexNet](https://www.paddlepaddle.org.cn/hubdetail?name=alexnet_imagenet&en_category=ImageClassification) | 首次在 CNN 中成功的应用了 ReLU, Dropout 和 LRN，并使用 GPU 进行运算加速 |\n| [VGG19](https://www.paddlepaddle.org.cn/hubdetail?name=vgg19_imagenet&en_category=ImageClassification) | 在 AlexNet 的基础上使用 3*3 小卷积核，增加网络深度，具有很好的泛化能力 |\n| [GoogLeNet](https://github.com/PaddlePaddle/models/tree/release/1.7/PaddleCV/image_classification) | 在不增加计算负载的前提下增加了网络的深度和宽度，性能更加优越 |\n| [ResNet50](https://www.paddlepaddle.org.cn/hubdetail?name=resnet_v2_50_imagenet&en_category=ImageClassification) | Residual Network，引入了新的残差结构，解决了随着网络加深，准确率下降的问题 |\n| [Inceptionv4](https://www.paddlepaddle.org.cn/hubdetail?name=inception_v4_imagenet&en_category=ImageClassification) | 将 Inception 模块与 Residual Connection 进行结合，通过ResNet的结构极大地加速训练并获得性能的提升 |\n| [MobileNetV2](https://www.paddlepaddle.org.cn/hubdetail?name=mobilenet_v2_imagenet&en_category=ImageClassification) | MobileNet结构的微调，直接在 thinner 的 bottleneck层上进行 skip learning 连接以及对 bottleneck layer 不进行 ReLu 非线性处理可取得更好的结果 |\n| [se_resnext50](https://www.paddlepaddle.org.cn/hubdetail?name=se_resnext50_32x4d_imagenet&en_category=ImageClassification) | 在ResNeXt 基础、上加入了 SE(Sequeeze-and-Excitation) 模块，提高了识别准确率，在 ILSVRC 2017 的分类项目中取得了第一名 |\n| [ShuffleNetV2](https://www.paddlepaddle.org.cn/hubdetail?name=shufflenet_v2_imagenet&en_category=ImageClassification) | ECCV2018，轻量级 CNN 网络，在速度和准确度之间做了很好地平衡。在同等复杂度下，比 ShuffleNet 和 MobileNetv2 更准确，更适合移动端以及无人车领域 |\n| [efficientNetb7](https://www.paddlepaddle.org.cn/hubdetail?name=efficientnetb7_imagenet&en_category=ImageClassification) | 同时对模型的分辨率，通道数和深度进行缩放，用极少的参数就可以达到SOTA的精度。 |\n| [xception71](https://www.paddlepaddle.org.cn/hubdetail?name=xception71_imagenet&en_category=ImageClassification) | 对inception-v3的改进，用深度可分离卷积代替普通卷积，降低参数量同时提高了精度。 |\n| [dpn107](https://www.paddlepaddle.org.cn/hubdetail?name=dpn107_imagenet&en_category=ImageClassification) | 融合了densenet和resnext的特点。 |\n| [DarkNet53](https://www.paddlepaddle.org.cn/hubdetail?name=darknet53_imagenet&en_category=ImageClassification) | 检测框架yolov3使用的backbone，在分类和检测任务上都有不错表现。 |\n| [DenseNet161](https://www.paddlepaddle.org.cn/hubdetail?name=densenet161_imagenet&en_category=ImageClassification) | 提出了密集连接的网络结构，更加有利于信息流的传递。 |\n| [ResNeXt152_vd](https://www.paddlepaddle.org.cn/hubdetail?name=resnext152_64x4d_imagenet&en_category=ImageClassification) | 提出了cardinatity的概念，用于作为模型复杂度的另外一个度量，有效地提升模型精度。 |\n"
  },
  {
    "path": "modules/image/classification/README_en.md",
    "content": "## **For better user experience, refer to the official web documentation -> [Image Classification](https://www.paddlepaddle.org.cn/hublist)**\n\n### Image Classification\n\nImage classification distinguishes between different categories of images based on the semantic information. This is an important basic issue in computer vision and is the basis for object detection, image segmentation, object tracking, behavior analysis, face recognition and other high-level vision tasks. It is widely applied in many fields. For example, face recognition and intelligent video analysis in the security field, traffic scene recognition in the transportation field, content-based image retrieval and automatic classification of photo albums in the Internet field, and image recognition in the medicine field.\n\n**Note:** If you are an experienced developer, feel free to use as required. If you are a newcomer, choose Resnet50 first at the server side and MobileNetV3 for mobile.\n\n- Recommended Models\n\n|                      | **Model Name**                                               | **Model Features**                                           |\n| -------------------- | :----------------------------------------------------------- | ------------------------------------------------------------ |\n| Image Classification | [Dish Identification](https://www.paddlepaddle.org.cn/hubdetail?name=resnet50_vd_dishes&en_category=ImageClassification) | Private dataset training. It supports the category identification of 8416 dishes. It is suitable for further fine-tuning in the dish orientation. |\n| Image Classification | [Animal Identification](https://www.paddlepaddle.org.cn/hubdetail?name=resnet50_vd_animals&en_category=ImageClassification) | Private dataset training. It supports the category identification of 7978 kinds of animals. It is suitable for further fine-tuning in the animal orientation. |\n| Image Classification | [Wildlife product identification](https://www.paddlepaddle.org.cn/hubdetail?name=resnet50_vd_wildanimals&en_category=ImageClassification) | Supports identification of the ten tags 'Ivory product', 'Ivory', 'Elephant', 'Tiger Skin', 'Tiger', 'Tiger Tusk/Claw/Bone', 'Pangolin squama', 'Pangolin', 'Pangolin Paw',  and 'Other'. |\n\n- More Models\n\n| **Model Name**                                               | **Model Introduction**                                       |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n| [AlexNet](https://www.paddlepaddle.org.cn/hubdetail?name=alexnet_imagenet&en_category=ImageClassification) | It is the first successful application of ReLU, Dropout and LRN in CNN, with using GPU for acceleration. |\n| [VGG19](https://www.paddlepaddle.org.cn/hubdetail?name=vgg19_imagenet&en_category=ImageClassification) | Use 3x3 small convolutional cores based on AlexNet to increase network depth with good generalization capabilities |\n| [GoogLeNet](https://github.com/PaddlePaddle/models/tree/release/1.7/PaddleCV/image_classification) | Increased network depth and width without increasing computational load for superior performance |\n| [ResNet50](https://www.paddlepaddle.org.cn/hubdetail?name=resnet_v2_50_imagenet&en_category=ImageClassification) | Residual Network. Introduce a new residual structure that solves the problem of decreasing accuracy as the network deepens. |\n| [Inceptionv4](https://www.paddlepaddle.org.cn/hubdetail?name=inception_v4_imagenet&en_category=ImageClassification) | Combine the Inception module with the Residual Connection greatly. This accelerates training and performance increases through ResNet's architecture. |\n| [MobileNetV2](https://www.paddlepaddle.org.cn/hubdetail?name=mobilenet_v2_imagenet&en_category=ImageClassification) | Fine-tuning of the MobileNet structure. Perform the skip learning connections directly on the bottleneck layer of thinner. There is no ReLu nonlinear processing of the bottleneck layer, with a good results. |\n| [se_resnext50](https://www.paddlepaddle.org.cn/hubdetail?name=se_resnext50_32x4d_imagenet&en_category=ImageClassification) | Add the SE (Sequeeze-and-Excitation) module to ResNeXt to improve recognition accuracy. Achieve the first place in the ILSVRC 2017 classification program. |\n| [ShuffleNetV2](https://www.paddlepaddle.org.cn/hubdetail?name=shufflenet_v2_imagenet&en_category=ImageClassification) | ECCV2018, lightweight CNN network. It offers a good balance between speed and accuracy. It is more accurate than ShuffleNet and MobileNetv2 at the same level of complexity. It is more suitable for mobile and unmanned vehicle fields. |\n| [efficientNetb7](https://www.paddlepaddle.org.cn/hubdetail?name=efficientnetb7_imagenet&en_category=ImageClassification) | Simultaneous resolution for models through the scaling of number of channels and depth. SOTA's accuracy can be achieved with very few parameters. |\n| [xception71](https://www.paddlepaddle.org.cn/hubdetail?name=xception71_imagenet&en_category=ImageClassification) | Improvements to inceptiv-v3. Replace regular convolution with deeply separable convolution. It reduces the number of parameters and increases the accuracy. |\n| [dpn107](https://www.paddlepaddle.org.cn/hubdetail?name=dpn107_imagenet&en_category=ImageClassification) | A fusion of densenet and resnext features.                   |\n| [DarkNet53](https://www.paddlepaddle.org.cn/hubdetail?name=darknet53_imagenet&en_category=ImageClassification) | The detection framework yolov3 uses backbone. It has good performance for both classification and detection tasks. |\n| [DenseNet161](https://www.paddlepaddle.org.cn/hubdetail?name=densenet161_imagenet&en_category=ImageClassification) | A densely connected network structure is proposed, which is more conducive to the flow of information. |\n| [ResNeXt152_vd](https://www.paddlepaddle.org.cn/hubdetail?name=resnext152_64x4d_imagenet&en_category=ImageClassification) | The concept of cardinatity is proposed as an additional measure of model complexity, effectively improving model accuracy. |\n"
  },
  {
    "path": "modules/image/classification/SnakeIdentification/README.md",
    "content": "# SnakeIdentification\n\n|模型名称|SnakeIdentification|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNet50_vd_ssld|\n|数据集|蛇种数据集|\n|是否支持Fine-tuning|否|\n|模型大小|84MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - 蛇种识别（SnakeIdentification），该模型可准确识别蛇的种类，并精准判断蛇的毒性。该PaddleHub Module支持API预测及命令行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.0.0  \n\n  - paddlehub >= 2.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n  - paddlex >= 1.3.7\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install SnakeIdentification\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n- ### 3、在线体验\n  [AI Studio 快速体验](https://aistudio.baidu.com/aistudio/projectdetail/1646951)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run SnakeIdentification --input_path /PATH/TO/IMAGE\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"SnakeIdentification\")\n    images = [cv2.imread('/PATH/TO/IMAGE')]\n    results = classifier.predict(images=images)\n    for result in results:\n        print(result)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def predict(images)\n    ```\n    - 分类接口API。\n    - **参数**\n      - images：list类型，待检测的图像。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install SnakeIdentification==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/SnakeIdentification/README_en.md",
    "content": "# SnakeIdentification\n\n|Module Name|SnakeIdentification|\n| :--- | :---: |\n|Category|image classification|\n|Network|ResNet50_vd_ssld|\n|Dataset|Snake Dataset|\n|Fine-tuning supported or not|No|\n|Module Size|84MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - This module can be used to identify the kind of snake, and judge the toxicity.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 2.0.0  \n\n  - paddlehub >= 2.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n  - paddlex >= 1.3.7\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install SnakeIdentification\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n- ### 3、Online experience\n  [AI Studio](https://aistudio.baidu.com/aistudio/projectdetail/1646951)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run SnakeIdentification --input_path /PATH/TO/IMAGE\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"SnakeIdentification\")\n    images = [cv2.imread('/PATH/TO/IMAGE')]\n    results = classifier.predict(images=images)\n    for result in results:\n        print(result)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def predict(images)\n    ```\n    - classification API.\n    - **Parameters**\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install SnakeIdentification==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/SnakeIdentification/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/classification/SnakeIdentification/assets/model.yml",
    "content": "Model: ResNet50_vd_ssld\nTransforms:\n- ResizeByShort:\n    max_size: -1\n    short_size: 256\n- CenterCrop:\n    crop_size: 224\n- Normalize:\n    mean:\n    - 0.485\n    - 0.456\n    - 0.406\n    std:\n    - 0.229\n    - 0.224\n    - 0.225\nTransformsMode: RGB\n_Attributes:\n  eval_metrics:\n    acc1: 1.0\n  fixed_input_shape: null\n  labels:\n  - \"\\u6C34\\u86C7\"\n  - \"\\u5251\\u7EB9\\u5E26\\u86C7\"\n  - \"\\u5FB7\\u51EF\\u65AF\\u6C0F\\u86C7\"\n  - \"\\u9ED1\\u9F20\\u86C7\"\n  - \"\\u897F\\u90E8\\u83F1\\u6591\\u54CD\\u5C3E\\u86C7\"\n  model_type: classifier\n  num_classes: 5\n_ModelInputsOutputs:\n  test_inputs:\n  - - image\n    - image\n  test_outputs:\n  - - predict\n    - softmax_0.tmp_0\n_init_params:\n  num_classes: 5\ncompleted_epochs: 0\nstatus: Infer\nversion: 1.1.0\n"
  },
  {
    "path": "modules/image/classification/SnakeIdentification/module.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\n\nimport os\nimport cv2\nimport argparse\nimport base64\nimport paddlex as pdx\n\nimport numpy as np\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef cv2_to_base64(image):\n    # return base64.b64encode(image)\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef read_images(paths):\n    images = []\n    for path in paths:\n        images.append(cv2.imread(path))\n    return images\n\n\n@moduleinfo(\n    name='SnakeIdentification',\n    type='cv/classification',\n    author='郑博培',\n    author_email='2733821739@qq.com',\n    summary='Identify snake species',\n    version='1.0.0')\nclass MODULE(hub.Module):\n    def _initialize(self, **kwargs):\n        self.default_pretrained_model_path = os.path.join(self.directory, 'assets')\n        self.model = pdx.deploy.Predictor(self.default_pretrained_model_path, **kwargs)\n\n    def predict(self, images=None, paths=None, data=None, batch_size=1, use_gpu=False, **kwargs):\n\n        all_data = images if images is not None else read_images(paths)\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = []\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except IndexError:\n                    break\n            out = self.model.batch_predict(batch_data, **kwargs)\n            res.extend(out)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.predict(images_decode, **kwargs)\n        res = []\n        for result in results:\n            if isinstance(result, dict):\n                # result_new = dict()\n                for key, value in result.items():\n                    if isinstance(value, np.ndarray):\n                        result[key] = cv2_to_base64(value)\n                    elif isinstance(value, np.generic):\n                        result[key] = np.asscalar(value)\n\n            elif isinstance(result, list):\n                for index in range(len(result)):\n                    for key, value in result[index].items():\n                        if isinstance(value, np.ndarray):\n                            result[index][key] = cv2_to_base64(value)\n                        elif isinstance(value, np.generic):\n                            result[index][key] = np.asscalar(value)\n            else:\n                raise RuntimeError('The result cannot be used in serving.')\n            res.append(result)\n        return res\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.predict(paths=[args.input_path], use_gpu=args.use_gpu)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', type=bool, default=False, help=\"whether use GPU or not\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n\n\nif __name__ == '__main__':\n    module = MODULE(directory='./new_model')\n    images = [cv2.imread('./cat.jpg'), cv2.imread('./cat.jpg'), cv2.imread('./cat.jpg')]\n    res = module.predict(images=images)\n"
  },
  {
    "path": "modules/image/classification/SnakeIdentification/requirements.txt",
    "content": "paddlex==1.3.7\n"
  },
  {
    "path": "modules/image/classification/SnakeIdentification/serving_client_demo.py",
    "content": "# coding: utf8\nimport requests\nimport json\nimport cv2\nimport base64\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\nif __name__ == '__main__':\n    # 获取图片的base64编码格式\n    img1 = cv2_to_base64(cv2.imread(\"IMAGE_PATH1\"))\n    img2 = cv2_to_base64(cv2.imread(\"IMAGE_PATH2\"))\n    data = {'images': [img1, img2]}\n    # 指定content-type\n    headers = {\"Content-type\": \"application/json\"}\n    # 发送HTTP请求\n    url = \"http://127.0.0.1:8866/predict/SnakeIdentification\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n"
  },
  {
    "path": "modules/image/classification/SpinalNet_Gemstones/README.md",
    "content": "# PaddleHub SpinalNet\n\n本示例将展示如何使用PaddleHub的SpinalNet预训练模型进行宝石识别或finetune并完成宝石的预测任务。\n\n## 1. 首先要安装PaddleHub2.0版\n\n```shell\n$pip install -U paddlehub==2.0.0\n```\n\n## 2. 在本地加载封装的模型\n\n```Python\nimport paddlehub as hub\n```\n### 加载spinalnet_res50_gemstone\n```Python\nspinal_res50 = hub.Module(name=\"spinalnet_res50_gemstone\")\n```\n### 加载spinalnet_vgg16_gemstone\n```Python\nspinal_vgg16 = hub.Module(name=\"spinalnet_vgg16_gemstone\")\n```\n### 加载spinalnet_res101_gemstone\n```Python\nspinal_res101 = hub.Module(name=\"spinalnet_res101_gemstone\")\n```\n## 3. 预测\n\n### 使用spinalnet_res50_gemstone预测\n```Python\nresult_res50 = spinal_res50.predict(['/PATH/TO/IMAGE'])\nprint(result_res50)\n```\n### 使用spinalnet_vgg16_gemstone预测\n```Python\nresult_vgg16 = spinal_vgg16.predict(['/PATH/TO/IMAGE'])\nprint(result_vgg16)\n```\n### 使用spinalnet_res101_gemstone预测\n```Python\nsresult_res101 = spinal_res101.predict(['/PATH/TO/IMAGE'])\nprint(result_res101)\n```\n## 4. 命令行预测\n\n```shell\n$ hub run spinalnet_res50_gemstone --input_path \"/PATH/TO/IMAGE\" --top_k 5\n```\n\n## 5. 对PaddleHub模型进行训练微调\n\n## 如何开始Fine-tune\n\n在完成安装PaddlePaddle与PaddleHub后，即可对Spinalnet模型进行针对宝石数据集的Fine-tune。\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为5个步骤。\n\n### Step1: 加载必要的库\n```python\nfrom paddlehub.finetune.trainer import Trainer\nfrom gem_dataset import GemStones\nfrom paddlehub.vision import transforms as T\nimport paddle\n```\n\n\n### Step2: 定义数据预处理方式\n```python\n\ntrain_transforms = T.Compose([T.Resize((256, 256)), T.CenterCrop(224), T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])], to_rgb=True)\neval_transforms = T.Compose([T.Resize((256, 256)), T.CenterCrop(224), T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])], to_rgb=True)\n```\n\n`transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n### Step3: 定义数据集\n```python\ngem_train = GemStones(transforms=train_transforms, mode='train')\ngem_validate = GemStones(transforms=eval_transforms, mode='eval')\n```\n\n\n数据集的准备代码可以参考 [gem_dataset.py](PaddleHub/modules/thirdparty/image/classification/SpinanlNet_Gemstones/gem_dataset.py)。\n\n\n### Step4: 开始训练微调\n\n```python\noptimizer = paddle.optimizer.Momentum(learning_rate=0.001, momentum=0.9, parameters=spinal_res50.parameters())\ntrainer = Trainer(spinal_res50, optimizer, use_gpu=True, checkpoint_dir='fine_tuned_model')\ntrainer.train(gem_train, epochs=5, batch_size=128, eval_dataset=gem_validate, save_interval=1, log_interval=10)\n```\n\n### Step5: 微调后再预测\n\n```python\nspinal_res50 = hub.Module(name=\"spinalnet_res50_gemstone\")\nresult_res50 = spinal_res50.predict(['/PATH/TO/IMAGE'])\nprint(result_res50)\n```\n\n\n### 查看代码\n\nhttps://github.com/PaddleHub/modules/thirdparty/image/classification/SpinalNet_Gemstones/spinalnet_res50_gemstone/module.py\n\nhttps://github.com/PaddleHub/modules/thirdparty/image/classification/SpinalNet_Gemstones/spinalnet_res101_gemstone/module.py\n\nhttps://github.com/PaddleHub/modules/thirdparty/image/classification/SpinalNet_Gemstones/spinalnet_vgg16_gemstone/module.py\n\n### 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "modules/image/classification/SpinalNet_Gemstones/gem_dataset.py",
    "content": "import paddle\nimport numpy as np\nfrom typing import Callable\nfrom code.config import config_parameters\n\n\nclass GemStones(paddle.io.Dataset):\n    \"\"\"\n    step 1：paddle.io.Dataset\n    \"\"\"\n\n    def __init__(self, transforms: Callable, mode: str = 'train'):\n        \"\"\"\n        step 2：create reader\n        \"\"\"\n        super(GemStones, self).__init__()\n\n        self.mode = mode\n        self.transforms = transforms\n\n        train_image_dir = config_parameters['train_image_dir']\n        eval_image_dir = config_parameters['eval_image_dir']\n        test_image_dir = config_parameters['test_image_dir']\n\n        train_data_folder = paddle.vision.DatasetFolder(train_image_dir)\n        eval_data_folder = paddle.vision.DatasetFolder(eval_image_dir)\n        test_data_folder = paddle.vision.DatasetFolder(test_image_dir)\n\n        config_parameters['label_dict'] = train_data_folder.class_to_idx\n\n        if self.mode == 'train':\n            self.data = train_data_folder\n        elif self.mode == 'eval':\n            self.data = eval_data_folder\n        elif self.mode == 'test':\n            self.data = test_data_folder\n\n    def __getitem__(self, index):\n        \"\"\"\n        step 3：implement __getitem__\n        \"\"\"\n        data = np.array(self.data[index][0]).astype('float32')\n\n        data = self.transforms(data)\n\n        label = np.array(self.data[index][1]).astype('int64')\n\n        return data, label\n\n    def __len__(self):\n        \"\"\"\n        step 4：implement __len__\n        \"\"\"\n        return len(self.data)\n"
  },
  {
    "path": "modules/image/classification/SpinalNet_Gemstones/spinalnet_res101_gemstone/README.md",
    "content": "## 概述\n* [SpinalNet](https://arxiv.org/abs/2007.03347)的网络结构如下图，\n\n[网络结构图](https://ai-studio-static-online.cdn.bcebos.com/0c58fff63018401089f92085a2aea5d46921351012e64ac4b7d5a8e1370c463f)\n\n该模型为SpinalNet在宝石数据集上的预训练模型，可以安装PaddleHub后完成一键预测及微调。\n\n## 预训练模型\n\n预训练模型位于https://aistudio.baidu.com/aistudio/datasetdetail/69923\n\n## API\n加载该模型后，使用PadduleHub2.0的默认图像分类API\n```\ndef Predict(images, batch_size, top_k):\n```\n\n**参数**\n* images (list[str: 图片路径]) : 输入图像数据列表\n* batch_size: 默认值为1\n* top_k: 每张图片的前k个预测类别\n"
  },
  {
    "path": "modules/image/classification/SpinalNet_Gemstones/spinalnet_res101_gemstone/label_list.txt",
    "content": "Alexandrite\nAlmandine\nAmazonite\nAmber\nAmethyst\nAmetrine\nAndalusite\nAndradite\nAquamarine\nAventurine Green\nAventurine Yellow\nBenitoite\nBeryl Golden\nBixbite\nBloodstone\nBlue Lace Agate\nCarnelian\nCats Eye\nChalcedony\nChalcedony Blue\nChrome Diopside\nChrysoberyl\nChrysocolla\nChrysoprase\nCitrine\nCoral\nDanburite\nDiamond\nDiaspore\nDumortierite\nEmerald\nFluorite\nGarnet Red\nGoshenite\nGrossular\nHessonite\nHiddenite\nIolite\nJade\nJasper\nKunzite\nKyanite\nLabradorite\nLapis Lazuli\nLarimar\nMalachite\nMoonstone\nMorganite\nOnyx Black\nOnyx Green\nOnyx Red\nOpal\nPearl\nPeridot\nPrehnite\nPyrite\nPyrope\nQuartz Beer\nQuartz Lemon\nQuartz Rose\nQuartz Rutilated\nQuartz Smoky\nRhodochrosite\nRhodolite\nRhodonite\nRuby\nSapphire Blue\nSapphire Pink\nSapphire Purple\nSapphire Yellow\nScapolite\nSerpentine\nSodalite\nSpessartite\nSphene\nSpinel\nSpodumene\nSunstone\nTanzanite\nTigers Eye\nTopaz\nTourmaline\nTsavorite\nTurquoise\nVariscite\nZircon\nZoisite\n"
  },
  {
    "path": "modules/image/classification/SpinalNet_Gemstones/spinalnet_res101_gemstone/module.py",
    "content": "# copyright (c) 2021 nanting03. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nfrom typing import Union\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddlehub.vision.transforms as T\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass BottleneckBlock(nn.Layer):\n\n    expansion = 4\n\n    def __init__(self,\n                 inplanes,\n                 planes,\n                 stride=1,\n                 downsample=None,\n                 groups=1,\n                 base_width=64,\n                 dilation=1,\n                 norm_layer=None):\n        super(BottleneckBlock, self).__init__()\n        if norm_layer is None:\n            norm_layer = nn.BatchNorm2D\n        width = int(planes * (base_width / 64.)) * groups\n\n        self.conv1 = nn.Conv2D(inplanes, width, 1, bias_attr=False)\n        self.bn1 = norm_layer(width)\n\n        self.conv2 = nn.Conv2D(\n            width, width, 3, padding=dilation, stride=stride, groups=groups, dilation=dilation, bias_attr=False)\n        self.bn2 = norm_layer(width)\n\n        self.conv3 = nn.Conv2D(width, planes * self.expansion, 1, bias_attr=False)\n        self.bn3 = norm_layer(planes * self.expansion)\n        self.relu = nn.ReLU()\n        self.downsample = downsample\n        self.stride = stride\n\n    def forward(self, x):\n        identity = x\n\n        out = self.conv1(x)\n        out = self.bn1(out)\n        out = self.relu(out)\n\n        out = self.conv2(out)\n        out = self.bn2(out)\n        out = self.relu(out)\n\n        out = self.conv3(out)\n        out = self.bn3(out)\n\n        if self.downsample is not None:\n            identity = self.downsample(x)\n\n        out += identity\n        out = self.relu(out)\n\n        return out\n\n\nclass ResNet(nn.Layer):\n    def __init__(self, block=BottleneckBlock, depth=101, with_pool=True):\n        super(ResNet, self).__init__()\n        layer_cfg = {18: [2, 2, 2, 2], 34: [3, 4, 6, 3], 50: [3, 4, 6, 3], 101: [3, 4, 23, 3], 152: [3, 8, 36, 3]}\n        layers = layer_cfg[depth]\n        self.with_pool = with_pool\n        self._norm_layer = nn.BatchNorm2D\n\n        self.inplanes = 64\n        self.dilation = 1\n\n        self.conv1 = nn.Conv2D(3, self.inplanes, kernel_size=7, stride=2, padding=3, bias_attr=False)\n        self.bn1 = self._norm_layer(self.inplanes)\n        self.relu = nn.ReLU()\n        self.maxpool = nn.MaxPool2D(kernel_size=3, stride=2, padding=1)\n        self.layer1 = self._make_layer(block, 64, layers[0])\n        self.layer2 = self._make_layer(block, 128, layers[1], stride=2)\n        self.layer3 = self._make_layer(block, 256, layers[2], stride=2)\n        self.layer4 = self._make_layer(block, 512, layers[3], stride=2)\n        if with_pool:\n            self.avgpool = nn.AdaptiveAvgPool2D((1, 1))\n\n    def _make_layer(self, block, planes, blocks, stride=1, dilate=False):\n        norm_layer = self._norm_layer\n        downsample = None\n        previous_dilation = self.dilation\n        if dilate:\n            self.dilation *= stride\n            stride = 1\n        if stride != 1 or self.inplanes != planes * block.expansion:\n            downsample = nn.Sequential(\n                nn.Conv2D(self.inplanes, planes * block.expansion, 1, stride=stride, bias_attr=False),\n                norm_layer(planes * block.expansion),\n            )\n\n        layers = []\n        layers.append(block(self.inplanes, planes, stride, downsample, 1, 64, previous_dilation, norm_layer))\n        self.inplanes = planes * block.expansion\n        for _ in range(1, blocks):\n            layers.append(block(self.inplanes, planes, norm_layer=norm_layer))\n\n        return nn.Sequential(*layers)\n\n    def forward(self, x):\n        x = self.conv1(x)\n        x = self.bn1(x)\n        x = self.relu(x)\n        x = self.maxpool(x)\n        x = self.layer1(x)\n        x = self.layer2(x)\n        x = self.layer3(x)\n        x = self.layer4(x)\n\n        if self.with_pool:\n            x = self.avgpool(x)\n\n        return x\n\n\n@moduleinfo(\n    name=\"spinalnet_res101_gemstone\",\n    type=\"CV/classification\",\n    author=\"nanting03\",\n    author_email=\"975348977@qq.com\",\n    summary=\"spinalnet_res101_gemstone is a classification model, \"\n    \"this module is trained with Gemstone dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass SpinalNet_ResNet101(nn.Layer):\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(SpinalNet_ResNet101, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        self.backbone = ResNet()\n\n        half_in_size = round(2048 / 2)\n        layer_width = 20\n\n        self.half_in_size = half_in_size\n\n        self.fc_spinal_layer1 = nn.Sequential(\n            nn.Dropout(p=0.5), nn.Linear(half_in_size, layer_width), nn.BatchNorm1D(layer_width), nn.ReLU())\n        self.fc_spinal_layer2 = nn.Sequential(\n            nn.Dropout(p=0.5), nn.Linear(half_in_size + layer_width, layer_width), nn.BatchNorm1D(layer_width),\n            nn.ReLU())\n        self.fc_spinal_layer3 = nn.Sequential(\n            nn.Dropout(p=0.5), nn.Linear(half_in_size + layer_width, layer_width), nn.BatchNorm1D(layer_width),\n            nn.ReLU())\n        self.fc_spinal_layer4 = nn.Sequential(\n            nn.Dropout(p=0.5), nn.Linear(half_in_size + layer_width, layer_width), nn.BatchNorm1D(layer_width),\n            nn.ReLU())\n        self.fc_out = nn.Sequential(\n            nn.Dropout(p=0.5),\n            nn.Linear(layer_width * 4, class_dim),\n        )\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'spinalnet_res101.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images)\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.backbone(inputs)\n        feature = y\n        y = paddle.flatten(y, 1)\n\n        y1 = self.fc_spinal_layer1(y[:, 0:self.half_in_size])\n        y2 = self.fc_spinal_layer2(paddle.concat([y[:, self.half_in_size:2 * self.half_in_size], y1], axis=1))\n        y3 = self.fc_spinal_layer3(paddle.concat([y[:, 0:self.half_in_size], y2], axis=1))\n        y4 = self.fc_spinal_layer4(paddle.concat([y[:, self.half_in_size:2 * self.half_in_size], y3], axis=1))\n\n        y = paddle.concat([y1, y2, y3, y4], axis=1)\n\n        y = self.fc_out(y)\n        return y, feature\n"
  },
  {
    "path": "modules/image/classification/SpinalNet_Gemstones/spinalnet_res50_gemstone/README.md",
    "content": "## 概述\n* [SpinalNet](https://arxiv.org/abs/2007.03347)的网络结构如下图，\n\n[网络结构图](https://ai-studio-static-online.cdn.bcebos.com/0c58fff63018401089f92085a2aea5d46921351012e64ac4b7d5a8e1370c463f)\n\n该模型为SpinalNet在宝石数据集上的预训练模型，可以安装PaddleHub后完成一键预测及微调。\n\n## 预训练模型\n\n预训练模型位于https://aistudio.baidu.com/aistudio/datasetdetail/69923\n\n## API\n加载该模型后，使用PadduleHub2.0的默认图像分类API\n```\ndef Predict(images, batch_size, top_k):\n```\n\n**参数**\n* images (list[str: 图片路径]) : 输入图像数据列表\n* batch_size: 默认值为1\n* top_k: 每张图片的前k个预测类别\n"
  },
  {
    "path": "modules/image/classification/SpinalNet_Gemstones/spinalnet_res50_gemstone/label_list.txt",
    "content": "Alexandrite\nAlmandine\nAmazonite\nAmber\nAmethyst\nAmetrine\nAndalusite\nAndradite\nAquamarine\nAventurine Green\nAventurine Yellow\nBenitoite\nBeryl Golden\nBixbite\nBloodstone\nBlue Lace Agate\nCarnelian\nCats Eye\nChalcedony\nChalcedony Blue\nChrome Diopside\nChrysoberyl\nChrysocolla\nChrysoprase\nCitrine\nCoral\nDanburite\nDiamond\nDiaspore\nDumortierite\nEmerald\nFluorite\nGarnet Red\nGoshenite\nGrossular\nHessonite\nHiddenite\nIolite\nJade\nJasper\nKunzite\nKyanite\nLabradorite\nLapis Lazuli\nLarimar\nMalachite\nMoonstone\nMorganite\nOnyx Black\nOnyx Green\nOnyx Red\nOpal\nPearl\nPeridot\nPrehnite\nPyrite\nPyrope\nQuartz Beer\nQuartz Lemon\nQuartz Rose\nQuartz Rutilated\nQuartz Smoky\nRhodochrosite\nRhodolite\nRhodonite\nRuby\nSapphire Blue\nSapphire Pink\nSapphire Purple\nSapphire Yellow\nScapolite\nSerpentine\nSodalite\nSpessartite\nSphene\nSpinel\nSpodumene\nSunstone\nTanzanite\nTigers Eye\nTopaz\nTourmaline\nTsavorite\nTurquoise\nVariscite\nZircon\nZoisite\n"
  },
  {
    "path": "modules/image/classification/SpinalNet_Gemstones/spinalnet_res50_gemstone/module.py",
    "content": "# copyright (c) 2021 nanting03. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nfrom typing import Union\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddlehub.vision.transforms as T\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass BottleneckBlock(nn.Layer):\n\n    expansion = 4\n\n    def __init__(self,\n                 inplanes,\n                 planes,\n                 stride=1,\n                 downsample=None,\n                 groups=1,\n                 base_width=64,\n                 dilation=1,\n                 norm_layer=None):\n        super(BottleneckBlock, self).__init__()\n        if norm_layer is None:\n            norm_layer = nn.BatchNorm2D\n        width = int(planes * (base_width / 64.)) * groups\n\n        self.conv1 = nn.Conv2D(inplanes, width, 1, bias_attr=False)\n        self.bn1 = norm_layer(width)\n\n        self.conv2 = nn.Conv2D(\n            width, width, 3, padding=dilation, stride=stride, groups=groups, dilation=dilation, bias_attr=False)\n        self.bn2 = norm_layer(width)\n\n        self.conv3 = nn.Conv2D(width, planes * self.expansion, 1, bias_attr=False)\n        self.bn3 = norm_layer(planes * self.expansion)\n        self.relu = nn.ReLU()\n        self.downsample = downsample\n        self.stride = stride\n\n    def forward(self, x):\n        identity = x\n\n        out = self.conv1(x)\n        out = self.bn1(out)\n        out = self.relu(out)\n\n        out = self.conv2(out)\n        out = self.bn2(out)\n        out = self.relu(out)\n\n        out = self.conv3(out)\n        out = self.bn3(out)\n\n        if self.downsample is not None:\n            identity = self.downsample(x)\n\n        out += identity\n        out = self.relu(out)\n\n        return out\n\n\nclass ResNet(nn.Layer):\n    def __init__(self, block=BottleneckBlock, depth=50, with_pool=True):\n        super(ResNet, self).__init__()\n        layer_cfg = {18: [2, 2, 2, 2], 34: [3, 4, 6, 3], 50: [3, 4, 6, 3], 101: [3, 4, 23, 3], 152: [3, 8, 36, 3]}\n        layers = layer_cfg[depth]\n        self.with_pool = with_pool\n        self._norm_layer = nn.BatchNorm2D\n\n        self.inplanes = 64\n        self.dilation = 1\n\n        self.conv1 = nn.Conv2D(3, self.inplanes, kernel_size=7, stride=2, padding=3, bias_attr=False)\n        self.bn1 = self._norm_layer(self.inplanes)\n        self.relu = nn.ReLU()\n        self.maxpool = nn.MaxPool2D(kernel_size=3, stride=2, padding=1)\n        self.layer1 = self._make_layer(block, 64, layers[0])\n        self.layer2 = self._make_layer(block, 128, layers[1], stride=2)\n        self.layer3 = self._make_layer(block, 256, layers[2], stride=2)\n        self.layer4 = self._make_layer(block, 512, layers[3], stride=2)\n        if with_pool:\n            self.avgpool = nn.AdaptiveAvgPool2D((1, 1))\n\n    def _make_layer(self, block, planes, blocks, stride=1, dilate=False):\n        norm_layer = self._norm_layer\n        downsample = None\n        previous_dilation = self.dilation\n        if dilate:\n            self.dilation *= stride\n            stride = 1\n        if stride != 1 or self.inplanes != planes * block.expansion:\n            downsample = nn.Sequential(\n                nn.Conv2D(self.inplanes, planes * block.expansion, 1, stride=stride, bias_attr=False),\n                norm_layer(planes * block.expansion),\n            )\n\n        layers = []\n        layers.append(block(self.inplanes, planes, stride, downsample, 1, 64, previous_dilation, norm_layer))\n        self.inplanes = planes * block.expansion\n        for _ in range(1, blocks):\n            layers.append(block(self.inplanes, planes, norm_layer=norm_layer))\n\n        return nn.Sequential(*layers)\n\n    def forward(self, x):\n        x = self.conv1(x)\n        x = self.bn1(x)\n        x = self.relu(x)\n        x = self.maxpool(x)\n        x = self.layer1(x)\n        x = self.layer2(x)\n        x = self.layer3(x)\n        x = self.layer4(x)\n\n        if self.with_pool:\n            x = self.avgpool(x)\n\n        return x\n\n\n@moduleinfo(\n    name=\"spinalnet_res50_gemstone\",\n    type=\"CV/classification\",\n    author=\"nanting03\",\n    author_email=\"975348977@qq.com\",\n    summary=\"spinalnet_res50_gemstone is a classification model, \"\n    \"this module is trained with Gemstone dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass SpinalNet_ResNet50(nn.Layer):\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(SpinalNet_ResNet50, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        self.backbone = ResNet()\n\n        half_in_size = round(2048 / 2)\n        layer_width = 20\n\n        self.half_in_size = half_in_size\n\n        self.fc_spinal_layer1 = nn.Sequential(\n            nn.Dropout(p=0.5), nn.Linear(half_in_size, layer_width), nn.BatchNorm1D(layer_width), nn.ReLU())\n        self.fc_spinal_layer2 = nn.Sequential(\n            nn.Dropout(p=0.5), nn.Linear(half_in_size + layer_width, layer_width), nn.BatchNorm1D(layer_width),\n            nn.ReLU())\n        self.fc_spinal_layer3 = nn.Sequential(\n            nn.Dropout(p=0.5), nn.Linear(half_in_size + layer_width, layer_width), nn.BatchNorm1D(layer_width),\n            nn.ReLU())\n        self.fc_spinal_layer4 = nn.Sequential(\n            nn.Dropout(p=0.5), nn.Linear(half_in_size + layer_width, layer_width), nn.BatchNorm1D(layer_width),\n            nn.ReLU())\n        self.fc_out = nn.Sequential(\n            nn.Dropout(p=0.5),\n            nn.Linear(layer_width * 4, class_dim),\n        )\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'spinalnet_res50.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images)\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.backbone(inputs)\n        feature = y\n        y = paddle.flatten(y, 1)\n\n        y1 = self.fc_spinal_layer1(y[:, 0:self.half_in_size])\n        y2 = self.fc_spinal_layer2(paddle.concat([y[:, self.half_in_size:2 * self.half_in_size], y1], axis=1))\n        y3 = self.fc_spinal_layer3(paddle.concat([y[:, 0:self.half_in_size], y2], axis=1))\n        y4 = self.fc_spinal_layer4(paddle.concat([y[:, self.half_in_size:2 * self.half_in_size], y3], axis=1))\n\n        y = paddle.concat([y1, y2, y3, y4], axis=1)\n\n        y = self.fc_out(y)\n        return y, feature\n"
  },
  {
    "path": "modules/image/classification/SpinalNet_Gemstones/spinalnet_vgg16_gemstone/README.md",
    "content": "## 概述\n* [SpinalNet](https://arxiv.org/abs/2007.03347)的网络结构如下图，\n\n[网络结构图](https://ai-studio-static-online.cdn.bcebos.com/0c58fff63018401089f92085a2aea5d46921351012e64ac4b7d5a8e1370c463f)\n\n该模型为SpinalNet在宝石数据集上的预训练模型，可以安装PaddleHub后完成一键预测及微调。\n\n## 预训练模型\n\n预训练模型位于https://aistudio.baidu.com/aistudio/datasetdetail/69923\n\n## API\n加载该模型后，使用PadduleHub2.0的默认图像分类API\n```\ndef Predict(images, batch_size, top_k):\n```\n\n**参数**\n* images (list[str: 图片路径]) : 输入图像数据列表\n* batch_size: 默认值为1\n* top_k: 每张图片的前k个预测类别\n"
  },
  {
    "path": "modules/image/classification/SpinalNet_Gemstones/spinalnet_vgg16_gemstone/label_list.txt",
    "content": "Alexandrite\nAlmandine\nAmazonite\nAmber\nAmethyst\nAmetrine\nAndalusite\nAndradite\nAquamarine\nAventurine Green\nAventurine Yellow\nBenitoite\nBeryl Golden\nBixbite\nBloodstone\nBlue Lace Agate\nCarnelian\nCats Eye\nChalcedony\nChalcedony Blue\nChrome Diopside\nChrysoberyl\nChrysocolla\nChrysoprase\nCitrine\nCoral\nDanburite\nDiamond\nDiaspore\nDumortierite\nEmerald\nFluorite\nGarnet Red\nGoshenite\nGrossular\nHessonite\nHiddenite\nIolite\nJade\nJasper\nKunzite\nKyanite\nLabradorite\nLapis Lazuli\nLarimar\nMalachite\nMoonstone\nMorganite\nOnyx Black\nOnyx Green\nOnyx Red\nOpal\nPearl\nPeridot\nPrehnite\nPyrite\nPyrope\nQuartz Beer\nQuartz Lemon\nQuartz Rose\nQuartz Rutilated\nQuartz Smoky\nRhodochrosite\nRhodolite\nRhodonite\nRuby\nSapphire Blue\nSapphire Pink\nSapphire Purple\nSapphire Yellow\nScapolite\nSerpentine\nSodalite\nSpessartite\nSphene\nSpinel\nSpodumene\nSunstone\nTanzanite\nTigers Eye\nTopaz\nTourmaline\nTsavorite\nTurquoise\nVariscite\nZircon\nZoisite\n"
  },
  {
    "path": "modules/image/classification/SpinalNet_Gemstones/spinalnet_vgg16_gemstone/module.py",
    "content": "# copyright (c) 2021 nanting03. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nfrom typing import Union\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddlehub.vision.transforms as T\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\nimport paddle\nfrom paddle import nn\n\n\nclass VGG(nn.Layer):\n    def __init__(self, features, with_pool=True):\n        super(VGG, self).__init__()\n        self.features = features\n        self.with_pool = with_pool\n\n        if with_pool:\n            self.avgpool = nn.AdaptiveAvgPool2D((7, 7))\n\n    def forward(self, x):\n        x = self.features(x)\n\n        if self.with_pool:\n            x = self.avgpool(x)\n\n        return x\n\n\ndef make_layers(cfg, batch_norm=False):\n    layers = []\n    in_channels = 3\n    for v in cfg:\n        if v == 'M':\n            layers += [nn.MaxPool2D(kernel_size=2, stride=2)]\n        else:\n            conv2d = nn.Conv2D(in_channels, v, kernel_size=3, padding=1)\n            if batch_norm:\n                layers += [conv2d, nn.BatchNorm2D(v), nn.ReLU()]\n            else:\n                layers += [conv2d, nn.ReLU()]\n            in_channels = v\n    return nn.Sequential(*layers)\n\n\ncfgs = {\n    'A': [64, 'M', 128, 'M', 256, 256, 'M', 512, 512, 'M', 512, 512, 'M'],\n    'B': [64, 64, 'M', 128, 128, 'M', 256, 256, 'M', 512, 512, 'M', 512, 512, 'M'],\n    'D': [64, 64, 'M', 128, 128, 'M', 256, 256, 256, 'M', 512, 512, 512, 'M', 512, 512, 512, 'M'],\n    'E': [64, 64, 'M', 128, 128, 'M', 256, 256, 256, 256, 'M', 512, 512, 512, 512, 'M', 512, 512, 512, 512, 'M'],\n}\n\n\ndef _vgg(arch, cfg, batch_norm, **kwargs):\n    model = VGG(make_layers(cfgs[cfg], batch_norm=batch_norm), **kwargs)\n    return model\n\n\ndef vgg16(batch_norm=False, **kwargs):\n    model_name = 'vgg16'\n    if batch_norm:\n        model_name += ('_bn')\n    return _vgg(model_name, 'D', batch_norm, **kwargs)\n\n\n@moduleinfo(\n    name=\"spinalnet_vgg16_gemstone\",\n    type=\"CV/classification\",\n    author=\"nanting03\",\n    author_email=\"975348977@qq.com\",\n    summary=\"spinalnet_vgg16_gemstone is a classification model, \"\n    \"this module is trained with Gemstone dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass SpinalNet_VGG16(nn.Layer):\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(SpinalNet_VGG16, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        self.backbone = vgg16()\n\n        half_in_size = round(512 * 7 * 7 / 2)\n        layer_width = 4096\n\n        self.half_in_size = half_in_size\n\n        self.fc_spinal_layer1 = nn.Sequential(\n            nn.Dropout(p=0.5),\n            nn.Linear(half_in_size, layer_width),\n            nn.BatchNorm1D(layer_width),\n            nn.ReLU(),\n        )\n        self.fc_spinal_layer2 = nn.Sequential(\n            nn.Dropout(p=0.5),\n            nn.Linear(half_in_size + layer_width, layer_width),\n            nn.BatchNorm1D(layer_width),\n            nn.ReLU(),\n        )\n        self.fc_spinal_layer3 = nn.Sequential(\n            nn.Dropout(p=0.5),\n            nn.Linear(half_in_size + layer_width, layer_width),\n            nn.BatchNorm1D(layer_width),\n            nn.ReLU(),\n        )\n        self.fc_spinal_layer4 = nn.Sequential(\n            nn.Dropout(p=0.5),\n            nn.Linear(half_in_size + layer_width, layer_width),\n            nn.BatchNorm1D(layer_width),\n            nn.ReLU(),\n        )\n        self.fc_out = nn.Sequential(nn.Dropout(p=0.5), nn.Linear(layer_width * 4, class_dim))\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'spinalnet_vgg16.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images)\n\n    def forward(self, inputs: paddle.Tensor):\n\n        y = self.backbone(inputs)\n        feature = y\n        y = paddle.flatten(y, 1)\n\n        y1 = self.fc_spinal_layer1(y[:, 0:self.half_in_size])\n        y2 = self.fc_spinal_layer2(paddle.concat([y[:, self.half_in_size:2 * self.half_in_size], y1], axis=1))\n        y3 = self.fc_spinal_layer3(paddle.concat([y[:, 0:self.half_in_size], y2], axis=1))\n        y4 = self.fc_spinal_layer4(paddle.concat([y[:, self.half_in_size:2 * self.half_in_size], y3], axis=1))\n\n        y = paddle.concat([y1, y2, y3, y4], axis=1)\n\n        y = self.fc_out(y)\n        return y, feature\n"
  },
  {
    "path": "modules/image/classification/alexnet_imagenet/README.md",
    "content": "# alexnet_imagenet\n\n|模型名称|alexnet_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|AlexNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|234MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - AlexNet是图像分类中的经典模型。模型由Alex Krizhevsky于2012年提出，并在2012年ILSVRC比赛中夺得冠军。该PaddleHub Module结构为AlexNet，基于ImageNet-2012数据集训练，接受输入图片大小为224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install alexnet_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run alexnet_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"alexnet_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率。\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install alexnet_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/alexnet_imagenet/README_en.md",
    "content": "# alexnet_imagenet\n\n|Module Name|alexnet_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|AlexNet|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|234MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - AlexNet was a classification model proposed by Alex Krizhevsky in 2012, and gained the champion of ILSVRC 2012. This module is based on AlexNet, trained on ImageNet-2012, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install alexnet_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run alexnet_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"alexnet_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install alexnet_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/darknet53_imagenet/README.md",
    "content": "# darknet53_imagenet\n\n|模型名称|darknet53_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|DarkNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|160MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - DarkNet 是由 Joseph Redmon 提出的图像分类模型，并应用于Yolov3 中作为 Backbone 来完成特征提取。该网络采用连续的 3*3 和 1*1 卷积进行连接，并像ResNet 一样有ShortCut连接。该 PaddleHub Module 基于 ImageNet-2012 数据集训练，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install darknet53_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run darknet53_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"darknet53_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率。\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install darknet53_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/darknet53_imagenet/README_en.md",
    "content": "# darknet53_imagenet\n\n|Module Name|darknet53_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|DarkNet|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|160MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - DarkNet is a classification model proposed by Joseph Redmon, which uses Yolov3 as backbone to extract features. This module is based on darknet53, trained on ImageNet-2012, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install darknet53_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run darknet53_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"darknet53_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install darknet53_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/darknet53_imagenet/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/classification/darknet53_imagenet/darknet.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport six\nimport math\n\nfrom paddle import fluid\nfrom paddle.fluid.param_attr import ParamAttr\nfrom paddle.fluid.regularizer import L2Decay\n\n__all__ = ['DarkNet']\n\n\nclass DarkNet(object):\n    \"\"\"DarkNet, see https://pjreddie.com/darknet/yolo/\n    Args:\n        depth (int): network depth, currently only darknet 53 is supported\n        norm_type (str): normalization type, 'bn' and 'sync_bn' are supported\n        norm_decay (float): weight decay for normalization layer weights\n        get_prediction (bool): whether to get prediction\n        class_dim (int): number of class while classification\n    \"\"\"\n\n    def __init__(self,\n                 depth=53,\n                 norm_type='sync_bn',\n                 norm_decay=0.,\n                 weight_prefix_name='',\n                 get_prediction=False,\n                 class_dim=1000):\n        assert depth in [53], \"unsupported depth value\"\n        self.depth = depth\n        self.norm_type = norm_type\n        self.norm_decay = norm_decay\n        self.depth_cfg = {53: ([1, 2, 8, 8, 4], self.basicblock)}\n        self.prefix_name = weight_prefix_name\n        self.class_dim = class_dim\n        self.get_prediction = get_prediction\n\n    def _conv_norm(self, input, ch_out, filter_size, stride, padding, act='leaky', name=None):\n        conv = fluid.layers.conv2d(\n            input=input,\n            num_filters=ch_out,\n            filter_size=filter_size,\n            stride=stride,\n            padding=padding,\n            act=None,\n            param_attr=ParamAttr(name=name + \".conv.weights\"),\n            bias_attr=False)\n\n        bn_name = name + \".bn\"\n        bn_param_attr = ParamAttr(regularizer=L2Decay(float(self.norm_decay)), name=bn_name + '.scale')\n        bn_bias_attr = ParamAttr(regularizer=L2Decay(float(self.norm_decay)), name=bn_name + '.offset')\n\n        out = fluid.layers.batch_norm(\n            input=conv,\n            act=None,\n            param_attr=bn_param_attr,\n            bias_attr=bn_bias_attr,\n            moving_mean_name=bn_name + '.mean',\n            moving_variance_name=bn_name + '.var')\n\n        # leaky relu here has `alpha` as 0.1, can not be set by\n        # `act` param in fluid.layers.batch_norm above.\n        if act == 'leaky':\n            out = fluid.layers.leaky_relu(x=out, alpha=0.1)\n\n        return out\n\n    def _downsample(self, input, ch_out, filter_size=3, stride=2, padding=1, name=None):\n        return self._conv_norm(input, ch_out=ch_out, filter_size=filter_size, stride=stride, padding=padding, name=name)\n\n    def basicblock(self, input, ch_out, name=None):\n        conv1 = self._conv_norm(input, ch_out=ch_out, filter_size=1, stride=1, padding=0, name=name + \".0\")\n        conv2 = self._conv_norm(conv1, ch_out=ch_out * 2, filter_size=3, stride=1, padding=1, name=name + \".1\")\n        out = fluid.layers.elementwise_add(x=input, y=conv2, act=None)\n        return out\n\n    def layer_warp(self, block_func, input, ch_out, count, name=None):\n        out = block_func(input, ch_out=ch_out, name='{}.0'.format(name))\n        for j in six.moves.xrange(1, count):\n            out = block_func(out, ch_out=ch_out, name='{}.{}'.format(name, j))\n        return out\n\n    def __call__(self, input):\n        \"\"\"Get the backbone of DarkNet, that is output for the 5 stages.\n\n        :param input: Variable of input image\n        :type input: Variable\n        :Returns: The last variables of each stage.\n        \"\"\"\n        stages, block_func = self.depth_cfg[self.depth]\n        stages = stages[0:5]\n        conv = self._conv_norm(\n            input=input, ch_out=32, filter_size=3, stride=1, padding=1, name=self.prefix_name + \"yolo_input\")\n        downsample_ = self._downsample(\n            input=conv, ch_out=conv.shape[1] * 2, name=self.prefix_name + \"yolo_input.downsample\")\n        blocks = []\n        for i, stage in enumerate(stages):\n            block = self.layer_warp(\n                block_func=block_func,\n                input=downsample_,\n                ch_out=32 * 2**i,\n                count=stage,\n                name=self.prefix_name + \"stage.{}\".format(i))\n            blocks.append(block)\n            if i < len(stages) - 1:  # do not downsaple in the last stage\n                downsample_ = self._downsample(\n                    input=block, ch_out=block.shape[1] * 2, name=self.prefix_name + \"stage.{}.downsample\".format(i))\n        if self.get_prediction:\n            pool = fluid.layers.pool2d(input=block, pool_type='avg', global_pooling=True)\n            stdv = 1.0 / math.sqrt(pool.shape[1] * 1.0)\n            out = fluid.layers.fc(\n                input=pool,\n                size=self.class_dim,\n                param_attr=ParamAttr(initializer=fluid.initializer.Uniform(-stdv, stdv), name='fc_weights'),\n                bias_attr=ParamAttr(name='fc_offset'))\n            out = fluid.layers.softmax(out)\n            return out\n        else:\n            return blocks\n"
  },
  {
    "path": "modules/image/classification/darknet53_imagenet/data_feed.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import division\n\nimport os\nfrom collections import OrderedDict\n\nimport cv2\nimport numpy as np\nfrom PIL import Image, ImageEnhance\nfrom paddle import fluid\n\nDATA_DIM = 224\nimg_mean = np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))\nimg_std = np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))\n\n\ndef resize_short(img, target_size):\n    percent = float(target_size) / min(img.size[0], img.size[1])\n    resized_width = int(round(img.size[0] * percent))\n    resized_height = int(round(img.size[1] * percent))\n    img = img.resize((resized_width, resized_height), Image.LANCZOS)\n    return img\n\n\ndef crop_image(img, target_size, center):\n    width, height = img.size\n    size = target_size\n    if center == True:\n        w_start = (width - size) / 2\n        h_start = (height - size) / 2\n    else:\n        w_start = np.random.randint(0, width - size + 1)\n        h_start = np.random.randint(0, height - size + 1)\n    w_end = w_start + size\n    h_end = h_start + size\n    img = img.crop((w_start, h_start, w_end, h_end))\n    return img\n\n\ndef process_image(img):\n    img = resize_short(img, target_size=256)\n    img = crop_image(img, target_size=DATA_DIM, center=True)\n    if img.mode != 'RGB':\n        img = img.convert('RGB')\n    #img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)\n    img = np.array(img).astype('float32').transpose((2, 0, 1)) / 255\n    img -= img_mean\n    img /= img_std\n    return img\n\n\ndef test_reader(paths=None, images=None):\n    \"\"\"data generator\n    :param paths: path to images.\n    :type paths: list, each element is a str\n    :param images: data of images, [N, H, W, C]\n    :type images: numpy.ndarray\n    \"\"\"\n    img_list = []\n    if paths:\n        for img_path in paths:\n            assert os.path.isfile(img_path), \"The {} isn't a valid file path.\".format(img_path)\n            img = Image.open(img_path)\n            #img = cv2.imread(img_path)\n            img_list.append(img)\n    if images is not None:\n        for img in images:\n            img_list.append(Image.fromarray(np.uint8(img)))\n    for im in img_list:\n        im = process_image(im)\n        yield im\n"
  },
  {
    "path": "modules/image/classification/darknet53_imagenet/label_file.txt",
    "content": "tench, Tinca tinca\ngoldfish, Carassius auratus\ngreat white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias\ntiger shark, Galeocerdo cuvieri\nhammerhead, hammerhead shark\nelectric ray, crampfish, numbfish, torpedo\nstingray\ncock\nhen\nostrich, Struthio camelus\nbrambling, Fringilla montifringilla\ngoldfinch, Carduelis carduelis\nhouse finch, linnet, Carpodacus mexicanus\njunco, snowbird\nindigo bunting, indigo finch, indigo bird, Passerina cyanea\nrobin, American robin, Turdus migratorius\nbulbul\njay\nmagpie\nchickadee\nwater ouzel, dipper\nkite\nbald eagle, American eagle, Haliaeetus leucocephalus\nvulture\ngreat grey owl, great gray owl, Strix nebulosa\nEuropean fire salamander, Salamandra salamandra\ncommon newt, Triturus vulgaris\neft\nspotted salamander, Ambystoma maculatum\naxolotl, mud puppy, Ambystoma mexicanum\nbullfrog, Rana catesbeiana\ntree frog, tree-frog\ntailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui\nloggerhead, loggerhead turtle, Caretta caretta\nleatherback turtle, leatherback, leathery turtle, Dermochelys coriacea\nmud turtle\nterrapin\nbox turtle, box tortoise\nbanded gecko\ncommon iguana, iguana, Iguana iguana\nAmerican chameleon, anole, Anolis carolinensis\nwhiptail, whiptail lizard\nagama\nfrilled lizard, Chlamydosaurus kingi\nalligator lizard\nGila monster, Heloderma suspectum\ngreen lizard, Lacerta viridis\nAfrican chameleon, Chamaeleo chamaeleon\nKomodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis\nAfrican crocodile, Nile crocodile, Crocodylus niloticus\nAmerican alligator, Alligator mississipiensis\ntriceratops\nthunder snake, worm snake, Carphophis amoenus\nringneck snake, ring-necked snake, ring snake\nhognose snake, puff adder, sand viper\ngreen snake, grass snake\nking snake, kingsnake\ngarter snake, grass snake\nwater snake\nvine snake\nnight snake, Hypsiglena torquata\nboa constrictor, Constrictor constrictor\nrock python, rock snake, Python sebae\nIndian cobra, Naja naja\ngreen mamba\nsea snake\nhorned viper, cerastes, sand viper, horned asp, Cerastes cornutus\ndiamondback, diamondback rattlesnake, Crotalus adamanteus\nsidewinder, horned rattlesnake, Crotalus cerastes\ntrilobite\nharvestman, daddy longlegs, Phalangium opilio\nscorpion\nblack and gold garden spider, Argiope aurantia\nbarn spider, Araneus cavaticus\ngarden spider, Aranea diademata\nblack widow, Latrodectus mactans\ntarantula\nwolf spider, hunting spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse, partridge, Bonasa umbellus\nprairie chicken, prairie grouse, prairie fowl\npeacock\nquail\npartridge\nAfrican grey, African gray, Psittacus erithacus\nmacaw\nsulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser, Mergus serrator\ngoose\nblack swan, Cygnus atratus\ntusker\nechidna, spiny anteater, anteater\nplatypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus\nwallaby, brush kangaroo\nkoala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus\nwombat\njellyfish\nsea anemone, anemone\nbrain coral\nflatworm, platyhelminth\nnematode, nematode worm, roundworm\nconch\nsnail\nslug\nsea slug, nudibranch\nchiton, coat-of-mail shell, sea cradle, polyplacophore\nchambered nautilus, pearly nautilus, nautilus\nDungeness crab, Cancer magister\nrock crab, Cancer irroratus\nfiddler crab\nking crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica\nAmerican lobster, Northern lobster, Maine lobster, Homarus americanus\nspiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish\ncrayfish, crawfish, crawdad, crawdaddy\nhermit crab\nisopod\nwhite stork, Ciconia ciconia\nblack stork, Ciconia nigra\nspoonbill\nflamingo\nlittle blue heron, Egretta caerulea\nAmerican egret, great white heron, Egretta albus\nbittern\ncrane\nlimpkin, Aramus pictus\nEuropean gallinule, Porphyrio porphyrio\nAmerican coot, marsh hen, mud hen, water hen, Fulica americana\nbustard\nruddy turnstone, Arenaria interpres\nred-backed sandpiper, dunlin, Erolia alpina\nredshank, Tringa totanus\ndowitcher\noystercatcher, oyster catcher\npelican\nking penguin, Aptenodytes patagonica\nalbatross, mollymawk\ngrey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus\nkiller whale, killer, orca, grampus, sea wolf, Orcinus orca\ndugong, Dugong dugon\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog, Maltese terrier, Maltese\nPekinese, Pekingese, Peke\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound, Afghan\nbasset, basset hound\nbeagle\nbloodhound, sleuthhound\nbluetick\nblack-and-tan coonhound\nWalker hound, Walker foxhound\nEnglish foxhound\nredbone\nborzoi, Russian wolfhound\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound, Ibizan Podenco\nNorwegian elkhound, elkhound\notterhound, otter hound\nSaluki, gazelle hound\nScottish deerhound, deerhound\nWeimaraner\nStaffordshire bullterrier, Staffordshire bull terrier\nAmerican Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier, Sealyham\nAiredale, Airedale terrier\ncairn, cairn terrier\nAustralian terrier\nDandie Dinmont, Dandie Dinmont terrier\nBoston bull, Boston terrier\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier, Scottish terrier, Scottie\nTibetan terrier, chrysanthemum dog\nsilky terrier, Sydney silky\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa, Lhasa apso\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla, Hungarian pointer\nEnglish setter\nIrish setter, red setter\nGordon setter\nBrittany spaniel\nclumber, clumber spaniel\nEnglish springer, English springer spaniel\nWelsh springer spaniel\ncocker spaniel, English cocker spaniel, cocker\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog, bobtail\nShetland sheepdog, Shetland sheep dog, Shetland\ncollie\nBorder collie\nBouvier des Flandres, Bouviers des Flandres\nRottweiler\nGerman shepherd, German shepherd dog, German police dog, alsatian\nDoberman, Doberman pinscher\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard, St Bernard\nEskimo dog, husky\nmalamute, malemute, Alaskan malamute\nSiberian husky\ndalmatian, coach dog, carriage dog\naffenpinscher, monkey pinscher, monkey dog\nbasenji\npug, pug-dog\nLeonberg\nNewfoundland, Newfoundland dog\nGreat Pyrenees\nSamoyed, Samoyede\nPomeranian\nchow, chow chow\nkeeshond\nBrabancon griffon\nPembroke, Pembroke Welsh corgi\nCardigan, Cardigan Welsh corgi\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf, grey wolf, gray wolf, Canis lupus\nwhite wolf, Arctic wolf, Canis lupus tundrarum\nred wolf, maned wolf, Canis rufus, Canis niger\ncoyote, prairie wolf, brush wolf, Canis latrans\ndingo, warrigal, warragal, Canis dingo\ndhole, Cuon alpinus\nAfrican hunting dog, hyena dog, Cape hunting dog, Lycaon pictus\nhyena, hyaena\nred fox, Vulpes vulpes\nkit fox, Vulpes macrotis\nArctic fox, white fox, Alopex lagopus\ngrey fox, gray fox, Urocyon cinereoargenteus\ntabby, tabby cat\ntiger cat\nPersian cat\nSiamese cat, Siamese\nEgyptian cat\ncougar, puma, catamount, mountain lion, painter, panther, Felis concolor\nlynx, catamount\nleopard, Panthera pardus\nsnow leopard, ounce, Panthera uncia\njaguar, panther, Panthera onca, Felis onca\nlion, king of beasts, Panthera leo\ntiger, Panthera tigris\ncheetah, chetah, Acinonyx jubatus\nbrown bear, bruin, Ursus arctos\nAmerican black bear, black bear, Ursus americanus, Euarctos americanus\nice bear, polar bear, Ursus Maritimus, Thalarctos maritimus\nsloth bear, Melursus ursinus, Ursus ursinus\nmongoose\nmeerkat, mierkat\ntiger beetle\nladybug, ladybeetle, lady beetle, ladybird, ladybird beetle\nground beetle, carabid beetle\nlong-horned beetle, longicorn, longicorn beetle\nleaf beetle, chrysomelid\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant, emmet, pismire\ngrasshopper, hopper\ncricket\nwalking stick, walkingstick, stick insect\ncockroach, roach\nmantis, mantid\ncicada, cicala\nleafhopper\nlacewing, lacewing fly\ndragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk\ndamselfly\nadmiral\nringlet, ringlet butterfly\nmonarch, monarch butterfly, milkweed butterfly, Danaus plexippus\ncabbage butterfly\nsulphur butterfly, sulfur butterfly\nlycaenid, lycaenid butterfly\nstarfish, sea star\nsea urchin\nsea cucumber, holothurian\nwood rabbit, cottontail, cottontail rabbit\nhare\nAngora, Angora rabbit\nhamster\nporcupine, hedgehog\nfox squirrel, eastern fox squirrel, Sciurus niger\nmarmot\nbeaver\nguinea pig, Cavia cobaya\nsorrel\nzebra\nhog, pig, grunter, squealer, Sus scrofa\nwild boar, boar, Sus scrofa\nwarthog\nhippopotamus, hippo, river horse, Hippopotamus amphibius\nox\nwater buffalo, water ox, Asiatic buffalo, Bubalus bubalis\nbison\nram, tup\nbighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis\nibex, Capra ibex\nhartebeest\nimpala, Aepyceros melampus\ngazelle\nArabian camel, dromedary, Camelus dromedarius\nllama\nweasel\nmink\npolecat, fitch, foulmart, foumart, Mustela putorius\nblack-footed ferret, ferret, Mustela nigripes\notter\nskunk, polecat, wood pussy\nbadger\narmadillo\nthree-toed sloth, ai, Bradypus tridactylus\norangutan, orang, orangutang, Pongo pygmaeus\ngorilla, Gorilla gorilla\nchimpanzee, chimp, Pan troglodytes\ngibbon, Hylobates lar\nsiamang, Hylobates syndactylus, Symphalangus syndactylus\nguenon, guenon monkey\npatas, hussar monkey, Erythrocebus patas\nbaboon\nmacaque\nlangur\ncolobus, colobus monkey\nproboscis monkey, Nasalis larvatus\nmarmoset\ncapuchin, ringtail, Cebus capucinus\nhowler monkey, howler\ntiti, titi monkey\nspider monkey, Ateles geoffroyi\nsquirrel monkey, Saimiri sciureus\nMadagascar cat, ring-tailed lemur, Lemur catta\nindri, indris, Indri indri, Indri brevicaudatus\nIndian elephant, Elephas maximus\nAfrican elephant, Loxodonta africana\nlesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens\ngiant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca\nbarracouta, snoek\neel\ncoho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch\nrock beauty, Holocanthus tricolor\nanemone fish\nsturgeon\ngar, garfish, garpike, billfish, Lepisosteus osseus\nlionfish\npuffer, pufferfish, blowfish, globefish\nabacus\nabaya\nacademic gown, academic robe, judge's robe\naccordion, piano accordion, squeeze box\nacoustic guitar\naircraft carrier, carrier, flattop, attack aircraft carrier\nairliner\nairship, dirigible\naltar\nambulance\namphibian, amphibious vehicle\nanalog clock\napiary, bee house\napron\nashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin\nassault rifle, assault gun\nbackpack, back pack, knapsack, packsack, rucksack, haversack\nbakery, bakeshop, bakehouse\nbalance beam, beam\nballoon\nballpoint, ballpoint pen, ballpen, Biro\nBand Aid\nbanjo\nbannister, banister, balustrade, balusters, handrail\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel, cask\nbarrow, garden cart, lawn cart, wheelbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap, swimming cap\nbath towel\nbathtub, bathing tub, bath, tub\nbeach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon\nbeacon, lighthouse, beacon light, pharos\nbeaker\nbearskin, busby, shako\nbeer bottle\nbeer glass\nbell cote, bell cot\nbib\nbicycle-built-for-two, tandem bicycle, tandem\nbikini, two-piece\nbinder, ring-binder\nbinoculars, field glasses, opera glasses\nbirdhouse\nboathouse\nbobsled, bobsleigh, bob\nbolo tie, bolo, bola tie, bola\nbonnet, poke bonnet\nbookcase\nbookshop, bookstore, bookstall\nbottlecap\nbow\nbow tie, bow-tie, bowtie\nbrass, memorial tablet, plaque\nbrassiere, bra, bandeau\nbreakwater, groin, groyne, mole, bulwark, seawall, jetty\nbreastplate, aegis, egis\nbroom\nbucket, pail\nbuckle\nbulletproof vest\nbullet train, bullet\nbutcher shop, meat market\ncab, hack, taxi, taxicab\ncaldron, cauldron\ncandle, taper, wax light\ncannon\ncanoe\ncan opener, tin opener\ncardigan\ncar mirror\ncarousel, carrousel, merry-go-round, roundabout, whirligig\ncarpenter's kit, tool kit\ncarton\ncar wheel\ncash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello, violoncello\ncellular telephone, cellular phone, cellphone, cell, mobile phone\nchain\nchainlink fence\nchain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour\nchain saw, chainsaw\nchest\nchiffonier, commode\nchime, bell, gong\nchina cabinet, china closet\nChristmas stocking\nchurch, church building\ncinema, movie theater, movie theatre, movie house, picture palace\ncleaver, meat cleaver, chopper\ncliff dwelling\ncloak\nclog, geta, patten, sabot\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil, spiral, volute, whorl, helix\ncombination lock\ncomputer keyboard, keypad\nconfectionery, confectionary, candy store\ncontainer ship, containership, container vessel\nconvertible\ncorkscrew, bottle screw\ncornet, horn, trumpet, trump\ncowboy boot\ncowboy hat, ten-gallon hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib, cot\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam, dike, dyke\ndesk\ndesktop computer\ndial telephone, dial phone\ndiaper, nappy, napkin\ndigital clock\ndigital watch\ndining table, board\ndishrag, dishcloth\ndishwasher, dish washer, dishwashing machine\ndisk brake, disc brake\ndock, dockage, docking facility\ndogsled, dog sled, dog sleigh\ndome\ndoormat, welcome mat\ndrilling platform, offshore rig\ndrum, membranophone, tympan\ndrumstick\ndumbbell\nDutch oven\nelectric fan, blower\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa, boa\nfile, file cabinet, filing cabinet\nfireboat\nfire engine, fire truck\nfire screen, fireguard\nflagpole, flagstaff\nflute, transverse flute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn, horn\nfrying pan, frypan, skillet\nfur coat\ngarbage truck, dustcart\ngasmask, respirator, gas helmet\ngas pump, gasoline pump, petrol pump, island dispenser\ngoblet\ngo-kart\ngolf ball\ngolfcart, golf cart\ngondola\ngong, tam-tam\ngown\ngrand piano, grand\ngreenhouse, nursery, glasshouse\ngrille, radiator grille\ngrocery store, grocery, food market, market\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower, blow dryer, blow drier, hair dryer, hair drier\nhand-held computer, hand-held microcomputer\nhandkerchief, hankie, hanky, hankey\nhard disc, hard disk, fixed disk\nharmonica, mouth organ, harp, mouth harp\nharp\nharvester, reaper\nhatchet\nholster\nhome theater, home theatre\nhoneycomb\nhook, claw\nhoopskirt, crinoline\nhorizontal bar, high bar\nhorse cart, horse-cart\nhourglass\niPod\niron, smoothing iron\njack-o'-lantern\njean, blue jean, denim\njeep, landrover\njersey, T-shirt, tee shirt\njigsaw puzzle\njinrikisha, ricksha, rickshaw\njoystick\nkimono\nknee pad\nknot\nlab coat, laboratory coat\nladle\nlampshade, lamp shade\nlaptop, laptop computer\nlawn mower, mower\nlens cap, lens cover\nletter opener, paper knife, paperknife\nlibrary\nlifeboat\nlighter, light, igniter, ignitor\nlimousine, limo\nliner, ocean liner\nlipstick, lip rouge\nLoafer\nlotion\nloudspeaker, speaker, speaker unit, loudspeaker system, speaker system\nloupe, jeweler's loupe\nlumbermill, sawmill\nmagnetic compass\nmailbag, postbag\nmailbox, letter box\nmaillot\nmaillot, tank suit\nmanhole cover\nmaraca\nmarimba, xylophone\nmask\nmatchstick\nmaypole\nmaze, labyrinth\nmeasuring cup\nmedicine chest, medicine cabinet\nmegalith, megalithic structure\nmicrophone, mike\nmicrowave, microwave oven\nmilitary uniform\nmilk can\nminibus\nminiskirt, mini\nminivan\nmissile\nmitten\nmixing bowl\nmobile home, manufactured home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter, scooter\nmountain bike, all-terrain bike, off-roader\nmountain tent\nmouse, computer mouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook, notebook computer\nobelisk\noboe, hautboy, hautbois\nocarina, sweet potato\nodometer, hodometer, mileometer, milometer\noil filter\norgan, pipe organ\noscilloscope, scope, cathode-ray oscilloscope, CRO\noverskirt\noxcart\noxygen mask\npacket\npaddle, boat paddle\npaddlewheel, paddle wheel\npadlock\npaintbrush\npajama, pyjama, pj's, jammies\npalace\npanpipe, pandean pipe, syrinx\npaper towel\nparachute, chute\nparallel bars, bars\npark bench\nparking meter\npassenger car, coach, carriage\npatio, terrace\npay-phone, pay-station\npedestal, plinth, footstall\npencil box, pencil case\npencil sharpener\nperfume, essence\nPetri dish\nphotocopier\npick, plectrum, plectron\npickelhaube\npicket fence, paling\npickup, pickup truck\npier\npiggy bank, penny bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate, pirate ship\npitcher, ewer\nplane, carpenter's plane, woodworking plane\nplanetarium\nplastic bag\nplate rack\nplow, plough\nplunger, plumber's helper\nPolaroid camera, Polaroid Land camera\npole\npolice van, police wagon, paddy wagon, patrol wagon, wagon, black Maria\nponcho\npool table, billiard table, snooker table\npop bottle, soda bottle\npot, flowerpot\npotter's wheel\npower drill\nprayer rug, prayer mat\nprinter\nprison, prison house\nprojectile, missile\nprojector\npuck, hockey puck\npunching bag, punch bag, punching ball, punchball\npurse\nquill, quill pen\nquilt, comforter, comfort, puff\nracer, race car, racing car\nracket, racquet\nradiator\nradio, wireless\nradio telescope, radio reflector\nrain barrel\nrecreational vehicle, RV, R.V.\nreel\nreflex camera\nrefrigerator, icebox\nremote control, remote\nrestaurant, eating house, eating place, eatery\nrevolver, six-gun, six-shooter\nrifle\nrocking chair, rocker\nrotisserie\nrubber eraser, rubber, pencil eraser\nrugby ball\nrule, ruler\nrunning shoe\nsafe\nsafety pin\nsaltshaker, salt shaker\nsandal\nsarong\nsax, saxophone\nscabbard\nscale, weighing machine\nschool bus\nschooner\nscoreboard\nscreen, CRT screen\nscrew\nscrewdriver\nseat belt, seatbelt\nsewing machine\nshield, buckler\nshoe shop, shoe-shop, shoe store\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule, slipstick\nsliding door\nslot, one-armed bandit\nsnorkel\nsnowmobile\nsnowplow, snowplough\nsoap dispenser\nsoccer ball\nsock\nsolar dish, solar collector, solar furnace\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web, spider's web\nspindle\nsports car, sport car\nspotlight, spot\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch, stop watch\nstove\nstrainer\nstreetcar, tram, tramcar, trolley, trolley car\nstretcher\nstudio couch, day bed\nstupa, tope\nsubmarine, pigboat, sub, U-boat\nsuit, suit of clothes\nsundial\nsunglass\nsunglasses, dark glasses, shades\nsunscreen, sunblock, sun blocker\nsuspension bridge\nswab, swob, mop\nsweatshirt\nswimming trunks, bathing trunks\nswing\nswitch, electric switch, electrical switch\nsyringe\ntable lamp\ntank, army tank, armored combat vehicle, armoured combat vehicle\ntape player\nteapot\nteddy, teddy bear\ntelevision, television system\ntennis ball\nthatch, thatched roof\ntheater curtain, theatre curtain\nthimble\nthresher, thrasher, threshing machine\nthrone\ntile roof\ntoaster\ntobacco shop, tobacconist shop, tobacconist\ntoilet seat\ntorch\ntotem pole\ntow truck, tow car, wrecker\ntoyshop\ntractor\ntrailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi\ntray\ntrench coat\ntricycle, trike, velocipede\ntrimaran\ntripod\ntriumphal arch\ntrolleybus, trolley coach, trackless trolley\ntrombone\ntub, vat\nturnstile\ntypewriter keyboard\numbrella\nunicycle, monocycle\nupright, upright piano\nvacuum, vacuum cleaner\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin, fiddle\nvolleyball\nwaffle iron\nwall clock\nwallet, billfold, notecase, pocketbook\nwardrobe, closet, press\nwarplane, military plane\nwashbasin, handbasin, washbowl, lavabo, wash-hand basin\nwasher, automatic washer, washing machine\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool, woolen, woollen\nworm fence, snake fence, snake-rail fence, Virginia fence\nwreck\nyawl\nyurt\nweb site, website, internet site, site\ncomic book\ncrossword puzzle, crossword\nstreet sign\ntraffic light, traffic signal, stoplight\nbook jacket, dust cover, dust jacket, dust wrapper\nmenu\nplate\nguacamole\nconsomme\nhot pot, hotpot\ntrifle\nice cream, icecream\nice lolly, lolly, lollipop, popsicle\nFrench loaf\nbagel, beigel\npretzel\ncheeseburger\nhotdog, hot dog, red hot\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini, courgette\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber, cuke\nartichoke, globe artichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple, ananas\nbanana\njackfruit, jak, jack\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce, chocolate syrup\ndough\nmeat loaf, meatloaf\npizza, pizza pie\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff, drop, drop-off\ncoral reef\ngeyser\nlakeside, lakeshore\npromontory, headland, head, foreland\nsandbar, sand bar\nseashore, coast, seacoast, sea-coast\nvalley, vale\nvolcano\nballplayer, baseball player\ngroom, bridegroom\nscuba diver\nrapeseed\ndaisy\nyellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum\ncorn\nacorn\nhip, rose hip, rosehip\nbuckeye, horse chestnut, conker\ncoral fungus\nagaric\ngyromitra\nstinkhorn, carrion fungus\nearthstar\nhen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa\nbolete\near, spike, capitulum\ntoilet tissue, toilet paper, bathroom tissue\n"
  },
  {
    "path": "modules/image/classification/darknet53_imagenet/module.py",
    "content": "import os\nimport ast\nimport argparse\n\nimport numpy as np\nimport paddle.fluid as fluid\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo, runnable\nfrom paddle.fluid.core import PaddleTensor, AnalysisConfig, create_paddle_predictor\nfrom paddlehub.common.paddle_helper import add_vars_prefix\nfrom paddlehub.io.parser import txt_parser\n\nfrom darknet53_imagenet.darknet import DarkNet\nfrom darknet53_imagenet.processor import load_label_info\nfrom darknet53_imagenet.data_feed import test_reader\n\n\n@moduleinfo(\n    name=\"darknet53_imagenet\",\n    version=\"1.1.0\",\n    type=\"cv/classification\",\n    summary=\"DarkNet53 is a image classfication model trained with ImageNet-2012 dataset.\",\n    author=\"paddlepaddle\",\n    author_email=\"paddle-dev@baidu.com\")\nclass DarkNet53(hub.Module):\n    def _initialize(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"darknet53_model\")\n        self.label_names = load_label_info(os.path.join(self.directory, \"label_file.txt\"))\n        self.infer_prog = None\n        self.pred_out = None\n        self._set_config()\n\n    def get_expected_image_width(self):\n        return 224\n\n    def get_expected_image_height(self):\n        return 224\n\n    def get_pretrained_images_mean(self):\n        im_mean = np.array([0.485, 0.456, 0.406]).reshape(1, 3)\n        return im_mean\n\n    def get_pretrained_images_std(self):\n        im_std = np.array([0.229, 0.224, 0.225]).reshape(1, 3)\n        return im_std\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        cpu_config = AnalysisConfig(self.default_pretrained_model_path)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_paddle_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = AnalysisConfig(self.default_pretrained_model_path)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=500, device_id=0)\n            self.gpu_predictor = create_paddle_predictor(gpu_config)\n\n    def context(self, input_image=None, trainable=True, pretrained=True, param_prefix='', get_prediction=False):\n        \"\"\"Distill the Head Features, so as to perform transfer learning.\n\n        :param input_image: image tensor.\n        :type input_image: <class 'paddle.fluid.framework.Variable'>\n        :param trainable: whether to set parameters trainable.\n        :type trainable: bool\n        :param pretrained: whether to load default pretrained model.\n        :type pretrained: bool\n        :param param_prefix: the prefix of parameters in yolo_head and backbone\n        :type param_prefix: str\n        :param get_prediction: whether to get prediction,\n            if True, outputs is {'bbox_out': bbox_out},\n            if False, outputs is {'head_features': head_features}.\n        :type get_prediction: bool\n        \"\"\"\n        context_prog = input_image.block.program if input_image else fluid.Program()\n        startup_program = fluid.Program()\n        with fluid.program_guard(context_prog, startup_program):\n            image = input_image if input_image else fluid.data(\n                name='image', shape=[-1, 3, 224, 224], dtype='float32', lod_level=0)\n            backbone = DarkNet(get_prediction=get_prediction)\n            out = backbone(image)\n            inputs = {'image': image}\n            if get_prediction:\n                outputs = {'pred_out': out}\n            else:\n                outputs = {'body_feats': out}\n\n            place = fluid.CPUPlace()\n            exe = fluid.Executor(place)\n            if pretrained:\n\n                def _if_exist(var):\n                    return os.path.exists(os.path.join(self.default_pretrained_model_path, var.name))\n\n                if not param_prefix:\n                    fluid.io.load_vars(\n                        exe, self.default_pretrained_model_path, main_program=context_prog, predicate=_if_exist)\n            else:\n                exe.run(startup_program)\n            return inputs, outputs, context_prog\n\n    def classification(self, paths=None, images=None, use_gpu=False, batch_size=1, top_k=2):\n        \"\"\"API of Classification.\n        :param paths: the path of images.\n        :type paths: list, each element is correspond to the path of an image.\n        :param images: data of images, [N, H, W, C]\n        :type images: numpy.ndarray\n        :param use_gpu: whether to use gpu or not.\n        :type use_gpu: bool\n        :param batch_size: batch size.\n        :type batch_size: int\n        :param top_k : top k\n        :type top_k : int\n        \"\"\"\n        if self.infer_prog is None:\n            inputs, outputs, self.infer_prog = self.context(trainable=False, pretrained=True, get_prediction=True)\n            self.infer_prog = self.infer_prog.clone(for_test=True)\n            self.pred_out = outputs['pred_out']\n        place = fluid.CUDAPlace(0) if use_gpu else fluid.CPUPlace()\n        exe = fluid.Executor(place)\n        all_images = []\n        paths = paths if paths else []\n        for yield_data in test_reader(paths, images):\n            all_images.append(yield_data)\n\n        images_num = len(all_images)\n        loop_num = int(np.ceil(images_num / batch_size))\n\n        res_list = []\n        top_k = max(min(top_k, 1000), 1)\n        for iter_id in range(loop_num):\n            batch_data = []\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_images[handle_id + image_id])\n                except:\n                    pass\n            batch_data = np.array(batch_data).astype('float32')\n            data_tensor = PaddleTensor(batch_data.copy())\n            if use_gpu:\n                result = self.gpu_predictor.run([data_tensor])\n            else:\n                result = self.cpu_predictor.run([data_tensor])\n            for i, res in enumerate(result[0].as_ndarray()):\n                res_dict = {}\n                pred_label = np.argsort(res)[::-1][:top_k]\n                for k in pred_label:\n                    class_name = self.label_names[int(k)].split(',')[0]\n                    max_prob = res[k]\n                    res_dict[class_name] = max_prob\n                res_list.append(res_dict)\n        return res_list\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu', type=ast.literal_eval, default=False, help=\"whether use GPU or not\")\n\n        self.arg_config_group.add_argument('--batch_size', type=int, default=1, help=\"batch size for prediction\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, default=None, help=\"input data\")\n\n        self.arg_input_group.add_argument('--input_file', type=str, default=None, help=\"file contain input data\")\n\n    def check_input_data(self, args):\n        input_data = []\n        if args.input_path:\n            input_data = [args.input_path]\n        elif args.input_file:\n            if not os.path.exists(args.input_file):\n                raise RuntimeError(\"File %s is not exist.\" % args.input_file)\n            else:\n                input_data = txt_parser.parse(args.input_file, use_strip=True)\n        return input_data\n\n    @runnable\n    def run_cmd(self, argvs):\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {}\".format(self.name),\n            prog=\"hub run {}\".format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        input_data = self.check_input_data(args)\n        if len(input_data) == 0:\n            self.parser.print_help()\n            exit(1)\n        else:\n            for image_path in input_data:\n                if not os.path.exists(image_path):\n                    raise RuntimeError(\"File %s or %s is not exist.\" % image_path)\n        return self.classification(paths=input_data, use_gpu=args.use_gpu, batch_size=args.batch_size)\n"
  },
  {
    "path": "modules/image/classification/darknet53_imagenet/processor.py",
    "content": "# coding=utf-8\ndef load_label_info(file_path):\n    with open(file_path, 'r') as fr:\n        return fr.read().split(\"\\n\")[:-1]\n"
  },
  {
    "path": "modules/image/classification/densenet121_imagenet/README.md",
    "content": "# densenet121_imagenet\n\n|模型名称|densenet121_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|DenseNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|34MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - DenseNet 是 CVPR 2017 最佳论文的模型，DenseNet 以前馈方式将每一层与其他层连接，从而 L 层网络就有 L(L+1)/2 个直接连接。对于每一层，其输入是之前的所有层的特征图，而自己的特征图作为之后所有层的输入。DenseNet 缓解了梯度消失问题，加强特征传播，促进了特征重用，并大幅减少了参数量。该PaddleHub Module结构为 DenseNet121，基于ImageNet-2012数据集训练，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install densenet121_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run densenet121_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"densenet121_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率。\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install densenet121_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/densenet121_imagenet/README_en.md",
    "content": "# densenet121_imagenet\n\n|Module Name|densenet121_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|DenseNet|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|34MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - DenseNet is the model in CVPR2017 best paper. Every layer outputs its result as input for the layer after it, and forms the dense connection topology. The dense connection ease the probblem of vanishing gradient and improve the information flow. This module is based on DenseNet121, trained on ImageNet-2012, and can predict an image of size 224*224*3.  \n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install densenet121_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run densenet121_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"densenet121_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install densenet121_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/densenet161_imagenet/README.md",
    "content": "# densenet161_imagenet\n\n|模型名称|densenet161_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|DenseNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|114MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - DenseNet 是 CVPR 2017 最佳论文的模型，DenseNet 以前馈方式将每一层与其他层连接，从而 L 层网络就有 L(L+1)/2 个直接连接。对于每一层，其输入是之前的所有层的特征图，而自己的特征图作为之后所有层的输入。DenseNet 缓解了梯度消失问题，加强特征传播，促进了特征重用，并大幅减少了参数量。该PaddleHub Module结构为 DenseNet161，基于ImageNet-2012数据集训练，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install densenet161_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run densenet161_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"densenet161_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率。\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install densenet161_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/densenet161_imagenet/README_en.md",
    "content": "# densenet161_imagenet\n\n|Module Name|densenet161_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|DenseNet|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|114MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - DenseNet is the model in CVPR2017 best paper. Every layer outputs its result as input for the layer after it, and forms the dense connection topology. The dense connection ease the probblem of vanishing gradient and improve the information flow. This module is based on DenseNet161, trained on ImageNet-2012, and can predict an image of size 224*224*3.  \n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install densenet161_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run densenet161_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"densenet161_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install densenet161_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/densenet169_imagenet/README.md",
    "content": "# densenet169_imagenet\n\n|模型名称|densenet169_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|DenseNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|59MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - DenseNet 是 CVPR 2017 最佳论文的模型，DenseNet 以前馈方式将每一层与其他层连接，从而 L 层网络就有 L(L+1)/2 个直接连接。对于每一层，其输入是之前的所有层的特征图，而自己的特征图作为之后所有层的输入。DenseNet 缓解了梯度消失问题，加强特征传播，促进了特征重用，并大幅减少了参数量。该PaddleHub Module结构为 DenseNet169，基于ImageNet-2012数据集训练，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install densenet169_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run densenet169_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"densenet169_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率。\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install densenet169_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/densenet169_imagenet/README_en.md",
    "content": "# densenet169_imagenet\n\n|Module Name|densenet169_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|DenseNet|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|59MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - DenseNet is the model in CVPR2017 best paper. Every layer outputs its result as input for the layer after it, and forms the dense connection topology. The dense connection ease the probblem of vanishing gradient and improve the information flow. This module is based on DenseNet169, trained on ImageNet-2012, and can predict an image of size 224*224*3.  \n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install densenet169_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run densenet169_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"densenet169_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install densenet169_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/densenet201_imagenet/README.md",
    "content": "# densenet201_imagenet\n\n|模型名称|densenet201_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|DenseNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|82MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - DenseNet 是 CVPR 2017 最佳论文的模型，DenseNet 以前馈方式将每一层与其他层连接，从而 L 层网络就有 L(L+1)/2 个直接连接。对于每一层，其输入是之前的所有层的特征图，而自己的特征图作为之后所有层的输入。DenseNet 缓解了梯度消失问题，加强特征传播，促进了特征重用，并大幅减少了参数量。该PaddleHub Module结构为 DenseNet201，基于ImageNet-2012数据集训练，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install densenet201_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run densenet201_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"densenet201_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率。\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install densenet201_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/densenet201_imagenet/README_en.md",
    "content": "# densenet201_imagenet\n\n|Module Name|densenet201_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|DenseNet|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|82MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - DenseNet is the model in CVPR2017 best paper. Every layer outputs its result as input for the layer after it, and forms the dense connection topology. The dense connection ease the probblem of vanishing gradient and improve the information flow. This module is based on DenseNet201, trained on ImageNet-2012, and can predict an image of size 224*224*3.  \n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install densenet201_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run densenet201_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"densenet201_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install densenet201_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/densenet264_imagenet/README.md",
    "content": "# densenet264_imagenet\n\n|模型名称|densenet264_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|DenseNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|135MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - DenseNet 是 CVPR 2017 最佳论文的模型，DenseNet 以前馈方式将每一层与其他层连接，从而 L 层网络就有 L(L+1)/2 个直接连接。对于每一层，其输入是之前的所有层的特征图，而自己的特征图作为之后所有层的输入。DenseNet 缓解了梯度消失问题，加强特征传播，促进了特征重用，并大幅减少了参数量。该PaddleHub Module结构为 DenseNet264，基于ImageNet-2012数据集训练，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install densenet264_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run densenet264_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"densenet264_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install densenet264_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/densenet264_imagenet/README_en.md",
    "content": "# densenet264_imagenet\n\n|Module Name|densenet264_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|DenseNet|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|135MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - DenseNet is the model in CVPR2017 best paper. Every layer outputs its result as input for the layer after it, and forms the dense connection topology. The dense connection ease the probblem of vanishing gradient and improve the information flow. This module is based on DenseNet264, trained on ImageNet-2012, and can predict an image of size 224*224*3.  \n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install densenet264_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run densenet264_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"densenet264_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install densenet264_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/dpn107_imagenet/README.md",
    "content": "# dpn107_imagenet\n\n|模型名称|dpn107_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|DPN|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|335MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - DPN(Dual Path Networks) 是 ImageNet 2017 目标定位冠军的图像分类模型，融合了 ResNet 和 DenseNet 的核心思想。该PaddleHub Module结构为 DPN107，基于ImageNet-2012数据集训练，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install dpn107_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run dpn107_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"dpn107_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率。\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install dpn107_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/dpn107_imagenet/README_en.md",
    "content": "# dpn107_imagenet\n\n|Module Name|dpn107_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|DPN|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|335MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - DPN(Dual Path Networks) is the champion of ILSVRC2017 in Object Localization Task. This module is based on DPN107, trained on ImageNet-2012, can predict an image of size 224*224*3.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install dpn107_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run dpn107_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"dpn107_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install dpn107_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/dpn131_imagenet/README.md",
    "content": "# dpn131_imagenet\n\n|模型名称|dpn131_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|DPN|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|306MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - DPN(Dual Path Networks) 是 ImageNet 2017 目标定位冠军的图像分类模型，融合了 ResNet 和 DenseNet 的核心思想。该PaddleHub Module结构为 DPN98，基于ImageNet-2012数据集训练，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install dpn131_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run dpn131_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"dpn131_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率。\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install dpn131_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/dpn131_imagenet/README_en.md",
    "content": "# dpn131_imagenet\n\n|Module Name|dpn131_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|DPN|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|306MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - DPN(Dual Path Networks) is the champion of ILSVRC2017 in Object Localization Task. This module is based on DPN131, trained on ImageNet-2012, can predict an image of size 224*224*3.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install dpn131_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run dpn131_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"dpn131_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install dpn131_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/dpn68_imagenet/README.md",
    "content": "# dpn68_imagenet\n\n|模型名称|dpn68_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|DPN|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|50MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - DPN(Dual Path Networks) 是 ImageNet 2017 目标定位冠军的图像分类模型，融合了 ResNet 和 DenseNet 的核心思想。该PaddleHub Module结构为 DPN68，基于ImageNet-2012数据集训练，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install dpn68_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run dpn68_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"dpn68_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率。\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install dpn68_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/dpn68_imagenet/README_en.md",
    "content": "# dpn68_imagenet\n\n|Module Name|dpn68_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|DPN|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|50MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - DPN(Dual Path Networks) is the champion of ILSVRC2017 in Object Localization Task. This module is based on DPN68, trained on ImageNet-2012, can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install dpn68_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run dpn68_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"dpn68_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install dpn68_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/dpn92_imagenet/README.md",
    "content": "# dpn92_imagenet\n\n|模型名称|dpn92_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|DPN|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|146MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - DPN(Dual Path Networks) 是 ImageNet 2017 目标定位冠军的图像分类模型，融合了 ResNet 和 DenseNet 的核心思想。该PaddleHub Module结构为 DPN92，基于ImageNet-2012数据集训练，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install dpn92_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run dpn92_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"dpn92_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率。\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install dpn92_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/dpn92_imagenet/README_en.md",
    "content": "# dpn92_imagenet\n\n|Module Name|dpn92_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|DPN|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|146MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - DPN(Dual Path Networks) is the champion of ILSVRC2017 in Object Localization Task. This module is based on DPN92, trained on ImageNet-2012, can predict an image of size 224*224*3.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install dpn92_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run dpn92_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"dpn92_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install dpn92_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/dpn98_imagenet/README.md",
    "content": "# dpn98_imagenet\n\n|模型名称|dpn98_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|DPN|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|238MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - DPN(Dual Path Networks) 是 ImageNet 2017 目标定位冠军的图像分类模型，融合了 ResNet 和 DenseNet 的核心思想。该PaddleHub Module结构为 DPN98，基于ImageNet-2012数据集训练，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install dpn98_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run dpn98_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"dpn98_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率。\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install dpn98_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/dpn98_imagenet/README_en.md",
    "content": "# dpn98_imagenet\n\n|Module Name|dpn98_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|DPN|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|238MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - DPN(Dual Path Networks) is the champion of ILSVRC2017 in Object Localization Task. This module is based on DPN98, trained on ImageNet-2012, can predict an image of size 224*224*3.\n\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install dpn98_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run dpn98_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"dpn98_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install dpn98_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/efficientnetb0_imagenet/README.md",
    "content": "# efficientnetb0_imagenet\n\n|模型名称|efficientnetb0_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|EfficientNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|22MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - EfficientNet 是谷歌的开源新模型，是一个轻量级网络，它的主干网络由 MBConv 构成，同时采取了 squeeze-and-excitation 操作对网络结构进行优化。该 PaddleHub Module结构为 EfficientNetB0，基于 ImageNet-2012 数据集训练，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install efficientnetb0_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run efficientnetb0_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"efficientnetb0_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 为识别的菜品类别，value为置信度。\n\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m efficientnetb0_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/efficientnetb0_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  提升预测性能以及易用性\n\n* 1.2.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install efficientnetb0_imagenet==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/efficientnetb0_imagenet/README_en.md",
    "content": "# efficientnetb0_imagenet\n\n|Module Name|efficientnetb0_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|EfficientNet|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|22MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - EfficientNet is a light-weight model proposed by google, which consists of MBConv, and takes advantage of squeeze-and-excitation operation. This module is based on EfficientNetB0, trained on ImageNet-2012 dataset, and can predict an image of size 224*224*3.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install efficientnetb0_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run efficientnetb0_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"efficientnetb0_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - classification API.\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - top\\_k (int): return the first k results\n\n    - **Return**\n\n      - res (list\\[dict\\]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of image classification.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m efficientnetb0_imagenet\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/efficientnetb0_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Improve the prediction performance and users' experience\n\n* 1.2.0\n\n  Remove Fluid API\n\n  - ```shell\n    $ hub install efficientnetb0_imagenet==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/efficientnetb0_imagenet/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/classification/efficientnetb0_imagenet/data_feed.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport time\nfrom collections import OrderedDict\n\nimport numpy as np\nfrom PIL import Image\n\n__all__ = ['reader']\n\nDATA_DIM = 224\nimg_mean = np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))\nimg_std = np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))\n\n\ndef resize_short(img, target_size):\n    percent = float(target_size) / min(img.size[0], img.size[1])\n    resized_width = int(round(img.size[0] * percent))\n    resized_height = int(round(img.size[1] * percent))\n    img = img.resize((resized_width, resized_height), Image.LANCZOS)\n    return img\n\n\ndef crop_image(img, target_size, center):\n    width, height = img.size\n    size = target_size\n    if center == True:\n        w_start = (width - size) / 2\n        h_start = (height - size) / 2\n    else:\n        w_start = np.random.randint(0, width - size + 1)\n        h_start = np.random.randint(0, height - size + 1)\n    w_end = w_start + size\n    h_end = h_start + size\n    img = img.crop((w_start, h_start, w_end, h_end))\n    return img\n\n\ndef process_image(img):\n    img = resize_short(img, target_size=256)\n    img = crop_image(img, target_size=DATA_DIM, center=True)\n    if img.mode != 'RGB':\n        img = img.convert('RGB')\n    img = np.array(img).astype('float32').transpose((2, 0, 1)) / 255\n    img -= img_mean\n    img /= img_std\n    return img\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list[numpy.ndarray]): images data, shape of each is [H, W, C].\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            each['org_im_path'] = im_path\n            each['org_im'] = Image.open(im_path)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n    if images is not None:\n        assert type(images), \"images is a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = Image.fromarray(im[:, :, ::-1])\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n\n    for element in component:\n        element['image'] = process_image(element['org_im'])\n        yield element\n"
  },
  {
    "path": "modules/image/classification/efficientnetb0_imagenet/label_list.txt",
    "content": "tench, Tinca tinca\ngoldfish, Carassius auratus\ngreat white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias\ntiger shark, Galeocerdo cuvieri\nhammerhead, hammerhead shark\nelectric ray, crampfish, numbfish, torpedo\nstingray\ncock\nhen\nostrich, Struthio camelus\nbrambling, Fringilla montifringilla\ngoldfinch, Carduelis carduelis\nhouse finch, linnet, Carpodacus mexicanus\njunco, snowbird\nindigo bunting, indigo finch, indigo bird, Passerina cyanea\nrobin, American robin, Turdus migratorius\nbulbul\njay\nmagpie\nchickadee\nwater ouzel, dipper\nkite\nbald eagle, American eagle, Haliaeetus leucocephalus\nvulture\ngreat grey owl, great gray owl, Strix nebulosa\nEuropean fire salamander, Salamandra salamandra\ncommon newt, Triturus vulgaris\neft\nspotted salamander, Ambystoma maculatum\naxolotl, mud puppy, Ambystoma mexicanum\nbullfrog, Rana catesbeiana\ntree frog, tree-frog\ntailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui\nloggerhead, loggerhead turtle, Caretta caretta\nleatherback turtle, leatherback, leathery turtle, Dermochelys coriacea\nmud turtle\nterrapin\nbox turtle, box tortoise\nbanded gecko\ncommon iguana, iguana, Iguana iguana\nAmerican chameleon, anole, Anolis carolinensis\nwhiptail, whiptail lizard\nagama\nfrilled lizard, Chlamydosaurus kingi\nalligator lizard\nGila monster, Heloderma suspectum\ngreen lizard, Lacerta viridis\nAfrican chameleon, Chamaeleo chamaeleon\nKomodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis\nAfrican crocodile, Nile crocodile, Crocodylus niloticus\nAmerican alligator, Alligator mississipiensis\ntriceratops\nthunder snake, worm snake, Carphophis amoenus\nringneck snake, ring-necked snake, ring snake\nhognose snake, puff adder, sand viper\ngreen snake, grass snake\nking snake, kingsnake\ngarter snake, grass snake\nwater snake\nvine snake\nnight snake, Hypsiglena torquata\nboa constrictor, Constrictor constrictor\nrock python, rock snake, Python sebae\nIndian cobra, Naja naja\ngreen mamba\nsea snake\nhorned viper, cerastes, sand viper, horned asp, Cerastes cornutus\ndiamondback, diamondback rattlesnake, Crotalus adamanteus\nsidewinder, horned rattlesnake, Crotalus cerastes\ntrilobite\nharvestman, daddy longlegs, Phalangium opilio\nscorpion\nblack and gold garden spider, Argiope aurantia\nbarn spider, Araneus cavaticus\ngarden spider, Aranea diademata\nblack widow, Latrodectus mactans\ntarantula\nwolf spider, hunting spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse, partridge, Bonasa umbellus\nprairie chicken, prairie grouse, prairie fowl\npeacock\nquail\npartridge\nAfrican grey, African gray, Psittacus erithacus\nmacaw\nsulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser, Mergus serrator\ngoose\nblack swan, Cygnus atratus\ntusker\nechidna, spiny anteater, anteater\nplatypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus\nwallaby, brush kangaroo\nkoala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus\nwombat\njellyfish\nsea anemone, anemone\nbrain coral\nflatworm, platyhelminth\nnematode, nematode worm, roundworm\nconch\nsnail\nslug\nsea slug, nudibranch\nchiton, coat-of-mail shell, sea cradle, polyplacophore\nchambered nautilus, pearly nautilus, nautilus\nDungeness crab, Cancer magister\nrock crab, Cancer irroratus\nfiddler crab\nking crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica\nAmerican lobster, Northern lobster, Maine lobster, Homarus americanus\nspiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish\ncrayfish, crawfish, crawdad, crawdaddy\nhermit crab\nisopod\nwhite stork, Ciconia ciconia\nblack stork, Ciconia nigra\nspoonbill\nflamingo\nlittle blue heron, Egretta caerulea\nAmerican egret, great white heron, Egretta albus\nbittern\ncrane\nlimpkin, Aramus pictus\nEuropean gallinule, Porphyrio porphyrio\nAmerican coot, marsh hen, mud hen, water hen, Fulica americana\nbustard\nruddy turnstone, Arenaria interpres\nred-backed sandpiper, dunlin, Erolia alpina\nredshank, Tringa totanus\ndowitcher\noystercatcher, oyster catcher\npelican\nking penguin, Aptenodytes patagonica\nalbatross, mollymawk\ngrey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus\nkiller whale, killer, orca, grampus, sea wolf, Orcinus orca\ndugong, Dugong dugon\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog, Maltese terrier, Maltese\nPekinese, Pekingese, Peke\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound, Afghan\nbasset, basset hound\nbeagle\nbloodhound, sleuthhound\nbluetick\nblack-and-tan coonhound\nWalker hound, Walker foxhound\nEnglish foxhound\nredbone\nborzoi, Russian wolfhound\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound, Ibizan Podenco\nNorwegian elkhound, elkhound\notterhound, otter hound\nSaluki, gazelle hound\nScottish deerhound, deerhound\nWeimaraner\nStaffordshire bullterrier, Staffordshire bull terrier\nAmerican Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier, Sealyham\nAiredale, Airedale terrier\ncairn, cairn terrier\nAustralian terrier\nDandie Dinmont, Dandie Dinmont terrier\nBoston bull, Boston terrier\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier, Scottish terrier, Scottie\nTibetan terrier, chrysanthemum dog\nsilky terrier, Sydney silky\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa, Lhasa apso\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla, Hungarian pointer\nEnglish setter\nIrish setter, red setter\nGordon setter\nBrittany spaniel\nclumber, clumber spaniel\nEnglish springer, English springer spaniel\nWelsh springer spaniel\ncocker spaniel, English cocker spaniel, cocker\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog, bobtail\nShetland sheepdog, Shetland sheep dog, Shetland\ncollie\nBorder collie\nBouvier des Flandres, Bouviers des Flandres\nRottweiler\nGerman shepherd, German shepherd dog, German police dog, alsatian\nDoberman, Doberman pinscher\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard, St Bernard\nEskimo dog, husky\nmalamute, malemute, Alaskan malamute\nSiberian husky\ndalmatian, coach dog, carriage dog\naffenpinscher, monkey pinscher, monkey dog\nbasenji\npug, pug-dog\nLeonberg\nNewfoundland, Newfoundland dog\nGreat Pyrenees\nSamoyed, Samoyede\nPomeranian\nchow, chow chow\nkeeshond\nBrabancon griffon\nPembroke, Pembroke Welsh corgi\nCardigan, Cardigan Welsh corgi\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf, grey wolf, gray wolf, Canis lupus\nwhite wolf, Arctic wolf, Canis lupus tundrarum\nred wolf, maned wolf, Canis rufus, Canis niger\ncoyote, prairie wolf, brush wolf, Canis latrans\ndingo, warrigal, warragal, Canis dingo\ndhole, Cuon alpinus\nAfrican hunting dog, hyena dog, Cape hunting dog, Lycaon pictus\nhyena, hyaena\nred fox, Vulpes vulpes\nkit fox, Vulpes macrotis\nArctic fox, white fox, Alopex lagopus\ngrey fox, gray fox, Urocyon cinereoargenteus\ntabby, tabby cat\ntiger cat\nPersian cat\nSiamese cat, Siamese\nEgyptian cat\ncougar, puma, catamount, mountain lion, painter, panther, Felis concolor\nlynx, catamount\nleopard, Panthera pardus\nsnow leopard, ounce, Panthera uncia\njaguar, panther, Panthera onca, Felis onca\nlion, king of beasts, Panthera leo\ntiger, Panthera tigris\ncheetah, chetah, Acinonyx jubatus\nbrown bear, bruin, Ursus arctos\nAmerican black bear, black bear, Ursus americanus, Euarctos americanus\nice bear, polar bear, Ursus Maritimus, Thalarctos maritimus\nsloth bear, Melursus ursinus, Ursus ursinus\nmongoose\nmeerkat, mierkat\ntiger beetle\nladybug, ladybeetle, lady beetle, ladybird, ladybird beetle\nground beetle, carabid beetle\nlong-horned beetle, longicorn, longicorn beetle\nleaf beetle, chrysomelid\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant, emmet, pismire\ngrasshopper, hopper\ncricket\nwalking stick, walkingstick, stick insect\ncockroach, roach\nmantis, mantid\ncicada, cicala\nleafhopper\nlacewing, lacewing fly\ndragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk\ndamselfly\nadmiral\nringlet, ringlet butterfly\nmonarch, monarch butterfly, milkweed butterfly, Danaus plexippus\ncabbage butterfly\nsulphur butterfly, sulfur butterfly\nlycaenid, lycaenid butterfly\nstarfish, sea star\nsea urchin\nsea cucumber, holothurian\nwood rabbit, cottontail, cottontail rabbit\nhare\nAngora, Angora rabbit\nhamster\nporcupine, hedgehog\nfox squirrel, eastern fox squirrel, Sciurus niger\nmarmot\nbeaver\nguinea pig, Cavia cobaya\nsorrel\nzebra\nhog, pig, grunter, squealer, Sus scrofa\nwild boar, boar, Sus scrofa\nwarthog\nhippopotamus, hippo, river horse, Hippopotamus amphibius\nox\nwater buffalo, water ox, Asiatic buffalo, Bubalus bubalis\nbison\nram, tup\nbighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis\nibex, Capra ibex\nhartebeest\nimpala, Aepyceros melampus\ngazelle\nArabian camel, dromedary, Camelus dromedarius\nllama\nweasel\nmink\npolecat, fitch, foulmart, foumart, Mustela putorius\nblack-footed ferret, ferret, Mustela nigripes\notter\nskunk, polecat, wood pussy\nbadger\narmadillo\nthree-toed sloth, ai, Bradypus tridactylus\norangutan, orang, orangutang, Pongo pygmaeus\ngorilla, Gorilla gorilla\nchimpanzee, chimp, Pan troglodytes\ngibbon, Hylobates lar\nsiamang, Hylobates syndactylus, Symphalangus syndactylus\nguenon, guenon monkey\npatas, hussar monkey, Erythrocebus patas\nbaboon\nmacaque\nlangur\ncolobus, colobus monkey\nproboscis monkey, Nasalis larvatus\nmarmoset\ncapuchin, ringtail, Cebus capucinus\nhowler monkey, howler\ntiti, titi monkey\nspider monkey, Ateles geoffroyi\nsquirrel monkey, Saimiri sciureus\nMadagascar cat, ring-tailed lemur, Lemur catta\nindri, indris, Indri indri, Indri brevicaudatus\nIndian elephant, Elephas maximus\nAfrican elephant, Loxodonta africana\nlesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens\ngiant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca\nbarracouta, snoek\neel\ncoho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch\nrock beauty, Holocanthus tricolor\nanemone fish\nsturgeon\ngar, garfish, garpike, billfish, Lepisosteus osseus\nlionfish\npuffer, pufferfish, blowfish, globefish\nabacus\nabaya\nacademic gown, academic robe, judge's robe\naccordion, piano accordion, squeeze box\nacoustic guitar\naircraft carrier, carrier, flattop, attack aircraft carrier\nairliner\nairship, dirigible\naltar\nambulance\namphibian, amphibious vehicle\nanalog clock\napiary, bee house\napron\nashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin\nassault rifle, assault gun\nbackpack, back pack, knapsack, packsack, rucksack, haversack\nbakery, bakeshop, bakehouse\nbalance beam, beam\nballoon\nballpoint, ballpoint pen, ballpen, Biro\nBand Aid\nbanjo\nbannister, banister, balustrade, balusters, handrail\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel, cask\nbarrow, garden cart, lawn cart, wheelbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap, swimming cap\nbath towel\nbathtub, bathing tub, bath, tub\nbeach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon\nbeacon, lighthouse, beacon light, pharos\nbeaker\nbearskin, busby, shako\nbeer bottle\nbeer glass\nbell cote, bell cot\nbib\nbicycle-built-for-two, tandem bicycle, tandem\nbikini, two-piece\nbinder, ring-binder\nbinoculars, field glasses, opera glasses\nbirdhouse\nboathouse\nbobsled, bobsleigh, bob\nbolo tie, bolo, bola tie, bola\nbonnet, poke bonnet\nbookcase\nbookshop, bookstore, bookstall\nbottlecap\nbow\nbow tie, bow-tie, bowtie\nbrass, memorial tablet, plaque\nbrassiere, bra, bandeau\nbreakwater, groin, groyne, mole, bulwark, seawall, jetty\nbreastplate, aegis, egis\nbroom\nbucket, pail\nbuckle\nbulletproof vest\nbullet train, bullet\nbutcher shop, meat market\ncab, hack, taxi, taxicab\ncaldron, cauldron\ncandle, taper, wax light\ncannon\ncanoe\ncan opener, tin opener\ncardigan\ncar mirror\ncarousel, carrousel, merry-go-round, roundabout, whirligig\ncarpenter's kit, tool kit\ncarton\ncar wheel\ncash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello, violoncello\ncellular telephone, cellular phone, cellphone, cell, mobile phone\nchain\nchainlink fence\nchain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour\nchain saw, chainsaw\nchest\nchiffonier, commode\nchime, bell, gong\nchina cabinet, china closet\nChristmas stocking\nchurch, church building\ncinema, movie theater, movie theatre, movie house, picture palace\ncleaver, meat cleaver, chopper\ncliff dwelling\ncloak\nclog, geta, patten, sabot\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil, spiral, volute, whorl, helix\ncombination lock\ncomputer keyboard, keypad\nconfectionery, confectionary, candy store\ncontainer ship, containership, container vessel\nconvertible\ncorkscrew, bottle screw\ncornet, horn, trumpet, trump\ncowboy boot\ncowboy hat, ten-gallon hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib, cot\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam, dike, dyke\ndesk\ndesktop computer\ndial telephone, dial phone\ndiaper, nappy, napkin\ndigital clock\ndigital watch\ndining table, board\ndishrag, dishcloth\ndishwasher, dish washer, dishwashing machine\ndisk brake, disc brake\ndock, dockage, docking facility\ndogsled, dog sled, dog sleigh\ndome\ndoormat, welcome mat\ndrilling platform, offshore rig\ndrum, membranophone, tympan\ndrumstick\ndumbbell\nDutch oven\nelectric fan, blower\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa, boa\nfile, file cabinet, filing cabinet\nfireboat\nfire engine, fire truck\nfire screen, fireguard\nflagpole, flagstaff\nflute, transverse flute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn, horn\nfrying pan, frypan, skillet\nfur coat\ngarbage truck, dustcart\ngasmask, respirator, gas helmet\ngas pump, gasoline pump, petrol pump, island dispenser\ngoblet\ngo-kart\ngolf ball\ngolfcart, golf cart\ngondola\ngong, tam-tam\ngown\ngrand piano, grand\ngreenhouse, nursery, glasshouse\ngrille, radiator grille\ngrocery store, grocery, food market, market\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower, blow dryer, blow drier, hair dryer, hair drier\nhand-held computer, hand-held microcomputer\nhandkerchief, hankie, hanky, hankey\nhard disc, hard disk, fixed disk\nharmonica, mouth organ, harp, mouth harp\nharp\nharvester, reaper\nhatchet\nholster\nhome theater, home theatre\nhoneycomb\nhook, claw\nhoopskirt, crinoline\nhorizontal bar, high bar\nhorse cart, horse-cart\nhourglass\niPod\niron, smoothing iron\njack-o'-lantern\njean, blue jean, denim\njeep, landrover\njersey, T-shirt, tee shirt\njigsaw puzzle\njinrikisha, ricksha, rickshaw\njoystick\nkimono\nknee pad\nknot\nlab coat, laboratory coat\nladle\nlampshade, lamp shade\nlaptop, laptop computer\nlawn mower, mower\nlens cap, lens cover\nletter opener, paper knife, paperknife\nlibrary\nlifeboat\nlighter, light, igniter, ignitor\nlimousine, limo\nliner, ocean liner\nlipstick, lip rouge\nLoafer\nlotion\nloudspeaker, speaker, speaker unit, loudspeaker system, speaker system\nloupe, jeweler's loupe\nlumbermill, sawmill\nmagnetic compass\nmailbag, postbag\nmailbox, letter box\nmaillot\nmaillot, tank suit\nmanhole cover\nmaraca\nmarimba, xylophone\nmask\nmatchstick\nmaypole\nmaze, labyrinth\nmeasuring cup\nmedicine chest, medicine cabinet\nmegalith, megalithic structure\nmicrophone, mike\nmicrowave, microwave oven\nmilitary uniform\nmilk can\nminibus\nminiskirt, mini\nminivan\nmissile\nmitten\nmixing bowl\nmobile home, manufactured home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter, scooter\nmountain bike, all-terrain bike, off-roader\nmountain tent\nmouse, computer mouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook, notebook computer\nobelisk\noboe, hautboy, hautbois\nocarina, sweet potato\nodometer, hodometer, mileometer, milometer\noil filter\norgan, pipe organ\noscilloscope, scope, cathode-ray oscilloscope, CRO\noverskirt\noxcart\noxygen mask\npacket\npaddle, boat paddle\npaddlewheel, paddle wheel\npadlock\npaintbrush\npajama, pyjama, pj's, jammies\npalace\npanpipe, pandean pipe, syrinx\npaper towel\nparachute, chute\nparallel bars, bars\npark bench\nparking meter\npassenger car, coach, carriage\npatio, terrace\npay-phone, pay-station\npedestal, plinth, footstall\npencil box, pencil case\npencil sharpener\nperfume, essence\nPetri dish\nphotocopier\npick, plectrum, plectron\npickelhaube\npicket fence, paling\npickup, pickup truck\npier\npiggy bank, penny bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate, pirate ship\npitcher, ewer\nplane, carpenter's plane, woodworking plane\nplanetarium\nplastic bag\nplate rack\nplow, plough\nplunger, plumber's helper\nPolaroid camera, Polaroid Land camera\npole\npolice van, police wagon, paddy wagon, patrol wagon, wagon, black Maria\nponcho\npool table, billiard table, snooker table\npop bottle, soda bottle\npot, flowerpot\npotter's wheel\npower drill\nprayer rug, prayer mat\nprinter\nprison, prison house\nprojectile, missile\nprojector\npuck, hockey puck\npunching bag, punch bag, punching ball, punchball\npurse\nquill, quill pen\nquilt, comforter, comfort, puff\nracer, race car, racing car\nracket, racquet\nradiator\nradio, wireless\nradio telescope, radio reflector\nrain barrel\nrecreational vehicle, RV, R.V.\nreel\nreflex camera\nrefrigerator, icebox\nremote control, remote\nrestaurant, eating house, eating place, eatery\nrevolver, six-gun, six-shooter\nrifle\nrocking chair, rocker\nrotisserie\nrubber eraser, rubber, pencil eraser\nrugby ball\nrule, ruler\nrunning shoe\nsafe\nsafety pin\nsaltshaker, salt shaker\nsandal\nsarong\nsax, saxophone\nscabbard\nscale, weighing machine\nschool bus\nschooner\nscoreboard\nscreen, CRT screen\nscrew\nscrewdriver\nseat belt, seatbelt\nsewing machine\nshield, buckler\nshoe shop, shoe-shop, shoe store\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule, slipstick\nsliding door\nslot, one-armed bandit\nsnorkel\nsnowmobile\nsnowplow, snowplough\nsoap dispenser\nsoccer ball\nsock\nsolar dish, solar collector, solar furnace\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web, spider's web\nspindle\nsports car, sport car\nspotlight, spot\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch, stop watch\nstove\nstrainer\nstreetcar, tram, tramcar, trolley, trolley car\nstretcher\nstudio couch, day bed\nstupa, tope\nsubmarine, pigboat, sub, U-boat\nsuit, suit of clothes\nsundial\nsunglass\nsunglasses, dark glasses, shades\nsunscreen, sunblock, sun blocker\nsuspension bridge\nswab, swob, mop\nsweatshirt\nswimming trunks, bathing trunks\nswing\nswitch, electric switch, electrical switch\nsyringe\ntable lamp\ntank, army tank, armored combat vehicle, armoured combat vehicle\ntape player\nteapot\nteddy, teddy bear\ntelevision, television system\ntennis ball\nthatch, thatched roof\ntheater curtain, theatre curtain\nthimble\nthresher, thrasher, threshing machine\nthrone\ntile roof\ntoaster\ntobacco shop, tobacconist shop, tobacconist\ntoilet seat\ntorch\ntotem pole\ntow truck, tow car, wrecker\ntoyshop\ntractor\ntrailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi\ntray\ntrench coat\ntricycle, trike, velocipede\ntrimaran\ntripod\ntriumphal arch\ntrolleybus, trolley coach, trackless trolley\ntrombone\ntub, vat\nturnstile\ntypewriter keyboard\numbrella\nunicycle, monocycle\nupright, upright piano\nvacuum, vacuum cleaner\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin, fiddle\nvolleyball\nwaffle iron\nwall clock\nwallet, billfold, notecase, pocketbook\nwardrobe, closet, press\nwarplane, military plane\nwashbasin, handbasin, washbowl, lavabo, wash-hand basin\nwasher, automatic washer, washing machine\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool, woolen, woollen\nworm fence, snake fence, snake-rail fence, Virginia fence\nwreck\nyawl\nyurt\nweb site, website, internet site, site\ncomic book\ncrossword puzzle, crossword\nstreet sign\ntraffic light, traffic signal, stoplight\nbook jacket, dust cover, dust jacket, dust wrapper\nmenu\nplate\nguacamole\nconsomme\nhot pot, hotpot\ntrifle\nice cream, icecream\nice lolly, lolly, lollipop, popsicle\nFrench loaf\nbagel, beigel\npretzel\ncheeseburger\nhotdog, hot dog, red hot\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini, courgette\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber, cuke\nartichoke, globe artichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple, ananas\nbanana\njackfruit, jak, jack\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce, chocolate syrup\ndough\nmeat loaf, meatloaf\npizza, pizza pie\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff, drop, drop-off\ncoral reef\ngeyser\nlakeside, lakeshore\npromontory, headland, head, foreland\nsandbar, sand bar\nseashore, coast, seacoast, sea-coast\nvalley, vale\nvolcano\nballplayer, baseball player\ngroom, bridegroom\nscuba diver\nrapeseed\ndaisy\nyellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum\ncorn\nacorn\nhip, rose hip, rosehip\nbuckeye, horse chestnut, conker\ncoral fungus\nagaric\ngyromitra\nstinkhorn, carrion fungus\nearthstar\nhen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa\nbolete\near, spike, capitulum\ntoilet tissue, toilet paper, bathroom tissue\n"
  },
  {
    "path": "modules/image/classification/efficientnetb0_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\n\nimport argparse\nimport ast\nimport os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import postprocess\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"efficientnetb0_imagenet\",\n            type=\"CV/image_classification\",\n            author=\"paddlepaddle\",\n            author_email=\"paddle-dev@baidu.com\",\n            summary=\"EfficientNetB0 is a image classfication model, this module is trained with imagenet datasets.\",\n            version=\"1.2.0\")\nclass EfficientNetB0ImageNet:\n\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"efficientnetb0_imagenet_infer_model\",\n                                                          \"model\")\n        label_file = os.path.join(self.directory, \"label_list.txt\")\n        with open(label_file, 'r', encoding='utf-8') as file:\n            self.label_list = file.read().split(\"\\n\")[:-1]\n        self._set_config()\n\n    def get_expected_image_width(self):\n        return 224\n\n    def get_expected_image_height(self):\n        return 224\n\n    def get_pretrained_images_mean(self):\n        im_mean = np.array([0.485, 0.456, 0.406]).reshape(1, 3)\n        return im_mean\n\n    def get_pretrained_images_std(self):\n        im_std = np.array([0.229, 0.224, 0.225]).reshape(1, 3)\n        return im_std\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path + '.pdmodel'\n        params = self.default_pretrained_model_path + '.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def classification(self, images=None, paths=None, batch_size=1, use_gpu=False, top_k=1):\n        \"\"\"\n        API for image classification.\n\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        all_data = list()\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = list()\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except:\n                    pass\n            # feed batch image\n            batch_image = np.array([data['image'] for data in batch_data])\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(batch_image.copy())\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            out = postprocess(data_out=output_handle.copy_to_cpu(), label_list=self.label_list, top_k=top_k)\n            res += out\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classify(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.classify(paths=[args.input_path], batch_size=args.batch_size, use_gpu=args.use_gpu)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not.\")\n        self.arg_config_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n        self.arg_config_group.add_argument('--top_k', type=ast.literal_eval, default=1, help=\"Return top k results.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n\n\nif __name__ == '__main__':\n    b0 = EfficientNetB0ImageNet()\n    b0.context()\n    import cv2\n    test_image = [\n        cv2.imread(\n            '/mnt/zhangxuefei/program-paddle/PaddleHub/hub_module/tests/image_dataset/classification/animals/dog.jpeg')\n    ]\n    res = b0.classification(images=test_image)\n    print(res)\n    res = b0.classification(paths=[\n        '/mnt/zhangxuefei/program-paddle/PaddleHub/hub_module/tests/image_dataset/classification/animals/dog.jpeg'\n    ])\n    print(res)\n    res = b0.classification(images=test_image)\n    print(res)\n    res = b0.classify(images=test_image)\n    print(res)\n"
  },
  {
    "path": "modules/image/classification/efficientnetb0_imagenet/processor.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport base64\n\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef softmax(x):\n    if len(x.shape) > 1:\n        tmp = np.max(x, axis=1)\n        x -= tmp.reshape((x.shape[0], 1))\n        x = np.exp(x)\n        tmp = np.sum(x, axis=1)\n        x /= tmp.reshape((x.shape[0], 1))\n    else:\n        tmp = np.max(x)\n        x -= tmp\n        x = np.exp(x)\n        tmp = np.sum(x)\n        x /= tmp\n    return x\n\n\ndef postprocess(data_out, label_list, top_k):\n    \"\"\"\n    Postprocess output of network, one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output data of network.\n        label_list (list): list of label.\n        top_k (int): Return top k results.\n    \"\"\"\n    output = []\n    for result in data_out:\n        result_i = softmax(result)\n        output_i = {}\n        indexs = np.argsort(result_i)[::-1][0:top_k]\n        for index in indexs:\n            label = label_list[index].split(',')[0]\n            output_i[label] = float(result_i[index])\n        output.append(output_i)\n    return output\n"
  },
  {
    "path": "modules/image/classification/efficientnetb0_imagenet/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/brFsZ7qszSY/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8OHx8ZG9nfGVufDB8fHx8MTY2MzA1ODQ1MQ&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"efficientnetb0_imagenet\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n\n    def test_classification1(self):\n        results = self.module.classification(paths=['tests/test.jpg'])\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification2(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')])\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification3(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')], use_gpu=True)\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification4(self):\n        self.assertRaises(AssertionError, self.module.classification, paths=['no.jpg'])\n\n    def test_classification5(self):\n        self.assertRaises(TypeError, self.module.classification, images=['tests/test.jpg'])\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/classification/efficientnetb0_small_imagenet/README.md",
    "content": "# efficientnetb0_small_imagenet\n\n|模型名称|efficientnetb0_small_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|EfficientNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|20MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - EfficientNet 是谷歌的开源新模型，是一个轻量级网络，它的主干网络由 MBConv 构成，同时采取了 squeeze-and-excitation 操作对网络结构进行优化。该 PaddleHub Module结构为 EfficientNetB0，基于 ImageNet-2012 数据集训练，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install efficientnetb0_small_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run efficientnetb0_small_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"efficientnetb0_small_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 为识别的菜品类别，value为置信度。\n\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m efficientnetb0_small_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/efficientnetb0_small_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install efficientnetb0_small_imagenet==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/efficientnetb0_small_imagenet/README_en.md",
    "content": "# efficientnetb0_small_imagenet\n\n|Module Name|efficientnetb0_small_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|EfficientNet|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|20MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - EfficientNet is a light-weight model proposed by google, which consists of MBConv, and takes advantage of squeeze-and-excitation operation. This module is based on EfficientNetB0, trained on ImageNet-2012 dataset, and can predict an image of size 224*224*3.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install efficientnetb0_small_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run efficientnetb0_small_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"efficientnetb0_small_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - classification API.\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - top\\_k (int): return the first k results\n\n    - **Return**\n\n      - res (list\\[dict\\]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of image classification.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m efficientnetb0_small_imagenet\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/efficientnetb0_small_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Remove Fluid API\n\n  - ```shell\n    $ hub install efficientnetb0_small_imagenet==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/efficientnetb0_small_imagenet/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/classification/efficientnetb0_small_imagenet/data_feed.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport time\nfrom collections import OrderedDict\n\nimport numpy as np\nfrom PIL import Image\n\n__all__ = ['reader']\n\nDATA_DIM = 224\nimg_mean = np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))\nimg_std = np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))\n\n\ndef resize_short(img, target_size):\n    percent = float(target_size) / min(img.size[0], img.size[1])\n    resized_width = int(round(img.size[0] * percent))\n    resized_height = int(round(img.size[1] * percent))\n    img = img.resize((resized_width, resized_height), Image.LANCZOS)\n    return img\n\n\ndef crop_image(img, target_size, center):\n    width, height = img.size\n    size = target_size\n    if center == True:\n        w_start = (width - size) / 2\n        h_start = (height - size) / 2\n    else:\n        w_start = np.random.randint(0, width - size + 1)\n        h_start = np.random.randint(0, height - size + 1)\n    w_end = w_start + size\n    h_end = h_start + size\n    img = img.crop((w_start, h_start, w_end, h_end))\n    return img\n\n\ndef process_image(img):\n    img = resize_short(img, target_size=256)\n    img = crop_image(img, target_size=DATA_DIM, center=True)\n    if img.mode != 'RGB':\n        img = img.convert('RGB')\n    img = np.array(img).astype('float32').transpose((2, 0, 1)) / 255\n    img -= img_mean\n    img /= img_std\n    return img\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list[numpy.ndarray]): images data, shape of each is [H, W, C].\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            each['org_im_path'] = im_path\n            each['org_im'] = Image.open(im_path)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n    if images is not None:\n        assert type(images), \"images is a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = Image.fromarray(im[:, :, ::-1])\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n\n    for element in component:\n        element['image'] = process_image(element['org_im'])\n        yield element\n"
  },
  {
    "path": "modules/image/classification/efficientnetb0_small_imagenet/label_list.txt",
    "content": "tench, Tinca tinca\ngoldfish, Carassius auratus\ngreat white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias\ntiger shark, Galeocerdo cuvieri\nhammerhead, hammerhead shark\nelectric ray, crampfish, numbfish, torpedo\nstingray\ncock\nhen\nostrich, Struthio camelus\nbrambling, Fringilla montifringilla\ngoldfinch, Carduelis carduelis\nhouse finch, linnet, Carpodacus mexicanus\njunco, snowbird\nindigo bunting, indigo finch, indigo bird, Passerina cyanea\nrobin, American robin, Turdus migratorius\nbulbul\njay\nmagpie\nchickadee\nwater ouzel, dipper\nkite\nbald eagle, American eagle, Haliaeetus leucocephalus\nvulture\ngreat grey owl, great gray owl, Strix nebulosa\nEuropean fire salamander, Salamandra salamandra\ncommon newt, Triturus vulgaris\neft\nspotted salamander, Ambystoma maculatum\naxolotl, mud puppy, Ambystoma mexicanum\nbullfrog, Rana catesbeiana\ntree frog, tree-frog\ntailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui\nloggerhead, loggerhead turtle, Caretta caretta\nleatherback turtle, leatherback, leathery turtle, Dermochelys coriacea\nmud turtle\nterrapin\nbox turtle, box tortoise\nbanded gecko\ncommon iguana, iguana, Iguana iguana\nAmerican chameleon, anole, Anolis carolinensis\nwhiptail, whiptail lizard\nagama\nfrilled lizard, Chlamydosaurus kingi\nalligator lizard\nGila monster, Heloderma suspectum\ngreen lizard, Lacerta viridis\nAfrican chameleon, Chamaeleo chamaeleon\nKomodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis\nAfrican crocodile, Nile crocodile, Crocodylus niloticus\nAmerican alligator, Alligator mississipiensis\ntriceratops\nthunder snake, worm snake, Carphophis amoenus\nringneck snake, ring-necked snake, ring snake\nhognose snake, puff adder, sand viper\ngreen snake, grass snake\nking snake, kingsnake\ngarter snake, grass snake\nwater snake\nvine snake\nnight snake, Hypsiglena torquata\nboa constrictor, Constrictor constrictor\nrock python, rock snake, Python sebae\nIndian cobra, Naja naja\ngreen mamba\nsea snake\nhorned viper, cerastes, sand viper, horned asp, Cerastes cornutus\ndiamondback, diamondback rattlesnake, Crotalus adamanteus\nsidewinder, horned rattlesnake, Crotalus cerastes\ntrilobite\nharvestman, daddy longlegs, Phalangium opilio\nscorpion\nblack and gold garden spider, Argiope aurantia\nbarn spider, Araneus cavaticus\ngarden spider, Aranea diademata\nblack widow, Latrodectus mactans\ntarantula\nwolf spider, hunting spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse, partridge, Bonasa umbellus\nprairie chicken, prairie grouse, prairie fowl\npeacock\nquail\npartridge\nAfrican grey, African gray, Psittacus erithacus\nmacaw\nsulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser, Mergus serrator\ngoose\nblack swan, Cygnus atratus\ntusker\nechidna, spiny anteater, anteater\nplatypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus\nwallaby, brush kangaroo\nkoala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus\nwombat\njellyfish\nsea anemone, anemone\nbrain coral\nflatworm, platyhelminth\nnematode, nematode worm, roundworm\nconch\nsnail\nslug\nsea slug, nudibranch\nchiton, coat-of-mail shell, sea cradle, polyplacophore\nchambered nautilus, pearly nautilus, nautilus\nDungeness crab, Cancer magister\nrock crab, Cancer irroratus\nfiddler crab\nking crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica\nAmerican lobster, Northern lobster, Maine lobster, Homarus americanus\nspiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish\ncrayfish, crawfish, crawdad, crawdaddy\nhermit crab\nisopod\nwhite stork, Ciconia ciconia\nblack stork, Ciconia nigra\nspoonbill\nflamingo\nlittle blue heron, Egretta caerulea\nAmerican egret, great white heron, Egretta albus\nbittern\ncrane\nlimpkin, Aramus pictus\nEuropean gallinule, Porphyrio porphyrio\nAmerican coot, marsh hen, mud hen, water hen, Fulica americana\nbustard\nruddy turnstone, Arenaria interpres\nred-backed sandpiper, dunlin, Erolia alpina\nredshank, Tringa totanus\ndowitcher\noystercatcher, oyster catcher\npelican\nking penguin, Aptenodytes patagonica\nalbatross, mollymawk\ngrey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus\nkiller whale, killer, orca, grampus, sea wolf, Orcinus orca\ndugong, Dugong dugon\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog, Maltese terrier, Maltese\nPekinese, Pekingese, Peke\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound, Afghan\nbasset, basset hound\nbeagle\nbloodhound, sleuthhound\nbluetick\nblack-and-tan coonhound\nWalker hound, Walker foxhound\nEnglish foxhound\nredbone\nborzoi, Russian wolfhound\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound, Ibizan Podenco\nNorwegian elkhound, elkhound\notterhound, otter hound\nSaluki, gazelle hound\nScottish deerhound, deerhound\nWeimaraner\nStaffordshire bullterrier, Staffordshire bull terrier\nAmerican Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier, Sealyham\nAiredale, Airedale terrier\ncairn, cairn terrier\nAustralian terrier\nDandie Dinmont, Dandie Dinmont terrier\nBoston bull, Boston terrier\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier, Scottish terrier, Scottie\nTibetan terrier, chrysanthemum dog\nsilky terrier, Sydney silky\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa, Lhasa apso\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla, Hungarian pointer\nEnglish setter\nIrish setter, red setter\nGordon setter\nBrittany spaniel\nclumber, clumber spaniel\nEnglish springer, English springer spaniel\nWelsh springer spaniel\ncocker spaniel, English cocker spaniel, cocker\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog, bobtail\nShetland sheepdog, Shetland sheep dog, Shetland\ncollie\nBorder collie\nBouvier des Flandres, Bouviers des Flandres\nRottweiler\nGerman shepherd, German shepherd dog, German police dog, alsatian\nDoberman, Doberman pinscher\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard, St Bernard\nEskimo dog, husky\nmalamute, malemute, Alaskan malamute\nSiberian husky\ndalmatian, coach dog, carriage dog\naffenpinscher, monkey pinscher, monkey dog\nbasenji\npug, pug-dog\nLeonberg\nNewfoundland, Newfoundland dog\nGreat Pyrenees\nSamoyed, Samoyede\nPomeranian\nchow, chow chow\nkeeshond\nBrabancon griffon\nPembroke, Pembroke Welsh corgi\nCardigan, Cardigan Welsh corgi\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf, grey wolf, gray wolf, Canis lupus\nwhite wolf, Arctic wolf, Canis lupus tundrarum\nred wolf, maned wolf, Canis rufus, Canis niger\ncoyote, prairie wolf, brush wolf, Canis latrans\ndingo, warrigal, warragal, Canis dingo\ndhole, Cuon alpinus\nAfrican hunting dog, hyena dog, Cape hunting dog, Lycaon pictus\nhyena, hyaena\nred fox, Vulpes vulpes\nkit fox, Vulpes macrotis\nArctic fox, white fox, Alopex lagopus\ngrey fox, gray fox, Urocyon cinereoargenteus\ntabby, tabby cat\ntiger cat\nPersian cat\nSiamese cat, Siamese\nEgyptian cat\ncougar, puma, catamount, mountain lion, painter, panther, Felis concolor\nlynx, catamount\nleopard, Panthera pardus\nsnow leopard, ounce, Panthera uncia\njaguar, panther, Panthera onca, Felis onca\nlion, king of beasts, Panthera leo\ntiger, Panthera tigris\ncheetah, chetah, Acinonyx jubatus\nbrown bear, bruin, Ursus arctos\nAmerican black bear, black bear, Ursus americanus, Euarctos americanus\nice bear, polar bear, Ursus Maritimus, Thalarctos maritimus\nsloth bear, Melursus ursinus, Ursus ursinus\nmongoose\nmeerkat, mierkat\ntiger beetle\nladybug, ladybeetle, lady beetle, ladybird, ladybird beetle\nground beetle, carabid beetle\nlong-horned beetle, longicorn, longicorn beetle\nleaf beetle, chrysomelid\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant, emmet, pismire\ngrasshopper, hopper\ncricket\nwalking stick, walkingstick, stick insect\ncockroach, roach\nmantis, mantid\ncicada, cicala\nleafhopper\nlacewing, lacewing fly\ndragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk\ndamselfly\nadmiral\nringlet, ringlet butterfly\nmonarch, monarch butterfly, milkweed butterfly, Danaus plexippus\ncabbage butterfly\nsulphur butterfly, sulfur butterfly\nlycaenid, lycaenid butterfly\nstarfish, sea star\nsea urchin\nsea cucumber, holothurian\nwood rabbit, cottontail, cottontail rabbit\nhare\nAngora, Angora rabbit\nhamster\nporcupine, hedgehog\nfox squirrel, eastern fox squirrel, Sciurus niger\nmarmot\nbeaver\nguinea pig, Cavia cobaya\nsorrel\nzebra\nhog, pig, grunter, squealer, Sus scrofa\nwild boar, boar, Sus scrofa\nwarthog\nhippopotamus, hippo, river horse, Hippopotamus amphibius\nox\nwater buffalo, water ox, Asiatic buffalo, Bubalus bubalis\nbison\nram, tup\nbighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis\nibex, Capra ibex\nhartebeest\nimpala, Aepyceros melampus\ngazelle\nArabian camel, dromedary, Camelus dromedarius\nllama\nweasel\nmink\npolecat, fitch, foulmart, foumart, Mustela putorius\nblack-footed ferret, ferret, Mustela nigripes\notter\nskunk, polecat, wood pussy\nbadger\narmadillo\nthree-toed sloth, ai, Bradypus tridactylus\norangutan, orang, orangutang, Pongo pygmaeus\ngorilla, Gorilla gorilla\nchimpanzee, chimp, Pan troglodytes\ngibbon, Hylobates lar\nsiamang, Hylobates syndactylus, Symphalangus syndactylus\nguenon, guenon monkey\npatas, hussar monkey, Erythrocebus patas\nbaboon\nmacaque\nlangur\ncolobus, colobus monkey\nproboscis monkey, Nasalis larvatus\nmarmoset\ncapuchin, ringtail, Cebus capucinus\nhowler monkey, howler\ntiti, titi monkey\nspider monkey, Ateles geoffroyi\nsquirrel monkey, Saimiri sciureus\nMadagascar cat, ring-tailed lemur, Lemur catta\nindri, indris, Indri indri, Indri brevicaudatus\nIndian elephant, Elephas maximus\nAfrican elephant, Loxodonta africana\nlesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens\ngiant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca\nbarracouta, snoek\neel\ncoho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch\nrock beauty, Holocanthus tricolor\nanemone fish\nsturgeon\ngar, garfish, garpike, billfish, Lepisosteus osseus\nlionfish\npuffer, pufferfish, blowfish, globefish\nabacus\nabaya\nacademic gown, academic robe, judge's robe\naccordion, piano accordion, squeeze box\nacoustic guitar\naircraft carrier, carrier, flattop, attack aircraft carrier\nairliner\nairship, dirigible\naltar\nambulance\namphibian, amphibious vehicle\nanalog clock\napiary, bee house\napron\nashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin\nassault rifle, assault gun\nbackpack, back pack, knapsack, packsack, rucksack, haversack\nbakery, bakeshop, bakehouse\nbalance beam, beam\nballoon\nballpoint, ballpoint pen, ballpen, Biro\nBand Aid\nbanjo\nbannister, banister, balustrade, balusters, handrail\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel, cask\nbarrow, garden cart, lawn cart, wheelbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap, swimming cap\nbath towel\nbathtub, bathing tub, bath, tub\nbeach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon\nbeacon, lighthouse, beacon light, pharos\nbeaker\nbearskin, busby, shako\nbeer bottle\nbeer glass\nbell cote, bell cot\nbib\nbicycle-built-for-two, tandem bicycle, tandem\nbikini, two-piece\nbinder, ring-binder\nbinoculars, field glasses, opera glasses\nbirdhouse\nboathouse\nbobsled, bobsleigh, bob\nbolo tie, bolo, bola tie, bola\nbonnet, poke bonnet\nbookcase\nbookshop, bookstore, bookstall\nbottlecap\nbow\nbow tie, bow-tie, bowtie\nbrass, memorial tablet, plaque\nbrassiere, bra, bandeau\nbreakwater, groin, groyne, mole, bulwark, seawall, jetty\nbreastplate, aegis, egis\nbroom\nbucket, pail\nbuckle\nbulletproof vest\nbullet train, bullet\nbutcher shop, meat market\ncab, hack, taxi, taxicab\ncaldron, cauldron\ncandle, taper, wax light\ncannon\ncanoe\ncan opener, tin opener\ncardigan\ncar mirror\ncarousel, carrousel, merry-go-round, roundabout, whirligig\ncarpenter's kit, tool kit\ncarton\ncar wheel\ncash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello, violoncello\ncellular telephone, cellular phone, cellphone, cell, mobile phone\nchain\nchainlink fence\nchain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour\nchain saw, chainsaw\nchest\nchiffonier, commode\nchime, bell, gong\nchina cabinet, china closet\nChristmas stocking\nchurch, church building\ncinema, movie theater, movie theatre, movie house, picture palace\ncleaver, meat cleaver, chopper\ncliff dwelling\ncloak\nclog, geta, patten, sabot\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil, spiral, volute, whorl, helix\ncombination lock\ncomputer keyboard, keypad\nconfectionery, confectionary, candy store\ncontainer ship, containership, container vessel\nconvertible\ncorkscrew, bottle screw\ncornet, horn, trumpet, trump\ncowboy boot\ncowboy hat, ten-gallon hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib, cot\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam, dike, dyke\ndesk\ndesktop computer\ndial telephone, dial phone\ndiaper, nappy, napkin\ndigital clock\ndigital watch\ndining table, board\ndishrag, dishcloth\ndishwasher, dish washer, dishwashing machine\ndisk brake, disc brake\ndock, dockage, docking facility\ndogsled, dog sled, dog sleigh\ndome\ndoormat, welcome mat\ndrilling platform, offshore rig\ndrum, membranophone, tympan\ndrumstick\ndumbbell\nDutch oven\nelectric fan, blower\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa, boa\nfile, file cabinet, filing cabinet\nfireboat\nfire engine, fire truck\nfire screen, fireguard\nflagpole, flagstaff\nflute, transverse flute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn, horn\nfrying pan, frypan, skillet\nfur coat\ngarbage truck, dustcart\ngasmask, respirator, gas helmet\ngas pump, gasoline pump, petrol pump, island dispenser\ngoblet\ngo-kart\ngolf ball\ngolfcart, golf cart\ngondola\ngong, tam-tam\ngown\ngrand piano, grand\ngreenhouse, nursery, glasshouse\ngrille, radiator grille\ngrocery store, grocery, food market, market\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower, blow dryer, blow drier, hair dryer, hair drier\nhand-held computer, hand-held microcomputer\nhandkerchief, hankie, hanky, hankey\nhard disc, hard disk, fixed disk\nharmonica, mouth organ, harp, mouth harp\nharp\nharvester, reaper\nhatchet\nholster\nhome theater, home theatre\nhoneycomb\nhook, claw\nhoopskirt, crinoline\nhorizontal bar, high bar\nhorse cart, horse-cart\nhourglass\niPod\niron, smoothing iron\njack-o'-lantern\njean, blue jean, denim\njeep, landrover\njersey, T-shirt, tee shirt\njigsaw puzzle\njinrikisha, ricksha, rickshaw\njoystick\nkimono\nknee pad\nknot\nlab coat, laboratory coat\nladle\nlampshade, lamp shade\nlaptop, laptop computer\nlawn mower, mower\nlens cap, lens cover\nletter opener, paper knife, paperknife\nlibrary\nlifeboat\nlighter, light, igniter, ignitor\nlimousine, limo\nliner, ocean liner\nlipstick, lip rouge\nLoafer\nlotion\nloudspeaker, speaker, speaker unit, loudspeaker system, speaker system\nloupe, jeweler's loupe\nlumbermill, sawmill\nmagnetic compass\nmailbag, postbag\nmailbox, letter box\nmaillot\nmaillot, tank suit\nmanhole cover\nmaraca\nmarimba, xylophone\nmask\nmatchstick\nmaypole\nmaze, labyrinth\nmeasuring cup\nmedicine chest, medicine cabinet\nmegalith, megalithic structure\nmicrophone, mike\nmicrowave, microwave oven\nmilitary uniform\nmilk can\nminibus\nminiskirt, mini\nminivan\nmissile\nmitten\nmixing bowl\nmobile home, manufactured home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter, scooter\nmountain bike, all-terrain bike, off-roader\nmountain tent\nmouse, computer mouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook, notebook computer\nobelisk\noboe, hautboy, hautbois\nocarina, sweet potato\nodometer, hodometer, mileometer, milometer\noil filter\norgan, pipe organ\noscilloscope, scope, cathode-ray oscilloscope, CRO\noverskirt\noxcart\noxygen mask\npacket\npaddle, boat paddle\npaddlewheel, paddle wheel\npadlock\npaintbrush\npajama, pyjama, pj's, jammies\npalace\npanpipe, pandean pipe, syrinx\npaper towel\nparachute, chute\nparallel bars, bars\npark bench\nparking meter\npassenger car, coach, carriage\npatio, terrace\npay-phone, pay-station\npedestal, plinth, footstall\npencil box, pencil case\npencil sharpener\nperfume, essence\nPetri dish\nphotocopier\npick, plectrum, plectron\npickelhaube\npicket fence, paling\npickup, pickup truck\npier\npiggy bank, penny bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate, pirate ship\npitcher, ewer\nplane, carpenter's plane, woodworking plane\nplanetarium\nplastic bag\nplate rack\nplow, plough\nplunger, plumber's helper\nPolaroid camera, Polaroid Land camera\npole\npolice van, police wagon, paddy wagon, patrol wagon, wagon, black Maria\nponcho\npool table, billiard table, snooker table\npop bottle, soda bottle\npot, flowerpot\npotter's wheel\npower drill\nprayer rug, prayer mat\nprinter\nprison, prison house\nprojectile, missile\nprojector\npuck, hockey puck\npunching bag, punch bag, punching ball, punchball\npurse\nquill, quill pen\nquilt, comforter, comfort, puff\nracer, race car, racing car\nracket, racquet\nradiator\nradio, wireless\nradio telescope, radio reflector\nrain barrel\nrecreational vehicle, RV, R.V.\nreel\nreflex camera\nrefrigerator, icebox\nremote control, remote\nrestaurant, eating house, eating place, eatery\nrevolver, six-gun, six-shooter\nrifle\nrocking chair, rocker\nrotisserie\nrubber eraser, rubber, pencil eraser\nrugby ball\nrule, ruler\nrunning shoe\nsafe\nsafety pin\nsaltshaker, salt shaker\nsandal\nsarong\nsax, saxophone\nscabbard\nscale, weighing machine\nschool bus\nschooner\nscoreboard\nscreen, CRT screen\nscrew\nscrewdriver\nseat belt, seatbelt\nsewing machine\nshield, buckler\nshoe shop, shoe-shop, shoe store\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule, slipstick\nsliding door\nslot, one-armed bandit\nsnorkel\nsnowmobile\nsnowplow, snowplough\nsoap dispenser\nsoccer ball\nsock\nsolar dish, solar collector, solar furnace\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web, spider's web\nspindle\nsports car, sport car\nspotlight, spot\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch, stop watch\nstove\nstrainer\nstreetcar, tram, tramcar, trolley, trolley car\nstretcher\nstudio couch, day bed\nstupa, tope\nsubmarine, pigboat, sub, U-boat\nsuit, suit of clothes\nsundial\nsunglass\nsunglasses, dark glasses, shades\nsunscreen, sunblock, sun blocker\nsuspension bridge\nswab, swob, mop\nsweatshirt\nswimming trunks, bathing trunks\nswing\nswitch, electric switch, electrical switch\nsyringe\ntable lamp\ntank, army tank, armored combat vehicle, armoured combat vehicle\ntape player\nteapot\nteddy, teddy bear\ntelevision, television system\ntennis ball\nthatch, thatched roof\ntheater curtain, theatre curtain\nthimble\nthresher, thrasher, threshing machine\nthrone\ntile roof\ntoaster\ntobacco shop, tobacconist shop, tobacconist\ntoilet seat\ntorch\ntotem pole\ntow truck, tow car, wrecker\ntoyshop\ntractor\ntrailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi\ntray\ntrench coat\ntricycle, trike, velocipede\ntrimaran\ntripod\ntriumphal arch\ntrolleybus, trolley coach, trackless trolley\ntrombone\ntub, vat\nturnstile\ntypewriter keyboard\numbrella\nunicycle, monocycle\nupright, upright piano\nvacuum, vacuum cleaner\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin, fiddle\nvolleyball\nwaffle iron\nwall clock\nwallet, billfold, notecase, pocketbook\nwardrobe, closet, press\nwarplane, military plane\nwashbasin, handbasin, washbowl, lavabo, wash-hand basin\nwasher, automatic washer, washing machine\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool, woolen, woollen\nworm fence, snake fence, snake-rail fence, Virginia fence\nwreck\nyawl\nyurt\nweb site, website, internet site, site\ncomic book\ncrossword puzzle, crossword\nstreet sign\ntraffic light, traffic signal, stoplight\nbook jacket, dust cover, dust jacket, dust wrapper\nmenu\nplate\nguacamole\nconsomme\nhot pot, hotpot\ntrifle\nice cream, icecream\nice lolly, lolly, lollipop, popsicle\nFrench loaf\nbagel, beigel\npretzel\ncheeseburger\nhotdog, hot dog, red hot\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini, courgette\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber, cuke\nartichoke, globe artichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple, ananas\nbanana\njackfruit, jak, jack\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce, chocolate syrup\ndough\nmeat loaf, meatloaf\npizza, pizza pie\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff, drop, drop-off\ncoral reef\ngeyser\nlakeside, lakeshore\npromontory, headland, head, foreland\nsandbar, sand bar\nseashore, coast, seacoast, sea-coast\nvalley, vale\nvolcano\nballplayer, baseball player\ngroom, bridegroom\nscuba diver\nrapeseed\ndaisy\nyellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum\ncorn\nacorn\nhip, rose hip, rosehip\nbuckeye, horse chestnut, conker\ncoral fungus\nagaric\ngyromitra\nstinkhorn, carrion fungus\nearthstar\nhen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa\nbolete\near, spike, capitulum\ntoilet tissue, toilet paper, bathroom tissue\n"
  },
  {
    "path": "modules/image/classification/efficientnetb0_small_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\n\nimport argparse\nimport ast\nimport os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import postprocess\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"efficientnetb0_small_imagenet\",\n            type=\"CV/image_classification\",\n            author=\"paddlepaddle\",\n            author_email=\"paddle-dev@baidu.com\",\n            summary=\"EfficientNetB0 is a image classfication model, this module is trained with imagenet datasets.\",\n            version=\"1.1.0\")\nclass EfficientNetB0SmallImageNet:\n\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"efficientnetb0_small_imagenet_infer_model\",\n                                                          \"model\")\n        label_file = os.path.join(self.directory, \"label_list.txt\")\n        with open(label_file, 'r', encoding='utf-8') as file:\n            self.label_list = file.read().split(\"\\n\")[:-1]\n        self._set_config()\n\n    def get_expected_image_width(self):\n        return 224\n\n    def get_expected_image_height(self):\n        return 224\n\n    def get_pretrained_images_mean(self):\n        im_mean = np.array([0.485, 0.456, 0.406]).reshape(1, 3)\n        return im_mean\n\n    def get_pretrained_images_std(self):\n        im_std = np.array([0.229, 0.224, 0.225]).reshape(1, 3)\n        return im_std\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path + '.pdmodel'\n        params = self.default_pretrained_model_path + '.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def classification(self, images=None, paths=None, batch_size=1, use_gpu=False, top_k=1):\n        \"\"\"\n        API for image classification.\n\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        all_data = list()\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = list()\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except:\n                    pass\n            # feed batch image\n            batch_image = np.array([data['image'] for data in batch_data])\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(batch_image.copy())\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            out = postprocess(data_out=output_handle.copy_to_cpu(), label_list=self.label_list, top_k=top_k)\n            res += out\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classify(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.classify(paths=[args.input_path], batch_size=args.batch_size, use_gpu=args.use_gpu)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not.\")\n        self.arg_config_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n        self.arg_config_group.add_argument('--top_k', type=ast.literal_eval, default=1, help=\"Return top k results.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n\n\nif __name__ == '__main__':\n    b0 = EfficientNetB0SmallImageNet()\n    b0.context()\n    import cv2\n    test_image = [cv2.imread('dog.jpeg')]\n    res = b0.classification(images=test_image)\n    print(res)\n    res = b0.classification(paths=['dog.jpeg'])\n    print(res)\n    res = b0.classification(images=test_image)\n    print(res)\n    res = b0.classify(images=test_image)\n    print(res)\n"
  },
  {
    "path": "modules/image/classification/efficientnetb0_small_imagenet/processor.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport base64\n\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef softmax(x):\n    orig_shape = x.shape\n    if len(x.shape) > 1:\n        tmp = np.max(x, axis=1)\n        x -= tmp.reshape((x.shape[0], 1))\n        x = np.exp(x)\n        tmp = np.sum(x, axis=1)\n        x /= tmp.reshape((x.shape[0], 1))\n    else:\n        tmp = np.max(x)\n        x -= tmp\n        x = np.exp(x)\n        tmp = np.sum(x)\n        x /= tmp\n    return x\n\n\ndef postprocess(data_out, label_list, top_k):\n    \"\"\"\n    Postprocess output of network, one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output data of network.\n        label_list (list): list of label.\n        top_k (int): Return top k results.\n    \"\"\"\n    output = []\n    for result in data_out:\n        result_i = softmax(result)\n        output_i = {}\n        indexs = np.argsort(result_i)[::-1][0:top_k]\n        for index in indexs:\n            label = label_list[index].split(',')[0]\n            output_i[label] = float(result_i[index])\n        output.append(output_i)\n    return output\n"
  },
  {
    "path": "modules/image/classification/efficientnetb0_small_imagenet/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/brFsZ7qszSY/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8OHx8ZG9nfGVufDB8fHx8MTY2MzA1ODQ1MQ&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"efficientnetb0_small_imagenet\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n\n    def test_classification1(self):\n        results = self.module.classification(paths=['tests/test.jpg'])\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification2(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')])\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification3(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')], use_gpu=True)\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification4(self):\n        self.assertRaises(AssertionError, self.module.classification, paths=['no.jpg'])\n\n    def test_classification5(self):\n        self.assertRaises(TypeError, self.module.classification, images=['tests/test.jpg'])\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/classification/efficientnetb1_imagenet/README.md",
    "content": "# efficientnetb1_imagenet\n\n|模型名称|efficientnetb1_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|EfficientNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|33MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - EfficientNet 是谷歌的开源新模型，是一个轻量级网络，它的主干网络由 MBConv 构成，同时采取了 squeeze-and-excitation 操作对网络结构进行优化。该 PaddleHub Module结构为 EfficientNetB1，基于 ImageNet-2012 数据集训练，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install efficientnetb1_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run efficientnetb1_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"efficientnetb1_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 为识别的菜品类别，value为置信度。\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m efficientnetb1_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/efficientnetb1_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  提升预测性能以及易用性\n\n* 1.2.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install efficientnetb1_imagenet==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/efficientnetb1_imagenet/README_en.md",
    "content": "# efficientnetb1_imagenet\n\n|Module Name|efficientnetb1_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|EfficientNet|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|33MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - EfficientNet is a light-weight model proposed by google, which consists of MBConv, and takes advantage of squeeze-and-excitation operation. This module is based on EfficientNetB1, trained on ImageNet-2012 dataset, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install efficientnetb1_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run efficientnetb1_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"efficientnetb1_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - classification API.\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - top\\_k (int): return the first k results\n\n    - **Return**\n\n      - res (list\\[dict\\]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of image classification.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m efficientnetb1_imagenet\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/efficientnetb1_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Improve the prediction performance and users' experience\n\n* 1.2.0\n\n  Remove Fluid API\n\n  - ```shell\n    $ hub install efficientnetb1_imagenet==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/efficientnetb1_imagenet/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/classification/efficientnetb1_imagenet/data_feed.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport time\nfrom collections import OrderedDict\n\nimport numpy as np\nfrom PIL import Image\n\n__all__ = ['reader']\n\nDATA_DIM = 224\nimg_mean = np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))\nimg_std = np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))\n\n\ndef resize_short(img, target_size):\n    percent = float(target_size) / min(img.size[0], img.size[1])\n    resized_width = int(round(img.size[0] * percent))\n    resized_height = int(round(img.size[1] * percent))\n    img = img.resize((resized_width, resized_height), Image.LANCZOS)\n    return img\n\n\ndef crop_image(img, target_size, center):\n    width, height = img.size\n    size = target_size\n    if center == True:\n        w_start = (width - size) / 2\n        h_start = (height - size) / 2\n    else:\n        w_start = np.random.randint(0, width - size + 1)\n        h_start = np.random.randint(0, height - size + 1)\n    w_end = w_start + size\n    h_end = h_start + size\n    img = img.crop((w_start, h_start, w_end, h_end))\n    return img\n\n\ndef process_image(img):\n    img = resize_short(img, target_size=256)\n    img = crop_image(img, target_size=DATA_DIM, center=True)\n    if img.mode != 'RGB':\n        img = img.convert('RGB')\n    img = np.array(img).astype('float32').transpose((2, 0, 1)) / 255\n    img -= img_mean\n    img /= img_std\n    return img\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list[numpy.ndarray]): images data, shape of each is [H, W, C].\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            each['org_im_path'] = im_path\n            each['org_im'] = Image.open(im_path)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n    if images is not None:\n        assert type(images), \"images is a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = Image.fromarray(im[:, :, ::-1])\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n\n    for element in component:\n        element['image'] = process_image(element['org_im'])\n        yield element\n"
  },
  {
    "path": "modules/image/classification/efficientnetb1_imagenet/label_list.txt",
    "content": "tench, Tinca tinca\ngoldfish, Carassius auratus\ngreat white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias\ntiger shark, Galeocerdo cuvieri\nhammerhead, hammerhead shark\nelectric ray, crampfish, numbfish, torpedo\nstingray\ncock\nhen\nostrich, Struthio camelus\nbrambling, Fringilla montifringilla\ngoldfinch, Carduelis carduelis\nhouse finch, linnet, Carpodacus mexicanus\njunco, snowbird\nindigo bunting, indigo finch, indigo bird, Passerina cyanea\nrobin, American robin, Turdus migratorius\nbulbul\njay\nmagpie\nchickadee\nwater ouzel, dipper\nkite\nbald eagle, American eagle, Haliaeetus leucocephalus\nvulture\ngreat grey owl, great gray owl, Strix nebulosa\nEuropean fire salamander, Salamandra salamandra\ncommon newt, Triturus vulgaris\neft\nspotted salamander, Ambystoma maculatum\naxolotl, mud puppy, Ambystoma mexicanum\nbullfrog, Rana catesbeiana\ntree frog, tree-frog\ntailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui\nloggerhead, loggerhead turtle, Caretta caretta\nleatherback turtle, leatherback, leathery turtle, Dermochelys coriacea\nmud turtle\nterrapin\nbox turtle, box tortoise\nbanded gecko\ncommon iguana, iguana, Iguana iguana\nAmerican chameleon, anole, Anolis carolinensis\nwhiptail, whiptail lizard\nagama\nfrilled lizard, Chlamydosaurus kingi\nalligator lizard\nGila monster, Heloderma suspectum\ngreen lizard, Lacerta viridis\nAfrican chameleon, Chamaeleo chamaeleon\nKomodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis\nAfrican crocodile, Nile crocodile, Crocodylus niloticus\nAmerican alligator, Alligator mississipiensis\ntriceratops\nthunder snake, worm snake, Carphophis amoenus\nringneck snake, ring-necked snake, ring snake\nhognose snake, puff adder, sand viper\ngreen snake, grass snake\nking snake, kingsnake\ngarter snake, grass snake\nwater snake\nvine snake\nnight snake, Hypsiglena torquata\nboa constrictor, Constrictor constrictor\nrock python, rock snake, Python sebae\nIndian cobra, Naja naja\ngreen mamba\nsea snake\nhorned viper, cerastes, sand viper, horned asp, Cerastes cornutus\ndiamondback, diamondback rattlesnake, Crotalus adamanteus\nsidewinder, horned rattlesnake, Crotalus cerastes\ntrilobite\nharvestman, daddy longlegs, Phalangium opilio\nscorpion\nblack and gold garden spider, Argiope aurantia\nbarn spider, Araneus cavaticus\ngarden spider, Aranea diademata\nblack widow, Latrodectus mactans\ntarantula\nwolf spider, hunting spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse, partridge, Bonasa umbellus\nprairie chicken, prairie grouse, prairie fowl\npeacock\nquail\npartridge\nAfrican grey, African gray, Psittacus erithacus\nmacaw\nsulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser, Mergus serrator\ngoose\nblack swan, Cygnus atratus\ntusker\nechidna, spiny anteater, anteater\nplatypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus\nwallaby, brush kangaroo\nkoala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus\nwombat\njellyfish\nsea anemone, anemone\nbrain coral\nflatworm, platyhelminth\nnematode, nematode worm, roundworm\nconch\nsnail\nslug\nsea slug, nudibranch\nchiton, coat-of-mail shell, sea cradle, polyplacophore\nchambered nautilus, pearly nautilus, nautilus\nDungeness crab, Cancer magister\nrock crab, Cancer irroratus\nfiddler crab\nking crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica\nAmerican lobster, Northern lobster, Maine lobster, Homarus americanus\nspiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish\ncrayfish, crawfish, crawdad, crawdaddy\nhermit crab\nisopod\nwhite stork, Ciconia ciconia\nblack stork, Ciconia nigra\nspoonbill\nflamingo\nlittle blue heron, Egretta caerulea\nAmerican egret, great white heron, Egretta albus\nbittern\ncrane\nlimpkin, Aramus pictus\nEuropean gallinule, Porphyrio porphyrio\nAmerican coot, marsh hen, mud hen, water hen, Fulica americana\nbustard\nruddy turnstone, Arenaria interpres\nred-backed sandpiper, dunlin, Erolia alpina\nredshank, Tringa totanus\ndowitcher\noystercatcher, oyster catcher\npelican\nking penguin, Aptenodytes patagonica\nalbatross, mollymawk\ngrey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus\nkiller whale, killer, orca, grampus, sea wolf, Orcinus orca\ndugong, Dugong dugon\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog, Maltese terrier, Maltese\nPekinese, Pekingese, Peke\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound, Afghan\nbasset, basset hound\nbeagle\nbloodhound, sleuthhound\nbluetick\nblack-and-tan coonhound\nWalker hound, Walker foxhound\nEnglish foxhound\nredbone\nborzoi, Russian wolfhound\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound, Ibizan Podenco\nNorwegian elkhound, elkhound\notterhound, otter hound\nSaluki, gazelle hound\nScottish deerhound, deerhound\nWeimaraner\nStaffordshire bullterrier, Staffordshire bull terrier\nAmerican Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier, Sealyham\nAiredale, Airedale terrier\ncairn, cairn terrier\nAustralian terrier\nDandie Dinmont, Dandie Dinmont terrier\nBoston bull, Boston terrier\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier, Scottish terrier, Scottie\nTibetan terrier, chrysanthemum dog\nsilky terrier, Sydney silky\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa, Lhasa apso\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla, Hungarian pointer\nEnglish setter\nIrish setter, red setter\nGordon setter\nBrittany spaniel\nclumber, clumber spaniel\nEnglish springer, English springer spaniel\nWelsh springer spaniel\ncocker spaniel, English cocker spaniel, cocker\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog, bobtail\nShetland sheepdog, Shetland sheep dog, Shetland\ncollie\nBorder collie\nBouvier des Flandres, Bouviers des Flandres\nRottweiler\nGerman shepherd, German shepherd dog, German police dog, alsatian\nDoberman, Doberman pinscher\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard, St Bernard\nEskimo dog, husky\nmalamute, malemute, Alaskan malamute\nSiberian husky\ndalmatian, coach dog, carriage dog\naffenpinscher, monkey pinscher, monkey dog\nbasenji\npug, pug-dog\nLeonberg\nNewfoundland, Newfoundland dog\nGreat Pyrenees\nSamoyed, Samoyede\nPomeranian\nchow, chow chow\nkeeshond\nBrabancon griffon\nPembroke, Pembroke Welsh corgi\nCardigan, Cardigan Welsh corgi\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf, grey wolf, gray wolf, Canis lupus\nwhite wolf, Arctic wolf, Canis lupus tundrarum\nred wolf, maned wolf, Canis rufus, Canis niger\ncoyote, prairie wolf, brush wolf, Canis latrans\ndingo, warrigal, warragal, Canis dingo\ndhole, Cuon alpinus\nAfrican hunting dog, hyena dog, Cape hunting dog, Lycaon pictus\nhyena, hyaena\nred fox, Vulpes vulpes\nkit fox, Vulpes macrotis\nArctic fox, white fox, Alopex lagopus\ngrey fox, gray fox, Urocyon cinereoargenteus\ntabby, tabby cat\ntiger cat\nPersian cat\nSiamese cat, Siamese\nEgyptian cat\ncougar, puma, catamount, mountain lion, painter, panther, Felis concolor\nlynx, catamount\nleopard, Panthera pardus\nsnow leopard, ounce, Panthera uncia\njaguar, panther, Panthera onca, Felis onca\nlion, king of beasts, Panthera leo\ntiger, Panthera tigris\ncheetah, chetah, Acinonyx jubatus\nbrown bear, bruin, Ursus arctos\nAmerican black bear, black bear, Ursus americanus, Euarctos americanus\nice bear, polar bear, Ursus Maritimus, Thalarctos maritimus\nsloth bear, Melursus ursinus, Ursus ursinus\nmongoose\nmeerkat, mierkat\ntiger beetle\nladybug, ladybeetle, lady beetle, ladybird, ladybird beetle\nground beetle, carabid beetle\nlong-horned beetle, longicorn, longicorn beetle\nleaf beetle, chrysomelid\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant, emmet, pismire\ngrasshopper, hopper\ncricket\nwalking stick, walkingstick, stick insect\ncockroach, roach\nmantis, mantid\ncicada, cicala\nleafhopper\nlacewing, lacewing fly\ndragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk\ndamselfly\nadmiral\nringlet, ringlet butterfly\nmonarch, monarch butterfly, milkweed butterfly, Danaus plexippus\ncabbage butterfly\nsulphur butterfly, sulfur butterfly\nlycaenid, lycaenid butterfly\nstarfish, sea star\nsea urchin\nsea cucumber, holothurian\nwood rabbit, cottontail, cottontail rabbit\nhare\nAngora, Angora rabbit\nhamster\nporcupine, hedgehog\nfox squirrel, eastern fox squirrel, Sciurus niger\nmarmot\nbeaver\nguinea pig, Cavia cobaya\nsorrel\nzebra\nhog, pig, grunter, squealer, Sus scrofa\nwild boar, boar, Sus scrofa\nwarthog\nhippopotamus, hippo, river horse, Hippopotamus amphibius\nox\nwater buffalo, water ox, Asiatic buffalo, Bubalus bubalis\nbison\nram, tup\nbighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis\nibex, Capra ibex\nhartebeest\nimpala, Aepyceros melampus\ngazelle\nArabian camel, dromedary, Camelus dromedarius\nllama\nweasel\nmink\npolecat, fitch, foulmart, foumart, Mustela putorius\nblack-footed ferret, ferret, Mustela nigripes\notter\nskunk, polecat, wood pussy\nbadger\narmadillo\nthree-toed sloth, ai, Bradypus tridactylus\norangutan, orang, orangutang, Pongo pygmaeus\ngorilla, Gorilla gorilla\nchimpanzee, chimp, Pan troglodytes\ngibbon, Hylobates lar\nsiamang, Hylobates syndactylus, Symphalangus syndactylus\nguenon, guenon monkey\npatas, hussar monkey, Erythrocebus patas\nbaboon\nmacaque\nlangur\ncolobus, colobus monkey\nproboscis monkey, Nasalis larvatus\nmarmoset\ncapuchin, ringtail, Cebus capucinus\nhowler monkey, howler\ntiti, titi monkey\nspider monkey, Ateles geoffroyi\nsquirrel monkey, Saimiri sciureus\nMadagascar cat, ring-tailed lemur, Lemur catta\nindri, indris, Indri indri, Indri brevicaudatus\nIndian elephant, Elephas maximus\nAfrican elephant, Loxodonta africana\nlesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens\ngiant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca\nbarracouta, snoek\neel\ncoho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch\nrock beauty, Holocanthus tricolor\nanemone fish\nsturgeon\ngar, garfish, garpike, billfish, Lepisosteus osseus\nlionfish\npuffer, pufferfish, blowfish, globefish\nabacus\nabaya\nacademic gown, academic robe, judge's robe\naccordion, piano accordion, squeeze box\nacoustic guitar\naircraft carrier, carrier, flattop, attack aircraft carrier\nairliner\nairship, dirigible\naltar\nambulance\namphibian, amphibious vehicle\nanalog clock\napiary, bee house\napron\nashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin\nassault rifle, assault gun\nbackpack, back pack, knapsack, packsack, rucksack, haversack\nbakery, bakeshop, bakehouse\nbalance beam, beam\nballoon\nballpoint, ballpoint pen, ballpen, Biro\nBand Aid\nbanjo\nbannister, banister, balustrade, balusters, handrail\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel, cask\nbarrow, garden cart, lawn cart, wheelbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap, swimming cap\nbath towel\nbathtub, bathing tub, bath, tub\nbeach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon\nbeacon, lighthouse, beacon light, pharos\nbeaker\nbearskin, busby, shako\nbeer bottle\nbeer glass\nbell cote, bell cot\nbib\nbicycle-built-for-two, tandem bicycle, tandem\nbikini, two-piece\nbinder, ring-binder\nbinoculars, field glasses, opera glasses\nbirdhouse\nboathouse\nbobsled, bobsleigh, bob\nbolo tie, bolo, bola tie, bola\nbonnet, poke bonnet\nbookcase\nbookshop, bookstore, bookstall\nbottlecap\nbow\nbow tie, bow-tie, bowtie\nbrass, memorial tablet, plaque\nbrassiere, bra, bandeau\nbreakwater, groin, groyne, mole, bulwark, seawall, jetty\nbreastplate, aegis, egis\nbroom\nbucket, pail\nbuckle\nbulletproof vest\nbullet train, bullet\nbutcher shop, meat market\ncab, hack, taxi, taxicab\ncaldron, cauldron\ncandle, taper, wax light\ncannon\ncanoe\ncan opener, tin opener\ncardigan\ncar mirror\ncarousel, carrousel, merry-go-round, roundabout, whirligig\ncarpenter's kit, tool kit\ncarton\ncar wheel\ncash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello, violoncello\ncellular telephone, cellular phone, cellphone, cell, mobile phone\nchain\nchainlink fence\nchain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour\nchain saw, chainsaw\nchest\nchiffonier, commode\nchime, bell, gong\nchina cabinet, china closet\nChristmas stocking\nchurch, church building\ncinema, movie theater, movie theatre, movie house, picture palace\ncleaver, meat cleaver, chopper\ncliff dwelling\ncloak\nclog, geta, patten, sabot\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil, spiral, volute, whorl, helix\ncombination lock\ncomputer keyboard, keypad\nconfectionery, confectionary, candy store\ncontainer ship, containership, container vessel\nconvertible\ncorkscrew, bottle screw\ncornet, horn, trumpet, trump\ncowboy boot\ncowboy hat, ten-gallon hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib, cot\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam, dike, dyke\ndesk\ndesktop computer\ndial telephone, dial phone\ndiaper, nappy, napkin\ndigital clock\ndigital watch\ndining table, board\ndishrag, dishcloth\ndishwasher, dish washer, dishwashing machine\ndisk brake, disc brake\ndock, dockage, docking facility\ndogsled, dog sled, dog sleigh\ndome\ndoormat, welcome mat\ndrilling platform, offshore rig\ndrum, membranophone, tympan\ndrumstick\ndumbbell\nDutch oven\nelectric fan, blower\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa, boa\nfile, file cabinet, filing cabinet\nfireboat\nfire engine, fire truck\nfire screen, fireguard\nflagpole, flagstaff\nflute, transverse flute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn, horn\nfrying pan, frypan, skillet\nfur coat\ngarbage truck, dustcart\ngasmask, respirator, gas helmet\ngas pump, gasoline pump, petrol pump, island dispenser\ngoblet\ngo-kart\ngolf ball\ngolfcart, golf cart\ngondola\ngong, tam-tam\ngown\ngrand piano, grand\ngreenhouse, nursery, glasshouse\ngrille, radiator grille\ngrocery store, grocery, food market, market\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower, blow dryer, blow drier, hair dryer, hair drier\nhand-held computer, hand-held microcomputer\nhandkerchief, hankie, hanky, hankey\nhard disc, hard disk, fixed disk\nharmonica, mouth organ, harp, mouth harp\nharp\nharvester, reaper\nhatchet\nholster\nhome theater, home theatre\nhoneycomb\nhook, claw\nhoopskirt, crinoline\nhorizontal bar, high bar\nhorse cart, horse-cart\nhourglass\niPod\niron, smoothing iron\njack-o'-lantern\njean, blue jean, denim\njeep, landrover\njersey, T-shirt, tee shirt\njigsaw puzzle\njinrikisha, ricksha, rickshaw\njoystick\nkimono\nknee pad\nknot\nlab coat, laboratory coat\nladle\nlampshade, lamp shade\nlaptop, laptop computer\nlawn mower, mower\nlens cap, lens cover\nletter opener, paper knife, paperknife\nlibrary\nlifeboat\nlighter, light, igniter, ignitor\nlimousine, limo\nliner, ocean liner\nlipstick, lip rouge\nLoafer\nlotion\nloudspeaker, speaker, speaker unit, loudspeaker system, speaker system\nloupe, jeweler's loupe\nlumbermill, sawmill\nmagnetic compass\nmailbag, postbag\nmailbox, letter box\nmaillot\nmaillot, tank suit\nmanhole cover\nmaraca\nmarimba, xylophone\nmask\nmatchstick\nmaypole\nmaze, labyrinth\nmeasuring cup\nmedicine chest, medicine cabinet\nmegalith, megalithic structure\nmicrophone, mike\nmicrowave, microwave oven\nmilitary uniform\nmilk can\nminibus\nminiskirt, mini\nminivan\nmissile\nmitten\nmixing bowl\nmobile home, manufactured home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter, scooter\nmountain bike, all-terrain bike, off-roader\nmountain tent\nmouse, computer mouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook, notebook computer\nobelisk\noboe, hautboy, hautbois\nocarina, sweet potato\nodometer, hodometer, mileometer, milometer\noil filter\norgan, pipe organ\noscilloscope, scope, cathode-ray oscilloscope, CRO\noverskirt\noxcart\noxygen mask\npacket\npaddle, boat paddle\npaddlewheel, paddle wheel\npadlock\npaintbrush\npajama, pyjama, pj's, jammies\npalace\npanpipe, pandean pipe, syrinx\npaper towel\nparachute, chute\nparallel bars, bars\npark bench\nparking meter\npassenger car, coach, carriage\npatio, terrace\npay-phone, pay-station\npedestal, plinth, footstall\npencil box, pencil case\npencil sharpener\nperfume, essence\nPetri dish\nphotocopier\npick, plectrum, plectron\npickelhaube\npicket fence, paling\npickup, pickup truck\npier\npiggy bank, penny bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate, pirate ship\npitcher, ewer\nplane, carpenter's plane, woodworking plane\nplanetarium\nplastic bag\nplate rack\nplow, plough\nplunger, plumber's helper\nPolaroid camera, Polaroid Land camera\npole\npolice van, police wagon, paddy wagon, patrol wagon, wagon, black Maria\nponcho\npool table, billiard table, snooker table\npop bottle, soda bottle\npot, flowerpot\npotter's wheel\npower drill\nprayer rug, prayer mat\nprinter\nprison, prison house\nprojectile, missile\nprojector\npuck, hockey puck\npunching bag, punch bag, punching ball, punchball\npurse\nquill, quill pen\nquilt, comforter, comfort, puff\nracer, race car, racing car\nracket, racquet\nradiator\nradio, wireless\nradio telescope, radio reflector\nrain barrel\nrecreational vehicle, RV, R.V.\nreel\nreflex camera\nrefrigerator, icebox\nremote control, remote\nrestaurant, eating house, eating place, eatery\nrevolver, six-gun, six-shooter\nrifle\nrocking chair, rocker\nrotisserie\nrubber eraser, rubber, pencil eraser\nrugby ball\nrule, ruler\nrunning shoe\nsafe\nsafety pin\nsaltshaker, salt shaker\nsandal\nsarong\nsax, saxophone\nscabbard\nscale, weighing machine\nschool bus\nschooner\nscoreboard\nscreen, CRT screen\nscrew\nscrewdriver\nseat belt, seatbelt\nsewing machine\nshield, buckler\nshoe shop, shoe-shop, shoe store\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule, slipstick\nsliding door\nslot, one-armed bandit\nsnorkel\nsnowmobile\nsnowplow, snowplough\nsoap dispenser\nsoccer ball\nsock\nsolar dish, solar collector, solar furnace\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web, spider's web\nspindle\nsports car, sport car\nspotlight, spot\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch, stop watch\nstove\nstrainer\nstreetcar, tram, tramcar, trolley, trolley car\nstretcher\nstudio couch, day bed\nstupa, tope\nsubmarine, pigboat, sub, U-boat\nsuit, suit of clothes\nsundial\nsunglass\nsunglasses, dark glasses, shades\nsunscreen, sunblock, sun blocker\nsuspension bridge\nswab, swob, mop\nsweatshirt\nswimming trunks, bathing trunks\nswing\nswitch, electric switch, electrical switch\nsyringe\ntable lamp\ntank, army tank, armored combat vehicle, armoured combat vehicle\ntape player\nteapot\nteddy, teddy bear\ntelevision, television system\ntennis ball\nthatch, thatched roof\ntheater curtain, theatre curtain\nthimble\nthresher, thrasher, threshing machine\nthrone\ntile roof\ntoaster\ntobacco shop, tobacconist shop, tobacconist\ntoilet seat\ntorch\ntotem pole\ntow truck, tow car, wrecker\ntoyshop\ntractor\ntrailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi\ntray\ntrench coat\ntricycle, trike, velocipede\ntrimaran\ntripod\ntriumphal arch\ntrolleybus, trolley coach, trackless trolley\ntrombone\ntub, vat\nturnstile\ntypewriter keyboard\numbrella\nunicycle, monocycle\nupright, upright piano\nvacuum, vacuum cleaner\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin, fiddle\nvolleyball\nwaffle iron\nwall clock\nwallet, billfold, notecase, pocketbook\nwardrobe, closet, press\nwarplane, military plane\nwashbasin, handbasin, washbowl, lavabo, wash-hand basin\nwasher, automatic washer, washing machine\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool, woolen, woollen\nworm fence, snake fence, snake-rail fence, Virginia fence\nwreck\nyawl\nyurt\nweb site, website, internet site, site\ncomic book\ncrossword puzzle, crossword\nstreet sign\ntraffic light, traffic signal, stoplight\nbook jacket, dust cover, dust jacket, dust wrapper\nmenu\nplate\nguacamole\nconsomme\nhot pot, hotpot\ntrifle\nice cream, icecream\nice lolly, lolly, lollipop, popsicle\nFrench loaf\nbagel, beigel\npretzel\ncheeseburger\nhotdog, hot dog, red hot\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini, courgette\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber, cuke\nartichoke, globe artichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple, ananas\nbanana\njackfruit, jak, jack\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce, chocolate syrup\ndough\nmeat loaf, meatloaf\npizza, pizza pie\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff, drop, drop-off\ncoral reef\ngeyser\nlakeside, lakeshore\npromontory, headland, head, foreland\nsandbar, sand bar\nseashore, coast, seacoast, sea-coast\nvalley, vale\nvolcano\nballplayer, baseball player\ngroom, bridegroom\nscuba diver\nrapeseed\ndaisy\nyellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum\ncorn\nacorn\nhip, rose hip, rosehip\nbuckeye, horse chestnut, conker\ncoral fungus\nagaric\ngyromitra\nstinkhorn, carrion fungus\nearthstar\nhen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa\nbolete\near, spike, capitulum\ntoilet tissue, toilet paper, bathroom tissue\n"
  },
  {
    "path": "modules/image/classification/efficientnetb1_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\n\nimport argparse\nimport ast\nimport os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import postprocess\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"efficientnetb1_imagenet\",\n            type=\"CV/image_classification\",\n            author=\"paddlepaddle\",\n            author_email=\"paddle-dev@baidu.com\",\n            summary=\"EfficientNetB1 is a image classfication model, this module is trained with imagenet datasets.\",\n            version=\"1.2.0\")\nclass EfficientNetB1ImageNet:\n\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"efficientnetb1_imagenet_infer_model\",\n                                                          \"model\")\n        label_file = os.path.join(self.directory, \"label_list.txt\")\n        with open(label_file, 'r', encoding='utf-8') as file:\n            self.label_list = file.read().split(\"\\n\")[:-1]\n        self._set_config()\n\n    def get_expected_image_width(self):\n        return 224\n\n    def get_expected_image_height(self):\n        return 224\n\n    def get_pretrained_images_mean(self):\n        im_mean = np.array([0.485, 0.456, 0.406]).reshape(1, 3)\n        return im_mean\n\n    def get_pretrained_images_std(self):\n        im_std = np.array([0.229, 0.224, 0.225]).reshape(1, 3)\n        return im_std\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path + '.pdmodel'\n        params = self.default_pretrained_model_path + '.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def classification(self, images=None, paths=None, batch_size=1, use_gpu=False, top_k=1):\n        \"\"\"\n        API for image classification.\n\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        all_data = list()\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = list()\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except:\n                    pass\n            # feed batch image\n            batch_image = np.array([data['image'] for data in batch_data])\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(batch_image.copy())\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            out = postprocess(data_out=output_handle.copy_to_cpu(), label_list=self.label_list, top_k=top_k)\n            res += out\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classify(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.classify(paths=[args.input_path], batch_size=args.batch_size, use_gpu=args.use_gpu)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not.\")\n        self.arg_config_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n        self.arg_config_group.add_argument('--top_k', type=ast.literal_eval, default=1, help=\"Return top k results.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n\n\nif __name__ == '__main__':\n    b1 = EfficientNetB1ImageNet()\n    b1.context()\n    import cv2\n    test_image = [\n        cv2.imread(\n            '/mnt/zhangxuefei/program-paddle/PaddleHub/hub_module/tests/image_dataset/classification/animals/dog.jpeg')\n    ]\n    res = b1.classification(images=test_image)\n    print(res)\n    res = b1.classification(paths=[\n        '/mnt/zhangxuefei/program-paddle/PaddleHub/hub_module/tests/image_dataset/classification/animals/dog.jpeg'\n    ])\n    print(res)\n    res = b1.classification(images=test_image)\n    print(res)\n    res = b1.classify(images=test_image)\n    print(res)\n"
  },
  {
    "path": "modules/image/classification/efficientnetb1_imagenet/processor.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport base64\n\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef softmax(x):\n    if len(x.shape) > 1:\n        tmp = np.max(x, axis=1)\n        x -= tmp.reshape((x.shape[0], 1))\n        x = np.exp(x)\n        tmp = np.sum(x, axis=1)\n        x /= tmp.reshape((x.shape[0], 1))\n    else:\n        tmp = np.max(x)\n        x -= tmp\n        x = np.exp(x)\n        tmp = np.sum(x)\n        x /= tmp\n    return x\n\n\ndef postprocess(data_out, label_list, top_k):\n    \"\"\"\n    Postprocess output of network, one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output data of network.\n        label_list (list): list of label.\n        top_k (int): Return top k results.\n    \"\"\"\n    output = []\n    for result in data_out:\n        result_i = softmax(result)\n        output_i = {}\n        indexs = np.argsort(result_i)[::-1][0:top_k]\n        for index in indexs:\n            label = label_list[index].split(',')[0]\n            output_i[label] = float(result_i[index])\n        output.append(output_i)\n    return output\n"
  },
  {
    "path": "modules/image/classification/efficientnetb1_imagenet/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/brFsZ7qszSY/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8OHx8ZG9nfGVufDB8fHx8MTY2MzA1ODQ1MQ&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"efficientnetb1_imagenet\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n\n    def test_classification1(self):\n        results = self.module.classification(paths=['tests/test.jpg'])\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification2(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')])\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification3(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')], use_gpu=True)\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification4(self):\n        self.assertRaises(AssertionError, self.module.classification, paths=['no.jpg'])\n\n    def test_classification5(self):\n        self.assertRaises(TypeError, self.module.classification, images=['tests/test.jpg'])\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/classification/efficientnetb2_imagenet/README.md",
    "content": "# efficientnetb2_imagenet\n\n|模型名称|efficientnetb2_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|EfficientNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|38MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - EfficientNet 是谷歌的开源新模型，是一个轻量级网络，它的主干网络由 MBConv 构成，同时采取了 squeeze-and-excitation 操作对网络结构进行优化。该 PaddleHub Module结构为 EfficientNetB2，基于 ImageNet-2012 数据集训练，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install efficientnetb2_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run efficientnetb2_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"efficientnetb2_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 为识别的菜品类别，value为置信度。\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m efficientnetb2_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/efficientnetb2_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  提升预测性能以及易用性\n\n* 1.2.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install efficientnetb2_imagenet==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/efficientnetb2_imagenet/README_en.md",
    "content": "# efficientnetb2_imagenet\n\n|Module Name|efficientnetb2_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|EfficientNet|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|38MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - EfficientNet is a light-weight model proposed by google, which consists of MBConv, and takes advantage of squeeze-and-excitation operation. This module is based on EfficientNetB2, trained on ImageNet-2012 dataset, and can predict an image of size 224*224*3.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install efficientnetb2_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run efficientnetb2_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"efficientnetb2_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - classification API.\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - top\\_k (int): return the first k results\n\n    - **Return**\n\n      - res (list\\[dict\\]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of image classification.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m efficientnetb2_imagenet\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/efficientnetb2_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Improve the prediction performance and users' experience\n\n* 1.2.0\n\n  Remove Fluid API\n\n  - ```shell\n    $ hub install efficientnetb2_imagenet==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/efficientnetb2_imagenet/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/classification/efficientnetb2_imagenet/data_feed.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport time\nfrom collections import OrderedDict\n\nimport numpy as np\nfrom PIL import Image\n\n__all__ = ['reader']\n\nDATA_DIM = 224\nimg_mean = np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))\nimg_std = np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))\n\n\ndef resize_short(img, target_size):\n    percent = float(target_size) / min(img.size[0], img.size[1])\n    resized_width = int(round(img.size[0] * percent))\n    resized_height = int(round(img.size[1] * percent))\n    img = img.resize((resized_width, resized_height), Image.LANCZOS)\n    return img\n\n\ndef crop_image(img, target_size, center):\n    width, height = img.size\n    size = target_size\n    if center == True:\n        w_start = (width - size) / 2\n        h_start = (height - size) / 2\n    else:\n        w_start = np.random.randint(0, width - size + 1)\n        h_start = np.random.randint(0, height - size + 1)\n    w_end = w_start + size\n    h_end = h_start + size\n    img = img.crop((w_start, h_start, w_end, h_end))\n    return img\n\n\ndef process_image(img):\n    img = resize_short(img, target_size=256)\n    img = crop_image(img, target_size=DATA_DIM, center=True)\n    if img.mode != 'RGB':\n        img = img.convert('RGB')\n    img = np.array(img).astype('float32').transpose((2, 0, 1)) / 255\n    img -= img_mean\n    img /= img_std\n    return img\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list[numpy.ndarray]): images data, shape of each is [H, W, C].\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            each['org_im_path'] = im_path\n            each['org_im'] = Image.open(im_path)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n    if images is not None:\n        assert type(images), \"images is a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = Image.fromarray(im[:, :, ::-1])\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n\n    for element in component:\n        element['image'] = process_image(element['org_im'])\n        yield element\n"
  },
  {
    "path": "modules/image/classification/efficientnetb2_imagenet/label_list.txt",
    "content": "tench, Tinca tinca\ngoldfish, Carassius auratus\ngreat white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias\ntiger shark, Galeocerdo cuvieri\nhammerhead, hammerhead shark\nelectric ray, crampfish, numbfish, torpedo\nstingray\ncock\nhen\nostrich, Struthio camelus\nbrambling, Fringilla montifringilla\ngoldfinch, Carduelis carduelis\nhouse finch, linnet, Carpodacus mexicanus\njunco, snowbird\nindigo bunting, indigo finch, indigo bird, Passerina cyanea\nrobin, American robin, Turdus migratorius\nbulbul\njay\nmagpie\nchickadee\nwater ouzel, dipper\nkite\nbald eagle, American eagle, Haliaeetus leucocephalus\nvulture\ngreat grey owl, great gray owl, Strix nebulosa\nEuropean fire salamander, Salamandra salamandra\ncommon newt, Triturus vulgaris\neft\nspotted salamander, Ambystoma maculatum\naxolotl, mud puppy, Ambystoma mexicanum\nbullfrog, Rana catesbeiana\ntree frog, tree-frog\ntailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui\nloggerhead, loggerhead turtle, Caretta caretta\nleatherback turtle, leatherback, leathery turtle, Dermochelys coriacea\nmud turtle\nterrapin\nbox turtle, box tortoise\nbanded gecko\ncommon iguana, iguana, Iguana iguana\nAmerican chameleon, anole, Anolis carolinensis\nwhiptail, whiptail lizard\nagama\nfrilled lizard, Chlamydosaurus kingi\nalligator lizard\nGila monster, Heloderma suspectum\ngreen lizard, Lacerta viridis\nAfrican chameleon, Chamaeleo chamaeleon\nKomodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis\nAfrican crocodile, Nile crocodile, Crocodylus niloticus\nAmerican alligator, Alligator mississipiensis\ntriceratops\nthunder snake, worm snake, Carphophis amoenus\nringneck snake, ring-necked snake, ring snake\nhognose snake, puff adder, sand viper\ngreen snake, grass snake\nking snake, kingsnake\ngarter snake, grass snake\nwater snake\nvine snake\nnight snake, Hypsiglena torquata\nboa constrictor, Constrictor constrictor\nrock python, rock snake, Python sebae\nIndian cobra, Naja naja\ngreen mamba\nsea snake\nhorned viper, cerastes, sand viper, horned asp, Cerastes cornutus\ndiamondback, diamondback rattlesnake, Crotalus adamanteus\nsidewinder, horned rattlesnake, Crotalus cerastes\ntrilobite\nharvestman, daddy longlegs, Phalangium opilio\nscorpion\nblack and gold garden spider, Argiope aurantia\nbarn spider, Araneus cavaticus\ngarden spider, Aranea diademata\nblack widow, Latrodectus mactans\ntarantula\nwolf spider, hunting spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse, partridge, Bonasa umbellus\nprairie chicken, prairie grouse, prairie fowl\npeacock\nquail\npartridge\nAfrican grey, African gray, Psittacus erithacus\nmacaw\nsulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser, Mergus serrator\ngoose\nblack swan, Cygnus atratus\ntusker\nechidna, spiny anteater, anteater\nplatypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus\nwallaby, brush kangaroo\nkoala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus\nwombat\njellyfish\nsea anemone, anemone\nbrain coral\nflatworm, platyhelminth\nnematode, nematode worm, roundworm\nconch\nsnail\nslug\nsea slug, nudibranch\nchiton, coat-of-mail shell, sea cradle, polyplacophore\nchambered nautilus, pearly nautilus, nautilus\nDungeness crab, Cancer magister\nrock crab, Cancer irroratus\nfiddler crab\nking crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica\nAmerican lobster, Northern lobster, Maine lobster, Homarus americanus\nspiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish\ncrayfish, crawfish, crawdad, crawdaddy\nhermit crab\nisopod\nwhite stork, Ciconia ciconia\nblack stork, Ciconia nigra\nspoonbill\nflamingo\nlittle blue heron, Egretta caerulea\nAmerican egret, great white heron, Egretta albus\nbittern\ncrane\nlimpkin, Aramus pictus\nEuropean gallinule, Porphyrio porphyrio\nAmerican coot, marsh hen, mud hen, water hen, Fulica americana\nbustard\nruddy turnstone, Arenaria interpres\nred-backed sandpiper, dunlin, Erolia alpina\nredshank, Tringa totanus\ndowitcher\noystercatcher, oyster catcher\npelican\nking penguin, Aptenodytes patagonica\nalbatross, mollymawk\ngrey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus\nkiller whale, killer, orca, grampus, sea wolf, Orcinus orca\ndugong, Dugong dugon\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog, Maltese terrier, Maltese\nPekinese, Pekingese, Peke\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound, Afghan\nbasset, basset hound\nbeagle\nbloodhound, sleuthhound\nbluetick\nblack-and-tan coonhound\nWalker hound, Walker foxhound\nEnglish foxhound\nredbone\nborzoi, Russian wolfhound\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound, Ibizan Podenco\nNorwegian elkhound, elkhound\notterhound, otter hound\nSaluki, gazelle hound\nScottish deerhound, deerhound\nWeimaraner\nStaffordshire bullterrier, Staffordshire bull terrier\nAmerican Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier, Sealyham\nAiredale, Airedale terrier\ncairn, cairn terrier\nAustralian terrier\nDandie Dinmont, Dandie Dinmont terrier\nBoston bull, Boston terrier\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier, Scottish terrier, Scottie\nTibetan terrier, chrysanthemum dog\nsilky terrier, Sydney silky\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa, Lhasa apso\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla, Hungarian pointer\nEnglish setter\nIrish setter, red setter\nGordon setter\nBrittany spaniel\nclumber, clumber spaniel\nEnglish springer, English springer spaniel\nWelsh springer spaniel\ncocker spaniel, English cocker spaniel, cocker\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog, bobtail\nShetland sheepdog, Shetland sheep dog, Shetland\ncollie\nBorder collie\nBouvier des Flandres, Bouviers des Flandres\nRottweiler\nGerman shepherd, German shepherd dog, German police dog, alsatian\nDoberman, Doberman pinscher\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard, St Bernard\nEskimo dog, husky\nmalamute, malemute, Alaskan malamute\nSiberian husky\ndalmatian, coach dog, carriage dog\naffenpinscher, monkey pinscher, monkey dog\nbasenji\npug, pug-dog\nLeonberg\nNewfoundland, Newfoundland dog\nGreat Pyrenees\nSamoyed, Samoyede\nPomeranian\nchow, chow chow\nkeeshond\nBrabancon griffon\nPembroke, Pembroke Welsh corgi\nCardigan, Cardigan Welsh corgi\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf, grey wolf, gray wolf, Canis lupus\nwhite wolf, Arctic wolf, Canis lupus tundrarum\nred wolf, maned wolf, Canis rufus, Canis niger\ncoyote, prairie wolf, brush wolf, Canis latrans\ndingo, warrigal, warragal, Canis dingo\ndhole, Cuon alpinus\nAfrican hunting dog, hyena dog, Cape hunting dog, Lycaon pictus\nhyena, hyaena\nred fox, Vulpes vulpes\nkit fox, Vulpes macrotis\nArctic fox, white fox, Alopex lagopus\ngrey fox, gray fox, Urocyon cinereoargenteus\ntabby, tabby cat\ntiger cat\nPersian cat\nSiamese cat, Siamese\nEgyptian cat\ncougar, puma, catamount, mountain lion, painter, panther, Felis concolor\nlynx, catamount\nleopard, Panthera pardus\nsnow leopard, ounce, Panthera uncia\njaguar, panther, Panthera onca, Felis onca\nlion, king of beasts, Panthera leo\ntiger, Panthera tigris\ncheetah, chetah, Acinonyx jubatus\nbrown bear, bruin, Ursus arctos\nAmerican black bear, black bear, Ursus americanus, Euarctos americanus\nice bear, polar bear, Ursus Maritimus, Thalarctos maritimus\nsloth bear, Melursus ursinus, Ursus ursinus\nmongoose\nmeerkat, mierkat\ntiger beetle\nladybug, ladybeetle, lady beetle, ladybird, ladybird beetle\nground beetle, carabid beetle\nlong-horned beetle, longicorn, longicorn beetle\nleaf beetle, chrysomelid\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant, emmet, pismire\ngrasshopper, hopper\ncricket\nwalking stick, walkingstick, stick insect\ncockroach, roach\nmantis, mantid\ncicada, cicala\nleafhopper\nlacewing, lacewing fly\ndragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk\ndamselfly\nadmiral\nringlet, ringlet butterfly\nmonarch, monarch butterfly, milkweed butterfly, Danaus plexippus\ncabbage butterfly\nsulphur butterfly, sulfur butterfly\nlycaenid, lycaenid butterfly\nstarfish, sea star\nsea urchin\nsea cucumber, holothurian\nwood rabbit, cottontail, cottontail rabbit\nhare\nAngora, Angora rabbit\nhamster\nporcupine, hedgehog\nfox squirrel, eastern fox squirrel, Sciurus niger\nmarmot\nbeaver\nguinea pig, Cavia cobaya\nsorrel\nzebra\nhog, pig, grunter, squealer, Sus scrofa\nwild boar, boar, Sus scrofa\nwarthog\nhippopotamus, hippo, river horse, Hippopotamus amphibius\nox\nwater buffalo, water ox, Asiatic buffalo, Bubalus bubalis\nbison\nram, tup\nbighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis\nibex, Capra ibex\nhartebeest\nimpala, Aepyceros melampus\ngazelle\nArabian camel, dromedary, Camelus dromedarius\nllama\nweasel\nmink\npolecat, fitch, foulmart, foumart, Mustela putorius\nblack-footed ferret, ferret, Mustela nigripes\notter\nskunk, polecat, wood pussy\nbadger\narmadillo\nthree-toed sloth, ai, Bradypus tridactylus\norangutan, orang, orangutang, Pongo pygmaeus\ngorilla, Gorilla gorilla\nchimpanzee, chimp, Pan troglodytes\ngibbon, Hylobates lar\nsiamang, Hylobates syndactylus, Symphalangus syndactylus\nguenon, guenon monkey\npatas, hussar monkey, Erythrocebus patas\nbaboon\nmacaque\nlangur\ncolobus, colobus monkey\nproboscis monkey, Nasalis larvatus\nmarmoset\ncapuchin, ringtail, Cebus capucinus\nhowler monkey, howler\ntiti, titi monkey\nspider monkey, Ateles geoffroyi\nsquirrel monkey, Saimiri sciureus\nMadagascar cat, ring-tailed lemur, Lemur catta\nindri, indris, Indri indri, Indri brevicaudatus\nIndian elephant, Elephas maximus\nAfrican elephant, Loxodonta africana\nlesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens\ngiant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca\nbarracouta, snoek\neel\ncoho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch\nrock beauty, Holocanthus tricolor\nanemone fish\nsturgeon\ngar, garfish, garpike, billfish, Lepisosteus osseus\nlionfish\npuffer, pufferfish, blowfish, globefish\nabacus\nabaya\nacademic gown, academic robe, judge's robe\naccordion, piano accordion, squeeze box\nacoustic guitar\naircraft carrier, carrier, flattop, attack aircraft carrier\nairliner\nairship, dirigible\naltar\nambulance\namphibian, amphibious vehicle\nanalog clock\napiary, bee house\napron\nashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin\nassault rifle, assault gun\nbackpack, back pack, knapsack, packsack, rucksack, haversack\nbakery, bakeshop, bakehouse\nbalance beam, beam\nballoon\nballpoint, ballpoint pen, ballpen, Biro\nBand Aid\nbanjo\nbannister, banister, balustrade, balusters, handrail\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel, cask\nbarrow, garden cart, lawn cart, wheelbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap, swimming cap\nbath towel\nbathtub, bathing tub, bath, tub\nbeach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon\nbeacon, lighthouse, beacon light, pharos\nbeaker\nbearskin, busby, shako\nbeer bottle\nbeer glass\nbell cote, bell cot\nbib\nbicycle-built-for-two, tandem bicycle, tandem\nbikini, two-piece\nbinder, ring-binder\nbinoculars, field glasses, opera glasses\nbirdhouse\nboathouse\nbobsled, bobsleigh, bob\nbolo tie, bolo, bola tie, bola\nbonnet, poke bonnet\nbookcase\nbookshop, bookstore, bookstall\nbottlecap\nbow\nbow tie, bow-tie, bowtie\nbrass, memorial tablet, plaque\nbrassiere, bra, bandeau\nbreakwater, groin, groyne, mole, bulwark, seawall, jetty\nbreastplate, aegis, egis\nbroom\nbucket, pail\nbuckle\nbulletproof vest\nbullet train, bullet\nbutcher shop, meat market\ncab, hack, taxi, taxicab\ncaldron, cauldron\ncandle, taper, wax light\ncannon\ncanoe\ncan opener, tin opener\ncardigan\ncar mirror\ncarousel, carrousel, merry-go-round, roundabout, whirligig\ncarpenter's kit, tool kit\ncarton\ncar wheel\ncash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello, violoncello\ncellular telephone, cellular phone, cellphone, cell, mobile phone\nchain\nchainlink fence\nchain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour\nchain saw, chainsaw\nchest\nchiffonier, commode\nchime, bell, gong\nchina cabinet, china closet\nChristmas stocking\nchurch, church building\ncinema, movie theater, movie theatre, movie house, picture palace\ncleaver, meat cleaver, chopper\ncliff dwelling\ncloak\nclog, geta, patten, sabot\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil, spiral, volute, whorl, helix\ncombination lock\ncomputer keyboard, keypad\nconfectionery, confectionary, candy store\ncontainer ship, containership, container vessel\nconvertible\ncorkscrew, bottle screw\ncornet, horn, trumpet, trump\ncowboy boot\ncowboy hat, ten-gallon hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib, cot\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam, dike, dyke\ndesk\ndesktop computer\ndial telephone, dial phone\ndiaper, nappy, napkin\ndigital clock\ndigital watch\ndining table, board\ndishrag, dishcloth\ndishwasher, dish washer, dishwashing machine\ndisk brake, disc brake\ndock, dockage, docking facility\ndogsled, dog sled, dog sleigh\ndome\ndoormat, welcome mat\ndrilling platform, offshore rig\ndrum, membranophone, tympan\ndrumstick\ndumbbell\nDutch oven\nelectric fan, blower\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa, boa\nfile, file cabinet, filing cabinet\nfireboat\nfire engine, fire truck\nfire screen, fireguard\nflagpole, flagstaff\nflute, transverse flute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn, horn\nfrying pan, frypan, skillet\nfur coat\ngarbage truck, dustcart\ngasmask, respirator, gas helmet\ngas pump, gasoline pump, petrol pump, island dispenser\ngoblet\ngo-kart\ngolf ball\ngolfcart, golf cart\ngondola\ngong, tam-tam\ngown\ngrand piano, grand\ngreenhouse, nursery, glasshouse\ngrille, radiator grille\ngrocery store, grocery, food market, market\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower, blow dryer, blow drier, hair dryer, hair drier\nhand-held computer, hand-held microcomputer\nhandkerchief, hankie, hanky, hankey\nhard disc, hard disk, fixed disk\nharmonica, mouth organ, harp, mouth harp\nharp\nharvester, reaper\nhatchet\nholster\nhome theater, home theatre\nhoneycomb\nhook, claw\nhoopskirt, crinoline\nhorizontal bar, high bar\nhorse cart, horse-cart\nhourglass\niPod\niron, smoothing iron\njack-o'-lantern\njean, blue jean, denim\njeep, landrover\njersey, T-shirt, tee shirt\njigsaw puzzle\njinrikisha, ricksha, rickshaw\njoystick\nkimono\nknee pad\nknot\nlab coat, laboratory coat\nladle\nlampshade, lamp shade\nlaptop, laptop computer\nlawn mower, mower\nlens cap, lens cover\nletter opener, paper knife, paperknife\nlibrary\nlifeboat\nlighter, light, igniter, ignitor\nlimousine, limo\nliner, ocean liner\nlipstick, lip rouge\nLoafer\nlotion\nloudspeaker, speaker, speaker unit, loudspeaker system, speaker system\nloupe, jeweler's loupe\nlumbermill, sawmill\nmagnetic compass\nmailbag, postbag\nmailbox, letter box\nmaillot\nmaillot, tank suit\nmanhole cover\nmaraca\nmarimba, xylophone\nmask\nmatchstick\nmaypole\nmaze, labyrinth\nmeasuring cup\nmedicine chest, medicine cabinet\nmegalith, megalithic structure\nmicrophone, mike\nmicrowave, microwave oven\nmilitary uniform\nmilk can\nminibus\nminiskirt, mini\nminivan\nmissile\nmitten\nmixing bowl\nmobile home, manufactured home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter, scooter\nmountain bike, all-terrain bike, off-roader\nmountain tent\nmouse, computer mouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook, notebook computer\nobelisk\noboe, hautboy, hautbois\nocarina, sweet potato\nodometer, hodometer, mileometer, milometer\noil filter\norgan, pipe organ\noscilloscope, scope, cathode-ray oscilloscope, CRO\noverskirt\noxcart\noxygen mask\npacket\npaddle, boat paddle\npaddlewheel, paddle wheel\npadlock\npaintbrush\npajama, pyjama, pj's, jammies\npalace\npanpipe, pandean pipe, syrinx\npaper towel\nparachute, chute\nparallel bars, bars\npark bench\nparking meter\npassenger car, coach, carriage\npatio, terrace\npay-phone, pay-station\npedestal, plinth, footstall\npencil box, pencil case\npencil sharpener\nperfume, essence\nPetri dish\nphotocopier\npick, plectrum, plectron\npickelhaube\npicket fence, paling\npickup, pickup truck\npier\npiggy bank, penny bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate, pirate ship\npitcher, ewer\nplane, carpenter's plane, woodworking plane\nplanetarium\nplastic bag\nplate rack\nplow, plough\nplunger, plumber's helper\nPolaroid camera, Polaroid Land camera\npole\npolice van, police wagon, paddy wagon, patrol wagon, wagon, black Maria\nponcho\npool table, billiard table, snooker table\npop bottle, soda bottle\npot, flowerpot\npotter's wheel\npower drill\nprayer rug, prayer mat\nprinter\nprison, prison house\nprojectile, missile\nprojector\npuck, hockey puck\npunching bag, punch bag, punching ball, punchball\npurse\nquill, quill pen\nquilt, comforter, comfort, puff\nracer, race car, racing car\nracket, racquet\nradiator\nradio, wireless\nradio telescope, radio reflector\nrain barrel\nrecreational vehicle, RV, R.V.\nreel\nreflex camera\nrefrigerator, icebox\nremote control, remote\nrestaurant, eating house, eating place, eatery\nrevolver, six-gun, six-shooter\nrifle\nrocking chair, rocker\nrotisserie\nrubber eraser, rubber, pencil eraser\nrugby ball\nrule, ruler\nrunning shoe\nsafe\nsafety pin\nsaltshaker, salt shaker\nsandal\nsarong\nsax, saxophone\nscabbard\nscale, weighing machine\nschool bus\nschooner\nscoreboard\nscreen, CRT screen\nscrew\nscrewdriver\nseat belt, seatbelt\nsewing machine\nshield, buckler\nshoe shop, shoe-shop, shoe store\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule, slipstick\nsliding door\nslot, one-armed bandit\nsnorkel\nsnowmobile\nsnowplow, snowplough\nsoap dispenser\nsoccer ball\nsock\nsolar dish, solar collector, solar furnace\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web, spider's web\nspindle\nsports car, sport car\nspotlight, spot\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch, stop watch\nstove\nstrainer\nstreetcar, tram, tramcar, trolley, trolley car\nstretcher\nstudio couch, day bed\nstupa, tope\nsubmarine, pigboat, sub, U-boat\nsuit, suit of clothes\nsundial\nsunglass\nsunglasses, dark glasses, shades\nsunscreen, sunblock, sun blocker\nsuspension bridge\nswab, swob, mop\nsweatshirt\nswimming trunks, bathing trunks\nswing\nswitch, electric switch, electrical switch\nsyringe\ntable lamp\ntank, army tank, armored combat vehicle, armoured combat vehicle\ntape player\nteapot\nteddy, teddy bear\ntelevision, television system\ntennis ball\nthatch, thatched roof\ntheater curtain, theatre curtain\nthimble\nthresher, thrasher, threshing machine\nthrone\ntile roof\ntoaster\ntobacco shop, tobacconist shop, tobacconist\ntoilet seat\ntorch\ntotem pole\ntow truck, tow car, wrecker\ntoyshop\ntractor\ntrailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi\ntray\ntrench coat\ntricycle, trike, velocipede\ntrimaran\ntripod\ntriumphal arch\ntrolleybus, trolley coach, trackless trolley\ntrombone\ntub, vat\nturnstile\ntypewriter keyboard\numbrella\nunicycle, monocycle\nupright, upright piano\nvacuum, vacuum cleaner\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin, fiddle\nvolleyball\nwaffle iron\nwall clock\nwallet, billfold, notecase, pocketbook\nwardrobe, closet, press\nwarplane, military plane\nwashbasin, handbasin, washbowl, lavabo, wash-hand basin\nwasher, automatic washer, washing machine\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool, woolen, woollen\nworm fence, snake fence, snake-rail fence, Virginia fence\nwreck\nyawl\nyurt\nweb site, website, internet site, site\ncomic book\ncrossword puzzle, crossword\nstreet sign\ntraffic light, traffic signal, stoplight\nbook jacket, dust cover, dust jacket, dust wrapper\nmenu\nplate\nguacamole\nconsomme\nhot pot, hotpot\ntrifle\nice cream, icecream\nice lolly, lolly, lollipop, popsicle\nFrench loaf\nbagel, beigel\npretzel\ncheeseburger\nhotdog, hot dog, red hot\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini, courgette\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber, cuke\nartichoke, globe artichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple, ananas\nbanana\njackfruit, jak, jack\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce, chocolate syrup\ndough\nmeat loaf, meatloaf\npizza, pizza pie\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff, drop, drop-off\ncoral reef\ngeyser\nlakeside, lakeshore\npromontory, headland, head, foreland\nsandbar, sand bar\nseashore, coast, seacoast, sea-coast\nvalley, vale\nvolcano\nballplayer, baseball player\ngroom, bridegroom\nscuba diver\nrapeseed\ndaisy\nyellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum\ncorn\nacorn\nhip, rose hip, rosehip\nbuckeye, horse chestnut, conker\ncoral fungus\nagaric\ngyromitra\nstinkhorn, carrion fungus\nearthstar\nhen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa\nbolete\near, spike, capitulum\ntoilet tissue, toilet paper, bathroom tissue\n"
  },
  {
    "path": "modules/image/classification/efficientnetb2_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\n\nimport argparse\nimport ast\nimport os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import postprocess\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"efficientnetb2_imagenet\",\n            type=\"CV/image_classification\",\n            author=\"paddlepaddle\",\n            author_email=\"paddle-dev@baidu.com\",\n            summary=\"EfficientNetB2 is a image classfication model, this module is trained with imagenet datasets.\",\n            version=\"1.2.0\")\nclass EfficientNetB2ImageNet:\n\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"efficientnetb2_imagenet_infer_model\",\n                                                          \"model\")\n        label_file = os.path.join(self.directory, \"label_list.txt\")\n        with open(label_file, 'r', encoding='utf-8') as file:\n            self.label_list = file.read().split(\"\\n\")[:-1]\n        self._set_config()\n\n    def get_expected_image_width(self):\n        return 224\n\n    def get_expected_image_height(self):\n        return 224\n\n    def get_pretrained_images_mean(self):\n        im_mean = np.array([0.485, 0.456, 0.406]).reshape(1, 3)\n        return im_mean\n\n    def get_pretrained_images_std(self):\n        im_std = np.array([0.229, 0.224, 0.225]).reshape(1, 3)\n        return im_std\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path + '.pdmodel'\n        params = self.default_pretrained_model_path + '.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def classification(self, images=None, paths=None, batch_size=1, use_gpu=False, top_k=1):\n        \"\"\"\n        API for image classification.\n\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        all_data = list()\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = list()\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except:\n                    pass\n            # feed batch image\n            batch_image = np.array([data['image'] for data in batch_data])\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(batch_image.copy())\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            out = postprocess(data_out=output_handle.copy_to_cpu(), label_list=self.label_list, top_k=top_k)\n            res += out\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classify(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.classify(paths=[args.input_path], batch_size=args.batch_size, use_gpu=args.use_gpu)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not.\")\n        self.arg_config_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n        self.arg_config_group.add_argument('--top_k', type=ast.literal_eval, default=1, help=\"Return top k results.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n\n\nif __name__ == '__main__':\n    b2 = EfficientNetB2ImageNet()\n    b2.context()\n    import cv2\n    test_image = [\n        cv2.imread(\n            '/mnt/zhangxuefei/program-paddle/PaddleHub/hub_module/tests/image_dataset/classification/animals/dog.jpeg')\n    ]\n    res = b2.classification(images=test_image)\n    print(res)\n    res = b2.classification(paths=[\n        '/mnt/zhangxuefei/program-paddle/PaddleHub/hub_module/tests/image_dataset/classification/animals/dog.jpeg'\n    ])\n    print(res)\n    res = b2.classification(images=test_image)\n    print(res)\n    res = b2.classify(images=test_image)\n    print(res)\n"
  },
  {
    "path": "modules/image/classification/efficientnetb2_imagenet/processor.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport base64\n\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef softmax(x):\n    if len(x.shape) > 1:\n        tmp = np.max(x, axis=1)\n        x -= tmp.reshape((x.shape[0], 1))\n        x = np.exp(x)\n        tmp = np.sum(x, axis=1)\n        x /= tmp.reshape((x.shape[0], 1))\n    else:\n        tmp = np.max(x)\n        x -= tmp\n        x = np.exp(x)\n        tmp = np.sum(x)\n        x /= tmp\n    return x\n\n\ndef postprocess(data_out, label_list, top_k):\n    \"\"\"\n    Postprocess output of network, one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output data of network.\n        label_list (list): list of label.\n        top_k (int): Return top k results.\n    \"\"\"\n    output = []\n    for result in data_out:\n        result_i = softmax(result)\n        output_i = {}\n        indexs = np.argsort(result_i)[::-1][0:top_k]\n        for index in indexs:\n            label = label_list[index].split(',')[0]\n            output_i[label] = float(result_i[index])\n        output.append(output_i)\n    return output\n"
  },
  {
    "path": "modules/image/classification/efficientnetb2_imagenet/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/brFsZ7qszSY/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8OHx8ZG9nfGVufDB8fHx8MTY2MzA1ODQ1MQ&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"efficientnetb2_imagenet\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n\n    def test_classification1(self):\n        results = self.module.classification(paths=['tests/test.jpg'])\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification2(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')])\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification3(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')], use_gpu=True)\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification4(self):\n        self.assertRaises(AssertionError, self.module.classification, paths=['no.jpg'])\n\n    def test_classification5(self):\n        self.assertRaises(TypeError, self.module.classification, images=['tests/test.jpg'])\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/classification/efficientnetb3_imagenet/README.md",
    "content": "# efficientnetb3_imagenet\n\n|模型名称|efficientnetb3_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|EfficientNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|51MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - EfficientNet 是谷歌的开源新模型，是一个轻量级网络，它的主干网络由 MBConv 构成，同时采取了 squeeze-and-excitation 操作对网络结构进行优化。该 PaddleHub Module结构为 EfficientNetB3，基于 ImageNet-2012 数据集训练，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install efficientnetb3_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run efficientnetb3_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"efficientnetb3_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 为识别的菜品类别，value为置信度。\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m efficientnetb3_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/efficientnetb3_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  提升预测性能以及易用性\n\n* 1.2.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install efficientnetb3_imagenet==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/efficientnetb3_imagenet/README_en.md",
    "content": "# efficientnetb3_imagenet\n\n|Module Name|efficientnetb3_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|EfficientNet|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|51MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - EfficientNet is a light-weight model proposed by google, which consists of MBConv, and takes advantage of squeeze-and-excitation operation. This module is based on EfficientNetB3, trained on ImageNet-2012 dataset, and can predict an image of size 224*224*3.\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install efficientnetb3_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run efficientnetb3_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"efficientnetb3_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - classification API.\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - top\\_k (int): return the first k results\n\n    - **Return**\n\n      - res (list\\[dict\\]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of image classification.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m efficientnetb3_imagenet\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/efficientnetb3_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Improve the prediction performance and users' experience\n\n* 1.2.0\n\n  Remove Fluid API\n\n  - ```shell\n    $ hub install efficientnetb3_imagenet==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/efficientnetb3_imagenet/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/classification/efficientnetb3_imagenet/data_feed.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport time\nfrom collections import OrderedDict\n\nimport numpy as np\nfrom PIL import Image\n\n__all__ = ['reader']\n\nDATA_DIM = 224\nimg_mean = np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))\nimg_std = np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))\n\n\ndef resize_short(img, target_size):\n    percent = float(target_size) / min(img.size[0], img.size[1])\n    resized_width = int(round(img.size[0] * percent))\n    resized_height = int(round(img.size[1] * percent))\n    img = img.resize((resized_width, resized_height), Image.LANCZOS)\n    return img\n\n\ndef crop_image(img, target_size, center):\n    width, height = img.size\n    size = target_size\n    if center == True:\n        w_start = (width - size) / 2\n        h_start = (height - size) / 2\n    else:\n        w_start = np.random.randint(0, width - size + 1)\n        h_start = np.random.randint(0, height - size + 1)\n    w_end = w_start + size\n    h_end = h_start + size\n    img = img.crop((w_start, h_start, w_end, h_end))\n    return img\n\n\ndef process_image(img):\n    img = resize_short(img, target_size=256)\n    img = crop_image(img, target_size=DATA_DIM, center=True)\n    if img.mode != 'RGB':\n        img = img.convert('RGB')\n    img = np.array(img).astype('float32').transpose((2, 0, 1)) / 255\n    img -= img_mean\n    img /= img_std\n    return img\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list[numpy.ndarray]): images data, shape of each is [H, W, C].\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            each['org_im_path'] = im_path\n            each['org_im'] = Image.open(im_path)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n    if images is not None:\n        assert type(images), \"images is a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = Image.fromarray(im[:, :, ::-1])\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n\n    for element in component:\n        element['image'] = process_image(element['org_im'])\n        yield element\n"
  },
  {
    "path": "modules/image/classification/efficientnetb3_imagenet/label_list.txt",
    "content": "tench, Tinca tinca\ngoldfish, Carassius auratus\ngreat white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias\ntiger shark, Galeocerdo cuvieri\nhammerhead, hammerhead shark\nelectric ray, crampfish, numbfish, torpedo\nstingray\ncock\nhen\nostrich, Struthio camelus\nbrambling, Fringilla montifringilla\ngoldfinch, Carduelis carduelis\nhouse finch, linnet, Carpodacus mexicanus\njunco, snowbird\nindigo bunting, indigo finch, indigo bird, Passerina cyanea\nrobin, American robin, Turdus migratorius\nbulbul\njay\nmagpie\nchickadee\nwater ouzel, dipper\nkite\nbald eagle, American eagle, Haliaeetus leucocephalus\nvulture\ngreat grey owl, great gray owl, Strix nebulosa\nEuropean fire salamander, Salamandra salamandra\ncommon newt, Triturus vulgaris\neft\nspotted salamander, Ambystoma maculatum\naxolotl, mud puppy, Ambystoma mexicanum\nbullfrog, Rana catesbeiana\ntree frog, tree-frog\ntailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui\nloggerhead, loggerhead turtle, Caretta caretta\nleatherback turtle, leatherback, leathery turtle, Dermochelys coriacea\nmud turtle\nterrapin\nbox turtle, box tortoise\nbanded gecko\ncommon iguana, iguana, Iguana iguana\nAmerican chameleon, anole, Anolis carolinensis\nwhiptail, whiptail lizard\nagama\nfrilled lizard, Chlamydosaurus kingi\nalligator lizard\nGila monster, Heloderma suspectum\ngreen lizard, Lacerta viridis\nAfrican chameleon, Chamaeleo chamaeleon\nKomodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis\nAfrican crocodile, Nile crocodile, Crocodylus niloticus\nAmerican alligator, Alligator mississipiensis\ntriceratops\nthunder snake, worm snake, Carphophis amoenus\nringneck snake, ring-necked snake, ring snake\nhognose snake, puff adder, sand viper\ngreen snake, grass snake\nking snake, kingsnake\ngarter snake, grass snake\nwater snake\nvine snake\nnight snake, Hypsiglena torquata\nboa constrictor, Constrictor constrictor\nrock python, rock snake, Python sebae\nIndian cobra, Naja naja\ngreen mamba\nsea snake\nhorned viper, cerastes, sand viper, horned asp, Cerastes cornutus\ndiamondback, diamondback rattlesnake, Crotalus adamanteus\nsidewinder, horned rattlesnake, Crotalus cerastes\ntrilobite\nharvestman, daddy longlegs, Phalangium opilio\nscorpion\nblack and gold garden spider, Argiope aurantia\nbarn spider, Araneus cavaticus\ngarden spider, Aranea diademata\nblack widow, Latrodectus mactans\ntarantula\nwolf spider, hunting spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse, partridge, Bonasa umbellus\nprairie chicken, prairie grouse, prairie fowl\npeacock\nquail\npartridge\nAfrican grey, African gray, Psittacus erithacus\nmacaw\nsulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser, Mergus serrator\ngoose\nblack swan, Cygnus atratus\ntusker\nechidna, spiny anteater, anteater\nplatypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus\nwallaby, brush kangaroo\nkoala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus\nwombat\njellyfish\nsea anemone, anemone\nbrain coral\nflatworm, platyhelminth\nnematode, nematode worm, roundworm\nconch\nsnail\nslug\nsea slug, nudibranch\nchiton, coat-of-mail shell, sea cradle, polyplacophore\nchambered nautilus, pearly nautilus, nautilus\nDungeness crab, Cancer magister\nrock crab, Cancer irroratus\nfiddler crab\nking crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica\nAmerican lobster, Northern lobster, Maine lobster, Homarus americanus\nspiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish\ncrayfish, crawfish, crawdad, crawdaddy\nhermit crab\nisopod\nwhite stork, Ciconia ciconia\nblack stork, Ciconia nigra\nspoonbill\nflamingo\nlittle blue heron, Egretta caerulea\nAmerican egret, great white heron, Egretta albus\nbittern\ncrane\nlimpkin, Aramus pictus\nEuropean gallinule, Porphyrio porphyrio\nAmerican coot, marsh hen, mud hen, water hen, Fulica americana\nbustard\nruddy turnstone, Arenaria interpres\nred-backed sandpiper, dunlin, Erolia alpina\nredshank, Tringa totanus\ndowitcher\noystercatcher, oyster catcher\npelican\nking penguin, Aptenodytes patagonica\nalbatross, mollymawk\ngrey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus\nkiller whale, killer, orca, grampus, sea wolf, Orcinus orca\ndugong, Dugong dugon\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog, Maltese terrier, Maltese\nPekinese, Pekingese, Peke\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound, Afghan\nbasset, basset hound\nbeagle\nbloodhound, sleuthhound\nbluetick\nblack-and-tan coonhound\nWalker hound, Walker foxhound\nEnglish foxhound\nredbone\nborzoi, Russian wolfhound\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound, Ibizan Podenco\nNorwegian elkhound, elkhound\notterhound, otter hound\nSaluki, gazelle hound\nScottish deerhound, deerhound\nWeimaraner\nStaffordshire bullterrier, Staffordshire bull terrier\nAmerican Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier, Sealyham\nAiredale, Airedale terrier\ncairn, cairn terrier\nAustralian terrier\nDandie Dinmont, Dandie Dinmont terrier\nBoston bull, Boston terrier\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier, Scottish terrier, Scottie\nTibetan terrier, chrysanthemum dog\nsilky terrier, Sydney silky\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa, Lhasa apso\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla, Hungarian pointer\nEnglish setter\nIrish setter, red setter\nGordon setter\nBrittany spaniel\nclumber, clumber spaniel\nEnglish springer, English springer spaniel\nWelsh springer spaniel\ncocker spaniel, English cocker spaniel, cocker\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog, bobtail\nShetland sheepdog, Shetland sheep dog, Shetland\ncollie\nBorder collie\nBouvier des Flandres, Bouviers des Flandres\nRottweiler\nGerman shepherd, German shepherd dog, German police dog, alsatian\nDoberman, Doberman pinscher\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard, St Bernard\nEskimo dog, husky\nmalamute, malemute, Alaskan malamute\nSiberian husky\ndalmatian, coach dog, carriage dog\naffenpinscher, monkey pinscher, monkey dog\nbasenji\npug, pug-dog\nLeonberg\nNewfoundland, Newfoundland dog\nGreat Pyrenees\nSamoyed, Samoyede\nPomeranian\nchow, chow chow\nkeeshond\nBrabancon griffon\nPembroke, Pembroke Welsh corgi\nCardigan, Cardigan Welsh corgi\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf, grey wolf, gray wolf, Canis lupus\nwhite wolf, Arctic wolf, Canis lupus tundrarum\nred wolf, maned wolf, Canis rufus, Canis niger\ncoyote, prairie wolf, brush wolf, Canis latrans\ndingo, warrigal, warragal, Canis dingo\ndhole, Cuon alpinus\nAfrican hunting dog, hyena dog, Cape hunting dog, Lycaon pictus\nhyena, hyaena\nred fox, Vulpes vulpes\nkit fox, Vulpes macrotis\nArctic fox, white fox, Alopex lagopus\ngrey fox, gray fox, Urocyon cinereoargenteus\ntabby, tabby cat\ntiger cat\nPersian cat\nSiamese cat, Siamese\nEgyptian cat\ncougar, puma, catamount, mountain lion, painter, panther, Felis concolor\nlynx, catamount\nleopard, Panthera pardus\nsnow leopard, ounce, Panthera uncia\njaguar, panther, Panthera onca, Felis onca\nlion, king of beasts, Panthera leo\ntiger, Panthera tigris\ncheetah, chetah, Acinonyx jubatus\nbrown bear, bruin, Ursus arctos\nAmerican black bear, black bear, Ursus americanus, Euarctos americanus\nice bear, polar bear, Ursus Maritimus, Thalarctos maritimus\nsloth bear, Melursus ursinus, Ursus ursinus\nmongoose\nmeerkat, mierkat\ntiger beetle\nladybug, ladybeetle, lady beetle, ladybird, ladybird beetle\nground beetle, carabid beetle\nlong-horned beetle, longicorn, longicorn beetle\nleaf beetle, chrysomelid\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant, emmet, pismire\ngrasshopper, hopper\ncricket\nwalking stick, walkingstick, stick insect\ncockroach, roach\nmantis, mantid\ncicada, cicala\nleafhopper\nlacewing, lacewing fly\ndragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk\ndamselfly\nadmiral\nringlet, ringlet butterfly\nmonarch, monarch butterfly, milkweed butterfly, Danaus plexippus\ncabbage butterfly\nsulphur butterfly, sulfur butterfly\nlycaenid, lycaenid butterfly\nstarfish, sea star\nsea urchin\nsea cucumber, holothurian\nwood rabbit, cottontail, cottontail rabbit\nhare\nAngora, Angora rabbit\nhamster\nporcupine, hedgehog\nfox squirrel, eastern fox squirrel, Sciurus niger\nmarmot\nbeaver\nguinea pig, Cavia cobaya\nsorrel\nzebra\nhog, pig, grunter, squealer, Sus scrofa\nwild boar, boar, Sus scrofa\nwarthog\nhippopotamus, hippo, river horse, Hippopotamus amphibius\nox\nwater buffalo, water ox, Asiatic buffalo, Bubalus bubalis\nbison\nram, tup\nbighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis\nibex, Capra ibex\nhartebeest\nimpala, Aepyceros melampus\ngazelle\nArabian camel, dromedary, Camelus dromedarius\nllama\nweasel\nmink\npolecat, fitch, foulmart, foumart, Mustela putorius\nblack-footed ferret, ferret, Mustela nigripes\notter\nskunk, polecat, wood pussy\nbadger\narmadillo\nthree-toed sloth, ai, Bradypus tridactylus\norangutan, orang, orangutang, Pongo pygmaeus\ngorilla, Gorilla gorilla\nchimpanzee, chimp, Pan troglodytes\ngibbon, Hylobates lar\nsiamang, Hylobates syndactylus, Symphalangus syndactylus\nguenon, guenon monkey\npatas, hussar monkey, Erythrocebus patas\nbaboon\nmacaque\nlangur\ncolobus, colobus monkey\nproboscis monkey, Nasalis larvatus\nmarmoset\ncapuchin, ringtail, Cebus capucinus\nhowler monkey, howler\ntiti, titi monkey\nspider monkey, Ateles geoffroyi\nsquirrel monkey, Saimiri sciureus\nMadagascar cat, ring-tailed lemur, Lemur catta\nindri, indris, Indri indri, Indri brevicaudatus\nIndian elephant, Elephas maximus\nAfrican elephant, Loxodonta africana\nlesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens\ngiant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca\nbarracouta, snoek\neel\ncoho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch\nrock beauty, Holocanthus tricolor\nanemone fish\nsturgeon\ngar, garfish, garpike, billfish, Lepisosteus osseus\nlionfish\npuffer, pufferfish, blowfish, globefish\nabacus\nabaya\nacademic gown, academic robe, judge's robe\naccordion, piano accordion, squeeze box\nacoustic guitar\naircraft carrier, carrier, flattop, attack aircraft carrier\nairliner\nairship, dirigible\naltar\nambulance\namphibian, amphibious vehicle\nanalog clock\napiary, bee house\napron\nashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin\nassault rifle, assault gun\nbackpack, back pack, knapsack, packsack, rucksack, haversack\nbakery, bakeshop, bakehouse\nbalance beam, beam\nballoon\nballpoint, ballpoint pen, ballpen, Biro\nBand Aid\nbanjo\nbannister, banister, balustrade, balusters, handrail\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel, cask\nbarrow, garden cart, lawn cart, wheelbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap, swimming cap\nbath towel\nbathtub, bathing tub, bath, tub\nbeach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon\nbeacon, lighthouse, beacon light, pharos\nbeaker\nbearskin, busby, shako\nbeer bottle\nbeer glass\nbell cote, bell cot\nbib\nbicycle-built-for-two, tandem bicycle, tandem\nbikini, two-piece\nbinder, ring-binder\nbinoculars, field glasses, opera glasses\nbirdhouse\nboathouse\nbobsled, bobsleigh, bob\nbolo tie, bolo, bola tie, bola\nbonnet, poke bonnet\nbookcase\nbookshop, bookstore, bookstall\nbottlecap\nbow\nbow tie, bow-tie, bowtie\nbrass, memorial tablet, plaque\nbrassiere, bra, bandeau\nbreakwater, groin, groyne, mole, bulwark, seawall, jetty\nbreastplate, aegis, egis\nbroom\nbucket, pail\nbuckle\nbulletproof vest\nbullet train, bullet\nbutcher shop, meat market\ncab, hack, taxi, taxicab\ncaldron, cauldron\ncandle, taper, wax light\ncannon\ncanoe\ncan opener, tin opener\ncardigan\ncar mirror\ncarousel, carrousel, merry-go-round, roundabout, whirligig\ncarpenter's kit, tool kit\ncarton\ncar wheel\ncash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello, violoncello\ncellular telephone, cellular phone, cellphone, cell, mobile phone\nchain\nchainlink fence\nchain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour\nchain saw, chainsaw\nchest\nchiffonier, commode\nchime, bell, gong\nchina cabinet, china closet\nChristmas stocking\nchurch, church building\ncinema, movie theater, movie theatre, movie house, picture palace\ncleaver, meat cleaver, chopper\ncliff dwelling\ncloak\nclog, geta, patten, sabot\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil, spiral, volute, whorl, helix\ncombination lock\ncomputer keyboard, keypad\nconfectionery, confectionary, candy store\ncontainer ship, containership, container vessel\nconvertible\ncorkscrew, bottle screw\ncornet, horn, trumpet, trump\ncowboy boot\ncowboy hat, ten-gallon hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib, cot\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam, dike, dyke\ndesk\ndesktop computer\ndial telephone, dial phone\ndiaper, nappy, napkin\ndigital clock\ndigital watch\ndining table, board\ndishrag, dishcloth\ndishwasher, dish washer, dishwashing machine\ndisk brake, disc brake\ndock, dockage, docking facility\ndogsled, dog sled, dog sleigh\ndome\ndoormat, welcome mat\ndrilling platform, offshore rig\ndrum, membranophone, tympan\ndrumstick\ndumbbell\nDutch oven\nelectric fan, blower\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa, boa\nfile, file cabinet, filing cabinet\nfireboat\nfire engine, fire truck\nfire screen, fireguard\nflagpole, flagstaff\nflute, transverse flute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn, horn\nfrying pan, frypan, skillet\nfur coat\ngarbage truck, dustcart\ngasmask, respirator, gas helmet\ngas pump, gasoline pump, petrol pump, island dispenser\ngoblet\ngo-kart\ngolf ball\ngolfcart, golf cart\ngondola\ngong, tam-tam\ngown\ngrand piano, grand\ngreenhouse, nursery, glasshouse\ngrille, radiator grille\ngrocery store, grocery, food market, market\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower, blow dryer, blow drier, hair dryer, hair drier\nhand-held computer, hand-held microcomputer\nhandkerchief, hankie, hanky, hankey\nhard disc, hard disk, fixed disk\nharmonica, mouth organ, harp, mouth harp\nharp\nharvester, reaper\nhatchet\nholster\nhome theater, home theatre\nhoneycomb\nhook, claw\nhoopskirt, crinoline\nhorizontal bar, high bar\nhorse cart, horse-cart\nhourglass\niPod\niron, smoothing iron\njack-o'-lantern\njean, blue jean, denim\njeep, landrover\njersey, T-shirt, tee shirt\njigsaw puzzle\njinrikisha, ricksha, rickshaw\njoystick\nkimono\nknee pad\nknot\nlab coat, laboratory coat\nladle\nlampshade, lamp shade\nlaptop, laptop computer\nlawn mower, mower\nlens cap, lens cover\nletter opener, paper knife, paperknife\nlibrary\nlifeboat\nlighter, light, igniter, ignitor\nlimousine, limo\nliner, ocean liner\nlipstick, lip rouge\nLoafer\nlotion\nloudspeaker, speaker, speaker unit, loudspeaker system, speaker system\nloupe, jeweler's loupe\nlumbermill, sawmill\nmagnetic compass\nmailbag, postbag\nmailbox, letter box\nmaillot\nmaillot, tank suit\nmanhole cover\nmaraca\nmarimba, xylophone\nmask\nmatchstick\nmaypole\nmaze, labyrinth\nmeasuring cup\nmedicine chest, medicine cabinet\nmegalith, megalithic structure\nmicrophone, mike\nmicrowave, microwave oven\nmilitary uniform\nmilk can\nminibus\nminiskirt, mini\nminivan\nmissile\nmitten\nmixing bowl\nmobile home, manufactured home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter, scooter\nmountain bike, all-terrain bike, off-roader\nmountain tent\nmouse, computer mouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook, notebook computer\nobelisk\noboe, hautboy, hautbois\nocarina, sweet potato\nodometer, hodometer, mileometer, milometer\noil filter\norgan, pipe organ\noscilloscope, scope, cathode-ray oscilloscope, CRO\noverskirt\noxcart\noxygen mask\npacket\npaddle, boat paddle\npaddlewheel, paddle wheel\npadlock\npaintbrush\npajama, pyjama, pj's, jammies\npalace\npanpipe, pandean pipe, syrinx\npaper towel\nparachute, chute\nparallel bars, bars\npark bench\nparking meter\npassenger car, coach, carriage\npatio, terrace\npay-phone, pay-station\npedestal, plinth, footstall\npencil box, pencil case\npencil sharpener\nperfume, essence\nPetri dish\nphotocopier\npick, plectrum, plectron\npickelhaube\npicket fence, paling\npickup, pickup truck\npier\npiggy bank, penny bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate, pirate ship\npitcher, ewer\nplane, carpenter's plane, woodworking plane\nplanetarium\nplastic bag\nplate rack\nplow, plough\nplunger, plumber's helper\nPolaroid camera, Polaroid Land camera\npole\npolice van, police wagon, paddy wagon, patrol wagon, wagon, black Maria\nponcho\npool table, billiard table, snooker table\npop bottle, soda bottle\npot, flowerpot\npotter's wheel\npower drill\nprayer rug, prayer mat\nprinter\nprison, prison house\nprojectile, missile\nprojector\npuck, hockey puck\npunching bag, punch bag, punching ball, punchball\npurse\nquill, quill pen\nquilt, comforter, comfort, puff\nracer, race car, racing car\nracket, racquet\nradiator\nradio, wireless\nradio telescope, radio reflector\nrain barrel\nrecreational vehicle, RV, R.V.\nreel\nreflex camera\nrefrigerator, icebox\nremote control, remote\nrestaurant, eating house, eating place, eatery\nrevolver, six-gun, six-shooter\nrifle\nrocking chair, rocker\nrotisserie\nrubber eraser, rubber, pencil eraser\nrugby ball\nrule, ruler\nrunning shoe\nsafe\nsafety pin\nsaltshaker, salt shaker\nsandal\nsarong\nsax, saxophone\nscabbard\nscale, weighing machine\nschool bus\nschooner\nscoreboard\nscreen, CRT screen\nscrew\nscrewdriver\nseat belt, seatbelt\nsewing machine\nshield, buckler\nshoe shop, shoe-shop, shoe store\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule, slipstick\nsliding door\nslot, one-armed bandit\nsnorkel\nsnowmobile\nsnowplow, snowplough\nsoap dispenser\nsoccer ball\nsock\nsolar dish, solar collector, solar furnace\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web, spider's web\nspindle\nsports car, sport car\nspotlight, spot\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch, stop watch\nstove\nstrainer\nstreetcar, tram, tramcar, trolley, trolley car\nstretcher\nstudio couch, day bed\nstupa, tope\nsubmarine, pigboat, sub, U-boat\nsuit, suit of clothes\nsundial\nsunglass\nsunglasses, dark glasses, shades\nsunscreen, sunblock, sun blocker\nsuspension bridge\nswab, swob, mop\nsweatshirt\nswimming trunks, bathing trunks\nswing\nswitch, electric switch, electrical switch\nsyringe\ntable lamp\ntank, army tank, armored combat vehicle, armoured combat vehicle\ntape player\nteapot\nteddy, teddy bear\ntelevision, television system\ntennis ball\nthatch, thatched roof\ntheater curtain, theatre curtain\nthimble\nthresher, thrasher, threshing machine\nthrone\ntile roof\ntoaster\ntobacco shop, tobacconist shop, tobacconist\ntoilet seat\ntorch\ntotem pole\ntow truck, tow car, wrecker\ntoyshop\ntractor\ntrailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi\ntray\ntrench coat\ntricycle, trike, velocipede\ntrimaran\ntripod\ntriumphal arch\ntrolleybus, trolley coach, trackless trolley\ntrombone\ntub, vat\nturnstile\ntypewriter keyboard\numbrella\nunicycle, monocycle\nupright, upright piano\nvacuum, vacuum cleaner\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin, fiddle\nvolleyball\nwaffle iron\nwall clock\nwallet, billfold, notecase, pocketbook\nwardrobe, closet, press\nwarplane, military plane\nwashbasin, handbasin, washbowl, lavabo, wash-hand basin\nwasher, automatic washer, washing machine\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool, woolen, woollen\nworm fence, snake fence, snake-rail fence, Virginia fence\nwreck\nyawl\nyurt\nweb site, website, internet site, site\ncomic book\ncrossword puzzle, crossword\nstreet sign\ntraffic light, traffic signal, stoplight\nbook jacket, dust cover, dust jacket, dust wrapper\nmenu\nplate\nguacamole\nconsomme\nhot pot, hotpot\ntrifle\nice cream, icecream\nice lolly, lolly, lollipop, popsicle\nFrench loaf\nbagel, beigel\npretzel\ncheeseburger\nhotdog, hot dog, red hot\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini, courgette\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber, cuke\nartichoke, globe artichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple, ananas\nbanana\njackfruit, jak, jack\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce, chocolate syrup\ndough\nmeat loaf, meatloaf\npizza, pizza pie\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff, drop, drop-off\ncoral reef\ngeyser\nlakeside, lakeshore\npromontory, headland, head, foreland\nsandbar, sand bar\nseashore, coast, seacoast, sea-coast\nvalley, vale\nvolcano\nballplayer, baseball player\ngroom, bridegroom\nscuba diver\nrapeseed\ndaisy\nyellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum\ncorn\nacorn\nhip, rose hip, rosehip\nbuckeye, horse chestnut, conker\ncoral fungus\nagaric\ngyromitra\nstinkhorn, carrion fungus\nearthstar\nhen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa\nbolete\near, spike, capitulum\ntoilet tissue, toilet paper, bathroom tissue\n"
  },
  {
    "path": "modules/image/classification/efficientnetb3_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\n\nimport argparse\nimport ast\nimport os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import postprocess\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"efficientnetb3_imagenet\",\n            type=\"CV/image_classification\",\n            author=\"paddlepaddle\",\n            author_email=\"paddle-dev@baidu.com\",\n            summary=\"EfficientNetB3 is a image classfication model, this module is trained with imagenet datasets.\",\n            version=\"1.2.0\")\nclass EfficientNetB3ImageNet:\n\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"efficientnetb3_imagenet_infer_model\",\n                                                          \"model\")\n        label_file = os.path.join(self.directory, \"label_list.txt\")\n        with open(label_file, 'r', encoding='utf-8') as file:\n            self.label_list = file.read().split(\"\\n\")[:-1]\n        self._set_config()\n\n    def get_expected_image_width(self):\n        return 224\n\n    def get_expected_image_height(self):\n        return 224\n\n    def get_pretrained_images_mean(self):\n        im_mean = np.array([0.485, 0.456, 0.406]).reshape(1, 3)\n        return im_mean\n\n    def get_pretrained_images_std(self):\n        im_std = np.array([0.229, 0.224, 0.225]).reshape(1, 3)\n        return im_std\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path + '.pdmodel'\n        params = self.default_pretrained_model_path + '.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def classification(self, images=None, paths=None, batch_size=1, use_gpu=False, top_k=1):\n        \"\"\"\n        API for image classification.\n\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        all_data = list()\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = list()\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except:\n                    pass\n            # feed batch image\n            batch_image = np.array([data['image'] for data in batch_data])\n\n            batch_image = np.array([data['image'] for data in batch_data])\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(batch_image.copy())\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            out = postprocess(data_out=output_handle.copy_to_cpu(), label_list=self.label_list, top_k=top_k)\n            res += out\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classify(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.classify(paths=[args.input_path], batch_size=args.batch_size, use_gpu=args.use_gpu)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not.\")\n        self.arg_config_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n        self.arg_config_group.add_argument('--top_k', type=ast.literal_eval, default=1, help=\"Return top k results.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n\n\nif __name__ == '__main__':\n    b3 = EfficientNetB3ImageNet()\n    b3.context()\n    import cv2\n    test_image = [\n        cv2.imread(\n            '/mnt/zhangxuefei/program-paddle/PaddleHub/hub_module/tests/image_dataset/classification/animals/dog.jpeg')\n    ]\n    res = b3.classification(images=test_image)\n    print(res)\n    res = b3.classification(paths=[\n        '/mnt/zhangxuefei/program-paddle/PaddleHub/hub_module/tests/image_dataset/classification/animals/dog.jpeg'\n    ])\n    print(res)\n    res = b3.classification(images=test_image)\n    print(res)\n    res = b3.classify(images=test_image)\n    print(res)\n"
  },
  {
    "path": "modules/image/classification/efficientnetb3_imagenet/processor.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport base64\n\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef softmax(x):\n    if len(x.shape) > 1:\n        tmp = np.max(x, axis=1)\n        x -= tmp.reshape((x.shape[0], 1))\n        x = np.exp(x)\n        tmp = np.sum(x, axis=1)\n        x /= tmp.reshape((x.shape[0], 1))\n    else:\n        tmp = np.max(x)\n        x -= tmp\n        x = np.exp(x)\n        tmp = np.sum(x)\n        x /= tmp\n    return x\n\n\ndef postprocess(data_out, label_list, top_k):\n    \"\"\"\n    Postprocess output of network, one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output data of network.\n        label_list (list): list of label.\n        top_k (int): Return top k results.\n    \"\"\"\n    output = []\n    for result in data_out:\n        result_i = softmax(result)\n        output_i = {}\n        indexs = np.argsort(result_i)[::-1][0:top_k]\n        for index in indexs:\n            label = label_list[index].split(',')[0]\n            output_i[label] = float(result_i[index])\n        output.append(output_i)\n    return output\n"
  },
  {
    "path": "modules/image/classification/efficientnetb3_imagenet/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/brFsZ7qszSY/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8OHx8ZG9nfGVufDB8fHx8MTY2MzA1ODQ1MQ&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"efficientnetb3_imagenet\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n\n    def test_classification1(self):\n        results = self.module.classification(paths=['tests/test.jpg'])\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification2(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')])\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification3(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')], use_gpu=True)\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification4(self):\n        self.assertRaises(AssertionError, self.module.classification, paths=['no.jpg'])\n\n    def test_classification5(self):\n        self.assertRaises(TypeError, self.module.classification, images=['tests/test.jpg'])\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/classification/efficientnetb4_imagenet/README.md",
    "content": "# efficientnetb4_imagenet\n\n|模型名称|efficientnetb4_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|EfficientNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|77MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - EfficientNet 是谷歌的开源新模型，是一个轻量级网络，它的主干网络由 MBConv 构成，同时采取了 squeeze-and-excitation 操作对网络结构进行优化。该 PaddleHub Module结构为 EfficientNetB4，基于 ImageNet-2012 数据集训练，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install efficientnetb4_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run efficientnetb4_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"efficientnetb4_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 为识别的菜品类别，value为置信度。\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m efficientnetb4_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/efficientnetb4_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  提升预测性能以及易用性\n\n\n* 1.2.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install efficientnetb4_imagenet==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/efficientnetb4_imagenet/README_en.md",
    "content": "# efficientnetb4_imagenet\n\n|Module Name|efficientnetb4_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|EfficientNet|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|77MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - EfficientNet is a light-weight model proposed by google, which consists of MBConv, and takes advantage of squeeze-and-excitation operation. This module is based on EfficientNetB4, trained on ImageNet-2012 dataset, and can predict an image of size 224*224*3.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install efficientnetb4_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run efficientnetb4_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"efficientnetb4_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - classification API.\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - top\\_k (int): return the first k results\n\n    - **Return**\n\n      - res (list\\[dict\\]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of image classification.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m efficientnetb4_imagenet\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/efficientnetb4_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Improve the prediction performance and users' experience\n\n* 1.2.0\n\n  Remove Fluid API\n\n  - ```shell\n    $ hub install efficientnetb4_imagenet==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/efficientnetb4_imagenet/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/classification/efficientnetb4_imagenet/data_feed.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport time\nfrom collections import OrderedDict\n\nimport numpy as np\nfrom PIL import Image\n\n__all__ = ['reader']\n\nDATA_DIM = 224\nimg_mean = np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))\nimg_std = np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))\n\n\ndef resize_short(img, target_size):\n    percent = float(target_size) / min(img.size[0], img.size[1])\n    resized_width = int(round(img.size[0] * percent))\n    resized_height = int(round(img.size[1] * percent))\n    img = img.resize((resized_width, resized_height), Image.LANCZOS)\n    return img\n\n\ndef crop_image(img, target_size, center):\n    width, height = img.size\n    size = target_size\n    if center == True:\n        w_start = (width - size) / 2\n        h_start = (height - size) / 2\n    else:\n        w_start = np.random.randint(0, width - size + 1)\n        h_start = np.random.randint(0, height - size + 1)\n    w_end = w_start + size\n    h_end = h_start + size\n    img = img.crop((w_start, h_start, w_end, h_end))\n    return img\n\n\ndef process_image(img):\n    img = resize_short(img, target_size=256)\n    img = crop_image(img, target_size=DATA_DIM, center=True)\n    if img.mode != 'RGB':\n        img = img.convert('RGB')\n    img = np.array(img).astype('float32').transpose((2, 0, 1)) / 255\n    img -= img_mean\n    img /= img_std\n    return img\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list[numpy.ndarray]): images data, shape of each is [H, W, C].\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            each['org_im_path'] = im_path\n            each['org_im'] = Image.open(im_path)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n    if images is not None:\n        assert type(images), \"images is a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = Image.fromarray(im[:, :, ::-1])\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n\n    for element in component:\n        element['image'] = process_image(element['org_im'])\n        yield element\n"
  },
  {
    "path": "modules/image/classification/efficientnetb4_imagenet/label_list.txt",
    "content": "tench, Tinca tinca\ngoldfish, Carassius auratus\ngreat white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias\ntiger shark, Galeocerdo cuvieri\nhammerhead, hammerhead shark\nelectric ray, crampfish, numbfish, torpedo\nstingray\ncock\nhen\nostrich, Struthio camelus\nbrambling, Fringilla montifringilla\ngoldfinch, Carduelis carduelis\nhouse finch, linnet, Carpodacus mexicanus\njunco, snowbird\nindigo bunting, indigo finch, indigo bird, Passerina cyanea\nrobin, American robin, Turdus migratorius\nbulbul\njay\nmagpie\nchickadee\nwater ouzel, dipper\nkite\nbald eagle, American eagle, Haliaeetus leucocephalus\nvulture\ngreat grey owl, great gray owl, Strix nebulosa\nEuropean fire salamander, Salamandra salamandra\ncommon newt, Triturus vulgaris\neft\nspotted salamander, Ambystoma maculatum\naxolotl, mud puppy, Ambystoma mexicanum\nbullfrog, Rana catesbeiana\ntree frog, tree-frog\ntailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui\nloggerhead, loggerhead turtle, Caretta caretta\nleatherback turtle, leatherback, leathery turtle, Dermochelys coriacea\nmud turtle\nterrapin\nbox turtle, box tortoise\nbanded gecko\ncommon iguana, iguana, Iguana iguana\nAmerican chameleon, anole, Anolis carolinensis\nwhiptail, whiptail lizard\nagama\nfrilled lizard, Chlamydosaurus kingi\nalligator lizard\nGila monster, Heloderma suspectum\ngreen lizard, Lacerta viridis\nAfrican chameleon, Chamaeleo chamaeleon\nKomodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis\nAfrican crocodile, Nile crocodile, Crocodylus niloticus\nAmerican alligator, Alligator mississipiensis\ntriceratops\nthunder snake, worm snake, Carphophis amoenus\nringneck snake, ring-necked snake, ring snake\nhognose snake, puff adder, sand viper\ngreen snake, grass snake\nking snake, kingsnake\ngarter snake, grass snake\nwater snake\nvine snake\nnight snake, Hypsiglena torquata\nboa constrictor, Constrictor constrictor\nrock python, rock snake, Python sebae\nIndian cobra, Naja naja\ngreen mamba\nsea snake\nhorned viper, cerastes, sand viper, horned asp, Cerastes cornutus\ndiamondback, diamondback rattlesnake, Crotalus adamanteus\nsidewinder, horned rattlesnake, Crotalus cerastes\ntrilobite\nharvestman, daddy longlegs, Phalangium opilio\nscorpion\nblack and gold garden spider, Argiope aurantia\nbarn spider, Araneus cavaticus\ngarden spider, Aranea diademata\nblack widow, Latrodectus mactans\ntarantula\nwolf spider, hunting spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse, partridge, Bonasa umbellus\nprairie chicken, prairie grouse, prairie fowl\npeacock\nquail\npartridge\nAfrican grey, African gray, Psittacus erithacus\nmacaw\nsulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser, Mergus serrator\ngoose\nblack swan, Cygnus atratus\ntusker\nechidna, spiny anteater, anteater\nplatypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus\nwallaby, brush kangaroo\nkoala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus\nwombat\njellyfish\nsea anemone, anemone\nbrain coral\nflatworm, platyhelminth\nnematode, nematode worm, roundworm\nconch\nsnail\nslug\nsea slug, nudibranch\nchiton, coat-of-mail shell, sea cradle, polyplacophore\nchambered nautilus, pearly nautilus, nautilus\nDungeness crab, Cancer magister\nrock crab, Cancer irroratus\nfiddler crab\nking crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica\nAmerican lobster, Northern lobster, Maine lobster, Homarus americanus\nspiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish\ncrayfish, crawfish, crawdad, crawdaddy\nhermit crab\nisopod\nwhite stork, Ciconia ciconia\nblack stork, Ciconia nigra\nspoonbill\nflamingo\nlittle blue heron, Egretta caerulea\nAmerican egret, great white heron, Egretta albus\nbittern\ncrane\nlimpkin, Aramus pictus\nEuropean gallinule, Porphyrio porphyrio\nAmerican coot, marsh hen, mud hen, water hen, Fulica americana\nbustard\nruddy turnstone, Arenaria interpres\nred-backed sandpiper, dunlin, Erolia alpina\nredshank, Tringa totanus\ndowitcher\noystercatcher, oyster catcher\npelican\nking penguin, Aptenodytes patagonica\nalbatross, mollymawk\ngrey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus\nkiller whale, killer, orca, grampus, sea wolf, Orcinus orca\ndugong, Dugong dugon\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog, Maltese terrier, Maltese\nPekinese, Pekingese, Peke\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound, Afghan\nbasset, basset hound\nbeagle\nbloodhound, sleuthhound\nbluetick\nblack-and-tan coonhound\nWalker hound, Walker foxhound\nEnglish foxhound\nredbone\nborzoi, Russian wolfhound\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound, Ibizan Podenco\nNorwegian elkhound, elkhound\notterhound, otter hound\nSaluki, gazelle hound\nScottish deerhound, deerhound\nWeimaraner\nStaffordshire bullterrier, Staffordshire bull terrier\nAmerican Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier, Sealyham\nAiredale, Airedale terrier\ncairn, cairn terrier\nAustralian terrier\nDandie Dinmont, Dandie Dinmont terrier\nBoston bull, Boston terrier\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier, Scottish terrier, Scottie\nTibetan terrier, chrysanthemum dog\nsilky terrier, Sydney silky\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa, Lhasa apso\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla, Hungarian pointer\nEnglish setter\nIrish setter, red setter\nGordon setter\nBrittany spaniel\nclumber, clumber spaniel\nEnglish springer, English springer spaniel\nWelsh springer spaniel\ncocker spaniel, English cocker spaniel, cocker\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog, bobtail\nShetland sheepdog, Shetland sheep dog, Shetland\ncollie\nBorder collie\nBouvier des Flandres, Bouviers des Flandres\nRottweiler\nGerman shepherd, German shepherd dog, German police dog, alsatian\nDoberman, Doberman pinscher\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard, St Bernard\nEskimo dog, husky\nmalamute, malemute, Alaskan malamute\nSiberian husky\ndalmatian, coach dog, carriage dog\naffenpinscher, monkey pinscher, monkey dog\nbasenji\npug, pug-dog\nLeonberg\nNewfoundland, Newfoundland dog\nGreat Pyrenees\nSamoyed, Samoyede\nPomeranian\nchow, chow chow\nkeeshond\nBrabancon griffon\nPembroke, Pembroke Welsh corgi\nCardigan, Cardigan Welsh corgi\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf, grey wolf, gray wolf, Canis lupus\nwhite wolf, Arctic wolf, Canis lupus tundrarum\nred wolf, maned wolf, Canis rufus, Canis niger\ncoyote, prairie wolf, brush wolf, Canis latrans\ndingo, warrigal, warragal, Canis dingo\ndhole, Cuon alpinus\nAfrican hunting dog, hyena dog, Cape hunting dog, Lycaon pictus\nhyena, hyaena\nred fox, Vulpes vulpes\nkit fox, Vulpes macrotis\nArctic fox, white fox, Alopex lagopus\ngrey fox, gray fox, Urocyon cinereoargenteus\ntabby, tabby cat\ntiger cat\nPersian cat\nSiamese cat, Siamese\nEgyptian cat\ncougar, puma, catamount, mountain lion, painter, panther, Felis concolor\nlynx, catamount\nleopard, Panthera pardus\nsnow leopard, ounce, Panthera uncia\njaguar, panther, Panthera onca, Felis onca\nlion, king of beasts, Panthera leo\ntiger, Panthera tigris\ncheetah, chetah, Acinonyx jubatus\nbrown bear, bruin, Ursus arctos\nAmerican black bear, black bear, Ursus americanus, Euarctos americanus\nice bear, polar bear, Ursus Maritimus, Thalarctos maritimus\nsloth bear, Melursus ursinus, Ursus ursinus\nmongoose\nmeerkat, mierkat\ntiger beetle\nladybug, ladybeetle, lady beetle, ladybird, ladybird beetle\nground beetle, carabid beetle\nlong-horned beetle, longicorn, longicorn beetle\nleaf beetle, chrysomelid\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant, emmet, pismire\ngrasshopper, hopper\ncricket\nwalking stick, walkingstick, stick insect\ncockroach, roach\nmantis, mantid\ncicada, cicala\nleafhopper\nlacewing, lacewing fly\ndragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk\ndamselfly\nadmiral\nringlet, ringlet butterfly\nmonarch, monarch butterfly, milkweed butterfly, Danaus plexippus\ncabbage butterfly\nsulphur butterfly, sulfur butterfly\nlycaenid, lycaenid butterfly\nstarfish, sea star\nsea urchin\nsea cucumber, holothurian\nwood rabbit, cottontail, cottontail rabbit\nhare\nAngora, Angora rabbit\nhamster\nporcupine, hedgehog\nfox squirrel, eastern fox squirrel, Sciurus niger\nmarmot\nbeaver\nguinea pig, Cavia cobaya\nsorrel\nzebra\nhog, pig, grunter, squealer, Sus scrofa\nwild boar, boar, Sus scrofa\nwarthog\nhippopotamus, hippo, river horse, Hippopotamus amphibius\nox\nwater buffalo, water ox, Asiatic buffalo, Bubalus bubalis\nbison\nram, tup\nbighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis\nibex, Capra ibex\nhartebeest\nimpala, Aepyceros melampus\ngazelle\nArabian camel, dromedary, Camelus dromedarius\nllama\nweasel\nmink\npolecat, fitch, foulmart, foumart, Mustela putorius\nblack-footed ferret, ferret, Mustela nigripes\notter\nskunk, polecat, wood pussy\nbadger\narmadillo\nthree-toed sloth, ai, Bradypus tridactylus\norangutan, orang, orangutang, Pongo pygmaeus\ngorilla, Gorilla gorilla\nchimpanzee, chimp, Pan troglodytes\ngibbon, Hylobates lar\nsiamang, Hylobates syndactylus, Symphalangus syndactylus\nguenon, guenon monkey\npatas, hussar monkey, Erythrocebus patas\nbaboon\nmacaque\nlangur\ncolobus, colobus monkey\nproboscis monkey, Nasalis larvatus\nmarmoset\ncapuchin, ringtail, Cebus capucinus\nhowler monkey, howler\ntiti, titi monkey\nspider monkey, Ateles geoffroyi\nsquirrel monkey, Saimiri sciureus\nMadagascar cat, ring-tailed lemur, Lemur catta\nindri, indris, Indri indri, Indri brevicaudatus\nIndian elephant, Elephas maximus\nAfrican elephant, Loxodonta africana\nlesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens\ngiant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca\nbarracouta, snoek\neel\ncoho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch\nrock beauty, Holocanthus tricolor\nanemone fish\nsturgeon\ngar, garfish, garpike, billfish, Lepisosteus osseus\nlionfish\npuffer, pufferfish, blowfish, globefish\nabacus\nabaya\nacademic gown, academic robe, judge's robe\naccordion, piano accordion, squeeze box\nacoustic guitar\naircraft carrier, carrier, flattop, attack aircraft carrier\nairliner\nairship, dirigible\naltar\nambulance\namphibian, amphibious vehicle\nanalog clock\napiary, bee house\napron\nashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin\nassault rifle, assault gun\nbackpack, back pack, knapsack, packsack, rucksack, haversack\nbakery, bakeshop, bakehouse\nbalance beam, beam\nballoon\nballpoint, ballpoint pen, ballpen, Biro\nBand Aid\nbanjo\nbannister, banister, balustrade, balusters, handrail\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel, cask\nbarrow, garden cart, lawn cart, wheelbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap, swimming cap\nbath towel\nbathtub, bathing tub, bath, tub\nbeach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon\nbeacon, lighthouse, beacon light, pharos\nbeaker\nbearskin, busby, shako\nbeer bottle\nbeer glass\nbell cote, bell cot\nbib\nbicycle-built-for-two, tandem bicycle, tandem\nbikini, two-piece\nbinder, ring-binder\nbinoculars, field glasses, opera glasses\nbirdhouse\nboathouse\nbobsled, bobsleigh, bob\nbolo tie, bolo, bola tie, bola\nbonnet, poke bonnet\nbookcase\nbookshop, bookstore, bookstall\nbottlecap\nbow\nbow tie, bow-tie, bowtie\nbrass, memorial tablet, plaque\nbrassiere, bra, bandeau\nbreakwater, groin, groyne, mole, bulwark, seawall, jetty\nbreastplate, aegis, egis\nbroom\nbucket, pail\nbuckle\nbulletproof vest\nbullet train, bullet\nbutcher shop, meat market\ncab, hack, taxi, taxicab\ncaldron, cauldron\ncandle, taper, wax light\ncannon\ncanoe\ncan opener, tin opener\ncardigan\ncar mirror\ncarousel, carrousel, merry-go-round, roundabout, whirligig\ncarpenter's kit, tool kit\ncarton\ncar wheel\ncash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello, violoncello\ncellular telephone, cellular phone, cellphone, cell, mobile phone\nchain\nchainlink fence\nchain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour\nchain saw, chainsaw\nchest\nchiffonier, commode\nchime, bell, gong\nchina cabinet, china closet\nChristmas stocking\nchurch, church building\ncinema, movie theater, movie theatre, movie house, picture palace\ncleaver, meat cleaver, chopper\ncliff dwelling\ncloak\nclog, geta, patten, sabot\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil, spiral, volute, whorl, helix\ncombination lock\ncomputer keyboard, keypad\nconfectionery, confectionary, candy store\ncontainer ship, containership, container vessel\nconvertible\ncorkscrew, bottle screw\ncornet, horn, trumpet, trump\ncowboy boot\ncowboy hat, ten-gallon hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib, cot\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam, dike, dyke\ndesk\ndesktop computer\ndial telephone, dial phone\ndiaper, nappy, napkin\ndigital clock\ndigital watch\ndining table, board\ndishrag, dishcloth\ndishwasher, dish washer, dishwashing machine\ndisk brake, disc brake\ndock, dockage, docking facility\ndogsled, dog sled, dog sleigh\ndome\ndoormat, welcome mat\ndrilling platform, offshore rig\ndrum, membranophone, tympan\ndrumstick\ndumbbell\nDutch oven\nelectric fan, blower\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa, boa\nfile, file cabinet, filing cabinet\nfireboat\nfire engine, fire truck\nfire screen, fireguard\nflagpole, flagstaff\nflute, transverse flute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn, horn\nfrying pan, frypan, skillet\nfur coat\ngarbage truck, dustcart\ngasmask, respirator, gas helmet\ngas pump, gasoline pump, petrol pump, island dispenser\ngoblet\ngo-kart\ngolf ball\ngolfcart, golf cart\ngondola\ngong, tam-tam\ngown\ngrand piano, grand\ngreenhouse, nursery, glasshouse\ngrille, radiator grille\ngrocery store, grocery, food market, market\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower, blow dryer, blow drier, hair dryer, hair drier\nhand-held computer, hand-held microcomputer\nhandkerchief, hankie, hanky, hankey\nhard disc, hard disk, fixed disk\nharmonica, mouth organ, harp, mouth harp\nharp\nharvester, reaper\nhatchet\nholster\nhome theater, home theatre\nhoneycomb\nhook, claw\nhoopskirt, crinoline\nhorizontal bar, high bar\nhorse cart, horse-cart\nhourglass\niPod\niron, smoothing iron\njack-o'-lantern\njean, blue jean, denim\njeep, landrover\njersey, T-shirt, tee shirt\njigsaw puzzle\njinrikisha, ricksha, rickshaw\njoystick\nkimono\nknee pad\nknot\nlab coat, laboratory coat\nladle\nlampshade, lamp shade\nlaptop, laptop computer\nlawn mower, mower\nlens cap, lens cover\nletter opener, paper knife, paperknife\nlibrary\nlifeboat\nlighter, light, igniter, ignitor\nlimousine, limo\nliner, ocean liner\nlipstick, lip rouge\nLoafer\nlotion\nloudspeaker, speaker, speaker unit, loudspeaker system, speaker system\nloupe, jeweler's loupe\nlumbermill, sawmill\nmagnetic compass\nmailbag, postbag\nmailbox, letter box\nmaillot\nmaillot, tank suit\nmanhole cover\nmaraca\nmarimba, xylophone\nmask\nmatchstick\nmaypole\nmaze, labyrinth\nmeasuring cup\nmedicine chest, medicine cabinet\nmegalith, megalithic structure\nmicrophone, mike\nmicrowave, microwave oven\nmilitary uniform\nmilk can\nminibus\nminiskirt, mini\nminivan\nmissile\nmitten\nmixing bowl\nmobile home, manufactured home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter, scooter\nmountain bike, all-terrain bike, off-roader\nmountain tent\nmouse, computer mouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook, notebook computer\nobelisk\noboe, hautboy, hautbois\nocarina, sweet potato\nodometer, hodometer, mileometer, milometer\noil filter\norgan, pipe organ\noscilloscope, scope, cathode-ray oscilloscope, CRO\noverskirt\noxcart\noxygen mask\npacket\npaddle, boat paddle\npaddlewheel, paddle wheel\npadlock\npaintbrush\npajama, pyjama, pj's, jammies\npalace\npanpipe, pandean pipe, syrinx\npaper towel\nparachute, chute\nparallel bars, bars\npark bench\nparking meter\npassenger car, coach, carriage\npatio, terrace\npay-phone, pay-station\npedestal, plinth, footstall\npencil box, pencil case\npencil sharpener\nperfume, essence\nPetri dish\nphotocopier\npick, plectrum, plectron\npickelhaube\npicket fence, paling\npickup, pickup truck\npier\npiggy bank, penny bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate, pirate ship\npitcher, ewer\nplane, carpenter's plane, woodworking plane\nplanetarium\nplastic bag\nplate rack\nplow, plough\nplunger, plumber's helper\nPolaroid camera, Polaroid Land camera\npole\npolice van, police wagon, paddy wagon, patrol wagon, wagon, black Maria\nponcho\npool table, billiard table, snooker table\npop bottle, soda bottle\npot, flowerpot\npotter's wheel\npower drill\nprayer rug, prayer mat\nprinter\nprison, prison house\nprojectile, missile\nprojector\npuck, hockey puck\npunching bag, punch bag, punching ball, punchball\npurse\nquill, quill pen\nquilt, comforter, comfort, puff\nracer, race car, racing car\nracket, racquet\nradiator\nradio, wireless\nradio telescope, radio reflector\nrain barrel\nrecreational vehicle, RV, R.V.\nreel\nreflex camera\nrefrigerator, icebox\nremote control, remote\nrestaurant, eating house, eating place, eatery\nrevolver, six-gun, six-shooter\nrifle\nrocking chair, rocker\nrotisserie\nrubber eraser, rubber, pencil eraser\nrugby ball\nrule, ruler\nrunning shoe\nsafe\nsafety pin\nsaltshaker, salt shaker\nsandal\nsarong\nsax, saxophone\nscabbard\nscale, weighing machine\nschool bus\nschooner\nscoreboard\nscreen, CRT screen\nscrew\nscrewdriver\nseat belt, seatbelt\nsewing machine\nshield, buckler\nshoe shop, shoe-shop, shoe store\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule, slipstick\nsliding door\nslot, one-armed bandit\nsnorkel\nsnowmobile\nsnowplow, snowplough\nsoap dispenser\nsoccer ball\nsock\nsolar dish, solar collector, solar furnace\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web, spider's web\nspindle\nsports car, sport car\nspotlight, spot\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch, stop watch\nstove\nstrainer\nstreetcar, tram, tramcar, trolley, trolley car\nstretcher\nstudio couch, day bed\nstupa, tope\nsubmarine, pigboat, sub, U-boat\nsuit, suit of clothes\nsundial\nsunglass\nsunglasses, dark glasses, shades\nsunscreen, sunblock, sun blocker\nsuspension bridge\nswab, swob, mop\nsweatshirt\nswimming trunks, bathing trunks\nswing\nswitch, electric switch, electrical switch\nsyringe\ntable lamp\ntank, army tank, armored combat vehicle, armoured combat vehicle\ntape player\nteapot\nteddy, teddy bear\ntelevision, television system\ntennis ball\nthatch, thatched roof\ntheater curtain, theatre curtain\nthimble\nthresher, thrasher, threshing machine\nthrone\ntile roof\ntoaster\ntobacco shop, tobacconist shop, tobacconist\ntoilet seat\ntorch\ntotem pole\ntow truck, tow car, wrecker\ntoyshop\ntractor\ntrailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi\ntray\ntrench coat\ntricycle, trike, velocipede\ntrimaran\ntripod\ntriumphal arch\ntrolleybus, trolley coach, trackless trolley\ntrombone\ntub, vat\nturnstile\ntypewriter keyboard\numbrella\nunicycle, monocycle\nupright, upright piano\nvacuum, vacuum cleaner\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin, fiddle\nvolleyball\nwaffle iron\nwall clock\nwallet, billfold, notecase, pocketbook\nwardrobe, closet, press\nwarplane, military plane\nwashbasin, handbasin, washbowl, lavabo, wash-hand basin\nwasher, automatic washer, washing machine\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool, woolen, woollen\nworm fence, snake fence, snake-rail fence, Virginia fence\nwreck\nyawl\nyurt\nweb site, website, internet site, site\ncomic book\ncrossword puzzle, crossword\nstreet sign\ntraffic light, traffic signal, stoplight\nbook jacket, dust cover, dust jacket, dust wrapper\nmenu\nplate\nguacamole\nconsomme\nhot pot, hotpot\ntrifle\nice cream, icecream\nice lolly, lolly, lollipop, popsicle\nFrench loaf\nbagel, beigel\npretzel\ncheeseburger\nhotdog, hot dog, red hot\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini, courgette\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber, cuke\nartichoke, globe artichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple, ananas\nbanana\njackfruit, jak, jack\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce, chocolate syrup\ndough\nmeat loaf, meatloaf\npizza, pizza pie\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff, drop, drop-off\ncoral reef\ngeyser\nlakeside, lakeshore\npromontory, headland, head, foreland\nsandbar, sand bar\nseashore, coast, seacoast, sea-coast\nvalley, vale\nvolcano\nballplayer, baseball player\ngroom, bridegroom\nscuba diver\nrapeseed\ndaisy\nyellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum\ncorn\nacorn\nhip, rose hip, rosehip\nbuckeye, horse chestnut, conker\ncoral fungus\nagaric\ngyromitra\nstinkhorn, carrion fungus\nearthstar\nhen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa\nbolete\near, spike, capitulum\ntoilet tissue, toilet paper, bathroom tissue\n"
  },
  {
    "path": "modules/image/classification/efficientnetb4_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\n\nimport argparse\nimport ast\nimport os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import postprocess\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"efficientnetb4_imagenet\",\n            type=\"CV/image_classification\",\n            author=\"paddlepaddle\",\n            author_email=\"paddle-dev@baidu.com\",\n            summary=\"EfficientNetB4 is a image classfication model, this module is trained with imagenet datasets.\",\n            version=\"1.2.0\")\nclass EfficientNetB4ImageNet:\n\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"efficientnetb4_imagenet_infer_model\",\n                                                          \"model\")\n        label_file = os.path.join(self.directory, \"label_list.txt\")\n        with open(label_file, 'r', encoding='utf-8') as file:\n            self.label_list = file.read().split(\"\\n\")[:-1]\n        self._set_config()\n\n    def get_expected_image_width(self):\n        return 224\n\n    def get_expected_image_height(self):\n        return 224\n\n    def get_pretrained_images_mean(self):\n        im_mean = np.array([0.485, 0.456, 0.406]).reshape(1, 3)\n        return im_mean\n\n    def get_pretrained_images_std(self):\n        im_std = np.array([0.229, 0.224, 0.225]).reshape(1, 3)\n        return im_std\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path + '.pdmodel'\n        params = self.default_pretrained_model_path + '.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def classification(self, images=None, paths=None, batch_size=1, use_gpu=False, top_k=1):\n        \"\"\"\n        API for image classification.\n\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        all_data = list()\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = list()\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except:\n                    pass\n            # feed batch image\n            batch_image = np.array([data['image'] for data in batch_data])\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(batch_image.copy())\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            out = postprocess(data_out=output_handle.copy_to_cpu(), label_list=self.label_list, top_k=top_k)\n            res += out\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classify(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.classify(paths=[args.input_path], batch_size=args.batch_size, use_gpu=args.use_gpu)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not.\")\n        self.arg_config_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n        self.arg_config_group.add_argument('--top_k', type=ast.literal_eval, default=1, help=\"Return top k results.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n\n\nif __name__ == '__main__':\n    b4 = EfficientNetB4ImageNet()\n    b4.context()\n    import cv2\n    test_image = [\n        cv2.imread(\n            '/mnt/zhangxuefei/program-paddle/PaddleHub/hub_module/tests/image_dataset/classification/animals/dog.jpeg')\n    ]\n    res = b4.classification(images=test_image)\n    print(res)\n    res = b4.classification(paths=[\n        '/mnt/zhangxuefei/program-paddle/PaddleHub/hub_module/tests/image_dataset/classification/animals/dog.jpeg'\n    ])\n    print(res)\n    res = b4.classification(images=test_image)\n    print(res)\n    res = b4.classify(images=test_image)\n    print(res)\n"
  },
  {
    "path": "modules/image/classification/efficientnetb4_imagenet/processor.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport base64\n\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef softmax(x):\n    if len(x.shape) > 1:\n        tmp = np.max(x, axis=1)\n        x -= tmp.reshape((x.shape[0], 1))\n        x = np.exp(x)\n        tmp = np.sum(x, axis=1)\n        x /= tmp.reshape((x.shape[0], 1))\n    else:\n        tmp = np.max(x)\n        x -= tmp\n        x = np.exp(x)\n        tmp = np.sum(x)\n        x /= tmp\n    return x\n\n\ndef postprocess(data_out, label_list, top_k):\n    \"\"\"\n    Postprocess output of network, one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output data of network.\n        label_list (list): list of label.\n        top_k (int): Return top k results.\n    \"\"\"\n    output = []\n    for result in data_out:\n        result_i = softmax(result)\n        output_i = {}\n        indexs = np.argsort(result_i)[::-1][0:top_k]\n        for index in indexs:\n            label = label_list[index].split(',')[0]\n            output_i[label] = float(result_i[index])\n        output.append(output_i)\n    return output\n"
  },
  {
    "path": "modules/image/classification/efficientnetb4_imagenet/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/brFsZ7qszSY/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8OHx8ZG9nfGVufDB8fHx8MTY2MzA1ODQ1MQ&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"efficientnetb4_imagenet\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n\n    def test_classification1(self):\n        results = self.module.classification(paths=['tests/test.jpg'])\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification2(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')])\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification3(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')], use_gpu=True)\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification4(self):\n        self.assertRaises(AssertionError, self.module.classification, paths=['no.jpg'])\n\n    def test_classification5(self):\n        self.assertRaises(TypeError, self.module.classification, images=['tests/test.jpg'])\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/classification/efficientnetb5_imagenet/README.md",
    "content": "# efficientnetb5_imagenet\n\n|模型名称|efficientnetb5_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|EfficientNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|121MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - EfficientNet 是谷歌的开源新模型，是一个轻量级网络，它的主干网络由 MBConv 构成，同时采取了 squeeze-and-excitation 操作对网络结构进行优化。该 PaddleHub Module结构为 EfficientNetB5，基于 ImageNet-2012 数据集训练，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install efficientnetb5_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run efficientnetb5_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"efficientnetb5_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 为识别的菜品类别，value为置信度。\n\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m efficientnetb5_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/efficientnetb5_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  提升预测性能以及易用性\n\n* 1.2.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install efficientnetb5_imagenet==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/efficientnetb5_imagenet/README_en.md",
    "content": "# efficientnetb5_imagenet\n\n|Module Name|efficientnetb5_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|EfficientNet|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|121MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - EfficientNet is a light-weight model proposed by google, which consists of MBConv, and takes advantage of squeeze-and-excitation operation. This module is based on EfficientNetB5, trained on ImageNet-2012 dataset, and can predict an image of size 224*224*3.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install efficientnetb5_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run efficientnetb5_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"efficientnetb5_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - classification API.\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - top\\_k (int): return the first k results\n\n    - **Return**\n\n      - res (list\\[dict\\]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of image classification.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m efficientnetb5_imagenet\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/efficientnetb5_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Improve the prediction performance and users' experience\n\n* 1.2.0\n\n  Remove Fluid API\n\n  - ```shell\n    $ hub install efficientnetb5_imagenet==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/efficientnetb5_imagenet/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/classification/efficientnetb5_imagenet/data_feed.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport time\nfrom collections import OrderedDict\n\nimport numpy as np\nfrom PIL import Image\n\n__all__ = ['reader']\n\nDATA_DIM = 224\nimg_mean = np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))\nimg_std = np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))\n\n\ndef resize_short(img, target_size):\n    percent = float(target_size) / min(img.size[0], img.size[1])\n    resized_width = int(round(img.size[0] * percent))\n    resized_height = int(round(img.size[1] * percent))\n    img = img.resize((resized_width, resized_height), Image.LANCZOS)\n    return img\n\n\ndef crop_image(img, target_size, center):\n    width, height = img.size\n    size = target_size\n    if center == True:\n        w_start = (width - size) / 2\n        h_start = (height - size) / 2\n    else:\n        w_start = np.random.randint(0, width - size + 1)\n        h_start = np.random.randint(0, height - size + 1)\n    w_end = w_start + size\n    h_end = h_start + size\n    img = img.crop((w_start, h_start, w_end, h_end))\n    return img\n\n\ndef process_image(img):\n    img = resize_short(img, target_size=256)\n    img = crop_image(img, target_size=DATA_DIM, center=True)\n    if img.mode != 'RGB':\n        img = img.convert('RGB')\n    img = np.array(img).astype('float32').transpose((2, 0, 1)) / 255\n    img -= img_mean\n    img /= img_std\n    return img\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list[numpy.ndarray]): images data, shape of each is [H, W, C].\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            each['org_im_path'] = im_path\n            each['org_im'] = Image.open(im_path)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n    if images is not None:\n        assert type(images), \"images is a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = Image.fromarray(im[:, :, ::-1])\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n\n    for element in component:\n        element['image'] = process_image(element['org_im'])\n        yield element\n"
  },
  {
    "path": "modules/image/classification/efficientnetb5_imagenet/label_list.txt",
    "content": "tench, Tinca tinca\ngoldfish, Carassius auratus\ngreat white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias\ntiger shark, Galeocerdo cuvieri\nhammerhead, hammerhead shark\nelectric ray, crampfish, numbfish, torpedo\nstingray\ncock\nhen\nostrich, Struthio camelus\nbrambling, Fringilla montifringilla\ngoldfinch, Carduelis carduelis\nhouse finch, linnet, Carpodacus mexicanus\njunco, snowbird\nindigo bunting, indigo finch, indigo bird, Passerina cyanea\nrobin, American robin, Turdus migratorius\nbulbul\njay\nmagpie\nchickadee\nwater ouzel, dipper\nkite\nbald eagle, American eagle, Haliaeetus leucocephalus\nvulture\ngreat grey owl, great gray owl, Strix nebulosa\nEuropean fire salamander, Salamandra salamandra\ncommon newt, Triturus vulgaris\neft\nspotted salamander, Ambystoma maculatum\naxolotl, mud puppy, Ambystoma mexicanum\nbullfrog, Rana catesbeiana\ntree frog, tree-frog\ntailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui\nloggerhead, loggerhead turtle, Caretta caretta\nleatherback turtle, leatherback, leathery turtle, Dermochelys coriacea\nmud turtle\nterrapin\nbox turtle, box tortoise\nbanded gecko\ncommon iguana, iguana, Iguana iguana\nAmerican chameleon, anole, Anolis carolinensis\nwhiptail, whiptail lizard\nagama\nfrilled lizard, Chlamydosaurus kingi\nalligator lizard\nGila monster, Heloderma suspectum\ngreen lizard, Lacerta viridis\nAfrican chameleon, Chamaeleo chamaeleon\nKomodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis\nAfrican crocodile, Nile crocodile, Crocodylus niloticus\nAmerican alligator, Alligator mississipiensis\ntriceratops\nthunder snake, worm snake, Carphophis amoenus\nringneck snake, ring-necked snake, ring snake\nhognose snake, puff adder, sand viper\ngreen snake, grass snake\nking snake, kingsnake\ngarter snake, grass snake\nwater snake\nvine snake\nnight snake, Hypsiglena torquata\nboa constrictor, Constrictor constrictor\nrock python, rock snake, Python sebae\nIndian cobra, Naja naja\ngreen mamba\nsea snake\nhorned viper, cerastes, sand viper, horned asp, Cerastes cornutus\ndiamondback, diamondback rattlesnake, Crotalus adamanteus\nsidewinder, horned rattlesnake, Crotalus cerastes\ntrilobite\nharvestman, daddy longlegs, Phalangium opilio\nscorpion\nblack and gold garden spider, Argiope aurantia\nbarn spider, Araneus cavaticus\ngarden spider, Aranea diademata\nblack widow, Latrodectus mactans\ntarantula\nwolf spider, hunting spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse, partridge, Bonasa umbellus\nprairie chicken, prairie grouse, prairie fowl\npeacock\nquail\npartridge\nAfrican grey, African gray, Psittacus erithacus\nmacaw\nsulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser, Mergus serrator\ngoose\nblack swan, Cygnus atratus\ntusker\nechidna, spiny anteater, anteater\nplatypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus\nwallaby, brush kangaroo\nkoala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus\nwombat\njellyfish\nsea anemone, anemone\nbrain coral\nflatworm, platyhelminth\nnematode, nematode worm, roundworm\nconch\nsnail\nslug\nsea slug, nudibranch\nchiton, coat-of-mail shell, sea cradle, polyplacophore\nchambered nautilus, pearly nautilus, nautilus\nDungeness crab, Cancer magister\nrock crab, Cancer irroratus\nfiddler crab\nking crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica\nAmerican lobster, Northern lobster, Maine lobster, Homarus americanus\nspiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish\ncrayfish, crawfish, crawdad, crawdaddy\nhermit crab\nisopod\nwhite stork, Ciconia ciconia\nblack stork, Ciconia nigra\nspoonbill\nflamingo\nlittle blue heron, Egretta caerulea\nAmerican egret, great white heron, Egretta albus\nbittern\ncrane\nlimpkin, Aramus pictus\nEuropean gallinule, Porphyrio porphyrio\nAmerican coot, marsh hen, mud hen, water hen, Fulica americana\nbustard\nruddy turnstone, Arenaria interpres\nred-backed sandpiper, dunlin, Erolia alpina\nredshank, Tringa totanus\ndowitcher\noystercatcher, oyster catcher\npelican\nking penguin, Aptenodytes patagonica\nalbatross, mollymawk\ngrey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus\nkiller whale, killer, orca, grampus, sea wolf, Orcinus orca\ndugong, Dugong dugon\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog, Maltese terrier, Maltese\nPekinese, Pekingese, Peke\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound, Afghan\nbasset, basset hound\nbeagle\nbloodhound, sleuthhound\nbluetick\nblack-and-tan coonhound\nWalker hound, Walker foxhound\nEnglish foxhound\nredbone\nborzoi, Russian wolfhound\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound, Ibizan Podenco\nNorwegian elkhound, elkhound\notterhound, otter hound\nSaluki, gazelle hound\nScottish deerhound, deerhound\nWeimaraner\nStaffordshire bullterrier, Staffordshire bull terrier\nAmerican Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier, Sealyham\nAiredale, Airedale terrier\ncairn, cairn terrier\nAustralian terrier\nDandie Dinmont, Dandie Dinmont terrier\nBoston bull, Boston terrier\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier, Scottish terrier, Scottie\nTibetan terrier, chrysanthemum dog\nsilky terrier, Sydney silky\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa, Lhasa apso\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla, Hungarian pointer\nEnglish setter\nIrish setter, red setter\nGordon setter\nBrittany spaniel\nclumber, clumber spaniel\nEnglish springer, English springer spaniel\nWelsh springer spaniel\ncocker spaniel, English cocker spaniel, cocker\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog, bobtail\nShetland sheepdog, Shetland sheep dog, Shetland\ncollie\nBorder collie\nBouvier des Flandres, Bouviers des Flandres\nRottweiler\nGerman shepherd, German shepherd dog, German police dog, alsatian\nDoberman, Doberman pinscher\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard, St Bernard\nEskimo dog, husky\nmalamute, malemute, Alaskan malamute\nSiberian husky\ndalmatian, coach dog, carriage dog\naffenpinscher, monkey pinscher, monkey dog\nbasenji\npug, pug-dog\nLeonberg\nNewfoundland, Newfoundland dog\nGreat Pyrenees\nSamoyed, Samoyede\nPomeranian\nchow, chow chow\nkeeshond\nBrabancon griffon\nPembroke, Pembroke Welsh corgi\nCardigan, Cardigan Welsh corgi\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf, grey wolf, gray wolf, Canis lupus\nwhite wolf, Arctic wolf, Canis lupus tundrarum\nred wolf, maned wolf, Canis rufus, Canis niger\ncoyote, prairie wolf, brush wolf, Canis latrans\ndingo, warrigal, warragal, Canis dingo\ndhole, Cuon alpinus\nAfrican hunting dog, hyena dog, Cape hunting dog, Lycaon pictus\nhyena, hyaena\nred fox, Vulpes vulpes\nkit fox, Vulpes macrotis\nArctic fox, white fox, Alopex lagopus\ngrey fox, gray fox, Urocyon cinereoargenteus\ntabby, tabby cat\ntiger cat\nPersian cat\nSiamese cat, Siamese\nEgyptian cat\ncougar, puma, catamount, mountain lion, painter, panther, Felis concolor\nlynx, catamount\nleopard, Panthera pardus\nsnow leopard, ounce, Panthera uncia\njaguar, panther, Panthera onca, Felis onca\nlion, king of beasts, Panthera leo\ntiger, Panthera tigris\ncheetah, chetah, Acinonyx jubatus\nbrown bear, bruin, Ursus arctos\nAmerican black bear, black bear, Ursus americanus, Euarctos americanus\nice bear, polar bear, Ursus Maritimus, Thalarctos maritimus\nsloth bear, Melursus ursinus, Ursus ursinus\nmongoose\nmeerkat, mierkat\ntiger beetle\nladybug, ladybeetle, lady beetle, ladybird, ladybird beetle\nground beetle, carabid beetle\nlong-horned beetle, longicorn, longicorn beetle\nleaf beetle, chrysomelid\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant, emmet, pismire\ngrasshopper, hopper\ncricket\nwalking stick, walkingstick, stick insect\ncockroach, roach\nmantis, mantid\ncicada, cicala\nleafhopper\nlacewing, lacewing fly\ndragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk\ndamselfly\nadmiral\nringlet, ringlet butterfly\nmonarch, monarch butterfly, milkweed butterfly, Danaus plexippus\ncabbage butterfly\nsulphur butterfly, sulfur butterfly\nlycaenid, lycaenid butterfly\nstarfish, sea star\nsea urchin\nsea cucumber, holothurian\nwood rabbit, cottontail, cottontail rabbit\nhare\nAngora, Angora rabbit\nhamster\nporcupine, hedgehog\nfox squirrel, eastern fox squirrel, Sciurus niger\nmarmot\nbeaver\nguinea pig, Cavia cobaya\nsorrel\nzebra\nhog, pig, grunter, squealer, Sus scrofa\nwild boar, boar, Sus scrofa\nwarthog\nhippopotamus, hippo, river horse, Hippopotamus amphibius\nox\nwater buffalo, water ox, Asiatic buffalo, Bubalus bubalis\nbison\nram, tup\nbighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis\nibex, Capra ibex\nhartebeest\nimpala, Aepyceros melampus\ngazelle\nArabian camel, dromedary, Camelus dromedarius\nllama\nweasel\nmink\npolecat, fitch, foulmart, foumart, Mustela putorius\nblack-footed ferret, ferret, Mustela nigripes\notter\nskunk, polecat, wood pussy\nbadger\narmadillo\nthree-toed sloth, ai, Bradypus tridactylus\norangutan, orang, orangutang, Pongo pygmaeus\ngorilla, Gorilla gorilla\nchimpanzee, chimp, Pan troglodytes\ngibbon, Hylobates lar\nsiamang, Hylobates syndactylus, Symphalangus syndactylus\nguenon, guenon monkey\npatas, hussar monkey, Erythrocebus patas\nbaboon\nmacaque\nlangur\ncolobus, colobus monkey\nproboscis monkey, Nasalis larvatus\nmarmoset\ncapuchin, ringtail, Cebus capucinus\nhowler monkey, howler\ntiti, titi monkey\nspider monkey, Ateles geoffroyi\nsquirrel monkey, Saimiri sciureus\nMadagascar cat, ring-tailed lemur, Lemur catta\nindri, indris, Indri indri, Indri brevicaudatus\nIndian elephant, Elephas maximus\nAfrican elephant, Loxodonta africana\nlesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens\ngiant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca\nbarracouta, snoek\neel\ncoho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch\nrock beauty, Holocanthus tricolor\nanemone fish\nsturgeon\ngar, garfish, garpike, billfish, Lepisosteus osseus\nlionfish\npuffer, pufferfish, blowfish, globefish\nabacus\nabaya\nacademic gown, academic robe, judge's robe\naccordion, piano accordion, squeeze box\nacoustic guitar\naircraft carrier, carrier, flattop, attack aircraft carrier\nairliner\nairship, dirigible\naltar\nambulance\namphibian, amphibious vehicle\nanalog clock\napiary, bee house\napron\nashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin\nassault rifle, assault gun\nbackpack, back pack, knapsack, packsack, rucksack, haversack\nbakery, bakeshop, bakehouse\nbalance beam, beam\nballoon\nballpoint, ballpoint pen, ballpen, Biro\nBand Aid\nbanjo\nbannister, banister, balustrade, balusters, handrail\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel, cask\nbarrow, garden cart, lawn cart, wheelbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap, swimming cap\nbath towel\nbathtub, bathing tub, bath, tub\nbeach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon\nbeacon, lighthouse, beacon light, pharos\nbeaker\nbearskin, busby, shako\nbeer bottle\nbeer glass\nbell cote, bell cot\nbib\nbicycle-built-for-two, tandem bicycle, tandem\nbikini, two-piece\nbinder, ring-binder\nbinoculars, field glasses, opera glasses\nbirdhouse\nboathouse\nbobsled, bobsleigh, bob\nbolo tie, bolo, bola tie, bola\nbonnet, poke bonnet\nbookcase\nbookshop, bookstore, bookstall\nbottlecap\nbow\nbow tie, bow-tie, bowtie\nbrass, memorial tablet, plaque\nbrassiere, bra, bandeau\nbreakwater, groin, groyne, mole, bulwark, seawall, jetty\nbreastplate, aegis, egis\nbroom\nbucket, pail\nbuckle\nbulletproof vest\nbullet train, bullet\nbutcher shop, meat market\ncab, hack, taxi, taxicab\ncaldron, cauldron\ncandle, taper, wax light\ncannon\ncanoe\ncan opener, tin opener\ncardigan\ncar mirror\ncarousel, carrousel, merry-go-round, roundabout, whirligig\ncarpenter's kit, tool kit\ncarton\ncar wheel\ncash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello, violoncello\ncellular telephone, cellular phone, cellphone, cell, mobile phone\nchain\nchainlink fence\nchain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour\nchain saw, chainsaw\nchest\nchiffonier, commode\nchime, bell, gong\nchina cabinet, china closet\nChristmas stocking\nchurch, church building\ncinema, movie theater, movie theatre, movie house, picture palace\ncleaver, meat cleaver, chopper\ncliff dwelling\ncloak\nclog, geta, patten, sabot\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil, spiral, volute, whorl, helix\ncombination lock\ncomputer keyboard, keypad\nconfectionery, confectionary, candy store\ncontainer ship, containership, container vessel\nconvertible\ncorkscrew, bottle screw\ncornet, horn, trumpet, trump\ncowboy boot\ncowboy hat, ten-gallon hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib, cot\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam, dike, dyke\ndesk\ndesktop computer\ndial telephone, dial phone\ndiaper, nappy, napkin\ndigital clock\ndigital watch\ndining table, board\ndishrag, dishcloth\ndishwasher, dish washer, dishwashing machine\ndisk brake, disc brake\ndock, dockage, docking facility\ndogsled, dog sled, dog sleigh\ndome\ndoormat, welcome mat\ndrilling platform, offshore rig\ndrum, membranophone, tympan\ndrumstick\ndumbbell\nDutch oven\nelectric fan, blower\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa, boa\nfile, file cabinet, filing cabinet\nfireboat\nfire engine, fire truck\nfire screen, fireguard\nflagpole, flagstaff\nflute, transverse flute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn, horn\nfrying pan, frypan, skillet\nfur coat\ngarbage truck, dustcart\ngasmask, respirator, gas helmet\ngas pump, gasoline pump, petrol pump, island dispenser\ngoblet\ngo-kart\ngolf ball\ngolfcart, golf cart\ngondola\ngong, tam-tam\ngown\ngrand piano, grand\ngreenhouse, nursery, glasshouse\ngrille, radiator grille\ngrocery store, grocery, food market, market\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower, blow dryer, blow drier, hair dryer, hair drier\nhand-held computer, hand-held microcomputer\nhandkerchief, hankie, hanky, hankey\nhard disc, hard disk, fixed disk\nharmonica, mouth organ, harp, mouth harp\nharp\nharvester, reaper\nhatchet\nholster\nhome theater, home theatre\nhoneycomb\nhook, claw\nhoopskirt, crinoline\nhorizontal bar, high bar\nhorse cart, horse-cart\nhourglass\niPod\niron, smoothing iron\njack-o'-lantern\njean, blue jean, denim\njeep, landrover\njersey, T-shirt, tee shirt\njigsaw puzzle\njinrikisha, ricksha, rickshaw\njoystick\nkimono\nknee pad\nknot\nlab coat, laboratory coat\nladle\nlampshade, lamp shade\nlaptop, laptop computer\nlawn mower, mower\nlens cap, lens cover\nletter opener, paper knife, paperknife\nlibrary\nlifeboat\nlighter, light, igniter, ignitor\nlimousine, limo\nliner, ocean liner\nlipstick, lip rouge\nLoafer\nlotion\nloudspeaker, speaker, speaker unit, loudspeaker system, speaker system\nloupe, jeweler's loupe\nlumbermill, sawmill\nmagnetic compass\nmailbag, postbag\nmailbox, letter box\nmaillot\nmaillot, tank suit\nmanhole cover\nmaraca\nmarimba, xylophone\nmask\nmatchstick\nmaypole\nmaze, labyrinth\nmeasuring cup\nmedicine chest, medicine cabinet\nmegalith, megalithic structure\nmicrophone, mike\nmicrowave, microwave oven\nmilitary uniform\nmilk can\nminibus\nminiskirt, mini\nminivan\nmissile\nmitten\nmixing bowl\nmobile home, manufactured home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter, scooter\nmountain bike, all-terrain bike, off-roader\nmountain tent\nmouse, computer mouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook, notebook computer\nobelisk\noboe, hautboy, hautbois\nocarina, sweet potato\nodometer, hodometer, mileometer, milometer\noil filter\norgan, pipe organ\noscilloscope, scope, cathode-ray oscilloscope, CRO\noverskirt\noxcart\noxygen mask\npacket\npaddle, boat paddle\npaddlewheel, paddle wheel\npadlock\npaintbrush\npajama, pyjama, pj's, jammies\npalace\npanpipe, pandean pipe, syrinx\npaper towel\nparachute, chute\nparallel bars, bars\npark bench\nparking meter\npassenger car, coach, carriage\npatio, terrace\npay-phone, pay-station\npedestal, plinth, footstall\npencil box, pencil case\npencil sharpener\nperfume, essence\nPetri dish\nphotocopier\npick, plectrum, plectron\npickelhaube\npicket fence, paling\npickup, pickup truck\npier\npiggy bank, penny bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate, pirate ship\npitcher, ewer\nplane, carpenter's plane, woodworking plane\nplanetarium\nplastic bag\nplate rack\nplow, plough\nplunger, plumber's helper\nPolaroid camera, Polaroid Land camera\npole\npolice van, police wagon, paddy wagon, patrol wagon, wagon, black Maria\nponcho\npool table, billiard table, snooker table\npop bottle, soda bottle\npot, flowerpot\npotter's wheel\npower drill\nprayer rug, prayer mat\nprinter\nprison, prison house\nprojectile, missile\nprojector\npuck, hockey puck\npunching bag, punch bag, punching ball, punchball\npurse\nquill, quill pen\nquilt, comforter, comfort, puff\nracer, race car, racing car\nracket, racquet\nradiator\nradio, wireless\nradio telescope, radio reflector\nrain barrel\nrecreational vehicle, RV, R.V.\nreel\nreflex camera\nrefrigerator, icebox\nremote control, remote\nrestaurant, eating house, eating place, eatery\nrevolver, six-gun, six-shooter\nrifle\nrocking chair, rocker\nrotisserie\nrubber eraser, rubber, pencil eraser\nrugby ball\nrule, ruler\nrunning shoe\nsafe\nsafety pin\nsaltshaker, salt shaker\nsandal\nsarong\nsax, saxophone\nscabbard\nscale, weighing machine\nschool bus\nschooner\nscoreboard\nscreen, CRT screen\nscrew\nscrewdriver\nseat belt, seatbelt\nsewing machine\nshield, buckler\nshoe shop, shoe-shop, shoe store\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule, slipstick\nsliding door\nslot, one-armed bandit\nsnorkel\nsnowmobile\nsnowplow, snowplough\nsoap dispenser\nsoccer ball\nsock\nsolar dish, solar collector, solar furnace\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web, spider's web\nspindle\nsports car, sport car\nspotlight, spot\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch, stop watch\nstove\nstrainer\nstreetcar, tram, tramcar, trolley, trolley car\nstretcher\nstudio couch, day bed\nstupa, tope\nsubmarine, pigboat, sub, U-boat\nsuit, suit of clothes\nsundial\nsunglass\nsunglasses, dark glasses, shades\nsunscreen, sunblock, sun blocker\nsuspension bridge\nswab, swob, mop\nsweatshirt\nswimming trunks, bathing trunks\nswing\nswitch, electric switch, electrical switch\nsyringe\ntable lamp\ntank, army tank, armored combat vehicle, armoured combat vehicle\ntape player\nteapot\nteddy, teddy bear\ntelevision, television system\ntennis ball\nthatch, thatched roof\ntheater curtain, theatre curtain\nthimble\nthresher, thrasher, threshing machine\nthrone\ntile roof\ntoaster\ntobacco shop, tobacconist shop, tobacconist\ntoilet seat\ntorch\ntotem pole\ntow truck, tow car, wrecker\ntoyshop\ntractor\ntrailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi\ntray\ntrench coat\ntricycle, trike, velocipede\ntrimaran\ntripod\ntriumphal arch\ntrolleybus, trolley coach, trackless trolley\ntrombone\ntub, vat\nturnstile\ntypewriter keyboard\numbrella\nunicycle, monocycle\nupright, upright piano\nvacuum, vacuum cleaner\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin, fiddle\nvolleyball\nwaffle iron\nwall clock\nwallet, billfold, notecase, pocketbook\nwardrobe, closet, press\nwarplane, military plane\nwashbasin, handbasin, washbowl, lavabo, wash-hand basin\nwasher, automatic washer, washing machine\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool, woolen, woollen\nworm fence, snake fence, snake-rail fence, Virginia fence\nwreck\nyawl\nyurt\nweb site, website, internet site, site\ncomic book\ncrossword puzzle, crossword\nstreet sign\ntraffic light, traffic signal, stoplight\nbook jacket, dust cover, dust jacket, dust wrapper\nmenu\nplate\nguacamole\nconsomme\nhot pot, hotpot\ntrifle\nice cream, icecream\nice lolly, lolly, lollipop, popsicle\nFrench loaf\nbagel, beigel\npretzel\ncheeseburger\nhotdog, hot dog, red hot\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini, courgette\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber, cuke\nartichoke, globe artichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple, ananas\nbanana\njackfruit, jak, jack\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce, chocolate syrup\ndough\nmeat loaf, meatloaf\npizza, pizza pie\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff, drop, drop-off\ncoral reef\ngeyser\nlakeside, lakeshore\npromontory, headland, head, foreland\nsandbar, sand bar\nseashore, coast, seacoast, sea-coast\nvalley, vale\nvolcano\nballplayer, baseball player\ngroom, bridegroom\nscuba diver\nrapeseed\ndaisy\nyellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum\ncorn\nacorn\nhip, rose hip, rosehip\nbuckeye, horse chestnut, conker\ncoral fungus\nagaric\ngyromitra\nstinkhorn, carrion fungus\nearthstar\nhen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa\nbolete\near, spike, capitulum\ntoilet tissue, toilet paper, bathroom tissue\n"
  },
  {
    "path": "modules/image/classification/efficientnetb5_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\n\nimport argparse\nimport ast\nimport os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import postprocess\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"efficientnetb5_imagenet\",\n            type=\"CV/image_classification\",\n            author=\"paddlepaddle\",\n            author_email=\"paddle-dev@baidu.com\",\n            summary=\"EfficientNetB5 is a image classfication model, this module is trained with imagenet datasets.\",\n            version=\"1.2.0\")\nclass EfficientNetB5ImageNet:\n\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"efficientnetb5_imagenet_infer_model\",\n                                                          \"model\")\n        label_file = os.path.join(self.directory, \"label_list.txt\")\n        with open(label_file, 'r', encoding='utf-8') as file:\n            self.label_list = file.read().split(\"\\n\")[:-1]\n        self._set_config()\n\n    def get_expected_image_width(self):\n        return 224\n\n    def get_expected_image_height(self):\n        return 224\n\n    def get_pretrained_images_mean(self):\n        im_mean = np.array([0.485, 0.456, 0.406]).reshape(1, 3)\n        return im_mean\n\n    def get_pretrained_images_std(self):\n        im_std = np.array([0.229, 0.224, 0.225]).reshape(1, 3)\n        return im_std\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path + '.pdmodel'\n        params = self.default_pretrained_model_path + '.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def classification(self, images=None, paths=None, batch_size=1, use_gpu=False, top_k=1):\n        \"\"\"\n        API for image classification.\n\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        all_data = list()\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = list()\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except:\n                    pass\n            # feed batch image\n            batch_image = np.array([data['image'] for data in batch_data])\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(batch_image.copy())\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            out = postprocess(data_out=output_handle.copy_to_cpu(), label_list=self.label_list, top_k=top_k)\n            res += out\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classify(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.classify(paths=[args.input_path], batch_size=args.batch_size, use_gpu=args.use_gpu)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not.\")\n        self.arg_config_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n        self.arg_config_group.add_argument('--top_k', type=ast.literal_eval, default=1, help=\"Return top k results.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n\n\nif __name__ == '__main__':\n    b5 = EfficientNetB5ImageNet()\n    b5.context()\n    import cv2\n    test_image = [cv2.imread('dog.jpeg')]\n    res = b5.classification(images=test_image)\n    print(res)\n    res = b5.classification(paths=['dog.jpeg'])\n    print(res)\n    res = b5.classification(images=test_image)\n    print(res)\n    res = b5.classify(images=test_image)\n    print(res)\n"
  },
  {
    "path": "modules/image/classification/efficientnetb5_imagenet/processor.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport base64\n\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef softmax(x):\n    if len(x.shape) > 1:\n        tmp = np.max(x, axis=1)\n        x -= tmp.reshape((x.shape[0], 1))\n        x = np.exp(x)\n        tmp = np.sum(x, axis=1)\n        x /= tmp.reshape((x.shape[0], 1))\n    else:\n        tmp = np.max(x)\n        x -= tmp\n        x = np.exp(x)\n        tmp = np.sum(x)\n        x /= tmp\n    return x\n\n\ndef postprocess(data_out, label_list, top_k):\n    \"\"\"\n    Postprocess output of network, one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output data of network.\n        label_list (list): list of label.\n        top_k (int): Return top k results.\n    \"\"\"\n    output = []\n    for result in data_out:\n        result_i = softmax(result)\n        output_i = {}\n        indexs = np.argsort(result_i)[::-1][0:top_k]\n        for index in indexs:\n            label = label_list[index].split(',')[0]\n            output_i[label] = float(result_i[index])\n        output.append(output_i)\n    return output\n"
  },
  {
    "path": "modules/image/classification/efficientnetb5_imagenet/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/brFsZ7qszSY/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8OHx8ZG9nfGVufDB8fHx8MTY2MzA1ODQ1MQ&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"efficientnetb5_imagenet\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n\n    def test_classification1(self):\n        results = self.module.classification(paths=['tests/test.jpg'])\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification2(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')])\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification3(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')], use_gpu=True)\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification4(self):\n        self.assertRaises(AssertionError, self.module.classification, paths=['no.jpg'])\n\n    def test_classification5(self):\n        self.assertRaises(TypeError, self.module.classification, images=['tests/test.jpg'])\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/classification/efficientnetb6_imagenet/README.md",
    "content": "# efficientnetb6_imagenet\n\n|模型名称|efficientnetb6_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|EfficientNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|170MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - EfficientNet 是谷歌的开源新模型，是一个轻量级网络，它的主干网络由 MBConv 构成，同时采取了 squeeze-and-excitation 操作对网络结构进行优化。该 PaddleHub Module结构为 EfficientNetB6，基于 ImageNet-2012 数据集训练，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install efficientnetb6_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run efficientnetb6_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"efficientnetb6_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 为识别的菜品类别，value为置信度。\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m efficientnetb6_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/efficientnetb6_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  提升预测性能以及易用性\n\n* 1.2.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install efficientnetb6_imagenet==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/efficientnetb6_imagenet/README_en.md",
    "content": "# efficientnetb6_imagenet\n\n|Module Name|efficientnetb6_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|EfficientNet|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|170MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - EfficientNet is a light-weight model proposed by google, which consists of MBConv, and takes advantage of squeeze-and-excitation operation. This module is based on EfficientNetB6, trained on ImageNet-2012 dataset, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install efficientnetb6_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run efficientnetb6_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"efficientnetb6_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - classification API.\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - top\\_k (int): return the first k results\n\n    - **Return**\n\n      - res (list\\[dict\\]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of image classification.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m efficientnetb6_imagenet\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/efficientnetb6_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Improve the prediction performance and users' experience\n\n* 1.2.0\n\n  Remove Fluid API\n\n  - ```shell\n    $ hub install efficientnetb6_imagenet==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/efficientnetb6_imagenet/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/classification/efficientnetb6_imagenet/data_feed.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport time\nfrom collections import OrderedDict\n\nimport numpy as np\nfrom PIL import Image\n\n__all__ = ['reader']\n\nDATA_DIM = 224\nimg_mean = np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))\nimg_std = np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))\n\n\ndef resize_short(img, target_size):\n    percent = float(target_size) / min(img.size[0], img.size[1])\n    resized_width = int(round(img.size[0] * percent))\n    resized_height = int(round(img.size[1] * percent))\n    img = img.resize((resized_width, resized_height), Image.LANCZOS)\n    return img\n\n\ndef crop_image(img, target_size, center):\n    width, height = img.size\n    size = target_size\n    if center == True:\n        w_start = (width - size) / 2\n        h_start = (height - size) / 2\n    else:\n        w_start = np.random.randint(0, width - size + 1)\n        h_start = np.random.randint(0, height - size + 1)\n    w_end = w_start + size\n    h_end = h_start + size\n    img = img.crop((w_start, h_start, w_end, h_end))\n    return img\n\n\ndef process_image(img):\n    img = resize_short(img, target_size=256)\n    img = crop_image(img, target_size=DATA_DIM, center=True)\n    if img.mode != 'RGB':\n        img = img.convert('RGB')\n    img = np.array(img).astype('float32').transpose((2, 0, 1)) / 255\n    img -= img_mean\n    img /= img_std\n    return img\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list[numpy.ndarray]): images data, shape of each is [H, W, C].\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            each['org_im_path'] = im_path\n            each['org_im'] = Image.open(im_path)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n    if images is not None:\n        assert type(images), \"images is a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = Image.fromarray(im[:, :, ::-1])\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n\n    for element in component:\n        element['image'] = process_image(element['org_im'])\n        yield element\n"
  },
  {
    "path": "modules/image/classification/efficientnetb6_imagenet/label_list.txt",
    "content": "tench, Tinca tinca\ngoldfish, Carassius auratus\ngreat white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias\ntiger shark, Galeocerdo cuvieri\nhammerhead, hammerhead shark\nelectric ray, crampfish, numbfish, torpedo\nstingray\ncock\nhen\nostrich, Struthio camelus\nbrambling, Fringilla montifringilla\ngoldfinch, Carduelis carduelis\nhouse finch, linnet, Carpodacus mexicanus\njunco, snowbird\nindigo bunting, indigo finch, indigo bird, Passerina cyanea\nrobin, American robin, Turdus migratorius\nbulbul\njay\nmagpie\nchickadee\nwater ouzel, dipper\nkite\nbald eagle, American eagle, Haliaeetus leucocephalus\nvulture\ngreat grey owl, great gray owl, Strix nebulosa\nEuropean fire salamander, Salamandra salamandra\ncommon newt, Triturus vulgaris\neft\nspotted salamander, Ambystoma maculatum\naxolotl, mud puppy, Ambystoma mexicanum\nbullfrog, Rana catesbeiana\ntree frog, tree-frog\ntailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui\nloggerhead, loggerhead turtle, Caretta caretta\nleatherback turtle, leatherback, leathery turtle, Dermochelys coriacea\nmud turtle\nterrapin\nbox turtle, box tortoise\nbanded gecko\ncommon iguana, iguana, Iguana iguana\nAmerican chameleon, anole, Anolis carolinensis\nwhiptail, whiptail lizard\nagama\nfrilled lizard, Chlamydosaurus kingi\nalligator lizard\nGila monster, Heloderma suspectum\ngreen lizard, Lacerta viridis\nAfrican chameleon, Chamaeleo chamaeleon\nKomodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis\nAfrican crocodile, Nile crocodile, Crocodylus niloticus\nAmerican alligator, Alligator mississipiensis\ntriceratops\nthunder snake, worm snake, Carphophis amoenus\nringneck snake, ring-necked snake, ring snake\nhognose snake, puff adder, sand viper\ngreen snake, grass snake\nking snake, kingsnake\ngarter snake, grass snake\nwater snake\nvine snake\nnight snake, Hypsiglena torquata\nboa constrictor, Constrictor constrictor\nrock python, rock snake, Python sebae\nIndian cobra, Naja naja\ngreen mamba\nsea snake\nhorned viper, cerastes, sand viper, horned asp, Cerastes cornutus\ndiamondback, diamondback rattlesnake, Crotalus adamanteus\nsidewinder, horned rattlesnake, Crotalus cerastes\ntrilobite\nharvestman, daddy longlegs, Phalangium opilio\nscorpion\nblack and gold garden spider, Argiope aurantia\nbarn spider, Araneus cavaticus\ngarden spider, Aranea diademata\nblack widow, Latrodectus mactans\ntarantula\nwolf spider, hunting spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse, partridge, Bonasa umbellus\nprairie chicken, prairie grouse, prairie fowl\npeacock\nquail\npartridge\nAfrican grey, African gray, Psittacus erithacus\nmacaw\nsulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser, Mergus serrator\ngoose\nblack swan, Cygnus atratus\ntusker\nechidna, spiny anteater, anteater\nplatypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus\nwallaby, brush kangaroo\nkoala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus\nwombat\njellyfish\nsea anemone, anemone\nbrain coral\nflatworm, platyhelminth\nnematode, nematode worm, roundworm\nconch\nsnail\nslug\nsea slug, nudibranch\nchiton, coat-of-mail shell, sea cradle, polyplacophore\nchambered nautilus, pearly nautilus, nautilus\nDungeness crab, Cancer magister\nrock crab, Cancer irroratus\nfiddler crab\nking crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica\nAmerican lobster, Northern lobster, Maine lobster, Homarus americanus\nspiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish\ncrayfish, crawfish, crawdad, crawdaddy\nhermit crab\nisopod\nwhite stork, Ciconia ciconia\nblack stork, Ciconia nigra\nspoonbill\nflamingo\nlittle blue heron, Egretta caerulea\nAmerican egret, great white heron, Egretta albus\nbittern\ncrane\nlimpkin, Aramus pictus\nEuropean gallinule, Porphyrio porphyrio\nAmerican coot, marsh hen, mud hen, water hen, Fulica americana\nbustard\nruddy turnstone, Arenaria interpres\nred-backed sandpiper, dunlin, Erolia alpina\nredshank, Tringa totanus\ndowitcher\noystercatcher, oyster catcher\npelican\nking penguin, Aptenodytes patagonica\nalbatross, mollymawk\ngrey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus\nkiller whale, killer, orca, grampus, sea wolf, Orcinus orca\ndugong, Dugong dugon\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog, Maltese terrier, Maltese\nPekinese, Pekingese, Peke\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound, Afghan\nbasset, basset hound\nbeagle\nbloodhound, sleuthhound\nbluetick\nblack-and-tan coonhound\nWalker hound, Walker foxhound\nEnglish foxhound\nredbone\nborzoi, Russian wolfhound\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound, Ibizan Podenco\nNorwegian elkhound, elkhound\notterhound, otter hound\nSaluki, gazelle hound\nScottish deerhound, deerhound\nWeimaraner\nStaffordshire bullterrier, Staffordshire bull terrier\nAmerican Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier, Sealyham\nAiredale, Airedale terrier\ncairn, cairn terrier\nAustralian terrier\nDandie Dinmont, Dandie Dinmont terrier\nBoston bull, Boston terrier\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier, Scottish terrier, Scottie\nTibetan terrier, chrysanthemum dog\nsilky terrier, Sydney silky\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa, Lhasa apso\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla, Hungarian pointer\nEnglish setter\nIrish setter, red setter\nGordon setter\nBrittany spaniel\nclumber, clumber spaniel\nEnglish springer, English springer spaniel\nWelsh springer spaniel\ncocker spaniel, English cocker spaniel, cocker\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog, bobtail\nShetland sheepdog, Shetland sheep dog, Shetland\ncollie\nBorder collie\nBouvier des Flandres, Bouviers des Flandres\nRottweiler\nGerman shepherd, German shepherd dog, German police dog, alsatian\nDoberman, Doberman pinscher\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard, St Bernard\nEskimo dog, husky\nmalamute, malemute, Alaskan malamute\nSiberian husky\ndalmatian, coach dog, carriage dog\naffenpinscher, monkey pinscher, monkey dog\nbasenji\npug, pug-dog\nLeonberg\nNewfoundland, Newfoundland dog\nGreat Pyrenees\nSamoyed, Samoyede\nPomeranian\nchow, chow chow\nkeeshond\nBrabancon griffon\nPembroke, Pembroke Welsh corgi\nCardigan, Cardigan Welsh corgi\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf, grey wolf, gray wolf, Canis lupus\nwhite wolf, Arctic wolf, Canis lupus tundrarum\nred wolf, maned wolf, Canis rufus, Canis niger\ncoyote, prairie wolf, brush wolf, Canis latrans\ndingo, warrigal, warragal, Canis dingo\ndhole, Cuon alpinus\nAfrican hunting dog, hyena dog, Cape hunting dog, Lycaon pictus\nhyena, hyaena\nred fox, Vulpes vulpes\nkit fox, Vulpes macrotis\nArctic fox, white fox, Alopex lagopus\ngrey fox, gray fox, Urocyon cinereoargenteus\ntabby, tabby cat\ntiger cat\nPersian cat\nSiamese cat, Siamese\nEgyptian cat\ncougar, puma, catamount, mountain lion, painter, panther, Felis concolor\nlynx, catamount\nleopard, Panthera pardus\nsnow leopard, ounce, Panthera uncia\njaguar, panther, Panthera onca, Felis onca\nlion, king of beasts, Panthera leo\ntiger, Panthera tigris\ncheetah, chetah, Acinonyx jubatus\nbrown bear, bruin, Ursus arctos\nAmerican black bear, black bear, Ursus americanus, Euarctos americanus\nice bear, polar bear, Ursus Maritimus, Thalarctos maritimus\nsloth bear, Melursus ursinus, Ursus ursinus\nmongoose\nmeerkat, mierkat\ntiger beetle\nladybug, ladybeetle, lady beetle, ladybird, ladybird beetle\nground beetle, carabid beetle\nlong-horned beetle, longicorn, longicorn beetle\nleaf beetle, chrysomelid\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant, emmet, pismire\ngrasshopper, hopper\ncricket\nwalking stick, walkingstick, stick insect\ncockroach, roach\nmantis, mantid\ncicada, cicala\nleafhopper\nlacewing, lacewing fly\ndragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk\ndamselfly\nadmiral\nringlet, ringlet butterfly\nmonarch, monarch butterfly, milkweed butterfly, Danaus plexippus\ncabbage butterfly\nsulphur butterfly, sulfur butterfly\nlycaenid, lycaenid butterfly\nstarfish, sea star\nsea urchin\nsea cucumber, holothurian\nwood rabbit, cottontail, cottontail rabbit\nhare\nAngora, Angora rabbit\nhamster\nporcupine, hedgehog\nfox squirrel, eastern fox squirrel, Sciurus niger\nmarmot\nbeaver\nguinea pig, Cavia cobaya\nsorrel\nzebra\nhog, pig, grunter, squealer, Sus scrofa\nwild boar, boar, Sus scrofa\nwarthog\nhippopotamus, hippo, river horse, Hippopotamus amphibius\nox\nwater buffalo, water ox, Asiatic buffalo, Bubalus bubalis\nbison\nram, tup\nbighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis\nibex, Capra ibex\nhartebeest\nimpala, Aepyceros melampus\ngazelle\nArabian camel, dromedary, Camelus dromedarius\nllama\nweasel\nmink\npolecat, fitch, foulmart, foumart, Mustela putorius\nblack-footed ferret, ferret, Mustela nigripes\notter\nskunk, polecat, wood pussy\nbadger\narmadillo\nthree-toed sloth, ai, Bradypus tridactylus\norangutan, orang, orangutang, Pongo pygmaeus\ngorilla, Gorilla gorilla\nchimpanzee, chimp, Pan troglodytes\ngibbon, Hylobates lar\nsiamang, Hylobates syndactylus, Symphalangus syndactylus\nguenon, guenon monkey\npatas, hussar monkey, Erythrocebus patas\nbaboon\nmacaque\nlangur\ncolobus, colobus monkey\nproboscis monkey, Nasalis larvatus\nmarmoset\ncapuchin, ringtail, Cebus capucinus\nhowler monkey, howler\ntiti, titi monkey\nspider monkey, Ateles geoffroyi\nsquirrel monkey, Saimiri sciureus\nMadagascar cat, ring-tailed lemur, Lemur catta\nindri, indris, Indri indri, Indri brevicaudatus\nIndian elephant, Elephas maximus\nAfrican elephant, Loxodonta africana\nlesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens\ngiant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca\nbarracouta, snoek\neel\ncoho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch\nrock beauty, Holocanthus tricolor\nanemone fish\nsturgeon\ngar, garfish, garpike, billfish, Lepisosteus osseus\nlionfish\npuffer, pufferfish, blowfish, globefish\nabacus\nabaya\nacademic gown, academic robe, judge's robe\naccordion, piano accordion, squeeze box\nacoustic guitar\naircraft carrier, carrier, flattop, attack aircraft carrier\nairliner\nairship, dirigible\naltar\nambulance\namphibian, amphibious vehicle\nanalog clock\napiary, bee house\napron\nashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin\nassault rifle, assault gun\nbackpack, back pack, knapsack, packsack, rucksack, haversack\nbakery, bakeshop, bakehouse\nbalance beam, beam\nballoon\nballpoint, ballpoint pen, ballpen, Biro\nBand Aid\nbanjo\nbannister, banister, balustrade, balusters, handrail\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel, cask\nbarrow, garden cart, lawn cart, wheelbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap, swimming cap\nbath towel\nbathtub, bathing tub, bath, tub\nbeach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon\nbeacon, lighthouse, beacon light, pharos\nbeaker\nbearskin, busby, shako\nbeer bottle\nbeer glass\nbell cote, bell cot\nbib\nbicycle-built-for-two, tandem bicycle, tandem\nbikini, two-piece\nbinder, ring-binder\nbinoculars, field glasses, opera glasses\nbirdhouse\nboathouse\nbobsled, bobsleigh, bob\nbolo tie, bolo, bola tie, bola\nbonnet, poke bonnet\nbookcase\nbookshop, bookstore, bookstall\nbottlecap\nbow\nbow tie, bow-tie, bowtie\nbrass, memorial tablet, plaque\nbrassiere, bra, bandeau\nbreakwater, groin, groyne, mole, bulwark, seawall, jetty\nbreastplate, aegis, egis\nbroom\nbucket, pail\nbuckle\nbulletproof vest\nbullet train, bullet\nbutcher shop, meat market\ncab, hack, taxi, taxicab\ncaldron, cauldron\ncandle, taper, wax light\ncannon\ncanoe\ncan opener, tin opener\ncardigan\ncar mirror\ncarousel, carrousel, merry-go-round, roundabout, whirligig\ncarpenter's kit, tool kit\ncarton\ncar wheel\ncash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello, violoncello\ncellular telephone, cellular phone, cellphone, cell, mobile phone\nchain\nchainlink fence\nchain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour\nchain saw, chainsaw\nchest\nchiffonier, commode\nchime, bell, gong\nchina cabinet, china closet\nChristmas stocking\nchurch, church building\ncinema, movie theater, movie theatre, movie house, picture palace\ncleaver, meat cleaver, chopper\ncliff dwelling\ncloak\nclog, geta, patten, sabot\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil, spiral, volute, whorl, helix\ncombination lock\ncomputer keyboard, keypad\nconfectionery, confectionary, candy store\ncontainer ship, containership, container vessel\nconvertible\ncorkscrew, bottle screw\ncornet, horn, trumpet, trump\ncowboy boot\ncowboy hat, ten-gallon hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib, cot\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam, dike, dyke\ndesk\ndesktop computer\ndial telephone, dial phone\ndiaper, nappy, napkin\ndigital clock\ndigital watch\ndining table, board\ndishrag, dishcloth\ndishwasher, dish washer, dishwashing machine\ndisk brake, disc brake\ndock, dockage, docking facility\ndogsled, dog sled, dog sleigh\ndome\ndoormat, welcome mat\ndrilling platform, offshore rig\ndrum, membranophone, tympan\ndrumstick\ndumbbell\nDutch oven\nelectric fan, blower\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa, boa\nfile, file cabinet, filing cabinet\nfireboat\nfire engine, fire truck\nfire screen, fireguard\nflagpole, flagstaff\nflute, transverse flute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn, horn\nfrying pan, frypan, skillet\nfur coat\ngarbage truck, dustcart\ngasmask, respirator, gas helmet\ngas pump, gasoline pump, petrol pump, island dispenser\ngoblet\ngo-kart\ngolf ball\ngolfcart, golf cart\ngondola\ngong, tam-tam\ngown\ngrand piano, grand\ngreenhouse, nursery, glasshouse\ngrille, radiator grille\ngrocery store, grocery, food market, market\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower, blow dryer, blow drier, hair dryer, hair drier\nhand-held computer, hand-held microcomputer\nhandkerchief, hankie, hanky, hankey\nhard disc, hard disk, fixed disk\nharmonica, mouth organ, harp, mouth harp\nharp\nharvester, reaper\nhatchet\nholster\nhome theater, home theatre\nhoneycomb\nhook, claw\nhoopskirt, crinoline\nhorizontal bar, high bar\nhorse cart, horse-cart\nhourglass\niPod\niron, smoothing iron\njack-o'-lantern\njean, blue jean, denim\njeep, landrover\njersey, T-shirt, tee shirt\njigsaw puzzle\njinrikisha, ricksha, rickshaw\njoystick\nkimono\nknee pad\nknot\nlab coat, laboratory coat\nladle\nlampshade, lamp shade\nlaptop, laptop computer\nlawn mower, mower\nlens cap, lens cover\nletter opener, paper knife, paperknife\nlibrary\nlifeboat\nlighter, light, igniter, ignitor\nlimousine, limo\nliner, ocean liner\nlipstick, lip rouge\nLoafer\nlotion\nloudspeaker, speaker, speaker unit, loudspeaker system, speaker system\nloupe, jeweler's loupe\nlumbermill, sawmill\nmagnetic compass\nmailbag, postbag\nmailbox, letter box\nmaillot\nmaillot, tank suit\nmanhole cover\nmaraca\nmarimba, xylophone\nmask\nmatchstick\nmaypole\nmaze, labyrinth\nmeasuring cup\nmedicine chest, medicine cabinet\nmegalith, megalithic structure\nmicrophone, mike\nmicrowave, microwave oven\nmilitary uniform\nmilk can\nminibus\nminiskirt, mini\nminivan\nmissile\nmitten\nmixing bowl\nmobile home, manufactured home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter, scooter\nmountain bike, all-terrain bike, off-roader\nmountain tent\nmouse, computer mouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook, notebook computer\nobelisk\noboe, hautboy, hautbois\nocarina, sweet potato\nodometer, hodometer, mileometer, milometer\noil filter\norgan, pipe organ\noscilloscope, scope, cathode-ray oscilloscope, CRO\noverskirt\noxcart\noxygen mask\npacket\npaddle, boat paddle\npaddlewheel, paddle wheel\npadlock\npaintbrush\npajama, pyjama, pj's, jammies\npalace\npanpipe, pandean pipe, syrinx\npaper towel\nparachute, chute\nparallel bars, bars\npark bench\nparking meter\npassenger car, coach, carriage\npatio, terrace\npay-phone, pay-station\npedestal, plinth, footstall\npencil box, pencil case\npencil sharpener\nperfume, essence\nPetri dish\nphotocopier\npick, plectrum, plectron\npickelhaube\npicket fence, paling\npickup, pickup truck\npier\npiggy bank, penny bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate, pirate ship\npitcher, ewer\nplane, carpenter's plane, woodworking plane\nplanetarium\nplastic bag\nplate rack\nplow, plough\nplunger, plumber's helper\nPolaroid camera, Polaroid Land camera\npole\npolice van, police wagon, paddy wagon, patrol wagon, wagon, black Maria\nponcho\npool table, billiard table, snooker table\npop bottle, soda bottle\npot, flowerpot\npotter's wheel\npower drill\nprayer rug, prayer mat\nprinter\nprison, prison house\nprojectile, missile\nprojector\npuck, hockey puck\npunching bag, punch bag, punching ball, punchball\npurse\nquill, quill pen\nquilt, comforter, comfort, puff\nracer, race car, racing car\nracket, racquet\nradiator\nradio, wireless\nradio telescope, radio reflector\nrain barrel\nrecreational vehicle, RV, R.V.\nreel\nreflex camera\nrefrigerator, icebox\nremote control, remote\nrestaurant, eating house, eating place, eatery\nrevolver, six-gun, six-shooter\nrifle\nrocking chair, rocker\nrotisserie\nrubber eraser, rubber, pencil eraser\nrugby ball\nrule, ruler\nrunning shoe\nsafe\nsafety pin\nsaltshaker, salt shaker\nsandal\nsarong\nsax, saxophone\nscabbard\nscale, weighing machine\nschool bus\nschooner\nscoreboard\nscreen, CRT screen\nscrew\nscrewdriver\nseat belt, seatbelt\nsewing machine\nshield, buckler\nshoe shop, shoe-shop, shoe store\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule, slipstick\nsliding door\nslot, one-armed bandit\nsnorkel\nsnowmobile\nsnowplow, snowplough\nsoap dispenser\nsoccer ball\nsock\nsolar dish, solar collector, solar furnace\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web, spider's web\nspindle\nsports car, sport car\nspotlight, spot\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch, stop watch\nstove\nstrainer\nstreetcar, tram, tramcar, trolley, trolley car\nstretcher\nstudio couch, day bed\nstupa, tope\nsubmarine, pigboat, sub, U-boat\nsuit, suit of clothes\nsundial\nsunglass\nsunglasses, dark glasses, shades\nsunscreen, sunblock, sun blocker\nsuspension bridge\nswab, swob, mop\nsweatshirt\nswimming trunks, bathing trunks\nswing\nswitch, electric switch, electrical switch\nsyringe\ntable lamp\ntank, army tank, armored combat vehicle, armoured combat vehicle\ntape player\nteapot\nteddy, teddy bear\ntelevision, television system\ntennis ball\nthatch, thatched roof\ntheater curtain, theatre curtain\nthimble\nthresher, thrasher, threshing machine\nthrone\ntile roof\ntoaster\ntobacco shop, tobacconist shop, tobacconist\ntoilet seat\ntorch\ntotem pole\ntow truck, tow car, wrecker\ntoyshop\ntractor\ntrailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi\ntray\ntrench coat\ntricycle, trike, velocipede\ntrimaran\ntripod\ntriumphal arch\ntrolleybus, trolley coach, trackless trolley\ntrombone\ntub, vat\nturnstile\ntypewriter keyboard\numbrella\nunicycle, monocycle\nupright, upright piano\nvacuum, vacuum cleaner\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin, fiddle\nvolleyball\nwaffle iron\nwall clock\nwallet, billfold, notecase, pocketbook\nwardrobe, closet, press\nwarplane, military plane\nwashbasin, handbasin, washbowl, lavabo, wash-hand basin\nwasher, automatic washer, washing machine\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool, woolen, woollen\nworm fence, snake fence, snake-rail fence, Virginia fence\nwreck\nyawl\nyurt\nweb site, website, internet site, site\ncomic book\ncrossword puzzle, crossword\nstreet sign\ntraffic light, traffic signal, stoplight\nbook jacket, dust cover, dust jacket, dust wrapper\nmenu\nplate\nguacamole\nconsomme\nhot pot, hotpot\ntrifle\nice cream, icecream\nice lolly, lolly, lollipop, popsicle\nFrench loaf\nbagel, beigel\npretzel\ncheeseburger\nhotdog, hot dog, red hot\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini, courgette\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber, cuke\nartichoke, globe artichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple, ananas\nbanana\njackfruit, jak, jack\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce, chocolate syrup\ndough\nmeat loaf, meatloaf\npizza, pizza pie\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff, drop, drop-off\ncoral reef\ngeyser\nlakeside, lakeshore\npromontory, headland, head, foreland\nsandbar, sand bar\nseashore, coast, seacoast, sea-coast\nvalley, vale\nvolcano\nballplayer, baseball player\ngroom, bridegroom\nscuba diver\nrapeseed\ndaisy\nyellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum\ncorn\nacorn\nhip, rose hip, rosehip\nbuckeye, horse chestnut, conker\ncoral fungus\nagaric\ngyromitra\nstinkhorn, carrion fungus\nearthstar\nhen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa\nbolete\near, spike, capitulum\ntoilet tissue, toilet paper, bathroom tissue\n"
  },
  {
    "path": "modules/image/classification/efficientnetb6_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\n\nimport argparse\nimport ast\nimport os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import postprocess\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"efficientnetb6_imagenet\",\n            type=\"CV/image_classification\",\n            author=\"paddlepaddle\",\n            author_email=\"paddle-dev@baidu.com\",\n            summary=\"EfficientNetB6 is a image classfication model, this module is trained with imagenet datasets.\",\n            version=\"1.2.0\")\nclass EfficientNetB6ImageNet:\n\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"efficientnetb6_imagenet_infer_model\",\n                                                          \"model\")\n        label_file = os.path.join(self.directory, \"label_list.txt\")\n        with open(label_file, 'r', encoding='utf-8') as file:\n            self.label_list = file.read().split(\"\\n\")[:-1]\n        self._set_config()\n\n    def get_expected_image_width(self):\n        return 224\n\n    def get_expected_image_height(self):\n        return 224\n\n    def get_pretrained_images_mean(self):\n        im_mean = np.array([0.485, 0.456, 0.406]).reshape(1, 3)\n        return im_mean\n\n    def get_pretrained_images_std(self):\n        im_std = np.array([0.229, 0.224, 0.225]).reshape(1, 3)\n        return im_std\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path + '.pdmodel'\n        params = self.default_pretrained_model_path + '.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def classification(self, images=None, paths=None, batch_size=1, use_gpu=False, top_k=1):\n        \"\"\"\n        API for image classification.\n\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        all_data = list()\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = list()\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except:\n                    pass\n            # feed batch image\n            batch_image = np.array([data['image'] for data in batch_data])\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(batch_image.copy())\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            out = postprocess(data_out=output_handle.copy_to_cpu(), label_list=self.label_list, top_k=top_k)\n            res += out\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classify(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.classify(paths=[args.input_path], batch_size=args.batch_size, use_gpu=args.use_gpu)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not.\")\n        self.arg_config_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n        self.arg_config_group.add_argument('--top_k', type=ast.literal_eval, default=1, help=\"Return top k results.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n\n\nif __name__ == '__main__':\n    b6 = EfficientNetB6ImageNet()\n    b6.context()\n    import cv2\n    test_image = [cv2.imread('dog.jpeg')]\n    res = b6.classification(images=test_image)\n    print(res)\n    res = b6.classification(paths=['dog.jpeg'])\n    print(res)\n    res = b6.classification(images=test_image)\n    print(res)\n    res = b6.classify(images=test_image)\n    print(res)\n"
  },
  {
    "path": "modules/image/classification/efficientnetb6_imagenet/processor.py",
    "content": "# -*- coding:utf-8 -*-\n# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport base64\n\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef softmax(x):\n    if len(x.shape) > 1:\n        tmp = np.max(x, axis=1)\n        x -= tmp.reshape((x.shape[0], 1))\n        x = np.exp(x)\n        tmp = np.sum(x, axis=1)\n        x /= tmp.reshape((x.shape[0], 1))\n    else:\n        tmp = np.max(x)\n        x -= tmp\n        x = np.exp(x)\n        tmp = np.sum(x)\n        x /= tmp\n    return x\n\n\ndef postprocess(data_out, label_list, top_k):\n    \"\"\"\n    Postprocess output of network, one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output data of network.\n        label_list (list): list of label.\n        top_k (int): Return top k results.\n    \"\"\"\n    output = []\n    for result in data_out:\n        result_i = softmax(result)\n        output_i = {}\n        indexs = np.argsort(result_i)[::-1][0:top_k]\n        for index in indexs:\n            label = label_list[index].split(',')[0]\n            output_i[label] = float(result_i[index])\n        output.append(output_i)\n    return output\n"
  },
  {
    "path": "modules/image/classification/efficientnetb6_imagenet/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/brFsZ7qszSY/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8OHx8ZG9nfGVufDB8fHx8MTY2MzA1ODQ1MQ&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"efficientnetb6_imagenet\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n\n    def test_classification1(self):\n        results = self.module.classification(paths=['tests/test.jpg'])\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification2(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')])\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification3(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')], use_gpu=True)\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification4(self):\n        self.assertRaises(AssertionError, self.module.classification, paths=['no.jpg'])\n\n    def test_classification5(self):\n        self.assertRaises(TypeError, self.module.classification, images=['tests/test.jpg'])\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/classification/efficientnetb7_imagenet/README.md",
    "content": "# efficientnetb7_imagenet\n\n|模型名称|efficientnetb7_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|EfficientNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|260MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - EfficientNet 是谷歌的开源新模型，是一个轻量级网络，它的主干网络由 MBConv 构成，同时采取了 squeeze-and-excitation 操作对网络结构进行优化。该 PaddleHub Module结构为 EfficientNetB7，基于 ImageNet-2012 数据集训练，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install efficientnetb7_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run efficientnetb7_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"efficientnetb7_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 为识别的菜品类别，value为置信度。\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m efficientnetb7_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/efficientnetb7_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  提升预测性能以及易用性\n\n* 1.2.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install efficientnetb7_imagenet==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/efficientnetb7_imagenet/README_en.md",
    "content": "# efficientnetb7_imagenet\n\n|Module Name|efficientnetb7_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|EfficientNet|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|260MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - EfficientNet is a light-weight model proposed by google, which consists of MBConv, and takes advantage of squeeze-and-excitation operation. This module is based on EfficientNetB7, trained on ImageNet-2012 dataset, and can predict an image of size 224*224*3.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install efficientnetb7_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run efficientnetb7_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"efficientnetb7_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - classification API.\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - top\\_k (int): return the first k results\n\n    - **Return**\n\n      - res (list\\[dict\\]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of image classification.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m efficientnetb7_imagenet\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/efficientnetb7_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Improve the prediction performance and users' experience\n\n* 1.2.0\n\n  Remove Fluid API\n\n  - ```shell\n    $ hub install efficientnetb7_imagenet==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/efficientnetb7_imagenet/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/classification/efficientnetb7_imagenet/data_feed.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport time\nfrom collections import OrderedDict\n\nimport numpy as np\nfrom PIL import Image\n\n__all__ = ['reader']\n\nDATA_DIM = 224\nimg_mean = np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))\nimg_std = np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))\n\n\ndef resize_short(img, target_size):\n    percent = float(target_size) / min(img.size[0], img.size[1])\n    resized_width = int(round(img.size[0] * percent))\n    resized_height = int(round(img.size[1] * percent))\n    img = img.resize((resized_width, resized_height), Image.LANCZOS)\n    return img\n\n\ndef crop_image(img, target_size, center):\n    width, height = img.size\n    size = target_size\n    if center == True:\n        w_start = (width - size) / 2\n        h_start = (height - size) / 2\n    else:\n        w_start = np.random.randint(0, width - size + 1)\n        h_start = np.random.randint(0, height - size + 1)\n    w_end = w_start + size\n    h_end = h_start + size\n    img = img.crop((w_start, h_start, w_end, h_end))\n    return img\n\n\ndef process_image(img):\n    img = resize_short(img, target_size=256)\n    img = crop_image(img, target_size=DATA_DIM, center=True)\n    if img.mode != 'RGB':\n        img = img.convert('RGB')\n    img = np.array(img).astype('float32').transpose((2, 0, 1)) / 255\n    img -= img_mean\n    img /= img_std\n    return img\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list[numpy.ndarray]): images data, shape of each is [H, W, C].\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            each['org_im_path'] = im_path\n            each['org_im'] = Image.open(im_path)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n    if images is not None:\n        assert type(images), \"images is a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = Image.fromarray(im[:, :, ::-1])\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n\n    for element in component:\n        element['image'] = process_image(element['org_im'])\n        yield element\n"
  },
  {
    "path": "modules/image/classification/efficientnetb7_imagenet/label_list.txt",
    "content": "tench, Tinca tinca\ngoldfish, Carassius auratus\ngreat white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias\ntiger shark, Galeocerdo cuvieri\nhammerhead, hammerhead shark\nelectric ray, crampfish, numbfish, torpedo\nstingray\ncock\nhen\nostrich, Struthio camelus\nbrambling, Fringilla montifringilla\ngoldfinch, Carduelis carduelis\nhouse finch, linnet, Carpodacus mexicanus\njunco, snowbird\nindigo bunting, indigo finch, indigo bird, Passerina cyanea\nrobin, American robin, Turdus migratorius\nbulbul\njay\nmagpie\nchickadee\nwater ouzel, dipper\nkite\nbald eagle, American eagle, Haliaeetus leucocephalus\nvulture\ngreat grey owl, great gray owl, Strix nebulosa\nEuropean fire salamander, Salamandra salamandra\ncommon newt, Triturus vulgaris\neft\nspotted salamander, Ambystoma maculatum\naxolotl, mud puppy, Ambystoma mexicanum\nbullfrog, Rana catesbeiana\ntree frog, tree-frog\ntailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui\nloggerhead, loggerhead turtle, Caretta caretta\nleatherback turtle, leatherback, leathery turtle, Dermochelys coriacea\nmud turtle\nterrapin\nbox turtle, box tortoise\nbanded gecko\ncommon iguana, iguana, Iguana iguana\nAmerican chameleon, anole, Anolis carolinensis\nwhiptail, whiptail lizard\nagama\nfrilled lizard, Chlamydosaurus kingi\nalligator lizard\nGila monster, Heloderma suspectum\ngreen lizard, Lacerta viridis\nAfrican chameleon, Chamaeleo chamaeleon\nKomodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis\nAfrican crocodile, Nile crocodile, Crocodylus niloticus\nAmerican alligator, Alligator mississipiensis\ntriceratops\nthunder snake, worm snake, Carphophis amoenus\nringneck snake, ring-necked snake, ring snake\nhognose snake, puff adder, sand viper\ngreen snake, grass snake\nking snake, kingsnake\ngarter snake, grass snake\nwater snake\nvine snake\nnight snake, Hypsiglena torquata\nboa constrictor, Constrictor constrictor\nrock python, rock snake, Python sebae\nIndian cobra, Naja naja\ngreen mamba\nsea snake\nhorned viper, cerastes, sand viper, horned asp, Cerastes cornutus\ndiamondback, diamondback rattlesnake, Crotalus adamanteus\nsidewinder, horned rattlesnake, Crotalus cerastes\ntrilobite\nharvestman, daddy longlegs, Phalangium opilio\nscorpion\nblack and gold garden spider, Argiope aurantia\nbarn spider, Araneus cavaticus\ngarden spider, Aranea diademata\nblack widow, Latrodectus mactans\ntarantula\nwolf spider, hunting spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse, partridge, Bonasa umbellus\nprairie chicken, prairie grouse, prairie fowl\npeacock\nquail\npartridge\nAfrican grey, African gray, Psittacus erithacus\nmacaw\nsulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser, Mergus serrator\ngoose\nblack swan, Cygnus atratus\ntusker\nechidna, spiny anteater, anteater\nplatypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus\nwallaby, brush kangaroo\nkoala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus\nwombat\njellyfish\nsea anemone, anemone\nbrain coral\nflatworm, platyhelminth\nnematode, nematode worm, roundworm\nconch\nsnail\nslug\nsea slug, nudibranch\nchiton, coat-of-mail shell, sea cradle, polyplacophore\nchambered nautilus, pearly nautilus, nautilus\nDungeness crab, Cancer magister\nrock crab, Cancer irroratus\nfiddler crab\nking crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica\nAmerican lobster, Northern lobster, Maine lobster, Homarus americanus\nspiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish\ncrayfish, crawfish, crawdad, crawdaddy\nhermit crab\nisopod\nwhite stork, Ciconia ciconia\nblack stork, Ciconia nigra\nspoonbill\nflamingo\nlittle blue heron, Egretta caerulea\nAmerican egret, great white heron, Egretta albus\nbittern\ncrane\nlimpkin, Aramus pictus\nEuropean gallinule, Porphyrio porphyrio\nAmerican coot, marsh hen, mud hen, water hen, Fulica americana\nbustard\nruddy turnstone, Arenaria interpres\nred-backed sandpiper, dunlin, Erolia alpina\nredshank, Tringa totanus\ndowitcher\noystercatcher, oyster catcher\npelican\nking penguin, Aptenodytes patagonica\nalbatross, mollymawk\ngrey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus\nkiller whale, killer, orca, grampus, sea wolf, Orcinus orca\ndugong, Dugong dugon\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog, Maltese terrier, Maltese\nPekinese, Pekingese, Peke\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound, Afghan\nbasset, basset hound\nbeagle\nbloodhound, sleuthhound\nbluetick\nblack-and-tan coonhound\nWalker hound, Walker foxhound\nEnglish foxhound\nredbone\nborzoi, Russian wolfhound\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound, Ibizan Podenco\nNorwegian elkhound, elkhound\notterhound, otter hound\nSaluki, gazelle hound\nScottish deerhound, deerhound\nWeimaraner\nStaffordshire bullterrier, Staffordshire bull terrier\nAmerican Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier, Sealyham\nAiredale, Airedale terrier\ncairn, cairn terrier\nAustralian terrier\nDandie Dinmont, Dandie Dinmont terrier\nBoston bull, Boston terrier\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier, Scottish terrier, Scottie\nTibetan terrier, chrysanthemum dog\nsilky terrier, Sydney silky\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa, Lhasa apso\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla, Hungarian pointer\nEnglish setter\nIrish setter, red setter\nGordon setter\nBrittany spaniel\nclumber, clumber spaniel\nEnglish springer, English springer spaniel\nWelsh springer spaniel\ncocker spaniel, English cocker spaniel, cocker\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog, bobtail\nShetland sheepdog, Shetland sheep dog, Shetland\ncollie\nBorder collie\nBouvier des Flandres, Bouviers des Flandres\nRottweiler\nGerman shepherd, German shepherd dog, German police dog, alsatian\nDoberman, Doberman pinscher\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard, St Bernard\nEskimo dog, husky\nmalamute, malemute, Alaskan malamute\nSiberian husky\ndalmatian, coach dog, carriage dog\naffenpinscher, monkey pinscher, monkey dog\nbasenji\npug, pug-dog\nLeonberg\nNewfoundland, Newfoundland dog\nGreat Pyrenees\nSamoyed, Samoyede\nPomeranian\nchow, chow chow\nkeeshond\nBrabancon griffon\nPembroke, Pembroke Welsh corgi\nCardigan, Cardigan Welsh corgi\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf, grey wolf, gray wolf, Canis lupus\nwhite wolf, Arctic wolf, Canis lupus tundrarum\nred wolf, maned wolf, Canis rufus, Canis niger\ncoyote, prairie wolf, brush wolf, Canis latrans\ndingo, warrigal, warragal, Canis dingo\ndhole, Cuon alpinus\nAfrican hunting dog, hyena dog, Cape hunting dog, Lycaon pictus\nhyena, hyaena\nred fox, Vulpes vulpes\nkit fox, Vulpes macrotis\nArctic fox, white fox, Alopex lagopus\ngrey fox, gray fox, Urocyon cinereoargenteus\ntabby, tabby cat\ntiger cat\nPersian cat\nSiamese cat, Siamese\nEgyptian cat\ncougar, puma, catamount, mountain lion, painter, panther, Felis concolor\nlynx, catamount\nleopard, Panthera pardus\nsnow leopard, ounce, Panthera uncia\njaguar, panther, Panthera onca, Felis onca\nlion, king of beasts, Panthera leo\ntiger, Panthera tigris\ncheetah, chetah, Acinonyx jubatus\nbrown bear, bruin, Ursus arctos\nAmerican black bear, black bear, Ursus americanus, Euarctos americanus\nice bear, polar bear, Ursus Maritimus, Thalarctos maritimus\nsloth bear, Melursus ursinus, Ursus ursinus\nmongoose\nmeerkat, mierkat\ntiger beetle\nladybug, ladybeetle, lady beetle, ladybird, ladybird beetle\nground beetle, carabid beetle\nlong-horned beetle, longicorn, longicorn beetle\nleaf beetle, chrysomelid\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant, emmet, pismire\ngrasshopper, hopper\ncricket\nwalking stick, walkingstick, stick insect\ncockroach, roach\nmantis, mantid\ncicada, cicala\nleafhopper\nlacewing, lacewing fly\ndragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk\ndamselfly\nadmiral\nringlet, ringlet butterfly\nmonarch, monarch butterfly, milkweed butterfly, Danaus plexippus\ncabbage butterfly\nsulphur butterfly, sulfur butterfly\nlycaenid, lycaenid butterfly\nstarfish, sea star\nsea urchin\nsea cucumber, holothurian\nwood rabbit, cottontail, cottontail rabbit\nhare\nAngora, Angora rabbit\nhamster\nporcupine, hedgehog\nfox squirrel, eastern fox squirrel, Sciurus niger\nmarmot\nbeaver\nguinea pig, Cavia cobaya\nsorrel\nzebra\nhog, pig, grunter, squealer, Sus scrofa\nwild boar, boar, Sus scrofa\nwarthog\nhippopotamus, hippo, river horse, Hippopotamus amphibius\nox\nwater buffalo, water ox, Asiatic buffalo, Bubalus bubalis\nbison\nram, tup\nbighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis\nibex, Capra ibex\nhartebeest\nimpala, Aepyceros melampus\ngazelle\nArabian camel, dromedary, Camelus dromedarius\nllama\nweasel\nmink\npolecat, fitch, foulmart, foumart, Mustela putorius\nblack-footed ferret, ferret, Mustela nigripes\notter\nskunk, polecat, wood pussy\nbadger\narmadillo\nthree-toed sloth, ai, Bradypus tridactylus\norangutan, orang, orangutang, Pongo pygmaeus\ngorilla, Gorilla gorilla\nchimpanzee, chimp, Pan troglodytes\ngibbon, Hylobates lar\nsiamang, Hylobates syndactylus, Symphalangus syndactylus\nguenon, guenon monkey\npatas, hussar monkey, Erythrocebus patas\nbaboon\nmacaque\nlangur\ncolobus, colobus monkey\nproboscis monkey, Nasalis larvatus\nmarmoset\ncapuchin, ringtail, Cebus capucinus\nhowler monkey, howler\ntiti, titi monkey\nspider monkey, Ateles geoffroyi\nsquirrel monkey, Saimiri sciureus\nMadagascar cat, ring-tailed lemur, Lemur catta\nindri, indris, Indri indri, Indri brevicaudatus\nIndian elephant, Elephas maximus\nAfrican elephant, Loxodonta africana\nlesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens\ngiant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca\nbarracouta, snoek\neel\ncoho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch\nrock beauty, Holocanthus tricolor\nanemone fish\nsturgeon\ngar, garfish, garpike, billfish, Lepisosteus osseus\nlionfish\npuffer, pufferfish, blowfish, globefish\nabacus\nabaya\nacademic gown, academic robe, judge's robe\naccordion, piano accordion, squeeze box\nacoustic guitar\naircraft carrier, carrier, flattop, attack aircraft carrier\nairliner\nairship, dirigible\naltar\nambulance\namphibian, amphibious vehicle\nanalog clock\napiary, bee house\napron\nashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin\nassault rifle, assault gun\nbackpack, back pack, knapsack, packsack, rucksack, haversack\nbakery, bakeshop, bakehouse\nbalance beam, beam\nballoon\nballpoint, ballpoint pen, ballpen, Biro\nBand Aid\nbanjo\nbannister, banister, balustrade, balusters, handrail\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel, cask\nbarrow, garden cart, lawn cart, wheelbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap, swimming cap\nbath towel\nbathtub, bathing tub, bath, tub\nbeach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon\nbeacon, lighthouse, beacon light, pharos\nbeaker\nbearskin, busby, shako\nbeer bottle\nbeer glass\nbell cote, bell cot\nbib\nbicycle-built-for-two, tandem bicycle, tandem\nbikini, two-piece\nbinder, ring-binder\nbinoculars, field glasses, opera glasses\nbirdhouse\nboathouse\nbobsled, bobsleigh, bob\nbolo tie, bolo, bola tie, bola\nbonnet, poke bonnet\nbookcase\nbookshop, bookstore, bookstall\nbottlecap\nbow\nbow tie, bow-tie, bowtie\nbrass, memorial tablet, plaque\nbrassiere, bra, bandeau\nbreakwater, groin, groyne, mole, bulwark, seawall, jetty\nbreastplate, aegis, egis\nbroom\nbucket, pail\nbuckle\nbulletproof vest\nbullet train, bullet\nbutcher shop, meat market\ncab, hack, taxi, taxicab\ncaldron, cauldron\ncandle, taper, wax light\ncannon\ncanoe\ncan opener, tin opener\ncardigan\ncar mirror\ncarousel, carrousel, merry-go-round, roundabout, whirligig\ncarpenter's kit, tool kit\ncarton\ncar wheel\ncash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello, violoncello\ncellular telephone, cellular phone, cellphone, cell, mobile phone\nchain\nchainlink fence\nchain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour\nchain saw, chainsaw\nchest\nchiffonier, commode\nchime, bell, gong\nchina cabinet, china closet\nChristmas stocking\nchurch, church building\ncinema, movie theater, movie theatre, movie house, picture palace\ncleaver, meat cleaver, chopper\ncliff dwelling\ncloak\nclog, geta, patten, sabot\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil, spiral, volute, whorl, helix\ncombination lock\ncomputer keyboard, keypad\nconfectionery, confectionary, candy store\ncontainer ship, containership, container vessel\nconvertible\ncorkscrew, bottle screw\ncornet, horn, trumpet, trump\ncowboy boot\ncowboy hat, ten-gallon hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib, cot\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam, dike, dyke\ndesk\ndesktop computer\ndial telephone, dial phone\ndiaper, nappy, napkin\ndigital clock\ndigital watch\ndining table, board\ndishrag, dishcloth\ndishwasher, dish washer, dishwashing machine\ndisk brake, disc brake\ndock, dockage, docking facility\ndogsled, dog sled, dog sleigh\ndome\ndoormat, welcome mat\ndrilling platform, offshore rig\ndrum, membranophone, tympan\ndrumstick\ndumbbell\nDutch oven\nelectric fan, blower\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa, boa\nfile, file cabinet, filing cabinet\nfireboat\nfire engine, fire truck\nfire screen, fireguard\nflagpole, flagstaff\nflute, transverse flute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn, horn\nfrying pan, frypan, skillet\nfur coat\ngarbage truck, dustcart\ngasmask, respirator, gas helmet\ngas pump, gasoline pump, petrol pump, island dispenser\ngoblet\ngo-kart\ngolf ball\ngolfcart, golf cart\ngondola\ngong, tam-tam\ngown\ngrand piano, grand\ngreenhouse, nursery, glasshouse\ngrille, radiator grille\ngrocery store, grocery, food market, market\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower, blow dryer, blow drier, hair dryer, hair drier\nhand-held computer, hand-held microcomputer\nhandkerchief, hankie, hanky, hankey\nhard disc, hard disk, fixed disk\nharmonica, mouth organ, harp, mouth harp\nharp\nharvester, reaper\nhatchet\nholster\nhome theater, home theatre\nhoneycomb\nhook, claw\nhoopskirt, crinoline\nhorizontal bar, high bar\nhorse cart, horse-cart\nhourglass\niPod\niron, smoothing iron\njack-o'-lantern\njean, blue jean, denim\njeep, landrover\njersey, T-shirt, tee shirt\njigsaw puzzle\njinrikisha, ricksha, rickshaw\njoystick\nkimono\nknee pad\nknot\nlab coat, laboratory coat\nladle\nlampshade, lamp shade\nlaptop, laptop computer\nlawn mower, mower\nlens cap, lens cover\nletter opener, paper knife, paperknife\nlibrary\nlifeboat\nlighter, light, igniter, ignitor\nlimousine, limo\nliner, ocean liner\nlipstick, lip rouge\nLoafer\nlotion\nloudspeaker, speaker, speaker unit, loudspeaker system, speaker system\nloupe, jeweler's loupe\nlumbermill, sawmill\nmagnetic compass\nmailbag, postbag\nmailbox, letter box\nmaillot\nmaillot, tank suit\nmanhole cover\nmaraca\nmarimba, xylophone\nmask\nmatchstick\nmaypole\nmaze, labyrinth\nmeasuring cup\nmedicine chest, medicine cabinet\nmegalith, megalithic structure\nmicrophone, mike\nmicrowave, microwave oven\nmilitary uniform\nmilk can\nminibus\nminiskirt, mini\nminivan\nmissile\nmitten\nmixing bowl\nmobile home, manufactured home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter, scooter\nmountain bike, all-terrain bike, off-roader\nmountain tent\nmouse, computer mouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook, notebook computer\nobelisk\noboe, hautboy, hautbois\nocarina, sweet potato\nodometer, hodometer, mileometer, milometer\noil filter\norgan, pipe organ\noscilloscope, scope, cathode-ray oscilloscope, CRO\noverskirt\noxcart\noxygen mask\npacket\npaddle, boat paddle\npaddlewheel, paddle wheel\npadlock\npaintbrush\npajama, pyjama, pj's, jammies\npalace\npanpipe, pandean pipe, syrinx\npaper towel\nparachute, chute\nparallel bars, bars\npark bench\nparking meter\npassenger car, coach, carriage\npatio, terrace\npay-phone, pay-station\npedestal, plinth, footstall\npencil box, pencil case\npencil sharpener\nperfume, essence\nPetri dish\nphotocopier\npick, plectrum, plectron\npickelhaube\npicket fence, paling\npickup, pickup truck\npier\npiggy bank, penny bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate, pirate ship\npitcher, ewer\nplane, carpenter's plane, woodworking plane\nplanetarium\nplastic bag\nplate rack\nplow, plough\nplunger, plumber's helper\nPolaroid camera, Polaroid Land camera\npole\npolice van, police wagon, paddy wagon, patrol wagon, wagon, black Maria\nponcho\npool table, billiard table, snooker table\npop bottle, soda bottle\npot, flowerpot\npotter's wheel\npower drill\nprayer rug, prayer mat\nprinter\nprison, prison house\nprojectile, missile\nprojector\npuck, hockey puck\npunching bag, punch bag, punching ball, punchball\npurse\nquill, quill pen\nquilt, comforter, comfort, puff\nracer, race car, racing car\nracket, racquet\nradiator\nradio, wireless\nradio telescope, radio reflector\nrain barrel\nrecreational vehicle, RV, R.V.\nreel\nreflex camera\nrefrigerator, icebox\nremote control, remote\nrestaurant, eating house, eating place, eatery\nrevolver, six-gun, six-shooter\nrifle\nrocking chair, rocker\nrotisserie\nrubber eraser, rubber, pencil eraser\nrugby ball\nrule, ruler\nrunning shoe\nsafe\nsafety pin\nsaltshaker, salt shaker\nsandal\nsarong\nsax, saxophone\nscabbard\nscale, weighing machine\nschool bus\nschooner\nscoreboard\nscreen, CRT screen\nscrew\nscrewdriver\nseat belt, seatbelt\nsewing machine\nshield, buckler\nshoe shop, shoe-shop, shoe store\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule, slipstick\nsliding door\nslot, one-armed bandit\nsnorkel\nsnowmobile\nsnowplow, snowplough\nsoap dispenser\nsoccer ball\nsock\nsolar dish, solar collector, solar furnace\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web, spider's web\nspindle\nsports car, sport car\nspotlight, spot\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch, stop watch\nstove\nstrainer\nstreetcar, tram, tramcar, trolley, trolley car\nstretcher\nstudio couch, day bed\nstupa, tope\nsubmarine, pigboat, sub, U-boat\nsuit, suit of clothes\nsundial\nsunglass\nsunglasses, dark glasses, shades\nsunscreen, sunblock, sun blocker\nsuspension bridge\nswab, swob, mop\nsweatshirt\nswimming trunks, bathing trunks\nswing\nswitch, electric switch, electrical switch\nsyringe\ntable lamp\ntank, army tank, armored combat vehicle, armoured combat vehicle\ntape player\nteapot\nteddy, teddy bear\ntelevision, television system\ntennis ball\nthatch, thatched roof\ntheater curtain, theatre curtain\nthimble\nthresher, thrasher, threshing machine\nthrone\ntile roof\ntoaster\ntobacco shop, tobacconist shop, tobacconist\ntoilet seat\ntorch\ntotem pole\ntow truck, tow car, wrecker\ntoyshop\ntractor\ntrailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi\ntray\ntrench coat\ntricycle, trike, velocipede\ntrimaran\ntripod\ntriumphal arch\ntrolleybus, trolley coach, trackless trolley\ntrombone\ntub, vat\nturnstile\ntypewriter keyboard\numbrella\nunicycle, monocycle\nupright, upright piano\nvacuum, vacuum cleaner\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin, fiddle\nvolleyball\nwaffle iron\nwall clock\nwallet, billfold, notecase, pocketbook\nwardrobe, closet, press\nwarplane, military plane\nwashbasin, handbasin, washbowl, lavabo, wash-hand basin\nwasher, automatic washer, washing machine\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool, woolen, woollen\nworm fence, snake fence, snake-rail fence, Virginia fence\nwreck\nyawl\nyurt\nweb site, website, internet site, site\ncomic book\ncrossword puzzle, crossword\nstreet sign\ntraffic light, traffic signal, stoplight\nbook jacket, dust cover, dust jacket, dust wrapper\nmenu\nplate\nguacamole\nconsomme\nhot pot, hotpot\ntrifle\nice cream, icecream\nice lolly, lolly, lollipop, popsicle\nFrench loaf\nbagel, beigel\npretzel\ncheeseburger\nhotdog, hot dog, red hot\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini, courgette\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber, cuke\nartichoke, globe artichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple, ananas\nbanana\njackfruit, jak, jack\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce, chocolate syrup\ndough\nmeat loaf, meatloaf\npizza, pizza pie\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff, drop, drop-off\ncoral reef\ngeyser\nlakeside, lakeshore\npromontory, headland, head, foreland\nsandbar, sand bar\nseashore, coast, seacoast, sea-coast\nvalley, vale\nvolcano\nballplayer, baseball player\ngroom, bridegroom\nscuba diver\nrapeseed\ndaisy\nyellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum\ncorn\nacorn\nhip, rose hip, rosehip\nbuckeye, horse chestnut, conker\ncoral fungus\nagaric\ngyromitra\nstinkhorn, carrion fungus\nearthstar\nhen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa\nbolete\near, spike, capitulum\ntoilet tissue, toilet paper, bathroom tissue\n"
  },
  {
    "path": "modules/image/classification/efficientnetb7_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\n\nimport argparse\nimport ast\nimport os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import postprocess\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"efficientnetb7_imagenet\",\n            type=\"CV/image_classification\",\n            author=\"paddlepaddle\",\n            author_email=\"paddle-dev@baidu.com\",\n            summary=\"EfficientNetB7 is a image classfication model, this module is trained with imagenet datasets.\",\n            version=\"1.2.0\")\nclass EfficientNetB7ImageNet:\n\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"efficientnetb7_imagenet_infer_model\",\n                                                          \"model\")\n        label_file = os.path.join(self.directory, \"label_list.txt\")\n        with open(label_file, 'r', encoding='utf-8') as file:\n            self.label_list = file.read().split(\"\\n\")[:-1]\n        self._set_config()\n\n    def get_expected_image_width(self):\n        return 224\n\n    def get_expected_image_height(self):\n        return 224\n\n    def get_pretrained_images_mean(self):\n        im_mean = np.array([0.485, 0.456, 0.406]).reshape(1, 3)\n        return im_mean\n\n    def get_pretrained_images_std(self):\n        im_std = np.array([0.229, 0.224, 0.225]).reshape(1, 3)\n        return im_std\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path + '.pdmodel'\n        params = self.default_pretrained_model_path + '.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def classification(self, images=None, paths=None, batch_size=1, use_gpu=False, top_k=1):\n        \"\"\"\n        API for image classification.\n\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        all_data = list()\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = list()\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except:\n                    pass\n            # feed batch image\n            batch_image = np.array([data['image'] for data in batch_data])\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(batch_image.copy())\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            out = postprocess(data_out=output_handle.copy_to_cpu(), label_list=self.label_list, top_k=top_k)\n            res += out\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classify(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.classify(paths=[args.input_path], batch_size=args.batch_size, use_gpu=args.use_gpu)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not.\")\n        self.arg_config_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n        self.arg_config_group.add_argument('--top_k', type=ast.literal_eval, default=1, help=\"Return top k results.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n\n\nif __name__ == '__main__':\n    b7 = EfficientNetB7ImageNet()\n    b7.context()\n    import cv2\n    test_image = [cv2.imread('dog.jpeg')]\n    res = b7.classification(images=test_image)\n    print(res)\n    res = b7.classification(paths=['dog.jpeg'])\n    print(res)\n    res = b7.classification(images=test_image)\n    print(res)\n    res = b7.classify(images=test_image)\n    print(res)\n"
  },
  {
    "path": "modules/image/classification/efficientnetb7_imagenet/processor.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport base64\n\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef softmax(x):\n    if len(x.shape) > 1:\n        tmp = np.max(x, axis=1)\n        x -= tmp.reshape((x.shape[0], 1))\n        x = np.exp(x)\n        tmp = np.sum(x, axis=1)\n        x /= tmp.reshape((x.shape[0], 1))\n    else:\n        tmp = np.max(x)\n        x -= tmp\n        x = np.exp(x)\n        tmp = np.sum(x)\n        x /= tmp\n    return x\n\n\ndef postprocess(data_out, label_list, top_k):\n    \"\"\"\n    Postprocess output of network, one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output data of network.\n        label_list (list): list of label.\n        top_k (int): Return top k results.\n    \"\"\"\n    output = []\n    for result in data_out:\n        result_i = softmax(result)\n        output_i = {}\n        indexs = np.argsort(result_i)[::-1][0:top_k]\n        for index in indexs:\n            label = label_list[index].split(',')[0]\n            output_i[label] = float(result_i[index])\n        output.append(output_i)\n    return output\n"
  },
  {
    "path": "modules/image/classification/efficientnetb7_imagenet/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/brFsZ7qszSY/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8OHx8ZG9nfGVufDB8fHx8MTY2MzA1ODQ1MQ&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"efficientnetb7_imagenet\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n\n    def test_classification1(self):\n        results = self.module.classification(paths=['tests/test.jpg'])\n        data = results[0]\n        self.assertTrue('Cardigan' in data)\n        self.assertTrue(data['Cardigan'] > 0.5)\n\n    def test_classification2(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')])\n        data = results[0]\n        self.assertTrue('Cardigan' in data)\n        self.assertTrue(data['Cardigan'] > 0.5)\n\n    def test_classification3(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')], use_gpu=True)\n        data = results[0]\n        self.assertTrue('Cardigan' in data)\n        self.assertTrue(data['Cardigan'] > 0.5)\n\n    def test_classification4(self):\n        self.assertRaises(AssertionError, self.module.classification, paths=['no.jpg'])\n\n    def test_classification5(self):\n        self.assertRaises(TypeError, self.module.classification, images=['tests/test.jpg'])\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/classification/esnet_x0_25_imagenet/README.md",
    "content": "# esnet_x0_25_imagenet\n\n|模型名称|esnet_x0_25_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ESNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|10 MB|\n|最新更新日期|2022-04-02|\n|数据指标|Acc|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - ESNet(Enhanced ShuffleNet)是百度自研的一个轻量级网络，该网络在 ShuffleNetV2 的基础上融合了 MobileNetV3、GhostNet、PPLCNet 的优点，组合成了一个在 ARM 设备上速度更快、精度更高的网络，由于其出色的表现，所以在 PaddleDetection 推出的 [PP-PicoDet](https://github.com/PaddlePaddle/PaddleDetection/tree/release/2.3/configs/picodet) 使用了该模型做 backbone，配合更强的目标检测算法，最终的指标一举刷新了目标检测模型在 ARM 设备上的 SOTA 指标。该模型为模型规模参数scale为x0.25下的ESNet模型。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install esnet_x0_25_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run esnet_x0_25_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"esnet_x0_25_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 包括'class_ids'（种类索引）, 'scores'（置信度） 和 'label_names'（种类名称）\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m esnet_x0_25_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"\\}\n    url = \"http://127.0.0.1:8866/predict/esnet_x0_25_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install esnet_x0_25_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/esnet_x0_25_imagenet/model.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nfrom typing import Any\nfrom typing import Callable\nfrom typing import Dict\nfrom typing import List\nfrom typing import Tuple\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nfrom paddle import concat\nfrom paddle import ParamAttr\nfrom paddle import reshape\nfrom paddle import split\nfrom paddle import transpose\nfrom paddle.nn import AdaptiveAvgPool2D\nfrom paddle.nn import BatchNorm\nfrom paddle.nn import Conv2D\nfrom paddle.nn import Dropout\nfrom paddle.nn import Linear\nfrom paddle.nn import MaxPool2D\nfrom paddle.nn.initializer import KaimingNormal\nfrom paddle.regularizer import L2Decay\n\nMODEL_STAGES_PATTERN = {\"ESNet\": [\"blocks[2]\", \"blocks[9]\", \"blocks[12]\"]}\n\n\nclass Identity(nn.Layer):\n\n    def __init__(self):\n        super(Identity, self).__init__()\n\n    def forward(self, inputs):\n        return inputs\n\n\nclass TheseusLayer(nn.Layer):\n\n    def __init__(self, *args, **kwargs):\n        super(TheseusLayer, self).__init__()\n        self.res_dict = {}\n        self.res_name = self.full_name()\n        self.pruner = None\n        self.quanter = None\n\n    def _return_dict_hook(self, layer, input, output):\n        res_dict = {\"output\": output}\n        # 'list' is needed to avoid error raised by popping self.res_dict\n        for res_key in list(self.res_dict):\n            # clear the res_dict because the forward process may change according to input\n            res_dict[res_key] = self.res_dict.pop(res_key)\n        return res_dict\n\n    def init_res(self, stages_pattern, return_patterns=None, return_stages=None):\n        if return_patterns and return_stages:\n            msg = f\"The 'return_patterns' would be ignored when 'return_stages' is set.\"\n            return_stages = None\n\n        if return_stages is True:\n            return_patterns = stages_pattern\n        # return_stages is int or bool\n        if type(return_stages) is int:\n            return_stages = [return_stages]\n        if isinstance(return_stages, list):\n            if max(return_stages) > len(stages_pattern) or min(return_stages) < 0:\n                msg = f\"The 'return_stages' set error. Illegal value(s) have been ignored. The stages' pattern list is {stages_pattern}.\"\n                return_stages = [val for val in return_stages if val >= 0 and val < len(stages_pattern)]\n            return_patterns = [stages_pattern[i] for i in return_stages]\n\n        if return_patterns:\n            self.update_res(return_patterns)\n\n    def replace_sub(self, *args, **kwargs) -> None:\n        msg = \"The function 'replace_sub()' is deprecated, please use 'upgrade_sublayer()' instead.\"\n        raise DeprecationWarning(msg)\n\n    def upgrade_sublayer(self, layer_name_pattern: Union[str, List[str]],\n                         handle_func: Callable[[nn.Layer, str], nn.Layer]) -> Dict[str, nn.Layer]:\n        \"\"\"use 'handle_func' to modify the sub-layer(s) specified by 'layer_name_pattern'.\n\n        Args:\n            layer_name_pattern (Union[str, List[str]]): The name of layer to be modified by 'handle_func'.\n            handle_func (Callable[[nn.Layer, str], nn.Layer]): The function to modify target layer specified by 'layer_name_pattern'. The formal params are the layer(nn.Layer) and pattern(str) that is (a member of) layer_name_pattern (when layer_name_pattern is List type). And the return is the layer processed.\n\n        Returns:\n            Dict[str, nn.Layer]: The key is the pattern and corresponding value is the result returned by 'handle_func()'.\n\n        Examples:\n\n            from paddle import nn\n            import paddleclas\n\n            def rep_func(layer: nn.Layer, pattern: str):\n                new_layer = nn.Conv2D(\n                    in_channels=layer._in_channels,\n                    out_channels=layer._out_channels,\n                    kernel_size=5,\n                    padding=2\n                )\n                return new_layer\n\n            net = paddleclas.MobileNetV1()\n            res = net.replace_sub(layer_name_pattern=[\"blocks[11].depthwise_conv.conv\", \"blocks[12].depthwise_conv.conv\"], handle_func=rep_func)\n            print(res)\n            # {'blocks[11].depthwise_conv.conv': the corresponding new_layer, 'blocks[12].depthwise_conv.conv': the corresponding new_layer}\n        \"\"\"\n\n        if not isinstance(layer_name_pattern, list):\n            layer_name_pattern = [layer_name_pattern]\n\n        hit_layer_pattern_list = []\n        for pattern in layer_name_pattern:\n            # parse pattern to find target layer and its parent\n            layer_list = parse_pattern_str(pattern=pattern, parent_layer=self)\n            if not layer_list:\n                continue\n            sub_layer_parent = layer_list[-2][\"layer\"] if len(layer_list) > 1 else self\n\n            sub_layer = layer_list[-1][\"layer\"]\n            sub_layer_name = layer_list[-1][\"name\"]\n            sub_layer_index = layer_list[-1][\"index\"]\n\n            new_sub_layer = handle_func(sub_layer, pattern)\n\n            if sub_layer_index:\n                getattr(sub_layer_parent, sub_layer_name)[sub_layer_index] = new_sub_layer\n            else:\n                setattr(sub_layer_parent, sub_layer_name, new_sub_layer)\n\n            hit_layer_pattern_list.append(pattern)\n        return hit_layer_pattern_list\n\n    def stop_after(self, stop_layer_name: str) -> bool:\n        \"\"\"stop forward and backward after 'stop_layer_name'.\n\n        Args:\n            stop_layer_name (str): The name of layer that stop forward and backward after this layer.\n\n        Returns:\n            bool: 'True' if successful, 'False' otherwise.\n        \"\"\"\n\n        layer_list = parse_pattern_str(stop_layer_name, self)\n        if not layer_list:\n            return False\n\n        parent_layer = self\n        for layer_dict in layer_list:\n            name, index = layer_dict[\"name\"], layer_dict[\"index\"]\n            if not set_identity(parent_layer, name, index):\n                msg = f\"Failed to set the layers that after stop_layer_name('{stop_layer_name}') to IdentityLayer. The error layer's name is '{name}'.\"\n                return False\n            parent_layer = layer_dict[\"layer\"]\n\n        return True\n\n    def update_res(self, return_patterns: Union[str, List[str]]) -> Dict[str, nn.Layer]:\n        \"\"\"update the result(s) to be returned.\n\n        Args:\n            return_patterns (Union[str, List[str]]): The name of layer to return output.\n\n        Returns:\n            Dict[str, nn.Layer]: The pattern(str) and corresponding layer(nn.Layer) that have been set successfully.\n        \"\"\"\n\n        # clear res_dict that could have been set\n        self.res_dict = {}\n\n        class Handler(object):\n\n            def __init__(self, res_dict):\n                # res_dict is a reference\n                self.res_dict = res_dict\n\n            def __call__(self, layer, pattern):\n                layer.res_dict = self.res_dict\n                layer.res_name = pattern\n                if hasattr(layer, \"hook_remove_helper\"):\n                    layer.hook_remove_helper.remove()\n                layer.hook_remove_helper = layer.register_forward_post_hook(save_sub_res_hook)\n                return layer\n\n        handle_func = Handler(self.res_dict)\n\n        hit_layer_pattern_list = self.upgrade_sublayer(return_patterns, handle_func=handle_func)\n\n        if hasattr(self, \"hook_remove_helper\"):\n            self.hook_remove_helper.remove()\n        self.hook_remove_helper = self.register_forward_post_hook(self._return_dict_hook)\n\n        return hit_layer_pattern_list\n\n\ndef save_sub_res_hook(layer, input, output):\n    layer.res_dict[layer.res_name] = output\n\n\ndef set_identity(parent_layer: nn.Layer, layer_name: str, layer_index: str = None) -> bool:\n    \"\"\"set the layer specified by layer_name and layer_index to Indentity.\n\n    Args:\n        parent_layer (nn.Layer): The parent layer of target layer specified by layer_name and layer_index.\n        layer_name (str): The name of target layer to be set to Indentity.\n        layer_index (str, optional): The index of target layer to be set to Indentity in parent_layer. Defaults to None.\n\n    Returns:\n        bool: True if successfully, False otherwise.\n    \"\"\"\n\n    stop_after = False\n    for sub_layer_name in parent_layer._sub_layers:\n        if stop_after:\n            parent_layer._sub_layers[sub_layer_name] = Identity()\n            continue\n        if sub_layer_name == layer_name:\n            stop_after = True\n\n    if layer_index and stop_after:\n        stop_after = False\n        for sub_layer_index in parent_layer._sub_layers[layer_name]._sub_layers:\n            if stop_after:\n                parent_layer._sub_layers[layer_name][sub_layer_index] = Identity()\n                continue\n            if layer_index == sub_layer_index:\n                stop_after = True\n\n    return stop_after\n\n\ndef parse_pattern_str(pattern: str, parent_layer: nn.Layer) -> Union[None, List[Dict[str, Union[nn.Layer, str, None]]]]:\n    \"\"\"parse the string type pattern.\n\n    Args:\n        pattern (str): The pattern to discribe layer.\n        parent_layer (nn.Layer): The root layer relative to the pattern.\n\n    Returns:\n        Union[None, List[Dict[str, Union[nn.Layer, str, None]]]]: None if failed. If successfully, the members are layers parsed in order:\n                                                                [\n                                                                    {\"layer\": first layer, \"name\": first layer's name parsed, \"index\": first layer's index parsed if exist},\n                                                                    {\"layer\": second layer, \"name\": second layer's name parsed, \"index\": second layer's index parsed if exist},\n                                                                    ...\n                                                                ]\n    \"\"\"\n\n    pattern_list = pattern.split(\".\")\n    if not pattern_list:\n        msg = f\"The pattern('{pattern}') is illegal. Please check and retry.\"\n        return None\n\n    layer_list = []\n    while len(pattern_list) > 0:\n        if '[' in pattern_list[0]:\n            target_layer_name = pattern_list[0].split('[')[0]\n            target_layer_index = pattern_list[0].split('[')[1].split(']')[0]\n        else:\n            target_layer_name = pattern_list[0]\n            target_layer_index = None\n\n        target_layer = getattr(parent_layer, target_layer_name, None)\n\n        if target_layer is None:\n            msg = f\"Not found layer named('{target_layer_name}') specifed in pattern('{pattern}').\"\n            return None\n\n        if target_layer_index and target_layer:\n            if int(target_layer_index) < 0 or int(target_layer_index) >= len(target_layer):\n                msg = f\"Not found layer by index('{target_layer_index}') specifed in pattern('{pattern}'). The index should < {len(target_layer)} and > 0.\"\n                return None\n\n            target_layer = target_layer[target_layer_index]\n\n        layer_list.append({\"layer\": target_layer, \"name\": target_layer_name, \"index\": target_layer_index})\n\n        pattern_list = pattern_list[1:]\n        parent_layer = target_layer\n    return layer_list\n\n\ndef channel_shuffle(x, groups):\n    batch_size, num_channels, height, width = x.shape[0:4]\n    channels_per_group = num_channels // groups\n    x = reshape(x=x, shape=[batch_size, groups, channels_per_group, height, width])\n    x = transpose(x=x, perm=[0, 2, 1, 3, 4])\n    x = reshape(x=x, shape=[batch_size, num_channels, height, width])\n    return x\n\n\ndef make_divisible(v, divisor=8, min_value=None):\n    if min_value is None:\n        min_value = divisor\n    new_v = max(min_value, int(v + divisor / 2) // divisor * divisor)\n    if new_v < 0.9 * v:\n        new_v += divisor\n    return new_v\n\n\nclass ConvBNLayer(TheseusLayer):\n\n    def __init__(self, in_channels, out_channels, kernel_size, stride=1, groups=1, if_act=True):\n        super().__init__()\n        self.conv = Conv2D(in_channels=in_channels,\n                           out_channels=out_channels,\n                           kernel_size=kernel_size,\n                           stride=stride,\n                           padding=(kernel_size - 1) // 2,\n                           groups=groups,\n                           weight_attr=ParamAttr(initializer=KaimingNormal()),\n                           bias_attr=False)\n\n        self.bn = BatchNorm(out_channels,\n                            param_attr=ParamAttr(regularizer=L2Decay(0.0)),\n                            bias_attr=ParamAttr(regularizer=L2Decay(0.0)))\n        self.if_act = if_act\n        self.hardswish = nn.Hardswish()\n\n    def forward(self, x):\n        x = self.conv(x)\n        x = self.bn(x)\n        if self.if_act:\n            x = self.hardswish(x)\n        return x\n\n\nclass SEModule(TheseusLayer):\n\n    def __init__(self, channel, reduction=4):\n        super().__init__()\n        self.avg_pool = AdaptiveAvgPool2D(1)\n        self.conv1 = Conv2D(in_channels=channel, out_channels=channel // reduction, kernel_size=1, stride=1, padding=0)\n        self.relu = nn.ReLU()\n        self.conv2 = Conv2D(in_channels=channel // reduction, out_channels=channel, kernel_size=1, stride=1, padding=0)\n        self.hardsigmoid = nn.Hardsigmoid()\n\n    def forward(self, x):\n        identity = x\n        x = self.avg_pool(x)\n        x = self.conv1(x)\n        x = self.relu(x)\n        x = self.conv2(x)\n        x = self.hardsigmoid(x)\n        x = paddle.multiply(x=identity, y=x)\n        return x\n\n\nclass ESBlock1(TheseusLayer):\n\n    def __init__(self, in_channels, out_channels):\n        super().__init__()\n        self.pw_1_1 = ConvBNLayer(in_channels=in_channels // 2, out_channels=out_channels // 2, kernel_size=1, stride=1)\n        self.dw_1 = ConvBNLayer(in_channels=out_channels // 2,\n                                out_channels=out_channels // 2,\n                                kernel_size=3,\n                                stride=1,\n                                groups=out_channels // 2,\n                                if_act=False)\n        self.se = SEModule(out_channels)\n\n        self.pw_1_2 = ConvBNLayer(in_channels=out_channels, out_channels=out_channels // 2, kernel_size=1, stride=1)\n\n    def forward(self, x):\n        x1, x2 = split(x, num_or_sections=[x.shape[1] // 2, x.shape[1] // 2], axis=1)\n        x2 = self.pw_1_1(x2)\n        x3 = self.dw_1(x2)\n        x3 = concat([x2, x3], axis=1)\n        x3 = self.se(x3)\n        x3 = self.pw_1_2(x3)\n        x = concat([x1, x3], axis=1)\n        return channel_shuffle(x, 2)\n\n\nclass ESBlock2(TheseusLayer):\n\n    def __init__(self, in_channels, out_channels):\n        super().__init__()\n\n        # branch1\n        self.dw_1 = ConvBNLayer(in_channels=in_channels,\n                                out_channels=in_channels,\n                                kernel_size=3,\n                                stride=2,\n                                groups=in_channels,\n                                if_act=False)\n        self.pw_1 = ConvBNLayer(in_channels=in_channels, out_channels=out_channels // 2, kernel_size=1, stride=1)\n        # branch2\n        self.pw_2_1 = ConvBNLayer(in_channels=in_channels, out_channels=out_channels // 2, kernel_size=1)\n        self.dw_2 = ConvBNLayer(in_channels=out_channels // 2,\n                                out_channels=out_channels // 2,\n                                kernel_size=3,\n                                stride=2,\n                                groups=out_channels // 2,\n                                if_act=False)\n        self.se = SEModule(out_channels // 2)\n        self.pw_2_2 = ConvBNLayer(in_channels=out_channels // 2, out_channels=out_channels // 2, kernel_size=1)\n        self.concat_dw = ConvBNLayer(in_channels=out_channels,\n                                     out_channels=out_channels,\n                                     kernel_size=3,\n                                     groups=out_channels)\n        self.concat_pw = ConvBNLayer(in_channels=out_channels, out_channels=out_channels, kernel_size=1)\n\n    def forward(self, x):\n        x1 = self.dw_1(x)\n        x1 = self.pw_1(x1)\n        x2 = self.pw_2_1(x)\n        x2 = self.dw_2(x2)\n        x2 = self.se(x2)\n        x2 = self.pw_2_2(x2)\n        x = concat([x1, x2], axis=1)\n        x = self.concat_dw(x)\n        x = self.concat_pw(x)\n        return x\n\n\nclass ESNet(TheseusLayer):\n\n    def __init__(self,\n                 stages_pattern,\n                 class_num=1000,\n                 scale=1.0,\n                 dropout_prob=0.2,\n                 class_expand=1280,\n                 return_patterns=None,\n                 return_stages=None):\n        super().__init__()\n        self.scale = scale\n        self.class_num = class_num\n        self.class_expand = class_expand\n        stage_repeats = [3, 7, 3]\n        stage_out_channels = [\n            -1, 24, make_divisible(116 * scale),\n            make_divisible(232 * scale),\n            make_divisible(464 * scale), 1024\n        ]\n\n        self.conv1 = ConvBNLayer(in_channels=3, out_channels=stage_out_channels[1], kernel_size=3, stride=2)\n        self.max_pool = MaxPool2D(kernel_size=3, stride=2, padding=1)\n\n        block_list = []\n        for stage_id, num_repeat in enumerate(stage_repeats):\n            for i in range(num_repeat):\n                if i == 0:\n                    block = ESBlock2(in_channels=stage_out_channels[stage_id + 1],\n                                     out_channels=stage_out_channels[stage_id + 2])\n                else:\n                    block = ESBlock1(in_channels=stage_out_channels[stage_id + 2],\n                                     out_channels=stage_out_channels[stage_id + 2])\n                block_list.append(block)\n        self.blocks = nn.Sequential(*block_list)\n\n        self.conv2 = ConvBNLayer(in_channels=stage_out_channels[-2], out_channels=stage_out_channels[-1], kernel_size=1)\n\n        self.avg_pool = AdaptiveAvgPool2D(1)\n\n        self.last_conv = Conv2D(in_channels=stage_out_channels[-1],\n                                out_channels=self.class_expand,\n                                kernel_size=1,\n                                stride=1,\n                                padding=0,\n                                bias_attr=False)\n        self.hardswish = nn.Hardswish()\n        self.dropout = Dropout(p=dropout_prob, mode=\"downscale_in_infer\")\n        self.flatten = nn.Flatten(start_axis=1, stop_axis=-1)\n        self.fc = Linear(self.class_expand, self.class_num)\n\n        super().init_res(stages_pattern, return_patterns=return_patterns, return_stages=return_stages)\n\n    def forward(self, x):\n        x = self.conv1(x)\n        x = self.max_pool(x)\n        x = self.blocks(x)\n        x = self.conv2(x)\n        x = self.avg_pool(x)\n        x = self.last_conv(x)\n        x = self.hardswish(x)\n        x = self.dropout(x)\n        x = self.flatten(x)\n        x = self.fc(x)\n        return x\n\n\ndef ESNet_x0_25(pretrained=False, use_ssld=False, **kwargs):\n    \"\"\"\n    ESNet_x0_25\n    Args:\n        pretrained: bool=False or str. If `True` load pretrained parameters, `False` otherwise.\n                    If str, means the path of the pretrained model.\n        use_ssld: bool=False. Whether using distillation pretrained model when pretrained=True.\n    Returns:\n        model: nn.Layer. Specific `ESNet_x0_25` model depends on args.\n    \"\"\"\n    model = ESNet(scale=0.25, stages_pattern=MODEL_STAGES_PATTERN[\"ESNet\"], **kwargs)\n    return model\n"
  },
  {
    "path": "modules/image/classification/esnet_x0_25_imagenet/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport cv2\nimport numpy as np\nimport paddle\nfrom skimage.io import imread\nfrom skimage.transform import rescale\nfrom skimage.transform import resize\n\nimport paddlehub as hub\nfrom .model import ESNet_x0_25\nfrom .processor import base64_to_cv2\nfrom .processor import create_operators\nfrom .processor import Topk\nfrom .utils import get_config\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"esnet_x0_25_imagenet\",\n            type=\"cv/classification\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"\",\n            version=\"1.0.0\")\nclass Esnet_x0_25_Imagenet:\n\n    def __init__(self):\n        self.config = get_config(os.path.join(self.directory, 'ESNet_x0_25.yaml'), show=False)\n        self.label_path = os.path.join(self.directory, 'imagenet1k_label_list.txt')\n        self.pretrain_path = os.path.join(self.directory, 'ESNet_x0_25_pretrained.pdparams')\n        self.config['Infer']['PostProcess']['class_id_map_file'] = self.label_path\n        self.model = ESNet_x0_25()\n        param_state_dict = paddle.load(self.pretrain_path)\n        self.model.set_dict(param_state_dict)\n        self.preprocess_funcs = create_operators(self.config[\"Infer\"][\"transforms\"])\n\n    def classification(self,\n                       images: list = None,\n                       paths: list = None,\n                       batch_size: int = 1,\n                       use_gpu: bool = False,\n                       top_k: int = 1):\n        '''\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results, each result dict contains key 'class_ids', 'scores' and 'label_names'.\n        '''\n        postprocess_func = Topk(top_k, self.label_path)\n        inputs = []\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n\n        if images != None:\n            for image in images:\n                image = image[:, :, ::-1]\n                inputs.append(image)\n\n        if paths != None:\n            for path in paths:\n                image = cv2.imread(path)[:, :, ::-1]\n                inputs.append(image)\n\n        batch_data = []\n        for idx, imagedata in enumerate(inputs):\n            for process in self.preprocess_funcs:\n                imagedata = process(imagedata)\n            batch_data.append(imagedata)\n            if len(batch_data) >= batch_size or idx == len(inputs) - 1:\n                batch_tensor = paddle.to_tensor(batch_data)\n                out = self.model(batch_tensor)\n                if isinstance(out, list):\n                    out = out[0]\n                if isinstance(out, dict) and \"logits\" in out:\n                    out = out[\"logits\"]\n                if isinstance(out, dict) and \"output\" in out:\n                    out = out[\"output\"]\n                result = postprocess_func(out)\n                results.extend(result)\n                batch_data.clear()\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n        results = self.classification(paths=[self.args.input_path],\n                                      use_gpu=self.args.use_gpu,\n                                      batch_size=self.args.batch_size,\n                                      top_k=self.args.top_k)\n        return results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classification(images=images_decode, **kwargs)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument('--batch_size', type=int, default=1, help='batch size')\n        self.arg_config_group.add_argument('--top_k', type=int, default=1, help='Return top k results.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to input image.\")\n"
  },
  {
    "path": "modules/image/classification/esnet_x0_25_imagenet/processor.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport base64\nimport inspect\nimport math\nimport os\nimport random\nimport sys\nfrom functools import partial\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\nimport six\nfrom paddle.vision.transforms import ColorJitter as RawColorJitter\nfrom PIL import Image\n\n\ndef create_operators(params, class_num=None):\n    \"\"\"\n    create operators based on the config\n\n    Args:\n        params(list): a dict list, used to create some operators\n    \"\"\"\n    assert isinstance(params, list), ('operator config should be a list')\n    ops = []\n    current_module = sys.modules[__name__]\n    for operator in params:\n        assert isinstance(operator, dict) and len(operator) == 1, \"yaml format error\"\n        op_name = list(operator)[0]\n        param = {} if operator[op_name] is None else operator[op_name]\n        op_func = getattr(current_module, op_name)\n        if \"class_num\" in inspect.getfullargspec(op_func).args:\n            param.update({\"class_num\": class_num})\n        op = op_func(**param)\n        ops.append(op)\n\n    return ops\n\n\nclass UnifiedResize(object):\n\n    def __init__(self, interpolation=None, backend=\"cv2\"):\n        _cv2_interp_from_str = {\n            'nearest': cv2.INTER_NEAREST,\n            'bilinear': cv2.INTER_LINEAR,\n            'area': cv2.INTER_AREA,\n            'bicubic': cv2.INTER_CUBIC,\n            'lanczos': cv2.INTER_LANCZOS4\n        }\n        _pil_interp_from_str = {\n            'nearest': Image.NEAREST,\n            'bilinear': Image.BILINEAR,\n            'bicubic': Image.BICUBIC,\n            'box': Image.BOX,\n            'lanczos': Image.LANCZOS,\n            'hamming': Image.HAMMING\n        }\n\n        def _pil_resize(src, size, resample):\n            pil_img = Image.fromarray(src)\n            pil_img = pil_img.resize(size, resample)\n            return np.asarray(pil_img)\n\n        if backend.lower() == \"cv2\":\n            if isinstance(interpolation, str):\n                interpolation = _cv2_interp_from_str[interpolation.lower()]\n            # compatible with opencv < version 4.4.0\n            elif interpolation is None:\n                interpolation = cv2.INTER_LINEAR\n            self.resize_func = partial(cv2.resize, interpolation=interpolation)\n        elif backend.lower() == \"pil\":\n            if isinstance(interpolation, str):\n                interpolation = _pil_interp_from_str[interpolation.lower()]\n            self.resize_func = partial(_pil_resize, resample=interpolation)\n        else:\n            self.resize_func = cv2.resize\n\n    def __call__(self, src, size):\n        return self.resize_func(src, size)\n\n\nclass OperatorParamError(ValueError):\n    \"\"\" OperatorParamError\n    \"\"\"\n    pass\n\n\nclass DecodeImage(object):\n    \"\"\" decode image \"\"\"\n\n    def __init__(self, to_rgb=True, to_np=False, channel_first=False):\n        self.to_rgb = to_rgb\n        self.to_np = to_np  # to numpy\n        self.channel_first = channel_first  # only enabled when to_np is True\n\n    def __call__(self, img):\n        if six.PY2:\n            assert type(img) is str and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        else:\n            assert type(img) is bytes and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        data = np.frombuffer(img, dtype='uint8')\n        img = cv2.imdecode(data, 1)\n        if self.to_rgb:\n            assert img.shape[2] == 3, 'invalid shape of image[%s]' % (img.shape)\n            img = img[:, :, ::-1]\n\n        if self.channel_first:\n            img = img.transpose((2, 0, 1))\n\n        return img\n\n\nclass ResizeImage(object):\n    \"\"\" resize image \"\"\"\n\n    def __init__(self, size=None, resize_short=None, interpolation=None, backend=\"cv2\"):\n        if resize_short is not None and resize_short > 0:\n            self.resize_short = resize_short\n            self.w = None\n            self.h = None\n        elif size is not None:\n            self.resize_short = None\n            self.w = size if type(size) is int else size[0]\n            self.h = size if type(size) is int else size[1]\n        else:\n            raise OperatorParamError(\"invalid params for ReisizeImage for '\\\n                'both 'size' and 'resize_short' are None\")\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        img_h, img_w = img.shape[:2]\n        if self.resize_short is not None:\n            percent = float(self.resize_short) / min(img_w, img_h)\n            w = int(round(img_w * percent))\n            h = int(round(img_h * percent))\n        else:\n            w = self.w\n            h = self.h\n        return self._resize_func(img, (w, h))\n\n\nclass CropImage(object):\n    \"\"\" crop image \"\"\"\n\n    def __init__(self, size):\n        if type(size) is int:\n            self.size = (size, size)\n        else:\n            self.size = size  # (h, w)\n\n    def __call__(self, img):\n        w, h = self.size\n        img_h, img_w = img.shape[:2]\n        w_start = (img_w - w) // 2\n        h_start = (img_h - h) // 2\n\n        w_end = w_start + w\n        h_end = h_start + h\n        return img[h_start:h_end, w_start:w_end, :]\n\n\nclass RandCropImage(object):\n    \"\"\" random crop image \"\"\"\n\n    def __init__(self, size, scale=None, ratio=None, interpolation=None, backend=\"cv2\"):\n        if type(size) is int:\n            self.size = (size, size)  # (h, w)\n        else:\n            self.size = size\n\n        self.scale = [0.08, 1.0] if scale is None else scale\n        self.ratio = [3. / 4., 4. / 3.] if ratio is None else ratio\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        size = self.size\n        scale = self.scale\n        ratio = self.ratio\n\n        aspect_ratio = math.sqrt(random.uniform(*ratio))\n        w = 1. * aspect_ratio\n        h = 1. / aspect_ratio\n\n        img_h, img_w = img.shape[:2]\n\n        bound = min((float(img_w) / img_h) / (w**2), (float(img_h) / img_w) / (h**2))\n        scale_max = min(scale[1], bound)\n        scale_min = min(scale[0], bound)\n\n        target_area = img_w * img_h * random.uniform(scale_min, scale_max)\n        target_size = math.sqrt(target_area)\n        w = int(target_size * w)\n        h = int(target_size * h)\n\n        i = random.randint(0, img_w - w)\n        j = random.randint(0, img_h - h)\n\n        img = img[j:j + h, i:i + w, :]\n\n        return self._resize_func(img, size)\n\n\nclass RandFlipImage(object):\n    \"\"\" random flip image\n        flip_code:\n            1: Flipped Horizontally\n            0: Flipped Vertically\n            -1: Flipped Horizontally & Vertically\n    \"\"\"\n\n    def __init__(self, flip_code=1):\n        assert flip_code in [-1, 0, 1], \"flip_code should be a value in [-1, 0, 1]\"\n        self.flip_code = flip_code\n\n    def __call__(self, img):\n        if random.randint(0, 1) == 1:\n            return cv2.flip(img, self.flip_code)\n        else:\n            return img\n\n\nclass NormalizeImage(object):\n    \"\"\" normalize image such as substract mean, divide std\n    \"\"\"\n\n    def __init__(self, scale=None, mean=None, std=None, order='chw', output_fp16=False, channel_num=3):\n        if isinstance(scale, str):\n            scale = eval(scale)\n        assert channel_num in [3, 4], \"channel number of input image should be set to 3 or 4.\"\n        self.channel_num = channel_num\n        self.output_dtype = 'float16' if output_fp16 else 'float32'\n        self.scale = np.float32(scale if scale is not None else 1.0 / 255.0)\n        self.order = order\n        mean = mean if mean is not None else [0.485, 0.456, 0.406]\n        std = std if std is not None else [0.229, 0.224, 0.225]\n\n        shape = (3, 1, 1) if self.order == 'chw' else (1, 1, 3)\n        self.mean = np.array(mean).reshape(shape).astype('float32')\n        self.std = np.array(std).reshape(shape).astype('float32')\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        assert isinstance(img, np.ndarray), \"invalid input 'img' in NormalizeImage\"\n\n        img = (img.astype('float32') * self.scale - self.mean) / self.std\n\n        if self.channel_num == 4:\n            img_h = img.shape[1] if self.order == 'chw' else img.shape[0]\n            img_w = img.shape[2] if self.order == 'chw' else img.shape[1]\n            pad_zeros = np.zeros((1, img_h, img_w)) if self.order == 'chw' else np.zeros((img_h, img_w, 1))\n            img = (np.concatenate((img, pad_zeros), axis=0) if self.order == 'chw' else np.concatenate(\n                (img, pad_zeros), axis=2))\n        return img.astype(self.output_dtype)\n\n\nclass ToCHWImage(object):\n    \"\"\" convert hwc image to chw image\n    \"\"\"\n\n    def __init__(self):\n        pass\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        return img.transpose((2, 0, 1))\n\n\nclass ColorJitter(RawColorJitter):\n    \"\"\"ColorJitter.\n    \"\"\"\n\n    def __init__(self, *args, **kwargs):\n        super().__init__(*args, **kwargs)\n\n    def __call__(self, img):\n        if not isinstance(img, Image.Image):\n            img = np.ascontiguousarray(img)\n            img = Image.fromarray(img)\n        img = super()._apply_image(img)\n        if isinstance(img, Image.Image):\n            img = np.asarray(img)\n        return img\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\nclass Topk(object):\n\n    def __init__(self, topk=1, class_id_map_file=None):\n        assert isinstance(topk, (int, ))\n        self.class_id_map = self.parse_class_id_map(class_id_map_file)\n        self.topk = topk\n\n    def parse_class_id_map(self, class_id_map_file):\n        if class_id_map_file is None:\n            return None\n        if not os.path.exists(class_id_map_file):\n            print(\n                \"Warning: If want to use your own label_dict, please input legal path!\\nOtherwise label_names will be empty!\"\n            )\n            return None\n\n        try:\n            class_id_map = {}\n            with open(class_id_map_file, \"r\") as fin:\n                lines = fin.readlines()\n                for line in lines:\n                    partition = line.split(\"\\n\")[0].partition(\" \")\n                    class_id_map[int(partition[0])] = str(partition[-1])\n        except Exception as ex:\n            print(ex)\n            class_id_map = None\n        return class_id_map\n\n    def __call__(self, x, file_names=None, multilabel=False):\n        assert isinstance(x, paddle.Tensor)\n        if file_names is not None:\n            assert x.shape[0] == len(file_names)\n        x = F.softmax(x, axis=-1) if not multilabel else F.sigmoid(x)\n        x = x.numpy()\n        y = []\n        for idx, probs in enumerate(x):\n            index = probs.argsort(axis=0)[-self.topk:][::-1].astype(\"int32\") if not multilabel else np.where(\n                probs >= 0.5)[0].astype(\"int32\")\n            clas_id_list = []\n            score_list = []\n            label_name_list = []\n            for i in index:\n                clas_id_list.append(i.item())\n                score_list.append(probs[i].item())\n                if self.class_id_map is not None:\n                    label_name_list.append(self.class_id_map[i.item()])\n            result = {\n                \"class_ids\": clas_id_list,\n                \"scores\": np.around(score_list, decimals=5).tolist(),\n            }\n            if file_names is not None:\n                result[\"file_name\"] = file_names[idx]\n            if label_name_list is not None:\n                result[\"label_names\"] = label_name_list\n            y.append(result)\n        return y\n"
  },
  {
    "path": "modules/image/classification/esnet_x0_25_imagenet/utils.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport yaml\n\n__all__ = ['get_config']\n\n\nclass AttrDict(dict):\n\n    def __getattr__(self, key):\n        return self[key]\n\n    def __setattr__(self, key, value):\n        if key in self.__dict__:\n            self.__dict__[key] = value\n        else:\n            self[key] = value\n\n    def __deepcopy__(self, content):\n        return copy.deepcopy(dict(self))\n\n\ndef create_attr_dict(yaml_config):\n    from ast import literal_eval\n    for key, value in yaml_config.items():\n        if type(value) is dict:\n            yaml_config[key] = value = AttrDict(value)\n        if isinstance(value, str):\n            try:\n                value = literal_eval(value)\n            except BaseException:\n                pass\n        if isinstance(value, AttrDict):\n            create_attr_dict(yaml_config[key])\n        else:\n            yaml_config[key] = value\n\n\ndef parse_config(cfg_file):\n    \"\"\"Load a config file into AttrDict\"\"\"\n    with open(cfg_file, 'r') as fopen:\n        yaml_config = AttrDict(yaml.load(fopen, Loader=yaml.SafeLoader))\n    create_attr_dict(yaml_config)\n    return yaml_config\n\n\ndef override(dl, ks, v):\n    \"\"\"\n    Recursively replace dict of list\n    Args:\n        dl(dict or list): dict or list to be replaced\n        ks(list): list of keys\n        v(str): value to be replaced\n    \"\"\"\n\n    def str2num(v):\n        try:\n            return eval(v)\n        except Exception:\n            return v\n\n    assert isinstance(dl, (list, dict)), (\"{} should be a list or a dict\")\n    assert len(ks) > 0, ('lenght of keys should larger than 0')\n    if isinstance(dl, list):\n        k = str2num(ks[0])\n        if len(ks) == 1:\n            assert k < len(dl), ('index({}) out of range({})'.format(k, dl))\n            dl[k] = str2num(v)\n        else:\n            override(dl[k], ks[1:], v)\n    else:\n        if len(ks) == 1:\n            # assert ks[0] in dl, ('{} is not exist in {}'.format(ks[0], dl))\n            if not ks[0] in dl:\n                print('A new filed ({}) detected!'.format(ks[0], dl))\n            dl[ks[0]] = str2num(v)\n        else:\n            override(dl[ks[0]], ks[1:], v)\n\n\ndef override_config(config, options=None):\n    \"\"\"\n    Recursively override the config\n    Args:\n        config(dict): dict to be replaced\n        options(list): list of pairs(key0.key1.idx.key2=value)\n            such as: [\n                'topk=2',\n                'VALID.transforms.1.ResizeImage.resize_short=300'\n            ]\n    Returns:\n        config(dict): replaced config\n    \"\"\"\n    if options is not None:\n        for opt in options:\n            assert isinstance(opt, str), (\"option({}) should be a str\".format(opt))\n            assert \"=\" in opt, (\"option({}) should contain a =\"\n                                \"to distinguish between key and value\".format(opt))\n            pair = opt.split('=')\n            assert len(pair) == 2, (\"there can be only a = in the option\")\n            key, value = pair\n            keys = key.split('.')\n            override(config, keys, value)\n    return config\n\n\ndef get_config(fname, overrides=None, show=False):\n    \"\"\"\n    Read config from file\n    \"\"\"\n    assert os.path.exists(fname), ('config file({}) is not exist'.format(fname))\n    config = parse_config(fname)\n    override_config(config, overrides)\n    return config\n"
  },
  {
    "path": "modules/image/classification/esnet_x0_5_imagenet/README.md",
    "content": "# esnet_x0_5_imagenet\n\n|模型名称|esnet_x0_5_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ESNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|12 MB|\n|最新更新日期|2022-04-02|\n|数据指标|Acc|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - ESNet(Enhanced ShuffleNet)是百度自研的一个轻量级网络，该网络在 ShuffleNetV2 的基础上融合了 MobileNetV3、GhostNet、PPLCNet 的优点，组合成了一个在 ARM 设备上速度更快、精度更高的网络，由于其出色的表现，所以在 PaddleDetection 推出的 [PP-PicoDet](https://github.com/PaddlePaddle/PaddleDetection/tree/release/2.3/configs/picodet) 使用了该模型做 backbone，配合更强的目标检测算法，最终的指标一举刷新了目标检测模型在 ARM 设备上的 SOTA 指标。该模型为模型规模参数scale为x0.5下的ESNet模型。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install esnet_x0_5_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run esnet_x0_5_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"esnet_x0_5_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 包括'class_ids'（种类索引）, 'scores'（置信度） 和 'label_names'（种类名称）\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m esnet_x0_5_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"\\}\n    url = \"http://127.0.0.1:8866/predict/esnet_x0_5_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install esnet_x0_5_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/esnet_x0_5_imagenet/model.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nfrom typing import Any\nfrom typing import Callable\nfrom typing import Dict\nfrom typing import List\nfrom typing import Tuple\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nfrom paddle import concat\nfrom paddle import ParamAttr\nfrom paddle import reshape\nfrom paddle import split\nfrom paddle import transpose\nfrom paddle.nn import AdaptiveAvgPool2D\nfrom paddle.nn import BatchNorm\nfrom paddle.nn import Conv2D\nfrom paddle.nn import Dropout\nfrom paddle.nn import Linear\nfrom paddle.nn import MaxPool2D\nfrom paddle.nn.initializer import KaimingNormal\nfrom paddle.regularizer import L2Decay\n\nMODEL_STAGES_PATTERN = {\"ESNet\": [\"blocks[2]\", \"blocks[9]\", \"blocks[12]\"]}\n\n\nclass Identity(nn.Layer):\n\n    def __init__(self):\n        super(Identity, self).__init__()\n\n    def forward(self, inputs):\n        return inputs\n\n\nclass TheseusLayer(nn.Layer):\n\n    def __init__(self, *args, **kwargs):\n        super(TheseusLayer, self).__init__()\n        self.res_dict = {}\n        self.res_name = self.full_name()\n        self.pruner = None\n        self.quanter = None\n\n    def _return_dict_hook(self, layer, input, output):\n        res_dict = {\"output\": output}\n        # 'list' is needed to avoid error raised by popping self.res_dict\n        for res_key in list(self.res_dict):\n            # clear the res_dict because the forward process may change according to input\n            res_dict[res_key] = self.res_dict.pop(res_key)\n        return res_dict\n\n    def init_res(self, stages_pattern, return_patterns=None, return_stages=None):\n        if return_patterns and return_stages:\n            msg = f\"The 'return_patterns' would be ignored when 'return_stages' is set.\"\n            return_stages = None\n\n        if return_stages is True:\n            return_patterns = stages_pattern\n        # return_stages is int or bool\n        if type(return_stages) is int:\n            return_stages = [return_stages]\n        if isinstance(return_stages, list):\n            if max(return_stages) > len(stages_pattern) or min(return_stages) < 0:\n                msg = f\"The 'return_stages' set error. Illegal value(s) have been ignored. The stages' pattern list is {stages_pattern}.\"\n                return_stages = [val for val in return_stages if val >= 0 and val < len(stages_pattern)]\n            return_patterns = [stages_pattern[i] for i in return_stages]\n\n        if return_patterns:\n            self.update_res(return_patterns)\n\n    def replace_sub(self, *args, **kwargs) -> None:\n        msg = \"The function 'replace_sub()' is deprecated, please use 'upgrade_sublayer()' instead.\"\n        raise DeprecationWarning(msg)\n\n    def upgrade_sublayer(self, layer_name_pattern: Union[str, List[str]],\n                         handle_func: Callable[[nn.Layer, str], nn.Layer]) -> Dict[str, nn.Layer]:\n        \"\"\"use 'handle_func' to modify the sub-layer(s) specified by 'layer_name_pattern'.\n\n        Args:\n            layer_name_pattern (Union[str, List[str]]): The name of layer to be modified by 'handle_func'.\n            handle_func (Callable[[nn.Layer, str], nn.Layer]): The function to modify target layer specified by 'layer_name_pattern'. The formal params are the layer(nn.Layer) and pattern(str) that is (a member of) layer_name_pattern (when layer_name_pattern is List type). And the return is the layer processed.\n\n        Returns:\n            Dict[str, nn.Layer]: The key is the pattern and corresponding value is the result returned by 'handle_func()'.\n\n        Examples:\n\n            from paddle import nn\n            import paddleclas\n\n            def rep_func(layer: nn.Layer, pattern: str):\n                new_layer = nn.Conv2D(\n                    in_channels=layer._in_channels,\n                    out_channels=layer._out_channels,\n                    kernel_size=5,\n                    padding=2\n                )\n                return new_layer\n\n            net = paddleclas.MobileNetV1()\n            res = net.replace_sub(layer_name_pattern=[\"blocks[11].depthwise_conv.conv\", \"blocks[12].depthwise_conv.conv\"], handle_func=rep_func)\n            print(res)\n            # {'blocks[11].depthwise_conv.conv': the corresponding new_layer, 'blocks[12].depthwise_conv.conv': the corresponding new_layer}\n        \"\"\"\n\n        if not isinstance(layer_name_pattern, list):\n            layer_name_pattern = [layer_name_pattern]\n\n        hit_layer_pattern_list = []\n        for pattern in layer_name_pattern:\n            # parse pattern to find target layer and its parent\n            layer_list = parse_pattern_str(pattern=pattern, parent_layer=self)\n            if not layer_list:\n                continue\n            sub_layer_parent = layer_list[-2][\"layer\"] if len(layer_list) > 1 else self\n\n            sub_layer = layer_list[-1][\"layer\"]\n            sub_layer_name = layer_list[-1][\"name\"]\n            sub_layer_index = layer_list[-1][\"index\"]\n\n            new_sub_layer = handle_func(sub_layer, pattern)\n\n            if sub_layer_index:\n                getattr(sub_layer_parent, sub_layer_name)[sub_layer_index] = new_sub_layer\n            else:\n                setattr(sub_layer_parent, sub_layer_name, new_sub_layer)\n\n            hit_layer_pattern_list.append(pattern)\n        return hit_layer_pattern_list\n\n    def stop_after(self, stop_layer_name: str) -> bool:\n        \"\"\"stop forward and backward after 'stop_layer_name'.\n\n        Args:\n            stop_layer_name (str): The name of layer that stop forward and backward after this layer.\n\n        Returns:\n            bool: 'True' if successful, 'False' otherwise.\n        \"\"\"\n\n        layer_list = parse_pattern_str(stop_layer_name, self)\n        if not layer_list:\n            return False\n\n        parent_layer = self\n        for layer_dict in layer_list:\n            name, index = layer_dict[\"name\"], layer_dict[\"index\"]\n            if not set_identity(parent_layer, name, index):\n                msg = f\"Failed to set the layers that after stop_layer_name('{stop_layer_name}') to IdentityLayer. The error layer's name is '{name}'.\"\n                return False\n            parent_layer = layer_dict[\"layer\"]\n\n        return True\n\n    def update_res(self, return_patterns: Union[str, List[str]]) -> Dict[str, nn.Layer]:\n        \"\"\"update the result(s) to be returned.\n\n        Args:\n            return_patterns (Union[str, List[str]]): The name of layer to return output.\n\n        Returns:\n            Dict[str, nn.Layer]: The pattern(str) and corresponding layer(nn.Layer) that have been set successfully.\n        \"\"\"\n\n        # clear res_dict that could have been set\n        self.res_dict = {}\n\n        class Handler(object):\n\n            def __init__(self, res_dict):\n                # res_dict is a reference\n                self.res_dict = res_dict\n\n            def __call__(self, layer, pattern):\n                layer.res_dict = self.res_dict\n                layer.res_name = pattern\n                if hasattr(layer, \"hook_remove_helper\"):\n                    layer.hook_remove_helper.remove()\n                layer.hook_remove_helper = layer.register_forward_post_hook(save_sub_res_hook)\n                return layer\n\n        handle_func = Handler(self.res_dict)\n\n        hit_layer_pattern_list = self.upgrade_sublayer(return_patterns, handle_func=handle_func)\n\n        if hasattr(self, \"hook_remove_helper\"):\n            self.hook_remove_helper.remove()\n        self.hook_remove_helper = self.register_forward_post_hook(self._return_dict_hook)\n\n        return hit_layer_pattern_list\n\n\ndef save_sub_res_hook(layer, input, output):\n    layer.res_dict[layer.res_name] = output\n\n\ndef set_identity(parent_layer: nn.Layer, layer_name: str, layer_index: str = None) -> bool:\n    \"\"\"set the layer specified by layer_name and layer_index to Indentity.\n\n    Args:\n        parent_layer (nn.Layer): The parent layer of target layer specified by layer_name and layer_index.\n        layer_name (str): The name of target layer to be set to Indentity.\n        layer_index (str, optional): The index of target layer to be set to Indentity in parent_layer. Defaults to None.\n\n    Returns:\n        bool: True if successfully, False otherwise.\n    \"\"\"\n\n    stop_after = False\n    for sub_layer_name in parent_layer._sub_layers:\n        if stop_after:\n            parent_layer._sub_layers[sub_layer_name] = Identity()\n            continue\n        if sub_layer_name == layer_name:\n            stop_after = True\n\n    if layer_index and stop_after:\n        stop_after = False\n        for sub_layer_index in parent_layer._sub_layers[layer_name]._sub_layers:\n            if stop_after:\n                parent_layer._sub_layers[layer_name][sub_layer_index] = Identity()\n                continue\n            if layer_index == sub_layer_index:\n                stop_after = True\n\n    return stop_after\n\n\ndef parse_pattern_str(pattern: str, parent_layer: nn.Layer) -> Union[None, List[Dict[str, Union[nn.Layer, str, None]]]]:\n    \"\"\"parse the string type pattern.\n\n    Args:\n        pattern (str): The pattern to discribe layer.\n        parent_layer (nn.Layer): The root layer relative to the pattern.\n\n    Returns:\n        Union[None, List[Dict[str, Union[nn.Layer, str, None]]]]: None if failed. If successfully, the members are layers parsed in order:\n                                                                [\n                                                                    {\"layer\": first layer, \"name\": first layer's name parsed, \"index\": first layer's index parsed if exist},\n                                                                    {\"layer\": second layer, \"name\": second layer's name parsed, \"index\": second layer's index parsed if exist},\n                                                                    ...\n                                                                ]\n    \"\"\"\n\n    pattern_list = pattern.split(\".\")\n    if not pattern_list:\n        msg = f\"The pattern('{pattern}') is illegal. Please check and retry.\"\n        return None\n\n    layer_list = []\n    while len(pattern_list) > 0:\n        if '[' in pattern_list[0]:\n            target_layer_name = pattern_list[0].split('[')[0]\n            target_layer_index = pattern_list[0].split('[')[1].split(']')[0]\n        else:\n            target_layer_name = pattern_list[0]\n            target_layer_index = None\n\n        target_layer = getattr(parent_layer, target_layer_name, None)\n\n        if target_layer is None:\n            msg = f\"Not found layer named('{target_layer_name}') specifed in pattern('{pattern}').\"\n            return None\n\n        if target_layer_index and target_layer:\n            if int(target_layer_index) < 0 or int(target_layer_index) >= len(target_layer):\n                msg = f\"Not found layer by index('{target_layer_index}') specifed in pattern('{pattern}'). The index should < {len(target_layer)} and > 0.\"\n                return None\n\n            target_layer = target_layer[target_layer_index]\n\n        layer_list.append({\"layer\": target_layer, \"name\": target_layer_name, \"index\": target_layer_index})\n\n        pattern_list = pattern_list[1:]\n        parent_layer = target_layer\n    return layer_list\n\n\ndef channel_shuffle(x, groups):\n    batch_size, num_channels, height, width = x.shape[0:4]\n    channels_per_group = num_channels // groups\n    x = reshape(x=x, shape=[batch_size, groups, channels_per_group, height, width])\n    x = transpose(x=x, perm=[0, 2, 1, 3, 4])\n    x = reshape(x=x, shape=[batch_size, num_channels, height, width])\n    return x\n\n\ndef make_divisible(v, divisor=8, min_value=None):\n    if min_value is None:\n        min_value = divisor\n    new_v = max(min_value, int(v + divisor / 2) // divisor * divisor)\n    if new_v < 0.9 * v:\n        new_v += divisor\n    return new_v\n\n\nclass ConvBNLayer(TheseusLayer):\n\n    def __init__(self, in_channels, out_channels, kernel_size, stride=1, groups=1, if_act=True):\n        super().__init__()\n        self.conv = Conv2D(in_channels=in_channels,\n                           out_channels=out_channels,\n                           kernel_size=kernel_size,\n                           stride=stride,\n                           padding=(kernel_size - 1) // 2,\n                           groups=groups,\n                           weight_attr=ParamAttr(initializer=KaimingNormal()),\n                           bias_attr=False)\n\n        self.bn = BatchNorm(out_channels,\n                            param_attr=ParamAttr(regularizer=L2Decay(0.0)),\n                            bias_attr=ParamAttr(regularizer=L2Decay(0.0)))\n        self.if_act = if_act\n        self.hardswish = nn.Hardswish()\n\n    def forward(self, x):\n        x = self.conv(x)\n        x = self.bn(x)\n        if self.if_act:\n            x = self.hardswish(x)\n        return x\n\n\nclass SEModule(TheseusLayer):\n\n    def __init__(self, channel, reduction=4):\n        super().__init__()\n        self.avg_pool = AdaptiveAvgPool2D(1)\n        self.conv1 = Conv2D(in_channels=channel, out_channels=channel // reduction, kernel_size=1, stride=1, padding=0)\n        self.relu = nn.ReLU()\n        self.conv2 = Conv2D(in_channels=channel // reduction, out_channels=channel, kernel_size=1, stride=1, padding=0)\n        self.hardsigmoid = nn.Hardsigmoid()\n\n    def forward(self, x):\n        identity = x\n        x = self.avg_pool(x)\n        x = self.conv1(x)\n        x = self.relu(x)\n        x = self.conv2(x)\n        x = self.hardsigmoid(x)\n        x = paddle.multiply(x=identity, y=x)\n        return x\n\n\nclass ESBlock1(TheseusLayer):\n\n    def __init__(self, in_channels, out_channels):\n        super().__init__()\n        self.pw_1_1 = ConvBNLayer(in_channels=in_channels // 2, out_channels=out_channels // 2, kernel_size=1, stride=1)\n        self.dw_1 = ConvBNLayer(in_channels=out_channels // 2,\n                                out_channels=out_channels // 2,\n                                kernel_size=3,\n                                stride=1,\n                                groups=out_channels // 2,\n                                if_act=False)\n        self.se = SEModule(out_channels)\n\n        self.pw_1_2 = ConvBNLayer(in_channels=out_channels, out_channels=out_channels // 2, kernel_size=1, stride=1)\n\n    def forward(self, x):\n        x1, x2 = split(x, num_or_sections=[x.shape[1] // 2, x.shape[1] // 2], axis=1)\n        x2 = self.pw_1_1(x2)\n        x3 = self.dw_1(x2)\n        x3 = concat([x2, x3], axis=1)\n        x3 = self.se(x3)\n        x3 = self.pw_1_2(x3)\n        x = concat([x1, x3], axis=1)\n        return channel_shuffle(x, 2)\n\n\nclass ESBlock2(TheseusLayer):\n\n    def __init__(self, in_channels, out_channels):\n        super().__init__()\n\n        # branch1\n        self.dw_1 = ConvBNLayer(in_channels=in_channels,\n                                out_channels=in_channels,\n                                kernel_size=3,\n                                stride=2,\n                                groups=in_channels,\n                                if_act=False)\n        self.pw_1 = ConvBNLayer(in_channels=in_channels, out_channels=out_channels // 2, kernel_size=1, stride=1)\n        # branch2\n        self.pw_2_1 = ConvBNLayer(in_channels=in_channels, out_channels=out_channels // 2, kernel_size=1)\n        self.dw_2 = ConvBNLayer(in_channels=out_channels // 2,\n                                out_channels=out_channels // 2,\n                                kernel_size=3,\n                                stride=2,\n                                groups=out_channels // 2,\n                                if_act=False)\n        self.se = SEModule(out_channels // 2)\n        self.pw_2_2 = ConvBNLayer(in_channels=out_channels // 2, out_channels=out_channels // 2, kernel_size=1)\n        self.concat_dw = ConvBNLayer(in_channels=out_channels,\n                                     out_channels=out_channels,\n                                     kernel_size=3,\n                                     groups=out_channels)\n        self.concat_pw = ConvBNLayer(in_channels=out_channels, out_channels=out_channels, kernel_size=1)\n\n    def forward(self, x):\n        x1 = self.dw_1(x)\n        x1 = self.pw_1(x1)\n        x2 = self.pw_2_1(x)\n        x2 = self.dw_2(x2)\n        x2 = self.se(x2)\n        x2 = self.pw_2_2(x2)\n        x = concat([x1, x2], axis=1)\n        x = self.concat_dw(x)\n        x = self.concat_pw(x)\n        return x\n\n\nclass ESNet(TheseusLayer):\n\n    def __init__(self,\n                 stages_pattern,\n                 class_num=1000,\n                 scale=1.0,\n                 dropout_prob=0.2,\n                 class_expand=1280,\n                 return_patterns=None,\n                 return_stages=None):\n        super().__init__()\n        self.scale = scale\n        self.class_num = class_num\n        self.class_expand = class_expand\n        stage_repeats = [3, 7, 3]\n        stage_out_channels = [\n            -1, 24, make_divisible(116 * scale),\n            make_divisible(232 * scale),\n            make_divisible(464 * scale), 1024\n        ]\n\n        self.conv1 = ConvBNLayer(in_channels=3, out_channels=stage_out_channels[1], kernel_size=3, stride=2)\n        self.max_pool = MaxPool2D(kernel_size=3, stride=2, padding=1)\n\n        block_list = []\n        for stage_id, num_repeat in enumerate(stage_repeats):\n            for i in range(num_repeat):\n                if i == 0:\n                    block = ESBlock2(in_channels=stage_out_channels[stage_id + 1],\n                                     out_channels=stage_out_channels[stage_id + 2])\n                else:\n                    block = ESBlock1(in_channels=stage_out_channels[stage_id + 2],\n                                     out_channels=stage_out_channels[stage_id + 2])\n                block_list.append(block)\n        self.blocks = nn.Sequential(*block_list)\n\n        self.conv2 = ConvBNLayer(in_channels=stage_out_channels[-2], out_channels=stage_out_channels[-1], kernel_size=1)\n\n        self.avg_pool = AdaptiveAvgPool2D(1)\n\n        self.last_conv = Conv2D(in_channels=stage_out_channels[-1],\n                                out_channels=self.class_expand,\n                                kernel_size=1,\n                                stride=1,\n                                padding=0,\n                                bias_attr=False)\n        self.hardswish = nn.Hardswish()\n        self.dropout = Dropout(p=dropout_prob, mode=\"downscale_in_infer\")\n        self.flatten = nn.Flatten(start_axis=1, stop_axis=-1)\n        self.fc = Linear(self.class_expand, self.class_num)\n\n        super().init_res(stages_pattern, return_patterns=return_patterns, return_stages=return_stages)\n\n    def forward(self, x):\n        x = self.conv1(x)\n        x = self.max_pool(x)\n        x = self.blocks(x)\n        x = self.conv2(x)\n        x = self.avg_pool(x)\n        x = self.last_conv(x)\n        x = self.hardswish(x)\n        x = self.dropout(x)\n        x = self.flatten(x)\n        x = self.fc(x)\n        return x\n\n\ndef ESNet_x0_5(pretrained=False, use_ssld=False, **kwargs):\n    \"\"\"\n    ESNet_x0_5\n    Args:\n        pretrained: bool=False or str. If `True` load pretrained parameters, `False` otherwise.\n                    If str, means the path of the pretrained model.\n        use_ssld: bool=False. Whether using distillation pretrained model when pretrained=True.\n    Returns:\n        model: nn.Layer. Specific `ESNet_x0_5` model depends on args.\n    \"\"\"\n    model = ESNet(scale=0.5, stages_pattern=MODEL_STAGES_PATTERN[\"ESNet\"], **kwargs)\n    return model\n"
  },
  {
    "path": "modules/image/classification/esnet_x0_5_imagenet/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport cv2\nimport numpy as np\nimport paddle\nfrom skimage.io import imread\nfrom skimage.transform import rescale\nfrom skimage.transform import resize\n\nimport paddlehub as hub\nfrom .model import ESNet_x0_5\nfrom .processor import base64_to_cv2\nfrom .processor import create_operators\nfrom .processor import Topk\nfrom .utils import get_config\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"esnet_x0_5_imagenet\",\n            type=\"cv/classification\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"\",\n            version=\"1.0.0\")\nclass Esnet_x0_5_Imagenet:\n\n    def __init__(self):\n        self.config = get_config(os.path.join(self.directory, 'ESNet_x0_5.yaml'), show=False)\n        self.label_path = os.path.join(self.directory, 'imagenet1k_label_list.txt')\n        self.pretrain_path = os.path.join(self.directory, 'ESNet_x0_5_pretrained.pdparams')\n        self.config['Infer']['PostProcess']['class_id_map_file'] = self.label_path\n        self.model = ESNet_x0_5()\n        param_state_dict = paddle.load(self.pretrain_path)\n        self.model.set_dict(param_state_dict)\n        self.preprocess_funcs = create_operators(self.config[\"Infer\"][\"transforms\"])\n\n    def classification(self,\n                       images: list = None,\n                       paths: list = None,\n                       batch_size: int = 1,\n                       use_gpu: bool = False,\n                       top_k: int = 1):\n        '''\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results, each result dict contains key 'class_ids', 'scores' and 'label_names'.\n        '''\n        postprocess_func = Topk(top_k, self.label_path)\n        inputs = []\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n\n        if images != None:\n            for image in images:\n                image = image[:, :, ::-1]\n                inputs.append(image)\n\n        if paths != None:\n            for path in paths:\n                image = cv2.imread(path)[:, :, ::-1]\n                inputs.append(image)\n\n        batch_data = []\n        for idx, imagedata in enumerate(inputs):\n            for process in self.preprocess_funcs:\n                imagedata = process(imagedata)\n            batch_data.append(imagedata)\n            if len(batch_data) >= batch_size or idx == len(inputs) - 1:\n                batch_tensor = paddle.to_tensor(batch_data)\n                out = self.model(batch_tensor)\n                if isinstance(out, list):\n                    out = out[0]\n                if isinstance(out, dict) and \"logits\" in out:\n                    out = out[\"logits\"]\n                if isinstance(out, dict) and \"output\" in out:\n                    out = out[\"output\"]\n                result = postprocess_func(out)\n                results.extend(result)\n                batch_data.clear()\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n        results = self.classification(paths=[self.args.input_path],\n                                      use_gpu=self.args.use_gpu,\n                                      batch_size=self.args.batch_size,\n                                      top_k=self.args.top_k)\n        return results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classification(images=images_decode, **kwargs)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument('--batch_size', type=int, default=1, help='batch size')\n        self.arg_config_group.add_argument('--top_k', type=int, default=1, help='Return top k results.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to input image.\")\n"
  },
  {
    "path": "modules/image/classification/esnet_x0_5_imagenet/processor.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport base64\nimport inspect\nimport math\nimport os\nimport random\nimport sys\nfrom functools import partial\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\nimport six\nfrom paddle.vision.transforms import ColorJitter as RawColorJitter\nfrom PIL import Image\n\n\ndef create_operators(params, class_num=None):\n    \"\"\"\n    create operators based on the config\n\n    Args:\n        params(list): a dict list, used to create some operators\n    \"\"\"\n    assert isinstance(params, list), ('operator config should be a list')\n    ops = []\n    current_module = sys.modules[__name__]\n    for operator in params:\n        assert isinstance(operator, dict) and len(operator) == 1, \"yaml format error\"\n        op_name = list(operator)[0]\n        param = {} if operator[op_name] is None else operator[op_name]\n        op_func = getattr(current_module, op_name)\n        if \"class_num\" in inspect.getfullargspec(op_func).args:\n            param.update({\"class_num\": class_num})\n        op = op_func(**param)\n        ops.append(op)\n\n    return ops\n\n\nclass UnifiedResize(object):\n\n    def __init__(self, interpolation=None, backend=\"cv2\"):\n        _cv2_interp_from_str = {\n            'nearest': cv2.INTER_NEAREST,\n            'bilinear': cv2.INTER_LINEAR,\n            'area': cv2.INTER_AREA,\n            'bicubic': cv2.INTER_CUBIC,\n            'lanczos': cv2.INTER_LANCZOS4\n        }\n        _pil_interp_from_str = {\n            'nearest': Image.NEAREST,\n            'bilinear': Image.BILINEAR,\n            'bicubic': Image.BICUBIC,\n            'box': Image.BOX,\n            'lanczos': Image.LANCZOS,\n            'hamming': Image.HAMMING\n        }\n\n        def _pil_resize(src, size, resample):\n            pil_img = Image.fromarray(src)\n            pil_img = pil_img.resize(size, resample)\n            return np.asarray(pil_img)\n\n        if backend.lower() == \"cv2\":\n            if isinstance(interpolation, str):\n                interpolation = _cv2_interp_from_str[interpolation.lower()]\n            # compatible with opencv < version 4.4.0\n            elif interpolation is None:\n                interpolation = cv2.INTER_LINEAR\n            self.resize_func = partial(cv2.resize, interpolation=interpolation)\n        elif backend.lower() == \"pil\":\n            if isinstance(interpolation, str):\n                interpolation = _pil_interp_from_str[interpolation.lower()]\n            self.resize_func = partial(_pil_resize, resample=interpolation)\n        else:\n            self.resize_func = cv2.resize\n\n    def __call__(self, src, size):\n        return self.resize_func(src, size)\n\n\nclass OperatorParamError(ValueError):\n    \"\"\" OperatorParamError\n    \"\"\"\n    pass\n\n\nclass DecodeImage(object):\n    \"\"\" decode image \"\"\"\n\n    def __init__(self, to_rgb=True, to_np=False, channel_first=False):\n        self.to_rgb = to_rgb\n        self.to_np = to_np  # to numpy\n        self.channel_first = channel_first  # only enabled when to_np is True\n\n    def __call__(self, img):\n        if six.PY2:\n            assert type(img) is str and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        else:\n            assert type(img) is bytes and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        data = np.frombuffer(img, dtype='uint8')\n        img = cv2.imdecode(data, 1)\n        if self.to_rgb:\n            assert img.shape[2] == 3, 'invalid shape of image[%s]' % (img.shape)\n            img = img[:, :, ::-1]\n\n        if self.channel_first:\n            img = img.transpose((2, 0, 1))\n\n        return img\n\n\nclass ResizeImage(object):\n    \"\"\" resize image \"\"\"\n\n    def __init__(self, size=None, resize_short=None, interpolation=None, backend=\"cv2\"):\n        if resize_short is not None and resize_short > 0:\n            self.resize_short = resize_short\n            self.w = None\n            self.h = None\n        elif size is not None:\n            self.resize_short = None\n            self.w = size if type(size) is int else size[0]\n            self.h = size if type(size) is int else size[1]\n        else:\n            raise OperatorParamError(\"invalid params for ReisizeImage for '\\\n                'both 'size' and 'resize_short' are None\")\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        img_h, img_w = img.shape[:2]\n        if self.resize_short is not None:\n            percent = float(self.resize_short) / min(img_w, img_h)\n            w = int(round(img_w * percent))\n            h = int(round(img_h * percent))\n        else:\n            w = self.w\n            h = self.h\n        return self._resize_func(img, (w, h))\n\n\nclass CropImage(object):\n    \"\"\" crop image \"\"\"\n\n    def __init__(self, size):\n        if type(size) is int:\n            self.size = (size, size)\n        else:\n            self.size = size  # (h, w)\n\n    def __call__(self, img):\n        w, h = self.size\n        img_h, img_w = img.shape[:2]\n        w_start = (img_w - w) // 2\n        h_start = (img_h - h) // 2\n\n        w_end = w_start + w\n        h_end = h_start + h\n        return img[h_start:h_end, w_start:w_end, :]\n\n\nclass RandCropImage(object):\n    \"\"\" random crop image \"\"\"\n\n    def __init__(self, size, scale=None, ratio=None, interpolation=None, backend=\"cv2\"):\n        if type(size) is int:\n            self.size = (size, size)  # (h, w)\n        else:\n            self.size = size\n\n        self.scale = [0.08, 1.0] if scale is None else scale\n        self.ratio = [3. / 4., 4. / 3.] if ratio is None else ratio\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        size = self.size\n        scale = self.scale\n        ratio = self.ratio\n\n        aspect_ratio = math.sqrt(random.uniform(*ratio))\n        w = 1. * aspect_ratio\n        h = 1. / aspect_ratio\n\n        img_h, img_w = img.shape[:2]\n\n        bound = min((float(img_w) / img_h) / (w**2), (float(img_h) / img_w) / (h**2))\n        scale_max = min(scale[1], bound)\n        scale_min = min(scale[0], bound)\n\n        target_area = img_w * img_h * random.uniform(scale_min, scale_max)\n        target_size = math.sqrt(target_area)\n        w = int(target_size * w)\n        h = int(target_size * h)\n\n        i = random.randint(0, img_w - w)\n        j = random.randint(0, img_h - h)\n\n        img = img[j:j + h, i:i + w, :]\n\n        return self._resize_func(img, size)\n\n\nclass RandFlipImage(object):\n    \"\"\" random flip image\n        flip_code:\n            1: Flipped Horizontally\n            0: Flipped Vertically\n            -1: Flipped Horizontally & Vertically\n    \"\"\"\n\n    def __init__(self, flip_code=1):\n        assert flip_code in [-1, 0, 1], \"flip_code should be a value in [-1, 0, 1]\"\n        self.flip_code = flip_code\n\n    def __call__(self, img):\n        if random.randint(0, 1) == 1:\n            return cv2.flip(img, self.flip_code)\n        else:\n            return img\n\n\nclass NormalizeImage(object):\n    \"\"\" normalize image such as substract mean, divide std\n    \"\"\"\n\n    def __init__(self, scale=None, mean=None, std=None, order='chw', output_fp16=False, channel_num=3):\n        if isinstance(scale, str):\n            scale = eval(scale)\n        assert channel_num in [3, 4], \"channel number of input image should be set to 3 or 4.\"\n        self.channel_num = channel_num\n        self.output_dtype = 'float16' if output_fp16 else 'float32'\n        self.scale = np.float32(scale if scale is not None else 1.0 / 255.0)\n        self.order = order\n        mean = mean if mean is not None else [0.485, 0.456, 0.406]\n        std = std if std is not None else [0.229, 0.224, 0.225]\n\n        shape = (3, 1, 1) if self.order == 'chw' else (1, 1, 3)\n        self.mean = np.array(mean).reshape(shape).astype('float32')\n        self.std = np.array(std).reshape(shape).astype('float32')\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        assert isinstance(img, np.ndarray), \"invalid input 'img' in NormalizeImage\"\n\n        img = (img.astype('float32') * self.scale - self.mean) / self.std\n\n        if self.channel_num == 4:\n            img_h = img.shape[1] if self.order == 'chw' else img.shape[0]\n            img_w = img.shape[2] if self.order == 'chw' else img.shape[1]\n            pad_zeros = np.zeros((1, img_h, img_w)) if self.order == 'chw' else np.zeros((img_h, img_w, 1))\n            img = (np.concatenate((img, pad_zeros), axis=0) if self.order == 'chw' else np.concatenate(\n                (img, pad_zeros), axis=2))\n        return img.astype(self.output_dtype)\n\n\nclass ToCHWImage(object):\n    \"\"\" convert hwc image to chw image\n    \"\"\"\n\n    def __init__(self):\n        pass\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        return img.transpose((2, 0, 1))\n\n\nclass ColorJitter(RawColorJitter):\n    \"\"\"ColorJitter.\n    \"\"\"\n\n    def __init__(self, *args, **kwargs):\n        super().__init__(*args, **kwargs)\n\n    def __call__(self, img):\n        if not isinstance(img, Image.Image):\n            img = np.ascontiguousarray(img)\n            img = Image.fromarray(img)\n        img = super()._apply_image(img)\n        if isinstance(img, Image.Image):\n            img = np.asarray(img)\n        return img\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\nclass Topk(object):\n\n    def __init__(self, topk=1, class_id_map_file=None):\n        assert isinstance(topk, (int, ))\n        self.class_id_map = self.parse_class_id_map(class_id_map_file)\n        self.topk = topk\n\n    def parse_class_id_map(self, class_id_map_file):\n        if class_id_map_file is None:\n            return None\n        if not os.path.exists(class_id_map_file):\n            print(\n                \"Warning: If want to use your own label_dict, please input legal path!\\nOtherwise label_names will be empty!\"\n            )\n            return None\n\n        try:\n            class_id_map = {}\n            with open(class_id_map_file, \"r\") as fin:\n                lines = fin.readlines()\n                for line in lines:\n                    partition = line.split(\"\\n\")[0].partition(\" \")\n                    class_id_map[int(partition[0])] = str(partition[-1])\n        except Exception as ex:\n            print(ex)\n            class_id_map = None\n        return class_id_map\n\n    def __call__(self, x, file_names=None, multilabel=False):\n        assert isinstance(x, paddle.Tensor)\n        if file_names is not None:\n            assert x.shape[0] == len(file_names)\n        x = F.softmax(x, axis=-1) if not multilabel else F.sigmoid(x)\n        x = x.numpy()\n        y = []\n        for idx, probs in enumerate(x):\n            index = probs.argsort(axis=0)[-self.topk:][::-1].astype(\"int32\") if not multilabel else np.where(\n                probs >= 0.5)[0].astype(\"int32\")\n            clas_id_list = []\n            score_list = []\n            label_name_list = []\n            for i in index:\n                clas_id_list.append(i.item())\n                score_list.append(probs[i].item())\n                if self.class_id_map is not None:\n                    label_name_list.append(self.class_id_map[i.item()])\n            result = {\n                \"class_ids\": clas_id_list,\n                \"scores\": np.around(score_list, decimals=5).tolist(),\n            }\n            if file_names is not None:\n                result[\"file_name\"] = file_names[idx]\n            if label_name_list is not None:\n                result[\"label_names\"] = label_name_list\n            y.append(result)\n        return y\n"
  },
  {
    "path": "modules/image/classification/esnet_x0_5_imagenet/utils.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport yaml\n\n__all__ = ['get_config']\n\n\nclass AttrDict(dict):\n\n    def __getattr__(self, key):\n        return self[key]\n\n    def __setattr__(self, key, value):\n        if key in self.__dict__:\n            self.__dict__[key] = value\n        else:\n            self[key] = value\n\n    def __deepcopy__(self, content):\n        return copy.deepcopy(dict(self))\n\n\ndef create_attr_dict(yaml_config):\n    from ast import literal_eval\n    for key, value in yaml_config.items():\n        if type(value) is dict:\n            yaml_config[key] = value = AttrDict(value)\n        if isinstance(value, str):\n            try:\n                value = literal_eval(value)\n            except BaseException:\n                pass\n        if isinstance(value, AttrDict):\n            create_attr_dict(yaml_config[key])\n        else:\n            yaml_config[key] = value\n\n\ndef parse_config(cfg_file):\n    \"\"\"Load a config file into AttrDict\"\"\"\n    with open(cfg_file, 'r') as fopen:\n        yaml_config = AttrDict(yaml.load(fopen, Loader=yaml.SafeLoader))\n    create_attr_dict(yaml_config)\n    return yaml_config\n\n\ndef override(dl, ks, v):\n    \"\"\"\n    Recursively replace dict of list\n    Args:\n        dl(dict or list): dict or list to be replaced\n        ks(list): list of keys\n        v(str): value to be replaced\n    \"\"\"\n\n    def str2num(v):\n        try:\n            return eval(v)\n        except Exception:\n            return v\n\n    assert isinstance(dl, (list, dict)), (\"{} should be a list or a dict\")\n    assert len(ks) > 0, ('lenght of keys should larger than 0')\n    if isinstance(dl, list):\n        k = str2num(ks[0])\n        if len(ks) == 1:\n            assert k < len(dl), ('index({}) out of range({})'.format(k, dl))\n            dl[k] = str2num(v)\n        else:\n            override(dl[k], ks[1:], v)\n    else:\n        if len(ks) == 1:\n            # assert ks[0] in dl, ('{} is not exist in {}'.format(ks[0], dl))\n            if not ks[0] in dl:\n                print('A new filed ({}) detected!'.format(ks[0], dl))\n            dl[ks[0]] = str2num(v)\n        else:\n            override(dl[ks[0]], ks[1:], v)\n\n\ndef override_config(config, options=None):\n    \"\"\"\n    Recursively override the config\n    Args:\n        config(dict): dict to be replaced\n        options(list): list of pairs(key0.key1.idx.key2=value)\n            such as: [\n                'topk=2',\n                'VALID.transforms.1.ResizeImage.resize_short=300'\n            ]\n    Returns:\n        config(dict): replaced config\n    \"\"\"\n    if options is not None:\n        for opt in options:\n            assert isinstance(opt, str), (\"option({}) should be a str\".format(opt))\n            assert \"=\" in opt, (\"option({}) should contain a =\"\n                                \"to distinguish between key and value\".format(opt))\n            pair = opt.split('=')\n            assert len(pair) == 2, (\"there can be only a = in the option\")\n            key, value = pair\n            keys = key.split('.')\n            override(config, keys, value)\n    return config\n\n\ndef get_config(fname, overrides=None, show=False):\n    \"\"\"\n    Read config from file\n    \"\"\"\n    assert os.path.exists(fname), ('config file({}) is not exist'.format(fname))\n    config = parse_config(fname)\n    override_config(config, overrides)\n    return config\n"
  },
  {
    "path": "modules/image/classification/fix_resnext101_32x48d_wsl_imagenet/README.md",
    "content": "# fix_resnext101_32x48d_wsl_imagenet\n\n|模型名称|fix_resnext101_32x48d_wsl_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNeXt|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|3.1GB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - ResNeXt 是由 UC San Diego 和 Facebook AI 研究所于2017年提出的图像分类模型，模型沿袭了 VGG/ResNets 的堆叠思想，并采用 split-transform-merge 策略来增加网络的分支数。该 PaddleHub Module 在包含数十亿张社交媒体图片的数据集上进行弱监督训练，并使用ImageNet-2012数据集finetune，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install fix_resnext101_32x48d_wsl_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run fix_resnext101_32x48d_wsl_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"fix_resnext101_32x48d_wsl_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 为识别的菜品类别，value为置信度。\n\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m fix_resnext101_32x48d_wsl_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/fix_resnext101_32x48d_wsl_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install fix_resnext101_32x48d_wsl_imagenet==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/fix_resnext101_32x48d_wsl_imagenet/README_en.md",
    "content": "# fix_resnext101_32x48d_wsl_imagenet\n\n|Module Name|fix_resnext101_32x48d_wsl_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|ResNeXt|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|3.1GB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - ResNeXt is proposed by UC San Diego and Facebook AI Research in 2017. This module is based on ResNeXt model. It is weak-supervised trained on billions of socail images, finetuned on ImageNet-2012 dataset, and can predict an image of size 224*224*3.\n\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install fix_resnext101_32x48d_wsl_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run fix_resnext101_32x48d_wsl_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"fix_resnext101_32x48d_wsl_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - classification API.\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - top\\_k (int): return the first k results\n\n    - **Return**\n\n      - res (list\\[dict\\]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of image classification.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m fix_resnext101_32x48d_wsl_imagenet\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/fix_resnext101_32x48d_wsl_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Remove Fluid API\n\n  - ```shell\n    $ hub install fix_resnext101_32x48d_wsl_imagenet==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/fix_resnext101_32x48d_wsl_imagenet/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/classification/fix_resnext101_32x48d_wsl_imagenet/data_feed.py",
    "content": "import os\nimport time\nfrom collections import OrderedDict\n\nimport numpy as np\nfrom PIL import Image\n\n__all__ = ['reader']\n\nDATA_DIM = 224\nimg_mean = np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))\nimg_std = np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))\n\n\ndef resize_short(img, target_size):\n    percent = float(target_size) / min(img.size[0], img.size[1])\n    resized_width = int(round(img.size[0] * percent))\n    resized_height = int(round(img.size[1] * percent))\n    img = img.resize((resized_width, resized_height), Image.LANCZOS)\n    return img\n\n\ndef crop_image(img, target_size, center):\n    width, height = img.size\n    size = target_size\n    if center == True:\n        w_start = (width - size) / 2\n        h_start = (height - size) / 2\n    else:\n        w_start = np.random.randint(0, width - size + 1)\n        h_start = np.random.randint(0, height - size + 1)\n    w_end = w_start + size\n    h_end = h_start + size\n    img = img.crop((w_start, h_start, w_end, h_end))\n    return img\n\n\ndef process_image(img):\n    img = resize_short(img, target_size=256)\n    img = crop_image(img, target_size=DATA_DIM, center=True)\n    if img.mode != 'RGB':\n        img = img.convert('RGB')\n    img = np.array(img).astype('float32').transpose((2, 0, 1)) / 255\n    img -= img_mean\n    img /= img_std\n    return img\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list[numpy.ndarray]): images data, shape of each is [H, W, C].\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            each['org_im_path'] = im_path\n            each['org_im'] = Image.open(im_path)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n    if images is not None:\n        assert type(images), \"images is a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = Image.fromarray(im[:, :, ::-1])\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n\n    for element in component:\n        element['image'] = process_image(element['org_im'])\n        yield element\n"
  },
  {
    "path": "modules/image/classification/fix_resnext101_32x48d_wsl_imagenet/label_list.txt",
    "content": "tench, Tinca tinca\ngoldfish, Carassius auratus\ngreat white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias\ntiger shark, Galeocerdo cuvieri\nhammerhead, hammerhead shark\nelectric ray, crampfish, numbfish, torpedo\nstingray\ncock\nhen\nostrich, Struthio camelus\nbrambling, Fringilla montifringilla\ngoldfinch, Carduelis carduelis\nhouse finch, linnet, Carpodacus mexicanus\njunco, snowbird\nindigo bunting, indigo finch, indigo bird, Passerina cyanea\nrobin, American robin, Turdus migratorius\nbulbul\njay\nmagpie\nchickadee\nwater ouzel, dipper\nkite\nbald eagle, American eagle, Haliaeetus leucocephalus\nvulture\ngreat grey owl, great gray owl, Strix nebulosa\nEuropean fire salamander, Salamandra salamandra\ncommon newt, Triturus vulgaris\neft\nspotted salamander, Ambystoma maculatum\naxolotl, mud puppy, Ambystoma mexicanum\nbullfrog, Rana catesbeiana\ntree frog, tree-frog\ntailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui\nloggerhead, loggerhead turtle, Caretta caretta\nleatherback turtle, leatherback, leathery turtle, Dermochelys coriacea\nmud turtle\nterrapin\nbox turtle, box tortoise\nbanded gecko\ncommon iguana, iguana, Iguana iguana\nAmerican chameleon, anole, Anolis carolinensis\nwhiptail, whiptail lizard\nagama\nfrilled lizard, Chlamydosaurus kingi\nalligator lizard\nGila monster, Heloderma suspectum\ngreen lizard, Lacerta viridis\nAfrican chameleon, Chamaeleo chamaeleon\nKomodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis\nAfrican crocodile, Nile crocodile, Crocodylus niloticus\nAmerican alligator, Alligator mississipiensis\ntriceratops\nthunder snake, worm snake, Carphophis amoenus\nringneck snake, ring-necked snake, ring snake\nhognose snake, puff adder, sand viper\ngreen snake, grass snake\nking snake, kingsnake\ngarter snake, grass snake\nwater snake\nvine snake\nnight snake, Hypsiglena torquata\nboa constrictor, Constrictor constrictor\nrock python, rock snake, Python sebae\nIndian cobra, Naja naja\ngreen mamba\nsea snake\nhorned viper, cerastes, sand viper, horned asp, Cerastes cornutus\ndiamondback, diamondback rattlesnake, Crotalus adamanteus\nsidewinder, horned rattlesnake, Crotalus cerastes\ntrilobite\nharvestman, daddy longlegs, Phalangium opilio\nscorpion\nblack and gold garden spider, Argiope aurantia\nbarn spider, Araneus cavaticus\ngarden spider, Aranea diademata\nblack widow, Latrodectus mactans\ntarantula\nwolf spider, hunting spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse, partridge, Bonasa umbellus\nprairie chicken, prairie grouse, prairie fowl\npeacock\nquail\npartridge\nAfrican grey, African gray, Psittacus erithacus\nmacaw\nsulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser, Mergus serrator\ngoose\nblack swan, Cygnus atratus\ntusker\nechidna, spiny anteater, anteater\nplatypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus\nwallaby, brush kangaroo\nkoala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus\nwombat\njellyfish\nsea anemone, anemone\nbrain coral\nflatworm, platyhelminth\nnematode, nematode worm, roundworm\nconch\nsnail\nslug\nsea slug, nudibranch\nchiton, coat-of-mail shell, sea cradle, polyplacophore\nchambered nautilus, pearly nautilus, nautilus\nDungeness crab, Cancer magister\nrock crab, Cancer irroratus\nfiddler crab\nking crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica\nAmerican lobster, Northern lobster, Maine lobster, Homarus americanus\nspiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish\ncrayfish, crawfish, crawdad, crawdaddy\nhermit crab\nisopod\nwhite stork, Ciconia ciconia\nblack stork, Ciconia nigra\nspoonbill\nflamingo\nlittle blue heron, Egretta caerulea\nAmerican egret, great white heron, Egretta albus\nbittern\ncrane\nlimpkin, Aramus pictus\nEuropean gallinule, Porphyrio porphyrio\nAmerican coot, marsh hen, mud hen, water hen, Fulica americana\nbustard\nruddy turnstone, Arenaria interpres\nred-backed sandpiper, dunlin, Erolia alpina\nredshank, Tringa totanus\ndowitcher\noystercatcher, oyster catcher\npelican\nking penguin, Aptenodytes patagonica\nalbatross, mollymawk\ngrey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus\nkiller whale, killer, orca, grampus, sea wolf, Orcinus orca\ndugong, Dugong dugon\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog, Maltese terrier, Maltese\nPekinese, Pekingese, Peke\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound, Afghan\nbasset, basset hound\nbeagle\nbloodhound, sleuthhound\nbluetick\nblack-and-tan coonhound\nWalker hound, Walker foxhound\nEnglish foxhound\nredbone\nborzoi, Russian wolfhound\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound, Ibizan Podenco\nNorwegian elkhound, elkhound\notterhound, otter hound\nSaluki, gazelle hound\nScottish deerhound, deerhound\nWeimaraner\nStaffordshire bullterrier, Staffordshire bull terrier\nAmerican Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier, Sealyham\nAiredale, Airedale terrier\ncairn, cairn terrier\nAustralian terrier\nDandie Dinmont, Dandie Dinmont terrier\nBoston bull, Boston terrier\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier, Scottish terrier, Scottie\nTibetan terrier, chrysanthemum dog\nsilky terrier, Sydney silky\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa, Lhasa apso\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla, Hungarian pointer\nEnglish setter\nIrish setter, red setter\nGordon setter\nBrittany spaniel\nclumber, clumber spaniel\nEnglish springer, English springer spaniel\nWelsh springer spaniel\ncocker spaniel, English cocker spaniel, cocker\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog, bobtail\nShetland sheepdog, Shetland sheep dog, Shetland\ncollie\nBorder collie\nBouvier des Flandres, Bouviers des Flandres\nRottweiler\nGerman shepherd, German shepherd dog, German police dog, alsatian\nDoberman, Doberman pinscher\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard, St Bernard\nEskimo dog, husky\nmalamute, malemute, Alaskan malamute\nSiberian husky\ndalmatian, coach dog, carriage dog\naffenpinscher, monkey pinscher, monkey dog\nbasenji\npug, pug-dog\nLeonberg\nNewfoundland, Newfoundland dog\nGreat Pyrenees\nSamoyed, Samoyede\nPomeranian\nchow, chow chow\nkeeshond\nBrabancon griffon\nPembroke, Pembroke Welsh corgi\nCardigan, Cardigan Welsh corgi\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf, grey wolf, gray wolf, Canis lupus\nwhite wolf, Arctic wolf, Canis lupus tundrarum\nred wolf, maned wolf, Canis rufus, Canis niger\ncoyote, prairie wolf, brush wolf, Canis latrans\ndingo, warrigal, warragal, Canis dingo\ndhole, Cuon alpinus\nAfrican hunting dog, hyena dog, Cape hunting dog, Lycaon pictus\nhyena, hyaena\nred fox, Vulpes vulpes\nkit fox, Vulpes macrotis\nArctic fox, white fox, Alopex lagopus\ngrey fox, gray fox, Urocyon cinereoargenteus\ntabby, tabby cat\ntiger cat\nPersian cat\nSiamese cat, Siamese\nEgyptian cat\ncougar, puma, catamount, mountain lion, painter, panther, Felis concolor\nlynx, catamount\nleopard, Panthera pardus\nsnow leopard, ounce, Panthera uncia\njaguar, panther, Panthera onca, Felis onca\nlion, king of beasts, Panthera leo\ntiger, Panthera tigris\ncheetah, chetah, Acinonyx jubatus\nbrown bear, bruin, Ursus arctos\nAmerican black bear, black bear, Ursus americanus, Euarctos americanus\nice bear, polar bear, Ursus Maritimus, Thalarctos maritimus\nsloth bear, Melursus ursinus, Ursus ursinus\nmongoose\nmeerkat, mierkat\ntiger beetle\nladybug, ladybeetle, lady beetle, ladybird, ladybird beetle\nground beetle, carabid beetle\nlong-horned beetle, longicorn, longicorn beetle\nleaf beetle, chrysomelid\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant, emmet, pismire\ngrasshopper, hopper\ncricket\nwalking stick, walkingstick, stick insect\ncockroach, roach\nmantis, mantid\ncicada, cicala\nleafhopper\nlacewing, lacewing fly\ndragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk\ndamselfly\nadmiral\nringlet, ringlet butterfly\nmonarch, monarch butterfly, milkweed butterfly, Danaus plexippus\ncabbage butterfly\nsulphur butterfly, sulfur butterfly\nlycaenid, lycaenid butterfly\nstarfish, sea star\nsea urchin\nsea cucumber, holothurian\nwood rabbit, cottontail, cottontail rabbit\nhare\nAngora, Angora rabbit\nhamster\nporcupine, hedgehog\nfox squirrel, eastern fox squirrel, Sciurus niger\nmarmot\nbeaver\nguinea pig, Cavia cobaya\nsorrel\nzebra\nhog, pig, grunter, squealer, Sus scrofa\nwild boar, boar, Sus scrofa\nwarthog\nhippopotamus, hippo, river horse, Hippopotamus amphibius\nox\nwater buffalo, water ox, Asiatic buffalo, Bubalus bubalis\nbison\nram, tup\nbighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis\nibex, Capra ibex\nhartebeest\nimpala, Aepyceros melampus\ngazelle\nArabian camel, dromedary, Camelus dromedarius\nllama\nweasel\nmink\npolecat, fitch, foulmart, foumart, Mustela putorius\nblack-footed ferret, ferret, Mustela nigripes\notter\nskunk, polecat, wood pussy\nbadger\narmadillo\nthree-toed sloth, ai, Bradypus tridactylus\norangutan, orang, orangutang, Pongo pygmaeus\ngorilla, Gorilla gorilla\nchimpanzee, chimp, Pan troglodytes\ngibbon, Hylobates lar\nsiamang, Hylobates syndactylus, Symphalangus syndactylus\nguenon, guenon monkey\npatas, hussar monkey, Erythrocebus patas\nbaboon\nmacaque\nlangur\ncolobus, colobus monkey\nproboscis monkey, Nasalis larvatus\nmarmoset\ncapuchin, ringtail, Cebus capucinus\nhowler monkey, howler\ntiti, titi monkey\nspider monkey, Ateles geoffroyi\nsquirrel monkey, Saimiri sciureus\nMadagascar cat, ring-tailed lemur, Lemur catta\nindri, indris, Indri indri, Indri brevicaudatus\nIndian elephant, Elephas maximus\nAfrican elephant, Loxodonta africana\nlesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens\ngiant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca\nbarracouta, snoek\neel\ncoho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch\nrock beauty, Holocanthus tricolor\nanemone fish\nsturgeon\ngar, garfish, garpike, billfish, Lepisosteus osseus\nlionfish\npuffer, pufferfish, blowfish, globefish\nabacus\nabaya\nacademic gown, academic robe, judge's robe\naccordion, piano accordion, squeeze box\nacoustic guitar\naircraft carrier, carrier, flattop, attack aircraft carrier\nairliner\nairship, dirigible\naltar\nambulance\namphibian, amphibious vehicle\nanalog clock\napiary, bee house\napron\nashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin\nassault rifle, assault gun\nbackpack, back pack, knapsack, packsack, rucksack, haversack\nbakery, bakeshop, bakehouse\nbalance beam, beam\nballoon\nballpoint, ballpoint pen, ballpen, Biro\nBand Aid\nbanjo\nbannister, banister, balustrade, balusters, handrail\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel, cask\nbarrow, garden cart, lawn cart, wheelbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap, swimming cap\nbath towel\nbathtub, bathing tub, bath, tub\nbeach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon\nbeacon, lighthouse, beacon light, pharos\nbeaker\nbearskin, busby, shako\nbeer bottle\nbeer glass\nbell cote, bell cot\nbib\nbicycle-built-for-two, tandem bicycle, tandem\nbikini, two-piece\nbinder, ring-binder\nbinoculars, field glasses, opera glasses\nbirdhouse\nboathouse\nbobsled, bobsleigh, bob\nbolo tie, bolo, bola tie, bola\nbonnet, poke bonnet\nbookcase\nbookshop, bookstore, bookstall\nbottlecap\nbow\nbow tie, bow-tie, bowtie\nbrass, memorial tablet, plaque\nbrassiere, bra, bandeau\nbreakwater, groin, groyne, mole, bulwark, seawall, jetty\nbreastplate, aegis, egis\nbroom\nbucket, pail\nbuckle\nbulletproof vest\nbullet train, bullet\nbutcher shop, meat market\ncab, hack, taxi, taxicab\ncaldron, cauldron\ncandle, taper, wax light\ncannon\ncanoe\ncan opener, tin opener\ncardigan\ncar mirror\ncarousel, carrousel, merry-go-round, roundabout, whirligig\ncarpenter's kit, tool kit\ncarton\ncar wheel\ncash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello, violoncello\ncellular telephone, cellular phone, cellphone, cell, mobile phone\nchain\nchainlink fence\nchain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour\nchain saw, chainsaw\nchest\nchiffonier, commode\nchime, bell, gong\nchina cabinet, china closet\nChristmas stocking\nchurch, church building\ncinema, movie theater, movie theatre, movie house, picture palace\ncleaver, meat cleaver, chopper\ncliff dwelling\ncloak\nclog, geta, patten, sabot\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil, spiral, volute, whorl, helix\ncombination lock\ncomputer keyboard, keypad\nconfectionery, confectionary, candy store\ncontainer ship, containership, container vessel\nconvertible\ncorkscrew, bottle screw\ncornet, horn, trumpet, trump\ncowboy boot\ncowboy hat, ten-gallon hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib, cot\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam, dike, dyke\ndesk\ndesktop computer\ndial telephone, dial phone\ndiaper, nappy, napkin\ndigital clock\ndigital watch\ndining table, board\ndishrag, dishcloth\ndishwasher, dish washer, dishwashing machine\ndisk brake, disc brake\ndock, dockage, docking facility\ndogsled, dog sled, dog sleigh\ndome\ndoormat, welcome mat\ndrilling platform, offshore rig\ndrum, membranophone, tympan\ndrumstick\ndumbbell\nDutch oven\nelectric fan, blower\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa, boa\nfile, file cabinet, filing cabinet\nfireboat\nfire engine, fire truck\nfire screen, fireguard\nflagpole, flagstaff\nflute, transverse flute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn, horn\nfrying pan, frypan, skillet\nfur coat\ngarbage truck, dustcart\ngasmask, respirator, gas helmet\ngas pump, gasoline pump, petrol pump, island dispenser\ngoblet\ngo-kart\ngolf ball\ngolfcart, golf cart\ngondola\ngong, tam-tam\ngown\ngrand piano, grand\ngreenhouse, nursery, glasshouse\ngrille, radiator grille\ngrocery store, grocery, food market, market\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower, blow dryer, blow drier, hair dryer, hair drier\nhand-held computer, hand-held microcomputer\nhandkerchief, hankie, hanky, hankey\nhard disc, hard disk, fixed disk\nharmonica, mouth organ, harp, mouth harp\nharp\nharvester, reaper\nhatchet\nholster\nhome theater, home theatre\nhoneycomb\nhook, claw\nhoopskirt, crinoline\nhorizontal bar, high bar\nhorse cart, horse-cart\nhourglass\niPod\niron, smoothing iron\njack-o'-lantern\njean, blue jean, denim\njeep, landrover\njersey, T-shirt, tee shirt\njigsaw puzzle\njinrikisha, ricksha, rickshaw\njoystick\nkimono\nknee pad\nknot\nlab coat, laboratory coat\nladle\nlampshade, lamp shade\nlaptop, laptop computer\nlawn mower, mower\nlens cap, lens cover\nletter opener, paper knife, paperknife\nlibrary\nlifeboat\nlighter, light, igniter, ignitor\nlimousine, limo\nliner, ocean liner\nlipstick, lip rouge\nLoafer\nlotion\nloudspeaker, speaker, speaker unit, loudspeaker system, speaker system\nloupe, jeweler's loupe\nlumbermill, sawmill\nmagnetic compass\nmailbag, postbag\nmailbox, letter box\nmaillot\nmaillot, tank suit\nmanhole cover\nmaraca\nmarimba, xylophone\nmask\nmatchstick\nmaypole\nmaze, labyrinth\nmeasuring cup\nmedicine chest, medicine cabinet\nmegalith, megalithic structure\nmicrophone, mike\nmicrowave, microwave oven\nmilitary uniform\nmilk can\nminibus\nminiskirt, mini\nminivan\nmissile\nmitten\nmixing bowl\nmobile home, manufactured home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter, scooter\nmountain bike, all-terrain bike, off-roader\nmountain tent\nmouse, computer mouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook, notebook computer\nobelisk\noboe, hautboy, hautbois\nocarina, sweet potato\nodometer, hodometer, mileometer, milometer\noil filter\norgan, pipe organ\noscilloscope, scope, cathode-ray oscilloscope, CRO\noverskirt\noxcart\noxygen mask\npacket\npaddle, boat paddle\npaddlewheel, paddle wheel\npadlock\npaintbrush\npajama, pyjama, pj's, jammies\npalace\npanpipe, pandean pipe, syrinx\npaper towel\nparachute, chute\nparallel bars, bars\npark bench\nparking meter\npassenger car, coach, carriage\npatio, terrace\npay-phone, pay-station\npedestal, plinth, footstall\npencil box, pencil case\npencil sharpener\nperfume, essence\nPetri dish\nphotocopier\npick, plectrum, plectron\npickelhaube\npicket fence, paling\npickup, pickup truck\npier\npiggy bank, penny bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate, pirate ship\npitcher, ewer\nplane, carpenter's plane, woodworking plane\nplanetarium\nplastic bag\nplate rack\nplow, plough\nplunger, plumber's helper\nPolaroid camera, Polaroid Land camera\npole\npolice van, police wagon, paddy wagon, patrol wagon, wagon, black Maria\nponcho\npool table, billiard table, snooker table\npop bottle, soda bottle\npot, flowerpot\npotter's wheel\npower drill\nprayer rug, prayer mat\nprinter\nprison, prison house\nprojectile, missile\nprojector\npuck, hockey puck\npunching bag, punch bag, punching ball, punchball\npurse\nquill, quill pen\nquilt, comforter, comfort, puff\nracer, race car, racing car\nracket, racquet\nradiator\nradio, wireless\nradio telescope, radio reflector\nrain barrel\nrecreational vehicle, RV, R.V.\nreel\nreflex camera\nrefrigerator, icebox\nremote control, remote\nrestaurant, eating house, eating place, eatery\nrevolver, six-gun, six-shooter\nrifle\nrocking chair, rocker\nrotisserie\nrubber eraser, rubber, pencil eraser\nrugby ball\nrule, ruler\nrunning shoe\nsafe\nsafety pin\nsaltshaker, salt shaker\nsandal\nsarong\nsax, saxophone\nscabbard\nscale, weighing machine\nschool bus\nschooner\nscoreboard\nscreen, CRT screen\nscrew\nscrewdriver\nseat belt, seatbelt\nsewing machine\nshield, buckler\nshoe shop, shoe-shop, shoe store\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule, slipstick\nsliding door\nslot, one-armed bandit\nsnorkel\nsnowmobile\nsnowplow, snowplough\nsoap dispenser\nsoccer ball\nsock\nsolar dish, solar collector, solar furnace\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web, spider's web\nspindle\nsports car, sport car\nspotlight, spot\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch, stop watch\nstove\nstrainer\nstreetcar, tram, tramcar, trolley, trolley car\nstretcher\nstudio couch, day bed\nstupa, tope\nsubmarine, pigboat, sub, U-boat\nsuit, suit of clothes\nsundial\nsunglass\nsunglasses, dark glasses, shades\nsunscreen, sunblock, sun blocker\nsuspension bridge\nswab, swob, mop\nsweatshirt\nswimming trunks, bathing trunks\nswing\nswitch, electric switch, electrical switch\nsyringe\ntable lamp\ntank, army tank, armored combat vehicle, armoured combat vehicle\ntape player\nteapot\nteddy, teddy bear\ntelevision, television system\ntennis ball\nthatch, thatched roof\ntheater curtain, theatre curtain\nthimble\nthresher, thrasher, threshing machine\nthrone\ntile roof\ntoaster\ntobacco shop, tobacconist shop, tobacconist\ntoilet seat\ntorch\ntotem pole\ntow truck, tow car, wrecker\ntoyshop\ntractor\ntrailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi\ntray\ntrench coat\ntricycle, trike, velocipede\ntrimaran\ntripod\ntriumphal arch\ntrolleybus, trolley coach, trackless trolley\ntrombone\ntub, vat\nturnstile\ntypewriter keyboard\numbrella\nunicycle, monocycle\nupright, upright piano\nvacuum, vacuum cleaner\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin, fiddle\nvolleyball\nwaffle iron\nwall clock\nwallet, billfold, notecase, pocketbook\nwardrobe, closet, press\nwarplane, military plane\nwashbasin, handbasin, washbowl, lavabo, wash-hand basin\nwasher, automatic washer, washing machine\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool, woolen, woollen\nworm fence, snake fence, snake-rail fence, Virginia fence\nwreck\nyawl\nyurt\nweb site, website, internet site, site\ncomic book\ncrossword puzzle, crossword\nstreet sign\ntraffic light, traffic signal, stoplight\nbook jacket, dust cover, dust jacket, dust wrapper\nmenu\nplate\nguacamole\nconsomme\nhot pot, hotpot\ntrifle\nice cream, icecream\nice lolly, lolly, lollipop, popsicle\nFrench loaf\nbagel, beigel\npretzel\ncheeseburger\nhotdog, hot dog, red hot\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini, courgette\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber, cuke\nartichoke, globe artichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple, ananas\nbanana\njackfruit, jak, jack\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce, chocolate syrup\ndough\nmeat loaf, meatloaf\npizza, pizza pie\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff, drop, drop-off\ncoral reef\ngeyser\nlakeside, lakeshore\npromontory, headland, head, foreland\nsandbar, sand bar\nseashore, coast, seacoast, sea-coast\nvalley, vale\nvolcano\nballplayer, baseball player\ngroom, bridegroom\nscuba diver\nrapeseed\ndaisy\nyellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum\ncorn\nacorn\nhip, rose hip, rosehip\nbuckeye, horse chestnut, conker\ncoral fungus\nagaric\ngyromitra\nstinkhorn, carrion fungus\nearthstar\nhen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa\nbolete\near, spike, capitulum\ntoilet tissue, toilet paper, bathroom tissue\n"
  },
  {
    "path": "modules/image/classification/fix_resnext101_32x48d_wsl_imagenet/module.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\n\nimport argparse\nimport ast\nimport os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import postprocess\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"fix_resnext101_32x48d_wsl_imagenet\",\n    type=\"CV/image_classification\",\n    author=\"paddlepaddle\",\n    author_email=\"paddle-dev@baidu.com\",\n    summary=\"fix_resnext101_32x48d_wsl is a image classfication model, this module is trained with imagenet datasets.\",\n    version=\"1.1.0\")\nclass FixResnext10132x48dwslImagenet:\n\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"model\", \"model\")\n        label_file = os.path.join(self.directory, \"label_list.txt\")\n        with open(label_file, 'r', encoding='utf-8') as file:\n            self.label_list = file.read().split(\"\\n\")[:-1]\n        self.predictor_set = False\n\n    def get_expected_image_width(self):\n        return 224\n\n    def get_expected_image_height(self):\n        return 224\n\n    def get_pretrained_images_mean(self):\n        im_mean = np.array([0.485, 0.456, 0.406]).reshape(1, 3)\n        return im_mean\n\n    def get_pretrained_images_std(self):\n        im_std = np.array([0.229, 0.224, 0.225]).reshape(1, 3)\n        return im_std\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path + '.pdmodel'\n        params = self.default_pretrained_model_path + '.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def classification(self, images=None, paths=None, batch_size=1, use_gpu=False, top_k=1):\n        \"\"\"\n        API for image classification.\n\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Attempt to use GPU for prediction, but environment variable CUDA_VISIBLE_DEVICES was not set correctly.\"\n                )\n\n        if not self.predictor_set:\n            self._set_config()\n            self.predictor_set = True\n\n        all_data = list()\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = list()\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except:\n                    pass\n            # feed batch image\n            batch_image = np.array([data['image'] for data in batch_data])\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(batch_image.copy())\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            out = postprocess(data_out=output_handle.copy_to_cpu(), label_list=self.label_list, top_k=top_k)\n            res += out\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classification(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.classification(paths=[args.input_path], batch_size=args.batch_size, use_gpu=args.use_gpu)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not.\")\n        self.arg_config_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n        self.arg_config_group.add_argument('--top_k', type=ast.literal_eval, default=1, help=\"Return top k results.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n"
  },
  {
    "path": "modules/image/classification/fix_resnext101_32x48d_wsl_imagenet/processor.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport base64\n\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef softmax(x):\n    if len(x.shape) > 1:\n        tmp = np.max(x, axis=1)\n        x -= tmp.reshape((x.shape[0], 1))\n        x = np.exp(x)\n        tmp = np.sum(x, axis=1)\n        x /= tmp.reshape((x.shape[0], 1))\n    else:\n        tmp = np.max(x)\n        x -= tmp\n        x = np.exp(x)\n        tmp = np.sum(x)\n        x /= tmp\n    return x\n\n\ndef postprocess(data_out, label_list, top_k):\n    \"\"\"\n    Postprocess output of network, one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output data of network.\n        label_list (list): list of label.\n        top_k (int): Return top k results.\n    \"\"\"\n    output = []\n    for result in data_out:\n        result_i = softmax(result)\n        output_i = {}\n        indexs = np.argsort(result_i)[::-1][0:top_k]\n        for index in indexs:\n            label = label_list[index].split(',')[0]\n            output_i[label] = float(result_i[index])\n        output.append(output_i)\n    return output\n"
  },
  {
    "path": "modules/image/classification/fix_resnext101_32x48d_wsl_imagenet/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/brFsZ7qszSY/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8OHx8ZG9nfGVufDB8fHx8MTY2MzA1ODQ1MQ&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"fix_resnext101_32x48d_wsl_imagenet\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n\n    def test_classification1(self):\n        results = self.module.classification(paths=['tests/test.jpg'])\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification2(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')])\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification3(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')], use_gpu=True)\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification4(self):\n        self.assertRaises(AssertionError, self.module.classification, paths=['no.jpg'])\n\n    def test_classification5(self):\n        self.assertRaises(TypeError, self.module.classification, images=['tests/test.jpg'])\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/classification/food_classification/README.md",
    "content": "# food_classification\n\n|模型名称|food_classification|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNet50_vd_ssld|\n|数据集|美食数据集|\n|是否支持Fine-tuning|否|\n|模型大小|91MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - 美食分类（food_classification），该模型可识别苹果派，小排骨，烤面包，牛肉馅饼，牛肉鞑靼。该PaddleHub Module支持API预测及命令行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlehub >= 2.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n  - paddlex >= 1.3.7\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install food_classification\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run food_classification --input_path /PATH/TO/IMAGE\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"food_classification\")\n    images = [cv2.imread('/PATH/TO/IMAGE')]\n    results = classifier.predict(images=images)\n    for result in results:\n        print(result)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def predict(images)\n    ```\n    - 分类接口API。\n    - **参数**\n      - images：list类型，待检测的图像。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型:\n        - category_id (int): 类别的id；\n        - category（str）: 类别;\n        - score（float）: 准确率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install food_classification==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/food_classification/README_en.md",
    "content": "# food_classification\n\n|Module Name|food_classification|\n| :--- | :---: |\n|Category|image classification|\n|Network|ResNet50_vd_ssld|\n|Dataset|Food Dataset|\n|Fine-tuning supported or not|No|\n|Module Size|91MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - This module can be used for food classification.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlehub >= 2.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n  - paddlex >= 1.3.7\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install food_classification\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run food_classification --input_path /PATH/TO/IMAGE\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"food_classification\")\n    images = [cv2.imread('/PATH/TO/IMAGE')]\n    results = classifier.predict(images=images)\n    for result in results:\n        print(result)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def predict(images)\n    ```\n    - classification API.\n    - **Parameters**\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n        - category_id (int): category id；\n        - category（str）: category name;\n        - score（float）: probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install food_classification==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/food_classification/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/classification/food_classification/module.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\n\nimport os\nimport cv2\nimport argparse\nimport base64\nimport paddlex as pdx\n\nimport numpy as np\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef cv2_to_base64(image):\n    # return base64.b64encode(image)\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef read_images(paths):\n    images = []\n    for path in paths:\n        images.append(cv2.imread(path))\n    return images\n\n\n@moduleinfo(\n    name='food_classification',\n    type='cv/classification',\n    author='郑博培、彭兆帅',\n    author_email='2733821739@qq.com, 1084667371@qq.com',\n    summary='Food classification',\n    version='1.0.0')\nclass MODULE(hub.Module):\n    def _initialize(self, **kwargs):\n        self.default_pretrained_model_path = os.path.join(self.directory, 'assets')\n        self.model = pdx.deploy.Predictor(self.default_pretrained_model_path, **kwargs)\n\n    def predict(self, images=None, paths=None, data=None, batch_size=1, use_gpu=False, **kwargs):\n\n        all_data = images if images is not None else read_images(paths)\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = []\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except IndexError:\n                    break\n            out = self.model.batch_predict(batch_data, **kwargs)\n            res.extend(out)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.predict(images_decode, **kwargs)\n        res = []\n        for result in results:\n            if isinstance(result, dict):\n                # result_new = dict()\n                for key, value in result.items():\n                    if isinstance(value, np.ndarray):\n                        result[key] = cv2_to_base64(value)\n                    elif isinstance(value, np.generic):\n                        result[key] = np.asscalar(value)\n\n            elif isinstance(result, list):\n                for index in range(len(result)):\n                    for key, value in result[index].items():\n                        if isinstance(value, np.ndarray):\n                            result[index][key] = cv2_to_base64(value)\n                        elif isinstance(value, np.generic):\n                            result[index][key] = np.asscalar(value)\n            else:\n                raise RuntimeError('The result cannot be used in serving.')\n            res.append(result)\n        return res\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.predict(paths=[args.input_path], use_gpu=args.use_gpu)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', type=bool, default=False, help=\"whether use GPU or not\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n\n\nif __name__ == '__main__':\n    module = MODULE(directory='./new_model')\n    images = [cv2.imread('./cat.jpg'), cv2.imread('./cat.jpg'), cv2.imread('./cat.jpg')]\n    res = module.predict(images=images)\n"
  },
  {
    "path": "modules/image/classification/food_classification/requirements.txt",
    "content": "paddlex == 1.3.7\n"
  },
  {
    "path": "modules/image/classification/ghostnet_x0_5_imagenet/README.md",
    "content": "# ghostnet_x0_5_imagenet\n\n|模型名称|ghostnet_x0_5_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|GhostNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|15MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - GhostNet是华为在2020年提出的全新轻量级网络结构，通过引入ghost模块，大大缓解了传统深度网络中特征的冗余计算问题，大大减少了网络参数和计算量。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install ghostnet_x0_5_imagenet\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run ghostnet_x0_5_imagenet --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='ghostnet_x0_5_imagenet')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用ghostnet_x0_5_imagenet对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"ghostnet_x0_5_imagenet\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='ghostnet_x0_5_imagenet', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m ghostnet_x0_5_imagenet\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ghostnet_x0_5_imagenet\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/ghostnet_x0_5_imagenet/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/ghostnet_x0_5_imagenet/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport math\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport paddlehub.vision.transforms as T\nimport numpy as np\nfrom paddle import ParamAttr\nfrom paddle.nn.initializer import Uniform, KaimingNormal\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    def __init__(self, in_channels, out_channels, kernel_size, stride=1, groups=1, act=\"relu\", name=None):\n        super(ConvBNLayer, self).__init__()\n        self._conv = nn.Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=(kernel_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(initializer=KaimingNormal(), name=name + \"_weights\"),\n            bias_attr=False)\n        bn_name = name + \"_bn\"\n\n        self._batch_norm = nn.BatchNorm(\n            num_channels=out_channels,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + \"_scale\", regularizer=paddle.regularizer.L2Decay(0.0)),\n            bias_attr=ParamAttr(name=bn_name + \"_offset\", regularizer=paddle.regularizer.L2Decay(0.0)),\n            moving_mean_name=bn_name + \"_mean\",\n            moving_variance_name=bn_name + \"_variance\")\n\n    def forward(self, inputs):\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass SEBlock(nn.Layer):\n    def __init__(self, num_channels, reduction_ratio=4, name=None):\n        super(SEBlock, self).__init__()\n        self.pool2d_gap = nn.AdaptiveAvgPool2D(1)\n        self._num_channels = num_channels\n        stdv = 1.0 / math.sqrt(num_channels * 1.0)\n        med_ch = num_channels // reduction_ratio\n        self.squeeze = nn.Linear(\n            num_channels,\n            med_ch,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_1_weights\"),\n            bias_attr=ParamAttr(name=name + \"_1_offset\"))\n        stdv = 1.0 / math.sqrt(med_ch * 1.0)\n        self.excitation = nn.Linear(\n            med_ch,\n            num_channels,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_2_weights\"),\n            bias_attr=ParamAttr(name=name + \"_2_offset\"))\n\n    def forward(self, inputs):\n        pool = self.pool2d_gap(inputs)\n        pool = paddle.squeeze(pool, axis=[2, 3])\n        squeeze = self.squeeze(pool)\n        squeeze = F.relu(squeeze)\n        excitation = self.excitation(squeeze)\n        excitation = paddle.clip(x=excitation, min=0, max=1)\n        excitation = paddle.unsqueeze(excitation, axis=[2, 3])\n        out = paddle.multiply(inputs, excitation)\n        return out\n\n\nclass GhostModule(nn.Layer):\n    def __init__(self, in_channels, output_channels, kernel_size=1, ratio=2, dw_size=3, stride=1, relu=True, name=None):\n        super(GhostModule, self).__init__()\n        init_channels = int(math.ceil(output_channels / ratio))\n        new_channels = int(init_channels * (ratio - 1))\n        self.primary_conv = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=init_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            groups=1,\n            act=\"relu\" if relu else None,\n            name=name + \"_primary_conv\")\n        self.cheap_operation = ConvBNLayer(\n            in_channels=init_channels,\n            out_channels=new_channels,\n            kernel_size=dw_size,\n            stride=1,\n            groups=init_channels,\n            act=\"relu\" if relu else None,\n            name=name + \"_cheap_operation\")\n\n    def forward(self, inputs):\n        x = self.primary_conv(inputs)\n        y = self.cheap_operation(x)\n        out = paddle.concat([x, y], axis=1)\n        return out\n\n\nclass GhostBottleneck(nn.Layer):\n    def __init__(self, in_channels, hidden_dim, output_channels, kernel_size, stride, use_se, name=None):\n        super(GhostBottleneck, self).__init__()\n        self._stride = stride\n        self._use_se = use_se\n        self._num_channels = in_channels\n        self._output_channels = output_channels\n        self.ghost_module_1 = GhostModule(\n            in_channels=in_channels,\n            output_channels=hidden_dim,\n            kernel_size=1,\n            stride=1,\n            relu=True,\n            name=name + \"_ghost_module_1\")\n        if stride == 2:\n            self.depthwise_conv = ConvBNLayer(\n                in_channels=hidden_dim,\n                out_channels=hidden_dim,\n                kernel_size=kernel_size,\n                stride=stride,\n                groups=hidden_dim,\n                act=None,\n                name=name + \"_depthwise_depthwise\"  # looks strange due to an old typo, will be fixed later.\n            )\n        if use_se:\n            self.se_block = SEBlock(num_channels=hidden_dim, name=name + \"_se\")\n        self.ghost_module_2 = GhostModule(\n            in_channels=hidden_dim,\n            output_channels=output_channels,\n            kernel_size=1,\n            relu=False,\n            name=name + \"_ghost_module_2\")\n        if stride != 1 or in_channels != output_channels:\n            self.shortcut_depthwise = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=in_channels,\n                kernel_size=kernel_size,\n                stride=stride,\n                groups=in_channels,\n                act=None,\n                name=name + \"_shortcut_depthwise_depthwise\"  # looks strange due to an old typo, will be fixed later.\n            )\n            self.shortcut_conv = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=output_channels,\n                kernel_size=1,\n                stride=1,\n                groups=1,\n                act=None,\n                name=name + \"_shortcut_conv\")\n\n    def forward(self, inputs):\n        x = self.ghost_module_1(inputs)\n        if self._stride == 2:\n            x = self.depthwise_conv(x)\n        if self._use_se:\n            x = self.se_block(x)\n        x = self.ghost_module_2(x)\n        if self._stride == 1 and self._num_channels == self._output_channels:\n            shortcut = inputs\n        else:\n            shortcut = self.shortcut_depthwise(inputs)\n            shortcut = self.shortcut_conv(shortcut)\n        return paddle.add(x=x, y=shortcut)\n\n\n@moduleinfo(\n    name=\"ghostnet_x0_5_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"ghostnet_x0_5_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass GhostNet(nn.Layer):\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(GhostNet, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        self.cfgs = [\n            # k, t, c, SE, s\n            [3, 16, 16, 0, 1],\n            [3, 48, 24, 0, 2],\n            [3, 72, 24, 0, 1],\n            [5, 72, 40, 1, 2],\n            [5, 120, 40, 1, 1],\n            [3, 240, 80, 0, 2],\n            [3, 200, 80, 0, 1],\n            [3, 184, 80, 0, 1],\n            [3, 184, 80, 0, 1],\n            [3, 480, 112, 1, 1],\n            [3, 672, 112, 1, 1],\n            [5, 672, 160, 1, 2],\n            [5, 960, 160, 0, 1],\n            [5, 960, 160, 1, 1],\n            [5, 960, 160, 0, 1],\n            [5, 960, 160, 1, 1]\n        ]\n        self.scale = 0.5\n        output_channels = int(self._make_divisible(16 * self.scale, 4))\n        self.conv1 = ConvBNLayer(\n            in_channels=3, out_channels=output_channels, kernel_size=3, stride=2, groups=1, act=\"relu\", name=\"conv1\")\n        # build inverted residual blocks\n        idx = 0\n        self.ghost_bottleneck_list = []\n        for k, exp_size, c, use_se, s in self.cfgs:\n            in_channels = output_channels\n            output_channels = int(self._make_divisible(c * self.scale, 4))\n            hidden_dim = int(self._make_divisible(exp_size * self.scale, 4))\n            ghost_bottleneck = self.add_sublayer(\n                name=\"_ghostbottleneck_\" + str(idx),\n                sublayer=GhostBottleneck(\n                    in_channels=in_channels,\n                    hidden_dim=hidden_dim,\n                    output_channels=output_channels,\n                    kernel_size=k,\n                    stride=s,\n                    use_se=use_se,\n                    name=\"_ghostbottleneck_\" + str(idx)))\n            self.ghost_bottleneck_list.append(ghost_bottleneck)\n            idx += 1\n        # build last several layers\n        in_channels = output_channels\n        output_channels = int(self._make_divisible(exp_size * self.scale, 4))\n        self.conv_last = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=output_channels,\n            kernel_size=1,\n            stride=1,\n            groups=1,\n            act=\"relu\",\n            name=\"conv_last\")\n        self.pool2d_gap = nn.AdaptiveAvgPool2D(1)\n        in_channels = output_channels\n        self._fc0_output_channels = 1280\n        self.fc_0 = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=self._fc0_output_channels,\n            kernel_size=1,\n            stride=1,\n            act=\"relu\",\n            name=\"fc_0\")\n        self.dropout = nn.Dropout(p=0.2)\n        stdv = 1.0 / math.sqrt(self._fc0_output_channels * 1.0)\n        self.fc_1 = nn.Linear(\n            self._fc0_output_channels,\n            class_dim,\n            weight_attr=ParamAttr(name=\"fc_1_weights\", initializer=Uniform(-stdv, stdv)),\n            bias_attr=ParamAttr(name=\"fc_1_offset\"))\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, inputs):\n        x = self.conv1(inputs)\n        for ghost_bottleneck in self.ghost_bottleneck_list:\n            x = ghost_bottleneck(x)\n        x = self.conv_last(x)\n        feature = self.pool2d_gap(x)\n        x = self.fc_0(feature)\n        x = self.dropout(x)\n        x = paddle.reshape(x, shape=[-1, self._fc0_output_channels])\n        x = self.fc_1(x)\n        return x, feature\n\n    def _make_divisible(self, v, divisor, min_value=None):\n        \"\"\"\n        This function is taken from the original tf repo.\n        It ensures that all layers have a channel number that is divisible by 8\n        It can be seen here:\n        https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py\n        \"\"\"\n        if min_value is None:\n            min_value = divisor\n        new_v = max(min_value, int(v + divisor / 2) // divisor * divisor)\n        # Make sure that round down does not go down by more than 10%.\n        if new_v < 0.9 * v:\n            new_v += divisor\n        return new_v\n"
  },
  {
    "path": "modules/image/classification/ghostnet_x1_0_imagenet/README.md",
    "content": "# ghostnet_x1_0_imagenet\n\n|模型名称|ghostnet_x1_0_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|GhostNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|30MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - GhostNet是华为在2020年提出的全新轻量级网络结构，通过引入ghost模块，大大缓解了传统深度网络中特征的冗余计算问题，大大减少了网络参数和计算量。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install ghostnet_x1_0_imagenet\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run ghostnet_x1_0_imagenet --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='ghostnet_x1_0_imagenet')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用ghostnet_x1_0_imagenet对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"ghostnet_x1_0_imagenet\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='ghostnet_x1_0_imagenet', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m ghostnet_x1_0_imagenet\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ghostnet_x1_0_imagenet\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/ghostnet_x1_0_imagenet/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/ghostnet_x1_0_imagenet/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport math\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport paddlehub.vision.transforms as T\nimport numpy as np\nfrom paddle import ParamAttr\nfrom paddle.nn.initializer import Uniform, KaimingNormal\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    def __init__(self, in_channels, out_channels, kernel_size, stride=1, groups=1, act=\"relu\", name=None):\n        super(ConvBNLayer, self).__init__()\n        self._conv = nn.Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=(kernel_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(initializer=KaimingNormal(), name=name + \"_weights\"),\n            bias_attr=False)\n        bn_name = name + \"_bn\"\n\n        self._batch_norm = nn.BatchNorm(\n            num_channels=out_channels,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + \"_scale\", regularizer=paddle.regularizer.L2Decay(0.0)),\n            bias_attr=ParamAttr(name=bn_name + \"_offset\", regularizer=paddle.regularizer.L2Decay(0.0)),\n            moving_mean_name=bn_name + \"_mean\",\n            moving_variance_name=bn_name + \"_variance\")\n\n    def forward(self, inputs):\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass SEBlock(nn.Layer):\n    def __init__(self, num_channels, reduction_ratio=4, name=None):\n        super(SEBlock, self).__init__()\n        self.pool2d_gap = nn.AdaptiveAvgPool2D(1)\n        self._num_channels = num_channels\n        stdv = 1.0 / math.sqrt(num_channels * 1.0)\n        med_ch = num_channels // reduction_ratio\n        self.squeeze = nn.Linear(\n            num_channels,\n            med_ch,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_1_weights\"),\n            bias_attr=ParamAttr(name=name + \"_1_offset\"))\n        stdv = 1.0 / math.sqrt(med_ch * 1.0)\n        self.excitation = nn.Linear(\n            med_ch,\n            num_channels,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_2_weights\"),\n            bias_attr=ParamAttr(name=name + \"_2_offset\"))\n\n    def forward(self, inputs):\n        pool = self.pool2d_gap(inputs)\n        pool = paddle.squeeze(pool, axis=[2, 3])\n        squeeze = self.squeeze(pool)\n        squeeze = F.relu(squeeze)\n        excitation = self.excitation(squeeze)\n        excitation = paddle.clip(x=excitation, min=0, max=1)\n        excitation = paddle.unsqueeze(excitation, axis=[2, 3])\n        out = paddle.multiply(inputs, excitation)\n        return out\n\n\nclass GhostModule(nn.Layer):\n    def __init__(self, in_channels, output_channels, kernel_size=1, ratio=2, dw_size=3, stride=1, relu=True, name=None):\n        super(GhostModule, self).__init__()\n        init_channels = int(math.ceil(output_channels / ratio))\n        new_channels = int(init_channels * (ratio - 1))\n        self.primary_conv = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=init_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            groups=1,\n            act=\"relu\" if relu else None,\n            name=name + \"_primary_conv\")\n        self.cheap_operation = ConvBNLayer(\n            in_channels=init_channels,\n            out_channels=new_channels,\n            kernel_size=dw_size,\n            stride=1,\n            groups=init_channels,\n            act=\"relu\" if relu else None,\n            name=name + \"_cheap_operation\")\n\n    def forward(self, inputs):\n        x = self.primary_conv(inputs)\n        y = self.cheap_operation(x)\n        out = paddle.concat([x, y], axis=1)\n        return out\n\n\nclass GhostBottleneck(nn.Layer):\n    def __init__(self, in_channels, hidden_dim, output_channels, kernel_size, stride, use_se, name=None):\n        super(GhostBottleneck, self).__init__()\n        self._stride = stride\n        self._use_se = use_se\n        self._num_channels = in_channels\n        self._output_channels = output_channels\n        self.ghost_module_1 = GhostModule(\n            in_channels=in_channels,\n            output_channels=hidden_dim,\n            kernel_size=1,\n            stride=1,\n            relu=True,\n            name=name + \"_ghost_module_1\")\n        if stride == 2:\n            self.depthwise_conv = ConvBNLayer(\n                in_channels=hidden_dim,\n                out_channels=hidden_dim,\n                kernel_size=kernel_size,\n                stride=stride,\n                groups=hidden_dim,\n                act=None,\n                name=name + \"_depthwise_depthwise\"  # looks strange due to an old typo, will be fixed later.\n            )\n        if use_se:\n            self.se_block = SEBlock(num_channels=hidden_dim, name=name + \"_se\")\n        self.ghost_module_2 = GhostModule(\n            in_channels=hidden_dim,\n            output_channels=output_channels,\n            kernel_size=1,\n            relu=False,\n            name=name + \"_ghost_module_2\")\n        if stride != 1 or in_channels != output_channels:\n            self.shortcut_depthwise = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=in_channels,\n                kernel_size=kernel_size,\n                stride=stride,\n                groups=in_channels,\n                act=None,\n                name=name + \"_shortcut_depthwise_depthwise\"  # looks strange due to an old typo, will be fixed later.\n            )\n            self.shortcut_conv = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=output_channels,\n                kernel_size=1,\n                stride=1,\n                groups=1,\n                act=None,\n                name=name + \"_shortcut_conv\")\n\n    def forward(self, inputs):\n        x = self.ghost_module_1(inputs)\n        if self._stride == 2:\n            x = self.depthwise_conv(x)\n        if self._use_se:\n            x = self.se_block(x)\n        x = self.ghost_module_2(x)\n        if self._stride == 1 and self._num_channels == self._output_channels:\n            shortcut = inputs\n        else:\n            shortcut = self.shortcut_depthwise(inputs)\n            shortcut = self.shortcut_conv(shortcut)\n        return paddle.add(x=x, y=shortcut)\n\n\n@moduleinfo(\n    name=\"ghostnet_x1_0_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"ghostnet_x1_0_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass GhostNet(nn.Layer):\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(GhostNet, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        self.cfgs = [\n            # k, t, c, SE, s\n            [3, 16, 16, 0, 1],\n            [3, 48, 24, 0, 2],\n            [3, 72, 24, 0, 1],\n            [5, 72, 40, 1, 2],\n            [5, 120, 40, 1, 1],\n            [3, 240, 80, 0, 2],\n            [3, 200, 80, 0, 1],\n            [3, 184, 80, 0, 1],\n            [3, 184, 80, 0, 1],\n            [3, 480, 112, 1, 1],\n            [3, 672, 112, 1, 1],\n            [5, 672, 160, 1, 2],\n            [5, 960, 160, 0, 1],\n            [5, 960, 160, 1, 1],\n            [5, 960, 160, 0, 1],\n            [5, 960, 160, 1, 1]\n        ]\n        self.scale = 1.0\n        output_channels = int(self._make_divisible(16 * self.scale, 4))\n        self.conv1 = ConvBNLayer(\n            in_channels=3, out_channels=output_channels, kernel_size=3, stride=2, groups=1, act=\"relu\", name=\"conv1\")\n        # build inverted residual blocks\n        idx = 0\n        self.ghost_bottleneck_list = []\n        for k, exp_size, c, use_se, s in self.cfgs:\n            in_channels = output_channels\n            output_channels = int(self._make_divisible(c * self.scale, 4))\n            hidden_dim = int(self._make_divisible(exp_size * self.scale, 4))\n            ghost_bottleneck = self.add_sublayer(\n                name=\"_ghostbottleneck_\" + str(idx),\n                sublayer=GhostBottleneck(\n                    in_channels=in_channels,\n                    hidden_dim=hidden_dim,\n                    output_channels=output_channels,\n                    kernel_size=k,\n                    stride=s,\n                    use_se=use_se,\n                    name=\"_ghostbottleneck_\" + str(idx)))\n            self.ghost_bottleneck_list.append(ghost_bottleneck)\n            idx += 1\n        # build last several layers\n        in_channels = output_channels\n        output_channels = int(self._make_divisible(exp_size * self.scale, 4))\n        self.conv_last = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=output_channels,\n            kernel_size=1,\n            stride=1,\n            groups=1,\n            act=\"relu\",\n            name=\"conv_last\")\n        self.pool2d_gap = nn.AdaptiveAvgPool2D(1)\n        in_channels = output_channels\n        self._fc0_output_channels = 1280\n        self.fc_0 = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=self._fc0_output_channels,\n            kernel_size=1,\n            stride=1,\n            act=\"relu\",\n            name=\"fc_0\")\n        self.dropout = nn.Dropout(p=0.2)\n        stdv = 1.0 / math.sqrt(self._fc0_output_channels * 1.0)\n        self.fc_1 = nn.Linear(\n            self._fc0_output_channels,\n            class_dim,\n            weight_attr=ParamAttr(name=\"fc_1_weights\", initializer=Uniform(-stdv, stdv)),\n            bias_attr=ParamAttr(name=\"fc_1_offset\"))\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, inputs):\n        x = self.conv1(inputs)\n        for ghost_bottleneck in self.ghost_bottleneck_list:\n            x = ghost_bottleneck(x)\n        x = self.conv_last(x)\n        feature = self.pool2d_gap(x)\n        x = self.fc_0(feature)\n        x = self.dropout(x)\n        x = paddle.reshape(x, shape=[-1, self._fc0_output_channels])\n        x = self.fc_1(x)\n        return x, feature\n\n    def _make_divisible(self, v, divisor, min_value=None):\n        \"\"\"\n        This function is taken from the original tf repo.\n        It ensures that all layers have a channel number that is divisible by 8\n        It can be seen here:\n        https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py\n        \"\"\"\n        if min_value is None:\n            min_value = divisor\n        new_v = max(min_value, int(v + divisor / 2) // divisor * divisor)\n        # Make sure that round down does not go down by more than 10%.\n        if new_v < 0.9 * v:\n            new_v += divisor\n        return new_v\n"
  },
  {
    "path": "modules/image/classification/ghostnet_x1_3_imagenet/README.md",
    "content": "# ghostnet_x1_3_imagenet\n\n|模型名称|ghostnet_x1_3_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|GhostNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|43MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - GhostNet是华为在2020年提出的全新轻量级网络结构，通过引入ghost模块，大大缓解了传统深度网络中特征的冗余计算问题，大大减少了网络参数和计算量。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install ghostnet_x1_3_imagenet\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run ghostnet_x1_3_imagenet --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='ghostnet_x1_3_imagenet')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用ghostnet_x1_3_imagenet对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"ghostnet_x1_3_imagenet\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='ghostnet_x1_3_imagenet', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m ghostnet_x1_3_imagenet\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ghostnet_x1_3_imagenet\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/ghostnet_x1_3_imagenet/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/ghostnet_x1_3_imagenet/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport math\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport paddlehub.vision.transforms as T\nimport numpy as np\nfrom paddle import ParamAttr\nfrom paddle.nn.initializer import Uniform, KaimingNormal\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    def __init__(self, in_channels, out_channels, kernel_size, stride=1, groups=1, act=\"relu\", name=None):\n        super(ConvBNLayer, self).__init__()\n        self._conv = nn.Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=(kernel_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(initializer=KaimingNormal(), name=name + \"_weights\"),\n            bias_attr=False)\n        bn_name = name + \"_bn\"\n\n        self._batch_norm = nn.BatchNorm(\n            num_channels=out_channels,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + \"_scale\", regularizer=paddle.regularizer.L2Decay(0.0)),\n            bias_attr=ParamAttr(name=bn_name + \"_offset\", regularizer=paddle.regularizer.L2Decay(0.0)),\n            moving_mean_name=bn_name + \"_mean\",\n            moving_variance_name=bn_name + \"_variance\")\n\n    def forward(self, inputs):\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass SEBlock(nn.Layer):\n    def __init__(self, num_channels, reduction_ratio=4, name=None):\n        super(SEBlock, self).__init__()\n        self.pool2d_gap = nn.AdaptiveAvgPool2D(1)\n        self._num_channels = num_channels\n        stdv = 1.0 / math.sqrt(num_channels * 1.0)\n        med_ch = num_channels // reduction_ratio\n        self.squeeze = nn.Linear(\n            num_channels,\n            med_ch,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_1_weights\"),\n            bias_attr=ParamAttr(name=name + \"_1_offset\"))\n        stdv = 1.0 / math.sqrt(med_ch * 1.0)\n        self.excitation = nn.Linear(\n            med_ch,\n            num_channels,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_2_weights\"),\n            bias_attr=ParamAttr(name=name + \"_2_offset\"))\n\n    def forward(self, inputs):\n        pool = self.pool2d_gap(inputs)\n        pool = paddle.squeeze(pool, axis=[2, 3])\n        squeeze = self.squeeze(pool)\n        squeeze = F.relu(squeeze)\n        excitation = self.excitation(squeeze)\n        excitation = paddle.clip(x=excitation, min=0, max=1)\n        excitation = paddle.unsqueeze(excitation, axis=[2, 3])\n        out = paddle.multiply(inputs, excitation)\n        return out\n\n\nclass GhostModule(nn.Layer):\n    def __init__(self, in_channels, output_channels, kernel_size=1, ratio=2, dw_size=3, stride=1, relu=True, name=None):\n        super(GhostModule, self).__init__()\n        init_channels = int(math.ceil(output_channels / ratio))\n        new_channels = int(init_channels * (ratio - 1))\n        self.primary_conv = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=init_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            groups=1,\n            act=\"relu\" if relu else None,\n            name=name + \"_primary_conv\")\n        self.cheap_operation = ConvBNLayer(\n            in_channels=init_channels,\n            out_channels=new_channels,\n            kernel_size=dw_size,\n            stride=1,\n            groups=init_channels,\n            act=\"relu\" if relu else None,\n            name=name + \"_cheap_operation\")\n\n    def forward(self, inputs):\n        x = self.primary_conv(inputs)\n        y = self.cheap_operation(x)\n        out = paddle.concat([x, y], axis=1)\n        return out\n\n\nclass GhostBottleneck(nn.Layer):\n    def __init__(self, in_channels, hidden_dim, output_channels, kernel_size, stride, use_se, name=None):\n        super(GhostBottleneck, self).__init__()\n        self._stride = stride\n        self._use_se = use_se\n        self._num_channels = in_channels\n        self._output_channels = output_channels\n        self.ghost_module_1 = GhostModule(\n            in_channels=in_channels,\n            output_channels=hidden_dim,\n            kernel_size=1,\n            stride=1,\n            relu=True,\n            name=name + \"_ghost_module_1\")\n        if stride == 2:\n            self.depthwise_conv = ConvBNLayer(\n                in_channels=hidden_dim,\n                out_channels=hidden_dim,\n                kernel_size=kernel_size,\n                stride=stride,\n                groups=hidden_dim,\n                act=None,\n                name=name + \"_depthwise_depthwise\"  # looks strange due to an old typo, will be fixed later.\n            )\n        if use_se:\n            self.se_block = SEBlock(num_channels=hidden_dim, name=name + \"_se\")\n        self.ghost_module_2 = GhostModule(\n            in_channels=hidden_dim,\n            output_channels=output_channels,\n            kernel_size=1,\n            relu=False,\n            name=name + \"_ghost_module_2\")\n        if stride != 1 or in_channels != output_channels:\n            self.shortcut_depthwise = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=in_channels,\n                kernel_size=kernel_size,\n                stride=stride,\n                groups=in_channels,\n                act=None,\n                name=name + \"_shortcut_depthwise_depthwise\"  # looks strange due to an old typo, will be fixed later.\n            )\n            self.shortcut_conv = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=output_channels,\n                kernel_size=1,\n                stride=1,\n                groups=1,\n                act=None,\n                name=name + \"_shortcut_conv\")\n\n    def forward(self, inputs):\n        x = self.ghost_module_1(inputs)\n        if self._stride == 2:\n            x = self.depthwise_conv(x)\n        if self._use_se:\n            x = self.se_block(x)\n        x = self.ghost_module_2(x)\n        if self._stride == 1 and self._num_channels == self._output_channels:\n            shortcut = inputs\n        else:\n            shortcut = self.shortcut_depthwise(inputs)\n            shortcut = self.shortcut_conv(shortcut)\n        return paddle.add(x=x, y=shortcut)\n\n\n@moduleinfo(\n    name=\"ghostnet_x1_3_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"ghostnet_x1_3_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass GhostNet(nn.Layer):\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(GhostNet, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        self.cfgs = [\n            # k, t, c, SE, s\n            [3, 16, 16, 0, 1],\n            [3, 48, 24, 0, 2],\n            [3, 72, 24, 0, 1],\n            [5, 72, 40, 1, 2],\n            [5, 120, 40, 1, 1],\n            [3, 240, 80, 0, 2],\n            [3, 200, 80, 0, 1],\n            [3, 184, 80, 0, 1],\n            [3, 184, 80, 0, 1],\n            [3, 480, 112, 1, 1],\n            [3, 672, 112, 1, 1],\n            [5, 672, 160, 1, 2],\n            [5, 960, 160, 0, 1],\n            [5, 960, 160, 1, 1],\n            [5, 960, 160, 0, 1],\n            [5, 960, 160, 1, 1]\n        ]\n        self.scale = 1.3\n        output_channels = int(self._make_divisible(16 * self.scale, 4))\n        self.conv1 = ConvBNLayer(\n            in_channels=3, out_channels=output_channels, kernel_size=3, stride=2, groups=1, act=\"relu\", name=\"conv1\")\n        # build inverted residual blocks\n        idx = 0\n        self.ghost_bottleneck_list = []\n        for k, exp_size, c, use_se, s in self.cfgs:\n            in_channels = output_channels\n            output_channels = int(self._make_divisible(c * self.scale, 4))\n            hidden_dim = int(self._make_divisible(exp_size * self.scale, 4))\n            ghost_bottleneck = self.add_sublayer(\n                name=\"_ghostbottleneck_\" + str(idx),\n                sublayer=GhostBottleneck(\n                    in_channels=in_channels,\n                    hidden_dim=hidden_dim,\n                    output_channels=output_channels,\n                    kernel_size=k,\n                    stride=s,\n                    use_se=use_se,\n                    name=\"_ghostbottleneck_\" + str(idx)))\n            self.ghost_bottleneck_list.append(ghost_bottleneck)\n            idx += 1\n        # build last several layers\n        in_channels = output_channels\n        output_channels = int(self._make_divisible(exp_size * self.scale, 4))\n        self.conv_last = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=output_channels,\n            kernel_size=1,\n            stride=1,\n            groups=1,\n            act=\"relu\",\n            name=\"conv_last\")\n        self.pool2d_gap = nn.AdaptiveAvgPool2D(1)\n        in_channels = output_channels\n        self._fc0_output_channels = 1280\n        self.fc_0 = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=self._fc0_output_channels,\n            kernel_size=1,\n            stride=1,\n            act=\"relu\",\n            name=\"fc_0\")\n        self.dropout = nn.Dropout(p=0.2)\n        stdv = 1.0 / math.sqrt(self._fc0_output_channels * 1.0)\n        self.fc_1 = nn.Linear(\n            self._fc0_output_channels,\n            class_dim,\n            weight_attr=ParamAttr(name=\"fc_1_weights\", initializer=Uniform(-stdv, stdv)),\n            bias_attr=ParamAttr(name=\"fc_1_offset\"))\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, inputs):\n        x = self.conv1(inputs)\n        for ghost_bottleneck in self.ghost_bottleneck_list:\n            x = ghost_bottleneck(x)\n        x = self.conv_last(x)\n        feature = self.pool2d_gap(x)\n        x = self.fc_0(feature)\n        x = self.dropout(x)\n        x = paddle.reshape(x, shape=[-1, self._fc0_output_channels])\n        x = self.fc_1(x)\n        return x, feature\n\n    def _make_divisible(self, v, divisor, min_value=None):\n        \"\"\"\n        This function is taken from the original tf repo.\n        It ensures that all layers have a channel number that is divisible by 8\n        It can be seen here:\n        https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py\n        \"\"\"\n        if min_value is None:\n            min_value = divisor\n        new_v = max(min_value, int(v + divisor / 2) // divisor * divisor)\n        # Make sure that round down does not go down by more than 10%.\n        if new_v < 0.9 * v:\n            new_v += divisor\n        return new_v\n"
  },
  {
    "path": "modules/image/classification/ghostnet_x1_3_imagenet_ssld/README.md",
    "content": "# ghostnet_x1_3_imagenet_ssld\n\n|模型名称|ghostnet_x1_3_imagenet_ssld|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|GhostNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|43MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - GhostNet是华为在2020年提出的全新轻量级网络结构，通过引入ghost模块，大大缓解了传统深度网络中特征的冗余计算问题，大大减少了网络参数和计算量。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install ghostnet_x1_3_imagenet_ssld\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run ghostnet_x1_3_imagenet_ssld --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='ghostnet_x1_3_imagenet_ssld')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用ghostnet_x1_3_imagenet_ssld对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"ghostnet_x1_3_imagenet_ssld\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='ghostnet_x1_3_imagenet_ssld', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m ghostnet_x1_3_imagenet_ssld\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ghostnet_x1_3_imagenet_ssld\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/ghostnet_x1_3_imagenet_ssld/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/ghostnet_x1_3_imagenet_ssld/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport math\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport paddlehub.vision.transforms as T\nimport numpy as np\nfrom paddle import ParamAttr\nfrom paddle.nn.initializer import Uniform, KaimingNormal\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    def __init__(self, in_channels, out_channels, kernel_size, stride=1, groups=1, act=\"relu\", name=None):\n        super(ConvBNLayer, self).__init__()\n        self._conv = nn.Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=(kernel_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(initializer=KaimingNormal(), name=name + \"_weights\"),\n            bias_attr=False)\n        bn_name = name + \"_bn\"\n\n        self._batch_norm = nn.BatchNorm(\n            num_channels=out_channels,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + \"_scale\", regularizer=paddle.regularizer.L2Decay(0.0)),\n            bias_attr=ParamAttr(name=bn_name + \"_offset\", regularizer=paddle.regularizer.L2Decay(0.0)),\n            moving_mean_name=bn_name + \"_mean\",\n            moving_variance_name=bn_name + \"_variance\")\n\n    def forward(self, inputs):\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass SEBlock(nn.Layer):\n    def __init__(self, num_channels, reduction_ratio=4, name=None):\n        super(SEBlock, self).__init__()\n        self.pool2d_gap = nn.AdaptiveAvgPool2D(1)\n        self._num_channels = num_channels\n        stdv = 1.0 / math.sqrt(num_channels * 1.0)\n        med_ch = num_channels // reduction_ratio\n        self.squeeze = nn.Linear(\n            num_channels,\n            med_ch,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_1_weights\"),\n            bias_attr=ParamAttr(name=name + \"_1_offset\"))\n        stdv = 1.0 / math.sqrt(med_ch * 1.0)\n        self.excitation = nn.Linear(\n            med_ch,\n            num_channels,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_2_weights\"),\n            bias_attr=ParamAttr(name=name + \"_2_offset\"))\n\n    def forward(self, inputs):\n        pool = self.pool2d_gap(inputs)\n        pool = paddle.squeeze(pool, axis=[2, 3])\n        squeeze = self.squeeze(pool)\n        squeeze = F.relu(squeeze)\n        excitation = self.excitation(squeeze)\n        excitation = paddle.clip(x=excitation, min=0, max=1)\n        excitation = paddle.unsqueeze(excitation, axis=[2, 3])\n        out = paddle.multiply(inputs, excitation)\n        return out\n\n\nclass GhostModule(nn.Layer):\n    def __init__(self, in_channels, output_channels, kernel_size=1, ratio=2, dw_size=3, stride=1, relu=True, name=None):\n        super(GhostModule, self).__init__()\n        init_channels = int(math.ceil(output_channels / ratio))\n        new_channels = int(init_channels * (ratio - 1))\n        self.primary_conv = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=init_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            groups=1,\n            act=\"relu\" if relu else None,\n            name=name + \"_primary_conv\")\n        self.cheap_operation = ConvBNLayer(\n            in_channels=init_channels,\n            out_channels=new_channels,\n            kernel_size=dw_size,\n            stride=1,\n            groups=init_channels,\n            act=\"relu\" if relu else None,\n            name=name + \"_cheap_operation\")\n\n    def forward(self, inputs):\n        x = self.primary_conv(inputs)\n        y = self.cheap_operation(x)\n        out = paddle.concat([x, y], axis=1)\n        return out\n\n\nclass GhostBottleneck(nn.Layer):\n    def __init__(self, in_channels, hidden_dim, output_channels, kernel_size, stride, use_se, name=None):\n        super(GhostBottleneck, self).__init__()\n        self._stride = stride\n        self._use_se = use_se\n        self._num_channels = in_channels\n        self._output_channels = output_channels\n        self.ghost_module_1 = GhostModule(\n            in_channels=in_channels,\n            output_channels=hidden_dim,\n            kernel_size=1,\n            stride=1,\n            relu=True,\n            name=name + \"_ghost_module_1\")\n        if stride == 2:\n            self.depthwise_conv = ConvBNLayer(\n                in_channels=hidden_dim,\n                out_channels=hidden_dim,\n                kernel_size=kernel_size,\n                stride=stride,\n                groups=hidden_dim,\n                act=None,\n                name=name + \"_depthwise_depthwise\"  # looks strange due to an old typo, will be fixed later.\n            )\n        if use_se:\n            self.se_block = SEBlock(num_channels=hidden_dim, name=name + \"_se\")\n        self.ghost_module_2 = GhostModule(\n            in_channels=hidden_dim,\n            output_channels=output_channels,\n            kernel_size=1,\n            relu=False,\n            name=name + \"_ghost_module_2\")\n        if stride != 1 or in_channels != output_channels:\n            self.shortcut_depthwise = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=in_channels,\n                kernel_size=kernel_size,\n                stride=stride,\n                groups=in_channels,\n                act=None,\n                name=name + \"_shortcut_depthwise_depthwise\"  # looks strange due to an old typo, will be fixed later.\n            )\n            self.shortcut_conv = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=output_channels,\n                kernel_size=1,\n                stride=1,\n                groups=1,\n                act=None,\n                name=name + \"_shortcut_conv\")\n\n    def forward(self, inputs):\n        x = self.ghost_module_1(inputs)\n        if self._stride == 2:\n            x = self.depthwise_conv(x)\n        if self._use_se:\n            x = self.se_block(x)\n        x = self.ghost_module_2(x)\n        if self._stride == 1 and self._num_channels == self._output_channels:\n            shortcut = inputs\n        else:\n            shortcut = self.shortcut_depthwise(inputs)\n            shortcut = self.shortcut_conv(shortcut)\n        return paddle.add(x=x, y=shortcut)\n\n\n@moduleinfo(\n    name=\"ghostnet_x1_3_imagenet_ssld\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"ghostnet_x1_3_imagenet_ssld is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass GhostNet(nn.Layer):\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(GhostNet, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        self.cfgs = [\n            # k, t, c, SE, s\n            [3, 16, 16, 0, 1],\n            [3, 48, 24, 0, 2],\n            [3, 72, 24, 0, 1],\n            [5, 72, 40, 1, 2],\n            [5, 120, 40, 1, 1],\n            [3, 240, 80, 0, 2],\n            [3, 200, 80, 0, 1],\n            [3, 184, 80, 0, 1],\n            [3, 184, 80, 0, 1],\n            [3, 480, 112, 1, 1],\n            [3, 672, 112, 1, 1],\n            [5, 672, 160, 1, 2],\n            [5, 960, 160, 0, 1],\n            [5, 960, 160, 1, 1],\n            [5, 960, 160, 0, 1],\n            [5, 960, 160, 1, 1]\n        ]\n        self.scale = 1.3\n        output_channels = int(self._make_divisible(16 * self.scale, 4))\n        self.conv1 = ConvBNLayer(\n            in_channels=3, out_channels=output_channels, kernel_size=3, stride=2, groups=1, act=\"relu\", name=\"conv1\")\n        # build inverted residual blocks\n        idx = 0\n        self.ghost_bottleneck_list = []\n        for k, exp_size, c, use_se, s in self.cfgs:\n            in_channels = output_channels\n            output_channels = int(self._make_divisible(c * self.scale, 4))\n            hidden_dim = int(self._make_divisible(exp_size * self.scale, 4))\n            ghost_bottleneck = self.add_sublayer(\n                name=\"_ghostbottleneck_\" + str(idx),\n                sublayer=GhostBottleneck(\n                    in_channels=in_channels,\n                    hidden_dim=hidden_dim,\n                    output_channels=output_channels,\n                    kernel_size=k,\n                    stride=s,\n                    use_se=use_se,\n                    name=\"_ghostbottleneck_\" + str(idx)))\n            self.ghost_bottleneck_list.append(ghost_bottleneck)\n            idx += 1\n        # build last several layers\n        in_channels = output_channels\n        output_channels = int(self._make_divisible(exp_size * self.scale, 4))\n        self.conv_last = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=output_channels,\n            kernel_size=1,\n            stride=1,\n            groups=1,\n            act=\"relu\",\n            name=\"conv_last\")\n        self.pool2d_gap = nn.AdaptiveAvgPool2D(1)\n        in_channels = output_channels\n        self._fc0_output_channels = 1280\n        self.fc_0 = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=self._fc0_output_channels,\n            kernel_size=1,\n            stride=1,\n            act=\"relu\",\n            name=\"fc_0\")\n        self.dropout = nn.Dropout(p=0.2)\n        stdv = 1.0 / math.sqrt(self._fc0_output_channels * 1.0)\n        self.fc_1 = nn.Linear(\n            self._fc0_output_channels,\n            class_dim,\n            weight_attr=ParamAttr(name=\"fc_1_weights\", initializer=Uniform(-stdv, stdv)),\n            bias_attr=ParamAttr(name=\"fc_1_offset\"))\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, inputs):\n        x = self.conv1(inputs)\n        for ghost_bottleneck in self.ghost_bottleneck_list:\n            x = ghost_bottleneck(x)\n        x = self.conv_last(x)\n        feature = self.pool2d_gap(x)\n        x = self.fc_0(feature)\n        x = self.dropout(x)\n        x = paddle.reshape(x, shape=[-1, self._fc0_output_channels])\n        x = self.fc_1(x)\n        return x, feature\n\n    def _make_divisible(self, v, divisor, min_value=None):\n        \"\"\"\n        This function is taken from the original tf repo.\n        It ensures that all layers have a channel number that is divisible by 8\n        It can be seen here:\n        https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py\n        \"\"\"\n        if min_value is None:\n            min_value = divisor\n        new_v = max(min_value, int(v + divisor / 2) // divisor * divisor)\n        # Make sure that round down does not go down by more than 10%.\n        if new_v < 0.9 * v:\n            new_v += divisor\n        return new_v\n"
  },
  {
    "path": "modules/image/classification/googlenet_imagenet/README.md",
    "content": "# googlenet_imagenet\n\n|模型名称|googlenet_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|GoogleNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|28MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - GoogleNet是图像分类中的经典模型。由Christian Szegedy等人在2014年提出，并获得了2014年ILSVRC竞赛冠军。该PaddleHub Module结构为GoogleNet，基于ImageNet-2012数据集训练，接受输入图片大小为224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install googlenet_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run googlenet_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"googlenet_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率。\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install googlenet_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/googlenet_imagenet/README_en.md",
    "content": "# googlenet_imagenet\n\n|Module Name|googlenet_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|GoogleNet|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|28MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - GoogleNet was proposed by Christian Szegedy in 2014 and gained the champion of ILSVRC 2014. This module is based on GoogleNet, trained on ImageNet-2012, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install googlenet_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run googlenet_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"googlenet_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install googlenet_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/googlenet_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\n\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\ndef xavier(channels: int, filter_size: int, name: str):\n    \"\"\"Initialize the weights by uniform distribution.\"\"\"\n    stdv = (3.0 / (filter_size**2 * channels))**0.5\n    param_attr = ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_weights\")\n    return param_attr\n\n\nclass ConvLayer(nn.Layer):\n    \"\"\"Basic conv2d layer.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 filter_size: int,\n                 stride: int = 1,\n                 groups: int = 1,\n                 name: str = None):\n        super(ConvLayer, self).__init__()\n\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self._conv(inputs)\n        return y\n\n\nclass Inception(nn.Layer):\n    \"\"\"Inception block.\"\"\"\n\n    def __init__(self,\n                 input_channels: int,\n                 output_channels: int,\n                 filter1: int,\n                 filter3R: int,\n                 filter3: int,\n                 filter5R: int,\n                 filter5: int,\n                 proj: int,\n                 name: str = None):\n        super(Inception, self).__init__()\n\n        self._conv1 = ConvLayer(input_channels, filter1, 1, name=\"inception_\" + name + \"_1x1\")\n        self._conv3r = ConvLayer(input_channels, filter3R, 1, name=\"inception_\" + name + \"_3x3_reduce\")\n        self._conv3 = ConvLayer(filter3R, filter3, 3, name=\"inception_\" + name + \"_3x3\")\n        self._conv5r = ConvLayer(input_channels, filter5R, 1, name=\"inception_\" + name + \"_5x5_reduce\")\n        self._conv5 = ConvLayer(filter5R, filter5, 5, name=\"inception_\" + name + \"_5x5\")\n        self._pool = MaxPool2d(kernel_size=3, stride=1, padding=1)\n\n        self._convprj = ConvLayer(input_channels, proj, 1, name=\"inception_\" + name + \"_3x3_proj\")\n\n    def forward(self, inputs: paddle.Tensor):\n        conv1 = self._conv1(inputs)\n\n        conv3r = self._conv3r(inputs)\n        conv3 = self._conv3(conv3r)\n\n        conv5r = self._conv5r(inputs)\n        conv5 = self._conv5(conv5r)\n\n        pool = self._pool(inputs)\n        convprj = self._convprj(pool)\n\n        cat = paddle.concat([conv1, conv3, conv5, convprj], axis=1)\n        cat = F.relu(cat)\n        return cat\n\n\n@moduleinfo(\n    name=\"googlenet_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"GoogleNet_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass GoogleNet(nn.Layer):\n    \"\"\"GoogleNet model\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(GoogleNet, self).__init__()\n        self._conv = ConvLayer(3, 64, 7, 2, name=\"conv1\")\n        self._pool = MaxPool2d(kernel_size=3, stride=2)\n        self._conv_1 = ConvLayer(64, 64, 1, name=\"conv2_1x1\")\n        self._conv_2 = ConvLayer(64, 192, 3, name=\"conv2_3x3\")\n\n        self._ince3a = Inception(192, 192, 64, 96, 128, 16, 32, 32, name=\"ince3a\")\n        self._ince3b = Inception(256, 256, 128, 128, 192, 32, 96, 64, name=\"ince3b\")\n\n        self._ince4a = Inception(480, 480, 192, 96, 208, 16, 48, 64, name=\"ince4a\")\n        self._ince4b = Inception(512, 512, 160, 112, 224, 24, 64, 64, name=\"ince4b\")\n        self._ince4c = Inception(512, 512, 128, 128, 256, 24, 64, 64, name=\"ince4c\")\n        self._ince4d = Inception(512, 512, 112, 144, 288, 32, 64, 64, name=\"ince4d\")\n        self._ince4e = Inception(528, 528, 256, 160, 320, 32, 128, 128, name=\"ince4e\")\n\n        self._ince5a = Inception(832, 832, 256, 160, 320, 32, 128, 128, name=\"ince5a\")\n        self._ince5b = Inception(832, 832, 384, 192, 384, 48, 128, 128, name=\"ince5b\")\n\n        self._pool_5 = AvgPool2d(kernel_size=7, stride=7)\n\n        self._drop = Dropout(p=0.4, mode=\"downscale_in_infer\")\n        self._fc_out = Linear(\n            1024, class_dim, weight_attr=xavier(1024, 1, \"out\"), bias_attr=ParamAttr(name=\"out_offset\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'googlenet_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/googlenet_imagenet.pdparams -O' +\n                    checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        x = self._conv(inputs)\n        x = self._pool(x)\n        x = self._conv_1(x)\n        x = self._conv_2(x)\n        x = self._pool(x)\n\n        x = self._ince3a(x)\n        x = self._ince3b(x)\n        x = self._pool(x)\n\n        ince4a = self._ince4a(x)\n        x = self._ince4b(ince4a)\n        x = self._ince4c(x)\n        ince4d = self._ince4d(x)\n        x = self._ince4e(ince4d)\n        x = self._pool(x)\n\n        x = self._ince5a(x)\n        ince5b = self._ince5b(x)\n\n        x = self._pool_5(ince5b)\n        x = self._drop(x)\n        x = paddle.squeeze(x, axis=[2, 3])\n        out = self._fc_out(x)\n        out = F.softmax(out)\n\n        return out\n"
  },
  {
    "path": "modules/image/classification/hrnet18_imagenet/README.md",
    "content": "# hrnet18_imagenet\n\n|模型名称|hrnet18_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|HRNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|124MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - HRNet是微软亚洲研究院在2019年提出的全新神经网络。与之前的卷积神经网络不同，这个网络在网络的深层依然可以保持高分辨率，所以预测的关键点的热图更加准确，而且在空间上也更加准确。此外，该网络在其他对分辨率敏感的视觉任务中表现特别好，例如检测和分割。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install hrnet18_imagenet\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run hrnet18_imagenet --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='hrnet18_imagenet')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用hrnet18_imagenet对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"hrnet18_imagenet\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='hrnet18_imagenet', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m hrnet18_imagenet\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/hrnet18_imagenet\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/hrnet18_imagenet/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/hrnet18_imagenet/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport math\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport paddlehub.vision.transforms as T\nimport numpy as np\nfrom paddle.nn.initializer import Uniform\nfrom paddle import ParamAttr\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    def __init__(self, num_channels, num_filters, filter_size, stride=1, groups=1, act=\"relu\", name=None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = nn.Conv2D(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        bn_name = name + '_bn'\n        self._batch_norm = nn.BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, input):\n        y = self._conv(input)\n        y = self._batch_norm(y)\n        return y\n\n\nclass Layer1(nn.Layer):\n    def __init__(self, num_channels, has_se=False, name=None):\n        super(Layer1, self).__init__()\n\n        self.bottleneck_block_list = []\n\n        for i in range(4):\n            bottleneck_block = self.add_sublayer(\n                \"bb_{}_{}\".format(name, i + 1),\n                BottleneckBlock(\n                    num_channels=num_channels if i == 0 else 256,\n                    num_filters=64,\n                    has_se=has_se,\n                    stride=1,\n                    downsample=True if i == 0 else False,\n                    name=name + '_' + str(i + 1)))\n            self.bottleneck_block_list.append(bottleneck_block)\n\n    def forward(self, input):\n        conv = input\n        for block_func in self.bottleneck_block_list:\n            conv = block_func(conv)\n        return conv\n\n\nclass TransitionLayer(nn.Layer):\n    def __init__(self, in_channels, out_channels, name=None):\n        super(TransitionLayer, self).__init__()\n\n        num_in = len(in_channels)\n        num_out = len(out_channels)\n        out = []\n        self.conv_bn_func_list = []\n        for i in range(num_out):\n            residual = None\n            if i < num_in:\n                if in_channels[i] != out_channels[i]:\n                    residual = self.add_sublayer(\n                        \"transition_{}_layer_{}\".format(name, i + 1),\n                        ConvBNLayer(\n                            num_channels=in_channels[i],\n                            num_filters=out_channels[i],\n                            filter_size=3,\n                            name=name + '_layer_' + str(i + 1)))\n            else:\n                residual = self.add_sublayer(\n                    \"transition_{}_layer_{}\".format(name, i + 1),\n                    ConvBNLayer(\n                        num_channels=in_channels[-1],\n                        num_filters=out_channels[i],\n                        filter_size=3,\n                        stride=2,\n                        name=name + '_layer_' + str(i + 1)))\n            self.conv_bn_func_list.append(residual)\n\n    def forward(self, input):\n        outs = []\n        for idx, conv_bn_func in enumerate(self.conv_bn_func_list):\n            if conv_bn_func is None:\n                outs.append(input[idx])\n            else:\n                if idx < len(input):\n                    outs.append(conv_bn_func(input[idx]))\n                else:\n                    outs.append(conv_bn_func(input[-1]))\n        return outs\n\n\nclass Branches(nn.Layer):\n    def __init__(self, block_num, in_channels, out_channels, has_se=False, name=None):\n        super(Branches, self).__init__()\n\n        self.basic_block_list = []\n\n        for i in range(len(out_channels)):\n            self.basic_block_list.append([])\n            for j in range(block_num):\n                in_ch = in_channels[i] if j == 0 else out_channels[i]\n                basic_block_func = self.add_sublayer(\n                    \"bb_{}_branch_layer_{}_{}\".format(name, i + 1, j + 1),\n                    BasicBlock(\n                        num_channels=in_ch,\n                        num_filters=out_channels[i],\n                        has_se=has_se,\n                        name=name + '_branch_layer_' + str(i + 1) + '_' + str(j + 1)))\n                self.basic_block_list[i].append(basic_block_func)\n\n    def forward(self, inputs):\n        outs = []\n        for idx, input in enumerate(inputs):\n            conv = input\n            basic_block_list = self.basic_block_list[idx]\n            for basic_block_func in basic_block_list:\n                conv = basic_block_func(conv)\n            outs.append(conv)\n        return outs\n\n\nclass BottleneckBlock(nn.Layer):\n    def __init__(self, num_channels, num_filters, has_se, stride=1, downsample=False, name=None):\n        super(BottleneckBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=1,\n            act=\"relu\",\n            name=name + \"_conv1\",\n        )\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_conv2\")\n        self.conv3 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters * 4, filter_size=1, act=None, name=name + \"_conv3\")\n\n        if self.downsample:\n            self.conv_down = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                act=None,\n                name=name + \"_downsample\")\n\n        if self.has_se:\n            self.se = SELayer(\n                num_channels=num_filters * 4, num_filters=num_filters * 4, reduction_ratio=16, name='fc' + name)\n\n    def forward(self, input):\n        residual = input\n        conv1 = self.conv1(input)\n        conv2 = self.conv2(conv1)\n        conv3 = self.conv3(conv2)\n\n        if self.downsample:\n            residual = self.conv_down(input)\n\n        if self.has_se:\n            conv3 = self.se(conv3)\n\n        y = paddle.add(x=residual, y=conv3)\n        y = F.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self, num_channels, num_filters, stride=1, has_se=False, downsample=False, name=None):\n        super(BasicBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_conv1\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters, filter_size=3, stride=1, act=None, name=name + \"_conv2\")\n\n        if self.downsample:\n            self.conv_down = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                act=\"relu\",\n                name=name + \"_downsample\")\n\n        if self.has_se:\n            self.se = SELayer(num_channels=num_filters, num_filters=num_filters, reduction_ratio=16, name='fc' + name)\n\n    def forward(self, input):\n        residual = input\n        conv1 = self.conv1(input)\n        conv2 = self.conv2(conv1)\n\n        if self.downsample:\n            residual = self.conv_down(input)\n\n        if self.has_se:\n            conv2 = self.se(conv2)\n\n        y = paddle.add(x=residual, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass SELayer(nn.Layer):\n    def __init__(self, num_channels, num_filters, reduction_ratio, name=None):\n        super(SELayer, self).__init__()\n\n        self.pool2d_gap = nn.AdaptiveAvgPool2D(1)\n\n        self._num_channels = num_channels\n\n        med_ch = int(num_channels / reduction_ratio)\n        stdv = 1.0 / math.sqrt(num_channels * 1.0)\n        self.squeeze = nn.Linear(\n            num_channels,\n            med_ch,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_sqz_weights\"),\n            bias_attr=ParamAttr(name=name + '_sqz_offset'))\n\n        stdv = 1.0 / math.sqrt(med_ch * 1.0)\n        self.excitation = nn.Linear(\n            med_ch,\n            num_filters,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_exc_weights\"),\n            bias_attr=ParamAttr(name=name + '_exc_offset'))\n\n    def forward(self, input):\n        pool = self.pool2d_gap(input)\n        pool = paddle.squeeze(pool, axis=[2, 3])\n        squeeze = self.squeeze(pool)\n        squeeze = F.relu(squeeze)\n        excitation = self.excitation(squeeze)\n        excitation = F.sigmoid(excitation)\n        excitation = paddle.unsqueeze(excitation, axis=[2, 3])\n        out = input * excitation\n        return out\n\n\nclass Stage(nn.Layer):\n    def __init__(self, num_channels, num_modules, num_filters, has_se=False, multi_scale_output=True, name=None):\n        super(Stage, self).__init__()\n\n        self._num_modules = num_modules\n\n        self.stage_func_list = []\n        for i in range(num_modules):\n            if i == num_modules - 1 and not multi_scale_output:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels,\n                        num_filters=num_filters,\n                        has_se=has_se,\n                        multi_scale_output=False,\n                        name=name + '_' + str(i + 1)))\n            else:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels, num_filters=num_filters, has_se=has_se,\n                        name=name + '_' + str(i + 1)))\n\n            self.stage_func_list.append(stage_func)\n\n    def forward(self, input):\n        out = input\n        for idx in range(self._num_modules):\n            out = self.stage_func_list[idx](out)\n        return out\n\n\nclass HighResolutionModule(nn.Layer):\n    def __init__(self, num_channels, num_filters, has_se=False, multi_scale_output=True, name=None):\n        super(HighResolutionModule, self).__init__()\n\n        self.branches_func = Branches(\n            block_num=4, in_channels=num_channels, out_channels=num_filters, has_se=has_se, name=name)\n\n        self.fuse_func = FuseLayers(\n            in_channels=num_filters, out_channels=num_filters, multi_scale_output=multi_scale_output, name=name)\n\n    def forward(self, input):\n        out = self.branches_func(input)\n        out = self.fuse_func(out)\n        return out\n\n\nclass FuseLayers(nn.Layer):\n    def __init__(self, in_channels, out_channels, multi_scale_output=True, name=None):\n        super(FuseLayers, self).__init__()\n\n        self._actual_ch = len(in_channels) if multi_scale_output else 1\n        self._in_channels = in_channels\n\n        self.residual_func_list = []\n        for i in range(self._actual_ch):\n            for j in range(len(in_channels)):\n                residual_func = None\n                if j > i:\n                    residual_func = self.add_sublayer(\n                        \"residual_{}_layer_{}_{}\".format(name, i + 1, j + 1),\n                        ConvBNLayer(\n                            num_channels=in_channels[j],\n                            num_filters=out_channels[i],\n                            filter_size=1,\n                            stride=1,\n                            act=None,\n                            name=name + '_layer_' + str(i + 1) + '_' + str(j + 1)))\n                    self.residual_func_list.append(residual_func)\n                elif j < i:\n                    pre_num_filters = in_channels[j]\n                    for k in range(i - j):\n                        if k == i - j - 1:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                ConvBNLayer(\n                                    num_channels=pre_num_filters,\n                                    num_filters=out_channels[i],\n                                    filter_size=3,\n                                    stride=2,\n                                    act=None,\n                                    name=name + '_layer_' + str(i + 1) + '_' + str(j + 1) + '_' + str(k + 1)))\n                            pre_num_filters = out_channels[i]\n                        else:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                ConvBNLayer(\n                                    num_channels=pre_num_filters,\n                                    num_filters=out_channels[j],\n                                    filter_size=3,\n                                    stride=2,\n                                    act=\"relu\",\n                                    name=name + '_layer_' + str(i + 1) + '_' + str(j + 1) + '_' + str(k + 1)))\n                            pre_num_filters = out_channels[j]\n                        self.residual_func_list.append(residual_func)\n\n    def forward(self, input):\n        outs = []\n        residual_func_idx = 0\n        for i in range(self._actual_ch):\n            residual = input[i]\n            for j in range(len(self._in_channels)):\n                if j > i:\n                    y = self.residual_func_list[residual_func_idx](input[j])\n                    residual_func_idx += 1\n\n                    y = F.upsample(y, scale_factor=2**(j - i), mode=\"nearest\")\n                    residual = paddle.add(x=residual, y=y)\n                elif j < i:\n                    y = input[j]\n                    for k in range(i - j):\n                        y = self.residual_func_list[residual_func_idx](y)\n                        residual_func_idx += 1\n\n                    residual = paddle.add(x=residual, y=y)\n\n            residual = F.relu(residual)\n            outs.append(residual)\n\n        return outs\n\n\nclass LastClsOut(nn.Layer):\n    def __init__(self, num_channel_list, has_se, num_filters_list=[32, 64, 128, 256], name=None):\n        super(LastClsOut, self).__init__()\n\n        self.func_list = []\n        for idx in range(len(num_channel_list)):\n            func = self.add_sublayer(\n                \"conv_{}_conv_{}\".format(name, idx + 1),\n                BottleneckBlock(\n                    num_channels=num_channel_list[idx],\n                    num_filters=num_filters_list[idx],\n                    has_se=has_se,\n                    downsample=True,\n                    name=name + 'conv_' + str(idx + 1)))\n            self.func_list.append(func)\n\n    def forward(self, inputs):\n        outs = []\n        for idx, input in enumerate(inputs):\n            out = self.func_list[idx](input)\n            outs.append(out)\n        return outs\n\n\n@moduleinfo(\n    name=\"hrnet18_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"hrnet18_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass HRNet18(nn.Layer):\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(HRNet18, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        self.width = 18\n        self.has_se = False\n        self.channels = {\n            18: [[18, 36], [18, 36, 72], [18, 36, 72, 144]],\n            30: [[30, 60], [30, 60, 120], [30, 60, 120, 240]],\n            32: [[32, 64], [32, 64, 128], [32, 64, 128, 256]],\n            40: [[40, 80], [40, 80, 160], [40, 80, 160, 320]],\n            44: [[44, 88], [44, 88, 176], [44, 88, 176, 352]],\n            48: [[48, 96], [48, 96, 192], [48, 96, 192, 384]],\n            60: [[60, 120], [60, 120, 240], [60, 120, 240, 480]],\n            64: [[64, 128], [64, 128, 256], [64, 128, 256, 512]]\n        }\n        self._class_dim = class_dim\n\n        channels_2, channels_3, channels_4 = self.channels[self.width]\n        num_modules_2, num_modules_3, num_modules_4 = 1, 4, 3\n\n        self.conv_layer1_1 = ConvBNLayer(\n            num_channels=3, num_filters=64, filter_size=3, stride=2, act='relu', name=\"layer1_1\")\n\n        self.conv_layer1_2 = ConvBNLayer(\n            num_channels=64, num_filters=64, filter_size=3, stride=2, act='relu', name=\"layer1_2\")\n\n        self.la1 = Layer1(num_channels=64, has_se=self.has_se, name=\"layer2\")\n\n        self.tr1 = TransitionLayer(in_channels=[256], out_channels=channels_2, name=\"tr1\")\n\n        self.st2 = Stage(\n            num_channels=channels_2, num_modules=num_modules_2, num_filters=channels_2, has_se=self.has_se, name=\"st2\")\n\n        self.tr2 = TransitionLayer(in_channels=channels_2, out_channels=channels_3, name=\"tr2\")\n        self.st3 = Stage(\n            num_channels=channels_3, num_modules=num_modules_3, num_filters=channels_3, has_se=self.has_se, name=\"st3\")\n\n        self.tr3 = TransitionLayer(in_channels=channels_3, out_channels=channels_4, name=\"tr3\")\n        self.st4 = Stage(\n            num_channels=channels_4, num_modules=num_modules_4, num_filters=channels_4, has_se=self.has_se, name=\"st4\")\n\n        # classification\n        num_filters_list = [32, 64, 128, 256]\n        self.last_cls = LastClsOut(\n            num_channel_list=channels_4,\n            has_se=self.has_se,\n            num_filters_list=num_filters_list,\n            name=\"cls_head\",\n        )\n\n        last_num_filters = [256, 512, 1024]\n        self.cls_head_conv_list = []\n        for idx in range(3):\n            self.cls_head_conv_list.append(\n                self.add_sublayer(\n                    \"cls_head_add{}\".format(idx + 1),\n                    ConvBNLayer(\n                        num_channels=num_filters_list[idx] * 4,\n                        num_filters=last_num_filters[idx],\n                        filter_size=3,\n                        stride=2,\n                        name=\"cls_head_add\" + str(idx + 1))))\n\n        self.conv_last = ConvBNLayer(\n            num_channels=1024, num_filters=2048, filter_size=1, stride=1, name=\"cls_head_last_conv\")\n\n        self.pool2d_avg = nn.AdaptiveAvgPool2D(1)\n\n        stdv = 1.0 / math.sqrt(2048 * 1.0)\n\n        self.out = nn.Linear(\n            2048,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_weights\"),\n            bias_attr=ParamAttr(name=\"fc_offset\"))\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, input):\n        conv1 = self.conv_layer1_1(input)\n        conv2 = self.conv_layer1_2(conv1)\n\n        la1 = self.la1(conv2)\n\n        tr1 = self.tr1([la1])\n        st2 = self.st2(tr1)\n\n        tr2 = self.tr2(st2)\n        st3 = self.st3(tr2)\n\n        tr3 = self.tr3(st3)\n        st4 = self.st4(tr3)\n\n        last_cls = self.last_cls(st4)\n\n        y = last_cls[0]\n        for idx in range(3):\n            y = paddle.add(last_cls[idx + 1], self.cls_head_conv_list[idx](y))\n\n        y = self.conv_last(y)\n        feature = self.pool2d_avg(y)\n        y = paddle.reshape(feature, shape=[-1, feature.shape[1]])\n        y = self.out(y)\n        return y, feature\n"
  },
  {
    "path": "modules/image/classification/hrnet18_imagenet_ssld/README.md",
    "content": "# hrnet18_imagenet_ssld\n\n|模型名称|hrnet18_imagenet_ssld|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|HRNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|124MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - HRNet是微软亚洲研究院在2019年提出的全新神经网络。与之前的卷积神经网络不同，这个网络在网络的深层依然可以保持高分辨率，所以预测的关键点的热图更加准确，而且在空间上也更加准确。此外，该网络在其他对分辨率敏感的视觉任务中表现特别好，例如检测和分割。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install hrnet18_imagenet_ssld\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run hrnet18_imagenet_ssld --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='hrnet18_imagenet_ssld')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用hrnet18_imagenet_ssld对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"hrnet18_imagenet_ssld\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='hrnet18_imagenet_ssld', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m hrnet18_imagenet_ssld\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/hrnet18_imagenet_ssld\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/hrnet18_imagenet_ssld/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/hrnet18_imagenet_ssld/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport math\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport paddlehub.vision.transforms as T\nimport numpy as np\nfrom paddle.nn.initializer import Uniform\nfrom paddle import ParamAttr\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    def __init__(self, num_channels, num_filters, filter_size, stride=1, groups=1, act=\"relu\", name=None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = nn.Conv2D(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        bn_name = name + '_bn'\n        self._batch_norm = nn.BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, input):\n        y = self._conv(input)\n        y = self._batch_norm(y)\n        return y\n\n\nclass Layer1(nn.Layer):\n    def __init__(self, num_channels, has_se=False, name=None):\n        super(Layer1, self).__init__()\n\n        self.bottleneck_block_list = []\n\n        for i in range(4):\n            bottleneck_block = self.add_sublayer(\n                \"bb_{}_{}\".format(name, i + 1),\n                BottleneckBlock(\n                    num_channels=num_channels if i == 0 else 256,\n                    num_filters=64,\n                    has_se=has_se,\n                    stride=1,\n                    downsample=True if i == 0 else False,\n                    name=name + '_' + str(i + 1)))\n            self.bottleneck_block_list.append(bottleneck_block)\n\n    def forward(self, input):\n        conv = input\n        for block_func in self.bottleneck_block_list:\n            conv = block_func(conv)\n        return conv\n\n\nclass TransitionLayer(nn.Layer):\n    def __init__(self, in_channels, out_channels, name=None):\n        super(TransitionLayer, self).__init__()\n\n        num_in = len(in_channels)\n        num_out = len(out_channels)\n        out = []\n        self.conv_bn_func_list = []\n        for i in range(num_out):\n            residual = None\n            if i < num_in:\n                if in_channels[i] != out_channels[i]:\n                    residual = self.add_sublayer(\n                        \"transition_{}_layer_{}\".format(name, i + 1),\n                        ConvBNLayer(\n                            num_channels=in_channels[i],\n                            num_filters=out_channels[i],\n                            filter_size=3,\n                            name=name + '_layer_' + str(i + 1)))\n            else:\n                residual = self.add_sublayer(\n                    \"transition_{}_layer_{}\".format(name, i + 1),\n                    ConvBNLayer(\n                        num_channels=in_channels[-1],\n                        num_filters=out_channels[i],\n                        filter_size=3,\n                        stride=2,\n                        name=name + '_layer_' + str(i + 1)))\n            self.conv_bn_func_list.append(residual)\n\n    def forward(self, input):\n        outs = []\n        for idx, conv_bn_func in enumerate(self.conv_bn_func_list):\n            if conv_bn_func is None:\n                outs.append(input[idx])\n            else:\n                if idx < len(input):\n                    outs.append(conv_bn_func(input[idx]))\n                else:\n                    outs.append(conv_bn_func(input[-1]))\n        return outs\n\n\nclass Branches(nn.Layer):\n    def __init__(self, block_num, in_channels, out_channels, has_se=False, name=None):\n        super(Branches, self).__init__()\n\n        self.basic_block_list = []\n\n        for i in range(len(out_channels)):\n            self.basic_block_list.append([])\n            for j in range(block_num):\n                in_ch = in_channels[i] if j == 0 else out_channels[i]\n                basic_block_func = self.add_sublayer(\n                    \"bb_{}_branch_layer_{}_{}\".format(name, i + 1, j + 1),\n                    BasicBlock(\n                        num_channels=in_ch,\n                        num_filters=out_channels[i],\n                        has_se=has_se,\n                        name=name + '_branch_layer_' + str(i + 1) + '_' + str(j + 1)))\n                self.basic_block_list[i].append(basic_block_func)\n\n    def forward(self, inputs):\n        outs = []\n        for idx, input in enumerate(inputs):\n            conv = input\n            basic_block_list = self.basic_block_list[idx]\n            for basic_block_func in basic_block_list:\n                conv = basic_block_func(conv)\n            outs.append(conv)\n        return outs\n\n\nclass BottleneckBlock(nn.Layer):\n    def __init__(self, num_channels, num_filters, has_se, stride=1, downsample=False, name=None):\n        super(BottleneckBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=1,\n            act=\"relu\",\n            name=name + \"_conv1\",\n        )\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_conv2\")\n        self.conv3 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters * 4, filter_size=1, act=None, name=name + \"_conv3\")\n\n        if self.downsample:\n            self.conv_down = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                act=None,\n                name=name + \"_downsample\")\n\n        if self.has_se:\n            self.se = SELayer(\n                num_channels=num_filters * 4, num_filters=num_filters * 4, reduction_ratio=16, name='fc' + name)\n\n    def forward(self, input):\n        residual = input\n        conv1 = self.conv1(input)\n        conv2 = self.conv2(conv1)\n        conv3 = self.conv3(conv2)\n\n        if self.downsample:\n            residual = self.conv_down(input)\n\n        if self.has_se:\n            conv3 = self.se(conv3)\n\n        y = paddle.add(x=residual, y=conv3)\n        y = F.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self, num_channels, num_filters, stride=1, has_se=False, downsample=False, name=None):\n        super(BasicBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_conv1\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters, filter_size=3, stride=1, act=None, name=name + \"_conv2\")\n\n        if self.downsample:\n            self.conv_down = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                act=\"relu\",\n                name=name + \"_downsample\")\n\n        if self.has_se:\n            self.se = SELayer(num_channels=num_filters, num_filters=num_filters, reduction_ratio=16, name='fc' + name)\n\n    def forward(self, input):\n        residual = input\n        conv1 = self.conv1(input)\n        conv2 = self.conv2(conv1)\n\n        if self.downsample:\n            residual = self.conv_down(input)\n\n        if self.has_se:\n            conv2 = self.se(conv2)\n\n        y = paddle.add(x=residual, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass SELayer(nn.Layer):\n    def __init__(self, num_channels, num_filters, reduction_ratio, name=None):\n        super(SELayer, self).__init__()\n\n        self.pool2d_gap = nn.AdaptiveAvgPool2D(1)\n\n        self._num_channels = num_channels\n\n        med_ch = int(num_channels / reduction_ratio)\n        stdv = 1.0 / math.sqrt(num_channels * 1.0)\n        self.squeeze = nn.Linear(\n            num_channels,\n            med_ch,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_sqz_weights\"),\n            bias_attr=ParamAttr(name=name + '_sqz_offset'))\n\n        stdv = 1.0 / math.sqrt(med_ch * 1.0)\n        self.excitation = nn.Linear(\n            med_ch,\n            num_filters,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_exc_weights\"),\n            bias_attr=ParamAttr(name=name + '_exc_offset'))\n\n    def forward(self, input):\n        pool = self.pool2d_gap(input)\n        pool = paddle.squeeze(pool, axis=[2, 3])\n        squeeze = self.squeeze(pool)\n        squeeze = F.relu(squeeze)\n        excitation = self.excitation(squeeze)\n        excitation = F.sigmoid(excitation)\n        excitation = paddle.unsqueeze(excitation, axis=[2, 3])\n        out = input * excitation\n        return out\n\n\nclass Stage(nn.Layer):\n    def __init__(self, num_channels, num_modules, num_filters, has_se=False, multi_scale_output=True, name=None):\n        super(Stage, self).__init__()\n\n        self._num_modules = num_modules\n\n        self.stage_func_list = []\n        for i in range(num_modules):\n            if i == num_modules - 1 and not multi_scale_output:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels,\n                        num_filters=num_filters,\n                        has_se=has_se,\n                        multi_scale_output=False,\n                        name=name + '_' + str(i + 1)))\n            else:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels, num_filters=num_filters, has_se=has_se,\n                        name=name + '_' + str(i + 1)))\n\n            self.stage_func_list.append(stage_func)\n\n    def forward(self, input):\n        out = input\n        for idx in range(self._num_modules):\n            out = self.stage_func_list[idx](out)\n        return out\n\n\nclass HighResolutionModule(nn.Layer):\n    def __init__(self, num_channels, num_filters, has_se=False, multi_scale_output=True, name=None):\n        super(HighResolutionModule, self).__init__()\n\n        self.branches_func = Branches(\n            block_num=4, in_channels=num_channels, out_channels=num_filters, has_se=has_se, name=name)\n\n        self.fuse_func = FuseLayers(\n            in_channels=num_filters, out_channels=num_filters, multi_scale_output=multi_scale_output, name=name)\n\n    def forward(self, input):\n        out = self.branches_func(input)\n        out = self.fuse_func(out)\n        return out\n\n\nclass FuseLayers(nn.Layer):\n    def __init__(self, in_channels, out_channels, multi_scale_output=True, name=None):\n        super(FuseLayers, self).__init__()\n\n        self._actual_ch = len(in_channels) if multi_scale_output else 1\n        self._in_channels = in_channels\n\n        self.residual_func_list = []\n        for i in range(self._actual_ch):\n            for j in range(len(in_channels)):\n                residual_func = None\n                if j > i:\n                    residual_func = self.add_sublayer(\n                        \"residual_{}_layer_{}_{}\".format(name, i + 1, j + 1),\n                        ConvBNLayer(\n                            num_channels=in_channels[j],\n                            num_filters=out_channels[i],\n                            filter_size=1,\n                            stride=1,\n                            act=None,\n                            name=name + '_layer_' + str(i + 1) + '_' + str(j + 1)))\n                    self.residual_func_list.append(residual_func)\n                elif j < i:\n                    pre_num_filters = in_channels[j]\n                    for k in range(i - j):\n                        if k == i - j - 1:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                ConvBNLayer(\n                                    num_channels=pre_num_filters,\n                                    num_filters=out_channels[i],\n                                    filter_size=3,\n                                    stride=2,\n                                    act=None,\n                                    name=name + '_layer_' + str(i + 1) + '_' + str(j + 1) + '_' + str(k + 1)))\n                            pre_num_filters = out_channels[i]\n                        else:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                ConvBNLayer(\n                                    num_channels=pre_num_filters,\n                                    num_filters=out_channels[j],\n                                    filter_size=3,\n                                    stride=2,\n                                    act=\"relu\",\n                                    name=name + '_layer_' + str(i + 1) + '_' + str(j + 1) + '_' + str(k + 1)))\n                            pre_num_filters = out_channels[j]\n                        self.residual_func_list.append(residual_func)\n\n    def forward(self, input):\n        outs = []\n        residual_func_idx = 0\n        for i in range(self._actual_ch):\n            residual = input[i]\n            for j in range(len(self._in_channels)):\n                if j > i:\n                    y = self.residual_func_list[residual_func_idx](input[j])\n                    residual_func_idx += 1\n\n                    y = F.upsample(y, scale_factor=2**(j - i), mode=\"nearest\")\n                    residual = paddle.add(x=residual, y=y)\n                elif j < i:\n                    y = input[j]\n                    for k in range(i - j):\n                        y = self.residual_func_list[residual_func_idx](y)\n                        residual_func_idx += 1\n\n                    residual = paddle.add(x=residual, y=y)\n\n            residual = F.relu(residual)\n            outs.append(residual)\n\n        return outs\n\n\nclass LastClsOut(nn.Layer):\n    def __init__(self, num_channel_list, has_se, num_filters_list=[32, 64, 128, 256], name=None):\n        super(LastClsOut, self).__init__()\n\n        self.func_list = []\n        for idx in range(len(num_channel_list)):\n            func = self.add_sublayer(\n                \"conv_{}_conv_{}\".format(name, idx + 1),\n                BottleneckBlock(\n                    num_channels=num_channel_list[idx],\n                    num_filters=num_filters_list[idx],\n                    has_se=has_se,\n                    downsample=True,\n                    name=name + 'conv_' + str(idx + 1)))\n            self.func_list.append(func)\n\n    def forward(self, inputs):\n        outs = []\n        for idx, input in enumerate(inputs):\n            out = self.func_list[idx](input)\n            outs.append(out)\n        return outs\n\n\n@moduleinfo(\n    name=\"hrnet18_imagenet_ssld\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"hrnet18_imagenet_ssld is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass HRNet18(nn.Layer):\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(HRNet18, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        self.width = 18\n        self.has_se = False\n        self.channels = {\n            18: [[18, 36], [18, 36, 72], [18, 36, 72, 144]],\n            30: [[30, 60], [30, 60, 120], [30, 60, 120, 240]],\n            32: [[32, 64], [32, 64, 128], [32, 64, 128, 256]],\n            40: [[40, 80], [40, 80, 160], [40, 80, 160, 320]],\n            44: [[44, 88], [44, 88, 176], [44, 88, 176, 352]],\n            48: [[48, 96], [48, 96, 192], [48, 96, 192, 384]],\n            60: [[60, 120], [60, 120, 240], [60, 120, 240, 480]],\n            64: [[64, 128], [64, 128, 256], [64, 128, 256, 512]]\n        }\n        self._class_dim = class_dim\n\n        channels_2, channels_3, channels_4 = self.channels[self.width]\n        num_modules_2, num_modules_3, num_modules_4 = 1, 4, 3\n\n        self.conv_layer1_1 = ConvBNLayer(\n            num_channels=3, num_filters=64, filter_size=3, stride=2, act='relu', name=\"layer1_1\")\n\n        self.conv_layer1_2 = ConvBNLayer(\n            num_channels=64, num_filters=64, filter_size=3, stride=2, act='relu', name=\"layer1_2\")\n\n        self.la1 = Layer1(num_channels=64, has_se=self.has_se, name=\"layer2\")\n\n        self.tr1 = TransitionLayer(in_channels=[256], out_channels=channels_2, name=\"tr1\")\n\n        self.st2 = Stage(\n            num_channels=channels_2, num_modules=num_modules_2, num_filters=channels_2, has_se=self.has_se, name=\"st2\")\n\n        self.tr2 = TransitionLayer(in_channels=channels_2, out_channels=channels_3, name=\"tr2\")\n        self.st3 = Stage(\n            num_channels=channels_3, num_modules=num_modules_3, num_filters=channels_3, has_se=self.has_se, name=\"st3\")\n\n        self.tr3 = TransitionLayer(in_channels=channels_3, out_channels=channels_4, name=\"tr3\")\n        self.st4 = Stage(\n            num_channels=channels_4, num_modules=num_modules_4, num_filters=channels_4, has_se=self.has_se, name=\"st4\")\n\n        # classification\n        num_filters_list = [32, 64, 128, 256]\n        self.last_cls = LastClsOut(\n            num_channel_list=channels_4,\n            has_se=self.has_se,\n            num_filters_list=num_filters_list,\n            name=\"cls_head\",\n        )\n\n        last_num_filters = [256, 512, 1024]\n        self.cls_head_conv_list = []\n        for idx in range(3):\n            self.cls_head_conv_list.append(\n                self.add_sublayer(\n                    \"cls_head_add{}\".format(idx + 1),\n                    ConvBNLayer(\n                        num_channels=num_filters_list[idx] * 4,\n                        num_filters=last_num_filters[idx],\n                        filter_size=3,\n                        stride=2,\n                        name=\"cls_head_add\" + str(idx + 1))))\n\n        self.conv_last = ConvBNLayer(\n            num_channels=1024, num_filters=2048, filter_size=1, stride=1, name=\"cls_head_last_conv\")\n\n        self.pool2d_avg = nn.AdaptiveAvgPool2D(1)\n\n        stdv = 1.0 / math.sqrt(2048 * 1.0)\n\n        self.out = nn.Linear(\n            2048,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_weights\"),\n            bias_attr=ParamAttr(name=\"fc_offset\"))\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, input):\n        conv1 = self.conv_layer1_1(input)\n        conv2 = self.conv_layer1_2(conv1)\n\n        la1 = self.la1(conv2)\n\n        tr1 = self.tr1([la1])\n        st2 = self.st2(tr1)\n\n        tr2 = self.tr2(st2)\n        st3 = self.st3(tr2)\n\n        tr3 = self.tr3(st3)\n        st4 = self.st4(tr3)\n\n        last_cls = self.last_cls(st4)\n\n        y = last_cls[0]\n        for idx in range(3):\n            y = paddle.add(last_cls[idx + 1], self.cls_head_conv_list[idx](y))\n\n        y = self.conv_last(y)\n        feature = self.pool2d_avg(y)\n        y = paddle.reshape(feature, shape=[-1, feature.shape[1]])\n        y = self.out(y)\n        return y, feature\n"
  },
  {
    "path": "modules/image/classification/hrnet30_imagenet/README.md",
    "content": "# hrnet30_imagenet\n\n|模型名称|hrnet30_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|HRNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|218MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - HRNet是微软亚洲研究院在2019年提出的全新神经网络。与之前的卷积神经网络不同，这个网络在网络的深层依然可以保持高分辨率，所以预测的关键点的热图更加准确，而且在空间上也更加准确。此外，该网络在其他对分辨率敏感的视觉任务中表现特别好，例如检测和分割。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install hrnet30_imagenet\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run hrnet30_imagenet --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='hrnet30_imagenet')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用hrnet30_imagenet对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"hrnet30_imagenet\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='hrnet30_imagenet', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m hrnet30_imagenet\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/hrnet30_imagenet\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/hrnet30_imagenet/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/hrnet30_imagenet/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport math\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport paddlehub.vision.transforms as T\nimport numpy as np\nfrom paddle.nn.initializer import Uniform\nfrom paddle import ParamAttr\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    def __init__(self, num_channels, num_filters, filter_size, stride=1, groups=1, act=\"relu\", name=None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = nn.Conv2D(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        bn_name = name + '_bn'\n        self._batch_norm = nn.BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, input):\n        y = self._conv(input)\n        y = self._batch_norm(y)\n        return y\n\n\nclass Layer1(nn.Layer):\n    def __init__(self, num_channels, has_se=False, name=None):\n        super(Layer1, self).__init__()\n\n        self.bottleneck_block_list = []\n\n        for i in range(4):\n            bottleneck_block = self.add_sublayer(\n                \"bb_{}_{}\".format(name, i + 1),\n                BottleneckBlock(\n                    num_channels=num_channels if i == 0 else 256,\n                    num_filters=64,\n                    has_se=has_se,\n                    stride=1,\n                    downsample=True if i == 0 else False,\n                    name=name + '_' + str(i + 1)))\n            self.bottleneck_block_list.append(bottleneck_block)\n\n    def forward(self, input):\n        conv = input\n        for block_func in self.bottleneck_block_list:\n            conv = block_func(conv)\n        return conv\n\n\nclass TransitionLayer(nn.Layer):\n    def __init__(self, in_channels, out_channels, name=None):\n        super(TransitionLayer, self).__init__()\n\n        num_in = len(in_channels)\n        num_out = len(out_channels)\n        out = []\n        self.conv_bn_func_list = []\n        for i in range(num_out):\n            residual = None\n            if i < num_in:\n                if in_channels[i] != out_channels[i]:\n                    residual = self.add_sublayer(\n                        \"transition_{}_layer_{}\".format(name, i + 1),\n                        ConvBNLayer(\n                            num_channels=in_channels[i],\n                            num_filters=out_channels[i],\n                            filter_size=3,\n                            name=name + '_layer_' + str(i + 1)))\n            else:\n                residual = self.add_sublayer(\n                    \"transition_{}_layer_{}\".format(name, i + 1),\n                    ConvBNLayer(\n                        num_channels=in_channels[-1],\n                        num_filters=out_channels[i],\n                        filter_size=3,\n                        stride=2,\n                        name=name + '_layer_' + str(i + 1)))\n            self.conv_bn_func_list.append(residual)\n\n    def forward(self, input):\n        outs = []\n        for idx, conv_bn_func in enumerate(self.conv_bn_func_list):\n            if conv_bn_func is None:\n                outs.append(input[idx])\n            else:\n                if idx < len(input):\n                    outs.append(conv_bn_func(input[idx]))\n                else:\n                    outs.append(conv_bn_func(input[-1]))\n        return outs\n\n\nclass Branches(nn.Layer):\n    def __init__(self, block_num, in_channels, out_channels, has_se=False, name=None):\n        super(Branches, self).__init__()\n\n        self.basic_block_list = []\n\n        for i in range(len(out_channels)):\n            self.basic_block_list.append([])\n            for j in range(block_num):\n                in_ch = in_channels[i] if j == 0 else out_channels[i]\n                basic_block_func = self.add_sublayer(\n                    \"bb_{}_branch_layer_{}_{}\".format(name, i + 1, j + 1),\n                    BasicBlock(\n                        num_channels=in_ch,\n                        num_filters=out_channels[i],\n                        has_se=has_se,\n                        name=name + '_branch_layer_' + str(i + 1) + '_' + str(j + 1)))\n                self.basic_block_list[i].append(basic_block_func)\n\n    def forward(self, inputs):\n        outs = []\n        for idx, input in enumerate(inputs):\n            conv = input\n            basic_block_list = self.basic_block_list[idx]\n            for basic_block_func in basic_block_list:\n                conv = basic_block_func(conv)\n            outs.append(conv)\n        return outs\n\n\nclass BottleneckBlock(nn.Layer):\n    def __init__(self, num_channels, num_filters, has_se, stride=1, downsample=False, name=None):\n        super(BottleneckBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=1,\n            act=\"relu\",\n            name=name + \"_conv1\",\n        )\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_conv2\")\n        self.conv3 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters * 4, filter_size=1, act=None, name=name + \"_conv3\")\n\n        if self.downsample:\n            self.conv_down = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                act=None,\n                name=name + \"_downsample\")\n\n        if self.has_se:\n            self.se = SELayer(\n                num_channels=num_filters * 4, num_filters=num_filters * 4, reduction_ratio=16, name='fc' + name)\n\n    def forward(self, input):\n        residual = input\n        conv1 = self.conv1(input)\n        conv2 = self.conv2(conv1)\n        conv3 = self.conv3(conv2)\n\n        if self.downsample:\n            residual = self.conv_down(input)\n\n        if self.has_se:\n            conv3 = self.se(conv3)\n\n        y = paddle.add(x=residual, y=conv3)\n        y = F.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self, num_channels, num_filters, stride=1, has_se=False, downsample=False, name=None):\n        super(BasicBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_conv1\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters, filter_size=3, stride=1, act=None, name=name + \"_conv2\")\n\n        if self.downsample:\n            self.conv_down = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                act=\"relu\",\n                name=name + \"_downsample\")\n\n        if self.has_se:\n            self.se = SELayer(num_channels=num_filters, num_filters=num_filters, reduction_ratio=16, name='fc' + name)\n\n    def forward(self, input):\n        residual = input\n        conv1 = self.conv1(input)\n        conv2 = self.conv2(conv1)\n\n        if self.downsample:\n            residual = self.conv_down(input)\n\n        if self.has_se:\n            conv2 = self.se(conv2)\n\n        y = paddle.add(x=residual, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass SELayer(nn.Layer):\n    def __init__(self, num_channels, num_filters, reduction_ratio, name=None):\n        super(SELayer, self).__init__()\n\n        self.pool2d_gap = nn.AdaptiveAvgPool2D(1)\n\n        self._num_channels = num_channels\n\n        med_ch = int(num_channels / reduction_ratio)\n        stdv = 1.0 / math.sqrt(num_channels * 1.0)\n        self.squeeze = nn.Linear(\n            num_channels,\n            med_ch,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_sqz_weights\"),\n            bias_attr=ParamAttr(name=name + '_sqz_offset'))\n\n        stdv = 1.0 / math.sqrt(med_ch * 1.0)\n        self.excitation = nn.Linear(\n            med_ch,\n            num_filters,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_exc_weights\"),\n            bias_attr=ParamAttr(name=name + '_exc_offset'))\n\n    def forward(self, input):\n        pool = self.pool2d_gap(input)\n        pool = paddle.squeeze(pool, axis=[2, 3])\n        squeeze = self.squeeze(pool)\n        squeeze = F.relu(squeeze)\n        excitation = self.excitation(squeeze)\n        excitation = F.sigmoid(excitation)\n        excitation = paddle.unsqueeze(excitation, axis=[2, 3])\n        out = input * excitation\n        return out\n\n\nclass Stage(nn.Layer):\n    def __init__(self, num_channels, num_modules, num_filters, has_se=False, multi_scale_output=True, name=None):\n        super(Stage, self).__init__()\n\n        self._num_modules = num_modules\n\n        self.stage_func_list = []\n        for i in range(num_modules):\n            if i == num_modules - 1 and not multi_scale_output:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels,\n                        num_filters=num_filters,\n                        has_se=has_se,\n                        multi_scale_output=False,\n                        name=name + '_' + str(i + 1)))\n            else:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels, num_filters=num_filters, has_se=has_se,\n                        name=name + '_' + str(i + 1)))\n\n            self.stage_func_list.append(stage_func)\n\n    def forward(self, input):\n        out = input\n        for idx in range(self._num_modules):\n            out = self.stage_func_list[idx](out)\n        return out\n\n\nclass HighResolutionModule(nn.Layer):\n    def __init__(self, num_channels, num_filters, has_se=False, multi_scale_output=True, name=None):\n        super(HighResolutionModule, self).__init__()\n\n        self.branches_func = Branches(\n            block_num=4, in_channels=num_channels, out_channels=num_filters, has_se=has_se, name=name)\n\n        self.fuse_func = FuseLayers(\n            in_channels=num_filters, out_channels=num_filters, multi_scale_output=multi_scale_output, name=name)\n\n    def forward(self, input):\n        out = self.branches_func(input)\n        out = self.fuse_func(out)\n        return out\n\n\nclass FuseLayers(nn.Layer):\n    def __init__(self, in_channels, out_channels, multi_scale_output=True, name=None):\n        super(FuseLayers, self).__init__()\n\n        self._actual_ch = len(in_channels) if multi_scale_output else 1\n        self._in_channels = in_channels\n\n        self.residual_func_list = []\n        for i in range(self._actual_ch):\n            for j in range(len(in_channels)):\n                residual_func = None\n                if j > i:\n                    residual_func = self.add_sublayer(\n                        \"residual_{}_layer_{}_{}\".format(name, i + 1, j + 1),\n                        ConvBNLayer(\n                            num_channels=in_channels[j],\n                            num_filters=out_channels[i],\n                            filter_size=1,\n                            stride=1,\n                            act=None,\n                            name=name + '_layer_' + str(i + 1) + '_' + str(j + 1)))\n                    self.residual_func_list.append(residual_func)\n                elif j < i:\n                    pre_num_filters = in_channels[j]\n                    for k in range(i - j):\n                        if k == i - j - 1:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                ConvBNLayer(\n                                    num_channels=pre_num_filters,\n                                    num_filters=out_channels[i],\n                                    filter_size=3,\n                                    stride=2,\n                                    act=None,\n                                    name=name + '_layer_' + str(i + 1) + '_' + str(j + 1) + '_' + str(k + 1)))\n                            pre_num_filters = out_channels[i]\n                        else:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                ConvBNLayer(\n                                    num_channels=pre_num_filters,\n                                    num_filters=out_channels[j],\n                                    filter_size=3,\n                                    stride=2,\n                                    act=\"relu\",\n                                    name=name + '_layer_' + str(i + 1) + '_' + str(j + 1) + '_' + str(k + 1)))\n                            pre_num_filters = out_channels[j]\n                        self.residual_func_list.append(residual_func)\n\n    def forward(self, input):\n        outs = []\n        residual_func_idx = 0\n        for i in range(self._actual_ch):\n            residual = input[i]\n            for j in range(len(self._in_channels)):\n                if j > i:\n                    y = self.residual_func_list[residual_func_idx](input[j])\n                    residual_func_idx += 1\n\n                    y = F.upsample(y, scale_factor=2**(j - i), mode=\"nearest\")\n                    residual = paddle.add(x=residual, y=y)\n                elif j < i:\n                    y = input[j]\n                    for k in range(i - j):\n                        y = self.residual_func_list[residual_func_idx](y)\n                        residual_func_idx += 1\n\n                    residual = paddle.add(x=residual, y=y)\n\n            residual = F.relu(residual)\n            outs.append(residual)\n\n        return outs\n\n\nclass LastClsOut(nn.Layer):\n    def __init__(self, num_channel_list, has_se, num_filters_list=[32, 64, 128, 256], name=None):\n        super(LastClsOut, self).__init__()\n\n        self.func_list = []\n        for idx in range(len(num_channel_list)):\n            func = self.add_sublayer(\n                \"conv_{}_conv_{}\".format(name, idx + 1),\n                BottleneckBlock(\n                    num_channels=num_channel_list[idx],\n                    num_filters=num_filters_list[idx],\n                    has_se=has_se,\n                    downsample=True,\n                    name=name + 'conv_' + str(idx + 1)))\n            self.func_list.append(func)\n\n    def forward(self, inputs):\n        outs = []\n        for idx, input in enumerate(inputs):\n            out = self.func_list[idx](input)\n            outs.append(out)\n        return outs\n\n\n@moduleinfo(\n    name=\"hrnet30_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"hrnet30_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass HRNet30(nn.Layer):\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(HRNet30, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        self.width = 30\n        self.has_se = False\n        self.channels = {\n            18: [[18, 36], [18, 36, 72], [18, 36, 72, 144]],\n            30: [[30, 60], [30, 60, 120], [30, 60, 120, 240]],\n            32: [[32, 64], [32, 64, 128], [32, 64, 128, 256]],\n            40: [[40, 80], [40, 80, 160], [40, 80, 160, 320]],\n            44: [[44, 88], [44, 88, 176], [44, 88, 176, 352]],\n            48: [[48, 96], [48, 96, 192], [48, 96, 192, 384]],\n            60: [[60, 120], [60, 120, 240], [60, 120, 240, 480]],\n            64: [[64, 128], [64, 128, 256], [64, 128, 256, 512]]\n        }\n        self._class_dim = class_dim\n\n        channels_2, channels_3, channels_4 = self.channels[self.width]\n        num_modules_2, num_modules_3, num_modules_4 = 1, 4, 3\n\n        self.conv_layer1_1 = ConvBNLayer(\n            num_channels=3, num_filters=64, filter_size=3, stride=2, act='relu', name=\"layer1_1\")\n\n        self.conv_layer1_2 = ConvBNLayer(\n            num_channels=64, num_filters=64, filter_size=3, stride=2, act='relu', name=\"layer1_2\")\n\n        self.la1 = Layer1(num_channels=64, has_se=self.has_se, name=\"layer2\")\n\n        self.tr1 = TransitionLayer(in_channels=[256], out_channels=channels_2, name=\"tr1\")\n\n        self.st2 = Stage(\n            num_channels=channels_2, num_modules=num_modules_2, num_filters=channels_2, has_se=self.has_se, name=\"st2\")\n\n        self.tr2 = TransitionLayer(in_channels=channels_2, out_channels=channels_3, name=\"tr2\")\n        self.st3 = Stage(\n            num_channels=channels_3, num_modules=num_modules_3, num_filters=channels_3, has_se=self.has_se, name=\"st3\")\n\n        self.tr3 = TransitionLayer(in_channels=channels_3, out_channels=channels_4, name=\"tr3\")\n        self.st4 = Stage(\n            num_channels=channels_4, num_modules=num_modules_4, num_filters=channels_4, has_se=self.has_se, name=\"st4\")\n\n        # classification\n        num_filters_list = [32, 64, 128, 256]\n        self.last_cls = LastClsOut(\n            num_channel_list=channels_4,\n            has_se=self.has_se,\n            num_filters_list=num_filters_list,\n            name=\"cls_head\",\n        )\n\n        last_num_filters = [256, 512, 1024]\n        self.cls_head_conv_list = []\n        for idx in range(3):\n            self.cls_head_conv_list.append(\n                self.add_sublayer(\n                    \"cls_head_add{}\".format(idx + 1),\n                    ConvBNLayer(\n                        num_channels=num_filters_list[idx] * 4,\n                        num_filters=last_num_filters[idx],\n                        filter_size=3,\n                        stride=2,\n                        name=\"cls_head_add\" + str(idx + 1))))\n\n        self.conv_last = ConvBNLayer(\n            num_channels=1024, num_filters=2048, filter_size=1, stride=1, name=\"cls_head_last_conv\")\n\n        self.pool2d_avg = nn.AdaptiveAvgPool2D(1)\n\n        stdv = 1.0 / math.sqrt(2048 * 1.0)\n\n        self.out = nn.Linear(\n            2048,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_weights\"),\n            bias_attr=ParamAttr(name=\"fc_offset\"))\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, input):\n        conv1 = self.conv_layer1_1(input)\n        conv2 = self.conv_layer1_2(conv1)\n\n        la1 = self.la1(conv2)\n\n        tr1 = self.tr1([la1])\n        st2 = self.st2(tr1)\n\n        tr2 = self.tr2(st2)\n        st3 = self.st3(tr2)\n\n        tr3 = self.tr3(st3)\n        st4 = self.st4(tr3)\n\n        last_cls = self.last_cls(st4)\n\n        y = last_cls[0]\n        for idx in range(3):\n            y = paddle.add(last_cls[idx + 1], self.cls_head_conv_list[idx](y))\n\n        y = self.conv_last(y)\n        feature = self.pool2d_avg(y)\n        y = paddle.reshape(feature, shape=[-1, feature.shape[1]])\n        y = self.out(y)\n        return y, feature\n"
  },
  {
    "path": "modules/image/classification/hrnet32_imagenet/README.md",
    "content": "# hrnet32_imagenet\n\n|模型名称|hrnet32_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|HRNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|238MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - HRNet是微软亚洲研究院在2019年提出的全新神经网络。与之前的卷积神经网络不同，这个网络在网络的深层依然可以保持高分辨率，所以预测的关键点的热图更加准确，而且在空间上也更加准确。此外，该网络在其他对分辨率敏感的视觉任务中表现特别好，例如检测和分割。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install hrnet32_imagenet\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run hrnet32_imagenet --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='hrnet32_imagenet')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用hrnet32_imagenet对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"hrnet32_imagenet\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='hrnet32_imagenet', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m hrnet32_imagenet\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/hrnet32_imagenet\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/hrnet32_imagenet/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/hrnet32_imagenet/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport math\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport paddlehub.vision.transforms as T\nimport numpy as np\nfrom paddle.nn.initializer import Uniform\nfrom paddle import ParamAttr\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    def __init__(self, num_channels, num_filters, filter_size, stride=1, groups=1, act=\"relu\", name=None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = nn.Conv2D(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        bn_name = name + '_bn'\n        self._batch_norm = nn.BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, input):\n        y = self._conv(input)\n        y = self._batch_norm(y)\n        return y\n\n\nclass Layer1(nn.Layer):\n    def __init__(self, num_channels, has_se=False, name=None):\n        super(Layer1, self).__init__()\n\n        self.bottleneck_block_list = []\n\n        for i in range(4):\n            bottleneck_block = self.add_sublayer(\n                \"bb_{}_{}\".format(name, i + 1),\n                BottleneckBlock(\n                    num_channels=num_channels if i == 0 else 256,\n                    num_filters=64,\n                    has_se=has_se,\n                    stride=1,\n                    downsample=True if i == 0 else False,\n                    name=name + '_' + str(i + 1)))\n            self.bottleneck_block_list.append(bottleneck_block)\n\n    def forward(self, input):\n        conv = input\n        for block_func in self.bottleneck_block_list:\n            conv = block_func(conv)\n        return conv\n\n\nclass TransitionLayer(nn.Layer):\n    def __init__(self, in_channels, out_channels, name=None):\n        super(TransitionLayer, self).__init__()\n\n        num_in = len(in_channels)\n        num_out = len(out_channels)\n        out = []\n        self.conv_bn_func_list = []\n        for i in range(num_out):\n            residual = None\n            if i < num_in:\n                if in_channels[i] != out_channels[i]:\n                    residual = self.add_sublayer(\n                        \"transition_{}_layer_{}\".format(name, i + 1),\n                        ConvBNLayer(\n                            num_channels=in_channels[i],\n                            num_filters=out_channels[i],\n                            filter_size=3,\n                            name=name + '_layer_' + str(i + 1)))\n            else:\n                residual = self.add_sublayer(\n                    \"transition_{}_layer_{}\".format(name, i + 1),\n                    ConvBNLayer(\n                        num_channels=in_channels[-1],\n                        num_filters=out_channels[i],\n                        filter_size=3,\n                        stride=2,\n                        name=name + '_layer_' + str(i + 1)))\n            self.conv_bn_func_list.append(residual)\n\n    def forward(self, input):\n        outs = []\n        for idx, conv_bn_func in enumerate(self.conv_bn_func_list):\n            if conv_bn_func is None:\n                outs.append(input[idx])\n            else:\n                if idx < len(input):\n                    outs.append(conv_bn_func(input[idx]))\n                else:\n                    outs.append(conv_bn_func(input[-1]))\n        return outs\n\n\nclass Branches(nn.Layer):\n    def __init__(self, block_num, in_channels, out_channels, has_se=False, name=None):\n        super(Branches, self).__init__()\n\n        self.basic_block_list = []\n\n        for i in range(len(out_channels)):\n            self.basic_block_list.append([])\n            for j in range(block_num):\n                in_ch = in_channels[i] if j == 0 else out_channels[i]\n                basic_block_func = self.add_sublayer(\n                    \"bb_{}_branch_layer_{}_{}\".format(name, i + 1, j + 1),\n                    BasicBlock(\n                        num_channels=in_ch,\n                        num_filters=out_channels[i],\n                        has_se=has_se,\n                        name=name + '_branch_layer_' + str(i + 1) + '_' + str(j + 1)))\n                self.basic_block_list[i].append(basic_block_func)\n\n    def forward(self, inputs):\n        outs = []\n        for idx, input in enumerate(inputs):\n            conv = input\n            basic_block_list = self.basic_block_list[idx]\n            for basic_block_func in basic_block_list:\n                conv = basic_block_func(conv)\n            outs.append(conv)\n        return outs\n\n\nclass BottleneckBlock(nn.Layer):\n    def __init__(self, num_channels, num_filters, has_se, stride=1, downsample=False, name=None):\n        super(BottleneckBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=1,\n            act=\"relu\",\n            name=name + \"_conv1\",\n        )\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_conv2\")\n        self.conv3 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters * 4, filter_size=1, act=None, name=name + \"_conv3\")\n\n        if self.downsample:\n            self.conv_down = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                act=None,\n                name=name + \"_downsample\")\n\n        if self.has_se:\n            self.se = SELayer(\n                num_channels=num_filters * 4, num_filters=num_filters * 4, reduction_ratio=16, name='fc' + name)\n\n    def forward(self, input):\n        residual = input\n        conv1 = self.conv1(input)\n        conv2 = self.conv2(conv1)\n        conv3 = self.conv3(conv2)\n\n        if self.downsample:\n            residual = self.conv_down(input)\n\n        if self.has_se:\n            conv3 = self.se(conv3)\n\n        y = paddle.add(x=residual, y=conv3)\n        y = F.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self, num_channels, num_filters, stride=1, has_se=False, downsample=False, name=None):\n        super(BasicBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_conv1\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters, filter_size=3, stride=1, act=None, name=name + \"_conv2\")\n\n        if self.downsample:\n            self.conv_down = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                act=\"relu\",\n                name=name + \"_downsample\")\n\n        if self.has_se:\n            self.se = SELayer(num_channels=num_filters, num_filters=num_filters, reduction_ratio=16, name='fc' + name)\n\n    def forward(self, input):\n        residual = input\n        conv1 = self.conv1(input)\n        conv2 = self.conv2(conv1)\n\n        if self.downsample:\n            residual = self.conv_down(input)\n\n        if self.has_se:\n            conv2 = self.se(conv2)\n\n        y = paddle.add(x=residual, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass SELayer(nn.Layer):\n    def __init__(self, num_channels, num_filters, reduction_ratio, name=None):\n        super(SELayer, self).__init__()\n\n        self.pool2d_gap = nn.AdaptiveAvgPool2D(1)\n\n        self._num_channels = num_channels\n\n        med_ch = int(num_channels / reduction_ratio)\n        stdv = 1.0 / math.sqrt(num_channels * 1.0)\n        self.squeeze = nn.Linear(\n            num_channels,\n            med_ch,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_sqz_weights\"),\n            bias_attr=ParamAttr(name=name + '_sqz_offset'))\n\n        stdv = 1.0 / math.sqrt(med_ch * 1.0)\n        self.excitation = nn.Linear(\n            med_ch,\n            num_filters,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_exc_weights\"),\n            bias_attr=ParamAttr(name=name + '_exc_offset'))\n\n    def forward(self, input):\n        pool = self.pool2d_gap(input)\n        pool = paddle.squeeze(pool, axis=[2, 3])\n        squeeze = self.squeeze(pool)\n        squeeze = F.relu(squeeze)\n        excitation = self.excitation(squeeze)\n        excitation = F.sigmoid(excitation)\n        excitation = paddle.unsqueeze(excitation, axis=[2, 3])\n        out = input * excitation\n        return out\n\n\nclass Stage(nn.Layer):\n    def __init__(self, num_channels, num_modules, num_filters, has_se=False, multi_scale_output=True, name=None):\n        super(Stage, self).__init__()\n\n        self._num_modules = num_modules\n\n        self.stage_func_list = []\n        for i in range(num_modules):\n            if i == num_modules - 1 and not multi_scale_output:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels,\n                        num_filters=num_filters,\n                        has_se=has_se,\n                        multi_scale_output=False,\n                        name=name + '_' + str(i + 1)))\n            else:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels, num_filters=num_filters, has_se=has_se,\n                        name=name + '_' + str(i + 1)))\n\n            self.stage_func_list.append(stage_func)\n\n    def forward(self, input):\n        out = input\n        for idx in range(self._num_modules):\n            out = self.stage_func_list[idx](out)\n        return out\n\n\nclass HighResolutionModule(nn.Layer):\n    def __init__(self, num_channels, num_filters, has_se=False, multi_scale_output=True, name=None):\n        super(HighResolutionModule, self).__init__()\n\n        self.branches_func = Branches(\n            block_num=4, in_channels=num_channels, out_channels=num_filters, has_se=has_se, name=name)\n\n        self.fuse_func = FuseLayers(\n            in_channels=num_filters, out_channels=num_filters, multi_scale_output=multi_scale_output, name=name)\n\n    def forward(self, input):\n        out = self.branches_func(input)\n        out = self.fuse_func(out)\n        return out\n\n\nclass FuseLayers(nn.Layer):\n    def __init__(self, in_channels, out_channels, multi_scale_output=True, name=None):\n        super(FuseLayers, self).__init__()\n\n        self._actual_ch = len(in_channels) if multi_scale_output else 1\n        self._in_channels = in_channels\n\n        self.residual_func_list = []\n        for i in range(self._actual_ch):\n            for j in range(len(in_channels)):\n                residual_func = None\n                if j > i:\n                    residual_func = self.add_sublayer(\n                        \"residual_{}_layer_{}_{}\".format(name, i + 1, j + 1),\n                        ConvBNLayer(\n                            num_channels=in_channels[j],\n                            num_filters=out_channels[i],\n                            filter_size=1,\n                            stride=1,\n                            act=None,\n                            name=name + '_layer_' + str(i + 1) + '_' + str(j + 1)))\n                    self.residual_func_list.append(residual_func)\n                elif j < i:\n                    pre_num_filters = in_channels[j]\n                    for k in range(i - j):\n                        if k == i - j - 1:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                ConvBNLayer(\n                                    num_channels=pre_num_filters,\n                                    num_filters=out_channels[i],\n                                    filter_size=3,\n                                    stride=2,\n                                    act=None,\n                                    name=name + '_layer_' + str(i + 1) + '_' + str(j + 1) + '_' + str(k + 1)))\n                            pre_num_filters = out_channels[i]\n                        else:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                ConvBNLayer(\n                                    num_channels=pre_num_filters,\n                                    num_filters=out_channels[j],\n                                    filter_size=3,\n                                    stride=2,\n                                    act=\"relu\",\n                                    name=name + '_layer_' + str(i + 1) + '_' + str(j + 1) + '_' + str(k + 1)))\n                            pre_num_filters = out_channels[j]\n                        self.residual_func_list.append(residual_func)\n\n    def forward(self, input):\n        outs = []\n        residual_func_idx = 0\n        for i in range(self._actual_ch):\n            residual = input[i]\n            for j in range(len(self._in_channels)):\n                if j > i:\n                    y = self.residual_func_list[residual_func_idx](input[j])\n                    residual_func_idx += 1\n\n                    y = F.upsample(y, scale_factor=2**(j - i), mode=\"nearest\")\n                    residual = paddle.add(x=residual, y=y)\n                elif j < i:\n                    y = input[j]\n                    for k in range(i - j):\n                        y = self.residual_func_list[residual_func_idx](y)\n                        residual_func_idx += 1\n\n                    residual = paddle.add(x=residual, y=y)\n\n            residual = F.relu(residual)\n            outs.append(residual)\n\n        return outs\n\n\nclass LastClsOut(nn.Layer):\n    def __init__(self, num_channel_list, has_se, num_filters_list=[32, 64, 128, 256], name=None):\n        super(LastClsOut, self).__init__()\n\n        self.func_list = []\n        for idx in range(len(num_channel_list)):\n            func = self.add_sublayer(\n                \"conv_{}_conv_{}\".format(name, idx + 1),\n                BottleneckBlock(\n                    num_channels=num_channel_list[idx],\n                    num_filters=num_filters_list[idx],\n                    has_se=has_se,\n                    downsample=True,\n                    name=name + 'conv_' + str(idx + 1)))\n            self.func_list.append(func)\n\n    def forward(self, inputs):\n        outs = []\n        for idx, input in enumerate(inputs):\n            out = self.func_list[idx](input)\n            outs.append(out)\n        return outs\n\n\n@moduleinfo(\n    name=\"hrnet32_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"hrnet32_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass HRNet32(nn.Layer):\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(HRNet32, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        self.width = 32\n        self.has_se = False\n        self.channels = {\n            18: [[18, 36], [18, 36, 72], [18, 36, 72, 144]],\n            30: [[30, 60], [30, 60, 120], [30, 60, 120, 240]],\n            32: [[32, 64], [32, 64, 128], [32, 64, 128, 256]],\n            40: [[40, 80], [40, 80, 160], [40, 80, 160, 320]],\n            44: [[44, 88], [44, 88, 176], [44, 88, 176, 352]],\n            48: [[48, 96], [48, 96, 192], [48, 96, 192, 384]],\n            60: [[60, 120], [60, 120, 240], [60, 120, 240, 480]],\n            64: [[64, 128], [64, 128, 256], [64, 128, 256, 512]]\n        }\n        self._class_dim = class_dim\n\n        channels_2, channels_3, channels_4 = self.channels[self.width]\n        num_modules_2, num_modules_3, num_modules_4 = 1, 4, 3\n\n        self.conv_layer1_1 = ConvBNLayer(\n            num_channels=3, num_filters=64, filter_size=3, stride=2, act='relu', name=\"layer1_1\")\n\n        self.conv_layer1_2 = ConvBNLayer(\n            num_channels=64, num_filters=64, filter_size=3, stride=2, act='relu', name=\"layer1_2\")\n\n        self.la1 = Layer1(num_channels=64, has_se=self.has_se, name=\"layer2\")\n\n        self.tr1 = TransitionLayer(in_channels=[256], out_channels=channels_2, name=\"tr1\")\n\n        self.st2 = Stage(\n            num_channels=channels_2, num_modules=num_modules_2, num_filters=channels_2, has_se=self.has_se, name=\"st2\")\n\n        self.tr2 = TransitionLayer(in_channels=channels_2, out_channels=channels_3, name=\"tr2\")\n        self.st3 = Stage(\n            num_channels=channels_3, num_modules=num_modules_3, num_filters=channels_3, has_se=self.has_se, name=\"st3\")\n\n        self.tr3 = TransitionLayer(in_channels=channels_3, out_channels=channels_4, name=\"tr3\")\n        self.st4 = Stage(\n            num_channels=channels_4, num_modules=num_modules_4, num_filters=channels_4, has_se=self.has_se, name=\"st4\")\n\n        # classification\n        num_filters_list = [32, 64, 128, 256]\n        self.last_cls = LastClsOut(\n            num_channel_list=channels_4,\n            has_se=self.has_se,\n            num_filters_list=num_filters_list,\n            name=\"cls_head\",\n        )\n\n        last_num_filters = [256, 512, 1024]\n        self.cls_head_conv_list = []\n        for idx in range(3):\n            self.cls_head_conv_list.append(\n                self.add_sublayer(\n                    \"cls_head_add{}\".format(idx + 1),\n                    ConvBNLayer(\n                        num_channels=num_filters_list[idx] * 4,\n                        num_filters=last_num_filters[idx],\n                        filter_size=3,\n                        stride=2,\n                        name=\"cls_head_add\" + str(idx + 1))))\n\n        self.conv_last = ConvBNLayer(\n            num_channels=1024, num_filters=2048, filter_size=1, stride=1, name=\"cls_head_last_conv\")\n\n        self.pool2d_avg = nn.AdaptiveAvgPool2D(1)\n\n        stdv = 1.0 / math.sqrt(2048 * 1.0)\n\n        self.out = nn.Linear(\n            2048,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_weights\"),\n            bias_attr=ParamAttr(name=\"fc_offset\"))\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, input):\n        conv1 = self.conv_layer1_1(input)\n        conv2 = self.conv_layer1_2(conv1)\n\n        la1 = self.la1(conv2)\n\n        tr1 = self.tr1([la1])\n        st2 = self.st2(tr1)\n\n        tr2 = self.tr2(st2)\n        st3 = self.st3(tr2)\n\n        tr3 = self.tr3(st3)\n        st4 = self.st4(tr3)\n\n        last_cls = self.last_cls(st4)\n\n        y = last_cls[0]\n        for idx in range(3):\n            y = paddle.add(last_cls[idx + 1], self.cls_head_conv_list[idx](y))\n\n        y = self.conv_last(y)\n        feature = self.pool2d_avg(y)\n        y = paddle.reshape(feature, shape=[-1, feature.shape[1]])\n        y = self.out(y)\n        return y, feature\n"
  },
  {
    "path": "modules/image/classification/hrnet40_imagenet/README.md",
    "content": "# hrnet40_imagenet\n\n|模型名称|hrnet40_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|HRNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|333MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - HRNet是微软亚洲研究院在2019年提出的全新神经网络。与之前的卷积神经网络不同，这个网络在网络的深层依然可以保持高分辨率，所以预测的关键点的热图更加准确，而且在空间上也更加准确。此外，该网络在其他对分辨率敏感的视觉任务中表现特别好，例如检测和分割。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install hrnet40_imagenet\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run hrnet40_imagenet --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='hrnet40_imagenet')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用hrnet40_imagenet对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"hrnet40_imagenet\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='hrnet40_imagenet', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m hrnet40_imagenet\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/hrnet40_imagenet\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/hrnet40_imagenet/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/hrnet40_imagenet/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport math\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport paddlehub.vision.transforms as T\nimport numpy as np\nfrom paddle.nn.initializer import Uniform\nfrom paddle import ParamAttr\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    def __init__(self, num_channels, num_filters, filter_size, stride=1, groups=1, act=\"relu\", name=None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = nn.Conv2D(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        bn_name = name + '_bn'\n        self._batch_norm = nn.BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, input):\n        y = self._conv(input)\n        y = self._batch_norm(y)\n        return y\n\n\nclass Layer1(nn.Layer):\n    def __init__(self, num_channels, has_se=False, name=None):\n        super(Layer1, self).__init__()\n\n        self.bottleneck_block_list = []\n\n        for i in range(4):\n            bottleneck_block = self.add_sublayer(\n                \"bb_{}_{}\".format(name, i + 1),\n                BottleneckBlock(\n                    num_channels=num_channels if i == 0 else 256,\n                    num_filters=64,\n                    has_se=has_se,\n                    stride=1,\n                    downsample=True if i == 0 else False,\n                    name=name + '_' + str(i + 1)))\n            self.bottleneck_block_list.append(bottleneck_block)\n\n    def forward(self, input):\n        conv = input\n        for block_func in self.bottleneck_block_list:\n            conv = block_func(conv)\n        return conv\n\n\nclass TransitionLayer(nn.Layer):\n    def __init__(self, in_channels, out_channels, name=None):\n        super(TransitionLayer, self).__init__()\n\n        num_in = len(in_channels)\n        num_out = len(out_channels)\n        out = []\n        self.conv_bn_func_list = []\n        for i in range(num_out):\n            residual = None\n            if i < num_in:\n                if in_channels[i] != out_channels[i]:\n                    residual = self.add_sublayer(\n                        \"transition_{}_layer_{}\".format(name, i + 1),\n                        ConvBNLayer(\n                            num_channels=in_channels[i],\n                            num_filters=out_channels[i],\n                            filter_size=3,\n                            name=name + '_layer_' + str(i + 1)))\n            else:\n                residual = self.add_sublayer(\n                    \"transition_{}_layer_{}\".format(name, i + 1),\n                    ConvBNLayer(\n                        num_channels=in_channels[-1],\n                        num_filters=out_channels[i],\n                        filter_size=3,\n                        stride=2,\n                        name=name + '_layer_' + str(i + 1)))\n            self.conv_bn_func_list.append(residual)\n\n    def forward(self, input):\n        outs = []\n        for idx, conv_bn_func in enumerate(self.conv_bn_func_list):\n            if conv_bn_func is None:\n                outs.append(input[idx])\n            else:\n                if idx < len(input):\n                    outs.append(conv_bn_func(input[idx]))\n                else:\n                    outs.append(conv_bn_func(input[-1]))\n        return outs\n\n\nclass Branches(nn.Layer):\n    def __init__(self, block_num, in_channels, out_channels, has_se=False, name=None):\n        super(Branches, self).__init__()\n\n        self.basic_block_list = []\n\n        for i in range(len(out_channels)):\n            self.basic_block_list.append([])\n            for j in range(block_num):\n                in_ch = in_channels[i] if j == 0 else out_channels[i]\n                basic_block_func = self.add_sublayer(\n                    \"bb_{}_branch_layer_{}_{}\".format(name, i + 1, j + 1),\n                    BasicBlock(\n                        num_channels=in_ch,\n                        num_filters=out_channels[i],\n                        has_se=has_se,\n                        name=name + '_branch_layer_' + str(i + 1) + '_' + str(j + 1)))\n                self.basic_block_list[i].append(basic_block_func)\n\n    def forward(self, inputs):\n        outs = []\n        for idx, input in enumerate(inputs):\n            conv = input\n            basic_block_list = self.basic_block_list[idx]\n            for basic_block_func in basic_block_list:\n                conv = basic_block_func(conv)\n            outs.append(conv)\n        return outs\n\n\nclass BottleneckBlock(nn.Layer):\n    def __init__(self, num_channels, num_filters, has_se, stride=1, downsample=False, name=None):\n        super(BottleneckBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=1,\n            act=\"relu\",\n            name=name + \"_conv1\",\n        )\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_conv2\")\n        self.conv3 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters * 4, filter_size=1, act=None, name=name + \"_conv3\")\n\n        if self.downsample:\n            self.conv_down = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                act=None,\n                name=name + \"_downsample\")\n\n        if self.has_se:\n            self.se = SELayer(\n                num_channels=num_filters * 4, num_filters=num_filters * 4, reduction_ratio=16, name='fc' + name)\n\n    def forward(self, input):\n        residual = input\n        conv1 = self.conv1(input)\n        conv2 = self.conv2(conv1)\n        conv3 = self.conv3(conv2)\n\n        if self.downsample:\n            residual = self.conv_down(input)\n\n        if self.has_se:\n            conv3 = self.se(conv3)\n\n        y = paddle.add(x=residual, y=conv3)\n        y = F.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self, num_channels, num_filters, stride=1, has_se=False, downsample=False, name=None):\n        super(BasicBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_conv1\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters, filter_size=3, stride=1, act=None, name=name + \"_conv2\")\n\n        if self.downsample:\n            self.conv_down = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                act=\"relu\",\n                name=name + \"_downsample\")\n\n        if self.has_se:\n            self.se = SELayer(num_channels=num_filters, num_filters=num_filters, reduction_ratio=16, name='fc' + name)\n\n    def forward(self, input):\n        residual = input\n        conv1 = self.conv1(input)\n        conv2 = self.conv2(conv1)\n\n        if self.downsample:\n            residual = self.conv_down(input)\n\n        if self.has_se:\n            conv2 = self.se(conv2)\n\n        y = paddle.add(x=residual, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass SELayer(nn.Layer):\n    def __init__(self, num_channels, num_filters, reduction_ratio, name=None):\n        super(SELayer, self).__init__()\n\n        self.pool2d_gap = nn.AdaptiveAvgPool2D(1)\n\n        self._num_channels = num_channels\n\n        med_ch = int(num_channels / reduction_ratio)\n        stdv = 1.0 / math.sqrt(num_channels * 1.0)\n        self.squeeze = nn.Linear(\n            num_channels,\n            med_ch,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_sqz_weights\"),\n            bias_attr=ParamAttr(name=name + '_sqz_offset'))\n\n        stdv = 1.0 / math.sqrt(med_ch * 1.0)\n        self.excitation = nn.Linear(\n            med_ch,\n            num_filters,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_exc_weights\"),\n            bias_attr=ParamAttr(name=name + '_exc_offset'))\n\n    def forward(self, input):\n        pool = self.pool2d_gap(input)\n        pool = paddle.squeeze(pool, axis=[2, 3])\n        squeeze = self.squeeze(pool)\n        squeeze = F.relu(squeeze)\n        excitation = self.excitation(squeeze)\n        excitation = F.sigmoid(excitation)\n        excitation = paddle.unsqueeze(excitation, axis=[2, 3])\n        out = input * excitation\n        return out\n\n\nclass Stage(nn.Layer):\n    def __init__(self, num_channels, num_modules, num_filters, has_se=False, multi_scale_output=True, name=None):\n        super(Stage, self).__init__()\n\n        self._num_modules = num_modules\n\n        self.stage_func_list = []\n        for i in range(num_modules):\n            if i == num_modules - 1 and not multi_scale_output:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels,\n                        num_filters=num_filters,\n                        has_se=has_se,\n                        multi_scale_output=False,\n                        name=name + '_' + str(i + 1)))\n            else:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels, num_filters=num_filters, has_se=has_se,\n                        name=name + '_' + str(i + 1)))\n\n            self.stage_func_list.append(stage_func)\n\n    def forward(self, input):\n        out = input\n        for idx in range(self._num_modules):\n            out = self.stage_func_list[idx](out)\n        return out\n\n\nclass HighResolutionModule(nn.Layer):\n    def __init__(self, num_channels, num_filters, has_se=False, multi_scale_output=True, name=None):\n        super(HighResolutionModule, self).__init__()\n\n        self.branches_func = Branches(\n            block_num=4, in_channels=num_channels, out_channels=num_filters, has_se=has_se, name=name)\n\n        self.fuse_func = FuseLayers(\n            in_channels=num_filters, out_channels=num_filters, multi_scale_output=multi_scale_output, name=name)\n\n    def forward(self, input):\n        out = self.branches_func(input)\n        out = self.fuse_func(out)\n        return out\n\n\nclass FuseLayers(nn.Layer):\n    def __init__(self, in_channels, out_channels, multi_scale_output=True, name=None):\n        super(FuseLayers, self).__init__()\n\n        self._actual_ch = len(in_channels) if multi_scale_output else 1\n        self._in_channels = in_channels\n\n        self.residual_func_list = []\n        for i in range(self._actual_ch):\n            for j in range(len(in_channels)):\n                residual_func = None\n                if j > i:\n                    residual_func = self.add_sublayer(\n                        \"residual_{}_layer_{}_{}\".format(name, i + 1, j + 1),\n                        ConvBNLayer(\n                            num_channels=in_channels[j],\n                            num_filters=out_channels[i],\n                            filter_size=1,\n                            stride=1,\n                            act=None,\n                            name=name + '_layer_' + str(i + 1) + '_' + str(j + 1)))\n                    self.residual_func_list.append(residual_func)\n                elif j < i:\n                    pre_num_filters = in_channels[j]\n                    for k in range(i - j):\n                        if k == i - j - 1:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                ConvBNLayer(\n                                    num_channels=pre_num_filters,\n                                    num_filters=out_channels[i],\n                                    filter_size=3,\n                                    stride=2,\n                                    act=None,\n                                    name=name + '_layer_' + str(i + 1) + '_' + str(j + 1) + '_' + str(k + 1)))\n                            pre_num_filters = out_channels[i]\n                        else:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                ConvBNLayer(\n                                    num_channels=pre_num_filters,\n                                    num_filters=out_channels[j],\n                                    filter_size=3,\n                                    stride=2,\n                                    act=\"relu\",\n                                    name=name + '_layer_' + str(i + 1) + '_' + str(j + 1) + '_' + str(k + 1)))\n                            pre_num_filters = out_channels[j]\n                        self.residual_func_list.append(residual_func)\n\n    def forward(self, input):\n        outs = []\n        residual_func_idx = 0\n        for i in range(self._actual_ch):\n            residual = input[i]\n            for j in range(len(self._in_channels)):\n                if j > i:\n                    y = self.residual_func_list[residual_func_idx](input[j])\n                    residual_func_idx += 1\n\n                    y = F.upsample(y, scale_factor=2**(j - i), mode=\"nearest\")\n                    residual = paddle.add(x=residual, y=y)\n                elif j < i:\n                    y = input[j]\n                    for k in range(i - j):\n                        y = self.residual_func_list[residual_func_idx](y)\n                        residual_func_idx += 1\n\n                    residual = paddle.add(x=residual, y=y)\n\n            residual = F.relu(residual)\n            outs.append(residual)\n\n        return outs\n\n\nclass LastClsOut(nn.Layer):\n    def __init__(self, num_channel_list, has_se, num_filters_list=[32, 64, 128, 256], name=None):\n        super(LastClsOut, self).__init__()\n\n        self.func_list = []\n        for idx in range(len(num_channel_list)):\n            func = self.add_sublayer(\n                \"conv_{}_conv_{}\".format(name, idx + 1),\n                BottleneckBlock(\n                    num_channels=num_channel_list[idx],\n                    num_filters=num_filters_list[idx],\n                    has_se=has_se,\n                    downsample=True,\n                    name=name + 'conv_' + str(idx + 1)))\n            self.func_list.append(func)\n\n    def forward(self, inputs):\n        outs = []\n        for idx, input in enumerate(inputs):\n            out = self.func_list[idx](input)\n            outs.append(out)\n        return outs\n\n\n@moduleinfo(\n    name=\"hrnet40_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"hrnet40_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass HRNet40(nn.Layer):\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(HRNet40, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        self.width = 40\n        self.has_se = False\n        self.channels = {\n            18: [[18, 36], [18, 36, 72], [18, 36, 72, 144]],\n            30: [[30, 60], [30, 60, 120], [30, 60, 120, 240]],\n            32: [[32, 64], [32, 64, 128], [32, 64, 128, 256]],\n            40: [[40, 80], [40, 80, 160], [40, 80, 160, 320]],\n            44: [[44, 88], [44, 88, 176], [44, 88, 176, 352]],\n            48: [[48, 96], [48, 96, 192], [48, 96, 192, 384]],\n            60: [[60, 120], [60, 120, 240], [60, 120, 240, 480]],\n            64: [[64, 128], [64, 128, 256], [64, 128, 256, 512]]\n        }\n        self._class_dim = class_dim\n\n        channels_2, channels_3, channels_4 = self.channels[self.width]\n        num_modules_2, num_modules_3, num_modules_4 = 1, 4, 3\n\n        self.conv_layer1_1 = ConvBNLayer(\n            num_channels=3, num_filters=64, filter_size=3, stride=2, act='relu', name=\"layer1_1\")\n\n        self.conv_layer1_2 = ConvBNLayer(\n            num_channels=64, num_filters=64, filter_size=3, stride=2, act='relu', name=\"layer1_2\")\n\n        self.la1 = Layer1(num_channels=64, has_se=self.has_se, name=\"layer2\")\n\n        self.tr1 = TransitionLayer(in_channels=[256], out_channels=channels_2, name=\"tr1\")\n\n        self.st2 = Stage(\n            num_channels=channels_2, num_modules=num_modules_2, num_filters=channels_2, has_se=self.has_se, name=\"st2\")\n\n        self.tr2 = TransitionLayer(in_channels=channels_2, out_channels=channels_3, name=\"tr2\")\n        self.st3 = Stage(\n            num_channels=channels_3, num_modules=num_modules_3, num_filters=channels_3, has_se=self.has_se, name=\"st3\")\n\n        self.tr3 = TransitionLayer(in_channels=channels_3, out_channels=channels_4, name=\"tr3\")\n        self.st4 = Stage(\n            num_channels=channels_4, num_modules=num_modules_4, num_filters=channels_4, has_se=self.has_se, name=\"st4\")\n\n        # classification\n        num_filters_list = [32, 64, 128, 256]\n        self.last_cls = LastClsOut(\n            num_channel_list=channels_4,\n            has_se=self.has_se,\n            num_filters_list=num_filters_list,\n            name=\"cls_head\",\n        )\n\n        last_num_filters = [256, 512, 1024]\n        self.cls_head_conv_list = []\n        for idx in range(3):\n            self.cls_head_conv_list.append(\n                self.add_sublayer(\n                    \"cls_head_add{}\".format(idx + 1),\n                    ConvBNLayer(\n                        num_channels=num_filters_list[idx] * 4,\n                        num_filters=last_num_filters[idx],\n                        filter_size=3,\n                        stride=2,\n                        name=\"cls_head_add\" + str(idx + 1))))\n\n        self.conv_last = ConvBNLayer(\n            num_channels=1024, num_filters=2048, filter_size=1, stride=1, name=\"cls_head_last_conv\")\n\n        self.pool2d_avg = nn.AdaptiveAvgPool2D(1)\n\n        stdv = 1.0 / math.sqrt(2048 * 1.0)\n\n        self.out = nn.Linear(\n            2048,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_weights\"),\n            bias_attr=ParamAttr(name=\"fc_offset\"))\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, input):\n        conv1 = self.conv_layer1_1(input)\n        conv2 = self.conv_layer1_2(conv1)\n\n        la1 = self.la1(conv2)\n\n        tr1 = self.tr1([la1])\n        st2 = self.st2(tr1)\n\n        tr2 = self.tr2(st2)\n        st3 = self.st3(tr2)\n\n        tr3 = self.tr3(st3)\n        st4 = self.st4(tr3)\n\n        last_cls = self.last_cls(st4)\n\n        y = last_cls[0]\n        for idx in range(3):\n            y = paddle.add(last_cls[idx + 1], self.cls_head_conv_list[idx](y))\n\n        y = self.conv_last(y)\n        feature = self.pool2d_avg(y)\n        y = paddle.reshape(feature, shape=[-1, feature.shape[1]])\n        y = self.out(y)\n        return y, feature\n"
  },
  {
    "path": "modules/image/classification/hrnet44_imagenet/README.md",
    "content": "# hrnet44_imagenet\n\n|模型名称|hrnet44_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|HRNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|388MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - HRNet是微软亚洲研究院在2019年提出的全新神经网络。与之前的卷积神经网络不同，这个网络在网络的深层依然可以保持高分辨率，所以预测的关键点的热图更加准确，而且在空间上也更加准确。此外，该网络在其他对分辨率敏感的视觉任务中表现特别好，例如检测和分割。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install hrnet44_imagenet\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run hrnet44_imagenet --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='hrnet44_imagenet')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用hrnet44_imagenet对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"hrnet44_imagenet\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='hrnet44_imagenet', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m hrnet44_imagenet\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/hrnet44_imagenet\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/hrnet44_imagenet/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/hrnet44_imagenet/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport math\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport paddlehub.vision.transforms as T\nimport numpy as np\nfrom paddle.nn.initializer import Uniform\nfrom paddle import ParamAttr\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    def __init__(self, num_channels, num_filters, filter_size, stride=1, groups=1, act=\"relu\", name=None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = nn.Conv2D(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        bn_name = name + '_bn'\n        self._batch_norm = nn.BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, input):\n        y = self._conv(input)\n        y = self._batch_norm(y)\n        return y\n\n\nclass Layer1(nn.Layer):\n    def __init__(self, num_channels, has_se=False, name=None):\n        super(Layer1, self).__init__()\n\n        self.bottleneck_block_list = []\n\n        for i in range(4):\n            bottleneck_block = self.add_sublayer(\n                \"bb_{}_{}\".format(name, i + 1),\n                BottleneckBlock(\n                    num_channels=num_channels if i == 0 else 256,\n                    num_filters=64,\n                    has_se=has_se,\n                    stride=1,\n                    downsample=True if i == 0 else False,\n                    name=name + '_' + str(i + 1)))\n            self.bottleneck_block_list.append(bottleneck_block)\n\n    def forward(self, input):\n        conv = input\n        for block_func in self.bottleneck_block_list:\n            conv = block_func(conv)\n        return conv\n\n\nclass TransitionLayer(nn.Layer):\n    def __init__(self, in_channels, out_channels, name=None):\n        super(TransitionLayer, self).__init__()\n\n        num_in = len(in_channels)\n        num_out = len(out_channels)\n        out = []\n        self.conv_bn_func_list = []\n        for i in range(num_out):\n            residual = None\n            if i < num_in:\n                if in_channels[i] != out_channels[i]:\n                    residual = self.add_sublayer(\n                        \"transition_{}_layer_{}\".format(name, i + 1),\n                        ConvBNLayer(\n                            num_channels=in_channels[i],\n                            num_filters=out_channels[i],\n                            filter_size=3,\n                            name=name + '_layer_' + str(i + 1)))\n            else:\n                residual = self.add_sublayer(\n                    \"transition_{}_layer_{}\".format(name, i + 1),\n                    ConvBNLayer(\n                        num_channels=in_channels[-1],\n                        num_filters=out_channels[i],\n                        filter_size=3,\n                        stride=2,\n                        name=name + '_layer_' + str(i + 1)))\n            self.conv_bn_func_list.append(residual)\n\n    def forward(self, input):\n        outs = []\n        for idx, conv_bn_func in enumerate(self.conv_bn_func_list):\n            if conv_bn_func is None:\n                outs.append(input[idx])\n            else:\n                if idx < len(input):\n                    outs.append(conv_bn_func(input[idx]))\n                else:\n                    outs.append(conv_bn_func(input[-1]))\n        return outs\n\n\nclass Branches(nn.Layer):\n    def __init__(self, block_num, in_channels, out_channels, has_se=False, name=None):\n        super(Branches, self).__init__()\n\n        self.basic_block_list = []\n\n        for i in range(len(out_channels)):\n            self.basic_block_list.append([])\n            for j in range(block_num):\n                in_ch = in_channels[i] if j == 0 else out_channels[i]\n                basic_block_func = self.add_sublayer(\n                    \"bb_{}_branch_layer_{}_{}\".format(name, i + 1, j + 1),\n                    BasicBlock(\n                        num_channels=in_ch,\n                        num_filters=out_channels[i],\n                        has_se=has_se,\n                        name=name + '_branch_layer_' + str(i + 1) + '_' + str(j + 1)))\n                self.basic_block_list[i].append(basic_block_func)\n\n    def forward(self, inputs):\n        outs = []\n        for idx, input in enumerate(inputs):\n            conv = input\n            basic_block_list = self.basic_block_list[idx]\n            for basic_block_func in basic_block_list:\n                conv = basic_block_func(conv)\n            outs.append(conv)\n        return outs\n\n\nclass BottleneckBlock(nn.Layer):\n    def __init__(self, num_channels, num_filters, has_se, stride=1, downsample=False, name=None):\n        super(BottleneckBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=1,\n            act=\"relu\",\n            name=name + \"_conv1\",\n        )\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_conv2\")\n        self.conv3 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters * 4, filter_size=1, act=None, name=name + \"_conv3\")\n\n        if self.downsample:\n            self.conv_down = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                act=None,\n                name=name + \"_downsample\")\n\n        if self.has_se:\n            self.se = SELayer(\n                num_channels=num_filters * 4, num_filters=num_filters * 4, reduction_ratio=16, name='fc' + name)\n\n    def forward(self, input):\n        residual = input\n        conv1 = self.conv1(input)\n        conv2 = self.conv2(conv1)\n        conv3 = self.conv3(conv2)\n\n        if self.downsample:\n            residual = self.conv_down(input)\n\n        if self.has_se:\n            conv3 = self.se(conv3)\n\n        y = paddle.add(x=residual, y=conv3)\n        y = F.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self, num_channels, num_filters, stride=1, has_se=False, downsample=False, name=None):\n        super(BasicBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_conv1\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters, filter_size=3, stride=1, act=None, name=name + \"_conv2\")\n\n        if self.downsample:\n            self.conv_down = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                act=\"relu\",\n                name=name + \"_downsample\")\n\n        if self.has_se:\n            self.se = SELayer(num_channels=num_filters, num_filters=num_filters, reduction_ratio=16, name='fc' + name)\n\n    def forward(self, input):\n        residual = input\n        conv1 = self.conv1(input)\n        conv2 = self.conv2(conv1)\n\n        if self.downsample:\n            residual = self.conv_down(input)\n\n        if self.has_se:\n            conv2 = self.se(conv2)\n\n        y = paddle.add(x=residual, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass SELayer(nn.Layer):\n    def __init__(self, num_channels, num_filters, reduction_ratio, name=None):\n        super(SELayer, self).__init__()\n\n        self.pool2d_gap = nn.AdaptiveAvgPool2D(1)\n\n        self._num_channels = num_channels\n\n        med_ch = int(num_channels / reduction_ratio)\n        stdv = 1.0 / math.sqrt(num_channels * 1.0)\n        self.squeeze = nn.Linear(\n            num_channels,\n            med_ch,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_sqz_weights\"),\n            bias_attr=ParamAttr(name=name + '_sqz_offset'))\n\n        stdv = 1.0 / math.sqrt(med_ch * 1.0)\n        self.excitation = nn.Linear(\n            med_ch,\n            num_filters,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_exc_weights\"),\n            bias_attr=ParamAttr(name=name + '_exc_offset'))\n\n    def forward(self, input):\n        pool = self.pool2d_gap(input)\n        pool = paddle.squeeze(pool, axis=[2, 3])\n        squeeze = self.squeeze(pool)\n        squeeze = F.relu(squeeze)\n        excitation = self.excitation(squeeze)\n        excitation = F.sigmoid(excitation)\n        excitation = paddle.unsqueeze(excitation, axis=[2, 3])\n        out = input * excitation\n        return out\n\n\nclass Stage(nn.Layer):\n    def __init__(self, num_channels, num_modules, num_filters, has_se=False, multi_scale_output=True, name=None):\n        super(Stage, self).__init__()\n\n        self._num_modules = num_modules\n\n        self.stage_func_list = []\n        for i in range(num_modules):\n            if i == num_modules - 1 and not multi_scale_output:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels,\n                        num_filters=num_filters,\n                        has_se=has_se,\n                        multi_scale_output=False,\n                        name=name + '_' + str(i + 1)))\n            else:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels, num_filters=num_filters, has_se=has_se,\n                        name=name + '_' + str(i + 1)))\n\n            self.stage_func_list.append(stage_func)\n\n    def forward(self, input):\n        out = input\n        for idx in range(self._num_modules):\n            out = self.stage_func_list[idx](out)\n        return out\n\n\nclass HighResolutionModule(nn.Layer):\n    def __init__(self, num_channels, num_filters, has_se=False, multi_scale_output=True, name=None):\n        super(HighResolutionModule, self).__init__()\n\n        self.branches_func = Branches(\n            block_num=4, in_channels=num_channels, out_channels=num_filters, has_se=has_se, name=name)\n\n        self.fuse_func = FuseLayers(\n            in_channels=num_filters, out_channels=num_filters, multi_scale_output=multi_scale_output, name=name)\n\n    def forward(self, input):\n        out = self.branches_func(input)\n        out = self.fuse_func(out)\n        return out\n\n\nclass FuseLayers(nn.Layer):\n    def __init__(self, in_channels, out_channels, multi_scale_output=True, name=None):\n        super(FuseLayers, self).__init__()\n\n        self._actual_ch = len(in_channels) if multi_scale_output else 1\n        self._in_channels = in_channels\n\n        self.residual_func_list = []\n        for i in range(self._actual_ch):\n            for j in range(len(in_channels)):\n                residual_func = None\n                if j > i:\n                    residual_func = self.add_sublayer(\n                        \"residual_{}_layer_{}_{}\".format(name, i + 1, j + 1),\n                        ConvBNLayer(\n                            num_channels=in_channels[j],\n                            num_filters=out_channels[i],\n                            filter_size=1,\n                            stride=1,\n                            act=None,\n                            name=name + '_layer_' + str(i + 1) + '_' + str(j + 1)))\n                    self.residual_func_list.append(residual_func)\n                elif j < i:\n                    pre_num_filters = in_channels[j]\n                    for k in range(i - j):\n                        if k == i - j - 1:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                ConvBNLayer(\n                                    num_channels=pre_num_filters,\n                                    num_filters=out_channels[i],\n                                    filter_size=3,\n                                    stride=2,\n                                    act=None,\n                                    name=name + '_layer_' + str(i + 1) + '_' + str(j + 1) + '_' + str(k + 1)))\n                            pre_num_filters = out_channels[i]\n                        else:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                ConvBNLayer(\n                                    num_channels=pre_num_filters,\n                                    num_filters=out_channels[j],\n                                    filter_size=3,\n                                    stride=2,\n                                    act=\"relu\",\n                                    name=name + '_layer_' + str(i + 1) + '_' + str(j + 1) + '_' + str(k + 1)))\n                            pre_num_filters = out_channels[j]\n                        self.residual_func_list.append(residual_func)\n\n    def forward(self, input):\n        outs = []\n        residual_func_idx = 0\n        for i in range(self._actual_ch):\n            residual = input[i]\n            for j in range(len(self._in_channels)):\n                if j > i:\n                    y = self.residual_func_list[residual_func_idx](input[j])\n                    residual_func_idx += 1\n\n                    y = F.upsample(y, scale_factor=2**(j - i), mode=\"nearest\")\n                    residual = paddle.add(x=residual, y=y)\n                elif j < i:\n                    y = input[j]\n                    for k in range(i - j):\n                        y = self.residual_func_list[residual_func_idx](y)\n                        residual_func_idx += 1\n\n                    residual = paddle.add(x=residual, y=y)\n\n            residual = F.relu(residual)\n            outs.append(residual)\n\n        return outs\n\n\nclass LastClsOut(nn.Layer):\n    def __init__(self, num_channel_list, has_se, num_filters_list=[32, 64, 128, 256], name=None):\n        super(LastClsOut, self).__init__()\n\n        self.func_list = []\n        for idx in range(len(num_channel_list)):\n            func = self.add_sublayer(\n                \"conv_{}_conv_{}\".format(name, idx + 1),\n                BottleneckBlock(\n                    num_channels=num_channel_list[idx],\n                    num_filters=num_filters_list[idx],\n                    has_se=has_se,\n                    downsample=True,\n                    name=name + 'conv_' + str(idx + 1)))\n            self.func_list.append(func)\n\n    def forward(self, inputs):\n        outs = []\n        for idx, input in enumerate(inputs):\n            out = self.func_list[idx](input)\n            outs.append(out)\n        return outs\n\n\n@moduleinfo(\n    name=\"hrnet44_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"hrnet44_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass HRNet44(nn.Layer):\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(HRNet44, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        self.width = 44\n        self.has_se = False\n        self.channels = {\n            18: [[18, 36], [18, 36, 72], [18, 36, 72, 144]],\n            30: [[30, 60], [30, 60, 120], [30, 60, 120, 240]],\n            32: [[32, 64], [32, 64, 128], [32, 64, 128, 256]],\n            40: [[40, 80], [40, 80, 160], [40, 80, 160, 320]],\n            44: [[44, 88], [44, 88, 176], [44, 88, 176, 352]],\n            48: [[48, 96], [48, 96, 192], [48, 96, 192, 384]],\n            60: [[60, 120], [60, 120, 240], [60, 120, 240, 480]],\n            64: [[64, 128], [64, 128, 256], [64, 128, 256, 512]]\n        }\n        self._class_dim = class_dim\n\n        channels_2, channels_3, channels_4 = self.channels[self.width]\n        num_modules_2, num_modules_3, num_modules_4 = 1, 4, 3\n\n        self.conv_layer1_1 = ConvBNLayer(\n            num_channels=3, num_filters=64, filter_size=3, stride=2, act='relu', name=\"layer1_1\")\n\n        self.conv_layer1_2 = ConvBNLayer(\n            num_channels=64, num_filters=64, filter_size=3, stride=2, act='relu', name=\"layer1_2\")\n\n        self.la1 = Layer1(num_channels=64, has_se=self.has_se, name=\"layer2\")\n\n        self.tr1 = TransitionLayer(in_channels=[256], out_channels=channels_2, name=\"tr1\")\n\n        self.st2 = Stage(\n            num_channels=channels_2, num_modules=num_modules_2, num_filters=channels_2, has_se=self.has_se, name=\"st2\")\n\n        self.tr2 = TransitionLayer(in_channels=channels_2, out_channels=channels_3, name=\"tr2\")\n        self.st3 = Stage(\n            num_channels=channels_3, num_modules=num_modules_3, num_filters=channels_3, has_se=self.has_se, name=\"st3\")\n\n        self.tr3 = TransitionLayer(in_channels=channels_3, out_channels=channels_4, name=\"tr3\")\n        self.st4 = Stage(\n            num_channels=channels_4, num_modules=num_modules_4, num_filters=channels_4, has_se=self.has_se, name=\"st4\")\n\n        # classification\n        num_filters_list = [32, 64, 128, 256]\n        self.last_cls = LastClsOut(\n            num_channel_list=channels_4,\n            has_se=self.has_se,\n            num_filters_list=num_filters_list,\n            name=\"cls_head\",\n        )\n\n        last_num_filters = [256, 512, 1024]\n        self.cls_head_conv_list = []\n        for idx in range(3):\n            self.cls_head_conv_list.append(\n                self.add_sublayer(\n                    \"cls_head_add{}\".format(idx + 1),\n                    ConvBNLayer(\n                        num_channels=num_filters_list[idx] * 4,\n                        num_filters=last_num_filters[idx],\n                        filter_size=3,\n                        stride=2,\n                        name=\"cls_head_add\" + str(idx + 1))))\n\n        self.conv_last = ConvBNLayer(\n            num_channels=1024, num_filters=2048, filter_size=1, stride=1, name=\"cls_head_last_conv\")\n\n        self.pool2d_avg = nn.AdaptiveAvgPool2D(1)\n\n        stdv = 1.0 / math.sqrt(2048 * 1.0)\n\n        self.out = nn.Linear(\n            2048,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_weights\"),\n            bias_attr=ParamAttr(name=\"fc_offset\"))\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, input):\n        conv1 = self.conv_layer1_1(input)\n        conv2 = self.conv_layer1_2(conv1)\n\n        la1 = self.la1(conv2)\n\n        tr1 = self.tr1([la1])\n        st2 = self.st2(tr1)\n\n        tr2 = self.tr2(st2)\n        st3 = self.st3(tr2)\n\n        tr3 = self.tr3(st3)\n        st4 = self.st4(tr3)\n\n        last_cls = self.last_cls(st4)\n\n        y = last_cls[0]\n        for idx in range(3):\n            y = paddle.add(last_cls[idx + 1], self.cls_head_conv_list[idx](y))\n\n        y = self.conv_last(y)\n        feature = self.pool2d_avg(y)\n        y = paddle.reshape(feature, shape=[-1, feature.shape[1]])\n        y = self.out(y)\n        return y, feature\n"
  },
  {
    "path": "modules/image/classification/hrnet48_imagenet/README.md",
    "content": "# hrnet48_imagenet\n\n|模型名称|hrnet48_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|HRNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|448MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - HRNet是微软亚洲研究院在2019年提出的全新神经网络。与之前的卷积神经网络不同，这个网络在网络的深层依然可以保持高分辨率，所以预测的关键点的热图更加准确，而且在空间上也更加准确。此外，该网络在其他对分辨率敏感的视觉任务中表现特别好，例如检测和分割。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install hrnet48_imagenet\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run hrnet48_imagenet --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='hrnet48_imagenet')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用hrnet48_imagenet对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"hrnet48_imagenet\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='hrnet48_imagenet', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m hrnet48_imagenet\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/hrnet48_imagenet\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/hrnet48_imagenet/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/hrnet48_imagenet/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport math\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport paddlehub.vision.transforms as T\nimport numpy as np\nfrom paddle.nn.initializer import Uniform\nfrom paddle import ParamAttr\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    def __init__(self, num_channels, num_filters, filter_size, stride=1, groups=1, act=\"relu\", name=None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = nn.Conv2D(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        bn_name = name + '_bn'\n        self._batch_norm = nn.BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, input):\n        y = self._conv(input)\n        y = self._batch_norm(y)\n        return y\n\n\nclass Layer1(nn.Layer):\n    def __init__(self, num_channels, has_se=False, name=None):\n        super(Layer1, self).__init__()\n\n        self.bottleneck_block_list = []\n\n        for i in range(4):\n            bottleneck_block = self.add_sublayer(\n                \"bb_{}_{}\".format(name, i + 1),\n                BottleneckBlock(\n                    num_channels=num_channels if i == 0 else 256,\n                    num_filters=64,\n                    has_se=has_se,\n                    stride=1,\n                    downsample=True if i == 0 else False,\n                    name=name + '_' + str(i + 1)))\n            self.bottleneck_block_list.append(bottleneck_block)\n\n    def forward(self, input):\n        conv = input\n        for block_func in self.bottleneck_block_list:\n            conv = block_func(conv)\n        return conv\n\n\nclass TransitionLayer(nn.Layer):\n    def __init__(self, in_channels, out_channels, name=None):\n        super(TransitionLayer, self).__init__()\n\n        num_in = len(in_channels)\n        num_out = len(out_channels)\n        out = []\n        self.conv_bn_func_list = []\n        for i in range(num_out):\n            residual = None\n            if i < num_in:\n                if in_channels[i] != out_channels[i]:\n                    residual = self.add_sublayer(\n                        \"transition_{}_layer_{}\".format(name, i + 1),\n                        ConvBNLayer(\n                            num_channels=in_channels[i],\n                            num_filters=out_channels[i],\n                            filter_size=3,\n                            name=name + '_layer_' + str(i + 1)))\n            else:\n                residual = self.add_sublayer(\n                    \"transition_{}_layer_{}\".format(name, i + 1),\n                    ConvBNLayer(\n                        num_channels=in_channels[-1],\n                        num_filters=out_channels[i],\n                        filter_size=3,\n                        stride=2,\n                        name=name + '_layer_' + str(i + 1)))\n            self.conv_bn_func_list.append(residual)\n\n    def forward(self, input):\n        outs = []\n        for idx, conv_bn_func in enumerate(self.conv_bn_func_list):\n            if conv_bn_func is None:\n                outs.append(input[idx])\n            else:\n                if idx < len(input):\n                    outs.append(conv_bn_func(input[idx]))\n                else:\n                    outs.append(conv_bn_func(input[-1]))\n        return outs\n\n\nclass Branches(nn.Layer):\n    def __init__(self, block_num, in_channels, out_channels, has_se=False, name=None):\n        super(Branches, self).__init__()\n\n        self.basic_block_list = []\n\n        for i in range(len(out_channels)):\n            self.basic_block_list.append([])\n            for j in range(block_num):\n                in_ch = in_channels[i] if j == 0 else out_channels[i]\n                basic_block_func = self.add_sublayer(\n                    \"bb_{}_branch_layer_{}_{}\".format(name, i + 1, j + 1),\n                    BasicBlock(\n                        num_channels=in_ch,\n                        num_filters=out_channels[i],\n                        has_se=has_se,\n                        name=name + '_branch_layer_' + str(i + 1) + '_' + str(j + 1)))\n                self.basic_block_list[i].append(basic_block_func)\n\n    def forward(self, inputs):\n        outs = []\n        for idx, input in enumerate(inputs):\n            conv = input\n            basic_block_list = self.basic_block_list[idx]\n            for basic_block_func in basic_block_list:\n                conv = basic_block_func(conv)\n            outs.append(conv)\n        return outs\n\n\nclass BottleneckBlock(nn.Layer):\n    def __init__(self, num_channels, num_filters, has_se, stride=1, downsample=False, name=None):\n        super(BottleneckBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=1,\n            act=\"relu\",\n            name=name + \"_conv1\",\n        )\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_conv2\")\n        self.conv3 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters * 4, filter_size=1, act=None, name=name + \"_conv3\")\n\n        if self.downsample:\n            self.conv_down = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                act=None,\n                name=name + \"_downsample\")\n\n        if self.has_se:\n            self.se = SELayer(\n                num_channels=num_filters * 4, num_filters=num_filters * 4, reduction_ratio=16, name='fc' + name)\n\n    def forward(self, input):\n        residual = input\n        conv1 = self.conv1(input)\n        conv2 = self.conv2(conv1)\n        conv3 = self.conv3(conv2)\n\n        if self.downsample:\n            residual = self.conv_down(input)\n\n        if self.has_se:\n            conv3 = self.se(conv3)\n\n        y = paddle.add(x=residual, y=conv3)\n        y = F.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self, num_channels, num_filters, stride=1, has_se=False, downsample=False, name=None):\n        super(BasicBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_conv1\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters, filter_size=3, stride=1, act=None, name=name + \"_conv2\")\n\n        if self.downsample:\n            self.conv_down = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                act=\"relu\",\n                name=name + \"_downsample\")\n\n        if self.has_se:\n            self.se = SELayer(num_channels=num_filters, num_filters=num_filters, reduction_ratio=16, name='fc' + name)\n\n    def forward(self, input):\n        residual = input\n        conv1 = self.conv1(input)\n        conv2 = self.conv2(conv1)\n\n        if self.downsample:\n            residual = self.conv_down(input)\n\n        if self.has_se:\n            conv2 = self.se(conv2)\n\n        y = paddle.add(x=residual, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass SELayer(nn.Layer):\n    def __init__(self, num_channels, num_filters, reduction_ratio, name=None):\n        super(SELayer, self).__init__()\n\n        self.pool2d_gap = nn.AdaptiveAvgPool2D(1)\n\n        self._num_channels = num_channels\n\n        med_ch = int(num_channels / reduction_ratio)\n        stdv = 1.0 / math.sqrt(num_channels * 1.0)\n        self.squeeze = nn.Linear(\n            num_channels,\n            med_ch,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_sqz_weights\"),\n            bias_attr=ParamAttr(name=name + '_sqz_offset'))\n\n        stdv = 1.0 / math.sqrt(med_ch * 1.0)\n        self.excitation = nn.Linear(\n            med_ch,\n            num_filters,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_exc_weights\"),\n            bias_attr=ParamAttr(name=name + '_exc_offset'))\n\n    def forward(self, input):\n        pool = self.pool2d_gap(input)\n        pool = paddle.squeeze(pool, axis=[2, 3])\n        squeeze = self.squeeze(pool)\n        squeeze = F.relu(squeeze)\n        excitation = self.excitation(squeeze)\n        excitation = F.sigmoid(excitation)\n        excitation = paddle.unsqueeze(excitation, axis=[2, 3])\n        out = input * excitation\n        return out\n\n\nclass Stage(nn.Layer):\n    def __init__(self, num_channels, num_modules, num_filters, has_se=False, multi_scale_output=True, name=None):\n        super(Stage, self).__init__()\n\n        self._num_modules = num_modules\n\n        self.stage_func_list = []\n        for i in range(num_modules):\n            if i == num_modules - 1 and not multi_scale_output:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels,\n                        num_filters=num_filters,\n                        has_se=has_se,\n                        multi_scale_output=False,\n                        name=name + '_' + str(i + 1)))\n            else:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels, num_filters=num_filters, has_se=has_se,\n                        name=name + '_' + str(i + 1)))\n\n            self.stage_func_list.append(stage_func)\n\n    def forward(self, input):\n        out = input\n        for idx in range(self._num_modules):\n            out = self.stage_func_list[idx](out)\n        return out\n\n\nclass HighResolutionModule(nn.Layer):\n    def __init__(self, num_channels, num_filters, has_se=False, multi_scale_output=True, name=None):\n        super(HighResolutionModule, self).__init__()\n\n        self.branches_func = Branches(\n            block_num=4, in_channels=num_channels, out_channels=num_filters, has_se=has_se, name=name)\n\n        self.fuse_func = FuseLayers(\n            in_channels=num_filters, out_channels=num_filters, multi_scale_output=multi_scale_output, name=name)\n\n    def forward(self, input):\n        out = self.branches_func(input)\n        out = self.fuse_func(out)\n        return out\n\n\nclass FuseLayers(nn.Layer):\n    def __init__(self, in_channels, out_channels, multi_scale_output=True, name=None):\n        super(FuseLayers, self).__init__()\n\n        self._actual_ch = len(in_channels) if multi_scale_output else 1\n        self._in_channels = in_channels\n\n        self.residual_func_list = []\n        for i in range(self._actual_ch):\n            for j in range(len(in_channels)):\n                residual_func = None\n                if j > i:\n                    residual_func = self.add_sublayer(\n                        \"residual_{}_layer_{}_{}\".format(name, i + 1, j + 1),\n                        ConvBNLayer(\n                            num_channels=in_channels[j],\n                            num_filters=out_channels[i],\n                            filter_size=1,\n                            stride=1,\n                            act=None,\n                            name=name + '_layer_' + str(i + 1) + '_' + str(j + 1)))\n                    self.residual_func_list.append(residual_func)\n                elif j < i:\n                    pre_num_filters = in_channels[j]\n                    for k in range(i - j):\n                        if k == i - j - 1:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                ConvBNLayer(\n                                    num_channels=pre_num_filters,\n                                    num_filters=out_channels[i],\n                                    filter_size=3,\n                                    stride=2,\n                                    act=None,\n                                    name=name + '_layer_' + str(i + 1) + '_' + str(j + 1) + '_' + str(k + 1)))\n                            pre_num_filters = out_channels[i]\n                        else:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                ConvBNLayer(\n                                    num_channels=pre_num_filters,\n                                    num_filters=out_channels[j],\n                                    filter_size=3,\n                                    stride=2,\n                                    act=\"relu\",\n                                    name=name + '_layer_' + str(i + 1) + '_' + str(j + 1) + '_' + str(k + 1)))\n                            pre_num_filters = out_channels[j]\n                        self.residual_func_list.append(residual_func)\n\n    def forward(self, input):\n        outs = []\n        residual_func_idx = 0\n        for i in range(self._actual_ch):\n            residual = input[i]\n            for j in range(len(self._in_channels)):\n                if j > i:\n                    y = self.residual_func_list[residual_func_idx](input[j])\n                    residual_func_idx += 1\n\n                    y = F.upsample(y, scale_factor=2**(j - i), mode=\"nearest\")\n                    residual = paddle.add(x=residual, y=y)\n                elif j < i:\n                    y = input[j]\n                    for k in range(i - j):\n                        y = self.residual_func_list[residual_func_idx](y)\n                        residual_func_idx += 1\n\n                    residual = paddle.add(x=residual, y=y)\n\n            residual = F.relu(residual)\n            outs.append(residual)\n\n        return outs\n\n\nclass LastClsOut(nn.Layer):\n    def __init__(self, num_channel_list, has_se, num_filters_list=[32, 64, 128, 256], name=None):\n        super(LastClsOut, self).__init__()\n\n        self.func_list = []\n        for idx in range(len(num_channel_list)):\n            func = self.add_sublayer(\n                \"conv_{}_conv_{}\".format(name, idx + 1),\n                BottleneckBlock(\n                    num_channels=num_channel_list[idx],\n                    num_filters=num_filters_list[idx],\n                    has_se=has_se,\n                    downsample=True,\n                    name=name + 'conv_' + str(idx + 1)))\n            self.func_list.append(func)\n\n    def forward(self, inputs):\n        outs = []\n        for idx, input in enumerate(inputs):\n            out = self.func_list[idx](input)\n            outs.append(out)\n        return outs\n\n\n@moduleinfo(\n    name=\"hrnet48_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"hrnet48_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass HRNet48(nn.Layer):\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(HRNet48, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        self.width = 48\n        self.has_se = False\n        self.channels = {\n            18: [[18, 36], [18, 36, 72], [18, 36, 72, 144]],\n            30: [[30, 60], [30, 60, 120], [30, 60, 120, 240]],\n            32: [[32, 64], [32, 64, 128], [32, 64, 128, 256]],\n            40: [[40, 80], [40, 80, 160], [40, 80, 160, 320]],\n            44: [[44, 88], [44, 88, 176], [44, 88, 176, 352]],\n            48: [[48, 96], [48, 96, 192], [48, 96, 192, 384]],\n            60: [[60, 120], [60, 120, 240], [60, 120, 240, 480]],\n            64: [[64, 128], [64, 128, 256], [64, 128, 256, 512]]\n        }\n        self._class_dim = class_dim\n\n        channels_2, channels_3, channels_4 = self.channels[self.width]\n        num_modules_2, num_modules_3, num_modules_4 = 1, 4, 3\n\n        self.conv_layer1_1 = ConvBNLayer(\n            num_channels=3, num_filters=64, filter_size=3, stride=2, act='relu', name=\"layer1_1\")\n\n        self.conv_layer1_2 = ConvBNLayer(\n            num_channels=64, num_filters=64, filter_size=3, stride=2, act='relu', name=\"layer1_2\")\n\n        self.la1 = Layer1(num_channels=64, has_se=self.has_se, name=\"layer2\")\n\n        self.tr1 = TransitionLayer(in_channels=[256], out_channels=channels_2, name=\"tr1\")\n\n        self.st2 = Stage(\n            num_channels=channels_2, num_modules=num_modules_2, num_filters=channels_2, has_se=self.has_se, name=\"st2\")\n\n        self.tr2 = TransitionLayer(in_channels=channels_2, out_channels=channels_3, name=\"tr2\")\n        self.st3 = Stage(\n            num_channels=channels_3, num_modules=num_modules_3, num_filters=channels_3, has_se=self.has_se, name=\"st3\")\n\n        self.tr3 = TransitionLayer(in_channels=channels_3, out_channels=channels_4, name=\"tr3\")\n        self.st4 = Stage(\n            num_channels=channels_4, num_modules=num_modules_4, num_filters=channels_4, has_se=self.has_se, name=\"st4\")\n\n        # classification\n        num_filters_list = [32, 64, 128, 256]\n        self.last_cls = LastClsOut(\n            num_channel_list=channels_4,\n            has_se=self.has_se,\n            num_filters_list=num_filters_list,\n            name=\"cls_head\",\n        )\n\n        last_num_filters = [256, 512, 1024]\n        self.cls_head_conv_list = []\n        for idx in range(3):\n            self.cls_head_conv_list.append(\n                self.add_sublayer(\n                    \"cls_head_add{}\".format(idx + 1),\n                    ConvBNLayer(\n                        num_channels=num_filters_list[idx] * 4,\n                        num_filters=last_num_filters[idx],\n                        filter_size=3,\n                        stride=2,\n                        name=\"cls_head_add\" + str(idx + 1))))\n\n        self.conv_last = ConvBNLayer(\n            num_channels=1024, num_filters=2048, filter_size=1, stride=1, name=\"cls_head_last_conv\")\n\n        self.pool2d_avg = nn.AdaptiveAvgPool2D(1)\n\n        stdv = 1.0 / math.sqrt(2048 * 1.0)\n\n        self.out = nn.Linear(\n            2048,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_weights\"),\n            bias_attr=ParamAttr(name=\"fc_offset\"))\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, input):\n        conv1 = self.conv_layer1_1(input)\n        conv2 = self.conv_layer1_2(conv1)\n\n        la1 = self.la1(conv2)\n\n        tr1 = self.tr1([la1])\n        st2 = self.st2(tr1)\n\n        tr2 = self.tr2(st2)\n        st3 = self.st3(tr2)\n\n        tr3 = self.tr3(st3)\n        st4 = self.st4(tr3)\n\n        last_cls = self.last_cls(st4)\n\n        y = last_cls[0]\n        for idx in range(3):\n            y = paddle.add(last_cls[idx + 1], self.cls_head_conv_list[idx](y))\n\n        y = self.conv_last(y)\n        feature = self.pool2d_avg(y)\n        y = paddle.reshape(feature, shape=[-1, feature.shape[1]])\n        y = self.out(y)\n        return y, feature\n"
  },
  {
    "path": "modules/image/classification/hrnet48_imagenet_ssld/README.md",
    "content": "# hrnet48_imagenet_ssld\n\n|模型名称|hrnet48_imagenet_ssld|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|HRNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|446MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - HRNet是微软亚洲研究院在2019年提出的全新神经网络。与之前的卷积神经网络不同，这个网络在网络的深层依然可以保持高分辨率，所以预测的关键点的热图更加准确，而且在空间上也更加准确。此外，该网络在其他对分辨率敏感的视觉任务中表现特别好，例如检测和分割。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install hrnet48_imagenet_ssld\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run hrnet48_imagenet_ssld --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='hrnet48_imagenet_ssld')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用hrnet48_imagenet_ssld对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"hrnet48_imagenet_ssld\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='hrnet48_imagenet_ssld', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m hrnet48_imagenet_ssld\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/hrnet48_imagenet_ssld\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/hrnet48_imagenet_ssld/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/hrnet48_imagenet_ssld/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport math\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport paddlehub.vision.transforms as T\nimport numpy as np\nfrom paddle.nn.initializer import Uniform\nfrom paddle import ParamAttr\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    def __init__(self, num_channels, num_filters, filter_size, stride=1, groups=1, act=\"relu\", name=None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = nn.Conv2D(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        bn_name = name + '_bn'\n        self._batch_norm = nn.BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, input):\n        y = self._conv(input)\n        y = self._batch_norm(y)\n        return y\n\n\nclass Layer1(nn.Layer):\n    def __init__(self, num_channels, has_se=False, name=None):\n        super(Layer1, self).__init__()\n\n        self.bottleneck_block_list = []\n\n        for i in range(4):\n            bottleneck_block = self.add_sublayer(\n                \"bb_{}_{}\".format(name, i + 1),\n                BottleneckBlock(\n                    num_channels=num_channels if i == 0 else 256,\n                    num_filters=64,\n                    has_se=has_se,\n                    stride=1,\n                    downsample=True if i == 0 else False,\n                    name=name + '_' + str(i + 1)))\n            self.bottleneck_block_list.append(bottleneck_block)\n\n    def forward(self, input):\n        conv = input\n        for block_func in self.bottleneck_block_list:\n            conv = block_func(conv)\n        return conv\n\n\nclass TransitionLayer(nn.Layer):\n    def __init__(self, in_channels, out_channels, name=None):\n        super(TransitionLayer, self).__init__()\n\n        num_in = len(in_channels)\n        num_out = len(out_channels)\n        out = []\n        self.conv_bn_func_list = []\n        for i in range(num_out):\n            residual = None\n            if i < num_in:\n                if in_channels[i] != out_channels[i]:\n                    residual = self.add_sublayer(\n                        \"transition_{}_layer_{}\".format(name, i + 1),\n                        ConvBNLayer(\n                            num_channels=in_channels[i],\n                            num_filters=out_channels[i],\n                            filter_size=3,\n                            name=name + '_layer_' + str(i + 1)))\n            else:\n                residual = self.add_sublayer(\n                    \"transition_{}_layer_{}\".format(name, i + 1),\n                    ConvBNLayer(\n                        num_channels=in_channels[-1],\n                        num_filters=out_channels[i],\n                        filter_size=3,\n                        stride=2,\n                        name=name + '_layer_' + str(i + 1)))\n            self.conv_bn_func_list.append(residual)\n\n    def forward(self, input):\n        outs = []\n        for idx, conv_bn_func in enumerate(self.conv_bn_func_list):\n            if conv_bn_func is None:\n                outs.append(input[idx])\n            else:\n                if idx < len(input):\n                    outs.append(conv_bn_func(input[idx]))\n                else:\n                    outs.append(conv_bn_func(input[-1]))\n        return outs\n\n\nclass Branches(nn.Layer):\n    def __init__(self, block_num, in_channels, out_channels, has_se=False, name=None):\n        super(Branches, self).__init__()\n\n        self.basic_block_list = []\n\n        for i in range(len(out_channels)):\n            self.basic_block_list.append([])\n            for j in range(block_num):\n                in_ch = in_channels[i] if j == 0 else out_channels[i]\n                basic_block_func = self.add_sublayer(\n                    \"bb_{}_branch_layer_{}_{}\".format(name, i + 1, j + 1),\n                    BasicBlock(\n                        num_channels=in_ch,\n                        num_filters=out_channels[i],\n                        has_se=has_se,\n                        name=name + '_branch_layer_' + str(i + 1) + '_' + str(j + 1)))\n                self.basic_block_list[i].append(basic_block_func)\n\n    def forward(self, inputs):\n        outs = []\n        for idx, input in enumerate(inputs):\n            conv = input\n            basic_block_list = self.basic_block_list[idx]\n            for basic_block_func in basic_block_list:\n                conv = basic_block_func(conv)\n            outs.append(conv)\n        return outs\n\n\nclass BottleneckBlock(nn.Layer):\n    def __init__(self, num_channels, num_filters, has_se, stride=1, downsample=False, name=None):\n        super(BottleneckBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=1,\n            act=\"relu\",\n            name=name + \"_conv1\",\n        )\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_conv2\")\n        self.conv3 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters * 4, filter_size=1, act=None, name=name + \"_conv3\")\n\n        if self.downsample:\n            self.conv_down = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                act=None,\n                name=name + \"_downsample\")\n\n        if self.has_se:\n            self.se = SELayer(\n                num_channels=num_filters * 4, num_filters=num_filters * 4, reduction_ratio=16, name='fc' + name)\n\n    def forward(self, input):\n        residual = input\n        conv1 = self.conv1(input)\n        conv2 = self.conv2(conv1)\n        conv3 = self.conv3(conv2)\n\n        if self.downsample:\n            residual = self.conv_down(input)\n\n        if self.has_se:\n            conv3 = self.se(conv3)\n\n        y = paddle.add(x=residual, y=conv3)\n        y = F.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self, num_channels, num_filters, stride=1, has_se=False, downsample=False, name=None):\n        super(BasicBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_conv1\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters, filter_size=3, stride=1, act=None, name=name + \"_conv2\")\n\n        if self.downsample:\n            self.conv_down = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                act=\"relu\",\n                name=name + \"_downsample\")\n\n        if self.has_se:\n            self.se = SELayer(num_channels=num_filters, num_filters=num_filters, reduction_ratio=16, name='fc' + name)\n\n    def forward(self, input):\n        residual = input\n        conv1 = self.conv1(input)\n        conv2 = self.conv2(conv1)\n\n        if self.downsample:\n            residual = self.conv_down(input)\n\n        if self.has_se:\n            conv2 = self.se(conv2)\n\n        y = paddle.add(x=residual, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass SELayer(nn.Layer):\n    def __init__(self, num_channels, num_filters, reduction_ratio, name=None):\n        super(SELayer, self).__init__()\n\n        self.pool2d_gap = nn.AdaptiveAvgPool2D(1)\n\n        self._num_channels = num_channels\n\n        med_ch = int(num_channels / reduction_ratio)\n        stdv = 1.0 / math.sqrt(num_channels * 1.0)\n        self.squeeze = nn.Linear(\n            num_channels,\n            med_ch,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_sqz_weights\"),\n            bias_attr=ParamAttr(name=name + '_sqz_offset'))\n\n        stdv = 1.0 / math.sqrt(med_ch * 1.0)\n        self.excitation = nn.Linear(\n            med_ch,\n            num_filters,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_exc_weights\"),\n            bias_attr=ParamAttr(name=name + '_exc_offset'))\n\n    def forward(self, input):\n        pool = self.pool2d_gap(input)\n        pool = paddle.squeeze(pool, axis=[2, 3])\n        squeeze = self.squeeze(pool)\n        squeeze = F.relu(squeeze)\n        excitation = self.excitation(squeeze)\n        excitation = F.sigmoid(excitation)\n        excitation = paddle.unsqueeze(excitation, axis=[2, 3])\n        out = input * excitation\n        return out\n\n\nclass Stage(nn.Layer):\n    def __init__(self, num_channels, num_modules, num_filters, has_se=False, multi_scale_output=True, name=None):\n        super(Stage, self).__init__()\n\n        self._num_modules = num_modules\n\n        self.stage_func_list = []\n        for i in range(num_modules):\n            if i == num_modules - 1 and not multi_scale_output:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels,\n                        num_filters=num_filters,\n                        has_se=has_se,\n                        multi_scale_output=False,\n                        name=name + '_' + str(i + 1)))\n            else:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels, num_filters=num_filters, has_se=has_se,\n                        name=name + '_' + str(i + 1)))\n\n            self.stage_func_list.append(stage_func)\n\n    def forward(self, input):\n        out = input\n        for idx in range(self._num_modules):\n            out = self.stage_func_list[idx](out)\n        return out\n\n\nclass HighResolutionModule(nn.Layer):\n    def __init__(self, num_channels, num_filters, has_se=False, multi_scale_output=True, name=None):\n        super(HighResolutionModule, self).__init__()\n\n        self.branches_func = Branches(\n            block_num=4, in_channels=num_channels, out_channels=num_filters, has_se=has_se, name=name)\n\n        self.fuse_func = FuseLayers(\n            in_channels=num_filters, out_channels=num_filters, multi_scale_output=multi_scale_output, name=name)\n\n    def forward(self, input):\n        out = self.branches_func(input)\n        out = self.fuse_func(out)\n        return out\n\n\nclass FuseLayers(nn.Layer):\n    def __init__(self, in_channels, out_channels, multi_scale_output=True, name=None):\n        super(FuseLayers, self).__init__()\n\n        self._actual_ch = len(in_channels) if multi_scale_output else 1\n        self._in_channels = in_channels\n\n        self.residual_func_list = []\n        for i in range(self._actual_ch):\n            for j in range(len(in_channels)):\n                residual_func = None\n                if j > i:\n                    residual_func = self.add_sublayer(\n                        \"residual_{}_layer_{}_{}\".format(name, i + 1, j + 1),\n                        ConvBNLayer(\n                            num_channels=in_channels[j],\n                            num_filters=out_channels[i],\n                            filter_size=1,\n                            stride=1,\n                            act=None,\n                            name=name + '_layer_' + str(i + 1) + '_' + str(j + 1)))\n                    self.residual_func_list.append(residual_func)\n                elif j < i:\n                    pre_num_filters = in_channels[j]\n                    for k in range(i - j):\n                        if k == i - j - 1:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                ConvBNLayer(\n                                    num_channels=pre_num_filters,\n                                    num_filters=out_channels[i],\n                                    filter_size=3,\n                                    stride=2,\n                                    act=None,\n                                    name=name + '_layer_' + str(i + 1) + '_' + str(j + 1) + '_' + str(k + 1)))\n                            pre_num_filters = out_channels[i]\n                        else:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                ConvBNLayer(\n                                    num_channels=pre_num_filters,\n                                    num_filters=out_channels[j],\n                                    filter_size=3,\n                                    stride=2,\n                                    act=\"relu\",\n                                    name=name + '_layer_' + str(i + 1) + '_' + str(j + 1) + '_' + str(k + 1)))\n                            pre_num_filters = out_channels[j]\n                        self.residual_func_list.append(residual_func)\n\n    def forward(self, input):\n        outs = []\n        residual_func_idx = 0\n        for i in range(self._actual_ch):\n            residual = input[i]\n            for j in range(len(self._in_channels)):\n                if j > i:\n                    y = self.residual_func_list[residual_func_idx](input[j])\n                    residual_func_idx += 1\n\n                    y = F.upsample(y, scale_factor=2**(j - i), mode=\"nearest\")\n                    residual = paddle.add(x=residual, y=y)\n                elif j < i:\n                    y = input[j]\n                    for k in range(i - j):\n                        y = self.residual_func_list[residual_func_idx](y)\n                        residual_func_idx += 1\n\n                    residual = paddle.add(x=residual, y=y)\n\n            residual = F.relu(residual)\n            outs.append(residual)\n\n        return outs\n\n\nclass LastClsOut(nn.Layer):\n    def __init__(self, num_channel_list, has_se, num_filters_list=[32, 64, 128, 256], name=None):\n        super(LastClsOut, self).__init__()\n\n        self.func_list = []\n        for idx in range(len(num_channel_list)):\n            func = self.add_sublayer(\n                \"conv_{}_conv_{}\".format(name, idx + 1),\n                BottleneckBlock(\n                    num_channels=num_channel_list[idx],\n                    num_filters=num_filters_list[idx],\n                    has_se=has_se,\n                    downsample=True,\n                    name=name + 'conv_' + str(idx + 1)))\n            self.func_list.append(func)\n\n    def forward(self, inputs):\n        outs = []\n        for idx, input in enumerate(inputs):\n            out = self.func_list[idx](input)\n            outs.append(out)\n        return outs\n\n\n@moduleinfo(\n    name=\"hrnet48_imagenet_ssld\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"hrnet48_imagenet_ssld is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass HRNet48(nn.Layer):\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(HRNet48, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        self.width = 48\n        self.has_se = False\n        self.channels = {\n            18: [[18, 36], [18, 36, 72], [18, 36, 72, 144]],\n            30: [[30, 60], [30, 60, 120], [30, 60, 120, 240]],\n            32: [[32, 64], [32, 64, 128], [32, 64, 128, 256]],\n            40: [[40, 80], [40, 80, 160], [40, 80, 160, 320]],\n            44: [[44, 88], [44, 88, 176], [44, 88, 176, 352]],\n            48: [[48, 96], [48, 96, 192], [48, 96, 192, 384]],\n            60: [[60, 120], [60, 120, 240], [60, 120, 240, 480]],\n            64: [[64, 128], [64, 128, 256], [64, 128, 256, 512]]\n        }\n        self._class_dim = class_dim\n\n        channels_2, channels_3, channels_4 = self.channels[self.width]\n        num_modules_2, num_modules_3, num_modules_4 = 1, 4, 3\n\n        self.conv_layer1_1 = ConvBNLayer(\n            num_channels=3, num_filters=64, filter_size=3, stride=2, act='relu', name=\"layer1_1\")\n\n        self.conv_layer1_2 = ConvBNLayer(\n            num_channels=64, num_filters=64, filter_size=3, stride=2, act='relu', name=\"layer1_2\")\n\n        self.la1 = Layer1(num_channels=64, has_se=self.has_se, name=\"layer2\")\n\n        self.tr1 = TransitionLayer(in_channels=[256], out_channels=channels_2, name=\"tr1\")\n\n        self.st2 = Stage(\n            num_channels=channels_2, num_modules=num_modules_2, num_filters=channels_2, has_se=self.has_se, name=\"st2\")\n\n        self.tr2 = TransitionLayer(in_channels=channels_2, out_channels=channels_3, name=\"tr2\")\n        self.st3 = Stage(\n            num_channels=channels_3, num_modules=num_modules_3, num_filters=channels_3, has_se=self.has_se, name=\"st3\")\n\n        self.tr3 = TransitionLayer(in_channels=channels_3, out_channels=channels_4, name=\"tr3\")\n        self.st4 = Stage(\n            num_channels=channels_4, num_modules=num_modules_4, num_filters=channels_4, has_se=self.has_se, name=\"st4\")\n\n        # classification\n        num_filters_list = [32, 64, 128, 256]\n        self.last_cls = LastClsOut(\n            num_channel_list=channels_4,\n            has_se=self.has_se,\n            num_filters_list=num_filters_list,\n            name=\"cls_head\",\n        )\n\n        last_num_filters = [256, 512, 1024]\n        self.cls_head_conv_list = []\n        for idx in range(3):\n            self.cls_head_conv_list.append(\n                self.add_sublayer(\n                    \"cls_head_add{}\".format(idx + 1),\n                    ConvBNLayer(\n                        num_channels=num_filters_list[idx] * 4,\n                        num_filters=last_num_filters[idx],\n                        filter_size=3,\n                        stride=2,\n                        name=\"cls_head_add\" + str(idx + 1))))\n\n        self.conv_last = ConvBNLayer(\n            num_channels=1024, num_filters=2048, filter_size=1, stride=1, name=\"cls_head_last_conv\")\n\n        self.pool2d_avg = nn.AdaptiveAvgPool2D(1)\n\n        stdv = 1.0 / math.sqrt(2048 * 1.0)\n\n        self.out = nn.Linear(\n            2048,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_weights\"),\n            bias_attr=ParamAttr(name=\"fc_offset\"))\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, input):\n        conv1 = self.conv_layer1_1(input)\n        conv2 = self.conv_layer1_2(conv1)\n\n        la1 = self.la1(conv2)\n\n        tr1 = self.tr1([la1])\n        st2 = self.st2(tr1)\n\n        tr2 = self.tr2(st2)\n        st3 = self.st3(tr2)\n\n        tr3 = self.tr3(st3)\n        st4 = self.st4(tr3)\n\n        last_cls = self.last_cls(st4)\n\n        y = last_cls[0]\n        for idx in range(3):\n            y = paddle.add(last_cls[idx + 1], self.cls_head_conv_list[idx](y))\n\n        y = self.conv_last(y)\n        feature = self.pool2d_avg(y)\n        y = paddle.reshape(feature, shape=[-1, feature.shape[1]])\n        y = self.out(y)\n        return y, feature\n"
  },
  {
    "path": "modules/image/classification/hrnet64_imagenet/README.md",
    "content": "# hrnet64_imagenet\n\n|模型名称|hrnet64_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|HRNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|740MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - HRNet是微软亚洲研究院在2019年提出的全新神经网络。与之前的卷积神经网络不同，这个网络在网络的深层依然可以保持高分辨率，所以预测的关键点的热图更加准确，而且在空间上也更加准确。此外，该网络在其他对分辨率敏感的视觉任务中表现特别好，例如检测和分割。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install hrnet64_imagenet\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run hrnet64_imagenet --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='hrnet64_imagenet')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用hrnet64_imagenet对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"hrnet64_imagenet\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='hrnet64_imagenet', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m hrnet64_imagenet\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/hrnet64_imagenet\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/hrnet64_imagenet/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/hrnet64_imagenet/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport math\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport paddlehub.vision.transforms as T\nimport numpy as np\nfrom paddle.nn.initializer import Uniform\nfrom paddle import ParamAttr\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    def __init__(self, num_channels, num_filters, filter_size, stride=1, groups=1, act=\"relu\", name=None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = nn.Conv2D(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        bn_name = name + '_bn'\n        self._batch_norm = nn.BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, input):\n        y = self._conv(input)\n        y = self._batch_norm(y)\n        return y\n\n\nclass Layer1(nn.Layer):\n    def __init__(self, num_channels, has_se=False, name=None):\n        super(Layer1, self).__init__()\n\n        self.bottleneck_block_list = []\n\n        for i in range(4):\n            bottleneck_block = self.add_sublayer(\n                \"bb_{}_{}\".format(name, i + 1),\n                BottleneckBlock(\n                    num_channels=num_channels if i == 0 else 256,\n                    num_filters=64,\n                    has_se=has_se,\n                    stride=1,\n                    downsample=True if i == 0 else False,\n                    name=name + '_' + str(i + 1)))\n            self.bottleneck_block_list.append(bottleneck_block)\n\n    def forward(self, input):\n        conv = input\n        for block_func in self.bottleneck_block_list:\n            conv = block_func(conv)\n        return conv\n\n\nclass TransitionLayer(nn.Layer):\n    def __init__(self, in_channels, out_channels, name=None):\n        super(TransitionLayer, self).__init__()\n\n        num_in = len(in_channels)\n        num_out = len(out_channels)\n        out = []\n        self.conv_bn_func_list = []\n        for i in range(num_out):\n            residual = None\n            if i < num_in:\n                if in_channels[i] != out_channels[i]:\n                    residual = self.add_sublayer(\n                        \"transition_{}_layer_{}\".format(name, i + 1),\n                        ConvBNLayer(\n                            num_channels=in_channels[i],\n                            num_filters=out_channels[i],\n                            filter_size=3,\n                            name=name + '_layer_' + str(i + 1)))\n            else:\n                residual = self.add_sublayer(\n                    \"transition_{}_layer_{}\".format(name, i + 1),\n                    ConvBNLayer(\n                        num_channels=in_channels[-1],\n                        num_filters=out_channels[i],\n                        filter_size=3,\n                        stride=2,\n                        name=name + '_layer_' + str(i + 1)))\n            self.conv_bn_func_list.append(residual)\n\n    def forward(self, input):\n        outs = []\n        for idx, conv_bn_func in enumerate(self.conv_bn_func_list):\n            if conv_bn_func is None:\n                outs.append(input[idx])\n            else:\n                if idx < len(input):\n                    outs.append(conv_bn_func(input[idx]))\n                else:\n                    outs.append(conv_bn_func(input[-1]))\n        return outs\n\n\nclass Branches(nn.Layer):\n    def __init__(self, block_num, in_channels, out_channels, has_se=False, name=None):\n        super(Branches, self).__init__()\n\n        self.basic_block_list = []\n\n        for i in range(len(out_channels)):\n            self.basic_block_list.append([])\n            for j in range(block_num):\n                in_ch = in_channels[i] if j == 0 else out_channels[i]\n                basic_block_func = self.add_sublayer(\n                    \"bb_{}_branch_layer_{}_{}\".format(name, i + 1, j + 1),\n                    BasicBlock(\n                        num_channels=in_ch,\n                        num_filters=out_channels[i],\n                        has_se=has_se,\n                        name=name + '_branch_layer_' + str(i + 1) + '_' + str(j + 1)))\n                self.basic_block_list[i].append(basic_block_func)\n\n    def forward(self, inputs):\n        outs = []\n        for idx, input in enumerate(inputs):\n            conv = input\n            basic_block_list = self.basic_block_list[idx]\n            for basic_block_func in basic_block_list:\n                conv = basic_block_func(conv)\n            outs.append(conv)\n        return outs\n\n\nclass BottleneckBlock(nn.Layer):\n    def __init__(self, num_channels, num_filters, has_se, stride=1, downsample=False, name=None):\n        super(BottleneckBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=1,\n            act=\"relu\",\n            name=name + \"_conv1\",\n        )\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_conv2\")\n        self.conv3 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters * 4, filter_size=1, act=None, name=name + \"_conv3\")\n\n        if self.downsample:\n            self.conv_down = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                act=None,\n                name=name + \"_downsample\")\n\n        if self.has_se:\n            self.se = SELayer(\n                num_channels=num_filters * 4, num_filters=num_filters * 4, reduction_ratio=16, name='fc' + name)\n\n    def forward(self, input):\n        residual = input\n        conv1 = self.conv1(input)\n        conv2 = self.conv2(conv1)\n        conv3 = self.conv3(conv2)\n\n        if self.downsample:\n            residual = self.conv_down(input)\n\n        if self.has_se:\n            conv3 = self.se(conv3)\n\n        y = paddle.add(x=residual, y=conv3)\n        y = F.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self, num_channels, num_filters, stride=1, has_se=False, downsample=False, name=None):\n        super(BasicBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_conv1\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters, filter_size=3, stride=1, act=None, name=name + \"_conv2\")\n\n        if self.downsample:\n            self.conv_down = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                act=\"relu\",\n                name=name + \"_downsample\")\n\n        if self.has_se:\n            self.se = SELayer(num_channels=num_filters, num_filters=num_filters, reduction_ratio=16, name='fc' + name)\n\n    def forward(self, input):\n        residual = input\n        conv1 = self.conv1(input)\n        conv2 = self.conv2(conv1)\n\n        if self.downsample:\n            residual = self.conv_down(input)\n\n        if self.has_se:\n            conv2 = self.se(conv2)\n\n        y = paddle.add(x=residual, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass SELayer(nn.Layer):\n    def __init__(self, num_channels, num_filters, reduction_ratio, name=None):\n        super(SELayer, self).__init__()\n\n        self.pool2d_gap = nn.AdaptiveAvgPool2D(1)\n\n        self._num_channels = num_channels\n\n        med_ch = int(num_channels / reduction_ratio)\n        stdv = 1.0 / math.sqrt(num_channels * 1.0)\n        self.squeeze = nn.Linear(\n            num_channels,\n            med_ch,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_sqz_weights\"),\n            bias_attr=ParamAttr(name=name + '_sqz_offset'))\n\n        stdv = 1.0 / math.sqrt(med_ch * 1.0)\n        self.excitation = nn.Linear(\n            med_ch,\n            num_filters,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_exc_weights\"),\n            bias_attr=ParamAttr(name=name + '_exc_offset'))\n\n    def forward(self, input):\n        pool = self.pool2d_gap(input)\n        pool = paddle.squeeze(pool, axis=[2, 3])\n        squeeze = self.squeeze(pool)\n        squeeze = F.relu(squeeze)\n        excitation = self.excitation(squeeze)\n        excitation = F.sigmoid(excitation)\n        excitation = paddle.unsqueeze(excitation, axis=[2, 3])\n        out = input * excitation\n        return out\n\n\nclass Stage(nn.Layer):\n    def __init__(self, num_channels, num_modules, num_filters, has_se=False, multi_scale_output=True, name=None):\n        super(Stage, self).__init__()\n\n        self._num_modules = num_modules\n\n        self.stage_func_list = []\n        for i in range(num_modules):\n            if i == num_modules - 1 and not multi_scale_output:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels,\n                        num_filters=num_filters,\n                        has_se=has_se,\n                        multi_scale_output=False,\n                        name=name + '_' + str(i + 1)))\n            else:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels, num_filters=num_filters, has_se=has_se,\n                        name=name + '_' + str(i + 1)))\n\n            self.stage_func_list.append(stage_func)\n\n    def forward(self, input):\n        out = input\n        for idx in range(self._num_modules):\n            out = self.stage_func_list[idx](out)\n        return out\n\n\nclass HighResolutionModule(nn.Layer):\n    def __init__(self, num_channels, num_filters, has_se=False, multi_scale_output=True, name=None):\n        super(HighResolutionModule, self).__init__()\n\n        self.branches_func = Branches(\n            block_num=4, in_channels=num_channels, out_channels=num_filters, has_se=has_se, name=name)\n\n        self.fuse_func = FuseLayers(\n            in_channels=num_filters, out_channels=num_filters, multi_scale_output=multi_scale_output, name=name)\n\n    def forward(self, input):\n        out = self.branches_func(input)\n        out = self.fuse_func(out)\n        return out\n\n\nclass FuseLayers(nn.Layer):\n    def __init__(self, in_channels, out_channels, multi_scale_output=True, name=None):\n        super(FuseLayers, self).__init__()\n\n        self._actual_ch = len(in_channels) if multi_scale_output else 1\n        self._in_channels = in_channels\n\n        self.residual_func_list = []\n        for i in range(self._actual_ch):\n            for j in range(len(in_channels)):\n                residual_func = None\n                if j > i:\n                    residual_func = self.add_sublayer(\n                        \"residual_{}_layer_{}_{}\".format(name, i + 1, j + 1),\n                        ConvBNLayer(\n                            num_channels=in_channels[j],\n                            num_filters=out_channels[i],\n                            filter_size=1,\n                            stride=1,\n                            act=None,\n                            name=name + '_layer_' + str(i + 1) + '_' + str(j + 1)))\n                    self.residual_func_list.append(residual_func)\n                elif j < i:\n                    pre_num_filters = in_channels[j]\n                    for k in range(i - j):\n                        if k == i - j - 1:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                ConvBNLayer(\n                                    num_channels=pre_num_filters,\n                                    num_filters=out_channels[i],\n                                    filter_size=3,\n                                    stride=2,\n                                    act=None,\n                                    name=name + '_layer_' + str(i + 1) + '_' + str(j + 1) + '_' + str(k + 1)))\n                            pre_num_filters = out_channels[i]\n                        else:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                ConvBNLayer(\n                                    num_channels=pre_num_filters,\n                                    num_filters=out_channels[j],\n                                    filter_size=3,\n                                    stride=2,\n                                    act=\"relu\",\n                                    name=name + '_layer_' + str(i + 1) + '_' + str(j + 1) + '_' + str(k + 1)))\n                            pre_num_filters = out_channels[j]\n                        self.residual_func_list.append(residual_func)\n\n    def forward(self, input):\n        outs = []\n        residual_func_idx = 0\n        for i in range(self._actual_ch):\n            residual = input[i]\n            for j in range(len(self._in_channels)):\n                if j > i:\n                    y = self.residual_func_list[residual_func_idx](input[j])\n                    residual_func_idx += 1\n\n                    y = F.upsample(y, scale_factor=2**(j - i), mode=\"nearest\")\n                    residual = paddle.add(x=residual, y=y)\n                elif j < i:\n                    y = input[j]\n                    for k in range(i - j):\n                        y = self.residual_func_list[residual_func_idx](y)\n                        residual_func_idx += 1\n\n                    residual = paddle.add(x=residual, y=y)\n\n            residual = F.relu(residual)\n            outs.append(residual)\n\n        return outs\n\n\nclass LastClsOut(nn.Layer):\n    def __init__(self, num_channel_list, has_se, num_filters_list=[32, 64, 128, 256], name=None):\n        super(LastClsOut, self).__init__()\n\n        self.func_list = []\n        for idx in range(len(num_channel_list)):\n            func = self.add_sublayer(\n                \"conv_{}_conv_{}\".format(name, idx + 1),\n                BottleneckBlock(\n                    num_channels=num_channel_list[idx],\n                    num_filters=num_filters_list[idx],\n                    has_se=has_se,\n                    downsample=True,\n                    name=name + 'conv_' + str(idx + 1)))\n            self.func_list.append(func)\n\n    def forward(self, inputs):\n        outs = []\n        for idx, input in enumerate(inputs):\n            out = self.func_list[idx](input)\n            outs.append(out)\n        return outs\n\n\n@moduleinfo(\n    name=\"hrnet64_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"hrnet64_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass HRNet64(nn.Layer):\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(HRNet64, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        self.width = 64\n        self.has_se = False\n        self.channels = {\n            18: [[18, 36], [18, 36, 72], [18, 36, 72, 144]],\n            30: [[30, 60], [30, 60, 120], [30, 60, 120, 240]],\n            32: [[32, 64], [32, 64, 128], [32, 64, 128, 256]],\n            40: [[40, 80], [40, 80, 160], [40, 80, 160, 320]],\n            44: [[44, 88], [44, 88, 176], [44, 88, 176, 352]],\n            48: [[48, 96], [48, 96, 192], [48, 96, 192, 384]],\n            60: [[60, 120], [60, 120, 240], [60, 120, 240, 480]],\n            64: [[64, 128], [64, 128, 256], [64, 128, 256, 512]]\n        }\n        self._class_dim = class_dim\n\n        channels_2, channels_3, channels_4 = self.channels[self.width]\n        num_modules_2, num_modules_3, num_modules_4 = 1, 4, 3\n\n        self.conv_layer1_1 = ConvBNLayer(\n            num_channels=3, num_filters=64, filter_size=3, stride=2, act='relu', name=\"layer1_1\")\n\n        self.conv_layer1_2 = ConvBNLayer(\n            num_channels=64, num_filters=64, filter_size=3, stride=2, act='relu', name=\"layer1_2\")\n\n        self.la1 = Layer1(num_channels=64, has_se=self.has_se, name=\"layer2\")\n\n        self.tr1 = TransitionLayer(in_channels=[256], out_channels=channels_2, name=\"tr1\")\n\n        self.st2 = Stage(\n            num_channels=channels_2, num_modules=num_modules_2, num_filters=channels_2, has_se=self.has_se, name=\"st2\")\n\n        self.tr2 = TransitionLayer(in_channels=channels_2, out_channels=channels_3, name=\"tr2\")\n        self.st3 = Stage(\n            num_channels=channels_3, num_modules=num_modules_3, num_filters=channels_3, has_se=self.has_se, name=\"st3\")\n\n        self.tr3 = TransitionLayer(in_channels=channels_3, out_channels=channels_4, name=\"tr3\")\n        self.st4 = Stage(\n            num_channels=channels_4, num_modules=num_modules_4, num_filters=channels_4, has_se=self.has_se, name=\"st4\")\n\n        # classification\n        num_filters_list = [32, 64, 128, 256]\n        self.last_cls = LastClsOut(\n            num_channel_list=channels_4,\n            has_se=self.has_se,\n            num_filters_list=num_filters_list,\n            name=\"cls_head\",\n        )\n\n        last_num_filters = [256, 512, 1024]\n        self.cls_head_conv_list = []\n        for idx in range(3):\n            self.cls_head_conv_list.append(\n                self.add_sublayer(\n                    \"cls_head_add{}\".format(idx + 1),\n                    ConvBNLayer(\n                        num_channels=num_filters_list[idx] * 4,\n                        num_filters=last_num_filters[idx],\n                        filter_size=3,\n                        stride=2,\n                        name=\"cls_head_add\" + str(idx + 1))))\n\n        self.conv_last = ConvBNLayer(\n            num_channels=1024, num_filters=2048, filter_size=1, stride=1, name=\"cls_head_last_conv\")\n\n        self.pool2d_avg = nn.AdaptiveAvgPool2D(1)\n\n        stdv = 1.0 / math.sqrt(2048 * 1.0)\n\n        self.out = nn.Linear(\n            2048,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_weights\"),\n            bias_attr=ParamAttr(name=\"fc_offset\"))\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, input):\n        conv1 = self.conv_layer1_1(input)\n        conv2 = self.conv_layer1_2(conv1)\n\n        la1 = self.la1(conv2)\n\n        tr1 = self.tr1([la1])\n        st2 = self.st2(tr1)\n\n        tr2 = self.tr2(st2)\n        st3 = self.st3(tr2)\n\n        tr3 = self.tr3(st3)\n        st4 = self.st4(tr3)\n\n        last_cls = self.last_cls(st4)\n\n        y = last_cls[0]\n        for idx in range(3):\n            y = paddle.add(last_cls[idx + 1], self.cls_head_conv_list[idx](y))\n\n        y = self.conv_last(y)\n        feature = self.pool2d_avg(y)\n        y = paddle.reshape(feature, shape=[-1, feature.shape[1]])\n        y = self.out(y)\n        return y, feature\n"
  },
  {
    "path": "modules/image/classification/inception_v4_imagenet/README.md",
    "content": "# inception_v4_imagenet\n\n|模型名称|inception_v4_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|Inception_V4|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|167MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - Inception 结构最初由 GoogLeNet 引入，因此 GoogLeNet 也被称为 Inception-v1，通过在 Inception-v1 的基础上引入Batch Normalization、分解、残差连接等技术，设计出了Inception-v4。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install inception_v4_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run inception_v4_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"inception_v4_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率。\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install inception_v4_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/inception_v4_imagenet/README_en.md",
    "content": "# inception_v4_imagenet\n\n|Module Name|inception_v4_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|Inception_V4|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|167MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n  - Inception structure is first introduced in GoogLeNet, so GoogLeNet is named Inception-v1. Inception-v4 is an improvement on it, which takas advantage of sereral useful strategies such as batch normalization, residual learning. This module is based on Inception-v4, trained on ImageNet-2012, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install inception_v4_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run inception_v4_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"inception_v4_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install inception_v4_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/inceptionv4_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport math\n\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 filter_size: int,\n                 stride: int = 1,\n                 padding: int = 0,\n                 groups: int = 1,\n                 act: str = 'relu',\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=padding,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        bn_name = name + \"_bn\"\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + \"_scale\"),\n            bias_attr=ParamAttr(name=bn_name + \"_offset\"),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass InceptionStem(nn.Layer):\n    \"\"\"InceptionV4 stem module.\"\"\"\n\n    def __init__(self):\n        super(InceptionStem, self).__init__()\n        self._conv_1 = ConvBNLayer(3, 32, 3, stride=2, act=\"relu\", name=\"conv1_3x3_s2\")\n        self._conv_2 = ConvBNLayer(32, 32, 3, act=\"relu\", name=\"conv2_3x3_s1\")\n        self._conv_3 = ConvBNLayer(32, 64, 3, padding=1, act=\"relu\", name=\"conv3_3x3_s1\")\n        self._pool = MaxPool2d(kernel_size=3, stride=2, padding=0)\n        self._conv2 = ConvBNLayer(64, 96, 3, stride=2, act=\"relu\", name=\"inception_stem1_3x3_s2\")\n        self._conv1_1 = ConvBNLayer(160, 64, 1, act=\"relu\", name=\"inception_stem2_3x3_reduce\")\n        self._conv1_2 = ConvBNLayer(64, 96, 3, act=\"relu\", name=\"inception_stem2_3x3\")\n        self._conv2_1 = ConvBNLayer(160, 64, 1, act=\"relu\", name=\"inception_stem2_1x7_reduce\")\n        self._conv2_2 = ConvBNLayer(64, 64, (7, 1), padding=(3, 0), act=\"relu\", name=\"inception_stem2_1x7\")\n        self._conv2_3 = ConvBNLayer(64, 64, (1, 7), padding=(0, 3), act=\"relu\", name=\"inception_stem2_7x1\")\n        self._conv2_4 = ConvBNLayer(64, 96, 3, act=\"relu\", name=\"inception_stem2_3x3_2\")\n        self._conv3 = ConvBNLayer(192, 192, 3, stride=2, act=\"relu\", name=\"inception_stem3_3x3_s2\")\n\n    def forward(self, inputs: paddle.Tensor):\n        conv = self._conv_1(inputs)\n        conv = self._conv_2(conv)\n        conv = self._conv_3(conv)\n\n        pool1 = self._pool(conv)\n        conv2 = self._conv2(conv)\n        concat = paddle.concat([pool1, conv2], axis=1)\n\n        conv1 = self._conv1_1(concat)\n        conv1 = self._conv1_2(conv1)\n\n        conv2 = self._conv2_1(concat)\n        conv2 = self._conv2_2(conv2)\n        conv2 = self._conv2_3(conv2)\n        conv2 = self._conv2_4(conv2)\n\n        concat = paddle.concat([conv1, conv2], axis=1)\n\n        conv1 = self._conv3(concat)\n        pool1 = self._pool(concat)\n\n        concat = paddle.concat([conv1, pool1], axis=1)\n        return concat\n\n\nclass InceptionA(nn.Layer):\n    \"\"\"InceptionA module for InceptionV4.\"\"\"\n\n    def __init__(self, name: str):\n        super(InceptionA, self).__init__()\n        self._pool = AvgPool2d(kernel_size=3, stride=1, padding=1)\n        self._conv1 = ConvBNLayer(384, 96, 1, act=\"relu\", name=\"inception_a\" + name + \"_1x1\")\n        self._conv2 = ConvBNLayer(384, 96, 1, act=\"relu\", name=\"inception_a\" + name + \"_1x1_2\")\n        self._conv3_1 = ConvBNLayer(384, 64, 1, act=\"relu\", name=\"inception_a\" + name + \"_3x3_reduce\")\n        self._conv3_2 = ConvBNLayer(64, 96, 3, padding=1, act=\"relu\", name=\"inception_a\" + name + \"_3x3\")\n        self._conv4_1 = ConvBNLayer(384, 64, 1, act=\"relu\", name=\"inception_a\" + name + \"_3x3_2_reduce\")\n        self._conv4_2 = ConvBNLayer(64, 96, 3, padding=1, act=\"relu\", name=\"inception_a\" + name + \"_3x3_2\")\n        self._conv4_3 = ConvBNLayer(96, 96, 3, padding=1, act=\"relu\", name=\"inception_a\" + name + \"_3x3_3\")\n\n    def forward(self, inputs: paddle.Tensor):\n        pool1 = self._pool(inputs)\n        conv1 = self._conv1(pool1)\n\n        conv2 = self._conv2(inputs)\n\n        conv3 = self._conv3_1(inputs)\n        conv3 = self._conv3_2(conv3)\n\n        conv4 = self._conv4_1(inputs)\n        conv4 = self._conv4_2(conv4)\n        conv4 = self._conv4_3(conv4)\n\n        concat = paddle.concat([conv1, conv2, conv3, conv4], axis=1)\n        return concat\n\n\nclass ReductionA(nn.Layer):\n    \"\"\"ReductionA module for InceptionV4.\"\"\"\n\n    def __init__(self):\n        super(ReductionA, self).__init__()\n        self._pool = MaxPool2d(kernel_size=3, stride=2, padding=0)\n        self._conv2 = ConvBNLayer(384, 384, 3, stride=2, act=\"relu\", name=\"reduction_a_3x3\")\n        self._conv3_1 = ConvBNLayer(384, 192, 1, act=\"relu\", name=\"reduction_a_3x3_2_reduce\")\n        self._conv3_2 = ConvBNLayer(192, 224, 3, padding=1, act=\"relu\", name=\"reduction_a_3x3_2\")\n        self._conv3_3 = ConvBNLayer(224, 256, 3, stride=2, act=\"relu\", name=\"reduction_a_3x3_3\")\n\n    def forward(self, inputs: paddle.Tensor):\n        pool1 = self._pool(inputs)\n        conv2 = self._conv2(inputs)\n        conv3 = self._conv3_1(inputs)\n        conv3 = self._conv3_2(conv3)\n        conv3 = self._conv3_3(conv3)\n        concat = paddle.concat([pool1, conv2, conv3], axis=1)\n        return concat\n\n\nclass InceptionB(nn.Layer):\n    \"\"\"InceptionB module for InceptionV4.\"\"\"\n\n    def __init__(self, name: str = None):\n        super(InceptionB, self).__init__()\n        self._pool = AvgPool2d(kernel_size=3, stride=1, padding=1)\n        self._conv1 = ConvBNLayer(1024, 128, 1, act=\"relu\", name=\"inception_b\" + name + \"_1x1\")\n        self._conv2 = ConvBNLayer(1024, 384, 1, act=\"relu\", name=\"inception_b\" + name + \"_1x1_2\")\n        self._conv3_1 = ConvBNLayer(1024, 192, 1, act=\"relu\", name=\"inception_b\" + name + \"_1x7_reduce\")\n        self._conv3_2 = ConvBNLayer(192, 224, (1, 7), padding=(0, 3), act=\"relu\", name=\"inception_b\" + name + \"_1x7\")\n        self._conv3_3 = ConvBNLayer(224, 256, (7, 1), padding=(3, 0), act=\"relu\", name=\"inception_b\" + name + \"_7x1\")\n        self._conv4_1 = ConvBNLayer(1024, 192, 1, act=\"relu\", name=\"inception_b\" + name + \"_7x1_2_reduce\")\n        self._conv4_2 = ConvBNLayer(192, 192, (1, 7), padding=(0, 3), act=\"relu\", name=\"inception_b\" + name + \"_1x7_2\")\n        self._conv4_3 = ConvBNLayer(192, 224, (7, 1), padding=(3, 0), act=\"relu\", name=\"inception_b\" + name + \"_7x1_2\")\n        self._conv4_4 = ConvBNLayer(224, 224, (1, 7), padding=(0, 3), act=\"relu\", name=\"inception_b\" + name + \"_1x7_3\")\n        self._conv4_5 = ConvBNLayer(224, 256, (7, 1), padding=(3, 0), act=\"relu\", name=\"inception_b\" + name + \"_7x1_3\")\n\n    def forward(self, inputs: paddle.Tensor):\n        pool1 = self._pool(inputs)\n        conv1 = self._conv1(pool1)\n\n        conv2 = self._conv2(inputs)\n\n        conv3 = self._conv3_1(inputs)\n        conv3 = self._conv3_2(conv3)\n        conv3 = self._conv3_3(conv3)\n\n        conv4 = self._conv4_1(inputs)\n        conv4 = self._conv4_2(conv4)\n        conv4 = self._conv4_3(conv4)\n        conv4 = self._conv4_4(conv4)\n        conv4 = self._conv4_5(conv4)\n\n        concat = paddle.concat([conv1, conv2, conv3, conv4], axis=1)\n        return concat\n\n\nclass ReductionB(nn.Layer):\n    \"\"\"ReductionB module for InceptionV4.\"\"\"\n\n    def __init__(self):\n        super(ReductionB, self).__init__()\n        self._pool = MaxPool2d(kernel_size=3, stride=2, padding=0)\n        self._conv2_1 = ConvBNLayer(1024, 192, 1, act=\"relu\", name=\"reduction_b_3x3_reduce\")\n        self._conv2_2 = ConvBNLayer(192, 192, 3, stride=2, act=\"relu\", name=\"reduction_b_3x3\")\n        self._conv3_1 = ConvBNLayer(1024, 256, 1, act=\"relu\", name=\"reduction_b_1x7_reduce\")\n        self._conv3_2 = ConvBNLayer(256, 256, (1, 7), padding=(0, 3), act=\"relu\", name=\"reduction_b_1x7\")\n        self._conv3_3 = ConvBNLayer(256, 320, (7, 1), padding=(3, 0), act=\"relu\", name=\"reduction_b_7x1\")\n        self._conv3_4 = ConvBNLayer(320, 320, 3, stride=2, act=\"relu\", name=\"reduction_b_3x3_2\")\n\n    def forward(self, inputs: paddle.Tensor):\n        pool1 = self._pool(inputs)\n\n        conv2 = self._conv2_1(inputs)\n        conv2 = self._conv2_2(conv2)\n\n        conv3 = self._conv3_1(inputs)\n        conv3 = self._conv3_2(conv3)\n        conv3 = self._conv3_3(conv3)\n        conv3 = self._conv3_4(conv3)\n\n        concat = paddle.concat([pool1, conv2, conv3], axis=1)\n\n        return concat\n\n\nclass InceptionC(nn.Layer):\n    \"\"\"InceptionC module for InceptionV4.\"\"\"\n\n    def __init__(self, name: str = None):\n        super(InceptionC, self).__init__()\n        self._pool = AvgPool2d(kernel_size=3, stride=1, padding=1)\n        self._conv1 = ConvBNLayer(1536, 256, 1, act=\"relu\", name=\"inception_c\" + name + \"_1x1\")\n        self._conv2 = ConvBNLayer(1536, 256, 1, act=\"relu\", name=\"inception_c\" + name + \"_1x1_2\")\n        self._conv3_0 = ConvBNLayer(1536, 384, 1, act=\"relu\", name=\"inception_c\" + name + \"_1x1_3\")\n        self._conv3_1 = ConvBNLayer(384, 256, (1, 3), padding=(0, 1), act=\"relu\", name=\"inception_c\" + name + \"_1x3\")\n        self._conv3_2 = ConvBNLayer(384, 256, (3, 1), padding=(1, 0), act=\"relu\", name=\"inception_c\" + name + \"_3x1\")\n        self._conv4_0 = ConvBNLayer(1536, 384, 1, act=\"relu\", name=\"inception_c\" + name + \"_1x1_4\")\n        self._conv4_00 = ConvBNLayer(384, 448, (1, 3), padding=(0, 1), act=\"relu\", name=\"inception_c\" + name + \"_1x3_2\")\n        self._conv4_000 = ConvBNLayer(\n            448, 512, (3, 1), padding=(1, 0), act=\"relu\", name=\"inception_c\" + name + \"_3x1_2\")\n        self._conv4_1 = ConvBNLayer(512, 256, (1, 3), padding=(0, 1), act=\"relu\", name=\"inception_c\" + name + \"_1x3_3\")\n        self._conv4_2 = ConvBNLayer(512, 256, (3, 1), padding=(1, 0), act=\"relu\", name=\"inception_c\" + name + \"_3x1_3\")\n\n    def forward(self, inputs: paddle.Tensor):\n        pool1 = self._pool(inputs)\n        conv1 = self._conv1(pool1)\n\n        conv2 = self._conv2(inputs)\n\n        conv3 = self._conv3_0(inputs)\n        conv3_1 = self._conv3_1(conv3)\n        conv3_2 = self._conv3_2(conv3)\n\n        conv4 = self._conv4_0(inputs)\n        conv4 = self._conv4_00(conv4)\n        conv4 = self._conv4_000(conv4)\n        conv4_1 = self._conv4_1(conv4)\n        conv4_2 = self._conv4_2(conv4)\n\n        concat = paddle.concat([conv1, conv2, conv3_1, conv3_2, conv4_1, conv4_2], axis=1)\n\n        return concat\n\n\n@moduleinfo(\n    name=\"inceptionv4_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"InceptionV4_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass InceptionV4(nn.Layer):\n    \"\"\"InceptionV4 model.\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(InceptionV4, self).__init__()\n        self._inception_stem = InceptionStem()\n\n        self._inceptionA_1 = InceptionA(name=\"1\")\n        self._inceptionA_2 = InceptionA(name=\"2\")\n        self._inceptionA_3 = InceptionA(name=\"3\")\n        self._inceptionA_4 = InceptionA(name=\"4\")\n        self._reductionA = ReductionA()\n\n        self._inceptionB_1 = InceptionB(name=\"1\")\n        self._inceptionB_2 = InceptionB(name=\"2\")\n        self._inceptionB_3 = InceptionB(name=\"3\")\n        self._inceptionB_4 = InceptionB(name=\"4\")\n        self._inceptionB_5 = InceptionB(name=\"5\")\n        self._inceptionB_6 = InceptionB(name=\"6\")\n        self._inceptionB_7 = InceptionB(name=\"7\")\n        self._reductionB = ReductionB()\n\n        self._inceptionC_1 = InceptionC(name=\"1\")\n        self._inceptionC_2 = InceptionC(name=\"2\")\n        self._inceptionC_3 = InceptionC(name=\"3\")\n\n        self.avg_pool = AdaptiveAvgPool2d(1)\n        self._drop = Dropout(p=0.2, mode=\"downscale_in_infer\")\n        stdv = 1.0 / math.sqrt(1536 * 1.0)\n        self.out = Linear(\n            1536,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"final_fc_weights\"),\n            bias_attr=ParamAttr(name=\"final_fc_offset\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'inceptionv4_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/inceptionv4_imagenet.pdparams -O'\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs):\n        x = self._inception_stem(inputs)\n\n        x = self._inceptionA_1(x)\n        x = self._inceptionA_2(x)\n        x = self._inceptionA_3(x)\n        x = self._inceptionA_4(x)\n        x = self._reductionA(x)\n\n        x = self._inceptionB_1(x)\n        x = self._inceptionB_2(x)\n        x = self._inceptionB_3(x)\n        x = self._inceptionB_4(x)\n        x = self._inceptionB_5(x)\n        x = self._inceptionB_6(x)\n        x = self._inceptionB_7(x)\n        x = self._reductionB(x)\n\n        x = self._inceptionC_1(x)\n        x = self._inceptionC_2(x)\n        x = self._inceptionC_3(x)\n\n        x = self.avg_pool(x)\n        x = paddle.squeeze(x, axis=[2, 3])\n        x = self._drop(x)\n        x = self.out(x)\n        return x\n"
  },
  {
    "path": "modules/image/classification/levit_128_imagenet/README.md",
    "content": "# levit_128_imagenet\n\n|模型名称|levit_128_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|LeViT|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|54 MB|\n|最新更新日期|2022-04-02|\n|数据指标|Acc|\n\n\n## 一、模型基本信息\n\n\n- ### 模型介绍\n\n  - LeViT 是一种快速推理的、用于图像分类任务的混合神经网络。其设计之初考虑了网络模型在不同的硬件平台上的性能，因此能够更好地反映普遍应用的真实场景。通过大量实验，作者找到了卷积神经网络与 Transformer 体系更好的结合方式，并且提出了 attention-based 方法，用于整合 Transformer 中的位置信息编码, 该模块的模型结构配置为LeViT128, 详情可参考[论文地址](https://arxiv.org/abs/2104.01136)。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install levit_128_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run levit_128_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"levit_128_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 包括'class_ids'（种类索引）, 'scores'（置信度） 和 'label_names'（种类名称）\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m levit_128_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"\\}\n    url = \"http://127.0.0.1:8866/predict/levit_128_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install levit_128_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/levit_128_imagenet/model.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# Code was based on https://github.com/facebookresearch/LeViT\nimport itertools\nimport math\nimport warnings\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn.initializer import Constant\nfrom paddle.nn.initializer import TruncatedNormal\nfrom paddle.regularizer import L2Decay\n\nfrom .vision_transformer import Identity\nfrom .vision_transformer import ones_\nfrom .vision_transformer import trunc_normal_\nfrom .vision_transformer import zeros_\n\n\ndef cal_attention_biases(attention_biases, attention_bias_idxs):\n    gather_list = []\n    attention_bias_t = paddle.transpose(attention_biases, (1, 0))\n    nums = attention_bias_idxs.shape[0]\n    for idx in range(nums):\n        gather = paddle.gather(attention_bias_t, attention_bias_idxs[idx])\n        gather_list.append(gather)\n    shape0, shape1 = attention_bias_idxs.shape\n    gather = paddle.concat(gather_list)\n    return paddle.transpose(gather, (1, 0)).reshape((0, shape0, shape1))\n\n\nclass Conv2d_BN(nn.Sequential):\n\n    def __init__(self, a, b, ks=1, stride=1, pad=0, dilation=1, groups=1, bn_weight_init=1, resolution=-10000):\n        super().__init__()\n        self.add_sublayer('c', nn.Conv2D(a, b, ks, stride, pad, dilation, groups, bias_attr=False))\n        bn = nn.BatchNorm2D(b)\n        ones_(bn.weight)\n        zeros_(bn.bias)\n        self.add_sublayer('bn', bn)\n\n\nclass Linear_BN(nn.Sequential):\n\n    def __init__(self, a, b, bn_weight_init=1):\n        super().__init__()\n        self.add_sublayer('c', nn.Linear(a, b, bias_attr=False))\n        bn = nn.BatchNorm1D(b)\n        if bn_weight_init == 0:\n            zeros_(bn.weight)\n        else:\n            ones_(bn.weight)\n        zeros_(bn.bias)\n        self.add_sublayer('bn', bn)\n\n    def forward(self, x):\n        l, bn = self._sub_layers.values()\n        x = l(x)\n        return paddle.reshape(bn(x.flatten(0, 1)), x.shape)\n\n\nclass BN_Linear(nn.Sequential):\n\n    def __init__(self, a, b, bias=True, std=0.02):\n        super().__init__()\n        self.add_sublayer('bn', nn.BatchNorm1D(a))\n        l = nn.Linear(a, b, bias_attr=bias)\n        trunc_normal_(l.weight)\n        if bias:\n            zeros_(l.bias)\n        self.add_sublayer('l', l)\n\n\ndef b16(n, activation, resolution=224):\n    return nn.Sequential(Conv2d_BN(3, n // 8, 3, 2, 1, resolution=resolution), activation(),\n                         Conv2d_BN(n // 8, n // 4, 3, 2, 1, resolution=resolution // 2), activation(),\n                         Conv2d_BN(n // 4, n // 2, 3, 2, 1, resolution=resolution // 4), activation(),\n                         Conv2d_BN(n // 2, n, 3, 2, 1, resolution=resolution // 8))\n\n\nclass Residual(nn.Layer):\n\n    def __init__(self, m, drop):\n        super().__init__()\n        self.m = m\n        self.drop = drop\n\n    def forward(self, x):\n        if self.training and self.drop > 0:\n            y = paddle.rand(shape=[x.shape[0], 1, 1]).__ge__(self.drop).astype(\"float32\")\n            y = y.divide(paddle.full_like(y, 1 - self.drop))\n            return paddle.add(x, y)\n        else:\n            return paddle.add(x, self.m(x))\n\n\nclass Attention(nn.Layer):\n\n    def __init__(self, dim, key_dim, num_heads=8, attn_ratio=4, activation=None, resolution=14):\n        super().__init__()\n        self.num_heads = num_heads\n        self.scale = key_dim**-0.5\n        self.key_dim = key_dim\n        self.nh_kd = nh_kd = key_dim * num_heads\n        self.d = int(attn_ratio * key_dim)\n        self.dh = int(attn_ratio * key_dim) * num_heads\n        self.attn_ratio = attn_ratio\n        self.h = self.dh + nh_kd * 2\n        self.qkv = Linear_BN(dim, self.h)\n        self.proj = nn.Sequential(activation(), Linear_BN(self.dh, dim, bn_weight_init=0))\n        points = list(itertools.product(range(resolution), range(resolution)))\n        N = len(points)\n        attention_offsets = {}\n        idxs = []\n        for p1 in points:\n            for p2 in points:\n                offset = (abs(p1[0] - p2[0]), abs(p1[1] - p2[1]))\n                if offset not in attention_offsets:\n                    attention_offsets[offset] = len(attention_offsets)\n                idxs.append(attention_offsets[offset])\n        self.attention_biases = self.create_parameter(shape=(num_heads, len(attention_offsets)),\n                                                      default_initializer=zeros_,\n                                                      attr=paddle.ParamAttr(regularizer=L2Decay(0.0)))\n        tensor_idxs = paddle.to_tensor(idxs, dtype='int64')\n        self.register_buffer('attention_bias_idxs', paddle.reshape(tensor_idxs, [N, N]))\n\n    @paddle.no_grad()\n    def train(self, mode=True):\n        if mode:\n            super().train()\n        else:\n            super().eval()\n        if mode and hasattr(self, 'ab'):\n            del self.ab\n        else:\n            self.ab = cal_attention_biases(self.attention_biases, self.attention_bias_idxs)\n\n    def forward(self, x):\n        self.training = True\n        B, N, C = x.shape\n        qkv = self.qkv(x)\n        qkv = paddle.reshape(qkv, [B, N, self.num_heads, self.h // self.num_heads])\n        q, k, v = paddle.split(qkv, [self.key_dim, self.key_dim, self.d], axis=3)\n        q = paddle.transpose(q, perm=[0, 2, 1, 3])\n        k = paddle.transpose(k, perm=[0, 2, 1, 3])\n        v = paddle.transpose(v, perm=[0, 2, 1, 3])\n        k_transpose = paddle.transpose(k, perm=[0, 1, 3, 2])\n\n        if self.training:\n            attention_biases = cal_attention_biases(self.attention_biases, self.attention_bias_idxs)\n        else:\n            attention_biases = self.ab\n        attn = (paddle.matmul(q, k_transpose) * self.scale + attention_biases)\n        attn = F.softmax(attn)\n        x = paddle.transpose(paddle.matmul(attn, v), perm=[0, 2, 1, 3])\n        x = paddle.reshape(x, [B, N, self.dh])\n        x = self.proj(x)\n        return x\n\n\nclass Subsample(nn.Layer):\n\n    def __init__(self, stride, resolution):\n        super().__init__()\n        self.stride = stride\n        self.resolution = resolution\n\n    def forward(self, x):\n        B, N, C = x.shape\n        x = paddle.reshape(x, [B, self.resolution, self.resolution, C])\n        end1, end2 = x.shape[1], x.shape[2]\n        x = x[:, 0:end1:self.stride, 0:end2:self.stride]\n        x = paddle.reshape(x, [B, -1, C])\n        return x\n\n\nclass AttentionSubsample(nn.Layer):\n\n    def __init__(self,\n                 in_dim,\n                 out_dim,\n                 key_dim,\n                 num_heads=8,\n                 attn_ratio=2,\n                 activation=None,\n                 stride=2,\n                 resolution=14,\n                 resolution_=7):\n        super().__init__()\n        self.num_heads = num_heads\n        self.scale = key_dim**-0.5\n        self.key_dim = key_dim\n        self.nh_kd = nh_kd = key_dim * num_heads\n        self.d = int(attn_ratio * key_dim)\n        self.dh = int(attn_ratio * key_dim) * self.num_heads\n        self.attn_ratio = attn_ratio\n        self.resolution_ = resolution_\n        self.resolution_2 = resolution_**2\n        self.training = True\n        h = self.dh + nh_kd\n        self.kv = Linear_BN(in_dim, h)\n\n        self.q = nn.Sequential(Subsample(stride, resolution), Linear_BN(in_dim, nh_kd))\n        self.proj = nn.Sequential(activation(), Linear_BN(self.dh, out_dim))\n\n        self.stride = stride\n        self.resolution = resolution\n        points = list(itertools.product(range(resolution), range(resolution)))\n        points_ = list(itertools.product(range(resolution_), range(resolution_)))\n\n        N = len(points)\n        N_ = len(points_)\n        attention_offsets = {}\n        idxs = []\n        i = 0\n        j = 0\n        for p1 in points_:\n            i += 1\n            for p2 in points:\n                j += 1\n                size = 1\n                offset = (abs(p1[0] * stride - p2[0] + (size - 1) / 2), abs(p1[1] * stride - p2[1] + (size - 1) / 2))\n                if offset not in attention_offsets:\n                    attention_offsets[offset] = len(attention_offsets)\n                idxs.append(attention_offsets[offset])\n        self.attention_biases = self.create_parameter(shape=(num_heads, len(attention_offsets)),\n                                                      default_initializer=zeros_,\n                                                      attr=paddle.ParamAttr(regularizer=L2Decay(0.0)))\n\n        tensor_idxs_ = paddle.to_tensor(idxs, dtype='int64')\n        self.register_buffer('attention_bias_idxs', paddle.reshape(tensor_idxs_, [N_, N]))\n\n    @paddle.no_grad()\n    def train(self, mode=True):\n        if mode:\n            super().train()\n        else:\n            super().eval()\n        if mode and hasattr(self, 'ab'):\n            del self.ab\n        else:\n            self.ab = cal_attention_biases(self.attention_biases, self.attention_bias_idxs)\n\n    def forward(self, x):\n        self.training = True\n        B, N, C = x.shape\n        kv = self.kv(x)\n        kv = paddle.reshape(kv, [B, N, self.num_heads, -1])\n        k, v = paddle.split(kv, [self.key_dim, self.d], axis=3)\n        k = paddle.transpose(k, perm=[0, 2, 1, 3])  # BHNC\n        v = paddle.transpose(v, perm=[0, 2, 1, 3])\n        q = paddle.reshape(self.q(x), [B, self.resolution_2, self.num_heads, self.key_dim])\n        q = paddle.transpose(q, perm=[0, 2, 1, 3])\n\n        if self.training:\n            attention_biases = cal_attention_biases(self.attention_biases, self.attention_bias_idxs)\n        else:\n            attention_biases = self.ab\n\n        attn = (paddle.matmul(q, paddle.transpose(k, perm=[0, 1, 3, 2]))) * self.scale + attention_biases\n        attn = F.softmax(attn)\n\n        x = paddle.reshape(paddle.transpose(paddle.matmul(attn, v), perm=[0, 2, 1, 3]), [B, -1, self.dh])\n        x = self.proj(x)\n        return x\n\n\nclass LeViT(nn.Layer):\n    \"\"\" Vision Transformer with support for patch or hybrid CNN input stage\n    \"\"\"\n\n    def __init__(self,\n                 img_size=224,\n                 patch_size=16,\n                 in_chans=3,\n                 class_num=1000,\n                 embed_dim=[192],\n                 key_dim=[64],\n                 depth=[12],\n                 num_heads=[3],\n                 attn_ratio=[2],\n                 mlp_ratio=[2],\n                 hybrid_backbone=None,\n                 down_ops=[],\n                 attention_activation=nn.Hardswish,\n                 mlp_activation=nn.Hardswish,\n                 distillation=True,\n                 drop_path=0):\n        super().__init__()\n\n        self.class_num = class_num\n        self.num_features = embed_dim[-1]\n        self.embed_dim = embed_dim\n        self.distillation = distillation\n\n        self.patch_embed = hybrid_backbone\n\n        self.blocks = []\n        down_ops.append([''])\n        resolution = img_size // patch_size\n        for i, (ed, kd, dpth, nh, ar, mr,\n                do) in enumerate(zip(embed_dim, key_dim, depth, num_heads, attn_ratio, mlp_ratio, down_ops)):\n            for _ in range(dpth):\n                self.blocks.append(\n                    Residual(\n                        Attention(\n                            ed,\n                            kd,\n                            nh,\n                            attn_ratio=ar,\n                            activation=attention_activation,\n                            resolution=resolution,\n                        ), drop_path))\n                if mr > 0:\n                    h = int(ed * mr)\n                    self.blocks.append(\n                        Residual(\n                            nn.Sequential(\n                                Linear_BN(ed, h),\n                                mlp_activation(),\n                                Linear_BN(h, ed, bn_weight_init=0),\n                            ), drop_path))\n            if do[0] == 'Subsample':\n                #('Subsample',key_dim, num_heads, attn_ratio, mlp_ratio, stride)\n                resolution_ = (resolution - 1) // do[5] + 1\n                self.blocks.append(\n                    AttentionSubsample(*embed_dim[i:i + 2],\n                                       key_dim=do[1],\n                                       num_heads=do[2],\n                                       attn_ratio=do[3],\n                                       activation=attention_activation,\n                                       stride=do[5],\n                                       resolution=resolution,\n                                       resolution_=resolution_))\n                resolution = resolution_\n                if do[4] > 0:  # mlp_ratio\n                    h = int(embed_dim[i + 1] * do[4])\n                    self.blocks.append(\n                        Residual(\n                            nn.Sequential(\n                                Linear_BN(embed_dim[i + 1], h),\n                                mlp_activation(),\n                                Linear_BN(h, embed_dim[i + 1], bn_weight_init=0),\n                            ), drop_path))\n        self.blocks = nn.Sequential(*self.blocks)\n\n        # Classifier head\n        self.head = BN_Linear(embed_dim[-1], class_num) if class_num > 0 else Identity()\n        if distillation:\n            self.head_dist = BN_Linear(embed_dim[-1], class_num) if class_num > 0 else Identity()\n\n    def forward(self, x):\n        x = self.patch_embed(x)\n        x = x.flatten(2)\n        x = paddle.transpose(x, perm=[0, 2, 1])\n        x = self.blocks(x)\n        x = x.mean(1)\n\n        x = paddle.reshape(x, [-1, self.embed_dim[-1]])\n        if self.distillation:\n            x = self.head(x), self.head_dist(x)\n            if not self.training:\n                x = (x[0] + x[1]) / 2\n        else:\n            x = self.head(x)\n        return x\n\n\ndef model_factory(C, D, X, N, drop_path, class_num, distillation):\n    embed_dim = [int(x) for x in C.split('_')]\n    num_heads = [int(x) for x in N.split('_')]\n    depth = [int(x) for x in X.split('_')]\n    act = nn.Hardswish\n    model = LeViT(\n        patch_size=16,\n        embed_dim=embed_dim,\n        num_heads=num_heads,\n        key_dim=[D] * 3,\n        depth=depth,\n        attn_ratio=[2, 2, 2],\n        mlp_ratio=[2, 2, 2],\n        down_ops=[\n            #('Subsample',key_dim, num_heads, attn_ratio, mlp_ratio, stride)\n            ['Subsample', D, embed_dim[0] // D, 4, 2, 2],\n            ['Subsample', D, embed_dim[1] // D, 4, 2, 2],\n        ],\n        attention_activation=act,\n        mlp_activation=act,\n        hybrid_backbone=b16(embed_dim[0], activation=act),\n        class_num=class_num,\n        drop_path=drop_path,\n        distillation=distillation)\n\n    return model\n\n\nspecification = {\n    'LeViT_128S': {\n        'C': '128_256_384',\n        'D': 16,\n        'N': '4_6_8',\n        'X': '2_3_4',\n        'drop_path': 0\n    },\n    'LeViT_128': {\n        'C': '128_256_384',\n        'D': 16,\n        'N': '4_8_12',\n        'X': '4_4_4',\n        'drop_path': 0\n    },\n    'LeViT_192': {\n        'C': '192_288_384',\n        'D': 32,\n        'N': '3_5_6',\n        'X': '4_4_4',\n        'drop_path': 0\n    },\n    'LeViT_256': {\n        'C': '256_384_512',\n        'D': 32,\n        'N': '4_6_8',\n        'X': '4_4_4',\n        'drop_path': 0\n    },\n    'LeViT_384': {\n        'C': '384_512_768',\n        'D': 32,\n        'N': '6_9_12',\n        'X': '4_4_4',\n        'drop_path': 0.1\n    },\n}\n\n\ndef LeViT_128(**kwargs):\n    model = model_factory(**specification['LeViT_128'], class_num=1000, distillation=False)\n    return model\n"
  },
  {
    "path": "modules/image/classification/levit_128_imagenet/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport cv2\nimport numpy as np\nimport paddle\nfrom skimage.io import imread\nfrom skimage.transform import rescale\nfrom skimage.transform import resize\n\nimport paddlehub as hub\nfrom .model import LeViT_128\nfrom .processor import base64_to_cv2\nfrom .processor import create_operators\nfrom .processor import Topk\nfrom .utils import get_config\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"levit_128_imagenet\",\n            type=\"cv/classification\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"\",\n            version=\"1.0.0\")\nclass LeViT_128_ImageNet:\n\n    def __init__(self):\n        self.config = get_config(os.path.join(self.directory, 'LeViT_128.yaml'), show=False)\n        self.label_path = os.path.join(self.directory, 'imagenet1k_label_list.txt')\n        self.pretrain_path = os.path.join(self.directory, 'LeViT_128_pretrained.pdparams')\n        self.config['Infer']['PostProcess']['class_id_map_file'] = self.label_path\n        self.model = LeViT_128()\n        param_state_dict = paddle.load(self.pretrain_path)\n        self.model.set_dict(param_state_dict)\n        self.preprocess_funcs = create_operators(self.config[\"Infer\"][\"transforms\"])\n\n    def classification(self,\n                       images: list = None,\n                       paths: list = None,\n                       batch_size: int = 1,\n                       use_gpu: bool = False,\n                       top_k: int = 1):\n        '''\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results, each result dict contains key 'class_ids', 'scores' and 'label_names'.\n        '''\n        postprocess_func = Topk(top_k, self.label_path)\n        inputs = []\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n\n        if images != None:\n            for image in images:\n                image = image[:, :, ::-1]\n                inputs.append(image)\n\n        if paths != None:\n            for path in paths:\n                image = cv2.imread(path)[:, :, ::-1]\n                inputs.append(image)\n\n        batch_data = []\n        for idx, imagedata in enumerate(inputs):\n            for process in self.preprocess_funcs:\n                imagedata = process(imagedata)\n            batch_data.append(imagedata)\n            if len(batch_data) >= batch_size or idx == len(inputs) - 1:\n                batch_tensor = paddle.to_tensor(batch_data)\n                out = self.model(batch_tensor)\n                if isinstance(out, list):\n                    out = out[0]\n                if isinstance(out, dict) and \"logits\" in out:\n                    out = out[\"logits\"]\n                if isinstance(out, dict) and \"output\" in out:\n                    out = out[\"output\"]\n                result = postprocess_func(out)\n                results.extend(result)\n                batch_data.clear()\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n        results = self.classification(paths=[self.args.input_path],\n                                      use_gpu=self.args.use_gpu,\n                                      batch_size=self.args.batch_size,\n                                      top_k=self.args.top_k)\n        return results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classification(images=images_decode, **kwargs)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument('--batch_size', type=int, default=1, help='batch size')\n        self.arg_config_group.add_argument('--top_k', type=int, default=1, help='Return top k results.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to input image.\")\n"
  },
  {
    "path": "modules/image/classification/levit_128_imagenet/processor.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport base64\nimport inspect\nimport math\nimport os\nimport random\nimport sys\nfrom functools import partial\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\nimport six\nfrom paddle.vision.transforms import ColorJitter as RawColorJitter\nfrom PIL import Image\n\n\ndef create_operators(params, class_num=None):\n    \"\"\"\n    create operators based on the config\n\n    Args:\n        params(list): a dict list, used to create some operators\n    \"\"\"\n    assert isinstance(params, list), ('operator config should be a list')\n    ops = []\n    current_module = sys.modules[__name__]\n    for operator in params:\n        assert isinstance(operator, dict) and len(operator) == 1, \"yaml format error\"\n        op_name = list(operator)[0]\n        param = {} if operator[op_name] is None else operator[op_name]\n        op_func = getattr(current_module, op_name)\n        if \"class_num\" in inspect.getfullargspec(op_func).args:\n            param.update({\"class_num\": class_num})\n        op = op_func(**param)\n        ops.append(op)\n\n    return ops\n\n\nclass UnifiedResize(object):\n\n    def __init__(self, interpolation=None, backend=\"cv2\"):\n        _cv2_interp_from_str = {\n            'nearest': cv2.INTER_NEAREST,\n            'bilinear': cv2.INTER_LINEAR,\n            'area': cv2.INTER_AREA,\n            'bicubic': cv2.INTER_CUBIC,\n            'lanczos': cv2.INTER_LANCZOS4\n        }\n        _pil_interp_from_str = {\n            'nearest': Image.NEAREST,\n            'bilinear': Image.BILINEAR,\n            'bicubic': Image.BICUBIC,\n            'box': Image.BOX,\n            'lanczos': Image.LANCZOS,\n            'hamming': Image.HAMMING\n        }\n\n        def _pil_resize(src, size, resample):\n            pil_img = Image.fromarray(src)\n            pil_img = pil_img.resize(size, resample)\n            return np.asarray(pil_img)\n\n        if backend.lower() == \"cv2\":\n            if isinstance(interpolation, str):\n                interpolation = _cv2_interp_from_str[interpolation.lower()]\n            # compatible with opencv < version 4.4.0\n            elif interpolation is None:\n                interpolation = cv2.INTER_LINEAR\n            self.resize_func = partial(cv2.resize, interpolation=interpolation)\n        elif backend.lower() == \"pil\":\n            if isinstance(interpolation, str):\n                interpolation = _pil_interp_from_str[interpolation.lower()]\n            self.resize_func = partial(_pil_resize, resample=interpolation)\n        else:\n            self.resize_func = cv2.resize\n\n    def __call__(self, src, size):\n        return self.resize_func(src, size)\n\n\nclass OperatorParamError(ValueError):\n    \"\"\" OperatorParamError\n    \"\"\"\n    pass\n\n\nclass DecodeImage(object):\n    \"\"\" decode image \"\"\"\n\n    def __init__(self, to_rgb=True, to_np=False, channel_first=False):\n        self.to_rgb = to_rgb\n        self.to_np = to_np  # to numpy\n        self.channel_first = channel_first  # only enabled when to_np is True\n\n    def __call__(self, img):\n        if six.PY2:\n            assert type(img) is str and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        else:\n            assert type(img) is bytes and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        data = np.frombuffer(img, dtype='uint8')\n        img = cv2.imdecode(data, 1)\n        if self.to_rgb:\n            assert img.shape[2] == 3, 'invalid shape of image[%s]' % (img.shape)\n            img = img[:, :, ::-1]\n\n        if self.channel_first:\n            img = img.transpose((2, 0, 1))\n\n        return img\n\n\nclass ResizeImage(object):\n    \"\"\" resize image \"\"\"\n\n    def __init__(self, size=None, resize_short=None, interpolation=None, backend=\"cv2\"):\n        if resize_short is not None and resize_short > 0:\n            self.resize_short = resize_short\n            self.w = None\n            self.h = None\n        elif size is not None:\n            self.resize_short = None\n            self.w = size if type(size) is int else size[0]\n            self.h = size if type(size) is int else size[1]\n        else:\n            raise OperatorParamError(\"invalid params for ReisizeImage for '\\\n                'both 'size' and 'resize_short' are None\")\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        img_h, img_w = img.shape[:2]\n        if self.resize_short is not None:\n            percent = float(self.resize_short) / min(img_w, img_h)\n            w = int(round(img_w * percent))\n            h = int(round(img_h * percent))\n        else:\n            w = self.w\n            h = self.h\n        return self._resize_func(img, (w, h))\n\n\nclass CropImage(object):\n    \"\"\" crop image \"\"\"\n\n    def __init__(self, size):\n        if type(size) is int:\n            self.size = (size, size)\n        else:\n            self.size = size  # (h, w)\n\n    def __call__(self, img):\n        w, h = self.size\n        img_h, img_w = img.shape[:2]\n        w_start = (img_w - w) // 2\n        h_start = (img_h - h) // 2\n\n        w_end = w_start + w\n        h_end = h_start + h\n        return img[h_start:h_end, w_start:w_end, :]\n\n\nclass RandCropImage(object):\n    \"\"\" random crop image \"\"\"\n\n    def __init__(self, size, scale=None, ratio=None, interpolation=None, backend=\"cv2\"):\n        if type(size) is int:\n            self.size = (size, size)  # (h, w)\n        else:\n            self.size = size\n\n        self.scale = [0.08, 1.0] if scale is None else scale\n        self.ratio = [3. / 4., 4. / 3.] if ratio is None else ratio\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        size = self.size\n        scale = self.scale\n        ratio = self.ratio\n\n        aspect_ratio = math.sqrt(random.uniform(*ratio))\n        w = 1. * aspect_ratio\n        h = 1. / aspect_ratio\n\n        img_h, img_w = img.shape[:2]\n\n        bound = min((float(img_w) / img_h) / (w**2), (float(img_h) / img_w) / (h**2))\n        scale_max = min(scale[1], bound)\n        scale_min = min(scale[0], bound)\n\n        target_area = img_w * img_h * random.uniform(scale_min, scale_max)\n        target_size = math.sqrt(target_area)\n        w = int(target_size * w)\n        h = int(target_size * h)\n\n        i = random.randint(0, img_w - w)\n        j = random.randint(0, img_h - h)\n\n        img = img[j:j + h, i:i + w, :]\n\n        return self._resize_func(img, size)\n\n\nclass RandFlipImage(object):\n    \"\"\" random flip image\n        flip_code:\n            1: Flipped Horizontally\n            0: Flipped Vertically\n            -1: Flipped Horizontally & Vertically\n    \"\"\"\n\n    def __init__(self, flip_code=1):\n        assert flip_code in [-1, 0, 1], \"flip_code should be a value in [-1, 0, 1]\"\n        self.flip_code = flip_code\n\n    def __call__(self, img):\n        if random.randint(0, 1) == 1:\n            return cv2.flip(img, self.flip_code)\n        else:\n            return img\n\n\nclass NormalizeImage(object):\n    \"\"\" normalize image such as substract mean, divide std\n    \"\"\"\n\n    def __init__(self, scale=None, mean=None, std=None, order='chw', output_fp16=False, channel_num=3):\n        if isinstance(scale, str):\n            scale = eval(scale)\n        assert channel_num in [3, 4], \"channel number of input image should be set to 3 or 4.\"\n        self.channel_num = channel_num\n        self.output_dtype = 'float16' if output_fp16 else 'float32'\n        self.scale = np.float32(scale if scale is not None else 1.0 / 255.0)\n        self.order = order\n        mean = mean if mean is not None else [0.485, 0.456, 0.406]\n        std = std if std is not None else [0.229, 0.224, 0.225]\n\n        shape = (3, 1, 1) if self.order == 'chw' else (1, 1, 3)\n        self.mean = np.array(mean).reshape(shape).astype('float32')\n        self.std = np.array(std).reshape(shape).astype('float32')\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        assert isinstance(img, np.ndarray), \"invalid input 'img' in NormalizeImage\"\n\n        img = (img.astype('float32') * self.scale - self.mean) / self.std\n\n        if self.channel_num == 4:\n            img_h = img.shape[1] if self.order == 'chw' else img.shape[0]\n            img_w = img.shape[2] if self.order == 'chw' else img.shape[1]\n            pad_zeros = np.zeros((1, img_h, img_w)) if self.order == 'chw' else np.zeros((img_h, img_w, 1))\n            img = (np.concatenate((img, pad_zeros), axis=0) if self.order == 'chw' else np.concatenate(\n                (img, pad_zeros), axis=2))\n        return img.astype(self.output_dtype)\n\n\nclass ToCHWImage(object):\n    \"\"\" convert hwc image to chw image\n    \"\"\"\n\n    def __init__(self):\n        pass\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        return img.transpose((2, 0, 1))\n\n\nclass ColorJitter(RawColorJitter):\n    \"\"\"ColorJitter.\n    \"\"\"\n\n    def __init__(self, *args, **kwargs):\n        super().__init__(*args, **kwargs)\n\n    def __call__(self, img):\n        if not isinstance(img, Image.Image):\n            img = np.ascontiguousarray(img)\n            img = Image.fromarray(img)\n        img = super()._apply_image(img)\n        if isinstance(img, Image.Image):\n            img = np.asarray(img)\n        return img\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\nclass Topk(object):\n\n    def __init__(self, topk=1, class_id_map_file=None):\n        assert isinstance(topk, (int, ))\n        self.class_id_map = self.parse_class_id_map(class_id_map_file)\n        self.topk = topk\n\n    def parse_class_id_map(self, class_id_map_file):\n        if class_id_map_file is None:\n            return None\n        if not os.path.exists(class_id_map_file):\n            print(\n                \"Warning: If want to use your own label_dict, please input legal path!\\nOtherwise label_names will be empty!\"\n            )\n            return None\n\n        try:\n            class_id_map = {}\n            with open(class_id_map_file, \"r\") as fin:\n                lines = fin.readlines()\n                for line in lines:\n                    partition = line.split(\"\\n\")[0].partition(\" \")\n                    class_id_map[int(partition[0])] = str(partition[-1])\n        except Exception as ex:\n            print(ex)\n            class_id_map = None\n        return class_id_map\n\n    def __call__(self, x, file_names=None, multilabel=False):\n        assert isinstance(x, paddle.Tensor)\n        if file_names is not None:\n            assert x.shape[0] == len(file_names)\n        x = F.softmax(x, axis=-1) if not multilabel else F.sigmoid(x)\n        x = x.numpy()\n        y = []\n        for idx, probs in enumerate(x):\n            index = probs.argsort(axis=0)[-self.topk:][::-1].astype(\"int32\") if not multilabel else np.where(\n                probs >= 0.5)[0].astype(\"int32\")\n            clas_id_list = []\n            score_list = []\n            label_name_list = []\n            for i in index:\n                clas_id_list.append(i.item())\n                score_list.append(probs[i].item())\n                if self.class_id_map is not None:\n                    label_name_list.append(self.class_id_map[i.item()])\n            result = {\n                \"class_ids\": clas_id_list,\n                \"scores\": np.around(score_list, decimals=5).tolist(),\n            }\n            if file_names is not None:\n                result[\"file_name\"] = file_names[idx]\n            if label_name_list is not None:\n                result[\"label_names\"] = label_name_list\n            y.append(result)\n        return y\n"
  },
  {
    "path": "modules/image/classification/levit_128_imagenet/utils.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport yaml\n\n__all__ = ['get_config']\n\n\nclass AttrDict(dict):\n\n    def __getattr__(self, key):\n        return self[key]\n\n    def __setattr__(self, key, value):\n        if key in self.__dict__:\n            self.__dict__[key] = value\n        else:\n            self[key] = value\n\n    def __deepcopy__(self, content):\n        return copy.deepcopy(dict(self))\n\n\ndef create_attr_dict(yaml_config):\n    from ast import literal_eval\n    for key, value in yaml_config.items():\n        if type(value) is dict:\n            yaml_config[key] = value = AttrDict(value)\n        if isinstance(value, str):\n            try:\n                value = literal_eval(value)\n            except BaseException:\n                pass\n        if isinstance(value, AttrDict):\n            create_attr_dict(yaml_config[key])\n        else:\n            yaml_config[key] = value\n\n\ndef parse_config(cfg_file):\n    \"\"\"Load a config file into AttrDict\"\"\"\n    with open(cfg_file, 'r') as fopen:\n        yaml_config = AttrDict(yaml.load(fopen, Loader=yaml.SafeLoader))\n    create_attr_dict(yaml_config)\n    return yaml_config\n\n\ndef override(dl, ks, v):\n    \"\"\"\n    Recursively replace dict of list\n    Args:\n        dl(dict or list): dict or list to be replaced\n        ks(list): list of keys\n        v(str): value to be replaced\n    \"\"\"\n\n    def str2num(v):\n        try:\n            return eval(v)\n        except Exception:\n            return v\n\n    assert isinstance(dl, (list, dict)), (\"{} should be a list or a dict\")\n    assert len(ks) > 0, ('lenght of keys should larger than 0')\n    if isinstance(dl, list):\n        k = str2num(ks[0])\n        if len(ks) == 1:\n            assert k < len(dl), ('index({}) out of range({})'.format(k, dl))\n            dl[k] = str2num(v)\n        else:\n            override(dl[k], ks[1:], v)\n    else:\n        if len(ks) == 1:\n            # assert ks[0] in dl, ('{} is not exist in {}'.format(ks[0], dl))\n            if not ks[0] in dl:\n                print('A new filed ({}) detected!'.format(ks[0], dl))\n            dl[ks[0]] = str2num(v)\n        else:\n            override(dl[ks[0]], ks[1:], v)\n\n\ndef override_config(config, options=None):\n    \"\"\"\n    Recursively override the config\n    Args:\n        config(dict): dict to be replaced\n        options(list): list of pairs(key0.key1.idx.key2=value)\n            such as: [\n                'topk=2',\n                'VALID.transforms.1.ResizeImage.resize_short=300'\n            ]\n    Returns:\n        config(dict): replaced config\n    \"\"\"\n    if options is not None:\n        for opt in options:\n            assert isinstance(opt, str), (\"option({}) should be a str\".format(opt))\n            assert \"=\" in opt, (\"option({}) should contain a =\"\n                                \"to distinguish between key and value\".format(opt))\n            pair = opt.split('=')\n            assert len(pair) == 2, (\"there can be only a = in the option\")\n            key, value = pair\n            keys = key.split('.')\n            override(config, keys, value)\n    return config\n\n\ndef get_config(fname, overrides=None, show=False):\n    \"\"\"\n    Read config from file\n    \"\"\"\n    assert os.path.exists(fname), ('config file({}) is not exist'.format(fname))\n    config = parse_config(fname)\n    override_config(config, overrides)\n    return config\n"
  },
  {
    "path": "modules/image/classification/levit_128s_imagenet/README.md",
    "content": "# levit_128s_imagenet\n\n|模型名称|levit_128s_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|LeViT|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|45 MB|\n|最新更新日期|2022-04-02|\n|数据指标|Acc|\n\n\n## 一、模型基本信息\n\n\n- ### 模型介绍\n\n  - LeViT 是一种快速推理的、用于图像分类任务的混合神经网络。其设计之初考虑了网络模型在不同的硬件平台上的性能，因此能够更好地反映普遍应用的真实场景。通过大量实验，作者找到了卷积神经网络与 Transformer 体系更好的结合方式，并且提出了 attention-based 方法，用于整合 Transformer 中的位置信息编码, 该模块的模型结构配置为LeViT128s, 详情可参考[论文地址](https://arxiv.org/abs/2104.01136)。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install levit_128s_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run levit_128s_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"levit_128s_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 包括'class_ids'（种类索引）, 'scores'（置信度） 和 'label_names'（种类名称）\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m levit_128s_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"\\}\n    url = \"http://127.0.0.1:8866/predict/levit_128s_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install levit_128s_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/levit_128s_imagenet/model.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# Code was based on https://github.com/facebookresearch/LeViT\nimport itertools\nimport math\nimport warnings\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn.initializer import Constant\nfrom paddle.nn.initializer import TruncatedNormal\nfrom paddle.regularizer import L2Decay\n\nfrom .vision_transformer import Identity\nfrom .vision_transformer import ones_\nfrom .vision_transformer import trunc_normal_\nfrom .vision_transformer import zeros_\n\n\ndef cal_attention_biases(attention_biases, attention_bias_idxs):\n    gather_list = []\n    attention_bias_t = paddle.transpose(attention_biases, (1, 0))\n    nums = attention_bias_idxs.shape[0]\n    for idx in range(nums):\n        gather = paddle.gather(attention_bias_t, attention_bias_idxs[idx])\n        gather_list.append(gather)\n    shape0, shape1 = attention_bias_idxs.shape\n    gather = paddle.concat(gather_list)\n    return paddle.transpose(gather, (1, 0)).reshape((0, shape0, shape1))\n\n\nclass Conv2d_BN(nn.Sequential):\n\n    def __init__(self, a, b, ks=1, stride=1, pad=0, dilation=1, groups=1, bn_weight_init=1, resolution=-10000):\n        super().__init__()\n        self.add_sublayer('c', nn.Conv2D(a, b, ks, stride, pad, dilation, groups, bias_attr=False))\n        bn = nn.BatchNorm2D(b)\n        ones_(bn.weight)\n        zeros_(bn.bias)\n        self.add_sublayer('bn', bn)\n\n\nclass Linear_BN(nn.Sequential):\n\n    def __init__(self, a, b, bn_weight_init=1):\n        super().__init__()\n        self.add_sublayer('c', nn.Linear(a, b, bias_attr=False))\n        bn = nn.BatchNorm1D(b)\n        if bn_weight_init == 0:\n            zeros_(bn.weight)\n        else:\n            ones_(bn.weight)\n        zeros_(bn.bias)\n        self.add_sublayer('bn', bn)\n\n    def forward(self, x):\n        l, bn = self._sub_layers.values()\n        x = l(x)\n        return paddle.reshape(bn(x.flatten(0, 1)), x.shape)\n\n\nclass BN_Linear(nn.Sequential):\n\n    def __init__(self, a, b, bias=True, std=0.02):\n        super().__init__()\n        self.add_sublayer('bn', nn.BatchNorm1D(a))\n        l = nn.Linear(a, b, bias_attr=bias)\n        trunc_normal_(l.weight)\n        if bias:\n            zeros_(l.bias)\n        self.add_sublayer('l', l)\n\n\ndef b16(n, activation, resolution=224):\n    return nn.Sequential(Conv2d_BN(3, n // 8, 3, 2, 1, resolution=resolution), activation(),\n                         Conv2d_BN(n // 8, n // 4, 3, 2, 1, resolution=resolution // 2), activation(),\n                         Conv2d_BN(n // 4, n // 2, 3, 2, 1, resolution=resolution // 4), activation(),\n                         Conv2d_BN(n // 2, n, 3, 2, 1, resolution=resolution // 8))\n\n\nclass Residual(nn.Layer):\n\n    def __init__(self, m, drop):\n        super().__init__()\n        self.m = m\n        self.drop = drop\n\n    def forward(self, x):\n        if self.training and self.drop > 0:\n            y = paddle.rand(shape=[x.shape[0], 1, 1]).__ge__(self.drop).astype(\"float32\")\n            y = y.divide(paddle.full_like(y, 1 - self.drop))\n            return paddle.add(x, y)\n        else:\n            return paddle.add(x, self.m(x))\n\n\nclass Attention(nn.Layer):\n\n    def __init__(self, dim, key_dim, num_heads=8, attn_ratio=4, activation=None, resolution=14):\n        super().__init__()\n        self.num_heads = num_heads\n        self.scale = key_dim**-0.5\n        self.key_dim = key_dim\n        self.nh_kd = nh_kd = key_dim * num_heads\n        self.d = int(attn_ratio * key_dim)\n        self.dh = int(attn_ratio * key_dim) * num_heads\n        self.attn_ratio = attn_ratio\n        self.h = self.dh + nh_kd * 2\n        self.qkv = Linear_BN(dim, self.h)\n        self.proj = nn.Sequential(activation(), Linear_BN(self.dh, dim, bn_weight_init=0))\n        points = list(itertools.product(range(resolution), range(resolution)))\n        N = len(points)\n        attention_offsets = {}\n        idxs = []\n        for p1 in points:\n            for p2 in points:\n                offset = (abs(p1[0] - p2[0]), abs(p1[1] - p2[1]))\n                if offset not in attention_offsets:\n                    attention_offsets[offset] = len(attention_offsets)\n                idxs.append(attention_offsets[offset])\n        self.attention_biases = self.create_parameter(shape=(num_heads, len(attention_offsets)),\n                                                      default_initializer=zeros_,\n                                                      attr=paddle.ParamAttr(regularizer=L2Decay(0.0)))\n        tensor_idxs = paddle.to_tensor(idxs, dtype='int64')\n        self.register_buffer('attention_bias_idxs', paddle.reshape(tensor_idxs, [N, N]))\n\n    @paddle.no_grad()\n    def train(self, mode=True):\n        if mode:\n            super().train()\n        else:\n            super().eval()\n        if mode and hasattr(self, 'ab'):\n            del self.ab\n        else:\n            self.ab = cal_attention_biases(self.attention_biases, self.attention_bias_idxs)\n\n    def forward(self, x):\n        self.training = True\n        B, N, C = x.shape\n        qkv = self.qkv(x)\n        qkv = paddle.reshape(qkv, [B, N, self.num_heads, self.h // self.num_heads])\n        q, k, v = paddle.split(qkv, [self.key_dim, self.key_dim, self.d], axis=3)\n        q = paddle.transpose(q, perm=[0, 2, 1, 3])\n        k = paddle.transpose(k, perm=[0, 2, 1, 3])\n        v = paddle.transpose(v, perm=[0, 2, 1, 3])\n        k_transpose = paddle.transpose(k, perm=[0, 1, 3, 2])\n\n        if self.training:\n            attention_biases = cal_attention_biases(self.attention_biases, self.attention_bias_idxs)\n        else:\n            attention_biases = self.ab\n        attn = (paddle.matmul(q, k_transpose) * self.scale + attention_biases)\n        attn = F.softmax(attn)\n        x = paddle.transpose(paddle.matmul(attn, v), perm=[0, 2, 1, 3])\n        x = paddle.reshape(x, [B, N, self.dh])\n        x = self.proj(x)\n        return x\n\n\nclass Subsample(nn.Layer):\n\n    def __init__(self, stride, resolution):\n        super().__init__()\n        self.stride = stride\n        self.resolution = resolution\n\n    def forward(self, x):\n        B, N, C = x.shape\n        x = paddle.reshape(x, [B, self.resolution, self.resolution, C])\n        end1, end2 = x.shape[1], x.shape[2]\n        x = x[:, 0:end1:self.stride, 0:end2:self.stride]\n        x = paddle.reshape(x, [B, -1, C])\n        return x\n\n\nclass AttentionSubsample(nn.Layer):\n\n    def __init__(self,\n                 in_dim,\n                 out_dim,\n                 key_dim,\n                 num_heads=8,\n                 attn_ratio=2,\n                 activation=None,\n                 stride=2,\n                 resolution=14,\n                 resolution_=7):\n        super().__init__()\n        self.num_heads = num_heads\n        self.scale = key_dim**-0.5\n        self.key_dim = key_dim\n        self.nh_kd = nh_kd = key_dim * num_heads\n        self.d = int(attn_ratio * key_dim)\n        self.dh = int(attn_ratio * key_dim) * self.num_heads\n        self.attn_ratio = attn_ratio\n        self.resolution_ = resolution_\n        self.resolution_2 = resolution_**2\n        self.training = True\n        h = self.dh + nh_kd\n        self.kv = Linear_BN(in_dim, h)\n\n        self.q = nn.Sequential(Subsample(stride, resolution), Linear_BN(in_dim, nh_kd))\n        self.proj = nn.Sequential(activation(), Linear_BN(self.dh, out_dim))\n\n        self.stride = stride\n        self.resolution = resolution\n        points = list(itertools.product(range(resolution), range(resolution)))\n        points_ = list(itertools.product(range(resolution_), range(resolution_)))\n\n        N = len(points)\n        N_ = len(points_)\n        attention_offsets = {}\n        idxs = []\n        i = 0\n        j = 0\n        for p1 in points_:\n            i += 1\n            for p2 in points:\n                j += 1\n                size = 1\n                offset = (abs(p1[0] * stride - p2[0] + (size - 1) / 2), abs(p1[1] * stride - p2[1] + (size - 1) / 2))\n                if offset not in attention_offsets:\n                    attention_offsets[offset] = len(attention_offsets)\n                idxs.append(attention_offsets[offset])\n        self.attention_biases = self.create_parameter(shape=(num_heads, len(attention_offsets)),\n                                                      default_initializer=zeros_,\n                                                      attr=paddle.ParamAttr(regularizer=L2Decay(0.0)))\n\n        tensor_idxs_ = paddle.to_tensor(idxs, dtype='int64')\n        self.register_buffer('attention_bias_idxs', paddle.reshape(tensor_idxs_, [N_, N]))\n\n    @paddle.no_grad()\n    def train(self, mode=True):\n        if mode:\n            super().train()\n        else:\n            super().eval()\n        if mode and hasattr(self, 'ab'):\n            del self.ab\n        else:\n            self.ab = cal_attention_biases(self.attention_biases, self.attention_bias_idxs)\n\n    def forward(self, x):\n        self.training = True\n        B, N, C = x.shape\n        kv = self.kv(x)\n        kv = paddle.reshape(kv, [B, N, self.num_heads, -1])\n        k, v = paddle.split(kv, [self.key_dim, self.d], axis=3)\n        k = paddle.transpose(k, perm=[0, 2, 1, 3])  # BHNC\n        v = paddle.transpose(v, perm=[0, 2, 1, 3])\n        q = paddle.reshape(self.q(x), [B, self.resolution_2, self.num_heads, self.key_dim])\n        q = paddle.transpose(q, perm=[0, 2, 1, 3])\n\n        if self.training:\n            attention_biases = cal_attention_biases(self.attention_biases, self.attention_bias_idxs)\n        else:\n            attention_biases = self.ab\n\n        attn = (paddle.matmul(q, paddle.transpose(k, perm=[0, 1, 3, 2]))) * self.scale + attention_biases\n        attn = F.softmax(attn)\n\n        x = paddle.reshape(paddle.transpose(paddle.matmul(attn, v), perm=[0, 2, 1, 3]), [B, -1, self.dh])\n        x = self.proj(x)\n        return x\n\n\nclass LeViT(nn.Layer):\n    \"\"\" Vision Transformer with support for patch or hybrid CNN input stage\n    \"\"\"\n\n    def __init__(self,\n                 img_size=224,\n                 patch_size=16,\n                 in_chans=3,\n                 class_num=1000,\n                 embed_dim=[192],\n                 key_dim=[64],\n                 depth=[12],\n                 num_heads=[3],\n                 attn_ratio=[2],\n                 mlp_ratio=[2],\n                 hybrid_backbone=None,\n                 down_ops=[],\n                 attention_activation=nn.Hardswish,\n                 mlp_activation=nn.Hardswish,\n                 distillation=True,\n                 drop_path=0):\n        super().__init__()\n\n        self.class_num = class_num\n        self.num_features = embed_dim[-1]\n        self.embed_dim = embed_dim\n        self.distillation = distillation\n\n        self.patch_embed = hybrid_backbone\n\n        self.blocks = []\n        down_ops.append([''])\n        resolution = img_size // patch_size\n        for i, (ed, kd, dpth, nh, ar, mr,\n                do) in enumerate(zip(embed_dim, key_dim, depth, num_heads, attn_ratio, mlp_ratio, down_ops)):\n            for _ in range(dpth):\n                self.blocks.append(\n                    Residual(\n                        Attention(\n                            ed,\n                            kd,\n                            nh,\n                            attn_ratio=ar,\n                            activation=attention_activation,\n                            resolution=resolution,\n                        ), drop_path))\n                if mr > 0:\n                    h = int(ed * mr)\n                    self.blocks.append(\n                        Residual(\n                            nn.Sequential(\n                                Linear_BN(ed, h),\n                                mlp_activation(),\n                                Linear_BN(h, ed, bn_weight_init=0),\n                            ), drop_path))\n            if do[0] == 'Subsample':\n                #('Subsample',key_dim, num_heads, attn_ratio, mlp_ratio, stride)\n                resolution_ = (resolution - 1) // do[5] + 1\n                self.blocks.append(\n                    AttentionSubsample(*embed_dim[i:i + 2],\n                                       key_dim=do[1],\n                                       num_heads=do[2],\n                                       attn_ratio=do[3],\n                                       activation=attention_activation,\n                                       stride=do[5],\n                                       resolution=resolution,\n                                       resolution_=resolution_))\n                resolution = resolution_\n                if do[4] > 0:  # mlp_ratio\n                    h = int(embed_dim[i + 1] * do[4])\n                    self.blocks.append(\n                        Residual(\n                            nn.Sequential(\n                                Linear_BN(embed_dim[i + 1], h),\n                                mlp_activation(),\n                                Linear_BN(h, embed_dim[i + 1], bn_weight_init=0),\n                            ), drop_path))\n        self.blocks = nn.Sequential(*self.blocks)\n\n        # Classifier head\n        self.head = BN_Linear(embed_dim[-1], class_num) if class_num > 0 else Identity()\n        if distillation:\n            self.head_dist = BN_Linear(embed_dim[-1], class_num) if class_num > 0 else Identity()\n\n    def forward(self, x):\n        x = self.patch_embed(x)\n        x = x.flatten(2)\n        x = paddle.transpose(x, perm=[0, 2, 1])\n        x = self.blocks(x)\n        x = x.mean(1)\n\n        x = paddle.reshape(x, [-1, self.embed_dim[-1]])\n        if self.distillation:\n            x = self.head(x), self.head_dist(x)\n            if not self.training:\n                x = (x[0] + x[1]) / 2\n        else:\n            x = self.head(x)\n        return x\n\n\ndef model_factory(C, D, X, N, drop_path, class_num, distillation):\n    embed_dim = [int(x) for x in C.split('_')]\n    num_heads = [int(x) for x in N.split('_')]\n    depth = [int(x) for x in X.split('_')]\n    act = nn.Hardswish\n    model = LeViT(\n        patch_size=16,\n        embed_dim=embed_dim,\n        num_heads=num_heads,\n        key_dim=[D] * 3,\n        depth=depth,\n        attn_ratio=[2, 2, 2],\n        mlp_ratio=[2, 2, 2],\n        down_ops=[\n            #('Subsample',key_dim, num_heads, attn_ratio, mlp_ratio, stride)\n            ['Subsample', D, embed_dim[0] // D, 4, 2, 2],\n            ['Subsample', D, embed_dim[1] // D, 4, 2, 2],\n        ],\n        attention_activation=act,\n        mlp_activation=act,\n        hybrid_backbone=b16(embed_dim[0], activation=act),\n        class_num=class_num,\n        drop_path=drop_path,\n        distillation=distillation)\n\n    return model\n\n\nspecification = {\n    'LeViT_128S': {\n        'C': '128_256_384',\n        'D': 16,\n        'N': '4_6_8',\n        'X': '2_3_4',\n        'drop_path': 0\n    },\n    'LeViT_128': {\n        'C': '128_256_384',\n        'D': 16,\n        'N': '4_8_12',\n        'X': '4_4_4',\n        'drop_path': 0\n    },\n    'LeViT_192': {\n        'C': '192_288_384',\n        'D': 32,\n        'N': '3_5_6',\n        'X': '4_4_4',\n        'drop_path': 0\n    },\n    'LeViT_256': {\n        'C': '256_384_512',\n        'D': 32,\n        'N': '4_6_8',\n        'X': '4_4_4',\n        'drop_path': 0\n    },\n    'LeViT_384': {\n        'C': '384_512_768',\n        'D': 32,\n        'N': '6_9_12',\n        'X': '4_4_4',\n        'drop_path': 0.1\n    },\n}\n\n\ndef LeViT_128S(**kwargs):\n    model = model_factory(**specification['LeViT_128S'], class_num=1000, distillation=False)\n    return model\n"
  },
  {
    "path": "modules/image/classification/levit_128s_imagenet/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport cv2\nimport numpy as np\nimport paddle\nfrom skimage.io import imread\nfrom skimage.transform import rescale\nfrom skimage.transform import resize\n\nimport paddlehub as hub\nfrom .model import LeViT_128S\nfrom .processor import base64_to_cv2\nfrom .processor import create_operators\nfrom .processor import Topk\nfrom .utils import get_config\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"levit_128s_imagenet\",\n            type=\"cv/classification\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"\",\n            version=\"1.0.0\")\nclass LeViT_128S_ImageNet:\n\n    def __init__(self):\n        self.config = get_config(os.path.join(self.directory, 'LeViT_128S.yaml'), show=False)\n        self.label_path = os.path.join(self.directory, 'imagenet1k_label_list.txt')\n        self.pretrain_path = os.path.join(self.directory, 'LeViT_128S_pretrained.pdparams')\n        self.config['Infer']['PostProcess']['class_id_map_file'] = self.label_path\n        self.model = LeViT_128S()\n        param_state_dict = paddle.load(self.pretrain_path)\n        self.model.set_dict(param_state_dict)\n        self.preprocess_funcs = create_operators(self.config[\"Infer\"][\"transforms\"])\n\n    def classification(self,\n                       images: list = None,\n                       paths: list = None,\n                       batch_size: int = 1,\n                       use_gpu: bool = False,\n                       top_k: int = 1):\n        '''\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results, each result dict contains key 'class_ids', 'scores' and 'label_names'.\n        '''\n        postprocess_func = Topk(top_k, self.label_path)\n        inputs = []\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n\n        if images != None:\n            for image in images:\n                image = image[:, :, ::-1]\n                inputs.append(image)\n\n        if paths != None:\n            for path in paths:\n                image = cv2.imread(path)[:, :, ::-1]\n                inputs.append(image)\n\n        batch_data = []\n        for idx, imagedata in enumerate(inputs):\n            for process in self.preprocess_funcs:\n                imagedata = process(imagedata)\n            batch_data.append(imagedata)\n            if len(batch_data) >= batch_size or idx == len(inputs) - 1:\n                batch_tensor = paddle.to_tensor(batch_data)\n                out = self.model(batch_tensor)\n                if isinstance(out, list):\n                    out = out[0]\n                if isinstance(out, dict) and \"logits\" in out:\n                    out = out[\"logits\"]\n                if isinstance(out, dict) and \"output\" in out:\n                    out = out[\"output\"]\n                result = postprocess_func(out)\n                results.extend(result)\n                batch_data.clear()\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n        results = self.classification(paths=[self.args.input_path],\n                                      use_gpu=self.args.use_gpu,\n                                      batch_size=self.args.batch_size,\n                                      top_k=self.args.top_k)\n        return results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classification(images=images_decode, **kwargs)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument('--batch_size', type=int, default=1, help='batch size')\n        self.arg_config_group.add_argument('--top_k', type=int, default=1, help='Return top k results.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to input image.\")\n"
  },
  {
    "path": "modules/image/classification/levit_128s_imagenet/processor.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport base64\nimport inspect\nimport math\nimport os\nimport random\nimport sys\nfrom functools import partial\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\nimport six\nfrom paddle.vision.transforms import ColorJitter as RawColorJitter\nfrom PIL import Image\n\n\ndef create_operators(params, class_num=None):\n    \"\"\"\n    create operators based on the config\n\n    Args:\n        params(list): a dict list, used to create some operators\n    \"\"\"\n    assert isinstance(params, list), ('operator config should be a list')\n    ops = []\n    current_module = sys.modules[__name__]\n    for operator in params:\n        assert isinstance(operator, dict) and len(operator) == 1, \"yaml format error\"\n        op_name = list(operator)[0]\n        param = {} if operator[op_name] is None else operator[op_name]\n        op_func = getattr(current_module, op_name)\n        if \"class_num\" in inspect.getfullargspec(op_func).args:\n            param.update({\"class_num\": class_num})\n        op = op_func(**param)\n        ops.append(op)\n\n    return ops\n\n\nclass UnifiedResize(object):\n\n    def __init__(self, interpolation=None, backend=\"cv2\"):\n        _cv2_interp_from_str = {\n            'nearest': cv2.INTER_NEAREST,\n            'bilinear': cv2.INTER_LINEAR,\n            'area': cv2.INTER_AREA,\n            'bicubic': cv2.INTER_CUBIC,\n            'lanczos': cv2.INTER_LANCZOS4\n        }\n        _pil_interp_from_str = {\n            'nearest': Image.NEAREST,\n            'bilinear': Image.BILINEAR,\n            'bicubic': Image.BICUBIC,\n            'box': Image.BOX,\n            'lanczos': Image.LANCZOS,\n            'hamming': Image.HAMMING\n        }\n\n        def _pil_resize(src, size, resample):\n            pil_img = Image.fromarray(src)\n            pil_img = pil_img.resize(size, resample)\n            return np.asarray(pil_img)\n\n        if backend.lower() == \"cv2\":\n            if isinstance(interpolation, str):\n                interpolation = _cv2_interp_from_str[interpolation.lower()]\n            # compatible with opencv < version 4.4.0\n            elif interpolation is None:\n                interpolation = cv2.INTER_LINEAR\n            self.resize_func = partial(cv2.resize, interpolation=interpolation)\n        elif backend.lower() == \"pil\":\n            if isinstance(interpolation, str):\n                interpolation = _pil_interp_from_str[interpolation.lower()]\n            self.resize_func = partial(_pil_resize, resample=interpolation)\n        else:\n            self.resize_func = cv2.resize\n\n    def __call__(self, src, size):\n        return self.resize_func(src, size)\n\n\nclass OperatorParamError(ValueError):\n    \"\"\" OperatorParamError\n    \"\"\"\n    pass\n\n\nclass DecodeImage(object):\n    \"\"\" decode image \"\"\"\n\n    def __init__(self, to_rgb=True, to_np=False, channel_first=False):\n        self.to_rgb = to_rgb\n        self.to_np = to_np  # to numpy\n        self.channel_first = channel_first  # only enabled when to_np is True\n\n    def __call__(self, img):\n        if six.PY2:\n            assert type(img) is str and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        else:\n            assert type(img) is bytes and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        data = np.frombuffer(img, dtype='uint8')\n        img = cv2.imdecode(data, 1)\n        if self.to_rgb:\n            assert img.shape[2] == 3, 'invalid shape of image[%s]' % (img.shape)\n            img = img[:, :, ::-1]\n\n        if self.channel_first:\n            img = img.transpose((2, 0, 1))\n\n        return img\n\n\nclass ResizeImage(object):\n    \"\"\" resize image \"\"\"\n\n    def __init__(self, size=None, resize_short=None, interpolation=None, backend=\"cv2\"):\n        if resize_short is not None and resize_short > 0:\n            self.resize_short = resize_short\n            self.w = None\n            self.h = None\n        elif size is not None:\n            self.resize_short = None\n            self.w = size if type(size) is int else size[0]\n            self.h = size if type(size) is int else size[1]\n        else:\n            raise OperatorParamError(\"invalid params for ReisizeImage for '\\\n                'both 'size' and 'resize_short' are None\")\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        img_h, img_w = img.shape[:2]\n        if self.resize_short is not None:\n            percent = float(self.resize_short) / min(img_w, img_h)\n            w = int(round(img_w * percent))\n            h = int(round(img_h * percent))\n        else:\n            w = self.w\n            h = self.h\n        return self._resize_func(img, (w, h))\n\n\nclass CropImage(object):\n    \"\"\" crop image \"\"\"\n\n    def __init__(self, size):\n        if type(size) is int:\n            self.size = (size, size)\n        else:\n            self.size = size  # (h, w)\n\n    def __call__(self, img):\n        w, h = self.size\n        img_h, img_w = img.shape[:2]\n        w_start = (img_w - w) // 2\n        h_start = (img_h - h) // 2\n\n        w_end = w_start + w\n        h_end = h_start + h\n        return img[h_start:h_end, w_start:w_end, :]\n\n\nclass RandCropImage(object):\n    \"\"\" random crop image \"\"\"\n\n    def __init__(self, size, scale=None, ratio=None, interpolation=None, backend=\"cv2\"):\n        if type(size) is int:\n            self.size = (size, size)  # (h, w)\n        else:\n            self.size = size\n\n        self.scale = [0.08, 1.0] if scale is None else scale\n        self.ratio = [3. / 4., 4. / 3.] if ratio is None else ratio\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        size = self.size\n        scale = self.scale\n        ratio = self.ratio\n\n        aspect_ratio = math.sqrt(random.uniform(*ratio))\n        w = 1. * aspect_ratio\n        h = 1. / aspect_ratio\n\n        img_h, img_w = img.shape[:2]\n\n        bound = min((float(img_w) / img_h) / (w**2), (float(img_h) / img_w) / (h**2))\n        scale_max = min(scale[1], bound)\n        scale_min = min(scale[0], bound)\n\n        target_area = img_w * img_h * random.uniform(scale_min, scale_max)\n        target_size = math.sqrt(target_area)\n        w = int(target_size * w)\n        h = int(target_size * h)\n\n        i = random.randint(0, img_w - w)\n        j = random.randint(0, img_h - h)\n\n        img = img[j:j + h, i:i + w, :]\n\n        return self._resize_func(img, size)\n\n\nclass RandFlipImage(object):\n    \"\"\" random flip image\n        flip_code:\n            1: Flipped Horizontally\n            0: Flipped Vertically\n            -1: Flipped Horizontally & Vertically\n    \"\"\"\n\n    def __init__(self, flip_code=1):\n        assert flip_code in [-1, 0, 1], \"flip_code should be a value in [-1, 0, 1]\"\n        self.flip_code = flip_code\n\n    def __call__(self, img):\n        if random.randint(0, 1) == 1:\n            return cv2.flip(img, self.flip_code)\n        else:\n            return img\n\n\nclass NormalizeImage(object):\n    \"\"\" normalize image such as substract mean, divide std\n    \"\"\"\n\n    def __init__(self, scale=None, mean=None, std=None, order='chw', output_fp16=False, channel_num=3):\n        if isinstance(scale, str):\n            scale = eval(scale)\n        assert channel_num in [3, 4], \"channel number of input image should be set to 3 or 4.\"\n        self.channel_num = channel_num\n        self.output_dtype = 'float16' if output_fp16 else 'float32'\n        self.scale = np.float32(scale if scale is not None else 1.0 / 255.0)\n        self.order = order\n        mean = mean if mean is not None else [0.485, 0.456, 0.406]\n        std = std if std is not None else [0.229, 0.224, 0.225]\n\n        shape = (3, 1, 1) if self.order == 'chw' else (1, 1, 3)\n        self.mean = np.array(mean).reshape(shape).astype('float32')\n        self.std = np.array(std).reshape(shape).astype('float32')\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        assert isinstance(img, np.ndarray), \"invalid input 'img' in NormalizeImage\"\n\n        img = (img.astype('float32') * self.scale - self.mean) / self.std\n\n        if self.channel_num == 4:\n            img_h = img.shape[1] if self.order == 'chw' else img.shape[0]\n            img_w = img.shape[2] if self.order == 'chw' else img.shape[1]\n            pad_zeros = np.zeros((1, img_h, img_w)) if self.order == 'chw' else np.zeros((img_h, img_w, 1))\n            img = (np.concatenate((img, pad_zeros), axis=0) if self.order == 'chw' else np.concatenate(\n                (img, pad_zeros), axis=2))\n        return img.astype(self.output_dtype)\n\n\nclass ToCHWImage(object):\n    \"\"\" convert hwc image to chw image\n    \"\"\"\n\n    def __init__(self):\n        pass\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        return img.transpose((2, 0, 1))\n\n\nclass ColorJitter(RawColorJitter):\n    \"\"\"ColorJitter.\n    \"\"\"\n\n    def __init__(self, *args, **kwargs):\n        super().__init__(*args, **kwargs)\n\n    def __call__(self, img):\n        if not isinstance(img, Image.Image):\n            img = np.ascontiguousarray(img)\n            img = Image.fromarray(img)\n        img = super()._apply_image(img)\n        if isinstance(img, Image.Image):\n            img = np.asarray(img)\n        return img\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\nclass Topk(object):\n\n    def __init__(self, topk=1, class_id_map_file=None):\n        assert isinstance(topk, (int, ))\n        self.class_id_map = self.parse_class_id_map(class_id_map_file)\n        self.topk = topk\n\n    def parse_class_id_map(self, class_id_map_file):\n        if class_id_map_file is None:\n            return None\n        if not os.path.exists(class_id_map_file):\n            print(\n                \"Warning: If want to use your own label_dict, please input legal path!\\nOtherwise label_names will be empty!\"\n            )\n            return None\n\n        try:\n            class_id_map = {}\n            with open(class_id_map_file, \"r\") as fin:\n                lines = fin.readlines()\n                for line in lines:\n                    partition = line.split(\"\\n\")[0].partition(\" \")\n                    class_id_map[int(partition[0])] = str(partition[-1])\n        except Exception as ex:\n            print(ex)\n            class_id_map = None\n        return class_id_map\n\n    def __call__(self, x, file_names=None, multilabel=False):\n        assert isinstance(x, paddle.Tensor)\n        if file_names is not None:\n            assert x.shape[0] == len(file_names)\n        x = F.softmax(x, axis=-1) if not multilabel else F.sigmoid(x)\n        x = x.numpy()\n        y = []\n        for idx, probs in enumerate(x):\n            index = probs.argsort(axis=0)[-self.topk:][::-1].astype(\"int32\") if not multilabel else np.where(\n                probs >= 0.5)[0].astype(\"int32\")\n            clas_id_list = []\n            score_list = []\n            label_name_list = []\n            for i in index:\n                clas_id_list.append(i.item())\n                score_list.append(probs[i].item())\n                if self.class_id_map is not None:\n                    label_name_list.append(self.class_id_map[i.item()])\n            result = {\n                \"class_ids\": clas_id_list,\n                \"scores\": np.around(score_list, decimals=5).tolist(),\n            }\n            if file_names is not None:\n                result[\"file_name\"] = file_names[idx]\n            if label_name_list is not None:\n                result[\"label_names\"] = label_name_list\n            y.append(result)\n        return y\n"
  },
  {
    "path": "modules/image/classification/levit_128s_imagenet/utils.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport yaml\n\n__all__ = ['get_config']\n\n\nclass AttrDict(dict):\n\n    def __getattr__(self, key):\n        return self[key]\n\n    def __setattr__(self, key, value):\n        if key in self.__dict__:\n            self.__dict__[key] = value\n        else:\n            self[key] = value\n\n    def __deepcopy__(self, content):\n        return copy.deepcopy(dict(self))\n\n\ndef create_attr_dict(yaml_config):\n    from ast import literal_eval\n    for key, value in yaml_config.items():\n        if type(value) is dict:\n            yaml_config[key] = value = AttrDict(value)\n        if isinstance(value, str):\n            try:\n                value = literal_eval(value)\n            except BaseException:\n                pass\n        if isinstance(value, AttrDict):\n            create_attr_dict(yaml_config[key])\n        else:\n            yaml_config[key] = value\n\n\ndef parse_config(cfg_file):\n    \"\"\"Load a config file into AttrDict\"\"\"\n    with open(cfg_file, 'r') as fopen:\n        yaml_config = AttrDict(yaml.load(fopen, Loader=yaml.SafeLoader))\n    create_attr_dict(yaml_config)\n    return yaml_config\n\n\ndef override(dl, ks, v):\n    \"\"\"\n    Recursively replace dict of list\n    Args:\n        dl(dict or list): dict or list to be replaced\n        ks(list): list of keys\n        v(str): value to be replaced\n    \"\"\"\n\n    def str2num(v):\n        try:\n            return eval(v)\n        except Exception:\n            return v\n\n    assert isinstance(dl, (list, dict)), (\"{} should be a list or a dict\")\n    assert len(ks) > 0, ('lenght of keys should larger than 0')\n    if isinstance(dl, list):\n        k = str2num(ks[0])\n        if len(ks) == 1:\n            assert k < len(dl), ('index({}) out of range({})'.format(k, dl))\n            dl[k] = str2num(v)\n        else:\n            override(dl[k], ks[1:], v)\n    else:\n        if len(ks) == 1:\n            # assert ks[0] in dl, ('{} is not exist in {}'.format(ks[0], dl))\n            if not ks[0] in dl:\n                print('A new filed ({}) detected!'.format(ks[0], dl))\n            dl[ks[0]] = str2num(v)\n        else:\n            override(dl[ks[0]], ks[1:], v)\n\n\ndef override_config(config, options=None):\n    \"\"\"\n    Recursively override the config\n    Args:\n        config(dict): dict to be replaced\n        options(list): list of pairs(key0.key1.idx.key2=value)\n            such as: [\n                'topk=2',\n                'VALID.transforms.1.ResizeImage.resize_short=300'\n            ]\n    Returns:\n        config(dict): replaced config\n    \"\"\"\n    if options is not None:\n        for opt in options:\n            assert isinstance(opt, str), (\"option({}) should be a str\".format(opt))\n            assert \"=\" in opt, (\"option({}) should contain a =\"\n                                \"to distinguish between key and value\".format(opt))\n            pair = opt.split('=')\n            assert len(pair) == 2, (\"there can be only a = in the option\")\n            key, value = pair\n            keys = key.split('.')\n            override(config, keys, value)\n    return config\n\n\ndef get_config(fname, overrides=None, show=False):\n    \"\"\"\n    Read config from file\n    \"\"\"\n    assert os.path.exists(fname), ('config file({}) is not exist'.format(fname))\n    config = parse_config(fname)\n    override_config(config, overrides)\n    return config\n"
  },
  {
    "path": "modules/image/classification/levit_192_imagenet/README.md",
    "content": "# levit_192_imagenet\n\n|模型名称|levit_192_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|LeViT|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|64 MB|\n|最新更新日期|2022-04-02|\n|数据指标|Acc|\n\n\n## 一、模型基本信息\n\n\n- ### 模型介绍\n\n  - LeViT 是一种快速推理的、用于图像分类任务的混合神经网络。其设计之初考虑了网络模型在不同的硬件平台上的性能，因此能够更好地反映普遍应用的真实场景。通过大量实验，作者找到了卷积神经网络与 Transformer 体系更好的结合方式，并且提出了 attention-based 方法，用于整合 Transformer 中的位置信息编码, 该模块的模型结构配置为LeViT192, 详情可参考[论文地址](https://arxiv.org/abs/2104.01136)。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install levit_192_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run levit_192_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"levit_192_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 包括'class_ids'（种类索引）, 'scores'（置信度） 和 'label_names'（种类名称）\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m levit_192_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"\\}\n    url = \"http://127.0.0.1:8866/predict/levit_192_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install levit_192_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/levit_192_imagenet/model.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# Code was based on https://github.com/facebookresearch/LeViT\nimport itertools\nimport math\nimport warnings\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn.initializer import Constant\nfrom paddle.nn.initializer import TruncatedNormal\nfrom paddle.regularizer import L2Decay\n\nfrom .vision_transformer import Identity\nfrom .vision_transformer import ones_\nfrom .vision_transformer import trunc_normal_\nfrom .vision_transformer import zeros_\n\n\ndef cal_attention_biases(attention_biases, attention_bias_idxs):\n    gather_list = []\n    attention_bias_t = paddle.transpose(attention_biases, (1, 0))\n    nums = attention_bias_idxs.shape[0]\n    for idx in range(nums):\n        gather = paddle.gather(attention_bias_t, attention_bias_idxs[idx])\n        gather_list.append(gather)\n    shape0, shape1 = attention_bias_idxs.shape\n    gather = paddle.concat(gather_list)\n    return paddle.transpose(gather, (1, 0)).reshape((0, shape0, shape1))\n\n\nclass Conv2d_BN(nn.Sequential):\n\n    def __init__(self, a, b, ks=1, stride=1, pad=0, dilation=1, groups=1, bn_weight_init=1, resolution=-10000):\n        super().__init__()\n        self.add_sublayer('c', nn.Conv2D(a, b, ks, stride, pad, dilation, groups, bias_attr=False))\n        bn = nn.BatchNorm2D(b)\n        ones_(bn.weight)\n        zeros_(bn.bias)\n        self.add_sublayer('bn', bn)\n\n\nclass Linear_BN(nn.Sequential):\n\n    def __init__(self, a, b, bn_weight_init=1):\n        super().__init__()\n        self.add_sublayer('c', nn.Linear(a, b, bias_attr=False))\n        bn = nn.BatchNorm1D(b)\n        if bn_weight_init == 0:\n            zeros_(bn.weight)\n        else:\n            ones_(bn.weight)\n        zeros_(bn.bias)\n        self.add_sublayer('bn', bn)\n\n    def forward(self, x):\n        l, bn = self._sub_layers.values()\n        x = l(x)\n        return paddle.reshape(bn(x.flatten(0, 1)), x.shape)\n\n\nclass BN_Linear(nn.Sequential):\n\n    def __init__(self, a, b, bias=True, std=0.02):\n        super().__init__()\n        self.add_sublayer('bn', nn.BatchNorm1D(a))\n        l = nn.Linear(a, b, bias_attr=bias)\n        trunc_normal_(l.weight)\n        if bias:\n            zeros_(l.bias)\n        self.add_sublayer('l', l)\n\n\ndef b16(n, activation, resolution=224):\n    return nn.Sequential(Conv2d_BN(3, n // 8, 3, 2, 1, resolution=resolution), activation(),\n                         Conv2d_BN(n // 8, n // 4, 3, 2, 1, resolution=resolution // 2), activation(),\n                         Conv2d_BN(n // 4, n // 2, 3, 2, 1, resolution=resolution // 4), activation(),\n                         Conv2d_BN(n // 2, n, 3, 2, 1, resolution=resolution // 8))\n\n\nclass Residual(nn.Layer):\n\n    def __init__(self, m, drop):\n        super().__init__()\n        self.m = m\n        self.drop = drop\n\n    def forward(self, x):\n        if self.training and self.drop > 0:\n            y = paddle.rand(shape=[x.shape[0], 1, 1]).__ge__(self.drop).astype(\"float32\")\n            y = y.divide(paddle.full_like(y, 1 - self.drop))\n            return paddle.add(x, y)\n        else:\n            return paddle.add(x, self.m(x))\n\n\nclass Attention(nn.Layer):\n\n    def __init__(self, dim, key_dim, num_heads=8, attn_ratio=4, activation=None, resolution=14):\n        super().__init__()\n        self.num_heads = num_heads\n        self.scale = key_dim**-0.5\n        self.key_dim = key_dim\n        self.nh_kd = nh_kd = key_dim * num_heads\n        self.d = int(attn_ratio * key_dim)\n        self.dh = int(attn_ratio * key_dim) * num_heads\n        self.attn_ratio = attn_ratio\n        self.h = self.dh + nh_kd * 2\n        self.qkv = Linear_BN(dim, self.h)\n        self.proj = nn.Sequential(activation(), Linear_BN(self.dh, dim, bn_weight_init=0))\n        points = list(itertools.product(range(resolution), range(resolution)))\n        N = len(points)\n        attention_offsets = {}\n        idxs = []\n        for p1 in points:\n            for p2 in points:\n                offset = (abs(p1[0] - p2[0]), abs(p1[1] - p2[1]))\n                if offset not in attention_offsets:\n                    attention_offsets[offset] = len(attention_offsets)\n                idxs.append(attention_offsets[offset])\n        self.attention_biases = self.create_parameter(shape=(num_heads, len(attention_offsets)),\n                                                      default_initializer=zeros_,\n                                                      attr=paddle.ParamAttr(regularizer=L2Decay(0.0)))\n        tensor_idxs = paddle.to_tensor(idxs, dtype='int64')\n        self.register_buffer('attention_bias_idxs', paddle.reshape(tensor_idxs, [N, N]))\n\n    @paddle.no_grad()\n    def train(self, mode=True):\n        if mode:\n            super().train()\n        else:\n            super().eval()\n        if mode and hasattr(self, 'ab'):\n            del self.ab\n        else:\n            self.ab = cal_attention_biases(self.attention_biases, self.attention_bias_idxs)\n\n    def forward(self, x):\n        self.training = True\n        B, N, C = x.shape\n        qkv = self.qkv(x)\n        qkv = paddle.reshape(qkv, [B, N, self.num_heads, self.h // self.num_heads])\n        q, k, v = paddle.split(qkv, [self.key_dim, self.key_dim, self.d], axis=3)\n        q = paddle.transpose(q, perm=[0, 2, 1, 3])\n        k = paddle.transpose(k, perm=[0, 2, 1, 3])\n        v = paddle.transpose(v, perm=[0, 2, 1, 3])\n        k_transpose = paddle.transpose(k, perm=[0, 1, 3, 2])\n\n        if self.training:\n            attention_biases = cal_attention_biases(self.attention_biases, self.attention_bias_idxs)\n        else:\n            attention_biases = self.ab\n        attn = (paddle.matmul(q, k_transpose) * self.scale + attention_biases)\n        attn = F.softmax(attn)\n        x = paddle.transpose(paddle.matmul(attn, v), perm=[0, 2, 1, 3])\n        x = paddle.reshape(x, [B, N, self.dh])\n        x = self.proj(x)\n        return x\n\n\nclass Subsample(nn.Layer):\n\n    def __init__(self, stride, resolution):\n        super().__init__()\n        self.stride = stride\n        self.resolution = resolution\n\n    def forward(self, x):\n        B, N, C = x.shape\n        x = paddle.reshape(x, [B, self.resolution, self.resolution, C])\n        end1, end2 = x.shape[1], x.shape[2]\n        x = x[:, 0:end1:self.stride, 0:end2:self.stride]\n        x = paddle.reshape(x, [B, -1, C])\n        return x\n\n\nclass AttentionSubsample(nn.Layer):\n\n    def __init__(self,\n                 in_dim,\n                 out_dim,\n                 key_dim,\n                 num_heads=8,\n                 attn_ratio=2,\n                 activation=None,\n                 stride=2,\n                 resolution=14,\n                 resolution_=7):\n        super().__init__()\n        self.num_heads = num_heads\n        self.scale = key_dim**-0.5\n        self.key_dim = key_dim\n        self.nh_kd = nh_kd = key_dim * num_heads\n        self.d = int(attn_ratio * key_dim)\n        self.dh = int(attn_ratio * key_dim) * self.num_heads\n        self.attn_ratio = attn_ratio\n        self.resolution_ = resolution_\n        self.resolution_2 = resolution_**2\n        self.training = True\n        h = self.dh + nh_kd\n        self.kv = Linear_BN(in_dim, h)\n\n        self.q = nn.Sequential(Subsample(stride, resolution), Linear_BN(in_dim, nh_kd))\n        self.proj = nn.Sequential(activation(), Linear_BN(self.dh, out_dim))\n\n        self.stride = stride\n        self.resolution = resolution\n        points = list(itertools.product(range(resolution), range(resolution)))\n        points_ = list(itertools.product(range(resolution_), range(resolution_)))\n\n        N = len(points)\n        N_ = len(points_)\n        attention_offsets = {}\n        idxs = []\n        i = 0\n        j = 0\n        for p1 in points_:\n            i += 1\n            for p2 in points:\n                j += 1\n                size = 1\n                offset = (abs(p1[0] * stride - p2[0] + (size - 1) / 2), abs(p1[1] * stride - p2[1] + (size - 1) / 2))\n                if offset not in attention_offsets:\n                    attention_offsets[offset] = len(attention_offsets)\n                idxs.append(attention_offsets[offset])\n        self.attention_biases = self.create_parameter(shape=(num_heads, len(attention_offsets)),\n                                                      default_initializer=zeros_,\n                                                      attr=paddle.ParamAttr(regularizer=L2Decay(0.0)))\n\n        tensor_idxs_ = paddle.to_tensor(idxs, dtype='int64')\n        self.register_buffer('attention_bias_idxs', paddle.reshape(tensor_idxs_, [N_, N]))\n\n    @paddle.no_grad()\n    def train(self, mode=True):\n        if mode:\n            super().train()\n        else:\n            super().eval()\n        if mode and hasattr(self, 'ab'):\n            del self.ab\n        else:\n            self.ab = cal_attention_biases(self.attention_biases, self.attention_bias_idxs)\n\n    def forward(self, x):\n        self.training = True\n        B, N, C = x.shape\n        kv = self.kv(x)\n        kv = paddle.reshape(kv, [B, N, self.num_heads, -1])\n        k, v = paddle.split(kv, [self.key_dim, self.d], axis=3)\n        k = paddle.transpose(k, perm=[0, 2, 1, 3])  # BHNC\n        v = paddle.transpose(v, perm=[0, 2, 1, 3])\n        q = paddle.reshape(self.q(x), [B, self.resolution_2, self.num_heads, self.key_dim])\n        q = paddle.transpose(q, perm=[0, 2, 1, 3])\n\n        if self.training:\n            attention_biases = cal_attention_biases(self.attention_biases, self.attention_bias_idxs)\n        else:\n            attention_biases = self.ab\n\n        attn = (paddle.matmul(q, paddle.transpose(k, perm=[0, 1, 3, 2]))) * self.scale + attention_biases\n        attn = F.softmax(attn)\n\n        x = paddle.reshape(paddle.transpose(paddle.matmul(attn, v), perm=[0, 2, 1, 3]), [B, -1, self.dh])\n        x = self.proj(x)\n        return x\n\n\nclass LeViT(nn.Layer):\n    \"\"\" Vision Transformer with support for patch or hybrid CNN input stage\n    \"\"\"\n\n    def __init__(self,\n                 img_size=224,\n                 patch_size=16,\n                 in_chans=3,\n                 class_num=1000,\n                 embed_dim=[192],\n                 key_dim=[64],\n                 depth=[12],\n                 num_heads=[3],\n                 attn_ratio=[2],\n                 mlp_ratio=[2],\n                 hybrid_backbone=None,\n                 down_ops=[],\n                 attention_activation=nn.Hardswish,\n                 mlp_activation=nn.Hardswish,\n                 distillation=True,\n                 drop_path=0):\n        super().__init__()\n\n        self.class_num = class_num\n        self.num_features = embed_dim[-1]\n        self.embed_dim = embed_dim\n        self.distillation = distillation\n\n        self.patch_embed = hybrid_backbone\n\n        self.blocks = []\n        down_ops.append([''])\n        resolution = img_size // patch_size\n        for i, (ed, kd, dpth, nh, ar, mr,\n                do) in enumerate(zip(embed_dim, key_dim, depth, num_heads, attn_ratio, mlp_ratio, down_ops)):\n            for _ in range(dpth):\n                self.blocks.append(\n                    Residual(\n                        Attention(\n                            ed,\n                            kd,\n                            nh,\n                            attn_ratio=ar,\n                            activation=attention_activation,\n                            resolution=resolution,\n                        ), drop_path))\n                if mr > 0:\n                    h = int(ed * mr)\n                    self.blocks.append(\n                        Residual(\n                            nn.Sequential(\n                                Linear_BN(ed, h),\n                                mlp_activation(),\n                                Linear_BN(h, ed, bn_weight_init=0),\n                            ), drop_path))\n            if do[0] == 'Subsample':\n                #('Subsample',key_dim, num_heads, attn_ratio, mlp_ratio, stride)\n                resolution_ = (resolution - 1) // do[5] + 1\n                self.blocks.append(\n                    AttentionSubsample(*embed_dim[i:i + 2],\n                                       key_dim=do[1],\n                                       num_heads=do[2],\n                                       attn_ratio=do[3],\n                                       activation=attention_activation,\n                                       stride=do[5],\n                                       resolution=resolution,\n                                       resolution_=resolution_))\n                resolution = resolution_\n                if do[4] > 0:  # mlp_ratio\n                    h = int(embed_dim[i + 1] * do[4])\n                    self.blocks.append(\n                        Residual(\n                            nn.Sequential(\n                                Linear_BN(embed_dim[i + 1], h),\n                                mlp_activation(),\n                                Linear_BN(h, embed_dim[i + 1], bn_weight_init=0),\n                            ), drop_path))\n        self.blocks = nn.Sequential(*self.blocks)\n\n        # Classifier head\n        self.head = BN_Linear(embed_dim[-1], class_num) if class_num > 0 else Identity()\n        if distillation:\n            self.head_dist = BN_Linear(embed_dim[-1], class_num) if class_num > 0 else Identity()\n\n    def forward(self, x):\n        x = self.patch_embed(x)\n        x = x.flatten(2)\n        x = paddle.transpose(x, perm=[0, 2, 1])\n        x = self.blocks(x)\n        x = x.mean(1)\n\n        x = paddle.reshape(x, [-1, self.embed_dim[-1]])\n        if self.distillation:\n            x = self.head(x), self.head_dist(x)\n            if not self.training:\n                x = (x[0] + x[1]) / 2\n        else:\n            x = self.head(x)\n        return x\n\n\ndef model_factory(C, D, X, N, drop_path, class_num, distillation):\n    embed_dim = [int(x) for x in C.split('_')]\n    num_heads = [int(x) for x in N.split('_')]\n    depth = [int(x) for x in X.split('_')]\n    act = nn.Hardswish\n    model = LeViT(\n        patch_size=16,\n        embed_dim=embed_dim,\n        num_heads=num_heads,\n        key_dim=[D] * 3,\n        depth=depth,\n        attn_ratio=[2, 2, 2],\n        mlp_ratio=[2, 2, 2],\n        down_ops=[\n            #('Subsample',key_dim, num_heads, attn_ratio, mlp_ratio, stride)\n            ['Subsample', D, embed_dim[0] // D, 4, 2, 2],\n            ['Subsample', D, embed_dim[1] // D, 4, 2, 2],\n        ],\n        attention_activation=act,\n        mlp_activation=act,\n        hybrid_backbone=b16(embed_dim[0], activation=act),\n        class_num=class_num,\n        drop_path=drop_path,\n        distillation=distillation)\n\n    return model\n\n\nspecification = {\n    'LeViT_128S': {\n        'C': '128_256_384',\n        'D': 16,\n        'N': '4_6_8',\n        'X': '2_3_4',\n        'drop_path': 0\n    },\n    'LeViT_128': {\n        'C': '128_256_384',\n        'D': 16,\n        'N': '4_8_12',\n        'X': '4_4_4',\n        'drop_path': 0\n    },\n    'LeViT_192': {\n        'C': '192_288_384',\n        'D': 32,\n        'N': '3_5_6',\n        'X': '4_4_4',\n        'drop_path': 0\n    },\n    'LeViT_256': {\n        'C': '256_384_512',\n        'D': 32,\n        'N': '4_6_8',\n        'X': '4_4_4',\n        'drop_path': 0\n    },\n    'LeViT_384': {\n        'C': '384_512_768',\n        'D': 32,\n        'N': '6_9_12',\n        'X': '4_4_4',\n        'drop_path': 0.1\n    },\n}\n\n\ndef LeViT_192(**kwargs):\n    model = model_factory(**specification['LeViT_192'], class_num=1000, distillation=False)\n    return model\n"
  },
  {
    "path": "modules/image/classification/levit_192_imagenet/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport cv2\nimport numpy as np\nimport paddle\nfrom skimage.io import imread\nfrom skimage.transform import rescale\nfrom skimage.transform import resize\n\nimport paddlehub as hub\nfrom .model import LeViT_192\nfrom .processor import base64_to_cv2\nfrom .processor import create_operators\nfrom .processor import Topk\nfrom .utils import get_config\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"levit_192_imagenet\",\n            type=\"cv/classification\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"\",\n            version=\"1.0.0\")\nclass LeViT_192_ImageNet:\n\n    def __init__(self):\n        self.config = get_config(os.path.join(self.directory, 'LeViT_192.yaml'), show=False)\n        self.label_path = os.path.join(self.directory, 'imagenet1k_label_list.txt')\n        self.pretrain_path = os.path.join(self.directory, 'LeViT_192_pretrained.pdparams')\n        self.config['Infer']['PostProcess']['class_id_map_file'] = self.label_path\n        self.model = LeViT_192()\n        param_state_dict = paddle.load(self.pretrain_path)\n        self.model.set_dict(param_state_dict)\n        self.preprocess_funcs = create_operators(self.config[\"Infer\"][\"transforms\"])\n\n    def classification(self,\n                       images: list = None,\n                       paths: list = None,\n                       batch_size: int = 1,\n                       use_gpu: bool = False,\n                       top_k: int = 1):\n        '''\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results, each result dict contains key 'class_ids', 'scores' and 'label_names'.\n        '''\n        postprocess_func = Topk(top_k, self.label_path)\n        inputs = []\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n\n        if images != None:\n            for image in images:\n                image = image[:, :, ::-1]\n                inputs.append(image)\n\n        if paths != None:\n            for path in paths:\n                image = cv2.imread(path)[:, :, ::-1]\n                inputs.append(image)\n\n        batch_data = []\n        for idx, imagedata in enumerate(inputs):\n            for process in self.preprocess_funcs:\n                imagedata = process(imagedata)\n            batch_data.append(imagedata)\n            if len(batch_data) >= batch_size or idx == len(inputs) - 1:\n                batch_tensor = paddle.to_tensor(batch_data)\n                out = self.model(batch_tensor)\n                if isinstance(out, list):\n                    out = out[0]\n                if isinstance(out, dict) and \"logits\" in out:\n                    out = out[\"logits\"]\n                if isinstance(out, dict) and \"output\" in out:\n                    out = out[\"output\"]\n                result = postprocess_func(out)\n                results.extend(result)\n                batch_data.clear()\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n        results = self.classification(paths=[self.args.input_path],\n                                      use_gpu=self.args.use_gpu,\n                                      batch_size=self.args.batch_size,\n                                      top_k=self.args.top_k)\n        return results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classification(images=images_decode, **kwargs)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument('--batch_size', type=int, default=1, help='batch size')\n        self.arg_config_group.add_argument('--top_k', type=int, default=1, help='Return top k results.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to input image.\")\n"
  },
  {
    "path": "modules/image/classification/levit_192_imagenet/processor.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport base64\nimport inspect\nimport math\nimport os\nimport random\nimport sys\nfrom functools import partial\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\nimport six\nfrom paddle.vision.transforms import ColorJitter as RawColorJitter\nfrom PIL import Image\n\n\ndef create_operators(params, class_num=None):\n    \"\"\"\n    create operators based on the config\n\n    Args:\n        params(list): a dict list, used to create some operators\n    \"\"\"\n    assert isinstance(params, list), ('operator config should be a list')\n    ops = []\n    current_module = sys.modules[__name__]\n    for operator in params:\n        assert isinstance(operator, dict) and len(operator) == 1, \"yaml format error\"\n        op_name = list(operator)[0]\n        param = {} if operator[op_name] is None else operator[op_name]\n        op_func = getattr(current_module, op_name)\n        if \"class_num\" in inspect.getfullargspec(op_func).args:\n            param.update({\"class_num\": class_num})\n        op = op_func(**param)\n        ops.append(op)\n\n    return ops\n\n\nclass UnifiedResize(object):\n\n    def __init__(self, interpolation=None, backend=\"cv2\"):\n        _cv2_interp_from_str = {\n            'nearest': cv2.INTER_NEAREST,\n            'bilinear': cv2.INTER_LINEAR,\n            'area': cv2.INTER_AREA,\n            'bicubic': cv2.INTER_CUBIC,\n            'lanczos': cv2.INTER_LANCZOS4\n        }\n        _pil_interp_from_str = {\n            'nearest': Image.NEAREST,\n            'bilinear': Image.BILINEAR,\n            'bicubic': Image.BICUBIC,\n            'box': Image.BOX,\n            'lanczos': Image.LANCZOS,\n            'hamming': Image.HAMMING\n        }\n\n        def _pil_resize(src, size, resample):\n            pil_img = Image.fromarray(src)\n            pil_img = pil_img.resize(size, resample)\n            return np.asarray(pil_img)\n\n        if backend.lower() == \"cv2\":\n            if isinstance(interpolation, str):\n                interpolation = _cv2_interp_from_str[interpolation.lower()]\n            # compatible with opencv < version 4.4.0\n            elif interpolation is None:\n                interpolation = cv2.INTER_LINEAR\n            self.resize_func = partial(cv2.resize, interpolation=interpolation)\n        elif backend.lower() == \"pil\":\n            if isinstance(interpolation, str):\n                interpolation = _pil_interp_from_str[interpolation.lower()]\n            self.resize_func = partial(_pil_resize, resample=interpolation)\n        else:\n            self.resize_func = cv2.resize\n\n    def __call__(self, src, size):\n        return self.resize_func(src, size)\n\n\nclass OperatorParamError(ValueError):\n    \"\"\" OperatorParamError\n    \"\"\"\n    pass\n\n\nclass DecodeImage(object):\n    \"\"\" decode image \"\"\"\n\n    def __init__(self, to_rgb=True, to_np=False, channel_first=False):\n        self.to_rgb = to_rgb\n        self.to_np = to_np  # to numpy\n        self.channel_first = channel_first  # only enabled when to_np is True\n\n    def __call__(self, img):\n        if six.PY2:\n            assert type(img) is str and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        else:\n            assert type(img) is bytes and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        data = np.frombuffer(img, dtype='uint8')\n        img = cv2.imdecode(data, 1)\n        if self.to_rgb:\n            assert img.shape[2] == 3, 'invalid shape of image[%s]' % (img.shape)\n            img = img[:, :, ::-1]\n\n        if self.channel_first:\n            img = img.transpose((2, 0, 1))\n\n        return img\n\n\nclass ResizeImage(object):\n    \"\"\" resize image \"\"\"\n\n    def __init__(self, size=None, resize_short=None, interpolation=None, backend=\"cv2\"):\n        if resize_short is not None and resize_short > 0:\n            self.resize_short = resize_short\n            self.w = None\n            self.h = None\n        elif size is not None:\n            self.resize_short = None\n            self.w = size if type(size) is int else size[0]\n            self.h = size if type(size) is int else size[1]\n        else:\n            raise OperatorParamError(\"invalid params for ReisizeImage for '\\\n                'both 'size' and 'resize_short' are None\")\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        img_h, img_w = img.shape[:2]\n        if self.resize_short is not None:\n            percent = float(self.resize_short) / min(img_w, img_h)\n            w = int(round(img_w * percent))\n            h = int(round(img_h * percent))\n        else:\n            w = self.w\n            h = self.h\n        return self._resize_func(img, (w, h))\n\n\nclass CropImage(object):\n    \"\"\" crop image \"\"\"\n\n    def __init__(self, size):\n        if type(size) is int:\n            self.size = (size, size)\n        else:\n            self.size = size  # (h, w)\n\n    def __call__(self, img):\n        w, h = self.size\n        img_h, img_w = img.shape[:2]\n        w_start = (img_w - w) // 2\n        h_start = (img_h - h) // 2\n\n        w_end = w_start + w\n        h_end = h_start + h\n        return img[h_start:h_end, w_start:w_end, :]\n\n\nclass RandCropImage(object):\n    \"\"\" random crop image \"\"\"\n\n    def __init__(self, size, scale=None, ratio=None, interpolation=None, backend=\"cv2\"):\n        if type(size) is int:\n            self.size = (size, size)  # (h, w)\n        else:\n            self.size = size\n\n        self.scale = [0.08, 1.0] if scale is None else scale\n        self.ratio = [3. / 4., 4. / 3.] if ratio is None else ratio\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        size = self.size\n        scale = self.scale\n        ratio = self.ratio\n\n        aspect_ratio = math.sqrt(random.uniform(*ratio))\n        w = 1. * aspect_ratio\n        h = 1. / aspect_ratio\n\n        img_h, img_w = img.shape[:2]\n\n        bound = min((float(img_w) / img_h) / (w**2), (float(img_h) / img_w) / (h**2))\n        scale_max = min(scale[1], bound)\n        scale_min = min(scale[0], bound)\n\n        target_area = img_w * img_h * random.uniform(scale_min, scale_max)\n        target_size = math.sqrt(target_area)\n        w = int(target_size * w)\n        h = int(target_size * h)\n\n        i = random.randint(0, img_w - w)\n        j = random.randint(0, img_h - h)\n\n        img = img[j:j + h, i:i + w, :]\n\n        return self._resize_func(img, size)\n\n\nclass RandFlipImage(object):\n    \"\"\" random flip image\n        flip_code:\n            1: Flipped Horizontally\n            0: Flipped Vertically\n            -1: Flipped Horizontally & Vertically\n    \"\"\"\n\n    def __init__(self, flip_code=1):\n        assert flip_code in [-1, 0, 1], \"flip_code should be a value in [-1, 0, 1]\"\n        self.flip_code = flip_code\n\n    def __call__(self, img):\n        if random.randint(0, 1) == 1:\n            return cv2.flip(img, self.flip_code)\n        else:\n            return img\n\n\nclass NormalizeImage(object):\n    \"\"\" normalize image such as substract mean, divide std\n    \"\"\"\n\n    def __init__(self, scale=None, mean=None, std=None, order='chw', output_fp16=False, channel_num=3):\n        if isinstance(scale, str):\n            scale = eval(scale)\n        assert channel_num in [3, 4], \"channel number of input image should be set to 3 or 4.\"\n        self.channel_num = channel_num\n        self.output_dtype = 'float16' if output_fp16 else 'float32'\n        self.scale = np.float32(scale if scale is not None else 1.0 / 255.0)\n        self.order = order\n        mean = mean if mean is not None else [0.485, 0.456, 0.406]\n        std = std if std is not None else [0.229, 0.224, 0.225]\n\n        shape = (3, 1, 1) if self.order == 'chw' else (1, 1, 3)\n        self.mean = np.array(mean).reshape(shape).astype('float32')\n        self.std = np.array(std).reshape(shape).astype('float32')\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        assert isinstance(img, np.ndarray), \"invalid input 'img' in NormalizeImage\"\n\n        img = (img.astype('float32') * self.scale - self.mean) / self.std\n\n        if self.channel_num == 4:\n            img_h = img.shape[1] if self.order == 'chw' else img.shape[0]\n            img_w = img.shape[2] if self.order == 'chw' else img.shape[1]\n            pad_zeros = np.zeros((1, img_h, img_w)) if self.order == 'chw' else np.zeros((img_h, img_w, 1))\n            img = (np.concatenate((img, pad_zeros), axis=0) if self.order == 'chw' else np.concatenate(\n                (img, pad_zeros), axis=2))\n        return img.astype(self.output_dtype)\n\n\nclass ToCHWImage(object):\n    \"\"\" convert hwc image to chw image\n    \"\"\"\n\n    def __init__(self):\n        pass\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        return img.transpose((2, 0, 1))\n\n\nclass ColorJitter(RawColorJitter):\n    \"\"\"ColorJitter.\n    \"\"\"\n\n    def __init__(self, *args, **kwargs):\n        super().__init__(*args, **kwargs)\n\n    def __call__(self, img):\n        if not isinstance(img, Image.Image):\n            img = np.ascontiguousarray(img)\n            img = Image.fromarray(img)\n        img = super()._apply_image(img)\n        if isinstance(img, Image.Image):\n            img = np.asarray(img)\n        return img\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\nclass Topk(object):\n\n    def __init__(self, topk=1, class_id_map_file=None):\n        assert isinstance(topk, (int, ))\n        self.class_id_map = self.parse_class_id_map(class_id_map_file)\n        self.topk = topk\n\n    def parse_class_id_map(self, class_id_map_file):\n        if class_id_map_file is None:\n            return None\n        if not os.path.exists(class_id_map_file):\n            print(\n                \"Warning: If want to use your own label_dict, please input legal path!\\nOtherwise label_names will be empty!\"\n            )\n            return None\n\n        try:\n            class_id_map = {}\n            with open(class_id_map_file, \"r\") as fin:\n                lines = fin.readlines()\n                for line in lines:\n                    partition = line.split(\"\\n\")[0].partition(\" \")\n                    class_id_map[int(partition[0])] = str(partition[-1])\n        except Exception as ex:\n            print(ex)\n            class_id_map = None\n        return class_id_map\n\n    def __call__(self, x, file_names=None, multilabel=False):\n        assert isinstance(x, paddle.Tensor)\n        if file_names is not None:\n            assert x.shape[0] == len(file_names)\n        x = F.softmax(x, axis=-1) if not multilabel else F.sigmoid(x)\n        x = x.numpy()\n        y = []\n        for idx, probs in enumerate(x):\n            index = probs.argsort(axis=0)[-self.topk:][::-1].astype(\"int32\") if not multilabel else np.where(\n                probs >= 0.5)[0].astype(\"int32\")\n            clas_id_list = []\n            score_list = []\n            label_name_list = []\n            for i in index:\n                clas_id_list.append(i.item())\n                score_list.append(probs[i].item())\n                if self.class_id_map is not None:\n                    label_name_list.append(self.class_id_map[i.item()])\n            result = {\n                \"class_ids\": clas_id_list,\n                \"scores\": np.around(score_list, decimals=5).tolist(),\n            }\n            if file_names is not None:\n                result[\"file_name\"] = file_names[idx]\n            if label_name_list is not None:\n                result[\"label_names\"] = label_name_list\n            y.append(result)\n        return y\n"
  },
  {
    "path": "modules/image/classification/levit_192_imagenet/utils.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport yaml\n\n__all__ = ['get_config']\n\n\nclass AttrDict(dict):\n\n    def __getattr__(self, key):\n        return self[key]\n\n    def __setattr__(self, key, value):\n        if key in self.__dict__:\n            self.__dict__[key] = value\n        else:\n            self[key] = value\n\n    def __deepcopy__(self, content):\n        return copy.deepcopy(dict(self))\n\n\ndef create_attr_dict(yaml_config):\n    from ast import literal_eval\n    for key, value in yaml_config.items():\n        if type(value) is dict:\n            yaml_config[key] = value = AttrDict(value)\n        if isinstance(value, str):\n            try:\n                value = literal_eval(value)\n            except BaseException:\n                pass\n        if isinstance(value, AttrDict):\n            create_attr_dict(yaml_config[key])\n        else:\n            yaml_config[key] = value\n\n\ndef parse_config(cfg_file):\n    \"\"\"Load a config file into AttrDict\"\"\"\n    with open(cfg_file, 'r') as fopen:\n        yaml_config = AttrDict(yaml.load(fopen, Loader=yaml.SafeLoader))\n    create_attr_dict(yaml_config)\n    return yaml_config\n\n\ndef override(dl, ks, v):\n    \"\"\"\n    Recursively replace dict of list\n    Args:\n        dl(dict or list): dict or list to be replaced\n        ks(list): list of keys\n        v(str): value to be replaced\n    \"\"\"\n\n    def str2num(v):\n        try:\n            return eval(v)\n        except Exception:\n            return v\n\n    assert isinstance(dl, (list, dict)), (\"{} should be a list or a dict\")\n    assert len(ks) > 0, ('lenght of keys should larger than 0')\n    if isinstance(dl, list):\n        k = str2num(ks[0])\n        if len(ks) == 1:\n            assert k < len(dl), ('index({}) out of range({})'.format(k, dl))\n            dl[k] = str2num(v)\n        else:\n            override(dl[k], ks[1:], v)\n    else:\n        if len(ks) == 1:\n            # assert ks[0] in dl, ('{} is not exist in {}'.format(ks[0], dl))\n            if not ks[0] in dl:\n                print('A new filed ({}) detected!'.format(ks[0], dl))\n            dl[ks[0]] = str2num(v)\n        else:\n            override(dl[ks[0]], ks[1:], v)\n\n\ndef override_config(config, options=None):\n    \"\"\"\n    Recursively override the config\n    Args:\n        config(dict): dict to be replaced\n        options(list): list of pairs(key0.key1.idx.key2=value)\n            such as: [\n                'topk=2',\n                'VALID.transforms.1.ResizeImage.resize_short=300'\n            ]\n    Returns:\n        config(dict): replaced config\n    \"\"\"\n    if options is not None:\n        for opt in options:\n            assert isinstance(opt, str), (\"option({}) should be a str\".format(opt))\n            assert \"=\" in opt, (\"option({}) should contain a =\"\n                                \"to distinguish between key and value\".format(opt))\n            pair = opt.split('=')\n            assert len(pair) == 2, (\"there can be only a = in the option\")\n            key, value = pair\n            keys = key.split('.')\n            override(config, keys, value)\n    return config\n\n\ndef get_config(fname, overrides=None, show=False):\n    \"\"\"\n    Read config from file\n    \"\"\"\n    assert os.path.exists(fname), ('config file({}) is not exist'.format(fname))\n    config = parse_config(fname)\n    override_config(config, overrides)\n    return config\n"
  },
  {
    "path": "modules/image/classification/levit_256_imagenet/README.md",
    "content": "# levit_256_imagenet\n\n|模型名称|levit_256_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|LeViT|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|109 MB|\n|最新更新日期|2022-04-02|\n|数据指标|Acc|\n\n\n## 一、模型基本信息\n\n\n- ### 模型介绍\n\n  - LeViT 是一种快速推理的、用于图像分类任务的混合神经网络。其设计之初考虑了网络模型在不同的硬件平台上的性能，因此能够更好地反映普遍应用的真实场景。通过大量实验，作者找到了卷积神经网络与 Transformer 体系更好的结合方式，并且提出了 attention-based 方法，用于整合 Transformer 中的位置信息编码, 该模块的模型结构配置为LeViT256, 详情可参考[论文地址](https://arxiv.org/abs/2104.01136)。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install levit_256_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run levit_256_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"levit_256_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 包括'class_ids'（种类索引）, 'scores'（置信度） 和 'label_names'（种类名称）\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m levit_256_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"\\}\n    url = \"http://127.0.0.1:8866/predict/levit_256_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install levit_256_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/levit_256_imagenet/model.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# Code was based on https://github.com/facebookresearch/LeViT\nimport itertools\nimport math\nimport warnings\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn.initializer import Constant\nfrom paddle.nn.initializer import TruncatedNormal\nfrom paddle.regularizer import L2Decay\n\nfrom .vision_transformer import Identity\nfrom .vision_transformer import ones_\nfrom .vision_transformer import trunc_normal_\nfrom .vision_transformer import zeros_\n\n\ndef cal_attention_biases(attention_biases, attention_bias_idxs):\n    gather_list = []\n    attention_bias_t = paddle.transpose(attention_biases, (1, 0))\n    nums = attention_bias_idxs.shape[0]\n    for idx in range(nums):\n        gather = paddle.gather(attention_bias_t, attention_bias_idxs[idx])\n        gather_list.append(gather)\n    shape0, shape1 = attention_bias_idxs.shape\n    gather = paddle.concat(gather_list)\n    return paddle.transpose(gather, (1, 0)).reshape((0, shape0, shape1))\n\n\nclass Conv2d_BN(nn.Sequential):\n\n    def __init__(self, a, b, ks=1, stride=1, pad=0, dilation=1, groups=1, bn_weight_init=1, resolution=-10000):\n        super().__init__()\n        self.add_sublayer('c', nn.Conv2D(a, b, ks, stride, pad, dilation, groups, bias_attr=False))\n        bn = nn.BatchNorm2D(b)\n        ones_(bn.weight)\n        zeros_(bn.bias)\n        self.add_sublayer('bn', bn)\n\n\nclass Linear_BN(nn.Sequential):\n\n    def __init__(self, a, b, bn_weight_init=1):\n        super().__init__()\n        self.add_sublayer('c', nn.Linear(a, b, bias_attr=False))\n        bn = nn.BatchNorm1D(b)\n        if bn_weight_init == 0:\n            zeros_(bn.weight)\n        else:\n            ones_(bn.weight)\n        zeros_(bn.bias)\n        self.add_sublayer('bn', bn)\n\n    def forward(self, x):\n        l, bn = self._sub_layers.values()\n        x = l(x)\n        return paddle.reshape(bn(x.flatten(0, 1)), x.shape)\n\n\nclass BN_Linear(nn.Sequential):\n\n    def __init__(self, a, b, bias=True, std=0.02):\n        super().__init__()\n        self.add_sublayer('bn', nn.BatchNorm1D(a))\n        l = nn.Linear(a, b, bias_attr=bias)\n        trunc_normal_(l.weight)\n        if bias:\n            zeros_(l.bias)\n        self.add_sublayer('l', l)\n\n\ndef b16(n, activation, resolution=224):\n    return nn.Sequential(Conv2d_BN(3, n // 8, 3, 2, 1, resolution=resolution), activation(),\n                         Conv2d_BN(n // 8, n // 4, 3, 2, 1, resolution=resolution // 2), activation(),\n                         Conv2d_BN(n // 4, n // 2, 3, 2, 1, resolution=resolution // 4), activation(),\n                         Conv2d_BN(n // 2, n, 3, 2, 1, resolution=resolution // 8))\n\n\nclass Residual(nn.Layer):\n\n    def __init__(self, m, drop):\n        super().__init__()\n        self.m = m\n        self.drop = drop\n\n    def forward(self, x):\n        if self.training and self.drop > 0:\n            y = paddle.rand(shape=[x.shape[0], 1, 1]).__ge__(self.drop).astype(\"float32\")\n            y = y.divide(paddle.full_like(y, 1 - self.drop))\n            return paddle.add(x, y)\n        else:\n            return paddle.add(x, self.m(x))\n\n\nclass Attention(nn.Layer):\n\n    def __init__(self, dim, key_dim, num_heads=8, attn_ratio=4, activation=None, resolution=14):\n        super().__init__()\n        self.num_heads = num_heads\n        self.scale = key_dim**-0.5\n        self.key_dim = key_dim\n        self.nh_kd = nh_kd = key_dim * num_heads\n        self.d = int(attn_ratio * key_dim)\n        self.dh = int(attn_ratio * key_dim) * num_heads\n        self.attn_ratio = attn_ratio\n        self.h = self.dh + nh_kd * 2\n        self.qkv = Linear_BN(dim, self.h)\n        self.proj = nn.Sequential(activation(), Linear_BN(self.dh, dim, bn_weight_init=0))\n        points = list(itertools.product(range(resolution), range(resolution)))\n        N = len(points)\n        attention_offsets = {}\n        idxs = []\n        for p1 in points:\n            for p2 in points:\n                offset = (abs(p1[0] - p2[0]), abs(p1[1] - p2[1]))\n                if offset not in attention_offsets:\n                    attention_offsets[offset] = len(attention_offsets)\n                idxs.append(attention_offsets[offset])\n        self.attention_biases = self.create_parameter(shape=(num_heads, len(attention_offsets)),\n                                                      default_initializer=zeros_,\n                                                      attr=paddle.ParamAttr(regularizer=L2Decay(0.0)))\n        tensor_idxs = paddle.to_tensor(idxs, dtype='int64')\n        self.register_buffer('attention_bias_idxs', paddle.reshape(tensor_idxs, [N, N]))\n\n    @paddle.no_grad()\n    def train(self, mode=True):\n        if mode:\n            super().train()\n        else:\n            super().eval()\n        if mode and hasattr(self, 'ab'):\n            del self.ab\n        else:\n            self.ab = cal_attention_biases(self.attention_biases, self.attention_bias_idxs)\n\n    def forward(self, x):\n        self.training = True\n        B, N, C = x.shape\n        qkv = self.qkv(x)\n        qkv = paddle.reshape(qkv, [B, N, self.num_heads, self.h // self.num_heads])\n        q, k, v = paddle.split(qkv, [self.key_dim, self.key_dim, self.d], axis=3)\n        q = paddle.transpose(q, perm=[0, 2, 1, 3])\n        k = paddle.transpose(k, perm=[0, 2, 1, 3])\n        v = paddle.transpose(v, perm=[0, 2, 1, 3])\n        k_transpose = paddle.transpose(k, perm=[0, 1, 3, 2])\n\n        if self.training:\n            attention_biases = cal_attention_biases(self.attention_biases, self.attention_bias_idxs)\n        else:\n            attention_biases = self.ab\n        attn = (paddle.matmul(q, k_transpose) * self.scale + attention_biases)\n        attn = F.softmax(attn)\n        x = paddle.transpose(paddle.matmul(attn, v), perm=[0, 2, 1, 3])\n        x = paddle.reshape(x, [B, N, self.dh])\n        x = self.proj(x)\n        return x\n\n\nclass Subsample(nn.Layer):\n\n    def __init__(self, stride, resolution):\n        super().__init__()\n        self.stride = stride\n        self.resolution = resolution\n\n    def forward(self, x):\n        B, N, C = x.shape\n        x = paddle.reshape(x, [B, self.resolution, self.resolution, C])\n        end1, end2 = x.shape[1], x.shape[2]\n        x = x[:, 0:end1:self.stride, 0:end2:self.stride]\n        x = paddle.reshape(x, [B, -1, C])\n        return x\n\n\nclass AttentionSubsample(nn.Layer):\n\n    def __init__(self,\n                 in_dim,\n                 out_dim,\n                 key_dim,\n                 num_heads=8,\n                 attn_ratio=2,\n                 activation=None,\n                 stride=2,\n                 resolution=14,\n                 resolution_=7):\n        super().__init__()\n        self.num_heads = num_heads\n        self.scale = key_dim**-0.5\n        self.key_dim = key_dim\n        self.nh_kd = nh_kd = key_dim * num_heads\n        self.d = int(attn_ratio * key_dim)\n        self.dh = int(attn_ratio * key_dim) * self.num_heads\n        self.attn_ratio = attn_ratio\n        self.resolution_ = resolution_\n        self.resolution_2 = resolution_**2\n        self.training = True\n        h = self.dh + nh_kd\n        self.kv = Linear_BN(in_dim, h)\n\n        self.q = nn.Sequential(Subsample(stride, resolution), Linear_BN(in_dim, nh_kd))\n        self.proj = nn.Sequential(activation(), Linear_BN(self.dh, out_dim))\n\n        self.stride = stride\n        self.resolution = resolution\n        points = list(itertools.product(range(resolution), range(resolution)))\n        points_ = list(itertools.product(range(resolution_), range(resolution_)))\n\n        N = len(points)\n        N_ = len(points_)\n        attention_offsets = {}\n        idxs = []\n        i = 0\n        j = 0\n        for p1 in points_:\n            i += 1\n            for p2 in points:\n                j += 1\n                size = 1\n                offset = (abs(p1[0] * stride - p2[0] + (size - 1) / 2), abs(p1[1] * stride - p2[1] + (size - 1) / 2))\n                if offset not in attention_offsets:\n                    attention_offsets[offset] = len(attention_offsets)\n                idxs.append(attention_offsets[offset])\n        self.attention_biases = self.create_parameter(shape=(num_heads, len(attention_offsets)),\n                                                      default_initializer=zeros_,\n                                                      attr=paddle.ParamAttr(regularizer=L2Decay(0.0)))\n\n        tensor_idxs_ = paddle.to_tensor(idxs, dtype='int64')\n        self.register_buffer('attention_bias_idxs', paddle.reshape(tensor_idxs_, [N_, N]))\n\n    @paddle.no_grad()\n    def train(self, mode=True):\n        if mode:\n            super().train()\n        else:\n            super().eval()\n        if mode and hasattr(self, 'ab'):\n            del self.ab\n        else:\n            self.ab = cal_attention_biases(self.attention_biases, self.attention_bias_idxs)\n\n    def forward(self, x):\n        self.training = True\n        B, N, C = x.shape\n        kv = self.kv(x)\n        kv = paddle.reshape(kv, [B, N, self.num_heads, -1])\n        k, v = paddle.split(kv, [self.key_dim, self.d], axis=3)\n        k = paddle.transpose(k, perm=[0, 2, 1, 3])  # BHNC\n        v = paddle.transpose(v, perm=[0, 2, 1, 3])\n        q = paddle.reshape(self.q(x), [B, self.resolution_2, self.num_heads, self.key_dim])\n        q = paddle.transpose(q, perm=[0, 2, 1, 3])\n\n        if self.training:\n            attention_biases = cal_attention_biases(self.attention_biases, self.attention_bias_idxs)\n        else:\n            attention_biases = self.ab\n\n        attn = (paddle.matmul(q, paddle.transpose(k, perm=[0, 1, 3, 2]))) * self.scale + attention_biases\n        attn = F.softmax(attn)\n\n        x = paddle.reshape(paddle.transpose(paddle.matmul(attn, v), perm=[0, 2, 1, 3]), [B, -1, self.dh])\n        x = self.proj(x)\n        return x\n\n\nclass LeViT(nn.Layer):\n    \"\"\" Vision Transformer with support for patch or hybrid CNN input stage\n    \"\"\"\n\n    def __init__(self,\n                 img_size=224,\n                 patch_size=16,\n                 in_chans=3,\n                 class_num=1000,\n                 embed_dim=[192],\n                 key_dim=[64],\n                 depth=[12],\n                 num_heads=[3],\n                 attn_ratio=[2],\n                 mlp_ratio=[2],\n                 hybrid_backbone=None,\n                 down_ops=[],\n                 attention_activation=nn.Hardswish,\n                 mlp_activation=nn.Hardswish,\n                 distillation=True,\n                 drop_path=0):\n        super().__init__()\n\n        self.class_num = class_num\n        self.num_features = embed_dim[-1]\n        self.embed_dim = embed_dim\n        self.distillation = distillation\n\n        self.patch_embed = hybrid_backbone\n\n        self.blocks = []\n        down_ops.append([''])\n        resolution = img_size // patch_size\n        for i, (ed, kd, dpth, nh, ar, mr,\n                do) in enumerate(zip(embed_dim, key_dim, depth, num_heads, attn_ratio, mlp_ratio, down_ops)):\n            for _ in range(dpth):\n                self.blocks.append(\n                    Residual(\n                        Attention(\n                            ed,\n                            kd,\n                            nh,\n                            attn_ratio=ar,\n                            activation=attention_activation,\n                            resolution=resolution,\n                        ), drop_path))\n                if mr > 0:\n                    h = int(ed * mr)\n                    self.blocks.append(\n                        Residual(\n                            nn.Sequential(\n                                Linear_BN(ed, h),\n                                mlp_activation(),\n                                Linear_BN(h, ed, bn_weight_init=0),\n                            ), drop_path))\n            if do[0] == 'Subsample':\n                #('Subsample',key_dim, num_heads, attn_ratio, mlp_ratio, stride)\n                resolution_ = (resolution - 1) // do[5] + 1\n                self.blocks.append(\n                    AttentionSubsample(*embed_dim[i:i + 2],\n                                       key_dim=do[1],\n                                       num_heads=do[2],\n                                       attn_ratio=do[3],\n                                       activation=attention_activation,\n                                       stride=do[5],\n                                       resolution=resolution,\n                                       resolution_=resolution_))\n                resolution = resolution_\n                if do[4] > 0:  # mlp_ratio\n                    h = int(embed_dim[i + 1] * do[4])\n                    self.blocks.append(\n                        Residual(\n                            nn.Sequential(\n                                Linear_BN(embed_dim[i + 1], h),\n                                mlp_activation(),\n                                Linear_BN(h, embed_dim[i + 1], bn_weight_init=0),\n                            ), drop_path))\n        self.blocks = nn.Sequential(*self.blocks)\n\n        # Classifier head\n        self.head = BN_Linear(embed_dim[-1], class_num) if class_num > 0 else Identity()\n        if distillation:\n            self.head_dist = BN_Linear(embed_dim[-1], class_num) if class_num > 0 else Identity()\n\n    def forward(self, x):\n        x = self.patch_embed(x)\n        x = x.flatten(2)\n        x = paddle.transpose(x, perm=[0, 2, 1])\n        x = self.blocks(x)\n        x = x.mean(1)\n\n        x = paddle.reshape(x, [-1, self.embed_dim[-1]])\n        if self.distillation:\n            x = self.head(x), self.head_dist(x)\n            if not self.training:\n                x = (x[0] + x[1]) / 2\n        else:\n            x = self.head(x)\n        return x\n\n\ndef model_factory(C, D, X, N, drop_path, class_num, distillation):\n    embed_dim = [int(x) for x in C.split('_')]\n    num_heads = [int(x) for x in N.split('_')]\n    depth = [int(x) for x in X.split('_')]\n    act = nn.Hardswish\n    model = LeViT(\n        patch_size=16,\n        embed_dim=embed_dim,\n        num_heads=num_heads,\n        key_dim=[D] * 3,\n        depth=depth,\n        attn_ratio=[2, 2, 2],\n        mlp_ratio=[2, 2, 2],\n        down_ops=[\n            #('Subsample',key_dim, num_heads, attn_ratio, mlp_ratio, stride)\n            ['Subsample', D, embed_dim[0] // D, 4, 2, 2],\n            ['Subsample', D, embed_dim[1] // D, 4, 2, 2],\n        ],\n        attention_activation=act,\n        mlp_activation=act,\n        hybrid_backbone=b16(embed_dim[0], activation=act),\n        class_num=class_num,\n        drop_path=drop_path,\n        distillation=distillation)\n\n    return model\n\n\nspecification = {\n    'LeViT_128S': {\n        'C': '128_256_384',\n        'D': 16,\n        'N': '4_6_8',\n        'X': '2_3_4',\n        'drop_path': 0\n    },\n    'LeViT_128': {\n        'C': '128_256_384',\n        'D': 16,\n        'N': '4_8_12',\n        'X': '4_4_4',\n        'drop_path': 0\n    },\n    'LeViT_192': {\n        'C': '192_288_384',\n        'D': 32,\n        'N': '3_5_6',\n        'X': '4_4_4',\n        'drop_path': 0\n    },\n    'LeViT_256': {\n        'C': '256_384_512',\n        'D': 32,\n        'N': '4_6_8',\n        'X': '4_4_4',\n        'drop_path': 0\n    },\n    'LeViT_384': {\n        'C': '384_512_768',\n        'D': 32,\n        'N': '6_9_12',\n        'X': '4_4_4',\n        'drop_path': 0.1\n    },\n}\n\n\ndef LeViT_256(**kwargs):\n    model = model_factory(**specification['LeViT_256'], class_num=1000, distillation=False)\n    return model\n"
  },
  {
    "path": "modules/image/classification/levit_256_imagenet/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport cv2\nimport numpy as np\nimport paddle\nfrom skimage.io import imread\nfrom skimage.transform import rescale\nfrom skimage.transform import resize\n\nimport paddlehub as hub\nfrom .model import LeViT_256\nfrom .processor import base64_to_cv2\nfrom .processor import create_operators\nfrom .processor import Topk\nfrom .utils import get_config\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"levit_256_imagenet\",\n            type=\"cv/classification\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"\",\n            version=\"1.0.0\")\nclass LeViT_256_ImageNet:\n\n    def __init__(self):\n        self.config = get_config(os.path.join(self.directory, 'LeViT_256.yaml'), show=False)\n        self.label_path = os.path.join(self.directory, 'imagenet1k_label_list.txt')\n        self.pretrain_path = os.path.join(self.directory, 'LeViT_256_pretrained.pdparams')\n        self.config['Infer']['PostProcess']['class_id_map_file'] = self.label_path\n        self.model = LeViT_256()\n        param_state_dict = paddle.load(self.pretrain_path)\n        self.model.set_dict(param_state_dict)\n        self.preprocess_funcs = create_operators(self.config[\"Infer\"][\"transforms\"])\n\n    def classification(self,\n                       images: list = None,\n                       paths: list = None,\n                       batch_size: int = 1,\n                       use_gpu: bool = False,\n                       top_k: int = 1):\n        '''\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results, each result dict contains key 'class_ids', 'scores' and 'label_names'.\n        '''\n        postprocess_func = Topk(top_k, self.label_path)\n        inputs = []\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n\n        if images != None:\n            for image in images:\n                image = image[:, :, ::-1]\n                inputs.append(image)\n\n        if paths != None:\n            for path in paths:\n                image = cv2.imread(path)[:, :, ::-1]\n                inputs.append(image)\n\n        batch_data = []\n        for idx, imagedata in enumerate(inputs):\n            for process in self.preprocess_funcs:\n                imagedata = process(imagedata)\n            batch_data.append(imagedata)\n            if len(batch_data) >= batch_size or idx == len(inputs) - 1:\n                batch_tensor = paddle.to_tensor(batch_data)\n                out = self.model(batch_tensor)\n                if isinstance(out, list):\n                    out = out[0]\n                if isinstance(out, dict) and \"logits\" in out:\n                    out = out[\"logits\"]\n                if isinstance(out, dict) and \"output\" in out:\n                    out = out[\"output\"]\n                result = postprocess_func(out)\n                results.extend(result)\n                batch_data.clear()\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n        results = self.classification(paths=[self.args.input_path],\n                                      use_gpu=self.args.use_gpu,\n                                      batch_size=self.args.batch_size,\n                                      top_k=self.args.top_k)\n        return results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classification(images=images_decode, **kwargs)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument('--batch_size', type=int, default=1, help='batch size')\n        self.arg_config_group.add_argument('--top_k', type=int, default=1, help='Return top k results.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to input image.\")\n"
  },
  {
    "path": "modules/image/classification/levit_256_imagenet/processor.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport base64\nimport inspect\nimport math\nimport os\nimport random\nimport sys\nfrom functools import partial\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\nimport six\nfrom paddle.vision.transforms import ColorJitter as RawColorJitter\nfrom PIL import Image\n\n\ndef create_operators(params, class_num=None):\n    \"\"\"\n    create operators based on the config\n\n    Args:\n        params(list): a dict list, used to create some operators\n    \"\"\"\n    assert isinstance(params, list), ('operator config should be a list')\n    ops = []\n    current_module = sys.modules[__name__]\n    for operator in params:\n        assert isinstance(operator, dict) and len(operator) == 1, \"yaml format error\"\n        op_name = list(operator)[0]\n        param = {} if operator[op_name] is None else operator[op_name]\n        op_func = getattr(current_module, op_name)\n        if \"class_num\" in inspect.getfullargspec(op_func).args:\n            param.update({\"class_num\": class_num})\n        op = op_func(**param)\n        ops.append(op)\n\n    return ops\n\n\nclass UnifiedResize(object):\n\n    def __init__(self, interpolation=None, backend=\"cv2\"):\n        _cv2_interp_from_str = {\n            'nearest': cv2.INTER_NEAREST,\n            'bilinear': cv2.INTER_LINEAR,\n            'area': cv2.INTER_AREA,\n            'bicubic': cv2.INTER_CUBIC,\n            'lanczos': cv2.INTER_LANCZOS4\n        }\n        _pil_interp_from_str = {\n            'nearest': Image.NEAREST,\n            'bilinear': Image.BILINEAR,\n            'bicubic': Image.BICUBIC,\n            'box': Image.BOX,\n            'lanczos': Image.LANCZOS,\n            'hamming': Image.HAMMING\n        }\n\n        def _pil_resize(src, size, resample):\n            pil_img = Image.fromarray(src)\n            pil_img = pil_img.resize(size, resample)\n            return np.asarray(pil_img)\n\n        if backend.lower() == \"cv2\":\n            if isinstance(interpolation, str):\n                interpolation = _cv2_interp_from_str[interpolation.lower()]\n            # compatible with opencv < version 4.4.0\n            elif interpolation is None:\n                interpolation = cv2.INTER_LINEAR\n            self.resize_func = partial(cv2.resize, interpolation=interpolation)\n        elif backend.lower() == \"pil\":\n            if isinstance(interpolation, str):\n                interpolation = _pil_interp_from_str[interpolation.lower()]\n            self.resize_func = partial(_pil_resize, resample=interpolation)\n        else:\n            self.resize_func = cv2.resize\n\n    def __call__(self, src, size):\n        return self.resize_func(src, size)\n\n\nclass OperatorParamError(ValueError):\n    \"\"\" OperatorParamError\n    \"\"\"\n    pass\n\n\nclass DecodeImage(object):\n    \"\"\" decode image \"\"\"\n\n    def __init__(self, to_rgb=True, to_np=False, channel_first=False):\n        self.to_rgb = to_rgb\n        self.to_np = to_np  # to numpy\n        self.channel_first = channel_first  # only enabled when to_np is True\n\n    def __call__(self, img):\n        if six.PY2:\n            assert type(img) is str and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        else:\n            assert type(img) is bytes and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        data = np.frombuffer(img, dtype='uint8')\n        img = cv2.imdecode(data, 1)\n        if self.to_rgb:\n            assert img.shape[2] == 3, 'invalid shape of image[%s]' % (img.shape)\n            img = img[:, :, ::-1]\n\n        if self.channel_first:\n            img = img.transpose((2, 0, 1))\n\n        return img\n\n\nclass ResizeImage(object):\n    \"\"\" resize image \"\"\"\n\n    def __init__(self, size=None, resize_short=None, interpolation=None, backend=\"cv2\"):\n        if resize_short is not None and resize_short > 0:\n            self.resize_short = resize_short\n            self.w = None\n            self.h = None\n        elif size is not None:\n            self.resize_short = None\n            self.w = size if type(size) is int else size[0]\n            self.h = size if type(size) is int else size[1]\n        else:\n            raise OperatorParamError(\"invalid params for ReisizeImage for '\\\n                'both 'size' and 'resize_short' are None\")\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        img_h, img_w = img.shape[:2]\n        if self.resize_short is not None:\n            percent = float(self.resize_short) / min(img_w, img_h)\n            w = int(round(img_w * percent))\n            h = int(round(img_h * percent))\n        else:\n            w = self.w\n            h = self.h\n        return self._resize_func(img, (w, h))\n\n\nclass CropImage(object):\n    \"\"\" crop image \"\"\"\n\n    def __init__(self, size):\n        if type(size) is int:\n            self.size = (size, size)\n        else:\n            self.size = size  # (h, w)\n\n    def __call__(self, img):\n        w, h = self.size\n        img_h, img_w = img.shape[:2]\n        w_start = (img_w - w) // 2\n        h_start = (img_h - h) // 2\n\n        w_end = w_start + w\n        h_end = h_start + h\n        return img[h_start:h_end, w_start:w_end, :]\n\n\nclass RandCropImage(object):\n    \"\"\" random crop image \"\"\"\n\n    def __init__(self, size, scale=None, ratio=None, interpolation=None, backend=\"cv2\"):\n        if type(size) is int:\n            self.size = (size, size)  # (h, w)\n        else:\n            self.size = size\n\n        self.scale = [0.08, 1.0] if scale is None else scale\n        self.ratio = [3. / 4., 4. / 3.] if ratio is None else ratio\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        size = self.size\n        scale = self.scale\n        ratio = self.ratio\n\n        aspect_ratio = math.sqrt(random.uniform(*ratio))\n        w = 1. * aspect_ratio\n        h = 1. / aspect_ratio\n\n        img_h, img_w = img.shape[:2]\n\n        bound = min((float(img_w) / img_h) / (w**2), (float(img_h) / img_w) / (h**2))\n        scale_max = min(scale[1], bound)\n        scale_min = min(scale[0], bound)\n\n        target_area = img_w * img_h * random.uniform(scale_min, scale_max)\n        target_size = math.sqrt(target_area)\n        w = int(target_size * w)\n        h = int(target_size * h)\n\n        i = random.randint(0, img_w - w)\n        j = random.randint(0, img_h - h)\n\n        img = img[j:j + h, i:i + w, :]\n\n        return self._resize_func(img, size)\n\n\nclass RandFlipImage(object):\n    \"\"\" random flip image\n        flip_code:\n            1: Flipped Horizontally\n            0: Flipped Vertically\n            -1: Flipped Horizontally & Vertically\n    \"\"\"\n\n    def __init__(self, flip_code=1):\n        assert flip_code in [-1, 0, 1], \"flip_code should be a value in [-1, 0, 1]\"\n        self.flip_code = flip_code\n\n    def __call__(self, img):\n        if random.randint(0, 1) == 1:\n            return cv2.flip(img, self.flip_code)\n        else:\n            return img\n\n\nclass NormalizeImage(object):\n    \"\"\" normalize image such as substract mean, divide std\n    \"\"\"\n\n    def __init__(self, scale=None, mean=None, std=None, order='chw', output_fp16=False, channel_num=3):\n        if isinstance(scale, str):\n            scale = eval(scale)\n        assert channel_num in [3, 4], \"channel number of input image should be set to 3 or 4.\"\n        self.channel_num = channel_num\n        self.output_dtype = 'float16' if output_fp16 else 'float32'\n        self.scale = np.float32(scale if scale is not None else 1.0 / 255.0)\n        self.order = order\n        mean = mean if mean is not None else [0.485, 0.456, 0.406]\n        std = std if std is not None else [0.229, 0.224, 0.225]\n\n        shape = (3, 1, 1) if self.order == 'chw' else (1, 1, 3)\n        self.mean = np.array(mean).reshape(shape).astype('float32')\n        self.std = np.array(std).reshape(shape).astype('float32')\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        assert isinstance(img, np.ndarray), \"invalid input 'img' in NormalizeImage\"\n\n        img = (img.astype('float32') * self.scale - self.mean) / self.std\n\n        if self.channel_num == 4:\n            img_h = img.shape[1] if self.order == 'chw' else img.shape[0]\n            img_w = img.shape[2] if self.order == 'chw' else img.shape[1]\n            pad_zeros = np.zeros((1, img_h, img_w)) if self.order == 'chw' else np.zeros((img_h, img_w, 1))\n            img = (np.concatenate((img, pad_zeros), axis=0) if self.order == 'chw' else np.concatenate(\n                (img, pad_zeros), axis=2))\n        return img.astype(self.output_dtype)\n\n\nclass ToCHWImage(object):\n    \"\"\" convert hwc image to chw image\n    \"\"\"\n\n    def __init__(self):\n        pass\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        return img.transpose((2, 0, 1))\n\n\nclass ColorJitter(RawColorJitter):\n    \"\"\"ColorJitter.\n    \"\"\"\n\n    def __init__(self, *args, **kwargs):\n        super().__init__(*args, **kwargs)\n\n    def __call__(self, img):\n        if not isinstance(img, Image.Image):\n            img = np.ascontiguousarray(img)\n            img = Image.fromarray(img)\n        img = super()._apply_image(img)\n        if isinstance(img, Image.Image):\n            img = np.asarray(img)\n        return img\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\nclass Topk(object):\n\n    def __init__(self, topk=1, class_id_map_file=None):\n        assert isinstance(topk, (int, ))\n        self.class_id_map = self.parse_class_id_map(class_id_map_file)\n        self.topk = topk\n\n    def parse_class_id_map(self, class_id_map_file):\n        if class_id_map_file is None:\n            return None\n        if not os.path.exists(class_id_map_file):\n            print(\n                \"Warning: If want to use your own label_dict, please input legal path!\\nOtherwise label_names will be empty!\"\n            )\n            return None\n\n        try:\n            class_id_map = {}\n            with open(class_id_map_file, \"r\") as fin:\n                lines = fin.readlines()\n                for line in lines:\n                    partition = line.split(\"\\n\")[0].partition(\" \")\n                    class_id_map[int(partition[0])] = str(partition[-1])\n        except Exception as ex:\n            print(ex)\n            class_id_map = None\n        return class_id_map\n\n    def __call__(self, x, file_names=None, multilabel=False):\n        assert isinstance(x, paddle.Tensor)\n        if file_names is not None:\n            assert x.shape[0] == len(file_names)\n        x = F.softmax(x, axis=-1) if not multilabel else F.sigmoid(x)\n        x = x.numpy()\n        y = []\n        for idx, probs in enumerate(x):\n            index = probs.argsort(axis=0)[-self.topk:][::-1].astype(\"int32\") if not multilabel else np.where(\n                probs >= 0.5)[0].astype(\"int32\")\n            clas_id_list = []\n            score_list = []\n            label_name_list = []\n            for i in index:\n                clas_id_list.append(i.item())\n                score_list.append(probs[i].item())\n                if self.class_id_map is not None:\n                    label_name_list.append(self.class_id_map[i.item()])\n            result = {\n                \"class_ids\": clas_id_list,\n                \"scores\": np.around(score_list, decimals=5).tolist(),\n            }\n            if file_names is not None:\n                result[\"file_name\"] = file_names[idx]\n            if label_name_list is not None:\n                result[\"label_names\"] = label_name_list\n            y.append(result)\n        return y\n"
  },
  {
    "path": "modules/image/classification/levit_256_imagenet/utils.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport yaml\n\n__all__ = ['get_config']\n\n\nclass AttrDict(dict):\n\n    def __getattr__(self, key):\n        return self[key]\n\n    def __setattr__(self, key, value):\n        if key in self.__dict__:\n            self.__dict__[key] = value\n        else:\n            self[key] = value\n\n    def __deepcopy__(self, content):\n        return copy.deepcopy(dict(self))\n\n\ndef create_attr_dict(yaml_config):\n    from ast import literal_eval\n    for key, value in yaml_config.items():\n        if type(value) is dict:\n            yaml_config[key] = value = AttrDict(value)\n        if isinstance(value, str):\n            try:\n                value = literal_eval(value)\n            except BaseException:\n                pass\n        if isinstance(value, AttrDict):\n            create_attr_dict(yaml_config[key])\n        else:\n            yaml_config[key] = value\n\n\ndef parse_config(cfg_file):\n    \"\"\"Load a config file into AttrDict\"\"\"\n    with open(cfg_file, 'r') as fopen:\n        yaml_config = AttrDict(yaml.load(fopen, Loader=yaml.SafeLoader))\n    create_attr_dict(yaml_config)\n    return yaml_config\n\n\ndef override(dl, ks, v):\n    \"\"\"\n    Recursively replace dict of list\n    Args:\n        dl(dict or list): dict or list to be replaced\n        ks(list): list of keys\n        v(str): value to be replaced\n    \"\"\"\n\n    def str2num(v):\n        try:\n            return eval(v)\n        except Exception:\n            return v\n\n    assert isinstance(dl, (list, dict)), (\"{} should be a list or a dict\")\n    assert len(ks) > 0, ('lenght of keys should larger than 0')\n    if isinstance(dl, list):\n        k = str2num(ks[0])\n        if len(ks) == 1:\n            assert k < len(dl), ('index({}) out of range({})'.format(k, dl))\n            dl[k] = str2num(v)\n        else:\n            override(dl[k], ks[1:], v)\n    else:\n        if len(ks) == 1:\n            # assert ks[0] in dl, ('{} is not exist in {}'.format(ks[0], dl))\n            if not ks[0] in dl:\n                print('A new filed ({}) detected!'.format(ks[0], dl))\n            dl[ks[0]] = str2num(v)\n        else:\n            override(dl[ks[0]], ks[1:], v)\n\n\ndef override_config(config, options=None):\n    \"\"\"\n    Recursively override the config\n    Args:\n        config(dict): dict to be replaced\n        options(list): list of pairs(key0.key1.idx.key2=value)\n            such as: [\n                'topk=2',\n                'VALID.transforms.1.ResizeImage.resize_short=300'\n            ]\n    Returns:\n        config(dict): replaced config\n    \"\"\"\n    if options is not None:\n        for opt in options:\n            assert isinstance(opt, str), (\"option({}) should be a str\".format(opt))\n            assert \"=\" in opt, (\"option({}) should contain a =\"\n                                \"to distinguish between key and value\".format(opt))\n            pair = opt.split('=')\n            assert len(pair) == 2, (\"there can be only a = in the option\")\n            key, value = pair\n            keys = key.split('.')\n            override(config, keys, value)\n    return config\n\n\ndef get_config(fname, overrides=None, show=False):\n    \"\"\"\n    Read config from file\n    \"\"\"\n    assert os.path.exists(fname), ('config file({}) is not exist'.format(fname))\n    config = parse_config(fname)\n    override_config(config, overrides)\n    return config\n"
  },
  {
    "path": "modules/image/classification/levit_384_imagenet/README.md",
    "content": "# levit_384_imagenet\n\n|模型名称|levit_384_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|LeViT|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|225 MB|\n|最新更新日期|2022-04-02|\n|数据指标|Acc|\n\n\n## 一、模型基本信息\n\n\n- ### 模型介绍\n\n  - LeViT 是一种快速推理的、用于图像分类任务的混合神经网络。其设计之初考虑了网络模型在不同的硬件平台上的性能，因此能够更好地反映普遍应用的真实场景。通过大量实验，作者找到了卷积神经网络与 Transformer 体系更好的结合方式，并且提出了 attention-based 方法，用于整合 Transformer 中的位置信息编码, 该模块的模型结构配置为LeViT384, 详情可参考[论文地址](https://arxiv.org/abs/2104.01136)。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install levit_384_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run levit_384_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"levit_384_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 包括'class_ids'（种类索引）, 'scores'（置信度） 和 'label_names'（种类名称）\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m levit_384_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"\\}\n    url = \"http://127.0.0.1:8866/predict/levit_384_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install levit_384_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/levit_384_imagenet/model.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# Code was based on https://github.com/facebookresearch/LeViT\nimport itertools\nimport math\nimport warnings\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn.initializer import Constant\nfrom paddle.nn.initializer import TruncatedNormal\nfrom paddle.regularizer import L2Decay\n\nfrom .vision_transformer import Identity\nfrom .vision_transformer import ones_\nfrom .vision_transformer import trunc_normal_\nfrom .vision_transformer import zeros_\n\n\ndef cal_attention_biases(attention_biases, attention_bias_idxs):\n    gather_list = []\n    attention_bias_t = paddle.transpose(attention_biases, (1, 0))\n    nums = attention_bias_idxs.shape[0]\n    for idx in range(nums):\n        gather = paddle.gather(attention_bias_t, attention_bias_idxs[idx])\n        gather_list.append(gather)\n    shape0, shape1 = attention_bias_idxs.shape\n    gather = paddle.concat(gather_list)\n    return paddle.transpose(gather, (1, 0)).reshape((0, shape0, shape1))\n\n\nclass Conv2d_BN(nn.Sequential):\n\n    def __init__(self, a, b, ks=1, stride=1, pad=0, dilation=1, groups=1, bn_weight_init=1, resolution=-10000):\n        super().__init__()\n        self.add_sublayer('c', nn.Conv2D(a, b, ks, stride, pad, dilation, groups, bias_attr=False))\n        bn = nn.BatchNorm2D(b)\n        ones_(bn.weight)\n        zeros_(bn.bias)\n        self.add_sublayer('bn', bn)\n\n\nclass Linear_BN(nn.Sequential):\n\n    def __init__(self, a, b, bn_weight_init=1):\n        super().__init__()\n        self.add_sublayer('c', nn.Linear(a, b, bias_attr=False))\n        bn = nn.BatchNorm1D(b)\n        if bn_weight_init == 0:\n            zeros_(bn.weight)\n        else:\n            ones_(bn.weight)\n        zeros_(bn.bias)\n        self.add_sublayer('bn', bn)\n\n    def forward(self, x):\n        l, bn = self._sub_layers.values()\n        x = l(x)\n        return paddle.reshape(bn(x.flatten(0, 1)), x.shape)\n\n\nclass BN_Linear(nn.Sequential):\n\n    def __init__(self, a, b, bias=True, std=0.02):\n        super().__init__()\n        self.add_sublayer('bn', nn.BatchNorm1D(a))\n        l = nn.Linear(a, b, bias_attr=bias)\n        trunc_normal_(l.weight)\n        if bias:\n            zeros_(l.bias)\n        self.add_sublayer('l', l)\n\n\ndef b16(n, activation, resolution=224):\n    return nn.Sequential(Conv2d_BN(3, n // 8, 3, 2, 1, resolution=resolution), activation(),\n                         Conv2d_BN(n // 8, n // 4, 3, 2, 1, resolution=resolution // 2), activation(),\n                         Conv2d_BN(n // 4, n // 2, 3, 2, 1, resolution=resolution // 4), activation(),\n                         Conv2d_BN(n // 2, n, 3, 2, 1, resolution=resolution // 8))\n\n\nclass Residual(nn.Layer):\n\n    def __init__(self, m, drop):\n        super().__init__()\n        self.m = m\n        self.drop = drop\n\n    def forward(self, x):\n        if self.training and self.drop > 0:\n            y = paddle.rand(shape=[x.shape[0], 1, 1]).__ge__(self.drop).astype(\"float32\")\n            y = y.divide(paddle.full_like(y, 1 - self.drop))\n            return paddle.add(x, y)\n        else:\n            return paddle.add(x, self.m(x))\n\n\nclass Attention(nn.Layer):\n\n    def __init__(self, dim, key_dim, num_heads=8, attn_ratio=4, activation=None, resolution=14):\n        super().__init__()\n        self.num_heads = num_heads\n        self.scale = key_dim**-0.5\n        self.key_dim = key_dim\n        self.nh_kd = nh_kd = key_dim * num_heads\n        self.d = int(attn_ratio * key_dim)\n        self.dh = int(attn_ratio * key_dim) * num_heads\n        self.attn_ratio = attn_ratio\n        self.h = self.dh + nh_kd * 2\n        self.qkv = Linear_BN(dim, self.h)\n        self.proj = nn.Sequential(activation(), Linear_BN(self.dh, dim, bn_weight_init=0))\n        points = list(itertools.product(range(resolution), range(resolution)))\n        N = len(points)\n        attention_offsets = {}\n        idxs = []\n        for p1 in points:\n            for p2 in points:\n                offset = (abs(p1[0] - p2[0]), abs(p1[1] - p2[1]))\n                if offset not in attention_offsets:\n                    attention_offsets[offset] = len(attention_offsets)\n                idxs.append(attention_offsets[offset])\n        self.attention_biases = self.create_parameter(shape=(num_heads, len(attention_offsets)),\n                                                      default_initializer=zeros_,\n                                                      attr=paddle.ParamAttr(regularizer=L2Decay(0.0)))\n        tensor_idxs = paddle.to_tensor(idxs, dtype='int64')\n        self.register_buffer('attention_bias_idxs', paddle.reshape(tensor_idxs, [N, N]))\n\n    @paddle.no_grad()\n    def train(self, mode=True):\n        if mode:\n            super().train()\n        else:\n            super().eval()\n        if mode and hasattr(self, 'ab'):\n            del self.ab\n        else:\n            self.ab = cal_attention_biases(self.attention_biases, self.attention_bias_idxs)\n\n    def forward(self, x):\n        self.training = True\n        B, N, C = x.shape\n        qkv = self.qkv(x)\n        qkv = paddle.reshape(qkv, [B, N, self.num_heads, self.h // self.num_heads])\n        q, k, v = paddle.split(qkv, [self.key_dim, self.key_dim, self.d], axis=3)\n        q = paddle.transpose(q, perm=[0, 2, 1, 3])\n        k = paddle.transpose(k, perm=[0, 2, 1, 3])\n        v = paddle.transpose(v, perm=[0, 2, 1, 3])\n        k_transpose = paddle.transpose(k, perm=[0, 1, 3, 2])\n\n        if self.training:\n            attention_biases = cal_attention_biases(self.attention_biases, self.attention_bias_idxs)\n        else:\n            attention_biases = self.ab\n        attn = (paddle.matmul(q, k_transpose) * self.scale + attention_biases)\n        attn = F.softmax(attn)\n        x = paddle.transpose(paddle.matmul(attn, v), perm=[0, 2, 1, 3])\n        x = paddle.reshape(x, [B, N, self.dh])\n        x = self.proj(x)\n        return x\n\n\nclass Subsample(nn.Layer):\n\n    def __init__(self, stride, resolution):\n        super().__init__()\n        self.stride = stride\n        self.resolution = resolution\n\n    def forward(self, x):\n        B, N, C = x.shape\n        x = paddle.reshape(x, [B, self.resolution, self.resolution, C])\n        end1, end2 = x.shape[1], x.shape[2]\n        x = x[:, 0:end1:self.stride, 0:end2:self.stride]\n        x = paddle.reshape(x, [B, -1, C])\n        return x\n\n\nclass AttentionSubsample(nn.Layer):\n\n    def __init__(self,\n                 in_dim,\n                 out_dim,\n                 key_dim,\n                 num_heads=8,\n                 attn_ratio=2,\n                 activation=None,\n                 stride=2,\n                 resolution=14,\n                 resolution_=7):\n        super().__init__()\n        self.num_heads = num_heads\n        self.scale = key_dim**-0.5\n        self.key_dim = key_dim\n        self.nh_kd = nh_kd = key_dim * num_heads\n        self.d = int(attn_ratio * key_dim)\n        self.dh = int(attn_ratio * key_dim) * self.num_heads\n        self.attn_ratio = attn_ratio\n        self.resolution_ = resolution_\n        self.resolution_2 = resolution_**2\n        self.training = True\n        h = self.dh + nh_kd\n        self.kv = Linear_BN(in_dim, h)\n\n        self.q = nn.Sequential(Subsample(stride, resolution), Linear_BN(in_dim, nh_kd))\n        self.proj = nn.Sequential(activation(), Linear_BN(self.dh, out_dim))\n\n        self.stride = stride\n        self.resolution = resolution\n        points = list(itertools.product(range(resolution), range(resolution)))\n        points_ = list(itertools.product(range(resolution_), range(resolution_)))\n\n        N = len(points)\n        N_ = len(points_)\n        attention_offsets = {}\n        idxs = []\n        i = 0\n        j = 0\n        for p1 in points_:\n            i += 1\n            for p2 in points:\n                j += 1\n                size = 1\n                offset = (abs(p1[0] * stride - p2[0] + (size - 1) / 2), abs(p1[1] * stride - p2[1] + (size - 1) / 2))\n                if offset not in attention_offsets:\n                    attention_offsets[offset] = len(attention_offsets)\n                idxs.append(attention_offsets[offset])\n        self.attention_biases = self.create_parameter(shape=(num_heads, len(attention_offsets)),\n                                                      default_initializer=zeros_,\n                                                      attr=paddle.ParamAttr(regularizer=L2Decay(0.0)))\n\n        tensor_idxs_ = paddle.to_tensor(idxs, dtype='int64')\n        self.register_buffer('attention_bias_idxs', paddle.reshape(tensor_idxs_, [N_, N]))\n\n    @paddle.no_grad()\n    def train(self, mode=True):\n        if mode:\n            super().train()\n        else:\n            super().eval()\n        if mode and hasattr(self, 'ab'):\n            del self.ab\n        else:\n            self.ab = cal_attention_biases(self.attention_biases, self.attention_bias_idxs)\n\n    def forward(self, x):\n        self.training = True\n        B, N, C = x.shape\n        kv = self.kv(x)\n        kv = paddle.reshape(kv, [B, N, self.num_heads, -1])\n        k, v = paddle.split(kv, [self.key_dim, self.d], axis=3)\n        k = paddle.transpose(k, perm=[0, 2, 1, 3])  # BHNC\n        v = paddle.transpose(v, perm=[0, 2, 1, 3])\n        q = paddle.reshape(self.q(x), [B, self.resolution_2, self.num_heads, self.key_dim])\n        q = paddle.transpose(q, perm=[0, 2, 1, 3])\n\n        if self.training:\n            attention_biases = cal_attention_biases(self.attention_biases, self.attention_bias_idxs)\n        else:\n            attention_biases = self.ab\n\n        attn = (paddle.matmul(q, paddle.transpose(k, perm=[0, 1, 3, 2]))) * self.scale + attention_biases\n        attn = F.softmax(attn)\n\n        x = paddle.reshape(paddle.transpose(paddle.matmul(attn, v), perm=[0, 2, 1, 3]), [B, -1, self.dh])\n        x = self.proj(x)\n        return x\n\n\nclass LeViT(nn.Layer):\n    \"\"\" Vision Transformer with support for patch or hybrid CNN input stage\n    \"\"\"\n\n    def __init__(self,\n                 img_size=224,\n                 patch_size=16,\n                 in_chans=3,\n                 class_num=1000,\n                 embed_dim=[192],\n                 key_dim=[64],\n                 depth=[12],\n                 num_heads=[3],\n                 attn_ratio=[2],\n                 mlp_ratio=[2],\n                 hybrid_backbone=None,\n                 down_ops=[],\n                 attention_activation=nn.Hardswish,\n                 mlp_activation=nn.Hardswish,\n                 distillation=True,\n                 drop_path=0):\n        super().__init__()\n\n        self.class_num = class_num\n        self.num_features = embed_dim[-1]\n        self.embed_dim = embed_dim\n        self.distillation = distillation\n\n        self.patch_embed = hybrid_backbone\n\n        self.blocks = []\n        down_ops.append([''])\n        resolution = img_size // patch_size\n        for i, (ed, kd, dpth, nh, ar, mr,\n                do) in enumerate(zip(embed_dim, key_dim, depth, num_heads, attn_ratio, mlp_ratio, down_ops)):\n            for _ in range(dpth):\n                self.blocks.append(\n                    Residual(\n                        Attention(\n                            ed,\n                            kd,\n                            nh,\n                            attn_ratio=ar,\n                            activation=attention_activation,\n                            resolution=resolution,\n                        ), drop_path))\n                if mr > 0:\n                    h = int(ed * mr)\n                    self.blocks.append(\n                        Residual(\n                            nn.Sequential(\n                                Linear_BN(ed, h),\n                                mlp_activation(),\n                                Linear_BN(h, ed, bn_weight_init=0),\n                            ), drop_path))\n            if do[0] == 'Subsample':\n                #('Subsample',key_dim, num_heads, attn_ratio, mlp_ratio, stride)\n                resolution_ = (resolution - 1) // do[5] + 1\n                self.blocks.append(\n                    AttentionSubsample(*embed_dim[i:i + 2],\n                                       key_dim=do[1],\n                                       num_heads=do[2],\n                                       attn_ratio=do[3],\n                                       activation=attention_activation,\n                                       stride=do[5],\n                                       resolution=resolution,\n                                       resolution_=resolution_))\n                resolution = resolution_\n                if do[4] > 0:  # mlp_ratio\n                    h = int(embed_dim[i + 1] * do[4])\n                    self.blocks.append(\n                        Residual(\n                            nn.Sequential(\n                                Linear_BN(embed_dim[i + 1], h),\n                                mlp_activation(),\n                                Linear_BN(h, embed_dim[i + 1], bn_weight_init=0),\n                            ), drop_path))\n        self.blocks = nn.Sequential(*self.blocks)\n\n        # Classifier head\n        self.head = BN_Linear(embed_dim[-1], class_num) if class_num > 0 else Identity()\n        if distillation:\n            self.head_dist = BN_Linear(embed_dim[-1], class_num) if class_num > 0 else Identity()\n\n    def forward(self, x):\n        x = self.patch_embed(x)\n        x = x.flatten(2)\n        x = paddle.transpose(x, perm=[0, 2, 1])\n        x = self.blocks(x)\n        x = x.mean(1)\n\n        x = paddle.reshape(x, [-1, self.embed_dim[-1]])\n        if self.distillation:\n            x = self.head(x), self.head_dist(x)\n            if not self.training:\n                x = (x[0] + x[1]) / 2\n        else:\n            x = self.head(x)\n        return x\n\n\ndef model_factory(C, D, X, N, drop_path, class_num, distillation):\n    embed_dim = [int(x) for x in C.split('_')]\n    num_heads = [int(x) for x in N.split('_')]\n    depth = [int(x) for x in X.split('_')]\n    act = nn.Hardswish\n    model = LeViT(\n        patch_size=16,\n        embed_dim=embed_dim,\n        num_heads=num_heads,\n        key_dim=[D] * 3,\n        depth=depth,\n        attn_ratio=[2, 2, 2],\n        mlp_ratio=[2, 2, 2],\n        down_ops=[\n            #('Subsample',key_dim, num_heads, attn_ratio, mlp_ratio, stride)\n            ['Subsample', D, embed_dim[0] // D, 4, 2, 2],\n            ['Subsample', D, embed_dim[1] // D, 4, 2, 2],\n        ],\n        attention_activation=act,\n        mlp_activation=act,\n        hybrid_backbone=b16(embed_dim[0], activation=act),\n        class_num=class_num,\n        drop_path=drop_path,\n        distillation=distillation)\n\n    return model\n\n\nspecification = {\n    'LeViT_128S': {\n        'C': '128_256_384',\n        'D': 16,\n        'N': '4_6_8',\n        'X': '2_3_4',\n        'drop_path': 0\n    },\n    'LeViT_128': {\n        'C': '128_256_384',\n        'D': 16,\n        'N': '4_8_12',\n        'X': '4_4_4',\n        'drop_path': 0\n    },\n    'LeViT_192': {\n        'C': '192_288_384',\n        'D': 32,\n        'N': '3_5_6',\n        'X': '4_4_4',\n        'drop_path': 0\n    },\n    'LeViT_256': {\n        'C': '256_384_512',\n        'D': 32,\n        'N': '4_6_8',\n        'X': '4_4_4',\n        'drop_path': 0\n    },\n    'LeViT_384': {\n        'C': '384_512_768',\n        'D': 32,\n        'N': '6_9_12',\n        'X': '4_4_4',\n        'drop_path': 0.1\n    },\n}\n\n\ndef LeViT_384(**kwargs):\n    model = model_factory(**specification['LeViT_384'], class_num=1000, distillation=False)\n    return model\n"
  },
  {
    "path": "modules/image/classification/levit_384_imagenet/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport cv2\nimport numpy as np\nimport paddle\nfrom skimage.io import imread\nfrom skimage.transform import rescale\nfrom skimage.transform import resize\n\nimport paddlehub as hub\nfrom .model import LeViT_384\nfrom .processor import base64_to_cv2\nfrom .processor import create_operators\nfrom .processor import Topk\nfrom .utils import get_config\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"levit_384_imagenet\",\n            type=\"cv/classification\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"\",\n            version=\"1.0.0\")\nclass LeViT_384_ImageNet:\n\n    def __init__(self):\n        self.config = get_config(os.path.join(self.directory, 'LeViT_384.yaml'), show=False)\n        self.label_path = os.path.join(self.directory, 'imagenet1k_label_list.txt')\n        self.pretrain_path = os.path.join(self.directory, 'LeViT_384_pretrained.pdparams')\n        self.config['Infer']['PostProcess']['class_id_map_file'] = self.label_path\n        self.model = LeViT_384()\n        param_state_dict = paddle.load(self.pretrain_path)\n        self.model.set_dict(param_state_dict)\n        self.preprocess_funcs = create_operators(self.config[\"Infer\"][\"transforms\"])\n\n    def classification(self,\n                       images: list = None,\n                       paths: list = None,\n                       batch_size: int = 1,\n                       use_gpu: bool = False,\n                       top_k: int = 1):\n        '''\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results, each result dict contains key 'class_ids', 'scores' and 'label_names'.\n        '''\n        postprocess_func = Topk(top_k, self.label_path)\n        inputs = []\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n\n        if images != None:\n            for image in images:\n                image = image[:, :, ::-1]\n                inputs.append(image)\n\n        if paths != None:\n            for path in paths:\n                image = cv2.imread(path)[:, :, ::-1]\n                inputs.append(image)\n\n        batch_data = []\n        for idx, imagedata in enumerate(inputs):\n            for process in self.preprocess_funcs:\n                imagedata = process(imagedata)\n            batch_data.append(imagedata)\n            if len(batch_data) >= batch_size or idx == len(inputs) - 1:\n                batch_tensor = paddle.to_tensor(batch_data)\n                out = self.model(batch_tensor)\n                if isinstance(out, list):\n                    out = out[0]\n                if isinstance(out, dict) and \"logits\" in out:\n                    out = out[\"logits\"]\n                if isinstance(out, dict) and \"output\" in out:\n                    out = out[\"output\"]\n                result = postprocess_func(out)\n                results.extend(result)\n                batch_data.clear()\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n        results = self.classification(paths=[self.args.input_path],\n                                      use_gpu=self.args.use_gpu,\n                                      batch_size=self.args.batch_size,\n                                      top_k=self.args.top_k)\n        return results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classification(images=images_decode, **kwargs)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument('--batch_size', type=int, default=1, help='batch size')\n        self.arg_config_group.add_argument('--top_k', type=int, default=1, help='Return top k results.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to input image.\")\n"
  },
  {
    "path": "modules/image/classification/levit_384_imagenet/processor.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport base64\nimport inspect\nimport math\nimport os\nimport random\nimport sys\nfrom functools import partial\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\nimport six\nfrom paddle.vision.transforms import ColorJitter as RawColorJitter\nfrom PIL import Image\n\n\ndef create_operators(params, class_num=None):\n    \"\"\"\n    create operators based on the config\n\n    Args:\n        params(list): a dict list, used to create some operators\n    \"\"\"\n    assert isinstance(params, list), ('operator config should be a list')\n    ops = []\n    current_module = sys.modules[__name__]\n    for operator in params:\n        assert isinstance(operator, dict) and len(operator) == 1, \"yaml format error\"\n        op_name = list(operator)[0]\n        param = {} if operator[op_name] is None else operator[op_name]\n        op_func = getattr(current_module, op_name)\n        if \"class_num\" in inspect.getfullargspec(op_func).args:\n            param.update({\"class_num\": class_num})\n        op = op_func(**param)\n        ops.append(op)\n\n    return ops\n\n\nclass UnifiedResize(object):\n\n    def __init__(self, interpolation=None, backend=\"cv2\"):\n        _cv2_interp_from_str = {\n            'nearest': cv2.INTER_NEAREST,\n            'bilinear': cv2.INTER_LINEAR,\n            'area': cv2.INTER_AREA,\n            'bicubic': cv2.INTER_CUBIC,\n            'lanczos': cv2.INTER_LANCZOS4\n        }\n        _pil_interp_from_str = {\n            'nearest': Image.NEAREST,\n            'bilinear': Image.BILINEAR,\n            'bicubic': Image.BICUBIC,\n            'box': Image.BOX,\n            'lanczos': Image.LANCZOS,\n            'hamming': Image.HAMMING\n        }\n\n        def _pil_resize(src, size, resample):\n            pil_img = Image.fromarray(src)\n            pil_img = pil_img.resize(size, resample)\n            return np.asarray(pil_img)\n\n        if backend.lower() == \"cv2\":\n            if isinstance(interpolation, str):\n                interpolation = _cv2_interp_from_str[interpolation.lower()]\n            # compatible with opencv < version 4.4.0\n            elif interpolation is None:\n                interpolation = cv2.INTER_LINEAR\n            self.resize_func = partial(cv2.resize, interpolation=interpolation)\n        elif backend.lower() == \"pil\":\n            if isinstance(interpolation, str):\n                interpolation = _pil_interp_from_str[interpolation.lower()]\n            self.resize_func = partial(_pil_resize, resample=interpolation)\n        else:\n            self.resize_func = cv2.resize\n\n    def __call__(self, src, size):\n        return self.resize_func(src, size)\n\n\nclass OperatorParamError(ValueError):\n    \"\"\" OperatorParamError\n    \"\"\"\n    pass\n\n\nclass DecodeImage(object):\n    \"\"\" decode image \"\"\"\n\n    def __init__(self, to_rgb=True, to_np=False, channel_first=False):\n        self.to_rgb = to_rgb\n        self.to_np = to_np  # to numpy\n        self.channel_first = channel_first  # only enabled when to_np is True\n\n    def __call__(self, img):\n        if six.PY2:\n            assert type(img) is str and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        else:\n            assert type(img) is bytes and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        data = np.frombuffer(img, dtype='uint8')\n        img = cv2.imdecode(data, 1)\n        if self.to_rgb:\n            assert img.shape[2] == 3, 'invalid shape of image[%s]' % (img.shape)\n            img = img[:, :, ::-1]\n\n        if self.channel_first:\n            img = img.transpose((2, 0, 1))\n\n        return img\n\n\nclass ResizeImage(object):\n    \"\"\" resize image \"\"\"\n\n    def __init__(self, size=None, resize_short=None, interpolation=None, backend=\"cv2\"):\n        if resize_short is not None and resize_short > 0:\n            self.resize_short = resize_short\n            self.w = None\n            self.h = None\n        elif size is not None:\n            self.resize_short = None\n            self.w = size if type(size) is int else size[0]\n            self.h = size if type(size) is int else size[1]\n        else:\n            raise OperatorParamError(\"invalid params for ReisizeImage for '\\\n                'both 'size' and 'resize_short' are None\")\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        img_h, img_w = img.shape[:2]\n        if self.resize_short is not None:\n            percent = float(self.resize_short) / min(img_w, img_h)\n            w = int(round(img_w * percent))\n            h = int(round(img_h * percent))\n        else:\n            w = self.w\n            h = self.h\n        return self._resize_func(img, (w, h))\n\n\nclass CropImage(object):\n    \"\"\" crop image \"\"\"\n\n    def __init__(self, size):\n        if type(size) is int:\n            self.size = (size, size)\n        else:\n            self.size = size  # (h, w)\n\n    def __call__(self, img):\n        w, h = self.size\n        img_h, img_w = img.shape[:2]\n        w_start = (img_w - w) // 2\n        h_start = (img_h - h) // 2\n\n        w_end = w_start + w\n        h_end = h_start + h\n        return img[h_start:h_end, w_start:w_end, :]\n\n\nclass RandCropImage(object):\n    \"\"\" random crop image \"\"\"\n\n    def __init__(self, size, scale=None, ratio=None, interpolation=None, backend=\"cv2\"):\n        if type(size) is int:\n            self.size = (size, size)  # (h, w)\n        else:\n            self.size = size\n\n        self.scale = [0.08, 1.0] if scale is None else scale\n        self.ratio = [3. / 4., 4. / 3.] if ratio is None else ratio\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        size = self.size\n        scale = self.scale\n        ratio = self.ratio\n\n        aspect_ratio = math.sqrt(random.uniform(*ratio))\n        w = 1. * aspect_ratio\n        h = 1. / aspect_ratio\n\n        img_h, img_w = img.shape[:2]\n\n        bound = min((float(img_w) / img_h) / (w**2), (float(img_h) / img_w) / (h**2))\n        scale_max = min(scale[1], bound)\n        scale_min = min(scale[0], bound)\n\n        target_area = img_w * img_h * random.uniform(scale_min, scale_max)\n        target_size = math.sqrt(target_area)\n        w = int(target_size * w)\n        h = int(target_size * h)\n\n        i = random.randint(0, img_w - w)\n        j = random.randint(0, img_h - h)\n\n        img = img[j:j + h, i:i + w, :]\n\n        return self._resize_func(img, size)\n\n\nclass RandFlipImage(object):\n    \"\"\" random flip image\n        flip_code:\n            1: Flipped Horizontally\n            0: Flipped Vertically\n            -1: Flipped Horizontally & Vertically\n    \"\"\"\n\n    def __init__(self, flip_code=1):\n        assert flip_code in [-1, 0, 1], \"flip_code should be a value in [-1, 0, 1]\"\n        self.flip_code = flip_code\n\n    def __call__(self, img):\n        if random.randint(0, 1) == 1:\n            return cv2.flip(img, self.flip_code)\n        else:\n            return img\n\n\nclass NormalizeImage(object):\n    \"\"\" normalize image such as substract mean, divide std\n    \"\"\"\n\n    def __init__(self, scale=None, mean=None, std=None, order='chw', output_fp16=False, channel_num=3):\n        if isinstance(scale, str):\n            scale = eval(scale)\n        assert channel_num in [3, 4], \"channel number of input image should be set to 3 or 4.\"\n        self.channel_num = channel_num\n        self.output_dtype = 'float16' if output_fp16 else 'float32'\n        self.scale = np.float32(scale if scale is not None else 1.0 / 255.0)\n        self.order = order\n        mean = mean if mean is not None else [0.485, 0.456, 0.406]\n        std = std if std is not None else [0.229, 0.224, 0.225]\n\n        shape = (3, 1, 1) if self.order == 'chw' else (1, 1, 3)\n        self.mean = np.array(mean).reshape(shape).astype('float32')\n        self.std = np.array(std).reshape(shape).astype('float32')\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        assert isinstance(img, np.ndarray), \"invalid input 'img' in NormalizeImage\"\n\n        img = (img.astype('float32') * self.scale - self.mean) / self.std\n\n        if self.channel_num == 4:\n            img_h = img.shape[1] if self.order == 'chw' else img.shape[0]\n            img_w = img.shape[2] if self.order == 'chw' else img.shape[1]\n            pad_zeros = np.zeros((1, img_h, img_w)) if self.order == 'chw' else np.zeros((img_h, img_w, 1))\n            img = (np.concatenate((img, pad_zeros), axis=0) if self.order == 'chw' else np.concatenate(\n                (img, pad_zeros), axis=2))\n        return img.astype(self.output_dtype)\n\n\nclass ToCHWImage(object):\n    \"\"\" convert hwc image to chw image\n    \"\"\"\n\n    def __init__(self):\n        pass\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        return img.transpose((2, 0, 1))\n\n\nclass ColorJitter(RawColorJitter):\n    \"\"\"ColorJitter.\n    \"\"\"\n\n    def __init__(self, *args, **kwargs):\n        super().__init__(*args, **kwargs)\n\n    def __call__(self, img):\n        if not isinstance(img, Image.Image):\n            img = np.ascontiguousarray(img)\n            img = Image.fromarray(img)\n        img = super()._apply_image(img)\n        if isinstance(img, Image.Image):\n            img = np.asarray(img)\n        return img\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\nclass Topk(object):\n\n    def __init__(self, topk=1, class_id_map_file=None):\n        assert isinstance(topk, (int, ))\n        self.class_id_map = self.parse_class_id_map(class_id_map_file)\n        self.topk = topk\n\n    def parse_class_id_map(self, class_id_map_file):\n        if class_id_map_file is None:\n            return None\n        if not os.path.exists(class_id_map_file):\n            print(\n                \"Warning: If want to use your own label_dict, please input legal path!\\nOtherwise label_names will be empty!\"\n            )\n            return None\n\n        try:\n            class_id_map = {}\n            with open(class_id_map_file, \"r\") as fin:\n                lines = fin.readlines()\n                for line in lines:\n                    partition = line.split(\"\\n\")[0].partition(\" \")\n                    class_id_map[int(partition[0])] = str(partition[-1])\n        except Exception as ex:\n            print(ex)\n            class_id_map = None\n        return class_id_map\n\n    def __call__(self, x, file_names=None, multilabel=False):\n        assert isinstance(x, paddle.Tensor)\n        if file_names is not None:\n            assert x.shape[0] == len(file_names)\n        x = F.softmax(x, axis=-1) if not multilabel else F.sigmoid(x)\n        x = x.numpy()\n        y = []\n        for idx, probs in enumerate(x):\n            index = probs.argsort(axis=0)[-self.topk:][::-1].astype(\"int32\") if not multilabel else np.where(\n                probs >= 0.5)[0].astype(\"int32\")\n            clas_id_list = []\n            score_list = []\n            label_name_list = []\n            for i in index:\n                clas_id_list.append(i.item())\n                score_list.append(probs[i].item())\n                if self.class_id_map is not None:\n                    label_name_list.append(self.class_id_map[i.item()])\n            result = {\n                \"class_ids\": clas_id_list,\n                \"scores\": np.around(score_list, decimals=5).tolist(),\n            }\n            if file_names is not None:\n                result[\"file_name\"] = file_names[idx]\n            if label_name_list is not None:\n                result[\"label_names\"] = label_name_list\n            y.append(result)\n        return y\n"
  },
  {
    "path": "modules/image/classification/levit_384_imagenet/utils.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport yaml\n\n__all__ = ['get_config']\n\n\nclass AttrDict(dict):\n\n    def __getattr__(self, key):\n        return self[key]\n\n    def __setattr__(self, key, value):\n        if key in self.__dict__:\n            self.__dict__[key] = value\n        else:\n            self[key] = value\n\n    def __deepcopy__(self, content):\n        return copy.deepcopy(dict(self))\n\n\ndef create_attr_dict(yaml_config):\n    from ast import literal_eval\n    for key, value in yaml_config.items():\n        if type(value) is dict:\n            yaml_config[key] = value = AttrDict(value)\n        if isinstance(value, str):\n            try:\n                value = literal_eval(value)\n            except BaseException:\n                pass\n        if isinstance(value, AttrDict):\n            create_attr_dict(yaml_config[key])\n        else:\n            yaml_config[key] = value\n\n\ndef parse_config(cfg_file):\n    \"\"\"Load a config file into AttrDict\"\"\"\n    with open(cfg_file, 'r') as fopen:\n        yaml_config = AttrDict(yaml.load(fopen, Loader=yaml.SafeLoader))\n    create_attr_dict(yaml_config)\n    return yaml_config\n\n\ndef override(dl, ks, v):\n    \"\"\"\n    Recursively replace dict of list\n    Args:\n        dl(dict or list): dict or list to be replaced\n        ks(list): list of keys\n        v(str): value to be replaced\n    \"\"\"\n\n    def str2num(v):\n        try:\n            return eval(v)\n        except Exception:\n            return v\n\n    assert isinstance(dl, (list, dict)), (\"{} should be a list or a dict\")\n    assert len(ks) > 0, ('lenght of keys should larger than 0')\n    if isinstance(dl, list):\n        k = str2num(ks[0])\n        if len(ks) == 1:\n            assert k < len(dl), ('index({}) out of range({})'.format(k, dl))\n            dl[k] = str2num(v)\n        else:\n            override(dl[k], ks[1:], v)\n    else:\n        if len(ks) == 1:\n            # assert ks[0] in dl, ('{} is not exist in {}'.format(ks[0], dl))\n            if not ks[0] in dl:\n                print('A new filed ({}) detected!'.format(ks[0], dl))\n            dl[ks[0]] = str2num(v)\n        else:\n            override(dl[ks[0]], ks[1:], v)\n\n\ndef override_config(config, options=None):\n    \"\"\"\n    Recursively override the config\n    Args:\n        config(dict): dict to be replaced\n        options(list): list of pairs(key0.key1.idx.key2=value)\n            such as: [\n                'topk=2',\n                'VALID.transforms.1.ResizeImage.resize_short=300'\n            ]\n    Returns:\n        config(dict): replaced config\n    \"\"\"\n    if options is not None:\n        for opt in options:\n            assert isinstance(opt, str), (\"option({}) should be a str\".format(opt))\n            assert \"=\" in opt, (\"option({}) should contain a =\"\n                                \"to distinguish between key and value\".format(opt))\n            pair = opt.split('=')\n            assert len(pair) == 2, (\"there can be only a = in the option\")\n            key, value = pair\n            keys = key.split('.')\n            override(config, keys, value)\n    return config\n\n\ndef get_config(fname, overrides=None, show=False):\n    \"\"\"\n    Read config from file\n    \"\"\"\n    assert os.path.exists(fname), ('config file({}) is not exist'.format(fname))\n    config = parse_config(fname)\n    override_config(config, overrides)\n    return config\n"
  },
  {
    "path": "modules/image/classification/marine_biometrics/README.md",
    "content": "# marine_biometrics\n\n|模型名称|marine_biometrics|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNet50_vd_ssld|\n|数据集|Fish4Knowledge|\n|是否支持Fine-tuning|否|\n|模型大小|84MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - 海洋生物识别（marine_biometrics），该模型可准确识别鱼的种类。该PaddleHub Module支持API预测及命令行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.0.0  \n\n  - paddlehub >= 2.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install marine_biometrics\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run marine_biometrics --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"marine_biometrics\")\n    images = [cv2.imread('/PATH/TO/IMAGE')]\n    results = classifier.predict(images=images)\n    for result in results:\n        print(result)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def predict(images)\n    ```\n    - 分类接口API。\n    - **参数**\n      - images：list类型，待检测的图像。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install marine_biometrics==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/marine_biometrics/README_en.md",
    "content": "# marine_biometrics\n\n|Module Name|marine_biometrics|\n| :--- | :---: |\n|Category|image classification|\n|Network|ResNet50_vd_ssld|\n|Dataset|Fish4Knowledge|\n|Fine-tuning supported or not|No|\n|Module Size|84MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - This module can be used to classify marine biometrics.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 2.0.0  \n\n  - paddlehub >= 2.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install marine_biometrics\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run marine_biometrics --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"marine_biometrics\")\n    images = [cv2.imread('/PATH/TO/IMAGE')]\n    results = classifier.predict(images=images)\n    for result in results:\n        print(result)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def predict(images)\n    ```\n    - classification API.\n    - **Parameters**\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install marine_biometrics==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/marine_biometrics/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/classification/marine_biometrics/module.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\n\nimport os\nimport cv2\nimport argparse\nimport base64\nimport paddlex as pdx\n\nimport numpy as np\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef cv2_to_base64(image):\n    # return base64.b64encode(image)\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef read_images(paths):\n    images = []\n    for path in paths:\n        images.append(cv2.imread(path))\n    return images\n\n\n@moduleinfo(\n    name='marine_biometrics',\n    type='cv/classification',\n    author='郑博培、彭兆帅',\n    author_email='2733821739@qq.com, 1084667371@qq.com',\n    summary=\n    'The model uses convolution neural network to tell you the key to identify marine fish, so that anyone can call out the names of the creatures.',\n    version='1.0.0')\nclass MODULE(hub.Module):\n    def _initialize(self, **kwargs):\n        self.default_pretrained_model_path = os.path.join(self.directory, 'assets')\n        self.model = pdx.deploy.Predictor(self.default_pretrained_model_path, **kwargs)\n\n    def predict(self, images=None, paths=None, data=None, batch_size=1, use_gpu=False, **kwargs):\n\n        all_data = images if images is not None else read_images(paths)\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = []\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except IndexError:\n                    break\n            out = self.model.batch_predict(batch_data, **kwargs)\n            res.extend(out)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.predict(images_decode, **kwargs)\n        res = []\n        for result in results:\n            if isinstance(result, dict):\n                # result_new = dict()\n                for key, value in result.items():\n                    if isinstance(value, np.ndarray):\n                        result[key] = cv2_to_base64(value)\n                    elif isinstance(value, np.generic):\n                        result[key] = np.asscalar(value)\n\n            elif isinstance(result, list):\n                for index in range(len(result)):\n                    for key, value in result[index].items():\n                        if isinstance(value, np.ndarray):\n                            result[index][key] = cv2_to_base64(value)\n                        elif isinstance(value, np.generic):\n                            result[index][key] = np.asscalar(value)\n            else:\n                raise RuntimeError('The result cannot be used in serving.')\n            res.append(result)\n        return res\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.predict(paths=[args.input_path], use_gpu=args.use_gpu)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', type=bool, default=False, help=\"whether use GPU or not\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n\n\nif __name__ == '__main__':\n    module = MODULE(directory='./new_model')\n    images = [cv2.imread('./cat.jpg'), cv2.imread('./cat.jpg'), cv2.imread('./cat.jpg')]\n    res = module.predict(images=images)\n"
  },
  {
    "path": "modules/image/classification/marine_biometrics/requirements.txt",
    "content": "paddlex==1.3.7\n"
  },
  {
    "path": "modules/image/classification/marine_biometrics/serving_client_demo.py",
    "content": "# coding: utf8\nimport requests\nimport json\nimport cv2\nimport base64\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\nif __name__ == '__main__':\n    # 获取图片的base64编码格式\n    img1 = cv2_to_base64(cv2.imread(\"IMAGE_PATH1\"))\n    img2 = cv2_to_base64(cv2.imread(\"IMAGE_PATH2\"))\n    data = {'images': [img1, img2]}\n    # 指定content-type\n    headers = {\"Content-type\": \"application/json\"}\n    # 发送HTTP请求\n    url = \"http://127.0.0.1:8866/predict/MarineBiometrics\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v1_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport math\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import MSRA\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 filter_size: int,\n                 num_filters: int,\n                 stride: int,\n                 padding: int,\n                 channels: int = None,\n                 num_groups: int = 1,\n                 act: str = 'relu',\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=padding,\n            groups=num_groups,\n            weight_attr=ParamAttr(initializer=MSRA(), name=name + \"_weights\"),\n            bias_attr=False)\n\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name + \"_bn_scale\"),\n            bias_attr=ParamAttr(name + \"_bn_offset\"),\n            moving_mean_name=name + \"_bn_mean\",\n            moving_variance_name=name + \"_bn_variance\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass DepthwiseSeparable(nn.Layer):\n    \"\"\"Depthwise and pointwise conv layer.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters1: int,\n                 num_filters2: int,\n                 num_groups: int,\n                 stride: int,\n                 scale: float,\n                 name: str = None):\n        super(DepthwiseSeparable, self).__init__()\n\n        self._depthwise_conv = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=int(num_filters1 * scale),\n            filter_size=3,\n            stride=stride,\n            padding=1,\n            num_groups=int(num_groups * scale),\n            name=name + \"_dw\")\n\n        self._pointwise_conv = ConvBNLayer(\n            num_channels=int(num_filters1 * scale),\n            filter_size=1,\n            num_filters=int(num_filters2 * scale),\n            stride=1,\n            padding=0,\n            name=name + \"_sep\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self._depthwise_conv(inputs)\n        y = self._pointwise_conv(y)\n        return y\n\n\n@moduleinfo(\n    name=\"mobilenet_v1_imagenet\",\n    type=\"cv/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"mobilenet_v1_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass MobileNetV1(nn.Layer):\n    \"\"\"MobileNetV1\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(MobileNetV1, self).__init__()\n        self.block_list = []\n\n        self.conv1 = ConvBNLayer(\n            num_channels=3, filter_size=3, channels=3, num_filters=int(32), stride=2, padding=1, name=\"conv1\")\n\n        conv2_1 = self.add_sublayer(\n            \"conv2_1\",\n            sublayer=DepthwiseSeparable(\n                num_channels=int(32),\n                num_filters1=32,\n                num_filters2=64,\n                num_groups=32,\n                stride=1,\n                scale=1,\n                name=\"conv2_1\"))\n        self.block_list.append(conv2_1)\n\n        conv2_2 = self.add_sublayer(\n            \"conv2_2\",\n            sublayer=DepthwiseSeparable(\n                num_channels=int(64),\n                num_filters1=64,\n                num_filters2=128,\n                num_groups=64,\n                stride=2,\n                scale=1,\n                name=\"conv2_2\"))\n        self.block_list.append(conv2_2)\n\n        conv3_1 = self.add_sublayer(\n            \"conv3_1\",\n            sublayer=DepthwiseSeparable(\n                num_channels=int(128),\n                num_filters1=128,\n                num_filters2=128,\n                num_groups=128,\n                stride=1,\n                scale=1,\n                name=\"conv3_1\"))\n        self.block_list.append(conv3_1)\n\n        conv3_2 = self.add_sublayer(\n            \"conv3_2\",\n            sublayer=DepthwiseSeparable(\n                num_channels=int(128),\n                num_filters1=128,\n                num_filters2=256,\n                num_groups=128,\n                stride=2,\n                scale=1,\n                name=\"conv3_2\"))\n        self.block_list.append(conv3_2)\n\n        conv4_1 = self.add_sublayer(\n            \"conv4_1\",\n            sublayer=DepthwiseSeparable(\n                num_channels=int(256),\n                num_filters1=256,\n                num_filters2=256,\n                num_groups=256,\n                stride=1,\n                scale=1,\n                name=\"conv4_1\"))\n        self.block_list.append(conv4_1)\n\n        conv4_2 = self.add_sublayer(\n            \"conv4_2\",\n            sublayer=DepthwiseSeparable(\n                num_channels=int(256),\n                num_filters1=256,\n                num_filters2=512,\n                num_groups=256,\n                stride=2,\n                scale=1,\n                name=\"conv4_2\"))\n        self.block_list.append(conv4_2)\n\n        for i in range(5):\n            conv5 = self.add_sublayer(\n                \"conv5_\" + str(i + 1),\n                sublayer=DepthwiseSeparable(\n                    num_channels=int(512),\n                    num_filters1=512,\n                    num_filters2=512,\n                    num_groups=512,\n                    stride=1,\n                    scale=1,\n                    name=\"conv5_\" + str(i + 1)))\n            self.block_list.append(conv5)\n\n        conv5_6 = self.add_sublayer(\n            \"conv5_6\",\n            sublayer=DepthwiseSeparable(\n                num_channels=int(512),\n                num_filters1=512,\n                num_filters2=1024,\n                num_groups=512,\n                stride=2,\n                scale=1,\n                name=\"conv5_6\"))\n        self.block_list.append(conv5_6)\n\n        conv6 = self.add_sublayer(\n            \"conv6\",\n            sublayer=DepthwiseSeparable(\n                num_channels=int(1024),\n                num_filters1=1024,\n                num_filters2=1024,\n                num_groups=1024,\n                stride=1,\n                scale=1,\n                name=\"conv6\"))\n        self.block_list.append(conv6)\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n\n        self.out = Linear(\n            int(1024),\n            class_dim,\n            weight_attr=ParamAttr(initializer=MSRA(), name=\"fc7_weights\"),\n            bias_attr=ParamAttr(name=\"fc7_offset\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'mobilenet_v1_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/mobilenet_v1_imagenet.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv1(inputs)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, 1024])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v1_imagenet_ssld/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport math\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import MSRA\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 filter_size: int,\n                 num_filters: int,\n                 stride: int,\n                 padding: int,\n                 channels: int = None,\n                 num_groups: int = 1,\n                 act: str = 'relu',\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=padding,\n            groups=num_groups,\n            weight_attr=ParamAttr(initializer=MSRA(), name=name + \"_weights\"),\n            bias_attr=False)\n\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name + \"_bn_scale\"),\n            bias_attr=ParamAttr(name + \"_bn_offset\"),\n            moving_mean_name=name + \"_bn_mean\",\n            moving_variance_name=name + \"_bn_variance\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass DepthwiseSeparable(nn.Layer):\n    \"\"\"Depthwise and pointwise conv layer.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters1: int,\n                 num_filters2: int,\n                 num_groups: int,\n                 stride: int,\n                 scale: float,\n                 name: str = None):\n        super(DepthwiseSeparable, self).__init__()\n\n        self._depthwise_conv = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=int(num_filters1 * scale),\n            filter_size=3,\n            stride=stride,\n            padding=1,\n            num_groups=int(num_groups * scale),\n            name=name + \"_dw\")\n\n        self._pointwise_conv = ConvBNLayer(\n            num_channels=int(num_filters1 * scale),\n            filter_size=1,\n            num_filters=int(num_filters2 * scale),\n            stride=1,\n            padding=0,\n            name=name + \"_sep\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self._depthwise_conv(inputs)\n        y = self._pointwise_conv(y)\n        return y\n\n\n@moduleinfo(\n    name=\"mobilenet_v1_imagenet_ssld\",\n    type=\"cv/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"mobilenet_v1_imagenet_ssld is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass MobileNetV1(nn.Layer):\n    \"\"\"MobileNetV1\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(MobileNetV1, self).__init__()\n        self.block_list = []\n\n        self.conv1 = ConvBNLayer(\n            num_channels=3, filter_size=3, channels=3, num_filters=int(32), stride=2, padding=1, name=\"conv1\")\n\n        conv2_1 = self.add_sublayer(\n            \"conv2_1\",\n            sublayer=DepthwiseSeparable(\n                num_channels=int(32),\n                num_filters1=32,\n                num_filters2=64,\n                num_groups=32,\n                stride=1,\n                scale=1,\n                name=\"conv2_1\"))\n        self.block_list.append(conv2_1)\n\n        conv2_2 = self.add_sublayer(\n            \"conv2_2\",\n            sublayer=DepthwiseSeparable(\n                num_channels=int(64),\n                num_filters1=64,\n                num_filters2=128,\n                num_groups=64,\n                stride=2,\n                scale=1,\n                name=\"conv2_2\"))\n        self.block_list.append(conv2_2)\n\n        conv3_1 = self.add_sublayer(\n            \"conv3_1\",\n            sublayer=DepthwiseSeparable(\n                num_channels=int(128),\n                num_filters1=128,\n                num_filters2=128,\n                num_groups=128,\n                stride=1,\n                scale=1,\n                name=\"conv3_1\"))\n        self.block_list.append(conv3_1)\n\n        conv3_2 = self.add_sublayer(\n            \"conv3_2\",\n            sublayer=DepthwiseSeparable(\n                num_channels=int(128),\n                num_filters1=128,\n                num_filters2=256,\n                num_groups=128,\n                stride=2,\n                scale=1,\n                name=\"conv3_2\"))\n        self.block_list.append(conv3_2)\n\n        conv4_1 = self.add_sublayer(\n            \"conv4_1\",\n            sublayer=DepthwiseSeparable(\n                num_channels=int(256),\n                num_filters1=256,\n                num_filters2=256,\n                num_groups=256,\n                stride=1,\n                scale=1,\n                name=\"conv4_1\"))\n        self.block_list.append(conv4_1)\n\n        conv4_2 = self.add_sublayer(\n            \"conv4_2\",\n            sublayer=DepthwiseSeparable(\n                num_channels=int(256),\n                num_filters1=256,\n                num_filters2=512,\n                num_groups=256,\n                stride=2,\n                scale=1,\n                name=\"conv4_2\"))\n        self.block_list.append(conv4_2)\n\n        for i in range(5):\n            conv5 = self.add_sublayer(\n                \"conv5_\" + str(i + 1),\n                sublayer=DepthwiseSeparable(\n                    num_channels=int(512),\n                    num_filters1=512,\n                    num_filters2=512,\n                    num_groups=512,\n                    stride=1,\n                    scale=1,\n                    name=\"conv5_\" + str(i + 1)))\n            self.block_list.append(conv5)\n\n        conv5_6 = self.add_sublayer(\n            \"conv5_6\",\n            sublayer=DepthwiseSeparable(\n                num_channels=int(512),\n                num_filters1=512,\n                num_filters2=1024,\n                num_groups=512,\n                stride=2,\n                scale=1,\n                name=\"conv5_6\"))\n        self.block_list.append(conv5_6)\n\n        conv6 = self.add_sublayer(\n            \"conv6\",\n            sublayer=DepthwiseSeparable(\n                num_channels=int(1024),\n                num_filters1=1024,\n                num_filters2=1024,\n                num_groups=1024,\n                stride=1,\n                scale=1,\n                name=\"conv6\"))\n        self.block_list.append(conv6)\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n\n        self.out = Linear(\n            int(1024),\n            class_dim,\n            weight_attr=ParamAttr(initializer=MSRA(), name=\"fc7_weights\"),\n            bias_attr=ParamAttr(name=\"fc7_offset\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'mobilenet_v1_ssld_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/mobilenet_v1_ssld_imagenet.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv1(inputs)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, 1024])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v2_animals/README.md",
    "content": "# mobilenet_v2_animals\n\n|模型名称|mobilenet_v2_animals|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|MobileNet_v2|\n|数据集|百度自建动物数据集|\n|是否支持Fine-tuning|否|\n|模型大小|50MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - MobileNet V2 是一个轻量化的卷积神经网络，它在 MobileNet 的基础上，做了 Inverted Residuals 和 Linear bottlenecks 这两大改进。该 PaddleHub Module 是在百度自建动物数据集上训练得到的，可用于图像分类和特征提取，当前已支持7978种动物的分类识别。模型的详情可参考[论文](https://arxiv.org/pdf/1801.04381.pdf)。\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install mobilenet_v2_animals\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run mobilenet_v2_animals --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"mobilenet_v2_animals\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 为识别的菜品类别，value为置信度。\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个动物识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m mobilenet_v2_animals\n    ```\n\n  - 这样就完成了一个动物识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/mobilenet_v2_animals\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install mobilenet_v2_animals==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v2_animals/README_en.md",
    "content": "# mobilenet_v2_animals\n\n|Module Name|mobilenet_v2_animals|\n| :--- | :---: |\n|Category|image classification|\n|Network|MobileNet_v2|\n|Dataset|Baidu Animal Dataset|\n|Fine-tuning supported or not|No|\n|Module Size|50MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - MobileNet is a light-weight convolution network. This module is trained on Baidu animal dataset, and can classify 7978 kinds of animals.\n  - For more information, please refer to：[MobileNetV2: Inverted Residuals and Linear Bottlenecks](https://arxiv.org/pdf/1801.04381.pdf)\n\n\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install mobilenet_v2_animals\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run mobilenet_v2_animals --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"mobilenet_v2_animals\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - classification API.\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - top\\_k (int): return the first k results\n\n    - **Return**\n\n      - res (list\\[dict\\]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of image classification.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m mobilenet_v2_animals\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/mobilenet_v2_animals\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Remove Fluid API\n\n  - ```shell\n    $ hub install mobilenet_v2_animals==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v2_animals/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/classification/mobilenet_v2_animals/data_feed.py",
    "content": "# coding=utf-8\nimport os\nimport time\nfrom collections import OrderedDict\n\nimport numpy as np\nfrom PIL import Image\n\n__all__ = ['reader']\n\nDATA_DIM = 224\nimg_mean = np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))\nimg_std = np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))\n\n\ndef resize_short(img, target_size):\n    percent = float(target_size) / min(img.size[0], img.size[1])\n    resized_width = int(round(img.size[0] * percent))\n    resized_height = int(round(img.size[1] * percent))\n    img = img.resize((resized_width, resized_height), Image.LANCZOS)\n    return img\n\n\ndef crop_image(img, target_size, center):\n    width, height = img.size\n    size = target_size\n    if center == True:\n        w_start = (width - size) / 2\n        h_start = (height - size) / 2\n    else:\n        w_start = np.random.randint(0, width - size + 1)\n        h_start = np.random.randint(0, height - size + 1)\n    w_end = w_start + size\n    h_end = h_start + size\n    img = img.crop((w_start, h_start, w_end, h_end))\n    return img\n\n\ndef process_image(img):\n    img = resize_short(img, target_size=256)\n    img = crop_image(img, target_size=DATA_DIM, center=True)\n    if img.mode != 'RGB':\n        img = img.convert('RGB')\n    img = np.array(img).astype('float32').transpose((2, 0, 1)) / 255\n    img -= img_mean\n    img /= img_std\n    return img\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list[numpy.ndarray]): images data, shape of each is [H, W, C].\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            each['org_im_path'] = im_path\n            each['org_im'] = Image.open(im_path)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n    if images is not None:\n        assert type(images), \"images is a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = Image.fromarray(im[:, :, ::-1])\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n\n    for element in component:\n        element['image'] = process_image(element['org_im'])\n        yield element\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v2_animals/label_list.txt",
    "content": "东方铃蟾\n中国小鲵\n中国树蟾\n北方狭口蛙\n华南湍蛙\n华西蟾蜍\n吻蚓\n哀牢髭蟾\n商城肥鲵\n大凉疣螈\n大鲵\n姬蛙\n弹琴蛙\n极北小鲵\n林蛙\n树蛙\n棘胸蛙\n棘腹蛙\n沼蛙\n泽蛙\n海蛙\n淡肩角蟾\n爪蟾\n牛蛙\n癞蛤蟆\n箭毒蛙\n红点齿蟾\n绿臭蛙\n花背蟾蜍\n花臭蛙\n草蛙\n蓝尾蝾螈\n虎纹蛙\n蝌蚪\n蝾螈\n海蟾蜍\n黑框蟾蜍\n角蛙\n负子蟾\n赤蛙\n金线蛙\n镇海棘螈\n雨蛙\n香港瘰螈\n鱼螈\n鳗螈\n黑斑肥螈\n黑斑蛙\n喜玛拉雅兔\n大耳白兔\n泽西长毛兔\n海棠兔\n荷兰兔\n巨型格仔兔\n巨型花明兔\n新西兰兔\n暹罗兔\n标准金吉拉兔\n比利时野兔\n沙漠棉兔\n法国垂耳兔\n波兰兔\n狮子头兔\n美国费斯垂耳兔\n英国垂耳兔\n英国斑点兔\n英种小型兔\n东北兔\n力斯兔\n加利福尼亚兔\n华南兔\n塔里木兔\n安哥拉兔\n比利时兔\n海南兔\n美国迷你垂耳兔\n美国长毛垂耳兔\n花巨兔\n蒙古兔\n高原兔\n银狐兔\n雷克斯兔\n香槟兔\n九绊犰狳\n仓鼠\n兔豚鼠\n八齿鼠\n布氏田鼠\n日本飞鼠\n星鼻鼹鼠\n更格卢鼠\n松鼠\n毛丝鼠\n水豚\n沙鼠\n河狸\n河狸鼠\n海狸鼠\n花栗鼠\n草原犬鼠\n豚鼠\n豪猪\n迷你刺猬\n针鼹\n马岛猬\n河马\n利木赞牛\n吉安黄牛\n夏洛来牛\n娟珊牛\n安格斯牛\n德克斯特牛\n水牛\n海福特牛\n牦牛\n瑞士褐牛\n羚牛\n西班牙斗牛\n西门塔尔牛\n高地牛\n印度犀\n爪哇犀\n白犀\n苏门答腊犀牛\n黑犀\n三元苗猪\n东北民猪\n两头乌\n太湖猪\n姜曲海猪\n巴克夏猪\n杜洛克猪\n架子猪\n汉普夏猪\n波中猪\n皮特兰\n监利猪\n约克夏猪\n花斑猪\n荣昌猪\n辽宁黑猪\n金华猪\n长白猪\n东北细毛羊\n侏儒山羊\n冰岛羊\n北山羊\n卡拉库尔羊\n印度羚\n原羚\n叉角羚\n安哥拉山羊\n布尔山羊\n捻角山羊\n捻角羚\n杜泊绵羊\n林肯羊\n济宁青山羊\n浏阳黑山羊\n海门山羊\n白绒山羊\n纯种小尾寒羊\n罗姆尼羊\n美丽诺绵羊\n美利奴羊\n萨能山羊\n藏羚羊\n跳羚\n非洲大羚羊\n非洲瞪羚\n黄淮山羊\n黑面羊\n亚洲象\n猛犸象\n非洲象\n貘\n南非长颈鹿\n安哥拉长颈鹿\n马赛长颈鹿\n三河马\n伊犁马\n克莱兹代尔马\n哈萨克马\n德保矮马\n摩根马\n斑马\n普氏野马\n欧洲野马\n汗血马\n河曲马\n田纳西走马\n矮种马\n美洲野马\n蒙古马\n西南马\n西藏野驴\n角马\n阿帕卢萨马\n阿拉伯马\n驴\n骡子\n黄骠马\n原驼\n无峰驼\n羊驼\n野骆驼\n阿拉善双峰驼\n驼马\n梅花鹿\n毛冠鹿\n水鹿\n海南坡鹿\n狍\n白唇鹿\n白尾鹿\n霍加狓\n香獐\n马鹿\n驯鹿\n驼鹿\n麋鹿\n黑尾鹿\n林麝\n马麝\n黑麂\n二趾树懒\n土豚\n树袋熊\n侏儒狨猴\n侏长尾猴\n倭狐猴\n冕狐猴\n几内亚狒狒\n叶猴\n喜山长尾叶猴\n大狐猴\n大猩猩\n婆罗洲猩猩\n婴猴\n山魈\n峨眉山猴\n懒猴\n指猴\n狒狒\n狮面狨\n猕猴\n环尾狐猴\n疣猴\n白脸猴\n白面僧面猴\n白额卷尾猴\n白额长尾猴\n红吼猴\n红领狐猴\n绒毛蛛猴\n绿猴\n肥尾鼠狐猴\n节尾猴\n苏门达腊猩猩\n菲氏叶猴\n豚尾叶猴\n赤猴\n跗猴\n金丝猴\n长尾猴\n长毛蜘蛛猴\n长臂猿\n长鼻猴\n黑吼猴\n黑猩猩\n黑疣猴\n鼠狐猴\n鼬狐猴\n伏翼\n吸血蝠\n大耳蝠\n大蝙蝠\n棕蝠\n狐蝠\n红蝠\n菊头蝠\n马来大狐蝠\n袋鼠\n亚洲黑熊\n北极熊\n国宝大熊猫\n大灰熊\n小熊猫\n懒熊\n棕熊\n浣熊\n狼獾\n眼镜熊\n美洲黑熊\n马来熊\n北极狐\n南非狐\n大耳狐\n孟加拉狐\n山狐\n沙狐\n灰狐\n耳廓狐\n苍狐\n草原狐\n藏狐\n赤狐\n达尔文狐\n阿富汗狐\n丝毛梗\n中亚牧羊犬\n中华田园犬\n中国冠毛犬\n中国昆明犬\n中国沙皮犬\n中国细犬\n中国藏獒\n丹迪丁蒙梗\n京巴\n伯瑞犬\n依比沙猎犬\n俄罗斯南部牧羊犬\n俄罗斯高加索犬\n克伦伯犬\n冰岛牧羊犬\n凯恩梗\n切萨皮克海湾寻回犬\n刚毛指示格里芬犬\n刚毛猎狐梗\n匈牙利维斯拉犬\n博美\n卷毛寻回犬\n卷毛比熊犬\n可卡\n可蒙犬\n史宾格犬\n吉娃娃犬\n哈士奇犬\n哥顿雪达犬\n喜乐蒂\n圣伯纳犬\n大丹\n德国狐狸犬\n大瑞士山地犬\n大白熊\n威尔士柯基\n威尔士梗\n威玛猎犬\n川东猎犬\n巴吉度犬\n八哥犬\n巴西獒犬\n布鲁塞尔格里芬犬\n平毛巡回猎犬\n平毛猎狐梗\n弗兰德牧羊犬\n德国刚毛指示犬\n德国短毛指示犬\n德牧/德国牧羊犬\n惠比特犬\n意大利卡斯罗犬\n意大利灵缇犬\n拉布拉多\n拉萨犬\n拳师犬\n挪威猎犬\n斯凯梗\n斯塔福斗牛梗\n日本土佐犬\n日本尖嘴犬\n日本柴犬\n日本狆\n日本秋田犬\n曼彻斯特梗\n杜宾犬\n杰克罗素梗\n松狮犬\n查理士王小猎犬\n比利时牧羊犬\n比格猎犬\n法兰德斯畜牧犬\n法兰西斗牛犬\n法国波尔多犬\n法国狼犬\n法国獒犬\n法老王猎犬\n波利犬\n波士顿梗犬\n波音达猎犬\n湖畔梗\n澳大利亚卡尔比犬\n澳大利亚梗\n澳大利亚牧羊犬\n爱尔兰塞特犬\n爱尔兰梗\n爱尔兰水猎犬\n爱尔兰猎狼犬\n爱尔兰长毛猎犬\n爱斯基摩犬\n牛头梗\n猎浣熊犬\n瑞士伯恩山犬\n白色德国牧羊犬\n约克夏梗\n纽波利顿獒犬\n纽芬兰犬\n罗威纳犬\n罗得西亚脊背犬\n比特狗\n美国水猎犬\n美国猎狐犬\n美系秋田犬\n腊肠犬\n芬兰狐狸犬\n苏俄猎狼犬\n苏格兰梗\n苏格兰猎鹿犬\n苏牧/苏格兰牧羊犬\n英国寻血猎犬\n英国猎狐犬\n英国老式牧羊犬\n荷兰毛狮犬\n萨摩耶\n萨路基猎犬\n葡萄牙水犬\n蝴蝶犬\n西施犬\n西班牙加纳利犬\n西藏梗\n西藏猎犬\n西里汉梗\n西高地白梗\n豺狗\n贝灵顿梗\n贵宾犬/贵妇犬\n软毛麦色梗\n边境梗\n边境牧羊犬\n迷你雪那瑞\n迷你鹿犬\n金毛犬\n长须牧羊犬\n阿富汗猎犬\n阿拉斯加雪橇犬\n阿根廷杜高犬\n雪纳瑞/标准雪纳瑞\n马尔济斯犬/玛尔基斯犬\n狮子\n亚洲胡狼\n北极狼\n印度狼\n埃塞俄比亚狼\n墨西哥狼\n意大利狼\n日本狼\n灰狼\n红狼\n藏狼\n袋狼\n貉\n郊狼\n鬃狼\n三色猫\n东奇尼猫\n东方短毛猫\n东方长毛猫\n亚洲猫\n伯曼猫\n俄罗斯蓝猫\n加州闪亮猫\n卡特尔猫\n哈瓦那棕猫\n喜马拉雅猫\n四川简州猫\n土耳其安哥拉猫\n土耳其梵猫\n埃及猫\n威尔士猫\n孟买猫\n巴厘猫\n布偶猫\n彼得秃猫\n德国卷毛猫\n拿破仑猫\n挪威森林猫\n斯芬克斯猫\n新加坡猫\n日本短尾猫\n曼切堪猫\n果子狸\n欧洲短毛猫\n波斯猫\n波米拉猫\n泰国暹罗猫\n澳大利亚雾猫\n灵猫\n热带草原猫\n爪哇猫\n狮子猫\n科拉特猫\n索马里猫\n缅因猫\n缅甸猫\n美国反耳猫\n美国短尾猫\n美国短毛猫\n美国硬毛猫\n肯尼亚猫\n苏格兰折耳猫\n英国短毛猫\n英国长毛猫\n虎猫\n褴褛猫\n西伯利亚猫\n豹猫\n重点色短毛猫\n金吉拉猫\n阿比西尼亚猫\n雪鞋猫\n马恩岛猫\n华南虎\n印度支那虎\n孟加拉虎\n新疆虎\n美洲虎\n苏门答腊虎\n西伯利亚虎\n云豹\n亚洲猎豹\n南非猎豹\n印度支那豹\n斯里兰卡豹\n波斯豹\n远东豹\n阿拉伯豹\n雪豹\n非洲豹\n黑豹\n海狗\n海狮\n海象\n海豹\n伶鼬\n巨獭\n日本貂\n松貂\n树懒\n水獭\n水貂\n江獭\n海獭\n渔貂\n狐鼬\n猞猁\n猪獾\n白鼬\n石貂\n紫貂\n美洲獾\n美洲貂\n艾鼬\n蜜獾\n食蚁兽\n香鼬\n黄喉貂\n黄鼬\n黑足鼬\n鼬獾\n小头鼠海豚\n巨头鲸\n抹香鲸\n江豚\n河豚\n海牛\n中华白海豚\n宽吻海豚\n白鳍豚\n真海豚\n独角鲸\n瓜头鲸\n白鲸\n短肢领航鲸\n蓝鲸\n虎鲸\n角岛鲸\n长肢领航鲸\n露脊鲸\n小须鲸\n布氏鲸\n灰鲸\n鳁鲸\n齿鲸\n鸭嘴兽\n丝虫\n利什曼虫\n吸吮线虫\n毛囊螨\n猪肉寄生虫\n盲鳗\n蓝氏贾第鞭毛虫\n蛔虫\n血吸虫\n钩虫\n阿米巴\n太平洋牡蛎\n海参\n海鳃\n黄金螺\n两头蛇\n乌梢蛇\n剑纹带蛇\n响尾蛇\n圆斑蝰\n太攀蛇\n山王蛇\n岩蟒\n束带蛇\n棕伊澳蛇\n森蚺\n水蚺\n水蛇\n水蟒\n沙漠王蛇\n沙漠腹蛇\n沙蟒\n海蛇\n烙铁头\n牛奶蛇\n猪鼻蛇\n玉米蛇\n玫瑰蟒\n环颈蛇\n琴蛇\n白唇竹叶青蛇\n白头蛇\n白眉蝮蛇\n白蟒\n盲蛇\n眼镜蛇\n红蟒\n线虫\n绿曼巴蛇\n绿树蟒\n绿瘦蛇\n绿蟒\n缅甸蟒\n腾蛇\n花浪蛇\n草蛇\n菱斑响尾蛇\n葡萄树蛇\n蕲蛇\n虎斑游蛇\n虎蛇\n蝮蛇\n蝰蛇\n蟒蛇\n裂须海蛇\n赤链蛇\n金环蛇\n金花蛇\n铜斑蛇\n银环蛇\n锦蛇\n闪鳞蛇\n魔鬼蛇\n黄脊游蛇\n黑蛇\n鼓腹巨蝰\n丽斑麻蜥\n加拉帕戈斯海鬣蜥\n北草蜥\n变色树蜥\n变色蜥\n墨西哥毒蜥\n壁虎\n尼罗河巨蜥\n平原巨蜥\n无脚蜥蜴\n楔齿蜥\n沙漠角蜥\n澳洲魔蜥\n犀牛鬣蜥\n王者蜥\n科莫多巨蜥\n绿鬣蜥\n蓝尾金蜥\n蓝舌蜥\n豹纹守宫\n马蛇子\n鬃狮蜥\n麻蜥\n古巴鳄\n奥利诺科鳄\n宽吻凯门鳄\n尼罗鳄\n巴拉圭凯门鳄\n暹罗鳄\n沼泽鳄\n湾鳄\n澳洲淡水鳄\n眼镜凯门鳄\n短吻鳄\n福鳄\n美洲鳄\n菲律宾鳄\n钝吻古鳄\n锥吻古鳄\n长嘴鳄\n食鱼鳄\n马来鳄\n黑凯门鳄\n中南半岛大鳖\n丽龟\n亚达伯拉象龟\n佛罗里达鳖\n几何陆龟\n凹甲陆龟\n加拉帕戈斯象龟\n南美小鳄龟\n卡罗莱纳箱龟\n印度星龟\n印度陆龟\n龟鳖四爪陆龟\n地龟\n埃及陆龟\n塞舌尔巨龟\n墨西哥箱龟\n安哥洛卡象龟\n山瑞鳖\n山鳖\n山龟\n巨型侧颈龟\n巴西彩龟\n平背龟\n扁尾陆龟\n斑鳖\n棱皮龟\n欧洲陆龟\n沙漠箱龟\n沼泽箱龟\n泥龟\n海龟\n湾岸箱龟\n牟氏水龟\n玛塔龟\n盒龟\n眼斑水龟\n红腿陆龟\n缅甸星龟\n缅甸陆龟\n美国鳖\n翘缘陆龟\n花龟\n苏卡达象龟\n蛛网陆龟\n西部箱龟\n豹纹陆龟\n赫曼陆龟\n辐射陆龟\n钟纹折背陆龟\n锦箱龟\n闭壳龟\n阿尔达布拉象龟\n阿根廷陆龟\n非洲折背陆龟\n饼干陆龟\n马来西亚巨龟\n马来鳖\n鹰嘴龟\n黄喉拟水龟\n黄斑侧颈龟\n黑颈乌龟\n黑鳖\n齿缘摄龟\n山蛭\n水丝蚓\n沙蚕\n水蛭\n蚯蚓\n面象虫\n僧帽水母\n栉水母\n根口水母\n水螅\n海羽星\n海葵\n淡水水母\n深海水母\n盒水母\n立方水母\n管水母目动物\n箱水母\n钵水母\n霞水母\n三叶虫\n蚰蜒\n中国红巨龙蜈蚣\n亚马逊巨人蜈蚣\n加拉帕格斯巨人蜈蚣\n北美巨人蜈蚣\n哈氏蜈蚣\n多棘蜈蚣\n少棘蜈蚣\n秘鲁巨人蜈蚣\n越南巨人蜈蚣\n马陆\n叩头虫\n吉丁虫\n大蜡螟\n大麦虫\n天幕毛虫\n天牛\n姬兜\n孑孓\n射炮步甲\n小蠹科\n尼科巴弓背蚁\n斑蝥\n木蛀虫\n树蟋\n棉蚜虫\n樗蚕\n水黾\n活板门蛛\n犀牛甲虫\n独角仙\n狮蚁\n瓢虫\n田鳖\n石蚕\n稚虫\n竹节虫\n粘虫\n胭脂虫\n腻虫\n芫菁\n菜青虫\n萤火虫\n蓟马\n虎甲\n体虱\n头虱\n木虱\n粉虱\n跳蚤\n阴虱\n飞虱\n虻\n蚂蚁\n切叶蚁\n子弹蚁\n微距蚂蚁\n斗牛犬蚁\n牛头犬蚁\n猛蚁\n织叶蚁\n臭蚁\n行军蚁\n阿根廷蚁\n三带喙库蚊\n埃及伊蚊\n库蚊\n按蚊\n白蚁\n长脚蚊\n蚜虫\n草蛉\n草蜻蛉\n蚊蝎蛉\n蝶角蛉\n变色夜蛾\n地老虎\n食心虫\n大造桥虫\n家蚕\n尺蛾\n帝王蛾\n斜纹夜蛾\n月形天蚕蛾\n枯叶蛾\n梨大食心虫\n棉红铃虫\n棉褐带卷蛾\n棉铃虫\n灯蛾\n烟草天蛾\n甜菜夜蛾\n眉纹天蚕蛾\n稻苞虫\n箩纹蛾\n美国白蛾\n舞毒蛾\n舟形毛虫\n苹果蠹蛾\n葡萄天蛾\n虎蛾\n螟蛾\n衣蛾\n透翅蛾\n马铃薯块茎蛾\n鹿蛾\n东方蜜蜂\n切叶蜂\n姬蜂\n木蜂\n瘿蜂\n虎头蜂\n蚁蜂\n蜂王\n西方蜜蜂\n长脚蜂\n熊蜂\n非洲蜂\n黑蜂\n蜉蝣\n乐仙蜻蜓\n杜松蜻蜓\n碧伟蜓\n红蜻蜓\n薄翅蜻蜓\n丝光绿蝇\n地中海实蝇\n寄蝇\n果蝇\n石蝇\n突眼蝇\n粪蝇\n胃蝇\n舌蝇\n苍蝇\n蚤蝇\n蝇蛆\n采采蝇\n食蚜蝇\n马蝇\n蝈蝈\n八点广翅蜡蝉\n斑衣蜡蝉\n柿广翅蜡蝉\n沫蝉\n角蝉\n龙眼鸡\n蝗虫\n副王蛱蝶\n喙凤蝶\n斑蝶\n曙凤蝶\n木兰青凤蝶\n灰蝶\n环蝶\n白蝴蝶\n眼蝶\n稻弄蝶\n粉蝶\n紫斑蝶\n纹白蝶\n绢蝶\n蓝蝶\n虎斑蝶\n蛱蝶\n赤蛱蝶\n金凤蝶\n金带喙凤蝶\n金裳凤蝶\n闪蝶\n黄斑弄蝶\n黄裳凤蝶\n蝼蛄\n牧草盲蝽\n硕蝽\n稻黑蝽\n荔蝽\n蝽\n麻皮蝽\n螳螂\n棕静螳\n北京油葫芦\n大棺头蟋蟀\n斗蟋\n中华灶蟋\n蟑螂\n蠓\n豆娘\n甘薯蚁象\n轮虫\n金龟子\n锹形虫\n长戟大兜\n隐翅虫\n马铃薯甲虫\n黄足黄守瓜\n龙虱\n东方扁虾\n中国对虾\n中国毛虾\n中国龙虾\n克氏原螯虾\n南极虾\n斑琴虾蛄\n日本对虾\n日本沼虾\n日本龙虾\n明虾\n杂色龙虾\n棘刺龙虾\n波纹龙虾\n白对虾\n白须龙虾\n皮皮虾\n红琵琶虾\n草虾\n蓝魔虾\n蜜蜂虾\n蝉虾\n螯虾\n褐虾\n铠甲虾\n锦绣龙虾\n长臂虾\n中华虎头蟹\n关公蟹\n北海道毛蟹\n圣诞岛红蟹\n大闸蟹\n寄居蟹\n帝王蟹\n拟穴青蟹\n旭蟹\n梭子蟹\n江蟹\n沙蟹\n珍宝蟹\n瓷蟹\n甘氏巨螯蟹\n短足拟石蟹\n石蟹\n老虎蟹\n蓝蟹\n蜘蛛蟹\n远海梭子蟹\n锈斑蟳\n锯齿溪蟹\n雪蟹\n面包蟹\n香槟蟹\n鹅颈藤壶\n印度华丽雨林蜘蛛\n圆蛛\n地蛛\n塔兰图拉毒蛛\n墨西哥火脚蜘蛛\n姬蛛\n巴西白膝头蜘蛛\n捕鱼蛛\n捕鸟蛛\n棒络新妇\n橙巴布蜘蛛\n毛蜘蛛\n水蜘蛛\n洪都拉斯卷毛蜘蛛\n漏斗蛛\n火玫瑰蜘蛛\n欧洲狼蛛\n白额巨蟹蛛\n皇帝巴布蜘蛛\n盗蛛\n盲蜘蛛\n红蜘蛛\n蟹蛛\n象牙华丽雨林蜘蛛\n跳蛛\n避日蛛\n金属蓝蜘蛛\n长脚盲蛛\n高脚蜘蛛\n黑腹狼蛛\n蜱虫\n东亚钳蝎\n中东金蝎\n亚洲雨林蝎\n以色列杀人蝎\n八重山蝎\n利比亚金蝎\n北非黑肥尾蝎\n双针蝎\n土耳其黑肥尾蝎\n巴西金幽灵\n帝王蝎\n沙漠金蝎\n红爪蝎\n红爪雨林蝎\n荧光蝎\n雷达蝎\n鞭蝎\n马来西亚雨林蝎\n黄肥尾蝎\n黑粗尾蝎\n鲎\n乌贼\n九孔螺\n吸血鬼乌贼\n唐冠螺\n囊螺\n帽贝\n文蛤\n杂色鲍\n栉孔扇贝\n椎实螺\n河蚌\n泥螺\n海湾扇贝\n海蛞蝓\n烟管螺\n玉螺\n玉黍螺\n珍珠贝\n珠蚌\n田螺\n白玉蜗牛\n石鳖\n竹蛏\n船蛆\n船蛸\n芋螺\n虎斑宝贝\n蚬\n蚶\n蛞蝓\n蜒螺\n货贝\n贻贝\n马氏珍珠贝\n鱿鱼\n鲍\n鹦鹉螺\n黄宝螺\n三文鱼\n马哈鱼\n主刺盖鱼\n二长棘鲷\n六带石斑鱼\n六斑刺鲀\n刺鲳\n半滑舌鳎\n吉富罗非鱼\n四指马鲅\n土耳其神仙\n多鳞鱚\n大弹涂鱼\n大眼鲷\n大西洋庸鲽\n宝刀鱼\n宝石石斑鱼\n宽体舌鳎\n射水鱼\n小丑鱼\n小公鱼\n小蜜蜂鱼\n带鱼\n斑点鸡笼鲳\n斑鰶\n旗鱼\n日本刺尾鱼\n日本金线鱼\n星点东方鲀\n暗纹东方鲀\n木叶鲽\n条纹东方鲀\n柔鱼\n横带髭鲷\n毛烟管鱼\n沙丁鱼\n海鲶\n海鳗\n澳洲三间火箭\n澳洲彩虹鱼\n灰星鲨\n牛头鲷\n狼牙鰕虎鱼\n白姑鱼\n白斑星鲨\n白胸刺尾鱼\n皮氏叫姑鱼\n短鳍红娘鱼\n石斑鱼\n石鲽\n红笛鲷\n红苹果美人\n绿河豚\n绿鳍马面鲀\n绿鳍鱼\n美丽硬仆骨舌鱼\n美国鲥鱼\n羽毛关刀\n胡椒鲷\n草莓鱼\n荷兰凤凰\n蓝点马鲛\n蓝线雀\n蓝魔鬼鱼\n虫纹东方鲀\n蝶鱼科\n褐牙鲆\n褐菖鲉\n角蝶\n赤魟\n路氏双髻鲨\n达氏鲟\n遮目鱼\n金线鱼\n霞蝶\n青干金枪鱼\n马面鱼\n高眼鲽\n鮸鱼\n鲅鱼\n鲈鱼\n鲐\n鲐鱼\n鲨鱼\n尖吻鲭鲨\n青鲨\n鲱\n鲳鱼\n刺鳐\n蓝点魟\n鳕鱼\n鳟鱼\n麒麟神仙\n黄姑鱼\n黄尾副刺尾鱼\n黄火箭\n黄鲷\n黄鳍马面鲀\n黑鮶\n黑鲷\n黑鳃刺尾鱼\n一眉道人鱼\n一线铅笔\n七星刀鱼\n三段红白锦鲤\n三色水泡\n三色虎头\n三间鼠鱼\n中华细鲫\n中华花鳅\n丹顶锦鲤鱼\n乌云盖雪\n乌鲤\n五点铅笔\n五花兰寿\n五花狮\n五花短尾琉金\n五花龙睛\n企鹅灯\n凤尾龙睛\n刚果恐龙王\n刚果扯旗\n包金狮\n华鲮\n印加鹦鹉\n印第安鼠\n叉尾斗鱼\n反天刀\n反游猫\n口红红白锦鲤\n咖啡鼠鱼\n唐鱼\n喷火灯鱼\n团头鲂\n地图鱼\n埃及神仙\n墨头鱼\n墨衣锦鲤\n墨龙睛\n壮体沙鳅\n夜明珠鱼\n大和锦\n大花恐龙\n头尾灯鱼\n奥尼罗非鱼\n孔雀光写锦鲤\n孔雀鱼\n孔雀黄金锦鲤\n宁德鲤鱼溪\n宽鳍鱲\n富士红白锦鲤\n寿星虎头\n小精灵鱼\n尖嘴铅笔\n尼罗罗非鱼\n巨骨舌鱼\n帝王老虎魟\n弓鱼\n彩虹鲨\n德国三色锦鲤\n德国写鲤\n德国别甲锦鲤\n德国昭和锦鲤\n德州豹\n成吉思汗鱼\n扯旗红水泡\n拿破仑红白锦鲤\n接吻鱼\n斑尾凤凰\n斑点短鲷\n斑节恐龙\n斑马鱼\n新大钩扯旗\n新疆大头鱼\n星点龙\n朱球高头珍珠\n朱砂红白水泡\n条纹小鲃\n柠檬灯鱼\n樱花短尾琉金\n水浅黄锦鲤\n沙鳅\n泥鳅\n浅黄三色锦鲤\n清苔鼠\n清道夫\n漂亮宝贝鱼\n火口鱼\n火焰变色龙鱼\n火焰灯鱼\n火焰铅笔\n火鹤鱼\n熊猫异形鱼\n熊猫鼠\n燕子美人\n玉兔金鱼\n玉印头皇冠\n玉顶蝶尾\n玉顶银狮\n王字虎头\n玛丽鱼\n玫瑰旗\n玻利维亚凤凰\n玻璃扯旗鱼\n玻璃拉拉\n珍珠灯鱼\n珍珠马甲\n珍珠魟\n瓦氏雅罗鱼\n电鲶\n画眉鱼\n白云山鱼\n白云金丝鱼\n白写锦鲤\n白水泡\n白豹鼠\n皇冠三间\n皇冠六间\n皇冠棋盘\n皇冠直升机鱼\n神仙鱼\n紫望天\n紫白龙睛\n紫罗兰鼠\n紫蝶尾\n紫身枝牙虾虎\n红十字鱼\n红头鼠\n红宝石鱼\n红寿\n红尾梦幻旗\n红尾玻璃\n红尾皇冠\n红尾金龙鱼\n红尾鲶\n红尾黑鲨\n红扯旗\n红望天\n红水泡\n红玫瑰鱼\n红珍珠鱼\n红白兰寿\n红白狮头龙睛\n红白皇冠\n红白短尾琉金\n红白蝶尾\n红绿灯鱼\n红背朱砂水泡\n红衣梦幻旗\n红顶五花狮\n红顶兰寿\n红顶猫狮\n红顶虎头\n红顶黑珍珠\n红魔鬼鱼\n红鳍鲌\n红鼻剪刀鱼\n红龙鱼\n纹唇鱼\n细鳞斜颌鲴\n绯昭和锦鲤\n维吉塔短鲷\n绿巨人鱼\n绿裳红剑尾坦克\n罗汉鱼\n翘嘴红鲌\n花斑副沙鳅\n花椒鼠\n花秋翠锦鲤\n花老虎\n苏眉鱼\n茶鲤\n草绳恐龙\n草鱼\n荧鳍龙睛\n荧鳞短尾琉金\n蓝宝石鱼\n蓝帝王灯\n蓝曼龙\n蓝灯鱼\n蓝眼鳉\n蓝蝶尾\n蓝衣锦鲤\n蓝袖短鲷\n虎皮鱼\n虎纹恐龙王\n血鹦鹉\n豹魟\n贴分黄金锦鲤\n赤三色锦鲤\n赤别甲锦鲤\n赤眼鳟\n过背金龙鱼\n酋长短鲷\n金头黑白狮\n金恐龙鱼\n金曼龙\n金松叶锦鲤\n金点魟\n金翅帝王鼠\n金菠萝鱼\n钻石灯鱼\n钻石皇冠豹\n铜盆鱼\n银松叶锦鲤\n银燕子\n银鲴\n银鳞三色锦鲤\n银龙鱼\n长尾紫琉金\n长尾红白琉金\n长尾黑琉金\n长薄鳅\n闪电红白锦鲤\n阿卡西短鲷\n霓虹燕子\n青海湖裸鲤\n青铜鼠\n青鱼\n非洲凤凰\n非洲十间\n非洲王子鱼\n马口鱼\n鲂\n鲇鱼\n鲢\n鳄鱼恐龙王\n鳊\n鳙\n鳡\n鹅头红\n黄写锦鲤\n黄金达摩\n黄金鳉\n黄鲤\n黑旗鱼\n黑望天\n黑白水泡\n黑白蝶尾\n黑白魟\n黑线飞狐鱼\n黑莲灯\n黑裙鱼\n黑金红头鼠\n黑龙鱼\n龙睛珍珠\n龙睛虎头\n加岛环企鹅\n南跳岩企鹅\n小蓝企鹅\n帝王企鹅\n帽带企鹅\n斑嘴环企鹅\n斯岛黄眉企鹅\n洪堡企鹅\n白颊黄眉企鹅\n金图企鹅\n阿德利企鹅\n马可罗尼企鹅\n麦哲伦企鹅\n黄眉企鹅\n美洲鸵\n鸵鸟\n鸸鹋\n鹤鸵\n三宝鸟\n冠鱼狗\n斑鱼狗\n栗喉蜂虎\n棕胸佛法僧\n犀鸟\n白领翡翠\n笑翠鸟\n绿喉蜂虎\n蓝喉蜂虎\n蓝翡翠\n蓝胸佛法僧\n蓝须夜蜂虎\n蓝颊蜂虎\n赤翡翠\n鹳嘴翡翠\n黄喉蜂虎\n漂泊信天翁\n潜鸟\n乌灰鹞\n乌雕\n兀鹫\n冕雕\n冠雕\n冠鹰雕\n凤头卡拉鹰\n凤头蜂鹰\n凤头鹃隼\n凤头鹰\n加州神鹫\n南非兀鹫\n卡拉卡拉鹰\n双齿鹰\n大鵟\n小雀鹰\n小雕\n巨翅鵟\n斑头雁\n日本松雀鹰\n普通鵟\n暗色歌鹰\n条纹鹰\n栗翅鹰\n栗鸢\n棕尾鵟\n棕榈鹫\n棕翅鵟鹰\n棕腹隼雕\n楔尾雕\n毛腿鵟\n海雕\n淡色歌鹰\n渔雕\n游隼\n灰背隼\n灰脸鵟鹰\n灰隼\n燕尾鸢\n燕隼\n爪哇鹰雕\n猎隼\n猛隼\n猛雕\n王鹫\n白兀鹫\n白头鹞\n白眼鵟鹰\n白肩雕\n白腹雕\n白腿小隼\n皱脸秃鹫\n矛隼\n短尾雕\n短趾雕\n秃鹫\n红头美洲鹫\n红尾鵟\n红脚隼\n红腿小隼\n红隼\n红鸢\n纹翅鸢\n美洲隼\n肉垂秃鹫\n胡兀鹫\n苍鹭\n苍鹰\n茶色雕\n草原隼\n草原鹰\n菲律宾雕\n菲律宾鹰\n蛇雕\n角雕\n赤腹鹰\n赤鸢\n赤鹰\n金雕\n阿尔泰隼\n雀鹰\n非洲鹰\n食猴鹰\n食猿雕\n饰冠鹰雕\n马来鹰雕\n鱼雕\n鸡鹰\n鹃头蜂鹰\n鹗\n鹞\n鹰雕\n黄爪隼\n黑冠鹃隼\n黑美洲鹫\n黑翅鸢\n黑耳鸢\n黑隼\n黑雕\n黑鸢\n三趾鸦雀\n三道眉草鹀\n丛林鸦\n东方大苇莺\n主红雀\n丽色奇鹛\n乌嘴柳莺\n乌鸦\n云雀\n仙八色鸫\n伯劳\n光背地鸫\n八哥\n冕雀\n冠蓝鸦\n冬鹪鹩\n凤头雀莺\n凤头鹀\n剑嘴鹛\n动冠伞鸟\n北扑翅鴷\n北朱雀\n北红尾鸲\n北鹨\n印支绿鹊\n厚嘴苇莺\n叉尾太阳鸟\n双辫八色鸫\n发冠卷尾\n台湾紫啸鸫\n台湾蓝鹊\n台湾黄山雀\n和平鸟\n咬鹃\n唐纳雀\n赤胸啄木鸟\n金喉拟啄木鸟\n黄纹拟啄木鸟\n斑姬啄木鸟\n黄嘴栗啄木鸟\n竹啄木鸟\n白翅啄木鸟\n三趾啄木鸟\n黄颈啄木鸟\n小星头啄木鸟\n绿啄木鸟\n白背啄木鸟\n红头啄木鸟\n栗啄木鸟\n白腹黑啄木鸟\n棕腹啄木鸟\n黄冠啄木鸟\n大斑啄木鸟\n小斑啄木鸟\n纹胸啄木鸟\n喜鹊\n园丁鸟\n夏候鸟\n夜鹰\n夜莺\n新疆歌鸲\n大噪鹛\n大朱雀\n大盘尾\n大绿雀鹎\n大苇莺\n大草鹛\n太平鸟\n宝兴鹛雀\n家鸦\n小仙鹟\n小燕尾\n小盘尾\n小鹀\n屠夫鸟\n山噪鹛\n山蓝仙鹟\n山雀\n山鹛\n山鹡鸰\n山鹨\n山麻雀\n岛鸫\n岩燕\n巨嘴沙雀\n布谷鸟\n弄岗穗鹛\n强脚树莺\n戈氏岩鹀\n戴菊\n拟大朱雀\n文须雀\n文鸟\n斑喉希鹛\n斑椋鸟\n斑翅朱雀\n斑背噪鹛\n斑背大尾莺\n斑胸短翅莺\n斑胸钩嘴鹛\n斑腰燕\n斑颈穗鹛\n旋木雀\n日本树莺\n日本歌鸲\n日本鹡鸰\n星鸦\n普通鳾\n暗冠蓝鸦\n暗绿柳莺\n暗胸朱雀\n暗色鸦雀\n曙红朱雀\n朱背啄花鸟\n朱鹂\n杂色山雀\n条纹噪鹛\n松雀\n松鸦\n极北朱顶雀\n林岭雀\n林柳莺\n林鹨\n柳莺\n树燕\n栗头八色鸫\n栗头地莺\n栗头雀鹛\n栗头鹟莺\n栗背短脚鹎\n栗腹歌鸲\n栗腹矶鸫\n栗臀噪鹛\n栗臀鳾\n栗颈噪鹛\n棕噪鹛\n棕头钩嘴鹛\n棕头雀鹛\n棕头鸦雀\n棕扇尾莺\n棕朱雀\n棕眉山岩鹨\n棕眉柳莺\n棕背雪雀\n棕胸蓝姬鹟\n棕脸鹟莺\n棕腹大仙鹟\n棕腹树鹊\n棕草鹛\n棕褐短翅莺\n棕顶树莺\n棕颈雪雀\n椋鸟\n楼燕\n横斑林莺\n橙头地鸫\n橙斑翅柳莺\n橙翅噪鹛\n橙胸姬鹟\n橙腹叶鹎\n歌百灵\n毛脚燕\n水蒲苇莺\n水鸫\n沙色朱雀\n河乌\n沼泽大尾莺\n泥色鸫\n海南蓝仙鹟\n海雀\n海鸠\n淡脚树莺\n淡黄腰柳莺\n火冠雀\n火尾太阳鸟\n火尾绿鹛\n灰冠鸦雀\n灰卷尾\n灰喉山椒鸟\n灰喉鸦雀\n灰头斑翅鹛\n灰头椋鸟\n灰头灰雀\n灰头钩嘴鹛\n灰头鸦雀\n灰头鸫\n灰头鹀\n灰山椒鸟\n灰树鹊\n灰椋鸟\n灰眶雀鹛\n灰短脚鹎\n灰翅噪鹛\n灰翅鸫\n灰背燕尾\n灰背鸫\n灰脸鹟莺\n灰腹噪鹛\n灰蓝山雀\n灰颈鹀\n灰鹀\n灰鹡鸰\n灶鸟\n点翅朱雀\n点胸鸦雀\n烟柳莺\n烟腹毛脚燕\n煤山雀\n燕子\n燕雀\n玉山噪鹛\n珠颈斑鹑\n理氏鹨\n琴鸟\n田鸫\n画眉\n白冠噪鹛\n白冠攀雀\n白喉噪鹛\n白喉姬鹟\n白喉扇尾鹟\n白喉林莺\n白喉林鹟\n白喉短翅鸫\n白喉矶鸫\n白喉红尾鸲\n白喉红臀鹎\n白尾蓝仙鹟\n白斑翅雪雀\n白眉地鸫\n白眉姬\n白眉姬鹟\n白眉山雀\n白眉扇尾鹟\n白眉朱雀\n白眉歌鸫\n白眉蓝姬鹟\n白眉雀鹛\n白眶斑翅鹛\n白眶雀鹛\n白眶鸦雀\n白眶鹟莺\n白翅交嘴雀\n白脸山雀\n白腰朱顶雀\n白腰雪雀\n白腹凤鹛\n白腹短翅鸲\n白腹鸫\n白领凤鹛\n白颈鸦\n白颈鸫\n白颊噪鹛\n白颊鹎\n白鹡鸰\n百灵鸟\n盘尾树鹊\n相思鸟\n眼纹噪鹛\n矛斑蝗莺\n知更鸟\n短嘴山椒鸟\n石雀\n禾花雀\n稻田苇莺\n粉红山椒鸟\n粉红椋鸟\n粉红腹岭雀\n紫啸鸫\n紫翅椋鸟\n紫背椋鸟\n红交嘴雀\n红喉姬鹟\n红喉歌鸲\n红喉鹨\n红嘴山鸦\n红嘴蓝鹊\n红嘴钩嘴鹛\n红嘴鸦雀\n红头灰雀\n红头穗鹛\n红头鸦雀\n红尾水鸲\n红尾鸫\n红梅花雀\n红眉朱雀\n红翅旋壁雀\n红翅薮鹛\n红翼鸫\n红胁蓝尾鸲\n红背红尾鸲\n红胸啄花鸟\n红腰朱雀\n红腹灰雀\n红腹红尾鸲\n纯色噪鹛\n纯色山鹪莺\n纵纹绿鹎\n纹喉鹎\n纹背捕蛛鸟\n细纹噪鹛\n细纹苇莺\n织巢鸟\n织布鸟\n绣眼鸟\n维达鸟\n绿喉太阳鸟\n绿翅短脚鹎\n绿背山雀\n绿胸八色鸫\n缎蓝亭鸟\n翠鸟\n花彩雀莺\n草地鹨\n草雀\n蒙古沙雀\n蓝八色鸫\n蓝喉歌鸲\n蓝大翅鸲\n蓝头矶鸫\n蓝枕八色鸫\n蓝枕花蜜鸟\n蓝歌鸲\n蓝短翅鸫\n蓝矶鸫\n蓝绿鹊\n蓝翅八色鸫\n蓝翅噪鹛\n藏雀\n藏鹀\n藏黄雀\n虎斑地鸫\n蚁鸟\n蜂鸟\n蜡嘴雀\n血雀\n褐冠山雀\n褐头凤鹛\n褐头岭雀\n褐岩鹨\n褐灰雀\n褐翅雪雀\n褐翅鸦雀\n褐胁雀鹛\n褐脸雀鹛\n西域山雀\n贺兰山岩鹨\n赤朱雀\n赤红山椒鸟\n远东树莺\n远东苇莺\n酒红朱雀\n金丝雀鸟\n金冠地莺\n金冠树八哥\n金头扇尾莺\n金头穗鹛\n金枕黑雀\n金眼鹛雀\n金翅雀\n金胸雀鹛\n金色林鸲\n金色鸦雀\n金额丝雀\n金额雀鹛\n钩嘴鹛\n铜蓝鹟\n银胸丝冠鸟\n锈腹短翅鸫\n长嘴地鸫\n长嘴捕蛛鸟\n长嘴钩嘴鹛\n长尾地鸫\n长尾奇鹛\n长尾山椒鸟\n长尾缝叶莺\n长尾雀\n长尾鹩鹛\n阔嘴鸟\n雨燕\n雪鹀\n震旦鸦雀\n靛冠噪鹛\n靛蓝彩鹀\n靴篱莺\n领岩鹨\n食蜂鸟\n高山雀鹛\n鳞头树莺\n鸦嘴卷尾\n鸫鸟\n鸲岩鹨\n鹀\n鹊鸲\n鹟\n鹩哥\n鹪鹩\n鹬鸵\n麝雉\n黄喉噪鹛\n黄喉雀鹛\n黄嘴山鸦\n黄嘴朱顶雀\n黄嘴蓝鹊\n黄头鹡鸰\n黄眉\n黄眉姬\n黄眉林雀\n黄眉鹀\n黄绿鹎\n黄胸柳莺\n黄腰太阳鸟\n黄腹冠鹎\n黄腹啄花鸟\n黄腹山雀\n黄腹柳莺\n黄腹树莺\n黄腹花蜜鸟\n黄腹鹟莺\n黄腹鹨\n黄腹鹪莺\n黄臀鹎\n黄莺\n黄雀\n黄颈拟蜡嘴雀\n黄颊山雀\n黄鹡鸰\n黑冠椋鸟\n黑冠黄鹎\n黑卷尾\n黑喉石鵖\n黑喉红臀鹎\n黑喉缝叶莺\n黑头噪鸦\n黑头奇鹛\n黑头鳾\n黑头鹀\n黑头鹎\n黑尾地鸦\n黑斑蝗莺\n黑枕王鹟\n黑百灵\n黑眉苇莺\n黑短脚鹎\n黑翅雀鹎\n黑背燕尾\n黑胸太阳鸟\n黑胸鸫\n黑脸鹟莺\n黑顶噪鹛\n黑顶林莺\n黑顶白颊林莺\n黑领噪鹛\n黑额树鹊\n黑鸫\n（美洲产）伞鸟\n加拿大鹅\n小天鹅\n扁嘴天鹅\n斑嘴鸭\n斑头秋沙鸭\n林鹳\n棕硬尾鸭\n溆浦鹅\n灰雁\n灰鹅\n狮头鹅\n琵嘴鸭\n瓣蹼鹬\n瘤头鸭\n白鹅\n秋沙鸭\n红头潜鸭\n绒鸭\n绿翅鸭\n罗纹鸭\n花脸鸭\n蓝翅鸭\n豁眼鹅\n豆雁\n赤膀鸭\n赤麻鸭\n野鸭\n针尾鸭\n长尾鸭\n雁鹅\n雪雁\n非洲雁\n鸳鸯\n鸳鸯鸭\n麝香鸭\n麻鸭\n黑天鹅\n黑海番鸭\n黑雁\n巨嘴鸟\n蚁鴷\n黄腰响蜜鴷\n三趾鹑\n丛冢雉\n元宝鸡\n冕鹧鸪\n刚果孔雀\n勺鸡\n南秧鸟\n原鸡\n台湾山鹧鸪\n吐绶鸡\n大凤冠雉\n大骨鸡\n寿光鸡\n山鹑\n岩雷鸟\n彩雉\n彩鸡鹑\n斑翅山鹑\n斑胸田鸡\n松鸡\n柳雷鸟\n榛鸡\n毛脚鸡\n黑水鸡\n紫水鸡\n沙鸡\n济宁百日鸡\n海兰褐壳蛋鸡\n海南孔雀雉\n海南山鹧鸪\n灰头麦鸡\n灰孔雀雉\n狼山鸡\n环颈雉\n珍珠鸡\n白尾雷鸟\n白颊山鹧鸪\n白鹇\n眼斑吐绶鸡\n眼斑火鸡\n眼斑雉\n矮脚鸡\n石鸡\n竹鸡\n紫青水鸡\n红喉山鹧鸪\n红胸山鹧鸪\n绿孔雀\n绿脚山鹧鸪\n绿雉\n艾草松鸡\n芦花鸡\n草原榛鸡\n蓝胸鹑\n蓝鹇\n虹雉\n血雉\n褐胸山鹧鸪\n角雉\n贵妃鸡\n走鹃\n铜尾孔雀雉\n铜长尾雉\n锦鸡\n长尾雉\n马鸡\n骨顶鸡\n高地鹧鸪\n鹅雏\n鹌鹑\n鹧鸪\n鹫珠鸡\n黄羽肉鸡\n黄鸡\n黄麻鸡\n黑凤冠雉\n黑腹翎鹑\n黑鸡\n黑鹇\n三趾鸥\n剪嘴鸥\n巨鹱\n江鸥\n海燕\n海鸥\n灰背鸥\n燕鸥\n白翅浮鸥\n白鸥\n贼鸥\n遗鸥\n须浮鸥\n黄嘴河燕鸥\n黑嘴鸥\n黑尾鸥\n大灰猫头鹰\n斑头鸺鹠\n林雕鸮\n大雕鸮\n灰林鸮\n猛鸮\n短耳鸮\n斑点猫头鹰\n纵纹腹小鸮\n褐渔鸮\n仓鸮\n毛腿渔鸮\n纵纹角鸮\n雪鸮\n草鸮\n花头鸺鹠\n长耳鸮\n雕鸮\n黄嘴角鸮\n丘鹬\n双领鸻\n反嘴鹬\n小嘴鸻\n小黄脚鹬\n弯嘴滨鹬\n扇尾沙锥\n斑胸滨鹬\n杓鹬\n林沙锥\n林鹬\n流苏鹬\n海鹦\n灰尾鹬\n燕鸻\n笛鸻\n红脚鹬\n翻石鹬\n蛎鹬\n金鸻\n原鸽\n山斑鸠\n岩鸽\n斑尾林鸽\n斑尾鹃鸠\n棕斑鸠\n楔尾绿鸠\n橙胸绿鸠\n渡渡鸟\n火斑鸠\n灰林鸽\n点子鸽\n珠颈斑鸠\n瘤鼻鸽\n白羽王鸽\n筋斗鸽\n红翅绿鸠\n绿皇鸠\n绿翅金鸠\n苍白骑士\n雪鸽\n黑颏果鸠\n乌鹃\n八声杜鹃\n小鸦鹃\n斑翅凤头鹃\n棕腹杜鹃\n紫金鹃\n红翅凤头鹃\n绿嘴地鹃\n翠金鹃\n蕉鹃\n褐翅鸦鹃\n鹰鹃\n军舰鸟\n北鲣鸟\n双冠鸬鹚\n斑头鸬鹚\n鸬鹚\n澳洲鹈鹕\n白尾鹲\n红嘴鹲\n红蛇鹈\n美洲蛇鹈\n蓝眼鸬鹚\n蓝脸鲣鸟\n鲣鸟\n鹈鹕\n黑腹军舰鸟\n黑腹蛇鹈\n黑颈鸬鹚\n姬田鸡\n小田鸡\n斑胁田鸡\n沙丘鹤\n灰鹤\n白头鹤\n白枕鹤\n白眉田鸡\n白胸苦恶鸟\n白鹤\n秧鸡\n秧鹤\n红胸田鸡\n红鹤\n蓑羽鹤\n蓝胸秧鸡\n赤颈鹤\n鸨\n黑颈鹤\n吸蜜鹦鹉\n塞内加尔鹦鹉\n大绯胸鹦鹉\n太阳鹦哥\n彩虹鹦鹉\n灰头鹦鹉\n灰鹦鹉\n牡丹鹦鹉\n玄凤鹦鹉\n白鹦鹉\n短尾鹦鹉\n米切氏凤头鹦鹉\n红嘴鹦鹉\n红肩金刚鹦鹉\n红腹鹦鹉\n红色的鹦鹉\n红额金刚鹦鹉\n花头鹦鹉\n葵花凤头鹦鹉\n蓝冠吸蜜鹦鹉\n蓝冠短尾鹦鹉\n蓝头金刚鹦鹉\n蓝颊玫瑰鹦鹉\n虎皮鹦鹉\n金刚鹦鹉\n金刚鹦鹉族\n金尾折衷鹦鹉\n金领金刚鹦鹉\n长尾鹦鹉\n食肉鹦鹉\n黄颈亚马逊鹦鹉\n黄额鹦鹉\n三色鹭\n印度池鹭\n啸鹭\n噪鹮\n埃及圣鹮\n夜鹭\n大蓝鹭\n岩鹭\n彩鹮\n彩鹳\n斑鹭\n朱鹭\n朱鹮\n栗腹鹭\n棕夜鹭\n棕颈鹭\n池鹭\n漂鹬\n澳洲白鹮\n牛背鹭\n琵鹭\n白脸鹭\n白腹鹳\n白鹭\n白鹳\n秃鹮\n秃鹳\n紫背苇鳽\n红鹳\n绿鹭\n绿鹮\n美洲白鹮\n美洲红鹮\n美洲绿鹭\n草鹭\n蓝灰鹭\n裸颈鹳\n钳嘴鹳\n锤头鹳\n长脚鹬\n非洲秃鹳\n鞍嘴鹳\n鲸头鹳\n麻鹭\n黄嘴鹮鹳\n黄脸鹳\n黄苇鳽\n黑头白鹮\n黑颈鹳\n黑鹭\n黑鹳\n黑喉潜鸟\n黄胡蜂\n短毛家猫\n蕈蚊\n北极兔\n凤尾蝶\n柯利牧羊犬\n斑背潜鸭\n椰子蟹\n卷叶蛾\n异国短毛猫\n细嘴黄鹂\n黑顶麻雀\n毒蛾\n格雷伊猎犬\n博伊金猎犬\n巴贝犬\n柯利犬\n杜高塞德斯科犬\n因纽特犬\n大型英法三色猎犬\n峡谷㹴\n阿尔卑斯达切斯勃拉克犬\n班道戈犬\n坎高犬\n俄罗斯－欧洲莱卡犬\n马瑞马·阿布鲁佐牧羊犬\n塞尔维亚猎犬\n阿拉帕哈蓝血斗牛犬\n丰山犬\n西班牙獒犬\n黑棕褐梗\n狮子狗\n阿特拉斯梗\n卡罗来纳犬\n斯洛伐克硬毛指示猎犬\n瑞典牧羊犬\n警犬\n拉贾帕拉耶姆犬\n小福克斯犬\n比利犬\n阿札瓦克犬\n波斯尼亚的牧羊犬\n威尔士牧羊犬\n加迪库塔犬\n霍夫瓦尔特犬\n布列塔尼猎犬\n奥弗涅指示猎犬\n捕鼠㹴犬\n智利狐狸梗\n印第安松鼠狗\n苏塞克斯猎犬\n蛋糕罐罐狗\n拉戈托罗马阁挪露犬\n布里吉特格里芬犬\n鬼脸狗\n山恶狗\n卡累利阿熊犬\n棉花面纱犬\n拉坎诺斯犬\n瑞典拉普猎犬\n阿兰多獒犬\n日本㹴\n挪威卢德杭犬\n史其派克犬\n瑞典腊肠犬\n库瓦兹犬\n威斯特达克斯布若卡犬\n澳大利亚牧牛犬\n贝克哈沃德狗\n博洛尼亚犬\n约克波犬\n爱罗狗\n爱迪犬\n甲斐犬\n波多黎各獒犬\n波兰猎犬\n罗秦犬\n英国猎浣熊犬\n罗威纳布林迪西狗\n拉布拉多哈士奇\n卡南犬\n大麦町\n迷你猎狐㹴\n佩特戴尔㹴\n约克夏㹴\n美国无毛㹴犬\n班特牛头犬\n中国福犬\n墨西哥无毛犬\n吉斯熊狗\n老式英国斗牛犬\n瓷器犬\n西班牙水犬\n意大利指示猎犬\n蒙特内哥罗山猎犬\n斗牛獒\n比韦尔狗\n猴㹴\n芬兰驯鹿犬\n奥地利黑褐猎犬\n保罗蓝梗\n波隆卡\n芬兰猎犬\n俄罗斯黑㹴\n猎水獭犬\n杂交狗\n大明斯特兰德犬\n阿图瓦猎犬\n荷兰斯牟雄德犬\n拉普尔灰狗\n维汗牧羊犬\n米奇狗\n帕森拉塞尔㹴犬\n诺维奇梗\n阿登牧牛犬\n美洲印第安人狗\n金塔马尼\n布劳特猎犬\n委内瑞拉牧羊犬\n埃什特雷拉山犬\n德国猎㹴\n贝里亚尔狗\n兰雷西犬\n罗马尼亚喀尔巴阡山脉牧羊犬\n纪州犬\n普里亚指狗\n捷克梗\n葡萄牙牧羊犬\n葡萄牙指示犬\n喜玛拉雅牧羊犬\n特洛曼犬\n立陶宛猎犬\n伊斯特拉粗毛猎犬\n塞尔维亚三色猎犬\n法连尼犬\n短毛视觉猎犬\n罗马尼亚米利泰克牧羊犬\n塞佩莱西伯利亚雪橇犬\n布拉克杜佩\n波兰狩猎犬\n布拉格瑟瑞克犬\n古梗犬\n乌拉圭西马伦犬\n印度野猪猎犬\n捷克狼犬\n丹麦老式指示猎犬\n撒阿路斯狼猎犬\n非洲家犬\n杂交狮子狗\n德国猎犬\n泰岗猎犬\n东西伯利亚雷卡犬\n威尔士猎犬\n凯利蓝㹴\n古代西班牙指示猎犬\n普德尔向导猎犬\n诺波丹狐狸犬\n巴吉度犬布列塔尼\n芬兰波美拉尼亚丝毛狗\n斯莫兰德猎犬\n树枝田纳西州斑点狗\n哈巴小猎犬\n复兴斗牛犬\n巨人狗\n希腊猎犬\n老式英国斗牛犬\n阿富汗牧羊犬\n施普斯狗\n克里特岛猎犬\n圣东基犬\n保加利亚牧羊犬\n那不勒斯獒犬\n瑞典猎鹿犬\n秘鲁无毛犬\n挪威布哈德犬\n格雷伊猎犬\n豹犬\n巴伐利亚山地犬\n桦太犬\n帕内尔的卡罗莱纳州恶狗\n特兰西瓦尼亚猎犬\n巴基斯坦斗牛犬\n亚伦狗\n兰伯格犬\n奥斯卡贵宾\n克龙弗兰德犬\n荷兰牧羊犬\n提洛尔猎犬\n英国獒犬\n英国白梗\n树丛杂种犬\n西班牙斗牛犬\n爱沙尼亚猎犬\n德克萨斯犬\n田特菲㹴\n巴西㹴\n瑞士猎犬\n哈尔登猎犬\n波密犬\n泰迪罗斯福梗\n英国玩具犬\n白色牧羊犬\n戈登塞特犬\n荷兰水猎犬\n雪贵宾\n布拉克·杜·波旁犬\n白英国斗牛犬\n塔马斯卡犬\n双鼻安第斯虎猎犬\n丝绸惠比特犬\n萨普兰尼那克犬\n马地犬\n贝加马斯卡犬\n标准贵宾犬\n台湾犬\n安纳托利亚牧羊犬\n科克尔犬\n比利牛斯獒犬\n伯尔尼劳佛犬\n汉密尔顿猎犬\n小型明斯特兰德犬\n阿里埃日犬\n乌托尼亚犬\n俄罗斯猎狼犬\n蓝色匹卡迪档猎犬\n基里奥犬\n伯格尔・德皮卡第犬\n加斯科尼蓝色矮腿猎犬\n安达卢西亚酒窖大猎狗\n袋鼠狗\n艾尔曼特狗\n麦克纳布犬\n恩特雷布赫山地犬\n俄罗斯玩具犬\n奥地利平犬\n大树猎犬\n威尔士矮脚狗\n穆托尔猎犬\n乞沙比克猎犬\n海根猎犬\n北海道犬\n田野猎犬\n欧洲猎犬\n凯利小猎犬\n长狗\n迷你贝吉格里芬凡丁犬\n大蓝加斯科涅猎犬\n苏利莫夫犬\n澳洲粗短尾牧牛犬\n马尔济斯\n库达犬\n丹麦布罗荷马獒\n兰开夏跟脚犬\n斯卢夫猎犬\n格陵兰犬\n兰西尔犬\n皮卡第猎犬\n美洲猎鹿犬\n谷斗牛犬\n欧亚大陆犬\n四国犬\n短毛意大利猎犬\n潘布魯克威尔斯柯基犬\n爱尔兰红白雪达犬\n英国指示犬\n波利尼西亚犬\n阿兹卡尔\n马鲁索斯犬\n斯塔比嚎犬\n汝拉布鲁诺猎犬\n小猎犬拉布拉多杂交狗\n金德利犬\n史毕诺犬\n小猎兔犬\n土耳其阿卡巴士犬\n圣日尔曼指示猎犬\n美国玩具㹴\n法国黑白色犬\n马林斯费斯特狗\n德冷特斯奇鹧鸪犬\n树丛竞走猎浣熊犬\n黑嘴杂种犬\n泰国脊背犬\n澳洲野狗\n庞特－奥德门猎犬\n韩国杜莎犬\n皮罗·德·伯里沙·马罗奎因犬\n巨型雪纳瑞\n斯恰潘道斯犬\n马拉石狗\n巴辛吉犬\n拉布拉多贵宾狗\n巴西非勒\n巴斯克牧羊犬\n马扎尔犬\n波希米亚牧羊犬\n汉诺威猎犬\n加拿大爱斯基摩犬\n莫斯科水犬\n哈瓦那犬\n库克颇犬\n沟壑梗犬\n坎尼狗\n比利牛斯牧羊犬\n艾特拉科尼克犬\n卡林犬\n黑褐弗吉尼亚猎狐犬\n非洲野犬\n诺福克㹴\n印度斯皮茨狗\n普露马梗\n西西伯利亚莱卡犬\n卡塔胡拉牛头犬\n运动卢卡斯㹴\n国王牧羊犬\n波萨维茨猎犬\n波西米亚硬毛格里芬指示犬\n皮卡普狗\n泰迪罗斯福梗\n丹麦瑞典农场犬\n波兰低地牧羊犬\n骑士比熊犬\n塔特拉山牧羊犬\n芬兰拉普猎犬\n英国雪达犬\n珍岛犬\n硬毛维兹拉犬\n斗牛梗\n克罗地亚牧羊犬\n德国粗毛指示猎犬\n夏伊洛牧羊犬\n西班牙灰狗\n费斯特山狗\n玩具斗牛犬\n夏威夷猫\n地丛鹟\n火红鸲鹟\n白斑尾柳莺\n黑背麦鸡\n纽氏梅花雀\n栗翅牛鹂\n北蝗莺\n白颊长尾山雀\n淡红鹪鹩\n黑嘴天鹅\n五彩鹦鹉\n红头咬鹃\n黑头织雀\n侏莺雀\n淡尾苇鹪鹩\n黑脸鸨\n蓝眼凤头鹦鹉\n红胸皇鸠\n柠黄雀鹀\n沼泽幽鹛\n番红头鹦哥\n白凤头鹦鹉\n翘眉企鹅\n埃及雁\n白腹地鸫\n灰食籽雀\n红尾歌鸲\n红白针尾雀\n北美蛎鹬\n红嘴巨嘴鸟\n火尾希鹛\n白腰草鹬\n印度鸲\n蓝背娇鹟\n厚嘴绿鸠\n黑头针尾雀\n黑腹树鸭\n凤头山雀\n灰噪刺莺\n黄纹薮雀\n红胸长爪鹡鸰\n阿拉伯啄木鸟\n灰蚁鵙\n日本柳莺\n黑喉隐蜂鸟\n黑肩鸢\n火喉蜂鸟\n凤头距翅麦鸡\n灰褐噪鹛\n酒红斑鸠\n地犀鸟\n白翅薮鹟\n黄腹绿鸠\n澳洲鹨\n巨辉椋鸟\n黑领鹎\n环蚁鹨\n白额饰眼鹟\n白颊黑雁\n彩鹬\n凤冠火背鹇\n侏长尾山雀\n白尾雀鹎\n栗顶薮雀\n马岛鹡鸰\n小杓鹬\n灰连雀\n黑腹食蚊鸟\n笛声鹪鹩\n白腹灰蕉鹃\n白眼褐鹎\n紫头蜂鸟\n美洲乌夜鹰\n黑腹辉椋鸟\n杂色须霸鹟\n锈腹薮雀\n黑喉粗嘴雀\n白眼潜鸭\n斯里兰卡蓝鹊\n小灰啄木鸟\n暗黄顶黑唐纳雀\n黑领啸鹟\n淡头玫瑰鹦鹉\n白胸窜鸟\n巨鹭\n黑腹花蜜鸟\n刀嘴海雀\n绯腰巨嘴鸟\n南美黑啄木鸟\n蓝头八色鸫\n红翅斑腹雀\n丛林山鹪莺\n红地鸠\n褐绣眼鸟\n蓝翅希鹛\n勺嘴鹬\n紫滨鹬\n白颊犀鸟\n内岛柳莺\n棕背蚁鸟\n白腰树燕\n白喉唧鹀\n沟嘴巨嘴鸟\n黑澳鳾\n棕耳噪鹛\n黑头拟鹂\n橙冠拟鹂\n橄榄篱莺\n黄腹噪刺莺\n黄纹绿鹦鹉\n白脸角鸮\n灰头林鸽\n澳洲灰雁\n非洲鹨\n镰翅夜鹰\n白腹海雀\n侏鸊鷉\n沼泽山鹪莺\n朱腹啄木鸟\n鹤鹬\n红冠啄花鸟\n黑颈天鹅\n苇鹀\n白眉林鸲\n美洲银鸥\n叙利亚啄木鸟\n短尾翠蜂鸟\n绿食蜜鸟\n白眉鹀\n黑头鸦雀\n粉胸斑鸠\n毛腿沙鸡\n红喉鸡鸠\n棕薮鸟\n黄色白斑翅雀\n长冠鹰雕\n灰背织雀\n长尾椋鸟\n白胸噪鹛\n淡嘴火雀\n华美极乐鸟\n丛蚁鵙\n尤卡坦啄木鸟\n五彩文鸟\n蒲苇莺\n半蹼鹬\n白头薮雀\n冠卷尾\n楔尾伯劳\n白背鸭\n红翅鹦鹉\n灰眉岩鹀\n亨岛苇莺\n乳白背啄木鸟\n灰嘴慧星蜂鸟\n白脸噪鹛\n点斑林鸮\n金眶鸻\n科氏蜂鸟\n青脚滨鹬\n杂色麦鸡\n黑顶唐纳雀\n印度走鸻\n白喉针尾雨燕\n澳洲苇莺\n黄胸鹬\n白翅噪椋鸟\n烟色绿霸鹟\n绿加岛莺雀\n秘鲁[红翅]共鸟\n灰地霸鹟\n史氏蝗莺\n白背矶鸫\n黑喉绿阔嘴鸟\n长尾山鸠\n黑眉绿鹎\n白喉鹧鸪\n亚历山大鹦鹉\n白腹麻鸭\n疣鼻栖鸭\n白眉鸣鹃鵙\n棕背小嘲鸫\n橙腹果鸠\n双日阔嘴蜂鸟\n黑顶唐加拉雀\n紫胸佛法僧\n黑冠噪鹛\n云斑林鹑\n矶鹬\n金胸歌鸲\n卷冠蓝鸦\n爪哇短翅莺\n黑冠啄木鸟\n黄褐食籽雀\n南美长尾蜂鸟\n蓝头佛法僧\n灰胸籽鹬\n黑腹蜂鸟\n黄垂麦鸡\n非洲褐鸫鹛\n灰头斑鸠\n凤头蜂鸟\n南非企鹅\n绿额矛嘴蜂鸟\n燕尾夜鹰\n长趾滨鹬\n黑眉信天翁\n塔劳秧鸡\n灰头文鸟\n墨西哥鹪鹩\n黑尾塍鹬\n加岛柳莺\n灰头啄木鸟\n白胸鸬鹚\n白喉角鸮\n黑腹鹱\n黄嘴鹊鵙\n白斑燕\n灰背南美鵟\n黑凤头鹦鹉\n黄臀鹎\n中杓鹬\n凤头海雀\n白冠啄木鸟\n凤头潜鸭\n非洲鸵鸟\n白顶地鸽\n黑头薮雀\n白翅夜鹰\n红侏霸鹟\n白眉雨燕\n小滨鹬\n白冠带鹀\n褐头山雀\n米岛扇尾鹟\n林攀雀\n纵纹蚁鸫\n白斑文鸟\n灰啸鹟\n灰头果鸠\n黑喉拟鴷\n黑嘴山巨嘴鸟\n索马里鸵鸟\n东方鸻\n黄褐翅铲嘴雀\n亮丽太阳鸟\n乌顶灌丛唐纳雀\n白腹针尾雨燕\n灰胸啄木鸟\n直嘴芦雀\n栗耳凤鹛\n红尾鹦哥\n紫辉椋鸟\n白翅澳鸦\n白尾蚁鸫\n蓝额红尾鸲\n红腹金刚鹦鹉\n南极鹱\n黑翅麦鸡\n灰啄木鸟\n红胸黑雁\n卢氏丽椋鸟\n白腰叉尾海燕\n凤头犀鸟\n厄瓜多尔啄木鸟\n非洲硬尾鸭\n淡胁秧鸡\n黑胸三趾鹑\n红尾猛雀鹀\n白翅丽唐纳雀\n马岛秧鸡\n灰头翡翠\n北长尾山雀\n红胸姬鹟\n栗背山雀\n黑蜂虎\n北马岛苇莺\n白点扇尾鹟\n马岛戴胜\n灰顶伯劳\n灰冠黑雀\n灰苇鹪鹩\n白喉爬地雀\n辉亭鸟\n白额雁\n北美鸺鹠\n乌灰鸫\n领月胸窜鸟\n点斑背蚁鸟\n蓝尾翠蜂鸟\n灰雕鸮\n印加燕鸥\n黄胸蚁鸫\n云南柳莺\n白簇岩吸蜜鸟\n大滨鹬\n黑剪嘴鸥\n黑疣皇鸠\n白胸蚁鸟\n茶胸吸蜜鸟\n白顶溪鸲\n白颈穗鹛\n橙胸咬鹃\n橙嘴蓝脸鲣鸟\n小杜鹃\n南美草鹀\n凤头褐鹎\n短嘴长尾山雀\n裸顶蚁鸟\n乌双斑雀\n普通秋沙鸭\n华西柳莺\n红头鸽\n白燕鸥\n蒙古沙鸻\n亚马逊灌丛霸鹟\n三趾鹟鴷\n凤头[共鸟]\n白胸绣眼鸟\n澳洲鹤\n蓝顶蚁鸫\n长冠八哥\n黑顶柳莺\n大杓鹬\n戴胜\n青脚鹬\n欧亚攀雀\n南美蚁鸫\n蓝喉翠鴗\n冠绣眼鸟\n非洲绿胸八色鸫\n鹊色唐纳雀\n黑短尾蓬背鹟\n领雀嘴鹎\n黑背白斑翅雀\n棕颈地霸鹟\n黑冕鹤\n杂色林鵙鹟\n灰颊夜鸫\n白喉蚁鸟\n南美针尾雀\n乌鸫\n铁嘴沙鸻\n厄瓜多尔姬啄木鸟\n纯胸姬啄木鸟\n黑胸麻鸭\n长嘴剑鸻\n美洲鹤\n黄翅莺雀\n白眼莺雀\n红颈滨鹬\n冕麦鸡\n白喉拾叶雀\n红颈黑鵙\n斑尾娇鹟\n锈喉马岛鹃\n纹针尾雀\n白南美鵟\n俾岛翠鸟\n暗色秧鸡\n白头麦鸡\n丽蓝头鹊\n三色伞鸟\n短嘴鸽\n栗鹀\n黑喉红尾鸲\n黑喉织雀\n黑脸石鸡\n黑皇鸠\n黑冠山雀\n鸡尾鹦鹉\n白喉绣眼鸟\n细斑姬啄木鸟\n灰鹨\n台湾拟啄木鸟\n赤颈鸭\n黑噪鹛\n红耳鹎\n白喉鸫鹛\n剑鸻\n棕钩嘴鵙\n短嘴地鹃\n马来犀鸟\n黑额燕鸥\n赭红尾鸲\n黑领鸲莺\n蓝山雀\n厄瓜多尔姬霸鹟\n亚马逊鹦哥\n厄瓜多尔小霸鹟\n黑颏岭裸鼻雀\n凤头黄眉企鹅\n灰头树鹛\n印支雀鹛\n四声杜鹃\n红颊蓝饰雀\n苍头薮雀\n棕颈拾叶雀\n白尾蓝胸蜂鸟\n尖嘴蜜鴷\n十二线极乐鸟\n冠鸦\n冠鸠\n冠鸭\n红额噪鹛\n白眼吸蜜鸟\n黑脚信天翁\n东非长尾伯劳\n白纹鴷雀\n黑翅长脚鹬\n斑喉鵖\n鹊鸭\n鹊雁\n沼泽鹎\n黄腹织雀\n南非鹨\n乌鵙鹟\n南非鹎\n红白唐纳雀\n狭尾椋鸟\n冠纹柳莺\n白喉林鸽\n厚嘴地鸠\n北杂鹛\n冠针尾雀\n皇辉蜂鸟\n三色薮雀\n灰额绿鸠\n绿嘴鸦鹃\n田雀鹀\n环颈鸻\n非洲白腰雨燕\n环颈鸭\n点胸蚁鵙\n紫喉领蜂鸟\n亮丽歌雀\n欧柳莺\n栗背丽鸫\n鬼鸮\n绿尾翠蜂鸟\n橙额黄雀鹀\n辉蓝细尾鹩莺\n南红蜂虎\n灰胸雅鹛\n丽色山雀\n黑脸王鹟\n珠眼嘲鸫\n黑白噪鹛\n黑枕威森莺\n北灰山雀\n黑嘴蕉鹃\n黄喉鹎\n黄喉鹀\n灰头绿鸠\n黑尾啸鹟\n小青脚鹬\n黑斑裸眼蚁鸟\n黑脸厚嘴雀\n短尾鹨\n白腹蜂鸟\n七彩文鸟\n短趾矶鸫\n细嘴雁\n棕顶大尾莺\n金眶鹟莺\n白眼先蚁鸫\n白头刺莺\n沼泽短翅莺\n细嘴杓鹬\n绿颈鹑鸠\n爪哇绿鹊\n黄腿鸥\n黑背信天翁\n栗胸地鹃\n白眼蚁鹩\n黄背拟鹂\n伯氏鹨\n白胸森鸠\n黄眉蚁鸟\n灰颊绿鸠\n灰头鹑鸠\n米岛鹎\n针尾沙锥\n斑翅山鹪莺\n仙蓝王鹟\n赤顶娇鹟\n点额啄木鸟\n南方蚁鵖\n红胸蜂虎\n凤头麦鸡\n灰腰燕\n斑鸫\n白尾姬霸鹟\n南非丝雀\n水雉\n马来鸻\n丽绣眼鸟\n姬隐蜂鸟\n凤头主红雀\n紫蓝饰雀\n北非石鸡\n华丽八色鸫\n棕背伯劳\n白头鹎\n攀雀\n白尾美洲咬鹃\n泽鹬\n北极海鹦\n爪哇缝叶莺\n白腹鸨\n台湾棕颈钩嘴鹛\n乳色走鸻\n洪都拉斯蜂鸟\n橙胸噪鹛\n民岛角鸮\n爱琴海猫\n阿拉伯茂猫\n巴比诺猫\n孟加拉猫\n巴西短毛猫\n查达利猫\n塞浦路斯短毛猫\n德文帝王猫\n顿斯科伊猫\n狸花猫\n侏儒猫\n科恩家猫\n拉邦猫\n尼比龙猫\n欧西猫\n欧斯亚史烈斯猫\n东方双色猫\n折叠猫\n北美洲短毛猫\n罗奥斯猫\n塞尔凯克卷毛猫\n塞伦盖蒂猫\n玩具虎猫\n乌克兰勒夫科伊猫\n黄粉蝶\n领燕鸻\n粉头斑鸠\n三线闭壳龟\n红斑马\n豹纹睑虎\n摄龟\n白头叶猴\n种猪\n西班牙羱羊\n白带锯蛱蝶\n非洲猛鱼\n阿拉伯大羚羊\n枝角类\n斜线天蛾\n钟虫\n拟光腹弓背蚁\n彩蚕\n圭亚那蓝牙\n蜢蜘\n美国红雀鱼\n茶黄螨\n蓬尾浣熊\n巨型羽翅鲎\n大极乐鸟\n黑长尾雉\n中华凤头燕鸥\n梳额蜴鲶\n山地田鼠\n歌鸲\n布列塔尼犬\n蓝面蝴蝶\n棘烟管鱼\n雀鱼\n尾斑圆颌针鱼\n松江鲈\n绿步甲\n黑额山噪鹛\n黑地蜂\n朝氏圆颈天牛\n灰驯狐猴\n红唇鱼\n星椿象\n淡水龙虾\n落基山大角羊\n五彩吊鱼\n青毛鼠\n狐狸鱼\n鬟羚\n獭兔\n彩蝴蝶\n啄木雀\n藏猕猴\n金斑鱼\n小麦红蜘蛛\n雾社黑燕蝶\n工程鲫\n哀鸽\n益蝽\n花罗汉\n卷尾\n小虾虎鱼\n斑星弄蝶\n红脸地犀鸟\n春尺蠖\n湖羊\n广东潮州犬\n非洲角蝰蛇\n水陆两栖动物\n赤线虫\n倭猪\n侏蓝仙鹟\n黑枪鱼\n纳氏鹞鲼\n红月光鱼\n棋盘凤凰\n长江鲟\n珍珠龙王鲷\n尖嘴扁颌针鱼\n箭头鱼\n挺胸龟\n火焰神仙\n玄灰蝶\n邓氏鱼\n玻璃灯鱼\n蜜蚁\n蛇鲭\n夏威夷僧海豹\n绿腰鹦哥\n斑纹缨口鳅\n英国跳跃长耳猎犬\n小鹦鹉\n白尾蜻蜓\n云南大头鱼\n兔耳袋狸\n红鳟鱼\n螺\n天使长尾天蚕蛾\n黑瘤地图龟\n密星海蝠鱼\n曼赤肯猫\n迷你贵宾犬\n稻管蓟马\n狗熊\n大角鹿\n毛驴\n高背红尾金龙鱼\n大角羊\n蓝海绵珊瑚\n蓝狐\n黑寡妇蛛\n尖鼻箱鲀\n小鸺鹠\n塘鲺\n普氏原羚\n红袖蜡蝉\n黑雀\n变形虫\n钩虾\n葡萄缺角天蛾\n红嘴朱雀\n鸻鹬\n褐蓝子鱼\n霸王蝾螈\n点玄灰蝶\n雌雄嵌合体\n弧边招潮蟹\n红丽丽\n东其尼猫\n红尾副鳅\n鹅喉羚\n塘鹅\n跑山鸡\n榆蓝叶甲\n蛇蛉\n牛角龙\n沙漠蛙\n栎鹰翅天蛾\n锤尾凤蝶\n叶形鱼\n八线腹链蛇\n葱蓟马\n黑长脚鹬\n花蜜长舌蝠\n沙蜥\n龙背种金鱼\n金壳虫\n大亭鸟\n银鲨\n红狼牙鰕虎鱼\n丁公鱼\n石金钱\n虾虎鱼\n黑背波鱼\n鲴鱼\n红脖颈槽蛇\n印头鱼\n蓝神仙\n马头鱼\n肥头鱼\n棕额长尾山雀\n黄河鳖\n黑马羚\n蝇虎\n斜斑彩灰蝶\n白顶鵖\n油松毛虫\n罗岛蓝鸠\n长腿鹬\n似鸡龙\n腹斑水蛇\n布里塔尼犬\n华鳊\n独角犀牛\n欧洲野牛\n丽纹云南鳅\n梳骨螺\n红绿金刚鹦鹉\n土拨鼠\n迷你灯鱼\n基氏细猛蚁\n中国水龙\n禄丰龙\n电蛱蝶\n川蜷\n尖尾雨燕\n小黄帽亚马逊鹦鹉\n蓝鳍金枪鱼\n碧蛾蜡蝉\n黄尾鱼\n丝足鱼\n莱芜黑山羊\n龙睛金鱼\n无恒变形虫\n铜嘴雀\n彩石鳑鲏\n夏赤蜻\n大眼鳊\n温泉蛇\n蛭\n盔龙\n吐尼尔鸽系\n褐翅燕鸥\n黑鱼\n彩虹蟾蜍\n欧洲乌鸫\n囊地鼠\n日本须鲨\n喀尔巴阡蜂\n玻璃海象\n横纹金蛛\n茧蜂\n红鳍笛鲷\n蓝额长脚地鸲\n红棕象甲虫\n鼯猴\n灰天鹅\n普罗特猎犬\n平菇厉眼蕈蚊\n电鳐\n叽咋柳莺\n鼬鼠\n红领吸蜜鹦鹉\n东亚喜鹊\n大嘴籽雀\n鳄龟\n棕头穗鹛\n蛇颈龙\n花枝鼠\n鸭嘴鲟\n山窗萤\n嘉翠蛱蝶\n小香猪\n西非短鲷\n红瓜子斑\n糙齿长吻海豚\n黑面狷羚\n狗鱼\n黑额光叶甲\n大口虾虎鱼\n赤嘴潜鸭\n黑眉柳莺\n棘鮋\n蓝冠鸽\n翱翔蓑鲉\n密纹飒弄蝶\n艾芬品\n黑尾灰蜻\n刺背球状蜘蛛\n长尾大蚕蛾\n四纹豆象\n白雉鸡\n欧洲盘羊\n大地懒\n笔尾树鼩\n短嘴金丝燕\n介壳虫\n过树蛇\n隐纹长脚胡蜂\n红脚苦恶鸟\n非洲金猫\n条石鲷\n齿缘龙虱\n皇带鱼\n鬼王螽斯\n稻象甲\n床虫\n南极冰鱼\n狗鲛\n麻田黄刺蛾\n鳀鱼\n黄金鸻\n石狩红蚁\n须红鲂鮄\n苹果全爪螨\n橙胸摄蜜鸟\n公主兔\n雉鹑\n宁都黄鸡\n叉尾卷尾\n五色锦鲤\n管鼻鯙\n赤麂\n中国虎纹捕鸟蛛\n乌鳢\n黑嘴树鸭\n蜜袋鼯\n拟态革鲀\n莱昂贝格犬\n卷叶螟\n水栖龟\n滑鼠蛇\n扭颈鸟\n丁丁猫儿\n角蟾\n黑龙江豹\n纤毛虫\n梅花龟\n黄天堂鸟\n凤头企鹅\n白肩鹊鵙\n安氏蜂鸟\n福建丽纹蛇\n灰莺\n普通长脚马蜂\n李子食心虫\n赤链华游蛇\n长尾巧织雀\n褐色树蛇\n黄蚬\n海帆蜥\n薮羚\n枪猎犬\n大鳞脂鲤\n灰鹭\n银鱼\n王锦蛇\n日本龟蜡蚧\n北极鸥\n樱桃鱼\n大头蚁\n赤胸鸫\n豹纹鼠鱼\n淡水苏眉\n点荷包鱼\n大棘烛光鱼\n安第斯火烈鸟\n德国硬毛指示猎犬\n小黄斑挵蝶\n长颈龟\n波斯鲟\n尖吻鲈\n蛾蚋\n原矛头蝮\n翡翠鸟\n蓝蛱蝶\n鞭毛虫\n扁形虫\n松茸毒蛾\n秀丽白虾\n文种金鱼\n雕鴞\n袋熊\n宝石海鳝\n科尼斯雷克斯猫\n双头五腿龟\n划蝽\n裸鼹鼠\n柳毒蛾\n曲纹紫灰蝶\n螟虫\n六点带蛱蝶\n蓝鳁鲸\n大蜥蜴\n美国银色短毛猫\n禾花鱼\n瑶山鳄蜥\n鸢\n苹掌舟蛾\n蝶蛾\n野狗\n根瘤蚜\n鸡苗\n鯮鱼\n蟛蜞\n吐鲁番斗鸡\n中间银鮈\n美洲斗牛犬\n扁刺蛾\n日鳽\n台湾大象鼻虫\n卷毛猫\n叉尾鱼\n黑枕燕鸥\n小型狮子犬\n鸭嘴螺\n芋双线天蛾\n蓝鳍鱼\n桃桑白蚧\n弹涂鱼\n墨西哥黑熊\n笋壳鱼\n圆顶珠蚌\n海南蜜蜂\n珍珠马三甲\n金披凤玫瑰鹦鹉\n红花实蝇\n澳洲皇帝蟹\n火焰龟\n红头爱情鸟\n中村锯天牛\n樟青凤蝶\n比尤伊克天鹅\n镖鲈\n鳃鱼\n小齿椰子猫\n华山松大小蠹\n黄冠亚马逊鹦鹉\n白腰雨燕\n高冠变色龙\n膨腹海马\n八色鸟\n桃花虫\n海老鼠\n披毛犀\n丝绵木金星尺蠖\n竹叶龟\n吻海马\n窃蠹\n柳圆叶甲\n星期狗\n冠海马\n赤似谷盗\n北京油鸡\n小蠹虫\n锯鳞蝰\n红额鹦鹉\n泰虎\n方正银鲫\n红肚凤凰\n双带弄蝶\n稻褐飞虱\n头条波鱼\n蜂虻\n达摩翠凤蝶\n中华虎甲\n信天翁\n帝鳄\n猪鼻龟\n斑林狸\n二尾蛱蝶\n红粉佳人鱼\n食木甲鲶鱼\n池龟\n沙特尔猫\n大雕\n沙漠之舟\n黄杨绢野螟\n虹鳟鱼\n金青鸟\n白蜘蛛\n幽灵鲨\n榕透翅毒蛾\n狸奴\n棕榈象鼻虫\n雷龙鱼\n彼斯奎氏鹦鹉\n双斑绿柳莺\n小麦吸浆虫\n北京雨燕\n向日葵螟\n棕灶鸟\n盾蝽\n沼泽山雀\n天蚕蛾\n渔鸥\n蟾福蛱蝶\n台湾山蜗牛\n荒漠沙蜥\n华夏剑凤蝶\n阿拉斯加内陆狼\n北极蛤\n鼹鼠\n黑额伯劳\n白环蚊霸鹟\n番茄蛙\n黑白兀鹫\n野生甲鱼\n熊猴\n鹞鹰\n稻螟虫\n鳄冰鱼\n太行犬\n红螺\n美国大鲵\n薮枝螅\n暗腹雪鸡\n波路豆齿蛇鳗\n黄足条蜂\n爬虫类\n稻秆蝇\n光脸鲷鱼\n斜纹天蛾\n鲌\n白斑斑鲨\n蟹守螺\n管鳗\n十一间仙\n西瓜龟\n水蛛\n斑鹞\n海瓜子\n蒙古羊\n岩羊\n黑枕绿啄木鸟\n红点鲑\n针大蚊\n鼋\n南美短鲷\n东德牧羊犬\n长嘴半蹼鹬\n银白鱼\n多耙红钻鱼\n荷花瓦特犬\n蓝枪鱼\n圆鮀鲣\n黄尾蜜鲴\n银鸭\n红灶鸟\n长体小鳔鮈\n朝鲜少鳞鳜\n巨型蚂蚁\n五彩鳗\n栗斑鵎鵼\n德国镜鲤\n木槿蚜虫\n巧克力娃娃\n橙翅亚马逊\n沙漠地鼠龟\n三尾褐凤蝶\n沙鳢\n银屏灯鱼\n臭椿皮蛾\n台湾山椒鱼\n臭肚鱼\n非洲树蛇\n放射鼠\n斑珍蝶\n粉红鲑\n咖啡貂\n豆眼白\n麻步甲\n金鲳鱼\n红嘴海鸥\n大红蛱蝶\n勃氏新热鳚\n青头潜鸭\n欧鳗\n前肛鳗\n白斑眼蝶\n中华鳑鲏\n赛克斯长毛猎犬\n丰年虾\n叶泡蛙\n乌鬃鹅\n真鳕\n七星鳢\n琉璃石斑鱼\n南浣熊\n灰腹鹰\n麂鹿\n哈什蚂\n中华真地鳖\n古铜谷弄蝶\n红脚细腰蜂\n乌珠穆沁马\n紫红冠鹦哥\n黑龙江茴鱼\n金灯虾虎\n咖啡透翅天蛾\n香猪\n日本鳗鲡\n缟鬣狗\n澳大利亚肺鱼\n东菲比霸鹟\n迷你兔\n豹尺蛾\n白星花金龟\n好胜金蛛\n家鸽\n硬头海鲇\n石蜈蚣\n华吸鳅\n沙鵖\n毛鳞鱼\n密疣蜥虎\n黄嘴白鹭\n豹纹鲨\n黑纹颈槽蛇\n珍珠狗\n玉米粘虫\n辫子鱼\n六带鰺\n金额叶鹎\n银耳相思鸟\n大泷线鱼\n荒漠睡鼠\n薮犬\n硫磺鹀\n幸运辘蛱蝶\n四绒球金鱼\n加州秃鹫\n姬蟋\n吊死鬼\n芦莺\n基龙\n白颈长尾雉\n白翅蓝鹊\n白脸牛羚\n鼩鼱\n锦鲫鱼\n金头珍珠虎\n兔狲\n夏洛来羊\n狗鳄\n蓝银白色猫\n白腹紫椋鸟\n大龙虾\n毛魔目夜蛾\n玛伦牧羊犬\n纳米比亚沙漠测行蛇\n碎蛇\n蓝星珍珠\n花雕鱼\n腕足动物\n乌蚁鸟\n紫玫瑰凤蝶\n大连湾牡蛎\n黄额鸦雀\n南蛇\n雷氏大疣蛛\n红鳍鮊\n红箭鱼\n粉蚧\n粉白龙猫\n巨蜘蛛\n叙利亚刺尾蜥\n人面蜘蛛\n橙仙鱼\n湘云鲫\n太平洋鲑鱼\n红点颏\n金花马骝\n灌木新园蛛\n全蝎\n中华曙猿\n凤眼蝶\n玻璃红鲤鱼\n蚓腹银斑蛛\n非鲫\n加拿大马鹿\n银鲛\n苎麻珍蝶\n缝叶蚁\n乌燕鸥\n花脊游蛇\n红丽丽鱼\n小潜鸭\n橙带蓝尺蛾\n太阳鸟\n大鳞青眼鱼\n中国狸花猫\n貂鼠\n盾天蛾\n茶树假眼小绿叶蝉\n姬鹬\n鸡头\n螳蛉\n鼠鸟\n红尾珠蝶\n蹄兔\n长嘴鹬\n大山羊\n蓝宝石华丽雨林\n方水母\n三角鲤\n大枇杷螺\n丝绒吊鱼\n长牡蛎\n印鱼\n蛀虫\n黑羽狨\n巨犀类\n彩虹八色鸫\n糖果蜗牛\n白马头鱼\n柳杉毛虫\n孤沙锥\n褐背拟地鸦\n蛾螺\n绯鲤\n法系安哥拉兔\n扁玉螺\n黑翅红蝉\n玉米象\n金龟甲\n西藏翠蛱蝶\n飞蚂蚁\n欧洲鲟\n军曹鱼\n黑脸鸬鹚\n哈威那\n美洲肺鱼\n旌蛉\n锦鲤\n毒鲉\n高体鳜\n树皮螳螂\n水老鼠\n铜头蛇\n七彩吊鱼\n锦龟\n日本青鳉\n鳞喉隐蜂鸟\n真枝角鹿\n渔猫\n海月水母\n海蜷\n北锯龟甲\n花斑裸鲤\n大鳞锥颌象鼻鱼\n中华大刀螂\n白唇竹叶青\n花八哥花鹩哥\n渔游蛇\n关中驴\n浅蜊\n泰国斑马脚蜘蛛\n哥伦比亚泥龟\n绿鹦嘴鹎\n四喜鸟\n黑鳍巨脂鲤\n三眼甲虫\n灰林鵖\n柴棺龟\n大珠母贝\n领蝴蝶鱼\n猪蛙\n老板鱼\n矛头蝮\n东南亚虎\n蓝灰扁尾海蛇\n异色多纹蜻\n阿拉斯加克利凯犬\n哈尔滨白猪\n鹤鹰\n棕黑锦蛇\n黄边小太阳鹦鹉\n沼泽章鱼\n巨龟\n叶齿金绿泥蜂\n雀鲷\n棕顶王森莺\n简牙下鱵鱼\n黑伯劳\n刀鲳鱼\n家鹅\n伟格仕太阳鸟\n红珠灰蝶\n花鸡\n豹斑象龟\n鳞眼鮃\n兵蚁\n池沼公鱼\n雾姥甲虫\n织雀\n花生大蟋\n法国三色猎犬\n硬毛猎狐梗\n家猫\n宾沙犬\n橙线吊\n翼法螺\n拟黑多刺蚁\n岛子鱼\n玫瑰鲫\n拉蒂松猫蛛\n霸王角蛙\n高角羚\n白痣广翅蜡蝉\n蓝顶蓝饰雀\n黑尾袋鼠\n加州红腹蝾螈\n雪雀\n雪花豹\n日本似织螽\n砗磲贝\n家猪\n飞凤鱼\n蟹黄水晶毛虫\n大口蛇鳗\n带猴\n斑马蟹\n真骨鱼类\n绿凤蝶\n佛法僧\n齐口裂腹鱼\n长嘴啄木鸟\n豹纹海鳝\n四川凉山犬\n迷你狗\n中华鲟\n刺鲀鱼\n斑蛾\n二线琴尾鳉\n洞螈\n白斑狗鱼\n伏地魔猫\n白背飞虱\n蚁形甲\n蓝鸫\n小蜂\n灰犀鸟\n红靛颏儿\n黄锡鲷\n草蜘蛛\n菱蝗\n白马鸡\n云猫\n丽眼斑螳\n新疆岩蜥\n马面鲷\n诺氏鹬\n褐柳莺\n蝙蝠鱼\n花萤\n大马蹄蝠\n神秘螺\n长颈驼\n宽节巨首蚁\n红姬缘椿象\n日本双棘长蠹\n鼠鼬\n花曲柳窄吉丁\n阴阳蝶\n滨鹬\n滨浪鹬\n白背夜鹭\n亮柔拟步甲\n黄点黒蝉\n大千手螺\n金黄眼镜蛇\n极北蝰\n丛林猫\n猿叶虫\n蓝鹤\n红腰勺鹬\n三化螟\n西猯\n斗笠螺\n杜父鱼\n总鳍鱼\n惊讶猫\n矛头蝮蛇\n三角锤天蛾\n黄尾巴鲢\n突灶螽\n迷你力斯兔\n萨伊蓝六间\n褐蛇\n红腹金鸡\n西帕基犬\n黑点灯\n叶蜂\n花鲢鱼\n中华剑角蝗\n白狮\n山黄鳝\n鬘螺\n日本鹰翅天蛾\n长鼻螺\n海兔螺\n交嘴雀\n玉米钻心虫\n花蜘蛛\n青蛙鱼\n黑纹伟蜓\n宅泥鱼\n金斑虎甲\n掘土蜂\n安东尼斯闪蝶\n水虿\n黑豚\n法国水犬\n鵟\n大骨顶\n森林漏斗蛛\n鱼鳍\n玉筋鱼\n银钩青凤蝶\n荔枝蒂蛀虫\n斑背燕尾\n马岛獴\n蓝仙鱼\n四川雉鹑\n青鳞鱼\n小地霸鹟\n大背天蛾\n安布闭壳龟\n真蛸\n蜥代龙\n雪鹑\n瓜实蝇\n灰胸竹鸡\n蜥龙鳄\n棱龟\n刺缘大薄翅天牛\n异齿龙\n细鳞太攀蛇\n花鳅\n玉米螟赤眼蜂\n西班牙指示犬\n鲈鲤\n橡树啄木鸟\n油葫芦\n东乡绿壳蛋鸡\n棕尾鼬狐猴\n麝鸭\n黄金鲤\n金钱龟\n雀鹛\n从江香猪\n铃蟾\n北部黑瘤地图龟\n黑线鳕\n姥鹬\n基围虾\n天螺蛳\n黄头庙龟\n舍腰蜂\n褐塘鳢\n巨头麝香龟\n混合蜓\n狼鳍鱼\n紫雷达鱼\n八线龙鱼\n棕头鸥\n海角鹦鹉\n鼠鱼\n六线鱼\n斑鬣狗\n雪鸡\n毛虫\n象拔蚌\n奥尼鱼\n青豹蛱蝶\n林鸮\n珠蝴蝶鱼\n新加坡红耳鹎\n三角鲂\n红松鼠\n大龙虱\n黑线银鲛\n红鳍鲫\n栗树鸭\n杨二尾舟蛾\n指标犬\n黑头鸭\n葱兰夜蛾\n盲椿象\n黑白关刀\n油彩粉红趾\n亮灰蝶\n白腰滨鹬\n非洲獴\n柯莫德熊\n桂花鱼\n半边鱼\n黑蚱蝉\n黄环林蛇\n美洲蓝凤蝶\n简阳大耳羊\n距翅雁\n银白蛱蝶\n巨叉深山锹甲\n金星宝螺\n印度棱背龟\n短舌鳎\n赛级犬\n赤鸽\n珍贵妩灰蝶\n混血猫\n麝香凤蝶\n玫红眉朱雀\n白牦牛\n雪蛤\n日本弓背蚁\n拉塞尔蝰蛇\n舒柏奇犬\n海蛾鱼\n九带犰狳\n细鳞鲴\n丁卡扁隆头鱼\n半月鱼\n红山椒鸟\n桑蚕\n南方波鱼\n斑大蚊\n幻紫斑蝶\n印度眼斑螳\n沧龙\n红斑美凤蝶\n黑脚企鹅\n美国树蛙\n豆荚螟\n山地麻蜥\n蚜狮\n豹纹魟\n响蜜鴷\n红嘴奎利亚雀\n赤颈鸫\n苇莺\n夏威夷蜗牛\n海鞘\n小瑞士犬\n果马蜂\n绿猫鸟\n刺鲃\n枸杞蚜虫\n瘦虾\n肺鱼\n柑桔凤蝶\n茶小绿叶蝉\n兰开夏赫勒犬\n韭菜迟眼蕈蚊\n蓝太平洋鹦鹉\n小啄木鸟\n达尔文雀\n达呼尔鼠兔\n横斑林鸮\n裂头蚴\n南方马口鱼\n红腰鹦鹉\n蜂鹰\n漏斗蜘蛛\n大蛙眼守宫\n太平洋玉筋鱼\n猩红蜻蜓\n米色龙猫\n飞海蛾鱼\n撒旦鸭嘴\n黑冠鳽\n疣鼻天鹅\n巨型毛冠鸽\n雉鸡\n和尚鹦鹉\n高原裸鲤\n龙须狮子鱼\n鳗鲡\n棉大卷叶螟\n荧光鼠\n白鲢鱼\n高原鼠兔\n灰蓝蚋莺\n双斑草雀\n丁鱥\n澳洲国王鹦鹉\n横带此鮆鲷\n鸣官鸟\n疣猪\n中国胭脂鱼\n鄂伦春猎犬\n唇鱼\n垂耳兔\n尖牙鱼\n黄河鸽子鱼\n半线波鱼\n鲐鲅鱼\n中华攀雀\n巴西流浪蜘蛛\n中红侧沟茧蜂\n鳉鱼\n台湾地蜥\n淡紫色猫\n双角老鼠宝螺\n长颌鲚\n蓝尖尾无须鳕\n水稻稻纵卷叶螟\n小白鹭\n大眼狮鲈\n长足虻\n龙睛鱼\n葫芦锹形虫\n黄唇鱼\n圣甲虫\n秘鲁蜂鸟\n赤腹松鼠\n东方菜粉蝶\n岩滨鹬\n拟鲶高原鳅\n非洲八色鸫\n疣蝗\n黑喉噪鹛\n黄腹旱獭\n家鸡\n天鹅绒虫\n玉米红蜘蛛\n山水牛\n海牛鲸\n榧螺\n大黑蚂蚁\n冬瓜蝶鱼\n西表山猫\n皇帝鱼\n蜜熊\n鬣狗\n小掩鼻风鸟\n白耳鹎\n钝翅苇莺\n黄潜蝇\n潜鸭\n绿鸦鹃\n藏獒\n红白锦鲤\n裸臀鱼\n白化花斑蛇\n粒纹大齿猛蚁\n阳彩臂金龟\n巨沙螽\n魔花螳螂\n紫鳗虾虎鱼\n麦町犬\n赤须夜蜂虎\n亚历山大群岛狼\n桨足纲\n鳚鱼\n白条鱼\n海鲋\n舒伯齐犬\n东流水牛\n黄绣眼鸟\n欧洲梭鲈\n深海鮟鱇\n甘蔗螟虫\n斑石鲷\n迷你马\n气步甲\n棘鱼类\n斜纹蝴蝶鱼\n家马蜂\n沙漏状蜘蛛\n切尔诺贝利巨鼠\n野鸽\n黄豹盛蛱蝶\n腔棘鱼\n希拉里蟾头龟\n蓝马鸡\n园蛛\n花骨鱼\n一点突额秆蝇\n玉米叶甲\n赤峰锦蛇\n白蜡蚧\n苏铁小灰蝶\n棋盘鲫\n陆龟\n红眼斑鸠\n娄费尔德折背陆龟\n星甲鱼\n中间黄颡鱼\n七彩马鞍鱼\n蓝帝提灯\n苜蓿多节天牛\n柳叶鱼\n东方泥龟\n花田鸡\n蠕鳗\n白鼻丛尾猴\n雪梨漏斗网蜘蛛\n绦虫\n耗子\n虎鲶\n玉女宝螺\n白扇蟌\n黄金鲫\n毛黄鳃金龟\n彩虹帝王灯鱼\n转基因鱼\n浅翅凤蛾\n金鲤鱼\n白头鹎雏鸟\n白天鹅\n台湾小头蛇\n竹笋象鼻虫\n印加鹦鹉短鲷\n亚非马蜂\n鲫\n豚鹿\n红尾热带鸟\n苹果剑鱼\n水泡金鱼\n野大白羊\n獐\n小黑水鸡\n粗皮鲷\n齿舌\n大麻鳽\n亚洲金猫\n雪花蛇鳝\n船丁鱼\n黄尾球跳甲\n虎绳蜘蛛\n小角锹甲\n漠百灵\n水稻大螟\n洋葱螺\n乌头鸽\n透明鳞金鱼\n金腹小鹟\n南阳桃花水母\n海鲢\n稻水象甲\n红翅黑鹂\n四翼鸟\n暗色伞鸟\n绿颊锥尾鹦鹉\n维那斯骨螺\n草鸽\n白钩蛱蝶\n鬣羚\n夜鹦鹉\n靛颏\n德氏长尾猴\n菜蚜\n山鸺鹠\n伊犁鹅\n马奇异春蜓\n澳洲鹦鹉\n小白额雁\n红翅鵙鹛\n四川裂腹鱼\n大壁虎\n竹叶青蛇\n七夕鱼\n中华新锹甲\n十二红龙睛\n金鲶鱼\n埃及胡子鲶\n红腹食人鲳\n褶伞蜥\n阿勒泰驯鹿\n中华盗虻\n麻鳽\n金斑鸻\n似竺朴丽鱼\n狗獾\n兀鹰\n巨齿蛉\n黑燕鱼\n指示猎犬\n毛股沟臀叶甲\n多毛纲\n窄曙凤蝶\n科琪娜斗鱼\n方尾鹟\n短尾黄鼠狼\n瘿蚊\n角斑樗蚕蛾\n蓝太阳鱼\n黄帅蛱蝶\n大鱼狗\n灰地鸫\n李鸟\n豇豆钻心虫\n中国荷斯坦奶牛\n孟加拉巨蜥\n蓝翠鸟\n螯蛱蝶\n优哉闪蝶\n金化科大甲虫\n乌山羊\n海格力斯巨人巴布\n摩来彩灰蝶\n蛲虫\n斑缘豆粉蝶\n黄猄蚁\n火冠蜂鸟\n蒺藜纹脉蛱蝶\n黑山猪\n弯角大羚羊\n白尾地鸦\n观赏鱼\n非纯种猫\n丽鳾\n刺鳅\n柴鸡\n长尾鸬鹚\n长须狮子鱼\n红鲤\n苔藓螳螂\n傲白蛱蝶\n隐士蜘蛛\n黄波罗凤蝶\n美洲狮\n亚历山大女皇鸟翼凤蝶\n黑线姬鼠\n巴图迪古阿蜘蛛\n黄斑地图龟\n号半龙鱼\n最美紫蛱蝶\n栉鼠\n盾皮鱼\n林肯港鹦鹉\n鳄蜥\n沟眶象\n指角蝇\n冠鸮\n玉米旋心虫\n黄顶丝雀\n细鳞壮鳕\n金黄锥尾鹦鹉\n萨尔路斯猎狼犬\n王蛇\n桦斑蝶\n头索动物\n绮蛳螺\n血虫\n燕尾鱼\n绿毛龟\n老爷树蛙\n稠李巢蛾\n翠袖锯眼蝶\n大余鸭\n树鹨\n小灵狗\n波纹鳜\n二元母猪\n黄金龟甲虫\n雎鸠\n玉杵带蛱蝶\n石鲷\n细毛羊\n二角尘蛛\n纹腹鹰\n蜡螟\n黄珠宝螺\n杀人蜂\n丽彩鹀\n海鱼\n木胡蜂\n竹虫\n条鳎\n澳洲短颈龟\n红顶雀\n白骨鱼\n剑射鱼\n斑点池龟\n红尾金灯\n莱卡犬\n狮子头金鱼\n细雕织纹螺\n竹蝗\n北非大羚羊\n军金刚鹦鹉\n绿草蜥\n喇叭鱼\n春鲤\n红头环蛇\n漏斗形蜘蛛\n锯腹脂鲤\n中华红林蚁\n蓝眼皇冠豹\n古白鲑\n硫黄丝雀\n非洲野驴\n沼泽横颈龟\n榕小蜂\n欧洲玉米螟\n达摩麝凤蝶\n新西兰黑嘴鸥\n黄金鱼\n河虾\n蝲蛄\n短吻飞旋原海豚\n大鳞锯鳞鱼\n异育银鲫\n红胸果鸠\n白眉山鹧鸪\n竹刀鱼\n孔雀鸽\n河川沙塘鳢\n马来亚虎\n两色绿刺蛾\n黄鳍鲔\n硬毛指示格里芬犬\n鬼鮋\n斑姬地鸠\n丽鹰雕\n少女鱼\n摩门蟋蟀\n海滨灰雀\n松突圆蚧\n马岛鹦鹉\n蚜茧蜂\n斑鹿\n大锹形虫\n纹耳鹎\n蟪蛄\n彩虹鲷\n斑纹鸟\n斑点狗\n粉红胸鹨\n菜蚤子\n西番翠凤蝶\n扁吻鱼\n家鸭\n皮蠹\n黑鲫\n三线蛇\n长手鱼\n五罗鱼\n泰坦甲虫\n南美林猫\n竹象虫\n约安巨马陆\n马尾藻鱼\n台湾纹白蝶\n加南犬\n青文鱼\n翼凤蝶\n奶茶仓鼠\n泰国鳄鱼\n绿奇花金龟\n花条蛇\n美洲大赤鱿\n豆蓝丽金龟\n东雀鹰\n萧氏松茎象\n长角羚\n宠物鸟\n尾羽龙\n慈鲷科鱼\n缟蝇\n尖吻小公鱼\n巨锯锹甲\n竹夹鱼\n象甲\n蚯蚓螺\n斑马鲶鱼\n闪蓝丽大蜻\n折带黄毒蛾\n达尔文蛙\n白鲫\n狼青犬\n柯岛浣熊\n黄红眼鹦鹉\n红尾鸲\n郭公虫\n蓝背鱼\n砗蚝\n墨西哥鹿\n弯嘴鸻\n园鼠\n黑颈鸊鷉\n长翅稻蝗\n台湾牛\n小黄铃\n桃粉大尾蚜\n草履虫\n红脚毛毛虫\n红腹山雀\n须鲸\n荔枝螺\n扁鲨\n双头蛇\n小胫刺蝗\n钩尾鸟翼凤蝶\n大腹园蛛\n草原斑猫\n泥蛉\n花蟹蛛\n彩虹灯鱼\n高原岩鹨\n缘钻嘴鱼\n龙头鱼\n白兔鱼\n蠼螋\n巴西菲勒犬\n多棘倒吊\n塍鹬\n巨蝮\n乌蕯拉巴树蝰\n潘帕斯猫\n美凤蝶\n六间鱼\n紫色花蜜鸟\n德文莱克斯猫\n甲鲶鱼\n花豹鱼\n盔头蛇\n绿冠蕉鹃\n黄斑椿象\n黄斑宽套大蜓\n石爬鮡\n墨胸胡蜂\n小家鼠\n岩松鼠\n蓝头蜂鸟\n蚁蛉\n莺雀\n考艾岛洞狼蛛\n大草螳\n灰胸鹪莺\n金斑蛱蝶\n粉壳蛋鸡\n扁缘宝螺\n棘牡蛎\n巴卡雷龙\n恒河猴\n巴达库尔龟\n类人猿\n福建颈斑蛇\n巨人守宫\n宠物蚕宝宝\n斑眼食蚜蝇\n九尾鲍\n驼鸟\n红鲫\n褐马鸡\n紫点葵珊瑚\n巧克力色猫\n隆头蛛\n翘嘴红鮊\n长嘴乌鸦\n黑眉锦蛇\n虾蟹\n多趾猫\n宽带凤蝶\n巨圆臀大蜓\n台湾鬣羚\n波士顿龙虾\n搏鱼\n吸血蛾\n长尾麝凤蝶\n大银腹蛛\n白领八哥\n斑驴\n红目天蚕蛾\n花鼠\n洪堡鱿鱼\n星鳗\n蓝脚寄居蟹\n银线灰蝶\n红高头龙睛\n巴氏豆丁海马\n桃花水母\n假鳄龟\n金枣宝螺\n无尾瓢鸡\n红蛱蝶\n海绵动物\n大家鼠\n沙即鸟\n扁担钩\n蓝锥嘴雀\n禽龙\n短须狮子鱼\n山寨熊猫\n姬透目天蚕蛾\n黄鮟鱇\n白腹幽鹛\n异色灰蜻\n鲈鳗\n特种野猪\n峨眉柳莺\n爬杈\n苏尼特双峰驼\n蓝线金灯\n马斯提夫犬\n麝鼠\n龙眼合夜蛾\n星头啄木鸟\n青带凤蝶\n紫鱼\n绿盲蝽\n秦岭虎\n丽鱼\n铠甲蝮\n葡萄牙波登可犬\n横斑腹小鸮\n鯕鳅\n针尾绿鸠\n刀鱼\n大贾丁氏鹦鹉\n感鱼\n黑狮犬\n陆鳄\n大眼斜鳞蛇\n鲮鲤\n红足穴猛蚁\n藏绵羊\n黄蝮海蛇\n玛雷玛牧羊犬\n巴西鲷\n拟矛尾虾虎鱼\n倭岩羊\n吸血鬼鹿\n奎利亚雀\n绿巨螳螂\n灯颊鲷\n啄花鸟\n火山兔\n金湖乌凤鸡\n棕臀凤鹛\n布履阑珊猫\n猫鼬\n黄尾白毒蛾\n七鳃鳗\n周氏啮小蜂\n鹪莺\n摇蚊幼虫\n矛纹草鹛\n毛貘\n海子水牛\n托佩克种猪\n澳洲鲭\n猫鸟\n长鳓\n招财猫鱼\n金蛉子\n安南龟\n烟青虫\n沙鯭鱼\n高山岭雀\n长趾蛙\n虎鲶 虎鲶\n天堂极乐鸟\n玫红领绿鹦鹉\n家蚕蛾\n红尾鲢\n褐镰翅冠雉\n火腹蟾蜍\n巨海燕\n圆鳍鱼\n维希拉猎犬\n黑头鸥\n联纹小叶春蜓\n眼镜鳄\n哥斯达黎加斑马脚蜘蛛\n斯马蜂\n晰蜴\n扁蜗牛\n鸽子鱼\n水晶虾\n负鼠\n罗非鱼\n青蛾蜡蝉\n敏麻蜥\n大野牛\n蛹期\n线纹蜻蜓\n黑褐色浣熊猎犬\n狭鳕\n小嘲鸫\n皮堡斯\n枯叶蟾蜍\n绿蝎\n飞蜥蜴\n红海星\n熊狸\n黑孔雀\n塞加羚羊\n巴拉金梭鱼\n高鼻羚羊\n玻璃猫\n三道鳞\n鼠妇\n树粉蝶\n蝉脱\n深海龙鱼\n星子鱼\n九斤黄\n金鲫鱼\n鬼子斗鸡\n小眉眼蝶\n白眉丝刺莺\n八带鱼\n褐头鸫\n巨林猪\n库车沙蜥\n红水泡眼\n红狮头金鱼\n臭屁虫\n短尾琉金鱼\n鸣角鸮\n花蚤\n斑灵狸\n尖头鱼\n金背蟾蜍\n菊天牛\n小鹪鹩\n绿缘扁角叶甲\n意大利蜂\n小红鹳\n比利时马\n梧桐鸟\n刺鲀\n塞特猎犬\n蔷薇三节叶蜂\n鱇康良白鱼\n水虱\n孔蛛\n矢尖蚧\n日本红珊瑚\n棕雨燕\n阿拉斯加狭鳕\n小夜鹰\n白鞘嘴鸥\n丝带凤蝶\n雪獒\n建鲤\n小蓝翠鸟\n乌冉克羊\n巨人甲虫\n风头麦鸡\n滩头雅罗鱼\n火焰吊\n美洲鳗鲡\n沙棘木蠹蛾\n麝猫\n柑橘凤蝶\n红斑翠蛱蝶\n大羚羊\n写鲤锦鲤\n眼镜仙\n玻璃鱼\n红玫瑰毛蜘蛛\n中华秋沙鸭\n珍珠狗头鱼\n和平鸽\n谷仓猫头鹰\n台湾鲷\n双瘤槽缝叩甲\n卡申夫鬼美人凤蝶\n樱桃鸡\n普通蛙\n地栖蜘蛛\n纳马雨蛙\n七彩神仙\n小佛塔芋螺\n横纹虎鹭\n蚕蛾\n高白鲑\n蝽象\n麂子\n青黄枯叶蛾\n山楂红蜘蛛\n褐林鸮\n金线灯鱼\n奇异盗蛛\n光明女神闪蝶\n蛇鹫\n豆粉蝶\n社鼠\n马士提夫犬\n棕色隐遁蛛\n林雕\n变种鲤\n低泡飞鼠\n谷米螺\n古铜色卷尾\n马雷马牧羊犬\n红尾鹲\n彩鹑\n马来雕鸮\n土公蛇\n克拉特猫\n悦目金蛛\n广翅蜡蝉\n细粒蝾螺\n带蛾\n石蝶鱼\n择长翅尺蛾\n海百合\n曾氏兔银鲛\n美蜘蛛\n藏马鸡\n原龟\n香蕉象虫\n海蜥蜴\n法螺\n德国绒毛犬\n笙珊瑚\n女王凤凰螺\n淡水鲨鱼\n啮龟\n柑桔木虱\n圆口纲\n亚马逊伞鸟\n竹象\n周斑水虻\n灰斑羚\n大穿山甲\n夜蛾\n黄腹角雉\n丑鸭\n黄山短尾猴\n食人动物\n靴雕\n费氏穿草鸫\n德氏大羚羊\n天蚕\n枣实蝇\n丛林八哥\n红薄荷神仙鱼\n始暴龙\n姬赤星椿象\n原角龙\n超级麦皮虫\n黄板鲫\n高原鼢鼠\n日本蚱\n青山羊\n高鳍鹦哥鱼\n姜弄蝶\n黏液鳗\n雨燕鹦鹉\n美国斗牛犬\n伊朗巨鼠\n彭泽鲫\n旗腹姬蜂\n雪鲷\n白腹姬鹟\n牛奶蛙\n白眉翡翠\n米奇鱼\n红嘴巨鸥\n鹮嘴鹬\n北风鸟\n蛇皮鱼\n甘薯天蛾\n帝啄木鸟\n细鳞鱼\n剑吻海蛇\n茶杯犬\n绵羊猪\n维兹拉犬\n双垂鹤驼\n云花斑裸胸鯙\n鱵\n黑嘴鸟\n牙鲆鱼\n长鬃山羊\n青魔鱼\n澳洲蓝面神仙\n白眉鵐\n斑节对虾\n大王虎甲\n黑顶吸蜜鹦鹉\n孔雀蛾\n枯叶蛱蝶\n透明短吻狮子鱼\n猎猴鸟\n树虱\n秀蛱蝶\n黄金蛙\n漠鵖\n双斑萤叶甲\n加州鲈鱼\n鹿狗\n红眼蝉\n间色宝螺\n丁蛎\n栗翅斑伞鸟\n非洲渡鸦\n刺豚\n真鳄龟\n白面水鸡\n白喉卷尾猴\n刚果灯鱼\n恩特布山地犬\n白黇鹿\n北京鸭\n吸血鬼鱼\n红戟虾\n素叶螳螂\n白眼河燕\n芝麻螺\n闪电王子\n巨鲈\n小团扇春蜓\n杨扇舟蛾\n草海龙\n鹿角虫\n红颈绿鸠\n褐美狐猴\n千手螺\n芬氏花仙螺\n南方豆天蛾\n棕顶雀鹀\n花蝇\n雪山宝螺\n暇虎鱼\n大鳞结鱼\n绿莲灯鱼\n无尾凤蝶\n瓜绢螟\n柳蝙蛾\n刺鲶\n加拉帕戈斯企鹅\n绿弄蝶\n加勒比尖背角鲨\n云南松毛虫\n涡蛛\n维多利亚鲈鱼\n玉带蜻\n罗威士梗\n大丽灯蛾\n越南大麂\n鳞头大鳄龟\n蓝燕雀\n短尾粗吻海龙\n日本猫\n黄腹鼬\n刁子\n香蛇\n太平洋刺狮子鱼\n长角蜂\n红喉蜂鸟\n树白蚁\n斑马贝\n沟齿鼠\n斑点鹦鹉\n龙宫翁戎螺\n蓝色虎斑重点色猫\n黑玉翅鸽\n蝴蝶鱼\n食肉军蚁\n巨蛇\n黄绿游蛇\n巨陶锹甲\n沼泽兔\n金银鳞锦鲤\n冠羽画眉\n巴西涡螺\n獴\n冰川黑熊\n珍珠鳞金鱼\n毛皮兽\n桃小食心虫\n红九棘鲈\n白眉长臂猿\n长尾蓝灰蝶\n竹直锥大象虫\n台湾美凤蝶\n中华乌塘鳢\n重唇鱼\n金环胡蜂\n水熊\n白秃猴\n无鳞鱼\n眼纹广翅蜡蝉\n橙颊梅花雀\n麻羊\n牙签鸟\n黑白蝴蝶\n似鲶高原鳅\n小豹蛱蝶\n火狐狸鱼\n蝽蟓\n绿蛙\n胖虎猫\n盱眙小龙虾\n笠头螈\n吼猴\n纹面弹簧蜥\n藏酋猴\n巴西巨人金直间\n猫头鹰环蝶\n暗蓝异花金龟\n单角鼻鱼\n白带螯蛱蝶\n大团扇春蜓\n杯斑小蟌\n扭法螺\n丽色噪鹛\n黑脚蚂蚁蜘蛛\n桃红鹦鹉\n拟旖斑蝶\n鲯鳅\n虹彩吸蜜鹦鹉族\n巨斧燕子\n肉兔\n斑羚\n家隅蛛\n翱翔飞鱼\n芒部锦鲤\n金花鼠\n棘茄鱼\n黄冠鹎\n黄腹莺\n改良肉驴\n豆象\n加利克瑟犬\n对虾\n白眉蝮\n白眶蛇\n雪达犬\n狭腹灰蜻\n卷心菜毛虫\n小鳍脚企鹅\n钝吻棒花鱼\n蓝脸叶鹎\n红鲷鱼\n钩翅眼蛱蝶\n红点蟾蜍\n屎屁虫\n长吻银鲛\n蛇尾纲\n黑鹂\n白金黄金锦鲤\n蓝凤冠鸠\n郝波特犬\n帚吼猴\n噪鹛\n食蜥王龙\n野马\n负泥虫\n蜘蛛蜂\n香醇雁\n灰飞虱\n宠物鼠\n虎纹麝香龟\n花鼠鱼\n草蝉\n小旗鱼\n鹦鹉嘴龙\n豹点七彩\n紫颊太阳鸟\n黑喉山鹪莺\n蜘蛛螺\n红九纹龙锦鲤\n河猪\n箭蚁\n金线鲃\n黑翅蝉\n巴西河豚\n白腹鹭\n白丝鱼\n芒鼠\n黄雕鸮\n海蜘蛛\n鳖\n阔嘴鹬\n美国野牛\n蔗鼠\n红喉鹧鸪\n沙半鸡\n巴西游走蛛\n彩裙鱼\n红极乐鸟\n狍子\n圣文生亚马逊鹦鹉\n兜虫\n君主绢蝶\n青花鱼\n琉球松鸦\n棕腹蓝仙鹟\n橙眼白猫\n白金鲤鱼\n长鳍斑竹鲨\n高体鰤\n亮大蜗牛\n黄翅蜻\n科氏倭狐猴\n大头鳕\n南亚虎\n网纹龟\n长丝鲈\n尖头银鱼\n长脚沙漠蚂蚁\n大鸊鷉\n魟\n眼斑龟\n针毛鼠\n蚧壳虫\n厚唇凤凰螺\n山田大蚊\n小型荷兰水猎犬\n巴西夜猴\n甜甜圈龟\n黑边公子小丑\n红宝石金刚鹦鹉\n辉腹翠蜂鸟\n拟高脚蛛\n高加索山羊\n云斑尖塘鳢\n尼罗非鲫\n蓑蛾\n财神鱼\n长尾鳕\n非洲大蜗牛\n鼷鹿\n大兀鹰\n球蛛\n钱鼠\n雷公山牛王\n加勒比海红鹳\n大嘴麻鳽\n玻璃炮弹\n银杏大蚕蛾\n巴西巨人金毛\n鳓鱼\n麒麟猫\n德国向导猎犬\n红毛猩猩\n白孔雀\n东玫瑰鹦鹉\n史毕诺犬\n蓝圆鲹\n小极乐鸟\n鞍背鸦\n嘲鸫\n裂腹鱼\n箱鲀\n金丝鲶\n僵蚕\n日本鳗\n斑蝇\n鹫\n大螳螂\n无毒蛇\n透明斑马鱼\n大头鲤\n乌鹟\n爱尔兰鹿\n梨桃小食心虫\n非洲白背兀鹫\n鹨\n蜂鸟鹰蛾\n蜥鵟\n红斑鱼\n纹猫\n旋角羚\n桃蛀螟\n埃及艳后鱼\n沼泽田鼠\n中国少鳞鳜\n裸胸鳝\n石鸫\n石鸻\n班鸠\n三线短鲷\n小雕鸮\n雪羊\n野象\n黄斑黑蟋蟀\n瞎疙瘩\n黑寡妇蜻蜓\n小飞象章鱼\n马士提夫獒犬\n黑纹片角叶蝉\n短毛猫\n大白鼠\n安灰蝶\n中介蝮\n龙种金鱼\n葡萄昼天蛾\n斑啄木鸟\n露尾甲\n彩虹锹甲\n白花蛇\n鹤类\n扁虫\n黑隆头蛛\n巴西梗犬\n流浪汉蜘蛛\n玉斑美凤蝶\n淡水观赏鱼\n蜘蛛猴\n雪虎\n赤鲑\n热带真海豚\n帝王鲷\n绿眉鸭\n北美负鼠\n普度鹿\n红壁蛙\n斗羊\n锥螺\n滨鼠\n宽尾琉金\n鸭嘴鱼\n薮猫\n曦和绢蝶\n周氏闭壳龟\n白额高脚蛛\n琉璃草蝉\n黑尾红月光\n青步甲\n古菱齿象\n高加索蜜蜂\n青鳉鱼\n花鲷\n巨暹罗鲤\n麝鹿\n非洲侏儒鳄\n塞鲸\n撅嘴鲢\n双须重唇鱼\n胆形织纹螺\n黄河象\n凤头阿鹨儿\n鸟蛤\n后鳍锯鳐\n水字蜘蛛螺\n棘头梅童鱼\n七带犰狳\n两爬动物\n军配虫\n海龙鱼\n眼斑龙虾\n白腹锦鸡苗\n美洲狗鱼\n蓝尾翠凤蝶\n苏雀\n双斑大蟋\n巨蛇颈龟\n南部锦龟\n德系獭兔\n白头蝰\n红白金鲫\n极北柳莺\n长牙青蛙\n花羔红点鲑\n新西兰岸鸻\n大眼金枪鱼\n菜花原矛头蝮\n佛可宝螺\n棉花蚜虫\n大蜜蜂\n鲸类动物\n平腹小蜂\n赤眼蜂\n黄腰胡蜂\n肥螈\n斑皇鸠\n穴兔\n野山羊\n银狨\n北豚尾猴\n卢卡斯梗犬\n妞妞鱼\n哈威那犬\n土鳖虫\n彩色水母\n澳意蜜蜂\n生态甲鱼\n合欢木虱\n菜蛾\n厚唇光唇鱼\n斑脸海番鸭\n二点委夜蛾\n红缘灯蛾\n乌鸦凤蝶\n普通田鼠\n隆林黄牛\n果实蝇\n小豆长喙天蛾\n绿巨嘴鸟\n金环宝螺\n基菱背螳\n三须公\n东北鳈\n南非剑羚\n犰狳\n粗鳞鮻\n华丽巨蚊\n数字蛱蝶\n沂蒙全蝎\n鳎目鱼\n渡边氏东方蜡蝉\n黄蜂鱼\n海蝎子\n黑鮟鱇\n水稻负泥虫\n卷贝\n柿绵蚧\n小口裂腹鱼\n新西兰秧鸡\n大颚细锯脂鲤\n棘星螺\n侏儒猪\n柳鱼\n黑腹滨鹬\n瑞士狐\n松鼠猴\n美国恶霸犬\n鬼脸天蛾\n英国马士提夫犬\n曲纹黛眼蝶\n台湾鲷鱼\n贾丁氏鹦鹉\n藏狮\n斑青花金龟\n红嘴山雀\n上户蜘蛛\n灰鼠\n巴马油鱼\n瑞典拉普杭犬\n灰树雀\n印度鸬鹚\n环颈斑鸠\n巨型鱿鱼\n撒坝猪\n图鲁兹鹅\n棘鳅\n高产奶牛\n半蓝魔鱼\n大赤旋螺\n松石七彩\n花文鱼\n叶海龙\n魟鱼\n普通鸮\n婆罗洲巨人橙边\n苹果实蝇\n紫林鸽\n澳洲仙\n勇斧螳\n蝼步甲\n狼鱼\n薄翅蝉\n笔螺\n竹针鱼\n金黄阿奎登牛\n丽绿刺蛾\n灰冕鹤\n黑胸歌鸲\n非洲侧颈龟\n七彩蓝眼鳉\n车轮螺\n海猫\n食蟹海豹\n墨龙睛蝶尾鱼\n豚海豹\n褐雨燕\n白鹮\n西班牙跳猎犬\n雪蛙\n栗耳鹀\n毛法螺\n伪鳖\n红石斑鱼\n犰狳环尾蜥\n王极乐鸟\n黑足熊蜂\n图卢兹鹅\n蜘蛛蝇\n棕腹鹰鹃\n孟加拉国豆娘鱼\n淡黄蝶\n动胸龟\n北京大蜓\n赤翅长颈金花虫\n肥胖园蛛\n黑鳍蛇鲭\n黑背心小丑\n巴西彩虹蟒\n篱螺或笠螺\n侧裸蜣螂\n食鱼蝮\n小卷蛾\n玳瑁猫\n短尾鸦雀\n深山扁锹形虫\n跳鼠\n盲鱼\n宏凯棘冠螺\n白甲乌鳢\n小黄赤蜻\n宠物猪\n啮齿类\n四川黑鲤\n青鲷\n富平奶山羊\n犁沟木纹龟\n上树鱼\n网蝽\n虻鲉\n白鱼\n歌山雀\n美蝴蝶鱼\n东非侧颈龟\n烟灰蛸\n真马\n海水神仙鱼\n核桃扁叶甲\n中稻缘蝽\n稻眉眼蝶\n合趾猿\n天蛾\n迷你垂耳兔\n南美珊瑚蛇\n变色蛇\n大理弓鱼\n短尾绿鹊\n绿茸线蛇\n琉球歌鸲\n穹翠凤蝶\n小黑斑凤蝶\n月光螺\n食蝠鸢\n鯷鱼\n苔蛾\n山字宽盾蝽\n温室白粉虱\n异鳞蛇鲭\n毛螳\n银大眼鲳\n四角山羊\n红眼鱼\n阿根廷龙\n大鼻鸽\n红腿叫鹤\n黑颈长脚鹬\n长吻鱼\n短面熊\n蓝面神仙鱼\n小鸊鷉\n地蛆\n佩尔什马\n伪死人头蟑螂\n北露脊海豚\n阿比西尼亚蓝猫\n苹果红蜘蛛\n蓝眼灯鱼\n刻克罗普斯蚕蛾\n蛀螟\n食蚊鱼\n蓝点钝口螈\n大豆食心虫\n仙女鱼\n黑森林猎犬\n蓝带翠鸟\n斯里兰卡凤头鹰\n波纹眼蛱蝶\n灰纹鹟\n大圆鄂针鱼\n中华彩丽金龟\n褐头鹪莺\n牛皮蝇\n鹠\n天堂鸟\n环喉雀\n榆绿毛萤叶甲\n石宾光唇鱼\n圣诞树珊瑚\n彩裳蜻蜓\n冠恐鸟\n海天使\n耐寒龟\n红蚂蚁\n三角鲤鱼\n长刀鱼\n台湾草蜥\n甲蝇\n赤尾噪鹛\n蓝孔雀\n大甲鰺\n日本短毛猫\n花笠螺\n大仙鹟\n深山锹形虫\n菱鲷\n斑纹泥龟\n中华九刺鱼\n黑斑原鮡\n尼姑鸽\n白鲟\n狮虎兽\n红线波鱼\n月光蝶鱼\n蛆虫\n高山天牛\n黑斑鲫\n密鳞牡蛎\n丁鲷\n黑龙江满洲龙\n南极狼\n美国银虎斑猫\n灰鹰\n商城黑猪\n云南秃角蝉\n野桑蚕\n巨型蜻蜓\n十姊妹\n长鬣蜥\n多鳍鱼\n牧迪犬\n斗米虫\n铲齿象\n红颊獴\n碎斑青凤蝶\n绿树蜥\n鲷鱼\n蓝蝴蝶鱼\n中国黑白花奶牛\n赤石斑鱼\n指狐猴\n暗棕鵟\n肯氏龙\n山坑螺\n黑带二尾舟蛾\n双锯鱼\n青玉粗尾蝎\n绿壳蛋鸡\n英国马士提夫\n星螺\n皮金龟\n凤头䴙䴘\n美丽蚁蛛\n美洲斑潜蝇\n加泰隆牧羊犬\n黑菌虫\n帕特大勒梗犬\n灰喉针尾雨燕\n梭鲈\n鹿角花金龟\n火兔灯鱼\n水虎鱼\n绿蓑鸽\n山溪鲵\n短翅船鸭\n墨脱竹叶青蛇\n鲃鲤\n田鱼\n臭椿沟眶蟓\n胡蝉\n细鼷鹿\n棉蝗\n欧洲泽龟\n冰清绢蝶\n扁颅蝠\n帝王紫蛱蝶\n潮汕蝾螈\n白斑翅拟蜡嘴雀\n角额壁蜂\n科罗澳拟蟾\n星文鸟\n赤颈鵟\n搜救犬\n多异瓢虫\n珠星雅罗鱼\n眼镜食叶猴\n烟草甲\n海南八哥\n金纹细蛾\n河狐\n扇尾斗鱼\n扶桑绵粉蚧\n曲纹袖弄蝶\n双环凤蝶\n泥东风螺\n印度猫\n太平洋岩鱼\n白角星弄蝶\n紫光箩纹蛾\n美洲金翅雀\n紫蛙\n涡虫\n婆欧里鸟\n锦鲫\n摇鹊鸲\n凤头燕鸥\n大牙土锯天牛\n牧羊犬\n大嘴苇莺\n金腰燕\n白唇龟\n小巴丹鹦鹉\n瓦灰鸽\n河鲈\n金翅\n水蚤\n白圈三线蝶\n珍珠蚌\n金条鱼\n孔雀鲤\n獐鹿\n赖蛤蟆\n大蜻蜓\n三突花蛛\n海虾\n浅黄锦鲤\n林鸱\n蠹虫\n圣赫勒拿蠼螋\n绒冠蓝鸦\n金带美法螺\n星点石斑鱼\n白肌银鱼\n小盗龙\n俄罗斯黑梗\n美国王鸽\n松阿扁叶蜂\n小羚羊\n刺尾鱼\n黄翅灰蜻\n美洲凤头山雀\n燕尾凤蝶\n海百合纲\n巢鼠\n红尾鱼\n真鲷\n黄颡\n两栖龟\n黑粉虫\n象拔\n棕尾褐鹟\n柑桔小实蝇\n蓝豆娘\n中华哲水蚤\n小型猪\n黄金灯鱼\n冠翠鸟\n草鞋蚧\n拟鳄龟\n红角鸮\n耳斑神仙\n韭蛆\n珍珠龙\n福建钝头蛇\n紫寿带鸟\n俄罗斯鲟\n中环蛱蝶\n非洲野猫\n纹喉凤鹛\n长尾鸮\n膨跗毛蚊\n珠螺\n侧颈龟\n白翅叶蝉\n黄斑园蛛\n斑鸽\n细手指珊瑚\n棕色大熊猫\n高砂熊蝉\n东非低地大猩猩\n皮球鱼\n点目大蚕蛾\n锯线天蛾\n哈萨克猎犬\n榆绵蚜\n水龙兽\n黄兔尾鼠\n中国水蛇\n鼩\n紫蜻蜓\n麻田金龟子\n东亚狼\n竖琴螺\n玉带凤蝶\n斑须蝽\n鹩鹛\n希氏蟾龟\n黄胸田鸡\n鳞鲤\n旅鸫\n白顶信天翁\n莓鲈\n黑叶猴\n金鲫\n茶壶鱼\n窄斑凤尾蛱蝶\n俄罗斯短毛猫\n溪蟹\n黄金欧洲陆龟\n鵟雕\n桃红颈天牛\n珍达鹦鹉\n黄帆一点贝\n金刚虎鱼\n多齿暴龙\n彩雀\n斑海鲶\n黑龙睛\n谷盗\n钝吻鮠\n尖嘴鸟\n艾氏拟水龟\n黑丛尾猴\n乌头驴\n潭门砗磲\n大林姬鼠\n松毛虫\n泰国鲫\n龙纹短鲷\n台湾大虎头蜂\n松叶锦鲤\n双齿多刺蚁\n东方鲀\n台湾双尾燕蝶\n暗色凤头鹟\n落叶松球蚜\n马鲭鱼\n黑兰寿\n小犰狳\n西伯利亚鲟\n史氏中喙鲸\n裸滨鼠\n和牛\n犀金龟\n8\n梨蚜\n叫鸭\n斑点叉尾回\n漏斗网蜘蛛\n苏丹盾甲蜥\n小鸥\n麝牛\n红琉金金鱼\n鸡龟\n红林翡翠\n红苹果鱼\n鹤顶红金鱼\n黄尾蓝魔\n西部锦龟\n大眼红鲌\n高加索野牛\n东非角巴布\n木匠蜂\n埃姆登鹅\n瓜黑斑瓢虫\n顿河马\n五彩吊\n斑鳜\n观赏虾\n美洲钢铁蓝蜘蛛\n黄缘蛱蝶\n黑尾鹬\n锈凹螺\n丝丁鱼\n大绿叶鹎\n眼镜鸮\n角色卷蛾\n刚毛太仓小猎犬\n琥珀闪蝶\n躄鱼\n牛头鸭嘴\n骚扰阿蚊\n角蝰\n白唇泥龟\n红带袖蝶\n狮王斗鱼\n多腿鸡\n猎豹\n中国昆仑山脉犬\n翘嘴鮊\n皖南龙\n多鳞铲颌鱼\n平颌鱲\n乌苏里貉\n红狮牧羊犬\n剑齿象\n铅笔鱼\n琉璃小灰蝶\n家茸天牛\n刺花螳螂\n真鲷鱼\n红绿橙\n埃及螳螂\n石蚌\n大扁头蟋\n无齿鲳\n日本锦蛇\n粗灰蜻\n秦川牛\n啄花雀\n棕三趾鹑\n长鳍金枪鱼\n梭鱼\n栗背林鸲\n银叶猴\n大猿叶甲\n重庆原矛头蝮\n古骆驼\n红羽极乐鸟\n中华婪步甲\n草金鱼\n阿文绶螺\n迷蛱蝶\n麂羚\n鸮鹦鹉\n幻紫斑蛱蝶\n潜水钟蜘蛛\n黑巨口鱼\n黑夜猴\n陕西卫矛\n点鱼\n大红鱼\n黄胸鼠\n矮岩羊\n叶蝉\n全州禾花鱼\n老鼠鱼\n黑浮鸥\n虎灰蝶\n红白鹦鹉\n古巴雀鳝\n薮猪\n远古蜈蚣虫\n咖灰蝶\n诸城暴龙\n珀翠蛱蝶\n细鳞鳑鲏\n飞龙粉蝶\n谷蠹\n绿毛乌龟\n斑鱧\n黄树巨蜥\n斜绿天蛾\n珠履带蛱蝶\n纹凤蝶\n黄翼蓝顶亚马逊鹦鹉\n蜥鳄\n绿蛱蝶\n柑橘木虱\n图库曼鸺鹠\n中华鳖\n蒙山金蝉\n香蕉交脉蚜\n虎狮兽\n印度鳄龙\n海鲂\n蓝色猫\n木纹龟\n蛀木虫\n牙签鱼\n日本七鳃鳗\n鹭鸟\n高体型异育银鲫\n乔氏猫\n珍珠虹鱼\n小天使翠凤蝶\n印度紫蛙\n红衣凤头鸟\n豆眼黑鸽\n彩蝶\n松巴皱盔犀鸟\n圆澳龟\n栗斑腹鹀\n巴里坤马\n大足黑山羊\n黑顶山雀\n金矛头蝮\n平毛寻回犬\n大山雀\n长尾鼬\n丝尾鳠\n非洲隼雕\n白针狮子鱼\n诸罗树蛙\n鲂鮄\n缩骨花鲢鱼\n大豆胞囊线虫\n大紫蛱蝶\n小魔花螳螂\n黄边胡蜂\n非洲屋蛇\n中华板齿犀\n柑桔潜叶蛾\n波蛱蝶\n竹鼠\n红色猫\n中华忍冬圆尾蚜\n德国莱克斯猫\n彩艳吉丁虫\n伊洛瓦底江豚\n北美凤头卡拉鹰\n环海豹\n非洲鹿\n梭子鱼\n西北非洲猎豹\n蓝蜻蛉\n黑苇鳽\n白耳狨\n巨型沙螽\n白腹长尾猴\n紫吊\n石决明\n土巴龙\n黄喉柳莺\n帆鳉\n广西笔尾灰犬\n菜叶蜂\n彗尾蛾\n菠萝鱼\n山椒鱼\n三角灯鱼\n烟蚜\n鹰隼\n阿彭策尔山犬\n鹊鹞\n黑银板\n鳎\n白耳画眉\n烟台黑猪\n镶黄蜾蠃\n五道黑\n烟粉虱\n白尾角马\n九间贝\n华贵折衷鹦鹉\n木虱子\n加那利犬\n鸭绿江面条鱼\n骨鳞鱼\n单猎犬\n荔枝蝽\n麝鼹\n油蝉\n阿图瓦犬\n大眼鱼龙\n烟实夜蛾\n黄脚鸥\n灰噪鸦\n比格\n红胸吸汁啄木鸟\n老挝岩鼠\n刺鱼\n鱼鸦\n伯恩山犬\n中华虎凤蝶\n长吻海蛇\n拟蚺蛇\n纨裤麝凤蝶\n亲亲按摩鱼\n玄猫\n锯缘龟\n眼虫\n黑眉鸦雀\n鲟鳇鱼\n褐鱼鸮\n马鲛鱼\n中华棘鳅\n蒙古鼠兔\n拟椋鸟\n青脚麻鸡\n维多凤冠鸠\n彩蛱蝶\n四腮鲈鱼\n红衫鱼\n剑龙类\n灰乌鸦\n松十二齿小蠹\n东方雀鹰\n黑弄蝶\n鸟嘴斜带天蛾\n鲮鱼\n眼纹斑叩甲\n瞪羚\n熊狗\n爪哇豹\n雉\n奶牛猫\n宝兴歌鸫\n亚马逊泥龟\n铜色腰蜂鸟\n马来食螺龟\n猫眼蝾螺\n小凤头鹦鹉\n侧点桨鳍丽鱼\n尖音库蚊\n山孔雀雉\n摇蚊\n犬齿牛尾鱼\n南滑蜥\n豆螺\n葡萄透翅蛾\n大伞弄蝶\n翡翠凤凰鱼\n烟黑色波斯猫\n圣多美鸽\n大帆倒鲷\n小地老虎\n黑胸胡蜂\n花豹\n台湾鹎\n西班牙狼\n珊瑚虫\n木蠹蛾\n玉米毛蚁\n大豆蚜虫\n鸡鱼\n暮眼蝶\n平头岭鳅\n茎蜂\n蚁狮\n英国波音达\n石墻蝶\n狗蚤\n奶牛豹鲸\n澳洲麦鸡\n翘嘴鲌\n粉红双锯鱼\n塔螺\n大蚕蛾\n中国癞象\n鬣鳞蜥\n红松石\n及达尖犁头鳐\n黑喉辉蜂鸟\n长白山蚂蚁\n玉米田棉铃虫\n红白鲤鱼\n珍珠虎\n缨冠蜂鸟\n裸蛛甲\n青山蜗牛\n大眼海鲢\n家畜绵羊\n海鲫\n童蟒\n中国蜂\n垃圾鱼\n锯犁鳗\n华艳色蟌\n罗福梗\n褐黄前锹甲\n大星步甲\n乌鸽\n日本松干蚧\n马岛鸭\n榆绿天蛾\n蛤蚧\n鼠鹿\n西貒\n叉牙指鱼\n双斑长脚蜂\n散纹盛蛱蝶\n亚洲大蚂蚁\n花粉蝶\n四眼鱼\n凤头苍鹰\n素饰蛱蝶\n缝叶莺\n豆虫\n多眼灰蝶\n青苔鼠\n夜光虫\n虎纹伯劳\n黄眉柳莺\n浅缝骨螺\n哈萨克犬\n琴蛙\n白条锦蛇\n细腰褐泥壶蜂\n扁豆小灰蝶\n桑氏锯脂鲤\n亚马逊鲇鱼\n树丛浣熊猎犬\n报喜斑粉蝶\n牛獒\n湖鲟\n高身银板鱼\n白龟\n大二尾蛱蝶\n哈瓦那卷毛狮子犬\n福寿螺\n琴鱼\n大灰象甲\n蚬蝶\n钟鹊\n山鹧鸪\n黑胸钩嘴鸢\n家鼠\n红尼罗鱼\n滇池金线鲃\n隐形大熊宝螺\n扇砗磲\n比例猎犬\n三湖鱼\n哥法地鼠龟\n澳洲长颈龟\n虎斑猫\n白面粗尾猿\n蛾蠓\n蚤蝼\n鹰头鹦鹉\n泰坦大天牛\n新西兰红嘴鸥\n蓝虎\n星点水龟\n姬鹟\n红头虾\n马蜂\n红白锦鲤鱼\n鲱形白鲑\n斑胁水鸡\n天竺鲷\n硬毛鼠\n五彩金刚鹦鹉\n栗背短翅鸫\n白乌鸦\n太阳猫\n扁口鱼\n石扒子\n中臀拟鲿\n茶斑蛾\n后鳄\n山斑鱼\n欧斑鸠\n台湾噪鹛\n东方田鼠\n改良肉牛\n中华蝾螈\n乌线雀鹛\n雉形鸦鹃\n烛光鱼\n苦恶鸟\n鲑冠凤头鹦鹉\n高地猫\n鸡冠蛇\n黄河小鲶鱼\n噪鹃\n蜂蝇\n红娘鱼\n小左鲆\n明蛤\n帝王亚马逊鹦鹉\n草鸟\n中华枯叶蝶\n观赏鸽\n花斑刺鳃鮨\n德国钢毛指示犬\n冬眠\n小黄家蚁\n巨型上户蜘蛛\n费里拉马蹄螺\n粉红圈锯背龟\n秘鲁鹈鹕\n白带燕凤蝶\n无翅亚纲\n林山雀\n狞猫\n短毛太仓小猎犬\n特纳兹铅笔鱼\n黑眉长尾山雀\n红姑鱼\n阉鸡\n紫崖燕\n吸石鱼\n眼斑螳螂\n孔雀鲷\n巨腿螳\n雕齿兽\n斑鳍飞鱼\n金目鲷\n夹竹桃天蛾\n箭毒青蛙\n斑鳟\n潘氏闭壳龟\n金带花鲭\n仙企鹅\n日本锦鲤\n吕宋蜻蜓\n大叶黄杨斑蛾\n双棘长蠹\n窄斑翠凤蝶\n德国猎梗\n歌篱莺\n戴褐臂金龟\n蛾\n刀鳅鱼\n卡斯特罗拉博雷罗犬\n长尾鵟\n斑马神仙鱼\n中华草龟\n印度豹\n肿腿蜂\n北极拟圆鳍鱼\n残锷线蛱蝶\n蓝眼贝\n别光锦鲤\n地图龟\n红腹啄木鸟\n戈登赛特犬\n鹑螺\n飞虎\n东非冕鹤\n钻石哨兵\n栗褐鹃鸠\n加拉帕戈斯地雀\n波鱼\n日本犬\n台湾黄蝶\n绿鸠\n楔尾丽椋鸟\n中国大锹\n泰国虎鱼\n树大蚕蛾\n大乌龟\n白尾蓝鸦\n巴西黑\n叶结鱼\n额河银鲫\n白蛤\n五间鱼\n云南巴狗\n鹤驼\n花伞软珊瑚\n扭弯布纹螺\n四耳猫\n相思带蛱蝶\n铜翅水雉\n后毒牙蛇\n白文鸟\n短尾突吻鳗\n鲷\n山烙铁头蛇\n穆蛱蝶\n棘冠海星\n黎明闪蝶\n奥锹甲\n紫胶蚧\n麻料\n蚝\n僧帽猴\n元鱼\n海黑鱼\n大西洋黄鱼\n绿颊亚马逊鹦鹉\n蹼鹬\n角原矛头蝮\n日行守宫\n双全白环蛇\n爱尔兰萨特狗\n土灰笋螺\n九间鱼\n钻心虫\n黄足黑守瓜\n维几尼亚负鼠\n米蛾\n长嘴鸟\n凤头鸊鷉\n蓝翅叶鹎\n尖鼻蝰蛇\n非洲慈鲷\n花纹细螯蟹\n黄金鲤鱼\n大凤蝶\n巨嘴柳莺\n卷象\n海南马蜂\n毒蜘蛛\n乌鱼\n卑怯管叶虫\n青铜金龟\n蝎泽龟\n沙漠角蝰\n台湾宽尾凤蝶\n海鲂鱼\n云南裂腹鱼\n石龙子\n白腹鹞\n细蛾\n长湖银鱼\n小青虫\n洋燕\n棕长脚蜂\n短鳍澳洲鳗鲡\n大眼真鲨\n樱桃谷鸭\n马岛缟狸\n蛴螬虫\n古巴钩嘴鸢\n林鼠\n负子蜍\n睡鼠\n扁尾海蛇\n帝王伟蜓\n巨蛤\n棕足鼯鼠\n白颈乌鸦\n豹鲂鮄鱼\n双钩异翅长蠹\n蓝喉拟啄木鸟\n雉鸠\n黑色蝇虎\n中国鹦鹉嘴龙\n灰斑鸠\n大吻虾虎\n异粉颜蚜蝇\n翡翠贻贝\n鸊鷉\n兔鼠\n反舌鸟\n茸毒蛾\n尖嘴地雀\n红珊瑚\n狐獴\n狸猫\n黑腹燕鸥\n气蛤蟆\n菜虫象\n无翅昆虫\n小粉蝶\n水稻灰飞虱\n细尾鹩莺\n红嘴蓝尾喜鹊\n蓝斑马\n蜻蜓稚虫\n月季长管蚜\n拟龟壳花\n水字螺\n直纹蜘蛱蝶\n无齿芙蓉龙\n加州海狮\n德国锦鲤\n蛇鳗\n红鮋\n绿篱莺\n花角罗汉鱼\n红翅皱膝蝗\n大绿金刚鹦鹉\n草花蛇\n维多利亚肺鱼\n漂白蜻蜓\n黄三角吊\n沟鼠\n马尾斗鱼\n旅鼠\n白嘴潜鸟\n陆鬣蜥\n帝王三间鱼\n大海鲢\n蒙古寒蝉\n虹银汉鱼\n东非水字螺\n铜头蝮\n红颊鹦哥\n水稻福寿螺\n无眼蜘蛛\n灵缇拉萨犬\n淡水鳗\n收割蚁\n黑帝王魟\n黑辉极乐鸟\n非洲虎鱼\n蓝翠蛛\n管猫\n副狮子鱼\n鹭鹤\n少鳞裂腹鱼\n透明鱼\n纵纹锦鱼\n无眼金线鲃\n棉蚜刺茧蜂\n鞭蝴蝶鱼\n大猫\n天蓬鱼\n世界上最丑的鹦鹉\n小白鼠\n安娜图牧羊犬\n蛇王孔雀鱼\n石首鮰鱼\n雪貂\n太阳龟\n澳大利亚红背蜘蛛\n鱼蛉\n万能梗\n木虫\n悬蛹\n荨麻蛱蝶\n秋赤蜻\n果核泥龟\n黄刺鱼\n鳗鳞\n红背蜘蛛\n腊肠猫\n凤鲚\n波斑鹭\n凤头鹧鸪\n三线舌鳎\n四须盘鮈\n棕污斑螳\n麦穗鱼\n裸鲤\n星斑川鲽\n白蛉\n德州条纹木蝎\n团团圆圆\n麝香龟\n非洲冕雕\n高加索羊\n黑窜鸟\n黑丽翅蜻\n钉螺\n南洋大兜虫\n黑端豹斑蝶\n黑白鹰雕\n凹尖腹蜂\n菜螟\n宽带鹿花金龟\n黑狼犬\n巨型海蜘蛛\n菲律宾穿山甲\n水游蛇\n蓝鳍剃刀鱼\n乌信天翁\n白尾鸢\n火焰仙\n长江间银鱼\n蟋螽\n黑尾胡蜂\n奶蛇\n巨松鼠\n长春鳊\n蒙面神仙\n斑胁火尾雀\n胎生蜥蜴\n大鹮\n高山兀鹫\n蓝牛羚\n布龙度蝎子\n火鸭\n疙瘩宝螺\n柞蚕\n跳虫\n银带鳚杜父鱼\n角菱背网蝽\n细胸金针虫\n剥皮鱼\n黄金鲷\n克拉克星鸦\n肉用大雁\n松瘤象\n蜡蚧\n红肛凤头鹦鹉\n丁氏蝴蝶鱼\n蜜罐蚁\n金灯鱼\n北美灰熊\n高山蛙\n飞蚁\n白毒蛾\n新西兰信天翁\n拟八哥\n导聋犬\n白腹蓝姬鹟\n白乌鱼\n戴笠鸽\n鹤嘴鱼\n灰文鸟\n扁鼠\n蓝嘴神仙鱼\n橙腹草原田鼠\n蜘蛛甲\n日本地龟\n龟板\n食蜗步甲\n大鼠耳蝠\n双脊龙\n棘角蛇纹春蜓\n牙克煞龙\n孔雀巨蜥\n金苔鼠\n食蟹狐\n鬼獒\n吞噬鳗\n小黄蛉\n阿法六线风鸟\n马哈美凤蝶\n白腹鹦哥\n鳙鱼头\n金钱鮸\n布里牧羊犬\n新西兰白兔\n法国八达鸽\n短鼻六间鱼\n香港蝾螈\n太湖新银鱼\n新疆细毛羊\n虎鼬\n勃氏蝴蝶鱼\n彩虹蜂虎\n德国熊犬\n梨星毛虫\n黑鼠\n云斑裸颊虾虎鱼\n梅白鱼\n鲀鱼\n造纸胡蜂\n白头鹀\n云斑鹦鹉\n茜草白腰天蛾\n日本金鱼\n笑隼\n北美鼠兔\n榆绿叶甲\n兰寿金鱼\n红头噪鹛\n史奇派克犬\n清水石斑鱼\n龟甲宝螺\n山羚\n惊天兽\n蝎子鱼\n稻雀\n粉点虾虎\n唾蛇\n白胫长大蚊\n纠结清白招潮蟹\n绿雀鹎\n猫蛱蝶\n山地鸡\n灰斑鸻\n电鳗\n舞虻\n蒙古野驴\n康氏矮海豚\n白毛小灰蝶\n德国黄胡蜂\n草蜥\n天使尾天蚕蛾\n单环刺螠\n青海沙蜥\n五莲黑猪\n长吻鮠\n小丑蛙\n叶吻银鲛\n红宽带蝴蝶\n野蚕\n蛇龟\n鲁北白山羊\n版纳蛙\n尖头塘鳢\n英国玳瑁猫\n侏狨\n光端缺翅虎甲\n地鳖\n长尾妍蜂鸟\n幽灵蛛\n七彩鲑\n黄鸲鹟\n西昌高山黑猪\n短吻间银鱼\n日本大鲵\n钻石虾\n金猫\n白斑鸠\n蹬倒山\n费氏灰色牡丹鹦鹉\n黄金豹纹狗头\n花甲螺\n野蚕蛾\n黑尾神仙\n方蟹\n象鼻龟\n黄鼠\n马蹄螺\n圆仔\n似鱤\n哥威斯犬\n马岛潜鸭\n火焰神仙鱼\n拉氏清溪蟹\n树蛇\n艾伦水虎鱼\n长颈羚\n路亚黑鱼\n仙琴蛙\n琥珀蚕\n短翅蝙蝠\n东方角鸮\n切斯特白猪\n屁步甲\n月光蝶\n东部菱斑响尾蛇\n兰花虫\n孔夫子锯锹\n金光鲫\n黑树巨蜥\n黄铜翠蛱蝶\n咕咕鸟\n鱇鱼良白鱼\n黒靴陆龟\n紫红蜻蜓\n迁粉蝶\n黄长脚蜂\n青山雀\n油鱼\n高山短翅莺\n宽翅树蟋\n阿穆尔隼\n肯龙\n龟金花虫\n刺魟鱼\n棒刺大头蚁\n彩虹巨嘴鸟\n麻鹬\n喜马拉雅塔尔羊\n穹宇萤\n红帽金鱼\n软珊瑚\n波纹小灰蝶\n西班牙舞娘\n绿背金鸠\n拉格多尔猫\n亚洲长纺器蛛\n小苇鳽\n山鵟\n卷尾猴\n沟齿鼩\n尖喙蛇\n中华蜜蜂\n土猫\n中华狼青\n军犬\n大地鸲\n纹蓝小蜻\n长臂金龟\n东方雪人\n巨型木虱\n大春蜓\n乳白啄木鸟\n罗氏帆鳍鳗\n暗黑鳃金龟\n双尾虫\n棕泥壶蜂\n灰头群织雀\n鼠猴\n花豹盛蛱蝶\n柳刺皮瘿螨\n刺豚鼠\n裸眼小凤头鹦鹉\n加岛花鹿\n竹红天牛\n大嘴鲸\n水虻\n黄脚虎头蜂\n鸭嘴鲨\n黄蚂蚁\n灰冠鹤\n林鸫\n非洲巨蛙\n黑长喙天蛾\n箱豚\n南非林羚\n泰国虾\n白头秃鹫\n北京黑猪\n阔臂螳螂\n细腰蜂\n蓝极乐鸟\n粉脚雁\n巨蜥\n泥蜂\n美洲森林野牛\n大连香螺\n巴西黄金蛙\n尖翅翠蛱蝶\n绿眼蛙\n绿带翠凤蝶\n野杜鹃\n葡萄虎天牛\n新疆漠虎\n欧鸬鹚\n北美蓝鸟\n喜庆亚马逊鹦鹉\n高脚蟹\n欧洲松貂\n刀背麝香龟\n凤头雀嘴鹎\n剑水蚤\n吸蜜鸟\n绒毛猴\n淡水石斑鱼\n黑青小蜂\n沼蝇\n折衷鹦鹉\n台湾猴\n易碎双腔龙\n普通黄胡蜂\n非洲鬣狗\n中国地鼠\n暗色水鸡\n欧洲缅甸猫\n青绿顶亚马逊鹦鹉\n黏虫\n石头鱼\n斑粉蝶\n阿波罗娟蝶\n金鹦鹉\n华尾天蚕蛾\n九香虫\n欧鼹\n黄蜂蝇\n绿豆象\n绿拟啄木鸟\n古代长须牧羊犬\n褐麻鳽\n屎壳郎\n剑羚\n美洲草原野牛\n细纹斑马\n拟地甲\n无霸勾蜓\n塔蓝图拉毒蛛\n玫瑰葵花鹦鹉\n麝香鹿\n天眼猫\n黄裳眼蛱蝶\n幽灵箭毒蛙\n和马蜂\n墨西哥火膝\n扇尾鸽\n山山牛\n桃潜叶蛾\n黄裙凤蝶\n统帅青凤蝶\n印度舌鳎\n白喉冠鹎\n鸲蝗莺\n黄河鲤\n草鳊\n布鲁芬氏金刚鹦鹉\n艾蛛\n长毛天牛\n巨石斑鱼\n中华露螽\n红面具锥尾鹦鹉\n气泡珊瑚\n黑冠猕猴\n热带鱼蓝鲨\n乌母鱼\n澳洲折衷鹦鹉\n卵圆蝎蝽\n非洲劲蜂\n古巴蟑螂\n鹿豚\n绿鹦鹉\n厚颌鲂\n秋刀鱼\n宽纹北箭蜓\n草原鸡鵟\n东方果实蝇\n云南蝗\n桃舌蜥\n青竹鱼\n湿地苇莺\n笠螺\n黄帆天堂鸟\n黄藤鸟\n七腿花蜘蛛\n棒花鱼\n江鳕\n鳞翅目昆虫\n欧亚红尾鸲\n赤星瓢虫\n毛虾\n犊牛\n仙女虾\n髭海豹\n蒙链荫眼蝶\n土熊蜂\n猪肉绦虫\n麦二叉蚜\n虬眉带蛱蝶\n小绿鸠\n鳝鱼\n海蛳螺\n佛鳄\n叶甲\n象海豹\n壁蜑螺\n角红蟹蛛\n蛹化\n沼泽侧颈龟\n穴小鸮\n金头缝叶莺\n玩赏犬\n红玉颈绿啄木鸟\n眼鳢\n黄金神仙鱼\n茴香虫\n非洲牛箱头蛙\n泰坦蟒\n獾\n古巴亚马逊鹦鹉\n断板龟\n书虱\n鲟\n疑步甲\n哥斯达黎加老虎尾\n云南盘鮈\n灰泥甲虫\n锹甲\n圆口铜鱼\n文昌鱼\n欧洲鳇\n三角蟹蛛\n龟背天牛\n野兔子\n健壮刺蛾寄蝇\n美洲野牛\n五间半\n沙蚤\n大胡蜂\n麂\n波多黎各亚马逊鹦鹉\n九纹龙锦鲤\n吹绵蚧\n黄胸泥壶蜂\n大明斯特兰犬\n双冠龙\n蓝九间\n乳黄色猫\n普通朱雀\n树蜂\n食人鲳\n中华红蚁\n彩色球形珍珠金鱼\n巴厘虎\n翠鸟大山雀\n枯叶螳螂\n侏儒眼镜猴\n法国蜗牛\n巴拉望巨扁锹形虫\n方尾鸢\n蝰鱼\n牙獐\n沟牙鼯鼠\n矛蚁\n猎蝽\n冰岛犬\n小吻虾虎鱼\n红梢子\n菲律宾铠甲蝮\n黑啄木鸟\n琉璃蛱蝶\n脊青步甲\n招财鱼\n阿岛信天翁\n雀鳝类\n大白凤头鹦鹉\n锯翅天蛾\n巨扁锹甲\n山蓝鸭\n宝珈枪吻海龙\n秦岭四宝\n喋喋吸蜜鹦鹉\n食蜗龟\n褐家鼠\n琉金金鱼\n码绢金龟\n岩鼠\n鳞鱼\n红疣猴\n红艳天牛\n褐隼\n普鲁斯鳄\n火缘步甲\n加拿大臭鼬\n柿蒂虫\n翠青蛇\n白尾金喉蜂鸟\n巨山龟\n双带鰺\n丝蝴蝶鱼\n黄芦蜂\n白尾灰蜻\n褐红钝猛蚁\n雀鳝\n小鳄鱼\n暗背雨燕\n海洋鱼类\n蓝金花虫\n贝加马斯卡犬\n广场鸽\n银鲫\n花金龟\n南极小须鲸\n乌塘鳢\n稀有白甲鱼\n七彩鱼神仙鱼\n圣多美绿鸽\n小猎兔狗\n蓝喉仙鹟\n秘鲁马驼鹿\n杨桃鸟羽蛾\n白甲鱼\n北极七鳃鳗\n风头鹰\n巨型哲罗鲑\n欧歌鸫\n阳江鹅\n美洲角雕\n开角龙\n小火蚁\n列日鸽\n相模湾玉螺\n刀锹形虫\n夜光猫\n瓜斑潜蝇\n小丑灯鱼\n巴拿马金蛙\n公鱼\n靴脚陆龟\n姬啄木鸟\n白头鸽\n红黄拟啄木鸟\n荷包鱼\n蓝艳金花虫\n松江鲈鱼\n苏里南负子蟾\n稻飞虱\n米诺鱼\n火焰贝\n沙塘鳢\n爬蚱\n六斑曲缘蜻\n四川柳莺\n飞蛾\n流星泽龟\n篮子鱼\n蛙眼守宫\n白金龙鱼\n透明蛙\n喜马拉雅蜜蜂\n长尾鸡\n麦蛾\n七彩燕鱼\n红带新鹿蛾\n隐鹮\n粉蠹\n日本梗\n海王龙\n达乌尔黄鼠\n百色闭壳龟\n蛇岛蝮\n柳根鱼\n跳甲\n快盗龙\n意大利硬毛指示犬\n条尾鸢魟\n黄纹三锥象鼻虫\n紫薇绒蚧\n蓝凤蝶\n迷你金刚鹦鹉\n鸡泡\n北美知更鸟\n长颈鹿锯锹\n彩臂金龟\n臆羚\n台湾温剑水蚤\n潜水艇鱼\n新西兰大蜥蜴\n黄鹀\n珍珠大帆吊鱼\n喜蛛\n有刺胞动物\n棉红蜘蛛\n普通夜鹰\n飞狐\n双角鬼面蛛\n金钱鱼\n花生蚜\n西班牙山羊\n宽尾凤蝶\n红里螺\n白点叉鼻鲀\n巨脉蜻蜓\n红领绿鹦鹉\n雀斜纹天蛾\n黑缘红瓢虫\n白长角羚\n莹斑篮子鱼\n大赤鼯鼠\n山形金鱼\n红铃虫\n鱼鸥\n山椒鸟\n羽蛾\n淮阳驴\n扁卷螺\n蜥脚龙\n南方皇家信天翁\n阎甲\n翠叶红颈凤蝶\n山獐\n北美山雀\n彩色水泡\n正鲣\n非洲水牛\n鼠兔\n冠企鹅\n蝎蛉\n蓝颊鹦嘴鱼\n白狭扇蟌\n红岩鹨\n帝王灯鱼\n白斑羚\n恐龙鱼\n中国绿刺蛾\n管口鱼\n海蛎子\n铜点花金龟\n弓斑东方鲀\n狐貉\n小螳螂\n泰国虎纹蛙\n长鳍箭口鱼\n白须公\n长嘴苇莺\n野生娃娃鱼\n加州兀鹫\n毛骨鱼\n鼢鼠\n白老鼠\n多宝鱼\n黑鹮\n弓背蚁\n泡桐细毛蝽\n印度巨松鼠\n云南弓鱼\n透顶单脉色蟌\n北美红雀\n兔宠物\n粉灰鸽\n女巫骨螺\n斑节鳉\n鬣蜥\n檗黄粉蝶\n平鲷\n非洲疣猪\n钟螺\n黑蜂蚜蝇\n鹤顶粉蝶\n贝尔普施六丝极乐鸟\n西藏毛腿沙鸡\n孔雀鸡\n霜天蛾\n深海鱼\n棕色野兔\n灶马蟋\n奇虾\n剑齿虎\n红嘴牛椋鸟\n中华龟\n长江胭脂鱼\n花边星齿蛉\n中华田园猫\n佛罗里达咬龟\n蒙古细狗\n红大马哈鱼\n黑葵花鹦鹉\n灰脚柳莺\n玉兔螺\n斑马吊鱼\n鳌花\n灌丛鸦\n赤鲈\n招财龟\n卡尔特猫\n珍珠罗汉\n蓝倒吊鱼\n火星蚂蚁\n蛙蛙鱼\n倒立鱼\n小灰蝶\n印度野牛\n长脚赤蛙\n中华马蜂\n黑俄罗斯梗\n墨珍珠龙睛\n赛布斯奥长耳犬\n野鲮\n樟翠尺蛾\n贴树皮\n台湾竹叶青蛇\n丽纹蛇\n墨西哥泥龟\n黎氏青凤蝶\n隐纹花松鼠\n大蓝蝶\n棕腹仙鹟\n长毛垂耳西班牙犬\n白眉鸫\n赤条蜂\n隐鼠\n湘鲫\n斑潜蝇\n冰蚕\n锯鳐\n浙江双栉蝠蛾\n栗苇鳽\n跳兔\n袋貂\n裂海豚\n西藏牧羊犬\n墨瑞鳕\n柑橘大实蝇\n加州鹰\n尖吻鲟\n憨鲣鸟\n螟黄抱缘姬蜂\n斜纹炮弹\n北非猎犬\n米特雷锥尾鹦鹉\n黄鳍金枪鱼\n东星斑\n万宝螺\n中华倒刺鲃\n长须黄颡鱼\n粗点哑蟋\n白头缅蝰\n眼斑螳\n爬猴\n花翅大蚊\n缨翅\n黄蛱蝶\n鬼艳锹形虫\n胡瓜鱼\n独眼猫\n沼泽猴\n马塔贝勒蚁\n角箱鲀\n蚁后\n高跷鹬\n山羌\n棕榈凤头鹦鹉\n青花龙鱼\n埃及蜜蜂\n阿克巴士犬\n黑守瓜\n金草鱼\n角鸮\n巨锯陶锹甲\n鳗鲶\n电鲇\n赤蛙螺\n素弄蝶\n山叫驴\n角香螺\n朴喙蝶\n橙粉蝶\n红黄金刚鹦鹉\n金盾龟金花虫\n米林肥腹蛛\n肥熊宝螺\n皇家企鹅\n越南刺鳑鲏\n圆形棘腹蛛\n岩原鲤\n鳑鮍儿\n双吻前口蝠鲼\n四盘耳乌贼\n食火鸟\n玉米螟\n平原斑马\n黄腿鸽\n栗鸮\n孔雀苗\n蝴蝶蜥\n泽鹿\n宽咽鱼\n玉米金针虫\n多型蓝凤蝶\n克氏海马鱼\n凤蝶毛虫\n纤粉蝶\n高腰翁戎螺\n金十字灯鱼\n青桐木虱\n雨伞巴丹鹦鹉\n蛾子\n斑凤蝶\n乌苏里拟鲿\n黑林鸽\n南美浣熊\n小渔雕\n罗曼粉壳蛋鸡\n丝光椋鸟\n鲩\n英格兰大猫\n小线角木蠹蛾\n槐尺蠖\n圆斑硬象鼻虫\n午仔鱼\n桑树桑毛虫\n鹊色鹂\n棉红蝽\n蓝月轮鹦鹉\n杨雪毒蛾\n莱茵鹅\n喷火鱼\n长嘴秧鸡\n普京虎\n草履蚧\n无眼希特勒\n粉色海豚\n夏洛伊牧羊犬\n短嘴鹬\n黄褐天幕毛虫\n纹腹叉鼻鲀\n龟壳攀鲈\n细纹狮子鱼\n富春江鲥鱼\n黑白关刀鱼\n大耳果蝠\n红珠凤蝶\n赤鼻棱鳀\n黑尾原鸡\n黄丝雀\n啮虫\n蒙古獒\n大口康吉鳗\n翠蓝斑凤蝶\n薮猫獛\n蒙古红鲌\n灯鱼\n侧斑兵鲶\n红闪电\n黑灰蝶\n绿罗花金龟\n翼龙\n果鸠\n忘忧尾蛱蝶\n黄额盒龟\n锯缘圆龟\n丽齿兽\n大铅笔鱼\n石蛾\n悬铃木方翅网蝽\n蝶尾\n长痣绿蜓\n斯洛伐克库瓦克犬\n眼鱼\n育肥肉牛犊\n扁锹形虫\n广西林蛇\n骆马\n九角龙鱼\n稻纵卷叶螟\n骨唇黄河鱼\n豇豆螟\n东方宽吻海豚\n后肛鱼\n黄泥龟\n袋獾\n碧翠凤蝶\n女皇凤凰螺\n黄纹丽龙虱\n秋白鲑\n剑尾鱼\n德国宾莎犬\n耳鲍\n猿面天蛾\n黑食人鱼\n桶眼鱼\n黄翅鹦哥\n荷兰奶牛\n明窗蛱蝶\n线翎电鳗\n圆蛤\n紫长尾蜂鸟\n大紫艳金花虫\n亚辟猴\n三色蝎\n满洲龟\n时鳞鱼\n蓝枕绿雀\n初雪宝螺\n狭颅田鼠\n刺桐姬小蜂\n粒突箱鲀\n星洲红鱼\n萨克森极乐鸟\n巨皇鸠\n太阳闪蝶\n朱领锡嘴雀\n靴隼雕\n马岗鹅\n大眼松鼠\n山甲珠\n长尾蜂鸟\n步行虫\n多彩仙\n梗类犬\n黑鹞\n细尾獴\n复齿鼯鼠\n虫草蝙蝠蛾\n章公鱼\n竹象鼻虫\n猩红蛇\n香蕉弄蝶\n背点棘赤刀鱼\n英格兰牧羊犬\n短尾信天翁\n蛇鳕\n锯谷盗\n冠麻鸭\n坦鲷\n红嘴黑鹎\n白弄蝶\n酢浆灰蝶\n鹳嘴翠鸟\n姬独角仙\n蜾蠃\n西西里猎犬\n鲬\n拟黑多刺蚂蚁\n白腹珍珠鸡\n苍羚\n羊羔\n蓖麻蚕\n岩鹨\n马尾松毛虫\n樱桃灯鱼\n长耳跳鼠\n短尾蝮蛇\n网丝蛱蝶\n非洲岬水牛\n重点色英短\n畜牧犬\n南疆黄羊\n耳鼠\n鼯猫\n匈牙利维兹拉犬\n大耳沙蜥\n黄鳍刺尾鲷\n松岗象大蚊\n人科动物\n小锥螺\n短棘银鲈\n潜叶蝇\n黄襟蛱蝶\n蚁蛛\n红腹拟鹩哥\n鹳\n钻穴鱼\n臭蜣螂\n角鹰\n南极鱼\n家庭短毛子猫\n象龟\n翻车鱼\n卡尼鄂拉蜜蜂\n金钱豹\n蚜灰蝶\n柳珊瑚\n日本蜘蛛蟹\n驼\n蝾螺\n褐头鸥\n葱须鳞蛾\n海鲇\n食人蚁\n斑点木蝶\n沼泽野蜓\n美国青蛙\n长鳍真鮰\n乌羊\n豹蠹蛾\n花斑鱼\n河麂\n暗色斑纹海豚\n红背啄木鸟\n花龙虾\n红肚火口鱼\n日本长尾蜓\n黑头绣眼鸟\n金胸丽椋鸟\n小绿叶蝉\n鲑\n剃刀龟\n美人虾\n锯缘摄龟\n青根鱼\n黄獒\n丽金龟\n菊花鹦鹉\n刺盖鱼\n粗皮姬蛙\n鲭\n黑尾叶蝉\n蛋龟\n蓝环章鱼\n广斧螳螂\n纯蓝仙鹟\n海洋爬行动物\n织纹螺\n田鹬\n法老蚁\n炸弹鱼\n黑熊犬\n花冠皱盔犀鸟\n巨猪\n海马鱼\n巨型非洲食人鱼\n伯克氏鹦鹉\n褐鹰\n中华广肩步甲\n红腹滨鹬\n金头仙\n紫沙蛇\n黄脚渔鸮\n新西兰鸠\n白羽鸡\n黑头剑蛇\n针嘴鱼\n大蓝蜻蜓\n黑大蜜蜂\n红鳍东方鲀\n重点色长毛猫\n美神闪蝶\n丝毛狗\n小口脂鲤\n白翅蜡蝉\n翘嘴鱼\n褐鸦雀\n茶杯猫\n长毛对虾\n白粉虱\n松雀鹰\n纯色鹪莺\n虎斑犬\n棕背麻雀\n船嘴鹭\n沼泽鹧鸪\n亚速尔牧牛犬\n牛角鱼\n老豹蛱蝶\n小麂\n赤褐灰蜻\n青狼獒\n扁嘴海雀\n犏牛\n步甲虫\n硬鳞鱼\n辉斑唐加拉雀\n大王具足虫\n黑小蜜蜂\n纯种猫\n银石鲈\n茶尺蠖\n球鼠妇\n稻瘿蚊\n雷州黄牛\n雅灰蝶\n俄罗斯牧羊犬\n鰕虎鱼\n中华奥锹甲\n浓紫彩灰蝶\n闪绿宽腹蜻\n紫石房蛤\n亚东鱼\n喜马拉雅悬崖蜂\n白尾麦鸡\n黑蚁\n歌鹰\n镜子鱼\n新月锦鱼\n红鲫鱼\n美姝凤蝶\n粗深海鳐\n巨型远古昆虫\n黑皂鸽\n墨吉对虾\n大青蝗\n青衣鱼\n美洲鸬鹚\n华西树蟾\n普派恩旅鸫\n鹰爪虾\n牛头伯劳\n二条叶甲\n黑色金龟子\n兰坪乌骨绵羊\n哥伦比亚红脚蜘蛛\n灰燕\n方格星虫\n美洲龙鱼\n海龙类\n稻螟\n雪猪\n蒙丹鸽\n灰胸薮鹛\n柳尖胸沫蝉\n日本石龟\n兰屿壁虎\n蛇目褐蚬蝶\n佛罗里达拟鳄龟\n臭鼬\n桃脸情侣鹦鹉\n黑红阔嘴鸟\n棕顶鹧鸪\n澳大利亚箱形水母\n魔鬼砗磲\n美国刚毛猫\n硬壳虫\n大嘴鸟\n红松石七彩\n小嘴鱼\n蝠鲼\n鸥嘴噪鸥\n普通燕鸥\n大军舰鸟\n蔷薇叶蜂\n黄脚胡蜂\n埋葬虫\n壁蜂\n小眼鱼\n长钻光鱼\n地懒\n黄颡鱼\n毛角豆芫菁\n斑林鸮\n微独角仙\n大木蜂\n江黄颡鱼\n蛇鮈\n五彩神仙鱼\n黄豚\n鸺鹠\n产奶鼠\n油彩粉红脚蜘蛛\n中华斑鱼蛉\n咖啡木蠹蛾\n串串狗\n跳跃蟑螂\n五条鰤\n苹果螺\n扁船蛸\n小翠鸟\n黄尖襟粉蝶\n伯格马斯科犬\n虱子\n雄蜂\n镏金金鱼\n金头蜈蚣\n短吻鳄龟\n棕鼯鼠\n山蜂\n灰头麻雀\n柑橘红蜘蛛\n阴阳燕子灯鱼\n细嘴鸥\n湘西黄牛\n海鬼鱼\n洋辣子\n噪犀鸟\n圃鹀\n栗背皇鸠\n丽眦蜥\n大目鲭\n大正三色锦鲤\n澳洲咸水鳄\n斑尾塍鹬\n南方牛\n中华地鳖\n斗篷蜥\n斑点鹰\n白化球蟒\n新月带蛱蝶\n中国野兔\n中国犀牛\n爪哇野牛\n汉氏金刚鹦鹉\n金边龙虱\n红尾炮弹鱼\n东部刺鳖\n珍珠鱼\n豆天蛾\n台湾小鼯鼠\n黄眼鸽\n钩巨蟹蛛\n阿波罗绢蝶\n斑马驴\n酷拉龙\n喜马拉雅旱獭\n东西伯利亚莱卡犬\n深蓝吸蜜鹦鹉\n喜玛拉雅白头蛇\n刺河豚\n大理裂腹鱼\n泰坦炮弹\n鮟鱇鱼\n青川犬\n颇赛克猎犬\n拟大腹园蛛\n中华大扁锹\n蓝子鱼\n三角蜻蜓\n鲥鱼\n线尾燕\n茶杯猪\n骏马\n鼎突多刺蚁\n金色曼蛙\n黄蝶\n大西洋鳕鱼\n雏鸡\n蛇皮\n棘穗软珊瑚\n思茅松毛虫\n柑橘类大实蝇\n灰鸭\n虹鳟\n岩羚羊\n青蜂\n死亡蝮蛇\n锈额斑翅鹛\n赤旋螺\n黑头丝雀\n七彩神仙鱼\n翁戎螺\n黑眉拟啄木鸟\n红獒\n玉斑凤蝶\n特蓝斑马\n翻飞鸽\n枣尺蠖\n马达加斯加长尾灵猫\n白颊雀鹰\n花园鳗\n白鳞鱼\n鼠蛇\n褐耳鹰\n龙凤锦鲤\n鹦哥鲨\n小雨蛙\n喜鹊蛋\n大口黑鲈\n北美旅鸽\n黑帽悬猴\n金针虫\n毛里求斯粉鸽\n金多犬\n东部玫瑰鹦鹉\n萨塞克斯猎犬\n碧凤蝶\n蓝舌石龙子\n哲罗鲑鱼\n睇暮眼蝶\n小鹿瞪羚\n小斑姬鹟\n送奶鸟\n小雅鹃\n裂唇鱼\n油桐尺蠖\n东风螺\n黄金吊\n鳞烟管鱼\n红冠雀\n舟蛾\n田野小猎犬\n黑草鹩莺\n紫红笛鲷\n麋羚\n竹长蠹\n海底珊瑚\n黑头山雀\n白顶鸽\n水鹨\n巽他山椒鸟\n非洲草原野兔\n刷尾负鼠\n罗氏琴尾鱼\n左旋香螺\n驼背大麻哈鱼\n白面鼯鼠\n客纹凤蝶\n铁爪鹀\n卡巴金马\n若虫\n鲑鳟鱼\n安格鲁貂\n貂猫\n脆肉皖\n塔形马蹄螺\n草原扑翅鴷\n阿坝蜜蜂\n怪眼白色猫\n根粉蚧\n黄金猛鱼\n鲹\n圃拟鹂\n伪虎鲸\n白翅鳍脚企鹅\n乳牛\n极北蝰蛇\n木蜂天蛾\n侏儒猫头鹰\n下司犬\n白色蟹蛛\n太平洋鳗鱼\n紫吊鱼\n马达加斯加日守宫\n蝙蝠蛾\n黑带食蚜蝇\n蚜小蜂\n日落蛾\n榆毒蛾\n黄猩猩\n非洲猎鹰\n玉米蓟马\n沟金针虫\n蓝眼白猫\n红弄蝶\n山原猫\n夏威夷绿雀\n海南睑虎\n南江黄羊\n非动物\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v2_animals/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\n\nimport ast\nimport argparse\nimport os\n\nimport numpy as np\nimport paddle\nimport paddle.static\nfrom paddle.inference import Config, create_predictor\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\nfrom .processor import postprocess, base64_to_cv2\nfrom .data_feed import reader\n\n\n@moduleinfo(\n    name=\"mobilenet_v2_animals\",\n    type=\"CV/image_classification\",\n    author=\"baidu-vis\",\n    author_email=\"\",\n    summary=\n    \"Mobilenet_V2 is a image classfication model, this module is trained with Baidu's self-built animals dataset.\",\n    version=\"1.1.0\")\nclass MobileNetV2Animals:\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"model\", \"model\")\n        label_file = os.path.join(self.directory, \"label_list.txt\")\n        with open(label_file, 'r', encoding='utf-8') as file:\n            self.label_list = file.read().split(\"\\n\")[:-1]\n        self._set_config()\n\n    def get_expected_image_width(self):\n        return 224\n\n    def get_expected_image_height(self):\n        return 224\n\n    def get_pretrained_images_mean(self):\n        im_mean = np.array([0.485, 0.456, 0.406]).reshape(1, 3)\n        return im_mean\n\n    def get_pretrained_images_std(self):\n        im_std = np.array([0.229, 0.224, 0.225]).reshape(1, 3)\n        return im_std\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path+'.pdmodel'\n        params = self.default_pretrained_model_path+'.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def classification(self, images=None, paths=None, batch_size=1, use_gpu=False, top_k=1):\n        \"\"\"\n        API for image classification.\n\n        Args:\n            images (numpy.ndarray): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        all_data = list()\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = list()\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except:\n                    pass\n            # feed batch image\n            batch_image = np.array([data['image'] for data in batch_data])\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(batch_image.copy())\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n            out = postprocess(data_out=output_handle.copy_to_cpu(), label_list=self.label_list, top_k=top_k)\n            res += out\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classification(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.classification(paths=[args.input_path], batch_size=args.batch_size, use_gpu=args.use_gpu)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu', type=ast.literal_eval, default=False, help=\"whether use GPU or not.\")\n        self.arg_config_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n        self.arg_config_group.add_argument('--top_k', type=ast.literal_eval, default=1, help=\"Return top k results.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v2_animals/processor.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport base64\nimport cv2\n\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef softmax(x):\n    orig_shape = x.shape\n    if len(x.shape) > 1:\n        tmp = np.max(x, axis=1)\n        x -= tmp.reshape((x.shape[0], 1))\n        x = np.exp(x)\n        tmp = np.sum(x, axis=1)\n        x /= tmp.reshape((x.shape[0], 1))\n    else:\n        tmp = np.max(x)\n        x -= tmp\n        x = np.exp(x)\n        tmp = np.sum(x)\n        x /= tmp\n    return x\n\n\ndef postprocess(data_out, label_list, top_k):\n    \"\"\"\n    Postprocess output of network, one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output data of network.\n        label_list (list): list of label.\n        top_k (int): Return top k results.\n    \"\"\"\n    output = []\n    for result in data_out:\n        result_i = softmax(result)\n        output_i = {}\n        indexs = np.argsort(result_i)[::-1][0:top_k]\n        for index in indexs:\n            label = label_list[index]\n            output_i[label] = float(result_i[index])\n        output.append(output_i)\n    return output\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v2_animals/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport paddlehub as hub\n\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/brFsZ7qszSY/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8OHx8ZG9nfGVufDB8fHx8MTY2MzA1ODQ1MQ&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"mobilenet_v2_animals\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n\n    def test_classification1(self):\n        results = self.module.classification(\n            paths=['tests/test.jpg']\n        )\n        data = results[0]\n        self.assertTrue('威尔士柯基' in data)\n        self.assertTrue(data['威尔士柯基'] > 0.5)\n\n    def test_classification2(self):\n        results = self.module.classification(\n            images=[cv2.imread('tests/test.jpg')]\n        )\n        data = results[0]\n        self.assertTrue('威尔士柯基' in data)\n        self.assertTrue(data['威尔士柯基'] > 0.5)\n\n    def test_classification3(self):\n        results = self.module.classification(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=True\n        )\n        data = results[0]\n        self.assertTrue('威尔士柯基' in data)\n        self.assertTrue(data['威尔士柯基'] > 0.5)\n\n    def test_classification4(self):\n        self.assertRaises(\n            AssertionError,\n            self.module.classification,\n            paths=['no.jpg']\n        )\n\n    def test_classification5(self):\n        self.assertRaises(\n            TypeError,\n            self.module.classification,\n            images=['test.jpg']\n        )\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v2_dishes/README.md",
    "content": "# mobilenet_v2_dishes\n\n|模型名称|mobilenet_v2_dishes|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|MobileNet_v2|\n|数据集|百度自建菜品数据集|\n|是否支持Fine-tuning|否|\n|模型大小|52MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - MobileNet V2 是一个轻量化的卷积神经网络，它在 MobileNet 的基础上，做了 Inverted Residuals 和 Linear bottlenecks 这两大改进。该 PaddleHub Module 是在百度自建菜品数据集上训练得到的，可用于图像分类和特征提取，当前已支持8416种菜品的分类识别。\n\n<p align=\"center\">\n<img src=\"http://bj.bcebos.com/ibox-thumbnail98/e7b22762cf42ab0e1e1fab6b8720938b?authorization=bce-auth-v1%2Ffbe74140929444858491fbf2b6bc0935%2F2020-04-08T11%3A49%3A16Z%2F1800%2F%2Faf385f56da3c8ee1298588939d93533a72203c079ae1187affa2da555b9898ea\" width = \"800\"  hspace='10'/> <br />\n</p>\n\n  - 更多详情参考：[MobileNetV2: Inverted Residuals and Linear Bottlenecks](https://arxiv.org/pdf/1801.04381.pdf)\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install mobilenet_v2_dishes\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run mobilenet_v2_dishes --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现菜品分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"mobilenet_v2_dishes\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 为识别的菜品类别，value为置信度。\n\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个菜品分类的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m mobilenet_v2_dishes\n    ```\n\n  - 这样就完成了一个菜品分类的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/mobilenet_v2_dishes\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install mobilenet_v2_dishes==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v2_dishes/README_en.md",
    "content": "# mobilenet_v2_dishes\n\n|Module Name|mobilenet_v2_dishes|\n| :--- | :---: |\n|Category|image classification|\n|Network|MobileNet_v2|\n|Dataset|Baidu food Dataset|\n|Fine-tuning supported or not|No|\n|Module Size|52MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - MobileNet is a light-weight convolution network. This module is trained on Baidu food dataset, and can classify 8416 kinds of food.\n\n<p align=\"center\">\n<img src=\"http://bj.bcebos.com/ibox-thumbnail98/e7b22762cf42ab0e1e1fab6b8720938b?authorization=bce-auth-v1%2Ffbe74140929444858491fbf2b6bc0935%2F2020-04-08T11%3A49%3A16Z%2F1800%2F%2Faf385f56da3c8ee1298588939d93533a72203c079ae1187affa2da555b9898ea\" width = \"800\"  hspace='10'/> <br />\n</p>\n\n  - For more information, please refer to：[MobileNetV2: Inverted Residuals and Linear Bottlenecks](https://arxiv.org/pdf/1801.04381.pdf)\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install mobilenet_v2_dishes\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run mobilenet_v2_dishes --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"mobilenet_v2_dishes\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - classification API.\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - top\\_k (int): return the first k results\n\n    - **Return**\n\n      - res (list\\[dict\\]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of image classification.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m mobilenet_v2_dishes\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/mobilenet_v2_dishes\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Remove Fluid API\n\n  - ```shell\n    $ hub install mobilenet_v2_dishes==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v2_dishes/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/classification/mobilenet_v2_dishes/data_feed.py",
    "content": "# coding=utf-8\nimport os\nimport time\nfrom collections import OrderedDict\n\nimport numpy as np\nfrom PIL import Image\n\n__all__ = ['reader']\n\nDATA_DIM = 224\nimg_mean = np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))\nimg_std = np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))\n\n\ndef resize_short(img, target_size):\n    percent = float(target_size) / min(img.size[0], img.size[1])\n    resized_width = int(round(img.size[0] * percent))\n    resized_height = int(round(img.size[1] * percent))\n    img = img.resize((resized_width, resized_height), Image.LANCZOS)\n    return img\n\n\ndef crop_image(img, target_size, center):\n    width, height = img.size\n    size = target_size\n    if center == True:\n        w_start = (width - size) / 2\n        h_start = (height - size) / 2\n    else:\n        w_start = np.random.randint(0, width - size + 1)\n        h_start = np.random.randint(0, height - size + 1)\n    w_end = w_start + size\n    h_end = h_start + size\n    img = img.crop((w_start, h_start, w_end, h_end))\n    return img\n\n\ndef process_image(img):\n    img = resize_short(img, target_size=256)\n    img = crop_image(img, target_size=DATA_DIM, center=True)\n    if img.mode != 'RGB':\n        img = img.convert('RGB')\n    img = np.array(img).astype('float32').transpose((2, 0, 1)) / 255\n    img -= img_mean\n    img /= img_std\n    return img\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list[numpy.ndarray]): images data, shape of each is [H, W, C].\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(\n                im_path), \"The {} isn't a valid file path.\".format(im_path)\n            each['org_im_path'] = im_path\n            each['org_im'] = Image.open(im_path)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n    if images is not None:\n        assert type(images), \"images is a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = Image.fromarray(im[:, :, ::-1])\n            each['org_im_path'] = 'ndarray_time={}'.format(\n                round(time.time(), 6) * 1e6)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n\n    for element in component:\n        element['image'] = process_image(element['org_im'])\n        yield element\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v2_dishes/label_list.txt",
    "content": "非菜\n红烧肉\n牛肉饭\n酸菜鱼\n炒金针菇\n土豆泥\n小炒肉\n牛仔骨\n毛血旺\n刀削面\n三明治\n土豆丝\n粉丝煲\n巧克力\n肉夹膜\n花生米\n卤肉饭\n炸酱面\n清江鱼\n鸡肉饭\n水煮鱼\n狮子头\n豆腐汤\n牛肉粒\n猪颈肉\n柠檬汁\n四季豆\n豆角炒肉\n鸡排饭\n流沙包\n辣子鸡\n羊蝎子\n臭豆腐\n豆腐丝\n龙利鱼\n龟苓膏\n老豆腐\n西兰花\n厚多士\n回锅肉\n老鸭汤\n酸梅汤\n海鲜饭\n云吞面\n牛肉汤\n牛腩面\n茶树菇\n海鲜汤\n小炒皇\n米饭套餐\n炒牛肉\n龙眼粉蒸肉\n牛肉锅\n小黄鱼\n绵绵冰\n八爪鱼\n空心菜\n油麦菜\n拌饭\n小牛肉\n萝卜皮\n雪花冰\n大黄鱼\n热干面\n疙瘩汤\n一品酱萝卜\n鸡尾酒\n湄公鱼\n羊腿\n汉堡套餐\n肥肠面\n螺蛳粉\n鸭血酸辣粉\n天妇罗\n炒鸡蛋\n烧豆腐\n窝窝头\n麻辣全家福\n土豆粉\n牛百叶\n一品鸡蛋\n参鸡汤\n多春鱼\n优质鸭血粉丝汤\n拼盘\n鱼豆腐\n萝卜糕\n厥根粉\n三杯鸡\n千叶豆腐\n土鸡蛋\n掌中宝\n一品海蜇头\n鲈鱼\n原味鸡腿肉\n小炒黄牛肉\n炒腊肉\n牛肉粉\n章鱼烧\n酸汤鱼\n午餐肉\n盖浇饭\n鱿鱼圈\n鸡中翅\n黑糯米\n咖喱牛腩\n麻椒牛肉\n玉米汁\n手撕鸡\n一品锅\n羊腿肉\n酸辣汤\n鸡肉卷\n鸡脆骨\n一锅鲜\n咕咾肉\n仔姜炒小公鸡\n石锅鱼\n米豆腐\n羊上脑\n菠萝饭\n豆腐\n香豆腐\n油泼面\n脆皮鸡\n菠萝油\n炒河粉\n麻辣香锅\n拌苦菊\n水晶粉\n牛肉饼\n秋刀鱼\n羊肉片\n肉焗饭\n鸡毛菜\n秘制豆腐\n黄金大鱿鱼\n干豆腐\n武昌鱼\n炒牛河\n炸豆腐\n烧排骨\n烧牛腩\n鸡米花\n咖喱蟹\n忌廉汤\n手抓饭\n排骨煲\n木桶饭\n金汤酸汤肥牛\n仙草牛奶冰\n肥牛饭\n葱油饼\n蒸扇贝\n海藻菜\n野山菌\n金錢肚\n鱼丸汤\n松板肉\n桂花鱼\n石斑鱼\n焖锅\n仔姜耗儿鱼\n腐皮卷\n萝卜丝\n铜锣烧\n鹌鹑蛋\n三黄鸡\n烤三文鱼\n南瓜饼\n小鲍鱼\n布朗尼\n斑鱼\n西芹炒木耳\n煮干丝\n原味牛肋骨\n罗非鱼\n肉丝面\n肉盖饭\n菌汤锅\n石锅豆腐\n雪梨汁\n海鲜焗饭\n鸡蛋仔\n乌龙奶盖茶\n小肥羊\n杂粮包\n梅花肉\n炒乌冬\n爆炒鱿鱼\n烤鳗鱼\n焖豆腐\n焗蜗牛\n仔姜牛肉丝\n牛腩煲\n牛腩饭\n猪排饭\n酿豆腐\n金桔茶\n钵钵鸡\n铁板饭\n锅中锅\n龙骨汤\n乌龙茶\n佛跳墙\n小吃拼盘\n奥利奥\n小河虾\n水煮巴沙鱼\n沙拉\n椰汁糕\n炒山药\n蒸鲈鱼\n豆腐花\n青口贝\n蔬菜豆腐\n小馒头\n拿破仑\n笋片王\n梭边鱼\n麻鸡\n汽锅鸡\n滋补锅\n烩豆腐\n牛尾汤\n养生素什锦\n紫米粥\n啤酒老鸭\n牛羊肉组合\n烧肉豆腐\n菌菇汤\n西多士\n通心粉\n里脊肉\n面疙瘩\n主食拼盘\n鲶鱼\n鸡腿饭\n三味锅\n丸子拼盘\n乌鸡\n冬瓜汤\n大拉皮\n手撕包菜\n明太鱼\n烧汁豆腐\n油条虾\n洋葱圈\n土豆焖牛腩\n杂菌牛柳粒\n神仙鸡\n茶香虾\n皮蛋豆腐\n西冷扒\n香葉西米糕\n西红柿\n铁板烧\n锅包肉\n烤面包片\n十三香龙虾\n剁椒鱼头\n鲜嫩豆腐\n原味椰子鸡火锅\n功夫鱼\n千层面\n叉烧酥\n墨鱼仔\n墨鱼滑\n招牌沸腾鱼\n炒杂菜\n牛腩\n烤脑花\n烧卖\n烧海参\n猪肚鸡\n珍宝蟹\n优质羊肉汤\n优质牛羊肉泡馍\n葡萄酒\n蒸鲥鱼\n虫草花\n酥皮汤\n肉酱焗饭\n冰镇鲍鱼\n鳗鱼卷\n龙鱼锅\n草鱼\n卡布奇诺\n时令蔬\n日本豆腐\n江团鱼\n炒猪肝\n压锅土豆\n红烧肉饭\n牛肋排\n牛肋条\n猪黄喉\n幼羊肋卷\n脆皮肠\n花果茶\n什锦蘑菇\n梅干菜扣肉\n泡菜火锅\n豆腐脑\n豌杂面\n青花鱼\n青菜钵\n千页豆腐\n鱿鱼虾\n鲜鱿鱼\n东坡肘子\n八宝饭\n冻豆腐\n龙利鱼柳\n小圆子\n手抓肉\n手抓饼\n水豆腐\n原味文昌椰子鸡\n气泡水\n卤水拼盘\n肥牛\n长豆角\n烤羊肉\n爆米花\n牛蛙煲\n牛蛙锅\n盐水鸭\n童子鸡\n浓汤\n桂花糯米莲藕\n庆红薯粉\n麻辣羊肚丝\n肉松卷\n芋儿鸡\n三文鱼芝士卷\n麻辣烫\n豆腐干\n酱骨头\n铁板鸡\n银耳羹\n银鱼羹\n火锅套餐\n石锅牛肉\n雪媚娘\n手卷\n海鲜拼盘\n丸子汤\n乌东面\n虾仁豆腐\n卷心菜\n吊烧鸡\n香芋地瓜球\n坛子肉\n玉子豆腐\n煎帆立贝\n鸡蛋炒拉条子\n拌莜面\n拌菠菜\n三文鱼头\n越南水晶虾\n年糕料理\n海鲜羹\n火锅面\n炒通菜\n牛上脑\n牛拼盘\n脆皮豆腐\n石榴汁\n笋烧肉\n组合\n糯米糍\n糯米鸡\n羊棒骨\n拌面\n肥牛煲\n肥肠煲\n臭桂鱼\n芥兰苗\n苏打水\n荔枝肉\n鲜菌拼盘\n蒸鱼头\n蔬菜粥\n大虾沙拉\n原味蛋黄酥\n招牌白切走地鸡\n干锅肥肠\n冬阴功汤\n雪花鱼\n香酥鸡\n马哈鱼\n鱼套餐\n鱿鱼丝\n鸡蛋汤\n黄辣丁\n丝瓜尖\n乌鸡卷\n九宫格\n南瓜羹\n双拼\n烤肥牛\n土鸡煲\n地锅鸡\n培根面\n大花卷\n大阪烧\n大骨汤\n小海鲜\n小炒鸡\n干锅鸡\n拆骨肉\n拌时蔬\n鲜柠檬水\n毛豆腐\n水晶鸡\n韩风泡菜锅\n海鲜煲\n火腿肠\n炒海鲜\n炒香干\n炒鲜鱿\n炖萝卜\n水煮牛肉\n爆三样\n炒牛眼肉\n牛肉滑\n牛肉煲\n牛肝菌\n猪肚汤\n琵琶鸭\n印式番茄汤\n鸡蛋紫菜卷\n老火锅\n鸡肉咖喱\n鳗鱼肉酱饭\n肥牛面\n腊味饭\n鳜鱼\n炒芒果螺\n芝士条\n草原肚\n荷包蛋\n原只鲍鱼荷叶饭\n萝卜汤\n原笼蒸甲鱼\n蛋花汤\n西洋菜\n豆腐鱼\n豌豆黄\n香锅羊肉\n面包汤\n巴沙鱼、清江鱼、草鱼\n鱼头皇\n鲜鹅肠\n焖鲟龙鱼\n鸡肉套餐\n鸡肉锅\n麦旋风\n黄骨鱼\n黑珍珠\n万年青\n水东芥菜\n双拼饭\n咖喱锅\n四角豆\n土猪肉\n地三鲜\n大杂烩\n大烩菜\n夫妻肺片\n香糯小米糕\n香酥小鱼干\n红烧手抓骨\n三文鱼腩\n松鼠鱼\n冰茶\n松饼\n核桃露\n桂花糕\n棒棒鸡\n萝卜\n泡菜汤\n瓜子\n冰淇淋火锅\n炒米条\n炒肉片\n酸辣炒鸡杂\n炖木瓜\n炸鲜奶\n烤牛蛙\n烧土豆\n烧肥肠\n烩三鲜\n焖牛肉\n鹅肝\n牛腱肉\n原味瑞士卷\n野生甲鱼\n番茄鱼\n奶盖乌龙\n目鱼仔\n蟹籽手卷\n香素鲍鱼\n红米肠\n老南瓜\n老虎菜\n肉卷饼\n铁板凯撒肉眼扒\n肉蟹煲\n肉锅仔\n香肠拼盘\n螺脆鱼锅\n艇仔粥\n花椒鸡\n茄子饭\n荷兰豆\n菊花茶\n野生菌王汤\n泡菜双拼\n葫芦鸡\n瑶柱蒸丝瓜\n蘑菇包\n西柚汁\n豆沙包\n豆腐卷\n豆花鱼\n豌豆尖\n仔姜跳水蛙\n野生菌\n原味部队火锅\n莜面鱼鱼\n印象巴黎月饼\n香猪肉\n香芋卷\n马蹄糕\n三文鱼拼盘\n焗饭\n虾鱼片粥\n芝士蛋卷\n乌江鱼\n乌鱼片\n养生茶\n冰桔茶\n面包的诱惑\n卤豆腐\n口口脆\n芝士夏威夷\n原汤大骨锅\n火锅\n小羔羊\n开胃菜\n手撕饼\n拌蛰头\n三文鱼卷\n三文鱼骨\n陈年普洱茶\n烧汁鲈鱼\n泡菜饼\n原味海中虾\n白斩鸡\n海带面\n海蛎煎\n涮羊肉\n炒大虾\n炒牛肚\n爆炒肥肠\n炒芦笋\n炒韭菜\n炖乌鸡\n炖土豆\n烧土鸡\n烧鲈鱼\n烧鹅饭\n焗豆腐\n水煮肉片\n爆羊肉\n牛筋面\n牛肉羹\n黑牛腹肉\n牛里脊\n猪肉滑\n珍菌猪肚煲\n猪脆骨\n玉米羹\n瓦块鱼\n韭香白米虾\n海皇豆腐\n石烧饭\n砂锅粥\n碟鱼头\n稻香蛙\n筒骨锅\n米纸卷\n蟹籽沙拉\n糯米糕\n绿茶饼\n绿豆糕\n牛羊组合\n羊肉煲\n老腊肉\n牛肉河粉\n辣椒肉炒肉\n肉燥饭\n肉皮冻\n牛肉脆饼\n肉骨茶\n四色拼盘\n莴笋丝\n肥牛粒\n萝卜干\n蒸肉饼\n蔬菜饼\n面包蛤蜊汤\n蝴蝶虾\n豌豆苗\n酱豆腐\n酸菜锅\n糖醋里脊\n野猪肉\n金边粉\n阳春面\n面类套餐\n香辣鸡\n鱿鱼筒\n鲜花饼\n黄蚬子\n黄金糕\n一品香锅\n三文治\n东山羊\n春卷\n八宝菜\n冬瓜茶\n凤尾鱼\n卤拼盘\n叉烧肠\n和风豆腐\n叫花鸡\n原味松饼\n咖喱鱼\n豆品拼盘\n精品羔羊\n啤酒鸭\n咖喱蔬菜\n土鸡锅\n圣女果\n外婆菜\n奶油汤\n菌汤子母锅\n签子牛肉\n寿喜锅\n小豆腐\n干贝粥\n各式牛肉\n扇子骨\n手撕菜\n拌三丝\n马兰头拌香干\n排骨虾\n擂茄子\n招牌无骨鱼\n有机鱼\n柠檬鸡\n棉花糖\n水爆肚\n蜜汁大虾\n长江白鱼\n泡饭\n冰淇淋松饼\n清补凉\n老火靓汤\n炒培根\n茶树菇炒爽肉\n炒肥牛\n炖老鸡\n炝锅鱼\n虾卷\n西点拼盘\n印度烤鸡肉\n蹄筋\n烧鳗鱼\n烧黄鱼\n土豆焖茄子\n爬爬虾\n麻辣牛杂煲\n牛筋腩\n牛腩筋\n牛蹄筋\n猪梅肉\n玉米片\n百香果\n盐焗鸡\n碗仔翅\n云南竹筒饭\n原味牛板筋\n糯米甜甜\n粒粒香\n糯米骨\n桂圆红枣茶\n红菜汤\n羊肉锅\n美极虾\n烩面\n骨肉相连\n胡椒虾\n腰花面\n腱子肉\n膨膨冰\n臭鲑鱼\n芒果卷\n番茄炖牛腩\n莲子羹\n白菜炒肉\n煎饼\n芹菜猪肉\n菠菜面\n萝卜片\n蒸三鲜\n蔬菜卷\n蔬菜面\n虾仁饭\n蛋蒸肉\n象拔蚌\n过水鱼\n原味酸奶酪\n酸辣锅\n醉鱼干\n银耳汤\n锅巴饭\n压锅牛腩\n石锅肥牛\n雪梨汤\n渝都风味鸡珍\n香鲈鱼\n马鲛鱼\n排骨套餐\n高丽菜\n魔芋丝\n鱼头汤\n鱿鱼拌饭\n鱼面筋\n鱿鱼嘴\n鲜肥牛\n鲜虾肠\n鲜黄喉\n鳕鱼堡\n鸡汤锅\n印度黑椒鸡腿\n鸭脯\n鸭锁骨\n麻辣鸡\n黄瓜卷\n一品鲜\n三角烧\n三重奏\n上上签\n上海青\n下饭菜\n东星斑\n乌梅汁\n什锦菜\n什锦锅\n招牌烧仙草\n仙草冻\n宫保鸡丁\n冷吃兔\n剑骨鱼\n巧克力松饼\n冬荫功虾汤\n自助火锅\n千层饼\n南瓜煲\n叉烧面\n牛蛙\n和乐蟹\n咖喱龙虾\n四喜锅\n脊骨土豆汤\n芝士火锅\n黄金大排面\n大片肉\n大连鲍\n奇异果\n多宝拼盘\n富贵虾\n小拌菜\n遇上小辣椒\n干豆角\n榴莲酥\n炒年糕\n拌折耳根\n拌茼蒿\n排骨粥\n排骨面\n杀猪菜\n松子鱼\n鲍鱼松茸汤\n五仁核桃包\n铁棍山药\n椒鱼片\n椰浆饭\n印度金芒果气泡\n水波蛋\n原汁牛肉\n鱼片\n沙丁鱼\n黄河鲤鱼\n沸腾饭\n葱油拌面\n泉水鸡\n辽参\n海鲜汇\n涮肚\n淮王鱼\n深海鱼\n清凉锅\n滑类组合\n滑蛋饭\n炒双脆\n醋炒土鸡\n芥兰炒百合\n野生菌炒脆骨\n炒膏蟹\n炖水蛋\n原生态炖羊肉\n霸王花炖龙骨\n烤猪肘\n板栗烧鸡\n烧老鹅\n烧豆角\n烧鸭饭\n烫干丝\n青椒焖甲鱼\n焖羊肉\n蛋黄焗南瓜\n焗薯蓉\n原汁焗蘑菇\n煲龙骨\n仙草\n牛仔肉\n牛板腱\n牛柳面\n黑椒牛肉条\n牛肉粥\n牛肋肉\n牛腩粉\n猪尾煲\n猪排肉\n猪肉饭\n猫耳朵\n瓦罐汤\n生拼盘\n田鸡王\n白水鱼\n白灵菇\n白玉菇\n百灵菇\n脆皮火烧\n石板饭\n秘制肉\n蜂窝豆腐\n竹丝鸡\n笋腊肉\n手工粗粮馒头\n紫甘蓝\n香芋紫米露\n紫菜汤\n红油锅\n芋圆红豆汤\n老母鸡\n肉双拼\n肉片汤\n肉沙拉\n香芋扣肉\n芒果捞\n芝士派\n花雕鸡\n招牌茶香肉\n菊花鱼\n菌菇煲\n菜排骨\n肉丝\n萝卜丸\n蒸桂鱼\n蒸肉排\n蔬菜拼\n蘑菇面\n虾手卷\n虾火锅\n蜜柚茶\n欧式土豆浓汤\n土豆炖牛腩\n鱼贴饼子\n跑山鸡\n麻辣火锅\n辣鲶鱼\n多士配雪糕\n海鲜酱油水\n酱肉丝\n酱肘花\n锅\n干锅包菜\n雪花牛\n雪花肉\n青斑鱼\n青椒鸡\n面筋煲\n飘香鸡\n牛腱\n鱼香肉丝\n香芋球\n飘香鲫鱼\n马兰头\n麻辣鮰鱼锅\n鱼军舰\n反卷\n龙利鱼捞饭\n鱼翅羹\n鱿鱼卷\n鱿鱼头\n海鲜套餐\n鲜椒鱼\n海鲜泡饭\n鲷鱼烧\n鳕鱼羹\n鸡腿排\n鸡蛋饼\n鸭嘴鱼\n鸭架汤\n黄金蟹\n黄鱼鲞\n黄鸭叫\n黑百叶\n龙虾片\n龙须菜\n水果披萨\n一把骨\n丁香鱼\n三剑客\n三线肉\n三色面\n鸡肉丸子饭\n河粉\n全品烧\n冷面鸡\n冰激凌火锅\n凤梨汁\n毛肚\n秘制排骨\n纸包豆腐\n千层酥\n千张包\n半筋半肉面\n南瓜丝\n原味鸡\n叉叉冰\n辣口味虾\n口水鱼\n爽口牛肉\n可丽饼\n组合焖锅\n吾桑格\n三拼\n花生\n咕噜鱼\n哈密瓜\n酒酿圆子羹\n圆舞曲\n圆子豆花\n烤土豆皮\n地瓜条\n北塘小炒\n芝士排骨\n原味大头鱼\n大碗鱼\n大红袍\n大肠面\n大虾锅\n大骨头\n大鳊鱼\n优质大麦茶\n筋头巴脑\n奶豆腐\n蟹子手卷\n鸡饭\n寿喜烧\n小猪包\n小甲鱼\n小番茄\n笋丝小菠菜\n小蘑菇\n小鱿鱼仔\n山药卷\n山药粥\n山野菜\n手工饼干\n韩式泡菜\n酸奶慕斯杯\n扁豆丝\n手剥笋\n扒皮鱼\n拌云耳\n拌干丝\n拌猪耳\n拌肚丝\n拌野菜\n凉拌鲫鱼\n拍青瓜\n招牌卷\n招牌面\n双拼套餐\n猪排定食\n鸡排面\n明炉鱼\n月牙骨\n羊杂拼盘\n杏仁茶\n板栗鸡\n果苏打\n香椿核桃仁\n梅子酒\n梅花糕\n梳乎厘\n醋椒豆腐\n特色椒麻锅\n榴莲蛋\n比目鱼\n水晶饺\n水滑肉\n三汁焖锅\n莜面烤姥姥\n金汤鲈鱼\n马家沟芹菜\n油浸鱼\n油焖笋\n流星肉\n海底捞\n海皇羹\n海鲜烩\n浸肥牛\n涮毛肚\n溜鱼片\n火锅粉\n火龙果\n炒包菜\n炒合菜\n小炒羊肉\n炒腊肠\n炒苦瓜\n炒蔬菜\n响油鳝丝\n炒酸菜\n炒面线\n炖肉汁\n炖豆角\n炸大虾\n招牌潮汕炸果肉\n炸蘑菇\n炸里脊\n烤子鱼\n烤肋排\n烤青鱼\n韭菜\n烧春鸡\n烧鹿筋\n焖鲶鱼\n焗土豆\n意式芝士焗番薯\n招牌芝士焗薯泥\n焗蟹宝\n燕麦包\n爆鳝面\n爽口菜\n牛杂锅\n牛肉肠\n牛黄喉\n拌猪耳朵\n猪肉面\n猪肉饼\n猪软骨\n猪骨汤\n玉子烧\n玫瑰茶\n珍珠斑\n番茄面\n炒时蔬菜\n竹荪汤\n西米布甸\n粟米羹\n蜜糖松饼\n糖芋苗\n糯米卷\n绿豆粥\n绿豆芽\n牛羊双拼\n羊肉面\n鲍参翅捞饭\n老坛子\n清蒸深海老虎\n聚宝盆\n肉炒粉\n肉类砂锅\n牛肉锅巴\n肉锅贴\n肥肠鱼\n肥肠鸡\n脆萝卜\n芒之恋\n芒果茶\n芝士挞\n芝士球\n芝士虾\n茉莉花炒蛋\n五花肉锅\n西芹百合\n茄汁面\n茶碗蒸\n山药炒木耳\n蓝莓果醋\n榴莲千层\n菌菇豆腐\n菜包肉\n菜泡饭\n菜炒蛋\n菠菜花生\n萝卜丁\n萝卜苗\n蒙古肉\n蒸南瓜\n蒸笼牛肉\n蒸白鱼\n豆豉蒸秋葵\n蒸钳鱼\n蒸鲍鱼\n蒸鲜鱿\n蓝莓汁\n藏龙鱼\n虾焖锅\n窝蛋肥牛\n蟹肉棒\n赤豆元宵\n豆烧肉\n豉油鸡\n跳水鱼\n跳跳虾\n原味蹄花汤\n麻辣排骨\n梭边鱼锅\n酒香肉\n香酥鲫鱼\n招牌酱大骨\n酸辣鱼\n金丝虾\n招牌金汤锅\n金沙包\n铁观音\n铁锅面\n砂锅山药\n雪布蕾\n雪梨茶\n青椒鱼\n拌青笋丝\n青豆泥\n面包条\n汤莜面窝窝\n韭菜盒\n印巷香水鱼\n肥肠\n马卡龙\n马面鱼\n驴打滚\n鱼头锅\n鱼子蛋\n鱼翅汤\n鱼肚羹\n鱼茸粥\n乌鱼蛋汤\n饭团\n鱿鱼饭\n鲍鱼菇\n鲜汤锅\n鲜竹卷\n鲶鱼鸡翅鲜虾锅\n锅巴菜\n鸡丁饭\n鸡肉拌面\n鸡捞面\n鸡焖锅\n鸭下巴\n竹笋鸭汤锅\n鸳鸯火锅\n鹅肝酱\n芝麻牛肉\n黄瓜汁\n黄金虾\n原味黑鱼煲\n炸酥黄金龙俐鱼\nBB鸭\n燒三文魚\n三杯鸭\n三鲜面\n阿公下酒菜\n鸡丝拌面\n乌鱼锅\n乌鸡粥\n九折板\n亲子饭\n八宝冰\n六合鱼\n经典便餐\n养乐多\n农家鸡\n冰榴莲\n冰橘茶\n冷豆腐\n黑猪排骨凉瓜汤\n红烧刁子鱼\n龙利鱼锅\n巧克力喷泉\n巧克力拉瓦\n功夫鸡\n乌龙杏仁加红豆\n南瓜盅\n招牌海南鸡饭\n卤肉卷\n卤花生\n可乐饼\n吉事果\n云吞捞面\n味噌汤\n土豆\n风味鮰鱼\n奇味鸡煲\n黑鱼\n芝士肥牛\n咸煎饼\n精品上脑\n极品大虾\n眼肉\n哨子面\n啤酒鱼\n仙草喜圆烧\n四喜拼盘\n咖喱海鲜\n咖喱猪肉\n四味锅\n土匪鸭\n土猪汤\n海鲜土瓶蒸\n土豆锅\n土鸡脚\n烤地瓜片\n地锅鱼\n金牌坛香肉\n海鲜墨鱼面\n大板烧\n大白刁\n白虾\n大碗饭\n大虾煲\n大馒头\n大骨棒\n鱼头泡饼\n沙冰\n奶酪棒\n姜母鸭\n嫩鱼片\n军舰\n蟹子沙律\n安东鸡\n密瓜汁\n富贵鱼\n小皇堡\n山楂汁\n山菌汤\n井冈山豆皮\n卡布基诺\n干渣肉\n干烧鱼\n干贝汤\n例汤\n日式火锅\n日式烧肉\n熏鱼\n当家肉\n心里美\n维怡豆奶\n海鲜\n情人果\n扇贝肉\n担仔面\n拌海草\n拌螺片\n拌豆皮\n拌香菜\n叉烧烧拼排骨\n捞牛肉\n排骨串\n推推乐\n摊鸡蛋\n手撕豆腐\n老虎斑鱼锅\n斗碗鱼\n尼斯沙拉\n安格斯牛肉\n方便面\n昂刺鱼\n重庆鸡公煲\n木瓜丝\n杏仁烧\n马来咖喱\n铁板鲈鱼\n榆林豆腐\n水果之恋\n水果乳酪\n水果冰粉\n果盛宴\n腰果鸡丁\n柠檬鸭\n九宫格火锅\n梦幻雪\n棒棒虾\n椒香鸡\n原味椰子冻\n椰子糕\n柠檬苏打\n水晶包\n蜜汁山药\n汁锅巴\n酸汤挂面\n汤火锅\n酸汤牛蛙\n沙拉虾\n沙棘汁\n沙茶面\n酱油拉面\n油淋鸡\n油酥饼\n油鸡饭\n泉水蛙\n金牌泉水鱼\n醋泡花生\n包浆豆腐\n海参斑\n海胆丸\n涮涮锅\n冰淇淋球\n混合锅\n太湖白鱼\n溜肉段\n腿卷\n炒三鲜\n炒冬笋\n优质小炒泡馍\n炒莴笋\n炒菠菜\n香炒藕片\n炒青菜\n招牌咖喱炒青蟹\n爆炒鱼片\n炒鲍鱼\n白辣椒炒鸡胗\n炖瘦肉\n鸡肉炖蘑菇\n炖鹧鸪\n炝锅面\n炸花生\n炸蛋散\n炸香蕉\n烤乳猪\n烤五花\n烤全鸡\n烤南瓜\n烤梅肉\n烤蘑菇\n烤蛏子\n烤香菇\n烤鸭饭\n黔鱼\n烧杂鱼\n酱烧鱼头\n红烧鲫鱼\n照烧鸡肉\n焖猪肚\n焖老鹅\n焗口蘑\n焗豆泥\n焗金瓜\n芝士焗鳕鱼\n孜然羊肉\n煎鸡蛋\n煨牛腩\n煮三鲜\n自煮海鲜\n煮豆腐\n熊手包\n火爆猪肝\n爆脆肠\n肥牛套餐\n牛火锅\n牛角面包\n大牡丹虾\n猪脚饭\n猪颈扒\n猴头菇\n山珍豆腐\n海蜇\n松茸甩袖汤\n甲鱼汤\n炒番薯叶\n白萝卜\n百叶包\n百叶卷\n乳鸽\n冰皮月饼\n盆菜\n仙草奶盖\n碎碎鸡\n章鱼饼\n竹笙汤\n竹节虾\n笋牛腩\n黄米凉糕\n米粉肉\n糊涂面\n糖火烧\n糯米鸭\n糯香骨\n紫薯饼\n红豆卷\n鲜榨红豆汁\n米线套餐\n绿豆汁\n绿豆汤\n香辣羊排锅\n鲍汁翅肚羹\n肉松饼\n肉板面\n鸡肉烩饭\n招牌肚包鸡\n猪肚煲鸡\n鹅肝冻糕\n高钙肥羊肉\n烧腊拼盘\n香腊牛肉\n鸡腿肉饭\n虾膏豆腐\n芋圆冰\n香芋招牌\n芙蓉虾\n芙蓉蛋\n芝士饼\n招牌芥菜饭\n花淇淋\n花甲粉\n花蚬子\n芹菜炒香干\n苦瓜羹\n茶壶汤\n抹茶红豆\n柠檬C芦荟优多\n排骨莲藕汤\n莴笋片\n金针菇肚丝\n鸡肉\n菌皇汤\n菌菇面\n青菜大拼\n酸菜鲈鱼\n萝卜酥\n萝卜鱼\n葡萄汁\n韩式葱煎饼\n蒜香味鱼\n蒜香肉\n蒜香鸡\n蒸膏蟹\n蒸蛏子\n鱿鱼\n樱桃蔓越莓\n蔬菜锅\n薏米汤\n蓝莓山药\n藤椒鱼\n虾仁面\n龙虾伊面\n蛋糕卷\n蛋黄卷\n蜂窝煤\n蝴蝶卷\n凉衣白肉\n西京烧\n西葫芦\n豆乳锅\n豆腐饭\n红豆豆花\n豉油虾\n意式奶油培根贝壳面\n贵妃鸡\n走油肉\n软壳蟹\n轻乳酪\n麻辣汤锅\n麻辣海带\n辣章鱼\n辣鱿鱼\n辣鸡肉\n精选牛肉\n通心菜\n香酥排骨\n酱排骨\n酱棒骨\n酱菠菜\n酱香肉\n酱香锅\n酱驴肉\n酸菜面\n酸酸鸡\n酸黄瓜\n醉蟹钳\n野菜饼\n铁板野蘑菇\n金瓜丝\n黄金豆腐\n银丝饼\n香锅土豆\n香锅排骨\n锅烧面\n锅草鱼\n锅驴肉\n石锅鲫鱼\n什锦砂锅\n长寿菜\n长寿面\n霸王鸡\n青菜汤\n风味肉\n风味鱼\n和风牛肉\n套餐\n馋嘴鱼\n百香果汁\n拌香椿苗\n香牛腩\n肚丝\n香茜饺\n蚕豆\n香酥卷\n鸡翼\n起司马铃薯\n鱼子酱\n鲍鱼炖蛋\n生饭\n鱼米羹\n鱼籽饭\n鲍鱼汤\n鲍鱼盅\n鱼头煲\n鲜果茶\n海鲜砂锅\n海鲜组合\n鲜腐竹\n鲜虾堡\n鲜锅仔\n鲜鱼锅\n鲮鱼滑\n鸡丝面\n鸡块面\n鸡煲翅\n鸡翅煲\n玉米鸡肉滑\n意式罗勒鸡肉面\n鸡腿锅\n鸦片鱼\n麻辣虾\n麻辣面\n麻麻鱼\n黄金卷\n黄金饼\n黑啤酒\n黑米糕\n黑米饭\n黑豚肉\nHi辣鱼\n西班牙Tapas\nQ麻糬\n馕炒肉\n一碗香\n三下锅\n三丝羹\n三合一\n三味鱼\n三杯雞\n三角峰\n三鲜煲\n关东木耳\n蒜香银丝元贝\n粉丝包菜\n酸汤丝娃娃\n肉丝带底\n两面黄\n焖中华鲟\n肉丸火锅\n原味丹麦棒\n乌头鱼\n九肚鱼\n家乡豆腐\n草花乳鸽汤\n亲亲肠\n菜煲\n虾仁炒蛋\n桃仁苦菊\n果仁菠菜\n锅仔土豆\n仔排煲\n芝士蛋糕\n巧克力面包\n巧克力味\n公主卷\n养生粥\n冷锅鱼\n金桔晶冻柠檬\n龙利鱼片\n秘制叉烧\n秘制梅肉\n秘制红枣\n桃仁剔炒鸡\n巧克力冰莹\n巧克力巴菲\n巧克力熔岩\n包公鱼\n千张丝\n南乳肉\n南煎肝\n南瓜糕\n摩卡奇诺\n卡特鱼\n卤肉面\n烙饼卷带鱼\n可口小菜\n鸡胗\n叶包鸡\n一号肥牛\n沙面\n合家欢\n综合拼盘\n吞拿鱼\n味三鲜\n冒菜\n卤蛋\n奶酪\n泡菜\n风味海藻\n猪肝\n各种系列\n脑花\n腊肠\n炒花菜\n西式培根\n咸水角\n咸水鸭\n豆制品组合\n龙虾\n马蹄响铃卷\n咖喱时蔬\n咖喱杂菜\n咖喱肥牛\n咖喱膏蟹\n四宝蔬\n四宝饭\n四小拼\n四果汤\n四色饺\n四重奏\n法国鹅肝\n芋圆二号\n地瓜叶\n地皮菜\n墨鱼卷\n芝士肋排\n樱花芝士虾卷\n芝士鳗鱼\n软壳蟹卷\n大嘴蛙\n点心大团圆\n大扁鱼\n大海螺\n大盆鸡\n大盘鱼\n大肠煲\n大豆腐\n大锅巴\n大麻球\n大黄鸭\n明太鱼丝\n招牌火车头河粉\n头道菜\n特色奥灶面\n老奶洋芋\n奶酪卷\n夏威夷贝\n嫩笋尖\n狮子头饭\n骄子肥牛\n子锅仔\n原味椰子鸡\n安康鱼\n元宝甲鱼\n烩菜\n农家锅巴\n至尊拼盘\n百叶小堂菜\n新鲜油菜\n高山小猪肉\n油焗小青龙\n小食组\n薰衣草小饼干\n小鲜肉\n可尔必思\n千层豆腐\n山楂茶\n韶山炒鸡\n山猪肉\n山药羹\n小炒\n农家手工麻糍\n巴旦木\n干烧肉\n干逼鹅\n干酪鱼\n年糕虾\n开胃酒\n乳酪\n小菜\n意式浓汤\n烧鸭\n三式组合\n蛋卷\n五彩时蔬\n彩缤纷\n彩虹卷\n烤玉米\n德克士\n心太软\n生态甲鱼\n思慕雪\n怪味鸡\n手扒鸡\n手抓蟹\n手掰肠\n手撕兔\n手撕笋\n扒广肚\n拌冰藻\n拌板筋\n拌桃仁\n拌海菜\n拌耳丝\n拌花菜\n凉拌茄子\n拌莴笋\n拌蜇头\n菠菜拌豆干\n拌面筋\n拌驴肉\n拌鱼皮\n拌鸡杂\n拌鸡肚\n辣拌鸡胗\n招牌虾\n猪排火锅\n排骨粉\n手掰豆腐\n手撕牛肉\n手撕茄子\n三文鱼皮\n阿拉斯加蟹\n蒜蓉粉丝\n昔果乐\n水晶牛肉\n水晶肴肉\n朝天锅\n木瓜汁\n日本料理\n猪杂汤饭\n砂锅\n杂粮粥\n杂粮糕\n老油条牛肉\n杭三鲜\n三杯鸡饭\n松花鱼\n美极海虾\n鲜果乐园\n优格\n芒果捞野\n鲜果百汇\n果雪芭\n架子肉\n柚子蜜\n柠檬红\n青柠蒸鱼\n柴火鸡\n栗子鸡\n核桃派\n花菜炒肉\n格拉条\n桂花鸭\n蟠桃祝寿\n过桥豆腐\n桶子鸡\n木桶羊肉\n梅子肉\n青椒肉丝\n滴脆腰花\n金椒鱼锅\n椒鸡杂\n椒麻鱼\n木瓜椰奶爽\n椰子汁\n椰子饭\n椰香鸡\n榴莲卷\n樱桃肉\n橘柠檬\n柠檬冰茶\n水库鱼\n水蜜桃\n水豆芽\n蜜汁木瓜\n鸡汁脆笋\n椰汁鸡汤\n咸肉粒汤丝螺\n酸汤揪片\n江团\n百叶\n酸汤羊肉\n青菜\n沙姜鸡\n沙茶酱\n油多士\n酱油拌饭\n曲奇\n油鳝糊\n泡萝卜\n气泡饮料\n蒜泥白肉\n澳洲牛肉\n海上鲜\n海参汤\n上海熏鱼\n海藻丝\n海螺汤\n海鲜拼\n鲤鱼\n溜肉片\n滋补汤\n火豆腐\n火锅鸡\n炒三丝\n炒五花\n炒什锦\n炒仔鸡\n炒咸肉\n炒墨鱼\n炒杂蔬\n炒米糕\n炒粉皮\n炒粿条\n板炒肝尖\n炒蘑菇\n炒蚬子\n炒螺片\n炒野菜\n炒钉螺\n炒鳝柳\n炒鸡肉\n酸辣炒鸡肫\n炒鹅肝\n炖辽参\n风味炝花生\n炸刀鱼\n油炸双拼\n香炸土豆\n炸排骨\n炸腐竹\n油炸臭干\n炸虾片\n烤乳鸽\n烤兔腿\n燒烤大拼\n烤大肠\n烤春鸡\n烤海虾\n烤羊背\n烤羊腰\n烤花蛤\n烤酸菜\n烤面筋\n烤鲷鱼\n烤鸡头\n烤鸡心\n烤鸭包\n煎烤黄鱼\n烧三鲜\n叉烧乌冬\n烧桂鱼\n烧羊肉\n烧青口\n烧鲤鱼\n烧鲳鱼\n香菇烧鸡饭\n烧鹅皇\n浓郁鲜香烩杂菇\n烩松肉\n烩海参\n烩羊肉\n烩蘑菇\n烩蹄筋\n烩鱼肚\n热狗堡\n焖江团\n焖鮰鱼\n焖鲜虾\n黄焖鸡饭\n奶油焗杂拌\n孜然土豆\n香煎藕饼\n煎蛋汤\n煎血肠\n煎鱼饼\n煎鸡饭\n北京煨牛肉\n煮海参\n生煮花生\n干煸龙虾\n熊猫包\n熏排骨\n熏牛肉\n熏鸭肉\n葱爆河虾\n爆爆珠\n爆爆蛋\n爆爽肉\n爆肚丝\n酱爆茄子\n油爆螺片\n爆鳝背\n爬虾肉\n爱丽丝\n爱玉冰\n鸦片鱼头\n金牌羊肉\n招牌豆腐\n招牌豆花\n招牌龙虾\n牛三宝\n牛丸粉\n肥牛乌冬\n牛仔腿\n手抓牛大骨\n牛尾锅\n牛尾骨\n牛排煲\n牛杂粉\n牛柳饭\n牛肉冻\n牛肉干\n芝士牛肉酱\n牛肚面\n牛肩肉\n牛胸口\n牛胸肉\n牛脆骨\n牛蛙虾\n牛骨头\n原味牛骨汤\n牛魔王\n猪排面\n猪肉汤\n猪肝面\n猪肺汤\n猪蹄虾\n猪骨煲\n猕猴桃汁\n金玉满堂\n玉米酥\n玫瑰饼\n珍珠丸\n珍珠马蹄\n玫瑰花茶\n椰奶\n麻辣瓦香鸡\n甜不辣\n百丽甜情人\n甜玫瑰\n甜辣虾\n生姜茶\n生炒骨\n野生菌汤\n田园鸡\n田螺鸡\n甲鱼锅\n意式薯条配芝士碎\n瘦肉汤\n白切肉\n白玉卷\n白糖糕\n白蚬子\n百叶丝\n菠菜\n黄瓜\n脆皮烧肉\n带皮羊肉\n脆皮蛋卷\n皮青菜\n带皮驴肉\n鸡蛋\n大盆牛蛙\n盐煎肉\n盲公鱼\n石榴鸡\n破仑酥\n碎米鸡\n洛神花茶\n福寿螺\n禾花鱼\n稻香肉\n竹筒肉\n竹筒鸡\n竹蛏王\n蒸笋壳鱼\n笑脸薯\n金米海参\n米焗饭\n稀饭\n米粑粑\n馒头\n菌类大拼\n鱼籽豆腐\n粉圆子\n粉粿冰\n粒粒爽\n粘豆包\n粟米汤\n糍饭糕\n年糕排骨\n焦糖炖蛋\n冰糖葫芦\n糖醋鱼\n红糖锅盔\n素凉菜\n荤素拼盘\n紫菜饭\n紫薯粉\n红咖喱\n招牌脆皮红子鸡\n红枣汁\n红枣糕\n招牌红烧鱼\n红腰豆\n烤红薯片\n红薯饼\n红豆粥\n红豆饼\n结烧肉\n绿豆沙\n绿豆酥\n缩骨鱼\n羊肉筋\n美人蛙\n美人蹄\n美人鱼\n美味蛙\n美极蛙\n美果冰\n群英会\n招牌骨皇翅\n翡翠卷\n卤面\n韭菜老烧蛋\n老肉片\n耙泥鳅\n木耳肉片\n肉三鲜\n肉乌冬\n烧肉夹饼\n肉抓饭\n肉汤锅\n肉清汤\n干锅花菜\n咸肉菜饭\n肉蛋卷\n牛肉蛋堡\n肉馅锅盔\n肉香菇\n肚丝面\n肠砂锅\n虾滑\n海胆豆腐\n胖头鱼\n胡萝卜\n脆皮鱼\n脚趾肉\n腌黄瓜\n豆腐烧肉\n肝腰合炒\n臭大元\n臭鲈鱼\n钢管鸡\n彩色心情\n特色肥牛\n雪砖\n芒果干\n芝士味\n芝麻球\n芝麻饼\n芥菜汤\n芦花鸡\n芦荟爽\n芦荟茶\n花生汤\n花生芽\n花生酱\n枣花蜂蜜\n花鲢鱼\n苹果派\n茶叶蛋\n茶套餐\n欧蕾\n草莓汁\n荞麦饼\n薄荷苏打\n櫻桃蔓越莓\n莓类甜心\n莴笋干\n莼菜汤\n蘑菇火锅\n菇炒肉\n蔬菜乌冬\n菜桃仁\n蔬菜浓汤\n炒粉\n雪菜笋丝\n韭菜粑粑\n香菜羊肉\n菜脯蛋\n酸菜鱼面\n菠萝冻\n菠萝蜜\n菠萝鸡\n萝卜汁\n冰火菠萝油包\n菠萝焗饭\n印度飞饼\n拌葫芦丝\n葫芦头\n葱包烩\n烂蒜肥肠\n蒸洄鱼\n蒸大肠\n蒸带子\n蒸皖鱼\n蒸鱼腩\n蒸鲟鱼\n清蒸黄鱼\n时蔬大拼\n蔬菜煲\n薏米水\n原味藏香猪\n海藻沙拉\n虾扯蛋\n龙虾捞饭\n虾烩饭\n原味蜂蜜松饼\n蝴蝶骨\n蟹肉卷\n蟹肉煲\n西蓝花\n西餐肠\n小豆凉糕\n刨冰\n蔬菜拼盘\n土豆沙律\n豆瓣鱼\n豆腐串\n冻豆腐拼\n豆花糕\n土豆花菜\n豉香鸡\n豚肉扒\n豬頸肉\n香脆跳跳骨\n跳跳鱼\n蹄花锅\n兔头\n特色辣子鱼\n辣脆笋\n辣萝卜\n辣酱面\n麻辣鲈鱼\n香辣鸡煲\n香辣鸡球\n鸭胗\n香辣达仔鱼\n大连鲜鲍\n迷踪蟹\n退秋鱼\n配土豆\n配苹果\n酱油饭\n牡蛎\n酱糖饼\n酱通菜\n酱骨棒\n酸汤面\n酸菜鸡\n酸豆角\n酿尖椒\n醉椒鸡\n醋萝卜\n老醋蛰头\n重乳酪\n娇娇\n野蕨菜\n金宝鱼\n金桔汁\n黄金泡菜\n招牌金牌骨\n炸馒头\n钵钵菜\n干锅素菜\n铁锅杂鱼\n铁锅炖鱼\n干锅牛杂\n锅甲鱼\n石锅鳝鱼\n什锦素菜\n长腿蟹\n大阪煎饼\n雅片鱼\n青咖喱\n青柠汁\n青柠茶\n青瓜卷\n青石斑\n青菠面\n面包篮\n咖喱面包鸡\n面线糊\n风吹肉\n大土豆\n风干肉\n酸梅风干鱼\n飘香肉\n饭团烧\n饭定食\n饺子皮\n饼套餐\n煎饼果子\n饼羊肉\n馋嘴鸭\n香啡包\n麦香山药\n香猪肝\n香瓜子\n香芋丸\n香芋酥\n五香花生\n香草汁\n香菇面\n香蕉卷\n香蕉塔\n香蕉派\n香辣酱\n香锅虾\n香锅鸡\n酒香鲅鱼\n鹅掌\n马拉糕\n马蹄水\n骨浓汤\n无骨炸鸡\n化骨绵掌\n三吃\n班鱼两吃\n鱼丸面\n鱼包蛋\n海鲜大烤\n鱼头面\n墨鱼汁面\n鱼汤面\n鱼片面\n鱼皮饺\n鱼砂锅\n鱼肉卷\n鱼脆饼\n三文鱼腐皮\n鱿鱼片\n鱿鱼酥\n鱿鱼面\n鲍螺片\n鲍鱼粥\n海鲜乌冬\n海鲜汤底\n鲜山药\n海鲜锅贴\n鲶鱼片\n鲷鱼片\n鳕鱼块\n鳕鱼条\n鳝鱼煲\n辣子鸡便当\n炸鸡拼盘\n鸡排锅\n鸡油饭\n鸡肉粥\n鸡肉饼\n鸡腿菇\n鸡鸭杂\n鸭五件\n鸭泡饭\n鸳鸯汤底\n鹅火锅\n咸鹅锅仔\n麦片虾\n麻油鸡\n麻辣串\n麻辣拌\n蟹黄南瓜\n蟹黄小笼\n黄豆煲\n黄金包\n黄金鱼\n黄鳝饭\n三黄鸡锅\n芥兰黑山羊\n黑椒面\n黑芝麻\n招牌椒盐龙头\n盘龙茄子\n乌龙茶奶\n龙须面\nQ麻薯\n一品包\n五花肉\n满满一桌菜\n一洋卷\n布丁可可\n布丁奶绿\n丁桂鱼\n炒饭\n七虾堡\n千丝万缕虾\n三层肉\n三杯虾\n三色糕\n丝丁鱼\n拉皮\n丝荞面\n金丝虾卷\n萝卜丝酥饼\n丸子面\n芥末乌贼烧\n欢乐时光\n香芒轻乳\n乳酪条\n五花趾\n五花锅\n五谷饭\n五香卷\n什锦煲\n杏仁月饼\n生煎包\n果仁蛋卷\n蚝仔煎蛋\n仔米糕\n时令凉菜\n时令小菜\n特价羊肉\n优乐酷\n考伯沙律\n伯爵茶\n黑松露低温蛋\n有机活体豆苗\n俱乐部\n倭豆饭\n超值晚餐\n健康菌\n李師傅脆肚\n芋儿烧鸡\n全羊汤\n八味碟\n八宝茶\n养生煲\n牛排\n冬瓜盅\n冰柠茶\n冰桔汁\n冰皮月\n冷菜区\n冻柠茶\n华夫\n冰激凌月饼\n龙凤什锦\n大刀耳片\n刨猪汤\n卤鸭\n豆制拼盘\n皮冻\n自制羊肉\n秘制龙虾\n招牌炒出前一丁\n餐前饮料\n剪刀面\n巧克力千层\n巧克力薄饼\n冬荫功汤面\n薯条加奶酪\n加明虾\n蔬菜\n鳕鱼\n自助晚餐\n自助水果\n腐皮包黄鱼\n陕北小菜\n西北面筋\n土匪猪肝\n千张卷\n半只鸡\n华夫筒\n中华海藻\n越南河粉\n南瓜挞\n南瓜汁\n卜卜甲\n萝卜牛杂\n牛奶卡仕达\n卤双拼\n卤菜粉\n卤鸡蛋\n卷大虾\n压土豆\n厚蛋烧\n草原肥羊\n老友牛杂\n双人锅\n双味菇\n双拼锅\n爽口凉菜\n口口香\n锅仔口味鸭\n口水虾\n脆口萝卜\n一口西多\n脆口黄瓜\n叶儿粑\n百叶拼盘\n五合一面\n混合果汁\n吊龙膀\n蛋\n五花\n海带\n腊味煲饭\n秋葵\n茄条\n味菌汤\n炸虾饼\n风味螺肉\n鱼锅\n风味鳕鱼\n和子饭\n和豆腐\n炒咕噜球\n咖喱汤\n芋泥烩咸蛋黄\n咸鸭蛋\n精品小菜\n一品蛋酥\n墨西哥脆饼\n菠菜面筋\n咖哩牛腩\n喜头鱼\n四喜烤夫\n四喜豆腐\n咖喱叻沙\n咖喱土豆\n咖喱拼盘\n咖喱牛楠\n咖喱鱼头\n喷柠鸡\n四冷拼\n四喜面\n北京四大碗\n四大缸\n回卤干\n回味虾\n回鱼锅\n沁园牛肉\n国宝鸡\n芋圆1\n圆子锅\n土匪鱼\n土匪鸡\n土豆松\n大块豆腐\n坨坨肉\n坨坨牛肉\n培根虾\n塌豆腐\n塘坝鱼\n石墨豆腐\n墨鱼柳\n墨鱼烧\n墨鱼饭\n墨鱼饼\n芝士地瓜\n芝士豆腐\n芝士鸡煲\n芝士鸡锅\n芝士龙虾\n多味鱼\n大块肉\n大杂扒\n大满贯\n大炖菜\n大牛肉\n大王蛇\n大盘菜\n大红肠\n大肉串\n大腰子\n大饼子\n大马哈\n大鸽饭\n大麻鱼\n飞天通菜\n天鹅蛋\n太鱼汤\n葫芦头泡馍\n夏威夷匹萨\n夹牛肉\n曲奇香奶\n奥尔良小吃\n奶冻糕\n水果\n奶油杯\n奶油烧\n双皮奶菜菜\n奶酪鱼\n奶酪鸡\n奶香片\n妇罗虾\n烤包\n娘惹羹\n阿婆醉鱼\n嫩千张\n腰片\n麻辣嫩鲶鱼\n嫩鸡肉\n拉茶\n栗子糖水\n带子裙边\n松子鲈鱼\n普宁豆腐\n同安封肉\n宝宝龟\n肉冻\n至尊批萨\n至尊眼肉\n小咸菜\n小墨鱼\n小山芋\n小棠菜\n小煎鸡\n雪山小王子\n小盆栽\n小米蒸\n抹茶小红豆\n小肥鹅\n小花菇\n小菜们\n小鸭酥\n脆耳\n山楂糕\n火山牛肉\n山竹笋\n脆笋\n巴山腊肉\n特色海鲜\n栾川豆腐\n泰州干丝\n潮州汤底\n兰州烩菜\n血鸭\n手工曲奇\n锅巴盖肉\n筋头巴脑锅\n帝皇蟹\n带鱼串\n海带鸡汤\n本帮熏鱼\n笋干丝瓜\n干锅鱼\n什锦年糕饼\n德庄鸭肠\n开心果\n开花肠\n奇异果冰\n奇异果爽\n各式水果\n酱汤\n法式金砖\n招牌奶茶\n微辣锅\n点心套餐\n心心相印\n热情蓝莓\n扇贝卷\n手把肉\n手火锅\n打边炉\n扣三丝\n扣肉面\n扣鲍鱼\n扣鹅掌\n把子肉\n手抓牛肉\n手抓羊肉\n抹茶冻\n拇指包\n沙拉黄瓜\n拌海茸\n拌牛肚\n拌猪肝\n拌田七\n拌笋丝\n凉拌苦瓜\n凉拌茭瓜\n拌荆芥\n拌菠萝\n拌藕带\n拌贝裙\n拌青瓜\n鸟贝\n拌鸡架\n拌黄豆\n表叔招牌骨\n手指牛肉\n鸡蛋挑石头\n干捞苕粉\n掌亦煲\n羊排抓饭\n排盖饭\n猪皮冻\n提子酥\n鸡肉丸\n凯撒沙律\n文鱼汤\n斑鱼滑\n调料组合\n德克萨斯牧场\n安格斯肉眼\n时蔬拼\n曲奇饼\n木瓜奶\n木瓜水\n木瓜羹\n木瓜酥\n肉末茄子\n杂粮汤\n杂鱼锅\n三杯中卷\n肉松军舰\n松坂肉\n黑松露包\n板栗粥\n板栗饼\n铁板羊肉\n铁板肥牛\n铁板香芋\n铁板鳕鱼\n板鸭煲\n南极冰藻\n南极冰鱼\n腰果丹麦\n鲜果什锦\n芒果啫喱\n果巴菲\n鲜果庄园\n蜜果甜心\n芒果糯米\n芒果\n芒果雪泥\n芒果雪花\n枣皇糕\n红枣莲心\n柑橘蜜\n柠檬虾\n瑶柱白粥\n栗子塔\n栗子汤\n板栗酥饼\n爆核桃鸡\n培根意麵\n培根浓汤\n培根芦笋\n玛格丽特\n果漾\n格瓦斯\n樱桃之恋\n樱桃炖蛋\n蟠桃\n桑拿虾\n百香桑椹果\n过桥排骨\n马桥香干\n怡宝桶装水\n木桶豆腐\n话梅山药\n话梅花生\n话梅芸豆\n梳芙厘\n棉棉冰\n椒乌鱼\n尖椒肥肠\n椒脆肚\n蘑菇\n椒香鱼\n黑椒鸡肉\n椰子挞\n椰汁冰\n意式椰香包\n榴莲糕\n榴莲角\n鲜橙木瓜\n柠檬蜂蜜\n柠檬蜜茶\n柠檬鲈鱼\n牛肚\n毛圆汤\n毛肚锅\n和民拉面\n牛气冲天\n热气羊肉\n原味气锅鸡\n水果卷\n水火锅\n水煎肉\n八宝\n鸡汁木耳\n蜜汁梅肉\n白汁河豚\n炸鸡\n墨鱼汁烩饭\n干锅花蛤\n汁豆干\n汁豆角\n汁鱿鱼\n汆白肉\n洄鱼\n西江钳鱼\n长江鮰鱼\n清锅\n竹荪\n优质粉汤羊血\n汤苋菜\n上汤豆苗\n汤酥肉\n鱼翅\n汽水肉\n巴沙鱼片\n巴沙鱼锅\n河豚鱼\n沸腾虾\n油墩子\n牛油果卷\n红油百叶\n油粑粑\n油耳片\n红油肚丝\n油腐竹\n油薄饼\n招牌葱油蚕豆\n油面筋\n葱油飞鱼\n宇治抹茶\n青椒蛙\n泡泡锅\n泰香鸡\n泼妇鱼\n洋葱塔\n洛神花\n蛋黄流沙卷\n啤酒\n意式浓菜汤\n海三鲜\n海之恋\n海参粥\n海棠糕\n海苔角\n海蚌片\n三文鱼海蜇边\n海螺肉\n拌海野菜\n青斑\n印度焗咖喱\n海鲜粉\n海鲜酱\n深海鳕鱼\n浸蚌片\n冰激淋月饼\n淋豆腐\n油淋鲈鱼\n淮山卷\n清水锅\n清炖鸡\n清蒸蟹\n油渣莲白\n美丽温哥华\n温泉鸭\n港小炒\n湘之驴\n溜肝尖\n滑双拼\n滚肥牛\n满堂红\n火锅包\n火锅菜\n火龙卷\n火龙虾\n乌鱼\n美极炒三脆\n炒三蔬\n印式风味炒土豆\n炒大肠\n炒小笋\n炒揪片\n渔家小炒海肠\n炒猪皮\n炒猪肚\n炒瑶柱\n炒紫菜\n炒羊杂\n炒胜瓜\n炝炒腰花\n炒苕皮\n蒜苗炒莲菜\n炒螺肉\n炒豆芽\n清炒豆苗\n炒魔芋\n炒鸡架\n炒鸡肾\n炒鹅肠\n炖大鹅\n炖水鸭\n炖酥肉\n炖鹌鹑\n北京炸咯吱\n炸响铃\n炸子鸡\n年糕\n炸灌肠\n炸蚕蛹\n炸饺子\n鱼饼\n口蘑\n燒烤小拼\n烤心管\n原味烤海鲜\n烤牡蛎\n烤目鱼\n烤米糕\n烤羊棒\n烤肉汁\n烤肉酱\n烤脆骨\n招牌烤脑子\n烤豆干\n烤豆腐\n烤银杏\n烤香蕉\n烤香鱼\n烤鱼片\n鲭鱼\n烤鸡胗\n烤鸡脖\n烤鸭卷\n烤鸭羹\n烧乳鸭\n烧元贝\n叉烧双拼\n干烧四宝\n红烧河豚\n烧海螺\n烧海鱼\n烤腌肉\n烧牛尾\n烧笋干\n烧羊棒\n皮蛋烧藕丸\n烧青菜\n叉烧餐包\n烧鮰鱼\n烧鲶鱼\n鸭胸\n鹅酥\n烧鹿肉\n红烧龙虾\n烩丝瓜\n干肉烩土豆\n西芹炒山药\n烩肚条\n烩麻什\n烩麻食\n焖全猪\n焖杂鱼\n焖肉面\n焖萝卜\n焖鸡煲\n芝士焗紫薯\n芝士焗蕃薯\n焗蟹斗\n芝士焗青口\n芝士焗香蕉\n芝士焗鲍鱼\n焦炸丸\n火焰烧虾\n甲鱼\n孜然脆骨\n孜然豆腐\n孜然鱿鱼\n煎海虾\n煎蛋卷\n煎蛋饼\n煎香蕉\n煎鱼柳\n煨豆腐\n煨龙骨\n煲套餐\n煲筒骨\n干煸土豆\n煸鱿鱼\n熏鲳鱼\n熔岩烧\n熘肉段\n熘鱼片\n燕麦粥\n爆仔鸭\n爆爆冰\n爆爆虾\n黄喉\n八爪鱼锅\n爱上鱼\n波霸蟹小姐爱奶油\n真爱排骨\n王牌牛肉\n招牌牛腩\n锅烙\n招牌鱼粉\n牛三鲜\n牛心管\n牛扒餐\n牛杂汤\n牛油果\n牛筋锅\n酱肉包\n牛肉筋\n牛肩峰\n牛脸肉\n牛舌饼\n鲶鱼锅\n牛蛙面\n牛轧糖\n肥牛金针菇\n牛鞭汤\n牛骨煲\n鸡公煲\n特色鱼\n狗肉汤\n独面筋\n猪杂汤\n黑椒猪爽肉\n猪猪组合\n猪肉煲\n香猪肉片\n猪蹄煲\n猪蹄筋\n猪里脊\n猪颈排\n猪香肉\n柚子玄米茶\n玉柠檬\n玉米粥\n玉米蟹\n玉米面\n焦糖玛其朵\n玫瑰露\n玻璃片\n珊瑚皮\n珍珠汤\n琵琶虾\n青瓜小卷\n香拌木耳\n苦瓜炒蛋\n木瓜炖奶\n煎蛋\n红枣\n肘花\n雪梨\n瓜果鲜奶\n腊味瓦煲饭\n甜甜拼\n甜蜜蜜\n甜面酱\n田源鸡\n田鸡煲\n田鸡粥\n北疆菜羹\n疙瘩面\n白切粉\n白切羊\n白森林\n白菜丝\n白菜汤\n白骨鱼\n百味鸡\n百岁鱼\n咖喱皇飞蟹\n鳕球\n酥皮浓汤\n脆皮炸鸡\n脆皮牛腩\n皮蛋汤\n青椒\n脆皮香蕉\n大盆牛肉\n水盆羊肉\n牛心菜\n益菌多\n盏通菜\n盐水虾\n盐水鹅\n盐焗鸽\n椒盐牛蛙\n椒盐龙虾\n盖锅巴\n目鱼花\n骨肉相连串\n石子饼\n石榴花\n石锅粉\n石锅蛙\n矿泉水\n石锅砂锅面\n牛肉\n碳烧肉\n神之恋\n汤锅\n食神豆腐\n鸡三宝\n马来西亚福建面\n私房虾\n章鱼须\n芋儿竹笋鸡\n竹荪鹅\n笋尖粉\n莴笋山药\n笋干煲\n筋皮子\n方签牛肉\n奶露\n玉米松仁\n米血糕\n意大利肉酱意粉\n蟹粉小笼\n糊塌子\n年糕套餐\n芝士年糕\n红糖冰粉\n糖红枣\n糖醋肉\n红糖馒头\n糯米肉\n素排骨\n素火腿\n素烧鸭\n素鸡煲\n紫米烧\n紫菜煲\n紫薯包\n紫薯卷\n紫薯托\n紫薯汁\n碳火紫薯秀\n红宝石\n红汤面\n红沙鱼\n红白锅\n红糖饼\n红菜苔\n红豆烧\n缤纷欢舞\n纸包鸡\n锡纸鲈鱼\n综合冰\n绿豆面\n绿豆饼\n缤纷卷\n罐焖牛肉\n罐罐菜\n招牌罗宋包\n罗氏虾\n罗汉肚\n羊头肉\n烤羊套餐\n羊小排\n羊脆骨\n羊腩煲\n羊腿排\n美蛙鱼\n翡翠包\n老三鲜\n老火汤\n老爆三\n老陕菜\n银耳南瓜\n木耳桃仁\n木耳炒蛋\n炖雪梨\n肉丝汤\n牛肉冒菜\n牛肉土豆条\n塔可\n大葱\n驴肉大饼\n牛肉宽面\n肉松包\n肉松饭\n牛肉泡饭\n肉浓汤\n羊肉焖饼\n牛肉粿条\n肉饼饭\n烤鹅肝\n肠火锅\n肥肠饭\n开胃萝卜\n开胃豆腐\n开胃鸡片\n胚芽乳\n胡椒肚\n招牌能量\n脆肉鲩\n鲜虾脆脆棒\n脆虾肠\n脆锅巴\n脆鱿鱼\n脚牛肉\n腊八蒜\n腊排骨\n腊肉饭\n腌生虾\n腌萝卜\n豆腐拼盘\n臭豆腐虾皮\n牛腩捞面\n咖喱腰果鸡\n沸腾草鱼\n沸腾鱼片\n沸腾鲶鱼\n火腿微萨\n炸膝软骨\n油炸臭干子\n牛舌拼盘\n特色凉菜\n蓝色妖姬\n白色恋人\n特色蘸料\n诱惑\n艾窝窝\n芋圆捞\n芋头冰\n芋头煲\n芋泥鸭\n芋艿煲\n芒果杯\n芒果派\n芒果贝\n芙蓉汤\n芙蓉蟹\n芝士包\n芝士棒\n芦笋卷\n芦笋汤\n芭夯兔\n花之卷\n桂花乌龙\n花排骨\n枝烧\n花生汁\n花生浆\n花生粥\n花生酥\n百花稍梅\n桂花芋艿\n芸豆卷\n豆芽炒粉\n苏子叶\n海苔拌饭\n苦荞馍\n番茄浓汤\n番茄鱼片\n番茄鱼锅\n茴香豆\n茗茶一壶\n茶大福\n抹茶松饼\n茶泡饭\n茶皇虾\n抹茶雪糕\n香草排骨\n荔枝蜜\n荞麦\n竹荪三鲜\n荷叶饼\n茉莉花茶\n莓优格\n莓千层\n莓多士\n莓新地\n梅类果汁\n草莓薄饼\n莓金砖\n班戟\n榴莲布蕾\n蘑菇城堡\n香菇油菜\n蘑菇烩饭\n金菇脆笋\n菇鸡煲\n炒菊花菌\n玉米菜团子\n青菜小拼\n菜淡皮\n清炒\n白菜炒虾\n德式泡菜烩饭\n豆皮\n鲜肉\n韭菜鸡蛋\n菠菜卷\n菠菜塔\n菠菜汤\n菠菜饭\n菠萝堡\n萝卜包\n菠萝可颂\n葛根粉\n葡国鸡\n葡萄干\n葡萄柚\n葡萄鱼\n香葱豆腐\n葱香鸡\n蒙布朗\n蒜羊血\n蒸盆子\n蒸粉果\n蒸肠头\n蒸肥肠\n蒸蚬子\n芙蓉蟹粉\n蒜蓉龙虾\n蓝莓味\n蓝莓派\n鲜蔬组合\n蔬菜吧\n蔬菜盘\n蔬菜羹\n香蕉松饼\n薯小弟\n金薯沙律\n藤椒鸡\n西域蘸酱菜\n虹鳟鱼\n活虾双吃\n龙虾拌面\n牛肉丸\n炖蛋\n虾肉卷\n蚝仔煎\n蚬子肉\n蚵仔煎\n蛋拌饭\n蛋灌饼\n蛋牛肉\n蛋花粥\n蛋角煲\n皮蛋豆花\n鸡蛋醪糟\n牡蛎煎蛋\n青蛙撞奶\n蛙蛙鸡\n雪蛤炖蛋\n蜗牛汤\n意式蜗牛饭\n蜜桃汁\n蜜鹅肝\n蝴蝶酥\n螺旋藻\n蟹壳黄\n蟹籽丸\n蟹籽包\n补炖汤\n油泼裤带面\n煎裹蒸粽\n西柚茶\n西瓜冰\n诱惑鸡\n调料区\n蔬菜大丰收\n五谷香馍\n豆塔塔\n红豆抹茶\n土豆排骨\n豆曲奇\n豆沙粽\n豆皮卷\n豆腐丸\n豆腐泡\n豆腐虾\n红豆薄撑\n豆角丝\n烤鱼\n香煎饼\n豚平烧\n川贝雪梨\n费城卷\n赖尿虾\n越南卷\n踏板鱼\n猪蹄火锅\n蹦鲤鱼\n刺身拼盤\n辣三鲜\n辣乌鱼\n辣双脆\n香辣板筋\n辣椒面\n辣汤面\n麻辣爆肚\n麻辣牛肚\n瓜条\n竹笋\n辣羊蹄\n辣肉面\n肋条\n麻辣脑花\n草鸡\n麻辣蜗牛\n麻辣诱惑\n麻辣鱼片\n糟辣鲤鱼\n香辣鸡杂\n鸭肠\n边馍馍\n过江肠\n过江鱼\n鸿运当头\n大连鲍鱼\n起酥肉松\n酥脆鸡\n奶酪山药\n酱凤尾\n酱卤肉\n腌酱拼盘\n云吞炸酱捞面\n酱料\n酱汁肉\n汁锅\n酱油渍\n酱油鸡\n麻酱花卷\n酱香骨\n酱脊骨\n酸辣碟\n酒酿元宵\n酿皮子\n酿秋茄\n鱼酿鱼腐\n醉排骨\n醉牛肉\n双脆\n醋溜鸡\n醋烧鸡\n野菌菇\n野菜卷\n野菠菜\n野香菇\n野鸡蛋\n招牌金瑶鸡\n黄金虾卷\n金钱手\n金鸭梨\n钵仔糕\n铁锅炖\n银雪鱼\n银馒头\n石锅三国\n干锅三素\n锅仔鸡\n大肠\n虾串\n火锅大锅\n干锅掌翅\n锅汤底\n石锅泡饭\n大锅海鲜\n羊杂\n美蛙\n石锅菜饭\n干锅藕条\n干锅豆角\n石锅香芋\n平锅鲤鱼\n干锅鲶鱼\n锅鳕鱼\n鸡杂\n无锡排骨\n冰镇龙虾\n长脚蟹\n澳门烧肉\n洛阳燕菜\n陆双拼\n雪花肥牛\n花雕醉枣\n花雕醉鸡\n雞尾酒\n雪文治\n雪梅娘\n雪梨粥\n雪花糕\n雪莲子\n雪蛤粥\n霹雳泡饭\n招牌雷公鸭\n抹茶霜淇淋\n青瓜烙\n面包圈\n面包塔\n面包干\n面包虾\n拼盤\n风塘虾\n风沙鸡\n餐包仔\n饸饹面\n饼夹肉\n奶绿\n五香牛杂\n绿茶\n酱香肘花\n香肠面\n香肠饭\n香肺片\n香芋派\n香芋饼\n香芒糕\n苦菊\n香蕉酥\n蛋仔\n香豆皮\n香辣汤\n香辣鸭\n香酥饼\n香锅兔\n奶香馒头\n香鱼块\n酱香鳕鱼\n麻辣\n马步鱼\n闷骚南瓜\n骨拼盘\n骨汤面\n清汤\n骨湯底\n脆骨牛肉\n脆骨饭团\n骨香鸡\n骨鱼煲\n无骨鸭蹼\n魔芋结\n魔鬼鱼\n鱼下巴\n烤鱼伴侣\n鱼唇汤\n鱼嘴巴\n鱼子卷\n手握\n拌菜\n烤鱼泡泡\n鱼泡馍\n鱼清汤\n鱿鱼火烧\n炒蛋\n鱼炖鸡\n鱼细卷\n有鲜鱼\n鱼薄切\n鱼蛋皇\n鱼蛋角\n鱼钩鱼\n鱼锅贴\n闷锅\n鱼香锅\n鱿鱼饼\n鲍鱼片\n鲜兔腰\n鲜果杯\n鲜果汁\n鲜果花\n海鲜焗面\n鲜牛肝\n鲜百叶\n鲜百合\n鲜肉粽\n鲜蔬卷\n鲜蔬汤\n鲜鱼面\n鲨鱼肚\n鳄鱼汤\n宫爆鳕鱼丁\n鳕鱼煲\n鳕鱼球\n鳗龙卷\n鳝丝面\n烤鸡匹萨\n鸡杂面\n鸡杂饭\n招牌鸡汤饭\n鸡爪煲\n鸡肉片\n奥尔良鸡肉粒\n鸡薄饼\n东坡鸡豆花\n鸡里蹦\n鸭头虾\n鸭板肠\n鸭架子\n鸭爪煲\n招牌莓酱\n鸭肉饭\n鸭脆肠\n鸭脚煲\n鸭脚筋\n鸭腿饭\n鸳鸯中锅\n鸳鸯卷\n鸳鸯糊\n鸳鸯虾\n鹅胸肉\n麦乐鸡\n蟹钳爱上麻得跳\n麻椒鸡\n芝麻火烧\n麻辣粉\n黄桂粥\n黄泥螺\n韭黄炒蛋\n黄牛汤\n黄腊丁\n黄金贝\n蛋黄锅巴\n黄鱼面\n黑土鸡\n黑曲奇\n黑米粽\n黑豆芽\n黑鮰鱼\n松鼠桂鱼\n龙利柳\n龙头鱼\n龙胆鱼\n招牌龙虾饭\n宫保鸡丁盖饭\n上脑\n焖鱼下巴锅\n绝世超芒\n东东包\n拔丝土豆\n拔丝地瓜\n金丝排骨\n关中杂拌\n鸡中翅锅\n雪中送炭\n烤串拼盘\n丽娜卷\n乌冬\n酷乐鲜柠\n南乳排骨\n乳猪\n仔猪煲\n味付三拼\n味付海藻\n伙伴肥牛\n宫保大虾\n菌汤兔肉锅\n全牛\n新西兰肥羊\n米兰虾饼\n兰馨锅\n绍兴醉鸡\n经典生煎\n冒牛肉\n鲜冬瓜片\n海鲜芝士乌冬面锅\n冰沙\n冰粉\n冰粥\n冷菜\n冷面\n冻柠七\n冻柠红\n凉皮\n凉粉\n熊猫凉糕\n凉茶\n凤爪\n大刀羊肉\n太极金刚煲锅\n秘制酱骨\n秘制鹅肝\n刺身\n餐前小食\n巧克力派对\n巧克力瀑布\n冬阴功汤锅\n面包切片\n大肠包小肠\n北冰洋\n披萨\n鸡叻\n千张\n元朗荣华月饼\n套餐+单点\n南瓜\n小园子\n萝卜丝饼\n萝卜炖牛腩\n卤味\n卤煮\n主厨浓汤\n叉烧肉\n一口多士\n爽口泡菜\n爽口苦菊\n可可\n虾吃虾涮\n小吊梨汤\n吐司\n腊味合蒸\n风味拉皮\n原味豆滋\n四味锅2\n拾味鱼扒\n绝味鸭架\n烤肉和生肉\n咖啡\n咖喱饭\n豆制品四拼\n一品锅饭\n一品鲈鱼\n咖喱火锅\n泡馍\n四拼锅\n江团焖锅\n江团鱼锅\n田园小炒\n庄园肥牛\n芋圆\n芋圆系列\n拌土鸡\n圣代\n地瓜\n老坛酸菜\n好莱坞炸篮\n培根卷\n培根锅\n避风塘海虾\n芝士宝贝\n芝士牛肉\n茉莉绿茶\n芝士薯球\n芝士蟹柳\n波士顿卷\n贝壳+虾\n塞外锅巴\n马来四大天王\n三文鱼大满足\n巧克力大爆炸\n大盘兔\n大锅\n大饼\n大骨\n寒天爱玉\n渔夫泡饭\n功夫青笋\n功夫鲈鱼\n桥头炸肉\n鱼头诱惑\n菌王奇香锅\n咖啡奥利奥松饼\n椰奶凉糕\n法式奶油鸡\n奶茶\n外婆鱼头\n宋嫂鱼羹\n嫩肉\n梅子奶酪\n正宗羊肉\n客家咸鸡\n宽粉\n富贵卷\n寒之寒\n金丝对对虾\n寿司\n法式甜蜜小包包\n麻酱调料\n火车头小焖锅\n小鱼小牛锅\n开胃小菜\n尚一汤\n九尺鹅肠\n长岛冰茶\n川粉\n杭州卤鸭\n兰州酿皮\n巨蛋烧\n锅巴排骨\n锅巴藕丸\n海蛎子巴蛸锅\n铁板帆蛎贝\n花生海带双拼\n带子\n带鱼\n干碟\n干锅\n管鱼\n江南的年糕蟹\n印度薄饼\n一座城池\n宫廷奶酪\n浪漫法式塔锅\n清蒸彩虹斑\n德庄汤\n回忆盒饭\n鱼\n夏威夷风情沙律\n两情相悦\n三文鱼蛋黄汁烩意粉\n意面\n孝感米酒\n扇贝\n焖扇贝锅\n芒果牛奶手杯冰\n打糕\n扯面\n猫抓糍粑\n抹茶\n炸鸡沙拉拌饭\n大拌菜\n芋圆招牌餐\n刺身拼盘小\n拼锅\n双椒鱼头\n鸡排伴侣\n鸡排大亨\n接吻鱼\n雪蟹脚+海虹+撒尿虾\n凯撒萨拉\n文鱼\n万岁寿司·料理\n新地\n早餐\n比利时薯球\n甜虾明太子\n东北春饼\n水晶焖锅\n水晶皮冻\n水晶鸭肝\n切片布朗尼塔\n沙朗牛肉\n木桶鱼\n木桶鸡\n木瓜\n炒木耳\n木须肉\n章鱼\n皇牌玛奇朵抹茶\n杂菜\n马来风光\n松茸\n牛板筋\n铁板肥肠\n铁板蛏子\n板鸭\n黑森林金砖\n芒果千层\n芒果椰奶\n水果盆栽\n时令水果拼盘\n果立方\n金枪鱼香果脆饼\n辣子鸡配芒果蛋黄\n雪冰\n荔枝烧肉\n枣糕\n枣豆糕\n蜜枣银心\n西红柿蛋汤\n比格自助\n木桶牛肉\n木桶葡挞\n南瓜条\n梅肉\n泡椒酸菜\n麦香芒椰果捞\n榴莲\n冰爽柠檬柚子\n正蟹煲\n酒水饮料\n蜜汁火方\n嫩汁牛烧\n蜜汁红枣\n鸡汁锅贴\n汉堡\n江蟹生\n汤包\n汤圆\n酸汤羊肚\n汽水\n云南汽锅鱼\n尖椒兔丁\n小油条\n三文鱼牛油果塔\n香油润鱼\n蒜泥香油\n葱油鲈鱼\n泡泡油糕\n泡芙\n泡饼\n芝士波波肠\n蒜泥青瓜\n滇香三阳开泰养生\n徐同泰酱油\n非洲冰草\n澳洲龙虾\n派大星\n泡菜流川锅\n海宝船\n海星\n海盗船\n海胆\n爽口海草\n超级海陆空\n海鲜区\n涮菜\n海鲜火锅\n清汤面\n清蒸鱼\n清酒\n田园清香蛙\n台湾卤肉\n田源鸡汤\n日月潭豆腐\n肉火烧\n火腿\n穿心莲\n白灼章鱼\n芥兰\n桂花炒元宵\n栗子\n炒面\n猪油菜炒饭\n拆骨肉炖菜干\n炖鸡\n火炙蟹棒\n炸支竹\n炸肉\n炸薯角\n炸雞\n芝士炸鸡锅\n烙饼\n烤乳酪\n芝士烤元贝\n香烤兔\n京酱烧烤寸骨\n烤海螺\n烤猪皮\n烤盘\n烤全羊\n烤翅\n烤肉\n烤肉筋\n烤腰子\n烤薯皮\n烤饼\n烤鸡\n烧卖三鲜\n牛肉烧多士\n锅烧春笋\n招牌烧肉粒\n烧饼\n烧鹅\n大热狗\n焖面\n焖黑鱼\n盐焗乳鸽\n芝士焗野菌\n火焰海螺\n煎面\n煎饺\n煎鱼\n小笨鸡\n浓汁煮鱼唇\n木瓜芦荟鱼肚\n煲蒸饭\n熏肉\n燃面\n燕窝\n燕饺\n爆肚\n甜汁爆鲜虾\n爱琴海\n招牌卷饼\n妈妈牌杂菇\n招牌海苔\n招牌牛河\n招牌贡茶\n香辣咖喱牛肉丸＆弹牙鱼蛋\n牛太郎\n牛奶\n牛排套餐\n牛柳\n牛筋\n牛肝\n牛舌\n虾、鱼、牛蛙\n虾、牛肉、牛蛙\n牛骨髓\n炸物拼盘\n狮子王\n狮王冻\n猪排\n猪皮\n猪肚\n猪血猪红\n猪蹄\n熊猫宝宝\n熊猫果茶\n玉米烙\n霸王鸡条\n环境\n玫瑰之恋\n玫瑰圆子\n木瓜雪杏\n香甜炸鸡\n野生草鱼\n三巴酱野生菌煲\n炒田螺\n吐鲁番米糕\n新疆香囊\n瘦肉粉\n纠结的大饼\n秋天的童话\n脆皮椰奶\n北欧鲑鱼皮焗卷\n皮蛋炒椒\n脆皮鲜奶\n椒盐排条\n鸭架\n石锅鸡\n肉碎锅巴\n碓窝鸡\n研磨时光\n青柠洛神冰茶\n福寿桃\n青稞扎萨\n海陆空焖锅\n大切章鱼皇\n笃笃笃糖粥\n笋\n笋片\n筋面\n筋饼\n试管饮料\n米皮\n米皮儿\n炒米粉\n鸡爪米糕铲\n过桥米线\n米酒\n糯米饭\n芭夯菌类汤锅\n粉丝\n粉条\n粥底\n年糕香排\n糖不甩\n枫糖唱片\n糖粥\n糖蒜\n糖饼\n素菜\n素鸡\n素鸭\n紫薯\n特色干煸粉絲蟹煲\n红杏鸡\n红肠\n红茶\n小红薯\n烈焰阿根廷红虾卷\n相思红豆奶\n红酒\n红锅\n将军高级拌饭\n综合锅\n罐牛\n摩托罗拉卷\n精灵署小弟\n特色羊排\n海鲜+羊肉\n羊血\n精美水果\n羹\n翅尖\n烤翅根\n翠鱼\n翡翠鱼\n老碗鱼\n茶树菇老鸭饭\n木耳辣根\n鲜肉月饼\n肉松酥\n肉滑\n肉筋\n肉粽\n牛肉诱惑\n烂肉豌豆\n牛肉辣汤\n肉饼\n羊肉馍馍\n囊包肉\n五香肘\n烧肠粉\n羊肉\n肥羊粉\n肯德基\n开胃泡菜\n海胆手卷\n龙脂猪血\n脆骨\n筋头巴脑小锅\n腊肉\n秘制腐乳鸭\n拌腐竹\n腰子\n非一般鲈鱼\n特色果汁\n三色炸鸡\n香芋排骨\n榴芒双拼\n泡芙小姐\n荷花卷\n木瓜银耳\n花茶\n鲜花菜\n樱花虾羹\n炒花蛤\n花鲢\n豆芽面筋\n提拉米苏切块\n提拉米苏切片\n苕皮\n苕粉\n鸡汤苦麦菜\n番茄汤底\n绿茶佛饼\n抹茶杏仁\n藏茶火锅\n草鱼锅\n生蚝\n草莓罐头\n榴莲匹萨\n榴莲忘返\n榴莲飘香\n榴莲飞饼\n榴莲鸡锅\n莴笋\n莴笋尖\n野生菌六拼\n菌菇\n香菜干丝\n油菜心\n菜汤\n韭菜盒子\n菠萝派\n201108312499華萊士\n紫葡之恋\n内蒙肥羊\n蒸蛋\n蒸面\n蒸鱼\n蒸鸡\n红虾\n蒜蓉茄子\n蒜蓉茼蒿\n香蕉薄饼\n炸薯条\n香脆薯格\n薯球\n薯角\n薯饼\n鲜藕片\n香辣锅\n炒虾仁\n虾片\n虾球\n虾籽面\n蚂蚁上树\n香辣蚝子鱼\n蛋挞\n奶油培根蛋汁面\n蛋饺\n炒竹蛏\n黄金蜂蜜绿\n蛤蜊炖蛋\n闺蜜咖喱\n螺肉\n海螺酱烧\n蟹味菇\n蟹大脚\n蟹柳\n蟹棒\n螃蟹焗饭\n营养蟹籽饭\n蟹足棒\n血肠\n菌类番茄滋补三锅\n口袋豆腐\n褡裢火烧\n西瓜\n茄角之恋\n让豆腐\n麻辣诱惑虾\n豆浆\n脆豆腐\n豌豆肥肠\n黄豆芽\n干煸豆角\n铁板豆豉鸡\n豚肉排\n虾贝组合\n西贝面筋\n土贡梅煎\n财鱼\n水货沙律\n刺身双拼\n刺身小拼\n瘦身海藻\n九转大肠\n辣松\n南洋鸡肉卷&辣翅\n辣锅\n酱汁香辣鮰鱼\n麻辣鳝段\n鸿运龙船\n七里香迷中蟹\n普通羊肉\n通菜\n郡花\n京都豆腐\n酒\n红酒雪梨\n油酥火烧\n酥肉\n酥饼\n麻酱拉皮\n麻酱酿皮\n海鲜酱闷锅\n酸奶\n酸汤\n酸菜\n酿皮\n醉鸡\n糖醋鲤鱼\n野菜\n黄金大饼\n黄金蟹斗\n铜锅\n吴铭毛肚\n组合焖锅之一\n石锅扇贝\n闷锅生料\n锅贴\n石锅酱汤\n冰镇黄桃\n鱼焖锅\n冬阴功面\n乾隆鱼头\n自助食材集体照\n雪娃娃\n雪碧\n雪蟹\n波霸奶绿\n青笋片\n青笋尖\n熊猫靓雪饼\n面包等\n面片\n热面皮\n自选青菜、菌类、面类等\n韭菜饼\n养颜木瓜\n肉丁馒头\n韩餐生肉\n蜂蜜柚子茶\n水饺\n饺子宴\n馄饨\n馅饼\n馒头片\n馕\n香妃鸡\n芥香章鱼\n香肠\n香草\n菌类合盘\n百香雪酪\n酱香龙骨\n驴杂汤\n驴肉\n骨棒\n筒骨汤\n蒜香排骨火烧\n鬼鸡\n魔芋\n鮰鱼\n探鱼冰粉\n鱼排\n鱼滑\n鱼肉\n鳗鱼蒲烧\n鱼面\n鱿鱼脆\n鲍鱼饭\n鲑之恋\n咖喱海鲜喇萨\n鲜奶\n海鲜葱饼\n鲷鱼\n鲽鱼柳\n黑椒鲽鱼煲\n大盘鸡\n鸡块\n和风香鸡堡1\n鸡尖\n鸡心\n鸡排\n雪花鸡柳\n鸡皮\n鸡翅\n炸鸡腿\n鸭头\n鸭心\n鸭汤\n鸭爪\n老卤鸭翅\n鸭肚\n极品鹅肠\n鸭脖\n鸭舌\n鸭血\n糍粑\n乳鸽焗饭\n鹅肠\n蒙鹿肥羊\n麻团\n麻花\n芝麻香饼\n黄啤\n黄馍馍\n黑啤\n黑拉皮\n黑鱼锅\n三足鼎立锅\n松鼠鲈鱼\n松鼠鳜鱼\n龙鱼\n绿茶龟宝宝\n西米露\n红豆沙\n双皮奶\n水果捞\n甜甜圈\n红豆冰\n杨枝甘露\n红豆糕\n蛋糕\n冰淇淋\n华夫饼\n冰激淋\n冰淇凌\n冰激凌\n西米捞\n凤梨酥\n薯糕\n芒果冰\n金捞\n千层糕\n鲜芋仙\n薄饼\n炖雪蛤\n下午茶\n布甸\n三兄弟\n莲子汤\n芒果爽\n雪山包\n紫米糕\n水果塔\n温泉蛋\n慕斯蛋糕\n巧克力蛋糕\n酪蛋糕\n奶蛋糕\n奶油蛋糕\n黑森林\n猪肉丸\n生日蛋糕\n杯子蛋糕\n布丁蛋糕\n栗子蛋糕\n小蛋糕\n起司蛋糕\n榴莲蛋糕\n抹茶蛋糕\n戚风蛋糕\n凹蛋糕\n墨鱼丸\n花枝丸\n甜品小丸子\n章鱼丸\n鱼丸鱼蛋\n贡丸\n虾丸\n撒尿牛丸\n炸丸子\n红烧丸子\n鱼球\n芝士丸\n牛筋丸\n慕斯\n千层\n芝士\n菜丸子\n芋丸\n酒酿丸子\n圆子\n丸子\n双层蛋糕\n水果蛋糕\n公主蛋糕\n彩虹蛋糕\n熔岩蛋糕\n寿司蛋糕\n冰淇淋蛋糕\n果蛋糕\n宝丸\n土笋冻\n冰激凌蛋糕\n莓蛋糕\n年轮蛋糕\n茶蛋糕\n鸡蛋糕\n羊肉丸\n桃蛋糕\n岩盐蛋糕\n红丝绒\n小青菜\n茼蒿\n蓬蒿菜\n皇帝菜\n蒿子杆\n蒿子秆\n杂青菜\n空菜苗\n西瓜汁\n柠檬茶\n桔柠檬\n水果茶\n可乐\n竹蔗水\n奶盖可可\n奇异果汁\n荞麦面\n年糕火锅\n辛拉面\n三文鱼刺身\n三文鱼北极贝\n牛肉刺身\n虾刺身\n蚌刺身\n水果沙拉\n蔬果沙拉\n芒果沙拉\n炒乌东\n威士忌\n酱鳕鱼\n猪骨锅\n牛油果沙拉\n鲜果沙拉\n海鲜粉丝沙律\n土豆丸\n番茄炒蛋\n罗汉笋\n鸭四宝\n香芒卷\n抹茶金砖\n什锦蔬菜\n杏鲍菇\n辣椒炒肉\n烧牛肉\n烧甲鱼\n炭烧肉\n粉烧肉\n咖喱牛肉饭\n柠檬鱼\n秋椒鱼\n双椒鱼\n青椒钵钵鱼\n烤肉金针菇卷\n海鲜菇\n干拌面\n韩国拌面\n猪头肉\n黑猪肉\n口水鸡\n茶香鸡\n大什扒\n海杂鱼\n香鱼片\n葱油鸡\n白切鸡\n小海参\n剔骨肉\n玉米排骨汤\n排骨汤\n紫菜包饭\n蛋包饭\n鸡翅包饭\n鱼包饭\n虾炒饭\n蛋炒饭\n海鲜炒饭\n牛肉炒饭\n菠萝炒饭\n什锦炒饭\n蔬菜炒饭\n鸡肉炒饭\n鹅肝炒饭\n肉粒炒饭\n海苔炒饭\n扬州炒饭\n鱼炒饭\n芝士炒饭\n咖喱炒饭\n叉烧炒饭\n铜锅饭\n葱烧虾\n椒炒蛋\n培根炒饭\n千刀肉\n豆角肉末\n南瓜饭\n酱油炒饭\n鸡肉串\n羊肉串\n牛肉串\n串串香\n串串虾\n猪肉串\n叉烧包\n小笼包\n奶黄包\n烤包子\n大包子\n碎肉生菜包\n素菜包\n肉包\n牛肉披萨\n鸡肉披萨\n海鲜披萨\n培根披萨\n榴莲披萨\n土豆比萨\n芝士披萨\n至尊披萨\n风情披萨\n香肠披萨\n烤肉披萨\n咖喱鸡\n土豆片\n小土豆\n土豆球\n烤土豆\n香辣土豆\n土豆饼\n糖醋排骨\n蒸排骨\n香汁排骨\n话梅排骨\n蒜香骨\n排骨饭\n蜜汁骨\n芒果汁\n苹果汁\n橙汁\n炖鸡汤\n蘑菇汤\n海带汤\n南瓜汤\n清远鸡\n酥皮\n土豆沙拉\n蔬菜沙拉\n玉米沙拉\n鸡肉沙拉\n凯撒沙拉\n木瓜沙拉\n海鲜沙拉\n蟹子沙拉\n牛肉沙拉\n帝王蟹\n基围虾\n咖喱虾\n香辣虾\n甜虾\n油爆虾\n焗大虾\n香酥虾\n黑虎虾\n玉米虾\n串烧虾\n元宝虾\n濑尿虾\n烤明虾\n焗明虾\n盆盆虾\n煮花螺\n面包蟹\n炒花蟹\n辣大闸蟹\n大闸蟹\n凤尾虾\n大明虾\n开背虾\n椒盐虾\n开边虾\n白灼虾\n南美虾\n皮皮虾\n酥皮虾\n炒河虾\n酱油虾\n蒜蓉虾\n鸡翅虾\n盐焗虾\n烤虾\n蹄花虾\n粉丝虾\n铁板虾\n斑节虾\n花螺\n炒螃蟹\n烤鸭\n片皮鸭\n香酥鸭\n手撕鸭\n加积鸭\n茶香鸭\n脆皮烧鹅\n樟茶鸭\n酱鸭\n脆皮鸭\n小刀鸭\n排骨\n牛排饭\n猪小排\n辣小排\n蜜汁小排\n嫩牛肉\n牛五花\n酱牛肉\n五香牛肉\n嫩汁肥牛\n烤牛肉\n牛肉片\n香爆牛肉\n麻辣牛肉\n蒙古牛肉\n肥羊\n牛排肉\n孜然牛肉\n酸菜牛肉\n拌牛肉\n五香驴肉\n滑牛肉\n牦牛肉\n生牛肉\n布丁\n提拉米苏\n暴风雪\n榴莲雪糕\n上汤娃娃菜\n小白菜\n大白菜\n辣白菜\n圆白菜\n洋白菜\n海白菜\n南瓜粥\n海鲜粥\n虾蟹粥\n小米粥\n皮蛋瘦肉粥\n八宝粥\n黑米粥\n瘦肉粥\n鲜虾粥\n桂花莲藕粥\n蚝仔粥\n菌菇粥\n百合粥\n芦荟羹\n牛肉卷\n羊肉卷\n猪肉卷\n小龙虾\n大龙虾\n护心肉\n梅花猪肉\n焖大虾\n招财鱼\n手抓排骨\n九节虾\n辣鸡架\n茄子煲\n酱茄子\n烤茄子\n烧茄子\n风味茄盒\n铁板茄子\n鱼香茄子\n豆角茄子\n毛豆\n炒生菜\n橄榄菜\n香辣蟹\n梭子蟹\n蟹脚\n蟹钳\n豆腐煲\n炖豆腐\n麻婆豆腐\n家常豆腐\n自磨豆腐\n麻豆腐\n风味豆腐\n一品豆腐\n蟹粉嫩豆腐\n肉酱面\n牛肉拉面\n海鲜面\n打卤面\n手擀面\n豚骨拉面\n车仔面\n公仔面\n牛杂面\n鸡蛋面\n牛杂河粉\n手撕面包\n菠萝包\n红豆包\n烤面包\n奶酪面包\n冰淇淋面包\n香蒜面包\n水果面包\n奶油面包\n葡萄干面包\n牛奶面包\n核桃面包\n咖啡面包\n小面包\n肉松面包\n毛毛虫面包\n蜂蜜面包\n石锅饭\n滑鸡饭\n蘑菇拌饭\n茄子拌饭\n煲仔饭\n一品煲\n泡菜饭\n虾饺\n抄手\n煎鳕鱼\n银鳕鱼\n烤鳕鱼\n烧鳕鱼\n香辣鱼\n麻辣鱼\n黄焖鸡\n鸡肉煲\n白菜心\n白灼菜心\n杏仁豆腐\n芝士玉米\n蛋黄松子玉米粒\n玉米饼\n泡椒牛蛙\n香锅牛蛙\n馋嘴蛙\n炒牛蛙\n焖牛蛙\n香辣牛蛙\n跳跳蛙\n水煮牛蛙\n霸王蛙\n黄焖鱼\n小面\n臊子面\n担担面\n摊摊面\n酸辣面\n味千拉面\n芥末墩\n奶酪牛排\n沙朗牛排\nT骨牛排\n排骨肉\n冰镇秋葵\n炒肉丝\n肉丝饭\n陈村粉\n咖喱炒河粉\n烤肉饭\n北极贝\n泡椒黄喉\n黄烩饭\n雪菜蒸黄鱼\n泰式酿番茄\n太爷鸡\n西芹果仁炒虾\n海鲜豆腐\n鲜木耳\n芹菜花生\n鲜金针菇\n烤金针菇\n白菜炖粉条\n绿菜炖粉条\n新鲜娃娃菜\n粉丝娃娃菜\n丝瓜汤\n月亮虾饼\n鲜玉米\n玉米杯\n鱼皮寿司\n凉拌香干\n韭菜炒香干\n咖喱鸡肉饭\n咖喱猪排饭\n榄菜四季豆\n鲜生菜\n鲜圆生菜\n蒜台炒肉\n纸杯蛋糕\n菲力牛排\n珍珠奶茶\n口味蛋糕\n芝士焗饭\n曲奇饼干\n西冷牛排\n牛肉汉堡\n三文鱼寿司\n巧克力慕斯\n海绵蛋糕\n夹心蛋糕\n焦糖布丁\n翻糖蛋糕\n黑椒牛排\n牛肉火锅\n火腿披萨\n肉三明治\n鸳鸯锅底\n芒果布丁\n海鲜意面\n热巧克力\n咖喱牛肉\n肉酱意面\n吐司面包\n卡通蛋糕\n番茄意面\n金枪鱼沙拉\n培根意面\n香草冰淇淋\n辫子面包\n烤五花肉\n原味奶茶\n波士顿龙虾\n肉眼牛排\n炒乌冬面\n黑森林蛋糕\n娃娃蛋糕\n鲜肉馄饨\n鳗鱼寿司\n酸奶冰淇淋\n全麦面包\n抹茶拿铁\n鸡蛋煎饼\n红豆奶茶\n猪肉水饺\n特色炒饭\n三文鱼沙拉\n芒果慕斯\n肉馅饺子\n西冷牛扒\n鱼炖豆腐\n金枪鱼三明治\n奶油意面\n猪肉饺子\n柠檬红茶\n莫吉托\n牛奶布丁\n鸡蛋三明治\n布丁奶茶\n火锅米线\n鱼豆腐汤\n豆腐丸子\n肠煲仔饭\n鸡煲仔饭\n面包布丁\n蜜糖吐司\n火腿三明治\n鸡肉焗饭\n肉酱意粉\n鹅肝寿司\n鸡肉意面\n田园沙拉\n水果色拉\n豆沙面包\n双拼披萨\n排骨煲仔饭\n咖喱鸡饭\n蘑菇披萨\n芒果冰沙\n鸡腿汉堡\n鸡肉色拉\n牛肉米线\n芒果冰淇淋\n苏打饼干\n拿铁咖啡\n萝卜丸子\n芝士汉堡\n金枪鱼寿司\n麻辣锅底\n酸奶慕斯\n芝士慕斯\n烟熏三文鱼\n奶油蛋糕卷\n可乐鸡翅\n丝袜奶茶\n蘑菇意面\n牛肉乌冬面\n夹心饼干\n烧牛肉面\n梅菜扣肉\n手握寿司\n豆沙月饼\n鸡肉汉堡\n裱花蛋糕\n葡式蛋挞\n砂锅米线\n玫瑰蛋糕\n燕麦饼干\n黄油饼干\n三文鱼腩寿司\n鱼烧豆腐\n糖玛奇朵\n焗土豆泥\n椰蓉面包\n雪花牛排\n迷你蛋糕\n天使蛋糕\n肉丝炒面\n特色烤鱼\n乳脂蛋糕\n美式咖啡\n摩卡咖啡\n山水豆腐\n芥末章鱼\n芝士意面\n老母鸡汤\n白菜豆腐\n烤猪肋排\n香蕉蛋糕\n戚风蛋糕卷\n肉馅水饺\n鲜肉大混沌\n清汤锅底\n海鲜意粉\n汤鸳鸯锅\n芭比蛋糕\n脆皮玉米\n欧式蛋糕\n巧克力冰淇淋\n黑椒牛扒\n蘑菇浓汤\n牛奶吐司\n蔬菜沙律\n萝卜蛋糕\n萝卜丝汤\n牛肉意面\n花朵面包\n芝麻汤圆\n粉蒸排骨\n港式奶茶\n干锅牛蛙\n黑椒牛柳\n芝士沙拉\n芒果班戟\n火腿炒饭\n海鲜烩饭\n牛油果色拉\n杏仁饼干\n蜜汁叉烧\n菲力牛扒\n芝士吐司\n羊肉火锅\n水果沙律\n手工豆腐\n蓝莓芝士\n玛芬蛋糕\n牛肉拌面\n汽车蛋糕\n阿根廷红虾\n手握披萨\n鸡蛋布丁\n香蕉奶昔\n重庆小面\n蟹黄豆腐\n芒果蛋糕\n肉炖粉条\n糖霜饼干\n牛肉肠粉\n煎三文鱼\n炒黑木耳\n泡椒凤爪\n酸奶冰激凌\n萝卜排骨汤\n泡菜炒饭\n杏仁蛋糕\n时蔬沙拉\n意大利粉\n鱼头豆腐汤\n南瓜浓汤\n巧克力饼干\n牛小排\n骨汤拉面\n马芬蛋糕\n野生木耳\n蔓越莓饼干\n蘑菇沙拉\n花环面包\n军舰寿司\n鸡蛋饺子\n芝麻饼干\n牛肉丸子\n炖五花肉\n豆角焖面\n莲蓉月饼\n芒果奶昔\n盆栽奶茶\n牛肉米粉\n水晶虾仁\n排骨米饭\n哈根达斯\n可可蛋糕\n三鲜水饺\n铁板豆腐\n酸奶沙拉\n炖排骨\n蜂蜜蛋糕\n笋炒腊肉\n秘制牛肉\n猪肉馅饼\n时令蔬菜\n手指饼干\n咖喱鸡肉\n黑椒牛肉\n鸡蛋水饺\n菜炒牛肉\n肉酱拌面\n肉丸子汤\n椰果奶茶\n三文鱼色拉\n夹心面包\n土司面包\n咖喱鱼蛋\n鲜虾披萨\n鱼腩刺身\n豆腐沙拉\n草莓慕斯\n芝心披萨\n肉末豆腐\n桂林米粉\n木耳炒肉\n有机时蔬\n奶酪披萨\n培根焗饭\n吐司披萨\n乳酪慕斯\n鲜肉水饺\n香辣烤鱼\n金针菇卷\n蟹粉豆腐\n芭比娃娃\n焦糖拿铁\n烤肉拼盘\n金枪鱼披萨\n抹茶慕斯\n炒酸奶\n叉烧肠粉\n鸭血粉丝\n鲜虾云吞\n香肠炒饭\n饺子\n菜烧豆腐\n炒粉丝\n芝士烤饭\n祝寿蛋糕\n番茄沙拉\n牛肉拼盘\n烧味拼盘\n火腿面包\n果木烤鸭\n手工水饺\n仙草奶茶\n鲜虾肠粉\n饭团便当\n闪电泡芙\n酱烧茄子\n土豆泥沙拉\n猪肉馄饨\n火腿沙拉\n海鲜色拉\n鸡粒炒饭\n蛋黄月饼\n肋眼牛排\n猪扒焗饭\n柠檬绿茶\n波斯顿龙虾\n巧克力酱\n奶油泡芙\n五花肉饭\n馋嘴牛蛙\n韭菜饺子\n银耳糖水\n酱蛋糕卷\n肉松寿司\n玫瑰奶茶\n火焰牛排\n核桃蛋糕\n土豆披萨\n乳酪面包\n香蕉牛奶\n蔬菜披萨\n菇炒肉片\n芒果酸奶\n肉蛋包饭\n米蒸排骨\n牛肉汤面\n金枪鱼刺身\n水果裸蛋糕\n有机鱼头\n排骨火锅\n手工巧克力\n巧克力曲奇\n五仁月饼\n鸳鸯奶茶\n蔓越莓吐司\n炖粉条\n抹茶冰激凌\n芝士薯条\n黑胡椒牛排\n肥牛米线\n空心菜梗\n点心拼盘\n炒西红柿\n水库鱼头\n榛果拿铁\n吞拿鱼沙拉\n小黄花鱼\n夏威夷披萨\n奶油布丁\n凉拌黄瓜\n全麦吐司\n巧克力布丁\n香草拿铁\n蒸土鸡蛋\n胡萝卜饺子\n肉眼牛扒\n肉丝盖饭\n糯米丸子\n猪肉丸子\n牛仔粒\n沙拉三明治\n土豆丝饼\n咖啡蛋糕\n南瓜馒头\n切片蛋糕\n巧克力奶茶\n价钱-c\n乌龙奶茶\n麻辣豆腐\n鸡蛋炒面\n铁板炒饭\n豆腐砂锅\n虾仁馄饨\n抹茶星冰乐\n芝麻面包\n艺术蛋糕\n肉小馄饨\n牛肉粉丝\n海鲜浓汤\n榴莲芝士\n奶油吐司\n切块蛋糕\n主题蛋糕\n香辣鸡翅\n雪域蛋糕\n酸奶奶昔\n辣鸡腿堡\n萝卜炒肉\n菇炒牛柳\n苹果蛋糕\n花式面包\n经典牛排\n牛肉炒面\n火龙果汁\n海鲜拉面\n开花馒头\n大虾沙律\n土豆焖饭\n四喜丸子\n养生豆腐\n大连diy蛋糕\n鸡蛋卷饼\n鲜肉小笼\n蜜汁鸡翅\n葡萄干吐司\n番茄锅底\n甜虾刺身\n鸳鸯锅\n椒炒肉丝\n招牌炒饭\n抹茶布丁\n意大利饭\n奶酪焗饭\n奶酪沙拉\n三文鱼饭\n鱼籽寿司\n鱼子寿司\n豆花米线\n西米布丁\n豆腐羹\n肉馅包子\n肉通心粉\n肉丝炒饭\n白菜粉丝\n牛肉馅饼\n牛排披萨\n牛奶奶茶\n炖土鸡汤\n炒火腿肠\n海胆刺身\n牛肉面\n拼三文鱼\n巧克力塔\n川北凉粉\n奶酪布丁\n刀切馒头\n麦芬蛋糕\n鲜蔬沙拉\n香肠面包\n帕尼尼\n红茶拿铁\n烧烤拼盘\n法式面包\n上汤浸时蔬\n椒土豆片\n小锅米线\n奶酪吐司\n原味酸奶\n摩卡星冰乐\n麻辣米线\n雪花牛扒\n蔓越莓面包\n蓝山咖啡\n芒果沙冰\n腊肠披萨\n腊肉炒饭\n腊味炒饭\n秘制牛排\n甜品拼盘\n炒鱿鱼须\n炒豆腐皮\n澳洲牛排\n水煮鱼片\n棒棒糖蛋糕\n核桃吐司\n布朗尼蛋糕\n数码蛋糕\n手切羊肉\n忌廉蛋糕\n卤水豆腐\n丝绒蛋糕\n麻辣烤鱼\n鸭煲仔饭\n香肠意面\n酸奶吐司\n酥皮月饼\n茄子土豆\n系列蛋糕\n糯米奶茶\n提拉米苏蛋糕\n笋炒肉丝\n白菜饺子\n牛肉水饺\n波斯龙虾\n沙拉面包\n沙拉寿司\n水果酸奶\n果炒虾仁\n无骨凤爪\n培根意粉\n味增拉面\n冰激淋球\n什锦披萨\n鸡蛋吐司\n鸡肉丸子\n鲜果沙律\n雪耳糖水\n雪梨糖水\n蜜糖土司\n虾仁饺子\n虾仁水饺\n虾仁意面\n菜土豆泥\n菜丸子汤\n草莓奶昔\n肉粉丝煲\n肉松蛋糕\n经典奶茶\n红油抄手\n白菜水饺\n野生大黄鱼\n玫瑰馒头\n牛肉寿司\n热狗面包\n北极贝寿司\n木瓜牛奶\n广式月饼\n奥尔良烤翅\n奶油松饼\n培根面包\n吐司布丁\n巧克力奶昔\n薄饼披萨\n芝士色拉\n砂锅鱼头\n玉米披萨\n烧鸡翅根\n烤鸡腿堡\n柠檬汽水\n肉松面包卷\n三文鱼意面\n手工酸奶\n台塑牛排\n南瓜发糕\n儿童蛋糕\n什锦沙拉\n鲜虾色拉\n风味披萨\n金针肥牛\n酸奶雪糕\n虾仁披萨\n虎皮尖椒\n蓝莓慕斯\n葱烧海参\n菜肉馄饨\n草莓酸奶\n茄汁意面\n肉炒河粉\n发糕\n牛肉定食\n炒鸡腿肉\n海鲜炒面\n有机蔬菜\n心形蛋糕\n巧克力派\n回锅肉饭\n咖喱牛腩煲\n巧克力泡芙\n丸粉丝汤\n三鲜饺子\n鸡蛋馅饼\n鸡蛋盖饭\n鱼头火锅\n金针菇汤\n肉酱千层面\n辣鸳鸯锅\n西红柿汤\n菠菜沙拉\n芝麻烧饼\n羊肉丸子\n红豆布丁\n盒子蛋糕\n烤肉年糕\n烤海鲈鱼\n炒笨鸡蛋\n大虾色拉\n南瓜布丁\n三文鱼扒\n丁骨牛排\nrisotto\n麻酱拌面\n麻将蛋糕\n鸡肉拌饭\n鸡扒焗饭\n魔芋豆腐\n青酱意面\n金枪鱼卷\n土豆炖牛肉\n蟹肉沙拉\n蒸金针菇\n酸菜牛肉面\n芒果色拉\n红豆吐司\n红枣糖水\n糯米烧卖\n眼肉牛排\n田园披萨\n玫瑰花馒头\n牛柳意面\n法式鹅肝\n柠檬蛋糕\n金枪鱼色拉\n果酱蛋糕\n手指泡芙\n挪威三文鱼\n奶盖红茶\n卡通饼干\n伯爵奶茶\n黄金炒饭\n鸡胸沙拉\n鸡盖浇饭\n干香茶树菇\n蛋拌豆腐\n蔬菜三明治\n荞麦冷面\n芝麻吐司\n芝士饼干\n腊肠炒饭\n羊蝎子锅\n红豆冰沙\n秘制烤鱼\n辣白菜炒饭\n牛肉丸汤\n海鲜年糕\n海胆寿司\n沙律寿司\n樱桃蛋糕\n果酱面包\n果仁面包\n加拿大龙虾\n招牌咖啡\n干捞粉丝\n小笼汤包\n奶黄月饼\n奶香吐司\n土豆丸子\n南瓜面包\n八宝辣酱\n鲜肉饺子\n香菜丸子\n野生黄鱼\n蛋糕切块\n肉丸汤\n菜粉丝汤\n金菇肥牛卷\n荷叶炒饭\n茄子豆角\n桂花糯米藕\n肉肠披萨\n红烧猪蹄\n田园色拉\n玫瑰花卷\n牛五花肉\n炖老鸡汤\n炒鸡胸肉\n炒韭菜花\n法式羊排\n汁蒸凤爪\n樱桃萝卜\n棉花蛋糕\n果蛋糕卷\n杂粮面包\n巧克力球\n寿司定食\n意大利宽面\n墨西哥卷\n培根沙拉\n双色馒头\n厚切牛舌\n伯爵红茶\n什锦拼盘\n玛格丽特饼干\n麻辣龙虾\n芝麻菜沙拉\n鸡蛋沙拉\n鸡蛋汤面\n鸡肉沙律\n鸡扒意粉\n鲜虾沙律\n鱿鱼寿司\n香草奶昔\n酸菜米线\n酪冰淇淋\n蔓越莓曲奇\n虾仁沙拉\n荠菜馄饨\n芝士意粉\n腿煲仔饭\n肉豆腐汤\n肉末炒饭\n翻糖饼干\n经典披萨\n红糖糍粑\n红枣豆浆\n粉丝扇贝\n私房豆腐\n白切鸡饭\n甜虾寿司\n现磨咖啡\n玉米面包\n牛肉砂锅\n爆浆鸡排\n炒西瓜皮\n炒佛手瓜\n澳洲肥牛\n乌冬面\n榛子蛋糕\n核桃饼干\n果园蛋糕\n极品肥牛\n三文鱼沙律\n招牌披萨\n拌海带丝\n干锅鸭头\n巧克力卷\n宫爆鸡丁\n小餐包\n咖喱大虾\n原味炸鸡\n巧克力摩卡\nLatte\nbiangbiang面\n黑麦面包\n黄油面包\n鸡蛋拌面\n酸奶芝士\n酱蒸凤爪\n调味牛排\n蜂蜜吐司\n虾仁肠粉\n蔬菜意面\n蓝莓酸奶\n蒸大闸蟹\n萝卜牛腩\n炒肉末\n荷叶蒸饭\n芝士鸡排\n芝士奶茶\n腊肠焖饭\n脆皮蛋糕\n肉金针菇\n肉炒鸡蛋\n肉沫茄子\n老式面包\n羊肉饺子\n红酒牛排\n百香果茶\n牛肉馄饨\n爆珠奶茶\n燕麦奶茶\n烤鸡披萨\n炒油豆腐\n黄河大鲤鱼\n沙朗牛扒\n芒果糯米饭\n磅蛋糕\n杏仁曲奇\n披萨双拼\n大虾披萨\n芝士焗扇贝\n土司披萨\n羔羊肉\n咖喱焗饭\n朱古力蛋糕\n全蛋蛋挞\n五谷豆浆\n丸子砂锅\n丝滑奶茶\n龙井虾仁\n黄油蛋糕\n麻酱凉面\n鸭胸沙拉\n鲅鱼水饺\n鱼盖浇饭\n鱼握寿司\n香菇饺子\n雪龙牛肉\n铁杆山药\n酸奶戚风\n酥粒面包\n土豆盖浇饭\n豆炒肉末\n西兰羊排\n鲜虾腐皮卷\n蒸黄花鱼\n青菜炒豆腐\n大馄饨\n银耳莲子糖水\n芝士切件\n肉酱披萨\n红枣蛋糕\n红枣发糕\n红枣粥\n丝鸡汤\n礼盒月饼\n田园时蔬\n猪颈肉饭\n牛肉饺子\n牛肉意粉\n牛排汉堡\n烤鸭披萨\n烤土豆片\n深井烧鹅\n冰淇淋华夫\n海鲜沙律\n水晶虾饺\n萨拉米披萨\n奥尔良鸡翅\n寿司组合\n咖喱乌冬面\n巧克力芝士\n巧克力拿铁\n什锦火锅\n鸡蛋糖水\n鳗鱼炒饭\n鱼丸米线\n莲藕汤\n酒酿小圆子\n酥皮蛋挞\n迷你汉堡\n蔓越莓蛋糕\n豆焖排骨\n豆沙汤圆\n装饰蛋糕\n蟹柳寿司\n蟹子寿司\n蜜豆吐司\n萝卜烧肉\n芝士蛋挞\n芒果雪糕\n脱骨凤爪\n肉三文治\n肉丁炒饭\n羊肉水饺\n经典咖啡\n红豆蛋糕\n糯米蛋糕\n立贝寿司\n相间肥牛\n番茄披萨\n现炸酥肉\n猪肉锅贴\n猪肉蒸饺\n特级肥牛\n牛奶冰沙\n八爪鱼寿司\n焦糖奶茶\n炖乌鸡汤\n炒玉米粒\n大鱼头\n水果布丁\n金枪鱼意面\n果酸奶杯\n水果小丸子\n木耳山药\n提子面包\n排骨蒸饭\n抹茶曲奇\n奶酪慕斯\n大片牛肉\n意大利烩饭\n土豆煎饼\n金针菇\n南瓜蛋糕\n农家豆腐\n全麦土司\n巧克力礼盒\n五花肉卷\n龙虾意面\n黄桃罐头\n酱烧饼\n麻薯面包\n骨汤锅底\n香辣猪蹄\n香蕉披萨\n铁板鱿鱼\n酱焗意粉\n辣炒花蛤\n豆腐锅仔\n豆腐炒肉\n蟹黄汤包\n蜜汁烤翅\n蛋糕奶茶\n蛋白炒饭\n蛋奶布丁\n蒜蓉面包\n菌汤锅底\n草莓奶茶\n芝士热狗\n芝士寿司\n脆皮鸡翅\n脆皮烤鸭\n脆皮乳鸽\n胡萝卜饼\n胡萝卜丁\n肉汤河粉\n羊肉烩面\n紫薯面包\n紫菜卷饭\n糖冰淇淋\n粉丝沙拉\n番茄色拉\n瓜炒虾仁\n牛腩河粉\n牛肉浓汤\n牛奶炖蛋\n烫面蒸饺\n炒河虾仁\n炒包心菜\n海鲜酒家\n海鲜大咖\n北海道吐司\n椒鱼头王\n椒多宝鱼\n果酱酸奶\n排骨焖饭\n拌豆腐皮\n拌海蜇头\n手抓羊排\n大黄花鱼\n四喜烤麸\n咖啡慕斯\n可可奶茶\n原味蛋糕\n卤味拼盘\n北京烤鸭\ntoro\n鹅肝沙拉\n鸡蛋肠粉\n鸡蛋包子\n鱼蒸肉饼\n鱼寿司卷\n香菇鸡汤\n风味烤鱼\n超级至尊\n超大鸡排\n豆腐圆子\n豆沙吐司\n蟹棒寿司\n蛋黄寿司\n虾仁滑蛋\n薄底披萨\n蒜蓉生蚝\n青菜炒虾仁\n莓裸蛋糕\n花生豆浆\n芝士拼盘\n猪肉馅馄饨\n荷叶饭\n鲜肉水煎包\n罗勒意面\n红茶奶盖\n红汤锅底\n米小丸子\n番薯糖水\n生蚝刺身\n玉米面饼\n玉米发糕\n牛肉沙律\n牛肉汤粉\n牛奶土司\n炒酸豆角\n火腿意面\n澳洲和牛\n牛油果寿司\n椰蓉吐司\n桂花糖藕\n安格斯牛排\n树根蛋糕\n果粒酸奶\n松仁玉米\n木糠布丁\n无骨鸡柳\n方形蛋糕\n三文鱼披萨\n提子蛋糕\n排骨年糕\n拌北极贝\n抹茶泡芙\n手冲咖啡\n玛德琳蛋糕\n奶酪饼干\n大肉饺子\n咖喱羊肉\n双色吐司\n冰糖燕窝\n迷迭香\n龙利鱼排\n鸡蛋软饼\n鸡肉意粉\n鲜酿酵素\n鲜虾春卷\n鲜奶布丁\n高钙羊肉\n香葱面包\n香菇焖饭\n香菇水饺\n香煎鸡扒\n香滑奶茶\n风味茄子\n韩国料理\n野生鲫鱼\n贝壳蛋糕\n豆豉烤鱼\n豆腐盖饭\n新西兰青口\n蟹籽寿司\n蜜豆面包\n蜂蜜绿茶\n蛤蜊意面\n蘑菇焗饭\n蒜香鸡翅\n蒜蓉扇贝\n鸡蛋卷\n草莓布丁\n花样馒头\n芦笋意面\n黑芝麻豆浆\n芝心年糕\n芝士薄饼\n腓力牛排\n香脆鸡腿堡\n脆皮鸡排\n胡萝卜粥\n肥牛拌饭\n肉松饭团\n肉拌黄瓜\n老火例汤\n罐罐米线\n红糖面包\n拿破仑蛋糕\n白菜粉条\n白灼芥兰\n米线\n生菜色拉\n牛肉烧饼\n牛奶刨冰\n照片蛋糕\n焗银鳕鱼\n焗基围虾\n烤馒头片\n烤肉炒饭\n炒牛仔粒\n冰淇淋泡芙\n柠檬特饮\n板牛仔骨\n排骨米线\n吞拿鱼寿司\n扇贝寿司\n情趣蛋糕\n山药木耳\n奶香面包\n奶酥面包\n大骨头汤\n培根汉堡\n培根土豆\n咖喱鸡翅\n原味芝士\n千层肉饼\n制作过程\n养生锅底\n巧克力麦芬\n巧克力吐司\n优格冰沙\n乡村面包\n酸辣粉\n丝豆腐羹\n黑糖奶茶\n鸡肉炖饭\n鲷鱼寿司\n鱼头豆腐\n炖鱼五花肉\n风琴土豆\n风味牛排\n风味咖啡\n铁板牛排\n酸辣米线\n酸菜粉条\n软骨拉面\n虾肉馄饨\n椰蓉面包卷\n萝卜沙拉\n萝卜水饺\n菠萝披萨\n猪肉包\n菜炒木耳\n菌类拼盘\n草莓泡芙\n茄子盖饭\n雪花牛小排\n雪花牛仔粒\n脆皮猪手\n肥肠米线\n肉炒蒜苔\n肉松吐司\n罗马生菜\n绿豆糖水\n纽约芝士\n红豆刨冰\n紫薯馒头\n空心挂面\n百合糖水\n番茄面包\n田园沙律\n特色炒鸡\n牛杂火锅\n焦糖摩卡\n焖黄花鱼\n烧四季豆\n烤鸡翅根\n德普烘焙食谱\n炸花生米\n滋补锅底\n沙茶牛肉\n鸡扒饭\n水煮鲶鱼\n辣椒小皮蛋\n西红柿炒鸡蛋\n红枣小米粥\n果干面包\n杂粮煎饼\n有机菜心\n抹茶芝士\n抹茶冰沙\n慕斯切块\n慕司蛋糕\n意面沙拉\n总汇披萨\n带骨牛排\n带皮牛肉\n带子寿司\n千岛湖鱼头\n卡布奇诺咖啡\n天妇罗卷\n土豆丝汤\n咖啡冰淇淋\n味大鸡排\n台湾香肠\n巧克力热饮\n巧克力咖啡\n三鲜\nmuffin\n龙利鱼扒\n黑胡椒汁\n黄瓜沙拉\n麻酱凉皮\n麻辣小面\n鹅肝蒸蛋\n鸡排沙拉\n鸡丁炒饭\n鲷鱼刺身\n鲜虾馄饨\n鲜蔬沙律\n鲜肉汤包\n鲜肉云吞\n鱼鸳鸯锅\n骨炖酸菜\n香蕉酸奶\n戚风小蛋糕\n韭菜水饺\n锅全家福\n野生河虾\n醋海蜇头\n酸辣凤爪\n酸奶布丁\n酥皮泡芙\n辣酱拌面\n辣牛肉汤\n辣子鸡丁\n蟹肉寿司\n蜂蜜酸奶\n蛋糕系列\n蛋糕切片\n蘑菇意粉\n藜麦沙拉\n蔬菜炒香肠\n炒豆皮\n鲍菇牛仔粒\n茄汁意粉\n芦荟绿茶\n芝麻酥饼\n芝士春卷\n芒果芝士\n脆皮泡芙\n胡萝卜面\n牛肉盖浇面\n腊肉娃娃菜\n肉丝米线\n老醋花生\n美国肥牛\n红糖发糕\n红烧乳鸽\n素馅包子\n糯米圆子\n玉米骨头汤\n秘制鸡翅\n秘制花甲\n秘制肥牛\n石屏豆腐\n野生大甲鱼\n现磨豆浆\n玉米寿司\n玉子寿司\n猪肉汉堡\n猪扒意粉\n狮子头面\n牛油果酱\n燕麦酸奶\n烤牛小排\n炒大头菜\n比萨\n火腿寿司\n海鲜盖饭\n海鲜汤面\n泰式奶茶\n水牛芝士\n榴莲布丁\n炒茄子\n椒味烤鱼\n木耳鸡蛋\n木瓜雪蛤\n木瓜糖水\n无水蛋糕\n排骨汤锅\n拌核桃仁\n蛋拌乌冬面\n抹茶奶盖\n手工意面\n戚風蛋糕\n小炒肉饭\n孜然羊排\n玉子豆腐煲\n天妇罗拼盘\n奶酪意面\n大虾意面\n德国咸猪手\n咖啡冰沙\n全家福\n可可饼干\n切鲜羊肉\n全麦馒头\n巧克力戚风\n云南小瓜\n丸类组合\n上汤时蔬\n鸭胸色拉\n鸡汤米线\n鸡排汉堡\n海鲜豆腐汤\n鲜虾烧卖\n海鲜烩意粉\n鱼蒸豆腐\n飞鱼籽沙拉\n一锅出\n鱿鱼须\n香草雪糕\n香草蛋糕\n香煎牛排\n香烤鸡腿\n风味牛肉\n韩国泡菜\n肉酱烩意粉\n塔塔酱\n造型饼干\n蔓越莓司康\n捷赛私房菜\n豆骨头汤\n黄豆炖猪蹄\n西葫芦丝\n蔬菜组合\n蒸龙利鱼\n萝卜煎饼\n菠菜色拉\n菇炒青菜\n芹菜饺子\n芦笋沙拉\n芝士松饼\n芝士扇贝\n芒果沙律\n牛肉粒披萨\n肉焖土豆\n肉烧土豆\n肉炒青椒\n炒芹菜\n炒芥兰\n肉炒白菜\n牛肉土豆粉\n義大利麵\n缤纷蛋糕\n红糖吐司\n糖醋丸子\n玉米炖排骨\n石鍋拌飯\n百叶豆腐\n番茄意粉\n小木耳\n玫瑰花蛋糕\n猪肉煎饺\n牛肉锅仔\n牛肉清汤\n牛奶小方\n骨扒\n星冰乐\n照烧鸡排\n鸡胸肉\n焗鸡扒饭\n焗多宝鱼\n烤鸡胸肉\n烤肠拼盘\n辣炒小公鸡\n冰火菠萝油\n火腿色拉\n火腿煎饼\n滑蛋虾仁\n清炒虾仁\n海鲜锅底\n奶油冰淇淋\n辣椒炒花甲\n椒五花肉\n核桃磅蛋糕\n桂花山药\n核桃沙拉\n柠檬可乐\n柚子沙律\n摩卡蛋糕\n提拉米蘇\n排骨砂锅\n排骨土豆\n招牌拉面\n手打鱼丸\n战斧牛排\n慕丝蛋糕\n干菜扣肉\n手工冰淇淋\n宝宝蛋羔\n大肉水饺\n大块牛肉\n芝士焗龙虾\n培根寿司\n圣诞蛋糕\n土豆盖饭\n咖啡布丁\n腊味糯米饭\n参乌鸡汤\n萝卜粉丝汤\n萝卜炖排骨\n冰糖雪梨\n巧克力冰沙\n五花肉片\n主厨沙拉\n串烧拼盘\n三文鱼面\n黑椒意面\n鸭血豆腐\n鸡蛋蒸饺\n鸡蛋盒子\n鱼片拼盘\n鱼一锅鲜\n骨炖土豆\n香蕉麦芬\n香蕉马芬\n香蕉吐司\n香菇馄饨\n香菇豆腐\n香芋奶茶\n香肠焗饭\n香肠焖饭\n风干牛肉\n波士顿大龙虾\n油面筋塞肉\n醋汁沙拉\n酸奶盆栽\n酱爆鸡丁\n豆角土豆\n豆腐鱼汤\n豆腐色拉\n豆炒虾仁\n豆渣馒头\n炒肉\n西米奶茶\n蛋黄饼干\n鲜虾净云吞\n蓑衣黄瓜\n蒜蓉西兰花\n萝卜馅饼\n萝卜泡菜\n炒毛豆\n菜拌豆腐\n菌菇披萨\n菌菇意面\n菇蛋花汤\n莲子豆浆\n草莓冰沙\n苹果豆浆\n芹菜水饺\n花甲米线\n花生饼干\n花生冰沙\n花型面包\n芥末虾球\n芝麻蛋糕\n芝麻花卷\n芝士布丁\n肉西兰花\n牛肉蔬菜汤\n肉炖白菜\n肉炒豆角\n炒菜花\n肉桂面包\n虾仁\n红豆糖水\n红烧带鱼\n糯米烧麦\n瑶柱炒饭\n玫瑰花面包\n西班牙香肠\n玫瑰面包\n牛奶馒头\n牛奶燕窝\n爆牛仔粒\n燕麦面包\n焦糖慕斯\n鳗鱼饭\n烤牛肋骨\n炒鱿鱼干\n炒木耳菜\n深海黄鱼\n深海大虾\n浓汤拉面\n洋芋擦擦\n牛油果手卷\n榴莲慕斯\n椰蓉饼干\n椒炒牛柳\n果仁蛋糕\n木糠蛋糕\n木糠布甸\n时令水果\n攸县香干\n排骨套饭\n拉面定食\n抹茶奶茶\n手工披萨\n手切牛肉\n意式咖啡\n夹心小蛋糕\n年糕拉面\n布丁面包\n客家豆腐\n奶酪司康\n奶油曲奇\n培根吐司\n坚果沙拉\n地狱拉面\n咖喱鸡块\n咖喱鱼丸\n咖啡拿铁\n南瓜沙拉\n动物饼干\n冰滴咖啡\n三鲜米线\n黑椒猪排\n黄瓜咸菜\n麻辣香肠\n鸡肉火锅\n鸡汤馄饨\n鸡排便当\n鸡丁披萨\n鲜奶小方\n鲑鱼沙拉\n生鱼片\n一夜干\n香酥鸡排\n香辣锅底\n香草面包\n香草布丁\n香煎鹅肝\n风情蛋糕\n面包披萨\n青瓜沙拉\n青柠苏打\n酸甜排骨\n酸汤鲈鱼\n酸汤锅底\n酸奶马芬\n酸奶冰棒\n酱酸辣粉\n果酱冰淇淋\n豆腐火锅\n黄豆猪蹄汤\n西洋菜汤\n血糯米粥\n蛋糕切件\n蛋乌冬面\n蚝油生菜\n虫草花汤\n虎皮凤爪\n蒜泥茄子\n萝卜蒸饺\n菠菜粉丝\n菠菜意面\n菌菇浓汤\n菇扒菜胆\n蒸荔浦芋头\n苹果面包\n芝麻月饼\n芝心比萨\n腰内猪排\n肉馅蒸饺\n肉炒香干\n肉炒韭菜\n肉沫豆腐\n银耳雪梨羹\n木耳红枣汤\n美味蛋糕\n红豆沙冰\n红糖蛋糕\n素馅饺子\n粉条包子\n蟹粉小笼包\n粉丝虾煲\n筒子骨汤\n笋烧牛腩\n生炒菜心\n牛腩火锅\n牛腩汤面\n牛肉薄片\n牛肉蒸饺\n牛肉煎饼\n牛奶沙冰\n牛奶冰棒\n牛三宝面\n爆炒腰花\n烧大黄鱼\n海鲜蒸蛋\n海鲜米线\n海鲜炖饭\n泡菜拉面\n牛油果奶昔\n沙拉军舰\n沙律军舰\n水煮活鱼\n椰奶布丁\n椒黑木耳\n椒盐花卷\n核桃豆浆\n枸杞鸡汤\n果酱饼干\n木瓜色拉\n曲奇泡芙\n时蔬色拉\n手卷寿司\n意大利麵\n干蒸烧卖\n巧克力挞\n班尼迪克蛋\n小麦啤酒\n大脸鸡排\n培根拌饭\n坚果蛋糕\n土豆烧鸡\n土豆炖肉\n咖喱炒蟹\n娃娃菜\n叶炒鸡蛋\n双拼蛋糕\n卡布其诺\n南瓜吐司\n刺身组合\n切片面包\n冰镇奶茶\n冰巧克力\n儿童牛排\n伍仁月饼\n什锦海鲜\n乳酪布丁\n乌龙奶盖\n上脑肥牛\n三鲜锅底\n黑豆豆浆\n黄豆豆浆\n麻辣牛蛙\n麻辣兔头\n麻花面包\n鸡蛋薄饼\n鸡腿排饭\n鸡肉宽面\n鸡肉卷饼\n鸡炖榛蘑\n鸡排焗饭\n海鲜全家福\n鲍菇牛柳\n鱼炖茄子\n鱼炖排骨\n骨头火锅\n香辣龙虾\n香辣豆腐\n多宝鱼\n风味沙拉\n风光披萨\n铁锅焖面\n酸辣白菜\n酸奶冰沙\n酱油炸鸡\n部队年糕\n软欧面包\n蔓越莓月饼\n蔓越莓土司\n越南春卷\n豌豆凉粉\n红豆小餐包\n调味炸鸡\n蝴蝶意面\n蛋牛肉粥\n虾握寿司\n虾仁蒸饺\n虾仁烧卖\n虾仁春卷\n蘑菇汉堡\n蔬菜拌饭\n蔬菜拉面\n蒸鸡蛋羹\n蒜香鸡排\n菠萝咕咾肉\n萝卜炒饭\n萝卜包子\n葡萄干蛋糕\n菠萝面包\n鲜肉包\n菜馅水饺\n蔬菜猪肝汤\n猪肉饺\n菜炒粉皮\n菜土豆汤\n菌菇焗饭\n金针菇牛肉卷\n菇炖鸡汤\n莓炒酸奶\n草莓果酱\n草莓松饼\n芹菜炒肉\n青花鱼定食\n芝士虾球\n芝士匹萨\n芒果绿茶\n芋圆奶茶\n胡萝卜片\n黑胡椒牛扒\n肉饺子馅\n肉酱薯条\n肉牛肉面\n羊肉大串\n羊后腿肉\n绿色毛肚\n红枣鸡汤\n糯米蒸鸡\n小米烩辽参\n芦笋炒虾仁\n生菜沙律\n瓜肉丸汤\n理石芝士\n珍珠圆子\n玉米炒饭\n猪排焗饭\n牛扒\n牛腩锅仔\n牛肉烧卖\n牛肉小面\n牛排意面\n牛奶慕斯\n爆浆猪排\n熊猫饼干\n焗三文鱼\n烤鸡沙拉\n烤鱿鱼须\n烤土豆泥\n烟肉意粉\n炒糯米饭\n炒柴鸡蛋\n火腿月饼\n泡泡浴蛋糕\n法棍面包\n汁烩意粉\n水果吐司\n水果冰沙\n尖椒四季豆\n梅花猪扒\n安格斯牛柳\n柚子沙拉\n杏仁面包\n粉丝汤\n时蔬披萨\n排骨煨汤\n排骨焖面\n吞拿鱼沙律\n拌黄瓜丝\n拉面年糕\n抹茶饼干\n手撕鸡饭\n手工蛋糕\n慕斯奶茶\n意式奶冻\n干菜烧肉\n帝王蟹脚\n巧克力豆\n巧克力棒\n巧克力杯\n帕尔玛火腿\n小熊蛋糕\n奶油焗饭\n多利亚饭\n芝士焗红薯\n坚果面包\n土豆鸡块\n咖啡奶茶\n吉列猪扒\n可可戚风\n双色饼干\n双色豆腐\n参炖鸡汤\n去骨鸭掌\n卷筒披萨\n歌剧院蛋糕\n乌冬面定食\n关东辽参\n巧克力玛芬\n克力松露\n义大利面\n牛肉丸汤河粉\n个性蛋糕\n丝炒粉条\n三文鱼籽\n黄瓜炒肉\n麻辣鸡翅\n鸡蛋面条\n鸡蛋肉饼\n鸡丝沙拉\n鸡丁米线\n鳄梨沙拉\n鲜虾寿司\n鲜花蛋糕\n海鲜娃娃菜\n鱼汁意面\n高汤锅底\n排骨盖浇饭\n五香豆腐干\n香蕉班戟\n香草鸡扒\n香草泡芙\n香干肉丝\n香干炒肉\n奶香小餐包\n青椒炒肉\n霜降牛肉\n雪花和牛\n锡兰红茶\n金枪鱼酱\n酸菜鱼锅\n贺寿蛋糕\n豉汁排骨\n紅豆薏米粥\n炒香肠\n土豆丝卷饼\n滋补鸳鸯锅\n蟹粉汤包\n蟹籽炒饭\n螺旋意面\n蜜豆土司\n牛蛙煲仔饭\n蛋炒韭菜\n天使面\n虾仁焗饭\n虾仁炒面\n蘑菇拼盘\n蓝莓奶昔\n椰蓉小餐包\n葡萄面包\n菠萝古老肉\n萝卜焖饭\n葡萄干玛芬\n菠萝虾球\n豆腐锅\n肉水饺\n菜煎豆腐\n炒腐竹\n梅菜扣肉饭\n蘑菇牛肉饭\n菇炖排骨\n蘑菇奶油汤\n荞麦凉面\n苏式月饼\n豆芽炒粉条\n花边披萨\n花形面包\n芝麻蛋卷\n芝士馅饼\n芝士饭团\n芒果肠粉\n奥尔良烤鸡翅\n腐皮寿司\n脆皮鸡饭\n鸡胸三明治\n肥肠酸辣粉\n肉馅锅贴\n肉酱热狗\n肉豆腐锅\n肉蒸水蛋\n鸡肉米纸卷\n肉炒粉丝\n汉堡包\n肉末蒸蛋\n肉末粉丝\n肉丝汤面\n银耳红枣羹\n翻转蛋糕\n缤纷披萨\n综合刺身\n红薯丸子\n红烧鸡块\n红油锅底\n紫薯月饼\n糟溜鱼片\n蛋糕冰淇淋\n竹笋烧牛肉\n秘制烤翅\n碳烤生蚝\n盐焗鸡翅\n白菜包子\n玉米馒头\n玉米煎饼\n猪肉火锅\n猪肉拌饭\n特级牛排\n牛骨汤底\n牛腩米粉\n牛肉汤锅\n牛奶咖啡\n牙签牛肉\n爆海螺片\n煎五花肉\n焦糖咖啡\n烧鸡腿堡\n肉炖白萝卜\n炒馒头丁\n炒莴笋叶\n炒水晶粉\n炒三层肉\n冰淇淋吐司\n海鲜米粉\n浓缩咖啡\n流心蛋糕\n泡椒田鸡\n烤鸡腿\n水晶粉丝\n柠檬蜂蜜水\n榨菜肉丝\n椒爆鸡胗\n炒鸡丁\n红枣枸杞汤\n果木牛扒\n松露蛋糕\n杂菜沙律\n木耳饺子\n时光蛋糕\n无骨鸡爪\n无骨鸡块\n三文鱼盖饭\n排骨拉面\n招牌沙拉\n招牌寿司\n拌金枪鱼\n拌海蜇皮\n拌明太鱼\n手打肉丸\n冰咖啡\n水库大鱼头\n带子刺身\n奶香土司\n奶酪色拉\n奶酥饼干\n奶茶大杯\n奶油芝士\n奶油慕斯\n奶油小方\n酸奶水果杯\n夹心年糕\n芝士帕尼尼\n坚果酸奶\n咖喱虾仁\n和风沙拉\n和牛寿司\n可可蛋糕卷\n叉烧焗饭\n叉烧排骨\n卤鹌鹑蛋\n卡仕达酱\n萝卜骨头汤\n萝卜蛋炒饭\n南瓜汤圆\n千层班戟\n加州鲈鱼\n刺身定食\n五香熏鱼\n乳酪披萨\n一品肥牛\n奶酪蛋糕\n黄油吐司\n麻酱小料\n麦香奶茶\n鸭胸寿司\n鸡蛋色拉\n鸡肉馅饼\n鸡块盖饭\n鸡丝米线\n鳗鱼盖饭\n鲜榨橙汁\n海鲜大拼盘\n鱼炖粉条\n鱼汤米线\n带骨小牛排\n香肠沙拉\n香小饼干\n风味鸡翅\n风味意面\n韩式拌饭\n雪花刨冰\n金枪鱼饭\n金枪鱼腩\n野生鲈鱼\n醪糟汤圆\n酿油豆腐\n酸菜锅底\n酸菜粉丝\n酸菜炒饭\n酸汤水饺\n酸奶棒冰\n辣炒蛤蜊\n超值牛排\n豆角肉沫\n豆角焖饭\n豆芽粉条\n豆皮寿司\n豆类牛奶冰\n豆浆布丁\n豆丝炒肉\n葫芦饼\n西点蛋糕\n西施豆腐\n蛤蜊浓汤\n蛋黄豆腐\n蛋糕内部\n蓝莓果酱\n蒜香茄子\n蒜蓉大虾\n菠萝海鲜饭\n胡萝卜面包\n葡萄干司康\n菠萝色拉\n鱼火锅\n菜猪肉丸\n菊花茄子\n菇土鸡汤\n草莓芝士\n茶玛奇朵\n花鲢鱼头\n花卷馒头\n芝士猪排\n芝士牛排\n芝士焗面\n芝士大虾\n芝士土豆\n芝士土司\n芝士切块\n芒果寿司\n豆腐鲫鱼汤\n豆腐五花肉\n豆腐丸子汤\n脆皮烧鸡\n肉酱米线\n腊肉糯米饭\n牛肉粉/面\n炒豆干\n牛肉炒乌冬\n肉丸披萨\n肉丝米粉\n肉丝炒粉\n耶加雪菲\n绿茶拿铁\n纸上烤鱼\n特级牛小排\n红酒鹅肝\n红薯糖水\n红薯粉丝\n红烧鸡翅\n红烧鱼块\n紫薯吐司\n紫菜蛋汤\n糯米糍粑\n糯米团子\n粉丝蒸虾\n米饭布丁\n米酒汤圆\n笋小炒肉\n石锅炒饭\n相扑火锅\n盐焗鸡饭\n虾皮鸡蛋饼\n脆皮大泡芙\n白菜蒸饺\n白汁意面\n白汁意粉\n花生酱饼干\n花生牛扎糖\n现酿酸奶\n玫瑰拿铁\n玉米排骨\n猫耳朵面\n猪软骨饭\n特色豆腐\n特色咖啡\n牛腩米线\n牛肉焖饭\n牛肉干锅\n牛肉咖喱\n牛排沙拉\n牛奶雪糕\n肥牛乌冬面\n燕麦曲奇\n熟豆豆浆\n焦糖苹果\n烩牛肉饭\n烧油豆腐\n烧大明虾\n锅烧乌冬面\n烤金枪鱼\n烤羊蝎子\n烤多宝鱼\n炒黄花菜\n炒黄瓜片\n炒鸡软骨\n炒红薯叶\n深海鱼头\n冰淇淋咖啡\n海鲜什锦\n海皇炒饭\n泡椒烤鱼\n油酥烧饼\n豉油皇炒面\n沙律虾球\n水牛毛肚\n水果慕斯\n榴莲泡芙\n榄菜炒饭\n椰蓉月饼\n蒸凤爪\n辣椒猪颈肉\n辣椒拌茄子\n青椒拌皮蛋\n辣椒嫩牛肉\n樱桃小萝卜\n培根芦笋卷\n金枪鱼面包\n果冻布丁\n木耳肉丝\n时蔬炒饭\n无骨鸭掌\n三文鱼骨腩\n排骨锅底\n招牌牛排\n吉拉多生蚝\n私房牛肉面\n年糕吐司\n干炒芹菜\n安康鱼肝\n嫩肩牛排\n奶油炸糕\n奶类冻芝士\n大虾寿司\n芝士焗意粉\n培根煎饼\n场景蛋糕\n土豆豆角\n土豆茄子\n土豆炖鸡\n土豆泥饼\n四川泡菜\n咖喱拉面\n咖喱意面\n台山菜花\n可可面包\n双色鱼头\n双拼比萨\n去骨凤爪\n厚片吐司\n萝卜鲫鱼汤\n萝卜馅饺子\n萝卜素丸子\n胡萝卜玉米汤\n南瓜百合\n单品咖啡\n半筋半肉\n千层毛肚\n巧克力华夫饼\n凉菜拼盘\n巧克力马芬\n巧克力月饼\n丸子组合\n丸子米线\n丸子火锅\n中华海草\n上汤菠菜\nTORO\n黑黄豆浆\n黄鱼春卷\n麻辣鸭头\n麻辣鸡块\n鹰嘴豆泥\n土鸡鸳鸯锅\n鸡蛋披萨\n鸡腿盖饭\n鸡翅拼盘\n鸡排意面\n三鲜馅饺子\n海鲜炒年糕\n鲜果松饼\n鲜果披萨\n鲑鱼色拉\n鲍鱼刺身\n鱿鱼盖饭\n鱼味牛油果\n鱼片火锅\n加州卷\n鸡腿堡\n香酥饼干\n香辣猪手\n香蕉面包\n香草慕斯\n香烤排骨\n电饭煲蛋糕\n猪颈肉炒饭\n韩式炸鸡\n青瓜沙律\n长棍面包\n金钻奶茶\n野生大虾\n里拉芝士\n酸菜黑鱼\n酸菜饺子\n酸菜肥牛\n酱椒鱼头\n酱三文鱼\n奶酪小餐包\n奶酪小蛋糕\n红酒烩牛肉\n番茄酱\n辫子吐司\n辣烤鸡翅\n辣椒皮蛋\n辣味烤鱼\n蹄花汤锅\n豹皮豆腐\n豆花牛肉\n豆腐肉饼\n豆腐粉条\n豆渣煎饼\n豆渣丸子\n土豆泥色拉\n豆沙餐包\n西红柿汁\n西冷牛肉\n蟹黄豆花\n蟹子炒饭\n羊蝎子火锅\n羊蝎子小锅\n蛋白饼干\n虾酱炒饭\n虾仁锅贴\n虾仁色拉\n藕炖排骨\n薏米糖水\n蔬菜焗饭\n蔬菜咖喱\n蓝莓乳酪\n椰蓉小面包\n蒸扇贝王\n蒜香龙虾\n葱拌豆腐\n萝卜酿肉\n葡萄干发糕\n菠萝油条\n鲜肉饺\n拌豆干\n菌鸳鸯锅\n菌炒牛柳\n菌类排骨汤\n菊花豆腐\n乌鸡汤\n荠菜饺子\n茶树菇汤\n苹果果酱\n花朵饼干\n花园蛋糕\n芝麻餐包\n火腿肠面包\n腰果虾仁\n豆腐鸡蛋羹\n脆皮猪肘\n胶酿油条\n胡萝卜泥\n肥牛盖饭\n肉饼蒸蛋\n牛肉蘑菇饭\n肉松炒饭\n鸡肉拌米粉\n牛肉土豆饭\n肉丸意面\n肉丝炒饼\n耳雪梨汤\n木耳炒腐竹\n考伯色拉\n羊排火锅\n红豆牛奶\n西红柿炒蛋\n红枣南瓜\n紫薯蛋糕\n紫薯发糕\n焦糖玛琪朵\n红薯粥\n秘制鲈鱼\n秘制猪手\n盱眙龙虾\n酱皇蒸凤爪\n皇后披萨\n小点心\n辣白菜火锅\n番茄牛肉\n番茄沙司\n甜虾沙拉\n紫甘蓝沙拉\n炒百合\n南瓜奶油汤\n琵琶豆腐\n大理石蛋糕\n西班牙火腿\n玫瑰慕斯\n玉米饺子\n猴子面包\n猪肉香肠\n猪肉拼盘\n特大鸡排\n牛肝菌汤\n牛肉土豆\n牛肉便当\n牛粉丝煲\n牛柳披萨\n牛排色拉\n爆鱿鱼须\n燕麦吐司\n烧汁茄子\n炸酱拌面\n炸冰淇淋\n炒鱿鱼花\n炒海螺片\n炒海带丝\n炒山药片\n火腿焗饭\n火腿焖饭\n冰激凌松饼\n混合沙拉\n深海鱼排\n海鲜粉丝\n海鲜清汤\n海鲜总汇\n洋葱炒肉\n洋芋焖饭\n泡菜肥牛\n泡菜拌饭\n法式吐司\n油面包卷\n油炒时蔬\n沙拉手卷\n豆沙小餐包\n樱桃果酱\n樱桃慕斯\n椰子雪糕\n辣椒炒豆干\n梅子绿茶\n金桔柠檬茶\n柠檬慕斯\n柚子绿茶\n枸杞燕窝\n果馅蛋糕\n果肉酸奶\n果肉果汁\n水果小圆子\n果味酸奶\n板腱牛排\n杨枝金露\n肉末炒豆角\n早餐面包\n无骨鸡排\n方块饼干\n新鲜蔬菜\n斯牛小排\n挤挤面包\n提拉米苏杯\n手工鱼丸\n手工凉皮\n意式烩饭\n彩色汤圆\n带骨肉眼\n巧克力饼\n山药鸡汤\n双层鸡腿堡\n小牛菲力\n莲子百合汤\n茄子煲仔饭\n牛奶鸡蛋羹\n奶香饼干\n奶酪拼盘\n奶酪年糕\n奶酪土司\n奶类蛋糕卷\n奶茶系列\n酸奶紫米露\n奶油意粉\n奶油司康\n焗意面\n比基尼蛋糕\n培根匹萨\n圆形蛋糕\n咖喱鸡扒饭\n咖啡星冰乐\n烤鲶鱼\n可可曲奇\n反卷寿司\n双色花卷\n参炖乌鸡\n原味奶酪\n卷边披萨\n印尼炒饭\n萝卜丝丸子\n南美大虾\n刺猬馒头\n奥利奥奶茶\n伊比利亚火腿\n巧克力雪糕\n巧克力牛奶\n巧克力火锅\n巧克力冰泥\n五谷丰登\n乌骨鸡汤\n粉丝蒸带子\n上脑牛排\n三鲜锅贴\n三鲜砂锅\n三文鱼片\nlaksa\n黑毛猪肉\n黄焖排骨\n黄桥烧饼\n麦穗面包\n鸭胸沙律\n鸡蛋土司\n鸡肉串烧\n鸡汤锅底\n鸡丝色拉\n鲜虾焗饭\n鲜肉煎饺\n鲜肉汤圆\n鱿鱼炒饭\n鱿鱼沙拉\n鱿鱼年糕\n鱼汤锅底\n三文鱼土豆饼\n骨炖山药\n马蹄糖水\n香辣肉丝\n香菇炖鸡\n香菇包子\n香草烤鸡\n香草奶冻\n牛肉堡\n香煎鲈鱼\n香浓咖啡\n百香果慕斯\n小排骨\n香味烤鱼\n东坡肉\n馅儿包子\n饼干礼盒\n风情烤翅\n韭菜馅饼\n韩式蛋糕\n雪花牛柳\n雪人蛋糕\n铁板牛柳\n金汤肥牛\n野生鲶鱼\n野生杂鱼\n里脊猪排\n酸辣粉丝\n酸奶土司\n酱爆鱿鱼\n酱烤鱿鱼\n酥皮面包\n辣味炒饭\n转化糖浆\n豉汁凤爪\n豆腐蒸蛋\n豆腐粉丝\n豆腐年糕\n豆腐包子\n黄豆煲排骨\n豆炒茄子\n谷物面包\n诱惑蛋糕\n豆角炒肉丝\n西葫芦片\n蟹肉炒饭\n螺丝意面\n蝴蝶馒头\n蜜豆蛋糕\n鸡蛋馅饺子\n鸡蛋豆腐羹\n蛋糕礼盒\n蛋糕盒子\n鸡蛋炒木耳\n大虾蛋包饭\n虾卷寿司\n虾仁蒸蛋\n虾仁生煎\n虎皮鸡蛋\n蘑菇色拉\n蘑菇炒肉\n薄脆披萨\n香蕉磅蛋糕\n蔬菜面包\n葱烧豆腐\n营养米糊\n葡萄干马芬\n葡萄干饼干\n葡萄干土司\n包子\n炒香菇\n炒汤圆\n青菜年糕汤\n菌菇锅底\n菌菇米线\n老鸡汤\n扒时蔬\n蔓越莓磅蛋糕\n松茸蘑菇汤\n番茄酱意面\n番茄海鲜汤\n茄子拌面\n苹果奶昔\n英式奶茶\n五花肉炒饭\n花甲粉丝\n花生酥饼\n花生汤圆\n芝麻布丁\n芝士生蚝\n芝士烤饼\n芝士拌饭\n芝士批萨\n芒果奶茶\n藍色夏威夷\n火腿肠炒饭\n火腿炒鸡蛋\n豆腐白菜汤\n豆腐炖白菜\n脆脆蛋糕\n胚芽奶茶\n肥牛锅仔\n肉铁板烧\n芝士卷\n肉紫菜卷\n肉碎炒饭\n砂锅饭\n肉炒青瓜\n炒粉条\n牛肉炒意粉\n炒土豆\n肉松餐包\n肉松月饼\n肉末拌面\n肉之近照\n老鸭粉丝\n老汤豆腐\n翡翠豆腐\n罗汉果茶\n经典沙拉\n经典汉堡\n红豆酸奶\n红豆凉粉\n红薯粉条\n红糖年糕\n红烩牛肉\n红枣奶茶\n紫薯汤圆\n糯米饭团\n糖水罐头\n玉米炒虾仁\n笋炒鸡蛋\n支竹猪肚煲\n立体蛋糕\n秘制牛扒\n盆栽酸奶\n辣椒酱\n百合南瓜\n白菜炒面\n白肉血肠\n白灼生菜\n番茄焗饭\n甜筒披萨\n甘梅地瓜\n瓜类粉丝汤\n琥珀核桃\n千层班戟蛋糕\n玫瑰红茶\n玉米面条\n玉米脆片\n猪血丸子\n猪肉沙拉\n狗肉火锅\n牛腩锅底\n牛肉石锅\n牛肉煲仔\n牛牛小排\n牛油红锅\n爱心蛋糕\n燕麦豆浆\n燕麦牛奶\n煎酿三宝\n煎多宝鱼\n烧鱿鱼须\n烧小土豆\n烧娃娃菜\n烧大白菜\n烤黄花鱼\n烤鸡肉饭\n烤鸡比萨\n香烤小羊排\n炼奶吐司\n炸鱼薯条\n雪里红\n炒血豆腐\n炒花椰菜\n炒猪肠粉\n炒猪大肠\n炒牛肉末\n炒手捏菜\n炒卤牛肉\n火腿芝士\n火腿炒面\n液体酸奶\n海鲜叻沙\n海苔饭团\n海胆蒸蛋\n奶油磅蛋糕\n油泼辣子\n豆沙小面包\n水果缤纷\n水果华夫\n歌剧蛋糕\n青椒松花蛋\n梅花猪排\n桂花年糕\n桂圆面包\n安格斯牛扒\n格子松饼\n核桃麦芬\n柠檬芝士\n红枣乌鸡汤\n果粒奶茶\n果干吐司\n果味奶茶\n果仁酸奶\n枸杞银耳汤\n木瓜奶昔\n有機花菜\n星鳗寿司\n无锡小笼\n三文魚壽司\n排骨焗饭\n排骨便当\n挂炉烤鸭\n招牌色拉\n拌黑腐竹\n拌野木耳\n拌荞麦面\n拌海螺片\n拌水萝卜\n手绘蛋糕\n手撕面筋\n手撕吐司\n手工香肠\n手工饺子\n扁豆焖面\n慕斯泡芙\n彩虹沙拉\n广东芥兰\n希腊酸奶\n小熊饼干\n九宫格锅底\n宫保虾球\n定制蛋糕\n奶香曲奇\n奶油戚风\n酸奶华夫饼\n玛奇朵咖啡\n天鹅泡芙\n天使意面\n壽司拼盤\n芝士小蛋糕\n墨西哥饼\n基围虾粥\n鸡块盖浇饭\n四季豆饭\n咸肉炒饭\n咖喱鸡腿\n咖喱锅底\n咖喱乌冬\n和牛牛排\n鱼头王\n烤鱿鱼\n烤全鱼\n风味炸鸡翅\n百合炒虾球\n可可慕斯\n原味豆浆\n原味蛋挞\n原味牛排\n萝卜炖羊排\n北极甜虾\n包心鱼丸\n巧克力棒棒糖\n利梭多饭\n创意寿司\n凉拌豆腐\n凉拌米线\n冬荫功汤\n内酯豆腐\n养生火锅\n全麦饼干\n全麦贝果\n全麦欧包\n什锦豆腐\n乳酪月饼\n拌黄瓜\n粉丝丸子汤\n东海带鱼\n上海菜饭\n上校鸡块\n三文鱼柳\n三丝春卷\n万州烤鱼\nCrepe\n柠檬酸奶\nT骨牛扒\n金枪鱼\n黑糖咖啡\n黑椒鸡翅\n黄金馒头\n黄河刀鱼\n芝麻菜披萨\n鸭肉沙拉\n鸡蛋锅贴\n鸡蛋炒饼\n鸡蛋摊饼\n鸡蛋捞面\n鸡腿焖饭\n鸡腿便当\n鸡肉焖饭\n鸡炖粉条\n鸡扒定食\n鲜虾豆腐\n海鲜炒意粉\n鲜果虾球\n鲍汁凤爪\n鱿鱼披萨\n鱼烧年糕\n鱼子沙拉\n鱼头砂锅\n山药汤\n香酥带鱼\n香辣鸡排\n香辣牛排\n香辣烤鸡\n香辣意面\n香辣大虾\n香菇面筋\n香菇肉片\n香菇肉包\n香菇炒饭\n香草奶茶\n香芒布丁\n香脆豆腐\n香煎带鱼\n香浓奶茶\n香橙蛋糕\n水煎包\n风味奶茶\n青椒茄子\n青岩豆腐\n雪顶咖啡\n雪梨燕窝\n门钉肉饼\n锦绣拼盘\n石锅拌饭2\n银耳甜汤\n铁板鸡排\n野生虾仁\n煎酿杏鲍菇\n酱鸡蛋饼\n酱蒸鱼头\n酱炒豆腐\n酱炒意面\n酱娃娃菜\n酥脆饼干\n酒酿汤圆\n烤蔬菜\n牛肉酱\n辣子炒肉\n辣味披萨\n辣椒八爪鱼\n刺身大拼盘\n起酥面包\n贡丸米线\n豚肉拉面\n豆豉酱拌面\n豆豉排骨\n豆腐虾仁\n豆腐水饺\n四角豆炒腊味\n谷饲牛排\n西芹木耳\n墨西哥面包\n羊扒\n蟹蒸粉丝\n蟹脚寿司\n蜂蜜芥末酱\n蜂蜜土司\n蛋糕吐司\n蛋浸时蔬\n虾寿司卷\n虾兵蟹将\n虾仁煎饺\n虎皮豆腐\n蘑菇肉片\n蘑菇土豆\n薄荷汽水\n薄荷拿铁\n蔬菜蛋饼\n蔬果沙律\n蓝莓马芬\n蒸石斑鱼\n葱爆牛柳\n葡萄吐司\n阿萨姆奶茶\n萝卜鸡汤\n萝卜肉丸\n萝卜肉丝\n萝卜牛肉\n黑木耳\n鲫鱼汤\n豆腐皮\n蒸茄子\n菜炒猪肚\n炒千张\n油豆腐\n菜土豆条\n土鸡汤\n蒸滑鸡\n炖小鸡\n蘑菇炒油菜\n鲜菇丸子汤\n莲藕丸子\n得莫利炖鱼\n草莓冻芝士\n香草星冰乐\n香草三文鱼\n茉莉奶茶\n茄子沙拉\n花生牛奶\n芝麻沙拉\n芝麻桃酥\n芝麻小饼\n芝士绿茶\n芝士烩饭\n芝士曲奇\n芝士奶盖\n芝士乳酪\n豆腐小炒肉\n脆皮鸡腿\n脆皮鸡扒\n脆皮叉烧\n肥肠米粉\n肥肠小面\n肥牛鹅肠\n肥牛炒饭\n肠仔披萨\n肉饼汉堡\n肉酿豆腐\n肉类芦笋卷\n肉碎豆腐\n肉砂锅粥\n炒西芹\n炒笋干\n肉炒尖椒\n肉沫炒饭\n肉松排包\n肉松披萨\n肉末豆角\n大酱汤\n肉土豆汤\n肉丸焗饭\n肉丝卷饼\n银耳百合羹\n羊肉烧麦\n羊肉炒饭\n羊肉拼盘\n绿豆凉粉\n经典牛扒\n顶级羔羊肉\n起泡酒\n红豆豆浆\n红豆汤圆\n红豆拿铁\n红豆年糕\n红豆土司\n红糖姜茶\n红米炒饭\n红烧肉面\n红油水饺\n西红柿牛腩\n红枣面包\n红枣牛奶\n紫薯沙拉\n紫菜饭团\n糯米鸡翅\n糯米粽子\n糖炒板栗\n粗粮面包\n粉丝蟹煲\n玉米炒肉丁\n芦笋培根卷\n神户牛肉\n礼盒蛋糕\n石烤鱿鱼\n石板豆腐\n皇冠面包\n鸡蛋羹\n百合甜汤\n番茄沙律\n野生小鲫鱼\n甜心蛋糕\n甜甜圈蛋糕\n玫瑰饼干\n玉米蛋糕\n猪肚鸡煲\n猪肉蒸包\n猪肉色拉\n猪肉肠粉\n猪肉大包\n特色凉面\n特浓咖啡\n牛腩煲仔\n牛腩焗饭\n牛肚火锅\n牛肋条肉\n牛肉月饼\n牛肉冷面\n牛筋牛腩\n牛柳意粉\n牛排骨肉\n牛排定食\n牛奶饼干\n牛奶排包\n牛仔骨煲\n牛乳蛋糕\n燕窝蛋挞\n熏鸡披萨\n煮荷包蛋\n焗龙利鱼\n烧肉炒饭\n烧牛小排\n烤鸭半只\n烤鸡软骨\n烤娃娃菜\n炼乳刨冰\n炸鸡软骨\n炸鸡沙拉\n炸五花肉\n炖油豆角\n炒鸡毛菜\n炒蛤蜊肉\n炒花甲螺\n炒花枝片\n炒红萝卜\n炒红苋菜\n炒小黄瓜\n炒夜开花\n炒南瓜藤\n炒北极贝\n火腿吐司\n火腿卷饼\n火炙寿司\n澳洲西冷\n滋补烩面\n深海鲽鱼\n消化饼干\n海鲜河粉\n海鲜拌饭\n海鲜伊面\n海盐拿铁\n海参捞饭\n洋葱披萨\n泡菜拼盘\n法式蜗牛\n法国面包\n法国蜗牛\n法国生蚝\n温泉蛋色拉\n牛油果慕斯\n沙爹牛肉\n沙姜猪手\n小馄饨\n蜜汁叉烧饭\n水晶虾球\n椰汁西米\n椰子蛋糕\n桃胶糖水\n安格斯西冷\n核桃色拉\n栗子白菜\n柠檬烤鸡\n柠檬果汁\n金枪鱼比萨\n果酸奶昔\n牛奶冰\n松子意面\n杭椒牛柳\n枸杞鸽子汤\n杏鲍菇丝\n杏仁脆饼\n杏仁布丁\n杂粮米糊\n朝鲜冷面\n时蔬豆腐\n三文鱼塔塔\n摩卡冰沙\n吞拿鱼意粉\n拿铁奶茶\n拌黑豆丝\n拌海蜇丝\n拉丝面包\n猪扒石锅饭\n手摇酸奶\n手打年糕\n手工馒头\n手工糖果\n手工丸子\n手卷披萨\n德国猪手\n年糕锅底\n干锅菜花\n干锅排骨\n常州百叶\n海带炖排骨\n帝王蟹腿\n巴沙鱼柳\n巧克力奶\n少的可怜\n茄子炒豆角\n娃娃菜汤\n奶酪汉堡\n奶茶布丁\n奶盖贡茶\n奶油饼干\n奶油排包\n奶油华夫\n天山雪莲\n大虾意粉\n大碗花菜\n意大利披萨\n多菌锅底\n土豆鸡翅\n土豆面包\n土豆肉片\n如豆粉条\n土豆拼盘\n哈斯面包\n咖喱鲈鱼\n咖喱豆腐\n咖喱猪排\n咖啡沙冰\n和牛西冷\n烤黑鱼\n吞拿鱼卷\n双拼奶茶\n原味泡芙\n原味曲奇\n萝卜馅包子\n萝卜焖牛腩\n萝卜炖羊肉\n南非干鲍\n千层面包\n包菜粉丝\n巧克力软曲奇\n秘制黑豆腐\n凤梨虾球\n养生汤锅\n巧克力华夫\n优格奶昔\n仙草芋圆\n什锦焗饭\n什锦时蔬\n什锦寿司\n五花肉串\n云吞汤面\n乳酪吐司\n玛格丽特披萨\n牛肉丸粿条汤\n东海黄鱼\n三鲜烧麦\n三鲜包子\n三角饭团\n三角蛋糕\n三文鱼腹\n盛り合わせ\notoro寿司\nBagel\n龙虾泡饭\n龙利鱼饭\n龙俐鱼柳\n黑毛和牛\n黑椒鸡排\n黑椒猪扒\n黄鱼馄饨\n黄瓜蘸酱\n黄瓜泡菜\n黄油土豆\n芝麻菜色拉\n鸭血米线\n鸡蛋面饼\n鸡蛋煎饺\n鸡蛋汉堡\n鸡蛋小饼\n鸡蘑菇粉\n鸡肉米线\n鸡石锅饭\n鸡柳汉堡\n鸡块米线\n鲜肉生煎\n鲜肉烧麦\n海鲜小火锅\n鲜奶泡芙\n鱼籽刺身\n鱼头锅底\n鳗鱼卷寿司\n鱼丸子汤\n鰻魚壽司\n排骨糯米饭\n焖土豆\n烩酸菜\n香酥鸡翅\n香辣鸡块\n香辣花甲\n辣椒油\n香蕉沙拉\n香蕉华夫\n香蕉冰沙\n香草羊排\n香芋丸子\n香煎鸡排\n香烤猪蹄\n炸猪排\n香炸大排\n香星冰乐\n风味烤翅\n风味炒饭\n风味拿铁\n顶级牛排\n韭菜包子\n面包机版\n青柠绿茶\n霸王鱼头\n霸王牛肉\n雪顶拿铁\n长江鲥鱼\n银鳕鱼扒\n银鱼蒸蛋\n铁盘披萨\n铁棍淮山\n铁板牛蛙\n拿铁星冰乐\n金枪鱼头\n金枪刺身\n野生黄鳝\n野生水鱼\n野生扇贝\n醋溜白菜\n醇香奶茶\n酸汤鱼片\n酸奶麦片\n酸奶冰棍\n酱鱿鱼须\n酱金钱肚\n酱蒸茄子\n酱蒸肠粉\n酱爆牛蛙\n酱焖排骨\n酱烤肋排\n酱烤生蚝\n酱炒豆角\n酱炒花甲\n酱拌茄子\n乳酪磅蛋糕\n酒巧克力\n道口烧鸡\n迷糊娃娃\n辣酱炸鸡\n辣椒炒饭\n辣三文鱼\n转印蛋糕\n海皇轩国门店\n蔓越莓软欧\n豆角烧肉\n豆角炒饭\n豆蓉月饼\n豆芽炒肉\n豆腐饺子\n豆腐酿肉\n豆腐蛋糕\n豆腐肉丸\n豆腐炖鱼\n豆腐塞肉\n焖猪蹄\n土豆烧鸡翅\n土豆炖鸡块\n炒肉丁\n豆沙粽子\n杂酱面\n话梅猪手\n虾饺皇\n西红柿面\n西米糖水\n西施泡饭\n裸麦面包\n裙边寿司\n蜜豆奶茶\n蜗牛浓汤\n蛋黄鸡翅\n蛋糕组合\n蛋糕制作\n蚝油牛肉\n虾蒸水蛋\n虾仁沙律\n虾仁拌面\n虎虾寿司\n糯米饼\n薯条拼盘\n香蕉冰淇淋\n蔬菜烩饭\n蔬菜卷饼\n蓝莓麦芬\n蓝莓冰沙\n蓑衣茄子\n蒜香炸鸡\n蒜泥黄瓜\n蒜味面包\n葱香面包\n葱爆肥牛\n胡萝卜蛋饼\n胡萝卜炒面\n萝卜浓汤\n萝卜拼盘\n菲肋牛排\n菲利牛排\n菠萝蛋糕\n菠萝果酱\n虾仁粥\n肉圆汤\n炒肉沫\n菜炒米饭\n菜炒猪肉\n菇肉片饭\n菌菇牛肉面\n炖鸡面\n莓莫吉托\n荷包蛋面\n山药炖排骨\n草莓大福\n茶碗蒸蛋\n茴香饺子\n番茄三明治\n苹果芝士\n苹果色拉\n花生豆腐\n花式馒头\n花卷面包\n芦笋浓汤\n芥末鸭掌\n芝麻馒头\n芝士雪冰\n芝士切片\n芒果蛋挞\n芋头糖水\n腊味拼盘\n脊骨火锅\n脆骨丸子\n脆皮茄子\n脆皮肠粉\n脆皮牛肉\n脆皮烤鸡\n胚芽面包\n肩胛牛排\n肩胛牛扒\n肥牛定食\n肠仔面包\n鸡肉芝士饭\n肉砂锅面\n烧萝卜\n炖酸菜\n肉炖粉皮\n肉炒洋葱\n肉松蛋卷\n肉小丸子\n寿司卷\n肉丝套饭\n肉丁拌面\n绿茶奶盖\n绣球馒头\n红酱意面\n红酒牛肉\n红豆餐包\n红豆雪糕\n红豆松饼\n红薯面包\n红糖鸡蛋\n紫薯糖水\n糯米面包\n粉丝包子\n米饭沙拉\n米饭披萨\n笋炖排骨\n芝麻糊\n磨牙饼干\n盒子鸡排\n百合木耳\n白酱意面\n白玉豆腐\n白切羊肉\n番茄莎莎\n番茄沙沙\n甲鱼火锅\n生态黄鱼\n甜辣炸鸡\n瓦罐煨汤\n炒鱿鱼\n小丸子\n玻璃虾寿司\n珍珠奶盖\n猪软骨面\n猪油拌饭\n果汁\n特色米线\n物语蛋糕\n牛腩肠粉\n牛腩盖饭\n牛肉薄烧\n牛肉米饭\n牛肉汤饭\n牛肉批萨\n牛肉年糕\n牛肉丸面\n黑椒牛柳炒面\n牛尾浓汤\n牛奶燕麦\n牛三星汤\n爆蛋奶茶\n爆浆蛋糕\n燕麦蛋糕\n熊猫饭团\n煲娃娃菜\n水煮黄骨鱼\n煎黄花鱼\n煎金枪鱼\n煎酿茄子\n焗大龙虾\n焖牛肉饭\n烧酿茄子\n烧橡皮鱼\n炼乳面包\n炸鸡拌饭\n炸虾寿司\n炭烧奶茶\n炖娃娃菜\n炖大白菜\n炒黑鱼片\n炒黄秋葵\n炒韭菜苔\n炒野木耳\n炒油菜蕻\n炒山鸡蛋\n炒千张丝\n火鸡披萨\n火腿意粉\n澳洲牛柳\n湘西腊肉\n冰淇淋奶昔\n海鮮炒飯\n北海道扇贝\n海苔寿司\n海捕大虾\n浓汤锅底\n澳洲大龙虾\n洋葱炒饭\n泡椒鸡爪\n油醋沙拉\n豉油皇鹅肠\n牛油果沙律\n沙爹肉串\n沙律手卷\n水果酵素\n水果汤圆\n水果奶昔\n水果圣代\n水晶奶茶\n水信玄饼\n气泡果汁\n欧式面包\n榴莲蛋挞\n榴莲双拼\n椰香面包\n椰香蛋糕\n椰子炖鸡\n椰丝蛋糕\n黑椒鸡扒饭\n椒蒸茄子\n黑椒牛肋骨\n炒鸡块\n辣椒炒火腿\n青椒炒木耳\n肉桂面包卷\n桂花汤圆\n培根焗意面\n核桃酥饼\n核桃曲奇\n茶树菇牛柳\n柴火香干\n柠檬冰饮\n枸杞豆浆\n枣泥蛋糕\n果酱酥饼\n果粒蛋糕\n果木牛排\n果冻芝士\n果仁饼干\n北极贝裙边\n肉松小面包\n杏仁酥饼\n杂鱼锅仔\n木耳白菜\n有机豆芽\n有机木耳\n普罗旺斯\n时蔬焗饭\n时蔬沙律\n无花果酱\n无花果汤\n新竹米粉\n手撕包心菜\n排骨煲仔\n排骨汤饭\n排骨拌饭\n招牌烧鹅\n拌黄花菜\n拉面火锅\n手撕羊排\n手撕猪心\n手抓龙虾\n手工拉面\n手工奶茶\n手切肥牛\n慕思蛋糕\n意式饺子\n忌廉浓汤\n心花怒放\n德国香肠\n开心花甲\n广东丝瓜\n年糕雪冰\n年糕火鍋\n带子肠粉\n布丁奶昔\n山楂果酱\n小米辽参\n孜然牛排\n夏威夷比萨\n天妇罗寿司\n天妇罗定食\n奶酪曲奇\n奶盖咖啡\n炖燕窝\n奶油蛋挞\n奶油奶酪\n奶油土司\n奶星冰乐\n大酥牛肉\n大虾小锅\n意大利扁面\n芝士牛肉堡\n墨鱼寿司\n培根饭团\n培根蛋饼\n坚果饼干\n圣诞饼干\n圣诞老人\n土豆蛋饼\n土豆肉丝\n土豆炒饭\n土司盒子\n咖喱鸡块饭\n千层肚\n咖喱肉蟹\n咖啡曲奇\n咖哩牛肉\n咕噜虾球\n烤草鱼\n炸鸡块\n大鱿鱼\n吐司盒子\n百合炒木耳\n可爱蛋糕\n可可吐司\n双薯沙拉\n双色蛋糕\n叉烧拼盘\n厚片土司\n萝卜丸子汤\n南瓜饼干\n南瓜色拉\n南瓜排骨\n南瓜戚风\n南瓜丸子\n千层盒子\n千层油糕\n刺身寿司\n切片吐司\n凤尾虾球\n凤凰奶糊\n凉拌鸡丝\n冰淇淋杯\n冰淇淋卷\n农家土鸡\n全麦软欧\n巧克力披萨\n巧克力慕司\n巧克力慕丝\n五香豆干\n五花肉汤\n五花肉2\n五花拌饭\n二豆豆浆\n乳酪沙拉\n乡里腊肉\n乐园蛋糕\n乌冬汤面\n乌冬定食\n丹麦面包\n上脑牛肉\n上脑牛扒\n三明治2\n三文鱼粥\n三文鱼挞\n丁丁炒面\n龙虾盖饭\n火龙果酸奶\n黑豆腐皮\n黑椒鸡扒\n黑椒排骨\n黄瓜炒蛋\n黄油扇贝\n麻辣花生\n芝麻糯米饼\n全麦小餐包\n鹌鹑皮蛋\n鹅肝牛排\n鸡蛋虾仁\n鸡蛋粉丝\n鸡腿焗饭\n鸡腿披萨\n鸡胸脯肉\n鸡肉拼盘\n鸡肉匹萨\n蓝鳍金枪鱼\n虾意粉\n鲜肉馅饼\n海鲜焗意粉\n鲜汤锅底\n鲅鱼饺子\n鲜鱼萝卜汤\n鱼籽蒸蛋\n鱼籽拌饭\n三文鱼籽军舰\n鱼沙拉卷\n骨粉丝汤\n骨头锅底\n养生汤\n鲜香鸳鸯锅\n香辣羊排\n香辣烤翅\n香蕉戚风\n香葱花卷\n香菜饺子\n香菜牛肉\n香菇肉丸\n香草曲奇\n香草咖啡\n香肠拌饭\n玛德琳\n香煎猪排\n香煎猪扒\n炸鸡翅\n香大鸡排\n电饭锅蛋糕\n风干火腿\n风味鸡排\n风味薯条\n风味排骨\n风味凉皮\n猪颈肉沙拉\n顶级肥牛\n韭菜锅贴\n韭菜蒸饺\n韭菜丸子\n韩式烤肉\n面包沙拉\n面包拼盘\n青菜拼盘\n青椒肉片\n雪菜肉丝\n雪糕蛋糕\n雪糕芭菲\n阿拉斯加\n长江回鱼\n银芽炒面\n银耳燕窝\n铁板牛扒\n金牌猪手\n金枪鱼面\n野菌浓汤\n酸辣米粉\n酸辣笋尖\n酸菜血肠\n酸菜白肉\n酸菜烤鱼\n酸奶玛芬\n酸奶沙冰\n酱猪肠粉\n酱焗扇贝\n酱拌黄瓜\n酥肉米线\n酥皮点心\n黑松露\n酸椰菜\n迷你蛋挞\n香辣美容蹄\n起司焗饭\n鸡脚汤\n豆豉蒸鱼\n豆花烤鱼\n豆腐近景\n豆腐白菜\n糯米粽\n焖茄子\n焖猪手\n豆渣面包\n豆沙春卷\n谷饲牛肉\n调味猪排\n调味牛肉\n诱惑披萨\n西芹虾仁\n西瓜冰沙\n墨西哥披萨\n新西兰牛排\n蜜酿酸奶\n蜜豆马芬\n蜜汁鸡块\n蜜汁南瓜\n蜂蜜松饼\n蛋白脆饼\n蛋白吐司\n鸡蛋炒馒头\n鸡蛋炒粉丝\n虾烧白菜\n虾烧卖皇\n虾炒白菜\n蘑菇面包\n蘑菇炖鸡\n蘑菇炖饭\n蘑菇炒饭\n薯泥沙拉\n山药糕\n薄荷奶昔\n薄脆饼干\n香蕉鸡蛋饼\n蔬菜炒面\n蓝莓蛋挞\n蒸糯米饭\n蒸太阳鱼\n蒸大元贝\n蒜香生蚝\n蒜香烤鱼\n蒜香意面\n蒜蓉蒸虾\n蒜苗炒肉\n葡萄蛋糕\n萝卜馄饨\n萝卜肉片\n萝卜火锅\n菠萝排骨\n菠萝奶昔\n菠菜蛋饼\n韭菜饺子馅\n菜面片汤\n菜花炒肉\n泡菜肥牛锅\n青菜肉馅饼\n芹菜肉饺子\n肉丸子\n菜素包子\n菜粒炒饭\n猪肝粥\n菜煮粉条\n煮年糕\n炒豆角\n炒蚕豆\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v2_dishes/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\n\nimport ast\nimport argparse\nimport os\n\nimport numpy as np\nfrom paddle.inference import Config, create_predictor\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\nfrom .processor import postprocess, base64_to_cv2\nfrom .data_feed import reader\n\n\n@moduleinfo(\n    name=\"mobilenet_v2_dishes\",\n    type=\"CV/image_classification\",\n    author=\"baidu-vis\",\n    author_email=\"\",\n    summary=\n    \"Mobilenet_V2 is a image classfication model, this module is trained with Baidu self-built dishes dataset.\",\n    version=\"1.1.0\")\nclass MobileNetV2Dishes:\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(\n            self.directory, \"model\", \"model\")\n        label_file = os.path.join(self.directory, \"label_list.txt\")\n        with open(label_file, 'r', encoding='utf-8') as file:\n            self.label_list = file.read().split(\"\\n\")[:-1]\n        self._set_config()\n\n    def get_expected_image_width(self):\n        return 224\n\n    def get_expected_image_height(self):\n        return 224\n\n    def get_pretrained_images_mean(self):\n        im_mean = np.array([0.485, 0.456, 0.406]).reshape(1, 3)\n        return im_mean\n\n    def get_pretrained_images_std(self):\n        im_std = np.array([0.229, 0.224, 0.225]).reshape(1, 3)\n        return im_std\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path+'.pdmodel'\n        params = self.default_pretrained_model_path+'.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(\n                memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def classification(self,\n                       images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n        \"\"\"\n        API for image classification.\n\n        Args:\n            images (numpy.ndarray): data of images, shape of each is [H, W, C], color space must be BGR. .\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        all_data = list()\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = list()\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except:\n                    pass\n            # feed batch image\n            batch_image = np.array([data['image'] for data in batch_data])\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(batch_image.copy())\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n            out = postprocess(\n                data_out=output_handle.copy_to_cpu(),\n                label_list=self.label_list,\n                top_k=top_k)\n            res += out\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classification(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(\n            title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\",\n            description=\n            \"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.classification(\n            paths=[args.input_path],\n            batch_size=args.batch_size,\n            use_gpu=args.use_gpu)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu',\n            type=ast.literal_eval,\n            default=False,\n            help=\"whether use GPU or not.\")\n        self.arg_config_group.add_argument(\n            '--batch_size',\n            type=ast.literal_eval,\n            default=1,\n            help=\"batch size.\")\n        self.arg_config_group.add_argument(\n            '--top_k',\n            type=ast.literal_eval,\n            default=1,\n            help=\"Return top k results.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument(\n            '--input_path', type=str, help=\"path to image.\")\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v2_dishes/processor.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport base64\nimport cv2\n\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef softmax(x):\n    orig_shape = x.shape\n    if len(x.shape) > 1:\n        tmp = np.max(x, axis=1)\n        x -= tmp.reshape((x.shape[0], 1))\n        x = np.exp(x)\n        tmp = np.sum(x, axis=1)\n        x /= tmp.reshape((x.shape[0], 1))\n    else:\n        tmp = np.max(x)\n        x -= tmp\n        x = np.exp(x)\n        tmp = np.sum(x)\n        x /= tmp\n    return x\n\n\ndef postprocess(data_out, label_list, top_k):\n    \"\"\"\n    Postprocess output of network, one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output data of network.\n        label_list (list): list of label.\n        top_k (int): Return top k results.\n    \"\"\"\n    output = []\n    for result in data_out:\n        result_i = softmax(result)\n        output_i = {}\n        indexs = np.argsort(result_i)[::-1][0:top_k]\n        for index in indexs:\n            label = label_list[index]\n            output_i[label] = float(result_i[index])\n        output.append(output_i)\n    return output\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v2_dishes/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport paddlehub as hub\n\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/rAyCBQTH7ws/download?ixid=MnwxMjA3fDB8MXxhbGx8fHx8fHx8fHwxNjYzMTIzODM5&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"mobilenet_v2_dishes\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n\n    def test_classification1(self):\n        results = self.module.classification(\n            paths=['tests/test.jpg']\n        )\n        data = results[0]\n        self.assertTrue('海鲜面' in data)\n        self.assertTrue(data['海鲜面'] > 0.01)\n\n    def test_classification2(self):\n        results = self.module.classification(\n            images=[cv2.imread('tests/test.jpg')]\n        )\n        data = results[0]\n        self.assertTrue('海鲜面' in data)\n        self.assertTrue(data['海鲜面'] > 0.01)\n\n    def test_classification3(self):\n        results = self.module.classification(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=True\n        )\n        data = results[0]\n        self.assertTrue('海鲜面' in data)\n        self.assertTrue(data['海鲜面'] > 0.01)\n\n    def test_classification4(self):\n        self.assertRaises(\n            AssertionError,\n            self.module.classification,\n            paths=['no.jpg']\n        )\n\n    def test_classification5(self):\n        self.assertRaises(\n            TypeError,\n            self.module.classification,\n            images=['test.jpg']\n        )\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v2_imagenet/README.md",
    "content": "# mobilenet_v2_imagenet\n\n|模型名称|mobilenet_v2_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|Mobilenet_v2|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|15MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - MobileNet V2是Mark Sandler, Andrew Howard等人在2018年提出的一个图像分类模型，该系列模型（MobileNet）是为移动和嵌入式设备提出的高效模型，在模型参数较少的情况下仍然保持了较高的分类准确率。该PaddleHub Module基于ImageNet-2012数据集训练，接受输入图片大小为224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install mobilenet_v2_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run mobilenet_v2_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"mobilenet_v2_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  修复python2中编码问题\n\n  - ```shell\n    $ hub install mobilenet_v2_imagenet==1.0.1\n    ```\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v2_imagenet/README_en.md",
    "content": "# mobilenet_v2_imagenet\n\n|Module Name|mobilenet_v2_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|Mobilenet_v2|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|15MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - MobileNet V2 is an image classification model proposed by Mark Sandler, Andrew Howard et al. in 2018. This model is a light-weight model for mobile and embedded device, and can reach high accurary with a few parameters. This module is based on MobileNet V2, trained on ImageNet-2012, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install mobilenet_v2_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run mobilenet_v2_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"mobilenet_v2_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.0.1\n\n  Fix the problem of encoding in python2\n\n  - ```shell\n    $ hub install mobilenet_v2_imagenet==1.0.1\n    ```\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v2_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\n\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 filter_size: int,\n                 num_filters: int,\n                 stride: int,\n                 padding: int,\n                 num_groups: int = 1,\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=padding,\n            groups=num_groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n\n        self._batch_norm = BatchNorm(\n            num_filters,\n            param_attr=ParamAttr(name=name + \"_bn_scale\"),\n            bias_attr=ParamAttr(name=name + \"_bn_offset\"),\n            moving_mean_name=name + \"_bn_mean\",\n            moving_variance_name=name + \"_bn_variance\")\n\n    def forward(self, inputs: paddle.Tensor, if_act: bool = True):\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        if if_act:\n            y = F.relu6(y)\n        return y\n\n\nclass InvertedResidualUnit(nn.Layer):\n    \"\"\"Inverted Residual unit.\"\"\"\n\n    def __init__(self, num_channels: int, num_in_filter: int, num_filters: int, stride: int, filter_size: int,\n                 padding: int, expansion_factor: int, name: str):\n        super(InvertedResidualUnit, self).__init__()\n\n        num_expfilter = int(round(num_in_filter * expansion_factor))\n        self._expand_conv = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_expfilter,\n            filter_size=1,\n            stride=1,\n            padding=0,\n            num_groups=1,\n            name=name + \"_expand\")\n\n        self._bottleneck_conv = ConvBNLayer(\n            num_channels=num_expfilter,\n            num_filters=num_expfilter,\n            filter_size=filter_size,\n            stride=stride,\n            padding=padding,\n            num_groups=num_expfilter,\n            name=name + \"_dwise\")\n\n        self._linear_conv = ConvBNLayer(\n            num_channels=num_expfilter,\n            num_filters=num_filters,\n            filter_size=1,\n            stride=1,\n            padding=0,\n            num_groups=1,\n            name=name + \"_linear\")\n\n    def forward(self, inputs: paddle.Tensor, ifshortcut: bool):\n        y = self._expand_conv(inputs, if_act=True)\n        y = self._bottleneck_conv(y, if_act=True)\n        y = self._linear_conv(y, if_act=False)\n        if ifshortcut:\n            y = paddle.elementwise_add(inputs, y)\n        return y\n\n\nclass InversiBlocks(nn.Layer):\n    \"\"\"Inverted residual block composed by inverted residual unit.\"\"\"\n\n    def __init__(self, in_c: int, t: int, c: int, n: int, s: int, name: str):\n        super(InversiBlocks, self).__init__()\n\n        self._first_block = InvertedResidualUnit(\n            num_channels=in_c,\n            num_in_filter=in_c,\n            num_filters=c,\n            stride=s,\n            filter_size=3,\n            padding=1,\n            expansion_factor=t,\n            name=name + \"_1\")\n\n        self._block_list = []\n        for i in range(1, n):\n            block = self.add_sublayer(\n                name + \"_\" + str(i + 1),\n                sublayer=InvertedResidualUnit(\n                    num_channels=c,\n                    num_in_filter=c,\n                    num_filters=c,\n                    stride=1,\n                    filter_size=3,\n                    padding=1,\n                    expansion_factor=t,\n                    name=name + \"_\" + str(i + 1)))\n            self._block_list.append(block)\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self._first_block(inputs, ifshortcut=False)\n        for block in self._block_list:\n            y = block(y, ifshortcut=True)\n        return y\n\n\n@moduleinfo(\n    name=\"mobilenet_v2_imagenet\",\n    type=\"cv/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"mobilenet_v2_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass MobileNet(nn.Layer):\n    \"\"\"MobileNetV2\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(MobileNet, self).__init__()\n\n        self.class_dim = class_dim\n\n        bottleneck_params_list = [(1, 16, 1, 1), (6, 24, 2, 2), (6, 32, 3, 2), (6, 64, 4, 2), (6, 96, 3, 1),\n                                  (6, 160, 3, 2), (6, 320, 1, 1)]\n\n        self.conv1 = ConvBNLayer(\n            num_channels=3, num_filters=int(32), filter_size=3, stride=2, padding=1, name=\"conv1_1\")\n\n        self.block_list = []\n        i = 1\n        in_c = int(32)\n        for layer_setting in bottleneck_params_list:\n            t, c, n, s = layer_setting\n            i += 1\n            block = self.add_sublayer(\n                \"conv\" + str(i), sublayer=InversiBlocks(in_c=in_c, t=t, c=int(c), n=n, s=s, name=\"conv\" + str(i)))\n            self.block_list.append(block)\n            in_c = int(c)\n\n        self.out_c = 1280\n        self.conv9 = ConvBNLayer(\n            num_channels=in_c, num_filters=self.out_c, filter_size=1, stride=1, padding=0, name=\"conv9\")\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n\n        self.out = Linear(\n            self.out_c, class_dim, weight_attr=ParamAttr(name=\"fc10_weights\"), bias_attr=ParamAttr(name=\"fc10_offset\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'mobilenet_v2_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/mobilenet_v2_imagenet.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv1(inputs, if_act=True)\n        for block in self.block_list:\n            y = block(y)\n        y = self.conv9(y, if_act=True)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.out_c])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v2_imagenet_ssld/README.md",
    "content": "# mobilenet_v2_imagenet_ssld\n\n|模型名称|mobilenet_v2_imagenet_ssld|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|Mobilenet_v2|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|15MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - MobileNet V2是Mark Sandler, Andrew Howard等人在2018年提出的一个图像分类模型，该系列模型（MobileNet）是为移动和嵌入式设备提出的高效模型，在模型参数较少的情况下仍然保持了较高的分类准确率。该PaddleHub Module基于ImageNet-2012数据集并采用PaddleClas提供的SSLD蒸馏方法训练得到，接受输入图片大小为224 x 224 x 3，支持finetune，也可以直接通过命令行或者Python接口进行预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install mobilenet_v2_imagenet_ssld\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run mobilenet_v2_imagenet_ssld --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"mobilenet_v2_imagenet_ssld\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 为识别的菜品类别，value为置信度。\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m mobilenet_v2_imagenet_ssld\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/mobilenet_v2_imagenet_ssld\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install mobilenet_v2_imagenet_ssld==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v2_imagenet_ssld/README_en.md",
    "content": "# mobilenet_v2_imagenet_ssld\n\n|Module Name|mobilenet_v2_imagenet_ssld|\n| :--- | :---: |\n|Category|image classification|\n|Network|Mobilenet_v2|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|15MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - MobileNet V2 is an image classification model proposed by Mark Sandler, Andrew Howard et al. in 2018. This model is a light-weight model for mobile and embedded device, and can reach high accurary with a few parameters. This module is based on MobileNet V2, trained on ImageNet-2012 with SSLD distillation strategy, and can predict an image of size 224*224*3.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install mobilenet_v2_imagenet_ssld\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run mobilenet_v2_imagenet_ssld --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"mobilenet_v2_imagenet_ssld\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - classification API.\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - top\\_k (int): return the first k results\n\n    - **Return**\n\n      - res (list\\[dict\\]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of image classification.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m mobilenet_v2_imagenet_ssld\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/mobilenet_v2_imagenet_ssld\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install mobilenet_v2_imagenet_ssld==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v2_imagenet_ssld/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\n\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 filter_size: int,\n                 num_filters: int,\n                 stride: int,\n                 padding: int,\n                 num_groups: int = 1,\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=padding,\n            groups=num_groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n\n        self._batch_norm = BatchNorm(\n            num_filters,\n            param_attr=ParamAttr(name=name + \"_bn_scale\"),\n            bias_attr=ParamAttr(name=name + \"_bn_offset\"),\n            moving_mean_name=name + \"_bn_mean\",\n            moving_variance_name=name + \"_bn_variance\")\n\n    def forward(self, inputs: paddle.Tensor, if_act: bool = True):\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        if if_act:\n            y = F.relu6(y)\n        return y\n\n\nclass InvertedResidualUnit(nn.Layer):\n    \"\"\"Inverted Residual unit.\"\"\"\n\n    def __init__(self, num_channels: int, num_in_filter: int, num_filters: int, stride: int, filter_size: int,\n                 padding: int, expansion_factor: int, name: str):\n        super(InvertedResidualUnit, self).__init__()\n\n        num_expfilter = int(round(num_in_filter * expansion_factor))\n        self._expand_conv = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_expfilter,\n            filter_size=1,\n            stride=1,\n            padding=0,\n            num_groups=1,\n            name=name + \"_expand\")\n\n        self._bottleneck_conv = ConvBNLayer(\n            num_channels=num_expfilter,\n            num_filters=num_expfilter,\n            filter_size=filter_size,\n            stride=stride,\n            padding=padding,\n            num_groups=num_expfilter,\n            name=name + \"_dwise\")\n\n        self._linear_conv = ConvBNLayer(\n            num_channels=num_expfilter,\n            num_filters=num_filters,\n            filter_size=1,\n            stride=1,\n            padding=0,\n            num_groups=1,\n            name=name + \"_linear\")\n\n    def forward(self, inputs: paddle.Tensor, ifshortcut: bool):\n        y = self._expand_conv(inputs, if_act=True)\n        y = self._bottleneck_conv(y, if_act=True)\n        y = self._linear_conv(y, if_act=False)\n        if ifshortcut:\n            y = paddle.elementwise_add(inputs, y)\n        return y\n\n\nclass InversiBlocks(nn.Layer):\n    \"\"\"Inverted residual block composed by inverted residual unit.\"\"\"\n\n    def __init__(self, in_c: int, t: int, c: int, n: int, s: int, name: str):\n        super(InversiBlocks, self).__init__()\n\n        self._first_block = InvertedResidualUnit(\n            num_channels=in_c,\n            num_in_filter=in_c,\n            num_filters=c,\n            stride=s,\n            filter_size=3,\n            padding=1,\n            expansion_factor=t,\n            name=name + \"_1\")\n\n        self._block_list = []\n        for i in range(1, n):\n            block = self.add_sublayer(\n                name + \"_\" + str(i + 1),\n                sublayer=InvertedResidualUnit(\n                    num_channels=c,\n                    num_in_filter=c,\n                    num_filters=c,\n                    stride=1,\n                    filter_size=3,\n                    padding=1,\n                    expansion_factor=t,\n                    name=name + \"_\" + str(i + 1)))\n            self._block_list.append(block)\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self._first_block(inputs, ifshortcut=False)\n        for block in self._block_list:\n            y = block(y, ifshortcut=True)\n        return y\n\n\n@moduleinfo(\n    name=\"mobilenet_v2_imagenet_ssld\",\n    type=\"cv/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"mobilenet_v2_imagenet_ssld is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass MobileNet(nn.Layer):\n    \"\"\"MobileNetV2\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(MobileNet, self).__init__()\n\n        self.class_dim = class_dim\n\n        bottleneck_params_list = [(1, 16, 1, 1), (6, 24, 2, 2), (6, 32, 3, 2), (6, 64, 4, 2), (6, 96, 3, 1),\n                                  (6, 160, 3, 2), (6, 320, 1, 1)]\n\n        self.conv1 = ConvBNLayer(\n            num_channels=3, num_filters=int(32), filter_size=3, stride=2, padding=1, name=\"conv1_1\")\n\n        self.block_list = []\n        i = 1\n        in_c = int(32)\n        for layer_setting in bottleneck_params_list:\n            t, c, n, s = layer_setting\n            i += 1\n            block = self.add_sublayer(\n                \"conv\" + str(i), sublayer=InversiBlocks(in_c=in_c, t=t, c=int(c), n=n, s=s, name=\"conv\" + str(i)))\n            self.block_list.append(block)\n            in_c = int(c)\n\n        self.out_c = 1280\n        self.conv9 = ConvBNLayer(\n            num_channels=in_c, num_filters=self.out_c, filter_size=1, stride=1, padding=0, name=\"conv9\")\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n\n        self.out = Linear(\n            self.out_c, class_dim, weight_attr=ParamAttr(name=\"fc10_weights\"), bias_attr=ParamAttr(name=\"fc10_offset\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'mobilenet_v2_ssld.pdparams.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/mobilenet_v2_ssld.pdparams -O ' +\n                    checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv1(inputs, if_act=True)\n        for block in self.block_list:\n            y = block(y)\n        y = self.conv9(y, if_act=True)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.out_c])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v3_large_imagenet_ssld/README.md",
    "content": "# mobilenet_v3_large_imagenet_ssld\n\n|模型名称|mobilenet_v3_large_imagenet_ssld|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|Mobilenet_v3_large|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|23MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - MobileNetV3是Google在2019年发布的新模型，作者通过结合NAS与NetAdapt进行搜索得到该网络结构，提供了Large和Small两个版本，分别适用于对资源不同要求的情况。对比于MobileNetV2，新的模型在速度和精度方面均有提升。该PaddleHubModule的模型结构为MobileNetV3 Large，基于ImageNet-2012数据集并采用PaddleClas提供的SSLD蒸馏方法训练得到，接受输入图片大小为224 x 224 x 3，支持finetune，也可以直接通过命令行或者Python接口进行预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install mobilenet_v3_large_imagenet_ssld\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run mobilenet_v3_large_imagenet_ssld --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"mobilenet_v3_large_imagenet_ssld\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 为识别的菜品类别，value为置信度。\n\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m mobilenet_v3_large_imagenet_ssld\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/mobilenet_v3_large_imagenet_ssld\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install mobilenet_v3_large_imagenet_ssld==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v3_large_imagenet_ssld/README_en.md",
    "content": "# mobilenet_v3_large_imagenet_ssld\n\n|Module Name|mobilenet_v3_large_imagenet_ssld|\n| :--- | :---: |\n|Category|image classification|\n|Network|Mobilenet_v3_large|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|23MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - MobileNetV3 is an image classification model proposed by Google in 2019. The authors proposed to search the network architecture by combination of NAS and NetAdapt, and provide two versions of this model, i.e. Large and Small version. This module is based on MobileNetV3 Large, trained on ImageNet-2012 with SSLD distillation strategy, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install mobilenet_v3_large_imagenet_ssld\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run mobilenet_v3_large_imagenet_ssld --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"mobilenet_v3_large_imagenet_ssld\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - classification API.\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - top\\_k (int): return the first k results\n\n    - **Return**\n\n      - res (list\\[dict\\]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of image classification.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m mobilenet_v3_large_imagenet_ssld\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/mobilenet_v3_large_imagenet_ssld\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install mobilenet_v3_large_imagenet_ssld==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v3_large_imagenet_ssld/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\n\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.regularizer import L2Decay\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\ndef make_divisible(v: int, divisor: int = 8, min_value: int = None):\n    \"\"\"\n        This function is taken from the original tf repo.\n        It ensures that all layers have a channel number that is divisible by 8\n        It can be seen here:\n        https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py\n    \"\"\"\n    if min_value is None:\n        min_value = divisor\n    new_v = max(min_value, int(v + divisor / 2) // divisor * divisor)\n    if new_v < 0.9 * v:\n        new_v += divisor\n    return new_v\n\n\n@moduleinfo(\n    name=\"mobilenet_v3_large_imagenet_ssld\",\n    type=\"cv/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"mobilenet_v3_large_imagenet_ssld is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass MobileNetV3Large(nn.Layer):\n    \"\"\"MobileNetV3Large module.\"\"\"\n\n    def __init__(self, dropout_prob: float = 0.2, class_dim: int = 1000, load_checkpoint: str = None):\n        super(MobileNetV3Large, self).__init__()\n\n        inplanes = 16\n        self.cfg = [\n            # k, exp, c,  se,     nl,  s,\n            [3, 16, 16, False, \"relu\", 1],\n            [3, 64, 24, False, \"relu\", 2],\n            [3, 72, 24, False, \"relu\", 1],\n            [5, 72, 40, True, \"relu\", 2],\n            [5, 120, 40, True, \"relu\", 1],\n            [5, 120, 40, True, \"relu\", 1],\n            [3, 240, 80, False, \"hard_swish\", 2],\n            [3, 200, 80, False, \"hard_swish\", 1],\n            [3, 184, 80, False, \"hard_swish\", 1],\n            [3, 184, 80, False, \"hard_swish\", 1],\n            [3, 480, 112, True, \"hard_swish\", 1],\n            [3, 672, 112, True, \"hard_swish\", 1],\n            [5, 672, 160, True, \"hard_swish\", 2],\n            [5, 960, 160, True, \"hard_swish\", 1],\n            [5, 960, 160, True, \"hard_swish\", 1]\n        ]\n        self.cls_ch_squeeze = 960\n        self.cls_ch_expand = 1280\n\n        self.conv1 = ConvBNLayer(\n            in_c=3,\n            out_c=make_divisible(inplanes),\n            filter_size=3,\n            stride=2,\n            padding=1,\n            num_groups=1,\n            if_act=True,\n            act=\"hard_swish\",\n            name=\"conv1\")\n\n        self.block_list = []\n        i = 0\n        inplanes = make_divisible(inplanes)\n        for (k, exp, c, se, nl, s) in self.cfg:\n            self.block_list.append(\n                ResidualUnit(\n                    in_c=inplanes,\n                    mid_c=make_divisible(exp),\n                    out_c=make_divisible(c),\n                    filter_size=k,\n                    stride=s,\n                    use_se=se,\n                    act=nl,\n                    name=\"conv\" + str(i + 2)))\n            self.add_sublayer(sublayer=self.block_list[-1], name=\"conv\" + str(i + 2))\n            inplanes = make_divisible(c)\n            i += 1\n\n        self.last_second_conv = ConvBNLayer(\n            in_c=inplanes,\n            out_c=make_divisible(self.cls_ch_squeeze),\n            filter_size=1,\n            stride=1,\n            padding=0,\n            num_groups=1,\n            if_act=True,\n            act=\"hard_swish\",\n            name=\"conv_last\")\n\n        self.pool = AdaptiveAvgPool2d(1)\n\n        self.last_conv = Conv2d(\n            in_channels=make_divisible(self.cls_ch_squeeze),\n            out_channels=self.cls_ch_expand,\n            kernel_size=1,\n            stride=1,\n            padding=0,\n            weight_attr=ParamAttr(name=\"last_1x1_conv_weights\"),\n            bias_attr=False)\n\n        self.dropout = Dropout(p=dropout_prob, mode=\"downscale_in_infer\")\n\n        self.out = Linear(\n            self.cls_ch_expand, class_dim, weight_attr=ParamAttr(\"fc_weights\"), bias_attr=ParamAttr(name=\"fc_offset\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'mobilenet_v3_large_ssld.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/mobilenet_v3_large_ssld.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        x = self.conv1(inputs)\n        for block in self.block_list:\n            x = block(x)\n\n        x = self.last_second_conv(x)\n        x = self.pool(x)\n\n        x = self.last_conv(x)\n        x = F.hard_swish(x)\n        x = self.dropout(x)\n        x = paddle.reshape(x, shape=[x.shape[0], x.shape[1]])\n        x = self.out(x)\n        return x\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(self,\n                 in_c: int,\n                 out_c: int,\n                 filter_size: int,\n                 stride: int,\n                 padding: int,\n                 num_groups: int = 1,\n                 if_act: bool = True,\n                 act: str = None,\n                 name: str = \"\"):\n        super(ConvBNLayer, self).__init__()\n        self.if_act = if_act\n        self.act = act\n        self.conv = Conv2d(\n            in_channels=in_c,\n            out_channels=out_c,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=padding,\n            groups=num_groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        self.bn = BatchNorm(\n            num_channels=out_c,\n            act=None,\n            param_attr=ParamAttr(name=name + \"_bn_scale\", regularizer=L2Decay(0.0)),\n            bias_attr=ParamAttr(name=name + \"_bn_offset\", regularizer=L2Decay(0.0)),\n            moving_mean_name=name + \"_bn_mean\",\n            moving_variance_name=name + \"_bn_variance\")\n\n    def forward(self, x: paddle.Tensor):\n        x = self.conv(x)\n        x = self.bn(x)\n        if self.if_act:\n            if self.act == \"relu\":\n                x = F.relu(x)\n            elif self.act == \"hard_swish\":\n                x = F.hard_swish(x)\n            else:\n                print(\"The activation function is selected incorrectly.\")\n                exit()\n        return x\n\n\nclass ResidualUnit(nn.Layer):\n    \"\"\"Residual unit for MobileNetV3.\"\"\"\n\n    def __init__(self,\n                 in_c: int,\n                 mid_c: int,\n                 out_c: int,\n                 filter_size: int,\n                 stride: int,\n                 use_se: int,\n                 act: str = None,\n                 name: str = ''):\n        super(ResidualUnit, self).__init__()\n        self.if_shortcut = stride == 1 and in_c == out_c\n        self.if_se = use_se\n\n        self.expand_conv = ConvBNLayer(\n            in_c=in_c, out_c=mid_c, filter_size=1, stride=1, padding=0, if_act=True, act=act, name=name + \"_expand\")\n        self.bottleneck_conv = ConvBNLayer(\n            in_c=mid_c,\n            out_c=mid_c,\n            filter_size=filter_size,\n            stride=stride,\n            padding=int((filter_size - 1) // 2),\n            num_groups=mid_c,\n            if_act=True,\n            act=act,\n            name=name + \"_depthwise\")\n        if self.if_se:\n            self.mid_se = SEModule(mid_c, name=name + \"_se\")\n        self.linear_conv = ConvBNLayer(\n            in_c=mid_c, out_c=out_c, filter_size=1, stride=1, padding=0, if_act=False, act=None, name=name + \"_linear\")\n\n    def forward(self, inputs: paddle.Tensor):\n        x = self.expand_conv(inputs)\n        x = self.bottleneck_conv(x)\n        if self.if_se:\n            x = self.mid_se(x)\n        x = self.linear_conv(x)\n        if self.if_shortcut:\n            x = paddle.elementwise_add(inputs, x)\n        return x\n\n\nclass SEModule(nn.Layer):\n    \"\"\"Basic model for ResidualUnit.\"\"\"\n\n    def __init__(self, channel: int, reduction: int = 4, name: str = \"\"):\n        super(SEModule, self).__init__()\n        self.avg_pool = AdaptiveAvgPool2d(1)\n        self.conv1 = Conv2d(\n            in_channels=channel,\n            out_channels=channel // reduction,\n            kernel_size=1,\n            stride=1,\n            padding=0,\n            weight_attr=ParamAttr(name=name + \"_1_weights\"),\n            bias_attr=ParamAttr(name=name + \"_1_offset\"))\n        self.conv2 = Conv2d(\n            in_channels=channel // reduction,\n            out_channels=channel,\n            kernel_size=1,\n            stride=1,\n            padding=0,\n            weight_attr=ParamAttr(name + \"_2_weights\"),\n            bias_attr=ParamAttr(name=name + \"_2_offset\"))\n\n    def forward(self, inputs: paddle.Tensor):\n        outputs = self.avg_pool(inputs)\n        outputs = self.conv1(outputs)\n        outputs = F.relu(outputs)\n        outputs = self.conv2(outputs)\n        outputs = F.hard_sigmoid(outputs)\n        return paddle.multiply(x=inputs, y=outputs, axis=0)\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v3_small_imagenet_ssld/README.md",
    "content": "# mobilenet_v3_small_imagenet_ssld\n\n|模型名称|mobilenet_v3_small_imagenet_ssld|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|Mobilenet_v3_Small|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|13MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - MobileNetV3是Google在2019年发布的新模型，作者通过结合NAS与NetAdapt进行搜索得到该网络结构，提供了Large和Small两个版本，分别适用于对资源不同要求的情况。对比于MobileNetV2，新的模型在速度和精度方面均有提升。该PaddleHubModule的模型结构为MobileNetV3 Small，基于ImageNet-2012数据集并采用PaddleClas提供的SSLD蒸馏方法训练得到，接受输入图片大小为224 x 224 x 3，支持finetune，也可以直接通过命令行或者Python接口进行预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install mobilenet_v3_small_imagenet_ssld\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run mobilenet_v3_small_imagenet_ssld --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"mobilenet_v3_small_imagenet_ssld\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 为识别的菜品类别，value为置信度。\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m mobilenet_v3_small_imagenet_ssld\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/mobilenet_v3_small_imagenet_ssld\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install mobilenet_v3_small_imagenet_ssld==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v3_small_imagenet_ssld/README_en.md",
    "content": "# mobilenet_v3_small_imagenet_ssld\n\n|Module Name|mobilenet_v3_small_imagenet_ssld|\n| :--- | :---: |\n|Category|image classification|\n|Network|Mobilenet_v3_Small|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|13MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - MobileNetV3 is an image classification model proposed by Google in 2019. The authors proposed to search the network architecture by combination of NAS and NetAdapt, and provide two versions of this model, i.e. Large and Small version. This module is based on MobileNetV3 Small, trained on ImageNet-2012 with SSLD distillation strategy, and can predict an image of size 224*224*3.\n\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install mobilenet_v3_small_imagenet_ssld\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run mobilenet_v3_small_imagenet_ssld --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"mobilenet_v3_small_imagenet_ssld\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - classification API.\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - top\\_k (int): return the first k results\n\n    - **Return**\n\n      - res (list\\[dict\\]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of image classification.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m mobilenet_v3_small_imagenet_ssld\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/mobilenet_v3_small_imagenet_ssld\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install mobilenet_v3_small_imagenet_ssld==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/mobilenet_v3_small_imagenet_ssld/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\n\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.regularizer import L2Decay\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\ndef make_divisible(v, divisor=8, min_value=None):\n    if min_value is None:\n        min_value = divisor\n    new_v = max(min_value, int(v + divisor / 2) // divisor * divisor)\n    if new_v < 0.9 * v:\n        new_v += divisor\n    return new_v\n\n\n@moduleinfo(\n    name=\"mobilenet_v3_small_imagenet_ssld\",\n    type=\"cv/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"mobilenet_v3_small_imagenet_ssld is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass MobileNetV3Small(nn.Layer):\n    \"\"\"MobileNetV3Small module.\"\"\"\n\n    def __init__(self, dropout_prob: float = 0.2, class_dim: int = 1000, load_checkpoint: str = None):\n        super(MobileNetV3Small, self).__init__()\n\n        inplanes = 16\n        self.cfg = [\n            # k, exp, c,  se,     nl,  s,\n            [3, 16, 16, True, \"relu\", 2],\n            [3, 72, 24, False, \"relu\", 2],\n            [3, 88, 24, False, \"relu\", 1],\n            [5, 96, 40, True, \"hard_swish\", 2],\n            [5, 240, 40, True, \"hard_swish\", 1],\n            [5, 240, 40, True, \"hard_swish\", 1],\n            [5, 120, 48, True, \"hard_swish\", 1],\n            [5, 144, 48, True, \"hard_swish\", 1],\n            [5, 288, 96, True, \"hard_swish\", 2],\n            [5, 576, 96, True, \"hard_swish\", 1],\n            [5, 576, 96, True, \"hard_swish\", 1],\n        ]\n        self.cls_ch_squeeze = 576\n        self.cls_ch_expand = 1280\n\n        self.conv1 = ConvBNLayer(\n            in_c=3,\n            out_c=make_divisible(inplanes),\n            filter_size=3,\n            stride=2,\n            padding=1,\n            num_groups=1,\n            if_act=True,\n            act=\"hard_swish\",\n            name=\"conv1\")\n\n        self.block_list = []\n        i = 0\n        inplanes = make_divisible(inplanes)\n        for (k, exp, c, se, nl, s) in self.cfg:\n            self.block_list.append(\n                ResidualUnit(\n                    in_c=inplanes,\n                    mid_c=make_divisible(exp),\n                    out_c=make_divisible(c),\n                    filter_size=k,\n                    stride=s,\n                    use_se=se,\n                    act=nl,\n                    name=\"conv\" + str(i + 2)))\n            self.add_sublayer(sublayer=self.block_list[-1], name=\"conv\" + str(i + 2))\n            inplanes = make_divisible(c)\n            i += 1\n\n        self.last_second_conv = ConvBNLayer(\n            in_c=inplanes,\n            out_c=make_divisible(self.cls_ch_squeeze),\n            filter_size=1,\n            stride=1,\n            padding=0,\n            num_groups=1,\n            if_act=True,\n            act=\"hard_swish\",\n            name=\"conv_last\")\n\n        self.pool = AdaptiveAvgPool2d(1)\n\n        self.last_conv = Conv2d(\n            in_channels=make_divisible(self.cls_ch_squeeze),\n            out_channels=self.cls_ch_expand,\n            kernel_size=1,\n            stride=1,\n            padding=0,\n            weight_attr=ParamAttr(name=\"last_1x1_conv_weights\"),\n            bias_attr=False)\n\n        self.dropout = Dropout(p=dropout_prob, mode=\"downscale_in_infer\")\n\n        self.out = Linear(\n            self.cls_ch_expand, class_dim, weight_attr=ParamAttr(\"fc_weights\"), bias_attr=ParamAttr(name=\"fc_offset\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'mobilenet_v3_small_ssld.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/mobilenet_v3_small_ssld.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        x = self.conv1(inputs)\n        for block in self.block_list:\n            x = block(x)\n\n        x = self.last_second_conv(x)\n        x = self.pool(x)\n\n        x = self.last_conv(x)\n        x = F.hard_swish(x)\n        x = self.dropout(x)\n        x = paddle.reshape(x, shape=[x.shape[0], x.shape[1]])\n        x = self.out(x)\n        return x\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(self,\n                 in_c: int,\n                 out_c: int,\n                 filter_size: int,\n                 stride: int,\n                 padding: int,\n                 num_groups: int = 1,\n                 if_act: bool = True,\n                 act: str = None,\n                 name: str = \"\"):\n        super(ConvBNLayer, self).__init__()\n        self.if_act = if_act\n        self.act = act\n        self.conv = Conv2d(\n            in_channels=in_c,\n            out_channels=out_c,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=padding,\n            groups=num_groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        self.bn = BatchNorm(\n            num_channels=out_c,\n            act=None,\n            param_attr=ParamAttr(name=name + \"_bn_scale\", regularizer=L2Decay(0.0)),\n            bias_attr=ParamAttr(name=name + \"_bn_offset\", regularizer=L2Decay(0.0)),\n            moving_mean_name=name + \"_bn_mean\",\n            moving_variance_name=name + \"_bn_variance\")\n\n    def forward(self, x):\n        x = self.conv(x)\n        x = self.bn(x)\n        if self.if_act:\n            if self.act == \"relu\":\n                x = F.relu(x)\n            elif self.act == \"hard_swish\":\n                x = F.hard_swish(x)\n            else:\n                print(\"The activation function is selected incorrectly.\")\n                exit()\n        return x\n\n\nclass ResidualUnit(nn.Layer):\n    \"\"\"Residual unit for MobileNetV3.\"\"\"\n\n    def __init__(self,\n                 in_c: int,\n                 mid_c: int,\n                 out_c: int,\n                 filter_size: int,\n                 stride: int,\n                 use_se: bool,\n                 act: str = None,\n                 name: str = ''):\n        super(ResidualUnit, self).__init__()\n        self.if_shortcut = stride == 1 and in_c == out_c\n        self.if_se = use_se\n\n        self.expand_conv = ConvBNLayer(\n            in_c=in_c, out_c=mid_c, filter_size=1, stride=1, padding=0, if_act=True, act=act, name=name + \"_expand\")\n        self.bottleneck_conv = ConvBNLayer(\n            in_c=mid_c,\n            out_c=mid_c,\n            filter_size=filter_size,\n            stride=stride,\n            padding=int((filter_size - 1) // 2),\n            num_groups=mid_c,\n            if_act=True,\n            act=act,\n            name=name + \"_depthwise\")\n        if self.if_se:\n            self.mid_se = SEModule(mid_c, name=name + \"_se\")\n        self.linear_conv = ConvBNLayer(\n            in_c=mid_c, out_c=out_c, filter_size=1, stride=1, padding=0, if_act=False, act=None, name=name + \"_linear\")\n\n    def forward(self, inputs: paddle.Tensor):\n        x = self.expand_conv(inputs)\n        x = self.bottleneck_conv(x)\n        if self.if_se:\n            x = self.mid_se(x)\n        x = self.linear_conv(x)\n        if self.if_shortcut:\n            x = paddle.elementwise_add(inputs, x)\n        return x\n\n\nclass SEModule(nn.Layer):\n    \"\"\"Basic model for ResidualUnit.\"\"\"\n\n    def __init__(self, channel: int, reduction: int = 4, name: str = \"\"):\n        super(SEModule, self).__init__()\n        self.avg_pool = AdaptiveAvgPool2d(1)\n        self.conv1 = Conv2d(\n            in_channels=channel,\n            out_channels=channel // reduction,\n            kernel_size=1,\n            stride=1,\n            padding=0,\n            weight_attr=ParamAttr(name=name + \"_1_weights\"),\n            bias_attr=ParamAttr(name=name + \"_1_offset\"))\n        self.conv2 = Conv2d(\n            in_channels=channel // reduction,\n            out_channels=channel,\n            kernel_size=1,\n            stride=1,\n            padding=0,\n            weight_attr=ParamAttr(name + \"_2_weights\"),\n            bias_attr=ParamAttr(name=name + \"_2_offset\"))\n\n    def forward(self, inputs: paddle.Tensor):\n        outputs = self.avg_pool(inputs)\n        outputs = self.conv1(outputs)\n        outputs = F.relu(outputs)\n        outputs = self.conv2(outputs)\n        outputs = F.hard_sigmoid(outputs)\n        return paddle.multiply(x=inputs, y=outputs, axis=0)\n"
  },
  {
    "path": "modules/image/classification/nasnet_imagenet/README.md",
    "content": "# nasnet_imagenet\n\n|模型名称|nasnet_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|NASNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|345MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - NASNet是Google通过AutoML自动训练出来的图像分类模型。该PaddleHub Module基于ImageNet-2012数据集训练，接受输入图片大小为224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install nasnet_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run nasnet_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"nasnet_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  修复python2中编码问题\n  - ```shell\n    $ hub install nasnet_imagenet==1.0.1\n    ```\n"
  },
  {
    "path": "modules/image/classification/nasnet_imagenet/README_en.md",
    "content": "# nasnet_imagenet\n\n|Module Name|nasnet_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|NASNet|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|345MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n  - NASNet is proposed by Google, which is trained by AutoML. This module is based on NASNet, trained on ImageNet-2012, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install nasnet_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run nasnet_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"nasnet_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.0.1\n\n  Fix the problem of encoding in python2\n  - ```shell\n    $ hub install nasnet_imagenet==1.0.1\n    ```\n"
  },
  {
    "path": "modules/image/classification/pnasnet_imagenet/README.md",
    "content": "# pnasnet_imagenet\n\n|模型名称|pnasnet_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|PNASNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|333MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - PNASNet是Google通过AutoML自动训练出来的图像分类模型。该PaddleHub Module基于ImageNet-2012数据集训练，接受输入图片大小为224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install pnasnet_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run pnasnet_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"pnasnet_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  修复python2中编码问题\n  - ```shell\n    $ hub install pnasnet_imagenet==1.0.1\n    ```\n"
  },
  {
    "path": "modules/image/classification/pnasnet_imagenet/README_en.md",
    "content": "# pnasnet_imagenet\n\n|Module Name|pnasnet_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|PNASNet|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|333MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - PNASNet is proposed by Google, which is trained by AutoML. This module is based on PNASNet, trained on ImageNet-2012, and can predict an image of size 224*224*3.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install pnasnet_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run pnasnet_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"pnasnet_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.0.1\n\n  Fix the problem of encoding in python2\n  - ```shell\n    $ hub install pnasnet_imagenet==1.0.1\n    ```\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x0_25_imagenet/README.md",
    "content": "# pplcnet_x0_25_imagenet\n\n|模型名称|pplcnet_x0_25_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|PPLCNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|5 MB|\n|最新更新日期|2022-04-02|\n|数据指标|Acc|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - PP-LCNet是百度针对Intel CPU 设备以及其加速库 MKLDNN 设计的特定骨干网络 ，比起其他的轻量级的 SOTA 模型，该骨干网络可以在不增加推理时间的情况下，进一步提升模型的性能，最终大幅度超越现有的 SOTA 模型。该模型为模型规模参数scale为x0.25下的PP-LCNet模型，关于模型结构的更多信息，可参考[论文](https://arxiv.org/pdf/2109.15099.pdf)。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install pplcnet_x0_25_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run pplcnet_x0_25_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"pplcnet_x0_25_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 包括'class_ids'（种类索引）, 'scores'（置信度） 和 'label_names'（种类名称）\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m pplcnet_x0_25_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"\\}\n    url = \"http://127.0.0.1:8866/predict/pplcnet_x0_25_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install pplcnet_x0_25_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x0_25_imagenet/model.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Any\nfrom typing import Callable\nfrom typing import Dict\nfrom typing import List\nfrom typing import Tuple\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nfrom paddle import ParamAttr\nfrom paddle.nn import AdaptiveAvgPool2D\nfrom paddle.nn import BatchNorm\nfrom paddle.nn import Conv2D\nfrom paddle.nn import Dropout\nfrom paddle.nn import Linear\nfrom paddle.nn.initializer import KaimingNormal\nfrom paddle.regularizer import L2Decay\n\n\nclass Identity(nn.Layer):\n\n    def __init__(self):\n        super(Identity, self).__init__()\n\n    def forward(self, inputs):\n        return inputs\n\n\nclass TheseusLayer(nn.Layer):\n\n    def __init__(self, *args, **kwargs):\n        super(TheseusLayer, self).__init__()\n        self.res_dict = {}\n        self.res_name = self.full_name()\n        self.pruner = None\n        self.quanter = None\n\n    def _return_dict_hook(self, layer, input, output):\n        res_dict = {\"output\": output}\n        # 'list' is needed to avoid error raised by popping self.res_dict\n        for res_key in list(self.res_dict):\n            # clear the res_dict because the forward process may change according to input\n            res_dict[res_key] = self.res_dict.pop(res_key)\n        return res_dict\n\n    def init_res(self, stages_pattern, return_patterns=None, return_stages=None):\n        if return_patterns and return_stages:\n            msg = f\"The 'return_patterns' would be ignored when 'return_stages' is set.\"\n            return_stages = None\n\n        if return_stages is True:\n            return_patterns = stages_pattern\n        # return_stages is int or bool\n        if type(return_stages) is int:\n            return_stages = [return_stages]\n        if isinstance(return_stages, list):\n            if max(return_stages) > len(stages_pattern) or min(return_stages) < 0:\n                msg = f\"The 'return_stages' set error. Illegal value(s) have been ignored. The stages' pattern list is {stages_pattern}.\"\n                return_stages = [val for val in return_stages if val >= 0 and val < len(stages_pattern)]\n            return_patterns = [stages_pattern[i] for i in return_stages]\n\n        if return_patterns:\n            self.update_res(return_patterns)\n\n    def replace_sub(self, *args, **kwargs) -> None:\n        msg = \"The function 'replace_sub()' is deprecated, please use 'upgrade_sublayer()' instead.\"\n        raise DeprecationWarning(msg)\n\n    def upgrade_sublayer(self, layer_name_pattern: Union[str, List[str]],\n                         handle_func: Callable[[nn.Layer, str], nn.Layer]) -> Dict[str, nn.Layer]:\n        \"\"\"use 'handle_func' to modify the sub-layer(s) specified by 'layer_name_pattern'.\n\n        Args:\n            layer_name_pattern (Union[str, List[str]]): The name of layer to be modified by 'handle_func'.\n            handle_func (Callable[[nn.Layer, str], nn.Layer]): The function to modify target layer specified by 'layer_name_pattern'. The formal params are the layer(nn.Layer) and pattern(str) that is (a member of) layer_name_pattern (when layer_name_pattern is List type). And the return is the layer processed.\n\n        Returns:\n            Dict[str, nn.Layer]: The key is the pattern and corresponding value is the result returned by 'handle_func()'.\n\n        Examples:\n\n            from paddle import nn\n            import paddleclas\n\n            def rep_func(layer: nn.Layer, pattern: str):\n                new_layer = nn.Conv2D(\n                    in_channels=layer._in_channels,\n                    out_channels=layer._out_channels,\n                    kernel_size=5,\n                    padding=2\n                )\n                return new_layer\n\n            net = paddleclas.MobileNetV1()\n            res = net.replace_sub(layer_name_pattern=[\"blocks[11].depthwise_conv.conv\", \"blocks[12].depthwise_conv.conv\"], handle_func=rep_func)\n            print(res)\n            # {'blocks[11].depthwise_conv.conv': the corresponding new_layer, 'blocks[12].depthwise_conv.conv': the corresponding new_layer}\n        \"\"\"\n\n        if not isinstance(layer_name_pattern, list):\n            layer_name_pattern = [layer_name_pattern]\n\n        hit_layer_pattern_list = []\n        for pattern in layer_name_pattern:\n            # parse pattern to find target layer and its parent\n            layer_list = parse_pattern_str(pattern=pattern, parent_layer=self)\n            if not layer_list:\n                continue\n            sub_layer_parent = layer_list[-2][\"layer\"] if len(layer_list) > 1 else self\n\n            sub_layer = layer_list[-1][\"layer\"]\n            sub_layer_name = layer_list[-1][\"name\"]\n            sub_layer_index = layer_list[-1][\"index\"]\n\n            new_sub_layer = handle_func(sub_layer, pattern)\n\n            if sub_layer_index:\n                getattr(sub_layer_parent, sub_layer_name)[sub_layer_index] = new_sub_layer\n            else:\n                setattr(sub_layer_parent, sub_layer_name, new_sub_layer)\n\n            hit_layer_pattern_list.append(pattern)\n        return hit_layer_pattern_list\n\n    def stop_after(self, stop_layer_name: str) -> bool:\n        \"\"\"stop forward and backward after 'stop_layer_name'.\n\n        Args:\n            stop_layer_name (str): The name of layer that stop forward and backward after this layer.\n\n        Returns:\n            bool: 'True' if successful, 'False' otherwise.\n        \"\"\"\n\n        layer_list = parse_pattern_str(stop_layer_name, self)\n        if not layer_list:\n            return False\n\n        parent_layer = self\n        for layer_dict in layer_list:\n            name, index = layer_dict[\"name\"], layer_dict[\"index\"]\n            if not set_identity(parent_layer, name, index):\n                msg = f\"Failed to set the layers that after stop_layer_name('{stop_layer_name}') to IdentityLayer. The error layer's name is '{name}'.\"\n                return False\n            parent_layer = layer_dict[\"layer\"]\n\n        return True\n\n    def update_res(self, return_patterns: Union[str, List[str]]) -> Dict[str, nn.Layer]:\n        \"\"\"update the result(s) to be returned.\n\n        Args:\n            return_patterns (Union[str, List[str]]): The name of layer to return output.\n\n        Returns:\n            Dict[str, nn.Layer]: The pattern(str) and corresponding layer(nn.Layer) that have been set successfully.\n        \"\"\"\n\n        # clear res_dict that could have been set\n        self.res_dict = {}\n\n        class Handler(object):\n\n            def __init__(self, res_dict):\n                # res_dict is a reference\n                self.res_dict = res_dict\n\n            def __call__(self, layer, pattern):\n                layer.res_dict = self.res_dict\n                layer.res_name = pattern\n                if hasattr(layer, \"hook_remove_helper\"):\n                    layer.hook_remove_helper.remove()\n                layer.hook_remove_helper = layer.register_forward_post_hook(save_sub_res_hook)\n                return layer\n\n        handle_func = Handler(self.res_dict)\n\n        hit_layer_pattern_list = self.upgrade_sublayer(return_patterns, handle_func=handle_func)\n\n        if hasattr(self, \"hook_remove_helper\"):\n            self.hook_remove_helper.remove()\n        self.hook_remove_helper = self.register_forward_post_hook(self._return_dict_hook)\n\n        return hit_layer_pattern_list\n\n\ndef save_sub_res_hook(layer, input, output):\n    layer.res_dict[layer.res_name] = output\n\n\ndef set_identity(parent_layer: nn.Layer, layer_name: str, layer_index: str = None) -> bool:\n    \"\"\"set the layer specified by layer_name and layer_index to Indentity.\n\n    Args:\n        parent_layer (nn.Layer): The parent layer of target layer specified by layer_name and layer_index.\n        layer_name (str): The name of target layer to be set to Indentity.\n        layer_index (str, optional): The index of target layer to be set to Indentity in parent_layer. Defaults to None.\n\n    Returns:\n        bool: True if successfully, False otherwise.\n    \"\"\"\n\n    stop_after = False\n    for sub_layer_name in parent_layer._sub_layers:\n        if stop_after:\n            parent_layer._sub_layers[sub_layer_name] = Identity()\n            continue\n        if sub_layer_name == layer_name:\n            stop_after = True\n\n    if layer_index and stop_after:\n        stop_after = False\n        for sub_layer_index in parent_layer._sub_layers[layer_name]._sub_layers:\n            if stop_after:\n                parent_layer._sub_layers[layer_name][sub_layer_index] = Identity()\n                continue\n            if layer_index == sub_layer_index:\n                stop_after = True\n\n    return stop_after\n\n\ndef parse_pattern_str(pattern: str, parent_layer: nn.Layer) -> Union[None, List[Dict[str, Union[nn.Layer, str, None]]]]:\n    \"\"\"parse the string type pattern.\n\n    Args:\n        pattern (str): The pattern to discribe layer.\n        parent_layer (nn.Layer): The root layer relative to the pattern.\n\n    Returns:\n        Union[None, List[Dict[str, Union[nn.Layer, str, None]]]]: None if failed. If successfully, the members are layers parsed in order:\n                                                                [\n                                                                    {\"layer\": first layer, \"name\": first layer's name parsed, \"index\": first layer's index parsed if exist},\n                                                                    {\"layer\": second layer, \"name\": second layer's name parsed, \"index\": second layer's index parsed if exist},\n                                                                    ...\n                                                                ]\n    \"\"\"\n\n    pattern_list = pattern.split(\".\")\n    if not pattern_list:\n        msg = f\"The pattern('{pattern}') is illegal. Please check and retry.\"\n        return None\n\n    layer_list = []\n    while len(pattern_list) > 0:\n        if '[' in pattern_list[0]:\n            target_layer_name = pattern_list[0].split('[')[0]\n            target_layer_index = pattern_list[0].split('[')[1].split(']')[0]\n        else:\n            target_layer_name = pattern_list[0]\n            target_layer_index = None\n\n        target_layer = getattr(parent_layer, target_layer_name, None)\n\n        if target_layer is None:\n            msg = f\"Not found layer named('{target_layer_name}') specifed in pattern('{pattern}').\"\n            return None\n\n        if target_layer_index and target_layer:\n            if int(target_layer_index) < 0 or int(target_layer_index) >= len(target_layer):\n                msg = f\"Not found layer by index('{target_layer_index}') specifed in pattern('{pattern}'). The index should < {len(target_layer)} and > 0.\"\n                return None\n\n            target_layer = target_layer[target_layer_index]\n\n        layer_list.append({\"layer\": target_layer, \"name\": target_layer_name, \"index\": target_layer_index})\n\n        pattern_list = pattern_list[1:]\n        parent_layer = target_layer\n    return layer_list\n\n\nMODEL_STAGES_PATTERN = {\"PPLCNet\": [\"blocks2\", \"blocks3\", \"blocks4\", \"blocks5\", \"blocks6\"]}\n\n# Each element(list) represents a depthwise block, which is composed of k, in_c, out_c, s, use_se.\n# k: kernel_size\n# in_c: input channel number in depthwise block\n# out_c: output channel number in depthwise block\n# s: stride in depthwise block\n# use_se: whether to use SE block\n\nNET_CONFIG = {\n    \"blocks2\":\n    #k, in_c, out_c, s, use_se\n    [[3, 16, 32, 1, False]],\n    \"blocks3\": [[3, 32, 64, 2, False], [3, 64, 64, 1, False]],\n    \"blocks4\": [[3, 64, 128, 2, False], [3, 128, 128, 1, False]],\n    \"blocks5\": [[3, 128, 256, 2, False], [5, 256, 256, 1, False], [5, 256, 256, 1, False], [5, 256, 256, 1, False],\n                [5, 256, 256, 1, False], [5, 256, 256, 1, False]],\n    \"blocks6\": [[5, 256, 512, 2, True], [5, 512, 512, 1, True]]\n}\n\n\ndef make_divisible(v, divisor=8, min_value=None):\n    if min_value is None:\n        min_value = divisor\n    new_v = max(min_value, int(v + divisor / 2) // divisor * divisor)\n    if new_v < 0.9 * v:\n        new_v += divisor\n    return new_v\n\n\nclass ConvBNLayer(TheseusLayer):\n\n    def __init__(self, num_channels, filter_size, num_filters, stride, num_groups=1):\n        super().__init__()\n\n        self.conv = Conv2D(in_channels=num_channels,\n                           out_channels=num_filters,\n                           kernel_size=filter_size,\n                           stride=stride,\n                           padding=(filter_size - 1) // 2,\n                           groups=num_groups,\n                           weight_attr=ParamAttr(initializer=KaimingNormal()),\n                           bias_attr=False)\n\n        self.bn = BatchNorm(num_filters,\n                            param_attr=ParamAttr(regularizer=L2Decay(0.0)),\n                            bias_attr=ParamAttr(regularizer=L2Decay(0.0)))\n        self.hardswish = nn.Hardswish()\n\n    def forward(self, x):\n        x = self.conv(x)\n        x = self.bn(x)\n        x = self.hardswish(x)\n        return x\n\n\nclass DepthwiseSeparable(TheseusLayer):\n\n    def __init__(self, num_channels, num_filters, stride, dw_size=3, use_se=False):\n        super().__init__()\n        self.use_se = use_se\n        self.dw_conv = ConvBNLayer(num_channels=num_channels,\n                                   num_filters=num_channels,\n                                   filter_size=dw_size,\n                                   stride=stride,\n                                   num_groups=num_channels)\n        if use_se:\n            self.se = SEModule(num_channels)\n        self.pw_conv = ConvBNLayer(num_channels=num_channels, filter_size=1, num_filters=num_filters, stride=1)\n\n    def forward(self, x):\n        x = self.dw_conv(x)\n        if self.use_se:\n            x = self.se(x)\n        x = self.pw_conv(x)\n        return x\n\n\nclass SEModule(TheseusLayer):\n\n    def __init__(self, channel, reduction=4):\n        super().__init__()\n        self.avg_pool = AdaptiveAvgPool2D(1)\n        self.conv1 = Conv2D(in_channels=channel, out_channels=channel // reduction, kernel_size=1, stride=1, padding=0)\n        self.relu = nn.ReLU()\n        self.conv2 = Conv2D(in_channels=channel // reduction, out_channels=channel, kernel_size=1, stride=1, padding=0)\n        self.hardsigmoid = nn.Hardsigmoid()\n\n    def forward(self, x):\n        identity = x\n        x = self.avg_pool(x)\n        x = self.conv1(x)\n        x = self.relu(x)\n        x = self.conv2(x)\n        x = self.hardsigmoid(x)\n        x = paddle.multiply(x=identity, y=x)\n        return x\n\n\nclass PPLCNet(TheseusLayer):\n\n    def __init__(self,\n                 stages_pattern,\n                 scale=1.0,\n                 class_num=1000,\n                 dropout_prob=0.2,\n                 class_expand=1280,\n                 return_patterns=None,\n                 return_stages=None):\n        super().__init__()\n        self.scale = scale\n        self.class_expand = class_expand\n\n        self.conv1 = ConvBNLayer(num_channels=3, filter_size=3, num_filters=make_divisible(16 * scale), stride=2)\n\n        self.blocks2 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks2\"])\n        ])\n\n        self.blocks3 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks3\"])\n        ])\n\n        self.blocks4 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks4\"])\n        ])\n\n        self.blocks5 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks5\"])\n        ])\n\n        self.blocks6 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks6\"])\n        ])\n\n        self.avg_pool = AdaptiveAvgPool2D(1)\n\n        self.last_conv = Conv2D(in_channels=make_divisible(NET_CONFIG[\"blocks6\"][-1][2] * scale),\n                                out_channels=self.class_expand,\n                                kernel_size=1,\n                                stride=1,\n                                padding=0,\n                                bias_attr=False)\n\n        self.hardswish = nn.Hardswish()\n        self.dropout = Dropout(p=dropout_prob, mode=\"downscale_in_infer\")\n        self.flatten = nn.Flatten(start_axis=1, stop_axis=-1)\n\n        self.fc = Linear(self.class_expand, class_num)\n\n        super().init_res(stages_pattern, return_patterns=return_patterns, return_stages=return_stages)\n\n    def forward(self, x):\n        x = self.conv1(x)\n\n        x = self.blocks2(x)\n        x = self.blocks3(x)\n        x = self.blocks4(x)\n        x = self.blocks5(x)\n        x = self.blocks6(x)\n\n        x = self.avg_pool(x)\n        x = self.last_conv(x)\n        x = self.hardswish(x)\n        x = self.dropout(x)\n        x = self.flatten(x)\n        x = self.fc(x)\n        return x\n\n\ndef PPLCNet_x0_25(**kwargs):\n    model = PPLCNet(scale=0.25, stages_pattern=MODEL_STAGES_PATTERN[\"PPLCNet\"], **kwargs)\n    return model\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x0_25_imagenet/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport cv2\nimport numpy as np\nimport paddle\nfrom skimage.io import imread\nfrom skimage.transform import rescale\nfrom skimage.transform import resize\n\nimport paddlehub as hub\nfrom .model import PPLCNet_x0_25\nfrom .processor import base64_to_cv2\nfrom .processor import create_operators\nfrom .processor import Topk\nfrom .utils import get_config\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"pplcnet_x0_25_imagenet\",\n            type=\"cv/classification\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"\",\n            version=\"1.0.0\")\nclass PPLcNet_x0_5:\n\n    def __init__(self):\n        self.config = get_config(os.path.join(self.directory, 'PPLCNet_x0_25.yaml'), show=False)\n        self.label_path = os.path.join(self.directory, 'imagenet1k_label_list.txt')\n        self.pretrain_path = os.path.join(self.directory, 'PPLCNet_x0_25_pretrained.pdparams')\n        self.config['Infer']['PostProcess']['class_id_map_file'] = self.label_path\n        self.model = PPLCNet_x0_25()\n        param_state_dict = paddle.load(self.pretrain_path)\n        self.model.set_dict(param_state_dict)\n        self.preprocess_funcs = create_operators(self.config[\"Infer\"][\"transforms\"])\n\n    def classification(self,\n                       images: list = None,\n                       paths: list = None,\n                       batch_size: int = 1,\n                       use_gpu: bool = False,\n                       top_k: int = 1):\n        '''\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results, each result dict contains key 'class_ids', 'scores' and 'label_names'.\n        '''\n        postprocess_func = Topk(top_k, self.label_path)\n        inputs = []\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n\n        if images != None:\n            for image in images:\n                image = image[:, :, ::-1]\n                inputs.append(image)\n\n        if paths != None:\n            for path in paths:\n                image = cv2.imread(path)[:, :, ::-1]\n                inputs.append(image)\n\n        batch_data = []\n        for idx, imagedata in enumerate(inputs):\n            for process in self.preprocess_funcs:\n                imagedata = process(imagedata)\n            batch_data.append(imagedata)\n            if len(batch_data) >= batch_size or idx == len(inputs) - 1:\n                batch_tensor = paddle.to_tensor(batch_data)\n                out = self.model(batch_tensor)\n                if isinstance(out, list):\n                    out = out[0]\n                if isinstance(out, dict) and \"logits\" in out:\n                    out = out[\"logits\"]\n                if isinstance(out, dict) and \"output\" in out:\n                    out = out[\"output\"]\n                result = postprocess_func(out)\n                results.extend(result)\n                batch_data.clear()\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n        results = self.classification(paths=[self.args.input_path],\n                                      use_gpu=self.args.use_gpu,\n                                      batch_size=self.args.batch_size,\n                                      top_k=self.args.top_k)\n        return results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classification(images=images_decode, **kwargs)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument('--batch_size', type=int, default=1, help='batch size')\n        self.arg_config_group.add_argument('--top_k', type=int, default=1, help='Return top k results.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to input image.\")\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x0_25_imagenet/processor.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport base64\nimport inspect\nimport math\nimport os\nimport random\nimport sys\nfrom functools import partial\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\nimport six\nfrom paddle.vision.transforms import ColorJitter as RawColorJitter\nfrom PIL import Image\n\n\ndef create_operators(params, class_num=None):\n    \"\"\"\n    create operators based on the config\n\n    Args:\n        params(list): a dict list, used to create some operators\n    \"\"\"\n    assert isinstance(params, list), ('operator config should be a list')\n    ops = []\n    current_module = sys.modules[__name__]\n    for operator in params:\n        assert isinstance(operator, dict) and len(operator) == 1, \"yaml format error\"\n        op_name = list(operator)[0]\n        param = {} if operator[op_name] is None else operator[op_name]\n        op_func = getattr(current_module, op_name)\n        if \"class_num\" in inspect.getfullargspec(op_func).args:\n            param.update({\"class_num\": class_num})\n        op = op_func(**param)\n        ops.append(op)\n\n    return ops\n\n\nclass UnifiedResize(object):\n\n    def __init__(self, interpolation=None, backend=\"cv2\"):\n        _cv2_interp_from_str = {\n            'nearest': cv2.INTER_NEAREST,\n            'bilinear': cv2.INTER_LINEAR,\n            'area': cv2.INTER_AREA,\n            'bicubic': cv2.INTER_CUBIC,\n            'lanczos': cv2.INTER_LANCZOS4\n        }\n        _pil_interp_from_str = {\n            'nearest': Image.NEAREST,\n            'bilinear': Image.BILINEAR,\n            'bicubic': Image.BICUBIC,\n            'box': Image.BOX,\n            'lanczos': Image.LANCZOS,\n            'hamming': Image.HAMMING\n        }\n\n        def _pil_resize(src, size, resample):\n            pil_img = Image.fromarray(src)\n            pil_img = pil_img.resize(size, resample)\n            return np.asarray(pil_img)\n\n        if backend.lower() == \"cv2\":\n            if isinstance(interpolation, str):\n                interpolation = _cv2_interp_from_str[interpolation.lower()]\n            # compatible with opencv < version 4.4.0\n            elif interpolation is None:\n                interpolation = cv2.INTER_LINEAR\n            self.resize_func = partial(cv2.resize, interpolation=interpolation)\n        elif backend.lower() == \"pil\":\n            if isinstance(interpolation, str):\n                interpolation = _pil_interp_from_str[interpolation.lower()]\n            self.resize_func = partial(_pil_resize, resample=interpolation)\n        else:\n            self.resize_func = cv2.resize\n\n    def __call__(self, src, size):\n        return self.resize_func(src, size)\n\n\nclass OperatorParamError(ValueError):\n    \"\"\" OperatorParamError\n    \"\"\"\n    pass\n\n\nclass DecodeImage(object):\n    \"\"\" decode image \"\"\"\n\n    def __init__(self, to_rgb=True, to_np=False, channel_first=False):\n        self.to_rgb = to_rgb\n        self.to_np = to_np  # to numpy\n        self.channel_first = channel_first  # only enabled when to_np is True\n\n    def __call__(self, img):\n        if six.PY2:\n            assert type(img) is str and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        else:\n            assert type(img) is bytes and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        data = np.frombuffer(img, dtype='uint8')\n        img = cv2.imdecode(data, 1)\n        if self.to_rgb:\n            assert img.shape[2] == 3, 'invalid shape of image[%s]' % (img.shape)\n            img = img[:, :, ::-1]\n\n        if self.channel_first:\n            img = img.transpose((2, 0, 1))\n\n        return img\n\n\nclass ResizeImage(object):\n    \"\"\" resize image \"\"\"\n\n    def __init__(self, size=None, resize_short=None, interpolation=None, backend=\"cv2\"):\n        if resize_short is not None and resize_short > 0:\n            self.resize_short = resize_short\n            self.w = None\n            self.h = None\n        elif size is not None:\n            self.resize_short = None\n            self.w = size if type(size) is int else size[0]\n            self.h = size if type(size) is int else size[1]\n        else:\n            raise OperatorParamError(\"invalid params for ReisizeImage for '\\\n                'both 'size' and 'resize_short' are None\")\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        img_h, img_w = img.shape[:2]\n        if self.resize_short is not None:\n            percent = float(self.resize_short) / min(img_w, img_h)\n            w = int(round(img_w * percent))\n            h = int(round(img_h * percent))\n        else:\n            w = self.w\n            h = self.h\n        return self._resize_func(img, (w, h))\n\n\nclass CropImage(object):\n    \"\"\" crop image \"\"\"\n\n    def __init__(self, size):\n        if type(size) is int:\n            self.size = (size, size)\n        else:\n            self.size = size  # (h, w)\n\n    def __call__(self, img):\n        w, h = self.size\n        img_h, img_w = img.shape[:2]\n        w_start = (img_w - w) // 2\n        h_start = (img_h - h) // 2\n\n        w_end = w_start + w\n        h_end = h_start + h\n        return img[h_start:h_end, w_start:w_end, :]\n\n\nclass RandCropImage(object):\n    \"\"\" random crop image \"\"\"\n\n    def __init__(self, size, scale=None, ratio=None, interpolation=None, backend=\"cv2\"):\n        if type(size) is int:\n            self.size = (size, size)  # (h, w)\n        else:\n            self.size = size\n\n        self.scale = [0.08, 1.0] if scale is None else scale\n        self.ratio = [3. / 4., 4. / 3.] if ratio is None else ratio\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        size = self.size\n        scale = self.scale\n        ratio = self.ratio\n\n        aspect_ratio = math.sqrt(random.uniform(*ratio))\n        w = 1. * aspect_ratio\n        h = 1. / aspect_ratio\n\n        img_h, img_w = img.shape[:2]\n\n        bound = min((float(img_w) / img_h) / (w**2), (float(img_h) / img_w) / (h**2))\n        scale_max = min(scale[1], bound)\n        scale_min = min(scale[0], bound)\n\n        target_area = img_w * img_h * random.uniform(scale_min, scale_max)\n        target_size = math.sqrt(target_area)\n        w = int(target_size * w)\n        h = int(target_size * h)\n\n        i = random.randint(0, img_w - w)\n        j = random.randint(0, img_h - h)\n\n        img = img[j:j + h, i:i + w, :]\n\n        return self._resize_func(img, size)\n\n\nclass RandFlipImage(object):\n    \"\"\" random flip image\n        flip_code:\n            1: Flipped Horizontally\n            0: Flipped Vertically\n            -1: Flipped Horizontally & Vertically\n    \"\"\"\n\n    def __init__(self, flip_code=1):\n        assert flip_code in [-1, 0, 1], \"flip_code should be a value in [-1, 0, 1]\"\n        self.flip_code = flip_code\n\n    def __call__(self, img):\n        if random.randint(0, 1) == 1:\n            return cv2.flip(img, self.flip_code)\n        else:\n            return img\n\n\nclass NormalizeImage(object):\n    \"\"\" normalize image such as substract mean, divide std\n    \"\"\"\n\n    def __init__(self, scale=None, mean=None, std=None, order='chw', output_fp16=False, channel_num=3):\n        if isinstance(scale, str):\n            scale = eval(scale)\n        assert channel_num in [3, 4], \"channel number of input image should be set to 3 or 4.\"\n        self.channel_num = channel_num\n        self.output_dtype = 'float16' if output_fp16 else 'float32'\n        self.scale = np.float32(scale if scale is not None else 1.0 / 255.0)\n        self.order = order\n        mean = mean if mean is not None else [0.485, 0.456, 0.406]\n        std = std if std is not None else [0.229, 0.224, 0.225]\n\n        shape = (3, 1, 1) if self.order == 'chw' else (1, 1, 3)\n        self.mean = np.array(mean).reshape(shape).astype('float32')\n        self.std = np.array(std).reshape(shape).astype('float32')\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        assert isinstance(img, np.ndarray), \"invalid input 'img' in NormalizeImage\"\n\n        img = (img.astype('float32') * self.scale - self.mean) / self.std\n\n        if self.channel_num == 4:\n            img_h = img.shape[1] if self.order == 'chw' else img.shape[0]\n            img_w = img.shape[2] if self.order == 'chw' else img.shape[1]\n            pad_zeros = np.zeros((1, img_h, img_w)) if self.order == 'chw' else np.zeros((img_h, img_w, 1))\n            img = (np.concatenate((img, pad_zeros), axis=0) if self.order == 'chw' else np.concatenate(\n                (img, pad_zeros), axis=2))\n        return img.astype(self.output_dtype)\n\n\nclass ToCHWImage(object):\n    \"\"\" convert hwc image to chw image\n    \"\"\"\n\n    def __init__(self):\n        pass\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        return img.transpose((2, 0, 1))\n\n\nclass ColorJitter(RawColorJitter):\n    \"\"\"ColorJitter.\n    \"\"\"\n\n    def __init__(self, *args, **kwargs):\n        super().__init__(*args, **kwargs)\n\n    def __call__(self, img):\n        if not isinstance(img, Image.Image):\n            img = np.ascontiguousarray(img)\n            img = Image.fromarray(img)\n        img = super()._apply_image(img)\n        if isinstance(img, Image.Image):\n            img = np.asarray(img)\n        return img\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\nclass Topk(object):\n\n    def __init__(self, topk=1, class_id_map_file=None):\n        assert isinstance(topk, (int, ))\n        self.class_id_map = self.parse_class_id_map(class_id_map_file)\n        self.topk = topk\n\n    def parse_class_id_map(self, class_id_map_file):\n        if class_id_map_file is None:\n            return None\n        if not os.path.exists(class_id_map_file):\n            print(\n                \"Warning: If want to use your own label_dict, please input legal path!\\nOtherwise label_names will be empty!\"\n            )\n            return None\n\n        try:\n            class_id_map = {}\n            with open(class_id_map_file, \"r\") as fin:\n                lines = fin.readlines()\n                for line in lines:\n                    partition = line.split(\"\\n\")[0].partition(\" \")\n                    class_id_map[int(partition[0])] = str(partition[-1])\n        except Exception as ex:\n            print(ex)\n            class_id_map = None\n        return class_id_map\n\n    def __call__(self, x, file_names=None, multilabel=False):\n        assert isinstance(x, paddle.Tensor)\n        if file_names is not None:\n            assert x.shape[0] == len(file_names)\n        x = F.softmax(x, axis=-1) if not multilabel else F.sigmoid(x)\n        x = x.numpy()\n        y = []\n        for idx, probs in enumerate(x):\n            index = probs.argsort(axis=0)[-self.topk:][::-1].astype(\"int32\") if not multilabel else np.where(\n                probs >= 0.5)[0].astype(\"int32\")\n            clas_id_list = []\n            score_list = []\n            label_name_list = []\n            for i in index:\n                clas_id_list.append(i.item())\n                score_list.append(probs[i].item())\n                if self.class_id_map is not None:\n                    label_name_list.append(self.class_id_map[i.item()])\n            result = {\n                \"class_ids\": clas_id_list,\n                \"scores\": np.around(score_list, decimals=5).tolist(),\n            }\n            if file_names is not None:\n                result[\"file_name\"] = file_names[idx]\n            if label_name_list is not None:\n                result[\"label_names\"] = label_name_list\n            y.append(result)\n        return y\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x0_25_imagenet/utils.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport yaml\n\n__all__ = ['get_config']\n\n\nclass AttrDict(dict):\n\n    def __getattr__(self, key):\n        return self[key]\n\n    def __setattr__(self, key, value):\n        if key in self.__dict__:\n            self.__dict__[key] = value\n        else:\n            self[key] = value\n\n    def __deepcopy__(self, content):\n        return copy.deepcopy(dict(self))\n\n\ndef create_attr_dict(yaml_config):\n    from ast import literal_eval\n    for key, value in yaml_config.items():\n        if type(value) is dict:\n            yaml_config[key] = value = AttrDict(value)\n        if isinstance(value, str):\n            try:\n                value = literal_eval(value)\n            except BaseException:\n                pass\n        if isinstance(value, AttrDict):\n            create_attr_dict(yaml_config[key])\n        else:\n            yaml_config[key] = value\n\n\ndef parse_config(cfg_file):\n    \"\"\"Load a config file into AttrDict\"\"\"\n    with open(cfg_file, 'r') as fopen:\n        yaml_config = AttrDict(yaml.load(fopen, Loader=yaml.SafeLoader))\n    create_attr_dict(yaml_config)\n    return yaml_config\n\n\ndef override(dl, ks, v):\n    \"\"\"\n    Recursively replace dict of list\n    Args:\n        dl(dict or list): dict or list to be replaced\n        ks(list): list of keys\n        v(str): value to be replaced\n    \"\"\"\n\n    def str2num(v):\n        try:\n            return eval(v)\n        except Exception:\n            return v\n\n    assert isinstance(dl, (list, dict)), (\"{} should be a list or a dict\")\n    assert len(ks) > 0, ('lenght of keys should larger than 0')\n    if isinstance(dl, list):\n        k = str2num(ks[0])\n        if len(ks) == 1:\n            assert k < len(dl), ('index({}) out of range({})'.format(k, dl))\n            dl[k] = str2num(v)\n        else:\n            override(dl[k], ks[1:], v)\n    else:\n        if len(ks) == 1:\n            # assert ks[0] in dl, ('{} is not exist in {}'.format(ks[0], dl))\n            if not ks[0] in dl:\n                print('A new filed ({}) detected!'.format(ks[0], dl))\n            dl[ks[0]] = str2num(v)\n        else:\n            override(dl[ks[0]], ks[1:], v)\n\n\ndef override_config(config, options=None):\n    \"\"\"\n    Recursively override the config\n    Args:\n        config(dict): dict to be replaced\n        options(list): list of pairs(key0.key1.idx.key2=value)\n            such as: [\n                'topk=2',\n                'VALID.transforms.1.ResizeImage.resize_short=300'\n            ]\n    Returns:\n        config(dict): replaced config\n    \"\"\"\n    if options is not None:\n        for opt in options:\n            assert isinstance(opt, str), (\"option({}) should be a str\".format(opt))\n            assert \"=\" in opt, (\"option({}) should contain a =\"\n                                \"to distinguish between key and value\".format(opt))\n            pair = opt.split('=')\n            assert len(pair) == 2, (\"there can be only a = in the option\")\n            key, value = pair\n            keys = key.split('.')\n            override(config, keys, value)\n    return config\n\n\ndef get_config(fname, overrides=None, show=False):\n    \"\"\"\n    Read config from file\n    \"\"\"\n    assert os.path.exists(fname), ('config file({}) is not exist'.format(fname))\n    config = parse_config(fname)\n    override_config(config, overrides)\n    return config\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x0_35_imagenet/README.md",
    "content": "# pplcnet_x0_35_imagenet\n\n|模型名称|pplcnet_x0_35_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|PPLCNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|6 MB|\n|最新更新日期|2022-04-02|\n|数据指标|Acc|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - PP-LCNet是百度针对Intel CPU 设备以及其加速库 MKLDNN 设计的特定骨干网络 ，比起其他的轻量级的 SOTA 模型，该骨干网络可以在不增加推理时间的情况下，进一步提升模型的性能，最终大幅度超越现有的 SOTA 模型。该模型为模型规模参数scale为x0.35下的PP-LCNet模型，关于模型结构的更多信息，可参考[论文](https://arxiv.org/pdf/2109.15099.pdf)。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install pplcnet_x0_35_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run pplcnet_x0_35_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"pplcnet_x0_35_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 包括'class_ids'（种类索引）, 'scores'（置信度） 和 'label_names'（种类名称）\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m pplcnet_x0_35_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"\\}\n    url = \"http://127.0.0.1:8866/predict/pplcnet_x0_35_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install pplcnet_x0_35_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x0_35_imagenet/model.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Any\nfrom typing import Callable\nfrom typing import Dict\nfrom typing import List\nfrom typing import Tuple\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nfrom paddle import ParamAttr\nfrom paddle.nn import AdaptiveAvgPool2D\nfrom paddle.nn import BatchNorm\nfrom paddle.nn import Conv2D\nfrom paddle.nn import Dropout\nfrom paddle.nn import Linear\nfrom paddle.nn.initializer import KaimingNormal\nfrom paddle.regularizer import L2Decay\n\n\nclass Identity(nn.Layer):\n\n    def __init__(self):\n        super(Identity, self).__init__()\n\n    def forward(self, inputs):\n        return inputs\n\n\nclass TheseusLayer(nn.Layer):\n\n    def __init__(self, *args, **kwargs):\n        super(TheseusLayer, self).__init__()\n        self.res_dict = {}\n        self.res_name = self.full_name()\n        self.pruner = None\n        self.quanter = None\n\n    def _return_dict_hook(self, layer, input, output):\n        res_dict = {\"output\": output}\n        # 'list' is needed to avoid error raised by popping self.res_dict\n        for res_key in list(self.res_dict):\n            # clear the res_dict because the forward process may change according to input\n            res_dict[res_key] = self.res_dict.pop(res_key)\n        return res_dict\n\n    def init_res(self, stages_pattern, return_patterns=None, return_stages=None):\n        if return_patterns and return_stages:\n            msg = f\"The 'return_patterns' would be ignored when 'return_stages' is set.\"\n            return_stages = None\n\n        if return_stages is True:\n            return_patterns = stages_pattern\n        # return_stages is int or bool\n        if type(return_stages) is int:\n            return_stages = [return_stages]\n        if isinstance(return_stages, list):\n            if max(return_stages) > len(stages_pattern) or min(return_stages) < 0:\n                msg = f\"The 'return_stages' set error. Illegal value(s) have been ignored. The stages' pattern list is {stages_pattern}.\"\n                return_stages = [val for val in return_stages if val >= 0 and val < len(stages_pattern)]\n            return_patterns = [stages_pattern[i] for i in return_stages]\n\n        if return_patterns:\n            self.update_res(return_patterns)\n\n    def replace_sub(self, *args, **kwargs) -> None:\n        msg = \"The function 'replace_sub()' is deprecated, please use 'upgrade_sublayer()' instead.\"\n        raise DeprecationWarning(msg)\n\n    def upgrade_sublayer(self, layer_name_pattern: Union[str, List[str]],\n                         handle_func: Callable[[nn.Layer, str], nn.Layer]) -> Dict[str, nn.Layer]:\n        \"\"\"use 'handle_func' to modify the sub-layer(s) specified by 'layer_name_pattern'.\n\n        Args:\n            layer_name_pattern (Union[str, List[str]]): The name of layer to be modified by 'handle_func'.\n            handle_func (Callable[[nn.Layer, str], nn.Layer]): The function to modify target layer specified by 'layer_name_pattern'. The formal params are the layer(nn.Layer) and pattern(str) that is (a member of) layer_name_pattern (when layer_name_pattern is List type). And the return is the layer processed.\n\n        Returns:\n            Dict[str, nn.Layer]: The key is the pattern and corresponding value is the result returned by 'handle_func()'.\n\n        Examples:\n\n            from paddle import nn\n            import paddleclas\n\n            def rep_func(layer: nn.Layer, pattern: str):\n                new_layer = nn.Conv2D(\n                    in_channels=layer._in_channels,\n                    out_channels=layer._out_channels,\n                    kernel_size=5,\n                    padding=2\n                )\n                return new_layer\n\n            net = paddleclas.MobileNetV1()\n            res = net.replace_sub(layer_name_pattern=[\"blocks[11].depthwise_conv.conv\", \"blocks[12].depthwise_conv.conv\"], handle_func=rep_func)\n            print(res)\n            # {'blocks[11].depthwise_conv.conv': the corresponding new_layer, 'blocks[12].depthwise_conv.conv': the corresponding new_layer}\n        \"\"\"\n\n        if not isinstance(layer_name_pattern, list):\n            layer_name_pattern = [layer_name_pattern]\n\n        hit_layer_pattern_list = []\n        for pattern in layer_name_pattern:\n            # parse pattern to find target layer and its parent\n            layer_list = parse_pattern_str(pattern=pattern, parent_layer=self)\n            if not layer_list:\n                continue\n            sub_layer_parent = layer_list[-2][\"layer\"] if len(layer_list) > 1 else self\n\n            sub_layer = layer_list[-1][\"layer\"]\n            sub_layer_name = layer_list[-1][\"name\"]\n            sub_layer_index = layer_list[-1][\"index\"]\n\n            new_sub_layer = handle_func(sub_layer, pattern)\n\n            if sub_layer_index:\n                getattr(sub_layer_parent, sub_layer_name)[sub_layer_index] = new_sub_layer\n            else:\n                setattr(sub_layer_parent, sub_layer_name, new_sub_layer)\n\n            hit_layer_pattern_list.append(pattern)\n        return hit_layer_pattern_list\n\n    def stop_after(self, stop_layer_name: str) -> bool:\n        \"\"\"stop forward and backward after 'stop_layer_name'.\n\n        Args:\n            stop_layer_name (str): The name of layer that stop forward and backward after this layer.\n\n        Returns:\n            bool: 'True' if successful, 'False' otherwise.\n        \"\"\"\n\n        layer_list = parse_pattern_str(stop_layer_name, self)\n        if not layer_list:\n            return False\n\n        parent_layer = self\n        for layer_dict in layer_list:\n            name, index = layer_dict[\"name\"], layer_dict[\"index\"]\n            if not set_identity(parent_layer, name, index):\n                msg = f\"Failed to set the layers that after stop_layer_name('{stop_layer_name}') to IdentityLayer. The error layer's name is '{name}'.\"\n                return False\n            parent_layer = layer_dict[\"layer\"]\n\n        return True\n\n    def update_res(self, return_patterns: Union[str, List[str]]) -> Dict[str, nn.Layer]:\n        \"\"\"update the result(s) to be returned.\n\n        Args:\n            return_patterns (Union[str, List[str]]): The name of layer to return output.\n\n        Returns:\n            Dict[str, nn.Layer]: The pattern(str) and corresponding layer(nn.Layer) that have been set successfully.\n        \"\"\"\n\n        # clear res_dict that could have been set\n        self.res_dict = {}\n\n        class Handler(object):\n\n            def __init__(self, res_dict):\n                # res_dict is a reference\n                self.res_dict = res_dict\n\n            def __call__(self, layer, pattern):\n                layer.res_dict = self.res_dict\n                layer.res_name = pattern\n                if hasattr(layer, \"hook_remove_helper\"):\n                    layer.hook_remove_helper.remove()\n                layer.hook_remove_helper = layer.register_forward_post_hook(save_sub_res_hook)\n                return layer\n\n        handle_func = Handler(self.res_dict)\n\n        hit_layer_pattern_list = self.upgrade_sublayer(return_patterns, handle_func=handle_func)\n\n        if hasattr(self, \"hook_remove_helper\"):\n            self.hook_remove_helper.remove()\n        self.hook_remove_helper = self.register_forward_post_hook(self._return_dict_hook)\n\n        return hit_layer_pattern_list\n\n\ndef save_sub_res_hook(layer, input, output):\n    layer.res_dict[layer.res_name] = output\n\n\ndef set_identity(parent_layer: nn.Layer, layer_name: str, layer_index: str = None) -> bool:\n    \"\"\"set the layer specified by layer_name and layer_index to Indentity.\n\n    Args:\n        parent_layer (nn.Layer): The parent layer of target layer specified by layer_name and layer_index.\n        layer_name (str): The name of target layer to be set to Indentity.\n        layer_index (str, optional): The index of target layer to be set to Indentity in parent_layer. Defaults to None.\n\n    Returns:\n        bool: True if successfully, False otherwise.\n    \"\"\"\n\n    stop_after = False\n    for sub_layer_name in parent_layer._sub_layers:\n        if stop_after:\n            parent_layer._sub_layers[sub_layer_name] = Identity()\n            continue\n        if sub_layer_name == layer_name:\n            stop_after = True\n\n    if layer_index and stop_after:\n        stop_after = False\n        for sub_layer_index in parent_layer._sub_layers[layer_name]._sub_layers:\n            if stop_after:\n                parent_layer._sub_layers[layer_name][sub_layer_index] = Identity()\n                continue\n            if layer_index == sub_layer_index:\n                stop_after = True\n\n    return stop_after\n\n\ndef parse_pattern_str(pattern: str, parent_layer: nn.Layer) -> Union[None, List[Dict[str, Union[nn.Layer, str, None]]]]:\n    \"\"\"parse the string type pattern.\n\n    Args:\n        pattern (str): The pattern to discribe layer.\n        parent_layer (nn.Layer): The root layer relative to the pattern.\n\n    Returns:\n        Union[None, List[Dict[str, Union[nn.Layer, str, None]]]]: None if failed. If successfully, the members are layers parsed in order:\n                                                                [\n                                                                    {\"layer\": first layer, \"name\": first layer's name parsed, \"index\": first layer's index parsed if exist},\n                                                                    {\"layer\": second layer, \"name\": second layer's name parsed, \"index\": second layer's index parsed if exist},\n                                                                    ...\n                                                                ]\n    \"\"\"\n\n    pattern_list = pattern.split(\".\")\n    if not pattern_list:\n        msg = f\"The pattern('{pattern}') is illegal. Please check and retry.\"\n        return None\n\n    layer_list = []\n    while len(pattern_list) > 0:\n        if '[' in pattern_list[0]:\n            target_layer_name = pattern_list[0].split('[')[0]\n            target_layer_index = pattern_list[0].split('[')[1].split(']')[0]\n        else:\n            target_layer_name = pattern_list[0]\n            target_layer_index = None\n\n        target_layer = getattr(parent_layer, target_layer_name, None)\n\n        if target_layer is None:\n            msg = f\"Not found layer named('{target_layer_name}') specifed in pattern('{pattern}').\"\n            return None\n\n        if target_layer_index and target_layer:\n            if int(target_layer_index) < 0 or int(target_layer_index) >= len(target_layer):\n                msg = f\"Not found layer by index('{target_layer_index}') specifed in pattern('{pattern}'). The index should < {len(target_layer)} and > 0.\"\n                return None\n\n            target_layer = target_layer[target_layer_index]\n\n        layer_list.append({\"layer\": target_layer, \"name\": target_layer_name, \"index\": target_layer_index})\n\n        pattern_list = pattern_list[1:]\n        parent_layer = target_layer\n    return layer_list\n\n\nMODEL_STAGES_PATTERN = {\"PPLCNet\": [\"blocks2\", \"blocks3\", \"blocks4\", \"blocks5\", \"blocks6\"]}\n\n# Each element(list) represents a depthwise block, which is composed of k, in_c, out_c, s, use_se.\n# k: kernel_size\n# in_c: input channel number in depthwise block\n# out_c: output channel number in depthwise block\n# s: stride in depthwise block\n# use_se: whether to use SE block\n\nNET_CONFIG = {\n    \"blocks2\":\n    #k, in_c, out_c, s, use_se\n    [[3, 16, 32, 1, False]],\n    \"blocks3\": [[3, 32, 64, 2, False], [3, 64, 64, 1, False]],\n    \"blocks4\": [[3, 64, 128, 2, False], [3, 128, 128, 1, False]],\n    \"blocks5\": [[3, 128, 256, 2, False], [5, 256, 256, 1, False], [5, 256, 256, 1, False], [5, 256, 256, 1, False],\n                [5, 256, 256, 1, False], [5, 256, 256, 1, False]],\n    \"blocks6\": [[5, 256, 512, 2, True], [5, 512, 512, 1, True]]\n}\n\n\ndef make_divisible(v, divisor=8, min_value=None):\n    if min_value is None:\n        min_value = divisor\n    new_v = max(min_value, int(v + divisor / 2) // divisor * divisor)\n    if new_v < 0.9 * v:\n        new_v += divisor\n    return new_v\n\n\nclass ConvBNLayer(TheseusLayer):\n\n    def __init__(self, num_channels, filter_size, num_filters, stride, num_groups=1):\n        super().__init__()\n\n        self.conv = Conv2D(in_channels=num_channels,\n                           out_channels=num_filters,\n                           kernel_size=filter_size,\n                           stride=stride,\n                           padding=(filter_size - 1) // 2,\n                           groups=num_groups,\n                           weight_attr=ParamAttr(initializer=KaimingNormal()),\n                           bias_attr=False)\n\n        self.bn = BatchNorm(num_filters,\n                            param_attr=ParamAttr(regularizer=L2Decay(0.0)),\n                            bias_attr=ParamAttr(regularizer=L2Decay(0.0)))\n        self.hardswish = nn.Hardswish()\n\n    def forward(self, x):\n        x = self.conv(x)\n        x = self.bn(x)\n        x = self.hardswish(x)\n        return x\n\n\nclass DepthwiseSeparable(TheseusLayer):\n\n    def __init__(self, num_channels, num_filters, stride, dw_size=3, use_se=False):\n        super().__init__()\n        self.use_se = use_se\n        self.dw_conv = ConvBNLayer(num_channels=num_channels,\n                                   num_filters=num_channels,\n                                   filter_size=dw_size,\n                                   stride=stride,\n                                   num_groups=num_channels)\n        if use_se:\n            self.se = SEModule(num_channels)\n        self.pw_conv = ConvBNLayer(num_channels=num_channels, filter_size=1, num_filters=num_filters, stride=1)\n\n    def forward(self, x):\n        x = self.dw_conv(x)\n        if self.use_se:\n            x = self.se(x)\n        x = self.pw_conv(x)\n        return x\n\n\nclass SEModule(TheseusLayer):\n\n    def __init__(self, channel, reduction=4):\n        super().__init__()\n        self.avg_pool = AdaptiveAvgPool2D(1)\n        self.conv1 = Conv2D(in_channels=channel, out_channels=channel // reduction, kernel_size=1, stride=1, padding=0)\n        self.relu = nn.ReLU()\n        self.conv2 = Conv2D(in_channels=channel // reduction, out_channels=channel, kernel_size=1, stride=1, padding=0)\n        self.hardsigmoid = nn.Hardsigmoid()\n\n    def forward(self, x):\n        identity = x\n        x = self.avg_pool(x)\n        x = self.conv1(x)\n        x = self.relu(x)\n        x = self.conv2(x)\n        x = self.hardsigmoid(x)\n        x = paddle.multiply(x=identity, y=x)\n        return x\n\n\nclass PPLCNet(TheseusLayer):\n\n    def __init__(self,\n                 stages_pattern,\n                 scale=1.0,\n                 class_num=1000,\n                 dropout_prob=0.2,\n                 class_expand=1280,\n                 return_patterns=None,\n                 return_stages=None):\n        super().__init__()\n        self.scale = scale\n        self.class_expand = class_expand\n\n        self.conv1 = ConvBNLayer(num_channels=3, filter_size=3, num_filters=make_divisible(16 * scale), stride=2)\n\n        self.blocks2 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks2\"])\n        ])\n\n        self.blocks3 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks3\"])\n        ])\n\n        self.blocks4 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks4\"])\n        ])\n\n        self.blocks5 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks5\"])\n        ])\n\n        self.blocks6 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks6\"])\n        ])\n\n        self.avg_pool = AdaptiveAvgPool2D(1)\n\n        self.last_conv = Conv2D(in_channels=make_divisible(NET_CONFIG[\"blocks6\"][-1][2] * scale),\n                                out_channels=self.class_expand,\n                                kernel_size=1,\n                                stride=1,\n                                padding=0,\n                                bias_attr=False)\n\n        self.hardswish = nn.Hardswish()\n        self.dropout = Dropout(p=dropout_prob, mode=\"downscale_in_infer\")\n        self.flatten = nn.Flatten(start_axis=1, stop_axis=-1)\n\n        self.fc = Linear(self.class_expand, class_num)\n\n        super().init_res(stages_pattern, return_patterns=return_patterns, return_stages=return_stages)\n\n    def forward(self, x):\n        x = self.conv1(x)\n\n        x = self.blocks2(x)\n        x = self.blocks3(x)\n        x = self.blocks4(x)\n        x = self.blocks5(x)\n        x = self.blocks6(x)\n\n        x = self.avg_pool(x)\n        x = self.last_conv(x)\n        x = self.hardswish(x)\n        x = self.dropout(x)\n        x = self.flatten(x)\n        x = self.fc(x)\n        return x\n\n\ndef PPLCNet_x0_35(pretrained=False, use_ssld=False, **kwargs):\n    model = PPLCNet(scale=0.35, stages_pattern=MODEL_STAGES_PATTERN[\"PPLCNet\"], **kwargs)\n    return model\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x0_35_imagenet/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport cv2\nimport numpy as np\nimport paddle\nfrom skimage.io import imread\nfrom skimage.transform import rescale\nfrom skimage.transform import resize\n\nimport paddlehub as hub\nfrom .model import PPLCNet_x0_35\nfrom .processor import base64_to_cv2\nfrom .processor import create_operators\nfrom .processor import Topk\nfrom .utils import get_config\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"pplcnet_x0_35_imagenet\",\n            type=\"cv/classification\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"\",\n            version=\"1.0.0\")\nclass PPLcNet_x0_35:\n\n    def __init__(self):\n        self.config = get_config(os.path.join(self.directory, 'PPLCNet_x0_35.yaml'), show=False)\n        self.label_path = os.path.join(self.directory, 'imagenet1k_label_list.txt')\n        self.pretrain_path = os.path.join(self.directory, 'PPLCNet_x0_35_pretrained.pdparams')\n        self.config['Infer']['PostProcess']['class_id_map_file'] = self.label_path\n        self.model = PPLCNet_x0_35()\n        param_state_dict = paddle.load(self.pretrain_path)\n        self.model.set_dict(param_state_dict)\n        self.preprocess_funcs = create_operators(self.config[\"Infer\"][\"transforms\"])\n\n    def classification(self,\n                       images: list = None,\n                       paths: list = None,\n                       batch_size: int = 1,\n                       use_gpu: bool = False,\n                       top_k: int = 1):\n        '''\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results, each result dict contains key 'class_ids', 'scores' and 'label_names'.\n        '''\n        postprocess_func = Topk(top_k, self.label_path)\n        inputs = []\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n\n        if images != None:\n            for image in images:\n                image = image[:, :, ::-1]\n                inputs.append(image)\n\n        if paths != None:\n            for path in paths:\n                image = cv2.imread(path)[:, :, ::-1]\n                inputs.append(image)\n\n        batch_data = []\n        for idx, imagedata in enumerate(inputs):\n            for process in self.preprocess_funcs:\n                imagedata = process(imagedata)\n            batch_data.append(imagedata)\n            if len(batch_data) >= batch_size or idx == len(inputs) - 1:\n                batch_tensor = paddle.to_tensor(batch_data)\n                out = self.model(batch_tensor)\n                if isinstance(out, list):\n                    out = out[0]\n                if isinstance(out, dict) and \"logits\" in out:\n                    out = out[\"logits\"]\n                if isinstance(out, dict) and \"output\" in out:\n                    out = out[\"output\"]\n                result = postprocess_func(out)\n                results.extend(result)\n                batch_data.clear()\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n        results = self.classification(paths=[self.args.input_path],\n                                      use_gpu=self.args.use_gpu,\n                                      batch_size=self.args.batch_size,\n                                      top_k=self.args.top_k)\n        return results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classification(images=images_decode, **kwargs)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument('--batch_size', type=int, default=1, help='batch size')\n        self.arg_config_group.add_argument('--top_k', type=int, default=1, help='Return top k results.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to input image.\")\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x0_35_imagenet/processor.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport base64\nimport inspect\nimport math\nimport os\nimport random\nimport sys\nfrom functools import partial\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\nimport six\nfrom paddle.vision.transforms import ColorJitter as RawColorJitter\nfrom PIL import Image\n\n\ndef create_operators(params, class_num=None):\n    \"\"\"\n    create operators based on the config\n\n    Args:\n        params(list): a dict list, used to create some operators\n    \"\"\"\n    assert isinstance(params, list), ('operator config should be a list')\n    ops = []\n    current_module = sys.modules[__name__]\n    for operator in params:\n        assert isinstance(operator, dict) and len(operator) == 1, \"yaml format error\"\n        op_name = list(operator)[0]\n        param = {} if operator[op_name] is None else operator[op_name]\n        op_func = getattr(current_module, op_name)\n        if \"class_num\" in inspect.getfullargspec(op_func).args:\n            param.update({\"class_num\": class_num})\n        op = op_func(**param)\n        ops.append(op)\n\n    return ops\n\n\nclass UnifiedResize(object):\n\n    def __init__(self, interpolation=None, backend=\"cv2\"):\n        _cv2_interp_from_str = {\n            'nearest': cv2.INTER_NEAREST,\n            'bilinear': cv2.INTER_LINEAR,\n            'area': cv2.INTER_AREA,\n            'bicubic': cv2.INTER_CUBIC,\n            'lanczos': cv2.INTER_LANCZOS4\n        }\n        _pil_interp_from_str = {\n            'nearest': Image.NEAREST,\n            'bilinear': Image.BILINEAR,\n            'bicubic': Image.BICUBIC,\n            'box': Image.BOX,\n            'lanczos': Image.LANCZOS,\n            'hamming': Image.HAMMING\n        }\n\n        def _pil_resize(src, size, resample):\n            pil_img = Image.fromarray(src)\n            pil_img = pil_img.resize(size, resample)\n            return np.asarray(pil_img)\n\n        if backend.lower() == \"cv2\":\n            if isinstance(interpolation, str):\n                interpolation = _cv2_interp_from_str[interpolation.lower()]\n            # compatible with opencv < version 4.4.0\n            elif interpolation is None:\n                interpolation = cv2.INTER_LINEAR\n            self.resize_func = partial(cv2.resize, interpolation=interpolation)\n        elif backend.lower() == \"pil\":\n            if isinstance(interpolation, str):\n                interpolation = _pil_interp_from_str[interpolation.lower()]\n            self.resize_func = partial(_pil_resize, resample=interpolation)\n        else:\n            self.resize_func = cv2.resize\n\n    def __call__(self, src, size):\n        return self.resize_func(src, size)\n\n\nclass OperatorParamError(ValueError):\n    \"\"\" OperatorParamError\n    \"\"\"\n    pass\n\n\nclass DecodeImage(object):\n    \"\"\" decode image \"\"\"\n\n    def __init__(self, to_rgb=True, to_np=False, channel_first=False):\n        self.to_rgb = to_rgb\n        self.to_np = to_np  # to numpy\n        self.channel_first = channel_first  # only enabled when to_np is True\n\n    def __call__(self, img):\n        if six.PY2:\n            assert type(img) is str and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        else:\n            assert type(img) is bytes and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        data = np.frombuffer(img, dtype='uint8')\n        img = cv2.imdecode(data, 1)\n        if self.to_rgb:\n            assert img.shape[2] == 3, 'invalid shape of image[%s]' % (img.shape)\n            img = img[:, :, ::-1]\n\n        if self.channel_first:\n            img = img.transpose((2, 0, 1))\n\n        return img\n\n\nclass ResizeImage(object):\n    \"\"\" resize image \"\"\"\n\n    def __init__(self, size=None, resize_short=None, interpolation=None, backend=\"cv2\"):\n        if resize_short is not None and resize_short > 0:\n            self.resize_short = resize_short\n            self.w = None\n            self.h = None\n        elif size is not None:\n            self.resize_short = None\n            self.w = size if type(size) is int else size[0]\n            self.h = size if type(size) is int else size[1]\n        else:\n            raise OperatorParamError(\"invalid params for ReisizeImage for '\\\n                'both 'size' and 'resize_short' are None\")\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        img_h, img_w = img.shape[:2]\n        if self.resize_short is not None:\n            percent = float(self.resize_short) / min(img_w, img_h)\n            w = int(round(img_w * percent))\n            h = int(round(img_h * percent))\n        else:\n            w = self.w\n            h = self.h\n        return self._resize_func(img, (w, h))\n\n\nclass CropImage(object):\n    \"\"\" crop image \"\"\"\n\n    def __init__(self, size):\n        if type(size) is int:\n            self.size = (size, size)\n        else:\n            self.size = size  # (h, w)\n\n    def __call__(self, img):\n        w, h = self.size\n        img_h, img_w = img.shape[:2]\n        w_start = (img_w - w) // 2\n        h_start = (img_h - h) // 2\n\n        w_end = w_start + w\n        h_end = h_start + h\n        return img[h_start:h_end, w_start:w_end, :]\n\n\nclass RandCropImage(object):\n    \"\"\" random crop image \"\"\"\n\n    def __init__(self, size, scale=None, ratio=None, interpolation=None, backend=\"cv2\"):\n        if type(size) is int:\n            self.size = (size, size)  # (h, w)\n        else:\n            self.size = size\n\n        self.scale = [0.08, 1.0] if scale is None else scale\n        self.ratio = [3. / 4., 4. / 3.] if ratio is None else ratio\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        size = self.size\n        scale = self.scale\n        ratio = self.ratio\n\n        aspect_ratio = math.sqrt(random.uniform(*ratio))\n        w = 1. * aspect_ratio\n        h = 1. / aspect_ratio\n\n        img_h, img_w = img.shape[:2]\n\n        bound = min((float(img_w) / img_h) / (w**2), (float(img_h) / img_w) / (h**2))\n        scale_max = min(scale[1], bound)\n        scale_min = min(scale[0], bound)\n\n        target_area = img_w * img_h * random.uniform(scale_min, scale_max)\n        target_size = math.sqrt(target_area)\n        w = int(target_size * w)\n        h = int(target_size * h)\n\n        i = random.randint(0, img_w - w)\n        j = random.randint(0, img_h - h)\n\n        img = img[j:j + h, i:i + w, :]\n\n        return self._resize_func(img, size)\n\n\nclass RandFlipImage(object):\n    \"\"\" random flip image\n        flip_code:\n            1: Flipped Horizontally\n            0: Flipped Vertically\n            -1: Flipped Horizontally & Vertically\n    \"\"\"\n\n    def __init__(self, flip_code=1):\n        assert flip_code in [-1, 0, 1], \"flip_code should be a value in [-1, 0, 1]\"\n        self.flip_code = flip_code\n\n    def __call__(self, img):\n        if random.randint(0, 1) == 1:\n            return cv2.flip(img, self.flip_code)\n        else:\n            return img\n\n\nclass NormalizeImage(object):\n    \"\"\" normalize image such as substract mean, divide std\n    \"\"\"\n\n    def __init__(self, scale=None, mean=None, std=None, order='chw', output_fp16=False, channel_num=3):\n        if isinstance(scale, str):\n            scale = eval(scale)\n        assert channel_num in [3, 4], \"channel number of input image should be set to 3 or 4.\"\n        self.channel_num = channel_num\n        self.output_dtype = 'float16' if output_fp16 else 'float32'\n        self.scale = np.float32(scale if scale is not None else 1.0 / 255.0)\n        self.order = order\n        mean = mean if mean is not None else [0.485, 0.456, 0.406]\n        std = std if std is not None else [0.229, 0.224, 0.225]\n\n        shape = (3, 1, 1) if self.order == 'chw' else (1, 1, 3)\n        self.mean = np.array(mean).reshape(shape).astype('float32')\n        self.std = np.array(std).reshape(shape).astype('float32')\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        assert isinstance(img, np.ndarray), \"invalid input 'img' in NormalizeImage\"\n\n        img = (img.astype('float32') * self.scale - self.mean) / self.std\n\n        if self.channel_num == 4:\n            img_h = img.shape[1] if self.order == 'chw' else img.shape[0]\n            img_w = img.shape[2] if self.order == 'chw' else img.shape[1]\n            pad_zeros = np.zeros((1, img_h, img_w)) if self.order == 'chw' else np.zeros((img_h, img_w, 1))\n            img = (np.concatenate((img, pad_zeros), axis=0) if self.order == 'chw' else np.concatenate(\n                (img, pad_zeros), axis=2))\n        return img.astype(self.output_dtype)\n\n\nclass ToCHWImage(object):\n    \"\"\" convert hwc image to chw image\n    \"\"\"\n\n    def __init__(self):\n        pass\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        return img.transpose((2, 0, 1))\n\n\nclass ColorJitter(RawColorJitter):\n    \"\"\"ColorJitter.\n    \"\"\"\n\n    def __init__(self, *args, **kwargs):\n        super().__init__(*args, **kwargs)\n\n    def __call__(self, img):\n        if not isinstance(img, Image.Image):\n            img = np.ascontiguousarray(img)\n            img = Image.fromarray(img)\n        img = super()._apply_image(img)\n        if isinstance(img, Image.Image):\n            img = np.asarray(img)\n        return img\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\nclass Topk(object):\n\n    def __init__(self, topk=1, class_id_map_file=None):\n        assert isinstance(topk, (int, ))\n        self.class_id_map = self.parse_class_id_map(class_id_map_file)\n        self.topk = topk\n\n    def parse_class_id_map(self, class_id_map_file):\n        if class_id_map_file is None:\n            return None\n        if not os.path.exists(class_id_map_file):\n            print(\n                \"Warning: If want to use your own label_dict, please input legal path!\\nOtherwise label_names will be empty!\"\n            )\n            return None\n\n        try:\n            class_id_map = {}\n            with open(class_id_map_file, \"r\") as fin:\n                lines = fin.readlines()\n                for line in lines:\n                    partition = line.split(\"\\n\")[0].partition(\" \")\n                    class_id_map[int(partition[0])] = str(partition[-1])\n        except Exception as ex:\n            print(ex)\n            class_id_map = None\n        return class_id_map\n\n    def __call__(self, x, file_names=None, multilabel=False):\n        assert isinstance(x, paddle.Tensor)\n        if file_names is not None:\n            assert x.shape[0] == len(file_names)\n        x = F.softmax(x, axis=-1) if not multilabel else F.sigmoid(x)\n        x = x.numpy()\n        y = []\n        for idx, probs in enumerate(x):\n            index = probs.argsort(axis=0)[-self.topk:][::-1].astype(\"int32\") if not multilabel else np.where(\n                probs >= 0.5)[0].astype(\"int32\")\n            clas_id_list = []\n            score_list = []\n            label_name_list = []\n            for i in index:\n                clas_id_list.append(i.item())\n                score_list.append(probs[i].item())\n                if self.class_id_map is not None:\n                    label_name_list.append(self.class_id_map[i.item()])\n            result = {\n                \"class_ids\": clas_id_list,\n                \"scores\": np.around(score_list, decimals=5).tolist(),\n            }\n            if file_names is not None:\n                result[\"file_name\"] = file_names[idx]\n            if label_name_list is not None:\n                result[\"label_names\"] = label_name_list\n            y.append(result)\n        return y\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x0_35_imagenet/utils.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport yaml\n\n__all__ = ['get_config']\n\n\nclass AttrDict(dict):\n\n    def __getattr__(self, key):\n        return self[key]\n\n    def __setattr__(self, key, value):\n        if key in self.__dict__:\n            self.__dict__[key] = value\n        else:\n            self[key] = value\n\n    def __deepcopy__(self, content):\n        return copy.deepcopy(dict(self))\n\n\ndef create_attr_dict(yaml_config):\n    from ast import literal_eval\n    for key, value in yaml_config.items():\n        if type(value) is dict:\n            yaml_config[key] = value = AttrDict(value)\n        if isinstance(value, str):\n            try:\n                value = literal_eval(value)\n            except BaseException:\n                pass\n        if isinstance(value, AttrDict):\n            create_attr_dict(yaml_config[key])\n        else:\n            yaml_config[key] = value\n\n\ndef parse_config(cfg_file):\n    \"\"\"Load a config file into AttrDict\"\"\"\n    with open(cfg_file, 'r') as fopen:\n        yaml_config = AttrDict(yaml.load(fopen, Loader=yaml.SafeLoader))\n    create_attr_dict(yaml_config)\n    return yaml_config\n\n\ndef override(dl, ks, v):\n    \"\"\"\n    Recursively replace dict of list\n    Args:\n        dl(dict or list): dict or list to be replaced\n        ks(list): list of keys\n        v(str): value to be replaced\n    \"\"\"\n\n    def str2num(v):\n        try:\n            return eval(v)\n        except Exception:\n            return v\n\n    assert isinstance(dl, (list, dict)), (\"{} should be a list or a dict\")\n    assert len(ks) > 0, ('lenght of keys should larger than 0')\n    if isinstance(dl, list):\n        k = str2num(ks[0])\n        if len(ks) == 1:\n            assert k < len(dl), ('index({}) out of range({})'.format(k, dl))\n            dl[k] = str2num(v)\n        else:\n            override(dl[k], ks[1:], v)\n    else:\n        if len(ks) == 1:\n            # assert ks[0] in dl, ('{} is not exist in {}'.format(ks[0], dl))\n            if not ks[0] in dl:\n                print('A new filed ({}) detected!'.format(ks[0], dl))\n            dl[ks[0]] = str2num(v)\n        else:\n            override(dl[ks[0]], ks[1:], v)\n\n\ndef override_config(config, options=None):\n    \"\"\"\n    Recursively override the config\n    Args:\n        config(dict): dict to be replaced\n        options(list): list of pairs(key0.key1.idx.key2=value)\n            such as: [\n                'topk=2',\n                'VALID.transforms.1.ResizeImage.resize_short=300'\n            ]\n    Returns:\n        config(dict): replaced config\n    \"\"\"\n    if options is not None:\n        for opt in options:\n            assert isinstance(opt, str), (\"option({}) should be a str\".format(opt))\n            assert \"=\" in opt, (\"option({}) should contain a =\"\n                                \"to distinguish between key and value\".format(opt))\n            pair = opt.split('=')\n            assert len(pair) == 2, (\"there can be only a = in the option\")\n            key, value = pair\n            keys = key.split('.')\n            override(config, keys, value)\n    return config\n\n\ndef get_config(fname, overrides=None, show=False):\n    \"\"\"\n    Read config from file\n    \"\"\"\n    assert os.path.exists(fname), ('config file({}) is not exist'.format(fname))\n    config = parse_config(fname)\n    override_config(config, overrides)\n    return config\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x0_5_imagenet/README.md",
    "content": "# pplcnet_x0_5_imagenet\n\n|模型名称|pplcnet_x0_5_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|PPLCNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|7 MB|\n|最新更新日期|2022-04-02|\n|数据指标|Acc|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - PP-LCNet是百度针对Intel CPU 设备以及其加速库 MKLDNN 设计的特定骨干网络 ，比起其他的轻量级的 SOTA 模型，该骨干网络可以在不增加推理时间的情况下，进一步提升模型的性能，最终大幅度超越现有的 SOTA 模型。该模型为模型规模参数scale为x0.5下的PP-LCNet模型，关于模型结构的更多信息，可参考[论文](https://arxiv.org/pdf/2109.15099.pdf)。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install pplcnet_x0_5_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run pplcnet_x0_5_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"pplcnet_x0_5_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 包括'class_ids'（种类索引）, 'scores'（置信度） 和 'label_names'（种类名称）\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m pplcnet_x0_5_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"\\}\n    url = \"http://127.0.0.1:8866/predict/pplcnet_x0_5_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install pplcnet_x0_5_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x0_5_imagenet/model.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Any\nfrom typing import Callable\nfrom typing import Dict\nfrom typing import List\nfrom typing import Tuple\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nfrom paddle import ParamAttr\nfrom paddle.nn import AdaptiveAvgPool2D\nfrom paddle.nn import BatchNorm\nfrom paddle.nn import Conv2D\nfrom paddle.nn import Dropout\nfrom paddle.nn import Linear\nfrom paddle.nn.initializer import KaimingNormal\nfrom paddle.regularizer import L2Decay\n\n\nclass Identity(nn.Layer):\n\n    def __init__(self):\n        super(Identity, self).__init__()\n\n    def forward(self, inputs):\n        return inputs\n\n\nclass TheseusLayer(nn.Layer):\n\n    def __init__(self, *args, **kwargs):\n        super(TheseusLayer, self).__init__()\n        self.res_dict = {}\n        self.res_name = self.full_name()\n        self.pruner = None\n        self.quanter = None\n\n    def _return_dict_hook(self, layer, input, output):\n        res_dict = {\"output\": output}\n        # 'list' is needed to avoid error raised by popping self.res_dict\n        for res_key in list(self.res_dict):\n            # clear the res_dict because the forward process may change according to input\n            res_dict[res_key] = self.res_dict.pop(res_key)\n        return res_dict\n\n    def init_res(self, stages_pattern, return_patterns=None, return_stages=None):\n        if return_patterns and return_stages:\n            msg = f\"The 'return_patterns' would be ignored when 'return_stages' is set.\"\n            return_stages = None\n\n        if return_stages is True:\n            return_patterns = stages_pattern\n        # return_stages is int or bool\n        if type(return_stages) is int:\n            return_stages = [return_stages]\n        if isinstance(return_stages, list):\n            if max(return_stages) > len(stages_pattern) or min(return_stages) < 0:\n                msg = f\"The 'return_stages' set error. Illegal value(s) have been ignored. The stages' pattern list is {stages_pattern}.\"\n                return_stages = [val for val in return_stages if val >= 0 and val < len(stages_pattern)]\n            return_patterns = [stages_pattern[i] for i in return_stages]\n\n        if return_patterns:\n            self.update_res(return_patterns)\n\n    def replace_sub(self, *args, **kwargs) -> None:\n        msg = \"The function 'replace_sub()' is deprecated, please use 'upgrade_sublayer()' instead.\"\n        raise DeprecationWarning(msg)\n\n    def upgrade_sublayer(self, layer_name_pattern: Union[str, List[str]],\n                         handle_func: Callable[[nn.Layer, str], nn.Layer]) -> Dict[str, nn.Layer]:\n        \"\"\"use 'handle_func' to modify the sub-layer(s) specified by 'layer_name_pattern'.\n\n        Args:\n            layer_name_pattern (Union[str, List[str]]): The name of layer to be modified by 'handle_func'.\n            handle_func (Callable[[nn.Layer, str], nn.Layer]): The function to modify target layer specified by 'layer_name_pattern'. The formal params are the layer(nn.Layer) and pattern(str) that is (a member of) layer_name_pattern (when layer_name_pattern is List type). And the return is the layer processed.\n\n        Returns:\n            Dict[str, nn.Layer]: The key is the pattern and corresponding value is the result returned by 'handle_func()'.\n\n        Examples:\n\n            from paddle import nn\n            import paddleclas\n\n            def rep_func(layer: nn.Layer, pattern: str):\n                new_layer = nn.Conv2D(\n                    in_channels=layer._in_channels,\n                    out_channels=layer._out_channels,\n                    kernel_size=5,\n                    padding=2\n                )\n                return new_layer\n\n            net = paddleclas.MobileNetV1()\n            res = net.replace_sub(layer_name_pattern=[\"blocks[11].depthwise_conv.conv\", \"blocks[12].depthwise_conv.conv\"], handle_func=rep_func)\n            print(res)\n            # {'blocks[11].depthwise_conv.conv': the corresponding new_layer, 'blocks[12].depthwise_conv.conv': the corresponding new_layer}\n        \"\"\"\n\n        if not isinstance(layer_name_pattern, list):\n            layer_name_pattern = [layer_name_pattern]\n\n        hit_layer_pattern_list = []\n        for pattern in layer_name_pattern:\n            # parse pattern to find target layer and its parent\n            layer_list = parse_pattern_str(pattern=pattern, parent_layer=self)\n            if not layer_list:\n                continue\n            sub_layer_parent = layer_list[-2][\"layer\"] if len(layer_list) > 1 else self\n\n            sub_layer = layer_list[-1][\"layer\"]\n            sub_layer_name = layer_list[-1][\"name\"]\n            sub_layer_index = layer_list[-1][\"index\"]\n\n            new_sub_layer = handle_func(sub_layer, pattern)\n\n            if sub_layer_index:\n                getattr(sub_layer_parent, sub_layer_name)[sub_layer_index] = new_sub_layer\n            else:\n                setattr(sub_layer_parent, sub_layer_name, new_sub_layer)\n\n            hit_layer_pattern_list.append(pattern)\n        return hit_layer_pattern_list\n\n    def stop_after(self, stop_layer_name: str) -> bool:\n        \"\"\"stop forward and backward after 'stop_layer_name'.\n\n        Args:\n            stop_layer_name (str): The name of layer that stop forward and backward after this layer.\n\n        Returns:\n            bool: 'True' if successful, 'False' otherwise.\n        \"\"\"\n\n        layer_list = parse_pattern_str(stop_layer_name, self)\n        if not layer_list:\n            return False\n\n        parent_layer = self\n        for layer_dict in layer_list:\n            name, index = layer_dict[\"name\"], layer_dict[\"index\"]\n            if not set_identity(parent_layer, name, index):\n                msg = f\"Failed to set the layers that after stop_layer_name('{stop_layer_name}') to IdentityLayer. The error layer's name is '{name}'.\"\n                return False\n            parent_layer = layer_dict[\"layer\"]\n\n        return True\n\n    def update_res(self, return_patterns: Union[str, List[str]]) -> Dict[str, nn.Layer]:\n        \"\"\"update the result(s) to be returned.\n\n        Args:\n            return_patterns (Union[str, List[str]]): The name of layer to return output.\n\n        Returns:\n            Dict[str, nn.Layer]: The pattern(str) and corresponding layer(nn.Layer) that have been set successfully.\n        \"\"\"\n\n        # clear res_dict that could have been set\n        self.res_dict = {}\n\n        class Handler(object):\n\n            def __init__(self, res_dict):\n                # res_dict is a reference\n                self.res_dict = res_dict\n\n            def __call__(self, layer, pattern):\n                layer.res_dict = self.res_dict\n                layer.res_name = pattern\n                if hasattr(layer, \"hook_remove_helper\"):\n                    layer.hook_remove_helper.remove()\n                layer.hook_remove_helper = layer.register_forward_post_hook(save_sub_res_hook)\n                return layer\n\n        handle_func = Handler(self.res_dict)\n\n        hit_layer_pattern_list = self.upgrade_sublayer(return_patterns, handle_func=handle_func)\n\n        if hasattr(self, \"hook_remove_helper\"):\n            self.hook_remove_helper.remove()\n        self.hook_remove_helper = self.register_forward_post_hook(self._return_dict_hook)\n\n        return hit_layer_pattern_list\n\n\ndef save_sub_res_hook(layer, input, output):\n    layer.res_dict[layer.res_name] = output\n\n\ndef set_identity(parent_layer: nn.Layer, layer_name: str, layer_index: str = None) -> bool:\n    \"\"\"set the layer specified by layer_name and layer_index to Indentity.\n\n    Args:\n        parent_layer (nn.Layer): The parent layer of target layer specified by layer_name and layer_index.\n        layer_name (str): The name of target layer to be set to Indentity.\n        layer_index (str, optional): The index of target layer to be set to Indentity in parent_layer. Defaults to None.\n\n    Returns:\n        bool: True if successfully, False otherwise.\n    \"\"\"\n\n    stop_after = False\n    for sub_layer_name in parent_layer._sub_layers:\n        if stop_after:\n            parent_layer._sub_layers[sub_layer_name] = Identity()\n            continue\n        if sub_layer_name == layer_name:\n            stop_after = True\n\n    if layer_index and stop_after:\n        stop_after = False\n        for sub_layer_index in parent_layer._sub_layers[layer_name]._sub_layers:\n            if stop_after:\n                parent_layer._sub_layers[layer_name][sub_layer_index] = Identity()\n                continue\n            if layer_index == sub_layer_index:\n                stop_after = True\n\n    return stop_after\n\n\ndef parse_pattern_str(pattern: str, parent_layer: nn.Layer) -> Union[None, List[Dict[str, Union[nn.Layer, str, None]]]]:\n    \"\"\"parse the string type pattern.\n\n    Args:\n        pattern (str): The pattern to discribe layer.\n        parent_layer (nn.Layer): The root layer relative to the pattern.\n\n    Returns:\n        Union[None, List[Dict[str, Union[nn.Layer, str, None]]]]: None if failed. If successfully, the members are layers parsed in order:\n                                                                [\n                                                                    {\"layer\": first layer, \"name\": first layer's name parsed, \"index\": first layer's index parsed if exist},\n                                                                    {\"layer\": second layer, \"name\": second layer's name parsed, \"index\": second layer's index parsed if exist},\n                                                                    ...\n                                                                ]\n    \"\"\"\n\n    pattern_list = pattern.split(\".\")\n    if not pattern_list:\n        msg = f\"The pattern('{pattern}') is illegal. Please check and retry.\"\n        return None\n\n    layer_list = []\n    while len(pattern_list) > 0:\n        if '[' in pattern_list[0]:\n            target_layer_name = pattern_list[0].split('[')[0]\n            target_layer_index = pattern_list[0].split('[')[1].split(']')[0]\n        else:\n            target_layer_name = pattern_list[0]\n            target_layer_index = None\n\n        target_layer = getattr(parent_layer, target_layer_name, None)\n\n        if target_layer is None:\n            msg = f\"Not found layer named('{target_layer_name}') specifed in pattern('{pattern}').\"\n            return None\n\n        if target_layer_index and target_layer:\n            if int(target_layer_index) < 0 or int(target_layer_index) >= len(target_layer):\n                msg = f\"Not found layer by index('{target_layer_index}') specifed in pattern('{pattern}'). The index should < {len(target_layer)} and > 0.\"\n                return None\n\n            target_layer = target_layer[target_layer_index]\n\n        layer_list.append({\"layer\": target_layer, \"name\": target_layer_name, \"index\": target_layer_index})\n\n        pattern_list = pattern_list[1:]\n        parent_layer = target_layer\n    return layer_list\n\n\nMODEL_STAGES_PATTERN = {\"PPLCNet\": [\"blocks2\", \"blocks3\", \"blocks4\", \"blocks5\", \"blocks6\"]}\n\n# Each element(list) represents a depthwise block, which is composed of k, in_c, out_c, s, use_se.\n# k: kernel_size\n# in_c: input channel number in depthwise block\n# out_c: output channel number in depthwise block\n# s: stride in depthwise block\n# use_se: whether to use SE block\n\nNET_CONFIG = {\n    \"blocks2\":\n    #k, in_c, out_c, s, use_se\n    [[3, 16, 32, 1, False]],\n    \"blocks3\": [[3, 32, 64, 2, False], [3, 64, 64, 1, False]],\n    \"blocks4\": [[3, 64, 128, 2, False], [3, 128, 128, 1, False]],\n    \"blocks5\": [[3, 128, 256, 2, False], [5, 256, 256, 1, False], [5, 256, 256, 1, False], [5, 256, 256, 1, False],\n                [5, 256, 256, 1, False], [5, 256, 256, 1, False]],\n    \"blocks6\": [[5, 256, 512, 2, True], [5, 512, 512, 1, True]]\n}\n\n\ndef make_divisible(v, divisor=8, min_value=None):\n    if min_value is None:\n        min_value = divisor\n    new_v = max(min_value, int(v + divisor / 2) // divisor * divisor)\n    if new_v < 0.9 * v:\n        new_v += divisor\n    return new_v\n\n\nclass ConvBNLayer(TheseusLayer):\n\n    def __init__(self, num_channels, filter_size, num_filters, stride, num_groups=1):\n        super().__init__()\n\n        self.conv = Conv2D(in_channels=num_channels,\n                           out_channels=num_filters,\n                           kernel_size=filter_size,\n                           stride=stride,\n                           padding=(filter_size - 1) // 2,\n                           groups=num_groups,\n                           weight_attr=ParamAttr(initializer=KaimingNormal()),\n                           bias_attr=False)\n\n        self.bn = BatchNorm(num_filters,\n                            param_attr=ParamAttr(regularizer=L2Decay(0.0)),\n                            bias_attr=ParamAttr(regularizer=L2Decay(0.0)))\n        self.hardswish = nn.Hardswish()\n\n    def forward(self, x):\n        x = self.conv(x)\n        x = self.bn(x)\n        x = self.hardswish(x)\n        return x\n\n\nclass DepthwiseSeparable(TheseusLayer):\n\n    def __init__(self, num_channels, num_filters, stride, dw_size=3, use_se=False):\n        super().__init__()\n        self.use_se = use_se\n        self.dw_conv = ConvBNLayer(num_channels=num_channels,\n                                   num_filters=num_channels,\n                                   filter_size=dw_size,\n                                   stride=stride,\n                                   num_groups=num_channels)\n        if use_se:\n            self.se = SEModule(num_channels)\n        self.pw_conv = ConvBNLayer(num_channels=num_channels, filter_size=1, num_filters=num_filters, stride=1)\n\n    def forward(self, x):\n        x = self.dw_conv(x)\n        if self.use_se:\n            x = self.se(x)\n        x = self.pw_conv(x)\n        return x\n\n\nclass SEModule(TheseusLayer):\n\n    def __init__(self, channel, reduction=4):\n        super().__init__()\n        self.avg_pool = AdaptiveAvgPool2D(1)\n        self.conv1 = Conv2D(in_channels=channel, out_channels=channel // reduction, kernel_size=1, stride=1, padding=0)\n        self.relu = nn.ReLU()\n        self.conv2 = Conv2D(in_channels=channel // reduction, out_channels=channel, kernel_size=1, stride=1, padding=0)\n        self.hardsigmoid = nn.Hardsigmoid()\n\n    def forward(self, x):\n        identity = x\n        x = self.avg_pool(x)\n        x = self.conv1(x)\n        x = self.relu(x)\n        x = self.conv2(x)\n        x = self.hardsigmoid(x)\n        x = paddle.multiply(x=identity, y=x)\n        return x\n\n\nclass PPLCNet(TheseusLayer):\n\n    def __init__(self,\n                 stages_pattern,\n                 scale=1.0,\n                 class_num=1000,\n                 dropout_prob=0.2,\n                 class_expand=1280,\n                 return_patterns=None,\n                 return_stages=None):\n        super().__init__()\n        self.scale = scale\n        self.class_expand = class_expand\n\n        self.conv1 = ConvBNLayer(num_channels=3, filter_size=3, num_filters=make_divisible(16 * scale), stride=2)\n\n        self.blocks2 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks2\"])\n        ])\n\n        self.blocks3 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks3\"])\n        ])\n\n        self.blocks4 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks4\"])\n        ])\n\n        self.blocks5 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks5\"])\n        ])\n\n        self.blocks6 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks6\"])\n        ])\n\n        self.avg_pool = AdaptiveAvgPool2D(1)\n\n        self.last_conv = Conv2D(in_channels=make_divisible(NET_CONFIG[\"blocks6\"][-1][2] * scale),\n                                out_channels=self.class_expand,\n                                kernel_size=1,\n                                stride=1,\n                                padding=0,\n                                bias_attr=False)\n\n        self.hardswish = nn.Hardswish()\n        self.dropout = Dropout(p=dropout_prob, mode=\"downscale_in_infer\")\n        self.flatten = nn.Flatten(start_axis=1, stop_axis=-1)\n\n        self.fc = Linear(self.class_expand, class_num)\n\n        super().init_res(stages_pattern, return_patterns=return_patterns, return_stages=return_stages)\n\n    def forward(self, x):\n        x = self.conv1(x)\n\n        x = self.blocks2(x)\n        x = self.blocks3(x)\n        x = self.blocks4(x)\n        x = self.blocks5(x)\n        x = self.blocks6(x)\n\n        x = self.avg_pool(x)\n        x = self.last_conv(x)\n        x = self.hardswish(x)\n        x = self.dropout(x)\n        x = self.flatten(x)\n        x = self.fc(x)\n        return x\n\n\ndef PPLCNet_x0_5(pretrained=False, use_ssld=False, **kwargs):\n    model = PPLCNet(scale=0.5, stages_pattern=MODEL_STAGES_PATTERN[\"PPLCNet\"], **kwargs)\n    return model\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x0_5_imagenet/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport cv2\nimport numpy as np\nimport paddle\nfrom skimage.io import imread\nfrom skimage.transform import rescale\nfrom skimage.transform import resize\n\nimport paddlehub as hub\nfrom .model import PPLCNet_x0_5\nfrom .processor import base64_to_cv2\nfrom .processor import create_operators\nfrom .processor import Topk\nfrom .utils import get_config\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"pplcnet_x0_5_imagenet\",\n            type=\"cv/classification\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"\",\n            version=\"1.0.0\")\nclass PPLcNet_x0_5:\n\n    def __init__(self):\n        self.config = get_config(os.path.join(self.directory, 'PPLCNet_x0_5.yaml'), show=False)\n        self.label_path = os.path.join(self.directory, 'imagenet1k_label_list.txt')\n        self.pretrain_path = os.path.join(self.directory, 'PPLCNet_x0_5_pretrained.pdparams')\n        self.config['Infer']['PostProcess']['class_id_map_file'] = self.label_path\n        self.model = PPLCNet_x0_5()\n        param_state_dict = paddle.load(self.pretrain_path)\n        self.model.set_dict(param_state_dict)\n        self.preprocess_funcs = create_operators(self.config[\"Infer\"][\"transforms\"])\n\n    def classification(self,\n                       images: list = None,\n                       paths: list = None,\n                       batch_size: int = 1,\n                       use_gpu: bool = False,\n                       top_k: int = 1):\n        '''\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results, each result dict contains key 'class_ids', 'scores' and 'label_names'.\n        '''\n        postprocess_func = Topk(top_k, self.label_path)\n        inputs = []\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n\n        if images != None:\n            for image in images:\n                image = image[:, :, ::-1]\n                inputs.append(image)\n\n        if paths != None:\n            for path in paths:\n                image = cv2.imread(path)[:, :, ::-1]\n                inputs.append(image)\n\n        batch_data = []\n        for idx, imagedata in enumerate(inputs):\n            for process in self.preprocess_funcs:\n                imagedata = process(imagedata)\n            batch_data.append(imagedata)\n            if len(batch_data) >= batch_size or idx == len(inputs) - 1:\n                batch_tensor = paddle.to_tensor(batch_data)\n                out = self.model(batch_tensor)\n                if isinstance(out, list):\n                    out = out[0]\n                if isinstance(out, dict) and \"logits\" in out:\n                    out = out[\"logits\"]\n                if isinstance(out, dict) and \"output\" in out:\n                    out = out[\"output\"]\n                result = postprocess_func(out)\n                results.extend(result)\n                batch_data.clear()\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n        results = self.classification(paths=[self.args.input_path],\n                                      use_gpu=self.args.use_gpu,\n                                      batch_size=self.args.batch_size,\n                                      top_k=self.args.top_k)\n        return results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classification(images=images_decode, **kwargs)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument('--batch_size', type=int, default=1, help='batch size')\n        self.arg_config_group.add_argument('--top_k', type=int, default=1, help='Return top k results.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to input image.\")\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x0_5_imagenet/processor.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport base64\nimport inspect\nimport math\nimport os\nimport random\nimport sys\nfrom functools import partial\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\nimport six\nfrom paddle.vision.transforms import ColorJitter as RawColorJitter\nfrom PIL import Image\n\n\ndef create_operators(params, class_num=None):\n    \"\"\"\n    create operators based on the config\n\n    Args:\n        params(list): a dict list, used to create some operators\n    \"\"\"\n    assert isinstance(params, list), ('operator config should be a list')\n    ops = []\n    current_module = sys.modules[__name__]\n    for operator in params:\n        assert isinstance(operator, dict) and len(operator) == 1, \"yaml format error\"\n        op_name = list(operator)[0]\n        param = {} if operator[op_name] is None else operator[op_name]\n        op_func = getattr(current_module, op_name)\n        if \"class_num\" in inspect.getfullargspec(op_func).args:\n            param.update({\"class_num\": class_num})\n        op = op_func(**param)\n        ops.append(op)\n\n    return ops\n\n\nclass UnifiedResize(object):\n\n    def __init__(self, interpolation=None, backend=\"cv2\"):\n        _cv2_interp_from_str = {\n            'nearest': cv2.INTER_NEAREST,\n            'bilinear': cv2.INTER_LINEAR,\n            'area': cv2.INTER_AREA,\n            'bicubic': cv2.INTER_CUBIC,\n            'lanczos': cv2.INTER_LANCZOS4\n        }\n        _pil_interp_from_str = {\n            'nearest': Image.NEAREST,\n            'bilinear': Image.BILINEAR,\n            'bicubic': Image.BICUBIC,\n            'box': Image.BOX,\n            'lanczos': Image.LANCZOS,\n            'hamming': Image.HAMMING\n        }\n\n        def _pil_resize(src, size, resample):\n            pil_img = Image.fromarray(src)\n            pil_img = pil_img.resize(size, resample)\n            return np.asarray(pil_img)\n\n        if backend.lower() == \"cv2\":\n            if isinstance(interpolation, str):\n                interpolation = _cv2_interp_from_str[interpolation.lower()]\n            # compatible with opencv < version 4.4.0\n            elif interpolation is None:\n                interpolation = cv2.INTER_LINEAR\n            self.resize_func = partial(cv2.resize, interpolation=interpolation)\n        elif backend.lower() == \"pil\":\n            if isinstance(interpolation, str):\n                interpolation = _pil_interp_from_str[interpolation.lower()]\n            self.resize_func = partial(_pil_resize, resample=interpolation)\n        else:\n            self.resize_func = cv2.resize\n\n    def __call__(self, src, size):\n        return self.resize_func(src, size)\n\n\nclass OperatorParamError(ValueError):\n    \"\"\" OperatorParamError\n    \"\"\"\n    pass\n\n\nclass DecodeImage(object):\n    \"\"\" decode image \"\"\"\n\n    def __init__(self, to_rgb=True, to_np=False, channel_first=False):\n        self.to_rgb = to_rgb\n        self.to_np = to_np  # to numpy\n        self.channel_first = channel_first  # only enabled when to_np is True\n\n    def __call__(self, img):\n        if six.PY2:\n            assert type(img) is str and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        else:\n            assert type(img) is bytes and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        data = np.frombuffer(img, dtype='uint8')\n        img = cv2.imdecode(data, 1)\n        if self.to_rgb:\n            assert img.shape[2] == 3, 'invalid shape of image[%s]' % (img.shape)\n            img = img[:, :, ::-1]\n\n        if self.channel_first:\n            img = img.transpose((2, 0, 1))\n\n        return img\n\n\nclass ResizeImage(object):\n    \"\"\" resize image \"\"\"\n\n    def __init__(self, size=None, resize_short=None, interpolation=None, backend=\"cv2\"):\n        if resize_short is not None and resize_short > 0:\n            self.resize_short = resize_short\n            self.w = None\n            self.h = None\n        elif size is not None:\n            self.resize_short = None\n            self.w = size if type(size) is int else size[0]\n            self.h = size if type(size) is int else size[1]\n        else:\n            raise OperatorParamError(\"invalid params for ReisizeImage for '\\\n                'both 'size' and 'resize_short' are None\")\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        img_h, img_w = img.shape[:2]\n        if self.resize_short is not None:\n            percent = float(self.resize_short) / min(img_w, img_h)\n            w = int(round(img_w * percent))\n            h = int(round(img_h * percent))\n        else:\n            w = self.w\n            h = self.h\n        return self._resize_func(img, (w, h))\n\n\nclass CropImage(object):\n    \"\"\" crop image \"\"\"\n\n    def __init__(self, size):\n        if type(size) is int:\n            self.size = (size, size)\n        else:\n            self.size = size  # (h, w)\n\n    def __call__(self, img):\n        w, h = self.size\n        img_h, img_w = img.shape[:2]\n        w_start = (img_w - w) // 2\n        h_start = (img_h - h) // 2\n\n        w_end = w_start + w\n        h_end = h_start + h\n        return img[h_start:h_end, w_start:w_end, :]\n\n\nclass RandCropImage(object):\n    \"\"\" random crop image \"\"\"\n\n    def __init__(self, size, scale=None, ratio=None, interpolation=None, backend=\"cv2\"):\n        if type(size) is int:\n            self.size = (size, size)  # (h, w)\n        else:\n            self.size = size\n\n        self.scale = [0.08, 1.0] if scale is None else scale\n        self.ratio = [3. / 4., 4. / 3.] if ratio is None else ratio\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        size = self.size\n        scale = self.scale\n        ratio = self.ratio\n\n        aspect_ratio = math.sqrt(random.uniform(*ratio))\n        w = 1. * aspect_ratio\n        h = 1. / aspect_ratio\n\n        img_h, img_w = img.shape[:2]\n\n        bound = min((float(img_w) / img_h) / (w**2), (float(img_h) / img_w) / (h**2))\n        scale_max = min(scale[1], bound)\n        scale_min = min(scale[0], bound)\n\n        target_area = img_w * img_h * random.uniform(scale_min, scale_max)\n        target_size = math.sqrt(target_area)\n        w = int(target_size * w)\n        h = int(target_size * h)\n\n        i = random.randint(0, img_w - w)\n        j = random.randint(0, img_h - h)\n\n        img = img[j:j + h, i:i + w, :]\n\n        return self._resize_func(img, size)\n\n\nclass RandFlipImage(object):\n    \"\"\" random flip image\n        flip_code:\n            1: Flipped Horizontally\n            0: Flipped Vertically\n            -1: Flipped Horizontally & Vertically\n    \"\"\"\n\n    def __init__(self, flip_code=1):\n        assert flip_code in [-1, 0, 1], \"flip_code should be a value in [-1, 0, 1]\"\n        self.flip_code = flip_code\n\n    def __call__(self, img):\n        if random.randint(0, 1) == 1:\n            return cv2.flip(img, self.flip_code)\n        else:\n            return img\n\n\nclass NormalizeImage(object):\n    \"\"\" normalize image such as substract mean, divide std\n    \"\"\"\n\n    def __init__(self, scale=None, mean=None, std=None, order='chw', output_fp16=False, channel_num=3):\n        if isinstance(scale, str):\n            scale = eval(scale)\n        assert channel_num in [3, 4], \"channel number of input image should be set to 3 or 4.\"\n        self.channel_num = channel_num\n        self.output_dtype = 'float16' if output_fp16 else 'float32'\n        self.scale = np.float32(scale if scale is not None else 1.0 / 255.0)\n        self.order = order\n        mean = mean if mean is not None else [0.485, 0.456, 0.406]\n        std = std if std is not None else [0.229, 0.224, 0.225]\n\n        shape = (3, 1, 1) if self.order == 'chw' else (1, 1, 3)\n        self.mean = np.array(mean).reshape(shape).astype('float32')\n        self.std = np.array(std).reshape(shape).astype('float32')\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        assert isinstance(img, np.ndarray), \"invalid input 'img' in NormalizeImage\"\n\n        img = (img.astype('float32') * self.scale - self.mean) / self.std\n\n        if self.channel_num == 4:\n            img_h = img.shape[1] if self.order == 'chw' else img.shape[0]\n            img_w = img.shape[2] if self.order == 'chw' else img.shape[1]\n            pad_zeros = np.zeros((1, img_h, img_w)) if self.order == 'chw' else np.zeros((img_h, img_w, 1))\n            img = (np.concatenate((img, pad_zeros), axis=0) if self.order == 'chw' else np.concatenate(\n                (img, pad_zeros), axis=2))\n        return img.astype(self.output_dtype)\n\n\nclass ToCHWImage(object):\n    \"\"\" convert hwc image to chw image\n    \"\"\"\n\n    def __init__(self):\n        pass\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        return img.transpose((2, 0, 1))\n\n\nclass ColorJitter(RawColorJitter):\n    \"\"\"ColorJitter.\n    \"\"\"\n\n    def __init__(self, *args, **kwargs):\n        super().__init__(*args, **kwargs)\n\n    def __call__(self, img):\n        if not isinstance(img, Image.Image):\n            img = np.ascontiguousarray(img)\n            img = Image.fromarray(img)\n        img = super()._apply_image(img)\n        if isinstance(img, Image.Image):\n            img = np.asarray(img)\n        return img\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\nclass Topk(object):\n\n    def __init__(self, topk=1, class_id_map_file=None):\n        assert isinstance(topk, (int, ))\n        self.class_id_map = self.parse_class_id_map(class_id_map_file)\n        self.topk = topk\n\n    def parse_class_id_map(self, class_id_map_file):\n        if class_id_map_file is None:\n            return None\n        if not os.path.exists(class_id_map_file):\n            print(\n                \"Warning: If want to use your own label_dict, please input legal path!\\nOtherwise label_names will be empty!\"\n            )\n            return None\n\n        try:\n            class_id_map = {}\n            with open(class_id_map_file, \"r\") as fin:\n                lines = fin.readlines()\n                for line in lines:\n                    partition = line.split(\"\\n\")[0].partition(\" \")\n                    class_id_map[int(partition[0])] = str(partition[-1])\n        except Exception as ex:\n            print(ex)\n            class_id_map = None\n        return class_id_map\n\n    def __call__(self, x, file_names=None, multilabel=False):\n        assert isinstance(x, paddle.Tensor)\n        if file_names is not None:\n            assert x.shape[0] == len(file_names)\n        x = F.softmax(x, axis=-1) if not multilabel else F.sigmoid(x)\n        x = x.numpy()\n        y = []\n        for idx, probs in enumerate(x):\n            index = probs.argsort(axis=0)[-self.topk:][::-1].astype(\"int32\") if not multilabel else np.where(\n                probs >= 0.5)[0].astype(\"int32\")\n            clas_id_list = []\n            score_list = []\n            label_name_list = []\n            for i in index:\n                clas_id_list.append(i.item())\n                score_list.append(probs[i].item())\n                if self.class_id_map is not None:\n                    label_name_list.append(self.class_id_map[i.item()])\n            result = {\n                \"class_ids\": clas_id_list,\n                \"scores\": np.around(score_list, decimals=5).tolist(),\n            }\n            if file_names is not None:\n                result[\"file_name\"] = file_names[idx]\n            if label_name_list is not None:\n                result[\"label_names\"] = label_name_list\n            y.append(result)\n        return y\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x0_5_imagenet/utils.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport yaml\n\n__all__ = ['get_config']\n\n\nclass AttrDict(dict):\n\n    def __getattr__(self, key):\n        return self[key]\n\n    def __setattr__(self, key, value):\n        if key in self.__dict__:\n            self.__dict__[key] = value\n        else:\n            self[key] = value\n\n    def __deepcopy__(self, content):\n        return copy.deepcopy(dict(self))\n\n\ndef create_attr_dict(yaml_config):\n    from ast import literal_eval\n    for key, value in yaml_config.items():\n        if type(value) is dict:\n            yaml_config[key] = value = AttrDict(value)\n        if isinstance(value, str):\n            try:\n                value = literal_eval(value)\n            except BaseException:\n                pass\n        if isinstance(value, AttrDict):\n            create_attr_dict(yaml_config[key])\n        else:\n            yaml_config[key] = value\n\n\ndef parse_config(cfg_file):\n    \"\"\"Load a config file into AttrDict\"\"\"\n    with open(cfg_file, 'r') as fopen:\n        yaml_config = AttrDict(yaml.load(fopen, Loader=yaml.SafeLoader))\n    create_attr_dict(yaml_config)\n    return yaml_config\n\n\ndef override(dl, ks, v):\n    \"\"\"\n    Recursively replace dict of list\n    Args:\n        dl(dict or list): dict or list to be replaced\n        ks(list): list of keys\n        v(str): value to be replaced\n    \"\"\"\n\n    def str2num(v):\n        try:\n            return eval(v)\n        except Exception:\n            return v\n\n    assert isinstance(dl, (list, dict)), (\"{} should be a list or a dict\")\n    assert len(ks) > 0, ('lenght of keys should larger than 0')\n    if isinstance(dl, list):\n        k = str2num(ks[0])\n        if len(ks) == 1:\n            assert k < len(dl), ('index({}) out of range({})'.format(k, dl))\n            dl[k] = str2num(v)\n        else:\n            override(dl[k], ks[1:], v)\n    else:\n        if len(ks) == 1:\n            # assert ks[0] in dl, ('{} is not exist in {}'.format(ks[0], dl))\n            if not ks[0] in dl:\n                print('A new filed ({}) detected!'.format(ks[0], dl))\n            dl[ks[0]] = str2num(v)\n        else:\n            override(dl[ks[0]], ks[1:], v)\n\n\ndef override_config(config, options=None):\n    \"\"\"\n    Recursively override the config\n    Args:\n        config(dict): dict to be replaced\n        options(list): list of pairs(key0.key1.idx.key2=value)\n            such as: [\n                'topk=2',\n                'VALID.transforms.1.ResizeImage.resize_short=300'\n            ]\n    Returns:\n        config(dict): replaced config\n    \"\"\"\n    if options is not None:\n        for opt in options:\n            assert isinstance(opt, str), (\"option({}) should be a str\".format(opt))\n            assert \"=\" in opt, (\"option({}) should contain a =\"\n                                \"to distinguish between key and value\".format(opt))\n            pair = opt.split('=')\n            assert len(pair) == 2, (\"there can be only a = in the option\")\n            key, value = pair\n            keys = key.split('.')\n            override(config, keys, value)\n    return config\n\n\ndef get_config(fname, overrides=None, show=False):\n    \"\"\"\n    Read config from file\n    \"\"\"\n    assert os.path.exists(fname), ('config file({}) is not exist'.format(fname))\n    config = parse_config(fname)\n    override_config(config, overrides)\n    return config\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x0_75_imagenet/README.md",
    "content": "# pplcnet_x0_75_imagenet\n\n|模型名称|pplcnet_x0_75_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|PPLCNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|9 MB|\n|最新更新日期|2022-04-02|\n|数据指标|Acc|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - PP-LCNet是百度针对Intel CPU 设备以及其加速库 MKLDNN 设计的特定骨干网络 ，比起其他的轻量级的 SOTA 模型，该骨干网络可以在不增加推理时间的情况下，进一步提升模型的性能，最终大幅度超越现有的 SOTA 模型。该模型为模型规模参数scale为x0.75下的PP-LCNet模型，关于模型结构的更多信息，可参考[论文](https://arxiv.org/pdf/2109.15099.pdf)。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install pplcnet_x0_75_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run pplcnet_x0_75_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"pplcnet_x0_75_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 包括'class_ids'（种类索引）, 'scores'（置信度） 和 'label_names'（种类名称）\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m pplcnet_x0_75_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"\\}\n    url = \"http://127.0.0.1:8866/predict/pplcnet_x0_75_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install pplcnet_x0_75_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x0_75_imagenet/model.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Any\nfrom typing import Callable\nfrom typing import Dict\nfrom typing import List\nfrom typing import Tuple\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nfrom paddle import ParamAttr\nfrom paddle.nn import AdaptiveAvgPool2D\nfrom paddle.nn import BatchNorm\nfrom paddle.nn import Conv2D\nfrom paddle.nn import Dropout\nfrom paddle.nn import Linear\nfrom paddle.nn.initializer import KaimingNormal\nfrom paddle.regularizer import L2Decay\n\n\nclass Identity(nn.Layer):\n\n    def __init__(self):\n        super(Identity, self).__init__()\n\n    def forward(self, inputs):\n        return inputs\n\n\nclass TheseusLayer(nn.Layer):\n\n    def __init__(self, *args, **kwargs):\n        super(TheseusLayer, self).__init__()\n        self.res_dict = {}\n        self.res_name = self.full_name()\n        self.pruner = None\n        self.quanter = None\n\n    def _return_dict_hook(self, layer, input, output):\n        res_dict = {\"output\": output}\n        # 'list' is needed to avoid error raised by popping self.res_dict\n        for res_key in list(self.res_dict):\n            # clear the res_dict because the forward process may change according to input\n            res_dict[res_key] = self.res_dict.pop(res_key)\n        return res_dict\n\n    def init_res(self, stages_pattern, return_patterns=None, return_stages=None):\n        if return_patterns and return_stages:\n            msg = f\"The 'return_patterns' would be ignored when 'return_stages' is set.\"\n            return_stages = None\n\n        if return_stages is True:\n            return_patterns = stages_pattern\n        # return_stages is int or bool\n        if type(return_stages) is int:\n            return_stages = [return_stages]\n        if isinstance(return_stages, list):\n            if max(return_stages) > len(stages_pattern) or min(return_stages) < 0:\n                msg = f\"The 'return_stages' set error. Illegal value(s) have been ignored. The stages' pattern list is {stages_pattern}.\"\n                return_stages = [val for val in return_stages if val >= 0 and val < len(stages_pattern)]\n            return_patterns = [stages_pattern[i] for i in return_stages]\n\n        if return_patterns:\n            self.update_res(return_patterns)\n\n    def replace_sub(self, *args, **kwargs) -> None:\n        msg = \"The function 'replace_sub()' is deprecated, please use 'upgrade_sublayer()' instead.\"\n        raise DeprecationWarning(msg)\n\n    def upgrade_sublayer(self, layer_name_pattern: Union[str, List[str]],\n                         handle_func: Callable[[nn.Layer, str], nn.Layer]) -> Dict[str, nn.Layer]:\n        \"\"\"use 'handle_func' to modify the sub-layer(s) specified by 'layer_name_pattern'.\n\n        Args:\n            layer_name_pattern (Union[str, List[str]]): The name of layer to be modified by 'handle_func'.\n            handle_func (Callable[[nn.Layer, str], nn.Layer]): The function to modify target layer specified by 'layer_name_pattern'. The formal params are the layer(nn.Layer) and pattern(str) that is (a member of) layer_name_pattern (when layer_name_pattern is List type). And the return is the layer processed.\n\n        Returns:\n            Dict[str, nn.Layer]: The key is the pattern and corresponding value is the result returned by 'handle_func()'.\n\n        Examples:\n\n            from paddle import nn\n            import paddleclas\n\n            def rep_func(layer: nn.Layer, pattern: str):\n                new_layer = nn.Conv2D(\n                    in_channels=layer._in_channels,\n                    out_channels=layer._out_channels,\n                    kernel_size=5,\n                    padding=2\n                )\n                return new_layer\n\n            net = paddleclas.MobileNetV1()\n            res = net.replace_sub(layer_name_pattern=[\"blocks[11].depthwise_conv.conv\", \"blocks[12].depthwise_conv.conv\"], handle_func=rep_func)\n            print(res)\n            # {'blocks[11].depthwise_conv.conv': the corresponding new_layer, 'blocks[12].depthwise_conv.conv': the corresponding new_layer}\n        \"\"\"\n\n        if not isinstance(layer_name_pattern, list):\n            layer_name_pattern = [layer_name_pattern]\n\n        hit_layer_pattern_list = []\n        for pattern in layer_name_pattern:\n            # parse pattern to find target layer and its parent\n            layer_list = parse_pattern_str(pattern=pattern, parent_layer=self)\n            if not layer_list:\n                continue\n            sub_layer_parent = layer_list[-2][\"layer\"] if len(layer_list) > 1 else self\n\n            sub_layer = layer_list[-1][\"layer\"]\n            sub_layer_name = layer_list[-1][\"name\"]\n            sub_layer_index = layer_list[-1][\"index\"]\n\n            new_sub_layer = handle_func(sub_layer, pattern)\n\n            if sub_layer_index:\n                getattr(sub_layer_parent, sub_layer_name)[sub_layer_index] = new_sub_layer\n            else:\n                setattr(sub_layer_parent, sub_layer_name, new_sub_layer)\n\n            hit_layer_pattern_list.append(pattern)\n        return hit_layer_pattern_list\n\n    def stop_after(self, stop_layer_name: str) -> bool:\n        \"\"\"stop forward and backward after 'stop_layer_name'.\n\n        Args:\n            stop_layer_name (str): The name of layer that stop forward and backward after this layer.\n\n        Returns:\n            bool: 'True' if successful, 'False' otherwise.\n        \"\"\"\n\n        layer_list = parse_pattern_str(stop_layer_name, self)\n        if not layer_list:\n            return False\n\n        parent_layer = self\n        for layer_dict in layer_list:\n            name, index = layer_dict[\"name\"], layer_dict[\"index\"]\n            if not set_identity(parent_layer, name, index):\n                msg = f\"Failed to set the layers that after stop_layer_name('{stop_layer_name}') to IdentityLayer. The error layer's name is '{name}'.\"\n                return False\n            parent_layer = layer_dict[\"layer\"]\n\n        return True\n\n    def update_res(self, return_patterns: Union[str, List[str]]) -> Dict[str, nn.Layer]:\n        \"\"\"update the result(s) to be returned.\n\n        Args:\n            return_patterns (Union[str, List[str]]): The name of layer to return output.\n\n        Returns:\n            Dict[str, nn.Layer]: The pattern(str) and corresponding layer(nn.Layer) that have been set successfully.\n        \"\"\"\n\n        # clear res_dict that could have been set\n        self.res_dict = {}\n\n        class Handler(object):\n\n            def __init__(self, res_dict):\n                # res_dict is a reference\n                self.res_dict = res_dict\n\n            def __call__(self, layer, pattern):\n                layer.res_dict = self.res_dict\n                layer.res_name = pattern\n                if hasattr(layer, \"hook_remove_helper\"):\n                    layer.hook_remove_helper.remove()\n                layer.hook_remove_helper = layer.register_forward_post_hook(save_sub_res_hook)\n                return layer\n\n        handle_func = Handler(self.res_dict)\n\n        hit_layer_pattern_list = self.upgrade_sublayer(return_patterns, handle_func=handle_func)\n\n        if hasattr(self, \"hook_remove_helper\"):\n            self.hook_remove_helper.remove()\n        self.hook_remove_helper = self.register_forward_post_hook(self._return_dict_hook)\n\n        return hit_layer_pattern_list\n\n\ndef save_sub_res_hook(layer, input, output):\n    layer.res_dict[layer.res_name] = output\n\n\ndef set_identity(parent_layer: nn.Layer, layer_name: str, layer_index: str = None) -> bool:\n    \"\"\"set the layer specified by layer_name and layer_index to Indentity.\n\n    Args:\n        parent_layer (nn.Layer): The parent layer of target layer specified by layer_name and layer_index.\n        layer_name (str): The name of target layer to be set to Indentity.\n        layer_index (str, optional): The index of target layer to be set to Indentity in parent_layer. Defaults to None.\n\n    Returns:\n        bool: True if successfully, False otherwise.\n    \"\"\"\n\n    stop_after = False\n    for sub_layer_name in parent_layer._sub_layers:\n        if stop_after:\n            parent_layer._sub_layers[sub_layer_name] = Identity()\n            continue\n        if sub_layer_name == layer_name:\n            stop_after = True\n\n    if layer_index and stop_after:\n        stop_after = False\n        for sub_layer_index in parent_layer._sub_layers[layer_name]._sub_layers:\n            if stop_after:\n                parent_layer._sub_layers[layer_name][sub_layer_index] = Identity()\n                continue\n            if layer_index == sub_layer_index:\n                stop_after = True\n\n    return stop_after\n\n\ndef parse_pattern_str(pattern: str, parent_layer: nn.Layer) -> Union[None, List[Dict[str, Union[nn.Layer, str, None]]]]:\n    \"\"\"parse the string type pattern.\n\n    Args:\n        pattern (str): The pattern to discribe layer.\n        parent_layer (nn.Layer): The root layer relative to the pattern.\n\n    Returns:\n        Union[None, List[Dict[str, Union[nn.Layer, str, None]]]]: None if failed. If successfully, the members are layers parsed in order:\n                                                                [\n                                                                    {\"layer\": first layer, \"name\": first layer's name parsed, \"index\": first layer's index parsed if exist},\n                                                                    {\"layer\": second layer, \"name\": second layer's name parsed, \"index\": second layer's index parsed if exist},\n                                                                    ...\n                                                                ]\n    \"\"\"\n\n    pattern_list = pattern.split(\".\")\n    if not pattern_list:\n        msg = f\"The pattern('{pattern}') is illegal. Please check and retry.\"\n        return None\n\n    layer_list = []\n    while len(pattern_list) > 0:\n        if '[' in pattern_list[0]:\n            target_layer_name = pattern_list[0].split('[')[0]\n            target_layer_index = pattern_list[0].split('[')[1].split(']')[0]\n        else:\n            target_layer_name = pattern_list[0]\n            target_layer_index = None\n\n        target_layer = getattr(parent_layer, target_layer_name, None)\n\n        if target_layer is None:\n            msg = f\"Not found layer named('{target_layer_name}') specifed in pattern('{pattern}').\"\n            return None\n\n        if target_layer_index and target_layer:\n            if int(target_layer_index) < 0 or int(target_layer_index) >= len(target_layer):\n                msg = f\"Not found layer by index('{target_layer_index}') specifed in pattern('{pattern}'). The index should < {len(target_layer)} and > 0.\"\n                return None\n\n            target_layer = target_layer[target_layer_index]\n\n        layer_list.append({\"layer\": target_layer, \"name\": target_layer_name, \"index\": target_layer_index})\n\n        pattern_list = pattern_list[1:]\n        parent_layer = target_layer\n    return layer_list\n\n\nMODEL_STAGES_PATTERN = {\"PPLCNet\": [\"blocks2\", \"blocks3\", \"blocks4\", \"blocks5\", \"blocks6\"]}\n\n# Each element(list) represents a depthwise block, which is composed of k, in_c, out_c, s, use_se.\n# k: kernel_size\n# in_c: input channel number in depthwise block\n# out_c: output channel number in depthwise block\n# s: stride in depthwise block\n# use_se: whether to use SE block\n\nNET_CONFIG = {\n    \"blocks2\":\n    #k, in_c, out_c, s, use_se\n    [[3, 16, 32, 1, False]],\n    \"blocks3\": [[3, 32, 64, 2, False], [3, 64, 64, 1, False]],\n    \"blocks4\": [[3, 64, 128, 2, False], [3, 128, 128, 1, False]],\n    \"blocks5\": [[3, 128, 256, 2, False], [5, 256, 256, 1, False], [5, 256, 256, 1, False], [5, 256, 256, 1, False],\n                [5, 256, 256, 1, False], [5, 256, 256, 1, False]],\n    \"blocks6\": [[5, 256, 512, 2, True], [5, 512, 512, 1, True]]\n}\n\n\ndef make_divisible(v, divisor=8, min_value=None):\n    if min_value is None:\n        min_value = divisor\n    new_v = max(min_value, int(v + divisor / 2) // divisor * divisor)\n    if new_v < 0.9 * v:\n        new_v += divisor\n    return new_v\n\n\nclass ConvBNLayer(TheseusLayer):\n\n    def __init__(self, num_channels, filter_size, num_filters, stride, num_groups=1):\n        super().__init__()\n\n        self.conv = Conv2D(in_channels=num_channels,\n                           out_channels=num_filters,\n                           kernel_size=filter_size,\n                           stride=stride,\n                           padding=(filter_size - 1) // 2,\n                           groups=num_groups,\n                           weight_attr=ParamAttr(initializer=KaimingNormal()),\n                           bias_attr=False)\n\n        self.bn = BatchNorm(num_filters,\n                            param_attr=ParamAttr(regularizer=L2Decay(0.0)),\n                            bias_attr=ParamAttr(regularizer=L2Decay(0.0)))\n        self.hardswish = nn.Hardswish()\n\n    def forward(self, x):\n        x = self.conv(x)\n        x = self.bn(x)\n        x = self.hardswish(x)\n        return x\n\n\nclass DepthwiseSeparable(TheseusLayer):\n\n    def __init__(self, num_channels, num_filters, stride, dw_size=3, use_se=False):\n        super().__init__()\n        self.use_se = use_se\n        self.dw_conv = ConvBNLayer(num_channels=num_channels,\n                                   num_filters=num_channels,\n                                   filter_size=dw_size,\n                                   stride=stride,\n                                   num_groups=num_channels)\n        if use_se:\n            self.se = SEModule(num_channels)\n        self.pw_conv = ConvBNLayer(num_channels=num_channels, filter_size=1, num_filters=num_filters, stride=1)\n\n    def forward(self, x):\n        x = self.dw_conv(x)\n        if self.use_se:\n            x = self.se(x)\n        x = self.pw_conv(x)\n        return x\n\n\nclass SEModule(TheseusLayer):\n\n    def __init__(self, channel, reduction=4):\n        super().__init__()\n        self.avg_pool = AdaptiveAvgPool2D(1)\n        self.conv1 = Conv2D(in_channels=channel, out_channels=channel // reduction, kernel_size=1, stride=1, padding=0)\n        self.relu = nn.ReLU()\n        self.conv2 = Conv2D(in_channels=channel // reduction, out_channels=channel, kernel_size=1, stride=1, padding=0)\n        self.hardsigmoid = nn.Hardsigmoid()\n\n    def forward(self, x):\n        identity = x\n        x = self.avg_pool(x)\n        x = self.conv1(x)\n        x = self.relu(x)\n        x = self.conv2(x)\n        x = self.hardsigmoid(x)\n        x = paddle.multiply(x=identity, y=x)\n        return x\n\n\nclass PPLCNet(TheseusLayer):\n\n    def __init__(self,\n                 stages_pattern,\n                 scale=1.0,\n                 class_num=1000,\n                 dropout_prob=0.2,\n                 class_expand=1280,\n                 return_patterns=None,\n                 return_stages=None):\n        super().__init__()\n        self.scale = scale\n        self.class_expand = class_expand\n\n        self.conv1 = ConvBNLayer(num_channels=3, filter_size=3, num_filters=make_divisible(16 * scale), stride=2)\n\n        self.blocks2 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks2\"])\n        ])\n\n        self.blocks3 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks3\"])\n        ])\n\n        self.blocks4 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks4\"])\n        ])\n\n        self.blocks5 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks5\"])\n        ])\n\n        self.blocks6 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks6\"])\n        ])\n\n        self.avg_pool = AdaptiveAvgPool2D(1)\n\n        self.last_conv = Conv2D(in_channels=make_divisible(NET_CONFIG[\"blocks6\"][-1][2] * scale),\n                                out_channels=self.class_expand,\n                                kernel_size=1,\n                                stride=1,\n                                padding=0,\n                                bias_attr=False)\n\n        self.hardswish = nn.Hardswish()\n        self.dropout = Dropout(p=dropout_prob, mode=\"downscale_in_infer\")\n        self.flatten = nn.Flatten(start_axis=1, stop_axis=-1)\n\n        self.fc = Linear(self.class_expand, class_num)\n\n        super().init_res(stages_pattern, return_patterns=return_patterns, return_stages=return_stages)\n\n    def forward(self, x):\n        x = self.conv1(x)\n\n        x = self.blocks2(x)\n        x = self.blocks3(x)\n        x = self.blocks4(x)\n        x = self.blocks5(x)\n        x = self.blocks6(x)\n\n        x = self.avg_pool(x)\n        x = self.last_conv(x)\n        x = self.hardswish(x)\n        x = self.dropout(x)\n        x = self.flatten(x)\n        x = self.fc(x)\n        return x\n\n\ndef PPLCNet_x0_75(pretrained=False, use_ssld=False, **kwargs):\n    model = PPLCNet(scale=0.75, stages_pattern=MODEL_STAGES_PATTERN[\"PPLCNet\"], **kwargs)\n    return model\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x0_75_imagenet/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport cv2\nimport numpy as np\nimport paddle\nfrom skimage.io import imread\nfrom skimage.transform import rescale\nfrom skimage.transform import resize\n\nimport paddlehub as hub\nfrom .model import PPLCNet_x0_75\nfrom .processor import base64_to_cv2\nfrom .processor import create_operators\nfrom .processor import Topk\nfrom .utils import get_config\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"pplcnet_x0_75_imagenet\",\n            type=\"cv/classification\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"\",\n            version=\"1.0.0\")\nclass PPLcNet_x0_75:\n\n    def __init__(self):\n        self.config = get_config(os.path.join(self.directory, 'PPLCNet_x0_75.yaml'), show=False)\n        self.label_path = os.path.join(self.directory, 'imagenet1k_label_list.txt')\n        self.pretrain_path = os.path.join(self.directory, 'PPLCNet_x0_75_pretrained.pdparams')\n        self.config['Infer']['PostProcess']['class_id_map_file'] = self.label_path\n        self.model = PPLCNet_x0_75()\n        param_state_dict = paddle.load(self.pretrain_path)\n        self.model.set_dict(param_state_dict)\n        self.preprocess_funcs = create_operators(self.config[\"Infer\"][\"transforms\"])\n\n    def classification(self,\n                       images: list = None,\n                       paths: list = None,\n                       batch_size: int = 1,\n                       use_gpu: bool = False,\n                       top_k: int = 1):\n        '''\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results, each result dict contains key 'class_ids', 'scores' and 'label_names'.\n        '''\n        postprocess_func = Topk(top_k, self.label_path)\n        inputs = []\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n\n        if images != None:\n            for image in images:\n                image = image[:, :, ::-1]\n                inputs.append(image)\n\n        if paths != None:\n            for path in paths:\n                image = cv2.imread(path)[:, :, ::-1]\n                inputs.append(image)\n\n        batch_data = []\n        for idx, imagedata in enumerate(inputs):\n            for process in self.preprocess_funcs:\n                imagedata = process(imagedata)\n            batch_data.append(imagedata)\n            if len(batch_data) >= batch_size or idx == len(inputs) - 1:\n                batch_tensor = paddle.to_tensor(batch_data)\n                out = self.model(batch_tensor)\n                if isinstance(out, list):\n                    out = out[0]\n                if isinstance(out, dict) and \"logits\" in out:\n                    out = out[\"logits\"]\n                if isinstance(out, dict) and \"output\" in out:\n                    out = out[\"output\"]\n                result = postprocess_func(out)\n                results.extend(result)\n                batch_data.clear()\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n        results = self.classification(paths=[self.args.input_path],\n                                      use_gpu=self.args.use_gpu,\n                                      batch_size=self.args.batch_size,\n                                      top_k=self.args.top_k)\n        return results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classification(images=images_decode, **kwargs)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument('--batch_size', type=int, default=1, help='batch size')\n        self.arg_config_group.add_argument('--top_k', type=int, default=1, help='Return top k results.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to input image.\")\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x0_75_imagenet/processor.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport base64\nimport inspect\nimport math\nimport os\nimport random\nimport sys\nfrom functools import partial\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\nimport six\nfrom paddle.vision.transforms import ColorJitter as RawColorJitter\nfrom PIL import Image\n\n\ndef create_operators(params, class_num=None):\n    \"\"\"\n    create operators based on the config\n\n    Args:\n        params(list): a dict list, used to create some operators\n    \"\"\"\n    assert isinstance(params, list), ('operator config should be a list')\n    ops = []\n    current_module = sys.modules[__name__]\n    for operator in params:\n        assert isinstance(operator, dict) and len(operator) == 1, \"yaml format error\"\n        op_name = list(operator)[0]\n        param = {} if operator[op_name] is None else operator[op_name]\n        op_func = getattr(current_module, op_name)\n        if \"class_num\" in inspect.getfullargspec(op_func).args:\n            param.update({\"class_num\": class_num})\n        op = op_func(**param)\n        ops.append(op)\n\n    return ops\n\n\nclass UnifiedResize(object):\n\n    def __init__(self, interpolation=None, backend=\"cv2\"):\n        _cv2_interp_from_str = {\n            'nearest': cv2.INTER_NEAREST,\n            'bilinear': cv2.INTER_LINEAR,\n            'area': cv2.INTER_AREA,\n            'bicubic': cv2.INTER_CUBIC,\n            'lanczos': cv2.INTER_LANCZOS4\n        }\n        _pil_interp_from_str = {\n            'nearest': Image.NEAREST,\n            'bilinear': Image.BILINEAR,\n            'bicubic': Image.BICUBIC,\n            'box': Image.BOX,\n            'lanczos': Image.LANCZOS,\n            'hamming': Image.HAMMING\n        }\n\n        def _pil_resize(src, size, resample):\n            pil_img = Image.fromarray(src)\n            pil_img = pil_img.resize(size, resample)\n            return np.asarray(pil_img)\n\n        if backend.lower() == \"cv2\":\n            if isinstance(interpolation, str):\n                interpolation = _cv2_interp_from_str[interpolation.lower()]\n            # compatible with opencv < version 4.4.0\n            elif interpolation is None:\n                interpolation = cv2.INTER_LINEAR\n            self.resize_func = partial(cv2.resize, interpolation=interpolation)\n        elif backend.lower() == \"pil\":\n            if isinstance(interpolation, str):\n                interpolation = _pil_interp_from_str[interpolation.lower()]\n            self.resize_func = partial(_pil_resize, resample=interpolation)\n        else:\n            self.resize_func = cv2.resize\n\n    def __call__(self, src, size):\n        return self.resize_func(src, size)\n\n\nclass OperatorParamError(ValueError):\n    \"\"\" OperatorParamError\n    \"\"\"\n    pass\n\n\nclass DecodeImage(object):\n    \"\"\" decode image \"\"\"\n\n    def __init__(self, to_rgb=True, to_np=False, channel_first=False):\n        self.to_rgb = to_rgb\n        self.to_np = to_np  # to numpy\n        self.channel_first = channel_first  # only enabled when to_np is True\n\n    def __call__(self, img):\n        if six.PY2:\n            assert type(img) is str and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        else:\n            assert type(img) is bytes and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        data = np.frombuffer(img, dtype='uint8')\n        img = cv2.imdecode(data, 1)\n        if self.to_rgb:\n            assert img.shape[2] == 3, 'invalid shape of image[%s]' % (img.shape)\n            img = img[:, :, ::-1]\n\n        if self.channel_first:\n            img = img.transpose((2, 0, 1))\n\n        return img\n\n\nclass ResizeImage(object):\n    \"\"\" resize image \"\"\"\n\n    def __init__(self, size=None, resize_short=None, interpolation=None, backend=\"cv2\"):\n        if resize_short is not None and resize_short > 0:\n            self.resize_short = resize_short\n            self.w = None\n            self.h = None\n        elif size is not None:\n            self.resize_short = None\n            self.w = size if type(size) is int else size[0]\n            self.h = size if type(size) is int else size[1]\n        else:\n            raise OperatorParamError(\"invalid params for ReisizeImage for '\\\n                'both 'size' and 'resize_short' are None\")\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        img_h, img_w = img.shape[:2]\n        if self.resize_short is not None:\n            percent = float(self.resize_short) / min(img_w, img_h)\n            w = int(round(img_w * percent))\n            h = int(round(img_h * percent))\n        else:\n            w = self.w\n            h = self.h\n        return self._resize_func(img, (w, h))\n\n\nclass CropImage(object):\n    \"\"\" crop image \"\"\"\n\n    def __init__(self, size):\n        if type(size) is int:\n            self.size = (size, size)\n        else:\n            self.size = size  # (h, w)\n\n    def __call__(self, img):\n        w, h = self.size\n        img_h, img_w = img.shape[:2]\n        w_start = (img_w - w) // 2\n        h_start = (img_h - h) // 2\n\n        w_end = w_start + w\n        h_end = h_start + h\n        return img[h_start:h_end, w_start:w_end, :]\n\n\nclass RandCropImage(object):\n    \"\"\" random crop image \"\"\"\n\n    def __init__(self, size, scale=None, ratio=None, interpolation=None, backend=\"cv2\"):\n        if type(size) is int:\n            self.size = (size, size)  # (h, w)\n        else:\n            self.size = size\n\n        self.scale = [0.08, 1.0] if scale is None else scale\n        self.ratio = [3. / 4., 4. / 3.] if ratio is None else ratio\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        size = self.size\n        scale = self.scale\n        ratio = self.ratio\n\n        aspect_ratio = math.sqrt(random.uniform(*ratio))\n        w = 1. * aspect_ratio\n        h = 1. / aspect_ratio\n\n        img_h, img_w = img.shape[:2]\n\n        bound = min((float(img_w) / img_h) / (w**2), (float(img_h) / img_w) / (h**2))\n        scale_max = min(scale[1], bound)\n        scale_min = min(scale[0], bound)\n\n        target_area = img_w * img_h * random.uniform(scale_min, scale_max)\n        target_size = math.sqrt(target_area)\n        w = int(target_size * w)\n        h = int(target_size * h)\n\n        i = random.randint(0, img_w - w)\n        j = random.randint(0, img_h - h)\n\n        img = img[j:j + h, i:i + w, :]\n\n        return self._resize_func(img, size)\n\n\nclass RandFlipImage(object):\n    \"\"\" random flip image\n        flip_code:\n            1: Flipped Horizontally\n            0: Flipped Vertically\n            -1: Flipped Horizontally & Vertically\n    \"\"\"\n\n    def __init__(self, flip_code=1):\n        assert flip_code in [-1, 0, 1], \"flip_code should be a value in [-1, 0, 1]\"\n        self.flip_code = flip_code\n\n    def __call__(self, img):\n        if random.randint(0, 1) == 1:\n            return cv2.flip(img, self.flip_code)\n        else:\n            return img\n\n\nclass NormalizeImage(object):\n    \"\"\" normalize image such as substract mean, divide std\n    \"\"\"\n\n    def __init__(self, scale=None, mean=None, std=None, order='chw', output_fp16=False, channel_num=3):\n        if isinstance(scale, str):\n            scale = eval(scale)\n        assert channel_num in [3, 4], \"channel number of input image should be set to 3 or 4.\"\n        self.channel_num = channel_num\n        self.output_dtype = 'float16' if output_fp16 else 'float32'\n        self.scale = np.float32(scale if scale is not None else 1.0 / 255.0)\n        self.order = order\n        mean = mean if mean is not None else [0.485, 0.456, 0.406]\n        std = std if std is not None else [0.229, 0.224, 0.225]\n\n        shape = (3, 1, 1) if self.order == 'chw' else (1, 1, 3)\n        self.mean = np.array(mean).reshape(shape).astype('float32')\n        self.std = np.array(std).reshape(shape).astype('float32')\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        assert isinstance(img, np.ndarray), \"invalid input 'img' in NormalizeImage\"\n\n        img = (img.astype('float32') * self.scale - self.mean) / self.std\n\n        if self.channel_num == 4:\n            img_h = img.shape[1] if self.order == 'chw' else img.shape[0]\n            img_w = img.shape[2] if self.order == 'chw' else img.shape[1]\n            pad_zeros = np.zeros((1, img_h, img_w)) if self.order == 'chw' else np.zeros((img_h, img_w, 1))\n            img = (np.concatenate((img, pad_zeros), axis=0) if self.order == 'chw' else np.concatenate(\n                (img, pad_zeros), axis=2))\n        return img.astype(self.output_dtype)\n\n\nclass ToCHWImage(object):\n    \"\"\" convert hwc image to chw image\n    \"\"\"\n\n    def __init__(self):\n        pass\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        return img.transpose((2, 0, 1))\n\n\nclass ColorJitter(RawColorJitter):\n    \"\"\"ColorJitter.\n    \"\"\"\n\n    def __init__(self, *args, **kwargs):\n        super().__init__(*args, **kwargs)\n\n    def __call__(self, img):\n        if not isinstance(img, Image.Image):\n            img = np.ascontiguousarray(img)\n            img = Image.fromarray(img)\n        img = super()._apply_image(img)\n        if isinstance(img, Image.Image):\n            img = np.asarray(img)\n        return img\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\nclass Topk(object):\n\n    def __init__(self, topk=1, class_id_map_file=None):\n        assert isinstance(topk, (int, ))\n        self.class_id_map = self.parse_class_id_map(class_id_map_file)\n        self.topk = topk\n\n    def parse_class_id_map(self, class_id_map_file):\n        if class_id_map_file is None:\n            return None\n        if not os.path.exists(class_id_map_file):\n            print(\n                \"Warning: If want to use your own label_dict, please input legal path!\\nOtherwise label_names will be empty!\"\n            )\n            return None\n\n        try:\n            class_id_map = {}\n            with open(class_id_map_file, \"r\") as fin:\n                lines = fin.readlines()\n                for line in lines:\n                    partition = line.split(\"\\n\")[0].partition(\" \")\n                    class_id_map[int(partition[0])] = str(partition[-1])\n        except Exception as ex:\n            print(ex)\n            class_id_map = None\n        return class_id_map\n\n    def __call__(self, x, file_names=None, multilabel=False):\n        assert isinstance(x, paddle.Tensor)\n        if file_names is not None:\n            assert x.shape[0] == len(file_names)\n        x = F.softmax(x, axis=-1) if not multilabel else F.sigmoid(x)\n        x = x.numpy()\n        y = []\n        for idx, probs in enumerate(x):\n            index = probs.argsort(axis=0)[-self.topk:][::-1].astype(\"int32\") if not multilabel else np.where(\n                probs >= 0.5)[0].astype(\"int32\")\n            clas_id_list = []\n            score_list = []\n            label_name_list = []\n            for i in index:\n                clas_id_list.append(i.item())\n                score_list.append(probs[i].item())\n                if self.class_id_map is not None:\n                    label_name_list.append(self.class_id_map[i.item()])\n            result = {\n                \"class_ids\": clas_id_list,\n                \"scores\": np.around(score_list, decimals=5).tolist(),\n            }\n            if file_names is not None:\n                result[\"file_name\"] = file_names[idx]\n            if label_name_list is not None:\n                result[\"label_names\"] = label_name_list\n            y.append(result)\n        return y\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x0_75_imagenet/utils.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport yaml\n\n__all__ = ['get_config']\n\n\nclass AttrDict(dict):\n\n    def __getattr__(self, key):\n        return self[key]\n\n    def __setattr__(self, key, value):\n        if key in self.__dict__:\n            self.__dict__[key] = value\n        else:\n            self[key] = value\n\n    def __deepcopy__(self, content):\n        return copy.deepcopy(dict(self))\n\n\ndef create_attr_dict(yaml_config):\n    from ast import literal_eval\n    for key, value in yaml_config.items():\n        if type(value) is dict:\n            yaml_config[key] = value = AttrDict(value)\n        if isinstance(value, str):\n            try:\n                value = literal_eval(value)\n            except BaseException:\n                pass\n        if isinstance(value, AttrDict):\n            create_attr_dict(yaml_config[key])\n        else:\n            yaml_config[key] = value\n\n\ndef parse_config(cfg_file):\n    \"\"\"Load a config file into AttrDict\"\"\"\n    with open(cfg_file, 'r') as fopen:\n        yaml_config = AttrDict(yaml.load(fopen, Loader=yaml.SafeLoader))\n    create_attr_dict(yaml_config)\n    return yaml_config\n\n\ndef override(dl, ks, v):\n    \"\"\"\n    Recursively replace dict of list\n    Args:\n        dl(dict or list): dict or list to be replaced\n        ks(list): list of keys\n        v(str): value to be replaced\n    \"\"\"\n\n    def str2num(v):\n        try:\n            return eval(v)\n        except Exception:\n            return v\n\n    assert isinstance(dl, (list, dict)), (\"{} should be a list or a dict\")\n    assert len(ks) > 0, ('lenght of keys should larger than 0')\n    if isinstance(dl, list):\n        k = str2num(ks[0])\n        if len(ks) == 1:\n            assert k < len(dl), ('index({}) out of range({})'.format(k, dl))\n            dl[k] = str2num(v)\n        else:\n            override(dl[k], ks[1:], v)\n    else:\n        if len(ks) == 1:\n            # assert ks[0] in dl, ('{} is not exist in {}'.format(ks[0], dl))\n            if not ks[0] in dl:\n                print('A new filed ({}) detected!'.format(ks[0], dl))\n            dl[ks[0]] = str2num(v)\n        else:\n            override(dl[ks[0]], ks[1:], v)\n\n\ndef override_config(config, options=None):\n    \"\"\"\n    Recursively override the config\n    Args:\n        config(dict): dict to be replaced\n        options(list): list of pairs(key0.key1.idx.key2=value)\n            such as: [\n                'topk=2',\n                'VALID.transforms.1.ResizeImage.resize_short=300'\n            ]\n    Returns:\n        config(dict): replaced config\n    \"\"\"\n    if options is not None:\n        for opt in options:\n            assert isinstance(opt, str), (\"option({}) should be a str\".format(opt))\n            assert \"=\" in opt, (\"option({}) should contain a =\"\n                                \"to distinguish between key and value\".format(opt))\n            pair = opt.split('=')\n            assert len(pair) == 2, (\"there can be only a = in the option\")\n            key, value = pair\n            keys = key.split('.')\n            override(config, keys, value)\n    return config\n\n\ndef get_config(fname, overrides=None, show=False):\n    \"\"\"\n    Read config from file\n    \"\"\"\n    assert os.path.exists(fname), ('config file({}) is not exist'.format(fname))\n    config = parse_config(fname)\n    override_config(config, overrides)\n    return config\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x1_0_imagenet/README.md",
    "content": "# pplcnet_x1_0_imagenet\n\n|模型名称|pplcnet_x1_0_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|PPLCNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|11 MB|\n|最新更新日期|2022-04-02|\n|数据指标|Acc|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - PP-LCNet是百度针对Intel CPU 设备以及其加速库 MKLDNN 设计的特定骨干网络 ，比起其他的轻量级的 SOTA 模型，该骨干网络可以在不增加推理时间的情况下，进一步提升模型的性能，最终大幅度超越现有的 SOTA 模型。该模型为模型规模参数scale为x1.0下的PP-LCNet模型，关于模型结构的更多信息，可参考[论文](https://arxiv.org/pdf/2109.15099.pdf)。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install pplcnet_x1_0_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run pplcnet_x1_0_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"pplcnet_x1_0_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 包括'class_ids'（种类索引）, 'scores'（置信度） 和 'label_names'（种类名称）\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m pplcnet_x1_0_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"\\}\n    url = \"http://127.0.0.1:8866/predict/pplcnet_x1_0_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install pplcnet_x1_0_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x1_0_imagenet/model.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Any\nfrom typing import Callable\nfrom typing import Dict\nfrom typing import List\nfrom typing import Tuple\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nfrom paddle import ParamAttr\nfrom paddle.nn import AdaptiveAvgPool2D\nfrom paddle.nn import BatchNorm\nfrom paddle.nn import Conv2D\nfrom paddle.nn import Dropout\nfrom paddle.nn import Linear\nfrom paddle.nn.initializer import KaimingNormal\nfrom paddle.regularizer import L2Decay\n\n\nclass Identity(nn.Layer):\n\n    def __init__(self):\n        super(Identity, self).__init__()\n\n    def forward(self, inputs):\n        return inputs\n\n\nclass TheseusLayer(nn.Layer):\n\n    def __init__(self, *args, **kwargs):\n        super(TheseusLayer, self).__init__()\n        self.res_dict = {}\n        self.res_name = self.full_name()\n        self.pruner = None\n        self.quanter = None\n\n    def _return_dict_hook(self, layer, input, output):\n        res_dict = {\"output\": output}\n        # 'list' is needed to avoid error raised by popping self.res_dict\n        for res_key in list(self.res_dict):\n            # clear the res_dict because the forward process may change according to input\n            res_dict[res_key] = self.res_dict.pop(res_key)\n        return res_dict\n\n    def init_res(self, stages_pattern, return_patterns=None, return_stages=None):\n        if return_patterns and return_stages:\n            msg = f\"The 'return_patterns' would be ignored when 'return_stages' is set.\"\n            return_stages = None\n\n        if return_stages is True:\n            return_patterns = stages_pattern\n        # return_stages is int or bool\n        if type(return_stages) is int:\n            return_stages = [return_stages]\n        if isinstance(return_stages, list):\n            if max(return_stages) > len(stages_pattern) or min(return_stages) < 0:\n                msg = f\"The 'return_stages' set error. Illegal value(s) have been ignored. The stages' pattern list is {stages_pattern}.\"\n                return_stages = [val for val in return_stages if val >= 0 and val < len(stages_pattern)]\n            return_patterns = [stages_pattern[i] for i in return_stages]\n\n        if return_patterns:\n            self.update_res(return_patterns)\n\n    def replace_sub(self, *args, **kwargs) -> None:\n        msg = \"The function 'replace_sub()' is deprecated, please use 'upgrade_sublayer()' instead.\"\n        raise DeprecationWarning(msg)\n\n    def upgrade_sublayer(self, layer_name_pattern: Union[str, List[str]],\n                         handle_func: Callable[[nn.Layer, str], nn.Layer]) -> Dict[str, nn.Layer]:\n        \"\"\"use 'handle_func' to modify the sub-layer(s) specified by 'layer_name_pattern'.\n\n        Args:\n            layer_name_pattern (Union[str, List[str]]): The name of layer to be modified by 'handle_func'.\n            handle_func (Callable[[nn.Layer, str], nn.Layer]): The function to modify target layer specified by 'layer_name_pattern'. The formal params are the layer(nn.Layer) and pattern(str) that is (a member of) layer_name_pattern (when layer_name_pattern is List type). And the return is the layer processed.\n\n        Returns:\n            Dict[str, nn.Layer]: The key is the pattern and corresponding value is the result returned by 'handle_func()'.\n\n        Examples:\n\n            from paddle import nn\n            import paddleclas\n\n            def rep_func(layer: nn.Layer, pattern: str):\n                new_layer = nn.Conv2D(\n                    in_channels=layer._in_channels,\n                    out_channels=layer._out_channels,\n                    kernel_size=5,\n                    padding=2\n                )\n                return new_layer\n\n            net = paddleclas.MobileNetV1()\n            res = net.replace_sub(layer_name_pattern=[\"blocks[11].depthwise_conv.conv\", \"blocks[12].depthwise_conv.conv\"], handle_func=rep_func)\n            print(res)\n            # {'blocks[11].depthwise_conv.conv': the corresponding new_layer, 'blocks[12].depthwise_conv.conv': the corresponding new_layer}\n        \"\"\"\n\n        if not isinstance(layer_name_pattern, list):\n            layer_name_pattern = [layer_name_pattern]\n\n        hit_layer_pattern_list = []\n        for pattern in layer_name_pattern:\n            # parse pattern to find target layer and its parent\n            layer_list = parse_pattern_str(pattern=pattern, parent_layer=self)\n            if not layer_list:\n                continue\n            sub_layer_parent = layer_list[-2][\"layer\"] if len(layer_list) > 1 else self\n\n            sub_layer = layer_list[-1][\"layer\"]\n            sub_layer_name = layer_list[-1][\"name\"]\n            sub_layer_index = layer_list[-1][\"index\"]\n\n            new_sub_layer = handle_func(sub_layer, pattern)\n\n            if sub_layer_index:\n                getattr(sub_layer_parent, sub_layer_name)[sub_layer_index] = new_sub_layer\n            else:\n                setattr(sub_layer_parent, sub_layer_name, new_sub_layer)\n\n            hit_layer_pattern_list.append(pattern)\n        return hit_layer_pattern_list\n\n    def stop_after(self, stop_layer_name: str) -> bool:\n        \"\"\"stop forward and backward after 'stop_layer_name'.\n\n        Args:\n            stop_layer_name (str): The name of layer that stop forward and backward after this layer.\n\n        Returns:\n            bool: 'True' if successful, 'False' otherwise.\n        \"\"\"\n\n        layer_list = parse_pattern_str(stop_layer_name, self)\n        if not layer_list:\n            return False\n\n        parent_layer = self\n        for layer_dict in layer_list:\n            name, index = layer_dict[\"name\"], layer_dict[\"index\"]\n            if not set_identity(parent_layer, name, index):\n                msg = f\"Failed to set the layers that after stop_layer_name('{stop_layer_name}') to IdentityLayer. The error layer's name is '{name}'.\"\n                return False\n            parent_layer = layer_dict[\"layer\"]\n\n        return True\n\n    def update_res(self, return_patterns: Union[str, List[str]]) -> Dict[str, nn.Layer]:\n        \"\"\"update the result(s) to be returned.\n\n        Args:\n            return_patterns (Union[str, List[str]]): The name of layer to return output.\n\n        Returns:\n            Dict[str, nn.Layer]: The pattern(str) and corresponding layer(nn.Layer) that have been set successfully.\n        \"\"\"\n\n        # clear res_dict that could have been set\n        self.res_dict = {}\n\n        class Handler(object):\n\n            def __init__(self, res_dict):\n                # res_dict is a reference\n                self.res_dict = res_dict\n\n            def __call__(self, layer, pattern):\n                layer.res_dict = self.res_dict\n                layer.res_name = pattern\n                if hasattr(layer, \"hook_remove_helper\"):\n                    layer.hook_remove_helper.remove()\n                layer.hook_remove_helper = layer.register_forward_post_hook(save_sub_res_hook)\n                return layer\n\n        handle_func = Handler(self.res_dict)\n\n        hit_layer_pattern_list = self.upgrade_sublayer(return_patterns, handle_func=handle_func)\n\n        if hasattr(self, \"hook_remove_helper\"):\n            self.hook_remove_helper.remove()\n        self.hook_remove_helper = self.register_forward_post_hook(self._return_dict_hook)\n\n        return hit_layer_pattern_list\n\n\ndef save_sub_res_hook(layer, input, output):\n    layer.res_dict[layer.res_name] = output\n\n\ndef set_identity(parent_layer: nn.Layer, layer_name: str, layer_index: str = None) -> bool:\n    \"\"\"set the layer specified by layer_name and layer_index to Indentity.\n\n    Args:\n        parent_layer (nn.Layer): The parent layer of target layer specified by layer_name and layer_index.\n        layer_name (str): The name of target layer to be set to Indentity.\n        layer_index (str, optional): The index of target layer to be set to Indentity in parent_layer. Defaults to None.\n\n    Returns:\n        bool: True if successfully, False otherwise.\n    \"\"\"\n\n    stop_after = False\n    for sub_layer_name in parent_layer._sub_layers:\n        if stop_after:\n            parent_layer._sub_layers[sub_layer_name] = Identity()\n            continue\n        if sub_layer_name == layer_name:\n            stop_after = True\n\n    if layer_index and stop_after:\n        stop_after = False\n        for sub_layer_index in parent_layer._sub_layers[layer_name]._sub_layers:\n            if stop_after:\n                parent_layer._sub_layers[layer_name][sub_layer_index] = Identity()\n                continue\n            if layer_index == sub_layer_index:\n                stop_after = True\n\n    return stop_after\n\n\ndef parse_pattern_str(pattern: str, parent_layer: nn.Layer) -> Union[None, List[Dict[str, Union[nn.Layer, str, None]]]]:\n    \"\"\"parse the string type pattern.\n\n    Args:\n        pattern (str): The pattern to discribe layer.\n        parent_layer (nn.Layer): The root layer relative to the pattern.\n\n    Returns:\n        Union[None, List[Dict[str, Union[nn.Layer, str, None]]]]: None if failed. If successfully, the members are layers parsed in order:\n                                                                [\n                                                                    {\"layer\": first layer, \"name\": first layer's name parsed, \"index\": first layer's index parsed if exist},\n                                                                    {\"layer\": second layer, \"name\": second layer's name parsed, \"index\": second layer's index parsed if exist},\n                                                                    ...\n                                                                ]\n    \"\"\"\n\n    pattern_list = pattern.split(\".\")\n    if not pattern_list:\n        msg = f\"The pattern('{pattern}') is illegal. Please check and retry.\"\n        return None\n\n    layer_list = []\n    while len(pattern_list) > 0:\n        if '[' in pattern_list[0]:\n            target_layer_name = pattern_list[0].split('[')[0]\n            target_layer_index = pattern_list[0].split('[')[1].split(']')[0]\n        else:\n            target_layer_name = pattern_list[0]\n            target_layer_index = None\n\n        target_layer = getattr(parent_layer, target_layer_name, None)\n\n        if target_layer is None:\n            msg = f\"Not found layer named('{target_layer_name}') specifed in pattern('{pattern}').\"\n            return None\n\n        if target_layer_index and target_layer:\n            if int(target_layer_index) < 0 or int(target_layer_index) >= len(target_layer):\n                msg = f\"Not found layer by index('{target_layer_index}') specifed in pattern('{pattern}'). The index should < {len(target_layer)} and > 0.\"\n                return None\n\n            target_layer = target_layer[target_layer_index]\n\n        layer_list.append({\"layer\": target_layer, \"name\": target_layer_name, \"index\": target_layer_index})\n\n        pattern_list = pattern_list[1:]\n        parent_layer = target_layer\n    return layer_list\n\n\nMODEL_STAGES_PATTERN = {\"PPLCNet\": [\"blocks2\", \"blocks3\", \"blocks4\", \"blocks5\", \"blocks6\"]}\n\n# Each element(list) represents a depthwise block, which is composed of k, in_c, out_c, s, use_se.\n# k: kernel_size\n# in_c: input channel number in depthwise block\n# out_c: output channel number in depthwise block\n# s: stride in depthwise block\n# use_se: whether to use SE block\n\nNET_CONFIG = {\n    \"blocks2\":\n    #k, in_c, out_c, s, use_se\n    [[3, 16, 32, 1, False]],\n    \"blocks3\": [[3, 32, 64, 2, False], [3, 64, 64, 1, False]],\n    \"blocks4\": [[3, 64, 128, 2, False], [3, 128, 128, 1, False]],\n    \"blocks5\": [[3, 128, 256, 2, False], [5, 256, 256, 1, False], [5, 256, 256, 1, False], [5, 256, 256, 1, False],\n                [5, 256, 256, 1, False], [5, 256, 256, 1, False]],\n    \"blocks6\": [[5, 256, 512, 2, True], [5, 512, 512, 1, True]]\n}\n\n\ndef make_divisible(v, divisor=8, min_value=None):\n    if min_value is None:\n        min_value = divisor\n    new_v = max(min_value, int(v + divisor / 2) // divisor * divisor)\n    if new_v < 0.9 * v:\n        new_v += divisor\n    return new_v\n\n\nclass ConvBNLayer(TheseusLayer):\n\n    def __init__(self, num_channels, filter_size, num_filters, stride, num_groups=1):\n        super().__init__()\n\n        self.conv = Conv2D(in_channels=num_channels,\n                           out_channels=num_filters,\n                           kernel_size=filter_size,\n                           stride=stride,\n                           padding=(filter_size - 1) // 2,\n                           groups=num_groups,\n                           weight_attr=ParamAttr(initializer=KaimingNormal()),\n                           bias_attr=False)\n\n        self.bn = BatchNorm(num_filters,\n                            param_attr=ParamAttr(regularizer=L2Decay(0.0)),\n                            bias_attr=ParamAttr(regularizer=L2Decay(0.0)))\n        self.hardswish = nn.Hardswish()\n\n    def forward(self, x):\n        x = self.conv(x)\n        x = self.bn(x)\n        x = self.hardswish(x)\n        return x\n\n\nclass DepthwiseSeparable(TheseusLayer):\n\n    def __init__(self, num_channels, num_filters, stride, dw_size=3, use_se=False):\n        super().__init__()\n        self.use_se = use_se\n        self.dw_conv = ConvBNLayer(num_channels=num_channels,\n                                   num_filters=num_channels,\n                                   filter_size=dw_size,\n                                   stride=stride,\n                                   num_groups=num_channels)\n        if use_se:\n            self.se = SEModule(num_channels)\n        self.pw_conv = ConvBNLayer(num_channels=num_channels, filter_size=1, num_filters=num_filters, stride=1)\n\n    def forward(self, x):\n        x = self.dw_conv(x)\n        if self.use_se:\n            x = self.se(x)\n        x = self.pw_conv(x)\n        return x\n\n\nclass SEModule(TheseusLayer):\n\n    def __init__(self, channel, reduction=4):\n        super().__init__()\n        self.avg_pool = AdaptiveAvgPool2D(1)\n        self.conv1 = Conv2D(in_channels=channel, out_channels=channel // reduction, kernel_size=1, stride=1, padding=0)\n        self.relu = nn.ReLU()\n        self.conv2 = Conv2D(in_channels=channel // reduction, out_channels=channel, kernel_size=1, stride=1, padding=0)\n        self.hardsigmoid = nn.Hardsigmoid()\n\n    def forward(self, x):\n        identity = x\n        x = self.avg_pool(x)\n        x = self.conv1(x)\n        x = self.relu(x)\n        x = self.conv2(x)\n        x = self.hardsigmoid(x)\n        x = paddle.multiply(x=identity, y=x)\n        return x\n\n\nclass PPLCNet(TheseusLayer):\n\n    def __init__(self,\n                 stages_pattern,\n                 scale=1.0,\n                 class_num=1000,\n                 dropout_prob=0.2,\n                 class_expand=1280,\n                 return_patterns=None,\n                 return_stages=None):\n        super().__init__()\n        self.scale = scale\n        self.class_expand = class_expand\n\n        self.conv1 = ConvBNLayer(num_channels=3, filter_size=3, num_filters=make_divisible(16 * scale), stride=2)\n\n        self.blocks2 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks2\"])\n        ])\n\n        self.blocks3 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks3\"])\n        ])\n\n        self.blocks4 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks4\"])\n        ])\n\n        self.blocks5 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks5\"])\n        ])\n\n        self.blocks6 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks6\"])\n        ])\n\n        self.avg_pool = AdaptiveAvgPool2D(1)\n\n        self.last_conv = Conv2D(in_channels=make_divisible(NET_CONFIG[\"blocks6\"][-1][2] * scale),\n                                out_channels=self.class_expand,\n                                kernel_size=1,\n                                stride=1,\n                                padding=0,\n                                bias_attr=False)\n\n        self.hardswish = nn.Hardswish()\n        self.dropout = Dropout(p=dropout_prob, mode=\"downscale_in_infer\")\n        self.flatten = nn.Flatten(start_axis=1, stop_axis=-1)\n\n        self.fc = Linear(self.class_expand, class_num)\n\n        super().init_res(stages_pattern, return_patterns=return_patterns, return_stages=return_stages)\n\n    def forward(self, x):\n        x = self.conv1(x)\n\n        x = self.blocks2(x)\n        x = self.blocks3(x)\n        x = self.blocks4(x)\n        x = self.blocks5(x)\n        x = self.blocks6(x)\n\n        x = self.avg_pool(x)\n        x = self.last_conv(x)\n        x = self.hardswish(x)\n        x = self.dropout(x)\n        x = self.flatten(x)\n        x = self.fc(x)\n        return x\n\n\ndef PPLCNet_x1_0(pretrained=False, use_ssld=False, **kwargs):\n    model = PPLCNet(scale=1.0, stages_pattern=MODEL_STAGES_PATTERN[\"PPLCNet\"], **kwargs)\n    return model\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x1_0_imagenet/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport cv2\nimport numpy as np\nimport paddle\nfrom skimage.io import imread\nfrom skimage.transform import rescale\nfrom skimage.transform import resize\n\nimport paddlehub as hub\nfrom .model import PPLCNet_x1_0\nfrom .processor import base64_to_cv2\nfrom .processor import create_operators\nfrom .processor import Topk\nfrom .utils import get_config\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"pplcnet_x1_0_imagenet\",\n            type=\"cv/classification\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"\",\n            version=\"1.0.0\")\nclass PPLcNet_x1_0:\n\n    def __init__(self):\n        self.config = get_config(os.path.join(self.directory, 'PPLCNet_x1_0.yaml'), show=False)\n        self.label_path = os.path.join(self.directory, 'imagenet1k_label_list.txt')\n        self.pretrain_path = os.path.join(self.directory, 'PPLCNet_x1_0_pretrained.pdparams')\n        self.config['Infer']['PostProcess']['class_id_map_file'] = self.label_path\n        self.model = PPLCNet_x1_0()\n        param_state_dict = paddle.load(self.pretrain_path)\n        self.model.set_dict(param_state_dict)\n        self.preprocess_funcs = create_operators(self.config[\"Infer\"][\"transforms\"])\n\n    def classification(self,\n                       images: list = None,\n                       paths: list = None,\n                       batch_size: int = 1,\n                       use_gpu: bool = False,\n                       top_k: int = 1):\n        '''\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results, each result dict contains key 'class_ids', 'scores' and 'label_names'.\n        '''\n        postprocess_func = Topk(top_k, self.label_path)\n        inputs = []\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n\n        if images != None:\n            for image in images:\n                image = image[:, :, ::-1]\n                inputs.append(image)\n\n        if paths != None:\n            for path in paths:\n                image = cv2.imread(path)[:, :, ::-1]\n                inputs.append(image)\n\n        batch_data = []\n        for idx, imagedata in enumerate(inputs):\n            for process in self.preprocess_funcs:\n                imagedata = process(imagedata)\n            batch_data.append(imagedata)\n            if len(batch_data) >= batch_size or idx == len(inputs) - 1:\n                batch_tensor = paddle.to_tensor(batch_data)\n                out = self.model(batch_tensor)\n                if isinstance(out, list):\n                    out = out[0]\n                if isinstance(out, dict) and \"logits\" in out:\n                    out = out[\"logits\"]\n                if isinstance(out, dict) and \"output\" in out:\n                    out = out[\"output\"]\n                result = postprocess_func(out)\n                results.extend(result)\n                batch_data.clear()\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n        results = self.classification(paths=[self.args.input_path],\n                                      use_gpu=self.args.use_gpu,\n                                      batch_size=self.args.batch_size,\n                                      top_k=self.args.top_k)\n        return results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classification(images=images_decode, **kwargs)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument('--batch_size', type=int, default=1, help='batch size')\n        self.arg_config_group.add_argument('--top_k', type=int, default=1, help='Return top k results.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to input image.\")\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x1_0_imagenet/processor.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport base64\nimport inspect\nimport math\nimport os\nimport random\nimport sys\nfrom functools import partial\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\nimport six\nfrom paddle.vision.transforms import ColorJitter as RawColorJitter\nfrom PIL import Image\n\n\ndef create_operators(params, class_num=None):\n    \"\"\"\n    create operators based on the config\n\n    Args:\n        params(list): a dict list, used to create some operators\n    \"\"\"\n    assert isinstance(params, list), ('operator config should be a list')\n    ops = []\n    current_module = sys.modules[__name__]\n    for operator in params:\n        assert isinstance(operator, dict) and len(operator) == 1, \"yaml format error\"\n        op_name = list(operator)[0]\n        param = {} if operator[op_name] is None else operator[op_name]\n        op_func = getattr(current_module, op_name)\n        if \"class_num\" in inspect.getfullargspec(op_func).args:\n            param.update({\"class_num\": class_num})\n        op = op_func(**param)\n        ops.append(op)\n\n    return ops\n\n\nclass UnifiedResize(object):\n\n    def __init__(self, interpolation=None, backend=\"cv2\"):\n        _cv2_interp_from_str = {\n            'nearest': cv2.INTER_NEAREST,\n            'bilinear': cv2.INTER_LINEAR,\n            'area': cv2.INTER_AREA,\n            'bicubic': cv2.INTER_CUBIC,\n            'lanczos': cv2.INTER_LANCZOS4\n        }\n        _pil_interp_from_str = {\n            'nearest': Image.NEAREST,\n            'bilinear': Image.BILINEAR,\n            'bicubic': Image.BICUBIC,\n            'box': Image.BOX,\n            'lanczos': Image.LANCZOS,\n            'hamming': Image.HAMMING\n        }\n\n        def _pil_resize(src, size, resample):\n            pil_img = Image.fromarray(src)\n            pil_img = pil_img.resize(size, resample)\n            return np.asarray(pil_img)\n\n        if backend.lower() == \"cv2\":\n            if isinstance(interpolation, str):\n                interpolation = _cv2_interp_from_str[interpolation.lower()]\n            # compatible with opencv < version 4.4.0\n            elif interpolation is None:\n                interpolation = cv2.INTER_LINEAR\n            self.resize_func = partial(cv2.resize, interpolation=interpolation)\n        elif backend.lower() == \"pil\":\n            if isinstance(interpolation, str):\n                interpolation = _pil_interp_from_str[interpolation.lower()]\n            self.resize_func = partial(_pil_resize, resample=interpolation)\n        else:\n            self.resize_func = cv2.resize\n\n    def __call__(self, src, size):\n        return self.resize_func(src, size)\n\n\nclass OperatorParamError(ValueError):\n    \"\"\" OperatorParamError\n    \"\"\"\n    pass\n\n\nclass DecodeImage(object):\n    \"\"\" decode image \"\"\"\n\n    def __init__(self, to_rgb=True, to_np=False, channel_first=False):\n        self.to_rgb = to_rgb\n        self.to_np = to_np  # to numpy\n        self.channel_first = channel_first  # only enabled when to_np is True\n\n    def __call__(self, img):\n        if six.PY2:\n            assert type(img) is str and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        else:\n            assert type(img) is bytes and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        data = np.frombuffer(img, dtype='uint8')\n        img = cv2.imdecode(data, 1)\n        if self.to_rgb:\n            assert img.shape[2] == 3, 'invalid shape of image[%s]' % (img.shape)\n            img = img[:, :, ::-1]\n\n        if self.channel_first:\n            img = img.transpose((2, 0, 1))\n\n        return img\n\n\nclass ResizeImage(object):\n    \"\"\" resize image \"\"\"\n\n    def __init__(self, size=None, resize_short=None, interpolation=None, backend=\"cv2\"):\n        if resize_short is not None and resize_short > 0:\n            self.resize_short = resize_short\n            self.w = None\n            self.h = None\n        elif size is not None:\n            self.resize_short = None\n            self.w = size if type(size) is int else size[0]\n            self.h = size if type(size) is int else size[1]\n        else:\n            raise OperatorParamError(\"invalid params for ReisizeImage for '\\\n                'both 'size' and 'resize_short' are None\")\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        img_h, img_w = img.shape[:2]\n        if self.resize_short is not None:\n            percent = float(self.resize_short) / min(img_w, img_h)\n            w = int(round(img_w * percent))\n            h = int(round(img_h * percent))\n        else:\n            w = self.w\n            h = self.h\n        return self._resize_func(img, (w, h))\n\n\nclass CropImage(object):\n    \"\"\" crop image \"\"\"\n\n    def __init__(self, size):\n        if type(size) is int:\n            self.size = (size, size)\n        else:\n            self.size = size  # (h, w)\n\n    def __call__(self, img):\n        w, h = self.size\n        img_h, img_w = img.shape[:2]\n        w_start = (img_w - w) // 2\n        h_start = (img_h - h) // 2\n\n        w_end = w_start + w\n        h_end = h_start + h\n        return img[h_start:h_end, w_start:w_end, :]\n\n\nclass RandCropImage(object):\n    \"\"\" random crop image \"\"\"\n\n    def __init__(self, size, scale=None, ratio=None, interpolation=None, backend=\"cv2\"):\n        if type(size) is int:\n            self.size = (size, size)  # (h, w)\n        else:\n            self.size = size\n\n        self.scale = [0.08, 1.0] if scale is None else scale\n        self.ratio = [3. / 4., 4. / 3.] if ratio is None else ratio\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        size = self.size\n        scale = self.scale\n        ratio = self.ratio\n\n        aspect_ratio = math.sqrt(random.uniform(*ratio))\n        w = 1. * aspect_ratio\n        h = 1. / aspect_ratio\n\n        img_h, img_w = img.shape[:2]\n\n        bound = min((float(img_w) / img_h) / (w**2), (float(img_h) / img_w) / (h**2))\n        scale_max = min(scale[1], bound)\n        scale_min = min(scale[0], bound)\n\n        target_area = img_w * img_h * random.uniform(scale_min, scale_max)\n        target_size = math.sqrt(target_area)\n        w = int(target_size * w)\n        h = int(target_size * h)\n\n        i = random.randint(0, img_w - w)\n        j = random.randint(0, img_h - h)\n\n        img = img[j:j + h, i:i + w, :]\n\n        return self._resize_func(img, size)\n\n\nclass RandFlipImage(object):\n    \"\"\" random flip image\n        flip_code:\n            1: Flipped Horizontally\n            0: Flipped Vertically\n            -1: Flipped Horizontally & Vertically\n    \"\"\"\n\n    def __init__(self, flip_code=1):\n        assert flip_code in [-1, 0, 1], \"flip_code should be a value in [-1, 0, 1]\"\n        self.flip_code = flip_code\n\n    def __call__(self, img):\n        if random.randint(0, 1) == 1:\n            return cv2.flip(img, self.flip_code)\n        else:\n            return img\n\n\nclass NormalizeImage(object):\n    \"\"\" normalize image such as substract mean, divide std\n    \"\"\"\n\n    def __init__(self, scale=None, mean=None, std=None, order='chw', output_fp16=False, channel_num=3):\n        if isinstance(scale, str):\n            scale = eval(scale)\n        assert channel_num in [3, 4], \"channel number of input image should be set to 3 or 4.\"\n        self.channel_num = channel_num\n        self.output_dtype = 'float16' if output_fp16 else 'float32'\n        self.scale = np.float32(scale if scale is not None else 1.0 / 255.0)\n        self.order = order\n        mean = mean if mean is not None else [0.485, 0.456, 0.406]\n        std = std if std is not None else [0.229, 0.224, 0.225]\n\n        shape = (3, 1, 1) if self.order == 'chw' else (1, 1, 3)\n        self.mean = np.array(mean).reshape(shape).astype('float32')\n        self.std = np.array(std).reshape(shape).astype('float32')\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        assert isinstance(img, np.ndarray), \"invalid input 'img' in NormalizeImage\"\n\n        img = (img.astype('float32') * self.scale - self.mean) / self.std\n\n        if self.channel_num == 4:\n            img_h = img.shape[1] if self.order == 'chw' else img.shape[0]\n            img_w = img.shape[2] if self.order == 'chw' else img.shape[1]\n            pad_zeros = np.zeros((1, img_h, img_w)) if self.order == 'chw' else np.zeros((img_h, img_w, 1))\n            img = (np.concatenate((img, pad_zeros), axis=0) if self.order == 'chw' else np.concatenate(\n                (img, pad_zeros), axis=2))\n        return img.astype(self.output_dtype)\n\n\nclass ToCHWImage(object):\n    \"\"\" convert hwc image to chw image\n    \"\"\"\n\n    def __init__(self):\n        pass\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        return img.transpose((2, 0, 1))\n\n\nclass ColorJitter(RawColorJitter):\n    \"\"\"ColorJitter.\n    \"\"\"\n\n    def __init__(self, *args, **kwargs):\n        super().__init__(*args, **kwargs)\n\n    def __call__(self, img):\n        if not isinstance(img, Image.Image):\n            img = np.ascontiguousarray(img)\n            img = Image.fromarray(img)\n        img = super()._apply_image(img)\n        if isinstance(img, Image.Image):\n            img = np.asarray(img)\n        return img\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\nclass Topk(object):\n\n    def __init__(self, topk=1, class_id_map_file=None):\n        assert isinstance(topk, (int, ))\n        self.class_id_map = self.parse_class_id_map(class_id_map_file)\n        self.topk = topk\n\n    def parse_class_id_map(self, class_id_map_file):\n        if class_id_map_file is None:\n            return None\n        if not os.path.exists(class_id_map_file):\n            print(\n                \"Warning: If want to use your own label_dict, please input legal path!\\nOtherwise label_names will be empty!\"\n            )\n            return None\n\n        try:\n            class_id_map = {}\n            with open(class_id_map_file, \"r\") as fin:\n                lines = fin.readlines()\n                for line in lines:\n                    partition = line.split(\"\\n\")[0].partition(\" \")\n                    class_id_map[int(partition[0])] = str(partition[-1])\n        except Exception as ex:\n            print(ex)\n            class_id_map = None\n        return class_id_map\n\n    def __call__(self, x, file_names=None, multilabel=False):\n        assert isinstance(x, paddle.Tensor)\n        if file_names is not None:\n            assert x.shape[0] == len(file_names)\n        x = F.softmax(x, axis=-1) if not multilabel else F.sigmoid(x)\n        x = x.numpy()\n        y = []\n        for idx, probs in enumerate(x):\n            index = probs.argsort(axis=0)[-self.topk:][::-1].astype(\"int32\") if not multilabel else np.where(\n                probs >= 0.5)[0].astype(\"int32\")\n            clas_id_list = []\n            score_list = []\n            label_name_list = []\n            for i in index:\n                clas_id_list.append(i.item())\n                score_list.append(probs[i].item())\n                if self.class_id_map is not None:\n                    label_name_list.append(self.class_id_map[i.item()])\n            result = {\n                \"class_ids\": clas_id_list,\n                \"scores\": np.around(score_list, decimals=5).tolist(),\n            }\n            if file_names is not None:\n                result[\"file_name\"] = file_names[idx]\n            if label_name_list is not None:\n                result[\"label_names\"] = label_name_list\n            y.append(result)\n        return y\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x1_0_imagenet/utils.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport yaml\n\n__all__ = ['get_config']\n\n\nclass AttrDict(dict):\n\n    def __getattr__(self, key):\n        return self[key]\n\n    def __setattr__(self, key, value):\n        if key in self.__dict__:\n            self.__dict__[key] = value\n        else:\n            self[key] = value\n\n    def __deepcopy__(self, content):\n        return copy.deepcopy(dict(self))\n\n\ndef create_attr_dict(yaml_config):\n    from ast import literal_eval\n    for key, value in yaml_config.items():\n        if type(value) is dict:\n            yaml_config[key] = value = AttrDict(value)\n        if isinstance(value, str):\n            try:\n                value = literal_eval(value)\n            except BaseException:\n                pass\n        if isinstance(value, AttrDict):\n            create_attr_dict(yaml_config[key])\n        else:\n            yaml_config[key] = value\n\n\ndef parse_config(cfg_file):\n    \"\"\"Load a config file into AttrDict\"\"\"\n    with open(cfg_file, 'r') as fopen:\n        yaml_config = AttrDict(yaml.load(fopen, Loader=yaml.SafeLoader))\n    create_attr_dict(yaml_config)\n    return yaml_config\n\n\ndef override(dl, ks, v):\n    \"\"\"\n    Recursively replace dict of list\n    Args:\n        dl(dict or list): dict or list to be replaced\n        ks(list): list of keys\n        v(str): value to be replaced\n    \"\"\"\n\n    def str2num(v):\n        try:\n            return eval(v)\n        except Exception:\n            return v\n\n    assert isinstance(dl, (list, dict)), (\"{} should be a list or a dict\")\n    assert len(ks) > 0, ('lenght of keys should larger than 0')\n    if isinstance(dl, list):\n        k = str2num(ks[0])\n        if len(ks) == 1:\n            assert k < len(dl), ('index({}) out of range({})'.format(k, dl))\n            dl[k] = str2num(v)\n        else:\n            override(dl[k], ks[1:], v)\n    else:\n        if len(ks) == 1:\n            # assert ks[0] in dl, ('{} is not exist in {}'.format(ks[0], dl))\n            if not ks[0] in dl:\n                print('A new filed ({}) detected!'.format(ks[0], dl))\n            dl[ks[0]] = str2num(v)\n        else:\n            override(dl[ks[0]], ks[1:], v)\n\n\ndef override_config(config, options=None):\n    \"\"\"\n    Recursively override the config\n    Args:\n        config(dict): dict to be replaced\n        options(list): list of pairs(key0.key1.idx.key2=value)\n            such as: [\n                'topk=2',\n                'VALID.transforms.1.ResizeImage.resize_short=300'\n            ]\n    Returns:\n        config(dict): replaced config\n    \"\"\"\n    if options is not None:\n        for opt in options:\n            assert isinstance(opt, str), (\"option({}) should be a str\".format(opt))\n            assert \"=\" in opt, (\"option({}) should contain a =\"\n                                \"to distinguish between key and value\".format(opt))\n            pair = opt.split('=')\n            assert len(pair) == 2, (\"there can be only a = in the option\")\n            key, value = pair\n            keys = key.split('.')\n            override(config, keys, value)\n    return config\n\n\ndef get_config(fname, overrides=None, show=False):\n    \"\"\"\n    Read config from file\n    \"\"\"\n    assert os.path.exists(fname), ('config file({}) is not exist'.format(fname))\n    config = parse_config(fname)\n    override_config(config, overrides)\n    return config\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x1_5_imagenet/README.md",
    "content": "# pplcnet_x1_5_imagenet\n\n|模型名称|pplcnet_x1_5_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|PPLCNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|17 MB|\n|最新更新日期|2022-04-02|\n|数据指标|Acc|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - PP-LCNet是百度针对Intel CPU 设备以及其加速库 MKLDNN 设计的特定骨干网络 ，比起其他的轻量级的 SOTA 模型，该骨干网络可以在不增加推理时间的情况下，进一步提升模型的性能，最终大幅度超越现有的 SOTA 模型。该模型为模型规模参数scale为x1.5下的PP-LCNet模型，关于模型结构的更多信息，可参考[论文](https://arxiv.org/pdf/2109.15099.pdf)。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install pplcnet_x1_5_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run pplcnet_x1_5_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"pplcnet_x1_5_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 包括'class_ids'（种类索引）, 'scores'（置信度） 和 'label_names'（种类名称）\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m pplcnet_x1_5_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"\\}\n    url = \"http://127.0.0.1:8866/predict/pplcnet_x1_5_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install pplcnet_x1_5_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x1_5_imagenet/model.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Any\nfrom typing import Callable\nfrom typing import Dict\nfrom typing import List\nfrom typing import Tuple\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nfrom paddle import ParamAttr\nfrom paddle.nn import AdaptiveAvgPool2D\nfrom paddle.nn import BatchNorm\nfrom paddle.nn import Conv2D\nfrom paddle.nn import Dropout\nfrom paddle.nn import Linear\nfrom paddle.nn.initializer import KaimingNormal\nfrom paddle.regularizer import L2Decay\n\n\nclass Identity(nn.Layer):\n\n    def __init__(self):\n        super(Identity, self).__init__()\n\n    def forward(self, inputs):\n        return inputs\n\n\nclass TheseusLayer(nn.Layer):\n\n    def __init__(self, *args, **kwargs):\n        super(TheseusLayer, self).__init__()\n        self.res_dict = {}\n        self.res_name = self.full_name()\n        self.pruner = None\n        self.quanter = None\n\n    def _return_dict_hook(self, layer, input, output):\n        res_dict = {\"output\": output}\n        # 'list' is needed to avoid error raised by popping self.res_dict\n        for res_key in list(self.res_dict):\n            # clear the res_dict because the forward process may change according to input\n            res_dict[res_key] = self.res_dict.pop(res_key)\n        return res_dict\n\n    def init_res(self, stages_pattern, return_patterns=None, return_stages=None):\n        if return_patterns and return_stages:\n            msg = f\"The 'return_patterns' would be ignored when 'return_stages' is set.\"\n            return_stages = None\n\n        if return_stages is True:\n            return_patterns = stages_pattern\n        # return_stages is int or bool\n        if type(return_stages) is int:\n            return_stages = [return_stages]\n        if isinstance(return_stages, list):\n            if max(return_stages) > len(stages_pattern) or min(return_stages) < 0:\n                msg = f\"The 'return_stages' set error. Illegal value(s) have been ignored. The stages' pattern list is {stages_pattern}.\"\n                return_stages = [val for val in return_stages if val >= 0 and val < len(stages_pattern)]\n            return_patterns = [stages_pattern[i] for i in return_stages]\n\n        if return_patterns:\n            self.update_res(return_patterns)\n\n    def replace_sub(self, *args, **kwargs) -> None:\n        msg = \"The function 'replace_sub()' is deprecated, please use 'upgrade_sublayer()' instead.\"\n        raise DeprecationWarning(msg)\n\n    def upgrade_sublayer(self, layer_name_pattern: Union[str, List[str]],\n                         handle_func: Callable[[nn.Layer, str], nn.Layer]) -> Dict[str, nn.Layer]:\n        \"\"\"use 'handle_func' to modify the sub-layer(s) specified by 'layer_name_pattern'.\n\n        Args:\n            layer_name_pattern (Union[str, List[str]]): The name of layer to be modified by 'handle_func'.\n            handle_func (Callable[[nn.Layer, str], nn.Layer]): The function to modify target layer specified by 'layer_name_pattern'. The formal params are the layer(nn.Layer) and pattern(str) that is (a member of) layer_name_pattern (when layer_name_pattern is List type). And the return is the layer processed.\n\n        Returns:\n            Dict[str, nn.Layer]: The key is the pattern and corresponding value is the result returned by 'handle_func()'.\n\n        Examples:\n\n            from paddle import nn\n            import paddleclas\n\n            def rep_func(layer: nn.Layer, pattern: str):\n                new_layer = nn.Conv2D(\n                    in_channels=layer._in_channels,\n                    out_channels=layer._out_channels,\n                    kernel_size=5,\n                    padding=2\n                )\n                return new_layer\n\n            net = paddleclas.MobileNetV1()\n            res = net.replace_sub(layer_name_pattern=[\"blocks[11].depthwise_conv.conv\", \"blocks[12].depthwise_conv.conv\"], handle_func=rep_func)\n            print(res)\n            # {'blocks[11].depthwise_conv.conv': the corresponding new_layer, 'blocks[12].depthwise_conv.conv': the corresponding new_layer}\n        \"\"\"\n\n        if not isinstance(layer_name_pattern, list):\n            layer_name_pattern = [layer_name_pattern]\n\n        hit_layer_pattern_list = []\n        for pattern in layer_name_pattern:\n            # parse pattern to find target layer and its parent\n            layer_list = parse_pattern_str(pattern=pattern, parent_layer=self)\n            if not layer_list:\n                continue\n            sub_layer_parent = layer_list[-2][\"layer\"] if len(layer_list) > 1 else self\n\n            sub_layer = layer_list[-1][\"layer\"]\n            sub_layer_name = layer_list[-1][\"name\"]\n            sub_layer_index = layer_list[-1][\"index\"]\n\n            new_sub_layer = handle_func(sub_layer, pattern)\n\n            if sub_layer_index:\n                getattr(sub_layer_parent, sub_layer_name)[sub_layer_index] = new_sub_layer\n            else:\n                setattr(sub_layer_parent, sub_layer_name, new_sub_layer)\n\n            hit_layer_pattern_list.append(pattern)\n        return hit_layer_pattern_list\n\n    def stop_after(self, stop_layer_name: str) -> bool:\n        \"\"\"stop forward and backward after 'stop_layer_name'.\n\n        Args:\n            stop_layer_name (str): The name of layer that stop forward and backward after this layer.\n\n        Returns:\n            bool: 'True' if successful, 'False' otherwise.\n        \"\"\"\n\n        layer_list = parse_pattern_str(stop_layer_name, self)\n        if not layer_list:\n            return False\n\n        parent_layer = self\n        for layer_dict in layer_list:\n            name, index = layer_dict[\"name\"], layer_dict[\"index\"]\n            if not set_identity(parent_layer, name, index):\n                msg = f\"Failed to set the layers that after stop_layer_name('{stop_layer_name}') to IdentityLayer. The error layer's name is '{name}'.\"\n                return False\n            parent_layer = layer_dict[\"layer\"]\n\n        return True\n\n    def update_res(self, return_patterns: Union[str, List[str]]) -> Dict[str, nn.Layer]:\n        \"\"\"update the result(s) to be returned.\n\n        Args:\n            return_patterns (Union[str, List[str]]): The name of layer to return output.\n\n        Returns:\n            Dict[str, nn.Layer]: The pattern(str) and corresponding layer(nn.Layer) that have been set successfully.\n        \"\"\"\n\n        # clear res_dict that could have been set\n        self.res_dict = {}\n\n        class Handler(object):\n\n            def __init__(self, res_dict):\n                # res_dict is a reference\n                self.res_dict = res_dict\n\n            def __call__(self, layer, pattern):\n                layer.res_dict = self.res_dict\n                layer.res_name = pattern\n                if hasattr(layer, \"hook_remove_helper\"):\n                    layer.hook_remove_helper.remove()\n                layer.hook_remove_helper = layer.register_forward_post_hook(save_sub_res_hook)\n                return layer\n\n        handle_func = Handler(self.res_dict)\n\n        hit_layer_pattern_list = self.upgrade_sublayer(return_patterns, handle_func=handle_func)\n\n        if hasattr(self, \"hook_remove_helper\"):\n            self.hook_remove_helper.remove()\n        self.hook_remove_helper = self.register_forward_post_hook(self._return_dict_hook)\n\n        return hit_layer_pattern_list\n\n\ndef save_sub_res_hook(layer, input, output):\n    layer.res_dict[layer.res_name] = output\n\n\ndef set_identity(parent_layer: nn.Layer, layer_name: str, layer_index: str = None) -> bool:\n    \"\"\"set the layer specified by layer_name and layer_index to Indentity.\n\n    Args:\n        parent_layer (nn.Layer): The parent layer of target layer specified by layer_name and layer_index.\n        layer_name (str): The name of target layer to be set to Indentity.\n        layer_index (str, optional): The index of target layer to be set to Indentity in parent_layer. Defaults to None.\n\n    Returns:\n        bool: True if successfully, False otherwise.\n    \"\"\"\n\n    stop_after = False\n    for sub_layer_name in parent_layer._sub_layers:\n        if stop_after:\n            parent_layer._sub_layers[sub_layer_name] = Identity()\n            continue\n        if sub_layer_name == layer_name:\n            stop_after = True\n\n    if layer_index and stop_after:\n        stop_after = False\n        for sub_layer_index in parent_layer._sub_layers[layer_name]._sub_layers:\n            if stop_after:\n                parent_layer._sub_layers[layer_name][sub_layer_index] = Identity()\n                continue\n            if layer_index == sub_layer_index:\n                stop_after = True\n\n    return stop_after\n\n\ndef parse_pattern_str(pattern: str, parent_layer: nn.Layer) -> Union[None, List[Dict[str, Union[nn.Layer, str, None]]]]:\n    \"\"\"parse the string type pattern.\n\n    Args:\n        pattern (str): The pattern to discribe layer.\n        parent_layer (nn.Layer): The root layer relative to the pattern.\n\n    Returns:\n        Union[None, List[Dict[str, Union[nn.Layer, str, None]]]]: None if failed. If successfully, the members are layers parsed in order:\n                                                                [\n                                                                    {\"layer\": first layer, \"name\": first layer's name parsed, \"index\": first layer's index parsed if exist},\n                                                                    {\"layer\": second layer, \"name\": second layer's name parsed, \"index\": second layer's index parsed if exist},\n                                                                    ...\n                                                                ]\n    \"\"\"\n\n    pattern_list = pattern.split(\".\")\n    if not pattern_list:\n        msg = f\"The pattern('{pattern}') is illegal. Please check and retry.\"\n        return None\n\n    layer_list = []\n    while len(pattern_list) > 0:\n        if '[' in pattern_list[0]:\n            target_layer_name = pattern_list[0].split('[')[0]\n            target_layer_index = pattern_list[0].split('[')[1].split(']')[0]\n        else:\n            target_layer_name = pattern_list[0]\n            target_layer_index = None\n\n        target_layer = getattr(parent_layer, target_layer_name, None)\n\n        if target_layer is None:\n            msg = f\"Not found layer named('{target_layer_name}') specifed in pattern('{pattern}').\"\n            return None\n\n        if target_layer_index and target_layer:\n            if int(target_layer_index) < 0 or int(target_layer_index) >= len(target_layer):\n                msg = f\"Not found layer by index('{target_layer_index}') specifed in pattern('{pattern}'). The index should < {len(target_layer)} and > 0.\"\n                return None\n\n            target_layer = target_layer[target_layer_index]\n\n        layer_list.append({\"layer\": target_layer, \"name\": target_layer_name, \"index\": target_layer_index})\n\n        pattern_list = pattern_list[1:]\n        parent_layer = target_layer\n    return layer_list\n\n\nMODEL_STAGES_PATTERN = {\"PPLCNet\": [\"blocks2\", \"blocks3\", \"blocks4\", \"blocks5\", \"blocks6\"]}\n\n# Each element(list) represents a depthwise block, which is composed of k, in_c, out_c, s, use_se.\n# k: kernel_size\n# in_c: input channel number in depthwise block\n# out_c: output channel number in depthwise block\n# s: stride in depthwise block\n# use_se: whether to use SE block\n\nNET_CONFIG = {\n    \"blocks2\":\n    #k, in_c, out_c, s, use_se\n    [[3, 16, 32, 1, False]],\n    \"blocks3\": [[3, 32, 64, 2, False], [3, 64, 64, 1, False]],\n    \"blocks4\": [[3, 64, 128, 2, False], [3, 128, 128, 1, False]],\n    \"blocks5\": [[3, 128, 256, 2, False], [5, 256, 256, 1, False], [5, 256, 256, 1, False], [5, 256, 256, 1, False],\n                [5, 256, 256, 1, False], [5, 256, 256, 1, False]],\n    \"blocks6\": [[5, 256, 512, 2, True], [5, 512, 512, 1, True]]\n}\n\n\ndef make_divisible(v, divisor=8, min_value=None):\n    if min_value is None:\n        min_value = divisor\n    new_v = max(min_value, int(v + divisor / 2) // divisor * divisor)\n    if new_v < 0.9 * v:\n        new_v += divisor\n    return new_v\n\n\nclass ConvBNLayer(TheseusLayer):\n\n    def __init__(self, num_channels, filter_size, num_filters, stride, num_groups=1):\n        super().__init__()\n\n        self.conv = Conv2D(in_channels=num_channels,\n                           out_channels=num_filters,\n                           kernel_size=filter_size,\n                           stride=stride,\n                           padding=(filter_size - 1) // 2,\n                           groups=num_groups,\n                           weight_attr=ParamAttr(initializer=KaimingNormal()),\n                           bias_attr=False)\n\n        self.bn = BatchNorm(num_filters,\n                            param_attr=ParamAttr(regularizer=L2Decay(0.0)),\n                            bias_attr=ParamAttr(regularizer=L2Decay(0.0)))\n        self.hardswish = nn.Hardswish()\n\n    def forward(self, x):\n        x = self.conv(x)\n        x = self.bn(x)\n        x = self.hardswish(x)\n        return x\n\n\nclass DepthwiseSeparable(TheseusLayer):\n\n    def __init__(self, num_channels, num_filters, stride, dw_size=3, use_se=False):\n        super().__init__()\n        self.use_se = use_se\n        self.dw_conv = ConvBNLayer(num_channels=num_channels,\n                                   num_filters=num_channels,\n                                   filter_size=dw_size,\n                                   stride=stride,\n                                   num_groups=num_channels)\n        if use_se:\n            self.se = SEModule(num_channels)\n        self.pw_conv = ConvBNLayer(num_channels=num_channels, filter_size=1, num_filters=num_filters, stride=1)\n\n    def forward(self, x):\n        x = self.dw_conv(x)\n        if self.use_se:\n            x = self.se(x)\n        x = self.pw_conv(x)\n        return x\n\n\nclass SEModule(TheseusLayer):\n\n    def __init__(self, channel, reduction=4):\n        super().__init__()\n        self.avg_pool = AdaptiveAvgPool2D(1)\n        self.conv1 = Conv2D(in_channels=channel, out_channels=channel // reduction, kernel_size=1, stride=1, padding=0)\n        self.relu = nn.ReLU()\n        self.conv2 = Conv2D(in_channels=channel // reduction, out_channels=channel, kernel_size=1, stride=1, padding=0)\n        self.hardsigmoid = nn.Hardsigmoid()\n\n    def forward(self, x):\n        identity = x\n        x = self.avg_pool(x)\n        x = self.conv1(x)\n        x = self.relu(x)\n        x = self.conv2(x)\n        x = self.hardsigmoid(x)\n        x = paddle.multiply(x=identity, y=x)\n        return x\n\n\nclass PPLCNet(TheseusLayer):\n\n    def __init__(self,\n                 stages_pattern,\n                 scale=1.0,\n                 class_num=1000,\n                 dropout_prob=0.2,\n                 class_expand=1280,\n                 return_patterns=None,\n                 return_stages=None):\n        super().__init__()\n        self.scale = scale\n        self.class_expand = class_expand\n\n        self.conv1 = ConvBNLayer(num_channels=3, filter_size=3, num_filters=make_divisible(16 * scale), stride=2)\n\n        self.blocks2 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks2\"])\n        ])\n\n        self.blocks3 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks3\"])\n        ])\n\n        self.blocks4 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks4\"])\n        ])\n\n        self.blocks5 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks5\"])\n        ])\n\n        self.blocks6 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks6\"])\n        ])\n\n        self.avg_pool = AdaptiveAvgPool2D(1)\n\n        self.last_conv = Conv2D(in_channels=make_divisible(NET_CONFIG[\"blocks6\"][-1][2] * scale),\n                                out_channels=self.class_expand,\n                                kernel_size=1,\n                                stride=1,\n                                padding=0,\n                                bias_attr=False)\n\n        self.hardswish = nn.Hardswish()\n        self.dropout = Dropout(p=dropout_prob, mode=\"downscale_in_infer\")\n        self.flatten = nn.Flatten(start_axis=1, stop_axis=-1)\n\n        self.fc = Linear(self.class_expand, class_num)\n\n        super().init_res(stages_pattern, return_patterns=return_patterns, return_stages=return_stages)\n\n    def forward(self, x):\n        x = self.conv1(x)\n\n        x = self.blocks2(x)\n        x = self.blocks3(x)\n        x = self.blocks4(x)\n        x = self.blocks5(x)\n        x = self.blocks6(x)\n\n        x = self.avg_pool(x)\n        x = self.last_conv(x)\n        x = self.hardswish(x)\n        x = self.dropout(x)\n        x = self.flatten(x)\n        x = self.fc(x)\n        return x\n\n\ndef PPLCNet_x1_5(pretrained=False, use_ssld=False, **kwargs):\n    model = PPLCNet(scale=1.5, stages_pattern=MODEL_STAGES_PATTERN[\"PPLCNet\"], **kwargs)\n    return model\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x1_5_imagenet/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport cv2\nimport numpy as np\nimport paddle\nfrom skimage.io import imread\nfrom skimage.transform import rescale\nfrom skimage.transform import resize\n\nimport paddlehub as hub\nfrom .model import PPLCNet_x1_5\nfrom .processor import base64_to_cv2\nfrom .processor import create_operators\nfrom .processor import Topk\nfrom .utils import get_config\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"pplcnet_x1_5_imagenet\",\n            type=\"cv/classification\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"\",\n            version=\"1.0.0\")\nclass PPLcNet_x1_5:\n\n    def __init__(self):\n        self.config = get_config(os.path.join(self.directory, 'PPLCNet_x1_5.yaml'), show=False)\n        self.label_path = os.path.join(self.directory, 'imagenet1k_label_list.txt')\n        self.pretrain_path = os.path.join(self.directory, 'PPLCNet_x1_5_pretrained.pdparams')\n        self.config['Infer']['PostProcess']['class_id_map_file'] = self.label_path\n        self.model = PPLCNet_x1_5()\n        param_state_dict = paddle.load(self.pretrain_path)\n        self.model.set_dict(param_state_dict)\n        self.preprocess_funcs = create_operators(self.config[\"Infer\"][\"transforms\"])\n\n    def classification(self,\n                       images: list = None,\n                       paths: list = None,\n                       batch_size: int = 1,\n                       use_gpu: bool = False,\n                       top_k: int = 1):\n        '''\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results, each result dict contains key 'class_ids', 'scores' and 'label_names'.\n        '''\n        postprocess_func = Topk(top_k, self.label_path)\n        inputs = []\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n\n        if images != None:\n            for image in images:\n                image = image[:, :, ::-1]\n                inputs.append(image)\n\n        if paths != None:\n            for path in paths:\n                image = cv2.imread(path)[:, :, ::-1]\n                inputs.append(image)\n\n        batch_data = []\n        for idx, imagedata in enumerate(inputs):\n            for process in self.preprocess_funcs:\n                imagedata = process(imagedata)\n            batch_data.append(imagedata)\n            if len(batch_data) >= batch_size or idx == len(inputs) - 1:\n                batch_tensor = paddle.to_tensor(batch_data)\n                out = self.model(batch_tensor)\n                if isinstance(out, list):\n                    out = out[0]\n                if isinstance(out, dict) and \"logits\" in out:\n                    out = out[\"logits\"]\n                if isinstance(out, dict) and \"output\" in out:\n                    out = out[\"output\"]\n                result = postprocess_func(out)\n                results.extend(result)\n                batch_data.clear()\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n        results = self.classification(paths=[self.args.input_path],\n                                      use_gpu=self.args.use_gpu,\n                                      batch_size=self.args.batch_size,\n                                      top_k=self.args.top_k)\n        return results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classification(images=images_decode, **kwargs)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument('--batch_size', type=int, default=1, help='batch size')\n        self.arg_config_group.add_argument('--top_k', type=int, default=1, help='Return top k results.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to input image.\")\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x1_5_imagenet/processor.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport base64\nimport inspect\nimport math\nimport os\nimport random\nimport sys\nfrom functools import partial\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\nimport six\nfrom paddle.vision.transforms import ColorJitter as RawColorJitter\nfrom PIL import Image\n\n\ndef create_operators(params, class_num=None):\n    \"\"\"\n    create operators based on the config\n\n    Args:\n        params(list): a dict list, used to create some operators\n    \"\"\"\n    assert isinstance(params, list), ('operator config should be a list')\n    ops = []\n    current_module = sys.modules[__name__]\n    for operator in params:\n        assert isinstance(operator, dict) and len(operator) == 1, \"yaml format error\"\n        op_name = list(operator)[0]\n        param = {} if operator[op_name] is None else operator[op_name]\n        op_func = getattr(current_module, op_name)\n        if \"class_num\" in inspect.getfullargspec(op_func).args:\n            param.update({\"class_num\": class_num})\n        op = op_func(**param)\n        ops.append(op)\n\n    return ops\n\n\nclass UnifiedResize(object):\n\n    def __init__(self, interpolation=None, backend=\"cv2\"):\n        _cv2_interp_from_str = {\n            'nearest': cv2.INTER_NEAREST,\n            'bilinear': cv2.INTER_LINEAR,\n            'area': cv2.INTER_AREA,\n            'bicubic': cv2.INTER_CUBIC,\n            'lanczos': cv2.INTER_LANCZOS4\n        }\n        _pil_interp_from_str = {\n            'nearest': Image.NEAREST,\n            'bilinear': Image.BILINEAR,\n            'bicubic': Image.BICUBIC,\n            'box': Image.BOX,\n            'lanczos': Image.LANCZOS,\n            'hamming': Image.HAMMING\n        }\n\n        def _pil_resize(src, size, resample):\n            pil_img = Image.fromarray(src)\n            pil_img = pil_img.resize(size, resample)\n            return np.asarray(pil_img)\n\n        if backend.lower() == \"cv2\":\n            if isinstance(interpolation, str):\n                interpolation = _cv2_interp_from_str[interpolation.lower()]\n            # compatible with opencv < version 4.4.0\n            elif interpolation is None:\n                interpolation = cv2.INTER_LINEAR\n            self.resize_func = partial(cv2.resize, interpolation=interpolation)\n        elif backend.lower() == \"pil\":\n            if isinstance(interpolation, str):\n                interpolation = _pil_interp_from_str[interpolation.lower()]\n            self.resize_func = partial(_pil_resize, resample=interpolation)\n        else:\n            self.resize_func = cv2.resize\n\n    def __call__(self, src, size):\n        return self.resize_func(src, size)\n\n\nclass OperatorParamError(ValueError):\n    \"\"\" OperatorParamError\n    \"\"\"\n    pass\n\n\nclass DecodeImage(object):\n    \"\"\" decode image \"\"\"\n\n    def __init__(self, to_rgb=True, to_np=False, channel_first=False):\n        self.to_rgb = to_rgb\n        self.to_np = to_np  # to numpy\n        self.channel_first = channel_first  # only enabled when to_np is True\n\n    def __call__(self, img):\n        if six.PY2:\n            assert type(img) is str and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        else:\n            assert type(img) is bytes and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        data = np.frombuffer(img, dtype='uint8')\n        img = cv2.imdecode(data, 1)\n        if self.to_rgb:\n            assert img.shape[2] == 3, 'invalid shape of image[%s]' % (img.shape)\n            img = img[:, :, ::-1]\n\n        if self.channel_first:\n            img = img.transpose((2, 0, 1))\n\n        return img\n\n\nclass ResizeImage(object):\n    \"\"\" resize image \"\"\"\n\n    def __init__(self, size=None, resize_short=None, interpolation=None, backend=\"cv2\"):\n        if resize_short is not None and resize_short > 0:\n            self.resize_short = resize_short\n            self.w = None\n            self.h = None\n        elif size is not None:\n            self.resize_short = None\n            self.w = size if type(size) is int else size[0]\n            self.h = size if type(size) is int else size[1]\n        else:\n            raise OperatorParamError(\"invalid params for ReisizeImage for '\\\n                'both 'size' and 'resize_short' are None\")\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        img_h, img_w = img.shape[:2]\n        if self.resize_short is not None:\n            percent = float(self.resize_short) / min(img_w, img_h)\n            w = int(round(img_w * percent))\n            h = int(round(img_h * percent))\n        else:\n            w = self.w\n            h = self.h\n        return self._resize_func(img, (w, h))\n\n\nclass CropImage(object):\n    \"\"\" crop image \"\"\"\n\n    def __init__(self, size):\n        if type(size) is int:\n            self.size = (size, size)\n        else:\n            self.size = size  # (h, w)\n\n    def __call__(self, img):\n        w, h = self.size\n        img_h, img_w = img.shape[:2]\n        w_start = (img_w - w) // 2\n        h_start = (img_h - h) // 2\n\n        w_end = w_start + w\n        h_end = h_start + h\n        return img[h_start:h_end, w_start:w_end, :]\n\n\nclass RandCropImage(object):\n    \"\"\" random crop image \"\"\"\n\n    def __init__(self, size, scale=None, ratio=None, interpolation=None, backend=\"cv2\"):\n        if type(size) is int:\n            self.size = (size, size)  # (h, w)\n        else:\n            self.size = size\n\n        self.scale = [0.08, 1.0] if scale is None else scale\n        self.ratio = [3. / 4., 4. / 3.] if ratio is None else ratio\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        size = self.size\n        scale = self.scale\n        ratio = self.ratio\n\n        aspect_ratio = math.sqrt(random.uniform(*ratio))\n        w = 1. * aspect_ratio\n        h = 1. / aspect_ratio\n\n        img_h, img_w = img.shape[:2]\n\n        bound = min((float(img_w) / img_h) / (w**2), (float(img_h) / img_w) / (h**2))\n        scale_max = min(scale[1], bound)\n        scale_min = min(scale[0], bound)\n\n        target_area = img_w * img_h * random.uniform(scale_min, scale_max)\n        target_size = math.sqrt(target_area)\n        w = int(target_size * w)\n        h = int(target_size * h)\n\n        i = random.randint(0, img_w - w)\n        j = random.randint(0, img_h - h)\n\n        img = img[j:j + h, i:i + w, :]\n\n        return self._resize_func(img, size)\n\n\nclass RandFlipImage(object):\n    \"\"\" random flip image\n        flip_code:\n            1: Flipped Horizontally\n            0: Flipped Vertically\n            -1: Flipped Horizontally & Vertically\n    \"\"\"\n\n    def __init__(self, flip_code=1):\n        assert flip_code in [-1, 0, 1], \"flip_code should be a value in [-1, 0, 1]\"\n        self.flip_code = flip_code\n\n    def __call__(self, img):\n        if random.randint(0, 1) == 1:\n            return cv2.flip(img, self.flip_code)\n        else:\n            return img\n\n\nclass NormalizeImage(object):\n    \"\"\" normalize image such as substract mean, divide std\n    \"\"\"\n\n    def __init__(self, scale=None, mean=None, std=None, order='chw', output_fp16=False, channel_num=3):\n        if isinstance(scale, str):\n            scale = eval(scale)\n        assert channel_num in [3, 4], \"channel number of input image should be set to 3 or 4.\"\n        self.channel_num = channel_num\n        self.output_dtype = 'float16' if output_fp16 else 'float32'\n        self.scale = np.float32(scale if scale is not None else 1.0 / 255.0)\n        self.order = order\n        mean = mean if mean is not None else [0.485, 0.456, 0.406]\n        std = std if std is not None else [0.229, 0.224, 0.225]\n\n        shape = (3, 1, 1) if self.order == 'chw' else (1, 1, 3)\n        self.mean = np.array(mean).reshape(shape).astype('float32')\n        self.std = np.array(std).reshape(shape).astype('float32')\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        assert isinstance(img, np.ndarray), \"invalid input 'img' in NormalizeImage\"\n\n        img = (img.astype('float32') * self.scale - self.mean) / self.std\n\n        if self.channel_num == 4:\n            img_h = img.shape[1] if self.order == 'chw' else img.shape[0]\n            img_w = img.shape[2] if self.order == 'chw' else img.shape[1]\n            pad_zeros = np.zeros((1, img_h, img_w)) if self.order == 'chw' else np.zeros((img_h, img_w, 1))\n            img = (np.concatenate((img, pad_zeros), axis=0) if self.order == 'chw' else np.concatenate(\n                (img, pad_zeros), axis=2))\n        return img.astype(self.output_dtype)\n\n\nclass ToCHWImage(object):\n    \"\"\" convert hwc image to chw image\n    \"\"\"\n\n    def __init__(self):\n        pass\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        return img.transpose((2, 0, 1))\n\n\nclass ColorJitter(RawColorJitter):\n    \"\"\"ColorJitter.\n    \"\"\"\n\n    def __init__(self, *args, **kwargs):\n        super().__init__(*args, **kwargs)\n\n    def __call__(self, img):\n        if not isinstance(img, Image.Image):\n            img = np.ascontiguousarray(img)\n            img = Image.fromarray(img)\n        img = super()._apply_image(img)\n        if isinstance(img, Image.Image):\n            img = np.asarray(img)\n        return img\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\nclass Topk(object):\n\n    def __init__(self, topk=1, class_id_map_file=None):\n        assert isinstance(topk, (int, ))\n        self.class_id_map = self.parse_class_id_map(class_id_map_file)\n        self.topk = topk\n\n    def parse_class_id_map(self, class_id_map_file):\n        if class_id_map_file is None:\n            return None\n        if not os.path.exists(class_id_map_file):\n            print(\n                \"Warning: If want to use your own label_dict, please input legal path!\\nOtherwise label_names will be empty!\"\n            )\n            return None\n\n        try:\n            class_id_map = {}\n            with open(class_id_map_file, \"r\") as fin:\n                lines = fin.readlines()\n                for line in lines:\n                    partition = line.split(\"\\n\")[0].partition(\" \")\n                    class_id_map[int(partition[0])] = str(partition[-1])\n        except Exception as ex:\n            print(ex)\n            class_id_map = None\n        return class_id_map\n\n    def __call__(self, x, file_names=None, multilabel=False):\n        assert isinstance(x, paddle.Tensor)\n        if file_names is not None:\n            assert x.shape[0] == len(file_names)\n        x = F.softmax(x, axis=-1) if not multilabel else F.sigmoid(x)\n        x = x.numpy()\n        y = []\n        for idx, probs in enumerate(x):\n            index = probs.argsort(axis=0)[-self.topk:][::-1].astype(\"int32\") if not multilabel else np.where(\n                probs >= 0.5)[0].astype(\"int32\")\n            clas_id_list = []\n            score_list = []\n            label_name_list = []\n            for i in index:\n                clas_id_list.append(i.item())\n                score_list.append(probs[i].item())\n                if self.class_id_map is not None:\n                    label_name_list.append(self.class_id_map[i.item()])\n            result = {\n                \"class_ids\": clas_id_list,\n                \"scores\": np.around(score_list, decimals=5).tolist(),\n            }\n            if file_names is not None:\n                result[\"file_name\"] = file_names[idx]\n            if label_name_list is not None:\n                result[\"label_names\"] = label_name_list\n            y.append(result)\n        return y\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x1_5_imagenet/utils.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport yaml\n\n__all__ = ['get_config']\n\n\nclass AttrDict(dict):\n\n    def __getattr__(self, key):\n        return self[key]\n\n    def __setattr__(self, key, value):\n        if key in self.__dict__:\n            self.__dict__[key] = value\n        else:\n            self[key] = value\n\n    def __deepcopy__(self, content):\n        return copy.deepcopy(dict(self))\n\n\ndef create_attr_dict(yaml_config):\n    from ast import literal_eval\n    for key, value in yaml_config.items():\n        if type(value) is dict:\n            yaml_config[key] = value = AttrDict(value)\n        if isinstance(value, str):\n            try:\n                value = literal_eval(value)\n            except BaseException:\n                pass\n        if isinstance(value, AttrDict):\n            create_attr_dict(yaml_config[key])\n        else:\n            yaml_config[key] = value\n\n\ndef parse_config(cfg_file):\n    \"\"\"Load a config file into AttrDict\"\"\"\n    with open(cfg_file, 'r') as fopen:\n        yaml_config = AttrDict(yaml.load(fopen, Loader=yaml.SafeLoader))\n    create_attr_dict(yaml_config)\n    return yaml_config\n\n\ndef override(dl, ks, v):\n    \"\"\"\n    Recursively replace dict of list\n    Args:\n        dl(dict or list): dict or list to be replaced\n        ks(list): list of keys\n        v(str): value to be replaced\n    \"\"\"\n\n    def str2num(v):\n        try:\n            return eval(v)\n        except Exception:\n            return v\n\n    assert isinstance(dl, (list, dict)), (\"{} should be a list or a dict\")\n    assert len(ks) > 0, ('lenght of keys should larger than 0')\n    if isinstance(dl, list):\n        k = str2num(ks[0])\n        if len(ks) == 1:\n            assert k < len(dl), ('index({}) out of range({})'.format(k, dl))\n            dl[k] = str2num(v)\n        else:\n            override(dl[k], ks[1:], v)\n    else:\n        if len(ks) == 1:\n            # assert ks[0] in dl, ('{} is not exist in {}'.format(ks[0], dl))\n            if not ks[0] in dl:\n                print('A new filed ({}) detected!'.format(ks[0], dl))\n            dl[ks[0]] = str2num(v)\n        else:\n            override(dl[ks[0]], ks[1:], v)\n\n\ndef override_config(config, options=None):\n    \"\"\"\n    Recursively override the config\n    Args:\n        config(dict): dict to be replaced\n        options(list): list of pairs(key0.key1.idx.key2=value)\n            such as: [\n                'topk=2',\n                'VALID.transforms.1.ResizeImage.resize_short=300'\n            ]\n    Returns:\n        config(dict): replaced config\n    \"\"\"\n    if options is not None:\n        for opt in options:\n            assert isinstance(opt, str), (\"option({}) should be a str\".format(opt))\n            assert \"=\" in opt, (\"option({}) should contain a =\"\n                                \"to distinguish between key and value\".format(opt))\n            pair = opt.split('=')\n            assert len(pair) == 2, (\"there can be only a = in the option\")\n            key, value = pair\n            keys = key.split('.')\n            override(config, keys, value)\n    return config\n\n\ndef get_config(fname, overrides=None, show=False):\n    \"\"\"\n    Read config from file\n    \"\"\"\n    assert os.path.exists(fname), ('config file({}) is not exist'.format(fname))\n    config = parse_config(fname)\n    override_config(config, overrides)\n    return config\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x2_0_imagenet/README.md",
    "content": "# pplcnet_x2_0_imagenet\n\n|模型名称|pplcnet_x2_0_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|PPLCNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|24 MB|\n|最新更新日期|2022-04-02|\n|数据指标|Acc|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - PP-LCNet是百度针对Intel CPU 设备以及其加速库 MKLDNN 设计的特定骨干网络 ，比起其他的轻量级的 SOTA 模型，该骨干网络可以在不增加推理时间的情况下，进一步提升模型的性能，最终大幅度超越现有的 SOTA 模型。该模型为模型规模参数scale为x2.0下的PP-LCNet模型，关于模型结构的更多信息，可参考[论文](https://arxiv.org/pdf/2109.15099.pdf)。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install pplcnet_x2_0_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run pplcnet_x2_0_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"pplcnet_x2_0_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 包括'class_ids'（种类索引）, 'scores'（置信度） 和 'label_names'（种类名称）\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m pplcnet_x2_0_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"\\}\n    url = \"http://127.0.0.1:8866/predict/pplcnet_x2_0_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install pplcnet_x2_0_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x2_0_imagenet/model.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Any\nfrom typing import Callable\nfrom typing import Dict\nfrom typing import List\nfrom typing import Tuple\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nfrom paddle import ParamAttr\nfrom paddle.nn import AdaptiveAvgPool2D\nfrom paddle.nn import BatchNorm\nfrom paddle.nn import Conv2D\nfrom paddle.nn import Dropout\nfrom paddle.nn import Linear\nfrom paddle.nn.initializer import KaimingNormal\nfrom paddle.regularizer import L2Decay\n\n\nclass Identity(nn.Layer):\n\n    def __init__(self):\n        super(Identity, self).__init__()\n\n    def forward(self, inputs):\n        return inputs\n\n\nclass TheseusLayer(nn.Layer):\n\n    def __init__(self, *args, **kwargs):\n        super(TheseusLayer, self).__init__()\n        self.res_dict = {}\n        self.res_name = self.full_name()\n        self.pruner = None\n        self.quanter = None\n\n    def _return_dict_hook(self, layer, input, output):\n        res_dict = {\"output\": output}\n        # 'list' is needed to avoid error raised by popping self.res_dict\n        for res_key in list(self.res_dict):\n            # clear the res_dict because the forward process may change according to input\n            res_dict[res_key] = self.res_dict.pop(res_key)\n        return res_dict\n\n    def init_res(self, stages_pattern, return_patterns=None, return_stages=None):\n        if return_patterns and return_stages:\n            msg = f\"The 'return_patterns' would be ignored when 'return_stages' is set.\"\n            return_stages = None\n\n        if return_stages is True:\n            return_patterns = stages_pattern\n        # return_stages is int or bool\n        if type(return_stages) is int:\n            return_stages = [return_stages]\n        if isinstance(return_stages, list):\n            if max(return_stages) > len(stages_pattern) or min(return_stages) < 0:\n                msg = f\"The 'return_stages' set error. Illegal value(s) have been ignored. The stages' pattern list is {stages_pattern}.\"\n                return_stages = [val for val in return_stages if val >= 0 and val < len(stages_pattern)]\n            return_patterns = [stages_pattern[i] for i in return_stages]\n\n        if return_patterns:\n            self.update_res(return_patterns)\n\n    def replace_sub(self, *args, **kwargs) -> None:\n        msg = \"The function 'replace_sub()' is deprecated, please use 'upgrade_sublayer()' instead.\"\n        raise DeprecationWarning(msg)\n\n    def upgrade_sublayer(self, layer_name_pattern: Union[str, List[str]],\n                         handle_func: Callable[[nn.Layer, str], nn.Layer]) -> Dict[str, nn.Layer]:\n        \"\"\"use 'handle_func' to modify the sub-layer(s) specified by 'layer_name_pattern'.\n\n        Args:\n            layer_name_pattern (Union[str, List[str]]): The name of layer to be modified by 'handle_func'.\n            handle_func (Callable[[nn.Layer, str], nn.Layer]): The function to modify target layer specified by 'layer_name_pattern'. The formal params are the layer(nn.Layer) and pattern(str) that is (a member of) layer_name_pattern (when layer_name_pattern is List type). And the return is the layer processed.\n\n        Returns:\n            Dict[str, nn.Layer]: The key is the pattern and corresponding value is the result returned by 'handle_func()'.\n\n        Examples:\n\n            from paddle import nn\n            import paddleclas\n\n            def rep_func(layer: nn.Layer, pattern: str):\n                new_layer = nn.Conv2D(\n                    in_channels=layer._in_channels,\n                    out_channels=layer._out_channels,\n                    kernel_size=5,\n                    padding=2\n                )\n                return new_layer\n\n            net = paddleclas.MobileNetV1()\n            res = net.replace_sub(layer_name_pattern=[\"blocks[11].depthwise_conv.conv\", \"blocks[12].depthwise_conv.conv\"], handle_func=rep_func)\n            print(res)\n            # {'blocks[11].depthwise_conv.conv': the corresponding new_layer, 'blocks[12].depthwise_conv.conv': the corresponding new_layer}\n        \"\"\"\n\n        if not isinstance(layer_name_pattern, list):\n            layer_name_pattern = [layer_name_pattern]\n\n        hit_layer_pattern_list = []\n        for pattern in layer_name_pattern:\n            # parse pattern to find target layer and its parent\n            layer_list = parse_pattern_str(pattern=pattern, parent_layer=self)\n            if not layer_list:\n                continue\n            sub_layer_parent = layer_list[-2][\"layer\"] if len(layer_list) > 1 else self\n\n            sub_layer = layer_list[-1][\"layer\"]\n            sub_layer_name = layer_list[-1][\"name\"]\n            sub_layer_index = layer_list[-1][\"index\"]\n\n            new_sub_layer = handle_func(sub_layer, pattern)\n\n            if sub_layer_index:\n                getattr(sub_layer_parent, sub_layer_name)[sub_layer_index] = new_sub_layer\n            else:\n                setattr(sub_layer_parent, sub_layer_name, new_sub_layer)\n\n            hit_layer_pattern_list.append(pattern)\n        return hit_layer_pattern_list\n\n    def stop_after(self, stop_layer_name: str) -> bool:\n        \"\"\"stop forward and backward after 'stop_layer_name'.\n\n        Args:\n            stop_layer_name (str): The name of layer that stop forward and backward after this layer.\n\n        Returns:\n            bool: 'True' if successful, 'False' otherwise.\n        \"\"\"\n\n        layer_list = parse_pattern_str(stop_layer_name, self)\n        if not layer_list:\n            return False\n\n        parent_layer = self\n        for layer_dict in layer_list:\n            name, index = layer_dict[\"name\"], layer_dict[\"index\"]\n            if not set_identity(parent_layer, name, index):\n                msg = f\"Failed to set the layers that after stop_layer_name('{stop_layer_name}') to IdentityLayer. The error layer's name is '{name}'.\"\n                return False\n            parent_layer = layer_dict[\"layer\"]\n\n        return True\n\n    def update_res(self, return_patterns: Union[str, List[str]]) -> Dict[str, nn.Layer]:\n        \"\"\"update the result(s) to be returned.\n\n        Args:\n            return_patterns (Union[str, List[str]]): The name of layer to return output.\n\n        Returns:\n            Dict[str, nn.Layer]: The pattern(str) and corresponding layer(nn.Layer) that have been set successfully.\n        \"\"\"\n\n        # clear res_dict that could have been set\n        self.res_dict = {}\n\n        class Handler(object):\n\n            def __init__(self, res_dict):\n                # res_dict is a reference\n                self.res_dict = res_dict\n\n            def __call__(self, layer, pattern):\n                layer.res_dict = self.res_dict\n                layer.res_name = pattern\n                if hasattr(layer, \"hook_remove_helper\"):\n                    layer.hook_remove_helper.remove()\n                layer.hook_remove_helper = layer.register_forward_post_hook(save_sub_res_hook)\n                return layer\n\n        handle_func = Handler(self.res_dict)\n\n        hit_layer_pattern_list = self.upgrade_sublayer(return_patterns, handle_func=handle_func)\n\n        if hasattr(self, \"hook_remove_helper\"):\n            self.hook_remove_helper.remove()\n        self.hook_remove_helper = self.register_forward_post_hook(self._return_dict_hook)\n\n        return hit_layer_pattern_list\n\n\ndef save_sub_res_hook(layer, input, output):\n    layer.res_dict[layer.res_name] = output\n\n\ndef set_identity(parent_layer: nn.Layer, layer_name: str, layer_index: str = None) -> bool:\n    \"\"\"set the layer specified by layer_name and layer_index to Indentity.\n\n    Args:\n        parent_layer (nn.Layer): The parent layer of target layer specified by layer_name and layer_index.\n        layer_name (str): The name of target layer to be set to Indentity.\n        layer_index (str, optional): The index of target layer to be set to Indentity in parent_layer. Defaults to None.\n\n    Returns:\n        bool: True if successfully, False otherwise.\n    \"\"\"\n\n    stop_after = False\n    for sub_layer_name in parent_layer._sub_layers:\n        if stop_after:\n            parent_layer._sub_layers[sub_layer_name] = Identity()\n            continue\n        if sub_layer_name == layer_name:\n            stop_after = True\n\n    if layer_index and stop_after:\n        stop_after = False\n        for sub_layer_index in parent_layer._sub_layers[layer_name]._sub_layers:\n            if stop_after:\n                parent_layer._sub_layers[layer_name][sub_layer_index] = Identity()\n                continue\n            if layer_index == sub_layer_index:\n                stop_after = True\n\n    return stop_after\n\n\ndef parse_pattern_str(pattern: str, parent_layer: nn.Layer) -> Union[None, List[Dict[str, Union[nn.Layer, str, None]]]]:\n    \"\"\"parse the string type pattern.\n\n    Args:\n        pattern (str): The pattern to discribe layer.\n        parent_layer (nn.Layer): The root layer relative to the pattern.\n\n    Returns:\n        Union[None, List[Dict[str, Union[nn.Layer, str, None]]]]: None if failed. If successfully, the members are layers parsed in order:\n                                                                [\n                                                                    {\"layer\": first layer, \"name\": first layer's name parsed, \"index\": first layer's index parsed if exist},\n                                                                    {\"layer\": second layer, \"name\": second layer's name parsed, \"index\": second layer's index parsed if exist},\n                                                                    ...\n                                                                ]\n    \"\"\"\n\n    pattern_list = pattern.split(\".\")\n    if not pattern_list:\n        msg = f\"The pattern('{pattern}') is illegal. Please check and retry.\"\n        return None\n\n    layer_list = []\n    while len(pattern_list) > 0:\n        if '[' in pattern_list[0]:\n            target_layer_name = pattern_list[0].split('[')[0]\n            target_layer_index = pattern_list[0].split('[')[1].split(']')[0]\n        else:\n            target_layer_name = pattern_list[0]\n            target_layer_index = None\n\n        target_layer = getattr(parent_layer, target_layer_name, None)\n\n        if target_layer is None:\n            msg = f\"Not found layer named('{target_layer_name}') specifed in pattern('{pattern}').\"\n            return None\n\n        if target_layer_index and target_layer:\n            if int(target_layer_index) < 0 or int(target_layer_index) >= len(target_layer):\n                msg = f\"Not found layer by index('{target_layer_index}') specifed in pattern('{pattern}'). The index should < {len(target_layer)} and > 0.\"\n                return None\n\n            target_layer = target_layer[target_layer_index]\n\n        layer_list.append({\"layer\": target_layer, \"name\": target_layer_name, \"index\": target_layer_index})\n\n        pattern_list = pattern_list[1:]\n        parent_layer = target_layer\n    return layer_list\n\n\nMODEL_STAGES_PATTERN = {\"PPLCNet\": [\"blocks2\", \"blocks3\", \"blocks4\", \"blocks5\", \"blocks6\"]}\n\n# Each element(list) represents a depthwise block, which is composed of k, in_c, out_c, s, use_se.\n# k: kernel_size\n# in_c: input channel number in depthwise block\n# out_c: output channel number in depthwise block\n# s: stride in depthwise block\n# use_se: whether to use SE block\n\nNET_CONFIG = {\n    \"blocks2\":\n    #k, in_c, out_c, s, use_se\n    [[3, 16, 32, 1, False]],\n    \"blocks3\": [[3, 32, 64, 2, False], [3, 64, 64, 1, False]],\n    \"blocks4\": [[3, 64, 128, 2, False], [3, 128, 128, 1, False]],\n    \"blocks5\": [[3, 128, 256, 2, False], [5, 256, 256, 1, False], [5, 256, 256, 1, False], [5, 256, 256, 1, False],\n                [5, 256, 256, 1, False], [5, 256, 256, 1, False]],\n    \"blocks6\": [[5, 256, 512, 2, True], [5, 512, 512, 1, True]]\n}\n\n\ndef make_divisible(v, divisor=8, min_value=None):\n    if min_value is None:\n        min_value = divisor\n    new_v = max(min_value, int(v + divisor / 2) // divisor * divisor)\n    if new_v < 0.9 * v:\n        new_v += divisor\n    return new_v\n\n\nclass ConvBNLayer(TheseusLayer):\n\n    def __init__(self, num_channels, filter_size, num_filters, stride, num_groups=1):\n        super().__init__()\n\n        self.conv = Conv2D(in_channels=num_channels,\n                           out_channels=num_filters,\n                           kernel_size=filter_size,\n                           stride=stride,\n                           padding=(filter_size - 1) // 2,\n                           groups=num_groups,\n                           weight_attr=ParamAttr(initializer=KaimingNormal()),\n                           bias_attr=False)\n\n        self.bn = BatchNorm(num_filters,\n                            param_attr=ParamAttr(regularizer=L2Decay(0.0)),\n                            bias_attr=ParamAttr(regularizer=L2Decay(0.0)))\n        self.hardswish = nn.Hardswish()\n\n    def forward(self, x):\n        x = self.conv(x)\n        x = self.bn(x)\n        x = self.hardswish(x)\n        return x\n\n\nclass DepthwiseSeparable(TheseusLayer):\n\n    def __init__(self, num_channels, num_filters, stride, dw_size=3, use_se=False):\n        super().__init__()\n        self.use_se = use_se\n        self.dw_conv = ConvBNLayer(num_channels=num_channels,\n                                   num_filters=num_channels,\n                                   filter_size=dw_size,\n                                   stride=stride,\n                                   num_groups=num_channels)\n        if use_se:\n            self.se = SEModule(num_channels)\n        self.pw_conv = ConvBNLayer(num_channels=num_channels, filter_size=1, num_filters=num_filters, stride=1)\n\n    def forward(self, x):\n        x = self.dw_conv(x)\n        if self.use_se:\n            x = self.se(x)\n        x = self.pw_conv(x)\n        return x\n\n\nclass SEModule(TheseusLayer):\n\n    def __init__(self, channel, reduction=4):\n        super().__init__()\n        self.avg_pool = AdaptiveAvgPool2D(1)\n        self.conv1 = Conv2D(in_channels=channel, out_channels=channel // reduction, kernel_size=1, stride=1, padding=0)\n        self.relu = nn.ReLU()\n        self.conv2 = Conv2D(in_channels=channel // reduction, out_channels=channel, kernel_size=1, stride=1, padding=0)\n        self.hardsigmoid = nn.Hardsigmoid()\n\n    def forward(self, x):\n        identity = x\n        x = self.avg_pool(x)\n        x = self.conv1(x)\n        x = self.relu(x)\n        x = self.conv2(x)\n        x = self.hardsigmoid(x)\n        x = paddle.multiply(x=identity, y=x)\n        return x\n\n\nclass PPLCNet(TheseusLayer):\n\n    def __init__(self,\n                 stages_pattern,\n                 scale=1.0,\n                 class_num=1000,\n                 dropout_prob=0.2,\n                 class_expand=1280,\n                 return_patterns=None,\n                 return_stages=None):\n        super().__init__()\n        self.scale = scale\n        self.class_expand = class_expand\n\n        self.conv1 = ConvBNLayer(num_channels=3, filter_size=3, num_filters=make_divisible(16 * scale), stride=2)\n\n        self.blocks2 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks2\"])\n        ])\n\n        self.blocks3 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks3\"])\n        ])\n\n        self.blocks4 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks4\"])\n        ])\n\n        self.blocks5 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks5\"])\n        ])\n\n        self.blocks6 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks6\"])\n        ])\n\n        self.avg_pool = AdaptiveAvgPool2D(1)\n\n        self.last_conv = Conv2D(in_channels=make_divisible(NET_CONFIG[\"blocks6\"][-1][2] * scale),\n                                out_channels=self.class_expand,\n                                kernel_size=1,\n                                stride=1,\n                                padding=0,\n                                bias_attr=False)\n\n        self.hardswish = nn.Hardswish()\n        self.dropout = Dropout(p=dropout_prob, mode=\"downscale_in_infer\")\n        self.flatten = nn.Flatten(start_axis=1, stop_axis=-1)\n\n        self.fc = Linear(self.class_expand, class_num)\n\n        super().init_res(stages_pattern, return_patterns=return_patterns, return_stages=return_stages)\n\n    def forward(self, x):\n        x = self.conv1(x)\n\n        x = self.blocks2(x)\n        x = self.blocks3(x)\n        x = self.blocks4(x)\n        x = self.blocks5(x)\n        x = self.blocks6(x)\n\n        x = self.avg_pool(x)\n        x = self.last_conv(x)\n        x = self.hardswish(x)\n        x = self.dropout(x)\n        x = self.flatten(x)\n        x = self.fc(x)\n        return x\n\n\ndef PPLCNet_x2_0(pretrained=False, use_ssld=False, **kwargs):\n    model = PPLCNet(scale=2.0, stages_pattern=MODEL_STAGES_PATTERN[\"PPLCNet\"], **kwargs)\n    return model\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x2_0_imagenet/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport cv2\nimport numpy as np\nimport paddle\nfrom skimage.io import imread\nfrom skimage.transform import rescale\nfrom skimage.transform import resize\n\nimport paddlehub as hub\nfrom .model import PPLCNet_x2_0\nfrom .processor import base64_to_cv2\nfrom .processor import create_operators\nfrom .processor import Topk\nfrom .utils import get_config\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"pplcnet_x2_0_imagenet\",\n            type=\"cv/classification\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"\",\n            version=\"1.0.0\")\nclass PPLcNet_x2_0:\n\n    def __init__(self):\n        self.config = get_config(os.path.join(self.directory, 'PPLCNet_x2_0.yaml'), show=False)\n        self.label_path = os.path.join(self.directory, 'imagenet1k_label_list.txt')\n        self.pretrain_path = os.path.join(self.directory, 'PPLCNet_x2_0_pretrained.pdparams')\n        self.config['Infer']['PostProcess']['class_id_map_file'] = self.label_path\n        self.model = PPLCNet_x2_0()\n        param_state_dict = paddle.load(self.pretrain_path)\n        self.model.set_dict(param_state_dict)\n        self.preprocess_funcs = create_operators(self.config[\"Infer\"][\"transforms\"])\n\n    def classification(self,\n                       images: list = None,\n                       paths: list = None,\n                       batch_size: int = 1,\n                       use_gpu: bool = False,\n                       top_k: int = 1):\n        '''\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results, each result dict contains key 'class_ids', 'scores' and 'label_names'.\n        '''\n        postprocess_func = Topk(top_k, self.label_path)\n        inputs = []\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n\n        if images != None:\n            for image in images:\n                image = image[:, :, ::-1]\n                inputs.append(image)\n\n        if paths != None:\n            for path in paths:\n                image = cv2.imread(path)[:, :, ::-1]\n                inputs.append(image)\n\n        batch_data = []\n        for idx, imagedata in enumerate(inputs):\n            for process in self.preprocess_funcs:\n                imagedata = process(imagedata)\n            batch_data.append(imagedata)\n            if len(batch_data) >= batch_size or idx == len(inputs) - 1:\n                batch_tensor = paddle.to_tensor(batch_data)\n                out = self.model(batch_tensor)\n                if isinstance(out, list):\n                    out = out[0]\n                if isinstance(out, dict) and \"logits\" in out:\n                    out = out[\"logits\"]\n                if isinstance(out, dict) and \"output\" in out:\n                    out = out[\"output\"]\n                result = postprocess_func(out)\n                results.extend(result)\n                batch_data.clear()\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n        results = self.classification(paths=[self.args.input_path],\n                                      use_gpu=self.args.use_gpu,\n                                      batch_size=self.args.batch_size,\n                                      top_k=self.args.top_k)\n        return results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classification(images=images_decode, **kwargs)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument('--batch_size', type=int, default=1, help='batch size')\n        self.arg_config_group.add_argument('--top_k', type=int, default=1, help='Return top k results.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to input image.\")\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x2_0_imagenet/processor.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport base64\nimport inspect\nimport math\nimport os\nimport random\nimport sys\nfrom functools import partial\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\nimport six\nfrom paddle.vision.transforms import ColorJitter as RawColorJitter\nfrom PIL import Image\n\n\ndef create_operators(params, class_num=None):\n    \"\"\"\n    create operators based on the config\n\n    Args:\n        params(list): a dict list, used to create some operators\n    \"\"\"\n    assert isinstance(params, list), ('operator config should be a list')\n    ops = []\n    current_module = sys.modules[__name__]\n    for operator in params:\n        assert isinstance(operator, dict) and len(operator) == 1, \"yaml format error\"\n        op_name = list(operator)[0]\n        param = {} if operator[op_name] is None else operator[op_name]\n        op_func = getattr(current_module, op_name)\n        if \"class_num\" in inspect.getfullargspec(op_func).args:\n            param.update({\"class_num\": class_num})\n        op = op_func(**param)\n        ops.append(op)\n\n    return ops\n\n\nclass UnifiedResize(object):\n\n    def __init__(self, interpolation=None, backend=\"cv2\"):\n        _cv2_interp_from_str = {\n            'nearest': cv2.INTER_NEAREST,\n            'bilinear': cv2.INTER_LINEAR,\n            'area': cv2.INTER_AREA,\n            'bicubic': cv2.INTER_CUBIC,\n            'lanczos': cv2.INTER_LANCZOS4\n        }\n        _pil_interp_from_str = {\n            'nearest': Image.NEAREST,\n            'bilinear': Image.BILINEAR,\n            'bicubic': Image.BICUBIC,\n            'box': Image.BOX,\n            'lanczos': Image.LANCZOS,\n            'hamming': Image.HAMMING\n        }\n\n        def _pil_resize(src, size, resample):\n            pil_img = Image.fromarray(src)\n            pil_img = pil_img.resize(size, resample)\n            return np.asarray(pil_img)\n\n        if backend.lower() == \"cv2\":\n            if isinstance(interpolation, str):\n                interpolation = _cv2_interp_from_str[interpolation.lower()]\n            # compatible with opencv < version 4.4.0\n            elif interpolation is None:\n                interpolation = cv2.INTER_LINEAR\n            self.resize_func = partial(cv2.resize, interpolation=interpolation)\n        elif backend.lower() == \"pil\":\n            if isinstance(interpolation, str):\n                interpolation = _pil_interp_from_str[interpolation.lower()]\n            self.resize_func = partial(_pil_resize, resample=interpolation)\n        else:\n            self.resize_func = cv2.resize\n\n    def __call__(self, src, size):\n        return self.resize_func(src, size)\n\n\nclass OperatorParamError(ValueError):\n    \"\"\" OperatorParamError\n    \"\"\"\n    pass\n\n\nclass DecodeImage(object):\n    \"\"\" decode image \"\"\"\n\n    def __init__(self, to_rgb=True, to_np=False, channel_first=False):\n        self.to_rgb = to_rgb\n        self.to_np = to_np  # to numpy\n        self.channel_first = channel_first  # only enabled when to_np is True\n\n    def __call__(self, img):\n        if six.PY2:\n            assert type(img) is str and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        else:\n            assert type(img) is bytes and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        data = np.frombuffer(img, dtype='uint8')\n        img = cv2.imdecode(data, 1)\n        if self.to_rgb:\n            assert img.shape[2] == 3, 'invalid shape of image[%s]' % (img.shape)\n            img = img[:, :, ::-1]\n\n        if self.channel_first:\n            img = img.transpose((2, 0, 1))\n\n        return img\n\n\nclass ResizeImage(object):\n    \"\"\" resize image \"\"\"\n\n    def __init__(self, size=None, resize_short=None, interpolation=None, backend=\"cv2\"):\n        if resize_short is not None and resize_short > 0:\n            self.resize_short = resize_short\n            self.w = None\n            self.h = None\n        elif size is not None:\n            self.resize_short = None\n            self.w = size if type(size) is int else size[0]\n            self.h = size if type(size) is int else size[1]\n        else:\n            raise OperatorParamError(\"invalid params for ReisizeImage for '\\\n                'both 'size' and 'resize_short' are None\")\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        img_h, img_w = img.shape[:2]\n        if self.resize_short is not None:\n            percent = float(self.resize_short) / min(img_w, img_h)\n            w = int(round(img_w * percent))\n            h = int(round(img_h * percent))\n        else:\n            w = self.w\n            h = self.h\n        return self._resize_func(img, (w, h))\n\n\nclass CropImage(object):\n    \"\"\" crop image \"\"\"\n\n    def __init__(self, size):\n        if type(size) is int:\n            self.size = (size, size)\n        else:\n            self.size = size  # (h, w)\n\n    def __call__(self, img):\n        w, h = self.size\n        img_h, img_w = img.shape[:2]\n        w_start = (img_w - w) // 2\n        h_start = (img_h - h) // 2\n\n        w_end = w_start + w\n        h_end = h_start + h\n        return img[h_start:h_end, w_start:w_end, :]\n\n\nclass RandCropImage(object):\n    \"\"\" random crop image \"\"\"\n\n    def __init__(self, size, scale=None, ratio=None, interpolation=None, backend=\"cv2\"):\n        if type(size) is int:\n            self.size = (size, size)  # (h, w)\n        else:\n            self.size = size\n\n        self.scale = [0.08, 1.0] if scale is None else scale\n        self.ratio = [3. / 4., 4. / 3.] if ratio is None else ratio\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        size = self.size\n        scale = self.scale\n        ratio = self.ratio\n\n        aspect_ratio = math.sqrt(random.uniform(*ratio))\n        w = 1. * aspect_ratio\n        h = 1. / aspect_ratio\n\n        img_h, img_w = img.shape[:2]\n\n        bound = min((float(img_w) / img_h) / (w**2), (float(img_h) / img_w) / (h**2))\n        scale_max = min(scale[1], bound)\n        scale_min = min(scale[0], bound)\n\n        target_area = img_w * img_h * random.uniform(scale_min, scale_max)\n        target_size = math.sqrt(target_area)\n        w = int(target_size * w)\n        h = int(target_size * h)\n\n        i = random.randint(0, img_w - w)\n        j = random.randint(0, img_h - h)\n\n        img = img[j:j + h, i:i + w, :]\n\n        return self._resize_func(img, size)\n\n\nclass RandFlipImage(object):\n    \"\"\" random flip image\n        flip_code:\n            1: Flipped Horizontally\n            0: Flipped Vertically\n            -1: Flipped Horizontally & Vertically\n    \"\"\"\n\n    def __init__(self, flip_code=1):\n        assert flip_code in [-1, 0, 1], \"flip_code should be a value in [-1, 0, 1]\"\n        self.flip_code = flip_code\n\n    def __call__(self, img):\n        if random.randint(0, 1) == 1:\n            return cv2.flip(img, self.flip_code)\n        else:\n            return img\n\n\nclass NormalizeImage(object):\n    \"\"\" normalize image such as substract mean, divide std\n    \"\"\"\n\n    def __init__(self, scale=None, mean=None, std=None, order='chw', output_fp16=False, channel_num=3):\n        if isinstance(scale, str):\n            scale = eval(scale)\n        assert channel_num in [3, 4], \"channel number of input image should be set to 3 or 4.\"\n        self.channel_num = channel_num\n        self.output_dtype = 'float16' if output_fp16 else 'float32'\n        self.scale = np.float32(scale if scale is not None else 1.0 / 255.0)\n        self.order = order\n        mean = mean if mean is not None else [0.485, 0.456, 0.406]\n        std = std if std is not None else [0.229, 0.224, 0.225]\n\n        shape = (3, 1, 1) if self.order == 'chw' else (1, 1, 3)\n        self.mean = np.array(mean).reshape(shape).astype('float32')\n        self.std = np.array(std).reshape(shape).astype('float32')\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        assert isinstance(img, np.ndarray), \"invalid input 'img' in NormalizeImage\"\n\n        img = (img.astype('float32') * self.scale - self.mean) / self.std\n\n        if self.channel_num == 4:\n            img_h = img.shape[1] if self.order == 'chw' else img.shape[0]\n            img_w = img.shape[2] if self.order == 'chw' else img.shape[1]\n            pad_zeros = np.zeros((1, img_h, img_w)) if self.order == 'chw' else np.zeros((img_h, img_w, 1))\n            img = (np.concatenate((img, pad_zeros), axis=0) if self.order == 'chw' else np.concatenate(\n                (img, pad_zeros), axis=2))\n        return img.astype(self.output_dtype)\n\n\nclass ToCHWImage(object):\n    \"\"\" convert hwc image to chw image\n    \"\"\"\n\n    def __init__(self):\n        pass\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        return img.transpose((2, 0, 1))\n\n\nclass ColorJitter(RawColorJitter):\n    \"\"\"ColorJitter.\n    \"\"\"\n\n    def __init__(self, *args, **kwargs):\n        super().__init__(*args, **kwargs)\n\n    def __call__(self, img):\n        if not isinstance(img, Image.Image):\n            img = np.ascontiguousarray(img)\n            img = Image.fromarray(img)\n        img = super()._apply_image(img)\n        if isinstance(img, Image.Image):\n            img = np.asarray(img)\n        return img\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\nclass Topk(object):\n\n    def __init__(self, topk=1, class_id_map_file=None):\n        assert isinstance(topk, (int, ))\n        self.class_id_map = self.parse_class_id_map(class_id_map_file)\n        self.topk = topk\n\n    def parse_class_id_map(self, class_id_map_file):\n        if class_id_map_file is None:\n            return None\n        if not os.path.exists(class_id_map_file):\n            print(\n                \"Warning: If want to use your own label_dict, please input legal path!\\nOtherwise label_names will be empty!\"\n            )\n            return None\n\n        try:\n            class_id_map = {}\n            with open(class_id_map_file, \"r\") as fin:\n                lines = fin.readlines()\n                for line in lines:\n                    partition = line.split(\"\\n\")[0].partition(\" \")\n                    class_id_map[int(partition[0])] = str(partition[-1])\n        except Exception as ex:\n            print(ex)\n            class_id_map = None\n        return class_id_map\n\n    def __call__(self, x, file_names=None, multilabel=False):\n        assert isinstance(x, paddle.Tensor)\n        if file_names is not None:\n            assert x.shape[0] == len(file_names)\n        x = F.softmax(x, axis=-1) if not multilabel else F.sigmoid(x)\n        x = x.numpy()\n        y = []\n        for idx, probs in enumerate(x):\n            index = probs.argsort(axis=0)[-self.topk:][::-1].astype(\"int32\") if not multilabel else np.where(\n                probs >= 0.5)[0].astype(\"int32\")\n            clas_id_list = []\n            score_list = []\n            label_name_list = []\n            for i in index:\n                clas_id_list.append(i.item())\n                score_list.append(probs[i].item())\n                if self.class_id_map is not None:\n                    label_name_list.append(self.class_id_map[i.item()])\n            result = {\n                \"class_ids\": clas_id_list,\n                \"scores\": np.around(score_list, decimals=5).tolist(),\n            }\n            if file_names is not None:\n                result[\"file_name\"] = file_names[idx]\n            if label_name_list is not None:\n                result[\"label_names\"] = label_name_list\n            y.append(result)\n        return y\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x2_0_imagenet/utils.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport yaml\n\n__all__ = ['get_config']\n\n\nclass AttrDict(dict):\n\n    def __getattr__(self, key):\n        return self[key]\n\n    def __setattr__(self, key, value):\n        if key in self.__dict__:\n            self.__dict__[key] = value\n        else:\n            self[key] = value\n\n    def __deepcopy__(self, content):\n        return copy.deepcopy(dict(self))\n\n\ndef create_attr_dict(yaml_config):\n    from ast import literal_eval\n    for key, value in yaml_config.items():\n        if type(value) is dict:\n            yaml_config[key] = value = AttrDict(value)\n        if isinstance(value, str):\n            try:\n                value = literal_eval(value)\n            except BaseException:\n                pass\n        if isinstance(value, AttrDict):\n            create_attr_dict(yaml_config[key])\n        else:\n            yaml_config[key] = value\n\n\ndef parse_config(cfg_file):\n    \"\"\"Load a config file into AttrDict\"\"\"\n    with open(cfg_file, 'r') as fopen:\n        yaml_config = AttrDict(yaml.load(fopen, Loader=yaml.SafeLoader))\n    create_attr_dict(yaml_config)\n    return yaml_config\n\n\ndef override(dl, ks, v):\n    \"\"\"\n    Recursively replace dict of list\n    Args:\n        dl(dict or list): dict or list to be replaced\n        ks(list): list of keys\n        v(str): value to be replaced\n    \"\"\"\n\n    def str2num(v):\n        try:\n            return eval(v)\n        except Exception:\n            return v\n\n    assert isinstance(dl, (list, dict)), (\"{} should be a list or a dict\")\n    assert len(ks) > 0, ('lenght of keys should larger than 0')\n    if isinstance(dl, list):\n        k = str2num(ks[0])\n        if len(ks) == 1:\n            assert k < len(dl), ('index({}) out of range({})'.format(k, dl))\n            dl[k] = str2num(v)\n        else:\n            override(dl[k], ks[1:], v)\n    else:\n        if len(ks) == 1:\n            # assert ks[0] in dl, ('{} is not exist in {}'.format(ks[0], dl))\n            if not ks[0] in dl:\n                print('A new filed ({}) detected!'.format(ks[0], dl))\n            dl[ks[0]] = str2num(v)\n        else:\n            override(dl[ks[0]], ks[1:], v)\n\n\ndef override_config(config, options=None):\n    \"\"\"\n    Recursively override the config\n    Args:\n        config(dict): dict to be replaced\n        options(list): list of pairs(key0.key1.idx.key2=value)\n            such as: [\n                'topk=2',\n                'VALID.transforms.1.ResizeImage.resize_short=300'\n            ]\n    Returns:\n        config(dict): replaced config\n    \"\"\"\n    if options is not None:\n        for opt in options:\n            assert isinstance(opt, str), (\"option({}) should be a str\".format(opt))\n            assert \"=\" in opt, (\"option({}) should contain a =\"\n                                \"to distinguish between key and value\".format(opt))\n            pair = opt.split('=')\n            assert len(pair) == 2, (\"there can be only a = in the option\")\n            key, value = pair\n            keys = key.split('.')\n            override(config, keys, value)\n    return config\n\n\ndef get_config(fname, overrides=None, show=False):\n    \"\"\"\n    Read config from file\n    \"\"\"\n    assert os.path.exists(fname), ('config file({}) is not exist'.format(fname))\n    config = parse_config(fname)\n    override_config(config, overrides)\n    return config\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x2_5_imagenet/README.md",
    "content": "# pplcnet_x2_5_imagenet\n\n|模型名称|pplcnet_x2_5_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|PPLCNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|34 MB|\n|最新更新日期|2022-04-02|\n|数据指标|Acc|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - PP-LCNet是百度针对Intel CPU 设备以及其加速库 MKLDNN 设计的特定骨干网络 ，比起其他的轻量级的 SOTA 模型，该骨干网络可以在不增加推理时间的情况下，进一步提升模型的性能，最终大幅度超越现有的 SOTA 模型。该模型为模型规模参数scale为x2.5下的PP-LCNet模型，关于模型结构的更多信息，可参考[论文](https://arxiv.org/pdf/2109.15099.pdf)。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install pplcnet_x2_5_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run pplcnet_x2_5_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"pplcnet_x2_5_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 包括'class_ids'（种类索引）, 'scores'（置信度） 和 'label_names'（种类名称）\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m pplcnet_x2_5_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"\\}\n    url = \"http://127.0.0.1:8866/predict/pplcnet_x2_5_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install pplcnet_x2_5_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x2_5_imagenet/model.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Any\nfrom typing import Callable\nfrom typing import Dict\nfrom typing import List\nfrom typing import Tuple\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nfrom paddle import ParamAttr\nfrom paddle.nn import AdaptiveAvgPool2D\nfrom paddle.nn import BatchNorm\nfrom paddle.nn import Conv2D\nfrom paddle.nn import Dropout\nfrom paddle.nn import Linear\nfrom paddle.nn.initializer import KaimingNormal\nfrom paddle.regularizer import L2Decay\n\n\nclass Identity(nn.Layer):\n\n    def __init__(self):\n        super(Identity, self).__init__()\n\n    def forward(self, inputs):\n        return inputs\n\n\nclass TheseusLayer(nn.Layer):\n\n    def __init__(self, *args, **kwargs):\n        super(TheseusLayer, self).__init__()\n        self.res_dict = {}\n        self.res_name = self.full_name()\n        self.pruner = None\n        self.quanter = None\n\n    def _return_dict_hook(self, layer, input, output):\n        res_dict = {\"output\": output}\n        # 'list' is needed to avoid error raised by popping self.res_dict\n        for res_key in list(self.res_dict):\n            # clear the res_dict because the forward process may change according to input\n            res_dict[res_key] = self.res_dict.pop(res_key)\n        return res_dict\n\n    def init_res(self, stages_pattern, return_patterns=None, return_stages=None):\n        if return_patterns and return_stages:\n            msg = f\"The 'return_patterns' would be ignored when 'return_stages' is set.\"\n            return_stages = None\n\n        if return_stages is True:\n            return_patterns = stages_pattern\n        # return_stages is int or bool\n        if type(return_stages) is int:\n            return_stages = [return_stages]\n        if isinstance(return_stages, list):\n            if max(return_stages) > len(stages_pattern) or min(return_stages) < 0:\n                msg = f\"The 'return_stages' set error. Illegal value(s) have been ignored. The stages' pattern list is {stages_pattern}.\"\n                return_stages = [val for val in return_stages if val >= 0 and val < len(stages_pattern)]\n            return_patterns = [stages_pattern[i] for i in return_stages]\n\n        if return_patterns:\n            self.update_res(return_patterns)\n\n    def replace_sub(self, *args, **kwargs) -> None:\n        msg = \"The function 'replace_sub()' is deprecated, please use 'upgrade_sublayer()' instead.\"\n        raise DeprecationWarning(msg)\n\n    def upgrade_sublayer(self, layer_name_pattern: Union[str, List[str]],\n                         handle_func: Callable[[nn.Layer, str], nn.Layer]) -> Dict[str, nn.Layer]:\n        \"\"\"use 'handle_func' to modify the sub-layer(s) specified by 'layer_name_pattern'.\n\n        Args:\n            layer_name_pattern (Union[str, List[str]]): The name of layer to be modified by 'handle_func'.\n            handle_func (Callable[[nn.Layer, str], nn.Layer]): The function to modify target layer specified by 'layer_name_pattern'. The formal params are the layer(nn.Layer) and pattern(str) that is (a member of) layer_name_pattern (when layer_name_pattern is List type). And the return is the layer processed.\n\n        Returns:\n            Dict[str, nn.Layer]: The key is the pattern and corresponding value is the result returned by 'handle_func()'.\n\n        Examples:\n\n            from paddle import nn\n            import paddleclas\n\n            def rep_func(layer: nn.Layer, pattern: str):\n                new_layer = nn.Conv2D(\n                    in_channels=layer._in_channels,\n                    out_channels=layer._out_channels,\n                    kernel_size=5,\n                    padding=2\n                )\n                return new_layer\n\n            net = paddleclas.MobileNetV1()\n            res = net.replace_sub(layer_name_pattern=[\"blocks[11].depthwise_conv.conv\", \"blocks[12].depthwise_conv.conv\"], handle_func=rep_func)\n            print(res)\n            # {'blocks[11].depthwise_conv.conv': the corresponding new_layer, 'blocks[12].depthwise_conv.conv': the corresponding new_layer}\n        \"\"\"\n\n        if not isinstance(layer_name_pattern, list):\n            layer_name_pattern = [layer_name_pattern]\n\n        hit_layer_pattern_list = []\n        for pattern in layer_name_pattern:\n            # parse pattern to find target layer and its parent\n            layer_list = parse_pattern_str(pattern=pattern, parent_layer=self)\n            if not layer_list:\n                continue\n            sub_layer_parent = layer_list[-2][\"layer\"] if len(layer_list) > 1 else self\n\n            sub_layer = layer_list[-1][\"layer\"]\n            sub_layer_name = layer_list[-1][\"name\"]\n            sub_layer_index = layer_list[-1][\"index\"]\n\n            new_sub_layer = handle_func(sub_layer, pattern)\n\n            if sub_layer_index:\n                getattr(sub_layer_parent, sub_layer_name)[sub_layer_index] = new_sub_layer\n            else:\n                setattr(sub_layer_parent, sub_layer_name, new_sub_layer)\n\n            hit_layer_pattern_list.append(pattern)\n        return hit_layer_pattern_list\n\n    def stop_after(self, stop_layer_name: str) -> bool:\n        \"\"\"stop forward and backward after 'stop_layer_name'.\n\n        Args:\n            stop_layer_name (str): The name of layer that stop forward and backward after this layer.\n\n        Returns:\n            bool: 'True' if successful, 'False' otherwise.\n        \"\"\"\n\n        layer_list = parse_pattern_str(stop_layer_name, self)\n        if not layer_list:\n            return False\n\n        parent_layer = self\n        for layer_dict in layer_list:\n            name, index = layer_dict[\"name\"], layer_dict[\"index\"]\n            if not set_identity(parent_layer, name, index):\n                msg = f\"Failed to set the layers that after stop_layer_name('{stop_layer_name}') to IdentityLayer. The error layer's name is '{name}'.\"\n                return False\n            parent_layer = layer_dict[\"layer\"]\n\n        return True\n\n    def update_res(self, return_patterns: Union[str, List[str]]) -> Dict[str, nn.Layer]:\n        \"\"\"update the result(s) to be returned.\n\n        Args:\n            return_patterns (Union[str, List[str]]): The name of layer to return output.\n\n        Returns:\n            Dict[str, nn.Layer]: The pattern(str) and corresponding layer(nn.Layer) that have been set successfully.\n        \"\"\"\n\n        # clear res_dict that could have been set\n        self.res_dict = {}\n\n        class Handler(object):\n\n            def __init__(self, res_dict):\n                # res_dict is a reference\n                self.res_dict = res_dict\n\n            def __call__(self, layer, pattern):\n                layer.res_dict = self.res_dict\n                layer.res_name = pattern\n                if hasattr(layer, \"hook_remove_helper\"):\n                    layer.hook_remove_helper.remove()\n                layer.hook_remove_helper = layer.register_forward_post_hook(save_sub_res_hook)\n                return layer\n\n        handle_func = Handler(self.res_dict)\n\n        hit_layer_pattern_list = self.upgrade_sublayer(return_patterns, handle_func=handle_func)\n\n        if hasattr(self, \"hook_remove_helper\"):\n            self.hook_remove_helper.remove()\n        self.hook_remove_helper = self.register_forward_post_hook(self._return_dict_hook)\n\n        return hit_layer_pattern_list\n\n\ndef save_sub_res_hook(layer, input, output):\n    layer.res_dict[layer.res_name] = output\n\n\ndef set_identity(parent_layer: nn.Layer, layer_name: str, layer_index: str = None) -> bool:\n    \"\"\"set the layer specified by layer_name and layer_index to Indentity.\n\n    Args:\n        parent_layer (nn.Layer): The parent layer of target layer specified by layer_name and layer_index.\n        layer_name (str): The name of target layer to be set to Indentity.\n        layer_index (str, optional): The index of target layer to be set to Indentity in parent_layer. Defaults to None.\n\n    Returns:\n        bool: True if successfully, False otherwise.\n    \"\"\"\n\n    stop_after = False\n    for sub_layer_name in parent_layer._sub_layers:\n        if stop_after:\n            parent_layer._sub_layers[sub_layer_name] = Identity()\n            continue\n        if sub_layer_name == layer_name:\n            stop_after = True\n\n    if layer_index and stop_after:\n        stop_after = False\n        for sub_layer_index in parent_layer._sub_layers[layer_name]._sub_layers:\n            if stop_after:\n                parent_layer._sub_layers[layer_name][sub_layer_index] = Identity()\n                continue\n            if layer_index == sub_layer_index:\n                stop_after = True\n\n    return stop_after\n\n\ndef parse_pattern_str(pattern: str, parent_layer: nn.Layer) -> Union[None, List[Dict[str, Union[nn.Layer, str, None]]]]:\n    \"\"\"parse the string type pattern.\n\n    Args:\n        pattern (str): The pattern to discribe layer.\n        parent_layer (nn.Layer): The root layer relative to the pattern.\n\n    Returns:\n        Union[None, List[Dict[str, Union[nn.Layer, str, None]]]]: None if failed. If successfully, the members are layers parsed in order:\n                                                                [\n                                                                    {\"layer\": first layer, \"name\": first layer's name parsed, \"index\": first layer's index parsed if exist},\n                                                                    {\"layer\": second layer, \"name\": second layer's name parsed, \"index\": second layer's index parsed if exist},\n                                                                    ...\n                                                                ]\n    \"\"\"\n\n    pattern_list = pattern.split(\".\")\n    if not pattern_list:\n        msg = f\"The pattern('{pattern}') is illegal. Please check and retry.\"\n        return None\n\n    layer_list = []\n    while len(pattern_list) > 0:\n        if '[' in pattern_list[0]:\n            target_layer_name = pattern_list[0].split('[')[0]\n            target_layer_index = pattern_list[0].split('[')[1].split(']')[0]\n        else:\n            target_layer_name = pattern_list[0]\n            target_layer_index = None\n\n        target_layer = getattr(parent_layer, target_layer_name, None)\n\n        if target_layer is None:\n            msg = f\"Not found layer named('{target_layer_name}') specifed in pattern('{pattern}').\"\n            return None\n\n        if target_layer_index and target_layer:\n            if int(target_layer_index) < 0 or int(target_layer_index) >= len(target_layer):\n                msg = f\"Not found layer by index('{target_layer_index}') specifed in pattern('{pattern}'). The index should < {len(target_layer)} and > 0.\"\n                return None\n\n            target_layer = target_layer[target_layer_index]\n\n        layer_list.append({\"layer\": target_layer, \"name\": target_layer_name, \"index\": target_layer_index})\n\n        pattern_list = pattern_list[1:]\n        parent_layer = target_layer\n    return layer_list\n\n\nMODEL_STAGES_PATTERN = {\"PPLCNet\": [\"blocks2\", \"blocks3\", \"blocks4\", \"blocks5\", \"blocks6\"]}\n\n# Each element(list) represents a depthwise block, which is composed of k, in_c, out_c, s, use_se.\n# k: kernel_size\n# in_c: input channel number in depthwise block\n# out_c: output channel number in depthwise block\n# s: stride in depthwise block\n# use_se: whether to use SE block\n\nNET_CONFIG = {\n    \"blocks2\":\n    #k, in_c, out_c, s, use_se\n    [[3, 16, 32, 1, False]],\n    \"blocks3\": [[3, 32, 64, 2, False], [3, 64, 64, 1, False]],\n    \"blocks4\": [[3, 64, 128, 2, False], [3, 128, 128, 1, False]],\n    \"blocks5\": [[3, 128, 256, 2, False], [5, 256, 256, 1, False], [5, 256, 256, 1, False], [5, 256, 256, 1, False],\n                [5, 256, 256, 1, False], [5, 256, 256, 1, False]],\n    \"blocks6\": [[5, 256, 512, 2, True], [5, 512, 512, 1, True]]\n}\n\n\ndef make_divisible(v, divisor=8, min_value=None):\n    if min_value is None:\n        min_value = divisor\n    new_v = max(min_value, int(v + divisor / 2) // divisor * divisor)\n    if new_v < 0.9 * v:\n        new_v += divisor\n    return new_v\n\n\nclass ConvBNLayer(TheseusLayer):\n\n    def __init__(self, num_channels, filter_size, num_filters, stride, num_groups=1):\n        super().__init__()\n\n        self.conv = Conv2D(in_channels=num_channels,\n                           out_channels=num_filters,\n                           kernel_size=filter_size,\n                           stride=stride,\n                           padding=(filter_size - 1) // 2,\n                           groups=num_groups,\n                           weight_attr=ParamAttr(initializer=KaimingNormal()),\n                           bias_attr=False)\n\n        self.bn = BatchNorm(num_filters,\n                            param_attr=ParamAttr(regularizer=L2Decay(0.0)),\n                            bias_attr=ParamAttr(regularizer=L2Decay(0.0)))\n        self.hardswish = nn.Hardswish()\n\n    def forward(self, x):\n        x = self.conv(x)\n        x = self.bn(x)\n        x = self.hardswish(x)\n        return x\n\n\nclass DepthwiseSeparable(TheseusLayer):\n\n    def __init__(self, num_channels, num_filters, stride, dw_size=3, use_se=False):\n        super().__init__()\n        self.use_se = use_se\n        self.dw_conv = ConvBNLayer(num_channels=num_channels,\n                                   num_filters=num_channels,\n                                   filter_size=dw_size,\n                                   stride=stride,\n                                   num_groups=num_channels)\n        if use_se:\n            self.se = SEModule(num_channels)\n        self.pw_conv = ConvBNLayer(num_channels=num_channels, filter_size=1, num_filters=num_filters, stride=1)\n\n    def forward(self, x):\n        x = self.dw_conv(x)\n        if self.use_se:\n            x = self.se(x)\n        x = self.pw_conv(x)\n        return x\n\n\nclass SEModule(TheseusLayer):\n\n    def __init__(self, channel, reduction=4):\n        super().__init__()\n        self.avg_pool = AdaptiveAvgPool2D(1)\n        self.conv1 = Conv2D(in_channels=channel, out_channels=channel // reduction, kernel_size=1, stride=1, padding=0)\n        self.relu = nn.ReLU()\n        self.conv2 = Conv2D(in_channels=channel // reduction, out_channels=channel, kernel_size=1, stride=1, padding=0)\n        self.hardsigmoid = nn.Hardsigmoid()\n\n    def forward(self, x):\n        identity = x\n        x = self.avg_pool(x)\n        x = self.conv1(x)\n        x = self.relu(x)\n        x = self.conv2(x)\n        x = self.hardsigmoid(x)\n        x = paddle.multiply(x=identity, y=x)\n        return x\n\n\nclass PPLCNet(TheseusLayer):\n\n    def __init__(self,\n                 stages_pattern,\n                 scale=1.0,\n                 class_num=1000,\n                 dropout_prob=0.2,\n                 class_expand=1280,\n                 return_patterns=None,\n                 return_stages=None):\n        super().__init__()\n        self.scale = scale\n        self.class_expand = class_expand\n\n        self.conv1 = ConvBNLayer(num_channels=3, filter_size=3, num_filters=make_divisible(16 * scale), stride=2)\n\n        self.blocks2 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks2\"])\n        ])\n\n        self.blocks3 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks3\"])\n        ])\n\n        self.blocks4 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks4\"])\n        ])\n\n        self.blocks5 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks5\"])\n        ])\n\n        self.blocks6 = nn.Sequential(*[\n            DepthwiseSeparable(num_channels=make_divisible(in_c * scale),\n                               num_filters=make_divisible(out_c * scale),\n                               dw_size=k,\n                               stride=s,\n                               use_se=se) for i, (k, in_c, out_c, s, se) in enumerate(NET_CONFIG[\"blocks6\"])\n        ])\n\n        self.avg_pool = AdaptiveAvgPool2D(1)\n\n        self.last_conv = Conv2D(in_channels=make_divisible(NET_CONFIG[\"blocks6\"][-1][2] * scale),\n                                out_channels=self.class_expand,\n                                kernel_size=1,\n                                stride=1,\n                                padding=0,\n                                bias_attr=False)\n\n        self.hardswish = nn.Hardswish()\n        self.dropout = Dropout(p=dropout_prob, mode=\"downscale_in_infer\")\n        self.flatten = nn.Flatten(start_axis=1, stop_axis=-1)\n\n        self.fc = Linear(self.class_expand, class_num)\n\n        super().init_res(stages_pattern, return_patterns=return_patterns, return_stages=return_stages)\n\n    def forward(self, x):\n        x = self.conv1(x)\n\n        x = self.blocks2(x)\n        x = self.blocks3(x)\n        x = self.blocks4(x)\n        x = self.blocks5(x)\n        x = self.blocks6(x)\n\n        x = self.avg_pool(x)\n        x = self.last_conv(x)\n        x = self.hardswish(x)\n        x = self.dropout(x)\n        x = self.flatten(x)\n        x = self.fc(x)\n        return x\n\n\ndef PPLCNet_x2_5(pretrained=False, use_ssld=False, **kwargs):\n    model = PPLCNet(scale=2.5, stages_pattern=MODEL_STAGES_PATTERN[\"PPLCNet\"], **kwargs)\n    return model\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x2_5_imagenet/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport cv2\nimport numpy as np\nimport paddle\nfrom skimage.io import imread\nfrom skimage.transform import rescale\nfrom skimage.transform import resize\n\nimport paddlehub as hub\nfrom .model import PPLCNet_x2_5\nfrom .processor import base64_to_cv2\nfrom .processor import create_operators\nfrom .processor import Topk\nfrom .utils import get_config\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"pplcnet_x2_5_imagenet\",\n            type=\"cv/classification\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"\",\n            version=\"1.0.0\")\nclass PPLcNet_x2_5:\n\n    def __init__(self):\n        self.config = get_config(os.path.join(self.directory, 'PPLCNet_x2_5.yaml'), show=False)\n        self.label_path = os.path.join(self.directory, 'imagenet1k_label_list.txt')\n        self.pretrain_path = os.path.join(self.directory, 'PPLCNet_x2_5_pretrained.pdparams')\n        self.config['Infer']['PostProcess']['class_id_map_file'] = self.label_path\n        self.model = PPLCNet_x2_5()\n        param_state_dict = paddle.load(self.pretrain_path)\n        self.model.set_dict(param_state_dict)\n        self.preprocess_funcs = create_operators(self.config[\"Infer\"][\"transforms\"])\n\n    def classification(self,\n                       images: list = None,\n                       paths: list = None,\n                       batch_size: int = 1,\n                       use_gpu: bool = False,\n                       top_k: int = 1):\n        '''\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results, each result dict contains key 'class_ids', 'scores' and 'label_names'.\n        '''\n        postprocess_func = Topk(top_k, self.label_path)\n        inputs = []\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n\n        if images != None:\n            for image in images:\n                image = image[:, :, ::-1]\n                inputs.append(image)\n\n        if paths != None:\n            for path in paths:\n                image = cv2.imread(path)[:, :, ::-1]\n                inputs.append(image)\n\n        batch_data = []\n        for idx, imagedata in enumerate(inputs):\n            for process in self.preprocess_funcs:\n                imagedata = process(imagedata)\n            batch_data.append(imagedata)\n            if len(batch_data) >= batch_size or idx == len(inputs) - 1:\n                batch_tensor = paddle.to_tensor(batch_data)\n                out = self.model(batch_tensor)\n                if isinstance(out, list):\n                    out = out[0]\n                if isinstance(out, dict) and \"logits\" in out:\n                    out = out[\"logits\"]\n                if isinstance(out, dict) and \"output\" in out:\n                    out = out[\"output\"]\n                result = postprocess_func(out)\n                results.extend(result)\n                batch_data.clear()\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n        results = self.classification(paths=[self.args.input_path],\n                                      use_gpu=self.args.use_gpu,\n                                      batch_size=self.args.batch_size,\n                                      top_k=self.args.top_k)\n        return results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classification(images=images_decode, **kwargs)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument('--batch_size', type=int, default=1, help='batch size')\n        self.arg_config_group.add_argument('--top_k', type=int, default=1, help='Return top k results.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to input image.\")\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x2_5_imagenet/processor.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport base64\nimport inspect\nimport math\nimport os\nimport random\nimport sys\nfrom functools import partial\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\nimport six\nfrom paddle.vision.transforms import ColorJitter as RawColorJitter\nfrom PIL import Image\n\n\ndef create_operators(params, class_num=None):\n    \"\"\"\n    create operators based on the config\n\n    Args:\n        params(list): a dict list, used to create some operators\n    \"\"\"\n    assert isinstance(params, list), ('operator config should be a list')\n    ops = []\n    current_module = sys.modules[__name__]\n    for operator in params:\n        assert isinstance(operator, dict) and len(operator) == 1, \"yaml format error\"\n        op_name = list(operator)[0]\n        param = {} if operator[op_name] is None else operator[op_name]\n        op_func = getattr(current_module, op_name)\n        if \"class_num\" in inspect.getfullargspec(op_func).args:\n            param.update({\"class_num\": class_num})\n        op = op_func(**param)\n        ops.append(op)\n\n    return ops\n\n\nclass UnifiedResize(object):\n\n    def __init__(self, interpolation=None, backend=\"cv2\"):\n        _cv2_interp_from_str = {\n            'nearest': cv2.INTER_NEAREST,\n            'bilinear': cv2.INTER_LINEAR,\n            'area': cv2.INTER_AREA,\n            'bicubic': cv2.INTER_CUBIC,\n            'lanczos': cv2.INTER_LANCZOS4\n        }\n        _pil_interp_from_str = {\n            'nearest': Image.NEAREST,\n            'bilinear': Image.BILINEAR,\n            'bicubic': Image.BICUBIC,\n            'box': Image.BOX,\n            'lanczos': Image.LANCZOS,\n            'hamming': Image.HAMMING\n        }\n\n        def _pil_resize(src, size, resample):\n            pil_img = Image.fromarray(src)\n            pil_img = pil_img.resize(size, resample)\n            return np.asarray(pil_img)\n\n        if backend.lower() == \"cv2\":\n            if isinstance(interpolation, str):\n                interpolation = _cv2_interp_from_str[interpolation.lower()]\n            # compatible with opencv < version 4.4.0\n            elif interpolation is None:\n                interpolation = cv2.INTER_LINEAR\n            self.resize_func = partial(cv2.resize, interpolation=interpolation)\n        elif backend.lower() == \"pil\":\n            if isinstance(interpolation, str):\n                interpolation = _pil_interp_from_str[interpolation.lower()]\n            self.resize_func = partial(_pil_resize, resample=interpolation)\n        else:\n            self.resize_func = cv2.resize\n\n    def __call__(self, src, size):\n        return self.resize_func(src, size)\n\n\nclass OperatorParamError(ValueError):\n    \"\"\" OperatorParamError\n    \"\"\"\n    pass\n\n\nclass DecodeImage(object):\n    \"\"\" decode image \"\"\"\n\n    def __init__(self, to_rgb=True, to_np=False, channel_first=False):\n        self.to_rgb = to_rgb\n        self.to_np = to_np  # to numpy\n        self.channel_first = channel_first  # only enabled when to_np is True\n\n    def __call__(self, img):\n        if six.PY2:\n            assert type(img) is str and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        else:\n            assert type(img) is bytes and len(img) > 0, \"invalid input 'img' in DecodeImage\"\n        data = np.frombuffer(img, dtype='uint8')\n        img = cv2.imdecode(data, 1)\n        if self.to_rgb:\n            assert img.shape[2] == 3, 'invalid shape of image[%s]' % (img.shape)\n            img = img[:, :, ::-1]\n\n        if self.channel_first:\n            img = img.transpose((2, 0, 1))\n\n        return img\n\n\nclass ResizeImage(object):\n    \"\"\" resize image \"\"\"\n\n    def __init__(self, size=None, resize_short=None, interpolation=None, backend=\"cv2\"):\n        if resize_short is not None and resize_short > 0:\n            self.resize_short = resize_short\n            self.w = None\n            self.h = None\n        elif size is not None:\n            self.resize_short = None\n            self.w = size if type(size) is int else size[0]\n            self.h = size if type(size) is int else size[1]\n        else:\n            raise OperatorParamError(\"invalid params for ReisizeImage for '\\\n                'both 'size' and 'resize_short' are None\")\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        img_h, img_w = img.shape[:2]\n        if self.resize_short is not None:\n            percent = float(self.resize_short) / min(img_w, img_h)\n            w = int(round(img_w * percent))\n            h = int(round(img_h * percent))\n        else:\n            w = self.w\n            h = self.h\n        return self._resize_func(img, (w, h))\n\n\nclass CropImage(object):\n    \"\"\" crop image \"\"\"\n\n    def __init__(self, size):\n        if type(size) is int:\n            self.size = (size, size)\n        else:\n            self.size = size  # (h, w)\n\n    def __call__(self, img):\n        w, h = self.size\n        img_h, img_w = img.shape[:2]\n        w_start = (img_w - w) // 2\n        h_start = (img_h - h) // 2\n\n        w_end = w_start + w\n        h_end = h_start + h\n        return img[h_start:h_end, w_start:w_end, :]\n\n\nclass RandCropImage(object):\n    \"\"\" random crop image \"\"\"\n\n    def __init__(self, size, scale=None, ratio=None, interpolation=None, backend=\"cv2\"):\n        if type(size) is int:\n            self.size = (size, size)  # (h, w)\n        else:\n            self.size = size\n\n        self.scale = [0.08, 1.0] if scale is None else scale\n        self.ratio = [3. / 4., 4. / 3.] if ratio is None else ratio\n\n        self._resize_func = UnifiedResize(interpolation=interpolation, backend=backend)\n\n    def __call__(self, img):\n        size = self.size\n        scale = self.scale\n        ratio = self.ratio\n\n        aspect_ratio = math.sqrt(random.uniform(*ratio))\n        w = 1. * aspect_ratio\n        h = 1. / aspect_ratio\n\n        img_h, img_w = img.shape[:2]\n\n        bound = min((float(img_w) / img_h) / (w**2), (float(img_h) / img_w) / (h**2))\n        scale_max = min(scale[1], bound)\n        scale_min = min(scale[0], bound)\n\n        target_area = img_w * img_h * random.uniform(scale_min, scale_max)\n        target_size = math.sqrt(target_area)\n        w = int(target_size * w)\n        h = int(target_size * h)\n\n        i = random.randint(0, img_w - w)\n        j = random.randint(0, img_h - h)\n\n        img = img[j:j + h, i:i + w, :]\n\n        return self._resize_func(img, size)\n\n\nclass RandFlipImage(object):\n    \"\"\" random flip image\n        flip_code:\n            1: Flipped Horizontally\n            0: Flipped Vertically\n            -1: Flipped Horizontally & Vertically\n    \"\"\"\n\n    def __init__(self, flip_code=1):\n        assert flip_code in [-1, 0, 1], \"flip_code should be a value in [-1, 0, 1]\"\n        self.flip_code = flip_code\n\n    def __call__(self, img):\n        if random.randint(0, 1) == 1:\n            return cv2.flip(img, self.flip_code)\n        else:\n            return img\n\n\nclass NormalizeImage(object):\n    \"\"\" normalize image such as substract mean, divide std\n    \"\"\"\n\n    def __init__(self, scale=None, mean=None, std=None, order='chw', output_fp16=False, channel_num=3):\n        if isinstance(scale, str):\n            scale = eval(scale)\n        assert channel_num in [3, 4], \"channel number of input image should be set to 3 or 4.\"\n        self.channel_num = channel_num\n        self.output_dtype = 'float16' if output_fp16 else 'float32'\n        self.scale = np.float32(scale if scale is not None else 1.0 / 255.0)\n        self.order = order\n        mean = mean if mean is not None else [0.485, 0.456, 0.406]\n        std = std if std is not None else [0.229, 0.224, 0.225]\n\n        shape = (3, 1, 1) if self.order == 'chw' else (1, 1, 3)\n        self.mean = np.array(mean).reshape(shape).astype('float32')\n        self.std = np.array(std).reshape(shape).astype('float32')\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        assert isinstance(img, np.ndarray), \"invalid input 'img' in NormalizeImage\"\n\n        img = (img.astype('float32') * self.scale - self.mean) / self.std\n\n        if self.channel_num == 4:\n            img_h = img.shape[1] if self.order == 'chw' else img.shape[0]\n            img_w = img.shape[2] if self.order == 'chw' else img.shape[1]\n            pad_zeros = np.zeros((1, img_h, img_w)) if self.order == 'chw' else np.zeros((img_h, img_w, 1))\n            img = (np.concatenate((img, pad_zeros), axis=0) if self.order == 'chw' else np.concatenate(\n                (img, pad_zeros), axis=2))\n        return img.astype(self.output_dtype)\n\n\nclass ToCHWImage(object):\n    \"\"\" convert hwc image to chw image\n    \"\"\"\n\n    def __init__(self):\n        pass\n\n    def __call__(self, img):\n        from PIL import Image\n        if isinstance(img, Image.Image):\n            img = np.array(img)\n\n        return img.transpose((2, 0, 1))\n\n\nclass ColorJitter(RawColorJitter):\n    \"\"\"ColorJitter.\n    \"\"\"\n\n    def __init__(self, *args, **kwargs):\n        super().__init__(*args, **kwargs)\n\n    def __call__(self, img):\n        if not isinstance(img, Image.Image):\n            img = np.ascontiguousarray(img)\n            img = Image.fromarray(img)\n        img = super()._apply_image(img)\n        if isinstance(img, Image.Image):\n            img = np.asarray(img)\n        return img\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\nclass Topk(object):\n\n    def __init__(self, topk=1, class_id_map_file=None):\n        assert isinstance(topk, (int, ))\n        self.class_id_map = self.parse_class_id_map(class_id_map_file)\n        self.topk = topk\n\n    def parse_class_id_map(self, class_id_map_file):\n        if class_id_map_file is None:\n            return None\n        if not os.path.exists(class_id_map_file):\n            print(\n                \"Warning: If want to use your own label_dict, please input legal path!\\nOtherwise label_names will be empty!\"\n            )\n            return None\n\n        try:\n            class_id_map = {}\n            with open(class_id_map_file, \"r\") as fin:\n                lines = fin.readlines()\n                for line in lines:\n                    partition = line.split(\"\\n\")[0].partition(\" \")\n                    class_id_map[int(partition[0])] = str(partition[-1])\n        except Exception as ex:\n            print(ex)\n            class_id_map = None\n        return class_id_map\n\n    def __call__(self, x, file_names=None, multilabel=False):\n        assert isinstance(x, paddle.Tensor)\n        if file_names is not None:\n            assert x.shape[0] == len(file_names)\n        x = F.softmax(x, axis=-1) if not multilabel else F.sigmoid(x)\n        x = x.numpy()\n        y = []\n        for idx, probs in enumerate(x):\n            index = probs.argsort(axis=0)[-self.topk:][::-1].astype(\"int32\") if not multilabel else np.where(\n                probs >= 0.5)[0].astype(\"int32\")\n            clas_id_list = []\n            score_list = []\n            label_name_list = []\n            for i in index:\n                clas_id_list.append(i.item())\n                score_list.append(probs[i].item())\n                if self.class_id_map is not None:\n                    label_name_list.append(self.class_id_map[i.item()])\n            result = {\n                \"class_ids\": clas_id_list,\n                \"scores\": np.around(score_list, decimals=5).tolist(),\n            }\n            if file_names is not None:\n                result[\"file_name\"] = file_names[idx]\n            if label_name_list is not None:\n                result[\"label_names\"] = label_name_list\n            y.append(result)\n        return y\n"
  },
  {
    "path": "modules/image/classification/pplcnet_x2_5_imagenet/utils.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport yaml\n\n__all__ = ['get_config']\n\n\nclass AttrDict(dict):\n\n    def __getattr__(self, key):\n        return self[key]\n\n    def __setattr__(self, key, value):\n        if key in self.__dict__:\n            self.__dict__[key] = value\n        else:\n            self[key] = value\n\n    def __deepcopy__(self, content):\n        return copy.deepcopy(dict(self))\n\n\ndef create_attr_dict(yaml_config):\n    from ast import literal_eval\n    for key, value in yaml_config.items():\n        if type(value) is dict:\n            yaml_config[key] = value = AttrDict(value)\n        if isinstance(value, str):\n            try:\n                value = literal_eval(value)\n            except BaseException:\n                pass\n        if isinstance(value, AttrDict):\n            create_attr_dict(yaml_config[key])\n        else:\n            yaml_config[key] = value\n\n\ndef parse_config(cfg_file):\n    \"\"\"Load a config file into AttrDict\"\"\"\n    with open(cfg_file, 'r') as fopen:\n        yaml_config = AttrDict(yaml.load(fopen, Loader=yaml.SafeLoader))\n    create_attr_dict(yaml_config)\n    return yaml_config\n\n\ndef override(dl, ks, v):\n    \"\"\"\n    Recursively replace dict of list\n    Args:\n        dl(dict or list): dict or list to be replaced\n        ks(list): list of keys\n        v(str): value to be replaced\n    \"\"\"\n\n    def str2num(v):\n        try:\n            return eval(v)\n        except Exception:\n            return v\n\n    assert isinstance(dl, (list, dict)), (\"{} should be a list or a dict\")\n    assert len(ks) > 0, ('lenght of keys should larger than 0')\n    if isinstance(dl, list):\n        k = str2num(ks[0])\n        if len(ks) == 1:\n            assert k < len(dl), ('index({}) out of range({})'.format(k, dl))\n            dl[k] = str2num(v)\n        else:\n            override(dl[k], ks[1:], v)\n    else:\n        if len(ks) == 1:\n            # assert ks[0] in dl, ('{} is not exist in {}'.format(ks[0], dl))\n            if not ks[0] in dl:\n                print('A new filed ({}) detected!'.format(ks[0], dl))\n            dl[ks[0]] = str2num(v)\n        else:\n            override(dl[ks[0]], ks[1:], v)\n\n\ndef override_config(config, options=None):\n    \"\"\"\n    Recursively override the config\n    Args:\n        config(dict): dict to be replaced\n        options(list): list of pairs(key0.key1.idx.key2=value)\n            such as: [\n                'topk=2',\n                'VALID.transforms.1.ResizeImage.resize_short=300'\n            ]\n    Returns:\n        config(dict): replaced config\n    \"\"\"\n    if options is not None:\n        for opt in options:\n            assert isinstance(opt, str), (\"option({}) should be a str\".format(opt))\n            assert \"=\" in opt, (\"option({}) should contain a =\"\n                                \"to distinguish between key and value\".format(opt))\n            pair = opt.split('=')\n            assert len(pair) == 2, (\"there can be only a = in the option\")\n            key, value = pair\n            keys = key.split('.')\n            override(config, keys, value)\n    return config\n\n\ndef get_config(fname, overrides=None, show=False):\n    \"\"\"\n    Read config from file\n    \"\"\"\n    assert os.path.exists(fname), ('config file({}) is not exist'.format(fname))\n    config = parse_config(fname)\n    override_config(config, overrides)\n    return config\n"
  },
  {
    "path": "modules/image/classification/repvgg_a0_imagenet/README.md",
    "content": "# repvgg_a0_imagenet\n\n|模型名称|repvgg_a0_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|RepVGG|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|53MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - RepVGG（Making VGG-style ConvNets Great Again）系列模型是清华大学（丁桂光团队）、旷视科技（孙建等）、香港科技大学和阿伯里斯特威斯大学于2021年提出的一种简单但功能强大的卷积神经网络架构。有一个类似于 VGG 的推理时间代理。主体由3x3卷积和relu stack组成，而训练时间模型具有多分支拓扑。训练时间和推理时间的解耦是通过重新参数化技术实现的，因此该模型被称为repvgg。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install repvgg_a0_imagenet\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run repvgg_a0_imagenet --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='repvgg_a0_imagenet')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用repvgg_a0_imagenet对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"repvgg_a0_imagenet\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='repvgg_a0_imagenet', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m repvgg_a0_imagenet\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/repvgg_a0_imagenet\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/repvgg_a0_imagenet/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/repvgg_a0_imagenet/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddlehub.vision.transforms as T\nimport numpy as np\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBN(nn.Layer):\n    def __init__(self, in_channels, out_channels, kernel_size, stride, padding, groups=1):\n        super(ConvBN, self).__init__()\n        self.conv = nn.Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=padding,\n            groups=groups,\n            bias_attr=False)\n        self.bn = nn.BatchNorm2D(num_features=out_channels)\n\n    def forward(self, x):\n        y = self.conv(x)\n        y = self.bn(y)\n        return y\n\n\nclass RepVGGBlock(nn.Layer):\n    def __init__(self,\n                 in_channels,\n                 out_channels,\n                 kernel_size,\n                 stride=1,\n                 padding=0,\n                 dilation=1,\n                 groups=1,\n                 padding_mode='zeros'):\n        super(RepVGGBlock, self).__init__()\n        self.in_channels = in_channels\n        self.out_channels = out_channels\n        self.kernel_size = kernel_size\n        self.stride = stride\n        self.padding = padding\n        self.dilation = dilation\n        self.groups = groups\n        self.padding_mode = padding_mode\n\n        assert kernel_size == 3\n        assert padding == 1\n\n        padding_11 = padding - kernel_size // 2\n\n        self.nonlinearity = nn.ReLU()\n\n        self.rbr_identity = nn.BatchNorm2D(\n            num_features=in_channels) if out_channels == in_channels and stride == 1 else None\n        self.rbr_dense = ConvBN(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=padding,\n            groups=groups)\n        self.rbr_1x1 = ConvBN(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=1,\n            stride=stride,\n            padding=padding_11,\n            groups=groups)\n\n    def forward(self, inputs):\n        if not self.training:\n            return self.nonlinearity(self.rbr_reparam(inputs))\n\n        if self.rbr_identity is None:\n            id_out = 0\n        else:\n            id_out = self.rbr_identity(inputs)\n        return self.nonlinearity(self.rbr_dense(inputs) + self.rbr_1x1(inputs) + id_out)\n\n    def eval(self):\n        if not hasattr(self, 'rbr_reparam'):\n            self.rbr_reparam = nn.Conv2D(\n                in_channels=self.in_channels,\n                out_channels=self.out_channels,\n                kernel_size=self.kernel_size,\n                stride=self.stride,\n                padding=self.padding,\n                dilation=self.dilation,\n                groups=self.groups,\n                padding_mode=self.padding_mode)\n        self.training = False\n        kernel, bias = self.get_equivalent_kernel_bias()\n        self.rbr_reparam.weight.set_value(kernel)\n        self.rbr_reparam.bias.set_value(bias)\n        for layer in self.sublayers():\n            layer.eval()\n\n    def get_equivalent_kernel_bias(self):\n        kernel3x3, bias3x3 = self._fuse_bn_tensor(self.rbr_dense)\n        kernel1x1, bias1x1 = self._fuse_bn_tensor(self.rbr_1x1)\n        kernelid, biasid = self._fuse_bn_tensor(self.rbr_identity)\n        return kernel3x3 + self._pad_1x1_to_3x3_tensor(kernel1x1) + kernelid, bias3x3 + bias1x1 + biasid\n\n    def _pad_1x1_to_3x3_tensor(self, kernel1x1):\n        if kernel1x1 is None:\n            return 0\n        else:\n            return nn.functional.pad(kernel1x1, [1, 1, 1, 1])\n\n    def _fuse_bn_tensor(self, branch):\n        if branch is None:\n            return 0, 0\n        if isinstance(branch, ConvBN):\n            kernel = branch.conv.weight\n            running_mean = branch.bn._mean\n            running_var = branch.bn._variance\n            gamma = branch.bn.weight\n            beta = branch.bn.bias\n            eps = branch.bn._epsilon\n        else:\n            assert isinstance(branch, nn.BatchNorm2D)\n            if not hasattr(self, 'id_tensor'):\n                input_dim = self.in_channels // self.groups\n                kernel_value = np.zeros((self.in_channels, input_dim, 3, 3), dtype=np.float32)\n                for i in range(self.in_channels):\n                    kernel_value[i, i % input_dim, 1, 1] = 1\n                self.id_tensor = paddle.to_tensor(kernel_value)\n            kernel = self.id_tensor\n            running_mean = branch._mean\n            running_var = branch._variance\n            gamma = branch.weight\n            beta = branch.bias\n            eps = branch._epsilon\n        std = (running_var + eps).sqrt()\n        t = (gamma / std).reshape((-1, 1, 1, 1))\n        return kernel * t, beta - running_mean * gamma / std\n\n\n@moduleinfo(\n    name=\"repvgg_a0_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"repvgg_a0_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass RepVGG_A0(nn.Layer):\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(RepVGG_A0, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        num_blocks = [2, 4, 14, 1]\n        width_multiplier = [0.75, 0.75, 0.75, 2.5]\n        self.override_groups_map = dict()\n\n        assert 0 not in self.override_groups_map\n\n        self.in_planes = min(64, int(64 * width_multiplier[0]))\n\n        self.stage0 = RepVGGBlock(in_channels=3, out_channels=self.in_planes, kernel_size=3, stride=2, padding=1)\n        self.cur_layer_idx = 1\n        self.stage1 = self._make_stage(int(64 * width_multiplier[0]), num_blocks[0], stride=2)\n        self.stage2 = self._make_stage(int(128 * width_multiplier[1]), num_blocks[1], stride=2)\n        self.stage3 = self._make_stage(int(256 * width_multiplier[2]), num_blocks[2], stride=2)\n        self.stage4 = self._make_stage(int(512 * width_multiplier[3]), num_blocks[3], stride=2)\n        self.gap = nn.AdaptiveAvgPool2D(output_size=1)\n        self.linear = nn.Linear(int(512 * width_multiplier[3]), class_dim)\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def _make_stage(self, planes, num_blocks, stride):\n        strides = [stride] + [1] * (num_blocks - 1)\n        blocks = []\n        for stride in strides:\n            cur_groups = self.override_groups_map.get(self.cur_layer_idx, 1)\n            blocks.append(\n                RepVGGBlock(\n                    in_channels=self.in_planes,\n                    out_channels=planes,\n                    kernel_size=3,\n                    stride=stride,\n                    padding=1,\n                    groups=cur_groups))\n            self.in_planes = planes\n            self.cur_layer_idx += 1\n        return nn.Sequential(*blocks)\n\n    def eval(self):\n        self.training = False\n        for layer in self.sublayers():\n            layer.training = False\n            layer.eval()\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, x):\n        out = self.stage0(x)\n        out = self.stage1(out)\n        out = self.stage2(out)\n        out = self.stage3(out)\n        out = self.stage4(out)\n        feature = self.gap(out)\n        out = paddle.flatten(feature, start_axis=1)\n        out = self.linear(out)\n        return out, feature\n"
  },
  {
    "path": "modules/image/classification/repvgg_a1_imagenet/README.md",
    "content": "# repvgg_a1_imagenet\n\n|模型名称|repvgg_a1_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|RepVGG|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|82MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - RepVGG（Making VGG-style ConvNets Great Again）系列模型是清华大学（丁桂光团队）、旷视科技（孙建等）、香港科技大学和阿伯里斯特威斯大学于2021年提出的一种简单但功能强大的卷积神经网络架构。有一个类似于 VGG 的推理时间代理。主体由3x3卷积和relu stack组成，而训练时间模型具有多分支拓扑。训练时间和推理时间的解耦是通过重新参数化技术实现的，因此该模型被称为repvgg。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install repvgg_a1_imagenet\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run repvgg_a1_imagenet --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='repvgg_a1_imagenet')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用repvgg_a1_imagenet对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"repvgg_a1_imagenet\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='repvgg_a1_imagenet', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m repvgg_a1_imagenet\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/repvgg_a1_imagenet\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/repvgg_a1_imagenet/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/repvgg_a1_imagenet/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddlehub.vision.transforms as T\nimport numpy as np\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBN(nn.Layer):\n    def __init__(self, in_channels, out_channels, kernel_size, stride, padding, groups=1):\n        super(ConvBN, self).__init__()\n        self.conv = nn.Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=padding,\n            groups=groups,\n            bias_attr=False)\n        self.bn = nn.BatchNorm2D(num_features=out_channels)\n\n    def forward(self, x):\n        y = self.conv(x)\n        y = self.bn(y)\n        return y\n\n\nclass RepVGGBlock(nn.Layer):\n    def __init__(self,\n                 in_channels,\n                 out_channels,\n                 kernel_size,\n                 stride=1,\n                 padding=0,\n                 dilation=1,\n                 groups=1,\n                 padding_mode='zeros'):\n        super(RepVGGBlock, self).__init__()\n        self.in_channels = in_channels\n        self.out_channels = out_channels\n        self.kernel_size = kernel_size\n        self.stride = stride\n        self.padding = padding\n        self.dilation = dilation\n        self.groups = groups\n        self.padding_mode = padding_mode\n\n        assert kernel_size == 3\n        assert padding == 1\n\n        padding_11 = padding - kernel_size // 2\n\n        self.nonlinearity = nn.ReLU()\n\n        self.rbr_identity = nn.BatchNorm2D(\n            num_features=in_channels) if out_channels == in_channels and stride == 1 else None\n        self.rbr_dense = ConvBN(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=padding,\n            groups=groups)\n        self.rbr_1x1 = ConvBN(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=1,\n            stride=stride,\n            padding=padding_11,\n            groups=groups)\n\n    def forward(self, inputs):\n        if not self.training:\n            return self.nonlinearity(self.rbr_reparam(inputs))\n\n        if self.rbr_identity is None:\n            id_out = 0\n        else:\n            id_out = self.rbr_identity(inputs)\n        return self.nonlinearity(self.rbr_dense(inputs) + self.rbr_1x1(inputs) + id_out)\n\n    def eval(self):\n        if not hasattr(self, 'rbr_reparam'):\n            self.rbr_reparam = nn.Conv2D(\n                in_channels=self.in_channels,\n                out_channels=self.out_channels,\n                kernel_size=self.kernel_size,\n                stride=self.stride,\n                padding=self.padding,\n                dilation=self.dilation,\n                groups=self.groups,\n                padding_mode=self.padding_mode)\n        self.training = False\n        kernel, bias = self.get_equivalent_kernel_bias()\n        self.rbr_reparam.weight.set_value(kernel)\n        self.rbr_reparam.bias.set_value(bias)\n        for layer in self.sublayers():\n            layer.eval()\n\n    def get_equivalent_kernel_bias(self):\n        kernel3x3, bias3x3 = self._fuse_bn_tensor(self.rbr_dense)\n        kernel1x1, bias1x1 = self._fuse_bn_tensor(self.rbr_1x1)\n        kernelid, biasid = self._fuse_bn_tensor(self.rbr_identity)\n        return kernel3x3 + self._pad_1x1_to_3x3_tensor(kernel1x1) + kernelid, bias3x3 + bias1x1 + biasid\n\n    def _pad_1x1_to_3x3_tensor(self, kernel1x1):\n        if kernel1x1 is None:\n            return 0\n        else:\n            return nn.functional.pad(kernel1x1, [1, 1, 1, 1])\n\n    def _fuse_bn_tensor(self, branch):\n        if branch is None:\n            return 0, 0\n        if isinstance(branch, ConvBN):\n            kernel = branch.conv.weight\n            running_mean = branch.bn._mean\n            running_var = branch.bn._variance\n            gamma = branch.bn.weight\n            beta = branch.bn.bias\n            eps = branch.bn._epsilon\n        else:\n            assert isinstance(branch, nn.BatchNorm2D)\n            if not hasattr(self, 'id_tensor'):\n                input_dim = self.in_channels // self.groups\n                kernel_value = np.zeros((self.in_channels, input_dim, 3, 3), dtype=np.float32)\n                for i in range(self.in_channels):\n                    kernel_value[i, i % input_dim, 1, 1] = 1\n                self.id_tensor = paddle.to_tensor(kernel_value)\n            kernel = self.id_tensor\n            running_mean = branch._mean\n            running_var = branch._variance\n            gamma = branch.weight\n            beta = branch.bias\n            eps = branch._epsilon\n        std = (running_var + eps).sqrt()\n        t = (gamma / std).reshape((-1, 1, 1, 1))\n        return kernel * t, beta - running_mean * gamma / std\n\n\n@moduleinfo(\n    name=\"repvgg_a1_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"repvgg_a1_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass RepVGG_A1(nn.Layer):\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(RepVGG_A1, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        num_blocks = [2, 4, 14, 1]\n        width_multiplier = [1, 1, 1, 2.5]\n        self.override_groups_map = dict()\n\n        assert 0 not in self.override_groups_map\n\n        self.in_planes = min(64, int(64 * width_multiplier[0]))\n\n        self.stage0 = RepVGGBlock(in_channels=3, out_channels=self.in_planes, kernel_size=3, stride=2, padding=1)\n        self.cur_layer_idx = 1\n        self.stage1 = self._make_stage(int(64 * width_multiplier[0]), num_blocks[0], stride=2)\n        self.stage2 = self._make_stage(int(128 * width_multiplier[1]), num_blocks[1], stride=2)\n        self.stage3 = self._make_stage(int(256 * width_multiplier[2]), num_blocks[2], stride=2)\n        self.stage4 = self._make_stage(int(512 * width_multiplier[3]), num_blocks[3], stride=2)\n        self.gap = nn.AdaptiveAvgPool2D(output_size=1)\n        self.linear = nn.Linear(int(512 * width_multiplier[3]), class_dim)\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def _make_stage(self, planes, num_blocks, stride):\n        strides = [stride] + [1] * (num_blocks - 1)\n        blocks = []\n        for stride in strides:\n            cur_groups = self.override_groups_map.get(self.cur_layer_idx, 1)\n            blocks.append(\n                RepVGGBlock(\n                    in_channels=self.in_planes,\n                    out_channels=planes,\n                    kernel_size=3,\n                    stride=stride,\n                    padding=1,\n                    groups=cur_groups))\n            self.in_planes = planes\n            self.cur_layer_idx += 1\n        return nn.Sequential(*blocks)\n\n    def eval(self):\n        self.training = False\n        for layer in self.sublayers():\n            layer.training = False\n            layer.eval()\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, x):\n        out = self.stage0(x)\n        out = self.stage1(out)\n        out = self.stage2(out)\n        out = self.stage3(out)\n        out = self.stage4(out)\n        feature = self.gap(out)\n        out = paddle.flatten(feature, start_axis=1)\n        out = self.linear(out)\n        return out, feature\n"
  },
  {
    "path": "modules/image/classification/repvgg_a2_imagenet/README.md",
    "content": "# repvgg_a2_imagenet\n\n|模型名称|repvgg_a2_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|RepVGG|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|163MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - RepVGG（Making VGG-style ConvNets Great Again）系列模型是清华大学（丁桂光团队）、旷视科技（孙建等）、香港科技大学和阿伯里斯特威斯大学于2021年提出的一种简单但功能强大的卷积神经网络架构。有一个类似于 VGG 的推理时间代理。主体由3x3卷积和relu stack组成，而训练时间模型具有多分支拓扑。训练时间和推理时间的解耦是通过重新参数化技术实现的，因此该模型被称为repvgg。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install repvgg_a2_imagenet\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run repvgg_a2_imagenet --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='repvgg_a2_imagenet')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用repvgg_a2_imagenet对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"repvgg_a2_imagenet\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='repvgg_a2_imagenet', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m repvgg_a2_imagenet\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/repvgg_a2_imagenet\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/repvgg_a2_imagenet/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/repvgg_a2_imagenet/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddlehub.vision.transforms as T\nimport numpy as np\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBN(nn.Layer):\n    def __init__(self, in_channels, out_channels, kernel_size, stride, padding, groups=1):\n        super(ConvBN, self).__init__()\n        self.conv = nn.Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=padding,\n            groups=groups,\n            bias_attr=False)\n        self.bn = nn.BatchNorm2D(num_features=out_channels)\n\n    def forward(self, x):\n        y = self.conv(x)\n        y = self.bn(y)\n        return y\n\n\nclass RepVGGBlock(nn.Layer):\n    def __init__(self,\n                 in_channels,\n                 out_channels,\n                 kernel_size,\n                 stride=1,\n                 padding=0,\n                 dilation=1,\n                 groups=1,\n                 padding_mode='zeros'):\n        super(RepVGGBlock, self).__init__()\n        self.in_channels = in_channels\n        self.out_channels = out_channels\n        self.kernel_size = kernel_size\n        self.stride = stride\n        self.padding = padding\n        self.dilation = dilation\n        self.groups = groups\n        self.padding_mode = padding_mode\n\n        assert kernel_size == 3\n        assert padding == 1\n\n        padding_11 = padding - kernel_size // 2\n\n        self.nonlinearity = nn.ReLU()\n\n        self.rbr_identity = nn.BatchNorm2D(\n            num_features=in_channels) if out_channels == in_channels and stride == 1 else None\n        self.rbr_dense = ConvBN(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=padding,\n            groups=groups)\n        self.rbr_1x1 = ConvBN(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=1,\n            stride=stride,\n            padding=padding_11,\n            groups=groups)\n\n    def forward(self, inputs):\n        if not self.training:\n            return self.nonlinearity(self.rbr_reparam(inputs))\n\n        if self.rbr_identity is None:\n            id_out = 0\n        else:\n            id_out = self.rbr_identity(inputs)\n        return self.nonlinearity(self.rbr_dense(inputs) + self.rbr_1x1(inputs) + id_out)\n\n    def eval(self):\n        if not hasattr(self, 'rbr_reparam'):\n            self.rbr_reparam = nn.Conv2D(\n                in_channels=self.in_channels,\n                out_channels=self.out_channels,\n                kernel_size=self.kernel_size,\n                stride=self.stride,\n                padding=self.padding,\n                dilation=self.dilation,\n                groups=self.groups,\n                padding_mode=self.padding_mode)\n        self.training = False\n        kernel, bias = self.get_equivalent_kernel_bias()\n        self.rbr_reparam.weight.set_value(kernel)\n        self.rbr_reparam.bias.set_value(bias)\n        for layer in self.sublayers():\n            layer.eval()\n\n    def get_equivalent_kernel_bias(self):\n        kernel3x3, bias3x3 = self._fuse_bn_tensor(self.rbr_dense)\n        kernel1x1, bias1x1 = self._fuse_bn_tensor(self.rbr_1x1)\n        kernelid, biasid = self._fuse_bn_tensor(self.rbr_identity)\n        return kernel3x3 + self._pad_1x1_to_3x3_tensor(kernel1x1) + kernelid, bias3x3 + bias1x1 + biasid\n\n    def _pad_1x1_to_3x3_tensor(self, kernel1x1):\n        if kernel1x1 is None:\n            return 0\n        else:\n            return nn.functional.pad(kernel1x1, [1, 1, 1, 1])\n\n    def _fuse_bn_tensor(self, branch):\n        if branch is None:\n            return 0, 0\n        if isinstance(branch, ConvBN):\n            kernel = branch.conv.weight\n            running_mean = branch.bn._mean\n            running_var = branch.bn._variance\n            gamma = branch.bn.weight\n            beta = branch.bn.bias\n            eps = branch.bn._epsilon\n        else:\n            assert isinstance(branch, nn.BatchNorm2D)\n            if not hasattr(self, 'id_tensor'):\n                input_dim = self.in_channels // self.groups\n                kernel_value = np.zeros((self.in_channels, input_dim, 3, 3), dtype=np.float32)\n                for i in range(self.in_channels):\n                    kernel_value[i, i % input_dim, 1, 1] = 1\n                self.id_tensor = paddle.to_tensor(kernel_value)\n            kernel = self.id_tensor\n            running_mean = branch._mean\n            running_var = branch._variance\n            gamma = branch.weight\n            beta = branch.bias\n            eps = branch._epsilon\n        std = (running_var + eps).sqrt()\n        t = (gamma / std).reshape((-1, 1, 1, 1))\n        return kernel * t, beta - running_mean * gamma / std\n\n\n@moduleinfo(\n    name=\"repvgg_a2_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"repvgg_a2_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass RepVGG_A2(nn.Layer):\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(RepVGG_A2, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        num_blocks = [2, 4, 14, 1]\n        width_multiplier = [1.5, 1.5, 1.5, 2.75]\n        self.override_groups_map = dict()\n\n        assert 0 not in self.override_groups_map\n\n        self.in_planes = min(64, int(64 * width_multiplier[0]))\n\n        self.stage0 = RepVGGBlock(in_channels=3, out_channels=self.in_planes, kernel_size=3, stride=2, padding=1)\n        self.cur_layer_idx = 1\n        self.stage1 = self._make_stage(int(64 * width_multiplier[0]), num_blocks[0], stride=2)\n        self.stage2 = self._make_stage(int(128 * width_multiplier[1]), num_blocks[1], stride=2)\n        self.stage3 = self._make_stage(int(256 * width_multiplier[2]), num_blocks[2], stride=2)\n        self.stage4 = self._make_stage(int(512 * width_multiplier[3]), num_blocks[3], stride=2)\n        self.gap = nn.AdaptiveAvgPool2D(output_size=1)\n        self.linear = nn.Linear(int(512 * width_multiplier[3]), class_dim)\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def _make_stage(self, planes, num_blocks, stride):\n        strides = [stride] + [1] * (num_blocks - 1)\n        blocks = []\n        for stride in strides:\n            cur_groups = self.override_groups_map.get(self.cur_layer_idx, 1)\n            blocks.append(\n                RepVGGBlock(\n                    in_channels=self.in_planes,\n                    out_channels=planes,\n                    kernel_size=3,\n                    stride=stride,\n                    padding=1,\n                    groups=cur_groups))\n            self.in_planes = planes\n            self.cur_layer_idx += 1\n        return nn.Sequential(*blocks)\n\n    def eval(self):\n        self.training = False\n        for layer in self.sublayers():\n            layer.training = False\n            layer.eval()\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, x):\n        out = self.stage0(x)\n        out = self.stage1(out)\n        out = self.stage2(out)\n        out = self.stage3(out)\n        out = self.stage4(out)\n        feature = self.gap(out)\n        out = paddle.flatten(feature, start_axis=1)\n        out = self.linear(out)\n        return out, feature\n"
  },
  {
    "path": "modules/image/classification/repvgg_b0_imagenet/README.md",
    "content": "# repvgg_b0_imagenet\n\n|模型名称|repvgg_b0_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|RepVGG|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|92MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - RepVGG（Making VGG-style ConvNets Great Again）系列模型是清华大学（丁桂光团队）、旷视科技（孙建等）、香港科技大学和阿伯里斯特威斯大学于2021年提出的一种简单但功能强大的卷积神经网络架构。有一个类似于 VGG 的推理时间代理。主体由3x3卷积和relu stack组成，而训练时间模型具有多分支拓扑。训练时间和推理时间的解耦是通过重新参数化技术实现的，因此该模型被称为repvgg。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install repvgg_b0_imagenet\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run repvgg_b0_imagenet --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='repvgg_b0_imagenet')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用repvgg_b0_imagenet对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"repvgg_b0_imagenet\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='repvgg_b0_imagenet', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m repvgg_b0_imagenet\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/repvgg_b0_imagenet\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/repvgg_b0_imagenet/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/repvgg_b0_imagenet/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddlehub.vision.transforms as T\nimport numpy as np\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBN(nn.Layer):\n    def __init__(self, in_channels, out_channels, kernel_size, stride, padding, groups=1):\n        super(ConvBN, self).__init__()\n        self.conv = nn.Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=padding,\n            groups=groups,\n            bias_attr=False)\n        self.bn = nn.BatchNorm2D(num_features=out_channels)\n\n    def forward(self, x):\n        y = self.conv(x)\n        y = self.bn(y)\n        return y\n\n\nclass RepVGGBlock(nn.Layer):\n    def __init__(self,\n                 in_channels,\n                 out_channels,\n                 kernel_size,\n                 stride=1,\n                 padding=0,\n                 dilation=1,\n                 groups=1,\n                 padding_mode='zeros'):\n        super(RepVGGBlock, self).__init__()\n        self.in_channels = in_channels\n        self.out_channels = out_channels\n        self.kernel_size = kernel_size\n        self.stride = stride\n        self.padding = padding\n        self.dilation = dilation\n        self.groups = groups\n        self.padding_mode = padding_mode\n\n        assert kernel_size == 3\n        assert padding == 1\n\n        padding_11 = padding - kernel_size // 2\n\n        self.nonlinearity = nn.ReLU()\n\n        self.rbr_identity = nn.BatchNorm2D(\n            num_features=in_channels) if out_channels == in_channels and stride == 1 else None\n        self.rbr_dense = ConvBN(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=padding,\n            groups=groups)\n        self.rbr_1x1 = ConvBN(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=1,\n            stride=stride,\n            padding=padding_11,\n            groups=groups)\n\n    def forward(self, inputs):\n        if not self.training:\n            return self.nonlinearity(self.rbr_reparam(inputs))\n\n        if self.rbr_identity is None:\n            id_out = 0\n        else:\n            id_out = self.rbr_identity(inputs)\n        return self.nonlinearity(self.rbr_dense(inputs) + self.rbr_1x1(inputs) + id_out)\n\n    def eval(self):\n        if not hasattr(self, 'rbr_reparam'):\n            self.rbr_reparam = nn.Conv2D(\n                in_channels=self.in_channels,\n                out_channels=self.out_channels,\n                kernel_size=self.kernel_size,\n                stride=self.stride,\n                padding=self.padding,\n                dilation=self.dilation,\n                groups=self.groups,\n                padding_mode=self.padding_mode)\n        self.training = False\n        kernel, bias = self.get_equivalent_kernel_bias()\n        self.rbr_reparam.weight.set_value(kernel)\n        self.rbr_reparam.bias.set_value(bias)\n        for layer in self.sublayers():\n            layer.eval()\n\n    def get_equivalent_kernel_bias(self):\n        kernel3x3, bias3x3 = self._fuse_bn_tensor(self.rbr_dense)\n        kernel1x1, bias1x1 = self._fuse_bn_tensor(self.rbr_1x1)\n        kernelid, biasid = self._fuse_bn_tensor(self.rbr_identity)\n        return kernel3x3 + self._pad_1x1_to_3x3_tensor(kernel1x1) + kernelid, bias3x3 + bias1x1 + biasid\n\n    def _pad_1x1_to_3x3_tensor(self, kernel1x1):\n        if kernel1x1 is None:\n            return 0\n        else:\n            return nn.functional.pad(kernel1x1, [1, 1, 1, 1])\n\n    def _fuse_bn_tensor(self, branch):\n        if branch is None:\n            return 0, 0\n        if isinstance(branch, ConvBN):\n            kernel = branch.conv.weight\n            running_mean = branch.bn._mean\n            running_var = branch.bn._variance\n            gamma = branch.bn.weight\n            beta = branch.bn.bias\n            eps = branch.bn._epsilon\n        else:\n            assert isinstance(branch, nn.BatchNorm2D)\n            if not hasattr(self, 'id_tensor'):\n                input_dim = self.in_channels // self.groups\n                kernel_value = np.zeros((self.in_channels, input_dim, 3, 3), dtype=np.float32)\n                for i in range(self.in_channels):\n                    kernel_value[i, i % input_dim, 1, 1] = 1\n                self.id_tensor = paddle.to_tensor(kernel_value)\n            kernel = self.id_tensor\n            running_mean = branch._mean\n            running_var = branch._variance\n            gamma = branch.weight\n            beta = branch.bias\n            eps = branch._epsilon\n        std = (running_var + eps).sqrt()\n        t = (gamma / std).reshape((-1, 1, 1, 1))\n        return kernel * t, beta - running_mean * gamma / std\n\n\n@moduleinfo(\n    name=\"repvgg_b0_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"repvgg_b0_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass RepVGG_B0(nn.Layer):\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(RepVGG_B0, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        num_blocks = [4, 6, 16, 1]\n        width_multiplier = [1, 1, 1, 2.5]\n        self.override_groups_map = dict()\n\n        assert 0 not in self.override_groups_map\n\n        self.in_planes = min(64, int(64 * width_multiplier[0]))\n\n        self.stage0 = RepVGGBlock(in_channels=3, out_channels=self.in_planes, kernel_size=3, stride=2, padding=1)\n        self.cur_layer_idx = 1\n        self.stage1 = self._make_stage(int(64 * width_multiplier[0]), num_blocks[0], stride=2)\n        self.stage2 = self._make_stage(int(128 * width_multiplier[1]), num_blocks[1], stride=2)\n        self.stage3 = self._make_stage(int(256 * width_multiplier[2]), num_blocks[2], stride=2)\n        self.stage4 = self._make_stage(int(512 * width_multiplier[3]), num_blocks[3], stride=2)\n        self.gap = nn.AdaptiveAvgPool2D(output_size=1)\n        self.linear = nn.Linear(int(512 * width_multiplier[3]), class_dim)\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def _make_stage(self, planes, num_blocks, stride):\n        strides = [stride] + [1] * (num_blocks - 1)\n        blocks = []\n        for stride in strides:\n            cur_groups = self.override_groups_map.get(self.cur_layer_idx, 1)\n            blocks.append(\n                RepVGGBlock(\n                    in_channels=self.in_planes,\n                    out_channels=planes,\n                    kernel_size=3,\n                    stride=stride,\n                    padding=1,\n                    groups=cur_groups))\n            self.in_planes = planes\n            self.cur_layer_idx += 1\n        return nn.Sequential(*blocks)\n\n    def eval(self):\n        self.training = False\n        for layer in self.sublayers():\n            layer.training = False\n            layer.eval()\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, x):\n        out = self.stage0(x)\n        out = self.stage1(out)\n        out = self.stage2(out)\n        out = self.stage3(out)\n        out = self.stage4(out)\n        feature = self.gap(out)\n        out = paddle.flatten(feature, start_axis=1)\n        out = self.linear(out)\n        return out, feature\n"
  },
  {
    "path": "modules/image/classification/repvgg_b1_imagenet/README.md",
    "content": "# repvgg_b1_imagenet\n\n|模型名称|repvgg_b1_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|RepVGG|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|332MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - RepVGG（Making VGG-style ConvNets Great Again）系列模型是清华大学（丁桂光团队）、旷视科技（孙建等）、香港科技大学和阿伯里斯特威斯大学于2021年提出的一种简单但功能强大的卷积神经网络架构。有一个类似于 VGG 的推理时间代理。主体由3x3卷积和relu stack组成，而训练时间模型具有多分支拓扑。训练时间和推理时间的解耦是通过重新参数化技术实现的，因此该模型被称为repvgg。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install repvgg_b1_imagenet\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run repvgg_b1_imagenet --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='repvgg_b1_imagenet')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用repvgg_b1_imagenet对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"repvgg_b1_imagenet\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='repvgg_b1_imagenet', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m repvgg_b1_imagenet\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/repvgg_b1_imagenet\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/repvgg_b1_imagenet/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/repvgg_b1_imagenet/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddlehub.vision.transforms as T\nimport numpy as np\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBN(nn.Layer):\n    def __init__(self, in_channels, out_channels, kernel_size, stride, padding, groups=1):\n        super(ConvBN, self).__init__()\n        self.conv = nn.Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=padding,\n            groups=groups,\n            bias_attr=False)\n        self.bn = nn.BatchNorm2D(num_features=out_channels)\n\n    def forward(self, x):\n        y = self.conv(x)\n        y = self.bn(y)\n        return y\n\n\nclass RepVGGBlock(nn.Layer):\n    def __init__(self,\n                 in_channels,\n                 out_channels,\n                 kernel_size,\n                 stride=1,\n                 padding=0,\n                 dilation=1,\n                 groups=1,\n                 padding_mode='zeros'):\n        super(RepVGGBlock, self).__init__()\n        self.in_channels = in_channels\n        self.out_channels = out_channels\n        self.kernel_size = kernel_size\n        self.stride = stride\n        self.padding = padding\n        self.dilation = dilation\n        self.groups = groups\n        self.padding_mode = padding_mode\n\n        assert kernel_size == 3\n        assert padding == 1\n\n        padding_11 = padding - kernel_size // 2\n\n        self.nonlinearity = nn.ReLU()\n\n        self.rbr_identity = nn.BatchNorm2D(\n            num_features=in_channels) if out_channels == in_channels and stride == 1 else None\n        self.rbr_dense = ConvBN(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=padding,\n            groups=groups)\n        self.rbr_1x1 = ConvBN(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=1,\n            stride=stride,\n            padding=padding_11,\n            groups=groups)\n\n    def forward(self, inputs):\n        if not self.training:\n            return self.nonlinearity(self.rbr_reparam(inputs))\n\n        if self.rbr_identity is None:\n            id_out = 0\n        else:\n            id_out = self.rbr_identity(inputs)\n        return self.nonlinearity(self.rbr_dense(inputs) + self.rbr_1x1(inputs) + id_out)\n\n    def eval(self):\n        if not hasattr(self, 'rbr_reparam'):\n            self.rbr_reparam = nn.Conv2D(\n                in_channels=self.in_channels,\n                out_channels=self.out_channels,\n                kernel_size=self.kernel_size,\n                stride=self.stride,\n                padding=self.padding,\n                dilation=self.dilation,\n                groups=self.groups,\n                padding_mode=self.padding_mode)\n        self.training = False\n        kernel, bias = self.get_equivalent_kernel_bias()\n        self.rbr_reparam.weight.set_value(kernel)\n        self.rbr_reparam.bias.set_value(bias)\n        for layer in self.sublayers():\n            layer.eval()\n\n    def get_equivalent_kernel_bias(self):\n        kernel3x3, bias3x3 = self._fuse_bn_tensor(self.rbr_dense)\n        kernel1x1, bias1x1 = self._fuse_bn_tensor(self.rbr_1x1)\n        kernelid, biasid = self._fuse_bn_tensor(self.rbr_identity)\n        return kernel3x3 + self._pad_1x1_to_3x3_tensor(kernel1x1) + kernelid, bias3x3 + bias1x1 + biasid\n\n    def _pad_1x1_to_3x3_tensor(self, kernel1x1):\n        if kernel1x1 is None:\n            return 0\n        else:\n            return nn.functional.pad(kernel1x1, [1, 1, 1, 1])\n\n    def _fuse_bn_tensor(self, branch):\n        if branch is None:\n            return 0, 0\n        if isinstance(branch, ConvBN):\n            kernel = branch.conv.weight\n            running_mean = branch.bn._mean\n            running_var = branch.bn._variance\n            gamma = branch.bn.weight\n            beta = branch.bn.bias\n            eps = branch.bn._epsilon\n        else:\n            assert isinstance(branch, nn.BatchNorm2D)\n            if not hasattr(self, 'id_tensor'):\n                input_dim = self.in_channels // self.groups\n                kernel_value = np.zeros((self.in_channels, input_dim, 3, 3), dtype=np.float32)\n                for i in range(self.in_channels):\n                    kernel_value[i, i % input_dim, 1, 1] = 1\n                self.id_tensor = paddle.to_tensor(kernel_value)\n            kernel = self.id_tensor\n            running_mean = branch._mean\n            running_var = branch._variance\n            gamma = branch.weight\n            beta = branch.bias\n            eps = branch._epsilon\n        std = (running_var + eps).sqrt()\n        t = (gamma / std).reshape((-1, 1, 1, 1))\n        return kernel * t, beta - running_mean * gamma / std\n\n\n@moduleinfo(\n    name=\"repvgg_b1_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"repvgg_b1_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass RepVGG_B1(nn.Layer):\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(RepVGG_B1, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        num_blocks = [4, 6, 16, 1]\n        width_multiplier = [2, 2, 2, 4]\n        self.override_groups_map = dict()\n\n        assert 0 not in self.override_groups_map\n\n        self.in_planes = min(64, int(64 * width_multiplier[0]))\n\n        self.stage0 = RepVGGBlock(in_channels=3, out_channels=self.in_planes, kernel_size=3, stride=2, padding=1)\n        self.cur_layer_idx = 1\n        self.stage1 = self._make_stage(int(64 * width_multiplier[0]), num_blocks[0], stride=2)\n        self.stage2 = self._make_stage(int(128 * width_multiplier[1]), num_blocks[1], stride=2)\n        self.stage3 = self._make_stage(int(256 * width_multiplier[2]), num_blocks[2], stride=2)\n        self.stage4 = self._make_stage(int(512 * width_multiplier[3]), num_blocks[3], stride=2)\n        self.gap = nn.AdaptiveAvgPool2D(output_size=1)\n        self.linear = nn.Linear(int(512 * width_multiplier[3]), class_dim)\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def _make_stage(self, planes, num_blocks, stride):\n        strides = [stride] + [1] * (num_blocks - 1)\n        blocks = []\n        for stride in strides:\n            cur_groups = self.override_groups_map.get(self.cur_layer_idx, 1)\n            blocks.append(\n                RepVGGBlock(\n                    in_channels=self.in_planes,\n                    out_channels=planes,\n                    kernel_size=3,\n                    stride=stride,\n                    padding=1,\n                    groups=cur_groups))\n            self.in_planes = planes\n            self.cur_layer_idx += 1\n        return nn.Sequential(*blocks)\n\n    def eval(self):\n        self.training = False\n        for layer in self.sublayers():\n            layer.training = False\n            layer.eval()\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, x):\n        out = self.stage0(x)\n        out = self.stage1(out)\n        out = self.stage2(out)\n        out = self.stage3(out)\n        out = self.stage4(out)\n        feature = self.gap(out)\n        out = paddle.flatten(feature, start_axis=1)\n        out = self.linear(out)\n        return out, feature\n"
  },
  {
    "path": "modules/image/classification/repvgg_b1g2_imagenet/README.md",
    "content": "# repvgg_b1g2_imagenet\n\n|模型名称|repvgg_b1g2_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|RepVGG|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|264MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - RepVGG（Making VGG-style ConvNets Great Again）系列模型是清华大学（丁桂光团队）、旷视科技（孙建等）、香港科技大学和阿伯里斯特威斯大学于2021年提出的一种简单但功能强大的卷积神经网络架构。有一个类似于 VGG 的推理时间代理。主体由3x3卷积和relu stack组成，而训练时间模型具有多分支拓扑。训练时间和推理时间的解耦是通过重新参数化技术实现的，因此该模型被称为repvgg。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install repvgg_b1g2_imagenet\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run repvgg_b1g2_imagenet --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='repvgg_b1g2_imagenet')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用repvgg_b1g2_imagenet对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"repvgg_b1g2_imagenet\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='repvgg_b1g2_imagenet', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m repvgg_b1g2_imagenet\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/repvgg_b1g2_imagenet\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/repvgg_b1g2_imagenet/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/repvgg_b1g2_imagenet/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddlehub.vision.transforms as T\nimport numpy as np\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBN(nn.Layer):\n    def __init__(self, in_channels, out_channels, kernel_size, stride, padding, groups=1):\n        super(ConvBN, self).__init__()\n        self.conv = nn.Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=padding,\n            groups=groups,\n            bias_attr=False)\n        self.bn = nn.BatchNorm2D(num_features=out_channels)\n\n    def forward(self, x):\n        y = self.conv(x)\n        y = self.bn(y)\n        return y\n\n\nclass RepVGGBlock(nn.Layer):\n    def __init__(self,\n                 in_channels,\n                 out_channels,\n                 kernel_size,\n                 stride=1,\n                 padding=0,\n                 dilation=1,\n                 groups=1,\n                 padding_mode='zeros'):\n        super(RepVGGBlock, self).__init__()\n        self.in_channels = in_channels\n        self.out_channels = out_channels\n        self.kernel_size = kernel_size\n        self.stride = stride\n        self.padding = padding\n        self.dilation = dilation\n        self.groups = groups\n        self.padding_mode = padding_mode\n\n        assert kernel_size == 3\n        assert padding == 1\n\n        padding_11 = padding - kernel_size // 2\n\n        self.nonlinearity = nn.ReLU()\n\n        self.rbr_identity = nn.BatchNorm2D(\n            num_features=in_channels) if out_channels == in_channels and stride == 1 else None\n        self.rbr_dense = ConvBN(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=padding,\n            groups=groups)\n        self.rbr_1x1 = ConvBN(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=1,\n            stride=stride,\n            padding=padding_11,\n            groups=groups)\n\n    def forward(self, inputs):\n        if not self.training:\n            return self.nonlinearity(self.rbr_reparam(inputs))\n\n        if self.rbr_identity is None:\n            id_out = 0\n        else:\n            id_out = self.rbr_identity(inputs)\n        return self.nonlinearity(self.rbr_dense(inputs) + self.rbr_1x1(inputs) + id_out)\n\n    def eval(self):\n        if not hasattr(self, 'rbr_reparam'):\n            self.rbr_reparam = nn.Conv2D(\n                in_channels=self.in_channels,\n                out_channels=self.out_channels,\n                kernel_size=self.kernel_size,\n                stride=self.stride,\n                padding=self.padding,\n                dilation=self.dilation,\n                groups=self.groups,\n                padding_mode=self.padding_mode)\n        self.training = False\n        kernel, bias = self.get_equivalent_kernel_bias()\n        self.rbr_reparam.weight.set_value(kernel)\n        self.rbr_reparam.bias.set_value(bias)\n        for layer in self.sublayers():\n            layer.eval()\n\n    def get_equivalent_kernel_bias(self):\n        kernel3x3, bias3x3 = self._fuse_bn_tensor(self.rbr_dense)\n        kernel1x1, bias1x1 = self._fuse_bn_tensor(self.rbr_1x1)\n        kernelid, biasid = self._fuse_bn_tensor(self.rbr_identity)\n        return kernel3x3 + self._pad_1x1_to_3x3_tensor(kernel1x1) + kernelid, bias3x3 + bias1x1 + biasid\n\n    def _pad_1x1_to_3x3_tensor(self, kernel1x1):\n        if kernel1x1 is None:\n            return 0\n        else:\n            return nn.functional.pad(kernel1x1, [1, 1, 1, 1])\n\n    def _fuse_bn_tensor(self, branch):\n        if branch is None:\n            return 0, 0\n        if isinstance(branch, ConvBN):\n            kernel = branch.conv.weight\n            running_mean = branch.bn._mean\n            running_var = branch.bn._variance\n            gamma = branch.bn.weight\n            beta = branch.bn.bias\n            eps = branch.bn._epsilon\n        else:\n            assert isinstance(branch, nn.BatchNorm2D)\n            if not hasattr(self, 'id_tensor'):\n                input_dim = self.in_channels // self.groups\n                kernel_value = np.zeros((self.in_channels, input_dim, 3, 3), dtype=np.float32)\n                for i in range(self.in_channels):\n                    kernel_value[i, i % input_dim, 1, 1] = 1\n                self.id_tensor = paddle.to_tensor(kernel_value)\n            kernel = self.id_tensor\n            running_mean = branch._mean\n            running_var = branch._variance\n            gamma = branch.weight\n            beta = branch.bias\n            eps = branch._epsilon\n        std = (running_var + eps).sqrt()\n        t = (gamma / std).reshape((-1, 1, 1, 1))\n        return kernel * t, beta - running_mean * gamma / std\n\n\n@moduleinfo(\n    name=\"repvgg_b1g2_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"repvgg_b1g2_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass RepVGG_B1G2(nn.Layer):\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(RepVGG_B1G2, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        num_blocks = [4, 6, 16, 1]\n        width_multiplier = [2, 2, 2, 4]\n        optional_groupwise_layers = [2, 4, 6, 8, 10, 12, 14, 16, 18, 20, 22, 24, 26]\n        self.override_groups_map = {l: 2 for l in optional_groupwise_layers}\n\n        assert 0 not in self.override_groups_map\n\n        self.in_planes = min(64, int(64 * width_multiplier[0]))\n\n        self.stage0 = RepVGGBlock(in_channels=3, out_channels=self.in_planes, kernel_size=3, stride=2, padding=1)\n        self.cur_layer_idx = 1\n        self.stage1 = self._make_stage(int(64 * width_multiplier[0]), num_blocks[0], stride=2)\n        self.stage2 = self._make_stage(int(128 * width_multiplier[1]), num_blocks[1], stride=2)\n        self.stage3 = self._make_stage(int(256 * width_multiplier[2]), num_blocks[2], stride=2)\n        self.stage4 = self._make_stage(int(512 * width_multiplier[3]), num_blocks[3], stride=2)\n        self.gap = nn.AdaptiveAvgPool2D(output_size=1)\n        self.linear = nn.Linear(int(512 * width_multiplier[3]), class_dim)\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def _make_stage(self, planes, num_blocks, stride):\n        strides = [stride] + [1] * (num_blocks - 1)\n        blocks = []\n        for stride in strides:\n            cur_groups = self.override_groups_map.get(self.cur_layer_idx, 1)\n            blocks.append(\n                RepVGGBlock(\n                    in_channels=self.in_planes,\n                    out_channels=planes,\n                    kernel_size=3,\n                    stride=stride,\n                    padding=1,\n                    groups=cur_groups))\n            self.in_planes = planes\n            self.cur_layer_idx += 1\n        return nn.Sequential(*blocks)\n\n    def eval(self):\n        self.training = False\n        for layer in self.sublayers():\n            layer.training = False\n            layer.eval()\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, x):\n        out = self.stage0(x)\n        out = self.stage1(out)\n        out = self.stage2(out)\n        out = self.stage3(out)\n        out = self.stage4(out)\n        feature = self.gap(out)\n        out = paddle.flatten(feature, start_axis=1)\n        out = self.linear(out)\n        return out, feature\n"
  },
  {
    "path": "modules/image/classification/repvgg_b1g4_imagenet/README.md",
    "content": "# repvgg_b1g4_imagenet\n\n|模型名称|repvgg_b1g4_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|RepVGG|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|231MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - RepVGG（Making VGG-style ConvNets Great Again）系列模型是清华大学（丁桂光团队）、旷视科技（孙建等）、香港科技大学和阿伯里斯特威斯大学于2021年提出的一种简单但功能强大的卷积神经网络架构。有一个类似于 VGG 的推理时间代理。主体由3x3卷积和relu stack组成，而训练时间模型具有多分支拓扑。训练时间和推理时间的解耦是通过重新参数化技术实现的，因此该模型被称为repvgg。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install repvgg_b1g4_imagenet\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run repvgg_b1g4_imagenet --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='repvgg_b1g4_imagenet')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用repvgg_b1g4_imagenet对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"repvgg_b1g4_imagenet\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='repvgg_b1g4_imagenet', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m repvgg_b1g4_imagenet\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/repvgg_b1g4_imagenet\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/repvgg_b1g4_imagenet/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/repvgg_b1g4_imagenet/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddlehub.vision.transforms as T\nimport numpy as np\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBN(nn.Layer):\n    def __init__(self, in_channels, out_channels, kernel_size, stride, padding, groups=1):\n        super(ConvBN, self).__init__()\n        self.conv = nn.Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=padding,\n            groups=groups,\n            bias_attr=False)\n        self.bn = nn.BatchNorm2D(num_features=out_channels)\n\n    def forward(self, x):\n        y = self.conv(x)\n        y = self.bn(y)\n        return y\n\n\nclass RepVGGBlock(nn.Layer):\n    def __init__(self,\n                 in_channels,\n                 out_channels,\n                 kernel_size,\n                 stride=1,\n                 padding=0,\n                 dilation=1,\n                 groups=1,\n                 padding_mode='zeros'):\n        super(RepVGGBlock, self).__init__()\n        self.in_channels = in_channels\n        self.out_channels = out_channels\n        self.kernel_size = kernel_size\n        self.stride = stride\n        self.padding = padding\n        self.dilation = dilation\n        self.groups = groups\n        self.padding_mode = padding_mode\n\n        assert kernel_size == 3\n        assert padding == 1\n\n        padding_11 = padding - kernel_size // 2\n\n        self.nonlinearity = nn.ReLU()\n\n        self.rbr_identity = nn.BatchNorm2D(\n            num_features=in_channels) if out_channels == in_channels and stride == 1 else None\n        self.rbr_dense = ConvBN(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=padding,\n            groups=groups)\n        self.rbr_1x1 = ConvBN(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=1,\n            stride=stride,\n            padding=padding_11,\n            groups=groups)\n\n    def forward(self, inputs):\n        if not self.training:\n            return self.nonlinearity(self.rbr_reparam(inputs))\n\n        if self.rbr_identity is None:\n            id_out = 0\n        else:\n            id_out = self.rbr_identity(inputs)\n        return self.nonlinearity(self.rbr_dense(inputs) + self.rbr_1x1(inputs) + id_out)\n\n    def eval(self):\n        if not hasattr(self, 'rbr_reparam'):\n            self.rbr_reparam = nn.Conv2D(\n                in_channels=self.in_channels,\n                out_channels=self.out_channels,\n                kernel_size=self.kernel_size,\n                stride=self.stride,\n                padding=self.padding,\n                dilation=self.dilation,\n                groups=self.groups,\n                padding_mode=self.padding_mode)\n        self.training = False\n        kernel, bias = self.get_equivalent_kernel_bias()\n        self.rbr_reparam.weight.set_value(kernel)\n        self.rbr_reparam.bias.set_value(bias)\n        for layer in self.sublayers():\n            layer.eval()\n\n    def get_equivalent_kernel_bias(self):\n        kernel3x3, bias3x3 = self._fuse_bn_tensor(self.rbr_dense)\n        kernel1x1, bias1x1 = self._fuse_bn_tensor(self.rbr_1x1)\n        kernelid, biasid = self._fuse_bn_tensor(self.rbr_identity)\n        return kernel3x3 + self._pad_1x1_to_3x3_tensor(kernel1x1) + kernelid, bias3x3 + bias1x1 + biasid\n\n    def _pad_1x1_to_3x3_tensor(self, kernel1x1):\n        if kernel1x1 is None:\n            return 0\n        else:\n            return nn.functional.pad(kernel1x1, [1, 1, 1, 1])\n\n    def _fuse_bn_tensor(self, branch):\n        if branch is None:\n            return 0, 0\n        if isinstance(branch, ConvBN):\n            kernel = branch.conv.weight\n            running_mean = branch.bn._mean\n            running_var = branch.bn._variance\n            gamma = branch.bn.weight\n            beta = branch.bn.bias\n            eps = branch.bn._epsilon\n        else:\n            assert isinstance(branch, nn.BatchNorm2D)\n            if not hasattr(self, 'id_tensor'):\n                input_dim = self.in_channels // self.groups\n                kernel_value = np.zeros((self.in_channels, input_dim, 3, 3), dtype=np.float32)\n                for i in range(self.in_channels):\n                    kernel_value[i, i % input_dim, 1, 1] = 1\n                self.id_tensor = paddle.to_tensor(kernel_value)\n            kernel = self.id_tensor\n            running_mean = branch._mean\n            running_var = branch._variance\n            gamma = branch.weight\n            beta = branch.bias\n            eps = branch._epsilon\n        std = (running_var + eps).sqrt()\n        t = (gamma / std).reshape((-1, 1, 1, 1))\n        return kernel * t, beta - running_mean * gamma / std\n\n\n@moduleinfo(\n    name=\"repvgg_b1g4_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"repvgg_b1g4_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass RepVGG_B1G4(nn.Layer):\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(RepVGG_B1G4, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        num_blocks = [4, 6, 16, 1]\n        width_multiplier = [2, 2, 2, 4]\n        optional_groupwise_layers = [2, 4, 6, 8, 10, 12, 14, 16, 18, 20, 22, 24, 26]\n        self.override_groups_map = {l: 4 for l in optional_groupwise_layers}\n\n        assert 0 not in self.override_groups_map\n\n        self.in_planes = min(64, int(64 * width_multiplier[0]))\n\n        self.stage0 = RepVGGBlock(in_channels=3, out_channels=self.in_planes, kernel_size=3, stride=2, padding=1)\n        self.cur_layer_idx = 1\n        self.stage1 = self._make_stage(int(64 * width_multiplier[0]), num_blocks[0], stride=2)\n        self.stage2 = self._make_stage(int(128 * width_multiplier[1]), num_blocks[1], stride=2)\n        self.stage3 = self._make_stage(int(256 * width_multiplier[2]), num_blocks[2], stride=2)\n        self.stage4 = self._make_stage(int(512 * width_multiplier[3]), num_blocks[3], stride=2)\n        self.gap = nn.AdaptiveAvgPool2D(output_size=1)\n        self.linear = nn.Linear(int(512 * width_multiplier[3]), class_dim)\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def _make_stage(self, planes, num_blocks, stride):\n        strides = [stride] + [1] * (num_blocks - 1)\n        blocks = []\n        for stride in strides:\n            cur_groups = self.override_groups_map.get(self.cur_layer_idx, 1)\n            blocks.append(\n                RepVGGBlock(\n                    in_channels=self.in_planes,\n                    out_channels=planes,\n                    kernel_size=3,\n                    stride=stride,\n                    padding=1,\n                    groups=cur_groups))\n            self.in_planes = planes\n            self.cur_layer_idx += 1\n        return nn.Sequential(*blocks)\n\n    def eval(self):\n        self.training = False\n        for layer in self.sublayers():\n            layer.training = False\n            layer.eval()\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, x):\n        out = self.stage0(x)\n        out = self.stage1(out)\n        out = self.stage2(out)\n        out = self.stage3(out)\n        out = self.stage4(out)\n        feature = self.gap(out)\n        out = paddle.flatten(feature, start_axis=1)\n        out = self.linear(out)\n        return out, feature\n"
  },
  {
    "path": "modules/image/classification/repvgg_b2_imagenet/README.md",
    "content": "# repvgg_b2_imagenet\n\n|模型名称|repvgg_b2_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|RepVGG|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|514MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - RepVGG（Making VGG-style ConvNets Great Again）系列模型是清华大学（丁桂光团队）、旷视科技（孙建等）、香港科技大学和阿伯里斯特威斯大学于2021年提出的一种简单但功能强大的卷积神经网络架构。有一个类似于 VGG 的推理时间代理。主体由3x3卷积和relu stack组成，而训练时间模型具有多分支拓扑。训练时间和推理时间的解耦是通过重新参数化技术实现的，因此该模型被称为repvgg。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install repvgg_b2_imagenet\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run repvgg_b2_imagenet --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='repvgg_b2_imagenet')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用repvgg_b2_imagenet对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"repvgg_b2_imagenet\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='repvgg_b2_imagenet', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m repvgg_b2_imagenet\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/repvgg_b2_imagenet\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/repvgg_b2_imagenet/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/repvgg_b2_imagenet/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddlehub.vision.transforms as T\nimport numpy as np\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBN(nn.Layer):\n    def __init__(self, in_channels, out_channels, kernel_size, stride, padding, groups=1):\n        super(ConvBN, self).__init__()\n        self.conv = nn.Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=padding,\n            groups=groups,\n            bias_attr=False)\n        self.bn = nn.BatchNorm2D(num_features=out_channels)\n\n    def forward(self, x):\n        y = self.conv(x)\n        y = self.bn(y)\n        return y\n\n\nclass RepVGGBlock(nn.Layer):\n    def __init__(self,\n                 in_channels,\n                 out_channels,\n                 kernel_size,\n                 stride=1,\n                 padding=0,\n                 dilation=1,\n                 groups=1,\n                 padding_mode='zeros'):\n        super(RepVGGBlock, self).__init__()\n        self.in_channels = in_channels\n        self.out_channels = out_channels\n        self.kernel_size = kernel_size\n        self.stride = stride\n        self.padding = padding\n        self.dilation = dilation\n        self.groups = groups\n        self.padding_mode = padding_mode\n\n        assert kernel_size == 3\n        assert padding == 1\n\n        padding_11 = padding - kernel_size // 2\n\n        self.nonlinearity = nn.ReLU()\n\n        self.rbr_identity = nn.BatchNorm2D(\n            num_features=in_channels) if out_channels == in_channels and stride == 1 else None\n        self.rbr_dense = ConvBN(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=padding,\n            groups=groups)\n        self.rbr_1x1 = ConvBN(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=1,\n            stride=stride,\n            padding=padding_11,\n            groups=groups)\n\n    def forward(self, inputs):\n        if not self.training:\n            return self.nonlinearity(self.rbr_reparam(inputs))\n\n        if self.rbr_identity is None:\n            id_out = 0\n        else:\n            id_out = self.rbr_identity(inputs)\n        return self.nonlinearity(self.rbr_dense(inputs) + self.rbr_1x1(inputs) + id_out)\n\n    def eval(self):\n        if not hasattr(self, 'rbr_reparam'):\n            self.rbr_reparam = nn.Conv2D(\n                in_channels=self.in_channels,\n                out_channels=self.out_channels,\n                kernel_size=self.kernel_size,\n                stride=self.stride,\n                padding=self.padding,\n                dilation=self.dilation,\n                groups=self.groups,\n                padding_mode=self.padding_mode)\n        self.training = False\n        kernel, bias = self.get_equivalent_kernel_bias()\n        self.rbr_reparam.weight.set_value(kernel)\n        self.rbr_reparam.bias.set_value(bias)\n        for layer in self.sublayers():\n            layer.eval()\n\n    def get_equivalent_kernel_bias(self):\n        kernel3x3, bias3x3 = self._fuse_bn_tensor(self.rbr_dense)\n        kernel1x1, bias1x1 = self._fuse_bn_tensor(self.rbr_1x1)\n        kernelid, biasid = self._fuse_bn_tensor(self.rbr_identity)\n        return kernel3x3 + self._pad_1x1_to_3x3_tensor(kernel1x1) + kernelid, bias3x3 + bias1x1 + biasid\n\n    def _pad_1x1_to_3x3_tensor(self, kernel1x1):\n        if kernel1x1 is None:\n            return 0\n        else:\n            return nn.functional.pad(kernel1x1, [1, 1, 1, 1])\n\n    def _fuse_bn_tensor(self, branch):\n        if branch is None:\n            return 0, 0\n        if isinstance(branch, ConvBN):\n            kernel = branch.conv.weight\n            running_mean = branch.bn._mean\n            running_var = branch.bn._variance\n            gamma = branch.bn.weight\n            beta = branch.bn.bias\n            eps = branch.bn._epsilon\n        else:\n            assert isinstance(branch, nn.BatchNorm2D)\n            if not hasattr(self, 'id_tensor'):\n                input_dim = self.in_channels // self.groups\n                kernel_value = np.zeros((self.in_channels, input_dim, 3, 3), dtype=np.float32)\n                for i in range(self.in_channels):\n                    kernel_value[i, i % input_dim, 1, 1] = 1\n                self.id_tensor = paddle.to_tensor(kernel_value)\n            kernel = self.id_tensor\n            running_mean = branch._mean\n            running_var = branch._variance\n            gamma = branch.weight\n            beta = branch.bias\n            eps = branch._epsilon\n        std = (running_var + eps).sqrt()\n        t = (gamma / std).reshape((-1, 1, 1, 1))\n        return kernel * t, beta - running_mean * gamma / std\n\n\n@moduleinfo(\n    name=\"repvgg_b2_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"repvgg_b2_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass RepVGG_B2(nn.Layer):\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(RepVGG_B2, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        num_blocks = [4, 6, 16, 1]\n        width_multiplier = [2.5, 2.5, 2.5, 5]\n        self.override_groups_map = dict()\n\n        assert 0 not in self.override_groups_map\n\n        self.in_planes = min(64, int(64 * width_multiplier[0]))\n\n        self.stage0 = RepVGGBlock(in_channels=3, out_channels=self.in_planes, kernel_size=3, stride=2, padding=1)\n        self.cur_layer_idx = 1\n        self.stage1 = self._make_stage(int(64 * width_multiplier[0]), num_blocks[0], stride=2)\n        self.stage2 = self._make_stage(int(128 * width_multiplier[1]), num_blocks[1], stride=2)\n        self.stage3 = self._make_stage(int(256 * width_multiplier[2]), num_blocks[2], stride=2)\n        self.stage4 = self._make_stage(int(512 * width_multiplier[3]), num_blocks[3], stride=2)\n        self.gap = nn.AdaptiveAvgPool2D(output_size=1)\n        self.linear = nn.Linear(int(512 * width_multiplier[3]), class_dim)\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def _make_stage(self, planes, num_blocks, stride):\n        strides = [stride] + [1] * (num_blocks - 1)\n        blocks = []\n        for stride in strides:\n            cur_groups = self.override_groups_map.get(self.cur_layer_idx, 1)\n            blocks.append(\n                RepVGGBlock(\n                    in_channels=self.in_planes,\n                    out_channels=planes,\n                    kernel_size=3,\n                    stride=stride,\n                    padding=1,\n                    groups=cur_groups))\n            self.in_planes = planes\n            self.cur_layer_idx += 1\n        return nn.Sequential(*blocks)\n\n    def eval(self):\n        self.training = False\n        for layer in self.sublayers():\n            layer.training = False\n            layer.eval()\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, x):\n        out = self.stage0(x)\n        out = self.stage1(out)\n        out = self.stage2(out)\n        out = self.stage3(out)\n        out = self.stage4(out)\n        feature = self.gap(out)\n        out = paddle.flatten(feature, start_axis=1)\n        out = self.linear(out)\n        return out, feature\n"
  },
  {
    "path": "modules/image/classification/repvgg_b2g4_imagenet/README.md",
    "content": "# repvgg_b2g4_imagenet\n\n|模型名称|repvgg_b2g4_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|RepVGG|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|357MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - RepVGG（Making VGG-style ConvNets Great Again）系列模型是清华大学（丁桂光团队）、旷视科技（孙建等）、香港科技大学和阿伯里斯特威斯大学于2021年提出的一种简单但功能强大的卷积神经网络架构。有一个类似于 VGG 的推理时间代理。主体由3x3卷积和relu stack组成，而训练时间模型具有多分支拓扑。训练时间和推理时间的解耦是通过重新参数化技术实现的，因此该模型被称为repvgg。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install repvgg_b2g4_imagenet\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run repvgg_b2g4_imagenet --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='repvgg_b2g4_imagenet')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用repvgg_b2g4_imagenet对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"repvgg_b2g4_imagenet\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='repvgg_b2g4_imagenet', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m repvgg_b2g4_imagenet\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/repvgg_b2g4_imagenet\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/repvgg_b2g4_imagenet/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/repvgg_b2g4_imagenet/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddlehub.vision.transforms as T\nimport numpy as np\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBN(nn.Layer):\n    def __init__(self, in_channels, out_channels, kernel_size, stride, padding, groups=1):\n        super(ConvBN, self).__init__()\n        self.conv = nn.Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=padding,\n            groups=groups,\n            bias_attr=False)\n        self.bn = nn.BatchNorm2D(num_features=out_channels)\n\n    def forward(self, x):\n        y = self.conv(x)\n        y = self.bn(y)\n        return y\n\n\nclass RepVGGBlock(nn.Layer):\n    def __init__(self,\n                 in_channels,\n                 out_channels,\n                 kernel_size,\n                 stride=1,\n                 padding=0,\n                 dilation=1,\n                 groups=1,\n                 padding_mode='zeros'):\n        super(RepVGGBlock, self).__init__()\n        self.in_channels = in_channels\n        self.out_channels = out_channels\n        self.kernel_size = kernel_size\n        self.stride = stride\n        self.padding = padding\n        self.dilation = dilation\n        self.groups = groups\n        self.padding_mode = padding_mode\n\n        assert kernel_size == 3\n        assert padding == 1\n\n        padding_11 = padding - kernel_size // 2\n\n        self.nonlinearity = nn.ReLU()\n\n        self.rbr_identity = nn.BatchNorm2D(\n            num_features=in_channels) if out_channels == in_channels and stride == 1 else None\n        self.rbr_dense = ConvBN(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=padding,\n            groups=groups)\n        self.rbr_1x1 = ConvBN(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=1,\n            stride=stride,\n            padding=padding_11,\n            groups=groups)\n\n    def forward(self, inputs):\n        if not self.training:\n            return self.nonlinearity(self.rbr_reparam(inputs))\n\n        if self.rbr_identity is None:\n            id_out = 0\n        else:\n            id_out = self.rbr_identity(inputs)\n        return self.nonlinearity(self.rbr_dense(inputs) + self.rbr_1x1(inputs) + id_out)\n\n    def eval(self):\n        if not hasattr(self, 'rbr_reparam'):\n            self.rbr_reparam = nn.Conv2D(\n                in_channels=self.in_channels,\n                out_channels=self.out_channels,\n                kernel_size=self.kernel_size,\n                stride=self.stride,\n                padding=self.padding,\n                dilation=self.dilation,\n                groups=self.groups,\n                padding_mode=self.padding_mode)\n        self.training = False\n        kernel, bias = self.get_equivalent_kernel_bias()\n        self.rbr_reparam.weight.set_value(kernel)\n        self.rbr_reparam.bias.set_value(bias)\n        for layer in self.sublayers():\n            layer.eval()\n\n    def get_equivalent_kernel_bias(self):\n        kernel3x3, bias3x3 = self._fuse_bn_tensor(self.rbr_dense)\n        kernel1x1, bias1x1 = self._fuse_bn_tensor(self.rbr_1x1)\n        kernelid, biasid = self._fuse_bn_tensor(self.rbr_identity)\n        return kernel3x3 + self._pad_1x1_to_3x3_tensor(kernel1x1) + kernelid, bias3x3 + bias1x1 + biasid\n\n    def _pad_1x1_to_3x3_tensor(self, kernel1x1):\n        if kernel1x1 is None:\n            return 0\n        else:\n            return nn.functional.pad(kernel1x1, [1, 1, 1, 1])\n\n    def _fuse_bn_tensor(self, branch):\n        if branch is None:\n            return 0, 0\n        if isinstance(branch, ConvBN):\n            kernel = branch.conv.weight\n            running_mean = branch.bn._mean\n            running_var = branch.bn._variance\n            gamma = branch.bn.weight\n            beta = branch.bn.bias\n            eps = branch.bn._epsilon\n        else:\n            assert isinstance(branch, nn.BatchNorm2D)\n            if not hasattr(self, 'id_tensor'):\n                input_dim = self.in_channels // self.groups\n                kernel_value = np.zeros((self.in_channels, input_dim, 3, 3), dtype=np.float32)\n                for i in range(self.in_channels):\n                    kernel_value[i, i % input_dim, 1, 1] = 1\n                self.id_tensor = paddle.to_tensor(kernel_value)\n            kernel = self.id_tensor\n            running_mean = branch._mean\n            running_var = branch._variance\n            gamma = branch.weight\n            beta = branch.bias\n            eps = branch._epsilon\n        std = (running_var + eps).sqrt()\n        t = (gamma / std).reshape((-1, 1, 1, 1))\n        return kernel * t, beta - running_mean * gamma / std\n\n\n@moduleinfo(\n    name=\"repvgg_b2g4_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"repvgg_b2g4_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass RepVGG_B2G4(nn.Layer):\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(RepVGG_B2G4, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        num_blocks = [4, 6, 16, 1]\n        width_multiplier = [2.5, 2.5, 2.5, 5]\n        optional_groupwise_layers = [2, 4, 6, 8, 10, 12, 14, 16, 18, 20, 22, 24, 26]\n        self.override_groups_map = {l: 4 for l in optional_groupwise_layers}\n\n        assert 0 not in self.override_groups_map\n\n        self.in_planes = min(64, int(64 * width_multiplier[0]))\n\n        self.stage0 = RepVGGBlock(in_channels=3, out_channels=self.in_planes, kernel_size=3, stride=2, padding=1)\n        self.cur_layer_idx = 1\n        self.stage1 = self._make_stage(int(64 * width_multiplier[0]), num_blocks[0], stride=2)\n        self.stage2 = self._make_stage(int(128 * width_multiplier[1]), num_blocks[1], stride=2)\n        self.stage3 = self._make_stage(int(256 * width_multiplier[2]), num_blocks[2], stride=2)\n        self.stage4 = self._make_stage(int(512 * width_multiplier[3]), num_blocks[3], stride=2)\n        self.gap = nn.AdaptiveAvgPool2D(output_size=1)\n        self.linear = nn.Linear(int(512 * width_multiplier[3]), class_dim)\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def _make_stage(self, planes, num_blocks, stride):\n        strides = [stride] + [1] * (num_blocks - 1)\n        blocks = []\n        for stride in strides:\n            cur_groups = self.override_groups_map.get(self.cur_layer_idx, 1)\n            blocks.append(\n                RepVGGBlock(\n                    in_channels=self.in_planes,\n                    out_channels=planes,\n                    kernel_size=3,\n                    stride=stride,\n                    padding=1,\n                    groups=cur_groups))\n            self.in_planes = planes\n            self.cur_layer_idx += 1\n        return nn.Sequential(*blocks)\n\n    def eval(self):\n        self.training = False\n        for layer in self.sublayers():\n            layer.training = False\n            layer.eval()\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, x):\n        out = self.stage0(x)\n        out = self.stage1(out)\n        out = self.stage2(out)\n        out = self.stage3(out)\n        out = self.stage4(out)\n        feature = self.gap(out)\n        out = paddle.flatten(feature, start_axis=1)\n        out = self.linear(out)\n        return out, feature\n"
  },
  {
    "path": "modules/image/classification/repvgg_b3g4_imagenet/README.md",
    "content": "# repvgg_b3g4_imagenet\n\n|模型名称|repvgg_b3g4_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|RepVGG|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|485MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - RepVGG（Making VGG-style ConvNets Great Again）系列模型是清华大学（丁桂光团队）、旷视科技（孙建等）、香港科技大学和阿伯里斯特威斯大学于2021年提出的一种简单但功能强大的卷积神经网络架构。有一个类似于 VGG 的推理时间代理。主体由3x3卷积和relu stack组成，而训练时间模型具有多分支拓扑。训练时间和推理时间的解耦是通过重新参数化技术实现的，因此该模型被称为repvgg。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install repvgg_b3g4_imagenet\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run repvgg_b3g4_imagenet --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='repvgg_b3g4_imagenet')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用repvgg_b3g4_imagenet对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"repvgg_b3g4_imagenet\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='repvgg_b3g4_imagenet', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m repvgg_b3g4_imagenet\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/repvgg_b3g4_imagenet\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/repvgg_b3g4_imagenet/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/repvgg_b3g4_imagenet/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddlehub.vision.transforms as T\nimport numpy as np\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBN(nn.Layer):\n    def __init__(self, in_channels, out_channels, kernel_size, stride, padding, groups=1):\n        super(ConvBN, self).__init__()\n        self.conv = nn.Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=padding,\n            groups=groups,\n            bias_attr=False)\n        self.bn = nn.BatchNorm2D(num_features=out_channels)\n\n    def forward(self, x):\n        y = self.conv(x)\n        y = self.bn(y)\n        return y\n\n\nclass RepVGGBlock(nn.Layer):\n    def __init__(self,\n                 in_channels,\n                 out_channels,\n                 kernel_size,\n                 stride=1,\n                 padding=0,\n                 dilation=1,\n                 groups=1,\n                 padding_mode='zeros'):\n        super(RepVGGBlock, self).__init__()\n        self.in_channels = in_channels\n        self.out_channels = out_channels\n        self.kernel_size = kernel_size\n        self.stride = stride\n        self.padding = padding\n        self.dilation = dilation\n        self.groups = groups\n        self.padding_mode = padding_mode\n\n        assert kernel_size == 3\n        assert padding == 1\n\n        padding_11 = padding - kernel_size // 2\n\n        self.nonlinearity = nn.ReLU()\n\n        self.rbr_identity = nn.BatchNorm2D(\n            num_features=in_channels) if out_channels == in_channels and stride == 1 else None\n        self.rbr_dense = ConvBN(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=padding,\n            groups=groups)\n        self.rbr_1x1 = ConvBN(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=1,\n            stride=stride,\n            padding=padding_11,\n            groups=groups)\n\n    def forward(self, inputs):\n        if not self.training:\n            return self.nonlinearity(self.rbr_reparam(inputs))\n\n        if self.rbr_identity is None:\n            id_out = 0\n        else:\n            id_out = self.rbr_identity(inputs)\n        return self.nonlinearity(self.rbr_dense(inputs) + self.rbr_1x1(inputs) + id_out)\n\n    def eval(self):\n        if not hasattr(self, 'rbr_reparam'):\n            self.rbr_reparam = nn.Conv2D(\n                in_channels=self.in_channels,\n                out_channels=self.out_channels,\n                kernel_size=self.kernel_size,\n                stride=self.stride,\n                padding=self.padding,\n                dilation=self.dilation,\n                groups=self.groups,\n                padding_mode=self.padding_mode)\n        self.training = False\n        kernel, bias = self.get_equivalent_kernel_bias()\n        self.rbr_reparam.weight.set_value(kernel)\n        self.rbr_reparam.bias.set_value(bias)\n        for layer in self.sublayers():\n            layer.eval()\n\n    def get_equivalent_kernel_bias(self):\n        kernel3x3, bias3x3 = self._fuse_bn_tensor(self.rbr_dense)\n        kernel1x1, bias1x1 = self._fuse_bn_tensor(self.rbr_1x1)\n        kernelid, biasid = self._fuse_bn_tensor(self.rbr_identity)\n        return kernel3x3 + self._pad_1x1_to_3x3_tensor(kernel1x1) + kernelid, bias3x3 + bias1x1 + biasid\n\n    def _pad_1x1_to_3x3_tensor(self, kernel1x1):\n        if kernel1x1 is None:\n            return 0\n        else:\n            return nn.functional.pad(kernel1x1, [1, 1, 1, 1])\n\n    def _fuse_bn_tensor(self, branch):\n        if branch is None:\n            return 0, 0\n        if isinstance(branch, ConvBN):\n            kernel = branch.conv.weight\n            running_mean = branch.bn._mean\n            running_var = branch.bn._variance\n            gamma = branch.bn.weight\n            beta = branch.bn.bias\n            eps = branch.bn._epsilon\n        else:\n            assert isinstance(branch, nn.BatchNorm2D)\n            if not hasattr(self, 'id_tensor'):\n                input_dim = self.in_channels // self.groups\n                kernel_value = np.zeros((self.in_channels, input_dim, 3, 3), dtype=np.float32)\n                for i in range(self.in_channels):\n                    kernel_value[i, i % input_dim, 1, 1] = 1\n                self.id_tensor = paddle.to_tensor(kernel_value)\n            kernel = self.id_tensor\n            running_mean = branch._mean\n            running_var = branch._variance\n            gamma = branch.weight\n            beta = branch.bias\n            eps = branch._epsilon\n        std = (running_var + eps).sqrt()\n        t = (gamma / std).reshape((-1, 1, 1, 1))\n        return kernel * t, beta - running_mean * gamma / std\n\n\n@moduleinfo(\n    name=\"repvgg_b3g4_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"repvgg_b3g4_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass RepVGG_B3G4(nn.Layer):\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(RepVGG_B3G4, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        num_blocks = [4, 6, 16, 1]\n        width_multiplier = [3, 3, 3, 5]\n        optional_groupwise_layers = [2, 4, 6, 8, 10, 12, 14, 16, 18, 20, 22, 24, 26]\n        self.override_groups_map = {l: 4 for l in optional_groupwise_layers}\n\n        assert 0 not in self.override_groups_map\n\n        self.in_planes = min(64, int(64 * width_multiplier[0]))\n\n        self.stage0 = RepVGGBlock(in_channels=3, out_channels=self.in_planes, kernel_size=3, stride=2, padding=1)\n        self.cur_layer_idx = 1\n        self.stage1 = self._make_stage(int(64 * width_multiplier[0]), num_blocks[0], stride=2)\n        self.stage2 = self._make_stage(int(128 * width_multiplier[1]), num_blocks[1], stride=2)\n        self.stage3 = self._make_stage(int(256 * width_multiplier[2]), num_blocks[2], stride=2)\n        self.stage4 = self._make_stage(int(512 * width_multiplier[3]), num_blocks[3], stride=2)\n        self.gap = nn.AdaptiveAvgPool2D(output_size=1)\n        self.linear = nn.Linear(int(512 * width_multiplier[3]), class_dim)\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def _make_stage(self, planes, num_blocks, stride):\n        strides = [stride] + [1] * (num_blocks - 1)\n        blocks = []\n        for stride in strides:\n            cur_groups = self.override_groups_map.get(self.cur_layer_idx, 1)\n            blocks.append(\n                RepVGGBlock(\n                    in_channels=self.in_planes,\n                    out_channels=planes,\n                    kernel_size=3,\n                    stride=stride,\n                    padding=1,\n                    groups=cur_groups))\n            self.in_planes = planes\n            self.cur_layer_idx += 1\n        return nn.Sequential(*blocks)\n\n    def eval(self):\n        self.training = False\n        for layer in self.sublayers():\n            layer.training = False\n            layer.eval()\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, x):\n        out = self.stage0(x)\n        out = self.stage1(out)\n        out = self.stage2(out)\n        out = self.stage3(out)\n        out = self.stage4(out)\n        feature = self.gap(out)\n        out = paddle.flatten(feature, start_axis=1)\n        out = self.linear(out)\n        return out, feature\n"
  },
  {
    "path": "modules/image/classification/res2net101_vd_26w_4s_imagenet/README.md",
    "content": "# res2net101_vd_26w_4s_imagenet\n\n|模型名称|res2net101_vd_26w_4s_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|Res2Net|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|179MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - Res2Net是2019年提出的一种全新的对ResNet的改进方案，该方案可以和现有其他优秀模块轻松整合，在不增加计算负载量的情况下，在ImageNet、CIFAR-100等数据集上的测试性能超过了ResNet。Res2Net结构简单，性能优越，进一步探索了CNN在更细粒度级别的多尺度表示能力。 该 PaddleHub Module 使用 ImageNet-2012数据集训练，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install res2net101_vd_26w_4s_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run res2net101_vd_26w_4s_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"res2net101_vd_26w_4s_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 为识别的菜品类别，value为置信度。\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m res2net101_vd_26w_4s_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/res2net101_vd_26w_4s_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install res2net101_vd_26w_4s_imagenet==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/res2net101_vd_26w_4s_imagenet/README_en.md",
    "content": "# res2net101_vd_26w_4s_imagenet\n\n|Module Name|res2net101_vd_26w_4s_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|Res2Net|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|179MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - Res2Net is an improvement on ResNet, which can improve performance without increasing computation. This module is based on Res2Net, trained on ImageNet-2012, and can predict an image of size 224*224*3.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install res2net101_vd_26w_4s_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run res2net101_vd_26w_4s_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"res2net101_vd_26w_4s_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - classification API.\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - top\\_k (int): return the first k results\n\n    - **Return**\n\n      - res (list\\[dict\\]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of image classification.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m res2net101_vd_26w_4s_imagenet\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/res2net101_vd_26w_4s_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Remove Fluid API\n\n  - ```shell\n    $ hub install res2net101_vd_26w_4s_imagenet==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/res2net101_vd_26w_4s_imagenet/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/classification/res2net101_vd_26w_4s_imagenet/data_feed.py",
    "content": "import os\nimport time\nfrom collections import OrderedDict\n\nimport numpy as np\nfrom PIL import Image\n\n__all__ = ['reader']\n\nDATA_DIM = 224\nimg_mean = np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))\nimg_std = np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))\n\n\ndef resize_short(img, target_size):\n    percent = float(target_size) / min(img.size[0], img.size[1])\n    resized_width = int(round(img.size[0] * percent))\n    resized_height = int(round(img.size[1] * percent))\n    img = img.resize((resized_width, resized_height), Image.LANCZOS)\n    return img\n\n\ndef crop_image(img, target_size, center):\n    width, height = img.size\n    size = target_size\n    if center == True:\n        w_start = (width - size) / 2\n        h_start = (height - size) / 2\n    else:\n        w_start = np.random.randint(0, width - size + 1)\n        h_start = np.random.randint(0, height - size + 1)\n    w_end = w_start + size\n    h_end = h_start + size\n    img = img.crop((w_start, h_start, w_end, h_end))\n    return img\n\n\ndef process_image(img):\n    img = resize_short(img, target_size=256)\n    img = crop_image(img, target_size=DATA_DIM, center=True)\n    if img.mode != 'RGB':\n        img = img.convert('RGB')\n    img = np.array(img).astype('float32').transpose((2, 0, 1)) / 255\n    img -= img_mean\n    img /= img_std\n    return img\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list[numpy.ndarray]): images data, shape of each is [H, W, C].\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            each['org_im_path'] = im_path\n            each['org_im'] = Image.open(im_path)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n    if images is not None:\n        assert type(images), \"images is a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = Image.fromarray(im[:, :, ::-1])\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n\n    for element in component:\n        element['image'] = process_image(element['org_im'])\n        yield element\n"
  },
  {
    "path": "modules/image/classification/res2net101_vd_26w_4s_imagenet/label_list.txt",
    "content": "tench, Tinca tinca\ngoldfish, Carassius auratus\ngreat white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias\ntiger shark, Galeocerdo cuvieri\nhammerhead, hammerhead shark\nelectric ray, crampfish, numbfish, torpedo\nstingray\ncock\nhen\nostrich, Struthio camelus\nbrambling, Fringilla montifringilla\ngoldfinch, Carduelis carduelis\nhouse finch, linnet, Carpodacus mexicanus\njunco, snowbird\nindigo bunting, indigo finch, indigo bird, Passerina cyanea\nrobin, American robin, Turdus migratorius\nbulbul\njay\nmagpie\nchickadee\nwater ouzel, dipper\nkite\nbald eagle, American eagle, Haliaeetus leucocephalus\nvulture\ngreat grey owl, great gray owl, Strix nebulosa\nEuropean fire salamander, Salamandra salamandra\ncommon newt, Triturus vulgaris\neft\nspotted salamander, Ambystoma maculatum\naxolotl, mud puppy, Ambystoma mexicanum\nbullfrog, Rana catesbeiana\ntree frog, tree-frog\ntailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui\nloggerhead, loggerhead turtle, Caretta caretta\nleatherback turtle, leatherback, leathery turtle, Dermochelys coriacea\nmud turtle\nterrapin\nbox turtle, box tortoise\nbanded gecko\ncommon iguana, iguana, Iguana iguana\nAmerican chameleon, anole, Anolis carolinensis\nwhiptail, whiptail lizard\nagama\nfrilled lizard, Chlamydosaurus kingi\nalligator lizard\nGila monster, Heloderma suspectum\ngreen lizard, Lacerta viridis\nAfrican chameleon, Chamaeleo chamaeleon\nKomodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis\nAfrican crocodile, Nile crocodile, Crocodylus niloticus\nAmerican alligator, Alligator mississipiensis\ntriceratops\nthunder snake, worm snake, Carphophis amoenus\nringneck snake, ring-necked snake, ring snake\nhognose snake, puff adder, sand viper\ngreen snake, grass snake\nking snake, kingsnake\ngarter snake, grass snake\nwater snake\nvine snake\nnight snake, Hypsiglena torquata\nboa constrictor, Constrictor constrictor\nrock python, rock snake, Python sebae\nIndian cobra, Naja naja\ngreen mamba\nsea snake\nhorned viper, cerastes, sand viper, horned asp, Cerastes cornutus\ndiamondback, diamondback rattlesnake, Crotalus adamanteus\nsidewinder, horned rattlesnake, Crotalus cerastes\ntrilobite\nharvestman, daddy longlegs, Phalangium opilio\nscorpion\nblack and gold garden spider, Argiope aurantia\nbarn spider, Araneus cavaticus\ngarden spider, Aranea diademata\nblack widow, Latrodectus mactans\ntarantula\nwolf spider, hunting spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse, partridge, Bonasa umbellus\nprairie chicken, prairie grouse, prairie fowl\npeacock\nquail\npartridge\nAfrican grey, African gray, Psittacus erithacus\nmacaw\nsulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser, Mergus serrator\ngoose\nblack swan, Cygnus atratus\ntusker\nechidna, spiny anteater, anteater\nplatypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus\nwallaby, brush kangaroo\nkoala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus\nwombat\njellyfish\nsea anemone, anemone\nbrain coral\nflatworm, platyhelminth\nnematode, nematode worm, roundworm\nconch\nsnail\nslug\nsea slug, nudibranch\nchiton, coat-of-mail shell, sea cradle, polyplacophore\nchambered nautilus, pearly nautilus, nautilus\nDungeness crab, Cancer magister\nrock crab, Cancer irroratus\nfiddler crab\nking crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica\nAmerican lobster, Northern lobster, Maine lobster, Homarus americanus\nspiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish\ncrayfish, crawfish, crawdad, crawdaddy\nhermit crab\nisopod\nwhite stork, Ciconia ciconia\nblack stork, Ciconia nigra\nspoonbill\nflamingo\nlittle blue heron, Egretta caerulea\nAmerican egret, great white heron, Egretta albus\nbittern\ncrane\nlimpkin, Aramus pictus\nEuropean gallinule, Porphyrio porphyrio\nAmerican coot, marsh hen, mud hen, water hen, Fulica americana\nbustard\nruddy turnstone, Arenaria interpres\nred-backed sandpiper, dunlin, Erolia alpina\nredshank, Tringa totanus\ndowitcher\noystercatcher, oyster catcher\npelican\nking penguin, Aptenodytes patagonica\nalbatross, mollymawk\ngrey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus\nkiller whale, killer, orca, grampus, sea wolf, Orcinus orca\ndugong, Dugong dugon\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog, Maltese terrier, Maltese\nPekinese, Pekingese, Peke\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound, Afghan\nbasset, basset hound\nbeagle\nbloodhound, sleuthhound\nbluetick\nblack-and-tan coonhound\nWalker hound, Walker foxhound\nEnglish foxhound\nredbone\nborzoi, Russian wolfhound\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound, Ibizan Podenco\nNorwegian elkhound, elkhound\notterhound, otter hound\nSaluki, gazelle hound\nScottish deerhound, deerhound\nWeimaraner\nStaffordshire bullterrier, Staffordshire bull terrier\nAmerican Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier, Sealyham\nAiredale, Airedale terrier\ncairn, cairn terrier\nAustralian terrier\nDandie Dinmont, Dandie Dinmont terrier\nBoston bull, Boston terrier\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier, Scottish terrier, Scottie\nTibetan terrier, chrysanthemum dog\nsilky terrier, Sydney silky\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa, Lhasa apso\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla, Hungarian pointer\nEnglish setter\nIrish setter, red setter\nGordon setter\nBrittany spaniel\nclumber, clumber spaniel\nEnglish springer, English springer spaniel\nWelsh springer spaniel\ncocker spaniel, English cocker spaniel, cocker\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog, bobtail\nShetland sheepdog, Shetland sheep dog, Shetland\ncollie\nBorder collie\nBouvier des Flandres, Bouviers des Flandres\nRottweiler\nGerman shepherd, German shepherd dog, German police dog, alsatian\nDoberman, Doberman pinscher\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard, St Bernard\nEskimo dog, husky\nmalamute, malemute, Alaskan malamute\nSiberian husky\ndalmatian, coach dog, carriage dog\naffenpinscher, monkey pinscher, monkey dog\nbasenji\npug, pug-dog\nLeonberg\nNewfoundland, Newfoundland dog\nGreat Pyrenees\nSamoyed, Samoyede\nPomeranian\nchow, chow chow\nkeeshond\nBrabancon griffon\nPembroke, Pembroke Welsh corgi\nCardigan, Cardigan Welsh corgi\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf, grey wolf, gray wolf, Canis lupus\nwhite wolf, Arctic wolf, Canis lupus tundrarum\nred wolf, maned wolf, Canis rufus, Canis niger\ncoyote, prairie wolf, brush wolf, Canis latrans\ndingo, warrigal, warragal, Canis dingo\ndhole, Cuon alpinus\nAfrican hunting dog, hyena dog, Cape hunting dog, Lycaon pictus\nhyena, hyaena\nred fox, Vulpes vulpes\nkit fox, Vulpes macrotis\nArctic fox, white fox, Alopex lagopus\ngrey fox, gray fox, Urocyon cinereoargenteus\ntabby, tabby cat\ntiger cat\nPersian cat\nSiamese cat, Siamese\nEgyptian cat\ncougar, puma, catamount, mountain lion, painter, panther, Felis concolor\nlynx, catamount\nleopard, Panthera pardus\nsnow leopard, ounce, Panthera uncia\njaguar, panther, Panthera onca, Felis onca\nlion, king of beasts, Panthera leo\ntiger, Panthera tigris\ncheetah, chetah, Acinonyx jubatus\nbrown bear, bruin, Ursus arctos\nAmerican black bear, black bear, Ursus americanus, Euarctos americanus\nice bear, polar bear, Ursus Maritimus, Thalarctos maritimus\nsloth bear, Melursus ursinus, Ursus ursinus\nmongoose\nmeerkat, mierkat\ntiger beetle\nladybug, ladybeetle, lady beetle, ladybird, ladybird beetle\nground beetle, carabid beetle\nlong-horned beetle, longicorn, longicorn beetle\nleaf beetle, chrysomelid\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant, emmet, pismire\ngrasshopper, hopper\ncricket\nwalking stick, walkingstick, stick insect\ncockroach, roach\nmantis, mantid\ncicada, cicala\nleafhopper\nlacewing, lacewing fly\ndragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk\ndamselfly\nadmiral\nringlet, ringlet butterfly\nmonarch, monarch butterfly, milkweed butterfly, Danaus plexippus\ncabbage butterfly\nsulphur butterfly, sulfur butterfly\nlycaenid, lycaenid butterfly\nstarfish, sea star\nsea urchin\nsea cucumber, holothurian\nwood rabbit, cottontail, cottontail rabbit\nhare\nAngora, Angora rabbit\nhamster\nporcupine, hedgehog\nfox squirrel, eastern fox squirrel, Sciurus niger\nmarmot\nbeaver\nguinea pig, Cavia cobaya\nsorrel\nzebra\nhog, pig, grunter, squealer, Sus scrofa\nwild boar, boar, Sus scrofa\nwarthog\nhippopotamus, hippo, river horse, Hippopotamus amphibius\nox\nwater buffalo, water ox, Asiatic buffalo, Bubalus bubalis\nbison\nram, tup\nbighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis\nibex, Capra ibex\nhartebeest\nimpala, Aepyceros melampus\ngazelle\nArabian camel, dromedary, Camelus dromedarius\nllama\nweasel\nmink\npolecat, fitch, foulmart, foumart, Mustela putorius\nblack-footed ferret, ferret, Mustela nigripes\notter\nskunk, polecat, wood pussy\nbadger\narmadillo\nthree-toed sloth, ai, Bradypus tridactylus\norangutan, orang, orangutang, Pongo pygmaeus\ngorilla, Gorilla gorilla\nchimpanzee, chimp, Pan troglodytes\ngibbon, Hylobates lar\nsiamang, Hylobates syndactylus, Symphalangus syndactylus\nguenon, guenon monkey\npatas, hussar monkey, Erythrocebus patas\nbaboon\nmacaque\nlangur\ncolobus, colobus monkey\nproboscis monkey, Nasalis larvatus\nmarmoset\ncapuchin, ringtail, Cebus capucinus\nhowler monkey, howler\ntiti, titi monkey\nspider monkey, Ateles geoffroyi\nsquirrel monkey, Saimiri sciureus\nMadagascar cat, ring-tailed lemur, Lemur catta\nindri, indris, Indri indri, Indri brevicaudatus\nIndian elephant, Elephas maximus\nAfrican elephant, Loxodonta africana\nlesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens\ngiant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca\nbarracouta, snoek\neel\ncoho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch\nrock beauty, Holocanthus tricolor\nanemone fish\nsturgeon\ngar, garfish, garpike, billfish, Lepisosteus osseus\nlionfish\npuffer, pufferfish, blowfish, globefish\nabacus\nabaya\nacademic gown, academic robe, judge's robe\naccordion, piano accordion, squeeze box\nacoustic guitar\naircraft carrier, carrier, flattop, attack aircraft carrier\nairliner\nairship, dirigible\naltar\nambulance\namphibian, amphibious vehicle\nanalog clock\napiary, bee house\napron\nashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin\nassault rifle, assault gun\nbackpack, back pack, knapsack, packsack, rucksack, haversack\nbakery, bakeshop, bakehouse\nbalance beam, beam\nballoon\nballpoint, ballpoint pen, ballpen, Biro\nBand Aid\nbanjo\nbannister, banister, balustrade, balusters, handrail\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel, cask\nbarrow, garden cart, lawn cart, wheelbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap, swimming cap\nbath towel\nbathtub, bathing tub, bath, tub\nbeach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon\nbeacon, lighthouse, beacon light, pharos\nbeaker\nbearskin, busby, shako\nbeer bottle\nbeer glass\nbell cote, bell cot\nbib\nbicycle-built-for-two, tandem bicycle, tandem\nbikini, two-piece\nbinder, ring-binder\nbinoculars, field glasses, opera glasses\nbirdhouse\nboathouse\nbobsled, bobsleigh, bob\nbolo tie, bolo, bola tie, bola\nbonnet, poke bonnet\nbookcase\nbookshop, bookstore, bookstall\nbottlecap\nbow\nbow tie, bow-tie, bowtie\nbrass, memorial tablet, plaque\nbrassiere, bra, bandeau\nbreakwater, groin, groyne, mole, bulwark, seawall, jetty\nbreastplate, aegis, egis\nbroom\nbucket, pail\nbuckle\nbulletproof vest\nbullet train, bullet\nbutcher shop, meat market\ncab, hack, taxi, taxicab\ncaldron, cauldron\ncandle, taper, wax light\ncannon\ncanoe\ncan opener, tin opener\ncardigan\ncar mirror\ncarousel, carrousel, merry-go-round, roundabout, whirligig\ncarpenter's kit, tool kit\ncarton\ncar wheel\ncash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello, violoncello\ncellular telephone, cellular phone, cellphone, cell, mobile phone\nchain\nchainlink fence\nchain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour\nchain saw, chainsaw\nchest\nchiffonier, commode\nchime, bell, gong\nchina cabinet, china closet\nChristmas stocking\nchurch, church building\ncinema, movie theater, movie theatre, movie house, picture palace\ncleaver, meat cleaver, chopper\ncliff dwelling\ncloak\nclog, geta, patten, sabot\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil, spiral, volute, whorl, helix\ncombination lock\ncomputer keyboard, keypad\nconfectionery, confectionary, candy store\ncontainer ship, containership, container vessel\nconvertible\ncorkscrew, bottle screw\ncornet, horn, trumpet, trump\ncowboy boot\ncowboy hat, ten-gallon hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib, cot\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam, dike, dyke\ndesk\ndesktop computer\ndial telephone, dial phone\ndiaper, nappy, napkin\ndigital clock\ndigital watch\ndining table, board\ndishrag, dishcloth\ndishwasher, dish washer, dishwashing machine\ndisk brake, disc brake\ndock, dockage, docking facility\ndogsled, dog sled, dog sleigh\ndome\ndoormat, welcome mat\ndrilling platform, offshore rig\ndrum, membranophone, tympan\ndrumstick\ndumbbell\nDutch oven\nelectric fan, blower\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa, boa\nfile, file cabinet, filing cabinet\nfireboat\nfire engine, fire truck\nfire screen, fireguard\nflagpole, flagstaff\nflute, transverse flute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn, horn\nfrying pan, frypan, skillet\nfur coat\ngarbage truck, dustcart\ngasmask, respirator, gas helmet\ngas pump, gasoline pump, petrol pump, island dispenser\ngoblet\ngo-kart\ngolf ball\ngolfcart, golf cart\ngondola\ngong, tam-tam\ngown\ngrand piano, grand\ngreenhouse, nursery, glasshouse\ngrille, radiator grille\ngrocery store, grocery, food market, market\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower, blow dryer, blow drier, hair dryer, hair drier\nhand-held computer, hand-held microcomputer\nhandkerchief, hankie, hanky, hankey\nhard disc, hard disk, fixed disk\nharmonica, mouth organ, harp, mouth harp\nharp\nharvester, reaper\nhatchet\nholster\nhome theater, home theatre\nhoneycomb\nhook, claw\nhoopskirt, crinoline\nhorizontal bar, high bar\nhorse cart, horse-cart\nhourglass\niPod\niron, smoothing iron\njack-o'-lantern\njean, blue jean, denim\njeep, landrover\njersey, T-shirt, tee shirt\njigsaw puzzle\njinrikisha, ricksha, rickshaw\njoystick\nkimono\nknee pad\nknot\nlab coat, laboratory coat\nladle\nlampshade, lamp shade\nlaptop, laptop computer\nlawn mower, mower\nlens cap, lens cover\nletter opener, paper knife, paperknife\nlibrary\nlifeboat\nlighter, light, igniter, ignitor\nlimousine, limo\nliner, ocean liner\nlipstick, lip rouge\nLoafer\nlotion\nloudspeaker, speaker, speaker unit, loudspeaker system, speaker system\nloupe, jeweler's loupe\nlumbermill, sawmill\nmagnetic compass\nmailbag, postbag\nmailbox, letter box\nmaillot\nmaillot, tank suit\nmanhole cover\nmaraca\nmarimba, xylophone\nmask\nmatchstick\nmaypole\nmaze, labyrinth\nmeasuring cup\nmedicine chest, medicine cabinet\nmegalith, megalithic structure\nmicrophone, mike\nmicrowave, microwave oven\nmilitary uniform\nmilk can\nminibus\nminiskirt, mini\nminivan\nmissile\nmitten\nmixing bowl\nmobile home, manufactured home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter, scooter\nmountain bike, all-terrain bike, off-roader\nmountain tent\nmouse, computer mouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook, notebook computer\nobelisk\noboe, hautboy, hautbois\nocarina, sweet potato\nodometer, hodometer, mileometer, milometer\noil filter\norgan, pipe organ\noscilloscope, scope, cathode-ray oscilloscope, CRO\noverskirt\noxcart\noxygen mask\npacket\npaddle, boat paddle\npaddlewheel, paddle wheel\npadlock\npaintbrush\npajama, pyjama, pj's, jammies\npalace\npanpipe, pandean pipe, syrinx\npaper towel\nparachute, chute\nparallel bars, bars\npark bench\nparking meter\npassenger car, coach, carriage\npatio, terrace\npay-phone, pay-station\npedestal, plinth, footstall\npencil box, pencil case\npencil sharpener\nperfume, essence\nPetri dish\nphotocopier\npick, plectrum, plectron\npickelhaube\npicket fence, paling\npickup, pickup truck\npier\npiggy bank, penny bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate, pirate ship\npitcher, ewer\nplane, carpenter's plane, woodworking plane\nplanetarium\nplastic bag\nplate rack\nplow, plough\nplunger, plumber's helper\nPolaroid camera, Polaroid Land camera\npole\npolice van, police wagon, paddy wagon, patrol wagon, wagon, black Maria\nponcho\npool table, billiard table, snooker table\npop bottle, soda bottle\npot, flowerpot\npotter's wheel\npower drill\nprayer rug, prayer mat\nprinter\nprison, prison house\nprojectile, missile\nprojector\npuck, hockey puck\npunching bag, punch bag, punching ball, punchball\npurse\nquill, quill pen\nquilt, comforter, comfort, puff\nracer, race car, racing car\nracket, racquet\nradiator\nradio, wireless\nradio telescope, radio reflector\nrain barrel\nrecreational vehicle, RV, R.V.\nreel\nreflex camera\nrefrigerator, icebox\nremote control, remote\nrestaurant, eating house, eating place, eatery\nrevolver, six-gun, six-shooter\nrifle\nrocking chair, rocker\nrotisserie\nrubber eraser, rubber, pencil eraser\nrugby ball\nrule, ruler\nrunning shoe\nsafe\nsafety pin\nsaltshaker, salt shaker\nsandal\nsarong\nsax, saxophone\nscabbard\nscale, weighing machine\nschool bus\nschooner\nscoreboard\nscreen, CRT screen\nscrew\nscrewdriver\nseat belt, seatbelt\nsewing machine\nshield, buckler\nshoe shop, shoe-shop, shoe store\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule, slipstick\nsliding door\nslot, one-armed bandit\nsnorkel\nsnowmobile\nsnowplow, snowplough\nsoap dispenser\nsoccer ball\nsock\nsolar dish, solar collector, solar furnace\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web, spider's web\nspindle\nsports car, sport car\nspotlight, spot\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch, stop watch\nstove\nstrainer\nstreetcar, tram, tramcar, trolley, trolley car\nstretcher\nstudio couch, day bed\nstupa, tope\nsubmarine, pigboat, sub, U-boat\nsuit, suit of clothes\nsundial\nsunglass\nsunglasses, dark glasses, shades\nsunscreen, sunblock, sun blocker\nsuspension bridge\nswab, swob, mop\nsweatshirt\nswimming trunks, bathing trunks\nswing\nswitch, electric switch, electrical switch\nsyringe\ntable lamp\ntank, army tank, armored combat vehicle, armoured combat vehicle\ntape player\nteapot\nteddy, teddy bear\ntelevision, television system\ntennis ball\nthatch, thatched roof\ntheater curtain, theatre curtain\nthimble\nthresher, thrasher, threshing machine\nthrone\ntile roof\ntoaster\ntobacco shop, tobacconist shop, tobacconist\ntoilet seat\ntorch\ntotem pole\ntow truck, tow car, wrecker\ntoyshop\ntractor\ntrailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi\ntray\ntrench coat\ntricycle, trike, velocipede\ntrimaran\ntripod\ntriumphal arch\ntrolleybus, trolley coach, trackless trolley\ntrombone\ntub, vat\nturnstile\ntypewriter keyboard\numbrella\nunicycle, monocycle\nupright, upright piano\nvacuum, vacuum cleaner\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin, fiddle\nvolleyball\nwaffle iron\nwall clock\nwallet, billfold, notecase, pocketbook\nwardrobe, closet, press\nwarplane, military plane\nwashbasin, handbasin, washbowl, lavabo, wash-hand basin\nwasher, automatic washer, washing machine\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool, woolen, woollen\nworm fence, snake fence, snake-rail fence, Virginia fence\nwreck\nyawl\nyurt\nweb site, website, internet site, site\ncomic book\ncrossword puzzle, crossword\nstreet sign\ntraffic light, traffic signal, stoplight\nbook jacket, dust cover, dust jacket, dust wrapper\nmenu\nplate\nguacamole\nconsomme\nhot pot, hotpot\ntrifle\nice cream, icecream\nice lolly, lolly, lollipop, popsicle\nFrench loaf\nbagel, beigel\npretzel\ncheeseburger\nhotdog, hot dog, red hot\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini, courgette\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber, cuke\nartichoke, globe artichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple, ananas\nbanana\njackfruit, jak, jack\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce, chocolate syrup\ndough\nmeat loaf, meatloaf\npizza, pizza pie\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff, drop, drop-off\ncoral reef\ngeyser\nlakeside, lakeshore\npromontory, headland, head, foreland\nsandbar, sand bar\nseashore, coast, seacoast, sea-coast\nvalley, vale\nvolcano\nballplayer, baseball player\ngroom, bridegroom\nscuba diver\nrapeseed\ndaisy\nyellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum\ncorn\nacorn\nhip, rose hip, rosehip\nbuckeye, horse chestnut, conker\ncoral fungus\nagaric\ngyromitra\nstinkhorn, carrion fungus\nearthstar\nhen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa\nbolete\near, spike, capitulum\ntoilet tissue, toilet paper, bathroom tissue\n"
  },
  {
    "path": "modules/image/classification/res2net101_vd_26w_4s_imagenet/module.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\n\nimport argparse\nimport ast\nimport os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import postprocess\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"res2net101_vd_26w_4s_imagenet\",\n    type=\"CV/image_classification\",\n    author=\"paddlepaddle\",\n    author_email=\"paddle-dev@baidu.com\",\n    summary=\"res2net101_vd_26w_4s is a image classfication model, this module is trained with imagenet datasets.\",\n    version=\"1.1.0\")\nclass Res2Net101vd26w4sImagenet:\n\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"res2net101_vd_26w_4s_imagenet_model\",\n                                                          \"model\")\n        label_file = os.path.join(self.directory, \"label_list.txt\")\n        with open(label_file, 'r', encoding='utf-8') as file:\n            self.label_list = file.read().split(\"\\n\")[:-1]\n        self.predictor_set = False\n\n    def get_expected_image_width(self):\n        return 224\n\n    def get_expected_image_height(self):\n        return 224\n\n    def get_pretrained_images_mean(self):\n        im_mean = np.array([0.485, 0.456, 0.406]).reshape(1, 3)\n        return im_mean\n\n    def get_pretrained_images_std(self):\n        im_std = np.array([0.229, 0.224, 0.225]).reshape(1, 3)\n        return im_std\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path + '.pdmodel'\n        params = self.default_pretrained_model_path + '.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def classification(self, images=None, paths=None, batch_size=1, use_gpu=False, top_k=1):\n        \"\"\"\n        API for image classification.\n\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results.\n        \"\"\"\n        if not self.predictor_set:\n            self._set_config()\n            self.predictor_set = True\n\n        all_data = list()\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = list()\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except:\n                    pass\n            # feed batch image\n            batch_image = np.array([data['image'] for data in batch_data])\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(batch_image.copy())\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            out = postprocess(data_out=output_handle.copy_to_cpu(), label_list=self.label_list, top_k=top_k)\n            res += out\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classification(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.classification(paths=[args.input_path], batch_size=args.batch_size, use_gpu=args.use_gpu)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not.\")\n        self.arg_config_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n        self.arg_config_group.add_argument('--top_k', type=ast.literal_eval, default=1, help=\"Return top k results.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n"
  },
  {
    "path": "modules/image/classification/res2net101_vd_26w_4s_imagenet/processor.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport base64\n\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef softmax(x):\n    if len(x.shape) > 1:\n        tmp = np.max(x, axis=1)\n        x -= tmp.reshape((x.shape[0], 1))\n        x = np.exp(x)\n        tmp = np.sum(x, axis=1)\n        x /= tmp.reshape((x.shape[0], 1))\n    else:\n        tmp = np.max(x)\n        x -= tmp\n        x = np.exp(x)\n        tmp = np.sum(x)\n        x /= tmp\n    return x\n\n\ndef postprocess(data_out, label_list, top_k):\n    \"\"\"\n    Postprocess output of network, one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output data of network.\n        label_list (list): list of label.\n        top_k (int): Return top k results.\n    \"\"\"\n    output = []\n    for result in data_out:\n        result_i = softmax(result)\n        output_i = {}\n        indexs = np.argsort(result_i)[::-1][0:top_k]\n        for index in indexs:\n            label = label_list[index].split(',')[0]\n            output_i[label] = float(result_i[index])\n        output.append(output_i)\n    return output\n"
  },
  {
    "path": "modules/image/classification/res2net101_vd_26w_4s_imagenet/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/brFsZ7qszSY/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8OHx8ZG9nfGVufDB8fHx8MTY2MzA1ODQ1MQ&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"res2net101_vd_26w_4s_imagenet\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n\n    def test_classification1(self):\n        results = self.module.classification(paths=['tests/test.jpg'])\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification2(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')])\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification3(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')], use_gpu=True)\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification4(self):\n        self.assertRaises(AssertionError, self.module.classification, paths=['no.jpg'])\n\n    def test_classification5(self):\n        self.assertRaises(TypeError, self.module.classification, images=['tests/test.jpg'])\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/classification/resnet101_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport math\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 filter_size: int,\n                 stride: int = 1,\n                 groups: int = 1,\n                 act: str = None,\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + \"_scale\"),\n            bias_attr=ParamAttr(bn_name + \"_offset\"),\n            moving_mean_name=bn_name + \"_mean\",\n            moving_variance_name=bn_name + \"_variance\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Bottleneck Block for ResNet101.\"\"\"\n\n    def __init__(self, num_channels: int, num_filters: int, stride: int, shortcut: bool = True, name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels, num_filters=num_filters, filter_size=1, act=\"relu\", name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters * 4, filter_size=1, act=None, name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                stride=stride,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n        self._num_channels_out = num_filters * 4\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.elementwise_add(x=short, y=conv2, act=\"relu\")\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    \"\"\"Basic block for ResNet101.\"\"\"\n\n    def __init__(self, num_channels: int, num_filters: int, stride: int, shortcut: bool = True, name: str = None):\n        super(BasicBlock, self).__init__()\n        self.stride = stride\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters, filter_size=3, act=None, name=name + \"_branch2b\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters,\n                filter_size=1,\n                stride=stride,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv1, act=\"relu\")\n        return y\n\n\n@moduleinfo(\n    name=\"resnet101_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"resnet101_imagenet is a classification model, \"\n    \"this module is trained with Baidu open sourced dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ResNet101(nn.Layer):\n    \"\"\"ResNet101 model.\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(ResNet101, self).__init__()\n\n        self.layers = 101\n        depth = [3, 4, 23, 3]\n        num_channels = [64, 256, 512, 1024]\n        num_filters = [64, 128, 256, 512]\n\n        self.conv = ConvBNLayer(num_channels=3, num_filters=64, filter_size=7, stride=2, act=\"relu\", name=\"conv1\")\n        self.pool2d_max = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n        self.block_list = []\n\n        for block in range(len(depth)):\n            shortcut = False\n            for i in range(depth[block]):\n                if block == 2:\n                    if i == 0:\n                        conv_name = \"res\" + str(block + 2) + \"a\"\n                    else:\n                        conv_name = \"res\" + str(block + 2) + \"b\" + str(i)\n                else:\n                    conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                bottleneck_block = self.add_sublayer(\n                    conv_name,\n                    BottleneckBlock(\n                        num_channels=num_channels[block] if i == 0 else num_filters[block] * 4,\n                        num_filters=num_filters[block],\n                        stride=2 if i == 0 and block != 0 else 1,\n                        shortcut=shortcut,\n                        name=conv_name))\n                self.block_list.append(bottleneck_block)\n                shortcut = True\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n\n        self.pool2d_avg_channels = num_channels[-1] * 2\n\n        stdv = 1.0 / math.sqrt(self.pool2d_avg_channels * 1.0)\n\n        self.out = Linear(\n            self.pool2d_avg_channels,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_0.w_0\"),\n            bias_attr=ParamAttr(name=\"fc_0.b_0\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'resnet101_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/resnet101_imagenet.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv(inputs)\n        y = self.pool2d_max(y)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.pool2d_avg_channels])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/resnet101_vd_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport math\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(\n            self,\n            num_channels: int,\n            num_filters: int,\n            filter_size: int,\n            stride: int = 1,\n            groups: int = 1,\n            is_vd_mode: bool = False,\n            act: str = None,\n            name: str = None,\n    ):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2d(kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, inputs: paddle.Tensor):\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Bottleneck Block for ResNet101_vd.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels, num_filters=num_filters, filter_size=1, act='relu', name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters * 4, filter_size=1, act=None, name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv2, act='relu')\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    \"\"\"Basic block for ResNet101_vd.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BasicBlock, self).__init__()\n        self.stride = stride\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters, filter_size=3, act=None, name=name + \"_branch2b\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters,\n                filter_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv1, act='relu')\n        return y\n\n\n@moduleinfo(\n    name=\"resnet101_vd_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"resnet101_vd_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ResNet101_vd(nn.Layer):\n    \"\"\"ResNet101_vd model.\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(ResNet101_vd, self).__init__()\n\n        self.layers = 101\n\n        depth = [3, 4, 23, 3]\n        num_channels = [64, 256, 512, 1024]\n        num_filters = [64, 128, 256, 512]\n\n        self.conv1_1 = ConvBNLayer(num_channels=3, num_filters=32, filter_size=3, stride=2, act='relu', name=\"conv1_1\")\n        self.conv1_2 = ConvBNLayer(num_channels=32, num_filters=32, filter_size=3, stride=1, act='relu', name=\"conv1_2\")\n        self.conv1_3 = ConvBNLayer(num_channels=32, num_filters=64, filter_size=3, stride=1, act='relu', name=\"conv1_3\")\n        self.pool2d_max = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n        self.block_list = []\n\n        for block in range(len(depth)):\n            shortcut = False\n            for i in range(depth[block]):\n                if block == 2:\n                    if i == 0:\n                        conv_name = \"res\" + str(block + 2) + \"a\"\n                    else:\n                        conv_name = \"res\" + str(block + 2) + \"b\" + str(i)\n                else:\n                    conv_name = \"res\" + str(block + 2) + chr(97 + i)\n\n                bottleneck_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    BottleneckBlock(\n                        num_channels=num_channels[block] if i == 0 else num_filters[block] * 4,\n                        num_filters=num_filters[block],\n                        stride=2 if i == 0 and block != 0 else 1,\n                        shortcut=shortcut,\n                        if_first=block == i == 0,\n                        name=conv_name))\n                self.block_list.append(bottleneck_block)\n                shortcut = True\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n        self.pool2d_avg_channels = num_channels[-1] * 2\n        stdv = 1.0 / math.sqrt(self.pool2d_avg_channels * 1.0)\n\n        self.out = Linear(\n            self.pool2d_avg_channels,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_0.w_0\"),\n            bias_attr=ParamAttr(name=\"fc_0.b_0\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'resnet101_vd_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/resnet101_vd_imagenet.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        y = self.pool2d_max(y)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.pool2d_avg_channels])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/resnet101_vd_imagenet_ssld/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport math\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(\n            self,\n            num_channels: int,\n            num_filters: int,\n            filter_size: int,\n            stride: int = 1,\n            groups: int = 1,\n            is_vd_mode: bool = False,\n            act: str = None,\n            name: str = None,\n    ):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2d(kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, inputs: paddle.Tensor):\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Bottleneck Block for ResNet101_vd.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels, num_filters=num_filters, filter_size=1, act='relu', name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters * 4, filter_size=1, act=None, name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv2, act='relu')\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    \"\"\"Basic block for ResNet101_vd.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BasicBlock, self).__init__()\n        self.stride = stride\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters, filter_size=3, act=None, name=name + \"_branch2b\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters,\n                filter_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv1, act='relu')\n        return y\n\n\n@moduleinfo(\n    name=\"resnet101_vd_imagenet_ssld\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"resnet101_vd_imagenet_ssld is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ResNet101_vd(nn.Layer):\n    \"\"\"ResNet101_vd model.\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(ResNet101_vd, self).__init__()\n\n        self.layers = 101\n\n        depth = [3, 4, 23, 3]\n        num_channels = [64, 256, 512, 1024]\n        num_filters = [64, 128, 256, 512]\n\n        self.conv1_1 = ConvBNLayer(num_channels=3, num_filters=32, filter_size=3, stride=2, act='relu', name=\"conv1_1\")\n        self.conv1_2 = ConvBNLayer(num_channels=32, num_filters=32, filter_size=3, stride=1, act='relu', name=\"conv1_2\")\n        self.conv1_3 = ConvBNLayer(num_channels=32, num_filters=64, filter_size=3, stride=1, act='relu', name=\"conv1_3\")\n        self.pool2d_max = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n        self.block_list = []\n\n        for block in range(len(depth)):\n            shortcut = False\n            for i in range(depth[block]):\n                if block == 2:\n                    if i == 0:\n                        conv_name = \"res\" + str(block + 2) + \"a\"\n                    else:\n                        conv_name = \"res\" + str(block + 2) + \"b\" + str(i)\n                else:\n                    conv_name = \"res\" + str(block + 2) + chr(97 + i)\n\n                bottleneck_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    BottleneckBlock(\n                        num_channels=num_channels[block] if i == 0 else num_filters[block] * 4,\n                        num_filters=num_filters[block],\n                        stride=2 if i == 0 and block != 0 else 1,\n                        shortcut=shortcut,\n                        if_first=block == i == 0,\n                        name=conv_name))\n                self.block_list.append(bottleneck_block)\n                shortcut = True\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n        self.pool2d_avg_channels = num_channels[-1] * 2\n        stdv = 1.0 / math.sqrt(self.pool2d_avg_channels * 1.0)\n\n        self.out = Linear(\n            self.pool2d_avg_channels,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_0.w_0\"),\n            bias_attr=ParamAttr(name=\"fc_0.b_0\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'resnet101_vd_ssld_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/resnet101_vd_ssld_imagenet.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        y = self.pool2d_max(y)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.pool2d_avg_channels])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/resnet152_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport math\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 filter_size: int,\n                 stride: int = 1,\n                 groups: int = 1,\n                 act: str = None,\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + \"_scale\"),\n            bias_attr=ParamAttr(bn_name + \"_offset\"),\n            moving_mean_name=bn_name + \"_mean\",\n            moving_variance_name=bn_name + \"_variance\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Bottleneck Block for ResNet152.\"\"\"\n\n    def __init__(self, num_channels: int, num_filters: int, stride: int, shortcut: bool = True, name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels, num_filters=num_filters, filter_size=1, act=\"relu\", name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters * 4, filter_size=1, act=None, name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                stride=stride,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n        self._num_channels_out = num_filters * 4\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.elementwise_add(x=short, y=conv2, act=\"relu\")\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    \"\"\"Basic block for ResNet152.\"\"\"\n\n    def __init__(self, num_channels: int, num_filters: int, stride: int, shortcut: bool = True, name: str = None):\n        super(BasicBlock, self).__init__()\n        self.stride = stride\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters, filter_size=3, act=None, name=name + \"_branch2b\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters,\n                filter_size=1,\n                stride=stride,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv1, act=\"relu\")\n        return y\n\n\n@moduleinfo(\n    name=\"resnet152_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"resnet152_imagenet is a classification model, \"\n    \"this module is trained with Baidu open sourced dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ResNet152(nn.Layer):\n    \"\"\"ResNet152 model.\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(ResNet152, self).__init__()\n\n        self.layers = 152\n        depth = [3, 8, 36, 3]\n        num_channels = [64, 256, 512, 1024]\n        num_filters = [64, 128, 256, 512]\n\n        self.conv = ConvBNLayer(num_channels=3, num_filters=64, filter_size=7, stride=2, act=\"relu\", name=\"conv1\")\n        self.pool2d_max = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n        self.block_list = []\n\n        for block in range(len(depth)):\n            shortcut = False\n            for i in range(depth[block]):\n                if block == 2:\n                    if i == 0:\n                        conv_name = \"res\" + str(block + 2) + \"a\"\n                    else:\n                        conv_name = \"res\" + str(block + 2) + \"b\" + str(i)\n                else:\n                    conv_name = \"res\" + str(block + 2) + chr(97 + i)\n\n                bottleneck_block = self.add_sublayer(\n                    conv_name,\n                    BottleneckBlock(\n                        num_channels=num_channels[block] if i == 0 else num_filters[block] * 4,\n                        num_filters=num_filters[block],\n                        stride=2 if i == 0 and block != 0 else 1,\n                        shortcut=shortcut,\n                        name=conv_name))\n                self.block_list.append(bottleneck_block)\n                shortcut = True\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n\n        self.pool2d_avg_channels = num_channels[-1] * 2\n\n        stdv = 1.0 / math.sqrt(self.pool2d_avg_channels * 1.0)\n\n        self.out = Linear(\n            self.pool2d_avg_channels,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_0.w_0\"),\n            bias_attr=ParamAttr(name=\"fc_0.b_0\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'resnet152_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/resnet152_imagenet.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv(inputs)\n        y = self.pool2d_max(y)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.pool2d_avg_channels])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/resnet152_vd_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport math\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(\n            self,\n            num_channels: int,\n            num_filters: int,\n            filter_size: int,\n            stride: int = 1,\n            groups: int = 1,\n            is_vd_mode: bool = False,\n            act: str = None,\n            name: str = None,\n    ):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2d(kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, inputs: paddle.Tensor):\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Bottleneck Block for ResNet152_vd.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels, num_filters=num_filters, filter_size=1, act='relu', name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters * 4, filter_size=1, act=None, name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv2, act='relu')\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    \"\"\"Basic block for ResNet152_vd.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BasicBlock, self).__init__()\n        self.stride = stride\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters, filter_size=3, act=None, name=name + \"_branch2b\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters,\n                filter_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv1, act='relu')\n        return y\n\n\n@moduleinfo(\n    name=\"resnet152_vd_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"resnet152_vd_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ResNet152_vd(nn.Layer):\n    \"\"\"ResNet152_vd model.\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(ResNet152_vd, self).__init__()\n\n        self.layers = 152\n\n        depth = [3, 8, 36, 3]\n        num_channels = [64, 256, 512, 1024]\n        num_filters = [64, 128, 256, 512]\n\n        self.conv1_1 = ConvBNLayer(num_channels=3, num_filters=32, filter_size=3, stride=2, act='relu', name=\"conv1_1\")\n        self.conv1_2 = ConvBNLayer(num_channels=32, num_filters=32, filter_size=3, stride=1, act='relu', name=\"conv1_2\")\n        self.conv1_3 = ConvBNLayer(num_channels=32, num_filters=64, filter_size=3, stride=1, act='relu', name=\"conv1_3\")\n        self.pool2d_max = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n        self.block_list = []\n\n        for block in range(len(depth)):\n            shortcut = False\n            for i in range(depth[block]):\n                if block == 2:\n                    if i == 0:\n                        conv_name = \"res\" + str(block + 2) + \"a\"\n                    else:\n                        conv_name = \"res\" + str(block + 2) + \"b\" + str(i)\n                else:\n                    conv_name = \"res\" + str(block + 2) + chr(97 + i)\n\n                bottleneck_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    BottleneckBlock(\n                        num_channels=num_channels[block] if i == 0 else num_filters[block] * 4,\n                        num_filters=num_filters[block],\n                        stride=2 if i == 0 and block != 0 else 1,\n                        shortcut=shortcut,\n                        if_first=block == i == 0,\n                        name=conv_name))\n                self.block_list.append(bottleneck_block)\n                shortcut = True\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n        self.pool2d_avg_channels = num_channels[-1] * 2\n        stdv = 1.0 / math.sqrt(self.pool2d_avg_channels * 1.0)\n\n        self.out = Linear(\n            self.pool2d_avg_channels,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_0.w_0\"),\n            bias_attr=ParamAttr(name=\"fc_0.b_0\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'resnet152_vd_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/resnet152_vd_imagenet.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        y = self.pool2d_max(y)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.pool2d_avg_channels])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/resnet18_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport math\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 filter_size: int,\n                 stride: int = 1,\n                 groups: int = 1,\n                 act: str = None,\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + \"_scale\"),\n            bias_attr=ParamAttr(bn_name + \"_offset\"),\n            moving_mean_name=bn_name + \"_mean\",\n            moving_variance_name=bn_name + \"_variance\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Bottleneck Block for ResNet18.\"\"\"\n\n    def __init__(self, num_channels: int, num_filters: int, stride: int, shortcut: bool = True, name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels, num_filters=num_filters, filter_size=1, act=\"relu\", name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters * 4, filter_size=1, act=None, name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                stride=stride,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n        self._num_channels_out = num_filters * 4\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.elementwise_add(x=short, y=conv2, act=\"relu\")\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    \"\"\"Basic block for ResNet18.\"\"\"\n\n    def __init__(self, num_channels: int, num_filters: int, stride: int, shortcut: bool = True, name: str = None):\n        super(BasicBlock, self).__init__()\n        self.stride = stride\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters, filter_size=3, act=None, name=name + \"_branch2b\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters,\n                filter_size=1,\n                stride=stride,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv1, act=\"relu\")\n        return y\n\n\n@moduleinfo(\n    name=\"resnet18_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"resnet18_imagenet is a classification model, \"\n    \"this module is trained with Baidu open sourced dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ResNet18(nn.Layer):\n    \"\"\"ResNet18 model.\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(ResNet18, self).__init__()\n\n        self.layers = 18\n        depth = [2, 2, 2, 2]\n        num_channels = [64, 64, 128, 256]\n        num_filters = [64, 128, 256, 512]\n\n        self.conv = ConvBNLayer(num_channels=3, num_filters=64, filter_size=7, stride=2, act=\"relu\", name=\"conv1\")\n        self.pool2d_max = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n        self.block_list = []\n\n        for block in range(len(depth)):\n            shortcut = False\n            for i in range(depth[block]):\n                conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                basic_block = self.add_sublayer(\n                    conv_name,\n                    BasicBlock(\n                        num_channels=num_channels[block] if i == 0 else num_filters[block],\n                        num_filters=num_filters[block],\n                        stride=2 if i == 0 and block != 0 else 1,\n                        shortcut=shortcut,\n                        name=conv_name))\n                self.block_list.append(basic_block)\n                shortcut = True\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n\n        self.pool2d_avg_channels = num_channels[-1] * 2\n\n        stdv = 1.0 / math.sqrt(self.pool2d_avg_channels * 1.0)\n\n        self.out = Linear(\n            self.pool2d_avg_channels,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_0.w_0\"),\n            bias_attr=ParamAttr(name=\"fc_0.b_0\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'resnet18_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/resnet18_imagenet.pdparams -O ' +\n                    checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv(inputs)\n        y = self.pool2d_max(y)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.pool2d_avg_channels])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/resnet18_vd_imagenet/README.md",
    "content": "# resnet18_vd_imagenet\n\n|模型名称|resnet18_vd_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNet_vd|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|46MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - ResNet系列模型是图像分类领域的重要模型之一，模型中提出的残差单元有效地解决了深度网络训练困难的问题，通过增加模型的深度提升了模型的准确率，ResNet-vd 其实就是 ResNet-D，是ResNet 原始结构的变种。该PaddleHub Module结构为ResNet_vd，基于ImageNet-2012数据集训练得到，接受输入图片大小为224 x 224 x 3，支持finetune，也可以直接通过命令行或者Python接口进行预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install resnet18_vd_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run resnet18_vd_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnet18_vd_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 为识别的菜品类别，value为置信度。\n\n\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m resnet18_vd_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/resnet18_vd_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install resnet18_vd_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnet18_vd_imagenet/README_en.md",
    "content": "# resnet18_vd_imagenet\n\n|Module Name|resnet18_vd_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|ResNet_vd|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|46MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - ResNet proposed a residual unit to solve the problem of training an extremely deep network, and improved the prediction accuracy of models. ResNet-vd is a variant of ResNet. This module is based on ResNet_vd, trained on ImageNet-2012 dataset, and can predict an image of size 224*224*3.\n\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install resnet18_vd_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run resnet18_vd_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnet18_vd_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - classification API.\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - top\\_k (int): return the first k results\n\n    - **Return**\n\n      - res (list\\[dict\\]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of image classification.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m resnet18_vd_imagenet\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/resnet18_vd_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install resnet18_vd_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnet18_vd_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport math\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(\n            self,\n            num_channels: int,\n            num_filters: int,\n            filter_size: int,\n            stride: int = 1,\n            groups: int = 1,\n            is_vd_mode: bool = False,\n            act: str = None,\n            name: str = None,\n    ):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2d(kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, inputs: paddle.Tensor):\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Bottleneck Block for ResNet18_vd.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels, num_filters=num_filters, filter_size=1, act='relu', name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters * 4, filter_size=1, act=None, name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv2, act='relu')\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    \"\"\"Basic block for ResNet18_vd.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BasicBlock, self).__init__()\n        self.stride = stride\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters, filter_size=3, act=None, name=name + \"_branch2b\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters,\n                filter_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv1, act='relu')\n        return y\n\n\n@moduleinfo(\n    name=\"resnet18_vd_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"resnet18_vd_imagenet is a classification model, \"\n    \"this module is trained with Baidu open sourced dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ResNet18_vd(nn.Layer):\n    \"\"\"ResNet18_vd model.\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(ResNet18_vd, self).__init__()\n\n        self.layers = 18\n        depth = [2, 2, 2, 2]\n        num_channels = [64, 64, 128, 256]\n        num_filters = [64, 128, 256, 512]\n\n        self.conv1_1 = ConvBNLayer(num_channels=3, num_filters=32, filter_size=3, stride=2, act='relu', name=\"conv1_1\")\n        self.conv1_2 = ConvBNLayer(num_channels=32, num_filters=32, filter_size=3, stride=1, act='relu', name=\"conv1_2\")\n        self.conv1_3 = ConvBNLayer(num_channels=32, num_filters=64, filter_size=3, stride=1, act='relu', name=\"conv1_3\")\n        self.pool2d_max = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n        self.block_list = []\n\n        for block in range(len(depth)):\n            shortcut = False\n            for i in range(depth[block]):\n                conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                basic_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    BasicBlock(\n                        num_channels=num_channels[block] if i == 0 else num_filters[block],\n                        num_filters=num_filters[block],\n                        stride=2 if i == 0 and block != 0 else 1,\n                        shortcut=shortcut,\n                        if_first=block == i == 0,\n                        name=conv_name))\n                self.block_list.append(basic_block)\n                shortcut = True\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n        self.pool2d_avg_channels = num_channels[-1] * 2\n        stdv = 1.0 / math.sqrt(self.pool2d_avg_channels * 1.0)\n\n        self.out = Linear(\n            self.pool2d_avg_channels,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_0.w_0\"),\n            bias_attr=ParamAttr(name=\"fc_0.b_0\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'resnet18_vd_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/resnet18_vd_imagenet.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        y = self.pool2d_max(y)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.pool2d_avg_channels])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/resnet200_vd_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport math\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(\n            self,\n            num_channels: int,\n            num_filters: int,\n            filter_size: int,\n            stride: int = 1,\n            groups: int = 1,\n            is_vd_mode: bool = False,\n            act: str = None,\n            name: str = None,\n    ):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2d(kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, inputs: paddle.Tensor):\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Bottleneck Block for ResNet200_vd.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels, num_filters=num_filters, filter_size=1, act='relu', name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters * 4, filter_size=1, act=None, name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv2, act='relu')\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    \"\"\"Basic block for ResNet200_vd.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BasicBlock, self).__init__()\n        self.stride = stride\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters, filter_size=3, act=None, name=name + \"_branch2b\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters,\n                filter_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv1, act='relu')\n        return y\n\n\n@moduleinfo(\n    name=\"resnet200_vd_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"resnet200_vd_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ResNet200_vd(nn.Layer):\n    \"\"\"ResNet200_vd model.\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(ResNet200_vd, self).__init__()\n\n        self.layers = 200\n\n        depth = [3, 12, 48, 3]\n        num_channels = [64, 256, 512, 1024]\n        num_filters = [64, 128, 256, 512]\n\n        self.conv1_1 = ConvBNLayer(num_channels=3, num_filters=32, filter_size=3, stride=2, act='relu', name=\"conv1_1\")\n        self.conv1_2 = ConvBNLayer(num_channels=32, num_filters=32, filter_size=3, stride=1, act='relu', name=\"conv1_2\")\n        self.conv1_3 = ConvBNLayer(num_channels=32, num_filters=64, filter_size=3, stride=1, act='relu', name=\"conv1_3\")\n        self.pool2d_max = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n        self.block_list = []\n\n        for block in range(len(depth)):\n            shortcut = False\n            for i in range(depth[block]):\n                if block == 2:\n                    if i == 0:\n                        conv_name = \"res\" + str(block + 2) + \"a\"\n                    else:\n                        conv_name = \"res\" + str(block + 2) + \"b\" + str(i)\n                else:\n                    conv_name = \"res\" + str(block + 2) + chr(97 + i)\n\n                bottleneck_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    BottleneckBlock(\n                        num_channels=num_channels[block] if i == 0 else num_filters[block] * 4,\n                        num_filters=num_filters[block],\n                        stride=2 if i == 0 and block != 0 else 1,\n                        shortcut=shortcut,\n                        if_first=block == i == 0,\n                        name=conv_name))\n                self.block_list.append(bottleneck_block)\n                shortcut = True\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n        self.pool2d_avg_channels = num_channels[-1] * 2\n        stdv = 1.0 / math.sqrt(self.pool2d_avg_channels * 1.0)\n\n        self.out = Linear(\n            self.pool2d_avg_channels,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_0.w_0\"),\n            bias_attr=ParamAttr(name=\"fc_0.b_0\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'resnet200_vd_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/resnet200_vd_imagenet.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        y = self.pool2d_max(y)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.pool2d_avg_channels])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/resnet34_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport math\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 filter_size: int,\n                 stride: int = 1,\n                 groups: int = 1,\n                 act: str = None,\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + \"_scale\"),\n            bias_attr=ParamAttr(bn_name + \"_offset\"),\n            moving_mean_name=bn_name + \"_mean\",\n            moving_variance_name=bn_name + \"_variance\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Bottleneck Block for ResNet34.\"\"\"\n\n    def __init__(self, num_channels: int, num_filters: int, stride: int, shortcut: bool = True, name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels, num_filters=num_filters, filter_size=1, act=\"relu\", name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters * 4, filter_size=1, act=None, name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                stride=stride,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n        self._num_channels_out = num_filters * 4\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.elementwise_add(x=short, y=conv2, act=\"relu\")\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    \"\"\"Basic block for ResNet34.\"\"\"\n\n    def __init__(self, num_channels: int, num_filters: int, stride: int, shortcut: bool = True, name: str = None):\n        super(BasicBlock, self).__init__()\n        self.stride = stride\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters, filter_size=3, act=None, name=name + \"_branch2b\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters,\n                filter_size=1,\n                stride=stride,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv1, act=\"relu\")\n        return y\n\n\n@moduleinfo(\n    name=\"resnet34_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"resnet34_imagenet is a classification model, \"\n    \"this module is trained with Baidu open sourced dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ResNet34(nn.Layer):\n    \"\"\"ResNet34 model.\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(ResNet34, self).__init__()\n\n        self.layers = 34\n        depth = [3, 4, 6, 3]\n        num_channels = [64, 64, 128, 256]\n        num_filters = [64, 128, 256, 512]\n\n        self.conv = ConvBNLayer(num_channels=3, num_filters=64, filter_size=7, stride=2, act=\"relu\", name=\"conv1\")\n        self.pool2d_max = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n        self.block_list = []\n\n        for block in range(len(depth)):\n            shortcut = False\n            for i in range(depth[block]):\n                conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                basic_block = self.add_sublayer(\n                    conv_name,\n                    BasicBlock(\n                        num_channels=num_channels[block] if i == 0 else num_filters[block],\n                        num_filters=num_filters[block],\n                        stride=2 if i == 0 and block != 0 else 1,\n                        shortcut=shortcut,\n                        name=conv_name))\n                self.block_list.append(basic_block)\n                shortcut = True\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n\n        self.pool2d_avg_channels = num_channels[-1] * 2\n\n        stdv = 1.0 / math.sqrt(self.pool2d_avg_channels * 1.0)\n\n        self.out = Linear(\n            self.pool2d_avg_channels,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_0.w_0\"),\n            bias_attr=ParamAttr(name=\"fc_0.b_0\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'resnet34_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/resnet34_imagenet.pdparams -O ' +\n                    checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv(inputs)\n        y = self.pool2d_max(y)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.pool2d_avg_channels])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/resnet34_v2_imagenet/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/classification/resnet34_v2_imagenet/data_feed.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import division\n\nimport os\nfrom collections import OrderedDict\n\nimport cv2\nimport numpy as np\nfrom PIL import Image, ImageEnhance\nfrom paddle import fluid\n\nDATA_DIM = 224\nimg_mean = np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))\nimg_std = np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))\n\n\ndef resize_short(img, target_size):\n    percent = float(target_size) / min(img.size[0], img.size[1])\n    resized_width = int(round(img.size[0] * percent))\n    resized_height = int(round(img.size[1] * percent))\n    img = img.resize((resized_width, resized_height), Image.LANCZOS)\n    return img\n\n\ndef crop_image(img, target_size, center):\n    width, height = img.size\n    size = target_size\n    if center == True:\n        w_start = (width - size) / 2\n        h_start = (height - size) / 2\n    else:\n        w_start = np.random.randint(0, width - size + 1)\n        h_start = np.random.randint(0, height - size + 1)\n    w_end = w_start + size\n    h_end = h_start + size\n    img = img.crop((w_start, h_start, w_end, h_end))\n    return img\n\n\ndef process_image(img):\n    img = resize_short(img, target_size=256)\n    img = crop_image(img, target_size=DATA_DIM, center=True)\n    if img.mode != 'RGB':\n        img = img.convert('RGB')\n    #img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)\n    img = np.array(img).astype('float32').transpose((2, 0, 1)) / 255\n    img -= img_mean\n    img /= img_std\n    return img\n\n\ndef test_reader(paths=None, images=None):\n    \"\"\"data generator\n    :param paths: path to images.\n    :type paths: list, each element is a str\n    :param images: data of images, [N, H, W, C]\n    :type images: numpy.ndarray\n    \"\"\"\n    img_list = []\n    if paths:\n        for img_path in paths:\n            assert os.path.isfile(img_path), \"The {} isn't a valid file path.\".format(img_path)\n            img = Image.open(img_path)\n            #img = cv2.imread(img_path)\n            img_list.append(img)\n    if images is not None:\n        for img in images:\n            img_list.append(Image.fromarray(np.uint8(img)))\n    for im in img_list:\n        im = process_image(im)\n        yield im\n"
  },
  {
    "path": "modules/image/classification/resnet34_v2_imagenet/label_file.txt",
    "content": "tench, Tinca tinca\ngoldfish, Carassius auratus\ngreat white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias\ntiger shark, Galeocerdo cuvieri\nhammerhead, hammerhead shark\nelectric ray, crampfish, numbfish, torpedo\nstingray\ncock\nhen\nostrich, Struthio camelus\nbrambling, Fringilla montifringilla\ngoldfinch, Carduelis carduelis\nhouse finch, linnet, Carpodacus mexicanus\njunco, snowbird\nindigo bunting, indigo finch, indigo bird, Passerina cyanea\nrobin, American robin, Turdus migratorius\nbulbul\njay\nmagpie\nchickadee\nwater ouzel, dipper\nkite\nbald eagle, American eagle, Haliaeetus leucocephalus\nvulture\ngreat grey owl, great gray owl, Strix nebulosa\nEuropean fire salamander, Salamandra salamandra\ncommon newt, Triturus vulgaris\neft\nspotted salamander, Ambystoma maculatum\naxolotl, mud puppy, Ambystoma mexicanum\nbullfrog, Rana catesbeiana\ntree frog, tree-frog\ntailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui\nloggerhead, loggerhead turtle, Caretta caretta\nleatherback turtle, leatherback, leathery turtle, Dermochelys coriacea\nmud turtle\nterrapin\nbox turtle, box tortoise\nbanded gecko\ncommon iguana, iguana, Iguana iguana\nAmerican chameleon, anole, Anolis carolinensis\nwhiptail, whiptail lizard\nagama\nfrilled lizard, Chlamydosaurus kingi\nalligator lizard\nGila monster, Heloderma suspectum\ngreen lizard, Lacerta viridis\nAfrican chameleon, Chamaeleo chamaeleon\nKomodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis\nAfrican crocodile, Nile crocodile, Crocodylus niloticus\nAmerican alligator, Alligator mississipiensis\ntriceratops\nthunder snake, worm snake, Carphophis amoenus\nringneck snake, ring-necked snake, ring snake\nhognose snake, puff adder, sand viper\ngreen snake, grass snake\nking snake, kingsnake\ngarter snake, grass snake\nwater snake\nvine snake\nnight snake, Hypsiglena torquata\nboa constrictor, Constrictor constrictor\nrock python, rock snake, Python sebae\nIndian cobra, Naja naja\ngreen mamba\nsea snake\nhorned viper, cerastes, sand viper, horned asp, Cerastes cornutus\ndiamondback, diamondback rattlesnake, Crotalus adamanteus\nsidewinder, horned rattlesnake, Crotalus cerastes\ntrilobite\nharvestman, daddy longlegs, Phalangium opilio\nscorpion\nblack and gold garden spider, Argiope aurantia\nbarn spider, Araneus cavaticus\ngarden spider, Aranea diademata\nblack widow, Latrodectus mactans\ntarantula\nwolf spider, hunting spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse, partridge, Bonasa umbellus\nprairie chicken, prairie grouse, prairie fowl\npeacock\nquail\npartridge\nAfrican grey, African gray, Psittacus erithacus\nmacaw\nsulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser, Mergus serrator\ngoose\nblack swan, Cygnus atratus\ntusker\nechidna, spiny anteater, anteater\nplatypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus\nwallaby, brush kangaroo\nkoala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus\nwombat\njellyfish\nsea anemone, anemone\nbrain coral\nflatworm, platyhelminth\nnematode, nematode worm, roundworm\nconch\nsnail\nslug\nsea slug, nudibranch\nchiton, coat-of-mail shell, sea cradle, polyplacophore\nchambered nautilus, pearly nautilus, nautilus\nDungeness crab, Cancer magister\nrock crab, Cancer irroratus\nfiddler crab\nking crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica\nAmerican lobster, Northern lobster, Maine lobster, Homarus americanus\nspiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish\ncrayfish, crawfish, crawdad, crawdaddy\nhermit crab\nisopod\nwhite stork, Ciconia ciconia\nblack stork, Ciconia nigra\nspoonbill\nflamingo\nlittle blue heron, Egretta caerulea\nAmerican egret, great white heron, Egretta albus\nbittern\ncrane\nlimpkin, Aramus pictus\nEuropean gallinule, Porphyrio porphyrio\nAmerican coot, marsh hen, mud hen, water hen, Fulica americana\nbustard\nruddy turnstone, Arenaria interpres\nred-backed sandpiper, dunlin, Erolia alpina\nredshank, Tringa totanus\ndowitcher\noystercatcher, oyster catcher\npelican\nking penguin, Aptenodytes patagonica\nalbatross, mollymawk\ngrey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus\nkiller whale, killer, orca, grampus, sea wolf, Orcinus orca\ndugong, Dugong dugon\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog, Maltese terrier, Maltese\nPekinese, Pekingese, Peke\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound, Afghan\nbasset, basset hound\nbeagle\nbloodhound, sleuthhound\nbluetick\nblack-and-tan coonhound\nWalker hound, Walker foxhound\nEnglish foxhound\nredbone\nborzoi, Russian wolfhound\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound, Ibizan Podenco\nNorwegian elkhound, elkhound\notterhound, otter hound\nSaluki, gazelle hound\nScottish deerhound, deerhound\nWeimaraner\nStaffordshire bullterrier, Staffordshire bull terrier\nAmerican Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier, Sealyham\nAiredale, Airedale terrier\ncairn, cairn terrier\nAustralian terrier\nDandie Dinmont, Dandie Dinmont terrier\nBoston bull, Boston terrier\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier, Scottish terrier, Scottie\nTibetan terrier, chrysanthemum dog\nsilky terrier, Sydney silky\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa, Lhasa apso\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla, Hungarian pointer\nEnglish setter\nIrish setter, red setter\nGordon setter\nBrittany spaniel\nclumber, clumber spaniel\nEnglish springer, English springer spaniel\nWelsh springer spaniel\ncocker spaniel, English cocker spaniel, cocker\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog, bobtail\nShetland sheepdog, Shetland sheep dog, Shetland\ncollie\nBorder collie\nBouvier des Flandres, Bouviers des Flandres\nRottweiler\nGerman shepherd, German shepherd dog, German police dog, alsatian\nDoberman, Doberman pinscher\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard, St Bernard\nEskimo dog, husky\nmalamute, malemute, Alaskan malamute\nSiberian husky\ndalmatian, coach dog, carriage dog\naffenpinscher, monkey pinscher, monkey dog\nbasenji\npug, pug-dog\nLeonberg\nNewfoundland, Newfoundland dog\nGreat Pyrenees\nSamoyed, Samoyede\nPomeranian\nchow, chow chow\nkeeshond\nBrabancon griffon\nPembroke, Pembroke Welsh corgi\nCardigan, Cardigan Welsh corgi\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf, grey wolf, gray wolf, Canis lupus\nwhite wolf, Arctic wolf, Canis lupus tundrarum\nred wolf, maned wolf, Canis rufus, Canis niger\ncoyote, prairie wolf, brush wolf, Canis latrans\ndingo, warrigal, warragal, Canis dingo\ndhole, Cuon alpinus\nAfrican hunting dog, hyena dog, Cape hunting dog, Lycaon pictus\nhyena, hyaena\nred fox, Vulpes vulpes\nkit fox, Vulpes macrotis\nArctic fox, white fox, Alopex lagopus\ngrey fox, gray fox, Urocyon cinereoargenteus\ntabby, tabby cat\ntiger cat\nPersian cat\nSiamese cat, Siamese\nEgyptian cat\ncougar, puma, catamount, mountain lion, painter, panther, Felis concolor\nlynx, catamount\nleopard, Panthera pardus\nsnow leopard, ounce, Panthera uncia\njaguar, panther, Panthera onca, Felis onca\nlion, king of beasts, Panthera leo\ntiger, Panthera tigris\ncheetah, chetah, Acinonyx jubatus\nbrown bear, bruin, Ursus arctos\nAmerican black bear, black bear, Ursus americanus, Euarctos americanus\nice bear, polar bear, Ursus Maritimus, Thalarctos maritimus\nsloth bear, Melursus ursinus, Ursus ursinus\nmongoose\nmeerkat, mierkat\ntiger beetle\nladybug, ladybeetle, lady beetle, ladybird, ladybird beetle\nground beetle, carabid beetle\nlong-horned beetle, longicorn, longicorn beetle\nleaf beetle, chrysomelid\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant, emmet, pismire\ngrasshopper, hopper\ncricket\nwalking stick, walkingstick, stick insect\ncockroach, roach\nmantis, mantid\ncicada, cicala\nleafhopper\nlacewing, lacewing fly\ndragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk\ndamselfly\nadmiral\nringlet, ringlet butterfly\nmonarch, monarch butterfly, milkweed butterfly, Danaus plexippus\ncabbage butterfly\nsulphur butterfly, sulfur butterfly\nlycaenid, lycaenid butterfly\nstarfish, sea star\nsea urchin\nsea cucumber, holothurian\nwood rabbit, cottontail, cottontail rabbit\nhare\nAngora, Angora rabbit\nhamster\nporcupine, hedgehog\nfox squirrel, eastern fox squirrel, Sciurus niger\nmarmot\nbeaver\nguinea pig, Cavia cobaya\nsorrel\nzebra\nhog, pig, grunter, squealer, Sus scrofa\nwild boar, boar, Sus scrofa\nwarthog\nhippopotamus, hippo, river horse, Hippopotamus amphibius\nox\nwater buffalo, water ox, Asiatic buffalo, Bubalus bubalis\nbison\nram, tup\nbighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis\nibex, Capra ibex\nhartebeest\nimpala, Aepyceros melampus\ngazelle\nArabian camel, dromedary, Camelus dromedarius\nllama\nweasel\nmink\npolecat, fitch, foulmart, foumart, Mustela putorius\nblack-footed ferret, ferret, Mustela nigripes\notter\nskunk, polecat, wood pussy\nbadger\narmadillo\nthree-toed sloth, ai, Bradypus tridactylus\norangutan, orang, orangutang, Pongo pygmaeus\ngorilla, Gorilla gorilla\nchimpanzee, chimp, Pan troglodytes\ngibbon, Hylobates lar\nsiamang, Hylobates syndactylus, Symphalangus syndactylus\nguenon, guenon monkey\npatas, hussar monkey, Erythrocebus patas\nbaboon\nmacaque\nlangur\ncolobus, colobus monkey\nproboscis monkey, Nasalis larvatus\nmarmoset\ncapuchin, ringtail, Cebus capucinus\nhowler monkey, howler\ntiti, titi monkey\nspider monkey, Ateles geoffroyi\nsquirrel monkey, Saimiri sciureus\nMadagascar cat, ring-tailed lemur, Lemur catta\nindri, indris, Indri indri, Indri brevicaudatus\nIndian elephant, Elephas maximus\nAfrican elephant, Loxodonta africana\nlesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens\ngiant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca\nbarracouta, snoek\neel\ncoho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch\nrock beauty, Holocanthus tricolor\nanemone fish\nsturgeon\ngar, garfish, garpike, billfish, Lepisosteus osseus\nlionfish\npuffer, pufferfish, blowfish, globefish\nabacus\nabaya\nacademic gown, academic robe, judge's robe\naccordion, piano accordion, squeeze box\nacoustic guitar\naircraft carrier, carrier, flattop, attack aircraft carrier\nairliner\nairship, dirigible\naltar\nambulance\namphibian, amphibious vehicle\nanalog clock\napiary, bee house\napron\nashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin\nassault rifle, assault gun\nbackpack, back pack, knapsack, packsack, rucksack, haversack\nbakery, bakeshop, bakehouse\nbalance beam, beam\nballoon\nballpoint, ballpoint pen, ballpen, Biro\nBand Aid\nbanjo\nbannister, banister, balustrade, balusters, handrail\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel, cask\nbarrow, garden cart, lawn cart, wheelbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap, swimming cap\nbath towel\nbathtub, bathing tub, bath, tub\nbeach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon\nbeacon, lighthouse, beacon light, pharos\nbeaker\nbearskin, busby, shako\nbeer bottle\nbeer glass\nbell cote, bell cot\nbib\nbicycle-built-for-two, tandem bicycle, tandem\nbikini, two-piece\nbinder, ring-binder\nbinoculars, field glasses, opera glasses\nbirdhouse\nboathouse\nbobsled, bobsleigh, bob\nbolo tie, bolo, bola tie, bola\nbonnet, poke bonnet\nbookcase\nbookshop, bookstore, bookstall\nbottlecap\nbow\nbow tie, bow-tie, bowtie\nbrass, memorial tablet, plaque\nbrassiere, bra, bandeau\nbreakwater, groin, groyne, mole, bulwark, seawall, jetty\nbreastplate, aegis, egis\nbroom\nbucket, pail\nbuckle\nbulletproof vest\nbullet train, bullet\nbutcher shop, meat market\ncab, hack, taxi, taxicab\ncaldron, cauldron\ncandle, taper, wax light\ncannon\ncanoe\ncan opener, tin opener\ncardigan\ncar mirror\ncarousel, carrousel, merry-go-round, roundabout, whirligig\ncarpenter's kit, tool kit\ncarton\ncar wheel\ncash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello, violoncello\ncellular telephone, cellular phone, cellphone, cell, mobile phone\nchain\nchainlink fence\nchain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour\nchain saw, chainsaw\nchest\nchiffonier, commode\nchime, bell, gong\nchina cabinet, china closet\nChristmas stocking\nchurch, church building\ncinema, movie theater, movie theatre, movie house, picture palace\ncleaver, meat cleaver, chopper\ncliff dwelling\ncloak\nclog, geta, patten, sabot\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil, spiral, volute, whorl, helix\ncombination lock\ncomputer keyboard, keypad\nconfectionery, confectionary, candy store\ncontainer ship, containership, container vessel\nconvertible\ncorkscrew, bottle screw\ncornet, horn, trumpet, trump\ncowboy boot\ncowboy hat, ten-gallon hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib, cot\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam, dike, dyke\ndesk\ndesktop computer\ndial telephone, dial phone\ndiaper, nappy, napkin\ndigital clock\ndigital watch\ndining table, board\ndishrag, dishcloth\ndishwasher, dish washer, dishwashing machine\ndisk brake, disc brake\ndock, dockage, docking facility\ndogsled, dog sled, dog sleigh\ndome\ndoormat, welcome mat\ndrilling platform, offshore rig\ndrum, membranophone, tympan\ndrumstick\ndumbbell\nDutch oven\nelectric fan, blower\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa, boa\nfile, file cabinet, filing cabinet\nfireboat\nfire engine, fire truck\nfire screen, fireguard\nflagpole, flagstaff\nflute, transverse flute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn, horn\nfrying pan, frypan, skillet\nfur coat\ngarbage truck, dustcart\ngasmask, respirator, gas helmet\ngas pump, gasoline pump, petrol pump, island dispenser\ngoblet\ngo-kart\ngolf ball\ngolfcart, golf cart\ngondola\ngong, tam-tam\ngown\ngrand piano, grand\ngreenhouse, nursery, glasshouse\ngrille, radiator grille\ngrocery store, grocery, food market, market\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower, blow dryer, blow drier, hair dryer, hair drier\nhand-held computer, hand-held microcomputer\nhandkerchief, hankie, hanky, hankey\nhard disc, hard disk, fixed disk\nharmonica, mouth organ, harp, mouth harp\nharp\nharvester, reaper\nhatchet\nholster\nhome theater, home theatre\nhoneycomb\nhook, claw\nhoopskirt, crinoline\nhorizontal bar, high bar\nhorse cart, horse-cart\nhourglass\niPod\niron, smoothing iron\njack-o'-lantern\njean, blue jean, denim\njeep, landrover\njersey, T-shirt, tee shirt\njigsaw puzzle\njinrikisha, ricksha, rickshaw\njoystick\nkimono\nknee pad\nknot\nlab coat, laboratory coat\nladle\nlampshade, lamp shade\nlaptop, laptop computer\nlawn mower, mower\nlens cap, lens cover\nletter opener, paper knife, paperknife\nlibrary\nlifeboat\nlighter, light, igniter, ignitor\nlimousine, limo\nliner, ocean liner\nlipstick, lip rouge\nLoafer\nlotion\nloudspeaker, speaker, speaker unit, loudspeaker system, speaker system\nloupe, jeweler's loupe\nlumbermill, sawmill\nmagnetic compass\nmailbag, postbag\nmailbox, letter box\nmaillot\nmaillot, tank suit\nmanhole cover\nmaraca\nmarimba, xylophone\nmask\nmatchstick\nmaypole\nmaze, labyrinth\nmeasuring cup\nmedicine chest, medicine cabinet\nmegalith, megalithic structure\nmicrophone, mike\nmicrowave, microwave oven\nmilitary uniform\nmilk can\nminibus\nminiskirt, mini\nminivan\nmissile\nmitten\nmixing bowl\nmobile home, manufactured home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter, scooter\nmountain bike, all-terrain bike, off-roader\nmountain tent\nmouse, computer mouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook, notebook computer\nobelisk\noboe, hautboy, hautbois\nocarina, sweet potato\nodometer, hodometer, mileometer, milometer\noil filter\norgan, pipe organ\noscilloscope, scope, cathode-ray oscilloscope, CRO\noverskirt\noxcart\noxygen mask\npacket\npaddle, boat paddle\npaddlewheel, paddle wheel\npadlock\npaintbrush\npajama, pyjama, pj's, jammies\npalace\npanpipe, pandean pipe, syrinx\npaper towel\nparachute, chute\nparallel bars, bars\npark bench\nparking meter\npassenger car, coach, carriage\npatio, terrace\npay-phone, pay-station\npedestal, plinth, footstall\npencil box, pencil case\npencil sharpener\nperfume, essence\nPetri dish\nphotocopier\npick, plectrum, plectron\npickelhaube\npicket fence, paling\npickup, pickup truck\npier\npiggy bank, penny bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate, pirate ship\npitcher, ewer\nplane, carpenter's plane, woodworking plane\nplanetarium\nplastic bag\nplate rack\nplow, plough\nplunger, plumber's helper\nPolaroid camera, Polaroid Land camera\npole\npolice van, police wagon, paddy wagon, patrol wagon, wagon, black Maria\nponcho\npool table, billiard table, snooker table\npop bottle, soda bottle\npot, flowerpot\npotter's wheel\npower drill\nprayer rug, prayer mat\nprinter\nprison, prison house\nprojectile, missile\nprojector\npuck, hockey puck\npunching bag, punch bag, punching ball, punchball\npurse\nquill, quill pen\nquilt, comforter, comfort, puff\nracer, race car, racing car\nracket, racquet\nradiator\nradio, wireless\nradio telescope, radio reflector\nrain barrel\nrecreational vehicle, RV, R.V.\nreel\nreflex camera\nrefrigerator, icebox\nremote control, remote\nrestaurant, eating house, eating place, eatery\nrevolver, six-gun, six-shooter\nrifle\nrocking chair, rocker\nrotisserie\nrubber eraser, rubber, pencil eraser\nrugby ball\nrule, ruler\nrunning shoe\nsafe\nsafety pin\nsaltshaker, salt shaker\nsandal\nsarong\nsax, saxophone\nscabbard\nscale, weighing machine\nschool bus\nschooner\nscoreboard\nscreen, CRT screen\nscrew\nscrewdriver\nseat belt, seatbelt\nsewing machine\nshield, buckler\nshoe shop, shoe-shop, shoe store\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule, slipstick\nsliding door\nslot, one-armed bandit\nsnorkel\nsnowmobile\nsnowplow, snowplough\nsoap dispenser\nsoccer ball\nsock\nsolar dish, solar collector, solar furnace\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web, spider's web\nspindle\nsports car, sport car\nspotlight, spot\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch, stop watch\nstove\nstrainer\nstreetcar, tram, tramcar, trolley, trolley car\nstretcher\nstudio couch, day bed\nstupa, tope\nsubmarine, pigboat, sub, U-boat\nsuit, suit of clothes\nsundial\nsunglass\nsunglasses, dark glasses, shades\nsunscreen, sunblock, sun blocker\nsuspension bridge\nswab, swob, mop\nsweatshirt\nswimming trunks, bathing trunks\nswing\nswitch, electric switch, electrical switch\nsyringe\ntable lamp\ntank, army tank, armored combat vehicle, armoured combat vehicle\ntape player\nteapot\nteddy, teddy bear\ntelevision, television system\ntennis ball\nthatch, thatched roof\ntheater curtain, theatre curtain\nthimble\nthresher, thrasher, threshing machine\nthrone\ntile roof\ntoaster\ntobacco shop, tobacconist shop, tobacconist\ntoilet seat\ntorch\ntotem pole\ntow truck, tow car, wrecker\ntoyshop\ntractor\ntrailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi\ntray\ntrench coat\ntricycle, trike, velocipede\ntrimaran\ntripod\ntriumphal arch\ntrolleybus, trolley coach, trackless trolley\ntrombone\ntub, vat\nturnstile\ntypewriter keyboard\numbrella\nunicycle, monocycle\nupright, upright piano\nvacuum, vacuum cleaner\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin, fiddle\nvolleyball\nwaffle iron\nwall clock\nwallet, billfold, notecase, pocketbook\nwardrobe, closet, press\nwarplane, military plane\nwashbasin, handbasin, washbowl, lavabo, wash-hand basin\nwasher, automatic washer, washing machine\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool, woolen, woollen\nworm fence, snake fence, snake-rail fence, Virginia fence\nwreck\nyawl\nyurt\nweb site, website, internet site, site\ncomic book\ncrossword puzzle, crossword\nstreet sign\ntraffic light, traffic signal, stoplight\nbook jacket, dust cover, dust jacket, dust wrapper\nmenu\nplate\nguacamole\nconsomme\nhot pot, hotpot\ntrifle\nice cream, icecream\nice lolly, lolly, lollipop, popsicle\nFrench loaf\nbagel, beigel\npretzel\ncheeseburger\nhotdog, hot dog, red hot\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini, courgette\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber, cuke\nartichoke, globe artichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple, ananas\nbanana\njackfruit, jak, jack\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce, chocolate syrup\ndough\nmeat loaf, meatloaf\npizza, pizza pie\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff, drop, drop-off\ncoral reef\ngeyser\nlakeside, lakeshore\npromontory, headland, head, foreland\nsandbar, sand bar\nseashore, coast, seacoast, sea-coast\nvalley, vale\nvolcano\nballplayer, baseball player\ngroom, bridegroom\nscuba diver\nrapeseed\ndaisy\nyellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum\ncorn\nacorn\nhip, rose hip, rosehip\nbuckeye, horse chestnut, conker\ncoral fungus\nagaric\ngyromitra\nstinkhorn, carrion fungus\nearthstar\nhen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa\nbolete\near, spike, capitulum\ntoilet tissue, toilet paper, bathroom tissue\n"
  },
  {
    "path": "modules/image/classification/resnet34_v2_imagenet/module.py",
    "content": "import os\nimport ast\nimport argparse\n\nimport numpy as np\nimport paddlehub as hub\nimport paddle.fluid as fluid\nfrom paddlehub.module.module import moduleinfo, runnable\nfrom paddle.fluid.core import PaddleTensor, AnalysisConfig, create_paddle_predictor\nfrom paddlehub.io.parser import txt_parser\n\nfrom resnet34_v2_imagenet.resnet import ResNet, ResNetC5\nfrom resnet34_v2_imagenet.processor import load_label_info\nfrom resnet34_v2_imagenet.data_feed import test_reader\n\n\n@moduleinfo(\n    name=\"resnet34_v2_imagenet\",\n    version=\"1.1.0\",\n    type=\"cv/classification\",\n    summary=\"ResNet34 is a image classfication model trained with ImageNet-2012 dataset.\",\n    author=\"paddlepaddle\",\n    author_email=\"paddle-dev@baidu.com\")\nclass ResNet34(hub.Module):\n    def _initialize(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"resnet34_v2_model\")\n        self.label_names = load_label_info(os.path.join(self.directory, \"label_file.txt\"))\n        self.infer_prog = None\n        self.pred_out = None\n        self._set_config()\n\n    def get_expected_image_width(self):\n        return 224\n\n    def get_expected_image_height(self):\n        return 224\n\n    def get_pretrained_images_mean(self):\n        im_mean = np.array([0.485, 0.456, 0.406]).reshape(1, 3)\n        return im_mean\n\n    def get_pretrained_images_std(self):\n        im_std = np.array([0.229, 0.224, 0.225]).reshape(1, 3)\n        return im_std\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        cpu_config = AnalysisConfig(self.default_pretrained_model_path)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_paddle_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = AnalysisConfig(self.default_pretrained_model_path)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=500, device_id=0)\n            self.gpu_predictor = create_paddle_predictor(gpu_config)\n\n    def context(self,\n                input_image=None,\n                trainable=True,\n                pretrained=True,\n                param_prefix='',\n                get_prediction=False,\n                variant='d',\n                norm_type='bn',\n                feature_maps=[3, 4, 5],\n                return_c5=False):\n        \"\"\"Distill the Head Features, so as to perform transfer learning.\n\n        :param input_image: image tensor.\n        :type input_image: <class 'paddle.fluid.framework.Variable'>\n        :param trainable: whether to set parameters trainable.\n        :type trainable: bool\n        :param pretrained: whether to load default pretrained model.\n        :type pretrained: bool\n        :param param_prefix: the prefix of parameters in yolo_head and backbone\n        :type param_prefix: str\n        :param get_prediction: whether to get prediction,\n            if True, outputs is {'bbox_out': bbox_out},\n            if False, outputs is {'head_features': head_features}.\n        :type get_prediction: bool\n        :param depth: depth of network\n        :type depth: int\n        :param variant: type of resnet\n        :type variant: str\n        :param norm_type: type of normlization\n        :type norm_type: str\n        :param feature_maps: stage of output\n        :type feature_maps: list\n        \"\"\"\n        context_prog = input_image.block.program if input_image else fluid.Program()\n        startup_program = fluid.Program()\n        with fluid.program_guard(context_prog, startup_program):\n            if return_c5:\n                return ResNetC5(depth=34, norm_type=norm_type, variant=variant, feature_maps=feature_maps)\n            image = input_image if input_image else fluid.data(\n                name='image', shape=[-1, 3, 224, 224], dtype='float32', lod_level=0)\n            backbone = ResNet(depth=34, variant=variant, norm_type=norm_type,\\\n                              feature_maps=feature_maps, get_prediction=get_prediction)\n\n            out = backbone(image)\n            inputs = {'image': image}\n            if get_prediction:\n                outputs = {'pred_out': out}\n            else:\n                outputs = {'body_feats': out}\n\n            place = fluid.CPUPlace()\n            exe = fluid.Executor(place)\n            if pretrained:\n\n                def _if_exist(var):\n                    return os.path.exists(os.path.join(self.default_pretrained_model_path, var.name))\n\n                if not param_prefix:\n                    fluid.io.load_vars(\n                        exe, self.default_pretrained_model_path, main_program=context_prog, predicate=_if_exist)\n            else:\n                exe.run(startup_program)\n            return inputs, outputs, context_prog\n\n    def classification(self, paths=None, images=None, use_gpu=False, batch_size=1, top_k=2):\n        \"\"\"API of Classification.\n        :param paths: the path of images.\n        :type paths: list, each element is correspond to the path of an image.\n        :param images: data of images, [N, H, W, C]\n        :type images: numpy.ndarray\n        :param use_gpu: whether to use gpu or not.\n        :type use_gpu: bool\n        :param batch_size: bathc size.\n        :type batch_size: int\n        :param top_k: result of top k\n        :typr top_k: int\n        \"\"\"\n        if self.infer_prog is None:\n            inputs, outputs, self.infer_prog = self.context(trainable=False, pretrained=True, get_prediction=True)\n            self.infer_prog = self.infer_prog.clone(for_test=True)\n            self.pred_out = outputs['pred_out']\n        place = fluid.CUDAPlace(0) if use_gpu else fluid.CPUPlace()\n        exe = fluid.Executor(place)\n        all_images = []\n        paths = paths if paths else []\n        for yield_data in test_reader(paths, images):\n            all_images.append(yield_data)\n\n        images_num = len(all_images)\n        loop_num = int(np.ceil(images_num / batch_size))\n\n        res_list = []\n        top_k = max(min(top_k, 1000), 1)\n        for iter_id in range(loop_num):\n            batch_data = []\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_images[handle_id + image_id])\n                except:\n                    pass\n            batch_data = np.array(batch_data).astype('float32')\n            data_tensor = PaddleTensor(batch_data.copy())\n            if use_gpu:\n                result = self.gpu_predictor.run([data_tensor])\n            else:\n                result = self.cpu_predictor.run([data_tensor])\n            for i, res in enumerate(result[0].as_ndarray()):\n                res_dict = {}\n                pred_label = np.argsort(res)[::-1][:top_k]\n                for k in pred_label:\n                    class_name = self.label_names[int(k)].split(',')[0]\n                    max_prob = res[k]\n                    res_dict[class_name] = max_prob\n                res_list.append(res_dict)\n        return res_list\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu', type=ast.literal_eval, default=False, help=\"whether use GPU or not\")\n\n        self.arg_config_group.add_argument('--batch_size', type=int, default=1, help=\"batch size for prediction\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, default=None, help=\"input data\")\n        self.arg_input_group.add_argument('--input_file', type=str, default=None, help=\"file contain input data\")\n\n    def check_input_data(self, args):\n        input_data = []\n        if args.input_path:\n            input_data = [args.input_path]\n        elif args.input_file:\n            if not os.path.exists(args.input_file):\n                raise RuntimeError(\"File %s is not exist.\" % args.input_file)\n            else:\n                input_data = txt_parser.parse(args.input_file, use_strip=True)\n        return input_data\n\n    @runnable\n    def run_cmd(self, argvs):\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {}\".format(self.name),\n            prog=\"hub run {}\".format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        input_data = self.check_input_data(args)\n        if len(input_data) == 0:\n            self.parser.print_help()\n            exit(1)\n        else:\n            for image_path in input_data:\n                if not os.path.exists(image_path):\n                    raise RuntimeError(\"File %s or %s is not exist.\" % image_path)\n        return self.classification(paths=input_data, use_gpu=args.use_gpu, batch_size=args.batch_size)\n"
  },
  {
    "path": "modules/image/classification/resnet34_v2_imagenet/name_adapter.py",
    "content": "# coding=utf-8\n\n\nclass NameAdapter(object):\n    \"\"\"Fix the backbones variable names for pretrained weight\"\"\"\n\n    def __init__(self, model):\n        super(NameAdapter, self).__init__()\n        self.model = model\n\n    @property\n    def model_type(self):\n        return getattr(self.model, '_model_type', '')\n\n    @property\n    def variant(self):\n        return getattr(self.model, 'variant', '')\n\n    def fix_conv_norm_name(self, name):\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        # the naming rule is same as pretrained weight\n        if self.model_type == 'SEResNeXt':\n            bn_name = name + \"_bn\"\n        return bn_name\n\n    def fix_shortcut_name(self, name):\n        if self.model_type == 'SEResNeXt':\n            name = 'conv' + name + '_prj'\n        return name\n\n    def fix_bottleneck_name(self, name):\n        if self.model_type == 'SEResNeXt':\n            conv_name1 = 'conv' + name + '_x1'\n            conv_name2 = 'conv' + name + '_x2'\n            conv_name3 = 'conv' + name + '_x3'\n            shortcut_name = name\n        else:\n            conv_name1 = name + \"_branch2a\"\n            conv_name2 = name + \"_branch2b\"\n            conv_name3 = name + \"_branch2c\"\n            shortcut_name = name + \"_branch1\"\n        return conv_name1, conv_name2, conv_name3, shortcut_name\n\n    def fix_layer_warp_name(self, stage_num, count, i):\n        name = 'res' + str(stage_num)\n        if count > 10 and stage_num == 4:\n            if i == 0:\n                conv_name = name + \"a\"\n            else:\n                conv_name = name + \"b\" + str(i)\n        else:\n            conv_name = name + chr(ord(\"a\") + i)\n        if self.model_type == 'SEResNeXt':\n            conv_name = str(stage_num + 2) + '_' + str(i + 1)\n        return conv_name\n\n    def fix_c1_stage_name(self):\n        return \"res_conv1\" if self.model_type == 'ResNeXt' else \"conv1\"\n"
  },
  {
    "path": "modules/image/classification/resnet34_v2_imagenet/nonlocal_helper.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport paddle.fluid as fluid\nfrom paddle.fluid import ParamAttr\n\nnonlocal_params = {\n    \"use_zero_init_conv\": False,\n    \"conv_init_std\": 0.01,\n    \"no_bias\": True,\n    \"use_maxpool\": False,\n    \"use_softmax\": True,\n    \"use_bn\": False,\n    \"use_scale\": True,  # vital for the model prformance!!!\n    \"use_affine\": False,\n    \"bn_momentum\": 0.9,\n    \"bn_epsilon\": 1.0000001e-5,\n    \"bn_init_gamma\": 0.9,\n    \"weight_decay_bn\": 1.e-4,\n}\n\n\ndef space_nonlocal(input, dim_in, dim_out, prefix, dim_inner, max_pool_stride=2):\n    cur = input\n    theta = fluid.layers.conv2d(input = cur, num_filters = dim_inner, \\\n                                filter_size = [1, 1], stride = [1, 1], \\\n                                padding = [0, 0], \\\n                                param_attr=ParamAttr(name = prefix + '_theta' + \"_w\", \\\n                                    initializer = fluid.initializer.Normal(loc = 0.0,\n                                    scale = nonlocal_params[\"conv_init_std\"])), \\\n                                bias_attr = ParamAttr(name = prefix + '_theta' + \"_b\", \\\n                                    initializer = fluid.initializer.Constant(value = 0.)) \\\n                                        if not nonlocal_params[\"no_bias\"] else False, \\\n                                name = prefix + '_theta')\n    theta_shape = theta.shape\n    theta_shape_op = fluid.layers.shape(theta)\n    theta_shape_op.stop_gradient = True\n\n    if nonlocal_params[\"use_maxpool\"]:\n        max_pool = fluid.layers.pool2d(input = cur, \\\n                                        pool_size = [max_pool_stride, max_pool_stride], \\\n                                        pool_type = 'max', \\\n                                        pool_stride = [max_pool_stride, max_pool_stride], \\\n                                        pool_padding = [0, 0], \\\n                                        name = prefix + '_pool')\n    else:\n        max_pool = cur\n\n    phi = fluid.layers.conv2d(input = max_pool, num_filters = dim_inner, \\\n                             filter_size = [1, 1], stride = [1, 1], \\\n                             padding = [0, 0], \\\n                             param_attr = ParamAttr(name = prefix + '_phi' + \"_w\", \\\n                                 initializer = fluid.initializer.Normal(loc = 0.0,\n                                 scale = nonlocal_params[\"conv_init_std\"])), \\\n                             bias_attr = ParamAttr(name = prefix + '_phi' + \"_b\", \\\n                                 initializer = fluid.initializer.Constant(value = 0.)) \\\n                                      if (nonlocal_params[\"no_bias\"] == 0) else False, \\\n                             name = prefix + '_phi')\n    phi_shape = phi.shape\n\n    g = fluid.layers.conv2d(input = max_pool, num_filters = dim_inner, \\\n                 filter_size = [1, 1], stride = [1, 1], \\\n                 padding = [0, 0], \\\n                 param_attr = ParamAttr(name = prefix + '_g' + \"_w\", \\\n                     initializer = fluid.initializer.Normal(loc = 0.0, scale = nonlocal_params[\"conv_init_std\"])), \\\n                 bias_attr = ParamAttr(name = prefix + '_g' + \"_b\", \\\n                     initializer = fluid.initializer.Constant(value = 0.)) if (nonlocal_params[\"no_bias\"] == 0) else False, \\\n                 name = prefix + '_g')\n    g_shape = g.shape\n    # we have to use explicit batch size (to support arbitrary spacetime size)\n    # e.g. (8, 1024, 4, 14, 14) => (8, 1024, 784)\n    theta = fluid.layers.reshape(theta, shape=(0, 0, -1))\n    theta = fluid.layers.transpose(theta, [0, 2, 1])\n    phi = fluid.layers.reshape(phi, [0, 0, -1])\n    theta_phi = fluid.layers.matmul(theta, phi, name=prefix + '_affinity')\n    g = fluid.layers.reshape(g, [0, 0, -1])\n\n    if nonlocal_params[\"use_softmax\"]:\n        if nonlocal_params[\"use_scale\"]:\n            theta_phi_sc = fluid.layers.scale(theta_phi, scale=dim_inner**-.5)\n        else:\n            theta_phi_sc = theta_phi\n        p = fluid.layers.softmax(theta_phi_sc, name=prefix + '_affinity' + '_prob')\n    else:\n        # not clear about what is doing in xlw's code\n        p = None  # not implemented\n        raise \"Not implemented when not use softmax\"\n\n    # note g's axis[2] corresponds to p's axis[2]\n    # e.g. g(8, 1024, 784_2) * p(8, 784_1, 784_2) => (8, 1024, 784_1)\n    p = fluid.layers.transpose(p, [0, 2, 1])\n    t = fluid.layers.matmul(g, p, name=prefix + '_y')\n\n    # reshape back\n    # e.g. (8, 1024, 784) => (8, 1024, 4, 14, 14)\n    t_shape = t.shape\n    t_re = fluid.layers.reshape(t, shape=list(theta_shape), actual_shape=theta_shape_op)\n    blob_out = t_re\n    blob_out = fluid.layers.conv2d(input = blob_out, num_filters = dim_out, \\\n                                  filter_size = [1, 1], stride = [1, 1], padding = [0, 0], \\\n                                  param_attr = ParamAttr(name = prefix + '_out' + \"_w\", \\\n                                      initializer = fluid.initializer.Constant(value = 0.) \\\n                                        if nonlocal_params[\"use_zero_init_conv\"] \\\n                                        else fluid.initializer.Normal(loc = 0.0,\n                                            scale = nonlocal_params[\"conv_init_std\"])), \\\n                                  bias_attr = ParamAttr(name = prefix + '_out' + \"_b\", \\\n                                          initializer = fluid.initializer.Constant(value = 0.)) \\\n                                           if (nonlocal_params[\"no_bias\"] == 0) else False, \\\n                                  name = prefix + '_out')\n    blob_out_shape = blob_out.shape\n\n    if nonlocal_params[\"use_bn\"]:\n        bn_name = prefix + \"_bn\"\n        blob_out = fluid.layers.batch_norm(blob_out, \\\n                      # is_test = test_mode, \\\n                      momentum = nonlocal_params[\"bn_momentum\"], \\\n                      epsilon = nonlocal_params[\"bn_epsilon\"], \\\n                      name = bn_name, \\\n                      param_attr = ParamAttr(name = bn_name + \"_s\", \\\n                      initializer = fluid.initializer.Constant(value = nonlocal_params[\"bn_init_gamma\"]), \\\n                      regularizer = fluid.regularizer.L2Decay(nonlocal_params[\"weight_decay_bn\"])), \\\n                      bias_attr = ParamAttr(name = bn_name + \"_b\", \\\n                      regularizer = fluid.regularizer.L2Decay(nonlocal_params[\"weight_decay_bn\"])), \\\n                      moving_mean_name = bn_name + \"_rm\", \\\n                      moving_variance_name = bn_name + \"_riv\") # add bn\n\n    if nonlocal_params[\"use_affine\"]:\n        affine_scale = fluid.layers.create_parameter(\\\n                       shape=[blob_out_shape[1]], dtype = blob_out.dtype, \\\n                       attr=ParamAttr(name=prefix + '_affine' + '_s'), \\\n                       default_initializer = fluid.initializer.Constant(value = 1.))\n        affine_bias = fluid.layers.create_parameter(\\\n                      shape=[blob_out_shape[1]], dtype = blob_out.dtype, \\\n                      attr=ParamAttr(name=prefix + '_affine' + '_b'), \\\n                      default_initializer = fluid.initializer.Constant(value = 0.))\n        blob_out = fluid.layers.affine_channel(blob_out, scale = affine_scale, \\\n                      bias = affine_bias, name = prefix + '_affine')   # add affine\n\n    return blob_out\n\n\ndef add_space_nonlocal(input, dim_in, dim_out, prefix, dim_inner):\n    '''\n    add_space_nonlocal:\n        Non-local Neural Networks: see https://arxiv.org/abs/1711.07971\n    '''\n    conv = space_nonlocal(input, dim_in, dim_out, prefix, dim_inner)\n    output = fluid.layers.elementwise_add(input, conv, name=prefix + '_sum')\n    return output\n"
  },
  {
    "path": "modules/image/classification/resnet34_v2_imagenet/processor.py",
    "content": "# coding=utf-8\ndef load_label_info(file_path):\n    with open(file_path, 'r') as fr:\n        return fr.read().split(\"\\n\")[:-1]\n"
  },
  {
    "path": "modules/image/classification/resnet34_v2_imagenet/resnet.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport math\nfrom collections import OrderedDict\nfrom numbers import Integral\n\nfrom paddle import fluid\nfrom paddle.fluid.param_attr import ParamAttr\nfrom paddle.fluid.framework import Variable\nfrom paddle.fluid.regularizer import L2Decay\nfrom paddle.fluid.initializer import Constant\n\nfrom .nonlocal_helper import add_space_nonlocal\nfrom .name_adapter import NameAdapter\n\n__all__ = ['ResNet', 'ResNetC5']\n\n\nclass ResNet(object):\n    \"\"\"\n    Residual Network, see https://arxiv.org/abs/1512.03385\n    Args:\n        depth (int): ResNet depth, should be 34, 50.\n        freeze_at (int): freeze the backbone at which stage\n        norm_type (str): normalization type, 'bn'/'sync_bn'/'affine_channel'\n        freeze_norm (bool): freeze normalization layers\n        norm_decay (float): weight decay for normalization layer weights\n        variant (str): ResNet variant, supports 'a', 'b', 'c', 'd' currently\n        feature_maps (list): index of stages whose feature maps are returned\n        dcn_v2_stages (list): index of stages who select deformable conv v2\n        nonlocal_stages (list): index of stages who select nonlocal networks\n    \"\"\"\n    __shared__ = ['norm_type', 'freeze_norm', 'weight_prefix_name']\n\n    def __init__(self,\n                 depth=34,\n                 freeze_at=0,\n                 norm_type='sync_bn',\n                 freeze_norm=False,\n                 norm_decay=0.,\n                 variant='b',\n                 feature_maps=[3, 4, 5],\n                 dcn_v2_stages=[],\n                 weight_prefix_name='',\n                 nonlocal_stages=[],\n                 get_prediction=False,\n                 class_dim=1000):\n        super(ResNet, self).__init__()\n\n        if isinstance(feature_maps, Integral):\n            feature_maps = [feature_maps]\n\n        assert depth in [34, 50], \\\n            \"depth {} not in [34, 50]\"\n        assert variant in ['a', 'b', 'c', 'd'], \"invalid ResNet variant\"\n        assert 0 <= freeze_at <= 4, \"freeze_at should be 0, 1, 2, 3 or 4\"\n        assert len(feature_maps) > 0, \"need one or more feature maps\"\n        assert norm_type in ['bn', 'sync_bn', 'affine_channel']\n        assert not (len(nonlocal_stages)>0 and depth<50), \\\n                    \"non-local is not supported for resnet18 or resnet34\"\n\n        self.depth = depth\n        self.freeze_at = freeze_at\n        self.norm_type = norm_type\n        self.norm_decay = norm_decay\n        self.freeze_norm = freeze_norm\n        self.variant = variant\n        self._model_type = 'ResNet'\n        self.feature_maps = feature_maps\n        self.dcn_v2_stages = dcn_v2_stages\n        self.depth_cfg = {\n            34: ([3, 4, 6, 3], self.basicblock),\n            50: ([3, 4, 6, 3], self.bottleneck),\n        }\n        self.stage_filters = [64, 128, 256, 512]\n        self._c1_out_chan_num = 64\n        self.na = NameAdapter(self)\n        self.prefix_name = weight_prefix_name\n\n        self.nonlocal_stages = nonlocal_stages\n        self.nonlocal_mod_cfg = {\n            50: 2,\n            101: 5,\n            152: 8,\n            200: 12,\n        }\n        self.get_prediction = get_prediction\n        self.class_dim = class_dim\n\n    def _conv_offset(self, input, filter_size, stride, padding, act=None, name=None):\n        out_channel = filter_size * filter_size * 3\n        out = fluid.layers.conv2d(\n            input,\n            num_filters=out_channel,\n            filter_size=filter_size,\n            stride=stride,\n            padding=padding,\n            param_attr=ParamAttr(initializer=Constant(0.0), name=name + \".w_0\"),\n            bias_attr=ParamAttr(initializer=Constant(0.0), name=name + \".b_0\"),\n            act=act,\n            name=name)\n        return out\n\n    def _conv_norm(self, input, num_filters, filter_size, stride=1, groups=1, act=None, name=None, dcn_v2=False):\n        _name = self.prefix_name + name if self.prefix_name != '' else name\n        if not dcn_v2:\n            conv = fluid.layers.conv2d(\n                input=input,\n                num_filters=num_filters,\n                filter_size=filter_size,\n                stride=stride,\n                padding=(filter_size - 1) // 2,\n                groups=groups,\n                act=None,\n                param_attr=ParamAttr(name=_name + \"_weights\"),\n                bias_attr=False,\n                name=_name + '.conv2d.output.1')\n        else:\n            # select deformable conv\"\n            offset_mask = self._conv_offset(\n                input=input,\n                filter_size=filter_size,\n                stride=stride,\n                padding=(filter_size - 1) // 2,\n                act=None,\n                name=_name + \"_conv_offset\")\n            offset_channel = filter_size**2 * 2\n            mask_channel = filter_size**2\n            offset, mask = fluid.layers.split(input=offset_mask, num_or_sections=[offset_channel, mask_channel], dim=1)\n            mask = fluid.layers.sigmoid(mask)\n            conv = fluid.layers.deformable_conv(\n                input=input,\n                offset=offset,\n                mask=mask,\n                num_filters=num_filters,\n                filter_size=filter_size,\n                stride=stride,\n                padding=(filter_size - 1) // 2,\n                groups=groups,\n                deformable_groups=1,\n                im2col_step=1,\n                param_attr=ParamAttr(name=_name + \"_weights\"),\n                bias_attr=False,\n                name=_name + \".conv2d.output.1\")\n\n        bn_name = self.na.fix_conv_norm_name(name)\n        bn_name = self.prefix_name + bn_name if self.prefix_name != '' else bn_name\n\n        norm_lr = 0. if self.freeze_norm else 1.\n        norm_decay = self.norm_decay\n        pattr = ParamAttr(name=bn_name + '_scale', learning_rate=norm_lr, regularizer=L2Decay(norm_decay))\n        battr = ParamAttr(name=bn_name + '_offset', learning_rate=norm_lr, regularizer=L2Decay(norm_decay))\n\n        if self.norm_type in ['bn', 'sync_bn']:\n            global_stats = True if self.freeze_norm else False\n            out = fluid.layers.batch_norm(\n                input=conv,\n                act=act,\n                name=bn_name + '.output.1',\n                param_attr=pattr,\n                bias_attr=battr,\n                moving_mean_name=bn_name + '_mean',\n                moving_variance_name=bn_name + '_variance',\n                use_global_stats=global_stats)\n            scale = fluid.framework._get_var(pattr.name)\n            bias = fluid.framework._get_var(battr.name)\n        elif self.norm_type == 'affine_channel':\n            scale = fluid.layers.create_parameter(\n                shape=[conv.shape[1]], dtype=conv.dtype, attr=pattr, default_initializer=fluid.initializer.Constant(1.))\n            bias = fluid.layers.create_parameter(\n                shape=[conv.shape[1]], dtype=conv.dtype, attr=battr, default_initializer=fluid.initializer.Constant(0.))\n            out = fluid.layers.affine_channel(x=conv, scale=scale, bias=bias, act=act)\n        if self.freeze_norm:\n            scale.stop_gradient = True\n            bias.stop_gradient = True\n        return out\n\n    def _shortcut(self, input, ch_out, stride, is_first, name):\n        max_pooling_in_short_cut = self.variant == 'd'\n        ch_in = input.shape[1]\n        # the naming rule is same as pretrained weight\n        name = self.na.fix_shortcut_name(name)\n        std_senet = getattr(self, 'std_senet', False)\n        if ch_in != ch_out or stride != 1 or (self.depth < 50 and is_first):\n            if std_senet:\n                if is_first:\n                    return self._conv_norm(input, ch_out, 1, stride, name=name)\n                else:\n                    return self._conv_norm(input, ch_out, 3, stride, name=name)\n            if max_pooling_in_short_cut and not is_first:\n                input = fluid.layers.pool2d(\n                    input=input, pool_size=2, pool_stride=2, pool_padding=0, ceil_mode=True, pool_type='avg')\n                return self._conv_norm(input, ch_out, 1, 1, name=name)\n            return self._conv_norm(input, ch_out, 1, stride, name=name)\n        else:\n            return input\n\n    def bottleneck(self, input, num_filters, stride, is_first, name, dcn_v2=False):\n        if self.variant == 'a':\n            stride1, stride2 = stride, 1\n        else:\n            stride1, stride2 = 1, stride\n\n        # ResNeXt\n        groups = getattr(self, 'groups', 1)\n        group_width = getattr(self, 'group_width', -1)\n        if groups == 1:\n            expand = 4\n        elif (groups * group_width) == 256:\n            expand = 1\n        else:  # FIXME hard code for now, handles 32x4d, 64x4d and 32x8d\n            num_filters = num_filters // 2\n            expand = 2\n\n        conv_name1, conv_name2, conv_name3, \\\n            shortcut_name = self.na.fix_bottleneck_name(name)\n        std_senet = getattr(self, 'std_senet', False)\n        if std_senet:\n            conv_def = [[int(num_filters / 2), 1, stride1, 'relu', 1, conv_name1],\n                        [num_filters, 3, stride2, 'relu', groups, conv_name2],\n                        [num_filters * expand, 1, 1, None, 1, conv_name3]]\n        else:\n            conv_def = [[num_filters, 1, stride1, 'relu', 1, conv_name1],\n                        [num_filters, 3, stride2, 'relu', groups, conv_name2],\n                        [num_filters * expand, 1, 1, None, 1, conv_name3]]\n\n        residual = input\n        for i, (c, k, s, act, g, _name) in enumerate(conv_def):\n            residual = self._conv_norm(\n                input=residual,\n                num_filters=c,\n                filter_size=k,\n                stride=s,\n                act=act,\n                groups=g,\n                name=_name,\n                dcn_v2=(i == 1 and dcn_v2))\n        short = self._shortcut(input, num_filters * expand, stride, is_first=is_first, name=shortcut_name)\n        # Squeeze-and-Excitation\n        if callable(getattr(self, '_squeeze_excitation', None)):\n            residual = self._squeeze_excitation(input=residual, num_channels=num_filters, name='fc' + name)\n        return fluid.layers.elementwise_add(x=short, y=residual, act='relu', name=name + \".add.output.5\")\n\n    def basicblock(self, input, num_filters, stride, is_first, name, dcn_v2=False):\n        assert dcn_v2 is False, \"Not implemented yet.\"\n        conv0 = self._conv_norm(\n            input=input, num_filters=num_filters, filter_size=3, act='relu', stride=stride, name=name + \"_branch2a\")\n        conv1 = self._conv_norm(input=conv0, num_filters=num_filters, filter_size=3, act=None, name=name + \"_branch2b\")\n        short = self._shortcut(input, num_filters, stride, is_first, name=name + \"_branch1\")\n        return fluid.layers.elementwise_add(x=short, y=conv1, act='relu')\n\n    def layer_warp(self, input, stage_num):\n        \"\"\"\n        Args:\n            input (Variable): input variable.\n            stage_num (int): the stage number, should be 2, 3, 4, 5\n\n        Returns:\n            The last variable in endpoint-th stage.\n        \"\"\"\n        assert stage_num in [2, 3, 4, 5]\n\n        stages, block_func = self.depth_cfg[self.depth]\n        count = stages[stage_num - 2]\n\n        ch_out = self.stage_filters[stage_num - 2]\n        is_first = False if stage_num != 2 else True\n        dcn_v2 = True if stage_num in self.dcn_v2_stages else False\n\n        nonlocal_mod = 1000\n        if stage_num in self.nonlocal_stages:\n            nonlocal_mod = self.nonlocal_mod_cfg[self.depth] if stage_num == 4 else 2\n\n        # Make the layer name and parameter name consistent\n        # with ImageNet pre-trained model\n        conv = input\n        for i in range(count):\n            conv_name = self.na.fix_layer_warp_name(stage_num, count, i)\n            if self.depth < 50:\n                is_first = True if i == 0 and stage_num == 2 else False\n            conv = block_func(\n                input=conv,\n                num_filters=ch_out,\n                stride=2 if i == 0 and stage_num != 2 else 1,\n                is_first=is_first,\n                name=conv_name,\n                dcn_v2=dcn_v2)\n\n            # add non local model\n            dim_in = conv.shape[1]\n            nonlocal_name = \"nonlocal_conv{}\".format(stage_num)\n            if i % nonlocal_mod == nonlocal_mod - 1:\n                conv = add_space_nonlocal(conv, dim_in, dim_in, nonlocal_name + '_{}'.format(i), int(dim_in / 2))\n        return conv\n\n    def c1_stage(self, input):\n        out_chan = self._c1_out_chan_num\n\n        conv1_name = self.na.fix_c1_stage_name()\n\n        if self.variant in ['c', 'd']:\n            conv_def = [\n                [out_chan // 2, 3, 2, \"conv1_1\"],\n                [out_chan // 2, 3, 1, \"conv1_2\"],\n                [out_chan, 3, 1, \"conv1_3\"],\n            ]\n        else:\n            conv_def = [[out_chan, 7, 2, conv1_name]]\n\n        for (c, k, s, _name) in conv_def:\n            input = self._conv_norm(input=input, num_filters=c, filter_size=k, stride=s, act='relu', name=_name)\n\n        output = fluid.layers.pool2d(input=input, pool_size=3, pool_stride=2, pool_padding=1, pool_type='max')\n        return output\n\n    def __call__(self, input):\n        assert isinstance(input, Variable)\n        assert not (set(self.feature_maps) - set([2, 3, 4, 5])), \\\n            \"feature maps {} not in [2, 3, 4, 5]\".format(self.feature_maps)\n\n        res_endpoints = []\n\n        res = input\n        feature_maps = self.feature_maps\n        severed_head = getattr(self, 'severed_head', False)\n        if not severed_head:\n            res = self.c1_stage(res)\n            feature_maps = range(2, max(self.feature_maps) + 1)\n\n        for i in feature_maps:\n            res = self.layer_warp(res, i)\n            if i in self.feature_maps:\n                res_endpoints.append(res)\n            if self.freeze_at >= i:\n                res.stop_gradient = True\n        if self.get_prediction:\n            pool = fluid.layers.pool2d(input=res, pool_type='avg', global_pooling=True)\n            stdv = 1.0 / math.sqrt(pool.shape[1] * 1.0)\n\n            out = fluid.layers.fc(\n                input=pool,\n                size=self.class_dim,\n                param_attr=fluid.param_attr.ParamAttr(initializer=fluid.initializer.Uniform(-stdv, stdv)))\n            out = fluid.layers.softmax(out)\n            return out\n        return OrderedDict(\n            [('res{}_sum'.format(self.feature_maps[idx]), feat) for idx, feat in enumerate(res_endpoints)])\n\n\nclass ResNetC5(ResNet):\n    def __init__(self,\n                 depth=50,\n                 freeze_at=2,\n                 norm_type='affine_channel',\n                 freeze_norm=True,\n                 norm_decay=0.,\n                 variant='b',\n                 feature_maps=[5],\n                 weight_prefix_name=''):\n        super(ResNetC5, self).__init__(depth, freeze_at, norm_type, freeze_norm, norm_decay, variant, feature_maps)\n        self.severed_head = True\n"
  },
  {
    "path": "modules/image/classification/resnet34_vd_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport math\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(\n            self,\n            num_channels: int,\n            num_filters: int,\n            filter_size: int,\n            stride: int = 1,\n            groups: int = 1,\n            is_vd_mode: bool = False,\n            act: str = None,\n            name: str = None,\n    ):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2d(kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, inputs: paddle.Tensor):\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Bottleneck Block for ResNet34_vd.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels, num_filters=num_filters, filter_size=1, act='relu', name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters * 4, filter_size=1, act=None, name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv2, act='relu')\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    \"\"\"Basic block for ResNet34_vd.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BasicBlock, self).__init__()\n        self.stride = stride\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters, filter_size=3, act=None, name=name + \"_branch2b\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters,\n                filter_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv1, act='relu')\n        return y\n\n\n@moduleinfo(\n    name=\"resnet34_vd_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"resnet34_vd_imagenet is a classification model, \"\n    \"this module is trained with Baidu open sourced dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ResNet34_vd(nn.Layer):\n    \"\"\"ResNet34_vd model.\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(ResNet34_vd, self).__init__()\n\n        self.layers = 34\n        depth = [3, 4, 6, 3]\n        num_channels = [64, 64, 128, 256]\n        num_filters = [64, 128, 256, 512]\n\n        self.conv1_1 = ConvBNLayer(num_channels=3, num_filters=32, filter_size=3, stride=2, act='relu', name=\"conv1_1\")\n        self.conv1_2 = ConvBNLayer(num_channels=32, num_filters=32, filter_size=3, stride=1, act='relu', name=\"conv1_2\")\n        self.conv1_3 = ConvBNLayer(num_channels=32, num_filters=64, filter_size=3, stride=1, act='relu', name=\"conv1_3\")\n        self.pool2d_max = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n        self.block_list = []\n\n        for block in range(len(depth)):\n            shortcut = False\n            for i in range(depth[block]):\n                conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                basic_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    BasicBlock(\n                        num_channels=num_channels[block] if i == 0 else num_filters[block],\n                        num_filters=num_filters[block],\n                        stride=2 if i == 0 and block != 0 else 1,\n                        shortcut=shortcut,\n                        if_first=block == i == 0,\n                        name=conv_name))\n                self.block_list.append(basic_block)\n                shortcut = True\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n        self.pool2d_avg_channels = num_channels[-1] * 2\n        stdv = 1.0 / math.sqrt(self.pool2d_avg_channels * 1.0)\n\n        self.out = Linear(\n            self.pool2d_avg_channels,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_0.w_0\"),\n            bias_attr=ParamAttr(name=\"fc_0.b_0\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'resnet34_vd_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/resnet34_vd_imagenet.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        y = self.pool2d_max(y)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.pool2d_avg_channels])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/resnet34_vd_imagenet_ssld/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport math\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(\n            self,\n            num_channels: int,\n            num_filters: int,\n            filter_size: int,\n            stride: int = 1,\n            groups: int = 1,\n            is_vd_mode: bool = False,\n            act: str = None,\n            name: str = None,\n    ):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2d(kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, inputs: paddle.Tensor):\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Bottleneck Block for ResNet34_vd.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels, num_filters=num_filters, filter_size=1, act='relu', name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters * 4, filter_size=1, act=None, name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv2, act='relu')\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    \"\"\"Basic block for ResNet34_vd.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BasicBlock, self).__init__()\n        self.stride = stride\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters, filter_size=3, act=None, name=name + \"_branch2b\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters,\n                filter_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv1, act='relu')\n        return y\n\n\n@moduleinfo(\n    name=\"resnet34_vd_imagenet_ssld\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"resnet34_vd_imagenet_ssld is a classification model, \"\n    \"this module is trained with Baidu open sourced dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ResNet34_vd(nn.Layer):\n    \"\"\"ResNet34_vd model.\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(ResNet34_vd, self).__init__()\n\n        self.layers = 34\n        depth = [3, 4, 6, 3]\n        num_channels = [64, 64, 128, 256]\n        num_filters = [64, 128, 256, 512]\n\n        self.conv1_1 = ConvBNLayer(num_channels=3, num_filters=32, filter_size=3, stride=2, act='relu', name=\"conv1_1\")\n        self.conv1_2 = ConvBNLayer(num_channels=32, num_filters=32, filter_size=3, stride=1, act='relu', name=\"conv1_2\")\n        self.conv1_3 = ConvBNLayer(num_channels=32, num_filters=64, filter_size=3, stride=1, act='relu', name=\"conv1_3\")\n        self.pool2d_max = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n        self.block_list = []\n\n        for block in range(len(depth)):\n            shortcut = False\n            for i in range(depth[block]):\n                conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                basic_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    BasicBlock(\n                        num_channels=num_channels[block] if i == 0 else num_filters[block],\n                        num_filters=num_filters[block],\n                        stride=2 if i == 0 and block != 0 else 1,\n                        shortcut=shortcut,\n                        if_first=block == i == 0,\n                        name=conv_name))\n                self.block_list.append(basic_block)\n                shortcut = True\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n        self.pool2d_avg_channels = num_channels[-1] * 2\n        stdv = 1.0 / math.sqrt(self.pool2d_avg_channels * 1.0)\n\n        self.out = Linear(\n            self.pool2d_avg_channels,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_0.w_0\"),\n            bias_attr=ParamAttr(name=\"fc_0.b_0\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'resnet34_vd_ssld_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/resnet34_vd_ssld_imagenet.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        y = self.pool2d_max(y)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.pool2d_avg_channels])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/resnet50_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport math\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 filter_size: int,\n                 stride: int = 1,\n                 groups: int = 1,\n                 act: str = None,\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + \"_scale\"),\n            bias_attr=ParamAttr(bn_name + \"_offset\"),\n            moving_mean_name=bn_name + \"_mean\",\n            moving_variance_name=bn_name + \"_variance\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Bottleneck Block for ResNet50.\"\"\"\n\n    def __init__(self, num_channels: int, num_filters: int, stride: int, shortcut: bool = True, name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels, num_filters=num_filters, filter_size=1, act=\"relu\", name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters * 4, filter_size=1, act=None, name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                stride=stride,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n        self._num_channels_out = num_filters * 4\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.elementwise_add(x=short, y=conv2, act=\"relu\")\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    \"\"\"Basic block for ResNet50.\"\"\"\n\n    def __init__(self, num_channels: int, num_filters: int, stride: int, shortcut: bool = True, name: str = None):\n        super(BasicBlock, self).__init__()\n        self.stride = stride\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters, filter_size=3, act=None, name=name + \"_branch2b\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters,\n                filter_size=1,\n                stride=stride,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv1, act=\"relu\")\n        return y\n\n\n@moduleinfo(\n    name=\"resnet50_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"resnet50_imagenet is a classification model, \"\n    \"this module is trained with Baidu open sourced dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ResNet50(nn.Layer):\n    \"\"\"ResNet50 model.\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(ResNet50, self).__init__()\n\n        self.layers = 50\n        depth = [3, 4, 6, 3]\n        num_channels = [64, 256, 512, 1024]\n        num_filters = [64, 128, 256, 512]\n\n        self.conv = ConvBNLayer(num_channels=3, num_filters=64, filter_size=7, stride=2, act=\"relu\", name=\"conv1\")\n        self.pool2d_max = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n        self.block_list = []\n\n        for block in range(len(depth)):\n            shortcut = False\n            for i in range(depth[block]):\n                conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                bottleneck_block = self.add_sublayer(\n                    conv_name,\n                    BottleneckBlock(\n                        num_channels=num_channels[block] if i == 0 else num_filters[block] * 4,\n                        num_filters=num_filters[block],\n                        stride=2 if i == 0 and block != 0 else 1,\n                        shortcut=shortcut,\n                        name=conv_name))\n                self.block_list.append(bottleneck_block)\n                shortcut = True\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n\n        self.pool2d_avg_channels = num_channels[-1] * 2\n\n        stdv = 1.0 / math.sqrt(self.pool2d_avg_channels * 1.0)\n\n        self.out = Linear(\n            self.pool2d_avg_channels,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_0.w_0\"),\n            bias_attr=ParamAttr(name=\"fc_0.b_0\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'resnet50_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/resnet50_imagenet.pdparams -O ' +\n                    checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv(inputs)\n        y = self.pool2d_max(y)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.pool2d_avg_channels])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/resnet50_v2_imagenet/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/classification/resnet50_v2_imagenet/data_feed.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import division\n\nimport os\nfrom collections import OrderedDict\n\nimport cv2\nimport numpy as np\nfrom PIL import Image, ImageEnhance\nfrom paddle import fluid\n\nDATA_DIM = 224\nimg_mean = np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))\nimg_std = np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))\n\n\ndef resize_short(img, target_size):\n    percent = float(target_size) / min(img.size[0], img.size[1])\n    resized_width = int(round(img.size[0] * percent))\n    resized_height = int(round(img.size[1] * percent))\n    img = img.resize((resized_width, resized_height), Image.LANCZOS)\n    return img\n\n\ndef crop_image(img, target_size, center):\n    width, height = img.size\n    size = target_size\n    if center == True:\n        w_start = (width - size) / 2\n        h_start = (height - size) / 2\n    else:\n        w_start = np.random.randint(0, width - size + 1)\n        h_start = np.random.randint(0, height - size + 1)\n    w_end = w_start + size\n    h_end = h_start + size\n    img = img.crop((w_start, h_start, w_end, h_end))\n    return img\n\n\ndef process_image(img):\n    img = resize_short(img, target_size=256)\n    img = crop_image(img, target_size=DATA_DIM, center=True)\n    if img.mode != 'RGB':\n        img = img.convert('RGB')\n    #img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)\n    img = np.array(img).astype('float32').transpose((2, 0, 1)) / 255\n    img -= img_mean\n    img /= img_std\n    return img\n\n\ndef test_reader(paths=None, images=None):\n    \"\"\"data generator\n    :param paths: path to images.\n    :type paths: list, each element is a str\n    :param images: data of images, [N, H, W, C]\n    :type images: numpy.ndarray\n    \"\"\"\n    img_list = []\n    if paths:\n        for img_path in paths:\n            assert os.path.isfile(img_path), \"The {} isn't a valid file path.\".format(img_path)\n            img = Image.open(img_path)\n            #img = cv2.imread(img_path)\n            img_list.append(img)\n    if images is not None:\n        for img in images:\n            img_list.append(Image.fromarray(np.uint8(img)))\n    for im in img_list:\n        im = process_image(im)\n        yield im\n"
  },
  {
    "path": "modules/image/classification/resnet50_v2_imagenet/label_file.txt",
    "content": "tench, Tinca tinca\ngoldfish, Carassius auratus\ngreat white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias\ntiger shark, Galeocerdo cuvieri\nhammerhead, hammerhead shark\nelectric ray, crampfish, numbfish, torpedo\nstingray\ncock\nhen\nostrich, Struthio camelus\nbrambling, Fringilla montifringilla\ngoldfinch, Carduelis carduelis\nhouse finch, linnet, Carpodacus mexicanus\njunco, snowbird\nindigo bunting, indigo finch, indigo bird, Passerina cyanea\nrobin, American robin, Turdus migratorius\nbulbul\njay\nmagpie\nchickadee\nwater ouzel, dipper\nkite\nbald eagle, American eagle, Haliaeetus leucocephalus\nvulture\ngreat grey owl, great gray owl, Strix nebulosa\nEuropean fire salamander, Salamandra salamandra\ncommon newt, Triturus vulgaris\neft\nspotted salamander, Ambystoma maculatum\naxolotl, mud puppy, Ambystoma mexicanum\nbullfrog, Rana catesbeiana\ntree frog, tree-frog\ntailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui\nloggerhead, loggerhead turtle, Caretta caretta\nleatherback turtle, leatherback, leathery turtle, Dermochelys coriacea\nmud turtle\nterrapin\nbox turtle, box tortoise\nbanded gecko\ncommon iguana, iguana, Iguana iguana\nAmerican chameleon, anole, Anolis carolinensis\nwhiptail, whiptail lizard\nagama\nfrilled lizard, Chlamydosaurus kingi\nalligator lizard\nGila monster, Heloderma suspectum\ngreen lizard, Lacerta viridis\nAfrican chameleon, Chamaeleo chamaeleon\nKomodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis\nAfrican crocodile, Nile crocodile, Crocodylus niloticus\nAmerican alligator, Alligator mississipiensis\ntriceratops\nthunder snake, worm snake, Carphophis amoenus\nringneck snake, ring-necked snake, ring snake\nhognose snake, puff adder, sand viper\ngreen snake, grass snake\nking snake, kingsnake\ngarter snake, grass snake\nwater snake\nvine snake\nnight snake, Hypsiglena torquata\nboa constrictor, Constrictor constrictor\nrock python, rock snake, Python sebae\nIndian cobra, Naja naja\ngreen mamba\nsea snake\nhorned viper, cerastes, sand viper, horned asp, Cerastes cornutus\ndiamondback, diamondback rattlesnake, Crotalus adamanteus\nsidewinder, horned rattlesnake, Crotalus cerastes\ntrilobite\nharvestman, daddy longlegs, Phalangium opilio\nscorpion\nblack and gold garden spider, Argiope aurantia\nbarn spider, Araneus cavaticus\ngarden spider, Aranea diademata\nblack widow, Latrodectus mactans\ntarantula\nwolf spider, hunting spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse, partridge, Bonasa umbellus\nprairie chicken, prairie grouse, prairie fowl\npeacock\nquail\npartridge\nAfrican grey, African gray, Psittacus erithacus\nmacaw\nsulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser, Mergus serrator\ngoose\nblack swan, Cygnus atratus\ntusker\nechidna, spiny anteater, anteater\nplatypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus\nwallaby, brush kangaroo\nkoala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus\nwombat\njellyfish\nsea anemone, anemone\nbrain coral\nflatworm, platyhelminth\nnematode, nematode worm, roundworm\nconch\nsnail\nslug\nsea slug, nudibranch\nchiton, coat-of-mail shell, sea cradle, polyplacophore\nchambered nautilus, pearly nautilus, nautilus\nDungeness crab, Cancer magister\nrock crab, Cancer irroratus\nfiddler crab\nking crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica\nAmerican lobster, Northern lobster, Maine lobster, Homarus americanus\nspiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish\ncrayfish, crawfish, crawdad, crawdaddy\nhermit crab\nisopod\nwhite stork, Ciconia ciconia\nblack stork, Ciconia nigra\nspoonbill\nflamingo\nlittle blue heron, Egretta caerulea\nAmerican egret, great white heron, Egretta albus\nbittern\ncrane\nlimpkin, Aramus pictus\nEuropean gallinule, Porphyrio porphyrio\nAmerican coot, marsh hen, mud hen, water hen, Fulica americana\nbustard\nruddy turnstone, Arenaria interpres\nred-backed sandpiper, dunlin, Erolia alpina\nredshank, Tringa totanus\ndowitcher\noystercatcher, oyster catcher\npelican\nking penguin, Aptenodytes patagonica\nalbatross, mollymawk\ngrey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus\nkiller whale, killer, orca, grampus, sea wolf, Orcinus orca\ndugong, Dugong dugon\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog, Maltese terrier, Maltese\nPekinese, Pekingese, Peke\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound, Afghan\nbasset, basset hound\nbeagle\nbloodhound, sleuthhound\nbluetick\nblack-and-tan coonhound\nWalker hound, Walker foxhound\nEnglish foxhound\nredbone\nborzoi, Russian wolfhound\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound, Ibizan Podenco\nNorwegian elkhound, elkhound\notterhound, otter hound\nSaluki, gazelle hound\nScottish deerhound, deerhound\nWeimaraner\nStaffordshire bullterrier, Staffordshire bull terrier\nAmerican Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier, Sealyham\nAiredale, Airedale terrier\ncairn, cairn terrier\nAustralian terrier\nDandie Dinmont, Dandie Dinmont terrier\nBoston bull, Boston terrier\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier, Scottish terrier, Scottie\nTibetan terrier, chrysanthemum dog\nsilky terrier, Sydney silky\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa, Lhasa apso\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla, Hungarian pointer\nEnglish setter\nIrish setter, red setter\nGordon setter\nBrittany spaniel\nclumber, clumber spaniel\nEnglish springer, English springer spaniel\nWelsh springer spaniel\ncocker spaniel, English cocker spaniel, cocker\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog, bobtail\nShetland sheepdog, Shetland sheep dog, Shetland\ncollie\nBorder collie\nBouvier des Flandres, Bouviers des Flandres\nRottweiler\nGerman shepherd, German shepherd dog, German police dog, alsatian\nDoberman, Doberman pinscher\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard, St Bernard\nEskimo dog, husky\nmalamute, malemute, Alaskan malamute\nSiberian husky\ndalmatian, coach dog, carriage dog\naffenpinscher, monkey pinscher, monkey dog\nbasenji\npug, pug-dog\nLeonberg\nNewfoundland, Newfoundland dog\nGreat Pyrenees\nSamoyed, Samoyede\nPomeranian\nchow, chow chow\nkeeshond\nBrabancon griffon\nPembroke, Pembroke Welsh corgi\nCardigan, Cardigan Welsh corgi\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf, grey wolf, gray wolf, Canis lupus\nwhite wolf, Arctic wolf, Canis lupus tundrarum\nred wolf, maned wolf, Canis rufus, Canis niger\ncoyote, prairie wolf, brush wolf, Canis latrans\ndingo, warrigal, warragal, Canis dingo\ndhole, Cuon alpinus\nAfrican hunting dog, hyena dog, Cape hunting dog, Lycaon pictus\nhyena, hyaena\nred fox, Vulpes vulpes\nkit fox, Vulpes macrotis\nArctic fox, white fox, Alopex lagopus\ngrey fox, gray fox, Urocyon cinereoargenteus\ntabby, tabby cat\ntiger cat\nPersian cat\nSiamese cat, Siamese\nEgyptian cat\ncougar, puma, catamount, mountain lion, painter, panther, Felis concolor\nlynx, catamount\nleopard, Panthera pardus\nsnow leopard, ounce, Panthera uncia\njaguar, panther, Panthera onca, Felis onca\nlion, king of beasts, Panthera leo\ntiger, Panthera tigris\ncheetah, chetah, Acinonyx jubatus\nbrown bear, bruin, Ursus arctos\nAmerican black bear, black bear, Ursus americanus, Euarctos americanus\nice bear, polar bear, Ursus Maritimus, Thalarctos maritimus\nsloth bear, Melursus ursinus, Ursus ursinus\nmongoose\nmeerkat, mierkat\ntiger beetle\nladybug, ladybeetle, lady beetle, ladybird, ladybird beetle\nground beetle, carabid beetle\nlong-horned beetle, longicorn, longicorn beetle\nleaf beetle, chrysomelid\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant, emmet, pismire\ngrasshopper, hopper\ncricket\nwalking stick, walkingstick, stick insect\ncockroach, roach\nmantis, mantid\ncicada, cicala\nleafhopper\nlacewing, lacewing fly\ndragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk\ndamselfly\nadmiral\nringlet, ringlet butterfly\nmonarch, monarch butterfly, milkweed butterfly, Danaus plexippus\ncabbage butterfly\nsulphur butterfly, sulfur butterfly\nlycaenid, lycaenid butterfly\nstarfish, sea star\nsea urchin\nsea cucumber, holothurian\nwood rabbit, cottontail, cottontail rabbit\nhare\nAngora, Angora rabbit\nhamster\nporcupine, hedgehog\nfox squirrel, eastern fox squirrel, Sciurus niger\nmarmot\nbeaver\nguinea pig, Cavia cobaya\nsorrel\nzebra\nhog, pig, grunter, squealer, Sus scrofa\nwild boar, boar, Sus scrofa\nwarthog\nhippopotamus, hippo, river horse, Hippopotamus amphibius\nox\nwater buffalo, water ox, Asiatic buffalo, Bubalus bubalis\nbison\nram, tup\nbighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis\nibex, Capra ibex\nhartebeest\nimpala, Aepyceros melampus\ngazelle\nArabian camel, dromedary, Camelus dromedarius\nllama\nweasel\nmink\npolecat, fitch, foulmart, foumart, Mustela putorius\nblack-footed ferret, ferret, Mustela nigripes\notter\nskunk, polecat, wood pussy\nbadger\narmadillo\nthree-toed sloth, ai, Bradypus tridactylus\norangutan, orang, orangutang, Pongo pygmaeus\ngorilla, Gorilla gorilla\nchimpanzee, chimp, Pan troglodytes\ngibbon, Hylobates lar\nsiamang, Hylobates syndactylus, Symphalangus syndactylus\nguenon, guenon monkey\npatas, hussar monkey, Erythrocebus patas\nbaboon\nmacaque\nlangur\ncolobus, colobus monkey\nproboscis monkey, Nasalis larvatus\nmarmoset\ncapuchin, ringtail, Cebus capucinus\nhowler monkey, howler\ntiti, titi monkey\nspider monkey, Ateles geoffroyi\nsquirrel monkey, Saimiri sciureus\nMadagascar cat, ring-tailed lemur, Lemur catta\nindri, indris, Indri indri, Indri brevicaudatus\nIndian elephant, Elephas maximus\nAfrican elephant, Loxodonta africana\nlesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens\ngiant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca\nbarracouta, snoek\neel\ncoho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch\nrock beauty, Holocanthus tricolor\nanemone fish\nsturgeon\ngar, garfish, garpike, billfish, Lepisosteus osseus\nlionfish\npuffer, pufferfish, blowfish, globefish\nabacus\nabaya\nacademic gown, academic robe, judge's robe\naccordion, piano accordion, squeeze box\nacoustic guitar\naircraft carrier, carrier, flattop, attack aircraft carrier\nairliner\nairship, dirigible\naltar\nambulance\namphibian, amphibious vehicle\nanalog clock\napiary, bee house\napron\nashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin\nassault rifle, assault gun\nbackpack, back pack, knapsack, packsack, rucksack, haversack\nbakery, bakeshop, bakehouse\nbalance beam, beam\nballoon\nballpoint, ballpoint pen, ballpen, Biro\nBand Aid\nbanjo\nbannister, banister, balustrade, balusters, handrail\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel, cask\nbarrow, garden cart, lawn cart, wheelbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap, swimming cap\nbath towel\nbathtub, bathing tub, bath, tub\nbeach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon\nbeacon, lighthouse, beacon light, pharos\nbeaker\nbearskin, busby, shako\nbeer bottle\nbeer glass\nbell cote, bell cot\nbib\nbicycle-built-for-two, tandem bicycle, tandem\nbikini, two-piece\nbinder, ring-binder\nbinoculars, field glasses, opera glasses\nbirdhouse\nboathouse\nbobsled, bobsleigh, bob\nbolo tie, bolo, bola tie, bola\nbonnet, poke bonnet\nbookcase\nbookshop, bookstore, bookstall\nbottlecap\nbow\nbow tie, bow-tie, bowtie\nbrass, memorial tablet, plaque\nbrassiere, bra, bandeau\nbreakwater, groin, groyne, mole, bulwark, seawall, jetty\nbreastplate, aegis, egis\nbroom\nbucket, pail\nbuckle\nbulletproof vest\nbullet train, bullet\nbutcher shop, meat market\ncab, hack, taxi, taxicab\ncaldron, cauldron\ncandle, taper, wax light\ncannon\ncanoe\ncan opener, tin opener\ncardigan\ncar mirror\ncarousel, carrousel, merry-go-round, roundabout, whirligig\ncarpenter's kit, tool kit\ncarton\ncar wheel\ncash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello, violoncello\ncellular telephone, cellular phone, cellphone, cell, mobile phone\nchain\nchainlink fence\nchain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour\nchain saw, chainsaw\nchest\nchiffonier, commode\nchime, bell, gong\nchina cabinet, china closet\nChristmas stocking\nchurch, church building\ncinema, movie theater, movie theatre, movie house, picture palace\ncleaver, meat cleaver, chopper\ncliff dwelling\ncloak\nclog, geta, patten, sabot\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil, spiral, volute, whorl, helix\ncombination lock\ncomputer keyboard, keypad\nconfectionery, confectionary, candy store\ncontainer ship, containership, container vessel\nconvertible\ncorkscrew, bottle screw\ncornet, horn, trumpet, trump\ncowboy boot\ncowboy hat, ten-gallon hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib, cot\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam, dike, dyke\ndesk\ndesktop computer\ndial telephone, dial phone\ndiaper, nappy, napkin\ndigital clock\ndigital watch\ndining table, board\ndishrag, dishcloth\ndishwasher, dish washer, dishwashing machine\ndisk brake, disc brake\ndock, dockage, docking facility\ndogsled, dog sled, dog sleigh\ndome\ndoormat, welcome mat\ndrilling platform, offshore rig\ndrum, membranophone, tympan\ndrumstick\ndumbbell\nDutch oven\nelectric fan, blower\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa, boa\nfile, file cabinet, filing cabinet\nfireboat\nfire engine, fire truck\nfire screen, fireguard\nflagpole, flagstaff\nflute, transverse flute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn, horn\nfrying pan, frypan, skillet\nfur coat\ngarbage truck, dustcart\ngasmask, respirator, gas helmet\ngas pump, gasoline pump, petrol pump, island dispenser\ngoblet\ngo-kart\ngolf ball\ngolfcart, golf cart\ngondola\ngong, tam-tam\ngown\ngrand piano, grand\ngreenhouse, nursery, glasshouse\ngrille, radiator grille\ngrocery store, grocery, food market, market\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower, blow dryer, blow drier, hair dryer, hair drier\nhand-held computer, hand-held microcomputer\nhandkerchief, hankie, hanky, hankey\nhard disc, hard disk, fixed disk\nharmonica, mouth organ, harp, mouth harp\nharp\nharvester, reaper\nhatchet\nholster\nhome theater, home theatre\nhoneycomb\nhook, claw\nhoopskirt, crinoline\nhorizontal bar, high bar\nhorse cart, horse-cart\nhourglass\niPod\niron, smoothing iron\njack-o'-lantern\njean, blue jean, denim\njeep, landrover\njersey, T-shirt, tee shirt\njigsaw puzzle\njinrikisha, ricksha, rickshaw\njoystick\nkimono\nknee pad\nknot\nlab coat, laboratory coat\nladle\nlampshade, lamp shade\nlaptop, laptop computer\nlawn mower, mower\nlens cap, lens cover\nletter opener, paper knife, paperknife\nlibrary\nlifeboat\nlighter, light, igniter, ignitor\nlimousine, limo\nliner, ocean liner\nlipstick, lip rouge\nLoafer\nlotion\nloudspeaker, speaker, speaker unit, loudspeaker system, speaker system\nloupe, jeweler's loupe\nlumbermill, sawmill\nmagnetic compass\nmailbag, postbag\nmailbox, letter box\nmaillot\nmaillot, tank suit\nmanhole cover\nmaraca\nmarimba, xylophone\nmask\nmatchstick\nmaypole\nmaze, labyrinth\nmeasuring cup\nmedicine chest, medicine cabinet\nmegalith, megalithic structure\nmicrophone, mike\nmicrowave, microwave oven\nmilitary uniform\nmilk can\nminibus\nminiskirt, mini\nminivan\nmissile\nmitten\nmixing bowl\nmobile home, manufactured home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter, scooter\nmountain bike, all-terrain bike, off-roader\nmountain tent\nmouse, computer mouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook, notebook computer\nobelisk\noboe, hautboy, hautbois\nocarina, sweet potato\nodometer, hodometer, mileometer, milometer\noil filter\norgan, pipe organ\noscilloscope, scope, cathode-ray oscilloscope, CRO\noverskirt\noxcart\noxygen mask\npacket\npaddle, boat paddle\npaddlewheel, paddle wheel\npadlock\npaintbrush\npajama, pyjama, pj's, jammies\npalace\npanpipe, pandean pipe, syrinx\npaper towel\nparachute, chute\nparallel bars, bars\npark bench\nparking meter\npassenger car, coach, carriage\npatio, terrace\npay-phone, pay-station\npedestal, plinth, footstall\npencil box, pencil case\npencil sharpener\nperfume, essence\nPetri dish\nphotocopier\npick, plectrum, plectron\npickelhaube\npicket fence, paling\npickup, pickup truck\npier\npiggy bank, penny bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate, pirate ship\npitcher, ewer\nplane, carpenter's plane, woodworking plane\nplanetarium\nplastic bag\nplate rack\nplow, plough\nplunger, plumber's helper\nPolaroid camera, Polaroid Land camera\npole\npolice van, police wagon, paddy wagon, patrol wagon, wagon, black Maria\nponcho\npool table, billiard table, snooker table\npop bottle, soda bottle\npot, flowerpot\npotter's wheel\npower drill\nprayer rug, prayer mat\nprinter\nprison, prison house\nprojectile, missile\nprojector\npuck, hockey puck\npunching bag, punch bag, punching ball, punchball\npurse\nquill, quill pen\nquilt, comforter, comfort, puff\nracer, race car, racing car\nracket, racquet\nradiator\nradio, wireless\nradio telescope, radio reflector\nrain barrel\nrecreational vehicle, RV, R.V.\nreel\nreflex camera\nrefrigerator, icebox\nremote control, remote\nrestaurant, eating house, eating place, eatery\nrevolver, six-gun, six-shooter\nrifle\nrocking chair, rocker\nrotisserie\nrubber eraser, rubber, pencil eraser\nrugby ball\nrule, ruler\nrunning shoe\nsafe\nsafety pin\nsaltshaker, salt shaker\nsandal\nsarong\nsax, saxophone\nscabbard\nscale, weighing machine\nschool bus\nschooner\nscoreboard\nscreen, CRT screen\nscrew\nscrewdriver\nseat belt, seatbelt\nsewing machine\nshield, buckler\nshoe shop, shoe-shop, shoe store\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule, slipstick\nsliding door\nslot, one-armed bandit\nsnorkel\nsnowmobile\nsnowplow, snowplough\nsoap dispenser\nsoccer ball\nsock\nsolar dish, solar collector, solar furnace\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web, spider's web\nspindle\nsports car, sport car\nspotlight, spot\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch, stop watch\nstove\nstrainer\nstreetcar, tram, tramcar, trolley, trolley car\nstretcher\nstudio couch, day bed\nstupa, tope\nsubmarine, pigboat, sub, U-boat\nsuit, suit of clothes\nsundial\nsunglass\nsunglasses, dark glasses, shades\nsunscreen, sunblock, sun blocker\nsuspension bridge\nswab, swob, mop\nsweatshirt\nswimming trunks, bathing trunks\nswing\nswitch, electric switch, electrical switch\nsyringe\ntable lamp\ntank, army tank, armored combat vehicle, armoured combat vehicle\ntape player\nteapot\nteddy, teddy bear\ntelevision, television system\ntennis ball\nthatch, thatched roof\ntheater curtain, theatre curtain\nthimble\nthresher, thrasher, threshing machine\nthrone\ntile roof\ntoaster\ntobacco shop, tobacconist shop, tobacconist\ntoilet seat\ntorch\ntotem pole\ntow truck, tow car, wrecker\ntoyshop\ntractor\ntrailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi\ntray\ntrench coat\ntricycle, trike, velocipede\ntrimaran\ntripod\ntriumphal arch\ntrolleybus, trolley coach, trackless trolley\ntrombone\ntub, vat\nturnstile\ntypewriter keyboard\numbrella\nunicycle, monocycle\nupright, upright piano\nvacuum, vacuum cleaner\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin, fiddle\nvolleyball\nwaffle iron\nwall clock\nwallet, billfold, notecase, pocketbook\nwardrobe, closet, press\nwarplane, military plane\nwashbasin, handbasin, washbowl, lavabo, wash-hand basin\nwasher, automatic washer, washing machine\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool, woolen, woollen\nworm fence, snake fence, snake-rail fence, Virginia fence\nwreck\nyawl\nyurt\nweb site, website, internet site, site\ncomic book\ncrossword puzzle, crossword\nstreet sign\ntraffic light, traffic signal, stoplight\nbook jacket, dust cover, dust jacket, dust wrapper\nmenu\nplate\nguacamole\nconsomme\nhot pot, hotpot\ntrifle\nice cream, icecream\nice lolly, lolly, lollipop, popsicle\nFrench loaf\nbagel, beigel\npretzel\ncheeseburger\nhotdog, hot dog, red hot\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini, courgette\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber, cuke\nartichoke, globe artichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple, ananas\nbanana\njackfruit, jak, jack\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce, chocolate syrup\ndough\nmeat loaf, meatloaf\npizza, pizza pie\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff, drop, drop-off\ncoral reef\ngeyser\nlakeside, lakeshore\npromontory, headland, head, foreland\nsandbar, sand bar\nseashore, coast, seacoast, sea-coast\nvalley, vale\nvolcano\nballplayer, baseball player\ngroom, bridegroom\nscuba diver\nrapeseed\ndaisy\nyellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum\ncorn\nacorn\nhip, rose hip, rosehip\nbuckeye, horse chestnut, conker\ncoral fungus\nagaric\ngyromitra\nstinkhorn, carrion fungus\nearthstar\nhen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa\nbolete\near, spike, capitulum\ntoilet tissue, toilet paper, bathroom tissue\n"
  },
  {
    "path": "modules/image/classification/resnet50_v2_imagenet/module.py",
    "content": "import os\nimport ast\nimport argparse\n\nimport numpy as np\nimport paddlehub as hub\nimport paddle.fluid as fluid\nfrom paddlehub.module.module import moduleinfo, runnable\nfrom paddle.fluid.core import PaddleTensor, AnalysisConfig, create_paddle_predictor\nfrom paddlehub.io.parser import txt_parser\n\nfrom resnet50_v2_imagenet.resnet import ResNet, ResNetC5\nfrom resnet50_v2_imagenet.processor import load_label_info\nfrom resnet50_v2_imagenet.data_feed import test_reader\n\n\n@moduleinfo(\n    name=\"resnet50_v2_imagenet\",\n    version=\"1.1.0\",\n    type=\"cv/classification\",\n    summary=\"ResNet50 is a image classfication model trained with ImageNet-2012 dataset.\",\n    author=\"paddlepaddle\",\n    author_email=\"paddle-dev@baidu.com\")\nclass ResNet50(hub.Module):\n    def _initialize(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"resnet50_v2_model\")\n        self.label_names = load_label_info(os.path.join(self.directory, \"label_file.txt\"))\n        self.infer_prog = None\n        self.pred_out = None\n        self._set_config()\n\n    def get_expected_image_width(self):\n        return 224\n\n    def get_expected_image_height(self):\n        return 224\n\n    def get_pretrained_images_mean(self):\n        im_mean = np.array([0.485, 0.456, 0.406]).reshape(1, 3)\n        return im_mean\n\n    def get_pretrained_images_std(self):\n        im_std = np.array([0.229, 0.224, 0.225]).reshape(1, 3)\n        return im_std\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        cpu_config = AnalysisConfig(self.default_pretrained_model_path)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_paddle_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = AnalysisConfig(self.default_pretrained_model_path)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=500, device_id=0)\n            self.gpu_predictor = create_paddle_predictor(gpu_config)\n\n    def context(self,\n                input_image=None,\n                trainable=True,\n                pretrained=True,\n                param_prefix='',\n                get_prediction=False,\n                variant='d',\n                norm_type='bn',\n                feature_maps=[3, 4, 5],\n                return_c5=False):\n        \"\"\"Distill the Head Features, so as to perform transfer learning.\n\n        :param input_image: image tensor.\n        :type input_image: <class 'paddle.fluid.framework.Variable'>\n        :param trainable: whether to set parameters trainable.\n        :type trainable: bool\n        :param pretrained: whether to load default pretrained model.\n        :type pretrained: bool\n        :param param_prefix: the prefix of parameters in yolo_head and backbone\n        :type param_prefix: str\n        :param get_prediction: whether to get prediction,\n            if True, outputs is {'bbox_out': bbox_out},\n            if False, outputs is {'head_features': head_features}.\n        :type get_prediction: bool\n        :param depth: depth of network\n        :type depth: int\n        :param variant: type of resnet\n        :type variant: str\n        :param norm_type: type of normlization\n        :type norm_type: str\n        :param feature_maps: stage of output\n        :type feature_maps: list\n        \"\"\"\n        context_prog = input_image.block.program if input_image else fluid.Program()\n        startup_program = fluid.Program()\n        with fluid.program_guard(context_prog, startup_program):\n            if return_c5:\n                return ResNetC5(depth=50, norm_type=norm_type, variant=variant, feature_maps=feature_maps)\n            image = input_image if input_image else fluid.data(\n                name='image', shape=[-1, 3, 224, 224], dtype='float32', lod_level=0)\n            backbone = ResNet(depth=50, variant=variant, norm_type=norm_type,\\\n                              feature_maps=feature_maps, get_prediction=get_prediction)\n\n            out = backbone(image)\n            inputs = {'image': image}\n            if get_prediction:\n                outputs = {'pred_out': out}\n            else:\n                outputs = {'body_feats': out}\n\n            place = fluid.CPUPlace()\n            exe = fluid.Executor(place)\n            if pretrained:\n\n                def _if_exist(var):\n                    return os.path.exists(os.path.join(self.default_pretrained_model_path, var.name))\n\n                if not param_prefix:\n                    fluid.io.load_vars(\n                        exe, self.default_pretrained_model_path, main_program=context_prog, predicate=_if_exist)\n            else:\n                exe.run(startup_program)\n            return inputs, outputs, context_prog\n\n    def classification(self, paths=None, images=None, use_gpu=False, batch_size=1, top_k=2):\n        \"\"\"API of Classification.\n        :param paths: the path of images.\n        :type paths: list, each element is correspond to the path of an image.\n        :param images: data of images, [N, H, W, C]\n        :type images: numpy.ndarray\n        :param use_gpu: whether to use gpu or not.\n        :type use_gpu: bool\n        :param batch_size: bathc size.\n        :type batch_size: int\n        :param top_k: result of top k\n        :typr top_k: int\n        \"\"\"\n        if self.infer_prog is None:\n            inputs, outputs, self.infer_prog = self.context(trainable=False, pretrained=True, get_prediction=True)\n            self.infer_prog = self.infer_prog.clone(for_test=True)\n            self.pred_out = outputs['pred_out']\n        place = fluid.CUDAPlace(0) if use_gpu else fluid.CPUPlace()\n        exe = fluid.Executor(place)\n        all_images = []\n        paths = paths if paths else []\n        for yield_data in test_reader(paths, images):\n            all_images.append(yield_data)\n\n        images_num = len(all_images)\n        loop_num = int(np.ceil(images_num / batch_size))\n\n        res_list = []\n        top_k = max(min(top_k, 1000), 1)\n        for iter_id in range(loop_num):\n            batch_data = []\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_images[handle_id + image_id])\n                except:\n                    pass\n            batch_data = np.array(batch_data).astype('float32')\n            data_tensor = PaddleTensor(batch_data.copy())\n            if use_gpu:\n                result = self.gpu_predictor.run([data_tensor])\n            else:\n                result = self.cpu_predictor.run([data_tensor])\n            for i, res in enumerate(result[0].as_ndarray()):\n                res_dict = {}\n                pred_label = np.argsort(res)[::-1][:top_k]\n                for k in pred_label:\n                    class_name = self.label_names[int(k)].split(',')[0]\n                    max_prob = res[k]\n                    res_dict[class_name] = max_prob\n                res_list.append(res_dict)\n        return res_list\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu', type=ast.literal_eval, default=False, help=\"whether use GPU or not\")\n\n        self.arg_config_group.add_argument('--batch_size', type=int, default=1, help=\"batch size for prediction\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, default=None, help=\"input data\")\n        self.arg_input_group.add_argument('--input_file', type=str, default=None, help=\"file contain input data\")\n\n    def check_input_data(self, args):\n        input_data = []\n        if args.input_path:\n            input_data = [args.input_path]\n        elif args.input_file:\n            if not os.path.exists(args.input_file):\n                raise RuntimeError(\"File %s is not exist.\" % args.input_file)\n            else:\n                input_data = txt_parser.parse(args.input_file, use_strip=True)\n        return input_data\n\n    @runnable\n    def run_cmd(self, argvs):\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {}\".format(self.name),\n            prog=\"hub run {}\".format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        input_data = self.check_input_data(args)\n        if len(input_data) == 0:\n            self.parser.print_help()\n            exit(1)\n        else:\n            for image_path in input_data:\n                if not os.path.exists(image_path):\n                    raise RuntimeError(\"File %s or %s is not exist.\" % image_path)\n        return self.classification(paths=input_data, use_gpu=args.use_gpu, batch_size=args.batch_size)\n"
  },
  {
    "path": "modules/image/classification/resnet50_v2_imagenet/name_adapter.py",
    "content": "# coding=utf-8\n\n\nclass NameAdapter(object):\n    \"\"\"Fix the backbones variable names for pretrained weight\"\"\"\n\n    def __init__(self, model):\n        super(NameAdapter, self).__init__()\n        self.model = model\n\n    @property\n    def model_type(self):\n        return getattr(self.model, '_model_type', '')\n\n    @property\n    def variant(self):\n        return getattr(self.model, 'variant', '')\n\n    def fix_conv_norm_name(self, name):\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        # the naming rule is same as pretrained weight\n        if self.model_type == 'SEResNeXt':\n            bn_name = name + \"_bn\"\n        return bn_name\n\n    def fix_shortcut_name(self, name):\n        if self.model_type == 'SEResNeXt':\n            name = 'conv' + name + '_prj'\n        return name\n\n    def fix_bottleneck_name(self, name):\n        if self.model_type == 'SEResNeXt':\n            conv_name1 = 'conv' + name + '_x1'\n            conv_name2 = 'conv' + name + '_x2'\n            conv_name3 = 'conv' + name + '_x3'\n            shortcut_name = name\n        else:\n            conv_name1 = name + \"_branch2a\"\n            conv_name2 = name + \"_branch2b\"\n            conv_name3 = name + \"_branch2c\"\n            shortcut_name = name + \"_branch1\"\n        return conv_name1, conv_name2, conv_name3, shortcut_name\n\n    def fix_layer_warp_name(self, stage_num, count, i):\n        name = 'res' + str(stage_num)\n        if count > 10 and stage_num == 4:\n            if i == 0:\n                conv_name = name + \"a\"\n            else:\n                conv_name = name + \"b\" + str(i)\n        else:\n            conv_name = name + chr(ord(\"a\") + i)\n        if self.model_type == 'SEResNeXt':\n            conv_name = str(stage_num + 2) + '_' + str(i + 1)\n        return conv_name\n\n    def fix_c1_stage_name(self):\n        return \"res_conv1\" if self.model_type == 'ResNeXt' else \"conv1\"\n"
  },
  {
    "path": "modules/image/classification/resnet50_v2_imagenet/nonlocal_helper.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport paddle.fluid as fluid\nfrom paddle.fluid import ParamAttr\n\nnonlocal_params = {\n    \"use_zero_init_conv\": False,\n    \"conv_init_std\": 0.01,\n    \"no_bias\": True,\n    \"use_maxpool\": False,\n    \"use_softmax\": True,\n    \"use_bn\": False,\n    \"use_scale\": True,  # vital for the model prformance!!!\n    \"use_affine\": False,\n    \"bn_momentum\": 0.9,\n    \"bn_epsilon\": 1.0000001e-5,\n    \"bn_init_gamma\": 0.9,\n    \"weight_decay_bn\": 1.e-4,\n}\n\n\ndef space_nonlocal(input, dim_in, dim_out, prefix, dim_inner, max_pool_stride=2):\n    cur = input\n    theta = fluid.layers.conv2d(input = cur, num_filters = dim_inner, \\\n                                filter_size = [1, 1], stride = [1, 1], \\\n                                padding = [0, 0], \\\n                                param_attr=ParamAttr(name = prefix + '_theta' + \"_w\", \\\n                                    initializer = fluid.initializer.Normal(loc = 0.0,\n                                    scale = nonlocal_params[\"conv_init_std\"])), \\\n                                bias_attr = ParamAttr(name = prefix + '_theta' + \"_b\", \\\n                                    initializer = fluid.initializer.Constant(value = 0.)) \\\n                                        if not nonlocal_params[\"no_bias\"] else False, \\\n                                name = prefix + '_theta')\n    theta_shape = theta.shape\n    theta_shape_op = fluid.layers.shape(theta)\n    theta_shape_op.stop_gradient = True\n\n    if nonlocal_params[\"use_maxpool\"]:\n        max_pool = fluid.layers.pool2d(input = cur, \\\n                                        pool_size = [max_pool_stride, max_pool_stride], \\\n                                        pool_type = 'max', \\\n                                        pool_stride = [max_pool_stride, max_pool_stride], \\\n                                        pool_padding = [0, 0], \\\n                                        name = prefix + '_pool')\n    else:\n        max_pool = cur\n\n    phi = fluid.layers.conv2d(input = max_pool, num_filters = dim_inner, \\\n                             filter_size = [1, 1], stride = [1, 1], \\\n                             padding = [0, 0], \\\n                             param_attr = ParamAttr(name = prefix + '_phi' + \"_w\", \\\n                                 initializer = fluid.initializer.Normal(loc = 0.0,\n                                 scale = nonlocal_params[\"conv_init_std\"])), \\\n                             bias_attr = ParamAttr(name = prefix + '_phi' + \"_b\", \\\n                                 initializer = fluid.initializer.Constant(value = 0.)) \\\n                                      if (nonlocal_params[\"no_bias\"] == 0) else False, \\\n                             name = prefix + '_phi')\n    phi_shape = phi.shape\n\n    g = fluid.layers.conv2d(input = max_pool, num_filters = dim_inner, \\\n                 filter_size = [1, 1], stride = [1, 1], \\\n                 padding = [0, 0], \\\n                 param_attr = ParamAttr(name = prefix + '_g' + \"_w\", \\\n                     initializer = fluid.initializer.Normal(loc = 0.0, scale = nonlocal_params[\"conv_init_std\"])), \\\n                 bias_attr = ParamAttr(name = prefix + '_g' + \"_b\", \\\n                     initializer = fluid.initializer.Constant(value = 0.)) if (nonlocal_params[\"no_bias\"] == 0) else False, \\\n                 name = prefix + '_g')\n    g_shape = g.shape\n    # we have to use explicit batch size (to support arbitrary spacetime size)\n    # e.g. (8, 1024, 4, 14, 14) => (8, 1024, 784)\n    theta = fluid.layers.reshape(theta, shape=(0, 0, -1))\n    theta = fluid.layers.transpose(theta, [0, 2, 1])\n    phi = fluid.layers.reshape(phi, [0, 0, -1])\n    theta_phi = fluid.layers.matmul(theta, phi, name=prefix + '_affinity')\n    g = fluid.layers.reshape(g, [0, 0, -1])\n\n    if nonlocal_params[\"use_softmax\"]:\n        if nonlocal_params[\"use_scale\"]:\n            theta_phi_sc = fluid.layers.scale(theta_phi, scale=dim_inner**-.5)\n        else:\n            theta_phi_sc = theta_phi\n        p = fluid.layers.softmax(theta_phi_sc, name=prefix + '_affinity' + '_prob')\n    else:\n        # not clear about what is doing in xlw's code\n        p = None  # not implemented\n        raise \"Not implemented when not use softmax\"\n\n    # note g's axis[2] corresponds to p's axis[2]\n    # e.g. g(8, 1024, 784_2) * p(8, 784_1, 784_2) => (8, 1024, 784_1)\n    p = fluid.layers.transpose(p, [0, 2, 1])\n    t = fluid.layers.matmul(g, p, name=prefix + '_y')\n\n    # reshape back\n    # e.g. (8, 1024, 784) => (8, 1024, 4, 14, 14)\n    t_shape = t.shape\n    t_re = fluid.layers.reshape(t, shape=list(theta_shape), actual_shape=theta_shape_op)\n    blob_out = t_re\n    blob_out = fluid.layers.conv2d(input = blob_out, num_filters = dim_out, \\\n                                  filter_size = [1, 1], stride = [1, 1], padding = [0, 0], \\\n                                  param_attr = ParamAttr(name = prefix + '_out' + \"_w\", \\\n                                      initializer = fluid.initializer.Constant(value = 0.) \\\n                                        if nonlocal_params[\"use_zero_init_conv\"] \\\n                                        else fluid.initializer.Normal(loc = 0.0,\n                                            scale = nonlocal_params[\"conv_init_std\"])), \\\n                                  bias_attr = ParamAttr(name = prefix + '_out' + \"_b\", \\\n                                          initializer = fluid.initializer.Constant(value = 0.)) \\\n                                           if (nonlocal_params[\"no_bias\"] == 0) else False, \\\n                                  name = prefix + '_out')\n    blob_out_shape = blob_out.shape\n\n    if nonlocal_params[\"use_bn\"]:\n        bn_name = prefix + \"_bn\"\n        blob_out = fluid.layers.batch_norm(blob_out, \\\n                      # is_test = test_mode, \\\n                      momentum = nonlocal_params[\"bn_momentum\"], \\\n                      epsilon = nonlocal_params[\"bn_epsilon\"], \\\n                      name = bn_name, \\\n                      param_attr = ParamAttr(name = bn_name + \"_s\", \\\n                      initializer = fluid.initializer.Constant(value = nonlocal_params[\"bn_init_gamma\"]), \\\n                      regularizer = fluid.regularizer.L2Decay(nonlocal_params[\"weight_decay_bn\"])), \\\n                      bias_attr = ParamAttr(name = bn_name + \"_b\", \\\n                      regularizer = fluid.regularizer.L2Decay(nonlocal_params[\"weight_decay_bn\"])), \\\n                      moving_mean_name = bn_name + \"_rm\", \\\n                      moving_variance_name = bn_name + \"_riv\") # add bn\n\n    if nonlocal_params[\"use_affine\"]:\n        affine_scale = fluid.layers.create_parameter(\\\n                       shape=[blob_out_shape[1]], dtype = blob_out.dtype, \\\n                       attr=ParamAttr(name=prefix + '_affine' + '_s'), \\\n                       default_initializer = fluid.initializer.Constant(value = 1.))\n        affine_bias = fluid.layers.create_parameter(\\\n                      shape=[blob_out_shape[1]], dtype = blob_out.dtype, \\\n                      attr=ParamAttr(name=prefix + '_affine' + '_b'), \\\n                      default_initializer = fluid.initializer.Constant(value = 0.))\n        blob_out = fluid.layers.affine_channel(blob_out, scale = affine_scale, \\\n                      bias = affine_bias, name = prefix + '_affine')   # add affine\n\n    return blob_out\n\n\ndef add_space_nonlocal(input, dim_in, dim_out, prefix, dim_inner):\n    '''\n    add_space_nonlocal:\n        Non-local Neural Networks: see https://arxiv.org/abs/1711.07971\n    '''\n    conv = space_nonlocal(input, dim_in, dim_out, prefix, dim_inner)\n    output = fluid.layers.elementwise_add(input, conv, name=prefix + '_sum')\n    return output\n"
  },
  {
    "path": "modules/image/classification/resnet50_v2_imagenet/processor.py",
    "content": "# coding=utf-8\ndef load_label_info(file_path):\n    with open(file_path, 'r') as fr:\n        return fr.read().split(\"\\n\")[:-1]\n"
  },
  {
    "path": "modules/image/classification/resnet50_v2_imagenet/resnet.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport math\nfrom collections import OrderedDict\nfrom numbers import Integral\n\nfrom paddle import fluid\nfrom paddle.fluid.param_attr import ParamAttr\nfrom paddle.fluid.framework import Variable\nfrom paddle.fluid.regularizer import L2Decay\nfrom paddle.fluid.initializer import Constant\n\nfrom .nonlocal_helper import add_space_nonlocal\nfrom .name_adapter import NameAdapter\n\n__all__ = ['ResNet', 'ResNetC5']\n\n\nclass ResNet(object):\n    \"\"\"\n    Residual Network, see https://arxiv.org/abs/1512.03385\n    Args:\n        depth (int): ResNet depth, should be 34, 50.\n        freeze_at (int): freeze the backbone at which stage\n        norm_type (str): normalization type, 'bn'/'sync_bn'/'affine_channel'\n        freeze_norm (bool): freeze normalization layers\n        norm_decay (float): weight decay for normalization layer weights\n        variant (str): ResNet variant, supports 'a', 'b', 'c', 'd' currently\n        feature_maps (list): index of stages whose feature maps are returned\n        dcn_v2_stages (list): index of stages who select deformable conv v2\n        nonlocal_stages (list): index of stages who select nonlocal networks\n    \"\"\"\n    __shared__ = ['norm_type', 'freeze_norm', 'weight_prefix_name']\n\n    def __init__(self,\n                 depth=50,\n                 freeze_at=0,\n                 norm_type='sync_bn',\n                 freeze_norm=False,\n                 norm_decay=0.,\n                 variant='d',\n                 feature_maps=[3, 4, 5],\n                 dcn_v2_stages=[],\n                 weight_prefix_name='',\n                 nonlocal_stages=[],\n                 get_prediction=False,\n                 class_dim=1000):\n        super(ResNet, self).__init__()\n\n        if isinstance(feature_maps, Integral):\n            feature_maps = [feature_maps]\n\n        assert depth in [34, 50], \\\n            \"depth {} not in [34, 50]\"\n        assert variant in ['a', 'b', 'c', 'd'], \"invalid ResNet variant\"\n        assert 0 <= freeze_at <= 4, \"freeze_at should be 0, 1, 2, 3 or 4\"\n        assert len(feature_maps) > 0, \"need one or more feature maps\"\n        assert norm_type in ['bn', 'sync_bn', 'affine_channel']\n        assert not (len(nonlocal_stages)>0 and depth<50), \\\n                    \"non-local is not supported for resnet18 or resnet34\"\n\n        self.depth = depth\n        self.freeze_at = freeze_at\n        self.norm_type = norm_type\n        self.norm_decay = norm_decay\n        self.freeze_norm = freeze_norm\n        self.variant = variant\n        self._model_type = 'ResNet'\n        self.feature_maps = feature_maps\n        self.dcn_v2_stages = dcn_v2_stages\n        self.depth_cfg = {\n            34: ([3, 4, 6, 3], self.basicblock),\n            50: ([3, 4, 6, 3], self.bottleneck),\n        }\n        self.stage_filters = [64, 128, 256, 512]\n        self._c1_out_chan_num = 64\n        self.na = NameAdapter(self)\n        self.prefix_name = weight_prefix_name\n\n        self.nonlocal_stages = nonlocal_stages\n        self.nonlocal_mod_cfg = {\n            50: 2,\n            101: 5,\n            152: 8,\n            200: 12,\n        }\n        self.get_prediction = get_prediction\n        self.class_dim = class_dim\n\n    def _conv_offset(self, input, filter_size, stride, padding, act=None, name=None):\n        out_channel = filter_size * filter_size * 3\n        out = fluid.layers.conv2d(\n            input,\n            num_filters=out_channel,\n            filter_size=filter_size,\n            stride=stride,\n            padding=padding,\n            param_attr=ParamAttr(initializer=Constant(0.0), name=name + \".w_0\"),\n            bias_attr=ParamAttr(initializer=Constant(0.0), name=name + \".b_0\"),\n            act=act,\n            name=name)\n        return out\n\n    def _conv_norm(self, input, num_filters, filter_size, stride=1, groups=1, act=None, name=None, dcn_v2=False):\n        _name = self.prefix_name + name if self.prefix_name != '' else name\n        if not dcn_v2:\n            conv = fluid.layers.conv2d(\n                input=input,\n                num_filters=num_filters,\n                filter_size=filter_size,\n                stride=stride,\n                padding=(filter_size - 1) // 2,\n                groups=groups,\n                act=None,\n                param_attr=ParamAttr(name=_name + \"_weights\"),\n                bias_attr=False,\n                name=_name + '.conv2d.output.1')\n        else:\n            # select deformable conv\"\n            offset_mask = self._conv_offset(\n                input=input,\n                filter_size=filter_size,\n                stride=stride,\n                padding=(filter_size - 1) // 2,\n                act=None,\n                name=_name + \"_conv_offset\")\n            offset_channel = filter_size**2 * 2\n            mask_channel = filter_size**2\n            offset, mask = fluid.layers.split(input=offset_mask, num_or_sections=[offset_channel, mask_channel], dim=1)\n            mask = fluid.layers.sigmoid(mask)\n            conv = fluid.layers.deformable_conv(\n                input=input,\n                offset=offset,\n                mask=mask,\n                num_filters=num_filters,\n                filter_size=filter_size,\n                stride=stride,\n                padding=(filter_size - 1) // 2,\n                groups=groups,\n                deformable_groups=1,\n                im2col_step=1,\n                param_attr=ParamAttr(name=_name + \"_weights\"),\n                bias_attr=False,\n                name=_name + \".conv2d.output.1\")\n\n        bn_name = self.na.fix_conv_norm_name(name)\n        bn_name = self.prefix_name + bn_name if self.prefix_name != '' else bn_name\n\n        norm_lr = 0. if self.freeze_norm else 1.\n        norm_decay = self.norm_decay\n        pattr = ParamAttr(name=bn_name + '_scale', learning_rate=norm_lr, regularizer=L2Decay(norm_decay))\n        battr = ParamAttr(name=bn_name + '_offset', learning_rate=norm_lr, regularizer=L2Decay(norm_decay))\n\n        if self.norm_type in ['bn', 'sync_bn']:\n            global_stats = True if self.freeze_norm else False\n            out = fluid.layers.batch_norm(\n                input=conv,\n                act=act,\n                name=bn_name + '.output.1',\n                param_attr=pattr,\n                bias_attr=battr,\n                moving_mean_name=bn_name + '_mean',\n                moving_variance_name=bn_name + '_variance',\n                use_global_stats=global_stats)\n            scale = fluid.framework._get_var(pattr.name)\n            bias = fluid.framework._get_var(battr.name)\n        elif self.norm_type == 'affine_channel':\n            scale = fluid.layers.create_parameter(\n                shape=[conv.shape[1]], dtype=conv.dtype, attr=pattr, default_initializer=fluid.initializer.Constant(1.))\n            bias = fluid.layers.create_parameter(\n                shape=[conv.shape[1]], dtype=conv.dtype, attr=battr, default_initializer=fluid.initializer.Constant(0.))\n            out = fluid.layers.affine_channel(x=conv, scale=scale, bias=bias, act=act)\n        if self.freeze_norm:\n            scale.stop_gradient = True\n            bias.stop_gradient = True\n        return out\n\n    def _shortcut(self, input, ch_out, stride, is_first, name):\n        max_pooling_in_short_cut = self.variant == 'd'\n        ch_in = input.shape[1]\n        # the naming rule is same as pretrained weight\n        name = self.na.fix_shortcut_name(name)\n        std_senet = getattr(self, 'std_senet', False)\n        if ch_in != ch_out or stride != 1 or (self.depth < 50 and is_first):\n            if std_senet:\n                if is_first:\n                    return self._conv_norm(input, ch_out, 1, stride, name=name)\n                else:\n                    return self._conv_norm(input, ch_out, 3, stride, name=name)\n            if max_pooling_in_short_cut and not is_first:\n                input = fluid.layers.pool2d(\n                    input=input, pool_size=2, pool_stride=2, pool_padding=0, ceil_mode=True, pool_type='avg')\n                return self._conv_norm(input, ch_out, 1, 1, name=name)\n            return self._conv_norm(input, ch_out, 1, stride, name=name)\n        else:\n            return input\n\n    def bottleneck(self, input, num_filters, stride, is_first, name, dcn_v2=False):\n        if self.variant == 'a':\n            stride1, stride2 = stride, 1\n        else:\n            stride1, stride2 = 1, stride\n\n        # ResNeXt\n        groups = getattr(self, 'groups', 1)\n        group_width = getattr(self, 'group_width', -1)\n        if groups == 1:\n            expand = 4\n        elif (groups * group_width) == 256:\n            expand = 1\n        else:  # FIXME hard code for now, handles 32x4d, 64x4d and 32x8d\n            num_filters = num_filters // 2\n            expand = 2\n\n        conv_name1, conv_name2, conv_name3, \\\n            shortcut_name = self.na.fix_bottleneck_name(name)\n        std_senet = getattr(self, 'std_senet', False)\n        if std_senet:\n            conv_def = [[int(num_filters / 2), 1, stride1, 'relu', 1, conv_name1],\n                        [num_filters, 3, stride2, 'relu', groups, conv_name2],\n                        [num_filters * expand, 1, 1, None, 1, conv_name3]]\n        else:\n            conv_def = [[num_filters, 1, stride1, 'relu', 1, conv_name1],\n                        [num_filters, 3, stride2, 'relu', groups, conv_name2],\n                        [num_filters * expand, 1, 1, None, 1, conv_name3]]\n\n        residual = input\n        for i, (c, k, s, act, g, _name) in enumerate(conv_def):\n            residual = self._conv_norm(\n                input=residual,\n                num_filters=c,\n                filter_size=k,\n                stride=s,\n                act=act,\n                groups=g,\n                name=_name,\n                dcn_v2=(i == 1 and dcn_v2))\n        short = self._shortcut(input, num_filters * expand, stride, is_first=is_first, name=shortcut_name)\n        # Squeeze-and-Excitation\n        if callable(getattr(self, '_squeeze_excitation', None)):\n            residual = self._squeeze_excitation(input=residual, num_channels=num_filters, name='fc' + name)\n        return fluid.layers.elementwise_add(x=short, y=residual, act='relu', name=name + \".add.output.5\")\n\n    def basicblock(self, input, num_filters, stride, is_first, name, dcn_v2=False):\n        assert dcn_v2 is False, \"Not implemented yet.\"\n        conv0 = self._conv_norm(\n            input=input, num_filters=num_filters, filter_size=3, act='relu', stride=stride, name=name + \"_branch2a\")\n        conv1 = self._conv_norm(input=conv0, num_filters=num_filters, filter_size=3, act=None, name=name + \"_branch2b\")\n        short = self._shortcut(input, num_filters, stride, is_first, name=name + \"_branch1\")\n        return fluid.layers.elementwise_add(x=short, y=conv1, act='relu')\n\n    def layer_warp(self, input, stage_num):\n        \"\"\"\n        Args:\n            input (Variable): input variable.\n            stage_num (int): the stage number, should be 2, 3, 4, 5\n\n        Returns:\n            The last variable in endpoint-th stage.\n        \"\"\"\n        assert stage_num in [2, 3, 4, 5]\n\n        stages, block_func = self.depth_cfg[self.depth]\n        count = stages[stage_num - 2]\n\n        ch_out = self.stage_filters[stage_num - 2]\n        is_first = False if stage_num != 2 else True\n        dcn_v2 = True if stage_num in self.dcn_v2_stages else False\n\n        nonlocal_mod = 1000\n        if stage_num in self.nonlocal_stages:\n            nonlocal_mod = self.nonlocal_mod_cfg[self.depth] if stage_num == 4 else 2\n\n        # Make the layer name and parameter name consistent\n        # with ImageNet pre-trained model\n        conv = input\n        for i in range(count):\n            conv_name = self.na.fix_layer_warp_name(stage_num, count, i)\n            if self.depth < 50:\n                is_first = True if i == 0 and stage_num == 2 else False\n            conv = block_func(\n                input=conv,\n                num_filters=ch_out,\n                stride=2 if i == 0 and stage_num != 2 else 1,\n                is_first=is_first,\n                name=conv_name,\n                dcn_v2=dcn_v2)\n\n            # add non local model\n            dim_in = conv.shape[1]\n            nonlocal_name = \"nonlocal_conv{}\".format(stage_num)\n            if i % nonlocal_mod == nonlocal_mod - 1:\n                conv = add_space_nonlocal(conv, dim_in, dim_in, nonlocal_name + '_{}'.format(i), int(dim_in / 2))\n        return conv\n\n    def c1_stage(self, input):\n        out_chan = self._c1_out_chan_num\n\n        conv1_name = self.na.fix_c1_stage_name()\n\n        if self.variant in ['c', 'd']:\n            conv_def = [\n                [out_chan // 2, 3, 2, \"conv1_1\"],\n                [out_chan // 2, 3, 1, \"conv1_2\"],\n                [out_chan, 3, 1, \"conv1_3\"],\n            ]\n        else:\n            conv_def = [[out_chan, 7, 2, conv1_name]]\n\n        for (c, k, s, _name) in conv_def:\n            input = self._conv_norm(input=input, num_filters=c, filter_size=k, stride=s, act='relu', name=_name)\n\n        output = fluid.layers.pool2d(input=input, pool_size=3, pool_stride=2, pool_padding=1, pool_type='max')\n        return output\n\n    def __call__(self, input):\n        assert isinstance(input, Variable)\n        assert not (set(self.feature_maps) - set([2, 3, 4, 5])), \\\n            \"feature maps {} not in [2, 3, 4, 5]\".format(self.feature_maps)\n\n        res_endpoints = []\n\n        res = input\n        feature_maps = self.feature_maps\n        severed_head = getattr(self, 'severed_head', False)\n        if not severed_head:\n            res = self.c1_stage(res)\n            feature_maps = range(2, max(self.feature_maps) + 1)\n\n        for i in feature_maps:\n            res = self.layer_warp(res, i)\n            if i in self.feature_maps:\n                res_endpoints.append(res)\n            if self.freeze_at >= i:\n                res.stop_gradient = True\n        if self.get_prediction:\n            pool = fluid.layers.pool2d(input=res, pool_type='avg', global_pooling=True)\n            stdv = 1.0 / math.sqrt(pool.shape[1] * 1.0)\n\n            out = fluid.layers.fc(\n                input=pool,\n                size=self.class_dim,\n                param_attr=fluid.param_attr.ParamAttr(initializer=fluid.initializer.Uniform(-stdv, stdv)))\n            out = fluid.layers.softmax(out)\n            return out\n        return OrderedDict(\n            [('res{}_sum'.format(self.feature_maps[idx]), feat) for idx, feat in enumerate(res_endpoints)])\n\n\nclass ResNetC5(ResNet):\n    def __init__(self,\n                 depth=50,\n                 freeze_at=2,\n                 norm_type='affine_channel',\n                 freeze_norm=True,\n                 norm_decay=0.,\n                 variant='b',\n                 feature_maps=[5],\n                 weight_prefix_name=''):\n        super(ResNetC5, self).__init__(depth, freeze_at, norm_type, freeze_norm, norm_decay, variant, feature_maps)\n        self.severed_head = True\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_10w/README.md",
    "content": "# resnet50_vd_10w\n\n|模型名称|resnet50_vd_10w|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNet_vd|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|否|\n|模型大小|92MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - ResNet系列模型是图像分类领域的重要模型之一，模型中提出的残差单元有效地解决了深度网络训练困难的问题，通过增加模型的深度提升了模型的准确率，ResNet-vd 其实就是 ResNet-D，是ResNet 原始结构的变种。该PaddleHub Module结构为ResNet_vd，使用百度自研的基于10万种类别、4千多万的有标签数据进行训练，接受输入图片大小为224 x 224 x 3，支持finetune。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install resnet50_vd_10w\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnet50_vd_10w\")\n    input_dict, output_dict, program = classifier.context(trainable=True)\n    ```\n\n- ### 2、API\n\n  - ```python\n    def context(trainable=True, pretrained=True)\n    ```\n    - **参数**\n      - trainable (bool): 计算图的参数是否为可训练的；<br/>\n      - pretrained (bool): 是否加载默认的预训练模型。\n\n    - **返回**\n      - inputs (dict): 计算图的输入，key 为 'image', value 为图片的张量；<br/>\n      - outputs (dict): 计算图的输出，key 为 'classification' 和 'feature_map'，其相应的值为：\n        - classification (paddle.fluid.framework.Variable): 分类结果，也就是全连接层的输出；\n        - feature\\_map (paddle.fluid.framework.Variable): 特征匹配，全连接层前面的那个张量。\n      - context\\_prog(fluid.Program): 计算图，用于迁移学习。\n\n\n\n  - ```python\n    def save_inference_model(dirname,\n                             model_filename=None,\n                             params_filename=None,\n                             combined=True)\n    ```\n    - **参数**\n      - dirname: 存在模型的目录名称；<br/>\n      - model_filename: 模型文件名称，默认为\\_\\_model\\_\\_; <br/>\n      - params_filename: 参数文件名称，默认为\\_\\_params\\_\\_(仅当`combined`为True时生效); <br/>\n      - combined: 是否将参数保存到统一的一个文件中。\n\n\n\n\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install resnet50_vd_10w==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_10w/README_en.md",
    "content": "# resnet50_vd_10w\n\n|Module Name|resnet50_vd_10w|\n| :--- | :---: |\n|Category|image classification|\n|Network|ResNet_vd|\n|Dataset|Baidu Dataset|\n|Fine-tuning supported or not|No|\n|Module Size|92MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - ResNet proposed a residual unit to solve the problem of training an extremely deep network, and improved the prediction accuracy of models. ResNet-vd is a variant of ResNet. This module is based on ResNet_vd, trained on Baidu dataset(consists of 100 thousand classes, 40 million pairs of data), and can predict an image of size 224*224*3.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install resnet50_vd_10w\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnet50_vd_10w\")\n    input_dict, output_dict, program = classifier.context(trainable=True)\n    ```\n\n- ### 2、API\n\n  - ```python\n    def context(trainable=True, pretrained=True)\n    ```\n    - **Parameters**\n      - trainable (bool): whether parameters are trainable；<br/>\n      - pretrained (bool): whether load the pre-trained model.\n\n    - **Return**\n      - inputs (dict): model inputs，key is 'image', value is the image tensor；<br/>\n      - outputs (dict): model outputs，key is 'classification' and 'feature_map'，values：\n        - classification (paddle.fluid.framework.Variable): classification result；\n        - feature\\_map (paddle.fluid.framework.Variable): feature map extracted by model.\n      - context\\_prog(fluid.Program): computation graph, used for transfer learning.\n\n\n\n  - ```python\n    def save_inference_model(dirname,\n                             model_filename=None,\n                             params_filename=None,\n                             combined=True)\n    ```\n    - **Parameters**\n      - dirname: output dir for saving model; <br/>\n      - model_filename: filename of model, default is \\_\\_model\\_\\_; <br/>\n      - params_filename: filename of parameters，default is \\_\\_params\\_\\_(only effective when `combined` is True); <br/>\n      - combined: whether save parameters into one file\n\n\n\n\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install resnet50_vd_10w==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_10w/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport math\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(\n            self,\n            num_channels: int,\n            num_filters: int,\n            filter_size: int,\n            stride: int = 1,\n            groups: int = 1,\n            is_vd_mode: bool = False,\n            act: str = None,\n            name: str = None,\n    ):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2d(kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, inputs: paddle.Tensor):\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Bottleneck Block for ResNet50_vd.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels, num_filters=num_filters, filter_size=1, act='relu', name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters * 4, filter_size=1, act=None, name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv2, act='relu')\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    \"\"\"Basic block for ResNet50_vd.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BasicBlock, self).__init__()\n        self.stride = stride\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters, filter_size=3, act=None, name=name + \"_branch2b\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters,\n                filter_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv1, act='relu')\n        return y\n\n\n@moduleinfo(\n    name=\"resnet50_vd_10w\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"resnet50_vd_imagenet_ssld is a classification model, \"\n    \"this module is trained with Baidu open sourced dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ResNet50_vd(nn.Layer):\n    \"\"\"ResNet50_vd model.\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(ResNet50_vd, self).__init__()\n\n        self.layers = 50\n        depth = [3, 4, 6, 3]\n        num_channels = [64, 256, 512, 1024]\n        num_filters = [64, 128, 256, 512]\n\n        self.conv1_1 = ConvBNLayer(num_channels=3, num_filters=32, filter_size=3, stride=2, act='relu', name=\"conv1_1\")\n        self.conv1_2 = ConvBNLayer(num_channels=32, num_filters=32, filter_size=3, stride=1, act='relu', name=\"conv1_2\")\n        self.conv1_3 = ConvBNLayer(num_channels=32, num_filters=64, filter_size=3, stride=1, act='relu', name=\"conv1_3\")\n        self.pool2d_max = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n        self.block_list = []\n\n        for block in range(len(depth)):\n            shortcut = False\n            for i in range(depth[block]):\n                conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                bottleneck_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    BottleneckBlock(\n                        num_channels=num_channels[block] if i == 0 else num_filters[block] * 4,\n                        num_filters=num_filters[block],\n                        stride=2 if i == 0 and block != 0 else 1,\n                        shortcut=shortcut,\n                        if_first=block == i == 0,\n                        name=conv_name))\n                self.block_list.append(bottleneck_block)\n                shortcut = True\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n        self.pool2d_avg_channels = num_channels[-1] * 2\n        stdv = 1.0 / math.sqrt(self.pool2d_avg_channels * 1.0)\n\n        self.out = Linear(\n            self.pool2d_avg_channels,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_0.w_0\"),\n            bias_attr=ParamAttr(name=\"fc_0.b_0\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'resnet50_vd_10w.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/resnet50_vd_10w.pdparams -O ' +\n                    checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        y = self.pool2d_max(y)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.pool2d_avg_channels])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_animals/README.md",
    "content": "# resnet50_vd_animals\n\n|模型名称|resnet50_vd_animals|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNet50_vd|\n|数据集|百度自建动物数据集|\n|是否支持Fine-tuning|否|\n|模型大小|154MB|\n|指标|-|\n|最新更新日期|2021-02-26|\n\n\n## 一、模型基本信息\n\n\n- ### 模型介绍\n\n    - ResNet-vd 其实就是 ResNet-D，是ResNet 原始结构的变种，可用于图像分类和特征提取。该 PaddleHub Module 采用百度自建动物数据集训练得到，支持7978种动物的分类识别。\n\n    - 模型的详情可参考[论文](https://arxiv.org/pdf/1812.01187.pdf)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install resnet50_vd_animals\n      ```\n    - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n    - ```\n      hub run resnet50_vd_animals --input_path \"/PATH/TO/IMAGE\"\n      ```\n\n- ### 2、预测代码示例\n\n    - ```python\n      import paddlehub as hub\n      import cv2\n\n      classifier = hub.Module(name=\"resnet50_vd_animals\")\n\n      result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n      # or\n      # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n      ```\n- ### 3、API\n\n    - ```python\n      def get_expected_image_width()\n      ```\n\n        - 返回预处理的图片宽度，也就是224。\n\n    - ```python\n      def get_expected_image_height()\n      ```\n\n        - 返回预处理的图片高度，也就是224。\n\n    - ```python\n      def get_pretrained_images_mean()\n      ```\n\n        - 返回预处理的图片均值，也就是 \\[0.485, 0.456, 0.406\\]。\n\n    - ```python\n      def get_pretrained_images_std()\n      ```\n\n        - 返回预处理的图片标准差，也就是 \\[0.229, 0.224, 0.225\\]。\n\n\n    - ```python\n      def classification(images=None,\n                         paths=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         top_k=1):\n      ```\n\n        - **参数**\n\n            * images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR；\n            * paths (list\\[str\\]): 图片的路径；\n            * batch\\_size (int): batch 的大小；\n            * use\\_gpu (bool): 是否使用 GPU 来预测；\n            * top\\_k (int): 返回预测结果的前 k 个。\n\n        - **返回**\n\n            -   res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 为识别动物的类别，value为置信度。\n\n    - ```python\n      def save_inference_model(dirname,\n                               model_filename=None,\n                               params_filename=None,\n                               combined=True)\n      ```\n\n        - 将模型保存到指定路径。\n\n        - **参数**\n\n            * dirname: 存在模型的目录名称\n            * model_filename: 模型文件名称，默认为\\_\\_model\\_\\_\n            * params_filename: 参数文件名称，默认为\\_\\_params\\_\\_(仅当`combined`为True时生效)\n            * combined: 是否将参数保存到统一的一个文件中\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线动物识别服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n        - ```shell\n          $ hub serving start -m resnet50_vd_animals\n          ```\n\n        - 这样就完成了一个在线动物识别服务化API的部署，默认端口号为8866。\n\n        - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n- 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/resnet50_vd_animals\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install resnet50_vd_animals==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_animals/README_en.md",
    "content": "# resnet50_vd_animals\n\n|Module Name|resnet50_vd_animals|\n| :--- | :---: |\n|Category |Image classification|\n|Network|ResNet50_vd|\n|Dataset|Baidu self-built dataset|\n|Fine-tuning supported or not|No|\n|Module Size|154MB|\n|Latest update date|2021-02-26|\n|Data indicators|-|\n\n\n## I. Basic Information\n\n- ### Application Effect Display\n\n  - ResNet-vd is a variant of ResNet, which can be used for image classification and feature extraction. This module is trained by Baidu self-built animal data set and supports the classification and recognition of 7,978 animal species.\n  - For more information, please refer to [ResNet-vd](https://arxiv.org/pdf/1812.01187.pdf)\n\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install resnet50_vd_animals\n    ```\n  - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run resnet50_vd_animals --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_en/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n    - ```python\n      import paddlehub as hub\n      import cv2\n\n      classifier = hub.Module(name=\"resnet50_vd_animals\")\n\n      result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n      # or\n      # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n      ```\n\n- ### 3、API\n\n    - ```python\n      def get_expected_image_width()\n      ```\n\n        - Returns the preprocessed image width, which is 224.\n\n    - ```python\n      def get_expected_image_height()\n      ```\n\n        - Returns the preprocessed image height, which is 224.\n\n    - ```python\n      def get_pretrained_images_mean()\n      ```\n\n        - Returns the mean value of the preprocessed image, which is \\[0.485, 0.456, 0.406\\].\n\n    - ```python\n      def get_pretrained_images_std()\n      ```\n\n        - Return the standard deviation of the preprocessed image, which is \\[0.229, 0.224, 0.225\\].\n\n\n    - ```python\n      def classification(images=None,\n                         paths=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         top_k=1):\n      ```\n\n        - **Parameter**\n\n            * images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n            * paths (list\\[str\\]): image path;\n            * batch\\_size (int): batch size;\n            * use\\_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n            * top\\_k (int): return the top k prediction results.\n\n        - **Return**\n\n            -   res (list\\[dict\\]): the list of classification results，key is the prediction label, value is the corresponding confidence.\n\n    - ```python\n      def save_inference_model(dirname,\n                               model_filename=None,\n                               params_filename=None,\n                               combined=True)\n      ```\n\n        - Save the model to the specified path.\n\n        - **Parameters**\n            * dirname: Save path.\n            * model\\_filename: model file name，defalt is \\_\\_model\\_\\_\n            * params\\_filename: parameter file name，defalt is \\_\\_params\\_\\_(Only takes effect when `combined` is True)\n            * combined: Whether to save the parameters to a unified file.\n\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of animal classification.\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n        - ```shell\n          $ hub serving start -m resnet50_vd_animals\n          ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n      - ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n\n        # Send an HTTP request\n        data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/resnet50_vd_animals\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n        # print prediction results\n        print(r.json()[\"results\"])\n        ```\n\n\n## V. Release Note\n\n- 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Remove Fluid API\n\n  - ```shell\n    $ hub install resnet50_vd_animals==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_animals/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/classification/resnet50_vd_animals/data_feed.py",
    "content": "# coding=utf-8\nimport os\nimport time\nfrom collections import OrderedDict\n\nimport numpy as np\nfrom PIL import Image\n\n__all__ = ['reader']\n\nDATA_DIM = 224\nimg_mean = np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))\nimg_std = np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))\n\n\ndef resize_short(img, target_size):\n    percent = float(target_size) / min(img.size[0], img.size[1])\n    resized_width = int(round(img.size[0] * percent))\n    resized_height = int(round(img.size[1] * percent))\n    img = img.resize((resized_width, resized_height), Image.LANCZOS)\n    return img\n\n\ndef crop_image(img, target_size, center):\n    width, height = img.size\n    size = target_size\n    if center == True:\n        w_start = (width - size) / 2\n        h_start = (height - size) / 2\n    else:\n        w_start = np.random.randint(0, width - size + 1)\n        h_start = np.random.randint(0, height - size + 1)\n    w_end = w_start + size\n    h_end = h_start + size\n    img = img.crop((w_start, h_start, w_end, h_end))\n    return img\n\n\ndef process_image(img):\n    img = resize_short(img, target_size=256)\n    img = crop_image(img, target_size=DATA_DIM, center=True)\n    if img.mode != 'RGB':\n        img = img.convert('RGB')\n    img = np.array(img).astype('float32').transpose((2, 0, 1)) / 255\n    img -= img_mean\n    img /= img_std\n    return img\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list[numpy.ndarray]): images data, shape of each is [H, W, C].\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            each['org_im_path'] = im_path\n            each['org_im'] = Image.open(im_path)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n    if images is not None:\n        assert type(images), \"images is a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = Image.fromarray(im[:, :, ::-1])\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n\n    for element in component:\n        element['image'] = process_image(element['org_im'])\n        yield element\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_animals/label_list.txt",
    "content": "东方铃蟾\n中国小鲵\n中国树蟾\n北方狭口蛙\n华南湍蛙\n华西蟾蜍\n吻蚓\n哀牢髭蟾\n商城肥鲵\n大凉疣螈\n大鲵\n姬蛙\n弹琴蛙\n极北小鲵\n林蛙\n树蛙\n棘胸蛙\n棘腹蛙\n沼蛙\n泽蛙\n海蛙\n淡肩角蟾\n爪蟾\n牛蛙\n癞蛤蟆\n箭毒蛙\n红点齿蟾\n绿臭蛙\n花背蟾蜍\n花臭蛙\n草蛙\n蓝尾蝾螈\n虎纹蛙\n蝌蚪\n蝾螈\n海蟾蜍\n黑框蟾蜍\n角蛙\n负子蟾\n赤蛙\n金线蛙\n镇海棘螈\n雨蛙\n香港瘰螈\n鱼螈\n鳗螈\n黑斑肥螈\n黑斑蛙\n喜玛拉雅兔\n大耳白兔\n泽西长毛兔\n海棠兔\n荷兰兔\n巨型格仔兔\n巨型花明兔\n新西兰兔\n暹罗兔\n标准金吉拉兔\n比利时野兔\n沙漠棉兔\n法国垂耳兔\n波兰兔\n狮子头兔\n美国费斯垂耳兔\n英国垂耳兔\n英国斑点兔\n英种小型兔\n东北兔\n力斯兔\n加利福尼亚兔\n华南兔\n塔里木兔\n安哥拉兔\n比利时兔\n海南兔\n美国迷你垂耳兔\n美国长毛垂耳兔\n花巨兔\n蒙古兔\n高原兔\n银狐兔\n雷克斯兔\n香槟兔\n九绊犰狳\n仓鼠\n兔豚鼠\n八齿鼠\n布氏田鼠\n日本飞鼠\n星鼻鼹鼠\n更格卢鼠\n松鼠\n毛丝鼠\n水豚\n沙鼠\n河狸\n河狸鼠\n海狸鼠\n花栗鼠\n草原犬鼠\n豚鼠\n豪猪\n迷你刺猬\n针鼹\n马岛猬\n河马\n利木赞牛\n吉安黄牛\n夏洛来牛\n娟珊牛\n安格斯牛\n德克斯特牛\n水牛\n海福特牛\n牦牛\n瑞士褐牛\n羚牛\n西班牙斗牛\n西门塔尔牛\n高地牛\n印度犀\n爪哇犀\n白犀\n苏门答腊犀牛\n黑犀\n三元苗猪\n东北民猪\n两头乌\n太湖猪\n姜曲海猪\n巴克夏猪\n杜洛克猪\n架子猪\n汉普夏猪\n波中猪\n皮特兰\n监利猪\n约克夏猪\n花斑猪\n荣昌猪\n辽宁黑猪\n金华猪\n长白猪\n东北细毛羊\n侏儒山羊\n冰岛羊\n北山羊\n卡拉库尔羊\n印度羚\n原羚\n叉角羚\n安哥拉山羊\n布尔山羊\n捻角山羊\n捻角羚\n杜泊绵羊\n林肯羊\n济宁青山羊\n浏阳黑山羊\n海门山羊\n白绒山羊\n纯种小尾寒羊\n罗姆尼羊\n美丽诺绵羊\n美利奴羊\n萨能山羊\n藏羚羊\n跳羚\n非洲大羚羊\n非洲瞪羚\n黄淮山羊\n黑面羊\n亚洲象\n猛犸象\n非洲象\n貘\n南非长颈鹿\n安哥拉长颈鹿\n马赛长颈鹿\n三河马\n伊犁马\n克莱兹代尔马\n哈萨克马\n德保矮马\n摩根马\n斑马\n普氏野马\n欧洲野马\n汗血马\n河曲马\n田纳西走马\n矮种马\n美洲野马\n蒙古马\n西南马\n西藏野驴\n角马\n阿帕卢萨马\n阿拉伯马\n驴\n骡子\n黄骠马\n原驼\n无峰驼\n羊驼\n野骆驼\n阿拉善双峰驼\n驼马\n梅花鹿\n毛冠鹿\n水鹿\n海南坡鹿\n狍\n白唇鹿\n白尾鹿\n霍加狓\n香獐\n马鹿\n驯鹿\n驼鹿\n麋鹿\n黑尾鹿\n林麝\n马麝\n黑麂\n二趾树懒\n土豚\n树袋熊\n侏儒狨猴\n侏长尾猴\n倭狐猴\n冕狐猴\n几内亚狒狒\n叶猴\n喜山长尾叶猴\n大狐猴\n大猩猩\n婆罗洲猩猩\n婴猴\n山魈\n峨眉山猴\n懒猴\n指猴\n狒狒\n狮面狨\n猕猴\n环尾狐猴\n疣猴\n白脸猴\n白面僧面猴\n白额卷尾猴\n白额长尾猴\n红吼猴\n红领狐猴\n绒毛蛛猴\n绿猴\n肥尾鼠狐猴\n节尾猴\n苏门达腊猩猩\n菲氏叶猴\n豚尾叶猴\n赤猴\n跗猴\n金丝猴\n长尾猴\n长毛蜘蛛猴\n长臂猿\n长鼻猴\n黑吼猴\n黑猩猩\n黑疣猴\n鼠狐猴\n鼬狐猴\n伏翼\n吸血蝠\n大耳蝠\n大蝙蝠\n棕蝠\n狐蝠\n红蝠\n菊头蝠\n马来大狐蝠\n袋鼠\n亚洲黑熊\n北极熊\n国宝大熊猫\n大灰熊\n小熊猫\n懒熊\n棕熊\n浣熊\n狼獾\n眼镜熊\n美洲黑熊\n马来熊\n北极狐\n南非狐\n大耳狐\n孟加拉狐\n山狐\n沙狐\n灰狐\n耳廓狐\n苍狐\n草原狐\n藏狐\n赤狐\n达尔文狐\n阿富汗狐\n丝毛梗\n中亚牧羊犬\n中华田园犬\n中国冠毛犬\n中国昆明犬\n中国沙皮犬\n中国细犬\n中国藏獒\n丹迪丁蒙梗\n京巴\n伯瑞犬\n依比沙猎犬\n俄罗斯南部牧羊犬\n俄罗斯高加索犬\n克伦伯犬\n冰岛牧羊犬\n凯恩梗\n切萨皮克海湾寻回犬\n刚毛指示格里芬犬\n刚毛猎狐梗\n匈牙利维斯拉犬\n博美\n卷毛寻回犬\n卷毛比熊犬\n可卡\n可蒙犬\n史宾格犬\n吉娃娃犬\n哈士奇犬\n哥顿雪达犬\n喜乐蒂\n圣伯纳犬\n大丹\n德国狐狸犬\n大瑞士山地犬\n大白熊\n威尔士柯基\n威尔士梗\n威玛猎犬\n川东猎犬\n巴吉度犬\n八哥犬\n巴西獒犬\n布鲁塞尔格里芬犬\n平毛巡回猎犬\n平毛猎狐梗\n弗兰德牧羊犬\n德国刚毛指示犬\n德国短毛指示犬\n德牧/德国牧羊犬\n惠比特犬\n意大利卡斯罗犬\n意大利灵缇犬\n拉布拉多\n拉萨犬\n拳师犬\n挪威猎犬\n斯凯梗\n斯塔福斗牛梗\n日本土佐犬\n日本尖嘴犬\n日本柴犬\n日本狆\n日本秋田犬\n曼彻斯特梗\n杜宾犬\n杰克罗素梗\n松狮犬\n查理士王小猎犬\n比利时牧羊犬\n比格猎犬\n法兰德斯畜牧犬\n法兰西斗牛犬\n法国波尔多犬\n法国狼犬\n法国獒犬\n法老王猎犬\n波利犬\n波士顿梗犬\n波音达猎犬\n湖畔梗\n澳大利亚卡尔比犬\n澳大利亚梗\n澳大利亚牧羊犬\n爱尔兰塞特犬\n爱尔兰梗\n爱尔兰水猎犬\n爱尔兰猎狼犬\n爱尔兰长毛猎犬\n爱斯基摩犬\n牛头梗\n猎浣熊犬\n瑞士伯恩山犬\n白色德国牧羊犬\n约克夏梗\n纽波利顿獒犬\n纽芬兰犬\n罗威纳犬\n罗得西亚脊背犬\n比特狗\n美国水猎犬\n美国猎狐犬\n美系秋田犬\n腊肠犬\n芬兰狐狸犬\n苏俄猎狼犬\n苏格兰梗\n苏格兰猎鹿犬\n苏牧/苏格兰牧羊犬\n英国寻血猎犬\n英国猎狐犬\n英国老式牧羊犬\n荷兰毛狮犬\n萨摩耶\n萨路基猎犬\n葡萄牙水犬\n蝴蝶犬\n西施犬\n西班牙加纳利犬\n西藏梗\n西藏猎犬\n西里汉梗\n西高地白梗\n豺狗\n贝灵顿梗\n贵宾犬/贵妇犬\n软毛麦色梗\n边境梗\n边境牧羊犬\n迷你雪那瑞\n迷你鹿犬\n金毛犬\n长须牧羊犬\n阿富汗猎犬\n阿拉斯加雪橇犬\n阿根廷杜高犬\n雪纳瑞/标准雪纳瑞\n马尔济斯犬/玛尔基斯犬\n狮子\n亚洲胡狼\n北极狼\n印度狼\n埃塞俄比亚狼\n墨西哥狼\n意大利狼\n日本狼\n灰狼\n红狼\n藏狼\n袋狼\n貉\n郊狼\n鬃狼\n三色猫\n东奇尼猫\n东方短毛猫\n东方长毛猫\n亚洲猫\n伯曼猫\n俄罗斯蓝猫\n加州闪亮猫\n卡特尔猫\n哈瓦那棕猫\n喜马拉雅猫\n四川简州猫\n土耳其安哥拉猫\n土耳其梵猫\n埃及猫\n威尔士猫\n孟买猫\n巴厘猫\n布偶猫\n彼得秃猫\n德国卷毛猫\n拿破仑猫\n挪威森林猫\n斯芬克斯猫\n新加坡猫\n日本短尾猫\n曼切堪猫\n果子狸\n欧洲短毛猫\n波斯猫\n波米拉猫\n泰国暹罗猫\n澳大利亚雾猫\n灵猫\n热带草原猫\n爪哇猫\n狮子猫\n科拉特猫\n索马里猫\n缅因猫\n缅甸猫\n美国反耳猫\n美国短尾猫\n美国短毛猫\n美国硬毛猫\n肯尼亚猫\n苏格兰折耳猫\n英国短毛猫\n英国长毛猫\n虎猫\n褴褛猫\n西伯利亚猫\n豹猫\n重点色短毛猫\n金吉拉猫\n阿比西尼亚猫\n雪鞋猫\n马恩岛猫\n华南虎\n印度支那虎\n孟加拉虎\n新疆虎\n美洲虎\n苏门答腊虎\n西伯利亚虎\n云豹\n亚洲猎豹\n南非猎豹\n印度支那豹\n斯里兰卡豹\n波斯豹\n远东豹\n阿拉伯豹\n雪豹\n非洲豹\n黑豹\n海狗\n海狮\n海象\n海豹\n伶鼬\n巨獭\n日本貂\n松貂\n树懒\n水獭\n水貂\n江獭\n海獭\n渔貂\n狐鼬\n猞猁\n猪獾\n白鼬\n石貂\n紫貂\n美洲獾\n美洲貂\n艾鼬\n蜜獾\n食蚁兽\n香鼬\n黄喉貂\n黄鼬\n黑足鼬\n鼬獾\n小头鼠海豚\n巨头鲸\n抹香鲸\n江豚\n河豚\n海牛\n中华白海豚\n宽吻海豚\n白鳍豚\n真海豚\n独角鲸\n瓜头鲸\n白鲸\n短肢领航鲸\n蓝鲸\n虎鲸\n角岛鲸\n长肢领航鲸\n露脊鲸\n小须鲸\n布氏鲸\n灰鲸\n鳁鲸\n齿鲸\n鸭嘴兽\n丝虫\n利什曼虫\n吸吮线虫\n毛囊螨\n猪肉寄生虫\n盲鳗\n蓝氏贾第鞭毛虫\n蛔虫\n血吸虫\n钩虫\n阿米巴\n太平洋牡蛎\n海参\n海鳃\n黄金螺\n两头蛇\n乌梢蛇\n剑纹带蛇\n响尾蛇\n圆斑蝰\n太攀蛇\n山王蛇\n岩蟒\n束带蛇\n棕伊澳蛇\n森蚺\n水蚺\n水蛇\n水蟒\n沙漠王蛇\n沙漠腹蛇\n沙蟒\n海蛇\n烙铁头\n牛奶蛇\n猪鼻蛇\n玉米蛇\n玫瑰蟒\n环颈蛇\n琴蛇\n白唇竹叶青蛇\n白头蛇\n白眉蝮蛇\n白蟒\n盲蛇\n眼镜蛇\n红蟒\n线虫\n绿曼巴蛇\n绿树蟒\n绿瘦蛇\n绿蟒\n缅甸蟒\n腾蛇\n花浪蛇\n草蛇\n菱斑响尾蛇\n葡萄树蛇\n蕲蛇\n虎斑游蛇\n虎蛇\n蝮蛇\n蝰蛇\n蟒蛇\n裂须海蛇\n赤链蛇\n金环蛇\n金花蛇\n铜斑蛇\n银环蛇\n锦蛇\n闪鳞蛇\n魔鬼蛇\n黄脊游蛇\n黑蛇\n鼓腹巨蝰\n丽斑麻蜥\n加拉帕戈斯海鬣蜥\n北草蜥\n变色树蜥\n变色蜥\n墨西哥毒蜥\n壁虎\n尼罗河巨蜥\n平原巨蜥\n无脚蜥蜴\n楔齿蜥\n沙漠角蜥\n澳洲魔蜥\n犀牛鬣蜥\n王者蜥\n科莫多巨蜥\n绿鬣蜥\n蓝尾金蜥\n蓝舌蜥\n豹纹守宫\n马蛇子\n鬃狮蜥\n麻蜥\n古巴鳄\n奥利诺科鳄\n宽吻凯门鳄\n尼罗鳄\n巴拉圭凯门鳄\n暹罗鳄\n沼泽鳄\n湾鳄\n澳洲淡水鳄\n眼镜凯门鳄\n短吻鳄\n福鳄\n美洲鳄\n菲律宾鳄\n钝吻古鳄\n锥吻古鳄\n长嘴鳄\n食鱼鳄\n马来鳄\n黑凯门鳄\n中南半岛大鳖\n丽龟\n亚达伯拉象龟\n佛罗里达鳖\n几何陆龟\n凹甲陆龟\n加拉帕戈斯象龟\n南美小鳄龟\n卡罗莱纳箱龟\n印度星龟\n印度陆龟\n龟鳖四爪陆龟\n地龟\n埃及陆龟\n塞舌尔巨龟\n墨西哥箱龟\n安哥洛卡象龟\n山瑞鳖\n山鳖\n山龟\n巨型侧颈龟\n巴西彩龟\n平背龟\n扁尾陆龟\n斑鳖\n棱皮龟\n欧洲陆龟\n沙漠箱龟\n沼泽箱龟\n泥龟\n海龟\n湾岸箱龟\n牟氏水龟\n玛塔龟\n盒龟\n眼斑水龟\n红腿陆龟\n缅甸星龟\n缅甸陆龟\n美国鳖\n翘缘陆龟\n花龟\n苏卡达象龟\n蛛网陆龟\n西部箱龟\n豹纹陆龟\n赫曼陆龟\n辐射陆龟\n钟纹折背陆龟\n锦箱龟\n闭壳龟\n阿尔达布拉象龟\n阿根廷陆龟\n非洲折背陆龟\n饼干陆龟\n马来西亚巨龟\n马来鳖\n鹰嘴龟\n黄喉拟水龟\n黄斑侧颈龟\n黑颈乌龟\n黑鳖\n齿缘摄龟\n山蛭\n水丝蚓\n沙蚕\n水蛭\n蚯蚓\n面象虫\n僧帽水母\n栉水母\n根口水母\n水螅\n海羽星\n海葵\n淡水水母\n深海水母\n盒水母\n立方水母\n管水母目动物\n箱水母\n钵水母\n霞水母\n三叶虫\n蚰蜒\n中国红巨龙蜈蚣\n亚马逊巨人蜈蚣\n加拉帕格斯巨人蜈蚣\n北美巨人蜈蚣\n哈氏蜈蚣\n多棘蜈蚣\n少棘蜈蚣\n秘鲁巨人蜈蚣\n越南巨人蜈蚣\n马陆\n叩头虫\n吉丁虫\n大蜡螟\n大麦虫\n天幕毛虫\n天牛\n姬兜\n孑孓\n射炮步甲\n小蠹科\n尼科巴弓背蚁\n斑蝥\n木蛀虫\n树蟋\n棉蚜虫\n樗蚕\n水黾\n活板门蛛\n犀牛甲虫\n独角仙\n狮蚁\n瓢虫\n田鳖\n石蚕\n稚虫\n竹节虫\n粘虫\n胭脂虫\n腻虫\n芫菁\n菜青虫\n萤火虫\n蓟马\n虎甲\n体虱\n头虱\n木虱\n粉虱\n跳蚤\n阴虱\n飞虱\n虻\n蚂蚁\n切叶蚁\n子弹蚁\n微距蚂蚁\n斗牛犬蚁\n牛头犬蚁\n猛蚁\n织叶蚁\n臭蚁\n行军蚁\n阿根廷蚁\n三带喙库蚊\n埃及伊蚊\n库蚊\n按蚊\n白蚁\n长脚蚊\n蚜虫\n草蛉\n草蜻蛉\n蚊蝎蛉\n蝶角蛉\n变色夜蛾\n地老虎\n食心虫\n大造桥虫\n家蚕\n尺蛾\n帝王蛾\n斜纹夜蛾\n月形天蚕蛾\n枯叶蛾\n梨大食心虫\n棉红铃虫\n棉褐带卷蛾\n棉铃虫\n灯蛾\n烟草天蛾\n甜菜夜蛾\n眉纹天蚕蛾\n稻苞虫\n箩纹蛾\n美国白蛾\n舞毒蛾\n舟形毛虫\n苹果蠹蛾\n葡萄天蛾\n虎蛾\n螟蛾\n衣蛾\n透翅蛾\n马铃薯块茎蛾\n鹿蛾\n东方蜜蜂\n切叶蜂\n姬蜂\n木蜂\n瘿蜂\n虎头蜂\n蚁蜂\n蜂王\n西方蜜蜂\n长脚蜂\n熊蜂\n非洲蜂\n黑蜂\n蜉蝣\n乐仙蜻蜓\n杜松蜻蜓\n碧伟蜓\n红蜻蜓\n薄翅蜻蜓\n丝光绿蝇\n地中海实蝇\n寄蝇\n果蝇\n石蝇\n突眼蝇\n粪蝇\n胃蝇\n舌蝇\n苍蝇\n蚤蝇\n蝇蛆\n采采蝇\n食蚜蝇\n马蝇\n蝈蝈\n八点广翅蜡蝉\n斑衣蜡蝉\n柿广翅蜡蝉\n沫蝉\n角蝉\n龙眼鸡\n蝗虫\n副王蛱蝶\n喙凤蝶\n斑蝶\n曙凤蝶\n木兰青凤蝶\n灰蝶\n环蝶\n白蝴蝶\n眼蝶\n稻弄蝶\n粉蝶\n紫斑蝶\n纹白蝶\n绢蝶\n蓝蝶\n虎斑蝶\n蛱蝶\n赤蛱蝶\n金凤蝶\n金带喙凤蝶\n金裳凤蝶\n闪蝶\n黄斑弄蝶\n黄裳凤蝶\n蝼蛄\n牧草盲蝽\n硕蝽\n稻黑蝽\n荔蝽\n蝽\n麻皮蝽\n螳螂\n棕静螳\n北京油葫芦\n大棺头蟋蟀\n斗蟋\n中华灶蟋\n蟑螂\n蠓\n豆娘\n甘薯蚁象\n轮虫\n金龟子\n锹形虫\n长戟大兜\n隐翅虫\n马铃薯甲虫\n黄足黄守瓜\n龙虱\n东方扁虾\n中国对虾\n中国毛虾\n中国龙虾\n克氏原螯虾\n南极虾\n斑琴虾蛄\n日本对虾\n日本沼虾\n日本龙虾\n明虾\n杂色龙虾\n棘刺龙虾\n波纹龙虾\n白对虾\n白须龙虾\n皮皮虾\n红琵琶虾\n草虾\n蓝魔虾\n蜜蜂虾\n蝉虾\n螯虾\n褐虾\n铠甲虾\n锦绣龙虾\n长臂虾\n中华虎头蟹\n关公蟹\n北海道毛蟹\n圣诞岛红蟹\n大闸蟹\n寄居蟹\n帝王蟹\n拟穴青蟹\n旭蟹\n梭子蟹\n江蟹\n沙蟹\n珍宝蟹\n瓷蟹\n甘氏巨螯蟹\n短足拟石蟹\n石蟹\n老虎蟹\n蓝蟹\n蜘蛛蟹\n远海梭子蟹\n锈斑蟳\n锯齿溪蟹\n雪蟹\n面包蟹\n香槟蟹\n鹅颈藤壶\n印度华丽雨林蜘蛛\n圆蛛\n地蛛\n塔兰图拉毒蛛\n墨西哥火脚蜘蛛\n姬蛛\n巴西白膝头蜘蛛\n捕鱼蛛\n捕鸟蛛\n棒络新妇\n橙巴布蜘蛛\n毛蜘蛛\n水蜘蛛\n洪都拉斯卷毛蜘蛛\n漏斗蛛\n火玫瑰蜘蛛\n欧洲狼蛛\n白额巨蟹蛛\n皇帝巴布蜘蛛\n盗蛛\n盲蜘蛛\n红蜘蛛\n蟹蛛\n象牙华丽雨林蜘蛛\n跳蛛\n避日蛛\n金属蓝蜘蛛\n长脚盲蛛\n高脚蜘蛛\n黑腹狼蛛\n蜱虫\n东亚钳蝎\n中东金蝎\n亚洲雨林蝎\n以色列杀人蝎\n八重山蝎\n利比亚金蝎\n北非黑肥尾蝎\n双针蝎\n土耳其黑肥尾蝎\n巴西金幽灵\n帝王蝎\n沙漠金蝎\n红爪蝎\n红爪雨林蝎\n荧光蝎\n雷达蝎\n鞭蝎\n马来西亚雨林蝎\n黄肥尾蝎\n黑粗尾蝎\n鲎\n乌贼\n九孔螺\n吸血鬼乌贼\n唐冠螺\n囊螺\n帽贝\n文蛤\n杂色鲍\n栉孔扇贝\n椎实螺\n河蚌\n泥螺\n海湾扇贝\n海蛞蝓\n烟管螺\n玉螺\n玉黍螺\n珍珠贝\n珠蚌\n田螺\n白玉蜗牛\n石鳖\n竹蛏\n船蛆\n船蛸\n芋螺\n虎斑宝贝\n蚬\n蚶\n蛞蝓\n蜒螺\n货贝\n贻贝\n马氏珍珠贝\n鱿鱼\n鲍\n鹦鹉螺\n黄宝螺\n三文鱼\n马哈鱼\n主刺盖鱼\n二长棘鲷\n六带石斑鱼\n六斑刺鲀\n刺鲳\n半滑舌鳎\n吉富罗非鱼\n四指马鲅\n土耳其神仙\n多鳞鱚\n大弹涂鱼\n大眼鲷\n大西洋庸鲽\n宝刀鱼\n宝石石斑鱼\n宽体舌鳎\n射水鱼\n小丑鱼\n小公鱼\n小蜜蜂鱼\n带鱼\n斑点鸡笼鲳\n斑鰶\n旗鱼\n日本刺尾鱼\n日本金线鱼\n星点东方鲀\n暗纹东方鲀\n木叶鲽\n条纹东方鲀\n柔鱼\n横带髭鲷\n毛烟管鱼\n沙丁鱼\n海鲶\n海鳗\n澳洲三间火箭\n澳洲彩虹鱼\n灰星鲨\n牛头鲷\n狼牙鰕虎鱼\n白姑鱼\n白斑星鲨\n白胸刺尾鱼\n皮氏叫姑鱼\n短鳍红娘鱼\n石斑鱼\n石鲽\n红笛鲷\n红苹果美人\n绿河豚\n绿鳍马面鲀\n绿鳍鱼\n美丽硬仆骨舌鱼\n美国鲥鱼\n羽毛关刀\n胡椒鲷\n草莓鱼\n荷兰凤凰\n蓝点马鲛\n蓝线雀\n蓝魔鬼鱼\n虫纹东方鲀\n蝶鱼科\n褐牙鲆\n褐菖鲉\n角蝶\n赤魟\n路氏双髻鲨\n达氏鲟\n遮目鱼\n金线鱼\n霞蝶\n青干金枪鱼\n马面鱼\n高眼鲽\n鮸鱼\n鲅鱼\n鲈鱼\n鲐\n鲐鱼\n鲨鱼\n尖吻鲭鲨\n青鲨\n鲱\n鲳鱼\n刺鳐\n蓝点魟\n鳕鱼\n鳟鱼\n麒麟神仙\n黄姑鱼\n黄尾副刺尾鱼\n黄火箭\n黄鲷\n黄鳍马面鲀\n黑鮶\n黑鲷\n黑鳃刺尾鱼\n一眉道人鱼\n一线铅笔\n七星刀鱼\n三段红白锦鲤\n三色水泡\n三色虎头\n三间鼠鱼\n中华细鲫\n中华花鳅\n丹顶锦鲤鱼\n乌云盖雪\n乌鲤\n五点铅笔\n五花兰寿\n五花狮\n五花短尾琉金\n五花龙睛\n企鹅灯\n凤尾龙睛\n刚果恐龙王\n刚果扯旗\n包金狮\n华鲮\n印加鹦鹉\n印第安鼠\n叉尾斗鱼\n反天刀\n反游猫\n口红红白锦鲤\n咖啡鼠鱼\n唐鱼\n喷火灯鱼\n团头鲂\n地图鱼\n埃及神仙\n墨头鱼\n墨衣锦鲤\n墨龙睛\n壮体沙鳅\n夜明珠鱼\n大和锦\n大花恐龙\n头尾灯鱼\n奥尼罗非鱼\n孔雀光写锦鲤\n孔雀鱼\n孔雀黄金锦鲤\n宁德鲤鱼溪\n宽鳍鱲\n富士红白锦鲤\n寿星虎头\n小精灵鱼\n尖嘴铅笔\n尼罗罗非鱼\n巨骨舌鱼\n帝王老虎魟\n弓鱼\n彩虹鲨\n德国三色锦鲤\n德国写鲤\n德国别甲锦鲤\n德国昭和锦鲤\n德州豹\n成吉思汗鱼\n扯旗红水泡\n拿破仑红白锦鲤\n接吻鱼\n斑尾凤凰\n斑点短鲷\n斑节恐龙\n斑马鱼\n新大钩扯旗\n新疆大头鱼\n星点龙\n朱球高头珍珠\n朱砂红白水泡\n条纹小鲃\n柠檬灯鱼\n樱花短尾琉金\n水浅黄锦鲤\n沙鳅\n泥鳅\n浅黄三色锦鲤\n清苔鼠\n清道夫\n漂亮宝贝鱼\n火口鱼\n火焰变色龙鱼\n火焰灯鱼\n火焰铅笔\n火鹤鱼\n熊猫异形鱼\n熊猫鼠\n燕子美人\n玉兔金鱼\n玉印头皇冠\n玉顶蝶尾\n玉顶银狮\n王字虎头\n玛丽鱼\n玫瑰旗\n玻利维亚凤凰\n玻璃扯旗鱼\n玻璃拉拉\n珍珠灯鱼\n珍珠马甲\n珍珠魟\n瓦氏雅罗鱼\n电鲶\n画眉鱼\n白云山鱼\n白云金丝鱼\n白写锦鲤\n白水泡\n白豹鼠\n皇冠三间\n皇冠六间\n皇冠棋盘\n皇冠直升机鱼\n神仙鱼\n紫望天\n紫白龙睛\n紫罗兰鼠\n紫蝶尾\n紫身枝牙虾虎\n红十字鱼\n红头鼠\n红宝石鱼\n红寿\n红尾梦幻旗\n红尾玻璃\n红尾皇冠\n红尾金龙鱼\n红尾鲶\n红尾黑鲨\n红扯旗\n红望天\n红水泡\n红玫瑰鱼\n红珍珠鱼\n红白兰寿\n红白狮头龙睛\n红白皇冠\n红白短尾琉金\n红白蝶尾\n红绿灯鱼\n红背朱砂水泡\n红衣梦幻旗\n红顶五花狮\n红顶兰寿\n红顶猫狮\n红顶虎头\n红顶黑珍珠\n红魔鬼鱼\n红鳍鲌\n红鼻剪刀鱼\n红龙鱼\n纹唇鱼\n细鳞斜颌鲴\n绯昭和锦鲤\n维吉塔短鲷\n绿巨人鱼\n绿裳红剑尾坦克\n罗汉鱼\n翘嘴红鲌\n花斑副沙鳅\n花椒鼠\n花秋翠锦鲤\n花老虎\n苏眉鱼\n茶鲤\n草绳恐龙\n草鱼\n荧鳍龙睛\n荧鳞短尾琉金\n蓝宝石鱼\n蓝帝王灯\n蓝曼龙\n蓝灯鱼\n蓝眼鳉\n蓝蝶尾\n蓝衣锦鲤\n蓝袖短鲷\n虎皮鱼\n虎纹恐龙王\n血鹦鹉\n豹魟\n贴分黄金锦鲤\n赤三色锦鲤\n赤别甲锦鲤\n赤眼鳟\n过背金龙鱼\n酋长短鲷\n金头黑白狮\n金恐龙鱼\n金曼龙\n金松叶锦鲤\n金点魟\n金翅帝王鼠\n金菠萝鱼\n钻石灯鱼\n钻石皇冠豹\n铜盆鱼\n银松叶锦鲤\n银燕子\n银鲴\n银鳞三色锦鲤\n银龙鱼\n长尾紫琉金\n长尾红白琉金\n长尾黑琉金\n长薄鳅\n闪电红白锦鲤\n阿卡西短鲷\n霓虹燕子\n青海湖裸鲤\n青铜鼠\n青鱼\n非洲凤凰\n非洲十间\n非洲王子鱼\n马口鱼\n鲂\n鲇鱼\n鲢\n鳄鱼恐龙王\n鳊\n鳙\n鳡\n鹅头红\n黄写锦鲤\n黄金达摩\n黄金鳉\n黄鲤\n黑旗鱼\n黑望天\n黑白水泡\n黑白蝶尾\n黑白魟\n黑线飞狐鱼\n黑莲灯\n黑裙鱼\n黑金红头鼠\n黑龙鱼\n龙睛珍珠\n龙睛虎头\n加岛环企鹅\n南跳岩企鹅\n小蓝企鹅\n帝王企鹅\n帽带企鹅\n斑嘴环企鹅\n斯岛黄眉企鹅\n洪堡企鹅\n白颊黄眉企鹅\n金图企鹅\n阿德利企鹅\n马可罗尼企鹅\n麦哲伦企鹅\n黄眉企鹅\n美洲鸵\n鸵鸟\n鸸鹋\n鹤鸵\n三宝鸟\n冠鱼狗\n斑鱼狗\n栗喉蜂虎\n棕胸佛法僧\n犀鸟\n白领翡翠\n笑翠鸟\n绿喉蜂虎\n蓝喉蜂虎\n蓝翡翠\n蓝胸佛法僧\n蓝须夜蜂虎\n蓝颊蜂虎\n赤翡翠\n鹳嘴翡翠\n黄喉蜂虎\n漂泊信天翁\n潜鸟\n乌灰鹞\n乌雕\n兀鹫\n冕雕\n冠雕\n冠鹰雕\n凤头卡拉鹰\n凤头蜂鹰\n凤头鹃隼\n凤头鹰\n加州神鹫\n南非兀鹫\n卡拉卡拉鹰\n双齿鹰\n大鵟\n小雀鹰\n小雕\n巨翅鵟\n斑头雁\n日本松雀鹰\n普通鵟\n暗色歌鹰\n条纹鹰\n栗翅鹰\n栗鸢\n棕尾鵟\n棕榈鹫\n棕翅鵟鹰\n棕腹隼雕\n楔尾雕\n毛腿鵟\n海雕\n淡色歌鹰\n渔雕\n游隼\n灰背隼\n灰脸鵟鹰\n灰隼\n燕尾鸢\n燕隼\n爪哇鹰雕\n猎隼\n猛隼\n猛雕\n王鹫\n白兀鹫\n白头鹞\n白眼鵟鹰\n白肩雕\n白腹雕\n白腿小隼\n皱脸秃鹫\n矛隼\n短尾雕\n短趾雕\n秃鹫\n红头美洲鹫\n红尾鵟\n红脚隼\n红腿小隼\n红隼\n红鸢\n纹翅鸢\n美洲隼\n肉垂秃鹫\n胡兀鹫\n苍鹭\n苍鹰\n茶色雕\n草原隼\n草原鹰\n菲律宾雕\n菲律宾鹰\n蛇雕\n角雕\n赤腹鹰\n赤鸢\n赤鹰\n金雕\n阿尔泰隼\n雀鹰\n非洲鹰\n食猴鹰\n食猿雕\n饰冠鹰雕\n马来鹰雕\n鱼雕\n鸡鹰\n鹃头蜂鹰\n鹗\n鹞\n鹰雕\n黄爪隼\n黑冠鹃隼\n黑美洲鹫\n黑翅鸢\n黑耳鸢\n黑隼\n黑雕\n黑鸢\n三趾鸦雀\n三道眉草鹀\n丛林鸦\n东方大苇莺\n主红雀\n丽色奇鹛\n乌嘴柳莺\n乌鸦\n云雀\n仙八色鸫\n伯劳\n光背地鸫\n八哥\n冕雀\n冠蓝鸦\n冬鹪鹩\n凤头雀莺\n凤头鹀\n剑嘴鹛\n动冠伞鸟\n北扑翅鴷\n北朱雀\n北红尾鸲\n北鹨\n印支绿鹊\n厚嘴苇莺\n叉尾太阳鸟\n双辫八色鸫\n发冠卷尾\n台湾紫啸鸫\n台湾蓝鹊\n台湾黄山雀\n和平鸟\n咬鹃\n唐纳雀\n赤胸啄木鸟\n金喉拟啄木鸟\n黄纹拟啄木鸟\n斑姬啄木鸟\n黄嘴栗啄木鸟\n竹啄木鸟\n白翅啄木鸟\n三趾啄木鸟\n黄颈啄木鸟\n小星头啄木鸟\n绿啄木鸟\n白背啄木鸟\n红头啄木鸟\n栗啄木鸟\n白腹黑啄木鸟\n棕腹啄木鸟\n黄冠啄木鸟\n大斑啄木鸟\n小斑啄木鸟\n纹胸啄木鸟\n喜鹊\n园丁鸟\n夏候鸟\n夜鹰\n夜莺\n新疆歌鸲\n大噪鹛\n大朱雀\n大盘尾\n大绿雀鹎\n大苇莺\n大草鹛\n太平鸟\n宝兴鹛雀\n家鸦\n小仙鹟\n小燕尾\n小盘尾\n小鹀\n屠夫鸟\n山噪鹛\n山蓝仙鹟\n山雀\n山鹛\n山鹡鸰\n山鹨\n山麻雀\n岛鸫\n岩燕\n巨嘴沙雀\n布谷鸟\n弄岗穗鹛\n强脚树莺\n戈氏岩鹀\n戴菊\n拟大朱雀\n文须雀\n文鸟\n斑喉希鹛\n斑椋鸟\n斑翅朱雀\n斑背噪鹛\n斑背大尾莺\n斑胸短翅莺\n斑胸钩嘴鹛\n斑腰燕\n斑颈穗鹛\n旋木雀\n日本树莺\n日本歌鸲\n日本鹡鸰\n星鸦\n普通鳾\n暗冠蓝鸦\n暗绿柳莺\n暗胸朱雀\n暗色鸦雀\n曙红朱雀\n朱背啄花鸟\n朱鹂\n杂色山雀\n条纹噪鹛\n松雀\n松鸦\n极北朱顶雀\n林岭雀\n林柳莺\n林鹨\n柳莺\n树燕\n栗头八色鸫\n栗头地莺\n栗头雀鹛\n栗头鹟莺\n栗背短脚鹎\n栗腹歌鸲\n栗腹矶鸫\n栗臀噪鹛\n栗臀鳾\n栗颈噪鹛\n棕噪鹛\n棕头钩嘴鹛\n棕头雀鹛\n棕头鸦雀\n棕扇尾莺\n棕朱雀\n棕眉山岩鹨\n棕眉柳莺\n棕背雪雀\n棕胸蓝姬鹟\n棕脸鹟莺\n棕腹大仙鹟\n棕腹树鹊\n棕草鹛\n棕褐短翅莺\n棕顶树莺\n棕颈雪雀\n椋鸟\n楼燕\n横斑林莺\n橙头地鸫\n橙斑翅柳莺\n橙翅噪鹛\n橙胸姬鹟\n橙腹叶鹎\n歌百灵\n毛脚燕\n水蒲苇莺\n水鸫\n沙色朱雀\n河乌\n沼泽大尾莺\n泥色鸫\n海南蓝仙鹟\n海雀\n海鸠\n淡脚树莺\n淡黄腰柳莺\n火冠雀\n火尾太阳鸟\n火尾绿鹛\n灰冠鸦雀\n灰卷尾\n灰喉山椒鸟\n灰喉鸦雀\n灰头斑翅鹛\n灰头椋鸟\n灰头灰雀\n灰头钩嘴鹛\n灰头鸦雀\n灰头鸫\n灰头鹀\n灰山椒鸟\n灰树鹊\n灰椋鸟\n灰眶雀鹛\n灰短脚鹎\n灰翅噪鹛\n灰翅鸫\n灰背燕尾\n灰背鸫\n灰脸鹟莺\n灰腹噪鹛\n灰蓝山雀\n灰颈鹀\n灰鹀\n灰鹡鸰\n灶鸟\n点翅朱雀\n点胸鸦雀\n烟柳莺\n烟腹毛脚燕\n煤山雀\n燕子\n燕雀\n玉山噪鹛\n珠颈斑鹑\n理氏鹨\n琴鸟\n田鸫\n画眉\n白冠噪鹛\n白冠攀雀\n白喉噪鹛\n白喉姬鹟\n白喉扇尾鹟\n白喉林莺\n白喉林鹟\n白喉短翅鸫\n白喉矶鸫\n白喉红尾鸲\n白喉红臀鹎\n白尾蓝仙鹟\n白斑翅雪雀\n白眉地鸫\n白眉姬\n白眉姬鹟\n白眉山雀\n白眉扇尾鹟\n白眉朱雀\n白眉歌鸫\n白眉蓝姬鹟\n白眉雀鹛\n白眶斑翅鹛\n白眶雀鹛\n白眶鸦雀\n白眶鹟莺\n白翅交嘴雀\n白脸山雀\n白腰朱顶雀\n白腰雪雀\n白腹凤鹛\n白腹短翅鸲\n白腹鸫\n白领凤鹛\n白颈鸦\n白颈鸫\n白颊噪鹛\n白颊鹎\n白鹡鸰\n百灵鸟\n盘尾树鹊\n相思鸟\n眼纹噪鹛\n矛斑蝗莺\n知更鸟\n短嘴山椒鸟\n石雀\n禾花雀\n稻田苇莺\n粉红山椒鸟\n粉红椋鸟\n粉红腹岭雀\n紫啸鸫\n紫翅椋鸟\n紫背椋鸟\n红交嘴雀\n红喉姬鹟\n红喉歌鸲\n红喉鹨\n红嘴山鸦\n红嘴蓝鹊\n红嘴钩嘴鹛\n红嘴鸦雀\n红头灰雀\n红头穗鹛\n红头鸦雀\n红尾水鸲\n红尾鸫\n红梅花雀\n红眉朱雀\n红翅旋壁雀\n红翅薮鹛\n红翼鸫\n红胁蓝尾鸲\n红背红尾鸲\n红胸啄花鸟\n红腰朱雀\n红腹灰雀\n红腹红尾鸲\n纯色噪鹛\n纯色山鹪莺\n纵纹绿鹎\n纹喉鹎\n纹背捕蛛鸟\n细纹噪鹛\n细纹苇莺\n织巢鸟\n织布鸟\n绣眼鸟\n维达鸟\n绿喉太阳鸟\n绿翅短脚鹎\n绿背山雀\n绿胸八色鸫\n缎蓝亭鸟\n翠鸟\n花彩雀莺\n草地鹨\n草雀\n蒙古沙雀\n蓝八色鸫\n蓝喉歌鸲\n蓝大翅鸲\n蓝头矶鸫\n蓝枕八色鸫\n蓝枕花蜜鸟\n蓝歌鸲\n蓝短翅鸫\n蓝矶鸫\n蓝绿鹊\n蓝翅八色鸫\n蓝翅噪鹛\n藏雀\n藏鹀\n藏黄雀\n虎斑地鸫\n蚁鸟\n蜂鸟\n蜡嘴雀\n血雀\n褐冠山雀\n褐头凤鹛\n褐头岭雀\n褐岩鹨\n褐灰雀\n褐翅雪雀\n褐翅鸦雀\n褐胁雀鹛\n褐脸雀鹛\n西域山雀\n贺兰山岩鹨\n赤朱雀\n赤红山椒鸟\n远东树莺\n远东苇莺\n酒红朱雀\n金丝雀鸟\n金冠地莺\n金冠树八哥\n金头扇尾莺\n金头穗鹛\n金枕黑雀\n金眼鹛雀\n金翅雀\n金胸雀鹛\n金色林鸲\n金色鸦雀\n金额丝雀\n金额雀鹛\n钩嘴鹛\n铜蓝鹟\n银胸丝冠鸟\n锈腹短翅鸫\n长嘴地鸫\n长嘴捕蛛鸟\n长嘴钩嘴鹛\n长尾地鸫\n长尾奇鹛\n长尾山椒鸟\n长尾缝叶莺\n长尾雀\n长尾鹩鹛\n阔嘴鸟\n雨燕\n雪鹀\n震旦鸦雀\n靛冠噪鹛\n靛蓝彩鹀\n靴篱莺\n领岩鹨\n食蜂鸟\n高山雀鹛\n鳞头树莺\n鸦嘴卷尾\n鸫鸟\n鸲岩鹨\n鹀\n鹊鸲\n鹟\n鹩哥\n鹪鹩\n鹬鸵\n麝雉\n黄喉噪鹛\n黄喉雀鹛\n黄嘴山鸦\n黄嘴朱顶雀\n黄嘴蓝鹊\n黄头鹡鸰\n黄眉\n黄眉姬\n黄眉林雀\n黄眉鹀\n黄绿鹎\n黄胸柳莺\n黄腰太阳鸟\n黄腹冠鹎\n黄腹啄花鸟\n黄腹山雀\n黄腹柳莺\n黄腹树莺\n黄腹花蜜鸟\n黄腹鹟莺\n黄腹鹨\n黄腹鹪莺\n黄臀鹎\n黄莺\n黄雀\n黄颈拟蜡嘴雀\n黄颊山雀\n黄鹡鸰\n黑冠椋鸟\n黑冠黄鹎\n黑卷尾\n黑喉石鵖\n黑喉红臀鹎\n黑喉缝叶莺\n黑头噪鸦\n黑头奇鹛\n黑头鳾\n黑头鹀\n黑头鹎\n黑尾地鸦\n黑斑蝗莺\n黑枕王鹟\n黑百灵\n黑眉苇莺\n黑短脚鹎\n黑翅雀鹎\n黑背燕尾\n黑胸太阳鸟\n黑胸鸫\n黑脸鹟莺\n黑顶噪鹛\n黑顶林莺\n黑顶白颊林莺\n黑领噪鹛\n黑额树鹊\n黑鸫\n（美洲产）伞鸟\n加拿大鹅\n小天鹅\n扁嘴天鹅\n斑嘴鸭\n斑头秋沙鸭\n林鹳\n棕硬尾鸭\n溆浦鹅\n灰雁\n灰鹅\n狮头鹅\n琵嘴鸭\n瓣蹼鹬\n瘤头鸭\n白鹅\n秋沙鸭\n红头潜鸭\n绒鸭\n绿翅鸭\n罗纹鸭\n花脸鸭\n蓝翅鸭\n豁眼鹅\n豆雁\n赤膀鸭\n赤麻鸭\n野鸭\n针尾鸭\n长尾鸭\n雁鹅\n雪雁\n非洲雁\n鸳鸯\n鸳鸯鸭\n麝香鸭\n麻鸭\n黑天鹅\n黑海番鸭\n黑雁\n巨嘴鸟\n蚁鴷\n黄腰响蜜鴷\n三趾鹑\n丛冢雉\n元宝鸡\n冕鹧鸪\n刚果孔雀\n勺鸡\n南秧鸟\n原鸡\n台湾山鹧鸪\n吐绶鸡\n大凤冠雉\n大骨鸡\n寿光鸡\n山鹑\n岩雷鸟\n彩雉\n彩鸡鹑\n斑翅山鹑\n斑胸田鸡\n松鸡\n柳雷鸟\n榛鸡\n毛脚鸡\n黑水鸡\n紫水鸡\n沙鸡\n济宁百日鸡\n海兰褐壳蛋鸡\n海南孔雀雉\n海南山鹧鸪\n灰头麦鸡\n灰孔雀雉\n狼山鸡\n环颈雉\n珍珠鸡\n白尾雷鸟\n白颊山鹧鸪\n白鹇\n眼斑吐绶鸡\n眼斑火鸡\n眼斑雉\n矮脚鸡\n石鸡\n竹鸡\n紫青水鸡\n红喉山鹧鸪\n红胸山鹧鸪\n绿孔雀\n绿脚山鹧鸪\n绿雉\n艾草松鸡\n芦花鸡\n草原榛鸡\n蓝胸鹑\n蓝鹇\n虹雉\n血雉\n褐胸山鹧鸪\n角雉\n贵妃鸡\n走鹃\n铜尾孔雀雉\n铜长尾雉\n锦鸡\n长尾雉\n马鸡\n骨顶鸡\n高地鹧鸪\n鹅雏\n鹌鹑\n鹧鸪\n鹫珠鸡\n黄羽肉鸡\n黄鸡\n黄麻鸡\n黑凤冠雉\n黑腹翎鹑\n黑鸡\n黑鹇\n三趾鸥\n剪嘴鸥\n巨鹱\n江鸥\n海燕\n海鸥\n灰背鸥\n燕鸥\n白翅浮鸥\n白鸥\n贼鸥\n遗鸥\n须浮鸥\n黄嘴河燕鸥\n黑嘴鸥\n黑尾鸥\n大灰猫头鹰\n斑头鸺鹠\n林雕鸮\n大雕鸮\n灰林鸮\n猛鸮\n短耳鸮\n斑点猫头鹰\n纵纹腹小鸮\n褐渔鸮\n仓鸮\n毛腿渔鸮\n纵纹角鸮\n雪鸮\n草鸮\n花头鸺鹠\n长耳鸮\n雕鸮\n黄嘴角鸮\n丘鹬\n双领鸻\n反嘴鹬\n小嘴鸻\n小黄脚鹬\n弯嘴滨鹬\n扇尾沙锥\n斑胸滨鹬\n杓鹬\n林沙锥\n林鹬\n流苏鹬\n海鹦\n灰尾鹬\n燕鸻\n笛鸻\n红脚鹬\n翻石鹬\n蛎鹬\n金鸻\n原鸽\n山斑鸠\n岩鸽\n斑尾林鸽\n斑尾鹃鸠\n棕斑鸠\n楔尾绿鸠\n橙胸绿鸠\n渡渡鸟\n火斑鸠\n灰林鸽\n点子鸽\n珠颈斑鸠\n瘤鼻鸽\n白羽王鸽\n筋斗鸽\n红翅绿鸠\n绿皇鸠\n绿翅金鸠\n苍白骑士\n雪鸽\n黑颏果鸠\n乌鹃\n八声杜鹃\n小鸦鹃\n斑翅凤头鹃\n棕腹杜鹃\n紫金鹃\n红翅凤头鹃\n绿嘴地鹃\n翠金鹃\n蕉鹃\n褐翅鸦鹃\n鹰鹃\n军舰鸟\n北鲣鸟\n双冠鸬鹚\n斑头鸬鹚\n鸬鹚\n澳洲鹈鹕\n白尾鹲\n红嘴鹲\n红蛇鹈\n美洲蛇鹈\n蓝眼鸬鹚\n蓝脸鲣鸟\n鲣鸟\n鹈鹕\n黑腹军舰鸟\n黑腹蛇鹈\n黑颈鸬鹚\n姬田鸡\n小田鸡\n斑胁田鸡\n沙丘鹤\n灰鹤\n白头鹤\n白枕鹤\n白眉田鸡\n白胸苦恶鸟\n白鹤\n秧鸡\n秧鹤\n红胸田鸡\n红鹤\n蓑羽鹤\n蓝胸秧鸡\n赤颈鹤\n鸨\n黑颈鹤\n吸蜜鹦鹉\n塞内加尔鹦鹉\n大绯胸鹦鹉\n太阳鹦哥\n彩虹鹦鹉\n灰头鹦鹉\n灰鹦鹉\n牡丹鹦鹉\n玄凤鹦鹉\n白鹦鹉\n短尾鹦鹉\n米切氏凤头鹦鹉\n红嘴鹦鹉\n红肩金刚鹦鹉\n红腹鹦鹉\n红色的鹦鹉\n红额金刚鹦鹉\n花头鹦鹉\n葵花凤头鹦鹉\n蓝冠吸蜜鹦鹉\n蓝冠短尾鹦鹉\n蓝头金刚鹦鹉\n蓝颊玫瑰鹦鹉\n虎皮鹦鹉\n金刚鹦鹉\n金刚鹦鹉族\n金尾折衷鹦鹉\n金领金刚鹦鹉\n长尾鹦鹉\n食肉鹦鹉\n黄颈亚马逊鹦鹉\n黄额鹦鹉\n三色鹭\n印度池鹭\n啸鹭\n噪鹮\n埃及圣鹮\n夜鹭\n大蓝鹭\n岩鹭\n彩鹮\n彩鹳\n斑鹭\n朱鹭\n朱鹮\n栗腹鹭\n棕夜鹭\n棕颈鹭\n池鹭\n漂鹬\n澳洲白鹮\n牛背鹭\n琵鹭\n白脸鹭\n白腹鹳\n白鹭\n白鹳\n秃鹮\n秃鹳\n紫背苇鳽\n红鹳\n绿鹭\n绿鹮\n美洲白鹮\n美洲红鹮\n美洲绿鹭\n草鹭\n蓝灰鹭\n裸颈鹳\n钳嘴鹳\n锤头鹳\n长脚鹬\n非洲秃鹳\n鞍嘴鹳\n鲸头鹳\n麻鹭\n黄嘴鹮鹳\n黄脸鹳\n黄苇鳽\n黑头白鹮\n黑颈鹳\n黑鹭\n黑鹳\n黑喉潜鸟\n黄胡蜂\n短毛家猫\n蕈蚊\n北极兔\n凤尾蝶\n柯利牧羊犬\n斑背潜鸭\n椰子蟹\n卷叶蛾\n异国短毛猫\n细嘴黄鹂\n黑顶麻雀\n毒蛾\n格雷伊猎犬\n博伊金猎犬\n巴贝犬\n柯利犬\n杜高塞德斯科犬\n因纽特犬\n大型英法三色猎犬\n峡谷㹴\n阿尔卑斯达切斯勃拉克犬\n班道戈犬\n坎高犬\n俄罗斯－欧洲莱卡犬\n马瑞马·阿布鲁佐牧羊犬\n塞尔维亚猎犬\n阿拉帕哈蓝血斗牛犬\n丰山犬\n西班牙獒犬\n黑棕褐梗\n狮子狗\n阿特拉斯梗\n卡罗来纳犬\n斯洛伐克硬毛指示猎犬\n瑞典牧羊犬\n警犬\n拉贾帕拉耶姆犬\n小福克斯犬\n比利犬\n阿札瓦克犬\n波斯尼亚的牧羊犬\n威尔士牧羊犬\n加迪库塔犬\n霍夫瓦尔特犬\n布列塔尼猎犬\n奥弗涅指示猎犬\n捕鼠㹴犬\n智利狐狸梗\n印第安松鼠狗\n苏塞克斯猎犬\n蛋糕罐罐狗\n拉戈托罗马阁挪露犬\n布里吉特格里芬犬\n鬼脸狗\n山恶狗\n卡累利阿熊犬\n棉花面纱犬\n拉坎诺斯犬\n瑞典拉普猎犬\n阿兰多獒犬\n日本㹴\n挪威卢德杭犬\n史其派克犬\n瑞典腊肠犬\n库瓦兹犬\n威斯特达克斯布若卡犬\n澳大利亚牧牛犬\n贝克哈沃德狗\n博洛尼亚犬\n约克波犬\n爱罗狗\n爱迪犬\n甲斐犬\n波多黎各獒犬\n波兰猎犬\n罗秦犬\n英国猎浣熊犬\n罗威纳布林迪西狗\n拉布拉多哈士奇\n卡南犬\n大麦町\n迷你猎狐㹴\n佩特戴尔㹴\n约克夏㹴\n美国无毛㹴犬\n班特牛头犬\n中国福犬\n墨西哥无毛犬\n吉斯熊狗\n老式英国斗牛犬\n瓷器犬\n西班牙水犬\n意大利指示猎犬\n蒙特内哥罗山猎犬\n斗牛獒\n比韦尔狗\n猴㹴\n芬兰驯鹿犬\n奥地利黑褐猎犬\n保罗蓝梗\n波隆卡\n芬兰猎犬\n俄罗斯黑㹴\n猎水獭犬\n杂交狗\n大明斯特兰德犬\n阿图瓦猎犬\n荷兰斯牟雄德犬\n拉普尔灰狗\n维汗牧羊犬\n米奇狗\n帕森拉塞尔㹴犬\n诺维奇梗\n阿登牧牛犬\n美洲印第安人狗\n金塔马尼\n布劳特猎犬\n委内瑞拉牧羊犬\n埃什特雷拉山犬\n德国猎㹴\n贝里亚尔狗\n兰雷西犬\n罗马尼亚喀尔巴阡山脉牧羊犬\n纪州犬\n普里亚指狗\n捷克梗\n葡萄牙牧羊犬\n葡萄牙指示犬\n喜玛拉雅牧羊犬\n特洛曼犬\n立陶宛猎犬\n伊斯特拉粗毛猎犬\n塞尔维亚三色猎犬\n法连尼犬\n短毛视觉猎犬\n罗马尼亚米利泰克牧羊犬\n塞佩莱西伯利亚雪橇犬\n布拉克杜佩\n波兰狩猎犬\n布拉格瑟瑞克犬\n古梗犬\n乌拉圭西马伦犬\n印度野猪猎犬\n捷克狼犬\n丹麦老式指示猎犬\n撒阿路斯狼猎犬\n非洲家犬\n杂交狮子狗\n德国猎犬\n泰岗猎犬\n东西伯利亚雷卡犬\n威尔士猎犬\n凯利蓝㹴\n古代西班牙指示猎犬\n普德尔向导猎犬\n诺波丹狐狸犬\n巴吉度犬布列塔尼\n芬兰波美拉尼亚丝毛狗\n斯莫兰德猎犬\n树枝田纳西州斑点狗\n哈巴小猎犬\n复兴斗牛犬\n巨人狗\n希腊猎犬\n老式英国斗牛犬\n阿富汗牧羊犬\n施普斯狗\n克里特岛猎犬\n圣东基犬\n保加利亚牧羊犬\n那不勒斯獒犬\n瑞典猎鹿犬\n秘鲁无毛犬\n挪威布哈德犬\n格雷伊猎犬\n豹犬\n巴伐利亚山地犬\n桦太犬\n帕内尔的卡罗莱纳州恶狗\n特兰西瓦尼亚猎犬\n巴基斯坦斗牛犬\n亚伦狗\n兰伯格犬\n奥斯卡贵宾\n克龙弗兰德犬\n荷兰牧羊犬\n提洛尔猎犬\n英国獒犬\n英国白梗\n树丛杂种犬\n西班牙斗牛犬\n爱沙尼亚猎犬\n德克萨斯犬\n田特菲㹴\n巴西㹴\n瑞士猎犬\n哈尔登猎犬\n波密犬\n泰迪罗斯福梗\n英国玩具犬\n白色牧羊犬\n戈登塞特犬\n荷兰水猎犬\n雪贵宾\n布拉克·杜·波旁犬\n白英国斗牛犬\n塔马斯卡犬\n双鼻安第斯虎猎犬\n丝绸惠比特犬\n萨普兰尼那克犬\n马地犬\n贝加马斯卡犬\n标准贵宾犬\n台湾犬\n安纳托利亚牧羊犬\n科克尔犬\n比利牛斯獒犬\n伯尔尼劳佛犬\n汉密尔顿猎犬\n小型明斯特兰德犬\n阿里埃日犬\n乌托尼亚犬\n俄罗斯猎狼犬\n蓝色匹卡迪档猎犬\n基里奥犬\n伯格尔・德皮卡第犬\n加斯科尼蓝色矮腿猎犬\n安达卢西亚酒窖大猎狗\n袋鼠狗\n艾尔曼特狗\n麦克纳布犬\n恩特雷布赫山地犬\n俄罗斯玩具犬\n奥地利平犬\n大树猎犬\n威尔士矮脚狗\n穆托尔猎犬\n乞沙比克猎犬\n海根猎犬\n北海道犬\n田野猎犬\n欧洲猎犬\n凯利小猎犬\n长狗\n迷你贝吉格里芬凡丁犬\n大蓝加斯科涅猎犬\n苏利莫夫犬\n澳洲粗短尾牧牛犬\n马尔济斯\n库达犬\n丹麦布罗荷马獒\n兰开夏跟脚犬\n斯卢夫猎犬\n格陵兰犬\n兰西尔犬\n皮卡第猎犬\n美洲猎鹿犬\n谷斗牛犬\n欧亚大陆犬\n四国犬\n短毛意大利猎犬\n潘布魯克威尔斯柯基犬\n爱尔兰红白雪达犬\n英国指示犬\n波利尼西亚犬\n阿兹卡尔\n马鲁索斯犬\n斯塔比嚎犬\n汝拉布鲁诺猎犬\n小猎犬拉布拉多杂交狗\n金德利犬\n史毕诺犬\n小猎兔犬\n土耳其阿卡巴士犬\n圣日尔曼指示猎犬\n美国玩具㹴\n法国黑白色犬\n马林斯费斯特狗\n德冷特斯奇鹧鸪犬\n树丛竞走猎浣熊犬\n黑嘴杂种犬\n泰国脊背犬\n澳洲野狗\n庞特－奥德门猎犬\n韩国杜莎犬\n皮罗·德·伯里沙·马罗奎因犬\n巨型雪纳瑞\n斯恰潘道斯犬\n马拉石狗\n巴辛吉犬\n拉布拉多贵宾狗\n巴西非勒\n巴斯克牧羊犬\n马扎尔犬\n波希米亚牧羊犬\n汉诺威猎犬\n加拿大爱斯基摩犬\n莫斯科水犬\n哈瓦那犬\n库克颇犬\n沟壑梗犬\n坎尼狗\n比利牛斯牧羊犬\n艾特拉科尼克犬\n卡林犬\n黑褐弗吉尼亚猎狐犬\n非洲野犬\n诺福克㹴\n印度斯皮茨狗\n普露马梗\n西西伯利亚莱卡犬\n卡塔胡拉牛头犬\n运动卢卡斯㹴\n国王牧羊犬\n波萨维茨猎犬\n波西米亚硬毛格里芬指示犬\n皮卡普狗\n泰迪罗斯福梗\n丹麦瑞典农场犬\n波兰低地牧羊犬\n骑士比熊犬\n塔特拉山牧羊犬\n芬兰拉普猎犬\n英国雪达犬\n珍岛犬\n硬毛维兹拉犬\n斗牛梗\n克罗地亚牧羊犬\n德国粗毛指示猎犬\n夏伊洛牧羊犬\n西班牙灰狗\n费斯特山狗\n玩具斗牛犬\n夏威夷猫\n地丛鹟\n火红鸲鹟\n白斑尾柳莺\n黑背麦鸡\n纽氏梅花雀\n栗翅牛鹂\n北蝗莺\n白颊长尾山雀\n淡红鹪鹩\n黑嘴天鹅\n五彩鹦鹉\n红头咬鹃\n黑头织雀\n侏莺雀\n淡尾苇鹪鹩\n黑脸鸨\n蓝眼凤头鹦鹉\n红胸皇鸠\n柠黄雀鹀\n沼泽幽鹛\n番红头鹦哥\n白凤头鹦鹉\n翘眉企鹅\n埃及雁\n白腹地鸫\n灰食籽雀\n红尾歌鸲\n红白针尾雀\n北美蛎鹬\n红嘴巨嘴鸟\n火尾希鹛\n白腰草鹬\n印度鸲\n蓝背娇鹟\n厚嘴绿鸠\n黑头针尾雀\n黑腹树鸭\n凤头山雀\n灰噪刺莺\n黄纹薮雀\n红胸长爪鹡鸰\n阿拉伯啄木鸟\n灰蚁鵙\n日本柳莺\n黑喉隐蜂鸟\n黑肩鸢\n火喉蜂鸟\n凤头距翅麦鸡\n灰褐噪鹛\n酒红斑鸠\n地犀鸟\n白翅薮鹟\n黄腹绿鸠\n澳洲鹨\n巨辉椋鸟\n黑领鹎\n环蚁鹨\n白额饰眼鹟\n白颊黑雁\n彩鹬\n凤冠火背鹇\n侏长尾山雀\n白尾雀鹎\n栗顶薮雀\n马岛鹡鸰\n小杓鹬\n灰连雀\n黑腹食蚊鸟\n笛声鹪鹩\n白腹灰蕉鹃\n白眼褐鹎\n紫头蜂鸟\n美洲乌夜鹰\n黑腹辉椋鸟\n杂色须霸鹟\n锈腹薮雀\n黑喉粗嘴雀\n白眼潜鸭\n斯里兰卡蓝鹊\n小灰啄木鸟\n暗黄顶黑唐纳雀\n黑领啸鹟\n淡头玫瑰鹦鹉\n白胸窜鸟\n巨鹭\n黑腹花蜜鸟\n刀嘴海雀\n绯腰巨嘴鸟\n南美黑啄木鸟\n蓝头八色鸫\n红翅斑腹雀\n丛林山鹪莺\n红地鸠\n褐绣眼鸟\n蓝翅希鹛\n勺嘴鹬\n紫滨鹬\n白颊犀鸟\n内岛柳莺\n棕背蚁鸟\n白腰树燕\n白喉唧鹀\n沟嘴巨嘴鸟\n黑澳鳾\n棕耳噪鹛\n黑头拟鹂\n橙冠拟鹂\n橄榄篱莺\n黄腹噪刺莺\n黄纹绿鹦鹉\n白脸角鸮\n灰头林鸽\n澳洲灰雁\n非洲鹨\n镰翅夜鹰\n白腹海雀\n侏鸊鷉\n沼泽山鹪莺\n朱腹啄木鸟\n鹤鹬\n红冠啄花鸟\n黑颈天鹅\n苇鹀\n白眉林鸲\n美洲银鸥\n叙利亚啄木鸟\n短尾翠蜂鸟\n绿食蜜鸟\n白眉鹀\n黑头鸦雀\n粉胸斑鸠\n毛腿沙鸡\n红喉鸡鸠\n棕薮鸟\n黄色白斑翅雀\n长冠鹰雕\n灰背织雀\n长尾椋鸟\n白胸噪鹛\n淡嘴火雀\n华美极乐鸟\n丛蚁鵙\n尤卡坦啄木鸟\n五彩文鸟\n蒲苇莺\n半蹼鹬\n白头薮雀\n冠卷尾\n楔尾伯劳\n白背鸭\n红翅鹦鹉\n灰眉岩鹀\n亨岛苇莺\n乳白背啄木鸟\n灰嘴慧星蜂鸟\n白脸噪鹛\n点斑林鸮\n金眶鸻\n科氏蜂鸟\n青脚滨鹬\n杂色麦鸡\n黑顶唐纳雀\n印度走鸻\n白喉针尾雨燕\n澳洲苇莺\n黄胸鹬\n白翅噪椋鸟\n烟色绿霸鹟\n绿加岛莺雀\n秘鲁[红翅]共鸟\n灰地霸鹟\n史氏蝗莺\n白背矶鸫\n黑喉绿阔嘴鸟\n长尾山鸠\n黑眉绿鹎\n白喉鹧鸪\n亚历山大鹦鹉\n白腹麻鸭\n疣鼻栖鸭\n白眉鸣鹃鵙\n棕背小嘲鸫\n橙腹果鸠\n双日阔嘴蜂鸟\n黑顶唐加拉雀\n紫胸佛法僧\n黑冠噪鹛\n云斑林鹑\n矶鹬\n金胸歌鸲\n卷冠蓝鸦\n爪哇短翅莺\n黑冠啄木鸟\n黄褐食籽雀\n南美长尾蜂鸟\n蓝头佛法僧\n灰胸籽鹬\n黑腹蜂鸟\n黄垂麦鸡\n非洲褐鸫鹛\n灰头斑鸠\n凤头蜂鸟\n南非企鹅\n绿额矛嘴蜂鸟\n燕尾夜鹰\n长趾滨鹬\n黑眉信天翁\n塔劳秧鸡\n灰头文鸟\n墨西哥鹪鹩\n黑尾塍鹬\n加岛柳莺\n灰头啄木鸟\n白胸鸬鹚\n白喉角鸮\n黑腹鹱\n黄嘴鹊鵙\n白斑燕\n灰背南美鵟\n黑凤头鹦鹉\n黄臀鹎\n中杓鹬\n凤头海雀\n白冠啄木鸟\n凤头潜鸭\n非洲鸵鸟\n白顶地鸽\n黑头薮雀\n白翅夜鹰\n红侏霸鹟\n白眉雨燕\n小滨鹬\n白冠带鹀\n褐头山雀\n米岛扇尾鹟\n林攀雀\n纵纹蚁鸫\n白斑文鸟\n灰啸鹟\n灰头果鸠\n黑喉拟鴷\n黑嘴山巨嘴鸟\n索马里鸵鸟\n东方鸻\n黄褐翅铲嘴雀\n亮丽太阳鸟\n乌顶灌丛唐纳雀\n白腹针尾雨燕\n灰胸啄木鸟\n直嘴芦雀\n栗耳凤鹛\n红尾鹦哥\n紫辉椋鸟\n白翅澳鸦\n白尾蚁鸫\n蓝额红尾鸲\n红腹金刚鹦鹉\n南极鹱\n黑翅麦鸡\n灰啄木鸟\n红胸黑雁\n卢氏丽椋鸟\n白腰叉尾海燕\n凤头犀鸟\n厄瓜多尔啄木鸟\n非洲硬尾鸭\n淡胁秧鸡\n黑胸三趾鹑\n红尾猛雀鹀\n白翅丽唐纳雀\n马岛秧鸡\n灰头翡翠\n北长尾山雀\n红胸姬鹟\n栗背山雀\n黑蜂虎\n北马岛苇莺\n白点扇尾鹟\n马岛戴胜\n灰顶伯劳\n灰冠黑雀\n灰苇鹪鹩\n白喉爬地雀\n辉亭鸟\n白额雁\n北美鸺鹠\n乌灰鸫\n领月胸窜鸟\n点斑背蚁鸟\n蓝尾翠蜂鸟\n灰雕鸮\n印加燕鸥\n黄胸蚁鸫\n云南柳莺\n白簇岩吸蜜鸟\n大滨鹬\n黑剪嘴鸥\n黑疣皇鸠\n白胸蚁鸟\n茶胸吸蜜鸟\n白顶溪鸲\n白颈穗鹛\n橙胸咬鹃\n橙嘴蓝脸鲣鸟\n小杜鹃\n南美草鹀\n凤头褐鹎\n短嘴长尾山雀\n裸顶蚁鸟\n乌双斑雀\n普通秋沙鸭\n华西柳莺\n红头鸽\n白燕鸥\n蒙古沙鸻\n亚马逊灌丛霸鹟\n三趾鹟鴷\n凤头[共鸟]\n白胸绣眼鸟\n澳洲鹤\n蓝顶蚁鸫\n长冠八哥\n黑顶柳莺\n大杓鹬\n戴胜\n青脚鹬\n欧亚攀雀\n南美蚁鸫\n蓝喉翠鴗\n冠绣眼鸟\n非洲绿胸八色鸫\n鹊色唐纳雀\n黑短尾蓬背鹟\n领雀嘴鹎\n黑背白斑翅雀\n棕颈地霸鹟\n黑冕鹤\n杂色林鵙鹟\n灰颊夜鸫\n白喉蚁鸟\n南美针尾雀\n乌鸫\n铁嘴沙鸻\n厄瓜多尔姬啄木鸟\n纯胸姬啄木鸟\n黑胸麻鸭\n长嘴剑鸻\n美洲鹤\n黄翅莺雀\n白眼莺雀\n红颈滨鹬\n冕麦鸡\n白喉拾叶雀\n红颈黑鵙\n斑尾娇鹟\n锈喉马岛鹃\n纹针尾雀\n白南美鵟\n俾岛翠鸟\n暗色秧鸡\n白头麦鸡\n丽蓝头鹊\n三色伞鸟\n短嘴鸽\n栗鹀\n黑喉红尾鸲\n黑喉织雀\n黑脸石鸡\n黑皇鸠\n黑冠山雀\n鸡尾鹦鹉\n白喉绣眼鸟\n细斑姬啄木鸟\n灰鹨\n台湾拟啄木鸟\n赤颈鸭\n黑噪鹛\n红耳鹎\n白喉鸫鹛\n剑鸻\n棕钩嘴鵙\n短嘴地鹃\n马来犀鸟\n黑额燕鸥\n赭红尾鸲\n黑领鸲莺\n蓝山雀\n厄瓜多尔姬霸鹟\n亚马逊鹦哥\n厄瓜多尔小霸鹟\n黑颏岭裸鼻雀\n凤头黄眉企鹅\n灰头树鹛\n印支雀鹛\n四声杜鹃\n红颊蓝饰雀\n苍头薮雀\n棕颈拾叶雀\n白尾蓝胸蜂鸟\n尖嘴蜜鴷\n十二线极乐鸟\n冠鸦\n冠鸠\n冠鸭\n红额噪鹛\n白眼吸蜜鸟\n黑脚信天翁\n东非长尾伯劳\n白纹鴷雀\n黑翅长脚鹬\n斑喉鵖\n鹊鸭\n鹊雁\n沼泽鹎\n黄腹织雀\n南非鹨\n乌鵙鹟\n南非鹎\n红白唐纳雀\n狭尾椋鸟\n冠纹柳莺\n白喉林鸽\n厚嘴地鸠\n北杂鹛\n冠针尾雀\n皇辉蜂鸟\n三色薮雀\n灰额绿鸠\n绿嘴鸦鹃\n田雀鹀\n环颈鸻\n非洲白腰雨燕\n环颈鸭\n点胸蚁鵙\n紫喉领蜂鸟\n亮丽歌雀\n欧柳莺\n栗背丽鸫\n鬼鸮\n绿尾翠蜂鸟\n橙额黄雀鹀\n辉蓝细尾鹩莺\n南红蜂虎\n灰胸雅鹛\n丽色山雀\n黑脸王鹟\n珠眼嘲鸫\n黑白噪鹛\n黑枕威森莺\n北灰山雀\n黑嘴蕉鹃\n黄喉鹎\n黄喉鹀\n灰头绿鸠\n黑尾啸鹟\n小青脚鹬\n黑斑裸眼蚁鸟\n黑脸厚嘴雀\n短尾鹨\n白腹蜂鸟\n七彩文鸟\n短趾矶鸫\n细嘴雁\n棕顶大尾莺\n金眶鹟莺\n白眼先蚁鸫\n白头刺莺\n沼泽短翅莺\n细嘴杓鹬\n绿颈鹑鸠\n爪哇绿鹊\n黄腿鸥\n黑背信天翁\n栗胸地鹃\n白眼蚁鹩\n黄背拟鹂\n伯氏鹨\n白胸森鸠\n黄眉蚁鸟\n灰颊绿鸠\n灰头鹑鸠\n米岛鹎\n针尾沙锥\n斑翅山鹪莺\n仙蓝王鹟\n赤顶娇鹟\n点额啄木鸟\n南方蚁鵖\n红胸蜂虎\n凤头麦鸡\n灰腰燕\n斑鸫\n白尾姬霸鹟\n南非丝雀\n水雉\n马来鸻\n丽绣眼鸟\n姬隐蜂鸟\n凤头主红雀\n紫蓝饰雀\n北非石鸡\n华丽八色鸫\n棕背伯劳\n白头鹎\n攀雀\n白尾美洲咬鹃\n泽鹬\n北极海鹦\n爪哇缝叶莺\n白腹鸨\n台湾棕颈钩嘴鹛\n乳色走鸻\n洪都拉斯蜂鸟\n橙胸噪鹛\n民岛角鸮\n爱琴海猫\n阿拉伯茂猫\n巴比诺猫\n孟加拉猫\n巴西短毛猫\n查达利猫\n塞浦路斯短毛猫\n德文帝王猫\n顿斯科伊猫\n狸花猫\n侏儒猫\n科恩家猫\n拉邦猫\n尼比龙猫\n欧西猫\n欧斯亚史烈斯猫\n东方双色猫\n折叠猫\n北美洲短毛猫\n罗奥斯猫\n塞尔凯克卷毛猫\n塞伦盖蒂猫\n玩具虎猫\n乌克兰勒夫科伊猫\n黄粉蝶\n领燕鸻\n粉头斑鸠\n三线闭壳龟\n红斑马\n豹纹睑虎\n摄龟\n白头叶猴\n种猪\n西班牙羱羊\n白带锯蛱蝶\n非洲猛鱼\n阿拉伯大羚羊\n枝角类\n斜线天蛾\n钟虫\n拟光腹弓背蚁\n彩蚕\n圭亚那蓝牙\n蜢蜘\n美国红雀鱼\n茶黄螨\n蓬尾浣熊\n巨型羽翅鲎\n大极乐鸟\n黑长尾雉\n中华凤头燕鸥\n梳额蜴鲶\n山地田鼠\n歌鸲\n布列塔尼犬\n蓝面蝴蝶\n棘烟管鱼\n雀鱼\n尾斑圆颌针鱼\n松江鲈\n绿步甲\n黑额山噪鹛\n黑地蜂\n朝氏圆颈天牛\n灰驯狐猴\n红唇鱼\n星椿象\n淡水龙虾\n落基山大角羊\n五彩吊鱼\n青毛鼠\n狐狸鱼\n鬟羚\n獭兔\n彩蝴蝶\n啄木雀\n藏猕猴\n金斑鱼\n小麦红蜘蛛\n雾社黑燕蝶\n工程鲫\n哀鸽\n益蝽\n花罗汉\n卷尾\n小虾虎鱼\n斑星弄蝶\n红脸地犀鸟\n春尺蠖\n湖羊\n广东潮州犬\n非洲角蝰蛇\n水陆两栖动物\n赤线虫\n倭猪\n侏蓝仙鹟\n黑枪鱼\n纳氏鹞鲼\n红月光鱼\n棋盘凤凰\n长江鲟\n珍珠龙王鲷\n尖嘴扁颌针鱼\n箭头鱼\n挺胸龟\n火焰神仙\n玄灰蝶\n邓氏鱼\n玻璃灯鱼\n蜜蚁\n蛇鲭\n夏威夷僧海豹\n绿腰鹦哥\n斑纹缨口鳅\n英国跳跃长耳猎犬\n小鹦鹉\n白尾蜻蜓\n云南大头鱼\n兔耳袋狸\n红鳟鱼\n螺\n天使长尾天蚕蛾\n黑瘤地图龟\n密星海蝠鱼\n曼赤肯猫\n迷你贵宾犬\n稻管蓟马\n狗熊\n大角鹿\n毛驴\n高背红尾金龙鱼\n大角羊\n蓝海绵珊瑚\n蓝狐\n黑寡妇蛛\n尖鼻箱鲀\n小鸺鹠\n塘鲺\n普氏原羚\n红袖蜡蝉\n黑雀\n变形虫\n钩虾\n葡萄缺角天蛾\n红嘴朱雀\n鸻鹬\n褐蓝子鱼\n霸王蝾螈\n点玄灰蝶\n雌雄嵌合体\n弧边招潮蟹\n红丽丽\n东其尼猫\n红尾副鳅\n鹅喉羚\n塘鹅\n跑山鸡\n榆蓝叶甲\n蛇蛉\n牛角龙\n沙漠蛙\n栎鹰翅天蛾\n锤尾凤蝶\n叶形鱼\n八线腹链蛇\n葱蓟马\n黑长脚鹬\n花蜜长舌蝠\n沙蜥\n龙背种金鱼\n金壳虫\n大亭鸟\n银鲨\n红狼牙鰕虎鱼\n丁公鱼\n石金钱\n虾虎鱼\n黑背波鱼\n鲴鱼\n红脖颈槽蛇\n印头鱼\n蓝神仙\n马头鱼\n肥头鱼\n棕额长尾山雀\n黄河鳖\n黑马羚\n蝇虎\n斜斑彩灰蝶\n白顶鵖\n油松毛虫\n罗岛蓝鸠\n长腿鹬\n似鸡龙\n腹斑水蛇\n布里塔尼犬\n华鳊\n独角犀牛\n欧洲野牛\n丽纹云南鳅\n梳骨螺\n红绿金刚鹦鹉\n土拨鼠\n迷你灯鱼\n基氏细猛蚁\n中国水龙\n禄丰龙\n电蛱蝶\n川蜷\n尖尾雨燕\n小黄帽亚马逊鹦鹉\n蓝鳍金枪鱼\n碧蛾蜡蝉\n黄尾鱼\n丝足鱼\n莱芜黑山羊\n龙睛金鱼\n无恒变形虫\n铜嘴雀\n彩石鳑鲏\n夏赤蜻\n大眼鳊\n温泉蛇\n蛭\n盔龙\n吐尼尔鸽系\n褐翅燕鸥\n黑鱼\n彩虹蟾蜍\n欧洲乌鸫\n囊地鼠\n日本须鲨\n喀尔巴阡蜂\n玻璃海象\n横纹金蛛\n茧蜂\n红鳍笛鲷\n蓝额长脚地鸲\n红棕象甲虫\n鼯猴\n灰天鹅\n普罗特猎犬\n平菇厉眼蕈蚊\n电鳐\n叽咋柳莺\n鼬鼠\n红领吸蜜鹦鹉\n东亚喜鹊\n大嘴籽雀\n鳄龟\n棕头穗鹛\n蛇颈龙\n花枝鼠\n鸭嘴鲟\n山窗萤\n嘉翠蛱蝶\n小香猪\n西非短鲷\n红瓜子斑\n糙齿长吻海豚\n黑面狷羚\n狗鱼\n黑额光叶甲\n大口虾虎鱼\n赤嘴潜鸭\n黑眉柳莺\n棘鮋\n蓝冠鸽\n翱翔蓑鲉\n密纹飒弄蝶\n艾芬品\n黑尾灰蜻\n刺背球状蜘蛛\n长尾大蚕蛾\n四纹豆象\n白雉鸡\n欧洲盘羊\n大地懒\n笔尾树鼩\n短嘴金丝燕\n介壳虫\n过树蛇\n隐纹长脚胡蜂\n红脚苦恶鸟\n非洲金猫\n条石鲷\n齿缘龙虱\n皇带鱼\n鬼王螽斯\n稻象甲\n床虫\n南极冰鱼\n狗鲛\n麻田黄刺蛾\n鳀鱼\n黄金鸻\n石狩红蚁\n须红鲂鮄\n苹果全爪螨\n橙胸摄蜜鸟\n公主兔\n雉鹑\n宁都黄鸡\n叉尾卷尾\n五色锦鲤\n管鼻鯙\n赤麂\n中国虎纹捕鸟蛛\n乌鳢\n黑嘴树鸭\n蜜袋鼯\n拟态革鲀\n莱昂贝格犬\n卷叶螟\n水栖龟\n滑鼠蛇\n扭颈鸟\n丁丁猫儿\n角蟾\n黑龙江豹\n纤毛虫\n梅花龟\n黄天堂鸟\n凤头企鹅\n白肩鹊鵙\n安氏蜂鸟\n福建丽纹蛇\n灰莺\n普通长脚马蜂\n李子食心虫\n赤链华游蛇\n长尾巧织雀\n褐色树蛇\n黄蚬\n海帆蜥\n薮羚\n枪猎犬\n大鳞脂鲤\n灰鹭\n银鱼\n王锦蛇\n日本龟蜡蚧\n北极鸥\n樱桃鱼\n大头蚁\n赤胸鸫\n豹纹鼠鱼\n淡水苏眉\n点荷包鱼\n大棘烛光鱼\n安第斯火烈鸟\n德国硬毛指示猎犬\n小黄斑挵蝶\n长颈龟\n波斯鲟\n尖吻鲈\n蛾蚋\n原矛头蝮\n翡翠鸟\n蓝蛱蝶\n鞭毛虫\n扁形虫\n松茸毒蛾\n秀丽白虾\n文种金鱼\n雕鴞\n袋熊\n宝石海鳝\n科尼斯雷克斯猫\n双头五腿龟\n划蝽\n裸鼹鼠\n柳毒蛾\n曲纹紫灰蝶\n螟虫\n六点带蛱蝶\n蓝鳁鲸\n大蜥蜴\n美国银色短毛猫\n禾花鱼\n瑶山鳄蜥\n鸢\n苹掌舟蛾\n蝶蛾\n野狗\n根瘤蚜\n鸡苗\n鯮鱼\n蟛蜞\n吐鲁番斗鸡\n中间银鮈\n美洲斗牛犬\n扁刺蛾\n日鳽\n台湾大象鼻虫\n卷毛猫\n叉尾鱼\n黑枕燕鸥\n小型狮子犬\n鸭嘴螺\n芋双线天蛾\n蓝鳍鱼\n桃桑白蚧\n弹涂鱼\n墨西哥黑熊\n笋壳鱼\n圆顶珠蚌\n海南蜜蜂\n珍珠马三甲\n金披凤玫瑰鹦鹉\n红花实蝇\n澳洲皇帝蟹\n火焰龟\n红头爱情鸟\n中村锯天牛\n樟青凤蝶\n比尤伊克天鹅\n镖鲈\n鳃鱼\n小齿椰子猫\n华山松大小蠹\n黄冠亚马逊鹦鹉\n白腰雨燕\n高冠变色龙\n膨腹海马\n八色鸟\n桃花虫\n海老鼠\n披毛犀\n丝绵木金星尺蠖\n竹叶龟\n吻海马\n窃蠹\n柳圆叶甲\n星期狗\n冠海马\n赤似谷盗\n北京油鸡\n小蠹虫\n锯鳞蝰\n红额鹦鹉\n泰虎\n方正银鲫\n红肚凤凰\n双带弄蝶\n稻褐飞虱\n头条波鱼\n蜂虻\n达摩翠凤蝶\n中华虎甲\n信天翁\n帝鳄\n猪鼻龟\n斑林狸\n二尾蛱蝶\n红粉佳人鱼\n食木甲鲶鱼\n池龟\n沙特尔猫\n大雕\n沙漠之舟\n黄杨绢野螟\n虹鳟鱼\n金青鸟\n白蜘蛛\n幽灵鲨\n榕透翅毒蛾\n狸奴\n棕榈象鼻虫\n雷龙鱼\n彼斯奎氏鹦鹉\n双斑绿柳莺\n小麦吸浆虫\n北京雨燕\n向日葵螟\n棕灶鸟\n盾蝽\n沼泽山雀\n天蚕蛾\n渔鸥\n蟾福蛱蝶\n台湾山蜗牛\n荒漠沙蜥\n华夏剑凤蝶\n阿拉斯加内陆狼\n北极蛤\n鼹鼠\n黑额伯劳\n白环蚊霸鹟\n番茄蛙\n黑白兀鹫\n野生甲鱼\n熊猴\n鹞鹰\n稻螟虫\n鳄冰鱼\n太行犬\n红螺\n美国大鲵\n薮枝螅\n暗腹雪鸡\n波路豆齿蛇鳗\n黄足条蜂\n爬虫类\n稻秆蝇\n光脸鲷鱼\n斜纹天蛾\n鲌\n白斑斑鲨\n蟹守螺\n管鳗\n十一间仙\n西瓜龟\n水蛛\n斑鹞\n海瓜子\n蒙古羊\n岩羊\n黑枕绿啄木鸟\n红点鲑\n针大蚊\n鼋\n南美短鲷\n东德牧羊犬\n长嘴半蹼鹬\n银白鱼\n多耙红钻鱼\n荷花瓦特犬\n蓝枪鱼\n圆鮀鲣\n黄尾蜜鲴\n银鸭\n红灶鸟\n长体小鳔鮈\n朝鲜少鳞鳜\n巨型蚂蚁\n五彩鳗\n栗斑鵎鵼\n德国镜鲤\n木槿蚜虫\n巧克力娃娃\n橙翅亚马逊\n沙漠地鼠龟\n三尾褐凤蝶\n沙鳢\n银屏灯鱼\n臭椿皮蛾\n台湾山椒鱼\n臭肚鱼\n非洲树蛇\n放射鼠\n斑珍蝶\n粉红鲑\n咖啡貂\n豆眼白\n麻步甲\n金鲳鱼\n红嘴海鸥\n大红蛱蝶\n勃氏新热鳚\n青头潜鸭\n欧鳗\n前肛鳗\n白斑眼蝶\n中华鳑鲏\n赛克斯长毛猎犬\n丰年虾\n叶泡蛙\n乌鬃鹅\n真鳕\n七星鳢\n琉璃石斑鱼\n南浣熊\n灰腹鹰\n麂鹿\n哈什蚂\n中华真地鳖\n古铜谷弄蝶\n红脚细腰蜂\n乌珠穆沁马\n紫红冠鹦哥\n黑龙江茴鱼\n金灯虾虎\n咖啡透翅天蛾\n香猪\n日本鳗鲡\n缟鬣狗\n澳大利亚肺鱼\n东菲比霸鹟\n迷你兔\n豹尺蛾\n白星花金龟\n好胜金蛛\n家鸽\n硬头海鲇\n石蜈蚣\n华吸鳅\n沙鵖\n毛鳞鱼\n密疣蜥虎\n黄嘴白鹭\n豹纹鲨\n黑纹颈槽蛇\n珍珠狗\n玉米粘虫\n辫子鱼\n六带鰺\n金额叶鹎\n银耳相思鸟\n大泷线鱼\n荒漠睡鼠\n薮犬\n硫磺鹀\n幸运辘蛱蝶\n四绒球金鱼\n加州秃鹫\n姬蟋\n吊死鬼\n芦莺\n基龙\n白颈长尾雉\n白翅蓝鹊\n白脸牛羚\n鼩鼱\n锦鲫鱼\n金头珍珠虎\n兔狲\n夏洛来羊\n狗鳄\n蓝银白色猫\n白腹紫椋鸟\n大龙虾\n毛魔目夜蛾\n玛伦牧羊犬\n纳米比亚沙漠测行蛇\n碎蛇\n蓝星珍珠\n花雕鱼\n腕足动物\n乌蚁鸟\n紫玫瑰凤蝶\n大连湾牡蛎\n黄额鸦雀\n南蛇\n雷氏大疣蛛\n红鳍鮊\n红箭鱼\n粉蚧\n粉白龙猫\n巨蜘蛛\n叙利亚刺尾蜥\n人面蜘蛛\n橙仙鱼\n湘云鲫\n太平洋鲑鱼\n红点颏\n金花马骝\n灌木新园蛛\n全蝎\n中华曙猿\n凤眼蝶\n玻璃红鲤鱼\n蚓腹银斑蛛\n非鲫\n加拿大马鹿\n银鲛\n苎麻珍蝶\n缝叶蚁\n乌燕鸥\n花脊游蛇\n红丽丽鱼\n小潜鸭\n橙带蓝尺蛾\n太阳鸟\n大鳞青眼鱼\n中国狸花猫\n貂鼠\n盾天蛾\n茶树假眼小绿叶蝉\n姬鹬\n鸡头\n螳蛉\n鼠鸟\n红尾珠蝶\n蹄兔\n长嘴鹬\n大山羊\n蓝宝石华丽雨林\n方水母\n三角鲤\n大枇杷螺\n丝绒吊鱼\n长牡蛎\n印鱼\n蛀虫\n黑羽狨\n巨犀类\n彩虹八色鸫\n糖果蜗牛\n白马头鱼\n柳杉毛虫\n孤沙锥\n褐背拟地鸦\n蛾螺\n绯鲤\n法系安哥拉兔\n扁玉螺\n黑翅红蝉\n玉米象\n金龟甲\n西藏翠蛱蝶\n飞蚂蚁\n欧洲鲟\n军曹鱼\n黑脸鸬鹚\n哈威那\n美洲肺鱼\n旌蛉\n锦鲤\n毒鲉\n高体鳜\n树皮螳螂\n水老鼠\n铜头蛇\n七彩吊鱼\n锦龟\n日本青鳉\n鳞喉隐蜂鸟\n真枝角鹿\n渔猫\n海月水母\n海蜷\n北锯龟甲\n花斑裸鲤\n大鳞锥颌象鼻鱼\n中华大刀螂\n白唇竹叶青\n花八哥花鹩哥\n渔游蛇\n关中驴\n浅蜊\n泰国斑马脚蜘蛛\n哥伦比亚泥龟\n绿鹦嘴鹎\n四喜鸟\n黑鳍巨脂鲤\n三眼甲虫\n灰林鵖\n柴棺龟\n大珠母贝\n领蝴蝶鱼\n猪蛙\n老板鱼\n矛头蝮\n东南亚虎\n蓝灰扁尾海蛇\n异色多纹蜻\n阿拉斯加克利凯犬\n哈尔滨白猪\n鹤鹰\n棕黑锦蛇\n黄边小太阳鹦鹉\n沼泽章鱼\n巨龟\n叶齿金绿泥蜂\n雀鲷\n棕顶王森莺\n简牙下鱵鱼\n黑伯劳\n刀鲳鱼\n家鹅\n伟格仕太阳鸟\n红珠灰蝶\n花鸡\n豹斑象龟\n鳞眼鮃\n兵蚁\n池沼公鱼\n雾姥甲虫\n织雀\n花生大蟋\n法国三色猎犬\n硬毛猎狐梗\n家猫\n宾沙犬\n橙线吊\n翼法螺\n拟黑多刺蚁\n岛子鱼\n玫瑰鲫\n拉蒂松猫蛛\n霸王角蛙\n高角羚\n白痣广翅蜡蝉\n蓝顶蓝饰雀\n黑尾袋鼠\n加州红腹蝾螈\n雪雀\n雪花豹\n日本似织螽\n砗磲贝\n家猪\n飞凤鱼\n蟹黄水晶毛虫\n大口蛇鳗\n带猴\n斑马蟹\n真骨鱼类\n绿凤蝶\n佛法僧\n齐口裂腹鱼\n长嘴啄木鸟\n豹纹海鳝\n四川凉山犬\n迷你狗\n中华鲟\n刺鲀鱼\n斑蛾\n二线琴尾鳉\n洞螈\n白斑狗鱼\n伏地魔猫\n白背飞虱\n蚁形甲\n蓝鸫\n小蜂\n灰犀鸟\n红靛颏儿\n黄锡鲷\n草蜘蛛\n菱蝗\n白马鸡\n云猫\n丽眼斑螳\n新疆岩蜥\n马面鲷\n诺氏鹬\n褐柳莺\n蝙蝠鱼\n花萤\n大马蹄蝠\n神秘螺\n长颈驼\n宽节巨首蚁\n红姬缘椿象\n日本双棘长蠹\n鼠鼬\n花曲柳窄吉丁\n阴阳蝶\n滨鹬\n滨浪鹬\n白背夜鹭\n亮柔拟步甲\n黄点黒蝉\n大千手螺\n金黄眼镜蛇\n极北蝰\n丛林猫\n猿叶虫\n蓝鹤\n红腰勺鹬\n三化螟\n西猯\n斗笠螺\n杜父鱼\n总鳍鱼\n惊讶猫\n矛头蝮蛇\n三角锤天蛾\n黄尾巴鲢\n突灶螽\n迷你力斯兔\n萨伊蓝六间\n褐蛇\n红腹金鸡\n西帕基犬\n黑点灯\n叶蜂\n花鲢鱼\n中华剑角蝗\n白狮\n山黄鳝\n鬘螺\n日本鹰翅天蛾\n长鼻螺\n海兔螺\n交嘴雀\n玉米钻心虫\n花蜘蛛\n青蛙鱼\n黑纹伟蜓\n宅泥鱼\n金斑虎甲\n掘土蜂\n安东尼斯闪蝶\n水虿\n黑豚\n法国水犬\n鵟\n大骨顶\n森林漏斗蛛\n鱼鳍\n玉筋鱼\n银钩青凤蝶\n荔枝蒂蛀虫\n斑背燕尾\n马岛獴\n蓝仙鱼\n四川雉鹑\n青鳞鱼\n小地霸鹟\n大背天蛾\n安布闭壳龟\n真蛸\n蜥代龙\n雪鹑\n瓜实蝇\n灰胸竹鸡\n蜥龙鳄\n棱龟\n刺缘大薄翅天牛\n异齿龙\n细鳞太攀蛇\n花鳅\n玉米螟赤眼蜂\n西班牙指示犬\n鲈鲤\n橡树啄木鸟\n油葫芦\n东乡绿壳蛋鸡\n棕尾鼬狐猴\n麝鸭\n黄金鲤\n金钱龟\n雀鹛\n从江香猪\n铃蟾\n北部黑瘤地图龟\n黑线鳕\n姥鹬\n基围虾\n天螺蛳\n黄头庙龟\n舍腰蜂\n褐塘鳢\n巨头麝香龟\n混合蜓\n狼鳍鱼\n紫雷达鱼\n八线龙鱼\n棕头鸥\n海角鹦鹉\n鼠鱼\n六线鱼\n斑鬣狗\n雪鸡\n毛虫\n象拔蚌\n奥尼鱼\n青豹蛱蝶\n林鸮\n珠蝴蝶鱼\n新加坡红耳鹎\n三角鲂\n红松鼠\n大龙虱\n黑线银鲛\n红鳍鲫\n栗树鸭\n杨二尾舟蛾\n指标犬\n黑头鸭\n葱兰夜蛾\n盲椿象\n黑白关刀\n油彩粉红趾\n亮灰蝶\n白腰滨鹬\n非洲獴\n柯莫德熊\n桂花鱼\n半边鱼\n黑蚱蝉\n黄环林蛇\n美洲蓝凤蝶\n简阳大耳羊\n距翅雁\n银白蛱蝶\n巨叉深山锹甲\n金星宝螺\n印度棱背龟\n短舌鳎\n赛级犬\n赤鸽\n珍贵妩灰蝶\n混血猫\n麝香凤蝶\n玫红眉朱雀\n白牦牛\n雪蛤\n日本弓背蚁\n拉塞尔蝰蛇\n舒柏奇犬\n海蛾鱼\n九带犰狳\n细鳞鲴\n丁卡扁隆头鱼\n半月鱼\n红山椒鸟\n桑蚕\n南方波鱼\n斑大蚊\n幻紫斑蝶\n印度眼斑螳\n沧龙\n红斑美凤蝶\n黑脚企鹅\n美国树蛙\n豆荚螟\n山地麻蜥\n蚜狮\n豹纹魟\n响蜜鴷\n红嘴奎利亚雀\n赤颈鸫\n苇莺\n夏威夷蜗牛\n海鞘\n小瑞士犬\n果马蜂\n绿猫鸟\n刺鲃\n枸杞蚜虫\n瘦虾\n肺鱼\n柑桔凤蝶\n茶小绿叶蝉\n兰开夏赫勒犬\n韭菜迟眼蕈蚊\n蓝太平洋鹦鹉\n小啄木鸟\n达尔文雀\n达呼尔鼠兔\n横斑林鸮\n裂头蚴\n南方马口鱼\n红腰鹦鹉\n蜂鹰\n漏斗蜘蛛\n大蛙眼守宫\n太平洋玉筋鱼\n猩红蜻蜓\n米色龙猫\n飞海蛾鱼\n撒旦鸭嘴\n黑冠鳽\n疣鼻天鹅\n巨型毛冠鸽\n雉鸡\n和尚鹦鹉\n高原裸鲤\n龙须狮子鱼\n鳗鲡\n棉大卷叶螟\n荧光鼠\n白鲢鱼\n高原鼠兔\n灰蓝蚋莺\n双斑草雀\n丁鱥\n澳洲国王鹦鹉\n横带此鮆鲷\n鸣官鸟\n疣猪\n中国胭脂鱼\n鄂伦春猎犬\n唇鱼\n垂耳兔\n尖牙鱼\n黄河鸽子鱼\n半线波鱼\n鲐鲅鱼\n中华攀雀\n巴西流浪蜘蛛\n中红侧沟茧蜂\n鳉鱼\n台湾地蜥\n淡紫色猫\n双角老鼠宝螺\n长颌鲚\n蓝尖尾无须鳕\n水稻稻纵卷叶螟\n小白鹭\n大眼狮鲈\n长足虻\n龙睛鱼\n葫芦锹形虫\n黄唇鱼\n圣甲虫\n秘鲁蜂鸟\n赤腹松鼠\n东方菜粉蝶\n岩滨鹬\n拟鲶高原鳅\n非洲八色鸫\n疣蝗\n黑喉噪鹛\n黄腹旱獭\n家鸡\n天鹅绒虫\n玉米红蜘蛛\n山水牛\n海牛鲸\n榧螺\n大黑蚂蚁\n冬瓜蝶鱼\n西表山猫\n皇帝鱼\n蜜熊\n鬣狗\n小掩鼻风鸟\n白耳鹎\n钝翅苇莺\n黄潜蝇\n潜鸭\n绿鸦鹃\n藏獒\n红白锦鲤\n裸臀鱼\n白化花斑蛇\n粒纹大齿猛蚁\n阳彩臂金龟\n巨沙螽\n魔花螳螂\n紫鳗虾虎鱼\n麦町犬\n赤须夜蜂虎\n亚历山大群岛狼\n桨足纲\n鳚鱼\n白条鱼\n海鲋\n舒伯齐犬\n东流水牛\n黄绣眼鸟\n欧洲梭鲈\n深海鮟鱇\n甘蔗螟虫\n斑石鲷\n迷你马\n气步甲\n棘鱼类\n斜纹蝴蝶鱼\n家马蜂\n沙漏状蜘蛛\n切尔诺贝利巨鼠\n野鸽\n黄豹盛蛱蝶\n腔棘鱼\n希拉里蟾头龟\n蓝马鸡\n园蛛\n花骨鱼\n一点突额秆蝇\n玉米叶甲\n赤峰锦蛇\n白蜡蚧\n苏铁小灰蝶\n棋盘鲫\n陆龟\n红眼斑鸠\n娄费尔德折背陆龟\n星甲鱼\n中间黄颡鱼\n七彩马鞍鱼\n蓝帝提灯\n苜蓿多节天牛\n柳叶鱼\n东方泥龟\n花田鸡\n蠕鳗\n白鼻丛尾猴\n雪梨漏斗网蜘蛛\n绦虫\n耗子\n虎鲶\n玉女宝螺\n白扇蟌\n黄金鲫\n毛黄鳃金龟\n彩虹帝王灯鱼\n转基因鱼\n浅翅凤蛾\n金鲤鱼\n白头鹎雏鸟\n白天鹅\n台湾小头蛇\n竹笋象鼻虫\n印加鹦鹉短鲷\n亚非马蜂\n鲫\n豚鹿\n红尾热带鸟\n苹果剑鱼\n水泡金鱼\n野大白羊\n獐\n小黑水鸡\n粗皮鲷\n齿舌\n大麻鳽\n亚洲金猫\n雪花蛇鳝\n船丁鱼\n黄尾球跳甲\n虎绳蜘蛛\n小角锹甲\n漠百灵\n水稻大螟\n洋葱螺\n乌头鸽\n透明鳞金鱼\n金腹小鹟\n南阳桃花水母\n海鲢\n稻水象甲\n红翅黑鹂\n四翼鸟\n暗色伞鸟\n绿颊锥尾鹦鹉\n维那斯骨螺\n草鸽\n白钩蛱蝶\n鬣羚\n夜鹦鹉\n靛颏\n德氏长尾猴\n菜蚜\n山鸺鹠\n伊犁鹅\n马奇异春蜓\n澳洲鹦鹉\n小白额雁\n红翅鵙鹛\n四川裂腹鱼\n大壁虎\n竹叶青蛇\n七夕鱼\n中华新锹甲\n十二红龙睛\n金鲶鱼\n埃及胡子鲶\n红腹食人鲳\n褶伞蜥\n阿勒泰驯鹿\n中华盗虻\n麻鳽\n金斑鸻\n似竺朴丽鱼\n狗獾\n兀鹰\n巨齿蛉\n黑燕鱼\n指示猎犬\n毛股沟臀叶甲\n多毛纲\n窄曙凤蝶\n科琪娜斗鱼\n方尾鹟\n短尾黄鼠狼\n瘿蚊\n角斑樗蚕蛾\n蓝太阳鱼\n黄帅蛱蝶\n大鱼狗\n灰地鸫\n李鸟\n豇豆钻心虫\n中国荷斯坦奶牛\n孟加拉巨蜥\n蓝翠鸟\n螯蛱蝶\n优哉闪蝶\n金化科大甲虫\n乌山羊\n海格力斯巨人巴布\n摩来彩灰蝶\n蛲虫\n斑缘豆粉蝶\n黄猄蚁\n火冠蜂鸟\n蒺藜纹脉蛱蝶\n黑山猪\n弯角大羚羊\n白尾地鸦\n观赏鱼\n非纯种猫\n丽鳾\n刺鳅\n柴鸡\n长尾鸬鹚\n长须狮子鱼\n红鲤\n苔藓螳螂\n傲白蛱蝶\n隐士蜘蛛\n黄波罗凤蝶\n美洲狮\n亚历山大女皇鸟翼凤蝶\n黑线姬鼠\n巴图迪古阿蜘蛛\n黄斑地图龟\n号半龙鱼\n最美紫蛱蝶\n栉鼠\n盾皮鱼\n林肯港鹦鹉\n鳄蜥\n沟眶象\n指角蝇\n冠鸮\n玉米旋心虫\n黄顶丝雀\n细鳞壮鳕\n金黄锥尾鹦鹉\n萨尔路斯猎狼犬\n王蛇\n桦斑蝶\n头索动物\n绮蛳螺\n血虫\n燕尾鱼\n绿毛龟\n老爷树蛙\n稠李巢蛾\n翠袖锯眼蝶\n大余鸭\n树鹨\n小灵狗\n波纹鳜\n二元母猪\n黄金龟甲虫\n雎鸠\n玉杵带蛱蝶\n石鲷\n细毛羊\n二角尘蛛\n纹腹鹰\n蜡螟\n黄珠宝螺\n杀人蜂\n丽彩鹀\n海鱼\n木胡蜂\n竹虫\n条鳎\n澳洲短颈龟\n红顶雀\n白骨鱼\n剑射鱼\n斑点池龟\n红尾金灯\n莱卡犬\n狮子头金鱼\n细雕织纹螺\n竹蝗\n北非大羚羊\n军金刚鹦鹉\n绿草蜥\n喇叭鱼\n春鲤\n红头环蛇\n漏斗形蜘蛛\n锯腹脂鲤\n中华红林蚁\n蓝眼皇冠豹\n古白鲑\n硫黄丝雀\n非洲野驴\n沼泽横颈龟\n榕小蜂\n欧洲玉米螟\n达摩麝凤蝶\n新西兰黑嘴鸥\n黄金鱼\n河虾\n蝲蛄\n短吻飞旋原海豚\n大鳞锯鳞鱼\n异育银鲫\n红胸果鸠\n白眉山鹧鸪\n竹刀鱼\n孔雀鸽\n河川沙塘鳢\n马来亚虎\n两色绿刺蛾\n黄鳍鲔\n硬毛指示格里芬犬\n鬼鮋\n斑姬地鸠\n丽鹰雕\n少女鱼\n摩门蟋蟀\n海滨灰雀\n松突圆蚧\n马岛鹦鹉\n蚜茧蜂\n斑鹿\n大锹形虫\n纹耳鹎\n蟪蛄\n彩虹鲷\n斑纹鸟\n斑点狗\n粉红胸鹨\n菜蚤子\n西番翠凤蝶\n扁吻鱼\n家鸭\n皮蠹\n黑鲫\n三线蛇\n长手鱼\n五罗鱼\n泰坦甲虫\n南美林猫\n竹象虫\n约安巨马陆\n马尾藻鱼\n台湾纹白蝶\n加南犬\n青文鱼\n翼凤蝶\n奶茶仓鼠\n泰国鳄鱼\n绿奇花金龟\n花条蛇\n美洲大赤鱿\n豆蓝丽金龟\n东雀鹰\n萧氏松茎象\n长角羚\n宠物鸟\n尾羽龙\n慈鲷科鱼\n缟蝇\n尖吻小公鱼\n巨锯锹甲\n竹夹鱼\n象甲\n蚯蚓螺\n斑马鲶鱼\n闪蓝丽大蜻\n折带黄毒蛾\n达尔文蛙\n白鲫\n狼青犬\n柯岛浣熊\n黄红眼鹦鹉\n红尾鸲\n郭公虫\n蓝背鱼\n砗蚝\n墨西哥鹿\n弯嘴鸻\n园鼠\n黑颈鸊鷉\n长翅稻蝗\n台湾牛\n小黄铃\n桃粉大尾蚜\n草履虫\n红脚毛毛虫\n红腹山雀\n须鲸\n荔枝螺\n扁鲨\n双头蛇\n小胫刺蝗\n钩尾鸟翼凤蝶\n大腹园蛛\n草原斑猫\n泥蛉\n花蟹蛛\n彩虹灯鱼\n高原岩鹨\n缘钻嘴鱼\n龙头鱼\n白兔鱼\n蠼螋\n巴西菲勒犬\n多棘倒吊\n塍鹬\n巨蝮\n乌蕯拉巴树蝰\n潘帕斯猫\n美凤蝶\n六间鱼\n紫色花蜜鸟\n德文莱克斯猫\n甲鲶鱼\n花豹鱼\n盔头蛇\n绿冠蕉鹃\n黄斑椿象\n黄斑宽套大蜓\n石爬鮡\n墨胸胡蜂\n小家鼠\n岩松鼠\n蓝头蜂鸟\n蚁蛉\n莺雀\n考艾岛洞狼蛛\n大草螳\n灰胸鹪莺\n金斑蛱蝶\n粉壳蛋鸡\n扁缘宝螺\n棘牡蛎\n巴卡雷龙\n恒河猴\n巴达库尔龟\n类人猿\n福建颈斑蛇\n巨人守宫\n宠物蚕宝宝\n斑眼食蚜蝇\n九尾鲍\n驼鸟\n红鲫\n褐马鸡\n紫点葵珊瑚\n巧克力色猫\n隆头蛛\n翘嘴红鮊\n长嘴乌鸦\n黑眉锦蛇\n虾蟹\n多趾猫\n宽带凤蝶\n巨圆臀大蜓\n台湾鬣羚\n波士顿龙虾\n搏鱼\n吸血蛾\n长尾麝凤蝶\n大银腹蛛\n白领八哥\n斑驴\n红目天蚕蛾\n花鼠\n洪堡鱿鱼\n星鳗\n蓝脚寄居蟹\n银线灰蝶\n红高头龙睛\n巴氏豆丁海马\n桃花水母\n假鳄龟\n金枣宝螺\n无尾瓢鸡\n红蛱蝶\n海绵动物\n大家鼠\n沙即鸟\n扁担钩\n蓝锥嘴雀\n禽龙\n短须狮子鱼\n山寨熊猫\n姬透目天蚕蛾\n黄鮟鱇\n白腹幽鹛\n异色灰蜻\n鲈鳗\n特种野猪\n峨眉柳莺\n爬杈\n苏尼特双峰驼\n蓝线金灯\n马斯提夫犬\n麝鼠\n龙眼合夜蛾\n星头啄木鸟\n青带凤蝶\n紫鱼\n绿盲蝽\n秦岭虎\n丽鱼\n铠甲蝮\n葡萄牙波登可犬\n横斑腹小鸮\n鯕鳅\n针尾绿鸠\n刀鱼\n大贾丁氏鹦鹉\n感鱼\n黑狮犬\n陆鳄\n大眼斜鳞蛇\n鲮鲤\n红足穴猛蚁\n藏绵羊\n黄蝮海蛇\n玛雷玛牧羊犬\n巴西鲷\n拟矛尾虾虎鱼\n倭岩羊\n吸血鬼鹿\n奎利亚雀\n绿巨螳螂\n灯颊鲷\n啄花鸟\n火山兔\n金湖乌凤鸡\n棕臀凤鹛\n布履阑珊猫\n猫鼬\n黄尾白毒蛾\n七鳃鳗\n周氏啮小蜂\n鹪莺\n摇蚊幼虫\n矛纹草鹛\n毛貘\n海子水牛\n托佩克种猪\n澳洲鲭\n猫鸟\n长鳓\n招财猫鱼\n金蛉子\n安南龟\n烟青虫\n沙鯭鱼\n高山岭雀\n长趾蛙\n虎鲶 虎鲶\n天堂极乐鸟\n玫红领绿鹦鹉\n家蚕蛾\n红尾鲢\n褐镰翅冠雉\n火腹蟾蜍\n巨海燕\n圆鳍鱼\n维希拉猎犬\n黑头鸥\n联纹小叶春蜓\n眼镜鳄\n哥斯达黎加斑马脚蜘蛛\n斯马蜂\n晰蜴\n扁蜗牛\n鸽子鱼\n水晶虾\n负鼠\n罗非鱼\n青蛾蜡蝉\n敏麻蜥\n大野牛\n蛹期\n线纹蜻蜓\n黑褐色浣熊猎犬\n狭鳕\n小嘲鸫\n皮堡斯\n枯叶蟾蜍\n绿蝎\n飞蜥蜴\n红海星\n熊狸\n黑孔雀\n塞加羚羊\n巴拉金梭鱼\n高鼻羚羊\n玻璃猫\n三道鳞\n鼠妇\n树粉蝶\n蝉脱\n深海龙鱼\n星子鱼\n九斤黄\n金鲫鱼\n鬼子斗鸡\n小眉眼蝶\n白眉丝刺莺\n八带鱼\n褐头鸫\n巨林猪\n库车沙蜥\n红水泡眼\n红狮头金鱼\n臭屁虫\n短尾琉金鱼\n鸣角鸮\n花蚤\n斑灵狸\n尖头鱼\n金背蟾蜍\n菊天牛\n小鹪鹩\n绿缘扁角叶甲\n意大利蜂\n小红鹳\n比利时马\n梧桐鸟\n刺鲀\n塞特猎犬\n蔷薇三节叶蜂\n鱇康良白鱼\n水虱\n孔蛛\n矢尖蚧\n日本红珊瑚\n棕雨燕\n阿拉斯加狭鳕\n小夜鹰\n白鞘嘴鸥\n丝带凤蝶\n雪獒\n建鲤\n小蓝翠鸟\n乌冉克羊\n巨人甲虫\n风头麦鸡\n滩头雅罗鱼\n火焰吊\n美洲鳗鲡\n沙棘木蠹蛾\n麝猫\n柑橘凤蝶\n红斑翠蛱蝶\n大羚羊\n写鲤锦鲤\n眼镜仙\n玻璃鱼\n红玫瑰毛蜘蛛\n中华秋沙鸭\n珍珠狗头鱼\n和平鸽\n谷仓猫头鹰\n台湾鲷\n双瘤槽缝叩甲\n卡申夫鬼美人凤蝶\n樱桃鸡\n普通蛙\n地栖蜘蛛\n纳马雨蛙\n七彩神仙\n小佛塔芋螺\n横纹虎鹭\n蚕蛾\n高白鲑\n蝽象\n麂子\n青黄枯叶蛾\n山楂红蜘蛛\n褐林鸮\n金线灯鱼\n奇异盗蛛\n光明女神闪蝶\n蛇鹫\n豆粉蝶\n社鼠\n马士提夫犬\n棕色隐遁蛛\n林雕\n变种鲤\n低泡飞鼠\n谷米螺\n古铜色卷尾\n马雷马牧羊犬\n红尾鹲\n彩鹑\n马来雕鸮\n土公蛇\n克拉特猫\n悦目金蛛\n广翅蜡蝉\n细粒蝾螺\n带蛾\n石蝶鱼\n择长翅尺蛾\n海百合\n曾氏兔银鲛\n美蜘蛛\n藏马鸡\n原龟\n香蕉象虫\n海蜥蜴\n法螺\n德国绒毛犬\n笙珊瑚\n女王凤凰螺\n淡水鲨鱼\n啮龟\n柑桔木虱\n圆口纲\n亚马逊伞鸟\n竹象\n周斑水虻\n灰斑羚\n大穿山甲\n夜蛾\n黄腹角雉\n丑鸭\n黄山短尾猴\n食人动物\n靴雕\n费氏穿草鸫\n德氏大羚羊\n天蚕\n枣实蝇\n丛林八哥\n红薄荷神仙鱼\n始暴龙\n姬赤星椿象\n原角龙\n超级麦皮虫\n黄板鲫\n高原鼢鼠\n日本蚱\n青山羊\n高鳍鹦哥鱼\n姜弄蝶\n黏液鳗\n雨燕鹦鹉\n美国斗牛犬\n伊朗巨鼠\n彭泽鲫\n旗腹姬蜂\n雪鲷\n白腹姬鹟\n牛奶蛙\n白眉翡翠\n米奇鱼\n红嘴巨鸥\n鹮嘴鹬\n北风鸟\n蛇皮鱼\n甘薯天蛾\n帝啄木鸟\n细鳞鱼\n剑吻海蛇\n茶杯犬\n绵羊猪\n维兹拉犬\n双垂鹤驼\n云花斑裸胸鯙\n鱵\n黑嘴鸟\n牙鲆鱼\n长鬃山羊\n青魔鱼\n澳洲蓝面神仙\n白眉鵐\n斑节对虾\n大王虎甲\n黑顶吸蜜鹦鹉\n孔雀蛾\n枯叶蛱蝶\n透明短吻狮子鱼\n猎猴鸟\n树虱\n秀蛱蝶\n黄金蛙\n漠鵖\n双斑萤叶甲\n加州鲈鱼\n鹿狗\n红眼蝉\n间色宝螺\n丁蛎\n栗翅斑伞鸟\n非洲渡鸦\n刺豚\n真鳄龟\n白面水鸡\n白喉卷尾猴\n刚果灯鱼\n恩特布山地犬\n白黇鹿\n北京鸭\n吸血鬼鱼\n红戟虾\n素叶螳螂\n白眼河燕\n芝麻螺\n闪电王子\n巨鲈\n小团扇春蜓\n杨扇舟蛾\n草海龙\n鹿角虫\n红颈绿鸠\n褐美狐猴\n千手螺\n芬氏花仙螺\n南方豆天蛾\n棕顶雀鹀\n花蝇\n雪山宝螺\n暇虎鱼\n大鳞结鱼\n绿莲灯鱼\n无尾凤蝶\n瓜绢螟\n柳蝙蛾\n刺鲶\n加拉帕戈斯企鹅\n绿弄蝶\n加勒比尖背角鲨\n云南松毛虫\n涡蛛\n维多利亚鲈鱼\n玉带蜻\n罗威士梗\n大丽灯蛾\n越南大麂\n鳞头大鳄龟\n蓝燕雀\n短尾粗吻海龙\n日本猫\n黄腹鼬\n刁子\n香蛇\n太平洋刺狮子鱼\n长角蜂\n红喉蜂鸟\n树白蚁\n斑马贝\n沟齿鼠\n斑点鹦鹉\n龙宫翁戎螺\n蓝色虎斑重点色猫\n黑玉翅鸽\n蝴蝶鱼\n食肉军蚁\n巨蛇\n黄绿游蛇\n巨陶锹甲\n沼泽兔\n金银鳞锦鲤\n冠羽画眉\n巴西涡螺\n獴\n冰川黑熊\n珍珠鳞金鱼\n毛皮兽\n桃小食心虫\n红九棘鲈\n白眉长臂猿\n长尾蓝灰蝶\n竹直锥大象虫\n台湾美凤蝶\n中华乌塘鳢\n重唇鱼\n金环胡蜂\n水熊\n白秃猴\n无鳞鱼\n眼纹广翅蜡蝉\n橙颊梅花雀\n麻羊\n牙签鸟\n黑白蝴蝶\n似鲶高原鳅\n小豹蛱蝶\n火狐狸鱼\n蝽蟓\n绿蛙\n胖虎猫\n盱眙小龙虾\n笠头螈\n吼猴\n纹面弹簧蜥\n藏酋猴\n巴西巨人金直间\n猫头鹰环蝶\n暗蓝异花金龟\n单角鼻鱼\n白带螯蛱蝶\n大团扇春蜓\n杯斑小蟌\n扭法螺\n丽色噪鹛\n黑脚蚂蚁蜘蛛\n桃红鹦鹉\n拟旖斑蝶\n鲯鳅\n虹彩吸蜜鹦鹉族\n巨斧燕子\n肉兔\n斑羚\n家隅蛛\n翱翔飞鱼\n芒部锦鲤\n金花鼠\n棘茄鱼\n黄冠鹎\n黄腹莺\n改良肉驴\n豆象\n加利克瑟犬\n对虾\n白眉蝮\n白眶蛇\n雪达犬\n狭腹灰蜻\n卷心菜毛虫\n小鳍脚企鹅\n钝吻棒花鱼\n蓝脸叶鹎\n红鲷鱼\n钩翅眼蛱蝶\n红点蟾蜍\n屎屁虫\n长吻银鲛\n蛇尾纲\n黑鹂\n白金黄金锦鲤\n蓝凤冠鸠\n郝波特犬\n帚吼猴\n噪鹛\n食蜥王龙\n野马\n负泥虫\n蜘蛛蜂\n香醇雁\n灰飞虱\n宠物鼠\n虎纹麝香龟\n花鼠鱼\n草蝉\n小旗鱼\n鹦鹉嘴龙\n豹点七彩\n紫颊太阳鸟\n黑喉山鹪莺\n蜘蛛螺\n红九纹龙锦鲤\n河猪\n箭蚁\n金线鲃\n黑翅蝉\n巴西河豚\n白腹鹭\n白丝鱼\n芒鼠\n黄雕鸮\n海蜘蛛\n鳖\n阔嘴鹬\n美国野牛\n蔗鼠\n红喉鹧鸪\n沙半鸡\n巴西游走蛛\n彩裙鱼\n红极乐鸟\n狍子\n圣文生亚马逊鹦鹉\n兜虫\n君主绢蝶\n青花鱼\n琉球松鸦\n棕腹蓝仙鹟\n橙眼白猫\n白金鲤鱼\n长鳍斑竹鲨\n高体鰤\n亮大蜗牛\n黄翅蜻\n科氏倭狐猴\n大头鳕\n南亚虎\n网纹龟\n长丝鲈\n尖头银鱼\n长脚沙漠蚂蚁\n大鸊鷉\n魟\n眼斑龟\n针毛鼠\n蚧壳虫\n厚唇凤凰螺\n山田大蚊\n小型荷兰水猎犬\n巴西夜猴\n甜甜圈龟\n黑边公子小丑\n红宝石金刚鹦鹉\n辉腹翠蜂鸟\n拟高脚蛛\n高加索山羊\n云斑尖塘鳢\n尼罗非鲫\n蓑蛾\n财神鱼\n长尾鳕\n非洲大蜗牛\n鼷鹿\n大兀鹰\n球蛛\n钱鼠\n雷公山牛王\n加勒比海红鹳\n大嘴麻鳽\n玻璃炮弹\n银杏大蚕蛾\n巴西巨人金毛\n鳓鱼\n麒麟猫\n德国向导猎犬\n红毛猩猩\n白孔雀\n东玫瑰鹦鹉\n史毕诺犬\n蓝圆鲹\n小极乐鸟\n鞍背鸦\n嘲鸫\n裂腹鱼\n箱鲀\n金丝鲶\n僵蚕\n日本鳗\n斑蝇\n鹫\n大螳螂\n无毒蛇\n透明斑马鱼\n大头鲤\n乌鹟\n爱尔兰鹿\n梨桃小食心虫\n非洲白背兀鹫\n鹨\n蜂鸟鹰蛾\n蜥鵟\n红斑鱼\n纹猫\n旋角羚\n桃蛀螟\n埃及艳后鱼\n沼泽田鼠\n中国少鳞鳜\n裸胸鳝\n石鸫\n石鸻\n班鸠\n三线短鲷\n小雕鸮\n雪羊\n野象\n黄斑黑蟋蟀\n瞎疙瘩\n黑寡妇蜻蜓\n小飞象章鱼\n马士提夫獒犬\n黑纹片角叶蝉\n短毛猫\n大白鼠\n安灰蝶\n中介蝮\n龙种金鱼\n葡萄昼天蛾\n斑啄木鸟\n露尾甲\n彩虹锹甲\n白花蛇\n鹤类\n扁虫\n黑隆头蛛\n巴西梗犬\n流浪汉蜘蛛\n玉斑美凤蝶\n淡水观赏鱼\n蜘蛛猴\n雪虎\n赤鲑\n热带真海豚\n帝王鲷\n绿眉鸭\n北美负鼠\n普度鹿\n红壁蛙\n斗羊\n锥螺\n滨鼠\n宽尾琉金\n鸭嘴鱼\n薮猫\n曦和绢蝶\n周氏闭壳龟\n白额高脚蛛\n琉璃草蝉\n黑尾红月光\n青步甲\n古菱齿象\n高加索蜜蜂\n青鳉鱼\n花鲷\n巨暹罗鲤\n麝鹿\n非洲侏儒鳄\n塞鲸\n撅嘴鲢\n双须重唇鱼\n胆形织纹螺\n黄河象\n凤头阿鹨儿\n鸟蛤\n后鳍锯鳐\n水字蜘蛛螺\n棘头梅童鱼\n七带犰狳\n两爬动物\n军配虫\n海龙鱼\n眼斑龙虾\n白腹锦鸡苗\n美洲狗鱼\n蓝尾翠凤蝶\n苏雀\n双斑大蟋\n巨蛇颈龟\n南部锦龟\n德系獭兔\n白头蝰\n红白金鲫\n极北柳莺\n长牙青蛙\n花羔红点鲑\n新西兰岸鸻\n大眼金枪鱼\n菜花原矛头蝮\n佛可宝螺\n棉花蚜虫\n大蜜蜂\n鲸类动物\n平腹小蜂\n赤眼蜂\n黄腰胡蜂\n肥螈\n斑皇鸠\n穴兔\n野山羊\n银狨\n北豚尾猴\n卢卡斯梗犬\n妞妞鱼\n哈威那犬\n土鳖虫\n彩色水母\n澳意蜜蜂\n生态甲鱼\n合欢木虱\n菜蛾\n厚唇光唇鱼\n斑脸海番鸭\n二点委夜蛾\n红缘灯蛾\n乌鸦凤蝶\n普通田鼠\n隆林黄牛\n果实蝇\n小豆长喙天蛾\n绿巨嘴鸟\n金环宝螺\n基菱背螳\n三须公\n东北鳈\n南非剑羚\n犰狳\n粗鳞鮻\n华丽巨蚊\n数字蛱蝶\n沂蒙全蝎\n鳎目鱼\n渡边氏东方蜡蝉\n黄蜂鱼\n海蝎子\n黑鮟鱇\n水稻负泥虫\n卷贝\n柿绵蚧\n小口裂腹鱼\n新西兰秧鸡\n大颚细锯脂鲤\n棘星螺\n侏儒猪\n柳鱼\n黑腹滨鹬\n瑞士狐\n松鼠猴\n美国恶霸犬\n鬼脸天蛾\n英国马士提夫犬\n曲纹黛眼蝶\n台湾鲷鱼\n贾丁氏鹦鹉\n藏狮\n斑青花金龟\n红嘴山雀\n上户蜘蛛\n灰鼠\n巴马油鱼\n瑞典拉普杭犬\n灰树雀\n印度鸬鹚\n环颈斑鸠\n巨型鱿鱼\n撒坝猪\n图鲁兹鹅\n棘鳅\n高产奶牛\n半蓝魔鱼\n大赤旋螺\n松石七彩\n花文鱼\n叶海龙\n魟鱼\n普通鸮\n婆罗洲巨人橙边\n苹果实蝇\n紫林鸽\n澳洲仙\n勇斧螳\n蝼步甲\n狼鱼\n薄翅蝉\n笔螺\n竹针鱼\n金黄阿奎登牛\n丽绿刺蛾\n灰冕鹤\n黑胸歌鸲\n非洲侧颈龟\n七彩蓝眼鳉\n车轮螺\n海猫\n食蟹海豹\n墨龙睛蝶尾鱼\n豚海豹\n褐雨燕\n白鹮\n西班牙跳猎犬\n雪蛙\n栗耳鹀\n毛法螺\n伪鳖\n红石斑鱼\n犰狳环尾蜥\n王极乐鸟\n黑足熊蜂\n图卢兹鹅\n蜘蛛蝇\n棕腹鹰鹃\n孟加拉国豆娘鱼\n淡黄蝶\n动胸龟\n北京大蜓\n赤翅长颈金花虫\n肥胖园蛛\n黑鳍蛇鲭\n黑背心小丑\n巴西彩虹蟒\n篱螺或笠螺\n侧裸蜣螂\n食鱼蝮\n小卷蛾\n玳瑁猫\n短尾鸦雀\n深山扁锹形虫\n跳鼠\n盲鱼\n宏凯棘冠螺\n白甲乌鳢\n小黄赤蜻\n宠物猪\n啮齿类\n四川黑鲤\n青鲷\n富平奶山羊\n犁沟木纹龟\n上树鱼\n网蝽\n虻鲉\n白鱼\n歌山雀\n美蝴蝶鱼\n东非侧颈龟\n烟灰蛸\n真马\n海水神仙鱼\n核桃扁叶甲\n中稻缘蝽\n稻眉眼蝶\n合趾猿\n天蛾\n迷你垂耳兔\n南美珊瑚蛇\n变色蛇\n大理弓鱼\n短尾绿鹊\n绿茸线蛇\n琉球歌鸲\n穹翠凤蝶\n小黑斑凤蝶\n月光螺\n食蝠鸢\n鯷鱼\n苔蛾\n山字宽盾蝽\n温室白粉虱\n异鳞蛇鲭\n毛螳\n银大眼鲳\n四角山羊\n红眼鱼\n阿根廷龙\n大鼻鸽\n红腿叫鹤\n黑颈长脚鹬\n长吻鱼\n短面熊\n蓝面神仙鱼\n小鸊鷉\n地蛆\n佩尔什马\n伪死人头蟑螂\n北露脊海豚\n阿比西尼亚蓝猫\n苹果红蜘蛛\n蓝眼灯鱼\n刻克罗普斯蚕蛾\n蛀螟\n食蚊鱼\n蓝点钝口螈\n大豆食心虫\n仙女鱼\n黑森林猎犬\n蓝带翠鸟\n斯里兰卡凤头鹰\n波纹眼蛱蝶\n灰纹鹟\n大圆鄂针鱼\n中华彩丽金龟\n褐头鹪莺\n牛皮蝇\n鹠\n天堂鸟\n环喉雀\n榆绿毛萤叶甲\n石宾光唇鱼\n圣诞树珊瑚\n彩裳蜻蜓\n冠恐鸟\n海天使\n耐寒龟\n红蚂蚁\n三角鲤鱼\n长刀鱼\n台湾草蜥\n甲蝇\n赤尾噪鹛\n蓝孔雀\n大甲鰺\n日本短毛猫\n花笠螺\n大仙鹟\n深山锹形虫\n菱鲷\n斑纹泥龟\n中华九刺鱼\n黑斑原鮡\n尼姑鸽\n白鲟\n狮虎兽\n红线波鱼\n月光蝶鱼\n蛆虫\n高山天牛\n黑斑鲫\n密鳞牡蛎\n丁鲷\n黑龙江满洲龙\n南极狼\n美国银虎斑猫\n灰鹰\n商城黑猪\n云南秃角蝉\n野桑蚕\n巨型蜻蜓\n十姊妹\n长鬣蜥\n多鳍鱼\n牧迪犬\n斗米虫\n铲齿象\n红颊獴\n碎斑青凤蝶\n绿树蜥\n鲷鱼\n蓝蝴蝶鱼\n中国黑白花奶牛\n赤石斑鱼\n指狐猴\n暗棕鵟\n肯氏龙\n山坑螺\n黑带二尾舟蛾\n双锯鱼\n青玉粗尾蝎\n绿壳蛋鸡\n英国马士提夫\n星螺\n皮金龟\n凤头䴙䴘\n美丽蚁蛛\n美洲斑潜蝇\n加泰隆牧羊犬\n黑菌虫\n帕特大勒梗犬\n灰喉针尾雨燕\n梭鲈\n鹿角花金龟\n火兔灯鱼\n水虎鱼\n绿蓑鸽\n山溪鲵\n短翅船鸭\n墨脱竹叶青蛇\n鲃鲤\n田鱼\n臭椿沟眶蟓\n胡蝉\n细鼷鹿\n棉蝗\n欧洲泽龟\n冰清绢蝶\n扁颅蝠\n帝王紫蛱蝶\n潮汕蝾螈\n白斑翅拟蜡嘴雀\n角额壁蜂\n科罗澳拟蟾\n星文鸟\n赤颈鵟\n搜救犬\n多异瓢虫\n珠星雅罗鱼\n眼镜食叶猴\n烟草甲\n海南八哥\n金纹细蛾\n河狐\n扇尾斗鱼\n扶桑绵粉蚧\n曲纹袖弄蝶\n双环凤蝶\n泥东风螺\n印度猫\n太平洋岩鱼\n白角星弄蝶\n紫光箩纹蛾\n美洲金翅雀\n紫蛙\n涡虫\n婆欧里鸟\n锦鲫\n摇鹊鸲\n凤头燕鸥\n大牙土锯天牛\n牧羊犬\n大嘴苇莺\n金腰燕\n白唇龟\n小巴丹鹦鹉\n瓦灰鸽\n河鲈\n金翅\n水蚤\n白圈三线蝶\n珍珠蚌\n金条鱼\n孔雀鲤\n獐鹿\n赖蛤蟆\n大蜻蜓\n三突花蛛\n海虾\n浅黄锦鲤\n林鸱\n蠹虫\n圣赫勒拿蠼螋\n绒冠蓝鸦\n金带美法螺\n星点石斑鱼\n白肌银鱼\n小盗龙\n俄罗斯黑梗\n美国王鸽\n松阿扁叶蜂\n小羚羊\n刺尾鱼\n黄翅灰蜻\n美洲凤头山雀\n燕尾凤蝶\n海百合纲\n巢鼠\n红尾鱼\n真鲷\n黄颡\n两栖龟\n黑粉虫\n象拔\n棕尾褐鹟\n柑桔小实蝇\n蓝豆娘\n中华哲水蚤\n小型猪\n黄金灯鱼\n冠翠鸟\n草鞋蚧\n拟鳄龟\n红角鸮\n耳斑神仙\n韭蛆\n珍珠龙\n福建钝头蛇\n紫寿带鸟\n俄罗斯鲟\n中环蛱蝶\n非洲野猫\n纹喉凤鹛\n长尾鸮\n膨跗毛蚊\n珠螺\n侧颈龟\n白翅叶蝉\n黄斑园蛛\n斑鸽\n细手指珊瑚\n棕色大熊猫\n高砂熊蝉\n东非低地大猩猩\n皮球鱼\n点目大蚕蛾\n锯线天蛾\n哈萨克猎犬\n榆绵蚜\n水龙兽\n黄兔尾鼠\n中国水蛇\n鼩\n紫蜻蜓\n麻田金龟子\n东亚狼\n竖琴螺\n玉带凤蝶\n斑须蝽\n鹩鹛\n希氏蟾龟\n黄胸田鸡\n鳞鲤\n旅鸫\n白顶信天翁\n莓鲈\n黑叶猴\n金鲫\n茶壶鱼\n窄斑凤尾蛱蝶\n俄罗斯短毛猫\n溪蟹\n黄金欧洲陆龟\n鵟雕\n桃红颈天牛\n珍达鹦鹉\n黄帆一点贝\n金刚虎鱼\n多齿暴龙\n彩雀\n斑海鲶\n黑龙睛\n谷盗\n钝吻鮠\n尖嘴鸟\n艾氏拟水龟\n黑丛尾猴\n乌头驴\n潭门砗磲\n大林姬鼠\n松毛虫\n泰国鲫\n龙纹短鲷\n台湾大虎头蜂\n松叶锦鲤\n双齿多刺蚁\n东方鲀\n台湾双尾燕蝶\n暗色凤头鹟\n落叶松球蚜\n马鲭鱼\n黑兰寿\n小犰狳\n西伯利亚鲟\n史氏中喙鲸\n裸滨鼠\n和牛\n犀金龟\n8\n梨蚜\n叫鸭\n斑点叉尾回\n漏斗网蜘蛛\n苏丹盾甲蜥\n小鸥\n麝牛\n红琉金金鱼\n鸡龟\n红林翡翠\n红苹果鱼\n鹤顶红金鱼\n黄尾蓝魔\n西部锦龟\n大眼红鲌\n高加索野牛\n东非角巴布\n木匠蜂\n埃姆登鹅\n瓜黑斑瓢虫\n顿河马\n五彩吊\n斑鳜\n观赏虾\n美洲钢铁蓝蜘蛛\n黄缘蛱蝶\n黑尾鹬\n锈凹螺\n丝丁鱼\n大绿叶鹎\n眼镜鸮\n角色卷蛾\n刚毛太仓小猎犬\n琥珀闪蝶\n躄鱼\n牛头鸭嘴\n骚扰阿蚊\n角蝰\n白唇泥龟\n红带袖蝶\n狮王斗鱼\n多腿鸡\n猎豹\n中国昆仑山脉犬\n翘嘴鮊\n皖南龙\n多鳞铲颌鱼\n平颌鱲\n乌苏里貉\n红狮牧羊犬\n剑齿象\n铅笔鱼\n琉璃小灰蝶\n家茸天牛\n刺花螳螂\n真鲷鱼\n红绿橙\n埃及螳螂\n石蚌\n大扁头蟋\n无齿鲳\n日本锦蛇\n粗灰蜻\n秦川牛\n啄花雀\n棕三趾鹑\n长鳍金枪鱼\n梭鱼\n栗背林鸲\n银叶猴\n大猿叶甲\n重庆原矛头蝮\n古骆驼\n红羽极乐鸟\n中华婪步甲\n草金鱼\n阿文绶螺\n迷蛱蝶\n麂羚\n鸮鹦鹉\n幻紫斑蛱蝶\n潜水钟蜘蛛\n黑巨口鱼\n黑夜猴\n陕西卫矛\n点鱼\n大红鱼\n黄胸鼠\n矮岩羊\n叶蝉\n全州禾花鱼\n老鼠鱼\n黑浮鸥\n虎灰蝶\n红白鹦鹉\n古巴雀鳝\n薮猪\n远古蜈蚣虫\n咖灰蝶\n诸城暴龙\n珀翠蛱蝶\n细鳞鳑鲏\n飞龙粉蝶\n谷蠹\n绿毛乌龟\n斑鱧\n黄树巨蜥\n斜绿天蛾\n珠履带蛱蝶\n纹凤蝶\n黄翼蓝顶亚马逊鹦鹉\n蜥鳄\n绿蛱蝶\n柑橘木虱\n图库曼鸺鹠\n中华鳖\n蒙山金蝉\n香蕉交脉蚜\n虎狮兽\n印度鳄龙\n海鲂\n蓝色猫\n木纹龟\n蛀木虫\n牙签鱼\n日本七鳃鳗\n鹭鸟\n高体型异育银鲫\n乔氏猫\n珍珠虹鱼\n小天使翠凤蝶\n印度紫蛙\n红衣凤头鸟\n豆眼黑鸽\n彩蝶\n松巴皱盔犀鸟\n圆澳龟\n栗斑腹鹀\n巴里坤马\n大足黑山羊\n黑顶山雀\n金矛头蝮\n平毛寻回犬\n大山雀\n长尾鼬\n丝尾鳠\n非洲隼雕\n白针狮子鱼\n诸罗树蛙\n鲂鮄\n缩骨花鲢鱼\n大豆胞囊线虫\n大紫蛱蝶\n小魔花螳螂\n黄边胡蜂\n非洲屋蛇\n中华板齿犀\n柑桔潜叶蛾\n波蛱蝶\n竹鼠\n红色猫\n中华忍冬圆尾蚜\n德国莱克斯猫\n彩艳吉丁虫\n伊洛瓦底江豚\n北美凤头卡拉鹰\n环海豹\n非洲鹿\n梭子鱼\n西北非洲猎豹\n蓝蜻蛉\n黑苇鳽\n白耳狨\n巨型沙螽\n白腹长尾猴\n紫吊\n石决明\n土巴龙\n黄喉柳莺\n帆鳉\n广西笔尾灰犬\n菜叶蜂\n彗尾蛾\n菠萝鱼\n山椒鱼\n三角灯鱼\n烟蚜\n鹰隼\n阿彭策尔山犬\n鹊鹞\n黑银板\n鳎\n白耳画眉\n烟台黑猪\n镶黄蜾蠃\n五道黑\n烟粉虱\n白尾角马\n九间贝\n华贵折衷鹦鹉\n木虱子\n加那利犬\n鸭绿江面条鱼\n骨鳞鱼\n单猎犬\n荔枝蝽\n麝鼹\n油蝉\n阿图瓦犬\n大眼鱼龙\n烟实夜蛾\n黄脚鸥\n灰噪鸦\n比格\n红胸吸汁啄木鸟\n老挝岩鼠\n刺鱼\n鱼鸦\n伯恩山犬\n中华虎凤蝶\n长吻海蛇\n拟蚺蛇\n纨裤麝凤蝶\n亲亲按摩鱼\n玄猫\n锯缘龟\n眼虫\n黑眉鸦雀\n鲟鳇鱼\n褐鱼鸮\n马鲛鱼\n中华棘鳅\n蒙古鼠兔\n拟椋鸟\n青脚麻鸡\n维多凤冠鸠\n彩蛱蝶\n四腮鲈鱼\n红衫鱼\n剑龙类\n灰乌鸦\n松十二齿小蠹\n东方雀鹰\n黑弄蝶\n鸟嘴斜带天蛾\n鲮鱼\n眼纹斑叩甲\n瞪羚\n熊狗\n爪哇豹\n雉\n奶牛猫\n宝兴歌鸫\n亚马逊泥龟\n铜色腰蜂鸟\n马来食螺龟\n猫眼蝾螺\n小凤头鹦鹉\n侧点桨鳍丽鱼\n尖音库蚊\n山孔雀雉\n摇蚊\n犬齿牛尾鱼\n南滑蜥\n豆螺\n葡萄透翅蛾\n大伞弄蝶\n翡翠凤凰鱼\n烟黑色波斯猫\n圣多美鸽\n大帆倒鲷\n小地老虎\n黑胸胡蜂\n花豹\n台湾鹎\n西班牙狼\n珊瑚虫\n木蠹蛾\n玉米毛蚁\n大豆蚜虫\n鸡鱼\n暮眼蝶\n平头岭鳅\n茎蜂\n蚁狮\n英国波音达\n石墻蝶\n狗蚤\n奶牛豹鲸\n澳洲麦鸡\n翘嘴鲌\n粉红双锯鱼\n塔螺\n大蚕蛾\n中国癞象\n鬣鳞蜥\n红松石\n及达尖犁头鳐\n黑喉辉蜂鸟\n长白山蚂蚁\n玉米田棉铃虫\n红白鲤鱼\n珍珠虎\n缨冠蜂鸟\n裸蛛甲\n青山蜗牛\n大眼海鲢\n家畜绵羊\n海鲫\n童蟒\n中国蜂\n垃圾鱼\n锯犁鳗\n华艳色蟌\n罗福梗\n褐黄前锹甲\n大星步甲\n乌鸽\n日本松干蚧\n马岛鸭\n榆绿天蛾\n蛤蚧\n鼠鹿\n西貒\n叉牙指鱼\n双斑长脚蜂\n散纹盛蛱蝶\n亚洲大蚂蚁\n花粉蝶\n四眼鱼\n凤头苍鹰\n素饰蛱蝶\n缝叶莺\n豆虫\n多眼灰蝶\n青苔鼠\n夜光虫\n虎纹伯劳\n黄眉柳莺\n浅缝骨螺\n哈萨克犬\n琴蛙\n白条锦蛇\n细腰褐泥壶蜂\n扁豆小灰蝶\n桑氏锯脂鲤\n亚马逊鲇鱼\n树丛浣熊猎犬\n报喜斑粉蝶\n牛獒\n湖鲟\n高身银板鱼\n白龟\n大二尾蛱蝶\n哈瓦那卷毛狮子犬\n福寿螺\n琴鱼\n大灰象甲\n蚬蝶\n钟鹊\n山鹧鸪\n黑胸钩嘴鸢\n家鼠\n红尼罗鱼\n滇池金线鲃\n隐形大熊宝螺\n扇砗磲\n比例猎犬\n三湖鱼\n哥法地鼠龟\n澳洲长颈龟\n虎斑猫\n白面粗尾猿\n蛾蠓\n蚤蝼\n鹰头鹦鹉\n泰坦大天牛\n新西兰红嘴鸥\n蓝虎\n星点水龟\n姬鹟\n红头虾\n马蜂\n红白锦鲤鱼\n鲱形白鲑\n斑胁水鸡\n天竺鲷\n硬毛鼠\n五彩金刚鹦鹉\n栗背短翅鸫\n白乌鸦\n太阳猫\n扁口鱼\n石扒子\n中臀拟鲿\n茶斑蛾\n后鳄\n山斑鱼\n欧斑鸠\n台湾噪鹛\n东方田鼠\n改良肉牛\n中华蝾螈\n乌线雀鹛\n雉形鸦鹃\n烛光鱼\n苦恶鸟\n鲑冠凤头鹦鹉\n高地猫\n鸡冠蛇\n黄河小鲶鱼\n噪鹃\n蜂蝇\n红娘鱼\n小左鲆\n明蛤\n帝王亚马逊鹦鹉\n草鸟\n中华枯叶蝶\n观赏鸽\n花斑刺鳃鮨\n德国钢毛指示犬\n冬眠\n小黄家蚁\n巨型上户蜘蛛\n费里拉马蹄螺\n粉红圈锯背龟\n秘鲁鹈鹕\n白带燕凤蝶\n无翅亚纲\n林山雀\n狞猫\n短毛太仓小猎犬\n特纳兹铅笔鱼\n黑眉长尾山雀\n红姑鱼\n阉鸡\n紫崖燕\n吸石鱼\n眼斑螳螂\n孔雀鲷\n巨腿螳\n雕齿兽\n斑鳍飞鱼\n金目鲷\n夹竹桃天蛾\n箭毒青蛙\n斑鳟\n潘氏闭壳龟\n金带花鲭\n仙企鹅\n日本锦鲤\n吕宋蜻蜓\n大叶黄杨斑蛾\n双棘长蠹\n窄斑翠凤蝶\n德国猎梗\n歌篱莺\n戴褐臂金龟\n蛾\n刀鳅鱼\n卡斯特罗拉博雷罗犬\n长尾鵟\n斑马神仙鱼\n中华草龟\n印度豹\n肿腿蜂\n北极拟圆鳍鱼\n残锷线蛱蝶\n蓝眼贝\n别光锦鲤\n地图龟\n红腹啄木鸟\n戈登赛特犬\n鹑螺\n飞虎\n东非冕鹤\n钻石哨兵\n栗褐鹃鸠\n加拉帕戈斯地雀\n波鱼\n日本犬\n台湾黄蝶\n绿鸠\n楔尾丽椋鸟\n中国大锹\n泰国虎鱼\n树大蚕蛾\n大乌龟\n白尾蓝鸦\n巴西黑\n叶结鱼\n额河银鲫\n白蛤\n五间鱼\n云南巴狗\n鹤驼\n花伞软珊瑚\n扭弯布纹螺\n四耳猫\n相思带蛱蝶\n铜翅水雉\n后毒牙蛇\n白文鸟\n短尾突吻鳗\n鲷\n山烙铁头蛇\n穆蛱蝶\n棘冠海星\n黎明闪蝶\n奥锹甲\n紫胶蚧\n麻料\n蚝\n僧帽猴\n元鱼\n海黑鱼\n大西洋黄鱼\n绿颊亚马逊鹦鹉\n蹼鹬\n角原矛头蝮\n日行守宫\n双全白环蛇\n爱尔兰萨特狗\n土灰笋螺\n九间鱼\n钻心虫\n黄足黑守瓜\n维几尼亚负鼠\n米蛾\n长嘴鸟\n凤头鸊鷉\n蓝翅叶鹎\n尖鼻蝰蛇\n非洲慈鲷\n花纹细螯蟹\n黄金鲤鱼\n大凤蝶\n巨嘴柳莺\n卷象\n海南马蜂\n毒蜘蛛\n乌鱼\n卑怯管叶虫\n青铜金龟\n蝎泽龟\n沙漠角蝰\n台湾宽尾凤蝶\n海鲂鱼\n云南裂腹鱼\n石龙子\n白腹鹞\n细蛾\n长湖银鱼\n小青虫\n洋燕\n棕长脚蜂\n短鳍澳洲鳗鲡\n大眼真鲨\n樱桃谷鸭\n马岛缟狸\n蛴螬虫\n古巴钩嘴鸢\n林鼠\n负子蜍\n睡鼠\n扁尾海蛇\n帝王伟蜓\n巨蛤\n棕足鼯鼠\n白颈乌鸦\n豹鲂鮄鱼\n双钩异翅长蠹\n蓝喉拟啄木鸟\n雉鸠\n黑色蝇虎\n中国鹦鹉嘴龙\n灰斑鸠\n大吻虾虎\n异粉颜蚜蝇\n翡翠贻贝\n鸊鷉\n兔鼠\n反舌鸟\n茸毒蛾\n尖嘴地雀\n红珊瑚\n狐獴\n狸猫\n黑腹燕鸥\n气蛤蟆\n菜虫象\n无翅昆虫\n小粉蝶\n水稻灰飞虱\n细尾鹩莺\n红嘴蓝尾喜鹊\n蓝斑马\n蜻蜓稚虫\n月季长管蚜\n拟龟壳花\n水字螺\n直纹蜘蛱蝶\n无齿芙蓉龙\n加州海狮\n德国锦鲤\n蛇鳗\n红鮋\n绿篱莺\n花角罗汉鱼\n红翅皱膝蝗\n大绿金刚鹦鹉\n草花蛇\n维多利亚肺鱼\n漂白蜻蜓\n黄三角吊\n沟鼠\n马尾斗鱼\n旅鼠\n白嘴潜鸟\n陆鬣蜥\n帝王三间鱼\n大海鲢\n蒙古寒蝉\n虹银汉鱼\n东非水字螺\n铜头蝮\n红颊鹦哥\n水稻福寿螺\n无眼蜘蛛\n灵缇拉萨犬\n淡水鳗\n收割蚁\n黑帝王魟\n黑辉极乐鸟\n非洲虎鱼\n蓝翠蛛\n管猫\n副狮子鱼\n鹭鹤\n少鳞裂腹鱼\n透明鱼\n纵纹锦鱼\n无眼金线鲃\n棉蚜刺茧蜂\n鞭蝴蝶鱼\n大猫\n天蓬鱼\n世界上最丑的鹦鹉\n小白鼠\n安娜图牧羊犬\n蛇王孔雀鱼\n石首鮰鱼\n雪貂\n太阳龟\n澳大利亚红背蜘蛛\n鱼蛉\n万能梗\n木虫\n悬蛹\n荨麻蛱蝶\n秋赤蜻\n果核泥龟\n黄刺鱼\n鳗鳞\n红背蜘蛛\n腊肠猫\n凤鲚\n波斑鹭\n凤头鹧鸪\n三线舌鳎\n四须盘鮈\n棕污斑螳\n麦穗鱼\n裸鲤\n星斑川鲽\n白蛉\n德州条纹木蝎\n团团圆圆\n麝香龟\n非洲冕雕\n高加索羊\n黑窜鸟\n黑丽翅蜻\n钉螺\n南洋大兜虫\n黑端豹斑蝶\n黑白鹰雕\n凹尖腹蜂\n菜螟\n宽带鹿花金龟\n黑狼犬\n巨型海蜘蛛\n菲律宾穿山甲\n水游蛇\n蓝鳍剃刀鱼\n乌信天翁\n白尾鸢\n火焰仙\n长江间银鱼\n蟋螽\n黑尾胡蜂\n奶蛇\n巨松鼠\n长春鳊\n蒙面神仙\n斑胁火尾雀\n胎生蜥蜴\n大鹮\n高山兀鹫\n蓝牛羚\n布龙度蝎子\n火鸭\n疙瘩宝螺\n柞蚕\n跳虫\n银带鳚杜父鱼\n角菱背网蝽\n细胸金针虫\n剥皮鱼\n黄金鲷\n克拉克星鸦\n肉用大雁\n松瘤象\n蜡蚧\n红肛凤头鹦鹉\n丁氏蝴蝶鱼\n蜜罐蚁\n金灯鱼\n北美灰熊\n高山蛙\n飞蚁\n白毒蛾\n新西兰信天翁\n拟八哥\n导聋犬\n白腹蓝姬鹟\n白乌鱼\n戴笠鸽\n鹤嘴鱼\n灰文鸟\n扁鼠\n蓝嘴神仙鱼\n橙腹草原田鼠\n蜘蛛甲\n日本地龟\n龟板\n食蜗步甲\n大鼠耳蝠\n双脊龙\n棘角蛇纹春蜓\n牙克煞龙\n孔雀巨蜥\n金苔鼠\n食蟹狐\n鬼獒\n吞噬鳗\n小黄蛉\n阿法六线风鸟\n马哈美凤蝶\n白腹鹦哥\n鳙鱼头\n金钱鮸\n布里牧羊犬\n新西兰白兔\n法国八达鸽\n短鼻六间鱼\n香港蝾螈\n太湖新银鱼\n新疆细毛羊\n虎鼬\n勃氏蝴蝶鱼\n彩虹蜂虎\n德国熊犬\n梨星毛虫\n黑鼠\n云斑裸颊虾虎鱼\n梅白鱼\n鲀鱼\n造纸胡蜂\n白头鹀\n云斑鹦鹉\n茜草白腰天蛾\n日本金鱼\n笑隼\n北美鼠兔\n榆绿叶甲\n兰寿金鱼\n红头噪鹛\n史奇派克犬\n清水石斑鱼\n龟甲宝螺\n山羚\n惊天兽\n蝎子鱼\n稻雀\n粉点虾虎\n唾蛇\n白胫长大蚊\n纠结清白招潮蟹\n绿雀鹎\n猫蛱蝶\n山地鸡\n灰斑鸻\n电鳗\n舞虻\n蒙古野驴\n康氏矮海豚\n白毛小灰蝶\n德国黄胡蜂\n草蜥\n天使尾天蚕蛾\n单环刺螠\n青海沙蜥\n五莲黑猪\n长吻鮠\n小丑蛙\n叶吻银鲛\n红宽带蝴蝶\n野蚕\n蛇龟\n鲁北白山羊\n版纳蛙\n尖头塘鳢\n英国玳瑁猫\n侏狨\n光端缺翅虎甲\n地鳖\n长尾妍蜂鸟\n幽灵蛛\n七彩鲑\n黄鸲鹟\n西昌高山黑猪\n短吻间银鱼\n日本大鲵\n钻石虾\n金猫\n白斑鸠\n蹬倒山\n费氏灰色牡丹鹦鹉\n黄金豹纹狗头\n花甲螺\n野蚕蛾\n黑尾神仙\n方蟹\n象鼻龟\n黄鼠\n马蹄螺\n圆仔\n似鱤\n哥威斯犬\n马岛潜鸭\n火焰神仙鱼\n拉氏清溪蟹\n树蛇\n艾伦水虎鱼\n长颈羚\n路亚黑鱼\n仙琴蛙\n琥珀蚕\n短翅蝙蝠\n东方角鸮\n切斯特白猪\n屁步甲\n月光蝶\n东部菱斑响尾蛇\n兰花虫\n孔夫子锯锹\n金光鲫\n黑树巨蜥\n黄铜翠蛱蝶\n咕咕鸟\n鱇鱼良白鱼\n黒靴陆龟\n紫红蜻蜓\n迁粉蝶\n黄长脚蜂\n青山雀\n油鱼\n高山短翅莺\n宽翅树蟋\n阿穆尔隼\n肯龙\n龟金花虫\n刺魟鱼\n棒刺大头蚁\n彩虹巨嘴鸟\n麻鹬\n喜马拉雅塔尔羊\n穹宇萤\n红帽金鱼\n软珊瑚\n波纹小灰蝶\n西班牙舞娘\n绿背金鸠\n拉格多尔猫\n亚洲长纺器蛛\n小苇鳽\n山鵟\n卷尾猴\n沟齿鼩\n尖喙蛇\n中华蜜蜂\n土猫\n中华狼青\n军犬\n大地鸲\n纹蓝小蜻\n长臂金龟\n东方雪人\n巨型木虱\n大春蜓\n乳白啄木鸟\n罗氏帆鳍鳗\n暗黑鳃金龟\n双尾虫\n棕泥壶蜂\n灰头群织雀\n鼠猴\n花豹盛蛱蝶\n柳刺皮瘿螨\n刺豚鼠\n裸眼小凤头鹦鹉\n加岛花鹿\n竹红天牛\n大嘴鲸\n水虻\n黄脚虎头蜂\n鸭嘴鲨\n黄蚂蚁\n灰冠鹤\n林鸫\n非洲巨蛙\n黑长喙天蛾\n箱豚\n南非林羚\n泰国虾\n白头秃鹫\n北京黑猪\n阔臂螳螂\n细腰蜂\n蓝极乐鸟\n粉脚雁\n巨蜥\n泥蜂\n美洲森林野牛\n大连香螺\n巴西黄金蛙\n尖翅翠蛱蝶\n绿眼蛙\n绿带翠凤蝶\n野杜鹃\n葡萄虎天牛\n新疆漠虎\n欧鸬鹚\n北美蓝鸟\n喜庆亚马逊鹦鹉\n高脚蟹\n欧洲松貂\n刀背麝香龟\n凤头雀嘴鹎\n剑水蚤\n吸蜜鸟\n绒毛猴\n淡水石斑鱼\n黑青小蜂\n沼蝇\n折衷鹦鹉\n台湾猴\n易碎双腔龙\n普通黄胡蜂\n非洲鬣狗\n中国地鼠\n暗色水鸡\n欧洲缅甸猫\n青绿顶亚马逊鹦鹉\n黏虫\n石头鱼\n斑粉蝶\n阿波罗娟蝶\n金鹦鹉\n华尾天蚕蛾\n九香虫\n欧鼹\n黄蜂蝇\n绿豆象\n绿拟啄木鸟\n古代长须牧羊犬\n褐麻鳽\n屎壳郎\n剑羚\n美洲草原野牛\n细纹斑马\n拟地甲\n无霸勾蜓\n塔蓝图拉毒蛛\n玫瑰葵花鹦鹉\n麝香鹿\n天眼猫\n黄裳眼蛱蝶\n幽灵箭毒蛙\n和马蜂\n墨西哥火膝\n扇尾鸽\n山山牛\n桃潜叶蛾\n黄裙凤蝶\n统帅青凤蝶\n印度舌鳎\n白喉冠鹎\n鸲蝗莺\n黄河鲤\n草鳊\n布鲁芬氏金刚鹦鹉\n艾蛛\n长毛天牛\n巨石斑鱼\n中华露螽\n红面具锥尾鹦鹉\n气泡珊瑚\n黑冠猕猴\n热带鱼蓝鲨\n乌母鱼\n澳洲折衷鹦鹉\n卵圆蝎蝽\n非洲劲蜂\n古巴蟑螂\n鹿豚\n绿鹦鹉\n厚颌鲂\n秋刀鱼\n宽纹北箭蜓\n草原鸡鵟\n东方果实蝇\n云南蝗\n桃舌蜥\n青竹鱼\n湿地苇莺\n笠螺\n黄帆天堂鸟\n黄藤鸟\n七腿花蜘蛛\n棒花鱼\n江鳕\n鳞翅目昆虫\n欧亚红尾鸲\n赤星瓢虫\n毛虾\n犊牛\n仙女虾\n髭海豹\n蒙链荫眼蝶\n土熊蜂\n猪肉绦虫\n麦二叉蚜\n虬眉带蛱蝶\n小绿鸠\n鳝鱼\n海蛳螺\n佛鳄\n叶甲\n象海豹\n壁蜑螺\n角红蟹蛛\n蛹化\n沼泽侧颈龟\n穴小鸮\n金头缝叶莺\n玩赏犬\n红玉颈绿啄木鸟\n眼鳢\n黄金神仙鱼\n茴香虫\n非洲牛箱头蛙\n泰坦蟒\n獾\n古巴亚马逊鹦鹉\n断板龟\n书虱\n鲟\n疑步甲\n哥斯达黎加老虎尾\n云南盘鮈\n灰泥甲虫\n锹甲\n圆口铜鱼\n文昌鱼\n欧洲鳇\n三角蟹蛛\n龟背天牛\n野兔子\n健壮刺蛾寄蝇\n美洲野牛\n五间半\n沙蚤\n大胡蜂\n麂\n波多黎各亚马逊鹦鹉\n九纹龙锦鲤\n吹绵蚧\n黄胸泥壶蜂\n大明斯特兰犬\n双冠龙\n蓝九间\n乳黄色猫\n普通朱雀\n树蜂\n食人鲳\n中华红蚁\n彩色球形珍珠金鱼\n巴厘虎\n翠鸟大山雀\n枯叶螳螂\n侏儒眼镜猴\n法国蜗牛\n巴拉望巨扁锹形虫\n方尾鸢\n蝰鱼\n牙獐\n沟牙鼯鼠\n矛蚁\n猎蝽\n冰岛犬\n小吻虾虎鱼\n红梢子\n菲律宾铠甲蝮\n黑啄木鸟\n琉璃蛱蝶\n脊青步甲\n招财鱼\n阿岛信天翁\n雀鳝类\n大白凤头鹦鹉\n锯翅天蛾\n巨扁锹甲\n山蓝鸭\n宝珈枪吻海龙\n秦岭四宝\n喋喋吸蜜鹦鹉\n食蜗龟\n褐家鼠\n琉金金鱼\n码绢金龟\n岩鼠\n鳞鱼\n红疣猴\n红艳天牛\n褐隼\n普鲁斯鳄\n火缘步甲\n加拿大臭鼬\n柿蒂虫\n翠青蛇\n白尾金喉蜂鸟\n巨山龟\n双带鰺\n丝蝴蝶鱼\n黄芦蜂\n白尾灰蜻\n褐红钝猛蚁\n雀鳝\n小鳄鱼\n暗背雨燕\n海洋鱼类\n蓝金花虫\n贝加马斯卡犬\n广场鸽\n银鲫\n花金龟\n南极小须鲸\n乌塘鳢\n稀有白甲鱼\n七彩鱼神仙鱼\n圣多美绿鸽\n小猎兔狗\n蓝喉仙鹟\n秘鲁马驼鹿\n杨桃鸟羽蛾\n白甲鱼\n北极七鳃鳗\n风头鹰\n巨型哲罗鲑\n欧歌鸫\n阳江鹅\n美洲角雕\n开角龙\n小火蚁\n列日鸽\n相模湾玉螺\n刀锹形虫\n夜光猫\n瓜斑潜蝇\n小丑灯鱼\n巴拿马金蛙\n公鱼\n靴脚陆龟\n姬啄木鸟\n白头鸽\n红黄拟啄木鸟\n荷包鱼\n蓝艳金花虫\n松江鲈鱼\n苏里南负子蟾\n稻飞虱\n米诺鱼\n火焰贝\n沙塘鳢\n爬蚱\n六斑曲缘蜻\n四川柳莺\n飞蛾\n流星泽龟\n篮子鱼\n蛙眼守宫\n白金龙鱼\n透明蛙\n喜马拉雅蜜蜂\n长尾鸡\n麦蛾\n七彩燕鱼\n红带新鹿蛾\n隐鹮\n粉蠹\n日本梗\n海王龙\n达乌尔黄鼠\n百色闭壳龟\n蛇岛蝮\n柳根鱼\n跳甲\n快盗龙\n意大利硬毛指示犬\n条尾鸢魟\n黄纹三锥象鼻虫\n紫薇绒蚧\n蓝凤蝶\n迷你金刚鹦鹉\n鸡泡\n北美知更鸟\n长颈鹿锯锹\n彩臂金龟\n臆羚\n台湾温剑水蚤\n潜水艇鱼\n新西兰大蜥蜴\n黄鹀\n珍珠大帆吊鱼\n喜蛛\n有刺胞动物\n棉红蜘蛛\n普通夜鹰\n飞狐\n双角鬼面蛛\n金钱鱼\n花生蚜\n西班牙山羊\n宽尾凤蝶\n红里螺\n白点叉鼻鲀\n巨脉蜻蜓\n红领绿鹦鹉\n雀斜纹天蛾\n黑缘红瓢虫\n白长角羚\n莹斑篮子鱼\n大赤鼯鼠\n山形金鱼\n红铃虫\n鱼鸥\n山椒鸟\n羽蛾\n淮阳驴\n扁卷螺\n蜥脚龙\n南方皇家信天翁\n阎甲\n翠叶红颈凤蝶\n山獐\n北美山雀\n彩色水泡\n正鲣\n非洲水牛\n鼠兔\n冠企鹅\n蝎蛉\n蓝颊鹦嘴鱼\n白狭扇蟌\n红岩鹨\n帝王灯鱼\n白斑羚\n恐龙鱼\n中国绿刺蛾\n管口鱼\n海蛎子\n铜点花金龟\n弓斑东方鲀\n狐貉\n小螳螂\n泰国虎纹蛙\n长鳍箭口鱼\n白须公\n长嘴苇莺\n野生娃娃鱼\n加州兀鹫\n毛骨鱼\n鼢鼠\n白老鼠\n多宝鱼\n黑鹮\n弓背蚁\n泡桐细毛蝽\n印度巨松鼠\n云南弓鱼\n透顶单脉色蟌\n北美红雀\n兔宠物\n粉灰鸽\n女巫骨螺\n斑节鳉\n鬣蜥\n檗黄粉蝶\n平鲷\n非洲疣猪\n钟螺\n黑蜂蚜蝇\n鹤顶粉蝶\n贝尔普施六丝极乐鸟\n西藏毛腿沙鸡\n孔雀鸡\n霜天蛾\n深海鱼\n棕色野兔\n灶马蟋\n奇虾\n剑齿虎\n红嘴牛椋鸟\n中华龟\n长江胭脂鱼\n花边星齿蛉\n中华田园猫\n佛罗里达咬龟\n蒙古细狗\n红大马哈鱼\n黑葵花鹦鹉\n灰脚柳莺\n玉兔螺\n斑马吊鱼\n鳌花\n灌丛鸦\n赤鲈\n招财龟\n卡尔特猫\n珍珠罗汉\n蓝倒吊鱼\n火星蚂蚁\n蛙蛙鱼\n倒立鱼\n小灰蝶\n印度野牛\n长脚赤蛙\n中华马蜂\n黑俄罗斯梗\n墨珍珠龙睛\n赛布斯奥长耳犬\n野鲮\n樟翠尺蛾\n贴树皮\n台湾竹叶青蛇\n丽纹蛇\n墨西哥泥龟\n黎氏青凤蝶\n隐纹花松鼠\n大蓝蝶\n棕腹仙鹟\n长毛垂耳西班牙犬\n白眉鸫\n赤条蜂\n隐鼠\n湘鲫\n斑潜蝇\n冰蚕\n锯鳐\n浙江双栉蝠蛾\n栗苇鳽\n跳兔\n袋貂\n裂海豚\n西藏牧羊犬\n墨瑞鳕\n柑橘大实蝇\n加州鹰\n尖吻鲟\n憨鲣鸟\n螟黄抱缘姬蜂\n斜纹炮弹\n北非猎犬\n米特雷锥尾鹦鹉\n黄鳍金枪鱼\n东星斑\n万宝螺\n中华倒刺鲃\n长须黄颡鱼\n粗点哑蟋\n白头缅蝰\n眼斑螳\n爬猴\n花翅大蚊\n缨翅\n黄蛱蝶\n鬼艳锹形虫\n胡瓜鱼\n独眼猫\n沼泽猴\n马塔贝勒蚁\n角箱鲀\n蚁后\n高跷鹬\n山羌\n棕榈凤头鹦鹉\n青花龙鱼\n埃及蜜蜂\n阿克巴士犬\n黑守瓜\n金草鱼\n角鸮\n巨锯陶锹甲\n鳗鲶\n电鲇\n赤蛙螺\n素弄蝶\n山叫驴\n角香螺\n朴喙蝶\n橙粉蝶\n红黄金刚鹦鹉\n金盾龟金花虫\n米林肥腹蛛\n肥熊宝螺\n皇家企鹅\n越南刺鳑鲏\n圆形棘腹蛛\n岩原鲤\n鳑鮍儿\n双吻前口蝠鲼\n四盘耳乌贼\n食火鸟\n玉米螟\n平原斑马\n黄腿鸽\n栗鸮\n孔雀苗\n蝴蝶蜥\n泽鹿\n宽咽鱼\n玉米金针虫\n多型蓝凤蝶\n克氏海马鱼\n凤蝶毛虫\n纤粉蝶\n高腰翁戎螺\n金十字灯鱼\n青桐木虱\n雨伞巴丹鹦鹉\n蛾子\n斑凤蝶\n乌苏里拟鲿\n黑林鸽\n南美浣熊\n小渔雕\n罗曼粉壳蛋鸡\n丝光椋鸟\n鲩\n英格兰大猫\n小线角木蠹蛾\n槐尺蠖\n圆斑硬象鼻虫\n午仔鱼\n桑树桑毛虫\n鹊色鹂\n棉红蝽\n蓝月轮鹦鹉\n杨雪毒蛾\n莱茵鹅\n喷火鱼\n长嘴秧鸡\n普京虎\n草履蚧\n无眼希特勒\n粉色海豚\n夏洛伊牧羊犬\n短嘴鹬\n黄褐天幕毛虫\n纹腹叉鼻鲀\n龟壳攀鲈\n细纹狮子鱼\n富春江鲥鱼\n黑白关刀鱼\n大耳果蝠\n红珠凤蝶\n赤鼻棱鳀\n黑尾原鸡\n黄丝雀\n啮虫\n蒙古獒\n大口康吉鳗\n翠蓝斑凤蝶\n薮猫獛\n蒙古红鲌\n灯鱼\n侧斑兵鲶\n红闪电\n黑灰蝶\n绿罗花金龟\n翼龙\n果鸠\n忘忧尾蛱蝶\n黄额盒龟\n锯缘圆龟\n丽齿兽\n大铅笔鱼\n石蛾\n悬铃木方翅网蝽\n蝶尾\n长痣绿蜓\n斯洛伐克库瓦克犬\n眼鱼\n育肥肉牛犊\n扁锹形虫\n广西林蛇\n骆马\n九角龙鱼\n稻纵卷叶螟\n骨唇黄河鱼\n豇豆螟\n东方宽吻海豚\n后肛鱼\n黄泥龟\n袋獾\n碧翠凤蝶\n女皇凤凰螺\n黄纹丽龙虱\n秋白鲑\n剑尾鱼\n德国宾莎犬\n耳鲍\n猿面天蛾\n黑食人鱼\n桶眼鱼\n黄翅鹦哥\n荷兰奶牛\n明窗蛱蝶\n线翎电鳗\n圆蛤\n紫长尾蜂鸟\n大紫艳金花虫\n亚辟猴\n三色蝎\n满洲龟\n时鳞鱼\n蓝枕绿雀\n初雪宝螺\n狭颅田鼠\n刺桐姬小蜂\n粒突箱鲀\n星洲红鱼\n萨克森极乐鸟\n巨皇鸠\n太阳闪蝶\n朱领锡嘴雀\n靴隼雕\n马岗鹅\n大眼松鼠\n山甲珠\n长尾蜂鸟\n步行虫\n多彩仙\n梗类犬\n黑鹞\n细尾獴\n复齿鼯鼠\n虫草蝙蝠蛾\n章公鱼\n竹象鼻虫\n猩红蛇\n香蕉弄蝶\n背点棘赤刀鱼\n英格兰牧羊犬\n短尾信天翁\n蛇鳕\n锯谷盗\n冠麻鸭\n坦鲷\n红嘴黑鹎\n白弄蝶\n酢浆灰蝶\n鹳嘴翠鸟\n姬独角仙\n蜾蠃\n西西里猎犬\n鲬\n拟黑多刺蚂蚁\n白腹珍珠鸡\n苍羚\n羊羔\n蓖麻蚕\n岩鹨\n马尾松毛虫\n樱桃灯鱼\n长耳跳鼠\n短尾蝮蛇\n网丝蛱蝶\n非洲岬水牛\n重点色英短\n畜牧犬\n南疆黄羊\n耳鼠\n鼯猫\n匈牙利维兹拉犬\n大耳沙蜥\n黄鳍刺尾鲷\n松岗象大蚊\n人科动物\n小锥螺\n短棘银鲈\n潜叶蝇\n黄襟蛱蝶\n蚁蛛\n红腹拟鹩哥\n鹳\n钻穴鱼\n臭蜣螂\n角鹰\n南极鱼\n家庭短毛子猫\n象龟\n翻车鱼\n卡尼鄂拉蜜蜂\n金钱豹\n蚜灰蝶\n柳珊瑚\n日本蜘蛛蟹\n驼\n蝾螺\n褐头鸥\n葱须鳞蛾\n海鲇\n食人蚁\n斑点木蝶\n沼泽野蜓\n美国青蛙\n长鳍真鮰\n乌羊\n豹蠹蛾\n花斑鱼\n河麂\n暗色斑纹海豚\n红背啄木鸟\n花龙虾\n红肚火口鱼\n日本长尾蜓\n黑头绣眼鸟\n金胸丽椋鸟\n小绿叶蝉\n鲑\n剃刀龟\n美人虾\n锯缘摄龟\n青根鱼\n黄獒\n丽金龟\n菊花鹦鹉\n刺盖鱼\n粗皮姬蛙\n鲭\n黑尾叶蝉\n蛋龟\n蓝环章鱼\n广斧螳螂\n纯蓝仙鹟\n海洋爬行动物\n织纹螺\n田鹬\n法老蚁\n炸弹鱼\n黑熊犬\n花冠皱盔犀鸟\n巨猪\n海马鱼\n巨型非洲食人鱼\n伯克氏鹦鹉\n褐鹰\n中华广肩步甲\n红腹滨鹬\n金头仙\n紫沙蛇\n黄脚渔鸮\n新西兰鸠\n白羽鸡\n黑头剑蛇\n针嘴鱼\n大蓝蜻蜓\n黑大蜜蜂\n红鳍东方鲀\n重点色长毛猫\n美神闪蝶\n丝毛狗\n小口脂鲤\n白翅蜡蝉\n翘嘴鱼\n褐鸦雀\n茶杯猫\n长毛对虾\n白粉虱\n松雀鹰\n纯色鹪莺\n虎斑犬\n棕背麻雀\n船嘴鹭\n沼泽鹧鸪\n亚速尔牧牛犬\n牛角鱼\n老豹蛱蝶\n小麂\n赤褐灰蜻\n青狼獒\n扁嘴海雀\n犏牛\n步甲虫\n硬鳞鱼\n辉斑唐加拉雀\n大王具足虫\n黑小蜜蜂\n纯种猫\n银石鲈\n茶尺蠖\n球鼠妇\n稻瘿蚊\n雷州黄牛\n雅灰蝶\n俄罗斯牧羊犬\n鰕虎鱼\n中华奥锹甲\n浓紫彩灰蝶\n闪绿宽腹蜻\n紫石房蛤\n亚东鱼\n喜马拉雅悬崖蜂\n白尾麦鸡\n黑蚁\n歌鹰\n镜子鱼\n新月锦鱼\n红鲫鱼\n美姝凤蝶\n粗深海鳐\n巨型远古昆虫\n黑皂鸽\n墨吉对虾\n大青蝗\n青衣鱼\n美洲鸬鹚\n华西树蟾\n普派恩旅鸫\n鹰爪虾\n牛头伯劳\n二条叶甲\n黑色金龟子\n兰坪乌骨绵羊\n哥伦比亚红脚蜘蛛\n灰燕\n方格星虫\n美洲龙鱼\n海龙类\n稻螟\n雪猪\n蒙丹鸽\n灰胸薮鹛\n柳尖胸沫蝉\n日本石龟\n兰屿壁虎\n蛇目褐蚬蝶\n佛罗里达拟鳄龟\n臭鼬\n桃脸情侣鹦鹉\n黑红阔嘴鸟\n棕顶鹧鸪\n澳大利亚箱形水母\n魔鬼砗磲\n美国刚毛猫\n硬壳虫\n大嘴鸟\n红松石七彩\n小嘴鱼\n蝠鲼\n鸥嘴噪鸥\n普通燕鸥\n大军舰鸟\n蔷薇叶蜂\n黄脚胡蜂\n埋葬虫\n壁蜂\n小眼鱼\n长钻光鱼\n地懒\n黄颡鱼\n毛角豆芫菁\n斑林鸮\n微独角仙\n大木蜂\n江黄颡鱼\n蛇鮈\n五彩神仙鱼\n黄豚\n鸺鹠\n产奶鼠\n油彩粉红脚蜘蛛\n中华斑鱼蛉\n咖啡木蠹蛾\n串串狗\n跳跃蟑螂\n五条鰤\n苹果螺\n扁船蛸\n小翠鸟\n黄尖襟粉蝶\n伯格马斯科犬\n虱子\n雄蜂\n镏金金鱼\n金头蜈蚣\n短吻鳄龟\n棕鼯鼠\n山蜂\n灰头麻雀\n柑橘红蜘蛛\n阴阳燕子灯鱼\n细嘴鸥\n湘西黄牛\n海鬼鱼\n洋辣子\n噪犀鸟\n圃鹀\n栗背皇鸠\n丽眦蜥\n大目鲭\n大正三色锦鲤\n澳洲咸水鳄\n斑尾塍鹬\n南方牛\n中华地鳖\n斗篷蜥\n斑点鹰\n白化球蟒\n新月带蛱蝶\n中国野兔\n中国犀牛\n爪哇野牛\n汉氏金刚鹦鹉\n金边龙虱\n红尾炮弹鱼\n东部刺鳖\n珍珠鱼\n豆天蛾\n台湾小鼯鼠\n黄眼鸽\n钩巨蟹蛛\n阿波罗绢蝶\n斑马驴\n酷拉龙\n喜马拉雅旱獭\n东西伯利亚莱卡犬\n深蓝吸蜜鹦鹉\n喜玛拉雅白头蛇\n刺河豚\n大理裂腹鱼\n泰坦炮弹\n鮟鱇鱼\n青川犬\n颇赛克猎犬\n拟大腹园蛛\n中华大扁锹\n蓝子鱼\n三角蜻蜓\n鲥鱼\n线尾燕\n茶杯猪\n骏马\n鼎突多刺蚁\n金色曼蛙\n黄蝶\n大西洋鳕鱼\n雏鸡\n蛇皮\n棘穗软珊瑚\n思茅松毛虫\n柑橘类大实蝇\n灰鸭\n虹鳟\n岩羚羊\n青蜂\n死亡蝮蛇\n锈额斑翅鹛\n赤旋螺\n黑头丝雀\n七彩神仙鱼\n翁戎螺\n黑眉拟啄木鸟\n红獒\n玉斑凤蝶\n特蓝斑马\n翻飞鸽\n枣尺蠖\n马达加斯加长尾灵猫\n白颊雀鹰\n花园鳗\n白鳞鱼\n鼠蛇\n褐耳鹰\n龙凤锦鲤\n鹦哥鲨\n小雨蛙\n喜鹊蛋\n大口黑鲈\n北美旅鸽\n黑帽悬猴\n金针虫\n毛里求斯粉鸽\n金多犬\n东部玫瑰鹦鹉\n萨塞克斯猎犬\n碧凤蝶\n蓝舌石龙子\n哲罗鲑鱼\n睇暮眼蝶\n小鹿瞪羚\n小斑姬鹟\n送奶鸟\n小雅鹃\n裂唇鱼\n油桐尺蠖\n东风螺\n黄金吊\n鳞烟管鱼\n红冠雀\n舟蛾\n田野小猎犬\n黑草鹩莺\n紫红笛鲷\n麋羚\n竹长蠹\n海底珊瑚\n黑头山雀\n白顶鸽\n水鹨\n巽他山椒鸟\n非洲草原野兔\n刷尾负鼠\n罗氏琴尾鱼\n左旋香螺\n驼背大麻哈鱼\n白面鼯鼠\n客纹凤蝶\n铁爪鹀\n卡巴金马\n若虫\n鲑鳟鱼\n安格鲁貂\n貂猫\n脆肉皖\n塔形马蹄螺\n草原扑翅鴷\n阿坝蜜蜂\n怪眼白色猫\n根粉蚧\n黄金猛鱼\n鲹\n圃拟鹂\n伪虎鲸\n白翅鳍脚企鹅\n乳牛\n极北蝰蛇\n木蜂天蛾\n侏儒猫头鹰\n下司犬\n白色蟹蛛\n太平洋鳗鱼\n紫吊鱼\n马达加斯加日守宫\n蝙蝠蛾\n黑带食蚜蝇\n蚜小蜂\n日落蛾\n榆毒蛾\n黄猩猩\n非洲猎鹰\n玉米蓟马\n沟金针虫\n蓝眼白猫\n红弄蝶\n山原猫\n夏威夷绿雀\n海南睑虎\n南江黄羊\n非动物\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_animals/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\n\nimport argparse\nimport ast\nimport os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import postprocess\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"resnet50_vd_animals\",\n    type=\"CV/image_classification\",\n    author=\"baidu-vis\",\n    author_email=\"\",\n    summary=\"ResNet50vd is a image classfication model, this module is trained with Baidu's self-built animals dataset.\",\n    version=\"1.0.1\")\nclass ResNet50vdAnimals:\n\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"model\", \"model\")\n        label_file = os.path.join(self.directory, \"label_list.txt\")\n        with open(label_file, 'r', encoding='utf-8') as file:\n            self.label_list = file.read().split(\"\\n\")[:-1]\n        self._set_config()\n\n    def get_expected_image_width(self):\n        return 224\n\n    def get_expected_image_height(self):\n        return 224\n\n    def get_pretrained_images_mean(self):\n        im_mean = np.array([0.485, 0.456, 0.406]).reshape(1, 3)\n        return im_mean\n\n    def get_pretrained_images_std(self):\n        im_std = np.array([0.229, 0.224, 0.225]).reshape(1, 3)\n        return im_std\n\n    def _get_device_id(self, places):\n        try:\n            places = os.environ[places]\n            id = int(places)\n        except:\n            id = -1\n        return id\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n\n        # create default cpu predictor\n        model = self.default_pretrained_model_path + '.pdmodel'\n        params = self.default_pretrained_model_path + '.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        # create predictors using various types of devices\n\n        # npu\n        npu_id = self._get_device_id(\"FLAGS_selected_npus\")\n        if npu_id != -1:\n            # use npu\n            npu_config = Config(model, params)\n            npu_config.disable_glog_info()\n            npu_config.enable_npu(device_id=npu_id)\n            self.npu_predictor = create_predictor(npu_config)\n\n        # gpu\n        gpu_id = self._get_device_id(\"CUDA_VISIBLE_DEVICES\")\n        if gpu_id != -1:\n            # use gpu\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=gpu_id)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n        # xpu\n        xpu_id = self._get_device_id(\"XPU_VISIBLE_DEVICES\")\n        if xpu_id != -1:\n            # use xpu\n            xpu_config = Config(model, params)\n            xpu_config.disable_glog_info()\n            xpu_config.enable_xpu(100)\n            self.xpu_predictor = create_predictor(xpu_config)\n\n    def classification(self, images=None, paths=None, batch_size=1, use_gpu=False, top_k=1, use_device=None):\n        \"\"\"\n        API for image classification.\n\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n            use_device (str): use cpu, gpu, xpu or npu, overwrites use_gpu flag.\n\n        Returns:\n            res (list[dict]): The classfication results.\n        \"\"\"\n\n        # real predictor to use\n        if use_device is not None:\n            if use_device == \"cpu\":\n                predictor = self.cpu_predictor\n            elif use_device == \"xpu\":\n                predictor = self.xpu_predictor\n            elif use_device == \"npu\":\n                predictor = self.npu_predictor\n            elif use_device == \"gpu\":\n                predictor = self.gpu_predictor\n            else:\n                raise Exception(\"Unsupported device: \" + use_device)\n        else:\n            # use_device is not set, therefore follow use_gpu\n            if use_gpu:\n                predictor = self.gpu_predictor\n            else:\n                predictor = self.cpu_predictor\n\n        all_data = list()\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = list()\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except:\n                    pass\n            # feed batch image\n            batch_image = np.array([data['image'] for data in batch_data])\n\n            input_names = predictor.get_input_names()\n            input_tensor = predictor.get_input_handle(input_names[0])\n            input_tensor.reshape(batch_image.shape)\n            input_tensor.copy_from_cpu(batch_image.copy())\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n            predictor_output = output_handle.copy_to_cpu()\n            out = postprocess(data_out=predictor_output, label_list=self.label_list, top_k=top_k)\n            res += out\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classification(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.classification(paths=[args.input_path],\n                                      batch_size=args.batch_size,\n                                      use_gpu=args.use_gpu,\n                                      top_k=args.top_k,\n                                      use_device=args.use_device)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not.\")\n        self.arg_config_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n        self.arg_config_group.add_argument('--top_k', type=ast.literal_eval, default=1, help=\"Return top k results.\")\n        self.arg_config_group.add_argument('--use_device',\n                                           choices=[\"cpu\", \"gpu\", \"xpu\", \"npu\"],\n                                           help=\"use cpu, gpu, xpu or npu. overwrites use_gpu flag.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_animals/processor.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport base64\n\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef softmax(x):\n    if len(x.shape) > 1:\n        tmp = np.max(x, axis=1)\n        x -= tmp.reshape((x.shape[0], 1))\n        x = np.exp(x)\n        tmp = np.sum(x, axis=1)\n        x /= tmp.reshape((x.shape[0], 1))\n    else:\n        tmp = np.max(x)\n        x -= tmp\n        x = np.exp(x)\n        tmp = np.sum(x)\n        x /= tmp\n    return x\n\n\ndef postprocess(data_out, label_list, top_k):\n    \"\"\"\n    Postprocess output of network, one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output data of network.\n        label_list (list): list of label.\n        top_k (int): Return top k results.\n    \"\"\"\n    output = []\n    for result in data_out:\n        result_i = softmax(result)\n        output_i = {}\n        indexs = np.argsort(result_i)[::-1][0:top_k]\n        for index in indexs:\n            label = label_list[index]\n            output_i[label] = float(result_i[index])\n        output.append(output_i)\n    return output\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_animals/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/brFsZ7qszSY/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8OHx8ZG9nfGVufDB8fHx8MTY2MzA1ODQ1MQ&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"resnet50_vd_animals\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n\n    def test_classification1(self):\n        results = self.module.classification(paths=['tests/test.jpg'])\n        data = results[0]\n        self.assertTrue('威尔士柯基' in data)\n        self.assertTrue(data['威尔士柯基'] > 0.5)\n\n    def test_classification2(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')])\n        data = results[0]\n        self.assertTrue('威尔士柯基' in data)\n        self.assertTrue(data['威尔士柯基'] > 0.5)\n\n    def test_classification3(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')], use_gpu=True)\n        data = results[0]\n        self.assertTrue('威尔士柯基' in data)\n        self.assertTrue(data['威尔士柯基'] > 0.5)\n\n    def test_classification4(self):\n        self.assertRaises(AssertionError, self.module.classification, paths=['no.jpg'])\n\n    def test_classification5(self):\n        self.assertRaises(TypeError, self.module.classification, images=['test.jpg'])\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_dishes/README.md",
    "content": "# resnet50_vd_dishes\n\n|模型名称|resnet50_vd_dishes|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNet50_vd|\n|数据集|百度自建菜品数据集|\n|是否支持Fine-tuning|否|\n|模型大小|158MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - ResNet-vd是ResNet原始结构的变种，可用于图像分类和特征提取。该 PaddleHub Module 采用百度自建菜品数据集训练得到，支持8416种菜品的分类识别。\n\n<p align=\"center\">\n<img src=\"http://bj.bcebos.com/ibox-thumbnail98/77fa9b7003e4665867855b2b65216519?authorization=bce-auth-v1%2Ffbe74140929444858491fbf2b6bc0935%2F2020-04-08T11%3A05%3A10Z%2F1800%2F%2F1df0ecb4a52adefeae240c9e2189e8032560333e399b3187ef1a76e4ffa5f19f\" width = \"800\"  hspace='10'/> <br />\n</p>\n\n  - 更多详情参考：[Bag of Tricks for Image Classification with Convolutional Neural Networks](https://arxiv.org/pdf/1812.01187.pdf)\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install resnet50_vd_dishes\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run resnet50_vd_dishes --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现菜品分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnet50_vd_dishes\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 为识别的菜品类别，value为置信度。\n\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个菜品分类的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m resnet50_vd_dishes\n    ```\n\n  - 这样就完成了一个菜品分类的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/resnet50_vd_dishes\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install resnet50_vd_dishes==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_dishes/README_en.md",
    "content": "# resnet50_vd_dishes\n\n|Module Name|resnet50_vd_dishes|\n| :--- | :---: |\n|Category|image classification|\n|Network|ResNet50_vd|\n|Dataset|Baidu Food Dataset|\n|Fine-tuning supported or not|No|\n|Module Size|158MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - ResNet proposed a residual unit to solve the problem of training an extremely deep network, and improved the prediction accuracy of models. ResNet-vd is a variant of ResNet. This module is based on ResNet-vd and can classify 8416 kinds of food.\n\n<p align=\"center\">\n<img src=\"http://bj.bcebos.com/ibox-thumbnail98/77fa9b7003e4665867855b2b65216519?authorization=bce-auth-v1%2Ffbe74140929444858491fbf2b6bc0935%2F2020-04-08T11%3A05%3A10Z%2F1800%2F%2F1df0ecb4a52adefeae240c9e2189e8032560333e399b3187ef1a76e4ffa5f19f\" width = \"800\"  hspace='10'/> <br />\n</p>\n\n  - For more information, please refer to：[Bag of Tricks for Image Classification with Convolutional Neural Networks](https://arxiv.org/pdf/1812.01187.pdf)\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install resnet50_vd_dishes\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run resnet50_vd_dishes --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnet50_vd_dishes\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - classification API.\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - top\\_k (int): return the first k results\n\n    - **Return**\n\n      - res (list\\[dict\\]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of image classification.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m resnet50_vd_dishes\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/resnet50_vd_dishes\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Remove Fluid API\n\n  - ```shell\n    $ hub install resnet50_vd_dishes==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_dishes/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/classification/resnet50_vd_dishes/data_feed.py",
    "content": "# coding=utf-8\nimport os\nimport time\nfrom collections import OrderedDict\n\nimport numpy as np\nfrom PIL import Image\n\n__all__ = ['reader']\n\nDATA_DIM = 224\nimg_mean = np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))\nimg_std = np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))\n\n\ndef resize_short(img, target_size):\n    percent = float(target_size) / min(img.size[0], img.size[1])\n    resized_width = int(round(img.size[0] * percent))\n    resized_height = int(round(img.size[1] * percent))\n    img = img.resize((resized_width, resized_height), Image.LANCZOS)\n    return img\n\n\ndef crop_image(img, target_size, center):\n    width, height = img.size\n    size = target_size\n    if center == True:\n        w_start = (width - size) / 2\n        h_start = (height - size) / 2\n    else:\n        w_start = np.random.randint(0, width - size + 1)\n        h_start = np.random.randint(0, height - size + 1)\n    w_end = w_start + size\n    h_end = h_start + size\n    img = img.crop((w_start, h_start, w_end, h_end))\n    return img\n\n\ndef process_image(img):\n    img = resize_short(img, target_size=256)\n    img = crop_image(img, target_size=DATA_DIM, center=True)\n    if img.mode != 'RGB':\n        img = img.convert('RGB')\n    img = np.array(img).astype('float32').transpose((2, 0, 1)) / 255\n    img -= img_mean\n    img /= img_std\n    return img\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list[numpy.ndarray]): images data, shape of each is [H, W, C].\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            each['org_im_path'] = im_path\n            each['org_im'] = Image.open(im_path)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n    if images is not None:\n        assert type(images), \"images is a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = Image.fromarray(im[:, :, ::-1])\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n\n    for element in component:\n        element['image'] = process_image(element['org_im'])\n        yield element\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_dishes/label_list.txt",
    "content": "非菜\n红烧肉\n牛肉饭\n酸菜鱼\n炒金针菇\n土豆泥\n小炒肉\n牛仔骨\n毛血旺\n刀削面\n三明治\n土豆丝\n粉丝煲\n巧克力\n肉夹膜\n花生米\n卤肉饭\n炸酱面\n清江鱼\n鸡肉饭\n水煮鱼\n狮子头\n豆腐汤\n牛肉粒\n猪颈肉\n柠檬汁\n四季豆\n豆角炒肉\n鸡排饭\n流沙包\n辣子鸡\n羊蝎子\n臭豆腐\n豆腐丝\n龙利鱼\n龟苓膏\n老豆腐\n西兰花\n厚多士\n回锅肉\n老鸭汤\n酸梅汤\n海鲜饭\n云吞面\n牛肉汤\n牛腩面\n茶树菇\n海鲜汤\n小炒皇\n米饭套餐\n炒牛肉\n龙眼粉蒸肉\n牛肉锅\n小黄鱼\n绵绵冰\n八爪鱼\n空心菜\n油麦菜\n拌饭\n小牛肉\n萝卜皮\n雪花冰\n大黄鱼\n热干面\n疙瘩汤\n一品酱萝卜\n鸡尾酒\n湄公鱼\n羊腿\n汉堡套餐\n肥肠面\n螺蛳粉\n鸭血酸辣粉\n天妇罗\n炒鸡蛋\n烧豆腐\n窝窝头\n麻辣全家福\n土豆粉\n牛百叶\n一品鸡蛋\n参鸡汤\n多春鱼\n优质鸭血粉丝汤\n拼盘\n鱼豆腐\n萝卜糕\n厥根粉\n三杯鸡\n千叶豆腐\n土鸡蛋\n掌中宝\n一品海蜇头\n鲈鱼\n原味鸡腿肉\n小炒黄牛肉\n炒腊肉\n牛肉粉\n章鱼烧\n酸汤鱼\n午餐肉\n盖浇饭\n鱿鱼圈\n鸡中翅\n黑糯米\n咖喱牛腩\n麻椒牛肉\n玉米汁\n手撕鸡\n一品锅\n羊腿肉\n酸辣汤\n鸡肉卷\n鸡脆骨\n一锅鲜\n咕咾肉\n仔姜炒小公鸡\n石锅鱼\n米豆腐\n羊上脑\n菠萝饭\n豆腐\n香豆腐\n油泼面\n脆皮鸡\n菠萝油\n炒河粉\n麻辣香锅\n拌苦菊\n水晶粉\n牛肉饼\n秋刀鱼\n羊肉片\n肉焗饭\n鸡毛菜\n秘制豆腐\n黄金大鱿鱼\n干豆腐\n武昌鱼\n炒牛河\n炸豆腐\n烧排骨\n烧牛腩\n鸡米花\n咖喱蟹\n忌廉汤\n手抓饭\n排骨煲\n木桶饭\n金汤酸汤肥牛\n仙草牛奶冰\n肥牛饭\n葱油饼\n蒸扇贝\n海藻菜\n野山菌\n金錢肚\n鱼丸汤\n松板肉\n桂花鱼\n石斑鱼\n焖锅\n仔姜耗儿鱼\n腐皮卷\n萝卜丝\n铜锣烧\n鹌鹑蛋\n三黄鸡\n烤三文鱼\n南瓜饼\n小鲍鱼\n布朗尼\n斑鱼\n西芹炒木耳\n煮干丝\n原味牛肋骨\n罗非鱼\n肉丝面\n肉盖饭\n菌汤锅\n石锅豆腐\n雪梨汁\n海鲜焗饭\n鸡蛋仔\n乌龙奶盖茶\n小肥羊\n杂粮包\n梅花肉\n炒乌冬\n爆炒鱿鱼\n烤鳗鱼\n焖豆腐\n焗蜗牛\n仔姜牛肉丝\n牛腩煲\n牛腩饭\n猪排饭\n酿豆腐\n金桔茶\n钵钵鸡\n铁板饭\n锅中锅\n龙骨汤\n乌龙茶\n佛跳墙\n小吃拼盘\n奥利奥\n小河虾\n水煮巴沙鱼\n沙拉\n椰汁糕\n炒山药\n蒸鲈鱼\n豆腐花\n青口贝\n蔬菜豆腐\n小馒头\n拿破仑\n笋片王\n梭边鱼\n麻鸡\n汽锅鸡\n滋补锅\n烩豆腐\n牛尾汤\n养生素什锦\n紫米粥\n啤酒老鸭\n牛羊肉组合\n烧肉豆腐\n菌菇汤\n西多士\n通心粉\n里脊肉\n面疙瘩\n主食拼盘\n鲶鱼\n鸡腿饭\n三味锅\n丸子拼盘\n乌鸡\n冬瓜汤\n大拉皮\n手撕包菜\n明太鱼\n烧汁豆腐\n油条虾\n洋葱圈\n土豆焖牛腩\n杂菌牛柳粒\n神仙鸡\n茶香虾\n皮蛋豆腐\n西冷扒\n香葉西米糕\n西红柿\n铁板烧\n锅包肉\n烤面包片\n十三香龙虾\n剁椒鱼头\n鲜嫩豆腐\n原味椰子鸡火锅\n功夫鱼\n千层面\n叉烧酥\n墨鱼仔\n墨鱼滑\n招牌沸腾鱼\n炒杂菜\n牛腩\n烤脑花\n烧卖\n烧海参\n猪肚鸡\n珍宝蟹\n优质羊肉汤\n优质牛羊肉泡馍\n葡萄酒\n蒸鲥鱼\n虫草花\n酥皮汤\n肉酱焗饭\n冰镇鲍鱼\n鳗鱼卷\n龙鱼锅\n草鱼\n卡布奇诺\n时令蔬\n日本豆腐\n江团鱼\n炒猪肝\n压锅土豆\n红烧肉饭\n牛肋排\n牛肋条\n猪黄喉\n幼羊肋卷\n脆皮肠\n花果茶\n什锦蘑菇\n梅干菜扣肉\n泡菜火锅\n豆腐脑\n豌杂面\n青花鱼\n青菜钵\n千页豆腐\n鱿鱼虾\n鲜鱿鱼\n东坡肘子\n八宝饭\n冻豆腐\n龙利鱼柳\n小圆子\n手抓肉\n手抓饼\n水豆腐\n原味文昌椰子鸡\n气泡水\n卤水拼盘\n肥牛\n长豆角\n烤羊肉\n爆米花\n牛蛙煲\n牛蛙锅\n盐水鸭\n童子鸡\n浓汤\n桂花糯米莲藕\n庆红薯粉\n麻辣羊肚丝\n肉松卷\n芋儿鸡\n三文鱼芝士卷\n麻辣烫\n豆腐干\n酱骨头\n铁板鸡\n银耳羹\n银鱼羹\n火锅套餐\n石锅牛肉\n雪媚娘\n手卷\n海鲜拼盘\n丸子汤\n乌东面\n虾仁豆腐\n卷心菜\n吊烧鸡\n香芋地瓜球\n坛子肉\n玉子豆腐\n煎帆立贝\n鸡蛋炒拉条子\n拌莜面\n拌菠菜\n三文鱼头\n越南水晶虾\n年糕料理\n海鲜羹\n火锅面\n炒通菜\n牛上脑\n牛拼盘\n脆皮豆腐\n石榴汁\n笋烧肉\n组合\n糯米糍\n糯米鸡\n羊棒骨\n拌面\n肥牛煲\n肥肠煲\n臭桂鱼\n芥兰苗\n苏打水\n荔枝肉\n鲜菌拼盘\n蒸鱼头\n蔬菜粥\n大虾沙拉\n原味蛋黄酥\n招牌白切走地鸡\n干锅肥肠\n冬阴功汤\n雪花鱼\n香酥鸡\n马哈鱼\n鱼套餐\n鱿鱼丝\n鸡蛋汤\n黄辣丁\n丝瓜尖\n乌鸡卷\n九宫格\n南瓜羹\n双拼\n烤肥牛\n土鸡煲\n地锅鸡\n培根面\n大花卷\n大阪烧\n大骨汤\n小海鲜\n小炒鸡\n干锅鸡\n拆骨肉\n拌时蔬\n鲜柠檬水\n毛豆腐\n水晶鸡\n韩风泡菜锅\n海鲜煲\n火腿肠\n炒海鲜\n炒香干\n炒鲜鱿\n炖萝卜\n水煮牛肉\n爆三样\n炒牛眼肉\n牛肉滑\n牛肉煲\n牛肝菌\n猪肚汤\n琵琶鸭\n印式番茄汤\n鸡蛋紫菜卷\n老火锅\n鸡肉咖喱\n鳗鱼肉酱饭\n肥牛面\n腊味饭\n鳜鱼\n炒芒果螺\n芝士条\n草原肚\n荷包蛋\n原只鲍鱼荷叶饭\n萝卜汤\n原笼蒸甲鱼\n蛋花汤\n西洋菜\n豆腐鱼\n豌豆黄\n香锅羊肉\n面包汤\n巴沙鱼、清江鱼、草鱼\n鱼头皇\n鲜鹅肠\n焖鲟龙鱼\n鸡肉套餐\n鸡肉锅\n麦旋风\n黄骨鱼\n黑珍珠\n万年青\n水东芥菜\n双拼饭\n咖喱锅\n四角豆\n土猪肉\n地三鲜\n大杂烩\n大烩菜\n夫妻肺片\n香糯小米糕\n香酥小鱼干\n红烧手抓骨\n三文鱼腩\n松鼠鱼\n冰茶\n松饼\n核桃露\n桂花糕\n棒棒鸡\n萝卜\n泡菜汤\n瓜子\n冰淇淋火锅\n炒米条\n炒肉片\n酸辣炒鸡杂\n炖木瓜\n炸鲜奶\n烤牛蛙\n烧土豆\n烧肥肠\n烩三鲜\n焖牛肉\n鹅肝\n牛腱肉\n原味瑞士卷\n野生甲鱼\n番茄鱼\n奶盖乌龙\n目鱼仔\n蟹籽手卷\n香素鲍鱼\n红米肠\n老南瓜\n老虎菜\n肉卷饼\n铁板凯撒肉眼扒\n肉蟹煲\n肉锅仔\n香肠拼盘\n螺脆鱼锅\n艇仔粥\n花椒鸡\n茄子饭\n荷兰豆\n菊花茶\n野生菌王汤\n泡菜双拼\n葫芦鸡\n瑶柱蒸丝瓜\n蘑菇包\n西柚汁\n豆沙包\n豆腐卷\n豆花鱼\n豌豆尖\n仔姜跳水蛙\n野生菌\n原味部队火锅\n莜面鱼鱼\n印象巴黎月饼\n香猪肉\n香芋卷\n马蹄糕\n三文鱼拼盘\n焗饭\n虾鱼片粥\n芝士蛋卷\n乌江鱼\n乌鱼片\n养生茶\n冰桔茶\n面包的诱惑\n卤豆腐\n口口脆\n芝士夏威夷\n原汤大骨锅\n火锅\n小羔羊\n开胃菜\n手撕饼\n拌蛰头\n三文鱼卷\n三文鱼骨\n陈年普洱茶\n烧汁鲈鱼\n泡菜饼\n原味海中虾\n白斩鸡\n海带面\n海蛎煎\n涮羊肉\n炒大虾\n炒牛肚\n爆炒肥肠\n炒芦笋\n炒韭菜\n炖乌鸡\n炖土豆\n烧土鸡\n烧鲈鱼\n烧鹅饭\n焗豆腐\n水煮肉片\n爆羊肉\n牛筋面\n牛肉羹\n黑牛腹肉\n牛里脊\n猪肉滑\n珍菌猪肚煲\n猪脆骨\n玉米羹\n瓦块鱼\n韭香白米虾\n海皇豆腐\n石烧饭\n砂锅粥\n碟鱼头\n稻香蛙\n筒骨锅\n米纸卷\n蟹籽沙拉\n糯米糕\n绿茶饼\n绿豆糕\n牛羊组合\n羊肉煲\n老腊肉\n牛肉河粉\n辣椒肉炒肉\n肉燥饭\n肉皮冻\n牛肉脆饼\n肉骨茶\n四色拼盘\n莴笋丝\n肥牛粒\n萝卜干\n蒸肉饼\n蔬菜饼\n面包蛤蜊汤\n蝴蝶虾\n豌豆苗\n酱豆腐\n酸菜锅\n糖醋里脊\n野猪肉\n金边粉\n阳春面\n面类套餐\n香辣鸡\n鱿鱼筒\n鲜花饼\n黄蚬子\n黄金糕\n一品香锅\n三文治\n东山羊\n春卷\n八宝菜\n冬瓜茶\n凤尾鱼\n卤拼盘\n叉烧肠\n和风豆腐\n叫花鸡\n原味松饼\n咖喱鱼\n豆品拼盘\n精品羔羊\n啤酒鸭\n咖喱蔬菜\n土鸡锅\n圣女果\n外婆菜\n奶油汤\n菌汤子母锅\n签子牛肉\n寿喜锅\n小豆腐\n干贝粥\n各式牛肉\n扇子骨\n手撕菜\n拌三丝\n马兰头拌香干\n排骨虾\n擂茄子\n招牌无骨鱼\n有机鱼\n柠檬鸡\n棉花糖\n水爆肚\n蜜汁大虾\n长江白鱼\n泡饭\n冰淇淋松饼\n清补凉\n老火靓汤\n炒培根\n茶树菇炒爽肉\n炒肥牛\n炖老鸡\n炝锅鱼\n虾卷\n西点拼盘\n印度烤鸡肉\n蹄筋\n烧鳗鱼\n烧黄鱼\n土豆焖茄子\n爬爬虾\n麻辣牛杂煲\n牛筋腩\n牛腩筋\n牛蹄筋\n猪梅肉\n玉米片\n百香果\n盐焗鸡\n碗仔翅\n云南竹筒饭\n原味牛板筋\n糯米甜甜\n粒粒香\n糯米骨\n桂圆红枣茶\n红菜汤\n羊肉锅\n美极虾\n烩面\n骨肉相连\n胡椒虾\n腰花面\n腱子肉\n膨膨冰\n臭鲑鱼\n芒果卷\n番茄炖牛腩\n莲子羹\n白菜炒肉\n煎饼\n芹菜猪肉\n菠菜面\n萝卜片\n蒸三鲜\n蔬菜卷\n蔬菜面\n虾仁饭\n蛋蒸肉\n象拔蚌\n过水鱼\n原味酸奶酪\n酸辣锅\n醉鱼干\n银耳汤\n锅巴饭\n压锅牛腩\n石锅肥牛\n雪梨汤\n渝都风味鸡珍\n香鲈鱼\n马鲛鱼\n排骨套餐\n高丽菜\n魔芋丝\n鱼头汤\n鱿鱼拌饭\n鱼面筋\n鱿鱼嘴\n鲜肥牛\n鲜虾肠\n鲜黄喉\n鳕鱼堡\n鸡汤锅\n印度黑椒鸡腿\n鸭脯\n鸭锁骨\n麻辣鸡\n黄瓜卷\n一品鲜\n三角烧\n三重奏\n上上签\n上海青\n下饭菜\n东星斑\n乌梅汁\n什锦菜\n什锦锅\n招牌烧仙草\n仙草冻\n宫保鸡丁\n冷吃兔\n剑骨鱼\n巧克力松饼\n冬荫功虾汤\n自助火锅\n千层饼\n南瓜煲\n叉烧面\n牛蛙\n和乐蟹\n咖喱龙虾\n四喜锅\n脊骨土豆汤\n芝士火锅\n黄金大排面\n大片肉\n大连鲍\n奇异果\n多宝拼盘\n富贵虾\n小拌菜\n遇上小辣椒\n干豆角\n榴莲酥\n炒年糕\n拌折耳根\n拌茼蒿\n排骨粥\n排骨面\n杀猪菜\n松子鱼\n鲍鱼松茸汤\n五仁核桃包\n铁棍山药\n椒鱼片\n椰浆饭\n印度金芒果气泡\n水波蛋\n原汁牛肉\n鱼片\n沙丁鱼\n黄河鲤鱼\n沸腾饭\n葱油拌面\n泉水鸡\n辽参\n海鲜汇\n涮肚\n淮王鱼\n深海鱼\n清凉锅\n滑类组合\n滑蛋饭\n炒双脆\n醋炒土鸡\n芥兰炒百合\n野生菌炒脆骨\n炒膏蟹\n炖水蛋\n原生态炖羊肉\n霸王花炖龙骨\n烤猪肘\n板栗烧鸡\n烧老鹅\n烧豆角\n烧鸭饭\n烫干丝\n青椒焖甲鱼\n焖羊肉\n蛋黄焗南瓜\n焗薯蓉\n原汁焗蘑菇\n煲龙骨\n仙草\n牛仔肉\n牛板腱\n牛柳面\n黑椒牛肉条\n牛肉粥\n牛肋肉\n牛腩粉\n猪尾煲\n猪排肉\n猪肉饭\n猫耳朵\n瓦罐汤\n生拼盘\n田鸡王\n白水鱼\n白灵菇\n白玉菇\n百灵菇\n脆皮火烧\n石板饭\n秘制肉\n蜂窝豆腐\n竹丝鸡\n笋腊肉\n手工粗粮馒头\n紫甘蓝\n香芋紫米露\n紫菜汤\n红油锅\n芋圆红豆汤\n老母鸡\n肉双拼\n肉片汤\n肉沙拉\n香芋扣肉\n芒果捞\n芝士派\n花雕鸡\n招牌茶香肉\n菊花鱼\n菌菇煲\n菜排骨\n肉丝\n萝卜丸\n蒸桂鱼\n蒸肉排\n蔬菜拼\n蘑菇面\n虾手卷\n虾火锅\n蜜柚茶\n欧式土豆浓汤\n土豆炖牛腩\n鱼贴饼子\n跑山鸡\n麻辣火锅\n辣鲶鱼\n多士配雪糕\n海鲜酱油水\n酱肉丝\n酱肘花\n锅\n干锅包菜\n雪花牛\n雪花肉\n青斑鱼\n青椒鸡\n面筋煲\n飘香鸡\n牛腱\n鱼香肉丝\n香芋球\n飘香鲫鱼\n马兰头\n麻辣鮰鱼锅\n鱼军舰\n反卷\n龙利鱼捞饭\n鱼翅羹\n鱿鱼卷\n鱿鱼头\n海鲜套餐\n鲜椒鱼\n海鲜泡饭\n鲷鱼烧\n鳕鱼羹\n鸡腿排\n鸡蛋饼\n鸭嘴鱼\n鸭架汤\n黄金蟹\n黄鱼鲞\n黄鸭叫\n黑百叶\n龙虾片\n龙须菜\n水果披萨\n一把骨\n丁香鱼\n三剑客\n三线肉\n三色面\n鸡肉丸子饭\n河粉\n全品烧\n冷面鸡\n冰激凌火锅\n凤梨汁\n毛肚\n秘制排骨\n纸包豆腐\n千层酥\n千张包\n半筋半肉面\n南瓜丝\n原味鸡\n叉叉冰\n辣口味虾\n口水鱼\n爽口牛肉\n可丽饼\n组合焖锅\n吾桑格\n三拼\n花生\n咕噜鱼\n哈密瓜\n酒酿圆子羹\n圆舞曲\n圆子豆花\n烤土豆皮\n地瓜条\n北塘小炒\n芝士排骨\n原味大头鱼\n大碗鱼\n大红袍\n大肠面\n大虾锅\n大骨头\n大鳊鱼\n优质大麦茶\n筋头巴脑\n奶豆腐\n蟹子手卷\n鸡饭\n寿喜烧\n小猪包\n小甲鱼\n小番茄\n笋丝小菠菜\n小蘑菇\n小鱿鱼仔\n山药卷\n山药粥\n山野菜\n手工饼干\n韩式泡菜\n酸奶慕斯杯\n扁豆丝\n手剥笋\n扒皮鱼\n拌云耳\n拌干丝\n拌猪耳\n拌肚丝\n拌野菜\n凉拌鲫鱼\n拍青瓜\n招牌卷\n招牌面\n双拼套餐\n猪排定食\n鸡排面\n明炉鱼\n月牙骨\n羊杂拼盘\n杏仁茶\n板栗鸡\n果苏打\n香椿核桃仁\n梅子酒\n梅花糕\n梳乎厘\n醋椒豆腐\n特色椒麻锅\n榴莲蛋\n比目鱼\n水晶饺\n水滑肉\n三汁焖锅\n莜面烤姥姥\n金汤鲈鱼\n马家沟芹菜\n油浸鱼\n油焖笋\n流星肉\n海底捞\n海皇羹\n海鲜烩\n浸肥牛\n涮毛肚\n溜鱼片\n火锅粉\n火龙果\n炒包菜\n炒合菜\n小炒羊肉\n炒腊肠\n炒苦瓜\n炒蔬菜\n响油鳝丝\n炒酸菜\n炒面线\n炖肉汁\n炖豆角\n炸大虾\n招牌潮汕炸果肉\n炸蘑菇\n炸里脊\n烤子鱼\n烤肋排\n烤青鱼\n韭菜\n烧春鸡\n烧鹿筋\n焖鲶鱼\n焗土豆\n意式芝士焗番薯\n招牌芝士焗薯泥\n焗蟹宝\n燕麦包\n爆鳝面\n爽口菜\n牛杂锅\n牛肉肠\n牛黄喉\n拌猪耳朵\n猪肉面\n猪肉饼\n猪软骨\n猪骨汤\n玉子烧\n玫瑰茶\n珍珠斑\n番茄面\n炒时蔬菜\n竹荪汤\n西米布甸\n粟米羹\n蜜糖松饼\n糖芋苗\n糯米卷\n绿豆粥\n绿豆芽\n牛羊双拼\n羊肉面\n鲍参翅捞饭\n老坛子\n清蒸深海老虎\n聚宝盆\n肉炒粉\n肉类砂锅\n牛肉锅巴\n肉锅贴\n肥肠鱼\n肥肠鸡\n脆萝卜\n芒之恋\n芒果茶\n芝士挞\n芝士球\n芝士虾\n茉莉花炒蛋\n五花肉锅\n西芹百合\n茄汁面\n茶碗蒸\n山药炒木耳\n蓝莓果醋\n榴莲千层\n菌菇豆腐\n菜包肉\n菜泡饭\n菜炒蛋\n菠菜花生\n萝卜丁\n萝卜苗\n蒙古肉\n蒸南瓜\n蒸笼牛肉\n蒸白鱼\n豆豉蒸秋葵\n蒸钳鱼\n蒸鲍鱼\n蒸鲜鱿\n蓝莓汁\n藏龙鱼\n虾焖锅\n窝蛋肥牛\n蟹肉棒\n赤豆元宵\n豆烧肉\n豉油鸡\n跳水鱼\n跳跳虾\n原味蹄花汤\n麻辣排骨\n梭边鱼锅\n酒香肉\n香酥鲫鱼\n招牌酱大骨\n酸辣鱼\n金丝虾\n招牌金汤锅\n金沙包\n铁观音\n铁锅面\n砂锅山药\n雪布蕾\n雪梨茶\n青椒鱼\n拌青笋丝\n青豆泥\n面包条\n汤莜面窝窝\n韭菜盒\n印巷香水鱼\n肥肠\n马卡龙\n马面鱼\n驴打滚\n鱼头锅\n鱼子蛋\n鱼翅汤\n鱼肚羹\n鱼茸粥\n乌鱼蛋汤\n饭团\n鱿鱼饭\n鲍鱼菇\n鲜汤锅\n鲜竹卷\n鲶鱼鸡翅鲜虾锅\n锅巴菜\n鸡丁饭\n鸡肉拌面\n鸡捞面\n鸡焖锅\n鸭下巴\n竹笋鸭汤锅\n鸳鸯火锅\n鹅肝酱\n芝麻牛肉\n黄瓜汁\n黄金虾\n原味黑鱼煲\n炸酥黄金龙俐鱼\nBB鸭\n燒三文魚\n三杯鸭\n三鲜面\n阿公下酒菜\n鸡丝拌面\n乌鱼锅\n乌鸡粥\n九折板\n亲子饭\n八宝冰\n六合鱼\n经典便餐\n养乐多\n农家鸡\n冰榴莲\n冰橘茶\n冷豆腐\n黑猪排骨凉瓜汤\n红烧刁子鱼\n龙利鱼锅\n巧克力喷泉\n巧克力拉瓦\n功夫鸡\n乌龙杏仁加红豆\n南瓜盅\n招牌海南鸡饭\n卤肉卷\n卤花生\n可乐饼\n吉事果\n云吞捞面\n味噌汤\n土豆\n风味鮰鱼\n奇味鸡煲\n黑鱼\n芝士肥牛\n咸煎饼\n精品上脑\n极品大虾\n眼肉\n哨子面\n啤酒鱼\n仙草喜圆烧\n四喜拼盘\n咖喱海鲜\n咖喱猪肉\n四味锅\n土匪鸭\n土猪汤\n海鲜土瓶蒸\n土豆锅\n土鸡脚\n烤地瓜片\n地锅鱼\n金牌坛香肉\n海鲜墨鱼面\n大板烧\n大白刁\n白虾\n大碗饭\n大虾煲\n大馒头\n大骨棒\n鱼头泡饼\n沙冰\n奶酪棒\n姜母鸭\n嫩鱼片\n军舰\n蟹子沙律\n安东鸡\n密瓜汁\n富贵鱼\n小皇堡\n山楂汁\n山菌汤\n井冈山豆皮\n卡布基诺\n干渣肉\n干烧鱼\n干贝汤\n例汤\n日式火锅\n日式烧肉\n熏鱼\n当家肉\n心里美\n维怡豆奶\n海鲜\n情人果\n扇贝肉\n担仔面\n拌海草\n拌螺片\n拌豆皮\n拌香菜\n叉烧烧拼排骨\n捞牛肉\n排骨串\n推推乐\n摊鸡蛋\n手撕豆腐\n老虎斑鱼锅\n斗碗鱼\n尼斯沙拉\n安格斯牛肉\n方便面\n昂刺鱼\n重庆鸡公煲\n木瓜丝\n杏仁烧\n马来咖喱\n铁板鲈鱼\n榆林豆腐\n水果之恋\n水果乳酪\n水果冰粉\n果盛宴\n腰果鸡丁\n柠檬鸭\n九宫格火锅\n梦幻雪\n棒棒虾\n椒香鸡\n原味椰子冻\n椰子糕\n柠檬苏打\n水晶包\n蜜汁山药\n汁锅巴\n酸汤挂面\n汤火锅\n酸汤牛蛙\n沙拉虾\n沙棘汁\n沙茶面\n酱油拉面\n油淋鸡\n油酥饼\n油鸡饭\n泉水蛙\n金牌泉水鱼\n醋泡花生\n包浆豆腐\n海参斑\n海胆丸\n涮涮锅\n冰淇淋球\n混合锅\n太湖白鱼\n溜肉段\n腿卷\n炒三鲜\n炒冬笋\n优质小炒泡馍\n炒莴笋\n炒菠菜\n香炒藕片\n炒青菜\n招牌咖喱炒青蟹\n爆炒鱼片\n炒鲍鱼\n白辣椒炒鸡胗\n炖瘦肉\n鸡肉炖蘑菇\n炖鹧鸪\n炝锅面\n炸花生\n炸蛋散\n炸香蕉\n烤乳猪\n烤五花\n烤全鸡\n烤南瓜\n烤梅肉\n烤蘑菇\n烤蛏子\n烤香菇\n烤鸭饭\n黔鱼\n烧杂鱼\n酱烧鱼头\n红烧鲫鱼\n照烧鸡肉\n焖猪肚\n焖老鹅\n焗口蘑\n焗豆泥\n焗金瓜\n芝士焗鳕鱼\n孜然羊肉\n煎鸡蛋\n煨牛腩\n煮三鲜\n自煮海鲜\n煮豆腐\n熊手包\n火爆猪肝\n爆脆肠\n肥牛套餐\n牛火锅\n牛角面包\n大牡丹虾\n猪脚饭\n猪颈扒\n猴头菇\n山珍豆腐\n海蜇\n松茸甩袖汤\n甲鱼汤\n炒番薯叶\n白萝卜\n百叶包\n百叶卷\n乳鸽\n冰皮月饼\n盆菜\n仙草奶盖\n碎碎鸡\n章鱼饼\n竹笙汤\n竹节虾\n笋牛腩\n黄米凉糕\n米粉肉\n糊涂面\n糖火烧\n糯米鸭\n糯香骨\n紫薯饼\n红豆卷\n鲜榨红豆汁\n米线套餐\n绿豆汁\n绿豆汤\n香辣羊排锅\n鲍汁翅肚羹\n肉松饼\n肉板面\n鸡肉烩饭\n招牌肚包鸡\n猪肚煲鸡\n鹅肝冻糕\n高钙肥羊肉\n烧腊拼盘\n香腊牛肉\n鸡腿肉饭\n虾膏豆腐\n芋圆冰\n香芋招牌\n芙蓉虾\n芙蓉蛋\n芝士饼\n招牌芥菜饭\n花淇淋\n花甲粉\n花蚬子\n芹菜炒香干\n苦瓜羹\n茶壶汤\n抹茶红豆\n柠檬C芦荟优多\n排骨莲藕汤\n莴笋片\n金针菇肚丝\n鸡肉\n菌皇汤\n菌菇面\n青菜大拼\n酸菜鲈鱼\n萝卜酥\n萝卜鱼\n葡萄汁\n韩式葱煎饼\n蒜香味鱼\n蒜香肉\n蒜香鸡\n蒸膏蟹\n蒸蛏子\n鱿鱼\n樱桃蔓越莓\n蔬菜锅\n薏米汤\n蓝莓山药\n藤椒鱼\n虾仁面\n龙虾伊面\n蛋糕卷\n蛋黄卷\n蜂窝煤\n蝴蝶卷\n凉衣白肉\n西京烧\n西葫芦\n豆乳锅\n豆腐饭\n红豆豆花\n豉油虾\n意式奶油培根贝壳面\n贵妃鸡\n走油肉\n软壳蟹\n轻乳酪\n麻辣汤锅\n麻辣海带\n辣章鱼\n辣鱿鱼\n辣鸡肉\n精选牛肉\n通心菜\n香酥排骨\n酱排骨\n酱棒骨\n酱菠菜\n酱香肉\n酱香锅\n酱驴肉\n酸菜面\n酸酸鸡\n酸黄瓜\n醉蟹钳\n野菜饼\n铁板野蘑菇\n金瓜丝\n黄金豆腐\n银丝饼\n香锅土豆\n香锅排骨\n锅烧面\n锅草鱼\n锅驴肉\n石锅鲫鱼\n什锦砂锅\n长寿菜\n长寿面\n霸王鸡\n青菜汤\n风味肉\n风味鱼\n和风牛肉\n套餐\n馋嘴鱼\n百香果汁\n拌香椿苗\n香牛腩\n肚丝\n香茜饺\n蚕豆\n香酥卷\n鸡翼\n起司马铃薯\n鱼子酱\n鲍鱼炖蛋\n生饭\n鱼米羹\n鱼籽饭\n鲍鱼汤\n鲍鱼盅\n鱼头煲\n鲜果茶\n海鲜砂锅\n海鲜组合\n鲜腐竹\n鲜虾堡\n鲜锅仔\n鲜鱼锅\n鲮鱼滑\n鸡丝面\n鸡块面\n鸡煲翅\n鸡翅煲\n玉米鸡肉滑\n意式罗勒鸡肉面\n鸡腿锅\n鸦片鱼\n麻辣虾\n麻辣面\n麻麻鱼\n黄金卷\n黄金饼\n黑啤酒\n黑米糕\n黑米饭\n黑豚肉\nHi辣鱼\n西班牙Tapas\nQ麻糬\n馕炒肉\n一碗香\n三下锅\n三丝羹\n三合一\n三味鱼\n三杯雞\n三角峰\n三鲜煲\n关东木耳\n蒜香银丝元贝\n粉丝包菜\n酸汤丝娃娃\n肉丝带底\n两面黄\n焖中华鲟\n肉丸火锅\n原味丹麦棒\n乌头鱼\n九肚鱼\n家乡豆腐\n草花乳鸽汤\n亲亲肠\n菜煲\n虾仁炒蛋\n桃仁苦菊\n果仁菠菜\n锅仔土豆\n仔排煲\n芝士蛋糕\n巧克力面包\n巧克力味\n公主卷\n养生粥\n冷锅鱼\n金桔晶冻柠檬\n龙利鱼片\n秘制叉烧\n秘制梅肉\n秘制红枣\n桃仁剔炒鸡\n巧克力冰莹\n巧克力巴菲\n巧克力熔岩\n包公鱼\n千张丝\n南乳肉\n南煎肝\n南瓜糕\n摩卡奇诺\n卡特鱼\n卤肉面\n烙饼卷带鱼\n可口小菜\n鸡胗\n叶包鸡\n一号肥牛\n沙面\n合家欢\n综合拼盘\n吞拿鱼\n味三鲜\n冒菜\n卤蛋\n奶酪\n泡菜\n风味海藻\n猪肝\n各种系列\n脑花\n腊肠\n炒花菜\n西式培根\n咸水角\n咸水鸭\n豆制品组合\n龙虾\n马蹄响铃卷\n咖喱时蔬\n咖喱杂菜\n咖喱肥牛\n咖喱膏蟹\n四宝蔬\n四宝饭\n四小拼\n四果汤\n四色饺\n四重奏\n法国鹅肝\n芋圆二号\n地瓜叶\n地皮菜\n墨鱼卷\n芝士肋排\n樱花芝士虾卷\n芝士鳗鱼\n软壳蟹卷\n大嘴蛙\n点心大团圆\n大扁鱼\n大海螺\n大盆鸡\n大盘鱼\n大肠煲\n大豆腐\n大锅巴\n大麻球\n大黄鸭\n明太鱼丝\n招牌火车头河粉\n头道菜\n特色奥灶面\n老奶洋芋\n奶酪卷\n夏威夷贝\n嫩笋尖\n狮子头饭\n骄子肥牛\n子锅仔\n原味椰子鸡\n安康鱼\n元宝甲鱼\n烩菜\n农家锅巴\n至尊拼盘\n百叶小堂菜\n新鲜油菜\n高山小猪肉\n油焗小青龙\n小食组\n薰衣草小饼干\n小鲜肉\n可尔必思\n千层豆腐\n山楂茶\n韶山炒鸡\n山猪肉\n山药羹\n小炒\n农家手工麻糍\n巴旦木\n干烧肉\n干逼鹅\n干酪鱼\n年糕虾\n开胃酒\n乳酪\n小菜\n意式浓汤\n烧鸭\n三式组合\n蛋卷\n五彩时蔬\n彩缤纷\n彩虹卷\n烤玉米\n德克士\n心太软\n生态甲鱼\n思慕雪\n怪味鸡\n手扒鸡\n手抓蟹\n手掰肠\n手撕兔\n手撕笋\n扒广肚\n拌冰藻\n拌板筋\n拌桃仁\n拌海菜\n拌耳丝\n拌花菜\n凉拌茄子\n拌莴笋\n拌蜇头\n菠菜拌豆干\n拌面筋\n拌驴肉\n拌鱼皮\n拌鸡杂\n拌鸡肚\n辣拌鸡胗\n招牌虾\n猪排火锅\n排骨粉\n手掰豆腐\n手撕牛肉\n手撕茄子\n三文鱼皮\n阿拉斯加蟹\n蒜蓉粉丝\n昔果乐\n水晶牛肉\n水晶肴肉\n朝天锅\n木瓜汁\n日本料理\n猪杂汤饭\n砂锅\n杂粮粥\n杂粮糕\n老油条牛肉\n杭三鲜\n三杯鸡饭\n松花鱼\n美极海虾\n鲜果乐园\n优格\n芒果捞野\n鲜果百汇\n果雪芭\n架子肉\n柚子蜜\n柠檬红\n青柠蒸鱼\n柴火鸡\n栗子鸡\n核桃派\n花菜炒肉\n格拉条\n桂花鸭\n蟠桃祝寿\n过桥豆腐\n桶子鸡\n木桶羊肉\n梅子肉\n青椒肉丝\n滴脆腰花\n金椒鱼锅\n椒鸡杂\n椒麻鱼\n木瓜椰奶爽\n椰子汁\n椰子饭\n椰香鸡\n榴莲卷\n樱桃肉\n橘柠檬\n柠檬冰茶\n水库鱼\n水蜜桃\n水豆芽\n蜜汁木瓜\n鸡汁脆笋\n椰汁鸡汤\n咸肉粒汤丝螺\n酸汤揪片\n江团\n百叶\n酸汤羊肉\n青菜\n沙姜鸡\n沙茶酱\n油多士\n酱油拌饭\n曲奇\n油鳝糊\n泡萝卜\n气泡饮料\n蒜泥白肉\n澳洲牛肉\n海上鲜\n海参汤\n上海熏鱼\n海藻丝\n海螺汤\n海鲜拼\n鲤鱼\n溜肉片\n滋补汤\n火豆腐\n火锅鸡\n炒三丝\n炒五花\n炒什锦\n炒仔鸡\n炒咸肉\n炒墨鱼\n炒杂蔬\n炒米糕\n炒粉皮\n炒粿条\n板炒肝尖\n炒蘑菇\n炒蚬子\n炒螺片\n炒野菜\n炒钉螺\n炒鳝柳\n炒鸡肉\n酸辣炒鸡肫\n炒鹅肝\n炖辽参\n风味炝花生\n炸刀鱼\n油炸双拼\n香炸土豆\n炸排骨\n炸腐竹\n油炸臭干\n炸虾片\n烤乳鸽\n烤兔腿\n燒烤大拼\n烤大肠\n烤春鸡\n烤海虾\n烤羊背\n烤羊腰\n烤花蛤\n烤酸菜\n烤面筋\n烤鲷鱼\n烤鸡头\n烤鸡心\n烤鸭包\n煎烤黄鱼\n烧三鲜\n叉烧乌冬\n烧桂鱼\n烧羊肉\n烧青口\n烧鲤鱼\n烧鲳鱼\n香菇烧鸡饭\n烧鹅皇\n浓郁鲜香烩杂菇\n烩松肉\n烩海参\n烩羊肉\n烩蘑菇\n烩蹄筋\n烩鱼肚\n热狗堡\n焖江团\n焖鮰鱼\n焖鲜虾\n黄焖鸡饭\n奶油焗杂拌\n孜然土豆\n香煎藕饼\n煎蛋汤\n煎血肠\n煎鱼饼\n煎鸡饭\n北京煨牛肉\n煮海参\n生煮花生\n干煸龙虾\n熊猫包\n熏排骨\n熏牛肉\n熏鸭肉\n葱爆河虾\n爆爆珠\n爆爆蛋\n爆爽肉\n爆肚丝\n酱爆茄子\n油爆螺片\n爆鳝背\n爬虾肉\n爱丽丝\n爱玉冰\n鸦片鱼头\n金牌羊肉\n招牌豆腐\n招牌豆花\n招牌龙虾\n牛三宝\n牛丸粉\n肥牛乌冬\n牛仔腿\n手抓牛大骨\n牛尾锅\n牛尾骨\n牛排煲\n牛杂粉\n牛柳饭\n牛肉冻\n牛肉干\n芝士牛肉酱\n牛肚面\n牛肩肉\n牛胸口\n牛胸肉\n牛脆骨\n牛蛙虾\n牛骨头\n原味牛骨汤\n牛魔王\n猪排面\n猪肉汤\n猪肝面\n猪肺汤\n猪蹄虾\n猪骨煲\n猕猴桃汁\n金玉满堂\n玉米酥\n玫瑰饼\n珍珠丸\n珍珠马蹄\n玫瑰花茶\n椰奶\n麻辣瓦香鸡\n甜不辣\n百丽甜情人\n甜玫瑰\n甜辣虾\n生姜茶\n生炒骨\n野生菌汤\n田园鸡\n田螺鸡\n甲鱼锅\n意式薯条配芝士碎\n瘦肉汤\n白切肉\n白玉卷\n白糖糕\n白蚬子\n百叶丝\n菠菜\n黄瓜\n脆皮烧肉\n带皮羊肉\n脆皮蛋卷\n皮青菜\n带皮驴肉\n鸡蛋\n大盆牛蛙\n盐煎肉\n盲公鱼\n石榴鸡\n破仑酥\n碎米鸡\n洛神花茶\n福寿螺\n禾花鱼\n稻香肉\n竹筒肉\n竹筒鸡\n竹蛏王\n蒸笋壳鱼\n笑脸薯\n金米海参\n米焗饭\n稀饭\n米粑粑\n馒头\n菌类大拼\n鱼籽豆腐\n粉圆子\n粉粿冰\n粒粒爽\n粘豆包\n粟米汤\n糍饭糕\n年糕排骨\n焦糖炖蛋\n冰糖葫芦\n糖醋鱼\n红糖锅盔\n素凉菜\n荤素拼盘\n紫菜饭\n紫薯粉\n红咖喱\n招牌脆皮红子鸡\n红枣汁\n红枣糕\n招牌红烧鱼\n红腰豆\n烤红薯片\n红薯饼\n红豆粥\n红豆饼\n结烧肉\n绿豆沙\n绿豆酥\n缩骨鱼\n羊肉筋\n美人蛙\n美人蹄\n美人鱼\n美味蛙\n美极蛙\n美果冰\n群英会\n招牌骨皇翅\n翡翠卷\n卤面\n韭菜老烧蛋\n老肉片\n耙泥鳅\n木耳肉片\n肉三鲜\n肉乌冬\n烧肉夹饼\n肉抓饭\n肉汤锅\n肉清汤\n干锅花菜\n咸肉菜饭\n肉蛋卷\n牛肉蛋堡\n肉馅锅盔\n肉香菇\n肚丝面\n肠砂锅\n虾滑\n海胆豆腐\n胖头鱼\n胡萝卜\n脆皮鱼\n脚趾肉\n腌黄瓜\n豆腐烧肉\n肝腰合炒\n臭大元\n臭鲈鱼\n钢管鸡\n彩色心情\n特色肥牛\n雪砖\n芒果干\n芝士味\n芝麻球\n芝麻饼\n芥菜汤\n芦花鸡\n芦荟爽\n芦荟茶\n花生汤\n花生芽\n花生酱\n枣花蜂蜜\n花鲢鱼\n苹果派\n茶叶蛋\n茶套餐\n欧蕾\n草莓汁\n荞麦饼\n薄荷苏打\n櫻桃蔓越莓\n莓类甜心\n莴笋干\n莼菜汤\n蘑菇火锅\n菇炒肉\n蔬菜乌冬\n菜桃仁\n蔬菜浓汤\n炒粉\n雪菜笋丝\n韭菜粑粑\n香菜羊肉\n菜脯蛋\n酸菜鱼面\n菠萝冻\n菠萝蜜\n菠萝鸡\n萝卜汁\n冰火菠萝油包\n菠萝焗饭\n印度飞饼\n拌葫芦丝\n葫芦头\n葱包烩\n烂蒜肥肠\n蒸洄鱼\n蒸大肠\n蒸带子\n蒸皖鱼\n蒸鱼腩\n蒸鲟鱼\n清蒸黄鱼\n时蔬大拼\n蔬菜煲\n薏米水\n原味藏香猪\n海藻沙拉\n虾扯蛋\n龙虾捞饭\n虾烩饭\n原味蜂蜜松饼\n蝴蝶骨\n蟹肉卷\n蟹肉煲\n西蓝花\n西餐肠\n小豆凉糕\n刨冰\n蔬菜拼盘\n土豆沙律\n豆瓣鱼\n豆腐串\n冻豆腐拼\n豆花糕\n土豆花菜\n豉香鸡\n豚肉扒\n豬頸肉\n香脆跳跳骨\n跳跳鱼\n蹄花锅\n兔头\n特色辣子鱼\n辣脆笋\n辣萝卜\n辣酱面\n麻辣鲈鱼\n香辣鸡煲\n香辣鸡球\n鸭胗\n香辣达仔鱼\n大连鲜鲍\n迷踪蟹\n退秋鱼\n配土豆\n配苹果\n酱油饭\n牡蛎\n酱糖饼\n酱通菜\n酱骨棒\n酸汤面\n酸菜鸡\n酸豆角\n酿尖椒\n醉椒鸡\n醋萝卜\n老醋蛰头\n重乳酪\n娇娇\n野蕨菜\n金宝鱼\n金桔汁\n黄金泡菜\n招牌金牌骨\n炸馒头\n钵钵菜\n干锅素菜\n铁锅杂鱼\n铁锅炖鱼\n干锅牛杂\n锅甲鱼\n石锅鳝鱼\n什锦素菜\n长腿蟹\n大阪煎饼\n雅片鱼\n青咖喱\n青柠汁\n青柠茶\n青瓜卷\n青石斑\n青菠面\n面包篮\n咖喱面包鸡\n面线糊\n风吹肉\n大土豆\n风干肉\n酸梅风干鱼\n飘香肉\n饭团烧\n饭定食\n饺子皮\n饼套餐\n煎饼果子\n饼羊肉\n馋嘴鸭\n香啡包\n麦香山药\n香猪肝\n香瓜子\n香芋丸\n香芋酥\n五香花生\n香草汁\n香菇面\n香蕉卷\n香蕉塔\n香蕉派\n香辣酱\n香锅虾\n香锅鸡\n酒香鲅鱼\n鹅掌\n马拉糕\n马蹄水\n骨浓汤\n无骨炸鸡\n化骨绵掌\n三吃\n班鱼两吃\n鱼丸面\n鱼包蛋\n海鲜大烤\n鱼头面\n墨鱼汁面\n鱼汤面\n鱼片面\n鱼皮饺\n鱼砂锅\n鱼肉卷\n鱼脆饼\n三文鱼腐皮\n鱿鱼片\n鱿鱼酥\n鱿鱼面\n鲍螺片\n鲍鱼粥\n海鲜乌冬\n海鲜汤底\n鲜山药\n海鲜锅贴\n鲶鱼片\n鲷鱼片\n鳕鱼块\n鳕鱼条\n鳝鱼煲\n辣子鸡便当\n炸鸡拼盘\n鸡排锅\n鸡油饭\n鸡肉粥\n鸡肉饼\n鸡腿菇\n鸡鸭杂\n鸭五件\n鸭泡饭\n鸳鸯汤底\n鹅火锅\n咸鹅锅仔\n麦片虾\n麻油鸡\n麻辣串\n麻辣拌\n蟹黄南瓜\n蟹黄小笼\n黄豆煲\n黄金包\n黄金鱼\n黄鳝饭\n三黄鸡锅\n芥兰黑山羊\n黑椒面\n黑芝麻\n招牌椒盐龙头\n盘龙茄子\n乌龙茶奶\n龙须面\nQ麻薯\n一品包\n五花肉\n满满一桌菜\n一洋卷\n布丁可可\n布丁奶绿\n丁桂鱼\n炒饭\n七虾堡\n千丝万缕虾\n三层肉\n三杯虾\n三色糕\n丝丁鱼\n拉皮\n丝荞面\n金丝虾卷\n萝卜丝酥饼\n丸子面\n芥末乌贼烧\n欢乐时光\n香芒轻乳\n乳酪条\n五花趾\n五花锅\n五谷饭\n五香卷\n什锦煲\n杏仁月饼\n生煎包\n果仁蛋卷\n蚝仔煎蛋\n仔米糕\n时令凉菜\n时令小菜\n特价羊肉\n优乐酷\n考伯沙律\n伯爵茶\n黑松露低温蛋\n有机活体豆苗\n俱乐部\n倭豆饭\n超值晚餐\n健康菌\n李師傅脆肚\n芋儿烧鸡\n全羊汤\n八味碟\n八宝茶\n养生煲\n牛排\n冬瓜盅\n冰柠茶\n冰桔汁\n冰皮月\n冷菜区\n冻柠茶\n华夫\n冰激凌月饼\n龙凤什锦\n大刀耳片\n刨猪汤\n卤鸭\n豆制拼盘\n皮冻\n自制羊肉\n秘制龙虾\n招牌炒出前一丁\n餐前饮料\n剪刀面\n巧克力千层\n巧克力薄饼\n冬荫功汤面\n薯条加奶酪\n加明虾\n蔬菜\n鳕鱼\n自助晚餐\n自助水果\n腐皮包黄鱼\n陕北小菜\n西北面筋\n土匪猪肝\n千张卷\n半只鸡\n华夫筒\n中华海藻\n越南河粉\n南瓜挞\n南瓜汁\n卜卜甲\n萝卜牛杂\n牛奶卡仕达\n卤双拼\n卤菜粉\n卤鸡蛋\n卷大虾\n压土豆\n厚蛋烧\n草原肥羊\n老友牛杂\n双人锅\n双味菇\n双拼锅\n爽口凉菜\n口口香\n锅仔口味鸭\n口水虾\n脆口萝卜\n一口西多\n脆口黄瓜\n叶儿粑\n百叶拼盘\n五合一面\n混合果汁\n吊龙膀\n蛋\n五花\n海带\n腊味煲饭\n秋葵\n茄条\n味菌汤\n炸虾饼\n风味螺肉\n鱼锅\n风味鳕鱼\n和子饭\n和豆腐\n炒咕噜球\n咖喱汤\n芋泥烩咸蛋黄\n咸鸭蛋\n精品小菜\n一品蛋酥\n墨西哥脆饼\n菠菜面筋\n咖哩牛腩\n喜头鱼\n四喜烤夫\n四喜豆腐\n咖喱叻沙\n咖喱土豆\n咖喱拼盘\n咖喱牛楠\n咖喱鱼头\n喷柠鸡\n四冷拼\n四喜面\n北京四大碗\n四大缸\n回卤干\n回味虾\n回鱼锅\n沁园牛肉\n国宝鸡\n芋圆1\n圆子锅\n土匪鱼\n土匪鸡\n土豆松\n大块豆腐\n坨坨肉\n坨坨牛肉\n培根虾\n塌豆腐\n塘坝鱼\n石墨豆腐\n墨鱼柳\n墨鱼烧\n墨鱼饭\n墨鱼饼\n芝士地瓜\n芝士豆腐\n芝士鸡煲\n芝士鸡锅\n芝士龙虾\n多味鱼\n大块肉\n大杂扒\n大满贯\n大炖菜\n大牛肉\n大王蛇\n大盘菜\n大红肠\n大肉串\n大腰子\n大饼子\n大马哈\n大鸽饭\n大麻鱼\n飞天通菜\n天鹅蛋\n太鱼汤\n葫芦头泡馍\n夏威夷匹萨\n夹牛肉\n曲奇香奶\n奥尔良小吃\n奶冻糕\n水果\n奶油杯\n奶油烧\n双皮奶菜菜\n奶酪鱼\n奶酪鸡\n奶香片\n妇罗虾\n烤包\n娘惹羹\n阿婆醉鱼\n嫩千张\n腰片\n麻辣嫩鲶鱼\n嫩鸡肉\n拉茶\n栗子糖水\n带子裙边\n松子鲈鱼\n普宁豆腐\n同安封肉\n宝宝龟\n肉冻\n至尊批萨\n至尊眼肉\n小咸菜\n小墨鱼\n小山芋\n小棠菜\n小煎鸡\n雪山小王子\n小盆栽\n小米蒸\n抹茶小红豆\n小肥鹅\n小花菇\n小菜们\n小鸭酥\n脆耳\n山楂糕\n火山牛肉\n山竹笋\n脆笋\n巴山腊肉\n特色海鲜\n栾川豆腐\n泰州干丝\n潮州汤底\n兰州烩菜\n血鸭\n手工曲奇\n锅巴盖肉\n筋头巴脑锅\n帝皇蟹\n带鱼串\n海带鸡汤\n本帮熏鱼\n笋干丝瓜\n干锅鱼\n什锦年糕饼\n德庄鸭肠\n开心果\n开花肠\n奇异果冰\n奇异果爽\n各式水果\n酱汤\n法式金砖\n招牌奶茶\n微辣锅\n点心套餐\n心心相印\n热情蓝莓\n扇贝卷\n手把肉\n手火锅\n打边炉\n扣三丝\n扣肉面\n扣鲍鱼\n扣鹅掌\n把子肉\n手抓牛肉\n手抓羊肉\n抹茶冻\n拇指包\n沙拉黄瓜\n拌海茸\n拌牛肚\n拌猪肝\n拌田七\n拌笋丝\n凉拌苦瓜\n凉拌茭瓜\n拌荆芥\n拌菠萝\n拌藕带\n拌贝裙\n拌青瓜\n鸟贝\n拌鸡架\n拌黄豆\n表叔招牌骨\n手指牛肉\n鸡蛋挑石头\n干捞苕粉\n掌亦煲\n羊排抓饭\n排盖饭\n猪皮冻\n提子酥\n鸡肉丸\n凯撒沙律\n文鱼汤\n斑鱼滑\n调料组合\n德克萨斯牧场\n安格斯肉眼\n时蔬拼\n曲奇饼\n木瓜奶\n木瓜水\n木瓜羹\n木瓜酥\n肉末茄子\n杂粮汤\n杂鱼锅\n三杯中卷\n肉松军舰\n松坂肉\n黑松露包\n板栗粥\n板栗饼\n铁板羊肉\n铁板肥牛\n铁板香芋\n铁板鳕鱼\n板鸭煲\n南极冰藻\n南极冰鱼\n腰果丹麦\n鲜果什锦\n芒果啫喱\n果巴菲\n鲜果庄园\n蜜果甜心\n芒果糯米\n芒果\n芒果雪泥\n芒果雪花\n枣皇糕\n红枣莲心\n柑橘蜜\n柠檬虾\n瑶柱白粥\n栗子塔\n栗子汤\n板栗酥饼\n爆核桃鸡\n培根意麵\n培根浓汤\n培根芦笋\n玛格丽特\n果漾\n格瓦斯\n樱桃之恋\n樱桃炖蛋\n蟠桃\n桑拿虾\n百香桑椹果\n过桥排骨\n马桥香干\n怡宝桶装水\n木桶豆腐\n话梅山药\n话梅花生\n话梅芸豆\n梳芙厘\n棉棉冰\n椒乌鱼\n尖椒肥肠\n椒脆肚\n蘑菇\n椒香鱼\n黑椒鸡肉\n椰子挞\n椰汁冰\n意式椰香包\n榴莲糕\n榴莲角\n鲜橙木瓜\n柠檬蜂蜜\n柠檬蜜茶\n柠檬鲈鱼\n牛肚\n毛圆汤\n毛肚锅\n和民拉面\n牛气冲天\n热气羊肉\n原味气锅鸡\n水果卷\n水火锅\n水煎肉\n八宝\n鸡汁木耳\n蜜汁梅肉\n白汁河豚\n炸鸡\n墨鱼汁烩饭\n干锅花蛤\n汁豆干\n汁豆角\n汁鱿鱼\n汆白肉\n洄鱼\n西江钳鱼\n长江鮰鱼\n清锅\n竹荪\n优质粉汤羊血\n汤苋菜\n上汤豆苗\n汤酥肉\n鱼翅\n汽水肉\n巴沙鱼片\n巴沙鱼锅\n河豚鱼\n沸腾虾\n油墩子\n牛油果卷\n红油百叶\n油粑粑\n油耳片\n红油肚丝\n油腐竹\n油薄饼\n招牌葱油蚕豆\n油面筋\n葱油飞鱼\n宇治抹茶\n青椒蛙\n泡泡锅\n泰香鸡\n泼妇鱼\n洋葱塔\n洛神花\n蛋黄流沙卷\n啤酒\n意式浓菜汤\n海三鲜\n海之恋\n海参粥\n海棠糕\n海苔角\n海蚌片\n三文鱼海蜇边\n海螺肉\n拌海野菜\n青斑\n印度焗咖喱\n海鲜粉\n海鲜酱\n深海鳕鱼\n浸蚌片\n冰激淋月饼\n淋豆腐\n油淋鲈鱼\n淮山卷\n清水锅\n清炖鸡\n清蒸蟹\n油渣莲白\n美丽温哥华\n温泉鸭\n港小炒\n湘之驴\n溜肝尖\n滑双拼\n滚肥牛\n满堂红\n火锅包\n火锅菜\n火龙卷\n火龙虾\n乌鱼\n美极炒三脆\n炒三蔬\n印式风味炒土豆\n炒大肠\n炒小笋\n炒揪片\n渔家小炒海肠\n炒猪皮\n炒猪肚\n炒瑶柱\n炒紫菜\n炒羊杂\n炒胜瓜\n炝炒腰花\n炒苕皮\n蒜苗炒莲菜\n炒螺肉\n炒豆芽\n清炒豆苗\n炒魔芋\n炒鸡架\n炒鸡肾\n炒鹅肠\n炖大鹅\n炖水鸭\n炖酥肉\n炖鹌鹑\n北京炸咯吱\n炸响铃\n炸子鸡\n年糕\n炸灌肠\n炸蚕蛹\n炸饺子\n鱼饼\n口蘑\n燒烤小拼\n烤心管\n原味烤海鲜\n烤牡蛎\n烤目鱼\n烤米糕\n烤羊棒\n烤肉汁\n烤肉酱\n烤脆骨\n招牌烤脑子\n烤豆干\n烤豆腐\n烤银杏\n烤香蕉\n烤香鱼\n烤鱼片\n鲭鱼\n烤鸡胗\n烤鸡脖\n烤鸭卷\n烤鸭羹\n烧乳鸭\n烧元贝\n叉烧双拼\n干烧四宝\n红烧河豚\n烧海螺\n烧海鱼\n烤腌肉\n烧牛尾\n烧笋干\n烧羊棒\n皮蛋烧藕丸\n烧青菜\n叉烧餐包\n烧鮰鱼\n烧鲶鱼\n鸭胸\n鹅酥\n烧鹿肉\n红烧龙虾\n烩丝瓜\n干肉烩土豆\n西芹炒山药\n烩肚条\n烩麻什\n烩麻食\n焖全猪\n焖杂鱼\n焖肉面\n焖萝卜\n焖鸡煲\n芝士焗紫薯\n芝士焗蕃薯\n焗蟹斗\n芝士焗青口\n芝士焗香蕉\n芝士焗鲍鱼\n焦炸丸\n火焰烧虾\n甲鱼\n孜然脆骨\n孜然豆腐\n孜然鱿鱼\n煎海虾\n煎蛋卷\n煎蛋饼\n煎香蕉\n煎鱼柳\n煨豆腐\n煨龙骨\n煲套餐\n煲筒骨\n干煸土豆\n煸鱿鱼\n熏鲳鱼\n熔岩烧\n熘肉段\n熘鱼片\n燕麦粥\n爆仔鸭\n爆爆冰\n爆爆虾\n黄喉\n八爪鱼锅\n爱上鱼\n波霸蟹小姐爱奶油\n真爱排骨\n王牌牛肉\n招牌牛腩\n锅烙\n招牌鱼粉\n牛三鲜\n牛心管\n牛扒餐\n牛杂汤\n牛油果\n牛筋锅\n酱肉包\n牛肉筋\n牛肩峰\n牛脸肉\n牛舌饼\n鲶鱼锅\n牛蛙面\n牛轧糖\n肥牛金针菇\n牛鞭汤\n牛骨煲\n鸡公煲\n特色鱼\n狗肉汤\n独面筋\n猪杂汤\n黑椒猪爽肉\n猪猪组合\n猪肉煲\n香猪肉片\n猪蹄煲\n猪蹄筋\n猪里脊\n猪颈排\n猪香肉\n柚子玄米茶\n玉柠檬\n玉米粥\n玉米蟹\n玉米面\n焦糖玛其朵\n玫瑰露\n玻璃片\n珊瑚皮\n珍珠汤\n琵琶虾\n青瓜小卷\n香拌木耳\n苦瓜炒蛋\n木瓜炖奶\n煎蛋\n红枣\n肘花\n雪梨\n瓜果鲜奶\n腊味瓦煲饭\n甜甜拼\n甜蜜蜜\n甜面酱\n田源鸡\n田鸡煲\n田鸡粥\n北疆菜羹\n疙瘩面\n白切粉\n白切羊\n白森林\n白菜丝\n白菜汤\n白骨鱼\n百味鸡\n百岁鱼\n咖喱皇飞蟹\n鳕球\n酥皮浓汤\n脆皮炸鸡\n脆皮牛腩\n皮蛋汤\n青椒\n脆皮香蕉\n大盆牛肉\n水盆羊肉\n牛心菜\n益菌多\n盏通菜\n盐水虾\n盐水鹅\n盐焗鸽\n椒盐牛蛙\n椒盐龙虾\n盖锅巴\n目鱼花\n骨肉相连串\n石子饼\n石榴花\n石锅粉\n石锅蛙\n矿泉水\n石锅砂锅面\n牛肉\n碳烧肉\n神之恋\n汤锅\n食神豆腐\n鸡三宝\n马来西亚福建面\n私房虾\n章鱼须\n芋儿竹笋鸡\n竹荪鹅\n笋尖粉\n莴笋山药\n笋干煲\n筋皮子\n方签牛肉\n奶露\n玉米松仁\n米血糕\n意大利肉酱意粉\n蟹粉小笼\n糊塌子\n年糕套餐\n芝士年糕\n红糖冰粉\n糖红枣\n糖醋肉\n红糖馒头\n糯米肉\n素排骨\n素火腿\n素烧鸭\n素鸡煲\n紫米烧\n紫菜煲\n紫薯包\n紫薯卷\n紫薯托\n紫薯汁\n碳火紫薯秀\n红宝石\n红汤面\n红沙鱼\n红白锅\n红糖饼\n红菜苔\n红豆烧\n缤纷欢舞\n纸包鸡\n锡纸鲈鱼\n综合冰\n绿豆面\n绿豆饼\n缤纷卷\n罐焖牛肉\n罐罐菜\n招牌罗宋包\n罗氏虾\n罗汉肚\n羊头肉\n烤羊套餐\n羊小排\n羊脆骨\n羊腩煲\n羊腿排\n美蛙鱼\n翡翠包\n老三鲜\n老火汤\n老爆三\n老陕菜\n银耳南瓜\n木耳桃仁\n木耳炒蛋\n炖雪梨\n肉丝汤\n牛肉冒菜\n牛肉土豆条\n塔可\n大葱\n驴肉大饼\n牛肉宽面\n肉松包\n肉松饭\n牛肉泡饭\n肉浓汤\n羊肉焖饼\n牛肉粿条\n肉饼饭\n烤鹅肝\n肠火锅\n肥肠饭\n开胃萝卜\n开胃豆腐\n开胃鸡片\n胚芽乳\n胡椒肚\n招牌能量\n脆肉鲩\n鲜虾脆脆棒\n脆虾肠\n脆锅巴\n脆鱿鱼\n脚牛肉\n腊八蒜\n腊排骨\n腊肉饭\n腌生虾\n腌萝卜\n豆腐拼盘\n臭豆腐虾皮\n牛腩捞面\n咖喱腰果鸡\n沸腾草鱼\n沸腾鱼片\n沸腾鲶鱼\n火腿微萨\n炸膝软骨\n油炸臭干子\n牛舌拼盘\n特色凉菜\n蓝色妖姬\n白色恋人\n特色蘸料\n诱惑\n艾窝窝\n芋圆捞\n芋头冰\n芋头煲\n芋泥鸭\n芋艿煲\n芒果杯\n芒果派\n芒果贝\n芙蓉汤\n芙蓉蟹\n芝士包\n芝士棒\n芦笋卷\n芦笋汤\n芭夯兔\n花之卷\n桂花乌龙\n花排骨\n枝烧\n花生汁\n花生浆\n花生粥\n花生酥\n百花稍梅\n桂花芋艿\n芸豆卷\n豆芽炒粉\n苏子叶\n海苔拌饭\n苦荞馍\n番茄浓汤\n番茄鱼片\n番茄鱼锅\n茴香豆\n茗茶一壶\n茶大福\n抹茶松饼\n茶泡饭\n茶皇虾\n抹茶雪糕\n香草排骨\n荔枝蜜\n荞麦\n竹荪三鲜\n荷叶饼\n茉莉花茶\n莓优格\n莓千层\n莓多士\n莓新地\n梅类果汁\n草莓薄饼\n莓金砖\n班戟\n榴莲布蕾\n蘑菇城堡\n香菇油菜\n蘑菇烩饭\n金菇脆笋\n菇鸡煲\n炒菊花菌\n玉米菜团子\n青菜小拼\n菜淡皮\n清炒\n白菜炒虾\n德式泡菜烩饭\n豆皮\n鲜肉\n韭菜鸡蛋\n菠菜卷\n菠菜塔\n菠菜汤\n菠菜饭\n菠萝堡\n萝卜包\n菠萝可颂\n葛根粉\n葡国鸡\n葡萄干\n葡萄柚\n葡萄鱼\n香葱豆腐\n葱香鸡\n蒙布朗\n蒜羊血\n蒸盆子\n蒸粉果\n蒸肠头\n蒸肥肠\n蒸蚬子\n芙蓉蟹粉\n蒜蓉龙虾\n蓝莓味\n蓝莓派\n鲜蔬组合\n蔬菜吧\n蔬菜盘\n蔬菜羹\n香蕉松饼\n薯小弟\n金薯沙律\n藤椒鸡\n西域蘸酱菜\n虹鳟鱼\n活虾双吃\n龙虾拌面\n牛肉丸\n炖蛋\n虾肉卷\n蚝仔煎\n蚬子肉\n蚵仔煎\n蛋拌饭\n蛋灌饼\n蛋牛肉\n蛋花粥\n蛋角煲\n皮蛋豆花\n鸡蛋醪糟\n牡蛎煎蛋\n青蛙撞奶\n蛙蛙鸡\n雪蛤炖蛋\n蜗牛汤\n意式蜗牛饭\n蜜桃汁\n蜜鹅肝\n蝴蝶酥\n螺旋藻\n蟹壳黄\n蟹籽丸\n蟹籽包\n补炖汤\n油泼裤带面\n煎裹蒸粽\n西柚茶\n西瓜冰\n诱惑鸡\n调料区\n蔬菜大丰收\n五谷香馍\n豆塔塔\n红豆抹茶\n土豆排骨\n豆曲奇\n豆沙粽\n豆皮卷\n豆腐丸\n豆腐泡\n豆腐虾\n红豆薄撑\n豆角丝\n烤鱼\n香煎饼\n豚平烧\n川贝雪梨\n费城卷\n赖尿虾\n越南卷\n踏板鱼\n猪蹄火锅\n蹦鲤鱼\n刺身拼盤\n辣三鲜\n辣乌鱼\n辣双脆\n香辣板筋\n辣椒面\n辣汤面\n麻辣爆肚\n麻辣牛肚\n瓜条\n竹笋\n辣羊蹄\n辣肉面\n肋条\n麻辣脑花\n草鸡\n麻辣蜗牛\n麻辣诱惑\n麻辣鱼片\n糟辣鲤鱼\n香辣鸡杂\n鸭肠\n边馍馍\n过江肠\n过江鱼\n鸿运当头\n大连鲍鱼\n起酥肉松\n酥脆鸡\n奶酪山药\n酱凤尾\n酱卤肉\n腌酱拼盘\n云吞炸酱捞面\n酱料\n酱汁肉\n汁锅\n酱油渍\n酱油鸡\n麻酱花卷\n酱香骨\n酱脊骨\n酸辣碟\n酒酿元宵\n酿皮子\n酿秋茄\n鱼酿鱼腐\n醉排骨\n醉牛肉\n双脆\n醋溜鸡\n醋烧鸡\n野菌菇\n野菜卷\n野菠菜\n野香菇\n野鸡蛋\n招牌金瑶鸡\n黄金虾卷\n金钱手\n金鸭梨\n钵仔糕\n铁锅炖\n银雪鱼\n银馒头\n石锅三国\n干锅三素\n锅仔鸡\n大肠\n虾串\n火锅大锅\n干锅掌翅\n锅汤底\n石锅泡饭\n大锅海鲜\n羊杂\n美蛙\n石锅菜饭\n干锅藕条\n干锅豆角\n石锅香芋\n平锅鲤鱼\n干锅鲶鱼\n锅鳕鱼\n鸡杂\n无锡排骨\n冰镇龙虾\n长脚蟹\n澳门烧肉\n洛阳燕菜\n陆双拼\n雪花肥牛\n花雕醉枣\n花雕醉鸡\n雞尾酒\n雪文治\n雪梅娘\n雪梨粥\n雪花糕\n雪莲子\n雪蛤粥\n霹雳泡饭\n招牌雷公鸭\n抹茶霜淇淋\n青瓜烙\n面包圈\n面包塔\n面包干\n面包虾\n拼盤\n风塘虾\n风沙鸡\n餐包仔\n饸饹面\n饼夹肉\n奶绿\n五香牛杂\n绿茶\n酱香肘花\n香肠面\n香肠饭\n香肺片\n香芋派\n香芋饼\n香芒糕\n苦菊\n香蕉酥\n蛋仔\n香豆皮\n香辣汤\n香辣鸭\n香酥饼\n香锅兔\n奶香馒头\n香鱼块\n酱香鳕鱼\n麻辣\n马步鱼\n闷骚南瓜\n骨拼盘\n骨汤面\n清汤\n骨湯底\n脆骨牛肉\n脆骨饭团\n骨香鸡\n骨鱼煲\n无骨鸭蹼\n魔芋结\n魔鬼鱼\n鱼下巴\n烤鱼伴侣\n鱼唇汤\n鱼嘴巴\n鱼子卷\n手握\n拌菜\n烤鱼泡泡\n鱼泡馍\n鱼清汤\n鱿鱼火烧\n炒蛋\n鱼炖鸡\n鱼细卷\n有鲜鱼\n鱼薄切\n鱼蛋皇\n鱼蛋角\n鱼钩鱼\n鱼锅贴\n闷锅\n鱼香锅\n鱿鱼饼\n鲍鱼片\n鲜兔腰\n鲜果杯\n鲜果汁\n鲜果花\n海鲜焗面\n鲜牛肝\n鲜百叶\n鲜百合\n鲜肉粽\n鲜蔬卷\n鲜蔬汤\n鲜鱼面\n鲨鱼肚\n鳄鱼汤\n宫爆鳕鱼丁\n鳕鱼煲\n鳕鱼球\n鳗龙卷\n鳝丝面\n烤鸡匹萨\n鸡杂面\n鸡杂饭\n招牌鸡汤饭\n鸡爪煲\n鸡肉片\n奥尔良鸡肉粒\n鸡薄饼\n东坡鸡豆花\n鸡里蹦\n鸭头虾\n鸭板肠\n鸭架子\n鸭爪煲\n招牌莓酱\n鸭肉饭\n鸭脆肠\n鸭脚煲\n鸭脚筋\n鸭腿饭\n鸳鸯中锅\n鸳鸯卷\n鸳鸯糊\n鸳鸯虾\n鹅胸肉\n麦乐鸡\n蟹钳爱上麻得跳\n麻椒鸡\n芝麻火烧\n麻辣粉\n黄桂粥\n黄泥螺\n韭黄炒蛋\n黄牛汤\n黄腊丁\n黄金贝\n蛋黄锅巴\n黄鱼面\n黑土鸡\n黑曲奇\n黑米粽\n黑豆芽\n黑鮰鱼\n松鼠桂鱼\n龙利柳\n龙头鱼\n龙胆鱼\n招牌龙虾饭\n宫保鸡丁盖饭\n上脑\n焖鱼下巴锅\n绝世超芒\n东东包\n拔丝土豆\n拔丝地瓜\n金丝排骨\n关中杂拌\n鸡中翅锅\n雪中送炭\n烤串拼盘\n丽娜卷\n乌冬\n酷乐鲜柠\n南乳排骨\n乳猪\n仔猪煲\n味付三拼\n味付海藻\n伙伴肥牛\n宫保大虾\n菌汤兔肉锅\n全牛\n新西兰肥羊\n米兰虾饼\n兰馨锅\n绍兴醉鸡\n经典生煎\n冒牛肉\n鲜冬瓜片\n海鲜芝士乌冬面锅\n冰沙\n冰粉\n冰粥\n冷菜\n冷面\n冻柠七\n冻柠红\n凉皮\n凉粉\n熊猫凉糕\n凉茶\n凤爪\n大刀羊肉\n太极金刚煲锅\n秘制酱骨\n秘制鹅肝\n刺身\n餐前小食\n巧克力派对\n巧克力瀑布\n冬阴功汤锅\n面包切片\n大肠包小肠\n北冰洋\n披萨\n鸡叻\n千张\n元朗荣华月饼\n套餐+单点\n南瓜\n小园子\n萝卜丝饼\n萝卜炖牛腩\n卤味\n卤煮\n主厨浓汤\n叉烧肉\n一口多士\n爽口泡菜\n爽口苦菊\n可可\n虾吃虾涮\n小吊梨汤\n吐司\n腊味合蒸\n风味拉皮\n原味豆滋\n四味锅2\n拾味鱼扒\n绝味鸭架\n烤肉和生肉\n咖啡\n咖喱饭\n豆制品四拼\n一品锅饭\n一品鲈鱼\n咖喱火锅\n泡馍\n四拼锅\n江团焖锅\n江团鱼锅\n田园小炒\n庄园肥牛\n芋圆\n芋圆系列\n拌土鸡\n圣代\n地瓜\n老坛酸菜\n好莱坞炸篮\n培根卷\n培根锅\n避风塘海虾\n芝士宝贝\n芝士牛肉\n茉莉绿茶\n芝士薯球\n芝士蟹柳\n波士顿卷\n贝壳+虾\n塞外锅巴\n马来四大天王\n三文鱼大满足\n巧克力大爆炸\n大盘兔\n大锅\n大饼\n大骨\n寒天爱玉\n渔夫泡饭\n功夫青笋\n功夫鲈鱼\n桥头炸肉\n鱼头诱惑\n菌王奇香锅\n咖啡奥利奥松饼\n椰奶凉糕\n法式奶油鸡\n奶茶\n外婆鱼头\n宋嫂鱼羹\n嫩肉\n梅子奶酪\n正宗羊肉\n客家咸鸡\n宽粉\n富贵卷\n寒之寒\n金丝对对虾\n寿司\n法式甜蜜小包包\n麻酱调料\n火车头小焖锅\n小鱼小牛锅\n开胃小菜\n尚一汤\n九尺鹅肠\n长岛冰茶\n川粉\n杭州卤鸭\n兰州酿皮\n巨蛋烧\n锅巴排骨\n锅巴藕丸\n海蛎子巴蛸锅\n铁板帆蛎贝\n花生海带双拼\n带子\n带鱼\n干碟\n干锅\n管鱼\n江南的年糕蟹\n印度薄饼\n一座城池\n宫廷奶酪\n浪漫法式塔锅\n清蒸彩虹斑\n德庄汤\n回忆盒饭\n鱼\n夏威夷风情沙律\n两情相悦\n三文鱼蛋黄汁烩意粉\n意面\n孝感米酒\n扇贝\n焖扇贝锅\n芒果牛奶手杯冰\n打糕\n扯面\n猫抓糍粑\n抹茶\n炸鸡沙拉拌饭\n大拌菜\n芋圆招牌餐\n刺身拼盘小\n拼锅\n双椒鱼头\n鸡排伴侣\n鸡排大亨\n接吻鱼\n雪蟹脚+海虹+撒尿虾\n凯撒萨拉\n文鱼\n万岁寿司·料理\n新地\n早餐\n比利时薯球\n甜虾明太子\n东北春饼\n水晶焖锅\n水晶皮冻\n水晶鸭肝\n切片布朗尼塔\n沙朗牛肉\n木桶鱼\n木桶鸡\n木瓜\n炒木耳\n木须肉\n章鱼\n皇牌玛奇朵抹茶\n杂菜\n马来风光\n松茸\n牛板筋\n铁板肥肠\n铁板蛏子\n板鸭\n黑森林金砖\n芒果千层\n芒果椰奶\n水果盆栽\n时令水果拼盘\n果立方\n金枪鱼香果脆饼\n辣子鸡配芒果蛋黄\n雪冰\n荔枝烧肉\n枣糕\n枣豆糕\n蜜枣银心\n西红柿蛋汤\n比格自助\n木桶牛肉\n木桶葡挞\n南瓜条\n梅肉\n泡椒酸菜\n麦香芒椰果捞\n榴莲\n冰爽柠檬柚子\n正蟹煲\n酒水饮料\n蜜汁火方\n嫩汁牛烧\n蜜汁红枣\n鸡汁锅贴\n汉堡\n江蟹生\n汤包\n汤圆\n酸汤羊肚\n汽水\n云南汽锅鱼\n尖椒兔丁\n小油条\n三文鱼牛油果塔\n香油润鱼\n蒜泥香油\n葱油鲈鱼\n泡泡油糕\n泡芙\n泡饼\n芝士波波肠\n蒜泥青瓜\n滇香三阳开泰养生\n徐同泰酱油\n非洲冰草\n澳洲龙虾\n派大星\n泡菜流川锅\n海宝船\n海星\n海盗船\n海胆\n爽口海草\n超级海陆空\n海鲜区\n涮菜\n海鲜火锅\n清汤面\n清蒸鱼\n清酒\n田园清香蛙\n台湾卤肉\n田源鸡汤\n日月潭豆腐\n肉火烧\n火腿\n穿心莲\n白灼章鱼\n芥兰\n桂花炒元宵\n栗子\n炒面\n猪油菜炒饭\n拆骨肉炖菜干\n炖鸡\n火炙蟹棒\n炸支竹\n炸肉\n炸薯角\n炸雞\n芝士炸鸡锅\n烙饼\n烤乳酪\n芝士烤元贝\n香烤兔\n京酱烧烤寸骨\n烤海螺\n烤猪皮\n烤盘\n烤全羊\n烤翅\n烤肉\n烤肉筋\n烤腰子\n烤薯皮\n烤饼\n烤鸡\n烧卖三鲜\n牛肉烧多士\n锅烧春笋\n招牌烧肉粒\n烧饼\n烧鹅\n大热狗\n焖面\n焖黑鱼\n盐焗乳鸽\n芝士焗野菌\n火焰海螺\n煎面\n煎饺\n煎鱼\n小笨鸡\n浓汁煮鱼唇\n木瓜芦荟鱼肚\n煲蒸饭\n熏肉\n燃面\n燕窝\n燕饺\n爆肚\n甜汁爆鲜虾\n爱琴海\n招牌卷饼\n妈妈牌杂菇\n招牌海苔\n招牌牛河\n招牌贡茶\n香辣咖喱牛肉丸＆弹牙鱼蛋\n牛太郎\n牛奶\n牛排套餐\n牛柳\n牛筋\n牛肝\n牛舌\n虾、鱼、牛蛙\n虾、牛肉、牛蛙\n牛骨髓\n炸物拼盘\n狮子王\n狮王冻\n猪排\n猪皮\n猪肚\n猪血猪红\n猪蹄\n熊猫宝宝\n熊猫果茶\n玉米烙\n霸王鸡条\n环境\n玫瑰之恋\n玫瑰圆子\n木瓜雪杏\n香甜炸鸡\n野生草鱼\n三巴酱野生菌煲\n炒田螺\n吐鲁番米糕\n新疆香囊\n瘦肉粉\n纠结的大饼\n秋天的童话\n脆皮椰奶\n北欧鲑鱼皮焗卷\n皮蛋炒椒\n脆皮鲜奶\n椒盐排条\n鸭架\n石锅鸡\n肉碎锅巴\n碓窝鸡\n研磨时光\n青柠洛神冰茶\n福寿桃\n青稞扎萨\n海陆空焖锅\n大切章鱼皇\n笃笃笃糖粥\n笋\n笋片\n筋面\n筋饼\n试管饮料\n米皮\n米皮儿\n炒米粉\n鸡爪米糕铲\n过桥米线\n米酒\n糯米饭\n芭夯菌类汤锅\n粉丝\n粉条\n粥底\n年糕香排\n糖不甩\n枫糖唱片\n糖粥\n糖蒜\n糖饼\n素菜\n素鸡\n素鸭\n紫薯\n特色干煸粉絲蟹煲\n红杏鸡\n红肠\n红茶\n小红薯\n烈焰阿根廷红虾卷\n相思红豆奶\n红酒\n红锅\n将军高级拌饭\n综合锅\n罐牛\n摩托罗拉卷\n精灵署小弟\n特色羊排\n海鲜+羊肉\n羊血\n精美水果\n羹\n翅尖\n烤翅根\n翠鱼\n翡翠鱼\n老碗鱼\n茶树菇老鸭饭\n木耳辣根\n鲜肉月饼\n肉松酥\n肉滑\n肉筋\n肉粽\n牛肉诱惑\n烂肉豌豆\n牛肉辣汤\n肉饼\n羊肉馍馍\n囊包肉\n五香肘\n烧肠粉\n羊肉\n肥羊粉\n肯德基\n开胃泡菜\n海胆手卷\n龙脂猪血\n脆骨\n筋头巴脑小锅\n腊肉\n秘制腐乳鸭\n拌腐竹\n腰子\n非一般鲈鱼\n特色果汁\n三色炸鸡\n香芋排骨\n榴芒双拼\n泡芙小姐\n荷花卷\n木瓜银耳\n花茶\n鲜花菜\n樱花虾羹\n炒花蛤\n花鲢\n豆芽面筋\n提拉米苏切块\n提拉米苏切片\n苕皮\n苕粉\n鸡汤苦麦菜\n番茄汤底\n绿茶佛饼\n抹茶杏仁\n藏茶火锅\n草鱼锅\n生蚝\n草莓罐头\n榴莲匹萨\n榴莲忘返\n榴莲飘香\n榴莲飞饼\n榴莲鸡锅\n莴笋\n莴笋尖\n野生菌六拼\n菌菇\n香菜干丝\n油菜心\n菜汤\n韭菜盒子\n菠萝派\n201108312499華萊士\n紫葡之恋\n内蒙肥羊\n蒸蛋\n蒸面\n蒸鱼\n蒸鸡\n红虾\n蒜蓉茄子\n蒜蓉茼蒿\n香蕉薄饼\n炸薯条\n香脆薯格\n薯球\n薯角\n薯饼\n鲜藕片\n香辣锅\n炒虾仁\n虾片\n虾球\n虾籽面\n蚂蚁上树\n香辣蚝子鱼\n蛋挞\n奶油培根蛋汁面\n蛋饺\n炒竹蛏\n黄金蜂蜜绿\n蛤蜊炖蛋\n闺蜜咖喱\n螺肉\n海螺酱烧\n蟹味菇\n蟹大脚\n蟹柳\n蟹棒\n螃蟹焗饭\n营养蟹籽饭\n蟹足棒\n血肠\n菌类番茄滋补三锅\n口袋豆腐\n褡裢火烧\n西瓜\n茄角之恋\n让豆腐\n麻辣诱惑虾\n豆浆\n脆豆腐\n豌豆肥肠\n黄豆芽\n干煸豆角\n铁板豆豉鸡\n豚肉排\n虾贝组合\n西贝面筋\n土贡梅煎\n财鱼\n水货沙律\n刺身双拼\n刺身小拼\n瘦身海藻\n九转大肠\n辣松\n南洋鸡肉卷&辣翅\n辣锅\n酱汁香辣鮰鱼\n麻辣鳝段\n鸿运龙船\n七里香迷中蟹\n普通羊肉\n通菜\n郡花\n京都豆腐\n酒\n红酒雪梨\n油酥火烧\n酥肉\n酥饼\n麻酱拉皮\n麻酱酿皮\n海鲜酱闷锅\n酸奶\n酸汤\n酸菜\n酿皮\n醉鸡\n糖醋鲤鱼\n野菜\n黄金大饼\n黄金蟹斗\n铜锅\n吴铭毛肚\n组合焖锅之一\n石锅扇贝\n闷锅生料\n锅贴\n石锅酱汤\n冰镇黄桃\n鱼焖锅\n冬阴功面\n乾隆鱼头\n自助食材集体照\n雪娃娃\n雪碧\n雪蟹\n波霸奶绿\n青笋片\n青笋尖\n熊猫靓雪饼\n面包等\n面片\n热面皮\n自选青菜、菌类、面类等\n韭菜饼\n养颜木瓜\n肉丁馒头\n韩餐生肉\n蜂蜜柚子茶\n水饺\n饺子宴\n馄饨\n馅饼\n馒头片\n馕\n香妃鸡\n芥香章鱼\n香肠\n香草\n菌类合盘\n百香雪酪\n酱香龙骨\n驴杂汤\n驴肉\n骨棒\n筒骨汤\n蒜香排骨火烧\n鬼鸡\n魔芋\n鮰鱼\n探鱼冰粉\n鱼排\n鱼滑\n鱼肉\n鳗鱼蒲烧\n鱼面\n鱿鱼脆\n鲍鱼饭\n鲑之恋\n咖喱海鲜喇萨\n鲜奶\n海鲜葱饼\n鲷鱼\n鲽鱼柳\n黑椒鲽鱼煲\n大盘鸡\n鸡块\n和风香鸡堡1\n鸡尖\n鸡心\n鸡排\n雪花鸡柳\n鸡皮\n鸡翅\n炸鸡腿\n鸭头\n鸭心\n鸭汤\n鸭爪\n老卤鸭翅\n鸭肚\n极品鹅肠\n鸭脖\n鸭舌\n鸭血\n糍粑\n乳鸽焗饭\n鹅肠\n蒙鹿肥羊\n麻团\n麻花\n芝麻香饼\n黄啤\n黄馍馍\n黑啤\n黑拉皮\n黑鱼锅\n三足鼎立锅\n松鼠鲈鱼\n松鼠鳜鱼\n龙鱼\n绿茶龟宝宝\n西米露\n红豆沙\n双皮奶\n水果捞\n甜甜圈\n红豆冰\n杨枝甘露\n红豆糕\n蛋糕\n冰淇淋\n华夫饼\n冰激淋\n冰淇凌\n冰激凌\n西米捞\n凤梨酥\n薯糕\n芒果冰\n金捞\n千层糕\n鲜芋仙\n薄饼\n炖雪蛤\n下午茶\n布甸\n三兄弟\n莲子汤\n芒果爽\n雪山包\n紫米糕\n水果塔\n温泉蛋\n慕斯蛋糕\n巧克力蛋糕\n酪蛋糕\n奶蛋糕\n奶油蛋糕\n黑森林\n猪肉丸\n生日蛋糕\n杯子蛋糕\n布丁蛋糕\n栗子蛋糕\n小蛋糕\n起司蛋糕\n榴莲蛋糕\n抹茶蛋糕\n戚风蛋糕\n凹蛋糕\n墨鱼丸\n花枝丸\n甜品小丸子\n章鱼丸\n鱼丸鱼蛋\n贡丸\n虾丸\n撒尿牛丸\n炸丸子\n红烧丸子\n鱼球\n芝士丸\n牛筋丸\n慕斯\n千层\n芝士\n菜丸子\n芋丸\n酒酿丸子\n圆子\n丸子\n双层蛋糕\n水果蛋糕\n公主蛋糕\n彩虹蛋糕\n熔岩蛋糕\n寿司蛋糕\n冰淇淋蛋糕\n果蛋糕\n宝丸\n土笋冻\n冰激凌蛋糕\n莓蛋糕\n年轮蛋糕\n茶蛋糕\n鸡蛋糕\n羊肉丸\n桃蛋糕\n岩盐蛋糕\n红丝绒\n小青菜\n茼蒿\n蓬蒿菜\n皇帝菜\n蒿子杆\n蒿子秆\n杂青菜\n空菜苗\n西瓜汁\n柠檬茶\n桔柠檬\n水果茶\n可乐\n竹蔗水\n奶盖可可\n奇异果汁\n荞麦面\n年糕火锅\n辛拉面\n三文鱼刺身\n三文鱼北极贝\n牛肉刺身\n虾刺身\n蚌刺身\n水果沙拉\n蔬果沙拉\n芒果沙拉\n炒乌东\n威士忌\n酱鳕鱼\n猪骨锅\n牛油果沙拉\n鲜果沙拉\n海鲜粉丝沙律\n土豆丸\n番茄炒蛋\n罗汉笋\n鸭四宝\n香芒卷\n抹茶金砖\n什锦蔬菜\n杏鲍菇\n辣椒炒肉\n烧牛肉\n烧甲鱼\n炭烧肉\n粉烧肉\n咖喱牛肉饭\n柠檬鱼\n秋椒鱼\n双椒鱼\n青椒钵钵鱼\n烤肉金针菇卷\n海鲜菇\n干拌面\n韩国拌面\n猪头肉\n黑猪肉\n口水鸡\n茶香鸡\n大什扒\n海杂鱼\n香鱼片\n葱油鸡\n白切鸡\n小海参\n剔骨肉\n玉米排骨汤\n排骨汤\n紫菜包饭\n蛋包饭\n鸡翅包饭\n鱼包饭\n虾炒饭\n蛋炒饭\n海鲜炒饭\n牛肉炒饭\n菠萝炒饭\n什锦炒饭\n蔬菜炒饭\n鸡肉炒饭\n鹅肝炒饭\n肉粒炒饭\n海苔炒饭\n扬州炒饭\n鱼炒饭\n芝士炒饭\n咖喱炒饭\n叉烧炒饭\n铜锅饭\n葱烧虾\n椒炒蛋\n培根炒饭\n千刀肉\n豆角肉末\n南瓜饭\n酱油炒饭\n鸡肉串\n羊肉串\n牛肉串\n串串香\n串串虾\n猪肉串\n叉烧包\n小笼包\n奶黄包\n烤包子\n大包子\n碎肉生菜包\n素菜包\n肉包\n牛肉披萨\n鸡肉披萨\n海鲜披萨\n培根披萨\n榴莲披萨\n土豆比萨\n芝士披萨\n至尊披萨\n风情披萨\n香肠披萨\n烤肉披萨\n咖喱鸡\n土豆片\n小土豆\n土豆球\n烤土豆\n香辣土豆\n土豆饼\n糖醋排骨\n蒸排骨\n香汁排骨\n话梅排骨\n蒜香骨\n排骨饭\n蜜汁骨\n芒果汁\n苹果汁\n橙汁\n炖鸡汤\n蘑菇汤\n海带汤\n南瓜汤\n清远鸡\n酥皮\n土豆沙拉\n蔬菜沙拉\n玉米沙拉\n鸡肉沙拉\n凯撒沙拉\n木瓜沙拉\n海鲜沙拉\n蟹子沙拉\n牛肉沙拉\n帝王蟹\n基围虾\n咖喱虾\n香辣虾\n甜虾\n油爆虾\n焗大虾\n香酥虾\n黑虎虾\n玉米虾\n串烧虾\n元宝虾\n濑尿虾\n烤明虾\n焗明虾\n盆盆虾\n煮花螺\n面包蟹\n炒花蟹\n辣大闸蟹\n大闸蟹\n凤尾虾\n大明虾\n开背虾\n椒盐虾\n开边虾\n白灼虾\n南美虾\n皮皮虾\n酥皮虾\n炒河虾\n酱油虾\n蒜蓉虾\n鸡翅虾\n盐焗虾\n烤虾\n蹄花虾\n粉丝虾\n铁板虾\n斑节虾\n花螺\n炒螃蟹\n烤鸭\n片皮鸭\n香酥鸭\n手撕鸭\n加积鸭\n茶香鸭\n脆皮烧鹅\n樟茶鸭\n酱鸭\n脆皮鸭\n小刀鸭\n排骨\n牛排饭\n猪小排\n辣小排\n蜜汁小排\n嫩牛肉\n牛五花\n酱牛肉\n五香牛肉\n嫩汁肥牛\n烤牛肉\n牛肉片\n香爆牛肉\n麻辣牛肉\n蒙古牛肉\n肥羊\n牛排肉\n孜然牛肉\n酸菜牛肉\n拌牛肉\n五香驴肉\n滑牛肉\n牦牛肉\n生牛肉\n布丁\n提拉米苏\n暴风雪\n榴莲雪糕\n上汤娃娃菜\n小白菜\n大白菜\n辣白菜\n圆白菜\n洋白菜\n海白菜\n南瓜粥\n海鲜粥\n虾蟹粥\n小米粥\n皮蛋瘦肉粥\n八宝粥\n黑米粥\n瘦肉粥\n鲜虾粥\n桂花莲藕粥\n蚝仔粥\n菌菇粥\n百合粥\n芦荟羹\n牛肉卷\n羊肉卷\n猪肉卷\n小龙虾\n大龙虾\n护心肉\n梅花猪肉\n焖大虾\n招财鱼\n手抓排骨\n九节虾\n辣鸡架\n茄子煲\n酱茄子\n烤茄子\n烧茄子\n风味茄盒\n铁板茄子\n鱼香茄子\n豆角茄子\n毛豆\n炒生菜\n橄榄菜\n香辣蟹\n梭子蟹\n蟹脚\n蟹钳\n豆腐煲\n炖豆腐\n麻婆豆腐\n家常豆腐\n自磨豆腐\n麻豆腐\n风味豆腐\n一品豆腐\n蟹粉嫩豆腐\n肉酱面\n牛肉拉面\n海鲜面\n打卤面\n手擀面\n豚骨拉面\n车仔面\n公仔面\n牛杂面\n鸡蛋面\n牛杂河粉\n手撕面包\n菠萝包\n红豆包\n烤面包\n奶酪面包\n冰淇淋面包\n香蒜面包\n水果面包\n奶油面包\n葡萄干面包\n牛奶面包\n核桃面包\n咖啡面包\n小面包\n肉松面包\n毛毛虫面包\n蜂蜜面包\n石锅饭\n滑鸡饭\n蘑菇拌饭\n茄子拌饭\n煲仔饭\n一品煲\n泡菜饭\n虾饺\n抄手\n煎鳕鱼\n银鳕鱼\n烤鳕鱼\n烧鳕鱼\n香辣鱼\n麻辣鱼\n黄焖鸡\n鸡肉煲\n白菜心\n白灼菜心\n杏仁豆腐\n芝士玉米\n蛋黄松子玉米粒\n玉米饼\n泡椒牛蛙\n香锅牛蛙\n馋嘴蛙\n炒牛蛙\n焖牛蛙\n香辣牛蛙\n跳跳蛙\n水煮牛蛙\n霸王蛙\n黄焖鱼\n小面\n臊子面\n担担面\n摊摊面\n酸辣面\n味千拉面\n芥末墩\n奶酪牛排\n沙朗牛排\nT骨牛排\n排骨肉\n冰镇秋葵\n炒肉丝\n肉丝饭\n陈村粉\n咖喱炒河粉\n烤肉饭\n北极贝\n泡椒黄喉\n黄烩饭\n雪菜蒸黄鱼\n泰式酿番茄\n太爷鸡\n西芹果仁炒虾\n海鲜豆腐\n鲜木耳\n芹菜花生\n鲜金针菇\n烤金针菇\n白菜炖粉条\n绿菜炖粉条\n新鲜娃娃菜\n粉丝娃娃菜\n丝瓜汤\n月亮虾饼\n鲜玉米\n玉米杯\n鱼皮寿司\n凉拌香干\n韭菜炒香干\n咖喱鸡肉饭\n咖喱猪排饭\n榄菜四季豆\n鲜生菜\n鲜圆生菜\n蒜台炒肉\n纸杯蛋糕\n菲力牛排\n珍珠奶茶\n口味蛋糕\n芝士焗饭\n曲奇饼干\n西冷牛排\n牛肉汉堡\n三文鱼寿司\n巧克力慕斯\n海绵蛋糕\n夹心蛋糕\n焦糖布丁\n翻糖蛋糕\n黑椒牛排\n牛肉火锅\n火腿披萨\n肉三明治\n鸳鸯锅底\n芒果布丁\n海鲜意面\n热巧克力\n咖喱牛肉\n肉酱意面\n吐司面包\n卡通蛋糕\n番茄意面\n金枪鱼沙拉\n培根意面\n香草冰淇淋\n辫子面包\n烤五花肉\n原味奶茶\n波士顿龙虾\n肉眼牛排\n炒乌冬面\n黑森林蛋糕\n娃娃蛋糕\n鲜肉馄饨\n鳗鱼寿司\n酸奶冰淇淋\n全麦面包\n抹茶拿铁\n鸡蛋煎饼\n红豆奶茶\n猪肉水饺\n特色炒饭\n三文鱼沙拉\n芒果慕斯\n肉馅饺子\n西冷牛扒\n鱼炖豆腐\n金枪鱼三明治\n奶油意面\n猪肉饺子\n柠檬红茶\n莫吉托\n牛奶布丁\n鸡蛋三明治\n布丁奶茶\n火锅米线\n鱼豆腐汤\n豆腐丸子\n肠煲仔饭\n鸡煲仔饭\n面包布丁\n蜜糖吐司\n火腿三明治\n鸡肉焗饭\n肉酱意粉\n鹅肝寿司\n鸡肉意面\n田园沙拉\n水果色拉\n豆沙面包\n双拼披萨\n排骨煲仔饭\n咖喱鸡饭\n蘑菇披萨\n芒果冰沙\n鸡腿汉堡\n鸡肉色拉\n牛肉米线\n芒果冰淇淋\n苏打饼干\n拿铁咖啡\n萝卜丸子\n芝士汉堡\n金枪鱼寿司\n麻辣锅底\n酸奶慕斯\n芝士慕斯\n烟熏三文鱼\n奶油蛋糕卷\n可乐鸡翅\n丝袜奶茶\n蘑菇意面\n牛肉乌冬面\n夹心饼干\n烧牛肉面\n梅菜扣肉\n手握寿司\n豆沙月饼\n鸡肉汉堡\n裱花蛋糕\n葡式蛋挞\n砂锅米线\n玫瑰蛋糕\n燕麦饼干\n黄油饼干\n三文鱼腩寿司\n鱼烧豆腐\n糖玛奇朵\n焗土豆泥\n椰蓉面包\n雪花牛排\n迷你蛋糕\n天使蛋糕\n肉丝炒面\n特色烤鱼\n乳脂蛋糕\n美式咖啡\n摩卡咖啡\n山水豆腐\n芥末章鱼\n芝士意面\n老母鸡汤\n白菜豆腐\n烤猪肋排\n香蕉蛋糕\n戚风蛋糕卷\n肉馅水饺\n鲜肉大混沌\n清汤锅底\n海鲜意粉\n汤鸳鸯锅\n芭比蛋糕\n脆皮玉米\n欧式蛋糕\n巧克力冰淇淋\n黑椒牛扒\n蘑菇浓汤\n牛奶吐司\n蔬菜沙律\n萝卜蛋糕\n萝卜丝汤\n牛肉意面\n花朵面包\n芝麻汤圆\n粉蒸排骨\n港式奶茶\n干锅牛蛙\n黑椒牛柳\n芝士沙拉\n芒果班戟\n火腿炒饭\n海鲜烩饭\n牛油果色拉\n杏仁饼干\n蜜汁叉烧\n菲力牛扒\n芝士吐司\n羊肉火锅\n水果沙律\n手工豆腐\n蓝莓芝士\n玛芬蛋糕\n牛肉拌面\n汽车蛋糕\n阿根廷红虾\n手握披萨\n鸡蛋布丁\n香蕉奶昔\n重庆小面\n蟹黄豆腐\n芒果蛋糕\n肉炖粉条\n糖霜饼干\n牛肉肠粉\n煎三文鱼\n炒黑木耳\n泡椒凤爪\n酸奶冰激凌\n萝卜排骨汤\n泡菜炒饭\n杏仁蛋糕\n时蔬沙拉\n意大利粉\n鱼头豆腐汤\n南瓜浓汤\n巧克力饼干\n牛小排\n骨汤拉面\n马芬蛋糕\n野生木耳\n蔓越莓饼干\n蘑菇沙拉\n花环面包\n军舰寿司\n鸡蛋饺子\n芝麻饼干\n牛肉丸子\n炖五花肉\n豆角焖面\n莲蓉月饼\n芒果奶昔\n盆栽奶茶\n牛肉米粉\n水晶虾仁\n排骨米饭\n哈根达斯\n可可蛋糕\n三鲜水饺\n铁板豆腐\n酸奶沙拉\n炖排骨\n蜂蜜蛋糕\n笋炒腊肉\n秘制牛肉\n猪肉馅饼\n时令蔬菜\n手指饼干\n咖喱鸡肉\n黑椒牛肉\n鸡蛋水饺\n菜炒牛肉\n肉酱拌面\n肉丸子汤\n椰果奶茶\n三文鱼色拉\n夹心面包\n土司面包\n咖喱鱼蛋\n鲜虾披萨\n鱼腩刺身\n豆腐沙拉\n草莓慕斯\n芝心披萨\n肉末豆腐\n桂林米粉\n木耳炒肉\n有机时蔬\n奶酪披萨\n培根焗饭\n吐司披萨\n乳酪慕斯\n鲜肉水饺\n香辣烤鱼\n金针菇卷\n蟹粉豆腐\n芭比娃娃\n焦糖拿铁\n烤肉拼盘\n金枪鱼披萨\n抹茶慕斯\n炒酸奶\n叉烧肠粉\n鸭血粉丝\n鲜虾云吞\n香肠炒饭\n饺子\n菜烧豆腐\n炒粉丝\n芝士烤饭\n祝寿蛋糕\n番茄沙拉\n牛肉拼盘\n烧味拼盘\n火腿面包\n果木烤鸭\n手工水饺\n仙草奶茶\n鲜虾肠粉\n饭团便当\n闪电泡芙\n酱烧茄子\n土豆泥沙拉\n猪肉馄饨\n火腿沙拉\n海鲜色拉\n鸡粒炒饭\n蛋黄月饼\n肋眼牛排\n猪扒焗饭\n柠檬绿茶\n波斯顿龙虾\n巧克力酱\n奶油泡芙\n五花肉饭\n馋嘴牛蛙\n韭菜饺子\n银耳糖水\n酱蛋糕卷\n肉松寿司\n玫瑰奶茶\n火焰牛排\n核桃蛋糕\n土豆披萨\n乳酪面包\n香蕉牛奶\n蔬菜披萨\n菇炒肉片\n芒果酸奶\n肉蛋包饭\n米蒸排骨\n牛肉汤面\n金枪鱼刺身\n水果裸蛋糕\n有机鱼头\n排骨火锅\n手工巧克力\n巧克力曲奇\n五仁月饼\n鸳鸯奶茶\n蔓越莓吐司\n炖粉条\n抹茶冰激凌\n芝士薯条\n黑胡椒牛排\n肥牛米线\n空心菜梗\n点心拼盘\n炒西红柿\n水库鱼头\n榛果拿铁\n吞拿鱼沙拉\n小黄花鱼\n夏威夷披萨\n奶油布丁\n凉拌黄瓜\n全麦吐司\n巧克力布丁\n香草拿铁\n蒸土鸡蛋\n胡萝卜饺子\n肉眼牛扒\n肉丝盖饭\n糯米丸子\n猪肉丸子\n牛仔粒\n沙拉三明治\n土豆丝饼\n咖啡蛋糕\n南瓜馒头\n切片蛋糕\n巧克力奶茶\n价钱-c\n乌龙奶茶\n麻辣豆腐\n鸡蛋炒面\n铁板炒饭\n豆腐砂锅\n虾仁馄饨\n抹茶星冰乐\n芝麻面包\n艺术蛋糕\n肉小馄饨\n牛肉粉丝\n海鲜浓汤\n榴莲芝士\n奶油吐司\n切块蛋糕\n主题蛋糕\n香辣鸡翅\n雪域蛋糕\n酸奶奶昔\n辣鸡腿堡\n萝卜炒肉\n菇炒牛柳\n苹果蛋糕\n花式面包\n经典牛排\n牛肉炒面\n火龙果汁\n海鲜拉面\n开花馒头\n大虾沙律\n土豆焖饭\n四喜丸子\n养生豆腐\n大连diy蛋糕\n鸡蛋卷饼\n鲜肉小笼\n蜜汁鸡翅\n葡萄干吐司\n番茄锅底\n甜虾刺身\n鸳鸯锅\n椒炒肉丝\n招牌炒饭\n抹茶布丁\n意大利饭\n奶酪焗饭\n奶酪沙拉\n三文鱼饭\n鱼籽寿司\n鱼子寿司\n豆花米线\n西米布丁\n豆腐羹\n肉馅包子\n肉通心粉\n肉丝炒饭\n白菜粉丝\n牛肉馅饼\n牛排披萨\n牛奶奶茶\n炖土鸡汤\n炒火腿肠\n海胆刺身\n牛肉面\n拼三文鱼\n巧克力塔\n川北凉粉\n奶酪布丁\n刀切馒头\n麦芬蛋糕\n鲜蔬沙拉\n香肠面包\n帕尼尼\n红茶拿铁\n烧烤拼盘\n法式面包\n上汤浸时蔬\n椒土豆片\n小锅米线\n奶酪吐司\n原味酸奶\n摩卡星冰乐\n麻辣米线\n雪花牛扒\n蔓越莓面包\n蓝山咖啡\n芒果沙冰\n腊肠披萨\n腊肉炒饭\n腊味炒饭\n秘制牛排\n甜品拼盘\n炒鱿鱼须\n炒豆腐皮\n澳洲牛排\n水煮鱼片\n棒棒糖蛋糕\n核桃吐司\n布朗尼蛋糕\n数码蛋糕\n手切羊肉\n忌廉蛋糕\n卤水豆腐\n丝绒蛋糕\n麻辣烤鱼\n鸭煲仔饭\n香肠意面\n酸奶吐司\n酥皮月饼\n茄子土豆\n系列蛋糕\n糯米奶茶\n提拉米苏蛋糕\n笋炒肉丝\n白菜饺子\n牛肉水饺\n波斯龙虾\n沙拉面包\n沙拉寿司\n水果酸奶\n果炒虾仁\n无骨凤爪\n培根意粉\n味增拉面\n冰激淋球\n什锦披萨\n鸡蛋吐司\n鸡肉丸子\n鲜果沙律\n雪耳糖水\n雪梨糖水\n蜜糖土司\n虾仁饺子\n虾仁水饺\n虾仁意面\n菜土豆泥\n菜丸子汤\n草莓奶昔\n肉粉丝煲\n肉松蛋糕\n经典奶茶\n红油抄手\n白菜水饺\n野生大黄鱼\n玫瑰馒头\n牛肉寿司\n热狗面包\n北极贝寿司\n木瓜牛奶\n广式月饼\n奥尔良烤翅\n奶油松饼\n培根面包\n吐司布丁\n巧克力奶昔\n薄饼披萨\n芝士色拉\n砂锅鱼头\n玉米披萨\n烧鸡翅根\n烤鸡腿堡\n柠檬汽水\n肉松面包卷\n三文鱼意面\n手工酸奶\n台塑牛排\n南瓜发糕\n儿童蛋糕\n什锦沙拉\n鲜虾色拉\n风味披萨\n金针肥牛\n酸奶雪糕\n虾仁披萨\n虎皮尖椒\n蓝莓慕斯\n葱烧海参\n菜肉馄饨\n草莓酸奶\n茄汁意面\n肉炒河粉\n发糕\n牛肉定食\n炒鸡腿肉\n海鲜炒面\n有机蔬菜\n心形蛋糕\n巧克力派\n回锅肉饭\n咖喱牛腩煲\n巧克力泡芙\n丸粉丝汤\n三鲜饺子\n鸡蛋馅饼\n鸡蛋盖饭\n鱼头火锅\n金针菇汤\n肉酱千层面\n辣鸳鸯锅\n西红柿汤\n菠菜沙拉\n芝麻烧饼\n羊肉丸子\n红豆布丁\n盒子蛋糕\n烤肉年糕\n烤海鲈鱼\n炒笨鸡蛋\n大虾色拉\n南瓜布丁\n三文鱼扒\n丁骨牛排\nrisotto\n麻酱拌面\n麻将蛋糕\n鸡肉拌饭\n鸡扒焗饭\n魔芋豆腐\n青酱意面\n金枪鱼卷\n土豆炖牛肉\n蟹肉沙拉\n蒸金针菇\n酸菜牛肉面\n芒果色拉\n红豆吐司\n红枣糖水\n糯米烧卖\n眼肉牛排\n田园披萨\n玫瑰花馒头\n牛柳意面\n法式鹅肝\n柠檬蛋糕\n金枪鱼色拉\n果酱蛋糕\n手指泡芙\n挪威三文鱼\n奶盖红茶\n卡通饼干\n伯爵奶茶\n黄金炒饭\n鸡胸沙拉\n鸡盖浇饭\n干香茶树菇\n蛋拌豆腐\n蔬菜三明治\n荞麦冷面\n芝麻吐司\n芝士饼干\n腊肠炒饭\n羊蝎子锅\n红豆冰沙\n秘制烤鱼\n辣白菜炒饭\n牛肉丸汤\n海鲜年糕\n海胆寿司\n沙律寿司\n樱桃蛋糕\n果酱面包\n果仁面包\n加拿大龙虾\n招牌咖啡\n干捞粉丝\n小笼汤包\n奶黄月饼\n奶香吐司\n土豆丸子\n南瓜面包\n八宝辣酱\n鲜肉饺子\n香菜丸子\n野生黄鱼\n蛋糕切块\n肉丸汤\n菜粉丝汤\n金菇肥牛卷\n荷叶炒饭\n茄子豆角\n桂花糯米藕\n肉肠披萨\n红烧猪蹄\n田园色拉\n玫瑰花卷\n牛五花肉\n炖老鸡汤\n炒鸡胸肉\n炒韭菜花\n法式羊排\n汁蒸凤爪\n樱桃萝卜\n棉花蛋糕\n果蛋糕卷\n杂粮面包\n巧克力球\n寿司定食\n意大利宽面\n墨西哥卷\n培根沙拉\n双色馒头\n厚切牛舌\n伯爵红茶\n什锦拼盘\n玛格丽特饼干\n麻辣龙虾\n芝麻菜沙拉\n鸡蛋沙拉\n鸡蛋汤面\n鸡肉沙律\n鸡扒意粉\n鲜虾沙律\n鱿鱼寿司\n香草奶昔\n酸菜米线\n酪冰淇淋\n蔓越莓曲奇\n虾仁沙拉\n荠菜馄饨\n芝士意粉\n腿煲仔饭\n肉豆腐汤\n肉末炒饭\n翻糖饼干\n经典披萨\n红糖糍粑\n红枣豆浆\n粉丝扇贝\n私房豆腐\n白切鸡饭\n甜虾寿司\n现磨咖啡\n玉米面包\n牛肉砂锅\n爆浆鸡排\n炒西瓜皮\n炒佛手瓜\n澳洲肥牛\n乌冬面\n榛子蛋糕\n核桃饼干\n果园蛋糕\n极品肥牛\n三文鱼沙律\n招牌披萨\n拌海带丝\n干锅鸭头\n巧克力卷\n宫爆鸡丁\n小餐包\n咖喱大虾\n原味炸鸡\n巧克力摩卡\nLatte\nbiangbiang面\n黑麦面包\n黄油面包\n鸡蛋拌面\n酸奶芝士\n酱蒸凤爪\n调味牛排\n蜂蜜吐司\n虾仁肠粉\n蔬菜意面\n蓝莓酸奶\n蒸大闸蟹\n萝卜牛腩\n炒肉末\n荷叶蒸饭\n芝士鸡排\n芝士奶茶\n腊肠焖饭\n脆皮蛋糕\n肉金针菇\n肉炒鸡蛋\n肉沫茄子\n老式面包\n羊肉饺子\n红酒牛排\n百香果茶\n牛肉馄饨\n爆珠奶茶\n燕麦奶茶\n烤鸡披萨\n炒油豆腐\n黄河大鲤鱼\n沙朗牛扒\n芒果糯米饭\n磅蛋糕\n杏仁曲奇\n披萨双拼\n大虾披萨\n芝士焗扇贝\n土司披萨\n羔羊肉\n咖喱焗饭\n朱古力蛋糕\n全蛋蛋挞\n五谷豆浆\n丸子砂锅\n丝滑奶茶\n龙井虾仁\n黄油蛋糕\n麻酱凉面\n鸭胸沙拉\n鲅鱼水饺\n鱼盖浇饭\n鱼握寿司\n香菇饺子\n雪龙牛肉\n铁杆山药\n酸奶戚风\n酥粒面包\n土豆盖浇饭\n豆炒肉末\n西兰羊排\n鲜虾腐皮卷\n蒸黄花鱼\n青菜炒豆腐\n大馄饨\n银耳莲子糖水\n芝士切件\n肉酱披萨\n红枣蛋糕\n红枣发糕\n红枣粥\n丝鸡汤\n礼盒月饼\n田园时蔬\n猪颈肉饭\n牛肉饺子\n牛肉意粉\n牛排汉堡\n烤鸭披萨\n烤土豆片\n深井烧鹅\n冰淇淋华夫\n海鲜沙律\n水晶虾饺\n萨拉米披萨\n奥尔良鸡翅\n寿司组合\n咖喱乌冬面\n巧克力芝士\n巧克力拿铁\n什锦火锅\n鸡蛋糖水\n鳗鱼炒饭\n鱼丸米线\n莲藕汤\n酒酿小圆子\n酥皮蛋挞\n迷你汉堡\n蔓越莓蛋糕\n豆焖排骨\n豆沙汤圆\n装饰蛋糕\n蟹柳寿司\n蟹子寿司\n蜜豆吐司\n萝卜烧肉\n芝士蛋挞\n芒果雪糕\n脱骨凤爪\n肉三文治\n肉丁炒饭\n羊肉水饺\n经典咖啡\n红豆蛋糕\n糯米蛋糕\n立贝寿司\n相间肥牛\n番茄披萨\n现炸酥肉\n猪肉锅贴\n猪肉蒸饺\n特级肥牛\n牛奶冰沙\n八爪鱼寿司\n焦糖奶茶\n炖乌鸡汤\n炒玉米粒\n大鱼头\n水果布丁\n金枪鱼意面\n果酸奶杯\n水果小丸子\n木耳山药\n提子面包\n排骨蒸饭\n抹茶曲奇\n奶酪慕斯\n大片牛肉\n意大利烩饭\n土豆煎饼\n金针菇\n南瓜蛋糕\n农家豆腐\n全麦土司\n巧克力礼盒\n五花肉卷\n龙虾意面\n黄桃罐头\n酱烧饼\n麻薯面包\n骨汤锅底\n香辣猪蹄\n香蕉披萨\n铁板鱿鱼\n酱焗意粉\n辣炒花蛤\n豆腐锅仔\n豆腐炒肉\n蟹黄汤包\n蜜汁烤翅\n蛋糕奶茶\n蛋白炒饭\n蛋奶布丁\n蒜蓉面包\n菌汤锅底\n草莓奶茶\n芝士热狗\n芝士寿司\n脆皮鸡翅\n脆皮烤鸭\n脆皮乳鸽\n胡萝卜饼\n胡萝卜丁\n肉汤河粉\n羊肉烩面\n紫薯面包\n紫菜卷饭\n糖冰淇淋\n粉丝沙拉\n番茄色拉\n瓜炒虾仁\n牛腩河粉\n牛肉浓汤\n牛奶炖蛋\n烫面蒸饺\n炒河虾仁\n炒包心菜\n海鲜酒家\n海鲜大咖\n北海道吐司\n椒鱼头王\n椒多宝鱼\n果酱酸奶\n排骨焖饭\n拌豆腐皮\n拌海蜇头\n手抓羊排\n大黄花鱼\n四喜烤麸\n咖啡慕斯\n可可奶茶\n原味蛋糕\n卤味拼盘\n北京烤鸭\ntoro\n鹅肝沙拉\n鸡蛋肠粉\n鸡蛋包子\n鱼蒸肉饼\n鱼寿司卷\n香菇鸡汤\n风味烤鱼\n超级至尊\n超大鸡排\n豆腐圆子\n豆沙吐司\n蟹棒寿司\n蛋黄寿司\n虾仁滑蛋\n薄底披萨\n蒜蓉生蚝\n青菜炒虾仁\n莓裸蛋糕\n花生豆浆\n芝士拼盘\n猪肉馅馄饨\n荷叶饭\n鲜肉水煎包\n罗勒意面\n红茶奶盖\n红汤锅底\n米小丸子\n番薯糖水\n生蚝刺身\n玉米面饼\n玉米发糕\n牛肉沙律\n牛肉汤粉\n牛奶土司\n炒酸豆角\n火腿意面\n澳洲和牛\n牛油果寿司\n椰蓉吐司\n桂花糖藕\n安格斯牛排\n树根蛋糕\n果粒酸奶\n松仁玉米\n木糠布丁\n无骨鸡柳\n方形蛋糕\n三文鱼披萨\n提子蛋糕\n排骨年糕\n拌北极贝\n抹茶泡芙\n手冲咖啡\n玛德琳蛋糕\n奶酪饼干\n大肉饺子\n咖喱羊肉\n双色吐司\n冰糖燕窝\n迷迭香\n龙利鱼排\n鸡蛋软饼\n鸡肉意粉\n鲜酿酵素\n鲜虾春卷\n鲜奶布丁\n高钙羊肉\n香葱面包\n香菇焖饭\n香菇水饺\n香煎鸡扒\n香滑奶茶\n风味茄子\n韩国料理\n野生鲫鱼\n贝壳蛋糕\n豆豉烤鱼\n豆腐盖饭\n新西兰青口\n蟹籽寿司\n蜜豆面包\n蜂蜜绿茶\n蛤蜊意面\n蘑菇焗饭\n蒜香鸡翅\n蒜蓉扇贝\n鸡蛋卷\n草莓布丁\n花样馒头\n芦笋意面\n黑芝麻豆浆\n芝心年糕\n芝士薄饼\n腓力牛排\n香脆鸡腿堡\n脆皮鸡排\n胡萝卜粥\n肥牛拌饭\n肉松饭团\n肉拌黄瓜\n老火例汤\n罐罐米线\n红糖面包\n拿破仑蛋糕\n白菜粉条\n白灼芥兰\n米线\n生菜色拉\n牛肉烧饼\n牛奶刨冰\n照片蛋糕\n焗银鳕鱼\n焗基围虾\n烤馒头片\n烤肉炒饭\n炒牛仔粒\n冰淇淋泡芙\n柠檬特饮\n板牛仔骨\n排骨米线\n吞拿鱼寿司\n扇贝寿司\n情趣蛋糕\n山药木耳\n奶香面包\n奶酥面包\n大骨头汤\n培根汉堡\n培根土豆\n咖喱鸡翅\n原味芝士\n千层肉饼\n制作过程\n养生锅底\n巧克力麦芬\n巧克力吐司\n优格冰沙\n乡村面包\n酸辣粉\n丝豆腐羹\n黑糖奶茶\n鸡肉炖饭\n鲷鱼寿司\n鱼头豆腐\n炖鱼五花肉\n风琴土豆\n风味牛排\n风味咖啡\n铁板牛排\n酸辣米线\n酸菜粉条\n软骨拉面\n虾肉馄饨\n椰蓉面包卷\n萝卜沙拉\n萝卜水饺\n菠萝披萨\n猪肉包\n菜炒木耳\n菌类拼盘\n草莓泡芙\n茄子盖饭\n雪花牛小排\n雪花牛仔粒\n脆皮猪手\n肥肠米线\n肉炒蒜苔\n肉松吐司\n罗马生菜\n绿豆糖水\n纽约芝士\n红豆刨冰\n紫薯馒头\n空心挂面\n百合糖水\n番茄面包\n田园沙律\n特色炒鸡\n牛杂火锅\n焦糖摩卡\n焖黄花鱼\n烧四季豆\n烤鸡翅根\n德普烘焙食谱\n炸花生米\n滋补锅底\n沙茶牛肉\n鸡扒饭\n水煮鲶鱼\n辣椒小皮蛋\n西红柿炒鸡蛋\n红枣小米粥\n果干面包\n杂粮煎饼\n有机菜心\n抹茶芝士\n抹茶冰沙\n慕斯切块\n慕司蛋糕\n意面沙拉\n总汇披萨\n带骨牛排\n带皮牛肉\n带子寿司\n千岛湖鱼头\n卡布奇诺咖啡\n天妇罗卷\n土豆丝汤\n咖啡冰淇淋\n味大鸡排\n台湾香肠\n巧克力热饮\n巧克力咖啡\n三鲜\nmuffin\n龙利鱼扒\n黑胡椒汁\n黄瓜沙拉\n麻酱凉皮\n麻辣小面\n鹅肝蒸蛋\n鸡排沙拉\n鸡丁炒饭\n鲷鱼刺身\n鲜虾馄饨\n鲜蔬沙律\n鲜肉汤包\n鲜肉云吞\n鱼鸳鸯锅\n骨炖酸菜\n香蕉酸奶\n戚风小蛋糕\n韭菜水饺\n锅全家福\n野生河虾\n醋海蜇头\n酸辣凤爪\n酸奶布丁\n酥皮泡芙\n辣酱拌面\n辣牛肉汤\n辣子鸡丁\n蟹肉寿司\n蜂蜜酸奶\n蛋糕系列\n蛋糕切片\n蘑菇意粉\n藜麦沙拉\n蔬菜炒香肠\n炒豆皮\n鲍菇牛仔粒\n茄汁意粉\n芦荟绿茶\n芝麻酥饼\n芝士春卷\n芒果芝士\n脆皮泡芙\n胡萝卜面\n牛肉盖浇面\n腊肉娃娃菜\n肉丝米线\n老醋花生\n美国肥牛\n红糖发糕\n红烧乳鸽\n素馅包子\n糯米圆子\n玉米骨头汤\n秘制鸡翅\n秘制花甲\n秘制肥牛\n石屏豆腐\n野生大甲鱼\n现磨豆浆\n玉米寿司\n玉子寿司\n猪肉汉堡\n猪扒意粉\n狮子头面\n牛油果酱\n燕麦酸奶\n烤牛小排\n炒大头菜\n比萨\n火腿寿司\n海鲜盖饭\n海鲜汤面\n泰式奶茶\n水牛芝士\n榴莲布丁\n炒茄子\n椒味烤鱼\n木耳鸡蛋\n木瓜雪蛤\n木瓜糖水\n无水蛋糕\n排骨汤锅\n拌核桃仁\n蛋拌乌冬面\n抹茶奶盖\n手工意面\n戚風蛋糕\n小炒肉饭\n孜然羊排\n玉子豆腐煲\n天妇罗拼盘\n奶酪意面\n大虾意面\n德国咸猪手\n咖啡冰沙\n全家福\n可可饼干\n切鲜羊肉\n全麦馒头\n巧克力戚风\n云南小瓜\n丸类组合\n上汤时蔬\n鸭胸色拉\n鸡汤米线\n鸡排汉堡\n海鲜豆腐汤\n鲜虾烧卖\n海鲜烩意粉\n鱼蒸豆腐\n飞鱼籽沙拉\n一锅出\n鱿鱼须\n香草雪糕\n香草蛋糕\n香煎牛排\n香烤鸡腿\n风味牛肉\n韩国泡菜\n肉酱烩意粉\n塔塔酱\n造型饼干\n蔓越莓司康\n捷赛私房菜\n豆骨头汤\n黄豆炖猪蹄\n西葫芦丝\n蔬菜组合\n蒸龙利鱼\n萝卜煎饼\n菠菜色拉\n菇炒青菜\n芹菜饺子\n芦笋沙拉\n芝士松饼\n芝士扇贝\n芒果沙律\n牛肉粒披萨\n肉焖土豆\n肉烧土豆\n肉炒青椒\n炒芹菜\n炒芥兰\n肉炒白菜\n牛肉土豆粉\n義大利麵\n缤纷蛋糕\n红糖吐司\n糖醋丸子\n玉米炖排骨\n石鍋拌飯\n百叶豆腐\n番茄意粉\n小木耳\n玫瑰花蛋糕\n猪肉煎饺\n牛肉锅仔\n牛肉清汤\n牛奶小方\n骨扒\n星冰乐\n照烧鸡排\n鸡胸肉\n焗鸡扒饭\n焗多宝鱼\n烤鸡胸肉\n烤肠拼盘\n辣炒小公鸡\n冰火菠萝油\n火腿色拉\n火腿煎饼\n滑蛋虾仁\n清炒虾仁\n海鲜锅底\n奶油冰淇淋\n辣椒炒花甲\n椒五花肉\n核桃磅蛋糕\n桂花山药\n核桃沙拉\n柠檬可乐\n柚子沙律\n摩卡蛋糕\n提拉米蘇\n排骨砂锅\n排骨土豆\n招牌拉面\n手打鱼丸\n战斧牛排\n慕丝蛋糕\n干菜扣肉\n手工冰淇淋\n宝宝蛋羔\n大肉水饺\n大块牛肉\n芝士焗龙虾\n培根寿司\n圣诞蛋糕\n土豆盖饭\n咖啡布丁\n腊味糯米饭\n参乌鸡汤\n萝卜粉丝汤\n萝卜炖排骨\n冰糖雪梨\n巧克力冰沙\n五花肉片\n主厨沙拉\n串烧拼盘\n三文鱼面\n黑椒意面\n鸭血豆腐\n鸡蛋蒸饺\n鸡蛋盒子\n鱼片拼盘\n鱼一锅鲜\n骨炖土豆\n香蕉麦芬\n香蕉马芬\n香蕉吐司\n香菇馄饨\n香菇豆腐\n香芋奶茶\n香肠焗饭\n香肠焖饭\n风干牛肉\n波士顿大龙虾\n油面筋塞肉\n醋汁沙拉\n酸奶盆栽\n酱爆鸡丁\n豆角土豆\n豆腐鱼汤\n豆腐色拉\n豆炒虾仁\n豆渣馒头\n炒肉\n西米奶茶\n蛋黄饼干\n鲜虾净云吞\n蓑衣黄瓜\n蒜蓉西兰花\n萝卜馅饼\n萝卜泡菜\n炒毛豆\n菜拌豆腐\n菌菇披萨\n菌菇意面\n菇蛋花汤\n莲子豆浆\n草莓冰沙\n苹果豆浆\n芹菜水饺\n花甲米线\n花生饼干\n花生冰沙\n花型面包\n芥末虾球\n芝麻蛋糕\n芝麻花卷\n芝士布丁\n肉西兰花\n牛肉蔬菜汤\n肉炖白菜\n肉炒豆角\n炒菜花\n肉桂面包\n虾仁\n红豆糖水\n红烧带鱼\n糯米烧麦\n瑶柱炒饭\n玫瑰花面包\n西班牙香肠\n玫瑰面包\n牛奶馒头\n牛奶燕窝\n爆牛仔粒\n燕麦面包\n焦糖慕斯\n鳗鱼饭\n烤牛肋骨\n炒鱿鱼干\n炒木耳菜\n深海黄鱼\n深海大虾\n浓汤拉面\n洋芋擦擦\n牛油果手卷\n榴莲慕斯\n椰蓉饼干\n椒炒牛柳\n果仁蛋糕\n木糠蛋糕\n木糠布甸\n时令水果\n攸县香干\n排骨套饭\n拉面定食\n抹茶奶茶\n手工披萨\n手切牛肉\n意式咖啡\n夹心小蛋糕\n年糕拉面\n布丁面包\n客家豆腐\n奶酪司康\n奶油曲奇\n培根吐司\n坚果沙拉\n地狱拉面\n咖喱鸡块\n咖喱鱼丸\n咖啡拿铁\n南瓜沙拉\n动物饼干\n冰滴咖啡\n三鲜米线\n黑椒猪排\n黄瓜咸菜\n麻辣香肠\n鸡肉火锅\n鸡汤馄饨\n鸡排便当\n鸡丁披萨\n鲜奶小方\n鲑鱼沙拉\n生鱼片\n一夜干\n香酥鸡排\n香辣锅底\n香草面包\n香草布丁\n香煎鹅肝\n风情蛋糕\n面包披萨\n青瓜沙拉\n青柠苏打\n酸甜排骨\n酸汤鲈鱼\n酸汤锅底\n酸奶马芬\n酸奶冰棒\n酱酸辣粉\n果酱冰淇淋\n豆腐火锅\n黄豆猪蹄汤\n西洋菜汤\n血糯米粥\n蛋糕切件\n蛋乌冬面\n蚝油生菜\n虫草花汤\n虎皮凤爪\n蒜泥茄子\n萝卜蒸饺\n菠菜粉丝\n菠菜意面\n菌菇浓汤\n菇扒菜胆\n蒸荔浦芋头\n苹果面包\n芝麻月饼\n芝心比萨\n腰内猪排\n肉馅蒸饺\n肉炒香干\n肉炒韭菜\n肉沫豆腐\n银耳雪梨羹\n木耳红枣汤\n美味蛋糕\n红豆沙冰\n红糖蛋糕\n素馅饺子\n粉条包子\n蟹粉小笼包\n粉丝虾煲\n筒子骨汤\n笋烧牛腩\n生炒菜心\n牛腩火锅\n牛腩汤面\n牛肉薄片\n牛肉蒸饺\n牛肉煎饼\n牛奶沙冰\n牛奶冰棒\n牛三宝面\n爆炒腰花\n烧大黄鱼\n海鲜蒸蛋\n海鲜米线\n海鲜炖饭\n泡菜拉面\n牛油果奶昔\n沙拉军舰\n沙律军舰\n水煮活鱼\n椰奶布丁\n椒黑木耳\n椒盐花卷\n核桃豆浆\n枸杞鸡汤\n果酱饼干\n木瓜色拉\n曲奇泡芙\n时蔬色拉\n手卷寿司\n意大利麵\n干蒸烧卖\n巧克力挞\n班尼迪克蛋\n小麦啤酒\n大脸鸡排\n培根拌饭\n坚果蛋糕\n土豆烧鸡\n土豆炖肉\n咖喱炒蟹\n娃娃菜\n叶炒鸡蛋\n双拼蛋糕\n卡布其诺\n南瓜吐司\n刺身组合\n切片面包\n冰镇奶茶\n冰巧克力\n儿童牛排\n伍仁月饼\n什锦海鲜\n乳酪布丁\n乌龙奶盖\n上脑肥牛\n三鲜锅底\n黑豆豆浆\n黄豆豆浆\n麻辣牛蛙\n麻辣兔头\n麻花面包\n鸡蛋薄饼\n鸡腿排饭\n鸡肉宽面\n鸡肉卷饼\n鸡炖榛蘑\n鸡排焗饭\n海鲜全家福\n鲍菇牛柳\n鱼炖茄子\n鱼炖排骨\n骨头火锅\n香辣龙虾\n香辣豆腐\n多宝鱼\n风味沙拉\n风光披萨\n铁锅焖面\n酸辣白菜\n酸奶冰沙\n酱油炸鸡\n部队年糕\n软欧面包\n蔓越莓月饼\n蔓越莓土司\n越南春卷\n豌豆凉粉\n红豆小餐包\n调味炸鸡\n蝴蝶意面\n蛋牛肉粥\n虾握寿司\n虾仁蒸饺\n虾仁烧卖\n虾仁春卷\n蘑菇汉堡\n蔬菜拌饭\n蔬菜拉面\n蒸鸡蛋羹\n蒜香鸡排\n菠萝咕咾肉\n萝卜炒饭\n萝卜包子\n葡萄干蛋糕\n菠萝面包\n鲜肉包\n菜馅水饺\n蔬菜猪肝汤\n猪肉饺\n菜炒粉皮\n菜土豆汤\n菌菇焗饭\n金针菇牛肉卷\n菇炖鸡汤\n莓炒酸奶\n草莓果酱\n草莓松饼\n芹菜炒肉\n青花鱼定食\n芝士虾球\n芝士匹萨\n芒果绿茶\n芋圆奶茶\n胡萝卜片\n黑胡椒牛扒\n肉饺子馅\n肉酱薯条\n肉牛肉面\n羊肉大串\n羊后腿肉\n绿色毛肚\n红枣鸡汤\n糯米蒸鸡\n小米烩辽参\n芦笋炒虾仁\n生菜沙律\n瓜肉丸汤\n理石芝士\n珍珠圆子\n玉米炒饭\n猪排焗饭\n牛扒\n牛腩锅仔\n牛肉烧卖\n牛肉小面\n牛排意面\n牛奶慕斯\n爆浆猪排\n熊猫饼干\n焗三文鱼\n烤鸡沙拉\n烤鱿鱼须\n烤土豆泥\n烟肉意粉\n炒糯米饭\n炒柴鸡蛋\n火腿月饼\n泡泡浴蛋糕\n法棍面包\n汁烩意粉\n水果吐司\n水果冰沙\n尖椒四季豆\n梅花猪扒\n安格斯牛柳\n柚子沙拉\n杏仁面包\n粉丝汤\n时蔬披萨\n排骨煨汤\n排骨焖面\n吞拿鱼沙律\n拌黄瓜丝\n拉面年糕\n抹茶饼干\n手撕鸡饭\n手工蛋糕\n慕斯奶茶\n意式奶冻\n干菜烧肉\n帝王蟹脚\n巧克力豆\n巧克力棒\n巧克力杯\n帕尔玛火腿\n小熊蛋糕\n奶油焗饭\n多利亚饭\n芝士焗红薯\n坚果面包\n土豆鸡块\n咖啡奶茶\n吉列猪扒\n可可戚风\n双色饼干\n双色豆腐\n参炖鸡汤\n去骨鸭掌\n卷筒披萨\n歌剧院蛋糕\n乌冬面定食\n关东辽参\n巧克力玛芬\n克力松露\n义大利面\n牛肉丸汤河粉\n个性蛋糕\n丝炒粉条\n三文鱼籽\n黄瓜炒肉\n麻辣鸡翅\n鸡蛋面条\n鸡蛋肉饼\n鸡丝沙拉\n鸡丁米线\n鳄梨沙拉\n鲜虾寿司\n鲜花蛋糕\n海鲜娃娃菜\n鱼汁意面\n高汤锅底\n排骨盖浇饭\n五香豆腐干\n香蕉班戟\n香草鸡扒\n香草泡芙\n香干肉丝\n香干炒肉\n奶香小餐包\n青椒炒肉\n霜降牛肉\n雪花和牛\n锡兰红茶\n金枪鱼酱\n酸菜鱼锅\n贺寿蛋糕\n豉汁排骨\n紅豆薏米粥\n炒香肠\n土豆丝卷饼\n滋补鸳鸯锅\n蟹粉汤包\n蟹籽炒饭\n螺旋意面\n蜜豆土司\n牛蛙煲仔饭\n蛋炒韭菜\n天使面\n虾仁焗饭\n虾仁炒面\n蘑菇拼盘\n蓝莓奶昔\n椰蓉小餐包\n葡萄面包\n菠萝古老肉\n萝卜焖饭\n葡萄干玛芬\n菠萝虾球\n豆腐锅\n肉水饺\n菜煎豆腐\n炒腐竹\n梅菜扣肉饭\n蘑菇牛肉饭\n菇炖排骨\n蘑菇奶油汤\n荞麦凉面\n苏式月饼\n豆芽炒粉条\n花边披萨\n花形面包\n芝麻蛋卷\n芝士馅饼\n芝士饭团\n芒果肠粉\n奥尔良烤鸡翅\n腐皮寿司\n脆皮鸡饭\n鸡胸三明治\n肥肠酸辣粉\n肉馅锅贴\n肉酱热狗\n肉豆腐锅\n肉蒸水蛋\n鸡肉米纸卷\n肉炒粉丝\n汉堡包\n肉末蒸蛋\n肉末粉丝\n肉丝汤面\n银耳红枣羹\n翻转蛋糕\n缤纷披萨\n综合刺身\n红薯丸子\n红烧鸡块\n红油锅底\n紫薯月饼\n糟溜鱼片\n蛋糕冰淇淋\n竹笋烧牛肉\n秘制烤翅\n碳烤生蚝\n盐焗鸡翅\n白菜包子\n玉米馒头\n玉米煎饼\n猪肉火锅\n猪肉拌饭\n特级牛排\n牛骨汤底\n牛腩米粉\n牛肉汤锅\n牛奶咖啡\n牙签牛肉\n爆海螺片\n煎五花肉\n焦糖咖啡\n烧鸡腿堡\n肉炖白萝卜\n炒馒头丁\n炒莴笋叶\n炒水晶粉\n炒三层肉\n冰淇淋吐司\n海鲜米粉\n浓缩咖啡\n流心蛋糕\n泡椒田鸡\n烤鸡腿\n水晶粉丝\n柠檬蜂蜜水\n榨菜肉丝\n椒爆鸡胗\n炒鸡丁\n红枣枸杞汤\n果木牛扒\n松露蛋糕\n杂菜沙律\n木耳饺子\n时光蛋糕\n无骨鸡爪\n无骨鸡块\n三文鱼盖饭\n排骨拉面\n招牌沙拉\n招牌寿司\n拌金枪鱼\n拌海蜇皮\n拌明太鱼\n手打肉丸\n冰咖啡\n水库大鱼头\n带子刺身\n奶香土司\n奶酪色拉\n奶酥饼干\n奶茶大杯\n奶油芝士\n奶油慕斯\n奶油小方\n酸奶水果杯\n夹心年糕\n芝士帕尼尼\n坚果酸奶\n咖喱虾仁\n和风沙拉\n和牛寿司\n可可蛋糕卷\n叉烧焗饭\n叉烧排骨\n卤鹌鹑蛋\n卡仕达酱\n萝卜骨头汤\n萝卜蛋炒饭\n南瓜汤圆\n千层班戟\n加州鲈鱼\n刺身定食\n五香熏鱼\n乳酪披萨\n一品肥牛\n奶酪蛋糕\n黄油吐司\n麻酱小料\n麦香奶茶\n鸭胸寿司\n鸡蛋色拉\n鸡肉馅饼\n鸡块盖饭\n鸡丝米线\n鳗鱼盖饭\n鲜榨橙汁\n海鲜大拼盘\n鱼炖粉条\n鱼汤米线\n带骨小牛排\n香肠沙拉\n香小饼干\n风味鸡翅\n风味意面\n韩式拌饭\n雪花刨冰\n金枪鱼饭\n金枪鱼腩\n野生鲈鱼\n醪糟汤圆\n酿油豆腐\n酸菜锅底\n酸菜粉丝\n酸菜炒饭\n酸汤水饺\n酸奶棒冰\n辣炒蛤蜊\n超值牛排\n豆角肉沫\n豆角焖饭\n豆芽粉条\n豆皮寿司\n豆类牛奶冰\n豆浆布丁\n豆丝炒肉\n葫芦饼\n西点蛋糕\n西施豆腐\n蛤蜊浓汤\n蛋黄豆腐\n蛋糕内部\n蓝莓果酱\n蒜香茄子\n蒜蓉大虾\n菠萝海鲜饭\n胡萝卜面包\n葡萄干司康\n菠萝色拉\n鱼火锅\n菜猪肉丸\n菊花茄子\n菇土鸡汤\n草莓芝士\n茶玛奇朵\n花鲢鱼头\n花卷馒头\n芝士猪排\n芝士牛排\n芝士焗面\n芝士大虾\n芝士土豆\n芝士土司\n芝士切块\n芒果寿司\n豆腐鲫鱼汤\n豆腐五花肉\n豆腐丸子汤\n脆皮烧鸡\n肉酱米线\n腊肉糯米饭\n牛肉粉/面\n炒豆干\n牛肉炒乌冬\n肉丸披萨\n肉丝米粉\n肉丝炒粉\n耶加雪菲\n绿茶拿铁\n纸上烤鱼\n特级牛小排\n红酒鹅肝\n红薯糖水\n红薯粉丝\n红烧鸡翅\n红烧鱼块\n紫薯吐司\n紫菜蛋汤\n糯米糍粑\n糯米团子\n粉丝蒸虾\n米饭布丁\n米酒汤圆\n笋小炒肉\n石锅炒饭\n相扑火锅\n盐焗鸡饭\n虾皮鸡蛋饼\n脆皮大泡芙\n白菜蒸饺\n白汁意面\n白汁意粉\n花生酱饼干\n花生牛扎糖\n现酿酸奶\n玫瑰拿铁\n玉米排骨\n猫耳朵面\n猪软骨饭\n特色豆腐\n特色咖啡\n牛腩米线\n牛肉焖饭\n牛肉干锅\n牛肉咖喱\n牛排沙拉\n牛奶雪糕\n肥牛乌冬面\n燕麦曲奇\n熟豆豆浆\n焦糖苹果\n烩牛肉饭\n烧油豆腐\n烧大明虾\n锅烧乌冬面\n烤金枪鱼\n烤羊蝎子\n烤多宝鱼\n炒黄花菜\n炒黄瓜片\n炒鸡软骨\n炒红薯叶\n深海鱼头\n冰淇淋咖啡\n海鲜什锦\n海皇炒饭\n泡椒烤鱼\n油酥烧饼\n豉油皇炒面\n沙律虾球\n水牛毛肚\n水果慕斯\n榴莲泡芙\n榄菜炒饭\n椰蓉月饼\n蒸凤爪\n辣椒猪颈肉\n辣椒拌茄子\n青椒拌皮蛋\n辣椒嫩牛肉\n樱桃小萝卜\n培根芦笋卷\n金枪鱼面包\n果冻布丁\n木耳肉丝\n时蔬炒饭\n无骨鸭掌\n三文鱼骨腩\n排骨锅底\n招牌牛排\n吉拉多生蚝\n私房牛肉面\n年糕吐司\n干炒芹菜\n安康鱼肝\n嫩肩牛排\n奶油炸糕\n奶类冻芝士\n大虾寿司\n芝士焗意粉\n培根煎饼\n场景蛋糕\n土豆豆角\n土豆茄子\n土豆炖鸡\n土豆泥饼\n四川泡菜\n咖喱拉面\n咖喱意面\n台山菜花\n可可面包\n双色鱼头\n双拼比萨\n去骨凤爪\n厚片吐司\n萝卜鲫鱼汤\n萝卜馅饺子\n萝卜素丸子\n胡萝卜玉米汤\n南瓜百合\n单品咖啡\n半筋半肉\n千层毛肚\n巧克力华夫饼\n凉菜拼盘\n巧克力马芬\n巧克力月饼\n丸子组合\n丸子米线\n丸子火锅\n中华海草\n上汤菠菜\nTORO\n黑黄豆浆\n黄鱼春卷\n麻辣鸭头\n麻辣鸡块\n鹰嘴豆泥\n土鸡鸳鸯锅\n鸡蛋披萨\n鸡腿盖饭\n鸡翅拼盘\n鸡排意面\n三鲜馅饺子\n海鲜炒年糕\n鲜果松饼\n鲜果披萨\n鲑鱼色拉\n鲍鱼刺身\n鱿鱼盖饭\n鱼味牛油果\n鱼片火锅\n加州卷\n鸡腿堡\n香酥饼干\n香辣猪手\n香蕉面包\n香草慕斯\n香烤排骨\n电饭煲蛋糕\n猪颈肉炒饭\n韩式炸鸡\n青瓜沙律\n长棍面包\n金钻奶茶\n野生大虾\n里拉芝士\n酸菜黑鱼\n酸菜饺子\n酸菜肥牛\n酱椒鱼头\n酱三文鱼\n奶酪小餐包\n奶酪小蛋糕\n红酒烩牛肉\n番茄酱\n辫子吐司\n辣烤鸡翅\n辣椒皮蛋\n辣味烤鱼\n蹄花汤锅\n豹皮豆腐\n豆花牛肉\n豆腐肉饼\n豆腐粉条\n豆渣煎饼\n豆渣丸子\n土豆泥色拉\n豆沙餐包\n西红柿汁\n西冷牛肉\n蟹黄豆花\n蟹子炒饭\n羊蝎子火锅\n羊蝎子小锅\n蛋白饼干\n虾酱炒饭\n虾仁锅贴\n虾仁色拉\n藕炖排骨\n薏米糖水\n蔬菜焗饭\n蔬菜咖喱\n蓝莓乳酪\n椰蓉小面包\n蒸扇贝王\n蒜香龙虾\n葱拌豆腐\n萝卜酿肉\n葡萄干发糕\n菠萝油条\n鲜肉饺\n拌豆干\n菌鸳鸯锅\n菌炒牛柳\n菌类排骨汤\n菊花豆腐\n乌鸡汤\n荠菜饺子\n茶树菇汤\n苹果果酱\n花朵饼干\n花园蛋糕\n芝麻餐包\n火腿肠面包\n腰果虾仁\n豆腐鸡蛋羹\n脆皮猪肘\n胶酿油条\n胡萝卜泥\n肥牛盖饭\n肉饼蒸蛋\n牛肉蘑菇饭\n肉松炒饭\n鸡肉拌米粉\n牛肉土豆饭\n肉丸意面\n肉丝炒饼\n耳雪梨汤\n木耳炒腐竹\n考伯色拉\n羊排火锅\n红豆牛奶\n西红柿炒蛋\n红枣南瓜\n紫薯蛋糕\n紫薯发糕\n焦糖玛琪朵\n红薯粥\n秘制鲈鱼\n秘制猪手\n盱眙龙虾\n酱皇蒸凤爪\n皇后披萨\n小点心\n辣白菜火锅\n番茄牛肉\n番茄沙司\n甜虾沙拉\n紫甘蓝沙拉\n炒百合\n南瓜奶油汤\n琵琶豆腐\n大理石蛋糕\n西班牙火腿\n玫瑰慕斯\n玉米饺子\n猴子面包\n猪肉香肠\n猪肉拼盘\n特大鸡排\n牛肝菌汤\n牛肉土豆\n牛肉便当\n牛粉丝煲\n牛柳披萨\n牛排色拉\n爆鱿鱼须\n燕麦吐司\n烧汁茄子\n炸酱拌面\n炸冰淇淋\n炒鱿鱼花\n炒海螺片\n炒海带丝\n炒山药片\n火腿焗饭\n火腿焖饭\n冰激凌松饼\n混合沙拉\n深海鱼排\n海鲜粉丝\n海鲜清汤\n海鲜总汇\n洋葱炒肉\n洋芋焖饭\n泡菜肥牛\n泡菜拌饭\n法式吐司\n油面包卷\n油炒时蔬\n沙拉手卷\n豆沙小餐包\n樱桃果酱\n樱桃慕斯\n椰子雪糕\n辣椒炒豆干\n梅子绿茶\n金桔柠檬茶\n柠檬慕斯\n柚子绿茶\n枸杞燕窝\n果馅蛋糕\n果肉酸奶\n果肉果汁\n水果小圆子\n果味酸奶\n板腱牛排\n杨枝金露\n肉末炒豆角\n早餐面包\n无骨鸡排\n方块饼干\n新鲜蔬菜\n斯牛小排\n挤挤面包\n提拉米苏杯\n手工鱼丸\n手工凉皮\n意式烩饭\n彩色汤圆\n带骨肉眼\n巧克力饼\n山药鸡汤\n双层鸡腿堡\n小牛菲力\n莲子百合汤\n茄子煲仔饭\n牛奶鸡蛋羹\n奶香饼干\n奶酪拼盘\n奶酪年糕\n奶酪土司\n奶类蛋糕卷\n奶茶系列\n酸奶紫米露\n奶油意粉\n奶油司康\n焗意面\n比基尼蛋糕\n培根匹萨\n圆形蛋糕\n咖喱鸡扒饭\n咖啡星冰乐\n烤鲶鱼\n可可曲奇\n反卷寿司\n双色花卷\n参炖乌鸡\n原味奶酪\n卷边披萨\n印尼炒饭\n萝卜丝丸子\n南美大虾\n刺猬馒头\n奥利奥奶茶\n伊比利亚火腿\n巧克力雪糕\n巧克力牛奶\n巧克力火锅\n巧克力冰泥\n五谷丰登\n乌骨鸡汤\n粉丝蒸带子\n上脑牛排\n三鲜锅贴\n三鲜砂锅\n三文鱼片\nlaksa\n黑毛猪肉\n黄焖排骨\n黄桥烧饼\n麦穗面包\n鸭胸沙律\n鸡蛋土司\n鸡肉串烧\n鸡汤锅底\n鸡丝色拉\n鲜虾焗饭\n鲜肉煎饺\n鲜肉汤圆\n鱿鱼炒饭\n鱿鱼沙拉\n鱿鱼年糕\n鱼汤锅底\n三文鱼土豆饼\n骨炖山药\n马蹄糖水\n香辣肉丝\n香菇炖鸡\n香菇包子\n香草烤鸡\n香草奶冻\n牛肉堡\n香煎鲈鱼\n香浓咖啡\n百香果慕斯\n小排骨\n香味烤鱼\n东坡肉\n馅儿包子\n饼干礼盒\n风情烤翅\n韭菜馅饼\n韩式蛋糕\n雪花牛柳\n雪人蛋糕\n铁板牛柳\n金汤肥牛\n野生鲶鱼\n野生杂鱼\n里脊猪排\n酸辣粉丝\n酸奶土司\n酱爆鱿鱼\n酱烤鱿鱼\n酥皮面包\n辣味炒饭\n转化糖浆\n豉汁凤爪\n豆腐蒸蛋\n豆腐粉丝\n豆腐年糕\n豆腐包子\n黄豆煲排骨\n豆炒茄子\n谷物面包\n诱惑蛋糕\n豆角炒肉丝\n西葫芦片\n蟹肉炒饭\n螺丝意面\n蝴蝶馒头\n蜜豆蛋糕\n鸡蛋馅饺子\n鸡蛋豆腐羹\n蛋糕礼盒\n蛋糕盒子\n鸡蛋炒木耳\n大虾蛋包饭\n虾卷寿司\n虾仁蒸蛋\n虾仁生煎\n虎皮鸡蛋\n蘑菇色拉\n蘑菇炒肉\n薄脆披萨\n香蕉磅蛋糕\n蔬菜面包\n葱烧豆腐\n营养米糊\n葡萄干马芬\n葡萄干饼干\n葡萄干土司\n包子\n炒香菇\n炒汤圆\n青菜年糕汤\n菌菇锅底\n菌菇米线\n老鸡汤\n扒时蔬\n蔓越莓磅蛋糕\n松茸蘑菇汤\n番茄酱意面\n番茄海鲜汤\n茄子拌面\n苹果奶昔\n英式奶茶\n五花肉炒饭\n花甲粉丝\n花生酥饼\n花生汤圆\n芝麻布丁\n芝士生蚝\n芝士烤饼\n芝士拌饭\n芝士批萨\n芒果奶茶\n藍色夏威夷\n火腿肠炒饭\n火腿炒鸡蛋\n豆腐白菜汤\n豆腐炖白菜\n脆脆蛋糕\n胚芽奶茶\n肥牛锅仔\n肉铁板烧\n芝士卷\n肉紫菜卷\n肉碎炒饭\n砂锅饭\n肉炒青瓜\n炒粉条\n牛肉炒意粉\n炒土豆\n肉松餐包\n肉松月饼\n肉末拌面\n肉之近照\n老鸭粉丝\n老汤豆腐\n翡翠豆腐\n罗汉果茶\n经典沙拉\n经典汉堡\n红豆酸奶\n红豆凉粉\n红薯粉条\n红糖年糕\n红烩牛肉\n红枣奶茶\n紫薯汤圆\n糯米饭团\n糖水罐头\n玉米炒虾仁\n笋炒鸡蛋\n支竹猪肚煲\n立体蛋糕\n秘制牛扒\n盆栽酸奶\n辣椒酱\n百合南瓜\n白菜炒面\n白肉血肠\n白灼生菜\n番茄焗饭\n甜筒披萨\n甘梅地瓜\n瓜类粉丝汤\n琥珀核桃\n千层班戟蛋糕\n玫瑰红茶\n玉米面条\n玉米脆片\n猪血丸子\n猪肉沙拉\n狗肉火锅\n牛腩锅底\n牛肉石锅\n牛肉煲仔\n牛牛小排\n牛油红锅\n爱心蛋糕\n燕麦豆浆\n燕麦牛奶\n煎酿三宝\n煎多宝鱼\n烧鱿鱼须\n烧小土豆\n烧娃娃菜\n烧大白菜\n烤黄花鱼\n烤鸡肉饭\n烤鸡比萨\n香烤小羊排\n炼奶吐司\n炸鱼薯条\n雪里红\n炒血豆腐\n炒花椰菜\n炒猪肠粉\n炒猪大肠\n炒牛肉末\n炒手捏菜\n炒卤牛肉\n火腿芝士\n火腿炒面\n液体酸奶\n海鲜叻沙\n海苔饭团\n海胆蒸蛋\n奶油磅蛋糕\n油泼辣子\n豆沙小面包\n水果缤纷\n水果华夫\n歌剧蛋糕\n青椒松花蛋\n梅花猪排\n桂花年糕\n桂圆面包\n安格斯牛扒\n格子松饼\n核桃麦芬\n柠檬芝士\n红枣乌鸡汤\n果粒奶茶\n果干吐司\n果味奶茶\n果仁酸奶\n枸杞银耳汤\n木瓜奶昔\n有機花菜\n星鳗寿司\n无锡小笼\n三文魚壽司\n排骨焗饭\n排骨便当\n挂炉烤鸭\n招牌色拉\n拌黑腐竹\n拌野木耳\n拌荞麦面\n拌海螺片\n拌水萝卜\n手绘蛋糕\n手撕面筋\n手撕吐司\n手工香肠\n手工饺子\n扁豆焖面\n慕斯泡芙\n彩虹沙拉\n广东芥兰\n希腊酸奶\n小熊饼干\n九宫格锅底\n宫保虾球\n定制蛋糕\n奶香曲奇\n奶油戚风\n酸奶华夫饼\n玛奇朵咖啡\n天鹅泡芙\n天使意面\n壽司拼盤\n芝士小蛋糕\n墨西哥饼\n基围虾粥\n鸡块盖浇饭\n四季豆饭\n咸肉炒饭\n咖喱鸡腿\n咖喱锅底\n咖喱乌冬\n和牛牛排\n鱼头王\n烤鱿鱼\n烤全鱼\n风味炸鸡翅\n百合炒虾球\n可可慕斯\n原味豆浆\n原味蛋挞\n原味牛排\n萝卜炖羊排\n北极甜虾\n包心鱼丸\n巧克力棒棒糖\n利梭多饭\n创意寿司\n凉拌豆腐\n凉拌米线\n冬荫功汤\n内酯豆腐\n养生火锅\n全麦饼干\n全麦贝果\n全麦欧包\n什锦豆腐\n乳酪月饼\n拌黄瓜\n粉丝丸子汤\n东海带鱼\n上海菜饭\n上校鸡块\n三文鱼柳\n三丝春卷\n万州烤鱼\nCrepe\n柠檬酸奶\nT骨牛扒\n金枪鱼\n黑糖咖啡\n黑椒鸡翅\n黄金馒头\n黄河刀鱼\n芝麻菜披萨\n鸭肉沙拉\n鸡蛋锅贴\n鸡蛋炒饼\n鸡蛋摊饼\n鸡蛋捞面\n鸡腿焖饭\n鸡腿便当\n鸡肉焖饭\n鸡炖粉条\n鸡扒定食\n鲜虾豆腐\n海鲜炒意粉\n鲜果虾球\n鲍汁凤爪\n鱿鱼披萨\n鱼烧年糕\n鱼子沙拉\n鱼头砂锅\n山药汤\n香酥带鱼\n香辣鸡排\n香辣牛排\n香辣烤鸡\n香辣意面\n香辣大虾\n香菇面筋\n香菇肉片\n香菇肉包\n香菇炒饭\n香草奶茶\n香芒布丁\n香脆豆腐\n香煎带鱼\n香浓奶茶\n香橙蛋糕\n水煎包\n风味奶茶\n青椒茄子\n青岩豆腐\n雪顶咖啡\n雪梨燕窝\n门钉肉饼\n锦绣拼盘\n石锅拌饭2\n银耳甜汤\n铁板鸡排\n野生虾仁\n煎酿杏鲍菇\n酱鸡蛋饼\n酱蒸鱼头\n酱炒豆腐\n酱炒意面\n酱娃娃菜\n酥脆饼干\n酒酿汤圆\n烤蔬菜\n牛肉酱\n辣子炒肉\n辣味披萨\n辣椒八爪鱼\n刺身大拼盘\n起酥面包\n贡丸米线\n豚肉拉面\n豆豉酱拌面\n豆豉排骨\n豆腐虾仁\n豆腐水饺\n四角豆炒腊味\n谷饲牛排\n西芹木耳\n墨西哥面包\n羊扒\n蟹蒸粉丝\n蟹脚寿司\n蜂蜜芥末酱\n蜂蜜土司\n蛋糕吐司\n蛋浸时蔬\n虾寿司卷\n虾兵蟹将\n虾仁煎饺\n虎皮豆腐\n蘑菇肉片\n蘑菇土豆\n薄荷汽水\n薄荷拿铁\n蔬菜蛋饼\n蔬果沙律\n蓝莓马芬\n蒸石斑鱼\n葱爆牛柳\n葡萄吐司\n阿萨姆奶茶\n萝卜鸡汤\n萝卜肉丸\n萝卜肉丝\n萝卜牛肉\n黑木耳\n鲫鱼汤\n豆腐皮\n蒸茄子\n菜炒猪肚\n炒千张\n油豆腐\n菜土豆条\n土鸡汤\n蒸滑鸡\n炖小鸡\n蘑菇炒油菜\n鲜菇丸子汤\n莲藕丸子\n得莫利炖鱼\n草莓冻芝士\n香草星冰乐\n香草三文鱼\n茉莉奶茶\n茄子沙拉\n花生牛奶\n芝麻沙拉\n芝麻桃酥\n芝麻小饼\n芝士绿茶\n芝士烩饭\n芝士曲奇\n芝士奶盖\n芝士乳酪\n豆腐小炒肉\n脆皮鸡腿\n脆皮鸡扒\n脆皮叉烧\n肥肠米粉\n肥肠小面\n肥牛鹅肠\n肥牛炒饭\n肠仔披萨\n肉饼汉堡\n肉酿豆腐\n肉类芦笋卷\n肉碎豆腐\n肉砂锅粥\n炒西芹\n炒笋干\n肉炒尖椒\n肉沫炒饭\n肉松排包\n肉松披萨\n肉末豆角\n大酱汤\n肉土豆汤\n肉丸焗饭\n肉丝卷饼\n银耳百合羹\n羊肉烧麦\n羊肉炒饭\n羊肉拼盘\n绿豆凉粉\n经典牛扒\n顶级羔羊肉\n起泡酒\n红豆豆浆\n红豆汤圆\n红豆拿铁\n红豆年糕\n红豆土司\n红糖姜茶\n红米炒饭\n红烧肉面\n红油水饺\n西红柿牛腩\n红枣面包\n红枣牛奶\n紫薯沙拉\n紫菜饭团\n糯米鸡翅\n糯米粽子\n糖炒板栗\n粗粮面包\n粉丝蟹煲\n玉米炒肉丁\n芦笋培根卷\n神户牛肉\n礼盒蛋糕\n石烤鱿鱼\n石板豆腐\n皇冠面包\n鸡蛋羹\n百合甜汤\n番茄沙律\n野生小鲫鱼\n甜心蛋糕\n甜甜圈蛋糕\n玫瑰饼干\n玉米蛋糕\n猪肚鸡煲\n猪肉蒸包\n猪肉色拉\n猪肉肠粉\n猪肉大包\n特色凉面\n特浓咖啡\n牛腩煲仔\n牛腩焗饭\n牛肚火锅\n牛肋条肉\n牛肉月饼\n牛肉冷面\n牛筋牛腩\n牛柳意粉\n牛排骨肉\n牛排定食\n牛奶饼干\n牛奶排包\n牛仔骨煲\n牛乳蛋糕\n燕窝蛋挞\n熏鸡披萨\n煮荷包蛋\n焗龙利鱼\n烧肉炒饭\n烧牛小排\n烤鸭半只\n烤鸡软骨\n烤娃娃菜\n炼乳刨冰\n炸鸡软骨\n炸鸡沙拉\n炸五花肉\n炖油豆角\n炒鸡毛菜\n炒蛤蜊肉\n炒花甲螺\n炒花枝片\n炒红萝卜\n炒红苋菜\n炒小黄瓜\n炒夜开花\n炒南瓜藤\n炒北极贝\n火腿吐司\n火腿卷饼\n火炙寿司\n澳洲西冷\n滋补烩面\n深海鲽鱼\n消化饼干\n海鲜河粉\n海鲜拌饭\n海鲜伊面\n海盐拿铁\n海参捞饭\n洋葱披萨\n泡菜拼盘\n法式蜗牛\n法国面包\n法国蜗牛\n法国生蚝\n温泉蛋色拉\n牛油果慕斯\n沙爹牛肉\n沙姜猪手\n小馄饨\n蜜汁叉烧饭\n水晶虾球\n椰汁西米\n椰子蛋糕\n桃胶糖水\n安格斯西冷\n核桃色拉\n栗子白菜\n柠檬烤鸡\n柠檬果汁\n金枪鱼比萨\n果酸奶昔\n牛奶冰\n松子意面\n杭椒牛柳\n枸杞鸽子汤\n杏鲍菇丝\n杏仁脆饼\n杏仁布丁\n杂粮米糊\n朝鲜冷面\n时蔬豆腐\n三文鱼塔塔\n摩卡冰沙\n吞拿鱼意粉\n拿铁奶茶\n拌黑豆丝\n拌海蜇丝\n拉丝面包\n猪扒石锅饭\n手摇酸奶\n手打年糕\n手工馒头\n手工糖果\n手工丸子\n手卷披萨\n德国猪手\n年糕锅底\n干锅菜花\n干锅排骨\n常州百叶\n海带炖排骨\n帝王蟹腿\n巴沙鱼柳\n巧克力奶\n少的可怜\n茄子炒豆角\n娃娃菜汤\n奶酪汉堡\n奶茶布丁\n奶盖贡茶\n奶油饼干\n奶油排包\n奶油华夫\n天山雪莲\n大虾意粉\n大碗花菜\n意大利披萨\n多菌锅底\n土豆鸡翅\n土豆面包\n土豆肉片\n如豆粉条\n土豆拼盘\n哈斯面包\n咖喱鲈鱼\n咖喱豆腐\n咖喱猪排\n咖啡沙冰\n和牛西冷\n烤黑鱼\n吞拿鱼卷\n双拼奶茶\n原味泡芙\n原味曲奇\n萝卜馅包子\n萝卜焖牛腩\n萝卜炖羊肉\n南非干鲍\n千层面包\n包菜粉丝\n巧克力软曲奇\n秘制黑豆腐\n凤梨虾球\n养生汤锅\n巧克力华夫\n优格奶昔\n仙草芋圆\n什锦焗饭\n什锦时蔬\n什锦寿司\n五花肉串\n云吞汤面\n乳酪吐司\n玛格丽特披萨\n牛肉丸粿条汤\n东海黄鱼\n三鲜烧麦\n三鲜包子\n三角饭团\n三角蛋糕\n三文鱼腹\n盛り合わせ\notoro寿司\nBagel\n龙虾泡饭\n龙利鱼饭\n龙俐鱼柳\n黑毛和牛\n黑椒鸡排\n黑椒猪扒\n黄鱼馄饨\n黄瓜蘸酱\n黄瓜泡菜\n黄油土豆\n芝麻菜色拉\n鸭血米线\n鸡蛋面饼\n鸡蛋煎饺\n鸡蛋汉堡\n鸡蛋小饼\n鸡蘑菇粉\n鸡肉米线\n鸡石锅饭\n鸡柳汉堡\n鸡块米线\n鲜肉生煎\n鲜肉烧麦\n海鲜小火锅\n鲜奶泡芙\n鱼籽刺身\n鱼头锅底\n鳗鱼卷寿司\n鱼丸子汤\n鰻魚壽司\n排骨糯米饭\n焖土豆\n烩酸菜\n香酥鸡翅\n香辣鸡块\n香辣花甲\n辣椒油\n香蕉沙拉\n香蕉华夫\n香蕉冰沙\n香草羊排\n香芋丸子\n香煎鸡排\n香烤猪蹄\n炸猪排\n香炸大排\n香星冰乐\n风味烤翅\n风味炒饭\n风味拿铁\n顶级牛排\n韭菜包子\n面包机版\n青柠绿茶\n霸王鱼头\n霸王牛肉\n雪顶拿铁\n长江鲥鱼\n银鳕鱼扒\n银鱼蒸蛋\n铁盘披萨\n铁棍淮山\n铁板牛蛙\n拿铁星冰乐\n金枪鱼头\n金枪刺身\n野生黄鳝\n野生水鱼\n野生扇贝\n醋溜白菜\n醇香奶茶\n酸汤鱼片\n酸奶麦片\n酸奶冰棍\n酱鱿鱼须\n酱金钱肚\n酱蒸茄子\n酱蒸肠粉\n酱爆牛蛙\n酱焖排骨\n酱烤肋排\n酱烤生蚝\n酱炒豆角\n酱炒花甲\n酱拌茄子\n乳酪磅蛋糕\n酒巧克力\n道口烧鸡\n迷糊娃娃\n辣酱炸鸡\n辣椒炒饭\n辣三文鱼\n转印蛋糕\n海皇轩国门店\n蔓越莓软欧\n豆角烧肉\n豆角炒饭\n豆蓉月饼\n豆芽炒肉\n豆腐饺子\n豆腐酿肉\n豆腐蛋糕\n豆腐肉丸\n豆腐炖鱼\n豆腐塞肉\n焖猪蹄\n土豆烧鸡翅\n土豆炖鸡块\n炒肉丁\n豆沙粽子\n杂酱面\n话梅猪手\n虾饺皇\n西红柿面\n西米糖水\n西施泡饭\n裸麦面包\n裙边寿司\n蜜豆奶茶\n蜗牛浓汤\n蛋黄鸡翅\n蛋糕组合\n蛋糕制作\n蚝油牛肉\n虾蒸水蛋\n虾仁沙律\n虾仁拌面\n虎虾寿司\n糯米饼\n薯条拼盘\n香蕉冰淇淋\n蔬菜烩饭\n蔬菜卷饼\n蓝莓麦芬\n蓝莓冰沙\n蓑衣茄子\n蒜香炸鸡\n蒜泥黄瓜\n蒜味面包\n葱香面包\n葱爆肥牛\n胡萝卜蛋饼\n胡萝卜炒面\n萝卜浓汤\n萝卜拼盘\n菲肋牛排\n菲利牛排\n菠萝蛋糕\n菠萝果酱\n虾仁粥\n肉圆汤\n炒肉沫\n菜炒米饭\n菜炒猪肉\n菇肉片饭\n菌菇牛肉面\n炖鸡面\n莓莫吉托\n荷包蛋面\n山药炖排骨\n草莓大福\n茶碗蒸蛋\n茴香饺子\n番茄三明治\n苹果芝士\n苹果色拉\n花生豆腐\n花式馒头\n花卷面包\n芦笋浓汤\n芥末鸭掌\n芝麻馒头\n芝士雪冰\n芝士切片\n芒果蛋挞\n芋头糖水\n腊味拼盘\n脊骨火锅\n脆骨丸子\n脆皮茄子\n脆皮肠粉\n脆皮牛肉\n脆皮烤鸡\n胚芽面包\n肩胛牛排\n肩胛牛扒\n肥牛定食\n肠仔面包\n鸡肉芝士饭\n肉砂锅面\n烧萝卜\n炖酸菜\n肉炖粉皮\n肉炒洋葱\n肉松蛋卷\n肉小丸子\n寿司卷\n肉丝套饭\n肉丁拌面\n绿茶奶盖\n绣球馒头\n红酱意面\n红酒牛肉\n红豆餐包\n红豆雪糕\n红豆松饼\n红薯面包\n红糖鸡蛋\n紫薯糖水\n糯米面包\n粉丝包子\n米饭沙拉\n米饭披萨\n笋炖排骨\n芝麻糊\n磨牙饼干\n盒子鸡排\n百合木耳\n白酱意面\n白玉豆腐\n白切羊肉\n番茄莎莎\n番茄沙沙\n甲鱼火锅\n生态黄鱼\n甜辣炸鸡\n瓦罐煨汤\n炒鱿鱼\n小丸子\n玻璃虾寿司\n珍珠奶盖\n猪软骨面\n猪油拌饭\n果汁\n特色米线\n物语蛋糕\n牛腩肠粉\n牛腩盖饭\n牛肉薄烧\n牛肉米饭\n牛肉汤饭\n牛肉批萨\n牛肉年糕\n牛肉丸面\n黑椒牛柳炒面\n牛尾浓汤\n牛奶燕麦\n牛三星汤\n爆蛋奶茶\n爆浆蛋糕\n燕麦蛋糕\n熊猫饭团\n煲娃娃菜\n水煮黄骨鱼\n煎黄花鱼\n煎金枪鱼\n煎酿茄子\n焗大龙虾\n焖牛肉饭\n烧酿茄子\n烧橡皮鱼\n炼乳面包\n炸鸡拌饭\n炸虾寿司\n炭烧奶茶\n炖娃娃菜\n炖大白菜\n炒黑鱼片\n炒黄秋葵\n炒韭菜苔\n炒野木耳\n炒油菜蕻\n炒山鸡蛋\n炒千张丝\n火鸡披萨\n火腿意粉\n澳洲牛柳\n湘西腊肉\n冰淇淋奶昔\n海鮮炒飯\n北海道扇贝\n海苔寿司\n海捕大虾\n浓汤锅底\n澳洲大龙虾\n洋葱炒饭\n泡椒鸡爪\n油醋沙拉\n豉油皇鹅肠\n牛油果沙律\n沙爹肉串\n沙律手卷\n水果酵素\n水果汤圆\n水果奶昔\n水果圣代\n水晶奶茶\n水信玄饼\n气泡果汁\n欧式面包\n榴莲蛋挞\n榴莲双拼\n椰香面包\n椰香蛋糕\n椰子炖鸡\n椰丝蛋糕\n黑椒鸡扒饭\n椒蒸茄子\n黑椒牛肋骨\n炒鸡块\n辣椒炒火腿\n青椒炒木耳\n肉桂面包卷\n桂花汤圆\n培根焗意面\n核桃酥饼\n核桃曲奇\n茶树菇牛柳\n柴火香干\n柠檬冰饮\n枸杞豆浆\n枣泥蛋糕\n果酱酥饼\n果粒蛋糕\n果木牛排\n果冻芝士\n果仁饼干\n北极贝裙边\n肉松小面包\n杏仁酥饼\n杂鱼锅仔\n木耳白菜\n有机豆芽\n有机木耳\n普罗旺斯\n时蔬焗饭\n时蔬沙律\n无花果酱\n无花果汤\n新竹米粉\n手撕包心菜\n排骨煲仔\n排骨汤饭\n排骨拌饭\n招牌烧鹅\n拌黄花菜\n拉面火锅\n手撕羊排\n手撕猪心\n手抓龙虾\n手工拉面\n手工奶茶\n手切肥牛\n慕思蛋糕\n意式饺子\n忌廉浓汤\n心花怒放\n德国香肠\n开心花甲\n广东丝瓜\n年糕雪冰\n年糕火鍋\n带子肠粉\n布丁奶昔\n山楂果酱\n小米辽参\n孜然牛排\n夏威夷比萨\n天妇罗寿司\n天妇罗定食\n奶酪曲奇\n奶盖咖啡\n炖燕窝\n奶油蛋挞\n奶油奶酪\n奶油土司\n奶星冰乐\n大酥牛肉\n大虾小锅\n意大利扁面\n芝士牛肉堡\n墨鱼寿司\n培根饭团\n培根蛋饼\n坚果饼干\n圣诞饼干\n圣诞老人\n土豆蛋饼\n土豆肉丝\n土豆炒饭\n土司盒子\n咖喱鸡块饭\n千层肚\n咖喱肉蟹\n咖啡曲奇\n咖哩牛肉\n咕噜虾球\n烤草鱼\n炸鸡块\n大鱿鱼\n吐司盒子\n百合炒木耳\n可爱蛋糕\n可可吐司\n双薯沙拉\n双色蛋糕\n叉烧拼盘\n厚片土司\n萝卜丸子汤\n南瓜饼干\n南瓜色拉\n南瓜排骨\n南瓜戚风\n南瓜丸子\n千层盒子\n千层油糕\n刺身寿司\n切片吐司\n凤尾虾球\n凤凰奶糊\n凉拌鸡丝\n冰淇淋杯\n冰淇淋卷\n农家土鸡\n全麦软欧\n巧克力披萨\n巧克力慕司\n巧克力慕丝\n五香豆干\n五花肉汤\n五花肉2\n五花拌饭\n二豆豆浆\n乳酪沙拉\n乡里腊肉\n乐园蛋糕\n乌冬汤面\n乌冬定食\n丹麦面包\n上脑牛肉\n上脑牛扒\n三明治2\n三文鱼粥\n三文鱼挞\n丁丁炒面\n龙虾盖饭\n火龙果酸奶\n黑豆腐皮\n黑椒鸡扒\n黑椒排骨\n黄瓜炒蛋\n黄油扇贝\n麻辣花生\n芝麻糯米饼\n全麦小餐包\n鹌鹑皮蛋\n鹅肝牛排\n鸡蛋虾仁\n鸡蛋粉丝\n鸡腿焗饭\n鸡腿披萨\n鸡胸脯肉\n鸡肉拼盘\n鸡肉匹萨\n蓝鳍金枪鱼\n虾意粉\n鲜肉馅饼\n海鲜焗意粉\n鲜汤锅底\n鲅鱼饺子\n鲜鱼萝卜汤\n鱼籽蒸蛋\n鱼籽拌饭\n三文鱼籽军舰\n鱼沙拉卷\n骨粉丝汤\n骨头锅底\n养生汤\n鲜香鸳鸯锅\n香辣羊排\n香辣烤翅\n香蕉戚风\n香葱花卷\n香菜饺子\n香菜牛肉\n香菇肉丸\n香草曲奇\n香草咖啡\n香肠拌饭\n玛德琳\n香煎猪排\n香煎猪扒\n炸鸡翅\n香大鸡排\n电饭锅蛋糕\n风干火腿\n风味鸡排\n风味薯条\n风味排骨\n风味凉皮\n猪颈肉沙拉\n顶级肥牛\n韭菜锅贴\n韭菜蒸饺\n韭菜丸子\n韩式烤肉\n面包沙拉\n面包拼盘\n青菜拼盘\n青椒肉片\n雪菜肉丝\n雪糕蛋糕\n雪糕芭菲\n阿拉斯加\n长江回鱼\n银芽炒面\n银耳燕窝\n铁板牛扒\n金牌猪手\n金枪鱼面\n野菌浓汤\n酸辣米粉\n酸辣笋尖\n酸菜血肠\n酸菜白肉\n酸菜烤鱼\n酸奶玛芬\n酸奶沙冰\n酱猪肠粉\n酱焗扇贝\n酱拌黄瓜\n酥肉米线\n酥皮点心\n黑松露\n酸椰菜\n迷你蛋挞\n香辣美容蹄\n起司焗饭\n鸡脚汤\n豆豉蒸鱼\n豆花烤鱼\n豆腐近景\n豆腐白菜\n糯米粽\n焖茄子\n焖猪手\n豆渣面包\n豆沙春卷\n谷饲牛肉\n调味猪排\n调味牛肉\n诱惑披萨\n西芹虾仁\n西瓜冰沙\n墨西哥披萨\n新西兰牛排\n蜜酿酸奶\n蜜豆马芬\n蜜汁鸡块\n蜜汁南瓜\n蜂蜜松饼\n蛋白脆饼\n蛋白吐司\n鸡蛋炒馒头\n鸡蛋炒粉丝\n虾烧白菜\n虾烧卖皇\n虾炒白菜\n蘑菇面包\n蘑菇炖鸡\n蘑菇炖饭\n蘑菇炒饭\n薯泥沙拉\n山药糕\n薄荷奶昔\n薄脆饼干\n香蕉鸡蛋饼\n蔬菜炒面\n蓝莓蛋挞\n蒸糯米饭\n蒸太阳鱼\n蒸大元贝\n蒜香生蚝\n蒜香烤鱼\n蒜香意面\n蒜蓉蒸虾\n蒜苗炒肉\n葡萄蛋糕\n萝卜馄饨\n萝卜肉片\n萝卜火锅\n菠萝排骨\n菠萝奶昔\n菠菜蛋饼\n韭菜饺子馅\n菜面片汤\n菜花炒肉\n泡菜肥牛锅\n青菜肉馅饼\n芹菜肉饺子\n肉丸子\n菜素包子\n菜粒炒饭\n猪肝粥\n菜煮粉条\n煮年糕\n炒豆角\n炒蚕豆\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_dishes/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\n\nimport argparse\nimport ast\nimport os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import postprocess\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"resnet50_vd_dishes\",\n    type=\"CV/image_classification\",\n    author=\"baidu-vis\",\n    author_email=\"\",\n    summary=\"ResNet50vd is a image classfication model, this module is trained with Baidu self-built dishes dataset.\",\n    version=\"1.1.0\")\nclass ResNet50vdDishes:\n\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"model\", \"model\")\n        label_file = os.path.join(self.directory, \"label_list.txt\")\n        with open(label_file, 'r', encoding='utf-8') as file:\n            self.label_list = file.read().split(\"\\n\")[:-1]\n        self._set_config()\n\n    def get_expected_image_width(self):\n        return 224\n\n    def get_expected_image_height(self):\n        return 224\n\n    def get_pretrained_images_mean(self):\n        im_mean = np.array([0.485, 0.456, 0.406]).reshape(1, 3)\n        return im_mean\n\n    def get_pretrained_images_std(self):\n        im_std = np.array([0.229, 0.224, 0.225]).reshape(1, 3)\n        return im_std\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path + '.pdmodel'\n        params = self.default_pretrained_model_path + '.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def classification(self, images=None, paths=None, batch_size=1, use_gpu=False, top_k=1):\n        \"\"\"\n        API for image classification.\n\n        Args:\n            images (numpy.ndarray): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results.\n        \"\"\"\n        all_data = list()\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = list()\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except:\n                    pass\n            # feed batch image\n            batch_image = np.array([data['image'] for data in batch_data])\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(batch_image.copy())\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            out = postprocess(data_out=output_handle.copy_to_cpu(), label_list=self.label_list, top_k=top_k)\n            res += out\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classification(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.classification(paths=[args.input_path], batch_size=args.batch_size, use_gpu=args.use_gpu)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not.\")\n        self.arg_config_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n        self.arg_config_group.add_argument('--top_k', type=ast.literal_eval, default=1, help=\"Return top k results.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_dishes/processor.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport base64\n\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef softmax(x):\n    if len(x.shape) > 1:\n        tmp = np.max(x, axis=1)\n        x -= tmp.reshape((x.shape[0], 1))\n        x = np.exp(x)\n        tmp = np.sum(x, axis=1)\n        x /= tmp.reshape((x.shape[0], 1))\n    else:\n        tmp = np.max(x)\n        x -= tmp\n        x = np.exp(x)\n        tmp = np.sum(x)\n        x /= tmp\n    return x\n\n\ndef postprocess(data_out, label_list, top_k):\n    \"\"\"\n    Postprocess output of network, one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output data of network.\n        label_list (list): list of label.\n        top_k (int): Return top k results.\n    \"\"\"\n    output = []\n    for result in data_out:\n        result_i = softmax(result)\n        output_i = {}\n        indexs = np.argsort(result_i)[::-1][0:top_k]\n        for index in indexs:\n            label = label_list[index]\n            output_i[label] = float(result_i[index])\n        output.append(output_i)\n    return output\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_dishes/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/YpfRCe5lda0/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8MTZ8fG5vb2RsZXxlbnwwfHx8fDE2NjU2MjQwMzA&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"resnet50_vd_dishes\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n\n    def test_classification1(self):\n        results = self.module.classification(paths=['tests/test.jpg'])\n        data = results[0]\n        self.assertTrue('麻辣烫' in data)\n        self.assertTrue(data['麻辣烫'] > 0.01)\n\n    def test_classification2(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')])\n        data = results[0]\n        self.assertTrue('麻辣烫' in data)\n        self.assertTrue(data['麻辣烫'] > 0.01)\n\n    def test_classification3(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')], use_gpu=True)\n        data = results[0]\n        self.assertTrue('麻辣烫' in data)\n        self.assertTrue(data['麻辣烫'] > 0.01)\n\n    def test_classification4(self):\n        self.assertRaises(AssertionError, self.module.classification, paths=['no.jpg'])\n\n    def test_classification5(self):\n        self.assertRaises(TypeError, self.module.classification, images=['test.jpg'])\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport math\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(\n            self,\n            num_channels: int,\n            num_filters: int,\n            filter_size: int,\n            stride: int = 1,\n            groups: int = 1,\n            is_vd_mode: bool = False,\n            act: str = None,\n            name: str = None,\n    ):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2d(kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, inputs: paddle.Tensor):\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Bottleneck Block for ResNet50_vd.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels, num_filters=num_filters, filter_size=1, act='relu', name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters * 4, filter_size=1, act=None, name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv2, act='relu')\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    \"\"\"Basic block for ResNet50_vd.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BasicBlock, self).__init__()\n        self.stride = stride\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters, filter_size=3, act=None, name=name + \"_branch2b\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters,\n                filter_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv1, act='relu')\n        return y\n\n\n@moduleinfo(\n    name=\"resnet50_vd_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"resnet50_vd_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ResNet50_vd(nn.Layer):\n    \"\"\"ResNet50_vd model.\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(ResNet50_vd, self).__init__()\n\n        self.layers = 50\n\n        depth = [3, 4, 6, 3]\n        num_channels = [64, 256, 512, 1024]\n        num_filters = [64, 128, 256, 512]\n\n        self.conv1_1 = ConvBNLayer(num_channels=3, num_filters=32, filter_size=3, stride=2, act='relu', name=\"conv1_1\")\n        self.conv1_2 = ConvBNLayer(num_channels=32, num_filters=32, filter_size=3, stride=1, act='relu', name=\"conv1_2\")\n        self.conv1_3 = ConvBNLayer(num_channels=32, num_filters=64, filter_size=3, stride=1, act='relu', name=\"conv1_3\")\n        self.pool2d_max = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n        self.block_list = []\n\n        for block in range(len(depth)):\n            shortcut = False\n            for i in range(depth[block]):\n                conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                bottleneck_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    BottleneckBlock(\n                        num_channels=num_channels[block] if i == 0 else num_filters[block] * 4,\n                        num_filters=num_filters[block],\n                        stride=2 if i == 0 and block != 0 else 1,\n                        shortcut=shortcut,\n                        if_first=block == i == 0,\n                        name=conv_name))\n                self.block_list.append(bottleneck_block)\n                shortcut = True\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n        self.pool2d_avg_channels = num_channels[-1] * 2\n        stdv = 1.0 / math.sqrt(self.pool2d_avg_channels * 1.0)\n\n        self.out = Linear(\n            self.pool2d_avg_channels,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_0.w_0\"),\n            bias_attr=ParamAttr(name=\"fc_0.b_0\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'resnet50_vd_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/resnet50_vd_imagenet.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        y = self.pool2d_max(y)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.pool2d_avg_channels])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_imagenet_ssld/README.md",
    "content": "# resnet50_vd_imagenet_ssld\n\n|模型名称|resnet50_vd_imagenet_ssld|\n| :--- | :---: | \n|类别|图像-图像分类|\n|网络|ResNet_vd|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|148MB|\n|指标|-|\n|最新更新日期|2021-02-26|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - ResNet-vd 其实就是 ResNet-D，是ResNet 原始结构的变种，可用于图像分类和特征提取。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install resnet50_vd_imagenet_ssld\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)   \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run resnet50_vd_imagenet_ssld --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n\n    if __name__ == '__main__':\n\n        model = hub.Module(name='resnet50_vd_imagenet_ssld')\n        result = model.predict(['/PATH/TO/IMAGE'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用resnet50_vd_imagenet_ssld对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n\n              flowers = Flowers(transforms)\n\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"resnet50_vd_imagenet_ssld\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n\n              if __name__ == '__main__':\n\n                  model = hub.Module(name='resnet50_vd_imagenet_ssld', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['/PATH/TO/IMAGE'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m resnet50_vd_imagenet_ssld\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/resnet50_vd_imagenet_ssld\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n    \n  升级为动态图版本\n\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_imagenet_ssld/README_en.md",
    "content": "# resnet50_vd_imagenet_ssld\n\n|Module Name|resnet50_vd_imagenet_ssld|\n| :--- | :---: | \n|Category |Image classification|\n|Network|ResNet_vd|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or notFine-tuning|Yes|\n|Module Size|148MB|\n|Data indicators|-|\n|Latest update date|2021-02-26|\n\n\n## I. Basic Information \n\n- ### Module Introduction\n\n  - ResNet-vd is a variant of ResNet, which can be used for image classification and feature extraction.\n\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install resnet50_vd_imagenet_ssld\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)   \n\n## III. Module API Prediction\n\n- ### 1、Command line Prediction\n\n    ```shell\n    $ hub run resnet50_vd_imagenet_ssld --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2、Prediction Code Example\n\n    ```python\n    import paddle\n    import paddlehub as hub\n\n    if __name__ == '__main__':\n\n        model = hub.Module(name='resnet50_vd_imagenet_ssld')\n        result = model.predict(['/PATH/TO/IMAGE'])\n    ```\n- ### 3.Fine-tune and Encapsulation\n\n    - After completing the installation of PaddlePaddle and PaddleHub, you can start using the user_guided_colorization model to fine-tune datasets such as [Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers) by excuting `python train.py`.\n\n    - Steps:\n\n        - Step1: Define the data preprocessing method\n            - ```python\n              import paddlehub.vision.transforms as T\n\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n             - `transforms`: The data enhancement module defines lots of data preprocessing methods. Users can replace the data preprocessing methods according to their needs.\n\n\n        - Step2: Download the dataset\n\n            - ```python\n              from paddlehub.datasets import Flowers\n\n              flowers = Flowers(transforms)\n\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: data preprocessing methods.\n                * `mode`: Select the data mode, the options are `train`, `test`, `val`. Default is `train`.\n                * `hub.datasets.Flowers()` will be automatically downloaded from the network and decompressed to the `$HOME/.paddlehub/dataset` directory under the user directory.\n\n        - Step3: Load the pre-trained model\n\n            - ```python\n              model = hub.Module(name=\"resnet50_vd_imagenet_ssld\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: model name.\n                * `label_list`: set the output classification category. Default is Imagenet2012 category.\n\n        - Step4: Optimization strategy\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - Run configuration\n\n            - `Trainer` mainly control the training of Fine-tune, including the following controllable parameters:\n\n                * `model`: Optimized model.\n                * `optimizer`: Optimizer selection.\n                * `use_vdl`: Whether to use vdl to visualize the training process.\n                * `checkpoint_dir`: The storage address of the model parameters.\n                * `compare_metrics`: The measurement index of the optimal model.\n\n            - `trainer.train` mainly control the specific training process, including the following controllable parameters:\n\n                * `train_dataset`: Training dataset.\n                * `epochs`: Epochs of training process.\n                * `batch_size`: Batch size.\n                * `num_workers`: Number of workers.\n                * `eval_dataset`: Validation dataset.\n                * `log_interval`:The interval for printing logs.\n                * `save_interval`: The interval for saving model parameters.\n\n\n    - Model prediction\n\n        -   When Fine-tune is completed, the model with the best performance on the verification set will be saved in the `${CHECKPOINT_DIR}/best_model` directory. We use this model to make predictions. The `predict.py` script is as follows:\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n\n              if __name__ == '__main__':\n\n                  model = hub.Module(name='resnet50_vd_imagenet_ssld', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['/PATH/TO/IMAGE'])\n              ```\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of classification.\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n        - ```shell\n          $ hub serving start -m resnet50_vd_imagenet_ssld\n          ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:** If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n    \n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n      - ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # Send an HTTP request\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/resnet50_vd_imagenet_ssld\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## V. Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n    \n  Upgrade to dynamic version\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_imagenet_ssld/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_imagenet_ssld/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport math\nfrom typing import Union\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport paddlehub.vision.transforms as T\nfrom paddle.nn import Conv2D, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2D, MaxPool2D, AvgPool2D\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(\n            self,\n            num_channels: int,\n            num_filters: int,\n            filter_size: int,\n            stride: int = 1,\n            groups: int = 1,\n            is_vd_mode: bool = False,\n            act: str = None,\n            name: str = None,\n    ):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2D(kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2D(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, inputs: paddle.Tensor):\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Bottleneck Block for ResNet50_vd.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels, num_filters=num_filters, filter_size=1, act='relu', name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters * 4, filter_size=1, act=None, name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.add(x=short, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    \"\"\"Basic block for ResNet50_vd.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BasicBlock, self).__init__()\n        self.stride = stride\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters, filter_size=3, act=None, name=name + \"_branch2b\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters,\n                filter_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.add(x=short, y=conv1)\n        y = F.relu(y)\n        return y\n\n\n@moduleinfo(\n    name=\"resnet50_vd_imagenet_ssld\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"resnet50_vd_imagenet_ssld is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ResNet50_vd(nn.Layer):\n    \"\"\"ResNet50_vd model.\"\"\"\n\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(ResNet50_vd, self).__init__()\n\n        self.layers = 50\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n        depth = [3, 4, 6, 3]\n        num_channels = [64, 256, 512, 1024]\n        num_filters = [64, 128, 256, 512]\n\n        self.conv1_1 = ConvBNLayer(num_channels=3, num_filters=32, filter_size=3, stride=2, act='relu', name=\"conv1_1\")\n        self.conv1_2 = ConvBNLayer(num_channels=32, num_filters=32, filter_size=3, stride=1, act='relu', name=\"conv1_2\")\n        self.conv1_3 = ConvBNLayer(num_channels=32, num_filters=64, filter_size=3, stride=1, act='relu', name=\"conv1_3\")\n        self.pool2d_max = MaxPool2D(kernel_size=3, stride=2, padding=1)\n\n        self.block_list = []\n\n        for block in range(len(depth)):\n            shortcut = False\n            for i in range(depth[block]):\n\n                conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                bottleneck_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    BottleneckBlock(\n                        num_channels=num_channels[block] if i == 0 else num_filters[block] * 4,\n                        num_filters=num_filters[block],\n                        stride=2 if i == 0 and block != 0 else 1,\n                        shortcut=shortcut,\n                        if_first=block == i == 0,\n                        name=conv_name))\n                self.block_list.append(bottleneck_block)\n                shortcut = True\n\n        self.pool2d_avg = AdaptiveAvgPool2D(1)\n        self.pool2d_avg_channels = num_channels[-1] * 2\n        stdv = 1.0 / math.sqrt(self.pool2d_avg_channels * 1.0)\n\n        self.out = Linear(\n            self.pool2d_avg_channels,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_0.w_0\"),\n            bias_attr=ParamAttr(name=\"fc_0.b_0\"))\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'resnet50_vd_ssld.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images)\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        y = self.pool2d_max(y)\n        for block in self.block_list:\n            y = block(y)\n        feature = self.pool2d_avg(y)\n        y = paddle.reshape(feature, shape=[-1, self.pool2d_avg_channels])\n        y = self.out(y)\n        return y, feature\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_wildanimals/README.md",
    "content": "# resnet50_vd_wildanimals\n\n|模型名称|resnet50_vd_wildanimals|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNet_vd|\n|数据集|IFAW 自建野生动物数据集|\n|是否支持Fine-tuning|否|\n|模型大小|92MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - ResNet-vd 其实就是 ResNet-D，是ResNet 原始结构的变种，可用于图像分类和特征提取。该 PaddleHub Module 采用百度自建野生动物数据集训练得到，支持'象牙制品','象牙', '大象', '虎皮', '老虎', '虎牙/虎爪/虎骨', '穿山甲甲片', '穿山甲', '穿山甲爪子', '其他' 这十个标签的识别。模型的详情可参考[论文](https://arxiv.org/pdf/1812.01187.pdf)。\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install resnet50_vd_wildanimals\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run resnet50_vd_wildanimals --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnet50_vd_wildanimals\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 为识别的菜品类别，value为置信度。\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个野生动物及其制品识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m resnet50_vd_wildanimals\n    ```\n\n  - 这样就完成了一个野生动物及其制品识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/resnet50_vd_wildanimals\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install resnet50_vd_wildanimals==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_wildanimals/README_en.md",
    "content": "# resnet50_vd_wildanimals\n\n|Module Name|resnet50_vd_wildanimals|\n| :--- | :---: |\n|Category|image classification|\n|Network|ResNet_vd|\n|Dataset|IFAW Wild Animal Dataset|\n|Fine-tuning supported or not|No|\n|Module Size|92MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - ResNet proposed a residual unit to solve the problem of training an extremely deep network, and improved the prediction accuracy of models. ResNet-vd is a variant of ResNet. This module is based on ResNet_vd, trained on IFAW Wild Animal dataset, and can predict ten kinds of wild animal components.\n\n\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install resnet50_vd_wildanimals\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run resnet50_vd_wildanimals --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnet50_vd_wildanimals\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - classification API.\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - top\\_k (int): return the first k results\n\n    - **Return**\n\n      - res (list\\[dict\\]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of image classification.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m resnet50_vd_wildanimals\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/resnet50_vd_wildanimals\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Remove Fluid API\n\n  - ```shell\n    $ hub install resnet50_vd_wildanimals==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_wildanimals/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/classification/resnet50_vd_wildanimals/data_feed.py",
    "content": "import os\nimport time\nfrom collections import OrderedDict\n\nimport numpy as np\nfrom PIL import Image\n\n__all__ = ['reader']\n\nDATA_DIM = 224\nimg_mean = np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))\nimg_std = np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))\n\n\ndef resize_short(img, target_size):\n    percent = float(target_size) / min(img.size[0], img.size[1])\n    resized_width = int(round(img.size[0] * percent))\n    resized_height = int(round(img.size[1] * percent))\n    img = img.resize((resized_width, resized_height), Image.LANCZOS)\n    return img\n\n\ndef crop_image(img, target_size, center):\n    width, height = img.size\n    size = target_size\n    if center == True:\n        w_start = (width - size) / 2\n        h_start = (height - size) / 2\n    else:\n        w_start = np.random.randint(0, width - size + 1)\n        h_start = np.random.randint(0, height - size + 1)\n    w_end = w_start + size\n    h_end = h_start + size\n    img = img.crop((w_start, h_start, w_end, h_end))\n    return img\n\n\ndef process_image(img):\n    img = resize_short(img, target_size=256)\n    img = crop_image(img, target_size=DATA_DIM, center=True)\n    if img.mode != 'RGB':\n        img = img.convert('RGB')\n    img = np.array(img).astype('float32').transpose((2, 0, 1)) / 255\n    img -= img_mean\n    img /= img_std\n    return img\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list[numpy.ndarray]): images data, shape of each is [H, W, C].\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            each['org_im_path'] = im_path\n            each['org_im'] = Image.open(im_path)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n    if images is not None:\n        assert type(images), \"images is a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = Image.fromarray(im[:, :, ::-1])\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n\n    for element in component:\n        element['image'] = process_image(element['org_im'])\n        yield element\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_wildanimals/label_list.txt",
    "content": "其他\n象牙制品\n象牙\n大象\n虎皮\n老虎\n虎牙/虎爪/虎骨\n穿山甲甲片\n穿山甲\n穿山甲爪子\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_wildanimals/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\n\nimport argparse\nimport ast\nimport os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import postprocess\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"resnet50_vd_wildanimals\",\n    type=\"CV/image_classification\",\n    author=\"baidu-vis\",\n    author_email=\"\",\n    summary=\n    \"ResNet50vd is a image classfication model, this module is trained with IFAW's self-built wild animals dataset.\",\n    version=\"1.1.0\")\nclass ResNet50vdWildAnimals:\n\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"model\", \"model\")\n        label_file = os.path.join(self.directory, \"label_list.txt\")\n        with open(label_file, 'r', encoding='utf-8') as file:\n            self.label_list = file.read().split(\"\\n\")[:-1]\n        self._set_config()\n\n    def get_expected_image_width(self):\n        return 224\n\n    def get_expected_image_height(self):\n        return 224\n\n    def get_pretrained_images_mean(self):\n        im_mean = np.array([0.485, 0.456, 0.406]).reshape(1, 3)\n        return im_mean\n\n    def get_pretrained_images_std(self):\n        im_std = np.array([0.229, 0.224, 0.225]).reshape(1, 3)\n        return im_std\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting.\n        \"\"\"\n        model = self.default_pretrained_model_path + '.pdmodel'\n        params = self.default_pretrained_model_path + '.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def classification(self, images=None, paths=None, batch_size=1, use_gpu=False, top_k=1):\n        \"\"\"\n        API for image classification.\n\n        Args:\n            images (numpy.ndarray): data of images, shape of each is [H, W, C].\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results.\n        \"\"\"\n        all_data = list()\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = list()\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except:\n                    pass\n            # feed batch image\n            batch_image = np.array([data['image'] for data in batch_data])\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(batch_image.copy())\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            out = postprocess(data_out=output_handle.copy_to_cpu(), label_list=self.label_list, top_k=top_k)\n            res += out\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classification(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.classification(paths=[args.input_path], batch_size=args.batch_size, use_gpu=args.use_gpu)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not.\")\n        self.arg_config_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n        self.arg_config_group.add_argument('--top_k', type=ast.literal_eval, default=1, help=\"Return top k results.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_wildanimals/processor.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport base64\n\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef softmax(x):\n    if len(x.shape) > 1:\n        tmp = np.max(x, axis=1)\n        x -= tmp.reshape((x.shape[0], 1))\n        x = np.exp(x)\n        tmp = np.sum(x, axis=1)\n        x /= tmp.reshape((x.shape[0], 1))\n    else:\n        tmp = np.max(x)\n        x -= tmp\n        x = np.exp(x)\n        tmp = np.sum(x)\n        x /= tmp\n    return x\n\n\ndef postprocess(data_out, label_list, top_k):\n    \"\"\"\n    Postprocess output of network, one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output data of network.\n        label_list (list): list of label.\n        top_k (int): Return top k results.\n    \"\"\"\n    output = []\n    for result in data_out:\n        result_i = softmax(result)\n        output_i = {}\n        indexs = np.argsort(result_i)[::-1][0:top_k]\n        for index in indexs:\n            label = label_list[index]\n            output_i[label] = float(result_i[index])\n        output.append(output_i)\n    return output\n"
  },
  {
    "path": "modules/image/classification/resnet50_vd_wildanimals/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/J33o16cP0SA/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8Mnx8aXZvcnl8ZW58MHx8fHwxNjY1NTUwNjk4&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"resnet50_vd_wildanimals\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n\n    def test_classification1(self):\n        results = self.module.classification(paths=['tests/test.jpg'])\n        data = results[0]\n        self.assertTrue('象牙' in data)\n        self.assertTrue(data['象牙'] > 0.2)\n\n    def test_classification2(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')])\n        data = results[0]\n        self.assertTrue('象牙' in data)\n        self.assertTrue(data['象牙'] > 0.2)\n\n    def test_classification3(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')], use_gpu=True)\n        data = results[0]\n        self.assertTrue('象牙' in data)\n        self.assertTrue(data['象牙'] > 0.2)\n\n    def test_classification4(self):\n        self.assertRaises(AssertionError, self.module.classification, paths=['no.jpg'])\n\n    def test_classification5(self):\n        self.assertRaises(TypeError, self.module.classification, images=['tests/test.jpg'])\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/classification/resnet_v2_101_imagenet/README.md",
    "content": "# resnet_v2_101_imagenet\n\n|模型名称|resnet_v2_101_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNet V2 101|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|173MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - ResNet系列模型是图像分类领域的重要模型之一，模型中提出的残差单元有效地解决了深度网络训练困难的问题，通过增加模型的深度提升了模型的准确率。该PaddleHub Module结构为ResNet101，基于ImageNet-2012数据集训练，接受输入图片大小为224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install resnet_v2_101_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run resnet_v2_101_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnet_v2_101_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n  修复python2中编码问题\n  - ```shell\n    $ hub install resnet_v2_101_imagenet==1.0.1\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnet_v2_101_imagenet/README_en.md",
    "content": "# resnet_v2_101_imagenet\n\n|Module Name|resnet_v2_101_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|ResNet V2 101|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|173MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - ResNet proposed a residual unit to solve the problem of training an extremely deep network, and improved the prediction accuracy of models. This module is based on ResNet101, trained on ImageNet-2012 dataset, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install resnet_v2_101_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run resnet_v2_101_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnet_v2_101_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.0.1\n  Fix the problem of encoding in python2\n  - ```shell\n    $ hub install resnet_v2_101_imagenet==1.0.1\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnet_v2_152_imagenet/README.md",
    "content": "# resnet_v2_152_imagenet\n\n|模型名称|resnet_v2_152_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNet V2|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|234MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - ResNet系列模型是图像分类领域的重要模型之一，模型中提出的残差单元有效地解决了深度网络训练困难的问题，通过增加模型的深度提升了模型的准确率。该PaddleHub Module结构为ResNet152，基于ImageNet-2012数据集训练，接受输入图片大小为224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install resnet_v2_152_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run resnet_v2_152_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnet_v2_152_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n  修复python2中编码问题\n  - ```shell\n    $ hub install resnet_v2_152_imagenet==1.0.1\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnet_v2_152_imagenet/README_en.md",
    "content": "# resnet_v2_152_imagenet\n\n|Module Name|resnet_v2_152_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|ResNet V2|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|234MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - ResNet proposed a residual unit to solve the problem of training an extremely deep network, and improved the prediction accuracy of models. This module is based on ResNet152, trained on ImageNet-2012 dataset, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install resnet_v2_152_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run resnet_v2_152_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnet_v2_152_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.0.1\n  Fix the problem of encoding in python2\n  - ```shell\n    $ hub install resnet_v2_152_imagenet==1.0.1\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnet_v2_18_imagenet/README.md",
    "content": "# resnet_v2_18_imagenet\n\n|模型名称|resnet_v2_18_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNet V2|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|46MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - ResNet系列模型是图像分类领域的重要模型之一，模型中提出的残差单元有效地解决了深度网络训练困难的问题，通过增加模型的深度提升了模型的准确率。该PaddleHub Module结构为ResNet18，基于ImageNet-2012数据集训练，接受输入图片大小为224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install resnet_v2_18_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run resnet_v2_18_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnet_v2_18_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install resnet_v2_18_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnet_v2_18_imagenet/README_en.md",
    "content": "# resnet_v2_18_imagenet\n\n|Module Name|resnet_v2_18_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|ResNet V2|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|46MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n  - ResNet proposed a residual unit to solve the problem of training an extremely deep network, and improved the prediction accuracy of models. This module is based on ResNet18, trained on ImageNet-2012 dataset, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install resnet_v2_18_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run resnet_v2_18_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnet_v2_18_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install resnet_v2_18_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnet_v2_34_imagenet/README.md",
    "content": "# resnet_v2_34_imagenet\n\n|模型名称|resnet_v2_34_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNet V2|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|85MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - ResNet系列模型是图像分类领域的重要模型之一，模型中提出的残差单元有效地解决了深度网络训练困难的问题，通过增加模型的深度提升了模型的准确率。该PaddleHub Module结构为ResNet34，基于ImageNet-2012数据集训练，接受输入图片大小为224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install resnet_v2_34_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run resnet_v2_34_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnet_v2_34_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install resnet_v2_34_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnet_v2_34_imagenet/README_en.md",
    "content": "# resnet_v2_34_imagenet\n\n|Module Name|resnet_v2_34_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|ResNet V2|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|85MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - ResNet proposed a residual unit to solve the problem of training an extremely deep network, and improved the prediction accuracy of models. This module is based on ResNet34, trained on ImageNet-2012 dataset, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install resnet_v2_34_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run resnet_v2_34_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnet_v2_34_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install resnet_v2_34_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnet_v2_50_imagenet/README.md",
    "content": "# resnet_v2_50_imagenet\n\n|模型名称|resnet_v2_50_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNet V2|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|99MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - ResNet系列模型是图像分类领域的重要模型之一，模型中提出的残差单元有效地解决了深度网络训练困难的问题，通过增加模型的深度提升了模型的准确率。该PaddleHub Module结构为ResNet50，基于ImageNet-2012数据集训练，接受输入图片大小为224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install resnet_v2_50_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run resnet_v2_50_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnet_v2_50_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n  修复python2中编码问题\n  - ```shell\n    $ hub install resnet_v2_50_imagenet==1.0.1\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnet_v2_50_imagenet/README_en.md",
    "content": "# resnet_v2_50_imagenet\n\n|Module Name|resnet_v2_50_imagenet|\n| :--- | :---: |\n|Category |Image classification|\n|Network|ResNet V2|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|99MB|\n|Latest update date|2021-02-26|\n|Data indicators|-|\n\n\n## I. Basic Information\n\n- ### Application Effect Display\n\n  - This module utilizes ResNet50 structure and it is trained on ImageNet-2012.\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install resnet_v2_50_imagenet\n    ```\n  - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run resnet_v2_50_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnet_v2_50_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - Prediction API for classification.\n\n    - **Parameter**\n      - data (dict): Key is 'image'，value is the list of image path.\n\n    - **Return**\n      - result (list[dict]): The list of classification results，key is the prediction label, value is the corresponding confidence.\n\n\n\n\n\n## IV. Release Note\n\n- 1.0.0\n\n  First release\n\n- 1.0.1\n\n  Fix encoding problem in python2\n\n  - ```shell\n    $ hub install resnet_v2_50_imagenet==1.0.1\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext101_32x16d_wsl/README.md",
    "content": "# resnext101_32x16d_wsl\n\n|模型名称|resnext101_32x16d_wsl|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNeXt_wsl|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|744MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - 由于人工标注的数据集在规模上已经接近其函数极限，Facebook 的研发人员采用了一种独特的迁移学习研究，通过使用 hashtag 作为标注，在包含数十亿张社交媒体图片的数据集上进行训练，这为大规模训练转向弱监督学习(Weakly Supervised Learning) 取得了重大突破。在 ImageNet 图像识别基准上，ResNeXt101_32x16d_wsl 的 Top-1 达到了 84.24% 的准确率。该 PaddleHub Module结构为 ResNeXt101_32x16d_wsl，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install resnext101_32x16d_wsl\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run resnext101_32x16d_wsl --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext101_32x16d_wsl\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install resnext101_32x16d_wsl==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext101_32x16d_wsl/README_en.md",
    "content": "# resnext101_32x16d_wsl\n\n|Module Name|resnext101_32x16d_wsl|\n| :--- | :---: |\n|Category|image classification|\n|Network|ResNeXt_wsl|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|744MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - The scale of dataset annotated by people is close to limit, researchers in Facebook adopt a new method of transfer learning to train the network. They use hashtag to annotate images, and trained on billions of social images, then transfer to weakly supervised learning. The top-1 accuracy of ResNeXt101_32x16d_wsl on ImageNet reaches 84.24%. This module is based on ResNeXt101_32x16d_wsl, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install resnext101_32x16d_wsl\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run resnext101_32x16d_wsl --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext101_32x16d_wsl\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install resnext101_32x16d_wsl==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext101_32x32d_wsl/README.md",
    "content": "# resnext101_32x32d_wsl\n\n|模型名称|resnext101_32x32d_wsl|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNeXt_wsl|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|1.8GB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - 由于人工标注的数据集在规模上已经接近其函数极限，Facebook 的研发人员采用了一种独特的迁移学习研究，通过使用 hashtag 作为标注，在包含数十亿张社交媒体图片的数据集上进行训练，这为大规模训练转向弱监督学习(Weakly Supervised Learning) 取得了重大突破。在 ImageNet 图像识别基准上，ResNeXt101_32x32d_wsl 的 Top-1 达到了 84.97% 的准确率。该 PaddleHub Module结构为 ResNeXt101_32x32d_wsl，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install resnext101_32x32d_wsl\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run resnext101_32x32d_wsl --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext101_32x32d_wsl\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install resnext101_32x32d_wsl==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext101_32x32d_wsl/README_en.md",
    "content": "# resnext101_32x32d_wsl\n\n|Module Name|resnext101_32x32d_wsl|\n| :--- | :---: |\n|Category|image classification|\n|Network|ResNeXt_wsl|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|1.8GB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - The scale of dataset annotated by people is close to limit, researchers in Facebook adopt a new method of transfer learning to train the network. They use hashtag to annotate images, and trained on billions of social images, then transfer to weakly supervised learning. The top-1 accuracy of ResNeXt101_32x32d_wsl on ImageNet reaches 84.97%. This module is based on ResNeXt101_32x32d_wsl, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install resnext101_32x32d_wsl\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run resnext101_32x32d_wsl --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext101_32x32d_wsl\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install resnext101_32x32d_wsl==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext101_32x48d_wsl/README.md",
    "content": "# resnext101_32x48d_wsl\n\n|模型名称|resnext101_32x48d_wsl|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNeXt_wsl|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|342MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - 由于人工标注的数据集在规模上已经接近其函数极限，Facebook 的研发人员采用了一种独特的迁移学习研究，通过使用 hashtag 作为标注，在包含数十亿张社交媒体图片的数据集上进行训练，这为大规模训练转向弱监督学习(Weakly Supervised Learning) 取得了重大突破。在 ImageNet 图像识别基准上，ResNeXt101_32x48d_wsl 的 Top-1 达到了 85.4% 的准确率。该 PaddleHub Module结构为 ResNeXt101_32x48d_wsl，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install resnext101_32x48d_wsl\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run resnext101_32x48d_wsl --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext101_32x48d_wsl\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install resnext101_32x48d_wsl==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext101_32x48d_wsl/README_en.md",
    "content": "# resnext101_32x48d_wsl\n\n|Module Name|resnext101_32x48d_wsl|\n| :--- | :---: |\n|Category|image classification|\n|Network|ResNeXt_wsl|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|342MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - The scale of dataset annotated by people is close to limit, researchers in Facebook adopt a new method of transfer learning to train the network. They use hashtag to annotate images, and trained on billions of social images, then transfer to weakly supervised learning. The top-1 accuracy of ResNeXt101_32x48d_wsl on ImageNet reaches 85.4%. This module is based on ResNeXt101_32x48d_wsl, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install resnext101_32x48d_wsl\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run resnext101_32x48d_wsl --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext101_32x48d_wsl\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install resnext101_32x48d_wsl==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext101_32x4d_imagenet/README.md",
    "content": "# resnext101_32x4d_imagenet\n\n|模型名称|resnext101_32x4d_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNeXt|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|172MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - ResNeXt 是由 UC San Diego 和 Facebook AI 研究所于2017年提出的图像分类模型，模型沿袭了 VGG/ResNets 的堆叠思想，并采用 split-transform-merge 策略来增加网络的分支数。resnext101_32x4d，表示 layers 为 101， 分支数为 32，每个分支的输入输出 channels 为4。该 PaddleHub Module 在包含数十亿张社交媒体图片的数据集上进行弱监督训练，并使用ImageNet-2012数据集finetune，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install resnext101_32x4d_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run resnext101_32x4d_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext101_32x4d_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install resnext101_32x4d_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext101_32x4d_imagenet/README_en.md",
    "content": "# resnext101_32x4d_imagenet\n\n|Module Name|resnext101_32x4d_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|ResNeXt|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|172MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - ResNeXt is proposed by UC San Diego and Facebook AI Research in 2017. This module is based on resnext101_32x4d, which denotes 101 layers ，32 branches，and the number of input and output branch channels is 4 in the network. It is weak-supervised trained on billions of socail images, finetuned on ImageNet-2012 dataset, and can predict an image of size 224*224*3.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install resnext101_32x4d_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run resnext101_32x4d_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext101_32x4d_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install resnext101_32x4d_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext101_32x4d_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nimport os\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 filter_size: int,\n                 stride: int = 1,\n                 groups: int = 1,\n                 act: str = None,\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Bottleneck Block for ResNeXt101.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 cardinality: int,\n                 shortcut: bool = True,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels, num_filters=num_filters, filter_size=1, act='relu', name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            groups=cardinality,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters * 2 if cardinality == 32 else num_filters,\n            filter_size=1,\n            act=None,\n            name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 2 if cardinality == 32 else num_filters,\n                filter_size=1,\n                stride=stride,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.elementwise_add(x=short, y=conv2, act='relu')\n        return y\n\n\n@moduleinfo(\n    name=\"resnext101_32x4d_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"resnext101_32x4d_imagenet is a classification model, \"\n    \"this module is trained with Baidu open sourced dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ResNeXt101_32x4d(nn.Layer):\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(ResNeXt101_32x4d, self).__init__()\n\n        self.layers = 101\n        self.cardinality = 32\n        depth = [3, 4, 23, 3]\n        num_channels = [64, 256, 512, 1024]\n        num_filters = [128, 256, 512, 1024]\n\n        self.conv = ConvBNLayer(num_channels=3, num_filters=64, filter_size=7, stride=2, act='relu', name=\"res_conv1\")\n        self.pool2d_max = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n        self.block_list = []\n        for block in range(len(depth)):\n            shortcut = False\n            for i in range(depth[block]):\n                if block == 2:\n                    if i == 0:\n                        conv_name = \"res\" + str(block + 2) + \"a\"\n                    else:\n                        conv_name = \"res\" + str(block + 2) + \"b\" + str(i)\n                else:\n                    conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                bottleneck_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    BottleneckBlock(\n                        num_channels=num_channels[block]\n                        if i == 0 else num_filters[block] * int(64 // self.cardinality),\n                        num_filters=num_filters[block],\n                        stride=2 if i == 0 and block != 0 else 1,\n                        cardinality=self.cardinality,\n                        shortcut=shortcut,\n                        name=conv_name))\n                self.block_list.append(bottleneck_block)\n                shortcut = True\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n        self.pool2d_avg_channels = num_channels[-1] * 2\n        stdv = 1.0 / math.sqrt(self.pool2d_avg_channels * 1.0)\n\n        self.out = Linear(\n            self.pool2d_avg_channels,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_weights\"),\n            bias_attr=ParamAttr(name=\"fc_offset\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'resnext101_32x4d_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/resnext101_32x4d_imagenet.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv(inputs)\n        y = self.pool2d_max(y)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.pool2d_avg_channels])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/resnext101_32x8d_wsl/README.md",
    "content": "# resnext101_32x8d_wsl\n\n|模型名称|resnext101_32x8d_wsl|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNeXt_wsl|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|317MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - 由于人工标注的数据集在规模上已经接近其函数极限，Facebook 的研发人员采用了一种独特的迁移学习研究，通过使用 hashtag 作为标注，在包含数十亿张社交媒体图片的数据集上进行训练，这为大规模训练转向弱监督学习(Weakly Supervised Learning) 取得了重大突破。在 ImageNet 图像识别基准上，ResNeXt101_32x8d_wsl 的 Top-1 达到了 82.55% 的准确率。该 PaddleHub Module结构为 ResNeXt101_32x8d_wsl，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install resnext101_32x8d_wsl\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run resnext101_32x8d_wsl --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext101_32x8d_wsl\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install resnext101_32x8d_wsl==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext101_32x8d_wsl/README_en.md",
    "content": "# resnext101_32x8d_wsl\n\n|Module Name|resnext101_32x8d_wsl|\n| :--- | :---: |\n|Category|image classification|\n|Network|ResNeXt_wsl|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|317MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - The scale of dataset annotated by people is close to limit, researchers in Facebook adopt a new method of transfer learning to train the network. They use hashtag to annotate images, and trained on billions of social images, then transfer to weakly supervised learning. The top-1 accuracy of ResNeXt101_32x8d_wsl on ImageNet reaches 82.55%. This module is based on ResNeXt101_32x8d_wsl, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install resnext101_32x8d_wsl\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run resnext101_32x8d_wsl --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext101_32x8d_wsl\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install resnext101_32x8d_wsl==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext101_64x4d_imagenet/README.md",
    "content": "# resnext101_64x4d_imagenet\n\n|模型名称|resnext101_64x4d_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNeXt|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|322MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - ResNeXt 是由 UC San Diego 和 Facebook AI 研究所于2017年提出的图像分类模型，模型沿袭了 VGG/ResNets 的堆叠思想，并采用 split-transform-merge 策略来增加网络的分支数。resnext101_64x4d，表示 layers 为 101， 分支数为 64，每个分支的输入输出 channels 为4。该 PaddleHub Module 在包含数十亿张社交媒体图片的数据集上进行弱监督训练，并使用ImageNet-2012数据集finetune，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install resnext101_64x4d_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run resnext101_64x4d_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext101_64x4d_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install resnext101_64x4d_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext101_64x4d_imagenet/README_en.md",
    "content": "# resnext101_64x4d_imagenet\n\n|Module Name|resnext101_64x4d_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|ResNeXt|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|322MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - ResNeXt is proposed by UC San Diego and Facebook AI Research in 2017. This module is based on resnext101_64x4d, which denotes 101 layers ，64 branches，and the number of input and output branch channels is 4 in the network. It is weak-supervised trained on billions of socail images, finetuned on ImageNet-2012 dataset, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install resnext101_64x4d_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run resnext101_64x4d_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext101_64x4d_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install resnext101_64x4d_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext101_64x4d_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nimport os\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 filter_size: int,\n                 stride: int = 1,\n                 groups: int = 1,\n                 act: str = None,\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Bottleneck Block for ResNeXt101.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 cardinality: int,\n                 shortcut: bool = True,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels, num_filters=num_filters, filter_size=1, act='relu', name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            groups=cardinality,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters * 2 if cardinality == 32 else num_filters,\n            filter_size=1,\n            act=None,\n            name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 2 if cardinality == 32 else num_filters,\n                filter_size=1,\n                stride=stride,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.elementwise_add(x=short, y=conv2, act='relu')\n        return y\n\n\n@moduleinfo(\n    name=\"resnext101_64x4d_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"resnext101_64x4d_imagenet is a classification model, \"\n    \"this module is trained with Baidu open sourced dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ResNeXt101_64x4d(nn.Layer):\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(ResNeXt101_64x4d, self).__init__()\n\n        self.layers = 101\n        self.cardinality = 64\n        depth = [3, 4, 23, 3]\n        num_channels = [64, 256, 512, 1024]\n        num_filters = [256, 512, 1024, 2048]\n\n        self.conv = ConvBNLayer(num_channels=3, num_filters=64, filter_size=7, stride=2, act='relu', name=\"res_conv1\")\n        self.pool2d_max = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n        self.block_list = []\n        for block in range(len(depth)):\n            shortcut = False\n            for i in range(depth[block]):\n                if block == 2:\n                    if i == 0:\n                        conv_name = \"res\" + str(block + 2) + \"a\"\n                    else:\n                        conv_name = \"res\" + str(block + 2) + \"b\" + str(i)\n                else:\n                    conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                bottleneck_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    BottleneckBlock(\n                        num_channels=num_channels[block]\n                        if i == 0 else num_filters[block] * int(64 // self.cardinality),\n                        num_filters=num_filters[block],\n                        stride=2 if i == 0 and block != 0 else 1,\n                        cardinality=self.cardinality,\n                        shortcut=shortcut,\n                        name=conv_name))\n                self.block_list.append(bottleneck_block)\n                shortcut = True\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n        self.pool2d_avg_channels = num_channels[-1] * 2\n        stdv = 1.0 / math.sqrt(self.pool2d_avg_channels * 1.0)\n\n        self.out = Linear(\n            self.pool2d_avg_channels,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_weights\"),\n            bias_attr=ParamAttr(name=\"fc_offset\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'resnext101_64x4d_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/resnext101_64x4d_imagenet.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv(inputs)\n        y = self.pool2d_max(y)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.pool2d_avg_channels])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/resnext101_vd_32x4d_imagenet/README.md",
    "content": "# resnext101_vd_32x4d_imagenet\n\n|模型名称|resnext101_vd_32x4d_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNeXt|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|172MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - ResNeXt 是由 UC San Diego 和 Facebook AI 研究所于2017年提出的图像分类模型，模型沿袭了 VGG/ResNets 的堆叠思想，并采用 split-transform-merge 策略来增加网络的分支数。resnext101_vd_32x4d，表示 layers 为 101， 分支数为 32，每个分支的输入输出 channels 为4。该 PaddleHub Module 在包含数十亿张社交媒体图片的数据集上进行弱监督训练，并使用ImageNet-2012数据集finetune，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install resnext101_vd_32x4d_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run resnext101_vd_32x4d_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext101_vd_32x4d_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n  - ```shell\n    $ hub install resnext101_vd_32x4d_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext101_vd_32x4d_imagenet/README_en.md",
    "content": "# resnext101_vd_32x4d_imagenet\n\n|Module Name|resnext101_vd_32x4d_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|ResNeXt|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|172MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - ResNeXt is proposed by UC San Diego and Facebook AI Research in 2017. This module is based on resnext101_vd_32x4d, which denotes 101 layers ，32 branches，and the number of input and output branch channels is 4 in the network. It is weak-supervised trained on billions of socail images, finetuned on ImageNet-2012 dataset, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install resnext101_vd_32x4d_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run resnext101_vd_32x4d_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext101_vd_32x4d_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n  - ```shell\n    $ hub install resnext101_vd_32x4d_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext101_vd_32x4d_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport math\nimport os\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(\n            self,\n            num_channels: int,\n            num_filters: int,\n            filter_size: int,\n            stride: int = 1,\n            groups: int = 1,\n            is_vd_mode: bool = False,\n            act: str = None,\n            name: str = None,\n    ):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2d(kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, inputs: paddle.Tensor):\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Bottleneck Block for ResNeXt50_vd.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 cardinality: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels, num_filters=num_filters, filter_size=1, act='relu', name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            groups=cardinality,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters * 2 if cardinality == 32 else num_filters,\n            filter_size=1,\n            act=None,\n            name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 2 if cardinality == 32 else num_filters,\n                filter_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.elementwise_add(x=short, y=conv2, act='relu')\n        return y\n\n\n@moduleinfo(\n    name=\"resnext101_vd_32x4d_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"resnext101_vd_32x4d_imagenet is a classification model, \"\n    \"this module is trained with Baidu open sourced dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ResNeXt101_vd(nn.Layer):\n    \"\"\"ResNeXt101_vd model.\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(ResNeXt101_vd, self).__init__()\n\n        self.layers = 101\n        self.cardinality = 32\n        depth = [3, 4, 23, 3]\n        num_channels = [64, 256, 512, 1024]\n        num_filters = [128, 256, 512, 1024]\n\n        self.conv1_1 = ConvBNLayer(num_channels=3, num_filters=32, filter_size=3, stride=2, act='relu', name=\"conv1_1\")\n        self.conv1_2 = ConvBNLayer(num_channels=32, num_filters=32, filter_size=3, stride=1, act='relu', name=\"conv1_2\")\n        self.conv1_3 = ConvBNLayer(num_channels=32, num_filters=64, filter_size=3, stride=1, act='relu', name=\"conv1_3\")\n\n        self.pool2d_max = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n        self.block_list = []\n        for block in range(len(depth)):\n            shortcut = False\n            for i in range(depth[block]):\n                if block == 2:\n                    if i == 0:\n                        conv_name = \"res\" + str(block + 2) + \"a\"\n                    else:\n                        conv_name = \"res\" + str(block + 2) + \"b\" + str(i)\n                else:\n                    conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                bottleneck_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    BottleneckBlock(\n                        num_channels=num_channels[block]\n                        if i == 0 else num_filters[block] * int(64 // self.cardinality),\n                        num_filters=num_filters[block],\n                        stride=2 if i == 0 and block != 0 else 1,\n                        cardinality=self.cardinality,\n                        shortcut=shortcut,\n                        if_first=block == i == 0,\n                        name=conv_name))\n                self.block_list.append(bottleneck_block)\n                shortcut = True\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n\n        self.pool2d_avg_channels = num_channels[-1] * 2\n\n        stdv = 1.0 / math.sqrt(self.pool2d_avg_channels * 1.0)\n\n        self.out = Linear(\n            self.pool2d_avg_channels,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_weights\"),\n            bias_attr=ParamAttr(name=\"fc_offset\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'resnext101_vd_32x4d_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/resnext101_vd_32x4d_imagenet.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        y = self.pool2d_max(y)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.pool2d_avg_channels])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/resnext101_vd_64x4d_imagenet/README.md",
    "content": "# resnext101_vd_64x4d_imagenet\n\n|模型名称|resnext101_vd_64x4d_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNeXt_vd|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|172MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - ResNeXt 是由 UC San Diego 和 Facebook AI 研究所于2017年提出的图像分类模型，模型沿袭了 VGG/ResNets 的堆叠思想，并采用 split-transform-merge 策略来增加网络的分支数。resnext101_vd_64x4d，表示 layers 为 101， 分支数为 64，每个分支的输入输出 channels 为4。该 PaddleHub Module 在包含数十亿张社交媒体图片的数据集上进行弱监督训练，并使用ImageNet-2012数据集finetune，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install resnext101_vd_64x4d_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run resnext101_vd_64x4d_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext101_vd_64x4d_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n  - ```shell\n    $ hub install resnext101_vd_64x4d_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext101_vd_64x4d_imagenet/README_en.md",
    "content": "# resnext101_vd_64x4d_imagenet\n\n|Module Name|resnext101_vd_64x4d_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|ResNeXt_vd|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|172MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - ResNeXt is proposed by UC San Diego and Facebook AI Research in 2017. This module is based on resnext101_vd_64x4d_imagenet, which denotes 101 layers ，64 branches，and the number of input and output branch channels is 4 in the network. It is weak-supervised trained on billions of socail images, finetuned on ImageNet-2012 dataset, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install resnext101_vd_64x4d_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run resnext101_vd_64x4d_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext101_vd_64x4d_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n  - ```shell\n    $ hub install resnext101_vd_64x4d_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext101_vd_64x4d_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport math\nimport os\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(\n            self,\n            num_channels: int,\n            num_filters: int,\n            filter_size: int,\n            stride: int = 1,\n            groups: int = 1,\n            is_vd_mode: bool = False,\n            act: str = None,\n            name: str = None,\n    ):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2d(kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, inputs: paddle.Tensor):\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Bottleneck Block for ResNeXt50_vd.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 cardinality: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels, num_filters=num_filters, filter_size=1, act='relu', name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            groups=cardinality,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters * 2 if cardinality == 32 else num_filters,\n            filter_size=1,\n            act=None,\n            name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 2 if cardinality == 32 else num_filters,\n                filter_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.elementwise_add(x=short, y=conv2, act='relu')\n        return y\n\n\n@moduleinfo(\n    name=\"resnext101_vd_64x4d_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"resnext101_vd_64x4d_imagenet is a classification model, \"\n    \"this module is trained with Baidu open sourced dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ResNeXt101_vd(nn.Layer):\n    \"\"\"ResNeXt101_vd model.\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(ResNeXt101_vd, self).__init__()\n\n        self.layers = 101\n        self.cardinality = 64\n        depth = [3, 4, 23, 3]\n        num_channels = [64, 256, 512, 1024]\n        num_filters = [256, 512, 1024, 2048]\n\n        self.conv1_1 = ConvBNLayer(num_channels=3, num_filters=32, filter_size=3, stride=2, act='relu', name=\"conv1_1\")\n        self.conv1_2 = ConvBNLayer(num_channels=32, num_filters=32, filter_size=3, stride=1, act='relu', name=\"conv1_2\")\n        self.conv1_3 = ConvBNLayer(num_channels=32, num_filters=64, filter_size=3, stride=1, act='relu', name=\"conv1_3\")\n\n        self.pool2d_max = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n        self.block_list = []\n        for block in range(len(depth)):\n            shortcut = False\n            for i in range(depth[block]):\n                if block == 2:\n                    if i == 0:\n                        conv_name = \"res\" + str(block + 2) + \"a\"\n                    else:\n                        conv_name = \"res\" + str(block + 2) + \"b\" + str(i)\n                else:\n                    conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                bottleneck_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    BottleneckBlock(\n                        num_channels=num_channels[block]\n                        if i == 0 else num_filters[block] * int(64 // self.cardinality),\n                        num_filters=num_filters[block],\n                        stride=2 if i == 0 and block != 0 else 1,\n                        cardinality=self.cardinality,\n                        shortcut=shortcut,\n                        if_first=block == i == 0,\n                        name=conv_name))\n                self.block_list.append(bottleneck_block)\n                shortcut = True\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n\n        self.pool2d_avg_channels = num_channels[-1] * 2\n\n        stdv = 1.0 / math.sqrt(self.pool2d_avg_channels * 1.0)\n\n        self.out = Linear(\n            self.pool2d_avg_channels,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_weights\"),\n            bias_attr=ParamAttr(name=\"fc_offset\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'resnext101_vd_64x4d_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/resnext101_vd_64x4d_imagenet.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        y = self.pool2d_max(y)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.pool2d_avg_channels])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/resnext152_32x4d_imagenet/README.md",
    "content": "# resnext152_32x4d_imagenet\n\n|模型名称|resnext152_32x4d_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNeXt|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|233MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - ResNeXt 是由 UC San Diego 和 Facebook AI 研究所于2017年提出的图像分类模型，模型沿袭了 VGG/ResNets 的堆叠思想，并采用 split-transform-merge 策略来增加网络的分支数。resnext152_32x4d，表示 layers 为 152， 分支数为32，每个分支的输入输出 channels 为4。该 PaddleHub Module 在包含数十亿张社交媒体图片的数据集上进行弱监督训练，并使用ImageNet-2012数据集finetune，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install resnext152_32x4d_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run resnext152_32x4d_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext152_32x4d_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install resnext152_32x4d_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext152_32x4d_imagenet/README_en.md",
    "content": "# resnext152_32x4d_imagenet\n\n|Module Name|resnext152_32x4d_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|ResNeXt|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|233MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - ResNeXt is proposed by UC San Diego and Facebook AI Research in 2017. This module is based on resnext152_32x4d_imagenet, which denotes 152 layers ，32 branches，and the number of input and output branch channels is 4 in the network. It is weak-supervised trained on billions of socail images, finetuned on ImageNet-2012 dataset, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install resnext152_32x4d_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run resnext152_32x4d_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext152_32x4d_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install resnext152_32x4d_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext152_32x4d_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nimport os\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 filter_size: int,\n                 stride: int = 1,\n                 groups: int = 1,\n                 act: str = None,\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Bottleneck Block for ResNeXt152.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 cardinality: int,\n                 shortcut: bool = True,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels, num_filters=num_filters, filter_size=1, act='relu', name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            groups=cardinality,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters * 2 if cardinality == 32 else num_filters,\n            filter_size=1,\n            act=None,\n            name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 2 if cardinality == 32 else num_filters,\n                filter_size=1,\n                stride=stride,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.elementwise_add(x=short, y=conv2, act='relu')\n        return y\n\n\n@moduleinfo(\n    name=\"resnext152_32x4d_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"resnext152_32x4d_imagenet is a classification model, \"\n    \"this module is trained with Baidu open sourced dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ResNeXt152_32x4d(nn.Layer):\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(ResNeXt152_32x4d, self).__init__()\n\n        self.layers = 152\n        self.cardinality = 32\n        depth = [3, 8, 36, 3]\n        num_channels = [64, 256, 512, 1024]\n        num_filters = [128, 256, 512, 1024]\n\n        self.conv = ConvBNLayer(num_channels=3, num_filters=64, filter_size=7, stride=2, act='relu', name=\"res_conv1\")\n        self.pool2d_max = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n        self.block_list = []\n        for block in range(len(depth)):\n            shortcut = False\n            for i in range(depth[block]):\n                if block == 2:\n                    if i == 0:\n                        conv_name = \"res\" + str(block + 2) + \"a\"\n                    else:\n                        conv_name = \"res\" + str(block + 2) + \"b\" + str(i)\n                else:\n                    conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                bottleneck_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    BottleneckBlock(\n                        num_channels=num_channels[block]\n                        if i == 0 else num_filters[block] * int(64 // self.cardinality),\n                        num_filters=num_filters[block],\n                        stride=2 if i == 0 and block != 0 else 1,\n                        cardinality=self.cardinality,\n                        shortcut=shortcut,\n                        name=conv_name))\n                self.block_list.append(bottleneck_block)\n                shortcut = True\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n        self.pool2d_avg_channels = num_channels[-1] * 2\n        stdv = 1.0 / math.sqrt(self.pool2d_avg_channels * 1.0)\n\n        self.out = Linear(\n            self.pool2d_avg_channels,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_weights\"),\n            bias_attr=ParamAttr(name=\"fc_offset\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'resnext152_32x4d_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/resnext152_32x4d_imagenet.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv(inputs)\n        y = self.pool2d_max(y)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.pool2d_avg_channels])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/resnext152_64x4d_imagenet/README.md",
    "content": "# resnext152_64x4d_imagenet\n\n|模型名称|resnext152_64x4d_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNeXt|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|444MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - ResNeXt 是由 UC San Diego 和 Facebook AI 研究所于2017年提出的图像分类模型，模型沿袭了 VGG/ResNets 的堆叠思想，并采用 split-transform-merge 策略来增加网络的分支数。resnext152_64x4d，表示 layers 为 152， 分支数为64，每个分支的输入输出 channels 为4。该 PaddleHub Module 在包含数十亿张社交媒体图片的数据集上进行弱监督训练，并使用ImageNet-2012数据集finetune，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install resnext152_64x4d_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run resnext152_64x4d_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext152_64x4d_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install resnext152_64x4d_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext152_64x4d_imagenet/README_en.md",
    "content": "# resnext152_64x4d_imagenet\n\n|Module Name|resnext152_64x4d_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|ResNeXt|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|444MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - ResNeXt is proposed by UC San Diego and Facebook AI Research in 2017. This module is based on resnext152_64x4d_imagenet, which denotes 152 layers ，64 branches，and the number of input and output branch channels is 4 in the network. It is weak-supervised trained on billions of socail images, finetuned on ImageNet-2012 dataset, and can predict an image of size 224*224*3.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install resnext152_64x4d_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run resnext152_64x4d_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext152_64x4d_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install resnext152_64x4d_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext152_64x4d_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nimport os\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 filter_size: int,\n                 stride: int = 1,\n                 groups: int = 1,\n                 act: str = None,\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Bottleneck Block for ResNeXt152.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 cardinality: int,\n                 shortcut: bool = True,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels, num_filters=num_filters, filter_size=1, act='relu', name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            groups=cardinality,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters * 2 if cardinality == 32 else num_filters,\n            filter_size=1,\n            act=None,\n            name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 2 if cardinality == 32 else num_filters,\n                filter_size=1,\n                stride=stride,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.elementwise_add(x=short, y=conv2, act='relu')\n        return y\n\n\n@moduleinfo(\n    name=\"resnext152_64x4d_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"resnext152_64x4d_imagenet is a classification model, \"\n    \"this module is trained with Baidu open sourced dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ResNeXt152_64x4d(nn.Layer):\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(ResNeXt152_64x4d, self).__init__()\n\n        self.layers = 152\n        self.cardinality = 64\n        depth = [3, 8, 36, 3]\n        num_channels = [64, 256, 512, 1024]\n        num_filters = [256, 512, 1024, 2048]\n\n        self.conv = ConvBNLayer(num_channels=3, num_filters=64, filter_size=7, stride=2, act='relu', name=\"res_conv1\")\n        self.pool2d_max = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n        self.block_list = []\n        for block in range(len(depth)):\n            shortcut = False\n            for i in range(depth[block]):\n                if block == 2:\n                    if i == 0:\n                        conv_name = \"res\" + str(block + 2) + \"a\"\n                    else:\n                        conv_name = \"res\" + str(block + 2) + \"b\" + str(i)\n                else:\n                    conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                bottleneck_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    BottleneckBlock(\n                        num_channels=num_channels[block]\n                        if i == 0 else num_filters[block] * int(64 // self.cardinality),\n                        num_filters=num_filters[block],\n                        stride=2 if i == 0 and block != 0 else 1,\n                        cardinality=self.cardinality,\n                        shortcut=shortcut,\n                        name=conv_name))\n                self.block_list.append(bottleneck_block)\n                shortcut = True\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n        self.pool2d_avg_channels = num_channels[-1] * 2\n        stdv = 1.0 / math.sqrt(self.pool2d_avg_channels * 1.0)\n\n        self.out = Linear(\n            self.pool2d_avg_channels,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_weights\"),\n            bias_attr=ParamAttr(name=\"fc_offset\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'resnext152_64x4d_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/resnext152_64x4d_imagenet.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv(inputs)\n        y = self.pool2d_max(y)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.pool2d_avg_channels])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/resnext152_vd_32x4d_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport math\nimport os\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(\n            self,\n            num_channels: int,\n            num_filters: int,\n            filter_size: int,\n            stride: int = 1,\n            groups: int = 1,\n            is_vd_mode: bool = False,\n            act: str = None,\n            name: str = None,\n    ):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2d(kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, inputs: paddle.Tensor):\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Bottleneck Block for ResNeXt152_vd.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 cardinality: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels, num_filters=num_filters, filter_size=1, act='relu', name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            groups=cardinality,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters * 2 if cardinality == 32 else num_filters,\n            filter_size=1,\n            act=None,\n            name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 2 if cardinality == 32 else num_filters,\n                filter_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.elementwise_add(x=short, y=conv2, act='relu')\n        return y\n\n\n@moduleinfo(\n    name=\"resnext152_vd_32x4d_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"resnext152_vd_32x4d_imagenet is a classification model, \"\n    \"this module is trained with Baidu open sourced dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ResNeXt152_vd(nn.Layer):\n    \"\"\"ResNeXt152_vd model.\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(ResNeXt152_vd, self).__init__()\n\n        self.layers = 152\n        self.cardinality = 32\n        depth = [3, 8, 36, 3]\n        num_channels = [64, 256, 512, 1024]\n        num_filters = [128, 256, 512, 1024]\n\n        self.conv1_1 = ConvBNLayer(num_channels=3, num_filters=32, filter_size=3, stride=2, act='relu', name=\"conv1_1\")\n        self.conv1_2 = ConvBNLayer(num_channels=32, num_filters=32, filter_size=3, stride=1, act='relu', name=\"conv1_2\")\n        self.conv1_3 = ConvBNLayer(num_channels=32, num_filters=64, filter_size=3, stride=1, act='relu', name=\"conv1_3\")\n\n        self.pool2d_max = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n        self.block_list = []\n        for block in range(len(depth)):\n            shortcut = False\n            for i in range(depth[block]):\n                if block == 2:\n                    if i == 0:\n                        conv_name = \"res\" + str(block + 2) + \"a\"\n                    else:\n                        conv_name = \"res\" + str(block + 2) + \"b\" + str(i)\n                else:\n                    conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                bottleneck_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    BottleneckBlock(\n                        num_channels=num_channels[block]\n                        if i == 0 else num_filters[block] * int(64 // self.cardinality),\n                        num_filters=num_filters[block],\n                        stride=2 if i == 0 and block != 0 else 1,\n                        cardinality=self.cardinality,\n                        shortcut=shortcut,\n                        if_first=block == i == 0,\n                        name=conv_name))\n                self.block_list.append(bottleneck_block)\n                shortcut = True\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n\n        self.pool2d_avg_channels = num_channels[-1] * 2\n\n        stdv = 1.0 / math.sqrt(self.pool2d_avg_channels * 1.0)\n\n        self.out = Linear(\n            self.pool2d_avg_channels,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_weights\"),\n            bias_attr=ParamAttr(name=\"fc_offset\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'resnext152_vd_32x4d_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/resnext152_vd_32x4d_imagenet.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        y = self.pool2d_max(y)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.pool2d_avg_channels])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/resnext152_vd_64x4d_imagenet/README.md",
    "content": "# resnext152_vd_64x4d_imagenet\n\n|模型名称|resnext152_vd_64x4d_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNeXt_vd|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|444MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - ResNeXt 是由 UC San Diego 和 Facebook AI 研究所于2017年提出的图像分类模型，模型沿袭了 VGG/ResNets 的堆叠思想，并采用 split-transform-merge 策略来增加网络的分支数。resnext152_vd_64x4d，表示 layers 为 152， 分支数为64，每个分支的输入输出 channels, 并采用了 3 个 3*3 的卷积核替代 ResNeXt152_64x4d 中第一个 7*7 的卷积核。该 PaddleHub Module 在包含数十亿张社交媒体图片的数据集上进行弱监督训练，并使用ImageNet-2012数据集finetune，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install resnext152_vd_64x4d_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run resnext152_vd_64x4d_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext152_vd_64x4d_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install resnext152_vd_64x4d_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext152_vd_64x4d_imagenet/README_en.md",
    "content": "# resnext152_vd_64x4d_imagenet\n\n|Module Name|resnext152_vd_64x4d_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|ResNeXt_vd|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|444MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - ResNeXt is proposed by UC San Diego and Facebook AI Research in 2017. This module is based on resnext152_vd_64x4d_imagenet, which denotes 152 layers ，64 branches，and the number of input and output branch channels is 4 in the network. It is weak-supervised trained on billions of socail images, finetuned on ImageNet-2012 dataset, and can predict an image of size 224*224*3.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install resnext152_vd_64x4d_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run resnext152_vd_64x4d_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext152_vd_64x4d_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install resnext152_vd_64x4d_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext152_vd_64x4d_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport math\nimport os\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(\n            self,\n            num_channels: int,\n            num_filters: int,\n            filter_size: int,\n            stride: int = 1,\n            groups: int = 1,\n            is_vd_mode: bool = False,\n            act: str = None,\n            name: str = None,\n    ):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2d(kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, inputs: paddle.Tensor):\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Bottleneck Block for ResNeXt152_vd.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 cardinality: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels, num_filters=num_filters, filter_size=1, act='relu', name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            groups=cardinality,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters * 2 if cardinality == 32 else num_filters,\n            filter_size=1,\n            act=None,\n            name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 2 if cardinality == 32 else num_filters,\n                filter_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.elementwise_add(x=short, y=conv2, act='relu')\n        return y\n\n\n@moduleinfo(\n    name=\"resnext152_vd_64x4d_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"resnext152_vd_64x4d_imagenet is a classification model, \"\n    \"this module is trained with Baidu open sourced dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ResNeXt152_vd(nn.Layer):\n    \"\"\"ResNeXt152_vd model.\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(ResNeXt152_vd, self).__init__()\n\n        self.layers = 152\n        self.cardinality = 64\n        depth = [3, 8, 36, 3]\n        num_channels = [64, 256, 512, 1024]\n        num_filters = [256, 512, 1024, 2048]\n\n        self.conv1_1 = ConvBNLayer(num_channels=3, num_filters=32, filter_size=3, stride=2, act='relu', name=\"conv1_1\")\n        self.conv1_2 = ConvBNLayer(num_channels=32, num_filters=32, filter_size=3, stride=1, act='relu', name=\"conv1_2\")\n        self.conv1_3 = ConvBNLayer(num_channels=32, num_filters=64, filter_size=3, stride=1, act='relu', name=\"conv1_3\")\n\n        self.pool2d_max = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n        self.block_list = []\n        for block in range(len(depth)):\n            shortcut = False\n            for i in range(depth[block]):\n                if block == 2:\n                    if i == 0:\n                        conv_name = \"res\" + str(block + 2) + \"a\"\n                    else:\n                        conv_name = \"res\" + str(block + 2) + \"b\" + str(i)\n                else:\n                    conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                bottleneck_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    BottleneckBlock(\n                        num_channels=num_channels[block]\n                        if i == 0 else num_filters[block] * int(64 // self.cardinality),\n                        num_filters=num_filters[block],\n                        stride=2 if i == 0 and block != 0 else 1,\n                        cardinality=self.cardinality,\n                        shortcut=shortcut,\n                        if_first=block == i == 0,\n                        name=conv_name))\n                self.block_list.append(bottleneck_block)\n                shortcut = True\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n\n        self.pool2d_avg_channels = num_channels[-1] * 2\n\n        stdv = 1.0 / math.sqrt(self.pool2d_avg_channels * 1.0)\n\n        self.out = Linear(\n            self.pool2d_avg_channels,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_weights\"),\n            bias_attr=ParamAttr(name=\"fc_offset\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'resnext152_vd_64x4d_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/resnext152_vd_64x4d_imagenet.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        y = self.pool2d_max(y)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.pool2d_avg_channels])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/resnext50_32x4d_imagenet/README.md",
    "content": "# resnext50_32x4d_imagenet\n\n|模型名称|resnext50_32x4d_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNeXt|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|97MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - ResNeXt 是由 UC San Diego 和 Facebook AI 研究所于2017年提出的图像分类模型，模型沿袭了 VGG/ResNets 的堆叠思想，并采用 split-transform-merge 策略来增加网络的分支数。resnext50_32x4d，表示 layers 为 50， 分支数为 32，每个分支的输入输出 channels 为4。该 PaddleHub Module 在包含数十亿张社交媒体图片的数据集上进行弱监督训练，并使用ImageNet-2012数据集finetune，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install resnext50_32x4d_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run resnext50_32x4d_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext50_32x4d_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install resnext50_32x4d_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext50_32x4d_imagenet/README_en.md",
    "content": "# resnext50_32x4d_imagenet\n\n|Module Name|resnext50_32x4d_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|ResNeXt|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|97MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n  - ResNeXt is proposed by UC San Diego and Facebook AI Research in 2017. This module is based on resnext50_32x4d, which denotes 50 layers ，32 branches，and the number of input and output branch channels is 4 in the network. It is weak-supervised trained on billions of socail images, finetuned on ImageNet-2012 dataset, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install resnext50_32x4d_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run resnext50_32x4d_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext50_32x4d_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install resnext50_32x4d_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext50_32x4d_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nimport os\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 filter_size: int,\n                 stride: int = 1,\n                 groups: int = 1,\n                 act: str = None,\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Bottleneck Block for ResNeXt50.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 cardinality: int,\n                 shortcut: bool = True,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels, num_filters=num_filters, filter_size=1, act='relu', name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            groups=cardinality,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters * 2 if cardinality == 32 else num_filters,\n            filter_size=1,\n            act=None,\n            name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 2 if cardinality == 32 else num_filters,\n                filter_size=1,\n                stride=stride,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.elementwise_add(x=short, y=conv2, act='relu')\n        return y\n\n\n@moduleinfo(\n    name=\"resnext50_32x4d_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"resnext50_32x4d_imagenet is a classification model, \"\n    \"this module is trained with Baidu open sourced dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ResNeXt50_32x4d(nn.Layer):\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(ResNeXt50_32x4d, self).__init__()\n\n        self.layers = 50\n        self.cardinality = 32\n        depth = [3, 4, 6, 3]\n        num_channels = [64, 256, 512, 1024]\n        num_filters = [128, 256, 512, 1024]\n\n        self.conv = ConvBNLayer(num_channels=3, num_filters=64, filter_size=7, stride=2, act='relu', name=\"res_conv1\")\n        self.pool2d_max = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n        self.block_list = []\n        for block in range(len(depth)):\n            shortcut = False\n            for i in range(depth[block]):\n                conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                bottleneck_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    BottleneckBlock(\n                        num_channels=num_channels[block]\n                        if i == 0 else num_filters[block] * int(64 // self.cardinality),\n                        num_filters=num_filters[block],\n                        stride=2 if i == 0 and block != 0 else 1,\n                        cardinality=self.cardinality,\n                        shortcut=shortcut,\n                        name=conv_name))\n                self.block_list.append(bottleneck_block)\n                shortcut = True\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n        self.pool2d_avg_channels = num_channels[-1] * 2\n        stdv = 1.0 / math.sqrt(self.pool2d_avg_channels * 1.0)\n\n        self.out = Linear(\n            self.pool2d_avg_channels,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_weights\"),\n            bias_attr=ParamAttr(name=\"fc_offset\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'resnext50_32x4d_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/resnext50_32x4d_imagenet.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv(inputs)\n        y = self.pool2d_max(y)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.pool2d_avg_channels])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/resnext50_64x4d_imagenet/README.md",
    "content": "# resnext50_64x4d_imagenet\n\n|模型名称|resnext50_64x4d_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNeXt|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|174MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - ResNeXt 是由 UC San Diego 和 Facebook AI 研究所于2017年提出的图像分类模型，模型沿袭了 VGG/ResNets 的堆叠思想，并采用 split-transform-merge 策略来增加网络的分支数。resnext50_64x4d，表示 layers 为 50， 分支数为 64，每个分支的输入输出 channels 为4。该 PaddleHub Module 在包含数十亿张社交媒体图片的数据集上进行弱监督训练，并使用ImageNet-2012数据集finetune，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install resnext50_64x4d_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run resnext50_64x4d_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext50_64x4d_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install resnext50_64x4d_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext50_64x4d_imagenet/README_en.md",
    "content": "# resnext50_64x4d_imagenet\n\n|Module Name|resnext50_64x4d_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|ResNeXt|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|174MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - ResNeXt is proposed by UC San Diego and Facebook AI Research in 2017. This module is based on resnext50_64x4d_imagenet, which denotes 50 layers ，60 branches，and the number of input and output branch channels is 4 in the network. It is weak-supervised trained on billions of socail images, finetuned on ImageNet-2012 dataset, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install resnext50_64x4d_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run resnext50_64x4d_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext50_64x4d_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install resnext50_64x4d_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext50_64x4d_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nimport os\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 filter_size: int,\n                 stride: int = 1,\n                 groups: int = 1,\n                 act: str = None,\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Bottleneck Block for ResNeXt50.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 cardinality: int,\n                 shortcut: bool = True,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels, num_filters=num_filters, filter_size=1, act='relu', name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            groups=cardinality,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters * 2 if cardinality == 32 else num_filters,\n            filter_size=1,\n            act=None,\n            name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 2 if cardinality == 32 else num_filters,\n                filter_size=1,\n                stride=stride,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.elementwise_add(x=short, y=conv2, act='relu')\n        return y\n\n\n@moduleinfo(\n    name=\"resnext50_64x4d_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"resnext50_64x4d_imagenet is a classification model, \"\n    \"this module is trained with Baidu open sourced dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ResNeXt50_64x4d(nn.Layer):\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(ResNeXt50_64x4d, self).__init__()\n\n        self.layers = 50\n        self.cardinality = 64\n        depth = [3, 4, 6, 3]\n        num_channels = [64, 256, 512, 1024]\n        num_filters = [256, 512, 1024, 2048]\n\n        self.conv = ConvBNLayer(num_channels=3, num_filters=64, filter_size=7, stride=2, act='relu', name=\"res_conv1\")\n        self.pool2d_max = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n        self.block_list = []\n        for block in range(len(depth)):\n            shortcut = False\n            for i in range(depth[block]):\n                conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                bottleneck_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    BottleneckBlock(\n                        num_channels=num_channels[block]\n                        if i == 0 else num_filters[block] * int(64 // self.cardinality),\n                        num_filters=num_filters[block],\n                        stride=2 if i == 0 and block != 0 else 1,\n                        cardinality=self.cardinality,\n                        shortcut=shortcut,\n                        name=conv_name))\n                self.block_list.append(bottleneck_block)\n                shortcut = True\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n        self.pool2d_avg_channels = num_channels[-1] * 2\n        stdv = 1.0 / math.sqrt(self.pool2d_avg_channels * 1.0)\n\n        self.out = Linear(\n            self.pool2d_avg_channels,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_weights\"),\n            bias_attr=ParamAttr(name=\"fc_offset\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'resnext50_64x4d_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/resnext50_64x4d_imagenet.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv(inputs)\n        y = self.pool2d_max(y)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.pool2d_avg_channels])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/resnext50_vd_32x4d_imagenet/README.md",
    "content": "# resnext50_vd_32x4d_imagenet\n\n|模型名称|resnext50_vd_32x4d_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNeXt_vd|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|98MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - ResNeXt 是由 UC San Diego 和 Facebook AI 研究所于2017年提出的图像分类模型，模型沿袭了 VGG/ResNets 的堆叠思想，并采用 split-transform-merge 策略来增加网络的分支数。resnext50_vd_32x4d，表示 layers 为 50， 分支数为 32，每个分支的输入输出 channels 为4。该 PaddleHub Module 在包含数十亿张社交媒体图片的数据集上进行弱监督训练，并使用ImageNet-2012数据集finetune，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install resnext50_vd_32x4d_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run resnext50_vd_32x4d_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext50_vd_32x4d_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install resnext50_vd_32x4d_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext50_vd_32x4d_imagenet/README_en.md",
    "content": "# resnext50_vd_32x4d_imagenet\n\n|Module Name|resnext50_vd_32x4d_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|ResNeXt_vd|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|98MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - ResNeXt is proposed by UC San Diego and Facebook AI Research in 2017. This module is based on resnext50_vd_32x4d, which denotes 50 layers ，32 branches，and the number of input and output branch channels is 4 in the network. It is weak-supervised trained on billions of socail images, finetuned on ImageNet-2012 dataset, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install resnext50_vd_32x4d_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run resnext50_vd_32x4d_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext50_vd_32x4d_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install resnext50_vd_32x4d_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext50_vd_32x4d_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport math\nimport os\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(\n            self,\n            num_channels: int,\n            num_filters: int,\n            filter_size: int,\n            stride: int = 1,\n            groups: int = 1,\n            is_vd_mode: bool = False,\n            act: str = None,\n            name: str = None,\n    ):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2d(kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, inputs: paddle.Tensor):\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Bottleneck Block for ResNeXt50_vd.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 cardinality: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels, num_filters=num_filters, filter_size=1, act='relu', name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            groups=cardinality,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters * 2 if cardinality == 32 else num_filters,\n            filter_size=1,\n            act=None,\n            name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 2 if cardinality == 32 else num_filters,\n                filter_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.elementwise_add(x=short, y=conv2, act='relu')\n        return y\n\n\n@moduleinfo(\n    name=\"resnext50_vd_32x4d_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"resnext50_vd_32x4d_imagenet is a classification model, \"\n    \"this module is trained with Baidu open sourced dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ResNeXt50_vd(nn.Layer):\n    \"\"\"ResNeXt50_vd model.\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(ResNeXt50_vd, self).__init__()\n\n        self.layers = 50\n        self.cardinality = 32\n        depth = [3, 4, 6, 3]\n        num_channels = [64, 256, 512, 1024]\n        num_filters = [128, 256, 512, 1024]\n\n        self.conv1_1 = ConvBNLayer(num_channels=3, num_filters=32, filter_size=3, stride=2, act='relu', name=\"conv1_1\")\n        self.conv1_2 = ConvBNLayer(num_channels=32, num_filters=32, filter_size=3, stride=1, act='relu', name=\"conv1_2\")\n        self.conv1_3 = ConvBNLayer(num_channels=32, num_filters=64, filter_size=3, stride=1, act='relu', name=\"conv1_3\")\n\n        self.pool2d_max = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n        self.block_list = []\n        for block in range(len(depth)):\n            shortcut = False\n            for i in range(depth[block]):\n                conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                bottleneck_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    BottleneckBlock(\n                        num_channels=num_channels[block]\n                        if i == 0 else num_filters[block] * int(64 // self.cardinality),\n                        num_filters=num_filters[block],\n                        stride=2 if i == 0 and block != 0 else 1,\n                        cardinality=self.cardinality,\n                        shortcut=shortcut,\n                        if_first=block == i == 0,\n                        name=conv_name))\n                self.block_list.append(bottleneck_block)\n                shortcut = True\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n\n        self.pool2d_avg_channels = num_channels[-1] * 2\n\n        stdv = 1.0 / math.sqrt(self.pool2d_avg_channels * 1.0)\n\n        self.out = Linear(\n            self.pool2d_avg_channels,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_weights\"),\n            bias_attr=ParamAttr(name=\"fc_offset\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'resnext50_vd_32x4d_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/resnext50_vd_32x4d_imagenet.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        y = self.pool2d_max(y)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.pool2d_avg_channels])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/resnext50_vd_64x4d_imagenet/README.md",
    "content": "# resnext50_vd_64x4d_imagenet\n\n|模型名称|resnext50_vd_64x4d_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ResNeXt_vd|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|175MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - ResNeXt 是由 UC San Diego 和 Facebook AI 研究所于2017年提出的图像分类模型，模型沿袭了 VGG/ResNets 的堆叠思想，并采用 split-transform-merge 策略来增加网络的分支数。resnext50_vd_64x4d，表示 layers 为 50， 分支数为 64，每个分支的输入输出 channels 为4。该 PaddleHub Module 在包含数十亿张社交媒体图片的数据集上进行弱监督训练，并使用ImageNet-2012数据集finetune，接受输入图片大小为 224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install resnext50_vd_64x4d_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run resnext50_vd_64x4d_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext50_vd_64x4d_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n  - ```shell\n    $ hub install resnext50_vd_64x4d_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext50_vd_64x4d_imagenet/README_en.md",
    "content": "# resnext50_vd_64x4d_imagenet\n\n|Module Name|resnext50_vd_64x4d_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|ResNeXt_vd|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|175MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - ResNeXt is proposed by UC San Diego and Facebook AI Research in 2017. This module is based on resnext50_vd_64x4d_imagenet, which denotes 50 layers ，64 branches，and the number of input and output branch channels is 4 in the network. It is weak-supervised trained on billions of socail images, finetuned on ImageNet-2012 dataset, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install resnext50_vd_64x4d_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run resnext50_vd_64x4d_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"resnext50_vd_64x4d_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n  - ```shell\n    $ hub install resnext50_vd_64x4d_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/resnext50_vd_64x4d_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport math\nimport os\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(\n            self,\n            num_channels: int,\n            num_filters: int,\n            filter_size: int,\n            stride: int = 1,\n            groups: int = 1,\n            is_vd_mode: bool = False,\n            act: str = None,\n            name: str = None,\n    ):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2d(kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, inputs: paddle.Tensor):\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Bottleneck Block for ResNeXt50_vd.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 cardinality: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            num_channels=num_channels, num_filters=num_filters, filter_size=1, act='relu', name=name + \"_branch2a\")\n        self.conv1 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            groups=cardinality,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters * 2 if cardinality == 32 else num_filters,\n            filter_size=1,\n            act=None,\n            name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 2 if cardinality == 32 else num_filters,\n                filter_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.elementwise_add(x=short, y=conv2, act='relu')\n        return y\n\n\n@moduleinfo(\n    name=\"resnext50_vd_64x4d_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"resnext50_vd_64x4d_imagenet is a classification model, \"\n    \"this module is trained with Baidu open sourced dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ResNeXt50_vd(nn.Layer):\n    \"\"\"ResNeXt50_vd model.\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(ResNeXt50_vd, self).__init__()\n\n        self.layers = 50\n        self.cardinality = 64\n        depth = [3, 4, 6, 3]\n        num_channels = [64, 256, 512, 1024]\n        num_filters = [256, 512, 1024, 2048]\n\n        self.conv1_1 = ConvBNLayer(num_channels=3, num_filters=32, filter_size=3, stride=2, act='relu', name=\"conv1_1\")\n        self.conv1_2 = ConvBNLayer(num_channels=32, num_filters=32, filter_size=3, stride=1, act='relu', name=\"conv1_2\")\n        self.conv1_3 = ConvBNLayer(num_channels=32, num_filters=64, filter_size=3, stride=1, act='relu', name=\"conv1_3\")\n\n        self.pool2d_max = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n        self.block_list = []\n        for block in range(len(depth)):\n            shortcut = False\n            for i in range(depth[block]):\n                conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                bottleneck_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    BottleneckBlock(\n                        num_channels=num_channels[block]\n                        if i == 0 else num_filters[block] * int(64 // self.cardinality),\n                        num_filters=num_filters[block],\n                        stride=2 if i == 0 and block != 0 else 1,\n                        cardinality=self.cardinality,\n                        shortcut=shortcut,\n                        if_first=block == i == 0,\n                        name=conv_name))\n                self.block_list.append(bottleneck_block)\n                shortcut = True\n\n        self.pool2d_avg = AdaptiveAvgPool2d(1)\n\n        self.pool2d_avg_channels = num_channels[-1] * 2\n\n        stdv = 1.0 / math.sqrt(self.pool2d_avg_channels * 1.0)\n\n        self.out = Linear(\n            self.pool2d_avg_channels,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_weights\"),\n            bias_attr=ParamAttr(name=\"fc_offset\"))\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'resnext50_vd_64x4d_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/resnext50_vd_64x4d_imagenet.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        y = self.pool2d_max(y)\n        for block in self.block_list:\n            y = block(y)\n        y = self.pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self.pool2d_avg_channels])\n        y = self.out(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/rexnet_1_0_imagenet/README.md",
    "content": "# rexnet_1_0_imagenet\n\n|模型名称|rexnet_1_0_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ReXNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|28MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - ReXNet 由 NAVER AI Lab 提出的基于新的网络设计原则而设计的网络。作者针对现有网络中具有代表性的瓶颈问题，提出了一套设计原则，他们认为，常规设计会产生代表性瓶颈，这会影响模型性能。为了研究表征瓶颈，作者研究了由一万个随机网络生成的特征的矩阵秩。此外，还研究了整个层的通道配置以设计更准确的网络架构。最后，作者提出了一套简单有效的设计原则来缓解表征瓶颈。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install rexnet_1_0_imagenet\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run rexnet_1_0_imagenet --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='rexnet_1_0_imagenet')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用rexnet_1_0_imagenet对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"rexnet_1_0_imagenet\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='rexnet_1_0_imagenet', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m rexnet_1_0_imagenet\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/rexnet_1_0_imagenet\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/rexnet_1_0_imagenet/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/rexnet_1_0_imagenet/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nfrom math import ceil\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport numpy as np\nimport paddlehub.vision.transforms as T\nfrom paddle import ParamAttr\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\ndef conv_bn_act(out, in_channels, channels, kernel=1, stride=1, pad=0, num_group=1, active=True, relu6=False):\n    out.append(nn.Conv2D(in_channels, channels, kernel, stride, pad, groups=num_group, bias_attr=False))\n    out.append(nn.BatchNorm2D(channels))\n    if active:\n        out.append(nn.ReLU6() if relu6 else nn.ReLU())\n\n\ndef conv_bn_swish(out, in_channels, channels, kernel=1, stride=1, pad=0, num_group=1):\n    out.append(nn.Conv2D(in_channels, channels, kernel, stride, pad, groups=num_group, bias_attr=False))\n    out.append(nn.BatchNorm2D(channels))\n    out.append(nn.Swish())\n\n\nclass SE(nn.Layer):\n    def __init__(self, in_channels, channels, se_ratio=12):\n        super(SE, self).__init__()\n        self.avg_pool = nn.AdaptiveAvgPool2D(1)\n        self.fc = nn.Sequential(\n            nn.Conv2D(in_channels, channels // se_ratio, kernel_size=1, padding=0),\n            nn.BatchNorm2D(channels // se_ratio), nn.ReLU(),\n            nn.Conv2D(channels // se_ratio, channels, kernel_size=1, padding=0), nn.Sigmoid())\n\n    def forward(self, x):\n        y = self.avg_pool(x)\n        y = self.fc(y)\n        return x * y\n\n\nclass LinearBottleneck(nn.Layer):\n    def __init__(self, in_channels, channels, t, stride, use_se=True, se_ratio=12, **kwargs):\n        super(LinearBottleneck, self).__init__(**kwargs)\n        self.use_shortcut = stride == 1 and in_channels <= channels\n        self.in_channels = in_channels\n        self.out_channels = channels\n\n        out = []\n        if t != 1:\n            dw_channels = in_channels * t\n            conv_bn_swish(out, in_channels=in_channels, channels=dw_channels)\n        else:\n            dw_channels = in_channels\n\n        conv_bn_act(\n            out,\n            in_channels=dw_channels,\n            channels=dw_channels,\n            kernel=3,\n            stride=stride,\n            pad=1,\n            num_group=dw_channels,\n            active=False)\n\n        if use_se:\n            out.append(SE(dw_channels, dw_channels, se_ratio))\n\n        out.append(nn.ReLU6())\n        conv_bn_act(out, in_channels=dw_channels, channels=channels, active=False, relu6=True)\n        self.out = nn.Sequential(*out)\n\n    def forward(self, x):\n        out = self.out(x)\n        if self.use_shortcut:\n            out[:, 0:self.in_channels] += x\n\n        return out\n\n\n@moduleinfo(\n    name=\"rexnet_1_0_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"rexnet_1_0_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass ReXNetV1(nn.Layer):\n    def __init__(self,\n                 label_list: list = None,\n                 load_checkpoint: str = None,\n                 input_ch=16,\n                 final_ch=180,\n                 width_mult=1.0,\n                 depth_mult=1.0,\n                 class_dim=1000,\n                 use_se=True,\n                 se_ratio=12,\n                 dropout_ratio=0.2,\n                 bn_momentum=0.9):\n\n        super(ReXNetV1, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        layers = [1, 2, 2, 3, 3, 5]\n        strides = [1, 2, 2, 2, 1, 2]\n        use_ses = [False, False, True, True, True, True]\n\n        layers = [ceil(element * depth_mult) for element in layers]\n        strides = sum([[element] + [1] * (layers[idx] - 1) for idx, element in enumerate(strides)], [])\n        if use_se:\n            use_ses = sum([[element] * layers[idx] for idx, element in enumerate(use_ses)], [])\n        else:\n            use_ses = [False] * sum(layers[:])\n        ts = [1] * layers[0] + [6] * sum(layers[1:])\n\n        self.depth = sum(layers[:]) * 3\n        stem_channel = 32 / width_mult if width_mult < 1.0 else 32\n        inplanes = input_ch / width_mult if width_mult < 1.0 else input_ch\n\n        features = []\n        in_channels_group = []\n        channels_group = []\n\n        # The following channel configuration is a simple instance to make each layer become an expand layer.\n        for i in range(self.depth // 3):\n            if i == 0:\n                in_channels_group.append(int(round(stem_channel * width_mult)))\n                channels_group.append(int(round(inplanes * width_mult)))\n            else:\n                in_channels_group.append(int(round(inplanes * width_mult)))\n                inplanes += final_ch / (self.depth // 3 * 1.0)\n                channels_group.append(int(round(inplanes * width_mult)))\n\n        conv_bn_swish(features, 3, int(round(stem_channel * width_mult)), kernel=3, stride=2, pad=1)\n\n        for block_idx, (in_c, c, t, s, se) in enumerate(zip(in_channels_group, channels_group, ts, strides, use_ses)):\n            features.append(LinearBottleneck(in_channels=in_c, channels=c, t=t, stride=s, use_se=se, se_ratio=se_ratio))\n\n        pen_channels = int(1280 * width_mult)\n        conv_bn_swish(features, c, pen_channels)\n\n        features.append(nn.AdaptiveAvgPool2D(1))\n        self.features = nn.Sequential(*features)\n        self.output = nn.Sequential(nn.Dropout(dropout_ratio), nn.Conv2D(pen_channels, class_dim, 1, bias_attr=True))\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, x):\n        feat = self.features(x)\n        x = self.output(feat).squeeze(axis=-1).squeeze(axis=-1)\n        return x, feat\n"
  },
  {
    "path": "modules/image/classification/rexnet_1_3_imagenet/README.md",
    "content": "# rexnet_1_3_imagenet\n\n|模型名称|rexnet_1_3_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ReXNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|44MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - ReXNet 由 NAVER AI Lab 提出的基于新的网络设计原则而设计的网络。作者针对现有网络中具有代表性的瓶颈问题，提出了一套设计原则，他们认为，常规设计会产生代表性瓶颈，这会影响模型性能。为了研究表征瓶颈，作者研究了由一万个随机网络生成的特征的矩阵秩。此外，还研究了整个层的通道配置以设计更准确的网络架构。最后，作者提出了一套简单有效的设计原则来缓解表征瓶颈。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install rexnet_1_3_imagenet\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run rexnet_1_3_imagenet --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='rexnet_1_3_imagenet')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用rexnet_1_3_imagenet对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"rexnet_1_3_imagenet\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='rexnet_1_3_imagenet', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m rexnet_1_3_imagenet\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/rexnet_1_3_imagenet\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/rexnet_1_3_imagenet/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/rexnet_1_3_imagenet/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nfrom math import ceil\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport numpy as np\nimport paddlehub.vision.transforms as T\nfrom paddle import ParamAttr\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\ndef conv_bn_act(out, in_channels, channels, kernel=1, stride=1, pad=0, num_group=1, active=True, relu6=False):\n    out.append(nn.Conv2D(in_channels, channels, kernel, stride, pad, groups=num_group, bias_attr=False))\n    out.append(nn.BatchNorm2D(channels))\n    if active:\n        out.append(nn.ReLU6() if relu6 else nn.ReLU())\n\n\ndef conv_bn_swish(out, in_channels, channels, kernel=1, stride=1, pad=0, num_group=1):\n    out.append(nn.Conv2D(in_channels, channels, kernel, stride, pad, groups=num_group, bias_attr=False))\n    out.append(nn.BatchNorm2D(channels))\n    out.append(nn.Swish())\n\n\nclass SE(nn.Layer):\n    def __init__(self, in_channels, channels, se_ratio=12):\n        super(SE, self).__init__()\n        self.avg_pool = nn.AdaptiveAvgPool2D(1)\n        self.fc = nn.Sequential(\n            nn.Conv2D(in_channels, channels // se_ratio, kernel_size=1, padding=0),\n            nn.BatchNorm2D(channels // se_ratio), nn.ReLU(),\n            nn.Conv2D(channels // se_ratio, channels, kernel_size=1, padding=0), nn.Sigmoid())\n\n    def forward(self, x):\n        y = self.avg_pool(x)\n        y = self.fc(y)\n        return x * y\n\n\nclass LinearBottleneck(nn.Layer):\n    def __init__(self, in_channels, channels, t, stride, use_se=True, se_ratio=12, **kwargs):\n        super(LinearBottleneck, self).__init__(**kwargs)\n        self.use_shortcut = stride == 1 and in_channels <= channels\n        self.in_channels = in_channels\n        self.out_channels = channels\n\n        out = []\n        if t != 1:\n            dw_channels = in_channels * t\n            conv_bn_swish(out, in_channels=in_channels, channels=dw_channels)\n        else:\n            dw_channels = in_channels\n\n        conv_bn_act(\n            out,\n            in_channels=dw_channels,\n            channels=dw_channels,\n            kernel=3,\n            stride=stride,\n            pad=1,\n            num_group=dw_channels,\n            active=False)\n\n        if use_se:\n            out.append(SE(dw_channels, dw_channels, se_ratio))\n\n        out.append(nn.ReLU6())\n        conv_bn_act(out, in_channels=dw_channels, channels=channels, active=False, relu6=True)\n        self.out = nn.Sequential(*out)\n\n    def forward(self, x):\n        out = self.out(x)\n        if self.use_shortcut:\n            out[:, 0:self.in_channels] += x\n\n        return out\n\n\n@moduleinfo(\n    name=\"rexnet_1_3_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"rexnet_1_3_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass ReXNetV1(nn.Layer):\n    def __init__(self,\n                 label_list: list = None,\n                 load_checkpoint: str = None,\n                 input_ch=16,\n                 final_ch=180,\n                 width_mult=1.3,\n                 depth_mult=1.0,\n                 class_dim=1000,\n                 use_se=True,\n                 se_ratio=12,\n                 dropout_ratio=0.2,\n                 bn_momentum=0.9):\n\n        super(ReXNetV1, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        layers = [1, 2, 2, 3, 3, 5]\n        strides = [1, 2, 2, 2, 1, 2]\n        use_ses = [False, False, True, True, True, True]\n\n        layers = [ceil(element * depth_mult) for element in layers]\n        strides = sum([[element] + [1] * (layers[idx] - 1) for idx, element in enumerate(strides)], [])\n        if use_se:\n            use_ses = sum([[element] * layers[idx] for idx, element in enumerate(use_ses)], [])\n        else:\n            use_ses = [False] * sum(layers[:])\n        ts = [1] * layers[0] + [6] * sum(layers[1:])\n\n        self.depth = sum(layers[:]) * 3\n        stem_channel = 32 / width_mult if width_mult < 1.0 else 32\n        inplanes = input_ch / width_mult if width_mult < 1.0 else input_ch\n\n        features = []\n        in_channels_group = []\n        channels_group = []\n\n        # The following channel configuration is a simple instance to make each layer become an expand layer.\n        for i in range(self.depth // 3):\n            if i == 0:\n                in_channels_group.append(int(round(stem_channel * width_mult)))\n                channels_group.append(int(round(inplanes * width_mult)))\n            else:\n                in_channels_group.append(int(round(inplanes * width_mult)))\n                inplanes += final_ch / (self.depth // 3 * 1.0)\n                channels_group.append(int(round(inplanes * width_mult)))\n\n        conv_bn_swish(features, 3, int(round(stem_channel * width_mult)), kernel=3, stride=2, pad=1)\n\n        for block_idx, (in_c, c, t, s, se) in enumerate(zip(in_channels_group, channels_group, ts, strides, use_ses)):\n            features.append(LinearBottleneck(in_channels=in_c, channels=c, t=t, stride=s, use_se=se, se_ratio=se_ratio))\n\n        pen_channels = int(1280 * width_mult)\n        conv_bn_swish(features, c, pen_channels)\n\n        features.append(nn.AdaptiveAvgPool2D(1))\n        self.features = nn.Sequential(*features)\n        self.output = nn.Sequential(nn.Dropout(dropout_ratio), nn.Conv2D(pen_channels, class_dim, 1, bias_attr=True))\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, x):\n        feat = self.features(x)\n        x = self.output(feat).squeeze(axis=-1).squeeze(axis=-1)\n        return x, feat\n"
  },
  {
    "path": "modules/image/classification/rexnet_1_5_imagenet/README.md",
    "content": "# rexnet_1_5_imagenet\n\n|模型名称|rexnet_1_5_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ReXNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|57MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - ReXNet 由 NAVER AI Lab 提出的基于新的网络设计原则而设计的网络。作者针对现有网络中具有代表性的瓶颈问题，提出了一套设计原则，他们认为，常规设计会产生代表性瓶颈，这会影响模型性能。为了研究表征瓶颈，作者研究了由一万个随机网络生成的特征的矩阵秩。此外，还研究了整个层的通道配置以设计更准确的网络架构。最后，作者提出了一套简单有效的设计原则来缓解表征瓶颈。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install rexnet_1_5_imagenet\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run rexnet_1_5_imagenet --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='rexnet_1_5_imagenet')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用rexnet_1_5_imagenet对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"rexnet_1_5_imagenet\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='rexnet_1_5_imagenet', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m rexnet_1_5_imagenet\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/rexnet_1_5_imagenet\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/rexnet_1_5_imagenet/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/rexnet_1_5_imagenet/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nfrom math import ceil\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport numpy as np\nimport paddlehub.vision.transforms as T\nfrom paddle import ParamAttr\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\ndef conv_bn_act(out, in_channels, channels, kernel=1, stride=1, pad=0, num_group=1, active=True, relu6=False):\n    out.append(nn.Conv2D(in_channels, channels, kernel, stride, pad, groups=num_group, bias_attr=False))\n    out.append(nn.BatchNorm2D(channels))\n    if active:\n        out.append(nn.ReLU6() if relu6 else nn.ReLU())\n\n\ndef conv_bn_swish(out, in_channels, channels, kernel=1, stride=1, pad=0, num_group=1):\n    out.append(nn.Conv2D(in_channels, channels, kernel, stride, pad, groups=num_group, bias_attr=False))\n    out.append(nn.BatchNorm2D(channels))\n    out.append(nn.Swish())\n\n\nclass SE(nn.Layer):\n    def __init__(self, in_channels, channels, se_ratio=12):\n        super(SE, self).__init__()\n        self.avg_pool = nn.AdaptiveAvgPool2D(1)\n        self.fc = nn.Sequential(\n            nn.Conv2D(in_channels, channels // se_ratio, kernel_size=1, padding=0),\n            nn.BatchNorm2D(channels // se_ratio), nn.ReLU(),\n            nn.Conv2D(channels // se_ratio, channels, kernel_size=1, padding=0), nn.Sigmoid())\n\n    def forward(self, x):\n        y = self.avg_pool(x)\n        y = self.fc(y)\n        return x * y\n\n\nclass LinearBottleneck(nn.Layer):\n    def __init__(self, in_channels, channels, t, stride, use_se=True, se_ratio=12, **kwargs):\n        super(LinearBottleneck, self).__init__(**kwargs)\n        self.use_shortcut = stride == 1 and in_channels <= channels\n        self.in_channels = in_channels\n        self.out_channels = channels\n\n        out = []\n        if t != 1:\n            dw_channels = in_channels * t\n            conv_bn_swish(out, in_channels=in_channels, channels=dw_channels)\n        else:\n            dw_channels = in_channels\n\n        conv_bn_act(\n            out,\n            in_channels=dw_channels,\n            channels=dw_channels,\n            kernel=3,\n            stride=stride,\n            pad=1,\n            num_group=dw_channels,\n            active=False)\n\n        if use_se:\n            out.append(SE(dw_channels, dw_channels, se_ratio))\n\n        out.append(nn.ReLU6())\n        conv_bn_act(out, in_channels=dw_channels, channels=channels, active=False, relu6=True)\n        self.out = nn.Sequential(*out)\n\n    def forward(self, x):\n        out = self.out(x)\n        if self.use_shortcut:\n            out[:, 0:self.in_channels] += x\n\n        return out\n\n\n@moduleinfo(\n    name=\"rexnet_1_5_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"rexnet_1_5_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass ReXNetV1(nn.Layer):\n    def __init__(self,\n                 label_list: list = None,\n                 load_checkpoint: str = None,\n                 input_ch=16,\n                 final_ch=180,\n                 width_mult=1.5,\n                 depth_mult=1.0,\n                 class_dim=1000,\n                 use_se=True,\n                 se_ratio=12,\n                 dropout_ratio=0.2,\n                 bn_momentum=0.9):\n\n        super(ReXNetV1, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        layers = [1, 2, 2, 3, 3, 5]\n        strides = [1, 2, 2, 2, 1, 2]\n        use_ses = [False, False, True, True, True, True]\n\n        layers = [ceil(element * depth_mult) for element in layers]\n        strides = sum([[element] + [1] * (layers[idx] - 1) for idx, element in enumerate(strides)], [])\n        if use_se:\n            use_ses = sum([[element] * layers[idx] for idx, element in enumerate(use_ses)], [])\n        else:\n            use_ses = [False] * sum(layers[:])\n        ts = [1] * layers[0] + [6] * sum(layers[1:])\n\n        self.depth = sum(layers[:]) * 3\n        stem_channel = 32 / width_mult if width_mult < 1.0 else 32\n        inplanes = input_ch / width_mult if width_mult < 1.0 else input_ch\n\n        features = []\n        in_channels_group = []\n        channels_group = []\n\n        # The following channel configuration is a simple instance to make each layer become an expand layer.\n        for i in range(self.depth // 3):\n            if i == 0:\n                in_channels_group.append(int(round(stem_channel * width_mult)))\n                channels_group.append(int(round(inplanes * width_mult)))\n            else:\n                in_channels_group.append(int(round(inplanes * width_mult)))\n                inplanes += final_ch / (self.depth // 3 * 1.0)\n                channels_group.append(int(round(inplanes * width_mult)))\n\n        conv_bn_swish(features, 3, int(round(stem_channel * width_mult)), kernel=3, stride=2, pad=1)\n\n        for block_idx, (in_c, c, t, s, se) in enumerate(zip(in_channels_group, channels_group, ts, strides, use_ses)):\n            features.append(LinearBottleneck(in_channels=in_c, channels=c, t=t, stride=s, use_se=se, se_ratio=se_ratio))\n\n        pen_channels = int(1280 * width_mult)\n        conv_bn_swish(features, c, pen_channels)\n\n        features.append(nn.AdaptiveAvgPool2D(1))\n        self.features = nn.Sequential(*features)\n        self.output = nn.Sequential(nn.Dropout(dropout_ratio), nn.Conv2D(pen_channels, class_dim, 1, bias_attr=True))\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, x):\n        feat = self.features(x)\n        x = self.output(feat).squeeze(axis=-1).squeeze(axis=-1)\n        return x, feat\n"
  },
  {
    "path": "modules/image/classification/rexnet_2_0_imagenet/README.md",
    "content": "# rexnet_2_0_imagenet\n\n|模型名称|rexnet_2_0_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ReXNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|95MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - ReXNet 由 NAVER AI Lab 提出的基于新的网络设计原则而设计的网络。作者针对现有网络中具有代表性的瓶颈问题，提出了一套设计原则，他们认为，常规设计会产生代表性瓶颈，这会影响模型性能。为了研究表征瓶颈，作者研究了由一万个随机网络生成的特征的矩阵秩。此外，还研究了整个层的通道配置以设计更准确的网络架构。最后，作者提出了一套简单有效的设计原则来缓解表征瓶颈。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install rexnet_2_0_imagenet\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run rexnet_2_0_imagenet --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='rexnet_2_0_imagenet')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用rexnet_2_0_imagenet对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"rexnet_2_0_imagenet\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='rexnet_2_0_imagenet', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m rexnet_2_0_imagenet\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/rexnet_2_0_imagenet\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/rexnet_2_0_imagenet/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/rexnet_2_0_imagenet/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nfrom math import ceil\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport numpy as np\nimport paddlehub.vision.transforms as T\nfrom paddle import ParamAttr\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\ndef conv_bn_act(out, in_channels, channels, kernel=1, stride=1, pad=0, num_group=1, active=True, relu6=False):\n    out.append(nn.Conv2D(in_channels, channels, kernel, stride, pad, groups=num_group, bias_attr=False))\n    out.append(nn.BatchNorm2D(channels))\n    if active:\n        out.append(nn.ReLU6() if relu6 else nn.ReLU())\n\n\ndef conv_bn_swish(out, in_channels, channels, kernel=1, stride=1, pad=0, num_group=1):\n    out.append(nn.Conv2D(in_channels, channels, kernel, stride, pad, groups=num_group, bias_attr=False))\n    out.append(nn.BatchNorm2D(channels))\n    out.append(nn.Swish())\n\n\nclass SE(nn.Layer):\n    def __init__(self, in_channels, channels, se_ratio=12):\n        super(SE, self).__init__()\n        self.avg_pool = nn.AdaptiveAvgPool2D(1)\n        self.fc = nn.Sequential(\n            nn.Conv2D(in_channels, channels // se_ratio, kernel_size=1, padding=0),\n            nn.BatchNorm2D(channels // se_ratio), nn.ReLU(),\n            nn.Conv2D(channels // se_ratio, channels, kernel_size=1, padding=0), nn.Sigmoid())\n\n    def forward(self, x):\n        y = self.avg_pool(x)\n        y = self.fc(y)\n        return x * y\n\n\nclass LinearBottleneck(nn.Layer):\n    def __init__(self, in_channels, channels, t, stride, use_se=True, se_ratio=12, **kwargs):\n        super(LinearBottleneck, self).__init__(**kwargs)\n        self.use_shortcut = stride == 1 and in_channels <= channels\n        self.in_channels = in_channels\n        self.out_channels = channels\n\n        out = []\n        if t != 1:\n            dw_channels = in_channels * t\n            conv_bn_swish(out, in_channels=in_channels, channels=dw_channels)\n        else:\n            dw_channels = in_channels\n\n        conv_bn_act(\n            out,\n            in_channels=dw_channels,\n            channels=dw_channels,\n            kernel=3,\n            stride=stride,\n            pad=1,\n            num_group=dw_channels,\n            active=False)\n\n        if use_se:\n            out.append(SE(dw_channels, dw_channels, se_ratio))\n\n        out.append(nn.ReLU6())\n        conv_bn_act(out, in_channels=dw_channels, channels=channels, active=False, relu6=True)\n        self.out = nn.Sequential(*out)\n\n    def forward(self, x):\n        out = self.out(x)\n        if self.use_shortcut:\n            out[:, 0:self.in_channels] += x\n\n        return out\n\n\n@moduleinfo(\n    name=\"rexnet_2_0_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"rexnet_2_0_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass ReXNetV1(nn.Layer):\n    def __init__(self,\n                 label_list: list = None,\n                 load_checkpoint: str = None,\n                 input_ch=16,\n                 final_ch=180,\n                 width_mult=2.0,\n                 depth_mult=1.0,\n                 class_dim=1000,\n                 use_se=True,\n                 se_ratio=12,\n                 dropout_ratio=0.2,\n                 bn_momentum=0.9):\n\n        super(ReXNetV1, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        layers = [1, 2, 2, 3, 3, 5]\n        strides = [1, 2, 2, 2, 1, 2]\n        use_ses = [False, False, True, True, True, True]\n\n        layers = [ceil(element * depth_mult) for element in layers]\n        strides = sum([[element] + [1] * (layers[idx] - 1) for idx, element in enumerate(strides)], [])\n        if use_se:\n            use_ses = sum([[element] * layers[idx] for idx, element in enumerate(use_ses)], [])\n        else:\n            use_ses = [False] * sum(layers[:])\n        ts = [1] * layers[0] + [6] * sum(layers[1:])\n\n        self.depth = sum(layers[:]) * 3\n        stem_channel = 32 / width_mult if width_mult < 1.0 else 32\n        inplanes = input_ch / width_mult if width_mult < 1.0 else input_ch\n\n        features = []\n        in_channels_group = []\n        channels_group = []\n\n        # The following channel configuration is a simple instance to make each layer become an expand layer.\n        for i in range(self.depth // 3):\n            if i == 0:\n                in_channels_group.append(int(round(stem_channel * width_mult)))\n                channels_group.append(int(round(inplanes * width_mult)))\n            else:\n                in_channels_group.append(int(round(inplanes * width_mult)))\n                inplanes += final_ch / (self.depth // 3 * 1.0)\n                channels_group.append(int(round(inplanes * width_mult)))\n\n        conv_bn_swish(features, 3, int(round(stem_channel * width_mult)), kernel=3, stride=2, pad=1)\n\n        for block_idx, (in_c, c, t, s, se) in enumerate(zip(in_channels_group, channels_group, ts, strides, use_ses)):\n            features.append(LinearBottleneck(in_channels=in_c, channels=c, t=t, stride=s, use_se=se, se_ratio=se_ratio))\n\n        pen_channels = int(1280 * width_mult)\n        conv_bn_swish(features, c, pen_channels)\n\n        features.append(nn.AdaptiveAvgPool2D(1))\n        self.features = nn.Sequential(*features)\n        self.output = nn.Sequential(nn.Dropout(dropout_ratio), nn.Conv2D(pen_channels, class_dim, 1, bias_attr=True))\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, x):\n        feat = self.features(x)\n        x = self.output(feat).squeeze(axis=-1).squeeze(axis=-1)\n        return x, feat\n"
  },
  {
    "path": "modules/image/classification/rexnet_3_0_imagenet/README.md",
    "content": "# rexnet_3_0_imagenet\n\n|模型名称|rexnet_3_0_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ReXNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|200MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - ReXNet 由 NAVER AI Lab 提出的基于新的网络设计原则而设计的网络。作者针对现有网络中具有代表性的瓶颈问题，提出了一套设计原则，他们认为，常规设计会产生代表性瓶颈，这会影响模型性能。为了研究表征瓶颈，作者研究了由一万个随机网络生成的特征的矩阵秩。此外，还研究了整个层的通道配置以设计更准确的网络架构。最后，作者提出了一套简单有效的设计原则来缓解表征瓶颈。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install rexnet_3_0_imagenet\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run rexnet_3_0_imagenet --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='rexnet_3_0_imagenet')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用rexnet_3_0_imagenet对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"rexnet_3_0_imagenet\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='rexnet_3_0_imagenet', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m rexnet_3_0_imagenet\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/rexnet_3_0_imagenet\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/rexnet_3_0_imagenet/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/rexnet_3_0_imagenet/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nfrom math import ceil\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport numpy as np\nimport paddlehub.vision.transforms as T\nfrom paddle import ParamAttr\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\ndef conv_bn_act(out, in_channels, channels, kernel=1, stride=1, pad=0, num_group=1, active=True, relu6=False):\n    out.append(nn.Conv2D(in_channels, channels, kernel, stride, pad, groups=num_group, bias_attr=False))\n    out.append(nn.BatchNorm2D(channels))\n    if active:\n        out.append(nn.ReLU6() if relu6 else nn.ReLU())\n\n\ndef conv_bn_swish(out, in_channels, channels, kernel=1, stride=1, pad=0, num_group=1):\n    out.append(nn.Conv2D(in_channels, channels, kernel, stride, pad, groups=num_group, bias_attr=False))\n    out.append(nn.BatchNorm2D(channels))\n    out.append(nn.Swish())\n\n\nclass SE(nn.Layer):\n    def __init__(self, in_channels, channels, se_ratio=12):\n        super(SE, self).__init__()\n        self.avg_pool = nn.AdaptiveAvgPool2D(1)\n        self.fc = nn.Sequential(\n            nn.Conv2D(in_channels, channels // se_ratio, kernel_size=1, padding=0),\n            nn.BatchNorm2D(channels // se_ratio), nn.ReLU(),\n            nn.Conv2D(channels // se_ratio, channels, kernel_size=1, padding=0), nn.Sigmoid())\n\n    def forward(self, x):\n        y = self.avg_pool(x)\n        y = self.fc(y)\n        return x * y\n\n\nclass LinearBottleneck(nn.Layer):\n    def __init__(self, in_channels, channels, t, stride, use_se=True, se_ratio=12, **kwargs):\n        super(LinearBottleneck, self).__init__(**kwargs)\n        self.use_shortcut = stride == 1 and in_channels <= channels\n        self.in_channels = in_channels\n        self.out_channels = channels\n\n        out = []\n        if t != 1:\n            dw_channels = in_channels * t\n            conv_bn_swish(out, in_channels=in_channels, channels=dw_channels)\n        else:\n            dw_channels = in_channels\n\n        conv_bn_act(\n            out,\n            in_channels=dw_channels,\n            channels=dw_channels,\n            kernel=3,\n            stride=stride,\n            pad=1,\n            num_group=dw_channels,\n            active=False)\n\n        if use_se:\n            out.append(SE(dw_channels, dw_channels, se_ratio))\n\n        out.append(nn.ReLU6())\n        conv_bn_act(out, in_channels=dw_channels, channels=channels, active=False, relu6=True)\n        self.out = nn.Sequential(*out)\n\n    def forward(self, x):\n        out = self.out(x)\n        if self.use_shortcut:\n            out[:, 0:self.in_channels] += x\n\n        return out\n\n\n@moduleinfo(\n    name=\"rexnet_3_0_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"rexnet_3_0_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass ReXNetV1(nn.Layer):\n    def __init__(self,\n                 label_list: list = None,\n                 load_checkpoint: str = None,\n                 input_ch=16,\n                 final_ch=180,\n                 width_mult=3.0,\n                 depth_mult=1.0,\n                 class_dim=1000,\n                 use_se=True,\n                 se_ratio=12,\n                 dropout_ratio=0.2,\n                 bn_momentum=0.9):\n\n        super(ReXNetV1, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        layers = [1, 2, 2, 3, 3, 5]\n        strides = [1, 2, 2, 2, 1, 2]\n        use_ses = [False, False, True, True, True, True]\n\n        layers = [ceil(element * depth_mult) for element in layers]\n        strides = sum([[element] + [1] * (layers[idx] - 1) for idx, element in enumerate(strides)], [])\n        if use_se:\n            use_ses = sum([[element] * layers[idx] for idx, element in enumerate(use_ses)], [])\n        else:\n            use_ses = [False] * sum(layers[:])\n        ts = [1] * layers[0] + [6] * sum(layers[1:])\n\n        self.depth = sum(layers[:]) * 3\n        stem_channel = 32 / width_mult if width_mult < 1.0 else 32\n        inplanes = input_ch / width_mult if width_mult < 1.0 else input_ch\n\n        features = []\n        in_channels_group = []\n        channels_group = []\n\n        # The following channel configuration is a simple instance to make each layer become an expand layer.\n        for i in range(self.depth // 3):\n            if i == 0:\n                in_channels_group.append(int(round(stem_channel * width_mult)))\n                channels_group.append(int(round(inplanes * width_mult)))\n            else:\n                in_channels_group.append(int(round(inplanes * width_mult)))\n                inplanes += final_ch / (self.depth // 3 * 1.0)\n                channels_group.append(int(round(inplanes * width_mult)))\n\n        conv_bn_swish(features, 3, int(round(stem_channel * width_mult)), kernel=3, stride=2, pad=1)\n\n        for block_idx, (in_c, c, t, s, se) in enumerate(zip(in_channels_group, channels_group, ts, strides, use_ses)):\n            features.append(LinearBottleneck(in_channels=in_c, channels=c, t=t, stride=s, use_se=se, se_ratio=se_ratio))\n\n        pen_channels = int(1280 * width_mult)\n        conv_bn_swish(features, c, pen_channels)\n\n        features.append(nn.AdaptiveAvgPool2D(1))\n        self.features = nn.Sequential(*features)\n        self.output = nn.Sequential(nn.Dropout(dropout_ratio), nn.Conv2D(pen_channels, class_dim, 1, bias_attr=True))\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, x):\n        feat = self.features(x)\n        x = self.output(feat).squeeze(axis=-1).squeeze(axis=-1)\n        return x, feat\n"
  },
  {
    "path": "modules/image/classification/se_hrnet64_imagenet_ssld/README.md",
    "content": "# se_hrnet64_imagenet_ssld\n\n|模型名称|se_hrnet64_imagenet_ssld|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|HRNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|是|\n|模型大小|493MB|\n|指标|-|\n|最新更新日期|2021-09-14|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - HRNet是微软亚洲研究院在2019年提出的全新神经网络。与之前的卷积神经网络不同，这个网络在网络的深层依然可以保持高分辨率，所以预测的关键点的热图更加准确，而且在空间上也更加准确。此外，该网络在其他对分辨率敏感的视觉任务中表现特别好，例如检测和分割。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n    - ```shell\n      $ hub install se_hrnet64_imagenet_ssld\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n    ```shell\n    $ hub run se_hrnet64_imagenet_ssld --input_path \"/PATH/TO/IMAGE\" --top_k 5\n    ```\n- ### 2.预测代码示例\n\n    ```python\n    import paddle\n    import paddlehub as hub\n    if __name__ == '__main__':\n        model = hub.Module(name='se_hrnet64_imagenet_ssld')\n        result = model.predict(['flower.jpg'])\n    ```\n- ### 3.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用se_hrnet64_imagenet_ssld对[Flowers](../../docs/reference/datasets.md#class-hubdatasetsflowers)等数据集进行Fine-tune。\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              import paddlehub.vision.transforms as T\n              transforms = T.Compose([T.Resize((256, 256)),\n                                    T.CenterCrop(224),\n                                    T.Normalize(mean=[0.485, 0.456, 0.406], std = [0.229, 0.224, 0.225])],\n                                    to_rgb=True)\n              ```\n\n                - `transforms` 数据增强模块定义了丰富的数据预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import Flowers\n              flowers = Flowers(transforms)\n              flowers_validate = Flowers(transforms, mode='val')\n              ```\n\n                * `transforms`: 数据预处理方式。\n                * `mode`: 选择数据模式，可选项有 `train`, `test`, `val`， 默认为`train`。\n\n                * 数据集的准备代码可以参考 [flowers.py](../../paddlehub/datasets/flowers.py)。`hub.datasets.Flowers()` 会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              model = hub.Module(name=\"se_hrnet64_imagenet_ssld\", label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"])\n              ```\n                * `name`: 选择预训练模型的名字。\n                * `label_list`: 设置输出分类类别，默认为Imagenet2012类别。\n\n        - Step4: 选择优化策略和运行配置\n\n            ```python\n            optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())\n            trainer = Trainer(model, optimizer, checkpoint_dir='img_classification_ckpt')\n            trainer.train(flowers, epochs=100, batch_size=32, eval_dataset=flowers_validate, save_interval=1)\n            ```\n\n\n            - 运行配置\n\n            - `Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n                * `model`: 被优化模型；\n                * `optimizer`: 优化器选择；\n                * `use_vdl`: 是否使用vdl可视化训练过程；\n                * `checkpoint_dir`: 保存模型参数的地址；\n                * `compare_metrics`: 保存最优模型的衡量指标；\n\n            - `trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n                * `train_dataset`: 训练时所用的数据集；\n                * `epochs`: 训练轮数；\n                * `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n                * `num_workers`: works的数量，默认为0；\n                * `eval_dataset`: 验证集；\n                * `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n                * `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n    - 模型预测\n\n        -   当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。 我们使用该模型来进行预测。predict.py脚本如下：\n\n            - ```python\n              import paddle\n              import paddlehub as hub\n              if __name__ == '__main__':\n                  model = hub.Module(name='se_hrnet64_imagenet_ssld', label_list=[\"roses\", \"tulips\", \"daisy\", \"sunflowers\", \"dandelion\"], load_checkpoint='/PATH/TO/CHECKPOINT')\n                  result = model.predict(['flower.jpg'])\n              ```\n\n\n            - **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线分类任务服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m se_hrnet64_imagenet_ssld\n      ```\n\n    - 这样就完成了一个分类任务服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n        import numpy as np\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)], 'top_k':2}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/se_hrnet64_imagenet_ssld\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        data =r.json()[\"results\"]['data']\n        ```\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/classification/se_hrnet64_imagenet_ssld/label_list.txt",
    "content": "tench\ngoldfish\ngreat white shark\ntiger shark\nhammerhead\nelectric ray\nstingray\ncock\nhen\nostrich\nbrambling\ngoldfinch\nhouse finch\njunco\nindigo bunting\nrobin\nbulbul\njay\nmagpie\nchickadee\nwater ouzel\nkite\nbald eagle\nvulture\ngreat grey owl\nEuropean fire salamander\ncommon newt\neft\nspotted salamander\naxolotl\nbullfrog\ntree frog\ntailed frog\nloggerhead\nleatherback turtle\nmud turtle\nterrapin\nbox turtle\nbanded gecko\ncommon iguana\nAmerican chameleon\nwhiptail\nagama\nfrilled lizard\nalligator lizard\nGila monster\ngreen lizard\nAfrican chameleon\nKomodo dragon\nAfrican crocodile\nAmerican alligator\ntriceratops\nthunder snake\nringneck snake\nhognose snake\ngreen snake\nking snake\ngarter snake\nwater snake\nvine snake\nnight snake\nboa constrictor\nrock python\nIndian cobra\ngreen mamba\nsea snake\nhorned viper\ndiamondback\nsidewinder\ntrilobite\nharvestman\nscorpion\nblack and gold garden spider\nbarn spider\ngarden spider\nblack widow\ntarantula\nwolf spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse\nprairie chicken\npeacock\nquail\npartridge\nAfrican grey\nmacaw\nsulphur-crested cockatoo\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser\ngoose\nblack swan\ntusker\nechidna\nplatypus\nwallaby\nkoala\nwombat\njellyfish\nsea anemone\nbrain coral\nflatworm\nnematode\nconch\nsnail\nslug\nsea slug\nchiton\nchambered nautilus\nDungeness crab\nrock crab\nfiddler crab\nking crab\nAmerican lobster\nspiny lobster\ncrayfish\nhermit crab\nisopod\nwhite stork\nblack stork\nspoonbill\nflamingo\nlittle blue heron\nAmerican egret\nbittern\ncrane\nlimpkin\nEuropean gallinule\nAmerican coot\nbustard\nruddy turnstone\nred-backed sandpiper\nredshank\ndowitcher\noystercatcher\npelican\nking penguin\nalbatross\ngrey whale\nkiller whale\ndugong\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog\nPekinese\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound\nbasset\nbeagle\nbloodhound\nbluetick\nblack-and-tan coonhound\nWalker hound\nEnglish foxhound\nredbone\nborzoi\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound\nNorwegian elkhound\notterhound\nSaluki\nScottish deerhound\nWeimaraner\nStaffordshire bullterrier\nAmerican Staffordshire terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier\nAiredale\ncairn\nAustralian terrier\nDandie Dinmont\nBoston bull\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier\nTibetan terrier\nsilky terrier\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla\nEnglish setter\nIrish setter\nGordon setter\nBrittany spaniel\nclumber\nEnglish springer\nWelsh springer spaniel\ncocker spaniel\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog\nShetland sheepdog\ncollie\nBorder collie\nBouvier des Flandres\nRottweiler\nGerman shepherd\nDoberman\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard\nEskimo dog\nmalamute\nSiberian husky\ndalmatian\naffenpinscher\nbasenji\npug\nLeonberg\nNewfoundland\nGreat Pyrenees\nSamoyed\nPomeranian\nchow\nkeeshond\nBrabancon griffon\nPembroke\nCardigan\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf\nwhite wolf\nred wolf\ncoyote\ndingo\ndhole\nAfrican hunting dog\nhyena\nred fox\nkit fox\nArctic fox\ngrey fox\ntabby\ntiger cat\nPersian cat\nSiamese cat\nEgyptian cat\ncougar\nlynx\nleopard\nsnow leopard\njaguar\nlion\ntiger\ncheetah\nbrown bear\nAmerican black bear\nice bear\nsloth bear\nmongoose\nmeerkat\ntiger beetle\nladybug\nground beetle\nlong-horned beetle\nleaf beetle\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant\ngrasshopper\ncricket\nwalking stick\ncockroach\nmantis\ncicada\nleafhopper\nlacewing\ndragonfly\ndamselfly\nadmiral\nringlet\nmonarch\ncabbage butterfly\nsulphur butterfly\nlycaenid\nstarfish\nsea urchin\nsea cucumber\nwood rabbit\nhare\nAngora\nhamster\nporcupine\nfox squirrel\nmarmot\nbeaver\nguinea pig\nsorrel\nzebra\nhog\nwild boar\nwarthog\nhippopotamus\nox\nwater buffalo\nbison\nram\nbighorn\nibex\nhartebeest\nimpala\ngazelle\nArabian camel\nllama\nweasel\nmink\npolecat\nblack-footed ferret\notter\nskunk\nbadger\narmadillo\nthree-toed sloth\norangutan\ngorilla\nchimpanzee\ngibbon\nsiamang\nguenon\npatas\nbaboon\nmacaque\nlangur\ncolobus\nproboscis monkey\nmarmoset\ncapuchin\nhowler monkey\ntiti\nspider monkey\nsquirrel monkey\nMadagascar cat\nindri\nIndian elephant\nAfrican elephant\nlesser panda\ngiant panda\nbarracouta\neel\ncoho\nrock beauty\nanemone fish\nsturgeon\ngar\nlionfish\npuffer\nabacus\nabaya\nacademic gown\naccordion\nacoustic guitar\naircraft carrier\nairliner\nairship\naltar\nambulance\namphibian\nanalog clock\napiary\napron\nashcan\nassault rifle\nbackpack\nbakery\nbalance beam\nballoon\nballpoint\nBand Aid\nbanjo\nbannister\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel\nbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap\nbath towel\nbathtub\nbeach wagon\nbeacon\nbeaker\nbearskin\nbeer bottle\nbeer glass\nbell cote\nbib\nbicycle-built-for-two\nbikini\nbinder\nbinoculars\nbirdhouse\nboathouse\nbobsled\nbolo tie\nbonnet\nbookcase\nbookshop\nbottlecap\nbow\nbow tie\nbrass\nbrassiere\nbreakwater\nbreastplate\nbroom\nbucket\nbuckle\nbulletproof vest\nbullet train\nbutcher shop\ncab\ncaldron\ncandle\ncannon\ncanoe\ncan opener\ncardigan\ncar mirror\ncarousel\ncarpenters kit\ncarton\ncar wheel\ncash machine\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello\ncellular telephone\nchain\nchainlink fence\nchain mail\nchain saw\nchest\nchiffonier\nchime\nchina cabinet\nChristmas stocking\nchurch\ncinema\ncleaver\ncliff dwelling\ncloak\nclog\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil\ncombination lock\ncomputer keyboard\nconfectionery\ncontainer ship\nconvertible\ncorkscrew\ncornet\ncowboy boot\ncowboy hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam\ndesk\ndesktop computer\ndial telephone\ndiaper\ndigital clock\ndigital watch\ndining table\ndishrag\ndishwasher\ndisk brake\ndock\ndogsled\ndome\ndoormat\ndrilling platform\ndrum\ndrumstick\ndumbbell\nDutch oven\nelectric fan\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa\nfile\nfireboat\nfire engine\nfire screen\nflagpole\nflute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn\nfrying pan\nfur coat\ngarbage truck\ngasmask\ngas pump\ngoblet\ngo-kart\ngolf ball\ngolfcart\ngondola\ngong\ngown\ngrand piano\ngreenhouse\ngrille\ngrocery store\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower\nhand-held computer\nhandkerchief\nhard disc\nharmonica\nharp\nharvester\nhatchet\nholster\nhome theater\nhoneycomb\nhook\nhoopskirt\nhorizontal bar\nhorse cart\nhourglass\niPod\niron\njack-o-lantern\njean\njeep\njersey\njigsaw puzzle\njinrikisha\njoystick\nkimono\nknee pad\nknot\nlab coat\nladle\nlampshade\nlaptop\nlawn mower\nlens cap\nletter opener\nlibrary\nlifeboat\nlighter\nlimousine\nliner\nlipstick\nLoafer\nlotion\nloudspeaker\nloupe\nlumbermill\nmagnetic compass\nmailbag\nmailbox\nmaillot\nmaillot\nmanhole cover\nmaraca\nmarimba\nmask\nmatchstick\nmaypole\nmaze\nmeasuring cup\nmedicine chest\nmegalith\nmicrophone\nmicrowave\nmilitary uniform\nmilk can\nminibus\nminiskirt\nminivan\nmissile\nmitten\nmixing bowl\nmobile home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter\nmountain bike\nmountain tent\nmouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook\nobelisk\noboe\nocarina\nodometer\noil filter\norgan\noscilloscope\noverskirt\noxcart\noxygen mask\npacket\npaddle\npaddlewheel\npadlock\npaintbrush\npajama\npalace\npanpipe\npaper towel\nparachute\nparallel bars\npark bench\nparking meter\npassenger car\npatio\npay-phone\npedestal\npencil box\npencil sharpener\nperfume\nPetri dish\nphotocopier\npick\npickelhaube\npicket fence\npickup\npier\npiggy bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate\npitcher\nplane\nplanetarium\nplastic bag\nplate rack\nplow\nplunger\nPolaroid camera\npole\npolice van\nponcho\npool table\npop bottle\npot\npotters wheel\npower drill\nprayer rug\nprinter\nprison\nprojectile\nprojector\npuck\npunching bag\npurse\nquill\nquilt\nracer\nracket\nradiator\nradio\nradio telescope\nrain barrel\nrecreational vehicle\nreel\nreflex camera\nrefrigerator\nremote control\nrestaurant\nrevolver\nrifle\nrocking chair\nrotisserie\nrubber eraser\nrugby ball\nrule\nrunning shoe\nsafe\nsafety pin\nsaltshaker\nsandal\nsarong\nsax\nscabbard\nscale\nschool bus\nschooner\nscoreboard\nscreen\nscrew\nscrewdriver\nseat belt\nsewing machine\nshield\nshoe shop\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule\nsliding door\nslot\nsnorkel\nsnowmobile\nsnowplow\nsoap dispenser\nsoccer ball\nsock\nsolar dish\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web\nspindle\nsports car\nspotlight\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch\nstove\nstrainer\nstreetcar\nstretcher\nstudio couch\nstupa\nsubmarine\nsuit\nsundial\nsunglass\nsunglasses\nsunscreen\nsuspension bridge\nswab\nsweatshirt\nswimming trunks\nswing\nswitch\nsyringe\ntable lamp\ntank\ntape player\nteapot\nteddy\ntelevision\ntennis ball\nthatch\ntheater curtain\nthimble\nthresher\nthrone\ntile roof\ntoaster\ntobacco shop\ntoilet seat\ntorch\ntotem pole\ntow truck\ntoyshop\ntractor\ntrailer truck\ntray\ntrench coat\ntricycle\ntrimaran\ntripod\ntriumphal arch\ntrolleybus\ntrombone\ntub\nturnstile\ntypewriter keyboard\numbrella\nunicycle\nupright\nvacuum\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin\nvolleyball\nwaffle iron\nwall clock\nwallet\nwardrobe\nwarplane\nwashbasin\nwasher\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool\nworm fence\nwreck\nyawl\nyurt\nweb site\ncomic book\ncrossword puzzle\nstreet sign\ntraffic light\nbook jacket\nmenu\nplate\nguacamole\nconsomme\nhot pot\ntrifle\nice cream\nice lolly\nFrench loaf\nbagel\npretzel\ncheeseburger\nhotdog\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber\nartichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple\nbanana\njackfruit\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce\ndough\nmeat loaf\npizza\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff\ncoral reef\ngeyser\nlakeside\npromontory\nsandbar\nseashore\nvalley\nvolcano\nballplayer\ngroom\nscuba diver\nrapeseed\ndaisy\nyellow ladys slipper\ncorn\nacorn\nhip\nbuckeye\ncoral fungus\nagaric\ngyromitra\nstinkhorn\nearthstar\nhen-of-the-woods\nbolete\near\ntoilet tissue\n"
  },
  {
    "path": "modules/image/classification/se_hrnet64_imagenet_ssld/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport math\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport paddlehub.vision.transforms as T\nimport numpy as np\nfrom paddle.nn.initializer import Uniform\nfrom paddle import ParamAttr\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    def __init__(self, num_channels, num_filters, filter_size, stride=1, groups=1, act=\"relu\", name=None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = nn.Conv2D(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        bn_name = name + '_bn'\n        self._batch_norm = nn.BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + '_scale'),\n            bias_attr=ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, input):\n        y = self._conv(input)\n        y = self._batch_norm(y)\n        return y\n\n\nclass Layer1(nn.Layer):\n    def __init__(self, num_channels, has_se=False, name=None):\n        super(Layer1, self).__init__()\n\n        self.bottleneck_block_list = []\n\n        for i in range(4):\n            bottleneck_block = self.add_sublayer(\n                \"bb_{}_{}\".format(name, i + 1),\n                BottleneckBlock(\n                    num_channels=num_channels if i == 0 else 256,\n                    num_filters=64,\n                    has_se=has_se,\n                    stride=1,\n                    downsample=True if i == 0 else False,\n                    name=name + '_' + str(i + 1)))\n            self.bottleneck_block_list.append(bottleneck_block)\n\n    def forward(self, input):\n        conv = input\n        for block_func in self.bottleneck_block_list:\n            conv = block_func(conv)\n        return conv\n\n\nclass TransitionLayer(nn.Layer):\n    def __init__(self, in_channels, out_channels, name=None):\n        super(TransitionLayer, self).__init__()\n\n        num_in = len(in_channels)\n        num_out = len(out_channels)\n        out = []\n        self.conv_bn_func_list = []\n        for i in range(num_out):\n            residual = None\n            if i < num_in:\n                if in_channels[i] != out_channels[i]:\n                    residual = self.add_sublayer(\n                        \"transition_{}_layer_{}\".format(name, i + 1),\n                        ConvBNLayer(\n                            num_channels=in_channels[i],\n                            num_filters=out_channels[i],\n                            filter_size=3,\n                            name=name + '_layer_' + str(i + 1)))\n            else:\n                residual = self.add_sublayer(\n                    \"transition_{}_layer_{}\".format(name, i + 1),\n                    ConvBNLayer(\n                        num_channels=in_channels[-1],\n                        num_filters=out_channels[i],\n                        filter_size=3,\n                        stride=2,\n                        name=name + '_layer_' + str(i + 1)))\n            self.conv_bn_func_list.append(residual)\n\n    def forward(self, input):\n        outs = []\n        for idx, conv_bn_func in enumerate(self.conv_bn_func_list):\n            if conv_bn_func is None:\n                outs.append(input[idx])\n            else:\n                if idx < len(input):\n                    outs.append(conv_bn_func(input[idx]))\n                else:\n                    outs.append(conv_bn_func(input[-1]))\n        return outs\n\n\nclass Branches(nn.Layer):\n    def __init__(self, block_num, in_channels, out_channels, has_se=False, name=None):\n        super(Branches, self).__init__()\n\n        self.basic_block_list = []\n\n        for i in range(len(out_channels)):\n            self.basic_block_list.append([])\n            for j in range(block_num):\n                in_ch = in_channels[i] if j == 0 else out_channels[i]\n                basic_block_func = self.add_sublayer(\n                    \"bb_{}_branch_layer_{}_{}\".format(name, i + 1, j + 1),\n                    BasicBlock(\n                        num_channels=in_ch,\n                        num_filters=out_channels[i],\n                        has_se=has_se,\n                        name=name + '_branch_layer_' + str(i + 1) + '_' + str(j + 1)))\n                self.basic_block_list[i].append(basic_block_func)\n\n    def forward(self, inputs):\n        outs = []\n        for idx, input in enumerate(inputs):\n            conv = input\n            basic_block_list = self.basic_block_list[idx]\n            for basic_block_func in basic_block_list:\n                conv = basic_block_func(conv)\n            outs.append(conv)\n        return outs\n\n\nclass BottleneckBlock(nn.Layer):\n    def __init__(self, num_channels, num_filters, has_se, stride=1, downsample=False, name=None):\n        super(BottleneckBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=1,\n            act=\"relu\",\n            name=name + \"_conv1\",\n        )\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_conv2\")\n        self.conv3 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters * 4, filter_size=1, act=None, name=name + \"_conv3\")\n\n        if self.downsample:\n            self.conv_down = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                act=None,\n                name=name + \"_downsample\")\n\n        if self.has_se:\n            self.se = SELayer(\n                num_channels=num_filters * 4, num_filters=num_filters * 4, reduction_ratio=16, name='fc' + name)\n\n    def forward(self, input):\n        residual = input\n        conv1 = self.conv1(input)\n        conv2 = self.conv2(conv1)\n        conv3 = self.conv3(conv2)\n\n        if self.downsample:\n            residual = self.conv_down(input)\n\n        if self.has_se:\n            conv3 = self.se(conv3)\n\n        y = paddle.add(x=residual, y=conv3)\n        y = F.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self, num_channels, num_filters, stride=1, has_se=False, downsample=False, name=None):\n        super(BasicBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_filters,\n            filter_size=3,\n            stride=stride,\n            act=\"relu\",\n            name=name + \"_conv1\")\n        self.conv2 = ConvBNLayer(\n            num_channels=num_filters, num_filters=num_filters, filter_size=3, stride=1, act=None, name=name + \"_conv2\")\n\n        if self.downsample:\n            self.conv_down = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=num_filters * 4,\n                filter_size=1,\n                act=\"relu\",\n                name=name + \"_downsample\")\n\n        if self.has_se:\n            self.se = SELayer(num_channels=num_filters, num_filters=num_filters, reduction_ratio=16, name='fc' + name)\n\n    def forward(self, input):\n        residual = input\n        conv1 = self.conv1(input)\n        conv2 = self.conv2(conv1)\n\n        if self.downsample:\n            residual = self.conv_down(input)\n\n        if self.has_se:\n            conv2 = self.se(conv2)\n\n        y = paddle.add(x=residual, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass SELayer(nn.Layer):\n    def __init__(self, num_channels, num_filters, reduction_ratio, name=None):\n        super(SELayer, self).__init__()\n\n        self.pool2d_gap = nn.AdaptiveAvgPool2D(1)\n\n        self._num_channels = num_channels\n\n        med_ch = int(num_channels / reduction_ratio)\n        stdv = 1.0 / math.sqrt(num_channels * 1.0)\n        self.squeeze = nn.Linear(\n            num_channels,\n            med_ch,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_sqz_weights\"),\n            bias_attr=ParamAttr(name=name + '_sqz_offset'))\n\n        stdv = 1.0 / math.sqrt(med_ch * 1.0)\n        self.excitation = nn.Linear(\n            med_ch,\n            num_filters,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=name + \"_exc_weights\"),\n            bias_attr=ParamAttr(name=name + '_exc_offset'))\n\n    def forward(self, input):\n        pool = self.pool2d_gap(input)\n        pool = paddle.squeeze(pool, axis=[2, 3])\n        squeeze = self.squeeze(pool)\n        squeeze = F.relu(squeeze)\n        excitation = self.excitation(squeeze)\n        excitation = F.sigmoid(excitation)\n        excitation = paddle.unsqueeze(excitation, axis=[2, 3])\n        out = input * excitation\n        return out\n\n\nclass Stage(nn.Layer):\n    def __init__(self, num_channels, num_modules, num_filters, has_se=False, multi_scale_output=True, name=None):\n        super(Stage, self).__init__()\n\n        self._num_modules = num_modules\n\n        self.stage_func_list = []\n        for i in range(num_modules):\n            if i == num_modules - 1 and not multi_scale_output:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels,\n                        num_filters=num_filters,\n                        has_se=has_se,\n                        multi_scale_output=False,\n                        name=name + '_' + str(i + 1)))\n            else:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels, num_filters=num_filters, has_se=has_se,\n                        name=name + '_' + str(i + 1)))\n\n            self.stage_func_list.append(stage_func)\n\n    def forward(self, input):\n        out = input\n        for idx in range(self._num_modules):\n            out = self.stage_func_list[idx](out)\n        return out\n\n\nclass HighResolutionModule(nn.Layer):\n    def __init__(self, num_channels, num_filters, has_se=False, multi_scale_output=True, name=None):\n        super(HighResolutionModule, self).__init__()\n\n        self.branches_func = Branches(\n            block_num=4, in_channels=num_channels, out_channels=num_filters, has_se=has_se, name=name)\n\n        self.fuse_func = FuseLayers(\n            in_channels=num_filters, out_channels=num_filters, multi_scale_output=multi_scale_output, name=name)\n\n    def forward(self, input):\n        out = self.branches_func(input)\n        out = self.fuse_func(out)\n        return out\n\n\nclass FuseLayers(nn.Layer):\n    def __init__(self, in_channels, out_channels, multi_scale_output=True, name=None):\n        super(FuseLayers, self).__init__()\n\n        self._actual_ch = len(in_channels) if multi_scale_output else 1\n        self._in_channels = in_channels\n\n        self.residual_func_list = []\n        for i in range(self._actual_ch):\n            for j in range(len(in_channels)):\n                residual_func = None\n                if j > i:\n                    residual_func = self.add_sublayer(\n                        \"residual_{}_layer_{}_{}\".format(name, i + 1, j + 1),\n                        ConvBNLayer(\n                            num_channels=in_channels[j],\n                            num_filters=out_channels[i],\n                            filter_size=1,\n                            stride=1,\n                            act=None,\n                            name=name + '_layer_' + str(i + 1) + '_' + str(j + 1)))\n                    self.residual_func_list.append(residual_func)\n                elif j < i:\n                    pre_num_filters = in_channels[j]\n                    for k in range(i - j):\n                        if k == i - j - 1:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                ConvBNLayer(\n                                    num_channels=pre_num_filters,\n                                    num_filters=out_channels[i],\n                                    filter_size=3,\n                                    stride=2,\n                                    act=None,\n                                    name=name + '_layer_' + str(i + 1) + '_' + str(j + 1) + '_' + str(k + 1)))\n                            pre_num_filters = out_channels[i]\n                        else:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                ConvBNLayer(\n                                    num_channels=pre_num_filters,\n                                    num_filters=out_channels[j],\n                                    filter_size=3,\n                                    stride=2,\n                                    act=\"relu\",\n                                    name=name + '_layer_' + str(i + 1) + '_' + str(j + 1) + '_' + str(k + 1)))\n                            pre_num_filters = out_channels[j]\n                        self.residual_func_list.append(residual_func)\n\n    def forward(self, input):\n        outs = []\n        residual_func_idx = 0\n        for i in range(self._actual_ch):\n            residual = input[i]\n            for j in range(len(self._in_channels)):\n                if j > i:\n                    y = self.residual_func_list[residual_func_idx](input[j])\n                    residual_func_idx += 1\n\n                    y = F.upsample(y, scale_factor=2**(j - i), mode=\"nearest\")\n                    residual = paddle.add(x=residual, y=y)\n                elif j < i:\n                    y = input[j]\n                    for k in range(i - j):\n                        y = self.residual_func_list[residual_func_idx](y)\n                        residual_func_idx += 1\n\n                    residual = paddle.add(x=residual, y=y)\n\n            residual = F.relu(residual)\n            outs.append(residual)\n\n        return outs\n\n\nclass LastClsOut(nn.Layer):\n    def __init__(self, num_channel_list, has_se, num_filters_list=[32, 64, 128, 256], name=None):\n        super(LastClsOut, self).__init__()\n\n        self.func_list = []\n        for idx in range(len(num_channel_list)):\n            func = self.add_sublayer(\n                \"conv_{}_conv_{}\".format(name, idx + 1),\n                BottleneckBlock(\n                    num_channels=num_channel_list[idx],\n                    num_filters=num_filters_list[idx],\n                    has_se=has_se,\n                    downsample=True,\n                    name=name + 'conv_' + str(idx + 1)))\n            self.func_list.append(func)\n\n    def forward(self, inputs):\n        outs = []\n        for idx, input in enumerate(inputs):\n            out = self.func_list[idx](input)\n            outs.append(out)\n        return outs\n\n\n@moduleinfo(\n    name=\"se_hrnet64_imagenet_ssld\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"se_hrnet64_imagenet_ssld is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.0.0\",\n    meta=ImageClassifierModule)\nclass SE_HRNet64(nn.Layer):\n    def __init__(self, label_list: list = None, load_checkpoint: str = None):\n        super(SE_HRNet64, self).__init__()\n\n        if label_list is not None:\n            self.labels = label_list\n            class_dim = len(self.labels)\n        else:\n            label_list = []\n            label_file = os.path.join(self.directory, 'label_list.txt')\n            files = open(label_file)\n            for line in files.readlines():\n                line = line.strip('\\n')\n                label_list.append(line)\n            self.labels = label_list\n            class_dim = len(self.labels)\n\n        self.width = 64\n        self.has_se = True\n        self.channels = {\n            18: [[18, 36], [18, 36, 72], [18, 36, 72, 144]],\n            30: [[30, 60], [30, 60, 120], [30, 60, 120, 240]],\n            32: [[32, 64], [32, 64, 128], [32, 64, 128, 256]],\n            40: [[40, 80], [40, 80, 160], [40, 80, 160, 320]],\n            44: [[44, 88], [44, 88, 176], [44, 88, 176, 352]],\n            48: [[48, 96], [48, 96, 192], [48, 96, 192, 384]],\n            60: [[60, 120], [60, 120, 240], [60, 120, 240, 480]],\n            64: [[64, 128], [64, 128, 256], [64, 128, 256, 512]]\n        }\n        self._class_dim = class_dim\n\n        channels_2, channels_3, channels_4 = self.channels[self.width]\n        num_modules_2, num_modules_3, num_modules_4 = 1, 4, 3\n\n        self.conv_layer1_1 = ConvBNLayer(\n            num_channels=3, num_filters=64, filter_size=3, stride=2, act='relu', name=\"layer1_1\")\n\n        self.conv_layer1_2 = ConvBNLayer(\n            num_channels=64, num_filters=64, filter_size=3, stride=2, act='relu', name=\"layer1_2\")\n\n        self.la1 = Layer1(num_channels=64, has_se=self.has_se, name=\"layer2\")\n\n        self.tr1 = TransitionLayer(in_channels=[256], out_channels=channels_2, name=\"tr1\")\n\n        self.st2 = Stage(\n            num_channels=channels_2, num_modules=num_modules_2, num_filters=channels_2, has_se=self.has_se, name=\"st2\")\n\n        self.tr2 = TransitionLayer(in_channels=channels_2, out_channels=channels_3, name=\"tr2\")\n        self.st3 = Stage(\n            num_channels=channels_3, num_modules=num_modules_3, num_filters=channels_3, has_se=self.has_se, name=\"st3\")\n\n        self.tr3 = TransitionLayer(in_channels=channels_3, out_channels=channels_4, name=\"tr3\")\n        self.st4 = Stage(\n            num_channels=channels_4, num_modules=num_modules_4, num_filters=channels_4, has_se=self.has_se, name=\"st4\")\n\n        # classification\n        num_filters_list = [32, 64, 128, 256]\n        self.last_cls = LastClsOut(\n            num_channel_list=channels_4,\n            has_se=self.has_se,\n            num_filters_list=num_filters_list,\n            name=\"cls_head\",\n        )\n\n        last_num_filters = [256, 512, 1024]\n        self.cls_head_conv_list = []\n        for idx in range(3):\n            self.cls_head_conv_list.append(\n                self.add_sublayer(\n                    \"cls_head_add{}\".format(idx + 1),\n                    ConvBNLayer(\n                        num_channels=num_filters_list[idx] * 4,\n                        num_filters=last_num_filters[idx],\n                        filter_size=3,\n                        stride=2,\n                        name=\"cls_head_add\" + str(idx + 1))))\n\n        self.conv_last = ConvBNLayer(\n            num_channels=1024, num_filters=2048, filter_size=1, stride=1, name=\"cls_head_last_conv\")\n\n        self.pool2d_avg = nn.AdaptiveAvgPool2D(1)\n\n        stdv = 1.0 / math.sqrt(2048 * 1.0)\n\n        self.out = nn.Linear(\n            2048,\n            class_dim,\n            weight_attr=ParamAttr(initializer=Uniform(-stdv, stdv), name=\"fc_weights\"),\n            bias_attr=ParamAttr(name=\"fc_offset\"))\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def transforms(self, images: Union[str, np.ndarray]):\n        transforms = T.Compose([\n            T.Resize((256, 256)),\n            T.CenterCrop(224),\n            T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])\n        ],\n                               to_rgb=True)\n        return transforms(images).astype('float32')\n\n    def forward(self, input):\n        conv1 = self.conv_layer1_1(input)\n        conv2 = self.conv_layer1_2(conv1)\n\n        la1 = self.la1(conv2)\n\n        tr1 = self.tr1([la1])\n        st2 = self.st2(tr1)\n\n        tr2 = self.tr2(st2)\n        st3 = self.st3(tr2)\n\n        tr3 = self.tr3(st3)\n        st4 = self.st4(tr3)\n\n        last_cls = self.last_cls(st4)\n\n        y = last_cls[0]\n        for idx in range(3):\n            y = paddle.add(last_cls[idx + 1], self.cls_head_conv_list[idx](y))\n\n        y = self.conv_last(y)\n        feature = self.pool2d_avg(y)\n        y = paddle.reshape(feature, shape=[-1, feature.shape[1]])\n        y = self.out(y)\n        return y, feature\n"
  },
  {
    "path": "modules/image/classification/se_resnet18_vd_imagenet/README.md",
    "content": "# se_resnet18_vd_imagenet\n\n|模型名称|se_resnet18_vd_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|SE-ResNet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|48MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - Squeeze-and-Excitation Networks是由Momenta在2017年提出的一种图像分类结构。该结构通过对特征通道间的相关性进行建模，把重要的特征进行强化来提升准确率。SE_ResNet基于ResNet模型添加了SE Block。该PaddleHub Module结构为SE_ResNet18，基于ImageNet-2012数据集训练，接受输入图片大小为224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install se_resnet18_vd_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run se_resnet18_vd_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"se_resnet18_vd_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - 分类接口API。\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，每一个图片数据的shape 均为 \\[H, W, C\\]，颜色空间为 BGR； <br/>\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - top\\_k (int): 返回预测结果的前 k 个。\n\n    - **返回**\n\n      - res (list\\[dict\\]): 分类结果，列表的每一个元素均为字典，其中 key 为识别的菜品类别，value为置信度。\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个图像识别的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m se_resnet18_vd_imagenet\n    ```\n\n  - 这样就完成了一个图像识别的在线服务的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/se_resnet18_vd_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install se_resnet18_vd_imagenet==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/se_resnet18_vd_imagenet/README_en.md",
    "content": "# se_resnet18_vd_imagenet\n\n|Module Name|se_resnet18_vd_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|SE-ResNet|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|48MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - Res2Net is an improvement on ResNet, which can improve performance without increasing computation. This module is based on Res2Net, trained on ImageNet-2012, and can predict an image of size 224*224*3.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install se_resnet18_vd_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run se_resnet18_vd_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"se_resnet18_vd_imagenet\")\n    result = classifier.classification(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = classifier.classification(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def classification(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       top_k=1):\n    ```\n    - classification API.\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - top\\_k (int): return the first k results\n\n    - **Return**\n\n      - res (list\\[dict\\]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of image classification.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m se_resnet18_vd_imagenet\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/se_resnet18_vd_imagenet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Remove Fluid API\n\n  - ```shell\n    $ hub install se_resnet18_vd_imagenet==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/se_resnet18_vd_imagenet/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/classification/se_resnet18_vd_imagenet/data_feed.py",
    "content": "import os\nimport time\nfrom collections import OrderedDict\n\nimport numpy as np\nfrom PIL import Image\n\n__all__ = ['reader']\n\nDATA_DIM = 224\nimg_mean = np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))\nimg_std = np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))\n\n\ndef resize_short(img, target_size):\n    percent = float(target_size) / min(img.size[0], img.size[1])\n    resized_width = int(round(img.size[0] * percent))\n    resized_height = int(round(img.size[1] * percent))\n    img = img.resize((resized_width, resized_height), Image.LANCZOS)\n    return img\n\n\ndef crop_image(img, target_size, center):\n    width, height = img.size\n    size = target_size\n    if center == True:\n        w_start = (width - size) / 2\n        h_start = (height - size) / 2\n    else:\n        w_start = np.random.randint(0, width - size + 1)\n        h_start = np.random.randint(0, height - size + 1)\n    w_end = w_start + size\n    h_end = h_start + size\n    img = img.crop((w_start, h_start, w_end, h_end))\n    return img\n\n\ndef process_image(img):\n    img = resize_short(img, target_size=256)\n    img = crop_image(img, target_size=DATA_DIM, center=True)\n    if img.mode != 'RGB':\n        img = img.convert('RGB')\n    img = np.array(img).astype('float32').transpose((2, 0, 1)) / 255\n    img -= img_mean\n    img /= img_std\n    return img\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list[numpy.ndarray]): images data, shape of each is [H, W, C].\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            each['org_im_path'] = im_path\n            each['org_im'] = Image.open(im_path)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n    if images is not None:\n        assert type(images), \"images is a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = Image.fromarray(im[:, :, ::-1])\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            component.append(each)\n\n    for element in component:\n        element['image'] = process_image(element['org_im'])\n        yield element\n"
  },
  {
    "path": "modules/image/classification/se_resnet18_vd_imagenet/label_list.txt",
    "content": "tench, Tinca tinca\ngoldfish, Carassius auratus\ngreat white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias\ntiger shark, Galeocerdo cuvieri\nhammerhead, hammerhead shark\nelectric ray, crampfish, numbfish, torpedo\nstingray\ncock\nhen\nostrich, Struthio camelus\nbrambling, Fringilla montifringilla\ngoldfinch, Carduelis carduelis\nhouse finch, linnet, Carpodacus mexicanus\njunco, snowbird\nindigo bunting, indigo finch, indigo bird, Passerina cyanea\nrobin, American robin, Turdus migratorius\nbulbul\njay\nmagpie\nchickadee\nwater ouzel, dipper\nkite\nbald eagle, American eagle, Haliaeetus leucocephalus\nvulture\ngreat grey owl, great gray owl, Strix nebulosa\nEuropean fire salamander, Salamandra salamandra\ncommon newt, Triturus vulgaris\neft\nspotted salamander, Ambystoma maculatum\naxolotl, mud puppy, Ambystoma mexicanum\nbullfrog, Rana catesbeiana\ntree frog, tree-frog\ntailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui\nloggerhead, loggerhead turtle, Caretta caretta\nleatherback turtle, leatherback, leathery turtle, Dermochelys coriacea\nmud turtle\nterrapin\nbox turtle, box tortoise\nbanded gecko\ncommon iguana, iguana, Iguana iguana\nAmerican chameleon, anole, Anolis carolinensis\nwhiptail, whiptail lizard\nagama\nfrilled lizard, Chlamydosaurus kingi\nalligator lizard\nGila monster, Heloderma suspectum\ngreen lizard, Lacerta viridis\nAfrican chameleon, Chamaeleo chamaeleon\nKomodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis\nAfrican crocodile, Nile crocodile, Crocodylus niloticus\nAmerican alligator, Alligator mississipiensis\ntriceratops\nthunder snake, worm snake, Carphophis amoenus\nringneck snake, ring-necked snake, ring snake\nhognose snake, puff adder, sand viper\ngreen snake, grass snake\nking snake, kingsnake\ngarter snake, grass snake\nwater snake\nvine snake\nnight snake, Hypsiglena torquata\nboa constrictor, Constrictor constrictor\nrock python, rock snake, Python sebae\nIndian cobra, Naja naja\ngreen mamba\nsea snake\nhorned viper, cerastes, sand viper, horned asp, Cerastes cornutus\ndiamondback, diamondback rattlesnake, Crotalus adamanteus\nsidewinder, horned rattlesnake, Crotalus cerastes\ntrilobite\nharvestman, daddy longlegs, Phalangium opilio\nscorpion\nblack and gold garden spider, Argiope aurantia\nbarn spider, Araneus cavaticus\ngarden spider, Aranea diademata\nblack widow, Latrodectus mactans\ntarantula\nwolf spider, hunting spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse, partridge, Bonasa umbellus\nprairie chicken, prairie grouse, prairie fowl\npeacock\nquail\npartridge\nAfrican grey, African gray, Psittacus erithacus\nmacaw\nsulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser, Mergus serrator\ngoose\nblack swan, Cygnus atratus\ntusker\nechidna, spiny anteater, anteater\nplatypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus\nwallaby, brush kangaroo\nkoala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus\nwombat\njellyfish\nsea anemone, anemone\nbrain coral\nflatworm, platyhelminth\nnematode, nematode worm, roundworm\nconch\nsnail\nslug\nsea slug, nudibranch\nchiton, coat-of-mail shell, sea cradle, polyplacophore\nchambered nautilus, pearly nautilus, nautilus\nDungeness crab, Cancer magister\nrock crab, Cancer irroratus\nfiddler crab\nking crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica\nAmerican lobster, Northern lobster, Maine lobster, Homarus americanus\nspiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish\ncrayfish, crawfish, crawdad, crawdaddy\nhermit crab\nisopod\nwhite stork, Ciconia ciconia\nblack stork, Ciconia nigra\nspoonbill\nflamingo\nlittle blue heron, Egretta caerulea\nAmerican egret, great white heron, Egretta albus\nbittern\ncrane\nlimpkin, Aramus pictus\nEuropean gallinule, Porphyrio porphyrio\nAmerican coot, marsh hen, mud hen, water hen, Fulica americana\nbustard\nruddy turnstone, Arenaria interpres\nred-backed sandpiper, dunlin, Erolia alpina\nredshank, Tringa totanus\ndowitcher\noystercatcher, oyster catcher\npelican\nking penguin, Aptenodytes patagonica\nalbatross, mollymawk\ngrey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus\nkiller whale, killer, orca, grampus, sea wolf, Orcinus orca\ndugong, Dugong dugon\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog, Maltese terrier, Maltese\nPekinese, Pekingese, Peke\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound, Afghan\nbasset, basset hound\nbeagle\nbloodhound, sleuthhound\nbluetick\nblack-and-tan coonhound\nWalker hound, Walker foxhound\nEnglish foxhound\nredbone\nborzoi, Russian wolfhound\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound, Ibizan Podenco\nNorwegian elkhound, elkhound\notterhound, otter hound\nSaluki, gazelle hound\nScottish deerhound, deerhound\nWeimaraner\nStaffordshire bullterrier, Staffordshire bull terrier\nAmerican Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier, Sealyham\nAiredale, Airedale terrier\ncairn, cairn terrier\nAustralian terrier\nDandie Dinmont, Dandie Dinmont terrier\nBoston bull, Boston terrier\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier, Scottish terrier, Scottie\nTibetan terrier, chrysanthemum dog\nsilky terrier, Sydney silky\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa, Lhasa apso\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla, Hungarian pointer\nEnglish setter\nIrish setter, red setter\nGordon setter\nBrittany spaniel\nclumber, clumber spaniel\nEnglish springer, English springer spaniel\nWelsh springer spaniel\ncocker spaniel, English cocker spaniel, cocker\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog, bobtail\nShetland sheepdog, Shetland sheep dog, Shetland\ncollie\nBorder collie\nBouvier des Flandres, Bouviers des Flandres\nRottweiler\nGerman shepherd, German shepherd dog, German police dog, alsatian\nDoberman, Doberman pinscher\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard, St Bernard\nEskimo dog, husky\nmalamute, malemute, Alaskan malamute\nSiberian husky\ndalmatian, coach dog, carriage dog\naffenpinscher, monkey pinscher, monkey dog\nbasenji\npug, pug-dog\nLeonberg\nNewfoundland, Newfoundland dog\nGreat Pyrenees\nSamoyed, Samoyede\nPomeranian\nchow, chow chow\nkeeshond\nBrabancon griffon\nPembroke, Pembroke Welsh corgi\nCardigan, Cardigan Welsh corgi\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf, grey wolf, gray wolf, Canis lupus\nwhite wolf, Arctic wolf, Canis lupus tundrarum\nred wolf, maned wolf, Canis rufus, Canis niger\ncoyote, prairie wolf, brush wolf, Canis latrans\ndingo, warrigal, warragal, Canis dingo\ndhole, Cuon alpinus\nAfrican hunting dog, hyena dog, Cape hunting dog, Lycaon pictus\nhyena, hyaena\nred fox, Vulpes vulpes\nkit fox, Vulpes macrotis\nArctic fox, white fox, Alopex lagopus\ngrey fox, gray fox, Urocyon cinereoargenteus\ntabby, tabby cat\ntiger cat\nPersian cat\nSiamese cat, Siamese\nEgyptian cat\ncougar, puma, catamount, mountain lion, painter, panther, Felis concolor\nlynx, catamount\nleopard, Panthera pardus\nsnow leopard, ounce, Panthera uncia\njaguar, panther, Panthera onca, Felis onca\nlion, king of beasts, Panthera leo\ntiger, Panthera tigris\ncheetah, chetah, Acinonyx jubatus\nbrown bear, bruin, Ursus arctos\nAmerican black bear, black bear, Ursus americanus, Euarctos americanus\nice bear, polar bear, Ursus Maritimus, Thalarctos maritimus\nsloth bear, Melursus ursinus, Ursus ursinus\nmongoose\nmeerkat, mierkat\ntiger beetle\nladybug, ladybeetle, lady beetle, ladybird, ladybird beetle\nground beetle, carabid beetle\nlong-horned beetle, longicorn, longicorn beetle\nleaf beetle, chrysomelid\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant, emmet, pismire\ngrasshopper, hopper\ncricket\nwalking stick, walkingstick, stick insect\ncockroach, roach\nmantis, mantid\ncicada, cicala\nleafhopper\nlacewing, lacewing fly\ndragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk\ndamselfly\nadmiral\nringlet, ringlet butterfly\nmonarch, monarch butterfly, milkweed butterfly, Danaus plexippus\ncabbage butterfly\nsulphur butterfly, sulfur butterfly\nlycaenid, lycaenid butterfly\nstarfish, sea star\nsea urchin\nsea cucumber, holothurian\nwood rabbit, cottontail, cottontail rabbit\nhare\nAngora, Angora rabbit\nhamster\nporcupine, hedgehog\nfox squirrel, eastern fox squirrel, Sciurus niger\nmarmot\nbeaver\nguinea pig, Cavia cobaya\nsorrel\nzebra\nhog, pig, grunter, squealer, Sus scrofa\nwild boar, boar, Sus scrofa\nwarthog\nhippopotamus, hippo, river horse, Hippopotamus amphibius\nox\nwater buffalo, water ox, Asiatic buffalo, Bubalus bubalis\nbison\nram, tup\nbighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis\nibex, Capra ibex\nhartebeest\nimpala, Aepyceros melampus\ngazelle\nArabian camel, dromedary, Camelus dromedarius\nllama\nweasel\nmink\npolecat, fitch, foulmart, foumart, Mustela putorius\nblack-footed ferret, ferret, Mustela nigripes\notter\nskunk, polecat, wood pussy\nbadger\narmadillo\nthree-toed sloth, ai, Bradypus tridactylus\norangutan, orang, orangutang, Pongo pygmaeus\ngorilla, Gorilla gorilla\nchimpanzee, chimp, Pan troglodytes\ngibbon, Hylobates lar\nsiamang, Hylobates syndactylus, Symphalangus syndactylus\nguenon, guenon monkey\npatas, hussar monkey, Erythrocebus patas\nbaboon\nmacaque\nlangur\ncolobus, colobus monkey\nproboscis monkey, Nasalis larvatus\nmarmoset\ncapuchin, ringtail, Cebus capucinus\nhowler monkey, howler\ntiti, titi monkey\nspider monkey, Ateles geoffroyi\nsquirrel monkey, Saimiri sciureus\nMadagascar cat, ring-tailed lemur, Lemur catta\nindri, indris, Indri indri, Indri brevicaudatus\nIndian elephant, Elephas maximus\nAfrican elephant, Loxodonta africana\nlesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens\ngiant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca\nbarracouta, snoek\neel\ncoho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch\nrock beauty, Holocanthus tricolor\nanemone fish\nsturgeon\ngar, garfish, garpike, billfish, Lepisosteus osseus\nlionfish\npuffer, pufferfish, blowfish, globefish\nabacus\nabaya\nacademic gown, academic robe, judge's robe\naccordion, piano accordion, squeeze box\nacoustic guitar\naircraft carrier, carrier, flattop, attack aircraft carrier\nairliner\nairship, dirigible\naltar\nambulance\namphibian, amphibious vehicle\nanalog clock\napiary, bee house\napron\nashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin\nassault rifle, assault gun\nbackpack, back pack, knapsack, packsack, rucksack, haversack\nbakery, bakeshop, bakehouse\nbalance beam, beam\nballoon\nballpoint, ballpoint pen, ballpen, Biro\nBand Aid\nbanjo\nbannister, banister, balustrade, balusters, handrail\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel, cask\nbarrow, garden cart, lawn cart, wheelbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap, swimming cap\nbath towel\nbathtub, bathing tub, bath, tub\nbeach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon\nbeacon, lighthouse, beacon light, pharos\nbeaker\nbearskin, busby, shako\nbeer bottle\nbeer glass\nbell cote, bell cot\nbib\nbicycle-built-for-two, tandem bicycle, tandem\nbikini, two-piece\nbinder, ring-binder\nbinoculars, field glasses, opera glasses\nbirdhouse\nboathouse\nbobsled, bobsleigh, bob\nbolo tie, bolo, bola tie, bola\nbonnet, poke bonnet\nbookcase\nbookshop, bookstore, bookstall\nbottlecap\nbow\nbow tie, bow-tie, bowtie\nbrass, memorial tablet, plaque\nbrassiere, bra, bandeau\nbreakwater, groin, groyne, mole, bulwark, seawall, jetty\nbreastplate, aegis, egis\nbroom\nbucket, pail\nbuckle\nbulletproof vest\nbullet train, bullet\nbutcher shop, meat market\ncab, hack, taxi, taxicab\ncaldron, cauldron\ncandle, taper, wax light\ncannon\ncanoe\ncan opener, tin opener\ncardigan\ncar mirror\ncarousel, carrousel, merry-go-round, roundabout, whirligig\ncarpenter's kit, tool kit\ncarton\ncar wheel\ncash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello, violoncello\ncellular telephone, cellular phone, cellphone, cell, mobile phone\nchain\nchainlink fence\nchain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour\nchain saw, chainsaw\nchest\nchiffonier, commode\nchime, bell, gong\nchina cabinet, china closet\nChristmas stocking\nchurch, church building\ncinema, movie theater, movie theatre, movie house, picture palace\ncleaver, meat cleaver, chopper\ncliff dwelling\ncloak\nclog, geta, patten, sabot\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil, spiral, volute, whorl, helix\ncombination lock\ncomputer keyboard, keypad\nconfectionery, confectionary, candy store\ncontainer ship, containership, container vessel\nconvertible\ncorkscrew, bottle screw\ncornet, horn, trumpet, trump\ncowboy boot\ncowboy hat, ten-gallon hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib, cot\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam, dike, dyke\ndesk\ndesktop computer\ndial telephone, dial phone\ndiaper, nappy, napkin\ndigital clock\ndigital watch\ndining table, board\ndishrag, dishcloth\ndishwasher, dish washer, dishwashing machine\ndisk brake, disc brake\ndock, dockage, docking facility\ndogsled, dog sled, dog sleigh\ndome\ndoormat, welcome mat\ndrilling platform, offshore rig\ndrum, membranophone, tympan\ndrumstick\ndumbbell\nDutch oven\nelectric fan, blower\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa, boa\nfile, file cabinet, filing cabinet\nfireboat\nfire engine, fire truck\nfire screen, fireguard\nflagpole, flagstaff\nflute, transverse flute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn, horn\nfrying pan, frypan, skillet\nfur coat\ngarbage truck, dustcart\ngasmask, respirator, gas helmet\ngas pump, gasoline pump, petrol pump, island dispenser\ngoblet\ngo-kart\ngolf ball\ngolfcart, golf cart\ngondola\ngong, tam-tam\ngown\ngrand piano, grand\ngreenhouse, nursery, glasshouse\ngrille, radiator grille\ngrocery store, grocery, food market, market\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower, blow dryer, blow drier, hair dryer, hair drier\nhand-held computer, hand-held microcomputer\nhandkerchief, hankie, hanky, hankey\nhard disc, hard disk, fixed disk\nharmonica, mouth organ, harp, mouth harp\nharp\nharvester, reaper\nhatchet\nholster\nhome theater, home theatre\nhoneycomb\nhook, claw\nhoopskirt, crinoline\nhorizontal bar, high bar\nhorse cart, horse-cart\nhourglass\niPod\niron, smoothing iron\njack-o'-lantern\njean, blue jean, denim\njeep, landrover\njersey, T-shirt, tee shirt\njigsaw puzzle\njinrikisha, ricksha, rickshaw\njoystick\nkimono\nknee pad\nknot\nlab coat, laboratory coat\nladle\nlampshade, lamp shade\nlaptop, laptop computer\nlawn mower, mower\nlens cap, lens cover\nletter opener, paper knife, paperknife\nlibrary\nlifeboat\nlighter, light, igniter, ignitor\nlimousine, limo\nliner, ocean liner\nlipstick, lip rouge\nLoafer\nlotion\nloudspeaker, speaker, speaker unit, loudspeaker system, speaker system\nloupe, jeweler's loupe\nlumbermill, sawmill\nmagnetic compass\nmailbag, postbag\nmailbox, letter box\nmaillot\nmaillot, tank suit\nmanhole cover\nmaraca\nmarimba, xylophone\nmask\nmatchstick\nmaypole\nmaze, labyrinth\nmeasuring cup\nmedicine chest, medicine cabinet\nmegalith, megalithic structure\nmicrophone, mike\nmicrowave, microwave oven\nmilitary uniform\nmilk can\nminibus\nminiskirt, mini\nminivan\nmissile\nmitten\nmixing bowl\nmobile home, manufactured home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter, scooter\nmountain bike, all-terrain bike, off-roader\nmountain tent\nmouse, computer mouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook, notebook computer\nobelisk\noboe, hautboy, hautbois\nocarina, sweet potato\nodometer, hodometer, mileometer, milometer\noil filter\norgan, pipe organ\noscilloscope, scope, cathode-ray oscilloscope, CRO\noverskirt\noxcart\noxygen mask\npacket\npaddle, boat paddle\npaddlewheel, paddle wheel\npadlock\npaintbrush\npajama, pyjama, pj's, jammies\npalace\npanpipe, pandean pipe, syrinx\npaper towel\nparachute, chute\nparallel bars, bars\npark bench\nparking meter\npassenger car, coach, carriage\npatio, terrace\npay-phone, pay-station\npedestal, plinth, footstall\npencil box, pencil case\npencil sharpener\nperfume, essence\nPetri dish\nphotocopier\npick, plectrum, plectron\npickelhaube\npicket fence, paling\npickup, pickup truck\npier\npiggy bank, penny bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate, pirate ship\npitcher, ewer\nplane, carpenter's plane, woodworking plane\nplanetarium\nplastic bag\nplate rack\nplow, plough\nplunger, plumber's helper\nPolaroid camera, Polaroid Land camera\npole\npolice van, police wagon, paddy wagon, patrol wagon, wagon, black Maria\nponcho\npool table, billiard table, snooker table\npop bottle, soda bottle\npot, flowerpot\npotter's wheel\npower drill\nprayer rug, prayer mat\nprinter\nprison, prison house\nprojectile, missile\nprojector\npuck, hockey puck\npunching bag, punch bag, punching ball, punchball\npurse\nquill, quill pen\nquilt, comforter, comfort, puff\nracer, race car, racing car\nracket, racquet\nradiator\nradio, wireless\nradio telescope, radio reflector\nrain barrel\nrecreational vehicle, RV, R.V.\nreel\nreflex camera\nrefrigerator, icebox\nremote control, remote\nrestaurant, eating house, eating place, eatery\nrevolver, six-gun, six-shooter\nrifle\nrocking chair, rocker\nrotisserie\nrubber eraser, rubber, pencil eraser\nrugby ball\nrule, ruler\nrunning shoe\nsafe\nsafety pin\nsaltshaker, salt shaker\nsandal\nsarong\nsax, saxophone\nscabbard\nscale, weighing machine\nschool bus\nschooner\nscoreboard\nscreen, CRT screen\nscrew\nscrewdriver\nseat belt, seatbelt\nsewing machine\nshield, buckler\nshoe shop, shoe-shop, shoe store\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule, slipstick\nsliding door\nslot, one-armed bandit\nsnorkel\nsnowmobile\nsnowplow, snowplough\nsoap dispenser\nsoccer ball\nsock\nsolar dish, solar collector, solar furnace\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web, spider's web\nspindle\nsports car, sport car\nspotlight, spot\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch, stop watch\nstove\nstrainer\nstreetcar, tram, tramcar, trolley, trolley car\nstretcher\nstudio couch, day bed\nstupa, tope\nsubmarine, pigboat, sub, U-boat\nsuit, suit of clothes\nsundial\nsunglass\nsunglasses, dark glasses, shades\nsunscreen, sunblock, sun blocker\nsuspension bridge\nswab, swob, mop\nsweatshirt\nswimming trunks, bathing trunks\nswing\nswitch, electric switch, electrical switch\nsyringe\ntable lamp\ntank, army tank, armored combat vehicle, armoured combat vehicle\ntape player\nteapot\nteddy, teddy bear\ntelevision, television system\ntennis ball\nthatch, thatched roof\ntheater curtain, theatre curtain\nthimble\nthresher, thrasher, threshing machine\nthrone\ntile roof\ntoaster\ntobacco shop, tobacconist shop, tobacconist\ntoilet seat\ntorch\ntotem pole\ntow truck, tow car, wrecker\ntoyshop\ntractor\ntrailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi\ntray\ntrench coat\ntricycle, trike, velocipede\ntrimaran\ntripod\ntriumphal arch\ntrolleybus, trolley coach, trackless trolley\ntrombone\ntub, vat\nturnstile\ntypewriter keyboard\numbrella\nunicycle, monocycle\nupright, upright piano\nvacuum, vacuum cleaner\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin, fiddle\nvolleyball\nwaffle iron\nwall clock\nwallet, billfold, notecase, pocketbook\nwardrobe, closet, press\nwarplane, military plane\nwashbasin, handbasin, washbowl, lavabo, wash-hand basin\nwasher, automatic washer, washing machine\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool, woolen, woollen\nworm fence, snake fence, snake-rail fence, Virginia fence\nwreck\nyawl\nyurt\nweb site, website, internet site, site\ncomic book\ncrossword puzzle, crossword\nstreet sign\ntraffic light, traffic signal, stoplight\nbook jacket, dust cover, dust jacket, dust wrapper\nmenu\nplate\nguacamole\nconsomme\nhot pot, hotpot\ntrifle\nice cream, icecream\nice lolly, lolly, lollipop, popsicle\nFrench loaf\nbagel, beigel\npretzel\ncheeseburger\nhotdog, hot dog, red hot\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini, courgette\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber, cuke\nartichoke, globe artichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple, ananas\nbanana\njackfruit, jak, jack\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce, chocolate syrup\ndough\nmeat loaf, meatloaf\npizza, pizza pie\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff, drop, drop-off\ncoral reef\ngeyser\nlakeside, lakeshore\npromontory, headland, head, foreland\nsandbar, sand bar\nseashore, coast, seacoast, sea-coast\nvalley, vale\nvolcano\nballplayer, baseball player\ngroom, bridegroom\nscuba diver\nrapeseed\ndaisy\nyellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum\ncorn\nacorn\nhip, rose hip, rosehip\nbuckeye, horse chestnut, conker\ncoral fungus\nagaric\ngyromitra\nstinkhorn, carrion fungus\nearthstar\nhen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa\nbolete\near, spike, capitulum\ntoilet tissue, toilet paper, bathroom tissue\n"
  },
  {
    "path": "modules/image/classification/se_resnet18_vd_imagenet/module.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\n\nimport argparse\nimport ast\nimport os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import postprocess\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"se_resnet18_vd_imagenet\",\n            type=\"CV/image_classification\",\n            author=\"paddlepaddle\",\n            author_email=\"paddle-dev@baidu.com\",\n            summary=\"SE_ResNet18_vd is a image classfication model, this module is trained with imagenet datasets.\",\n            version=\"1.1.0\")\nclass SEResNet18vdImageNet:\n\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"se_resnet18_vd_imagenet_model\", \"model\")\n        label_file = os.path.join(self.directory, \"label_list.txt\")\n        with open(label_file, 'r', encoding='utf-8') as file:\n            self.label_list = file.read().split(\"\\n\")[:-1]\n        self.predictor_set = False\n\n    def get_expected_image_width(self):\n        return 224\n\n    def get_expected_image_height(self):\n        return 224\n\n    def get_pretrained_images_mean(self):\n        im_mean = np.array([0.485, 0.456, 0.406]).reshape(1, 3)\n        return im_mean\n\n    def get_pretrained_images_std(self):\n        im_std = np.array([0.229, 0.224, 0.225]).reshape(1, 3)\n        return im_std\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path + '.pdmodel'\n        params = self.default_pretrained_model_path + '.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def classification(self, images=None, paths=None, batch_size=1, use_gpu=False, top_k=1):\n        \"\"\"\n        API for image classification.\n\n        Args:\n            images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            top_k (int): Return top k results.\n\n        Returns:\n            res (list[dict]): The classfication results.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Attempt to use GPU for prediction, but environment variable CUDA_VISIBLE_DEVICES was not set correctly.\"\n                )\n\n        if not self.predictor_set:\n            self._set_config()\n            self.predictor_set = True\n\n        all_data = list()\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = list()\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except:\n                    pass\n            # feed batch image\n            batch_image = np.array([data['image'] for data in batch_data])\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(batch_image.copy())\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            out = postprocess(data_out=output_handle.copy_to_cpu(), label_list=self.label_list, top_k=top_k)\n            res += out\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.classification(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.classification(paths=[args.input_path], batch_size=args.batch_size, use_gpu=args.use_gpu)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not.\")\n        self.arg_config_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n        self.arg_config_group.add_argument('--top_k', type=ast.literal_eval, default=1, help=\"Return top k results.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n"
  },
  {
    "path": "modules/image/classification/se_resnet18_vd_imagenet/processor.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport base64\n\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef softmax(x):\n    orig_shape = x.shape\n    if len(x.shape) > 1:\n        tmp = np.max(x, axis=1)\n        x -= tmp.reshape((x.shape[0], 1))\n        x = np.exp(x)\n        tmp = np.sum(x, axis=1)\n        x /= tmp.reshape((x.shape[0], 1))\n    else:\n        tmp = np.max(x)\n        x -= tmp\n        x = np.exp(x)\n        tmp = np.sum(x)\n        x /= tmp\n    return x\n\n\ndef postprocess(data_out, label_list, top_k):\n    \"\"\"\n    Postprocess output of network, one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output data of network.\n        label_list (list): list of label.\n        top_k (int): Return top k results.\n    \"\"\"\n    output = []\n    for result in data_out:\n        result_i = softmax(result)\n        output_i = {}\n        indexs = np.argsort(result_i)[::-1][0:top_k]\n        for index in indexs:\n            label = label_list[index].split(',')[0]\n            output_i[label] = float(result_i[index])\n        output.append(output_i)\n    return output\n"
  },
  {
    "path": "modules/image/classification/se_resnet18_vd_imagenet/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/brFsZ7qszSY/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8OHx8ZG9nfGVufDB8fHx8MTY2MzA1ODQ1MQ&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"se_resnet18_vd_imagenet\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n\n    def test_classification1(self):\n        results = self.module.classification(paths=['tests/test.jpg'])\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification2(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')])\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification3(self):\n        results = self.module.classification(images=[cv2.imread('tests/test.jpg')], use_gpu=True)\n        data = results[0]\n        self.assertTrue('Pembroke' in data)\n        self.assertTrue(data['Pembroke'] > 0.5)\n\n    def test_classification4(self):\n        self.assertRaises(AssertionError, self.module.classification, paths=['no.jpg'])\n\n    def test_classification5(self):\n        self.assertRaises(TypeError, self.module.classification, images=['tests/test.jpg'])\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/classification/se_resnext101_32x4d_imagenet/README.md",
    "content": "# se_resnext101_32x4d_imagenet\n\n|模型名称|se_resnext101_32x4d_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|SE_ResNeXt|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|191MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - Squeeze-and-Excitation Networks是由Momenta在2017年提出的一种图像分类结构。该结构通过对特征通道间的相关性进行建模，把重要的特征进行强化来提升准确率。SE_ResNeXt基于ResNeXt模型添加了SE Block，并获得了2017 ILSVR竞赛的冠军。该PaddleHub Module结构为SE_ResNeXt101_32x4d，基于ImageNet-2012数据集训练，接受输入图片大小为224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install se_resnext101_32x4d_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run se_resnext101_32x4d_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"se_resnext101_32x4d_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install se_resnext101_32x4d_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/se_resnext101_32x4d_imagenet/README_en.md",
    "content": "# se_resnext101_32x4d_imagenet\n\n|Module Name|se_resnext101_32x4d_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|SE_ResNeXt|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|191MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - Squeeze-and-Excitation Network is proposed by Momenta in 2017. This model learns the weight to strengthen important channels of features and improves classification accuracy, which is the champion of ILSVR 2017. This module is based on se_resnext101_32x4d, trained on ImageNet-2012, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install se_resnext101_32x4d_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run se_resnext101_32x4d_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"se_resnext101_32x4d_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install se_resnext101_32x4d_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/se_resnext50_32x4d_imagenet/README.md",
    "content": "# se_resnext50_32x4d_imagenet\n\n|模型名称|se_resnext50_32x4d_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|SE_ResNeXt|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|107MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - Squeeze-and-Excitation Networks是由Momenta在2017年提出的一种图像分类结构。该结构通过对特征通道间的相关性进行建模，把重要的特征进行强化来提升准确率。SE_ResNeXt基于ResNeXt模型添加了SE Block，并获得了2017 ILSVR竞赛的冠军。该PaddleHub Module结构为SE_ResNeXt50_32x4d，基于ImageNet-2012数据集训练，接受输入图片大小为224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install se_resnext50_32x4d_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run se_resnext50_32x4d_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"se_resnext50_32x4d_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install se_resnext50_32x4d_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/se_resnext50_32x4d_imagenet/README_en.md",
    "content": "# se_resnext50_32x4d_imagenet\n\n|Module Name|se_resnext50_32x4d_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|SE_ResNeXt|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|107MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - Squeeze-and-Excitation Network is proposed by Momenta in 2017. This model learns the weight to strengthen important channels of features and improves classification accuracy, which is the champion of ILSVR 2017. This module is based on SE_ResNeXt50_32x4d, trained on ImageNet-2012, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install se_resnext50_32x4d_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run se_resnext50_32x4d_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"se_resnext50_32x4d_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install se_resnext50_32x4d_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/shufflenet_v2_imagenet/README.md",
    "content": "# shufflenet_v2_imagenet\n\n|模型名称|shufflenet_v2_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|ShuffleNet V2|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|11MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - ShuffleNet V2是由旷视科技在2018年提出的轻量级图像分类模型，该模型通过pointwise group convolution和channel shuffle两种方式，在保持精度的同时大大降低了模型的计算量。该PaddleHub Module结构为ShuffleNet V2，基于ImageNet-2012数据集训练，接受输入图片大小为224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install shufflenet_v2_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run shufflenet_v2_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"shufflenet_v2_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install shufflenet_v2_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/shufflenet_v2_imagenet/README_en.md",
    "content": "# shufflenet_v2_imagenet\n\n|Module Name|shufflenet_v2_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|ShuffleNet V2|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|11MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - ShuffleNet V2 is a light-weight model proposed by MEGVII in 2018. This model proposed pointwise group convolution and channel shuffle to keep accurary and reduce the amount of computation. This module is based on ShuffleNet V2, trained on ImageNet-2012, and can predict an image of 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install shufflenet_v2_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run shufflenet_v2_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"shufflenet_v2_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install shufflenet_v2_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/shufflenet_v2_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\n\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import MSRA\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\ndef channel_shuffle(x: paddle.Tensor, groups: int):\n    \"\"\"Shuffle input channels.\"\"\"\n    batchsize, num_channels, height, width = x.shape[0], x.shape[1], x.shape[2], x.shape[3]\n    channels_per_group = num_channels // groups\n\n    # reshape\n    x = paddle.reshape(x=x, shape=[batchsize, groups, channels_per_group, height, width])\n\n    x = paddle.transpose(x=x, perm=[0, 2, 1, 3, 4])\n    # flatten\n    x = paddle.reshape(x=x, shape=[batchsize, num_channels, height, width])\n    return x\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 filter_size: int,\n                 num_filters: int,\n                 stride: int,\n                 padding: int,\n                 channels: int = None,\n                 num_groups: int = 1,\n                 if_act: bool = True,\n                 act: str = 'relu',\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n        self._if_act = if_act\n        assert act in ['relu', 'swish'], \\\n            \"supported act are {} but your act is {}\".format(\n                ['relu', 'swish'], act)\n        self._act = act\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=padding,\n            groups=num_groups,\n            weight_attr=ParamAttr(initializer=MSRA(), name=name + \"_weights\"),\n            bias_attr=False)\n\n        self._batch_norm = BatchNorm(\n            num_filters,\n            param_attr=ParamAttr(name=name + \"_bn_scale\"),\n            bias_attr=ParamAttr(name=name + \"_bn_offset\"),\n            moving_mean_name=name + \"_bn_mean\",\n            moving_variance_name=name + \"_bn_variance\")\n\n    def forward(self, inputs: paddle.Tensor, if_act: bool = True):\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        if self._if_act:\n            y = F.relu(y) if self._act == 'relu' else F.swish(y)\n        return y\n\n\nclass InvertedResidualUnit(nn.Layer):\n    \"\"\"Inverted Residual unit.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int,\n                 benchmodel: int,\n                 act: str = 'relu',\n                 name: str = None):\n        super(InvertedResidualUnit, self).__init__()\n        assert stride in [1, 2], \\\n            \"supported stride are {} but your stride is {}\".format([1, 2], stride)\n        self.benchmodel = benchmodel\n        oup_inc = num_filters // 2\n        inp = num_channels\n        if benchmodel == 1:\n            self._conv_pw = ConvBNLayer(\n                num_channels=num_channels // 2,\n                num_filters=oup_inc,\n                filter_size=1,\n                stride=1,\n                padding=0,\n                num_groups=1,\n                if_act=True,\n                act=act,\n                name='stage_' + name + '_conv1')\n            self._conv_dw = ConvBNLayer(\n                num_channels=oup_inc,\n                num_filters=oup_inc,\n                filter_size=3,\n                stride=stride,\n                padding=1,\n                num_groups=oup_inc,\n                if_act=False,\n                act=act,\n                name='stage_' + name + '_conv2')\n            self._conv_linear = ConvBNLayer(\n                num_channels=oup_inc,\n                num_filters=oup_inc,\n                filter_size=1,\n                stride=1,\n                padding=0,\n                num_groups=1,\n                if_act=True,\n                act=act,\n                name='stage_' + name + '_conv3')\n        else:\n            # branch1\n            self._conv_dw_1 = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=inp,\n                filter_size=3,\n                stride=stride,\n                padding=1,\n                num_groups=inp,\n                if_act=False,\n                act=act,\n                name='stage_' + name + '_conv4')\n            self._conv_linear_1 = ConvBNLayer(\n                num_channels=inp,\n                num_filters=oup_inc,\n                filter_size=1,\n                stride=1,\n                padding=0,\n                num_groups=1,\n                if_act=True,\n                act=act,\n                name='stage_' + name + '_conv5')\n            # branch2\n            self._conv_pw_2 = ConvBNLayer(\n                num_channels=num_channels,\n                num_filters=oup_inc,\n                filter_size=1,\n                stride=1,\n                padding=0,\n                num_groups=1,\n                if_act=True,\n                act=act,\n                name='stage_' + name + '_conv1')\n            self._conv_dw_2 = ConvBNLayer(\n                num_channels=oup_inc,\n                num_filters=oup_inc,\n                filter_size=3,\n                stride=stride,\n                padding=1,\n                num_groups=oup_inc,\n                if_act=False,\n                act=act,\n                name='stage_' + name + '_conv2')\n            self._conv_linear_2 = ConvBNLayer(\n                num_channels=oup_inc,\n                num_filters=oup_inc,\n                filter_size=1,\n                stride=1,\n                padding=0,\n                num_groups=1,\n                if_act=True,\n                act=act,\n                name='stage_' + name + '_conv3')\n\n    def forward(self, inputs: paddle.Tensor):\n        if self.benchmodel == 1:\n            x1, x2 = paddle.split(inputs, num_or_sections=[inputs.shape[1] // 2, inputs.shape[1] // 2], axis=1)\n            x2 = self._conv_pw(x2)\n            x2 = self._conv_dw(x2)\n            x2 = self._conv_linear(x2)\n            out = paddle.concat([x1, x2], axis=1)\n        else:\n            x1 = self._conv_dw_1(inputs)\n            x1 = self._conv_linear_1(x1)\n\n            x2 = self._conv_pw_2(inputs)\n            x2 = self._conv_dw_2(x2)\n            x2 = self._conv_linear_2(x2)\n            out = paddle.concat([x1, x2], axis=1)\n\n        return channel_shuffle(out, 2)\n\n\n@moduleinfo(\n    name=\"shufflenet_v2_imagenet\",\n    type=\"cv/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"shufflenet_v2_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass ShuffleNet(nn.Layer):\n    \"\"\"ShuffleNet model.\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(ShuffleNet, self).__init__()\n        self.scale = 1\n        self.class_dim = class_dim\n        stage_repeats = [4, 8, 4]\n        stage_out_channels = [-1, 24, 116, 232, 464, 1024]\n\n        # 1. conv1\n        self._conv1 = ConvBNLayer(\n            num_channels=3,\n            num_filters=stage_out_channels[1],\n            filter_size=3,\n            stride=2,\n            padding=1,\n            if_act=True,\n            act='relu',\n            name='stage1_conv')\n        self._max_pool = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n        # 2. bottleneck sequences\n        self._block_list = []\n        i = 1\n        in_c = int(32)\n        for idxstage in range(len(stage_repeats)):\n            numrepeat = stage_repeats[idxstage]\n            output_channel = stage_out_channels[idxstage + 2]\n            for i in range(numrepeat):\n                if i == 0:\n                    block = self.add_sublayer(\n                        str(idxstage + 2) + '_' + str(i + 1),\n                        InvertedResidualUnit(\n                            num_channels=stage_out_channels[idxstage + 1],\n                            num_filters=output_channel,\n                            stride=2,\n                            benchmodel=2,\n                            act='relu',\n                            name=str(idxstage + 2) + '_' + str(i + 1)))\n                    self._block_list.append(block)\n                else:\n                    block = self.add_sublayer(\n                        str(idxstage + 2) + '_' + str(i + 1),\n                        InvertedResidualUnit(\n                            num_channels=output_channel,\n                            num_filters=output_channel,\n                            stride=1,\n                            benchmodel=1,\n                            act='relu',\n                            name=str(idxstage + 2) + '_' + str(i + 1)))\n                    self._block_list.append(block)\n\n        # 3. last_conv\n        self._last_conv = ConvBNLayer(\n            num_channels=stage_out_channels[-2],\n            num_filters=stage_out_channels[-1],\n            filter_size=1,\n            stride=1,\n            padding=0,\n            if_act=True,\n            act='relu',\n            name='conv5')\n\n        # 4. pool\n        self._pool2d_avg = AdaptiveAvgPool2d(1)\n        self._out_c = stage_out_channels[-1]\n        # 5. fc\n        self._fc = Linear(\n            stage_out_channels[-1],\n            class_dim,\n            weight_attr=ParamAttr(name='fc6_weights'),\n            bias_attr=ParamAttr(name='fc6_offset'))\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'shufflenet_v2_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/shufflenet_v2_imagenet.pdparams -O '\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self._conv1(inputs)\n        y = self._max_pool(y)\n        for inv in self._block_list:\n            y = inv(y)\n        y = self._last_conv(y)\n        y = self._pool2d_avg(y)\n        y = paddle.reshape(y, shape=[-1, self._out_c])\n        y = self._fc(y)\n        return y\n"
  },
  {
    "path": "modules/image/classification/spinalnet_res101_gemstone/README.md",
    "content": "# spinalnet_res101_gemstone\n\n|模型名称|spinalnet_res101_gemstone|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|resnet101|\n|数据集|gemstone|\n|是否支持Fine-tuning|否|\n|模型大小|246MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - 使用PaddleHub的SpinalNet预训练模型进行宝石识别或finetune并完成宝石的预测任务。\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.0.0  \n\n  - paddlehub >= 2.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install spinalnet_res101_gemstone\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run spinalnet_res101_gemstone --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"spinalnet_res101_gemstone\")\n    result = classifier.predict(['/PATH/TO/IMAGE'])\n    print(result)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def predict(images)\n    ```\n    - 分类接口API。\n    - **参数**\n      - images: list类型，待预测的图像。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n  - ```shell\n    $ hub install spinalnet_res101_gemstone==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/spinalnet_res101_gemstone/README_en.md",
    "content": "# spinalnet_res101_gemstone\n\n|Module Name|spinalnet_res101_gemstone|\n| :--- | :---: |\n|Category|image classification|\n|Network|resnet101|\n|Dataset|gemstone|\n|Fine-tuning supported or not|No|\n|Module Size|246MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - This module is based on SpinalNet trained on gemstone dataset, and can be used to classify a gemstone.\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 2.0.0  \n\n  - paddlehub >= 2.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install spinalnet_res101_gemstone\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run spinalnet_res101_gemstone --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"spinalnet_res101_gemstone\")\n    result = classifier.predict(['/PATH/TO/IMAGE'])\n    print(result)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def predict(images)\n    ```\n    - classification API.\n    - **Parameters**\n      - images(list[numpy.ndarray]): image data.\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n  - ```shell\n    $ hub install spinalnet_res101_gemstone==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/spinalnet_res50_gemstone/README.md",
    "content": "# spinalnet_res50_gemstone\n\n|模型名称|spinalnet_res50_gemstone|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|resnet50|\n|数据集|gemstone|\n|是否支持Fine-tuning|否|\n|模型大小|137MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - 使用PaddleHub的SpinalNet预训练模型进行宝石识别或finetune并完成宝石的预测任务。\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.0.0  \n\n  - paddlehub >= 2.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install spinalnet_res50_gemstone\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run spinalnet_res50_gemstone --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"spinalnet_res50_gemstone\")\n    result = classifier.predict(['/PATH/TO/IMAGE'])\n    print(result)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def predict(images)\n    ```\n    - 分类接口API。\n    - **参数**\n      - images: list类型，待预测的图像。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n  - ```shell\n    $ hub install spinalnet_res50_gemstone==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/spinalnet_res50_gemstone/README_en.md",
    "content": "# spinalnet_res50_gemstone\n\n|Module Name|spinalnet_res50_gemstone|\n| :--- | :---: |\n|Category|image classification|\n|Network|resnet50|\n|Dataset|gemstone|\n|Fine-tuning supported or not|No|\n|Module Size|137MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - This module is based on SpinalNet trained on gemstone dataset, and can be used to classify a gemstone.\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 2.0.0  \n\n  - paddlehub >= 2.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install spinalnet_res50_gemstone\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run spinalnet_res50_gemstone --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"spinalnet_res50_gemstone\")\n    result = classifier.predict(['/PATH/TO/IMAGE'])\n    print(result)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def predict(images)\n    ```\n    - classification API.\n    - **Parameters**\n      - images: list类型，待预测的图像.\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n  - ```shell\n    $ hub install spinalnet_res50_gemstone==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/spinalnet_vgg16_gemstone/README.md",
    "content": "# spinalnet_vgg16_gemstone\n\n|模型名称|spinalnet_vgg16_gemstone|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|vgg16|\n|数据集|gemstone|\n|是否支持Fine-tuning|否|\n|模型大小|1.5GB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - 使用PaddleHub的SpinalNet预训练模型进行宝石识别或finetune并完成宝石的预测任务。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.0.0  \n\n  - paddlehub >= 2.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install spinalnet_vgg16_gemstone\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run spinalnet_vgg16_gemstone --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"spinalnet_vgg16_gemstone\")\n    result = classifier.predict(['/PATH/TO/IMAGE'])\n    print(result)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def predict(images)\n    ```\n    - 分类接口API。\n    - **参数**\n      - images: list类型，待预测的图像。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n  - ```shell\n    $ hub install spinalnet_vgg16_gemstone==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/spinalnet_vgg16_gemstone/README_en.md",
    "content": "# spinalnet_vgg16_gemstone\n\n|Module Name|spinalnet_vgg16_gemstone|\n| :--- | :---: |\n|Category|image classification|\n|Network|vgg16|\n|Dataset|gemstone|\n|Fine-tuning supported or not|No|\n|Module Size|1.5GB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - This module is based on SpinalNet trained on gemstone dataset, and can be used to classify a gemstone.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 2.0.0  \n\n  - paddlehub >= 2.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install spinalnet_vgg16_gemstone\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run spinalnet_vgg16_gemstone --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"spinalnet_vgg16_gemstone\")\n    result = classifier.predict(['/PATH/TO/IMAGE'])\n    print(result)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def predict(images)\n    ```\n    - classification API.\n    - **Parameters**\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n  - ```shell\n    $ hub install spinalnet_vgg16_gemstone==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/vgg11_imagenet/README.md",
    "content": "# vgg11_imagenet\n\n|模型名称|vgg11_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|VGG|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|507MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - VGG是牛津大学计算机视觉组和DeepMind在2014年提出的一种图像分类模型。该系列模型探索了卷积神经网络的深度与其性能之间的关系，通过实验证明了增加网络的深度能够在一定程度上影响网络最终的性能，到目前为止，VGG仍然被许多其他图像任务用作特征提取的BackBone网络。该PaddleHub Module结构为VGG11，基于ImageNet-2012数据集训练，接受输入图片大小为224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install vgg11_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run vgg11_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"vgg11_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install vgg11_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/vgg11_imagenet/README_en.md",
    "content": "# vgg11_imagenet\n\n|Module Name|vgg11_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|VGG|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|507MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - VGG is a serial of models for image classification proposed by university of Oxford and DeepMind. The serial models demonstrate 'the deeper the network is, the better the performance is'. And VGG is used for feature extraction as the backbone by most image classification tasks. This module is based on VGG11, trained on ImageNet-2012, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install vgg11_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run vgg11_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"vgg11_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install vgg11_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/vgg13_imagenet/README.md",
    "content": "# vgg13_imagenet\n\n|模型名称|vgg13_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|VGG|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|508MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - VGG是牛津大学计算机视觉组和DeepMind在2014年提出的一种图像分类模型。该系列模型探索了卷积神经网络的深度与其性能之间的关系，通过实验证明了增加网络的深度能够在一定程度上影响网络最终的性能，到目前为止，VGG仍然被许多其他图像任务用作特征提取的BackBone网络。该PaddleHub Module结构为VGG13，基于ImageNet-2012数据集训练，接受输入图片大小为224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install vgg13_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run vgg13_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"vgg13_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install vgg13_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/vgg13_imagenet/README_en.md",
    "content": "# vgg13_imagenet\n\n|Module Name|vgg13_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|VGG|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|508MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - VGG is a serial of models for image classification proposed by university of Oxford and DeepMind. The serial models demonstrate 'the deeper the network is, the better the performance is'. And VGG is used for feature extraction as the backbone by most image classification tasks. This module is based on VGG13, trained on ImageNet-2012, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install vgg13_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run vgg13_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"vgg13_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install vgg13_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/vgg16_imagenet/README.md",
    "content": "# vgg16_imagenet\n\n|模型名称|vgg16_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|VGG|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|528MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - VGG是牛津大学计算机视觉组和DeepMind在2014年提出的一种图像分类模型。该系列模型探索了卷积神经网络的深度与其性能之间的关系，通过实验证明了增加网络的深度能够在一定程度上影响网络最终的性能，到目前为止，VGG仍然被许多其他图像任务用作特征提取的BackBone网络。该PaddleHub Module结构为VGG16，基于ImageNet-2012数据集训练，接受输入图片大小为224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install vgg16_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run vgg16_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"vgg16_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install vgg16_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/vgg16_imagenet/README_en.md",
    "content": "# vgg16_imagenet\n\n|Module Name|vgg16_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|VGG|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|528MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - VGG is a serial of models for image classification proposed by university of Oxford and DeepMind. The serial models demonstrate 'the deeper the network is, the better the performance is'. And VGG is used for feature extraction as the backbone by most image classification tasks. This module is based on VGG16, trained on ImageNet-2012, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install vgg16_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run vgg16_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"vgg16_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install vgg16_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/vgg16_imagenet/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/classification/vgg16_imagenet/data_feed.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import division\n\nimport os\nfrom collections import OrderedDict\n\nimport cv2\nimport numpy as np\nfrom PIL import Image, ImageEnhance\nfrom paddle import fluid\n\nDATA_DIM = 224\nimg_mean = np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))\nimg_std = np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))\n\n\ndef resize_short(img, target_size):\n    percent = float(target_size) / min(img.size[0], img.size[1])\n    resized_width = int(round(img.size[0] * percent))\n    resized_height = int(round(img.size[1] * percent))\n    img = img.resize((resized_width, resized_height), Image.LANCZOS)\n    return img\n\n\ndef crop_image(img, target_size, center):\n    width, height = img.size\n    size = target_size\n    if center == True:\n        w_start = (width - size) / 2\n        h_start = (height - size) / 2\n    else:\n        w_start = np.random.randint(0, width - size + 1)\n        h_start = np.random.randint(0, height - size + 1)\n    w_end = w_start + size\n    h_end = h_start + size\n    img = img.crop((w_start, h_start, w_end, h_end))\n    return img\n\n\ndef process_image(img):\n    img = resize_short(img, target_size=256)\n    img = crop_image(img, target_size=DATA_DIM, center=True)\n    if img.mode != 'RGB':\n        img = img.convert('RGB')\n    #img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)\n    img = np.array(img).astype('float32').transpose((2, 0, 1)) / 255\n    img -= img_mean\n    img /= img_std\n    return img\n\n\ndef test_reader(paths=None, images=None):\n    \"\"\"data generator\n    :param paths: path to images.\n    :type paths: list, each element is a str\n    :param images: data of images, [N, H, W, C]\n    :type images: numpy.ndarray\n    \"\"\"\n    img_list = []\n    if paths:\n        for img_path in paths:\n            assert os.path.isfile(img_path), \"The {} isn't a valid file path.\".format(img_path)\n            img = Image.open(img_path)\n            #img = cv2.imread(img_path)\n            img_list.append(img)\n    if images is not None:\n        for img in images:\n            img_list.append(Image.fromarray(np.uint8(img)))\n    for im in img_list:\n        im = process_image(im)\n        yield im\n"
  },
  {
    "path": "modules/image/classification/vgg16_imagenet/label_file.txt",
    "content": "tench, Tinca tinca\ngoldfish, Carassius auratus\ngreat white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias\ntiger shark, Galeocerdo cuvieri\nhammerhead, hammerhead shark\nelectric ray, crampfish, numbfish, torpedo\nstingray\ncock\nhen\nostrich, Struthio camelus\nbrambling, Fringilla montifringilla\ngoldfinch, Carduelis carduelis\nhouse finch, linnet, Carpodacus mexicanus\njunco, snowbird\nindigo bunting, indigo finch, indigo bird, Passerina cyanea\nrobin, American robin, Turdus migratorius\nbulbul\njay\nmagpie\nchickadee\nwater ouzel, dipper\nkite\nbald eagle, American eagle, Haliaeetus leucocephalus\nvulture\ngreat grey owl, great gray owl, Strix nebulosa\nEuropean fire salamander, Salamandra salamandra\ncommon newt, Triturus vulgaris\neft\nspotted salamander, Ambystoma maculatum\naxolotl, mud puppy, Ambystoma mexicanum\nbullfrog, Rana catesbeiana\ntree frog, tree-frog\ntailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui\nloggerhead, loggerhead turtle, Caretta caretta\nleatherback turtle, leatherback, leathery turtle, Dermochelys coriacea\nmud turtle\nterrapin\nbox turtle, box tortoise\nbanded gecko\ncommon iguana, iguana, Iguana iguana\nAmerican chameleon, anole, Anolis carolinensis\nwhiptail, whiptail lizard\nagama\nfrilled lizard, Chlamydosaurus kingi\nalligator lizard\nGila monster, Heloderma suspectum\ngreen lizard, Lacerta viridis\nAfrican chameleon, Chamaeleo chamaeleon\nKomodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis\nAfrican crocodile, Nile crocodile, Crocodylus niloticus\nAmerican alligator, Alligator mississipiensis\ntriceratops\nthunder snake, worm snake, Carphophis amoenus\nringneck snake, ring-necked snake, ring snake\nhognose snake, puff adder, sand viper\ngreen snake, grass snake\nking snake, kingsnake\ngarter snake, grass snake\nwater snake\nvine snake\nnight snake, Hypsiglena torquata\nboa constrictor, Constrictor constrictor\nrock python, rock snake, Python sebae\nIndian cobra, Naja naja\ngreen mamba\nsea snake\nhorned viper, cerastes, sand viper, horned asp, Cerastes cornutus\ndiamondback, diamondback rattlesnake, Crotalus adamanteus\nsidewinder, horned rattlesnake, Crotalus cerastes\ntrilobite\nharvestman, daddy longlegs, Phalangium opilio\nscorpion\nblack and gold garden spider, Argiope aurantia\nbarn spider, Araneus cavaticus\ngarden spider, Aranea diademata\nblack widow, Latrodectus mactans\ntarantula\nwolf spider, hunting spider\ntick\ncentipede\nblack grouse\nptarmigan\nruffed grouse, partridge, Bonasa umbellus\nprairie chicken, prairie grouse, prairie fowl\npeacock\nquail\npartridge\nAfrican grey, African gray, Psittacus erithacus\nmacaw\nsulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita\nlorikeet\ncoucal\nbee eater\nhornbill\nhummingbird\njacamar\ntoucan\ndrake\nred-breasted merganser, Mergus serrator\ngoose\nblack swan, Cygnus atratus\ntusker\nechidna, spiny anteater, anteater\nplatypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus\nwallaby, brush kangaroo\nkoala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus\nwombat\njellyfish\nsea anemone, anemone\nbrain coral\nflatworm, platyhelminth\nnematode, nematode worm, roundworm\nconch\nsnail\nslug\nsea slug, nudibranch\nchiton, coat-of-mail shell, sea cradle, polyplacophore\nchambered nautilus, pearly nautilus, nautilus\nDungeness crab, Cancer magister\nrock crab, Cancer irroratus\nfiddler crab\nking crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica\nAmerican lobster, Northern lobster, Maine lobster, Homarus americanus\nspiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish\ncrayfish, crawfish, crawdad, crawdaddy\nhermit crab\nisopod\nwhite stork, Ciconia ciconia\nblack stork, Ciconia nigra\nspoonbill\nflamingo\nlittle blue heron, Egretta caerulea\nAmerican egret, great white heron, Egretta albus\nbittern\ncrane\nlimpkin, Aramus pictus\nEuropean gallinule, Porphyrio porphyrio\nAmerican coot, marsh hen, mud hen, water hen, Fulica americana\nbustard\nruddy turnstone, Arenaria interpres\nred-backed sandpiper, dunlin, Erolia alpina\nredshank, Tringa totanus\ndowitcher\noystercatcher, oyster catcher\npelican\nking penguin, Aptenodytes patagonica\nalbatross, mollymawk\ngrey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus\nkiller whale, killer, orca, grampus, sea wolf, Orcinus orca\ndugong, Dugong dugon\nsea lion\nChihuahua\nJapanese spaniel\nMaltese dog, Maltese terrier, Maltese\nPekinese, Pekingese, Peke\nShih-Tzu\nBlenheim spaniel\npapillon\ntoy terrier\nRhodesian ridgeback\nAfghan hound, Afghan\nbasset, basset hound\nbeagle\nbloodhound, sleuthhound\nbluetick\nblack-and-tan coonhound\nWalker hound, Walker foxhound\nEnglish foxhound\nredbone\nborzoi, Russian wolfhound\nIrish wolfhound\nItalian greyhound\nwhippet\nIbizan hound, Ibizan Podenco\nNorwegian elkhound, elkhound\notterhound, otter hound\nSaluki, gazelle hound\nScottish deerhound, deerhound\nWeimaraner\nStaffordshire bullterrier, Staffordshire bull terrier\nAmerican Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier\nBedlington terrier\nBorder terrier\nKerry blue terrier\nIrish terrier\nNorfolk terrier\nNorwich terrier\nYorkshire terrier\nwire-haired fox terrier\nLakeland terrier\nSealyham terrier, Sealyham\nAiredale, Airedale terrier\ncairn, cairn terrier\nAustralian terrier\nDandie Dinmont, Dandie Dinmont terrier\nBoston bull, Boston terrier\nminiature schnauzer\ngiant schnauzer\nstandard schnauzer\nScotch terrier, Scottish terrier, Scottie\nTibetan terrier, chrysanthemum dog\nsilky terrier, Sydney silky\nsoft-coated wheaten terrier\nWest Highland white terrier\nLhasa, Lhasa apso\nflat-coated retriever\ncurly-coated retriever\ngolden retriever\nLabrador retriever\nChesapeake Bay retriever\nGerman short-haired pointer\nvizsla, Hungarian pointer\nEnglish setter\nIrish setter, red setter\nGordon setter\nBrittany spaniel\nclumber, clumber spaniel\nEnglish springer, English springer spaniel\nWelsh springer spaniel\ncocker spaniel, English cocker spaniel, cocker\nSussex spaniel\nIrish water spaniel\nkuvasz\nschipperke\ngroenendael\nmalinois\nbriard\nkelpie\nkomondor\nOld English sheepdog, bobtail\nShetland sheepdog, Shetland sheep dog, Shetland\ncollie\nBorder collie\nBouvier des Flandres, Bouviers des Flandres\nRottweiler\nGerman shepherd, German shepherd dog, German police dog, alsatian\nDoberman, Doberman pinscher\nminiature pinscher\nGreater Swiss Mountain dog\nBernese mountain dog\nAppenzeller\nEntleBucher\nboxer\nbull mastiff\nTibetan mastiff\nFrench bulldog\nGreat Dane\nSaint Bernard, St Bernard\nEskimo dog, husky\nmalamute, malemute, Alaskan malamute\nSiberian husky\ndalmatian, coach dog, carriage dog\naffenpinscher, monkey pinscher, monkey dog\nbasenji\npug, pug-dog\nLeonberg\nNewfoundland, Newfoundland dog\nGreat Pyrenees\nSamoyed, Samoyede\nPomeranian\nchow, chow chow\nkeeshond\nBrabancon griffon\nPembroke, Pembroke Welsh corgi\nCardigan, Cardigan Welsh corgi\ntoy poodle\nminiature poodle\nstandard poodle\nMexican hairless\ntimber wolf, grey wolf, gray wolf, Canis lupus\nwhite wolf, Arctic wolf, Canis lupus tundrarum\nred wolf, maned wolf, Canis rufus, Canis niger\ncoyote, prairie wolf, brush wolf, Canis latrans\ndingo, warrigal, warragal, Canis dingo\ndhole, Cuon alpinus\nAfrican hunting dog, hyena dog, Cape hunting dog, Lycaon pictus\nhyena, hyaena\nred fox, Vulpes vulpes\nkit fox, Vulpes macrotis\nArctic fox, white fox, Alopex lagopus\ngrey fox, gray fox, Urocyon cinereoargenteus\ntabby, tabby cat\ntiger cat\nPersian cat\nSiamese cat, Siamese\nEgyptian cat\ncougar, puma, catamount, mountain lion, painter, panther, Felis concolor\nlynx, catamount\nleopard, Panthera pardus\nsnow leopard, ounce, Panthera uncia\njaguar, panther, Panthera onca, Felis onca\nlion, king of beasts, Panthera leo\ntiger, Panthera tigris\ncheetah, chetah, Acinonyx jubatus\nbrown bear, bruin, Ursus arctos\nAmerican black bear, black bear, Ursus americanus, Euarctos americanus\nice bear, polar bear, Ursus Maritimus, Thalarctos maritimus\nsloth bear, Melursus ursinus, Ursus ursinus\nmongoose\nmeerkat, mierkat\ntiger beetle\nladybug, ladybeetle, lady beetle, ladybird, ladybird beetle\nground beetle, carabid beetle\nlong-horned beetle, longicorn, longicorn beetle\nleaf beetle, chrysomelid\ndung beetle\nrhinoceros beetle\nweevil\nfly\nbee\nant, emmet, pismire\ngrasshopper, hopper\ncricket\nwalking stick, walkingstick, stick insect\ncockroach, roach\nmantis, mantid\ncicada, cicala\nleafhopper\nlacewing, lacewing fly\ndragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk\ndamselfly\nadmiral\nringlet, ringlet butterfly\nmonarch, monarch butterfly, milkweed butterfly, Danaus plexippus\ncabbage butterfly\nsulphur butterfly, sulfur butterfly\nlycaenid, lycaenid butterfly\nstarfish, sea star\nsea urchin\nsea cucumber, holothurian\nwood rabbit, cottontail, cottontail rabbit\nhare\nAngora, Angora rabbit\nhamster\nporcupine, hedgehog\nfox squirrel, eastern fox squirrel, Sciurus niger\nmarmot\nbeaver\nguinea pig, Cavia cobaya\nsorrel\nzebra\nhog, pig, grunter, squealer, Sus scrofa\nwild boar, boar, Sus scrofa\nwarthog\nhippopotamus, hippo, river horse, Hippopotamus amphibius\nox\nwater buffalo, water ox, Asiatic buffalo, Bubalus bubalis\nbison\nram, tup\nbighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis\nibex, Capra ibex\nhartebeest\nimpala, Aepyceros melampus\ngazelle\nArabian camel, dromedary, Camelus dromedarius\nllama\nweasel\nmink\npolecat, fitch, foulmart, foumart, Mustela putorius\nblack-footed ferret, ferret, Mustela nigripes\notter\nskunk, polecat, wood pussy\nbadger\narmadillo\nthree-toed sloth, ai, Bradypus tridactylus\norangutan, orang, orangutang, Pongo pygmaeus\ngorilla, Gorilla gorilla\nchimpanzee, chimp, Pan troglodytes\ngibbon, Hylobates lar\nsiamang, Hylobates syndactylus, Symphalangus syndactylus\nguenon, guenon monkey\npatas, hussar monkey, Erythrocebus patas\nbaboon\nmacaque\nlangur\ncolobus, colobus monkey\nproboscis monkey, Nasalis larvatus\nmarmoset\ncapuchin, ringtail, Cebus capucinus\nhowler monkey, howler\ntiti, titi monkey\nspider monkey, Ateles geoffroyi\nsquirrel monkey, Saimiri sciureus\nMadagascar cat, ring-tailed lemur, Lemur catta\nindri, indris, Indri indri, Indri brevicaudatus\nIndian elephant, Elephas maximus\nAfrican elephant, Loxodonta africana\nlesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens\ngiant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca\nbarracouta, snoek\neel\ncoho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch\nrock beauty, Holocanthus tricolor\nanemone fish\nsturgeon\ngar, garfish, garpike, billfish, Lepisosteus osseus\nlionfish\npuffer, pufferfish, blowfish, globefish\nabacus\nabaya\nacademic gown, academic robe, judge's robe\naccordion, piano accordion, squeeze box\nacoustic guitar\naircraft carrier, carrier, flattop, attack aircraft carrier\nairliner\nairship, dirigible\naltar\nambulance\namphibian, amphibious vehicle\nanalog clock\napiary, bee house\napron\nashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin\nassault rifle, assault gun\nbackpack, back pack, knapsack, packsack, rucksack, haversack\nbakery, bakeshop, bakehouse\nbalance beam, beam\nballoon\nballpoint, ballpoint pen, ballpen, Biro\nBand Aid\nbanjo\nbannister, banister, balustrade, balusters, handrail\nbarbell\nbarber chair\nbarbershop\nbarn\nbarometer\nbarrel, cask\nbarrow, garden cart, lawn cart, wheelbarrow\nbaseball\nbasketball\nbassinet\nbassoon\nbathing cap, swimming cap\nbath towel\nbathtub, bathing tub, bath, tub\nbeach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon\nbeacon, lighthouse, beacon light, pharos\nbeaker\nbearskin, busby, shako\nbeer bottle\nbeer glass\nbell cote, bell cot\nbib\nbicycle-built-for-two, tandem bicycle, tandem\nbikini, two-piece\nbinder, ring-binder\nbinoculars, field glasses, opera glasses\nbirdhouse\nboathouse\nbobsled, bobsleigh, bob\nbolo tie, bolo, bola tie, bola\nbonnet, poke bonnet\nbookcase\nbookshop, bookstore, bookstall\nbottlecap\nbow\nbow tie, bow-tie, bowtie\nbrass, memorial tablet, plaque\nbrassiere, bra, bandeau\nbreakwater, groin, groyne, mole, bulwark, seawall, jetty\nbreastplate, aegis, egis\nbroom\nbucket, pail\nbuckle\nbulletproof vest\nbullet train, bullet\nbutcher shop, meat market\ncab, hack, taxi, taxicab\ncaldron, cauldron\ncandle, taper, wax light\ncannon\ncanoe\ncan opener, tin opener\ncardigan\ncar mirror\ncarousel, carrousel, merry-go-round, roundabout, whirligig\ncarpenter's kit, tool kit\ncarton\ncar wheel\ncash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM\ncassette\ncassette player\ncastle\ncatamaran\nCD player\ncello, violoncello\ncellular telephone, cellular phone, cellphone, cell, mobile phone\nchain\nchainlink fence\nchain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour\nchain saw, chainsaw\nchest\nchiffonier, commode\nchime, bell, gong\nchina cabinet, china closet\nChristmas stocking\nchurch, church building\ncinema, movie theater, movie theatre, movie house, picture palace\ncleaver, meat cleaver, chopper\ncliff dwelling\ncloak\nclog, geta, patten, sabot\ncocktail shaker\ncoffee mug\ncoffeepot\ncoil, spiral, volute, whorl, helix\ncombination lock\ncomputer keyboard, keypad\nconfectionery, confectionary, candy store\ncontainer ship, containership, container vessel\nconvertible\ncorkscrew, bottle screw\ncornet, horn, trumpet, trump\ncowboy boot\ncowboy hat, ten-gallon hat\ncradle\ncrane\ncrash helmet\ncrate\ncrib, cot\nCrock Pot\ncroquet ball\ncrutch\ncuirass\ndam, dike, dyke\ndesk\ndesktop computer\ndial telephone, dial phone\ndiaper, nappy, napkin\ndigital clock\ndigital watch\ndining table, board\ndishrag, dishcloth\ndishwasher, dish washer, dishwashing machine\ndisk brake, disc brake\ndock, dockage, docking facility\ndogsled, dog sled, dog sleigh\ndome\ndoormat, welcome mat\ndrilling platform, offshore rig\ndrum, membranophone, tympan\ndrumstick\ndumbbell\nDutch oven\nelectric fan, blower\nelectric guitar\nelectric locomotive\nentertainment center\nenvelope\nespresso maker\nface powder\nfeather boa, boa\nfile, file cabinet, filing cabinet\nfireboat\nfire engine, fire truck\nfire screen, fireguard\nflagpole, flagstaff\nflute, transverse flute\nfolding chair\nfootball helmet\nforklift\nfountain\nfountain pen\nfour-poster\nfreight car\nFrench horn, horn\nfrying pan, frypan, skillet\nfur coat\ngarbage truck, dustcart\ngasmask, respirator, gas helmet\ngas pump, gasoline pump, petrol pump, island dispenser\ngoblet\ngo-kart\ngolf ball\ngolfcart, golf cart\ngondola\ngong, tam-tam\ngown\ngrand piano, grand\ngreenhouse, nursery, glasshouse\ngrille, radiator grille\ngrocery store, grocery, food market, market\nguillotine\nhair slide\nhair spray\nhalf track\nhammer\nhamper\nhand blower, blow dryer, blow drier, hair dryer, hair drier\nhand-held computer, hand-held microcomputer\nhandkerchief, hankie, hanky, hankey\nhard disc, hard disk, fixed disk\nharmonica, mouth organ, harp, mouth harp\nharp\nharvester, reaper\nhatchet\nholster\nhome theater, home theatre\nhoneycomb\nhook, claw\nhoopskirt, crinoline\nhorizontal bar, high bar\nhorse cart, horse-cart\nhourglass\niPod\niron, smoothing iron\njack-o'-lantern\njean, blue jean, denim\njeep, landrover\njersey, T-shirt, tee shirt\njigsaw puzzle\njinrikisha, ricksha, rickshaw\njoystick\nkimono\nknee pad\nknot\nlab coat, laboratory coat\nladle\nlampshade, lamp shade\nlaptop, laptop computer\nlawn mower, mower\nlens cap, lens cover\nletter opener, paper knife, paperknife\nlibrary\nlifeboat\nlighter, light, igniter, ignitor\nlimousine, limo\nliner, ocean liner\nlipstick, lip rouge\nLoafer\nlotion\nloudspeaker, speaker, speaker unit, loudspeaker system, speaker system\nloupe, jeweler's loupe\nlumbermill, sawmill\nmagnetic compass\nmailbag, postbag\nmailbox, letter box\nmaillot\nmaillot, tank suit\nmanhole cover\nmaraca\nmarimba, xylophone\nmask\nmatchstick\nmaypole\nmaze, labyrinth\nmeasuring cup\nmedicine chest, medicine cabinet\nmegalith, megalithic structure\nmicrophone, mike\nmicrowave, microwave oven\nmilitary uniform\nmilk can\nminibus\nminiskirt, mini\nminivan\nmissile\nmitten\nmixing bowl\nmobile home, manufactured home\nModel T\nmodem\nmonastery\nmonitor\nmoped\nmortar\nmortarboard\nmosque\nmosquito net\nmotor scooter, scooter\nmountain bike, all-terrain bike, off-roader\nmountain tent\nmouse, computer mouse\nmousetrap\nmoving van\nmuzzle\nnail\nneck brace\nnecklace\nnipple\nnotebook, notebook computer\nobelisk\noboe, hautboy, hautbois\nocarina, sweet potato\nodometer, hodometer, mileometer, milometer\noil filter\norgan, pipe organ\noscilloscope, scope, cathode-ray oscilloscope, CRO\noverskirt\noxcart\noxygen mask\npacket\npaddle, boat paddle\npaddlewheel, paddle wheel\npadlock\npaintbrush\npajama, pyjama, pj's, jammies\npalace\npanpipe, pandean pipe, syrinx\npaper towel\nparachute, chute\nparallel bars, bars\npark bench\nparking meter\npassenger car, coach, carriage\npatio, terrace\npay-phone, pay-station\npedestal, plinth, footstall\npencil box, pencil case\npencil sharpener\nperfume, essence\nPetri dish\nphotocopier\npick, plectrum, plectron\npickelhaube\npicket fence, paling\npickup, pickup truck\npier\npiggy bank, penny bank\npill bottle\npillow\nping-pong ball\npinwheel\npirate, pirate ship\npitcher, ewer\nplane, carpenter's plane, woodworking plane\nplanetarium\nplastic bag\nplate rack\nplow, plough\nplunger, plumber's helper\nPolaroid camera, Polaroid Land camera\npole\npolice van, police wagon, paddy wagon, patrol wagon, wagon, black Maria\nponcho\npool table, billiard table, snooker table\npop bottle, soda bottle\npot, flowerpot\npotter's wheel\npower drill\nprayer rug, prayer mat\nprinter\nprison, prison house\nprojectile, missile\nprojector\npuck, hockey puck\npunching bag, punch bag, punching ball, punchball\npurse\nquill, quill pen\nquilt, comforter, comfort, puff\nracer, race car, racing car\nracket, racquet\nradiator\nradio, wireless\nradio telescope, radio reflector\nrain barrel\nrecreational vehicle, RV, R.V.\nreel\nreflex camera\nrefrigerator, icebox\nremote control, remote\nrestaurant, eating house, eating place, eatery\nrevolver, six-gun, six-shooter\nrifle\nrocking chair, rocker\nrotisserie\nrubber eraser, rubber, pencil eraser\nrugby ball\nrule, ruler\nrunning shoe\nsafe\nsafety pin\nsaltshaker, salt shaker\nsandal\nsarong\nsax, saxophone\nscabbard\nscale, weighing machine\nschool bus\nschooner\nscoreboard\nscreen, CRT screen\nscrew\nscrewdriver\nseat belt, seatbelt\nsewing machine\nshield, buckler\nshoe shop, shoe-shop, shoe store\nshoji\nshopping basket\nshopping cart\nshovel\nshower cap\nshower curtain\nski\nski mask\nsleeping bag\nslide rule, slipstick\nsliding door\nslot, one-armed bandit\nsnorkel\nsnowmobile\nsnowplow, snowplough\nsoap dispenser\nsoccer ball\nsock\nsolar dish, solar collector, solar furnace\nsombrero\nsoup bowl\nspace bar\nspace heater\nspace shuttle\nspatula\nspeedboat\nspider web, spider's web\nspindle\nsports car, sport car\nspotlight, spot\nstage\nsteam locomotive\nsteel arch bridge\nsteel drum\nstethoscope\nstole\nstone wall\nstopwatch, stop watch\nstove\nstrainer\nstreetcar, tram, tramcar, trolley, trolley car\nstretcher\nstudio couch, day bed\nstupa, tope\nsubmarine, pigboat, sub, U-boat\nsuit, suit of clothes\nsundial\nsunglass\nsunglasses, dark glasses, shades\nsunscreen, sunblock, sun blocker\nsuspension bridge\nswab, swob, mop\nsweatshirt\nswimming trunks, bathing trunks\nswing\nswitch, electric switch, electrical switch\nsyringe\ntable lamp\ntank, army tank, armored combat vehicle, armoured combat vehicle\ntape player\nteapot\nteddy, teddy bear\ntelevision, television system\ntennis ball\nthatch, thatched roof\ntheater curtain, theatre curtain\nthimble\nthresher, thrasher, threshing machine\nthrone\ntile roof\ntoaster\ntobacco shop, tobacconist shop, tobacconist\ntoilet seat\ntorch\ntotem pole\ntow truck, tow car, wrecker\ntoyshop\ntractor\ntrailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi\ntray\ntrench coat\ntricycle, trike, velocipede\ntrimaran\ntripod\ntriumphal arch\ntrolleybus, trolley coach, trackless trolley\ntrombone\ntub, vat\nturnstile\ntypewriter keyboard\numbrella\nunicycle, monocycle\nupright, upright piano\nvacuum, vacuum cleaner\nvase\nvault\nvelvet\nvending machine\nvestment\nviaduct\nviolin, fiddle\nvolleyball\nwaffle iron\nwall clock\nwallet, billfold, notecase, pocketbook\nwardrobe, closet, press\nwarplane, military plane\nwashbasin, handbasin, washbowl, lavabo, wash-hand basin\nwasher, automatic washer, washing machine\nwater bottle\nwater jug\nwater tower\nwhiskey jug\nwhistle\nwig\nwindow screen\nwindow shade\nWindsor tie\nwine bottle\nwing\nwok\nwooden spoon\nwool, woolen, woollen\nworm fence, snake fence, snake-rail fence, Virginia fence\nwreck\nyawl\nyurt\nweb site, website, internet site, site\ncomic book\ncrossword puzzle, crossword\nstreet sign\ntraffic light, traffic signal, stoplight\nbook jacket, dust cover, dust jacket, dust wrapper\nmenu\nplate\nguacamole\nconsomme\nhot pot, hotpot\ntrifle\nice cream, icecream\nice lolly, lolly, lollipop, popsicle\nFrench loaf\nbagel, beigel\npretzel\ncheeseburger\nhotdog, hot dog, red hot\nmashed potato\nhead cabbage\nbroccoli\ncauliflower\nzucchini, courgette\nspaghetti squash\nacorn squash\nbutternut squash\ncucumber, cuke\nartichoke, globe artichoke\nbell pepper\ncardoon\nmushroom\nGranny Smith\nstrawberry\norange\nlemon\nfig\npineapple, ananas\nbanana\njackfruit, jak, jack\ncustard apple\npomegranate\nhay\ncarbonara\nchocolate sauce, chocolate syrup\ndough\nmeat loaf, meatloaf\npizza, pizza pie\npotpie\nburrito\nred wine\nespresso\ncup\neggnog\nalp\nbubble\ncliff, drop, drop-off\ncoral reef\ngeyser\nlakeside, lakeshore\npromontory, headland, head, foreland\nsandbar, sand bar\nseashore, coast, seacoast, sea-coast\nvalley, vale\nvolcano\nballplayer, baseball player\ngroom, bridegroom\nscuba diver\nrapeseed\ndaisy\nyellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum\ncorn\nacorn\nhip, rose hip, rosehip\nbuckeye, horse chestnut, conker\ncoral fungus\nagaric\ngyromitra\nstinkhorn, carrion fungus\nearthstar\nhen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa\nbolete\near, spike, capitulum\ntoilet tissue, toilet paper, bathroom tissue\n"
  },
  {
    "path": "modules/image/classification/vgg16_imagenet/module.py",
    "content": "# coding=utf-8\nimport os\nimport ast\nimport argparse\n\nimport numpy as np\nimport paddlehub as hub\nimport paddle.fluid as fluid\nfrom paddlehub.module.module import moduleinfo, runnable\nfrom paddle.fluid.core import PaddleTensor, AnalysisConfig, create_paddle_predictor\nfrom paddlehub.io.parser import txt_parser\n\nfrom vgg16_imagenet.vgg import VGG\nfrom vgg16_imagenet.processor import load_label_info\nfrom vgg16_imagenet.data_feed import test_reader\n\n\n@moduleinfo(\n    name=\"vgg16_imagenet\",\n    version=\"1.1.0\",\n    type=\"cv/classification\",\n    summary=\"VGG16 is a image classfication model trained with ImageNet-2012 dataset.\",\n    author=\"paddlepaddle\",\n    author_email=\"paddle-dev@baidu.com\")\nclass VGG16(hub.Module):\n    def _initialize(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"vgg16_model\")\n        self.label_names = load_label_info(os.path.join(self.directory, \"label_file.txt\"))\n        self.infer_prog = None\n        self.pred_out = None\n        self._set_config()\n\n    def get_expected_image_width(self):\n        return 224\n\n    def get_expected_image_height(self):\n        return 224\n\n    def get_pretrained_images_mean(self):\n        im_mean = np.array([0.485, 0.456, 0.406]).reshape(1, 3)\n        return im_mean\n\n    def get_pretrained_images_std(self):\n        im_std = np.array([0.229, 0.224, 0.225]).reshape(1, 3)\n        return im_std\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        cpu_config = AnalysisConfig(self.default_pretrained_model_path)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        cpu_config.switch_ir_optim(False)\n        self.cpu_predictor = create_paddle_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = AnalysisConfig(self.default_pretrained_model_path)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=500, device_id=0)\n            self.gpu_predictor = create_paddle_predictor(gpu_config)\n\n    def context(self,\n                input_image=None,\n                trainable=True,\n                pretrained=True,\n                param_prefix='',\n                get_prediction=False,\n                extra_block_filters=((256, 512, 1, 2, 3), (128, 256, 1, 2, 3), (128, 256, 0, 1, 3), (128, 256, 0, 1,\n                                                                                                     3)),\n                normalizations=(20., -1, -1, -1, -1, -1)):\n        \"\"\"Distill the Head Features, so as to perform transfer learning.\n\n        :param input_image: image tensor.\n        :type input_image: <class 'paddle.fluid.framework.Variable'>\n        :param trainable: whether to set parameters trainable.\n        :type trainable: bool\n        :param pretrained: whether to load default pretrained model.\n        :type pretrained: bool\n        :param param_prefix: the prefix of parameters.\n        :type param_prefix: str\n        :param get_prediction: whether to get prediction.\n        :type get_prediction: bool\n        :param extra_block_filters: in each extra block, params:\n            [in_channel, out_channel, padding_size, stride_size, filter_size]\n        :type extra_block_filters: list\n        :param normalizations: params list of init scale in l2 norm, skip init\n            scale if param is -1.\n        :type normalizations: list\n        \"\"\"\n        context_prog = input_image.block.program if input_image else fluid.Program()\n        startup_program = fluid.Program()\n        with fluid.program_guard(context_prog, startup_program):\n            image = input_image if input_image else fluid.data(\n                name='image', shape=[-1, 3, 224, 224], dtype='float32', lod_level=0)\n\n            backbone = VGG(\n                depth=16,\n                with_extra_blocks=not get_prediction,\n                normalizations=normalizations,\n                extra_block_filters=extra_block_filters)\n\n            out = backbone(image)\n            inputs = {'image': image}\n            if get_prediction:\n                outputs = {'pred_out': out}\n            else:\n                outputs = {'body_feats': out}\n\n            place = fluid.CPUPlace()\n            exe = fluid.Executor(place)\n            if pretrained:\n\n                def _if_exist(var):\n                    return os.path.exists(os.path.join(self.default_pretrained_model_path, var.name))\n\n                if not param_prefix:\n                    fluid.io.load_vars(\n                        exe, self.default_pretrained_model_path, main_program=context_prog, predicate=_if_exist)\n            else:\n                exe.run(startup_program)\n            return inputs, outputs, context_prog\n\n    def classification(self, paths=None, images=None, use_gpu=False, batch_size=1, top_k=1):\n        \"\"\"API of Classification.\n        :param paths: the path of images.\n        :type paths: list, each element is correspond to the path of an image.\n        :param images: data of images, [N, H, W, C]\n        :type images: numpy.ndarray\n        :param use_gpu: whether to use gpu or not.\n        :type use_gpu: bool\n        :param batch_size: bathc size.\n        :type batch_size: int\n        :param top_k: result of top k\n        :type top_k: int\n        \"\"\"\n        if self.infer_prog is None:\n            inputs, outputs, self.infer_prog = self.context(trainable=False, pretrained=True, get_prediction=True)\n            self.infer_prog = self.infer_prog.clone(for_test=True)\n            self.pred_out = outputs['pred_out']\n        place = fluid.CUDAPlace(0) if use_gpu else fluid.CPUPlace()\n        exe = fluid.Executor(place)\n        all_images = []\n        paths = paths if paths else []\n        for yield_data in test_reader(paths, images):\n            all_images.append(yield_data)\n\n        images_num = len(all_images)\n        loop_num = int(np.ceil(images_num / batch_size))\n        res_list = []\n        top_k = max(min(top_k, 1000), 1)\n        for iter_id in range(loop_num):\n            batch_data = []\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_images[handle_id + image_id])\n                except:\n                    pass\n            batch_data = np.array(batch_data).astype('float32')\n            data_tensor = PaddleTensor(batch_data.copy())\n            if use_gpu:\n                result = self.gpu_predictor.run([data_tensor])\n            else:\n                result = self.cpu_predictor.run([data_tensor])\n            for i, res in enumerate(result[0].as_ndarray()):\n                res_dict = {}\n                pred_label = np.argsort(res)[::-1][:top_k]\n                for k in pred_label:\n                    class_name = self.label_names[int(k)].split(',')[0]\n                    max_prob = res[k]\n                    res_dict[class_name] = max_prob\n                res_list.append(res_dict)\n        return res_list\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu', type=ast.literal_eval, default=False, help=\"whether use GPU or not\")\n\n        self.arg_config_group.add_argument('--batch_size', type=int, default=1, help=\"batch size for prediction\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, default=None, help=\"input data\")\n\n        self.arg_input_group.add_argument('--input_file', type=str, default=None, help=\"file contain input data\")\n\n    def check_input_data(self, args):\n        input_data = []\n        if args.input_path:\n            input_data = [args.input_path]\n        elif args.input_file:\n            if not os.path.exists(args.input_file):\n                raise RuntimeError(\"File %s is not exist.\" % args.input_file)\n            else:\n                input_data = txt_parser.parse(args.input_file, use_strip=True)\n        return input_data\n\n    @runnable\n    def run_cmd(self, argvs):\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {}\".format(self.name),\n            prog=\"hub run {}\".format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        input_data = self.check_input_data(args)\n        if len(input_data) == 0:\n            self.parser.print_help()\n            exit(1)\n        else:\n            for image_path in input_data:\n                if not os.path.exists(image_path):\n                    raise RuntimeError(\"File %s or %s is not exist.\" % image_path)\n        return self.classification(paths=input_data, use_gpu=args.use_gpu, batch_size=args.batch_size)\n"
  },
  {
    "path": "modules/image/classification/vgg16_imagenet/processor.py",
    "content": "# coding=utf-8\ndef load_label_info(file_path):\n    with open(file_path, 'r') as fr:\n        return fr.read().split(\"\\n\")[:-1]\n"
  },
  {
    "path": "modules/image/classification/vgg16_imagenet/vgg.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nfrom paddle import fluid\nfrom paddle.fluid.param_attr import ParamAttr\n\n__all__ = ['VGG']\n\n\nclass VGG(object):\n    \"\"\"\n    VGG, see https://arxiv.org/abs/1409.1556\n\n    Args:\n        depth (int): the VGG net depth (16 or 19)\n        normalizations (list): params list of init scale in l2 norm, skip init\n            scale if param is -1.\n        with_extra_blocks (bool): whether or not extra blocks should be added\n        extra_block_filters (list): in each extra block, params:\n            [in_channel, out_channel, padding_size, stride_size, filter_size]\n        class_dim (int): number of class while classification\n    \"\"\"\n\n    def __init__(self,\n                 depth=16,\n                 with_extra_blocks=False,\n                 normalizations=[20., -1, -1, -1, -1, -1],\n                 extra_block_filters=[[256, 512, 1, 2, 3], [128, 256, 1, 2, 3], [128, 256, 0, 1, 3],\n                                      [128, 256, 0, 1, 3]],\n                 class_dim=1000):\n        assert depth in [16, 19], \"depth {} not in [16, 19]\"\n        self.depth = depth\n        self.depth_cfg = {16: [2, 2, 3, 3, 3], 19: [2, 2, 4, 4, 4]}\n        self.with_extra_blocks = with_extra_blocks\n        self.normalizations = normalizations\n        self.extra_block_filters = extra_block_filters\n        self.class_dim = class_dim\n\n    def __call__(self, input):\n        layers = []\n        layers += self._vgg_block(input)\n\n        if not self.with_extra_blocks:\n            return layers[-1]\n\n        layers += self._add_extras_block(layers[-1])\n        norm_cfg = self.normalizations\n        for k, v in enumerate(layers):\n            if not norm_cfg[k] == -1:\n                layers[k] = self._l2_norm_scale(v, init_scale=norm_cfg[k])\n\n        return layers\n\n    def _vgg_block(self, input):\n        nums = self.depth_cfg[self.depth]\n        vgg_base = [64, 128, 256, 512, 512]\n        conv = input\n        res_layer = []\n        layers = []\n        for k, v in enumerate(vgg_base):\n            conv = self._conv_block(conv, v, nums[k], name=\"conv{}_\".format(k + 1))\n            layers.append(conv)\n            if self.with_extra_blocks:\n                if k == 4:\n                    conv = self._pooling_block(conv, 3, 1, pool_padding=1)\n                else:\n                    conv = self._pooling_block(conv, 2, 2)\n            else:\n                conv = self._pooling_block(conv, 2, 2)\n        if not self.with_extra_blocks:\n            fc_dim = 4096\n            fc_name = [\"fc6\", \"fc7\", \"fc8\"]\n            fc1 = fluid.layers.fc(\n                input=conv,\n                size=fc_dim,\n                act='relu',\n                param_attr=fluid.param_attr.ParamAttr(name=fc_name[0] + \"_weights\"),\n                bias_attr=fluid.param_attr.ParamAttr(name=fc_name[0] + \"_offset\"))\n            fc2 = fluid.layers.fc(\n                input=fc1,\n                size=fc_dim,\n                act='relu',\n                param_attr=fluid.param_attr.ParamAttr(name=fc_name[1] + \"_weights\"),\n                bias_attr=fluid.param_attr.ParamAttr(name=fc_name[1] + \"_offset\"))\n            out = fluid.layers.fc(\n                input=fc2,\n                size=self.class_dim,\n                param_attr=fluid.param_attr.ParamAttr(name=fc_name[2] + \"_weights\"),\n                bias_attr=fluid.param_attr.ParamAttr(name=fc_name[2] + \"_offset\"))\n            out = fluid.layers.softmax(out)\n            res_layer.append(out)\n            return [out]\n        else:\n            fc6 = self._conv_layer(conv, 1024, 3, 1, 6, dilation=6, name=\"fc6\")\n            fc7 = self._conv_layer(fc6, 1024, 1, 1, 0, name=\"fc7\")\n            return [layers[3], fc7]\n\n    def _add_extras_block(self, input):\n        cfg = self.extra_block_filters\n        conv = input\n        layers = []\n        for k, v in enumerate(cfg):\n            assert len(v) == 5, \"extra_block_filters size not fix\"\n            conv = self._extra_block(conv, v[0], v[1], v[2], v[3], v[4], name=\"conv{}_\".format(6 + k))\n            layers.append(conv)\n\n        return layers\n\n    def _conv_block(self, input, num_filter, groups, name=None):\n        conv = input\n        for i in range(groups):\n            conv = self._conv_layer(\n                input=conv,\n                num_filters=num_filter,\n                filter_size=3,\n                stride=1,\n                padding=1,\n                act='relu',\n                name=name + str(i + 1))\n        return conv\n\n    def _extra_block(self, input, num_filters1, num_filters2, padding_size, stride_size, filter_size, name=None):\n        # 1x1 conv\n        conv_1 = self._conv_layer(\n            input=input, num_filters=int(num_filters1), filter_size=1, stride=1, act='relu', padding=0, name=name + \"1\")\n\n        # 3x3 conv\n        conv_2 = self._conv_layer(\n            input=conv_1,\n            num_filters=int(num_filters2),\n            filter_size=filter_size,\n            stride=stride_size,\n            act='relu',\n            padding=padding_size,\n            name=name + \"2\")\n        return conv_2\n\n    def _conv_layer(self,\n                    input,\n                    num_filters,\n                    filter_size,\n                    stride,\n                    padding,\n                    dilation=1,\n                    act='relu',\n                    use_cudnn=True,\n                    name=None):\n        conv = fluid.layers.conv2d(\n            input=input,\n            num_filters=num_filters,\n            filter_size=filter_size,\n            stride=stride,\n            padding=padding,\n            dilation=dilation,\n            act=act,\n            use_cudnn=use_cudnn,\n            param_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=ParamAttr(name=name + \"_biases\") if self.with_extra_blocks else False,\n            name=name + '.conv2d.output.1')\n        return conv\n\n    def _pooling_block(self, conv, pool_size, pool_stride, pool_padding=0, ceil_mode=True):\n        pool = fluid.layers.pool2d(\n            input=conv,\n            pool_size=pool_size,\n            pool_type='max',\n            pool_stride=pool_stride,\n            pool_padding=pool_padding,\n            ceil_mode=ceil_mode)\n        return pool\n\n    def _l2_norm_scale(self, input, init_scale=1.0, channel_shared=False):\n        from paddle.fluid.layer_helper import LayerHelper\n        from paddle.fluid.initializer import Constant\n        helper = LayerHelper(\"Scale\")\n        l2_norm = fluid.layers.l2_normalize(input, axis=1)  # l2 norm along channel\n        shape = [1] if channel_shared else [input.shape[1]]\n        scale = helper.create_parameter(\n            attr=helper.param_attr, shape=shape, dtype=input.dtype, default_initializer=Constant(init_scale))\n        out = fluid.layers.elementwise_mul(\n            x=l2_norm, y=scale, axis=-1 if channel_shared else 1, name=\"conv4_3_norm_scale\")\n        return out\n"
  },
  {
    "path": "modules/image/classification/vgg19_imagenet/README.md",
    "content": "# vgg19_imagenet\n\n|模型名称|vgg19_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|vgg19_imagenet|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|549MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - VGG是牛津大学计算机视觉组和DeepMind在2014年提出的一种图像分类模型。该系列模型探索了卷积神经网络的深度与其性能之间的关系，通过实验证明了增加网络的深度能够在一定程度上影响网络最终的性能，到目前为止，VGG仍然被许多其他图像任务用作特征提取的BackBone网络。该PaddleHub Module结构为VGG19，基于ImageNet-2012数据集训练，接受输入图片大小为224 x 224 x 3，支持直接通过命令行或者Python接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install vgg19_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run vgg19_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"vgg19_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install vgg19_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/vgg19_imagenet/README_en.md",
    "content": "# vgg19_imagenet\n\n|Module Name|vgg19_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|vgg19_imagenet|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|549MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n  - VGG is a serial of models for image classification proposed by university of Oxford and DeepMind. The serial models demonstrate 'the deeper the network is, the better the performance is'. And VGG is used for feature extraction as the backbone by most image classification tasks. This module is based on VGG19, trained on ImageNet-2012, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install vgg19_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run vgg19_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"vgg19_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install vgg19_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/xception41_imagenet/README.md",
    "content": "# xception41_imagenet\n\n|模型名称|xception41_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|Xception|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - Xception 全称为 Extreme Inception，是 Google 于 2016年提出的 Inception V3 的改进模型。Xception 采用了深度可分离卷积(depthwise separable convolution) 来替换原来 Inception V3 中的卷积操作，整体的网络结构是带有残差连接的深度可分离卷积层的线性堆叠。该PaddleHub Module结构为Xception41，基于ImageNet-2012数据集训练，接受输入图片大小为224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install xception41_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run xception41_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"xception41_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install xception41_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/xception41_imagenet/README_en.md",
    "content": "# xception41_imagenet\n\n|Module Name|xception41_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|Xception|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - Xception is a model proposed by Google in 2016, which is an improvement on Inception V3. This module is based on Xception41, trained on ImageNet-2012, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install xception41_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run xception41_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"xception41_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install xception41_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/xception41_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport sys\nimport math\n\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 filter_size: int,\n                 stride: int = 1,\n                 groups: int = 1,\n                 act: str = None,\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        bn_name = \"bn_\" + name\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + \"_scale\"),\n            bias_attr=ParamAttr(name=bn_name + \"_offset\"),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass SeparableConv(nn.Layer):\n    \"\"\"Basic separable conv layer, it contains pointwise conv and depthwise conv.\"\"\"\n\n    def __init__(self, input_channels: int, output_channels: int, stride: int = 1, name: str = None):\n        super(SeparableConv, self).__init__()\n\n        self._pointwise_conv = ConvBNLayer(input_channels, output_channels, 1, name=name + \"_sep\")\n        self._depthwise_conv = ConvBNLayer(\n            output_channels, output_channels, 3, stride=stride, groups=output_channels, name=name + \"_dw\")\n\n    def forward(self, inputs: paddle.Tensor):\n        x = self._pointwise_conv(inputs)\n        x = self._depthwise_conv(x)\n        return x\n\n\nclass EntryFlowBottleneckBlock(nn.Layer):\n    \"\"\"Basic entry flow bottleneck block for Xception.\"\"\"\n\n    def __init__(self,\n                 input_channels: int,\n                 output_channels: int,\n                 stride: int = 2,\n                 name: str = None,\n                 relu_first: bool = False):\n        super(EntryFlowBottleneckBlock, self).__init__()\n        self.relu_first = relu_first\n\n        self._short = Conv2d(\n            in_channels=input_channels,\n            out_channels=output_channels,\n            kernel_size=1,\n            stride=stride,\n            padding=0,\n            weight_attr=ParamAttr(name + \"_branch1_weights\"),\n            bias_attr=False)\n        self._conv1 = SeparableConv(input_channels, output_channels, stride=1, name=name + \"_branch2a_weights\")\n        self._conv2 = SeparableConv(output_channels, output_channels, stride=1, name=name + \"_branch2b_weights\")\n        self._pool = MaxPool2d(kernel_size=3, stride=stride, padding=1)\n\n    def forward(self, inputs: paddle.Tensor):\n        conv0 = inputs\n        short = self._short(inputs)\n        if self.relu_first:\n            conv0 = F.relu(conv0)\n        conv1 = self._conv1(conv0)\n        conv2 = F.relu(conv1)\n        conv2 = self._conv2(conv2)\n        pool = self._pool(conv2)\n        return paddle.elementwise_add(x=short, y=pool)\n\n\nclass EntryFlow(nn.Layer):\n    \"\"\"Entry flow for Xception.\"\"\"\n\n    def __init__(self, block_num: int = 3):\n        super(EntryFlow, self).__init__()\n\n        name = \"entry_flow\"\n        self.block_num = block_num\n        self._conv1 = ConvBNLayer(3, 32, 3, stride=2, act=\"relu\", name=name + \"_conv1\")\n        self._conv2 = ConvBNLayer(32, 64, 3, act=\"relu\", name=name + \"_conv2\")\n        if block_num == 3:\n            self._conv_0 = EntryFlowBottleneckBlock(64, 128, stride=2, name=name + \"_0\", relu_first=False)\n            self._conv_1 = EntryFlowBottleneckBlock(128, 256, stride=2, name=name + \"_1\", relu_first=True)\n            self._conv_2 = EntryFlowBottleneckBlock(256, 728, stride=2, name=name + \"_2\", relu_first=True)\n        elif block_num == 5:\n            self._conv_0 = EntryFlowBottleneckBlock(64, 128, stride=2, name=name + \"_0\", relu_first=False)\n            self._conv_1 = EntryFlowBottleneckBlock(128, 256, stride=1, name=name + \"_1\", relu_first=True)\n            self._conv_2 = EntryFlowBottleneckBlock(256, 256, stride=2, name=name + \"_2\", relu_first=True)\n            self._conv_3 = EntryFlowBottleneckBlock(256, 728, stride=1, name=name + \"_3\", relu_first=True)\n            self._conv_4 = EntryFlowBottleneckBlock(728, 728, stride=2, name=name + \"_4\", relu_first=True)\n        else:\n            sys.exit(-1)\n\n    def forward(self, inputs: paddle.Tensor):\n        x = self._conv1(inputs)\n        x = self._conv2(x)\n\n        if self.block_num == 3:\n            x = self._conv_0(x)\n            x = self._conv_1(x)\n            x = self._conv_2(x)\n        elif self.block_num == 5:\n            x = self._conv_0(x)\n            x = self._conv_1(x)\n            x = self._conv_2(x)\n            x = self._conv_3(x)\n            x = self._conv_4(x)\n        return x\n\n\nclass MiddleFlowBottleneckBlock(nn.Layer):\n    \"\"\"Basic middle flow bottleneck block for Xception.\"\"\"\n\n    def __init__(self, input_channels: int, output_channels: int, name: str):\n        super(MiddleFlowBottleneckBlock, self).__init__()\n\n        self._conv_0 = SeparableConv(input_channels, output_channels, stride=1, name=name + \"_branch2a_weights\")\n        self._conv_1 = SeparableConv(output_channels, output_channels, stride=1, name=name + \"_branch2b_weights\")\n        self._conv_2 = SeparableConv(output_channels, output_channels, stride=1, name=name + \"_branch2c_weights\")\n\n    def forward(self, inputs: paddle.Tensor):\n        conv0 = F.relu(inputs)\n        conv0 = self._conv_0(conv0)\n        conv1 = F.relu(conv0)\n        conv1 = self._conv_1(conv1)\n        conv2 = F.relu(conv1)\n        conv2 = self._conv_2(conv2)\n        return paddle.elementwise_add(x=inputs, y=conv2)\n\n\nclass MiddleFlow(nn.Layer):\n    \"\"\"Middle flow for Xception.\"\"\"\n\n    def __init__(self, block_num: int = 8):\n        super(MiddleFlow, self).__init__()\n\n        self.block_num = block_num\n        self._conv_0 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_0\")\n        self._conv_1 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_1\")\n        self._conv_2 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_2\")\n        self._conv_3 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_3\")\n        self._conv_4 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_4\")\n        self._conv_5 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_5\")\n        self._conv_6 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_6\")\n        self._conv_7 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_7\")\n        if block_num == 16:\n            self._conv_8 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_8\")\n            self._conv_9 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_9\")\n            self._conv_10 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_10\")\n            self._conv_11 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_11\")\n            self._conv_12 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_12\")\n            self._conv_13 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_13\")\n            self._conv_14 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_14\")\n            self._conv_15 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_15\")\n\n    def forward(self, inputs: paddle.Tensor):\n        x = self._conv_0(inputs)\n        x = self._conv_1(x)\n        x = self._conv_2(x)\n        x = self._conv_3(x)\n        x = self._conv_4(x)\n        x = self._conv_5(x)\n        x = self._conv_6(x)\n        x = self._conv_7(x)\n        if self.block_num == 16:\n            x = self._conv_8(x)\n            x = self._conv_9(x)\n            x = self._conv_10(x)\n            x = self._conv_11(x)\n            x = self._conv_12(x)\n            x = self._conv_13(x)\n            x = self._conv_14(x)\n            x = self._conv_15(x)\n        return x\n\n\nclass ExitFlowBottleneckBlock(nn.Layer):\n    \"\"\"Basic exit flow bottleneck block for Xception.\"\"\"\n\n    def __init__(self, input_channels: int, output_channels1: int, output_channels2: int, name: str):\n        super(ExitFlowBottleneckBlock, self).__init__()\n\n        self._short = Conv2d(\n            in_channels=input_channels,\n            out_channels=output_channels2,\n            kernel_size=1,\n            stride=2,\n            padding=0,\n            weight_attr=ParamAttr(name + \"_branch1_weights\"),\n            bias_attr=False)\n        self._conv_1 = SeparableConv(input_channels, output_channels1, stride=1, name=name + \"_branch2a_weights\")\n        self._conv_2 = SeparableConv(output_channels1, output_channels2, stride=1, name=name + \"_branch2b_weights\")\n        self._pool = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n    def forward(self, inputs: paddle.Tensor):\n        short = self._short(inputs)\n        conv0 = F.relu(inputs)\n        conv1 = self._conv_1(conv0)\n        conv2 = F.relu(conv1)\n        conv2 = self._conv_2(conv2)\n        pool = self._pool(conv2)\n        return paddle.elementwise_add(x=short, y=pool)\n\n\nclass ExitFlow(nn.Layer):\n    \"\"\"Exit flow for Xception.\"\"\"\n\n    def __init__(self, class_dim: int):\n        super(ExitFlow, self).__init__()\n\n        name = \"exit_flow\"\n\n        self._conv_0 = ExitFlowBottleneckBlock(728, 728, 1024, name=name + \"_1\")\n        self._conv_1 = SeparableConv(1024, 1536, stride=1, name=name + \"_2\")\n        self._conv_2 = SeparableConv(1536, 2048, stride=1, name=name + \"_3\")\n        self._pool = AdaptiveAvgPool2d(1)\n        stdv = 1.0 / math.sqrt(2048 * 1.0)\n        self._out = Linear(\n            2048,\n            class_dim,\n            weight_attr=ParamAttr(name=\"fc_weights\", initializer=Uniform(-stdv, stdv)),\n            bias_attr=ParamAttr(name=\"fc_offset\"))\n\n    def forward(self, inputs: paddle.Tensor):\n        conv0 = self._conv_0(inputs)\n        conv1 = self._conv_1(conv0)\n        conv1 = F.relu(conv1)\n        conv2 = self._conv_2(conv1)\n        conv2 = F.relu(conv2)\n        pool = self._pool(conv2)\n        pool = paddle.reshape(pool, [0, -1])\n        out = self._out(pool)\n        return out\n\n\n@moduleinfo(\n    name=\"xception41_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"Xception41_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass Xception41(nn.Layer):\n    \"\"\"Xception41 model.\"\"\"\n\n    def __init__(self, class_dim: int = 1000, load_checkpoint: str = None):\n        super(Xception41, self).__init__()\n        self.entry_flow_block_num = 3\n        self.middle_flow_block_num = 8\n        self._entry_flow = EntryFlow(self.entry_flow_block_num)\n        self._middle_flow = MiddleFlow(self.middle_flow_block_num)\n        self._exit_flow = ExitFlow(class_dim)\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'xception41_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/xception41_imagenet.pdparams -O'\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs: paddle.Tensor):\n        x = self._entry_flow(inputs)\n        x = self._middle_flow(x)\n        x = self._exit_flow(x)\n        return x\n"
  },
  {
    "path": "modules/image/classification/xception65_imagenet/README.md",
    "content": "# xception65_imagenet\n\n|模型名称|xception65_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|Xception|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|140MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - Xception 全称为 Extreme Inception，是 Google 于 2016年提出的 Inception V3 的改进模型。Xception 采用了深度可分离卷积(depthwise separable convolution) 来替换原来 Inception V3 中的卷积操作，整体的网络结构是带有残差连接的深度可分离卷积层的线性堆叠。该PaddleHub Module结构为Xception65，基于ImageNet-2012数据集训练，接受输入图片大小为224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install xception65_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run xception65_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"xception65_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install xception65_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/xception65_imagenet/README_en.md",
    "content": "# xception65_imagenet\n\n|Module Name|xception65_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|Xception|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|140MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - Xception is a model proposed by Google in 2016, which is an improvement on Inception V3. This module is based on Xception65, trained on ImageNet-2012, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install xception65_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run xception65_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"xception65_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install xception65_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/xception65_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport sys\nimport math\n\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 filter_size: int,\n                 stride: int = 1,\n                 groups: int = 1,\n                 act: str = None,\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        bn_name = \"bn_\" + name\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + \"_scale\"),\n            bias_attr=ParamAttr(name=bn_name + \"_offset\"),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass SeparableConv(nn.Layer):\n    \"\"\"Basic separable conv layer, it contains pointwise conv and depthwise conv.\"\"\"\n\n    def __init__(self, input_channels: int, output_channels: int, stride: int = 1, name: str = None):\n        super(SeparableConv, self).__init__()\n\n        self._pointwise_conv = ConvBNLayer(input_channels, output_channels, 1, name=name + \"_sep\")\n        self._depthwise_conv = ConvBNLayer(\n            output_channels, output_channels, 3, stride=stride, groups=output_channels, name=name + \"_dw\")\n\n    def forward(self, inputs: paddle.Tensor):\n        x = self._pointwise_conv(inputs)\n        x = self._depthwise_conv(x)\n        return x\n\n\nclass EntryFlowBottleneckBlock(nn.Layer):\n    \"\"\"Basic entry flow bottleneck block for Xception.\"\"\"\n\n    def __init__(self,\n                 input_channels: int,\n                 output_channels: int,\n                 stride: int = 2,\n                 name: str = None,\n                 relu_first: bool = False):\n        super(EntryFlowBottleneckBlock, self).__init__()\n        self.relu_first = relu_first\n\n        self._short = Conv2d(\n            in_channels=input_channels,\n            out_channels=output_channels,\n            kernel_size=1,\n            stride=stride,\n            padding=0,\n            weight_attr=ParamAttr(name + \"_branch1_weights\"),\n            bias_attr=False)\n        self._conv1 = SeparableConv(input_channels, output_channels, stride=1, name=name + \"_branch2a_weights\")\n        self._conv2 = SeparableConv(output_channels, output_channels, stride=1, name=name + \"_branch2b_weights\")\n        self._pool = MaxPool2d(kernel_size=3, stride=stride, padding=1)\n\n    def forward(self, inputs: paddle.Tensor):\n        conv0 = inputs\n        short = self._short(inputs)\n        if self.relu_first:\n            conv0 = F.relu(conv0)\n        conv1 = self._conv1(conv0)\n        conv2 = F.relu(conv1)\n        conv2 = self._conv2(conv2)\n        pool = self._pool(conv2)\n        return paddle.elementwise_add(x=short, y=pool)\n\n\nclass EntryFlow(nn.Layer):\n    \"\"\"Entry flow for Xception.\"\"\"\n\n    def __init__(self, block_num: int = 3):\n        super(EntryFlow, self).__init__()\n\n        name = \"entry_flow\"\n        self.block_num = block_num\n        self._conv1 = ConvBNLayer(3, 32, 3, stride=2, act=\"relu\", name=name + \"_conv1\")\n        self._conv2 = ConvBNLayer(32, 64, 3, act=\"relu\", name=name + \"_conv2\")\n        if block_num == 3:\n            self._conv_0 = EntryFlowBottleneckBlock(64, 128, stride=2, name=name + \"_0\", relu_first=False)\n            self._conv_1 = EntryFlowBottleneckBlock(128, 256, stride=2, name=name + \"_1\", relu_first=True)\n            self._conv_2 = EntryFlowBottleneckBlock(256, 728, stride=2, name=name + \"_2\", relu_first=True)\n        elif block_num == 5:\n            self._conv_0 = EntryFlowBottleneckBlock(64, 128, stride=2, name=name + \"_0\", relu_first=False)\n            self._conv_1 = EntryFlowBottleneckBlock(128, 256, stride=1, name=name + \"_1\", relu_first=True)\n            self._conv_2 = EntryFlowBottleneckBlock(256, 256, stride=2, name=name + \"_2\", relu_first=True)\n            self._conv_3 = EntryFlowBottleneckBlock(256, 728, stride=1, name=name + \"_3\", relu_first=True)\n            self._conv_4 = EntryFlowBottleneckBlock(728, 728, stride=2, name=name + \"_4\", relu_first=True)\n        else:\n            sys.exit(-1)\n\n    def forward(self, inputs: paddle.Tensor):\n        x = self._conv1(inputs)\n        x = self._conv2(x)\n\n        if self.block_num == 3:\n            x = self._conv_0(x)\n            x = self._conv_1(x)\n            x = self._conv_2(x)\n        elif self.block_num == 5:\n            x = self._conv_0(x)\n            x = self._conv_1(x)\n            x = self._conv_2(x)\n            x = self._conv_3(x)\n            x = self._conv_4(x)\n        return x\n\n\nclass MiddleFlowBottleneckBlock(nn.Layer):\n    \"\"\"Basic middle flow bottleneck block for Xception.\"\"\"\n\n    def __init__(self, input_channels: int, output_channels: int, name: str):\n        super(MiddleFlowBottleneckBlock, self).__init__()\n\n        self._conv_0 = SeparableConv(input_channels, output_channels, stride=1, name=name + \"_branch2a_weights\")\n        self._conv_1 = SeparableConv(output_channels, output_channels, stride=1, name=name + \"_branch2b_weights\")\n        self._conv_2 = SeparableConv(output_channels, output_channels, stride=1, name=name + \"_branch2c_weights\")\n\n    def forward(self, inputs: paddle.Tensor):\n        conv0 = F.relu(inputs)\n        conv0 = self._conv_0(conv0)\n        conv1 = F.relu(conv0)\n        conv1 = self._conv_1(conv1)\n        conv2 = F.relu(conv1)\n        conv2 = self._conv_2(conv2)\n        return paddle.elementwise_add(x=inputs, y=conv2)\n\n\nclass MiddleFlow(nn.Layer):\n    \"\"\"Middle flow for Xception.\"\"\"\n\n    def __init__(self, block_num: int = 8):\n        super(MiddleFlow, self).__init__()\n\n        self.block_num = block_num\n        self._conv_0 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_0\")\n        self._conv_1 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_1\")\n        self._conv_2 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_2\")\n        self._conv_3 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_3\")\n        self._conv_4 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_4\")\n        self._conv_5 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_5\")\n        self._conv_6 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_6\")\n        self._conv_7 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_7\")\n        if block_num == 16:\n            self._conv_8 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_8\")\n            self._conv_9 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_9\")\n            self._conv_10 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_10\")\n            self._conv_11 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_11\")\n            self._conv_12 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_12\")\n            self._conv_13 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_13\")\n            self._conv_14 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_14\")\n            self._conv_15 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_15\")\n\n    def forward(self, inputs: paddle.Tensor):\n        x = self._conv_0(inputs)\n        x = self._conv_1(x)\n        x = self._conv_2(x)\n        x = self._conv_3(x)\n        x = self._conv_4(x)\n        x = self._conv_5(x)\n        x = self._conv_6(x)\n        x = self._conv_7(x)\n        if self.block_num == 16:\n            x = self._conv_8(x)\n            x = self._conv_9(x)\n            x = self._conv_10(x)\n            x = self._conv_11(x)\n            x = self._conv_12(x)\n            x = self._conv_13(x)\n            x = self._conv_14(x)\n            x = self._conv_15(x)\n        return x\n\n\nclass ExitFlowBottleneckBlock(nn.Layer):\n    \"\"\"Basic exit flow bottleneck block for Xception.\"\"\"\n\n    def __init__(self, input_channels, output_channels1, output_channels2, name):\n        super(ExitFlowBottleneckBlock, self).__init__()\n\n        self._short = Conv2d(\n            in_channels=input_channels,\n            out_channels=output_channels2,\n            kernel_size=1,\n            stride=2,\n            padding=0,\n            weight_attr=ParamAttr(name + \"_branch1_weights\"),\n            bias_attr=False)\n        self._conv_1 = SeparableConv(input_channels, output_channels1, stride=1, name=name + \"_branch2a_weights\")\n        self._conv_2 = SeparableConv(output_channels1, output_channels2, stride=1, name=name + \"_branch2b_weights\")\n        self._pool = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n    def forward(self, inputs: paddle.Tensor):\n        short = self._short(inputs)\n        conv0 = F.relu(inputs)\n        conv1 = self._conv_1(conv0)\n        conv2 = F.relu(conv1)\n        conv2 = self._conv_2(conv2)\n        pool = self._pool(conv2)\n        return paddle.elementwise_add(x=short, y=pool)\n\n\nclass ExitFlow(nn.Layer):\n    \"\"\"Exit flow for Xception.\"\"\"\n\n    def __init__(self, class_dim):\n        super(ExitFlow, self).__init__()\n\n        name = \"exit_flow\"\n\n        self._conv_0 = ExitFlowBottleneckBlock(728, 728, 1024, name=name + \"_1\")\n        self._conv_1 = SeparableConv(1024, 1536, stride=1, name=name + \"_2\")\n        self._conv_2 = SeparableConv(1536, 2048, stride=1, name=name + \"_3\")\n        self._pool = AdaptiveAvgPool2d(1)\n        stdv = 1.0 / math.sqrt(2048 * 1.0)\n        self._out = Linear(\n            2048,\n            class_dim,\n            weight_attr=ParamAttr(name=\"fc_weights\", initializer=Uniform(-stdv, stdv)),\n            bias_attr=ParamAttr(name=\"fc_offset\"))\n\n    def forward(self, inputs: paddle.Tensor):\n        conv0 = self._conv_0(inputs)\n        conv1 = self._conv_1(conv0)\n        conv1 = F.relu(conv1)\n        conv2 = self._conv_2(conv1)\n        conv2 = F.relu(conv2)\n        pool = self._pool(conv2)\n        pool = paddle.reshape(pool, [0, -1])\n        out = self._out(pool)\n        return out\n\n\n@moduleinfo(\n    name=\"xception65_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"Xception65_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass Xception65(nn.Layer):\n    def __init__(self, class_dim=1000, load_checkpoint: str = None):\n        super(Xception65, self).__init__()\n        self.entry_flow_block_num = 3\n        self.middle_flow_block_num = 16\n        self._entry_flow = EntryFlow(self.entry_flow_block_num)\n        self._middle_flow = MiddleFlow(self.middle_flow_block_num)\n        self._exit_flow = ExitFlow(class_dim)\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'xception65_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/xception65_imagenet.pdparams -O'\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs):\n        x = self._entry_flow(inputs)\n        x = self._middle_flow(x)\n        x = self._exit_flow(x)\n        return x\n"
  },
  {
    "path": "modules/image/classification/xception71_imagenet/README.md",
    "content": "# xception71_imagenet\n\n|模型名称|xception71_imagenet|\n| :--- | :---: |\n|类别|图像-图像分类|\n|网络|Xception|\n|数据集|ImageNet-2012|\n|是否支持Fine-tuning|否|\n|模型大小|147MB|\n|最新更新日期|-|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n\n\n- ### 模型介绍\n\n  - Xception 全称为 Extreme Inception，是 Google 于 2016年提出的 Inception V3 的改进模型。Xception 采用了深度可分离卷积(depthwise separable convolution) 来替换原来 Inception V3 中的卷积操作，整体的网络结构是带有残差连接的深度可分离卷积层的线性堆叠。该PaddleHub Module结构为Xception71，基于ImageNet-2012数据集训练，接受输入图片大小为224 x 224 x 3，支持直接通过命令行或者 Python 接口进行预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install xception71_imagenet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run xception71_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现图像分类模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"xception71_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - 分类接口API。\n    - **参数**\n      - data：dict类型，key为image，str类型，value为待检测的图片路径，list类型。\n\n    - **返回**\n      - result：list类型，每个元素为对应输入图片的预测结果。预测结果为dict类型，key为该图片分类结果label，value为该label对应的概率\n\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install xception71_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/xception71_imagenet/README_en.md",
    "content": "# xception71_imagenet\n\n|Module Name|xception71_imagenet|\n| :--- | :---: |\n|Category|image classification|\n|Network|Xception|\n|Dataset|ImageNet-2012|\n|Fine-tuning supported or not|No|\n|Module Size|147MB|\n|Latest update date|-|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n\n\n- ### Module Introduction\n\n  - Xception is a model proposed by Google in 2016, which is an improvement on Inception V3. This module is based on Xception71, trained on ImageNet-2012, and can predict an image of size 224*224*3.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.4.0  \n\n  - paddlehub >= 1.0.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install xception71_imagenet\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run xception71_imagenet --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    classifier = hub.Module(name=\"xception71_imagenet\")\n    test_img_path = \"/PATH/TO/IMAGE\"\n    input_dict = {\"image\": [test_img_path]}\n    result = classifier.classification(data=input_dict)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def classification(data)\n    ```\n    - classification API.\n    - **Parameters**\n      - data (dict): key is \"image\", value is a list of image paths\n\n    - **Return**\n      - result(list[dict]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install xception71_imagenet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/classification/xception71_imagenet/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport sys\nimport math\n\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn import Conv2d, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2d, MaxPool2d, AvgPool2d\nfrom paddle.nn.initializer import Uniform\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.cv_module import ImageClassifierModule\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 filter_size: int,\n                 stride: int = 1,\n                 groups: int = 1,\n                 act: str = None,\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = Conv2d(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        bn_name = \"bn_\" + name\n        self._batch_norm = BatchNorm(\n            num_filters,\n            act=act,\n            param_attr=ParamAttr(name=bn_name + \"_scale\"),\n            bias_attr=ParamAttr(name=bn_name + \"_offset\"),\n            moving_mean_name=bn_name + '_mean',\n            moving_variance_name=bn_name + '_variance')\n\n    def forward(self, inputs: paddle.Tensor):\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        return y\n\n\nclass SeparableConv(nn.Layer):\n    \"\"\"Basic separable conv layer, it contains pointwise conv and depthwise conv.\"\"\"\n\n    def __init__(self, input_channels: int, output_channels: int, stride: int = 1, name: str = None):\n        super(SeparableConv, self).__init__()\n\n        self._pointwise_conv = ConvBNLayer(input_channels, output_channels, 1, name=name + \"_sep\")\n        self._depthwise_conv = ConvBNLayer(\n            output_channels, output_channels, 3, stride=stride, groups=output_channels, name=name + \"_dw\")\n\n    def forward(self, inputs: paddle.Tensor):\n        x = self._pointwise_conv(inputs)\n        x = self._depthwise_conv(x)\n        return x\n\n\nclass EntryFlowBottleneckBlock(nn.Layer):\n    \"\"\"Basic entry flow bottleneck block for Xception.\"\"\"\n\n    def __init__(self,\n                 input_channels: int,\n                 output_channels: int,\n                 stride: int = 2,\n                 name: str = None,\n                 relu_first: bool = False):\n        super(EntryFlowBottleneckBlock, self).__init__()\n        self.relu_first = relu_first\n\n        self._short = Conv2d(\n            in_channels=input_channels,\n            out_channels=output_channels,\n            kernel_size=1,\n            stride=stride,\n            padding=0,\n            weight_attr=ParamAttr(name + \"_branch1_weights\"),\n            bias_attr=False)\n        self._conv1 = SeparableConv(input_channels, output_channels, stride=1, name=name + \"_branch2a_weights\")\n        self._conv2 = SeparableConv(output_channels, output_channels, stride=1, name=name + \"_branch2b_weights\")\n        self._pool = MaxPool2d(kernel_size=3, stride=stride, padding=1)\n\n    def forward(self, inputs: paddle.Tensor):\n        conv0 = inputs\n        short = self._short(inputs)\n        if self.relu_first:\n            conv0 = F.relu(conv0)\n        conv1 = self._conv1(conv0)\n        conv2 = F.relu(conv1)\n        conv2 = self._conv2(conv2)\n        pool = self._pool(conv2)\n        return paddle.elementwise_add(x=short, y=pool)\n\n\nclass EntryFlow(nn.Layer):\n    \"\"\"Entry flow for Xception.\"\"\"\n\n    def __init__(self, block_num: int = 3):\n        super(EntryFlow, self).__init__()\n\n        name = \"entry_flow\"\n        self.block_num = block_num\n        self._conv1 = ConvBNLayer(3, 32, 3, stride=2, act=\"relu\", name=name + \"_conv1\")\n        self._conv2 = ConvBNLayer(32, 64, 3, act=\"relu\", name=name + \"_conv2\")\n        if block_num == 3:\n            self._conv_0 = EntryFlowBottleneckBlock(64, 128, stride=2, name=name + \"_0\", relu_first=False)\n            self._conv_1 = EntryFlowBottleneckBlock(128, 256, stride=2, name=name + \"_1\", relu_first=True)\n            self._conv_2 = EntryFlowBottleneckBlock(256, 728, stride=2, name=name + \"_2\", relu_first=True)\n        elif block_num == 5:\n            self._conv_0 = EntryFlowBottleneckBlock(64, 128, stride=2, name=name + \"_0\", relu_first=False)\n            self._conv_1 = EntryFlowBottleneckBlock(128, 256, stride=1, name=name + \"_1\", relu_first=True)\n            self._conv_2 = EntryFlowBottleneckBlock(256, 256, stride=2, name=name + \"_2\", relu_first=True)\n            self._conv_3 = EntryFlowBottleneckBlock(256, 728, stride=1, name=name + \"_3\", relu_first=True)\n            self._conv_4 = EntryFlowBottleneckBlock(728, 728, stride=2, name=name + \"_4\", relu_first=True)\n        else:\n            sys.exit(-1)\n\n    def forward(self, inputs: paddle.Tensor):\n        x = self._conv1(inputs)\n        x = self._conv2(x)\n\n        if self.block_num == 3:\n            x = self._conv_0(x)\n            x = self._conv_1(x)\n            x = self._conv_2(x)\n        elif self.block_num == 5:\n            x = self._conv_0(x)\n            x = self._conv_1(x)\n            x = self._conv_2(x)\n            x = self._conv_3(x)\n            x = self._conv_4(x)\n        return x\n\n\nclass MiddleFlowBottleneckBlock(nn.Layer):\n    \"\"\"Basic middle flow bottleneck block for Xception.\"\"\"\n\n    def __init__(self, input_channels: int, output_channels: int, name: str):\n        super(MiddleFlowBottleneckBlock, self).__init__()\n\n        self._conv_0 = SeparableConv(input_channels, output_channels, stride=1, name=name + \"_branch2a_weights\")\n        self._conv_1 = SeparableConv(output_channels, output_channels, stride=1, name=name + \"_branch2b_weights\")\n        self._conv_2 = SeparableConv(output_channels, output_channels, stride=1, name=name + \"_branch2c_weights\")\n\n    def forward(self, inputs: paddle.Tensor):\n        conv0 = F.relu(inputs)\n        conv0 = self._conv_0(conv0)\n        conv1 = F.relu(conv0)\n        conv1 = self._conv_1(conv1)\n        conv2 = F.relu(conv1)\n        conv2 = self._conv_2(conv2)\n        return paddle.elementwise_add(x=inputs, y=conv2)\n\n\nclass MiddleFlow(nn.Layer):\n    \"\"\"Middle flow for Xception.\"\"\"\n\n    def __init__(self, block_num: int = 8):\n        super(MiddleFlow, self).__init__()\n\n        self.block_num = block_num\n        self._conv_0 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_0\")\n        self._conv_1 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_1\")\n        self._conv_2 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_2\")\n        self._conv_3 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_3\")\n        self._conv_4 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_4\")\n        self._conv_5 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_5\")\n        self._conv_6 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_6\")\n        self._conv_7 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_7\")\n        if block_num == 16:\n            self._conv_8 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_8\")\n            self._conv_9 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_9\")\n            self._conv_10 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_10\")\n            self._conv_11 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_11\")\n            self._conv_12 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_12\")\n            self._conv_13 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_13\")\n            self._conv_14 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_14\")\n            self._conv_15 = MiddleFlowBottleneckBlock(728, 728, name=\"middle_flow_15\")\n\n    def forward(self, inputs: paddle.Tensor):\n        x = self._conv_0(inputs)\n        x = self._conv_1(x)\n        x = self._conv_2(x)\n        x = self._conv_3(x)\n        x = self._conv_4(x)\n        x = self._conv_5(x)\n        x = self._conv_6(x)\n        x = self._conv_7(x)\n        if self.block_num == 16:\n            x = self._conv_8(x)\n            x = self._conv_9(x)\n            x = self._conv_10(x)\n            x = self._conv_11(x)\n            x = self._conv_12(x)\n            x = self._conv_13(x)\n            x = self._conv_14(x)\n            x = self._conv_15(x)\n        return x\n\n\nclass ExitFlowBottleneckBlock(nn.Layer):\n    \"\"\"Basic exit flow bottleneck block for Xception.\"\"\"\n\n    def __init__(self, input_channels: int, output_channels1: int, output_channels2: int, name: str):\n        super(ExitFlowBottleneckBlock, self).__init__()\n\n        self._short = Conv2d(\n            in_channels=input_channels,\n            out_channels=output_channels2,\n            kernel_size=1,\n            stride=2,\n            padding=0,\n            weight_attr=ParamAttr(name + \"_branch1_weights\"),\n            bias_attr=False)\n        self._conv_1 = SeparableConv(input_channels, output_channels1, stride=1, name=name + \"_branch2a_weights\")\n        self._conv_2 = SeparableConv(output_channels1, output_channels2, stride=1, name=name + \"_branch2b_weights\")\n        self._pool = MaxPool2d(kernel_size=3, stride=2, padding=1)\n\n    def forward(self, inputs: paddle.Tensor):\n        short = self._short(inputs)\n        conv0 = F.relu(inputs)\n        conv1 = self._conv_1(conv0)\n        conv2 = F.relu(conv1)\n        conv2 = self._conv_2(conv2)\n        pool = self._pool(conv2)\n        return paddle.elementwise_add(x=short, y=pool)\n\n\nclass ExitFlow(nn.Layer):\n    def __init__(self, class_dim: int):\n        super(ExitFlow, self).__init__()\n\n        name = \"exit_flow\"\n\n        self._conv_0 = ExitFlowBottleneckBlock(728, 728, 1024, name=name + \"_1\")\n        self._conv_1 = SeparableConv(1024, 1536, stride=1, name=name + \"_2\")\n        self._conv_2 = SeparableConv(1536, 2048, stride=1, name=name + \"_3\")\n        self._pool = AdaptiveAvgPool2d(1)\n        stdv = 1.0 / math.sqrt(2048 * 1.0)\n        self._out = Linear(\n            2048,\n            class_dim,\n            weight_attr=ParamAttr(name=\"fc_weights\", initializer=Uniform(-stdv, stdv)),\n            bias_attr=ParamAttr(name=\"fc_offset\"))\n\n    def forward(self, inputs: paddle.Tensor):\n        conv0 = self._conv_0(inputs)\n        conv1 = self._conv_1(conv0)\n        conv1 = F.relu(conv1)\n        conv2 = self._conv_2(conv1)\n        conv2 = F.relu(conv2)\n        pool = self._pool(conv2)\n        pool = paddle.reshape(pool, [0, -1])\n        out = self._out(pool)\n        return out\n\n\n@moduleinfo(\n    name=\"xception71_imagenet\",\n    type=\"CV/classification\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"Xception71_imagenet is a classification model, \"\n    \"this module is trained with Imagenet dataset.\",\n    version=\"1.1.0\",\n    meta=ImageClassifierModule)\nclass Xception71(nn.Layer):\n    def __init__(self, class_dim=1000, load_checkpoint: str = None):\n        super(Xception71, self).__init__()\n        self.entry_flow_block_num = 5\n        self.middle_flow_block_num = 16\n        self._entry_flow = EntryFlow(self.entry_flow_block_num)\n        self._middle_flow = MiddleFlow(self.middle_flow_block_num)\n        self._exit_flow = ExitFlow(class_dim)\n\n        if load_checkpoint is not None:\n            model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'xception71_imagenet.pdparams')\n            if not os.path.exists(checkpoint):\n                os.system(\n                    'wget https://paddlehub.bj.bcebos.com/dygraph/image_classification/xception71_imagenet.pdparams -O'\n                    + checkpoint)\n            model_dict = paddle.load(checkpoint)[0]\n            self.set_dict(model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def forward(self, inputs):\n        x = self._entry_flow(inputs)\n        x = self._middle_flow(x)\n        x = self._exit_flow(x)\n        return x\n"
  },
  {
    "path": "modules/image/depth_estimation/MiDaS_Large/README.md",
    "content": "# MiDaS_Large\n\n|模型名称|MiDaS_Large|\n| :--- | :---: |\n|类别|图像 - 深度估计|\n|网络|-|\n|数据集|3D Movies, WSVD, ReDWeb, MegaDepth|\n|是否支持Fine-tuning|否|\n|模型大小|399MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://img-blog.csdnimg.cn/20201227112600975.jpg\"  width='70%' hspace='10'/> <br />\n    </p>\n\n\n- ### 模型介绍\n\n  - MiDaS_Large是一个单目深度估计模型，模型可通过输入图像估计其中的深度信息。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.0.0  \n\n  - paddlehub >= 2.0.0   | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)  \n\n- ### 2、安装\n\n  - ```shell\n    $ hub install MiDaS_Large\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"MiDaS_Large\")\n    result = model.depth_estimation(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.depth_estimation(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def depth_estimation(images=None,\n                    paths=None,\n                    batch_size=1,\n                    output_dir='output',\n                    visualization=False):\n    ```\n\n    - 深度估计API。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]；<br/>\n      - paths (list\\[str\\]): 图片的路径；<br/>\n      - batch_size (int) : batch 的大小；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 output；<br/>\n      - visualization (bool) : 是否将结果保存为图片文件。\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n      - res (list\\[numpy.ndarray\\]): 图像深度数据，ndarray.shape 为 \\[H, W\\]\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install MiDaS_Large==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/depth_estimation/MiDaS_Large/README_en.md",
    "content": "# MiDaS_Large\n\n|Module Name|MiDaS_Large|\n| :--- | :---: |\n|Category|depth estimation|\n|Network|-|\n|Dataset|3D Movies, WSVD, ReDWeb, MegaDepth|\n|Fine-tuning supported or not|No|\n|Module Size|399MB|\n|Latest update date|2021-02-26|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n    <p align=\"center\">\n    <img src=\"https://img-blog.csdnimg.cn/20201227112600975.jpg\"  width='70%' hspace='10'/> <br />\n    </p>\n\n\n- ### Module Introduction\n\n  - MiDas_Large module is used for monocular depth estimation.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 2.0.0  \n\n  - paddlehub >= 2.0.0   | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)  \n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install MiDaS_Large\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"MiDaS_Large\")\n    result = model.depth_estimation(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.depth_estimation(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def depth_estimation(images=None,\n                    paths=None,\n                    batch_size=1,\n                    output_dir='output',\n                    visualization=False):\n    ```\n\n    - depth estimation API.\n\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - batch_size (int): the size of batch;\n      - output_dir (str): save path of images;\n      - visualization (bool): Whether to save the results as picture files;\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n      - res (list\\[numpy.ndarray\\]): depth data，ndarray.shape is  \\[H, W\\]\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install MiDaS_Large==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/depth_estimation/MiDaS_Large/inference.py",
    "content": "import os\r\nimport numpy as np\r\n\r\nfrom paddle.inference import create_predictor, Config\r\n\r\n__all__ = ['InferenceModel']\r\n\r\n\r\nclass InferenceModel():\r\n    # 初始化函数\r\n    def __init__(self, modelpath, use_gpu=False, use_mkldnn=False, combined=True):\r\n        '''\r\n        init the inference model\r\n\r\n        modelpath: inference model path\r\n\r\n        use_gpu: use gpu or not\r\n\r\n        use_mkldnn: use mkldnn or not\r\n\r\n        combined: inference model format is combined or not\r\n        '''\r\n        # 加载模型配置\r\n        self.config = self.load_config(modelpath, use_gpu, use_mkldnn, combined)\r\n\r\n    # 打印函数\r\n    def __repr__(self):\r\n        '''\r\n        get the numbers and name of inputs and outputs\r\n        '''\r\n        return 'inputs_num: %d\\ninputs_names: %s\\noutputs_num: %d\\noutputs_names: %s' % (len(\r\n            self.input_handles), str(self.input_names), len(self.output_handles), str(self.output_names))\r\n\r\n    # 类调用函数\r\n    def __call__(self, *input_datas, batch_size=1):\r\n        '''\r\n        call function\r\n        '''\r\n        return self.forward(*input_datas, batch_size=batch_size)\r\n\r\n    # 模型参数加载函数\r\n    def load_config(self, modelpath, use_gpu, use_mkldnn, combined):\r\n        '''\r\n        load the model config\r\n\r\n        modelpath: inference model path\r\n\r\n        use_gpu: use gpu or not\r\n\r\n        use_mkldnn: use mkldnn or not\r\n\r\n        combined: inference model format is combined or not\r\n        '''\r\n        # 对运行位置进行配置\r\n        if use_gpu:\r\n            try:\r\n                int(os.environ.get('CUDA_VISIBLE_DEVICES'))\r\n            except Exception:\r\n                print(\r\n                    'Error! Unable to use GPU. Please set the environment variables \"CUDA_VISIBLE_DEVICES=GPU_id\" to use GPU.'\r\n                )\r\n                use_gpu = False\r\n\r\n        # 加载模型参数\r\n        if combined:\r\n            model = os.path.join(modelpath, \"__model__\")\r\n            params = os.path.join(modelpath, \"__params__\")\r\n            config = Config(model, params)\r\n        else:\r\n            config = Config(modelpath)\r\n\r\n        # 设置参数\r\n        if use_gpu:\r\n            config.enable_use_gpu(100, 0)\r\n        else:\r\n            config.disable_gpu()\r\n            if use_mkldnn:\r\n                config.enable_mkldnn()\r\n\r\n        # 返回配置\r\n        return config\r\n\r\n    # 预测器创建函数\r\n    def eval(self):\r\n        '''\r\n        create the model predictor by model config\r\n        '''\r\n        # 创建预测器\r\n        self.predictor = create_predictor(self.config)\r\n\r\n        # 获取模型的输入输出名称\r\n        self.input_names = self.predictor.get_input_names()\r\n        self.output_names = self.predictor.get_output_names()\r\n\r\n        # 获取输入\r\n        self.input_handles = []\r\n        for input_name in self.input_names:\r\n            self.input_handles.append(self.predictor.get_input_handle(input_name))\r\n\r\n        # 获取输出\r\n        self.output_handles = []\r\n        for output_name in self.output_names:\r\n            self.output_handles.append(self.predictor.get_output_handle(output_name))\r\n\r\n    # 前向计算函数\r\n    def forward(self, *input_datas, batch_size=1):\r\n        \"\"\"\r\n        model inference\r\n\r\n        batch_size: batch size\r\n\r\n        *input_datas: x1, x2, ..., xn\r\n        \"\"\"\r\n        # 切分输入数据\r\n        datas_num = input_datas[0].shape[0]\r\n        split_num = datas_num // batch_size + 1 if datas_num % batch_size != 0 else datas_num // batch_size\r\n        input_datas = [np.array_split(input_data, split_num) for input_data in input_datas]\r\n\r\n        # 遍历输入数据进行预测\r\n        outputs = {}\r\n        for step in range(split_num):\r\n            for i in range(len(self.input_handles)):\r\n                input_data = input_datas[i][step].copy()\r\n                self.input_handles[i].copy_from_cpu(input_data)\r\n\r\n            self.predictor.run()\r\n\r\n            for i in range(len(self.output_handles)):\r\n                output = self.output_handles[i].copy_to_cpu()\r\n                if i in outputs:\r\n                    outputs[i].append(output)\r\n                else:\r\n                    outputs[i] = [output]\r\n\r\n        # 预测结果合并\r\n        for key in outputs.keys():\r\n            outputs[key] = np.concatenate(outputs[key], 0)\r\n\r\n        # 返回预测结果\r\n        return outputs\r\n"
  },
  {
    "path": "modules/image/depth_estimation/MiDaS_Large/module.py",
    "content": "import os\r\nimport cv2\r\nimport numpy as np\r\n\r\nfrom paddlehub import Module\r\nfrom paddlehub.module.module import moduleinfo\r\n\r\nfrom paddle.vision.transforms import Compose\r\nfrom MiDaS_Large.utils import write_depth\r\nfrom MiDaS_Large.inference import InferenceModel\r\nfrom MiDaS_Large.transforms import Resize, NormalizeImage, PrepareForNet\r\n\r\n\r\n@moduleinfo(\r\n    name=\"MiDaS_Large\",  # 模型名称\r\n    type=\"CV/style_transfer\",  # 模型类型\r\n    author=\"jm12138\",  # 作者名称\r\n    author_email=\"jm12138@qq.com\",  # 作者邮箱\r\n    summary=\"MiDaS_Large\",  # 模型介绍\r\n    version=\"1.0.0\"  # 版本号\r\n)\r\nclass MiDaS_Large(Module):\r\n    # 初始化函数\r\n    def __init__(self, name=None, directory=None, use_gpu=False):\r\n        # 设置模型路径\r\n        model_path = os.path.join(self.directory, \"model-f6b98070\")\r\n\r\n        # 加载模型\r\n        self.model = InferenceModel(modelpath=model_path, use_gpu=use_gpu, use_mkldnn=False, combined=True)\r\n        self.model.eval()\r\n\r\n        # 数据预处理配置\r\n        self.net_h, self.net_w = 384, 384\r\n        self.transform = Compose([\r\n            Resize(\r\n                self.net_w,\r\n                self.net_h,\r\n                resize_target=None,\r\n                keep_aspect_ratio=False,\r\n                ensure_multiple_of=32,\r\n                resize_method=\"upper_bound\",\r\n                image_interpolation_method=cv2.INTER_CUBIC,\r\n            ),\r\n            NormalizeImage(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]),\r\n            PrepareForNet()\r\n        ])\r\n\r\n    # 数据读取函数\r\n    @staticmethod\r\n    def load_datas(paths, images):\r\n        datas = []\r\n\r\n        # 读取数据列表\r\n        if paths is not None:\r\n            for im_path in paths:\r\n                assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\r\n                im = cv2.imread(im_path)\r\n                datas.append(im)\r\n\r\n        if images is not None:\r\n            datas = images\r\n\r\n        # 返回数据列表\r\n        return datas\r\n\r\n    # 数据预处理函数\r\n    def preprocess(self, datas):\r\n        input_datas = []\r\n\r\n        for img in datas:\r\n            # 归一化\r\n            img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB) / 255.0\r\n\r\n            # 图像变换\r\n            img = self.transform({\"image\": img})[\"image\"]\r\n\r\n            # 新增维度\r\n            input_data = img[np.newaxis, ...]\r\n\r\n            input_datas.append(input_data)\r\n\r\n        # 拼接数据\r\n        input_datas = np.concatenate(input_datas, 0)\r\n\r\n        return input_datas\r\n\r\n    # 数据后处理函数\r\n    @staticmethod\r\n    def postprocess(datas, results, output_dir='output', visualization=False):\r\n        # 检查输出目录\r\n        if visualization:\r\n            if not os.path.exists(output_dir):\r\n                os.mkdir(output_dir)\r\n\r\n        outputs = []\r\n\r\n        for img, result, count in zip(datas, results, range(len(datas))):\r\n            # 缩放回原尺寸\r\n            output = cv2.resize(result, (img.shape[1], img.shape[0]), interpolation=cv2.INTER_CUBIC)\r\n\r\n            # 可视化输出\r\n            if visualization:\r\n                pfm_f, png_f = write_depth(os.path.join(output_dir, str(count)), output, bits=2)\r\n\r\n            outputs.append(output)\r\n\r\n        return outputs\r\n\r\n    # 深度估计函数\r\n    def depth_estimation(self, images=None, paths=None, batch_size=1, output_dir='output', visualization=False):\r\n        # 加载数据\r\n        datas = self.load_datas(paths, images)\r\n\r\n        # 数据预处理\r\n        input_datas = self.preprocess(datas)\r\n\r\n        # 模型预测\r\n        results = self.model(input_datas, batch_size=batch_size)[0]\r\n\r\n        # 结果后处理\r\n        outputs = self.postprocess(datas, results, output_dir, visualization)\r\n\r\n        return outputs\r\n"
  },
  {
    "path": "modules/image/depth_estimation/MiDaS_Large/transforms.py",
    "content": "# Refer https://github.com/intel-isl/MiDaS\r\n\r\nimport numpy as np\r\nimport cv2\r\n\r\n\r\nclass Resize(object):\r\n    \"\"\"Resize sample to given size (width, height).\r\n    \"\"\"\r\n\r\n    def __init__(self,\r\n                 width,\r\n                 height,\r\n                 resize_target=True,\r\n                 keep_aspect_ratio=False,\r\n                 ensure_multiple_of=1,\r\n                 resize_method=\"lower_bound\",\r\n                 image_interpolation_method=cv2.INTER_AREA):\r\n        \"\"\"Init.\r\n\r\n        Args:\r\n            width (int): desired output width\r\n            height (int): desired output height\r\n            resize_target (bool, optional):\r\n                True: Resize the full sample (image, mask, target).\r\n                False: Resize image only.\r\n                Defaults to True.\r\n            keep_aspect_ratio (bool, optional):\r\n                True: Keep the aspect ratio of the input sample.\r\n                Output sample might not have the given width and height, and\r\n                resize behaviour depends on the parameter 'resize_method'.\r\n                Defaults to False.\r\n            ensure_multiple_of (int, optional):\r\n                Output width and height is constrained to be multiple of this parameter.\r\n                Defaults to 1.\r\n            resize_method (str, optional):\r\n                \"lower_bound\": Output will be at least as large as the given size.\r\n                \"upper_bound\": Output will be at max as large as the given size. (Output size might be smaller than given size.)\r\n                \"minimal\": Scale as least as possible.  (Output size might be smaller than given size.)\r\n                Defaults to \"lower_bound\".\r\n        \"\"\"\r\n        self.__width = width\r\n        self.__height = height\r\n\r\n        self.__resize_target = resize_target\r\n        self.__keep_aspect_ratio = keep_aspect_ratio\r\n        self.__multiple_of = ensure_multiple_of\r\n        self.__resize_method = resize_method\r\n        self.__image_interpolation_method = image_interpolation_method\r\n\r\n    def constrain_to_multiple_of(self, x, min_val=0, max_val=None):\r\n        y = (np.round(x / self.__multiple_of) * self.__multiple_of).astype(int)\r\n\r\n        if max_val is not None and y > max_val:\r\n            y = (np.floor(x / self.__multiple_of) * self.__multiple_of).astype(int)\r\n\r\n        if y < min_val:\r\n            y = (np.ceil(x / self.__multiple_of) * self.__multiple_of).astype(int)\r\n\r\n        return y\r\n\r\n    def get_size(self, width, height):\r\n        # determine new height and width\r\n        scale_height = self.__height / height\r\n        scale_width = self.__width / width\r\n\r\n        if self.__keep_aspect_ratio:\r\n            if self.__resize_method == \"lower_bound\":\r\n                # scale such that output size is lower bound\r\n                if scale_width > scale_height:\r\n                    # fit width\r\n                    scale_height = scale_width\r\n                else:\r\n                    # fit height\r\n                    scale_width = scale_height\r\n            elif self.__resize_method == \"upper_bound\":\r\n                # scale such that output size is upper bound\r\n                if scale_width < scale_height:\r\n                    # fit width\r\n                    scale_height = scale_width\r\n                else:\r\n                    # fit height\r\n                    scale_width = scale_height\r\n            elif self.__resize_method == \"minimal\":\r\n                # scale as least as possbile\r\n                if abs(1 - scale_width) < abs(1 - scale_height):\r\n                    # fit width\r\n                    scale_height = scale_width\r\n                else:\r\n                    # fit height\r\n                    scale_width = scale_height\r\n            else:\r\n                raise ValueError(f\"resize_method {self.__resize_method} not implemented\")\r\n\r\n        if self.__resize_method == \"lower_bound\":\r\n            new_height = self.constrain_to_multiple_of(scale_height * height, min_val=self.__height)\r\n            new_width = self.constrain_to_multiple_of(scale_width * width, min_val=self.__width)\r\n        elif self.__resize_method == \"upper_bound\":\r\n            new_height = self.constrain_to_multiple_of(scale_height * height, max_val=self.__height)\r\n            new_width = self.constrain_to_multiple_of(scale_width * width, max_val=self.__width)\r\n        elif self.__resize_method == \"minimal\":\r\n            new_height = self.constrain_to_multiple_of(scale_height * height)\r\n            new_width = self.constrain_to_multiple_of(scale_width * width)\r\n        else:\r\n            raise ValueError(f\"resize_method {self.__resize_method} not implemented\")\r\n\r\n        return (new_width, new_height)\r\n\r\n    def __call__(self, sample):\r\n        width, height = self.get_size(sample[\"image\"].shape[1], sample[\"image\"].shape[0])\r\n\r\n        # resize sample\r\n        sample[\"image\"] = cv2.resize(\r\n            sample[\"image\"],\r\n            (width, height),\r\n            interpolation=self.__image_interpolation_method,\r\n        )\r\n\r\n        if self.__resize_target:\r\n            if \"disparity\" in sample:\r\n                sample[\"disparity\"] = cv2.resize(\r\n                    sample[\"disparity\"],\r\n                    (width, height),\r\n                    interpolation=cv2.INTER_NEAREST,\r\n                )\r\n\r\n            if \"depth\" in sample:\r\n                sample[\"depth\"] = cv2.resize(sample[\"depth\"], (width, height), interpolation=cv2.INTER_NEAREST)\r\n\r\n            sample[\"mask\"] = cv2.resize(\r\n                sample[\"mask\"].astype(np.float32),\r\n                (width, height),\r\n                interpolation=cv2.INTER_NEAREST,\r\n            )\r\n            sample[\"mask\"] = sample[\"mask\"].astype(bool)\r\n\r\n        return sample\r\n\r\n\r\nclass NormalizeImage(object):\r\n    \"\"\"Normlize image by given mean and std.\r\n    \"\"\"\r\n\r\n    def __init__(self, mean, std):\r\n        self.__mean = mean\r\n        self.__std = std\r\n\r\n    def __call__(self, sample):\r\n        sample[\"image\"] = (sample[\"image\"] - self.__mean) / self.__std\r\n\r\n        return sample\r\n\r\n\r\nclass PrepareForNet(object):\r\n    \"\"\"Prepare sample for usage as network input.\r\n    \"\"\"\r\n\r\n    def __init__(self):\r\n        pass\r\n\r\n    def __call__(self, sample):\r\n        image = np.transpose(sample[\"image\"], (2, 0, 1))\r\n        sample[\"image\"] = np.ascontiguousarray(image).astype(np.float32)\r\n\r\n        if \"mask\" in sample:\r\n            sample[\"mask\"] = sample[\"mask\"].astype(np.float32)\r\n            sample[\"mask\"] = np.ascontiguousarray(sample[\"mask\"])\r\n\r\n        if \"disparity\" in sample:\r\n            disparity = sample[\"disparity\"].astype(np.float32)\r\n            sample[\"disparity\"] = np.ascontiguousarray(disparity)\r\n\r\n        if \"depth\" in sample:\r\n            depth = sample[\"depth\"].astype(np.float32)\r\n            sample[\"depth\"] = np.ascontiguousarray(depth)\r\n\r\n        return sample\r\n"
  },
  {
    "path": "modules/image/depth_estimation/MiDaS_Large/utils.py",
    "content": "# Refer https://github.com/intel-isl/MiDaS\r\n\"\"\"Utils for monoDepth.\r\n\"\"\"\r\nimport sys\r\nimport numpy as np\r\nimport cv2\r\n\r\n\r\ndef write_pfm(path, image, scale=1):\r\n    \"\"\"Write pfm file.\r\n\r\n    Args:\r\n        path (str): pathto file\r\n        image (array): data\r\n        scale (int, optional): Scale. Defaults to 1.\r\n    \"\"\"\r\n\r\n    with open(path, \"wb\") as file:\r\n        color = None\r\n\r\n        if image.dtype.name != \"float32\":\r\n            raise Exception(\"Image dtype must be float32.\")\r\n\r\n        image = np.flipud(image)\r\n\r\n        if len(image.shape) == 3 and image.shape[2] == 3:  # color image\r\n            color = True\r\n        elif (len(image.shape) == 2 or len(image.shape) == 3 and image.shape[2] == 1):  # greyscale\r\n            color = False\r\n        else:\r\n            raise Exception(\"Image must have H x W x 3, H x W x 1 or H x W dimensions.\")\r\n\r\n        file.write(\"PF\\n\" if color else \"Pf\\n\".encode())\r\n        file.write(\"%d %d\\n\".encode() % (image.shape[1], image.shape[0]))\r\n\r\n        endian = image.dtype.byteorder\r\n\r\n        if endian == \"<\" or endian == \"=\" and sys.byteorder == \"little\":\r\n            scale = -scale\r\n\r\n        file.write(\"%f\\n\".encode() % scale)\r\n\r\n        image.tofile(file)\r\n\r\n\r\ndef read_image(path):\r\n    \"\"\"Read image and output RGB image (0-1).\r\n\r\n    Args:\r\n        path (str): path to file\r\n\r\n    Returns:\r\n        array: RGB image (0-1)\r\n    \"\"\"\r\n    img = cv2.imread(path)\r\n    if img.ndim == 2:\r\n        img = cv2.cvtColor(img, cv2.COLOR_GRAY2BGR)\r\n    img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB) / 255.0\r\n    return img\r\n\r\n\r\ndef write_depth(path, depth, bits=1):\r\n    \"\"\"Write depth map to pfm and png file.\r\n\r\n    Args:\r\n        path (str): filepath without extension\r\n        depth (array): depth\r\n    \"\"\"\r\n    write_pfm(path + \".pfm\", depth.astype(np.float32))\r\n\r\n    depth_min = depth.min()\r\n    depth_max = depth.max()\r\n\r\n    max_val = (2**(8 * bits)) - 1\r\n\r\n    if depth_max - depth_min > np.finfo(\"float\").eps:\r\n        out = max_val * (depth - depth_min) / (depth_max - depth_min)\r\n    else:\r\n        out = np.zeros(depth.shape, dtype=depth.type)\r\n\r\n    if bits == 1:\r\n        cv2.imwrite(path + \".png\", out.astype(\"uint8\"))\r\n    elif bits == 2:\r\n        cv2.imwrite(path + \".png\", out.astype(\"uint16\"))\r\n    return path + '.pfm', path + \".png\"\r\n"
  },
  {
    "path": "modules/image/depth_estimation/MiDaS_Small/README.md",
    "content": "## 模型概述\r\nMiDas v2.1 small 单目深度估计模型\r\n\r\n模型可通过输入图像估计其中的深度信息\r\n\r\n模型权重转换自 [MiDas](https://github.com/intel-isl/MiDaS) 官方开源项目\r\n\r\n\r\n## 模型安装\r\n\r\n```shell\r\n$hub install MiDaS_Small\r\n```\r\n\r\n## 效果展示\r\n![效果展示](https://img-blog.csdnimg.cn/20201227112553903.jpg)\r\n\r\n## API 说明\r\n\r\n```python\r\ndef depth_estimation(\r\n    images=None,\r\n    paths=None,\r\n    batch_size=1,\r\n    output_dir='output',\r\n    visualization=False\r\n)\r\n```\r\n\r\n深度估计API\r\n\r\n**参数**\r\n\r\n* images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，默认为 None；\r\n* paths (list\\[str\\]): 图片的路径，默认为 None；\r\n* batch\\_size (int): batch 的大小，默认设为 1；\r\n* visualization (bool): 是否将识别结果保存为图片文件，默认设为 False；\r\n* output\\_dir (str): 图片的保存路径，默认设为 output。\r\n\r\n\r\n**返回**\r\n\r\n* res (list\\[numpy.ndarray\\]): 图像深度数据，ndarray.shape 为 \\[H, W\\]。\r\n\r\n\r\n## 预测代码示例\r\n\r\n```python\r\nimport cv2\r\nimport paddlehub as hub\r\n\r\n# 模型加载\r\n# use_gpu：是否使用GPU进行预测\r\nmodel = hub.Module(name='MiDaS_Small', use_gpu=False)\r\n\r\n# 模型预测\r\nresult = model.depth_estimation(images=[cv2.imread('/PATH/TO/IMAGE')])\r\n\r\n# or\r\n# result = model.style_transfer(paths=['/PATH/TO/IMAGE'])\r\n```\r\n\r\n## 模型相关信息\r\n\r\n### 模型代码\r\n\r\nhttps://github.com/intel-isl/MiDaS\r\n\r\n### 依赖\r\n\r\npaddlepaddle >= 2.0.0rc0\r\n\r\npaddlehub >= 2.0.0b1\r\n"
  },
  {
    "path": "modules/image/depth_estimation/MiDaS_Small/inference.py",
    "content": "import os\r\nimport numpy as np\r\n\r\nfrom paddle.inference import create_predictor, Config\r\n\r\n__all__ = ['InferenceModel']\r\n\r\n\r\nclass InferenceModel():\r\n    # 初始化函数\r\n    def __init__(self, modelpath, use_gpu=False, use_mkldnn=False, combined=True):\r\n        '''\r\n        init the inference model\r\n\r\n        modelpath: inference model path\r\n\r\n        use_gpu: use gpu or not\r\n\r\n        use_mkldnn: use mkldnn or not\r\n\r\n        combined: inference model format is combined or not\r\n        '''\r\n        # 加载模型配置\r\n        self.config = self.load_config(modelpath, use_gpu, use_mkldnn, combined)\r\n\r\n    # 打印函数\r\n    def __repr__(self):\r\n        '''\r\n        get the numbers and name of inputs and outputs\r\n        '''\r\n        return 'inputs_num: %d\\ninputs_names: %s\\noutputs_num: %d\\noutputs_names: %s' % (len(\r\n            self.input_handles), str(self.input_names), len(self.output_handles), str(self.output_names))\r\n\r\n    # 类调用函数\r\n    def __call__(self, *input_datas, batch_size=1):\r\n        '''\r\n        call function\r\n        '''\r\n        return self.forward(*input_datas, batch_size=batch_size)\r\n\r\n    # 模型参数加载函数\r\n    def load_config(self, modelpath, use_gpu, use_mkldnn, combined):\r\n        '''\r\n        load the model config\r\n\r\n        modelpath: inference model path\r\n\r\n        use_gpu: use gpu or not\r\n\r\n        use_mkldnn: use mkldnn or not\r\n\r\n        combined: inference model format is combined or not\r\n        '''\r\n        # 对运行位置进行配置\r\n        if use_gpu:\r\n            try:\r\n                int(os.environ.get('CUDA_VISIBLE_DEVICES'))\r\n            except Exception:\r\n                print(\r\n                    'Error! Unable to use GPU. Please set the environment variables \"CUDA_VISIBLE_DEVICES=GPU_id\" to use GPU.'\r\n                )\r\n                use_gpu = False\r\n\r\n        # 加载模型参数\r\n        if combined:\r\n            model = os.path.join(modelpath, \"__model__\")\r\n            params = os.path.join(modelpath, \"__params__\")\r\n            config = Config(model, params)\r\n        else:\r\n            config = Config(modelpath)\r\n\r\n        # 设置参数\r\n        if use_gpu:\r\n            config.enable_use_gpu(100, 0)\r\n        else:\r\n            config.disable_gpu()\r\n            if use_mkldnn:\r\n                config.enable_mkldnn()\r\n\r\n        # 返回配置\r\n        return config\r\n\r\n    # 预测器创建函数\r\n    def eval(self):\r\n        '''\r\n        create the model predictor by model config\r\n        '''\r\n        # 创建预测器\r\n        self.predictor = create_predictor(self.config)\r\n\r\n        # 获取模型的输入输出名称\r\n        self.input_names = self.predictor.get_input_names()\r\n        self.output_names = self.predictor.get_output_names()\r\n\r\n        # 获取输入\r\n        self.input_handles = []\r\n        for input_name in self.input_names:\r\n            self.input_handles.append(self.predictor.get_input_handle(input_name))\r\n\r\n        # 获取输出\r\n        self.output_handles = []\r\n        for output_name in self.output_names:\r\n            self.output_handles.append(self.predictor.get_output_handle(output_name))\r\n\r\n    # 前向计算函数\r\n    def forward(self, *input_datas, batch_size=1):\r\n        \"\"\"\r\n        model inference\r\n\r\n        batch_size: batch size\r\n\r\n        *input_datas: x1, x2, ..., xn\r\n        \"\"\"\r\n        # 切分输入数据\r\n        datas_num = input_datas[0].shape[0]\r\n        split_num = datas_num // batch_size + 1 if datas_num % batch_size != 0 else datas_num // batch_size\r\n        input_datas = [np.array_split(input_data, split_num) for input_data in input_datas]\r\n\r\n        # 遍历输入数据进行预测\r\n        outputs = {}\r\n        for step in range(split_num):\r\n            for i in range(len(self.input_handles)):\r\n                input_data = input_datas[i][step].copy()\r\n                self.input_handles[i].copy_from_cpu(input_data)\r\n\r\n            self.predictor.run()\r\n\r\n            for i in range(len(self.output_handles)):\r\n                output = self.output_handles[i].copy_to_cpu()\r\n                if i in outputs:\r\n                    outputs[i].append(output)\r\n                else:\r\n                    outputs[i] = [output]\r\n\r\n        # 预测结果合并\r\n        for key in outputs.keys():\r\n            outputs[key] = np.concatenate(outputs[key], 0)\r\n\r\n        # 返回预测结果\r\n        return outputs\r\n"
  },
  {
    "path": "modules/image/depth_estimation/MiDaS_Small/module.py",
    "content": "import os\r\nimport cv2\r\nimport numpy as np\r\n\r\nfrom paddlehub import Module\r\nfrom paddlehub.module.module import moduleinfo\r\n\r\nfrom paddle.vision.transforms import Compose\r\nfrom MiDaS_Small.utils import write_depth\r\nfrom MiDaS_Small.inference import InferenceModel\r\nfrom MiDaS_Small.transforms import Resize, NormalizeImage, PrepareForNet\r\n\r\n\r\n@moduleinfo(\r\n    name=\"MiDaS_Small\",  # 模型名称\r\n    type=\"CV/style_transfer\",  # 模型类型\r\n    author=\"jm12138\",  # 作者名称\r\n    author_email=\"jm12138@qq.com\",  # 作者邮箱\r\n    summary=\"MiDaS_Small\",  # 模型介绍\r\n    version=\"1.0.0\"  # 版本号\r\n)\r\nclass MiDaS_Small(Module):\r\n    # 初始化函数\r\n    def __init__(self, name=None, directory=None, use_gpu=False):\r\n        # 设置模型路径\r\n        model_path = os.path.join(self.directory, \"model-small\")\r\n\r\n        # 加载模型\r\n        self.model = InferenceModel(modelpath=model_path, use_gpu=use_gpu, use_mkldnn=False, combined=True)\r\n        self.model.eval()\r\n\r\n        # 数据预处理配置\r\n        self.net_h, self.net_w = 256, 256\r\n        self.transform = Compose([\r\n            Resize(\r\n                self.net_w,\r\n                self.net_h,\r\n                resize_target=None,\r\n                keep_aspect_ratio=False,\r\n                ensure_multiple_of=32,\r\n                resize_method=\"upper_bound\",\r\n                image_interpolation_method=cv2.INTER_CUBIC,\r\n            ),\r\n            NormalizeImage(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]),\r\n            PrepareForNet()\r\n        ])\r\n\r\n    # 数据读取函数\r\n    @staticmethod\r\n    def load_datas(paths, images):\r\n        datas = []\r\n\r\n        # 读取数据列表\r\n        if paths is not None:\r\n            for im_path in paths:\r\n                assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\r\n                im = cv2.imread(im_path)\r\n                datas.append(im)\r\n\r\n        if images is not None:\r\n            datas = images\r\n\r\n        # 返回数据列表\r\n        return datas\r\n\r\n    # 数据预处理函数\r\n    def preprocess(self, datas):\r\n        input_datas = []\r\n\r\n        for img in datas:\r\n            # 归一化\r\n            img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB) / 255.0\r\n\r\n            # 图像变换\r\n            img = self.transform({\"image\": img})[\"image\"]\r\n\r\n            # 新增维度\r\n            input_data = img[np.newaxis, ...]\r\n\r\n            input_datas.append(input_data)\r\n\r\n        # 拼接数据\r\n        input_datas = np.concatenate(input_datas, 0)\r\n\r\n        return input_datas\r\n\r\n    # 数据后处理函数\r\n    @staticmethod\r\n    def postprocess(datas, results, output_dir='output', visualization=False):\r\n        # 检查输出目录\r\n        if visualization:\r\n            if not os.path.exists(output_dir):\r\n                os.mkdir(output_dir)\r\n\r\n        outputs = []\r\n\r\n        for img, result, count in zip(datas, results, range(len(datas))):\r\n            # 缩放回原尺寸\r\n            output = cv2.resize(result, (img.shape[1], img.shape[0]), interpolation=cv2.INTER_CUBIC)\r\n\r\n            # 可视化输出\r\n            if visualization:\r\n                pfm_f, png_f = write_depth(os.path.join(output_dir, str(count)), output, bits=2)\r\n\r\n            outputs.append(output)\r\n\r\n        return outputs\r\n\r\n    # 深度估计函数\r\n    def depth_estimation(self, images=None, paths=None, batch_size=1, output_dir='output', visualization=False):\r\n        # 加载数据\r\n        datas = self.load_datas(paths, images)\r\n\r\n        # 数据预处理\r\n        input_datas = self.preprocess(datas)\r\n\r\n        # 模型预测\r\n        results = self.model(input_datas, batch_size=batch_size)[0]\r\n\r\n        # 结果后处理\r\n        outputs = self.postprocess(datas, results, output_dir, visualization)\r\n\r\n        return outputs\r\n"
  },
  {
    "path": "modules/image/depth_estimation/MiDaS_Small/transforms.py",
    "content": "# Refer https://github.com/intel-isl/MiDaS\r\n\r\nimport numpy as np\r\nimport cv2\r\n\r\n\r\nclass Resize(object):\r\n    \"\"\"Resize sample to given size (width, height).\r\n    \"\"\"\r\n\r\n    def __init__(self,\r\n                 width,\r\n                 height,\r\n                 resize_target=True,\r\n                 keep_aspect_ratio=False,\r\n                 ensure_multiple_of=1,\r\n                 resize_method=\"lower_bound\",\r\n                 image_interpolation_method=cv2.INTER_AREA):\r\n        \"\"\"Init.\r\n\r\n        Args:\r\n            width (int): desired output width\r\n            height (int): desired output height\r\n            resize_target (bool, optional):\r\n                True: Resize the full sample (image, mask, target).\r\n                False: Resize image only.\r\n                Defaults to True.\r\n            keep_aspect_ratio (bool, optional):\r\n                True: Keep the aspect ratio of the input sample.\r\n                Output sample might not have the given width and height, and\r\n                resize behaviour depends on the parameter 'resize_method'.\r\n                Defaults to False.\r\n            ensure_multiple_of (int, optional):\r\n                Output width and height is constrained to be multiple of this parameter.\r\n                Defaults to 1.\r\n            resize_method (str, optional):\r\n                \"lower_bound\": Output will be at least as large as the given size.\r\n                \"upper_bound\": Output will be at max as large as the given size. (Output size might be smaller than given size.)\r\n                \"minimal\": Scale as least as possible.  (Output size might be smaller than given size.)\r\n                Defaults to \"lower_bound\".\r\n        \"\"\"\r\n        self.__width = width\r\n        self.__height = height\r\n\r\n        self.__resize_target = resize_target\r\n        self.__keep_aspect_ratio = keep_aspect_ratio\r\n        self.__multiple_of = ensure_multiple_of\r\n        self.__resize_method = resize_method\r\n        self.__image_interpolation_method = image_interpolation_method\r\n\r\n    def constrain_to_multiple_of(self, x, min_val=0, max_val=None):\r\n        y = (np.round(x / self.__multiple_of) * self.__multiple_of).astype(int)\r\n\r\n        if max_val is not None and y > max_val:\r\n            y = (np.floor(x / self.__multiple_of) * self.__multiple_of).astype(int)\r\n\r\n        if y < min_val:\r\n            y = (np.ceil(x / self.__multiple_of) * self.__multiple_of).astype(int)\r\n\r\n        return y\r\n\r\n    def get_size(self, width, height):\r\n        # determine new height and width\r\n        scale_height = self.__height / height\r\n        scale_width = self.__width / width\r\n\r\n        if self.__keep_aspect_ratio:\r\n            if self.__resize_method == \"lower_bound\":\r\n                # scale such that output size is lower bound\r\n                if scale_width > scale_height:\r\n                    # fit width\r\n                    scale_height = scale_width\r\n                else:\r\n                    # fit height\r\n                    scale_width = scale_height\r\n            elif self.__resize_method == \"upper_bound\":\r\n                # scale such that output size is upper bound\r\n                if scale_width < scale_height:\r\n                    # fit width\r\n                    scale_height = scale_width\r\n                else:\r\n                    # fit height\r\n                    scale_width = scale_height\r\n            elif self.__resize_method == \"minimal\":\r\n                # scale as least as possbile\r\n                if abs(1 - scale_width) < abs(1 - scale_height):\r\n                    # fit width\r\n                    scale_height = scale_width\r\n                else:\r\n                    # fit height\r\n                    scale_width = scale_height\r\n            else:\r\n                raise ValueError(f\"resize_method {self.__resize_method} not implemented\")\r\n\r\n        if self.__resize_method == \"lower_bound\":\r\n            new_height = self.constrain_to_multiple_of(scale_height * height, min_val=self.__height)\r\n            new_width = self.constrain_to_multiple_of(scale_width * width, min_val=self.__width)\r\n        elif self.__resize_method == \"upper_bound\":\r\n            new_height = self.constrain_to_multiple_of(scale_height * height, max_val=self.__height)\r\n            new_width = self.constrain_to_multiple_of(scale_width * width, max_val=self.__width)\r\n        elif self.__resize_method == \"minimal\":\r\n            new_height = self.constrain_to_multiple_of(scale_height * height)\r\n            new_width = self.constrain_to_multiple_of(scale_width * width)\r\n        else:\r\n            raise ValueError(f\"resize_method {self.__resize_method} not implemented\")\r\n\r\n        return (new_width, new_height)\r\n\r\n    def __call__(self, sample):\r\n        width, height = self.get_size(sample[\"image\"].shape[1], sample[\"image\"].shape[0])\r\n\r\n        # resize sample\r\n        sample[\"image\"] = cv2.resize(\r\n            sample[\"image\"],\r\n            (width, height),\r\n            interpolation=self.__image_interpolation_method,\r\n        )\r\n\r\n        if self.__resize_target:\r\n            if \"disparity\" in sample:\r\n                sample[\"disparity\"] = cv2.resize(\r\n                    sample[\"disparity\"],\r\n                    (width, height),\r\n                    interpolation=cv2.INTER_NEAREST,\r\n                )\r\n\r\n            if \"depth\" in sample:\r\n                sample[\"depth\"] = cv2.resize(sample[\"depth\"], (width, height), interpolation=cv2.INTER_NEAREST)\r\n\r\n            sample[\"mask\"] = cv2.resize(\r\n                sample[\"mask\"].astype(np.float32),\r\n                (width, height),\r\n                interpolation=cv2.INTER_NEAREST,\r\n            )\r\n            sample[\"mask\"] = sample[\"mask\"].astype(bool)\r\n\r\n        return sample\r\n\r\n\r\nclass NormalizeImage(object):\r\n    \"\"\"Normlize image by given mean and std.\r\n    \"\"\"\r\n\r\n    def __init__(self, mean, std):\r\n        self.__mean = mean\r\n        self.__std = std\r\n\r\n    def __call__(self, sample):\r\n        sample[\"image\"] = (sample[\"image\"] - self.__mean) / self.__std\r\n\r\n        return sample\r\n\r\n\r\nclass PrepareForNet(object):\r\n    \"\"\"Prepare sample for usage as network input.\r\n    \"\"\"\r\n\r\n    def __init__(self):\r\n        pass\r\n\r\n    def __call__(self, sample):\r\n        image = np.transpose(sample[\"image\"], (2, 0, 1))\r\n        sample[\"image\"] = np.ascontiguousarray(image).astype(np.float32)\r\n\r\n        if \"mask\" in sample:\r\n            sample[\"mask\"] = sample[\"mask\"].astype(np.float32)\r\n            sample[\"mask\"] = np.ascontiguousarray(sample[\"mask\"])\r\n\r\n        if \"disparity\" in sample:\r\n            disparity = sample[\"disparity\"].astype(np.float32)\r\n            sample[\"disparity\"] = np.ascontiguousarray(disparity)\r\n\r\n        if \"depth\" in sample:\r\n            depth = sample[\"depth\"].astype(np.float32)\r\n            sample[\"depth\"] = np.ascontiguousarray(depth)\r\n\r\n        return sample\r\n"
  },
  {
    "path": "modules/image/depth_estimation/MiDaS_Small/utils.py",
    "content": "# Refer https://github.com/intel-isl/MiDaS\r\n\"\"\"Utils for monoDepth.\r\n\"\"\"\r\nimport sys\r\nimport numpy as np\r\nimport cv2\r\n\r\n\r\ndef write_pfm(path, image, scale=1):\r\n    \"\"\"Write pfm file.\r\n\r\n    Args:\r\n        path (str): pathto file\r\n        image (array): data\r\n        scale (int, optional): Scale. Defaults to 1.\r\n    \"\"\"\r\n\r\n    with open(path, \"wb\") as file:\r\n        color = None\r\n\r\n        if image.dtype.name != \"float32\":\r\n            raise Exception(\"Image dtype must be float32.\")\r\n\r\n        image = np.flipud(image)\r\n\r\n        if len(image.shape) == 3 and image.shape[2] == 3:  # color image\r\n            color = True\r\n        elif (len(image.shape) == 2 or len(image.shape) == 3 and image.shape[2] == 1):  # greyscale\r\n            color = False\r\n        else:\r\n            raise Exception(\"Image must have H x W x 3, H x W x 1 or H x W dimensions.\")\r\n\r\n        file.write(\"PF\\n\" if color else \"Pf\\n\".encode())\r\n        file.write(\"%d %d\\n\".encode() % (image.shape[1], image.shape[0]))\r\n\r\n        endian = image.dtype.byteorder\r\n\r\n        if endian == \"<\" or endian == \"=\" and sys.byteorder == \"little\":\r\n            scale = -scale\r\n\r\n        file.write(\"%f\\n\".encode() % scale)\r\n\r\n        image.tofile(file)\r\n\r\n\r\ndef read_image(path):\r\n    \"\"\"Read image and output RGB image (0-1).\r\n\r\n    Args:\r\n        path (str): path to file\r\n\r\n    Returns:\r\n        array: RGB image (0-1)\r\n    \"\"\"\r\n    img = cv2.imread(path)\r\n    if img.ndim == 2:\r\n        img = cv2.cvtColor(img, cv2.COLOR_GRAY2BGR)\r\n    img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB) / 255.0\r\n    return img\r\n\r\n\r\ndef write_depth(path, depth, bits=1):\r\n    \"\"\"Write depth map to pfm and png file.\r\n\r\n    Args:\r\n        path (str): filepath without extension\r\n        depth (array): depth\r\n    \"\"\"\r\n    write_pfm(path + \".pfm\", depth.astype(np.float32))\r\n\r\n    depth_min = depth.min()\r\n    depth_max = depth.max()\r\n\r\n    max_val = (2**(8 * bits)) - 1\r\n\r\n    if depth_max - depth_min > np.finfo(\"float\").eps:\r\n        out = max_val * (depth - depth_min) / (depth_max - depth_min)\r\n    else:\r\n        out = np.zeros(depth.shape, dtype=depth.type)\r\n\r\n    if bits == 1:\r\n        cv2.imwrite(path + \".png\", out.astype(\"uint8\"))\r\n    elif bits == 2:\r\n        cv2.imwrite(path + \".png\", out.astype(\"uint16\"))\r\n    return path + '.pfm', path + \".png\"\r\n"
  },
  {
    "path": "modules/image/face_detection/README.md",
    "content": "## **更好用户体验，建议参考WEB端官方文档 -> [【人脸检测】](https://www.paddlepaddle.org.cn/hublist)**\n\n### 人脸检测\n人脸检测属于目标检测的一个重要分支，由于近年来安防市场、人脸识别、人脸安全方面的原因，成为目标检测中最重要的任务之一。\n\n- 推荐模型\n\n| 模型名称                                                     | 模型简介                                                     |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n | [人脸检测](https://www.paddlepaddle.org.cn/hubdetail?name=pyramidbox_lite_server&en_category=FaceDetection) | 百度自研，18年3月WIDER Face 数据集**冠军模型**，           |\n| [超轻量人脸检测](https://www.paddlepaddle.org.cn/hubdetail?name=ultra_light_fast_generic_face_detector_1mb_640&en_category=FaceDetection) | 针对边缘计算设备或低算力设备(如用ARM推理)设计的实时超轻量级通用人脸检测模型，可以在低算力设备中如用ARM进行实时的通用场景的人脸检测推理。 |\n| [口罩人脸检测与识别](https://www.paddlepaddle.org.cn/hubdetail?name=pyramidbox_lite_server_mask&en_category=FaceDetection) | 业界**首个开源口罩人脸检测与识别模型**，引起广泛关注。     |\n"
  },
  {
    "path": "modules/image/face_detection/README_en.md",
    "content": "## **For better user experience, refer to the Web official documents -> [Face Detection](https://www.paddlepaddle.org.cn/hublist)**\n\n### Face Detection\n\nFace detection belongs to an important branch of the object detection. It has become one of the most important tasks in target detection in recent years in the security market, face recognition, and face security.\n\n- Recommended Models\n\n| Model Name                                                   | Model Introduction                                           |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n| [Face Detection](https://www.paddlepaddle.org.cn/hubdetail?name=pyramidbox_lite_server&en_category=FaceDetection) | Baidu self-developed model. It is the WIDER Face dataset ►Champion Model◄ in March 2018. |\n| [Ultra Lightweight Face Detection](https://www.paddlepaddle.org.cn/hubdetail?name=ultra_light_fast_generic_face_detector_1mb_640&en_category=FaceDetection) | Real-time ultra-lightweight general-purpose face detection model designed for edge computing devices or low computational power devices (e.g., ARM inference). It can perform real-time face detection inference in general-purpose scenarios in low computational power devices, such as ARM. |\n| [Mask face detection and recognition.](https://www.paddlepaddle.org.cn/hubdetail?name=pyramidbox_lite_server_mask&en_category=FaceDetection) | It is the First open source mask face detection and recognition model in the industry, with attracting widespread attention. |\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_face_detection/README.md",
    "content": "# pyramidbox_face_detection\n\n|模型名称|pyramidbox_face_detection|\n| :--- | :---: |\n|类别|图像 - 人脸检测|\n|网络|PyramidBox|\n|数据集|WIDER FACE数据集|\n|是否支持Fine-tuning|否|\n|模型大小|220MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/131602468-351eb3fb-81e3-4294-ac8e-b49a3a0232cb.jpg\"   width='50%' hspace='10'/>\n    <br />\n    </p>\n\n\n- ### 模型介绍\n\n  - PyramidBox是一种基于SSD的单阶段人脸检测器，它利用上下文信息解决困难人脸的检测问题。PyramidBox在六个尺度的特征图上进行不同层级的预测。该工作主要包括以下模块：LFPN、PyramidAnchors、CPM、Data-anchor-sampling。该PaddleHub Module的预训练数据集为WIDER FACE数据集，可支持预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install pyramidbox_face_detection\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run pyramidbox_face_detection --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现人脸检测模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    face_detector = hub.Module(name=\"pyramidbox_face_detection\")\n    result = face_detector.face_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = face_detector.face_detection(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def face_detection(images=None,\n                       paths=None,\n                       use_gpu=False,\n                       output_dir='detection_result',\n                       visualization=False,  \n                       score_thresh=0.15)\n    ```\n\n    - 检测输入图片中的所有人脸位置。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；<br/>\n      - paths (list\\[str\\]): 图片的路径；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 detection\\_result；<br/>\n      - visualization (bool): 是否将识别结果保存为图片文件；<br/>\n      - score_thresh (float): 置信度的阈值。\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为:\n        - path (str): 原输入图片的路径\n        - data (list): 检测结果，list的每一个元素为 dict，各字段为:\n          - confidence (float): 识别的置信度\n          - left (int): 边界框的左上角x坐标\n          - top (int): 边界框的左上角y坐标\n          - right (int): 边界框的右下角x坐标\n          - bottom (int): 边界框的右下角y坐标\n\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - 将模型保存到指定路径。\n\n    - **参数**\n\n      - dirname: 模型保存路径 <br/>\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线人脸检测服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m pyramidbox_face_detection\n    ```\n\n  - 这样就完成了一个人脸检测服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/pyramidbox_face_detection\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  修复numpy数据读取问题\n\n* 1.2.0\n\n  修复无法导出推理模型的问题\n\n  - ```shell\n    $ hub install pyramidbox_face_detection==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_face_detection/README_en.md",
    "content": "# pyramidbox_face_detection\n\n|Module Name|pyramidbox_face_detection|\n| :--- | :---: |\n|Category|face detection|\n|Network|PyramidBox|\n|Dataset|WIDER FACEDataset|\n|Fine-tuning supported or not|No|\n|Module Size|220MB|\n|Latest update date|2021-02-26|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/131602468-351eb3fb-81e3-4294-ac8e-b49a3a0232cb.jpg\"   width='50%' hspace='10'/>\n    <br />\n    </p>\n\n\n- ### Module Introduction\n\n  - PyramidBox is a one-stage face detector based on SSD. It can redict results across six scale levels of feature maps. This module is based on PyramidBox, trained on WIDER FACE Dataset, and supports face detection.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install pyramidbox_face_detection\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run pyramidbox_face_detection --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    face_detector = hub.Module(name=\"pyramidbox_face_detection\")\n    result = face_detector.face_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = face_detector.face_detection(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def face_detection(images=None,\n                       paths=None,\n                       use_gpu=False,\n                       output_dir='detection_result',\n                       visualization=False,  \n                       score_thresh=0.15)\n    ```\n\n    - Detect all faces in image\n\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - output_dir (str): save path of images;\n      - visualization (bool): Whether to save the results as picture files;\n      - score_thresh (float): the confidence threshold\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n\n      - res (list\\[dict\\]): results\n        - path (str): path for input image\n        - data (list): detection results, each element in the list is dict\n          - confidence (float): the confidence of the result\n          - left (int): the upper left corner x coordinate of the detection box\n          - top (int): the upper left corner y coordinate of the detection box\n          - right (int): the lower right corner x coordinate of the detection box\n          - bottom (int): the lower right corner y coordinate of the detection box\n\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - Save model to specific path\n\n    - **Parameters**\n\n      - dirname: model save path\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of face detection.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m pyramidbox_face_detection\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/pyramidbox_face_detection\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Fix the problem of reading numpy\n\n* 1.2.0\n\n  Fix a bug of save_inference_model\n\n  - ```shell\n    $ hub install pyramidbox_face_detection==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_face_detection/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/face_detection/pyramidbox_face_detection/data_feed.py",
    "content": "# coding=utf-8\nimport os\nimport time\nfrom collections import OrderedDict\n\nimport cv2\nimport numpy as np\nfrom PIL import Image\n\n__all__ = ['reader']\n\n\ndef preprocess(image):\n    if image.mode == 'L':\n        image = image.convert('RGB')\n    shrink, max_shrink = get_shrink(image.size[1], image.size[0])\n    image_shape = [3, image.size[1], image.size[0]]\n    if shrink != 1:\n        h, w = int(image_shape[1] * shrink), int(image_shape[2] * shrink)\n        image = image.resize((w, h), Image.ANTIALIAS)\n        image_shape = [3, h, w]\n\n    img = np.array(image)\n    img = to_chw_bgr(img)\n    mean = [104., 117., 123.]\n    scale = 0.007843\n    img = img.astype('float32')\n    img -= np.array(mean)[:, np.newaxis, np.newaxis].astype('float32')\n    img = img * scale\n    img = np.array(img)\n    return img\n\n\ndef to_chw_bgr(image):\n    \"\"\"\n    Transpose image from HWC to CHW and from RBG to BGR.\n\n    Args:\n        image (np.array): an image with HWC and RBG layout.\n    \"\"\"\n    # HWC to CHW\n    if len(image.shape) == 3:\n        image = np.swapaxes(image, 1, 2)\n        image = np.swapaxes(image, 1, 0)\n    # RBG to BGR\n    image = image[[2, 1, 0], :, :]\n    return image\n\n\ndef get_shrink(height, width):\n    \"\"\"\n    shrink the original image according to the org_im_height and org_im_width.\n    calculate the value of shrink.\n\n    Args:\n        height (int): image height.\n        width (int): image width.\n    \"\"\"\n    max_shrink_v1 = (0x7fffffff / 577.0 / (height * width))**0.5\n    max_shrink_v2 = ((678 * 1024 * 2.0 * 2.0) / (height * width))**0.5\n\n    def get_round(x, loc):\n        str_x = str(x)\n        if '.' in str_x:\n            str_before, str_after = str_x.split('.')\n            len_after = len(str_after)\n            if len_after >= 3:\n                str_final = str_before + '.' + str_after[0:loc]\n                return float(str_final)\n            else:\n                return x\n\n    max_shrink = get_round(min(max_shrink_v1, max_shrink_v2), 2) - 0.3\n    if max_shrink >= 1.5 and max_shrink < 2:\n        max_shrink = max_shrink - 0.1\n    elif max_shrink >= 2 and max_shrink < 3:\n        max_shrink = max_shrink - 0.2\n    elif max_shrink >= 3 and max_shrink < 4:\n        max_shrink = max_shrink - 0.3\n    elif max_shrink >= 4 and max_shrink < 5:\n        max_shrink = max_shrink - 0.4\n    elif max_shrink >= 5:\n        max_shrink = max_shrink - 0.5\n    elif max_shrink <= 0.1:\n        max_shrink = 0.1\n    shrink = max_shrink if max_shrink < 1 else 1\n    return shrink, max_shrink\n\n\ndef reader(images, paths):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list(numpy.ndarray)): images data, shape of each is [H, W, C], color space is BGR.\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths is not None:\n        assert type(paths) is list, \"paths should be a list.\"\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            each['org_im'] = Image.open(im_path)\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            each['org_im_path'] = im_path\n            component.append(each)\n    if images is not None:\n        assert type(images) is list, \"images should be a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = Image.fromarray(cv2.cvtColor(im, cv2.COLOR_BGR2RGB))\n            each['org_im_width'], each['org_im_height'] = each['org_im'].size\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            component.append(each)\n\n    for element in component:\n        element['image'] = preprocess(element['org_im'])\n        yield element\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_face_detection/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\n\nimport ast\nimport argparse\nimport os\n\nimport numpy as np\nimport paddle\nimport paddle.jit\nimport paddle.static\nfrom paddle.inference import Config, create_predictor\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\nfrom .data_feed import reader\nfrom .processor import postprocess, base64_to_cv2\n\n\n@moduleinfo(\n    name=\"pyramidbox_face_detection\",\n    type=\"CV/face_detection\",\n    author=\"baidu-vis\",\n    author_email=\"\",\n    summary=\"Baidu's PyramidBox model for face detection.\",\n    version=\"1.2.0\")\nclass PyramidBoxFaceDetection:\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"pyramidbox_face_detection_widerface\", \"model\")\n        self._set_config()\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path+'.pdmodel'\n        params = self.default_pretrained_model_path+'.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def face_detection(self,\n                       images=None,\n                       paths=None,\n                       data=None,\n                       use_gpu=False,\n                       output_dir='detection_result',\n                       visualization=False,\n                       score_thresh=0.15):\n        \"\"\"\n        API for face detection.\n\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n            paths (list[str]): The paths of images.\n            use_gpu (bool): Whether to use gpu.\n            output_dir (str): The path to store output images.\n            visualization (bool): Whether to save image or not.\n            score_thresh (float): score threshold to limit the detection result.\n\n        Returns:\n            res (list[dict]): The result of face detection, keys are 'data' and 'path', the correspoding values are:\n            data (list[dict]): 5 keys, where\n                'left', 'top', 'right', 'bottom' are the coordinate of detection bounding box,\n                'confidence' is the confidence this bbox.\n            path (str): The path of original image.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        # compatibility with older versions\n        if data:\n            if 'image' in data:\n                if paths is None:\n                    paths = list()\n                paths += data['image']\n\n        res = list()\n        # process one by one\n        for element in reader(images, paths):\n            image = np.expand_dims(element['image'], axis=0).astype('float32')\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(image)\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n            output = np.expand_dims(output_handle.copy_to_cpu(), axis=1)\n            # print(len(data_out))  # 1\n            out = postprocess(\n                data_out=output_handle.copy_to_cpu(),\n                org_im=element['org_im'],\n                org_im_path=element['org_im_path'],\n                org_im_width=element['org_im_width'],\n                org_im_height=element['org_im_height'],\n                output_dir=output_dir,\n                visualization=visualization,\n                score_thresh=score_thresh)\n            res.append(out)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.face_detection(images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.face_detection(\n            paths=[args.input_path],\n            use_gpu=args.use_gpu,\n            output_dir=args.output_dir,\n            visualization=args.visualization,\n            score_thresh=args.score_thresh)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu', type=ast.literal_eval, default=False, help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default='detection_result', help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument(\n            '--visualization', type=ast.literal_eval, default=False, help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n        self.arg_input_group.add_argument(\n            '--score_thresh', type=ast.literal_eval, default=0.15, help=\"score threshold of face detection.\")\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_face_detection/processor.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport os\nimport time\n\nimport base64\nimport cv2\nimport numpy as np\nfrom PIL import ImageDraw\n\n__all__ = ['base64_to_cv2', 'postprocess']\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef check_dir(dir_path):\n    \"\"\"\n    Create directory to save processed image.\n\n    Args:\n        dir_path (str): directory path to save images.\n\n    \"\"\"\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef get_save_image_name(img, org_im_path, output_dir):\n    \"\"\"\n    Get save image name.\n    \"\"\"\n    # name prefix of original image\n    org_im_name = os.path.split(org_im_path)[-1]\n    im_prefix = os.path.splitext(org_im_name)[0]\n    # extension\n    if img.mode == 'RGBA':\n        ext = '.png'\n    else:\n        ext = '.jpg'\n    # save image path\n    save_im_path = os.path.join(output_dir, im_prefix + ext)\n    if os.path.exists(save_im_path):\n        save_im_path = os.path.join(output_dir, im_prefix + 'time={}'.format(int(time.time())) + ext)\n    return save_im_path\n\n\ndef draw_bboxes(image, bboxes, org_im_path, output_dir):\n    \"\"\"\n    Draw bounding boxes on image.\n\n    Args:\n        bboxes (np.array): bounding boxes.\n    \"\"\"\n    draw = ImageDraw.Draw(image)\n    for i in range(len(bboxes)):\n        xmin, ymin, xmax, ymax = bboxes[i]\n        (left, right, top, bottom) = (xmin, xmax, ymin, ymax)\n        draw.line([(left, top), (left, bottom), (right, bottom), (right, top), (left, top)], width=4, fill='red')\n    save_name = get_save_image_name(image, org_im_path, output_dir)\n    image.save(save_name)\n\n\ndef postprocess(data_out, org_im, org_im_path, org_im_width, org_im_height, output_dir, visualization, score_thresh):\n    \"\"\"\n    Postprocess output of network. one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output of network.\n        org_im: (PIL.Image object): original image.\n        org_im_path (str): path of original image.\n        org_im_width (int): width of original image.\n        org_im_height (int): height of original image.\n        output_dir (str): output directory to store image.\n        visualization (bool): whether to save image or not.\n\n    Returns:\n        output (dict): keys are 'data' and 'path', the correspoding values are:\n            data (list[dict]): 5 keys, where\n                'left', 'top', 'right', 'bottom' are the coordinate of detection bounding box,\n                'confidence' is the confidence this bbox.\n            path (str): The path of original image.\n    \"\"\"\n    output = dict()\n    output['data'] = list()\n    output['path'] = org_im_path\n\n    if data_out.shape[0] == 0:\n        print(\"No face detected in {}\".format(org_im_path))\n    else:\n        det_conf = data_out[:, 1]\n        det_xmin = org_im_width * data_out[:, 2]\n        det_ymin = org_im_height * data_out[:, 3]\n        det_xmax = org_im_width * data_out[:, 4]\n        det_ymax = org_im_height * data_out[:, 5]\n        dets = np.column_stack((det_xmin, det_ymin, det_xmax, det_ymax, det_conf))\n        keep_index = np.where(dets[:, 4] >= score_thresh)[0]\n        dets = dets[keep_index, :]\n\n        if dets.shape[0] == 0:\n            print(\"No face detected in {}\".format(org_im_path))\n        else:\n            for detect_face in dets:\n                dt_i = dict()\n                dt_i['left'] = float(detect_face[0])\n                dt_i['top'] = float(detect_face[1])\n                dt_i['right'] = float(detect_face[2])\n                dt_i['bottom'] = float(detect_face[3])\n                dt_i['confidence'] = float(detect_face[4])\n                output['data'].append(dt_i)\n\n        if visualization:\n            check_dir(output_dir)\n            draw_bboxes(org_im, dets[:, 0:4], org_im_path, output_dir)\n    return output\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_face_detection/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport paddlehub as hub\n\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://ai-studio-static-online.cdn.bcebos.com/7799a8ccc5f6471b9d56fb6eff94f82a08b70ca2c7594d3f99877e366c0a2619'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"pyramidbox_face_detection\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('detection_result')\n\n    def test_face_detection1(self):\n        results = self.module.face_detection(\n            paths=['tests/test.jpg'],\n            use_gpu=False,\n            visualization=False\n        )\n        bbox = results[0]['data'][0]\n\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n        \n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(1000 < left < 4000)\n        self.assertTrue(1000 < right < 4000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection2(self):\n        results = self.module.face_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=False\n        )\n        bbox = results[0]['data'][0]\n\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n        \n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(1000 < left < 4000)\n        self.assertTrue(1000 < right < 4000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection3(self):\n        results = self.module.face_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=True\n        )\n        bbox = results[0]['data'][0]\n\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n        \n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(1000 < left < 4000)\n        self.assertTrue(1000 < right < 4000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection4(self):\n        results = self.module.face_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=True,\n            visualization=False\n        )\n        bbox = results[0]['data'][0]\n\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n        \n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(1000 < left < 4000)\n        self.assertTrue(1000 < right < 4000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection5(self):\n        self.assertRaises(\n            AssertionError,\n            self.module.face_detection,\n            paths=['no.jpg']\n        )\n\n    def test_face_detection6(self):\n        self.assertRaises(\n            cv2.error,\n            self.module.face_detection,\n            images=['test.jpg']\n        )\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_mobile/README.md",
    "content": "# pyramidbox_lite_mobile\n\n|模型名称|pyramidbox_lite_mobile|\n| :--- | :---: |\n|类别|图像 - 人脸检测|\n|网络|PyramidBox|\n|数据集|WIDER FACE数据集 + 百度自采人脸数据集|\n|是否支持Fine-tuning|否|\n|模型大小|7.3MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/131602468-351eb3fb-81e3-4294-ac8e-b49a3a0232cb.jpg\"   width='50%' hspace='10'/>\n    <br />\n    </p>\n\n- ### 模型介绍\n\n  - PyramidBox-Lite是基于2018年百度发表于计算机视觉顶级会议ECCV 2018的论文PyramidBox而研发的轻量级模型，模型基于主干网络FaceBoxes，对于光照、口罩遮挡、表情变化、尺度变化等常见问题具有很强的鲁棒性。该PaddleHub Module是针对于移动端优化过的模型，适合部署于移动端或者边缘检测等算力受限的设备上，并基于WIDER FACE数据集和百度自采人脸数据集进行训练，支持预测，可用于人脸检测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install pyramidbox_lite_mobile\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run pyramidbox_lite_mobile --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现人脸检测模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    face_detector = hub.Module(name=\"pyramidbox_lite_mobile\")\n    result = face_detector.face_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = face_detector.face_detection(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def face_detection(images=None,\n                       paths=None,\n                       use_gpu=False,\n                       output_dir='detection_result',\n                       visualization=False,\n                       shrink=0.5,\n                       confs_threshold=0.6)\n    ```\n\n    - 检测输入图片中的所有人脸位置。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；<br/>\n      - paths (list\\[str\\]): 图片的路径；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 detection\\_result；<br/>\n      - visualization (bool): 是否将识别结果保存为图片文件；<br/>\n      - shrink (float): 用于设置图片的缩放比例，该值越大，则对于输入图片中的小尺寸人脸有更好的检测效果（模型计算成本越高），反之则对于大尺寸人脸有更好的检测效果；<br/>\n      - confs\\_threshold (float): 置信度的阈值。\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为:\n        - path (str): 原输入图片的路径\n        - data (list): 检测结果，list的每一个元素为 dict，各字段为:\n          - confidence (float): 识别的置信度\n          - left (int): 边界框的左上角x坐标\n          - top (int): 边界框的左上角y坐标\n          - right (int): 边界框的右下角x坐标\n          - bottom (int): 边界框的右下角y坐标\n\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - 将模型保存到指定路径。\n\n    - **参数**\n\n      - dirname: 模型保存路径 <br/>\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线人脸检测服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m pyramidbox_lite_mobile\n    ```\n\n  - 这样就完成了一个人脸检测服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/pyramidbox_lite_mobile\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n- ### gradio app 支持\n\n  从 PaddleHub 2.3.1 开始支持使用链接 http://127.0.0.1:8866/gradio/pyramidbox_lite_mobile 在浏览器中访问 pyramidbox_lite_mobile 的 Gradio APP。\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.2.1\n\n  移除 fluid api\n\n* 1.3.0\n\n  修复无法导出推理模型的问题\n\n* 1.4.0\n\n  添加 Gradio APP 支持\n\n  - ```shell\n    $ hub install pyramidbox_lite_mobile==1.4.0\n    ```\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_mobile/README_en.md",
    "content": "# pyramidbox_lite_mobile\n\n|Module Name|pyramidbox_lite_mobile|\n| :--- | :---: |\n|Category|face detection|\n|Network|PyramidBox|\n|Dataset|WIDER FACEDataset + Baidu Face Dataset|\n|Fine-tuning supported or not|No|\n|Module Size|7.3MB|\n|Latest update date|2021-02-26|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/131602468-351eb3fb-81e3-4294-ac8e-b49a3a0232cb.jpg\"   width='50%' hspace='10'/>\n    <br />\n    </p>\n\n- ### Module Introduction\n\n  - PyramidBox-Lite is a light-weight model based  on PyramidBox proposed by Baidu in ECCV 2018. This model has solid robustness against interferences such as light and scale variation. This module is optimized for mobile device, based on PyramidBox, trained on WIDER FACE Dataset and Baidu Face Dataset, and can be used for face detection.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install pyramidbox_lite_mobile\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run pyramidbox_lite_mobile --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    face_detector = hub.Module(name=\"pyramidbox_lite_mobile\")\n    result = face_detector.face_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = face_detector.face_detection(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def face_detection(images=None,\n                       paths=None,\n                       use_gpu=False,\n                       output_dir='detection_result',\n                       visualization=False,\n                       shrink=0.5,\n                       confs_threshold=0.6)\n    ```\n\n    - Detect all faces in image\n\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - output_dir (str): save path of images;\n      - visualization (bool): Whether to save the results as picture files;\n      - shrink (float): the scale to resize image\n      - confs\\_threshold (float): the confidence threshold\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n\n      - res (list\\[dict\\]): results\n        - path (str): path for input image\n        - data (list): detection results, each element in the list is dict\n          - confidence (float): the confidence of the result\n          - left (int): the upper left corner x coordinate of the detection box\n          - top (int): the upper left corner y coordinate of the detection box\n          - right (int): the lower right corner x coordinate of the detection box\n          - bottom (int): the lower right corner y coordinate of the detection box\n\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - Save model to specific path\n\n    - **Parameters**\n\n      - dirname: model save path\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of face detection.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m pyramidbox_lite_mobile\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/pyramidbox_lite_mobile\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n- ### Gradio APP support\n  Starting with PaddleHub 2.3.1, the Gradio APP for pyramidbox_lite_mobile is supported to be accessed in the browser using the link http://127.0.0.1:8866/gradio/pyramidbox_lite_mobile.\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.2.1\n\n  Remove fluid api\n\n* 1.3.0\n\n  Fix a bug of save_inference_model\n\n* 1.4.0\n\n  Add Gradio APP support.\n\n  - ```shell\n    $ hub install pyramidbox_lite_mobile==1.4.0\n    ```\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_mobile/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_mobile/data_feed.py",
    "content": "import os\nimport time\nfrom collections import OrderedDict\n\nimport cv2\nimport numpy as np\n\n__all__ = ['reader']\n\n\ndef preprocess(org_im, shrink):\n    image = org_im.copy()\n    image_height, image_width, image_channel = image.shape\n    if shrink != 1:\n        image_height, image_width = int(image_height * shrink), int(image_width * shrink)\n        image = cv2.resize(image, (image_width, image_height), cv2.INTER_NEAREST)\n    # HWC to CHW\n    if len(image.shape) == 3:\n        image = np.swapaxes(image, 1, 2)\n        image = np.swapaxes(image, 1, 0)\n    # mean, std\n    mean = [104., 117., 123.]\n    scale = 0.007843\n    image = image.astype('float32')\n    image -= np.array(mean)[:, np.newaxis, np.newaxis].astype('float32')\n    image = image * scale\n    return image, image_height, image_width\n\n\ndef reader(images, paths, shrink):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list(numpy.ndarray)): images data, shape of each is [H, W, C], color space is BGR.\n        paths (list[str]): paths to images.\n        shrink (float): parameter to control the resize scale in preprocess.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths is not None:\n        assert type(paths) is list, \"paths should be a list.\"\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            im = cv2.imread(im_path)\n            each['org_im'] = im\n            each['org_im_path'] = im_path\n            component.append(each)\n    if images is not None:\n        assert type(images) is list, \"images should be a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = im\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            component.append(each)\n\n    for element in component:\n        element['image'], element['image_height'], element['image_width'] = preprocess(element['org_im'], shrink)\n        yield element\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_mobile/label_list.txt",
    "content": "BACKGROUND\nface\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_mobile/module.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\n\nimport argparse\nimport ast\nimport os\n\nimport numpy as np\nimport paddle\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import postprocess\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"pyramidbox_lite_mobile\",\n            type=\"CV/face_detection\",\n            author=\"baidu-vis\",\n            author_email=\"\",\n            summary=\"PyramidBox-Lite-Mobile is a high-performance face detection model.\",\n            version=\"1.4.0\")\nclass PyramidBoxLiteMobile:\n\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"pyramidbox_lite_mobile_face_detection\",\n                                                          \"model\")\n        self._set_config()\n        self.processor = self\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path + '.pdmodel'\n        params = self.default_pretrained_model_path + '.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def face_detection(self,\n                       images=None,\n                       paths=None,\n                       data=None,\n                       use_gpu=False,\n                       output_dir='detection_result',\n                       visualization=False,\n                       shrink=0.5,\n                       confs_threshold=0.6):\n        \"\"\"\n        API for face detection.\n\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n            paths (list[str]): The paths of images.\n            use_gpu (bool): Whether to use gpu.\n            output_dir (str): The path to store output images.\n            visualization (bool): Whether to save image or not.\n            shrink (float): parameter to control the resize scale in preprocess.\n            confs_threshold (float): confidence threshold.\n\n        Returns:\n            res (list[dict]): The result of face detection and save path of images.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        # compatibility with older versions\n        if data:\n            if 'image' in data:\n                if paths is None:\n                    paths = list()\n                paths += data['image']\n            elif 'data' in data:\n                if images is None:\n                    images = list()\n                images += data['data']\n\n        res = list()\n        # process one by one\n        for element in reader(images, paths, shrink):\n            image = np.expand_dims(element['image'], axis=0).astype('float32')\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(image)\n\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n            output_data = output_handle.copy_to_cpu()\n\n            out = postprocess(data_out=output_data,\n                              org_im=element['org_im'],\n                              org_im_path=element['org_im_path'],\n                              image_width=element['image_width'],\n                              image_height=element['image_height'],\n                              output_dir=output_dir,\n                              visualization=visualization,\n                              shrink=shrink,\n                              confs_threshold=confs_threshold)\n            res.append(out)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.face_detection(images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.face_detection(paths=[args.input_path],\n                                      use_gpu=args.use_gpu,\n                                      output_dir=args.output_dir,\n                                      visualization=args.visualization,\n                                      shrink=args.shrink,\n                                      confs_threshold=args.confs_threshold)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='detection_result',\n                                           help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument('--visualization',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n        self.arg_input_group.add_argument(\n            '--shrink',\n            type=ast.literal_eval,\n            default=0.5,\n            help=\"resize the image to shrink * original_shape before feeding into network.\")\n        self.arg_input_group.add_argument('--confs_threshold',\n                                          type=ast.literal_eval,\n                                          default=0.6,\n                                          help=\"confidence threshold.\")\n\n    def create_gradio_app(self):\n        import gradio as gr\n        import tempfile\n        import os\n        from PIL import Image\n\n        def inference(image, shrink, confs_threshold):\n            with tempfile.TemporaryDirectory() as temp_dir:\n                self.face_detection(paths=[image],\n                                    use_gpu=False,\n                                    visualization=True,\n                                    output_dir=temp_dir,\n                                    shrink=shrink,\n                                    confs_threshold=confs_threshold)\n                return Image.open(os.path.join(temp_dir, os.listdir(temp_dir)[0]))\n\n        interface = gr.Interface(inference, [\n            gr.inputs.Image(type=\"filepath\"),\n            gr.Slider(0.0, 1.0, 0.5, step=0.01),\n            gr.Slider(0.0, 1.0, 0.6, step=0.01)\n        ],\n                                 gr.outputs.Image(type=\"ndarray\"),\n                                 title='pyramidbox_lite_mobile')\n        return interface\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_mobile/processor.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport base64\nimport os\nimport time\n\nimport cv2\nimport numpy as np\nfrom PIL import Image\n\n__all__ = ['base64_to_cv2', 'postprocess']\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef check_dir(dir_path):\n    \"\"\"\n    Create directory to save processed image.\n\n    Args:\n        dir_path (str): directory path to save images.\n\n    \"\"\"\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef get_save_image_name(org_im, org_im_path, output_dir):\n    \"\"\"\n    Get save image name from source image path.\n    \"\"\"\n    # name prefix of original image\n    org_im_name = os.path.split(org_im_path)[-1]\n    im_prefix = os.path.splitext(org_im_name)[0]\n    # extension\n    img = Image.fromarray(org_im[:, :, ::-1])\n    if img.mode == 'RGBA':\n        ext = '.png'\n    elif img.mode == 'RGB':\n        ext = '.jpg'\n    # save image path\n    save_im_path = os.path.join(output_dir, im_prefix + ext)\n    if os.path.exists(save_im_path):\n        save_im_path = os.path.join(output_dir, im_prefix + 'time={}'.format(int(time.time())) + ext)\n\n    return save_im_path\n\n\ndef clip_bbox(bbox, img_height, img_width):\n    bbox['left'] = int(max(min(bbox['left'], img_width), 0.))\n    bbox['top'] = int(max(min(bbox['top'], img_height), 0.))\n    bbox['right'] = int(max(min(bbox['right'], img_width), 0.))\n    bbox['bottom'] = int(max(min(bbox['bottom'], img_height), 0.))\n    return bbox\n\n\ndef postprocess(data_out, org_im, org_im_path, image_height, image_width, output_dir, visualization, shrink,\n                confs_threshold):\n    \"\"\"\n    Postprocess output of network. one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output of network.\n        org_im (numpy.ndarray): original image.\n        org_im_path (list): path of riginal image.\n        image_height (int): height of preprocessed image.\n        image_width (int): width of preprocessed image.\n        output_dir (str): output directory to store image.\n        visualization (bool): whether to save image or not.\n        shrink (float): parameter to control the resize scale in preprocess.\n        confs_threshold (float): confidence threshold.\n\n    Returns:\n        output (dict): keys are 'data' and 'path', the correspoding values are:\n            data (list[dict]): 5 keys, where\n                'left', 'top', 'right', 'bottom' are the coordinate of detection bounding box,\n                'confidence' is the confidence this bbox.\n            path (str): The path of original image.\n    \"\"\"\n    output = dict()\n    output['data'] = list()\n    output['path'] = org_im_path\n\n    for each_data in data_out:\n        # each_data is a list: [label, confidence, left, top, right, bottom]\n        if each_data[1] > confs_threshold:\n            dt_bbox = dict()\n            dt_bbox['confidence'] = float(each_data[1])\n            dt_bbox['left'] = image_width * each_data[2] / shrink\n            dt_bbox['top'] = image_height * each_data[3] / shrink\n            dt_bbox['right'] = image_width * each_data[4] / shrink\n            dt_bbox['bottom'] = image_height * each_data[5] / shrink\n            dt_bbox = clip_bbox(dt_bbox, org_im.shape[0], org_im.shape[1])\n            output['data'].append(dt_bbox)\n\n    if visualization:\n        check_dir(output_dir)\n        save_im_path = get_save_image_name(org_im, org_im_path, output_dir)\n        im_out = org_im.copy()\n        if len(output['data']) > 0:\n            for bbox in output['data']:\n                cv2.rectangle(im_out, (bbox['left'], bbox['top']), (bbox['right'], bbox['bottom']), (255, 255, 0), 2)\n        cv2.imwrite(save_im_path, im_out)\n\n    return output\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_mobile/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport paddlehub as hub\n\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://ai-studio-static-online.cdn.bcebos.com/7799a8ccc5f6471b9d56fb6eff94f82a08b70ca2c7594d3f99877e366c0a2619'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"pyramidbox_lite_mobile\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('detection_result')\n\n    def test_face_detection1(self):\n        results = self.module.face_detection(\n            paths=['tests/test.jpg'],\n            use_gpu=False,\n            visualization=False\n        )\n        bbox = results[0]['data'][0]\n\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n        \n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(1000 < left < 4000)\n        self.assertTrue(1000 < right < 4000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection2(self):\n        results = self.module.face_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=False\n        )\n        bbox = results[0]['data'][0]\n\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n        \n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(1000 < left < 4000)\n        self.assertTrue(1000 < right < 4000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection3(self):\n        results = self.module.face_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=True\n        )\n        bbox = results[0]['data'][0]\n\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n        \n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(1000 < left < 4000)\n        self.assertTrue(1000 < right < 4000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection4(self):\n        results = self.module.face_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=True,\n            visualization=False\n        )\n        bbox = results[0]['data'][0]\n\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n        \n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(1000 < left < 4000)\n        self.assertTrue(1000 < right < 4000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection5(self):\n        self.assertRaises(\n            AssertionError,\n            self.module.face_detection,\n            paths=['no.jpg']\n        )\n\n    def test_face_detection6(self):\n        self.assertRaises(\n            AttributeError,\n            self.module.face_detection,\n            images=['test.jpg']\n        )\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_mobile_mask/README.md",
    "content": "# pyramidbox_lite_mobile_mask\n\n|模型名称|pyramidbox_lite_mobile_mask|\n| :--- | :---: |\n|类别|图像 - 人脸检测|\n|网络|PyramidBox|\n|数据集|WIDER FACE数据集 + 百度自采人脸数据集|\n|是否支持Fine-tuning|否|\n|模型大小|1.2MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/131603304-690a2e3b-9f24-42f6-9297-a12ada884191.jpg\"   width='50%' hspace='10'/>\n    <br />\n    </p>\n\n- ### 模型介绍\n\n  - PyramidBox-Lite是基于2018年百度发表于计算机视觉顶级会议ECCV 2018的论文PyramidBox而研发的轻量级模型，模型基于主干网络FaceBoxes，对于光照、口罩遮挡、表情变化、尺度变化等常见问题具有很强的鲁棒性。该PaddleHub Module是针对于移动端优化过的模型，适合部署于移动端或者边缘检测等算力受限的设备上，并基于WIDER FACE数据集和百度自采人脸数据集进行训练，支持预测，可用于检测人脸是否佩戴口罩。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install pyramidbox_lite_mobile_mask\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run pyramidbox_lite_mobile_mask --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现人脸检测模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    mask_detector = hub.Module(name=\"pyramidbox_lite_mobile_mask\")\n    result = mask_detector.face_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = mask_detector.face_detection(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def __init__(face_detector_module=None)\n    ```\n\n    - **参数**\n\n      - face\\_detector\\_module (class): 人脸检测模型，默认为 pyramidbox\\_lite\\_mobile。\n\n  - ```python\n    def face_detection(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       visualization=False,\n                       output_dir='detection_result',\n                       use_multi_scale=False,\n                       shrink=0.5,\n                       confs_threshold=0.6)\n    ```\n\n    - 识别输入图片中的所有的人脸，并判断有无戴口罩。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；<br/>\n      - paths (list\\[str\\]): 图片的路径；<br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - visualization (bool): 是否将识别结果保存为图片文件；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 detection\\_result；<br/>\n      - use\\_multi\\_scale (bool) : 用于设置是否开启多尺度的人脸检测，开启多尺度人脸检测能够更好的检测到输入图像中不同尺寸的人脸，但是会增加模型计算量，降低预测速度；<br/>\n      - shrink (float): 用于设置图片的缩放比例，该值越大，则对于输入图片中的小尺寸人脸有更好的检测效果（模型计算成本越高），反之则对于大尺寸人脸有更好的检测效果；<br/>\n      - confs\\_threshold (float): 置信度的阈值。\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为:\n        - path (str): 原输入图片的路径\n        - data (list): 检测结果，list的每一个元素为 dict，各字段为:\n          - label (str): 识别标签，为 'NO MASK' 或者 'MASK'；\n          - confidence (float): 识别的置信度\n          - left (int): 边界框的左上角x坐标\n          - top (int): 边界框的左上角y坐标\n          - right (int): 边界框的右下角x坐标\n          - bottom (int): 边界框的右下角y坐标\n\n  - ```python\n    def set_face_detector_module(face_detector_module)\n    ```\n    - 设置口罩检测模型中进行人脸检测的底座模型。\n    - **参数**\n\n      - face\\_detector\\_module (class): 人脸检测模型。\n\n  - ```python\n    def get_face_detector_module()\n    ```\n    - 获取口罩检测模型中进行人脸检测的底座模型。\n    - **返回**\n\n      - 当前模型使用的人脸检测模型\n\n\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - 将模型保存到指定路径。\n\n    - **参数**\n\n      - dirname: 模型保存路径 <br/>\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线口罩检测服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m pyramidbox_lite_mobile_mask\n    ```\n\n  - 这样就完成了一个口罩检测服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/pyramidbox_lite_mobile_mask\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n- ### gradio app 支持\n\n  从 PaddleHub 2.3.1 开始支持使用链接 http://127.0.0.1:8866/gradio/pyramidbox_lite_mobile_mask 在浏览器中访问 pyramidbox_lite_mobile_mask 的 Gradio APP。\n\n## 五、Paddle Lite部署\n- ### 通过python执行以下代码，保存模型\n  - ```python\n    import paddlehub as hub\n    pyramidbox_lite_mobile_mask = hub.Module(name=\"pyramidbox_lite_mobile_mask\")\n\n    # 将模型保存在test_program文件夹之中\n    pyramidbox_lite_mobile_mask.save_inference_model(dirname=\"test_program\")\n    ```\n\n- ### 进行模型转换\n  - 从paddlehub下载的是预测模型，可以使用PaddleLite提供的模型优化工具OPT对预测模型进行转换，转换之后进而可以实现在手机等端侧硬件上的部署，具体请请参考[OPT工具](https://paddle-lite.readthedocs.io/zh/latest/user_guides/model_optimize_tool.html)\n\n- ### 模型通过Paddle Lite进行部署\n  - 参考[Paddle-Lite口罩检测模型部署教程](https://github.com/PaddlePaddle/Paddle-Lite/tree/develop/lite/demo/cxx)\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.3.1\n\n  移除 fluid api\n\n* 1.4.0\n\n  修复无法导出模型的问题\n\n* 1.5.0\n\n  添加 Gradio APP 的支持\n\n  - ```shell\n    $ hub install pyramidbox_lite_mobile_mask==1.5.0\n    ```\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_mobile_mask/README_en.md",
    "content": "# pyramidbox_lite_mobile_mask\n\n|Module Name|pyramidbox_lite_mobile_mask|\n| :--- | :---: |\n|Category|face detection|\n|Network|PyramidBox|\n|Dataset|WIDER FACEDataset + Baidu Face Dataset|\n|Fine-tuning supported or not|No|\n|Module Size|1.2MB|\n|Latest update date|2021-02-26|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/131603304-690a2e3b-9f24-42f6-9297-a12ada884191.jpg\"   width='50%' hspace='10'/>\n    <br />\n    </p>\n\n- ### Module Introduction\n\n  - PyramidBox-Lite is a light-weight model based  on PyramidBox proposed by Baidu in ECCV 2018. This model has solid robustness against interferences such as light and scale variation. This module is optimized for mobile device, based on PyramidBox, trained on WIDER FACE Dataset and Baidu Face Dataset, and can be used for mask detection.\n\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install pyramidbox_lite_mobile_mask\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run pyramidbox_lite_mobile_mask --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    mask_detector = hub.Module(name=\"pyramidbox_lite_mobile_mask\")\n    result = mask_detector.face_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = mask_detector.face_detection(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def face_detection(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       visualization=False,\n                       output_dir='detection_result',\n                       use_multi_scale=False,\n                       shrink=0.5,\n                       confs_threshold=0.6)\n    ```\n\n    - Detect all faces in image, and judge the existence of mask.\n\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - visualization (bool): Whether to save the results as picture files;\n      - output_dir (str): save path of images;\n      - use\\_multi\\_scale (bool) : whether to detect across multiple scales;\n      - shrink (float): the scale to resize image\n      - confs\\_threshold (float): the confidence threshold\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n\n      - res (list\\[dict\\]): results\n        - path (str): path for input image\n        - data (list): detection results, each element in the list is dict\n          - label (str):  'NO MASK' or 'MASK'；\n          - confidence (float): the confidence of the result\n          - left (int): the upper left corner x coordinate of the detection box\n          - top (int): the upper left corner y coordinate of the detection box\n          - right (int): the lower right corner x coordinate of the detection box\n          - bottom (int): the lower right corner y coordinate of the detection box\n\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - Save model to specific path\n\n    - **Parameters**\n\n      - dirname: model save path\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of face detection.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m pyramidbox_lite_mobile_mask\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/pyramidbox_lite_mobile_mask\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n- ### Gradio APP support\n  Starting with PaddleHub 2.3.1, the Gradio APP for pyramidbox_lite_mobile_mask is supported to be accessed in the browser using the link http://127.0.0.1:8866/gradio/pyramidbox_lite_mobile_mask.\n\n## V.Paddle Lite Deployment\n- ### Save model demo\n  - ```python\n    import paddlehub as hub\n    pyramidbox_lite_mobile_mask = hub.Module(name=\"pyramidbox_lite_mobile_mask\")\n\n    # save model in directory named test_program\n    pyramidbox_lite_mobile_mask.save_inference_model(dirname=\"test_program\")\n    ```\n\n- ### transform model\n\n  - The model downloaded from paddlehub is a prediction model. If we want to deploy it in mobile device, we can use OPT tool provided by PaddleLite to transform the model. For more information, please refer to [OPT tool](https://paddle-lite.readthedocs.io/zh/latest/user_guides/model_optimize_tool.html))\n\n- ### Deploy the model with Paddle Lite\n  - Please refer to[Paddle-Lite mask detection model deployment demo](https://github.com/PaddlePaddle/Paddle-Lite/tree/develop/lite/demo/cxx)\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.3.1\n\n  Remove fluid api\n\n* 1.4.0\n\n  Fix a bug of save_inference_model\n\n* 1.5.0\n\n  Add Gradio APP support.\n\n  - ```shell\n    $ hub install pyramidbox_lite_mobile_mask==1.5.0\n    ```\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_mobile_mask/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_mobile_mask/data_feed.py",
    "content": "# coding=utf-8\nimport math\nimport os\nimport time\nfrom collections import OrderedDict\n\nimport cv2\nimport numpy as np\n\n__all__ = ['reader']\n\nmulti_scales = [0.3, 0.6, 0.9]\n\n\ndef bbox_vote(det):\n    order = det[:, 4].ravel().argsort()[::-1]\n    det = det[order, :]\n    if det.shape[0] == 0:\n        dets = np.array([[10, 10, 20, 20, 0.002]])\n        det = np.empty(shape=[0, 5])\n    while det.shape[0] > 0:\n        # IOU\n        area = (det[:, 2] - det[:, 0] + 1) * (det[:, 3] - det[:, 1] + 1)\n        xx1 = np.maximum(det[0, 0], det[:, 0])\n        yy1 = np.maximum(det[0, 1], det[:, 1])\n        xx2 = np.minimum(det[0, 2], det[:, 2])\n        yy2 = np.minimum(det[0, 3], det[:, 3])\n        w = np.maximum(0.0, xx2 - xx1 + 1)\n        h = np.maximum(0.0, yy2 - yy1 + 1)\n        inter = w * h\n        o = inter / (area[0] + area[:] - inter)\n        # nms\n        merge_index = np.where(o >= 0.3)[0]\n        det_accu = det[merge_index, :]\n        det = np.delete(det, merge_index, 0)\n        if merge_index.shape[0] <= 1:\n            if det.shape[0] == 0:\n                try:\n                    dets = np.row_stack((dets, det_accu))\n                except:\n                    dets = det_accu\n            continue\n        det_accu[:, 0:4] = det_accu[:, 0:4] * np.tile(det_accu[:, -1:], (1, 4))\n        max_score = np.max(det_accu[:, 4])\n        det_accu_sum = np.zeros((1, 5))\n        det_accu_sum[:, 0:4] = np.sum(det_accu[:, 0:4], axis=0) / np.sum(det_accu[:, -1:])\n        det_accu_sum[:, 4] = max_score\n        try:\n            dets = np.row_stack((dets, det_accu_sum))\n        except:\n            dets = det_accu_sum\n    dets = dets[0:750, :]\n    return dets\n\n\ndef crop(image, pts, shift=0, scale=1.5, rotate=0, res_width=128, res_height=128):\n    res = (res_width, res_height)\n    idx1 = 0\n    idx2 = 1\n    # angle\n    alpha = 0\n    if pts[idx2, 0] != -1 and pts[idx2, 1] != -1 and pts[idx1, 0] != -1 and pts[idx1, 1] != -1:\n        alpha = math.atan2(pts[idx2, 1] - pts[idx1, 1], pts[idx2, 0] - pts[idx1, 0]) * 180 / math.pi\n    pts[pts == -1] = np.inf\n    coord_min = np.min(pts, 0)\n    pts[pts == np.inf] = -1\n    coord_max = np.max(pts, 0)\n    # coordinates of center point\n    c = np.array([coord_max[0] - (coord_max[0] - coord_min[0]) / 2,\n                  coord_max[1] - (coord_max[1] - coord_min[1]) / 2])  # center\n    max_wh = max((coord_max[0] - coord_min[0]) / 2, (coord_max[1] - coord_min[1]) / 2)\n    # Shift the center point, rot add eyes angle\n    c = c + shift * max_wh\n    rotate = rotate + alpha\n    M = cv2.getRotationMatrix2D((c[0], c[1]), rotate, res[0] / (2 * max_wh * scale))\n    M[0, 2] = M[0, 2] - (c[0] - res[0] / 2.0)\n    M[1, 2] = M[1, 2] - (c[1] - res[0] / 2.0)\n    image_out = cv2.warpAffine(image, M, res)\n    return image_out, M\n\n\ndef color_normalize(image, mean, std=None):\n    if image.shape[-1] == 1:\n        image = np.repeat(image, axis=2)\n    h, w, c = image.shape\n    image = np.transpose(image, (2, 0, 1))\n    image = np.subtract(image.reshape(c, -1), mean[:, np.newaxis]).reshape(-1, h, w)\n    image = np.transpose(image, (1, 2, 0))\n    return image\n\n\ndef process_image(org_im, face):\n    pts = np.array([\n        face['left'], face['top'], face['right'], face['top'], face['left'], face['bottom'], face['right'],\n        face['bottom']\n    ]).reshape(4, 2).astype(np.float32)\n    image_in, M = crop(org_im, pts)\n    image_in = image_in / 256.0\n    image_in = color_normalize(image_in, mean=np.array([0.5, 0.5, 0.5]))\n    image_in = image_in.astype(np.float32).transpose([2, 0, 1]).reshape(-1, 3, 128, 128)\n    return image_in\n\n\ndef reader(face_detector, shrink, confs_threshold, images, paths, use_gpu, use_multi_scale):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        face_detector (class): class to detect faces.\n        shrink (float): parameter to control the resize scale in face_detector.\n        confs_threshold (float): confidence threshold of face_detector.\n        images (list(numpy.ndarray)): images data, shape of each is [H, W, C], color space is BGR.\n        paths (list[str]): paths to images.\n        use_gpu (bool): whether to use gpu in face_detector.\n        use_multi_scale (bool): whether to enable multi-scale face detection.\n    Yield:\n        element (collections.OrderedDict): info of original image, preprocessed image, contains 3 keys:\n            org_im (numpy.ndarray) : original image.\n            org_im_path (str): path to original image.\n            preprocessed (list[OrderedDict]):each element contains 2 keys:\n               face  (dict): face detected in the original image.\n               image (numpy.ndarray): data to be fed into neural network.\n    \"\"\"\n    component = list()\n    if paths is not None:\n        assert type(paths) is list, \"paths should be a list.\"\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            im = cv2.imread(im_path)\n            each['org_im'] = im\n            each['org_im_path'] = im_path\n            component.append(each)\n    if images is not None:\n        assert type(images) is list, \"images should be a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = im\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            component.append(each)\n\n    for element in component:\n        if use_multi_scale:\n            scale_res = list()\n            detect_faces = list()\n            for scale in multi_scales:\n                _detect_res = face_detector.face_detection(images=[element['org_im']],\n                                                           use_gpu=use_gpu,\n                                                           visualization=False,\n                                                           shrink=scale,\n                                                           confs_threshold=confs_threshold)\n\n                _s = list()\n                for _face in _detect_res[0]['data']:\n                    _face_list = [_face['left'], _face['top'], _face['right'], _face['bottom'], _face['confidence']]\n                    _s.append(_face_list)\n\n                if _s:\n                    scale_res.append(np.array(_s))\n\n            scale_res = np.row_stack(scale_res)\n            scale_res = bbox_vote(scale_res)\n            keep_index = np.where(scale_res[:, 4] >= confs_threshold)[0]\n            scale_res = scale_res[keep_index, :]\n            for data in scale_res:\n                face = {'left': data[0], 'top': data[1], 'right': data[2], 'bottom': data[3], 'confidence': data[4]}\n                detect_faces.append(face)\n        else:\n            _detect_res = face_detector.face_detection(images=[element['org_im']],\n                                                       use_gpu=use_gpu,\n                                                       visualization=False,\n                                                       shrink=shrink,\n                                                       confs_threshold=confs_threshold)\n            detect_faces = _detect_res[0]['data']\n\n        element['preprocessed'] = list()\n        for face in detect_faces:\n            handled = OrderedDict()\n            handled['face'] = face\n            handled['image'] = process_image(element['org_im'], face)\n            element['preprocessed'].append(handled)\n\n        yield element\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_mobile_mask/label_list.txt",
    "content": "NO MASK\nMASK\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_mobile_mask/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\n\nimport argparse\nimport ast\nimport os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nimport paddlehub as hub\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import postprocess\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"pyramidbox_lite_mobile_mask\",\n    type=\"CV/face_detection\",\n    author=\"baidu-vis\",\n    author_email=\"\",\n    summary=\n    \"Pyramidbox-Lite-Mobile-Mask is a high-performance face detection model used to detect whether people wear masks.\",\n    version=\"1.5.0\")\nclass PyramidBoxLiteMobileMask:\n\n    def __init__(self, face_detector_module=None):\n        \"\"\"\n        Args:\n            face_detector_module (class): module to detect face.\n        \"\"\"\n        self.default_pretrained_model_path = os.path.join(self.directory, \"pyramidbox_lite_mobile_mask_model\", \"model\")\n        if face_detector_module is None:\n            self.face_detector = hub.Module(name='pyramidbox_lite_mobile')\n        else:\n            self.face_detector = face_detector_module\n        self._set_config()\n        self.processor = self\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path + '.pdmodel'\n        params = self.default_pretrained_model_path + '.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def set_face_detector_module(self, face_detector_module):\n        \"\"\"\n        Set face detector.\n        Args:\n            face_detector_module (class): module to detect face.\n        \"\"\"\n        self.face_detector = face_detector_module\n\n    def get_face_detector_module(self):\n        return self.face_detector\n\n    def face_detection(self,\n                       images=None,\n                       paths=None,\n                       data=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       visualization=False,\n                       output_dir='detection_result',\n                       use_multi_scale=False,\n                       shrink=0.5,\n                       confs_threshold=0.6):\n        \"\"\"\n        API for face detection.\n\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size of image tensor to be fed into the later classification network.\n            use_gpu (bool): Whether to use gpu.\n            visualization (bool): Whether to save image or not.\n            output_dir (str): The path to store output images.\n            use_multi_scale (bool): whether to enable multi-scale face detection. Enabling multi-scale face detection\n                can increase the accuracy to detect faces, however,\n                it reduce the prediction speed for the increase model calculation.\n            shrink (float): parameter to control the resize scale in preprocess.\n            confs_threshold (float): confidence threshold.\n\n        Returns:\n            res (list[dict]): The result of face detection and save path of images.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        # compatibility with older versions\n        if data:\n            if 'image' in data:\n                if paths is None:\n                    paths = list()\n                paths += data['image']\n            elif 'data' in data:\n                if images is None:\n                    images = list()\n                images += data['data']\n\n        # get all data\n        all_element = list()\n        for yield_data in reader(self.face_detector, shrink, confs_threshold, images, paths, use_gpu, use_multi_scale):\n            all_element.append(yield_data)\n\n        image_list = list()\n        element_image_num = list()\n        for i in range(len(all_element)):\n            element_image = [handled['image'] for handled in all_element[i]['preprocessed']]\n            element_image_num.append(len(element_image))\n            image_list.extend(element_image)\n\n        total_num = len(image_list)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        predict_out = np.zeros((1, 2))\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for element_id in range(batch_size):\n                try:\n                    batch_data.append(image_list[handle_id + element_id])\n                except:\n                    pass\n\n            image_arr = np.squeeze(np.array(batch_data), axis=1)\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(image_arr)\n\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n            output_data = output_handle.copy_to_cpu()\n\n            predict_out = np.concatenate((predict_out, output_data))\n\n        predict_out = predict_out[1:]\n        # postprocess one by one\n        res = list()\n        for i in range(len(all_element)):\n            detect_faces_list = [handled['face'] for handled in all_element[i]['preprocessed']]\n            interval_left = sum(element_image_num[0:i])\n            interval_right = interval_left + element_image_num[i]\n            out = postprocess(confidence_out=predict_out[interval_left:interval_right],\n                              org_im=all_element[i]['org_im'],\n                              org_im_path=all_element[i]['org_im_path'],\n                              detected_faces=detect_faces_list,\n                              output_dir=output_dir,\n                              visualization=visualization)\n            res.append(out)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.face_detection(images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.face_detection(paths=[args.input_path],\n                                      use_gpu=args.use_gpu,\n                                      output_dir=args.output_dir,\n                                      visualization=args.visualization,\n                                      shrink=args.shrink,\n                                      confs_threshold=args.confs_threshold)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='detection_result',\n                                           help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument('--visualization',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n        self.arg_input_group.add_argument(\n            '--shrink',\n            type=ast.literal_eval,\n            default=0.5,\n            help=\"resize the image to `shrink * original_shape` before feeding into network.\")\n        self.arg_input_group.add_argument('--confs_threshold',\n                                          type=ast.literal_eval,\n                                          default=0.6,\n                                          help=\"confidence threshold.\")\n\n    def create_gradio_app(self):\n        import gradio as gr\n        import tempfile\n        import os\n        from PIL import Image\n\n        def inference(image, shrink, confs_threshold):\n            with tempfile.TemporaryDirectory() as temp_dir:\n                self.face_detection(paths=[image],\n                                    use_gpu=False,\n                                    visualization=True,\n                                    output_dir=temp_dir,\n                                    shrink=shrink,\n                                    confs_threshold=confs_threshold)\n                return Image.open(os.path.join(temp_dir, os.listdir(temp_dir)[0]))\n\n        interface = gr.Interface(inference, [\n            gr.inputs.Image(type=\"filepath\"),\n            gr.Slider(0.0, 1.0, 0.5, step=0.01),\n            gr.Slider(0.0, 1.0, 0.6, step=0.01)\n        ],\n                                 gr.outputs.Image(type=\"ndarray\"),\n                                 title='pyramidbox_lite_mobile_mask')\n        return interface\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_mobile_mask/processor.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport base64\nimport os\nimport time\n\nimport cv2\nimport numpy as np\nfrom PIL import Image\nfrom PIL import ImageDraw\n\n__all__ = ['base64_to_cv2', 'postprocess']\n\nlabel_list = ['NO MASK', 'MASK']\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef check_dir(dir_path):\n    \"\"\"\n    Create directory to save processed image.\n\n    Args:\n        dir_path (str): directory path to save images.\n\n    \"\"\"\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef get_save_image_name(org_im, org_im_path, output_dir):\n    \"\"\"\n    Get save image name from source image path.\n    \"\"\"\n    # name prefix of original image\n    org_im_name = os.path.split(org_im_path)[-1]\n    im_prefix = os.path.splitext(org_im_name)[0]\n    # extension\n    img = Image.fromarray(org_im[:, :, ::-1])\n    if img.mode == 'RGBA':\n        ext = '.png'\n    elif img.mode == 'RGB':\n        ext = '.jpg'\n    elif img.mode == 'L':  # black and white\n        ext = '.jpg'\n    # save image path\n    save_im_path = os.path.join(output_dir, im_prefix + ext)\n    if os.path.exists(save_im_path):\n        save_im_path = os.path.join(output_dir, im_prefix + 'time={}'.format(int(time.time())) + ext)\n\n    return save_im_path\n\n\ndef draw_bounding_box_on_image(save_im_path, output_data):\n    image = Image.open(save_im_path)\n    draw = ImageDraw.Draw(image)\n    for bbox in output_data:\n        # draw bouding box\n        if bbox['label'] == \"MASK\":\n            draw.line([(bbox['left'], bbox['top']), (bbox['left'], bbox['bottom']), (bbox['right'], bbox['bottom']),\n                       (bbox['right'], bbox['top']), (bbox['left'], bbox['top'])],\n                      width=2,\n                      fill='green')\n        else:\n            draw.line([(bbox['left'], bbox['top']), (bbox['left'], bbox['bottom']), (bbox['right'], bbox['bottom']),\n                       (bbox['right'], bbox['top']), (bbox['left'], bbox['top'])],\n                      width=2,\n                      fill='red')\n        # draw label\n        text = bbox['label'] + \": %.2f%%\" % (100 * bbox['confidence'])\n        textsize_width, textsize_height = draw.textsize(text=text)\n        if image.mode == 'RGB' or image.mode == 'RGBA':\n            box_fill = (255, 255, 255)\n            text_fill = (0, 0, 0)\n        else:\n            box_fill = (255)\n            text_fill = (0)\n\n        draw.rectangle(xy=(bbox['left'], bbox['top'] - (textsize_height + 5), bbox['left'] + textsize_width + 10,\n                           bbox['top'] - 3),\n                       fill=box_fill)\n        draw.text(xy=(bbox['left'], bbox['top'] - 15), text=text, fill=text_fill)\n    image.save(save_im_path)\n\n\ndef postprocess(confidence_out, org_im, org_im_path, detected_faces, output_dir, visualization):\n    \"\"\"\n    Postprocess output of network. one element at a time.\n\n    Args:\n        confidence_out (numpy.ndarray): confidences of each label.\n        org_im (numpy.ndarray): original image.\n        org_im_path (list): path of original image.\n        detected_faces (list): faces detected in a picture.\n        output_dir (str): output directory to store image.\n        visualization (bool): whether to save image or not.\n\n    Returns:\n        output (dict): keys are 'data' and 'path', the correspoding values are:\n            data (list[dict]): 6 keys, where\n                'label' is `MASK` or `NO MASK`,\n                'left', 'top', 'right', 'bottom' are the coordinate of detection bounding box,\n                'confidence' is the confidence of mask detection.\n            path (str): The path of original image.\n    \"\"\"\n    output = dict()\n    output['data'] = list()\n    output['path'] = org_im_path\n\n    for index, face in enumerate(detected_faces):\n        label_idx = np.argmax(confidence_out[index])\n        label_confidence = confidence_out[index][label_idx]\n        bbox = dict()\n        bbox['label'] = label_list[label_idx]\n        bbox['confidence'] = label_confidence\n        bbox['top'] = detected_faces[index]['top']\n        bbox['bottom'] = detected_faces[index]['bottom']\n        bbox['left'] = detected_faces[index]['left']\n        bbox['right'] = detected_faces[index]['right']\n        output['data'].append(bbox)\n\n    if visualization:\n        check_dir(output_dir)\n        save_im_path = get_save_image_name(org_im, org_im_path, output_dir)\n        cv2.imwrite(save_im_path, org_im)\n        draw_bounding_box_on_image(save_im_path, output['data'])\n\n    return output\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_mobile_mask/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://ai-studio-static-online.cdn.bcebos.com/7799a8ccc5f6471b9d56fb6eff94f82a08b70ca2c7594d3f99877e366c0a2619'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"pyramidbox_lite_mobile_mask\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('detection_result')\n\n    def test_face_detection1(self):\n        results = self.module.face_detection(paths=['tests/test.jpg'], use_gpu=False, visualization=False)\n        bbox = results[0]['data'][0]\n\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'NO MASK')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(1000 < left < 4000)\n        self.assertTrue(1000 < right < 4000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection2(self):\n        results = self.module.face_detection(images=[cv2.imread('tests/test.jpg')], use_gpu=False, visualization=False)\n        bbox = results[0]['data'][0]\n\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'NO MASK')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(1000 < left < 4000)\n        self.assertTrue(1000 < right < 4000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection3(self):\n        results = self.module.face_detection(images=[cv2.imread('tests/test.jpg')], use_gpu=False, visualization=True)\n        bbox = results[0]['data'][0]\n\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'NO MASK')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(1000 < left < 4000)\n        self.assertTrue(1000 < right < 4000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection4(self):\n        results = self.module.face_detection(images=[cv2.imread('tests/test.jpg')], use_gpu=True, visualization=False)\n        bbox = results[0]['data'][0]\n\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'NO MASK')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(1000 < left < 4000)\n        self.assertTrue(1000 < right < 4000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection5(self):\n        self.assertRaises(AssertionError, self.module.face_detection, paths=['no.jpg'])\n\n    def test_face_detection6(self):\n        self.assertRaises(AttributeError, self.module.face_detection, images=['test.jpg'])\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model/face_detector.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model/face_detector.pdiparams'))\n\n        self.assertTrue(os.path.exists('./inference/model/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_server/README.md",
    "content": "# pyramidbox_lite_server\n\n|模型名称|pyramidbox_lite_server|\n| :--- | :---: |\n|类别|图像 - 人脸检测|\n|网络|PyramidBox|\n|数据集|WIDER FACE数据集 + 百度自采人脸数据集|\n|是否支持Fine-tuning|否|\n|模型大小|8MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/131602468-351eb3fb-81e3-4294-ac8e-b49a3a0232cb.jpg\"   width='50%' hspace='10'/>\n    <br />\n    </p>\n\n- ### 模型介绍\n\n  - PyramidBox-Lite是基于2018年百度发表于计算机视觉顶级会议ECCV 2018的论文PyramidBox而研发的轻量级模型，模型基于主干网络FaceBoxes，对于光照、口罩遮挡、表情变化、尺度变化等常见问题具有很强的鲁棒性。该PaddleHub Module基于WIDER FACE数据集和百度自采人脸数据集进行训练，支持预测，可用于人脸检测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install pyramidbox_lite_server\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run pyramidbox_lite_server --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现人脸检测模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    face_detector = hub.Module(name=\"pyramidbox_lite_server\")\n    result = face_detector.face_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = face_detector.face_detection(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def face_detection(images=None,\n                       paths=None,\n                       use_gpu=False,\n                       output_dir='detection_result',\n                       visualization=False,  \n                       shrink=0.5,\n                       confs_threshold=0.6)\n    ```\n\n    - 检测输入图片中的所有人脸位置。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；<br/>\n      - paths (list\\[str\\]): 图片的路径；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 detection\\_result；<br/>\n      - visualization (bool): 是否将识别结果保存为图片文件；<br/>\n      - shrink (float): 用于设置图片的缩放比例，该值越大，则对于输入图片中的小尺寸人脸有更好的检测效果（模型计算成本越高），反之则对于大尺寸人脸有更好的检测效果；<br/>\n      - confs\\_threshold (float): 置信度的阈值。\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为:\n        - path (str): 原输入图片的路径\n        - data (list): 检测结果，list的每一个元素为 dict，各字段为:\n          - confidence (float): 识别的置信度\n          - left (int): 边界框的左上角x坐标\n          - top (int): 边界框的左上角y坐标\n          - right (int): 边界框的右下角x坐标\n          - bottom (int): 边界框的右下角y坐标\n\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - 将模型保存到指定路径。\n\n    - **参数**\n\n      - dirname: 模型保存路径 <br/>\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线人脸检测服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m pyramidbox_lite_server\n    ```\n\n  - 这样就完成了一个人脸检测服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/pyramidbox_lite_server\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n- ### gradio app 支持\n\n  从 PaddleHub 2.3.1 开始支持使用链接 http://127.0.0.1:8866/gradio/pyramidbox_lite_server 在浏览器中访问 pyramidbox_lite_server 的 Gradio APP。\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.2.0\n\n  修复numpy数据读取问题\n\n* 1.2.1\n\n  移除 fluid api\n\n* 1.3.0\n\n  修复无法导出推理模型的问题\n\n* 1.4.0\n\n  添加 Gradio APP 支持\n\n  - ```shell\n    $ hub install pyramidbox_lite_server==1.4.0\n    ```\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_server/README_en.md",
    "content": "# pyramidbox_lite_server\n\n|Module Name|pyramidbox_lite_server|\n| :--- | :---: |\n|Category|face detection|\n|Network|PyramidBox|\n|Dataset|WIDER FACEDataset + Baidu Face Dataset|\n|Fine-tuning supported or not|No|\n|Module Size|8MB|\n|Latest update date|2021-02-26|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/131602468-351eb3fb-81e3-4294-ac8e-b49a3a0232cb.jpg\"   width='50%' hspace='10'/>\n    <br />\n    </p>\n\n- ### Module Introduction\n\n  - PyramidBox-Lite is a light-weight model based  on PyramidBox proposed by Baidu in ECCV 2018. This model has solid robustness against interferences such as light and scale variation. This module is based on PyramidBox, trained on WIDER FACE Dataset and Baidu Face Dataset, and can be used for face detection.\n\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install pyramidbox_lite_server\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run pyramidbox_lite_server --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    face_detector = hub.Module(name=\"pyramidbox_lite_server\")\n    result = face_detector.face_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = face_detector.face_detection(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def face_detection(images=None,\n                       paths=None,\n                       use_gpu=False,\n                       output_dir='detection_result',\n                       visualization=False,  \n                       shrink=0.5,\n                       confs_threshold=0.6)\n    ```\n\n    - Detect all faces in image\n\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - output_dir (str): save path of images;\n      - visualization (bool): Whether to save the results as picture files;\n      - shrink (float): the scale to resize image\n      - confs\\_threshold (float): the confidence threshold\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n\n      - res (list\\[dict\\]): results\n        - path (str): path for input image\n        - data (list): detection results, each element in the list is dict\n          - confidence (float): the confidence of the result\n          - left (int): the upper left corner x coordinate of the detection box\n          - top (int): the upper left corner y coordinate of the detection box\n          - right (int): the lower right corner x coordinate of the detection box\n          - bottom (int): the lower right corner y coordinate of the detection box\n\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - Save model to specific path\n\n    - **Parameters**\n\n      - dirname: model save path\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of face detection.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m pyramidbox_lite_server\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/pyramidbox_lite_server\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n- ### Gradio APP support\n  Starting with PaddleHub 2.3.1, the Gradio APP for pyramidbox_lite_server is supported to be accessed in the browser using the link http://127.0.0.1:8866/gradio/pyramidbox_lite_server.\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.2.0\n\n  Fix the problem of reading numpy\n\n* 1.2.1\n\n  Remove fluid api\n\n* 1.3.0\n\n  Fix a bug of save_inference_model\n\n* 1.4.0\n\n  Add Gradio APP support.\n\n  - ```shell\n    $ hub install pyramidbox_lite_server==1.4.0\n    ```\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_server/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_server/data_feed.py",
    "content": "import os\nimport time\nfrom collections import OrderedDict\n\nimport cv2\nimport numpy as np\n\n__all__ = ['reader']\n\n\ndef preprocess(org_im, shrink):\n    image = org_im.copy()\n    image_height, image_width, image_channel = image.shape\n    if shrink != 1:\n        image_height, image_width = int(image_height * shrink), int(image_width * shrink)\n        image = cv2.resize(image, (image_width, image_height), cv2.INTER_NEAREST)\n    # HWC to CHW\n    if len(image.shape) == 3:\n        image = np.swapaxes(image, 1, 2)\n        image = np.swapaxes(image, 1, 0)\n    # mean, std\n    mean = [104., 117., 123.]\n    scale = 0.007843\n    image = image.astype('float32')\n    image -= np.array(mean)[:, np.newaxis, np.newaxis].astype('float32')\n    image = image * scale\n    return image, image_height, image_width\n\n\ndef reader(images, paths, shrink):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list(numpy.ndarray)): images data, shape of each is [H, W, C], color space is BGR.\n        paths (list[str]): paths to images.\n        shrink (float): parameter to control the resize scale in preprocess.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths is not None:\n        assert type(paths) is list, \"paths should be a list.\"\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            im = cv2.imread(im_path)\n            each['org_im'] = im\n            each['org_im_path'] = im_path\n            component.append(each)\n    if images is not None:\n        assert type(images) is list, \"images should be a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = im\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            component.append(each)\n\n    for element in component:\n        element['image'], element['image_height'], element['image_width'] = preprocess(element['org_im'], shrink)\n        yield element\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_server/label_list.txt",
    "content": "BACKGROUND\nface\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_server/module.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\n\nimport argparse\nimport ast\nimport os\n\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import postprocess\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"pyramidbox_lite_server\",\n            type=\"CV/face_detection\",\n            author=\"baidu-vis\",\n            author_email=\"\",\n            summary=\"PyramidBox-Lite-Server is a high-performance face detection model.\",\n            version=\"1.4.0\")\nclass PyramidBoxLiteServer:\n\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"pyramidbox_lite_server_face_detection\",\n                                                          \"model\")\n        self._set_config()\n        self.processor = self\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path + '.pdmodel'\n        params = self.default_pretrained_model_path + '.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def face_detection(self,\n                       images=None,\n                       paths=None,\n                       data=None,\n                       use_gpu=False,\n                       output_dir='detection_result',\n                       visualization=False,\n                       shrink=0.5,\n                       confs_threshold=0.6):\n        \"\"\"\n        API for face detection.\n\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n            paths (list[str]): The paths of images.\n            use_gpu (bool): Whether to use gpu.\n            output_dir (str): The path to store output images.\n            visualization (bool): Whether to save image or not.\n            shrink (float): parameter to control the resize scale in preprocess.\n            confs_threshold (float): confidence threshold.\n\n        Returns:\n            res (list[dict]): The result of face detection and save path of images.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        # compatibility with older versions\n        if data:\n            if 'image' in data:\n                if paths is None:\n                    paths = list()\n                paths += data['image']\n            elif 'data' in data:\n                if images is None:\n                    images = list()\n                images += data['data']\n\n        res = list()\n        # process one by one\n        for element in reader(images, paths, shrink):\n            image = np.expand_dims(element['image'], axis=0).astype('float32')\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(image)\n\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n            output_data = output_handle.copy_to_cpu()\n\n            out = postprocess(data_out=output_data,\n                              org_im=element['org_im'],\n                              org_im_path=element['org_im_path'],\n                              image_width=element['image_width'],\n                              image_height=element['image_height'],\n                              output_dir=output_dir,\n                              visualization=visualization,\n                              shrink=shrink,\n                              confs_threshold=confs_threshold)\n            res.append(out)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.face_detection(images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.face_detection(paths=[args.input_path],\n                                      use_gpu=args.use_gpu,\n                                      output_dir=args.output_dir,\n                                      visualization=args.visualization,\n                                      shrink=args.shrink,\n                                      confs_threshold=args.confs_threshold)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='detection_result',\n                                           help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument('--visualization',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n        self.arg_input_group.add_argument(\n            '--shrink',\n            type=ast.literal_eval,\n            default=0.5,\n            help=\"resize the image to shrink * original_shape before feeding into network.\")\n        self.arg_input_group.add_argument('--confs_threshold',\n                                          type=ast.literal_eval,\n                                          default=0.6,\n                                          help=\"confidence threshold.\")\n\n    def create_gradio_app(self):\n        import gradio as gr\n        import tempfile\n        import os\n        from PIL import Image\n\n        def inference(image, shrink, confs_threshold):\n            with tempfile.TemporaryDirectory() as temp_dir:\n                self.face_detection(paths=[image],\n                                    use_gpu=False,\n                                    visualization=True,\n                                    output_dir=temp_dir,\n                                    shrink=shrink,\n                                    confs_threshold=confs_threshold)\n                return Image.open(os.path.join(temp_dir, os.listdir(temp_dir)[0]))\n\n        interface = gr.Interface(inference, [\n            gr.inputs.Image(type=\"filepath\"),\n            gr.Slider(0.0, 1.0, 0.5, step=0.01),\n            gr.Slider(0.0, 1.0, 0.6, step=0.01)\n        ],\n                                 gr.outputs.Image(type=\"ndarray\"),\n                                 title='pyramidbox_lite_server')\n        return interface\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_server/processor.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport base64\nimport os\nimport time\n\nimport cv2\nimport numpy as np\nfrom PIL import Image\n\n__all__ = ['base64_to_cv2', 'postprocess']\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef check_dir(dir_path):\n    \"\"\"\n    Create directory to save processed image.\n\n    Args:\n        dir_path (str): directory path to save images.\n\n    \"\"\"\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef get_save_image_name(org_im, org_im_path, output_dir):\n    \"\"\"\n    Get save image name from source image path.\n    \"\"\"\n    # name prefix of original image\n    org_im_name = os.path.split(org_im_path)[-1]\n    im_prefix = os.path.splitext(org_im_name)[0]\n    # extension\n    img = Image.fromarray(org_im[:, :, ::-1])\n    if img.mode == 'RGBA':\n        ext = '.png'\n    elif img.mode == 'RGB':\n        ext = '.jpg'\n    # save image path\n    save_im_path = os.path.join(output_dir, im_prefix + ext)\n    if os.path.exists(save_im_path):\n        save_im_path = os.path.join(output_dir, im_prefix + 'time={}'.format(int(time.time())) + ext)\n\n    return save_im_path\n\n\ndef clip_bbox(bbox, img_height, img_width):\n    bbox['left'] = int(max(min(bbox['left'], img_width), 0.))\n    bbox['top'] = int(max(min(bbox['top'], img_height), 0.))\n    bbox['right'] = int(max(min(bbox['right'], img_width), 0.))\n    bbox['bottom'] = int(max(min(bbox['bottom'], img_height), 0.))\n    return bbox\n\n\ndef postprocess(data_out, org_im, org_im_path, image_height, image_width, output_dir, visualization, shrink,\n                confs_threshold):\n    \"\"\"\n    Postprocess output of network. one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output of network.\n        org_im (numpy.ndarray): original image.\n        org_im_path (list): path of riginal image.\n        image_height (int): height of preprocessed image.\n        image_width (int): width of preprocessed image.\n        output_dir (str): output directory to store image.\n        visualization (bool): whether to save image or not.\n        shrink (float): parameter to control the resize scale in preprocess.\n        confs_threshold (float): confidence threshold.\n\n    Returns:\n        output (dict): keys are 'data' and 'path', the correspoding values are:\n            data (list[dict]): 5 keys, where\n                'left', 'top', 'right', 'bottom' are the coordinate of detection bounding box,\n                'confidence' is the confidence this bbox.\n            path (str): The path of original image.\n    \"\"\"\n    output = dict()\n    output['data'] = list()\n    output['path'] = org_im_path\n\n    for each_data in data_out:\n        # each_data is a list: [label, confidence, left, top, right, bottom]\n        if each_data[1] > confs_threshold:\n            dt_bbox = dict()\n            dt_bbox['confidence'] = float(each_data[1])\n            dt_bbox['left'] = image_width * each_data[2] / shrink\n            dt_bbox['top'] = image_height * each_data[3] / shrink\n            dt_bbox['right'] = image_width * each_data[4] / shrink\n            dt_bbox['bottom'] = image_height * each_data[5] / shrink\n            dt_bbox = clip_bbox(dt_bbox, org_im.shape[0], org_im.shape[1])\n            output['data'].append(dt_bbox)\n\n    if visualization:\n        check_dir(output_dir)\n        save_im_path = get_save_image_name(org_im, org_im_path, output_dir)\n        im_out = org_im.copy()\n        if len(output['data']) > 0:\n            for bbox in output['data']:\n                cv2.rectangle(im_out, (bbox['left'], bbox['top']), (bbox['right'], bbox['bottom']), (255, 255, 0), 2)\n        cv2.imwrite(save_im_path, im_out)\n\n    return output\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_server/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport paddlehub as hub\n\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/iFgRcqHznqg/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8MXx8ZmFjZXxlbnwwfHx8fDE2NjE5ODAyMTc&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"pyramidbox_lite_server\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('detection_result')\n\n    def test_face_detection1(self):\n        results = self.module.face_detection(\n            paths=['tests/test.jpg'],\n            use_gpu=False,\n            visualization=False\n        )\n        bbox = results[0]['data'][0]\n\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n        \n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(0 < left < 2000)\n        self.assertTrue(0 < right < 2000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection2(self):\n        results = self.module.face_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=False\n        )\n        bbox = results[0]['data'][0]\n\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n        \n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(0 < left < 2000)\n        self.assertTrue(0 < right < 2000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection3(self):\n        results = self.module.face_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=True\n        )\n        bbox = results[0]['data'][0]\n\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n        \n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(0 < left < 2000)\n        self.assertTrue(0 < right < 2000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection4(self):\n        results = self.module.face_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=True,\n            visualization=False\n        )\n        bbox = results[0]['data'][0]\n\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n        \n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(0 < left < 2000)\n        self.assertTrue(0 < right < 2000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection5(self):\n        self.assertRaises(\n            AssertionError,\n            self.module.face_detection,\n            paths=['no.jpg']\n        )\n\n    def test_face_detection6(self):\n        self.assertRaises(\n            AttributeError,\n            self.module.face_detection,\n            images=['test.jpg']\n        )\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_server_mask/README.md",
    "content": "# pyramidbox_lite_server_mask\n\n|模型名称|pyramidbox_lite_server_mask|\n| :--- | :---: |\n|类别|图像 - 人脸检测|\n|网络|PyramidBox|\n|数据集|WIDER FACE数据集 + 百度自采人脸数据集|\n|是否支持Fine-tuning|否|\n|模型大小|1.2MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/131603304-690a2e3b-9f24-42f6-9297-a12ada884191.jpg\"   width='50%' hspace='10'/>\n    <br />\n    </p>\n\n- ### 模型介绍\n\n  - PyramidBox-Lite是基于2018年百度发表于计算机视觉顶级会议ECCV 2018的论文PyramidBox而研发的轻量级模型，模型基于主干网络FaceBoxes，对于光照、口罩遮挡、表情变化、尺度变化等常见问题具有很强的鲁棒性。该PaddleHub Module基于WIDER FACE数据集和百度自采人脸数据集进行训练，支持预测，可用于检测人脸是否佩戴口罩。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install pyramidbox_lite_server_mask\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run pyramidbox_lite_server_mask --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现人脸检测模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    mask_detector = hub.Module(name=\"pyramidbox_lite_server_mask\")\n    result = mask_detector.face_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = mask_detector.face_detection(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def __init__(face_detector_module=None)\n    ```\n\n    - **参数**\n\n      - face\\_detector\\_module (class): 人脸检测模型，默认为 pyramidbox\\_lite\\_server。\n\n  - ```python\n    def face_detection(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       visualization=False,\n                       output_dir='detection_result',\n                       use_multi_scale=False,\n                       shrink=0.5,\n                       confs_threshold=0.6)\n    ```\n\n    - 识别输入图片中的所有的人脸，并判断有无戴口罩。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；<br/>\n      - paths (list\\[str\\]): 图片的路径；<br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - visualization (bool): 是否将识别结果保存为图片文件；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 detection\\_result；<br/>\n      - use\\_multi\\_scale (bool) : 用于设置是否开启多尺度的人脸检测，开启多尺度人脸检测能够更好的检测到输入图像中不同尺寸的人脸，但是会增加模型计算量，降低预测速度；<br/>\n      - shrink (float): 用于设置图片的缩放比例，该值越大，则对于输入图片中的小尺寸人脸有更好的检测效果（模型计算成本越高），反之则对于大尺寸人脸有更好的检测效果；<br/>\n      - confs\\_threshold (float): 置信度的阈值。\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为:\n        - path (str): 原输入图片的路径\n        - data (list): 检测结果，list的每一个元素为 dict，各字段为:\n          - label (str): 识别标签，为 'NO MASK' 或者 'MASK'；\n          - confidence (float): 识别的置信度\n          - left (int): 边界框的左上角x坐标\n          - top (int): 边界框的左上角y坐标\n          - right (int): 边界框的右下角x坐标\n          - bottom (int): 边界框的右下角y坐标\n\n  - ```python\n    def set_face_detector_module(face_detector_module)\n    ```\n    - 设置口罩检测模型中进行人脸检测的底座模型。\n    - **参数**\n\n      - face\\_detector\\_module (class): 人脸检测模型。\n\n  - ```python\n    def get_face_detector_module()\n    ```\n    - 获取口罩检测模型中进行人脸检测的底座模型。\n    - **返回**\n\n      - 当前模型使用的人脸检测模型\n\n\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - 将模型保存到指定路径。\n\n    - **参数**\n\n      - dirname: 模型保存路径 <br/>\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线口罩检测服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m pyramidbox_lite_server_mask\n    ```\n\n  - 这样就完成了一个口罩检测服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/pyramidbox_lite_server_mask\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n- ### gradio app 支持\n\n  从 PaddleHub 2.3.1 开始支持使用链接 http://127.0.0.1:8866/gradio/pyramidbox_lite_server_mask 在浏览器中访问 pyramidbox_lite_server_mask 的 Gradio APP。\n\n## 五、Paddle Lite部署\n- ### 通过python执行以下代码，保存模型\n  - ```python\n    import paddlehub as hub\n    pyramidbox_lite_server_mask = hub.Module(name=\"pyramidbox_lite_server_mask\")\n\n    # 将模型保存在test_program文件夹之中\n    pyramidbox_lite_server_mask.save_inference_model(dirname=\"test_program\")\n    ```\n\n- ### 进行模型转换\n  - 从paddlehub下载的是预测模型，可以使用PaddleLite提供的模型优化工具OPT对预测模型进行转换，转换之后进而可以实现在手机等端侧硬件上的部署，具体请请参考[OPT工具](https://paddle-lite.readthedocs.io/zh/latest/user_guides/model_optimize_tool.html)\n\n- ### 模型通过Paddle Lite进行部署\n  - 参考[Paddle-Lite口罩检测模型部署教程](https://github.com/PaddlePaddle/Paddle-Lite/tree/develop/lite/demo/cxx)\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.3.2\n\n  移除 fluid api\n\n* 1.4.0\n\n  修复无法导出推理模型的问题\n\n* 1.5.0\n\n  添加 Gradio APP 的支持\n\n  - ```shell\n    $ hub install pyramidbox_lite_server_mask==1.5.0\n    ```\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_server_mask/README_en.md",
    "content": "# pyramidbox_lite_server_mask\n\n|Module Name|pyramidbox_lite_server_mask|\n| :--- | :---: |\n|Category|face detection|\n|Network|PyramidBox|\n|Dataset|WIDER FACEDataset + Baidu Face Dataset|\n|Fine-tuning supported or not|No|\n|Module Size|1.2MB|\n|Latest update date|2021-02-26|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/131603304-690a2e3b-9f24-42f6-9297-a12ada884191.jpg\"   width='50%' hspace='10'/>\n    <br />\n    </p>\n\n- ### Module Introduction\n\n  - PyramidBox-Lite is a light-weight model based  on PyramidBox proposed by Baidu in ECCV 2018. This model has solid robustness against interferences such as light and scale variation. This module is based on PyramidBox, trained on WIDER FACE Dataset and Baidu Face Dataset, and can be used for mask detection.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install pyramidbox_lite_server_mask\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run pyramidbox_lite_server_mask --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    mask_detector = hub.Module(name=\"pyramidbox_lite_server_mask\")\n    result = mask_detector.face_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = mask_detector.face_detection(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def face_detection(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       visualization=False,\n                       output_dir='detection_result',\n                       use_multi_scale=False,\n                       shrink=0.5,\n                       confs_threshold=0.6)\n    ```\n\n    - Detect all faces in image, and judge the existence of mask.\n\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - visualization (bool): Whether to save the results as picture files;\n      - output_dir (str): save path of images;\n      - use\\_multi\\_scale (bool) : whether to detect across multiple scales;\n      - shrink (float): the scale to resize image\n      - confs\\_threshold (float): the confidence threshold\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n\n      - res (list\\[dict\\]): results\n        - path (str): path for input image\n        - data (list): detection results, each element in the list is dict\n          - label (str): 'NO MASK' or 'MASK'；\n          - confidence (float): the confidence of the result\n          - left (int): the upper left corner x coordinate of the detection box\n          - top (int): the upper left corner y coordinate of the detection box\n          - right (int): the lower right corner x coordinate of the detection box\n          - bottom (int): the lower right corner y coordinate of the detection box\n\n\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - Save model to specific path\n\n    - **Parameters**\n\n      - dirname: model save path\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of face detection.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m pyramidbox_lite_server_mask\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/pyramidbox_lite_server_mask\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n- ### Gradio APP support\n  Starting with PaddleHub 2.3.1, the Gradio APP for pyramidbox_lite_server_mask is supported to be accessed in the browser using the link http://127.0.0.1:8866/gradio/pyramidbox_lite_server_mask.\n\n## V.Paddle Lite Deployment\n- ### Save model demo\n  - ```python\n    import paddlehub as hub\n    pyramidbox_lite_server_mask = hub.Module(name=\"pyramidbox_lite_server_mask\")\n\n    # save model in directory named test_program\n    pyramidbox_lite_server_mask.save_inference_model(dirname=\"test_program\")\n    ```\n\n\n- ### transform model\n\n  - The model downloaded from paddlehub is a prediction model. If we want to deploy it in mobile device, we can use OPT tool provided by PaddleLite to transform the model. For more information, please refer to [OPT tool](https://paddle-lite.readthedocs.io/zh/latest/user_guides/model_optimize_tool.html))\n\n- ### Deploy the model with Paddle Lite\n  - Please refer to[Paddle-Lite mask detection model deployment demo](https://github.com/PaddlePaddle/Paddle-Lite/tree/develop/lite/demo/cxx)\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.3.2\n\n  Remove fluid api\n\n* 1.4.0\n\n  Fix a bug of save_inference_model\n\n* 1.5.0\n\n  Add Gradio APP support.\n\n  - ```shell\n    $ hub install pyramidbox_lite_server_mask==1.5.0\n    ```\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_server_mask/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_server_mask/data_feed.py",
    "content": "# coding=utf-8\nimport math\nimport os\nimport time\nfrom collections import OrderedDict\n\nimport cv2\nimport numpy as np\n\n__all__ = ['reader']\n\nmulti_scales = [0.3, 0.6, 0.9]\n\n\ndef bbox_vote(det):\n    order = det[:, 4].ravel().argsort()[::-1]\n    det = det[order, :]\n    if det.shape[0] == 0:\n        dets = np.array([[10, 10, 20, 20, 0.002]])\n        det = np.empty(shape=[0, 5])\n    while det.shape[0] > 0:\n        # IOU\n        area = (det[:, 2] - det[:, 0] + 1) * (det[:, 3] - det[:, 1] + 1)\n        xx1 = np.maximum(det[0, 0], det[:, 0])\n        yy1 = np.maximum(det[0, 1], det[:, 1])\n        xx2 = np.minimum(det[0, 2], det[:, 2])\n        yy2 = np.minimum(det[0, 3], det[:, 3])\n        w = np.maximum(0.0, xx2 - xx1 + 1)\n        h = np.maximum(0.0, yy2 - yy1 + 1)\n        inter = w * h\n        o = inter / (area[0] + area[:] - inter)\n        # nms\n        merge_index = np.where(o >= 0.3)[0]\n        det_accu = det[merge_index, :]\n        det = np.delete(det, merge_index, 0)\n        if merge_index.shape[0] <= 1:\n            if det.shape[0] == 0:\n                try:\n                    dets = np.row_stack((dets, det_accu))\n                except:\n                    dets = det_accu\n            continue\n        det_accu[:, 0:4] = det_accu[:, 0:4] * np.tile(det_accu[:, -1:], (1, 4))\n        max_score = np.max(det_accu[:, 4])\n        det_accu_sum = np.zeros((1, 5))\n        det_accu_sum[:, 0:4] = np.sum(det_accu[:, 0:4], axis=0) / np.sum(det_accu[:, -1:])\n        det_accu_sum[:, 4] = max_score\n        try:\n            dets = np.row_stack((dets, det_accu_sum))\n        except:\n            dets = det_accu_sum\n    dets = dets[0:750, :]\n    return dets\n\n\ndef crop(image, pts, shift=0, scale=1.5, rotate=0, res_width=128, res_height=128):\n    res = (res_width, res_height)\n    idx1 = 0\n    idx2 = 1\n    # angle\n    alpha = 0\n    if pts[idx2, 0] != -1 and pts[idx2, 1] != -1 and pts[idx1, 0] != -1 and pts[idx1, 1] != -1:\n        alpha = math.atan2(pts[idx2, 1] - pts[idx1, 1], pts[idx2, 0] - pts[idx1, 0]) * 180 / math.pi\n    pts[pts == -1] = np.inf\n    coord_min = np.min(pts, 0)\n    pts[pts == np.inf] = -1\n    coord_max = np.max(pts, 0)\n    # coordinates of center point\n    c = np.array([coord_max[0] - (coord_max[0] - coord_min[0]) / 2,\n                  coord_max[1] - (coord_max[1] - coord_min[1]) / 2])  # center\n    max_wh = max((coord_max[0] - coord_min[0]) / 2, (coord_max[1] - coord_min[1]) / 2)\n    # Shift the center point, rot add eyes angle\n    c = c + shift * max_wh\n    rotate = rotate + alpha\n    M = cv2.getRotationMatrix2D((c[0], c[1]), rotate, res[0] / (2 * max_wh * scale))\n    M[0, 2] = M[0, 2] - (c[0] - res[0] / 2.0)\n    M[1, 2] = M[1, 2] - (c[1] - res[0] / 2.0)\n    image_out = cv2.warpAffine(image, M, res)\n    return image_out, M\n\n\ndef color_normalize(image, mean, std=None):\n    if image.shape[-1] == 1:\n        image = np.repeat(image, axis=2)\n    h, w, c = image.shape\n    image = np.transpose(image, (2, 0, 1))\n    image = np.subtract(image.reshape(c, -1), mean[:, np.newaxis]).reshape(-1, h, w)\n    image = np.transpose(image, (1, 2, 0))\n    return image\n\n\ndef process_image(org_im, face):\n    pts = np.array([\n        face['left'], face['top'], face['right'], face['top'], face['left'], face['bottom'], face['right'],\n        face['bottom']\n    ]).reshape(4, 2).astype(np.float32)\n    image_in, M = crop(org_im, pts)\n    image_in = image_in / 256.0\n    image_in = color_normalize(image_in, mean=np.array([0.5, 0.5, 0.5]))\n    image_in = image_in.astype(np.float32).transpose([2, 0, 1]).reshape(-1, 3, 128, 128)\n    return image_in\n\n\ndef reader(face_detector, shrink, confs_threshold, images, paths, use_gpu, use_multi_scale):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        face_detector (class): class to detect faces.\n        shrink (float): parameter to control the resize scale in face_detector.\n        confs_threshold (float): confidence threshold of face_detector.\n        images (list(numpy.ndarray)): images data, shape of each is [H, W, C], color space is BGR.\n        paths (list[str]): paths to images.\n        use_gpu (bool): whether to use gpu in face_detector.\n        use_multi_scale (bool): whether to enable multi-scale face detection.\n    Yield:\n        element (collections.OrderedDict): info of original image, preprocessed image, contains 3 keys:\n            org_im (numpy.ndarray) : original image.\n            org_im_path (str): path to original image.\n            preprocessed (list[OrderedDict]):each element contains 2 keys:\n               face  (dict): face detected in the original image.\n               image (numpy.ndarray): data to be fed into neural network.\n    \"\"\"\n    component = list()\n    if paths is not None:\n        assert type(paths) is list, \"paths should be a list.\"\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            im = cv2.imread(im_path)\n            each['org_im'] = im\n            each['org_im_path'] = im_path\n            component.append(each)\n    if images is not None:\n        assert type(images) is list, \"images should be a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = im\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            component.append(each)\n\n    for element in component:\n        if use_multi_scale:\n            scale_res = list()\n            detect_faces = list()\n            for scale in multi_scales:\n                _detect_res = face_detector.face_detection(images=[element['org_im']],\n                                                           use_gpu=use_gpu,\n                                                           visualization=False,\n                                                           shrink=scale,\n                                                           confs_threshold=confs_threshold)\n\n                _s = list()\n                for _face in _detect_res[0]['data']:\n                    _face_list = [_face['left'], _face['top'], _face['right'], _face['bottom'], _face['confidence']]\n                    _s.append(_face_list)\n\n                if _s:\n                    scale_res.append(np.array(_s))\n            if scale_res:\n                scale_res = np.row_stack(scale_res)\n                scale_res = bbox_vote(scale_res)\n                keep_index = np.where(scale_res[:, 4] >= confs_threshold)[0]\n                scale_res = scale_res[keep_index, :]\n                for data in scale_res:\n                    face = {'left': data[0], 'top': data[1], 'right': data[2], 'bottom': data[3], 'confidence': data[4]}\n                    detect_faces.append(face)\n            else:\n                detect_faces = []\n        else:\n            _detect_res = face_detector.face_detection(images=[element['org_im']],\n                                                       use_gpu=use_gpu,\n                                                       visualization=False,\n                                                       shrink=shrink,\n                                                       confs_threshold=confs_threshold)\n            detect_faces = _detect_res[0]['data']\n\n        element['preprocessed'] = list()\n        for face in detect_faces:\n            handled = OrderedDict()\n            handled['face'] = face\n            handled['image'] = process_image(element['org_im'], face)\n            element['preprocessed'].append(handled)\n\n        yield element\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_server_mask/label_list.txt",
    "content": "NO MASK\nMASK\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_server_mask/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\n\nimport argparse\nimport ast\nimport os\n\nimport numpy as np\nimport paddle\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nimport paddlehub as hub\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import postprocess\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"pyramidbox_lite_server_mask\",\n    type=\"CV/face_detection\",\n    author=\"baidu-vis\",\n    author_email=\"\",\n    summary=\n    \"PyramidBox-Lite-Server-Mask is a high-performance face detection model used to detect whether people wear masks.\",\n    version=\"1.4.0\")\nclass PyramidBoxLiteServerMask:\n\n    def __init__(self, face_detector_module=None):\n        \"\"\"\n        Args:\n            face_detector_module (class): module to detect face.\n        \"\"\"\n        self.default_pretrained_model_path = os.path.join(self.directory, \"pyramidbox_lite_server_mask_model\", \"model\")\n        if face_detector_module is None:\n            self.face_detector = hub.Module(name='pyramidbox_lite_server')\n        else:\n            self.face_detector = face_detector_module\n        self._set_config()\n        self.processor = self\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path + '.pdmodel'\n        params = self.default_pretrained_model_path + '.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def set_face_detector_module(self, face_detector_module):\n        \"\"\"\n        Set face detector.\n        Args:\n            face_detector_module (class): module to detect faces.\n        \"\"\"\n        self.face_detector = face_detector_module\n\n    def get_face_detector_module(self):\n        return self.face_detector\n\n    def face_detection(self,\n                       images=None,\n                       paths=None,\n                       data=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       visualization=False,\n                       output_dir='detection_result',\n                       use_multi_scale=False,\n                       shrink=0.5,\n                       confs_threshold=0.6):\n        \"\"\"\n        API for face detection.\n\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C], color space must be BGR.\n            paths (list[str]): The paths of images.\n            use_gpu (bool): Whether to use gpu.\n            visualization (bool): Whether to save image or not.\n            output_dir (str): The path to store output images.\n            use_multi_scale (bool): whether to enable multi-scale face detection. Enabling multi-scale face detection\n                can increase the accuracy to detect faces, however,\n                it reduce the prediction speed for the increase model calculation.\n            shrink (float): parameter to control the resize scale in preprocess.\n            confs_threshold (float): confidence threshold.\n\n        Returns:\n            res (list[dict]): The result of face detection and save path of images.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Attempt to use GPU for prediction, but environment variable CUDA_VISIBLE_DEVICES was not set correctly.\"\n                )\n\n        # compatibility with older versions\n        if data:\n            if 'image' in data:\n                if paths is None:\n                    paths = list()\n                paths += data['image']\n            elif 'data' in data:\n                if images is None:\n                    images = list()\n                images += data['data']\n\n        # get all data\n        all_element = list()\n        for yield_data in reader(self.face_detector, shrink, confs_threshold, images, paths, use_gpu, use_multi_scale):\n            all_element.append(yield_data)\n\n        image_list = list()\n        element_image_num = list()\n        for i in range(len(all_element)):\n            element_image = [handled['image'] for handled in all_element[i]['preprocessed']]\n            element_image_num.append(len(element_image))\n            image_list.extend(element_image)\n\n        total_num = len(image_list)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        predict_out = np.zeros((1, 2))\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for element_id in range(batch_size):\n                try:\n                    batch_data.append(image_list[handle_id + element_id])\n                except:\n                    pass\n\n            image_arr = np.squeeze(np.array(batch_data), axis=1)\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(image_arr)\n\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n            output_data = output_handle.copy_to_cpu()\n\n            predict_out = np.concatenate((predict_out, output_data))\n\n        predict_out = predict_out[1:]\n        # postprocess one by one\n        res = list()\n        for i in range(len(all_element)):\n            detect_faces_list = [handled['face'] for handled in all_element[i]['preprocessed']]\n            interval_left = sum(element_image_num[0:i])\n            interval_right = interval_left + element_image_num[i]\n            out = postprocess(confidence_out=predict_out[interval_left:interval_right],\n                              org_im=all_element[i]['org_im'],\n                              org_im_path=all_element[i]['org_im_path'],\n                              detected_faces=detect_faces_list,\n                              output_dir=output_dir,\n                              visualization=visualization)\n            res.append(out)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.face_detection(images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.face_detection(paths=[args.input_path],\n                                      use_gpu=args.use_gpu,\n                                      output_dir=args.output_dir,\n                                      visualization=args.visualization,\n                                      shrink=args.shrink,\n                                      confs_threshold=args.confs_threshold)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='detection_result',\n                                           help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument('--visualization',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n        self.arg_input_group.add_argument(\n            '--shrink',\n            type=ast.literal_eval,\n            default=0.5,\n            help=\"resize the image to `shrink * original_shape` before feeding into network.\")\n        self.arg_input_group.add_argument('--confs_threshold',\n                                          type=ast.literal_eval,\n                                          default=0.6,\n                                          help=\"confidence threshold.\")\n\n    def create_gradio_app(self):\n        import gradio as gr\n        import tempfile\n        import os\n        from PIL import Image\n\n        def inference(image, shrink, confs_threshold):\n            with tempfile.TemporaryDirectory() as temp_dir:\n                self.face_detection(paths=[image],\n                                    use_gpu=False,\n                                    visualization=True,\n                                    output_dir=temp_dir,\n                                    shrink=shrink,\n                                    confs_threshold=confs_threshold)\n                return Image.open(os.path.join(temp_dir, os.listdir(temp_dir)[0]))\n\n        interface = gr.Interface(inference, [\n            gr.inputs.Image(type=\"filepath\"),\n            gr.Slider(0.0, 1.0, 0.5, step=0.01),\n            gr.Slider(0.0, 1.0, 0.6, step=0.01)\n        ],\n                                 gr.outputs.Image(type=\"ndarray\"),\n                                 title='pyramidbox_lite_server_mask')\n        return interface\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_server_mask/processor.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport base64\nimport os\nimport time\n\nimport cv2\nimport numpy as np\nfrom PIL import Image\nfrom PIL import ImageDraw\n\n__all__ = ['base64_to_cv2', 'postprocess']\n\nlabel_list = ['NO MASK', 'MASK']\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef check_dir(dir_path):\n    \"\"\"\n    Create directory to save processed image.\n\n    Args:\n        dir_path (str): directory path to save images.\n\n    \"\"\"\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef get_save_image_name(org_im, org_im_path, output_dir):\n    \"\"\"\n    Get save image name from source image path.\n    \"\"\"\n    # name prefix of original image\n    org_im_name = os.path.split(org_im_path)[-1]\n    im_prefix = os.path.splitext(org_im_name)[0]\n    # extension\n    img = Image.fromarray(org_im[:, :, ::-1])\n    if img.mode == 'RGBA':\n        ext = '.png'\n    elif img.mode == 'RGB':\n        ext = '.jpg'\n    elif img.mode == 'L':  # black and white\n        ext = '.jpg'\n    # save image path\n    save_im_path = os.path.join(output_dir, im_prefix + ext)\n    if os.path.exists(save_im_path):\n        save_im_path = os.path.join(output_dir, im_prefix + 'time={}'.format(int(time.time())) + ext)\n\n    return save_im_path\n\n\ndef draw_bounding_box_on_image(save_im_path, output_data):\n    image = Image.open(save_im_path)\n    draw = ImageDraw.Draw(image)\n    for bbox in output_data:\n        # draw bouding box\n        if bbox['label'] == \"MASK\":\n            draw.line([(bbox['left'], bbox['top']), (bbox['left'], bbox['bottom']), (bbox['right'], bbox['bottom']),\n                       (bbox['right'], bbox['top']), (bbox['left'], bbox['top'])],\n                      width=2,\n                      fill='green')\n        else:\n            draw.line([(bbox['left'], bbox['top']), (bbox['left'], bbox['bottom']), (bbox['right'], bbox['bottom']),\n                       (bbox['right'], bbox['top']), (bbox['left'], bbox['top'])],\n                      width=2,\n                      fill='red')\n        # draw label\n        text = bbox['label'] + \": %.2f%%\" % (100 * bbox['confidence'])\n        textsize_width, textsize_height = draw.textsize(text=text)\n        if image.mode == 'RGB' or image.mode == 'RGBA':\n            box_fill = (255, 255, 255)\n            text_fill = (0, 0, 0)\n        else:\n            box_fill = (255)\n            text_fill = (0)\n\n        draw.rectangle(xy=(bbox['left'], bbox['top'] - (textsize_height + 5), bbox['left'] + textsize_width + 10,\n                           bbox['top'] - 3),\n                       fill=box_fill)\n        draw.text(xy=(bbox['left'], bbox['top'] - 15), text=text, fill=text_fill)\n    image.save(save_im_path)\n\n\ndef postprocess(confidence_out, org_im, org_im_path, detected_faces, output_dir, visualization):\n    \"\"\"\n    Postprocess output of network. one element at a time.\n\n    Args:\n        confidence_out (numpy.ndarray): confidences of each label.\n        org_im (numpy.ndarray): original image.\n        org_im_path (list): path of original image.\n        detected_faces (list): faces detected in a picture.\n        output_dir (str): output directory to store image.\n        visualization (bool): whether to save image or not.\n\n    Returns:\n        output (dict): keys are 'data' and 'path', the correspoding values are:\n            data (list[dict]): 6 keys, where\n                'label' is `MASK` or `NO MASK`,\n                'left', 'top', 'right', 'bottom' are the coordinate of detection bounding box,\n                'confidence' is the confidence of mask detection.\n            path (str): The path of original image.\n    \"\"\"\n    output = dict()\n    output['data'] = list()\n    output['path'] = org_im_path\n\n    for index, face in enumerate(detected_faces):\n        label_idx = np.argmax(confidence_out[index])\n        label_confidence = confidence_out[index][label_idx]\n        bbox = dict()\n        bbox['label'] = label_list[label_idx]\n        bbox['confidence'] = label_confidence\n        bbox['top'] = detected_faces[index]['top']\n        bbox['bottom'] = detected_faces[index]['bottom']\n        bbox['left'] = detected_faces[index]['left']\n        bbox['right'] = detected_faces[index]['right']\n        output['data'].append(bbox)\n\n    if visualization:\n        check_dir(output_dir)\n        save_im_path = get_save_image_name(org_im, org_im_path, output_dir)\n        cv2.imwrite(save_im_path, org_im)\n        draw_bounding_box_on_image(save_im_path, output['data'])\n\n    return output\n"
  },
  {
    "path": "modules/image/face_detection/pyramidbox_lite_server_mask/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/iFgRcqHznqg/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8MXx8ZmFjZXxlbnwwfHx8fDE2NjE5ODAyMTc&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"pyramidbox_lite_server_mask\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('detection_result')\n\n    def test_face_detection1(self):\n        results = self.module.face_detection(paths=['tests/test.jpg'], use_gpu=False, visualization=False)\n        bbox = results[0]['data'][0]\n\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'NO MASK')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(0 < left < 2000)\n        self.assertTrue(0 < right < 2000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection2(self):\n        results = self.module.face_detection(images=[cv2.imread('tests/test.jpg')], use_gpu=False, visualization=False)\n        bbox = results[0]['data'][0]\n\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'NO MASK')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(0 < left < 2000)\n        self.assertTrue(0 < right < 2000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection3(self):\n        results = self.module.face_detection(images=[cv2.imread('tests/test.jpg')], use_gpu=False, visualization=True)\n        bbox = results[0]['data'][0]\n\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'NO MASK')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(0 < left < 2000)\n        self.assertTrue(0 < right < 2000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection4(self):\n        results = self.module.face_detection(images=[cv2.imread('tests/test.jpg')], use_gpu=True, visualization=False)\n        bbox = results[0]['data'][0]\n\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'NO MASK')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(0 < left < 2000)\n        self.assertTrue(0 < right < 2000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection5(self):\n        self.assertRaises(AssertionError, self.module.face_detection, paths=['no.jpg'])\n\n    def test_face_detection6(self):\n        self.assertRaises(AttributeError, self.module.face_detection, images=['test.jpg'])\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model/face_detector.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model/face_detector.pdiparams'))\n\n        self.assertTrue(os.path.exists('./inference/model/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/face_detection/ultra_light_fast_generic_face_detector_1mb_320/README.md",
    "content": "# ultra_light_fast_generic_face_detector_1mb_320\n\n|模型名称|ultra_light_fast_generic_face_detector_1mb_320|\n| :--- | :---: |\n|类别|图像 - 人脸检测|\n|网络|Ultra-Light-Fast-Generic-Face-Detector-1MB|\n|数据集|WIDER FACE数据集|\n|是否支持Fine-tuning|否|\n|模型大小|2.6MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/131604811-bce29c3f-66f7-45cb-a388-d739368bfeb9.jpg\"   width='50%' hspace='10'/>\n    <br />\n    </p>\n\n- ### 模型介绍\n\n  - Ultra-Light-Fast-Generic-Face-Detector-1MB是针对边缘计算设备或低算力设备(如用ARM推理)设计的实时超轻量级通用人脸检测模型，可以在低算力设备中如用ARM进行实时的通用场景的人脸检测推理。该PaddleHub Module的预训练数据集为WIDER FACE数据集，可支持预测，在预测时会将图片输入缩放为320 * 240。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install ultra_light_fast_generic_face_detector_1mb_320\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run ultra_light_fast_generic_face_detector_1mb_320 --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现人脸检测模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    face_detector = hub.Module(name=\"ultra_light_fast_generic_face_detector_1mb_320\")\n    result = face_detector.face_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = face_detector.face_detection(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def face_detection(images=None,\n                       paths=None,\n                       batch\\_size=1,\n                       use_gpu=False,\n                       output_dir='face_detector_640_predict_output',\n                       visualization=False,\n                       confs_threshold=0.5)\n    ```\n\n    - 检测输入图片中的所有人脸位置。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；<br/>\n      - paths (list\\[str\\]): 图片的路径；<br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 face\\_detector\\_640\\_predict\\_output；<br/>\n      - visualization (bool): 是否将识别结果保存为图片文件；<br/>\n      - confs\\_threshold (float): 置信度的阈值。\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为:\n        - path (str): 原输入图片的路径\n        - data (list): 检测结果，list的每一个元素为 dict，各字段为:\n          - confidence (float): 识别的置信度\n          - left (int): 边界框的左上角x坐标\n          - top (int): 边界框的左上角y坐标\n          - right (int): 边界框的右下角x坐标\n          - bottom (int): 边界框的右下角y坐标\n        - save\\_path 字段为可视化图片的保存路径（仅当visualization=True时存在）\n\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - 将模型保存到指定路径。\n\n    - **参数**\n\n      - dirname: 模型保存路径 <br/>\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线人脸检测服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m ultra_light_fast_generic_face_detector_1mb_320\n    ```\n\n  - 这样就完成了一个人脸检测服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/ultra_light_fast_generic_face_detector_1mb_320\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.3\n\n  移除 fluid api\n\n* 1.2.0\n  \n  修复无法导出推理模型的问题\n\n  - ```shell\n    $ hub install ultra_light_fast_generic_face_detector_1mb_320==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/face_detection/ultra_light_fast_generic_face_detector_1mb_320/README_en.md",
    "content": "# ultra_light_fast_generic_face_detector_1mb_320\n\n|Module Name|ultra_light_fast_generic_face_detector_1mb_320|\n| :--- | :---: |\n|Category|face detection|\n|Network|Ultra-Light-Fast-Generic-Face-Detector-1MB|\n|Dataset|WIDER FACEDataset|\n|Fine-tuning supported or not|No|\n|Module Size|2.6MB|\n|Latest update date|2021-02-26|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/131604811-bce29c3f-66f7-45cb-a388-d739368bfeb9.jpg\"   width='50%' hspace='10'/>\n    <br />\n    </p>\n\n- ### Module Introduction\n\n  - Ultra-Light-Fast-Generic-Face-Detector-1MB is an extreme light-weight model for real-time face detection in low computation power devices. This module is based on Ultra-Light-Fast-Generic-Face-Detector-1MB, trained on WIDER FACEDataset, and can be used for face detection.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install ultra_light_fast_generic_face_detector_1mb_320\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run ultra_light_fast_generic_face_detector_1mb_320 --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    face_detector = hub.Module(name=\"ultra_light_fast_generic_face_detector_1mb_320\")\n    result = face_detector.face_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = face_detector.face_detection(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def face_detection(images=None,\n                       paths=None,\n                       batch\\_size=1,\n                       use_gpu=False,\n                       output_dir='face_detector_640_predict_output',\n                       visualization=False,\n                       confs_threshold=0.5)\n    ```\n\n    - Detect all faces in image\n\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - output_dir (str): save path of images;\n      - visualization (bool): Whether to save the results as picture files;\n      - confs\\_threshold (float): the confidence threshold\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n\n      - res (list\\[dict\\]): results\n        - path (str): path for input image\n        - data (list): detection results, each element in the list is dict\n          - confidence (float): the confidence of the result\n          - left (int): the upper left corner x coordinate of the detection box\n          - top (int): the upper left corner y coordinate of the detection box\n          - right (int): the lower right corner x coordinate of the detection box\n          - bottom (int): the lower right corner y coordinate of the detection box\n        - save\\_path (str): path for saving output image\n\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - Save model to specific path\n\n    - **Parameters**\n\n      - dirname: model save path\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of face detection.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m ultra_light_fast_generic_face_detector_1mb_320\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/ultra_light_fast_generic_face_detector_1mb_320\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.3\n\n  Remove fluid api\n\n* 1.2.0\n\n  Fix a bug of save_inference_model\n\n  - ```shell\n    $ hub install ultra_light_fast_generic_face_detector_1mb_320==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/face_detection/ultra_light_fast_generic_face_detector_1mb_320/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/face_detection/ultra_light_fast_generic_face_detector_1mb_320/data_feed.py",
    "content": "# coding=utf-8\nimport os\nfrom collections import OrderedDict\n\nimport cv2\nimport numpy as np\n\n__all__ = ['reader']\n\n\ndef preprocess(orig_image):\n    image = cv2.cvtColor(orig_image, cv2.COLOR_BGR2RGB)\n    image = cv2.resize(image, (320, 240))\n    image_mean = np.array([127, 127, 127])\n    image = (image - image_mean) / 128.0\n    image = np.transpose(image, [2, 0, 1])\n    return image\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            im = cv2.imread(im_path)\n            each['orig_im'] = im\n            each['orig_im_shape'] = im.shape  # height, width, channel\n            each['orig_im_path'] = im_path\n            component.append(each)\n    if images is not None:\n        assert type(images) is list, \"images should be a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['orig_im'] = im\n            each['orig_im_path'] = None\n            each['orig_im_shape'] = im.shape  # height, width, channel\n            component.append(each)\n\n    for element in component:\n        element['image'] = preprocess(element['orig_im'])\n        yield element\n"
  },
  {
    "path": "modules/image/face_detection/ultra_light_fast_generic_face_detector_1mb_320/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\n\nimport argparse\nimport ast\nimport os\n\nimport numpy as np\nimport paddle\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import postprocess\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"ultra_light_fast_generic_face_detector_1mb_320\",\n    type=\"CV/face_detection\",\n    author=\"paddlepaddle\",\n    author_email=\"paddle-dev@baidu.com\",\n    summary=\n    \"Ultra-Light-Fast-Generic-Face-Detector-1MB is a high-performance object detection model release on https://github.com/Linzaer/Ultra-Light-Fast-Generic-Face-Detector-1MB.\",\n    version=\"1.2.0\")\nclass FaceDetector320:\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory,\n                                                          \"ultra_light_fast_generic_face_detector_1mb_320\", \"model\")\n        self._set_config()\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path+'.pdmodel'\n        params = self.default_pretrained_model_path+'.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def face_detection(self,\n                       images=None,\n                       paths=None,\n                       data=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       output_dir='face_detector_320_predict_output',\n                       visualization=False,\n                       confs_threshold=0.5,\n                       iou_threshold=0.5):\n        \"\"\"\n        API for face detection.\n\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C], color space is BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            output_dir (str): The path to store output images.\n            visualization (bool): Whether to save image or not.\n            confs_threshold (float): threshold for confidence coefficient.\n            iou_threshold (float): threshold for iou.\n\n        Returns:\n            res (list[dict()]): The result of face detection and save path of images.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        # compatibility with older versions\n        if data and 'image' in data:\n            if paths is None:\n                paths = []\n            paths += data['image']\n\n        # get all data\n        all_data = []\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = []\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except:\n                    pass\n            # feed batch image\n            batch_image = np.array([data['image'] for data in batch_data]).astype('float32')\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(batch_image)\n\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n            confidences = output_handle.copy_to_cpu()\n\n            output_handle = predictor.get_output_handle(output_names[1])\n            boxes = output_handle.copy_to_cpu()\n\n            # postprocess one by one\n            for i in range(len(batch_data)):\n                out = postprocess(confidences=confidences[i],\n                                  boxes=boxes[i],\n                                  orig_im=batch_data[i]['orig_im'],\n                                  orig_im_shape=batch_data[i]['orig_im_shape'],\n                                  orig_im_path=batch_data[i]['orig_im_path'],\n                                  output_dir=output_dir,\n                                  visualization=visualization,\n                                  confs_threshold=confs_threshold,\n                                  iou_threshold=iou_threshold)\n                res.append(out)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.face_detection(images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.face_detection(paths=[args.input_path],\n                                      batch_size=args.batch_size,\n                                      use_gpu=args.use_gpu,\n                                      output_dir=args.output_dir,\n                                      visualization=args.visualization)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='face_detector_320_predict_output',\n                                           help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument('--visualization',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether to save output as images.\")\n        self.arg_config_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n"
  },
  {
    "path": "modules/image/face_detection/ultra_light_fast_generic_face_detector_1mb_320/processor.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport os\nimport time\n\nimport base64\nimport cv2\nimport numpy as np\n\n__all__ = ['postprocess']\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef area_of(left_top, right_bottom):\n    hw = np.clip(right_bottom - left_top, 0.0, None)\n    return hw[..., 0] * hw[..., 1]\n\n\ndef iou_of(boxes0, boxes1, eps=1e-5):\n    overlap_left_top = np.maximum(boxes0[..., :2], boxes1[..., :2])\n    overlap_right_bottom = np.minimum(boxes0[..., 2:], boxes1[..., 2:])\n    overlap_area = area_of(overlap_left_top, overlap_right_bottom)\n    area0 = area_of(boxes0[..., :2], boxes0[..., 2:])\n    area1 = area_of(boxes1[..., :2], boxes1[..., 2:])\n    return overlap_area / (area0 + area1 - overlap_area + eps)\n\n\ndef hard_nms(box_scores, iou_threshold, top_k=-1, candidate_size=200):\n    scores = box_scores[:, -1]\n    boxes = box_scores[:, :-1]\n    picked = []\n    # _, indexes = scores.sort(descending=True)\n    indexes = np.argsort(scores)\n    # indexes = indexes[:candidate_size]\n    indexes = indexes[-candidate_size:]\n    while len(indexes) > 0:\n        # current = indexes[0]\n        current = indexes[-1]\n        picked.append(current)\n        if 0 < top_k == len(picked) or len(indexes) == 1:\n            break\n        current_box = boxes[current, :]\n        # indexes = indexes[1:]\n        indexes = indexes[:-1]\n        rest_boxes = boxes[indexes, :]\n        iou = iou_of(rest_boxes, np.expand_dims(current_box, axis=0))\n        indexes = indexes[iou <= iou_threshold]\n    return box_scores[picked, :]\n\n\ndef check_dir(dir_path):\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef get_image_ext(image):\n    if image.shape[2] == 4:\n        return \".png\"\n    return \".jpg\"\n\n\ndef postprocess(confidences,\n                boxes,\n                orig_im,\n                orig_im_shape,\n                orig_im_path,\n                output_dir,\n                visualization,\n                confs_threshold=0.5,\n                iou_threshold=0.5):\n    \"\"\"\n    Postprocess output of network. one image at a time.\n\n    Args:\n        confidences (numpy.ndarray): confidences, with shape [num, 2]\n        boxes (numpy.ndaray): boxes coordinate,  with shape [num, 4]\n        orig_im (numpy.ndarray): original image.\n        orig_im_shape (list): shape pf original image.\n        orig_im_path (list): path of riginal image.\n        output_dir (str): output directory to store image.\n        visualization (bool): whether to save image or not.\n    \"\"\"\n    output = {}\n    output['data'] = []\n    if orig_im_path:\n        output['path'] = orig_im_path\n    picked_box_probs = []\n    picked_labels = []\n    for class_index in range(1, confidences.shape[1]):\n        probs = confidences[:, class_index]\n        mask = probs > confs_threshold\n        probs = probs[mask]\n        if probs.shape[0] == 0:\n            continue\n        subset_boxes = boxes[mask, :]\n        box_probs = np.concatenate([subset_boxes, probs.reshape(-1, 1)], axis=1)\n        box_probs = hard_nms(box_probs, iou_threshold=iou_threshold, top_k=-1)\n        picked_box_probs.append(box_probs)\n        picked_labels.extend([class_index] * box_probs.shape[0])\n\n    if not picked_box_probs:\n        return output\n\n    picked_box_probs = np.concatenate(picked_box_probs)\n    picked_box_probs[:, 0] *= orig_im_shape[1]\n    picked_box_probs[:, 1] *= orig_im_shape[0]\n    picked_box_probs[:, 2] *= orig_im_shape[1]\n    picked_box_probs[:, 3] *= orig_im_shape[0]\n\n    for data in picked_box_probs:\n        output['data'].append({\n            'left': float(data[0]),\n            'right': float(data[2]),\n            'top': float(data[1]),\n            'bottom': float(data[3]),\n            'confidence': float(data[4])\n        })\n\n    picked_box_probs = picked_box_probs[:, :4].astype(np.int32)\n    if visualization:\n        for i in range(picked_box_probs.shape[0]):\n            box = picked_box_probs[i]\n            cv2.rectangle(orig_im, (box[0], box[1]), (box[2], box[3]), (255, 255, 0), 2)\n        check_dir(output_dir)\n        ext = os.path.splitext(orig_im_path) if orig_im_path else ''\n        ext = ext if ext else get_image_ext(orig_im)\n        orig_im_path = orig_im_path if orig_im_path else 'ndarray_{}{}'.format(time.time(), ext)\n        im_name = os.path.basename(orig_im_path)\n        im_save_path = os.path.join(output_dir, im_name)\n        output['save_path'] = im_save_path\n        cv2.imwrite(im_save_path, orig_im)\n    return output\n"
  },
  {
    "path": "modules/image/face_detection/ultra_light_fast_generic_face_detector_1mb_320/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport paddlehub as hub\n\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://ai-studio-static-online.cdn.bcebos.com/7799a8ccc5f6471b9d56fb6eff94f82a08b70ca2c7594d3f99877e366c0a2619'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"ultra_light_fast_generic_face_detector_1mb_320\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('face_detector_320_predict_output')\n\n    def test_face_detection1(self):\n        results = self.module.face_detection(\n            paths=['tests/test.jpg'],\n            use_gpu=False,\n            visualization=False\n        )\n        bbox = results[0]['data'][0]\n\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n        \n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(1000 < left < 4000)\n        self.assertTrue(1000 < right < 4000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection2(self):\n        results = self.module.face_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=False\n        )\n        bbox = results[0]['data'][0]\n\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n        \n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(1000 < left < 4000)\n        self.assertTrue(1000 < right < 4000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection3(self):\n        results = self.module.face_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=True\n        )\n        bbox = results[0]['data'][0]\n\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n        \n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(1000 < left < 4000)\n        self.assertTrue(1000 < right < 4000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection4(self):\n        results = self.module.face_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=True,\n            visualization=False\n        )\n        bbox = results[0]['data'][0]\n\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n        \n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(1000 < left < 4000)\n        self.assertTrue(1000 < right < 4000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection5(self):\n        self.assertRaises(\n            AssertionError,\n            self.module.face_detection,\n            paths=['no.jpg']\n        )\n\n    def test_face_detection6(self):\n        self.assertRaises(\n            AttributeError,\n            self.module.face_detection,\n            images=['test.jpg']\n        )\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/face_detection/ultra_light_fast_generic_face_detector_1mb_640/README.md",
    "content": "# ultra_light_fast_generic_face_detector_1mb_640\n\n|模型名称|ultra_light_fast_generic_face_detector_1mb_640|\n| :--- | :---: |\n|类别|图像 - 人脸检测|\n|网络|Ultra-Light-Fast-Generic-Face-Detector-1MB|\n|数据集|WIDER FACE数据集|\n|是否支持Fine-tuning|否|\n|模型大小|2.9MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/131604811-bce29c3f-66f7-45cb-a388-d739368bfeb9.jpg\"   width='50%' hspace='10'/>\n    <br />\n    </p>\n\n- ### 模型介绍\n\n  - Ultra-Light-Fast-Generic-Face-Detector-1MB是针对边缘计算设备或低算力设备(如用ARM推理)设计的实时超轻量级通用人脸检测模型，可以在低算力设备中如用ARM进行实时的通用场景的人脸检测推理。该PaddleHub Module的预训练数据集为WIDER FACE数据集，可支持预测，在预测时会将图片输入缩放为640 * 480。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install ultra_light_fast_generic_face_detector_1mb_640\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run ultra_light_fast_generic_face_detector_1mb_640 --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现人脸检测模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    face_detector = hub.Module(name=\"ultra_light_fast_generic_face_detector_1mb_640\")\n    result = face_detector.face_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = face_detector.face_detection(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def face_detection(images=None,\n                       paths=None,\n                       batch\\_size=1,\n                       use_gpu=False,\n                       output_dir='face_detector_640_predict_output',\n                       visualization=False,\n                       confs_threshold=0.5)\n    ```\n\n    - 检测输入图片中的所有人脸位置。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；<br/>\n      - paths (list\\[str\\]): 图片的路径；<br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 face\\_detector\\_640\\_predict\\_output；<br/>\n      - visualization (bool): 是否将识别结果保存为图片文件；<br/>\n      - confs\\_threshold (float): 置信度的阈值。\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为:\n        - path (str): 原输入图片的路径\n        - data (list): 检测结果，list的每一个元素为 dict，各字段为:\n          - confidence (float): 识别的置信度\n          - left (int): 边界框的左上角x坐标\n          - top (int): 边界框的左上角y坐标\n          - right (int): 边界框的右下角x坐标\n          - bottom (int): 边界框的右下角y坐标\n        - save\\_path 字段为可视化图片的保存路径（仅当visualization=True时存在）\n\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - 将模型保存到指定路径。\n\n    - **参数**\n\n      - dirname: 模型保存路径 <br/>\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线人脸检测服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m ultra_light_fast_generic_face_detector_1mb_640\n    ```\n\n  - 这样就完成了一个人脸检测服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/ultra_light_fast_generic_face_detector_1mb_640\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.3\n\n  移除 fluid api\n\n* 1.2.0\n  \n  修复无法导出推理模型的问题\n\n  - ```shell\n    $ hub install ultra_light_fast_generic_face_detector_1mb_640==1.2.0\n    ```\n\n"
  },
  {
    "path": "modules/image/face_detection/ultra_light_fast_generic_face_detector_1mb_640/README_en.md",
    "content": "# ultra_light_fast_generic_face_detector_1mb_640\n\n|Module Name|ultra_light_fast_generic_face_detector_1mb_640|\n| :--- | :---: |\n|Category|face detection|\n|Network|Ultra-Light-Fast-Generic-Face-Detector-1MB|\n|Dataset|WIDER FACEDataset|\n|Fine-tuning supported or not|No|\n|Module Size|2.9MB|\n|Latest update date|2021-02-26|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/131604811-bce29c3f-66f7-45cb-a388-d739368bfeb9.jpg\"   width='50%' hspace='10'/>\n    <br />\n    </p>\n\n- ### Module Introduction\n\n  - Ultra-Light-Fast-Generic-Face-Detector-1MB is an extreme light-weight model for real-time face detection in low computation power devices. This module is based on Ultra-Light-Fast-Generic-Face-Detector-1MB, trained on WIDER FACEDataset, and can be used for face detection.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install ultra_light_fast_generic_face_detector_1mb_640\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run ultra_light_fast_generic_face_detector_1mb_640 --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    face_detector = hub.Module(name=\"ultra_light_fast_generic_face_detector_1mb_640\")\n    result = face_detector.face_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = face_detector.face_detection(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def face_detection(images=None,\n                       paths=None,\n                       batch\\_size=1,\n                       use_gpu=False,\n                       output_dir='face_detector_640_predict_output',\n                       visualization=False,\n                       confs_threshold=0.5)\n    ```\n\n    - Detect all faces in image\n\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - output_dir (str): save path of images;\n      - visualization (bool): Whether to save the results as picture files;\n      - confs\\_threshold (float): the confidence threshold\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n\n      - res (list\\[dict\\]): results\n        - path (str): path for input image\n        - data (list): detection results, each element in the list is dict\n          - confidence (float): the confidence of the result\n          - left (int): the upper left corner x coordinate of the detection box\n          - top (int): the upper left corner y coordinate of the detection box\n          - right (int): the lower right corner x coordinate of the detection box\n          - bottom (int): the lower right corner y coordinate of the detection box\n        - save\\_path (str): path for saving output image\n\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - Save model to specific path\n\n    - **Parameters**\n\n      - dirname: model save path\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of face detection.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m ultra_light_fast_generic_face_detector_1mb_640\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/ultra_light_fast_generic_face_detector_1mb_640\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.3\n\n  Remove fluid api\n\n* 1.2.0\n\n  Fix a bug of save_inference_model\n\n  - ```shell\n    $ hub install ultra_light_fast_generic_face_detector_1mb_640==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/face_detection/ultra_light_fast_generic_face_detector_1mb_640/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/face_detection/ultra_light_fast_generic_face_detector_1mb_640/data_feed.py",
    "content": "# coding=utf-8\nimport os\nfrom collections import OrderedDict\n\nimport cv2\nimport numpy as np\n\n__all__ = ['reader']\n\n\ndef preprocess(orig_image):\n    image = cv2.cvtColor(orig_image, cv2.COLOR_BGR2RGB)\n    image = cv2.resize(image, (640, 480))\n    image_mean = np.array([127, 127, 127])\n    image = (image - image_mean) / 128.0\n    image = np.transpose(image, [2, 0, 1])\n    return image\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            im = cv2.imread(im_path)\n            each['orig_im'] = im\n            each['orig_im_shape'] = im.shape  # height, width, channel\n            each['orig_im_path'] = im_path\n            component.append(each)\n    if images is not None:\n        assert type(images) is list, \"images should be a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['orig_im'] = im\n            each['orig_im_path'] = None\n            each['orig_im_shape'] = im.shape  # height, width, channel\n            component.append(each)\n\n    for element in component:\n        element['image'] = preprocess(element['orig_im'])\n        yield element\n"
  },
  {
    "path": "modules/image/face_detection/ultra_light_fast_generic_face_detector_1mb_640/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\n\nimport argparse\nimport ast\nimport os\n\nimport numpy as np\nimport paddle\nimport paddle.static\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import postprocess\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"ultra_light_fast_generic_face_detector_1mb_640\",\n    type=\"CV/face_detection\",\n    author=\"paddlepaddle\",\n    author_email=\"paddle-dev@baidu.com\",\n    summary=\n    \"Ultra-Light-Fast-Generic-Face-Detector-1MB is a high-performance object detection model release on https://github.com/Linzaer/Ultra-Light-Fast-Generic-Face-Detector-1MB.\",\n    version=\"1.2.0\")\nclass FaceDetector640:\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory,\n                                                          \"ultra_light_fast_generic_face_detector_1mb_640\", \"model\")\n        self._set_config()\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path+'.pdmodel'\n        params = self.default_pretrained_model_path+'.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def face_detection(self,\n                       images=None,\n                       paths=None,\n                       data=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       output_dir='face_detector_640_predict_output',\n                       visualization=False,\n                       confs_threshold=0.5,\n                       iou_threshold=0.5):\n        \"\"\"\n        API for face detection.\n\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            output_dir (str): The path to store output images.\n            visualization (bool): Whether to save image or not.\n            confs_threshold (float): threshold for confidence coefficient.\n            iou_threshold (float): threshold for iou.\n        Returns:\n            res (list[dict()]): The result of face detection and save path of images.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        # compatibility with older versions\n        if data and 'image' in data:\n            if paths is None:\n                paths = []\n            paths += data['image']\n\n        # get all data\n        all_data = []\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = []\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except:\n                    pass\n            # feed batch image\n            batch_image = np.array([data['image'] for data in batch_data]).astype('float32')\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(batch_image)\n\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n            confidences = output_handle.copy_to_cpu()\n\n            output_handle = predictor.get_output_handle(output_names[1])\n            boxes = output_handle.copy_to_cpu()\n\n            # postprocess one by one\n            for i in range(len(batch_data)):\n                out = postprocess(confidences=confidences[i],\n                                  boxes=boxes[i],\n                                  orig_im=batch_data[i]['orig_im'],\n                                  orig_im_shape=batch_data[i]['orig_im_shape'],\n                                  orig_im_path=batch_data[i]['orig_im_path'],\n                                  output_dir=output_dir,\n                                  visualization=visualization,\n                                  confs_threshold=confs_threshold,\n                                  iou_threshold=iou_threshold)\n                res.append(out)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.face_detection(images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.face_detection(paths=[args.input_path],\n                                      batch_size=args.batch_size,\n                                      use_gpu=args.use_gpu,\n                                      output_dir=args.output_dir,\n                                      visualization=args.visualization)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='face_detector_640_predict_output',\n                                           help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument('--visualization',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether to save output as images.\")\n        self.arg_config_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n"
  },
  {
    "path": "modules/image/face_detection/ultra_light_fast_generic_face_detector_1mb_640/processor.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport os\nimport time\n\nimport base64\nimport cv2\nimport numpy as np\n\n__all__ = ['postprocess']\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef area_of(left_top, right_bottom):\n    hw = np.clip(right_bottom - left_top, 0.0, None)\n    return hw[..., 0] * hw[..., 1]\n\n\ndef iou_of(boxes0, boxes1, eps=1e-5):\n    overlap_left_top = np.maximum(boxes0[..., :2], boxes1[..., :2])\n    overlap_right_bottom = np.minimum(boxes0[..., 2:], boxes1[..., 2:])\n    overlap_area = area_of(overlap_left_top, overlap_right_bottom)\n    area0 = area_of(boxes0[..., :2], boxes0[..., 2:])\n    area1 = area_of(boxes1[..., :2], boxes1[..., 2:])\n    return overlap_area / (area0 + area1 - overlap_area + eps)\n\n\ndef hard_nms(box_scores, iou_threshold, top_k=-1, candidate_size=200):\n    scores = box_scores[:, -1]\n    boxes = box_scores[:, :-1]\n    picked = []\n    # _, indexes = scores.sort(descending=True)\n    indexes = np.argsort(scores)\n    # indexes = indexes[:candidate_size]\n    indexes = indexes[-candidate_size:]\n    while len(indexes) > 0:\n        # current = indexes[0]\n        current = indexes[-1]\n        picked.append(current)\n        if 0 < top_k == len(picked) or len(indexes) == 1:\n            break\n        current_box = boxes[current, :]\n        # indexes = indexes[1:]\n        indexes = indexes[:-1]\n        rest_boxes = boxes[indexes, :]\n        iou = iou_of(rest_boxes, np.expand_dims(current_box, axis=0))\n        indexes = indexes[iou <= iou_threshold]\n    return box_scores[picked, :]\n\n\ndef check_dir(dir_path):\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef get_image_ext(image):\n    if image.shape[2] == 4:\n        return \".png\"\n    return \".jpg\"\n\n\ndef postprocess(confidences,\n                boxes,\n                orig_im,\n                orig_im_shape,\n                orig_im_path,\n                output_dir,\n                visualization,\n                confs_threshold=0.5,\n                iou_threshold=0.5):\n    \"\"\"\n    Postprocess output of network. one image at a time.\n\n    Args:\n        confidences (numpy.ndarray): confidences, with shape [num, 2]\n        boxes (numpy.ndaray): boxes coordinate,  with shape [num, 4]\n        orig_im (numpy.ndarray): original image.\n        orig_im_shape (list): shape pf original image.\n        orig_im_path (list): path of riginal image.\n        output_dir (str): output directory to store image.\n        visualization (bool): whether to save image or not.\n    \"\"\"\n    output = {}\n    output['data'] = []\n    if orig_im_path:\n        output['path'] = orig_im_path\n    picked_box_probs = []\n    picked_labels = []\n    for class_index in range(1, confidences.shape[1]):\n        probs = confidences[:, class_index]\n        mask = probs > confs_threshold\n        probs = probs[mask]\n        if probs.shape[0] == 0:\n            continue\n        subset_boxes = boxes[mask, :]\n        box_probs = np.concatenate([subset_boxes, probs.reshape(-1, 1)], axis=1)\n        box_probs = hard_nms(box_probs, iou_threshold=iou_threshold, top_k=-1)\n        picked_box_probs.append(box_probs)\n        picked_labels.extend([class_index] * box_probs.shape[0])\n\n    if not picked_box_probs:\n        return output\n\n    picked_box_probs = np.concatenate(picked_box_probs)\n    picked_box_probs[:, 0] *= orig_im_shape[1]\n    picked_box_probs[:, 1] *= orig_im_shape[0]\n    picked_box_probs[:, 2] *= orig_im_shape[1]\n    picked_box_probs[:, 3] *= orig_im_shape[0]\n\n    for data in picked_box_probs:\n        output['data'].append({\n            'left': float(data[0]),\n            'right': float(data[2]),\n            'top': float(data[1]),\n            'bottom': float(data[3]),\n            'confidence': float(data[4])\n        })\n\n    picked_box_probs = picked_box_probs[:, :4].astype(np.int32)\n    if visualization:\n        for i in range(picked_box_probs.shape[0]):\n            box = picked_box_probs[i]\n            cv2.rectangle(orig_im, (box[0], box[1]), (box[2], box[3]), (255, 255, 0), 2)\n        check_dir(output_dir)\n        ext = os.path.splitext(orig_im_path) if orig_im_path else ''\n        ext = ext if ext else get_image_ext(orig_im)\n        orig_im_path = orig_im_path if orig_im_path else 'ndarray_{}{}'.format(time.time(), ext)\n        im_name = os.path.basename(orig_im_path)\n        im_save_path = os.path.join(output_dir, im_name)\n        output['save_path'] = im_save_path\n        cv2.imwrite(im_save_path, orig_im)\n    return output\n"
  },
  {
    "path": "modules/image/face_detection/ultra_light_fast_generic_face_detector_1mb_640/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport paddlehub as hub\n\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://ai-studio-static-online.cdn.bcebos.com/7799a8ccc5f6471b9d56fb6eff94f82a08b70ca2c7594d3f99877e366c0a2619'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"ultra_light_fast_generic_face_detector_1mb_640\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('face_detector_640_predict_output')\n\n    def test_face_detection1(self):\n        results = self.module.face_detection(\n            paths=['tests/test.jpg'],\n            use_gpu=False,\n            visualization=False\n        )\n        bbox = results[0]['data'][0]\n\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n        \n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(1000 < left < 4000)\n        self.assertTrue(1000 < right < 4000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection2(self):\n        results = self.module.face_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=False\n        )\n        bbox = results[0]['data'][0]\n\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n        \n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(1000 < left < 4000)\n        self.assertTrue(1000 < right < 4000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection3(self):\n        results = self.module.face_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=True\n        )\n        bbox = results[0]['data'][0]\n\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n        \n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(1000 < left < 4000)\n        self.assertTrue(1000 < right < 4000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection4(self):\n        results = self.module.face_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=True,\n            visualization=False\n        )\n        bbox = results[0]['data'][0]\n\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n        \n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(1000 < left < 4000)\n        self.assertTrue(1000 < right < 4000)\n        self.assertTrue(0 < top < 2000)\n        self.assertTrue(0 < bottom < 2000)\n\n    def test_face_detection5(self):\n        self.assertRaises(\n            AssertionError,\n            self.module.face_detection,\n            paths=['no.jpg']\n        )\n\n    def test_face_detection6(self):\n        self.assertRaises(\n            AttributeError,\n            self.module.face_detection,\n            images=['test.jpg']\n        )\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/image_processing/enlightengan/README.md",
    "content": "# enlightengan\n\n|模型名称|enlightengan|\n| :--- | :---: |\n|类别|图像 - 暗光增强|\n|网络|EnlightenGAN|\n|数据集|-|\n|是否支持Fine-tuning|否|\n|模型大小|83MB|\n|最新更新日期|2021-11-04|\n|数据指标|-|\n\n\n## 一、模型基本信息  \n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/142827116-76d713c6-94d9-410d-830a-65135cd856b8.jpeg\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    输入图像\n    <br />\n    <img src=\"https://user-images.githubusercontent.com/22424850/142827262-97317323-f6bd-4aa4-b7ac-c69436c4d576.png\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    输出图像\n     <br />\n    </p>\n\n- ### 模型介绍\n\n  - EnlightenGAN使用非成对的数据进行训练，通过设计自特征保留损失函数和自约束注意力机制，训练的网络可以应用到多种场景下的暗光增强中。\n\n  - 更多详情参考：[EnlightenGAN: Deep Light Enhancement without Paired Supervision](https://arxiv.org/abs/1906.06972)\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n  - onnxruntime\n  - x2paddle\n  - pillow\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install enlightengan\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    # Read from a file\n    $ hub run enlightengan --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现暗光增强模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    enlightener = hub.Module(name=\"enlightengan\")\n    input_path = [\"/PATH/TO/IMAGE\"]\n    # Read from a file\n    enlightener.enlightening(paths=input_path, output_dir='./enlightening_result/', use_gpu=True)  \n    ```\n\n- ### 3、API\n\n  - ```python\n    def enlightening(images=None, paths=None, output_dir='./enlightening_result/', use_gpu=False, visualization=True)\n    ```\n    - 暗光增强API。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]；<br/>\n      - paths (list\\[str\\]): 图片的路径；<br/>\n      - output\\_dir (str): 结果保存的路径； <br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - visualization(bool): 是否保存结果到本地文件夹\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像风格转换服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m enlightengan\n    ```\n\n  - 这样就完成了一个图像风格转换的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/enlightengan\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install enlightengan==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/image_processing/enlightengan/enlighten_inference/pd_model/x2paddle_code.py",
    "content": "import paddle\nimport math\n\n\nclass ONNXModel(paddle.nn.Layer):\n    def __init__(self):\n        super(ONNXModel, self).__init__()\n        self.conv0 = paddle.nn.Conv2D(in_channels=3, out_channels=3, kernel_size=[1, 1], groups=3)\n        self.pool0 = paddle.nn.MaxPool2D(kernel_size=[2, 2], stride=2)\n        self.pool1 = paddle.nn.MaxPool2D(kernel_size=[2, 2], stride=2)\n        self.conv1 = paddle.nn.Conv2D(in_channels=4, out_channels=32, kernel_size=[3, 3], padding=1)\n        self.pool2 = paddle.nn.MaxPool2D(kernel_size=[2, 2], stride=2)\n        self.leakyrelu0 = paddle.nn.LeakyReLU(negative_slope=0.20000000298023224)\n        self.pool3 = paddle.nn.MaxPool2D(kernel_size=[2, 2], stride=2)\n        self.batchnorm0 = paddle.nn.BatchNorm(\n            num_channels=32, momentum=0.8999999761581421, epsilon=9.999999747378752e-06, is_test=True)\n        self.conv2 = paddle.nn.Conv2D(in_channels=32, out_channels=32, kernel_size=[3, 3], padding=1)\n        self.leakyrelu1 = paddle.nn.LeakyReLU(negative_slope=0.20000000298023224)\n        self.batchnorm1 = paddle.nn.BatchNorm(\n            num_channels=32, momentum=0.8999999761581421, epsilon=9.999999747378752e-06, is_test=True)\n        self.pool4 = paddle.nn.MaxPool2D(kernel_size=[2, 2], stride=2)\n        self.conv3 = paddle.nn.Conv2D(in_channels=32, out_channels=64, kernel_size=[3, 3], padding=1)\n        self.leakyrelu2 = paddle.nn.LeakyReLU(negative_slope=0.20000000298023224)\n        self.batchnorm2 = paddle.nn.BatchNorm(\n            num_channels=64, momentum=0.8999999761581421, epsilon=9.999999747378752e-06, is_test=True)\n        self.conv4 = paddle.nn.Conv2D(in_channels=64, out_channels=64, kernel_size=[3, 3], padding=1)\n        self.leakyrelu3 = paddle.nn.LeakyReLU(negative_slope=0.20000000298023224)\n        self.batchnorm3 = paddle.nn.BatchNorm(\n            num_channels=64, momentum=0.8999999761581421, epsilon=9.999999747378752e-06, is_test=True)\n        self.pool5 = paddle.nn.MaxPool2D(kernel_size=[2, 2], stride=2)\n        self.conv5 = paddle.nn.Conv2D(in_channels=64, out_channels=128, kernel_size=[3, 3], padding=1)\n        self.leakyrelu4 = paddle.nn.LeakyReLU(negative_slope=0.20000000298023224)\n        self.batchnorm4 = paddle.nn.BatchNorm(\n            num_channels=128, momentum=0.8999999761581421, epsilon=9.999999747378752e-06, is_test=True)\n        self.conv6 = paddle.nn.Conv2D(in_channels=128, out_channels=128, kernel_size=[3, 3], padding=1)\n        self.leakyrelu5 = paddle.nn.LeakyReLU(negative_slope=0.20000000298023224)\n        self.batchnorm5 = paddle.nn.BatchNorm(\n            num_channels=128, momentum=0.8999999761581421, epsilon=9.999999747378752e-06, is_test=True)\n        self.pool6 = paddle.nn.MaxPool2D(kernel_size=[2, 2], stride=2)\n        self.conv7 = paddle.nn.Conv2D(in_channels=128, out_channels=256, kernel_size=[3, 3], padding=1)\n        self.leakyrelu6 = paddle.nn.LeakyReLU(negative_slope=0.20000000298023224)\n        self.batchnorm6 = paddle.nn.BatchNorm(\n            num_channels=256, momentum=0.8999999761581421, epsilon=9.999999747378752e-06, is_test=True)\n        self.conv8 = paddle.nn.Conv2D(in_channels=256, out_channels=256, kernel_size=[3, 3], padding=1)\n        self.leakyrelu7 = paddle.nn.LeakyReLU(negative_slope=0.20000000298023224)\n        self.batchnorm7 = paddle.nn.BatchNorm(\n            num_channels=256, momentum=0.8999999761581421, epsilon=9.999999747378752e-06, is_test=True)\n        self.pool7 = paddle.nn.MaxPool2D(kernel_size=[2, 2], stride=2)\n        self.conv9 = paddle.nn.Conv2D(in_channels=256, out_channels=512, kernel_size=[3, 3], padding=1)\n        self.leakyrelu8 = paddle.nn.LeakyReLU(negative_slope=0.20000000298023224)\n        self.batchnorm8 = paddle.nn.BatchNorm(\n            num_channels=512, momentum=0.8999999761581421, epsilon=9.999999747378752e-06, is_test=True)\n        self.conv10 = paddle.nn.Conv2D(in_channels=512, out_channels=512, kernel_size=[3, 3], padding=1)\n        self.leakyrelu9 = paddle.nn.LeakyReLU(negative_slope=0.20000000298023224)\n        self.batchnorm9 = paddle.nn.BatchNorm(\n            num_channels=512, momentum=0.8999999761581421, epsilon=9.999999747378752e-06, is_test=True)\n        self.conv11 = paddle.nn.Conv2D(in_channels=512, out_channels=256, kernel_size=[3, 3], padding=1)\n        self.conv12 = paddle.nn.Conv2D(in_channels=512, out_channels=256, kernel_size=[3, 3], padding=1)\n        self.leakyrelu10 = paddle.nn.LeakyReLU(negative_slope=0.20000000298023224)\n        self.batchnorm10 = paddle.nn.BatchNorm(\n            num_channels=256, momentum=0.8999999761581421, epsilon=9.999999747378752e-06, is_test=True)\n        self.conv13 = paddle.nn.Conv2D(in_channels=256, out_channels=256, kernel_size=[3, 3], padding=1)\n        self.leakyrelu11 = paddle.nn.LeakyReLU(negative_slope=0.20000000298023224)\n        self.batchnorm11 = paddle.nn.BatchNorm(\n            num_channels=256, momentum=0.8999999761581421, epsilon=9.999999747378752e-06, is_test=True)\n        self.conv14 = paddle.nn.Conv2D(in_channels=256, out_channels=128, kernel_size=[3, 3], padding=1)\n        self.conv15 = paddle.nn.Conv2D(in_channels=256, out_channels=128, kernel_size=[3, 3], padding=1)\n        self.leakyrelu12 = paddle.nn.LeakyReLU(negative_slope=0.20000000298023224)\n        self.batchnorm12 = paddle.nn.BatchNorm(\n            num_channels=128, momentum=0.8999999761581421, epsilon=9.999999747378752e-06, is_test=True)\n        self.conv16 = paddle.nn.Conv2D(in_channels=128, out_channels=128, kernel_size=[3, 3], padding=1)\n        self.leakyrelu13 = paddle.nn.LeakyReLU(negative_slope=0.20000000298023224)\n        self.batchnorm13 = paddle.nn.BatchNorm(\n            num_channels=128, momentum=0.8999999761581421, epsilon=9.999999747378752e-06, is_test=True)\n        self.conv17 = paddle.nn.Conv2D(in_channels=128, out_channels=64, kernel_size=[3, 3], padding=1)\n        self.conv18 = paddle.nn.Conv2D(in_channels=128, out_channels=64, kernel_size=[3, 3], padding=1)\n        self.leakyrelu14 = paddle.nn.LeakyReLU(negative_slope=0.20000000298023224)\n        self.batchnorm14 = paddle.nn.BatchNorm(\n            num_channels=64, momentum=0.8999999761581421, epsilon=9.999999747378752e-06, is_test=True)\n        self.conv19 = paddle.nn.Conv2D(in_channels=64, out_channels=64, kernel_size=[3, 3], padding=1)\n        self.leakyrelu15 = paddle.nn.LeakyReLU(negative_slope=0.20000000298023224)\n        self.batchnorm15 = paddle.nn.BatchNorm(\n            num_channels=64, momentum=0.8999999761581421, epsilon=9.999999747378752e-06, is_test=True)\n        self.conv20 = paddle.nn.Conv2D(in_channels=64, out_channels=32, kernel_size=[3, 3], padding=1)\n        self.conv21 = paddle.nn.Conv2D(in_channels=64, out_channels=32, kernel_size=[3, 3], padding=1)\n        self.leakyrelu16 = paddle.nn.LeakyReLU(negative_slope=0.20000000298023224)\n        self.batchnorm16 = paddle.nn.BatchNorm(\n            num_channels=32, momentum=0.8999999761581421, epsilon=9.999999747378752e-06, is_test=True)\n        self.conv22 = paddle.nn.Conv2D(in_channels=32, out_channels=32, kernel_size=[3, 3], padding=1)\n        self.leakyrelu17 = paddle.nn.LeakyReLU(negative_slope=0.20000000298023224)\n        self.conv23 = paddle.nn.Conv2D(in_channels=32, out_channels=3, kernel_size=[1, 1])\n\n    def forward(self, x2paddle_input):\n        x2paddle_137 = paddle.full(dtype='float32', shape=[1], fill_value=1.0)\n        x2paddle_145 = paddle.full(dtype='float32', shape=[1], fill_value=0.29899999499320984)\n        x2paddle_147 = paddle.full(dtype='float32', shape=[1], fill_value=0.5870000123977661)\n        x2paddle_150 = paddle.full(dtype='float32', shape=[1], fill_value=0.11400000005960464)\n        x2paddle_153 = paddle.full(dtype='float32', shape=[1], fill_value=2.0)\n        x2paddle_155 = paddle.full(dtype='float32', shape=[1], fill_value=1.0)\n        x2paddle_256 = paddle.full(dtype='float32', shape=[1], fill_value=1.0)\n        x2paddle_134 = self.conv0(x2paddle_input)\n        x2paddle_135, = paddle.split(x=x2paddle_134, num_or_sections=[1])\n        x2paddle_257 = paddle.multiply(x=x2paddle_134, y=x2paddle_256)\n        x2paddle_136 = paddle.squeeze(x=x2paddle_135, axis=[0])\n        x2paddle_138 = paddle.add(x=x2paddle_136, y=x2paddle_137)\n        x2paddle_139_p0, x2paddle_139_p1, x2paddle_139_p2 = paddle.split(x=x2paddle_138, num_or_sections=[1, 1, 1])\n        x2paddle_142 = paddle.squeeze(x=x2paddle_139_p0, axis=[0])\n        x2paddle_143 = paddle.squeeze(x=x2paddle_139_p1, axis=[0])\n        x2paddle_144 = paddle.squeeze(x=x2paddle_139_p2, axis=[0])\n        x2paddle_146 = paddle.multiply(x=x2paddle_142, y=x2paddle_145)\n        x2paddle_148 = paddle.multiply(x=x2paddle_143, y=x2paddle_147)\n        x2paddle_151 = paddle.multiply(x=x2paddle_144, y=x2paddle_150)\n        x2paddle_149 = paddle.add(x=x2paddle_146, y=x2paddle_148)\n        x2paddle_152 = paddle.add(x=x2paddle_149, y=x2paddle_151)\n        x2paddle_154 = paddle.divide(x=x2paddle_152, y=x2paddle_153)\n        x2paddle_156 = paddle.subtract(x=x2paddle_155, y=x2paddle_154)\n        x2paddle_157 = paddle.unsqueeze(x=x2paddle_156, axis=[0])\n        x2paddle_158 = paddle.unsqueeze(x=x2paddle_157, axis=[0])\n        x2paddle_159 = self.pool0(x2paddle_158)\n        x2paddle_163 = paddle.concat(x=[x2paddle_134, x2paddle_158], axis=1)\n        x2paddle_160 = self.pool1(x2paddle_159)\n        x2paddle_164 = self.conv1(x2paddle_163)\n        x2paddle_161 = self.pool2(x2paddle_160)\n        x2paddle_165 = self.leakyrelu0(x2paddle_164)\n        x2paddle_162 = self.pool3(x2paddle_161)\n        x2paddle_166 = self.batchnorm0(x2paddle_165)\n        x2paddle_167 = self.conv2(x2paddle_166)\n        x2paddle_168 = self.leakyrelu1(x2paddle_167)\n        x2paddle_169 = self.batchnorm1(x2paddle_168)\n        x2paddle_170 = self.pool4(x2paddle_169)\n        x2paddle_246 = paddle.multiply(x=x2paddle_169, y=x2paddle_158)\n        x2paddle_171 = self.conv3(x2paddle_170)\n        x2paddle_172 = self.leakyrelu2(x2paddle_171)\n        x2paddle_173 = self.batchnorm2(x2paddle_172)\n        x2paddle_174 = self.conv4(x2paddle_173)\n        x2paddle_175 = self.leakyrelu3(x2paddle_174)\n        x2paddle_176 = self.batchnorm3(x2paddle_175)\n        x2paddle_177 = self.pool5(x2paddle_176)\n        x2paddle_232 = paddle.multiply(x=x2paddle_176, y=x2paddle_159)\n        x2paddle_178 = self.conv5(x2paddle_177)\n        x2paddle_179 = self.leakyrelu4(x2paddle_178)\n        x2paddle_180 = self.batchnorm4(x2paddle_179)\n        x2paddle_181 = self.conv6(x2paddle_180)\n        x2paddle_182 = self.leakyrelu5(x2paddle_181)\n        x2paddle_183 = self.batchnorm5(x2paddle_182)\n        x2paddle_184 = self.pool6(x2paddle_183)\n        x2paddle_218 = paddle.multiply(x=x2paddle_183, y=x2paddle_160)\n        x2paddle_185 = self.conv7(x2paddle_184)\n        x2paddle_186 = self.leakyrelu6(x2paddle_185)\n        x2paddle_187 = self.batchnorm6(x2paddle_186)\n        x2paddle_188 = self.conv8(x2paddle_187)\n        x2paddle_189 = self.leakyrelu7(x2paddle_188)\n        x2paddle_190 = self.batchnorm7(x2paddle_189)\n        x2paddle_191 = self.pool7(x2paddle_190)\n        x2paddle_204 = paddle.multiply(x=x2paddle_190, y=x2paddle_161)\n        x2paddle_192 = self.conv9(x2paddle_191)\n        x2paddle_193 = self.leakyrelu8(x2paddle_192)\n        x2paddle_194 = self.batchnorm8(x2paddle_193)\n        x2paddle_195 = paddle.multiply(x=x2paddle_194, y=x2paddle_162)\n        x2paddle_196 = self.conv10(x2paddle_195)\n        x2paddle_197 = self.leakyrelu9(x2paddle_196)\n        x2paddle_198 = self.batchnorm9(x2paddle_197)\n        x2paddle_203 = paddle.nn.functional.interpolate(x=x2paddle_198, scale_factor=[2.0, 2.0], mode='bilinear')\n        x2paddle_205 = self.conv11(x2paddle_203)\n        x2paddle_206 = paddle.concat(x=[x2paddle_205, x2paddle_204], axis=1)\n        x2paddle_207 = self.conv12(x2paddle_206)\n        x2paddle_208 = self.leakyrelu10(x2paddle_207)\n        x2paddle_209 = self.batchnorm10(x2paddle_208)\n        x2paddle_210 = self.conv13(x2paddle_209)\n        x2paddle_211 = self.leakyrelu11(x2paddle_210)\n        x2paddle_212 = self.batchnorm11(x2paddle_211)\n        x2paddle_217 = paddle.nn.functional.interpolate(x=x2paddle_212, scale_factor=[2.0, 2.0], mode='bilinear')\n        x2paddle_219 = self.conv14(x2paddle_217)\n        x2paddle_220 = paddle.concat(x=[x2paddle_219, x2paddle_218], axis=1)\n        x2paddle_221 = self.conv15(x2paddle_220)\n        x2paddle_222 = self.leakyrelu12(x2paddle_221)\n        x2paddle_223 = self.batchnorm12(x2paddle_222)\n        x2paddle_224 = self.conv16(x2paddle_223)\n        x2paddle_225 = self.leakyrelu13(x2paddle_224)\n        x2paddle_226 = self.batchnorm13(x2paddle_225)\n        x2paddle_231 = paddle.nn.functional.interpolate(x=x2paddle_226, scale_factor=[2.0, 2.0], mode='bilinear')\n        x2paddle_233 = self.conv17(x2paddle_231)\n        x2paddle_234 = paddle.concat(x=[x2paddle_233, x2paddle_232], axis=1)\n        x2paddle_235 = self.conv18(x2paddle_234)\n        x2paddle_236 = self.leakyrelu14(x2paddle_235)\n        x2paddle_237 = self.batchnorm14(x2paddle_236)\n        x2paddle_238 = self.conv19(x2paddle_237)\n        x2paddle_239 = self.leakyrelu15(x2paddle_238)\n        x2paddle_240 = self.batchnorm15(x2paddle_239)\n        x2paddle_245 = paddle.nn.functional.interpolate(x=x2paddle_240, scale_factor=[2.0, 2.0], mode='bilinear')\n        x2paddle_247 = self.conv20(x2paddle_245)\n        x2paddle_248 = paddle.concat(x=[x2paddle_247, x2paddle_246], axis=1)\n        x2paddle_249 = self.conv21(x2paddle_248)\n        x2paddle_250 = self.leakyrelu16(x2paddle_249)\n        x2paddle_251 = self.batchnorm16(x2paddle_250)\n        x2paddle_252 = self.conv22(x2paddle_251)\n        x2paddle_253 = self.leakyrelu17(x2paddle_252)\n        x2paddle_254 = self.conv23(x2paddle_253)\n        x2paddle_255 = paddle.multiply(x=x2paddle_254, y=x2paddle_158)\n        x2paddle_output = paddle.add(x=x2paddle_255, y=x2paddle_257)\n        return x2paddle_output, x2paddle_255\n"
  },
  {
    "path": "modules/image/image_processing/enlightengan/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport os\n\nimport cv2\nimport numpy as np\nimport paddle\n\nimport paddlehub as hub\nfrom .enlighten_inference.pd_model.x2paddle_code import ONNXModel\nfrom .util import base64_to_cv2\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"enlightengan\",\n            type=\"CV/enlighten\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"\",\n            version=\"1.0.0\")\nclass EnlightenGAN:\n\n    def __init__(self):\n        self.pretrained_model = os.path.join(self.directory, \"enlighten_inference/pd_model\")\n        self.model = ONNXModel()\n        params = paddle.load(os.path.join(self.pretrained_model, 'model.pdparams'))\n        self.model.set_dict(params, use_structured_name=True)\n\n    def enlightening(self,\n                     images: list = None,\n                     paths: list = None,\n                     output_dir: str = './enlightening_result/',\n                     use_gpu: bool = False,\n                     visualization: bool = True):\n        '''\n        enlighten images in the low-light scene.\n\n        images (list[numpy.ndarray]): data of images, shape of each is [H, W, C], color space must be BGR(read by cv2).\n        paths (list[str]): paths to images\n        output_dir (str): the dir to save the results\n        use_gpu (bool): if True, use gpu to perform the computation, otherwise cpu.\n        visualization (bool): if True, save results in output_dir.\n        '''\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n        self.model.eval()\n\n        if images != None:\n            for image in images:\n                image = image[:, :, ::-1]\n                image = np.expand_dims(np.transpose(image, (2, 0, 1)).astype(np.float32) / 255., 0)\n                inputtensor = paddle.to_tensor(image)\n                out, out1 = self.model(inputtensor)\n                out = out.numpy()[0]\n                out = (np.transpose(out, (1, 2, 0)) + 1) / 2.0 * 255.0\n                out = np.clip(out, 0, 255)\n                out = out.astype('uint8')\n                results.append(out)\n\n        if paths != None:\n            for path in paths:\n                image = cv2.imread(path)[:, :, ::-1]\n                image = np.expand_dims(np.transpose(image, (2, 0, 1)).astype(np.float32) / 255., 0)\n                inputtensor = paddle.to_tensor(image)\n                out, out1 = self.model(inputtensor)\n                out = out.numpy()[0]\n                out = (np.transpose(out, (1, 2, 0)) + 1) / 2.0 * 255.0\n                out = np.clip(out, 0, 255)\n                out = out.astype('uint8')\n                results.append(out)\n\n        if visualization == True:\n            if not os.path.exists(output_dir):\n                os.makedirs(output_dir, exist_ok=True)\n            for i, out in enumerate(results):\n                cv2.imwrite(os.path.join(output_dir, 'output_{}.png'.format(i)), out[:, :, ::-1])\n\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n        results = self.enlightening(paths=[self.args.input_path],\n                                    output_dir=self.args.output_dir,\n                                    use_gpu=self.args.use_gpu,\n                                    visualization=self.args.visualization)\n        return results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.enlightening(images=images_decode, **kwargs)\n        tolist = [result.tolist() for result in results]\n        return tolist\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='enlightening_result',\n                                           help='output directory for saving result.')\n        self.arg_config_group.add_argument('--visualization', type=bool, default=False, help='save results or not.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to input image.\")\n"
  },
  {
    "path": "modules/image/image_processing/enlightengan/util.py",
    "content": "import base64\n\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n"
  },
  {
    "path": "modules/image/image_processing/prnet/README.md",
    "content": "# prnet\n\n|模型名称|prnet|\n| :--- | :---: |\n|类别|图像 - 图像生成|\n|网络|PRN|\n|数据集|300W-LP|\n|是否支持Fine-tuning|否|\n|模型大小|154MB|\n|最新更新日期|2021-11-20|\n|数据指标|-|\n\n\n## 一、模型基本信息  \n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/157190651-595b6964-97c5-4b0b-ac0a-c30c8520a972.png\"  width = \"450\"  hspace='10'/>\n    <br />\n    输入原图像\n    <br />\n    <img src=\"https://user-images.githubusercontent.com/22424850/142995636-dd5e1f0a-3810-4ae9-b680-4b2482858001.jpg\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    输入参考图像\n    <br />\n    <img src=\"https://user-images.githubusercontent.com/22424850/157205282-89c9ace9-5fec-4112-ace1-edaebbfc30f8.png\"  width = \"450\"  hspace='10'/>\n    <br />\n    输出图像\n     <br />\n    </p>\n\n- ### 模型介绍\n\n  - PRNet提出一种方法同时重建3D的脸部结构和脸部对齐，可应用于脸部对齐、3D脸重建、脸部纹理编辑等任务。该模块引入了脸部纹理编辑的功能，可以将参考图像的脸部纹理转移到原图像上。\n\n  - 更多详情参考：[Joint 3D Face Reconstruction and Dense Alignment with Position Map Regression Network](https://arxiv.org/pdf/1803.07835.pdf)\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n  - dlib\n  - scikit-image\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install prnet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run prnet --source  \"/PATH/TO/IMAGE1\"  --ref \"/PATH/TO/IMAGE2\"\n    ```\n  - 通过命令行方式实现脸部纹理编辑的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    module = hub.Module(name=\"prnet\")\n    source_path = \"/PATH/TO/IMAGE1\"\n    ref_path = \"/PATH/TO/IMAGE2\"\n    module.face_swap(paths=[{'source':input_path, 'ref':ref_path}],\n                    mode = 0,\n                    output_dir='./swapping_result/',\n                    use_gpu=True,\n                    visualization=True)  \n    ```\n\n- ### 3、API\n\n  - ```python\n    def face_swap(self,\n                images=None,\n                paths=None,\n                mode = 0,\n                output_dir='./swapping_result/',\n                use_gpu=False,\n                visualization=True):\n    ```\n    - 脸部纹理编辑API，将参考图像的脸部纹理转移到原图像上。\n\n    - **参数**\n      - images (list[dict]): data of images, 每一个元素都为一个 dict，有关键字 source, ref, 相应取值为：\n          - source (numpy.ndarray): 待转换的图片，shape 为 \\[H, W, C\\]，BGR格式；<br/>\n          - ref (numpy.ndarray) : 参考图像，shape为 \\[H, W, C\\]，BGR格式；<br/>\n      - paths (list[str]): paths to images, 每一个元素都为一个dict, 有关键字 source, ref, 相应取值为：\n          - source (str): 待转换的图片的路径；<br/>\n          - ref (str) : 参考图像的路径；<br/>\n      - mode(int): option, 0表示改变局部纹理, 1表示改变整个脸；<br/>\n      - output\\_dir (str): 结果保存的路径； <br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - visualization(bool): 是否保存结果到本地文件夹\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像风格转换服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m prnet\n    ```\n\n  - 这样就完成了一个图像风格转换的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import rawpy\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[{'source': cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE1\")), 'ref':cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE2\"))}]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/prnet/\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install prnet==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/image_processing/prnet/api.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nfrom time import time\n\nimport numpy as np\nimport paddle\nfrom skimage.io import imread\nfrom skimage.io import imsave\nfrom skimage.transform import estimate_transform\nfrom skimage.transform import warp\n\nfrom .predictor import PosPrediction\n\n\nclass PRN:\n    ''' Joint 3D Face Reconstruction and Dense Alignment with Position Map Regression Network\n    Args:\n        is_dlib(bool, optional): If true, dlib is used for detecting faces.\n        prefix(str, optional): If run at another folder, the absolute path is needed to load the data.\n    '''\n\n    def __init__(self, is_dlib=False, prefix='.'):\n\n        # resolution of input and output image size.\n        self.resolution_inp = 256\n        self.resolution_op = 256\n\n        #---- load detectors\n        if is_dlib:\n            import dlib\n            detector_path = os.path.join(prefix, 'Data/net-data/mmod_human_face_detector.dat')\n            self.face_detector = dlib.cnn_face_detection_model_v1(detector_path)\n\n        #---- load PRN\n        params = paddle.load(os.path.join(prefix, \"pd_model/model.pdparams\"))\n        self.pos_predictor = PosPrediction(params, self.resolution_inp, self.resolution_op)\n\n        # uv file\n        self.uv_kpt_ind = np.loadtxt(os.path.join(prefix,\n                                                  'Data/uv-data/uv_kpt_ind.txt')).astype(np.int32)  # 2 x 68 get kpt\n        self.face_ind = np.loadtxt(os.path.join(prefix, 'Data/uv-data/face_ind.txt')).astype(\n            np.int32)  # get valid vertices in the pos map\n        self.triangles = np.loadtxt(os.path.join(prefix, 'Data/uv-data/triangles.txt')).astype(np.int32)  # ntri x 3\n\n        self.uv_coords = self.generate_uv_coords()\n\n    def generate_uv_coords(self):\n        resolution = self.resolution_op\n        uv_coords = np.meshgrid(range(resolution), range(resolution))\n        uv_coords = np.transpose(np.array(uv_coords), [1, 2, 0])\n        uv_coords = np.reshape(uv_coords, [resolution**2, -1])\n        uv_coords = uv_coords[self.face_ind, :]\n        uv_coords = np.hstack((uv_coords[:, :2], np.zeros([uv_coords.shape[0], 1])))\n        return uv_coords\n\n    def dlib_detect(self, image):\n        return self.face_detector(image, 1)\n\n    def net_forward(self, image):\n        ''' The core of out method: regress the position map of a given image.\n        Args:\n            image: (256,256,3) array. value range: 0~1\n        Returns:\n            pos: the 3D position map. (256, 256, 3) array.\n        '''\n        return self.pos_predictor.predict(image)\n\n    def process(self, input, image_info=None):\n        ''' process image with crop operation.\n        Args:\n            input: (h,w,3) array or str(image path). image value range:1~255.\n            image_info(optional): the bounding box information of faces. if None, will use dlib to detect face.\n\n        Returns:\n            pos: the 3D position map. (256, 256, 3).\n        '''\n        if isinstance(input, str):\n            try:\n                image = imread(input)\n            except IOError:\n                print(\"error opening file: \", input)\n                return None\n        else:\n            image = input\n\n        if image.ndim < 3:\n            image = np.tile(image[:, :, np.newaxis], [1, 1, 3])\n\n        if image_info is not None:\n            if np.max(image_info.shape) > 4:  # key points to get bounding box\n                kpt = image_info\n                if kpt.shape[0] > 3:\n                    kpt = kpt.T\n                left = np.min(kpt[0, :])\n                right = np.max(kpt[0, :])\n                top = np.min(kpt[1, :])\n                bottom = np.max(kpt[1, :])\n            else:  # bounding box\n                bbox = image_info\n                left = bbox[0]\n                right = bbox[1]\n                top = bbox[2]\n                bottom = bbox[3]\n            old_size = (right - left + bottom - top) / 2\n            center = np.array([right - (right - left) / 2.0, bottom - (bottom - top) / 2.0])\n            size = int(old_size * 1.6)\n        else:\n            detected_faces = self.dlib_detect(image)\n            if len(detected_faces) == 0:\n                print('warning: no detected face')\n                return None\n\n            d = detected_faces[\n                0].rect  ## only use the first detected face (assume that each input image only contains one face)\n            left = d.left()\n            right = d.right()\n            top = d.top()\n            bottom = d.bottom()\n            old_size = (right - left + bottom - top) / 2\n            center = np.array([right - (right - left) / 2.0, bottom - (bottom - top) / 2.0 + old_size * 0.14])\n            size = int(old_size * 1.58)\n\n        # crop image\n        src_pts = np.array([[center[0] - size / 2, center[1] - size / 2], [center[0] - size / 2, center[1] + size / 2],\n                            [center[0] + size / 2, center[1] - size / 2]])\n        DST_PTS = np.array([[0, 0], [0, self.resolution_inp - 1], [self.resolution_inp - 1, 0]])\n        tform = estimate_transform('similarity', src_pts, DST_PTS)\n\n        image = image / 255.\n        cropped_image = warp(image, tform.inverse, output_shape=(self.resolution_inp, self.resolution_inp))\n\n        cropped_pos = self.net_forward(cropped_image)\n\n        # restore\n        cropped_vertices = np.reshape(cropped_pos, [-1, 3]).T\n        z = cropped_vertices[2, :].copy() / tform.params[0, 0]\n        cropped_vertices[2, :] = 1\n        vertices = np.dot(np.linalg.inv(tform.params), cropped_vertices)\n        vertices = np.vstack((vertices[:2, :], z))\n        pos = np.reshape(vertices.T, [self.resolution_op, self.resolution_op, 3])\n\n        return pos\n\n    def get_landmarks(self, pos):\n        '''\n        Args:\n            pos: the 3D position map. shape = (256, 256, 3).\n        Returns:\n            kpt: 68 3D landmarks. shape = (68, 3).\n        '''\n        kpt = pos[self.uv_kpt_ind[1, :], self.uv_kpt_ind[0, :], :]\n        return kpt\n\n    def get_vertices(self, pos):\n        '''\n        Args:\n            pos: the 3D position map. shape = (256, 256, 3).\n        Returns:\n            vertices: the vertices(point cloud). shape = (num of points, 3). n is about 40K here.\n        '''\n        all_vertices = np.reshape(pos, [self.resolution_op**2, -1])\n        vertices = all_vertices[self.face_ind, :]\n\n        return vertices\n\n    def get_colors_from_texture(self, texture):\n        '''\n        Args:\n            texture: the texture map. shape = (256, 256, 3).\n        Returns:\n            colors: the corresponding colors of vertices. shape = (num of points, 3). n is 45128 here.\n        '''\n        all_colors = np.reshape(texture, [self.resolution_op**2, -1])\n        colors = all_colors[self.face_ind, :]\n\n        return colors\n\n    def get_colors(self, image, vertices):\n        '''\n        Args:\n            pos: the 3D position map. shape = (256, 256, 3).\n        Returns:\n            colors: the corresponding colors of vertices. shape = (num of points, 3). n is 45128 here.\n        '''\n        [h, w, _] = image.shape\n        vertices[:, 0] = np.minimum(np.maximum(vertices[:, 0], 0), w - 1)  # x\n        vertices[:, 1] = np.minimum(np.maximum(vertices[:, 1], 0), h - 1)  # y\n        ind = np.round(vertices).astype(np.int32)\n        colors = image[ind[:, 1], ind[:, 0], :]  # n x 3\n\n        return colors\n"
  },
  {
    "path": "modules/image/image_processing/prnet/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport copy\nimport os\n\nimport cv2\nimport numpy as np\nimport paddle\nfrom skimage.io import imread\nfrom skimage.transform import rescale\nfrom skimage.transform import resize\n\nimport paddlehub as hub\nfrom .api import PRN\nfrom .predictor import PosPrediction\nfrom .util import base64_to_cv2\nfrom .utils.render import render_texture\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"prnet\", type=\"CV/\", author=\"paddlepaddle\", author_email=\"\", summary=\"\", version=\"1.0.0\")\nclass PRNet:\n    def __init__(self):\n        self.pretrained_model = os.path.join(self.directory, \"pd_model/model.pdparams\")\n        self.network = PRN(is_dlib=True, prefix=self.directory)\n\n    def face_swap(self,\n                  images: list = None,\n                  paths: list = None,\n                  mode: int = 0,\n                  output_dir: str = './swapping_result/',\n                  use_gpu: bool = False,\n                  visualization: bool = True):\n        '''\n        Denoise a raw image in the low-light scene.\n\n        images (list[dict]): data of images, each element is a dict:\n          - source (numpy.ndarray): input image，shape is \\[H, W, C\\]，BGR format；<br/>\n          - ref (numpy.ndarray) : style image，shape is \\[H, W, C\\]，BGR format；<br/>\n        paths (list[dict]): paths to images, eacg element is a dict:\n          - source (str): path to input image；<br/>\n          - ref (str) : path to reference image；<br/>\n        mode (int): option, 0 for change part of texture, 1 for change whole face\n        output_dir (str): the dir to save the results\n        use_gpu (bool): if True, use gpu to perform the computation, otherwise cpu.\n        visualization (bool): if True, save results in output_dir.\n        '''\n        results = []\n        paddle.disable_static()\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n        if images == None and paths == None:\n            print('No image provided. Please input an image or a image path.')\n            return\n\n        if images != None:\n            for image_dict in images:\n                source_img = image_dict['source'][:, :, ::-1]\n                ref_img = image_dict['ref'][:, :, ::-1]\n                results.append(self.texture_editing(source_img, ref_img, mode))\n\n        if paths != None:\n            for path_dict in paths:\n                source_img = cv2.imread(path_dict['source'])[:, :, ::-1]\n                ref_img = cv2.imread(path_dict['ref'])[:, :, ::-1]\n                results.append(self.texture_editing(source_img, ref_img, mode))\n\n        if visualization == True:\n            if not os.path.exists(output_dir):\n                os.makedirs(output_dir, exist_ok=True)\n            for i, out in enumerate(results):\n                cv2.imwrite(os.path.join(output_dir, 'output_{}.png'.format(i)), out[:, :, ::-1])\n\n        return results\n\n    def texture_editing(self, source_img, ref_img, mode):\n        # read image\n        image = source_img\n        [h, w, _] = image.shape\n        prn = self.network\n        #-- 1. 3d reconstruction -> get texture.\n        pos = prn.process(image)\n        vertices = prn.get_vertices(pos)\n        image = image / 255.\n        texture = cv2.remap(\n            image,\n            pos[:, :, :2].astype(np.float32),\n            None,\n            interpolation=cv2.INTER_NEAREST,\n            borderMode=cv2.BORDER_CONSTANT,\n            borderValue=(0))\n\n        #-- 2. Texture Editing\n        Mode = mode\n        # change part of texture(for data augumentation/selfie editing. Here modify eyes for example)\n        if Mode == 0:\n            # load eye mask\n            uv_face_eye = imread(os.path.join(self.directory, 'Data/uv-data/uv_face_eyes.png'), as_gray=True) / 255.\n            uv_face = imread(os.path.join(self.directory, 'Data/uv-data/uv_face.png'), as_gray=True) / 255.\n            eye_mask = (abs(uv_face_eye - uv_face) > 0).astype(np.float32)\n\n            # texture from another image or a processed texture\n            ref_image = ref_img\n            ref_pos = prn.process(ref_image)\n            ref_image = ref_image / 255.\n            ref_texture = cv2.remap(\n                ref_image,\n                ref_pos[:, :, :2].astype(np.float32),\n                None,\n                interpolation=cv2.INTER_NEAREST,\n                borderMode=cv2.BORDER_CONSTANT,\n                borderValue=(0))\n\n            # modify texture\n            new_texture = texture * (1 - eye_mask[:, :, np.newaxis]) + ref_texture * eye_mask[:, :, np.newaxis]\n\n        # change whole face(face swap)\n        elif Mode == 1:\n            # texture from another image or a processed texture\n            ref_image = ref_img\n            ref_pos = prn.process(ref_image)\n            ref_image = ref_image / 255.\n            ref_texture = cv2.remap(\n                ref_image,\n                ref_pos[:, :, :2].astype(np.float32),\n                None,\n                interpolation=cv2.INTER_NEAREST,\n                borderMode=cv2.BORDER_CONSTANT,\n                borderValue=(0))\n            ref_vertices = prn.get_vertices(ref_pos)\n            new_texture = ref_texture  #(texture + ref_texture)/2.\n\n        else:\n            print('Wrong Mode! Mode should be 0 or 1.')\n            exit()\n\n        #-- 3. remap to input image.(render)\n        vis_colors = np.ones((vertices.shape[0], 1))\n        face_mask = render_texture(vertices.T, vis_colors.T, prn.triangles.T, h, w, c=1)\n        face_mask = np.squeeze(face_mask > 0).astype(np.float32)\n\n        new_colors = prn.get_colors_from_texture(new_texture)\n        new_image = render_texture(vertices.T, new_colors.T, prn.triangles.T, h, w, c=3)\n        new_image = image * (1 - face_mask[:, :, np.newaxis]) + new_image * face_mask[:, :, np.newaxis]\n\n        # Possion Editing for blending image\n        vis_ind = np.argwhere(face_mask > 0)\n        vis_min = np.min(vis_ind, 0)\n        vis_max = np.max(vis_ind, 0)\n        center = (int((vis_min[1] + vis_max[1]) / 2 + 0.5), int((vis_min[0] + vis_max[0]) / 2 + 0.5))\n        output = cv2.seamlessClone((new_image * 255).astype(np.uint8), (image * 255).astype(np.uint8),\n                                   (face_mask * 255).astype(np.uint8), center, cv2.NORMAL_CLONE)\n\n        return output\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n\n        self.face_swap(\n            paths=[{\n                'source': self.args.source,\n                'ref': self.args.ref\n            }],\n            mode=self.args.mode,\n            output_dir=self.args.output_dir,\n            use_gpu=self.args.use_gpu,\n            visualization=self.args.visualization)\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = copy.deepcopy(images)\n        for image in images_decode:\n            image['source'] = base64_to_cv2(image['source'])\n            image['ref'] = base64_to_cv2(image['ref'])\n        results = self.face_swap(images_decode, **kwargs)\n        tolist = [result.tolist() for result in results]\n        return tolist\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--mode', type=int, default=0, help='process option, 0 for part texture, 1 for whole face.', choices=[0, 1])\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default='swapping_result', help='output directory for saving result.')\n        self.arg_config_group.add_argument('--visualization', type=bool, default=False, help='save results or not.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--source', type=str, help=\"path to source image.\")\n        self.arg_input_group.add_argument('--ref', type=str, help=\"path to reference image.\")\n"
  },
  {
    "path": "modules/image/image_processing/prnet/pd_model/x2paddle_code.py",
    "content": "import paddle\nimport math\n\n\nclass TFModel(paddle.nn.Layer):\n    def __init__(self):\n        super(TFModel, self).__init__()\n        self.conv0 = paddle.nn.Conv2D(\n            weight_attr='conv0.weight',\n            bias_attr=False,\n            in_channels=3,\n            out_channels=16,\n            kernel_size=[4, 4],\n            padding='SAME')\n        self.bn0 = paddle.nn.BatchNorm(\n            num_channels=16,\n            epsilon=0.0010000000474974513,\n            param_attr='resfcn256_Conv_BatchNorm_FusedBatchNorm_resfcn256_Conv_BatchNorm_gamma',\n            bias_attr='resfcn256_Conv_BatchNorm_FusedBatchNorm_resfcn256_Conv_BatchNorm_beta',\n            moving_mean_name='resfcn256_Conv_BatchNorm_FusedBatchNorm_resfcn256_Conv_BatchNorm_moving_mean',\n            moving_variance_name='resfcn256_Conv_BatchNorm_FusedBatchNorm_resfcn256_Conv_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu0 = paddle.nn.ReLU()\n        self.conv1 = paddle.nn.Conv2D(\n            weight_attr='conv1.weight',\n            bias_attr=False,\n            in_channels=16,\n            out_channels=32,\n            kernel_size=[1, 1],\n            stride=2,\n            padding='SAME')\n        self.conv2 = paddle.nn.Conv2D(\n            weight_attr='conv2.weight',\n            bias_attr=False,\n            in_channels=16,\n            out_channels=16,\n            kernel_size=[1, 1],\n            padding='SAME')\n        self.bn1 = paddle.nn.BatchNorm(\n            num_channels=16,\n            epsilon=0.0010000000474974513,\n            param_attr='resfcn256_resBlock_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_Conv_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_Conv_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_resBlock_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_Conv_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_Conv_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu1 = paddle.nn.ReLU()\n        self.conv3 = paddle.nn.Conv2D(\n            weight_attr='conv3.weight',\n            bias_attr=False,\n            in_channels=16,\n            out_channels=16,\n            kernel_size=[4, 4],\n            stride=2,\n            padding='SAME')\n        self.bn2 = paddle.nn.BatchNorm(\n            num_channels=16,\n            epsilon=0.0010000000474974513,\n            param_attr='resfcn256_resBlock_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_Conv_1_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_Conv_1_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_resBlock_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_Conv_1_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_Conv_1_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu2 = paddle.nn.ReLU()\n        self.conv4 = paddle.nn.Conv2D(\n            weight_attr='conv4.weight',\n            bias_attr=False,\n            in_channels=16,\n            out_channels=32,\n            kernel_size=[1, 1],\n            padding='SAME')\n        self.bn3 = paddle.nn.BatchNorm(\n            num_channels=32,\n            epsilon=0.0010000000474974513,\n            param_attr='resfcn256_resBlock_BatchNorm_FusedBatchNorm_resfcn256_resBlock_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_BatchNorm_FusedBatchNorm_resfcn256_resBlock_BatchNorm_beta',\n            moving_mean_name='resfcn256_resBlock_BatchNorm_FusedBatchNorm_resfcn256_resBlock_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_BatchNorm_FusedBatchNorm_resfcn256_resBlock_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu3 = paddle.nn.ReLU()\n        self.conv5 = paddle.nn.Conv2D(\n            weight_attr='conv5.weight',\n            bias_attr=False,\n            in_channels=32,\n            out_channels=16,\n            kernel_size=[1, 1],\n            padding='SAME')\n        self.bn4 = paddle.nn.BatchNorm(\n            num_channels=16,\n            epsilon=0.0010000000474974513,\n            param_attr='resfcn256_resBlock_1_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_1_Conv_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_1_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_1_Conv_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_resBlock_1_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_1_Conv_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_1_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_1_Conv_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu4 = paddle.nn.ReLU()\n        self.conv6 = paddle.nn.Conv2D(\n            weight_attr='conv6.weight',\n            bias_attr=False,\n            in_channels=16,\n            out_channels=16,\n            kernel_size=[4, 4],\n            padding='SAME')\n        self.bn5 = paddle.nn.BatchNorm(\n            num_channels=16,\n            epsilon=0.0010000000474974513,\n            param_attr=\n            'resfcn256_resBlock_1_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_1_Conv_1_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_1_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_1_Conv_1_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_resBlock_1_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_1_Conv_1_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_1_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_1_Conv_1_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu5 = paddle.nn.ReLU()\n        self.conv7 = paddle.nn.Conv2D(\n            weight_attr='conv7.weight',\n            bias_attr=False,\n            in_channels=16,\n            out_channels=32,\n            kernel_size=[1, 1],\n            padding='SAME')\n        self.bn6 = paddle.nn.BatchNorm(\n            num_channels=32,\n            epsilon=0.0010000000474974513,\n            param_attr='resfcn256_resBlock_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_1_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_1_BatchNorm_beta',\n            moving_mean_name='resfcn256_resBlock_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_1_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_1_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu6 = paddle.nn.ReLU()\n        self.conv8 = paddle.nn.Conv2D(\n            weight_attr='conv8.weight',\n            bias_attr=False,\n            in_channels=32,\n            out_channels=64,\n            kernel_size=[1, 1],\n            stride=2,\n            padding='SAME')\n        self.conv9 = paddle.nn.Conv2D(\n            weight_attr='conv9.weight',\n            bias_attr=False,\n            in_channels=32,\n            out_channels=32,\n            kernel_size=[1, 1],\n            padding='SAME')\n        self.bn7 = paddle.nn.BatchNorm(\n            num_channels=32,\n            epsilon=0.0010000000474974513,\n            param_attr='resfcn256_resBlock_2_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_2_Conv_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_2_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_2_Conv_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_resBlock_2_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_2_Conv_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_2_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_2_Conv_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu7 = paddle.nn.ReLU()\n        self.conv10 = paddle.nn.Conv2D(\n            weight_attr='conv10.weight',\n            bias_attr=False,\n            in_channels=32,\n            out_channels=32,\n            kernel_size=[4, 4],\n            stride=2,\n            padding='SAME')\n        self.bn8 = paddle.nn.BatchNorm(\n            num_channels=32,\n            epsilon=0.0010000000474974513,\n            param_attr=\n            'resfcn256_resBlock_2_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_2_Conv_1_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_2_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_2_Conv_1_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_resBlock_2_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_2_Conv_1_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_2_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_2_Conv_1_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu8 = paddle.nn.ReLU()\n        self.conv11 = paddle.nn.Conv2D(\n            weight_attr='conv11.weight',\n            bias_attr=False,\n            in_channels=32,\n            out_channels=64,\n            kernel_size=[1, 1],\n            padding='SAME')\n        self.bn9 = paddle.nn.BatchNorm(\n            num_channels=64,\n            epsilon=0.0010000000474974513,\n            param_attr='resfcn256_resBlock_2_BatchNorm_FusedBatchNorm_resfcn256_resBlock_2_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_2_BatchNorm_FusedBatchNorm_resfcn256_resBlock_2_BatchNorm_beta',\n            moving_mean_name='resfcn256_resBlock_2_BatchNorm_FusedBatchNorm_resfcn256_resBlock_2_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_2_BatchNorm_FusedBatchNorm_resfcn256_resBlock_2_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu9 = paddle.nn.ReLU()\n        self.conv12 = paddle.nn.Conv2D(\n            weight_attr='conv12.weight',\n            bias_attr=False,\n            in_channels=64,\n            out_channels=32,\n            kernel_size=[1, 1],\n            padding='SAME')\n        self.bn10 = paddle.nn.BatchNorm(\n            num_channels=32,\n            epsilon=0.0010000000474974513,\n            param_attr='resfcn256_resBlock_3_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_3_Conv_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_3_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_3_Conv_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_resBlock_3_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_3_Conv_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_3_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_3_Conv_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu10 = paddle.nn.ReLU()\n        self.conv13 = paddle.nn.Conv2D(\n            weight_attr='conv13.weight',\n            bias_attr=False,\n            in_channels=32,\n            out_channels=32,\n            kernel_size=[4, 4],\n            padding='SAME')\n        self.bn11 = paddle.nn.BatchNorm(\n            num_channels=32,\n            epsilon=0.0010000000474974513,\n            param_attr=\n            'resfcn256_resBlock_3_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_3_Conv_1_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_3_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_3_Conv_1_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_resBlock_3_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_3_Conv_1_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_3_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_3_Conv_1_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu11 = paddle.nn.ReLU()\n        self.conv14 = paddle.nn.Conv2D(\n            weight_attr='conv14.weight',\n            bias_attr=False,\n            in_channels=32,\n            out_channels=64,\n            kernel_size=[1, 1],\n            padding='SAME')\n        self.bn12 = paddle.nn.BatchNorm(\n            num_channels=64,\n            epsilon=0.0010000000474974513,\n            param_attr='resfcn256_resBlock_3_BatchNorm_FusedBatchNorm_resfcn256_resBlock_3_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_3_BatchNorm_FusedBatchNorm_resfcn256_resBlock_3_BatchNorm_beta',\n            moving_mean_name='resfcn256_resBlock_3_BatchNorm_FusedBatchNorm_resfcn256_resBlock_3_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_3_BatchNorm_FusedBatchNorm_resfcn256_resBlock_3_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu12 = paddle.nn.ReLU()\n        self.conv15 = paddle.nn.Conv2D(\n            weight_attr='conv15.weight',\n            bias_attr=False,\n            in_channels=64,\n            out_channels=128,\n            kernel_size=[1, 1],\n            stride=2,\n            padding='SAME')\n        self.conv16 = paddle.nn.Conv2D(\n            weight_attr='conv16.weight',\n            bias_attr=False,\n            in_channels=64,\n            out_channels=64,\n            kernel_size=[1, 1],\n            padding='SAME')\n        self.bn13 = paddle.nn.BatchNorm(\n            num_channels=64,\n            epsilon=0.0010000000474974513,\n            param_attr='resfcn256_resBlock_4_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_4_Conv_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_4_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_4_Conv_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_resBlock_4_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_4_Conv_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_4_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_4_Conv_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu13 = paddle.nn.ReLU()\n        self.conv17 = paddle.nn.Conv2D(\n            weight_attr='conv17.weight',\n            bias_attr=False,\n            in_channels=64,\n            out_channels=64,\n            kernel_size=[4, 4],\n            stride=2,\n            padding='SAME')\n        self.bn14 = paddle.nn.BatchNorm(\n            num_channels=64,\n            epsilon=0.0010000000474974513,\n            param_attr=\n            'resfcn256_resBlock_4_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_4_Conv_1_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_4_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_4_Conv_1_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_resBlock_4_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_4_Conv_1_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_4_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_4_Conv_1_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu14 = paddle.nn.ReLU()\n        self.conv18 = paddle.nn.Conv2D(\n            weight_attr='conv18.weight',\n            bias_attr=False,\n            in_channels=64,\n            out_channels=128,\n            kernel_size=[1, 1],\n            padding='SAME')\n        self.bn15 = paddle.nn.BatchNorm(\n            num_channels=128,\n            epsilon=0.0010000000474974513,\n            param_attr='resfcn256_resBlock_4_BatchNorm_FusedBatchNorm_resfcn256_resBlock_4_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_4_BatchNorm_FusedBatchNorm_resfcn256_resBlock_4_BatchNorm_beta',\n            moving_mean_name='resfcn256_resBlock_4_BatchNorm_FusedBatchNorm_resfcn256_resBlock_4_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_4_BatchNorm_FusedBatchNorm_resfcn256_resBlock_4_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu15 = paddle.nn.ReLU()\n        self.conv19 = paddle.nn.Conv2D(\n            weight_attr='conv19.weight',\n            bias_attr=False,\n            in_channels=128,\n            out_channels=64,\n            kernel_size=[1, 1],\n            padding='SAME')\n        self.bn16 = paddle.nn.BatchNorm(\n            num_channels=64,\n            epsilon=0.0010000000474974513,\n            param_attr='resfcn256_resBlock_5_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_5_Conv_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_5_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_5_Conv_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_resBlock_5_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_5_Conv_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_5_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_5_Conv_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu16 = paddle.nn.ReLU()\n        self.conv20 = paddle.nn.Conv2D(\n            weight_attr='conv20.weight',\n            bias_attr=False,\n            in_channels=64,\n            out_channels=64,\n            kernel_size=[4, 4],\n            padding='SAME')\n        self.bn17 = paddle.nn.BatchNorm(\n            num_channels=64,\n            epsilon=0.0010000000474974513,\n            param_attr=\n            'resfcn256_resBlock_5_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_5_Conv_1_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_5_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_5_Conv_1_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_resBlock_5_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_5_Conv_1_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_5_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_5_Conv_1_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu17 = paddle.nn.ReLU()\n        self.conv21 = paddle.nn.Conv2D(\n            weight_attr='conv21.weight',\n            bias_attr=False,\n            in_channels=64,\n            out_channels=128,\n            kernel_size=[1, 1],\n            padding='SAME')\n        self.bn18 = paddle.nn.BatchNorm(\n            num_channels=128,\n            epsilon=0.0010000000474974513,\n            param_attr='resfcn256_resBlock_5_BatchNorm_FusedBatchNorm_resfcn256_resBlock_5_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_5_BatchNorm_FusedBatchNorm_resfcn256_resBlock_5_BatchNorm_beta',\n            moving_mean_name='resfcn256_resBlock_5_BatchNorm_FusedBatchNorm_resfcn256_resBlock_5_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_5_BatchNorm_FusedBatchNorm_resfcn256_resBlock_5_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu18 = paddle.nn.ReLU()\n        self.conv22 = paddle.nn.Conv2D(\n            weight_attr='conv22.weight',\n            bias_attr=False,\n            in_channels=128,\n            out_channels=256,\n            kernel_size=[1, 1],\n            stride=2,\n            padding='SAME')\n        self.conv23 = paddle.nn.Conv2D(\n            weight_attr='conv23.weight',\n            bias_attr=False,\n            in_channels=128,\n            out_channels=128,\n            kernel_size=[1, 1],\n            padding='SAME')\n        self.bn19 = paddle.nn.BatchNorm(\n            num_channels=128,\n            epsilon=0.0010000000474974513,\n            param_attr='resfcn256_resBlock_6_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_6_Conv_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_6_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_6_Conv_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_resBlock_6_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_6_Conv_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_6_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_6_Conv_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu19 = paddle.nn.ReLU()\n        self.conv24 = paddle.nn.Conv2D(\n            weight_attr='conv24.weight',\n            bias_attr=False,\n            in_channels=128,\n            out_channels=128,\n            kernel_size=[4, 4],\n            stride=2,\n            padding='SAME')\n        self.bn20 = paddle.nn.BatchNorm(\n            num_channels=128,\n            epsilon=0.0010000000474974513,\n            param_attr=\n            'resfcn256_resBlock_6_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_6_Conv_1_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_6_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_6_Conv_1_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_resBlock_6_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_6_Conv_1_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_6_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_6_Conv_1_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu20 = paddle.nn.ReLU()\n        self.conv25 = paddle.nn.Conv2D(\n            weight_attr='conv25.weight',\n            bias_attr=False,\n            in_channels=128,\n            out_channels=256,\n            kernel_size=[1, 1],\n            padding='SAME')\n        self.bn21 = paddle.nn.BatchNorm(\n            num_channels=256,\n            epsilon=0.0010000000474974513,\n            param_attr='resfcn256_resBlock_6_BatchNorm_FusedBatchNorm_resfcn256_resBlock_6_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_6_BatchNorm_FusedBatchNorm_resfcn256_resBlock_6_BatchNorm_beta',\n            moving_mean_name='resfcn256_resBlock_6_BatchNorm_FusedBatchNorm_resfcn256_resBlock_6_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_6_BatchNorm_FusedBatchNorm_resfcn256_resBlock_6_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu21 = paddle.nn.ReLU()\n        self.conv26 = paddle.nn.Conv2D(\n            weight_attr='conv26.weight',\n            bias_attr=False,\n            in_channels=256,\n            out_channels=128,\n            kernel_size=[1, 1],\n            padding='SAME')\n        self.bn22 = paddle.nn.BatchNorm(\n            num_channels=128,\n            epsilon=0.0010000000474974513,\n            param_attr='resfcn256_resBlock_7_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_7_Conv_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_7_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_7_Conv_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_resBlock_7_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_7_Conv_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_7_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_7_Conv_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu22 = paddle.nn.ReLU()\n        self.conv27 = paddle.nn.Conv2D(\n            weight_attr='conv27.weight',\n            bias_attr=False,\n            in_channels=128,\n            out_channels=128,\n            kernel_size=[4, 4],\n            padding='SAME')\n        self.bn23 = paddle.nn.BatchNorm(\n            num_channels=128,\n            epsilon=0.0010000000474974513,\n            param_attr=\n            'resfcn256_resBlock_7_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_7_Conv_1_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_7_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_7_Conv_1_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_resBlock_7_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_7_Conv_1_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_7_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_7_Conv_1_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu23 = paddle.nn.ReLU()\n        self.conv28 = paddle.nn.Conv2D(\n            weight_attr='conv28.weight',\n            bias_attr=False,\n            in_channels=128,\n            out_channels=256,\n            kernel_size=[1, 1],\n            padding='SAME')\n        self.bn24 = paddle.nn.BatchNorm(\n            num_channels=256,\n            epsilon=0.0010000000474974513,\n            param_attr='resfcn256_resBlock_7_BatchNorm_FusedBatchNorm_resfcn256_resBlock_7_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_7_BatchNorm_FusedBatchNorm_resfcn256_resBlock_7_BatchNorm_beta',\n            moving_mean_name='resfcn256_resBlock_7_BatchNorm_FusedBatchNorm_resfcn256_resBlock_7_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_7_BatchNorm_FusedBatchNorm_resfcn256_resBlock_7_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu24 = paddle.nn.ReLU()\n        self.conv29 = paddle.nn.Conv2D(\n            weight_attr='conv29.weight',\n            bias_attr=False,\n            in_channels=256,\n            out_channels=512,\n            kernel_size=[1, 1],\n            stride=2,\n            padding='SAME')\n        self.conv30 = paddle.nn.Conv2D(\n            weight_attr='conv30.weight',\n            bias_attr=False,\n            in_channels=256,\n            out_channels=256,\n            kernel_size=[1, 1],\n            padding='SAME')\n        self.bn25 = paddle.nn.BatchNorm(\n            num_channels=256,\n            epsilon=0.0010000000474974513,\n            param_attr='resfcn256_resBlock_8_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_8_Conv_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_8_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_8_Conv_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_resBlock_8_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_8_Conv_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_8_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_8_Conv_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu25 = paddle.nn.ReLU()\n        self.conv31 = paddle.nn.Conv2D(\n            weight_attr='conv31.weight',\n            bias_attr=False,\n            in_channels=256,\n            out_channels=256,\n            kernel_size=[4, 4],\n            stride=2,\n            padding='SAME')\n        self.bn26 = paddle.nn.BatchNorm(\n            num_channels=256,\n            epsilon=0.0010000000474974513,\n            param_attr=\n            'resfcn256_resBlock_8_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_8_Conv_1_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_8_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_8_Conv_1_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_resBlock_8_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_8_Conv_1_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_8_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_8_Conv_1_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu26 = paddle.nn.ReLU()\n        self.conv32 = paddle.nn.Conv2D(\n            weight_attr='conv32.weight',\n            bias_attr=False,\n            in_channels=256,\n            out_channels=512,\n            kernel_size=[1, 1],\n            padding='SAME')\n        self.bn27 = paddle.nn.BatchNorm(\n            num_channels=512,\n            epsilon=0.0010000000474974513,\n            param_attr='resfcn256_resBlock_8_BatchNorm_FusedBatchNorm_resfcn256_resBlock_8_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_8_BatchNorm_FusedBatchNorm_resfcn256_resBlock_8_BatchNorm_beta',\n            moving_mean_name='resfcn256_resBlock_8_BatchNorm_FusedBatchNorm_resfcn256_resBlock_8_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_8_BatchNorm_FusedBatchNorm_resfcn256_resBlock_8_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu27 = paddle.nn.ReLU()\n        self.conv33 = paddle.nn.Conv2D(\n            weight_attr='conv33.weight',\n            bias_attr=False,\n            in_channels=512,\n            out_channels=256,\n            kernel_size=[1, 1],\n            padding='SAME')\n        self.bn28 = paddle.nn.BatchNorm(\n            num_channels=256,\n            epsilon=0.0010000000474974513,\n            param_attr='resfcn256_resBlock_9_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_9_Conv_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_9_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_9_Conv_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_resBlock_9_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_9_Conv_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_9_Conv_BatchNorm_FusedBatchNorm_resfcn256_resBlock_9_Conv_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu28 = paddle.nn.ReLU()\n        self.conv34 = paddle.nn.Conv2D(\n            weight_attr='conv34.weight',\n            bias_attr=False,\n            in_channels=256,\n            out_channels=256,\n            kernel_size=[4, 4],\n            padding='SAME')\n        self.bn29 = paddle.nn.BatchNorm(\n            num_channels=256,\n            epsilon=0.0010000000474974513,\n            param_attr=\n            'resfcn256_resBlock_9_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_9_Conv_1_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_9_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_9_Conv_1_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_resBlock_9_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_9_Conv_1_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_9_Conv_1_BatchNorm_FusedBatchNorm_resfcn256_resBlock_9_Conv_1_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu29 = paddle.nn.ReLU()\n        self.conv35 = paddle.nn.Conv2D(\n            weight_attr='conv35.weight',\n            bias_attr=False,\n            in_channels=256,\n            out_channels=512,\n            kernel_size=[1, 1],\n            padding='SAME')\n        self.bn30 = paddle.nn.BatchNorm(\n            num_channels=512,\n            epsilon=0.0010000000474974513,\n            param_attr='resfcn256_resBlock_9_BatchNorm_FusedBatchNorm_resfcn256_resBlock_9_BatchNorm_gamma',\n            bias_attr='resfcn256_resBlock_9_BatchNorm_FusedBatchNorm_resfcn256_resBlock_9_BatchNorm_beta',\n            moving_mean_name='resfcn256_resBlock_9_BatchNorm_FusedBatchNorm_resfcn256_resBlock_9_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_resBlock_9_BatchNorm_FusedBatchNorm_resfcn256_resBlock_9_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu30 = paddle.nn.ReLU()\n        self.resfcn256_Conv2d_transpose_conv2d_transpose_conv36_weight = self.create_parameter(\n            shape=(512, 512, 4, 4), attr='conv36.weight')\n        self.bn31 = paddle.nn.BatchNorm(\n            num_channels=512,\n            epsilon=0.0010000000474974513,\n            param_attr='resfcn256_Conv2d_transpose_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_BatchNorm_gamma',\n            bias_attr='resfcn256_Conv2d_transpose_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_Conv2d_transpose_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_Conv2d_transpose_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu31 = paddle.nn.ReLU()\n        self.resfcn256_Conv2d_transpose_1_conv2d_transpose_conv37_weight = self.create_parameter(\n            shape=(512, 256, 4, 4), attr='conv37.weight')\n        self.bn32 = paddle.nn.BatchNorm(\n            num_channels=256,\n            epsilon=0.0010000000474974513,\n            param_attr=\n            'resfcn256_Conv2d_transpose_1_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_1_BatchNorm_gamma',\n            bias_attr=\n            'resfcn256_Conv2d_transpose_1_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_1_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_Conv2d_transpose_1_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_1_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_Conv2d_transpose_1_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_1_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu32 = paddle.nn.ReLU()\n        self.resfcn256_Conv2d_transpose_2_conv2d_transpose_conv38_weight = self.create_parameter(\n            shape=(256, 256, 4, 4), attr='conv38.weight')\n        self.bn33 = paddle.nn.BatchNorm(\n            num_channels=256,\n            epsilon=0.0010000000474974513,\n            param_attr=\n            'resfcn256_Conv2d_transpose_2_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_2_BatchNorm_gamma',\n            bias_attr=\n            'resfcn256_Conv2d_transpose_2_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_2_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_Conv2d_transpose_2_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_2_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_Conv2d_transpose_2_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_2_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu33 = paddle.nn.ReLU()\n        self.resfcn256_Conv2d_transpose_3_conv2d_transpose_conv39_weight = self.create_parameter(\n            shape=(256, 256, 4, 4), attr='conv39.weight')\n        self.bn34 = paddle.nn.BatchNorm(\n            num_channels=256,\n            epsilon=0.0010000000474974513,\n            param_attr=\n            'resfcn256_Conv2d_transpose_3_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_3_BatchNorm_gamma',\n            bias_attr=\n            'resfcn256_Conv2d_transpose_3_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_3_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_Conv2d_transpose_3_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_3_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_Conv2d_transpose_3_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_3_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu34 = paddle.nn.ReLU()\n        self.resfcn256_Conv2d_transpose_4_conv2d_transpose_conv40_weight = self.create_parameter(\n            shape=(256, 128, 4, 4), attr='conv40.weight')\n        self.bn35 = paddle.nn.BatchNorm(\n            num_channels=128,\n            epsilon=0.0010000000474974513,\n            param_attr=\n            'resfcn256_Conv2d_transpose_4_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_4_BatchNorm_gamma',\n            bias_attr=\n            'resfcn256_Conv2d_transpose_4_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_4_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_Conv2d_transpose_4_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_4_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_Conv2d_transpose_4_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_4_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu35 = paddle.nn.ReLU()\n        self.resfcn256_Conv2d_transpose_5_conv2d_transpose_conv41_weight = self.create_parameter(\n            shape=(128, 128, 4, 4), attr='conv41.weight')\n        self.bn36 = paddle.nn.BatchNorm(\n            num_channels=128,\n            epsilon=0.0010000000474974513,\n            param_attr=\n            'resfcn256_Conv2d_transpose_5_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_5_BatchNorm_gamma',\n            bias_attr=\n            'resfcn256_Conv2d_transpose_5_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_5_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_Conv2d_transpose_5_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_5_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_Conv2d_transpose_5_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_5_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu36 = paddle.nn.ReLU()\n        self.resfcn256_Conv2d_transpose_6_conv2d_transpose_conv42_weight = self.create_parameter(\n            shape=(128, 128, 4, 4), attr='conv42.weight')\n        self.bn37 = paddle.nn.BatchNorm(\n            num_channels=128,\n            epsilon=0.0010000000474974513,\n            param_attr=\n            'resfcn256_Conv2d_transpose_6_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_6_BatchNorm_gamma',\n            bias_attr=\n            'resfcn256_Conv2d_transpose_6_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_6_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_Conv2d_transpose_6_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_6_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_Conv2d_transpose_6_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_6_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu37 = paddle.nn.ReLU()\n        self.resfcn256_Conv2d_transpose_7_conv2d_transpose_conv43_weight = self.create_parameter(\n            shape=(128, 64, 4, 4), attr='conv43.weight')\n        self.bn38 = paddle.nn.BatchNorm(\n            num_channels=64,\n            epsilon=0.0010000000474974513,\n            param_attr=\n            'resfcn256_Conv2d_transpose_7_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_7_BatchNorm_gamma',\n            bias_attr=\n            'resfcn256_Conv2d_transpose_7_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_7_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_Conv2d_transpose_7_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_7_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_Conv2d_transpose_7_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_7_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu38 = paddle.nn.ReLU()\n        self.resfcn256_Conv2d_transpose_8_conv2d_transpose_conv44_weight = self.create_parameter(\n            shape=(64, 64, 4, 4), attr='conv44.weight')\n        self.bn39 = paddle.nn.BatchNorm(\n            num_channels=64,\n            epsilon=0.0010000000474974513,\n            param_attr=\n            'resfcn256_Conv2d_transpose_8_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_8_BatchNorm_gamma',\n            bias_attr=\n            'resfcn256_Conv2d_transpose_8_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_8_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_Conv2d_transpose_8_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_8_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_Conv2d_transpose_8_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_8_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu39 = paddle.nn.ReLU()\n        self.resfcn256_Conv2d_transpose_9_conv2d_transpose_conv45_weight = self.create_parameter(\n            shape=(64, 64, 4, 4), attr='conv45.weight')\n        self.bn40 = paddle.nn.BatchNorm(\n            num_channels=64,\n            epsilon=0.0010000000474974513,\n            param_attr=\n            'resfcn256_Conv2d_transpose_9_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_9_BatchNorm_gamma',\n            bias_attr=\n            'resfcn256_Conv2d_transpose_9_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_9_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_Conv2d_transpose_9_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_9_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_Conv2d_transpose_9_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_9_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu40 = paddle.nn.ReLU()\n        self.resfcn256_Conv2d_transpose_10_conv2d_transpose_conv46_weight = self.create_parameter(\n            shape=(64, 32, 4, 4), attr='conv46.weight')\n        self.bn41 = paddle.nn.BatchNorm(\n            num_channels=32,\n            epsilon=0.0010000000474974513,\n            param_attr=\n            'resfcn256_Conv2d_transpose_10_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_10_BatchNorm_gamma',\n            bias_attr=\n            'resfcn256_Conv2d_transpose_10_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_10_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_Conv2d_transpose_10_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_10_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_Conv2d_transpose_10_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_10_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu41 = paddle.nn.ReLU()\n        self.resfcn256_Conv2d_transpose_11_conv2d_transpose_conv47_weight = self.create_parameter(\n            shape=(32, 32, 4, 4), attr='conv47.weight')\n        self.bn42 = paddle.nn.BatchNorm(\n            num_channels=32,\n            epsilon=0.0010000000474974513,\n            param_attr=\n            'resfcn256_Conv2d_transpose_11_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_11_BatchNorm_gamma',\n            bias_attr=\n            'resfcn256_Conv2d_transpose_11_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_11_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_Conv2d_transpose_11_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_11_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_Conv2d_transpose_11_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_11_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu42 = paddle.nn.ReLU()\n        self.resfcn256_Conv2d_transpose_12_conv2d_transpose_conv48_weight = self.create_parameter(\n            shape=(32, 16, 4, 4), attr='conv48.weight')\n        self.bn43 = paddle.nn.BatchNorm(\n            num_channels=16,\n            epsilon=0.0010000000474974513,\n            param_attr=\n            'resfcn256_Conv2d_transpose_12_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_12_BatchNorm_gamma',\n            bias_attr=\n            'resfcn256_Conv2d_transpose_12_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_12_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_Conv2d_transpose_12_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_12_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_Conv2d_transpose_12_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_12_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu43 = paddle.nn.ReLU()\n        self.resfcn256_Conv2d_transpose_13_conv2d_transpose_conv49_weight = self.create_parameter(\n            shape=(16, 16, 4, 4), attr='conv49.weight')\n        self.bn44 = paddle.nn.BatchNorm(\n            num_channels=16,\n            epsilon=0.0010000000474974513,\n            param_attr=\n            'resfcn256_Conv2d_transpose_13_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_13_BatchNorm_gamma',\n            bias_attr=\n            'resfcn256_Conv2d_transpose_13_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_13_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_Conv2d_transpose_13_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_13_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_Conv2d_transpose_13_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_13_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu44 = paddle.nn.ReLU()\n        self.resfcn256_Conv2d_transpose_14_conv2d_transpose_conv50_weight = self.create_parameter(\n            shape=(16, 3, 4, 4), attr='conv50.weight')\n        self.bn45 = paddle.nn.BatchNorm(\n            num_channels=3,\n            epsilon=0.0010000000474974513,\n            param_attr=\n            'resfcn256_Conv2d_transpose_14_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_14_BatchNorm_gamma',\n            bias_attr=\n            'resfcn256_Conv2d_transpose_14_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_14_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_Conv2d_transpose_14_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_14_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_Conv2d_transpose_14_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_14_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu45 = paddle.nn.ReLU()\n        self.resfcn256_Conv2d_transpose_15_conv2d_transpose_conv51_weight = self.create_parameter(\n            shape=(3, 3, 4, 4), attr='conv51.weight')\n        self.bn46 = paddle.nn.BatchNorm(\n            num_channels=3,\n            epsilon=0.0010000000474974513,\n            param_attr=\n            'resfcn256_Conv2d_transpose_15_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_15_BatchNorm_gamma',\n            bias_attr=\n            'resfcn256_Conv2d_transpose_15_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_15_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_Conv2d_transpose_15_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_15_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_Conv2d_transpose_15_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_15_BatchNorm_moving_variance',\n            is_test=True)\n        self.relu46 = paddle.nn.ReLU()\n        self.resfcn256_Conv2d_transpose_16_conv2d_transpose_conv52_weight = self.create_parameter(\n            shape=(3, 3, 4, 4), attr='conv52.weight')\n        self.bn47 = paddle.nn.BatchNorm(\n            num_channels=3,\n            epsilon=0.0010000000474974513,\n            param_attr=\n            'resfcn256_Conv2d_transpose_16_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_16_BatchNorm_gamma',\n            bias_attr=\n            'resfcn256_Conv2d_transpose_16_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_16_BatchNorm_beta',\n            moving_mean_name=\n            'resfcn256_Conv2d_transpose_16_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_16_BatchNorm_moving_mean',\n            moving_variance_name=\n            'resfcn256_Conv2d_transpose_16_BatchNorm_FusedBatchNorm_resfcn256_Conv2d_transpose_16_BatchNorm_moving_variance',\n            is_test=True)\n        self.sigmoid0 = paddle.nn.Sigmoid()\n\n    def forward(self, Placeholder):\n        resfcn256_Conv2d_transpose_mul_y = paddle.full(dtype='int32', shape=[1], fill_value=1)\n        resfcn256_Conv2d_transpose_mul_1_y = paddle.full(dtype='int32', shape=[1], fill_value=1)\n        resfcn256_Conv2d_transpose_stack_3 = paddle.full(dtype='int32', shape=[1], fill_value=512)\n        resfcn256_Conv2d_transpose_1_mul_y = paddle.full(dtype='int32', shape=[1], fill_value=2)\n        resfcn256_Conv2d_transpose_1_mul_1_y = paddle.full(dtype='int32', shape=[1], fill_value=2)\n        resfcn256_Conv2d_transpose_1_stack_3 = paddle.full(dtype='int32', shape=[1], fill_value=256)\n        resfcn256_Conv2d_transpose_2_mul_y = paddle.full(dtype='int32', shape=[1], fill_value=1)\n        resfcn256_Conv2d_transpose_2_mul_1_y = paddle.full(dtype='int32', shape=[1], fill_value=1)\n        resfcn256_Conv2d_transpose_2_stack_3 = paddle.full(dtype='int32', shape=[1], fill_value=256)\n        resfcn256_Conv2d_transpose_3_mul_y = paddle.full(dtype='int32', shape=[1], fill_value=1)\n        resfcn256_Conv2d_transpose_3_mul_1_y = paddle.full(dtype='int32', shape=[1], fill_value=1)\n        resfcn256_Conv2d_transpose_3_stack_3 = paddle.full(dtype='int32', shape=[1], fill_value=256)\n        resfcn256_Conv2d_transpose_4_mul_y = paddle.full(dtype='int32', shape=[1], fill_value=2)\n        resfcn256_Conv2d_transpose_4_mul_1_y = paddle.full(dtype='int32', shape=[1], fill_value=2)\n        resfcn256_Conv2d_transpose_4_stack_3 = paddle.full(dtype='int32', shape=[1], fill_value=128)\n        resfcn256_Conv2d_transpose_5_mul_y = paddle.full(dtype='int32', shape=[1], fill_value=1)\n        resfcn256_Conv2d_transpose_5_mul_1_y = paddle.full(dtype='int32', shape=[1], fill_value=1)\n        resfcn256_Conv2d_transpose_5_stack_3 = paddle.full(dtype='int32', shape=[1], fill_value=128)\n        resfcn256_Conv2d_transpose_6_mul_y = paddle.full(dtype='int32', shape=[1], fill_value=1)\n        resfcn256_Conv2d_transpose_6_mul_1_y = paddle.full(dtype='int32', shape=[1], fill_value=1)\n        resfcn256_Conv2d_transpose_6_stack_3 = paddle.full(dtype='int32', shape=[1], fill_value=128)\n        resfcn256_Conv2d_transpose_7_mul_y = paddle.full(dtype='int32', shape=[1], fill_value=2)\n        resfcn256_Conv2d_transpose_7_mul_1_y = paddle.full(dtype='int32', shape=[1], fill_value=2)\n        resfcn256_Conv2d_transpose_7_stack_3 = paddle.full(dtype='int32', shape=[1], fill_value=64)\n        resfcn256_Conv2d_transpose_8_mul_y = paddle.full(dtype='int32', shape=[1], fill_value=1)\n        resfcn256_Conv2d_transpose_8_mul_1_y = paddle.full(dtype='int32', shape=[1], fill_value=1)\n        resfcn256_Conv2d_transpose_8_stack_3 = paddle.full(dtype='int32', shape=[1], fill_value=64)\n        resfcn256_Conv2d_transpose_9_mul_y = paddle.full(dtype='int32', shape=[1], fill_value=1)\n        resfcn256_Conv2d_transpose_9_mul_1_y = paddle.full(dtype='int32', shape=[1], fill_value=1)\n        resfcn256_Conv2d_transpose_9_stack_3 = paddle.full(dtype='int32', shape=[1], fill_value=64)\n        resfcn256_Conv2d_transpose_10_mul_y = paddle.full(dtype='int32', shape=[1], fill_value=2)\n        resfcn256_Conv2d_transpose_10_mul_1_y = paddle.full(dtype='int32', shape=[1], fill_value=2)\n        resfcn256_Conv2d_transpose_10_stack_3 = paddle.full(dtype='int32', shape=[1], fill_value=32)\n        resfcn256_Conv2d_transpose_11_mul_y = paddle.full(dtype='int32', shape=[1], fill_value=1)\n        resfcn256_Conv2d_transpose_11_mul_1_y = paddle.full(dtype='int32', shape=[1], fill_value=1)\n        resfcn256_Conv2d_transpose_11_stack_3 = paddle.full(dtype='int32', shape=[1], fill_value=32)\n        resfcn256_Conv2d_transpose_12_mul_y = paddle.full(dtype='int32', shape=[1], fill_value=2)\n        resfcn256_Conv2d_transpose_12_mul_1_y = paddle.full(dtype='int32', shape=[1], fill_value=2)\n        resfcn256_Conv2d_transpose_12_stack_3 = paddle.full(dtype='int32', shape=[1], fill_value=16)\n        resfcn256_Conv2d_transpose_13_mul_y = paddle.full(dtype='int32', shape=[1], fill_value=1)\n        resfcn256_Conv2d_transpose_13_mul_1_y = paddle.full(dtype='int32', shape=[1], fill_value=1)\n        resfcn256_Conv2d_transpose_13_stack_3 = paddle.full(dtype='int32', shape=[1], fill_value=16)\n        resfcn256_Conv2d_transpose_14_mul_y = paddle.full(dtype='int32', shape=[1], fill_value=1)\n        resfcn256_Conv2d_transpose_14_mul_1_y = paddle.full(dtype='int32', shape=[1], fill_value=1)\n        resfcn256_Conv2d_transpose_14_stack_3 = paddle.full(dtype='int32', shape=[1], fill_value=3)\n        resfcn256_Conv2d_transpose_15_mul_y = paddle.full(dtype='int32', shape=[1], fill_value=1)\n        resfcn256_Conv2d_transpose_15_mul_1_y = paddle.full(dtype='int32', shape=[1], fill_value=1)\n        resfcn256_Conv2d_transpose_15_stack_3 = paddle.full(dtype='int32', shape=[1], fill_value=3)\n        resfcn256_Conv2d_transpose_16_mul_y = paddle.full(dtype='int32', shape=[1], fill_value=1)\n        resfcn256_Conv2d_transpose_16_mul_1_y = paddle.full(dtype='int32', shape=[1], fill_value=1)\n        resfcn256_Conv2d_transpose_16_stack_3 = paddle.full(dtype='int32', shape=[1], fill_value=3)\n        conv2d_transpose_0 = paddle.transpose(x=Placeholder, perm=[0, 3, 1, 2])\n        resfcn256_Conv_Conv2D = self.conv0(conv2d_transpose_0)\n        resfcn256_Conv_BatchNorm_FusedBatchNorm = self.bn0(resfcn256_Conv_Conv2D)\n        resfcn256_Conv_Relu = self.relu0(resfcn256_Conv_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_shortcut_Conv2D = self.conv1(resfcn256_Conv_Relu)\n        resfcn256_resBlock_Conv_Conv2D = self.conv2(resfcn256_Conv_Relu)\n        resfcn256_resBlock_Conv_BatchNorm_FusedBatchNorm = self.bn1(resfcn256_resBlock_Conv_Conv2D)\n        resfcn256_resBlock_Conv_Relu = self.relu1(resfcn256_resBlock_Conv_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_Conv_1_Conv2D = self.conv3(resfcn256_resBlock_Conv_Relu)\n        resfcn256_resBlock_Conv_1_BatchNorm_FusedBatchNorm = self.bn2(resfcn256_resBlock_Conv_1_Conv2D)\n        resfcn256_resBlock_Conv_1_Relu = self.relu2(resfcn256_resBlock_Conv_1_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_Conv_2_Conv2D = self.conv4(resfcn256_resBlock_Conv_1_Relu)\n        resfcn256_resBlock_add = paddle.add(x=resfcn256_resBlock_Conv_2_Conv2D, y=resfcn256_resBlock_shortcut_Conv2D)\n        resfcn256_resBlock_BatchNorm_FusedBatchNorm = self.bn3(resfcn256_resBlock_add)\n        resfcn256_resBlock_Relu = self.relu3(resfcn256_resBlock_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_1_Conv_Conv2D = self.conv5(resfcn256_resBlock_Relu)\n        resfcn256_resBlock_1_Conv_BatchNorm_FusedBatchNorm = self.bn4(resfcn256_resBlock_1_Conv_Conv2D)\n        resfcn256_resBlock_1_Conv_Relu = self.relu4(resfcn256_resBlock_1_Conv_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_1_Conv_1_Conv2D = self.conv6(resfcn256_resBlock_1_Conv_Relu)\n        resfcn256_resBlock_1_Conv_1_BatchNorm_FusedBatchNorm = self.bn5(resfcn256_resBlock_1_Conv_1_Conv2D)\n        resfcn256_resBlock_1_Conv_1_Relu = self.relu5(resfcn256_resBlock_1_Conv_1_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_1_Conv_2_Conv2D = self.conv7(resfcn256_resBlock_1_Conv_1_Relu)\n        resfcn256_resBlock_1_add = paddle.add(x=resfcn256_resBlock_1_Conv_2_Conv2D, y=resfcn256_resBlock_Relu)\n        resfcn256_resBlock_1_BatchNorm_FusedBatchNorm = self.bn6(resfcn256_resBlock_1_add)\n        resfcn256_resBlock_1_Relu = self.relu6(resfcn256_resBlock_1_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_2_shortcut_Conv2D = self.conv8(resfcn256_resBlock_1_Relu)\n        resfcn256_resBlock_2_Conv_Conv2D = self.conv9(resfcn256_resBlock_1_Relu)\n        resfcn256_resBlock_2_Conv_BatchNorm_FusedBatchNorm = self.bn7(resfcn256_resBlock_2_Conv_Conv2D)\n        resfcn256_resBlock_2_Conv_Relu = self.relu7(resfcn256_resBlock_2_Conv_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_2_Conv_1_Conv2D = self.conv10(resfcn256_resBlock_2_Conv_Relu)\n        resfcn256_resBlock_2_Conv_1_BatchNorm_FusedBatchNorm = self.bn8(resfcn256_resBlock_2_Conv_1_Conv2D)\n        resfcn256_resBlock_2_Conv_1_Relu = self.relu8(resfcn256_resBlock_2_Conv_1_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_2_Conv_2_Conv2D = self.conv11(resfcn256_resBlock_2_Conv_1_Relu)\n        resfcn256_resBlock_2_add = paddle.add(\n            x=resfcn256_resBlock_2_Conv_2_Conv2D, y=resfcn256_resBlock_2_shortcut_Conv2D)\n        resfcn256_resBlock_2_BatchNorm_FusedBatchNorm = self.bn9(resfcn256_resBlock_2_add)\n        resfcn256_resBlock_2_Relu = self.relu9(resfcn256_resBlock_2_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_3_Conv_Conv2D = self.conv12(resfcn256_resBlock_2_Relu)\n        resfcn256_resBlock_3_Conv_BatchNorm_FusedBatchNorm = self.bn10(resfcn256_resBlock_3_Conv_Conv2D)\n        resfcn256_resBlock_3_Conv_Relu = self.relu10(resfcn256_resBlock_3_Conv_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_3_Conv_1_Conv2D = self.conv13(resfcn256_resBlock_3_Conv_Relu)\n        resfcn256_resBlock_3_Conv_1_BatchNorm_FusedBatchNorm = self.bn11(resfcn256_resBlock_3_Conv_1_Conv2D)\n        resfcn256_resBlock_3_Conv_1_Relu = self.relu11(resfcn256_resBlock_3_Conv_1_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_3_Conv_2_Conv2D = self.conv14(resfcn256_resBlock_3_Conv_1_Relu)\n        resfcn256_resBlock_3_add = paddle.add(x=resfcn256_resBlock_3_Conv_2_Conv2D, y=resfcn256_resBlock_2_Relu)\n        resfcn256_resBlock_3_BatchNorm_FusedBatchNorm = self.bn12(resfcn256_resBlock_3_add)\n        resfcn256_resBlock_3_Relu = self.relu12(resfcn256_resBlock_3_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_4_shortcut_Conv2D = self.conv15(resfcn256_resBlock_3_Relu)\n        resfcn256_resBlock_4_Conv_Conv2D = self.conv16(resfcn256_resBlock_3_Relu)\n        resfcn256_resBlock_4_Conv_BatchNorm_FusedBatchNorm = self.bn13(resfcn256_resBlock_4_Conv_Conv2D)\n        resfcn256_resBlock_4_Conv_Relu = self.relu13(resfcn256_resBlock_4_Conv_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_4_Conv_1_Conv2D = self.conv17(resfcn256_resBlock_4_Conv_Relu)\n        resfcn256_resBlock_4_Conv_1_BatchNorm_FusedBatchNorm = self.bn14(resfcn256_resBlock_4_Conv_1_Conv2D)\n        resfcn256_resBlock_4_Conv_1_Relu = self.relu14(resfcn256_resBlock_4_Conv_1_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_4_Conv_2_Conv2D = self.conv18(resfcn256_resBlock_4_Conv_1_Relu)\n        resfcn256_resBlock_4_add = paddle.add(\n            x=resfcn256_resBlock_4_Conv_2_Conv2D, y=resfcn256_resBlock_4_shortcut_Conv2D)\n        resfcn256_resBlock_4_BatchNorm_FusedBatchNorm = self.bn15(resfcn256_resBlock_4_add)\n        resfcn256_resBlock_4_Relu = self.relu15(resfcn256_resBlock_4_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_5_Conv_Conv2D = self.conv19(resfcn256_resBlock_4_Relu)\n        resfcn256_resBlock_5_Conv_BatchNorm_FusedBatchNorm = self.bn16(resfcn256_resBlock_5_Conv_Conv2D)\n        resfcn256_resBlock_5_Conv_Relu = self.relu16(resfcn256_resBlock_5_Conv_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_5_Conv_1_Conv2D = self.conv20(resfcn256_resBlock_5_Conv_Relu)\n        resfcn256_resBlock_5_Conv_1_BatchNorm_FusedBatchNorm = self.bn17(resfcn256_resBlock_5_Conv_1_Conv2D)\n        resfcn256_resBlock_5_Conv_1_Relu = self.relu17(resfcn256_resBlock_5_Conv_1_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_5_Conv_2_Conv2D = self.conv21(resfcn256_resBlock_5_Conv_1_Relu)\n        resfcn256_resBlock_5_add = paddle.add(x=resfcn256_resBlock_5_Conv_2_Conv2D, y=resfcn256_resBlock_4_Relu)\n        resfcn256_resBlock_5_BatchNorm_FusedBatchNorm = self.bn18(resfcn256_resBlock_5_add)\n        resfcn256_resBlock_5_Relu = self.relu18(resfcn256_resBlock_5_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_6_shortcut_Conv2D = self.conv22(resfcn256_resBlock_5_Relu)\n        resfcn256_resBlock_6_Conv_Conv2D = self.conv23(resfcn256_resBlock_5_Relu)\n        resfcn256_resBlock_6_Conv_BatchNorm_FusedBatchNorm = self.bn19(resfcn256_resBlock_6_Conv_Conv2D)\n        resfcn256_resBlock_6_Conv_Relu = self.relu19(resfcn256_resBlock_6_Conv_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_6_Conv_1_Conv2D = self.conv24(resfcn256_resBlock_6_Conv_Relu)\n        resfcn256_resBlock_6_Conv_1_BatchNorm_FusedBatchNorm = self.bn20(resfcn256_resBlock_6_Conv_1_Conv2D)\n        resfcn256_resBlock_6_Conv_1_Relu = self.relu20(resfcn256_resBlock_6_Conv_1_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_6_Conv_2_Conv2D = self.conv25(resfcn256_resBlock_6_Conv_1_Relu)\n        resfcn256_resBlock_6_add = paddle.add(\n            x=resfcn256_resBlock_6_Conv_2_Conv2D, y=resfcn256_resBlock_6_shortcut_Conv2D)\n        resfcn256_resBlock_6_BatchNorm_FusedBatchNorm = self.bn21(resfcn256_resBlock_6_add)\n        resfcn256_resBlock_6_Relu = self.relu21(resfcn256_resBlock_6_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_7_Conv_Conv2D = self.conv26(resfcn256_resBlock_6_Relu)\n        resfcn256_resBlock_7_Conv_BatchNorm_FusedBatchNorm = self.bn22(resfcn256_resBlock_7_Conv_Conv2D)\n        resfcn256_resBlock_7_Conv_Relu = self.relu22(resfcn256_resBlock_7_Conv_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_7_Conv_1_Conv2D = self.conv27(resfcn256_resBlock_7_Conv_Relu)\n        resfcn256_resBlock_7_Conv_1_BatchNorm_FusedBatchNorm = self.bn23(resfcn256_resBlock_7_Conv_1_Conv2D)\n        resfcn256_resBlock_7_Conv_1_Relu = self.relu23(resfcn256_resBlock_7_Conv_1_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_7_Conv_2_Conv2D = self.conv28(resfcn256_resBlock_7_Conv_1_Relu)\n        resfcn256_resBlock_7_add = paddle.add(x=resfcn256_resBlock_7_Conv_2_Conv2D, y=resfcn256_resBlock_6_Relu)\n        resfcn256_resBlock_7_BatchNorm_FusedBatchNorm = self.bn24(resfcn256_resBlock_7_add)\n        resfcn256_resBlock_7_Relu = self.relu24(resfcn256_resBlock_7_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_8_shortcut_Conv2D = self.conv29(resfcn256_resBlock_7_Relu)\n        resfcn256_resBlock_8_Conv_Conv2D = self.conv30(resfcn256_resBlock_7_Relu)\n        resfcn256_resBlock_8_Conv_BatchNorm_FusedBatchNorm = self.bn25(resfcn256_resBlock_8_Conv_Conv2D)\n        resfcn256_resBlock_8_Conv_Relu = self.relu25(resfcn256_resBlock_8_Conv_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_8_Conv_1_Conv2D = self.conv31(resfcn256_resBlock_8_Conv_Relu)\n        resfcn256_resBlock_8_Conv_1_BatchNorm_FusedBatchNorm = self.bn26(resfcn256_resBlock_8_Conv_1_Conv2D)\n        resfcn256_resBlock_8_Conv_1_Relu = self.relu26(resfcn256_resBlock_8_Conv_1_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_8_Conv_2_Conv2D = self.conv32(resfcn256_resBlock_8_Conv_1_Relu)\n        resfcn256_resBlock_8_add = paddle.add(\n            x=resfcn256_resBlock_8_Conv_2_Conv2D, y=resfcn256_resBlock_8_shortcut_Conv2D)\n        resfcn256_resBlock_8_BatchNorm_FusedBatchNorm = self.bn27(resfcn256_resBlock_8_add)\n        resfcn256_resBlock_8_Relu = self.relu27(resfcn256_resBlock_8_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_9_Conv_Conv2D = self.conv33(resfcn256_resBlock_8_Relu)\n        resfcn256_resBlock_9_Conv_BatchNorm_FusedBatchNorm = self.bn28(resfcn256_resBlock_9_Conv_Conv2D)\n        resfcn256_resBlock_9_Conv_Relu = self.relu28(resfcn256_resBlock_9_Conv_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_9_Conv_1_Conv2D = self.conv34(resfcn256_resBlock_9_Conv_Relu)\n        resfcn256_resBlock_9_Conv_1_BatchNorm_FusedBatchNorm = self.bn29(resfcn256_resBlock_9_Conv_1_Conv2D)\n        resfcn256_resBlock_9_Conv_1_Relu = self.relu29(resfcn256_resBlock_9_Conv_1_BatchNorm_FusedBatchNorm)\n        resfcn256_resBlock_9_Conv_2_Conv2D = self.conv35(resfcn256_resBlock_9_Conv_1_Relu)\n        resfcn256_resBlock_9_add = paddle.add(x=resfcn256_resBlock_9_Conv_2_Conv2D, y=resfcn256_resBlock_8_Relu)\n        resfcn256_resBlock_9_BatchNorm_FusedBatchNorm = self.bn30(resfcn256_resBlock_9_add)\n        resfcn256_resBlock_9_BatchNorm_FusedBatchNorm = paddle.transpose(\n            x=resfcn256_resBlock_9_BatchNorm_FusedBatchNorm, perm=[0, 2, 3, 1])\n        resfcn256_resBlock_9_Relu = self.relu30(resfcn256_resBlock_9_BatchNorm_FusedBatchNorm)\n        resfcn256_Conv2d_transpose_Shape = paddle.shape(input=resfcn256_resBlock_9_Relu)\n        resfcn256_Conv2d_transpose_strided_slice = paddle.slice(\n            input=resfcn256_Conv2d_transpose_Shape, axes=[0], starts=[0], ends=[1])\n        resfcn256_Conv2d_transpose_strided_slice_1 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_Shape, axes=[0], starts=[1], ends=[2])\n        resfcn256_Conv2d_transpose_strided_slice_2 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_Shape, axes=[0], starts=[2], ends=[3])\n        resfcn256_Conv2d_transpose_mul = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_strided_slice_1, y=resfcn256_Conv2d_transpose_mul_y)\n        resfcn256_Conv2d_transpose_mul_1 = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_strided_slice_2, y=resfcn256_Conv2d_transpose_mul_1_y)\n        resfcn256_Conv2d_transpose_stack = paddle.stack(x=[\n            resfcn256_Conv2d_transpose_strided_slice, resfcn256_Conv2d_transpose_mul, resfcn256_Conv2d_transpose_mul_1,\n            resfcn256_Conv2d_transpose_stack_3\n        ])\n        resfcn256_Conv2d_transpose_stack = paddle.reshape(x=resfcn256_Conv2d_transpose_stack, shape=[-1])\n        conv2dbackpropinput_transpose_0 = paddle.transpose(x=resfcn256_resBlock_9_Relu, perm=[0, 3, 1, 2])\n        resfcn256_Conv2d_transpose_conv2d_transpose_conv36_weight = self.resfcn256_Conv2d_transpose_conv2d_transpose_conv36_weight\n        resfcn256_Conv2d_transpose_conv2d_transpose = paddle.nn.functional.conv2d_transpose(\n            x=conv2dbackpropinput_transpose_0,\n            weight=resfcn256_Conv2d_transpose_conv2d_transpose_conv36_weight,\n            stride=[1, 1],\n            dilation=[1, 1],\n            padding='SAME',\n            output_size=[8, 8])\n        resfcn256_Conv2d_transpose_BatchNorm_FusedBatchNorm = self.bn31(resfcn256_Conv2d_transpose_conv2d_transpose)\n        resfcn256_Conv2d_transpose_BatchNorm_FusedBatchNorm = paddle.transpose(\n            x=resfcn256_Conv2d_transpose_BatchNorm_FusedBatchNorm, perm=[0, 2, 3, 1])\n        resfcn256_Conv2d_transpose_Relu = self.relu31(resfcn256_Conv2d_transpose_BatchNorm_FusedBatchNorm)\n        resfcn256_Conv2d_transpose_1_Shape = paddle.shape(input=resfcn256_Conv2d_transpose_Relu)\n        resfcn256_Conv2d_transpose_1_strided_slice = paddle.slice(\n            input=resfcn256_Conv2d_transpose_1_Shape, axes=[0], starts=[0], ends=[1])\n        resfcn256_Conv2d_transpose_1_strided_slice_1 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_1_Shape, axes=[0], starts=[1], ends=[2])\n        resfcn256_Conv2d_transpose_1_strided_slice_2 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_1_Shape, axes=[0], starts=[2], ends=[3])\n        resfcn256_Conv2d_transpose_1_mul = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_1_strided_slice_1, y=resfcn256_Conv2d_transpose_1_mul_y)\n        resfcn256_Conv2d_transpose_1_mul_1 = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_1_strided_slice_2, y=resfcn256_Conv2d_transpose_1_mul_1_y)\n        resfcn256_Conv2d_transpose_1_stack = paddle.stack(x=[\n            resfcn256_Conv2d_transpose_1_strided_slice, resfcn256_Conv2d_transpose_1_mul,\n            resfcn256_Conv2d_transpose_1_mul_1, resfcn256_Conv2d_transpose_1_stack_3\n        ])\n        resfcn256_Conv2d_transpose_1_stack = paddle.reshape(x=resfcn256_Conv2d_transpose_1_stack, shape=[-1])\n        conv2dbackpropinput_transpose_1 = paddle.transpose(x=resfcn256_Conv2d_transpose_Relu, perm=[0, 3, 1, 2])\n        resfcn256_Conv2d_transpose_1_conv2d_transpose_conv37_weight = self.resfcn256_Conv2d_transpose_1_conv2d_transpose_conv37_weight\n        resfcn256_Conv2d_transpose_1_conv2d_transpose = paddle.nn.functional.conv2d_transpose(\n            x=conv2dbackpropinput_transpose_1,\n            weight=resfcn256_Conv2d_transpose_1_conv2d_transpose_conv37_weight,\n            stride=[2, 2],\n            dilation=[1, 1],\n            padding='SAME',\n            output_size=[16, 16])\n        resfcn256_Conv2d_transpose_1_BatchNorm_FusedBatchNorm = self.bn32(resfcn256_Conv2d_transpose_1_conv2d_transpose)\n        resfcn256_Conv2d_transpose_1_BatchNorm_FusedBatchNorm = paddle.transpose(\n            x=resfcn256_Conv2d_transpose_1_BatchNorm_FusedBatchNorm, perm=[0, 2, 3, 1])\n        resfcn256_Conv2d_transpose_1_Relu = self.relu32(resfcn256_Conv2d_transpose_1_BatchNorm_FusedBatchNorm)\n        resfcn256_Conv2d_transpose_2_Shape = paddle.shape(input=resfcn256_Conv2d_transpose_1_Relu)\n        resfcn256_Conv2d_transpose_2_strided_slice = paddle.slice(\n            input=resfcn256_Conv2d_transpose_2_Shape, axes=[0], starts=[0], ends=[1])\n        resfcn256_Conv2d_transpose_2_strided_slice_1 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_2_Shape, axes=[0], starts=[1], ends=[2])\n        resfcn256_Conv2d_transpose_2_strided_slice_2 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_2_Shape, axes=[0], starts=[2], ends=[3])\n        resfcn256_Conv2d_transpose_2_mul = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_2_strided_slice_1, y=resfcn256_Conv2d_transpose_2_mul_y)\n        resfcn256_Conv2d_transpose_2_mul_1 = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_2_strided_slice_2, y=resfcn256_Conv2d_transpose_2_mul_1_y)\n        resfcn256_Conv2d_transpose_2_stack = paddle.stack(x=[\n            resfcn256_Conv2d_transpose_2_strided_slice, resfcn256_Conv2d_transpose_2_mul,\n            resfcn256_Conv2d_transpose_2_mul_1, resfcn256_Conv2d_transpose_2_stack_3\n        ])\n        resfcn256_Conv2d_transpose_2_stack = paddle.reshape(x=resfcn256_Conv2d_transpose_2_stack, shape=[-1])\n        conv2dbackpropinput_transpose_2 = paddle.transpose(x=resfcn256_Conv2d_transpose_1_Relu, perm=[0, 3, 1, 2])\n        resfcn256_Conv2d_transpose_2_conv2d_transpose_conv38_weight = self.resfcn256_Conv2d_transpose_2_conv2d_transpose_conv38_weight\n        resfcn256_Conv2d_transpose_2_conv2d_transpose = paddle.nn.functional.conv2d_transpose(\n            x=conv2dbackpropinput_transpose_2,\n            weight=resfcn256_Conv2d_transpose_2_conv2d_transpose_conv38_weight,\n            stride=[1, 1],\n            dilation=[1, 1],\n            padding='SAME',\n            output_size=[16, 16])\n        resfcn256_Conv2d_transpose_2_BatchNorm_FusedBatchNorm = self.bn33(resfcn256_Conv2d_transpose_2_conv2d_transpose)\n        resfcn256_Conv2d_transpose_2_BatchNorm_FusedBatchNorm = paddle.transpose(\n            x=resfcn256_Conv2d_transpose_2_BatchNorm_FusedBatchNorm, perm=[0, 2, 3, 1])\n        resfcn256_Conv2d_transpose_2_Relu = self.relu33(resfcn256_Conv2d_transpose_2_BatchNorm_FusedBatchNorm)\n        resfcn256_Conv2d_transpose_3_Shape = paddle.shape(input=resfcn256_Conv2d_transpose_2_Relu)\n        resfcn256_Conv2d_transpose_3_strided_slice = paddle.slice(\n            input=resfcn256_Conv2d_transpose_3_Shape, axes=[0], starts=[0], ends=[1])\n        resfcn256_Conv2d_transpose_3_strided_slice_1 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_3_Shape, axes=[0], starts=[1], ends=[2])\n        resfcn256_Conv2d_transpose_3_strided_slice_2 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_3_Shape, axes=[0], starts=[2], ends=[3])\n        resfcn256_Conv2d_transpose_3_mul = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_3_strided_slice_1, y=resfcn256_Conv2d_transpose_3_mul_y)\n        resfcn256_Conv2d_transpose_3_mul_1 = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_3_strided_slice_2, y=resfcn256_Conv2d_transpose_3_mul_1_y)\n        resfcn256_Conv2d_transpose_3_stack = paddle.stack(x=[\n            resfcn256_Conv2d_transpose_3_strided_slice, resfcn256_Conv2d_transpose_3_mul,\n            resfcn256_Conv2d_transpose_3_mul_1, resfcn256_Conv2d_transpose_3_stack_3\n        ])\n        resfcn256_Conv2d_transpose_3_stack = paddle.reshape(x=resfcn256_Conv2d_transpose_3_stack, shape=[-1])\n        conv2dbackpropinput_transpose_3 = paddle.transpose(x=resfcn256_Conv2d_transpose_2_Relu, perm=[0, 3, 1, 2])\n        resfcn256_Conv2d_transpose_3_conv2d_transpose_conv39_weight = self.resfcn256_Conv2d_transpose_3_conv2d_transpose_conv39_weight\n        resfcn256_Conv2d_transpose_3_conv2d_transpose = paddle.nn.functional.conv2d_transpose(\n            x=conv2dbackpropinput_transpose_3,\n            weight=resfcn256_Conv2d_transpose_3_conv2d_transpose_conv39_weight,\n            stride=[1, 1],\n            dilation=[1, 1],\n            padding='SAME',\n            output_size=[16, 16])\n        resfcn256_Conv2d_transpose_3_BatchNorm_FusedBatchNorm = self.bn34(resfcn256_Conv2d_transpose_3_conv2d_transpose)\n        resfcn256_Conv2d_transpose_3_BatchNorm_FusedBatchNorm = paddle.transpose(\n            x=resfcn256_Conv2d_transpose_3_BatchNorm_FusedBatchNorm, perm=[0, 2, 3, 1])\n        resfcn256_Conv2d_transpose_3_Relu = self.relu34(resfcn256_Conv2d_transpose_3_BatchNorm_FusedBatchNorm)\n        resfcn256_Conv2d_transpose_4_Shape = paddle.shape(input=resfcn256_Conv2d_transpose_3_Relu)\n        resfcn256_Conv2d_transpose_4_strided_slice = paddle.slice(\n            input=resfcn256_Conv2d_transpose_4_Shape, axes=[0], starts=[0], ends=[1])\n        resfcn256_Conv2d_transpose_4_strided_slice_1 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_4_Shape, axes=[0], starts=[1], ends=[2])\n        resfcn256_Conv2d_transpose_4_strided_slice_2 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_4_Shape, axes=[0], starts=[2], ends=[3])\n        resfcn256_Conv2d_transpose_4_mul = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_4_strided_slice_1, y=resfcn256_Conv2d_transpose_4_mul_y)\n        resfcn256_Conv2d_transpose_4_mul_1 = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_4_strided_slice_2, y=resfcn256_Conv2d_transpose_4_mul_1_y)\n        resfcn256_Conv2d_transpose_4_stack = paddle.stack(x=[\n            resfcn256_Conv2d_transpose_4_strided_slice, resfcn256_Conv2d_transpose_4_mul,\n            resfcn256_Conv2d_transpose_4_mul_1, resfcn256_Conv2d_transpose_4_stack_3\n        ])\n        resfcn256_Conv2d_transpose_4_stack = paddle.reshape(x=resfcn256_Conv2d_transpose_4_stack, shape=[-1])\n        conv2dbackpropinput_transpose_4 = paddle.transpose(x=resfcn256_Conv2d_transpose_3_Relu, perm=[0, 3, 1, 2])\n        resfcn256_Conv2d_transpose_4_conv2d_transpose_conv40_weight = self.resfcn256_Conv2d_transpose_4_conv2d_transpose_conv40_weight\n        resfcn256_Conv2d_transpose_4_conv2d_transpose = paddle.nn.functional.conv2d_transpose(\n            x=conv2dbackpropinput_transpose_4,\n            weight=resfcn256_Conv2d_transpose_4_conv2d_transpose_conv40_weight,\n            stride=[2, 2],\n            dilation=[1, 1],\n            padding='SAME',\n            output_size=[32, 32])\n        resfcn256_Conv2d_transpose_4_BatchNorm_FusedBatchNorm = self.bn35(resfcn256_Conv2d_transpose_4_conv2d_transpose)\n        resfcn256_Conv2d_transpose_4_BatchNorm_FusedBatchNorm = paddle.transpose(\n            x=resfcn256_Conv2d_transpose_4_BatchNorm_FusedBatchNorm, perm=[0, 2, 3, 1])\n        resfcn256_Conv2d_transpose_4_Relu = self.relu35(resfcn256_Conv2d_transpose_4_BatchNorm_FusedBatchNorm)\n        resfcn256_Conv2d_transpose_5_Shape = paddle.shape(input=resfcn256_Conv2d_transpose_4_Relu)\n        resfcn256_Conv2d_transpose_5_strided_slice = paddle.slice(\n            input=resfcn256_Conv2d_transpose_5_Shape, axes=[0], starts=[0], ends=[1])\n        resfcn256_Conv2d_transpose_5_strided_slice_1 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_5_Shape, axes=[0], starts=[1], ends=[2])\n        resfcn256_Conv2d_transpose_5_strided_slice_2 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_5_Shape, axes=[0], starts=[2], ends=[3])\n        resfcn256_Conv2d_transpose_5_mul = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_5_strided_slice_1, y=resfcn256_Conv2d_transpose_5_mul_y)\n        resfcn256_Conv2d_transpose_5_mul_1 = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_5_strided_slice_2, y=resfcn256_Conv2d_transpose_5_mul_1_y)\n        resfcn256_Conv2d_transpose_5_stack = paddle.stack(x=[\n            resfcn256_Conv2d_transpose_5_strided_slice, resfcn256_Conv2d_transpose_5_mul,\n            resfcn256_Conv2d_transpose_5_mul_1, resfcn256_Conv2d_transpose_5_stack_3\n        ])\n        resfcn256_Conv2d_transpose_5_stack = paddle.reshape(x=resfcn256_Conv2d_transpose_5_stack, shape=[-1])\n        conv2dbackpropinput_transpose_5 = paddle.transpose(x=resfcn256_Conv2d_transpose_4_Relu, perm=[0, 3, 1, 2])\n        resfcn256_Conv2d_transpose_5_conv2d_transpose_conv41_weight = self.resfcn256_Conv2d_transpose_5_conv2d_transpose_conv41_weight\n        resfcn256_Conv2d_transpose_5_conv2d_transpose = paddle.nn.functional.conv2d_transpose(\n            x=conv2dbackpropinput_transpose_5,\n            weight=resfcn256_Conv2d_transpose_5_conv2d_transpose_conv41_weight,\n            stride=[1, 1],\n            dilation=[1, 1],\n            padding='SAME',\n            output_size=[32, 32])\n        resfcn256_Conv2d_transpose_5_BatchNorm_FusedBatchNorm = self.bn36(resfcn256_Conv2d_transpose_5_conv2d_transpose)\n        resfcn256_Conv2d_transpose_5_BatchNorm_FusedBatchNorm = paddle.transpose(\n            x=resfcn256_Conv2d_transpose_5_BatchNorm_FusedBatchNorm, perm=[0, 2, 3, 1])\n        resfcn256_Conv2d_transpose_5_Relu = self.relu36(resfcn256_Conv2d_transpose_5_BatchNorm_FusedBatchNorm)\n        resfcn256_Conv2d_transpose_6_Shape = paddle.shape(input=resfcn256_Conv2d_transpose_5_Relu)\n        resfcn256_Conv2d_transpose_6_strided_slice = paddle.slice(\n            input=resfcn256_Conv2d_transpose_6_Shape, axes=[0], starts=[0], ends=[1])\n        resfcn256_Conv2d_transpose_6_strided_slice_1 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_6_Shape, axes=[0], starts=[1], ends=[2])\n        resfcn256_Conv2d_transpose_6_strided_slice_2 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_6_Shape, axes=[0], starts=[2], ends=[3])\n        resfcn256_Conv2d_transpose_6_mul = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_6_strided_slice_1, y=resfcn256_Conv2d_transpose_6_mul_y)\n        resfcn256_Conv2d_transpose_6_mul_1 = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_6_strided_slice_2, y=resfcn256_Conv2d_transpose_6_mul_1_y)\n        resfcn256_Conv2d_transpose_6_stack = paddle.stack(x=[\n            resfcn256_Conv2d_transpose_6_strided_slice, resfcn256_Conv2d_transpose_6_mul,\n            resfcn256_Conv2d_transpose_6_mul_1, resfcn256_Conv2d_transpose_6_stack_3\n        ])\n        resfcn256_Conv2d_transpose_6_stack = paddle.reshape(x=resfcn256_Conv2d_transpose_6_stack, shape=[-1])\n        conv2dbackpropinput_transpose_6 = paddle.transpose(x=resfcn256_Conv2d_transpose_5_Relu, perm=[0, 3, 1, 2])\n        resfcn256_Conv2d_transpose_6_conv2d_transpose_conv42_weight = self.resfcn256_Conv2d_transpose_6_conv2d_transpose_conv42_weight\n        resfcn256_Conv2d_transpose_6_conv2d_transpose = paddle.nn.functional.conv2d_transpose(\n            x=conv2dbackpropinput_transpose_6,\n            weight=resfcn256_Conv2d_transpose_6_conv2d_transpose_conv42_weight,\n            stride=[1, 1],\n            dilation=[1, 1],\n            padding='SAME',\n            output_size=[32, 32])\n        resfcn256_Conv2d_transpose_6_BatchNorm_FusedBatchNorm = self.bn37(resfcn256_Conv2d_transpose_6_conv2d_transpose)\n        resfcn256_Conv2d_transpose_6_BatchNorm_FusedBatchNorm = paddle.transpose(\n            x=resfcn256_Conv2d_transpose_6_BatchNorm_FusedBatchNorm, perm=[0, 2, 3, 1])\n        resfcn256_Conv2d_transpose_6_Relu = self.relu37(resfcn256_Conv2d_transpose_6_BatchNorm_FusedBatchNorm)\n        resfcn256_Conv2d_transpose_7_Shape = paddle.shape(input=resfcn256_Conv2d_transpose_6_Relu)\n        resfcn256_Conv2d_transpose_7_strided_slice = paddle.slice(\n            input=resfcn256_Conv2d_transpose_7_Shape, axes=[0], starts=[0], ends=[1])\n        resfcn256_Conv2d_transpose_7_strided_slice_1 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_7_Shape, axes=[0], starts=[1], ends=[2])\n        resfcn256_Conv2d_transpose_7_strided_slice_2 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_7_Shape, axes=[0], starts=[2], ends=[3])\n        resfcn256_Conv2d_transpose_7_mul = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_7_strided_slice_1, y=resfcn256_Conv2d_transpose_7_mul_y)\n        resfcn256_Conv2d_transpose_7_mul_1 = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_7_strided_slice_2, y=resfcn256_Conv2d_transpose_7_mul_1_y)\n        resfcn256_Conv2d_transpose_7_stack = paddle.stack(x=[\n            resfcn256_Conv2d_transpose_7_strided_slice, resfcn256_Conv2d_transpose_7_mul,\n            resfcn256_Conv2d_transpose_7_mul_1, resfcn256_Conv2d_transpose_7_stack_3\n        ])\n        resfcn256_Conv2d_transpose_7_stack = paddle.reshape(x=resfcn256_Conv2d_transpose_7_stack, shape=[-1])\n        conv2dbackpropinput_transpose_7 = paddle.transpose(x=resfcn256_Conv2d_transpose_6_Relu, perm=[0, 3, 1, 2])\n        resfcn256_Conv2d_transpose_7_conv2d_transpose_conv43_weight = self.resfcn256_Conv2d_transpose_7_conv2d_transpose_conv43_weight\n        resfcn256_Conv2d_transpose_7_conv2d_transpose = paddle.nn.functional.conv2d_transpose(\n            x=conv2dbackpropinput_transpose_7,\n            weight=resfcn256_Conv2d_transpose_7_conv2d_transpose_conv43_weight,\n            stride=[2, 2],\n            dilation=[1, 1],\n            padding='SAME',\n            output_size=[64, 64])\n        resfcn256_Conv2d_transpose_7_BatchNorm_FusedBatchNorm = self.bn38(resfcn256_Conv2d_transpose_7_conv2d_transpose)\n        resfcn256_Conv2d_transpose_7_BatchNorm_FusedBatchNorm = paddle.transpose(\n            x=resfcn256_Conv2d_transpose_7_BatchNorm_FusedBatchNorm, perm=[0, 2, 3, 1])\n        resfcn256_Conv2d_transpose_7_Relu = self.relu38(resfcn256_Conv2d_transpose_7_BatchNorm_FusedBatchNorm)\n        resfcn256_Conv2d_transpose_8_Shape = paddle.shape(input=resfcn256_Conv2d_transpose_7_Relu)\n        resfcn256_Conv2d_transpose_8_strided_slice = paddle.slice(\n            input=resfcn256_Conv2d_transpose_8_Shape, axes=[0], starts=[0], ends=[1])\n        resfcn256_Conv2d_transpose_8_strided_slice_1 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_8_Shape, axes=[0], starts=[1], ends=[2])\n        resfcn256_Conv2d_transpose_8_strided_slice_2 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_8_Shape, axes=[0], starts=[2], ends=[3])\n        resfcn256_Conv2d_transpose_8_mul = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_8_strided_slice_1, y=resfcn256_Conv2d_transpose_8_mul_y)\n        resfcn256_Conv2d_transpose_8_mul_1 = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_8_strided_slice_2, y=resfcn256_Conv2d_transpose_8_mul_1_y)\n        resfcn256_Conv2d_transpose_8_stack = paddle.stack(x=[\n            resfcn256_Conv2d_transpose_8_strided_slice, resfcn256_Conv2d_transpose_8_mul,\n            resfcn256_Conv2d_transpose_8_mul_1, resfcn256_Conv2d_transpose_8_stack_3\n        ])\n        resfcn256_Conv2d_transpose_8_stack = paddle.reshape(x=resfcn256_Conv2d_transpose_8_stack, shape=[-1])\n        conv2dbackpropinput_transpose_8 = paddle.transpose(x=resfcn256_Conv2d_transpose_7_Relu, perm=[0, 3, 1, 2])\n        resfcn256_Conv2d_transpose_8_conv2d_transpose_conv44_weight = self.resfcn256_Conv2d_transpose_8_conv2d_transpose_conv44_weight\n        resfcn256_Conv2d_transpose_8_conv2d_transpose = paddle.nn.functional.conv2d_transpose(\n            x=conv2dbackpropinput_transpose_8,\n            weight=resfcn256_Conv2d_transpose_8_conv2d_transpose_conv44_weight,\n            stride=[1, 1],\n            dilation=[1, 1],\n            padding='SAME',\n            output_size=[64, 64])\n        resfcn256_Conv2d_transpose_8_BatchNorm_FusedBatchNorm = self.bn39(resfcn256_Conv2d_transpose_8_conv2d_transpose)\n        resfcn256_Conv2d_transpose_8_BatchNorm_FusedBatchNorm = paddle.transpose(\n            x=resfcn256_Conv2d_transpose_8_BatchNorm_FusedBatchNorm, perm=[0, 2, 3, 1])\n        resfcn256_Conv2d_transpose_8_Relu = self.relu39(resfcn256_Conv2d_transpose_8_BatchNorm_FusedBatchNorm)\n        resfcn256_Conv2d_transpose_9_Shape = paddle.shape(input=resfcn256_Conv2d_transpose_8_Relu)\n        resfcn256_Conv2d_transpose_9_strided_slice = paddle.slice(\n            input=resfcn256_Conv2d_transpose_9_Shape, axes=[0], starts=[0], ends=[1])\n        resfcn256_Conv2d_transpose_9_strided_slice_1 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_9_Shape, axes=[0], starts=[1], ends=[2])\n        resfcn256_Conv2d_transpose_9_strided_slice_2 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_9_Shape, axes=[0], starts=[2], ends=[3])\n        resfcn256_Conv2d_transpose_9_mul = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_9_strided_slice_1, y=resfcn256_Conv2d_transpose_9_mul_y)\n        resfcn256_Conv2d_transpose_9_mul_1 = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_9_strided_slice_2, y=resfcn256_Conv2d_transpose_9_mul_1_y)\n        resfcn256_Conv2d_transpose_9_stack = paddle.stack(x=[\n            resfcn256_Conv2d_transpose_9_strided_slice, resfcn256_Conv2d_transpose_9_mul,\n            resfcn256_Conv2d_transpose_9_mul_1, resfcn256_Conv2d_transpose_9_stack_3\n        ])\n        resfcn256_Conv2d_transpose_9_stack = paddle.reshape(x=resfcn256_Conv2d_transpose_9_stack, shape=[-1])\n        conv2dbackpropinput_transpose_9 = paddle.transpose(x=resfcn256_Conv2d_transpose_8_Relu, perm=[0, 3, 1, 2])\n        resfcn256_Conv2d_transpose_9_conv2d_transpose_conv45_weight = self.resfcn256_Conv2d_transpose_9_conv2d_transpose_conv45_weight\n        resfcn256_Conv2d_transpose_9_conv2d_transpose = paddle.nn.functional.conv2d_transpose(\n            x=conv2dbackpropinput_transpose_9,\n            weight=resfcn256_Conv2d_transpose_9_conv2d_transpose_conv45_weight,\n            stride=[1, 1],\n            dilation=[1, 1],\n            padding='SAME',\n            output_size=[64, 64])\n        resfcn256_Conv2d_transpose_9_BatchNorm_FusedBatchNorm = self.bn40(resfcn256_Conv2d_transpose_9_conv2d_transpose)\n        resfcn256_Conv2d_transpose_9_BatchNorm_FusedBatchNorm = paddle.transpose(\n            x=resfcn256_Conv2d_transpose_9_BatchNorm_FusedBatchNorm, perm=[0, 2, 3, 1])\n        resfcn256_Conv2d_transpose_9_Relu = self.relu40(resfcn256_Conv2d_transpose_9_BatchNorm_FusedBatchNorm)\n        resfcn256_Conv2d_transpose_10_Shape = paddle.shape(input=resfcn256_Conv2d_transpose_9_Relu)\n        resfcn256_Conv2d_transpose_10_strided_slice = paddle.slice(\n            input=resfcn256_Conv2d_transpose_10_Shape, axes=[0], starts=[0], ends=[1])\n        resfcn256_Conv2d_transpose_10_strided_slice_1 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_10_Shape, axes=[0], starts=[1], ends=[2])\n        resfcn256_Conv2d_transpose_10_strided_slice_2 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_10_Shape, axes=[0], starts=[2], ends=[3])\n        resfcn256_Conv2d_transpose_10_mul = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_10_strided_slice_1, y=resfcn256_Conv2d_transpose_10_mul_y)\n        resfcn256_Conv2d_transpose_10_mul_1 = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_10_strided_slice_2, y=resfcn256_Conv2d_transpose_10_mul_1_y)\n        resfcn256_Conv2d_transpose_10_stack = paddle.stack(x=[\n            resfcn256_Conv2d_transpose_10_strided_slice, resfcn256_Conv2d_transpose_10_mul,\n            resfcn256_Conv2d_transpose_10_mul_1, resfcn256_Conv2d_transpose_10_stack_3\n        ])\n        resfcn256_Conv2d_transpose_10_stack = paddle.reshape(x=resfcn256_Conv2d_transpose_10_stack, shape=[-1])\n        conv2dbackpropinput_transpose_10 = paddle.transpose(x=resfcn256_Conv2d_transpose_9_Relu, perm=[0, 3, 1, 2])\n        resfcn256_Conv2d_transpose_10_conv2d_transpose_conv46_weight = self.resfcn256_Conv2d_transpose_10_conv2d_transpose_conv46_weight\n        resfcn256_Conv2d_transpose_10_conv2d_transpose = paddle.nn.functional.conv2d_transpose(\n            x=conv2dbackpropinput_transpose_10,\n            weight=resfcn256_Conv2d_transpose_10_conv2d_transpose_conv46_weight,\n            stride=[2, 2],\n            dilation=[1, 1],\n            padding='SAME',\n            output_size=[128, 128])\n        resfcn256_Conv2d_transpose_10_BatchNorm_FusedBatchNorm = self.bn41(\n            resfcn256_Conv2d_transpose_10_conv2d_transpose)\n        resfcn256_Conv2d_transpose_10_BatchNorm_FusedBatchNorm = paddle.transpose(\n            x=resfcn256_Conv2d_transpose_10_BatchNorm_FusedBatchNorm, perm=[0, 2, 3, 1])\n        resfcn256_Conv2d_transpose_10_Relu = self.relu41(resfcn256_Conv2d_transpose_10_BatchNorm_FusedBatchNorm)\n        resfcn256_Conv2d_transpose_11_Shape = paddle.shape(input=resfcn256_Conv2d_transpose_10_Relu)\n        resfcn256_Conv2d_transpose_11_strided_slice = paddle.slice(\n            input=resfcn256_Conv2d_transpose_11_Shape, axes=[0], starts=[0], ends=[1])\n        resfcn256_Conv2d_transpose_11_strided_slice_1 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_11_Shape, axes=[0], starts=[1], ends=[2])\n        resfcn256_Conv2d_transpose_11_strided_slice_2 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_11_Shape, axes=[0], starts=[2], ends=[3])\n        resfcn256_Conv2d_transpose_11_mul = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_11_strided_slice_1, y=resfcn256_Conv2d_transpose_11_mul_y)\n        resfcn256_Conv2d_transpose_11_mul_1 = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_11_strided_slice_2, y=resfcn256_Conv2d_transpose_11_mul_1_y)\n        resfcn256_Conv2d_transpose_11_stack = paddle.stack(x=[\n            resfcn256_Conv2d_transpose_11_strided_slice, resfcn256_Conv2d_transpose_11_mul,\n            resfcn256_Conv2d_transpose_11_mul_1, resfcn256_Conv2d_transpose_11_stack_3\n        ])\n        resfcn256_Conv2d_transpose_11_stack = paddle.reshape(x=resfcn256_Conv2d_transpose_11_stack, shape=[-1])\n        conv2dbackpropinput_transpose_11 = paddle.transpose(x=resfcn256_Conv2d_transpose_10_Relu, perm=[0, 3, 1, 2])\n        resfcn256_Conv2d_transpose_11_conv2d_transpose_conv47_weight = self.resfcn256_Conv2d_transpose_11_conv2d_transpose_conv47_weight\n        resfcn256_Conv2d_transpose_11_conv2d_transpose = paddle.nn.functional.conv2d_transpose(\n            x=conv2dbackpropinput_transpose_11,\n            weight=resfcn256_Conv2d_transpose_11_conv2d_transpose_conv47_weight,\n            stride=[1, 1],\n            dilation=[1, 1],\n            padding='SAME',\n            output_size=[128, 128])\n        resfcn256_Conv2d_transpose_11_BatchNorm_FusedBatchNorm = self.bn42(\n            resfcn256_Conv2d_transpose_11_conv2d_transpose)\n        resfcn256_Conv2d_transpose_11_BatchNorm_FusedBatchNorm = paddle.transpose(\n            x=resfcn256_Conv2d_transpose_11_BatchNorm_FusedBatchNorm, perm=[0, 2, 3, 1])\n        resfcn256_Conv2d_transpose_11_Relu = self.relu42(resfcn256_Conv2d_transpose_11_BatchNorm_FusedBatchNorm)\n        resfcn256_Conv2d_transpose_12_Shape = paddle.shape(input=resfcn256_Conv2d_transpose_11_Relu)\n        resfcn256_Conv2d_transpose_12_strided_slice = paddle.slice(\n            input=resfcn256_Conv2d_transpose_12_Shape, axes=[0], starts=[0], ends=[1])\n        resfcn256_Conv2d_transpose_12_strided_slice_1 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_12_Shape, axes=[0], starts=[1], ends=[2])\n        resfcn256_Conv2d_transpose_12_strided_slice_2 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_12_Shape, axes=[0], starts=[2], ends=[3])\n        resfcn256_Conv2d_transpose_12_mul = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_12_strided_slice_1, y=resfcn256_Conv2d_transpose_12_mul_y)\n        resfcn256_Conv2d_transpose_12_mul_1 = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_12_strided_slice_2, y=resfcn256_Conv2d_transpose_12_mul_1_y)\n        resfcn256_Conv2d_transpose_12_stack = paddle.stack(x=[\n            resfcn256_Conv2d_transpose_12_strided_slice, resfcn256_Conv2d_transpose_12_mul,\n            resfcn256_Conv2d_transpose_12_mul_1, resfcn256_Conv2d_transpose_12_stack_3\n        ])\n        resfcn256_Conv2d_transpose_12_stack = paddle.reshape(x=resfcn256_Conv2d_transpose_12_stack, shape=[-1])\n        conv2dbackpropinput_transpose_12 = paddle.transpose(x=resfcn256_Conv2d_transpose_11_Relu, perm=[0, 3, 1, 2])\n        resfcn256_Conv2d_transpose_12_conv2d_transpose_conv48_weight = self.resfcn256_Conv2d_transpose_12_conv2d_transpose_conv48_weight\n        resfcn256_Conv2d_transpose_12_conv2d_transpose = paddle.nn.functional.conv2d_transpose(\n            x=conv2dbackpropinput_transpose_12,\n            weight=resfcn256_Conv2d_transpose_12_conv2d_transpose_conv48_weight,\n            stride=[2, 2],\n            dilation=[1, 1],\n            padding='SAME',\n            output_size=[256, 256])\n        resfcn256_Conv2d_transpose_12_BatchNorm_FusedBatchNorm = self.bn43(\n            resfcn256_Conv2d_transpose_12_conv2d_transpose)\n        resfcn256_Conv2d_transpose_12_BatchNorm_FusedBatchNorm = paddle.transpose(\n            x=resfcn256_Conv2d_transpose_12_BatchNorm_FusedBatchNorm, perm=[0, 2, 3, 1])\n        resfcn256_Conv2d_transpose_12_Relu = self.relu43(resfcn256_Conv2d_transpose_12_BatchNorm_FusedBatchNorm)\n        resfcn256_Conv2d_transpose_13_Shape = paddle.shape(input=resfcn256_Conv2d_transpose_12_Relu)\n        resfcn256_Conv2d_transpose_13_strided_slice = paddle.slice(\n            input=resfcn256_Conv2d_transpose_13_Shape, axes=[0], starts=[0], ends=[1])\n        resfcn256_Conv2d_transpose_13_strided_slice_1 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_13_Shape, axes=[0], starts=[1], ends=[2])\n        resfcn256_Conv2d_transpose_13_strided_slice_2 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_13_Shape, axes=[0], starts=[2], ends=[3])\n        resfcn256_Conv2d_transpose_13_mul = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_13_strided_slice_1, y=resfcn256_Conv2d_transpose_13_mul_y)\n        resfcn256_Conv2d_transpose_13_mul_1 = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_13_strided_slice_2, y=resfcn256_Conv2d_transpose_13_mul_1_y)\n        resfcn256_Conv2d_transpose_13_stack = paddle.stack(x=[\n            resfcn256_Conv2d_transpose_13_strided_slice, resfcn256_Conv2d_transpose_13_mul,\n            resfcn256_Conv2d_transpose_13_mul_1, resfcn256_Conv2d_transpose_13_stack_3\n        ])\n        resfcn256_Conv2d_transpose_13_stack = paddle.reshape(x=resfcn256_Conv2d_transpose_13_stack, shape=[-1])\n        conv2dbackpropinput_transpose_13 = paddle.transpose(x=resfcn256_Conv2d_transpose_12_Relu, perm=[0, 3, 1, 2])\n        resfcn256_Conv2d_transpose_13_conv2d_transpose_conv49_weight = self.resfcn256_Conv2d_transpose_13_conv2d_transpose_conv49_weight\n        resfcn256_Conv2d_transpose_13_conv2d_transpose = paddle.nn.functional.conv2d_transpose(\n            x=conv2dbackpropinput_transpose_13,\n            weight=resfcn256_Conv2d_transpose_13_conv2d_transpose_conv49_weight,\n            stride=[1, 1],\n            dilation=[1, 1],\n            padding='SAME',\n            output_size=[256, 256])\n        resfcn256_Conv2d_transpose_13_BatchNorm_FusedBatchNorm = self.bn44(\n            resfcn256_Conv2d_transpose_13_conv2d_transpose)\n        resfcn256_Conv2d_transpose_13_BatchNorm_FusedBatchNorm = paddle.transpose(\n            x=resfcn256_Conv2d_transpose_13_BatchNorm_FusedBatchNorm, perm=[0, 2, 3, 1])\n        resfcn256_Conv2d_transpose_13_Relu = self.relu44(resfcn256_Conv2d_transpose_13_BatchNorm_FusedBatchNorm)\n        resfcn256_Conv2d_transpose_14_Shape = paddle.shape(input=resfcn256_Conv2d_transpose_13_Relu)\n        resfcn256_Conv2d_transpose_14_strided_slice = paddle.slice(\n            input=resfcn256_Conv2d_transpose_14_Shape, axes=[0], starts=[0], ends=[1])\n        resfcn256_Conv2d_transpose_14_strided_slice_1 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_14_Shape, axes=[0], starts=[1], ends=[2])\n        resfcn256_Conv2d_transpose_14_strided_slice_2 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_14_Shape, axes=[0], starts=[2], ends=[3])\n        resfcn256_Conv2d_transpose_14_mul = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_14_strided_slice_1, y=resfcn256_Conv2d_transpose_14_mul_y)\n        resfcn256_Conv2d_transpose_14_mul_1 = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_14_strided_slice_2, y=resfcn256_Conv2d_transpose_14_mul_1_y)\n        resfcn256_Conv2d_transpose_14_stack = paddle.stack(x=[\n            resfcn256_Conv2d_transpose_14_strided_slice, resfcn256_Conv2d_transpose_14_mul,\n            resfcn256_Conv2d_transpose_14_mul_1, resfcn256_Conv2d_transpose_14_stack_3\n        ])\n        resfcn256_Conv2d_transpose_14_stack = paddle.reshape(x=resfcn256_Conv2d_transpose_14_stack, shape=[-1])\n        conv2dbackpropinput_transpose_14 = paddle.transpose(x=resfcn256_Conv2d_transpose_13_Relu, perm=[0, 3, 1, 2])\n        resfcn256_Conv2d_transpose_14_conv2d_transpose_conv50_weight = self.resfcn256_Conv2d_transpose_14_conv2d_transpose_conv50_weight\n        resfcn256_Conv2d_transpose_14_conv2d_transpose = paddle.nn.functional.conv2d_transpose(\n            x=conv2dbackpropinput_transpose_14,\n            weight=resfcn256_Conv2d_transpose_14_conv2d_transpose_conv50_weight,\n            stride=[1, 1],\n            dilation=[1, 1],\n            padding='SAME',\n            output_size=[256, 256])\n        resfcn256_Conv2d_transpose_14_BatchNorm_FusedBatchNorm = self.bn45(\n            resfcn256_Conv2d_transpose_14_conv2d_transpose)\n        resfcn256_Conv2d_transpose_14_BatchNorm_FusedBatchNorm = paddle.transpose(\n            x=resfcn256_Conv2d_transpose_14_BatchNorm_FusedBatchNorm, perm=[0, 2, 3, 1])\n        resfcn256_Conv2d_transpose_14_Relu = self.relu45(resfcn256_Conv2d_transpose_14_BatchNorm_FusedBatchNorm)\n        resfcn256_Conv2d_transpose_15_Shape = paddle.shape(input=resfcn256_Conv2d_transpose_14_Relu)\n        resfcn256_Conv2d_transpose_15_strided_slice = paddle.slice(\n            input=resfcn256_Conv2d_transpose_15_Shape, axes=[0], starts=[0], ends=[1])\n        resfcn256_Conv2d_transpose_15_strided_slice_1 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_15_Shape, axes=[0], starts=[1], ends=[2])\n        resfcn256_Conv2d_transpose_15_strided_slice_2 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_15_Shape, axes=[0], starts=[2], ends=[3])\n        resfcn256_Conv2d_transpose_15_mul = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_15_strided_slice_1, y=resfcn256_Conv2d_transpose_15_mul_y)\n        resfcn256_Conv2d_transpose_15_mul_1 = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_15_strided_slice_2, y=resfcn256_Conv2d_transpose_15_mul_1_y)\n        resfcn256_Conv2d_transpose_15_stack = paddle.stack(x=[\n            resfcn256_Conv2d_transpose_15_strided_slice, resfcn256_Conv2d_transpose_15_mul,\n            resfcn256_Conv2d_transpose_15_mul_1, resfcn256_Conv2d_transpose_15_stack_3\n        ])\n        resfcn256_Conv2d_transpose_15_stack = paddle.reshape(x=resfcn256_Conv2d_transpose_15_stack, shape=[-1])\n        conv2dbackpropinput_transpose_15 = paddle.transpose(x=resfcn256_Conv2d_transpose_14_Relu, perm=[0, 3, 1, 2])\n        resfcn256_Conv2d_transpose_15_conv2d_transpose_conv51_weight = self.resfcn256_Conv2d_transpose_15_conv2d_transpose_conv51_weight\n        resfcn256_Conv2d_transpose_15_conv2d_transpose = paddle.nn.functional.conv2d_transpose(\n            x=conv2dbackpropinput_transpose_15,\n            weight=resfcn256_Conv2d_transpose_15_conv2d_transpose_conv51_weight,\n            stride=[1, 1],\n            dilation=[1, 1],\n            padding='SAME',\n            output_size=[256, 256])\n        resfcn256_Conv2d_transpose_15_BatchNorm_FusedBatchNorm = self.bn46(\n            resfcn256_Conv2d_transpose_15_conv2d_transpose)\n        resfcn256_Conv2d_transpose_15_BatchNorm_FusedBatchNorm = paddle.transpose(\n            x=resfcn256_Conv2d_transpose_15_BatchNorm_FusedBatchNorm, perm=[0, 2, 3, 1])\n        resfcn256_Conv2d_transpose_15_Relu = self.relu46(resfcn256_Conv2d_transpose_15_BatchNorm_FusedBatchNorm)\n        resfcn256_Conv2d_transpose_16_Shape = paddle.shape(input=resfcn256_Conv2d_transpose_15_Relu)\n        resfcn256_Conv2d_transpose_16_strided_slice = paddle.slice(\n            input=resfcn256_Conv2d_transpose_16_Shape, axes=[0], starts=[0], ends=[1])\n        resfcn256_Conv2d_transpose_16_strided_slice_1 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_16_Shape, axes=[0], starts=[1], ends=[2])\n        resfcn256_Conv2d_transpose_16_strided_slice_2 = paddle.slice(\n            input=resfcn256_Conv2d_transpose_16_Shape, axes=[0], starts=[2], ends=[3])\n        resfcn256_Conv2d_transpose_16_mul = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_16_strided_slice_1, y=resfcn256_Conv2d_transpose_16_mul_y)\n        resfcn256_Conv2d_transpose_16_mul_1 = paddle.multiply(\n            x=resfcn256_Conv2d_transpose_16_strided_slice_2, y=resfcn256_Conv2d_transpose_16_mul_1_y)\n        resfcn256_Conv2d_transpose_16_stack = paddle.stack(x=[\n            resfcn256_Conv2d_transpose_16_strided_slice, resfcn256_Conv2d_transpose_16_mul,\n            resfcn256_Conv2d_transpose_16_mul_1, resfcn256_Conv2d_transpose_16_stack_3\n        ])\n        resfcn256_Conv2d_transpose_16_stack = paddle.reshape(x=resfcn256_Conv2d_transpose_16_stack, shape=[-1])\n        conv2dbackpropinput_transpose_16 = paddle.transpose(x=resfcn256_Conv2d_transpose_15_Relu, perm=[0, 3, 1, 2])\n        resfcn256_Conv2d_transpose_16_conv2d_transpose_conv52_weight = self.resfcn256_Conv2d_transpose_16_conv2d_transpose_conv52_weight\n        resfcn256_Conv2d_transpose_16_conv2d_transpose = paddle.nn.functional.conv2d_transpose(\n            x=conv2dbackpropinput_transpose_16,\n            weight=resfcn256_Conv2d_transpose_16_conv2d_transpose_conv52_weight,\n            stride=[1, 1],\n            dilation=[1, 1],\n            padding='SAME',\n            output_size=[256, 256])\n        resfcn256_Conv2d_transpose_16_BatchNorm_FusedBatchNorm = self.bn47(\n            resfcn256_Conv2d_transpose_16_conv2d_transpose)\n        resfcn256_Conv2d_transpose_16_BatchNorm_FusedBatchNorm = paddle.transpose(\n            x=resfcn256_Conv2d_transpose_16_BatchNorm_FusedBatchNorm, perm=[0, 2, 3, 1])\n        resfcn256_Conv2d_transpose_16_Sigmoid = self.sigmoid0(resfcn256_Conv2d_transpose_16_BatchNorm_FusedBatchNorm)\n        return resfcn256_Conv2d_transpose_16_Sigmoid\n\n\ndef main(Placeholder):\n    # There are 1 inputs.\n    # Placeholder: shape-[-1, 256, 256, 3], type-float32.\n\n    paddle.disable_static()\n    params = paddle.load('/work/ToTransferInHub/PRNet-Paddle/pd_model/model.pdparams')\n    model = TFModel()\n    model.set_dict(params, use_structured_name=False)\n    model.eval()\n    out = model(Placeholder)\n    return out\n\n\nif __name__ == '__main__':\n    tensor = paddle.randn([1, 256, 256, 3])\n    print(main(tensor).shape)\n"
  },
  {
    "path": "modules/image/image_processing/prnet/predictor.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport numpy as np\nimport paddle\n\nfrom .pd_model.x2paddle_code import TFModel\n\n\nclass PosPrediction():\n    def __init__(self, params, resolution_inp=256, resolution_op=256):\n        # -- hyper settings\n        self.resolution_inp = resolution_inp\n        self.resolution_op = resolution_op\n        self.MaxPos = resolution_inp * 1.1\n\n        # network type\n        self.network = TFModel()\n        self.network.set_dict(params, use_structured_name=False)\n        self.network.eval()\n\n    def predict(self, image):\n        paddle.disable_static()\n        image_tensor = paddle.to_tensor(image[np.newaxis, :, :, :], dtype='float32')\n        pos = self.network(image_tensor)\n        pos = pos.numpy()\n        pos = np.squeeze(pos)\n        return pos * self.MaxPos\n\n    def predict_batch(self, images):\n        pos = self.sess.run(self.x_op, feed_dict={self.x: images})\n        return pos * self.MaxPos\n"
  },
  {
    "path": "modules/image/image_processing/prnet/requirements.txt",
    "content": "dlib\nscikit-image\n"
  },
  {
    "path": "modules/image/image_processing/prnet/util.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport base64\n\nimport cv2\nimport numpy as np\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_GRAYSCALE)\n    return data\n"
  },
  {
    "path": "modules/image/image_processing/prnet/utils/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/image_processing/prnet/utils/cv_plot.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport cv2\nimport numpy as np\n\nend_list = np.array([17, 22, 27, 42, 48, 31, 36, 68], dtype=np.int32) - 1\n\n\ndef plot_kpt(image, kpt):\n    ''' Draw 68 key points\n    Args:\n        image: the input image\n        kpt: (68, 3).\n    '''\n    image = image.copy()\n    kpt = np.round(kpt).astype(np.int32)\n    for i in range(kpt.shape[0]):\n        st = kpt[i, :2]\n        image = cv2.circle(image, (st[0], st[1]), 1, (0, 0, 255), 2)\n        if i in end_list:\n            continue\n        ed = kpt[i + 1, :2]\n        image = cv2.line(image, (st[0], st[1]), (ed[0], ed[1]), (255, 255, 255), 1)\n    return image\n\n\ndef plot_vertices(image, vertices):\n    image = image.copy()\n    vertices = np.round(vertices).astype(np.int32)\n    for i in range(0, vertices.shape[0], 2):\n        st = vertices[i, :2]\n        image = cv2.circle(image, (st[0], st[1]), 1, (255, 0, 0), -1)\n    return image\n\n\ndef plot_pose_box(image, P, kpt, color=(0, 255, 0), line_width=2):\n    ''' Draw a 3D box as annotation of pose. Ref:https://github.com/yinguobing/head-pose-estimation/blob/master/pose_estimator.py\n    Args:\n        image: the input image\n        P: (3, 4). Affine Camera Matrix.\n        kpt: (68, 3).\n    '''\n    image = image.copy()\n\n    point_3d = []\n    rear_size = 90\n    rear_depth = 0\n    point_3d.append((-rear_size, -rear_size, rear_depth))\n    point_3d.append((-rear_size, rear_size, rear_depth))\n    point_3d.append((rear_size, rear_size, rear_depth))\n    point_3d.append((rear_size, -rear_size, rear_depth))\n    point_3d.append((-rear_size, -rear_size, rear_depth))\n\n    front_size = 105\n    front_depth = 110\n    point_3d.append((-front_size, -front_size, front_depth))\n    point_3d.append((-front_size, front_size, front_depth))\n    point_3d.append((front_size, front_size, front_depth))\n    point_3d.append((front_size, -front_size, front_depth))\n    point_3d.append((-front_size, -front_size, front_depth))\n    point_3d = np.array(point_3d, dtype=np.float).reshape(-1, 3)\n\n    # Map to 2d image points\n    point_3d_homo = np.hstack((point_3d, np.ones([point_3d.shape[0], 1])))  #n x 4\n    point_2d = point_3d_homo.dot(P.T)[:, :2]\n    point_2d[:, :2] = point_2d[:, :2] - np.mean(point_2d[:4, :2], 0) + np.mean(kpt[:27, :2], 0)\n    point_2d = np.int32(point_2d.reshape(-1, 2))\n\n    # Draw all the lines\n    cv2.polylines(image, [point_2d], True, color, line_width, cv2.LINE_AA)\n    cv2.line(image, tuple(point_2d[1]), tuple(point_2d[6]), color, line_width, cv2.LINE_AA)\n    cv2.line(image, tuple(point_2d[2]), tuple(point_2d[7]), color, line_width, cv2.LINE_AA)\n    cv2.line(image, tuple(point_2d[3]), tuple(point_2d[8]), color, line_width, cv2.LINE_AA)\n\n    return image\n"
  },
  {
    "path": "modules/image/image_processing/prnet/utils/estimate_pose.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom math import asin\nfrom math import atan2\nfrom math import cos\nfrom math import sin\n\nimport numpy as np\n\n\ndef isRotationMatrix(R):\n    ''' checks if a matrix is a valid rotation matrix(whether orthogonal or not)\n    '''\n    Rt = np.transpose(R)\n    shouldBeIdentity = np.dot(Rt, R)\n    I = np.identity(3, dtype=R.dtype)\n    n = np.linalg.norm(I - shouldBeIdentity)\n    return n < 1e-6\n\n\ndef matrix2angle(R):\n    ''' compute three Euler angles from a Rotation Matrix. Ref: http://www.gregslabaugh.net/publications/euler.pdf\n    Args:\n        R: (3,3). rotation matrix\n    Returns:\n        x: yaw\n        y: pitch\n        z: roll\n    '''\n    # assert(isRotationMatrix(R))\n\n    if R[2, 0] != 1 or R[2, 0] != -1:\n        x = asin(R[2, 0])\n        y = atan2(R[2, 1] / cos(x), R[2, 2] / cos(x))\n        z = atan2(R[1, 0] / cos(x), R[0, 0] / cos(x))\n\n    else:  # Gimbal lock\n        z = 0  #can be anything\n        if R[2, 0] == -1:\n            x = np.pi / 2\n            y = z + atan2(R[0, 1], R[0, 2])\n        else:\n            x = -np.pi / 2\n            y = -z + atan2(-R[0, 1], -R[0, 2])\n\n    return x, y, z\n\n\ndef P2sRt(P):\n    ''' decompositing camera matrix P.\n    Args:\n        P: (3, 4). Affine Camera Matrix.\n    Returns:\n        s: scale factor.\n        R: (3, 3). rotation matrix.\n        t2d: (2,). 2d translation.\n    '''\n    t2d = P[:2, 3]\n    R1 = P[0:1, :3]\n    R2 = P[1:2, :3]\n    s = (np.linalg.norm(R1) + np.linalg.norm(R2)) / 2.0\n    r1 = R1 / np.linalg.norm(R1)\n    r2 = R2 / np.linalg.norm(R2)\n    r3 = np.cross(r1, r2)\n\n    R = np.concatenate((r1, r2, r3), 0)\n    return s, R, t2d\n\n\ndef compute_similarity_transform(points_static, points_to_transform):\n    #http://nghiaho.com/?page_id=671\n    p0 = np.copy(points_static).T\n    p1 = np.copy(points_to_transform).T\n\n    t0 = -np.mean(p0, axis=1).reshape(3, 1)\n    t1 = -np.mean(p1, axis=1).reshape(3, 1)\n    t_final = t1 - t0\n\n    p0c = p0 + t0\n    p1c = p1 + t1\n\n    covariance_matrix = p0c.dot(p1c.T)\n    U, S, V = np.linalg.svd(covariance_matrix)\n    R = U.dot(V)\n    if np.linalg.det(R) < 0:\n        R[:, 2] *= -1\n\n    rms_d0 = np.sqrt(np.mean(np.linalg.norm(p0c, axis=0)**2))\n    rms_d1 = np.sqrt(np.mean(np.linalg.norm(p1c, axis=0)**2))\n\n    s = (rms_d0 / rms_d1)\n    P = np.c_[s * np.eye(3).dot(R), t_final]\n    return P\n\n\ndef estimate_pose(vertices):\n    canonical_vertices = np.load('Data/uv-data/canonical_vertices.npy')\n    P = compute_similarity_transform(vertices, canonical_vertices)\n    _, R, _ = P2sRt(P)  # decompose affine matrix to s, R, t\n    pose = matrix2angle(R)\n\n    return P, pose\n"
  },
  {
    "path": "modules/image/image_processing/prnet/utils/render.py",
    "content": "'''\nAuthor: YadiraF\nMail: fengyao@sjtu.edu.cn\n'''\nimport numpy as np\n\n\ndef isPointInTri(point, tri_points):\n    ''' Judge whether the point is in the triangle\n    Method:\n        http://blackpawn.com/texts/pointinpoly/\n    Args:\n        point: [u, v] or [x, y]\n        tri_points: three vertices(2d points) of a triangle. 2 coords x 3 vertices\n    Returns:\n        bool: true for in triangle\n    '''\n    tp = tri_points\n\n    # vectors\n    v0 = tp[:, 2] - tp[:, 0]\n    v1 = tp[:, 1] - tp[:, 0]\n    v2 = point - tp[:, 0]\n\n    # dot products\n    dot00 = np.dot(v0.T, v0)\n    dot01 = np.dot(v0.T, v1)\n    dot02 = np.dot(v0.T, v2)\n    dot11 = np.dot(v1.T, v1)\n    dot12 = np.dot(v1.T, v2)\n\n    # barycentric coordinates\n    if dot00 * dot11 - dot01 * dot01 == 0:\n        inverDeno = 0\n    else:\n        inverDeno = 1 / (dot00 * dot11 - dot01 * dot01)\n\n    u = (dot11 * dot02 - dot01 * dot12) * inverDeno\n    v = (dot00 * dot12 - dot01 * dot02) * inverDeno\n\n    # check if point in triangle\n    return (u >= 0) & (v >= 0) & (u + v < 1)\n\n\ndef get_point_weight(point, tri_points):\n    ''' Get the weights of the position\n    Methods: https://gamedev.stackexchange.com/questions/23743/whats-the-most-efficient-way-to-find-barycentric-coordinates\n     -m1.compute the area of the triangles formed by embedding the point P inside the triangle\n     -m2.Christer Ericson's book \"Real-Time Collision Detection\". faster, so I used this.\n    Args:\n        point: [u, v] or [x, y]\n        tri_points: three vertices(2d points) of a triangle. 2 coords x 3 vertices\n    Returns:\n        w0: weight of v0\n        w1: weight of v1\n        w2: weight of v3\n     '''\n    tp = tri_points\n    # vectors\n    v0 = tp[:, 2] - tp[:, 0]\n    v1 = tp[:, 1] - tp[:, 0]\n    v2 = point - tp[:, 0]\n\n    # dot products\n    dot00 = np.dot(v0.T, v0)\n    dot01 = np.dot(v0.T, v1)\n    dot02 = np.dot(v0.T, v2)\n    dot11 = np.dot(v1.T, v1)\n    dot12 = np.dot(v1.T, v2)\n\n    # barycentric coordinates\n    if dot00 * dot11 - dot01 * dot01 == 0:\n        inverDeno = 0\n    else:\n        inverDeno = 1 / (dot00 * dot11 - dot01 * dot01)\n\n    u = (dot11 * dot02 - dot01 * dot12) * inverDeno\n    v = (dot00 * dot12 - dot01 * dot02) * inverDeno\n\n    w0 = 1 - u - v\n    w1 = v\n    w2 = u\n\n    return w0, w1, w2\n\n\ndef render_texture(vertices, colors, triangles, h, w, c=3):\n    ''' render mesh by z buffer\n    Args:\n        vertices: 3 x nver\n        colors: 3 x nver\n        triangles: 3 x ntri\n        h: height\n        w: width\n    '''\n    # initial\n    image = np.zeros((h, w, c))\n\n    depth_buffer = np.zeros([h, w]) - 999999.\n    # triangle depth: approximate the depth to the average value of z in each vertex(v0, v1, v2), since the vertices are closed to each other\n    tri_depth = (vertices[2, triangles[0, :]] + vertices[2, triangles[1, :]] + vertices[2, triangles[2, :]]) / 3.\n    tri_tex = (colors[:, triangles[0, :]] + colors[:, triangles[1, :]] + colors[:, triangles[2, :]]) / 3.\n\n    for i in range(triangles.shape[1]):\n        tri = triangles[:, i]  # 3 vertex indices\n\n        # the inner bounding box\n        umin = max(int(np.ceil(np.min(vertices[0, tri]))), 0)\n        umax = min(int(np.floor(np.max(vertices[0, tri]))), w - 1)\n\n        vmin = max(int(np.ceil(np.min(vertices[1, tri]))), 0)\n        vmax = min(int(np.floor(np.max(vertices[1, tri]))), h - 1)\n\n        if umax < umin or vmax < vmin:\n            continue\n\n        for u in range(umin, umax + 1):\n            for v in range(vmin, vmax + 1):\n                if tri_depth[i] > depth_buffer[v, u] and isPointInTri([u, v], vertices[:2, tri]):\n                    depth_buffer[v, u] = tri_depth[i]\n                    image[v, u, :] = tri_tex[:, i]\n    return image\n\n\ndef map_texture(src_image,\n                src_vertices,\n                dst_vertices,\n                dst_triangle_buffer,\n                triangles,\n                h,\n                w,\n                c=3,\n                mapping_type='bilinear'):\n    '''\n    Args:\n        triangles: 3 x ntri\n\n        # src\n        src_image: height x width x nchannels\n        src_vertices: 3 x nver\n\n        # dst\n        dst_vertices: 3 x nver\n        dst_triangle_buffer: height x width. the triangle index of each pixel in dst image\n\n    Returns:\n        dst_image: height x width x nchannels\n\n    '''\n    [sh, sw, sc] = src_image.shape\n    dst_image = np.zeros((h, w, c))\n    for y in range(h):\n        for x in range(w):\n            tri_ind = dst_triangle_buffer[y, x]\n            if tri_ind < 0:  # no tri in dst image\n                continue\n            #if src_triangles_vis[tri_ind]: # the corresponding triangle in src image is invisible\n            #   continue\n\n            # then. For this triangle index, map corresponding pixels(in triangles) in src image to dst image\n            # Two Methods:\n            # M1. Calculate the corresponding affine matrix from src triangle to dst triangle. Then find the corresponding src position of this dst pixel.\n            # -- ToDo\n            # M2. Calculate the relative position of three vertices in dst triangle, then find the corresponding src position relative to three src vertices.\n            tri = triangles[:, tri_ind]\n            # dst weight, here directly use the center to approximate because the tri is small\n            # if tri_ind < 366:\n            # print tri_ind\n            w0, w1, w2 = get_point_weight([x, y], dst_vertices[:2, tri])\n            # else:\n            #     w0 = w1 = w2 = 1./3\n            # src\n            src_texel = w0 * src_vertices[:2, tri[0]] + w1 * src_vertices[:2, tri[1]] + w2 * src_vertices[:2, tri[2]]  #\n            #\n            if src_texel[0] < 0 or src_texel[0] > sw - 1 or src_texel[1] < 0 or src_texel[1] > sh - 1:\n                dst_image[y, x, :] = 0\n                continue\n            # As the coordinates of the transformed pixel in the image will most likely not lie on a texel, we have to choose how to\n            # calculate the pixel colors depending on the next texels\n            # there are three different texture interpolation methods: area, bilinear and nearest neighbour\n            # print y, x, src_texel\n            # nearest neighbour\n            if mapping_type == 'nearest':\n                dst_image[y, x, :] = src_image[int(round(src_texel[1])), int(round(src_texel[0])), :]\n            # bilinear\n            elif mapping_type == 'bilinear':\n                # next 4 pixels\n                ul = src_image[int(np.floor(src_texel[1])), int(np.floor(src_texel[0])), :]\n                ur = src_image[int(np.floor(src_texel[1])), int(np.ceil(src_texel[0])), :]\n                dl = src_image[int(np.ceil(src_texel[1])), int(np.floor(src_texel[0])), :]\n                dr = src_image[int(np.ceil(src_texel[1])), int(np.ceil(src_texel[0])), :]\n\n                yd = src_texel[1] - np.floor(src_texel[1])\n                xd = src_texel[0] - np.floor(src_texel[0])\n                dst_image[y, x, :] = ul * (1 - xd) * (1 - yd) + ur * xd * (1 - yd) + dl * (1 - xd) * yd + dr * xd * yd\n\n    return dst_image\n\n\ndef get_depth_buffer(vertices, triangles, h, w):\n    '''\n    Args:\n        vertices: 3 x nver\n        triangles: 3 x ntri\n        h: height\n        w: width\n    Returns:\n        depth_buffer: height x width\n    ToDo:\n        whether to add x, y by 0.5? the center of the pixel?\n        m3. like somewhere is wrong\n    # Each triangle has 3 vertices & Each vertex has 3 coordinates x, y, z.\n    # Here, the bigger the z, the fronter the point.\n    '''\n    # initial\n    depth_buffer = np.zeros([h, w\n                             ]) - 999999.  #+ np.min(vertices[2,:]) - 999999. # set the initial z to the farest position\n\n    ## calculate the depth(z) of each triangle\n    #-m1. z = the center of shpere(through 3 vertices)\n    #center3d = (vertices[:, triangles[0,:]] + vertices[:,triangles[1,:]] + vertices[:, triangles[2,:]])/3.\n    #tri_depth = np.sum(center3d**2, axis = 0)\n    #-m2. z = the center of z(v0, v1, v2)\n    tri_depth = (vertices[2, triangles[0, :]] + vertices[2, triangles[1, :]] + vertices[2, triangles[2, :]]) / 3.\n\n    for i in range(triangles.shape[1]):\n        tri = triangles[:, i]  # 3 vertex indices\n\n        # the inner bounding box\n        umin = max(int(np.ceil(np.min(vertices[0, tri]))), 0)\n        umax = min(int(np.floor(np.max(vertices[0, tri]))), w - 1)\n\n        vmin = max(int(np.ceil(np.min(vertices[1, tri]))), 0)\n        vmax = min(int(np.floor(np.max(vertices[1, tri]))), h - 1)\n\n        if umax < umin or vmax < vmin:\n            continue\n\n        for u in range(umin, umax + 1):\n            for v in range(vmin, vmax + 1):\n                #-m3. calculate the accurate depth(z) of each pixel by barycentric weights\n                #w0, w1, w2 = weightsOfpoint([u,v], vertices[:2, tri])\n                #tri_depth = w0*vertices[2,tri[0]] + w1*vertices[2,tri[1]] + w2*vertices[2,tri[2]]\n                if tri_depth[i] > depth_buffer[v, u]:  # and is_pointIntri([u,v], vertices[:2, tri]):\n                    depth_buffer[v, u] = tri_depth[i]\n\n    return depth_buffer\n\n\ndef get_triangle_buffer(vertices, triangles, h, w):\n    '''\n    Args:\n        vertices: 3 x nver\n        triangles: 3 x ntri\n        h: height\n        w: width\n    Returns:\n        depth_buffer: height x width\n    ToDo:\n        whether to add x, y by 0.5? the center of the pixel?\n        m3. like somewhere is wrong\n    # Each triangle has 3 vertices & Each vertex has 3 coordinates x, y, z.\n    # Here, the bigger the z, the fronter the point.\n    '''\n    # initial\n    depth_buffer = np.zeros([h, w\n                             ]) - 999999.  #+ np.min(vertices[2,:]) - 999999. # set the initial z to the farest position\n    triangle_buffer = np.zeros_like(depth_buffer, dtype=np.int32) - 1  # if -1, the pixel has no triangle correspondance\n\n    ## calculate the depth(z) of each triangle\n    #-m1. z = the center of shpere(through 3 vertices)\n    #center3d = (vertices[:, triangles[0,:]] + vertices[:,triangles[1,:]] + vertices[:, triangles[2,:]])/3.\n    #tri_depth = np.sum(center3d**2, axis = 0)\n    #-m2. z = the center of z(v0, v1, v2)\n    tri_depth = (vertices[2, triangles[0, :]] + vertices[2, triangles[1, :]] + vertices[2, triangles[2, :]]) / 3.\n\n    for i in range(triangles.shape[1]):\n        tri = triangles[:, i]  # 3 vertex indices\n\n        # the inner bounding box\n        umin = max(int(np.ceil(np.min(vertices[0, tri]))), 0)\n        umax = min(int(np.floor(np.max(vertices[0, tri]))), w - 1)\n\n        vmin = max(int(np.ceil(np.min(vertices[1, tri]))), 0)\n        vmax = min(int(np.floor(np.max(vertices[1, tri]))), h - 1)\n\n        if umax < umin or vmax < vmin:\n            continue\n\n        for u in range(umin, umax + 1):\n            for v in range(vmin, vmax + 1):\n                #-m3. calculate the accurate depth(z) of each pixel by barycentric weights\n                #w0, w1, w2 = weightsOfpoint([u,v], vertices[:2, tri])\n                #tri_depth = w0*vertices[2,tri[0]] + w1*vertices[2,tri[1]] + w2*vertices[2,tri[2]]\n                if tri_depth[i] > depth_buffer[v, u] and isPointInTri([u, v], vertices[:2, tri]):\n                    depth_buffer[v, u] = tri_depth[i]\n                    triangle_buffer[v, u] = i\n\n    return triangle_buffer\n\n\ndef vis_of_vertices(vertices, triangles, h, w, depth_buffer=None):\n    '''\n    Args:\n        vertices: 3 x nver\n        triangles: 3 x ntri\n        depth_buffer: height x width\n    Returns:\n        vertices_vis: nver. the visibility of each vertex\n    '''\n    if depth_buffer == None:\n        depth_buffer = get_depth_buffer(vertices, triangles, h, w)\n\n    vertices_vis = np.zeros(vertices.shape[1], dtype=bool)\n\n    depth_tmp = np.zeros_like(depth_buffer) - 99999\n    for i in range(vertices.shape[1]):\n        vertex = vertices[:, i]\n\n        if np.floor(vertex[0]) < 0 or np.ceil(vertex[0]) > w - 1 or np.floor(vertex[1]) < 0 or np.ceil(\n                vertex[1]) > h - 1:\n            continue\n\n        # bilinear interp\n        # ul = depth_buffer[int(np.floor(vertex[1])), int(np.floor(vertex[0]))]\n        # ur = depth_buffer[int(np.floor(vertex[1])), int(np.ceil(vertex[0]))]\n        # dl = depth_buffer[int(np.ceil(vertex[1])), int(np.floor(vertex[0]))]\n        # dr = depth_buffer[int(np.ceil(vertex[1])), int(np.ceil(vertex[0]))]\n\n        # yd = vertex[1] - np.floor(vertex[1])\n        # xd = vertex[0] - np.floor(vertex[0])\n\n        # vertex_depth = ul*(1-xd)*(1-yd) + ur*xd*(1-yd) + dl*(1-xd)*yd + dr*xd*yd\n\n        # nearest\n        px = int(np.round(vertex[0]))\n        py = int(np.round(vertex[1]))\n\n        # if (vertex[2] > depth_buffer[ul[0], ul[1]]) & (vertex[2] > depth_buffer[ur[0], ur[1]]) & (vertex[2] > depth_buffer[dl[0], dl[1]]) & (vertex[2] > depth_buffer[dr[0], dr[1]]):\n        if vertex[2] < depth_tmp[py, px]:\n            continue\n\n        # if vertex[2] > depth_buffer[py, px]:\n        #     vertices_vis[i] = True\n        #     depth_tmp[py, px] = vertex[2]\n        # elif np.abs(vertex[2] - depth_buffer[py, px]) < 1:\n        #     vertices_vis[i] = True\n\n        threshold = 2  # need to be optimized.\n        if np.abs(vertex[2] - depth_buffer[py, px]) < threshold:\n            # if np.abs(vertex[2] - vertex_depth) < threshold:\n            vertices_vis[i] = True\n            depth_tmp[py, px] = vertex[2]\n\n    return vertices_vis\n"
  },
  {
    "path": "modules/image/image_processing/prnet/utils/render_app.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport numpy as np\nfrom scipy import ndimage\nfrom utils.render import render_texture\nfrom utils.render import vis_of_vertices\n\n\ndef get_visibility(vertices, triangles, h, w):\n    triangles = triangles.T\n    vertices_vis = vis_of_vertices(vertices.T, triangles, h, w)\n    vertices_vis = vertices_vis.astype(bool)\n    for k in range(2):\n        tri_vis = vertices_vis[triangles[0, :]] | vertices_vis[triangles[1, :]] | vertices_vis[triangles[2, :]]\n        ind = triangles[:, tri_vis]\n        vertices_vis[ind] = True\n    # for k in range(2):\n    #     tri_vis = vertices_vis[triangles[0,:]] & vertices_vis[triangles[1,:]] & vertices_vis[triangles[2,:]]\n    #     ind = triangles[:, tri_vis]\n    #     vertices_vis[ind] = True\n    vertices_vis = vertices_vis.astype(np.float32)  #1 for visible and 0 for non-visible\n    return vertices_vis\n\n\ndef get_uv_mask(vertices_vis, triangles, uv_coords, h, w, resolution):\n    triangles = triangles.T\n    vertices_vis = vertices_vis.astype(np.float32)\n    uv_mask = render_texture(uv_coords.T, vertices_vis[np.newaxis, :], triangles, resolution, resolution, 1)\n    uv_mask = np.squeeze(uv_mask > 0)\n    uv_mask = ndimage.binary_closing(uv_mask)\n    uv_mask = ndimage.binary_erosion(uv_mask, structure=np.ones((4, 4)))\n    uv_mask = ndimage.binary_closing(uv_mask)\n    uv_mask = ndimage.binary_erosion(uv_mask, structure=np.ones((4, 4)))\n    uv_mask = ndimage.binary_erosion(uv_mask, structure=np.ones((4, 4)))\n    uv_mask = ndimage.binary_erosion(uv_mask, structure=np.ones((4, 4)))\n    uv_mask = uv_mask.astype(np.float32)\n\n    return np.squeeze(uv_mask)\n\n\ndef get_depth_image(vertices, triangles, h, w, isShow=False):\n    z = vertices[:, 2:]\n    if isShow:\n        z = z / max(z)\n    depth_image = render_texture(vertices.T, z.T, triangles.T, h, w, 1)\n    return np.squeeze(depth_image)\n"
  },
  {
    "path": "modules/image/image_processing/prnet/utils/rotate_vertices.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport numpy as np\n\n\n# import scipy.io as\ndef frontalize(vertices):\n    canonical_vertices = np.load('Data/uv-data/canonical_vertices.npy')\n\n    vertices_homo = np.hstack((vertices, np.ones([vertices.shape[0], 1])))  #n x 4\n    P = np.linalg.lstsq(vertices_homo, canonical_vertices)[0].T  # Affine matrix. 3 x 4\n    front_vertices = vertices_homo.dot(P.T)\n\n    return front_vertices\n"
  },
  {
    "path": "modules/image/image_processing/prnet/utils/write.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\n\nimport numpy as np\nfrom skimage.io import imsave\n\n\ndef write_asc(path, vertices):\n    '''\n    Args:\n        vertices: shape = (nver, 3)\n    '''\n    if path.split('.')[-1] == 'asc':\n        np.savetxt(path, vertices)\n    else:\n        np.savetxt(path + '.asc', vertices)\n\n\ndef write_obj_with_colors(obj_name, vertices, triangles, colors):\n    ''' Save 3D face model with texture represented by colors.\n    Args:\n        obj_name: str\n        vertices: shape = (nver, 3)\n        colors: shape = (nver, 3)\n        triangles: shape = (ntri, 3)\n    '''\n    triangles = triangles.copy()\n    triangles += 1  # meshlab start with 1\n\n    if obj_name.split('.')[-1] != 'obj':\n        obj_name = obj_name + '.obj'\n\n    # write obj\n    with open(obj_name, 'w') as f:\n\n        # write vertices & colors\n        for i in range(vertices.shape[0]):\n            # s = 'v {} {} {} \\n'.format(vertices[0,i], vertices[1,i], vertices[2,i])\n            s = 'v {} {} {} {} {} {}\\n'.format(vertices[i, 0], vertices[i, 1], vertices[i, 2], colors[i, 0],\n                                               colors[i, 1], colors[i, 2])\n            f.write(s)\n\n        # write f: ver ind/ uv ind\n        [k, ntri] = triangles.shape\n        for i in range(triangles.shape[0]):\n            # s = 'f {} {} {}\\n'.format(triangles[i, 0], triangles[i, 1], triangles[i, 2])\n            s = 'f {} {} {}\\n'.format(triangles[i, 2], triangles[i, 1], triangles[i, 0])\n            f.write(s)\n\n\ndef write_obj_with_texture(obj_name, vertices, triangles, texture, uv_coords):\n    ''' Save 3D face model with texture represented by texture map.\n    Ref: https://github.com/patrikhuber/eos/blob/bd00155ebae4b1a13b08bf5a991694d682abbada/include/eos/core/Mesh.hpp\n    Args:\n        obj_name: str\n        vertices: shape = (nver, 3)\n        triangles: shape = (ntri, 3)\n        texture: shape = (256,256,3)\n        uv_coords: shape = (nver, 3) max value<=1\n    '''\n    if obj_name.split('.')[-1] != 'obj':\n        obj_name = obj_name + '.obj'\n    mtl_name = obj_name.replace('.obj', '.mtl')\n    texture_name = obj_name.replace('.obj', '_texture.png')\n\n    triangles = triangles.copy()\n    triangles += 1  # mesh lab start with 1\n\n    # write obj\n    with open(obj_name, 'w') as f:\n        # first line: write mtlib(material library)\n        s = \"mtllib {}\\n\".format(os.path.abspath(mtl_name))\n        f.write(s)\n\n        # write vertices\n        for i in range(vertices.shape[0]):\n            s = 'v {} {} {}\\n'.format(vertices[i, 0], vertices[i, 1], vertices[i, 2])\n            f.write(s)\n\n        # write uv coords\n        for i in range(uv_coords.shape[0]):\n            s = 'vt {} {}\\n'.format(uv_coords[i, 0], 1 - uv_coords[i, 1])\n            f.write(s)\n\n        f.write(\"usemtl FaceTexture\\n\")\n\n        # write f: ver ind/ uv ind\n        for i in range(triangles.shape[0]):\n            # s = 'f {}/{} {}/{} {}/{}\\n'.format(triangles[i,0], triangles[i,0], triangles[i,1], triangles[i,1], triangles[i,2], triangles[i,2])\n            s = 'f {}/{} {}/{} {}/{}\\n'.format(triangles[i, 2], triangles[i, 2], triangles[i, 1], triangles[i, 1],\n                                               triangles[i, 0], triangles[i, 0])\n            f.write(s)\n\n    # write mtl\n    with open(mtl_name, 'w') as f:\n        f.write(\"newmtl FaceTexture\\n\")\n        s = 'map_Kd {}\\n'.format(os.path.abspath(texture_name))  # map to image\n        f.write(s)\n\n    # write texture as png\n    imsave(texture_name, texture)\n\n\ndef write_obj_with_colors_texture(obj_name, vertices, colors, triangles, texture, uv_coords):\n    ''' Save 3D face model with texture.\n    Ref: https://github.com/patrikhuber/eos/blob/bd00155ebae4b1a13b08bf5a991694d682abbada/include/eos/core/Mesh.hpp\n    Args:\n        obj_name: str\n        vertices: shape = (nver, 3)\n        colors: shape = (nver, 3)\n        triangles: shape = (ntri, 3)\n        texture: shape = (256,256,3)\n        uv_coords: shape = (nver, 3) max value<=1\n    '''\n    if obj_name.split('.')[-1] != 'obj':\n        obj_name = obj_name + '.obj'\n    mtl_name = obj_name.replace('.obj', '.mtl')\n    texture_name = obj_name.replace('.obj', '_texture.png')\n\n    triangles = triangles.copy()\n    triangles += 1  # mesh lab start with 1\n\n    # write obj\n    with open(obj_name, 'w') as f:\n        # first line: write mtlib(material library)\n        s = \"mtllib {}\\n\".format(os.path.abspath(mtl_name))\n        f.write(s)\n\n        # write vertices\n        for i in range(vertices.shape[0]):\n            s = 'v {} {} {} {} {} {}\\n'.format(vertices[i, 0], vertices[i, 1], vertices[i, 2], colors[i, 0],\n                                               colors[i, 1], colors[i, 2])\n            f.write(s)\n\n        # write uv coords\n        for i in range(uv_coords.shape[0]):\n            s = 'vt {} {}\\n'.format(uv_coords[i, 0], 1 - uv_coords[i, 1])\n            f.write(s)\n\n        f.write(\"usemtl FaceTexture\\n\")\n\n        # write f: ver ind/ uv ind\n        for i in range(triangles.shape[0]):\n            # s = 'f {}/{} {}/{} {}/{}\\n'.format(triangles[i,0], triangles[i,0], triangles[i,1], triangles[i,1], triangles[i,2], triangles[i,2])\n            s = 'f {}/{} {}/{} {}/{}\\n'.format(triangles[i, 2], triangles[i, 2], triangles[i, 1], triangles[i, 1],\n                                               triangles[i, 0], triangles[i, 0])\n            f.write(s)\n\n    # write mtl\n    with open(mtl_name, 'w') as f:\n        f.write(\"newmtl FaceTexture\\n\")\n        s = 'map_Kd {}\\n'.format(os.path.abspath(texture_name))  # map to image\n        f.write(s)\n\n    # write texture as png\n    imsave(texture_name, texture)\n"
  },
  {
    "path": "modules/image/image_processing/seeinthedark/README.md",
    "content": "# seeinthedark\n\n|模型名称|seeinthedark|\n| :--- | :---: |\n|类别|图像 - 暗光增强|\n|网络|ConvNet|\n|数据集|SID dataset|\n|是否支持Fine-tuning|否|\n|模型大小|120MB|\n|最新更新日期|2021-11-02|\n|数据指标|-|\n\n\n## 一、模型基本信息  \n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/142962370-a957d7b3-8050-4f5a-8462-3d6e49facb33.png\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    输入图像\n    <br />\n    <img src=\"https://user-images.githubusercontent.com/22424850/142962460-4a1b31ef-0eec-423b-ab3d-8622f3e8261a.png\"  width = \"450\" height = \"300\" hspace='10'/>\n    <br />\n    输出图像\n     <br />\n    </p>\n\n- ### 模型介绍\n\n  - 通过大量暗光条件下短曝光和长曝光组成的图像对，以RAW图像为输入，RGB图像为参照进行训练，该模型实现端到端直接将暗光下的RAW图像处理得到可见的RGB图像。\n\n  - 更多详情参考：[Learning to See in the Dark](http://cchen156.github.io/paper/18CVPR_SID.pdf)\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n  - rawpy\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install seeinthedark\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    # Read from a raw(Sony, .ARW) file\n    $ hub run seeinthedark --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现暗光增强模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    denoiser = hub.Module(name=\"seeinthedark\")\n    input_path = \"/PATH/TO/IMAGE\"\n    # Read from a raw file\n    denoiser.denoising(paths=[input_path], output_path='./denoising_result.png', use_gpu=True)  \n    ```\n\n- ### 3、API\n\n  - ```python\n    def denoising(images=None, paths=None, output_dir='./denoising_result/', use_gpu=False, visualization=True)\n    ```\n    - 暗光增强API，完成对暗光RAW图像的降噪并处理生成RGB图像。\n\n    - **参数**\n      - images (list\\[numpy.ndarray\\]): 输入的图像，单通道的马赛克图像; <br/>\n      - paths (list\\[str\\]): 暗光图像文件的路径，Sony的RAW格式；<br/>\n      - output\\_dir (str): 结果保存的路径； <br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - visualization(bool): 是否保存结果到本地文件夹\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像风格转换服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m seeinthedark\n    ```\n\n  - 这样就完成了一个图像风格转换的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import rawpy\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(rawpy.imread(\"/PATH/TO/IMAGE\").raw_image_visible)]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/seeinthedark/\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install seeinthedark==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/image_processing/seeinthedark/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport argparse\n\nimport paddle\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo, runnable, serving\nimport numpy as np\nimport rawpy\nimport cv2\n\nfrom .util import base64_to_cv2\n\n\ndef pack_raw(raw):\n    # pack Bayer image to 4 channels\n    im = raw\n    if not isinstance(raw, np.ndarray):\n        im = raw.raw_image_visible.astype(np.float32)\n    im = np.maximum(im - 512, 0) / (16383 - 512)  # subtract the black level\n\n    im = np.expand_dims(im, axis=2)\n    img_shape = im.shape\n    H = img_shape[0]\n    W = img_shape[1]\n\n    out = np.concatenate((im[0:H:2, 0:W:2, :], im[0:H:2, 1:W:2, :], im[1:H:2, 1:W:2, :], im[1:H:2, 0:W:2, :]), axis=2)\n    return out\n\n\n@moduleinfo(\n    name=\"seeinthedark\", type=\"CV/denoising\", author=\"paddlepaddle\", author_email=\"\", summary=\"\", version=\"1.0.0\")\nclass LearningToSeeInDark:\n    def __init__(self):\n        self.pretrained_model = os.path.join(self.directory, \"pd_model/inference_model\")\n        self.cpu_have_loaded = False\n        self.gpu_have_loaded = False\n\n    def set_device(self, use_gpu=False):\n        if use_gpu == False:\n            if not self.cpu_have_loaded:\n                exe = paddle.static.Executor(paddle.CPUPlace())\n                [prog, inputs, outputs] = paddle.static.load_inference_model(\n                    path_prefix=self.pretrained_model,\n                    executor=exe,\n                    model_filename=\"model.pdmodel\",\n                    params_filename=\"model.pdiparams\")\n                self.cpuexec, self.cpuprog, self.cpuinputs, self.cpuoutputs = exe, prog, inputs, outputs\n                self.cpu_have_loaded = True\n\n            return self.cpuexec, self.cpuprog, self.cpuinputs, self.cpuoutputs\n\n        else:\n            if not self.gpu_have_loaded:\n                exe = paddle.static.Executor(paddle.CUDAPlace(0))\n                [prog, inputs, outputs] = paddle.static.load_inference_model(\n                    path_prefix=self.pretrained_model,\n                    executor=exe,\n                    model_filename=\"model.pdmodel\",\n                    params_filename=\"model.pdiparams\")\n                self.gpuexec, self.gpuprog, self.gpuinputs, self.gpuoutputs = exe, prog, inputs, outputs\n                self.gpu_have_loaded = True\n\n            return self.gpuexec, self.gpuprog, self.gpuinputs, self.gpuoutputs\n\n    def denoising(self,\n                  images: list = None,\n                  paths: list = None,\n                  output_dir: str = './enlightening_result/',\n                  use_gpu: bool = False,\n                  visualization: bool = True):\n        '''\n        Denoise a raw image in the low-light scene.\n\n        images (list[numpy.ndarray]): data of images, shape of each is [H, W], must be sing-channel image captured by camera.\n        paths (list[str]): paths to images\n        output_dir: the dir to save the results\n        use_gpu: if True, use gpu to perform the computation, otherwise cpu.\n        visualization: if True, save results in output_dir.\n        '''\n        results = []\n        paddle.enable_static()\n        exe, prog, inputs, outputs = self.set_device(use_gpu)\n\n        if images != None:\n            for raw in images:\n                input_full = np.expand_dims(pack_raw(raw), axis=0) * 300\n                px = input_full.shape[1] // 512\n                py = input_full.shape[2] // 512\n                rx, ry = px * 512, py * 512\n                input_full = input_full[:, :rx, :ry, :]\n                output = np.random.randn(rx * 2, ry * 2, 3)\n                input_full = np.minimum(input_full, 1.0)\n                for i in range(px):\n                    for j in range(py):\n                        input_patch = input_full[:, i * 512:i * 512 + 512, j * 512:j * 512 + 512, :]\n                        result = exe.run(prog, feed={inputs[0]: input_patch}, fetch_list=outputs)\n                        output[i * 512 * 2:i * 512 * 2 + 512 * 2, j * 512 * 2:j * 512 * 2 + 512 * 2, :] = result[0][0]\n                output = np.minimum(np.maximum(output, 0), 1)\n                output = output * 255\n                output = np.clip(output, 0, 255)\n                output = output.astype('uint8')\n                results.append(output)\n        if paths != None:\n            for path in paths:\n                raw = rawpy.imread(path)\n                input_full = np.expand_dims(pack_raw(raw), axis=0) * 300\n                px = input_full.shape[1] // 512\n                py = input_full.shape[2] // 512\n                rx, ry = px * 512, py * 512\n                input_full = input_full[:, :rx, :ry, :]\n                output = np.random.randn(rx * 2, ry * 2, 3)\n                input_full = np.minimum(input_full, 1.0)\n                for i in range(px):\n                    for j in range(py):\n                        input_patch = input_full[:, i * 512:i * 512 + 512, j * 512:j * 512 + 512, :]\n                        result = exe.run(prog, feed={inputs[0]: input_patch}, fetch_list=outputs)\n                        output[i * 512 * 2:i * 512 * 2 + 512 * 2, j * 512 * 2:j * 512 * 2 + 512 * 2, :] = result[0][0]\n                output = np.minimum(np.maximum(output, 0), 1)\n                output = output * 255\n                output = np.clip(output, 0, 255)\n                output = output.astype('uint8')\n                results.append(output)\n\n        if visualization == True:\n            if not os.path.exists(output_dir):\n                os.makedirs(output_dir, exist_ok=True)\n            for i, out in enumerate(results):\n                cv2.imwrite(os.path.join(output_dir, 'output_{}.png'.format(i)), out[:, :, ::-1])\n\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n        self.denoising(\n            paths=[self.args.input_path],\n            output_dir=self.args.output_dir,\n            use_gpu=self.args.use_gpu,\n            visualization=self.args.visualization)\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.denoising(images=images_decode, **kwargs)\n        tolist = [result.tolist() for result in results]\n        return tolist\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default='denoising_result', help='output directory for saving result.')\n        self.arg_config_group.add_argument('--visualization', type=bool, default=False, help='save results or not.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument(\n            '--input_path', type=str, help=\"path to input raw image, should be raw file captured by camera.\")\n"
  },
  {
    "path": "modules/image/image_processing/seeinthedark/requirements.txt",
    "content": "rawpy\n"
  },
  {
    "path": "modules/image/industrial_application/meter_readings/barometer_reader/README.md",
    "content": "## 模型概述\n\nbarometer_reader是基于PaddleX实现对传统机械式指针表计的检测与自动读数功能的模型。该模型首先使用目标检测模型检测出图像中的表计，随后使用语义分割模型将各表计的指针和刻度分割，最后根据指针的相对位置和预知的量程计算出各表计的读数。\n\n## 命令行预测\n\n```\n$ hub run barometer_reader --input_path \"/PATH/TO/IMAGE\"\n\n```\n\n## API\n\n```python\ndef predict(self,\n            im_file: Union[str, np.ndarray],\n            score_threshold: float = 0.5,\n            seg_batch_size: int = 2,\n            erode_kernel: int = 4,\n            use_erode: bool = True,\n            visualization: bool = False,\n            save_dir: str ='output'):\n```\n\n预测API，用于表针读数。\n\n**参数**\n\n* im_file (Union\\[str, np.ndarray\\]): 图片路径或者图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n* score\\_threshold (float): 检测模型输出结果中，预测得分低于该阈值的框将被滤除，默认值为0.5；\n* seg\\_batch\\_size (int): 分割的批量大小，默认为2；\n* erode\\_kernel (int): 图像腐蚀操作时的卷积核大小，默认值为4；\n* use\\_erode (str): 是否使用图像腐蚀对分割预测图进行细分，默认为False;\n* visualization (bool): 是否将可视化图片保存；\n* save_dir (str): 保存图片到路径， 默认为\"output\"。\n\n**返回**\n\n* res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，关键字有 'category\\_id', 'bbox', 'score','category', 对应的取值为：\n  * category\\_id (int): bounding box框出的图片的类别号；\n  * bbox (list): bounding box数值；\n  * score (float): bbox类别得分；\n  * category (str):  bounding box框出的图片的类别名称。\n\n\n## 代码示例\n\n```python\nimport cv2\nimport paddlehub as hub\n\nmodel = hub.Module(name='barometer_reader')\nres = model.predict('/PATH/TO/IMAGE')\nprint(res)\n```\n\n## 服务部署\n\nPaddleHub Serving可以部署一个表计识别的在线服务。\n\n## 第一步：启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m barometer_reader\n```\n\n默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n## 第二步：发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\nimport cv2\nimport base64\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\nif __name__ == '__main__':\n    # 获取图片的base64编码格式\n    img = cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))\n    data = {'image': img}\n    # 指定content-type\n    headers = {\"Content-type\": \"application/json\"}\n    # 发送HTTP请求\n    url = \"http://127.0.0.1:8866/predict/barometer_reader\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    # 打印预测结果\n    print(r.json())\n```\n\n### 查看代码\n\nhttps://github.com/PaddlePaddle/PaddleX\n\n\n### 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n\npaddlex >= 1.3.0\n"
  },
  {
    "path": "modules/image/industrial_application/meter_readings/barometer_reader/module.py",
    "content": "import math\nimport os\nimport base64\nfrom typing import Union\n\nimport argparse\nimport cv2\nimport numpy as np\nimport paddlehub as hub\nimport paddlex as pdx\nimport paddle.nn as nn\nfrom paddlex.seg import transforms as T\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\nMETER_SHAPE = 512\nCIRCLE_CENTER = [256, 256]\nCIRCLE_RADIUS = 250\nPI = 3.1415926536\nLINE_HEIGHT = 120\nLINE_WIDTH = 1570\nTYPE_THRESHOLD = 40\nMETER_CONFIG = [{\n    'scale_value': 25.0 / 50.0,\n    'range': 25.0,\n    'unit': \"(MPa)\"\n}, {\n    'scale_value': 1.6 / 32.0,\n    'range': 1.6,\n    'unit': \"(MPa)\"\n}]\n\n\ndef base64_to_cv2(b64str: str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef cv2_to_base64(image: np.ndarray):\n    # return base64.b64encode(image)\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\n@moduleinfo(\n    name=\"barometer_reader\",\n    type=\"CV/image_editing\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\n    \"meter_reader implements the detection and automatic reading of traditional mechanical pointer meters based on Meter detection and  pointer segmentation.\",\n    version=\"1.0.0\")\nclass BarometerReader(nn.Layer):\n    def __init__(self):\n        super(BarometerReader, self).__init__()\n        self.detector = pdx.load_model(os.path.join(self.directory, 'meter_det_inference_model'))\n        self.segmenter = pdx.load_model(os.path.join(self.directory, 'meter_seg_inference_model'))\n        self.seg_transform = T.Compose([T.Normalize()])\n\n    def read_process(self, label_maps: np.ndarray):\n        line_images = self.creat_line_image(label_maps)\n        scale_data, pointer_data = self.convert_1d_data(line_images)\n        self.scale_mean_filtration(scale_data)\n        result = self.get_meter_reader(scale_data, pointer_data)\n        return result\n\n    def creat_line_image(self, meter_image: np.ndarray):\n        line_image = np.zeros((LINE_HEIGHT, LINE_WIDTH), dtype=np.uint8)\n        for row in range(LINE_HEIGHT):\n            for col in range(LINE_WIDTH):\n                theta = PI * 2 / LINE_WIDTH * (col + 1)\n                rho = CIRCLE_RADIUS - row - 1\n                x = int(CIRCLE_CENTER[0] + rho * math.cos(theta) + 0.5)\n                y = int(CIRCLE_CENTER[1] - rho * math.sin(theta) + 0.5)\n                line_image[row, col] = meter_image[x, y]\n        return line_image\n\n    def convert_1d_data(self, meter_image: np.ndarray):\n        scale_data = np.zeros((LINE_WIDTH), dtype=np.uint8)\n        pointer_data = np.zeros((LINE_WIDTH), dtype=np.uint8)\n        for col in range(LINE_WIDTH):\n            for row in range(LINE_HEIGHT):\n                if meter_image[row, col] == 1:\n                    pointer_data[col] += 1\n                elif meter_image[row, col] == 2:\n                    scale_data[col] += 1\n        return scale_data, pointer_data\n\n    def scale_mean_filtration(self, scale_data: np.ndarray):\n        mean_data = np.mean(scale_data)\n        for col in range(LINE_WIDTH):\n            if scale_data[col] < mean_data:\n                scale_data[col] = 0\n\n    def get_meter_reader(self, scale_data: np.ndarray, pointer_data: np.ndarray):\n        scale_flag = False\n        pointer_flag = False\n        one_scale_start = 0\n        one_scale_end = 0\n        one_pointer_start = 0\n        one_pointer_end = 0\n        scale_location = list()\n        pointer_location = 0\n        for i in range(LINE_WIDTH - 1):\n            if scale_data[i] > 0 and scale_data[i + 1] > 0:\n                if scale_flag == False:\n                    one_scale_start = i\n                    scale_flag = True\n            if scale_flag:\n                if scale_data[i] == 0 and scale_data[i + 1] == 0:\n                    one_scale_end = i - 1\n                    one_scale_location = (one_scale_start + one_scale_end) / 2\n                    scale_location.append(one_scale_location)\n                    one_scale_start = 0\n                    one_scale_end = 0\n                    scale_flag = False\n            if pointer_data[i] > 0 and pointer_data[i + 1] > 0:\n                if pointer_flag == False:\n                    one_pointer_start = i\n                    pointer_flag = True\n            if pointer_flag:\n                if pointer_data[i] == 0 and pointer_data[i + 1] == 0:\n                    one_pointer_end = i - 1\n                    pointer_location = (one_pointer_start + one_pointer_end) / 2\n                    one_pointer_start = 0\n                    one_pointer_end = 0\n                    pointer_flag = False\n\n        scale_num = len(scale_location)\n        scales = -1\n        ratio = -1\n        if scale_num > 0:\n            for i in range(scale_num - 1):\n                if scale_location[i] <= pointer_location and pointer_location < scale_location[i + 1]:\n                    scales = i + (pointer_location - scale_location[i]) / (\n                        scale_location[i + 1] - scale_location[i] + 1e-05) + 1\n            ratio = (pointer_location - scale_location[0]) / (scale_location[scale_num - 1] - scale_location[0] + 1e-05)\n        result = {'scale_num': scale_num, 'scales': scales, 'ratio': ratio}\n        return result\n\n    def predict(self,\n                im_file: Union[str, np.ndarray],\n                score_threshold: float = 0.5,\n                seg_batch_size: int = 2,\n                erode_kernel: int = 4,\n                use_erode: bool = True,\n                visualization: bool = False,\n                save_dir: str = 'output'):\n\n        if isinstance(im_file, str):\n            im = cv2.imread(im_file).astype('float32')\n        else:\n            im = im_file.copy()\n        det_results = self.detector.predict(im)\n        filtered_results = list()\n        for res in det_results:\n            if res['score'] > score_threshold:\n                filtered_results.append(res)\n\n        resized_meters = list()\n\n        for res in filtered_results:\n            xmin, ymin, w, h = res['bbox']\n            xmin = max(0, int(xmin))\n            ymin = max(0, int(ymin))\n            xmax = min(im.shape[1], int(xmin + w - 1))\n            ymax = min(im.shape[0], int(ymin + h - 1))\n            sub_image = im[ymin:(ymax + 1), xmin:(xmax + 1), :]\n\n            # Resize the image with shape (METER_SHAPE, METER_SHAPE)\n            meter_shape = sub_image.shape\n            scale_x = float(METER_SHAPE) / float(meter_shape[1])\n            scale_y = float(METER_SHAPE) / float(meter_shape[0])\n            meter_meter = cv2.resize(sub_image, None, None, fx=scale_x, fy=scale_y, interpolation=cv2.INTER_LINEAR)\n            meter_meter = meter_meter.astype('float32')\n            resized_meters.append(meter_meter)\n\n        meter_num = len(resized_meters)\n        seg_results = list()\n        for i in range(0, meter_num, seg_batch_size):\n            im_size = min(meter_num, i + seg_batch_size)\n            meter_images = list()\n            for j in range(i, im_size):\n                meter_images.append(resized_meters[j - i])\n\n            result = self.segmenter.batch_predict(transforms=self.seg_transform, img_file_list=meter_images)\n\n            if use_erode:\n                kernel = np.ones((erode_kernel, erode_kernel), np.uint8)\n                for i in range(len(result)):\n                    result[i]['label_map'] = cv2.erode(result[i]['label_map'], kernel)\n            seg_results.extend(result)\n\n        results = list()\n        for i, seg_result in enumerate(seg_results):\n            result = self.read_process(seg_result['label_map'])\n            results.append(result)\n\n        meter_values = list()\n        for i, result in enumerate(results):\n            if result['scale_num'] > TYPE_THRESHOLD:\n                value = result['scales'] * METER_CONFIG[0]['scale_value']\n            else:\n                value = result['scales'] * METER_CONFIG[1]['scale_value']\n            meter_values.append(value)\n            print(\"-- Meter {} -- result: {} --\\n\".format(i, value))\n        # visualize the results\n        visual_results = list()\n        for i, res in enumerate(filtered_results):\n            # Use `score` to represent the meter value\n            res['score'] = meter_values[i]\n            visual_results.append(res)\n        if visualization:\n            pdx.det.visualize(im_file, visual_results, -1, save_dir=save_dir)\n\n        return visual_results\n\n    @serving\n    def serving_method(self, image: str, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = base64_to_cv2(image)\n        results = self.predict(im_file=images_decode, **kwargs)\n        res = []\n        for result in results:\n            result['category_id'] = int(result['category_id'])\n            result['bbox'] = [float(value) for value in result['bbox']]\n            result['category'] = str(result['category'])\n            res.append(result)\n        return res\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.predict(im_file=args.input_path)\n        return results\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n"
  },
  {
    "path": "modules/image/industrial_application/meter_readings/barometer_reader/requirements.txt",
    "content": "paddlex == 1.3.0"
  },
  {
    "path": "modules/image/instance_segmentation/solov2/README.md",
    "content": "# solov2\n\n| 模型名称            |    solov2     |\n| :------------------ | :-----------: |\n| 类别                | 图像-实例分割 |\n| 网络                |       -       |\n| 数据集              |   COCO2014    |\n| 是否支持Fine-tuning |      否       |\n| 模型大小            |     165M      |\n| 最新更新日期        |  2021-02-26   |\n| 数据指标            |       -       |\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n<div align=\"center\">\n<img src=\"https://user-images.githubusercontent.com/76040149/133250741-83040204-acfc-4348-af90-acac74f40cd8.png\"   height = \"300\" />\n</div>\n\n- ### 模型介绍\n  - solov2是基于\"SOLOv2: Dynamic, Faster and Stronger\"实现的快速实例分割的模型。该模型基于SOLOV1, 并且针对mask的检测效果和运行效率进行改进，在实例分割任务中表现优秀。相对语义分割，实例分割需要标注出图上同一物体的不同个体。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install solov2\n    ```\n\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n   | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run openpose_hands_estimation --input_path \"/PATH/TO/IMAGE\"\n    ```\n\n  - 通过命令行方式实现hub模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import cv2\n    import paddlehub as hub\n\n    img = cv2.imread('/PATH/TO/IMAGE')\n    model = hub.Module(name='solov2', use_gpu=False)\n    output = model.predict(image=img, visualization=False)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def predict(image: Union[str, np.ndarray],\n                threshold: float = 0.5,\n                visualization: bool = False,\n                save_dir: str = 'solov2_result'):\n    ```\n\n    - 预测API，实例分割。\n    - **参数**\n      - image (Union[str, np.ndarray]): 图片路径或者图片数据，ndarray.shape 为 [H, W, C]，BGR格式；\n      - threshold (float): 检测模型输出结果中，预测得分低于该阈值的框将被滤除，默认值为0.5；\n      - visualization (bool): 是否将可视化图片保存；\n      - save_dir (str): 保存图片到路径， 默认为\"solov2_result\"。\n    - **返回**\n      - res (dict): 识别结果，关键字有 'segm', 'label', 'score'对应的取值为：\n        - segm (np.ndarray): 实例分割结果,取值为0或1。0表示背景，1为实例；\n        - label (list): 实例分割结果类别id；\n        - score (list):实例分割结果类别得分；\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个实例分割的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m solov2 -p 8866\n    ```\n\n  - 这样就完成了一个实例分割的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    import numpy as np\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    def base64_to_cv2(b64str):\n        data = base64.b64decode(b64str.encode('utf8'))\n        data = np.fromstring(data, np.uint8)\n        data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n        return data\n\n    # 发送HTTP请求\n\n    org_im = cv2.imread('/PATH/TO/IMAGE')\n    h, w, c = org_im.shape\n    data = {'images':[cv2_to_base64(org_im)]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/solov2\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    seg = base64.b64decode(r.json()[\"results\"]['segm'].encode('utf8'))\n    seg = np.fromstring(seg, dtype=np.int32).reshape((-1, h, w))\n\n    label = base64.b64decode(r.json()[\"results\"]['label'].encode('utf8'))\n    label = np.fromstring(label, dtype=np.int64)\n\n    score = base64.b64decode(r.json()[\"results\"]['score'].encode('utf8'))\n    score = np.fromstring(score, dtype=np.float32)\n\n    print('seg', seg)\n    print('label', label)\n    print('score', score)\n    ```\n\n- ### Gradio App 支持\n  从 PaddleHub 2.3.1 开始支持使用链接 http://127.0.0.1:8866/gradio/solov2 在浏览器中访问 solov2 的 Gradio App。\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  适配 PaddlePaddle 2.2.0+\n\n* 1.2.0\n\n  添加 Gradio APP 支持\n\n  - ```shell\n    $ hub install solov2==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/instance_segmentation/solov2/data_feed.py",
    "content": "import base64\n\nimport cv2\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\nfrom paddle.inference import PrecisionType\nfrom PIL import Image\nfrom PIL import ImageDraw\n\n\ndef create_inputs(im, im_info):\n    \"\"\"generate input for different model type\n    Args:\n        im (np.ndarray): image (np.ndarray)\n        im_info (dict): info of image\n    Returns:\n        inputs (dict): input of model\n    \"\"\"\n    inputs = {}\n    inputs['image'] = im\n    origin_shape = list(im_info['origin_shape'])\n    resize_shape = list(im_info['resize_shape'])\n    pad_shape = list(im_info['pad_shape']) if im_info['pad_shape'] is not None else list(im_info['resize_shape'])\n    scale_x, scale_y = im_info['scale']\n    scale = scale_x\n    im_info = np.array([resize_shape + [scale]]).astype('float32')\n    inputs['im_info'] = im_info\n    inputs['scale_factor'] = np.array([scale_x, scale_x]).astype('float32').reshape(-1, 2)\n    inputs['im_shape'] = np.array(resize_shape).astype('float32').reshape(-1, 2)\n    return inputs\n\n\ndef visualize_box_mask(im, results, labels=None, mask_resolution=14, threshold=0.5):\n    \"\"\"\n    Args:\n        im (str/np.ndarray): path of image/np.ndarray read by cv2\n        results (dict): include 'boxes': np.ndarray: shape:[N,6], N: number of box,\n                        matix element:[class, score, x_min, y_min, x_max, y_max]\n                        MaskRCNN's results include 'masks': np.ndarray:\n                        shape:[N, class_num, mask_resolution, mask_resolution]\n        labels (list): labels:['class1', ..., 'classn']\n        mask_resolution (int): shape of a mask is:[mask_resolution, mask_resolution]\n        threshold (float): Threshold of score.\n    Returns:\n        im (PIL.Image.Image): visualized image\n    \"\"\"\n    if not labels:\n        labels = [\n            'background', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat',\n            'traffic light', 'fire', 'hydrant', 'stop sign', 'parking meter', 'bench', 'bird', 'cat', 'dog', 'horse',\n            'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie',\n            'suitcase', 'frisbee', 'skis', 'snowboard', 'sports ball', 'kite', 'baseball bat', 'baseball glove',\n            'skateboard', 'surfboard', 'tennis racket', 'bottle', 'wine glass', 'cup', 'fork', 'knife', 'spoon', 'bowl',\n            'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hot dog', 'pizza', 'donut', 'cake', 'chair',\n            'couch', 'potted plant', 'bed', 'dining table', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard',\n            'cell phone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors',\n            'teddy bear', 'hair drier', 'toothbrush'\n        ]\n    if isinstance(im, str):\n        im = Image.open(im).convert('RGB')\n    else:\n        im = cv2.cvtColor(im, cv2.COLOR_BGR2RGB)\n        im = Image.fromarray(im)\n    if 'masks' in results and 'boxes' in results:\n        im = draw_mask(im, results['boxes'], results['masks'], labels, resolution=mask_resolution)\n    if 'boxes' in results:\n        im = draw_box(im, results['boxes'], labels)\n    if 'segm' in results:\n        im = draw_segm(im, results['segm'], results['label'], results['score'], labels, threshold=threshold)\n    return im\n\n\ndef get_color_map_list(num_classes):\n    \"\"\"\n    Args:\n        num_classes (int): number of class\n    Returns:\n        color_map (list): RGB color list\n    \"\"\"\n    color_map = num_classes * [0, 0, 0]\n    for i in range(0, num_classes):\n        j = 0\n        lab = i\n        while lab:\n            color_map[i * 3] |= (((lab >> 0) & 1) << (7 - j))\n            color_map[i * 3 + 1] |= (((lab >> 1) & 1) << (7 - j))\n            color_map[i * 3 + 2] |= (((lab >> 2) & 1) << (7 - j))\n            j += 1\n            lab >>= 3\n    color_map = [color_map[i:i + 3] for i in range(0, len(color_map), 3)]\n    return color_map\n\n\ndef expand_boxes(boxes, scale=0.0):\n    \"\"\"\n    Args:\n        boxes (np.ndarray): shape:[N,4], N:number of box,\n                            matix element:[x_min, y_min, x_max, y_max]\n        scale (float): scale of boxes\n    Returns:\n        boxes_exp (np.ndarray): expanded boxes\n    \"\"\"\n    w_half = (boxes[:, 2] - boxes[:, 0]) * .5\n    h_half = (boxes[:, 3] - boxes[:, 1]) * .5\n    x_c = (boxes[:, 2] + boxes[:, 0]) * .5\n    y_c = (boxes[:, 3] + boxes[:, 1]) * .5\n    w_half *= scale\n    h_half *= scale\n    boxes_exp = np.zeros(boxes.shape)\n    boxes_exp[:, 0] = x_c - w_half\n    boxes_exp[:, 2] = x_c + w_half\n    boxes_exp[:, 1] = y_c - h_half\n    boxes_exp[:, 3] = y_c + h_half\n    return boxes_exp\n\n\ndef draw_mask(im, np_boxes, np_masks, labels, resolution=14, threshold=0.5):\n    \"\"\"\n    Args:\n        im (PIL.Image.Image): PIL image\n        np_boxes (np.ndarray): shape:[N,6], N: number of box,\n                               matix element:[class, score, x_min, y_min, x_max, y_max]\n        np_masks (np.ndarray): shape:[N, class_num, resolution, resolution]\n        labels (list): labels:['class1', ..., 'classn']\n        resolution (int): shape of a mask is:[resolution, resolution]\n        threshold (float): threshold of mask\n    Returns:\n        im (PIL.Image.Image): visualized image\n    \"\"\"\n    color_list = get_color_map_list(len(labels))\n    scale = (resolution + 2.0) / resolution\n    im_w, im_h = im.size\n    w_ratio = 0.4\n    alpha = 0.7\n    im = np.array(im).astype('float32')\n    rects = np_boxes[:, 2:]\n    expand_rects = expand_boxes(rects, scale)\n    expand_rects = expand_rects.astype(np.int32)\n    clsid_scores = np_boxes[:, 0:2]\n    padded_mask = np.zeros((resolution + 2, resolution + 2), dtype=np.float32)\n    clsid2color = {}\n    for idx in range(len(np_boxes)):\n        clsid, score = clsid_scores[idx].tolist()\n        clsid = int(clsid)\n        xmin, ymin, xmax, ymax = expand_rects[idx].tolist()\n        w = xmax - xmin + 1\n        h = ymax - ymin + 1\n        w = np.maximum(w, 1)\n        h = np.maximum(h, 1)\n        padded_mask[1:-1, 1:-1] = np_masks[idx, int(clsid), :, :]\n        resized_mask = cv2.resize(padded_mask, (w, h))\n        resized_mask = np.array(resized_mask > threshold, dtype=np.uint8)\n        x0 = min(max(xmin, 0), im_w)\n        x1 = min(max(xmax + 1, 0), im_w)\n        y0 = min(max(ymin, 0), im_h)\n        y1 = min(max(ymax + 1, 0), im_h)\n        im_mask = np.zeros((im_h, im_w), dtype=np.uint8)\n        im_mask[y0:y1, x0:x1] = resized_mask[(y0 - ymin):(y1 - ymin), (x0 - xmin):(x1 - xmin)]\n        if clsid not in clsid2color:\n            clsid2color[clsid] = color_list[clsid]\n        color_mask = clsid2color[clsid]\n        for c in range(3):\n            color_mask[c] = color_mask[c] * (1 - w_ratio) + w_ratio * 255\n        idx = np.nonzero(im_mask)\n        color_mask = np.array(color_mask)\n        im[idx[0], idx[1], :] *= 1.0 - alpha\n        im[idx[0], idx[1], :] += alpha * color_mask\n    return Image.fromarray(im.astype('uint8'))\n\n\ndef draw_box(im, np_boxes, labels):\n    \"\"\"\n    Args:\n        im (PIL.Image.Image): PIL image\n        np_boxes (np.ndarray): shape:[N,6], N: number of box,\n                               matix element:[class, score, x_min, y_min, x_max, y_max]\n        labels (list): labels:['class1', ..., 'classn']\n    Returns:\n        im (PIL.Image.Image): visualized image\n    \"\"\"\n    draw_thickness = min(im.size) // 320\n    draw = ImageDraw.Draw(im)\n    clsid2color = {}\n    color_list = get_color_map_list(len(labels))\n\n    for dt in np_boxes:\n        clsid, bbox, score = int(dt[0]), dt[2:], dt[1]\n        xmin, ymin, xmax, ymax = bbox\n        w = xmax - xmin\n        h = ymax - ymin\n        if clsid not in clsid2color:\n            clsid2color[clsid] = color_list[clsid]\n        color = tuple(clsid2color[clsid])\n\n        # draw bbox\n        draw.line([(xmin, ymin), (xmin, ymax), (xmax, ymax), (xmax, ymin), (xmin, ymin)],\n                  width=draw_thickness,\n                  fill=color)\n\n        # draw label\n        text = \"{} {:.4f}\".format(labels[clsid], score)\n        tw, th = draw.textsize(text)\n        draw.rectangle([(xmin + 1, ymin - th), (xmin + tw + 1, ymin)], fill=color)\n        draw.text((xmin + 1, ymin - th), text, fill=(255, 255, 255))\n    return im\n\n\ndef draw_segm(im, np_segms, np_label, np_score, labels, threshold=0.5, alpha=0.7):\n    \"\"\"\n    Draw segmentation on image.\n    \"\"\"\n    mask_color_id = 0\n    w_ratio = .4\n    color_list = get_color_map_list(len(labels))\n    im = np.array(im).astype('float32')\n    clsid2color = {}\n    np_segms = np_segms.astype(np.uint8)\n\n    for i in range(np_segms.shape[0]):\n        mask, score, clsid = np_segms[i], np_score[i], np_label[i] + 1\n        if score < threshold:\n            continue\n        if clsid not in clsid2color:\n            clsid2color[clsid] = color_list[clsid]\n        color_mask = clsid2color[clsid]\n        for c in range(3):\n            color_mask[c] = color_mask[c] * (1 - w_ratio) + w_ratio * 255\n        idx = np.nonzero(mask)\n        color_mask = np.array(color_mask)\n        im[idx[0], idx[1], :] *= 1.0 - alpha\n        im[idx[0], idx[1], :] += alpha * color_mask\n        sum_x = np.sum(mask, axis=0)\n        x = np.where(sum_x > 0.5)[0]\n        sum_y = np.sum(mask, axis=1)\n        y = np.where(sum_y > 0.5)[0]\n        x0, x1, y0, y1 = x[0], x[-1], y[0], y[-1]\n        cv2.rectangle(im, (x0, y0), (x1, y1), tuple(color_mask.astype('int32').tolist()), 1)\n        bbox_text = '%s %.2f' % (labels[clsid], score)\n        t_size = cv2.getTextSize(bbox_text, 0, 0.3, thickness=1)[0]\n        cv2.rectangle(im, (x0, y0), (x0 + t_size[0], y0 - t_size[1] - 3), tuple(color_mask.astype('int32').tolist()),\n                      -1)\n        cv2.putText(im, bbox_text, (x0, y0 - 2), cv2.FONT_HERSHEY_SIMPLEX, 0.3, (0, 0, 0), 1, lineType=cv2.LINE_AA)\n\n    return Image.fromarray(im.astype('uint8'))\n\n\ndef load_predictor(model_dir, run_mode='paddle', batch_size=1, use_gpu=False, min_subgraph_size=3):\n    \"\"\"set AnalysisConfig, generate AnalysisPredictor\n    Args:\n        model_dir (str): root path of __model__ and __params__\n        use_gpu (bool): whether use gpu\n    Returns:\n        predictor (PaddlePredictor): AnalysisPredictor\n    Raises:\n        ValueError: predict by TensorRT need use_gpu == True.\n    \"\"\"\n    if not use_gpu and not run_mode == 'paddle':\n        raise ValueError(\"Predict by TensorRT mode: {}, expect use_gpu==True, but use_gpu == {}\".format(\n            run_mode, use_gpu))\n    if run_mode == 'trt_int8':\n        raise ValueError(\"TensorRT int8 mode is not supported now, \"\n                         \"please use trt_fp32 or trt_fp16 instead.\")\n    precision_map = {'trt_int8': PrecisionType.Int8, 'trt_fp32': PrecisionType.Float32, 'trt_fp16': PrecisionType.Half}\n    config = Config(model_dir + '.pdmodel', model_dir + '.pdiparams')\n    if use_gpu:\n        # initial GPU memory(M), device ID\n        config.enable_use_gpu(100, 0)\n        # optimize graph and fuse op\n        config.switch_ir_optim(True)\n    else:\n        config.disable_gpu()\n\n    if run_mode in precision_map.keys():\n        config.enable_tensorrt_engine(workspace_size=1 << 10,\n                                      max_batch_size=batch_size,\n                                      min_subgraph_size=min_subgraph_size,\n                                      precision_mode=precision_map[run_mode],\n                                      use_static=False,\n                                      use_calib_mode=False)\n\n    # disable print log when predict\n    config.disable_glog_info()\n    # enable shared memory\n    config.enable_memory_optim()\n    # disable feed, fetch OP, needed by zero_copy_run\n    config.switch_use_feed_fetch_ops(False)\n    predictor = create_predictor(config)\n    return predictor\n\n\ndef cv2_to_base64(image: np.ndarray):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef base64_to_cv2(b64str: str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n"
  },
  {
    "path": "modules/image/instance_segmentation/solov2/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport base64\nimport os\nimport time\nfrom functools import reduce\nfrom typing import Union\n\nimport numpy as np\nimport solov2.data_feed as D\nimport solov2.processor as P\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import serving\n\n\nclass Detector:\n    \"\"\"\n    Args:\n        min_subgraph_size (int): number of tensorRT graphs.\n        use_gpu (bool): whether use gpu\n        threshold (float): threshold to reserve the result for output.\n    \"\"\"\n\n    def __init__(self, min_subgraph_size: int = 60, use_gpu=False):\n\n        self.default_pretrained_model_path = os.path.join(self.directory, 'solov2_r50_fpn_1x', 'model')\n        self.predictor = D.load_predictor(self.default_pretrained_model_path,\n                                          min_subgraph_size=min_subgraph_size,\n                                          use_gpu=use_gpu)\n        self.compose = [\n            P.Resize(max_size=1333),\n            P.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]),\n            P.Permute(),\n            P.PadStride(stride=32)\n        ]\n\n    def transform(self, im: Union[str, np.ndarray]):\n        im, im_info = P.preprocess(im, self.compose)\n        inputs = D.create_inputs(im, im_info)\n        return inputs, im_info\n\n    def postprocess(self, np_boxes: np.ndarray, np_masks: np.ndarray, threshold: float = 0.5):\n        # postprocess output of predictor\n        results = {}\n        expect_boxes = (np_boxes[:, 1] > threshold) & (np_boxes[:, 0] > -1)\n        np_boxes = np_boxes[expect_boxes, :]\n        for box in np_boxes:\n            print('class_id:{:d}, confidence:{:.4f},'\n                  'left_top:[{:.2f},{:.2f}],'\n                  ' right_bottom:[{:.2f},{:.2f}]'.format(int(box[0]), box[1], box[2], box[3], box[4], box[5]))\n        results['boxes'] = np_boxes\n        if np_masks is not None:\n            np_masks = np_masks[expect_boxes, :, :, :]\n            results['masks'] = np_masks\n        return results\n\n    def predict(self, image: Union[str, np.ndarray], threshold: float = 0.5):\n        '''\n        Args:\n            image (str/np.ndarray): path of image/ np.ndarray read by cv2\n            threshold (float): threshold of predicted box' score\n        Returns:\n            results (dict): include 'boxes': np.ndarray: shape:[N,6], N: number of box,\n                            matix element:[class, score, x_min, y_min, x_max, y_max]\n                            MaskRCNN's results include 'masks': np.ndarray:\n                            shape:[N, class_num, mask_resolution, mask_resolution]\n        '''\n        inputs, im_info = self.transform(image)\n        np_boxes, np_masks = None, None\n\n        input_names = self.predictor.get_input_names()\n        for i in range(len(input_names)):\n            input_tensor = self.predictor.get_input_handle(input_names[i])\n            input_tensor.copy_from_cpu(inputs[input_names[i]])\n\n        self.predictor.run()\n        output_names = self.predictor.get_output_names()\n        boxes_tensor = self.predictor.get_output_handle(output_names[0])\n        np_boxes = boxes_tensor.copy_to_cpu()\n        # do not perform postprocess in benchmark mode\n        results = []\n        if reduce(lambda x, y: x * y, np_boxes.shape) < 6:\n            print('[WARNNING] No object detected.')\n            results = {'boxes': np.array([])}\n        else:\n            results = self.postprocess(np_boxes, np_masks, im_info, threshold=threshold)\n        return results\n\n\n@moduleinfo(name=\"solov2\",\n            type=\"CV/instance_segmentation\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"solov2 is a detection model, this module is trained with COCO dataset.\",\n            version=\"1.2.0\")\nclass DetectorSOLOv2(Detector):\n    \"\"\"\n    Args:\n        use_gpu (bool): whether use gpu\n        threshold (float): threshold to reserve the result for output.\n    \"\"\"\n\n    def __init__(self, use_gpu: bool = False):\n        super(DetectorSOLOv2, self).__init__(use_gpu=use_gpu)\n\n    def predict(self,\n                image: Union[str, np.ndarray],\n                threshold: float = 0.5,\n                visualization: bool = False,\n                save_dir: str = 'solov2_result'):\n        '''\n        Args:\n            image (str/np.ndarray): path of image/ np.ndarray read by cv2\n            threshold (float): threshold of predicted box' score\n            visualization (bool): Whether to save visualization result.\n            save_dir (str): save path.\n\n        '''\n\n        inputs, im_info = self.transform(image)\n        np_label, np_score, np_segms = None, None, None\n\n        input_names = self.predictor.get_input_names()\n        for i in range(len(input_names)):\n            input_tensor = self.predictor.get_input_handle(input_names[i])\n            input_tensor.copy_from_cpu(inputs[input_names[i]])\n\n        self.predictor.run()\n        output_names = self.predictor.get_output_names()\n        np_label = self.predictor.get_output_handle(output_names[1]).copy_to_cpu()\n        np_score = self.predictor.get_output_handle(output_names[2]).copy_to_cpu()\n        np_segms = self.predictor.get_output_handle(output_names[3]).copy_to_cpu()\n        output = dict(segm=np_segms, label=np_label, score=np_score)\n\n        if visualization:\n            if not os.path.exists(save_dir):\n                os.makedirs(save_dir)\n            image = D.visualize_box_mask(im=image, results=output, threshold=threshold)\n            name = str(time.time()) + '.png'\n            save_path = os.path.join(save_dir, name)\n            image.save(save_path)\n        return output\n\n    @serving\n    def serving_method(self, images: list, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = D.base64_to_cv2(images[0])\n        results = self.predict(image=images_decode, **kwargs)\n        final = {}\n        final['segm'] = base64.b64encode(results['segm']).decode('utf8')\n        final['label'] = base64.b64encode(results['label']).decode('utf8')\n        final['score'] = base64.b64encode(results['score']).decode('utf8')\n        return final\n\n    def create_gradio_app(self):\n        import os\n        import tempfile\n        import gradio as gr\n\n        from PIL import Image\n\n        def inference(img, threshold):\n            with tempfile.TemporaryDirectory() as tempdir_name:\n                self.predict(image=img, threshold=threshold, visualization=True, save_dir=tempdir_name)\n                result_names = os.listdir(tempdir_name)\n                return Image.open(os.path.join(tempdir_name, result_names[0]))\n\n        interface = gr.Interface(inference,\n                                 inputs=[gr.inputs.Image(type=\"filepath\"),\n                                         gr.Slider(0.0, 1.0, value=0.5)],\n                                 outputs=gr.Image(label='segmentation'),\n                                 title='SOLOv2',\n                                 allow_flagging='never')\n\n        return interface\n"
  },
  {
    "path": "modules/image/instance_segmentation/solov2/processor.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport cv2\nimport numpy as np\nfrom PIL import Image\n\n\ndef decode_image(im_file, im_info):\n    \"\"\"read rgb image\n    Args:\n        im_file (str/np.ndarray): path of image/ np.ndarray read by cv2\n        im_info (dict): info of image\n    Returns:\n        im (np.ndarray):  processed image (np.ndarray)\n        im_info (dict): info of processed image\n    \"\"\"\n    if isinstance(im_file, str):\n        with open(im_file, 'rb') as f:\n            im_read = f.read()\n        data = np.frombuffer(im_read, dtype='uint8')\n        im = cv2.imdecode(data, 1)  # BGR mode, but need RGB mode\n        im = cv2.cvtColor(im, cv2.COLOR_BGR2RGB)\n        im_info['origin_shape'] = im.shape[:2]\n        im_info['resize_shape'] = im.shape[:2]\n    else:\n        im = im_file\n        #im = cv2.cvtColor(im, cv2.COLOR_BGR2RGB)\n        im_info['origin_shape'] = im.shape[:2]\n        im_info['resize_shape'] = im.shape[:2]\n    return im, im_info\n\n\nclass Resize(object):\n    \"\"\"resize image by target_size and max_size\n    Args:\n        arch (str): model type\n        target_size (int): the target size of image\n        max_size (int): the max size of image\n        use_cv2 (bool): whether us cv2\n        image_shape (list): input shape of model\n        interp (int): method of resize\n    \"\"\"\n\n    def __init__(self,\n                 target_size=800,\n                 max_size=1333,\n                 use_cv2=True,\n                 image_shape=None,\n                 interp=cv2.INTER_LINEAR,\n                 resize_box=False):\n        self.target_size = target_size\n        self.max_size = max_size\n        self.image_shape = image_shape\n        self.use_cv2 = use_cv2\n        self.interp = interp\n\n    def __call__(self, im, im_info):\n        \"\"\"\n        Args:\n            im (np.ndarray): image (np.ndarray)\n            im_info (dict): info of image\n        Returns:\n            im (np.ndarray):  processed image (np.ndarray)\n            im_info (dict): info of processed image\n        \"\"\"\n        im_channel = im.shape[2]\n        im_scale_x, im_scale_y = self.generate_scale(im)\n        if self.use_cv2:\n            im = cv2.resize(im, None, None, fx=im_scale_x, fy=im_scale_y, interpolation=self.interp)\n        else:\n            resize_w = int(im_scale_x * float(im.shape[1]))\n            resize_h = int(im_scale_y * float(im.shape[0]))\n            if self.max_size != 0:\n                raise TypeError('If you set max_size to cap the maximum size of image,'\n                                'please set use_cv2 to True to resize the image.')\n            im = im.astype('uint8')\n            im = Image.fromarray(im)\n            im = im.resize((int(resize_w), int(resize_h)), self.interp)\n            im = np.array(im)\n\n        # padding im when image_shape fixed by infer_cfg.yml\n        if self.max_size != 0 and self.image_shape is not None:\n            padding_im = np.zeros((self.max_size, self.max_size, im_channel), dtype=np.float32)\n            im_h, im_w = im.shape[:2]\n            padding_im[:im_h, :im_w, :] = im\n            im = padding_im\n\n        im_info['scale'] = [im_scale_x, im_scale_y]\n        im_info['resize_shape'] = im.shape[:2]\n        return im, im_info\n\n    def generate_scale(self, im):\n        \"\"\"\n        Args:\n            im (np.ndarray): image (np.ndarray)\n        Returns:\n            im_scale_x: the resize ratio of X\n            im_scale_y: the resize ratio of Y\n        \"\"\"\n        origin_shape = im.shape[:2]\n        im_c = im.shape[2]\n        if self.max_size != 0:\n            im_size_min = np.min(origin_shape[0:2])\n            im_size_max = np.max(origin_shape[0:2])\n            im_scale = float(self.target_size) / float(im_size_min)\n            if np.round(im_scale * im_size_max) > self.max_size:\n                im_scale = float(self.max_size) / float(im_size_max)\n            im_scale_x = im_scale\n            im_scale_y = im_scale\n        else:\n            im_scale_x = float(self.target_size) / float(origin_shape[1])\n            im_scale_y = float(self.target_size) / float(origin_shape[0])\n        return im_scale_x, im_scale_y\n\n\nclass Normalize(object):\n    \"\"\"normalize image\n    Args:\n        mean (list): im - mean\n        std (list): im / std\n        is_scale (bool): whether need im / 255\n        is_channel_first (bool): if True: image shape is CHW, else: HWC\n    \"\"\"\n\n    def __init__(self, mean, std, is_scale=True, is_channel_first=False):\n        self.mean = mean\n        self.std = std\n        self.is_scale = is_scale\n        self.is_channel_first = is_channel_first\n\n    def __call__(self, im, im_info):\n        \"\"\"\n        Args:\n            im (np.ndarray): image (np.ndarray)\n            im_info (dict): info of image\n        Returns:\n            im (np.ndarray):  processed image (np.ndarray)\n            im_info (dict): info of processed image\n        \"\"\"\n        im = im.astype(np.float32, copy=False)\n        if self.is_channel_first:\n            mean = np.array(self.mean)[:, np.newaxis, np.newaxis]\n            std = np.array(self.std)[:, np.newaxis, np.newaxis]\n        else:\n            mean = np.array(self.mean)[np.newaxis, np.newaxis, :]\n            std = np.array(self.std)[np.newaxis, np.newaxis, :]\n        if self.is_scale:\n            im = im / 255.0\n        im -= mean\n        im /= std\n        return im, im_info\n\n\nclass Permute(object):\n    \"\"\"permute image\n    Args:\n        to_bgr (bool): whether convert RGB to BGR\n        channel_first (bool): whether convert HWC to CHW\n    \"\"\"\n\n    def __init__(self, to_bgr=False, channel_first=True):\n        self.to_bgr = to_bgr\n        self.channel_first = channel_first\n\n    def __call__(self, im, im_info):\n        \"\"\"\n        Args:\n            im (np.ndarray): image (np.ndarray)\n            im_info (dict): info of image\n        Returns:\n            im (np.ndarray):  processed image (np.ndarray)\n            im_info (dict): info of processed image\n        \"\"\"\n        if self.channel_first:\n            im = im.transpose((2, 0, 1)).copy()\n        if self.to_bgr:\n            im = im[[2, 1, 0], :, :]\n        return im, im_info\n\n\nclass PadStride(object):\n    \"\"\" padding image for model with FPN\n    Args:\n        stride (bool): model with FPN need image shape % stride == 0\n    \"\"\"\n\n    def __init__(self, stride=0):\n        self.coarsest_stride = stride\n\n    def __call__(self, im, im_info):\n        \"\"\"\n        Args:\n            im (np.ndarray): image (np.ndarray)\n            im_info (dict): info of image\n        Returns:\n            im (np.ndarray):  processed image (np.ndarray)\n            im_info (dict): info of processed image\n        \"\"\"\n        coarsest_stride = self.coarsest_stride\n        if coarsest_stride == 0:\n            return im\n        im_c, im_h, im_w = im.shape\n        pad_h = int(np.ceil(float(im_h) / coarsest_stride) * coarsest_stride)\n        pad_w = int(np.ceil(float(im_w) / coarsest_stride) * coarsest_stride)\n        padding_im = np.zeros((im_c, pad_h, pad_w), dtype=np.float32)\n        padding_im[:, :im_h, :im_w] = im\n        im_info['pad_shape'] = padding_im.shape[1:]\n        return padding_im, im_info\n\n\ndef preprocess(im, preprocess_ops):\n    # process image by preprocess_ops\n    im_info = {\n        'scale': [1., 1.],\n        'origin_shape': None,\n        'resize_shape': None,\n        'pad_shape': None,\n    }\n    im, im_info = decode_image(im, im_info)\n    for operator in preprocess_ops:\n        im, im_info = operator(im, im_info)\n    im = np.array((im, )).astype('float32')\n    return im, im_info\n"
  },
  {
    "path": "modules/image/instance_segmentation/solov2/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport numpy as np\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://ai-studio-static-online.cdn.bcebos.com/7799a8ccc5f6471b9d56fb6eff94f82a08b70ca2c7594d3f99877e366c0a2619'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"solov2\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('solov2_result')\n\n    def test_predict1(self):\n        results = self.module.predict(image='tests/test.jpg', visualization=False)\n        segm = results['segm']\n        label = results['label']\n        score = results['score']\n        self.assertIsInstance(segm, np.ndarray)\n        self.assertIsInstance(label, np.ndarray)\n        self.assertIsInstance(score, np.ndarray)\n\n    def test_predict2(self):\n        results = self.module.predict(image=cv2.imread('tests/test.jpg'), visualization=False)\n        segm = results['segm']\n        label = results['label']\n        score = results['score']\n        self.assertIsInstance(segm, np.ndarray)\n        self.assertIsInstance(label, np.ndarray)\n        self.assertIsInstance(score, np.ndarray)\n\n    def test_predict3(self):\n        results = self.module.predict(image=cv2.imread('tests/test.jpg'), visualization=True)\n        segm = results['segm']\n        label = results['label']\n        score = results['score']\n        self.assertIsInstance(segm, np.ndarray)\n        self.assertIsInstance(label, np.ndarray)\n        self.assertIsInstance(score, np.ndarray)\n\n    def test_predict4(self):\n        module = hub.Module(name=\"solov2\", use_gpu=True)\n        results = module.predict(image=cv2.imread('tests/test.jpg'), visualization=True)\n        segm = results['segm']\n        label = results['label']\n        score = results['score']\n        self.assertIsInstance(segm, np.ndarray)\n        self.assertIsInstance(label, np.ndarray)\n        self.assertIsInstance(score, np.ndarray)\n\n    def test_predict5(self):\n        self.assertRaises(FileNotFoundError, self.module.predict, image='no.jpg')\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/keypoint_detection/README.md",
    "content": "## **更好用户体验，建议参考WEB端官方文档 -> [【关键点检测】](https://www.paddlepaddle.org.cn/hublist)**\n\n#### 关键点检测\n\n人体骨骼关键点检测 (Pose Estimation) 主要检测人体的一些关键点，如关节，五官等，通过关键点描述人体骨骼信息。人体骨骼关键点检测对于描述人体姿态，预测人体行为至关重要。是诸多计算机视觉任务的基础，例如动作分类，异常行为检测，以及自动驾驶等等。\n\n- 精选模型\n\n| 模型名称                                                     | 模型简介                                                     |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n| [单人--人体骨骼关键点检测](https://www.paddlepaddle.org.cn/hubdetail?name=human_pose_estimation_resnet50_mpii&en_category=KeyPointDetection) | 可用于行为识别、人物跟踪、步态识别等相关领域。具体应用主要集中在智能视频监控，病人监护系统，人机交互，虚拟现实，人体动画，智能家居，智能安防，运动员辅助训练等等。  |\n| [多人-人体骨骼关键点检测](https://www.paddlepaddle.org.cn/hubdetail?name=openpose_body_estimation&en_category=KeyPointDetection) | 可用于行为识别、人物跟踪、步态识别等相关领域。具体应用主要集中在智能视频监控，病人监护系统，人机交互，虚拟现实，人体动画，智能家居，智能安防，运动员辅助训练等等。  |\n| [面部关键点检测](https://www.paddlepaddle.org.cn/hubdetail?name=face_landmark_localization&en_category=KeyPointDetection) |可用于人脸识别、表情分析、三维人脸重建及三维动画等其它人脸相关问题，支持同一张图中的多个人脸检测  |\n| [手部关键点检测](https://www.paddlepaddle.org.cn/hubdetail?name=hand_pose_localization&en_category=KeyPointDetection) |可用于手势识别，配合人体骨骼关键点，可用于异常行为检测等多种场景  |\n"
  },
  {
    "path": "modules/image/keypoint_detection/README_en.md",
    "content": "## **For better user experience, refer to the Web official document ->  [Key Points Detection](https://www.paddlepaddle.org.cn/hublist)**\n\n#### Key Point Detection\n\nPose Estimation refers to the detection of key points in the human body, such as joints, facial features, and so on. It describes information about the human skeleton through these key points. Pose estimation is essential for describing human posture and predicting human behaviors. It is the basis for many computer vision tasks, such as motion classification, abnormal behavior detection, and automated driving.\n\n- Recommended Models\n\n| Model Name                                                   | Model Introduction                                           |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n| [Single Person - Pose Estimation](https://www.paddlepaddle.org.cn/hubdetail?name=human_pose_estimation_resnet50_mpii&en_category=KeyPointDetection) | It can be used in behavior recognition, person tracking, gait recognition and other related fields. Specific applications include intelligent video surveillance, patient monitoring systems, human-computer interaction, virtual reality, human animation, smart home, intelligent security, athlete training, and so on. |\n| [Multi Person - Pose Estimation](https://www.paddlepaddle.org.cn/hubdetail?name=openpose_body_estimation&en_category=KeyPointDetection) | It can be used in behavior recognition, person tracking, gait recognition and other related fields. Specific applications include intelligent video surveillance, patient monitoring systems, human-computer interaction, virtual reality, human animation, smart home, intelligent security, athlete training, and so on. |\n| [Facial Key Points Detection](https://www.paddlepaddle.org.cn/hubdetail?name=face_landmark_localization&en_category=KeyPointDetection) | It can be used for face recognition, expression analysis, 3D face reconstruction, 3D animation and other face-related issues. It supports multiple face detection in the same picture. |\n| [Hand Key Point Detection](https://www.paddlepaddle.org.cn/hubdetail?name=hand_pose_localization&en_category=KeyPointDetection) | It can be used for gesture recognition. In combination with Pose Estimation, it can be used for abnormal behavior detection and many other scenarios. |\n"
  },
  {
    "path": "modules/image/keypoint_detection/face_landmark_localization/README.md",
    "content": "# face_landmark_localization\n\n| 模型名称            | face_landmark_localization |\n| :------------------ | :------------------------: |\n| 类别                |      图像-关键点检测       |\n| 网络                |       Face_Landmark        |\n| 数据集              |          AFW/AFLW          |\n| 是否支持Fine-tuning |             否             |\n| 模型大小            |             3M             |\n| 最新更新日期        |         2021-02-26         |\n| 数据指标            |             -              |\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 人脸关键点（左）、模型检测效果（右）\n\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/76040149/133222449-a1c2d444-a839-4c0e-9203-30d67eeb2246.jpeg\"  hspace=\"5\" width=\"300\"/> <img src=\"https://user-images.githubusercontent.com/76040149/133229934-e7357767-28e0-4253-bf71-f948de9966f1.jpg\"  hspace=\"5\" height=\"300\"/>\n    </p>\n\n- ### 模型介绍\n\n  - 人脸关键点检测是人脸识别和分析领域中的关键一步，它是诸如自动人脸识别、表情分析、三维人脸重建及三维动画等其它人脸相关问题的前提和突破口。该 PaddleHub Module 的模型转换自 https://github.com/lsy17096535/face-landmark ，支持同一张图中的多个人脸检测。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 1.6.2\n\n  - paddlehub >= 1.6.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install face_landmark_localization\n    ```\n\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n   | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run face_landmark_localization --input_path \"/PATH/TO/IMAGE\"\n    ```\n\n  - 通过命令行方式实现hub模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    face_landmark = hub.Module(name=\"face_landmark_localization\")\n\n    # Replace face detection module to speed up predictions but reduce performance\n    # face_landmark.set_face_detector_module(hub.Module(name=\"ultra_light_fast_generic_face_detector_1mb_320\"))\n\n    result = face_landmark.keypoint_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = face_landmark.keypoint_detection(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def __init__(face_detector_module=None):\n    ```\n\n    - **参数**\n      - face_detector_module (class): 人脸检测模型，默认为 ultra_light_fast_generic_face_detector_1mb_640.\n\n  - ```python\n    def keypoint_detection(images=None,\n                           paths=None,\n                           batch_size=1,\n                           use_gpu=False,\n                           output_dir='face_landmark_output',\n                           visualization=False):\n    ```\n\n    - 识别输入图片中的所有人脸关键点，每张人脸检测出68个关键点（人脸轮廓17个点，左右眉毛各5个点，左右眼睛各6个点，鼻子9个点，嘴巴20个点）\n\n\n    - **参数**\n\n      - images (list[numpy.ndarray]): 图片数据，ndarray.shape 为 [H, W, C]，BGR格式；\n      - paths (list[str]): 图片的路径；\n      - batch_size (int): batch 的大小；\n      - use_gpu (bool): 是否使用 GPU；\n      - visualization (bool): 是否将识别结果保存为图片文件；\n      - output_dir (str): 图片的保存路径，当为 None 时，默认设为face_landmark_output。\n\n    - **返回**\n\n      - res (list[dict]): 识别结果的列表，列表元素为 dict, 有以下两个字段：\n        - save_path : 可视化图片的保存路径（仅当visualization=True时存在）；\n        - data: 图片中每张人脸的关键点坐标\n\n  - ```python\n    def set_face_detector_module(face_detector_module):\n    ```\n\n    - 设置为人脸关键点检测模型进行人脸检测的底座模型\n    - **参数**\n      - face_detector_module (class): 人脸检测模型\n\n  - ```python\n    def get_face_detector_module():\n    ```\n\n    - 获取为人脸关键点检测模型进行人脸检测的底座模型\n    - **返回**\n      - 当前模型使用的人脸检测模型。\n\n  - ```python\n    def save_inference_model(dirname):\n    ```\n    \n    - **参数**\n      - dirname: 模型保存路径\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线人脸关键点检测服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m face_landmark_localization -p 8866\n    ```\n\n  - 这样就完成了一个人脸关键点服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    import paddlehub as hub\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/face_landmark_localization\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n* 1.0.2\n\n* 1.0.3\n\n  移除 fluid api\n\n* 1.1.0\n\n  修复无法导出推理模型的问题\n\n  * ```shell\n    $ hub install face_landmark_localization==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/keypoint_detection/face_landmark_localization/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/keypoint_detection/face_landmark_localization/data_feed.py",
    "content": "# coding=utf-8\nimport os\nfrom collections import OrderedDict\n\nimport cv2\nimport numpy as np\n\n__all__ = ['reader']\n\n\ndef reader(face_detector, images=None, paths=None, use_gpu=False):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list(numpy.ndarray)): images data, shape of each is [H, W, C].\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    components = []\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            im = cv2.imread(im_path)\n            each['orig_im'] = im\n            each['orig_im_shape'] = im.shape\n            each['orig_im_path'] = im_path\n            components.append(each)\n    if images is not None:\n        assert type(images) is list, \"images should be a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['orig_im'] = im\n            each['orig_im_path'] = None\n            each['orig_im_shape'] = im.shape\n            components.append(each)\n\n    for idx, item in enumerate(\n            face_detector.face_detection(\n                images=[component['orig_im'] for component in components], use_gpu=use_gpu, visualization=False)):\n        for face in item['data']:\n            width = int(components[idx]['orig_im_shape'][1])\n            height = int(components[idx]['orig_im_shape'][0])\n            x1 = 0 if int(face['left']) < 0 else int(face['left'])\n            x2 = width if int(face['right']) > width else int(face['right'])\n            y1 = 0 if int(face['top']) < 0 else int(face['top'])\n            y2 = height if int(face['bottom']) > height else int(face['bottom'])\n            roi = components[idx]['orig_im'][y1:y2 + 1, x1:x2 + 1, :]\n            gray_img = cv2.cvtColor(roi, cv2.COLOR_RGB2GRAY)\n            gray_img = cv2.resize(gray_img, (60, 60), interpolation=cv2.INTER_CUBIC)\n            mean, std_dev = cv2.meanStdDev(gray_img)\n            gray_img = (gray_img - mean[0][0]) / (0.000001 + std_dev[0][0])\n            gray_img = np.expand_dims(gray_img, axis=0)\n            yield {\n                'face': gray_img,\n                'x1': x1,\n                'y1': y1,\n                'x2': x2,\n                'y2': y2,\n                'orig_im': components[idx]['orig_im'],\n                'orig_im_path': components[idx]['orig_im_path'],\n                'orig_im_shape': components[idx]['orig_im_shape'],\n                'id': idx\n            }\n"
  },
  {
    "path": "modules/image/keypoint_detection/face_landmark_localization/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\n\nimport argparse\nimport ast\nimport os\n\nimport numpy as np\nimport paddle\nimport paddle.jit\nimport paddle.static\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import postprocess\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"face_landmark_localization\",\n    type=\"CV/keypoint_detection\",\n    author=\"paddlepaddle\",\n    author_email=\"paddle-dev@baidu.com\",\n    summary=\n    \"Face_Landmark_Localization can be used to locate face landmark. This Module is trained through the MPII Human Pose dataset.\",\n    version=\"1.1.0\")\nclass FaceLandmarkLocalization:\n    def __init__(self, face_detector_module=None):\n        \"\"\"\n        Args:\n            face_detector_module (class): module to detect face.\n        \"\"\"\n        self.default_pretrained_model_path = os.path.join(self.directory, \"face_landmark_localization\", \"model\")\n        if face_detector_module is None:\n            self.face_detector = hub.Module(name=\"ultra_light_fast_generic_face_detector_1mb_640\")\n        else:\n            self.face_detector = face_detector_module\n        self._set_config()\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path+'.pdmodel'\n        params = self.default_pretrained_model_path+'.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def set_face_detector_module(self, face_detector_module):\n        \"\"\"\n        Set face detector.\n\n        Args:\n            face_detector_module (class): module to detect face.\n        \"\"\"\n        self.face_detector = face_detector_module\n\n    def get_face_detector_module(self):\n        return self.face_detector\n\n    def keypoint_detection(self,\n                           images=None,\n                           paths=None,\n                           batch_size=1,\n                           use_gpu=False,\n                           output_dir='face_landmark_output',\n                           visualization=False):\n        \"\"\"\n        API for face landmark.\n\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C].\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            output_dir (str): The path to store output images.\n            visualization (bool): Whether to save image or not.\n\n        Returns:\n            res (list[dict()]): The key points of face landmark and save path of images.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        # get all data\n        all_data = []\n        for yield_data in reader(self.face_detector, images, paths, use_gpu):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = []\n        for iter_id in range(loop_num):\n            batch_data = []\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except:\n                    pass\n            # feed batch image\n            batch_image = np.array([data['face'] for data in batch_data]).astype('float32')\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(batch_image)\n\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n            points = output_handle.copy_to_cpu()\n\n            for idx, sample in enumerate(batch_data):\n                sample['points'] = points[idx].reshape(68, -1)\n            res += batch_data\n\n        res = postprocess(res, output_dir, visualization)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.keypoint_detection(images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.keypoint_detection(paths=[args.input_path],\n                                          use_gpu=args.use_gpu,\n                                          output_dir=args.output_dir,\n                                          visualization=args.visualization)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default=None,\n                                           help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument('--visualization',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n"
  },
  {
    "path": "modules/image/keypoint_detection/face_landmark_localization/processor.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport base64\nimport os\nimport time\n\nimport cv2\nimport numpy as np\n\n__all__ = ['check_dir', 'postprocess']\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef check_dir(dir_path):\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef get_image_ext(image):\n    if image.shape[2] == 4:\n        return \".png\"\n    return \".jpg\"\n\n\ndef postprocess(res, output_dir, visualization):\n    \"\"\"\n    postprocess ouput of network, one face at a time.\n    \"\"\"\n    output = []\n    _cur_id = -1\n    for idx, _result in enumerate(res):\n        if _result['id'] != _cur_id:\n            _cur_id = _result['id']\n            output.append({'data': []})\n        _result['points'][:, 0] *= (_result['x2'] - _result['x1'])\n        _result['points'][:, 0] += _result['x1']\n        _result['points'][:, 1] *= (_result['y2'] - _result['y1'])\n        _result['points'][:, 1] += _result['y1']\n        output[-1]['data'].append(_result['points'].tolist())\n\n    if visualization:\n        check_dir(output_dir)\n        for idx, sample in enumerate(output):\n            orig_im = res[idx]['orig_im']\n            for points in sample['data']:\n                for x, y in points:\n                    cv2.circle(orig_im, (int(x), int(y)), 1, (0, 0, 255), 2)\n            orig_im_path = res[idx]['orig_im_path']\n            ext = os.path.splitext(orig_im_path) if orig_im_path else ''\n            ext = ext if ext else get_image_ext(orig_im)\n            org_im_path = orig_im_path if orig_im_path else 'ndarray_{}{}'.format(time.time(), ext)\n            im_name = os.path.basename(org_im_path)\n            im_save_path = os.path.join(output_dir, im_name)\n            sample['save_path'] = im_save_path\n            cv2.imwrite(im_save_path, orig_im)\n\n    return output\n"
  },
  {
    "path": "modules/image/keypoint_detection/face_landmark_localization/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport paddlehub as hub\n\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://ai-studio-static-online.cdn.bcebos.com/7799a8ccc5f6471b9d56fb6eff94f82a08b70ca2c7594d3f99877e366c0a2619'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"face_landmark_localization\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('face_landmark_output')\n\n    def test_keypoint_detection1(self):\n        results = self.module.keypoint_detection(\n            paths=['tests/test.jpg'],\n            use_gpu=False,\n            visualization=False\n        )\n        kps = results[0]['data'][0]\n        self.assertIsInstance(kps, list)\n\n    def test_keypoint_detection2(self):\n        results = self.module.keypoint_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=False\n        )\n        kps = results[0]['data'][0]\n        self.assertIsInstance(kps, list)\n\n    def test_keypoint_detection3(self):\n        results = self.module.keypoint_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=True\n        )\n        kps = results[0]['data'][0]\n        self.assertIsInstance(kps, list)\n\n    def test_keypoint_detection4(self):\n        results = self.module.keypoint_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=True,\n            visualization=False\n        )\n        kps = results[0]['data'][0]\n        self.assertIsInstance(kps, list)\n\n    def test_keypoint_detection5(self):\n        self.assertRaises(\n            AssertionError,\n            self.module.keypoint_detection,\n            paths=['no.jpg']\n        )\n\n    def test_keypoint_detection6(self):\n        self.assertRaises(\n            AttributeError,\n            self.module.keypoint_detection,\n            images=['test.jpg']\n        )\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model/model.pdiparams'))\n\n        self.assertTrue(os.path.exists('./inference/model/face_detector.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model/face_detector.pdiparams'))\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/keypoint_detection/hand_pose_localization/model.py",
    "content": "import os\r\nimport numpy as np\r\n\r\nfrom paddle.inference import create_predictor, Config\r\n\r\n__all__ = ['InferenceModel']\r\n\r\n\r\nclass InferenceModel:\r\n    # 初始化函数\r\n    def __init__(self,\r\n                 modelpath,\r\n                 use_gpu=False,\r\n                 gpu_id=0,\r\n                 use_mkldnn=False,\r\n                 cpu_threads=1):\r\n        '''\r\n        init the inference model\r\n        modelpath: inference model path\r\n        use_gpu: use gpu or not\r\n        use_mkldnn: use mkldnn or not\r\n        '''\r\n        # 加载模型配置\r\n        self.config = self.load_config(modelpath, use_gpu, gpu_id, use_mkldnn, cpu_threads)\r\n\r\n    # 打印函数\r\n    def __repr__(self):\r\n        '''\r\n        get the numbers and name of inputs and outputs \r\n        '''\r\n        return 'input_num: %d\\ninput_names: %s\\noutput_num: %d\\noutput_names: %s' % (\r\n            self.input_num,\r\n            str(self.input_names),\r\n            self.output_num,\r\n            str(self.output_names)\r\n        )\r\n\r\n    # 类调用函数\r\n    def __call__(self, *input_datas, batch_size=1):\r\n        '''\r\n        call function\r\n        '''\r\n        return self.forward(*input_datas, batch_size=batch_size)\r\n\r\n    # 模型参数加载函数\r\n    def load_config(self, modelpath, use_gpu, gpu_id, use_mkldnn, cpu_threads):\r\n        '''\r\n        load the model config\r\n        modelpath: inference model path\r\n        use_gpu: use gpu or not\r\n        use_mkldnn: use mkldnn or not\r\n        '''\r\n        # 对运行位置进行配置\r\n        if use_gpu:\r\n            try:\r\n                int(os.environ.get('CUDA_VISIBLE_DEVICES'))\r\n            except Exception:\r\n                print(\r\n                    '''Error! Unable to use GPU. Please set the environment variables \"CUDA_VISIBLE_DEVICES=GPU_id\" to use GPU. Now switch to CPU to continue...''')\r\n                use_gpu = False\r\n\r\n        if os.path.isdir(modelpath):\r\n            if os.path.exists(os.path.join(modelpath, \"__params__\")):\r\n                # __model__ + __params__\r\n                model = os.path.join(modelpath, \"__model__\")\r\n                params = os.path.join(modelpath, \"__params__\")\r\n                config = Config(model, params)\r\n            elif os.path.exists(os.path.join(modelpath, \"params\")):\r\n                # model + params\r\n                model = os.path.join(modelpath, \"model\")\r\n                params = os.path.join(modelpath, \"params\")\r\n                config = Config(model, params)\r\n            elif os.path.exists(os.path.join(modelpath, \"__model__\")):\r\n                # __model__ + others\r\n                config = Config(modelpath)\r\n            else:\r\n                raise Exception(\r\n                    \"Error! Can\\'t find the model in: %s. Please check your model path.\" % os.path.abspath(modelpath))\r\n        elif os.path.exists(modelpath + \".pdmodel\"):\r\n            # *.pdmodel + *.pdiparams\r\n            model = modelpath + \".pdmodel\"\r\n            params = modelpath + \".pdiparams\"\r\n            config = Config(model, params)\r\n        elif isinstance(modelpath, Config):\r\n            config = modelpath\r\n        else:\r\n            raise Exception(\r\n                \"Error! Can\\'t find the model in: %s. Please check your model path.\" % os.path.abspath(modelpath))\r\n\r\n        # 设置参数\r\n        if use_gpu:\r\n            config.enable_use_gpu(100, gpu_id)\r\n        else:\r\n            config.disable_gpu()\r\n            config.set_cpu_math_library_num_threads(cpu_threads)\r\n            if use_mkldnn:\r\n                config.enable_mkldnn()\r\n\r\n        config.disable_glog_info()\r\n\r\n        # 返回配置\r\n        return config\r\n\r\n    # 预测器创建函数\r\n    def eval(self):\r\n        '''\r\n        create the model predictor by model config\r\n        '''\r\n        # 创建预测器\r\n        self.predictor = create_predictor(self.config)\r\n\r\n        # 获取模型的输入输出名称\r\n        self.input_names = self.predictor.get_input_names()\r\n        self.output_names = self.predictor.get_output_names()\r\n\r\n        # 获取模型的输入输出节点数量\r\n        self.input_num = len(self.input_names)\r\n        self.output_num = len(self.output_names)\r\n\r\n        # 获取输入\r\n        self.input_handles = []\r\n        for input_name in self.input_names:\r\n            self.input_handles.append(\r\n                self.predictor.get_input_handle(input_name))\r\n\r\n        # 获取输出\r\n        self.output_handles = []\r\n        for output_name in self.output_names:\r\n            self.output_handles.append(\r\n                self.predictor.get_output_handle(output_name))\r\n\r\n    # 前向计算函数\r\n    def forward(self, *input_datas, batch_size=1):\r\n        \"\"\"\r\n        model inference\r\n        batch_size: batch size\r\n        *input_datas: x1, x2, ..., xn\r\n        \"\"\"\r\n        # 切分输入数据\r\n        datas_num = input_datas[0].shape[0]\r\n        split_num = datas_num // batch_size + \\\r\n                    1 if datas_num % batch_size != 0 else datas_num // batch_size\r\n        input_datas = [np.array_split(input_data, split_num)\r\n                       for input_data in input_datas]\r\n\r\n        # 遍历输入数据进行预测\r\n        outputs = {}\r\n        for step in range(split_num):\r\n            for i in range(self.input_num):\r\n                input_data = input_datas[i][step].copy()\r\n                self.input_handles[i].copy_from_cpu(input_data)\r\n\r\n            self.predictor.run()\r\n\r\n            for i in range(self.output_num):\r\n                output = self.output_handles[i].copy_to_cpu()\r\n                if i in outputs:\r\n                    outputs[i].append(output)\r\n                else:\r\n                    outputs[i] = [output]\r\n\r\n        # 预测结果合并\r\n        for key in outputs.keys():\r\n            outputs[key] = np.concatenate(outputs[key], 0)\r\n\r\n        outputs = [v for v in outputs.values()]\r\n\r\n        # 返回预测结果\r\n        return tuple(outputs) if len(outputs) > 1 else outputs[0]"
  },
  {
    "path": "modules/image/keypoint_detection/hand_pose_localization/module.py",
    "content": "# coding=utf-8\r\nimport os\r\n\r\nimport numpy as np\r\nfrom paddlehub.module.module import moduleinfo, serving\r\n\r\nfrom .model import InferenceModel\r\nfrom .processor import base64_to_cv2, Processor\r\n\r\n\r\n@moduleinfo(\r\n    name=\"hand_pose_localization\",  # 模型名称\r\n    type=\"CV/keypoint_detection\",  # 模型类型\r\n    author=\"jm12138\",  # 作者名称\r\n    author_email=\"jm12138@qq.com\",  # 作者邮箱\r\n    summary=\"hand_pose_localization\",  # 模型介绍\r\n    version=\"1.1.0\"  # 版本号\r\n)\r\nclass Hand_Pose_Localization:\r\n    # 初始化函数\r\n    def __init__(self, use_gpu=False):\r\n        # 设置模型路径\r\n        self.model_path = os.path.join(self.directory, \"hand_pose_localization\", \"model\")\r\n\r\n        # 加载模型\r\n        self.model = InferenceModel(modelpath=self.model_path, use_gpu=use_gpu)\r\n\r\n        self.model.eval()\r\n\r\n    # 关键点检测函数\r\n    def keypoint_detection(self, images=None, paths=None, batch_size=1, output_dir='output', visualization=False):\r\n        # 加载数据处理器\r\n        processor = Processor(images, paths, batch_size, output_dir)\r\n\r\n        # 模型预测\r\n        outputs = []\r\n        for input_data in processor.input_datas:\r\n            output = self.model(input_data)\r\n            outputs.append(output)\r\n        outputs = np.concatenate(outputs, 0)\r\n\r\n        # 结果后处理\r\n        results = processor.postprocess(outputs, visualization)\r\n\r\n        # 返回结果\r\n        return results\r\n\r\n    # Hub Serving\r\n    @serving\r\n    def serving_method(self, images, **kwargs):\r\n        # 获取输入数据\r\n        images_decode = [base64_to_cv2(image) for image in images]\r\n        # 关键点检测\r\n        results = self.keypoint_detection(images_decode, **kwargs)\r\n        # 返回结果\r\n        return results\r\n"
  },
  {
    "path": "modules/image/keypoint_detection/hand_pose_localization/processor.py",
    "content": "import os\r\nimport cv2\r\nimport time\r\nimport base64\r\nimport numpy as np\r\n\r\n__all__ = ['base64_to_cv2', 'Processor']\r\n\r\n\r\ndef check_dir(dir_path):\r\n    # 目录检查函数\r\n    if not os.path.exists(dir_path):\r\n        os.makedirs(dir_path)\r\n    elif os.path.isfile(dir_path):\r\n        os.remove(dir_path)\r\n        os.makedirs(dir_path)\r\n\r\n\r\ndef base64_to_cv2(b64str):\r\n    # base64转cv2函数\r\n    data = base64.b64decode(b64str.encode('utf8'))\r\n    data = np.fromstring(data, np.uint8)\r\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\r\n    return data\r\n\r\n\r\nclass Processor():\r\n    # 初始化函数\r\n    def __init__(self, images=None, paths=None, batch_size=1, output_dir='output'):\r\n        # 变量设置\r\n        self.num_points = 21\r\n        self.inHeight = 368\r\n        self.threshold = 0.1\r\n        self.point_pairs = [[0, 1], [1, 2], [2, 3], [3, 4], [0, 5], [5, 6], [6, 7], [7, 8], [0, 9], [9, 10], [10, 11],\r\n                            [11, 12], [0, 13], [13, 14], [14, 15], [15, 16], [0, 17], [17, 18], [18, 19], [19, 20]]\r\n\r\n        self.images = images\r\n        self.paths = paths\r\n        self.batch_size = batch_size\r\n        self.output_dir = output_dir\r\n\r\n        # 获取原始输入数据\r\n        self.datas = self.load_datas()\r\n\r\n        # 对原始输入数据进行预处理\r\n        self.input_datas = self.preprocess()\r\n\r\n    # 读取数据函数\r\n    def load_datas(self):\r\n        datas = []\r\n\r\n        # 读取数据列表\r\n        if self.paths is not None:\r\n            for im_path in self.paths:\r\n                assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\r\n                im = cv2.imread(im_path).astype('float32')\r\n                datas.append(im)\r\n\r\n        if self.images is not None:\r\n            datas = self.images\r\n\r\n        # 返回数据列表\r\n        return datas\r\n\r\n    # 数据预处理函数\r\n    def preprocess(self):\r\n        input_datas = []\r\n\r\n        # 数据预处理\r\n        for i, img in enumerate(self.datas):\r\n            img_height, img_width, _ = img.shape\r\n            aspect_ratio = img_width / img_height\r\n            inWidth = int(((aspect_ratio * self.inHeight) * 8) // 8)\r\n            inpBlob = cv2.dnn.blobFromImage(\r\n                img, 1.0 / 255, (inWidth, self.inHeight), (0, 0, 0), swapRB=False, crop=False)\r\n            input_datas.append(inpBlob)\r\n\r\n        # 数据按batch_size切分\r\n        input_datas = np.concatenate(input_datas, 0)\r\n        split_num = len(self.datas) // self.batch_size + 1 if len(self.datas) % self.batch_size != 0 else len(\r\n            self.datas) // self.batch_size\r\n        input_datas = np.array_split(input_datas, split_num)\r\n\r\n        # 返回预处理完成的数据\r\n        return input_datas\r\n\r\n    # 结果后处理函数\r\n    def postprocess(self, outputs, visualization):\r\n        all_points = []\r\n\r\n        # 结果后处理\r\n        for im_id, img in enumerate(self.datas):\r\n            points = []\r\n            for idx in range(self.num_points):\r\n                probMap = outputs[im_id, idx, :, :]\r\n                img_height, img_width, _ = img.shape\r\n                probMap = cv2.resize(probMap, (img_width, img_height))\r\n                minVal, prob, minLoc, point = cv2.minMaxLoc(probMap)\r\n\r\n                if prob > self.threshold:\r\n                    points.append([int(point[0]), int(point[1])])\r\n                else:\r\n                    points.append(None)\r\n\r\n            all_points.append(points)\r\n\r\n            # 结果可视化\r\n            if visualization:\r\n                # 检查输出目录\r\n                check_dir(self.output_dir)\r\n                # 结果可视化\r\n                self.vis_pose(img, points, im_id)\r\n\r\n        # 返回后处理结果\r\n        return all_points\r\n\r\n    # 结果可视化\r\n    def vis_pose(self, img, points, im_id):\r\n        # 根据结果绘制关键点到原图像上\r\n        for pair in self.point_pairs:\r\n            partA = pair[0]\r\n            partB = pair[1]\r\n\r\n            if points[partA] and points[partB]:\r\n                cv2.line(img, tuple(points[partA]), tuple(points[partB]), (0, 255, 255), 3)\r\n                cv2.circle(img, tuple(points[partA]), 8, (0, 0, 255), thickness=-1, lineType=cv2.FILLED)\r\n\r\n        # 可视化图像保存\r\n        cv2.imwrite(os.path.join(self.output_dir, '%d_%d.jpg' % (im_id, time.time())), img)\r\n"
  },
  {
    "path": "modules/image/keypoint_detection/hand_pose_localization/readme.md",
    "content": "# hand_pose_localization\n\n| 模型名称            | hand_pose_localization |\n| :------------------ | :--------------------: |\n| 类别                |    图像-关键点检测     |\n| 网络                |                        |\n| 数据集              |       MPII, NZSL       |\n| 是否支持Fine-tuning |           否           |\n| 模型大小            |          130M          |\n| 最新更新日期        |       2021-06-02       |\n| 数据指标            |                        |\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n<p align=\"center\">\n<img src=\"https://user-images.githubusercontent.com/76040149/133246893-f47cfdce-b9c1-490b-b1de-f837b61caf18.png\" align=\"center\" width=\"500\">\n</p>\n  \n- ### 模型介绍\n  - openpose 手部关键点检测模型。更多详情请参考：[openpose开源项目](https://github.com/CMU-Perceptual-Computing-Lab/openpose)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install hand_pose_localization\n    ```\n\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n   | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- 本模型不支持命令行预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    import cv2\n    import paddlehub as hub\n    \n    # use_gpu：是否使用GPU进行预测\n    model = hub.Module(name='hand_pose_localization', use_gpu=False)\n    \n    # 调用关键点检测API\n    result = model.keypoint_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.keypoint_detection(paths=['/PATH/TO/IMAGE'])\n    \n    # 打印预测结果\n    print(result)\n    ```\n  \n- ### 2、API\n\n  - ```python\n    def keypoint_detection(images=None,\n                           paths=None,\n                           batch_size=1,\n                           output_dir='output',\n                           visualization=False)：\n    ```\n    \n    - 预测API，识别出人体手部关键点。\n    - **参数**\n      - images (list[numpy.ndarray]): 图片数据，ndarray.shape 为 [H, W, C], 默认设为 None；\n      - paths (list[str]): 图片的路径, 默认设为 None；\n      - batch_size (int): batch 的大小，默认设为 1；\n      - visualization (bool): 是否将识别结果保存为图片文件，默认设为 False；\n      - output_dir (str): 图片的保存路径，默认设为 output。\n    - **返回**\n      - res (list[list[list[int](https://www.paddlepaddle.org.cn/hubdetail?name=hand_pose_localization&en_category=KeyPointDetection)]]): 每张图片识别到的21个手部关键点组成的列表，每个关键点的格式为[x, y](https://www.paddlepaddle.org.cn/hubdetail?name=hand_pose_localization&en_category=KeyPointDetection)，若有关键点未识别到则为None\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线人体手部关键点检测服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m hand_pose_localization -p 8866\n    ```\n\n  - 这样就完成了一个人体手部关键点检测的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    \n    # 图片Base64编码函数\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n    \n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/hand_pose_localization\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    \n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  适配paddlehub 2.0\n\n* 1.1.0\n\n  * ```shell\n    $ hub install hand_pose_localization==1.1.0\n    ```\n\n    \n"
  },
  {
    "path": "modules/image/keypoint_detection/hand_pose_localization/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport paddlehub as hub\n\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/8UAUuP97RlY/download?ixid=MnwxMjA3fDB8MXxhbGx8fHx8fHx8fHwxNjYxODQxMzI1&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"hand_pose_localization\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('output')\n\n    def test_keypoint_detection1(self):\n        results = self.module.keypoint_detection(\n            paths=['tests/test.jpg'],\n            visualization=False\n        )\n        kps = results[0]\n        self.assertIsInstance(kps, list)\n\n    def test_keypoint_detection2(self):\n        results = self.module.keypoint_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            visualization=False\n        )\n        kps = results[0]\n        self.assertIsInstance(kps, list)\n\n    def test_keypoint_detection3(self):\n        results = self.module.keypoint_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            visualization=True\n        )\n        kps = results[0]\n        self.assertIsInstance(kps, list)\n\n    def test_keypoint_detection4(self):\n        self.module = hub.Module(name=\"hand_pose_localization\", use_gpu=True)\n        results = self.module.keypoint_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            visualization=False\n        )\n        kps = results[0]\n        self.assertIsInstance(kps, list)\n\n    def test_keypoint_detection5(self):\n        self.assertRaises(\n            AssertionError,\n            self.module.keypoint_detection,\n            paths=['no.jpg']\n        )\n\n    def test_keypoint_detection6(self):\n        self.assertRaises(\n            AttributeError,\n            self.module.keypoint_detection,\n            images=['test.jpg']\n        )\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/keypoint_detection/human_pose_estimation_resnet50_mpii/README.md",
    "content": "# human_pose_estimation_resnet50_mpii\n\n| 模型名称            | human_pose_estimation_resnet50_mpii |\n| :------------------ | :---------------------------------: |\n| 类别                |           图像-关键点检测           |\n| 网络                |            Pose_Resnet50            |\n| 数据集              |                MPII                 |\n| 是否支持Fine-tuning |                 否                  |\n| 模型大小            |                121M                 |\n| 最新更新日期        |             2021-02-26              |\n| 数据指标            |                                     |\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n<p align=\"center\">\n<img src=\"https://user-images.githubusercontent.com/76040149/133231581-1dffc391-652d-417f-b8ad-a3c22b8092e8.jpg\" width=\"300\">\n</p>\n  \n- ### 模型介绍\n  - 人体骨骼关键点检测(Pose Estimation) 是计算机视觉的基础算法之一，在很多cv任务中起到了基础性的作用，如行为识别、人物跟踪、步态识别等领域。\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 1.6.2\n\n  - paddlehub >= 1.6.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install human_pose_estimation_resnet50_mpii\n    ```\n\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n   | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run human_pose_estimation_resnet50_mpii --input_path \"/PATH/TO/IMAGE\"\n    ```\n    \n  - 通过命令行方式实现hub模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import cv2\n    import paddlehub as hub\n    \n    pose_estimation = hub.Module(name=\"human_pose_estimation_resnet50_mpii\")\n    \n    result = pose_estimation.keypoint_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = pose_estimation.keypoint_detection(paths=['/PATH/TO/IMAGE'])\n    \n    # PaddleHub示例图片下载方法：\n    # wget https://paddlehub.bj.bcebos.com/resources/test_image.jpg\n    ```\n  \n- ### 3、API\n\n  - ```python\n    def keypoint_detection(images=None,\n                           paths=None,\n                           batch_size=1,\n                           use_gpu=False,\n                           output_dir='output_pose',\n                           visualization=False):\n    ```\n    \n    - 预测API，识别出人体骨骼关键点。\n    - **参数**\n      - images (list[numpy.ndarray]): 图片数据，ndarray.shape 为 [H, W, C]；\n      - paths (list[str]): 图片的路径；\n      - batch_size (int): batch 的大小；\n      - use_gpu (bool): 是否使用 GPU；\n      - visualization (bool): 是否将识别结果保存为图片文件；\n      - output_dir (str): 图片的保存路径，默认设为 output_pose。\n    - **返回**\n      - res (list): 识别元素的列表，列表元素为 dict，关键字为 'path', 'data'，相应的取值为：\n        - path (str): 原图的路径；\n        - data (OrderedDict): 人体骨骼关键点的坐标。\n    \n  - ```python\n    def save_inference_model(dirname):\n    ```\n  \n    - 将模型保存到指定路径。\n  \n    - **参数**\n  \n      - dirname: 模型保存路径\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线人体骨骼关键点识别服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m human_pose_estimation_resnet50_mpii -p 8866\n    ```\n\n  - 这样就完成了一个人体骨骼关键点识别的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    \n    \n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n    \n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/human_pose_estimation_resnet50_mpii\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    \n    # 打印预测结果\n    print(r.json()[\"results\"])\n    \n    # r.json()['results']即为keypoint_detection函数返回的结果\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n* 1.1.0\n\n* 1.1.1\n\n* 1.2.0\n\n   移除 fluid api\n\n  * ```shell\n    $ hub install human_pose_estimation_resnet50_mpii==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/keypoint_detection/human_pose_estimation_resnet50_mpii/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/keypoint_detection/human_pose_estimation_resnet50_mpii/data_feed.py",
    "content": "# coding=utf-8\nimport os\nimport time\nfrom collections import OrderedDict\n\nimport cv2\nimport numpy as np\n\n__all__ = ['reader']\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            im = cv2.imread(im_path).astype('float32')\n            each['org_im'] = im\n            each['org_im_path'] = im_path\n            each['org_im_shape'] = im.shape\n            component.append(each)\n    if images is not None:\n        assert type(images) is list, \"images should be a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = im\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            each['org_im_shape'] = im.shape\n            component.append(each)\n\n    for element in component:\n        im = element['org_im'].copy()\n        im = cv2.resize(im, (384, 384))\n        im = im.astype('float32')\n        im = im.transpose((2, 0, 1)) / 255\n        im -= np.array([0.485, 0.456, 0.406]).reshape((3, 1, 1))\n        im /= np.array([0.229, 0.224, 0.225]).reshape((3, 1, 1))\n        element['image'] = im\n        yield element\n"
  },
  {
    "path": "modules/image/keypoint_detection/human_pose_estimation_resnet50_mpii/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\n\nimport ast\nimport os\nimport argparse\n\nimport paddle\nimport paddle.jit\nimport paddle.static\nimport numpy as np\nfrom paddle.inference import Config, create_predictor\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\nfrom .processor import base64_to_cv2, postprocess\nfrom .data_feed import reader\n\n\n@moduleinfo(\n    name=\"human_pose_estimation_resnet50_mpii\",\n    type=\"CV/keypoint_detection\",\n    author=\"paddlepaddle\",\n    author_email=\"paddle-dev@baidu.comi\",\n    summary=\n    \"Paddle implementation for the paper `Simple baselines for human pose estimation and tracking`, trained with the MPII dataset.\",\n    version=\"1.2.0\")\nclass HumanPoseEstimation:\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"pose-resnet50-mpii-384x384\", \"model\")\n        self._set_config()\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path+'.pdmodel'\n        params = self.default_pretrained_model_path+'.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def keypoint_detection(self,\n                           images=None,\n                           paths=None,\n                           batch_size=1,\n                           use_gpu=False,\n                           output_dir='output_pose',\n                           visualization=False):\n        \"\"\"\n        API for human pose estimation and tracking.\n\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C].\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            output_dir (str): The path to store output images.\n            visualization (bool): Whether to save image or not.\n\n        Returns:\n            res (list[dict]): each element of res is a dict, keys contains 'path', 'data', the corresponding valus are:\n                path (str): the path of original image.\n                data (OrderedDict): The key points of human pose.\n        \"\"\"\n        all_data = list()\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n        res = list()\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except:\n                    pass\n            # feed batch image\n            batch_image = np.array([data['image'] for data in batch_data])\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(batch_image)\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n            output = np.expand_dims(output_handle.copy_to_cpu(), axis=1)\n            # postprocess one by one\n            for i in range(len(batch_data)):\n                out = postprocess(\n                    out_heatmaps=output[i],\n                    org_im=batch_data[i]['org_im'],\n                    org_im_shape=batch_data[i]['org_im_shape'],\n                    org_im_path=batch_data[i]['org_im_path'],\n                    output_dir=output_dir,\n                    visualization=visualization)\n                res.append(out)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.keypoint_detection(images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the human_pose_estimation_resnet50_mpii module.\",\n            prog='hub run human_pose_estimation_resnet50_mpii',\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.keypoint_detection(\n            paths=[args.input_path],\n            batch_size=args.batch_size,\n            use_gpu=args.use_gpu,\n            output_dir=args.output_dir,\n            visualization=args.visualization)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu', type=ast.literal_eval, default=False, help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default='output_pose', help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument(\n            '--visualization', type=ast.literal_eval, default=False, help=\"whether to save output as images.\")\n        self.arg_config_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n"
  },
  {
    "path": "modules/image/keypoint_detection/human_pose_estimation_resnet50_mpii/processor.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport base64\nimport os\nimport time\nfrom collections import OrderedDict\n\nimport cv2\nimport numpy as np\n\n__all__ = ['base64_to_cv2', 'postprocess']\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef get_max_preds(batch_heatmaps):\n    \"\"\"\n    Get predictions from score maps.\n\n    Args:\n        batch_heatmaps (numpy.ndarray): output of the network, with shape [N, C, H, W]\n    \"\"\"\n    assert isinstance(batch_heatmaps, np.ndarray), \\\n        'batch_heatmaps should be numpy.ndarray'\n    assert batch_heatmaps.ndim == 4, 'batch_images should be 4-ndim'\n\n    batch_size = batch_heatmaps.shape[0]\n    num_joints = batch_heatmaps.shape[1]\n    width = batch_heatmaps.shape[3]\n    heatmaps_reshaped = batch_heatmaps.reshape((batch_size, num_joints, -1))\n    idx = np.argmax(heatmaps_reshaped, 2)\n    maxvals = np.amax(heatmaps_reshaped, 2)\n    maxvals = maxvals.reshape((batch_size, num_joints, 1))\n    idx = idx.reshape((batch_size, num_joints, 1))\n    preds = np.tile(idx, (1, 1, 2)).astype(np.float32)\n    preds[:, :, 0] = (preds[:, :, 0]) % width\n    preds[:, :, 1] = np.floor((preds[:, :, 1]) / width)\n    pred_mask = np.tile(np.greater(maxvals, 0.0), (1, 1, 2))\n    pred_mask = pred_mask.astype(np.float32)\n    preds *= pred_mask\n    return preds, maxvals\n\n\ndef predict_results(batch_heatmaps):\n    batch_size, num_joints, heatmap_height, heatmap_width = batch_heatmaps.shape\n    preds, maxvals = get_max_preds(batch_heatmaps)\n    return preds[0] * 4, num_joints\n\n\ndef postprocess(out_heatmaps, org_im, org_im_shape, org_im_path, output_dir, visualization):\n    \"\"\"\n    Postprocess output of network. one image at a time.\n\n    Args:\n        out_heatmaps (numpy.ndarray): output of network.\n        org_im (numpy.ndarray): original image.\n        org_im_shape (list): shape pf original image.\n        org_im_path (list): path of riginal image.\n        output_dir (str): output directory to store image.\n        visualization (bool): whether to save image or not.\n\n    Returns:\n        res (dict): Output of postprocess. keys contains 'path', 'data', the corresponding valus are:\n            path (str): the path of original image.\n            data (OrderedDict): The key points of human pose.\n    \"\"\"\n    res = dict()\n    res['path'] = org_im_path\n    res['data'] = OrderedDict()\n    preds, num_joints = predict_results(out_heatmaps)\n    scale_horizon = org_im_shape[1] * 1.0 / 384\n    scale_vertical = org_im_shape[0] * 1.0 / 384\n    preds = np.multiply(preds, (scale_horizon, scale_vertical)).astype(int)\n    if visualization:\n        icolor = (255, 137, 0)\n        ocolor = (138, 255, 0)\n        rendered_im = org_im.copy()\n        for j in range(num_joints):\n            x, y = preds[j]\n            cv2.circle(rendered_im, (x, y), 3, icolor, -1, 16)\n            cv2.circle(rendered_im, (x, y), 6, ocolor, 1, 16)\n        check_dir(output_dir)\n        save_im_name = get_save_image_name(org_im, org_im_path, output_dir)\n        cv2.imwrite(save_im_name, rendered_im)\n        print('image saved in {}'.format(save_im_name))\n\n    # articulation\n    preds = list(map(lambda pred: [int(_) for _ in pred], preds))\n    res['data']['left_ankle'] = list(preds[0])\n    res['data']['left_knee'] = list(preds[1])\n    res['data']['left_hip'] = list(preds[2])\n    res['data']['right_hip'] = list(preds[3])\n    res['data']['right_knee'] = list(preds[4])\n    res['data']['right_ankle'] = list(preds[5])\n    res['data']['pelvis'] = list(preds[6])\n    res['data']['thorax'] = list(preds[7])\n    res['data']['upper_neck'] = list(preds[8])\n    res['data']['head_top'] = list(preds[9])\n    res['data']['right_wrist'] = list(preds[10])\n    res['data']['right_elbow'] = list(preds[11])\n    res['data']['right_shoulder'] = list(preds[12])\n    res['data']['left_shoulder'] = list(preds[13])\n    res['data']['left_elbow'] = list(preds[14])\n    res['data']['left_wrist'] = list(preds[15])\n\n    return res\n\n\ndef check_dir(dir_path):\n    \"\"\"\n    Create directory to save processed image.\n\n    Args:\n        dir_path (str): directory path to save images.\n    \"\"\"\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef get_save_image_name(org_im, org_im_path, output_dir):\n    \"\"\"\n    Get save image name from source image path.\n    \"\"\"\n    # name prefix of original image\n    org_im_name = os.path.split(org_im_path)[-1]\n    im_prefix = os.path.splitext(org_im_name)[0]\n    # extension\n    ext = '.jpg'\n    # save image path\n    save_im_path = os.path.join(output_dir, im_prefix + ext)\n    if os.path.exists(save_im_path):\n        save_im_path = os.path.join(output_dir, im_prefix + 'time={}'.format(int(time.time())) + ext)\n\n    return save_im_path\n"
  },
  {
    "path": "modules/image/keypoint_detection/human_pose_estimation_resnet50_mpii/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport paddlehub as hub\n\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://ai-studio-static-online.cdn.bcebos.com/7799a8ccc5f6471b9d56fb6eff94f82a08b70ca2c7594d3f99877e366c0a2619'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"human_pose_estimation_resnet50_mpii\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('output_pose')\n\n    def test_keypoint_detection1(self):\n        results = self.module.keypoint_detection(\n            paths=['tests/test.jpg']\n        )\n        kps = results[0]['data']\n        self.assertIsInstance(kps, dict)\n\n    def test_keypoint_detection2(self):\n        results = self.module.keypoint_detection(\n            images=[cv2.imread('tests/test.jpg')]\n        )\n        kps = results[0]['data']\n        self.assertIsInstance(kps, dict)\n\n    def test_keypoint_detection3(self):\n        results = self.module.keypoint_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            visualization=True\n        )\n        kps = results[0]['data']\n        self.assertIsInstance(kps, dict)\n\n    def test_keypoint_detection4(self):\n        results = self.module.keypoint_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=True\n        )\n        kps = results[0]['data']\n        self.assertIsInstance(kps, dict)\n\n    def test_keypoint_detection5(self):\n        self.assertRaises(\n            AssertionError,\n            self.module.keypoint_detection,\n            paths=['no.jpg']\n        )\n\n    def test_keypoint_detection6(self):\n        self.assertRaises(\n            AttributeError,\n            self.module.keypoint_detection,\n            images=['test.jpg']\n        )\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/keypoint_detection/openpose_body_estimation/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport time\nimport copy\nimport argparse\nfrom typing import Union\nfrom collections import OrderedDict\n\nimport cv2\nimport paddle\nimport paddle.nn as nn\nimport numpy as np\nfrom paddlehub.module.module import moduleinfo, runnable, serving\nimport paddlehub.vision.transforms as T\nfrom . import processor as P\n\n\n@moduleinfo(\n    name=\"openpose_body_estimation\",\n    type=\"CV/image_editing\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"Openpose_body_estimation is a body pose estimation model based on Realtime Multi-Person 2D Pose \\\n            Estimation using Part Affinity Fields.\",\n    version=\"1.1.0\")\nclass BodyPoseModel(nn.Layer):\n    \"\"\"\n    BodyposeModel\n\n    Args:\n        load_checkpoint(str): Checkpoint save path, default is None.\n    \"\"\"\n\n    def __init__(self, load_checkpoint: str = None):\n        super(BodyPoseModel, self).__init__()\n\n        self.resize_func = P.ResizeScaling()\n        self.norm_func = T.Normalize(std=[1, 1, 1])\n        self.pad_func = P.PadDownRight()\n        self.remove_pad = P.RemovePadding()\n        self.get_peak = P.GetPeak()\n        self.get_connection = P.Connection()\n        self.get_candidate = P.Candidate()\n        self.draw_pose = P.DrawPose()\n\n        no_relu_layers = ['conv5_5_CPM_L1', 'conv5_5_CPM_L2', 'Mconv7_stage2_L1', \\\n                          'Mconv7_stage2_L2', 'Mconv7_stage3_L1', 'Mconv7_stage3_L2', \\\n                          'Mconv7_stage4_L1', 'Mconv7_stage4_L2', 'Mconv7_stage5_L1', \\\n                          'Mconv7_stage5_L2', 'Mconv7_stage6_L1', 'Mconv7_stage6_L1']\n        blocks = {}\n        block0 = OrderedDict([('conv1_1', [3, 64, 3, 1, 1]), ('conv1_2', [64, 64, 3, 1, 1]), ('pool1_stage1', [2, 2,\n                                                                                                               0]),\n                              ('conv2_1', [64, 128, 3, 1, 1]), ('conv2_2', [128, 128, 3, 1, 1]),\n                              ('pool2_stage1', [2, 2, 0]), ('conv3_1', [128, 256, 3, 1, 1]),\n                              ('conv3_2', [256, 256, 3, 1, 1]), ('conv3_3', [256, 256, 3, 1, 1]),\n                              ('conv3_4', [256, 256, 3, 1, 1]), ('pool3_stage1', [2, 2, 0]),\n                              ('conv4_1', [256, 512, 3, 1, 1]), ('conv4_2', [512, 512, 3, 1, 1]),\n                              ('conv4_3_CPM', [512, 256, 3, 1, 1]), ('conv4_4_CPM', [256, 128, 3, 1, 1])])\n\n        block1_1 = OrderedDict([('conv5_1_CPM_L1', [128, 128, 3, 1, 1]), ('conv5_2_CPM_L1', [128, 128, 3, 1, 1]),\n                                ('conv5_3_CPM_L1', [128, 128, 3, 1, 1]), ('conv5_4_CPM_L1', [128, 512, 1, 1, 0]),\n                                ('conv5_5_CPM_L1', [512, 38, 1, 1, 0])])\n\n        block1_2 = OrderedDict([('conv5_1_CPM_L2', [128, 128, 3, 1, 1]), ('conv5_2_CPM_L2', [128, 128, 3, 1, 1]),\n                                ('conv5_3_CPM_L2', [128, 128, 3, 1, 1]), ('conv5_4_CPM_L2', [128, 512, 1, 1, 0]),\n                                ('conv5_5_CPM_L2', [512, 19, 1, 1, 0])])\n        blocks['block1_1'] = block1_1\n        blocks['block1_2'] = block1_2\n\n        self.model0 = self.make_layers(block0, no_relu_layers)\n\n        for i in range(2, 7):\n            blocks['block%d_1' % i] = OrderedDict([('Mconv1_stage%d_L1' % i, [185, 128, 7, 1, 3]),\n                                                   ('Mconv2_stage%d_L1' % i, [128, 128, 7, 1, 3]),\n                                                   ('Mconv3_stage%d_L1' % i, [128, 128, 7, 1, 3]),\n                                                   ('Mconv4_stage%d_L1' % i, [128, 128, 7, 1, 3]),\n                                                   ('Mconv5_stage%d_L1' % i, [128, 128, 7, 1, 3]),\n                                                   ('Mconv6_stage%d_L1' % i, [128, 128, 1, 1, 0]),\n                                                   ('Mconv7_stage%d_L1' % i, [128, 38, 1, 1, 0])])\n\n            blocks['block%d_2' % i] = OrderedDict([('Mconv1_stage%d_L2' % i, [185, 128, 7, 1, 3]),\n                                                   ('Mconv2_stage%d_L2' % i, [128, 128, 7, 1, 3]),\n                                                   ('Mconv3_stage%d_L2' % i, [128, 128, 7, 1, 3]),\n                                                   ('Mconv4_stage%d_L2' % i, [128, 128, 7, 1, 3]),\n                                                   ('Mconv5_stage%d_L2' % i, [128, 128, 7, 1, 3]),\n                                                   ('Mconv6_stage%d_L2' % i, [128, 128, 1, 1, 0]),\n                                                   ('Mconv7_stage%d_L2' % i, [128, 19, 1, 1, 0])])\n\n        for k in blocks.keys():\n            blocks[k] = self.make_layers(blocks[k], no_relu_layers)\n\n        self.model1_1 = blocks['block1_1']\n        self.model2_1 = blocks['block2_1']\n        self.model3_1 = blocks['block3_1']\n        self.model4_1 = blocks['block4_1']\n        self.model5_1 = blocks['block5_1']\n        self.model6_1 = blocks['block6_1']\n\n        self.model1_2 = blocks['block1_2']\n        self.model2_2 = blocks['block2_2']\n        self.model3_2 = blocks['block3_2']\n        self.model4_2 = blocks['block4_2']\n        self.model5_2 = blocks['block5_2']\n        self.model6_2 = blocks['block6_2']\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'openpose_body.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n\n    def make_layers(self, block: dict, no_relu_layers: list):\n        layers = []\n        for layer_name, v in block.items():\n            if 'pool' in layer_name:\n                layer = nn.MaxPool2D(kernel_size=v[0], stride=v[1], padding=v[2])\n                layers.append((layer_name, layer))\n            else:\n                conv2d = nn.Conv2D(in_channels=v[0], out_channels=v[1], kernel_size=v[2], stride=v[3], padding=v[4])\n                layers.append((layer_name, conv2d))\n                if layer_name not in no_relu_layers:\n                    layers.append(('relu_' + layer_name, nn.ReLU()))\n        layers = tuple(layers)\n        return nn.Sequential(*layers)\n\n    def transform(self, orgimg: np.ndarray, scale_search: float = 0.5):\n        process = self.resize_func(orgimg, scale_search)\n        imageToTest_padded, pad = self.pad_func(process)\n        process = self.norm_func(imageToTest_padded)\n        process = np.ascontiguousarray(np.transpose(process[:, :, :, np.newaxis], (3, 2, 0, 1))).astype(\"float32\")\n\n        return process, imageToTest_padded, pad\n\n    def forward(self, x: paddle.Tensor):\n\n        out1 = self.model0(x)\n\n        out1_1 = self.model1_1(out1)\n        out1_2 = self.model1_2(out1)\n        out2 = paddle.concat([out1_1, out1_2, out1], 1)\n\n        out2_1 = self.model2_1(out2)\n        out2_2 = self.model2_2(out2)\n        out3 = paddle.concat([out2_1, out2_2, out1], 1)\n\n        out3_1 = self.model3_1(out3)\n        out3_2 = self.model3_2(out3)\n        out4 = paddle.concat([out3_1, out3_2, out1], 1)\n\n        out4_1 = self.model4_1(out4)\n        out4_2 = self.model4_2(out4)\n        out5 = paddle.concat([out4_1, out4_2, out1], 1)\n\n        out5_1 = self.model5_1(out5)\n        out5_2 = self.model5_2(out5)\n        out6 = paddle.concat([out5_1, out5_2, out1], 1)\n\n        out6_1 = self.model6_1(out6)\n        out6_2 = self.model6_2(out6)\n\n        return out6_1, out6_2\n\n    def predict(self, img: Union[str, np.ndarray], save_path: str = \"openpose_body\", visualization: bool = True):\n        self.eval()\n        self.visualization = visualization\n        if isinstance(img, str):\n            orgImg = cv2.imread(img)\n        else:\n            orgImg = img\n        data, imageToTest_padded, pad = self.transform(orgImg)\n        Mconv7_stage6_L1, Mconv7_stage6_L2 = self.forward(paddle.to_tensor(data))\n        Mconv7_stage6_L1 = Mconv7_stage6_L1.numpy()\n        Mconv7_stage6_L2 = Mconv7_stage6_L2.numpy()\n\n        heatmap_avg = self.remove_pad(Mconv7_stage6_L2, imageToTest_padded, orgImg, pad)\n        paf_avg = self.remove_pad(Mconv7_stage6_L1, imageToTest_padded, orgImg, pad)\n\n        all_peaks = self.get_peak(heatmap_avg)\n        connection_all, special_k = self.get_connection(all_peaks, paf_avg, orgImg)\n        candidate, subset = self.get_candidate(all_peaks, connection_all, special_k)\n\n        canvas = copy.deepcopy(orgImg)\n        canvas = self.draw_pose(canvas, candidate, subset)\n        if self.visualization:\n            if not os.path.exists(save_path):\n                os.mkdir(save_path)\n            img_name = str(time.time()) + '.png'\n            save_path = os.path.join(save_path, img_name)\n            cv2.imwrite(save_path, canvas)\n\n        results = {'candidate': candidate, 'subset': subset, 'data': canvas}\n\n        return results\n\n    @serving\n    def serving_method(self, images: list, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [P.base64_to_cv2(image) for image in images]\n        results = self.predict(img=images_decode[0], **kwargs)\n        final = {}\n        final['candidate'] = P.cv2_to_base64(results['candidate'])\n        final['subset'] = P.cv2_to_base64(results['subset'])\n        final['data'] = P.cv2_to_base64(results['data'])\n\n        return final\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.predict(img=args.input_path, save_path=args.output_dir, visualization=args.visualization)\n\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default='openpose_body', help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument(\n            '--visualization', type=bool, default=True, help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n"
  },
  {
    "path": "modules/image/keypoint_detection/openpose_body_estimation/processor.py",
    "content": "import base64\nimport math\nfrom typing import Callable\n\nimport cv2\nimport numpy as np\nfrom scipy.ndimage import gaussian_filter\n\n\nclass PadDownRight:\n    \"\"\"\n    Get padding images.\n\n    Args:\n        stride(int): Stride for calculate pad value for edges.\n        padValue(int): Initialization for new area.\n    \"\"\"\n\n    def __init__(self, stride: int = 8, padValue: int = 128):\n        self.stride = stride\n        self.padValue = padValue\n\n    def __call__(self, img: np.ndarray):\n        h, w = img.shape[0:2]\n        pad = 4 * [0]\n        pad[2] = 0 if (h % self.stride == 0) else self.stride - (h % self.stride)  # down\n        pad[3] = 0 if (w % self.stride == 0) else self.stride - (w % self.stride)  # right\n\n        img_padded = img\n        pad_up = np.tile(img_padded[0:1, :, :] * 0 + self.padValue, (pad[0], 1, 1))\n        img_padded = np.concatenate((pad_up, img_padded), axis=0)\n        pad_left = np.tile(img_padded[:, 0:1, :] * 0 + self.padValue, (1, pad[1], 1))\n        img_padded = np.concatenate((pad_left, img_padded), axis=1)\n        pad_down = np.tile(img_padded[-2:-1, :, :] * 0 + self.padValue, (pad[2], 1, 1))\n        img_padded = np.concatenate((img_padded, pad_down), axis=0)\n        pad_right = np.tile(img_padded[:, -2:-1, :] * 0 + self.padValue, (1, pad[3], 1))\n        img_padded = np.concatenate((img_padded, pad_right), axis=1)\n\n        return img_padded, pad\n\n\nclass RemovePadding:\n    \"\"\"\n    Remove the padding values.\n\n    Args:\n        stride(int): Scales for resizing the images.\n\n    \"\"\"\n\n    def __init__(self, stride: int = 8):\n        self.stride = stride\n\n    def __call__(self, data: np.ndarray, imageToTest_padded: np.ndarray, oriImg: np.ndarray, pad: list) -> np.ndarray:\n        heatmap = np.transpose(np.squeeze(data), (1, 2, 0))\n        heatmap = cv2.resize(heatmap, (0, 0), fx=self.stride, fy=self.stride, interpolation=cv2.INTER_CUBIC)\n        heatmap = heatmap[:imageToTest_padded.shape[0] - pad[2], :imageToTest_padded.shape[1] - pad[3], :]\n        heatmap = cv2.resize(heatmap, (oriImg.shape[1], oriImg.shape[0]), interpolation=cv2.INTER_CUBIC)\n\n        return heatmap\n\n\nclass GetPeak:\n    \"\"\"\n    Get peak values and coordinate from input.\n\n    Args:\n        thresh(float): Threshold value for selecting peak value, default is 0.1.\n    \"\"\"\n\n    def __init__(self, thresh=0.1):\n        self.thresh = thresh\n\n    def __call__(self, heatmap: np.ndarray):\n        all_peaks = []\n        peak_counter = 0\n        for part in range(18):\n            map_ori = heatmap[:, :, part]\n            one_heatmap = gaussian_filter(map_ori, sigma=3)\n\n            map_left = np.zeros(one_heatmap.shape)\n            map_left[1:, :] = one_heatmap[:-1, :]\n            map_right = np.zeros(one_heatmap.shape)\n            map_right[:-1, :] = one_heatmap[1:, :]\n            map_up = np.zeros(one_heatmap.shape)\n            map_up[:, 1:] = one_heatmap[:, :-1]\n            map_down = np.zeros(one_heatmap.shape)\n            map_down[:, :-1] = one_heatmap[:, 1:]\n\n            peaks_binary = np.logical_and.reduce(\n                (one_heatmap >= map_left, one_heatmap >= map_right, one_heatmap >= map_up, one_heatmap >= map_down,\n                 one_heatmap > self.thresh))\n            peaks = list(zip(np.nonzero(peaks_binary)[1], np.nonzero(peaks_binary)[0]))  # note reverse\n            peaks_with_score = [x + (map_ori[x[1], x[0]], ) for x in peaks]\n            peak_id = range(peak_counter, peak_counter + len(peaks))\n            peaks_with_score_and_id = [peaks_with_score[i] + (peak_id[i], ) for i in range(len(peak_id))]\n\n            all_peaks.append(peaks_with_score_and_id)\n            peak_counter += len(peaks)\n\n        return all_peaks\n\n\nclass Connection:\n    \"\"\"\n    Get connection for selected estimation points.\n\n    Args:\n        mapIdx(list): Part Affinity Fields map index, default is None.\n        limbSeq(list): Peak candidate map index, default is None.\n\n    \"\"\"\n\n    def __init__(self, mapIdx: list = None, limbSeq: list = None):\n        if mapIdx and limbSeq:\n            self.mapIdx = mapIdx\n            self.limbSeq = limbSeq\n        else:\n            self.mapIdx = [[31, 32], [39, 40], [33, 34], [35, 36], [41, 42], [43, 44], [19, 20], [21, 22], \\\n                           [23, 24], [25, 26], [27, 28], [29, 30], [47, 48], [49, 50], [53, 54], [51, 52], \\\n                           [55, 56], [37, 38], [45, 46]]\n\n            self.limbSeq = [[2, 3], [2, 6], [3, 4], [4, 5], [6, 7], [7, 8], [2, 9], [9, 10], \\\n                            [10, 11], [2, 12], [12, 13], [13, 14], [2, 1], [1, 15], [15, 17], \\\n                            [1, 16], [16, 18], [3, 17], [6, 18]]\n        self.caculate_vector = CalculateVector()\n\n    def __call__(self, all_peaks: list, paf_avg: np.ndarray, orgimg: np.ndarray):\n        connection_all = []\n        special_k = []\n        for k in range(len(self.mapIdx)):\n            score_mid = paf_avg[:, :, [x - 19 for x in self.mapIdx[k]]]\n            candA = all_peaks[self.limbSeq[k][0] - 1]\n            candB = all_peaks[self.limbSeq[k][1] - 1]\n            nA = len(candA)\n            nB = len(candB)\n            if nA != 0 and nB != 0:\n                connection_candidate = self.caculate_vector(candA, candB, nA, nB, score_mid, orgimg)\n                connection_candidate = sorted(connection_candidate, key=lambda x: x[2], reverse=True)\n                connection = np.zeros((0, 5))\n                for c in range(len(connection_candidate)):\n                    i, j, s = connection_candidate[c][0:3]\n                    if i not in connection[:, 3] and j not in connection[:, 4]:\n                        connection = np.vstack([connection, [candA[i][3], candB[j][3], s, i, j]])\n                        if len(connection) >= min(nA, nB):\n                            break\n\n                connection_all.append(connection)\n            else:\n                special_k.append(k)\n                connection_all.append([])\n\n        return connection_all, special_k\n\n\nclass CalculateVector:\n    \"\"\"\n    Vector decomposition and normalization, refer Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields\n    for more details.\n\n    Args:\n        thresh(float): Threshold value for selecting candidate vector, default is 0.05.\n    \"\"\"\n\n    def __init__(self, thresh: float = 0.05):\n        self.thresh = thresh\n\n    def __call__(self, candA: list, candB: list, nA: int, nB: int, score_mid: np.ndarray, oriImg: np.ndarray):\n        connection_candidate = []\n        for i in range(nA):\n            for j in range(nB):\n                vec = np.subtract(candB[j][:2], candA[i][:2])\n                norm = math.sqrt(vec[0] * vec[0] + vec[1] * vec[1]) + 1e-5\n                vec = np.divide(vec, norm)\n\n                startend = list(zip(np.linspace(candA[i][0], candB[j][0], num=10), \\\n                                    np.linspace(candA[i][1], candB[j][1], num=10)))\n\n                vec_x = np.array([score_mid[int(round(startend[I][1])), int(round(startend[I][0])), 0] \\\n                                  for I in range(len(startend))])\n                vec_y = np.array([score_mid[int(round(startend[I][1])), int(round(startend[I][0])), 1] \\\n                                  for I in range(len(startend))])\n\n                score_midpts = np.multiply(vec_x, vec[0]) + np.multiply(vec_y, vec[1])\n                score_with_dist_prior = sum(score_midpts) / len(score_midpts) + min(0.5 * oriImg.shape[0] / norm - 1, 0)\n                criterion1 = len(np.nonzero(score_midpts > self.thresh)[0]) > 0.8 * len(score_midpts)\n                criterion2 = score_with_dist_prior > 0\n                if criterion1 and criterion2:\n                    connection_candidate.append(\n                        [i, j, score_with_dist_prior, score_with_dist_prior + candA[i][2] + candB[j][2]])\n        return connection_candidate\n\n\nclass DrawPose:\n    \"\"\"\n    Draw Pose estimation results on canvas.\n\n    Args:\n        stickwidth(int): Angle value to draw approximate ellipse curve, default is 4.\n\n    \"\"\"\n\n    def __init__(self, stickwidth: int = 4):\n        self.stickwidth = stickwidth\n\n        self.limbSeq = [[2, 3], [2, 6], [3, 4], [4, 5], [6, 7], [7, 8], [2, 9], [9, 10], [10, 11], [2, 12], [12, 13],\n                        [13, 14], [2, 1], [1, 15], [15, 17], [1, 16], [16, 18], [3, 17], [6, 18]]\n\n        self.colors = [[255, 0, 0], [255, 85, 0], [255, 170, 0], [255, 255, 0],\n                       [170, 255, 0], [85, 255, 0], [0, 255, 0], [0, 255, 85], [0, 255, 170], [0, 255, 255],\n                       [0, 170, 255], [0, 85, 255], [0, 0, 255], [85, 0, 255], [170, 0, 255], [255, 0, 255],\n                       [255, 0, 170], [255, 0, 85]]\n\n    def __call__(self, canvas: np.ndarray, candidate: np.ndarray, subset: np.ndarray):\n        for i in range(18):\n            for n in range(len(subset)):\n                index = int(subset[n][i])\n                if index == -1:\n                    continue\n                x, y = candidate[index][0:2]\n                cv2.circle(canvas, (int(x), int(y)), 4, self.colors[i], thickness=-1)\n        for i in range(17):\n            for n in range(len(subset)):\n                index = subset[n][np.array(self.limbSeq[i]) - 1]\n                if -1 in index:\n                    continue\n                cur_canvas = canvas.copy()\n                Y = candidate[index.astype(int), 0]\n                X = candidate[index.astype(int), 1]\n                mX = np.mean(X)\n                mY = np.mean(Y)\n                length = ((X[0] - X[1])**2 + (Y[0] - Y[1])**2)**0.5\n                angle = math.degrees(math.atan2(X[0] - X[1], Y[0] - Y[1]))\n                polygon = cv2.ellipse2Poly((int(mY), int(mX)), (int(length / 2), self.stickwidth), \\\n                                           int(angle), 0, 360, 1)\n                cv2.fillConvexPoly(cur_canvas, polygon, self.colors[i])\n                canvas = cv2.addWeighted(canvas, 0.4, cur_canvas, 0.6, 0)\n        return canvas\n\n\nclass Candidate:\n    \"\"\"\n    Select candidate for body pose estimation.\n\n    Args:\n        mapIdx(list): Part Affinity Fields map index, default is None.\n        limbSeq(list): Peak candidate map index, default is None.\n    \"\"\"\n\n    def __init__(self, mapIdx: list = None, limbSeq: list = None):\n        if mapIdx and limbSeq:\n            self.mapIdx = mapIdx\n            self.limbSeq = limbSeq\n        else:\n            self.mapIdx = [[31, 32], [39, 40], [33, 34], [35, 36], [41, 42], [43, 44], [19, 20], [21, 22], \\\n                           [23, 24], [25, 26], [27, 28], [29, 30], [47, 48], [49, 50], [53, 54], [51, 52], \\\n                           [55, 56], [37, 38], [45, 46]]\n            self.limbSeq = [[2, 3], [2, 6], [3, 4], [4, 5], [6, 7], [7, 8], [2, 9], [9, 10], \\\n                            [10, 11], [2, 12], [12, 13], [13, 14], [2, 1], [1, 15], [15, 17], \\\n                            [1, 16], [16, 18], [3, 17], [6, 18]]\n\n    def __call__(self, all_peaks: list, connection_all: list, special_k: list):\n        subset = -1 * np.ones((0, 20))\n        candidate = np.array([item for sublist in all_peaks for item in sublist])\n        for k in range(len(self.mapIdx)):\n            if k not in special_k:\n                partAs = connection_all[k][:, 0]\n                partBs = connection_all[k][:, 1]\n                indexA, indexB = np.array(self.limbSeq[k]) - 1\n\n                for i in range(len(connection_all[k])):  # = 1:size(temp,1)\n                    found = 0\n                    subset_idx = [-1, -1]\n                    for j in range(len(subset)):  # 1:size(subset,1):\n                        if subset[j][indexA] == partAs[i] or subset[j][indexB] == partBs[i]:\n                            subset_idx[found] = j\n                            found += 1\n\n                    if found == 1:\n                        j = subset_idx[0]\n                        if subset[j][indexB] != partBs[i]:\n                            subset[j][indexB] = partBs[i]\n                            subset[j][-1] += 1\n                            subset[j][-2] += candidate[partBs[i].astype(int), 2] + connection_all[k][i][2]\n                    elif found == 2:  # if found 2 and disjoint, merge them\n                        j1, j2 = subset_idx\n                        membership = ((subset[j1] >= 0).astype(int) + (subset[j2] >= 0).astype(int))[:-2]\n                        if len(np.nonzero(membership == 2)[0]) == 0:  # merge\n                            subset[j1][:-2] += (subset[j2][:-2] + 1)\n                            subset[j1][-2:] += subset[j2][-2:]\n                            subset[j1][-2] += connection_all[k][i][2]\n                            subset = np.delete(subset, j2, 0)\n                        else:  # as like found == 1\n                            subset[j1][indexB] = partBs[i]\n                            subset[j1][-1] += 1\n                            subset[j1][-2] += candidate[partBs[i].astype(int), 2] + connection_all[k][i][2]\n\n                    # if find no partA in the subset, create a new subset\n                    elif not found and k < 17:\n                        row = -1 * np.ones(20)\n                        row[indexA] = partAs[i]\n                        row[indexB] = partBs[i]\n                        row[-1] = 2\n                        row[-2] = sum(candidate[connection_all[k][i, :2].astype(int), 2]) + connection_all[k][i][2]\n                        subset = np.vstack([subset, row])\n        # delete some rows of subset which has few parts occur\n        deleteIdx = []\n        for i in range(len(subset)):\n            if subset[i][-1] < 4 or subset[i][-2] / subset[i][-1] < 0.4:\n                deleteIdx.append(i)\n        subset = np.delete(subset, deleteIdx, axis=0)\n\n        return candidate, subset\n\n\nclass ResizeScaling:\n    \"\"\"Resize images by scaling method.\n\n    Args:\n        target(int): Target image size.\n        interpolation(Callable): Interpolation method.\n    \"\"\"\n\n    def __init__(self, target: int = 368, interpolation: Callable = cv2.INTER_CUBIC):\n        self.target = target\n        self.interpolation = interpolation\n\n    def __call__(self, img, scale_search):\n        scale = scale_search * self.target / img.shape[0]\n        resize_img = cv2.resize(img, (0, 0), fx=scale, fy=scale, interpolation=self.interpolation)\n        return resize_img\n\n\ndef cv2_to_base64(image: np.ndarray):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef base64_to_cv2(b64str: str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n"
  },
  {
    "path": "modules/image/keypoint_detection/openpose_body_estimation/readme.md",
    "content": "# openpose_body_estimation\n\n| 模型名称            |  openpose_body_estimation  |\n| :------------------ | :------------------------: |\n| 类别                |      图像-关键点检测       |\n| 网络                | two-branch multi-stage CNN |\n| 数据集              |      MPII, COCO 2016       |\n| 是否支持Fine-tuning |             否             |\n| 模型大小            |            185M            |\n| 最新更新日期        |         2021-06-28         |\n| 数据指标            |             -              |\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 人体关键点（左）、模型预测效果（右）\n  \n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/76040149/133232647-011528a1-32f3-416f-a618-17ffbeba6bab.png\" height = \"300\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/76040149/133232724-30979d86-8688-483e-abc3-a9159695a56c.png\" height = \"300\" hspace='10'/>\n    </p>\n    \n- ### 模型介绍\n  - openpose_body_estimation是基于'Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields'构建的用于肢体关键点检测的模型，该模型可以与openpose_hands_estimation模型联合使用检测肢体和手部关键点。\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install openpose_body_estimation\n    ```\n\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n   | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run openpose_body_estimation --input_path \"/PATH/TO/IMAGE\"\n    ```\n    \n  - 通过命令行方式实现hub模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    \n    model = hub.Module(name='openpose_body_estimation')\n    result = model.predict('/PATH/TO/IMAGE')\n    model.save_inference_model('/PATH/TO/SAVE/MODEL')\n    \n    # PaddleHub示例图片下载方法：\n    # wget https://paddlehub.bj.bcebos.com/resources/test_image.jpg\n    ```\n  \n- ### 3、API\n\n  - ```python\n    def __init__(self, load_checkpoint: str = None):\n    ```\n\n    - 模型初始化函数\n    - **参数**\n      - load_checkpoint(str): 肢体检测模型，用户可以指定自己的模型地址。 默认为None时，会使用PaddleHub提供的默认模型。\n\n  - ```python\n    def predict(self,\n                img, \n                save_path='openpose_body',  \n                visualization=True):\n    ```\n    \n    - 识别输入图片中的所有人肢体关键点。\n    - **参数**\n      - img (numpy.ndarray|str): 图片数据，使用图片路径或者输入numpy.ndarray，BGR格式；\n      - save_path (str): 图片保存路径， 默认为'openpose_body'；\n      - visualization (bool): 是否将识别结果保存为图片文件；\n    - **返回**\n      - res (dict): 识别结果的列表，列表元素为 dict, 有以下两个字段：\n        - data : 可视化图片内容(numpy.ndarray，BGR格式);\n        - candidate: 图片中所有肢体关键点坐标;\n        - subset: 不同的人不同关键点对应的关键点坐标的索引。\n    \n  - ```python\n    def save_inference_model(save_dir):\n    ```\n\n    - 将模型保存到指定路径。\n    - **参数**\n      - save_dir(str): 存在模型的目录名称。\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线肢体关键点检测服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m openpose_body_estimation -p 8866\n    ```\n\n  - 这样就完成了一个肢体关键点服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    \n    import numpy as np\n    \n    \n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n    \n    def base64_to_cv2(b64str):\n        data = base64.b64decode(b64str.encode('utf8'))\n        data = np.fromstring(data, np.uint8)\n        data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n        return data\n    \n    # 发送HTTP请求\n    org_im = cv2.imread('/PATH/TO/IMAGE')\n    data = {'images':[cv2_to_base64(org_im)]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/openpose_body_estimation\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    \n    canvas = base64_to_cv2(r.json()[\"results\"]['data'])\n    cv2.imwrite('keypoint_body.png', canvas)\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  * ```shell\n    $ hub install openpose_body_estimation==1.1.0\n    ```\n\n"
  },
  {
    "path": "modules/image/keypoint_detection/openpose_body_estimation/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport paddlehub as hub\n\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://ai-studio-static-online.cdn.bcebos.com/7799a8ccc5f6471b9d56fb6eff94f82a08b70ca2c7594d3f99877e366c0a2619'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"openpose_body_estimation\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('openpose_body')\n\n    def test_predict1(self):\n        results = self.module.predict(\n            img='tests/test.jpg',\n            visualization=False\n        )\n        kps = results['candidate'].tolist()\n        self.assertIsInstance(kps, list)\n\n    def test_predict2(self):\n        results = self.module.predict(\n            img=cv2.imread('tests/test.jpg'),\n            visualization=False\n        )\n        kps = results['candidate'].tolist()\n        self.assertIsInstance(kps, list)\n\n    def test_predict3(self):\n        results = self.module.predict(\n            img=cv2.imread('tests/test.jpg'),\n            visualization=True\n        )\n        kps = results['candidate'].tolist()\n        self.assertIsInstance(kps, list)\n\n    def test_predict4(self):\n        self.assertRaises(\n            AttributeError,\n            self.module.predict,\n            img='no.jpg'\n        )\n\n    def test_predict5(self):\n        self.assertRaises(\n            AttributeError,\n            self.module.predict,\n            img=['test.jpg']\n        )\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model/openpose_body_estimation.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model/openpose_body_estimation.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/keypoint_detection/openpose_hands_estimation/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport copy\nimport time\nimport argparse\nfrom typing import Union\nfrom collections import OrderedDict\n\nimport cv2\nimport paddle\nimport numpy as np\nimport paddle.nn as nn\nimport paddlehub as hub\nfrom skimage.measure import label\nfrom scipy.ndimage import gaussian_filter\nfrom paddlehub.module.module import moduleinfo, runnable, serving\nimport paddlehub.vision.transforms as T\n\nfrom . import processor as P\n\n\n@moduleinfo(\n    name=\"openpose_hands_estimation\",\n    type=\"CV/image_editing\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"Openpose_hands_estimation is a hand pose estimation model based on Hand Keypoint Detection in \\\n            Single Images using Multiview Bootstrapping.\",\n    version=\"1.1.0\")\nclass HandPoseModel(nn.Layer):\n    \"\"\"\n    HandposeModel\n\n    Args:\n        load_checkpoint(str): Checkpoint save path, default is None.\n    \"\"\"\n\n    def __init__(self, load_checkpoint: str = None):\n        super(HandPoseModel, self).__init__()\n\n        self.norm_func = T.Normalize(std=[1, 1, 1])\n        self.resize_func = P.ResizeScaling()\n        self.hand_detect = P.HandDetect()\n        self.pad_func = P.PadDownRight()\n        self.remove_pad = P.RemovePadding()\n        self.draw_pose = P.DrawPose()\n        self.draw_hand = P.DrawHandPose()\n\n        no_relu_layers = ['conv6_2_CPM', 'Mconv7_stage2', 'Mconv7_stage3', \\\n                          'Mconv7_stage4', 'Mconv7_stage5', 'Mconv7_stage6']\n\n        block1_0 = OrderedDict([('conv1_1', [3, 64, 3, 1, 1]), ('conv1_2', [64, 64, 3, 1, 1]),\n                                ('pool1_stage1', [2, 2, 0]), ('conv2_1', [64, 128, 3, 1, 1]),\n                                ('conv2_2', [128, 128, 3, 1, 1]), ('pool2_stage1', [2, 2, 0]),\n                                ('conv3_1', [128, 256, 3, 1, 1]), ('conv3_2', [256, 256, 3, 1, 1]),\n                                ('conv3_3', [256, 256, 3, 1, 1]), ('conv3_4', [256, 256, 3, 1, 1]),\n                                ('pool3_stage1', [2, 2, 0]), ('conv4_1', [256, 512, 3, 1, 1]),\n                                ('conv4_2', [512, 512, 3, 1, 1]), ('conv4_3', [512, 512, 3, 1, 1]),\n                                ('conv4_4', [512, 512, 3, 1, 1]), ('conv5_1', [512, 512, 3, 1, 1]),\n                                ('conv5_2', [512, 512, 3, 1, 1]), ('conv5_3_CPM', [512, 128, 3, 1, 1])])\n\n        block1_1 = OrderedDict([('conv6_1_CPM', [128, 512, 1, 1, 0]), ('conv6_2_CPM', [512, 22, 1, 1, 0])])\n\n        blocks = {}\n        blocks['block1_0'] = block1_0\n        blocks['block1_1'] = block1_1\n\n        for i in range(2, 7):\n            blocks['block%d' % i] = OrderedDict([('Mconv1_stage%d' % i, [150, 128, 7, 1, 3]),\n                                                 ('Mconv2_stage%d' % i, [128, 128, 7, 1, 3]),\n                                                 ('Mconv3_stage%d' % i, [128, 128, 7, 1, 3]),\n                                                 ('Mconv4_stage%d' % i, [128, 128, 7, 1, 3]),\n                                                 ('Mconv5_stage%d' % i, [128, 128, 7, 1, 3]),\n                                                 ('Mconv6_stage%d' % i, [128, 128, 1, 1, 0]),\n                                                 ('Mconv7_stage%d' % i, [128, 22, 1, 1, 0])])\n\n        for k in blocks.keys():\n            blocks[k] = self.make_layers(blocks[k], no_relu_layers)\n\n        self.model1_0 = blocks['block1_0']\n        self.model1_1 = blocks['block1_1']\n        self.model2 = blocks['block2']\n        self.model3 = blocks['block3']\n        self.model4 = blocks['block4']\n        self.model5 = blocks['block5']\n        self.model6 = blocks['block6']\n\n        if load_checkpoint is not None:\n            self.model_dict = paddle.load(load_checkpoint)[0]\n            self.set_dict(self.model_dict)\n            print(\"load custom checkpoint success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'openpose_hand.pdparams')\n            self.model_dict = paddle.load(checkpoint)\n            self.set_dict(self.model_dict)\n            print(\"load pretrained checkpoint success\")\n        self.body_model = None\n\n    def make_layers(self, block: dict, no_relu_layers: list):\n        layers = []\n        for layer_name, v in block.items():\n            if 'pool' in layer_name:\n                layer = nn.MaxPool2D(kernel_size=v[0], stride=v[1], padding=v[2])\n                layers.append((layer_name, layer))\n            else:\n                conv2d = nn.Conv2D(in_channels=v[0], out_channels=v[1], kernel_size=v[2], stride=v[3], padding=v[4])\n                layers.append((layer_name, conv2d))\n                if layer_name not in no_relu_layers:\n                    layers.append(('relu_' + layer_name, nn.ReLU()))\n        layers = tuple(layers)\n        return nn.Sequential(*layers)\n\n    def forward(self, x: paddle.Tensor):\n        out1_0 = self.model1_0(x)\n        out1_1 = self.model1_1(out1_0)\n        concat_stage2 = paddle.concat([out1_1, out1_0], 1)\n        out_stage2 = self.model2(concat_stage2)\n        concat_stage3 = paddle.concat([out_stage2, out1_0], 1)\n        out_stage3 = self.model3(concat_stage3)\n        concat_stage4 = paddle.concat([out_stage3, out1_0], 1)\n        out_stage4 = self.model4(concat_stage4)\n        concat_stage5 = paddle.concat([out_stage4, out1_0], 1)\n        out_stage5 = self.model5(concat_stage5)\n        concat_stage6 = paddle.concat([out_stage5, out1_0], 1)\n        out_stage6 = self.model6(concat_stage6)\n        return out_stage6\n\n    def hand_estimation(self, handimg: np.ndarray, scale_search: list):\n        heatmap_avg = np.zeros((handimg.shape[0], handimg.shape[1], 22))\n        for scale in scale_search:\n            process = self.resize_func(handimg, scale)\n            imageToTest_padded, pad = self.pad_func(process)\n            process = self.norm_func(imageToTest_padded)\n            process = np.ascontiguousarray(np.transpose(process[:, :, :, np.newaxis], (3, 2, 0, 1))).astype(\"float32\")\n            data = self.forward(paddle.to_tensor(process))\n            data = data.numpy()\n            heatmap = self.remove_pad(data, imageToTest_padded, handimg, pad)\n            heatmap_avg += heatmap / len(scale_search)\n\n        all_peaks = []\n        for part in range(21):\n            map_ori = heatmap_avg[:, :, part]\n            one_heatmap = gaussian_filter(map_ori, sigma=3)\n            binary = np.ascontiguousarray(one_heatmap > 0.05, dtype=np.uint8)\n            if np.sum(binary) == 0:\n                all_peaks.append([0, 0])\n                continue\n            label_img, label_numbers = label(binary, return_num=True, connectivity=binary.ndim)\n            max_index = np.argmax([np.sum(map_ori[label_img == i]) for i in range(1, label_numbers + 1)]) + 1\n            label_img[label_img != max_index] = 0\n            map_ori[label_img == 0] = 0\n\n            y, x = P.npmax(map_ori)\n            all_peaks.append([x, y])\n\n        return np.array(all_peaks)\n\n    def predict(self,\n                img: Union[str, np.ndarray],\n                save_path: str = 'openpose_hand',\n                scale: list = [0.5, 1.0, 1.5, 2.0],\n                visualization: bool = True):\n        self.eval()\n        self.visualization = visualization\n        if isinstance(img, str):\n            org_img = cv2.imread(img)\n        else:\n            org_img = img\n\n        if not self.body_model:\n            self.body_model = hub.Module(name='openpose_body_estimation')\n            self.body_model.eval()\n\n        body_result = self.body_model.predict(org_img)\n        hands_list = self.hand_detect(body_result['candidate'], body_result['subset'], org_img)\n\n        all_hand_peaks = []\n\n        for x, y, w, is_left in hands_list:\n            peaks = self.hand_estimation(org_img[y:y + w, x:x + w, :], scale)\n            peaks[:, 0] = np.where(peaks[:, 0] == 0, peaks[:, 0], peaks[:, 0] + x)\n            peaks[:, 1] = np.where(peaks[:, 1] == 0, peaks[:, 1], peaks[:, 1] + y)\n            all_hand_peaks.append(peaks)\n        canvas = copy.deepcopy(org_img)\n        canvas = self.draw_pose(\n            canvas,\n            body_result['candidate'],\n            body_result['subset'],\n        )\n        canvas = self.draw_hand(canvas, all_hand_peaks)\n        if self.visualization:\n            if not os.path.exists(save_path):\n                os.mkdir(save_path)\n            img_name = str(time.time()) + '.png'\n            save_path = os.path.join(save_path, img_name)\n            cv2.imwrite(save_path, canvas)\n\n        results = {'all_hand_peaks': all_hand_peaks, 'data': canvas}\n\n        return results\n\n    @serving\n    def serving_method(self, images: list, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [P.base64_to_cv2(image) for image in images]\n        results = self.predict(img=images_decode[0], **kwargs)\n        final = {}\n        final['all_hand_peaks'] = [peak.tolist() for peak in results['all_hand_peaks']]\n        final['data'] = P.cv2_to_base64(results['data'])\n        return final\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.predict(\n            img=args.input_path, save_path=args.output_dir, scale=args.scale, visualization=args.visualization)\n\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default='openpose_hand', help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument(\n            '--scale', type=list, default=[0.5, 1.0, 1.5, 2.0], help=\"The search scale for openpose hands model.\")\n        self.arg_config_group.add_argument(\n            '--visualization', type=bool, default=True, help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n"
  },
  {
    "path": "modules/image/keypoint_detection/openpose_hands_estimation/processor.py",
    "content": "import os\nimport math\nimport base64\nfrom typing import Callable\n\nimport cv2\nimport numpy as np\nimport matplotlib\nfrom matplotlib import pyplot as plt\nfrom matplotlib.figure import Figure\nfrom matplotlib.backends.backend_agg import FigureCanvasAgg as FigureCanvas\nmatplotlib.use('Agg')\n\n\nclass HandDetect:\n    \"\"\"\n    Detect hand pose information from body pose estimation result.\n\n    Args:\n        ratioWristElbow(float): Ratio to adjust the wrist center, ,default is 0.33.\n    \"\"\"\n\n    def __init__(self, ratioWristElbow: float = 0.33):\n        self.ratioWristElbow = ratioWristElbow\n\n    def __call__(self, candidate: np.ndarray, subset: np.ndarray, oriImg: np.ndarray):\n        detect_result = []\n        image_height, image_width = oriImg.shape[0:2]\n        for person in subset.astype(int):\n            has_left = np.sum(person[[5, 6, 7]] == -1) == 0\n            has_right = np.sum(person[[2, 3, 4]] == -1) == 0\n            if not (has_left or has_right):\n                continue\n            hands = []\n            # left hand\n            if has_left:\n                left_shoulder_index, left_elbow_index, left_wrist_index = person[[5, 6, 7]]\n                x1, y1 = candidate[left_shoulder_index][:2]\n                x2, y2 = candidate[left_elbow_index][:2]\n                x3, y3 = candidate[left_wrist_index][:2]\n                hands.append([x1, y1, x2, y2, x3, y3, True])\n            # right hand\n            if has_right:\n                right_shoulder_index, right_elbow_index, right_wrist_index = person[[2, 3, 4]]\n                x1, y1 = candidate[right_shoulder_index][:2]\n                x2, y2 = candidate[right_elbow_index][:2]\n                x3, y3 = candidate[right_wrist_index][:2]\n                hands.append([x1, y1, x2, y2, x3, y3, False])\n\n            for x1, y1, x2, y2, x3, y3, is_left in hands:\n\n                x = x3 + self.ratioWristElbow * (x3 - x2)\n                y = y3 + self.ratioWristElbow * (y3 - y2)\n                distanceWristElbow = math.sqrt((x3 - x2)**2 + (y3 - y2)**2)\n                distanceElbowShoulder = math.sqrt((x2 - x1)**2 + (y2 - y1)**2)\n                width = 1.5 * max(distanceWristElbow, 0.9 * distanceElbowShoulder)\n\n                x -= width / 2\n                y -= width / 2\n\n                if x < 0: x = 0\n                if y < 0: y = 0\n                width1 = width\n                width2 = width\n                if x + width > image_width: width1 = image_width - x\n                if y + width > image_height: width2 = image_height - y\n                width = min(width1, width2)\n\n                if width >= 20:\n                    detect_result.append([int(x), int(y), int(width), is_left])\n\n        return detect_result\n\n\nclass PadDownRight:\n    \"\"\"\n    Get padding images.\n\n    Args:\n        stride(int): Stride for calculate pad value for edges.\n        padValue(int): Initialization for new area.\n    \"\"\"\n\n    def __init__(self, stride: int = 8, padValue: int = 128):\n        self.stride = stride\n        self.padValue = padValue\n\n    def __call__(self, img: np.ndarray):\n        h, w = img.shape[0:2]\n        pad = 4 * [0]\n        pad[2] = 0 if (h % self.stride == 0) else self.stride - (h % self.stride)  # down\n        pad[3] = 0 if (w % self.stride == 0) else self.stride - (w % self.stride)  # right\n\n        img_padded = img\n        pad_up = np.tile(img_padded[0:1, :, :] * 0 + self.padValue, (pad[0], 1, 1))\n        img_padded = np.concatenate((pad_up, img_padded), axis=0)\n        pad_left = np.tile(img_padded[:, 0:1, :] * 0 + self.padValue, (1, pad[1], 1))\n        img_padded = np.concatenate((pad_left, img_padded), axis=1)\n        pad_down = np.tile(img_padded[-2:-1, :, :] * 0 + self.padValue, (pad[2], 1, 1))\n        img_padded = np.concatenate((img_padded, pad_down), axis=0)\n        pad_right = np.tile(img_padded[:, -2:-1, :] * 0 + self.padValue, (1, pad[3], 1))\n        img_padded = np.concatenate((img_padded, pad_right), axis=1)\n\n        return img_padded, pad\n\n\nclass RemovePadding:\n    \"\"\"\n    Remove the padding values.\n\n    Args:\n        stride(int): Scales for resizing the images.\n\n    \"\"\"\n\n    def __init__(self, stride: int = 8):\n        self.stride = stride\n\n    def __call__(self, data: np.ndarray, imageToTest_padded: np.ndarray, oriImg: np.ndarray, pad: list) -> np.ndarray:\n        heatmap = np.transpose(np.squeeze(data), (1, 2, 0))\n        heatmap = cv2.resize(heatmap, (0, 0), fx=self.stride, fy=self.stride, interpolation=cv2.INTER_CUBIC)\n        heatmap = heatmap[:imageToTest_padded.shape[0] - pad[2], :imageToTest_padded.shape[1] - pad[3], :]\n        heatmap = cv2.resize(heatmap, (oriImg.shape[1], oriImg.shape[0]), interpolation=cv2.INTER_CUBIC)\n\n        return heatmap\n\n\nclass DrawPose:\n    \"\"\"\n    Draw Pose estimation results on canvas.\n\n    Args:\n        stickwidth(int): Angle value to draw approximate ellipse curve, default is 4.\n\n    \"\"\"\n\n    def __init__(self, stickwidth: int = 4):\n        self.stickwidth = stickwidth\n\n        self.limbSeq = [[2, 3], [2, 6], [3, 4], [4, 5], [6, 7], [7, 8], [2, 9], [9, 10], [10, 11], [2, 12], [12, 13],\n                        [13, 14], [2, 1], [1, 15], [15, 17], [1, 16], [16, 18], [3, 17], [6, 18]]\n\n        self.colors = [[255, 0, 0], [255, 85, 0], [255, 170, 0], [255, 255, 0],\n                       [170, 255, 0], [85, 255, 0], [0, 255, 0], [0, 255, 85], [0, 255, 170], [0, 255, 255],\n                       [0, 170, 255], [0, 85, 255], [0, 0, 255], [85, 0, 255], [170, 0, 255], [255, 0, 255],\n                       [255, 0, 170], [255, 0, 85]]\n\n    def __call__(self, canvas: np.ndarray, candidate: np.ndarray, subset: np.ndarray):\n        for i in range(18):\n            for n in range(len(subset)):\n                index = int(subset[n][i])\n                if index == -1:\n                    continue\n                x, y = candidate[index][0:2]\n                cv2.circle(canvas, (int(x), int(y)), 4, self.colors[i], thickness=-1)\n        for i in range(17):\n            for n in range(len(subset)):\n                index = subset[n][np.array(self.limbSeq[i]) - 1]\n                if -1 in index:\n                    continue\n                cur_canvas = canvas.copy()\n                Y = candidate[index.astype(int), 0]\n                X = candidate[index.astype(int), 1]\n                mX = np.mean(X)\n                mY = np.mean(Y)\n                length = ((X[0] - X[1])**2 + (Y[0] - Y[1])**2)**0.5\n                angle = math.degrees(math.atan2(X[0] - X[1], Y[0] - Y[1]))\n                polygon = cv2.ellipse2Poly((int(mY), int(mX)), (int(length / 2), self.stickwidth), int(angle), 0, 360,\n                                           1)\n                cv2.fillConvexPoly(cur_canvas, polygon, self.colors[i])\n                canvas = cv2.addWeighted(canvas, 0.4, cur_canvas, 0.6, 0)\n        return canvas\n\n\nclass DrawHandPose:\n    \"\"\"\n        Draw hand pose estimation results on canvas.\n        Args:\n            show_number(bool): Whether to show estimation ids in canvas, default is False.\n\n        \"\"\"\n\n    def __init__(self, show_number: bool = False):\n        self.edges = [[0, 1], [1, 2], [2, 3], [3, 4], [0, 5], [5, 6], [6, 7], [7, 8], [0, 9], [9, 10], \\\n                      [10, 11], [11, 12], [0, 13], [13, 14], [14, 15], [15, 16], [0, 17], [17, 18], [18, 19], [19, 20]]\n        self.show_number = show_number\n\n    def __call__(self, canvas: np.ndarray, all_hand_peaks: list):\n        fig = Figure(figsize=plt.figaspect(canvas))\n\n        fig.subplots_adjust(0, 0, 1, 1)\n        fig.subplots_adjust(bottom=0, top=1, left=0, right=1)\n        bg = FigureCanvas(fig)\n        ax = fig.subplots()\n        ax.axis('off')\n        ax.imshow(canvas)\n\n        width, height = ax.figure.get_size_inches() * ax.figure.get_dpi()\n\n        for peaks in all_hand_peaks:\n            for ie, e in enumerate(self.edges):\n                if np.sum(np.all(peaks[e], axis=1) == 0) == 0:\n                    x1, y1 = peaks[e[0]]\n                    x2, y2 = peaks[e[1]]\n                    ax.plot([x1, x2], [y1, y2],\n                            color=matplotlib.colors.hsv_to_rgb([ie / float(len(self.edges)), 1.0, 1.0]))\n\n            for i, keyponit in enumerate(peaks):\n                x, y = keyponit\n                ax.plot(x, y, 'r.')\n                if self.show_number:\n                    ax.text(x, y, str(i))\n        bg.draw()\n        canvas = np.frombuffer(bg.tostring_rgb(), dtype='uint8').reshape(int(height), int(width), 3)\n        return canvas\n\n\nclass ResizeScaling:\n    \"\"\"Resize images by scaling method.\n\n    Args:\n        target(int): Target image size.\n        interpolation(Callable): Interpolation method.\n    \"\"\"\n\n    def __init__(self, target: int = 368, interpolation: Callable = cv2.INTER_CUBIC):\n        self.target = target\n        self.interpolation = interpolation\n\n    def __call__(self, img, scale_search):\n        scale = scale_search * self.target / img.shape[0]\n        resize_img = cv2.resize(img, (0, 0), fx=scale, fy=scale, interpolation=self.interpolation)\n        return resize_img\n\n\ndef npmax(array: np.ndarray):\n    \"\"\"Get max value and index.\"\"\"\n    arrayindex = array.argmax(1)\n    arrayvalue = array.max(1)\n    i = arrayvalue.argmax()\n    j = arrayindex[i]\n    return i, j\n\n\ndef cv2_to_base64(image: np.ndarray):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef base64_to_cv2(b64str: str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n"
  },
  {
    "path": "modules/image/keypoint_detection/openpose_hands_estimation/readme.md",
    "content": "# openpose_hands_estimation\n\n| 模型名称            | hand_pose_localization |\n| :------------------ | :--------------------: |\n| 类别                |    图像-关键点检测     |\n| 网络                |           -            |\n| 数据集              |       MPII, NZSL       |\n| 是否支持Fine-tuning |           否           |\n| 模型大小            |          130M          |\n| 最新更新日期        |       2021-06-02       |\n| 数据指标            |           -            |\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 手部关键点展示（左）、预测效果（右）\n<p align=\"center\">\n<img src=\"https://user-images.githubusercontent.com/76040149/133233743-dc6e3aaf-27fd-4f7d-95be-21c9383a2ea1.png\" height=\"300\"><img src=\"https://user-images.githubusercontent.com/76040149/133234189-f7a47940-2be2-445c-8043-b490b5402e15.png\" height=\"300\">\n</p>\n\n- ### 模型介绍\n  - penpose_hands_estimation是基于 'Hand Keypoint Detection in Single Images using Multiview Bootstrapping' 构建的用于手部关键点检测的模型。\n  \n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n  - scikit-image\n  - scipy\n  - ```shell\n    $ pip install scikit-image scipy\n    ```\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install openpose_hands_estimation\n    ```\n\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n   | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run openpose_hands_estimation --input_path \"/PATH/TO/IMAGE\"\n    ```\n    \n  - Note：本模型先识别人体关键点以确定2个手的位置，再识别手部关键点；输入图片建议为半身照或全身照，手部没有遮挡；本模型需要用到openpose_body_estimation，若未安装则推理前会自动安装\n    \n  - 通过命令行方式实现hub模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    \n    model = hub.Module(name='openpose_hands_estimation')\n    result = model.predict('/PATH/TO/IMAGE')\n    model.save_inference_model('/PATH/TO/SAVE/MODEL')\n    ```\n  \n- ### 3、API\n\n  - ```python\n    def __init__(load_checkpoint: str = None):\n    ```\n\n    - **参数**\n      - load_checkpoint(str): 手部检测模型，用户可以指定自己的模型地址。 默认为None时，会使用PaddleHub提供的默认模型。\n\n  - ```python\n    def predict(img, \n                save_path='openpose_hand', \n                scale=[0.5, 1.0, 1.5, 2.0], \n                visualization=True):\n    ```\n\n    - 识别输入图片中的所有人手部关键点。\n    - **参数**\n      - img (numpy.ndarray|str): 图片数据，使用图片路径或者输入numpy.ndarray，BGR格式；\n      - save_path (str): 图片保存路径， 默认为openpose_hand；\n      - scale (list): 搜索关键点时使用图片的不同尺度；\n      - visualization (bool): 是否将识别结果保存为图片文件；\n    - **返回**\n      - res (dict): 识别结果的列表，列表元素为 dict, 有以下两个字段：\n        - data : 可视化图片内容（numpy.ndarray，BGR格式）；\n        - all_hand_peaks: 图片中手部关键点坐标\n\n  - ```python\n    def save_inference_model(save_dir):\n    ```\n\n    - 将模型保存到指定路径。\n    - **参数**\n      - save_dir(str): 存放模型的目录名称\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线手部关键点检测服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m openpose_hands_estimation -p 8866\n    ```\n\n  - 这样就完成了一个人体手部关键点检测的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    \n    import numpy as np\n    \n    \n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n    \n    def base64_to_cv2(b64str):\n        data = base64.b64decode(b64str.encode('utf8'))\n        data = np.fromstring(data, np.uint8)\n        data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n        return data\n    \n    # 发送HTTP请求\n    org_im = cv2.imread('/PATH/TO/IMAGE')\n    data = {'images':[cv2_to_base64(org_im)]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/openpose_hands_estimation\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    canvas = base64_to_cv2(r.json()[\"results\"][\"data\"])\n    cv2.imwrite('keypoint.png', canvas)\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  * ```shell\n    $ hub install hand_pose_localization==1.1.0\n    ```\n\n    \n"
  },
  {
    "path": "modules/image/keypoint_detection/openpose_hands_estimation/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport paddlehub as hub\n\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/QUVLQPt37n0/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8M3x8cGVyc29uJTIwaGFuZHMlMjBoZWxsb3xlbnwwfHx8fDE2NjE4NjE1MzE&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"openpose_hands_estimation\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('openpose_body')\n        shutil.rmtree('openpose_hand')\n\n    def test_predict1(self):\n        results = self.module.predict(\n            img='tests/test.jpg',\n            visualization=False\n        )\n        kps = results['all_hand_peaks'][0].tolist()\n        self.assertIsInstance(kps, list)\n\n    def test_predict2(self):\n        results = self.module.predict(\n            img=cv2.imread('tests/test.jpg'),\n            visualization=False\n        )\n        kps = results['all_hand_peaks'][0].tolist()\n        self.assertIsInstance(kps, list)\n\n    def test_predict3(self):\n        results = self.module.predict(\n            img=cv2.imread('tests/test.jpg'),\n            visualization=True\n        )\n        kps = results['all_hand_peaks'][0].tolist()\n        self.assertIsInstance(kps, list)\n\n    def test_predict4(self):\n        self.assertRaises(\n            AttributeError,\n            self.module.predict,\n            img='no.jpg'\n        )\n\n    def test_predict5(self):\n        self.assertRaises(\n            AttributeError,\n            self.module.predict,\n            img=['test.jpg']\n        )\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model/openpose_hands_estimation.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model/openpose_hands_estimation.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()"
  },
  {
    "path": "modules/image/keypoint_detection/pp-tinypose/README.md",
    "content": "# pp-tinypose\n\n|模型名称|pp-tinypose|\n| :--- | :---: |\n|类别|图像-关键点检测|\n|网络|PicoDet + tinypose|\n|数据集|COCO + AI Challenger|\n|是否支持Fine-tuning|否|\n|模型大小|125M|\n|最新更新日期|2022-05-20|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n<p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/169768593-9fcf729a-458e-4bb1-bb3c-b005ff7bcec2.jpg\"   hspace='10'/>\n    <br />\n    输入图像\n    <br />\n    <img src=\"https://user-images.githubusercontent.com/22424850/170029768-3c60def2-7c87-4e8a-98bc-1bbc912204e7.jpg\"   hspace='10'/>\n    <br />\n    输出图像\n\n- ### 模型介绍\n\n  - PP-TinyPose是PaddleDetecion针对移动端设备优化的实时关键点检测模型，可流畅地在移动端设备上执行多人姿态估计任务。借助PaddleDetecion自研的优秀轻量级检测模型PicoDet以及轻量级姿态估计任务骨干网络Tinypose, 结合多种策略有效平衡了模型的速度和精度表现。\n\n  - 更多详情参考：[PP-TinyPose](https://github.com/PaddlePaddle/PaddleDetection/tree/release/2.4/configs/keypoint/tiny_pose)。\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.2\n\n  - paddlehub >= 2.2   | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install pp-tinypose\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run pp-tinypose --input_path \"/PATH/TO/IMAGE\" --visualization True --use_gpu\n    ```\n  - 通过命令行方式实现关键点检测模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"pp-tinypose\")\n    result = model.predict('/PATH/TO/IMAGE', save_path='pp_tinypose_output', visualization=True, use_gpu=True)\n    ```\n\n- ### 3、API\n\n\n  - ```python\n    def predict(self, img: Union[str, np.ndarray], save_path: str = \"pp_tinypose_output\", visualization: bool = True, use_gpu = False)\n    ```\n\n    - 预测API，识别输入图片中的所有人肢体关键点。\n\n    - **参数**\n\n      - img (numpy.ndarray|str): 图片数据，使用图片路径或者输入numpy.ndarray，BGR格式；\n      - save_path (str): 图片保存路径， 默认为'pp_tinypose_output'；\n      - visualization (bool): 是否将识别结果保存为图片文件；\n      - use_gpu: 是否使用gpu；\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表元素依然为列表，存的内容为[图像名称，检测框，关键点]。\n\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个关键点检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m pp-tinypose\n    ```\n\n  - 这样就完成了一个关键点检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/pp-tinypose\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    ```\n\n- ### Gradio APP 支持\n  从 PaddleHub 2.3.1 开始支持使用链接 http://127.0.0.1:8866/gradio/pp-tinypose 在浏览器中访问 pp-tinypose 的 Gradio APP。\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  修复使用 ndarray 输入时无法保存可视化图片的问题\n\n* 1.2.0\n\n  添加 Gradio APP 支持\n\n  - ```shell\n    $ hub install pp-tinypose==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/keypoint_detection/pp-tinypose/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/keypoint_detection/pp-tinypose/det_keypoint_unite_infer.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport numpy as np\n\nfrom .keypoint_postprocess import translate_to_ori_images\n\nKEYPOINT_SUPPORT_MODELS = {'HigherHRNet': 'keypoint_bottomup', 'HRNet': 'keypoint_topdown'}\n\n\ndef predict_with_given_det(image, det_res, keypoint_detector, keypoint_batch_size, run_benchmark):\n    rec_images, records, det_rects = keypoint_detector.get_person_from_rect(image, det_res)\n    keypoint_vector = []\n    score_vector = []\n\n    rect_vector = det_rects\n    keypoint_results = keypoint_detector.predict_image(rec_images, run_benchmark, repeats=10, visual=False)\n    keypoint_vector, score_vector = translate_to_ori_images(keypoint_results, np.array(records))\n    keypoint_res = {}\n    keypoint_res['keypoint'] = [keypoint_vector.tolist(), score_vector.tolist()] if len(keypoint_vector) > 0 else [[],\n                                                                                                                   []]\n    keypoint_res['bbox'] = rect_vector\n    return keypoint_res\n"
  },
  {
    "path": "modules/image/keypoint_detection/pp-tinypose/infer.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport glob\nimport json\nimport math\nimport os\nfrom pathlib import Path\n\nimport cv2\nimport numpy as np\nimport yaml\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nfrom .keypoint_preprocess import EvalAffine\nfrom .keypoint_preprocess import expand_crop\nfrom .keypoint_preprocess import TopDownEvalAffine\nfrom .preprocess import decode_image\nfrom .preprocess import LetterBoxResize\nfrom .preprocess import NormalizeImage\nfrom .preprocess import Pad\nfrom .preprocess import PadStride\nfrom .preprocess import Permute\nfrom .preprocess import preprocess\nfrom .preprocess import Resize\nfrom .preprocess import WarpAffine\nfrom .visualize import visualize_box\n\n# Global dictionary\nSUPPORT_MODELS = {\n    'YOLO',\n    'RCNN',\n    'SSD',\n    'Face',\n    'FCOS',\n    'SOLOv2',\n    'TTFNet',\n    'S2ANet',\n    'JDE',\n    'FairMOT',\n    'DeepSORT',\n    'GFL',\n    'PicoDet',\n    'CenterNet',\n    'TOOD',\n    'RetinaNet',\n    'StrongBaseline',\n    'STGCN',\n    'YOLOX',\n}\n\n\nclass Detector(object):\n    \"\"\"\n    Args:\n        pred_config (object): config of model, defined by `Config(model_dir)`\n        model_dir (str): root path of model.pdiparams, model.pdmodel and infer_cfg.yml\n        device (str): Choose the device you want to run, it can be: CPU/GPU/XPU, default is CPU\n        run_mode (str): mode of running(paddle/trt_fp32/trt_fp16)\n        batch_size (int): size of pre batch in inference\n        trt_min_shape (int): min shape for dynamic shape in trt\n        trt_max_shape (int): max shape for dynamic shape in trt\n        trt_opt_shape (int): opt shape for dynamic shape in trt\n        trt_calib_mode (bool): If the model is produced by TRT offline quantitative\n            calibration, trt_calib_mode need to set True\n        cpu_threads (int): cpu threads\n        enable_mkldnn (bool): whether to open MKLDNN\n        enable_mkldnn_bfloat16 (bool): whether to turn on mkldnn bfloat16\n        output_dir (str): The path of output\n        threshold (float): The threshold of score for visualization\n        delete_shuffle_pass (bool): whether to remove shuffle_channel_detect_pass in TensorRT.\n                                    Used by action model.\n    \"\"\"\n\n    def __init__(self,\n                 model_dir,\n                 device='CPU',\n                 run_mode='paddle',\n                 batch_size=1,\n                 trt_min_shape=1,\n                 trt_max_shape=1280,\n                 trt_opt_shape=640,\n                 trt_calib_mode=False,\n                 cpu_threads=1,\n                 enable_mkldnn=False,\n                 enable_mkldnn_bfloat16=False,\n                 output_dir='output',\n                 threshold=0.5,\n                 delete_shuffle_pass=False):\n        self.pred_config = self.set_config(model_dir)\n        self.device = device\n        self.predictor, self.config = load_predictor(model_dir,\n                                                     run_mode=run_mode,\n                                                     batch_size=batch_size,\n                                                     min_subgraph_size=self.pred_config.min_subgraph_size,\n                                                     device=device,\n                                                     use_dynamic_shape=self.pred_config.use_dynamic_shape,\n                                                     trt_min_shape=trt_min_shape,\n                                                     trt_max_shape=trt_max_shape,\n                                                     trt_opt_shape=trt_opt_shape,\n                                                     trt_calib_mode=trt_calib_mode,\n                                                     cpu_threads=cpu_threads,\n                                                     enable_mkldnn=enable_mkldnn,\n                                                     enable_mkldnn_bfloat16=enable_mkldnn_bfloat16,\n                                                     delete_shuffle_pass=delete_shuffle_pass)\n        self.cpu_mem, self.gpu_mem, self.gpu_util = 0, 0, 0\n        self.batch_size = batch_size\n        self.output_dir = output_dir\n        self.threshold = threshold\n\n    def set_config(self, model_dir):\n        return PredictConfig(model_dir)\n\n    def preprocess(self, image_list):\n        preprocess_ops = []\n        for op_info in self.pred_config.preprocess_infos:\n            new_op_info = op_info.copy()\n            op_type = new_op_info.pop('type')\n            preprocess_ops.append(eval(op_type)(**new_op_info))\n\n        input_im_lst = []\n        input_im_info_lst = []\n        for im_path in image_list:\n            im, im_info = preprocess(im_path, preprocess_ops)\n            input_im_lst.append(im)\n            input_im_info_lst.append(im_info)\n        inputs = create_inputs(input_im_lst, input_im_info_lst)\n        input_names = self.predictor.get_input_names()\n        for i in range(len(input_names)):\n            input_tensor = self.predictor.get_input_handle(input_names[i])\n            input_tensor.copy_from_cpu(inputs[input_names[i]])\n\n        return inputs\n\n    def postprocess(self, inputs, result):\n        # postprocess output of predictor\n        np_boxes_num = result['boxes_num']\n        if np_boxes_num[0] <= 0:\n            print('[WARNNING] No object detected.')\n            result = {'boxes': np.zeros([0, 6]), 'boxes_num': [0]}\n        result = {k: v for k, v in result.items() if v is not None}\n        return result\n\n    def filter_box(self, result, threshold):\n        np_boxes_num = result['boxes_num']\n        boxes = result['boxes']\n        start_idx = 0\n        filter_boxes = []\n        filter_num = []\n        for i in range(len(np_boxes_num)):\n            boxes_num = np_boxes_num[i]\n            boxes_i = boxes[start_idx:start_idx + boxes_num, :]\n            idx = boxes_i[:, 1] > threshold\n            filter_boxes_i = boxes_i[idx, :]\n            filter_boxes.append(filter_boxes_i)\n            filter_num.append(filter_boxes_i.shape[0])\n            start_idx += boxes_num\n        boxes = np.concatenate(filter_boxes)\n        filter_num = np.array(filter_num)\n        filter_res = {'boxes': boxes, 'boxes_num': filter_num}\n        return filter_res\n\n    def predict(self, repeats=1):\n        '''\n        Args:\n            repeats (int): repeats number for prediction\n        Returns:\n            result (dict): include 'boxes': np.ndarray: shape:[N,6], N: number of box,\n                            matix element:[class, score, x_min, y_min, x_max, y_max]\n                            MaskRCNN's result include 'masks': np.ndarray:\n                            shape: [N, im_h, im_w]\n        '''\n        # model prediction\n        np_boxes, np_masks = None, None\n        for i in range(repeats):\n            self.predictor.run()\n            output_names = self.predictor.get_output_names()\n            boxes_tensor = self.predictor.get_output_handle(output_names[0])\n            np_boxes = boxes_tensor.copy_to_cpu()\n            boxes_num = self.predictor.get_output_handle(output_names[1])\n            np_boxes_num = boxes_num.copy_to_cpu()\n            if self.pred_config.mask:\n                masks_tensor = self.predictor.get_output_handle(output_names[2])\n                np_masks = masks_tensor.copy_to_cpu()\n        result = dict(boxes=np_boxes, masks=np_masks, boxes_num=np_boxes_num)\n        return result\n\n    def merge_batch_result(self, batch_result):\n        if len(batch_result) == 1:\n            return batch_result[0]\n        res_key = batch_result[0].keys()\n        results = {k: [] for k in res_key}\n        for res in batch_result:\n            for k, v in res.items():\n                results[k].append(v)\n        for k, v in results.items():\n            if k != 'masks':\n                results[k] = np.concatenate(v)\n        return results\n\n    def predict_image(self, image_list, run_benchmark=False, repeats=1, visual=True, save_file=None):\n        batch_loop_cnt = math.ceil(float(len(image_list)) / self.batch_size)\n        results = []\n        for i in range(batch_loop_cnt):\n            start_index = i * self.batch_size\n            end_index = min((i + 1) * self.batch_size, len(image_list))\n            batch_image_list = image_list[start_index:end_index]\n            # preprocess\n            inputs = self.preprocess(batch_image_list)\n\n            # model prediction\n            result = self.predict()\n\n            # postprocess\n\n            result = self.postprocess(inputs, result)\n\n            if visual:\n                visualize(batch_image_list,\n                          result,\n                          self.pred_config.labels,\n                          output_dir=self.output_dir,\n                          threshold=self.threshold)\n\n            results.append(result)\n            if visual:\n                print('Test iter {}'.format(i))\n\n        if save_file is not None:\n            Path(self.output_dir).mkdir(exist_ok=True)\n            self.format_coco_results(image_list, results, save_file=save_file)\n\n        results = self.merge_batch_result(results)\n        return results\n\n    def predict_video(self, video_file, camera_id):\n        video_out_name = 'output.mp4'\n        if camera_id != -1:\n            capture = cv2.VideoCapture(camera_id)\n        else:\n            capture = cv2.VideoCapture(video_file)\n            video_out_name = os.path.split(video_file)[-1]\n        # Get Video info : resolution, fps, frame count\n        width = int(capture.get(cv2.CAP_PROP_FRAME_WIDTH))\n        height = int(capture.get(cv2.CAP_PROP_FRAME_HEIGHT))\n        fps = int(capture.get(cv2.CAP_PROP_FPS))\n        frame_count = int(capture.get(cv2.CAP_PROP_FRAME_COUNT))\n        print(\"fps: %d, frame_count: %d\" % (fps, frame_count))\n\n        if not os.path.exists(self.output_dir):\n            os.makedirs(self.output_dir)\n        out_path = os.path.join(self.output_dir, video_out_name)\n        fourcc = cv2.VideoWriter_fourcc(*'mp4v')\n        writer = cv2.VideoWriter(out_path, fourcc, fps, (width, height))\n        index = 1\n        while (1):\n            ret, frame = capture.read()\n            if not ret:\n                break\n            print('detect frame: %d' % (index))\n            index += 1\n            results = self.predict_image([frame[:, :, ::-1]], visual=False)\n\n            im = visualize_box(frame, results, self.pred_config.labels, threshold=self.threshold)\n            im = np.array(im)\n            writer.write(im)\n            if camera_id != -1:\n                cv2.imshow('Mask Detection', im)\n                if cv2.waitKey(1) & 0xFF == ord('q'):\n                    break\n        writer.release()\n\n    @staticmethod\n    def format_coco_results(image_list, results, save_file=None):\n        coco_results = []\n        image_id = 0\n\n        for result in results:\n            start_idx = 0\n            for box_num in result['boxes_num']:\n                idx_slice = slice(start_idx, start_idx + box_num)\n                start_idx += box_num\n\n                image_file = image_list[image_id]\n                image_id += 1\n\n                if 'boxes' in result:\n                    boxes = result['boxes'][idx_slice, :]\n                    per_result = [\n                        {\n                            'image_file': image_file,\n                            'bbox': [box[2], box[3], box[4] - box[2], box[5] - box[3]],  # xyxy -> xywh\n                            'score': box[1],\n                            'category_id': int(box[0]),\n                        } for k, box in enumerate(boxes.tolist())\n                    ]\n\n                elif 'segm' in result:\n                    import pycocotools.mask as mask_util\n\n                    scores = result['score'][idx_slice].tolist()\n                    category_ids = result['label'][idx_slice].tolist()\n                    segms = result['segm'][idx_slice, :]\n                    rles = [\n                        mask_util.encode(np.array(mask[:, :, np.newaxis], dtype=np.uint8, order='F'))[0]\n                        for mask in segms\n                    ]\n                    for rle in rles:\n                        rle['counts'] = rle['counts'].decode('utf-8')\n\n                    per_result = [{\n                        'image_file': image_file,\n                        'segmentation': rle,\n                        'score': scores[k],\n                        'category_id': category_ids[k],\n                    } for k, rle in enumerate(rles)]\n\n                else:\n                    raise RuntimeError('')\n\n                # per_result = [item for item in per_result if item['score'] > threshold]\n                coco_results.extend(per_result)\n\n        if save_file:\n            with open(os.path.join(save_file), 'w') as f:\n                json.dump(coco_results, f)\n\n        return coco_results\n\n\ndef create_inputs(imgs, im_info):\n    \"\"\"generate input for different model type\n    Args:\n        imgs (list(numpy)): list of images (np.ndarray)\n        im_info (list(dict)): list of image info\n    Returns:\n        inputs (dict): input of model\n    \"\"\"\n    inputs = {}\n\n    im_shape = []\n    scale_factor = []\n    if len(imgs) == 1:\n        inputs['image'] = np.array((imgs[0], )).astype('float32')\n        inputs['im_shape'] = np.array((im_info[0]['im_shape'], )).astype('float32')\n        inputs['scale_factor'] = np.array((im_info[0]['scale_factor'], )).astype('float32')\n        return inputs\n\n    for e in im_info:\n        im_shape.append(np.array((e['im_shape'], )).astype('float32'))\n        scale_factor.append(np.array((e['scale_factor'], )).astype('float32'))\n\n    inputs['im_shape'] = np.concatenate(im_shape, axis=0)\n    inputs['scale_factor'] = np.concatenate(scale_factor, axis=0)\n\n    imgs_shape = [[e.shape[1], e.shape[2]] for e in imgs]\n    max_shape_h = max([e[0] for e in imgs_shape])\n    max_shape_w = max([e[1] for e in imgs_shape])\n    padding_imgs = []\n    for img in imgs:\n        im_c, im_h, im_w = img.shape[:]\n        padding_im = np.zeros((im_c, max_shape_h, max_shape_w), dtype=np.float32)\n        padding_im[:, :im_h, :im_w] = img\n        padding_imgs.append(padding_im)\n    inputs['image'] = np.stack(padding_imgs, axis=0)\n    return inputs\n\n\nclass PredictConfig():\n    \"\"\"set config of preprocess, postprocess and visualize\n    Args:\n        model_dir (str): root path of model.yml\n    \"\"\"\n\n    def __init__(self, model_dir):\n        # parsing Yaml config for Preprocess\n        deploy_file = os.path.join(model_dir, 'infer_cfg.yml')\n        with open(deploy_file) as f:\n            yml_conf = yaml.safe_load(f)\n        self.check_model(yml_conf)\n        self.arch = yml_conf['arch']\n        self.preprocess_infos = yml_conf['Preprocess']\n        self.min_subgraph_size = yml_conf['min_subgraph_size']\n        self.labels = yml_conf['label_list']\n        self.mask = False\n        self.use_dynamic_shape = yml_conf['use_dynamic_shape']\n        if 'mask' in yml_conf:\n            self.mask = yml_conf['mask']\n        self.tracker = None\n        if 'tracker' in yml_conf:\n            self.tracker = yml_conf['tracker']\n        if 'NMS' in yml_conf:\n            self.nms = yml_conf['NMS']\n        if 'fpn_stride' in yml_conf:\n            self.fpn_stride = yml_conf['fpn_stride']\n        if self.arch == 'RCNN' and yml_conf.get('export_onnx', False):\n            print('The RCNN export model is used for ONNX and it only supports batch_size = 1')\n        self.print_config()\n\n    def check_model(self, yml_conf):\n        \"\"\"\n        Raises:\n            ValueError: loaded model not in supported model type\n        \"\"\"\n        for support_model in SUPPORT_MODELS:\n            if support_model in yml_conf['arch']:\n                return True\n        raise ValueError(\"Unsupported arch: {}, expect {}\".format(yml_conf['arch'], SUPPORT_MODELS))\n\n    def print_config(self):\n        print('-----------  Model Configuration -----------')\n        print('%s: %s' % ('Model Arch', self.arch))\n        print('%s: ' % ('Transform Order'))\n        for op_info in self.preprocess_infos:\n            print('--%s: %s' % ('transform op', op_info['type']))\n        print('--------------------------------------------')\n\n\ndef load_predictor(model_dir,\n                   run_mode='paddle',\n                   batch_size=1,\n                   device='CPU',\n                   min_subgraph_size=3,\n                   use_dynamic_shape=False,\n                   trt_min_shape=1,\n                   trt_max_shape=1280,\n                   trt_opt_shape=640,\n                   trt_calib_mode=False,\n                   cpu_threads=1,\n                   enable_mkldnn=False,\n                   enable_mkldnn_bfloat16=False,\n                   delete_shuffle_pass=False):\n    \"\"\"set AnalysisConfig, generate AnalysisPredictor\n    Args:\n        model_dir (str): root path of __model__ and __params__\n        device (str): Choose the device you want to run, it can be: CPU/GPU/XPU, default is CPU\n        run_mode (str): mode of running(paddle/trt_fp32/trt_fp16/trt_int8)\n        use_dynamic_shape (bool): use dynamic shape or not\n        trt_min_shape (int): min shape for dynamic shape in trt\n        trt_max_shape (int): max shape for dynamic shape in trt\n        trt_opt_shape (int): opt shape for dynamic shape in trt\n        trt_calib_mode (bool): If the model is produced by TRT offline quantitative\n            calibration, trt_calib_mode need to set True\n        delete_shuffle_pass (bool): whether to remove shuffle_channel_detect_pass in TensorRT.\n                                    Used by action model.\n    Returns:\n        predictor (PaddlePredictor): AnalysisPredictor\n    Raises:\n        ValueError: predict by TensorRT need device == 'GPU'.\n    \"\"\"\n    if device != 'GPU' and run_mode != 'paddle':\n        raise ValueError(\"Predict by TensorRT mode: {}, expect device=='GPU', but device == {}\".format(\n            run_mode, device))\n    config = Config(os.path.join(model_dir, 'model.pdmodel'), os.path.join(model_dir, 'model.pdiparams'))\n    if device == 'GPU':\n        # initial GPU memory(M), device ID\n        config.enable_use_gpu(200, 0)\n        # optimize graph and fuse op\n        config.switch_ir_optim(True)\n    elif device == 'XPU':\n        config.enable_lite_engine()\n        config.enable_xpu(10 * 1024 * 1024)\n    else:\n        config.disable_gpu()\n        config.set_cpu_math_library_num_threads(cpu_threads)\n        if enable_mkldnn:\n            try:\n                # cache 10 different shapes for mkldnn to avoid memory leak\n                config.set_mkldnn_cache_capacity(10)\n                config.enable_mkldnn()\n                if enable_mkldnn_bfloat16:\n                    config.enable_mkldnn_bfloat16()\n            except Exception as e:\n                print(\"The current environment does not support `mkldnn`, so disable mkldnn.\")\n                pass\n\n    precision_map = {\n        'trt_int8': Config.Precision.Int8,\n        'trt_fp32': Config.Precision.Float32,\n        'trt_fp16': Config.Precision.Half\n    }\n    if run_mode in precision_map.keys():\n        config.enable_tensorrt_engine(workspace_size=(1 << 25) * batch_size,\n                                      max_batch_size=batch_size,\n                                      min_subgraph_size=min_subgraph_size,\n                                      precision_mode=precision_map[run_mode],\n                                      use_static=False,\n                                      use_calib_mode=trt_calib_mode)\n\n        if use_dynamic_shape:\n            min_input_shape = {'image': [batch_size, 3, trt_min_shape, trt_min_shape]}\n            max_input_shape = {'image': [batch_size, 3, trt_max_shape, trt_max_shape]}\n            opt_input_shape = {'image': [batch_size, 3, trt_opt_shape, trt_opt_shape]}\n            config.set_trt_dynamic_shape_info(min_input_shape, max_input_shape, opt_input_shape)\n            print('trt set dynamic shape done!')\n\n    # disable print log when predict\n    config.disable_glog_info()\n    # enable shared memory\n    config.enable_memory_optim()\n    # disable feed, fetch OP, needed by zero_copy_run\n    config.switch_use_feed_fetch_ops(False)\n    if delete_shuffle_pass:\n        config.delete_pass(\"shuffle_channel_detect_pass\")\n    predictor = create_predictor(config)\n    return predictor, config\n\n\ndef get_test_images(infer_dir, infer_img):\n    \"\"\"\n    Get image path list in TEST mode\n    \"\"\"\n    assert infer_img is not None or infer_dir is not None, \\\n        \"--image_file or --image_dir should be set\"\n    assert infer_img is None or os.path.isfile(infer_img), \\\n            \"{} is not a file\".format(infer_img)\n    assert infer_dir is None or os.path.isdir(infer_dir), \\\n            \"{} is not a directory\".format(infer_dir)\n\n    # infer_img has a higher priority\n    if infer_img and os.path.isfile(infer_img):\n        return [infer_img]\n\n    images = set()\n    infer_dir = os.path.abspath(infer_dir)\n    assert os.path.isdir(infer_dir), \\\n        \"infer_dir {} is not a directory\".format(infer_dir)\n    exts = ['jpg', 'jpeg', 'png', 'bmp']\n    exts += [ext.upper() for ext in exts]\n    for ext in exts:\n        images.update(glob.glob('{}/*.{}'.format(infer_dir, ext)))\n    images = list(images)\n\n    assert len(images) > 0, \"no image found in {}\".format(infer_dir)\n    print(\"Found {} inference images in total.\".format(len(images)))\n\n    return images\n\n\ndef visualize(image_list, result, labels, output_dir='output/', threshold=0.5):\n    # visualize the predict result\n    start_idx = 0\n    for idx, image_file in enumerate(image_list):\n        im_bboxes_num = result['boxes_num'][idx]\n        im_results = {}\n        if 'boxes' in result:\n            im_results['boxes'] = result['boxes'][start_idx:start_idx + im_bboxes_num, :]\n        if 'masks' in result:\n            im_results['masks'] = result['masks'][start_idx:start_idx + im_bboxes_num, :]\n        if 'segm' in result:\n            im_results['segm'] = result['segm'][start_idx:start_idx + im_bboxes_num, :]\n        if 'label' in result:\n            im_results['label'] = result['label'][start_idx:start_idx + im_bboxes_num]\n        if 'score' in result:\n            im_results['score'] = result['score'][start_idx:start_idx + im_bboxes_num]\n\n        start_idx += im_bboxes_num\n        im = visualize_box(image_file, im_results, labels, threshold=threshold)\n        img_name = os.path.split(image_file)[-1]\n        if not os.path.exists(output_dir):\n            os.makedirs(output_dir)\n        out_path = os.path.join(output_dir, img_name)\n        im.save(out_path, quality=95)\n        print(\"save result to: \" + out_path)\n"
  },
  {
    "path": "modules/image/keypoint_detection/pp-tinypose/keypoint_infer.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nimport os\n\nimport cv2\nimport numpy as np\nimport yaml\n\nfrom .infer import Detector\nfrom .keypoint_postprocess import HRNetPostProcess\nfrom .keypoint_preprocess import expand_crop\nfrom .visualize import visualize_pose\n\n# Global dictionary\nKEYPOINT_SUPPORT_MODELS = {'HigherHRNet': 'keypoint_bottomup', 'HRNet': 'keypoint_topdown'}\n\n\nclass KeyPointDetector(Detector):\n    \"\"\"\n    Args:\n        model_dir (str): root path of model.pdiparams, model.pdmodel and infer_cfg.yml\n        device (str): Choose the device you want to run, it can be: CPU/GPU/XPU, default is CPU\n        run_mode (str): mode of running(paddle/trt_fp32/trt_fp16)\n        batch_size (int): size of pre batch in inference\n        trt_min_shape (int): min shape for dynamic shape in trt\n        trt_max_shape (int): max shape for dynamic shape in trt\n        trt_opt_shape (int): opt shape for dynamic shape in trt\n        trt_calib_mode (bool): If the model is produced by TRT offline quantitative\n            calibration, trt_calib_mode need to set True\n        cpu_threads (int): cpu threads\n        enable_mkldnn (bool): whether to open MKLDNN\n        use_dark(bool): whether to use postprocess in DarkPose\n    \"\"\"\n\n    def __init__(self,\n                 model_dir,\n                 device='CPU',\n                 run_mode='paddle',\n                 batch_size=1,\n                 trt_min_shape=1,\n                 trt_max_shape=1280,\n                 trt_opt_shape=640,\n                 trt_calib_mode=False,\n                 cpu_threads=1,\n                 enable_mkldnn=False,\n                 output_dir='output',\n                 threshold=0.5,\n                 use_dark=True):\n        super(KeyPointDetector, self).__init__(\n            model_dir=model_dir,\n            device=device,\n            run_mode=run_mode,\n            batch_size=batch_size,\n            trt_min_shape=trt_min_shape,\n            trt_max_shape=trt_max_shape,\n            trt_opt_shape=trt_opt_shape,\n            trt_calib_mode=trt_calib_mode,\n            cpu_threads=cpu_threads,\n            enable_mkldnn=enable_mkldnn,\n            output_dir=output_dir,\n            threshold=threshold,\n        )\n        self.use_dark = use_dark\n\n    def set_config(self, model_dir):\n        return PredictConfig_KeyPoint(model_dir)\n\n    def get_person_from_rect(self, image, results):\n        # crop the person result from image\n        valid_rects = results['boxes']\n        rect_images = []\n        new_rects = []\n        org_rects = []\n        for rect in valid_rects:\n            rect_image, new_rect, org_rect = expand_crop(image, rect)\n            if rect_image is None or rect_image.size == 0:\n                continue\n            rect_images.append(rect_image)\n            new_rects.append(new_rect)\n            org_rects.append(org_rect)\n        return rect_images, new_rects, org_rects\n\n    def postprocess(self, inputs, result):\n        np_heatmap = result['heatmap']\n        np_masks = result['masks']\n        # postprocess output of predictor\n        if KEYPOINT_SUPPORT_MODELS[self.pred_config.arch] == 'keypoint_bottomup':\n            results = {}\n            h, w = inputs['im_shape'][0]\n            preds = [np_heatmap]\n            if np_masks is not None:\n                preds += np_masks\n            preds += [h, w]\n            keypoint_postprocess = HRNetPostProcess()\n            kpts, scores = keypoint_postprocess(*preds)\n            results['keypoint'] = kpts\n            results['score'] = scores\n            return results\n        elif KEYPOINT_SUPPORT_MODELS[self.pred_config.arch] == 'keypoint_topdown':\n            results = {}\n            imshape = inputs['im_shape'][:, ::-1]\n            center = np.round(imshape / 2.)\n            scale = imshape / 200.\n            keypoint_postprocess = HRNetPostProcess(use_dark=self.use_dark)\n            kpts, scores = keypoint_postprocess(np_heatmap, center, scale)\n            results['keypoint'] = kpts\n            results['score'] = scores\n            return results\n        else:\n            raise ValueError(\"Unsupported arch: {}, expect {}\".format(self.pred_config.arch, KEYPOINT_SUPPORT_MODELS))\n\n    def predict(self, repeats=1):\n        '''\n        Args:\n            repeats (int): repeat number for prediction\n        Returns:\n            results (dict): include 'boxes': np.ndarray: shape:[N,6], N: number of box,\n                            matix element:[class, score, x_min, y_min, x_max, y_max]\n                            MaskRCNN's results include 'masks': np.ndarray:\n                            shape: [N, im_h, im_w]\n        '''\n        # model prediction\n        np_heatmap, np_masks = None, None\n        for i in range(repeats):\n            self.predictor.run()\n            output_names = self.predictor.get_output_names()\n            heatmap_tensor = self.predictor.get_output_handle(output_names[0])\n            np_heatmap = heatmap_tensor.copy_to_cpu()\n            if self.pred_config.tagmap:\n                masks_tensor = self.predictor.get_output_handle(output_names[1])\n                heat_k = self.predictor.get_output_handle(output_names[2])\n                inds_k = self.predictor.get_output_handle(output_names[3])\n                np_masks = [masks_tensor.copy_to_cpu(), heat_k.copy_to_cpu(), inds_k.copy_to_cpu()]\n        result = dict(heatmap=np_heatmap, masks=np_masks)\n        return result\n\n    def predict_image(self, image_list, run_benchmark=False, repeats=1, visual=True):\n        results = []\n        batch_loop_cnt = math.ceil(float(len(image_list)) / self.batch_size)\n        for i in range(batch_loop_cnt):\n            start_index = i * self.batch_size\n            end_index = min((i + 1) * self.batch_size, len(image_list))\n            batch_image_list = image_list[start_index:end_index]\n            # preprocess\n            inputs = self.preprocess(batch_image_list)\n\n            # model prediction\n            result = self.predict()\n            # postprocess\n            result = self.postprocess(inputs, result)\n\n            if visual:\n                if not os.path.exists(self.output_dir):\n                    os.makedirs(self.output_dir)\n                visualize(batch_image_list, result, visual_thresh=self.threshold, save_dir=self.output_dir)\n\n            results.append(result)\n            if visual:\n                print('Test iter {}'.format(i))\n        results = self.merge_batch_result(results)\n        return results\n\n    def predict_video(self, video_file, camera_id):\n        video_name = 'output.mp4'\n        if camera_id != -1:\n            capture = cv2.VideoCapture(camera_id)\n        else:\n            capture = cv2.VideoCapture(video_file)\n            video_name = os.path.split(video_file)[-1]\n        # Get Video info : resolution, fps, frame count\n        width = int(capture.get(cv2.CAP_PROP_FRAME_WIDTH))\n        height = int(capture.get(cv2.CAP_PROP_FRAME_HEIGHT))\n        fps = int(capture.get(cv2.CAP_PROP_FPS))\n        frame_count = int(capture.get(cv2.CAP_PROP_FRAME_COUNT))\n        print(\"fps: %d, frame_count: %d\" % (fps, frame_count))\n\n        if not os.path.exists(self.output_dir):\n            os.makedirs(self.output_dir)\n        out_path = os.path.join(self.output_dir, video_name)\n        fourcc = cv2.VideoWriter_fourcc(*'mp4v')\n        writer = cv2.VideoWriter(out_path, fourcc, fps, (width, height))\n        index = 1\n        while (1):\n            ret, frame = capture.read()\n            if not ret:\n                break\n            print('detect frame: %d' % (index))\n            index += 1\n            results = self.predict_image([frame[:, :, ::-1]], visual=False)\n            im_results = {}\n            im_results['keypoint'] = [results['keypoint'], results['score']]\n            im = visualize_pose(frame, im_results, visual_thresh=self.threshold, returnimg=True)\n            writer.write(im)\n            if camera_id != -1:\n                cv2.imshow('Mask Detection', im)\n                if cv2.waitKey(1) & 0xFF == ord('q'):\n                    break\n        writer.release()\n\n\ndef create_inputs(imgs, im_info):\n    \"\"\"generate input for different model type\n    Args:\n        imgs (list(numpy)): list of image (np.ndarray)\n        im_info (list(dict)): list of image info\n    Returns:\n        inputs (dict): input of model\n    \"\"\"\n    inputs = {}\n    inputs['image'] = np.stack(imgs, axis=0).astype('float32')\n    im_shape = []\n    for e in im_info:\n        im_shape.append(np.array((e['im_shape'])).astype('float32'))\n    inputs['im_shape'] = np.stack(im_shape, axis=0)\n    return inputs\n\n\nclass PredictConfig_KeyPoint():\n    \"\"\"set config of preprocess, postprocess and visualize\n    Args:\n        model_dir (str): root path of model.yml\n    \"\"\"\n\n    def __init__(self, model_dir):\n        # parsing Yaml config for Preprocess\n        deploy_file = os.path.join(model_dir, 'infer_cfg.yml')\n        with open(deploy_file) as f:\n            yml_conf = yaml.safe_load(f)\n        self.check_model(yml_conf)\n        self.arch = yml_conf['arch']\n        self.archcls = KEYPOINT_SUPPORT_MODELS[yml_conf['arch']]\n        self.preprocess_infos = yml_conf['Preprocess']\n        self.min_subgraph_size = yml_conf['min_subgraph_size']\n        self.labels = yml_conf['label_list']\n        self.tagmap = False\n        self.use_dynamic_shape = yml_conf['use_dynamic_shape']\n        if 'keypoint_bottomup' == self.archcls:\n            self.tagmap = True\n        self.print_config()\n\n    def check_model(self, yml_conf):\n        \"\"\"\n        Raises:\n            ValueError: loaded model not in supported model type\n        \"\"\"\n        for support_model in KEYPOINT_SUPPORT_MODELS:\n            if support_model in yml_conf['arch']:\n                return True\n        raise ValueError(\"Unsupported arch: {}, expect {}\".format(yml_conf['arch'], KEYPOINT_SUPPORT_MODELS))\n\n    def print_config(self):\n        print('-----------  Model Configuration -----------')\n        print('%s: %s' % ('Model Arch', self.arch))\n        print('%s: ' % ('Transform Order'))\n        for op_info in self.preprocess_infos:\n            print('--%s: %s' % ('transform op', op_info['type']))\n        print('--------------------------------------------')\n\n\ndef visualize(image_list, results, visual_thresh=0.6, save_dir='output'):\n    im_results = {}\n    for i, image_file in enumerate(image_list):\n        skeletons = results['keypoint']\n        scores = results['score']\n        skeleton = skeletons[i:i + 1]\n        score = scores[i:i + 1]\n        im_results['keypoint'] = [skeleton, score]\n        visualize_pose(image_file, im_results, visual_thresh=visual_thresh, save_dir=save_dir)\n"
  },
  {
    "path": "modules/image/keypoint_detection/pp-tinypose/keypoint_postprocess.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\n\nimport cv2\nimport numpy as np\n\nfrom .keypoint_preprocess import get_affine_transform\n\n\nclass HRNetPostProcess(object):\n\n    def __init__(self, use_dark=True):\n        self.use_dark = use_dark\n\n    def flip_back(self, output_flipped, matched_parts):\n        assert output_flipped.ndim == 4,\\\n                'output_flipped should be [batch_size, num_joints, height, width]'\n\n        output_flipped = output_flipped[:, :, :, ::-1]\n\n        for pair in matched_parts:\n            tmp = output_flipped[:, pair[0], :, :].copy()\n            output_flipped[:, pair[0], :, :] = output_flipped[:, pair[1], :, :]\n            output_flipped[:, pair[1], :, :] = tmp\n\n        return output_flipped\n\n    def get_max_preds(self, heatmaps):\n        \"\"\"get predictions from score maps\n\n        Args:\n            heatmaps: numpy.ndarray([batch_size, num_joints, height, width])\n\n        Returns:\n            preds: numpy.ndarray([batch_size, num_joints, 2]), keypoints coords\n            maxvals: numpy.ndarray([batch_size, num_joints, 2]), the maximum confidence of the keypoints\n        \"\"\"\n        assert isinstance(heatmaps, np.ndarray), 'heatmaps should be numpy.ndarray'\n        assert heatmaps.ndim == 4, 'batch_images should be 4-ndim'\n\n        batch_size = heatmaps.shape[0]\n        num_joints = heatmaps.shape[1]\n        width = heatmaps.shape[3]\n        heatmaps_reshaped = heatmaps.reshape((batch_size, num_joints, -1))\n        idx = np.argmax(heatmaps_reshaped, 2)\n        maxvals = np.amax(heatmaps_reshaped, 2)\n\n        maxvals = maxvals.reshape((batch_size, num_joints, 1))\n        idx = idx.reshape((batch_size, num_joints, 1))\n\n        preds = np.tile(idx, (1, 1, 2)).astype(np.float32)\n\n        preds[:, :, 0] = (preds[:, :, 0]) % width\n        preds[:, :, 1] = np.floor((preds[:, :, 1]) / width)\n\n        pred_mask = np.tile(np.greater(maxvals, 0.0), (1, 1, 2))\n        pred_mask = pred_mask.astype(np.float32)\n\n        preds *= pred_mask\n\n        return preds, maxvals\n\n    def gaussian_blur(self, heatmap, kernel):\n        border = (kernel - 1) // 2\n        batch_size = heatmap.shape[0]\n        num_joints = heatmap.shape[1]\n        height = heatmap.shape[2]\n        width = heatmap.shape[3]\n        for i in range(batch_size):\n            for j in range(num_joints):\n                origin_max = np.max(heatmap[i, j])\n                dr = np.zeros((height + 2 * border, width + 2 * border))\n                dr[border:-border, border:-border] = heatmap[i, j].copy()\n                dr = cv2.GaussianBlur(dr, (kernel, kernel), 0)\n                heatmap[i, j] = dr[border:-border, border:-border].copy()\n                heatmap[i, j] *= origin_max / np.max(heatmap[i, j])\n        return heatmap\n\n    def dark_parse(self, hm, coord):\n        heatmap_height = hm.shape[0]\n        heatmap_width = hm.shape[1]\n        px = int(coord[0])\n        py = int(coord[1])\n        if 1 < px < heatmap_width - 2 and 1 < py < heatmap_height - 2:\n            dx = 0.5 * (hm[py][px + 1] - hm[py][px - 1])\n            dy = 0.5 * (hm[py + 1][px] - hm[py - 1][px])\n            dxx = 0.25 * (hm[py][px + 2] - 2 * hm[py][px] + hm[py][px - 2])\n            dxy = 0.25 * (hm[py+1][px+1] - hm[py-1][px+1] - hm[py+1][px-1] \\\n                + hm[py-1][px-1])\n            dyy = 0.25 * (hm[py + 2 * 1][px] - 2 * hm[py][px] + hm[py - 2 * 1][px])\n            derivative = np.matrix([[dx], [dy]])\n            hessian = np.matrix([[dxx, dxy], [dxy, dyy]])\n            if dxx * dyy - dxy**2 != 0:\n                hessianinv = hessian.I\n                offset = -hessianinv * derivative\n                offset = np.squeeze(np.array(offset.T), axis=0)\n                coord += offset\n        return coord\n\n    def dark_postprocess(self, hm, coords, kernelsize):\n        \"\"\"\n        refer to https://github.com/ilovepose/DarkPose/lib/core/inference.py\n\n        \"\"\"\n        hm = self.gaussian_blur(hm, kernelsize)\n        hm = np.maximum(hm, 1e-10)\n        hm = np.log(hm)\n        for n in range(coords.shape[0]):\n            for p in range(coords.shape[1]):\n                coords[n, p] = self.dark_parse(hm[n][p], coords[n][p])\n        return coords\n\n    def get_final_preds(self, heatmaps, center, scale, kernelsize=3):\n        \"\"\"the highest heatvalue location with a quarter offset in the\n        direction from the highest response to the second highest response.\n\n        Args:\n            heatmaps (numpy.ndarray): The predicted heatmaps\n            center (numpy.ndarray): The boxes center\n            scale (numpy.ndarray): The scale factor\n\n        Returns:\n            preds: numpy.ndarray([batch_size, num_joints, 2]), keypoints coords\n            maxvals: numpy.ndarray([batch_size, num_joints, 1]), the maximum confidence of the keypoints\n        \"\"\"\n\n        coords, maxvals = self.get_max_preds(heatmaps)\n\n        heatmap_height = heatmaps.shape[2]\n        heatmap_width = heatmaps.shape[3]\n\n        if self.use_dark:\n            coords = self.dark_postprocess(heatmaps, coords, kernelsize)\n        else:\n            for n in range(coords.shape[0]):\n                for p in range(coords.shape[1]):\n                    hm = heatmaps[n][p]\n                    px = int(math.floor(coords[n][p][0] + 0.5))\n                    py = int(math.floor(coords[n][p][1] + 0.5))\n                    if 1 < px < heatmap_width - 1 and 1 < py < heatmap_height - 1:\n                        diff = np.array([hm[py][px + 1] - hm[py][px - 1], hm[py + 1][px] - hm[py - 1][px]])\n                        coords[n][p] += np.sign(diff) * .25\n        preds = coords.copy()\n\n        # Transform back\n        for i in range(coords.shape[0]):\n            preds[i] = transform_preds(coords[i], center[i], scale[i], [heatmap_width, heatmap_height])\n\n        return preds, maxvals\n\n    def __call__(self, output, center, scale):\n        preds, maxvals = self.get_final_preds(output, center, scale)\n        return np.concatenate((preds, maxvals), axis=-1), np.mean(maxvals, axis=1)\n\n\ndef transform_preds(coords, center, scale, output_size):\n    target_coords = np.zeros(coords.shape)\n    trans = get_affine_transform(center, scale * 200, 0, output_size, inv=1)\n    for p in range(coords.shape[0]):\n        target_coords[p, 0:2] = affine_transform(coords[p, 0:2], trans)\n    return target_coords\n\n\ndef affine_transform(pt, t):\n    new_pt = np.array([pt[0], pt[1], 1.]).T\n    new_pt = np.dot(t, new_pt)\n    return new_pt[:2]\n\n\ndef translate_to_ori_images(keypoint_result, batch_records):\n    kpts = keypoint_result['keypoint']\n    scores = keypoint_result['score']\n    kpts[..., 0] += batch_records[:, 0:1]\n    kpts[..., 1] += batch_records[:, 1:2]\n    return kpts, scores\n"
  },
  {
    "path": "modules/image/keypoint_detection/pp-tinypose/keypoint_preprocess.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"\nthis code is based on https://github.com/open-mmlab/mmpose/mmpose/core/post_processing/post_transforms.py\n\"\"\"\nimport cv2\nimport numpy as np\n\n\nclass EvalAffine(object):\n\n    def __init__(self, size, stride=64):\n        super(EvalAffine, self).__init__()\n        self.size = size\n        self.stride = stride\n\n    def __call__(self, image, im_info):\n        s = self.size\n        h, w, _ = image.shape\n        trans, size_resized = get_affine_mat_kernel(h, w, s, inv=False)\n        image_resized = cv2.warpAffine(image, trans, size_resized)\n        return image_resized, im_info\n\n\ndef get_affine_mat_kernel(h, w, s, inv=False):\n    if w < h:\n        w_ = s\n        h_ = int(np.ceil((s / w * h) / 64.) * 64)\n        scale_w = w\n        scale_h = h_ / w_ * w\n\n    else:\n        h_ = s\n        w_ = int(np.ceil((s / h * w) / 64.) * 64)\n        scale_h = h\n        scale_w = w_ / h_ * h\n\n    center = np.array([np.round(w / 2.), np.round(h / 2.)])\n\n    size_resized = (w_, h_)\n    trans = get_affine_transform(center, np.array([scale_w, scale_h]), 0, size_resized, inv=inv)\n\n    return trans, size_resized\n\n\ndef get_affine_transform(center, input_size, rot, output_size, shift=(0., 0.), inv=False):\n    \"\"\"Get the affine transform matrix, given the center/scale/rot/output_size.\n\n    Args:\n        center (np.ndarray[2, ]): Center of the bounding box (x, y).\n        scale (np.ndarray[2, ]): Scale of the bounding box\n            wrt [width, height].\n        rot (float): Rotation angle (degree).\n        output_size (np.ndarray[2, ]): Size of the destination heatmaps.\n        shift (0-100%): Shift translation ratio wrt the width/height.\n            Default (0., 0.).\n        inv (bool): Option to inverse the affine transform direction.\n            (inv=False: src->dst or inv=True: dst->src)\n\n    Returns:\n        np.ndarray: The transform matrix.\n    \"\"\"\n    assert len(center) == 2\n    assert len(output_size) == 2\n    assert len(shift) == 2\n    if not isinstance(input_size, (np.ndarray, list)):\n        input_size = np.array([input_size, input_size], dtype=np.float32)\n    scale_tmp = input_size\n\n    shift = np.array(shift)\n    src_w = scale_tmp[0]\n    dst_w = output_size[0]\n    dst_h = output_size[1]\n\n    rot_rad = np.pi * rot / 180\n    src_dir = rotate_point([0., src_w * -0.5], rot_rad)\n    dst_dir = np.array([0., dst_w * -0.5])\n\n    src = np.zeros((3, 2), dtype=np.float32)\n    src[0, :] = center + scale_tmp * shift\n    src[1, :] = center + src_dir + scale_tmp * shift\n    src[2, :] = _get_3rd_point(src[0, :], src[1, :])\n\n    dst = np.zeros((3, 2), dtype=np.float32)\n    dst[0, :] = [dst_w * 0.5, dst_h * 0.5]\n    dst[1, :] = np.array([dst_w * 0.5, dst_h * 0.5]) + dst_dir\n    dst[2, :] = _get_3rd_point(dst[0, :], dst[1, :])\n\n    if inv:\n        trans = cv2.getAffineTransform(np.float32(dst), np.float32(src))\n    else:\n        trans = cv2.getAffineTransform(np.float32(src), np.float32(dst))\n\n    return trans\n\n\ndef get_warp_matrix(theta, size_input, size_dst, size_target):\n    \"\"\"This code is based on\n        https://github.com/open-mmlab/mmpose/blob/master/mmpose/core/post_processing/post_transforms.py\n\n        Calculate the transformation matrix under the constraint of unbiased.\n    Paper ref: Huang et al. The Devil is in the Details: Delving into Unbiased\n    Data Processing for Human Pose Estimation (CVPR 2020).\n\n    Args:\n        theta (float): Rotation angle in degrees.\n        size_input (np.ndarray): Size of input image [w, h].\n        size_dst (np.ndarray): Size of output image [w, h].\n        size_target (np.ndarray): Size of ROI in input plane [w, h].\n\n    Returns:\n        matrix (np.ndarray): A matrix for transformation.\n    \"\"\"\n    theta = np.deg2rad(theta)\n    matrix = np.zeros((2, 3), dtype=np.float32)\n    scale_x = size_dst[0] / size_target[0]\n    scale_y = size_dst[1] / size_target[1]\n    matrix[0, 0] = np.cos(theta) * scale_x\n    matrix[0, 1] = -np.sin(theta) * scale_x\n    matrix[0, 2] = scale_x * (-0.5 * size_input[0] * np.cos(theta) + 0.5 * size_input[1] * np.sin(theta) +\n                              0.5 * size_target[0])\n    matrix[1, 0] = np.sin(theta) * scale_y\n    matrix[1, 1] = np.cos(theta) * scale_y\n    matrix[1, 2] = scale_y * (-0.5 * size_input[0] * np.sin(theta) - 0.5 * size_input[1] * np.cos(theta) +\n                              0.5 * size_target[1])\n    return matrix\n\n\ndef rotate_point(pt, angle_rad):\n    \"\"\"Rotate a point by an angle.\n\n    Args:\n        pt (list[float]): 2 dimensional point to be rotated\n        angle_rad (float): rotation angle by radian\n\n    Returns:\n        list[float]: Rotated point.\n    \"\"\"\n    assert len(pt) == 2\n    sn, cs = np.sin(angle_rad), np.cos(angle_rad)\n    new_x = pt[0] * cs - pt[1] * sn\n    new_y = pt[0] * sn + pt[1] * cs\n    rotated_pt = [new_x, new_y]\n\n    return rotated_pt\n\n\ndef _get_3rd_point(a, b):\n    \"\"\"To calculate the affine matrix, three pairs of points are required. This\n    function is used to get the 3rd point, given 2D points a & b.\n\n    The 3rd point is defined by rotating vector `a - b` by 90 degrees\n    anticlockwise, using b as the rotation center.\n\n    Args:\n        a (np.ndarray): point(x,y)\n        b (np.ndarray): point(x,y)\n\n    Returns:\n        np.ndarray: The 3rd point.\n    \"\"\"\n    assert len(a) == 2\n    assert len(b) == 2\n    direction = a - b\n    third_pt = b + np.array([-direction[1], direction[0]], dtype=np.float32)\n\n    return third_pt\n\n\nclass TopDownEvalAffine(object):\n    \"\"\"apply affine transform to image and coords\n\n    Args:\n        trainsize (list): [w, h], the standard size used to train\n        use_udp (bool): whether to use Unbiased Data Processing.\n        records(dict): the dict contained the image and coords\n\n    Returns:\n        records (dict): contain the image and coords after tranformed\n\n    \"\"\"\n\n    def __init__(self, trainsize, use_udp=False):\n        self.trainsize = trainsize\n        self.use_udp = use_udp\n\n    def __call__(self, image, im_info):\n        rot = 0\n        imshape = im_info['im_shape'][::-1]\n        center = im_info['center'] if 'center' in im_info else imshape / 2.\n        scale = im_info['scale'] if 'scale' in im_info else imshape\n        if self.use_udp:\n            trans = get_warp_matrix(rot, center * 2.0, [self.trainsize[0] - 1.0, self.trainsize[1] - 1.0], scale)\n            image = cv2.warpAffine(image,\n                                   trans, (int(self.trainsize[0]), int(self.trainsize[1])),\n                                   flags=cv2.INTER_LINEAR)\n        else:\n            trans = get_affine_transform(center, scale, rot, self.trainsize)\n            image = cv2.warpAffine(image,\n                                   trans, (int(self.trainsize[0]), int(self.trainsize[1])),\n                                   flags=cv2.INTER_LINEAR)\n\n        return image, im_info\n\n\ndef expand_crop(images, rect, expand_ratio=0.3):\n    imgh, imgw, c = images.shape\n    label, conf, xmin, ymin, xmax, ymax = [int(x) for x in rect.tolist()]\n    if label != 0:\n        return None, None, None\n    org_rect = [xmin, ymin, xmax, ymax]\n    h_half = (ymax - ymin) * (1 + expand_ratio) / 2.\n    w_half = (xmax - xmin) * (1 + expand_ratio) / 2.\n    if h_half > w_half * 4 / 3:\n        w_half = h_half * 0.75\n    center = [(ymin + ymax) / 2., (xmin + xmax) / 2.]\n    ymin = max(0, int(center[0] - h_half))\n    ymax = min(imgh - 1, int(center[0] + h_half))\n    xmin = max(0, int(center[1] - w_half))\n    xmax = min(imgw - 1, int(center[1] + w_half))\n    return images[ymin:ymax, xmin:xmax, :], [xmin, ymin, xmax, ymax], org_rect\n"
  },
  {
    "path": "modules/image/keypoint_detection/pp-tinypose/module.py",
    "content": "# copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport json\nimport os\nimport time\nfrom typing import Union\n\nimport cv2\nimport numpy as np\n\nfrom .det_keypoint_unite_infer import predict_with_given_det\nfrom .infer import Detector\nfrom .keypoint_infer import KeyPointDetector\nfrom .preprocess import base64_to_cv2\nfrom .preprocess import decode_image\nfrom .visualize import visualize_pose\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"pp-tinypose\",\n    type=\"CV/keypoint_detection\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"PP-TinyPose is a real-time keypoint detection model optimized by PaddleDetecion for mobile devices.\",\n    version=\"1.2.0\")\nclass PP_TinyPose:\n    \"\"\"\n    PP-TinyPose Model.\n    \"\"\"\n\n    def __init__(self):\n        self.det_model_dir = os.path.join(self.directory, 'model/picodet_s_320_coco_lcnet/')\n        self.keypoint_model_dir = os.path.join(self.directory, 'model/tinypose_256x192/')\n        self.detector = Detector(self.det_model_dir)\n        self.topdown_keypoint_detector = KeyPointDetector(self.keypoint_model_dir)\n\n    def predict(self,\n                img: Union[str, np.ndarray],\n                save_path: str = \"pp_tinypose_output\",\n                visualization: bool = False,\n                use_gpu=False):\n        if use_gpu:\n            device = 'GPU'\n        else:\n            device = 'CPU'\n        if self.detector.device != device:\n            self.detector = Detector(self.det_model_dir, device=device)\n            self.topdown_keypoint_detector = KeyPointDetector(self.keypoint_model_dir, device=device)\n\n        self.visualization = visualization\n        store_res = []\n\n        # Decode image in advance in det + pose prediction\n        image, _ = decode_image(img, {})\n        results = self.detector.predict_image([image], visual=False)\n        results = self.detector.filter_box(results, 0.5)\n        if results['boxes_num'] > 0:\n            keypoint_res = predict_with_given_det(image, results, self.topdown_keypoint_detector, 1, False)\n            save_name = img if isinstance(img, str) else os.path.join(save_path, (str(time.time()) + '.png'))\n            store_res.append(\n                [save_name, keypoint_res['bbox'], [keypoint_res['keypoint'][0], keypoint_res['keypoint'][1]]])\n            if not os.path.exists(save_path):\n                os.makedirs(save_path)\n            if self.visualization:\n                if isinstance(img, np.ndarray):\n                    cv2.imwrite(save_name, img)\n                visualize_pose(save_name, keypoint_res, visual_thresh=0.5, save_dir=save_path)\n        return store_res\n\n    @serving\n    def serving_method(self, images: list, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.predict(img=images_decode[0], **kwargs)\n        results = json.dumps(results)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.predict(img=args.input_path,\n                               save_path=args.output_dir,\n                               visualization=args.visualization,\n                               use_gpu=args.use_gpu)\n\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='pp_tinypose_output',\n                                           help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument('--visualization',\n                                           type=bool,\n                                           default=True,\n                                           help=\"whether to save output as images.\")\n\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n\n    def create_gradio_app(self):\n        import gradio as gr\n        import tempfile\n        import os\n        from PIL import Image\n\n        def inference(image, use_gpu=False):\n            with tempfile.TemporaryDirectory() as temp_dir:\n                self.predict(img=image, use_gpu=use_gpu, visualization=True, save_path=temp_dir)\n                return Image.open(os.path.join(temp_dir, os.listdir(temp_dir)[0]))\n\n        interface = gr.Interface(\n            inference,\n            [gr.inputs.Image(type=\"filepath\"), gr.Checkbox(label='use_gpu')],\n            gr.outputs.Image(type=\"ndarray\"),\n            title='pp-tinypose')\n        return interface\n"
  },
  {
    "path": "modules/image/keypoint_detection/pp-tinypose/preprocess.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport base64\n\nimport cv2\nimport numpy as np\n\nfrom .keypoint_preprocess import get_affine_transform\n\n\ndef decode_image(im_file, im_info):\n    \"\"\"read rgb image\n    Args:\n        im_file (str|np.ndarray): input can be image path or np.ndarray\n        im_info (dict): info of image\n    Returns:\n        im (np.ndarray):  processed image (np.ndarray)\n        im_info (dict): info of processed image\n    \"\"\"\n    if isinstance(im_file, str):\n        with open(im_file, 'rb') as f:\n            im_read = f.read()\n        data = np.frombuffer(im_read, dtype='uint8')\n        im = cv2.imdecode(data, 1)  # BGR mode, but need RGB mode\n        im = cv2.cvtColor(im, cv2.COLOR_BGR2RGB)\n    else:\n        im = cv2.cvtColor(im_file, cv2.COLOR_BGR2RGB)\n    im_info['im_shape'] = np.array(im.shape[:2], dtype=np.float32)\n    im_info['scale_factor'] = np.array([1., 1.], dtype=np.float32)\n    return im, im_info\n\n\nclass Resize(object):\n    \"\"\"resize image by target_size and max_size\n    Args:\n        target_size (int): the target size of image\n        keep_ratio (bool): whether keep_ratio or not, default true\n        interp (int): method of resize\n    \"\"\"\n\n    def __init__(self, target_size, keep_ratio=True, interp=cv2.INTER_LINEAR):\n        if isinstance(target_size, int):\n            target_size = [target_size, target_size]\n        self.target_size = target_size\n        self.keep_ratio = keep_ratio\n        self.interp = interp\n\n    def __call__(self, im, im_info):\n        \"\"\"\n        Args:\n            im (np.ndarray): image (np.ndarray)\n            im_info (dict): info of image\n        Returns:\n            im (np.ndarray):  processed image (np.ndarray)\n            im_info (dict): info of processed image\n        \"\"\"\n        assert len(self.target_size) == 2\n        assert self.target_size[0] > 0 and self.target_size[1] > 0\n        im_channel = im.shape[2]\n        im_scale_y, im_scale_x = self.generate_scale(im)\n        im = cv2.resize(im, None, None, fx=im_scale_x, fy=im_scale_y, interpolation=self.interp)\n        im_info['im_shape'] = np.array(im.shape[:2]).astype('float32')\n        im_info['scale_factor'] = np.array([im_scale_y, im_scale_x]).astype('float32')\n        return im, im_info\n\n    def generate_scale(self, im):\n        \"\"\"\n        Args:\n            im (np.ndarray): image (np.ndarray)\n        Returns:\n            im_scale_x: the resize ratio of X\n            im_scale_y: the resize ratio of Y\n        \"\"\"\n        origin_shape = im.shape[:2]\n        im_c = im.shape[2]\n        if self.keep_ratio:\n            im_size_min = np.min(origin_shape)\n            im_size_max = np.max(origin_shape)\n            target_size_min = np.min(self.target_size)\n            target_size_max = np.max(self.target_size)\n            im_scale = float(target_size_min) / float(im_size_min)\n            if np.round(im_scale * im_size_max) > target_size_max:\n                im_scale = float(target_size_max) / float(im_size_max)\n            im_scale_x = im_scale\n            im_scale_y = im_scale\n        else:\n            resize_h, resize_w = self.target_size\n            im_scale_y = resize_h / float(origin_shape[0])\n            im_scale_x = resize_w / float(origin_shape[1])\n        return im_scale_y, im_scale_x\n\n\nclass NormalizeImage(object):\n    \"\"\"normalize image\n    Args:\n        mean (list): im - mean\n        std (list): im / std\n        is_scale (bool): whether need im / 255\n        is_channel_first (bool): if True: image shape is CHW, else: HWC\n    \"\"\"\n\n    def __init__(self, mean, std, is_scale=True):\n        self.mean = mean\n        self.std = std\n        self.is_scale = is_scale\n\n    def __call__(self, im, im_info):\n        \"\"\"\n        Args:\n            im (np.ndarray): image (np.ndarray)\n            im_info (dict): info of image\n        Returns:\n            im (np.ndarray):  processed image (np.ndarray)\n            im_info (dict): info of processed image\n        \"\"\"\n        im = im.astype(np.float32, copy=False)\n        mean = np.array(self.mean)[np.newaxis, np.newaxis, :]\n        std = np.array(self.std)[np.newaxis, np.newaxis, :]\n\n        if self.is_scale:\n            im = im / 255.0\n        im -= mean\n        im /= std\n        return im, im_info\n\n\nclass Permute(object):\n    \"\"\"permute image\n    Args:\n        to_bgr (bool): whether convert RGB to BGR\n        channel_first (bool): whether convert HWC to CHW\n    \"\"\"\n\n    def __init__(self, ):\n        super(Permute, self).__init__()\n\n    def __call__(self, im, im_info):\n        \"\"\"\n        Args:\n            im (np.ndarray): image (np.ndarray)\n            im_info (dict): info of image\n        Returns:\n            im (np.ndarray):  processed image (np.ndarray)\n            im_info (dict): info of processed image\n        \"\"\"\n        im = im.transpose((2, 0, 1)).copy()\n        return im, im_info\n\n\nclass PadStride(object):\n    \"\"\" padding image for model with FPN, instead PadBatch(pad_to_stride) in original config\n    Args:\n        stride (bool): model with FPN need image shape % stride == 0\n    \"\"\"\n\n    def __init__(self, stride=0):\n        self.coarsest_stride = stride\n\n    def __call__(self, im, im_info):\n        \"\"\"\n        Args:\n            im (np.ndarray): image (np.ndarray)\n            im_info (dict): info of image\n        Returns:\n            im (np.ndarray):  processed image (np.ndarray)\n            im_info (dict): info of processed image\n        \"\"\"\n        coarsest_stride = self.coarsest_stride\n        if coarsest_stride <= 0:\n            return im, im_info\n        im_c, im_h, im_w = im.shape\n        pad_h = int(np.ceil(float(im_h) / coarsest_stride) * coarsest_stride)\n        pad_w = int(np.ceil(float(im_w) / coarsest_stride) * coarsest_stride)\n        padding_im = np.zeros((im_c, pad_h, pad_w), dtype=np.float32)\n        padding_im[:, :im_h, :im_w] = im\n        return padding_im, im_info\n\n\nclass LetterBoxResize(object):\n\n    def __init__(self, target_size):\n        \"\"\"\n        Resize image to target size, convert normalized xywh to pixel xyxy\n        format ([x_center, y_center, width, height] -> [x0, y0, x1, y1]).\n        Args:\n            target_size (int|list): image target size.\n        \"\"\"\n        super(LetterBoxResize, self).__init__()\n        if isinstance(target_size, int):\n            target_size = [target_size, target_size]\n        self.target_size = target_size\n\n    def letterbox(self, img, height, width, color=(127.5, 127.5, 127.5)):\n        # letterbox: resize a rectangular image to a padded rectangular\n        shape = img.shape[:2]  # [height, width]\n        ratio_h = float(height) / shape[0]\n        ratio_w = float(width) / shape[1]\n        ratio = min(ratio_h, ratio_w)\n        new_shape = (round(shape[1] * ratio), round(shape[0] * ratio))  # [width, height]\n        padw = (width - new_shape[0]) / 2\n        padh = (height - new_shape[1]) / 2\n        top, bottom = round(padh - 0.1), round(padh + 0.1)\n        left, right = round(padw - 0.1), round(padw + 0.1)\n\n        img = cv2.resize(img, new_shape, interpolation=cv2.INTER_AREA)  # resized, no border\n        img = cv2.copyMakeBorder(img, top, bottom, left, right, cv2.BORDER_CONSTANT, value=color)  # padded rectangular\n        return img, ratio, padw, padh\n\n    def __call__(self, im, im_info):\n        \"\"\"\n        Args:\n            im (np.ndarray): image (np.ndarray)\n            im_info (dict): info of image\n        Returns:\n            im (np.ndarray):  processed image (np.ndarray)\n            im_info (dict): info of processed image\n        \"\"\"\n        assert len(self.target_size) == 2\n        assert self.target_size[0] > 0 and self.target_size[1] > 0\n        height, width = self.target_size\n        h, w = im.shape[:2]\n        im, ratio, padw, padh = self.letterbox(im, height=height, width=width)\n\n        new_shape = [round(h * ratio), round(w * ratio)]\n        im_info['im_shape'] = np.array(new_shape, dtype=np.float32)\n        im_info['scale_factor'] = np.array([ratio, ratio], dtype=np.float32)\n        return im, im_info\n\n\nclass Pad(object):\n\n    def __init__(self, size, fill_value=[114.0, 114.0, 114.0]):\n        \"\"\"\n        Pad image to a specified size.\n        Args:\n            size (list[int]): image target size\n            fill_value (list[float]): rgb value of pad area, default (114.0, 114.0, 114.0)\n        \"\"\"\n        super(Pad, self).__init__()\n        if isinstance(size, int):\n            size = [size, size]\n        self.size = size\n        self.fill_value = fill_value\n\n    def __call__(self, im, im_info):\n        im_h, im_w = im.shape[:2]\n        h, w = self.size\n        if h == im_h and w == im_w:\n            im = im.astype(np.float32)\n            return im, im_info\n\n        canvas = np.ones((h, w, 3), dtype=np.float32)\n        canvas *= np.array(self.fill_value, dtype=np.float32)\n        canvas[0:im_h, 0:im_w, :] = im.astype(np.float32)\n        im = canvas\n        return im, im_info\n\n\nclass WarpAffine(object):\n    \"\"\"Warp affine the image\n    \"\"\"\n\n    def __init__(self, keep_res=False, pad=31, input_h=512, input_w=512, scale=0.4, shift=0.1):\n        self.keep_res = keep_res\n        self.pad = pad\n        self.input_h = input_h\n        self.input_w = input_w\n        self.scale = scale\n        self.shift = shift\n\n    def __call__(self, im, im_info):\n        \"\"\"\n        Args:\n            im (np.ndarray): image (np.ndarray)\n            im_info (dict): info of image\n        Returns:\n            im (np.ndarray):  processed image (np.ndarray)\n            im_info (dict): info of processed image\n        \"\"\"\n        img = cv2.cvtColor(im, cv2.COLOR_RGB2BGR)\n\n        h, w = img.shape[:2]\n\n        if self.keep_res:\n            input_h = (h | self.pad) + 1\n            input_w = (w | self.pad) + 1\n            s = np.array([input_w, input_h], dtype=np.float32)\n            c = np.array([w // 2, h // 2], dtype=np.float32)\n\n        else:\n            s = max(h, w) * 1.0\n            input_h, input_w = self.input_h, self.input_w\n            c = np.array([w / 2., h / 2.], dtype=np.float32)\n\n        trans_input = get_affine_transform(c, s, 0, [input_w, input_h])\n        img = cv2.resize(img, (w, h))\n        inp = cv2.warpAffine(img, trans_input, (input_w, input_h), flags=cv2.INTER_LINEAR)\n        return inp, im_info\n\n\ndef preprocess(im, preprocess_ops):\n    # process image by preprocess_ops\n    im_info = {\n        'scale_factor': np.array([1., 1.], dtype=np.float32),\n        'im_shape': None,\n    }\n    im, im_info = decode_image(im, im_info)\n    for operator in preprocess_ops:\n        im, im_info = operator(im, im_info)\n    return im, im_info\n\n\ndef cv2_to_base64(image: np.ndarray):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef base64_to_cv2(b64str: str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n"
  },
  {
    "path": "modules/image/keypoint_detection/pp-tinypose/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://ai-studio-static-online.cdn.bcebos.com/7799a8ccc5f6471b9d56fb6eff94f82a08b70ca2c7594d3f99877e366c0a2619'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"pp-tinypose\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('pp_tinypose_output')\n\n    def test_predict1(self):\n        results = self.module.predict(img='tests/test.jpg', visualization=False)\n        kps = results[0][1]\n        self.assertIsInstance(kps, list)\n\n    def test_predict2(self):\n        results = self.module.predict(img=cv2.imread('tests/test.jpg'), visualization=False)\n        kps = results[0][1]\n        self.assertIsInstance(kps, list)\n\n    def test_predict3(self):\n        results = self.module.predict(img=cv2.imread('tests/test.jpg'), visualization=True)\n        kps = results[0][1]\n        self.assertIsInstance(kps, list)\n\n    def test_predict4(self):\n        self.assertRaises(FileNotFoundError, self.module.predict, img='no.jpg')\n\n    def test_predict5(self):\n        self.assertRaises(TypeError, self.module.predict, img=['test.jpg'])\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/keypoint_detection/pp-tinypose/visualize.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import division\n\nimport math\nimport os\n\nimport cv2\nimport numpy as np\nfrom PIL import Image\nfrom PIL import ImageDraw\nfrom PIL import ImageFile\n\nImageFile.LOAD_TRUNCATED_IMAGES = True\n\n\ndef visualize_box(im, results, labels, threshold=0.5):\n    \"\"\"\n    Args:\n        im (str/np.ndarray): path of image/np.ndarray read by cv2\n        results (dict): include 'boxes': np.ndarray: shape:[N,6], N: number of box,\n                        matix element:[class, score, x_min, y_min, x_max, y_max]\n                        MaskRCNN's results include 'masks': np.ndarray:\n                        shape:[N, im_h, im_w]\n        labels (list): labels:['class1', ..., 'classn']\n        threshold (float): Threshold of score.\n    Returns:\n        im (PIL.Image.Image): visualized image\n    \"\"\"\n    if isinstance(im, str):\n        im = Image.open(im).convert('RGB')\n    elif isinstance(im, np.ndarray):\n        im = Image.fromarray(im)\n    if 'boxes' in results and len(results['boxes']) > 0:\n        im = draw_box(im, results['boxes'], labels, threshold=threshold)\n    return im\n\n\ndef get_color_map_list(num_classes):\n    \"\"\"\n    Args:\n        num_classes (int): number of class\n    Returns:\n        color_map (list): RGB color list\n    \"\"\"\n    color_map = num_classes * [0, 0, 0]\n    for i in range(0, num_classes):\n        j = 0\n        lab = i\n        while lab:\n            color_map[i * 3] |= (((lab >> 0) & 1) << (7 - j))\n            color_map[i * 3 + 1] |= (((lab >> 1) & 1) << (7 - j))\n            color_map[i * 3 + 2] |= (((lab >> 2) & 1) << (7 - j))\n            j += 1\n            lab >>= 3\n    color_map = [color_map[i:i + 3] for i in range(0, len(color_map), 3)]\n    return color_map\n\n\ndef draw_box(im, np_boxes, labels, threshold=0.5):\n    \"\"\"\n    Args:\n        im (PIL.Image.Image): PIL image\n        np_boxes (np.ndarray): shape:[N,6], N: number of box,\n                               matix element:[class, score, x_min, y_min, x_max, y_max]\n        labels (list): labels:['class1', ..., 'classn']\n        threshold (float): threshold of box\n    Returns:\n        im (PIL.Image.Image): visualized image\n    \"\"\"\n    draw_thickness = min(im.size) // 320\n    draw = ImageDraw.Draw(im)\n    clsid2color = {}\n    color_list = get_color_map_list(len(labels))\n    expect_boxes = (np_boxes[:, 1] > threshold) & (np_boxes[:, 0] > -1)\n    np_boxes = np_boxes[expect_boxes, :]\n\n    for dt in np_boxes:\n        clsid, bbox, score = int(dt[0]), dt[2:], dt[1]\n        if clsid not in clsid2color:\n            clsid2color[clsid] = color_list[clsid]\n        color = tuple(clsid2color[clsid])\n\n        if len(bbox) == 4:\n            xmin, ymin, xmax, ymax = bbox\n            print('class_id:{:d}, confidence:{:.4f}, left_top:[{:.2f},{:.2f}],'\n                  'right_bottom:[{:.2f},{:.2f}]'.format(int(clsid), score, xmin, ymin, xmax, ymax))\n            # draw bbox\n            draw.line([(xmin, ymin), (xmin, ymax), (xmax, ymax), (xmax, ymin), (xmin, ymin)],\n                      width=draw_thickness,\n                      fill=color)\n        elif len(bbox) == 8:\n            x1, y1, x2, y2, x3, y3, x4, y4 = bbox\n            draw.line([(x1, y1), (x2, y2), (x3, y3), (x4, y4), (x1, y1)], width=2, fill=color)\n            xmin = min(x1, x2, x3, x4)\n            ymin = min(y1, y2, y3, y4)\n\n        # draw label\n        text = \"{} {:.4f}\".format(labels[clsid], score)\n        tw, th = draw.textsize(text)\n        draw.rectangle([(xmin + 1, ymin - th), (xmin + tw + 1, ymin)], fill=color)\n        draw.text((xmin + 1, ymin - th), text, fill=(255, 255, 255))\n    return im\n\n\ndef get_color(idx):\n    idx = idx * 3\n    color = ((37 * idx) % 255, (17 * idx) % 255, (29 * idx) % 255)\n    return color\n\n\ndef visualize_pose(imgfile,\n                   results,\n                   visual_thresh=0.6,\n                   save_name='pose.jpg',\n                   save_dir='output',\n                   returnimg=False,\n                   ids=None):\n    try:\n        import matplotlib.pyplot as plt\n        import matplotlib\n        plt.switch_backend('agg')\n    except Exception as e:\n        raise e\n    skeletons, scores = results['keypoint']\n    skeletons = np.array(skeletons)\n    kpt_nums = 17\n    if len(skeletons) > 0:\n        kpt_nums = skeletons.shape[1]\n    if kpt_nums == 17:  #plot coco keypoint\n        EDGES = [(0, 1), (0, 2), (1, 3), (2, 4), (3, 5), (4, 6), (5, 7), (6, 8), (7, 9), (8, 10), (5, 11), (6, 12),\n                 (11, 13), (12, 14), (13, 15), (14, 16), (11, 12)]\n    else:  #plot mpii keypoint\n        EDGES = [(0, 1), (1, 2), (3, 4), (4, 5), (2, 6), (3, 6), (6, 7), (7, 8), (8, 9), (10, 11), (11, 12), (13, 14),\n                 (14, 15), (8, 12), (8, 13)]\n    NUM_EDGES = len(EDGES)\n\n    colors = [[255, 0, 0], [255, 85, 0], [255, 170, 0], [255, 255, 0], [170, 255, 0], [85, 255, 0], [0, 255, 0], \\\n            [0, 255, 85], [0, 255, 170], [0, 255, 255], [0, 170, 255], [0, 85, 255], [0, 0, 255], [85, 0, 255], \\\n            [170, 0, 255], [255, 0, 255], [255, 0, 170], [255, 0, 85]]\n    cmap = matplotlib.cm.get_cmap('hsv')\n    plt.figure()\n\n    img = cv2.imread(imgfile) if type(imgfile) == str else imgfile\n\n    color_set = results['colors'] if 'colors' in results else None\n\n    if 'bbox' in results and ids is None:\n        bboxs = results['bbox']\n        for j, rect in enumerate(bboxs):\n            xmin, ymin, xmax, ymax = rect\n            color = colors[0] if color_set is None else colors[color_set[j] % len(colors)]\n            cv2.rectangle(img, (xmin, ymin), (xmax, ymax), color, 1)\n\n    canvas = img.copy()\n    for i in range(kpt_nums):\n        for j in range(len(skeletons)):\n            if skeletons[j][i, 2] < visual_thresh:\n                continue\n            if ids is None:\n                color = colors[i] if color_set is None else colors[color_set[j] % len(colors)]\n            else:\n                color = get_color(ids[j])\n\n            cv2.circle(canvas, tuple(skeletons[j][i, 0:2].astype('int32')), 2, color, thickness=-1)\n\n    to_plot = cv2.addWeighted(img, 0.3, canvas, 0.7, 0)\n    fig = matplotlib.pyplot.gcf()\n\n    stickwidth = 2\n\n    for i in range(NUM_EDGES):\n        for j in range(len(skeletons)):\n            edge = EDGES[i]\n            if skeletons[j][edge[0], 2] < visual_thresh or skeletons[j][edge[1], 2] < visual_thresh:\n                continue\n\n            cur_canvas = canvas.copy()\n            X = [skeletons[j][edge[0], 1], skeletons[j][edge[1], 1]]\n            Y = [skeletons[j][edge[0], 0], skeletons[j][edge[1], 0]]\n            mX = np.mean(X)\n            mY = np.mean(Y)\n            length = ((X[0] - X[1])**2 + (Y[0] - Y[1])**2)**0.5\n            angle = math.degrees(math.atan2(X[0] - X[1], Y[0] - Y[1]))\n            polygon = cv2.ellipse2Poly((int(mY), int(mX)), (int(length / 2), stickwidth), int(angle), 0, 360, 1)\n            if ids is None:\n                color = colors[i] if color_set is None else colors[color_set[j] % len(colors)]\n            else:\n                color = get_color(ids[j])\n            cv2.fillConvexPoly(cur_canvas, polygon, color)\n            canvas = cv2.addWeighted(canvas, 0.4, cur_canvas, 0.6, 0)\n    if returnimg:\n        return canvas\n    save_name = os.path.join(save_dir, os.path.splitext(os.path.basename(imgfile))[0] + '_vis.jpg')\n    plt.imsave(save_name, canvas[:, :, ::-1])\n    print(\"keypoint visualize image saved to: \" + save_name)\n    plt.close()\n"
  },
  {
    "path": "modules/image/matting/dim_vgg16_matting/README.md",
    "content": "# dim_vgg16_matting\n\n|模型名称|dim_vgg16_matting|\n| :--- | :---: |\n|类别|图像-抠图|\n|网络|dim_vgg16|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|否|\n|模型大小|164MB|\n|指标|SAD112.73|\n|最新更新日期|2021-12-03|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n  - 样例结果示例(左为原图，右为效果图)：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/144574288-28671577-8d5d-4b20-adb9-fe737015c841.jpg\" width = \"337\" height = \"505\" hspace='10' />\n    <img src=\"https://user-images.githubusercontent.com/35907364/144779164-47146d3a-58c9-4a38-b968-3530aa9a0137.png\" width = \"337\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### 模型介绍\n\n  - Matting（精细化分割/影像去背/抠图）是指借由计算前景的颜色和透明度，将前景从影像中撷取出来的技术，可用于替换背景、影像合成、视觉特效，在电影工业中被广泛地使用。影像中的每个像素会有代表其前景透明度的值，称作阿法值（Alpha），一张影像中所有阿法值的集合称作阿法遮罩（Alpha Matte），将影像被遮罩所涵盖的部分取出即可完成前景的分离。dim_vgg16_matting是一种需要trimap作为输入的matting模型。\n\n\n\n  - 更多详情请参考：[dim_vgg16_matting](https://github.com/PaddlePaddle/PaddleSeg/tree/release/2.3/contrib/Matting)\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.2.0\n\n    - paddlehub >= 2.1.0\n\n    - paddleseg >= 2.3.0\n\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install dim_vgg16_matting\n      ```\n\n    - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run dim_vgg16_matting --input_path \"/PATH/TO/IMAGE\" --trimap_path \"/PATH/TO/TRIMAP\"\n    ```\n\n  - 通过命令行方式实现hub模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n    - ```python\n      import paddlehub as hub\n      import cv2\n\n      model = hub.Module(name=\"dim_vgg16_matting\")\n\n      result = model.predict(image_list=[\"/PATH/TO/IMAGE\"], trimap_list=[\"/PATH/TO/IMAGE\"])\n      print(result)\n      ```\n- ### 3、API\n\n    - ```python\n        def predict(self,\n                    image_list,\n                    trimap_list,\n                    visualization,\n                    save_path):\n      ```\n\n        - 人像matting预测API，用于将输入图片中的人像分割出来。\n\n        - 参数\n\n            - image_list (list(str | numpy.ndarray)):图片输入路径或者BGR格式numpy数据。\n            - trimap_list(list(str | numpy.ndarray)):trimap输入路径或者单通道灰度图片。\n            - visualization (bool): 是否进行可视化，默认为False。\n            - save_path (str): 当visualization为True时，保存图片的路径，默认为\"dim_vgg16_matting_output\" 。\n\n        - 返回\n\n            - result (list(numpy.ndarray))：模型分割结果：\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署人像matting在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m dim_vgg16_matting\n      ```\n\n    - 这样就完成了一个人像matting在线服务API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    import time\n    import numpy as np\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n\n    def base64_to_cv2(b64str):\n        data = base64.b64decode(b64str.encode('utf8'))\n        data = np.fromstring(data, np.uint8)\n        data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n        return data\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))], 'trimaps':[cv2_to_base64(cv2.imread(\"/PATH/TO/TRIMAP\"))]}\n\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/dim_vgg16_matting\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    for image in r.json()[\"results\"]['data']:\n        data = base64_to_cv2(image)\n        image_path =str(time.time()) + \".png\"\n        cv2.imwrite(image_path, data)\n      ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/matting/dim_vgg16_matting/README_en.md",
    "content": "# dim_vgg16_matting\n\n|Module Name|dim_vgg16_matting|\n| :--- | :---: |\n|Category|Matting|\n|Network|dim_vgg16|\n|Dataset|Baidu self-built dataset|\n|Support Fine-tuning|No|\n|Module Size|164MB|\n|Data Indicators|-|\n|Latest update date|2021-12-03|\n\n\n## I. Basic Information\n\n- ### Application Effect Display\n\n  - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/144574288-28671577-8d5d-4b20-adb9-fe737015c841.jpg\" width = \"337\" height = \"505\" hspace='10'/>\n    <img src=\"https://user-images.githubusercontent.com/35907364/144779164-47146d3a-58c9-4a38-b968-3530aa9a0137.png\" width = \"337\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### Module Introduction\n\n  - Mating is the technique of extracting foreground from an image by calculating its color and transparency. It is widely used in the film industry to replace background, image composition, and visual effects. Each pixel in the image will have a value that represents its foreground transparency, called Alpha. The set of all Alpha values in an image is called Alpha Matte. The part of the image covered by the mask can be extracted to complete foreground separation.\n\n\n\n  - For more information, please refer to: [dim_vgg16_matting](https://github.com/PaddlePaddle/PaddleSeg/tree/release/2.3/contrib/Matting)\n\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.2.0\n\n    - paddlehub >= 2.1.0\n\n    - paddleseg >= 2.3.0\n\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install dim_vgg16_matting\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run dim_vgg16_matting --input_path \"/PATH/TO/IMAGE\" --trimap_path \"/PATH/TO/TRIMAP\"\n    ```\n\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_en/tutorial/cmd_usage.rst)\n\n\n- ### 2、Prediction Code Example\n\n    - ```python\n      import paddlehub as hub\n      import cv2\n\n      model = hub.Module(name=\"dim_vgg16_matting\")\n\n      result = model.predict(image_list=[\"/PATH/TO/IMAGE\"], trimap_list=[\"/PATH/TO/IMAGE\"])\n      print(result)\n      ```\n- ### 3、API\n\n    - ```python\n        def predict(self,\n                    image_list,\n                    trimap_list,\n                    visualization,\n                    save_path):\n      ```\n\n        - Prediction API for matting.\n\n        - **Parameter**\n\n            - image_list (list(str | numpy.ndarray)): Image path or image data, ndarray.shape is in the format \\[H, W, C\\]，BGR.\n            - trimap_list(list(str | numpy.ndarray)): Trimap path or trimap data, ndarray.shape is in the format \\[H, W]，Gray style.\n            - visualization (bool): Whether to save the recognition results as picture files, default is False.\n            - save_path (str): Save path of images, \"dim_vgg16_matting_output\" by default.\n\n        - **Return**\n\n            - result (list(numpy.ndarray))：The list of model results.\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of matting.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command:\n\n    - ```shell\n      $ hub serving start -m dim_vgg16_matting\n      ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n\n    ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    import time\n    import numpy as np\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n\n    def base64_to_cv2(b64str):\n        data = base64.b64decode(b64str.encode('utf8'))\n        data = np.fromstring(data, np.uint8)\n        data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n        return data\n\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))], 'trimaps':[cv2_to_base64(cv2.imread(\"/PATH/TO/TRIMAP\"))]}\n\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/dim_vgg16_matting\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    for image in r.json()[\"results\"]['data']:\n        data = base64_to_cv2(image)\n        image_path =str(time.time()) + \".png\"\n        cv2.imwrite(image_path, data)\n      ```\n\n## V. Release Note\n\n- 1.0.0\n\n  First release\n"
  },
  {
    "path": "modules/image/matting/dim_vgg16_matting/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport time\nimport argparse\nfrom typing import Callable, Union, List, Tuple\n\nimport numpy as np\nimport cv2\nimport scipy\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.module import moduleinfo, runnable, serving\nfrom paddleseg.models import layers\n\nfrom dim_vgg16_matting.vgg import VGG16\nimport dim_vgg16_matting.processor as P\n\n\n@moduleinfo(\n    name=\"dim_vgg16_matting\", \n    type=\"CV/matting\", \n    author=\"paddlepaddle\",\n    summary=\"dim_vgg16_matting is a matting model\",  \n    version=\"1.0.0\"  \n)\nclass DIMVGG16(nn.Layer):\n    \"\"\"\n    The DIM implementation based on PaddlePaddle.\n\n    The original article refers to\n    Ning Xu, et, al. \"Deep Image Matting\"\n    (https://arxiv.org/pdf/1908.07919.pdf).\n\n    Args:\n        stage (int, optional): The stage of model. Defautl: 3.\n        decoder_input_channels(int, optional): The channel of decoder input. Default: 512.\n        pretrained(str, optional): The path of pretrianed model. Defautl: None.\n\n    \"\"\"\n    def __init__(self,\n                 stage: int = 3,\n                 decoder_input_channels: int = 512,\n                 pretrained: str = None):\n        super(DIMVGG16, self).__init__()\n        \n        self.backbone = VGG16()\n        self.pretrained = pretrained\n        self.stage = stage\n\n        decoder_output_channels = [64, 128, 256, 512]\n        self.decoder = Decoder(\n            input_channels=decoder_input_channels,\n            output_channels=decoder_output_channels)\n        if self.stage == 2:\n            for param in self.backbone.parameters():\n                param.stop_gradient = True\n            for param in self.decoder.parameters():\n                param.stop_gradient = True\n        if self.stage >= 2:\n            self.refine = Refine()\n            \n        self.transforms = P.Compose([P.LoadImages(), P.LimitLong(max_long=3840),P.Normalize()])\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'dim-vgg16.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n    \n    def preprocess(self, img: Union[str, np.ndarray] , transforms: Callable, trimap: Union[str, np.ndarray] = None) -> dict:\n        data = {}\n        data['img'] = img\n        if trimap is not None:\n            data['trimap'] = trimap\n            data['gt_fields'] = ['trimap']\n        data['trans_info'] = []\n        data = self.transforms(data)\n        data['img'] = paddle.to_tensor(data['img'])\n        data['img'] = data['img'].unsqueeze(0)\n        if trimap is not None:\n            data['trimap'] = paddle.to_tensor(data['trimap'])\n            data['trimap'] = data['trimap'].unsqueeze((0, 1))\n\n        return data  \n    \n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        input_shape = paddle.shape(inputs['img'])[-2:]\n        x = paddle.concat([inputs['img'], inputs['trimap'] / 255], axis=1)\n        fea_list = self.backbone(x)\n\n        # decoder stage\n        up_shape = []\n        for i in range(5):\n            up_shape.append(paddle.shape(fea_list[i])[-2:])\n        alpha_raw = self.decoder(fea_list, up_shape)\n        alpha_raw = F.interpolate(\n            alpha_raw, input_shape, mode='bilinear', align_corners=False)\n        logit_dict = {'alpha_raw': alpha_raw}\n        if self.stage < 2:\n            return logit_dict\n\n        if self.stage >= 2:\n            # refine stage\n            refine_input = paddle.concat([inputs['img'], alpha_raw], axis=1)\n            alpha_refine = self.refine(refine_input)\n\n            # finally alpha\n            alpha_pred = alpha_refine + alpha_raw\n            alpha_pred = F.interpolate(\n                alpha_pred, input_shape, mode='bilinear', align_corners=False)\n            if not self.training:\n                alpha_pred = paddle.clip(alpha_pred, min=0, max=1)\n            logit_dict['alpha_pred'] = alpha_pred\n   \n        return alpha_pred\n    \n    def predict(self, image_list: list, trimap_list: list, visualization: bool =False, save_path: str = \"dim_vgg16_matting_output\") -> list:\n        self.eval()\n        result= []\n        with paddle.no_grad():\n            for i, im_path in enumerate(image_list):\n                trimap = trimap_list[i] if trimap_list is not None else None\n                data = self.preprocess(img=im_path, transforms=self.transforms, trimap=trimap)\n                alpha_pred = self.forward(data)\n                alpha_pred = P.reverse_transform(alpha_pred, data['trans_info'])\n                alpha_pred = (alpha_pred.numpy()).squeeze()\n                alpha_pred = (alpha_pred * 255).astype('uint8')\n                alpha_pred = P.save_alpha_pred(alpha_pred, trimap)\n                result.append(alpha_pred)\n                if visualization:\n                    if not os.path.exists(save_path):\n                        os.makedirs(save_path)\n                    img_name = str(time.time()) + '.png'\n                    image_save_path = os.path.join(save_path, img_name)\n                    cv2.imwrite(image_save_path, alpha_pred)\n\n        return result\n    \n    @serving\n    def serving_method(self, images: list, trimaps:list, **kwargs) -> dict:\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [P.base64_to_cv2(image) for image in images]\n\n        if trimaps is not None:\n            trimap_decoder = [cv2.cvtColor(P.base64_to_cv2(trimap), cv2.COLOR_BGR2GRAY) for trimap in trimaps]\n        else:\n            trimap_decoder = None\n\n        outputs = self.predict(image_list=images_decode, trimap_list= trimap_decoder, **kwargs)\n        \n        serving_data = [P.cv2_to_base64(outputs[i]) for i in range(len(outputs))]\n        results = {'data': serving_data}\n\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list) -> list:\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        if args.trimap_path is not None:\n            trimap_list = [args.trimap_path]\n        else:\n            trimap_list = None\n\n        results = self.predict(image_list=[args.input_path], trimap_list=trimap_list, save_path=args.output_dir, visualization=args.visualization)\n\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default=\"dim_vgg16_matting_output\", help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument(\n            '--visualization', type=bool, default=True, help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n        self.arg_input_group.add_argument('--trimap_path', type=str, help=\"path to trimap.\")                \n                   \n    \nclass Up(nn.Layer):\n    def __init__(self, input_channels: int, output_channels: int):\n        super().__init__()\n        self.conv = layers.ConvBNReLU(\n            input_channels,\n            output_channels,\n            kernel_size=5,\n            padding=2,\n            bias_attr=False)\n\n    def forward(self, x: paddle.Tensor, skip: paddle.Tensor, output_shape: list) -> paddle.Tensor:\n        x = F.interpolate(\n            x, size=output_shape, mode='bilinear', align_corners=False)\n        x = x + skip\n        x = self.conv(x)\n        x = F.relu(x)\n\n        return x\n\n\nclass Decoder(nn.Layer):\n    def __init__(self, input_channels: int, output_channels: list = [64, 128, 256, 512]):\n        super().__init__()\n        self.deconv6 = nn.Conv2D(\n            input_channels, input_channels, kernel_size=1, bias_attr=False)\n        self.deconv5 = Up(input_channels, output_channels[-1])\n        self.deconv4 = Up(output_channels[-1], output_channels[-2])\n        self.deconv3 = Up(output_channels[-2], output_channels[-3])\n        self.deconv2 = Up(output_channels[-3], output_channels[-4])\n        self.deconv1 = Up(output_channels[-4], 64)\n\n        self.alpha_conv = nn.Conv2D(\n            64, 1, kernel_size=5, padding=2, bias_attr=False)\n\n    def forward(self, fea_list: list, shape_list: list) -> paddle.Tensor:\n        x = fea_list[-1]\n        x = self.deconv6(x)\n        x = self.deconv5(x, fea_list[4], shape_list[4])\n        x = self.deconv4(x, fea_list[3], shape_list[3])\n        x = self.deconv3(x, fea_list[2], shape_list[2])\n        x = self.deconv2(x, fea_list[1], shape_list[1])\n        x = self.deconv1(x, fea_list[0], shape_list[0])\n        alpha = self.alpha_conv(x)\n        alpha = F.sigmoid(alpha)\n\n        return alpha\n\n\nclass Refine(nn.Layer):\n    def __init__(self):\n        super().__init__()\n        self.conv1 = layers.ConvBNReLU(\n            4, 64, kernel_size=3, padding=1, bias_attr=False)\n        self.conv2 = layers.ConvBNReLU(\n            64, 64, kernel_size=3, padding=1, bias_attr=False)\n        self.conv3 = layers.ConvBNReLU(\n            64, 64, kernel_size=3, padding=1, bias_attr=False)\n        self.alpha_pred = layers.ConvBNReLU(\n            64, 1, kernel_size=3, padding=1, bias_attr=False)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.conv1(x)\n        x = self.conv2(x)\n        x = self.conv3(x)\n        alpha = self.alpha_pred(x)\n\n        return alpha\n"
  },
  {
    "path": "modules/image/matting/dim_vgg16_matting/processor.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport random\nimport base64\nfrom typing import Callable, Union, List, Tuple\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\nfrom paddleseg.transforms import functional\nfrom PIL import Image\n\n\nclass Compose:\n    \"\"\"\n    Do transformation on input data with corresponding pre-processing and augmentation operations.\n    The shape of input data to all operations is [height, width, channels].\n    \"\"\"\n\n    def __init__(self, transforms: Callable, to_rgb: bool = True):\n        if not isinstance(transforms, list):\n            raise TypeError('The transforms must be a list!')\n        self.transforms = transforms\n        self.to_rgb = to_rgb\n\n    def __call__(self, data: dict) -> dict:\n\n        if 'trans_info' not in data:\n            data['trans_info'] = []\n        for op in self.transforms:\n            data = op(data)\n            if data is None:\n                return None\n\n        data['img'] = np.transpose(data['img'], (2, 0, 1))\n        for key in data.get('gt_fields', []):\n            if len(data[key].shape) == 2:\n                continue\n            data[key] = np.transpose(data[key], (2, 0, 1))\n\n        return data\n\n\nclass LoadImages:\n    \"\"\"\n    Read images from image path.\n\n    Args:\n        to_rgb (bool, optional): If converting image to RGB color space. Default: True.\n    \"\"\"\n    def __init__(self, to_rgb: bool = True):\n        self.to_rgb = to_rgb\n\n    def __call__(self, data: dict) -> dict:\n\n        if isinstance(data['img'], str):\n            data['img'] = cv2.imread(data['img'])\n\n        for key in data.get('gt_fields', []):\n            if isinstance(data[key], str):\n                data[key] = cv2.imread(data[key], cv2.IMREAD_UNCHANGED)\n            # if alpha and trimap has 3 channels, extract one.\n            if key in ['alpha', 'trimap']:\n                if len(data[key].shape) > 2:\n                    data[key] = data[key][:, :, 0]\n\n        if self.to_rgb:\n            data['img'] = cv2.cvtColor(data['img'], cv2.COLOR_BGR2RGB)\n            for key in data.get('gt_fields', []):\n                if len(data[key].shape) == 2:\n                    continue\n                data[key] = cv2.cvtColor(data[key], cv2.COLOR_BGR2RGB)\n\n        return data\n\n\nclass LimitLong:\n    \"\"\"\n    Limit the long edge of image.\n\n    If the long edge is larger than max_long, resize the long edge\n    to max_long, while scale the short edge proportionally.\n\n    If the long edge is smaller than min_long, resize the long edge\n    to min_long, while scale the short edge proportionally.\n\n    Args:\n        max_long (int, optional): If the long edge of image is larger than max_long,\n            it will be resize to max_long. Default: None.\n        min_long (int, optional): If the long edge of image is smaller than min_long,\n            it will be resize to min_long. Default: None.\n    \"\"\"\n\n    def __init__(self, max_long=None, min_long=None):\n        if max_long is not None:\n            if not isinstance(max_long, int):\n                raise TypeError(\n                    \"Type of `max_long` is invalid. It should be int, but it is {}\"\n                    .format(type(max_long)))\n        if min_long is not None:\n            if not isinstance(min_long, int):\n                raise TypeError(\n                    \"Type of `min_long` is invalid. It should be int, but it is {}\"\n                    .format(type(min_long)))\n        if (max_long is not None) and (min_long is not None):\n            if min_long > max_long:\n                raise ValueError(\n                    '`max_long should not smaller than min_long, but they are {} and {}'\n                    .format(max_long, min_long))\n        self.max_long = max_long\n        self.min_long = min_long\n\n    def __call__(self, data):\n        h, w = data['img'].shape[:2]\n        long_edge = max(h, w)\n        target = long_edge\n        if (self.max_long is not None) and (long_edge > self.max_long):\n            target = self.max_long\n        elif (self.min_long is not None) and (long_edge < self.min_long):\n            target = self.min_long\n\n        if target != long_edge:\n            data['trans_info'].append(('resize', data['img'].shape[0:2]))\n            data['img'] = functional.resize_long(data['img'], target)\n            for key in data.get('gt_fields', []):\n                data[key] = functional.resize_long(data[key], target)\n\n        return data\n\n\nclass Normalize:\n    \"\"\"\n    Normalize an image.\n\n    Args:\n        mean (list, optional): The mean value of a data set. Default: [0.5, 0.5, 0.5].\n        std (list, optional): The standard deviation of a data set. Default: [0.5, 0.5, 0.5].\n\n    Raises:\n        ValueError: When mean/std is not list or any value in std is 0.\n    \"\"\"\n\n    def __init__(self, mean: Union[List[float], Tuple[float]] = (0.5, 0.5, 0.5), std: Union[List[float], Tuple[float]] = (0.5, 0.5, 0.5)):\n        self.mean = mean\n        self.std = std\n        if not (isinstance(self.mean, (list, tuple))\n                and isinstance(self.std, (list, tuple))):\n            raise ValueError(\n                \"{}: input type is invalid. It should be list or tuple\".format(\n                    self))\n        from functools import reduce\n        if reduce(lambda x, y: x * y, self.std) == 0:\n            raise ValueError('{}: std is invalid!'.format(self))\n\n    def __call__(self, data: dict) -> dict:\n        mean = np.array(self.mean)[np.newaxis, np.newaxis, :]\n        std = np.array(self.std)[np.newaxis, np.newaxis, :]\n        data['img'] = functional.normalize(data['img'], mean, std)\n        if 'fg' in data.get('gt_fields', []):\n            data['fg'] = functional.normalize(data['fg'], mean, std)\n        if 'bg' in data.get('gt_fields', []):\n            data['bg'] = functional.normalize(data['bg'], mean, std)\n\n        return data\n\n\ndef reverse_transform(alpha: paddle.Tensor, trans_info: List[str]):\n    \"\"\"recover pred to origin shape\"\"\"\n    for item in trans_info[::-1]:\n        if item[0] == 'resize':\n            h, w = item[1][0], item[1][1]\n            alpha = F.interpolate(alpha, [h, w], mode='bilinear')\n        elif item[0] == 'padding':\n            h, w = item[1][0], item[1][1]\n            alpha = alpha[:, :, 0:h, 0:w]\n        else:\n            raise Exception(\"Unexpected info '{}' in im_info\".format(item[0]))\n    return alpha\n\ndef save_alpha_pred(alpha: np.ndarray, trimap: np.ndarray = None):\n    \"\"\"\n    The value of alpha is range [0, 1], shape should be [h,w]\n    \"\"\"\n    if isinstance(trimap, str):\n        trimap = cv2.imread(trimap, 0)\n    alpha[trimap == 0] = 0\n    alpha[trimap == 255] = 255\n    alpha = (alpha).astype('uint8')\n    return alpha\n\n\ndef cv2_to_base64(image: np.ndarray):\n    \"\"\"\n    Convert data from BGR to base64 format.\n    \"\"\"\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef base64_to_cv2(b64str: str):\n    \"\"\"\n    Convert data from base64 to BGR format.\n    \"\"\"\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data"
  },
  {
    "path": "modules/image/matting/dim_vgg16_matting/requirements.txt",
    "content": "paddleseg >= 2.3.0\n"
  },
  {
    "path": "modules/image/matting/dim_vgg16_matting/vgg.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import List, Tuple\n\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn import Conv2D, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2D, MaxPool2D, AvgPool2D\n\nfrom paddleseg.utils import utils\n\n\nclass ConvBlock(nn.Layer):\n    def __init__(self, input_channels: int, output_channels: int, groups: int, name: str = None):\n        super(ConvBlock, self).__init__()\n\n        self.groups = groups\n        self._conv_1 = Conv2D(\n            in_channels=input_channels,\n            out_channels=output_channels,\n            kernel_size=3,\n            stride=1,\n            padding=1,\n            weight_attr=ParamAttr(name=name + \"1_weights\"),\n            bias_attr=False)\n        if groups == 2 or groups == 3 or groups == 4:\n            self._conv_2 = Conv2D(\n                in_channels=output_channels,\n                out_channels=output_channels,\n                kernel_size=3,\n                stride=1,\n                padding=1,\n                weight_attr=ParamAttr(name=name + \"2_weights\"),\n                bias_attr=False)\n        if groups == 3 or groups == 4:\n            self._conv_3 = Conv2D(\n                in_channels=output_channels,\n                out_channels=output_channels,\n                kernel_size=3,\n                stride=1,\n                padding=1,\n                weight_attr=ParamAttr(name=name + \"3_weights\"),\n                bias_attr=False)\n        if groups == 4:\n            self._conv_4 = Conv2D(\n                in_channels=output_channels,\n                out_channels=output_channels,\n                kernel_size=3,\n                stride=1,\n                padding=1,\n                weight_attr=ParamAttr(name=name + \"4_weights\"),\n                bias_attr=False)\n\n        self._pool = MaxPool2D(\n            kernel_size=2, stride=2, padding=0, return_mask=True)\n\n    def forward(self, inputs: paddle.Tensor) -> List[paddle.Tensor]:\n        x = self._conv_1(inputs)\n        x = F.relu(x)\n        if self.groups == 2 or self.groups == 3 or self.groups == 4:\n            x = self._conv_2(x)\n            x = F.relu(x)\n        if self.groups == 3 or self.groups == 4:\n            x = self._conv_3(x)\n            x = F.relu(x)\n        if self.groups == 4:\n            x = self._conv_4(x)\n            x = F.relu(x)\n        skip = x\n        x, max_indices = self._pool(x)\n        return x, max_indices, skip\n\n\nclass VGGNet(nn.Layer):\n    def __init__(self, input_channels: int = 4, layers: int = 11, pretrained: str = None):\n        super(VGGNet, self).__init__()\n        self.pretrained = pretrained\n\n        self.layers = layers\n        self.vgg_configure = {\n            11: [1, 1, 2, 2, 2],\n            13: [2, 2, 2, 2, 2],\n            16: [2, 2, 3, 3, 3],\n            19: [2, 2, 4, 4, 4]\n        }\n        assert self.layers in self.vgg_configure.keys(), \\\n            \"supported layers are {} but input layer is {}\".format(\n                self.vgg_configure.keys(), layers)\n        self.groups = self.vgg_configure[self.layers]\n\n        # matting的第一层卷积输入为4通道，初始化是直接初始化为0\n        self._conv_block_1 = ConvBlock(\n            input_channels, 64, self.groups[0], name=\"conv1_\")\n        self._conv_block_2 = ConvBlock(64, 128, self.groups[1], name=\"conv2_\")\n        self._conv_block_3 = ConvBlock(128, 256, self.groups[2], name=\"conv3_\")\n        self._conv_block_4 = ConvBlock(256, 512, self.groups[3], name=\"conv4_\")\n        self._conv_block_5 = ConvBlock(512, 512, self.groups[4], name=\"conv5_\")\n\n        # 这一层的初始化需要利用vgg fc6的参数转换后进行初始化，可以暂时不考虑初始化\n        self._conv_6 = Conv2D(\n            512, 512, kernel_size=3, padding=1, bias_attr=False)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        fea_list = []\n        ids_list = []\n        x, ids, skip = self._conv_block_1(inputs)\n        fea_list.append(skip)\n        ids_list.append(ids)\n        x, ids, skip = self._conv_block_2(x)\n        fea_list.append(skip)\n        ids_list.append(ids)\n        x, ids, skip = self._conv_block_3(x)\n        fea_list.append(skip)\n        ids_list.append(ids)\n        x, ids, skip = self._conv_block_4(x)\n        fea_list.append(skip)\n        ids_list.append(ids)\n        x, ids, skip = self._conv_block_5(x)\n        fea_list.append(skip)\n        ids_list.append(ids)\n        x = F.relu(self._conv_6(x))\n        fea_list.append(x)\n        return fea_list\n\n\ndef VGG16(**args):\n    model = VGGNet(layers=16, **args)\n    return model"
  },
  {
    "path": "modules/image/matting/gfm_resnet34_matting/README.md",
    "content": "# gfm_resnet34_matting\n\n|模型名称|gfm_resnet34_matting|\n| :--- | :---: |\n|类别|图像-抠图|\n|网络|gfm_resnet34|\n|数据集|AM-2k|\n|是否支持Fine-tuning|否|\n|模型大小|562MB|\n|指标|SAD10.89|\n|最新更新日期|2021-12-03|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n  - 样例结果示例(左为原图，右为效果图)：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/145993777-9b69a85d-d31c-4743-8620-82b2a56ca1e7.jpg\" width = \"480\" height = \"350\" hspace='10'/>\n    <img src=\"https://user-images.githubusercontent.com/35907364/145993809-b0fb4bae-2c64-4868-99fc-500f19343442.png\" width = \"480\" height = \"350\" hspace='10'/>\n    </p>\n\n- ### 模型介绍\n\n  - Matting（精细化分割/影像去背/抠图）是指借由计算前景的颜色和透明度，将前景从影像中撷取出来的技术，可用于替换背景、影像合成、视觉特效，在电影工业中被广泛地使用。影像中的每个像素会有代表其前景透明度的值，称作阿法值（Alpha），一张影像中所有阿法值的集合称作阿法遮罩（Alpha Matte），将影像被遮罩所涵盖的部分取出即可完成前景的分离。gfm_resnet34_matting可生成抠图结果。\n\n\n\n  - 更多详情请参考：[gfm_resnet34_matting](https://github.com/JizhiziLi/GFM)\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.2.0\n\n    - paddlehub >= 2.1.0\n\n    - paddleseg >= 2.3.0\n\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install gfm_resnet34_matting\n      ```\n\n    - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run gfm_resnet34_matting --input_path \"/PATH/TO/IMAGE\"\n    ```\n\n  - 通过命令行方式实现hub模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n    - ```python\n      import paddlehub as hub\n      import cv2\n\n      model = hub.Module(name=\"gfm_resnet34_matting\")\n      result = model.predict([cv2.imread(\"/PATH/TO/IMAGE\")])\n      print(result)\n      ```\n- ### 3、API\n\n    - ```python\n        def predict(self,\n                    image_list,\n                    visualization,\n                    save_path):\n      ```\n\n        - 动物matting预测API，用于将输入图片中的动物分割出来。\n\n        - 参数\n\n            - image_list (list(str | numpy.ndarray)):图片输入路径或者BGR格式numpy数据。\n            - visualization (bool): 是否进行可视化，默认为False。\n            - save_path (str): 当visualization为True时，保存图片的路径，默认为\"gfm_resnet34_matting_output\"。\n\n        - 返回\n\n            - result (list(numpy.ndarray))：模型分割结果：\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署动物matting在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m gfm_resnet34_matting\n      ```\n\n    - 这样就完成了一个动物matting在线服务API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    import time\n    import numpy as np\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n\n    def base64_to_cv2(b64str):\n        data = base64.b64decode(b64str.encode('utf8'))\n        data = np.fromstring(data, np.uint8)\n        data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n        return data\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/gfm_resnet34_matting\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    for image in r.json()[\"results\"]['data']:\n        data = base64_to_cv2(image)\n        image_path =str(time.time()) + \".png\"\n        cv2.imwrite(image_path, data)\n      ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/matting/gfm_resnet34_matting/README_en.md",
    "content": "# gfm_resnet34_matting\n\n|Module Name|gfm_resnet34_matting|\n| :--- | :---: |\n|Category|Image Matting|\n|Network|gfm_resnet34|\n|Dataset|AM-2k|\n|Support Fine-tuning|No|\n|Module Size|562MB|\n|Data Indicators|SAD10.89|\n|Latest update date|2021-12-03|\n\n\n## I. Basic Information\n\n- ### Application Effect Display\n\n  - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/145993777-9b69a85d-d31c-4743-8620-82b2a56ca1e7.jpg\" width = \"480\" height = \"350\" hspace='10'/>\n    <img src=\"https://user-images.githubusercontent.com/35907364/145993809-b0fb4bae-2c64-4868-99fc-500f19343442.png\" width = \"480\" height = \"350\" hspace='10'/>\n    </p>\n\n- ### Module Introduction\n\n  - Mating is the technique of extracting foreground from an image by calculating its color and transparency. It is widely used in the film industry to replace background, image composition, and visual effects. Each pixel in the image will have a value that represents its foreground transparency, called Alpha. The set of all Alpha values in an image is called Alpha Matte. The part of the image covered by the mask can be extracted to complete foreground separation.\n\n\n\n  - For more information, please refer to: [gfm_resnet34_matting](https://github.com/JizhiziLi/GFM)\n\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.2.0\n\n    - paddlehub >= 2.1.0\n\n    - paddleseg >= 2.3.0\n\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install gfm_resnet34_matting\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run gfm_resnet34_matting --input_path \"/PATH/TO/IMAGE\"\n    ```\n\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_en/tutorial/cmd_usage.rst)\n\n\n- ### 2、Prediction Code Example\n\n    - ```python\n      import paddlehub as hub\n      import cv2\n\n      model = hub.Module(name=\"gfm_resnet34_matting\")\n      result = model.predict([cv2.imread(\"/PATH/TO/IMAGE\")])\n      print(result)\n\n      ```\n- ### 3、API\n\n    - ```python\n        def predict(self,\n                    image_list,\n                    visualization,\n                    save_path):\n      ```\n\n        - Prediction API for matting.\n\n        - **Parameter**\n\n            - image_list (list(str | numpy.ndarray)): Image path or image data, ndarray.shape is in the format \\[H, W, C\\]，BGR.\n            - visualization (bool): Whether to save the recognition results as picture files, default is False.\n            - save_path (str): Save path of images, \"modnet_mobilenetv2_matting_output\" by default.\n\n        - **Return**\n\n            - result (list(numpy.ndarray))：The list of model results.\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of matting.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command:\n\n    - ```shell\n      $ hub serving start -m gfm_resnet34_matting\n      ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n\n    ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    import time\n    import numpy as np\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n\n    def base64_to_cv2(b64str):\n        data = base64.b64decode(b64str.encode('utf8'))\n        data = np.fromstring(data, np.uint8)\n        data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n        return data\n\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/gfm_resnet34_matting\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    for image in r.json()[\"results\"]['data']:\n        data = base64_to_cv2(image)\n        image_path =str(time.time()) + \".png\"\n        cv2.imwrite(image_path, data)\n      ```\n\n## V. Release Note\n\n- 1.0.0\n\n  First release\n"
  },
  {
    "path": "modules/image/matting/gfm_resnet34_matting/gfm.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Callable\nfrom typing import List\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom gfm_resnet34_matting.resnet import resnet34\n\n\ndef conv3x3(in_planes: int, out_planes: int, stride: int = 1) -> Callable:\n    \"\"\"3x3 convolution with padding\"\"\"\n    return nn.Conv2D(in_planes, out_planes, kernel_size=3, stride=stride, padding=1, bias_attr=False)\n\n\ndef conv_up_psp(in_channels: int, out_channels: int, up_sample: float) -> Callable:\n    return nn.Sequential(nn.Conv2D(in_channels, out_channels, 3, padding=1), nn.BatchNorm2D(out_channels), nn.ReLU(),\n                         nn.Upsample(scale_factor=up_sample, mode='bilinear', align_corners=False))\n\n\ndef build_bb(in_channels: int, mid_channels: int, out_channels: int) -> Callable:\n    return nn.Sequential(nn.Conv2D(in_channels, mid_channels, 3, dilation=2, padding=2), nn.BatchNorm2D(mid_channels),\n                         nn.ReLU(), nn.Conv2D(mid_channels, out_channels, 3, dilation=2, padding=2),\n                         nn.BatchNorm2D(out_channels), nn.ReLU(),\n                         nn.Conv2D(out_channels, out_channels, 3, dilation=2, padding=2), nn.BatchNorm2D(out_channels),\n                         nn.ReLU())\n\n\ndef build_decoder(in_channels: int, mid_channels_1: int, mid_channels_2: int, out_channels: int, last_bnrelu: bool,\n                  upsample_flag: bool) -> Callable:\n    layers = []\n    layers += [\n        nn.Conv2D(in_channels, mid_channels_1, 3, padding=1),\n        nn.BatchNorm2D(mid_channels_1),\n        nn.ReLU(),\n        nn.Conv2D(mid_channels_1, mid_channels_2, 3, padding=1),\n        nn.BatchNorm2D(mid_channels_2),\n        nn.ReLU(),\n        nn.Conv2D(mid_channels_2, out_channels, 3, padding=1)\n    ]\n    if last_bnrelu:\n        layers += [nn.BatchNorm2D(out_channels), nn.ReLU()]\n\n    if upsample_flag:\n        layers += [nn.Upsample(scale_factor=2, mode='bilinear')]\n\n    sequential = nn.Sequential(*layers)\n    return sequential\n\n\nclass BasicBlock(nn.Layer):\n    expansion = 1\n\n    def __init__(self, inplanes: int, planes: int, stride: int = 1, downsample=None):\n        super(BasicBlock, self).__init__()\n        self.conv1 = conv3x3(inplanes, planes, stride)\n        self.bn1 = nn.BatchNorm2D(planes)\n        self.relu = nn.ReLU()\n        self.conv2 = conv3x3(planes, planes)\n        self.bn2 = nn.BatchNorm2D(planes)\n        self.downsample = downsample\n        self.stride = stride\n\n    def forward(self, x: paddle.Tensor) -> Callable:\n        residual = x\n        out = self.conv1(x)\n        out = self.bn1(out)\n        out = self.relu(out)\n        out = self.conv2(out)\n        out = self.bn2(out)\n        if self.downsample is not None:\n            residual = self.downsample(x)\n        out += residual\n        out = self.relu(out)\n        return out\n\n\nclass PSPModule(nn.Layer):\n\n    def __init__(self, features: paddle.Tensor, out_features: int = 1024, sizes: List[int] = (1, 2, 3, 6)):\n        super().__init__()\n        #self.stages = []\n        self.stages = nn.LayerList([self._make_stage(features, size) for size in sizes])\n        self.bottleneck = nn.Conv2D(features * (len(sizes) + 1), out_features, kernel_size=1)\n        self.relu = nn.ReLU()\n\n    def _make_stage(self, features: paddle.Tensor, size: int) -> Callable:\n        prior = nn.AdaptiveAvgPool2D(output_size=(size, size))\n        conv = nn.Conv2D(features, features, kernel_size=1, bias_attr=False)\n        return nn.Sequential(prior, conv)\n\n    def forward(self, feats: paddle.Tensor) -> paddle.Tensor:\n        h, w = feats.shape[2], feats.shape[3]\n        priors = [F.upsample(stage(feats), size=(h, w), mode='bilinear', align_corners=True)\n                  for stage in self.stages] + [feats]\n        bottle = self.bottleneck(paddle.concat(priors, 1))\n        return self.relu(bottle)\n\n\nclass SELayer(nn.Layer):\n\n    def __init__(self, channel: int, reduction: int = 4):\n        super(SELayer, self).__init__()\n        self.avg_pool = nn.AdaptiveAvgPool2D(1)\n        self.fc = nn.Sequential(nn.Linear(channel, channel // reduction, bias_attr=False), nn.ReLU(),\n                                nn.Linear(channel // reduction, channel, bias_attr=False), nn.Sigmoid())\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        b, c, _, _ = x.size()\n        y = self.avg_pool(x).view(b, c)\n        y = self.fc(y).view(b, c, 1, 1)\n        return x * y.expand_as(x)\n\n\nclass GFM(nn.Layer):\n    \"\"\"\n    The GFM implementation based on PaddlePaddle.\n\n    The original article refers to：\n    Bridging Composite and Real: Towards End-to-end Deep Image Matting [IJCV-2021]\n    Main network file (GFM).\n\n    Copyright (c) 2021, Jizhizi Li (jili8515@uni.sydney.edu.au)\n    Licensed under the MIT License (see LICENSE for details)\n    Github repo: https://github.com/JizhiziLi/GFM\n    Paper link (Arxiv): https://arxiv.org/abs/2010.16188\n\n    \"\"\"\n\n    def __init__(self):\n        super().__init__()\n        self.backbone = 'r34_2b'\n        self.rosta = 'TT'\n        if self.rosta == 'TT':\n            self.gd_channel = 3\n        else:\n            self.gd_channel = 2\n        if self.backbone == 'r34_2b':\n            self.resnet = resnet34()\n            self.encoder0 = nn.Sequential(nn.Conv2D(3, 64, 3, padding=1), nn.BatchNorm2D(64), nn.ReLU())\n            self.encoder1 = self.resnet.layer1\n            self.encoder2 = self.resnet.layer2\n            self.encoder3 = self.resnet.layer3\n            self.encoder4 = self.resnet.layer4\n            self.encoder5 = nn.Sequential(nn.MaxPool2D(2, 2, ceil_mode=True), BasicBlock(512, 512),\n                                          BasicBlock(512, 512), BasicBlock(512, 512))\n            self.encoder6 = nn.Sequential(nn.MaxPool2D(2, 2, ceil_mode=True), BasicBlock(512, 512),\n                                          BasicBlock(512, 512), BasicBlock(512, 512))\n            self.psp_module = PSPModule(512, 512, (1, 3, 5))\n            self.psp6 = conv_up_psp(512, 512, 2)\n            self.psp5 = conv_up_psp(512, 512, 4)\n            self.psp4 = conv_up_psp(512, 256, 8)\n            self.psp3 = conv_up_psp(512, 128, 16)\n            self.psp2 = conv_up_psp(512, 64, 32)\n            self.psp1 = conv_up_psp(512, 64, 32)\n            self.decoder6_g = build_decoder(1024, 512, 512, 512, True, True)\n            self.decoder5_g = build_decoder(1024, 512, 512, 512, True, True)\n            self.decoder4_g = build_decoder(1024, 512, 512, 256, True, True)\n            self.decoder3_g = build_decoder(512, 256, 256, 128, True, True)\n            self.decoder2_g = build_decoder(256, 128, 128, 64, True, True)\n            self.decoder1_g = build_decoder(128, 64, 64, 64, True, False)\n            self.bridge_block = build_bb(512, 512, 512)\n            self.decoder6_f = build_decoder(1024, 512, 512, 512, True, True)\n            self.decoder5_f = build_decoder(1024, 512, 512, 512, True, True)\n            self.decoder4_f = build_decoder(1024, 512, 512, 256, True, True)\n            self.decoder3_f = build_decoder(512, 256, 256, 128, True, True)\n            self.decoder2_f = build_decoder(256, 128, 128, 64, True, True)\n            self.decoder1_f = build_decoder(128, 64, 64, 64, True, False)\n            if self.rosta == 'RIM':\n                self.decoder0_g_tt = nn.Sequential(nn.Conv2D(64, 3, 3, padding=1))\n                self.decoder0_g_ft = nn.Sequential(nn.Conv2D(64, 2, 3, padding=1))\n                self.decoder0_g_bt = nn.Sequential(nn.Conv2D(64, 2, 3, padding=1))\n                self.decoder0_f_tt = nn.Sequential(nn.Conv2D(64, 1, 3, padding=1))\n                self.decoder0_f_ft = nn.Sequential(nn.Conv2D(64, 1, 3, padding=1))\n                self.decoder0_f_bt = nn.Sequential(nn.Conv2D(64, 1, 3, padding=1))\n            else:\n                self.decoder0_g = nn.Sequential(nn.Conv2D(64, self.gd_channel, 3, padding=1))\n                self.decoder0_f = nn.Sequential(nn.Conv2D(64, 1, 3, padding=1))\n        if self.backbone == 'r34':\n            self.encoder0 = nn.Sequential(self.resnet.conv1, self.resnet.bn1, self.resnet.relu)\n\n            self.encoder1 = nn.Sequential(self.resnet.maxpool, self.resnet.layer1)\n            self.encoder2 = self.resnet.layer2\n            self.encoder3 = self.resnet.layer3\n            self.encoder4 = self.resnet.layer4\n            self.psp_module = PSPModule(512, 512, (1, 3, 5))\n            self.psp4 = conv_up_psp(512, 256, 2)\n            self.psp3 = conv_up_psp(512, 128, 4)\n            self.psp2 = conv_up_psp(512, 64, 8)\n            self.psp1 = conv_up_psp(512, 64, 16)\n            self.decoder4_g = build_decoder(1024, 512, 512, 256, True, True)\n            self.decoder3_g = build_decoder(512, 256, 256, 128, True, True)\n            self.decoder2_g = build_decoder(256, 128, 128, 64, True, True)\n            self.decoder1_g = build_decoder(128, 64, 64, 64, True, True)\n            self.bridge_block = build_bb(512, 512, 512)\n            self.decoder4_f = build_decoder(1024, 512, 512, 256, True, True)\n            self.decoder3_f = build_decoder(512, 256, 256, 128, True, True)\n            self.decoder2_f = build_decoder(256, 128, 128, 64, True, True)\n            self.decoder1_f = build_decoder(128, 64, 64, 64, True, True)\n            if self.rosta == 'RIM':\n                self.decoder0_g_tt = build_decoder(128, 64, 64, 3, False, True)\n                self.decoder0_g_ft = build_decoder(128, 64, 64, 2, False, True)\n                self.decoder0_g_bt = build_decoder(128, 64, 64, 2, False, True)\n                self.decoder0_f_tt = build_decoder(128, 64, 64, 1, False, True)\n                self.decoder0_f_ft = build_decoder(128, 64, 64, 1, False, True)\n                self.decoder0_f_bt = build_decoder(128, 64, 64, 1, False, True)\n            else:\n                self.decoder0_g = build_decoder(128, 64, 64, self.gd_channel, False, True)\n                self.decoder0_f = build_decoder(128, 64, 64, 1, False, True)\n        elif self.backbone == 'r101':\n            self.encoder0 = nn.Sequential(self.resnet.conv1, self.resnet.bn1, self.resnet.relu)\n            self.encoder1 = nn.Sequential(self.resnet.maxpool, self.resnet.layer1)\n            self.encoder2 = self.resnet.layer2\n            self.encoder3 = self.resnet.layer3\n            self.encoder4 = self.resnet.layer4\n            self.psp_module = PSPModule(2048, 2048, (1, 3, 5))\n            self.bridge_block = build_bb(2048, 2048, 2048)\n            self.psp4 = conv_up_psp(2048, 1024, 2)\n            self.psp3 = conv_up_psp(2048, 512, 4)\n            self.psp2 = conv_up_psp(2048, 256, 8)\n            self.psp1 = conv_up_psp(2048, 64, 16)\n            self.decoder4_g = build_decoder(4096, 2048, 1024, 1024, True, True)\n            self.decoder3_g = build_decoder(2048, 1024, 512, 512, True, True)\n            self.decoder2_g = build_decoder(1024, 512, 256, 256, True, True)\n            self.decoder1_g = build_decoder(512, 256, 128, 64, True, True)\n            self.decoder4_f = build_decoder(4096, 2048, 1024, 1024, True, True)\n            self.decoder3_f = build_decoder(2048, 1024, 512, 512, True, True)\n            self.decoder2_f = build_decoder(1024, 512, 256, 256, True, True)\n            self.decoder1_f = build_decoder(512, 256, 128, 64, True, True)\n            if self.rosta == 'RIM':\n                self.decoder0_g_tt = build_decoder(128, 64, 64, 3, False, True)\n                self.decoder0_g_ft = build_decoder(128, 64, 64, 2, False, True)\n                self.decoder0_g_bt = build_decoder(128, 64, 64, 2, False, True)\n                self.decoder0_f_tt = build_decoder(128, 64, 64, 1, False, True)\n                self.decoder0_f_ft = build_decoder(128, 64, 64, 1, False, True)\n                self.decoder0_f_bt = build_decoder(128, 64, 64, 1, False, True)\n            else:\n                self.decoder0_g = build_decoder(128, 64, 64, self.gd_channel, False, True)\n                self.decoder0_f = build_decoder(128, 64, 64, 1, False, True)\n        elif self.backbone == 'd121':\n            self.encoder0 = nn.Sequential(self.densenet.features.conv0, self.densenet.features.norm0,\n                                          self.densenet.features.relu0)\n            self.encoder1 = nn.Sequential(self.densenet.features.denseblock1, self.densenet.features.transition1)\n            self.encoder2 = nn.Sequential(self.densenet.features.denseblock2, self.densenet.features.transition2)\n            self.encoder3 = nn.Sequential(self.densenet.features.denseblock3, self.densenet.features.transition3)\n            self.encoder4 = nn.Sequential(self.densenet.features.denseblock4, nn.Conv2D(1024, 512, 3, padding=1),\n                                          nn.BatchNorm2D(512), nn.ReLU(), nn.MaxPool2D(2, 2, ceil_mode=True))\n            self.psp_module = PSPModule(512, 512, (1, 3, 5))\n            self.psp4 = conv_up_psp(512, 256, 2)\n            self.psp3 = conv_up_psp(512, 128, 4)\n            self.psp2 = conv_up_psp(512, 64, 8)\n            self.psp1 = conv_up_psp(512, 64, 16)\n            self.decoder4_g = build_decoder(1024, 512, 512, 256, True, True)\n            self.decoder3_g = build_decoder(512, 256, 256, 128, True, True)\n            self.decoder2_g = build_decoder(256, 128, 128, 64, True, True)\n            self.decoder1_g = build_decoder(128, 64, 64, 64, True, True)\n            self.bridge_block = build_bb(512, 512, 512)\n            self.decoder4_f = build_decoder(1024, 512, 512, 256, True, True)\n            self.decoder3_f = build_decoder(768, 256, 256, 128, True, True)\n            self.decoder2_f = build_decoder(384, 128, 128, 64, True, True)\n            self.decoder1_f = build_decoder(192, 64, 64, 64, True, True)\n            if self.rosta == 'RIM':\n                self.decoder0_g_tt = build_decoder(128, 64, 64, 3, False, True)\n                self.decoder0_g_ft = build_decoder(128, 64, 64, 2, False, True)\n                self.decoder0_g_bt = build_decoder(128, 64, 64, 2, False, True)\n                self.decoder0_f_tt = build_decoder(128, 64, 64, 1, False, True)\n                self.decoder0_f_ft = build_decoder(128, 64, 64, 1, False, True)\n                self.decoder0_f_bt = build_decoder(128, 64, 64, 1, False, True)\n            else:\n                self.decoder0_g = build_decoder(128, 64, 64, self.gd_channel, False, True)\n                self.decoder0_f = build_decoder(128, 64, 64, 1, False, True)\n        if self.rosta == 'RIM':\n            self.rim = nn.Sequential(nn.Conv2D(3, 16, 1), SELayer(16), nn.Conv2D(16, 1, 1))\n\n    def forward(self, input: paddle.Tensor) -> List[paddle.Tensor]:\n        glance_sigmoid = paddle.zeros(input.shape)\n        glance_sigmoid.stop_gradient = True\n        focus_sigmoid = paddle.zeros(input.shape)\n        focus_sigmoid.stop_gradient = True\n        fusion_sigmoid = paddle.zeros(input.shape)\n        fusion_sigmoid.stop_gradient = True\n        e0 = self.encoder0(input)\n        e1 = self.encoder1(e0)\n        e2 = self.encoder2(e1)\n        e3 = self.encoder3(e2)\n        e4 = self.encoder4(e3)\n        if self.backbone == 'r34_2b':\n            e5 = self.encoder5(e4)\n            e6 = self.encoder6(e5)\n            psp = self.psp_module(e6)\n            d6_g = self.decoder6_g(paddle.concat((psp, e6), 1))\n            d5_g = self.decoder5_g(paddle.concat((self.psp6(psp), d6_g), 1))\n            d4_g = self.decoder4_g(paddle.concat((self.psp5(psp), d5_g), 1))\n        else:\n            psp = self.psp_module(e4)\n            d4_g = self.decoder4_g(paddle.concat((psp, e4), 1))\n        d3_g = self.decoder3_g(paddle.concat((self.psp4(psp), d4_g), 1))\n        d2_g = self.decoder2_g(paddle.concat((self.psp3(psp), d3_g), 1))\n        d1_g = self.decoder1_g(paddle.concat((self.psp2(psp), d2_g), 1))\n        if self.backbone == 'r34_2b':\n            if self.rosta == 'RIM':\n                d0_g_tt = self.decoder0_g_tt(d1_g)\n                d0_g_ft = self.decoder0_g_ft(d1_g)\n                d0_g_bt = self.decoder0_g_bt(d1_g)\n            else:\n                d0_g = self.decoder0_g(d1_g)\n        elif self.rosta == 'RIM':\n            d0_g_tt = self.decoder0_g_tt(paddle.concat((self.psp1(psp), d1_g), 1))\n            d0_g_ft = self.decoder0_g_ft(paddle.concat((self.psp1(psp), d1_g), 1))\n            d0_g_bt = self.decoder0_g_bt(paddle.concat((self.psp1(psp), d1_g), 1))\n        else:\n            d0_g = self.decoder0_g(paddle.concat((self.psp1(psp), d1_g), 1))\n        if self.rosta == 'RIM':\n            glance_sigmoid_tt = F.sigmoid(d0_g_tt)\n            glance_sigmoid_ft = F.sigmoid(d0_g_ft)\n            glance_sigmoid_bt = F.sigmoid(d0_g_bt)\n        else:\n            glance_sigmoid = F.sigmoid(d0_g)\n        if self.backbone == 'r34_2b':\n            bb = self.bridge_block(e6)\n            d6_f = self.decoder6_f(paddle.concat((bb, e6), 1))\n            d5_f = self.decoder5_f(paddle.concat((d6_f, e5), 1))\n            d4_f = self.decoder4_f(paddle.concat((d5_f, e4), 1))\n        else:\n            bb = self.bridge_block(e4)\n            d4_f = self.decoder4_f(paddle.concat((bb, e4), 1))\n        d3_f = self.decoder3_f(paddle.concat((d4_f, e3), 1))\n        d2_f = self.decoder2_f(paddle.concat((d3_f, e2), 1))\n        d1_f = self.decoder1_f(paddle.concat((d2_f, e1), 1))\n        if self.backbone == 'r34_2b':\n            if self.rosta == 'RIM':\n                d0_f_tt = self.decoder0_f_tt(d1_f)\n                d0_f_ft = self.decoder0_f_ft(d1_f)\n                d0_f_bt = self.decoder0_f_bt(d1_f)\n            else:\n                d0_f = self.decoder0_f(d1_f)\n        elif self.rosta == 'RIM':\n            d0_f_tt = self.decoder0_f_tt(paddle.concat((d1_f, e0), 1))\n            d0_f_ft = self.decoder0_f_ft(paddle.concat((d1_f, e0), 1))\n            d0_f_bt = self.decoder0_f_bt(paddle.concat((d1_f, e0), 1))\n        else:\n            d0_f = self.decoder0_f(paddle.concat((d1_f, e0), 1))\n        if self.rosta == 'RIM':\n            focus_sigmoid_tt = F.sigmoid(d0_f_tt)\n            focus_sigmoid_ft = F.sigmoid(d0_f_ft)\n            focus_sigmoid_bt = F.sigmoid(d0_f_bt)\n        else:\n            focus_sigmoid = F.sigmoid(d0_f)\n        if self.rosta == 'RIM':\n            fusion_sigmoid_tt = collaborative_matting('TT', glance_sigmoid_tt, focus_sigmoid_tt)\n            fusion_sigmoid_ft = collaborative_matting('FT', glance_sigmoid_ft, focus_sigmoid_ft)\n            fusion_sigmoid_bt = collaborative_matting('BT', glance_sigmoid_bt, focus_sigmoid_bt)\n            fusion_sigmoid = paddle.concat((fusion_sigmoid_tt, fusion_sigmoid_ft, fusion_sigmoid_bt), 1)\n            fusion_sigmoid = self.rim(fusion_sigmoid)\n            return [[glance_sigmoid_tt, focus_sigmoid_tt, fusion_sigmoid_tt],\n                    [glance_sigmoid_ft, focus_sigmoid_ft, fusion_sigmoid_ft],\n                    [glance_sigmoid_bt, focus_sigmoid_bt, fusion_sigmoid_bt], fusion_sigmoid]\n        else:\n            fusion_sigmoid = collaborative_matting(self.rosta, glance_sigmoid, focus_sigmoid)\n            return glance_sigmoid, focus_sigmoid, fusion_sigmoid\n\n\ndef collaborative_matting(rosta, glance_sigmoid, focus_sigmoid):\n    if rosta == 'TT':\n        values = paddle.max(glance_sigmoid, axis=1)\n        index = paddle.argmax(glance_sigmoid, axis=1)\n        index = index[:, None, :, :].cast(paddle.float32)\n        bg_mask = index.clone()\n        bg_mask[bg_mask == 2] = 1\n        bg_mask = 1 - bg_mask\n        trimap_mask = index.clone()\n        trimap_mask[trimap_mask == 2] = 0\n        fg_mask = index.clone()\n        fg_mask[fg_mask == 1] = 0\n        fg_mask[fg_mask == 2] = 1\n        focus_sigmoid = focus_sigmoid.cpu()\n        trimap_mask = trimap_mask.cpu()\n        fg_mask = fg_mask.cpu()\n        fusion_sigmoid = focus_sigmoid * trimap_mask + fg_mask\n    elif rosta == 'BT':\n        values = paddle.max(glance_sigmoid, axis=1)\n        index = paddle.argmax(glance_sigmoid, axis=1)\n        index = index[:, None, :, :].cast(paddle.float32)\n        fusion_sigmoid = index - focus_sigmoid\n        fusion_sigmoid[fusion_sigmoid < 0] = 0\n    else:\n        values = paddle.max(glance_sigmoid, axis=1)\n        index = paddle.argmax(glance_sigmoid, axis=1)\n        index = index[:, None, :, :].cast(paddle.float32)\n        fusion_sigmoid = index + focus_sigmoid\n        fusion_sigmoid[fusion_sigmoid > 1] = 1\n    return fusion_sigmoid\n\n\nif __name__ == \"__main__\":\n    model = GFM()\n    x = paddle.ones([1, 3, 256, 256])\n    result = model(x)\n    print(x)\n"
  },
  {
    "path": "modules/image/matting/gfm_resnet34_matting/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport os\nimport time\nfrom typing import List\nfrom typing import Union\n\nimport cv2\nimport gfm_resnet34_matting.processor as P\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nfrom gfm_resnet34_matting.gfm import GFM\nfrom PIL import Image\nfrom skimage.transform import resize\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"gfm_resnet34_matting\",\n            type=\"CV/matting\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"gfm_resnet34_matting is an animal matting model.\",\n            version=\"1.0.0\")\nclass GFMResNet34(nn.Layer):\n    \"\"\"\n    The GFM implementation based on PaddlePaddle.\n\n    The original article refers to：\n    Bridging Composite and Real: Towards End-to-end Deep Image Matting [IJCV-2021]\n    Main network file (GFM).\n\n    Github repo: https://github.com/JizhiziLi/GFM\n    Paper link (Arxiv): https://arxiv.org/abs/2010.16188\n    \"\"\"\n\n    def __init__(self, pretrained: str = None):\n        super(GFMResNet34, self).__init__()\n\n        self.model = GFM()\n        self.resize_by_short = P.ResizeByShort(1080)\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.model.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.model.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def preprocess(self, img: Union[str, np.ndarray], h: int, w: int) -> paddle.Tensor:\n        if min(h, w) > 1080:\n            img = self.resize_by_short(img)\n        tensor_img = self.scale_image(img, h, w)\n        return tensor_img\n\n    def scale_image(self, img: np.ndarray, h: int, w: int, ratio: float = 1 / 3):\n        new_h = min(1600, h - (h % 32))\n        new_w = min(1600, w - (w % 32))\n        resize_h = int(h * ratio)\n        resize_w = int(w * ratio)\n        new_h = min(1600, resize_h - (resize_h % 32))\n        new_w = min(1600, resize_w - (resize_w % 32))\n\n        scale_img = resize(img, (new_h, new_w)) * 255\n        tensor_img = paddle.to_tensor(scale_img.astype(np.float32)[np.newaxis, :, :, :])\n        tensor_img = tensor_img.transpose([0, 3, 1, 2])\n        return tensor_img\n\n    def inference_img_scale(self, input: paddle.Tensor) -> List[paddle.Tensor]:\n        pred_global, pred_local, pred_fusion = self.model(input)\n        pred_global = P.gen_trimap_from_segmap_e2e(pred_global)\n        pred_local = pred_local.numpy()[0, 0, :, :]\n        pred_fusion = pred_fusion.numpy()[0, 0, :, :]\n        return pred_global, pred_local, pred_fusion\n\n    def predict(self, image_list: list, visualization: bool = True, save_path: str = \"gfm_resnet34_matting_output\"):\n        self.model.eval()\n        result = []\n        with paddle.no_grad():\n            for i, img in enumerate(image_list):\n                if isinstance(img, str):\n                    img = np.array(Image.open(img))[:, :, :3]\n                else:\n                    img = img[:, :, ::-1]\n                h, w, _ = img.shape\n                tensor_img = self.preprocess(img, h, w)\n                pred_glance_1, pred_focus_1, pred_fusion_1 = self.inference_img_scale(tensor_img)\n                pred_glance_1 = resize(pred_glance_1, (h, w)) * 255.0\n                tensor_img = self.scale_image(img, h, w, 1 / 2)\n                pred_glance_2, pred_focus_2, pred_fusion_2 = self.inference_img_scale(tensor_img)\n                pred_focus_2 = resize(pred_focus_2, (h, w))\n                pred_fusion = P.get_masked_local_from_global_test(pred_glance_1, pred_focus_2)\n                pred_fusion = (pred_fusion * 255).astype(np.uint8)\n                if visualization:\n                    if not os.path.exists(save_path):\n                        os.makedirs(save_path)\n                    img_name = str(time.time()) + '.png'\n                    image_save_path = os.path.join(save_path, img_name)\n                    cv2.imwrite(image_save_path, pred_fusion)\n                result.append(pred_fusion)\n        return result\n\n    @serving\n    def serving_method(self, images: str, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [P.base64_to_cv2(image) for image in images]\n        outputs = self.predict(image_list=images_decode, **kwargs)\n        serving_data = [P.cv2_to_base64(outputs[i]) for i in range(len(outputs))]\n        results = {'data': serving_data}\n\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n\n        results = self.predict(image_list=[args.input_path],\n                               save_path=args.output_dir,\n                               visualization=args.visualization)\n\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default=\"gfm_resnet34_matting_output\",\n                                           help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument('--visualization',\n                                           type=bool,\n                                           default=True,\n                                           help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n"
  },
  {
    "path": "modules/image/matting/gfm_resnet34_matting/processor.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport base64\n\nimport cv2\nimport numpy as np\nfrom paddleseg.transforms import functional\n\n\nclass ResizeByLong:\n    \"\"\"\n    Resize the long side of an image to given size, and then scale the other side proportionally.\n\n    Args:\n        long_size (int): The target size of long side.\n    \"\"\"\n\n    def __init__(self, long_size):\n        self.long_size = long_size\n\n    def __call__(self, data):\n        data = functional.resize_long(data, self.long_size)\n        return data\n\n\nclass ResizeByShort:\n    \"\"\"\n    Resize the short side of an image to given size, and then scale the other side proportionally.\n\n    Args:\n        short_size (int): The target size of short side.\n    \"\"\"\n\n    def __init__(self, short_size):\n        self.short_size = short_size\n\n    def __call__(self, data):\n        \n        data = functional.resize_short(data, self.short_size)\n        \n        return data\n\ndef gen_trimap_from_segmap_e2e(segmap):\n\ttrimap = np.argmax(segmap, axis=1)[0]\n\ttrimap = trimap.astype(np.int64)\t\n\ttrimap[trimap==1]=128\n\ttrimap[trimap==2]=255\n\treturn trimap.astype(np.uint8)\n\ndef get_masked_local_from_global_test(global_result, local_result):\n\tweighted_global = np.ones(global_result.shape)\n\tweighted_global[global_result==255] = 0\n\tweighted_global[global_result==0] = 0\n\tfusion_result = global_result*(1.-weighted_global)/255+local_result*weighted_global\n\treturn fusion_result\n\ndef cv2_to_base64(image: np.ndarray):\n    \"\"\"\n    Convert data from BGR to base64 format.\n    \"\"\"\n    data = cv2.imencode('.png', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef base64_to_cv2(b64str: str):\n    \"\"\"\n    Convert data from base64 to BGR format.\n    \"\"\"\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data"
  },
  {
    "path": "modules/image/matting/gfm_resnet34_matting/requirements.txt",
    "content": "paddleseg >= 2.3.0\nscikit-image\n"
  },
  {
    "path": "modules/image/matting/gfm_resnet34_matting/resnet.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nfrom typing import Type, Any, Callable, Union, List, Optional\n\n\ndef conv3x3(in_planes: int, out_planes: int, stride: int=1, groups: int=1,\n    dilation: int=1) ->paddle.nn.Conv2D:\n    \"\"\"3x3 convolution with padding\"\"\"\n    return nn.Conv2D(in_planes, out_planes, kernel_size=3, stride=stride,\n        padding=dilation, groups=groups, dilation=dilation, bias_attr=False)\n\n\ndef conv1x1(in_planes: int, out_planes: int, stride: int=1) ->paddle.nn.Conv2D:\n    \"\"\"1x1 convolution\"\"\"\n    return nn.Conv2D(in_planes, out_planes, kernel_size=1, stride=stride,\n        bias_attr=False)\n\n\nclass BasicBlock(nn.Layer):\n    expansion: int = 1\n\n    def __init__(self, inplanes: int, planes: int, stride: int=1,\n        downsample: Optional[nn.Layer]=None, groups: int=1, base_width:\n        int=64, dilation: int=1, norm_layer: Optional[Callable[..., paddle.\n        nn.Layer]]=None) ->None:\n        super(BasicBlock, self).__init__()\n        if norm_layer is None:\n            norm_layer = nn.BatchNorm2D\n        if groups != 1 or base_width != 64:\n            raise ValueError(\n                'BasicBlock only supports groups=1 and base_width=64')\n        if dilation > 1:\n            raise NotImplementedError(\n                'Dilation > 1 not supported in BasicBlock')\n        self.conv1 = conv3x3(inplanes, planes, stride)\n        self.bn1 = norm_layer(planes)\n        self.relu = paddle.nn.ReLU()\n        self.conv2 = conv3x3(planes, planes)\n        self.bn2 = norm_layer(planes)\n        self.downsample = downsample\n        self.stride = stride\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        identity = x\n        out = self.conv1(x)\n        out = self.bn1(out)\n        out = self.relu(out)\n        out = self.conv2(out)\n        out = self.bn2(out)\n        if self.downsample is not None:\n            identity = self.downsample(x)\n        out += identity\n        out = self.relu(out)\n        return out\n\n\nclass Bottleneck(nn.Layer):\n    expansion: int = 4\n\n    def __init__(self, inplanes: int, planes: int, stride: int=1,\n        downsample: Optional[nn.Layer]=None, groups: int=1, base_width:\n        int=64, dilation: int=1, norm_layer: Optional[Callable[..., paddle.\n        nn.Layer]]=None) ->None:\n        super(Bottleneck, self).__init__()\n        if norm_layer is None:\n            norm_layer = nn.BatchNorm2D\n        width = int(planes * (base_width / 64.0)) * groups\n        self.conv1 = conv1x1(inplanes, width)\n        self.bn1 = norm_layer(width)\n        self.conv2 = conv3x3(width, width, stride, groups, dilation)\n        self.bn2 = norm_layer(width)\n        self.conv3 = conv1x1(width, planes * self.expansion)\n        self.bn3 = norm_layer(planes * self.expansion)\n        self.relu = paddle.nn.ReLU()\n        self.downsample = downsample\n        self.stride = stride\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        identity = x\n        out = self.conv1(x)\n        out = self.bn1(out)\n        out = self.relu(out)\n        out = self.conv2(out)\n        out = self.bn2(out)\n        out = self.relu(out)\n        out = self.conv3(out)\n        out = self.bn3(out)\n        if self.downsample is not None:\n            identity = self.downsample(x)\n        out += identity\n        out = self.relu(out)\n        return out\n\n\nclass ResNet(nn.Layer):\n\n    def __init__(self, block: Type[Union[BasicBlock, Bottleneck]], layers:\n        List[int], num_classes: int=1000, zero_init_residual: bool=False,\n        groups: int=1, width_per_group: int=64,\n        replace_stride_with_dilation: Optional[List[bool]]=None, norm_layer:\n        Optional[Callable[..., paddle.nn.Layer]]=None) ->None:\n        super(ResNet, self).__init__()\n        if norm_layer is None:\n            norm_layer = nn.BatchNorm2D\n        self._norm_layer = norm_layer\n        self.inplanes = 64\n        self.dilation = 1\n        if replace_stride_with_dilation is None:\n            replace_stride_with_dilation = [False, False, False]\n        if len(replace_stride_with_dilation) != 3:\n            raise ValueError(\n                'replace_stride_with_dilation should be None or a 3-element tuple, got {}'\n                .format(replace_stride_with_dilation))\n        self.groups = groups\n        self.base_width = width_per_group\n        self.conv1 = nn.Conv2D(3, self.inplanes, kernel_size=7, stride=2,\n            padding=3, bias_attr=False)\n        self.bn1 = norm_layer(self.inplanes)\n        self.relu = paddle.nn.ReLU()\n        self.maxpool = nn.MaxPool2D(kernel_size=3, stride=2, padding=1)\n        self.layer1 = self._make_layer(block, 64, layers[0])\n        self.layer2 = self._make_layer(block, 128, layers[1], stride=2,\n            dilate=replace_stride_with_dilation[0])\n        self.layer3 = self._make_layer(block, 256, layers[2], stride=2,\n            dilate=replace_stride_with_dilation[1])\n        self.layer4 = self._make_layer(block, 512, layers[3], stride=2,\n            dilate=replace_stride_with_dilation[2])\n        self.avgpool = nn.AdaptiveAvgPool2D((1, 1))\n        self.fc = nn.Linear(512 * block.expansion, num_classes)\n\n    def _make_layer(self, block: Type[Union[BasicBlock, Bottleneck]],\n        planes: int, blocks: int, stride: int=1, dilate: bool=False\n        ) ->paddle.nn.Sequential:\n        norm_layer = self._norm_layer\n        downsample = None\n        previous_dilation = self.dilation\n        if dilate:\n            self.dilation *= stride\n            stride = 1\n        if stride != 1 or self.inplanes != planes * block.expansion:\n            downsample = nn.Sequential(conv1x1(self.inplanes, planes *\n                block.expansion, stride), norm_layer(planes * block.expansion))\n        layers = []\n        layers.append(block(self.inplanes, planes, stride, downsample, self\n            .groups, self.base_width, previous_dilation, norm_layer))\n        self.inplanes = planes * block.expansion\n        for _ in range(1, blocks):\n            layers.append(block(self.inplanes, planes, groups=self.groups,\n                base_width=self.base_width, dilation=self.dilation,\n                norm_layer=norm_layer))\n        return nn.Sequential(*layers)\n\n    def _forward_impl(self, x: paddle.Tensor) ->paddle.Tensor:\n        x = self.conv1(x)\n        x = self.bn1(x)\n        x = self.relu(x)\n        x = self.maxpool(x)\n        x = self.layer1(x)\n        x = self.layer2(x)\n        x = self.layer3(x)\n        x = self.layer4(x)\n        x = self.avgpool(x)\n        x= paddle.flatten(x,1)\n        x = self.fc(x)\n        return x\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        return self._forward_impl(x)\n\n\ndef _resnet(arch: str, block: Type[Union[BasicBlock, Bottleneck]], layers:\n    List[int], pretrained: bool, progress: bool, **kwargs: Any) ->ResNet:\n    model = ResNet(block, layers, **kwargs)\n    return model\n\n\ndef resnet34(pretrained: bool=False, progress: bool=True, **kwargs: Any\n    ) ->ResNet:\n    \"\"\"ResNet-34 model from\n    `\"Deep Residual Learning for Image Recognition\" <https://arxiv.org/pdf/1512.03385.pdf>`_.\n\n    Args:\n        pretrained (bool): If True, returns a model pre-trained on ImageNet\n        progress (bool): If True, displays a progress bar of the download to stderr\n    \"\"\"\n    return _resnet('resnet34', BasicBlock, [3, 4, 6, 3], pretrained,\n        progress, **kwargs)\n"
  },
  {
    "path": "modules/image/matting/modnet_hrnet18_matting/README.md",
    "content": "# modnet_hrnet18_matting\n\n|模型名称|modnet_hrnet18_matting|\n| :--- | :---: | \n|类别|图像-抠图|\n|网络|modnet_hrnet18|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|否|\n|模型大小|60MB|\n|指标|SAD77.96|\n|最新更新日期|2021-12-03|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n  - 样例结果示例(左为原图，右为效果图)：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/144574288-28671577-8d5d-4b20-adb9-fe737015c841.jpg\" width = \"337\" height = \"505\" hspace='10'/> \n    <img src=\"https://user-images.githubusercontent.com/35907364/144780857-13c63c21-5d12-4028-985b-378776f58220.png\" width = \"337\" height = \"505\" hspace='10'/> \n    </p>\n\n- ### 模型介绍\n\n  - Matting（精细化分割/影像去背/抠图）是指借由计算前景的颜色和透明度，将前景从影像中撷取出来的技术，可用于替换背景、影像合成、视觉特效，在电影工业中被广泛地使用。影像中的每个像素会有代表其前景透明度的值，称作阿法值（Alpha），一张影像中所有阿法值的集合称作阿法遮罩（Alpha Matte），将影像被遮罩所涵盖的部分取出即可完成前景的分离。modnet_hrnet18_matting可生成抠图结果。\n\n\n  \n  - 更多详情请参考：[modnet_hrnet18_matting](https://github.com/PaddlePaddle/PaddleSeg/tree/release/2.3/contrib/Matting)\n  \n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.2.0\n\n    - paddlehub >= 2.1.0\n\n    - paddleseg >= 2.3.0\n\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install modnet_hrnet18_matting\n      ```\n      \n    - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n    \n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run modnet_hrnet18_matting --input_path \"/PATH/TO/IMAGE\"\n    ```\n    \n  - 通过命令行方式实现hub模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n    - ```python\n      import paddlehub as hub\n      import cv2\n\n      model = hub.Module(name=\"modnet_hrnet18_matting\")\n\n      result = model.predict([\"/PATH/TO/IMAGE\"])\n      print(result)\n      ```\n- ### 3、API\n\n    - ```python\n        def predict(self, \n                    image_list, \n                    trimap_list, \n                    visualization, \n                    save_path):\n      ```\n\n        - 人像matting预测API，用于将输入图片中的人像分割出来。\n\n        - 参数\n\n            - image_list (list(str | numpy.ndarray)):图片输入路径或者BGR格式numpy数据。\n            - trimap_list(list(str | numpy.ndarray)):trimap输入路径或者单通道灰度图格式图片。\n            - visualization (bool): 是否进行可视化，默认为False。\n            - save_path (str): 当visualization为True时，保存图片的路径，默认为\"modnet_hrnet18_matting_output\"。\n\n        - 返回\n\n            - result (list(numpy.ndarray))：模型分割结果：\n\n \n## 四、服务部署\n\n- PaddleHub Serving可以部署人像matting在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m modnet_hrnet18_matting\n      ```\n\n    - 这样就完成了一个人像matting在线服务API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    import time\n    import numpy as np\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n\n    def base64_to_cv2(b64str):\n        data = base64.b64decode(b64str.encode('utf8'))\n        data = np.fromstring(data, np.uint8)\n        data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n        return data\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/modnet_hrnet18_matting\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    for image in r.json()[\"results\"]['data']:\n        data = base64_to_cv2(image)\n        image_path =str(time.time()) + \".png\"\n        cv2.imwrite(image_path, data)\n      ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/matting/modnet_hrnet18_matting/README_en.md",
    "content": "# modnet_hrnet18_matting\n\n|Module Name|modnet_hrnet18_matting|\n| :--- | :---: | \n|Category|Image Segmentation|\n|Network|modnet_mobilenetv2|\n|Dataset|Baidu self-built dataset|\n|Support Fine-tuning|No|\n|Module Size|60MB|\n|Data Indicators|SAD77.96|\n|Latest update date|2021-12-03|\n\n\n## I. Basic Information\n\n- ### Application Effect Display\n\n  - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/144574288-28671577-8d5d-4b20-adb9-fe737015c841.jpg\" width = \"337\" height = \"505\" hspace='10'/> \n    <img src=\"https://user-images.githubusercontent.com/35907364/144780857-13c63c21-5d12-4028-985b-378776f58220.png\" width = \"337\" height = \"505\" hspace='10'/> \n    </p>\n\n- ### Module Introduction\n\n  - Mating is the technique of extracting foreground from an image by calculating its color and transparency. It is widely used in the film industry to replace background, image composition, and visual effects. Each pixel in the image will have a value that represents its foreground transparency, called Alpha. The set of all Alpha values in an image is called Alpha Matte. The part of the image covered by the mask can be extracted to complete foreground separation.\n\n\n  \n  - For more information, please refer to: [modnet_hrnet18_matting](https://github.com/PaddlePaddle/PaddleSeg/tree/release/2.3/contrib/Matting)\n  \n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.2.0\n\n    - paddlehub >= 2.1.0\n\n    - paddleseg >= 2.3.0\n\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install modnet_hrnet18_matting\n      ```\n      \n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n    \n## III. Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run modnet_hrnet18_matting --input_path \"/PATH/TO/IMAGE\"\n    ```\n    \n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_en/tutorial/cmd_usage.rst)\n\n\n- ### 2、Prediction Code Example\n\n    - ```python\n      import paddlehub as hub\n      import cv2\n\n      model = hub.Module(name=\"modnet_hrnet18_matting\")\n\n      result = model.predict([\"/PATH/TO/IMAGE\"])\n      print(result)\n      ```\n- ### 3、API\n\n    - ```python\n        def predict(self, \n                    image_list, \n                    trimap_list, \n                    visualization, \n                    save_path):\n      ```\n\n        - Prediction API for matting.\n\n        - **Parameter**\n\n            - image_list (list(str | numpy.ndarray)): Image path or image data, ndarray.shape is in the format \\[H, W, C\\]，BGR.\n            - trimap_list(list(str | numpy.ndarray)): Trimap path or trimap data, ndarray.shape is in the format \\[H, W]，gray. Default is None\n            - visualization (bool): Whether to save the recognition results as picture files, default is False.\n            - save_path (str): Save path of images, \"modnet_hrnet18_matting_output\" by default.\n\n        - **Return**\n\n            - result (list(numpy.ndarray))：The list of model results.\n\n \n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of matting.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command:\n\n    - ```shell\n      $ hub serving start -m modnet_hrnet18_matting\n      ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n\n    ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    import time\n    import numpy as np\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n\n    def base64_to_cv2(b64str):\n        data = base64.b64decode(b64str.encode('utf8'))\n        data = np.fromstring(data, np.uint8)\n        data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n        return data\n\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/modnet_hrnet18_matting\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    for image in r.json()[\"results\"]['data']:\n        data = base64_to_cv2(image)\n        image_path =str(time.time()) + \".png\"\n        cv2.imwrite(image_path, data)\n      ```\n\n## V. Release Note\n\n- 1.0.0\n\n  First release\n"
  },
  {
    "path": "modules/image/matting/modnet_hrnet18_matting/hrnet.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport math\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom paddleseg.cvlibs import manager, param_init\nfrom paddleseg.models import layers\nfrom paddleseg.utils import utils\n\n__all__ = [\"HRNet_W18\"]\n\n\nclass HRNet(nn.Layer):\n    \"\"\"\n    The HRNet implementation based on PaddlePaddle.\n\n    The original article refers to\n    Jingdong Wang, et, al. \"HRNet：Deep High-Resolution Representation Learning for Visual Recognition\"\n    (https://arxiv.org/pdf/1908.07919.pdf).\n\n    Args:\n        pretrained (str, optional): The path of pretrained model.\n        stage1_num_modules (int, optional): Number of modules for stage1. Default 1.\n        stage1_num_blocks (list, optional): Number of blocks per module for stage1. Default (4).\n        stage1_num_channels (list, optional): Number of channels per branch for stage1. Default (64).\n        stage2_num_modules (int, optional): Number of modules for stage2. Default 1.\n        stage2_num_blocks (list, optional): Number of blocks per module for stage2. Default (4, 4).\n        stage2_num_channels (list, optional): Number of channels per branch for stage2. Default (18, 36).\n        stage3_num_modules (int, optional): Number of modules for stage3. Default 4.\n        stage3_num_blocks (list, optional): Number of blocks per module for stage3. Default (4, 4, 4).\n        stage3_num_channels (list, optional): Number of channels per branch for stage3. Default [18, 36, 72).\n        stage4_num_modules (int, optional): Number of modules for stage4. Default 3.\n        stage4_num_blocks (list, optional): Number of blocks per module for stage4. Default (4, 4, 4, 4).\n        stage4_num_channels (list, optional): Number of channels per branch for stage4. Default (18, 36, 72. 144).\n        has_se (bool, optional): Whether to use Squeeze-and-Excitation module. Default False.\n        align_corners (bool, optional): An argument of F.interpolate. It should be set to False when the feature size is even,\n            e.g. 1024x512, otherwise it is True, e.g. 769x769. Default: False.\n    \"\"\"\n\n    def __init__(self,\n                 input_channels: int=3,\n                 pretrained: int = None,\n                 stage1_num_modules: int = 1,\n                 stage1_num_blocks: list = (4, ),\n                 stage1_num_channels: list = (64, ),\n                 stage2_num_modules: int = 1,\n                 stage2_num_blocks: list = (4, 4),\n                 stage2_num_channels: list = (18, 36),\n                 stage3_num_modules: int = 4,\n                 stage3_num_blocks: list = (4, 4, 4),\n                 stage3_num_channels: list = (18, 36, 72),\n                 stage4_num_modules: int = 3,\n                 stage4_num_blocks: list = (4, 4, 4, 4),\n                 stage4_num_channels: list = (18, 36, 72, 144),\n                 has_se: bool = False,\n                 align_corners: bool = False,\n                 padding_same: bool = True):\n        super(HRNet, self).__init__()\n        self.pretrained = pretrained\n        self.stage1_num_modules = stage1_num_modules\n        self.stage1_num_blocks = stage1_num_blocks\n        self.stage1_num_channels = stage1_num_channels\n        self.stage2_num_modules = stage2_num_modules\n        self.stage2_num_blocks = stage2_num_blocks\n        self.stage2_num_channels = stage2_num_channels\n        self.stage3_num_modules = stage3_num_modules\n        self.stage3_num_blocks = stage3_num_blocks\n        self.stage3_num_channels = stage3_num_channels\n        self.stage4_num_modules = stage4_num_modules\n        self.stage4_num_blocks = stage4_num_blocks\n        self.stage4_num_channels = stage4_num_channels\n        self.has_se = has_se\n        self.align_corners = align_corners\n\n        self.feat_channels = [i for i in stage4_num_channels]\n        self.feat_channels = [64] + self.feat_channels\n\n        self.conv_layer1_1 = layers.ConvBNReLU(\n            in_channels=input_channels,\n            out_channels=64,\n            kernel_size=3,\n            stride=2,\n            padding=1 if not padding_same else 'same',\n            bias_attr=False)\n\n        self.conv_layer1_2 = layers.ConvBNReLU(\n            in_channels=64,\n            out_channels=64,\n            kernel_size=3,\n            stride=2,\n            padding=1 if not padding_same else 'same',\n            bias_attr=False)\n\n        self.la1 = Layer1(\n            num_channels=64,\n            num_blocks=self.stage1_num_blocks[0],\n            num_filters=self.stage1_num_channels[0],\n            has_se=has_se,\n            name=\"layer2\",\n            padding_same=padding_same)\n\n        self.tr1 = TransitionLayer(\n            in_channels=[self.stage1_num_channels[0] * 4],\n            out_channels=self.stage2_num_channels,\n            name=\"tr1\",\n            padding_same=padding_same)\n\n        self.st2 = Stage(\n            num_channels=self.stage2_num_channels,\n            num_modules=self.stage2_num_modules,\n            num_blocks=self.stage2_num_blocks,\n            num_filters=self.stage2_num_channels,\n            has_se=self.has_se,\n            name=\"st2\",\n            align_corners=align_corners,\n            padding_same=padding_same)\n\n        self.tr2 = TransitionLayer(\n            in_channels=self.stage2_num_channels,\n            out_channels=self.stage3_num_channels,\n            name=\"tr2\",\n            padding_same=padding_same)\n        self.st3 = Stage(\n            num_channels=self.stage3_num_channels,\n            num_modules=self.stage3_num_modules,\n            num_blocks=self.stage3_num_blocks,\n            num_filters=self.stage3_num_channels,\n            has_se=self.has_se,\n            name=\"st3\",\n            align_corners=align_corners,\n            padding_same=padding_same)\n\n        self.tr3 = TransitionLayer(\n            in_channels=self.stage3_num_channels,\n            out_channels=self.stage4_num_channels,\n            name=\"tr3\",\n            padding_same=padding_same)\n        self.st4 = Stage(\n            num_channels=self.stage4_num_channels,\n            num_modules=self.stage4_num_modules,\n            num_blocks=self.stage4_num_blocks,\n            num_filters=self.stage4_num_channels,\n            has_se=self.has_se,\n            name=\"st4\",\n            align_corners=align_corners,\n            padding_same=padding_same)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        feat_list = []\n        conv1 = self.conv_layer1_1(x)\n        feat_list.append(conv1)\n        conv2 = self.conv_layer1_2(conv1)\n\n        la1 = self.la1(conv2)\n\n        tr1 = self.tr1([la1])\n        st2 = self.st2(tr1)\n\n        tr2 = self.tr2(st2)\n        st3 = self.st3(tr2)\n\n        tr3 = self.tr3(st3)\n        st4 = self.st4(tr3)\n\n        feat_list = feat_list + st4\n\n        return feat_list\n\n\nclass Layer1(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 num_blocks: int,\n                 has_se: bool = False,\n                 name: str = None,\n                 padding_same: bool = True):\n        super(Layer1, self).__init__()\n\n        self.bottleneck_block_list = []\n\n        for i in range(num_blocks):\n            bottleneck_block = self.add_sublayer(\n                \"bb_{}_{}\".format(name, i + 1),\n                BottleneckBlock(\n                    num_channels=num_channels if i == 0 else num_filters * 4,\n                    num_filters=num_filters,\n                    has_se=has_se,\n                    stride=1,\n                    downsample=True if i == 0 else False,\n                    name=name + '_' + str(i + 1),\n                    padding_same=padding_same))\n            self.bottleneck_block_list.append(bottleneck_block)\n\n    def forward(self, x: paddle.Tensor):\n        conv = x\n        for block_func in self.bottleneck_block_list:\n            conv = block_func(conv)\n        return conv\n\n\nclass TransitionLayer(nn.Layer):\n    def __init__(self, \n                 in_channels: int, \n                 out_channels: int, \n                 name: str = None, \n                 padding_same: bool = True):\n        super(TransitionLayer, self).__init__()\n\n        num_in = len(in_channels)\n        num_out = len(out_channels)\n        self.conv_bn_func_list = []\n        for i in range(num_out):\n            residual = None\n            if i < num_in:\n                if in_channels[i] != out_channels[i]:\n                    residual = self.add_sublayer(\n                        \"transition_{}_layer_{}\".format(name, i + 1),\n                        layers.ConvBNReLU(\n                            in_channels=in_channels[i],\n                            out_channels=out_channels[i],\n                            kernel_size=3,\n                            padding=1 if not padding_same else 'same',\n                            bias_attr=False))\n            else:\n                residual = self.add_sublayer(\n                    \"transition_{}_layer_{}\".format(name, i + 1),\n                    layers.ConvBNReLU(\n                        in_channels=in_channels[-1],\n                        out_channels=out_channels[i],\n                        kernel_size=3,\n                        stride=2,\n                        padding=1 if not padding_same else 'same',\n                        bias_attr=False))\n            self.conv_bn_func_list.append(residual)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outs = []\n        for idx, conv_bn_func in enumerate(self.conv_bn_func_list):\n            if conv_bn_func is None:\n                outs.append(x[idx])\n            else:\n                if idx < len(x):\n                    outs.append(conv_bn_func(x[idx]))\n                else:\n                    outs.append(conv_bn_func(x[-1]))\n        return outs\n\n\nclass Branches(nn.Layer):\n    def __init__(self,\n                 num_blocks: int,\n                 in_channels: int,\n                 out_channels: int,\n                 has_se: bool = False,\n                 name: str = None,\n                 padding_same: bool = True):\n        super(Branches, self).__init__()\n\n        self.basic_block_list = []\n\n        for i in range(len(out_channels)):\n            self.basic_block_list.append([])\n            for j in range(num_blocks[i]):\n                in_ch = in_channels[i] if j == 0 else out_channels[i]\n                basic_block_func = self.add_sublayer(\n                    \"bb_{}_branch_layer_{}_{}\".format(name, i + 1, j + 1),\n                    BasicBlock(\n                        num_channels=in_ch,\n                        num_filters=out_channels[i],\n                        has_se=has_se,\n                        name=name + '_branch_layer_' + str(i + 1) + '_' +\n                        str(j + 1),\n                        padding_same=padding_same))\n                self.basic_block_list[i].append(basic_block_func)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outs = []\n        for idx, input in enumerate(x):\n            conv = input\n            for basic_block_func in self.basic_block_list[idx]:\n                conv = basic_block_func(conv)\n            outs.append(conv)\n        return outs\n\n\nclass BottleneckBlock(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 has_se: bool,\n                 stride: int = 1,\n                 downsample: bool = False,\n                 name:str = None,\n                 padding_same: bool = True):\n        super(BottleneckBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = layers.ConvBNReLU(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=1,\n            bias_attr=False)\n\n        self.conv2 = layers.ConvBNReLU(\n            in_channels=num_filters,\n            out_channels=num_filters,\n            kernel_size=3,\n            stride=stride,\n            padding=1 if not padding_same else 'same',\n            bias_attr=False)\n\n        self.conv3 = layers.ConvBN(\n            in_channels=num_filters,\n            out_channels=num_filters * 4,\n            kernel_size=1,\n            bias_attr=False)\n\n        if self.downsample:\n            self.conv_down = layers.ConvBN(\n                in_channels=num_channels,\n                out_channels=num_filters * 4,\n                kernel_size=1,\n                bias_attr=False)\n\n        if self.has_se:\n            self.se = SELayer(\n                num_channels=num_filters * 4,\n                num_filters=num_filters * 4,\n                reduction_ratio=16,\n                name=name + '_fc')\n\n        self.add = layers.Add()\n        self.relu = layers.Activation(\"relu\")\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        residual = x\n        conv1 = self.conv1(x)\n        conv2 = self.conv2(conv1)\n        conv3 = self.conv3(conv2)\n\n        if self.downsample:\n            residual = self.conv_down(x)\n\n        if self.has_se:\n            conv3 = self.se(conv3)\n\n        y = self.add(conv3, residual)\n        y = self.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int, \n                 stride: int = 1,\n                 has_se: bool = False,\n                 downsample: bool = False,\n                 name: str = None,\n                 padding_same: bool = True):\n        super(BasicBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = layers.ConvBNReLU(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=3,\n            stride=stride,\n            padding=1 if not padding_same else 'same',\n            bias_attr=False)\n        self.conv2 = layers.ConvBN(\n            in_channels=num_filters,\n            out_channels=num_filters,\n            kernel_size=3,\n            padding=1 if not padding_same else 'same',\n            bias_attr=False)\n\n        if self.downsample:\n            self.conv_down = layers.ConvBNReLU(\n                in_channels=num_channels,\n                out_channels=num_filters,\n                kernel_size=1,\n                bias_attr=False)\n\n        if self.has_se:\n            self.se = SELayer(\n                num_channels=num_filters,\n                num_filters=num_filters,\n                reduction_ratio=16,\n                name=name + '_fc')\n\n        self.add = layers.Add()\n        self.relu = layers.Activation(\"relu\")\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        residual = x\n        conv1 = self.conv1(x)\n        conv2 = self.conv2(conv1)\n\n        if self.downsample:\n            residual = self.conv_down(x)\n\n        if self.has_se:\n            conv2 = self.se(conv2)\n\n        y = self.add(conv2, residual)\n        y = self.relu(y)\n        return y\n\n\nclass SELayer(nn.Layer):\n    def __init__(self, num_channels: int, num_filters: int, reduction_ratio: int, name: str = None):\n        super(SELayer, self).__init__()\n\n        self.pool2d_gap = nn.AdaptiveAvgPool2D(1)\n\n        self._num_channels = num_channels\n\n        med_ch = int(num_channels / reduction_ratio)\n        stdv = 1.0 / math.sqrt(num_channels * 1.0)\n        self.squeeze = nn.Linear(\n            num_channels,\n            med_ch,\n            weight_attr=paddle.ParamAttr(\n                initializer=nn.initializer.Uniform(-stdv, stdv)))\n\n        stdv = 1.0 / math.sqrt(med_ch * 1.0)\n        self.excitation = nn.Linear(\n            med_ch,\n            num_filters,\n            weight_attr=paddle.ParamAttr(\n                initializer=nn.initializer.Uniform(-stdv, stdv)))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        pool = self.pool2d_gap(x)\n        pool = paddle.reshape(pool, shape=[-1, self._num_channels])\n        squeeze = self.squeeze(pool)\n        squeeze = F.relu(squeeze)\n        excitation = self.excitation(squeeze)\n        excitation = F.sigmoid(excitation)\n        excitation = paddle.reshape(\n            excitation, shape=[-1, self._num_channels, 1, 1])\n        out = x * excitation\n        return out\n\n\nclass Stage(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_modules: int,\n                 num_blocks: int,\n                 num_filters: int,\n                 has_se: bool = False,\n                 multi_scale_output: bool = True,\n                 name: str = None,\n                 align_corners: bool = False,\n                 padding_same: bool = True):\n        super(Stage, self).__init__()\n\n        self._num_modules = num_modules\n\n        self.stage_func_list = []\n        for i in range(num_modules):\n            if i == num_modules - 1 and not multi_scale_output:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels,\n                        num_blocks=num_blocks,\n                        num_filters=num_filters,\n                        has_se=has_se,\n                        multi_scale_output=False,\n                        name=name + '_' + str(i + 1),\n                        align_corners=align_corners,\n                        padding_same=padding_same))\n            else:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels,\n                        num_blocks=num_blocks,\n                        num_filters=num_filters,\n                        has_se=has_se,\n                        name=name + '_' + str(i + 1),\n                        align_corners=align_corners,\n                        padding_same=padding_same))\n\n            self.stage_func_list.append(stage_func)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        out = x\n        for idx in range(self._num_modules):\n            out = self.stage_func_list[idx](out)\n        return out\n\n\nclass HighResolutionModule(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_blocks: int,\n                 num_filters: int,\n                 has_se: bool = False,\n                 multi_scale_output: bool = True,\n                 name: str = None,\n                 align_corners: bool = False,\n                 padding_same: bool = True):\n        super(HighResolutionModule, self).__init__()\n\n        self.branches_func = Branches(\n            num_blocks=num_blocks,\n            in_channels=num_channels,\n            out_channels=num_filters,\n            has_se=has_se,\n            name=name,\n            padding_same=padding_same)\n\n        self.fuse_func = FuseLayers(\n            in_channels=num_filters,\n            out_channels=num_filters,\n            multi_scale_output=multi_scale_output,\n            name=name,\n            align_corners=align_corners,\n            padding_same=padding_same)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        out = self.branches_func(x)\n        out = self.fuse_func(out)\n        return out\n\n\nclass FuseLayers(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 multi_scale_output: bool = True,\n                 name: str = None,\n                 align_corners: bool = False,\n                 padding_same: bool = True):\n        super(FuseLayers, self).__init__()\n\n        self._actual_ch = len(in_channels) if multi_scale_output else 1\n        self._in_channels = in_channels\n        self.align_corners = align_corners\n\n        self.residual_func_list = []\n        for i in range(self._actual_ch):\n            for j in range(len(in_channels)):\n                if j > i:\n                    residual_func = self.add_sublayer(\n                        \"residual_{}_layer_{}_{}\".format(name, i + 1, j + 1),\n                        layers.ConvBN(\n                            in_channels=in_channels[j],\n                            out_channels=out_channels[i],\n                            kernel_size=1,\n                            bias_attr=False))\n                    self.residual_func_list.append(residual_func)\n                elif j < i:\n                    pre_num_filters = in_channels[j]\n                    for k in range(i - j):\n                        if k == i - j - 1:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(\n                                    name, i + 1, j + 1, k + 1),\n                                layers.ConvBN(\n                                    in_channels=pre_num_filters,\n                                    out_channels=out_channels[i],\n                                    kernel_size=3,\n                                    stride=2,\n                                    padding=1 if not padding_same else 'same',\n                                    bias_attr=False))\n                            pre_num_filters = out_channels[i]\n                        else:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(\n                                    name, i + 1, j + 1, k + 1),\n                                layers.ConvBNReLU(\n                                    in_channels=pre_num_filters,\n                                    out_channels=out_channels[j],\n                                    kernel_size=3,\n                                    stride=2,\n                                    padding=1 if not padding_same else 'same',\n                                    bias_attr=False))\n                            pre_num_filters = out_channels[j]\n                        self.residual_func_list.append(residual_func)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outs = []\n        residual_func_idx = 0\n        for i in range(self._actual_ch):\n            residual = x[i]\n            residual_shape = paddle.shape(residual)[-2:]\n            for j in range(len(self._in_channels)):\n                if j > i:\n                    y = self.residual_func_list[residual_func_idx](x[j])\n                    residual_func_idx += 1\n\n                    y = F.interpolate(\n                        y,\n                        residual_shape,\n                        mode='bilinear',\n                        align_corners=self.align_corners)\n                    residual = residual + y\n                elif j < i:\n                    y = x[j]\n                    for k in range(i - j):\n                        y = self.residual_func_list[residual_func_idx](y)\n                        residual_func_idx += 1\n\n                    residual = residual + y\n\n            residual = F.relu(residual)\n            outs.append(residual)\n\n        return outs\n\n\ndef HRNet_W18(**kwargs):\n    model = HRNet(\n        stage1_num_modules=1,\n        stage1_num_blocks=[4],\n        stage1_num_channels=[64],\n        stage2_num_modules=1,\n        stage2_num_blocks=[4, 4],\n        stage2_num_channels=[18, 36],\n        stage3_num_modules=4,\n        stage3_num_blocks=[4, 4, 4],\n        stage3_num_channels=[18, 36, 72],\n        stage4_num_modules=3,\n        stage4_num_blocks=[4, 4, 4, 4],\n        stage4_num_channels=[18, 36, 72, 144],\n        **kwargs)\n    return model"
  },
  {
    "path": "modules/image/matting/modnet_hrnet18_matting/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport time\nimport argparse\nfrom typing import Callable, Union, List, Tuple\n\nimport numpy as np\nimport cv2\nimport scipy\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\nfrom modnet_hrnet18_matting.hrnet import HRNet_W18\nimport modnet_hrnet18_matting.processor as P\n\n\n@moduleinfo(\n    name=\"modnet_hrnet18_matting\", \n    type=\"CV/matting\", \n    author=\"paddlepaddle\",\n    summary=\"modnet_hrnet18_matting is a matting model\",  \n    version=\"1.0.0\"  \n)\nclass MODNetHRNet18(nn.Layer):\n    \"\"\"\n    The MODNet implementation based on PaddlePaddle.\n\n    The original article refers to\n    Zhanghan Ke, et, al. \"Is a Green Screen Really Necessary for Real-Time Portrait Matting?\"\n    (https://arxiv.org/pdf/2011.11961.pdf).\n\n    Args:\n        hr_channels(int, optional): The channels of high resolutions branch. Defautl: None.\n        pretrained(str, optional): The path of pretrianed model. Defautl: None.\n    \"\"\"\n\n    def __init__(self, hr_channels:int = 32, pretrained=None):\n        super(MODNetHRNet18, self).__init__()\n\n        self.backbone = HRNet_W18()\n        self.pretrained = pretrained\n\n        self.head = MODNetHead(\n            hr_channels=hr_channels, backbone_channels=self.backbone.feat_channels)\n        self.blurer = GaussianBlurLayer(1, 3)\n        self.transforms = P.Compose([P.LoadImages(), P.ResizeByShort(), P.ResizeToIntMult(), P.Normalize()])\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'modnet-hrnet_w18.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n    \n    def preprocess(self, img: Union[str, np.ndarray] , transforms: Callable, trimap: Union[str, np.ndarray] = None):\n        data = {}\n        data['img'] = img\n        if trimap is not None:\n            data['trimap'] = trimap\n            data['gt_fields'] = ['trimap']\n        data['trans_info'] = []\n        data = self.transforms(data)\n        data['img'] = paddle.to_tensor(data['img'])\n        data['img'] = data['img'].unsqueeze(0)\n        if trimap is not None:\n            data['trimap'] = paddle.to_tensor(data['trimap'])\n            data['trimap'] = data['trimap'].unsqueeze((0, 1))\n\n        return data  \n    \n    def forward(self, inputs: dict) -> paddle.Tensor:\n        x = inputs['img']\n        feat_list = self.backbone(x)\n        y = self.head(inputs=inputs, feat_list=feat_list)\n        return y\n    \n    def predict(self, image_list: list, trimap_list: list = None, visualization: bool =False, save_path: str = \"modnet_hrnet18_matting_output\") -> list:\n        self.eval()\n        result= []\n        with paddle.no_grad():\n            for i, im_path in enumerate(image_list):\n                trimap = trimap_list[i] if trimap_list is not None else None\n                data = self.preprocess(img=im_path, transforms=self.transforms, trimap=trimap)\n                alpha_pred = self.forward(data)\n                alpha_pred = P.reverse_transform(alpha_pred, data['trans_info'])\n                alpha_pred = (alpha_pred.numpy()).squeeze()\n                alpha_pred = (alpha_pred * 255).astype('uint8')\n                alpha_pred = P.save_alpha_pred(alpha_pred, trimap)\n                result.append(alpha_pred)\n                if visualization:\n                    if not os.path.exists(save_path):\n                        os.makedirs(save_path)\n                    img_name = str(time.time()) + '.png'\n                    image_save_path = os.path.join(save_path, img_name)\n                    cv2.imwrite(image_save_path, alpha_pred)\n\n        return result\n    \n    @serving\n    def serving_method(self, images: list, trimaps:list = None, **kwargs) -> dict:\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [P.base64_to_cv2(image) for image in images]\n        if trimaps is not None:\n            trimap_decoder = [cv2.cvtColor(P.base64_to_cv2(trimap), cv2.COLOR_BGR2GRAY) for trimap in trimaps]\n        else:\n            trimap_decoder = None\n        \n        outputs = self.predict(image_list=images_decode, trimap_list= trimap_decoder, **kwargs)\n        serving_data = [P.cv2_to_base64(outputs[i]) for i in range(len(outputs))]\n        results = {'data': serving_data}\n\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        if args.trimap_path is not None:\n            trimap_list = [args.trimap_path]\n        else:\n            trimap_list = None\n\n        results = self.predict(image_list=[args.input_path], trimap_list=trimap_list, save_path=args.output_dir, visualization=args.visualization)\n\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default=\"modnet_hrnet18_matting_output\", help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument(\n            '--visualization', type=bool, default=True, help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n        self.arg_input_group.add_argument('--trimap_path', type=str, default=None, help=\"path to image.\")\n                \n                   \n    \nclass MODNetHead(nn.Layer):\n    \"\"\"\n    Segmentation head.\n    \"\"\"\n    def __init__(self, hr_channels: int, backbone_channels: int):\n        super().__init__()\n\n        self.lr_branch = LRBranch(backbone_channels)\n        self.hr_branch = HRBranch(hr_channels, backbone_channels)\n        self.f_branch = FusionBranch(hr_channels, backbone_channels)\n\n    def forward(self, inputs: paddle.Tensor, feat_list: list):\n        pred_semantic, lr8x, [enc2x, enc4x] = self.lr_branch(feat_list)\n        pred_detail, hr2x = self.hr_branch(inputs['img'], enc2x, enc4x, lr8x)\n        pred_matte = self.f_branch(inputs['img'], lr8x, hr2x)\n\n        if self.training:\n            logit_dict = {\n                'semantic': pred_semantic,\n                'detail': pred_detail,\n                'matte': pred_matte\n            }\n            return logit_dict\n        else:\n            return pred_matte\n\n\n\nclass FusionBranch(nn.Layer):\n    def __init__(self, hr_channels: int, enc_channels: int):\n        super().__init__()\n        self.conv_lr4x = Conv2dIBNormRelu(\n            enc_channels[2], hr_channels, 5, stride=1, padding=2)\n\n        self.conv_f2x = Conv2dIBNormRelu(\n            2 * hr_channels, hr_channels, 3, stride=1, padding=1)\n        self.conv_f = nn.Sequential(\n            Conv2dIBNormRelu(\n                hr_channels + 3, int(hr_channels / 2), 3, stride=1, padding=1),\n            Conv2dIBNormRelu(\n                int(hr_channels / 2),\n                1,\n                1,\n                stride=1,\n                padding=0,\n                with_ibn=False,\n                with_relu=False))\n\n    def forward(self, img: paddle.Tensor, lr8x: paddle.Tensor, hr2x: paddle.Tensor):\n        lr4x = F.interpolate(\n            lr8x, scale_factor=2, mode='bilinear', align_corners=False)\n        lr4x = self.conv_lr4x(lr4x)\n        lr2x = F.interpolate(\n            lr4x, scale_factor=2, mode='bilinear', align_corners=False)\n\n        f2x = self.conv_f2x(paddle.concat((lr2x, hr2x), axis=1))\n        f = F.interpolate(\n            f2x, scale_factor=2, mode='bilinear', align_corners=False)\n        f = self.conv_f(paddle.concat((f, img), axis=1))\n        pred_matte = F.sigmoid(f)\n\n        return pred_matte\n\n\nclass HRBranch(nn.Layer):\n    \"\"\"\n    High Resolution Branch of MODNet\n    \"\"\"\n\n    def __init__(self, hr_channels: int, enc_channels:int):\n        super().__init__()\n\n        self.tohr_enc2x = Conv2dIBNormRelu(\n            enc_channels[0], hr_channels, 1, stride=1, padding=0)\n        self.conv_enc2x = Conv2dIBNormRelu(\n            hr_channels + 3, hr_channels, 3, stride=2, padding=1)\n\n        self.tohr_enc4x = Conv2dIBNormRelu(\n            enc_channels[1], hr_channels, 1, stride=1, padding=0)\n        self.conv_enc4x = Conv2dIBNormRelu(\n            2 * hr_channels, 2 * hr_channels, 3, stride=1, padding=1)\n\n        self.conv_hr4x = nn.Sequential(\n            Conv2dIBNormRelu(\n                2 * hr_channels + enc_channels[2] + 3,\n                2 * hr_channels,\n                3,\n                stride=1,\n                padding=1),\n            Conv2dIBNormRelu(\n                2 * hr_channels, 2 * hr_channels, 3, stride=1, padding=1),\n            Conv2dIBNormRelu(\n                2 * hr_channels, hr_channels, 3, stride=1, padding=1))\n\n        self.conv_hr2x = nn.Sequential(\n            Conv2dIBNormRelu(\n                2 * hr_channels, 2 * hr_channels, 3, stride=1, padding=1),\n            Conv2dIBNormRelu(\n                2 * hr_channels, hr_channels, 3, stride=1, padding=1),\n            Conv2dIBNormRelu(hr_channels, hr_channels, 3, stride=1, padding=1),\n            Conv2dIBNormRelu(hr_channels, hr_channels, 3, stride=1, padding=1))\n\n        self.conv_hr = nn.Sequential(\n            Conv2dIBNormRelu(\n                hr_channels + 3, hr_channels, 3, stride=1, padding=1),\n            Conv2dIBNormRelu(\n                hr_channels,\n                1,\n                1,\n                stride=1,\n                padding=0,\n                with_ibn=False,\n                with_relu=False))\n\n    def forward(self, img: paddle.Tensor, enc2x: paddle.Tensor, enc4x: paddle.Tensor, lr8x: paddle.Tensor):\n        img2x = F.interpolate(\n            img, scale_factor=1 / 2, mode='bilinear', align_corners=False)\n        img4x = F.interpolate(\n            img, scale_factor=1 / 4, mode='bilinear', align_corners=False)\n\n        enc2x = self.tohr_enc2x(enc2x)\n        hr4x = self.conv_enc2x(paddle.concat((img2x, enc2x), axis=1))\n\n        enc4x = self.tohr_enc4x(enc4x)\n        hr4x = self.conv_enc4x(paddle.concat((hr4x, enc4x), axis=1))\n\n        lr4x = F.interpolate(\n            lr8x, scale_factor=2, mode='bilinear', align_corners=False)\n        hr4x = self.conv_hr4x(paddle.concat((hr4x, lr4x, img4x), axis=1))\n\n        hr2x = F.interpolate(\n            hr4x, scale_factor=2, mode='bilinear', align_corners=False)\n        hr2x = self.conv_hr2x(paddle.concat((hr2x, enc2x), axis=1))\n\n        pred_detail = None\n        if self.training:\n            hr = F.interpolate(\n                hr2x, scale_factor=2, mode='bilinear', align_corners=False)\n            hr = self.conv_hr(paddle.concat((hr, img), axis=1))\n            pred_detail = F.sigmoid(hr)\n\n        return pred_detail, hr2x\n\n\nclass LRBranch(nn.Layer):\n    \"\"\"\n    Low Resolution Branch of MODNet\n    \"\"\"\n    def __init__(self, backbone_channels: int):\n        super().__init__()\n        self.se_block = SEBlock(backbone_channels[4], reduction=4)\n        self.conv_lr16x = Conv2dIBNormRelu(\n            backbone_channels[4], backbone_channels[3], 5, stride=1, padding=2)\n        self.conv_lr8x = Conv2dIBNormRelu(\n            backbone_channels[3], backbone_channels[2], 5, stride=1, padding=2)\n        self.conv_lr = Conv2dIBNormRelu(\n            backbone_channels[2],\n            1,\n            3,\n            stride=2,\n            padding=1,\n            with_ibn=False,\n            with_relu=False)\n\n    def forward(self, feat_list: list):\n        enc2x, enc4x, enc32x = feat_list[0], feat_list[1], feat_list[4]\n\n        enc32x = self.se_block(enc32x)\n        lr16x = F.interpolate(\n            enc32x, scale_factor=2, mode='bilinear', align_corners=False)\n        lr16x = self.conv_lr16x(lr16x)\n        lr8x = F.interpolate(\n            lr16x, scale_factor=2, mode='bilinear', align_corners=False)\n        lr8x = self.conv_lr8x(lr8x)\n\n        pred_semantic = None\n        if self.training:\n            lr = self.conv_lr(lr8x)\n            pred_semantic = F.sigmoid(lr)\n\n        return pred_semantic, lr8x, [enc2x, enc4x]\n\n\nclass IBNorm(nn.Layer):\n    \"\"\"\n    Combine Instance Norm and Batch Norm into One Layer\n    \"\"\"\n\n    def __init__(self, in_channels: int):\n        super().__init__()\n        self.bnorm_channels = in_channels // 2\n        self.inorm_channels = in_channels - self.bnorm_channels\n\n        self.bnorm = nn.BatchNorm2D(self.bnorm_channels)\n        self.inorm = nn.InstanceNorm2D(self.inorm_channels)\n\n    def forward(self, x):\n        bn_x = self.bnorm(x[:, :self.bnorm_channels, :, :])\n        in_x = self.inorm(x[:, self.bnorm_channels:, :, :])\n\n        return paddle.concat((bn_x, in_x), 1)\n\n\nclass Conv2dIBNormRelu(nn.Layer):\n    \"\"\"\n    Convolution + IBNorm + Relu\n    \"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 stride: int = 1,\n                 padding: int = 0,\n                 dilation:int = 1,\n                 groups: int = 1,\n                 bias_attr: paddle.ParamAttr = None,\n                 with_ibn: bool = True,\n                 with_relu: bool = True):\n\n        super().__init__()\n\n        layers = [\n            nn.Conv2D(\n                in_channels,\n                out_channels,\n                kernel_size,\n                stride=stride,\n                padding=padding,\n                dilation=dilation,\n                groups=groups,\n                bias_attr=bias_attr)\n        ]\n\n        if with_ibn:\n            layers.append(IBNorm(out_channels))\n\n        if with_relu:\n            layers.append(nn.ReLU())\n\n        self.layers = nn.Sequential(*layers)\n\n    def forward(self, x: paddle.Tensor):\n        return self.layers(x)\n\n\nclass SEBlock(nn.Layer):\n    \"\"\"\n    SE Block Proposed in https://arxiv.org/pdf/1709.01507.pdf\n    \"\"\"\n\n    def __init__(self, num_channels: int, reduction:int = 1):\n        super().__init__()\n        self.pool = nn.AdaptiveAvgPool2D(1)\n        self.conv = nn.Sequential(\n            nn.Conv2D(\n                num_channels,\n                int(num_channels // reduction),\n                1,\n                bias_attr=False), nn.ReLU(),\n            nn.Conv2D(\n                int(num_channels // reduction),\n                num_channels,\n                1,\n                bias_attr=False), nn.Sigmoid())\n\n    def forward(self, x: paddle.Tensor):\n        w = self.pool(x)\n        w = self.conv(w)\n        return w * x\n\n\nclass GaussianBlurLayer(nn.Layer):\n    \"\"\" Add Gaussian Blur to a 4D tensors\n    This layer takes a 4D tensor of {N, C, H, W} as input.\n    The Gaussian blur will be performed in given channel number (C) splitly.\n    \"\"\"\n\n    def __init__(self, channels: int, kernel_size: int):\n        \"\"\"\n        Args:\n            channels (int): Channel for input tensor\n            kernel_size (int): Size of the kernel used in blurring\n        \"\"\"\n\n        super(GaussianBlurLayer, self).__init__()\n        self.channels = channels\n        self.kernel_size = kernel_size\n        assert self.kernel_size % 2 != 0\n\n        self.op = nn.Sequential(\n            nn.Pad2D(int(self.kernel_size / 2), mode='reflect'),\n            nn.Conv2D(\n                channels,\n                channels,\n                self.kernel_size,\n                stride=1,\n                padding=0,\n                bias_attr=False,\n                groups=channels))\n\n        self._init_kernel()\n        self.op[1].weight.stop_gradient = True\n\n    def forward(self, x: paddle.Tensor):\n        \"\"\"\n        Args:\n            x (paddle.Tensor): input 4D tensor\n        Returns:\n            paddle.Tensor: Blurred version of the input\n        \"\"\"\n\n        if not len(list(x.shape)) == 4:\n            print('\\'GaussianBlurLayer\\' requires a 4D tensor as input\\n')\n            exit()\n        elif not x.shape[1] == self.channels:\n            print('In \\'GaussianBlurLayer\\', the required channel ({0}) is'\n                  'not the same as input ({1})\\n'.format(\n                      self.channels, x.shape[1]))\n            exit()\n\n        return self.op(x)\n\n    def _init_kernel(self):\n        sigma = 0.3 * ((self.kernel_size - 1) * 0.5 - 1) + 0.8\n\n        n = np.zeros((self.kernel_size, self.kernel_size))\n        i = int(self.kernel_size / 2)\n        n[i, i] = 1\n        kernel = scipy.ndimage.gaussian_filter(n, sigma)\n        kernel = kernel.astype('float32')\n        kernel = kernel[np.newaxis, np.newaxis, :, :]\n        paddle.assign(kernel, self.op[1].weight)"
  },
  {
    "path": "modules/image/matting/modnet_hrnet18_matting/processor.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport random\nimport base64\nfrom typing import Callable, Union, List, Tuple\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\nfrom paddleseg.transforms import functional\nfrom PIL import Image\n\n\nclass Compose:\n    \"\"\"\n    Do transformation on input data with corresponding pre-processing and augmentation operations.\n    The shape of input data to all operations is [height, width, channels].\n    \"\"\"\n\n    def __init__(self, transforms: Callable, to_rgb: bool = True):\n        if not isinstance(transforms, list):\n            raise TypeError('The transforms must be a list!')\n        self.transforms = transforms\n        self.to_rgb = to_rgb\n\n    def __call__(self, data: dict) -> dict:\n\n        if 'trans_info' not in data:\n            data['trans_info'] = []\n        for op in self.transforms:\n            data = op(data)\n            if data is None:\n                return None\n\n        data['img'] = np.transpose(data['img'], (2, 0, 1))\n        for key in data.get('gt_fields', []):\n            if len(data[key].shape) == 2:\n                continue\n            data[key] = np.transpose(data[key], (2, 0, 1))\n\n        return data\n\n\nclass LoadImages:\n    \"\"\"\n    Read images from image path.\n\n    Args:\n        to_rgb (bool, optional): If converting image to RGB color space. Default: True.\n    \"\"\"\n    def __init__(self, to_rgb: bool = True):\n        self.to_rgb = to_rgb\n\n    def __call__(self, data: dict) -> dict:\n\n        if isinstance(data['img'], str):\n            data['img'] = cv2.imread(data['img'])\n\n        for key in data.get('gt_fields', []):\n            if isinstance(data[key], str):\n                data[key] = cv2.imread(data[key], cv2.IMREAD_UNCHANGED)\n            # if alpha and trimap has 3 channels, extract one.\n            if key in ['alpha', 'trimap']:\n                if len(data[key].shape) > 2:\n                    data[key] = data[key][:, :, 0]\n\n        if self.to_rgb:\n            data['img'] = cv2.cvtColor(data['img'], cv2.COLOR_BGR2RGB)\n            for key in data.get('gt_fields', []):\n                if len(data[key].shape) == 2:\n                    continue\n                data[key] = cv2.cvtColor(data[key], cv2.COLOR_BGR2RGB)\n\n        return data\n\n\nclass ResizeByShort:\n    \"\"\"\n    Resize the short side of an image to given size, and then scale the other side proportionally.\n\n    Args:\n        short_size (int): The target size of short side.\n    \"\"\"\n\n    def __init__(self, short_size: int =512):\n        self.short_size = short_size\n\n    def __call__(self, data: dict) -> dict:\n\n        data['trans_info'].append(('resize', data['img'].shape[0:2]))\n        data['img'] = functional.resize_short(data['img'], self.short_size)\n        for key in data.get('gt_fields', []):\n            data[key] = functional.resize_short(data[key], self.short_size)\n        return data\n\n\nclass ResizeToIntMult:\n    \"\"\"\n    Resize to some int muitple, d.g. 32.\n    \"\"\"\n\n    def __init__(self, mult_int: int = 32):\n        self.mult_int = mult_int\n\n    def __call__(self, data: dict) -> dict:\n        data['trans_info'].append(('resize', data['img'].shape[0:2]))\n\n        h, w = data['img'].shape[0:2]\n        rw = w - w % 32\n        rh = h - h % 32\n        data['img'] = functional.resize(data['img'], (rw, rh))\n        for key in data.get('gt_fields', []):\n            data[key] = functional.resize(data[key], (rw, rh))\n\n        return data\n\n\nclass Normalize:\n    \"\"\"\n    Normalize an image.\n\n    Args:\n        mean (list, optional): The mean value of a data set. Default: [0.5, 0.5, 0.5].\n        std (list, optional): The standard deviation of a data set. Default: [0.5, 0.5, 0.5].\n\n    Raises:\n        ValueError: When mean/std is not list or any value in std is 0.\n    \"\"\"\n\n    def __init__(self, mean: Union[List[float], Tuple[float]] = (0.5, 0.5, 0.5), std: Union[List[float], Tuple[float]] = (0.5, 0.5, 0.5)):\n        self.mean = mean\n        self.std = std\n        if not (isinstance(self.mean, (list, tuple))\n                and isinstance(self.std, (list, tuple))):\n            raise ValueError(\n                \"{}: input type is invalid. It should be list or tuple\".format(\n                    self))\n        from functools import reduce\n        if reduce(lambda x, y: x * y, self.std) == 0:\n            raise ValueError('{}: std is invalid!'.format(self))\n\n    def __call__(self, data: dict) -> dict:\n        mean = np.array(self.mean)[np.newaxis, np.newaxis, :]\n        std = np.array(self.std)[np.newaxis, np.newaxis, :]\n        data['img'] = functional.normalize(data['img'], mean, std)\n        if 'fg' in data.get('gt_fields', []):\n            data['fg'] = functional.normalize(data['fg'], mean, std)\n        if 'bg' in data.get('gt_fields', []):\n            data['bg'] = functional.normalize(data['bg'], mean, std)\n\n        return data\n\n\ndef reverse_transform(alpha: paddle.Tensor, trans_info: List[str]):\n    \"\"\"recover pred to origin shape\"\"\"\n    for item in trans_info[::-1]:\n        if item[0] == 'resize':\n            h, w = item[1][0], item[1][1]\n            alpha = F.interpolate(alpha, [h, w], mode='bilinear')\n        elif item[0] == 'padding':\n            h, w = item[1][0], item[1][1]\n            alpha = alpha[:, :, 0:h, 0:w]\n        else:\n            raise Exception(\"Unexpected info '{}' in im_info\".format(item[0]))\n    return alpha\n\ndef save_alpha_pred(alpha: np.ndarray, trimap:  Union[np.ndarray, str] = None):\n    \"\"\"\n    The value of alpha is range [0, 1], shape should be [h,w]\n    \"\"\"\n    if isinstance(trimap, str):\n        trimap = cv2.imread(trimap, 0)\n\n    alpha[trimap == 0] = 0\n    alpha[trimap == 255] = 255\n    alpha = (alpha).astype('uint8')\n    return alpha\n\n\ndef cv2_to_base64(image: np.ndarray):\n    \"\"\"\n    Convert data from BGR to base64 format.\n    \"\"\"\n    data = cv2.imencode('.png', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef base64_to_cv2(b64str: str):\n    \"\"\"\n    Convert data from base64 to BGR format.\n    \"\"\"\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data"
  },
  {
    "path": "modules/image/matting/modnet_mobilenetv2_matting/README.md",
    "content": "# modnet_mobilenetv2_matting\n\n|模型名称|modnet_mobilenetv2_matting|\n| :--- | :---: | \n|类别|图像-抠图|\n|网络|modnet_mobilenetv2|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|否|\n|模型大小|38MB|\n|指标|SAD112.73|\n|最新更新日期|2021-12-03|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n  - 样例结果示例(左为原图，右为效果图)：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/144574288-28671577-8d5d-4b20-adb9-fe737015c841.jpg\" width = \"337\" height = \"505\" hspace='10'/> \n    <img src=\"https://user-images.githubusercontent.com/35907364/144574092-d0dd08f3-309b-4a7d-84d5-8b94604431a1.png\" width = \"337\" height = \"505\" hspace='10'/> \n    </p>\n\n- ### 模型介绍\n\n  - Matting（精细化分割/影像去背/抠图）是指借由计算前景的颜色和透明度，将前景从影像中撷取出来的技术，可用于替换背景、影像合成、视觉特效，在电影工业中被广泛地使用。影像中的每个像素会有代表其前景透明度的值，称作阿法值（Alpha），一张影像中所有阿法值的集合称作阿法遮罩（Alpha Matte），将影像被遮罩所涵盖的部分取出即可完成前景的分离。modnet_mobilenetv2_matting可生成抠图结果。\n\n\n  \n  - 更多详情请参考：[modnet_mobilenetv2_matting](https://github.com/PaddlePaddle/PaddleSeg/tree/release/2.3/contrib/Matting)\n  \n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.2.0\n\n    - paddlehub >= 2.1.0\n\n    - paddleseg >= 2.3.0\n\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install modnet_mobilenetv2_matting\n      ```\n      \n    - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n    \n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run modnet_mobilenetv2_matting --input_path \"/PATH/TO/IMAGE\"\n    ```\n    \n  - 通过命令行方式实现hub模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n    - ```python\n      import paddlehub as hub\n      import cv2\n\n      model = hub.Module(name=\"modnet_mobilenetv2_matting\")\n\n      result = model.predict([\"/PATH/TO/IMAGE\"])\n      print(result)\n      ```\n- ### 3、API\n\n    - ```python\n        def predict(self, \n                    image_list, \n                    trimap_list, \n                    visualization, \n                    save_path):\n      ```\n\n        - 人像matting预测API，用于将输入图片中的人像分割出来。\n\n        - 参数\n\n            - image_list (list(str | numpy.ndarray)):图片输入路径或者BGR格式numpy数据。\n            - trimap_list(list(str | numpy.ndarray)):trimap输入路径或者灰度图单通道格式图片。默认为None。\n            - visualization (bool): 是否进行可视化，默认为False。\n            - save_path (str): 当visualization为True时，保存图片的路径，默认为\"modnet_mobilenetv2_matting_output\"。\n\n        - 返回\n\n            - result (list(numpy.ndarray))：模型分割结果：\n\n \n## 四、服务部署\n\n- PaddleHub Serving可以部署人像matting在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m modnet_mobilenetv2_matting\n      ```\n\n    - 这样就完成了一个人像matting在线服务API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    import time\n    import numpy as np\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n\n    def base64_to_cv2(b64str):\n        data = base64.b64decode(b64str.encode('utf8'))\n        data = np.fromstring(data, np.uint8)\n        data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n        return data\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/modnet_mobilenetv2_matting\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    for image in r.json()[\"results\"]['data']:\n        data = base64_to_cv2(image)\n        image_path =str(time.time()) + \".png\"\n        cv2.imwrite(image_path, data)\n      ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/matting/modnet_mobilenetv2_matting/README_en.md",
    "content": "# modnet_mobilenetv2_matting\n\n|Module Name|modnet_mobilenetv2_matting|\n| :--- | :---: | \n|Category|Image Matting|\n|Network|modnet_mobilenetv2|\n|Dataset|Baidu self-built dataset|\n|Support Fine-tuning|No|\n|Module Size|38MB|\n|Data Indicators|SAD112.73|\n|Latest update date|2021-12-03|\n\n\n## I. Basic Information\n\n- ### Application Effect Display\n\n  - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/144574288-28671577-8d5d-4b20-adb9-fe737015c841.jpg\" width = \"337\" height = \"505\" hspace='10'/> \n    <img src=\"https://user-images.githubusercontent.com/35907364/144574092-d0dd08f3-309b-4a7d-84d5-8b94604431a1.png\" width = \"337\" height = \"505\" hspace='10'/> \n    </p>\n\n- ### Module Introduction\n\n  - Mating is the technique of extracting foreground from an image by calculating its color and transparency. It is widely used in the film industry to replace background, image composition, and visual effects. Each pixel in the image will have a value that represents its foreground transparency, called Alpha. The set of all Alpha values in an image is called Alpha Matte. The part of the image covered by the mask can be extracted to complete foreground separation.\n\n\n  \n  - For more information, please refer to: [modnet_mobilenetv2_matting](https://github.com/PaddlePaddle/PaddleSeg/tree/release/2.3/contrib/Matting)\n  \n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.2.0\n\n    - paddlehub >= 2.1.0\n\n    - paddleseg >= 2.3.0\n\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install modnet_mobilenetv2_matting\n      ```\n      \n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n    \n## III. Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run modnet_mobilenetv2_matting --input_path \"/PATH/TO/IMAGE\"\n    ```\n    \n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_en/tutorial/cmd_usage.rst)\n\n\n- ### 2、Prediction Code Example\n\n    - ```python\n      import paddlehub as hub\n      import cv2\n\n      model = hub.Module(name=\"modnet_mobilenetv2_matting\")\n\n      result = model.predict(image_list=[\"/PATH/TO/IMAGE\"])\n      print(result)\n      ```\n- ### 3、API\n\n    - ```python\n        def predict(self, \n                    image_list, \n                    trimap_list, \n                    visualization, \n                    save_path):\n      ```\n\n        - Prediction API for matting.\n\n        - **Parameter**\n\n            - image_list (list(str | numpy.ndarray)): Image path or image data, ndarray.shape is in the format \\[H, W, C\\]，BGR.\n            - trimap_list(list(str | numpy.ndarray)): Trimap path or trimap data, ndarray.shape is in the format \\[H, W]，gray. Default is None.\n            - visualization (bool): Whether to save the recognition results as picture files, default is False.\n            - save_path (str): Save path of images, \"modnet_mobilenetv2_matting_output\" by default.\n\n        - **Return**\n\n            - result (list(numpy.ndarray))：The list of model results.\n\n \n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of matting.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command:\n\n    - ```shell\n      $ hub serving start -m modnet_mobilenetv2_matting\n      ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n\n    ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    import time\n    import numpy as np\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n\n    def base64_to_cv2(b64str):\n        data = base64.b64decode(b64str.encode('utf8'))\n        data = np.fromstring(data, np.uint8)\n        data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n        return data\n\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/modnet_mobilenetv2_matting\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    for image in r.json()[\"results\"]['data']:\n        data = base64_to_cv2(image)\n        image_path =str(time.time()) + \".png\"\n        cv2.imwrite(image_path, data)\n      ```\n\n## V. Release Note\n\n- 1.0.0\n\n  First release\n"
  },
  {
    "path": "modules/image/matting/modnet_mobilenetv2_matting/mobilenetv2.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport math\n\nimport numpy as np\nimport paddle\nfrom paddle import ParamAttr\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn import Conv2D, BatchNorm, Linear, Dropout\nfrom paddle.nn import AdaptiveAvgPool2D, MaxPool2D, AvgPool2D\n\nfrom paddleseg import utils\nfrom paddleseg.cvlibs import manager\n\n\n__all__ = [\"MobileNetV2\"]\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n    def __init__(self,\n                 num_channels: int,\n                 filter_size: int,\n                 num_filters: int,\n                 stride: int,\n                 padding: int,\n                 num_groups: int=1,\n                 name: str = None,\n                 use_cudnn: bool = True):\n        super(ConvBNLayer, self).__init__()\n\n        self._conv = Conv2D(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=filter_size,\n            stride=stride,\n            padding=padding,\n            groups=num_groups,\n            weight_attr=ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n\n        self._batch_norm = BatchNorm(\n            num_filters,\n            param_attr=ParamAttr(name=name + \"_bn_scale\"),\n            bias_attr=ParamAttr(name=name + \"_bn_offset\"),\n            moving_mean_name=name + \"_bn_mean\",\n            moving_variance_name=name + \"_bn_variance\")\n\n    def forward(self, inputs: paddle.Tensor, if_act: bool = True) -> paddle.Tensor:\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        if if_act:\n            y = F.relu6(y)\n        return y\n\n\nclass InvertedResidualUnit(nn.Layer):\n    \"\"\"Inverted residual block\"\"\"\n    def __init__(self, num_channels: int, num_in_filter: int, num_filters: int, stride: int,\n                 filter_size: int, padding: int, expansion_factor: int, name: str):\n        super(InvertedResidualUnit, self).__init__()\n        num_expfilter = int(round(num_in_filter * expansion_factor))\n        self._expand_conv = ConvBNLayer(\n            num_channels=num_channels,\n            num_filters=num_expfilter,\n            filter_size=1,\n            stride=1,\n            padding=0,\n            num_groups=1,\n            name=name + \"_expand\")\n\n        self._bottleneck_conv = ConvBNLayer(\n            num_channels=num_expfilter,\n            num_filters=num_expfilter,\n            filter_size=filter_size,\n            stride=stride,\n            padding=padding,\n            num_groups=num_expfilter,\n            use_cudnn=False,\n            name=name + \"_dwise\")\n\n        self._linear_conv = ConvBNLayer(\n            num_channels=num_expfilter,\n            num_filters=num_filters,\n            filter_size=1,\n            stride=1,\n            padding=0,\n            num_groups=1,\n            name=name + \"_linear\")\n\n    def forward(self, inputs: paddle.Tensor, ifshortcut: bool) -> paddle.Tensor:\n        y = self._expand_conv(inputs, if_act=True)\n        y = self._bottleneck_conv(y, if_act=True)\n        y = self._linear_conv(y, if_act=False)\n        if ifshortcut:\n            y = paddle.add(inputs, y)\n        return y\n\n\nclass InvresiBlocks(nn.Layer):\n    def __init__(self, in_c: int, t: int, c: int, n: int, s: int, name: str):\n        super(InvresiBlocks, self).__init__()\n\n        self._first_block = InvertedResidualUnit(\n            num_channels=in_c,\n            num_in_filter=in_c,\n            num_filters=c,\n            stride=s,\n            filter_size=3,\n            padding=1,\n            expansion_factor=t,\n            name=name + \"_1\")\n\n        self._block_list = []\n        for i in range(1, n):\n            block = self.add_sublayer(\n                name + \"_\" + str(i + 1),\n                sublayer=InvertedResidualUnit(\n                    num_channels=c,\n                    num_in_filter=c,\n                    num_filters=c,\n                    stride=1,\n                    filter_size=3,\n                    padding=1,\n                    expansion_factor=t,\n                    name=name + \"_\" + str(i + 1)))\n            self._block_list.append(block)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self._first_block(inputs, ifshortcut=False)\n        for block in self._block_list:\n            y = block(y, ifshortcut=True)\n        return y\n\n\nclass MobileNet(nn.Layer):\n    \"\"\"Networj of MobileNet\"\"\"\n    def __init__(self,\n                 input_channels: int = 3,\n                 scale: float = 1.0,\n                 pretrained: str = None,\n                 prefix_name: str = \"\"):\n        super(MobileNet, self).__init__()\n        self.scale = scale\n\n        bottleneck_params_list = [\n            (1, 16, 1, 1),\n            (6, 24, 2, 2),\n            (6, 32, 3, 2),\n            (6, 64, 4, 2),\n            (6, 96, 3, 1),\n            (6, 160, 3, 2),\n            (6, 320, 1, 1),\n        ]\n\n        self.conv1 = ConvBNLayer(\n            num_channels=input_channels,\n            num_filters=int(32 * scale),\n            filter_size=3,\n            stride=2,\n            padding=1,\n            name=prefix_name + \"conv1_1\")\n\n        self.block_list = []\n        i = 1\n        in_c = int(32 * scale)\n        for layer_setting in bottleneck_params_list:\n            t, c, n, s = layer_setting\n            i += 1\n            block = self.add_sublayer(\n                prefix_name + \"conv\" + str(i),\n                sublayer=InvresiBlocks(\n                    in_c=in_c,\n                    t=t,\n                    c=int(c * scale),\n                    n=n,\n                    s=s,\n                    name=prefix_name + \"conv\" + str(i)))\n            self.block_list.append(block)\n            in_c = int(c * scale)\n\n        self.out_c = int(1280 * scale) if scale > 1.0 else 1280\n        self.conv9 = ConvBNLayer(\n            num_channels=in_c,\n            num_filters=self.out_c,\n            filter_size=1,\n            stride=1,\n            padding=0,\n            name=prefix_name + \"conv9\")\n\n        self.feat_channels = [int(i * scale) for i in [16, 24, 32, 96, 1280]]\n        self.pretrained = pretrained\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        feat_list = []\n        y = self.conv1(inputs, if_act=True)\n\n        block_index = 0\n        for block in self.block_list:\n            y = block(y)\n            if block_index in [0, 1, 2, 4]:\n                feat_list.append(y)\n            block_index += 1\n        y = self.conv9(y, if_act=True)\n        feat_list.append(y)\n        return feat_list\n\n\ndef MobileNetV2(**kwargs):\n    model = MobileNet(scale=1.0, **kwargs)\n    return model\n"
  },
  {
    "path": "modules/image/matting/modnet_mobilenetv2_matting/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport time\nimport argparse\nfrom typing import Callable, Union, List, Tuple\n\nimport numpy as np\nimport cv2\nimport scipy\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\nfrom modnet_mobilenetv2_matting.mobilenetv2 import MobileNetV2\nimport modnet_mobilenetv2_matting.processor as P\n\n\n@moduleinfo(\n    name=\"modnet_mobilenetv2_matting\", \n    type=\"CV\", \n    author=\"paddlepaddle\",\n    summary=\"modnet_mobilenetv2_matting is a matting model\",  \n    version=\"1.0.0\"  \n)\nclass MODNetMobilenetV2(nn.Layer):\n    \"\"\"\n    The MODNet implementation based on PaddlePaddle.\n\n    The original article refers to\n    Zhanghan Ke, et, al. \"Is a Green Screen Really Necessary for Real-Time Portrait Matting?\"\n    (https://arxiv.org/pdf/2011.11961.pdf).\n\n    Args:\n        hr_channels(int, optional): The channels of high resolutions branch. Defautl: None.\n        pretrained(str, optional): The path of pretrianed model. Defautl: None.\n\n    \"\"\"\n\n    def __init__(self, hr_channels:int = 32, pretrained=None):\n        super(MODNetMobilenetV2, self).__init__()\n        \n        self.backbone = MobileNetV2()\n        self.pretrained = pretrained\n\n        self.head = MODNetHead(\n            hr_channels=hr_channels, backbone_channels=self.backbone.feat_channels)\n        self.blurer = GaussianBlurLayer(1, 3)\n        self.transforms = P.Compose([P.LoadImages(), P.ResizeByShort(), P.ResizeToIntMult(), P.Normalize()])\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'modnet-mobilenetv2.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n    \n    def preprocess(self, img: Union[str, np.ndarray] , transforms: Callable, trimap: Union[str, np.ndarray] = None):\n        data = {}\n        data['img'] = img\n        if trimap is not None:\n            data['trimap'] = trimap\n            data['gt_fields'] = ['trimap']\n        data['trans_info'] = []\n        data = self.transforms(data)\n        data['img'] = paddle.to_tensor(data['img'])\n        data['img'] = data['img'].unsqueeze(0)\n        if trimap is not None:\n            data['trimap'] = paddle.to_tensor(data['trimap'])\n            data['trimap'] = data['trimap'].unsqueeze((0, 1))\n\n        return data  \n    \n    def forward(self, inputs: dict):\n        x = inputs['img']\n        feat_list = self.backbone(x)\n        y = self.head(inputs=inputs, feat_list=feat_list)\n        return y\n    \n    def predict(self, image_list: list, trimap_list: list = None, visualization: bool =False, save_path: str = \"modnet_mobilenetv2_matting_output\"):\n        self.eval()\n        result = []\n        with paddle.no_grad():\n            for i, im_path in enumerate(image_list):\n                trimap = trimap_list[i] if trimap_list is not None else None\n                data = self.preprocess(img=im_path, transforms=self.transforms, trimap=trimap)\n                alpha_pred = self.forward(data)\n                alpha_pred = P.reverse_transform(alpha_pred, data['trans_info'])\n                alpha_pred = (alpha_pred.numpy()).squeeze()\n                alpha_pred = (alpha_pred * 255).astype('uint8')\n                alpha_pred = P.save_alpha_pred(alpha_pred, trimap)\n                result.append(alpha_pred)\n                if visualization:\n                    if not os.path.exists(save_path):\n                        os.makedirs(save_path)\n                    img_name = str(time.time()) + '.png'\n                    image_save_path = os.path.join(save_path, img_name)\n                    cv2.imwrite(image_save_path, alpha_pred)\n\n        return result\n    \n    @serving\n    def serving_method(self, images: list, trimaps:list = None, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [P.base64_to_cv2(image) for image in images]\n        if trimaps is not None:\n            trimap_decoder = [cv2.cvtColor(P.base64_to_cv2(trimap), cv2.COLOR_BGR2GRAY) for trimap in trimaps]\n        else:\n            trimap_decoder = None\n        \n        outputs = self.predict(image_list=images_decode, trimap_list= trimap_decoder, **kwargs)\n        serving_data = [P.cv2_to_base64(outputs[i]) for i in range(len(outputs))]\n        results = {'data': serving_data}\n\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        if args.trimap_path is not None:\n            trimap_list = [args.trimap_path]\n        else:\n            trimap_list = None\n\n        results = self.predict(image_list=[args.input_path], trimap_list=trimap_list, save_path=args.output_dir, visualization=args.visualization)\n\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default=\"modnet_mobilenetv2_matting_output\", help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument(\n            '--visualization', type=bool, default=True, help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n        self.arg_input_group.add_argument('--trimap_path', type=str, default=None, help=\"path to image.\")\n                \n                   \n    \nclass MODNetHead(nn.Layer):\n    \"\"\"\n    Segmentation head.\n    \"\"\"\n    def __init__(self, hr_channels: int, backbone_channels: int):\n        super().__init__()\n\n        self.lr_branch = LRBranch(backbone_channels)\n        self.hr_branch = HRBranch(hr_channels, backbone_channels)\n        self.f_branch = FusionBranch(hr_channels, backbone_channels)\n\n    def forward(self, inputs: paddle.Tensor, feat_list: list):\n        pred_semantic, lr8x, [enc2x, enc4x] = self.lr_branch(feat_list)\n        pred_detail, hr2x = self.hr_branch(inputs['img'], enc2x, enc4x, lr8x)\n        pred_matte = self.f_branch(inputs['img'], lr8x, hr2x)\n\n        if self.training:\n            logit_dict = {\n                'semantic': pred_semantic,\n                'detail': pred_detail,\n                'matte': pred_matte\n            }\n            return logit_dict\n        else:\n            return pred_matte\n\n\n\nclass FusionBranch(nn.Layer):\n    def __init__(self, hr_channels: int, enc_channels: int):\n        super().__init__()\n        self.conv_lr4x = Conv2dIBNormRelu(\n            enc_channels[2], hr_channels, 5, stride=1, padding=2)\n\n        self.conv_f2x = Conv2dIBNormRelu(\n            2 * hr_channels, hr_channels, 3, stride=1, padding=1)\n        self.conv_f = nn.Sequential(\n            Conv2dIBNormRelu(\n                hr_channels + 3, int(hr_channels / 2), 3, stride=1, padding=1),\n            Conv2dIBNormRelu(\n                int(hr_channels / 2),\n                1,\n                1,\n                stride=1,\n                padding=0,\n                with_ibn=False,\n                with_relu=False))\n\n    def forward(self, img: paddle.Tensor, lr8x: paddle.Tensor, hr2x: paddle.Tensor):\n        lr4x = F.interpolate(\n            lr8x, scale_factor=2, mode='bilinear', align_corners=False)\n        lr4x = self.conv_lr4x(lr4x)\n        lr2x = F.interpolate(\n            lr4x, scale_factor=2, mode='bilinear', align_corners=False)\n\n        f2x = self.conv_f2x(paddle.concat((lr2x, hr2x), axis=1))\n        f = F.interpolate(\n            f2x, scale_factor=2, mode='bilinear', align_corners=False)\n        f = self.conv_f(paddle.concat((f, img), axis=1))\n        pred_matte = F.sigmoid(f)\n\n        return pred_matte\n\n\nclass HRBranch(nn.Layer):\n    \"\"\"\n    High Resolution Branch of MODNet\n    \"\"\"\n\n    def __init__(self, hr_channels: int, enc_channels:int):\n        super().__init__()\n\n        self.tohr_enc2x = Conv2dIBNormRelu(\n            enc_channels[0], hr_channels, 1, stride=1, padding=0)\n        self.conv_enc2x = Conv2dIBNormRelu(\n            hr_channels + 3, hr_channels, 3, stride=2, padding=1)\n\n        self.tohr_enc4x = Conv2dIBNormRelu(\n            enc_channels[1], hr_channels, 1, stride=1, padding=0)\n        self.conv_enc4x = Conv2dIBNormRelu(\n            2 * hr_channels, 2 * hr_channels, 3, stride=1, padding=1)\n\n        self.conv_hr4x = nn.Sequential(\n            Conv2dIBNormRelu(\n                2 * hr_channels + enc_channels[2] + 3,\n                2 * hr_channels,\n                3,\n                stride=1,\n                padding=1),\n            Conv2dIBNormRelu(\n                2 * hr_channels, 2 * hr_channels, 3, stride=1, padding=1),\n            Conv2dIBNormRelu(\n                2 * hr_channels, hr_channels, 3, stride=1, padding=1))\n\n        self.conv_hr2x = nn.Sequential(\n            Conv2dIBNormRelu(\n                2 * hr_channels, 2 * hr_channels, 3, stride=1, padding=1),\n            Conv2dIBNormRelu(\n                2 * hr_channels, hr_channels, 3, stride=1, padding=1),\n            Conv2dIBNormRelu(hr_channels, hr_channels, 3, stride=1, padding=1),\n            Conv2dIBNormRelu(hr_channels, hr_channels, 3, stride=1, padding=1))\n\n        self.conv_hr = nn.Sequential(\n            Conv2dIBNormRelu(\n                hr_channels + 3, hr_channels, 3, stride=1, padding=1),\n            Conv2dIBNormRelu(\n                hr_channels,\n                1,\n                1,\n                stride=1,\n                padding=0,\n                with_ibn=False,\n                with_relu=False))\n\n    def forward(self, img: paddle.Tensor, enc2x: paddle.Tensor, enc4x: paddle.Tensor, lr8x: paddle.Tensor):\n        img2x = F.interpolate(\n            img, scale_factor=1 / 2, mode='bilinear', align_corners=False)\n        img4x = F.interpolate(\n            img, scale_factor=1 / 4, mode='bilinear', align_corners=False)\n\n        enc2x = self.tohr_enc2x(enc2x)\n        hr4x = self.conv_enc2x(paddle.concat((img2x, enc2x), axis=1))\n\n        enc4x = self.tohr_enc4x(enc4x)\n        hr4x = self.conv_enc4x(paddle.concat((hr4x, enc4x), axis=1))\n\n        lr4x = F.interpolate(\n            lr8x, scale_factor=2, mode='bilinear', align_corners=False)\n        hr4x = self.conv_hr4x(paddle.concat((hr4x, lr4x, img4x), axis=1))\n\n        hr2x = F.interpolate(\n            hr4x, scale_factor=2, mode='bilinear', align_corners=False)\n        hr2x = self.conv_hr2x(paddle.concat((hr2x, enc2x), axis=1))\n\n        pred_detail = None\n        if self.training:\n            hr = F.interpolate(\n                hr2x, scale_factor=2, mode='bilinear', align_corners=False)\n            hr = self.conv_hr(paddle.concat((hr, img), axis=1))\n            pred_detail = F.sigmoid(hr)\n\n        return pred_detail, hr2x\n\n\nclass LRBranch(nn.Layer):\n    \"\"\"\n    Low Resolution Branch of MODNet\n    \"\"\"\n    def __init__(self, backbone_channels: int):\n        super().__init__()\n        self.se_block = SEBlock(backbone_channels[4], reduction=4)\n        self.conv_lr16x = Conv2dIBNormRelu(\n            backbone_channels[4], backbone_channels[3], 5, stride=1, padding=2)\n        self.conv_lr8x = Conv2dIBNormRelu(\n            backbone_channels[3], backbone_channels[2], 5, stride=1, padding=2)\n        self.conv_lr = Conv2dIBNormRelu(\n            backbone_channels[2],\n            1,\n            3,\n            stride=2,\n            padding=1,\n            with_ibn=False,\n            with_relu=False)\n\n    def forward(self, feat_list: list):\n        enc2x, enc4x, enc32x = feat_list[0], feat_list[1], feat_list[4]\n\n        enc32x = self.se_block(enc32x)\n        lr16x = F.interpolate(\n            enc32x, scale_factor=2, mode='bilinear', align_corners=False)\n        lr16x = self.conv_lr16x(lr16x)\n        lr8x = F.interpolate(\n            lr16x, scale_factor=2, mode='bilinear', align_corners=False)\n        lr8x = self.conv_lr8x(lr8x)\n\n        pred_semantic = None\n        if self.training:\n            lr = self.conv_lr(lr8x)\n            pred_semantic = F.sigmoid(lr)\n\n        return pred_semantic, lr8x, [enc2x, enc4x]\n\n\nclass IBNorm(nn.Layer):\n    \"\"\"\n    Combine Instance Norm and Batch Norm into One Layer\n    \"\"\"\n\n    def __init__(self, in_channels: int):\n        super().__init__()\n        self.bnorm_channels = in_channels // 2\n        self.inorm_channels = in_channels - self.bnorm_channels\n\n        self.bnorm = nn.BatchNorm2D(self.bnorm_channels)\n        self.inorm = nn.InstanceNorm2D(self.inorm_channels)\n\n    def forward(self, x):\n        bn_x = self.bnorm(x[:, :self.bnorm_channels, :, :])\n        in_x = self.inorm(x[:, self.bnorm_channels:, :, :])\n\n        return paddle.concat((bn_x, in_x), 1)\n\n\nclass Conv2dIBNormRelu(nn.Layer):\n    \"\"\"\n    Convolution + IBNorm + Relu\n    \"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 stride: int = 1,\n                 padding: int = 0,\n                 dilation:int = 1,\n                 groups: int = 1,\n                 bias_attr: paddle.ParamAttr = None,\n                 with_ibn: bool = True,\n                 with_relu: bool = True):\n\n        super().__init__()\n\n        layers = [\n            nn.Conv2D(\n                in_channels,\n                out_channels,\n                kernel_size,\n                stride=stride,\n                padding=padding,\n                dilation=dilation,\n                groups=groups,\n                bias_attr=bias_attr)\n        ]\n\n        if with_ibn:\n            layers.append(IBNorm(out_channels))\n\n        if with_relu:\n            layers.append(nn.ReLU())\n\n        self.layers = nn.Sequential(*layers)\n\n    def forward(self, x: paddle.Tensor):\n        return self.layers(x)\n\n\nclass SEBlock(nn.Layer):\n    \"\"\"\n    SE Block Proposed in https://arxiv.org/pdf/1709.01507.pdf\n    \"\"\"\n\n    def __init__(self, num_channels: int, reduction:int = 1):\n        super().__init__()\n        self.pool = nn.AdaptiveAvgPool2D(1)\n        self.conv = nn.Sequential(\n            nn.Conv2D(\n                num_channels,\n                int(num_channels // reduction),\n                1,\n                bias_attr=False), nn.ReLU(),\n            nn.Conv2D(\n                int(num_channels // reduction),\n                num_channels,\n                1,\n                bias_attr=False), nn.Sigmoid())\n\n    def forward(self, x: paddle.Tensor):\n        w = self.pool(x)\n        w = self.conv(w)\n        return w * x\n\n\nclass GaussianBlurLayer(nn.Layer):\n    \"\"\" Add Gaussian Blur to a 4D tensors\n    This layer takes a 4D tensor of {N, C, H, W} as input.\n    The Gaussian blur will be performed in given channel number (C) splitly.\n    \"\"\"\n\n    def __init__(self, channels: int, kernel_size: int):\n        \"\"\"\n        Args:\n            channels (int): Channel for input tensor\n            kernel_size (int): Size of the kernel used in blurring\n        \"\"\"\n\n        super(GaussianBlurLayer, self).__init__()\n        self.channels = channels\n        self.kernel_size = kernel_size\n        assert self.kernel_size % 2 != 0\n\n        self.op = nn.Sequential(\n            nn.Pad2D(int(self.kernel_size / 2), mode='reflect'),\n            nn.Conv2D(\n                channels,\n                channels,\n                self.kernel_size,\n                stride=1,\n                padding=0,\n                bias_attr=False,\n                groups=channels))\n\n        self._init_kernel()\n        self.op[1].weight.stop_gradient = True\n\n    def forward(self, x: paddle.Tensor):\n        \"\"\"\n        Args:\n            x (paddle.Tensor): input 4D tensor\n        Returns:\n            paddle.Tensor: Blurred version of the input\n        \"\"\"\n\n        if not len(list(x.shape)) == 4:\n            print('\\'GaussianBlurLayer\\' requires a 4D tensor as input\\n')\n            exit()\n        elif not x.shape[1] == self.channels:\n            print('In \\'GaussianBlurLayer\\', the required channel ({0}) is'\n                  'not the same as input ({1})\\n'.format(\n                      self.channels, x.shape[1]))\n            exit()\n\n        return self.op(x)\n\n    def _init_kernel(self):\n        sigma = 0.3 * ((self.kernel_size - 1) * 0.5 - 1) + 0.8\n\n        n = np.zeros((self.kernel_size, self.kernel_size))\n        i = int(self.kernel_size / 2)\n        n[i, i] = 1\n        kernel = scipy.ndimage.gaussian_filter(n, sigma)\n        kernel = kernel.astype('float32')\n        kernel = kernel[np.newaxis, np.newaxis, :, :]\n        paddle.assign(kernel, self.op[1].weight)"
  },
  {
    "path": "modules/image/matting/modnet_mobilenetv2_matting/processor.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport random\nimport base64\nfrom typing import Callable, Union, List, Tuple\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\nfrom paddleseg.transforms import functional\nfrom PIL import Image\n\n\nclass Compose:\n    \"\"\"\n    Do transformation on input data with corresponding pre-processing and augmentation operations.\n    The shape of input data to all operations is [height, width, channels].\n    \"\"\"\n\n    def __init__(self, transforms: Callable, to_rgb: bool = True):\n        if not isinstance(transforms, list):\n            raise TypeError('The transforms must be a list!')\n        self.transforms = transforms\n        self.to_rgb = to_rgb\n\n    def __call__(self, data: dict) -> dict:\n\n        if 'trans_info' not in data:\n            data['trans_info'] = []\n        for op in self.transforms:\n            data = op(data)\n            if data is None:\n                return None\n\n        data['img'] = np.transpose(data['img'], (2, 0, 1))\n        for key in data.get('gt_fields', []):\n            if len(data[key].shape) == 2:\n                continue\n            data[key] = np.transpose(data[key], (2, 0, 1))\n\n        return data\n\n\nclass LoadImages:\n    \"\"\"\n    Read images from image path.\n\n    Args:\n        to_rgb (bool, optional): If converting image to RGB color space. Default: True.\n    \"\"\"\n    def __init__(self, to_rgb: bool = True):\n        self.to_rgb = to_rgb\n\n    def __call__(self, data: dict) -> dict:\n\n        if isinstance(data['img'], str):\n            data['img'] = cv2.imread(data['img'])\n\n        for key in data.get('gt_fields', []):\n            if isinstance(data[key], str):\n                data[key] = cv2.imread(data[key], cv2.IMREAD_UNCHANGED)\n            # if alpha and trimap has 3 channels, extract one.\n            if key in ['alpha', 'trimap']:\n                if len(data[key].shape) > 2:\n                    data[key] = data[key][:, :, 0]\n\n        if self.to_rgb:\n            data['img'] = cv2.cvtColor(data['img'], cv2.COLOR_BGR2RGB)\n            for key in data.get('gt_fields', []):\n                if len(data[key].shape) == 2:\n                    continue\n                data[key] = cv2.cvtColor(data[key], cv2.COLOR_BGR2RGB)\n\n        return data\n\n\nclass ResizeByShort:\n    \"\"\"\n    Resize the short side of an image to given size, and then scale the other side proportionally.\n\n    Args:\n        short_size (int): The target size of short side.\n    \"\"\"\n\n    def __init__(self, short_size: int =512):\n        self.short_size = short_size\n\n    def __call__(self, data: dict) -> dict:\n\n        data['trans_info'].append(('resize', data['img'].shape[0:2]))\n        data['img'] = functional.resize_short(data['img'], self.short_size)\n        for key in data.get('gt_fields', []):\n            data[key] = functional.resize_short(data[key], self.short_size)\n        return data\n\n\nclass ResizeToIntMult:\n    \"\"\"\n    Resize to some int muitple, d.g. 32.\n    \"\"\"\n\n    def __init__(self, mult_int: int = 32):\n        self.mult_int = mult_int\n\n    def __call__(self, data: dict) -> dict:\n        data['trans_info'].append(('resize', data['img'].shape[0:2]))\n\n        h, w = data['img'].shape[0:2]\n        rw = w - w % 32\n        rh = h - h % 32\n        data['img'] = functional.resize(data['img'], (rw, rh))\n        for key in data.get('gt_fields', []):\n            data[key] = functional.resize(data[key], (rw, rh))\n\n        return data\n\n\nclass Normalize:\n    \"\"\"\n    Normalize an image.\n\n    Args:\n        mean (list, optional): The mean value of a data set. Default: [0.5, 0.5, 0.5].\n        std (list, optional): The standard deviation of a data set. Default: [0.5, 0.5, 0.5].\n\n    Raises:\n        ValueError: When mean/std is not list or any value in std is 0.\n    \"\"\"\n\n    def __init__(self, mean: Union[List[float], Tuple[float]] = (0.5, 0.5, 0.5), std: Union[List[float], Tuple[float]] = (0.5, 0.5, 0.5)):\n        self.mean = mean\n        self.std = std\n        if not (isinstance(self.mean, (list, tuple))\n                and isinstance(self.std, (list, tuple))):\n            raise ValueError(\n                \"{}: input type is invalid. It should be list or tuple\".format(\n                    self))\n        from functools import reduce\n        if reduce(lambda x, y: x * y, self.std) == 0:\n            raise ValueError('{}: std is invalid!'.format(self))\n\n    def __call__(self, data: dict) -> dict:\n        mean = np.array(self.mean)[np.newaxis, np.newaxis, :]\n        std = np.array(self.std)[np.newaxis, np.newaxis, :]\n        data['img'] = functional.normalize(data['img'], mean, std)\n        if 'fg' in data.get('gt_fields', []):\n            data['fg'] = functional.normalize(data['fg'], mean, std)\n        if 'bg' in data.get('gt_fields', []):\n            data['bg'] = functional.normalize(data['bg'], mean, std)\n\n        return data\n\n\ndef reverse_transform(alpha: paddle.Tensor, trans_info: List[str]):\n    \"\"\"recover pred to origin shape\"\"\"\n    for item in trans_info[::-1]:\n        if item[0] == 'resize':\n            h, w = item[1][0], item[1][1]\n            alpha = F.interpolate(alpha, [h, w], mode='bilinear')\n        elif item[0] == 'padding':\n            h, w = item[1][0], item[1][1]\n            alpha = alpha[:, :, 0:h, 0:w]\n        else:\n            raise Exception(\"Unexpected info '{}' in im_info\".format(item[0]))\n    return alpha\n\ndef save_alpha_pred(alpha: np.ndarray, trimap: np.ndarray = None):\n    \"\"\"\n    The value of alpha is range [0, 1], shape should be [h,w]\n    \"\"\"\n    if isinstance(trimap, str):\n        trimap = cv2.imread(trimap, 0)\n    alpha[trimap == 0] = 0\n    alpha[trimap == 255] = 255\n    alpha = (alpha).astype('uint8')\n    return alpha\n\n\ndef cv2_to_base64(image: np.ndarray):\n    \"\"\"\n    Convert data from BGR to base64 format.\n    \"\"\"\n    data = cv2.imencode('.png', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef base64_to_cv2(b64str: str):\n    \"\"\"\n    Convert data from base64 to BGR format.\n    \"\"\"\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data"
  },
  {
    "path": "modules/image/matting/modnet_mobilenetv2_matting/requirements.py",
    "content": "paddleseg >= 2.3.0\n"
  },
  {
    "path": "modules/image/matting/modnet_resnet50vd_matting/README.md",
    "content": "# modnet_resnet50vd_matting\n\n|模型名称|modnet_resnet50vd_matting|\n| :--- | :---: |\n|类别|图像-抠图|\n|网络|modnet_resnet50vd|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|否|\n|模型大小|535MB|\n|指标|SAD112.73|\n|最新更新日期|2021-12-03|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n  - 样例结果示例(左为原图，右为效果图)：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/144574288-28671577-8d5d-4b20-adb9-fe737015c841.jpg\" width = \"337\" height = \"505\" hspace='10'/>\n    <img src=\"https://user-images.githubusercontent.com/35907364/144779164-47146d3a-58c9-4a38-b968-3530aa9a0137.png\" width = \"337\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### 模型介绍\n\n  - Matting（精细化分割/影像去背/抠图）是指借由计算前景的颜色和透明度，将前景从影像中撷取出来的技术，可用于替换背景、影像合成、视觉特效，在电影工业中被广泛地使用。影像中的每个像素会有代表其前景透明度的值，称作阿法值（Alpha），一张影像中所有阿法值的集合称作阿法遮罩（Alpha Matte），将影像被遮罩所涵盖的部分取出即可完成前景的分离。modnet_resnet50vd_matting可生成抠图结果。\n\n\n\n  - 更多详情请参考：[modnet_resnet50vd_matting](https://github.com/PaddlePaddle/PaddleSeg/tree/release/2.3/contrib/Matting)\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.2.0\n\n    - paddlehub >= 2.1.0\n\n    - paddleseg >= 2.3.0\n\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install modnet_resnet50vd_matting\n      ```\n\n    - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run modnet_resnet50vd_matting --input_path \"/PATH/TO/IMAGE\"\n    ```\n\n  - 通过命令行方式实现hub模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n\n- ### 2、预测代码示例\n\n    - ```python\n      import paddlehub as hub\n      import cv2\n\n      model = hub.Module(name=\"modnet_resnet50vd_matting\")\n\n      result = model.predict([\"/PATH/TO/IMAGE\"])\n      print(result)\n      ```\n\n- ### 3、API\n\n    - ```python\n        def predict(self,\n                    image_list,\n                    trimap_list,\n                    visualization,\n                    save_path):\n      ```\n\n        - 人像matting预测API，用于将输入图片中的人像分割出来。\n\n        - 参数\n\n            - image_list (list(str | numpy.ndarray)):图片输入路径或者BGR格式numpy数据。\n            - trimap_list(list(str | numpy.ndarray)):trimap输入路径或者灰度图单通道格式图片。\n            - visualization (bool): 是否进行可视化，默认为False。\n            - save_path (str): 当visualization为True时，保存图片的路径，默认为\"modnet_resnet50vd_matting_output\"。\n\n        - 返回\n\n            - result (list(numpy.ndarray))：模型分割结果：\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署人像matting在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m modnet_resnet50vd_matting\n      ```\n\n    - 这样就完成了一个人像matting在线服务API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    import time\n    import numpy as np\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n\n    def base64_to_cv2(b64str):\n        data = base64.b64decode(b64str.encode('utf8'))\n        data = np.fromstring(data, np.uint8)\n        data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n        return data\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/modnet_resnet50vd_matting\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    for image in r.json()[\"results\"]['data']:\n        data = base64_to_cv2(image)\n        image_path =str(time.time()) + \".png\"\n        cv2.imwrite(image_path, data)\n      ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/matting/modnet_resnet50vd_matting/README_en.md",
    "content": "# modnet_resnet50vd_matting\n\n|Module Name|modnet_resnet50vd_matting|\n| :--- | :---: |\n|Category|Image Matting|\n|Network|modnet_resnet50vd|\n|Dataset|Baidu self-built dataset|\n|Support Fine-tuning|No|\n|Module Size|535MB|\n|Data Indicators|SAD104.14|\n|Latest update date|2021-12-03|\n\n\n## I. Basic Information\n\n- ### Application Effect Display\n\n  - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/144574288-28671577-8d5d-4b20-adb9-fe737015c841.jpg\" width = \"337\" height = \"505\" hspace='10'/>\n    <img src=\"https://user-images.githubusercontent.com/35907364/144779164-47146d3a-58c9-4a38-b968-3530aa9a0137.png\" width = \"337\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### Module Introduction\n\n  - Mating is the technique of extracting foreground from an image by calculating its color and transparency. It is widely used in the film industry to replace background, image composition, and visual effects. Each pixel in the image will have a value that represents its foreground transparency, called Alpha. The set of all Alpha values in an image is called Alpha Matte. The part of the image covered by the mask can be extracted to complete foreground separation.\n\n\n\n  - For more information, please refer to: [modnet_resnet50vd_matting](https://github.com/PaddlePaddle/PaddleSeg/tree/release/2.3/contrib/Matting)\n\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.2.0\n\n    - paddlehub >= 2.1.0\n\n    - paddleseg >= 2.3.0\n\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install modnet_resnet50vd_matting\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run modnet_resnet50vd_matting --input_path \"/PATH/TO/IMAGE\"\n    ```\n\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_en/tutorial/cmd_usage.rst)\n\n\n- ### 2、Prediction Code Example\n\n    - ```python\n      import paddlehub as hub\n      import cv2\n\n      model = hub.Module(name=\"modnet_resnet50vd_matting\")\n\n      result = model.predict([\"/PATH/TO/IMAGE\"])\n      print(result)\n      ```\n- ### 3、API\n\n    - ```python\n        def predict(self,\n                    image_list,\n                    trimap_list,\n                    visualization,\n                    save_path):\n      ```\n\n        - Prediction API for matting.\n\n        - **Parameter**\n\n            - image_list (list(str | numpy.ndarray)): Image path or image data, ndarray.shape is in the format \\[H, W, C\\], BGR.\n            - trimap_list(list(str | numpy.ndarray)): Trimap path or trimap data, ndarray.shape is in the format \\[H, W\\], Gray. Default is None.\n            - visualization (bool): Whether to save the recognition results as picture files, default is False.\n            - save_path (str): Save path of images, \"modnet_resnet50vd_matting_output\" by default.\n\n        - **Return**\n\n            - result (list(numpy.ndarray))：The list of model results.\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of matting.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command:\n\n    - ```shell\n      $ hub serving start -m modnet_resnet50vd_matting\n      ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n\n    ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    import time\n    import numpy as np\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n\n    def base64_to_cv2(b64str):\n        data = base64.b64decode(b64str.encode('utf8'))\n        data = np.fromstring(data, np.uint8)\n        data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n        return data\n\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/modnet_resnet50vd_matting\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    for image in r.json()[\"results\"]['data']:\n        data = base64_to_cv2(image)\n        image_path =str(time.time()) + \".png\"\n        cv2.imwrite(image_path, data)\n      ```\n\n## V. Release Note\n\n- 1.0.0\n\n  First release\n"
  },
  {
    "path": "modules/image/matting/modnet_resnet50vd_matting/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/matting/modnet_resnet50vd_matting/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport os\nimport time\nfrom typing import Callable\nfrom typing import List\nfrom typing import Union\n\nimport cv2\nimport modnet_resnet50vd_matting.processor as P\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport scipy\nfrom modnet_resnet50vd_matting.resnet import ResNet50_vd\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"modnet_resnet50vd_matting\",\n            type=\"CV/matting\",\n            author=\"paddlepaddle\",\n            summary=\"modnet_resnet50vd_matting is a matting model\",\n            version=\"1.0.0\")\nclass MODNetResNet50Vd(nn.Layer):\n    \"\"\"\n    The MODNet implementation based on PaddlePaddle.\n\n    The original article refers to\n    Zhanghan Ke, et, al. \"Is a Green Screen Really Necessary for Real-Time Portrait Matting?\"\n    (https://arxiv.org/pdf/2011.11961.pdf).\n\n    Args:\n        hr_channels(int, optional): The channels of high resolutions branch. Defautl: None.\n        pretrained(str, optional): The path of pretrianed model. Defautl: None.\n    \"\"\"\n\n    def __init__(self, hr_channels: int = 32, pretrained=None):\n        super(MODNetResNet50Vd, self).__init__()\n\n        self.backbone = ResNet50_vd()\n        self.pretrained = pretrained\n\n        self.head = MODNetHead(hr_channels=hr_channels, backbone_channels=self.backbone.feat_channels)\n        self.blurer = GaussianBlurLayer(1, 3)\n        self.transforms = P.Compose([P.LoadImages(), P.ResizeByShort(), P.ResizeToIntMult(), P.Normalize()])\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'modnet-resnet50_vd.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def preprocess(self, img: Union[str, np.ndarray], transforms: Callable, trimap: Union[str, np.ndarray] = None):\n        data = {}\n        data['img'] = img\n        if trimap is not None:\n            data['trimap'] = trimap\n            data['gt_fields'] = ['trimap']\n        data['trans_info'] = []\n        data = transforms(data)\n        data['img'] = paddle.to_tensor(data['img'])\n        data['img'] = data['img'].unsqueeze(0)\n        if trimap is not None:\n            data['trimap'] = paddle.to_tensor(data['trimap'])\n            data['trimap'] = data['trimap'].unsqueeze((0, 1))\n\n        return data\n\n    def forward(self, inputs: dict):\n        x = inputs['img']\n        feat_list = self.backbone(x)\n        y = self.head(inputs=inputs, feat_list=feat_list)\n        return y\n\n    def predict(self,\n                image_list: list,\n                trimap_list: list = None,\n                visualization: bool = False,\n                save_path: str = \"modnet_resnet50vd_matting_output\"):\n        self.eval()\n        result = []\n        with paddle.no_grad():\n            for i, im_path in enumerate(image_list):\n                trimap = trimap_list[i] if trimap_list is not None else None\n                data = self.preprocess(img=im_path, transforms=self.transforms, trimap=trimap)\n                alpha_pred = self.forward(data)\n                alpha_pred = P.reverse_transform(alpha_pred, data['trans_info'])\n                alpha_pred = (alpha_pred.numpy()).squeeze()\n                alpha_pred = (alpha_pred * 255).astype('uint8')\n                alpha_pred = P.save_alpha_pred(alpha_pred, trimap)\n                result.append(alpha_pred)\n                if visualization:\n                    if not os.path.exists(save_path):\n                        os.makedirs(save_path)\n                    img_name = str(time.time()) + '.png'\n                    image_save_path = os.path.join(save_path, img_name)\n                    cv2.imwrite(image_save_path, alpha_pred)\n\n        return result\n\n    @serving\n    def serving_method(self, images: list, trimaps: list = None, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [P.base64_to_cv2(image) for image in images]\n        if trimaps is not None:\n            trimap_decoder = [cv2.cvtColor(P.base64_to_cv2(trimap), cv2.COLOR_BGR2GRAY) for trimap in trimaps]\n        else:\n            trimap_decoder = None\n\n        outputs = self.predict(image_list=images_decode, trimap_list=trimap_decoder, **kwargs)\n        serving_data = [P.cv2_to_base64(outputs[i]) for i in range(len(outputs))]\n        results = {'data': serving_data}\n\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        if args.trimap_path is not None:\n            trimap_list = [args.trimap_path]\n        else:\n            trimap_list = None\n\n        results = self.predict(image_list=[args.input_path],\n                               trimap_list=trimap_list,\n                               save_path=args.output_dir,\n                               visualization=args.visualization)\n\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default=\"modnet_resnet50vd_matting_output\",\n                                           help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument('--visualization',\n                                           type=bool,\n                                           default=True,\n                                           help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n        self.arg_input_group.add_argument('--trimap_path', type=str, default=None, help=\"path to trimap.\")\n\n\nclass MODNetHead(nn.Layer):\n    \"\"\"\n    Segmentation head.\n    \"\"\"\n\n    def __init__(self, hr_channels: int, backbone_channels: int):\n        super().__init__()\n\n        self.lr_branch = LRBranch(backbone_channels)\n        self.hr_branch = HRBranch(hr_channels, backbone_channels)\n        self.f_branch = FusionBranch(hr_channels, backbone_channels)\n\n    def forward(self, inputs: paddle.Tensor, feat_list: list) -> paddle.Tensor:\n        pred_semantic, lr8x, [enc2x, enc4x] = self.lr_branch(feat_list)\n        pred_detail, hr2x = self.hr_branch(inputs['img'], enc2x, enc4x, lr8x)\n        pred_matte = self.f_branch(inputs['img'], lr8x, hr2x)\n        return pred_matte\n\n\nclass FusionBranch(nn.Layer):\n\n    def __init__(self, hr_channels: int, enc_channels: int):\n        super().__init__()\n        self.conv_lr4x = Conv2dIBNormRelu(enc_channels[2], hr_channels, 5, stride=1, padding=2)\n\n        self.conv_f2x = Conv2dIBNormRelu(2 * hr_channels, hr_channels, 3, stride=1, padding=1)\n        self.conv_f = nn.Sequential(\n            Conv2dIBNormRelu(hr_channels + 3, int(hr_channels / 2), 3, stride=1, padding=1),\n            Conv2dIBNormRelu(int(hr_channels / 2), 1, 1, stride=1, padding=0, with_ibn=False, with_relu=False))\n\n    def forward(self, img: paddle.Tensor, lr8x: paddle.Tensor, hr2x: paddle.Tensor) -> paddle.Tensor:\n        lr4x = F.interpolate(lr8x, scale_factor=2, mode='bilinear', align_corners=False)\n        lr4x = self.conv_lr4x(lr4x)\n        lr2x = F.interpolate(lr4x, scale_factor=2, mode='bilinear', align_corners=False)\n\n        f2x = self.conv_f2x(paddle.concat((lr2x, hr2x), axis=1))\n        f = F.interpolate(f2x, scale_factor=2, mode='bilinear', align_corners=False)\n        f = self.conv_f(paddle.concat((f, img), axis=1))\n        pred_matte = F.sigmoid(f)\n\n        return pred_matte\n\n\nclass HRBranch(nn.Layer):\n    \"\"\"\n    High Resolution Branch of MODNet\n    \"\"\"\n\n    def __init__(self, hr_channels: int, enc_channels: int):\n        super().__init__()\n\n        self.tohr_enc2x = Conv2dIBNormRelu(enc_channels[0], hr_channels, 1, stride=1, padding=0)\n        self.conv_enc2x = Conv2dIBNormRelu(hr_channels + 3, hr_channels, 3, stride=2, padding=1)\n\n        self.tohr_enc4x = Conv2dIBNormRelu(enc_channels[1], hr_channels, 1, stride=1, padding=0)\n        self.conv_enc4x = Conv2dIBNormRelu(2 * hr_channels, 2 * hr_channels, 3, stride=1, padding=1)\n\n        self.conv_hr4x = nn.Sequential(\n            Conv2dIBNormRelu(2 * hr_channels + enc_channels[2] + 3, 2 * hr_channels, 3, stride=1, padding=1),\n            Conv2dIBNormRelu(2 * hr_channels, 2 * hr_channels, 3, stride=1, padding=1),\n            Conv2dIBNormRelu(2 * hr_channels, hr_channels, 3, stride=1, padding=1))\n\n        self.conv_hr2x = nn.Sequential(Conv2dIBNormRelu(2 * hr_channels, 2 * hr_channels, 3, stride=1, padding=1),\n                                       Conv2dIBNormRelu(2 * hr_channels, hr_channels, 3, stride=1, padding=1),\n                                       Conv2dIBNormRelu(hr_channels, hr_channels, 3, stride=1, padding=1),\n                                       Conv2dIBNormRelu(hr_channels, hr_channels, 3, stride=1, padding=1))\n\n        self.conv_hr = nn.Sequential(\n            Conv2dIBNormRelu(hr_channels + 3, hr_channels, 3, stride=1, padding=1),\n            Conv2dIBNormRelu(hr_channels, 1, 1, stride=1, padding=0, with_ibn=False, with_relu=False))\n\n    def forward(self, img: paddle.Tensor, enc2x: paddle.Tensor, enc4x: paddle.Tensor,\n                lr8x: paddle.Tensor) -> paddle.Tensor:\n        img2x = F.interpolate(img, scale_factor=1 / 2, mode='bilinear', align_corners=False)\n        img4x = F.interpolate(img, scale_factor=1 / 4, mode='bilinear', align_corners=False)\n\n        enc2x = self.tohr_enc2x(enc2x)\n        hr4x = self.conv_enc2x(paddle.concat((img2x, enc2x), axis=1))\n\n        enc4x = self.tohr_enc4x(enc4x)\n        hr4x = self.conv_enc4x(paddle.concat((hr4x, enc4x), axis=1))\n\n        lr4x = F.interpolate(lr8x, scale_factor=2, mode='bilinear', align_corners=False)\n        hr4x = self.conv_hr4x(paddle.concat((hr4x, lr4x, img4x), axis=1))\n\n        hr2x = F.interpolate(hr4x, scale_factor=2, mode='bilinear', align_corners=False)\n        hr2x = self.conv_hr2x(paddle.concat((hr2x, enc2x), axis=1))\n        pred_detail = None\n        return pred_detail, hr2x\n\n\nclass LRBranch(nn.Layer):\n    \"\"\"\n    Low Resolution Branch of MODNet\n    \"\"\"\n\n    def __init__(self, backbone_channels: int):\n        super().__init__()\n        self.se_block = SEBlock(backbone_channels[4], reduction=4)\n        self.conv_lr16x = Conv2dIBNormRelu(backbone_channels[4], backbone_channels[3], 5, stride=1, padding=2)\n        self.conv_lr8x = Conv2dIBNormRelu(backbone_channels[3], backbone_channels[2], 5, stride=1, padding=2)\n        self.conv_lr = Conv2dIBNormRelu(backbone_channels[2],\n                                        1,\n                                        3,\n                                        stride=2,\n                                        padding=1,\n                                        with_ibn=False,\n                                        with_relu=False)\n\n    def forward(self, feat_list: list) -> List[paddle.Tensor]:\n        enc2x, enc4x, enc32x = feat_list[0], feat_list[1], feat_list[4]\n\n        enc32x = self.se_block(enc32x)\n        lr16x = F.interpolate(enc32x, scale_factor=2, mode='bilinear', align_corners=False)\n        lr16x = self.conv_lr16x(lr16x)\n        lr8x = F.interpolate(lr16x, scale_factor=2, mode='bilinear', align_corners=False)\n        lr8x = self.conv_lr8x(lr8x)\n\n        pred_semantic = None\n        if self.training:\n            lr = self.conv_lr(lr8x)\n            pred_semantic = F.sigmoid(lr)\n\n        return pred_semantic, lr8x, [enc2x, enc4x]\n\n\nclass IBNorm(nn.Layer):\n    \"\"\"\n    Combine Instance Norm and Batch Norm into One Layer\n    \"\"\"\n\n    def __init__(self, in_channels: int):\n        super().__init__()\n        self.bnorm_channels = in_channels // 2\n        self.inorm_channels = in_channels - self.bnorm_channels\n\n        self.bnorm = nn.BatchNorm2D(self.bnorm_channels)\n        self.inorm = nn.InstanceNorm2D(self.inorm_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        bn_x = self.bnorm(x[:, :self.bnorm_channels, :, :])\n        in_x = self.inorm(x[:, self.bnorm_channels:, :, :])\n\n        return paddle.concat((bn_x, in_x), 1)\n\n\nclass Conv2dIBNormRelu(nn.Layer):\n    \"\"\"\n    Convolution + IBNorm + Relu\n    \"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 stride: int = 1,\n                 padding: int = 0,\n                 dilation: int = 1,\n                 groups: int = 1,\n                 bias_attr: paddle.ParamAttr = None,\n                 with_ibn: bool = True,\n                 with_relu: bool = True):\n\n        super().__init__()\n\n        layers = [\n            nn.Conv2D(in_channels,\n                      out_channels,\n                      kernel_size,\n                      stride=stride,\n                      padding=padding,\n                      dilation=dilation,\n                      groups=groups,\n                      bias_attr=bias_attr)\n        ]\n\n        if with_ibn:\n            layers.append(IBNorm(out_channels))\n\n        if with_relu:\n            layers.append(nn.ReLU())\n\n        self.layers = nn.Sequential(*layers)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        return self.layers(x)\n\n\nclass SEBlock(nn.Layer):\n    \"\"\"\n    SE Block Proposed in https://arxiv.org/pdf/1709.01507.pdf\n    \"\"\"\n\n    def __init__(self, num_channels: int, reduction: int = 1):\n        super().__init__()\n        self.pool = nn.AdaptiveAvgPool2D(1)\n        self.conv = nn.Sequential(nn.Conv2D(num_channels, int(num_channels // reduction), 1,\n                                            bias_attr=False), nn.ReLU(),\n                                  nn.Conv2D(int(num_channels // reduction), num_channels, 1, bias_attr=False),\n                                  nn.Sigmoid())\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        w = self.pool(x)\n        w = self.conv(w)\n        return w * x\n\n\nclass GaussianBlurLayer(nn.Layer):\n    \"\"\" Add Gaussian Blur to a 4D tensors\n    This layer takes a 4D tensor of {N, C, H, W} as input.\n    The Gaussian blur will be performed in given channel number (C) splitly.\n    \"\"\"\n\n    def __init__(self, channels: int, kernel_size: int):\n        \"\"\"\n        Args:\n            channels (int): Channel for input tensor\n            kernel_size (int): Size of the kernel used in blurring\n        \"\"\"\n\n        super(GaussianBlurLayer, self).__init__()\n        self.channels = channels\n        self.kernel_size = kernel_size\n        assert self.kernel_size % 2 != 0\n\n        self.op = nn.Sequential(\n            nn.Pad2D(int(self.kernel_size / 2), mode='reflect'),\n            nn.Conv2D(channels, channels, self.kernel_size, stride=1, padding=0, bias_attr=False, groups=channels))\n\n        self._init_kernel()\n        self.op[1].weight.stop_gradient = True\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        \"\"\"\n        Args:\n            x (paddle.Tensor): input 4D tensor\n        Returns:\n            paddle.Tensor: Blurred version of the input\n        \"\"\"\n\n        if not len(list(x.shape)) == 4:\n            print('\\'GaussianBlurLayer\\' requires a 4D tensor as input\\n')\n            exit()\n        elif not x.shape[1] == self.channels:\n            print('In \\'GaussianBlurLayer\\', the required channel ({0}) is'\n                  'not the same as input ({1})\\n'.format(self.channels, x.shape[1]))\n            exit()\n\n        return self.op(x)\n\n    def _init_kernel(self):\n        sigma = 0.3 * ((self.kernel_size - 1) * 0.5 - 1) + 0.8\n\n        n = np.zeros((self.kernel_size, self.kernel_size))\n        i = int(self.kernel_size / 2)\n        n[i, i] = 1\n        kernel = scipy.ndimage.gaussian_filter(n, sigma)\n        kernel = kernel.astype('float32')\n        kernel = kernel[np.newaxis, np.newaxis, :, :]\n        paddle.assign(kernel, self.op[1].weight)\n"
  },
  {
    "path": "modules/image/matting/modnet_resnet50vd_matting/processor.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport base64\nfrom typing import Callable\nfrom typing import List\nfrom typing import Tuple\nfrom typing import Union\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\nfrom paddleseg.transforms import functional\n\n\nclass Compose:\n    \"\"\"\n    Do transformation on input data with corresponding pre-processing and augmentation operations.\n    The shape of input data to all operations is [height, width, channels].\n    \"\"\"\n\n    def __init__(self, transforms: Callable, to_rgb: bool = True):\n        if not isinstance(transforms, list):\n            raise TypeError('The transforms must be a list!')\n        self.transforms = transforms\n        self.to_rgb = to_rgb\n\n    def __call__(self, data: dict) -> dict:\n\n        if 'trans_info' not in data:\n            data['trans_info'] = []\n        for op in self.transforms:\n            data = op(data)\n            if data is None:\n                return None\n\n        data['img'] = np.transpose(data['img'], (2, 0, 1))\n        for key in data.get('gt_fields', []):\n            if len(data[key].shape) == 2:\n                continue\n            data[key] = np.transpose(data[key], (2, 0, 1))\n\n        return data\n\n\nclass LoadImages:\n    \"\"\"\n    Read images from image path.\n\n    Args:\n        to_rgb (bool, optional): If converting image to RGB color space. Default: True.\n    \"\"\"\n\n    def __init__(self, to_rgb: bool = True):\n        self.to_rgb = to_rgb\n\n    def __call__(self, data: dict) -> dict:\n\n        if isinstance(data['img'], str):\n            data['img'] = cv2.imread(data['img'])\n\n        for key in data.get('gt_fields', []):\n            if isinstance(data[key], str):\n                data[key] = cv2.imread(data[key], cv2.IMREAD_UNCHANGED)\n            # if alpha and trimap has 3 channels, extract one.\n            if key in ['alpha', 'trimap']:\n                if len(data[key].shape) > 2:\n                    data[key] = data[key][:, :, 0]\n\n        if self.to_rgb:\n            data['img'] = cv2.cvtColor(data['img'], cv2.COLOR_BGR2RGB)\n            for key in data.get('gt_fields', []):\n                if len(data[key].shape) == 2:\n                    continue\n                data[key] = cv2.cvtColor(data[key], cv2.COLOR_BGR2RGB)\n\n        return data\n\n\nclass ResizeByShort:\n    \"\"\"\n    Resize the short side of an image to given size, and then scale the other side proportionally.\n\n    Args:\n        short_size (int): The target size of short side.\n    \"\"\"\n\n    def __init__(self, short_size: int = 512):\n        self.short_size = short_size\n\n    def __call__(self, data: dict) -> dict:\n\n        data['trans_info'].append(('resize', data['img'].shape[0:2]))\n        data['img'] = functional.resize_short(data['img'], self.short_size)\n        for key in data.get('gt_fields', []):\n            data[key] = functional.resize_short(data[key], self.short_size)\n        return data\n\n\nclass ResizeToIntMult:\n    \"\"\"\n    Resize to some int muitple, d.g. 32.\n    \"\"\"\n\n    def __init__(self, mult_int: int = 32):\n        self.mult_int = mult_int\n\n    def __call__(self, data: dict) -> dict:\n        data['trans_info'].append(('resize', data['img'].shape[0:2]))\n\n        h, w = data['img'].shape[0:2]\n        rw = w - w % 32\n        rh = h - h % 32\n        data['img'] = functional.resize(data['img'], (rw, rh))\n        for key in data.get('gt_fields', []):\n            data[key] = functional.resize(data[key], (rw, rh))\n\n        return data\n\n\nclass Normalize:\n    \"\"\"\n    Normalize an image.\n\n    Args:\n        mean (list, optional): The mean value of a data set. Default: [0.5, 0.5, 0.5].\n        std (list, optional): The standard deviation of a data set. Default: [0.5, 0.5, 0.5].\n\n    Raises:\n        ValueError: When mean/std is not list or any value in std is 0.\n    \"\"\"\n\n    def __init__(self,\n                 mean: Union[List[float], Tuple[float]] = (0.5, 0.5, 0.5),\n                 std: Union[List[float], Tuple[float]] = (0.5, 0.5, 0.5)):\n        self.mean = mean\n        self.std = std\n        if not (isinstance(self.mean, (list, tuple)) and isinstance(self.std, (list, tuple))):\n            raise ValueError(\"{}: input type is invalid. It should be list or tuple\".format(self))\n        from functools import reduce\n        if reduce(lambda x, y: x * y, self.std) == 0:\n            raise ValueError('{}: std is invalid!'.format(self))\n\n    def __call__(self, data: dict) -> dict:\n        mean = np.array(self.mean)[np.newaxis, np.newaxis, :]\n        std = np.array(self.std)[np.newaxis, np.newaxis, :]\n        data['img'] = functional.normalize(data['img'], mean, std)\n        if 'fg' in data.get('gt_fields', []):\n            data['fg'] = functional.normalize(data['fg'], mean, std)\n        if 'bg' in data.get('gt_fields', []):\n            data['bg'] = functional.normalize(data['bg'], mean, std)\n\n        return data\n\n\ndef reverse_transform(alpha: paddle.Tensor, trans_info: List[str]):\n    \"\"\"recover pred to origin shape\"\"\"\n    for item in trans_info[::-1]:\n        if item[0] == 'resize':\n            h, w = item[1][0], item[1][1]\n            alpha = F.interpolate(alpha, [h, w], mode='bilinear')\n        elif item[0] == 'padding':\n            h, w = item[1][0], item[1][1]\n            alpha = alpha[:, :, 0:h, 0:w]\n        else:\n            raise Exception(\"Unexpected info '{}' in im_info\".format(item[0]))\n    return alpha\n\n\ndef save_alpha_pred(alpha: np.ndarray, trimap: np.ndarray = None):\n    \"\"\"\n    The value of alpha is range [0, 1], shape should be [h,w]\n    \"\"\"\n    if isinstance(trimap, str):\n        trimap = cv2.imread(trimap, 0)\n    alpha[trimap == 0] = 0\n    alpha[trimap == 255] = 255\n    alpha = (alpha).astype('uint8')\n    return alpha\n\n\ndef cv2_to_base64(image: np.ndarray):\n    \"\"\"\n    Convert data from BGR to base64 format.\n    \"\"\"\n    data = cv2.imencode('.png', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef base64_to_cv2(b64str: str):\n    \"\"\"\n    Convert data from base64 to BGR format.\n    \"\"\"\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n"
  },
  {
    "path": "modules/image/matting/modnet_resnet50vd_matting/requirements.txt",
    "content": "paddleseg>=2.3.0\n"
  },
  {
    "path": "modules/image/matting/modnet_resnet50vd_matting/resnet.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddleseg.models import layers\n\n__all__ = [\"ResNet50_vd\"]\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        kernel_size: int,\n        stride: int = 1,\n        dilation: int = 1,\n        groups: int = 1,\n        is_vd_mode: bool = False,\n        act: str = None,\n    ):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = nn.AvgPool2D(kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = nn.Conv2D(in_channels=in_channels,\n                               out_channels=out_channels,\n                               kernel_size=kernel_size,\n                               stride=stride,\n                               padding=(kernel_size - 1) // 2 if dilation == 1 else 0,\n                               dilation=dilation,\n                               groups=groups,\n                               bias_attr=False)\n\n        self._batch_norm = layers.SyncBatchNorm(out_channels)\n        self._act_op = layers.Activation(act=act)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        y = self._act_op(y)\n\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Residual bottleneck block\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 dilation: int = 1):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(in_channels=in_channels, out_channels=out_channels, kernel_size=1, act='relu')\n\n        self.dilation = dilation\n\n        self.conv1 = ConvBNLayer(in_channels=out_channels,\n                                 out_channels=out_channels,\n                                 kernel_size=3,\n                                 stride=stride,\n                                 act='relu',\n                                 dilation=dilation)\n        self.conv2 = ConvBNLayer(in_channels=out_channels, out_channels=out_channels * 4, kernel_size=1, act=None)\n\n        if not shortcut:\n            self.short = ConvBNLayer(in_channels=in_channels,\n                                     out_channels=out_channels * 4,\n                                     kernel_size=1,\n                                     stride=1,\n                                     is_vd_mode=False if if_first or stride == 1 else True)\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n\n        ####################################################################\n        # If given dilation rate > 1, using corresponding padding.\n        # The performance drops down without the follow padding.\n        if self.dilation > 1:\n            padding = self.dilation\n            y = F.pad(y, [padding, padding, padding, padding])\n        #####################################################################\n\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.add(x=short, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    \"\"\"Basic residual block\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, stride: int, shortcut: bool = True, if_first: bool = False):\n        super(BasicBlock, self).__init__()\n        self.stride = stride\n        self.conv0 = ConvBNLayer(in_channels=in_channels,\n                                 out_channels=out_channels,\n                                 kernel_size=3,\n                                 stride=stride,\n                                 act='relu')\n        self.conv1 = ConvBNLayer(in_channels=out_channels, out_channels=out_channels, kernel_size=3, act=None)\n\n        if not shortcut:\n            self.short = ConvBNLayer(in_channels=in_channels,\n                                     out_channels=out_channels,\n                                     kernel_size=1,\n                                     stride=1,\n                                     is_vd_mode=False if if_first else True)\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.add(x=short, y=conv1)\n        y = F.relu(y)\n\n        return y\n\n\nclass ResNet_vd(nn.Layer):\n    \"\"\"\n    The ResNet_vd implementation based on PaddlePaddle.\n\n    The original article refers to Jingdong\n    Tong He, et, al. \"Bag of Tricks for Image Classification with Convolutional Neural Networks\"\n    (https://arxiv.org/pdf/1812.01187.pdf).\n\n    \"\"\"\n\n    def __init__(self,\n                 input_channels: int = 3,\n                 layers: int = 50,\n                 output_stride: int = 32,\n                 multi_grid: tuple = (1, 1, 1),\n                 pretrained: str = None):\n        super(ResNet_vd, self).__init__()\n\n        self.conv1_logit = None  # for gscnn shape stream\n        self.layers = layers\n        supported_layers = [18, 34, 50, 101, 152, 200]\n        assert layers in supported_layers, \\\n            \"supported layers are {} but input layer is {}\".format(\n                supported_layers, layers)\n\n        if layers == 18:\n            depth = [2, 2, 2, 2]\n        elif layers == 34 or layers == 50:\n            depth = [3, 4, 6, 3]\n        elif layers == 101:\n            depth = [3, 4, 23, 3]\n        elif layers == 152:\n            depth = [3, 8, 36, 3]\n        elif layers == 200:\n            depth = [3, 12, 48, 3]\n        num_channels = [64, 256, 512, 1024] if layers >= 50 else [64, 64, 128, 256]\n        num_filters = [64, 128, 256, 512]\n\n        # for channels of four returned stages\n        self.feat_channels = [c * 4 for c in num_filters] if layers >= 50 else num_filters\n        self.feat_channels = [64] + self.feat_channels\n\n        dilation_dict = None\n        if output_stride == 8:\n            dilation_dict = {2: 2, 3: 4}\n        elif output_stride == 16:\n            dilation_dict = {3: 2}\n\n        self.conv1_1 = ConvBNLayer(in_channels=input_channels, out_channels=32, kernel_size=3, stride=2, act='relu')\n        self.conv1_2 = ConvBNLayer(in_channels=32, out_channels=32, kernel_size=3, stride=1, act='relu')\n        self.conv1_3 = ConvBNLayer(in_channels=32, out_channels=64, kernel_size=3, stride=1, act='relu')\n        self.pool2d_max = nn.MaxPool2D(kernel_size=3, stride=2, padding=1)\n\n        # self.block_list = []\n        self.stage_list = []\n        if layers >= 50:\n            for block in range(len(depth)):\n                shortcut = False\n                block_list = []\n                for i in range(depth[block]):\n                    if layers in [101, 152] and block == 2:\n                        if i == 0:\n                            conv_name = \"res\" + str(block + 2) + \"a\"\n                        else:\n                            conv_name = \"res\" + str(block + 2) + \"b\" + str(i)\n                    else:\n                        conv_name = \"res\" + str(block + 2) + chr(97 + i)\n\n                    ###############################################################################\n                    # Add dilation rate for some segmentation tasks, if dilation_dict is not None.\n                    dilation_rate = dilation_dict[block] if dilation_dict and block in dilation_dict else 1\n\n                    # Actually block here is 'stage', and i is 'block' in 'stage'\n                    # At the stage 4, expand the the dilation_rate if given multi_grid\n                    if block == 3:\n                        dilation_rate = dilation_rate * multi_grid[i]\n                    ###############################################################################\n\n                    bottleneck_block = self.add_sublayer(\n                        'bb_%d_%d' % (block, i),\n                        BottleneckBlock(in_channels=num_channels[block] if i == 0 else num_filters[block] * 4,\n                                        out_channels=num_filters[block],\n                                        stride=2 if i == 0 and block != 0 and dilation_rate == 1 else 1,\n                                        shortcut=shortcut,\n                                        if_first=block == i == 0,\n                                        dilation=dilation_rate))\n\n                    block_list.append(bottleneck_block)\n                    shortcut = True\n                self.stage_list.append(block_list)\n        else:\n            for block in range(len(depth)):\n                shortcut = False\n                block_list = []\n                for i in range(depth[block]):\n                    conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                    basic_block = self.add_sublayer(\n                        'bb_%d_%d' % (block, i),\n                        BasicBlock(in_channels=num_channels[block] if i == 0 else num_filters[block],\n                                   out_channels=num_filters[block],\n                                   stride=2 if i == 0 and block != 0 else 1,\n                                   shortcut=shortcut,\n                                   if_first=block == i == 0))\n                    block_list.append(basic_block)\n                    shortcut = True\n                self.stage_list.append(block_list)\n\n        self.pretrained = pretrained\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        feat_list = []\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        feat_list.append(y)\n\n        y = self.pool2d_max(y)\n\n        # A feature list saves the output feature map of each stage.\n        for stage in self.stage_list:\n            for block in stage:\n                y = block(y)\n            feat_list.append(y)\n\n        return feat_list\n\n\ndef ResNet50_vd(**args):\n    model = ResNet_vd(layers=50, **args)\n    return model\n"
  },
  {
    "path": "modules/image/object_detection/README.md",
    "content": "## **更好用户体验，建议参考WEB端官方文档 -> [【目标检测】](https://www.paddlepaddle.org.cn/hublist)**\n\n\n\n### 目标检测\n\n目标检测任务的目标是给定一张图像或是一个视频帧，让计算机找出其中所有目标的位置，并给出每个目标的具体类别。对于计算机而言，能够“看到”的是图像被编码之后的数字，但很难解图像或是视频帧中出现了人或是物体这样的高层语义概念，也就更加难以定位目标出现在图像中哪个区域。\n\n- 精选推荐模型\n\n| 模型名称                                                     | 模型简介                                                     |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n | [YOLOv3](https://www.paddlepaddle.org.cn/hubdetail?name=yolov3_darknet53_coco2017&en_category=ObjectDetection) | 实现精度相比原作者**提高5.9 个绝对百分点**，性能极致优化。 |\n | [行人检测](https://www.paddlepaddle.org.cn/hubdetail?name=yolov3_darknet53_pedestrian&en_category=ObjectDetection) | 百度自研模型，海量私有数据集训练，可以应用于智能视频监控，人体行为分析，客流统计系统，智能交通等领域 |\n | [车辆检测](https://www.paddlepaddle.org.cn/hubdetail?name=yolov3_darknet53_vehicles&en_category=ObjectDetection) | 百度自研模型，支持car (汽车)，truck (卡车)，bus (公交车)，motorbike (摩托车)，tricycle (三轮车)等车型的识别 |\n"
  },
  {
    "path": "modules/image/object_detection/README_en.md",
    "content": "## **For better user experience, refer to Web official document -> [Object Detection](https://www.paddlepaddle.org.cn/hublist)**\n\n### Object Detection\n\nThe goal of an object detection task is to allow a computer to find the locations of all objects when an image or a video frame is given. In addition, the specific category of each object is specified. For the computer, what it \"sees\" are the digits encoded. It is difficult for the computer to understand the high level semantic concept - whether it is a person or an object in the image or video frame. It is even more difficult to locate where the object exists in the image.\n\n- Recommended Models\n\n| Model Name                                                   | Model Introduction                                           |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n| [YOLOv3](https://www.paddlepaddle.org.cn/hubdetail?name=yolov3_darknet53_coco2017&en_category=ObjectDetection) | Compared with the original author, the implementation accuracy is increased by 5.9 absolute percents, and the performance is extremely optimized. |\n| [Pedestrian detection](https://www.paddlepaddle.org.cn/hubdetail?name=yolov3_darknet53_pedestrian&en_category=ObjectDetection) | Baidu's self-developed model. It is the massive private dataset training, and can be applied to intelligent video surveillance, human behavior analysis, passenger flow statistics system, intelligent transportation and other fields. |\n| [Vehicle inspection](https://www.paddlepaddle.org.cn/hubdetail?name=yolov3_darknet53_vehicles&en_category=ObjectDetection) | Baidu's self-developed model. It supports recognition of car, truck, bus, motorbike, tricycle and other vehicle types. |\n"
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_coco2017/README.md",
    "content": "# faster_rcnn_resnet50_coco2017\n\n|模型名称|faster_rcnn_resnet50_coco2017|\n| :--- | :---: |\n|类别|图像 - 目标检测|\n|网络|faster_rcnn|\n|数据集|COCO2017|\n|是否支持Fine-tuning|否|\n|模型大小|131MB|\n|最新更新日期|2021-03-15|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n     <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/131504887-d024c7e5-fc09-4d6b-92b8-4d0c965949d0.jpg\"   width='50%' hspace='10'/>\n     <br />\n     </p>\n\n\n- ### 模型介绍\n\n  - Faster_RCNN是两阶段目标检测器，对图像生成候选区域、提取特征、判别特征类别并修正候选框位置。Faster_RCNN整体网络可以分为4部分，一是ResNet-50作为基础卷积层，二是区域生成网络，三是Rol Align，四是检测层。Faster_RCNN是在MS-COCO数据集上预训练的模型。目前仅提供预测功能。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install faster_rcnn_resnet50_coco2017\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run faster_rcnn_resnet50_coco2017 --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现目标检测模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    object_detector = hub.Module(name=\"faster_rcnn_resnet50_coco2017\")\n    result = object_detector.object_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = object_detector.object_detection(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def object_detection(paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='detection_result',\n                         score_thresh=0.5,\n                         visualization=True)\n    ```\n\n    - 预测API，检测输入图片中的所有目标的位置。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 detection\\_result；<br/>\n      - score\\_thresh (float): 识别置信度的阈值；<br/>\n      - visualization (bool): 是否将识别结果保存为图片文件。\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为:\n        - data (list): 检测结果，list的每一个元素为 dict，各字段为:\n          - confidence (float): 识别的置信度\n          - label (str): 标签\n          - left (int): 边界框的左上角x坐标\n          - top (int): 边界框的左上角y坐标\n          - right (int): 边界框的右下角x坐标\n          - bottom (int): 边界框的右下角y坐标\n        - save\\_path (str, optional): 识别结果的保存路径 (仅当visualization=True时存在)\n\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - 将模型保存到指定路径。\n\n    - **参数**\n\n      - dirname: 模型保存路径 <br/>\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m faster_rcnn_resnet50_coco2017\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/faster_rcnn_resnet50_coco2017\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.1.0\n\n  初始发布\n\n* 1.1.1\n\n  修复numpy数据读取问题\n\n* 1.2.0\n\n  移除 fluid api\n\n  - ```shell\n    $ hub install faster_rcnn_resnet50_coco2017==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_coco2017/README_en.md",
    "content": "# faster_rcnn_resnet50_coco2017\n\n|Module Name|faster_rcnn_resnet50_coco2017|\n| :--- | :---: |\n|Category|object detection|\n|Network|faster_rcnn|\n|Dataset|COCO2017|\n|Fine-tuning supported or not|No|\n|Module Size|131MB|\n|Latest update date|2021-03-15|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n     <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/131504887-d024c7e5-fc09-4d6b-92b8-4d0c965949d0.jpg\"   width='50%' hspace='10'/>\n     <br />\n     </p>\n\n\n- ### Module Introduction\n\n  - Faster_RCNN is a two-stage detector, it consists of feature extraction, proposal, classification and refinement processes. This module is trained on COCO2017 dataset, and can be used for object detection.\n\n\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install faster_rcnn_resnet50_coco2017\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run faster_rcnn_resnet50_coco2017 --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    object_detector = hub.Module(name=\"faster_rcnn_resnet50_coco2017\")\n    result = object_detector.object_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = object_detector.object_detection(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def object_detection(paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='detection_result',\n                         score_thresh=0.5,\n                         visualization=True)\n    ```\n\n    - Detection API, detect positions of all objects in image\n\n    - **Parameters**\n\n      - paths (list[str]): image path;\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - output_dir (str): save path of images;\n      - score\\_thresh (float): confidence threshold；<br/>\n      - visualization (bool): Whether to save the results as picture files;\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n\n      - res (list\\[dict\\]): results\n        - data (list): detection results, each element in the list is dict\n          - confidence (float): the confidence of the result\n          - label (str): label\n          - left (int): the upper left corner x coordinate of the detection box\n          - top (int): the upper left corner y coordinate of the detection box\n          - right (int): the lower right corner x coordinate of the detection box\n          - bottom (int): the lower right corner y coordinate of the detection box\n        - save\\_path (str, optional): output path for saving results\n\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - Save model to specific path\n\n    - **Parameters**\n\n      - dirname: model save path\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of object detection.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m faster_rcnn_resnet50_coco2017\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/faster_rcnn_resnet50_coco2017\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.1.0\n\n  First release\n\n* 1.1.1\n\n  Fix the problem of reading numpy\n\n* 1.2.0\n\n  Remove fluid api\n\n  - ```shell\n    $ hub install faster_rcnn_resnet50_coco2017==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_coco2017/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_coco2017/data_feed.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import division\n\nimport os\n\nimport cv2\nimport numpy as np\n\n__all__ = ['test_reader']\n\n\ndef test_reader(paths=None, images=None):\n    \"\"\"\n    data generator\n\n    Args:\n        paths (list[str]): paths to images.\n        images (list(numpy.ndarray)): data of images, shape of each is [H, W, C]\n\n    Yield:\n        res (dict): key contains 'image' and 'im_info', the corresponding values is:\n            image (numpy.ndarray): the image to be fed into network\n            im_info (numpy.ndarray): the info about the preprocessed.\n    \"\"\"\n    img_list = list()\n    if paths:\n        for img_path in paths:\n            assert os.path.isfile(\n                img_path), \"The {} isn't a valid file path.\".format(img_path)\n            img = cv2.imread(img_path).astype('float32')\n            img_list.append(img)\n    if images is not None:\n        for img in images:\n            img_list.append(img)\n\n    for im in img_list:\n        im = cv2.cvtColor(im, cv2.COLOR_BGR2RGB)\n        im = im.astype(np.float32, copy=False)\n        mean = [0.485, 0.456, 0.406]\n        std = [0.229, 0.224, 0.225]\n        mean = np.array(mean)[np.newaxis, np.newaxis, :]\n        std = np.array(std)[np.newaxis, np.newaxis, :]\n        im = im / 255.0\n        im -= mean\n        im /= std\n\n        target_size = 800\n        max_size = 1333\n\n        shape = im.shape\n        # im_shape holds the original shape of image.\n        im_shape = np.array([shape[0], shape[1], 1.0]).astype('float32')\n        im_size_min = np.min(shape[0:2])\n        im_size_max = np.max(shape[0:2])\n        im_scale = float(target_size) / float(im_size_min)\n        if np.round(im_scale * im_size_max) > max_size:\n            im_scale = float(max_size) / float(im_size_max)\n\n        resize_w = np.round(im_scale * float(shape[1]))\n        resize_h = np.round(im_scale * float(shape[0]))\n        # im_info holds the resize info of image.\n        im_info = np.array([resize_h, resize_w, im_scale]).astype('float32')\n\n        im = cv2.resize(\n            im,\n            None,\n            None,\n            fx=im_scale,\n            fy=im_scale,\n            interpolation=cv2.INTER_LINEAR)\n\n        # HWC --> CHW\n        im = np.swapaxes(im, 1, 2)\n        im = np.swapaxes(im, 1, 0)\n        yield {'image': im, 'im_info': im_info, 'im_shape': im_shape}\n\n\ndef padding_minibatch(batch_data, coarsest_stride=0, use_padded_im_info=True):\n    max_shape_org = np.array(\n        [data['image'].shape for data in batch_data]).max(axis=0)\n    if coarsest_stride > 0:\n        max_shape = np.zeros((3)).astype('int32')\n        max_shape[1] = int(\n            np.ceil(max_shape_org[1] / coarsest_stride) * coarsest_stride)\n        max_shape[2] = int(\n            np.ceil(max_shape_org[2] / coarsest_stride) * coarsest_stride)\n    else:\n        max_shape = max_shape_org.astype('int32')\n\n    padding_image = list()\n    padding_info = list()\n    padding_shape = list()\n\n    for data in batch_data:\n        im_c, im_h, im_w = data['image'].shape\n        # image\n        padding_im = np.zeros((im_c, max_shape[1], max_shape[2]),\n                              dtype=np.float32)\n        padding_im[:, 0:im_h, 0:im_w] = data['image']\n        padding_image.append(padding_im)\n        # im_info\n        data['im_info'][\n            0] = max_shape[1] if use_padded_im_info else max_shape_org[1]\n        data['im_info'][\n            1] = max_shape[2] if use_padded_im_info else max_shape_org[2]\n        padding_info.append(data['im_info'])\n        padding_shape.append(data['im_shape'])\n\n    padding_image = np.array(padding_image).astype('float32')\n    padding_info = np.array(padding_info).astype('float32')\n    padding_shape = np.array(padding_shape).astype('float32')\n    return padding_image, padding_info, padding_shape\n"
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_coco2017/label_file.txt",
    "content": "background\nperson\nbicycle\ncar\nmotorcycle\nairplane\nbus\ntrain\ntruck\nboat\ntraffic light\nfire hydrant\nstop sign\nparking meter\nbench\nbird\ncat\ndog\nhorse\nsheep\ncow\nelephant\nbear\nzebra\ngiraffe\nbackpack\numbrella\nhandbag\ntie\nsuitcase\nfrisbee\nskis\nsnowboard\nsports ball\nkite\nbaseball bat\nbaseball glove\nskateboard\nsurfboard\ntennis racket\nbottle\nwine glass\ncup\nfork\nknife\nspoon\nbowl\nbanana\napple\nsandwich\norange\nbroccoli\ncarrot\nhot dog\npizza\ndonut\ncake\nchair\ncouch\npotted plant\nbed\ndining table\ntoilet\ntv\nlaptop\nmouse\nremote\nkeyboard\ncell phone\nmicrowave\noven\ntoaster\nsink\nrefrigerator\nbook\nclock\nvase\nscissors\nteddy bear\nhair drier\ntoothbrush\n"
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_coco2017/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport os\nimport ast\nimport argparse\nfrom math import ceil\n\nimport paddle\nimport numpy as np\nimport paddle.static\nfrom paddlehub.module.module import moduleinfo, runnable, serving\nfrom paddle.inference import Config, create_predictor\nfrom paddlehub.utils.parser import txt_parser\n\nfrom .processor import load_label_info, postprocess, base64_to_cv2\nfrom .data_feed import test_reader, padding_minibatch\n\n@moduleinfo(\n    name=\"faster_rcnn_resnet50_coco2017\",\n    version=\"1.2.0\",\n    type=\"cv/object_detection\",\n    summary=\n    \"Baidu's Faster R-CNN model for object detection with backbone ResNet50, trained with dataset COCO2017\",\n    author=\"paddlepaddle\",\n    author_email=\"paddle-dev@baidu.com\")\nclass FasterRCNNResNet50:\n    def __init__(self):\n        # default pretrained model, Faster-RCNN with backbone ResNet50, shape of input tensor is [3, 800, 1333]\n        self.default_pretrained_model_path = os.path.join(\n            self.directory, \"faster_rcnn_resnet50_model\", \"model\")\n        self.label_names = load_label_info(\n            os.path.join(self.directory, \"label_file.txt\"))\n        self._set_config()\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path+'.pdmodel'\n        params = self.default_pretrained_model_path+'.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=500, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def object_detection(self,\n                         paths=None,\n                         images=None,\n                         use_gpu=False,\n                         batch_size=1,\n                         output_dir='detection_result',\n                         score_thresh=0.5,\n                         visualization=True):\n        \"\"\"API of Object Detection.\n\n        Args:\n            paths (list[str]): The paths of images.\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            output_dir (str): The path to store output images.\n            visualization (bool): Whether to save image or not.\n            score_thresh (float): threshold for object detecion.\n\n        Returns:\n            res (list[dict]): The result of coco2017 detecion. keys include 'data', 'save_path', the corresponding value is:\n                data (dict): the result of object detection, keys include 'left', 'top', 'right', 'bottom', 'label', 'confidence', the corresponding value is:\n                    left (float): The X coordinate of the upper left corner of the bounding box;\n                    top (float): The Y coordinate of the upper left corner of the bounding box;\n                    right (float): The X coordinate of the lower right corner of the bounding box;\n                    bottom (float): The Y coordinate of the lower right corner of the bounding box;\n                    label (str): The label of detection result;\n                    confidence (float): The confidence of detection result.\n                save_path (str, optional): The path to save output images.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Attempt to use GPU for prediction, but environment variable CUDA_VISIBLE_DEVICES was not set correctly.\"\n                )\n        paths = paths if paths else list()\n\n        all_images = list()\n        for yield_return in test_reader(paths, images):\n            all_images.append(yield_return)\n\n        images_num = len(all_images)\n        loop_num = ceil(images_num / batch_size)\n        res = []\n        for iter_id in range(loop_num):\n            batch_data = []\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_images[handle_id + image_id])\n                except:\n                    pass\n\n            padding_image, padding_info, padding_shape = padding_minibatch(\n                batch_data)\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n\n            feed_list = [\n                padding_image, padding_info, padding_shape\n            ]\n\n            input_names = predictor.get_input_names()\n            \n            for i, input_name in enumerate(input_names):\n                data = np.asarray(feed_list[i], dtype=np.float32)\n                handle = predictor.get_input_handle(input_name)\n                handle.copy_from_cpu(data)\n            \n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            output = postprocess(\n                paths=paths,\n                images=images,\n                data_out=output_handle,\n                score_thresh=score_thresh,\n                label_names=self.label_names,\n                output_dir=output_dir,\n                handle_id=handle_id,\n                visualization=visualization)\n            res += output\n        return res\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu',\n            type=ast.literal_eval,\n            default=False,\n            help=\"whether use GPU or not\")\n\n        self.arg_config_group.add_argument(\n            '--batch_size',\n            type=int,\n            default=1,\n            help=\"batch size for prediction\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options\n        \"\"\"\n        self.arg_input_group.add_argument(\n            '--input_path', type=str, default=None, help=\"input data\")\n\n        self.arg_input_group.add_argument(\n            '--input_file',\n            type=str,\n            default=None,\n            help=\"file contain input data\")\n\n    def check_input_data(self, args):\n        input_data = []\n        if args.input_path:\n            input_data = [args.input_path]\n        elif args.input_file:\n            if not os.path.exists(args.input_file):\n                raise RuntimeError(\"File %s is not exist.\" % args.input_file)\n            else:\n                input_data = txt_parser.parse(args.input_file, use_strip=True)\n        return input_data\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.object_detection(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {}\".format(self.name),\n            prog=\"hub run {}\".format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(\n            title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\",\n            description=\n            \"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        input_data = self.check_input_data(args)\n        if len(input_data) == 0:\n            self.parser.print_help()\n            exit(1)\n        else:\n            for image_path in input_data:\n                if not os.path.exists(image_path):\n                    raise RuntimeError(\n                        \"File %s or %s is not exist.\" % image_path)\n        return self.object_detection(\n            paths=input_data, use_gpu=args.use_gpu, batch_size=args.batch_size)\n"
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_coco2017/processor.py",
    "content": "# coding=utf-8\nimport base64\nimport os\n\nimport cv2\nimport numpy as np\nfrom PIL import Image, ImageDraw\n\n__all__ = [\n    'base64_to_cv2',\n    'load_label_info',\n    'postprocess',\n]\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\ndef check_dir(dir_path):\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\ndef get_save_image_name(img, output_dir, image_path):\n    \"\"\"Get save image name from source image path.\n    \"\"\"\n    if not os.path.exists(output_dir):\n        os.makedirs(output_dir)\n    image_name = os.path.split(image_path)[-1]\n    name, ext = os.path.splitext(image_name)\n\n    if ext == '':\n        if img.format == 'PNG':\n            ext = '.png'\n        elif img.format == 'JPEG':\n            ext = '.jpg'\n        elif img.format == 'BMP':\n            ext = '.bmp'\n        else:\n            if img.mode == \"RGB\" or img.mode == \"L\":\n                ext = \".jpg\"\n            elif img.mode == \"RGBA\" or img.mode == \"P\":\n                ext = '.png'\n\n    return os.path.join(output_dir, \"{}\".format(name)) + ext\n\n\ndef draw_bounding_box_on_image(image_path, data_list, save_dir):\n    image = Image.open(image_path)\n    draw = ImageDraw.Draw(image)\n    for data in data_list:\n        left, right, top, bottom = data['left'], data['right'], data[\n            'top'], data['bottom']\n\n        # draw bbox\n        draw.line([(left, top), (left, bottom), (right, bottom), (right, top),\n                   (left, top)],\n                  width=2,\n                  fill='red')\n\n        # draw label\n        if image.mode == 'RGB':\n            text = data['label'] + \": %.2f%%\" % (100 * data['confidence'])\n            textsize_width, textsize_height = draw.textsize(text=text)\n            draw.rectangle(\n                xy=(left, top - (textsize_height + 5),\n                    left + textsize_width + 10, top),\n                fill=(255, 255, 255))\n            draw.text(xy=(left, top - 15), text=text, fill=(0, 0, 0))\n\n    save_name = get_save_image_name(image, save_dir, image_path)\n    if os.path.exists(save_name):\n        os.remove(save_name)\n\n    image.save(save_name)\n    return save_name\n\n\ndef clip_bbox(bbox, img_width, img_height):\n    xmin = max(min(bbox[0], img_width), 0.)\n    ymin = max(min(bbox[1], img_height), 0.)\n    xmax = max(min(bbox[2], img_width), 0.)\n    ymax = max(min(bbox[3], img_height), 0.)\n    return float(xmin), float(ymin), float(xmax), float(ymax)\n\n\ndef load_label_info(file_path):\n    with open(file_path, 'r') as fr:\n        text = fr.readlines()\n        label_names = []\n        for info in text:\n            label_names.append(info.strip())\n        return label_names\n\n\ndef postprocess(paths,\n                images,\n                data_out,\n                score_thresh,\n                label_names,\n                output_dir,\n                handle_id,\n                visualization=True):\n    \"\"\"\n    postprocess the lod_tensor produced by Executor.run\n\n    Args:\n        paths (list[str]): the path of images.\n        images (list(numpy.ndarray)):  list of images, shape of each is [H, W, C].\n        data_out (lod_tensor): data produced by executor.run.\n        score_thresh (float): the low limit of bounding box.\n        label_names (list[str]): label names.\n        output_dir (str): output directory.\n        handle_id (int): The number of images that have been handled.\n        visualization (bool): whether to save as images.\n\n    Returns:\n        res (list[dict]): The result of vehicles detecion. keys include 'data', 'save_path', the corresponding value is:\n            data (dict): the result of object detection, keys include 'left', 'top', 'right', 'bottom', 'label', 'confidence', the corresponding value is:\n                left (float): The X coordinate of the upper left corner of the bounding box;\n                top (float): The Y coordinate of the upper left corner of the bounding box;\n                right (float): The X coordinate of the lower right corner of the bounding box;\n                bottom (float): The Y coordinate of the lower right corner of the bounding box;\n                label (str): The label of detection result;\n                confidence (float): The confidence of detection result.\n            save_path (str): The path to save output images.\n    \"\"\"\n    lod = data_out.lod()[0]\n    results = data_out.copy_to_cpu()\n\n    check_dir(output_dir)\n\n    if paths:\n        assert type(paths) is list, \"type(paths) is not list.\"\n        if handle_id < len(paths):\n            unhandled_paths = paths[handle_id:]\n            unhandled_paths_num = len(unhandled_paths)\n        else:\n            unhandled_paths_num = 0\n    if images is not None:\n        if handle_id < len(images):\n            unhandled_paths = None\n            unhandled_paths_num = len(images) - handle_id\n        else:\n            unhandled_paths_num = 0\n\n    output = []\n    for index in range(len(lod) - 1):\n        output_i = {'data': []}\n        if unhandled_paths and index < unhandled_paths_num:\n            org_img_path = unhandled_paths[index]\n            org_img = Image.open(org_img_path)\n            output_i['path'] = org_img_path\n        else:\n            org_img = images[index - unhandled_paths_num]\n            org_img = org_img.astype(np.uint8)\n            org_img = Image.fromarray(org_img[:, :, ::-1])\n            if visualization:\n                org_img_path = get_save_image_name(\n                    org_img, output_dir, 'image_numpy_{}'.format(\n                        (handle_id + index)))\n                org_img.save(org_img_path)\n        org_img_height = org_img.height\n        org_img_width = org_img.width\n        result_i = results[lod[index]:lod[index + 1]]\n        for row in result_i:\n            if len(row) != 6:\n                continue\n            if row[1] < score_thresh:\n                continue\n            category_id = int(row[0])\n            confidence = row[1]\n            bbox = row[2:]\n            dt = {}\n            dt['label'] = label_names[category_id]\n            dt['confidence'] = float(confidence)\n            dt['left'], dt['top'], dt['right'], dt['bottom'] = clip_bbox(\n                bbox, org_img_width, org_img_height)\n            output_i['data'].append(dt)\n\n        output.append(output_i)\n        if visualization:\n            output_i['save_path'] = draw_bounding_box_on_image(\n                org_img_path, output_i['data'], output_dir)\n\n    return output\n"
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_coco2017/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport paddlehub as hub\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://ai-studio-static-online.cdn.bcebos.com/68313e182f5e4ad9907e69dac9ece8fc50840d7ffbd24fa88396f009958f969a'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"faster_rcnn_resnet50_coco2017\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('detection_result')\n\n    def test_object_detection1(self):\n        results = self.module.object_detection(\n            paths=['tests/test.jpg']\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'cat')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(200 < left < 800)\n        self.assertTrue(2500 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(3500 < bottom < 4500)\n\n    def test_object_detection2(self):\n        results = self.module.object_detection(\n            images=[cv2.imread('tests/test.jpg')]\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'cat')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(200 < left < 800)\n        self.assertTrue(2500 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(3500 < bottom < 4500)\n\n    def test_object_detection3(self):\n        results = self.module.object_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            visualization=False\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'cat')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(200 < left < 800)\n        self.assertTrue(2500 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(3500 < bottom < 4500)\n\n    def test_object_detection4(self):\n        self.assertRaises(\n            AssertionError,\n            self.module.object_detection,\n            paths=['no.jpg']\n        )\n\n    def test_object_detection5(self):\n        self.assertRaises(\n            cv2.error,\n            self.module.object_detection,\n            images=['test.jpg']\n        )\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_fpn_coco2017/README.md",
    "content": "# faster_rcnn_resnet50_fpn_coco2017\n\n|模型名称|faster_rcnn_resnet50_fpn_coco2017|\n| :--- | :---: |\n|类别|图像 - 目标检测|\n|网络|faster_rcnn|\n|数据集|COCO2017|\n|是否支持Fine-tuning|否|\n|模型大小|161MB|\n|最新更新日期|2021-03-15|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n     <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/131504887-d024c7e5-fc09-4d6b-92b8-4d0c965949d0.jpg\"   width='50%' hspace='10'/>\n     <br />\n     </p>\n\n- ### 模型介绍\n\n  - Faster_RCNN是两阶段目标检测器，对图像生成候选区域、提取特征、判别特征类别并修正候选框位置。Faster_RCNN整体网络可以分为4个部分，一是ResNet-50作为基础卷积层，二是区域生成网络，三是Rol Align，四是检测层。Faster_RCNN是在MS-COCO数据集上预训练的模型。目前仅支持预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install faster_rcnn_resnet50_fpn_coco2017\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run faster_rcnn_resnet50_fpn_coco2017 --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现目标检测模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    object_detector = hub.Module(name=\"faster_rcnn_resnet50_fpn_coco2017\")\n    result = object_detector.object_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = object_detector.object_detection((paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def object_detection(paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='detection_result',\n                         score_thresh=0.5,\n                         visualization=True)\n    ```\n\n    - 预测API，检测输入图片中的所有目标的位置。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 detection\\_result；<br/>\n      - score\\_thresh (float): 识别置信度的阈值；<br/>\n      - visualization (bool): 是否将识别结果保存为图片文件。\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为:\n        - data (list): 检测结果，list的每一个元素为 dict，各字段为:\n          - confidence (float): 识别的置信度\n          - label (str): 标签\n          - left (int): 边界框的左上角x坐标\n          - top (int): 边界框的左上角y坐标\n          - right (int): 边界框的右下角x坐标\n          - bottom (int): 边界框的右下角y坐标\n        - save\\_path (str, optional): 识别结果的保存路径 (仅当visualization=True时存在)\n\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - 将模型保存到指定路径。\n\n    - **参数**\n\n      - dirname: 模型保存路径 <br/>\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m faster_rcnn_resnet50_fpn_coco2017\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/faster_rcnn_resnet50_fpn_coco2017\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  修复numpy数据读取问题\n\n* 1.1.0\n\n  移除 fluid api\n\n  - ```shell\n    $ hub install faster_rcnn_resnet50_fpn_coco2017==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_fpn_coco2017/README_en.md",
    "content": "# faster_rcnn_resnet50_fpn_coco2017\n\n|Module Name|faster_rcnn_resnet50_fpn_coco2017|\n| :--- | :---: |\n|Category|object detection|\n|Network|faster_rcnn|\n|Dataset|COCO2017|\n|Fine-tuning supported or not|No|\n|Module Size|161MB|\n|Latest update date|2021-03-15|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n     <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/131504887-d024c7e5-fc09-4d6b-92b8-4d0c965949d0.jpg\"   width='50%' hspace='10'/>\n     <br />\n     </p>\n\n- ### Module Introduction\n\n  - Faster_RCNN is a two-stage detector, it consists of feature extraction, proposal, classification and refinement processes. This module is trained on COCO2017 dataset, and can be used for object detection.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install faster_rcnn_resnet50_fpn_coco2017\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run faster_rcnn_resnet50_fpn_coco2017 --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    object_detector = hub.Module(name=\"faster_rcnn_resnet50_fpn_coco2017\")\n    result = object_detector.object_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = object_detector.object_detection((paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def object_detection(paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='detection_result',\n                         score_thresh=0.5,\n                         visualization=True)\n    ```\n\n    - Detection API, detect positions of all objects in image\n\n    - **Parameters**\n\n      - paths (list[str]): image path;\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - output_dir (str): save path of images;\n      - score\\_thresh (float): confidence threshold；<br/>\n      - visualization (bool): Whether to save the results as picture files;\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n\n      - res (list\\[dict\\]): results\n        - data (list): detection results, each element in the list is dict\n          - confidence (float): the confidence of the result\n          - label (str): label\n          - left (int): the upper left corner x coordinate of the detection box\n          - top (int): the upper left corner y coordinate of the detection box\n          - right (int): the lower right corner x coordinate of the detection box\n          - bottom (int): the lower right corner y coordinate of the detection box\n        - save\\_path (str, optional): output path for saving results\n\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - Save model to specific path\n\n    - **Parameters**\n\n      - dirname: model save path\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of object detection.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m faster_rcnn_resnet50_fpn_coco2017\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/faster_rcnn_resnet50_fpn_coco2017\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.0.1\n\n  Fix the problem of reading numpy\n\n* 1.1.0\n\n  Remove fluid api\n\n  - ```shell\n    $ hub install faster_rcnn_resnet50_fpn_coco2017==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_fpn_coco2017/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_fpn_coco2017/data_feed.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import division\n\nimport os\n\nimport cv2\nimport numpy as np\n\n__all__ = ['test_reader']\n\n\ndef test_reader(paths=None, images=None):\n    \"\"\"\n    data generator\n\n    Args:\n        paths (list[str]): paths to images.\n        images (list(numpy.ndarray)): data of images, shape of each is [H, W, C]\n\n    Yield:\n        res (dict): key contains 'image', 'im_info', 'im_shape', the corresponding values is:\n            image (numpy.ndarray): the image to be fed into network\n            im_info (numpy.ndarray): the info about the preprocessed.\n            im_shape (numpy.ndarray): the shape of image.\n    \"\"\"\n    img_list = list()\n    if paths:\n        for img_path in paths:\n            assert os.path.isfile(\n                img_path), \"The {} isn't a valid file path.\".format(img_path)\n            img = cv2.imread(img_path).astype('float32')\n            img_list.append(img)\n    if images is not None:\n        for img in images:\n            img_list.append(img)\n\n    for im in img_list:\n        im = cv2.cvtColor(im, cv2.COLOR_BGR2RGB)\n        im = im.astype(np.float32, copy=False)\n        mean = [0.485, 0.456, 0.406]\n        std = [0.229, 0.224, 0.225]\n        mean = np.array(mean)[np.newaxis, np.newaxis, :]\n        std = np.array(std)[np.newaxis, np.newaxis, :]\n        im = im / 255.0\n        im -= mean\n        im /= std\n\n        target_size = 800\n        max_size = 1333\n\n        shape = im.shape\n        # im_shape holds the original shape of image.\n        im_shape = np.array([shape[0], shape[1], 1.0]).astype('float32')\n        im_size_min = np.min(shape[0:2])\n        im_size_max = np.max(shape[0:2])\n        im_scale = float(target_size) / float(im_size_min)\n        if np.round(im_scale * im_size_max) > max_size:\n            im_scale = float(max_size) / float(im_size_max)\n\n        resize_w = np.round(im_scale * float(shape[1]))\n        resize_h = np.round(im_scale * float(shape[0]))\n        # im_info holds the resize info of image.\n        im_info = np.array([resize_h, resize_w, im_scale]).astype('float32')\n\n        im = cv2.resize(\n            im,\n            None,\n            None,\n            fx=im_scale,\n            fy=im_scale,\n            interpolation=cv2.INTER_LINEAR)\n\n        # HWC --> CHW\n        im = np.swapaxes(im, 1, 2)\n        im = np.swapaxes(im, 1, 0)\n        yield {'image': im, 'im_info': im_info, 'im_shape': im_shape}\n\n\ndef padding_minibatch(batch_data, coarsest_stride=0, use_padded_im_info=True):\n    max_shape_org = np.array(\n        [data['image'].shape for data in batch_data]).max(axis=0)\n    if coarsest_stride > 0:\n        max_shape = np.zeros((3)).astype('int32')\n        max_shape[1] = int(\n            np.ceil(max_shape_org[1] / coarsest_stride) * coarsest_stride)\n        max_shape[2] = int(\n            np.ceil(max_shape_org[2] / coarsest_stride) * coarsest_stride)\n    else:\n        max_shape = max_shape_org.astype('int32')\n\n    padding_image = list()\n    padding_info = list()\n    padding_shape = list()\n\n    for data in batch_data:\n        im_c, im_h, im_w = data['image'].shape\n        # image\n        padding_im = np.zeros((im_c, max_shape[1], max_shape[2]),\n                              dtype=np.float32)\n        padding_im[:, 0:im_h, 0:im_w] = data['image']\n        padding_image.append(padding_im)\n        # im_info\n        data['im_info'][\n            0] = max_shape[1] if use_padded_im_info else max_shape_org[1]\n        data['im_info'][\n            1] = max_shape[2] if use_padded_im_info else max_shape_org[2]\n        padding_info.append(data['im_info'])\n        padding_shape.append(data['im_shape'])\n\n    padding_image = np.array(padding_image).astype('float32')\n    padding_info = np.array(padding_info).astype('float32')\n    padding_shape = np.array(padding_shape).astype('float32')\n    return padding_image, padding_info, padding_shape\n"
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_fpn_coco2017/label_file.txt",
    "content": "background\nperson\nbicycle\ncar\nmotorcycle\nairplane\nbus\ntrain\ntruck\nboat\ntraffic light\nfire hydrant\nstop sign\nparking meter\nbench\nbird\ncat\ndog\nhorse\nsheep\ncow\nelephant\nbear\nzebra\ngiraffe\nbackpack\numbrella\nhandbag\ntie\nsuitcase\nfrisbee\nskis\nsnowboard\nsports ball\nkite\nbaseball bat\nbaseball glove\nskateboard\nsurfboard\ntennis racket\nbottle\nwine glass\ncup\nfork\nknife\nspoon\nbowl\nbanana\napple\nsandwich\norange\nbroccoli\ncarrot\nhot dog\npizza\ndonut\ncake\nchair\ncouch\npotted plant\nbed\ndining table\ntoilet\ntv\nlaptop\nmouse\nremote\nkeyboard\ncell phone\nmicrowave\noven\ntoaster\nsink\nrefrigerator\nbook\nclock\nvase\nscissors\nteddy bear\nhair drier\ntoothbrush\n"
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_fpn_coco2017/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport os\nimport ast\nimport argparse\nfrom math import ceil\n\nimport paddle\nimport numpy as np\nimport paddle.jit\nimport paddle.static\nfrom paddlehub.module.module import moduleinfo, runnable, serving\nfrom paddle.inference import Config, create_predictor\nfrom paddlehub.utils.parser import txt_parser\nfrom .processor import load_label_info, postprocess, base64_to_cv2\nfrom .data_feed import test_reader, padding_minibatch\n\n\n@moduleinfo(\n    name=\"faster_rcnn_resnet50_fpn_coco2017\",\n    version=\"1.1.0\",\n    type=\"cv/object_detection\",\n    summary=\n    \"Baidu's Faster-RCNN model for object detection, whose backbone is ResNet50, processed with Feature Pyramid Networks\",\n    author=\"paddlepaddle\",\n    author_email=\"paddle-dev@baidu.com\")\nclass FasterRCNNResNet50RPN:\n    def __init__(self):\n        # default pretrained model, Faster-RCNN with backbone ResNet50, shape of input tensor is [3, 800, 1333]\n        self.default_pretrained_model_path = os.path.join(\n            self.directory, \"faster_rcnn_resnet50_fpn_model\", \"model\")\n        self.label_names = load_label_info(\n            os.path.join(self.directory, \"label_file.txt\"))\n        self._set_config()\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path+'.pdmodel'\n        params = self.default_pretrained_model_path+'.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=500, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def object_detection(self,\n                         paths=None,\n                         images=None,\n                         use_gpu=False,\n                         batch_size=1,\n                         output_dir='detection_result',\n                         score_thresh=0.5,\n                         visualization=True):\n        \"\"\"API of Object Detection.\n\n        Args:\n            paths (list[str]): The paths of images.\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            output_dir (str): The path to store output images.\n            visualization (bool): Whether to save image or not.\n            score_thresh (float): threshold for object detecion.\n\n        Returns:\n            res (list[dict]): The result of coco2017 detecion. keys include 'data', 'save_path', the corresponding value is:\n                data (dict): the result of object detection, keys include 'left', 'top', 'right', 'bottom', 'label', 'confidence', the corresponding value is:\n                    left (float): The X coordinate of the upper left corner of the bounding box;\n                    top (float): The Y coordinate of the upper left corner of the bounding box;\n                    right (float): The X coordinate of the lower right corner of the bounding box;\n                    bottom (float): The Y coordinate of the lower right corner of the bounding box;\n                    label (str): The label of detection result;\n                    confidence (float): The confidence of detection result.\n                save_path (str, optional): The path to save output images.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Attempt to use GPU for prediction, but environment variable CUDA_VISIBLE_DEVICES was not set correctly.\"\n                )\n\n        paths = paths if paths else list()\n\n        all_images = list()\n        for yield_data in test_reader(paths, images):\n            all_images.append(yield_data)\n\n        images_num = len(all_images)\n        loop_num = ceil(images_num / batch_size)\n        res = []\n\n        for iter_id in range(loop_num):\n            batch_data = []\n            handle_id = iter_id * batch_size\n\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_images[handle_id + image_id])\n                except:\n                    pass\n\n            padding_image, padding_info, padding_shape = padding_minibatch(\n                batch_data, coarsest_stride=32, use_padded_im_info=True)\n            feed_list = [\n                padding_image, padding_info, padding_shape\n            ]\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n\n            feed_list = [\n                padding_image, padding_info, padding_shape\n            ]\n\n            input_names = predictor.get_input_names()\n            \n            for i, input_name in enumerate(input_names):\n                data = np.asarray(feed_list[i], dtype=np.float32)\n                handle = predictor.get_input_handle(input_name)\n                handle.copy_from_cpu(data)\n            \n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            output = postprocess(\n                paths=paths,\n                images=images,\n                data_out=output_handle,\n                score_thresh=score_thresh,\n                label_names=self.label_names,\n                output_dir=output_dir,\n                handle_id=handle_id,\n                visualization=visualization)\n            res += output\n        return res\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu',\n            type=ast.literal_eval,\n            default=False,\n            help=\"whether use GPU or not\")\n\n        self.arg_config_group.add_argument(\n            '--batch_size',\n            type=int,\n            default=1,\n            help=\"batch size for prediction\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options\n        \"\"\"\n        self.arg_input_group.add_argument(\n            '--input_path', type=str, default=None, help=\"input data\")\n\n        self.arg_input_group.add_argument(\n            '--input_file',\n            type=str,\n            default=None,\n            help=\"file contain input data\")\n\n    def check_input_data(self, args):\n        input_data = []\n        if args.input_path:\n            input_data = [args.input_path]\n        elif args.input_file:\n            if not os.path.exists(args.input_file):\n                raise RuntimeError(\"File %s is not exist.\" % args.input_file)\n            else:\n                input_data = txt_parser.parse(args.input_file, use_strip=True)\n        return input_data\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.object_detection(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {}\".format(self.name),\n            prog=\"hub run {}\".format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(\n            title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\",\n            description=\n            \"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        input_data = self.check_input_data(args)\n        if len(input_data) == 0:\n            self.parser.print_help()\n            exit(1)\n        else:\n            for image_path in input_data:\n                if not os.path.exists(image_path):\n                    raise RuntimeError(\n                        \"File %s or %s is not exist.\" % image_path)\n        return self.object_detection(\n            paths=input_data, use_gpu=args.use_gpu, batch_size=args.batch_size)\n"
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_fpn_coco2017/processor.py",
    "content": "# coding=utf-8\nimport base64\nimport os\n\nimport cv2\nimport numpy as np\nfrom PIL import Image, ImageDraw\n\n__all__ = [\n    'base64_to_cv2',\n    'load_label_info',\n    'postprocess',\n]\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\ndef check_dir(dir_path):\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\ndef get_save_image_name(img, output_dir, image_path):\n    \"\"\"Get save image name from source image path.\n    \"\"\"\n    if not os.path.exists(output_dir):\n        os.makedirs(output_dir)\n    image_name = os.path.split(image_path)[-1]\n    name, ext = os.path.splitext(image_name)\n\n    if ext == '':\n        if img.format == 'PNG':\n            ext = '.png'\n        elif img.format == 'JPEG':\n            ext = '.jpg'\n        elif img.format == 'BMP':\n            ext = '.bmp'\n        else:\n            if img.mode == \"RGB\" or img.mode == \"L\":\n                ext = \".jpg\"\n            elif img.mode == \"RGBA\" or img.mode == \"P\":\n                ext = '.png'\n\n    return os.path.join(output_dir, \"{}\".format(name)) + ext\n\n\ndef draw_bounding_box_on_image(image_path, data_list, save_dir):\n    image = Image.open(image_path)\n    draw = ImageDraw.Draw(image)\n    for data in data_list:\n        left, right, top, bottom = data['left'], data['right'], data[\n            'top'], data['bottom']\n\n        # draw bbox\n        draw.line([(left, top), (left, bottom), (right, bottom), (right, top),\n                   (left, top)],\n                  width=2,\n                  fill='red')\n\n        # draw label\n        if image.mode == 'RGB':\n            text = data['label'] + \": %.2f%%\" % (100 * data['confidence'])\n            textsize_width, textsize_height = draw.textsize(text=text)\n            draw.rectangle(\n                xy=(left, top - (textsize_height + 5),\n                    left + textsize_width + 10, top),\n                fill=(255, 255, 255))\n            draw.text(xy=(left, top - 15), text=text, fill=(0, 0, 0))\n\n    save_name = get_save_image_name(image, save_dir, image_path)\n    if os.path.exists(save_name):\n        os.remove(save_name)\n\n    image.save(save_name)\n    return save_name\n\n\ndef clip_bbox(bbox, img_width, img_height):\n    xmin = max(min(bbox[0], img_width), 0.)\n    ymin = max(min(bbox[1], img_height), 0.)\n    xmax = max(min(bbox[2], img_width), 0.)\n    ymax = max(min(bbox[3], img_height), 0.)\n    return float(xmin), float(ymin), float(xmax), float(ymax)\n\n\ndef load_label_info(file_path):\n    with open(file_path, 'r') as fr:\n        text = fr.readlines()\n        label_names = []\n        for info in text:\n            label_names.append(info.strip())\n        return label_names\n\n\ndef postprocess(paths,\n                images,\n                data_out,\n                score_thresh,\n                label_names,\n                output_dir,\n                handle_id,\n                visualization=True):\n    \"\"\"\n    postprocess the lod_tensor produced by Executor.run\n\n    Args:\n        paths (list[str]): the path of images.\n        images (list(numpy.ndarray)):  list of images, shape of each is [H, W, C].\n        data_out (lod_tensor): data produced by executor.run.\n        score_thresh (float): the low limit of bounding box.\n        label_names (list[str]): label names.\n        output_dir (str): output directory.\n        handle_id (int): The number of images that have been handled.\n        visualization (bool): whether to save as images.\n\n    Returns:\n        res (list[dict]): The result of vehicles detecion. keys include 'data', 'save_path', the corresponding value is:\n            data (dict): the result of object detection, keys include 'left', 'top', 'right', 'bottom', 'label', 'confidence', the corresponding value is:\n                left (float): The X coordinate of the upper left corner of the bounding box;\n                top (float): The Y coordinate of the upper left corner of the bounding box;\n                right (float): The X coordinate of the lower right corner of the bounding box;\n                bottom (float): The Y coordinate of the lower right corner of the bounding box;\n                label (str): The label of detection result;\n                confidence (float): The confidence of detection result.\n            save_path (str): The path to save output images.\n    \"\"\"\n    lod = data_out.lod()[0]\n    results = data_out.copy_to_cpu()\n\n    check_dir(output_dir)\n\n    if paths:\n        assert type(paths) is list, \"type(paths) is not list.\"\n        if handle_id < len(paths):\n            unhandled_paths = paths[handle_id:]\n            unhandled_paths_num = len(unhandled_paths)\n        else:\n            unhandled_paths_num = 0\n    if images is not None:\n        if handle_id < len(images):\n            unhandled_paths = None\n            unhandled_paths_num = len(images) - handle_id\n        else:\n            unhandled_paths_num = 0\n\n    output = []\n    for index in range(len(lod) - 1):\n        output_i = {'data': []}\n        if unhandled_paths and index < unhandled_paths_num:\n            org_img_path = unhandled_paths[index]\n            org_img = Image.open(org_img_path)\n            output_i['path'] = org_img_path\n        else:\n            org_img = images[index - unhandled_paths_num]\n            org_img = org_img.astype(np.uint8)\n            org_img = Image.fromarray(org_img[:, :, ::-1])\n            if visualization:\n                org_img_path = get_save_image_name(\n                    org_img, output_dir, 'image_numpy_{}'.format(\n                        (handle_id + index)))\n                org_img.save(org_img_path)\n        org_img_height = org_img.height\n        org_img_width = org_img.width\n        result_i = results[lod[index]:lod[index + 1]]\n        for row in result_i:\n            if len(row) != 6:\n                continue\n            if row[1] < score_thresh:\n                continue\n            category_id = int(row[0])\n            confidence = row[1]\n            bbox = row[2:]\n            dt = {}\n            dt['label'] = label_names[category_id]\n            dt['confidence'] = float(confidence)\n            dt['left'], dt['top'], dt['right'], dt['bottom'] = clip_bbox(\n                bbox, org_img_width, org_img_height)\n            output_i['data'].append(dt)\n\n        output.append(output_i)\n        if visualization:\n            output_i['save_path'] = draw_bounding_box_on_image(\n                org_img_path, output_i['data'], output_dir)\n\n    return output\n"
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_fpn_coco2017/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport paddlehub as hub\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://ai-studio-static-online.cdn.bcebos.com/68313e182f5e4ad9907e69dac9ece8fc50840d7ffbd24fa88396f009958f969a'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"faster_rcnn_resnet50_fpn_coco2017\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('detection_result')\n\n    def test_object_detection1(self):\n        results = self.module.object_detection(\n            paths=['tests/test.jpg']\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'cat')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(200 < left < 800)\n        self.assertTrue(2500 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(3500 < bottom < 4500)\n\n    def test_object_detection2(self):\n        results = self.module.object_detection(\n            images=[cv2.imread('tests/test.jpg')]\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'cat')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(200 < left < 800)\n        self.assertTrue(2500 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(3500 < bottom < 4500)\n\n    def test_object_detection3(self):\n        results = self.module.object_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            visualization=False\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'cat')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(200 < left < 800)\n        self.assertTrue(2500 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(3500 < bottom < 4500)\n\n    def test_object_detection4(self):\n        self.assertRaises(\n            AssertionError,\n            self.module.object_detection,\n            paths=['no.jpg']\n        )\n\n    def test_object_detection5(self):\n        self.assertRaises(\n            cv2.error,\n            self.module.object_detection,\n            images=['test.jpg']\n        )\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_fpn_venus/README.md",
    "content": "# faster_rcnn_resnet50_fpn_venus\n\n|模型名称|faster_rcnn_resnet50_fpn_venus|\n| :--- | :---: |\n|类别|图像 - 目标检测|\n|网络|faster_rcnn|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|是|\n|模型大小|317MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - Faster_RCNN是两阶段目标检测器，对图像生成候选区域、提取特征、判别特征类别并修正候选框位置。Faster_RCNN整体网络可以分为4个部分，一是ResNet-50作为基础卷积层，二是区域生成网络，三是Rol Align，四是检测层。该PaddleHub Module是由800+tag,170w图片，1000w+检测框训练的大规模通用检测模型，在8个数据集上MAP平均提升2.06%，iou=0.5的准确率平均提升1.78%。对比于其他通用检测模型，使用该Module进行finetune，可以更快收敛，达到较优效果。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)  \n\n- ### 2、安装\n\n  - ```shell\n    $ hub install faster_rcnn_resnet50_fpn_venus\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、API\n\n  - ```python\n    def context(num_classes=81,\n                trainable=True,\n                pretrained=True,\n                phase='train')\n    ```\n\n    - 提取特征，用于迁移学习。\n\n    - **参数**\n      - num\\_classes (int): 类别数；<br/>\n      - trainable (bool): 参数是否可训练；<br/>\n      - pretrained (bool): 是否加载预训练模型；<br/>\n      - get\\_prediction (bool): 可选值为 'train'/'predict'，'train' 用于训练，'predict' 用于预测。\n\n    - **返回**\n      - inputs (dict): 模型的输入，相应的取值为：\n        当phase为'train'时，包含：\n          - image (Variable): 图像变量\n          - im\\_size (Variable): 图像的尺寸\n          - im\\_info (Variable): 图像缩放信息\n          - gt\\_class (Variable): 检测框类别\n          - gt\\_box (Variable): 检测框坐标\n          - is\\_crowd (Variable): 单个框内是否包含多个物体\n        当 phase 为 'predict'时，包含：\n          - image (Variable): 图像变量\n          - im\\_size (Variable): 图像的尺寸\n          - im\\_info (Variable): 图像缩放信息\n      - outputs (dict): 模型的输出，响应的取值为：\n        当 phase 为 'train'时，包含：\n          - head_features (Variable): 所提取的特征\n          - rpn\\_cls\\_loss (Variable): 检测框分类损失\n          - rpn\\_reg\\_loss (Variable): 检测框回归损失\n          - generate\\_proposal\\_labels (Variable): 图像信息\n        当 phase 为 'predict'时，包含：\n          - head_features (Variable): 所提取的特征\n          - rois (Variable): 提取的roi\n          - bbox\\_out (Variable): 预测结果\n      - context\\_prog (Program): 用于迁移学习的 Program\n\n  - ```python\n    def save_inference_model(dirname,\n                             model_filename=None,\n                             params_filename=None,\n                             combined=True)\n    ```\n    - 将模型保存到指定路径。\n\n    - **参数**\n\n      - dirname: 存在模型的目录名称； <br/>\n      - model\\_filename: 模型文件名称，默认为\\_\\_model\\_\\_； <br/>\n      - params\\_filename: 参数文件名称，默认为\\_\\_params\\_\\_(仅当`combined`为True时生效)；<br/>\n      - combined: 是否将参数保存到统一的一个文件中。\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n  - ```shell\n    $ hub install faster_rcnn_resnet50_fpn_venus==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_fpn_venus/README_en.md",
    "content": "# faster_rcnn_resnet50_fpn_venus\n\n|Module Name|faster_rcnn_resnet50_fpn_venus|\n| :--- | :---: |\n|Category|object detection|\n|Network|faster_rcnn|\n|Dataset|Baidu Detection Dataset|\n|Fine-tuning supported or not|Yes|\n|Module Size|317MB|\n|Latest update date|2021-02-26|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Module Introduction\n\n  - Faster_RCNN is a two-stage detector, it consists of feature extraction, proposal, classification and refinement processes. This module is trained on Baidu Detection Dataset, which contains 170w pictures and 1000w+ boxes, and improve the accuracy on 8 test datasets with average 2.06%. Besides, this module supports to fine-tune model, and can achieve faster convergence and better performance.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)  \n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install faster_rcnn_resnet50_fpn_venus\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、API\n\n  - ```python\n    def context(num_classes=81,\n                trainable=True,\n                pretrained=True,\n                phase='train')\n    ```\n\n    - Extract features, and do transfer learning\n\n    - **Parameters**\n      - num\\_classes (int): number of classes；<br/>\n      - trainable (bool): whether parameters trainable or not；<br/>\n      - pretrained (bool): whether load pretrained model or not\n      - get\\_prediction (bool): optional, 'train' or 'predict'，'train' is used for training，'predict' used for prediction.\n\n    - **Return**\n      - inputs (dict): inputs, a dict：\n        if phase is 'train', keys are：\n          - image (Variable): image variable\n          - im\\_size (Variable): image size\n          - im\\_info (Variable): image information\n          - gt\\_class (Variable): box class\n          - gt\\_box (Variable): box coordination\n          - is\\_crowd (Variable): if multiple objects in box\n        if phase 为 'predict'，keys are：\n          - image (Variable): image variable\n          - im\\_size (Variable): image size\n          - im\\_info (Variable): image information\n      - outputs (dict): model output\n         if phase is 'train', keys are：\n          - head_features (Variable): features extracted\n          - rpn\\_cls\\_loss (Variable): classfication loss in box\n          - rpn\\_reg\\_loss (Variable): regression loss in box\n          - generate\\_proposal\\_labels (Variable): proposal labels\n        if phase 为 'predict'，keys are：\n          - head_features (Variable): features extracted\n          - rois (Variable): roi\n          - bbox\\_out (Variable): prediction results\n      - program for transfer learning\n\n  - ```python\n    def save_inference_model(dirname,\n                             model_filename=None,\n                             params_filename=None,\n                             combined=True)\n    ```\n    - Save model to specific path\n\n    - **Parameters**\n\n      - dirname: output dir for saving model\n      - model\\_filename: filename for saving model\n      - params\\_filename: filename for saving parameters\n      - combined: whether save parameters into one file\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n  - ```shell\n    $ hub install faster_rcnn_resnet50_fpn_venus==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_fpn_venus/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_fpn_venus/bbox_assigner.py",
    "content": "class BBoxAssigner(object):\n    # __op__ = fluid.layers.generate_proposal_labels\n    def __init__(self,\n                 batch_size_per_im=512,\n                 fg_fraction=.25,\n                 fg_thresh=.5,\n                 bg_thresh_hi=.5,\n                 bg_thresh_lo=0.,\n                 bbox_reg_weights=[0.1, 0.1, 0.2, 0.2],\n                 class_nums=81,\n                 shuffle_before_sample=True):\n        super(BBoxAssigner, self).__init__()\n        self.batch_size_per_im = batch_size_per_im\n        self.fg_fraction = fg_fraction\n        self.fg_thresh = fg_thresh\n        self.bg_thresh_hi = bg_thresh_hi\n        self.bg_thresh_lo = bg_thresh_lo\n        self.bbox_reg_weights = bbox_reg_weights\n        self.class_nums = class_nums\n        self.use_random = shuffle_before_sample\n"
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_fpn_venus/bbox_head.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nfrom collections import OrderedDict\n\nfrom paddle import fluid\nfrom paddle.fluid.param_attr import ParamAttr\nfrom paddle.fluid.initializer import Normal, Xavier\nfrom paddle.fluid.regularizer import L2Decay\nfrom paddle.fluid.initializer import MSRA\n\n\nclass MultiClassNMS(object):\n    # __op__ = fluid.layers.multiclass_nms\n    def __init__(self,\n                 score_threshold=.05,\n                 nms_top_k=-1,\n                 keep_top_k=100,\n                 nms_threshold=.5,\n                 normalized=False,\n                 nms_eta=1.0,\n                 background_label=0):\n        super(MultiClassNMS, self).__init__()\n        self.score_threshold = score_threshold\n        self.nms_top_k = nms_top_k\n        self.keep_top_k = keep_top_k\n        self.nms_threshold = nms_threshold\n        self.normalized = normalized\n        self.nms_eta = nms_eta\n        self.background_label = background_label\n\n\nclass SmoothL1Loss(object):\n    '''\n    Smooth L1 loss\n    Args:\n        sigma (float): hyper param in smooth l1 loss\n    '''\n\n    def __init__(self, sigma=1.0):\n        super(SmoothL1Loss, self).__init__()\n        self.sigma = sigma\n\n    def __call__(self, x, y, inside_weight=None, outside_weight=None):\n        return fluid.layers.smooth_l1(\n            x, y, inside_weight=inside_weight, outside_weight=outside_weight, sigma=self.sigma)\n\n\nclass BoxCoder(object):\n    def __init__(self, prior_box_var=[0.1, 0.1, 0.2, 0.2], code_type='decode_center_size', box_normalized=False,\n                 axis=1):\n        super(BoxCoder, self).__init__()\n        self.prior_box_var = prior_box_var\n        self.code_type = code_type\n        self.box_normalized = box_normalized\n        self.axis = axis\n\n\nclass TwoFCHead(object):\n    \"\"\"\n    RCNN head with two Fully Connected layers\n\n    Args:\n        mlp_dim (int): num of filters for the fc layers\n    \"\"\"\n\n    def __init__(self, mlp_dim=1024):\n        super(TwoFCHead, self).__init__()\n        self.mlp_dim = mlp_dim\n\n    def __call__(self, roi_feat):\n        fan = roi_feat.shape[1] * roi_feat.shape[2] * roi_feat.shape[3]\n\n        fc6 = fluid.layers.fc(\n            input=roi_feat,\n            size=self.mlp_dim,\n            act='relu',\n            name='fc6',\n            param_attr=ParamAttr(name='fc6_w', initializer=Xavier(fan_out=fan)),\n            bias_attr=ParamAttr(name='fc6_b', learning_rate=2., regularizer=L2Decay(0.)))\n        head_feat = fluid.layers.fc(\n            input=fc6,\n            size=self.mlp_dim,\n            act='relu',\n            name='fc7',\n            param_attr=ParamAttr(name='fc7_w', initializer=Xavier()),\n            bias_attr=ParamAttr(name='fc7_b', learning_rate=2., regularizer=L2Decay(0.)))\n\n        return head_feat\n\n\nclass BBoxHead(object):\n    \"\"\"\n    RCNN bbox head\n\n    Args:\n        head (object): the head module instance, e.g., `ResNetC5`, `TwoFCHead`\n        box_coder (object): `BoxCoder` instance\n        nms (object): `MultiClassNMS` instance\n        num_classes: number of output classes\n    \"\"\"\n    __inject__ = ['head', 'box_coder', 'nms', 'bbox_loss']\n    __shared__ = ['num_classes']\n\n    def __init__(self, head, box_coder=BoxCoder(), nms=MultiClassNMS(), bbox_loss=SmoothL1Loss(), num_classes=81):\n        super(BBoxHead, self).__init__()\n        self.head = head\n        self.num_classes = num_classes\n        self.box_coder = box_coder\n        self.nms = nms\n        self.bbox_loss = bbox_loss\n        self.head_feat = None\n\n    def get_head_feat(self, input=None):\n        \"\"\"\n        Get the bbox head feature map.\n        \"\"\"\n\n        if input is not None:\n            feat = self.head(input)\n            if isinstance(feat, OrderedDict):\n                feat = list(feat.values())[0]\n            self.head_feat = feat\n        return self.head_feat\n\n    def _get_output(self, roi_feat):\n        \"\"\"\n        Get bbox head output.\n\n        Args:\n            roi_feat (Variable): RoI feature from RoIExtractor.\n\n        Returns:\n            cls_score(Variable): Output of rpn head with shape of\n                [N, num_anchors, H, W].\n            bbox_pred(Variable): Output of rpn head with shape of\n                [N, num_anchors * 4, H, W].\n        \"\"\"\n        head_feat = self.get_head_feat(roi_feat)\n        # when ResNetC5 output a single feature map\n        if not isinstance(self.head, TwoFCHead):\n            head_feat = fluid.layers.pool2d(head_feat, pool_type='avg', global_pooling=True)\n        cls_score = fluid.layers.fc(\n            input=head_feat,\n            size=self.num_classes,\n            act=None,\n            name='cls_score',\n            param_attr=ParamAttr(name='cls_score_w', initializer=Normal(loc=0.0, scale=0.01)),\n            bias_attr=ParamAttr(name='cls_score_b', learning_rate=2., regularizer=L2Decay(0.)))\n        bbox_pred = fluid.layers.fc(\n            input=head_feat,\n            size=4 * self.num_classes,\n            act=None,\n            name='bbox_pred',\n            param_attr=ParamAttr(name='bbox_pred_w', initializer=Normal(loc=0.0, scale=0.001)),\n            bias_attr=ParamAttr(name='bbox_pred_b', learning_rate=2., regularizer=L2Decay(0.)))\n        return cls_score, bbox_pred\n\n    def get_loss(self, roi_feat, labels_int32, bbox_targets, bbox_inside_weights, bbox_outside_weights):\n        \"\"\"\n        Get bbox_head loss.\n\n        Args:\n            roi_feat (Variable): RoI feature from RoIExtractor.\n            labels_int32(Variable): Class label of a RoI with shape [P, 1].\n                P is the number of RoI.\n            bbox_targets(Variable): Box label of a RoI with shape\n                [P, 4 * class_nums].\n            bbox_inside_weights(Variable): Indicates whether a box should\n                contribute to loss. Same shape as bbox_targets.\n            bbox_outside_weights(Variable): Indicates whether a box should\n                contribute to loss. Same shape as bbox_targets.\n\n        Return:\n            Type: Dict\n                loss_cls(Variable): bbox_head loss.\n                loss_bbox(Variable): bbox_head loss.\n        \"\"\"\n\n        cls_score, bbox_pred = self._get_output(roi_feat)\n\n        labels_int64 = fluid.layers.cast(x=labels_int32, dtype='int64')\n        labels_int64.stop_gradient = True\n        loss_cls = fluid.layers.softmax_with_cross_entropy(\n            logits=cls_score, label=labels_int64, numeric_stable_mode=True)\n        loss_cls = fluid.layers.reduce_mean(loss_cls)\n        loss_bbox = self.bbox_loss(\n            x=bbox_pred, y=bbox_targets, inside_weight=bbox_inside_weights, outside_weight=bbox_outside_weights)\n        loss_bbox = fluid.layers.reduce_mean(loss_bbox)\n        return {'loss_cls': loss_cls, 'loss_bbox': loss_bbox}\n\n    def get_prediction(self, roi_feat, rois, im_info, im_shape, return_box_score=False):\n        \"\"\"\n        Get prediction bounding box in test stage.\n\n        Args:\n            roi_feat (Variable): RoI feature from RoIExtractor.\n            rois (Variable): Output of generate_proposals in rpn head.\n            im_info (Variable): A 2-D LoDTensor with shape [B, 3]. B is the\n                number of input images, each element consists of im_height,\n                im_width, im_scale.\n            im_shape (Variable): Actual shape of original image with shape\n                [B, 3]. B is the number of images, each element consists of\n                original_height, original_width, 1\n\n        Returns:\n            pred_result(Variable): Prediction result with shape [N, 6]. Each\n                row has 6 values: [label, confidence, xmin, ymin, xmax, ymax].\n                N is the total number of prediction.\n        \"\"\"\n        cls_score, bbox_pred = self._get_output(roi_feat)\n\n        im_scale = fluid.layers.slice(im_info, [1], starts=[2], ends=[3])\n        im_scale = fluid.layers.sequence_expand(im_scale, rois)\n        boxes = rois / im_scale\n        cls_prob = fluid.layers.softmax(cls_score, use_cudnn=False)\n        bbox_pred = fluid.layers.reshape(bbox_pred, (-1, self.num_classes, 4))\n        # self.box_coder\n        decoded_box = fluid.layers.box_coder(\n            prior_box=boxes,\n            target_box=bbox_pred,\n            prior_box_var=self.box_coder.prior_box_var,\n            code_type=self.box_coder.code_type,\n            box_normalized=self.box_coder.box_normalized,\n            axis=self.box_coder.axis)\n        cliped_box = fluid.layers.box_clip(input=decoded_box, im_info=im_shape)\n        if return_box_score:\n            return {'bbox': cliped_box, 'score': cls_prob}\n        # self.nms\n        pred_result = fluid.layers.multiclass_nms(\n            bboxes=cliped_box,\n            scores=cls_prob,\n            score_threshold=self.nms.score_threshold,\n            nms_top_k=self.nms.nms_top_k,\n            keep_top_k=self.nms.keep_top_k,\n            nms_threshold=self.nms.nms_threshold,\n            normalized=self.nms.normalized,\n            nms_eta=self.nms.nms_eta,\n            background_label=self.nms.background_label)\n        return pred_result\n"
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_fpn_venus/data_feed.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import division\n\nimport os\nfrom collections import OrderedDict\n\nimport cv2\nimport numpy as np\nfrom PIL import Image, ImageEnhance\nfrom paddle import fluid\n\n__all__ = ['test_reader']\n\n\ndef test_reader(paths=None, images=None):\n    \"\"\"\n    data generator\n\n    Args:\n        paths (list[str]): paths to images.\n        images (list(numpy.ndarray)): data of images, shape of each is [H, W, C]\n\n    Yield:\n        res (dict): key contains 'image', 'im_info', 'im_shape', the corresponding values is:\n            image (numpy.ndarray): the image to be fed into network\n            im_info (numpy.ndarray): the info about the preprocessed.\n            im_shape (numpy.ndarray): the shape of image.\n    \"\"\"\n    img_list = list()\n    if paths:\n        for img_path in paths:\n            assert os.path.isfile(img_path), \"The {} isn't a valid file path.\".format(img_path)\n            img = cv2.imread(img_path).astype('float32')\n            img_list.append(img)\n    if images is not None:\n        for img in images:\n            img_list.append(img)\n\n    for im in img_list:\n        im = cv2.cvtColor(im, cv2.COLOR_BGR2RGB)\n        im = im.astype(np.float32, copy=False)\n        mean = [0.485, 0.456, 0.406]\n        std = [0.229, 0.224, 0.225]\n        mean = np.array(mean)[np.newaxis, np.newaxis, :]\n        std = np.array(std)[np.newaxis, np.newaxis, :]\n        im = im / 255.0\n        im -= mean\n        im /= std\n\n        target_size = 800\n        max_size = 1333\n\n        shape = im.shape\n        # im_shape holds the original shape of image.\n        im_shape = np.array([shape[0], shape[1], 1.0]).astype('float32')\n        im_size_min = np.min(shape[0:2])\n        im_size_max = np.max(shape[0:2])\n        im_scale = float(target_size) / float(im_size_min)\n        if np.round(im_scale * im_size_max) > max_size:\n            im_scale = float(max_size) / float(im_size_max)\n\n        resize_w = np.round(im_scale * float(shape[1]))\n        resize_h = np.round(im_scale * float(shape[0]))\n        # im_info holds the resize info of image.\n        im_info = np.array([resize_h, resize_w, im_scale]).astype('float32')\n\n        im = cv2.resize(im, None, None, fx=im_scale, fy=im_scale, interpolation=cv2.INTER_LINEAR)\n\n        # HWC --> CHW\n        im = np.swapaxes(im, 1, 2)\n        im = np.swapaxes(im, 1, 0)\n        yield {'image': im, 'im_info': im_info, 'im_shape': im_shape}\n\n\ndef padding_minibatch(batch_data, coarsest_stride=0, use_padded_im_info=True):\n    max_shape_org = np.array([data['image'].shape for data in batch_data]).max(axis=0)\n    if coarsest_stride > 0:\n        max_shape = np.zeros((3)).astype('int32')\n        max_shape[1] = int(np.ceil(max_shape_org[1] / coarsest_stride) * coarsest_stride)\n        max_shape[2] = int(np.ceil(max_shape_org[2] / coarsest_stride) * coarsest_stride)\n    else:\n        max_shape = max_shape_org.astype('int32')\n\n    padding_image = list()\n    padding_info = list()\n    padding_shape = list()\n\n    for data in batch_data:\n        im_c, im_h, im_w = data['image'].shape\n        # image\n        padding_im = np.zeros((im_c, max_shape[1], max_shape[2]), dtype=np.float32)\n        padding_im[:, 0:im_h, 0:im_w] = data['image']\n        padding_image.append(padding_im)\n        # im_info\n        data['im_info'][0] = max_shape[1] if use_padded_im_info else max_shape_org[1]\n        data['im_info'][1] = max_shape[2] if use_padded_im_info else max_shape_org[2]\n        padding_info.append(data['im_info'])\n        padding_shape.append(data['im_shape'])\n\n    padding_image = np.array(padding_image).astype('float32')\n    padding_info = np.array(padding_info).astype('float32')\n    padding_shape = np.array(padding_shape).astype('float32')\n    return padding_image, padding_info, padding_shape\n"
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_fpn_venus/fpn.py",
    "content": "# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport copy\nfrom collections import OrderedDict\n\nfrom paddle import fluid\nfrom paddle.fluid.param_attr import ParamAttr\nfrom paddle.fluid.initializer import Xavier\nfrom paddle.fluid.regularizer import L2Decay\n\n__all__ = ['ConvNorm', 'FPN']\n\n\ndef ConvNorm(input,\n             num_filters,\n             filter_size,\n             stride=1,\n             groups=1,\n             norm_decay=0.,\n             norm_type='affine_channel',\n             norm_groups=32,\n             dilation=1,\n             lr_scale=1,\n             freeze_norm=False,\n             act=None,\n             norm_name=None,\n             initializer=None,\n             name=None):\n    fan = num_filters\n    conv = fluid.layers.conv2d(\n        input=input,\n        num_filters=num_filters,\n        filter_size=filter_size,\n        stride=stride,\n        padding=((filter_size - 1) // 2) * dilation,\n        dilation=dilation,\n        groups=groups,\n        act=None,\n        param_attr=ParamAttr(name=name + \"_weights\", initializer=initializer, learning_rate=lr_scale),\n        bias_attr=False,\n        name=name + '.conv2d.output.1')\n\n    norm_lr = 0. if freeze_norm else 1.\n    pattr = ParamAttr(name=norm_name + '_scale', learning_rate=norm_lr * lr_scale, regularizer=L2Decay(norm_decay))\n    battr = ParamAttr(name=norm_name + '_offset', learning_rate=norm_lr * lr_scale, regularizer=L2Decay(norm_decay))\n\n    if norm_type in ['bn', 'sync_bn']:\n        global_stats = True if freeze_norm else False\n        out = fluid.layers.batch_norm(\n            input=conv,\n            act=act,\n            name=norm_name + '.output.1',\n            param_attr=pattr,\n            bias_attr=battr,\n            moving_mean_name=norm_name + '_mean',\n            moving_variance_name=norm_name + '_variance',\n            use_global_stats=global_stats)\n        scale = fluid.framework._get_var(pattr.name)\n        bias = fluid.framework._get_var(battr.name)\n    elif norm_type == 'gn':\n        out = fluid.layers.group_norm(\n            input=conv, act=act, name=norm_name + '.output.1', groups=norm_groups, param_attr=pattr, bias_attr=battr)\n        scale = fluid.framework._get_var(pattr.name)\n        bias = fluid.framework._get_var(battr.name)\n    elif norm_type == 'affine_channel':\n        scale = fluid.layers.create_parameter(\n            shape=[conv.shape[1]], dtype=conv.dtype, attr=pattr, default_initializer=fluid.initializer.Constant(1.))\n        bias = fluid.layers.create_parameter(\n            shape=[conv.shape[1]], dtype=conv.dtype, attr=battr, default_initializer=fluid.initializer.Constant(0.))\n        out = fluid.layers.affine_channel(x=conv, scale=scale, bias=bias, act=act)\n    if freeze_norm:\n        scale.stop_gradient = True\n        bias.stop_gradient = True\n    return out\n\n\nclass FPN(object):\n    \"\"\"\n    Feature Pyramid Network, see https://arxiv.org/abs/1612.03144\n\n    Args:\n        num_chan (int): number of feature channels\n        min_level (int): lowest level of the backbone feature map to use\n        max_level (int): highest level of the backbone feature map to use\n        spatial_scale (list): feature map scaling factor\n        has_extra_convs (bool): whether has extral convolutions in higher levels\n        norm_type (str|None): normalization type, 'bn'/'sync_bn'/'affine_channel'\n    \"\"\"\n    __shared__ = ['norm_type', 'freeze_norm']\n\n    def __init__(self,\n                 num_chan=256,\n                 min_level=2,\n                 max_level=6,\n                 spatial_scale=[1. / 32., 1. / 16., 1. / 8., 1. / 4.],\n                 has_extra_convs=False,\n                 norm_type=None,\n                 freeze_norm=False):\n        self.freeze_norm = freeze_norm\n        self.num_chan = num_chan\n        self.min_level = min_level\n        self.max_level = max_level\n        self.spatial_scale = spatial_scale\n        self.has_extra_convs = has_extra_convs\n        self.norm_type = norm_type\n\n    def _add_topdown_lateral(self, body_name, body_input, upper_output):\n        lateral_name = 'fpn_inner_' + body_name + '_lateral'\n        topdown_name = 'fpn_topdown_' + body_name\n        fan = body_input.shape[1]\n        if self.norm_type:\n            initializer = Xavier(fan_out=fan)\n            lateral = ConvNorm(\n                body_input,\n                self.num_chan,\n                1,\n                initializer=initializer,\n                norm_type=self.norm_type,\n                freeze_norm=self.freeze_norm,\n                name=lateral_name,\n                norm_name=lateral_name)\n        else:\n            lateral = fluid.layers.conv2d(\n                body_input,\n                self.num_chan,\n                1,\n                param_attr=ParamAttr(name=lateral_name + \"_w\", initializer=Xavier(fan_out=fan)),\n                bias_attr=ParamAttr(name=lateral_name + \"_b\", learning_rate=2., regularizer=L2Decay(0.)),\n                name=lateral_name)\n        topdown = fluid.layers.resize_nearest(upper_output, scale=2., name=topdown_name)\n        return lateral + topdown\n\n    def get_output(self, body_dict):\n        \"\"\"\n        Add FPN onto backbone.\n\n        Args:\n            body_dict(OrderedDict): Dictionary of variables and each element is the\n                output of backbone.\n\n        Return:\n            fpn_dict(OrderedDict): A dictionary represents the output of FPN with\n                their name.\n            spatial_scale(list): A list of multiplicative spatial scale factor.\n        \"\"\"\n        spatial_scale = copy.deepcopy(self.spatial_scale)\n        body_name_list = list(body_dict.keys())[::-1]\n        num_backbone_stages = len(body_name_list)\n        self.fpn_inner_output = [[] for _ in range(num_backbone_stages)]\n        fpn_inner_name = 'fpn_inner_' + body_name_list[0]\n        body_input = body_dict[body_name_list[0]]\n        fan = body_input.shape[1]\n        if self.norm_type:\n            initializer = Xavier(fan_out=fan)\n            self.fpn_inner_output[0] = ConvNorm(\n                body_input,\n                self.num_chan,\n                1,\n                initializer=initializer,\n                norm_type=self.norm_type,\n                freeze_norm=self.freeze_norm,\n                name=fpn_inner_name,\n                norm_name=fpn_inner_name)\n        else:\n            self.fpn_inner_output[0] = fluid.layers.conv2d(\n                body_input,\n                self.num_chan,\n                1,\n                param_attr=ParamAttr(name=fpn_inner_name + \"_w\", initializer=Xavier(fan_out=fan)),\n                bias_attr=ParamAttr(name=fpn_inner_name + \"_b\", learning_rate=2., regularizer=L2Decay(0.)),\n                name=fpn_inner_name)\n        for i in range(1, num_backbone_stages):\n            body_name = body_name_list[i]\n            body_input = body_dict[body_name]\n            top_output = self.fpn_inner_output[i - 1]\n            fpn_inner_single = self._add_topdown_lateral(body_name, body_input, top_output)\n            self.fpn_inner_output[i] = fpn_inner_single\n        fpn_dict = {}\n        fpn_name_list = []\n        for i in range(num_backbone_stages):\n            fpn_name = 'fpn_' + body_name_list[i]\n            fan = self.fpn_inner_output[i].shape[1] * 3 * 3\n            if self.norm_type:\n                initializer = Xavier(fan_out=fan)\n                fpn_output = ConvNorm(\n                    self.fpn_inner_output[i],\n                    self.num_chan,\n                    3,\n                    initializer=initializer,\n                    norm_type=self.norm_type,\n                    freeze_norm=self.freeze_norm,\n                    name=fpn_name,\n                    norm_name=fpn_name)\n            else:\n                fpn_output = fluid.layers.conv2d(\n                    self.fpn_inner_output[i],\n                    self.num_chan,\n                    filter_size=3,\n                    padding=1,\n                    param_attr=ParamAttr(name=fpn_name + \"_w\", initializer=Xavier(fan_out=fan)),\n                    bias_attr=ParamAttr(name=fpn_name + \"_b\", learning_rate=2., regularizer=L2Decay(0.)),\n                    name=fpn_name)\n            fpn_dict[fpn_name] = fpn_output\n            fpn_name_list.append(fpn_name)\n        if not self.has_extra_convs and self.max_level - self.min_level == len(spatial_scale):\n            body_top_name = fpn_name_list[0]\n            body_top_extension = fluid.layers.pool2d(\n                fpn_dict[body_top_name], 1, 'max', pool_stride=2, name=body_top_name + '_subsampled_2x')\n            fpn_dict[body_top_name + '_subsampled_2x'] = body_top_extension\n            fpn_name_list.insert(0, body_top_name + '_subsampled_2x')\n            spatial_scale.insert(0, spatial_scale[0] * 0.5)\n        # Coarser FPN levels introduced for RetinaNet\n        highest_backbone_level = self.min_level + len(spatial_scale) - 1\n        if self.has_extra_convs and self.max_level > highest_backbone_level:\n            fpn_blob = body_dict[body_name_list[0]]\n            for i in range(highest_backbone_level + 1, self.max_level + 1):\n                fpn_blob_in = fpn_blob\n                fpn_name = 'fpn_' + str(i)\n                if i > highest_backbone_level + 1:\n                    fpn_blob_in = fluid.layers.relu(fpn_blob)\n                fan = fpn_blob_in.shape[1] * 3 * 3\n                fpn_blob = fluid.layers.conv2d(\n                    input=fpn_blob_in,\n                    num_filters=self.num_chan,\n                    filter_size=3,\n                    stride=2,\n                    padding=1,\n                    param_attr=ParamAttr(name=fpn_name + \"_w\", initializer=Xavier(fan_out=fan)),\n                    bias_attr=ParamAttr(name=fpn_name + \"_b\", learning_rate=2., regularizer=L2Decay(0.)),\n                    name=fpn_name)\n                fpn_dict[fpn_name] = fpn_blob\n                fpn_name_list.insert(0, fpn_name)\n                spatial_scale.insert(0, spatial_scale[0] * 0.5)\n        res_dict = OrderedDict([(k, fpn_dict[k]) for k in fpn_name_list])\n        return res_dict, spatial_scale\n"
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_fpn_venus/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport os\nimport ast\nimport argparse\nfrom collections import OrderedDict\nfrom functools import partial\nfrom math import ceil\n\nimport numpy as np\nimport paddle.fluid as fluid\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo, runnable, serving\nfrom paddle.fluid.core import PaddleTensor, AnalysisConfig, create_paddle_predictor\nfrom paddlehub.io.parser import txt_parser\nfrom paddlehub.common.paddle_helper import add_vars_prefix\n\nfrom faster_rcnn_resnet50_fpn_venus.processor import load_label_info, postprocess, base64_to_cv2\nfrom faster_rcnn_resnet50_fpn_venus.data_feed import test_reader, padding_minibatch\nfrom faster_rcnn_resnet50_fpn_venus.fpn import FPN\nfrom faster_rcnn_resnet50_fpn_venus.resnet import ResNet\nfrom faster_rcnn_resnet50_fpn_venus.rpn_head import AnchorGenerator, RPNTargetAssign, GenerateProposals, FPNRPNHead\nfrom faster_rcnn_resnet50_fpn_venus.bbox_head import MultiClassNMS, BBoxHead, TwoFCHead\nfrom faster_rcnn_resnet50_fpn_venus.bbox_assigner import BBoxAssigner\nfrom faster_rcnn_resnet50_fpn_venus.roi_extractor import FPNRoIAlign\n\n\n@moduleinfo(\n    name=\"faster_rcnn_resnet50_fpn_venus\",\n    version=\"1.0.0\",\n    type=\"cv/object_detection\",\n    summary=\n    \"Baidu's Faster-RCNN model for object detection, whose backbone is ResNet50, processed with Feature Pyramid Networks\",\n    author=\"paddlepaddle\",\n    author_email=\"paddle-dev@baidu.com\")\nclass FasterRCNNResNet50RPN(hub.Module):\n    def _initialize(self):\n        # default pretrained model, Faster-RCNN with backbone ResNet50, shape of input tensor is [3, 800, 1333]\n        self.default_pretrained_model_path = os.path.join(self.directory, \"faster_rcnn_resnet50_fpn_model\")\n\n    def context(self, num_classes=708, trainable=True, pretrained=True, phase='train'):\n        \"\"\"\n        Distill the Head Features, so as to perform transfer learning.\n\n        Args:\n            trainable (bool): whether to set parameters trainable.\n            pretrained (bool): whether to load default pretrained model.\n            get_prediction (bool): whether to get prediction.\n            phase (str): optional choices are 'train' and 'predict'.\n\n        Returns:\n             inputs (dict): the input variables.\n             outputs (dict): the output variables.\n             context_prog (Program): the program to execute transfer learning.\n        \"\"\"\n        context_prog = fluid.Program()\n        startup_program = fluid.Program()\n        with fluid.program_guard(context_prog, startup_program):\n            with fluid.unique_name.guard():\n                image = fluid.layers.data(name='image', shape=[-1, 3, -1, -1], dtype='float32')\n                # backbone\n                backbone = ResNet(norm_type='affine_channel', depth=50, feature_maps=[2, 3, 4, 5], freeze_at=2)\n                body_feats = backbone(image)\n                # fpn\n                fpn = FPN(max_level=6, min_level=2, num_chan=256, spatial_scale=[0.03125, 0.0625, 0.125, 0.25])\n                var_prefix = '@HUB_{}@'.format(self.name)\n                im_info = fluid.layers.data(name='im_info', shape=[3], dtype='float32', lod_level=0)\n                im_shape = fluid.layers.data(name='im_shape', shape=[3], dtype='float32', lod_level=0)\n                body_feat_names = list(body_feats.keys())\n                body_feats, spatial_scale = fpn.get_output(body_feats)\n                # rpn_head: RPNHead\n                rpn_head = self.rpn_head()\n                rois = rpn_head.get_proposals(body_feats, im_info, mode=phase)\n                # train\n                if phase == 'train':\n                    gt_bbox = fluid.layers.data(name='gt_bbox', shape=[4], dtype='float32', lod_level=1)\n                    is_crowd = fluid.layers.data(name='is_crowd', shape=[1], dtype='int32', lod_level=1)\n                    gt_class = fluid.layers.data(name='gt_class', shape=[1], dtype='int32', lod_level=1)\n                    rpn_loss = rpn_head.get_loss(im_info, gt_bbox, is_crowd)\n                    # bbox_assigner: BBoxAssigner\n                    bbox_assigner = self.bbox_assigner(num_classes)\n                    outs = fluid.layers.generate_proposal_labels(\n                        rpn_rois=rois,\n                        gt_classes=gt_class,\n                        is_crowd=is_crowd,\n                        gt_boxes=gt_bbox,\n                        im_info=im_info,\n                        batch_size_per_im=bbox_assigner.batch_size_per_im,\n                        fg_fraction=bbox_assigner.fg_fraction,\n                        fg_thresh=bbox_assigner.fg_thresh,\n                        bg_thresh_hi=bbox_assigner.bg_thresh_hi,\n                        bg_thresh_lo=bbox_assigner.bg_thresh_lo,\n                        bbox_reg_weights=bbox_assigner.bbox_reg_weights,\n                        class_nums=bbox_assigner.class_nums,\n                        use_random=bbox_assigner.use_random)\n                    rois = outs[0]\n\n                roi_extractor = self.roi_extractor()\n                roi_feat = roi_extractor(head_inputs=body_feats, rois=rois, spatial_scale=spatial_scale)\n                # head_feat\n                bbox_head = self.bbox_head(num_classes)\n                head_feat = bbox_head.head(roi_feat)\n                if isinstance(head_feat, OrderedDict):\n                    head_feat = list(head_feat.values())[0]\n                if phase == 'train':\n                    inputs = {\n                        'image': var_prefix + image.name,\n                        'im_info': var_prefix + im_info.name,\n                        'im_shape': var_prefix + im_shape.name,\n                        'gt_class': var_prefix + gt_class.name,\n                        'gt_bbox': var_prefix + gt_bbox.name,\n                        'is_crowd': var_prefix + is_crowd.name\n                    }\n                    outputs = {\n                        'head_features': var_prefix + head_feat.name,\n                        'rpn_cls_loss': var_prefix + rpn_loss['rpn_cls_loss'].name,\n                        'rpn_reg_loss': var_prefix + rpn_loss['rpn_reg_loss'].name,\n                        'generate_proposal_labels': [var_prefix + var.name for var in outs]\n                    }\n                elif phase == 'predict':\n                    pred = bbox_head.get_prediction(roi_feat, rois, im_info, im_shape)\n                    inputs = {\n                        'image': var_prefix + image.name,\n                        'im_info': var_prefix + im_info.name,\n                        'im_shape': var_prefix + im_shape.name\n                    }\n                    outputs = {\n                        'head_features': var_prefix + head_feat.name,\n                        'rois': var_prefix + rois.name,\n                        'bbox_out': var_prefix + pred.name\n                    }\n                add_vars_prefix(context_prog, var_prefix)\n                add_vars_prefix(startup_program, var_prefix)\n\n                global_vars = context_prog.global_block().vars\n                inputs = {key: global_vars[value] for key, value in inputs.items()}\n                outputs = {\n                    key: global_vars[value] if not isinstance(value, list) else [global_vars[var] for var in value]\n                    for key, value in outputs.items()\n                }\n\n                for param in context_prog.global_block().iter_parameters():\n                    param.trainable = trainable\n\n                place = fluid.CPUPlace()\n                exe = fluid.Executor(place)\n                exe.run(startup_program)\n                if pretrained:\n\n                    def _if_exist(var):\n                        if num_classes != 81:\n                            if 'bbox_pred' in var.name or 'cls_score' in var.name:\n                                return False\n                        return os.path.exists(os.path.join(self.default_pretrained_model_path, var.name))\n\n                    fluid.io.load_vars(exe, self.default_pretrained_model_path, predicate=_if_exist)\n                return inputs, outputs, context_prog\n\n    def rpn_head(self):\n        return FPNRPNHead(\n            anchor_generator=AnchorGenerator(\n                anchor_sizes=[32, 64, 128, 256, 512],\n                aspect_ratios=[0.5, 1.0, 2.0],\n                stride=[16.0, 16.0],\n                variance=[1.0, 1.0, 1.0, 1.0]),\n            rpn_target_assign=RPNTargetAssign(\n                rpn_batch_size_per_im=256,\n                rpn_fg_fraction=0.5,\n                rpn_negative_overlap=0.3,\n                rpn_positive_overlap=0.7,\n                rpn_straddle_thresh=0.0),\n            train_proposal=GenerateProposals(min_size=0.0, nms_thresh=0.7, post_nms_top_n=2000, pre_nms_top_n=2000),\n            test_proposal=GenerateProposals(min_size=0.0, nms_thresh=0.7, post_nms_top_n=1000, pre_nms_top_n=1000),\n            anchor_start_size=32,\n            num_chan=256,\n            min_level=2,\n            max_level=6)\n\n    def roi_extractor(self):\n        return FPNRoIAlign(\n            canconical_level=4, canonical_size=224, max_level=5, min_level=2, box_resolution=7, sampling_ratio=2)\n\n    def bbox_head(self, num_classes):\n        return BBoxHead(\n            head=TwoFCHead(mlp_dim=1024),\n            nms=MultiClassNMS(keep_top_k=100, nms_threshold=0.5, score_threshold=0.05),\n            num_classes=num_classes)\n\n    def bbox_assigner(self, num_classes):\n        return BBoxAssigner(\n            batch_size_per_im=512,\n            bbox_reg_weights=[0.1, 0.1, 0.2, 0.2],\n            bg_thresh_hi=0.5,\n            bg_thresh_lo=0.0,\n            fg_fraction=0.25,\n            fg_thresh=0.5,\n            class_nums=num_classes)\n"
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_fpn_venus/name_adapter.py",
    "content": "# coding=utf-8\n\n\nclass NameAdapter(object):\n    \"\"\"Fix the backbones variable names for pretrained weight\"\"\"\n\n    def __init__(self, model):\n        super(NameAdapter, self).__init__()\n        self.model = model\n\n    @property\n    def model_type(self):\n        return getattr(self.model, '_model_type', '')\n\n    @property\n    def variant(self):\n        return getattr(self.model, 'variant', '')\n\n    def fix_conv_norm_name(self, name):\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n        # the naming rule is same as pretrained weight\n        if self.model_type == 'SEResNeXt':\n            bn_name = name + \"_bn\"\n        return bn_name\n\n    def fix_shortcut_name(self, name):\n        if self.model_type == 'SEResNeXt':\n            name = 'conv' + name + '_prj'\n        return name\n\n    def fix_bottleneck_name(self, name):\n        if self.model_type == 'SEResNeXt':\n            conv_name1 = 'conv' + name + '_x1'\n            conv_name2 = 'conv' + name + '_x2'\n            conv_name3 = 'conv' + name + '_x3'\n            shortcut_name = name\n        else:\n            conv_name1 = name + \"_branch2a\"\n            conv_name2 = name + \"_branch2b\"\n            conv_name3 = name + \"_branch2c\"\n            shortcut_name = name + \"_branch1\"\n        return conv_name1, conv_name2, conv_name3, shortcut_name\n\n    def fix_layer_warp_name(self, stage_num, count, i):\n        name = 'res' + str(stage_num)\n        if count > 10 and stage_num == 4:\n            if i == 0:\n                conv_name = name + \"a\"\n            else:\n                conv_name = name + \"b\" + str(i)\n        else:\n            conv_name = name + chr(ord(\"a\") + i)\n        if self.model_type == 'SEResNeXt':\n            conv_name = str(stage_num + 2) + '_' + str(i + 1)\n        return conv_name\n\n    def fix_c1_stage_name(self):\n        return \"res_conv1\" if self.model_type == 'ResNeXt' else \"conv1\"\n"
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_fpn_venus/nonlocal_helper.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport paddle.fluid as fluid\nfrom paddle.fluid import ParamAttr\n\nnonlocal_params = {\n    \"use_zero_init_conv\": False,\n    \"conv_init_std\": 0.01,\n    \"no_bias\": True,\n    \"use_maxpool\": False,\n    \"use_softmax\": True,\n    \"use_bn\": False,\n    \"use_scale\": True,  # vital for the model prformance!!!\n    \"use_affine\": False,\n    \"bn_momentum\": 0.9,\n    \"bn_epsilon\": 1.0000001e-5,\n    \"bn_init_gamma\": 0.9,\n    \"weight_decay_bn\": 1.e-4,\n}\n\n\ndef space_nonlocal(input, dim_in, dim_out, prefix, dim_inner, max_pool_stride=2):\n    cur = input\n    theta = fluid.layers.conv2d(input = cur, num_filters = dim_inner, \\\n                                filter_size = [1, 1], stride = [1, 1], \\\n                                padding = [0, 0], \\\n                                param_attr=ParamAttr(name = prefix + '_theta' + \"_w\", \\\n                                    initializer = fluid.initializer.Normal(loc = 0.0,\n                                    scale = nonlocal_params[\"conv_init_std\"])), \\\n                                bias_attr = ParamAttr(name = prefix + '_theta' + \"_b\", \\\n                                    initializer = fluid.initializer.Constant(value = 0.)) \\\n                                        if not nonlocal_params[\"no_bias\"] else False, \\\n                                name = prefix + '_theta')\n    theta_shape = theta.shape\n    theta_shape_op = fluid.layers.shape(theta)\n    theta_shape_op.stop_gradient = True\n\n    if nonlocal_params[\"use_maxpool\"]:\n        max_pool = fluid.layers.pool2d(input = cur, \\\n                                        pool_size = [max_pool_stride, max_pool_stride], \\\n                                        pool_type = 'max', \\\n                                        pool_stride = [max_pool_stride, max_pool_stride], \\\n                                        pool_padding = [0, 0], \\\n                                        name = prefix + '_pool')\n    else:\n        max_pool = cur\n\n    phi = fluid.layers.conv2d(input = max_pool, num_filters = dim_inner, \\\n                             filter_size = [1, 1], stride = [1, 1], \\\n                             padding = [0, 0], \\\n                             param_attr = ParamAttr(name = prefix + '_phi' + \"_w\", \\\n                                 initializer = fluid.initializer.Normal(loc = 0.0,\n                                 scale = nonlocal_params[\"conv_init_std\"])), \\\n                             bias_attr = ParamAttr(name = prefix + '_phi' + \"_b\", \\\n                                 initializer = fluid.initializer.Constant(value = 0.)) \\\n                                      if (nonlocal_params[\"no_bias\"] == 0) else False, \\\n                             name = prefix + '_phi')\n    phi_shape = phi.shape\n\n    g = fluid.layers.conv2d(input = max_pool, num_filters = dim_inner, \\\n                 filter_size = [1, 1], stride = [1, 1], \\\n                 padding = [0, 0], \\\n                 param_attr = ParamAttr(name = prefix + '_g' + \"_w\", \\\n                     initializer = fluid.initializer.Normal(loc = 0.0, scale = nonlocal_params[\"conv_init_std\"])), \\\n                 bias_attr = ParamAttr(name = prefix + '_g' + \"_b\", \\\n                     initializer = fluid.initializer.Constant(value = 0.)) if (nonlocal_params[\"no_bias\"] == 0) else False, \\\n                 name = prefix + '_g')\n    g_shape = g.shape\n    # we have to use explicit batch size (to support arbitrary spacetime size)\n    # e.g. (8, 1024, 4, 14, 14) => (8, 1024, 784)\n    theta = fluid.layers.reshape(theta, shape=(0, 0, -1))\n    theta = fluid.layers.transpose(theta, [0, 2, 1])\n    phi = fluid.layers.reshape(phi, [0, 0, -1])\n    theta_phi = fluid.layers.matmul(theta, phi, name=prefix + '_affinity')\n    g = fluid.layers.reshape(g, [0, 0, -1])\n\n    if nonlocal_params[\"use_softmax\"]:\n        if nonlocal_params[\"use_scale\"]:\n            theta_phi_sc = fluid.layers.scale(theta_phi, scale=dim_inner**-.5)\n        else:\n            theta_phi_sc = theta_phi\n        p = fluid.layers.softmax(theta_phi_sc, name=prefix + '_affinity' + '_prob')\n    else:\n        # not clear about what is doing in xlw's code\n        p = None  # not implemented\n        raise \"Not implemented when not use softmax\"\n\n    # note g's axis[2] corresponds to p's axis[2]\n    # e.g. g(8, 1024, 784_2) * p(8, 784_1, 784_2) => (8, 1024, 784_1)\n    p = fluid.layers.transpose(p, [0, 2, 1])\n    t = fluid.layers.matmul(g, p, name=prefix + '_y')\n\n    # reshape back\n    # e.g. (8, 1024, 784) => (8, 1024, 4, 14, 14)\n    t_shape = t.shape\n    t_re = fluid.layers.reshape(t, shape=list(theta_shape), actual_shape=theta_shape_op)\n    blob_out = t_re\n    blob_out = fluid.layers.conv2d(input = blob_out, num_filters = dim_out, \\\n                                  filter_size = [1, 1], stride = [1, 1], padding = [0, 0], \\\n                                  param_attr = ParamAttr(name = prefix + '_out' + \"_w\", \\\n                                      initializer = fluid.initializer.Constant(value = 0.) \\\n                                        if nonlocal_params[\"use_zero_init_conv\"] \\\n                                        else fluid.initializer.Normal(loc = 0.0,\n                                            scale = nonlocal_params[\"conv_init_std\"])), \\\n                                  bias_attr = ParamAttr(name = prefix + '_out' + \"_b\", \\\n                                          initializer = fluid.initializer.Constant(value = 0.)) \\\n                                           if (nonlocal_params[\"no_bias\"] == 0) else False, \\\n                                  name = prefix + '_out')\n    blob_out_shape = blob_out.shape\n\n    if nonlocal_params[\"use_bn\"]:\n        bn_name = prefix + \"_bn\"\n        blob_out = fluid.layers.batch_norm(blob_out, \\\n                      # is_test = test_mode, \\\n                      momentum = nonlocal_params[\"bn_momentum\"], \\\n                      epsilon = nonlocal_params[\"bn_epsilon\"], \\\n                      name = bn_name, \\\n                      param_attr = ParamAttr(name = bn_name + \"_s\", \\\n                      initializer = fluid.initializer.Constant(value = nonlocal_params[\"bn_init_gamma\"]), \\\n                      regularizer = fluid.regularizer.L2Decay(nonlocal_params[\"weight_decay_bn\"])), \\\n                      bias_attr = ParamAttr(name = bn_name + \"_b\", \\\n                      regularizer = fluid.regularizer.L2Decay(nonlocal_params[\"weight_decay_bn\"])), \\\n                      moving_mean_name = bn_name + \"_rm\", \\\n                      moving_variance_name = bn_name + \"_riv\") # add bn\n\n    if nonlocal_params[\"use_affine\"]:\n        affine_scale = fluid.layers.create_parameter(\\\n                       shape=[blob_out_shape[1]], dtype = blob_out.dtype, \\\n                       attr=ParamAttr(name=prefix + '_affine' + '_s'), \\\n                       default_initializer = fluid.initializer.Constant(value = 1.))\n        affine_bias = fluid.layers.create_parameter(\\\n                      shape=[blob_out_shape[1]], dtype = blob_out.dtype, \\\n                      attr=ParamAttr(name=prefix + '_affine' + '_b'), \\\n                      default_initializer = fluid.initializer.Constant(value = 0.))\n        blob_out = fluid.layers.affine_channel(blob_out, scale = affine_scale, \\\n                      bias = affine_bias, name = prefix + '_affine')   # add affine\n\n    return blob_out\n\n\ndef add_space_nonlocal(input, dim_in, dim_out, prefix, dim_inner):\n    '''\n    add_space_nonlocal:\n        Non-local Neural Networks: see https://arxiv.org/abs/1711.07971\n    '''\n    conv = space_nonlocal(input, dim_in, dim_out, prefix, dim_inner)\n    output = fluid.layers.elementwise_add(input, conv, name=prefix + '_sum')\n    return output\n"
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_fpn_venus/processor.py",
    "content": "# coding=utf-8\nimport base64\nimport os\n\nimport cv2\nimport numpy as np\nfrom PIL import Image, ImageDraw\n\n__all__ = [\n    'base64_to_cv2',\n    'load_label_info',\n    'postprocess',\n]\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef get_save_image_name(img, output_dir, image_path):\n    \"\"\"Get save image name from source image path.\n    \"\"\"\n    if not os.path.exists(output_dir):\n        os.makedirs(output_dir)\n    image_name = os.path.split(image_path)[-1]\n    name, ext = os.path.splitext(image_name)\n\n    if ext == '':\n        if img.format == 'PNG':\n            ext = '.png'\n        elif img.format == 'JPEG':\n            ext = '.jpg'\n        elif img.format == 'BMP':\n            ext = '.bmp'\n        else:\n            if img.mode == \"RGB\" or img.mode == \"L\":\n                ext = \".jpg\"\n            elif img.mode == \"RGBA\" or img.mode == \"P\":\n                ext = '.png'\n\n    return os.path.join(output_dir, \"{}\".format(name)) + ext\n\n\ndef draw_bounding_box_on_image(image_path, data_list, save_dir):\n    image = Image.open(image_path)\n    draw = ImageDraw.Draw(image)\n    for data in data_list:\n        left, right, top, bottom = data['left'], data['right'], data['top'], data['bottom']\n\n        # draw bbox\n        draw.line([(left, top), (left, bottom), (right, bottom), (right, top), (left, top)], width=2, fill='red')\n\n        # draw label\n        if image.mode == 'RGB':\n            text = data['label'] + \": %.2f%%\" % (100 * data['confidence'])\n            textsize_width, textsize_height = draw.textsize(text=text)\n            draw.rectangle(\n                xy=(left, top - (textsize_height + 5), left + textsize_width + 10, top), fill=(255, 255, 255))\n            draw.text(xy=(left, top - 15), text=text, fill=(0, 0, 0))\n\n    save_name = get_save_image_name(image, save_dir, image_path)\n    if os.path.exists(save_name):\n        os.remove(save_name)\n\n    image.save(save_name)\n    return save_name\n\n\ndef clip_bbox(bbox, img_width, img_height):\n    xmin = max(min(bbox[0], img_width), 0.)\n    ymin = max(min(bbox[1], img_height), 0.)\n    xmax = max(min(bbox[2], img_width), 0.)\n    ymax = max(min(bbox[3], img_height), 0.)\n    return xmin, ymin, xmax, ymax\n\n\ndef load_label_info(file_path):\n    with open(file_path, 'r') as fr:\n        text = fr.readlines()\n        label_names = []\n        for info in text:\n            label_names.append(info.strip())\n        return label_names\n\n\ndef postprocess(paths, images, data_out, score_thresh, label_names, output_dir, handle_id, visualization=True):\n    \"\"\"\n    postprocess the lod_tensor produced by fluid.Executor.run\n\n    Args:\n        paths (list[str]): the path of images.\n        images (list(numpy.ndarray)):  list of images, shape of each is [H, W, C].\n        data_out (lod_tensor): data produced by executor.run.\n        score_thresh (float): the low limit of bounding box.\n        label_names (list[str]): label names.\n        output_dir (str): output directory.\n        handle_id (int): The number of images that have been handled.\n        visualization (bool): whether to save as images.\n\n    Returns:\n        res (list[dict]): The result of vehicles detecion. keys include 'data', 'save_path', the corresponding value is:\n            data (dict): the result of object detection, keys include 'left', 'top', 'right', 'bottom', 'label', 'confidence', the corresponding value is:\n                left (float): The X coordinate of the upper left corner of the bounding box;\n                top (float): The Y coordinate of the upper left corner of the bounding box;\n                right (float): The X coordinate of the lower right corner of the bounding box;\n                bottom (float): The Y coordinate of the lower right corner of the bounding box;\n                label (str): The label of detection result;\n                confidence (float): The confidence of detection result.\n            save_path (str): The path to save output images.\n    \"\"\"\n    lod_tensor = data_out[0]\n    lod = lod_tensor.lod[0]\n    results = lod_tensor.as_ndarray()\n\n    if handle_id < len(paths):\n        unhandled_paths = paths[handle_id:]\n        unhandled_paths_num = len(unhandled_paths)\n    else:\n        unhandled_paths_num = 0\n\n    output = []\n    for index in range(len(lod) - 1):\n        output_i = {'data': []}\n        if index < unhandled_paths_num:\n            org_img_path = unhandled_paths[index]\n            org_img = Image.open(org_img_path)\n            output_i['path'] = org_img_path\n        else:\n            org_img = images[index - unhandled_paths_num]\n            org_img = org_img.astype(np.uint8)\n            org_img = Image.fromarray(org_img[:, :, ::-1])\n            if visualization:\n                org_img_path = get_save_image_name(org_img, output_dir, 'image_numpy_{}'.format((handle_id + index)))\n                org_img.save(org_img_path)\n        org_img_height = org_img.height\n        org_img_width = org_img.width\n        result_i = results[lod[index]:lod[index + 1]]\n        for row in result_i:\n            if len(row) != 6:\n                continue\n            if row[1] < score_thresh:\n                continue\n            category_id = int(row[0])\n            confidence = row[1]\n            bbox = row[2:]\n            dt = {}\n            dt['label'] = label_names[category_id]\n            dt['confidence'] = confidence\n            dt['left'], dt['top'], dt['right'], dt['bottom'] = clip_bbox(bbox, org_img_width, org_img_height)\n            output_i['data'].append(dt)\n\n        output.append(output_i)\n        if visualization:\n            output_i['save_path'] = draw_bounding_box_on_image(org_img_path, output_i['data'], output_dir)\n\n    return output\n"
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_fpn_venus/resnet.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport math\nfrom collections import OrderedDict\nfrom numbers import Integral\n\nfrom paddle import fluid\nfrom paddle.fluid.param_attr import ParamAttr\nfrom paddle.fluid.framework import Variable\nfrom paddle.fluid.regularizer import L2Decay\nfrom paddle.fluid.initializer import Constant\n\nfrom .nonlocal_helper import add_space_nonlocal\nfrom .name_adapter import NameAdapter\n\n__all__ = ['ResNet', 'ResNetC5']\n\n\nclass ResNet(object):\n    \"\"\"\n    Residual Network, see https://arxiv.org/abs/1512.03385\n    Args:\n        depth (int): ResNet depth, should be 34, 50.\n        freeze_at (int): freeze the backbone at which stage\n        norm_type (str): normalization type, 'bn'/'sync_bn'/'affine_channel'\n        freeze_norm (bool): freeze normalization layers\n        norm_decay (float): weight decay for normalization layer weights\n        variant (str): ResNet variant, supports 'a', 'b', 'c', 'd' currently\n        feature_maps (list): index of stages whose feature maps are returned\n        dcn_v2_stages (list): index of stages who select deformable conv v2\n        nonlocal_stages (list): index of stages who select nonlocal networks\n    \"\"\"\n    __shared__ = ['norm_type', 'freeze_norm', 'weight_prefix_name']\n\n    def __init__(self,\n                 depth=50,\n                 freeze_at=0,\n                 norm_type='sync_bn',\n                 freeze_norm=False,\n                 norm_decay=0.,\n                 variant='b',\n                 feature_maps=[3, 4, 5],\n                 dcn_v2_stages=[],\n                 weight_prefix_name='',\n                 nonlocal_stages=[],\n                 get_prediction=False,\n                 class_dim=1000):\n        super(ResNet, self).__init__()\n\n        if isinstance(feature_maps, Integral):\n            feature_maps = [feature_maps]\n\n        assert depth in [34, 50], \\\n            \"depth {} not in [34, 50]\"\n        assert variant in ['a', 'b', 'c', 'd'], \"invalid ResNet variant\"\n        assert 0 <= freeze_at <= 4, \"freeze_at should be 0, 1, 2, 3 or 4\"\n        assert len(feature_maps) > 0, \"need one or more feature maps\"\n        assert norm_type in ['bn', 'sync_bn', 'affine_channel']\n        assert not (len(nonlocal_stages)>0 and depth<50), \\\n                    \"non-local is not supported for resnet18 or resnet34\"\n\n        self.depth = depth\n        self.freeze_at = freeze_at\n        self.norm_type = norm_type\n        self.norm_decay = norm_decay\n        self.freeze_norm = freeze_norm\n        self.variant = variant\n        self._model_type = 'ResNet'\n        self.feature_maps = feature_maps\n        self.dcn_v2_stages = dcn_v2_stages\n        self.depth_cfg = {\n            34: ([3, 4, 6, 3], self.basicblock),\n            50: ([3, 4, 6, 3], self.bottleneck),\n        }\n        self.stage_filters = [64, 128, 256, 512]\n        self._c1_out_chan_num = 64\n        self.na = NameAdapter(self)\n        self.prefix_name = weight_prefix_name\n\n        self.nonlocal_stages = nonlocal_stages\n        self.nonlocal_mod_cfg = {\n            50: 2,\n            101: 5,\n            152: 8,\n            200: 12,\n        }\n        self.get_prediction = get_prediction\n        self.class_dim = class_dim\n\n    def _conv_offset(self, input, filter_size, stride, padding, act=None, name=None):\n        out_channel = filter_size * filter_size * 3\n        out = fluid.layers.conv2d(\n            input,\n            num_filters=out_channel,\n            filter_size=filter_size,\n            stride=stride,\n            padding=padding,\n            param_attr=ParamAttr(initializer=Constant(0.0), name=name + \".w_0\"),\n            bias_attr=ParamAttr(initializer=Constant(0.0), name=name + \".b_0\"),\n            act=act,\n            name=name)\n        return out\n\n    def _conv_norm(self, input, num_filters, filter_size, stride=1, groups=1, act=None, name=None, dcn_v2=False):\n        _name = self.prefix_name + name if self.prefix_name != '' else name\n        if not dcn_v2:\n            conv = fluid.layers.conv2d(\n                input=input,\n                num_filters=num_filters,\n                filter_size=filter_size,\n                stride=stride,\n                padding=(filter_size - 1) // 2,\n                groups=groups,\n                act=None,\n                param_attr=ParamAttr(name=_name + \"_weights\"),\n                bias_attr=False,\n                name=_name + '.conv2d.output.1')\n        else:\n            # select deformable conv\"\n            offset_mask = self._conv_offset(\n                input=input,\n                filter_size=filter_size,\n                stride=stride,\n                padding=(filter_size - 1) // 2,\n                act=None,\n                name=_name + \"_conv_offset\")\n            offset_channel = filter_size**2 * 2\n            mask_channel = filter_size**2\n            offset, mask = fluid.layers.split(input=offset_mask, num_or_sections=[offset_channel, mask_channel], dim=1)\n            mask = fluid.layers.sigmoid(mask)\n            conv = fluid.layers.deformable_conv(\n                input=input,\n                offset=offset,\n                mask=mask,\n                num_filters=num_filters,\n                filter_size=filter_size,\n                stride=stride,\n                padding=(filter_size - 1) // 2,\n                groups=groups,\n                deformable_groups=1,\n                im2col_step=1,\n                param_attr=ParamAttr(name=_name + \"_weights\"),\n                bias_attr=False,\n                name=_name + \".conv2d.output.1\")\n\n        bn_name = self.na.fix_conv_norm_name(name)\n        bn_name = self.prefix_name + bn_name if self.prefix_name != '' else bn_name\n\n        norm_lr = 0. if self.freeze_norm else 1.\n        norm_decay = self.norm_decay\n        pattr = ParamAttr(name=bn_name + '_scale', learning_rate=norm_lr, regularizer=L2Decay(norm_decay))\n        battr = ParamAttr(name=bn_name + '_offset', learning_rate=norm_lr, regularizer=L2Decay(norm_decay))\n\n        if self.norm_type in ['bn', 'sync_bn']:\n            global_stats = True if self.freeze_norm else False\n            out = fluid.layers.batch_norm(\n                input=conv,\n                act=act,\n                name=bn_name + '.output.1',\n                param_attr=pattr,\n                bias_attr=battr,\n                moving_mean_name=bn_name + '_mean',\n                moving_variance_name=bn_name + '_variance',\n                use_global_stats=global_stats)\n            scale = fluid.framework._get_var(pattr.name)\n            bias = fluid.framework._get_var(battr.name)\n        elif self.norm_type == 'affine_channel':\n            scale = fluid.layers.create_parameter(\n                shape=[conv.shape[1]], dtype=conv.dtype, attr=pattr, default_initializer=fluid.initializer.Constant(1.))\n            bias = fluid.layers.create_parameter(\n                shape=[conv.shape[1]], dtype=conv.dtype, attr=battr, default_initializer=fluid.initializer.Constant(0.))\n            out = fluid.layers.affine_channel(x=conv, scale=scale, bias=bias, act=act)\n        if self.freeze_norm:\n            scale.stop_gradient = True\n            bias.stop_gradient = True\n        return out\n\n    def _shortcut(self, input, ch_out, stride, is_first, name):\n        max_pooling_in_short_cut = self.variant == 'd'\n        ch_in = input.shape[1]\n        # the naming rule is same as pretrained weight\n        name = self.na.fix_shortcut_name(name)\n        std_senet = getattr(self, 'std_senet', False)\n        if ch_in != ch_out or stride != 1 or (self.depth < 50 and is_first):\n            if std_senet:\n                if is_first:\n                    return self._conv_norm(input, ch_out, 1, stride, name=name)\n                else:\n                    return self._conv_norm(input, ch_out, 3, stride, name=name)\n            if max_pooling_in_short_cut and not is_first:\n                input = fluid.layers.pool2d(\n                    input=input, pool_size=2, pool_stride=2, pool_padding=0, ceil_mode=True, pool_type='avg')\n                return self._conv_norm(input, ch_out, 1, 1, name=name)\n            return self._conv_norm(input, ch_out, 1, stride, name=name)\n        else:\n            return input\n\n    def bottleneck(self, input, num_filters, stride, is_first, name, dcn_v2=False):\n        if self.variant == 'a':\n            stride1, stride2 = stride, 1\n        else:\n            stride1, stride2 = 1, stride\n\n        # ResNeXt\n        groups = getattr(self, 'groups', 1)\n        group_width = getattr(self, 'group_width', -1)\n        if groups == 1:\n            expand = 4\n        elif (groups * group_width) == 256:\n            expand = 1\n        else:  # FIXME hard code for now, handles 32x4d, 64x4d and 32x8d\n            num_filters = num_filters // 2\n            expand = 2\n\n        conv_name1, conv_name2, conv_name3, \\\n            shortcut_name = self.na.fix_bottleneck_name(name)\n        std_senet = getattr(self, 'std_senet', False)\n        if std_senet:\n            conv_def = [[int(num_filters / 2), 1, stride1, 'relu', 1, conv_name1],\n                        [num_filters, 3, stride2, 'relu', groups, conv_name2],\n                        [num_filters * expand, 1, 1, None, 1, conv_name3]]\n        else:\n            conv_def = [[num_filters, 1, stride1, 'relu', 1, conv_name1],\n                        [num_filters, 3, stride2, 'relu', groups, conv_name2],\n                        [num_filters * expand, 1, 1, None, 1, conv_name3]]\n\n        residual = input\n        for i, (c, k, s, act, g, _name) in enumerate(conv_def):\n            residual = self._conv_norm(\n                input=residual,\n                num_filters=c,\n                filter_size=k,\n                stride=s,\n                act=act,\n                groups=g,\n                name=_name,\n                dcn_v2=(i == 1 and dcn_v2))\n        short = self._shortcut(input, num_filters * expand, stride, is_first=is_first, name=shortcut_name)\n        # Squeeze-and-Excitation\n        if callable(getattr(self, '_squeeze_excitation', None)):\n            residual = self._squeeze_excitation(input=residual, num_channels=num_filters, name='fc' + name)\n        return fluid.layers.elementwise_add(x=short, y=residual, act='relu', name=name + \".add.output.5\")\n\n    def basicblock(self, input, num_filters, stride, is_first, name, dcn_v2=False):\n        assert dcn_v2 is False, \"Not implemented yet.\"\n        conv0 = self._conv_norm(\n            input=input, num_filters=num_filters, filter_size=3, act='relu', stride=stride, name=name + \"_branch2a\")\n        conv1 = self._conv_norm(input=conv0, num_filters=num_filters, filter_size=3, act=None, name=name + \"_branch2b\")\n        short = self._shortcut(input, num_filters, stride, is_first, name=name + \"_branch1\")\n        return fluid.layers.elementwise_add(x=short, y=conv1, act='relu')\n\n    def layer_warp(self, input, stage_num):\n        \"\"\"\n        Args:\n            input (Variable): input variable.\n            stage_num (int): the stage number, should be 2, 3, 4, 5\n\n        Returns:\n            The last variable in endpoint-th stage.\n        \"\"\"\n        assert stage_num in [2, 3, 4, 5]\n\n        stages, block_func = self.depth_cfg[self.depth]\n        count = stages[stage_num - 2]\n\n        ch_out = self.stage_filters[stage_num - 2]\n        is_first = False if stage_num != 2 else True\n        dcn_v2 = True if stage_num in self.dcn_v2_stages else False\n\n        nonlocal_mod = 1000\n        if stage_num in self.nonlocal_stages:\n            nonlocal_mod = self.nonlocal_mod_cfg[self.depth] if stage_num == 4 else 2\n\n        # Make the layer name and parameter name consistent\n        # with ImageNet pre-trained model\n        conv = input\n        for i in range(count):\n            conv_name = self.na.fix_layer_warp_name(stage_num, count, i)\n            if self.depth < 50:\n                is_first = True if i == 0 and stage_num == 2 else False\n            conv = block_func(\n                input=conv,\n                num_filters=ch_out,\n                stride=2 if i == 0 and stage_num != 2 else 1,\n                is_first=is_first,\n                name=conv_name,\n                dcn_v2=dcn_v2)\n\n            # add non local model\n            dim_in = conv.shape[1]\n            nonlocal_name = \"nonlocal_conv{}\".format(stage_num)\n            if i % nonlocal_mod == nonlocal_mod - 1:\n                conv = add_space_nonlocal(conv, dim_in, dim_in, nonlocal_name + '_{}'.format(i), int(dim_in / 2))\n        return conv\n\n    def c1_stage(self, input):\n        out_chan = self._c1_out_chan_num\n\n        conv1_name = self.na.fix_c1_stage_name()\n\n        if self.variant in ['c', 'd']:\n            conv_def = [\n                [out_chan // 2, 3, 2, \"conv1_1\"],\n                [out_chan // 2, 3, 1, \"conv1_2\"],\n                [out_chan, 3, 1, \"conv1_3\"],\n            ]\n        else:\n            conv_def = [[out_chan, 7, 2, conv1_name]]\n\n        for (c, k, s, _name) in conv_def:\n            input = self._conv_norm(input=input, num_filters=c, filter_size=k, stride=s, act='relu', name=_name)\n\n        output = fluid.layers.pool2d(input=input, pool_size=3, pool_stride=2, pool_padding=1, pool_type='max')\n        return output\n\n    def __call__(self, input):\n        assert isinstance(input, Variable)\n        assert not (set(self.feature_maps) - set([2, 3, 4, 5])), \\\n            \"feature maps {} not in [2, 3, 4, 5]\".format(self.feature_maps)\n\n        res_endpoints = []\n\n        res = input\n        feature_maps = self.feature_maps\n        severed_head = getattr(self, 'severed_head', False)\n        if not severed_head:\n            res = self.c1_stage(res)\n            feature_maps = range(2, max(self.feature_maps) + 1)\n\n        for i in feature_maps:\n            res = self.layer_warp(res, i)\n            if i in self.feature_maps:\n                res_endpoints.append(res)\n            if self.freeze_at >= i:\n                res.stop_gradient = True\n        if self.get_prediction:\n            pool = fluid.layers.pool2d(input=res, pool_type='avg', global_pooling=True)\n            stdv = 1.0 / math.sqrt(pool.shape[1] * 1.0)\n\n            out = fluid.layers.fc(\n                input=pool,\n                size=self.class_dim,\n                param_attr=fluid.param_attr.ParamAttr(initializer=fluid.initializer.Uniform(-stdv, stdv)))\n            out = fluid.layers.softmax(out)\n            return out\n        return OrderedDict(\n            [('res{}_sum'.format(self.feature_maps[idx]), feat) for idx, feat in enumerate(res_endpoints)])\n\n\nclass ResNetC5(ResNet):\n    def __init__(self,\n                 depth=50,\n                 freeze_at=2,\n                 norm_type='affine_channel',\n                 freeze_norm=True,\n                 norm_decay=0.,\n                 variant='b',\n                 feature_maps=[5],\n                 weight_prefix_name=''):\n        super(ResNetC5, self).__init__(depth, freeze_at, norm_type, freeze_norm, norm_decay, variant, feature_maps)\n        self.severed_head = True\n"
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_fpn_venus/roi_extractor.py",
    "content": "# coding=utf-8\nimport paddle.fluid as fluid\n\n__all__ = ['FPNRoIAlign']\n\n\nclass FPNRoIAlign(object):\n    \"\"\"\n    RoI align pooling for FPN feature maps\n    Args:\n        sampling_ratio (int): number of sampling points\n        min_level (int): lowest level of FPN layer\n        max_level (int): highest level of FPN layer\n        canconical_level (int): the canconical FPN feature map level\n        canonical_size (int): the canconical FPN feature map size\n        box_resolution (int): box resolution\n        mask_resolution (int): mask roi resolution\n    \"\"\"\n\n    def __init__(self,\n                 sampling_ratio=0,\n                 min_level=2,\n                 max_level=5,\n                 canconical_level=4,\n                 canonical_size=224,\n                 box_resolution=7,\n                 mask_resolution=14):\n        super(FPNRoIAlign, self).__init__()\n        self.sampling_ratio = sampling_ratio\n        self.min_level = min_level\n        self.max_level = max_level\n        self.canconical_level = canconical_level\n        self.canonical_size = canonical_size\n        self.box_resolution = box_resolution\n        self.mask_resolution = mask_resolution\n\n    def __call__(self, head_inputs, rois, spatial_scale, is_mask=False):\n        \"\"\"\n        Adopt RoI align onto several level of feature maps to get RoI features.\n        Distribute RoIs to different levels by area and get a list of RoI\n        features by distributed RoIs and their corresponding feature maps.\n\n        Returns:\n            roi_feat(Variable): RoI features with shape of [M, C, R, R],\n                where M is the number of RoIs and R is RoI resolution\n\n        \"\"\"\n        k_min = self.min_level\n        k_max = self.max_level\n        num_roi_lvls = k_max - k_min + 1\n        name_list = list(head_inputs.keys())\n        input_name_list = name_list[-num_roi_lvls:]\n        spatial_scale = spatial_scale[-num_roi_lvls:]\n        rois_dist, restore_index = fluid.layers.distribute_fpn_proposals(rois, k_min, k_max, self.canconical_level,\n                                                                         self.canonical_size)\n        # rois_dist is in ascend order\n        roi_out_list = []\n        resolution = is_mask and self.mask_resolution or self.box_resolution\n        for lvl in range(num_roi_lvls):\n            name_index = num_roi_lvls - lvl - 1\n            rois_input = rois_dist[lvl]\n            head_input = head_inputs[input_name_list[name_index]]\n            sc = spatial_scale[name_index]\n            roi_out = fluid.layers.roi_align(\n                input=head_input,\n                rois=rois_input,\n                pooled_height=resolution,\n                pooled_width=resolution,\n                spatial_scale=sc,\n                sampling_ratio=self.sampling_ratio)\n            roi_out_list.append(roi_out)\n        roi_feat_shuffle = fluid.layers.concat(roi_out_list)\n        roi_feat_ = fluid.layers.gather(roi_feat_shuffle, restore_index)\n        roi_feat = fluid.layers.lod_reset(roi_feat_, rois)\n\n        return roi_feat\n"
  },
  {
    "path": "modules/image/object_detection/faster_rcnn_resnet50_fpn_venus/rpn_head.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nfrom paddle import fluid\nfrom paddle.fluid.param_attr import ParamAttr\nfrom paddle.fluid.initializer import Normal\nfrom paddle.fluid.regularizer import L2Decay\n\n__all__ = ['AnchorGenerator', 'RPNTargetAssign', 'GenerateProposals', 'RPNHead', 'FPNRPNHead']\n\n\nclass AnchorGenerator(object):\n    # __op__ = fluid.layers.anchor_generator\n    def __init__(self,\n                 stride=[16.0, 16.0],\n                 anchor_sizes=[32, 64, 128, 256, 512],\n                 aspect_ratios=[0.5, 1., 2.],\n                 variance=[1., 1., 1., 1.]):\n        super(AnchorGenerator, self).__init__()\n        self.anchor_sizes = anchor_sizes\n        self.aspect_ratios = aspect_ratios\n        self.variance = variance\n        self.stride = stride\n\n\nclass RPNTargetAssign(object):\n    # __op__ = fluid.layers.rpn_target_assign\n    def __init__(self,\n                 rpn_batch_size_per_im=256,\n                 rpn_straddle_thresh=0.,\n                 rpn_fg_fraction=0.5,\n                 rpn_positive_overlap=0.7,\n                 rpn_negative_overlap=0.3,\n                 use_random=True):\n        super(RPNTargetAssign, self).__init__()\n        self.rpn_batch_size_per_im = rpn_batch_size_per_im\n        self.rpn_straddle_thresh = rpn_straddle_thresh\n        self.rpn_fg_fraction = rpn_fg_fraction\n        self.rpn_positive_overlap = rpn_positive_overlap\n        self.rpn_negative_overlap = rpn_negative_overlap\n        self.use_random = use_random\n\n\nclass GenerateProposals(object):\n    # __op__ = fluid.layers.generate_proposals\n    def __init__(self, pre_nms_top_n=6000, post_nms_top_n=1000, nms_thresh=.5, min_size=.1, eta=1.):\n        super(GenerateProposals, self).__init__()\n        self.pre_nms_top_n = pre_nms_top_n\n        self.post_nms_top_n = post_nms_top_n\n        self.nms_thresh = nms_thresh\n        self.min_size = min_size\n        self.eta = eta\n\n\nclass RPNHead(object):\n    \"\"\"\n    RPN Head\n\n    Args:\n        anchor_generator (object): `AnchorGenerator` instance\n        rpn_target_assign (object): `RPNTargetAssign` instance\n        train_proposal (object): `GenerateProposals` instance for training\n        test_proposal (object): `GenerateProposals` instance for testing\n        num_classes (int): number of classes in rpn output\n    \"\"\"\n    __inject__ = ['anchor_generator', 'rpn_target_assign', 'train_proposal', 'test_proposal']\n\n    def __init__(self, anchor_generator, rpn_target_assign, train_proposal, test_proposal, num_classes=1):\n        super(RPNHead, self).__init__()\n        self.anchor_generator = anchor_generator\n        self.rpn_target_assign = rpn_target_assign\n        self.train_proposal = train_proposal\n        self.test_proposal = test_proposal\n        self.num_classes = num_classes\n\n    def _get_output(self, input):\n        \"\"\"\n        Get anchor and RPN head output.\n\n        Args:\n            input(Variable): feature map from backbone with shape of [N, C, H, W]\n\n        Returns:\n            rpn_cls_score(Variable): Output of rpn head with shape of [N, num_anchors, H, W].\n            rpn_bbox_pred(Variable): Output of rpn head with shape of [N, num_anchors * 4, H, W].\n        \"\"\"\n        dim_out = input.shape[1]\n        rpn_conv = fluid.layers.conv2d(\n            input=input,\n            num_filters=dim_out,\n            filter_size=3,\n            stride=1,\n            padding=1,\n            act='relu',\n            name='conv_rpn',\n            param_attr=ParamAttr(name=\"conv_rpn_w\", initializer=Normal(loc=0., scale=0.01)),\n            bias_attr=ParamAttr(name=\"conv_rpn_b\", learning_rate=2., regularizer=L2Decay(0.)))\n        # Generate anchors self.anchor_generator\n        self.anchor, self.anchor_var = fluid.layers.anchor_generator(\n            input=rpn_conv,\n            anchor_sizes=self.anchor_generator.anchor_sizes,\n            aspect_ratios=self.anchor_generator.aspect_ratios,\n            variance=self.anchor_generator.variance,\n            stride=self.anchor_generator.stride)\n\n        num_anchor = self.anchor.shape[2]\n        # Proposal classification scores\n        self.rpn_cls_score = fluid.layers.conv2d(\n            rpn_conv,\n            num_filters=num_anchor * self.num_classes,\n            filter_size=1,\n            stride=1,\n            padding=0,\n            act=None,\n            name='rpn_cls_score',\n            param_attr=ParamAttr(name=\"rpn_cls_logits_w\", initializer=Normal(loc=0., scale=0.01)),\n            bias_attr=ParamAttr(name=\"rpn_cls_logits_b\", learning_rate=2., regularizer=L2Decay(0.)))\n        # Proposal bbox regression deltas\n        self.rpn_bbox_pred = fluid.layers.conv2d(\n            rpn_conv,\n            num_filters=4 * num_anchor,\n            filter_size=1,\n            stride=1,\n            padding=0,\n            act=None,\n            name='rpn_bbox_pred',\n            param_attr=ParamAttr(name=\"rpn_bbox_pred_w\", initializer=Normal(loc=0., scale=0.01)),\n            bias_attr=ParamAttr(name=\"rpn_bbox_pred_b\", learning_rate=2., regularizer=L2Decay(0.)))\n        return self.rpn_cls_score, self.rpn_bbox_pred\n\n    def get_proposals(self, body_feats, im_info, mode='train'):\n        \"\"\"\n        Get proposals according to the output of backbone.\n\n        Args:\n            body_feats (dict): The dictionary of feature maps from backbone.\n            im_info(Variable): The information of image with shape [N, 3] with\n                shape (height, width, scale).\n            body_feat_names(list): A list of names of feature maps from\n                backbone.\n\n        Returns:\n            rpn_rois(Variable): Output proposals with shape of (rois_num, 4).\n        \"\"\"\n        # In RPN Heads, only the last feature map of backbone is used.\n        # And body_feat_names[-1] represents the last level name of backbone.\n        body_feat = list(body_feats.values())[-1]\n        rpn_cls_score, rpn_bbox_pred = self._get_output(body_feat)\n\n        if self.num_classes == 1:\n            rpn_cls_prob = fluid.layers.sigmoid(rpn_cls_score, name='rpn_cls_prob')\n        else:\n            rpn_cls_score = fluid.layers.transpose(rpn_cls_score, perm=[0, 2, 3, 1])\n            rpn_cls_score = fluid.layers.reshape(rpn_cls_score, shape=(0, 0, 0, -1, self.num_classes))\n            rpn_cls_prob_tmp = fluid.layers.softmax(rpn_cls_score, use_cudnn=False, name='rpn_cls_prob')\n            rpn_cls_prob_slice = fluid.layers.slice(rpn_cls_prob_tmp, axes=[4], starts=[1], ends=[self.num_classes])\n            rpn_cls_prob, _ = fluid.layers.topk(rpn_cls_prob_slice, 1)\n            rpn_cls_prob = fluid.layers.reshape(rpn_cls_prob, shape=(0, 0, 0, -1))\n            rpn_cls_prob = fluid.layers.transpose(rpn_cls_prob, perm=[0, 3, 1, 2])\n        prop_op = self.train_proposal if mode == 'train' else self.test_proposal\n        # prop_op\n        rpn_rois, rpn_roi_probs = fluid.layers.generate_proposals(\n            scores=rpn_cls_prob,\n            bbox_deltas=rpn_bbox_pred,\n            im_info=im_info,\n            anchors=self.anchor,\n            variances=self.anchor_var,\n            pre_nms_top_n=prop_op.pre_nms_top_n,\n            post_nms_top_n=prop_op.post_nms_top_n,\n            nms_thresh=prop_op.nms_thresh,\n            min_size=prop_op.min_size,\n            eta=prop_op.eta)\n        return rpn_rois\n\n    def _transform_input(self, rpn_cls_score, rpn_bbox_pred, anchor, anchor_var):\n        rpn_cls_score = fluid.layers.transpose(rpn_cls_score, perm=[0, 2, 3, 1])\n        rpn_bbox_pred = fluid.layers.transpose(rpn_bbox_pred, perm=[0, 2, 3, 1])\n        anchor = fluid.layers.reshape(anchor, shape=(-1, 4))\n        anchor_var = fluid.layers.reshape(anchor_var, shape=(-1, 4))\n        rpn_cls_score = fluid.layers.reshape(x=rpn_cls_score, shape=(0, -1, self.num_classes))\n        rpn_bbox_pred = fluid.layers.reshape(x=rpn_bbox_pred, shape=(0, -1, 4))\n        return rpn_cls_score, rpn_bbox_pred, anchor, anchor_var\n\n    def _get_loss_input(self):\n        for attr in ['rpn_cls_score', 'rpn_bbox_pred', 'anchor', 'anchor_var']:\n            if not getattr(self, attr, None):\n                raise ValueError(\"self.{} should not be None,\".format(attr), \"call RPNHead.get_proposals first\")\n        return self._transform_input(self.rpn_cls_score, self.rpn_bbox_pred, self.anchor, self.anchor_var)\n\n    def get_loss(self, im_info, gt_box, is_crowd, gt_label=None):\n        \"\"\"\n        Sample proposals and Calculate rpn loss.\n\n        Args:\n            im_info(Variable): The information of image with shape [N, 3] with\n                shape (height, width, scale).\n            gt_box(Variable): The ground-truth bounding boxes with shape [M, 4].\n                M is the number of groundtruth.\n            is_crowd(Variable): Indicates groud-truth is crowd or not with\n                shape [M, 1]. M is the number of groundtruth.\n\n        Returns:\n            Type: dict\n                rpn_cls_loss(Variable): RPN classification loss.\n                rpn_bbox_loss(Variable): RPN bounding box regression loss.\n\n        \"\"\"\n        rpn_cls, rpn_bbox, anchor, anchor_var = self._get_loss_input()\n        if self.num_classes == 1:\n            # self.rpn_target_assign\n            score_pred, loc_pred, score_tgt, loc_tgt, bbox_weight = \\\n                fluid.layers.rpn_target_assign(\n                    bbox_pred=rpn_bbox,\n                    cls_logits=rpn_cls,\n                    anchor_box=anchor,\n                    anchor_var=anchor_var,\n                    gt_boxes=gt_box,\n                    is_crowd=is_crowd,\n                    im_info=im_info,\n                    rpn_batch_size_per_im=self.rpn_target_assign.rpn_batch_size_per_im,\n                    rpn_straddle_thresh=self.rpn_target_assign.rpn_straddle_thresh,\n                    rpn_fg_fraction=self.rpn_target_assign.rpn_fg_fraction,\n                    rpn_positive_overlap=self.rpn_target_assign.rpn_positive_overlap,\n                    rpn_negative_overlap=self.rpn_target_assign.rpn_negative_overlap,\n                    use_random=self.rpn_target_assign.use_random)\n            score_tgt = fluid.layers.cast(x=score_tgt, dtype='float32')\n            score_tgt.stop_gradient = True\n            rpn_cls_loss = fluid.layers.sigmoid_cross_entropy_with_logits(x=score_pred, label=score_tgt)\n        else:\n            score_pred, loc_pred, score_tgt, loc_tgt, bbox_weight = \\\n                self.rpn_target_assign(\n                    bbox_pred=rpn_bbox,\n                    cls_logits=rpn_cls,\n                    anchor_box=anchor,\n                    anchor_var=anchor_var,\n                    gt_boxes=gt_box,\n                    gt_labels=gt_label,\n                    is_crowd=is_crowd,\n                    num_classes=self.num_classes,\n                    im_info=im_info)\n            labels_int64 = fluid.layers.cast(x=score_tgt, dtype='int64')\n            labels_int64.stop_gradient = True\n            rpn_cls_loss = fluid.layers.softmax_with_cross_entropy(\n                logits=score_pred, label=labels_int64, numeric_stable_mode=True)\n\n        rpn_cls_loss = fluid.layers.reduce_mean(rpn_cls_loss, name='loss_rpn_cls')\n\n        loc_tgt = fluid.layers.cast(x=loc_tgt, dtype='float32')\n        loc_tgt.stop_gradient = True\n        rpn_reg_loss = fluid.layers.smooth_l1(\n            x=loc_pred, y=loc_tgt, sigma=3.0, inside_weight=bbox_weight, outside_weight=bbox_weight)\n        rpn_reg_loss = fluid.layers.reduce_sum(rpn_reg_loss, name='loss_rpn_bbox')\n        score_shape = fluid.layers.shape(score_tgt)\n        score_shape = fluid.layers.cast(x=score_shape, dtype='float32')\n        norm = fluid.layers.reduce_prod(score_shape)\n        norm.stop_gradient = True\n        rpn_reg_loss = rpn_reg_loss / norm\n        return {'rpn_cls_loss': rpn_cls_loss, 'rpn_reg_loss': rpn_reg_loss}\n\n\nclass FPNRPNHead(RPNHead):\n    \"\"\"\n    RPN Head that supports FPN input\n\n    Args:\n        anchor_generator (object): `AnchorGenerator` instance\n        rpn_target_assign (object): `RPNTargetAssign` instance\n        train_proposal (object): `GenerateProposals` instance for training\n        test_proposal (object): `GenerateProposals` instance for testing\n        anchor_start_size (int): size of anchor at the first scale\n        num_chan (int): number of FPN output channels\n        min_level (int): lowest level of FPN output\n        max_level (int): highest level of FPN output\n        num_classes (int): number of classes in rpn output\n    \"\"\"\n\n    def __init__(self,\n                 anchor_generator,\n                 rpn_target_assign,\n                 train_proposal,\n                 test_proposal,\n                 anchor_start_size=32,\n                 num_chan=256,\n                 min_level=2,\n                 max_level=6,\n                 num_classes=1):\n        super(FPNRPNHead, self).__init__(anchor_generator, rpn_target_assign, train_proposal, test_proposal)\n        self.anchor_start_size = anchor_start_size\n        self.num_chan = num_chan\n        self.min_level = min_level\n        self.max_level = max_level\n        self.num_classes = num_classes\n\n        self.fpn_rpn_list = []\n        self.anchors_list = []\n        self.anchor_var_list = []\n\n    def _get_output(self, input, feat_lvl):\n        \"\"\"\n        Get anchor and FPN RPN head output at one level.\n\n        Args:\n            input(Variable): Body feature from backbone.\n            feat_lvl(int): Indicate the level of rpn output corresponding\n                to the level of feature map.\n\n        Return:\n            rpn_cls_score(Variable): Output of one level of fpn rpn head with\n                shape of [N, num_anchors, H, W].\n            rpn_bbox_pred(Variable): Output of one level of fpn rpn head with\n                shape of [N, num_anchors * 4, H, W].\n        \"\"\"\n        slvl = str(feat_lvl)\n        conv_name = 'conv_rpn_fpn' + slvl\n        cls_name = 'rpn_cls_logits_fpn' + slvl\n        bbox_name = 'rpn_bbox_pred_fpn' + slvl\n        conv_share_name = 'conv_rpn_fpn' + str(self.min_level)\n        cls_share_name = 'rpn_cls_logits_fpn' + str(self.min_level)\n        bbox_share_name = 'rpn_bbox_pred_fpn' + str(self.min_level)\n\n        num_anchors = len(self.anchor_generator.aspect_ratios)\n        conv_rpn_fpn = fluid.layers.conv2d(\n            input=input,\n            num_filters=self.num_chan,\n            filter_size=3,\n            padding=1,\n            act='relu',\n            name=conv_name,\n            param_attr=ParamAttr(name=conv_share_name + '_w', initializer=Normal(loc=0., scale=0.01)),\n            bias_attr=ParamAttr(name=conv_share_name + '_b', learning_rate=2., regularizer=L2Decay(0.)))\n\n        # self.anchor_generator\n        self.anchors, self.anchor_var = fluid.layers.anchor_generator(\n            input=conv_rpn_fpn,\n            anchor_sizes=(self.anchor_start_size * 2.**(feat_lvl - self.min_level), ),\n            stride=(2.**feat_lvl, 2.**feat_lvl),\n            aspect_ratios=self.anchor_generator.aspect_ratios,\n            variance=self.anchor_generator.variance)\n\n        cls_num_filters = num_anchors * self.num_classes\n        self.rpn_cls_score = fluid.layers.conv2d(\n            input=conv_rpn_fpn,\n            num_filters=cls_num_filters,\n            filter_size=1,\n            act=None,\n            name=cls_name,\n            param_attr=ParamAttr(name=cls_share_name + '_w', initializer=Normal(loc=0., scale=0.01)),\n            bias_attr=ParamAttr(name=cls_share_name + '_b', learning_rate=2., regularizer=L2Decay(0.)))\n        self.rpn_bbox_pred = fluid.layers.conv2d(\n            input=conv_rpn_fpn,\n            num_filters=num_anchors * 4,\n            filter_size=1,\n            act=None,\n            name=bbox_name,\n            param_attr=ParamAttr(name=bbox_share_name + '_w', initializer=Normal(loc=0., scale=0.01)),\n            bias_attr=ParamAttr(name=bbox_share_name + '_b', learning_rate=2., regularizer=L2Decay(0.)))\n        return self.rpn_cls_score, self.rpn_bbox_pred\n\n    def _get_single_proposals(self, body_feat, im_info, feat_lvl, mode='train'):\n        \"\"\"\n        Get proposals in one level according to the output of fpn rpn head\n\n        Args:\n            body_feat(Variable): the feature map from backone.\n            im_info(Variable): The information of image with shape [N, 3] with\n                format (height, width, scale).\n            feat_lvl(int): Indicate the level of proposals corresponding to\n                the feature maps.\n\n        Returns:\n            rpn_rois_fpn(Variable): Output proposals with shape of (rois_num, 4).\n            rpn_roi_probs_fpn(Variable): Scores of proposals with\n                shape of (rois_num, 1).\n        \"\"\"\n\n        rpn_cls_score_fpn, rpn_bbox_pred_fpn = self._get_output(body_feat, feat_lvl)\n\n        prop_op = self.train_proposal if mode == 'train' else self.test_proposal\n        if self.num_classes == 1:\n            rpn_cls_prob_fpn = fluid.layers.sigmoid(rpn_cls_score_fpn, name='rpn_cls_prob_fpn' + str(feat_lvl))\n        else:\n            rpn_cls_score_fpn = fluid.layers.transpose(rpn_cls_score_fpn, perm=[0, 2, 3, 1])\n            rpn_cls_score_fpn = fluid.layers.reshape(rpn_cls_score_fpn, shape=(0, 0, 0, -1, self.num_classes))\n            rpn_cls_prob_fpn = fluid.layers.softmax(\n                rpn_cls_score_fpn, use_cudnn=False, name='rpn_cls_prob_fpn' + str(feat_lvl))\n            rpn_cls_prob_fpn = fluid.layers.slice(rpn_cls_prob_fpn, axes=[4], starts=[1], ends=[self.num_classes])\n            rpn_cls_prob_fpn, _ = fluid.layers.topk(rpn_cls_prob_fpn, 1)\n            rpn_cls_prob_fpn = fluid.layers.reshape(rpn_cls_prob_fpn, shape=(0, 0, 0, -1))\n            rpn_cls_prob_fpn = fluid.layers.transpose(rpn_cls_prob_fpn, perm=[0, 3, 1, 2])\n        # prop_op\n        rpn_rois_fpn, rpn_roi_prob_fpn = fluid.layers.generate_proposals(\n            scores=rpn_cls_prob_fpn,\n            bbox_deltas=rpn_bbox_pred_fpn,\n            im_info=im_info,\n            anchors=self.anchors,\n            variances=self.anchor_var,\n            pre_nms_top_n=prop_op.pre_nms_top_n,\n            post_nms_top_n=prop_op.post_nms_top_n,\n            nms_thresh=prop_op.nms_thresh,\n            min_size=prop_op.min_size,\n            eta=prop_op.eta)\n        return rpn_rois_fpn, rpn_roi_prob_fpn\n\n    def get_proposals(self, fpn_feats, im_info, mode='train'):\n        \"\"\"\n        Get proposals in multiple levels according to the output of fpn\n        rpn head\n\n        Args:\n            fpn_feats(dict): A dictionary represents the output feature map\n                of FPN with their name.\n            im_info(Variable): The information of image with shape [N, 3] with\n                format (height, width, scale).\n\n        Return:\n            rois_list(Variable): Output proposals in shape of [rois_num, 4]\n        \"\"\"\n        rois_list = []\n        roi_probs_list = []\n        fpn_feat_names = list(fpn_feats.keys())\n        for lvl in range(self.min_level, self.max_level + 1):\n            fpn_feat_name = fpn_feat_names[self.max_level - lvl]\n            fpn_feat = fpn_feats[fpn_feat_name]\n            rois_fpn, roi_probs_fpn = self._get_single_proposals(fpn_feat, im_info, lvl, mode)\n            self.fpn_rpn_list.append((self.rpn_cls_score, self.rpn_bbox_pred))\n            rois_list.append(rois_fpn)\n            roi_probs_list.append(roi_probs_fpn)\n            self.anchors_list.append(self.anchors)\n            self.anchor_var_list.append(self.anchor_var)\n        prop_op = self.train_proposal if mode == 'train' else self.test_proposal\n        post_nms_top_n = prop_op.post_nms_top_n\n        rois_collect = fluid.layers.collect_fpn_proposals(\n            rois_list, roi_probs_list, self.min_level, self.max_level, post_nms_top_n, name='collect')\n        return rois_collect\n\n    def _get_loss_input(self):\n        rpn_clses = []\n        rpn_bboxes = []\n        anchors = []\n        anchor_vars = []\n        for i in range(len(self.fpn_rpn_list)):\n            single_input = self._transform_input(self.fpn_rpn_list[i][0], self.fpn_rpn_list[i][1], self.anchors_list[i],\n                                                 self.anchor_var_list[i])\n            rpn_clses.append(single_input[0])\n            rpn_bboxes.append(single_input[1])\n            anchors.append(single_input[2])\n            anchor_vars.append(single_input[3])\n\n        rpn_cls = fluid.layers.concat(rpn_clses, axis=1)\n        rpn_bbox = fluid.layers.concat(rpn_bboxes, axis=1)\n        anchors = fluid.layers.concat(anchors)\n        anchor_var = fluid.layers.concat(anchor_vars)\n        return rpn_cls, rpn_bbox, anchors, anchor_var\n"
  },
  {
    "path": "modules/image/object_detection/ssd_mobilenet_v1_pascal/README.md",
    "content": "# ssd_mobilenet_v1_pascal\n\n|模型名称|ssd_mobilenet_v1_pascal|\n| :--- | :---: |\n|类别|图像 - 目标检测|\n|网络|SSD|\n|数据集|PASCAL VOC|\n|是否支持Fine-tuning|否|\n|模型大小|24MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n     <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/131504887-d024c7e5-fc09-4d6b-92b8-4d0c965949d0.jpg\"   width='50%' hspace='10'/>\n     <br />\n     </p>\n\n- ### 模型介绍\n\n  - Single Shot MultiBox Detector (SSD) 是一种单阶段的目标检测器。与两阶段的检测方法不同，单阶段目标检测并不进行区域推荐，而是直接从特征图回归出目标的边界框和分类概率。SSD 运用了这种单阶段检测的思想，并且对其进行改进：在不同尺度的特征图上检测对应尺度的目标。该PaddleHub Module的基网络为MobileNet-v1模型，在Pascal数据集上预训练得到，目前仅支持预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install ssd_mobilenet_v1_pascal\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run ssd_mobilenet_v1_pascal --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现目标检测模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    object_detector = hub.Module(name=\"ssd_mobilenet_v1_pascal\")\n    result = object_detector.object_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = object_detector.object_detection((paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def object_detection(paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='detection_result',\n                         score_thresh=0.5,\n                         visualization=True,\n                         )\n    ```\n\n    - 预测API，检测输入图片中的所有目标的位置。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 detection\\_result; <br/>\n      - score\\_thresh (float): 识别置信度的阈值；<br/>\n      - visualization (bool): 是否将识别结果保存为图片文件。\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为:\n        - data (list): 检测结果，list的每一个元素为 dict，各字段为:\n          - confidence (float): 识别的置信度\n          - label (str): 标签\n          - left (int): 边界框的左上角x坐标\n          - top (int): 边界框的左上角y坐标\n          - right (int): 边界框的右下角x坐标\n          - bottom (int): 边界框的右下角y坐标\n        - save\\_path (str, optional): 识别结果的保存路径 (仅当visualization=True时存在)\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - 将模型保存到指定路径。\n\n    - **参数**\n\n      - dirname: 模型保存路径 <br/>\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m ssd_mobilenet_v1_pascal\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/ssd_mobilenet_v1_pascal\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.2\n\n  修复numpy数据读取问题\n\n* 1.1.3\n\n  移除 fluid api\n\n* 1.2.0\n\n  修复推理模型无法导出的问题\n\n  - ```shell\n    $ hub install ssd_mobilenet_v1_pascal==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/object_detection/ssd_mobilenet_v1_pascal/README_en.md",
    "content": "# ssd_mobilenet_v1_pascal\n\n|Module Name|ssd_mobilenet_v1_pascal|\n| :--- | :---: |\n|Category|object detection|\n|Network|SSD|\n|Dataset|PASCAL VOC|\n|Fine-tuning supported or not|No|\n|Module Size|24MB|\n|Latest update date|2021-02-26|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n     <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/131504887-d024c7e5-fc09-4d6b-92b8-4d0c965949d0.jpg\"   width='50%' hspace='10'/>\n     <br />\n     </p>\n\n- ### Module Introduction\n\n  - Single Shot MultiBox Detector (SSD) is a one-stage detector. Different from two-stage detector, SSD frames object detection as a re- gression problem to spatially separated bounding boxes and associated class probabilities. This module is based on MobileNet-v1, trained on Pascal dataset, and can be used for object detection.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install ssd_mobilenet_v1_pascal\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run ssd_mobilenet_v1_pascal --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    object_detector = hub.Module(name=\"ssd_mobilenet_v1_pascal\")\n    result = object_detector.object_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = object_detector.object_detection((paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def object_detection(paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='detection_result',\n                         score_thresh=0.5,\n                         visualization=True,\n                         )\n    ```\n\n    - Detection API, detect positions of all objects in image\n\n    - **Parameters**\n\n      - paths (list[str]): image path;\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - output_dir (str): save path of images;\n      - score\\_thresh (float): 识别置信度的阈值；<br/>\n      - visualization (bool): Whether to save the results as picture files;\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n\n    - **Return**\n\n      - res (list\\[dict\\]): results\n        - data (list): detection results, each element in the list is dict\n          - confidence (float): the confidence of the result\n          - label (str): label\n          - left (int): the upper left corner x coordinate of the detection box\n          - top (int): the upper left corner y coordinate of the detection box\n          - right (int): the lower right corner x coordinate of the detection box\n          - bottom (int): the lower right corner y coordinate of the detection box\n        - save\\_path (str, optional): output path for saving results\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - Save model to specific path\n\n    - **Parameters**\n\n      - dirname: model save path\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of object detection.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m ssd_mobilenet_v1_pascal\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/ssd_mobilenet_v1_pascal\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.2\n\n  Fix the problem of reading numpy\n\n* 1.1.3\n\n  Remove fluid api\n\n* 1.2.0\n\n  Fix bug of save_inference_model\n\n  - ```shell\n    $ hub install ssd_mobilenet_v1_pascal==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/object_detection/ssd_mobilenet_v1_pascal/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/object_detection/ssd_mobilenet_v1_pascal/config.yml",
    "content": "MobileNet:\n  norm_decay: 0.\n  conv_group_scale: 1\n  conv_learning_rate: 0.1\n  extra_block_filters: [[256, 512], [128, 256], [128, 256], [64, 128]]\n  with_extra_blocks: True\n\nSSDOutputDecoder:\n  background_label: 0\n  keep_top_k: 200\n  nms_eta: 1.0\n  nms_threshold: 0.45\n  nms_top_k: 400\n  score_threshold: 0.01\n\nMultiBoxHead:\n  aspect_ratios: [[2.], [2., 3.], [2., 3.], [2., 3.], [2., 3.], [2., 3.]]\n  base_size: 300\n  flip: True\n  max_ratio: 90\n  max_sizes: [[], 150.0, 195.0, 240.0, 285.0, 300.0]\n  min_ratio: 20\n  min_sizes: [60.0, 105.0, 150.0, 195.0, 240.0, 285.0]\n  offset: 0.5\n"
  },
  {
    "path": "modules/image/object_detection/ssd_mobilenet_v1_pascal/data_feed.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import division\n\nimport os\nimport random\n\nimport cv2\nimport numpy as np\nfrom PIL import Image\n\n__all__ = ['reader']\n\n\nclass DecodeImage(object):\n    def __init__(self, to_rgb=True, with_mixup=False):\n        \"\"\" Transform the image data to numpy format.\n\n        Args:\n            to_rgb (bool): whether to convert BGR to RGB\n            with_mixup (bool): whether or not to mixup image and gt_bbbox/gt_score\n        \"\"\"\n        self.to_rgb = to_rgb\n        self.with_mixup = with_mixup\n\n    def __call__(self, im):\n        if self.to_rgb:\n            im = cv2.cvtColor(im, cv2.COLOR_BGR2RGB)\n\n        return im\n\n\nclass ResizeImage(object):\n    def __init__(self,\n                 target_size=0,\n                 max_size=0,\n                 interp=cv2.INTER_LINEAR,\n                 use_cv2=True):\n        \"\"\"\n        Rescale image to the specified target size, and capped at max_size\n        if max_size != 0.\n        If target_size is list, selected a scale randomly as the specified\n        target size.\n\n        Args:\n            target_size (int|list): the target size of image's short side,\n                multi-scale training is adopted when type is list.\n            max_size (int): the max size of image\n            interp (int): the interpolation method\n            use_cv2 (bool): use the cv2 interpolation method or use PIL\n                interpolation method\n        \"\"\"\n        self.max_size = int(max_size)\n        self.interp = int(interp)\n        self.use_cv2 = use_cv2\n        self.target_size = target_size\n\n    def __call__(self, im):\n        if not isinstance(im, np.ndarray):\n            raise TypeError(\"{}: image type is not numpy.\".format(self))\n        if len(im.shape) != 3:\n            raise ValueError('{}: image is not 3-dimensional.'.format(self))\n        im_shape = im.shape\n        im_size_min = np.min(im_shape[0:2])\n        im_size_max = np.max(im_shape[0:2])\n        if isinstance(self.target_size, list):\n            # Case for multi-scale training\n            selected_size = random.choice(self.target_size)\n        else:\n            selected_size = self.target_size\n        if float(im_size_min) == 0:\n            raise ZeroDivisionError('{}: min size of image is 0'.format(self))\n        if self.max_size != 0:\n            im_scale = float(selected_size) / float(im_size_min)\n            # Prevent the biggest axis from being more than max_size\n            if np.round(im_scale * im_size_max) > self.max_size:\n                im_scale = float(self.max_size) / float(im_size_max)\n            im_scale_x = im_scale\n            im_scale_y = im_scale\n\n            resize_w = im_scale_x * float(im_shape[1])\n            resize_h = im_scale_y * float(im_shape[0])\n            im_info = [resize_h, resize_w, im_scale]\n        else:\n            im_scale_x = float(selected_size) / float(im_shape[1])\n            im_scale_y = float(selected_size) / float(im_shape[0])\n\n            resize_w = selected_size\n            resize_h = selected_size\n\n        if self.use_cv2:\n            im = cv2.resize(\n                im,\n                None,\n                None,\n                fx=im_scale_x,\n                fy=im_scale_y,\n                interpolation=self.interp)\n        else:\n            if self.max_size != 0:\n                raise TypeError(\n                    'If you set max_size to cap the maximum size of image,'\n                    'please set use_cv2 to True to resize the image.')\n            im = im.astype('uint8')\n            im = Image.fromarray(im)\n            im = im.resize((int(resize_w), int(resize_h)), self.interp)\n            im = np.array(im)\n\n        return im\n\n\nclass NormalizeImage(object):\n    def __init__(self,\n                 mean=[0.485, 0.456, 0.406],\n                 std=[1, 1, 1],\n                 is_scale=True,\n                 is_channel_first=True):\n        \"\"\"\n        Args:\n            mean (list): the pixel mean\n            std (list): the pixel variance\n        \"\"\"\n        self.mean = mean\n        self.std = std\n        self.is_scale = is_scale\n        self.is_channel_first = is_channel_first\n\n    def __call__(self, im):\n        \"\"\"Normalize the image.\n\n        Operators:\n            1.(optional) Scale the image to [0,1]\n            2. Each pixel minus mean and is divided by std\n        \"\"\"\n        im = im.astype(np.float32, copy=False)\n        if self.is_channel_first:\n            mean = np.array(self.mean)[:, np.newaxis, np.newaxis]\n            std = np.array(self.std)[:, np.newaxis, np.newaxis]\n        else:\n            mean = np.array(self.mean)[np.newaxis, np.newaxis, :]\n            std = np.array(self.std)[np.newaxis, np.newaxis, :]\n        if self.is_scale:\n            im = im / 255.0\n        im -= mean\n        im /= std\n        return im\n\n\nclass Permute(object):\n    def __init__(self, to_bgr=True, channel_first=True):\n        \"\"\"\n        Change the channel.\n\n        Args:\n            to_bgr (bool): confirm whether to convert RGB to BGR\n            channel_first (bool): confirm whether to change channel\n        \"\"\"\n        self.to_bgr = to_bgr\n        self.channel_first = channel_first\n\n    def __call__(self, im):\n        if self.channel_first:\n            im = np.swapaxes(im, 1, 2)\n            im = np.swapaxes(im, 1, 0)\n        if self.to_bgr:\n            im = im[[2, 1, 0], :, :]\n        return im\n\n\ndef reader(paths=[],\n           images=None,\n           decode_image=DecodeImage(to_rgb=True, with_mixup=False),\n           resize_image=ResizeImage(\n               target_size=512, interp=1, max_size=0, use_cv2=False),\n           permute_image=Permute(to_bgr=False),\n           normalize_image=NormalizeImage(\n               mean=[104, 117, 123], std=[1, 1, 1], is_scale=False)):\n    \"\"\"\n    data generator\n\n    Args:\n        paths (list[str]): paths to images.\n        images (list(numpy.ndarray)): data of images, shape of each is [H, W, C]\n        decode_image (class object): instance of <class 'DecodeImage' object>\n        resize_image (class object): instance of <class 'ResizeImage' object>\n        permute_image (class object): instance of <class 'Permute' object>\n        normalize_image (class object): instance of <class 'NormalizeImage' object>\n    \"\"\"\n    img_list = []\n    if paths is not None:\n        assert type(paths) is list, \"type(paths) is not list.\"\n        for img_path in paths:\n            assert os.path.isfile(\n                img_path), \"The {} isn't a valid file path.\".format(img_path)\n            img = cv2.imread(img_path).astype('float32')\n            img_list.append(img)\n    if images is not None:\n        for img in images:\n            img_list.append(img)\n\n    decode_image = DecodeImage(to_rgb=True, with_mixup=False)\n    resize_image = ResizeImage(\n        target_size=300, interp=1, max_size=0, use_cv2=False)\n    permute_image = Permute()\n    normalize_image = NormalizeImage(\n        mean=[127.5, 127.5, 127.5],\n        std=[127.502231, 127.502231, 127.502231],\n        is_scale=False)\n\n    for img in img_list:\n        preprocessed_img = decode_image(img)\n        preprocessed_img = resize_image(preprocessed_img)\n        preprocessed_img = permute_image(preprocessed_img)\n        preprocessed_img = normalize_image(preprocessed_img)\n        yield [preprocessed_img]\n"
  },
  {
    "path": "modules/image/object_detection/ssd_mobilenet_v1_pascal/label_file.txt",
    "content": "background\naeroplane\nbicycle\nbird\nboat\nbottle\nbus\ncar\ncat\nchair\ncow\ndiningtable\ndog\nhorse\nmotorbike\nperson\npottedplant\nsheep\nsofa\ntrain\ntvmonitor\n"
  },
  {
    "path": "modules/image/object_detection/ssd_mobilenet_v1_pascal/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\n\nimport argparse\nimport ast\nimport os\nfrom functools import partial\n\nimport numpy as np\nimport paddle\nimport paddle.jit\nimport paddle.static\nimport yaml\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import load_label_info\nfrom .processor import postprocess\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"ssd_mobilenet_v1_pascal\",\n            version=\"1.2.0\",\n            type=\"cv/object_detection\",\n            summary=\"SSD with backbone MobileNet_V1, trained with dataset Pasecal VOC.\",\n            author=\"paddlepaddle\",\n            author_email=\"paddle-dev@baidu.com\")\nclass SSDMobileNetv1:\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"ssd_mobilenet_v1_model\", \"model\")\n        self.label_names = load_label_info(os.path.join(self.directory, \"label_file.txt\"))\n        self.model_config = None\n        self._set_config()\n\n    def _set_config(self):\n        # predictor config setting.\n        model = self.default_pretrained_model_path+'.pdmodel'\n        params = self.default_pretrained_model_path+'.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        cpu_config.switch_ir_optim(False)\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=500, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n        # model config setting.\n        if not self.model_config:\n            with open(os.path.join(self.directory, 'config.yml')) as fp:\n                self.model_config = yaml.load(fp.read(), Loader=yaml.FullLoader)\n\n        self.multi_box_head_config = self.model_config['MultiBoxHead']\n        self.output_decoder_config = self.model_config['SSDOutputDecoder']\n        self.mobilenet_config = self.model_config['MobileNet']\n\n    def object_detection(self,\n                         paths=None,\n                         images=None,\n                         data=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='detection_result',\n                         score_thresh=0.5,\n                         visualization=True):\n        \"\"\"API of Object Detection.\n\n        Args:\n            paths (list[str]): The paths of images.\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            output_dir (str): The path to store output images.\n            visualization (bool): Whether to save image or not.\n            score_thresh (float): threshold for object detecion.\n\n        Returns:\n            res (list[dict]): The result of coco2017 detecion. keys include 'data', 'save_path', the corresponding value is:\n                data (dict): the result of object detection, keys include 'left', 'top', 'right', 'bottom', 'label', 'confidence', the corresponding value is:\n                    left (float): The X coordinate of the upper left corner of the bounding box;\n                    top (float): The Y coordinate of the upper left corner of the bounding box;\n                    right (float): The X coordinate of the lower right corner of the bounding box;\n                    bottom (float): The Y coordinate of the lower right corner of the bounding box;\n                    label (str): The label of detection result;\n                    confidence (float): The confidence of detection result.\n                save_path (str, optional): The path to save output images.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Attempt to use GPU for prediction, but environment variable CUDA_VISIBLE_DEVICES was not set correctly.\"\n                )\n\n        paths = paths if paths else list()\n        if data and 'image' in data:\n            paths += data['image']\n\n        data_reader = partial(reader, paths, images)\n        batch_reader = paddle.batch(data_reader, batch_size=batch_size)\n        res = []\n        for iter_id, feed_data in enumerate(batch_reader()):\n            feed_data = np.array(feed_data)\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(np.array(list(feed_data[:, 0])))\n\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            output = postprocess(paths=paths,\n                                 images=images,\n                                 data_out=output_handle,\n                                 score_thresh=score_thresh,\n                                 label_names=self.label_names,\n                                 output_dir=output_dir,\n                                 handle_id=iter_id * batch_size,\n                                 visualization=visualization)\n            res.extend(output)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.object_detection(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.object_detection(paths=[args.input_path],\n                                        batch_size=args.batch_size,\n                                        use_gpu=args.use_gpu,\n                                        output_dir=args.output_dir,\n                                        visualization=args.visualization,\n                                        score_thresh=args.score_thresh)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='detection_result',\n                                           help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument('--visualization',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n        self.arg_input_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n        self.arg_input_group.add_argument('--score_thresh',\n                                          type=ast.literal_eval,\n                                          default=0.5,\n                                          help=\"threshold for object detecion.\")\n"
  },
  {
    "path": "modules/image/object_detection/ssd_mobilenet_v1_pascal/processor.py",
    "content": "# coding=utf-8\nimport base64\nimport os\n\nimport cv2\nimport numpy as np\nfrom PIL import Image\nfrom PIL import ImageDraw\n\n__all__ = ['base64_to_cv2', 'load_label_info', 'postprocess']\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef check_dir(dir_path):\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef get_save_image_name(img, output_dir, image_path):\n    \"\"\"\n    Get save image name from source image path.\n    \"\"\"\n    if not os.path.exists(output_dir):\n        os.makedirs(output_dir)\n    image_name = os.path.split(image_path)[-1]\n    name, ext = os.path.splitext(image_name)\n\n    if img.format == 'PNG':\n        ext = '.png'\n    elif img.format == 'JPEG':\n        ext = '.jpg'\n    elif img.format == 'BMP':\n        ext = '.bmp'\n    else:\n        if img.mode == \"RGB\" or img.mode == \"L\":\n            ext = \".jpg\"\n        elif img.mode == \"RGBA\" or img.mode == \"P\":\n            ext = '.png'\n\n    return os.path.join(output_dir, \"{}\".format(name)) + ext\n\n\ndef draw_bounding_box_on_image(image_path, data_list, save_dir):\n    image = Image.open(image_path)\n    draw = ImageDraw.Draw(image)\n    for data in data_list:\n        left, right, top, bottom = data['left'], data['right'], data['top'], data['bottom']\n\n        # draw bbox\n        draw.line([(left, top), (left, bottom), (right, bottom), (right, top), (left, top)], width=2, fill='red')\n\n        # draw label\n        if image.mode == 'RGB':\n            text = data['label'] + \": %.2f%%\" % (100 * data['confidence'])\n            textsize_width, textsize_height = draw.textsize(text=text)\n            draw.rectangle(xy=(left, top - (textsize_height + 5), left + textsize_width + 10, top),\n                           fill=(255, 255, 255))\n            draw.text(xy=(left, top - 15), text=text, fill=(0, 0, 0))\n\n    save_name = get_save_image_name(image, save_dir, image_path)\n    if os.path.exists(save_name):\n        os.remove(save_name)\n\n    image.save(save_name)\n\n    return save_name\n\n\ndef clip_bbox(bbox, img_width, img_height):\n    xmin = max(min(bbox[0], img_width), 0.)\n    ymin = max(min(bbox[1], img_height), 0.)\n    xmax = max(min(bbox[2], img_width), 0.)\n    ymax = max(min(bbox[3], img_height), 0.)\n    return float(xmin), float(ymin), float(xmax), float(ymax)\n\n\ndef load_label_info(file_path):\n    with open(file_path, 'r') as fr:\n        text = fr.readlines()\n        label_names = []\n        for info in text:\n            label_names.append(info.strip())\n        return label_names\n\n\ndef postprocess(paths, images, data_out, score_thresh, label_names, output_dir, handle_id, visualization=True):\n    \"\"\"\n    postprocess the lod_tensor produced by Executor.run\n\n    Args:\n        paths (list[str]): the path of images.\n        images (list(numpy.ndarray)):  list of images, shape of each is [H, W, C].\n        data_out (lod_tensor): data produced by executor.run.\n        score_thresh (float): the low limit of bounding box.\n        label_names (list[str]): label names.\n        output_dir (str): output directory.\n        handle_id (int): The number of images that have been handled.\n        visualization (bool): whether to save as images.\n\n    Returns:\n        res (list[dict]): The result of vehicles detecion. keys include 'data', 'save_path', the corresponding value is:\n            data (dict): the result of object detection, keys include 'left', 'top', 'right', 'bottom', 'label', 'confidence', the corresponding value is:\n                left (float): The X coordinate of the upper left corner of the bounding box;\n                top (float): The Y coordinate of the upper left corner of the bounding box;\n                right (float): The X coordinate of the lower right corner of the bounding box;\n                bottom (float): The Y coordinate of the lower right corner of the bounding box;\n                label (str): The label of detection result;\n                confidence (float): The confidence of detection result.\n            save_path (str): The path to save output images.\n    \"\"\"\n    lod = data_out.lod()[0]\n    results = data_out.copy_to_cpu()\n\n    check_dir(output_dir)\n\n    if paths:\n        assert type(paths) is list, \"type(paths) is not list.\"\n        if handle_id < len(paths):\n            unhandled_paths = paths[handle_id:]\n            unhandled_paths_num = len(unhandled_paths)\n        else:\n            unhandled_paths_num = 0\n    if images is not None:\n        if handle_id < len(images):\n            unhandled_paths = None\n            unhandled_paths_num = len(images) - handle_id\n        else:\n            unhandled_paths_num = 0\n\n    output = []\n    for index in range(len(lod) - 1):\n        output_i = {'data': []}\n        if unhandled_paths and index < unhandled_paths_num:\n            org_img_path = unhandled_paths[index]\n            org_img = Image.open(org_img_path)\n            output_i['path'] = org_img_path\n        else:\n            org_img = images[index - unhandled_paths_num]\n            org_img = org_img.astype(np.uint8)\n            org_img = Image.fromarray(org_img[:, :, ::-1])\n            if visualization:\n                org_img_path = get_save_image_name(org_img, output_dir, 'image_numpy_{}'.format((handle_id + index)))\n                org_img.save(org_img_path)\n        org_img_height = org_img.height\n        org_img_width = org_img.width\n        result_i = results[lod[index]:lod[index + 1]]\n        for row in result_i:\n            if len(row) != 6:\n                continue\n            if row[1] < score_thresh:\n                continue\n            category_id = int(row[0])\n            confidence = row[1]\n            bbox = row[2:]\n            bbox[0] = bbox[0] * org_img_width\n            bbox[1] = bbox[1] * org_img_height\n            bbox[2] = bbox[2] * org_img_width\n            bbox[3] = bbox[3] * org_img_height\n            dt = {}\n            dt['label'] = label_names[category_id]\n            dt['confidence'] = float(confidence)\n            dt['left'], dt['top'], dt['right'], dt['bottom'] = clip_bbox(bbox, org_img_width, org_img_height)\n            output_i['data'].append(dt)\n\n        output.append(output_i)\n        if visualization:\n            output_i['save_path'] = draw_bounding_box_on_image(org_img_path, output_i['data'], output_dir)\n\n    return output\n"
  },
  {
    "path": "modules/image/object_detection/ssd_mobilenet_v1_pascal/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport paddlehub as hub\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://ai-studio-static-online.cdn.bcebos.com/68313e182f5e4ad9907e69dac9ece8fc50840d7ffbd24fa88396f009958f969a'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"ssd_mobilenet_v1_pascal\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('detection_result')\n\n    def test_object_detection1(self):\n        results = self.module.object_detection(\n            paths=['tests/test.jpg']\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'cat')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(200 < left < 800)\n        self.assertTrue(2500 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(3500 < bottom < 4500)\n\n    def test_object_detection2(self):\n        results = self.module.object_detection(\n            images=[cv2.imread('tests/test.jpg')]\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'cat')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(200 < left < 800)\n        self.assertTrue(2500 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(3500 < bottom < 4500)\n\n    def test_object_detection3(self):\n        results = self.module.object_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            visualization=False\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'cat')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(200 < left < 800)\n        self.assertTrue(2500 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(3500 < bottom < 4500)\n\n    def test_object_detection4(self):\n        self.assertRaises(\n            AssertionError,\n            self.module.object_detection,\n            paths=['no.jpg']\n        )\n\n    def test_object_detection5(self):\n        self.assertRaises(\n            cv2.error,\n            self.module.object_detection,\n            images=['test.jpg']\n        )\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/object_detection/ssd_vgg16_300_coco2017/README.md",
    "content": "# ssd_vgg16_300_coco2017\n\n|模型名称|ssd_vgg16_300_coco2017|\n| :--- | :---: |\n|类别|图像 - 目标检测|\n|网络|SSD|\n|数据集|COCO2017|\n|是否支持Fine-tuning|否|\n|模型大小|139MB|\n|最新更新日期|2021-03-15|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n     <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/131506781-b4ecb77b-5ab1-4795-88da-5f547f7f7f9c.jpg\"   width='50%' hspace='10'/>\n     <br />\n     </p>\n\n- ### 模型介绍\n\n  - Single Shot MultiBox Detector (SSD) 是一种单阶段的目标检测器。与两阶段的检测方法不同，单阶段目标检测并不进行区域推荐，而是直接从特征图回归出目标的边界框和分类概率。SSD 运用了这种单阶段检测的思想，并且对其进行改进：在不同尺度的特征图上检测对应尺度的目标。该PaddleHub Module的基网络为VGG16模型，在Pascal数据集上预训练得到，目前仅支持预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install ssd_vgg16_300_coco2017\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run ssd_vgg16_300_coco2017 --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现目标检测模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    object_detector = hub.Module(name=\"ssd_vgg16_300_coco2017\")\n    result = object_detector.object_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = object_detector.object_detection((paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def object_detection(paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='detection_result',\n                         score_thresh=0.5,\n                         visualization=True)\n    ```\n\n    - 预测API，检测输入图片中的所有目标的位置。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 detection\\_result；<br/>\n      - score\\_thresh (float): 识别置信度的阈值；<br/>\n      - visualization (bool): 是否将识别结果保存为图片文件。  \n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为:\n        - data (list): 检测结果，list的每一个元素为 dict，各字段为:\n          - confidence (float): 识别的置信度\n          - label (str): 标签\n          - left (int): 边界框的左上角x坐标\n          - top (int): 边界框的左上角y坐标\n          - right (int): 边界框的右下角x坐标\n          - bottom (int): 边界框的右下角y坐标\n        - save\\_path (str, optional): 识别结果的保存路径 (仅当visualization=True时存在)\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - 将模型保存到指定路径。\n\n    - **参数**\n\n      - dirname: 模型保存路径 <br/>\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m ssd_vgg16_300_coco2017\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/ssd_vgg16_300_coco2017\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布  \n\n* 1.0.2\n\n  修复numpy数据读取问题\n\n* 1.1.0\n\n  移除 fluid api\n\n  - ```shell\n    $ hub install ssd_vgg16_300_coco2017==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/object_detection/ssd_vgg16_300_coco2017/README_en.md",
    "content": "# ssd_vgg16_300_coco2017\n\n|Module Name|ssd_vgg16_300_coco2017|\n| :--- | :---: |\n|Category|object detection|\n|Network|SSD|\n|Dataset|COCO2017|\n|Fine-tuning supported or not|No|\n|Module Size|139MB|\n|Latest update date|2021-03-15|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n     <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/131506781-b4ecb77b-5ab1-4795-88da-5f547f7f7f9c.jpg\"   width='50%' hspace='10'/>\n     <br />\n     </p>\n\n- ### Module Introduction\n\n  - Single Shot MultiBox Detector (SSD) is a one-stage detector. Different from two-stage detector, SSD frames object detection as a re- gression problem to spatially separated bounding boxes and associated class probabilities. This module is based on VGG16, trained on COCO2017 dataset, and can be used for object detection.\n\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install ssd_vgg16_300_coco2017\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run ssd_vgg16_300_coco2017 --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    object_detector = hub.Module(name=\"ssd_vgg16_300_coco2017\")\n    result = object_detector.object_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = object_detector.object_detection((paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def object_detection(paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='detection_result',\n                         score_thresh=0.5,\n                         visualization=True)\n    ```\n\n    - Detection API, detect positions of all objects in image\n\n    - **Parameters**\n\n      - paths (list[str]): image path;\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - output_dir (str): save path of images;\n      - score\\_thresh (float): confidence threshold；<br/>\n      - visualization (bool): Whether to save the results as picture files;\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n\n      - res (list\\[dict\\]): results\n        - data (list): detection results, each element in the list is dict\n          - confidence (float): the confidence of the result\n          - label (str): label\n          - left (int): the upper left corner x coordinate of the detection box\n          - top (int): the upper left corner y coordinate of the detection box\n          - right (int): the lower right corner x coordinate of the detection box\n          - bottom (int): the lower right corner y coordinate of the detection box\n        - save\\_path (str, optional): output path for saving results\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - Save model to specific path\n\n    - **Parameters**\n\n      - dirname: model save path\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of object detection.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m ssd_vgg16_300_coco2017\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/ssd_vgg16_300_coco2017\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release  \n\n* 1.0.2\n\n  Fix the problem of reading numpy\n\n* 1.1.0\n\n  Remove fluid api\n\n  - ```shell\n    $ hub install ssd_vgg16_300_coco2017==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/object_detection/ssd_vgg16_300_coco2017/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/object_detection/ssd_vgg16_300_coco2017/config.yml",
    "content": "SSDOutputDecoder:\n  background_label: 0\n  keep_top_k: 200\n  nms_eta: 1.0\n  nms_threshold: 0.45\n  nms_top_k: 400\n  score_threshold: 0.01\n\nMultiBoxHead:\n  base_size: 300\n  aspect_ratios: [[2.], [2., 3.], [2., 3.], [2., 3.], [2.], [2.]]\n  min_ratio: 15\n  max_ratio: 90\n  min_sizes: [30.0, 60.0, 111.0, 162.0, 213.0, 264.0]\n  max_sizes: [60.0, 111.0, 162.0, 213.0, 264.0, 315.0]\n  steps: [8, 16, 32, 64, 100, 300]\n  offset: 0.5\n  flip: true\n  kernel_size: 3\n  pad: 1\n"
  },
  {
    "path": "modules/image/object_detection/ssd_vgg16_300_coco2017/data_feed.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import division\n\nimport os\nimport random\n\nimport cv2\nimport numpy as np\nfrom PIL import Image\n\n__all__ = ['reader']\n\n\nclass DecodeImage(object):\n    def __init__(self, to_rgb=True, with_mixup=False):\n        \"\"\" Transform the image data to numpy format.\n\n        Args:\n            to_rgb (bool): whether to convert BGR to RGB\n            with_mixup (bool): whether or not to mixup image and gt_bbbox/gt_score\n        \"\"\"\n        self.to_rgb = to_rgb\n        self.with_mixup = with_mixup\n\n    def __call__(self, im):\n        if self.to_rgb:\n            im = cv2.cvtColor(im, cv2.COLOR_BGR2RGB)\n\n        return im\n\n\nclass ResizeImage(object):\n    def __init__(self, target_size=0, max_size=0, interp=cv2.INTER_LINEAR, use_cv2=True):\n        \"\"\"\n        Rescale image to the specified target size, and capped at max_size\n        if max_size != 0.\n        If target_size is list, selected a scale randomly as the specified\n        target size.\n\n        Args:\n            target_size (int|list): the target size of image's short side,\n                multi-scale training is adopted when type is list.\n            max_size (int): the max size of image\n            interp (int): the interpolation method\n            use_cv2 (bool): use the cv2 interpolation method or use PIL\n                interpolation method\n        \"\"\"\n        self.max_size = int(max_size)\n        self.interp = int(interp)\n        self.use_cv2 = use_cv2\n        self.target_size = target_size\n\n    def __call__(self, im):\n        if not isinstance(im, np.ndarray):\n            raise TypeError(\"{}: image type is not numpy.\".format(self))\n        if len(im.shape) != 3:\n            raise ValueError('{}: image is not 3-dimensional.'.format(self))\n        im_shape = im.shape\n        im_size_min = np.min(im_shape[0:2])\n        im_size_max = np.max(im_shape[0:2])\n        if isinstance(self.target_size, list):\n            # Case for multi-scale training\n            selected_size = random.choice(self.target_size)\n        else:\n            selected_size = self.target_size\n        if float(im_size_min) == 0:\n            raise ZeroDivisionError('{}: min size of image is 0'.format(self))\n        if self.max_size != 0:\n            im_scale = float(selected_size) / float(im_size_min)\n            # Prevent the biggest axis from being more than max_size\n            if np.round(im_scale * im_size_max) > self.max_size:\n                im_scale = float(self.max_size) / float(im_size_max)\n            im_scale_x = im_scale\n            im_scale_y = im_scale\n\n            resize_w = im_scale_x * float(im_shape[1])\n            resize_h = im_scale_y * float(im_shape[0])\n            im_info = [resize_h, resize_w, im_scale]\n        else:\n            im_scale_x = float(selected_size) / float(im_shape[1])\n            im_scale_y = float(selected_size) / float(im_shape[0])\n\n            resize_w = selected_size\n            resize_h = selected_size\n\n        if self.use_cv2:\n            im = cv2.resize(im, None, None, fx=im_scale_x, fy=im_scale_y, interpolation=self.interp)\n        else:\n            if self.max_size != 0:\n                raise TypeError('If you set max_size to cap the maximum size of image,'\n                                'please set use_cv2 to True to resize the image.')\n            im = im.astype('uint8')\n            im = Image.fromarray(im)\n            im = im.resize((int(resize_w), int(resize_h)), self.interp)\n            im = np.array(im)\n\n        return im\n\n\nclass NormalizeImage(object):\n    def __init__(self, mean=[0.485, 0.456, 0.406], std=[1, 1, 1], is_scale=True, is_channel_first=True):\n        \"\"\"\n        Args:\n            mean (list): the pixel mean\n            std (list): the pixel variance\n        \"\"\"\n        self.mean = mean\n        self.std = std\n        self.is_scale = is_scale\n        self.is_channel_first = is_channel_first\n\n    def __call__(self, im):\n        \"\"\"Normalize the image.\n\n        Operators:\n            1.(optional) Scale the image to [0,1]\n            2. Each pixel minus mean and is divided by std\n        \"\"\"\n        im = im.astype(np.float32, copy=False)\n        if self.is_channel_first:\n            mean = np.array(self.mean)[:, np.newaxis, np.newaxis]\n            std = np.array(self.std)[:, np.newaxis, np.newaxis]\n        else:\n            mean = np.array(self.mean)[np.newaxis, np.newaxis, :]\n            std = np.array(self.std)[np.newaxis, np.newaxis, :]\n        if self.is_scale:\n            im = im / 255.0\n        im -= mean\n        im /= std\n        return im\n\n\nclass Permute(object):\n    def __init__(self, to_bgr=True, channel_first=True):\n        \"\"\"\n        Change the channel.\n\n        Args:\n            to_bgr (bool): confirm whether to convert RGB to BGR\n            channel_first (bool): confirm whether to change channel\n        \"\"\"\n        self.to_bgr = to_bgr\n        self.channel_first = channel_first\n\n    def __call__(self, im):\n        if self.channel_first:\n            im = np.swapaxes(im, 1, 2)\n            im = np.swapaxes(im, 1, 0)\n        if self.to_bgr:\n            im = im[[2, 1, 0], :, :]\n        return im\n\n\ndef reader(paths=[],\n           images=None,\n           decode_image=DecodeImage(to_rgb=True, with_mixup=False),\n           resize_image=ResizeImage(target_size=512, interp=1, max_size=0, use_cv2=False),\n           permute_image=Permute(to_bgr=False),\n           normalize_image=NormalizeImage(mean=[104, 117, 123], std=[1, 1, 1], is_scale=False)):\n    \"\"\"\n    data generator\n\n    Args:\n        paths (list[str]): paths to images.\n        images (list(numpy.ndarray)): data of images, shape of each is [H, W, C]\n        decode_image (class object): instance of <class 'DecodeImage' object>\n        resize_image (class object): instance of <class 'ResizeImage' object>\n        permute_image (class object): instance of <class 'Permute' object>\n        normalize_image (class object): instance of <class 'NormalizeImage' object>\n    \"\"\"\n    img_list = []\n    if paths is not None:\n        assert type(paths) is list, \"type(paths) is not list.\"\n        for img_path in paths:\n            assert os.path.isfile(img_path), \"The {} isn't a valid file path.\".format(img_path)\n            img = cv2.imread(img_path).astype('float32')\n            img_list.append(img)\n    if images is not None:\n        for img in images:\n            img_list.append(img)\n\n    resize_image = ResizeImage(target_size=300, interp=1, max_size=0, use_cv2=False)\n\n    for img in img_list:\n        preprocessed_img = decode_image(img)\n        preprocessed_img = resize_image(preprocessed_img)\n        preprocessed_img = permute_image(preprocessed_img)\n        preprocessed_img = normalize_image(preprocessed_img)\n        yield [preprocessed_img]\n"
  },
  {
    "path": "modules/image/object_detection/ssd_vgg16_300_coco2017/label_file.txt",
    "content": "background\nperson\nbicycle\ncar\nmotorcycle\nairplane\nbus\ntrain\ntruck\nboat\ntraffic light\nfire hydrant\nstop sign\nparking meter\nbench\nbird\ncat\ndog\nhorse\nsheep\ncow\nelephant\nbear\nzebra\ngiraffe\nbackpack\numbrella\nhandbag\ntie\nsuitcase\nfrisbee\nskis\nsnowboard\nsports ball\nkite\nbaseball bat\nbaseball glove\nskateboard\nsurfboard\ntennis racket\nbottle\nwine glass\ncup\nfork\nknife\nspoon\nbowl\nbanana\napple\nsandwich\norange\nbroccoli\ncarrot\nhot dog\npizza\ndonut\ncake\nchair\ncouch\npotted plant\nbed\ndining table\ntoilet\ntv\nlaptop\nmouse\nremote\nkeyboard\ncell phone\nmicrowave\noven\ntoaster\nsink\nrefrigerator\nbook\nclock\nvase\nscissors\nteddy bear\nhair drier\ntoothbrush\n"
  },
  {
    "path": "modules/image/object_detection/ssd_vgg16_300_coco2017/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\n\nimport ast\nimport argparse\nimport os\nfrom functools import partial\n\nimport yaml\nimport paddle\nimport numpy as np\nimport paddle.static\nfrom paddle.inference import Config, create_predictor\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\nfrom .processor import load_label_info, postprocess, base64_to_cv2\nfrom .data_feed import reader\n\n\n@moduleinfo(\n    name=\"ssd_vgg16_300_coco2017\",\n    version=\"1.1.0\",\n    type=\"cv/object_detection\",\n    summary=\"SSD with backbone VGG16, trained with dataset COCO.\",\n    author=\"paddlepaddle\",\n    author_email=\"paddle-dev@baidu.com\")\nclass SSDVGG16:\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(\n            self.directory, \"ssd_vgg16_300_model\", \"model\")\n        self.label_names = load_label_info(\n            os.path.join(self.directory, \"label_file.txt\"))\n        self.model_config = None\n        self._set_config()\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting.\n        \"\"\"\n        model = self.default_pretrained_model_path+'.pdmodel'\n        params = self.default_pretrained_model_path+'.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        cpu_config.switch_ir_optim(False)\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=500, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n        # model config setting.\n        if not self.model_config:\n            with open(os.path.join(self.directory, 'config.yml')) as fp:\n                self.model_config = yaml.load(fp.read(), Loader=yaml.FullLoader)\n\n        self.multi_box_head_config = self.model_config['MultiBoxHead']\n        self.output_decoder_config = self.model_config['SSDOutputDecoder']\n\n    def object_detection(self,\n                         paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='detection_result',\n                         score_thresh=0.5,\n                         visualization=True):\n        \"\"\"API of Object Detection.\n\n        Args:\n            paths (list[str]): The paths of images.\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            output_dir (str): The path to store output images.\n            visualization (bool): Whether to save image or not.\n            score_thresh (float): threshold for object detecion.\n\n        Returns:\n            res (list[dict]): The result of coco2017 detecion. keys include 'data', 'save_path', the corresponding value is:\n                data (dict): the result of object detection, keys include 'left', 'top', 'right', 'bottom', 'label', 'confidence', the corresponding value is:\n                    left (float): The X coordinate of the upper left corner of the bounding box;\n                    top (float): The Y coordinate of the upper left corner of the bounding box;\n                    right (float): The X coordinate of the lower right corner of the bounding box;\n                    bottom (float): The Y coordinate of the lower right corner of the bounding box;\n                    label (str): The label of detection result;\n                    confidence (float): The confidence of detection result.\n                save_path (str, optional): The path to save output images.\n        \"\"\"\n        paths = paths if paths else list()\n        data_reader = partial(reader, paths, images)\n        batch_reader = paddle.batch(data_reader, batch_size=batch_size)\n        res = []\n        for iter_id, feed_data in enumerate(batch_reader()):\n            feed_data = np.array(feed_data)\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(np.array(list(feed_data[:, 0])))\n\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            output = postprocess(paths=paths,\n                                 images=images,\n                                 data_out=output_handle,\n                                 score_thresh=score_thresh,\n                                 label_names=self.label_names,\n                                 output_dir=output_dir,\n                                 handle_id=iter_id * batch_size,\n                                 visualization=visualization)\n            res.extend(output)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.object_detection(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(\n            title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\",\n            description=\n            \"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.object_detection(\n            paths=[args.input_path],\n            batch_size=args.batch_size,\n            use_gpu=args.use_gpu,\n            output_dir=args.output_dir,\n            visualization=args.visualization,\n            score_thresh=args.score_thresh)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu',\n            type=ast.literal_eval,\n            default=False,\n            help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument(\n            '--output_dir',\n            type=str,\n            default='detection_result',\n            help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument(\n            '--visualization',\n            type=ast.literal_eval,\n            default=False,\n            help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument(\n            '--input_path', type=str, help=\"path to image.\")\n        self.arg_input_group.add_argument(\n            '--batch_size',\n            type=ast.literal_eval,\n            default=1,\n            help=\"batch size.\")\n        self.arg_input_group.add_argument(\n            '--score_thresh',\n            type=ast.literal_eval,\n            default=0.5,\n            help=\"threshold for object detecion.\")\n"
  },
  {
    "path": "modules/image/object_detection/ssd_vgg16_300_coco2017/processor.py",
    "content": "# coding=utf-8\nimport base64\nimport os\n\nimport cv2\nimport numpy as np\nfrom PIL import Image, ImageDraw\n\n__all__ = ['base64_to_cv2', 'load_label_info', 'postprocess']\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef get_save_image_name(img, output_dir, image_path):\n    \"\"\"\n    Get save image name from source image path.\n    \"\"\"\n    if not os.path.exists(output_dir):\n        os.makedirs(output_dir)\n    image_name = os.path.split(image_path)[-1]\n    name, ext = os.path.splitext(image_name)\n\n    if img.format == 'PNG':\n        ext = '.png'\n    elif img.format == 'JPEG':\n        ext = '.jpg'\n    elif img.format == 'BMP':\n        ext = '.bmp'\n    else:\n        if img.mode == \"RGB\" or img.mode == \"L\":\n            ext = \".jpg\"\n        elif img.mode == \"RGBA\" or img.mode == \"P\":\n            ext = '.png'\n\n    return os.path.join(output_dir, \"{}\".format(name)) + ext\n\n\ndef draw_bounding_box_on_image(image_path, data_list, save_dir):\n    image = Image.open(image_path)\n    draw = ImageDraw.Draw(image)\n    for data in data_list:\n        left, right, top, bottom = data['left'], data['right'], data['top'], data['bottom']\n\n        # draw bbox\n        draw.line([(left, top), (left, bottom), (right, bottom), (right, top), (left, top)], width=2, fill='red')\n\n        # draw label\n        if image.mode == 'RGB':\n            text = data['label'] + \": %.2f%%\" % (100 * data['confidence'])\n            textsize_width, textsize_height = draw.textsize(text=text)\n            draw.rectangle(\n                xy=(left, top - (textsize_height + 5), left + textsize_width + 10, top), fill=(255, 255, 255))\n            draw.text(xy=(left, top - 15), text=text, fill=(0, 0, 0))\n\n    save_name = get_save_image_name(image, save_dir, image_path)\n    if os.path.exists(save_name):\n        os.remove(save_name)\n\n    image.save(save_name)\n\n    return save_name\n\n\ndef clip_bbox(bbox, img_width, img_height):\n    xmin = max(min(bbox[0], img_width), 0.)\n    ymin = max(min(bbox[1], img_height), 0.)\n    xmax = max(min(bbox[2], img_width), 0.)\n    ymax = max(min(bbox[3], img_height), 0.)\n    return float(xmin), float(ymin), float(xmax), float(ymax)\n\n\ndef load_label_info(file_path):\n    with open(file_path, 'r') as fr:\n        text = fr.readlines()\n        label_names = []\n        for info in text:\n            label_names.append(info.strip())\n        return label_names\n\n\ndef postprocess(paths, images, data_out, score_thresh, label_names, output_dir, handle_id, visualization=True):\n    \"\"\"\n    postprocess the lod_tensor produced by Executor.run\n\n    Args:\n        paths (list[str]): the path of images.\n        images (list(numpy.ndarray)):  list of images, shape of each is [H, W, C].\n        data_out (lod_tensor): data produced by executor.run.\n        score_thresh (float): the low limit of bounding box.\n        label_names (list[str]): label names.\n        output_dir (str): output directory.\n        handle_id (int): The number of images that have been handled.\n        visualization (bool): whether to save as images.\n\n    Returns:\n        res (list[dict]): The result of vehicles detecion. keys include 'data', 'save_path', the corresponding value is:\n            data (dict): the result of object detection, keys include 'left', 'top', 'right', 'bottom', 'label', 'confidence', the corresponding value is:\n                left (float): The X coordinate of the upper left corner of the bounding box;\n                top (float): The Y coordinate of the upper left corner of the bounding box;\n                right (float): The X coordinate of the lower right corner of the bounding box;\n                bottom (float): The Y coordinate of the lower right corner of the bounding box;\n                label (str): The label of detection result;\n                confidence (float): The confidence of detection result.\n            save_path (str): The path to save output images.\n    \"\"\"\n    lod = data_out.lod()[0]\n    results = data_out.copy_to_cpu()\n\n    if handle_id < len(paths):\n        unhandled_paths = paths[handle_id:]\n        unhandled_paths_num = len(unhandled_paths)\n    else:\n        unhandled_paths_num = 0\n\n    output = []\n    for index in range(len(lod) - 1):\n        output_i = {'data': []}\n        if index < unhandled_paths_num:\n            org_img_path = unhandled_paths[index]\n            org_img = Image.open(org_img_path)\n            output_i['path'] = org_img_path\n        else:\n            org_img = images[index - unhandled_paths_num]\n            org_img = org_img.astype(np.uint8)\n            org_img = Image.fromarray(org_img[:, :, ::-1])\n            if visualization:\n                org_img_path = get_save_image_name(org_img, output_dir, 'image_numpy_{}'.format((handle_id + index)))\n                org_img.save(org_img_path)\n        org_img_height = org_img.height\n        org_img_width = org_img.width\n        result_i = results[lod[index]:lod[index + 1]]\n        for row in result_i:\n            if len(row) != 6:\n                continue\n            if row[1] < score_thresh:\n                continue\n            category_id = int(row[0])\n            confidence = row[1]\n            bbox = row[2:]\n            bbox[0] = bbox[0] * org_img_width\n            bbox[1] = bbox[1] * org_img_height\n            bbox[2] = bbox[2] * org_img_width\n            bbox[3] = bbox[3] * org_img_height\n            dt = {}\n            dt['label'] = label_names[category_id]\n            dt['confidence'] = float(confidence)\n            dt['left'], dt['top'], dt['right'], dt['bottom'] = clip_bbox(bbox, org_img_width, org_img_height)\n            output_i['data'].append(dt)\n\n        output.append(output_i)\n        if visualization:\n            output_i['save_path'] = draw_bounding_box_on_image(org_img_path, output_i['data'], output_dir)\n\n    return output\n"
  },
  {
    "path": "modules/image/object_detection/ssd_vgg16_300_coco2017/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport paddlehub as hub\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://ai-studio-static-online.cdn.bcebos.com/68313e182f5e4ad9907e69dac9ece8fc50840d7ffbd24fa88396f009958f969a'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"ssd_vgg16_300_coco2017\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('detection_result')\n\n    def test_object_detection1(self):\n        results = self.module.object_detection(\n            paths=['tests/test.jpg']\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'cat')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(200 < left < 800)\n        self.assertTrue(2500 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(3500 < bottom < 4500)\n\n    def test_object_detection2(self):\n        results = self.module.object_detection(\n            images=[cv2.imread('tests/test.jpg')]\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'cat')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(200 < left < 800)\n        self.assertTrue(2500 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(3500 < bottom < 4500)\n\n    def test_object_detection3(self):\n        results = self.module.object_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            visualization=False\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'cat')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(200 < left < 800)\n        self.assertTrue(2500 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(3500 < bottom < 4500)\n\n    def test_object_detection4(self):\n        self.assertRaises(\n            AssertionError,\n            self.module.object_detection,\n            paths=['no.jpg']\n        )\n\n    def test_object_detection5(self):\n        self.assertRaises(\n            cv2.error,\n            self.module.object_detection,\n            images=['test.jpg']\n        )\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/object_detection/ssd_vgg16_512_coco2017/README.md",
    "content": "# ssd_vgg16_512_coco2017\n\n|模型名称|ssd_vgg16_512_coco2017|\n| :--- | :---: |\n|类别|图像 - 目标检测|\n|网络|SSD|\n|数据集|COCO2017|\n|是否支持Fine-tuning|否|\n|模型大小|139MB|\n|最新更新日期|2021-03-15|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n     <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/131506781-b4ecb77b-5ab1-4795-88da-5f547f7f7f9c.jpg\"   width='50%' hspace='10'/>\n     <br />\n     </p>\n\n- ### 模型介绍\n\n  - Single Shot MultiBox Detector (SSD) 是一种单阶段的目标检测器。与两阶段的检测方法不同，单阶段目标检测并不进行区域推荐，而是直接从特征图回归出目标的边界框和分类概率。SSD 运用了这种单阶段检测的思想，并且对其进行改进：在不同尺度的特征图上检测对应尺度的目标。该PaddleHub Module的基网络为VGG16模型，在Pascal数据集上预训练得到，目前仅支持预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install ssd_vgg16_512_coco2017\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run ssd_vgg16_512_coco2017 --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现目标检测模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    object_detector = hub.Module(name=\"ssd_vgg16_512_coco2017\")\n    result = object_detector.object_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = object_detector.object_detection((paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def object_detection(paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='detection_result',\n                         score_thresh=0.5,\n                         visualization=True)\n    ```\n\n    - 预测API，检测输入图片中的所有目标的位置。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 detection\\_result；<br/>\n      - score\\_thresh (float): 识别置信度的阈值；<br/>\n      - visualization (bool): 是否将识别结果保存为图片文件。  \n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为:\n        - data (list): 检测结果，list的每一个元素为 dict，各字段为:\n          - confidence (float): 识别的置信度\n          - label (str): 标签\n          - left (int): 边界框的左上角x坐标\n          - top (int): 边界框的左上角y坐标\n          - right (int): 边界框的右下角x坐标\n          - bottom (int): 边界框的右下角y坐标\n        - save\\_path (str, optional): 识别结果的保存路径 (仅当visualization=True时存在)\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - 将模型保存到指定路径。\n\n    - **参数**\n\n      - dirname: 模型保存路径 <br/>\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m ssd_vgg16_512_coco2017\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/ssd_vgg16_512_coco2017\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布  \n\n* 1.0.2\n\n  修复numpy数据读取问题\n\n* 1.1.0\n\n   移除 fluid api\n\n  - ```shell\n    $ hub install ssd_vgg16_512_coco2017==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/object_detection/ssd_vgg16_512_coco2017/README_en.md",
    "content": "# ssd_vgg16_512_coco2017\n\n|Module Name|ssd_vgg16_512_coco2017|\n| :--- | :---: |\n|Category|object detection|\n|Network|SSD|\n|Dataset|COCO2017|\n|Fine-tuning supported or not|No|\n|Module Size|139MB|\n|Latest update date|2021-03-15|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n     <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/131506781-b4ecb77b-5ab1-4795-88da-5f547f7f7f9c.jpg\"   width='50%' hspace='10'/>\n     <br />\n     </p>\n\n- ### Module Introduction\n\n  - Single Shot MultiBox Detector (SSD) is a one-stage detector. Different from two-stage detector, SSD frames object detection as a re- gression problem to spatially separated bounding boxes and associated class probabilities. This module is based on VGG16, trained on COCO2017 dataset, and can be used for object detection.\n\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install ssd_vgg16_512_coco2017\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run ssd_vgg16_512_coco2017 --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    object_detector = hub.Module(name=\"ssd_vgg16_512_coco2017\")\n    result = object_detector.object_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = object_detector.object_detection((paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def object_detection(paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='detection_result',\n                         score_thresh=0.5,\n                         visualization=True)\n    ```\n\n    - Detection API, detect positions of all objects in image\n\n    - **Parameters**\n\n      - paths (list[str]): image path;\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - output_dir (str): save path of images;\n      - score\\_thresh (float): confidence threshold；<br/>\n      - visualization (bool): Whether to save the results as picture files;\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n\n      - res (list\\[dict\\]): results\n        - data (list): detection results, each element in the list is dict\n          - confidence (float): the confidence of the result\n          - label (str): label\n          - left (int): the upper left corner x coordinate of the detection box\n          - top (int): the upper left corner y coordinate of the detection box\n          - right (int): the lower right corner x coordinate of the detection box\n          - bottom (int): the lower right corner y coordinate of the detection box\n        - save\\_path (str, optional): output path for saving results\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - Save model to specific path\n\n    - **Parameters**\n\n      - dirname: model save path\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of object detection.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m ssd_vgg16_512_coco2017\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/ssd_vgg16_512_coco2017\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release  \n\n* 1.0.2\n\n  Fix the problem of reading numpy\n\n* 1.1.0\n\n   移除 fluid api\n\n  - ```shell\n    $ hub install ssd_vgg16_512_coco2017==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/object_detection/ssd_vgg16_512_coco2017/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/object_detection/ssd_vgg16_512_coco2017/config.yml",
    "content": "SSDOutputDecoder:\n  background_label: 0\n  keep_top_k: 200\n  nms_eta: 1.0\n  nms_threshold: 0.45\n  nms_top_k: 400\n  score_threshold: 0.01\n\nMultiBoxHead:\n  base_size: 512\n  aspect_ratios: [[2.], [2., 3.], [2., 3.], [2., 3.], [2., 3.], [2.], [2.]]\n  min_ratio: 15\n  max_ratio: 90\n  min_sizes: [20.0, 51.0, 133.0, 215.0, 296.0, 378.0, 460.0]\n  max_sizes: [51.0, 133.0, 215.0, 296.0, 378.0, 460.0, 542.0]\n  steps: [8, 16, 32, 64, 128, 256, 512]\n  offset: 0.5\n  flip: true\n  kernel_size: 3\n  pad: 1\n"
  },
  {
    "path": "modules/image/object_detection/ssd_vgg16_512_coco2017/data_feed.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import division\n\nimport os\nimport random\n\nimport cv2\nimport numpy as np\nfrom PIL import Image\n\n__all__ = ['reader']\n\n\nclass DecodeImage(object):\n    def __init__(self, to_rgb=True, with_mixup=False):\n        \"\"\" Transform the image data to numpy format.\n\n        Args:\n            to_rgb (bool): whether to convert BGR to RGB\n            with_mixup (bool): whether or not to mixup image and gt_bbbox/gt_score\n        \"\"\"\n        self.to_rgb = to_rgb\n        self.with_mixup = with_mixup\n\n    def __call__(self, im):\n        if self.to_rgb:\n            im = cv2.cvtColor(im, cv2.COLOR_BGR2RGB)\n\n        return im\n\n\nclass ResizeImage(object):\n    def __init__(self,\n                 target_size=0,\n                 max_size=0,\n                 interp=cv2.INTER_LINEAR,\n                 use_cv2=True):\n        \"\"\"\n        Rescale image to the specified target size, and capped at max_size\n        if max_size != 0.\n        If target_size is list, selected a scale randomly as the specified\n        target size.\n\n        Args:\n            target_size (int|list): the target size of image's short side,\n                multi-scale training is adopted when type is list.\n            max_size (int): the max size of image\n            interp (int): the interpolation method\n            use_cv2 (bool): use the cv2 interpolation method or use PIL\n                interpolation method\n        \"\"\"\n        self.max_size = int(max_size)\n        self.interp = int(interp)\n        self.use_cv2 = use_cv2\n        self.target_size = target_size\n\n    def __call__(self, im):\n        if not isinstance(im, np.ndarray):\n            raise TypeError(\"{}: image type is not numpy.\".format(self))\n        if len(im.shape) != 3:\n            raise ValueError('{}: image is not 3-dimensional.'.format(self))\n        im_shape = im.shape\n        im_size_min = np.min(im_shape[0:2])\n        im_size_max = np.max(im_shape[0:2])\n        if isinstance(self.target_size, list):\n            # Case for multi-scale training\n            selected_size = random.choice(self.target_size)\n        else:\n            selected_size = self.target_size\n        if float(im_size_min) == 0:\n            raise ZeroDivisionError('{}: min size of image is 0'.format(self))\n        if self.max_size != 0:\n            im_scale = float(selected_size) / float(im_size_min)\n            # Prevent the biggest axis from being more than max_size\n            if np.round(im_scale * im_size_max) > self.max_size:\n                im_scale = float(self.max_size) / float(im_size_max)\n            im_scale_x = im_scale\n            im_scale_y = im_scale\n\n            resize_w = im_scale_x * float(im_shape[1])\n            resize_h = im_scale_y * float(im_shape[0])\n            im_info = [resize_h, resize_w, im_scale]\n        else:\n            im_scale_x = float(selected_size) / float(im_shape[1])\n            im_scale_y = float(selected_size) / float(im_shape[0])\n\n            resize_w = selected_size\n            resize_h = selected_size\n\n        if self.use_cv2:\n            im = cv2.resize(\n                im,\n                None,\n                None,\n                fx=im_scale_x,\n                fy=im_scale_y,\n                interpolation=self.interp)\n        else:\n            if self.max_size != 0:\n                raise TypeError(\n                    'If you set max_size to cap the maximum size of image,'\n                    'please set use_cv2 to True to resize the image.')\n            im = im.astype('uint8')\n            im = Image.fromarray(im)\n            im = im.resize((int(resize_w), int(resize_h)), self.interp)\n            im = np.array(im)\n\n        return im\n\n\nclass NormalizeImage(object):\n    def __init__(self,\n                 mean=[0.485, 0.456, 0.406],\n                 std=[1, 1, 1],\n                 is_scale=True,\n                 is_channel_first=True):\n        \"\"\"\n        Args:\n            mean (list): the pixel mean\n            std (list): the pixel variance\n        \"\"\"\n        self.mean = mean\n        self.std = std\n        self.is_scale = is_scale\n        self.is_channel_first = is_channel_first\n\n    def __call__(self, im):\n        \"\"\"Normalize the image.\n\n        Operators:\n            1.(optional) Scale the image to [0,1]\n            2. Each pixel minus mean and is divided by std\n        \"\"\"\n        im = im.astype(np.float32, copy=False)\n        if self.is_channel_first:\n            mean = np.array(self.mean)[:, np.newaxis, np.newaxis]\n            std = np.array(self.std)[:, np.newaxis, np.newaxis]\n        else:\n            mean = np.array(self.mean)[np.newaxis, np.newaxis, :]\n            std = np.array(self.std)[np.newaxis, np.newaxis, :]\n        if self.is_scale:\n            im = im / 255.0\n        im -= mean\n        im /= std\n        return im\n\n\nclass Permute(object):\n    def __init__(self, to_bgr=True, channel_first=True):\n        \"\"\"\n        Change the channel.\n\n        Args:\n            to_bgr (bool): confirm whether to convert RGB to BGR\n            channel_first (bool): confirm whether to change channel\n        \"\"\"\n        self.to_bgr = to_bgr\n        self.channel_first = channel_first\n\n    def __call__(self, im):\n        if self.channel_first:\n            im = np.swapaxes(im, 1, 2)\n            im = np.swapaxes(im, 1, 0)\n        if self.to_bgr:\n            im = im[[2, 1, 0], :, :]\n        return im\n\n\ndef reader(paths=[],\n           images=None,\n           decode_image=DecodeImage(to_rgb=True, with_mixup=False),\n           resize_image=ResizeImage(\n               target_size=512, interp=1, max_size=0, use_cv2=False),\n           permute_image=Permute(to_bgr=False),\n           normalize_image=NormalizeImage(\n               mean=[104, 117, 123], std=[1, 1, 1], is_scale=False)):\n    \"\"\"\n    data generator\n\n    Args:\n        paths (list[str]): paths to images.\n        images (list(numpy.ndarray)): data of images, shape of each is [H, W, C]\n        decode_image (class object): instance of <class 'DecodeImage' object>\n        resize_image (class object): instance of <class 'ResizeImage' object>\n        permute_image (class object): instance of <class 'Permute' object>\n        normalize_image (class object): instance of <class 'NormalizeImage' object>\n    \"\"\"\n    img_list = []\n    if paths is not None:\n        assert type(paths) is list, \"type(paths) is not list.\"\n        for img_path in paths:\n            assert os.path.isfile(\n                img_path), \"The {} isn't a valid file path.\".format(img_path)\n            img = cv2.imread(img_path).astype('float32')\n            img_list.append(img)\n    if images is not None:\n        for img in images:\n            img_list.append(img)\n\n    for img in img_list:\n        preprocessed_img = decode_image(img)\n        preprocessed_img = resize_image(preprocessed_img)\n        preprocessed_img = permute_image(preprocessed_img)\n        preprocessed_img = normalize_image(preprocessed_img)\n        yield [preprocessed_img]\n"
  },
  {
    "path": "modules/image/object_detection/ssd_vgg16_512_coco2017/label_file.txt",
    "content": "background\nperson\nbicycle\ncar\nmotorcycle\nairplane\nbus\ntrain\ntruck\nboat\ntraffic light\nfire hydrant\nstop sign\nparking meter\nbench\nbird\ncat\ndog\nhorse\nsheep\ncow\nelephant\nbear\nzebra\ngiraffe\nbackpack\numbrella\nhandbag\ntie\nsuitcase\nfrisbee\nskis\nsnowboard\nsports ball\nkite\nbaseball bat\nbaseball glove\nskateboard\nsurfboard\ntennis racket\nbottle\nwine glass\ncup\nfork\nknife\nspoon\nbowl\nbanana\napple\nsandwich\norange\nbroccoli\ncarrot\nhot dog\npizza\ndonut\ncake\nchair\ncouch\npotted plant\nbed\ndining table\ntoilet\ntv\nlaptop\nmouse\nremote\nkeyboard\ncell phone\nmicrowave\noven\ntoaster\nsink\nrefrigerator\nbook\nclock\nvase\nscissors\nteddy bear\nhair drier\ntoothbrush\n"
  },
  {
    "path": "modules/image/object_detection/ssd_vgg16_512_coco2017/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\n\nimport ast\nimport argparse\nimport os\nfrom functools import partial\n\nimport yaml\nimport paddle\nimport numpy as np\nimport paddle.static\nfrom paddle.inference import Config, create_predictor\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\nfrom .processor import load_label_info, postprocess, base64_to_cv2\nfrom .data_feed import reader\n\n\n@moduleinfo(\n    name=\"ssd_vgg16_512_coco2017\",\n    version=\"1.1.0\",\n    type=\"cv/object_detection\",\n    summary=\"SSD with backbone VGG16, trained with dataset COCO.\",\n    author=\"paddlepaddle\",\n    author_email=\"paddle-dev@baidu.com\")\nclass SSDVGG16_512:\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(\n            self.directory, \"ssd_vgg16_512_model\", \"model\")\n        self.label_names = load_label_info(\n            os.path.join(self.directory, \"label_file.txt\"))\n        self.model_config = None\n        self._set_config()\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting.\n        \"\"\"\n        model = self.default_pretrained_model_path+'.pdmodel'\n        params = self.default_pretrained_model_path+'.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        cpu_config.switch_ir_optim(False)\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=500, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n        # model config setting.\n        if not self.model_config:\n            with open(os.path.join(self.directory, 'config.yml')) as fp:\n                self.model_config = yaml.load(fp.read(), Loader=yaml.FullLoader)\n\n        self.multi_box_head_config = self.model_config['MultiBoxHead']\n        self.output_decoder_config = self.model_config['SSDOutputDecoder']\n\n    def object_detection(self,\n                         paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='detection_result',\n                         score_thresh=0.5,\n                         visualization=True):\n        \"\"\"API of Object Detection.\n\n        Args:\n            paths (list[str]): The paths of images.\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            output_dir (str): The path to store output images.\n            visualization (bool): Whether to save image or not.\n            score_thresh (float): threshold for object detecion.\n\n        Returns:\n            res (list[dict]): The result of coco2017 detecion. keys include 'data', 'save_path', the corresponding value is:\n                data (dict): the result of object detection, keys include 'left', 'top', 'right', 'bottom', 'label', 'confidence', the corresponding value is:\n                    left (float): The X coordinate of the upper left corner of the bounding box;\n                    top (float): The Y coordinate of the upper left corner of the bounding box;\n                    right (float): The X coordinate of the lower right corner of the bounding box;\n                    bottom (float): The Y coordinate of the lower right corner of the bounding box;\n                    label (str): The label of detection result;\n                    confidence (float): The confidence of detection result.\n                save_path (str, optional): The path to save output images.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Attempt to use GPU for prediction, but environment variable CUDA_VISIBLE_DEVICES was not set correctly.\"\n                )\n\n        paths = paths if paths else list()\n        data_reader = partial(reader, paths, images)\n        batch_reader = paddle.batch(data_reader, batch_size=batch_size)\n        res = []\n        for iter_id, feed_data in enumerate(batch_reader()):\n            feed_data = np.array(feed_data)\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(np.array(list(feed_data[:, 0])))\n\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            output = postprocess(paths=paths,\n                                 images=images,\n                                 data_out=output_handle,\n                                 score_thresh=score_thresh,\n                                 label_names=self.label_names,\n                                 output_dir=output_dir,\n                                 handle_id=iter_id * batch_size,\n                                 visualization=visualization)\n            res.extend(output)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.object_detection(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(\n            title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\",\n            description=\n            \"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.object_detection(\n            paths=[args.input_path],\n            batch_size=args.batch_size,\n            use_gpu=args.use_gpu,\n            output_dir=args.output_dir,\n            visualization=args.visualization,\n            score_thresh=args.score_thresh)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu',\n            type=ast.literal_eval,\n            default=False,\n            help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument(\n            '--output_dir',\n            type=str,\n            default='detection_result',\n            help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument(\n            '--visualization',\n            type=ast.literal_eval,\n            default=False,\n            help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument(\n            '--input_path', type=str, help=\"path to image.\")\n        self.arg_input_group.add_argument(\n            '--batch_size',\n            type=ast.literal_eval,\n            default=1,\n            help=\"batch size.\")\n        self.arg_input_group.add_argument(\n            '--score_thresh',\n            type=ast.literal_eval,\n            default=0.5,\n            help=\"threshold for object detecion.\")\n"
  },
  {
    "path": "modules/image/object_detection/ssd_vgg16_512_coco2017/processor.py",
    "content": "# coding=utf-8\nimport base64\nimport os\n\nimport cv2\nimport numpy as np\nfrom PIL import Image, ImageDraw\n\n__all__ = ['base64_to_cv2', 'load_label_info', 'postprocess']\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\ndef check_dir(dir_path):\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\ndef get_save_image_name(img, output_dir, image_path):\n    \"\"\"\n    Get save image name from source image path.\n    \"\"\"\n    if not os.path.exists(output_dir):\n        os.makedirs(output_dir)\n    image_name = os.path.split(image_path)[-1]\n    name, ext = os.path.splitext(image_name)\n\n    if img.format == 'PNG':\n        ext = '.png'\n    elif img.format == 'JPEG':\n        ext = '.jpg'\n    elif img.format == 'BMP':\n        ext = '.bmp'\n    else:\n        if img.mode == \"RGB\" or img.mode == \"L\":\n            ext = \".jpg\"\n        elif img.mode == \"RGBA\" or img.mode == \"P\":\n            ext = '.png'\n\n    return os.path.join(output_dir, \"{}\".format(name)) + ext\n\n\ndef draw_bounding_box_on_image(image_path, data_list, save_dir):\n    image = Image.open(image_path)\n    draw = ImageDraw.Draw(image)\n    for data in data_list:\n        left, right, top, bottom = data['left'], data['right'], data[\n            'top'], data['bottom']\n\n        # draw bbox\n        draw.line([(left, top), (left, bottom), (right, bottom), (right, top),\n                   (left, top)],\n                  width=2,\n                  fill='red')\n\n        # draw label\n        if image.mode == 'RGB':\n            text = data['label'] + \": %.2f%%\" % (100 * data['confidence'])\n            textsize_width, textsize_height = draw.textsize(text=text)\n            draw.rectangle(\n                xy=(left, top - (textsize_height + 5),\n                    left + textsize_width + 10, top),\n                fill=(255, 255, 255))\n            draw.text(xy=(left, top - 15), text=text, fill=(0, 0, 0))\n\n    save_name = get_save_image_name(image, save_dir, image_path)\n    if os.path.exists(save_name):\n        os.remove(save_name)\n\n    image.save(save_name)\n\n    return save_name\n\n\ndef clip_bbox(bbox, img_width, img_height):\n    xmin = max(min(bbox[0], img_width), 0.)\n    ymin = max(min(bbox[1], img_height), 0.)\n    xmax = max(min(bbox[2], img_width), 0.)\n    ymax = max(min(bbox[3], img_height), 0.)\n    return float(xmin), float(ymin), float(xmax), float(ymax)\n\n\ndef load_label_info(file_path):\n    with open(file_path, 'r') as fr:\n        text = fr.readlines()\n        label_names = []\n        for info in text:\n            label_names.append(info.strip())\n        return label_names\n\n\ndef postprocess(paths,\n                images,\n                data_out,\n                score_thresh,\n                label_names,\n                output_dir,\n                handle_id,\n                visualization=True):\n    \"\"\"\n    postprocess the lod_tensor produced by Executor.run\n\n    Args:\n        paths (list[str]): the path of images.\n        images (list(numpy.ndarray)):  list of images, shape of each is [H, W, C].\n        data_out (lod_tensor): data produced by executor.run.\n        score_thresh (float): the low limit of bounding box.\n        label_names (list[str]): label names.\n        output_dir (str): output directory.\n        handle_id (int): The number of images that have been handled.\n        visualization (bool): whether to save as images.\n\n    Returns:\n        res (list[dict]): The result of vehicles detecion. keys include 'data', 'save_path', the corresponding value is:\n            data (dict): the result of object detection, keys include 'left', 'top', 'right', 'bottom', 'label', 'confidence', the corresponding value is:\n                left (float): The X coordinate of the upper left corner of the bounding box;\n                top (float): The Y coordinate of the upper left corner of the bounding box;\n                right (float): The X coordinate of the lower right corner of the bounding box;\n                bottom (float): The Y coordinate of the lower right corner of the bounding box;\n                label (str): The label of detection result;\n                confidence (float): The confidence of detection result.\n            save_path (str): The path to save output images.\n    \"\"\"\n    lod = data_out.lod()[0]\n    results = data_out.copy_to_cpu()\n\n    check_dir(output_dir)\n\n    if paths:\n        assert type(paths) is list, \"type(paths) is not list.\"\n        if handle_id < len(paths):\n            unhandled_paths = paths[handle_id:]\n            unhandled_paths_num = len(unhandled_paths)\n        else:\n            unhandled_paths_num = 0\n    if images is not None:\n        if handle_id < len(images):\n            unhandled_paths = None\n            unhandled_paths_num = len(images) - handle_id\n        else:\n            unhandled_paths_num = 0\n\n    output = []\n    for index in range(len(lod) - 1):\n        output_i = {'data': []}\n        if unhandled_paths and index < unhandled_paths_num:\n            org_img_path = unhandled_paths[index]\n            org_img = Image.open(org_img_path)\n            output_i['path'] = org_img_path\n        else:\n            org_img = images[index - unhandled_paths_num]\n            org_img = org_img.astype(np.uint8)\n            org_img = Image.fromarray(org_img[:, :, ::-1])\n            if visualization:\n                org_img_path = get_save_image_name(\n                    org_img, output_dir, 'image_numpy_{}'.format(\n                        (handle_id + index)))\n                org_img.save(org_img_path)\n        org_img_height = org_img.height\n        org_img_width = org_img.width\n        result_i = results[lod[index]:lod[index + 1]]\n        for row in result_i:\n            if len(row) != 6:\n                continue\n            if row[1] < score_thresh:\n                continue\n            category_id = int(row[0])\n            confidence = row[1]\n            bbox = row[2:]\n            bbox[0] = bbox[0] * org_img_width\n            bbox[1] = bbox[1] * org_img_height\n            bbox[2] = bbox[2] * org_img_width\n            bbox[3] = bbox[3] * org_img_height\n            dt = {}\n            dt['label'] = label_names[category_id]\n            dt['confidence'] = float(confidence)\n            dt['left'], dt['top'], dt['right'], dt['bottom'] = clip_bbox(\n                bbox, org_img_width, org_img_height)\n            output_i['data'].append(dt)\n\n        output.append(output_i)\n        if visualization:\n            output_i['save_path'] = draw_bounding_box_on_image(\n                org_img_path, output_i['data'], output_dir)\n\n    return output\n"
  },
  {
    "path": "modules/image/object_detection/ssd_vgg16_512_coco2017/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport paddlehub as hub\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://ai-studio-static-online.cdn.bcebos.com/68313e182f5e4ad9907e69dac9ece8fc50840d7ffbd24fa88396f009958f969a'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"ssd_vgg16_512_coco2017\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('detection_result')\n\n    def test_object_detection1(self):\n        results = self.module.object_detection(\n            paths=['tests/test.jpg']\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'cat')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(200 < left < 800)\n        self.assertTrue(2500 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(3500 < bottom < 4500)\n\n    def test_object_detection2(self):\n        results = self.module.object_detection(\n            images=[cv2.imread('tests/test.jpg')]\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'cat')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(200 < left < 800)\n        self.assertTrue(2500 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(3500 < bottom < 4500)\n\n    def test_object_detection3(self):\n        results = self.module.object_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            visualization=False\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'cat')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(200 < left < 800)\n        self.assertTrue(2500 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(3500 < bottom < 4500)\n\n    def test_object_detection4(self):\n        self.assertRaises(\n            AssertionError,\n            self.module.object_detection,\n            paths=['no.jpg']\n        )\n\n    def test_object_detection5(self):\n        self.assertRaises(\n            cv2.error,\n            self.module.object_detection,\n            images=['test.jpg']\n        )\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_coco2017/README.md",
    "content": "# yolov3_darknet53_coco2017\n\n|模型名称|yolov3_darknet53_coco2017|\n| :--- | :---: |\n|类别|图像 - 目标检测|\n|网络|YOLOv3|\n|数据集|COCO2017|\n|是否支持Fine-tuning|否|\n|模型大小|239MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n     <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/131506781-b4ecb77b-5ab1-4795-88da-5f547f7f7f9c.jpg\"   width='50%' hspace='10'/>\n     <br />\n     </p>\n\n- ### 模型介绍\n\n  - YOLOv3是由Joseph Redmon和Ali Farhadi提出的单阶段检测器, 该检测器与达到同样精度的传统目标检测方法相比，推断速度能达到接近两倍。 YOLOv3将输入图像划分格子，并对每个格子预测bounding box。YOLOv3的loss函数由三部分组成：Location误差，Confidence误差和分类误差。该PaddleHub Module预训练数据集为COCO2017，目前仅支持预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install yolov3_darknet53_coco2017\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run yolov3_darknet53_coco2017 --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现目标检测模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    object_detector = hub.Module(name=\"yolov3_darknet53_coco2017\")\n    result = object_detector.object_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = object_detector.object_detection((paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def object_detection(paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='detection_result',\n                         score_thresh=0.5,\n                         visualization=True)\n    ```\n\n    - 预测API，检测输入图片中的所有目标的位置。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 detection\\_result；<br/>\n      - score\\_thresh (float): 识别置信度的阈值；<br/>\n      - visualization (bool): 是否将识别结果保存为图片文件。\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为:\n        - data (list): 检测结果，list的每一个元素为 dict，各字段为:\n          - confidence (float): 识别的置信度\n          - label (str): 标签\n          - left (int): 边界框的左上角x坐标\n          - top (int): 边界框的左上角y坐标\n          - right (int): 边界框的右下角x坐标\n          - bottom (int): 边界框的右下角y坐标\n        - save\\_path (str, optional): 识别结果的保存路径 (仅当visualization=True时存在)\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - 将模型保存到指定路径。\n\n    - **参数**\n\n      - dirname: 模型保存路径 <br/>\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m yolov3_darknet53_coco2017\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/yolov3_darknet53_coco2017\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.1\n  修复numpy数据读取问题\n\n* 1.2.0\n  移除 fluid api\n\n  - ```shell\n    $ hub install yolov3_darknet53_coco2017==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_coco2017/README_en.md",
    "content": "# yolov3_darknet53_coco2017\n\n|Module Name|yolov3_darknet53_coco2017|\n| :--- | :---: |\n|Category|object detection|\n|Network|YOLOv3|\n|Dataset|COCO2017|\n|Fine-tuning supported or not|No|\n|Module Size|239MB|\n|Latest update date|2021-02-26|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n     <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/131506781-b4ecb77b-5ab1-4795-88da-5f547f7f7f9c.jpg\"   width='50%' hspace='10'/>\n     <br />\n     </p>\n\n- ### Module Introduction\n\n  - YOLOv3 is a one-stage detector proposed by Joseph Redmon and Ali Farhadi, which can reach comparable accuracy but twice as fast as traditional methods. This module is based on YOLOv3, trained on COCO2017, and can be used for object detection.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install yolov3_darknet53_coco2017\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run yolov3_darknet53_coco2017 --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    object_detector = hub.Module(name=\"yolov3_darknet53_coco2017\")\n    result = object_detector.object_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = object_detector.object_detection((paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def object_detection(paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='detection_result',\n                         score_thresh=0.5,\n                         visualization=True)\n    ```\n\n    - Detection API, detect positions of all objects in image\n\n    - **Parameters**\n\n      - paths (list[str]): image path;\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - output_dir (str): save path of images;\n      - score\\_thresh (float): confidence threshold；<br/>\n      - visualization (bool): Whether to save the results as picture files;\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n\n      - res (list\\[dict\\]): results\n        - data (list): detection results, each element in the list is dict\n          - confidence (float): the confidence of the result\n          - label (str): label\n          - left (int): the upper left corner x coordinate of the detection box\n          - top (int): the upper left corner y coordinate of the detection box\n          - right (int): the lower right corner x coordinate of the detection box\n          - bottom (int): the lower right corner y coordinate of the detection box\n        - save\\_path (str, optional): output path for saving results\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - Save model to specific path\n\n    - **Parameters**\n\n      - dirname: model save path\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of object detection.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m yolov3_darknet53_coco2017\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/yolov3_darknet53_coco2017\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.1\n  Fix the problem of reading numpy\n\n* 1.2.0\n  Remove fluid api\n\n  - ```shell\n    $ hub install yolov3_darknet53_coco2017==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_coco2017/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_coco2017/data_feed.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import division\n\nimport os\n\nimport cv2\nimport numpy as np\n\n__all__ = ['reader']\n\n\ndef reader(paths=[], images=None):\n    \"\"\"\n    data generator\n\n    Args:\n        paths (list[str]): paths to images.\n        images (list(numpy.ndarray)): data of images, shape of each is [H, W, C]\n\n    Yield:\n        res (list): preprocessed image and the size of original image.\n    \"\"\"\n    img_list = []\n    if paths:\n        assert type(paths) is list, \"type(paths) is not list.\"\n        for img_path in paths:\n            assert os.path.isfile(img_path), \"The {} isn't a valid file path.\".format(img_path)\n            img = cv2.imread(img_path).astype('float32')\n            img_list.append(img)\n    if images is not None:\n        for img in images:\n            img_list.append(img)\n\n    for im in img_list:\n        # im_size\n        im_shape = im.shape\n        im_size = np.array([im_shape[0], im_shape[1]], dtype=np.int32)\n\n        # decode image\n        im = cv2.cvtColor(im, cv2.COLOR_BGR2RGB)\n\n        # resize image\n        target_size = 608\n        im_size_min = np.min(im_shape[0:2])\n        im_size_max = np.max(im_shape[0:2])\n        if float(im_size_min) == 0:\n            raise ZeroDivisionError('min size of image is 0')\n\n        im_scale_x = float(target_size) / float(im_shape[1])\n        im_scale_y = float(target_size) / float(im_shape[0])\n        im = cv2.resize(im, None, None, fx=im_scale_x, fy=im_scale_y, interpolation=2)\n\n        # normalize image\n        mean = [0.485, 0.456, 0.406]\n        std = [0.229, 0.224, 0.225]\n        im = im.astype(np.float32, copy=False)\n        mean = np.array(mean)[np.newaxis, np.newaxis, :]\n        std = np.array(std)[np.newaxis, np.newaxis, :]\n        im = im / 255.0\n        im -= mean\n        im /= std\n\n        # permute\n        im = np.swapaxes(im, 1, 2)\n        im = np.swapaxes(im, 1, 0)\n\n        yield [im, im_size]\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_coco2017/label_file.txt",
    "content": "person\nbicycle\ncar\nmotorcycle\nairplane\nbus\ntrain\ntruck\nboat\ntraffic light\nfire hydrant\nstop sign\nparking meter\nbench\nbird\ncat\ndog\nhorse\nsheep\ncow\nelephant\nbear\nzebra\ngiraffe\nbackpack\numbrella\nhandbag\ntie\nsuitcase\nfrisbee\nskis\nsnowboard\nsports ball\nkite\nbaseball bat\nbaseball glove\nskateboard\nsurfboard\ntennis racket\nbottle\nwine glass\ncup\nfork\nknife\nspoon\nbowl\nbanana\napple\nsandwich\norange\nbroccoli\ncarrot\nhot dog\npizza\ndonut\ncake\nchair\ncouch\npotted plant\nbed\ndining table\ntoilet\ntv\nlaptop\nmouse\nremote\nkeyboard\ncell phone\nmicrowave\noven\ntoaster\nsink\nrefrigerator\nbook\nclock\nvase\nscissors\nteddy bear\nhair drier\ntoothbrush\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_coco2017/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\n\nimport ast\nimport argparse\nimport os\nfrom functools import partial\n\nimport paddle\nimport numpy as np\nimport paddle.jit\nimport paddle.static\nfrom paddle.inference import Config, create_predictor\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\nfrom .processor import load_label_info, postprocess, base64_to_cv2\nfrom .data_feed import reader\n\n\n@moduleinfo(\n    name=\"yolov3_darknet53_coco2017\",\n    version=\"1.2.0\",\n    type=\"CV/object_detection\",\n    summary=\"Baidu's YOLOv3 model for object detection, with backbone DarkNet53, trained with dataset coco2017.\",\n    author=\"paddlepaddle\",\n    author_email=\"paddle-dev@baidu.com\")\nclass YOLOv3DarkNet53Coco2017:\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"yolov3_darknet53_model\", \"model\")\n        self.label_names = load_label_info(os.path.join(self.directory, \"label_file.txt\"))\n        self._set_config()\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting.\n        \"\"\"\n        model = self.default_pretrained_model_path+'.pdmodel'\n        params = self.default_pretrained_model_path+'.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        cpu_config.switch_ir_optim(False)\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=500, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def object_detection(self,\n                         paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='detection_result',\n                         score_thresh=0.5,\n                         visualization=True):\n        \"\"\"API of Object Detection.\n\n        Args:\n            paths (list[str]): The paths of images.\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            output_dir (str): The path to store output images.\n            visualization (bool): Whether to save image or not.\n            score_thresh (float): threshold for object detecion.\n\n        Returns:\n            res (list[dict]): The result of coco2017 detecion. keys include 'data', 'save_path', the corresponding value is:\n                data (dict): the result of object detection, keys include 'left', 'top', 'right', 'bottom', 'label', 'confidence', the corresponding value is:\n                    left (float): The X coordinate of the upper left corner of the bounding box;\n                    top (float): The Y coordinate of the upper left corner of the bounding box;\n                    right (float): The X coordinate of the lower right corner of the bounding box;\n                    bottom (float): The Y coordinate of the lower right corner of the bounding box;\n                    label (str): The label of detection result;\n                    confidence (float): The confidence of detection result.\n                save_path (str, optional): The path to save output images.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        paths = paths if paths else list()\n        data_reader = partial(reader, paths, images)\n        batch_reader = paddle.batch(data_reader, batch_size=batch_size)\n        res = []\n        for iter_id, feed_data in enumerate(batch_reader()):\n            feed_data = np.array(feed_data)\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(np.array(list(feed_data[:, 0])))\n            input_handle = predictor.get_input_handle(input_names[1])\n            input_handle.copy_from_cpu(np.array(list(feed_data[:, 1])))\n\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            output = postprocess(paths=paths,\n                                 images=images,\n                                 data_out=output_handle,\n                                 score_thresh=score_thresh,\n                                 label_names=self.label_names,\n                                 output_dir=output_dir,\n                                 handle_id=iter_id * batch_size,\n                                 visualization=visualization)\n            res.extend(output)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.object_detection(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.object_detection(\n            paths=[args.input_path],\n            batch_size=args.batch_size,\n            use_gpu=args.use_gpu,\n            output_dir=args.output_dir,\n            visualization=args.visualization,\n            score_thresh=args.score_thresh)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu', type=ast.literal_eval, default=False, help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default='detection_result', help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument(\n            '--visualization', type=ast.literal_eval, default=False, help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n        self.arg_input_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n        self.arg_input_group.add_argument(\n            '--score_thresh', type=ast.literal_eval, default=0.5, help=\"threshold for object detecion.\")\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_coco2017/processor.py",
    "content": "# coding=utf-8\nimport base64\nimport os\n\nimport cv2\nimport numpy as np\nfrom PIL import Image, ImageDraw\n\n__all__ = ['base64_to_cv2', 'load_label_info', 'postprocess']\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef check_dir(dir_path):\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef get_save_image_name(img, output_dir, image_path):\n    \"\"\"Get save image name from source image path.\n    \"\"\"\n    image_name = os.path.split(image_path)[-1]\n    name, ext = os.path.splitext(image_name)\n\n    if ext == '':\n        if img.format == 'PNG':\n            ext = '.png'\n        elif img.format == 'JPEG':\n            ext = '.jpg'\n        elif img.format == 'BMP':\n            ext = '.bmp'\n        else:\n            if img.mode == \"RGB\" or img.mode == \"L\":\n                ext = \".jpg\"\n            elif img.mode == \"RGBA\" or img.mode == \"P\":\n                ext = '.png'\n\n    return os.path.join(output_dir, \"{}\".format(name)) + ext\n\n\ndef draw_bounding_box_on_image(image_path, data_list, save_dir):\n    image = Image.open(image_path)\n    draw = ImageDraw.Draw(image)\n    for data in data_list:\n        left, right, top, bottom = data['left'], data['right'], data['top'], data['bottom']\n        # draw bbox\n        draw.line([(left, top), (left, bottom), (right, bottom), (right, top), (left, top)], width=2, fill='red')\n        # draw label\n        if image.mode == 'RGB':\n            text = data['label'] + \": %.2f%%\" % (100 * data['confidence'])\n            textsize_width, textsize_height = draw.textsize(text=text)\n            draw.rectangle(\n                xy=(left, top - (textsize_height + 5), left + textsize_width + 10, top), fill=(255, 255, 255))\n            draw.text(xy=(left, top - 15), text=text, fill=(0, 0, 0))\n\n    save_name = get_save_image_name(image, save_dir, image_path)\n    if os.path.exists(save_name):\n        os.remove(save_name)\n\n    image.save(save_name)\n    return save_name\n\n\ndef clip_bbox(bbox, img_width, img_height):\n    xmin = max(min(bbox[0], img_width), 0.)\n    ymin = max(min(bbox[1], img_height), 0.)\n    xmax = max(min(bbox[2], img_width), 0.)\n    ymax = max(min(bbox[3], img_height), 0.)\n    return float(xmin), float(ymin), float(xmax), float(ymax)\n\n\ndef load_label_info(file_path):\n    with open(file_path, 'r') as fr:\n        text = fr.readlines()\n        label_names = []\n        for info in text:\n            label_names.append(info.strip())\n        return label_names\n\n\ndef postprocess(paths, images, data_out, score_thresh, label_names, output_dir, handle_id, visualization=True):\n    \"\"\"\n    postprocess the lod_tensor produced by Executor.run\n\n    Args:\n        paths (list[str]): The paths of images.\n        images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n        data_out (lod_tensor): data output of predictor.\n        batch_size (int): batch size.\n        use_gpu (bool): Whether to use gpu.\n        output_dir (str): The path to store output images.\n        visualization (bool): Whether to save image or not.\n        score_thresh (float): the low limit of bounding box.\n        label_names (list[str]): label names.\n        handle_id (int): The number of images that have been handled.\n\n    Returns:\n        res (list[dict]): The result of vehicles detecion. keys include 'data', 'save_path', the corresponding value is:\n            data (dict): the result of object detection, keys include 'left', 'top', 'right', 'bottom', 'label', 'confidence', the corresponding value is:\n                left (float): The X coordinate of the upper left corner of the bounding box;\n                top (float): The Y coordinate of the upper left corner of the bounding box;\n                right (float): The X coordinate of the lower right corner of the bounding box;\n                bottom (float): The Y coordinate of the lower right corner of the bounding box;\n                label (str): The label of detection result;\n                confidence (float): The confidence of detection result.\n            save_path (str): The path to save output images.\n    \"\"\"\n    lod = data_out.lod()[0]\n    results = data_out.copy_to_cpu()\n\n    check_dir(output_dir)\n\n    assert type(paths) is list, \"type(paths) is not list.\"\n    if handle_id < len(paths):\n        unhandled_paths = paths[handle_id:]\n        unhandled_paths_num = len(unhandled_paths)\n    else:\n        unhandled_paths_num = 0\n\n    output = list()\n    for index in range(len(lod) - 1):\n        output_i = {'data': []}\n        if index < unhandled_paths_num:\n            org_img_path = unhandled_paths[index]\n            org_img = Image.open(org_img_path)\n        else:\n            org_img = images[index - unhandled_paths_num]\n            org_img = org_img.astype(np.uint8)\n            org_img = Image.fromarray(org_img[:, :, ::-1])\n            if visualization:\n                org_img_path = get_save_image_name(org_img, output_dir, 'image_numpy_{}'.format((handle_id + index)))\n                org_img.save(org_img_path)\n        org_img_height = org_img.height\n        org_img_width = org_img.width\n        result_i = results[lod[index]:lod[index + 1]]\n        for row in result_i:\n            if len(row) != 6:\n                continue\n            if row[1] < score_thresh:\n                continue\n            category_id = int(row[0])\n            confidence = row[1]\n            bbox = row[2:]\n            dt = {}\n            dt['label'] = label_names[category_id]\n            dt['confidence'] = float(confidence)\n            dt['left'], dt['top'], dt['right'], dt['bottom'] = clip_bbox(bbox, org_img_width, org_img_height)\n            output_i['data'].append(dt)\n\n        output.append(output_i)\n        if visualization:\n            output_i['save_path'] = draw_bounding_box_on_image(org_img_path, output_i['data'], output_dir)\n\n    return output\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_coco2017/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport paddlehub as hub\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://ai-studio-static-online.cdn.bcebos.com/68313e182f5e4ad9907e69dac9ece8fc50840d7ffbd24fa88396f009958f969a'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"yolov3_darknet53_coco2017\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('detection_result')\n\n    def test_object_detection1(self):\n        results = self.module.object_detection(\n            paths=['tests/test.jpg']\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'cat')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(0 < left < 1000)\n        self.assertTrue(2500 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(3500 < bottom < 4500)\n\n    def test_object_detection2(self):\n        results = self.module.object_detection(\n            images=[cv2.imread('tests/test.jpg')]\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'cat')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(0 < left < 1000)\n        self.assertTrue(2500 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(3500 < bottom < 4500)\n\n    def test_object_detection3(self):\n        results = self.module.object_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            visualization=False\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'cat')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(0 < left < 1000)\n        self.assertTrue(2500 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(3500 < bottom < 4500)\n\n    def test_object_detection4(self):\n        self.assertRaises(\n            AssertionError,\n            self.module.object_detection,\n            paths=['no.jpg']\n        )\n\n    def test_object_detection5(self):\n        self.assertRaises(\n            AttributeError,\n            self.module.object_detection,\n            images=['test.jpg']\n        )\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_pedestrian/README.md",
    "content": "# yolov3_darknet53_pedestrian\n\n|模型名称|yolov3_darknet53_pedestrian|\n| :--- | :---: |\n|类别|图像 - 目标检测|\n|网络|YOLOv3|\n|数据集|百度自建大规模行人数据集|\n|是否支持Fine-tuning|否|\n|模型大小|238MB|\n|最新更新日期|2021-03-15|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n     <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/131492636-714c697c-3275-4c8c-a83a-cf971a91ba98.jpg\"   width='50%' hspace='10'/>\n     <br />\n     </p>\n\n- ### 模型介绍\n\n  - 行人检测是计算机视觉技术中的目标检测问题，用于判断图像中是否存在行人并给予精确定位，定位结果用矩形框表示。行人检测技术有很强的使用价值，它可以与行人跟踪、行人重识别等技术结合，应用于汽车无人驾驶系统、智能视频监控、人体行为分析、客流统计系统、智能交通等领域。yolov3_darknet53_pedestrian Module的网络为YOLOv3, 其中backbone为DarkNet53, 采用百度自建大规模车辆数据集训练得到，目前仅支持预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install yolov3_darknet53_pedestrian\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run yolov3_darknet53_pedestrian --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现行人检测模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    pedestrian_detector = hub.Module(name=\"yolov3_darknet53_pedestrian\")\n    result = pedestrian_detector.object_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = pedestrian_detector.object_detection(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def object_detection(paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='yolov3_pedestrian_detect_output',\n                         score_thresh=0.2,\n                         visualization=True)  \n    ```\n\n    - 预测API，检测输入图片中的所有行人的位置。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；<br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 yolov3\\_pedestrian\\_detect\\_output；<br/>\n      - score\\_thresh (float): 识别置信度的阈值；<br/>\n      - visualization (bool): 是否将识别结果保存为图片文件。<br/>\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为:\n        - data (list): 检测结果，list的每一个元素为 dict，各字段为:\n          - confidence (float): 识别的置信度\n          - label (str): 标签\n          - left (int): 边界框的左上角x坐标\n          - top (int): 边界框的左上角y坐标\n          - right (int): 边界框的右下角x坐标\n          - bottom (int): 边界框的右下角y坐标\n        - save\\_path (str, optional): 识别结果的保存路径 (仅当visualization=True时存在)\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - 将模型保存到指定路径。\n\n    - **参数**\n\n      - dirname: 模型保存路径 <br/>\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个行人检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m yolov3_darknet53_pedestrian\n    ```\n\n  - 这样就完成了一个行人检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/yolov3_darknet53_pedestrian\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布  \n\n* 1.0.2\n\n   修复numpy数据读取问题\n\n* 1.0.3\n\n   移除 fluid api\n\n* 1.1.0\n\n   修复推理模型无法导出的问题\n\n  - ```shell\n    $ hub install yolov3_darknet53_pedestrian==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_pedestrian/README_en.md",
    "content": "# yolov3_darknet53_pedestrian\n\n|Module Name|yolov3_darknet53_pedestrian|\n| :--- | :---: |\n|Category|object detection|\n|Network|YOLOv3|\n|Dataset|Baidu Pedestrian Dataset|\n|Fine-tuning supported or not|No|\n|Module Size|238MB|\n|Latest update date|2021-03-15|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n     <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/131492636-714c697c-3275-4c8c-a83a-cf971a91ba98.jpg\"   width='50%' hspace='10'/>\n     <br />\n     </p>\n\n- ### Module Introduction\n\n  - YOLOv3 is a one-stage detector proposed by Joseph Redmon and Ali Farhadi, which can reach comparable accuracy but twice as fast as traditional methods. This module is based on YOLOv3, trained on Baidu Pedestrian Dataset, and can be used for pedestrian detection.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install yolov3_darknet53_pedestrian\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run yolov3_darknet53_pedestrian --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    pedestrian_detector = hub.Module(name=\"yolov3_darknet53_pedestrian\")\n    result = pedestrian_detector.object_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = pedestrian_detector.object_detection(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def object_detection(paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='yolov3_pedestrian_detect_output',\n                         score_thresh=0.2,\n                         visualization=True)  \n    ```\n\n    - Detection API, detect positions of all pedestrian in image\n\n    - **Parameters**\n\n      - paths (list[str]): image path;\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - output_dir (str): save path of images;\n      - score\\_thresh (float): confidence threshold；<br/>\n      - visualization (bool): Whether to save the results as picture files;\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n\n    - **Return**\n\n      - res (list\\[dict\\]): results\n        - data (list): detection results, each element in the list is dict\n          - confidence (float): the confidence of the result\n          - label (str): label\n          - left (int): the upper left corner x coordinate of the detection box\n          - top (int): the upper left corner y coordinate of the detection box\n          - right (int): the lower right corner x coordinate of the detection box\n          - bottom (int): the lower right corner y coordinate of the detection box\n        - save\\_path (str, optional): output path for saving results\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - Save model to specific path\n\n    - **Parameters**\n\n      - dirname: model save path\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of object detection.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m yolov3_darknet53_pedestrian\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/yolov3_darknet53_pedestrian\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release  \n\n* 1.0.2\n\n   Fix the problem of reading numpy\n\n* 1.0.3\n\n   Remove fluid api\n\n* 1.1.0\n\n   Fix bug of save_inference_model\n\n  - ```shell\n    $ hub install yolov3_darknet53_pedestrian==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_pedestrian/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_pedestrian/data_feed.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import division\n\nimport os\n\nimport cv2\nimport numpy as np\n\n__all__ = ['reader']\n\n\ndef reader(paths=[], images=None):\n    \"\"\"\n    data generator\n\n    Args:\n        paths (list[str]): paths to images.\n        images (list(numpy.ndarray)): data of images, shape of each is [H, W, C]\n\n    Yield:\n        res (list): preprocessed image and the size of original image.\n    \"\"\"\n    img_list = []\n    if paths:\n        assert type(paths) is list, \"type(paths) is not list.\"\n        for img_path in paths:\n            assert os.path.isfile(\n                img_path), \"The {} isn't a valid file path.\".format(img_path)\n            img = cv2.imread(img_path).astype('float32')\n            img_list.append(img)\n    if images is not None:\n        for img in images:\n            img_list.append(img)\n\n    for im in img_list:\n        # im_size\n        im_shape = im.shape\n        im_size = np.array([im_shape[0], im_shape[1]], dtype=np.int32)\n\n        # decode image\n        im = cv2.cvtColor(im, cv2.COLOR_BGR2RGB)\n\n        # resize image\n        target_size = 608\n        im_size_min = np.min(im_shape[0:2])\n        im_size_max = np.max(im_shape[0:2])\n        if float(im_size_min) == 0:\n            raise ZeroDivisionError('min size of image is 0')\n\n        im_scale_x = float(target_size) / float(im_shape[1])\n        im_scale_y = float(target_size) / float(im_shape[0])\n        im = cv2.resize(\n            im, None, None, fx=im_scale_x, fy=im_scale_y, interpolation=2)\n\n        # normalize image\n        mean = [0.485, 0.456, 0.406]\n        std = [0.229, 0.224, 0.225]\n        im = im.astype(np.float32, copy=False)\n        mean = np.array(mean)[np.newaxis, np.newaxis, :]\n        std = np.array(std)[np.newaxis, np.newaxis, :]\n        im = im / 255.0\n        im -= mean\n        im /= std\n\n        # permute\n        im = np.swapaxes(im, 1, 2)\n        im = np.swapaxes(im, 1, 0)\n\n        yield [im, im_size]\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_pedestrian/label_file.txt",
    "content": "pedestrian\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_pedestrian/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\n\nimport argparse\nimport ast\nimport os\nfrom functools import partial\n\nimport numpy as np\nimport paddle\nimport paddle.jit\nimport paddle.static\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import load_label_info\nfrom .processor import postprocess\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"yolov3_darknet53_pedestrian\",\n            version=\"1.1.0\",\n            type=\"CV/object_detection\",\n            summary=\"Baidu's YOLOv3 model for pedestrian detection, with backbone DarkNet53.\",\n            author=\"paddlepaddle\",\n            author_email=\"paddle-dev@baidu.com\")\nclass YOLOv3DarkNet53Pedestrian:\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"yolov3_darknet53_pedestrian_model\", \"model\")\n        self.label_names = load_label_info(os.path.join(self.directory, \"label_file.txt\"))\n        self._set_config()\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting.\n        \"\"\"\n        model = self.default_pretrained_model_path+'.pdmodel'\n        params = self.default_pretrained_model_path+'.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        cpu_config.switch_ir_optim(False)\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=500, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def object_detection(self,\n                         paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='yolov3_pedestrian_detect_output',\n                         score_thresh=0.2,\n                         visualization=True):\n        \"\"\"API of Object Detection.\n\n        Args:\n            paths (list[str]): The paths of images.\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            output_dir (str): The path to store output images.\n            visualization (bool): Whether to save image or not.\n            score_thresh (float): threshold for object detecion.\n\n        Returns:\n            res (list[dict]): The result of pedestrian detecion. keys include 'data', 'save_path', the corresponding value is:\n                data (dict): the result of object detection, keys include 'left', 'top', 'right', 'bottom', 'label', 'confidence', the corresponding value is:\n                    left (float): The X coordinate of the upper left corner of the bounding box;\n                    top (float): The Y coordinate of the upper left corner of the bounding box;\n                    right (float): The X coordinate of the lower right corner of the bounding box;\n                    bottom (float): The Y coordinate of the lower right corner of the bounding box;\n                    label (str): The label of detection result;\n                    confidence (float): The confidence of detection result.\n                save_path (str, optional): The path to save output images.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Attempt to use GPU for prediction, but environment variable CUDA_VISIBLE_DEVICES was not set correctly.\"\n                )\n\n        paths = paths if paths else list()\n        data_reader = partial(reader, paths, images)\n        batch_reader = paddle.batch(data_reader, batch_size=batch_size)\n        res = []\n        for iter_id, feed_data in enumerate(batch_reader()):\n            feed_data = np.array(feed_data)\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(np.array(list(feed_data[:, 0])))\n            input_handle = predictor.get_input_handle(input_names[1])\n            input_handle.copy_from_cpu(np.array(list(feed_data[:, 1])))\n\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            output = postprocess(paths=paths,\n                                 images=images,\n                                 data_out=output_handle,\n                                 score_thresh=score_thresh,\n                                 label_names=self.label_names,\n                                 output_dir=output_dir,\n                                 handle_id=iter_id * batch_size,\n                                 visualization=visualization)\n            res.extend(output)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.object_detection(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.object_detection(paths=[args.input_path],\n                                        batch_size=args.batch_size,\n                                        use_gpu=args.use_gpu,\n                                        output_dir=args.output_dir,\n                                        visualization=args.visualization,\n                                        score_thresh=args.score_thresh)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='yolov3_pedestrian_detect_output',\n                                           help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument('--visualization',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n        self.arg_input_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n        self.arg_input_group.add_argument('--score_thresh',\n                                          type=ast.literal_eval,\n                                          default=0.2,\n                                          help=\"threshold for object detecion.\")\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_pedestrian/processor.py",
    "content": "# coding=utf-8\nimport base64\nimport os\n\nimport cv2\nimport numpy as np\nfrom PIL import Image\nfrom PIL import ImageDraw\n\n__all__ = ['base64_to_cv2', 'load_label_info', 'postprocess']\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef check_dir(dir_path):\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef get_save_image_name(img, output_dir, image_path):\n    \"\"\"Get save image name from source image path.\n    \"\"\"\n    image_name = os.path.split(image_path)[-1]\n    name, ext = os.path.splitext(image_name)\n\n    if ext == '':\n        if img.format == 'PNG':\n            ext = '.png'\n        elif img.format == 'JPEG':\n            ext = '.jpg'\n        elif img.format == 'BMP':\n            ext = '.bmp'\n        else:\n            if img.mode == \"RGB\" or img.mode == \"L\":\n                ext = \".jpg\"\n            elif img.mode == \"RGBA\" or img.mode == \"P\":\n                ext = '.png'\n\n    return os.path.join(output_dir, \"{}\".format(name)) + ext\n\n\ndef draw_bounding_box_on_image(image_path, data_list, save_dir):\n    image = Image.open(image_path)\n    draw = ImageDraw.Draw(image)\n    for data in data_list:\n        left, right, top, bottom = data['left'], data['right'], data['top'], data['bottom']\n        # draw bbox\n        draw.line([(left, top), (left, bottom), (right, bottom), (right, top), (left, top)], width=2, fill='red')\n        # draw label\n        if image.mode == 'RGB':\n            text = data['label'] + \": %.2f%%\" % (100 * data['confidence'])\n            textsize_width, textsize_height = draw.textsize(text=text)\n            draw.rectangle(xy=(left, top - (textsize_height + 5), left + textsize_width + 10, top),\n                           fill=(255, 255, 255))\n            draw.text(xy=(left, top - 15), text=text, fill=(0, 0, 0))\n\n    save_name = get_save_image_name(image, save_dir, image_path)\n    if os.path.exists(save_name):\n        os.remove(save_name)\n\n    image.save(save_name)\n    return save_name\n\n\ndef clip_bbox(bbox, img_width, img_height):\n    xmin = max(min(bbox[0], img_width), 0.)\n    ymin = max(min(bbox[1], img_height), 0.)\n    xmax = max(min(bbox[2], img_width), 0.)\n    ymax = max(min(bbox[3], img_height), 0.)\n    return float(xmin), float(ymin), float(xmax), float(ymax)\n\n\ndef load_label_info(file_path):\n    with open(file_path, 'r') as fr:\n        text = fr.readlines()\n        label_names = []\n        for info in text:\n            label_names.append(info.strip())\n        return label_names\n\n\ndef postprocess(paths, images, data_out, score_thresh, label_names, output_dir, handle_id, visualization=True):\n    \"\"\"\n    postprocess the lod_tensor produced by Executor.run\n\n    Args:\n        paths (list[str]): The paths of images.\n        images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n        data_out (lod_tensor): data output of predictor.\n        batch_size (int): batch size.\n        use_gpu (bool): Whether to use gpu.\n        output_dir (str): The path to store output images.\n        visualization (bool): Whether to save image or not.\n        score_thresh (float): the low limit of bounding box.\n        label_names (list[str]): label names.\n        handle_id (int): The number of images that have been handled.\n\n    Returns:\n        res (list[dict]): The result of vehicles detecion. keys include 'data', 'save_path', the corresponding value is:\n            data (dict): the result of object detection, keys include 'left', 'top', 'right', 'bottom', 'label', 'confidence', the corresponding value is:\n                left (float): The X coordinate of the upper left corner of the bounding box;\n                top (float): The Y coordinate of the upper left corner of the bounding box;\n                right (float): The X coordinate of the lower right corner of the bounding box;\n                bottom (float): The Y coordinate of the lower right corner of the bounding box;\n                label (str): The label of detection result;\n                confidence (float): The confidence of detection result.\n            save_path (str): The path to save output images.\n    \"\"\"\n    lod = data_out.lod()[0]\n    results = data_out.copy_to_cpu()\n\n    check_dir(output_dir)\n\n    if paths:\n        assert type(paths) is list, \"type(paths) is not list.\"\n        if handle_id < len(paths):\n            unhandled_paths = paths[handle_id:]\n            unhandled_paths_num = len(unhandled_paths)\n        else:\n            unhandled_paths_num = 0\n    if images is not None:\n        if handle_id < len(images):\n            unhandled_paths = None\n            unhandled_paths_num = len(images) - handle_id\n        else:\n            unhandled_paths_num = 0\n\n    output = list()\n    for index in range(len(lod) - 1):\n        output_i = {'data': []}\n        if unhandled_paths and index < unhandled_paths_num:\n            org_img_path = unhandled_paths[index]\n            org_img = Image.open(org_img_path)\n        else:\n            org_img = images[index - unhandled_paths_num]\n            org_img = org_img.astype(np.uint8)\n            org_img = Image.fromarray(org_img[:, :, ::-1])\n            if visualization:\n                org_img_path = get_save_image_name(org_img, output_dir, 'image_numpy_{}'.format((handle_id + index)))\n                org_img.save(org_img_path)\n        org_img_height = org_img.height\n        org_img_width = org_img.width\n        result_i = results[lod[index]:lod[index + 1]]\n        for row in result_i:\n            if len(row) != 6:\n                continue\n            if row[1] < score_thresh:\n                continue\n            category_id = int(row[0])\n            confidence = row[1]\n            bbox = row[2:]\n            dt = {}\n            dt['label'] = label_names[category_id]\n            dt['confidence'] = float(confidence)\n            dt['left'], dt['top'], dt['right'], dt['bottom'] = clip_bbox(bbox, org_img_width, org_img_height)\n            output_i['data'].append(dt)\n\n        output.append(output_i)\n        if visualization:\n            output_i['save_path'] = draw_bounding_box_on_image(org_img_path, output_i['data'], output_dir)\n\n    return output\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_pedestrian/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport paddlehub as hub\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://ai-studio-static-online.cdn.bcebos.com/15310014bf794c87a1e3b289d904ecae122aafe8c8fe47fd98634e79a8e4012f'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"yolov3_darknet53_pedestrian\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('yolov3_pedestrian_detect_output')\n\n    def test_object_detection1(self):\n        results = self.module.object_detection(\n            paths=['tests/test.jpg']\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'pedestrian')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(0 < left < 1000)\n        self.assertTrue(1000 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(1000 < bottom < 4500)\n\n    def test_object_detection2(self):\n        results = self.module.object_detection(\n            images=[cv2.imread('tests/test.jpg')]\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'pedestrian')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(0 < left < 1000)\n        self.assertTrue(1000 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(1000 < bottom < 4500)\n\n    def test_object_detection3(self):\n        results = self.module.object_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            visualization=False\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'pedestrian')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(0 < left < 1000)\n        self.assertTrue(1000 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(1000 < bottom < 4500)\n\n    def test_object_detection4(self):\n        self.assertRaises(\n            AssertionError,\n            self.module.object_detection,\n            paths=['no.jpg']\n        )\n\n    def test_object_detection5(self):\n        self.assertRaises(\n            AttributeError,\n            self.module.object_detection,\n            images=['test.jpg']\n        )\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_vehicles/README.md",
    "content": "# yolov3_darknet53_vehicles\n\n|模型名称|yolov3_darknet53_vehicles|\n| :--- | :---: |\n|类别|图像 - 目标检测|\n|网络|YOLOv3|\n|数据集|百度自建大规模车辆数据集|\n|是否支持Fine-tuning|否|\n|模型大小|238MB|\n|最新更新日期|2021-03-15|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n     <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/131529643-70ee93fc-c9f3-40df-a981-901074683beb.jpg\"   width='50%' hspace='10'/>\n     <br />\n     </p>\n\n- ### 模型介绍\n\n  - 车辆检测是城市交通监控中非常重要并且具有挑战性的任务，该任务的难度在于对复杂场景中相对较小的车辆进行精准地定位和分类。该 PaddleHub Module 的网络为 YOLOv3, 其中 backbone 为 DarkNet53，采用百度自建大规模车辆数据集训练得到，支持car (汽车)、truck (卡车)、bus (公交车)、motorbike (摩托车)、tricycle (三轮车)等车型的识别。目前仅支持预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install yolov3_darknet53_vehicles\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run yolov3_darknet53_vehicles --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现车辆检测模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    vehicles_detector = hub.Module(name=\"yolov3_darknet53_vehicles\")\n    result = vehicles_detector.object_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = vehicles_detector.object_detection((paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def object_detection(paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='yolov3_vehicles_detect_output',\n                         score_thresh=0.2,\n                         visualization=True)\n    ```\n\n    - 预测API，检测输入图片中的所有车辆的位置。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 yolov3\\_vehicles\\_detect\\_output；<br/>\n      - score\\_thresh (float): 识别置信度的阈值；<br/>\n      - visualization (bool): 是否将识别结果保存为图片文件。\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为:\n        - data (list): 检测结果，list的每一个元素为 dict，各字段为:\n          - confidence (float): 识别的置信度\n          - label (str): 标签\n          - left (int): 边界框的左上角x坐标\n          - top (int): 边界框的左上角y坐标\n          - right (int): 边界框的右下角x坐标\n          - bottom (int): 边界框的右下角y坐标\n        - save\\_path (str, optional): 识别结果的保存路径 (仅当visualization=True时存在)\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - 将模型保存到指定路径。\n\n    - **参数**\n\n      - dirname: 模型保存路径 <br/>\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个车辆检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m yolov3_darknet53_vehicles\n    ```\n\n  - 这样就完成了一个车辆检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/yolov3_darknet53_vehicles\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.2\n\n  修复numpy数据读取问题\n\n* 1.0.3\n\n  移除 fluid api\n\n* 1.1.0\n\n   修复推理模型无法导出的问题\n\n  - ```shell\n    $ hub install yolov3_darknet53_vehicles==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_vehicles/README_en.md",
    "content": "# yolov3_darknet53_vehicles\n\n|Module Name|yolov3_darknet53_vehicles|\n| :--- | :---: |\n|Category|object detection|\n|Network|YOLOv3|\n|Dataset|Baidu Vehicle Dataset|\n|Fine-tuning supported or not|No|\n|Module Size|238MB|\n|Latest update date|2021-03-15|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n     <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/131529643-70ee93fc-c9f3-40df-a981-901074683beb.jpg\"   width='50%' hspace='10'/>\n     <br />\n     </p>\n\n- ### Module Introduction\n\n  - YOLOv3 is a one-stage detector proposed by Joseph Redmon and Ali Farhadi, which can reach comparable accuracy but twice as fast as traditional methods. This module is based on YOLOv3, trained on Baidu Vehicle Dataset, and can be used for vehicle detection.\n\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install yolov3_darknet53_vehicles\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run yolov3_darknet53_vehicles --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    vehicles_detector = hub.Module(name=\"yolov3_darknet53_vehicles\")\n    result = vehicles_detector.object_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = vehicles_detector.object_detection((paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def object_detection(paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='yolov3_vehicles_detect_output',\n                         score_thresh=0.2,\n                         visualization=True)\n    ```\n\n    - Detection API, detect positions of all vehicles in image\n\n    - **Parameters**\n\n      - paths (list[str]): image path;\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - output_dir (str): save path of images;\n      - score\\_thresh (float): confidence threshold；<br/>\n      - visualization (bool): Whether to save the results as picture files;\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n\n      - res (list\\[dict\\]): results\n        - data (list): detection results, each element in the list is dict\n          - confidence (float): the confidence of the result\n          - label (str): label\n          - left (int): the upper left corner x coordinate of the detection box\n          - top (int): the upper left corner y coordinate of the detection box\n          - right (int): the lower right corner x coordinate of the detection box\n          - bottom (int): the lower right corner y coordinate of the detection box\n        - save\\_path (str, optional): output path for saving results\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - Save model to specific path\n\n    - **Parameters**\n\n      - dirname: model save path\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of object detection.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m yolov3_darknet53_vehicles\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/yolov3_darknet53_vehicles\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.0.2\n\n  Fix the problem of reading numpy\n\n* 1.0.3\n\n  Remove fluid api\n\n* 1.1.0\n\n   Fix bug of save_inference_model\n\n  - ```shell\n    $ hub install yolov3_darknet53_vehicles==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_vehicles/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_vehicles/data_feed.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import division\n\nimport os\n\nimport cv2\nimport numpy as np\n\n__all__ = ['reader']\n\n\ndef reader(paths=[], images=None):\n    \"\"\"\n    data generator\n\n    Args:\n        paths (list[str]): paths to images.\n        images (list(numpy.ndarray)): data of images, shape of each is [H, W, C]\n\n    Yield:\n        res (list): preprocessed image and the size of original image.\n    \"\"\"\n    img_list = []\n    if paths:\n        assert type(paths) is list, \"type(paths) is not list.\"\n        for img_path in paths:\n            assert os.path.isfile(\n                img_path), \"The {} isn't a valid file path.\".format(img_path)\n            img = cv2.imread(img_path).astype('float32')\n            img_list.append(img)\n    if images is not None:\n        for img in images:\n            img_list.append(img)\n\n    for im in img_list:\n        # im_size\n        im_shape = im.shape\n        im_size = np.array([im_shape[0], im_shape[1]], dtype=np.int32)\n\n        # decode image\n        im = cv2.cvtColor(im, cv2.COLOR_BGR2RGB)\n\n        # resize image\n        target_size = 608\n        im_size_min = np.min(im_shape[0:2])\n        im_size_max = np.max(im_shape[0:2])\n        if float(im_size_min) == 0:\n            raise ZeroDivisionError('min size of image is 0')\n\n        im_scale_x = float(target_size) / float(im_shape[1])\n        im_scale_y = float(target_size) / float(im_shape[0])\n        im = cv2.resize(\n            im, None, None, fx=im_scale_x, fy=im_scale_y, interpolation=2)\n\n        # normalize image\n        mean = [0.485, 0.456, 0.406]\n        std = [0.229, 0.224, 0.225]\n        im = im.astype(np.float32, copy=False)\n        mean = np.array(mean)[np.newaxis, np.newaxis, :]\n        std = np.array(std)[np.newaxis, np.newaxis, :]\n        im = im / 255.0\n        im -= mean\n        im /= std\n\n        # permute\n        im = np.swapaxes(im, 1, 2)\n        im = np.swapaxes(im, 1, 0)\n\n        yield [im, im_size]\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_vehicles/label_file.txt",
    "content": "car\ntruck\nbus\nmotorbike\ntricycle\ncarplate\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_vehicles/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\n\nimport argparse\nimport ast\nimport os\nfrom functools import partial\n\nimport numpy as np\nimport paddle\nimport paddle.jit\nimport paddle.static\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import load_label_info\nfrom .processor import postprocess\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"yolov3_darknet53_vehicles\",\n            version=\"1.1.0\",\n            type=\"CV/object_detection\",\n            summary=\"Baidu's YOLOv3 model for vehicles detection, with backbone DarkNet53.\",\n            author=\"paddlepaddle\",\n            author_email=\"paddle-dev@baidu.com\")\nclass YOLOv3DarkNet53Vehicles:\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"yolov3_darknet53_vehicles_model\", \"model\")\n        self.label_names = load_label_info(os.path.join(self.directory, \"label_file.txt\"))\n        self._set_config()\n\n    def _get_device_id(self, places):\n        try:\n            places = os.environ[places]\n            id = int(places)\n        except:\n            id = -1\n        return id\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting.\n        \"\"\"\n\n        # create default cpu predictor\n        model = self.default_pretrained_model_path+'.pdmodel'\n        params = self.default_pretrained_model_path+'.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        # create predictors using various types of devices\n\n        # npu\n        npu_id = self._get_device_id(\"FLAGS_selected_npus\")\n        if npu_id != -1:\n            # use npu\n            npu_config = Config(model, params)\n            npu_config.disable_glog_info()\n            npu_config.enable_npu(device_id=npu_id)\n            self.npu_predictor = create_predictor(npu_config)\n\n        # gpu\n        gpu_id = self._get_device_id(\"CUDA_VISIBLE_DEVICES\")\n        if gpu_id != -1:\n            # use gpu\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=gpu_id)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n        # xpu\n        xpu_id = self._get_device_id(\"XPU_VISIBLE_DEVICES\")\n        if xpu_id != -1:\n            # use xpu\n            xpu_config = Config(model, params)\n            xpu_config.disable_glog_info()\n            xpu_config.enable_xpu(100)\n            self.xpu_predictor = create_predictor(xpu_config)\n\n    def object_detection(self,\n                         paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='yolov3_vehicles_detect_output',\n                         score_thresh=0.2,\n                         visualization=True,\n                         use_device=None):\n        \"\"\"API of Object Detection.\n\n        Args:\n            paths (list[str]): The paths of images.\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            output_dir (str): The path to store output images.\n            visualization (bool): Whether to save image or not.\n            score_thresh (float): threshold for object detecion.\n            use_device (str): use cpu, gpu, xpu or npu, overwrites use_gpu flag.\n\n        Returns:\n            res (list[dict]): The result of vehicles detecion. keys include 'data', 'save_path', the corresponding value is:\n                data (dict): the result of object detection, keys include 'left', 'top', 'right', 'bottom', 'label', 'confidence', the corresponding value is:\n                    left (float): The X coordinate of the upper left corner of the bounding box;\n                    top (float): The Y coordinate of the upper left corner of the bounding box;\n                    right (float): The X coordinate of the lower right corner of the bounding box;\n                    bottom (float): The Y coordinate of the lower right corner of the bounding box;\n                    label (str): The label of detection result;\n                    confidence (float): The confidence of detection result.\n                save_path (str, optional): The path to save output images.\n        \"\"\"\n\n        # real predictor to use\n        if use_device is not None:\n            if use_device == \"cpu\":\n                predictor = self.cpu_predictor\n            elif use_device == \"xpu\":\n                predictor = self.xpu_predictor\n            elif use_device == \"npu\":\n                predictor = self.npu_predictor\n            elif use_device == \"gpu\":\n                predictor = self.gpu_predictor\n            else:\n                raise Exception(\"Unsupported device: \" + use_device)\n        else:\n            # use_device is not set, therefore follow use_gpu\n            if use_gpu:\n                predictor = self.gpu_predictor\n            else:\n                predictor = self.cpu_predictor\n\n        paths = paths if paths else list()\n        data_reader = partial(reader, paths, images)\n        batch_reader = paddle.batch(data_reader, batch_size=batch_size)\n        res = []\n        for iter_id, feed_data in enumerate(batch_reader()):\n            feed_data = np.array(feed_data)\n\n            input_names = predictor.get_input_names()\n            image_data = np.array(list(feed_data[:, 0]))\n            image_size_data = np.array(list(feed_data[:, 1]))\n\n            image_tensor = predictor.get_input_handle(input_names[0])\n            image_tensor.reshape(image_data.shape)\n            image_tensor.copy_from_cpu(image_data.copy())\n\n            image_size_tensor = predictor.get_input_handle(input_names[1])\n            image_size_tensor.reshape(image_size_data.shape)\n            image_size_tensor.copy_from_cpu(image_size_data.copy())\n\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            output = postprocess(paths=paths,\n                                 images=images,\n                                 data_out=output_handle,\n                                 score_thresh=score_thresh,\n                                 label_names=self.label_names,\n                                 output_dir=output_dir,\n                                 handle_id=iter_id * batch_size,\n                                 visualization=visualization)\n            res.extend(output)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.object_detection(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.object_detection(paths=[args.input_path],\n                                        batch_size=args.batch_size,\n                                        use_gpu=args.use_gpu,\n                                        output_dir=args.output_dir,\n                                        visualization=args.visualization,\n                                        score_thresh=args.score_thresh,\n                                        use_device=args.use_device)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='yolov3_vehicles_detect_output',\n                                           help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument('--visualization',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether to save output as images.\")\n        self.arg_config_group.add_argument('--use_device',\n                                           choices=[\"cpu\", \"gpu\", \"xpu\", \"npu\"],\n                                           help=\"use cpu, gpu, xpu or npu. overwrites use_gpu flag.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n        self.arg_input_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n        self.arg_input_group.add_argument('--score_thresh',\n                                          type=ast.literal_eval,\n                                          default=0.2,\n                                          help=\"threshold for object detecion.\")\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_vehicles/processor.py",
    "content": "# coding=utf-8\nimport base64\nimport os\n\nimport cv2\nimport numpy as np\nfrom PIL import Image, ImageDraw\n\n__all__ = ['base64_to_cv2', 'load_label_info', 'postprocess']\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef check_dir(dir_path):\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef get_save_image_name(img, output_dir, image_path):\n    \"\"\"Get save image name from source image path.\n    \"\"\"\n    image_name = os.path.split(image_path)[-1]\n    name, ext = os.path.splitext(image_name)\n\n    if ext == '':\n        if img.format == 'PNG':\n            ext = '.png'\n        elif img.format == 'JPEG':\n            ext = '.jpg'\n        elif img.format == 'BMP':\n            ext = '.bmp'\n        else:\n            if img.mode == \"RGB\" or img.mode == \"L\":\n                ext = \".jpg\"\n            elif img.mode == \"RGBA\" or img.mode == \"P\":\n                ext = '.png'\n\n    return os.path.join(output_dir, \"{}\".format(name)) + ext\n\n\ndef draw_bounding_box_on_image(image_path, data_list, save_dir):\n    image = Image.open(image_path)\n    draw = ImageDraw.Draw(image)\n    for data in data_list:\n        left, right, top, bottom = data['left'], data['right'], data['top'], data['bottom']\n        # draw bbox\n        draw.line([(left, top), (left, bottom), (right, bottom), (right, top), (left, top)], width=2, fill='red')\n        # draw label\n        if image.mode == 'RGB':\n            text = data['label'] + \": %.2f%%\" % (100 * data['confidence'])\n            textsize_width, textsize_height = draw.textsize(text=text)\n            draw.rectangle(\n                xy=(left, top - (textsize_height + 5), left + textsize_width + 10, top), fill=(255, 255, 255))\n            draw.text(xy=(left, top - 15), text=text, fill=(0, 0, 0))\n\n    save_name = get_save_image_name(image, save_dir, image_path)\n    if os.path.exists(save_name):\n        os.remove(save_name)\n\n    image.save(save_name)\n    return save_name\n\n\ndef clip_bbox(bbox, img_width, img_height):\n    xmin = max(min(bbox[0], img_width), 0.)\n    ymin = max(min(bbox[1], img_height), 0.)\n    xmax = max(min(bbox[2], img_width), 0.)\n    ymax = max(min(bbox[3], img_height), 0.)\n    return float(xmin), float(ymin), float(xmax), float(ymax)\n\n\ndef load_label_info(file_path):\n    with open(file_path, 'r') as fr:\n        text = fr.readlines()\n        label_names = []\n        for info in text:\n            label_names.append(info.strip())\n        return label_names\n\n\ndef postprocess(paths, images, data_out, score_thresh, label_names, output_dir, handle_id, visualization=True):\n    \"\"\"\n    postprocess the lod_tensor produced by Executor.run\n\n    Args:\n        paths (list[str]): The paths of images.\n        images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n        data_out (lod_tensor): data output of predictor.\n        output_dir (str): The path to store output images.\n        visualization (bool): Whether to save image or not.\n        score_thresh (float): the low limit of bounding box.\n        label_names (list[str]): label names.\n        handle_id (int): The number of images that have been handled.\n\n    Returns:\n        res (list[dict]): The result of vehicles detecion. keys include 'data', 'save_path', the corresponding value is:\n            data (dict): the result of object detection, keys include 'left', 'top', 'right', 'bottom', 'label', 'confidence', the corresponding value is:\n                left (float): The X coordinate of the upper left corner of the bounding box;\n                top (float): The Y coordinate of the upper left corner of the bounding box;\n                right (float): The X coordinate of the lower right corner of the bounding box;\n                bottom (float): The Y coordinate of the lower right corner of the bounding box;\n                label (str): The label of detection result;\n                confidence (float): The confidence of detection result.\n            save_path (str): The path to save output images.\n    \"\"\"\n    results = data_out.copy_to_cpu()\n    lod = data_out.lod()[0]\n\n    check_dir(output_dir)\n\n    if paths:\n        assert type(paths) is list, \"type(paths) is not list.\"\n        if handle_id < len(paths):\n            unhandled_paths = paths[handle_id:]\n            unhandled_paths_num = len(unhandled_paths)\n        else:\n            unhandled_paths_num = 0\n    if images is not None:\n        if handle_id < len(images):\n            unhandled_paths = None\n            unhandled_paths_num = len(images) - handle_id\n        else:\n            unhandled_paths_num = 0\n\n    output = list()\n    for index in range(len(lod) - 1):\n        output_i = {'data': []}\n        if unhandled_paths and index < unhandled_paths_num:\n            org_img_path = unhandled_paths[index]\n            org_img = Image.open(org_img_path)\n        else:\n            org_img = images[index - unhandled_paths_num]\n            org_img = org_img.astype(np.uint8)\n            org_img = Image.fromarray(org_img[:, :, ::-1])\n            if visualization:\n                org_img_path = get_save_image_name(org_img, output_dir, 'image_numpy_{}'.format((handle_id + index)))\n                org_img.save(org_img_path)\n        org_img_height = org_img.height\n        org_img_width = org_img.width\n        result_i = results[lod[index]:lod[index + 1]]\n        for row in result_i:\n            if len(row) != 6:\n                continue\n            if row[1] < score_thresh:\n                continue\n            category_id = int(row[0])\n            confidence = row[1]\n            bbox = row[2:]\n            dt = {}\n            dt['label'] = label_names[category_id]\n            dt['confidence'] = float(confidence)\n            dt['left'], dt['top'], dt['right'], dt['bottom'] = clip_bbox(bbox, org_img_width, org_img_height)\n            output_i['data'].append(dt)\n\n        output.append(output_i)\n        if visualization:\n            output_i['save_path'] = draw_bounding_box_on_image(org_img_path, output_i['data'], output_dir)\n\n    return output\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_vehicles/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport paddlehub as hub\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://ai-studio-static-online.cdn.bcebos.com/036990d3d8654d789c2138492155d9dd95dba2a2fc8e410ab059eea42b330f59'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"yolov3_darknet53_vehicles\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('yolov3_vehicles_detect_output')\n\n    def test_object_detection1(self):\n        results = self.module.object_detection(\n            paths=['tests/test.jpg']\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'car')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(2000 < left < 4000)\n        self.assertTrue(4000 < right < 6000)\n        self.assertTrue(1000 < top < 3000)\n        self.assertTrue(2000 < bottom < 5000)\n\n    def test_object_detection2(self):\n        results = self.module.object_detection(\n            images=[cv2.imread('tests/test.jpg')]\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'car')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(2000 < left < 4000)\n        self.assertTrue(4000 < right < 6000)\n        self.assertTrue(1000 < top < 3000)\n        self.assertTrue(2000 < bottom < 5000)\n\n    def test_object_detection3(self):\n        results = self.module.object_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            visualization=False\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'car')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(2000 < left < 4000)\n        self.assertTrue(4000 < right < 6000)\n        self.assertTrue(1000 < top < 3000)\n        self.assertTrue(2000 < bottom < 5000)\n\n    def test_object_detection4(self):\n        self.assertRaises(\n            AssertionError,\n            self.module.object_detection,\n            paths=['no.jpg']\n        )\n\n    def test_object_detection5(self):\n        self.assertRaises(\n            AttributeError,\n            self.module.object_detection,\n            images=['test.jpg']\n        )\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_venus/README.md",
    "content": "# yolov3_darknet53_venus\n\n|模型名称|yolov3_darknet53_venus|\n| :--- | :---: |\n|类别|图像 - 目标检测|\n|网络|YOLOv3|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|是|\n|模型大小|501MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - YOLOv3是由Joseph Redmon和Ali Farhadi提出的单阶段检测器, 该检测器与达到同样精度的传统目标检测方法相比，推断速度能达到接近两倍。 YOLOv3将输入图像划分格子，并对每个格子预测bounding box。YOLOv3的loss函数由三部分组成：Location误差，Confidence误差和分类误差。该PaddleHub Module是由800+tag,170w图片，1000w+检测框训练的大规模通用检测模型，在8个数据集上MAP平均提升5.36%，iou=0.5的准确率提升4.53%。对比于其他通用检测模型，使用该Module进行finetune，可以更快收敛，达到较优效果。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)  \n\n- ### 2、安装\n\n  - ```shell\n    $ hub install yolov3_darknet53_venus\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、API\n\n  - ```python\n    def context(trainable=True,\n                pretrained=True,\n                get_prediction=False)\n    ```\n\n    - 提取特征，用于迁移学习。\n\n    - **参数**\n\n      - trainable(bool): 参数是否可训练；<br/>\n      - pretrained (bool): 是否加载预训练模型；<br/>\n      - get\\_prediction (bool): 是否执行预测。\n\n    - **返回**\n      - inputs (dict): 模型的输入，keys 包括 'image', 'im\\_size'，相应的取值为：\n        - image (Variable): 图像变量\n        - im\\_size (Variable): 图片的尺寸\n      - outputs (dict): 模型的输出。如果 get\\_prediction 为 False，输出 'head\\_features'、'body\\_features'，否则输出 'bbox\\_out'\n      - context\\_prog (Program): 用于迁移学习的 Program\n\n\n  - ```python\n    def save_inference_model(dirname,\n                             model_filename=None,\n                             params_filename=None,\n                             combined=True)\n    ```\n    - 将模型保存到指定路径。\n\n    - **参数**\n\n      - dirname: 存在模型的目录名称； <br/>\n      - model\\_filename: 模型文件名称，默认为\\_\\_model\\_\\_； <br/>\n      - params\\_filename: 参数文件名称，默认为\\_\\_params\\_\\_(仅当`combined`为True时生效)；<br/>\n      - combined: 是否将参数保存到统一的一个文件中。\n\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n  - ```shell\n    $ hub install yolov3_darknet53_venus==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_venus/README_en.md",
    "content": "# yolov3_darknet53_venus\n\n|Module Name|yolov3_darknet53_venus|\n| :--- | :---: |\n|Category|object detection|\n|Network|YOLOv3|\n|Dataset|Baidu Detection Dataset|\n|Fine-tuning supported or not|Yes|\n|Module Size|501MB|\n|Latest update date|2021-02-26|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Module Introduction\n\n  - YOLOv3 is a one-stage detector proposed by Joseph Redmon and Ali Farhadi, which can reach comparable accuracy but twice as fast as traditional methods. This module is based on YOLOv3, trained on Baidu Vehicle Dataset which consists of 170w pictures and 1000w+ boxes, improve the accuracy on 8 test datasets for average 5.36%, and can be used for vehicle detection.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)  \n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install yolov3_darknet53_venus\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、API\n\n  - ```python\n    def context(trainable=True,\n                pretrained=True,\n                get_prediction=False)\n    ```\n\n    - Extract features, and do transfer learning\n\n    - **Parameters**\n\n      - trainable(bool): whether parameters trainable or not\n      - pretrained (bool): whether load pretrained model or not\n      - get\\_prediction (bool): whether perform prediction\n\n    - **Return**\n      - inputs (dict): inputs, a dict, include two keys: \"image\" and \"im\\_size\"\n        - image (Variable): image variable\n        - im\\_size (Variable): image size\n      - outputs (dict): model output\n      - program for transfer learning\n\n  - ```python\n    def object_detection(paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         score_thresh=0.5,\n                         visualization=True,\n                         output_dir='detection_result')\n    ```\n\n    - Detection API, detect positions of all objects in image\n\n    - **Parameters**\n\n      - paths (list[str]): image path;\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - score\\_thresh (float): confidence threshold；<br/>\n      - visualization (bool): Whether to save the results as picture files;\n      - output_dir (str): save path of images;\n\n    - **Return**\n\n      - res (list\\[dict\\]): classication results, each element in the list is dict, key is the label name, and value is the corresponding probability\n        - data (list): detection results, each element in the list is dict\n          - confidence (float): the confidence of the result\n          - label (str): label\n          - left (int): the upper left corner x coordinate of the detection box\n          - top (int): the upper left corner y coordinate of the detection box\n          - right (int): the lower right corner x coordinate of the detection box\n          - bottom (int): the lower right corner y coordinate of the detection box\n        - save\\_path (str, optional): output path for saving results\n\n  - ```python\n    def save_inference_model(dirname,\n                             model_filename=None,\n                             params_filename=None,\n                             combined=True)\n    ```\n    - Save model to specific path\n\n    - **Parameters**\n\n      - dirname: output dir for saving model\n      - model\\_filename: filename for saving model\n      - params\\_filename: filename for saving parameters\n      - combined: whether save parameters into one file\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n  - ```shell\n    $ hub install yolov3_darknet53_venus==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_venus/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_venus/darknet.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport six\nimport math\n\nfrom paddle import fluid\nfrom paddle.fluid.param_attr import ParamAttr\nfrom paddle.fluid.regularizer import L2Decay\n\n__all__ = ['DarkNet']\n\n\nclass DarkNet(object):\n    \"\"\"DarkNet, see https://pjreddie.com/darknet/yolo/\n\n    Args:\n        depth (int): network depth, currently only darknet 53 is supported\n        norm_type (str): normalization type, 'bn' and 'sync_bn' are supported\n        norm_decay (float): weight decay for normalization layer weights\n        get_prediction (bool): whether to get prediction\n        class_dim (int): number of class while classification\n    \"\"\"\n\n    def __init__(self,\n                 depth=53,\n                 norm_type='sync_bn',\n                 norm_decay=0.,\n                 weight_prefix_name='',\n                 get_prediction=False,\n                 class_dim=1000):\n        assert depth in [53], \"unsupported depth value\"\n        self.depth = depth\n        self.norm_type = norm_type\n        self.norm_decay = norm_decay\n        self.depth_cfg = {53: ([1, 2, 8, 8, 4], self.basicblock)}\n        self.prefix_name = weight_prefix_name\n        self.class_dim = class_dim\n        self.get_prediction = get_prediction\n\n    def _conv_norm(self, input, ch_out, filter_size, stride, padding, act='leaky', name=None):\n        conv = fluid.layers.conv2d(\n            input=input,\n            num_filters=ch_out,\n            filter_size=filter_size,\n            stride=stride,\n            padding=padding,\n            act=None,\n            param_attr=ParamAttr(name=name + \".conv.weights\"),\n            bias_attr=False)\n\n        bn_name = name + \".bn\"\n        bn_param_attr = ParamAttr(regularizer=L2Decay(float(self.norm_decay)), name=bn_name + '.scale')\n        bn_bias_attr = ParamAttr(regularizer=L2Decay(float(self.norm_decay)), name=bn_name + '.offset')\n\n        out = fluid.layers.batch_norm(\n            input=conv,\n            act=None,\n            param_attr=bn_param_attr,\n            bias_attr=bn_bias_attr,\n            moving_mean_name=bn_name + '.mean',\n            moving_variance_name=bn_name + '.var')\n\n        # leaky relu here has `alpha` as 0.1, can not be set by\n        # `act` param in fluid.layers.batch_norm above.\n        if act == 'leaky':\n            out = fluid.layers.leaky_relu(x=out, alpha=0.1)\n\n        return out\n\n    def _downsample(self, input, ch_out, filter_size=3, stride=2, padding=1, name=None):\n        return self._conv_norm(input, ch_out=ch_out, filter_size=filter_size, stride=stride, padding=padding, name=name)\n\n    def basicblock(self, input, ch_out, name=None):\n        conv1 = self._conv_norm(input, ch_out=ch_out, filter_size=1, stride=1, padding=0, name=name + \".0\")\n        conv2 = self._conv_norm(conv1, ch_out=ch_out * 2, filter_size=3, stride=1, padding=1, name=name + \".1\")\n        out = fluid.layers.elementwise_add(x=input, y=conv2, act=None)\n        return out\n\n    def layer_warp(self, block_func, input, ch_out, count, name=None):\n        out = block_func(input, ch_out=ch_out, name='{}.0'.format(name))\n        for j in six.moves.xrange(1, count):\n            out = block_func(out, ch_out=ch_out, name='{}.{}'.format(name, j))\n        return out\n\n    def __call__(self, input):\n        \"\"\"\n        Get the backbone of DarkNet, that is output for the 5 stages.\n        \"\"\"\n        stages, block_func = self.depth_cfg[self.depth]\n        stages = stages[0:5]\n        conv = self._conv_norm(\n            input=input, ch_out=32, filter_size=3, stride=1, padding=1, name=self.prefix_name + \"yolo_input\")\n        downsample_ = self._downsample(\n            input=conv, ch_out=conv.shape[1] * 2, name=self.prefix_name + \"yolo_input.downsample\")\n        blocks = []\n        for i, stage in enumerate(stages):\n            block = self.layer_warp(\n                block_func=block_func,\n                input=downsample_,\n                ch_out=32 * 2**i,\n                count=stage,\n                name=self.prefix_name + \"stage.{}\".format(i))\n            blocks.append(block)\n            if i < len(stages) - 1:  # do not downsaple in the last stage\n                downsample_ = self._downsample(\n                    input=block, ch_out=block.shape[1] * 2, name=self.prefix_name + \"stage.{}.downsample\".format(i))\n        if self.get_prediction:\n            pool = fluid.layers.pool2d(input=block, pool_type='avg', global_pooling=True)\n            stdv = 1.0 / math.sqrt(pool.shape[1] * 1.0)\n            out = fluid.layers.fc(\n                input=pool,\n                size=self.class_dim,\n                param_attr=ParamAttr(initializer=fluid.initializer.Uniform(-stdv, stdv), name='fc_weights'),\n                bias_attr=ParamAttr(name='fc_offset'))\n            out = fluid.layers.softmax(out)\n            return out\n        else:\n            return blocks\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_venus/data_feed.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import division\n\nimport os\n\nimport cv2\nimport numpy as np\n\n__all__ = ['reader']\n\n\ndef reader(paths=[], images=None):\n    \"\"\"\n    data generator\n\n    Args:\n        paths (list[str]): paths to images.\n        images (list(numpy.ndarray)): data of images, shape of each is [H, W, C]\n\n    Yield:\n        res (list): preprocessed image and the size of original image.\n    \"\"\"\n    img_list = []\n    if paths:\n        assert type(paths) is list, \"type(paths) is not list.\"\n        for img_path in paths:\n            assert os.path.isfile(img_path), \"The {} isn't a valid file path.\".format(img_path)\n            img = cv2.imread(img_path).astype('float32')\n            img_list.append(img)\n    if images is not None:\n        for img in images:\n            img_list.append(img)\n\n    for im in img_list:\n        # im_size\n        im_shape = im.shape\n        im_size = np.array([im_shape[0], im_shape[1]], dtype=np.int32)\n\n        # decode image\n        im = cv2.cvtColor(im, cv2.COLOR_BGR2RGB)\n\n        # resize image\n        target_size = 608\n        im_size_min = np.min(im_shape[0:2])\n        im_size_max = np.max(im_shape[0:2])\n        if float(im_size_min) == 0:\n            raise ZeroDivisionError('min size of image is 0')\n\n        im_scale_x = float(target_size) / float(im_shape[1])\n        im_scale_y = float(target_size) / float(im_shape[0])\n        im = cv2.resize(im, None, None, fx=im_scale_x, fy=im_scale_y, interpolation=2)\n\n        # normalize image\n        mean = [0.485, 0.456, 0.406]\n        std = [0.229, 0.224, 0.225]\n        im = im.astype(np.float32, copy=False)\n        mean = np.array(mean)[np.newaxis, np.newaxis, :]\n        std = np.array(std)[np.newaxis, np.newaxis, :]\n        im = im / 255.0\n        im -= mean\n        im /= std\n\n        # permute\n        im = np.swapaxes(im, 1, 2)\n        im = np.swapaxes(im, 1, 0)\n\n        yield [im, im_size]\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_venus/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\n\nimport ast\nimport argparse\nimport os\nfrom functools import partial\n\nimport numpy as np\nimport paddle.fluid as fluid\nimport paddlehub as hub\nfrom paddle.fluid.core import PaddleTensor, AnalysisConfig, create_paddle_predictor\nfrom paddlehub.module.module import moduleinfo, runnable, serving\nfrom paddlehub.common.paddle_helper import add_vars_prefix\n\nfrom yolov3_darknet53_venus.darknet import DarkNet\nfrom yolov3_darknet53_venus.processor import load_label_info, postprocess, base64_to_cv2\nfrom yolov3_darknet53_venus.data_feed import reader\nfrom yolov3_darknet53_venus.yolo_head import MultiClassNMS, YOLOv3Head\n\n\n@moduleinfo(\n    name=\"yolov3_darknet53_venus\",\n    version=\"1.0.0\",\n    type=\"CV/object_detection\",\n    summary=\"Baidu's YOLOv3 model for object detection, with backbone DarkNet53, trained with Baidu self-built dataset.\",\n    author=\"paddlepaddle\",\n    author_email=\"paddle-dev@baidu.com\")\nclass YOLOv3DarkNet53Venus(hub.Module):\n    def _initialize(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"yolov3_darknet53_model\")\n\n    def context(self, trainable=True, pretrained=True, get_prediction=False):\n        \"\"\"\n        Distill the Head Features, so as to perform transfer learning.\n\n        Args:\n            trainable (bool): whether to set parameters trainable.\n            pretrained (bool): whether to load default pretrained model.\n            get_prediction (bool): whether to get prediction.\n\n        Returns:\n             inputs(dict): the input variables.\n             outputs(dict): the output variables.\n             context_prog (Program): the program to execute transfer learning.\n        \"\"\"\n        context_prog = fluid.Program()\n        startup_program = fluid.Program()\n        with fluid.program_guard(context_prog, startup_program):\n            with fluid.unique_name.guard():\n                # image\n                image = fluid.layers.data(name='image', shape=[3, 608, 608], dtype='float32')\n                # backbone\n                backbone = DarkNet(norm_type='bn', norm_decay=0., depth=53)\n                # body_feats\n                body_feats = backbone(image)\n                # im_size\n                im_size = fluid.layers.data(name='im_size', shape=[2], dtype='int32')\n                # yolo_head\n                yolo_head = YOLOv3Head(num_classes=708)\n                # head_features\n                head_features, body_features = yolo_head._get_outputs(body_feats, is_train=trainable)\n\n                place = fluid.CPUPlace()\n                exe = fluid.Executor(place)\n                exe.run(fluid.default_startup_program())\n\n                # var_prefix\n                var_prefix = '@HUB_{}@'.format(self.name)\n                # name of inputs\n                inputs = {'image': var_prefix + image.name, 'im_size': var_prefix + im_size.name}\n                # name of outputs\n                if get_prediction:\n                    bbox_out = yolo_head.get_prediction(head_features, im_size)\n                    outputs = {'bbox_out': [var_prefix + bbox_out.name]}\n                else:\n                    outputs = {\n                        'head_features': [var_prefix + var.name for var in head_features],\n                        'body_features': [var_prefix + var.name for var in body_features]\n                    }\n                # add_vars_prefix\n                add_vars_prefix(context_prog, var_prefix)\n                add_vars_prefix(fluid.default_startup_program(), var_prefix)\n                # inputs\n                inputs = {key: context_prog.global_block().vars[value] for key, value in inputs.items()}\n                # outputs\n                outputs = {\n                    key: [context_prog.global_block().vars[varname] for varname in value]\n                    for key, value in outputs.items()\n                }\n                # trainable\n                for param in context_prog.global_block().iter_parameters():\n                    param.trainable = trainable\n                # pretrained\n                if pretrained:\n\n                    def _if_exist(var):\n                        return os.path.exists(os.path.join(self.default_pretrained_model_path, var.name))\n\n                    fluid.io.load_vars(exe, self.default_pretrained_model_path, predicate=_if_exist)\n                else:\n                    exe.run(startup_program)\n\n                return inputs, outputs, context_prog\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_venus/processor.py",
    "content": "# coding=utf-8\nimport base64\nimport os\n\nimport cv2\nimport numpy as np\nfrom PIL import Image, ImageDraw\n\n__all__ = ['base64_to_cv2', 'load_label_info', 'postprocess']\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef check_dir(dir_path):\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef get_save_image_name(img, output_dir, image_path):\n    \"\"\"Get save image name from source image path.\n    \"\"\"\n    image_name = os.path.split(image_path)[-1]\n    name, ext = os.path.splitext(image_name)\n\n    if ext == '':\n        if img.format == 'PNG':\n            ext = '.png'\n        elif img.format == 'JPEG':\n            ext = '.jpg'\n        elif img.format == 'BMP':\n            ext = '.bmp'\n        else:\n            if img.mode == \"RGB\" or img.mode == \"L\":\n                ext = \".jpg\"\n            elif img.mode == \"RGBA\" or img.mode == \"P\":\n                ext = '.png'\n\n    return os.path.join(output_dir, \"{}\".format(name)) + ext\n\n\ndef draw_bounding_box_on_image(image_path, data_list, save_dir):\n    image = Image.open(image_path)\n    draw = ImageDraw.Draw(image)\n    for data in data_list:\n        left, right, top, bottom = data['left'], data['right'], data['top'], data['bottom']\n        # draw bbox\n        draw.line([(left, top), (left, bottom), (right, bottom), (right, top), (left, top)], width=2, fill='red')\n        # draw label\n        if image.mode == 'RGB':\n            text = data['label'] + \": %.2f%%\" % (100 * data['confidence'])\n            textsize_width, textsize_height = draw.textsize(text=text)\n            draw.rectangle(\n                xy=(left, top - (textsize_height + 5), left + textsize_width + 10, top), fill=(255, 255, 255))\n            draw.text(xy=(left, top - 15), text=text, fill=(0, 0, 0))\n\n    save_name = get_save_image_name(image, save_dir, image_path)\n    if os.path.exists(save_name):\n        os.remove(save_name)\n\n    image.save(save_name)\n    return save_name\n\n\ndef clip_bbox(bbox, img_width, img_height):\n    xmin = max(min(bbox[0], img_width), 0.)\n    ymin = max(min(bbox[1], img_height), 0.)\n    xmax = max(min(bbox[2], img_width), 0.)\n    ymax = max(min(bbox[3], img_height), 0.)\n    return xmin, ymin, xmax, ymax\n\n\ndef load_label_info(file_path):\n    with open(file_path, 'r') as fr:\n        text = fr.readlines()\n        label_names = []\n        for info in text:\n            label_names.append(info.strip())\n        return label_names\n\n\ndef postprocess(paths, images, data_out, score_thresh, label_names, output_dir, handle_id, visualization=True):\n    \"\"\"\n    postprocess the lod_tensor produced by fluid.Executor.run\n\n    Args:\n        paths (list[str]): The paths of images.\n        images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n        data_out (lod_tensor): data output of predictor.\n        batch_size (int): batch size.\n        use_gpu (bool): Whether to use gpu.\n        output_dir (str): The path to store output images.\n        visualization (bool): Whether to save image or not.\n        score_thresh (float): the low limit of bounding box.\n        label_names (list[str]): label names.\n        handle_id (int): The number of images that have been handled.\n\n    Returns:\n        res (list[dict]): The result of vehicles detecion. keys include 'data', 'save_path', the corresponding value is:\n            data (dict): the result of object detection, keys include 'left', 'top', 'right', 'bottom', 'label', 'confidence', the corresponding value is:\n                left (float): The X coordinate of the upper left corner of the bounding box;\n                top (float): The Y coordinate of the upper left corner of the bounding box;\n                right (float): The X coordinate of the lower right corner of the bounding box;\n                bottom (float): The Y coordinate of the lower right corner of the bounding box;\n                label (str): The label of detection result;\n                confidence (float): The confidence of detection result.\n            save_path (str): The path to save output images.\n    \"\"\"\n    lod_tensor = data_out[0]\n    lod = lod_tensor.lod[0]\n    results = lod_tensor.as_ndarray()\n\n    check_dir(output_dir)\n\n    assert type(paths) is list, \"type(paths) is not list.\"\n    if handle_id < len(paths):\n        unhandled_paths = paths[handle_id:]\n        unhandled_paths_num = len(unhandled_paths)\n    else:\n        unhandled_paths_num = 0\n\n    output = list()\n    for index in range(len(lod) - 1):\n        output_i = {'data': []}\n        if index < unhandled_paths_num:\n            org_img_path = unhandled_paths[index]\n            org_img = Image.open(org_img_path)\n        else:\n            org_img = images[index - unhandled_paths_num]\n            org_img = org_img.astype(np.uint8)\n            org_img = Image.fromarray(org_img[:, :, ::-1])\n            if visualization:\n                org_img_path = get_save_image_name(org_img, output_dir, 'image_numpy_{}'.format((handle_id + index)))\n                org_img.save(org_img_path)\n        org_img_height = org_img.height\n        org_img_width = org_img.width\n        result_i = results[lod[index]:lod[index + 1]]\n        for row in result_i:\n            if len(row) != 6:\n                continue\n            if row[1] < score_thresh:\n                continue\n            category_id = int(row[0])\n            confidence = row[1]\n            bbox = row[2:]\n            dt = {}\n            dt['label'] = label_names[category_id]\n            dt['confidence'] = confidence\n            dt['left'], dt['top'], dt['right'], dt['bottom'] = clip_bbox(bbox, org_img_width, org_img_height)\n            output_i['data'].append(dt)\n\n        output.append(output_i)\n        if visualization:\n            output_i['save_path'] = draw_bounding_box_on_image(org_img_path, output_i['data'], output_dir)\n\n    return output\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_darknet53_venus/yolo_head.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nfrom collections import OrderedDict\n\nfrom paddle import fluid\nfrom paddle.fluid.param_attr import ParamAttr\nfrom paddle.fluid.regularizer import L2Decay\n\n__all__ = ['MultiClassNMS', 'YOLOv3Head']\n\n\nclass MultiClassNMS(object):\n    # __op__ = fluid.layers.multiclass_nms\n    def __init__(self, background_label, keep_top_k, nms_threshold, nms_top_k, normalized, score_threshold):\n        super(MultiClassNMS, self).__init__()\n        self.background_label = background_label\n        self.keep_top_k = keep_top_k\n        self.nms_threshold = nms_threshold\n        self.nms_top_k = nms_top_k\n        self.normalized = normalized\n        self.score_threshold = score_threshold\n\n\nclass YOLOv3Head(object):\n    \"\"\"Head block for YOLOv3 network\n\n    Args:\n        norm_decay (float): weight decay for normalization layer weights\n        num_classes (int): number of output classes\n        ignore_thresh (float): threshold to ignore confidence loss\n        label_smooth (bool): whether to use label smoothing\n        anchors (list): anchors\n        anchor_masks (list): anchor masks\n        nms (object): an instance of `MultiClassNMS`\n    \"\"\"\n\n    def __init__(self,\n                 norm_decay=0.,\n                 num_classes=80,\n                 ignore_thresh=0.7,\n                 label_smooth=True,\n                 anchors=[[10, 13], [16, 30], [33, 23], [30, 61], [62, 45], [59, 119], [116, 90], [156, 198],\n                          [373, 326]],\n                 anchor_masks=[[6, 7, 8], [3, 4, 5], [0, 1, 2]],\n                 nms=MultiClassNMS(\n                     background_label=-1,\n                     keep_top_k=100,\n                     nms_threshold=0.45,\n                     nms_top_k=1000,\n                     normalized=True,\n                     score_threshold=0.01),\n                 weight_prefix_name=''):\n        self.norm_decay = norm_decay\n        self.num_classes = num_classes\n        self.ignore_thresh = ignore_thresh\n        self.label_smooth = label_smooth\n        self.anchor_masks = anchor_masks\n        self._parse_anchors(anchors)\n        self.nms = nms\n        self.prefix_name = weight_prefix_name\n\n    def _conv_bn(self, input, ch_out, filter_size, stride, padding, act='leaky', is_test=True, name=None):\n        conv = fluid.layers.conv2d(\n            input=input,\n            num_filters=ch_out,\n            filter_size=filter_size,\n            stride=stride,\n            padding=padding,\n            act=None,\n            param_attr=ParamAttr(name=name + \".conv.weights\"),\n            bias_attr=False)\n\n        bn_name = name + \".bn\"\n        bn_param_attr = ParamAttr(regularizer=L2Decay(self.norm_decay), name=bn_name + '.scale')\n        bn_bias_attr = ParamAttr(regularizer=L2Decay(self.norm_decay), name=bn_name + '.offset')\n        out = fluid.layers.batch_norm(\n            input=conv,\n            act=None,\n            is_test=is_test,\n            param_attr=bn_param_attr,\n            bias_attr=bn_bias_attr,\n            moving_mean_name=bn_name + '.mean',\n            moving_variance_name=bn_name + '.var')\n\n        if act == 'leaky':\n            out = fluid.layers.leaky_relu(x=out, alpha=0.1)\n        return out\n\n    def _detection_block(self, input, channel, is_test=True, name=None):\n        assert channel % 2 == 0, \\\n            \"channel {} cannot be divided by 2 in detection block {}\" \\\n            .format(channel, name)\n\n        conv = input\n        for j in range(2):\n            conv = self._conv_bn(\n                conv, channel, filter_size=1, stride=1, padding=0, is_test=is_test, name='{}.{}.0'.format(name, j))\n            conv = self._conv_bn(\n                conv, channel * 2, filter_size=3, stride=1, padding=1, is_test=is_test, name='{}.{}.1'.format(name, j))\n        route = self._conv_bn(\n            conv, channel, filter_size=1, stride=1, padding=0, is_test=is_test, name='{}.2'.format(name))\n        tip = self._conv_bn(\n            route, channel * 2, filter_size=3, stride=1, padding=1, is_test=is_test, name='{}.tip'.format(name))\n        return route, tip\n\n    def _upsample(self, input, scale=2, name=None):\n        out = fluid.layers.resize_nearest(input=input, scale=float(scale), name=name)\n        return out\n\n    def _parse_anchors(self, anchors):\n        \"\"\"\n        Check ANCHORS/ANCHOR_MASKS in config and parse mask_anchors\n\n        \"\"\"\n        self.anchors = []\n        self.mask_anchors = []\n\n        assert len(anchors) > 0, \"ANCHORS not set.\"\n        assert len(self.anchor_masks) > 0, \"ANCHOR_MASKS not set.\"\n\n        for anchor in anchors:\n            assert len(anchor) == 2, \"anchor {} len should be 2\".format(anchor)\n            self.anchors.extend(anchor)\n\n        anchor_num = len(anchors)\n        for masks in self.anchor_masks:\n            self.mask_anchors.append([])\n            for mask in masks:\n                assert mask < anchor_num, \"anchor mask index overflow\"\n                self.mask_anchors[-1].extend(anchors[mask])\n\n    def _get_outputs(self, input, is_train=True):\n        \"\"\"\n        Get YOLOv3 head output\n\n        Args:\n            input (list): List of Variables, output of backbone stages\n            is_train (bool): whether in train or test mode\n\n        Returns:\n            outputs (list): Variables of each output layer\n        \"\"\"\n\n        outputs = []\n\n        # get last out_layer_num blocks in reverse order\n        out_layer_num = len(self.anchor_masks)\n        if isinstance(input, OrderedDict):\n            blocks = list(input.values())[-1:-out_layer_num - 1:-1]\n        else:\n            blocks = input[-1:-out_layer_num - 1:-1]\n        route = None\n        for i, block in enumerate(blocks):\n            if i > 0:  # perform concat in first 2 detection_block\n                block = fluid.layers.concat(input=[route, block], axis=1)\n            route, tip = self._detection_block(\n                block, channel=512 // (2**i), is_test=(not is_train), name=self.prefix_name + \"yolo_block.{}\".format(i))\n\n            # out channel number = mask_num * (5 + class_num)\n            num_filters = len(self.anchor_masks[i]) * (self.num_classes + 5)\n            block_out = fluid.layers.conv2d(\n                input=tip,\n                num_filters=num_filters,\n                filter_size=1,\n                stride=1,\n                padding=0,\n                act=None,\n                param_attr=ParamAttr(name=self.prefix_name + \"yolo_output.{}.conv.weights\".format(i)),\n                bias_attr=ParamAttr(\n                    regularizer=L2Decay(0.), name=self.prefix_name + \"yolo_output.{}.conv.bias\".format(i)))\n            outputs.append(block_out)\n\n            if i < len(blocks) - 1:\n                # do not perform upsample in the last detection_block\n                route = self._conv_bn(\n                    input=route,\n                    ch_out=256 // (2**i),\n                    filter_size=1,\n                    stride=1,\n                    padding=0,\n                    is_test=(not is_train),\n                    name=self.prefix_name + \"yolo_transition.{}\".format(i))\n                # upsample\n                route = self._upsample(route)\n\n        return outputs, blocks\n\n    def get_prediction(self, outputs, im_size):\n        \"\"\"\n        Get prediction result of YOLOv3 network\n\n        Args:\n            outputs (list): list of Variables, return from _get_outputs\n            im_size (Variable): Variable of size([h, w]) of each image\n\n        Returns:\n            pred (Variable): The prediction result after non-max suppress.\n\n        \"\"\"\n        boxes = []\n        scores = []\n        downsample = 32\n        for i, output in enumerate(outputs):\n            box, score = fluid.layers.yolo_box(\n                x=output,\n                img_size=im_size,\n                anchors=self.mask_anchors[i],\n                class_num=self.num_classes,\n                conf_thresh=self.nms.score_threshold,\n                downsample_ratio=downsample,\n                name=self.prefix_name + \"yolo_box\" + str(i))\n            boxes.append(box)\n            scores.append(fluid.layers.transpose(score, perm=[0, 2, 1]))\n\n            downsample //= 2\n\n        yolo_boxes = fluid.layers.concat(boxes, axis=1)\n        yolo_scores = fluid.layers.concat(scores, axis=2)\n        pred = fluid.layers.multiclass_nms(\n            bboxes=yolo_boxes,\n            scores=yolo_scores,\n            score_threshold=self.nms.score_threshold,\n            nms_top_k=self.nms.nms_top_k,\n            keep_top_k=self.nms.keep_top_k,\n            nms_threshold=self.nms.nms_threshold,\n            background_label=self.nms.background_label,\n            normalized=self.nms.normalized,\n            name=\"multiclass_nms\")\n        return pred\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_mobilenet_v1_coco2017/README.md",
    "content": "# yolov3_mobilenet_v1_coco2017\n\n|模型名称|yolov3_mobilenet_v1_coco2017|\n| :--- | :---: |\n|类别|图像 - 目标检测|\n|网络|YOLOv3|\n|数据集|COCO2017|\n|是否支持Fine-tuning|否|\n|模型大小|96MB|\n|最新更新日期|2021-03-15|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n     <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/131506781-b4ecb77b-5ab1-4795-88da-5f547f7f7f9c.jpg\"   width='50%' hspace='10'/>\n     <br />\n     </p>\n\n\n- ### 模型介绍\n\n  - YOLOv3是由Joseph Redmon和Ali Farhadi提出的单阶段检测器, 该检测器与达到同样精度的传统目标检测方法相比，推断速度能达到接近两倍。YOLOv3将输入图像划分格子，并对每个格子预测bounding box。YOLOv3的loss函数由三部分组成：Location误差，Confidence误差和分类误差。该PaddleHub Module预训练数据集为COCO2017，目前仅支持预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install yolov3_mobilenet_v1_coco2017\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run yolov3_mobilenet_v1_coco2017 --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现目标检测模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    object_detector = hub.Module(name=\"yolov3_mobilenet_v1_coco2017\")\n    result = object_detector.object_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = object_detector.object_detection((paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def object_detection(paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='detection_result',\n                         score_thresh=0.5,\n                         visualization=True)\n    ```\n\n    - 预测API，检测输入图片中的所有目标的位置。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 detection\\_result；<br/>\n      - score\\_thresh (float): 识别置信度的阈值；<br/>\n      - visualization (bool): 是否将识别结果保存为图片文件。\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为:\n        - data (list): 检测结果，list的每一个元素为 dict，各字段为:\n          - confidence (float): 识别的置信度\n          - label (str): 标签\n          - left (int): 边界框的左上角x坐标\n          - top (int): 边界框的左上角y坐标\n          - right (int): 边界框的右下角x坐标\n          - bottom (int): 边界框的右下角y坐标\n        - save\\_path (str, optional): 识别结果的保存路径 (仅当visualization=True时存在)\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - 将模型保存到指定路径。\n\n    - **参数**\n\n      - dirname: 模型保存路径 <br/>\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m yolov3_mobilenet_v1_coco2017\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/yolov3_mobilenet_v1_coco2017\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.2\n\n  修复numpy数据读取问题\n\n* 1.1.0\n\n  移除 fluid api\n\n  - ```shell\n    $ hub install yolov3_mobilenet_v1_coco2017==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_mobilenet_v1_coco2017/README_en.md",
    "content": "# yolov3_mobilenet_v1_coco2017\n\n|Module Name|yolov3_mobilenet_v1_coco2017|\n| :--- | :---: |\n|Category|object detection|\n|Network|YOLOv3|\n|Dataset|COCO2017|\n|Fine-tuning supported or not|No|\n|Module Size|96MB|\n|Latest update date|2021-03-15|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n     <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/131506781-b4ecb77b-5ab1-4795-88da-5f547f7f7f9c.jpg\"   width='50%' hspace='10'/>\n     <br />\n     </p>\n\n\n- ### Module Introduction\n\n  - YOLOv3 is a one-stage detector proposed by Joseph Redmon and Ali Farhadi, which can reach comparable accuracy but twice as fast as traditional methods. This module is based on YOLOv3, trained on COCO2017, and can be used for object detection.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install yolov3_mobilenet_v1_coco2017\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run yolov3_mobilenet_v1_coco2017 --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    object_detector = hub.Module(name=\"yolov3_mobilenet_v1_coco2017\")\n    result = object_detector.object_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = object_detector.object_detection((paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def object_detection(paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='detection_result',\n                         score_thresh=0.5,\n                         visualization=True)\n    ```\n\n    - Detection API, detect positions of all objects in image\n\n    - **Parameters**\n\n      - paths (list[str]): image path;\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - output_dir (str): save path of images;\n      - score\\_thresh (float): confidence threshold；<br/>\n      - visualization (bool): Whether to save the results as picture files;\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n\n      - res (list\\[dict\\]): results\n        - data (list): detection results, each element in the list is dict\n          - confidence (float): the confidence of the result\n          - label (str): label\n          - left (int): the upper left corner x coordinate of the detection box\n          - top (int): the upper left corner y coordinate of the detection box\n          - right (int): the lower right corner x coordinate of the detection box\n          - bottom (int): the lower right corner y coordinate of the detection box\n        - save\\_path (str, optional): output path for saving results\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - Save model to specific path\n\n    - **Parameters**\n\n      - dirname: model save path\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of object detection.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m yolov3_mobilenet_v1_coco2017\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/yolov3_mobilenet_v1_coco2017\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.0.2\n\n  Fix the problem of reading numpy\n\n* 1.1.0\n\n  Remove fluid api\n\n  - ```shell\n    $ hub install yolov3_mobilenet_v1_coco2017==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_mobilenet_v1_coco2017/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/object_detection/yolov3_mobilenet_v1_coco2017/data_feed.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import division\n\nimport os\n\nimport cv2\nimport numpy as np\n\n__all__ = ['reader']\n\n\ndef reader(paths=[], images=None):\n    \"\"\"\n    data generator\n\n    Args:\n        paths (list[str]): paths to images.\n        images (list(numpy.ndarray)): data of images, shape of each is [H, W, C]\n\n    Yield:\n        res (list): preprocessed image and the size of original image.\n    \"\"\"\n    img_list = []\n    if paths:\n        assert type(paths) is list, \"type(paths) is not list.\"\n        for img_path in paths:\n            assert os.path.isfile(\n                img_path), \"The {} isn't a valid file path.\".format(img_path)\n            img = cv2.imread(img_path).astype('float32')\n            img_list.append(img)\n    if images is not None:\n        for img in images:\n            img_list.append(img)\n\n    for im in img_list:\n        # im_size\n        im_shape = im.shape\n        im_size = np.array([im_shape[0], im_shape[1]], dtype=np.int32)\n\n        # decode image\n        im = cv2.cvtColor(im, cv2.COLOR_BGR2RGB)\n\n        # resize image\n        target_size = 608\n        im_size_min = np.min(im_shape[0:2])\n        im_size_max = np.max(im_shape[0:2])\n        if float(im_size_min) == 0:\n            raise ZeroDivisionError('min size of image is 0')\n\n        im_scale_x = float(target_size) / float(im_shape[1])\n        im_scale_y = float(target_size) / float(im_shape[0])\n        im = cv2.resize(\n            im, None, None, fx=im_scale_x, fy=im_scale_y, interpolation=2)\n\n        # normalize image\n        mean = [0.485, 0.456, 0.406]\n        std = [0.229, 0.224, 0.225]\n        im = im.astype(np.float32, copy=False)\n        mean = np.array(mean)[np.newaxis, np.newaxis, :]\n        std = np.array(std)[np.newaxis, np.newaxis, :]\n        im = im / 255.0\n        im -= mean\n        im /= std\n\n        # permute\n        im = np.swapaxes(im, 1, 2)\n        im = np.swapaxes(im, 1, 0)\n\n        yield [im, im_size]\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_mobilenet_v1_coco2017/label_file.txt",
    "content": "person\nbicycle\ncar\nmotorcycle\nairplane\nbus\ntrain\ntruck\nboat\ntraffic light\nfire hydrant\nstop sign\nparking meter\nbench\nbird\ncat\ndog\nhorse\nsheep\ncow\nelephant\nbear\nzebra\ngiraffe\nbackpack\numbrella\nhandbag\ntie\nsuitcase\nfrisbee\nskis\nsnowboard\nsports ball\nkite\nbaseball bat\nbaseball glove\nskateboard\nsurfboard\ntennis racket\nbottle\nwine glass\ncup\nfork\nknife\nspoon\nbowl\nbanana\napple\nsandwich\norange\nbroccoli\ncarrot\nhot dog\npizza\ndonut\ncake\nchair\ncouch\npotted plant\nbed\ndining table\ntoilet\ntv\nlaptop\nmouse\nremote\nkeyboard\ncell phone\nmicrowave\noven\ntoaster\nsink\nrefrigerator\nbook\nclock\nvase\nscissors\nteddy bear\nhair drier\ntoothbrush\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_mobilenet_v1_coco2017/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\n\nimport ast\nimport argparse\nimport os\nfrom functools import partial\n\nimport paddle\nimport numpy as np\nimport paddle.jit\nimport paddle.static\nfrom paddle.inference import Config, create_predictor\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\nfrom .processor import load_label_info, postprocess, base64_to_cv2\nfrom .data_feed import reader\n\n\n@moduleinfo(\n    name=\"yolov3_mobilenet_v1_coco2017\",\n    version=\"1.1.0\",\n    type=\"CV/object_detection\",\n    summary=\n    \"Baidu's YOLOv3 model for object detection with backbone MobileNet_V1, trained with dataset COCO2017.\",\n    author=\"paddlepaddle\",\n    author_email=\"paddle-dev@baidu.com\")\nclass YOLOv3MobileNetV1Coco2017:\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(\n            self.directory, \"yolov3_mobilenet_v1_model\", \"model\")\n        self.label_names = load_label_info(\n            os.path.join(self.directory, \"label_file.txt\"))\n        self._set_config()\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting.\n        \"\"\"\n        model = self.default_pretrained_model_path+'.pdmodel'\n        params = self.default_pretrained_model_path+'.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        cpu_config.switch_ir_optim(False)\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=500, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def object_detection(self,\n                         paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='detection_result',\n                         score_thresh=0.5,\n                         visualization=True):\n        \"\"\"API of Object Detection.\n\n        Args:\n            paths (list[str]): The paths of images.\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            output_dir (str): The path to store output images.\n            visualization (bool): Whether to save image or not.\n            score_thresh (float): threshold for object detecion.\n\n        Returns:\n            res (list[dict]): The result of coco2017 detecion. keys include 'data', 'save_path', the corresponding value is:\n                data (dict): the result of object detection, keys include 'left', 'top', 'right', 'bottom', 'label', 'confidence', the corresponding value is:\n                    left (float): The X coordinate of the upper left corner of the bounding box;\n                    top (float): The Y coordinate of the upper left corner of the bounding box;\n                    right (float): The X coordinate of the lower right corner of the bounding box;\n                    bottom (float): The Y coordinate of the lower right corner of the bounding box;\n                    label (str): The label of detection result;\n                    confidence (float): The confidence of detection result.\n                save_path (str, optional): The path to save output images.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Attempt to use GPU for prediction, but environment variable CUDA_VISIBLE_DEVICES was not set correctly.\"\n                )\n\n        paths = paths if paths else list()\n        data_reader = partial(reader, paths, images)\n        batch_reader = paddle.batch(data_reader, batch_size=batch_size)\n        res = []\n        for iter_id, feed_data in enumerate(batch_reader()):\n            feed_data = np.array(feed_data)\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(np.array(list(feed_data[:, 0])))\n            input_handle = predictor.get_input_handle(input_names[1])\n            input_handle.copy_from_cpu(np.array(list(feed_data[:, 1])))\n\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            output = postprocess(paths=paths,\n                                 images=images,\n                                 data_out=output_handle,\n                                 score_thresh=score_thresh,\n                                 label_names=self.label_names,\n                                 output_dir=output_dir,\n                                 handle_id=iter_id * batch_size,\n                                 visualization=visualization)\n            res.extend(output)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.object_detection(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(\n            title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\",\n            description=\n            \"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.object_detection(\n            paths=[args.input_path],\n            batch_size=args.batch_size,\n            use_gpu=args.use_gpu,\n            output_dir=args.output_dir,\n            visualization=args.visualization,\n            score_thresh=args.score_thresh)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu',\n            type=ast.literal_eval,\n            default=False,\n            help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument(\n            '--output_dir',\n            type=str,\n            default='detection_result',\n            help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument(\n            '--visualization',\n            type=ast.literal_eval,\n            default=False,\n            help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument(\n            '--input_path', type=str, help=\"path to image.\")\n        self.arg_input_group.add_argument(\n            '--batch_size',\n            type=ast.literal_eval,\n            default=1,\n            help=\"batch size.\")\n        self.arg_input_group.add_argument(\n            '--score_thresh',\n            type=ast.literal_eval,\n            default=0.5,\n            help=\"threshold for object detecion.\")\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_mobilenet_v1_coco2017/processor.py",
    "content": "# coding=utf-8\nimport base64\nimport os\n\nimport cv2\nimport numpy as np\nfrom PIL import Image, ImageDraw\n\n__all__ = ['base64_to_cv2', 'load_label_info', 'postprocess']\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef check_dir(dir_path):\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef get_save_image_name(img, output_dir, image_path):\n    \"\"\"Get save image name from source image path.\n    \"\"\"\n    image_name = os.path.split(image_path)[-1]\n    name, ext = os.path.splitext(image_name)\n\n    if ext == '':\n        if img.format == 'PNG':\n            ext = '.png'\n        elif img.format == 'JPEG':\n            ext = '.jpg'\n        elif img.format == 'BMP':\n            ext = '.bmp'\n        else:\n            if img.mode == \"RGB\" or img.mode == \"L\":\n                ext = \".jpg\"\n            elif img.mode == \"RGBA\" or img.mode == \"P\":\n                ext = '.png'\n\n    return os.path.join(output_dir, \"{}\".format(name)) + ext\n\n\ndef draw_bounding_box_on_image(image_path, data_list, save_dir):\n    image = Image.open(image_path)\n    draw = ImageDraw.Draw(image)\n    for data in data_list:\n        left, right, top, bottom = data['left'], data['right'], data[\n            'top'], data['bottom']\n        # draw bbox\n        draw.line([(left, top), (left, bottom), (right, bottom), (right, top),\n                   (left, top)],\n                  width=2,\n                  fill='red')\n        # draw label\n        if image.mode == 'RGB':\n            text = data['label'] + \": %.2f%%\" % (100 * data['confidence'])\n            textsize_width, textsize_height = draw.textsize(text=text)\n            draw.rectangle(\n                xy=(left, top - (textsize_height + 5),\n                    left + textsize_width + 10, top),\n                fill=(255, 255, 255))\n            draw.text(xy=(left, top - 15), text=text, fill=(0, 0, 0))\n\n    save_name = get_save_image_name(image, save_dir, image_path)\n    if os.path.exists(save_name):\n        os.remove(save_name)\n\n    image.save(save_name)\n    return save_name\n\n\ndef clip_bbox(bbox, img_width, img_height):\n    xmin = max(min(bbox[0], img_width), 0.)\n    ymin = max(min(bbox[1], img_height), 0.)\n    xmax = max(min(bbox[2], img_width), 0.)\n    ymax = max(min(bbox[3], img_height), 0.)\n    return float(xmin), float(ymin), float(xmax), float(ymax)\n\n\ndef load_label_info(file_path):\n    with open(file_path, 'r') as fr:\n        text = fr.readlines()\n        label_names = []\n        for info in text:\n            label_names.append(info.strip())\n        return label_names\n\n\ndef postprocess(paths,\n                images,\n                data_out,\n                score_thresh,\n                label_names,\n                output_dir,\n                handle_id,\n                visualization=True):\n    \"\"\"\n    postprocess the lod_tensor produced by Executor.run\n\n    Args:\n        paths (list[str]): The paths of images.\n        images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n        data_out (lod_tensor): data output of predictor.\n        batch_size (int): batch size.\n        use_gpu (bool): Whether to use gpu.\n        output_dir (str): The path to store output images.\n        visualization (bool): Whether to save image or not.\n        score_thresh (float): the low limit of bounding box.\n        label_names (list[str]): label names.\n        handle_id (int): The number of images that have been handled.\n\n    Returns:\n        res (list[dict]): The result of vehicles detecion. keys include 'data', 'save_path', the corresponding value is:\n            data (dict): the result of object detection, keys include 'left', 'top', 'right', 'bottom', 'label', 'confidence', the corresponding value is:\n                left (float): The X coordinate of the upper left corner of the bounding box;\n                top (float): The Y coordinate of the upper left corner of the bounding box;\n                right (float): The X coordinate of the lower right corner of the bounding box;\n                bottom (float): The Y coordinate of the lower right corner of the bounding box;\n                label (str): The label of detection result;\n                confidence (float): The confidence of detection result.\n            save_path (str): The path to save output images.\n    \"\"\"\n    lod = data_out.lod()[0]\n    results = data_out.copy_to_cpu()\n\n    check_dir(output_dir)\n\n    if paths:\n        assert type(paths) is list, \"type(paths) is not list.\"\n        if handle_id < len(paths):\n            unhandled_paths = paths[handle_id:]\n            unhandled_paths_num = len(unhandled_paths)\n        else:\n            unhandled_paths_num = 0\n    if images is not None:\n        if handle_id < len(images):\n            unhandled_paths = None\n            unhandled_paths_num = len(images) - handle_id\n        else:\n            unhandled_paths_num = 0\n\n    output = list()\n    for index in range(len(lod) - 1):\n        output_i = {'data': []}\n        if unhandled_paths and index < unhandled_paths_num:\n            org_img_path = unhandled_paths[index]\n            org_img = Image.open(org_img_path)\n        else:\n            org_img = images[index - unhandled_paths_num]\n            org_img = org_img.astype(np.uint8)\n            org_img = Image.fromarray(org_img[:, :, ::-1])\n            if visualization:\n                org_img_path = get_save_image_name(\n                    org_img, output_dir, 'image_numpy_{}'.format(\n                        (handle_id + index)))\n                org_img.save(org_img_path)\n        org_img_height = org_img.height\n        org_img_width = org_img.width\n        result_i = results[lod[index]:lod[index + 1]]\n        for row in result_i:\n            if len(row) != 6:\n                continue\n            if row[1] < score_thresh:\n                continue\n            category_id = int(row[0])\n            confidence = row[1]\n            bbox = row[2:]\n            dt = {}\n            dt['label'] = label_names[category_id]\n            dt['confidence'] = float(confidence)\n            dt['left'], dt['top'], dt['right'], dt['bottom'] = clip_bbox(\n                bbox, org_img_width, org_img_height)\n            output_i['data'].append(dt)\n\n        output.append(output_i)\n        if visualization:\n            output_i['save_path'] = draw_bounding_box_on_image(\n                org_img_path, output_i['data'], output_dir)\n\n    return output\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_mobilenet_v1_coco2017/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport paddlehub as hub\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://ai-studio-static-online.cdn.bcebos.com/68313e182f5e4ad9907e69dac9ece8fc50840d7ffbd24fa88396f009958f969a'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"yolov3_mobilenet_v1_coco2017\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('detection_result')\n\n    def test_object_detection1(self):\n        results = self.module.object_detection(\n            paths=['tests/test.jpg']\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'cat')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(0 < left < 1000)\n        self.assertTrue(1000 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(1000 < bottom < 4500)\n\n    def test_object_detection2(self):\n        results = self.module.object_detection(\n            images=[cv2.imread('tests/test.jpg')]\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'cat')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(0 < left < 1000)\n        self.assertTrue(1000 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(1000 < bottom < 4500)\n\n    def test_object_detection3(self):\n        results = self.module.object_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            visualization=False\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'cat')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(0 < left < 1000)\n        self.assertTrue(1000 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(1000 < bottom < 4500)\n\n    def test_object_detection4(self):\n        self.assertRaises(\n            AssertionError,\n            self.module.object_detection,\n            paths=['no.jpg']\n        )\n\n    def test_object_detection5(self):\n        self.assertRaises(\n            AttributeError,\n            self.module.object_detection,\n            images=['test.jpg']\n        )\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()"
  },
  {
    "path": "modules/image/object_detection/yolov3_resnet34_coco2017/README.md",
    "content": "# yolov3_resnet34_coco2017\n\n|模型名称|yolov3_resnet34_coco2017|\n| :--- | :---: |\n|类别|图像 - 目标检测|\n|网络|YOLOv3|\n|数据集|COCO2017|\n|是否支持Fine-tuning|否|\n|模型大小|164MB|\n|最新更新日期|2021-03-15|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n     <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/131506781-b4ecb77b-5ab1-4795-88da-5f547f7f7f9c.jpg\"   width='50%' hspace='10'/>\n     <br />\n     </p>\n\n- ### 模型介绍\n\n  - YOLOv3是由Joseph Redmon和Ali Farhadi提出的单阶段检测器, 该检测器与达到同样精度的传统目标检测方法相比，推断速度能达到接近两倍。 YOLOv3将输入图像划分格子，并对每个格子预测bounding box。YOLOv3的loss函数由三部分组成：Location误差，Confidence误差和分类误差。该PaddleHub Module预训练数据集为COCO2017，目前仅支持预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install yolov3_resnet34_coco2017\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run yolov3_resnet34_coco2017 --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现目标检测模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    object_detector = hub.Module(name=\"yolov3_resnet34_coco2017\")\n    result = object_detector.object_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = object_detector.object_detection((paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def object_detection(paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='detection_result',\n                         score_thresh=0.5,\n                         visualization=True)\n    ```\n\n    - 预测API，检测输入图片中的所有目标的位置。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 detection\\_result；<br/>\n      - score\\_thresh (float): 识别置信度的阈值；<br/>\n      - visualization (bool): 是否将识别结果保存为图片文件。\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为:\n        - data (list): 检测结果，list的每一个元素为 dict，各字段为:\n          - confidence (float): 识别的置信度\n          - label (str): 标签\n          - left (int): 边界框的左上角x坐标\n          - top (int): 边界框的左上角y坐标\n          - right (int): 边界框的右下角x坐标\n          - bottom (int): 边界框的右下角y坐标\n        - save\\_path (str, optional): 识别结果的保存路径 (仅当visualization=True时存在)\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - 将模型保存到指定路径。\n\n    - **参数**\n\n      - dirname: 模型保存路径 <br/>\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m yolov3_resnet34_coco2017\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/yolov3_resnet34_coco2017\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布  \n\n* 1.0.2\n\n  修复numpy数据读取问题\n\n* 1.1.0\n\n  移除 fluid api\n\n  - ```shell\n    $ hub install yolov3_resnet34_coco2017==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_resnet34_coco2017/README_en.md",
    "content": "# yolov3_resnet34_coco2017\n\n|Module Name|yolov3_resnet34_coco2017|\n| :--- | :---: |\n|Category|object detection|\n|Network|YOLOv3|\n|Dataset|COCO2017|\n|Fine-tuning supported or not|No|\n|Module Size|164MB|\n|Latest update date|2021-03-15|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n     <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/131506781-b4ecb77b-5ab1-4795-88da-5f547f7f7f9c.jpg\"   width='50%' hspace='10'/>\n     <br />\n     </p>\n\n- ### Module Introduction\n\n  - YOLOv3 is a one-stage detector proposed by Joseph Redmon and Ali Farhadi, which can reach comparable accuracy but twice as fast as traditional methods. This module is based on YOLOv3, trained on COCO2017, and can be used for object detection.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install yolov3_resnet34_coco2017\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run yolov3_resnet34_coco2017 --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    object_detector = hub.Module(name=\"yolov3_resnet34_coco2017\")\n    result = object_detector.object_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = object_detector.object_detection((paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def object_detection(paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='detection_result',\n                         score_thresh=0.5,\n                         visualization=True)\n    ```\n\n    - Detection API, detect positions of all objects in image\n\n    - **Parameters**\n\n      - paths (list[str]): image path;\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - output_dir (str): save path of images;\n      - score\\_thresh (float): confidence threshold；<br/>\n      - visualization (bool): Whether to save the results as picture files;\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n\n      - res (list\\[dict\\]): results\n        - data (list): detection results, each element in the list is dict\n          - confidence (float): the confidence of the result\n          - label (str): label\n          - left (int): the upper left corner x coordinate of the detection box\n          - top (int): the upper left corner y coordinate of the detection box\n          - right (int): the lower right corner x coordinate of the detection box\n          - bottom (int): the lower right corner y coordinate of the detection box\n        - save\\_path (str, optional): output path for saving results\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - Save model to specific path\n\n    - **Parameters**\n\n      - dirname: model save path\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of object detection.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m yolov3_resnet34_coco2017\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/yolov3_resnet34_coco2017\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release  \n\n* 1.0.2\n\n  Fix the problem of reading numpy\n\n* 1.1.0\n\n  Remove fluid api\n\n  - ```shell\n    $ hub install yolov3_resnet34_coco2017==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_resnet34_coco2017/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/object_detection/yolov3_resnet34_coco2017/data_feed.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import division\n\nimport os\n\nimport cv2\nimport numpy as np\n\n__all__ = ['reader']\n\n\ndef reader(paths=[], images=None):\n    \"\"\"\n    data generator\n\n    Args:\n        paths (list[str]): paths to images.\n        images (list(numpy.ndarray)): data of images, shape of each is [H, W, C]\n\n    Yield:\n        res (list): preprocessed image and the size of original image.\n    \"\"\"\n    img_list = []\n    if paths:\n        assert type(paths) is list, \"type(paths) is not list.\"\n        for img_path in paths:\n            assert os.path.isfile(\n                img_path), \"The {} isn't a valid file path.\".format(img_path)\n            img = cv2.imread(img_path).astype('float32')\n            img_list.append(img)\n    if images is not None:\n        for img in images:\n            img_list.append(img)\n\n    for im in img_list:\n        # im_size\n        im_shape = im.shape\n        im_size = np.array([im_shape[0], im_shape[1]], dtype=np.int32)\n\n        # decode image\n        im = cv2.cvtColor(im, cv2.COLOR_BGR2RGB)\n\n        # resize image\n        target_size = 608\n        im_size_min = np.min(im_shape[0:2])\n        im_size_max = np.max(im_shape[0:2])\n        if float(im_size_min) == 0:\n            raise ZeroDivisionError('min size of image is 0')\n\n        im_scale_x = float(target_size) / float(im_shape[1])\n        im_scale_y = float(target_size) / float(im_shape[0])\n        im = cv2.resize(\n            im, None, None, fx=im_scale_x, fy=im_scale_y, interpolation=2)\n\n        # normalize image\n        mean = [0.485, 0.456, 0.406]\n        std = [0.229, 0.224, 0.225]\n        im = im.astype(np.float32, copy=False)\n        mean = np.array(mean)[np.newaxis, np.newaxis, :]\n        std = np.array(std)[np.newaxis, np.newaxis, :]\n        im = im / 255.0\n        im -= mean\n        im /= std\n\n        # permute\n        im = np.swapaxes(im, 1, 2)\n        im = np.swapaxes(im, 1, 0)\n\n        yield [im, im_size]\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_resnet34_coco2017/label_file.txt",
    "content": "person\nbicycle\ncar\nmotorcycle\nairplane\nbus\ntrain\ntruck\nboat\ntraffic light\nfire hydrant\nstop sign\nparking meter\nbench\nbird\ncat\ndog\nhorse\nsheep\ncow\nelephant\nbear\nzebra\ngiraffe\nbackpack\numbrella\nhandbag\ntie\nsuitcase\nfrisbee\nskis\nsnowboard\nsports ball\nkite\nbaseball bat\nbaseball glove\nskateboard\nsurfboard\ntennis racket\nbottle\nwine glass\ncup\nfork\nknife\nspoon\nbowl\nbanana\napple\nsandwich\norange\nbroccoli\ncarrot\nhot dog\npizza\ndonut\ncake\nchair\ncouch\npotted plant\nbed\ndining table\ntoilet\ntv\nlaptop\nmouse\nremote\nkeyboard\ncell phone\nmicrowave\noven\ntoaster\nsink\nrefrigerator\nbook\nclock\nvase\nscissors\nteddy bear\nhair drier\ntoothbrush\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_resnet34_coco2017/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\n\nimport ast\nimport argparse\nimport os\nfrom functools import partial\n\nimport paddle\nimport numpy as np\nimport paddle.jit\nimport paddle.static\nimport paddle.static\nfrom paddle.inference import Config, create_predictor\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\nfrom .processor import load_label_info, postprocess, base64_to_cv2\nfrom .data_feed import reader\n\n\n@moduleinfo(\n    name=\"yolov3_resnet34_coco2017\",\n    version=\"1.1.0\",\n    type=\"CV/object_detection\",\n    summary=\n    \"Baidu's YOLOv3 model for object detection with backbone ResNet34, trained with dataset coco2017.\",\n    author=\"paddlepaddle\",\n    author_email=\"paddle-dev@baidu.com\")\nclass YOLOv3ResNet34Coco2017:\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(\n            self.directory, \"yolov3_resnet34_model\", \"model\")\n        self.label_names = load_label_info(\n            os.path.join(self.directory, \"label_file.txt\"))\n        self._set_config()\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting.\n        \"\"\"\n        model = self.default_pretrained_model_path+'.pdmodel'\n        params = self.default_pretrained_model_path+'.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        cpu_config.switch_ir_optim(False)\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=500, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def object_detection(self,\n                         paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='detection_result',\n                         score_thresh=0.5,\n                         visualization=True):\n        \"\"\"API of Object Detection.\n\n        Args:\n            paths (list[str]): The paths of images.\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            output_dir (str): The path to store output images.\n            visualization (bool): Whether to save image or not.\n            score_thresh (float): threshold for object detecion.\n\n        Returns:\n            res (list[dict]): The result of coco2017 detecion. keys include 'data', 'save_path', the corresponding value is:\n                data (dict): the result of object detection, keys include 'left', 'top', 'right', 'bottom', 'label', 'confidence', the corresponding value is:\n                    left (float): The X coordinate of the upper left corner of the bounding box;\n                    top (float): The Y coordinate of the upper left corner of the bounding box;\n                    right (float): The X coordinate of the lower right corner of the bounding box;\n                    bottom (float): The Y coordinate of the lower right corner of the bounding box;\n                    label (str): The label of detection result;\n                    confidence (float): The confidence of detection result.\n                save_path (str, optional): The path to save output images.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Attempt to use GPU for prediction, but environment variable CUDA_VISIBLE_DEVICES was not set correctly.\"\n                )\n\n        paths = paths if paths else list()\n        data_reader = partial(reader, paths, images)\n        batch_reader = paddle.batch(data_reader, batch_size=batch_size)\n        res = []\n        for iter_id, feed_data in enumerate(batch_reader()):\n            feed_data = np.array(feed_data)\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(np.array(list(feed_data[:, 0])))\n            input_handle = predictor.get_input_handle(input_names[1])\n            input_handle.copy_from_cpu(np.array(list(feed_data[:, 1])))\n\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            output = postprocess(paths=paths,\n                                 images=images,\n                                 data_out=output_handle,\n                                 score_thresh=score_thresh,\n                                 label_names=self.label_names,\n                                 output_dir=output_dir,\n                                 handle_id=iter_id * batch_size,\n                                 visualization=visualization)\n            res.extend(output)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.object_detection(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(\n            title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\",\n            description=\n            \"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.object_detection(\n            paths=[args.input_path],\n            batch_size=args.batch_size,\n            use_gpu=args.use_gpu,\n            output_dir=args.output_dir,\n            visualization=args.visualization,\n            score_thresh=args.score_thresh)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu',\n            type=ast.literal_eval,\n            default=False,\n            help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument(\n            '--output_dir',\n            type=str,\n            default='detection_result',\n            help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument(\n            '--visualization',\n            type=ast.literal_eval,\n            default=False,\n            help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument(\n            '--input_path', type=str, help=\"path to image.\")\n        self.arg_input_group.add_argument(\n            '--batch_size',\n            type=ast.literal_eval,\n            default=1,\n            help=\"batch size.\")\n        self.arg_input_group.add_argument(\n            '--score_thresh',\n            type=ast.literal_eval,\n            default=0.5,\n            help=\"threshold for object detecion.\")\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_resnet34_coco2017/processor.py",
    "content": "# coding=utf-8\nimport base64\nimport os\n\nimport cv2\nimport numpy as np\nfrom PIL import Image, ImageDraw\n\n__all__ = ['base64_to_cv2', 'load_label_info', 'postprocess']\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef check_dir(dir_path):\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef get_save_image_name(img, output_dir, image_path):\n    \"\"\"Get save image name from source image path.\n    \"\"\"\n    image_name = os.path.split(image_path)[-1]\n    name, ext = os.path.splitext(image_name)\n\n    if ext == '':\n        if img.format == 'PNG':\n            ext = '.png'\n        elif img.format == 'JPEG':\n            ext = '.jpg'\n        elif img.format == 'BMP':\n            ext = '.bmp'\n        else:\n            if img.mode == \"RGB\" or img.mode == \"L\":\n                ext = \".jpg\"\n            elif img.mode == \"RGBA\" or img.mode == \"P\":\n                ext = '.png'\n\n    return os.path.join(output_dir, \"{}\".format(name)) + ext\n\n\ndef draw_bounding_box_on_image(image_path, data_list, save_dir):\n    image = Image.open(image_path)\n    draw = ImageDraw.Draw(image)\n    for data in data_list:\n        left, right, top, bottom = data['left'], data['right'], data[\n            'top'], data['bottom']\n        # draw bbox\n        draw.line([(left, top), (left, bottom), (right, bottom), (right, top),\n                   (left, top)],\n                  width=2,\n                  fill='red')\n        # draw label\n        if image.mode == 'RGB':\n            text = data['label'] + \": %.2f%%\" % (100 * data['confidence'])\n            textsize_width, textsize_height = draw.textsize(text=text)\n            draw.rectangle(\n                xy=(left, top - (textsize_height + 5),\n                    left + textsize_width + 10, top),\n                fill=(255, 255, 255))\n            draw.text(xy=(left, top - 15), text=text, fill=(0, 0, 0))\n\n    save_name = get_save_image_name(image, save_dir, image_path)\n    if os.path.exists(save_name):\n        os.remove(save_name)\n\n    image.save(save_name)\n    return save_name\n\n\ndef clip_bbox(bbox, img_width, img_height):\n    xmin = max(min(bbox[0], img_width), 0.)\n    ymin = max(min(bbox[1], img_height), 0.)\n    xmax = max(min(bbox[2], img_width), 0.)\n    ymax = max(min(bbox[3], img_height), 0.)\n    return float(xmin), float(ymin), float(xmax), float(ymax)\n\n\ndef load_label_info(file_path):\n    with open(file_path, 'r') as fr:\n        text = fr.readlines()\n        label_names = []\n        for info in text:\n            label_names.append(info.strip())\n        return label_names\n\n\ndef postprocess(paths,\n                images,\n                data_out,\n                score_thresh,\n                label_names,\n                output_dir,\n                handle_id,\n                visualization=True):\n    \"\"\"\n    postprocess the lod_tensor produced by Executor.run\n\n    Args:\n        paths (list[str]): The paths of images.\n        images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n        data_out (lod_tensor): data output of predictor.\n        batch_size (int): batch size.\n        use_gpu (bool): Whether to use gpu.\n        output_dir (str): The path to store output images.\n        visualization (bool): Whether to save image or not.\n        score_thresh (float): the low limit of bounding box.\n        label_names (list[str]): label names.\n        handle_id (int): The number of images that have been handled.\n\n    Returns:\n        res (list[dict]): The result of vehicles detecion. keys include 'data', 'save_path', the corresponding value is:\n            data (dict): the result of object detection, keys include 'left', 'top', 'right', 'bottom', 'label', 'confidence', the corresponding value is:\n                left (float): The X coordinate of the upper left corner of the bounding box;\n                top (float): The Y coordinate of the upper left corner of the bounding box;\n                right (float): The X coordinate of the lower right corner of the bounding box;\n                bottom (float): The Y coordinate of the lower right corner of the bounding box;\n                label (str): The label of detection result;\n                confidence (float): The confidence of detection result.\n            save_path (str): The path to save output images.\n    \"\"\"\n    lod = data_out.lod()[0]\n    results = data_out.copy_to_cpu()\n\n    check_dir(output_dir)\n\n    if paths:\n        assert type(paths) is list, \"type(paths) is not list.\"\n        if handle_id < len(paths):\n            unhandled_paths = paths[handle_id:]\n            unhandled_paths_num = len(unhandled_paths)\n        else:\n            unhandled_paths_num = 0\n    if images is not None:\n        if handle_id < len(images):\n            unhandled_paths = None\n            unhandled_paths_num = len(images) - handle_id\n        else:\n            unhandled_paths_num = 0\n\n    output = list()\n    for index in range(len(lod) - 1):\n        output_i = {'data': []}\n        if unhandled_paths and index < unhandled_paths_num:\n            org_img_path = unhandled_paths[index]\n            org_img = Image.open(org_img_path)\n        else:\n            org_img = images[index - unhandled_paths_num]\n            org_img = org_img.astype(np.uint8)\n            org_img = Image.fromarray(org_img[:, :, ::-1])\n            if visualization:\n                org_img_path = get_save_image_name(\n                    org_img, output_dir, 'image_numpy_{}'.format(\n                        (handle_id + index)))\n                org_img.save(org_img_path)\n        org_img_height = org_img.height\n        org_img_width = org_img.width\n        result_i = results[lod[index]:lod[index + 1]]\n        for row in result_i:\n            if len(row) != 6:\n                continue\n            if row[1] < score_thresh:\n                continue\n            category_id = int(row[0])\n            confidence = row[1]\n            bbox = row[2:]\n            dt = {}\n            dt['label'] = label_names[category_id]\n            dt['confidence'] = float(confidence)\n            dt['left'], dt['top'], dt['right'], dt['bottom'] = clip_bbox(\n                bbox, org_img_width, org_img_height)\n            output_i['data'].append(dt)\n\n        output.append(output_i)\n        if visualization:\n            output_i['save_path'] = draw_bounding_box_on_image(\n                org_img_path, output_i['data'], output_dir)\n\n    return output\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_resnet34_coco2017/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport paddlehub as hub\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://ai-studio-static-online.cdn.bcebos.com/68313e182f5e4ad9907e69dac9ece8fc50840d7ffbd24fa88396f009958f969a'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"yolov3_resnet34_coco2017\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('detection_result')\n\n    def test_object_detection1(self):\n        results = self.module.object_detection(\n            paths=['tests/test.jpg']\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'cat')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(0 < left < 1000)\n        self.assertTrue(1000 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(1000 < bottom < 4500)\n\n    def test_object_detection2(self):\n        results = self.module.object_detection(\n            images=[cv2.imread('tests/test.jpg')]\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'cat')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(0 < left < 1000)\n        self.assertTrue(1000 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(1000 < bottom < 4500)\n\n    def test_object_detection3(self):\n        results = self.module.object_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            visualization=False\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'cat')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(0 < left < 1000)\n        self.assertTrue(1000 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(1000 < bottom < 4500)\n\n    def test_object_detection4(self):\n        self.assertRaises(\n            AssertionError,\n            self.module.object_detection,\n            paths=['no.jpg']\n        )\n\n    def test_object_detection5(self):\n        self.assertRaises(\n            AttributeError,\n            self.module.object_detection,\n            images=['test.jpg']\n        )\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()"
  },
  {
    "path": "modules/image/object_detection/yolov3_resnet50_vd_coco2017/README.md",
    "content": "# yolov3_resnet50_vd_coco2017\n\n|模型名称|yolov3_resnet50_vd_coco2017|\n| :--- | :---: |\n|类别|图像 - 目标检测|\n|网络|YOLOv3|\n|数据集|COCO2017|\n|是否支持Fine-tuning|否|\n|模型大小|178MB|\n|最新更新日期|2021-03-15|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n     <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/131506781-b4ecb77b-5ab1-4795-88da-5f547f7f7f9c.jpg\"   width='50%' hspace='10'/>\n     <br />\n     </p>\n\n- ### 模型介绍\n\n  - YOLOv3是由Joseph Redmon和Ali Farhadi提出的单阶段检测器, 该检测器与达到同样精度的传统目标检测方法相比，推断速度能达到接近两倍。 YOLOv3将输入图像划分格子，并对每个格子预测bounding box。YOLOv3的loss函数由三部分组成：Location误差，Confidence误差和分类误差。该PaddleHub Module预训练数据集为COCO2017，目前仅支持预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install yolov3_resnet50_vd_coco2017\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run yolov3_resnet50_vd_coco2017 --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现目标检测模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    object_detector = hub.Module(name=\"yolov3_resnet50_vd_coco2017\")\n    result = object_detector.object_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = object_detector.object_detection((paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def object_detection(paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='detection_result',\n                         score_thresh=0.5,\n                         visualization=True)\n    ```\n\n    - 预测API，检测输入图片中的所有目标的位置。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式； <br/>\n      - batch\\_size (int): batch 的大小；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 detection\\_result；<br/>\n      - score\\_thresh (float): 识别置信度的阈值；<br/>\n      - visualization (bool): 是否将识别结果保存为图片文件。\n\n      **NOTE:** paths和images两个参数选择其一进行提供数据\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为:\n        - data (list): 检测结果，list的每一个元素为 dict，各字段为:\n          - confidence (float): 识别的置信度\n          - label (str): 标签\n          - left (int): 边界框的左上角x坐标\n          - top (int): 边界框的左上角y坐标\n          - right (int): 边界框的右下角x坐标\n          - bottom (int): 边界框的右下角y坐标\n        - save\\_path (str, optional): 识别结果的保存路径 (仅当visualization=True时存在)\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - 将模型保存到指定路径。\n\n    - **参数**\n\n      - dirname: 模型保存路径 <br/>\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m yolov3_resnet50_vd_coco2017\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/yolov3_resnet50_vd_coco2017\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.2\n\n  修复numpy数据读取问题\n\n* 1.1.0\n\n  移除 fluid api\n\n  - ```shell\n    $ hub install yolov3_resnet50_vd_coco2017==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_resnet50_vd_coco2017/README_en.md",
    "content": "# yolov3_resnet50_vd_coco2017\n\n|Module Name|yolov3_resnet50_vd_coco2017|\n| :--- | :---: |\n|Category|object detection|\n|Network|YOLOv3|\n|Dataset|COCO2017|\n|Fine-tuning supported or not|No|\n|Module Size|178MB|\n|Latest update date|2021-03-15|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n     <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/131506781-b4ecb77b-5ab1-4795-88da-5f547f7f7f9c.jpg\"   width='50%' hspace='10'/>\n     <br />\n     </p>\n\n- ### Module Introduction\n\n  - YOLOv3 is a one-stage detector proposed by Joseph Redmon and Ali Farhadi, which can reach comparable accuracy but twice as fast as traditional methods. This module is based on YOLOv3, trained on COCO2017, and can be used for object detection.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.6.2  \n\n  - paddlehub >= 1.6.0  | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install yolov3_resnet50_vd_coco2017\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run yolov3_resnet50_vd_coco2017 --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    object_detector = hub.Module(name=\"yolov3_resnet50_vd_coco2017\")\n    result = object_detector.object_detection(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = object_detector.object_detection((paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def object_detection(paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='detection_result',\n                         score_thresh=0.5,\n                         visualization=True)\n    ```\n\n    - Detection API, detect positions of all objects in image\n\n    - **Parameters**\n\n      - paths (list[str]): image path;\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - batch_size (int): the size of batch;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - output_dir (str): save path of images;\n      - score\\_thresh (float): confidence threshold；<br/>\n      - visualization (bool): Whether to save the results as picture files;\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n\n      - res (list\\[dict\\]): results\n        - data (list): detection results, each element in the list is dict\n          - confidence (float): the confidence of the result\n          - label (str): label\n          - left (int): the upper left corner x coordinate of the detection box\n          - top (int): the upper left corner y coordinate of the detection box\n          - right (int): the lower right corner x coordinate of the detection box\n          - bottom (int): the lower right corner y coordinate of the detection box\n        - save\\_path (str, optional): output path for saving results\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n    - Save model to specific path\n\n    - **Parameters**\n\n      - dirname: save model path\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of object detection.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m yolov3_resnet50_vd_coco2017\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/yolov3_resnet50_vd_coco2017\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.0.2\n\n  Fix the problem of reading numpy\n\n* 1.1.0\n\n  Remove fluid api\n\n  - ```shell\n    $ hub install yolov3_resnet50_vd_coco2017==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_resnet50_vd_coco2017/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/object_detection/yolov3_resnet50_vd_coco2017/data_feed.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import division\n\nimport os\n\nimport cv2\nimport numpy as np\n\n__all__ = ['reader']\n\n\ndef reader(paths=[], images=None):\n    \"\"\"\n    data generator\n\n    Args:\n        paths (list[str]): paths to images.\n        images (list(numpy.ndarray)): data of images, shape of each is [H, W, C]\n\n    Yield:\n        res (list): preprocessed image and the size of original image.\n    \"\"\"\n    img_list = []\n    if paths:\n        assert type(paths) is list, \"type(paths) is not list.\"\n        for img_path in paths:\n            assert os.path.isfile(\n                img_path), \"The {} isn't a valid file path.\".format(img_path)\n            img = cv2.imread(img_path).astype('float32')\n            img_list.append(img)\n    if images is not None:\n        for img in images:\n            img_list.append(img)\n\n    for im in img_list:\n        # im_size\n        im_shape = im.shape\n        im_size = np.array([im_shape[0], im_shape[1]], dtype=np.int32)\n\n        # decode image\n        im = cv2.cvtColor(im, cv2.COLOR_BGR2RGB)\n\n        # resize image\n        target_size = 608\n        im_size_min = np.min(im_shape[0:2])\n        im_size_max = np.max(im_shape[0:2])\n        if float(im_size_min) == 0:\n            raise ZeroDivisionError('min size of image is 0')\n\n        im_scale_x = float(target_size) / float(im_shape[1])\n        im_scale_y = float(target_size) / float(im_shape[0])\n        im = cv2.resize(\n            im, None, None, fx=im_scale_x, fy=im_scale_y, interpolation=2)\n\n        # normalize image\n        mean = [0.485, 0.456, 0.406]\n        std = [0.229, 0.224, 0.225]\n        im = im.astype(np.float32, copy=False)\n        mean = np.array(mean)[np.newaxis, np.newaxis, :]\n        std = np.array(std)[np.newaxis, np.newaxis, :]\n        im = im / 255.0\n        im -= mean\n        im /= std\n\n        # permute\n        im = np.swapaxes(im, 1, 2)\n        im = np.swapaxes(im, 1, 0)\n\n        yield [im, im_size]\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_resnet50_vd_coco2017/label_file.txt",
    "content": "person\nbicycle\ncar\nmotorcycle\nairplane\nbus\ntrain\ntruck\nboat\ntraffic light\nfire hydrant\nstop sign\nparking meter\nbench\nbird\ncat\ndog\nhorse\nsheep\ncow\nelephant\nbear\nzebra\ngiraffe\nbackpack\numbrella\nhandbag\ntie\nsuitcase\nfrisbee\nskis\nsnowboard\nsports ball\nkite\nbaseball bat\nbaseball glove\nskateboard\nsurfboard\ntennis racket\nbottle\nwine glass\ncup\nfork\nknife\nspoon\nbowl\nbanana\napple\nsandwich\norange\nbroccoli\ncarrot\nhot dog\npizza\ndonut\ncake\nchair\ncouch\npotted plant\nbed\ndining table\ntoilet\ntv\nlaptop\nmouse\nremote\nkeyboard\ncell phone\nmicrowave\noven\ntoaster\nsink\nrefrigerator\nbook\nclock\nvase\nscissors\nteddy bear\nhair drier\ntoothbrush\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_resnet50_vd_coco2017/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\n\nimport ast\nimport argparse\nimport os\nfrom functools import partial\n\nimport paddle\nimport numpy as np\nimport paddle.static\nfrom paddle.inference import Config, create_predictor\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\nfrom .processor import load_label_info, postprocess, base64_to_cv2\nfrom .data_feed import reader\n\n\n@moduleinfo(\n    name=\"yolov3_resnet50_vd_coco2017\",\n    version=\"1.1.0\",\n    type=\"CV/object_detection\",\n    summary=\n    \"Baidu's YOLOv3 model for object detection with backbone ResNet50, trained with dataset coco2017.\",\n    author=\"paddlepaddle\",\n    author_email=\"paddle-dev@baidu.com\")\nclass YOLOv3ResNet50Coco2017:\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(\n            self.directory, \"yolov3_resnet50_model\", \"model\")\n        self.label_names = load_label_info(\n            os.path.join(self.directory, \"label_file.txt\"))\n        self._set_config()\n \n    def _set_config(self):\n        \"\"\"\n        predictor config setting.\n        \"\"\"\n        model = self.default_pretrained_model_path+'.pdmodel'\n        params = self.default_pretrained_model_path+'.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        cpu_config.switch_ir_optim(False)\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=500, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def object_detection(self,\n                         paths=None,\n                         images=None,\n                         batch_size=1,\n                         use_gpu=False,\n                         output_dir='detection_result',\n                         score_thresh=0.5,\n                         visualization=True):\n        \"\"\"API of Object Detection.\n\n        Args:\n            paths (list[str]): The paths of images.\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            output_dir (str): The path to store output images.\n            visualization (bool): Whether to save image or not.\n            score_thresh (float): threshold for object detecion.\n\n        Returns:\n            res (list[dict]): The result of coco2017 detecion. keys include 'data', 'save_path', the corresponding value is:\n                data (dict): the result of object detection, keys include 'left', 'top', 'right', 'bottom', 'label', 'confidence', the corresponding value is:\n                    left (float): The X coordinate of the upper left corner of the bounding box;\n                    top (float): The Y coordinate of the upper left corner of the bounding box;\n                    right (float): The X coordinate of the lower right corner of the bounding box;\n                    bottom (float): The Y coordinate of the lower right corner of the bounding box;\n                    label (str): The label of detection result;\n                    confidence (float): The confidence of detection result.\n                save_path (str, optional): The path to save output images.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Attempt to use GPU for prediction, but environment variable CUDA_VISIBLE_DEVICES was not set correctly.\"\n                )\n\n        paths = paths if paths else list()\n        data_reader = partial(reader, paths, images)\n        batch_reader = paddle.batch(data_reader, batch_size=batch_size)\n        res = []\n        for iter_id, feed_data in enumerate(batch_reader()):\n            feed_data = np.array(feed_data)\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(np.array(list(feed_data[:, 0])))\n            input_handle = predictor.get_input_handle(input_names[1])\n            input_handle.copy_from_cpu(np.array(list(feed_data[:, 1])))\n\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            output = postprocess(paths=paths,\n                                 images=images,\n                                 data_out=output_handle,\n                                 score_thresh=score_thresh,\n                                 label_names=self.label_names,\n                                 output_dir=output_dir,\n                                 handle_id=iter_id * batch_size,\n                                 visualization=visualization)\n            res.extend(output)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.object_detection(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(\n            title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\",\n            description=\n            \"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.object_detection(\n            paths=[args.input_path],\n            batch_size=args.batch_size,\n            use_gpu=args.use_gpu,\n            output_dir=args.output_dir,\n            visualization=args.visualization,\n            score_thresh=args.score_thresh)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu',\n            type=ast.literal_eval,\n            default=False,\n            help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument(\n            '--output_dir',\n            type=str,\n            default='detection_result',\n            help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument(\n            '--visualization',\n            type=ast.literal_eval,\n            default=False,\n            help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument(\n            '--input_path', type=str, help=\"path to image.\")\n        self.arg_input_group.add_argument(\n            '--batch_size',\n            type=ast.literal_eval,\n            default=1,\n            help=\"batch size.\")\n        self.arg_input_group.add_argument(\n            '--score_thresh',\n            type=ast.literal_eval,\n            default=0.5,\n            help=\"threshold for object detecion.\")\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_resnet50_vd_coco2017/processor.py",
    "content": "# coding=utf-8\nimport base64\nimport os\n\nimport cv2\nimport numpy as np\nfrom PIL import Image, ImageDraw\n\n__all__ = ['base64_to_cv2', 'load_label_info', 'postprocess']\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef check_dir(dir_path):\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef get_save_image_name(img, output_dir, image_path):\n    \"\"\"Get save image name from source image path.\n    \"\"\"\n    image_name = os.path.split(image_path)[-1]\n    name, ext = os.path.splitext(image_name)\n\n    if ext == '':\n        if img.format == 'PNG':\n            ext = '.png'\n        elif img.format == 'JPEG':\n            ext = '.jpg'\n        elif img.format == 'BMP':\n            ext = '.bmp'\n        else:\n            if img.mode == \"RGB\" or img.mode == \"L\":\n                ext = \".jpg\"\n            elif img.mode == \"RGBA\" or img.mode == \"P\":\n                ext = '.png'\n\n    return os.path.join(output_dir, \"{}\".format(name)) + ext\n\n\ndef draw_bounding_box_on_image(image_path, data_list, save_dir):\n    image = Image.open(image_path)\n    draw = ImageDraw.Draw(image)\n    for data in data_list:\n        left, right, top, bottom = data['left'], data['right'], data[\n            'top'], data['bottom']\n        # draw bbox\n        draw.line([(left, top), (left, bottom), (right, bottom), (right, top),\n                   (left, top)],\n                  width=2,\n                  fill='red')\n        # draw label\n        if image.mode == 'RGB':\n            text = data['label'] + \": %.2f%%\" % (100 * data['confidence'])\n            textsize_width, textsize_height = draw.textsize(text=text)\n            draw.rectangle(\n                xy=(left, top - (textsize_height + 5),\n                    left + textsize_width + 10, top),\n                fill=(255, 255, 255))\n            draw.text(xy=(left, top - 15), text=text, fill=(0, 0, 0))\n\n    save_name = get_save_image_name(image, save_dir, image_path)\n    if os.path.exists(save_name):\n        os.remove(save_name)\n\n    image.save(save_name)\n    return save_name\n\n\ndef clip_bbox(bbox, img_width, img_height):\n    xmin = max(min(bbox[0], img_width), 0.)\n    ymin = max(min(bbox[1], img_height), 0.)\n    xmax = max(min(bbox[2], img_width), 0.)\n    ymax = max(min(bbox[3], img_height), 0.)\n    return float(xmin), float(ymin), float(xmax), float(ymax)\n\n\ndef load_label_info(file_path):\n    with open(file_path, 'r') as fr:\n        text = fr.readlines()\n        label_names = []\n        for info in text:\n            label_names.append(info.strip())\n        return label_names\n\n\ndef postprocess(paths,\n                images,\n                data_out,\n                score_thresh,\n                label_names,\n                output_dir,\n                handle_id,\n                visualization=True):\n    \"\"\"\n    postprocess the lod_tensor produced by Executor.run\n\n    Args:\n        paths (list[str]): The paths of images.\n        images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n        data_out (lod_tensor): data output of predictor.\n        batch_size (int): batch size.\n        use_gpu (bool): Whether to use gpu.\n        output_dir (str): The path to store output images.\n        visualization (bool): Whether to save image or not.\n        score_thresh (float): the low limit of bounding box.\n        label_names (list[str]): label names.\n        handle_id (int): The number of images that have been handled.\n\n    Returns:\n        res (list[dict]): The result of vehicles detecion. keys include 'data', 'save_path', the corresponding value is:\n            data (dict): the result of object detection, keys include 'left', 'top', 'right', 'bottom', 'label', 'confidence', the corresponding value is:\n                left (float): The X coordinate of the upper left corner of the bounding box;\n                top (float): The Y coordinate of the upper left corner of the bounding box;\n                right (float): The X coordinate of the lower right corner of the bounding box;\n                bottom (float): The Y coordinate of the lower right corner of the bounding box;\n                label (str): The label of detection result;\n                confidence (float): The confidence of detection result.\n            save_path (str): The path to save output images.\n    \"\"\"\n    lod = data_out.lod()[0]\n    results = data_out.copy_to_cpu()\n\n    check_dir(output_dir)\n\n    if paths:\n        assert type(paths) is list, \"type(paths) is not list.\"\n        if handle_id < len(paths):\n            unhandled_paths = paths[handle_id:]\n            unhandled_paths_num = len(unhandled_paths)\n        else:\n            unhandled_paths_num = 0\n    if images is not None:\n        if handle_id < len(images):\n            unhandled_paths = None\n            unhandled_paths_num = len(images) - handle_id\n        else:\n            unhandled_paths_num = 0\n\n\n    output = list()\n    for index in range(len(lod) - 1):\n        output_i = {'data': []}\n        if unhandled_paths and index < unhandled_paths_num:\n            org_img_path = unhandled_paths[index]\n            org_img = Image.open(org_img_path)\n        else:\n            org_img = images[index - unhandled_paths_num]\n            org_img = org_img.astype(np.uint8)\n            org_img = Image.fromarray(org_img[:, :, ::-1])\n            if visualization:\n                org_img_path = get_save_image_name(\n                    org_img, output_dir, 'image_numpy_{}'.format(\n                        (handle_id + index)))\n                org_img.save(org_img_path)\n        org_img_height = org_img.height\n        org_img_width = org_img.width\n        result_i = results[lod[index]:lod[index + 1]]\n        for row in result_i:\n            if len(row) != 6:\n                continue\n            if row[1] < score_thresh:\n                continue\n            category_id = int(row[0])\n            confidence = row[1]\n            bbox = row[2:]\n            dt = {}\n            dt['label'] = label_names[category_id]\n            dt['confidence'] = float(confidence)\n            dt['left'], dt['top'], dt['right'], dt['bottom'] = clip_bbox(\n                bbox, org_img_width, org_img_height)\n            output_i['data'].append(dt)\n\n        output.append(output_i)\n        if visualization:\n            output_i['save_path'] = draw_bounding_box_on_image(\n                org_img_path, output_i['data'], output_dir)\n\n    return output\n"
  },
  {
    "path": "modules/image/object_detection/yolov3_resnet50_vd_coco2017/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport paddlehub as hub\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://ai-studio-static-online.cdn.bcebos.com/68313e182f5e4ad9907e69dac9ece8fc50840d7ffbd24fa88396f009958f969a'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"yolov3_resnet50_vd_coco2017\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('detection_result')\n\n    def test_object_detection1(self):\n        results = self.module.object_detection(\n            paths=['tests/test.jpg']\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'cat')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(0 < left < 1000)\n        self.assertTrue(1000 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(1000 < bottom < 4500)\n\n    def test_object_detection2(self):\n        results = self.module.object_detection(\n            images=[cv2.imread('tests/test.jpg')]\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'cat')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(0 < left < 1000)\n        self.assertTrue(1000 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(1000 < bottom < 4500)\n\n    def test_object_detection3(self):\n        results = self.module.object_detection(\n            images=[cv2.imread('tests/test.jpg')],\n            visualization=False\n        )\n        bbox = results[0]['data'][0]\n        label = bbox['label']\n        confidence = bbox['confidence']\n        left = bbox['left']\n        right = bbox['right']\n        top = bbox['top']\n        bottom = bbox['bottom']\n\n        self.assertEqual(label, 'cat')\n        self.assertTrue(confidence > 0.5)\n        self.assertTrue(0 < left < 1000)\n        self.assertTrue(1000 < right < 3500)\n        self.assertTrue(500 < top < 1500)\n        self.assertTrue(1000 < bottom < 4500)\n\n    def test_object_detection4(self):\n        self.assertRaises(\n            AssertionError,\n            self.module.object_detection,\n            paths=['no.jpg']\n        )\n\n    def test_object_detection5(self):\n        self.assertRaises(\n            AttributeError,\n            self.module.object_detection,\n            images=['test.jpg']\n        )\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()"
  },
  {
    "path": "modules/image/semantic_segmentation/Extract_Line_Draft/Readme.md",
    "content": "# Extract_Line_Draft\n\n|模型名称|Extract_Line_Draft|\n| :--- | :---: | \n|类别|图像-图像分割|\n|网络|-|\n|数据集|-|\n|是否支持Fine-tuning|否|\n|模型大小|259MB|\n|指标|-|\n|最新更新日期|2021-02-26|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n  - 样例结果示例：\n      <p align=\"center\">\n      <img src=\"https://ai-studio-static-online.cdn.bcebos.com/1c30757e069541a18dc89b92f0750983b77ad762560849afa0170046672e57a3\" width = \"337\" height = \"505\" hspace='10'/> <img src=\"https://ai-studio-static-online.cdn.bcebos.com/7ef00637e5974be2847317053f8abe97236cec75fba14f77be2c095529a1eeb3\" width = \"337\" height = \"505\" hspace='10'/>\n      </p>\n\n- ### 模型介绍\n\n  - 提取线稿（Extract_Line_Draft），该模型可自动根据彩色图生成线稿图。该PaddleHub Module支持API预测及命令行预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0  \n\n- ### 2.安装\n\n    - ```shell\n      $ hub install Extract_Line_Draft\n      ```\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n  - ### 1、命令行预测\n\n    ```shell\n    $ hub run Extract_Line_Draft --input_path \"testImage\" --use_gpu True\n    ```\n\n  - ### 2、预测代码示例\n\n    ```python\n    import paddlehub as hub\n\n    Extract_Line_Draft_test = hub.Module(name=\"Extract_Line_Draft\")\n\n    test_img = \"testImage.png\"\n\n    # execute predict\n    Extract_Line_Draft_test.ExtractLine(test_img, use_gpu=True)\n    ```\n  \n  - ### 3、API\n\n    ```python\n    def ExtractLine(image, use_gpu=False)\n    ```\n\n    - 预测API，用于图像分割得到人体解析。\n\n    - **参数**\n\n      * image(str): 待检测的图片路径\n      * use_gpu (bool): 是否使用 GPU\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 Fluid API\n\n  ```shell\n  $ hub install Extract_Line_Draft == 1.1.0\n  ```"
  },
  {
    "path": "modules/image/semantic_segmentation/Extract_Line_Draft/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/semantic_segmentation/Extract_Line_Draft/function.py",
    "content": "import numpy as np\nimport cv2\nfrom scipy import ndimage\n\n\ndef get_normal_map(img):\n    img = img.astype(np.float32)\n    img = img / 255.0\n    img = - img + 1\n    img[img < 0] = 0\n    img[img > 1] = 1\n    return img\n\n\ndef get_gray_map(img):\n    gray = cv2.cvtColor(img.astype(np.uint8), cv2.COLOR_BGR2GRAY)\n    highPass = gray.astype(np.float32)\n    highPass = highPass / 255.0\n    highPass = 1 - highPass\n    highPass = highPass[None]\n    return highPass.transpose((1, 2, 0))\n\n\ndef get_light_map(img):\n    gray = cv2.cvtColor(img.astype(np.uint8), cv2.COLOR_BGR2GRAY)\n    blur = cv2.GaussianBlur(gray, (0, 0), 3)\n    highPass = gray.astype(int) - blur.astype(int)\n    highPass = highPass.astype(np.float32)\n    highPass = highPass / 128.0\n    highPass = highPass[None]\n    return highPass.transpose((1, 2, 0))\n\n\ndef get_light_map_single(img):\n    gray = img\n    gray = gray[None]\n    gray = gray.transpose((1, 2, 0))\n    blur = cv2.GaussianBlur(gray, (0, 0), 3)\n    gray = gray.reshape((gray.shape[0], gray.shape[1]))\n    highPass = gray.astype(int) - blur.astype(int)\n    highPass = highPass.astype(np.float32)\n    highPass = highPass / 128.0\n    return highPass\n\n\ndef get_light_map_drawer(img):\n    gray = cv2.cvtColor(img.astype(np.uint8), cv2.COLOR_BGR2GRAY)\n    blur = cv2.GaussianBlur(gray, (0, 0), 3)\n    highPass = gray.astype(int) - blur.astype(int) + 255\n    highPass[highPass < 0] = 0\n    highPass[highPass > 255] = 255\n    highPass = highPass.astype(np.float32)\n    highPass = highPass / 255.0\n    highPass = 1 - highPass\n    highPass = highPass[None]\n    return highPass.transpose((1, 2, 0))\n\n\ndef get_light_map_drawer2(img):\n    ret = img.copy()\n    ret = ret.astype(np.float32)\n    ret[:, :, 0] = get_light_map_drawer3(img[:, :, 0])\n    ret[:, :, 1] = get_light_map_drawer3(img[:, :, 1])\n    ret[:, :, 2] = get_light_map_drawer3(img[:, :, 2])\n    ret = np.amax(ret, 2)\n    return ret\n\n\ndef get_light_map_drawer3(img):\n    gray = img\n    blur = cv2.blur(gray, ksize=(5, 5))\n    highPass = gray.astype(int) - blur.astype(int) + 255\n    highPass[highPass < 0] = 0\n    highPass[highPass > 255] = 255\n    highPass = highPass.astype(np.float32)\n    highPass = highPass / 255.0\n    highPass = 1 - highPass\n    return highPass\n\n\ndef normalize_pic(img):\n    img = img / np.max(img)\n    return img\n\n\ndef superlize_pic(img):\n    img = img * 2.33333\n    img[img > 1] = 1\n    return img\n\n\ndef mask_pic(img, mask):\n    mask_mat = mask\n    mask_mat = mask_mat.astype(np.float32)\n    mask_mat = cv2.GaussianBlur(mask_mat, (0, 0), 1)\n    mask_mat = mask_mat / np.max(mask_mat)\n    mask_mat = mask_mat * 255\n    mask_mat[mask_mat < 255] = 0\n    mask_mat = mask_mat.astype(np.uint8)\n    mask_mat = cv2.GaussianBlur(mask_mat, (0, 0), 3)\n    mask_mat = get_gray_map(mask_mat)\n    mask_mat = normalize_pic(mask_mat)\n    mask_mat = resize_img_512(mask_mat)\n    super_from = np.multiply(img, mask_mat)\n    return super_from\n\n\ndef resize_img_512(img):\n    zeros = np.zeros((512, 512, img.shape[2]), dtype=np.float32)\n    zeros[:img.shape[0], :img.shape[1]] = img\n    return zeros\n\n\ndef resize_img_512_3d(img):\n    zeros = np.zeros((1, 3, 512, 512), dtype=np.float32)\n    zeros[0, 0: img.shape[0], 0: img.shape[1], 0: img.shape[2]] = img\n    return zeros.transpose((1, 2, 3, 0))\n\n\ndef denoise_mat(img, i):\n    return ndimage.median_filter(img, i)\n\n\ndef show_active_img_and_save_denoise(img, path):\n    mat = img.astype(np.float32)\n    mat = - mat + 1\n    mat = mat * 255.0\n    mat[mat < 0] = 0\n    mat[mat > 255] = 255\n    mat = mat.astype(np.uint8)\n    mat = ndimage.median_filter(mat, 1)\n    cv2.imwrite(path, mat)\n    return\n\n\ndef show_active_img(name, img):\n    mat = img.astype(np.float32)\n    mat = - mat + 1\n    mat = mat * 255.0\n    mat[mat < 0] = 0\n    mat[mat > 255] = 255\n    mat = mat.astype(np.uint8)\n    cv2.imshow(name, mat)\n    return\n\n\ndef get_active_img(img):\n    mat = img.astype(np.float32)\n    mat = - mat + 1\n    mat = mat * 255.0\n    mat[mat < 0] = 0\n    mat[mat > 255] = 255\n    mat = mat.astype(np.uint8)\n    return mat\n\n\ndef get_active_img_fil(img):\n    mat = img.astype(np.float32)\n    mat[mat < 0.18] = 0\n    mat = - mat + 1\n    mat = mat * 255.0\n    mat[mat < 0] = 0\n    mat[mat > 255] = 255\n    mat = mat.astype(np.uint8)\n    return mat\n\n\ndef show_double_active_img(name, img):\n    mat = img.astype(np.float32)\n    mat = mat * 128.0\n    mat = mat + 127.0\n    mat[mat < 0] = 0\n    mat[mat > 255] = 255\n    cv2.imshow(name, mat.astype(np.uint8))\n    return\n\n\ndef debug_pic_helper():\n    for index in range(1130):\n        gray_path = 'data\\\\gray\\\\' + str(index) + '.jpg'\n        color_path = 'data\\\\color\\\\' + str(index) + '.jpg'\n\n        mat_color = cv2.imread(color_path)\n        mat_color = get_light_map(mat_color)\n        mat_color = normalize_pic(mat_color)\n        mat_color = resize_img_512(mat_color)\n        show_double_active_img('mat_color', mat_color)\n\n        mat_gray = cv2.imread(gray_path)\n        mat_gray = get_gray_map(mat_gray)\n        mat_gray = normalize_pic(mat_gray)\n        mat_gray = resize_img_512(mat_gray)\n        show_active_img('mat_gray', mat_gray)\n\n        cv2.waitKey(1000)\n"
  },
  {
    "path": "modules/image/semantic_segmentation/Extract_Line_Draft/module.py",
    "content": "import argparse\nimport ast\nimport os\nimport cv2\nfrom pathlib import Path\n\nfrom paddle.inference import Config, create_predictor\nfrom paddlehub.module.module import runnable, moduleinfo\nimport numpy as np\nfrom .function import get_light_map_single, normalize_pic, resize_img_512_3d, show_active_img_and_save_denoise\n\n\n@moduleinfo(\n    name=\"Extract_Line_Draft\",\n    version=\"1.1.0\",\n    type=\"cv/segmentation\",\n    summary=\"Import the color picture and generate the line draft of the picture\",\n    author=\"彭兆帅，郑博培\",\n    author_email=\"1084667371@qq.com，2733821739@qq.com\")\nclass ExtractLineDraft:\n    def __init__(self):\n        \"\"\"\n        Initialize with the necessary elements\n        \"\"\"\n        # 加载模型路径\n        self.default_pretrained_model_path = os.path.join(\n            self.directory, \"assets\", \"infer_model\", \"model\")\n        self._set_config()\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        self.model_file_path = self.default_pretrained_model_path\n        model = self.default_pretrained_model_path+'.pdmodel'\n        params = self.default_pretrained_model_path+'.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.switch_ir_optim(True)\n        cpu_config.enable_memory_optim()\n        cpu_config.switch_use_feed_fetch_ops(False)\n        cpu_config.switch_specify_input_names(True)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.switch_ir_optim(True)\n            gpu_config.enable_memory_optim()\n            gpu_config.switch_use_feed_fetch_ops(False)\n            gpu_config.switch_specify_input_names(True)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(100, 0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    # 模型预测函数\n    def predict(self, input_datas):\n        outputs = []\n        # 遍历输入数据进行预测\n        for input_data in input_datas:\n            inputs = input_data.copy()\n            self.input_handle.copy_from_cpu(inputs)\n            self.predictor.run()\n            output = self.output_handle.copy_to_cpu()\n            outputs.append(output)\n\n        # 预测结果合并\n        outputs = np.concatenate(outputs, 0)\n\n        # 返回预测结果\n        return outputs\n\n    def ExtractLine(self, image, use_gpu=False):\n        \"\"\"\n        Get the input and program of the infer model\n\n        Args:\n             image (str): image path\n             use_gpu(bool): Weather to use gpu\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        from_mat = cv2.imread(image)\n        width = float(from_mat.shape[1])\n        height = float(from_mat.shape[0])\n        new_width = 0\n        new_height = 0\n        if (width > height):\n            from_mat = cv2.resize(\n                from_mat, (512, int(512 / width * height)), interpolation=cv2.INTER_AREA)\n            new_width = 512\n            new_height = int(512 / width * height)\n        else:\n            from_mat = cv2.resize(\n                from_mat, (int(512 / height * width), 512), interpolation=cv2.INTER_AREA)\n            new_width = int(512 / height * width)\n            new_height = 512\n\n        from_mat = from_mat.transpose((2, 0, 1))\n        light_map = np.zeros(from_mat.shape, dtype=np.float32)\n        for channel in range(3):\n            light_map[channel] = get_light_map_single(from_mat[channel])\n        light_map = normalize_pic(light_map)\n        light_map = resize_img_512_3d(light_map)\n        light_map = light_map.astype('float32')\n\n        # 获取模型的输入输出\n        if use_gpu:\n            self.predictor = self.gpu_predictor\n        else:\n            self.predictor = self.cpu_predictor\n\n        self.input_names = self.predictor.get_input_names()\n        self.output_names = self.predictor.get_output_names()\n        self.input_handle = self.predictor.get_input_handle(\n            self.input_names[0])\n        self.output_handle = self.predictor.get_output_handle(\n            self.output_names[0])\n        line_mat = self.predict(np.expand_dims(\n            light_map, axis=0).astype('float32'))\n        # 去除 batch 维度 (512, 512, 3)\n        line_mat = line_mat.transpose((3, 1, 2, 0))[0]\n        # 裁剪 (512, 384, 3)\n        line_mat = line_mat[0:int(new_height), 0:int(new_width), :]\n        line_mat = np.amax(line_mat, 2)\n        # 保存图片\n        if Path('./output/').exists():\n            show_active_img_and_save_denoise(\n                line_mat, './output/' + 'output.png')\n        else:\n            os.makedirs('./output/')\n            show_active_img_and_save_denoise(\n                line_mat, './output/' + 'output.png')\n        print('图片已经完成')\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description='Run the %s module.' % self.name,\n            prog='hub run %s' % self.name,\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(\n            title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\",\n            description=\"Run configuration for controlling module behavior, not required.\")\n\n        self.add_module_input_arg()\n\n        args = self.parser.parse_args(argvs)\n\n        try:\n            input_data = self.check_input_data(args)\n        except RuntimeError:\n            self.parser.print_help()\n            return None\n\n        use_gpu = args.use_gpu\n        self.ExtractLine(image=input_data, use_gpu=use_gpu)\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options\n        \"\"\"\n        self.arg_input_group.add_argument(\n            '--image',\n            type=str,\n            default=None,\n            help=\"file contain input data\")\n        self.arg_input_group.add_argument(\n            '--use_gpu',\n            type=ast.literal_eval,\n            default=None,\n            help=\"weather to use gpu\")\n\n    def check_input_data(self, args):\n        input_data = []\n        if args.image:\n            if not os.path.exists(args.image):\n                raise RuntimeError(\"Path %s is not exist.\" % args.image)\n        path = \"{}\".format(args.image)\n        return path\n"
  },
  {
    "path": "modules/image/semantic_segmentation/Extract_Line_Draft/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport paddlehub as hub\n\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://ai-studio-static-online.cdn.bcebos.com/1c30757e069541a18dc89b92f0750983b77ad762560849afa0170046672e57a3'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"Extract_Line_Draft\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('output')\n\n    def test_ExtractLine1(self):\n        self.module.ExtractLine(\n            image='tests/test.jpg',\n            use_gpu=False\n        )\n        self.assertTrue(os.path.exists('output/output.png'))\n\n    def test_ExtractLine2(self):\n        self.module.ExtractLine(\n            image='tests/test.jpg',\n            use_gpu=True\n        )\n        self.assertTrue(os.path.exists('output/output.png'))\n\n    def test_ExtractLine3(self):\n        self.assertRaises(\n            AttributeError,\n            self.module.ExtractLine,\n            image='no.jpg'\n        )\n\n    def test_ExtractLine4(self):\n        self.assertRaises(\n            TypeError,\n            self.module.ExtractLine,\n            image=['tests/test.jpg']\n        )\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ExtremeC3_Portrait_Segmentation/README.md",
    "content": "# ExtremeC3_Portrait_Segmentation\r\n\r\n|模型名称|ExtremeC3_Portrait_Segmentation|\r\n| :--- | :---: | \r\n|类别|图像-图像分割|\r\n|网络|ExtremeC3|\r\n|数据集|EG1800, Baidu fashion dataset|\r\n|是否支持Fine-tuning|否|\r\n|模型大小|0.038MB|\r\n|指标|-|\r\n|最新更新日期|2021-02-26|\r\n\r\n## 一、模型基本信息\r\n\r\n- ### 应用效果展示\r\n\r\n    - 样例结果示例：\r\n        <p align=\"center\">\r\n        <img src=\"https://ai-studio-static-online.cdn.bcebos.com/1261398a98e24184852bdaff5a4e1dbd7739430f59fb47e8b84e3a2cfb976107\"  hspace='10'/> <br />\r\n        </p>\r\n\r\n\r\n- ### 模型介绍\r\n    * 基于 ExtremeC3 模型实现的轻量化人像分割模型\r\n\r\n    * 更多详情请参考： [ExtremeC3_Portrait_Segmentation](https://github.com/clovaai/ext_portrait_segmentation) 项目\r\n\r\n## 二、安装\r\n\r\n- ### 1、环境依赖\r\n    - paddlepaddle >= 2.0.0  \r\n\r\n    - paddlehub >= 2.0.0\r\n\r\n- ### 2、安装\r\n\r\n    - ```shell\r\n      $ hub install ExtremeC3_Portrait_Segmentation\r\n      ```\r\n      \r\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\r\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\r\n\r\n\r\n## 三、模型API预测\r\n\r\n- ### 1、预测代码示例\r\n\r\n    ```python\r\n    import cv2\r\n    import paddlehub as hub\r\n\r\n    model = hub.Module(name='ExtremeC3_Portrait_Segmentation')\r\n\r\n    result = model.Segmentation(\r\n        images=[cv2.imread('/PATH/TO/IMAGE')],\r\n        paths=None,\r\n        batch_size=1,\r\n        output_dir='output',\r\n        visualization=False)\r\n    ```\r\n\r\n- ### 2、API\r\n\r\n```python\r\ndef Segmentation(\r\n    images=None,\r\n    paths=None,\r\n    batch_size=1,\r\n    output_dir='output',\r\n    visualization=False):\r\n```\r\n- 人像分割 API\r\n\r\n- **参数**\r\n    * images (list[np.ndarray]) : 输入图像数据列表（BGR）\r\n    * paths (list[str]) : 输入图像路径列表\r\n    * batch_size (int) : 数据批大小\r\n    * output_dir (str) : 可视化图像输出目录\r\n    * visualization (bool) : 是否可视化\r\n\r\n- **返回**\r\n    * results (list[dict{\"mask\":np.ndarray,\"result\":np.ndarray}]): 输出图像数据列表\r\n\r\n## 四、更新历史\r\n\r\n* 1.0.0\r\n\r\n  初始发布\r\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ExtremeC3_Portrait_Segmentation/README_en.md",
    "content": "# ExtremeC3_Portrait_Segmentation\n\n|Module Name|ExtremeC3_Portrait_Segmentation|\n| :--- | :---: | \n|Category|image segmentation|\n|Network |ExtremeC3|\n|Dataset|EG1800, Baidu fashion dataset|\n|Fine-tuning supported or not|No|\n|Module Size|0.038MB|\n|Data indicators|-|\n|Latest update date|2021-02-26|\n\n## I. Basic Information \n\n- ### Application Effect Display\n\n    - Sample results:\n        <p align=\"center\">\n        <img src=\"https://ai-studio-static-online.cdn.bcebos.com/1261398a98e24184852bdaff5a4e1dbd7739430f59fb47e8b84e3a2cfb976107\"  hspace='10'/> <br />\n        </p>\n\n\n- ### Module Introduction\n    * ExtremeC3_Portrait_Segmentation is a light weigth module based on ExtremeC3 to achieve portrait segmentation.\n\n    * For more information, please refer to: [ExtremeC3_Portrait_Segmentation](https://github.com/clovaai/ext_portrait_segmentation).\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0  \n\n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install ExtremeC3_Portrait_Segmentation\n      ```\n      \n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_ch/get_start/mac_quickstart.md) \n\n\n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n    ```python\n    import cv2\n    import paddlehub as hub\n\n    model = hub.Module(name='ExtremeC3_Portrait_Segmentation')\n\n    result = model.Segmentation(\n        images=[cv2.imread('/PATH/TO/IMAGE')],\n        paths=None,\n        batch_size=1,\n        output_dir='output',\n        visualization=False)\n    ```\n\n- ### 2、API\n\n    ```python\n    def Segmentation(\n        images=None,\n        paths=None,\n        batch_size=1,\n        output_dir='output',\n        visualization=False):\n    ```\n    - Prediction API, used for portrait segmentation.\n\n    - **Parameter**\n        * images (list[np.ndarray]) : image data, ndarray.shape is in the format [H, W, C], BGR;\n        * paths (list[str]) :image path\n        * batch_size (int) : batch size\n        * output_dir (str) : save path of images, 'output' by default.\n        * visualization (bool) : whether to save the segmentation results as picture files.\n    - **Return**\n        * results (list[dict{\"mask\":np.ndarray,\"result\":np.ndarray}]): list of recognition results.\n\n## IV. Release Note\n\n- 1.0.0\n\n  First release\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ExtremeC3_Portrait_Segmentation/module.py",
    "content": "import os\r\nimport cv2\r\nimport paddle\r\nimport numpy as np\r\n\r\nfrom paddle.nn import Layer\r\nfrom paddlehub.module.module import moduleinfo\r\n\r\n\r\n@moduleinfo(\r\n    name=\"ExtremeC3_Portrait_Segmentation\",  # 模型名称\r\n    type=\"CV/semantic_segmentation\",  # 模型类型\r\n    author=\"jm12138\",  # 作者名称\r\n    author_email=\"jm12138@qq.com\",  # 作者邮箱\r\n    summary=\"ExtremeC3_Portrait_Segmentation\",  # 模型介绍\r\n    version=\"1.0.0\"  # 版本号\r\n)\r\nclass ExtremeC3_Portrait_Segmentation(Layer):\r\n    # 初始化函数\r\n    def __init__(self, name=None, directory=None):\r\n        super(ExtremeC3_Portrait_Segmentation, self).__init__()\r\n        # 设置模型路径\r\n        self.model_path = os.path.join(self.directory, \"ExtremeC3\")\r\n\r\n        # 加载模型\r\n        self.model = paddle.jit.load(self.model_path)\r\n        self.model.eval()\r\n\r\n        # 均值方差\r\n        self.mean = [107.304565, 115.69884, 132.35703]\r\n        self.std = [63.97182, 65.1337, 68.29726]\r\n\r\n    # 读取数据函数\r\n    @staticmethod\r\n    def load_datas(paths, images):\r\n        datas = []\r\n\r\n        # 读取数据列表\r\n        if paths is not None:\r\n            for im_path in paths:\r\n                assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\r\n                im = cv2.imread(im_path)\r\n                datas.append(im)\r\n\r\n        if images is not None:\r\n            datas = images\r\n\r\n        # 返回数据列表\r\n        return datas\r\n\r\n    # 预处理函数\r\n    def preprocess(self, datas, batch_size):\r\n        input_datas = []\r\n        for img in datas:\r\n            # 缩放\r\n            h, w = img.shape[:2]\r\n            img = cv2.resize(img, (224, 224))\r\n\r\n            # 格式转换\r\n            img = img.astype(np.float32)\r\n\r\n            # 归一化\r\n            for j in range(3):\r\n                img[:, :, j] -= self.mean[j]\r\n            for j in range(3):\r\n                img[:, :, j] /= self.std[j]\r\n            img /= 255.\r\n\r\n            # 格式转换\r\n            img = img.transpose((2, 0, 1))\r\n            img = img[np.newaxis, ...]\r\n\r\n            input_datas.append(img)\r\n\r\n        # 数据切分\r\n        input_datas = np.concatenate(input_datas, 0)\r\n        split_num = len(datas) // batch_size + 1 if len(datas) % batch_size != 0 else len(datas) // batch_size\r\n        input_datas = np.array_split(input_datas, split_num)\r\n        return input_datas\r\n\r\n    # 后处理函数\r\n    def postprocess(self, outputs, datas, output_dir, visualization):\r\n        # 检查输出目录\r\n        if visualization:\r\n            if not os.path.exists(output_dir):\r\n                os.mkdir(output_dir)\r\n\r\n        # 拼接输出\r\n        outputs = paddle.concat(outputs)\r\n\r\n        results = []\r\n        for output, img, i in zip(outputs, datas, range(len(datas))):\r\n            # 计算MASK\r\n            mask = (output[0] > 0).numpy().astype('float32')\r\n\r\n            # 缩放\r\n            h, w = img.shape[:2]\r\n            mask = cv2.resize(mask, (w, h))\r\n\r\n            # 计算输出图像\r\n            result = (img * mask[..., np.newaxis] + (1 - mask[..., np.newaxis]) * 255).astype(np.uint8)\r\n\r\n            # 格式还原\r\n            mask = (mask * 255).astype(np.uint8)\r\n\r\n            # 可视化\r\n            if visualization:\r\n                cv2.imwrite(os.path.join(output_dir, 'result_mask_%d.png' % i), mask)\r\n                cv2.imwrite(os.path.join(output_dir, 'result_%d.png' % i), result)\r\n\r\n            results.append({'mask': mask, 'result': result})\r\n\r\n        return results\r\n\r\n    # 关键点检测函数\r\n    def Segmentation(self, images=None, paths=None, batch_size=1, output_dir='output', visualization=False):\r\n        # 加载数据处理器\r\n        datas = self.load_datas(paths, images)\r\n\r\n        # 获取输入数据\r\n        input_datas = self.preprocess(datas, batch_size)\r\n\r\n        # 模型预测\r\n        outputs = [self.model(paddle.to_tensor(input_data)) for input_data in input_datas]\r\n\r\n        # 结果后处理\r\n        results = self.postprocess(outputs, datas, output_dir, visualization)\r\n\r\n        # 返回结果\r\n        return results\r\n"
  },
  {
    "path": "modules/image/semantic_segmentation/FCN_HRNet_W18_Face_Seg/README.md",
    "content": "# FCN_HRNet_W18_Face_Seg\r\n\r\n|模型名称|FCN_HRNet_W18_Face_Seg|\r\n| :--- | :---: |\r\n|类别|图像 - 图像生成|\r\n|网络|FCN_HRNet_W18|\r\n|数据集|-|\r\n|是否支持Fine-tuning|否|\r\n|模型大小|56MB|\r\n|最新更新日期|2021-02-26|\r\n|数据指标|-|\r\n\r\n\r\n## 一、模型基本信息\r\n\r\n- ### 应用效果展示\r\n  - 样例结果示例：\r\n    <p align=\"center\">\r\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/88155299a7534f1084f8467a4d6db7871dc4729627d3471c9129d316dc4ff9bc\"  width='70%' hspace='10'/> <br />\r\n    </p>\r\n\r\n\r\n- ### 模型介绍\r\n\r\n  - 基于 FCN_HRNet_W18模型实现的人像分割模型。\r\n\r\n\r\n## 二、安装\r\n\r\n- ### 1、环境依赖  \r\n\r\n  - paddlepaddle >= 2.0.0  \r\n\r\n  - paddlehub >= 2.0.0   | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)  \r\n\r\n- ### 2、安装\r\n\r\n  - ```shell\r\n    $ hub install FCN_HRNet_W18_Face_Seg\r\n    ```\r\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\r\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\r\n\r\n## 三、模型API预测\r\n\r\n- ### 1、预测代码示例\r\n\r\n  - ```python\r\n    import paddlehub as hub\r\n    import cv2\r\n\r\n    model = hub.Module(name=\"FCN_HRNet_W18_Face_Seg\")\r\n    result = model.Segmentation(images=[cv2.imread('/PATH/TO/IMAGE')])\r\n    # or\r\n    # result = model.Segmentation(paths=['/PATH/TO/IMAGE'])\r\n    ```\r\n\r\n- ### 2、API\r\n\r\n  - ```python\r\n    def Segmentation(images=None,\r\n                    paths=None,\r\n                    batch_size=1,\r\n                    output_dir='output',\r\n                    visualization=False):\r\n    ```\r\n\r\n    - 人像分割API。\r\n\r\n    - **参数**\r\n\r\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]；<br/>\r\n      - paths (list\\[str\\]): 输入图像路径；<br/>\r\n      - batch_size (int) : batch大小；<br/>\r\n      - output\\_dir (str): 图片的保存路径，默认设为 output；<br/>\r\n      - visualization (bool) : 是否将结果保存为图片文件；；<br/>\r\n\r\n      **NOTE:** paths和images两个参数选择其一进行提供数据\r\n\r\n    - **返回**\r\n      - res (list\\[numpy.ndarray\\]): 输出图像数据，ndarray.shape 为 \\[H, W, C\\]\r\n\r\n\r\n\r\n\r\n## 四、更新历史\r\n\r\n* 1.0.0\r\n\r\n  初始发布\r\n\r\n  - ```shell\r\n    $ hub install FCN_HRNet_W18_Face_Seg==1.0.0\r\n    ```\r\n"
  },
  {
    "path": "modules/image/semantic_segmentation/FCN_HRNet_W18_Face_Seg/README_en.md",
    "content": "# FCN_HRNet_W18_Face_Seg\n\n|Module Name|FCN_HRNet_W18_Face_Seg|\n| :--- | :---: |\n|Category|image segmentation|\n|Network|FCN_HRNet_W18|\n|Dataset|-|\n|Fine-tuning supported or not|No|\n|Module Size|56MB|\n|Latest update date|2021-02-26|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n    <p align=\"center\">\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/88155299a7534f1084f8467a4d6db7871dc4729627d3471c9129d316dc4ff9bc\"  width='70%' hspace='10'/> <br />\n    </p>\n\n\n- ### Module Introduction\n\n  - This module is based on FCN_HRNet_W18 model, and can be used to segment face region.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 2.0.0  \n\n  - paddlehub >= 2.0.0   | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)  \n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install FCN_HRNet_W18_Face_Seg\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"FCN_HRNet_W18_Face_Seg\")\n    result = model.Segmentation(images=[cv2.imread('/PATH/TO/IMAGE')])\n    # or\n    # result = model.Segmentation(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def Segmentation(images=None,\n                    paths=None,\n                    batch_size=1,\n                    output_dir='output',\n                    visualization=False):\n    ```\n\n    - Face segmentation API.\n\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - paths (list[str]): image path;\n      - batch_size (int): the size of batch;\n      - output_dir (str): save path of images;\n      - visualization (bool): Whether to save the results as picture files;\n\n      **NOTE:** choose one parameter to provide data from paths and images\n\n    - **Return**\n      - res (list\\[numpy.ndarray\\]): result list，ndarray.shape is \\[H, W, C\\]\n\n\n\n\n## IV.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install FCN_HRNet_W18_Face_Seg==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/semantic_segmentation/FCN_HRNet_W18_Face_Seg/model/__init__.py",
    "content": "from .hrnet import HRNet_W18\nfrom .fcn import FCN\n"
  },
  {
    "path": "modules/image/semantic_segmentation/FCN_HRNet_W18_Face_Seg/model/fcn.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom .layers import ConvBNReLU\n\n\nclass FCN(nn.Layer):\n    \"\"\"\n    A simple implementation for FCN based on PaddlePaddle.\n\n    The original article refers to\n    Evan Shelhamer, et, al. \"Fully Convolutional Networks for Semantic Segmentation\"\n    (https://arxiv.org/abs/1411.4038).\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone (paddle.nn.Layer): Backbone networks.\n        backbone_indices (tuple, optional): The values in the tuple indicate the indices of output of backbone.\n            Default: (-1, ).\n        channels (int, optional): The channels between conv layer and the last layer of FCNHead.\n            If None, it will be the number of channels of input features. Default: None.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.  Default: False.\n        pretrained (str, optional): The path or url of pretrained model. Default: None\n    \"\"\"\n\n    def __init__(self,\n                 num_classes,\n                 backbone,\n                 backbone_indices=(-1, ),\n                 channels=None,\n                 align_corners=False,\n                 pretrained=None):\n        super(FCN, self).__init__()\n\n        self.backbone = backbone\n        backbone_channels = [backbone.feat_channels[i] for i in backbone_indices]\n\n        self.head = FCNHead(num_classes, backbone_indices, backbone_channels, channels)\n\n        self.align_corners = align_corners\n        self.pretrained = pretrained\n\n    def forward(self, x):\n        feat_list = self.backbone(x)\n        logit_list = self.head(feat_list)\n        return [\n            F.interpolate(logit, x.shape[2:], mode='bilinear', align_corners=self.align_corners) for logit in logit_list\n        ]\n\n\nclass FCNHead(nn.Layer):\n    \"\"\"\n    A simple implementation for FCNHead based on PaddlePaddle\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (tuple, optional): The values in the tuple indicate the indices of output of backbone.\n            Default: (-1, ).\n        channels (int, optional): The channels between conv layer and the last layer of FCNHead.\n            If None, it will be the number of channels of input features. Default: None.\n        pretrained (str, optional): The path of pretrained model. Default: None\n    \"\"\"\n\n    def __init__(self, num_classes, backbone_indices=(-1, ), backbone_channels=(270, ), channels=None):\n        super(FCNHead, self).__init__()\n\n        self.num_classes = num_classes\n        self.backbone_indices = backbone_indices\n        if channels is None:\n            channels = backbone_channels[0]\n\n        self.conv_1 = ConvBNReLU(\n            in_channels=backbone_channels[0], out_channels=channels, kernel_size=1, padding='same', stride=1)\n        self.cls = nn.Conv2D(in_channels=channels, out_channels=self.num_classes, kernel_size=1, stride=1, padding=0)\n\n    def forward(self, feat_list):\n        logit_list = []\n        x = feat_list[self.backbone_indices[0]]\n        x = self.conv_1(x)\n        logit = self.cls(x)\n        logit_list.append(logit)\n        return logit_list\n"
  },
  {
    "path": "modules/image/semantic_segmentation/FCN_HRNet_W18_Face_Seg/model/hrnet.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport math\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom .layers import ConvBNReLU, ConvBN\n\n__all__ = [\n    \"HRNet_W18_Small_V1\", \"HRNet_W18_Small_V2\", \"HRNet_W18\", \"HRNet_W30\", \"HRNet_W32\", \"HRNet_W40\", \"HRNet_W44\",\n    \"HRNet_W48\", \"HRNet_W60\", \"HRNet_W64\"\n]\n\n\nclass HRNet(nn.Layer):\n    \"\"\"\n    The HRNet implementation based on PaddlePaddle.\n\n    The original article refers to\n    Jingdong Wang, et, al. \"HRNet：Deep High-Resolution Representation Learning for Visual Recognition\"\n    (https://arxiv.org/pdf/1908.07919.pdf).\n\n    Args:\n        pretrained (str): The path of pretrained model.\n        stage1_num_modules (int): Number of modules for stage1. Default 1.\n        stage1_num_blocks (list): Number of blocks per module for stage1. Default [4].\n        stage1_num_channels (list): Number of channels per branch for stage1. Default [64].\n        stage2_num_modules (int): Number of modules for stage2. Default 1.\n        stage2_num_blocks (list): Number of blocks per module for stage2. Default [4, 4]\n        stage2_num_channels (list): Number of channels per branch for stage2. Default [18, 36].\n        stage3_num_modules (int): Number of modules for stage3. Default 4.\n        stage3_num_blocks (list): Number of blocks per module for stage3. Default [4, 4, 4]\n        stage3_num_channels (list): Number of channels per branch for stage3. Default [18, 36, 72].\n        stage4_num_modules (int): Number of modules for stage4. Default 3.\n        stage4_num_blocks (list): Number of blocks per module for stage4. Default [4, 4, 4, 4]\n        stage4_num_channels (list): Number of channels per branch for stage4. Default [18, 36, 72. 144].\n        has_se (bool): Whether to use Squeeze-and-Excitation module. Default False.\n        align_corners (bool, optional): An argument of F.interpolate. It should be set to False when the feature size is even,\n            e.g. 1024x512, otherwise it is True, e.g. 769x769. Default: False.\n    \"\"\"\n\n    def __init__(self,\n                 pretrained=None,\n                 stage1_num_modules=1,\n                 stage1_num_blocks=[4],\n                 stage1_num_channels=[64],\n                 stage2_num_modules=1,\n                 stage2_num_blocks=[4, 4],\n                 stage2_num_channels=[18, 36],\n                 stage3_num_modules=4,\n                 stage3_num_blocks=[4, 4, 4],\n                 stage3_num_channels=[18, 36, 72],\n                 stage4_num_modules=3,\n                 stage4_num_blocks=[4, 4, 4, 4],\n                 stage4_num_channels=[18, 36, 72, 144],\n                 has_se=False,\n                 align_corners=False):\n        super(HRNet, self).__init__()\n        self.pretrained = pretrained\n        self.stage1_num_modules = stage1_num_modules\n        self.stage1_num_blocks = stage1_num_blocks\n        self.stage1_num_channels = stage1_num_channels\n        self.stage2_num_modules = stage2_num_modules\n        self.stage2_num_blocks = stage2_num_blocks\n        self.stage2_num_channels = stage2_num_channels\n        self.stage3_num_modules = stage3_num_modules\n        self.stage3_num_blocks = stage3_num_blocks\n        self.stage3_num_channels = stage3_num_channels\n        self.stage4_num_modules = stage4_num_modules\n        self.stage4_num_blocks = stage4_num_blocks\n        self.stage4_num_channels = stage4_num_channels\n        self.has_se = has_se\n        self.align_corners = align_corners\n        self.feat_channels = [sum(stage4_num_channels)]\n\n        self.conv_layer1_1 = ConvBNReLU(\n            in_channels=3, out_channels=64, kernel_size=3, stride=2, padding='same', bias_attr=False)\n\n        self.conv_layer1_2 = ConvBNReLU(\n            in_channels=64, out_channels=64, kernel_size=3, stride=2, padding='same', bias_attr=False)\n\n        self.la1 = Layer1(\n            num_channels=64,\n            num_blocks=self.stage1_num_blocks[0],\n            num_filters=self.stage1_num_channels[0],\n            has_se=has_se,\n            name=\"layer2\")\n\n        self.tr1 = TransitionLayer(\n            in_channels=[self.stage1_num_channels[0] * 4], out_channels=self.stage2_num_channels, name=\"tr1\")\n\n        self.st2 = Stage(\n            num_channels=self.stage2_num_channels,\n            num_modules=self.stage2_num_modules,\n            num_blocks=self.stage2_num_blocks,\n            num_filters=self.stage2_num_channels,\n            has_se=self.has_se,\n            name=\"st2\",\n            align_corners=align_corners)\n\n        self.tr2 = TransitionLayer(\n            in_channels=self.stage2_num_channels, out_channels=self.stage3_num_channels, name=\"tr2\")\n        self.st3 = Stage(\n            num_channels=self.stage3_num_channels,\n            num_modules=self.stage3_num_modules,\n            num_blocks=self.stage3_num_blocks,\n            num_filters=self.stage3_num_channels,\n            has_se=self.has_se,\n            name=\"st3\",\n            align_corners=align_corners)\n\n        self.tr3 = TransitionLayer(\n            in_channels=self.stage3_num_channels, out_channels=self.stage4_num_channels, name=\"tr3\")\n        self.st4 = Stage(\n            num_channels=self.stage4_num_channels,\n            num_modules=self.stage4_num_modules,\n            num_blocks=self.stage4_num_blocks,\n            num_filters=self.stage4_num_channels,\n            has_se=self.has_se,\n            name=\"st4\",\n            align_corners=align_corners)\n\n    def forward(self, x):\n        conv1 = self.conv_layer1_1(x)\n        conv2 = self.conv_layer1_2(conv1)\n\n        la1 = self.la1(conv2)\n\n        tr1 = self.tr1([la1])\n        st2 = self.st2(tr1)\n\n        tr2 = self.tr2(st2)\n        st3 = self.st3(tr2)\n\n        tr3 = self.tr3(st3)\n        st4 = self.st4(tr3)\n\n        x0_h, x0_w = st4[0].shape[2:]\n        x1 = F.interpolate(st4[1], (x0_h, x0_w), mode='bilinear', align_corners=self.align_corners)\n        x2 = F.interpolate(st4[2], (x0_h, x0_w), mode='bilinear', align_corners=self.align_corners)\n        x3 = F.interpolate(st4[3], (x0_h, x0_w), mode='bilinear', align_corners=self.align_corners)\n        x = paddle.concat([st4[0], x1, x2, x3], axis=1)\n\n        return [x]\n\n\nclass Layer1(nn.Layer):\n    def __init__(self, num_channels, num_filters, num_blocks, has_se=False, name=None):\n        super(Layer1, self).__init__()\n\n        self.bottleneck_block_list = []\n\n        for i in range(num_blocks):\n            bottleneck_block = self.add_sublayer(\n                \"bb_{}_{}\".format(name, i + 1),\n                BottleneckBlock(\n                    num_channels=num_channels if i == 0 else num_filters * 4,\n                    num_filters=num_filters,\n                    has_se=has_se,\n                    stride=1,\n                    downsample=True if i == 0 else False,\n                    name=name + '_' + str(i + 1)))\n            self.bottleneck_block_list.append(bottleneck_block)\n\n    def forward(self, x):\n        conv = x\n        for block_func in self.bottleneck_block_list:\n            conv = block_func(conv)\n        return conv\n\n\nclass TransitionLayer(nn.Layer):\n    def __init__(self, in_channels, out_channels, name=None):\n        super(TransitionLayer, self).__init__()\n\n        num_in = len(in_channels)\n        num_out = len(out_channels)\n        self.conv_bn_func_list = []\n        for i in range(num_out):\n            residual = None\n            if i < num_in:\n                if in_channels[i] != out_channels[i]:\n                    residual = self.add_sublayer(\n                        \"transition_{}_layer_{}\".format(name, i + 1),\n                        ConvBNReLU(\n                            in_channels=in_channels[i],\n                            out_channels=out_channels[i],\n                            kernel_size=3,\n                            padding='same',\n                            bias_attr=False))\n            else:\n                residual = self.add_sublayer(\n                    \"transition_{}_layer_{}\".format(name, i + 1),\n                    ConvBNReLU(\n                        in_channels=in_channels[-1],\n                        out_channels=out_channels[i],\n                        kernel_size=3,\n                        stride=2,\n                        padding='same',\n                        bias_attr=False))\n            self.conv_bn_func_list.append(residual)\n\n    def forward(self, x):\n        outs = []\n        for idx, conv_bn_func in enumerate(self.conv_bn_func_list):\n            if conv_bn_func is None:\n                outs.append(x[idx])\n            else:\n                if idx < len(x):\n                    outs.append(conv_bn_func(x[idx]))\n                else:\n                    outs.append(conv_bn_func(x[-1]))\n        return outs\n\n\nclass Branches(nn.Layer):\n    def __init__(self, num_blocks, in_channels, out_channels, has_se=False, name=None):\n        super(Branches, self).__init__()\n\n        self.basic_block_list = []\n\n        for i in range(len(out_channels)):\n            self.basic_block_list.append([])\n            for j in range(num_blocks[i]):\n                in_ch = in_channels[i] if j == 0 else out_channels[i]\n                basic_block_func = self.add_sublayer(\n                    \"bb_{}_branch_layer_{}_{}\".format(name, i + 1, j + 1),\n                    BasicBlock(\n                        num_channels=in_ch,\n                        num_filters=out_channels[i],\n                        has_se=has_se,\n                        name=name + '_branch_layer_' + str(i + 1) + '_' + str(j + 1)))\n                self.basic_block_list[i].append(basic_block_func)\n\n    def forward(self, x):\n        outs = []\n        for idx, input in enumerate(x):\n            conv = input\n            for basic_block_func in self.basic_block_list[idx]:\n                conv = basic_block_func(conv)\n            outs.append(conv)\n        return outs\n\n\nclass BottleneckBlock(nn.Layer):\n    def __init__(self, num_channels, num_filters, has_se, stride=1, downsample=False, name=None):\n        super(BottleneckBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = ConvBNReLU(\n            in_channels=num_channels, out_channels=num_filters, kernel_size=1, padding='same', bias_attr=False)\n\n        self.conv2 = ConvBNReLU(\n            in_channels=num_filters,\n            out_channels=num_filters,\n            kernel_size=3,\n            stride=stride,\n            padding='same',\n            bias_attr=False)\n\n        self.conv3 = ConvBN(\n            in_channels=num_filters, out_channels=num_filters * 4, kernel_size=1, padding='same', bias_attr=False)\n\n        if self.downsample:\n            self.conv_down = ConvBN(\n                in_channels=num_channels, out_channels=num_filters * 4, kernel_size=1, padding='same', bias_attr=False)\n\n        if self.has_se:\n            self.se = SELayer(\n                num_channels=num_filters * 4, num_filters=num_filters * 4, reduction_ratio=16, name=name + '_fc')\n\n    def forward(self, x):\n        residual = x\n        conv1 = self.conv1(x)\n        conv2 = self.conv2(conv1)\n        conv3 = self.conv3(conv2)\n\n        if self.downsample:\n            residual = self.conv_down(x)\n\n        if self.has_se:\n            conv3 = self.se(conv3)\n\n        y = conv3 + residual\n        y = F.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self, num_channels, num_filters, stride=1, has_se=False, downsample=False, name=None):\n        super(BasicBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = ConvBNReLU(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=3,\n            stride=stride,\n            padding='same',\n            bias_attr=False)\n        self.conv2 = ConvBN(\n            in_channels=num_filters, out_channels=num_filters, kernel_size=3, padding='same', bias_attr=False)\n\n        if self.downsample:\n            self.conv_down = ConvBNReLU(\n                in_channels=num_channels, out_channels=num_filters, kernel_size=1, padding='same', bias_attr=False)\n\n        if self.has_se:\n            self.se = SELayer(num_channels=num_filters, num_filters=num_filters, reduction_ratio=16, name=name + '_fc')\n\n    def forward(self, x):\n        residual = x\n        conv1 = self.conv1(x)\n        conv2 = self.conv2(conv1)\n\n        if self.downsample:\n            residual = self.conv_down(x)\n\n        if self.has_se:\n            conv2 = self.se(conv2)\n\n        y = conv2 + residual\n        y = F.relu(y)\n        return y\n\n\nclass SELayer(nn.Layer):\n    def __init__(self, num_channels, num_filters, reduction_ratio, name=None):\n        super(SELayer, self).__init__()\n\n        self.pool2d_gap = nn.AdaptiveAvgPool2D(1)\n\n        self._num_channels = num_channels\n\n        med_ch = int(num_channels / reduction_ratio)\n        stdv = 1.0 / math.sqrt(num_channels * 1.0)\n        self.squeeze = nn.Linear(\n            num_channels, med_ch, weight_attr=paddle.ParamAttr(initializer=nn.initializer.Uniform(-stdv, stdv)))\n\n        stdv = 1.0 / math.sqrt(med_ch * 1.0)\n        self.excitation = nn.Linear(\n            med_ch, num_filters, weight_attr=paddle.ParamAttr(initializer=nn.initializer.Uniform(-stdv, stdv)))\n\n    def forward(self, x):\n        pool = self.pool2d_gap(x)\n        pool = paddle.reshape(pool, shape=[-1, self._num_channels])\n        squeeze = self.squeeze(pool)\n        squeeze = F.relu(squeeze)\n        excitation = self.excitation(squeeze)\n        excitation = F.sigmoid(excitation)\n        excitation = paddle.reshape(excitation, shape=[-1, self._num_channels, 1, 1])\n        out = x * excitation\n        return out\n\n\nclass Stage(nn.Layer):\n    def __init__(self,\n                 num_channels,\n                 num_modules,\n                 num_blocks,\n                 num_filters,\n                 has_se=False,\n                 multi_scale_output=True,\n                 name=None,\n                 align_corners=False):\n        super(Stage, self).__init__()\n\n        self._num_modules = num_modules\n\n        self.stage_func_list = []\n        for i in range(num_modules):\n            if i == num_modules - 1 and not multi_scale_output:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels,\n                        num_blocks=num_blocks,\n                        num_filters=num_filters,\n                        has_se=has_se,\n                        multi_scale_output=False,\n                        name=name + '_' + str(i + 1),\n                        align_corners=align_corners))\n            else:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels,\n                        num_blocks=num_blocks,\n                        num_filters=num_filters,\n                        has_se=has_se,\n                        name=name + '_' + str(i + 1),\n                        align_corners=align_corners))\n\n            self.stage_func_list.append(stage_func)\n\n    def forward(self, x):\n        out = x\n        for idx in range(self._num_modules):\n            out = self.stage_func_list[idx](out)\n        return out\n\n\nclass HighResolutionModule(nn.Layer):\n    def __init__(self,\n                 num_channels,\n                 num_blocks,\n                 num_filters,\n                 has_se=False,\n                 multi_scale_output=True,\n                 name=None,\n                 align_corners=False):\n        super(HighResolutionModule, self).__init__()\n\n        self.branches_func = Branches(\n            num_blocks=num_blocks, in_channels=num_channels, out_channels=num_filters, has_se=has_se, name=name)\n\n        self.fuse_func = FuseLayers(\n            in_channels=num_filters,\n            out_channels=num_filters,\n            multi_scale_output=multi_scale_output,\n            name=name,\n            align_corners=align_corners)\n\n    def forward(self, x):\n        out = self.branches_func(x)\n        out = self.fuse_func(out)\n        return out\n\n\nclass FuseLayers(nn.Layer):\n    def __init__(self, in_channels, out_channels, multi_scale_output=True, name=None, align_corners=False):\n        super(FuseLayers, self).__init__()\n\n        self._actual_ch = len(in_channels) if multi_scale_output else 1\n        self._in_channels = in_channels\n        self.align_corners = align_corners\n\n        self.residual_func_list = []\n        for i in range(self._actual_ch):\n            for j in range(len(in_channels)):\n                if j > i:\n                    residual_func = self.add_sublayer(\n                        \"residual_{}_layer_{}_{}\".format(name, i + 1, j + 1),\n                        ConvBN(\n                            in_channels=in_channels[j],\n                            out_channels=out_channels[i],\n                            kernel_size=1,\n                            padding='same',\n                            bias_attr=False))\n                    self.residual_func_list.append(residual_func)\n                elif j < i:\n                    pre_num_filters = in_channels[j]\n                    for k in range(i - j):\n                        if k == i - j - 1:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                ConvBN(\n                                    in_channels=pre_num_filters,\n                                    out_channels=out_channels[i],\n                                    kernel_size=3,\n                                    stride=2,\n                                    padding='same',\n                                    bias_attr=False))\n                            pre_num_filters = out_channels[i]\n                        else:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                ConvBNReLU(\n                                    in_channels=pre_num_filters,\n                                    out_channels=out_channels[j],\n                                    kernel_size=3,\n                                    stride=2,\n                                    padding='same',\n                                    bias_attr=False))\n                            pre_num_filters = out_channels[j]\n                        self.residual_func_list.append(residual_func)\n\n    def forward(self, x):\n        outs = []\n        residual_func_idx = 0\n        for i in range(self._actual_ch):\n            residual = x[i]\n            residual_shape = residual.shape[-2:]\n            for j in range(len(self._in_channels)):\n                if j > i:\n                    y = self.residual_func_list[residual_func_idx](x[j])\n                    residual_func_idx += 1\n\n                    y = F.interpolate(y, residual_shape, mode='bilinear', align_corners=self.align_corners)\n                    residual = residual + y\n                elif j < i:\n                    y = x[j]\n                    for k in range(i - j):\n                        y = self.residual_func_list[residual_func_idx](y)\n                        residual_func_idx += 1\n\n                    residual = residual + y\n\n            residual = F.relu(residual)\n            outs.append(residual)\n\n        return outs\n\n\ndef HRNet_W18_Small_V1(**kwargs):\n    model = HRNet(\n        stage1_num_modules=1,\n        stage1_num_blocks=[1],\n        stage1_num_channels=[32],\n        stage2_num_modules=1,\n        stage2_num_blocks=[2, 2],\n        stage2_num_channels=[16, 32],\n        stage3_num_modules=1,\n        stage3_num_blocks=[2, 2, 2],\n        stage3_num_channels=[16, 32, 64],\n        stage4_num_modules=1,\n        stage4_num_blocks=[2, 2, 2, 2],\n        stage4_num_channels=[16, 32, 64, 128],\n        **kwargs)\n    return model\n\n\ndef HRNet_W18_Small_V2(**kwargs):\n    model = HRNet(\n        stage1_num_modules=1,\n        stage1_num_blocks=[2],\n        stage1_num_channels=[64],\n        stage2_num_modules=1,\n        stage2_num_blocks=[2, 2],\n        stage2_num_channels=[18, 36],\n        stage3_num_modules=1,\n        stage3_num_blocks=[2, 2, 2],\n        stage3_num_channels=[18, 36, 72],\n        stage4_num_modules=1,\n        stage4_num_blocks=[2, 2, 2, 2],\n        stage4_num_channels=[18, 36, 72, 144],\n        **kwargs)\n    return model\n\n\ndef HRNet_W18(**kwargs):\n    model = HRNet(\n        stage1_num_modules=1,\n        stage1_num_blocks=[4],\n        stage1_num_channels=[64],\n        stage2_num_modules=1,\n        stage2_num_blocks=[4, 4],\n        stage2_num_channels=[18, 36],\n        stage3_num_modules=4,\n        stage3_num_blocks=[4, 4, 4],\n        stage3_num_channels=[18, 36, 72],\n        stage4_num_modules=3,\n        stage4_num_blocks=[4, 4, 4, 4],\n        stage4_num_channels=[18, 36, 72, 144],\n        **kwargs)\n    return model\n\n\ndef HRNet_W30(**kwargs):\n    model = HRNet(\n        stage1_num_modules=1,\n        stage1_num_blocks=[4],\n        stage1_num_channels=[64],\n        stage2_num_modules=1,\n        stage2_num_blocks=[4, 4],\n        stage2_num_channels=[30, 60],\n        stage3_num_modules=4,\n        stage3_num_blocks=[4, 4, 4],\n        stage3_num_channels=[30, 60, 120],\n        stage4_num_modules=3,\n        stage4_num_blocks=[4, 4, 4, 4],\n        stage4_num_channels=[30, 60, 120, 240],\n        **kwargs)\n    return model\n\n\ndef HRNet_W32(**kwargs):\n    model = HRNet(\n        stage1_num_modules=1,\n        stage1_num_blocks=[4],\n        stage1_num_channels=[64],\n        stage2_num_modules=1,\n        stage2_num_blocks=[4, 4],\n        stage2_num_channels=[32, 64],\n        stage3_num_modules=4,\n        stage3_num_blocks=[4, 4, 4],\n        stage3_num_channels=[32, 64, 128],\n        stage4_num_modules=3,\n        stage4_num_blocks=[4, 4, 4, 4],\n        stage4_num_channels=[32, 64, 128, 256],\n        **kwargs)\n    return model\n\n\ndef HRNet_W40(**kwargs):\n    model = HRNet(\n        stage1_num_modules=1,\n        stage1_num_blocks=[4],\n        stage1_num_channels=[64],\n        stage2_num_modules=1,\n        stage2_num_blocks=[4, 4],\n        stage2_num_channels=[40, 80],\n        stage3_num_modules=4,\n        stage3_num_blocks=[4, 4, 4],\n        stage3_num_channels=[40, 80, 160],\n        stage4_num_modules=3,\n        stage4_num_blocks=[4, 4, 4, 4],\n        stage4_num_channels=[40, 80, 160, 320],\n        **kwargs)\n    return model\n\n\ndef HRNet_W44(**kwargs):\n    model = HRNet(\n        stage1_num_modules=1,\n        stage1_num_blocks=[4],\n        stage1_num_channels=[64],\n        stage2_num_modules=1,\n        stage2_num_blocks=[4, 4],\n        stage2_num_channels=[44, 88],\n        stage3_num_modules=4,\n        stage3_num_blocks=[4, 4, 4],\n        stage3_num_channels=[44, 88, 176],\n        stage4_num_modules=3,\n        stage4_num_blocks=[4, 4, 4, 4],\n        stage4_num_channels=[44, 88, 176, 352],\n        **kwargs)\n    return model\n\n\ndef HRNet_W48(**kwargs):\n    model = HRNet(\n        stage1_num_modules=1,\n        stage1_num_blocks=[4],\n        stage1_num_channels=[64],\n        stage2_num_modules=1,\n        stage2_num_blocks=[4, 4],\n        stage2_num_channels=[48, 96],\n        stage3_num_modules=4,\n        stage3_num_blocks=[4, 4, 4],\n        stage3_num_channels=[48, 96, 192],\n        stage4_num_modules=3,\n        stage4_num_blocks=[4, 4, 4, 4],\n        stage4_num_channels=[48, 96, 192, 384],\n        **kwargs)\n    return model\n\n\ndef HRNet_W60(**kwargs):\n    model = HRNet(\n        stage1_num_modules=1,\n        stage1_num_blocks=[4],\n        stage1_num_channels=[64],\n        stage2_num_modules=1,\n        stage2_num_blocks=[4, 4],\n        stage2_num_channels=[60, 120],\n        stage3_num_modules=4,\n        stage3_num_blocks=[4, 4, 4],\n        stage3_num_channels=[60, 120, 240],\n        stage4_num_modules=3,\n        stage4_num_blocks=[4, 4, 4, 4],\n        stage4_num_channels=[60, 120, 240, 480],\n        **kwargs)\n    return model\n\n\ndef HRNet_W64(**kwargs):\n    model = HRNet(\n        stage1_num_modules=1,\n        stage1_num_blocks=[4],\n        stage1_num_channels=[64],\n        stage2_num_modules=1,\n        stage2_num_blocks=[4, 4],\n        stage2_num_channels=[64, 128],\n        stage3_num_modules=4,\n        stage3_num_blocks=[4, 4, 4],\n        stage3_num_channels=[64, 128, 256],\n        stage4_num_modules=3,\n        stage4_num_blocks=[4, 4, 4, 4],\n        stage4_num_channels=[64, 128, 256, 512],\n        **kwargs)\n    return model\n"
  },
  {
    "path": "modules/image/semantic_segmentation/FCN_HRNet_W18_Face_Seg/model/layers.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm instead\"\"\"\n    if paddle.get_device() == 'cpu':\n        return nn.BatchNorm(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass ConvBNReLU(nn.Layer):\n    def __init__(self, in_channels, out_channels, kernel_size, padding='same', **kwargs):\n        super().__init__()\n\n        self._conv = nn.Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x):\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    def __init__(self, in_channels, out_channels, kernel_size, padding='same', **kwargs):\n        super().__init__()\n        self._conv = nn.Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x):\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n"
  },
  {
    "path": "modules/image/semantic_segmentation/FCN_HRNet_W18_Face_Seg/module.py",
    "content": "import os\r\nimport cv2\r\nimport paddle\r\nimport paddle.nn as nn\r\nimport numpy as np\r\nfrom FCN_HRNet_W18_Face_Seg.model import FCN, HRNet_W18\r\nfrom paddlehub.module.module import moduleinfo\r\n\r\n\r\n@moduleinfo(\r\n    name=\"FCN_HRNet_W18_Face_Seg\",  # 模型名称\r\n    type=\"CV\",  # 模型类型\r\n    author=\"jm12138\",  # 作者名称\r\n    author_email=\"jm12138@qq.com\",  # 作者邮箱\r\n    summary=\"FCN_HRNet_W18_Face_Seg\",  # 模型介绍\r\n    version=\"1.0.0\"  # 版本号\r\n)\r\nclass FCN_HRNet_W18_Face_Seg(nn.Layer):\r\n    def __init__(self):\r\n        super(FCN_HRNet_W18_Face_Seg, self).__init__()\r\n        # 加载分割模型\r\n        self.seg = FCN(num_classes=2, backbone=HRNet_W18())\r\n\r\n        # 加载模型参数\r\n        state_dict = paddle.load(os.path.join(self.directory, 'seg_model_384.pdparams'))\r\n        self.seg.set_state_dict(state_dict)\r\n\r\n        # 设置模型为评估模式\r\n        self.seg.eval()\r\n\r\n    # 读取数据函数\r\n    @staticmethod\r\n    def load_datas(paths, images):\r\n        datas = []\r\n\r\n        # 读取数据列表\r\n        if paths is not None:\r\n            for im_path in paths:\r\n                assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\r\n                im = cv2.imread(im_path)\r\n                datas.append(im)\r\n\r\n        if images is not None:\r\n            datas = images\r\n\r\n        # 返回数据列表\r\n        return datas\r\n\r\n    # 数据预处理函数\r\n    @staticmethod\r\n    def preprocess(images, batch_size):\r\n        input_datas = []\r\n\r\n        for image in images:\r\n            # 图像缩放\r\n            image = cv2.resize(image, (384, 384), interpolation=cv2.INTER_AREA)\r\n\r\n            # 数据格式转换\r\n            image = (image / 255.)[np.newaxis, :, :, :]\r\n            image = np.transpose(image, (0, 3, 1, 2)).astype(np.float32)\r\n\r\n            input_datas.append(image)\r\n\r\n        input_datas = np.concatenate(input_datas, 0)\r\n\r\n        # 数据切分\r\n        datas_num = input_datas.shape[0]\r\n        split_num = datas_num // batch_size + 1 if datas_num % batch_size != 0 else datas_num // batch_size\r\n        input_datas = np.array_split(input_datas, split_num)\r\n\r\n        return input_datas\r\n\r\n    # 结果归一化函数\r\n    @staticmethod\r\n    def normPRED(d):\r\n        ma = np.max(d)\r\n        mi = np.min(d)\r\n\r\n        dn = (d - mi) / (ma - mi)\r\n\r\n        return dn\r\n\r\n    # 结果后处理函数\r\n    def postprocess(self, outputs, datas, output_dir, visualization):\r\n        # 检查输出目录\r\n        if visualization:\r\n            if not os.path.exists(output_dir):\r\n                os.mkdir(output_dir)\r\n\r\n        results = []\r\n\r\n        for output, image, i in zip(outputs, datas, range(len(datas))):\r\n            # 计算MASK\r\n            pred = self.normPRED(output[1])\r\n\r\n            # 图像缩放\r\n            h, w = image.shape[:2]\r\n            mask = cv2.resize(pred, (w, h))\r\n            mask[mask < 0.5] = 0.\r\n            mask[mask > 0.55] = 1.\r\n\r\n            # 计算输出图像\r\n            face = (image * mask[..., np.newaxis] + (1 - mask[..., np.newaxis]) * 255).astype(np.uint8)\r\n\r\n            # 格式还原\r\n            mask = (mask * 255).astype(np.uint8)\r\n\r\n            # 可视化结果保存\r\n            if visualization:\r\n                cv2.imwrite(os.path.join(output_dir, 'result_mask_%d.png' % i), mask)\r\n                cv2.imwrite(os.path.join(output_dir, 'result_%d.png' % i), face)\r\n\r\n            results.append({'mask': mask, 'face': face})\r\n\r\n        return results\r\n\r\n    # 模型预测函数\r\n    def predict(self, input_datas):\r\n        outputs = []\r\n\r\n        for data in input_datas:\r\n            # 转换数据为Tensor\r\n            data = paddle.to_tensor(data)\r\n\r\n            # 模型前向计算\r\n            logits = self.seg(data)\r\n\r\n            outputs.append(logits[0].numpy())\r\n\r\n        outputs = np.concatenate(outputs, 0)\r\n\r\n        return outputs\r\n\r\n    def Segmentation(self, images=None, paths=None, batch_size=1, output_dir='output', visualization=False):\r\n        # 获取输入数据\r\n        datas = self.load_datas(paths, images)\r\n\r\n        # 数据预处理\r\n        input_datas = self.preprocess(datas, batch_size)\r\n\r\n        # 模型预测\r\n        outputs = self.predict(input_datas)\r\n\r\n        # 结果后处理\r\n        results = self.postprocess(outputs, datas, output_dir, visualization)\r\n\r\n        return results\r\n"
  },
  {
    "path": "modules/image/semantic_segmentation/Pneumonia_CT_LKM_PP/README.md",
    "content": "\n# Pneumonia_CT_LKM_PP\n\n|模型名称|Pneumonia_CT_LKM_PP|\n| :--- | :---: | \n|类别|图像-图像分割|\n|网络|-|\n|数据集|-|\n|是否支持Fine-tuning|否|\n|模型大小|35M|\n|指标|-|\n|最新更新日期|2021-02-26|\n\n\n## 一、模型基本信息\n\n\n- ### 模型介绍\n\n    - 肺炎CT影像分析模型（Pneumonia-CT-LKM-PP）可以高效地完成对患者CT影像的病灶检测识别、病灶轮廓勾画，通过一定的后处理代码，可以分析输出肺部病灶的数量、体积、病灶占比等全套定量指标。值得强调的是，该系统采用的深度学习算法模型充分训练了所收集到的高分辨率和低分辨率的CT影像数据，能极好地适应不同等级CT影像设备采集的检查数据，有望为医疗资源受限和医疗水平偏低的基层医院提供有效的肺炎辅助诊断工具。\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install Pneumonia_CT_LKM_PP==1.0.0\n      ```\n      \n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n    ```python\n    import paddlehub as hub\n\n    pneumonia = hub.Module(name=\"Pneumonia_CT_LKM_PP\")\n\n    input_only_lesion_np_path = \"/PATH/TO/ONLY_LESION_NP\"\n    input_both_lesion_np_path = \"/PATH/TO/LESION_NP\"\n    input_both_lung_np_path = \"/PATH/TO/LUNG_NP\"\n\n    # set input dict\n    input_dict = {\"image_np_path\": [\n                                    [input_only_lesion_np_path],\n                                    [input_both_lesion_np_path, input_both_lung_np_path],\n                                    ]}\n\n    # execute predict and print the result\n    results = pneumonia.segmentation(data=input_dict)\n    for result in results:\n        print(result)\n\n    ```\n   \n\n- ### 2、API\n\n    ```python\n    def segmentation(data)\n    ```\n\n    - 预测API，用于肺炎CT影像分析。\n\n    - **参数**\n\n        * data (dict): key，str类型，\"image_np_path\"；value，list类型，每个元素为list类型，[用于病灶分析的影像numpy数组(文件后缀名.npy)路径, 用于肺部分割的影像numpy数组路径]，如果仅进行病灶分析不进行肺部分割，可以省略用于肺部分割的影像numpy数组路径\n       \n\n    - **返回**\n\n        * result  (list\\[dict\\]): 每个元素为对应输入的预测结果。每个预测结果为dict类型：预测结果有以下字段：\n            * input_lesion_np_path: 存放用于病灶分析的numpy数组路径；\n            * output_lesion_np: 存放病灶分析结果，numpy数组；\n            * input_lesion_np_path：存放用于肺部分割的numpy数组路径（仅当对应输入包含肺部影像numpy时存在该字段）\n            * output_lung_np：存放肺部分割结果，numpy数组（仅当对应输入包含肺部影像numpy时存在该字段）\n\n\n## 四、更新历史\n\n* 1.0.0\n\n    初始发布\n"
  },
  {
    "path": "modules/image/semantic_segmentation/Pneumonia_CT_LKM_PP/README_en.md",
    "content": "# Pneumonia_CT_LKM_PP\n\n|Module Name|Pneumonia_CT_LKM_PP|\n| :--- | :---: | \n|Category|Image segmentation|\n|Network |-|\n|Dataset|-|\n|Fine-tuning supported or not|No|\n|Module Size|35M|\n|Data indicators|-|\n|Latest update date|2021-02-26|\n\n\n## I. Basic Information \n\n\n- ### Module Introduction\n\n    - Pneumonia CT analysis model (Pneumonia-CT-LKM-PP) can efficiently complete the detection of lesions and outline the patient's CT images. Through post-processing codes, the number, volume, and lesions of lung lesions can be analyzed. This model has been fully trained by high-resolution and low-resolution CT image data, which can adapt to the examination data collected by different levels of CT imaging equipment. \n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install Pneumonia_CT_LKM_PP==1.0.0\n      ```\n      \n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n    ```python\n    import paddlehub as hub\n\n    pneumonia = hub.Module(name=\"Pneumonia_CT_LKM_PP\")\n\n    input_only_lesion_np_path = \"/PATH/TO/ONLY_LESION_NP\"\n    input_both_lesion_np_path = \"/PATH/TO/LESION_NP\"\n    input_both_lung_np_path = \"/PATH/TO/LUNG_NP\"\n\n    # set input dict\n    input_dict = {\"image_np_path\": [\n                                    [input_only_lesion_np_path],\n                                    [input_both_lesion_np_path, input_both_lung_np_path],\n                                    ]}\n\n    # execute predict and print the result\n    results = pneumonia.segmentation(data=input_dict)\n    for result in results:\n        print(result)\n\n    ```\n   \n\n- ### 2、API\n\n    - ```python\n      def segmentation(data)\n      ```\n\n        - Prediction API, used for CT analysis of pneumonia.\n\n        - **Parameter**\n\n            * data (dict): key is \"image_np_path\", value is the list of results which contains lesion and lung segmentation masks. \n            \n\n        - **Return**\n\n            * result  (list\\[dict\\]): the list of recognition results, where each element is dict and each field is: \n                * input_lesion_np_path: input path of lesion.\n                * output_lesion_np: segmentation result path of lesion.\n                * input_lung_np_path: input path of lung.\n                * output_lung_np：segmentation result path of lung.\n\n\n## IV. Release Note\n\n* 1.0.0\n\n    First release\n"
  },
  {
    "path": "modules/image/semantic_segmentation/Pneumonia_CT_LKM_PP_lung/README.md",
    "content": "\n# Pneumonia_CT_LKM_PP_lung\n\n|模型名称|Pneumonia_CT_LKM_PP_lung|\n| :--- | :---: | \n|类别|图像-图像分割|\n|网络|-|\n|数据集|-|\n|是否支持Fine-tuning|否|\n|模型大小|35M|\n|指标|-|\n|最新更新日期|2021-02-26|\n\n\n## 一、模型基本信息\n\n\n- ### 模型介绍\n\n    - 肺炎CT影像分析模型（Pneumonia-CT-LKM-PP）可以高效地完成对患者CT影像的病灶检测识别、病灶轮廓勾画，通过一定的后处理代码，可以分析输出肺部病灶的数量、体积、病灶占比等全套定量指标。值得强调的是，该系统采用的深度学习算法模型充分训练了所收集到的高分辨率和低分辨率的CT影像数据，能极好地适应不同等级CT影像设备采集的检查数据，有望为医疗资源受限和医疗水平偏低的基层医院提供有效的肺炎辅助诊断工具。(此module为Pneumonia_CT_LKM_PP的子module。)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install Pneumonia_CT_LKM_PP_lung==1.0.0\n      ```\n      \n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n    ```python\n    import paddlehub as hub\n\n    pneumonia = hub.Module(name=\"Pneumonia_CT_LKM_PP_lung\")\n\n    input_only_lesion_np_path = \"/PATH/TO/ONLY_LESION_NP\"\n    input_both_lesion_np_path = \"/PATH/TO/LESION_NP\"\n    input_both_lung_np_path = \"/PATH/TO/LUNG_NP\"\n\n    # set input dict\n    input_dict = {\"image_np_path\": [\n                                    [input_only_lesion_np_path],\n                                    [input_both_lesion_np_path, input_both_lung_np_path],\n                                    ]}\n\n    # execute predict and print the result\n    results = pneumonia.segmentation(data=input_dict)\n    for result in results:\n        print(result)\n\n    ```\n   \n\n- ### 2、API\n\n    ```python\n    def segmentation(data)\n    ```\n\n    - 预测API，用于肺炎CT影像分析。\n\n    - **参数**\n\n        * data (dict): key，str类型，\"image_np_path\"；value，list类型，每个元素为list类型，[用于病灶分析的影像numpy数组(文件后缀名.npy)路径, 用于肺部分割的影像numpy数组路径]，如果仅进行病灶分析不进行肺部分割，可以省略用于肺部分割的影像numpy数组路径\n       \n\n    - **返回**\n\n        * result  (list\\[dict\\]): 每个元素为对应输入的预测结果。每个预测结果为dict类型：预测结果有以下字段：\n            * input_lesion_np_path: 存放用于病灶分析的numpy数组路径；\n            * output_lesion_np: 存放病灶分析结果，numpy数组；\n            * input_lesion_np_path：存放用于肺部分割的numpy数组路径（仅当对应输入包含肺部影像numpy时存在该字段）\n            * output_lung_np：存放肺部分割结果，numpy数组（仅当对应输入包含肺部影像numpy时存在该字段）\n\n\n## 四、更新历史\n\n* 1.0.0\n\n    初始发布\n"
  },
  {
    "path": "modules/image/semantic_segmentation/Pneumonia_CT_LKM_PP_lung/README_en.md",
    "content": "# Pneumonia_CT_LKM_PP_lung\n\n|Module Name|Pneumonia_CT_LKM_PP_lung|\n| :--- | :---: | \n|Category|Image segmentation|\n|Network |-|\n|Dataset|-|\n|Fine-tuning supported or not|No|\n|Module Size|35M|\n|Data indicators|-|\n|Latest update date|2021-02-26|\n\n\n## I. Basic Information \n\n\n- ### Module Introduction\n\n    - Pneumonia CT analysis model (Pneumonia-CT-LKM-PP) can efficiently complete the detection of lesions and outline the patient's CT images. Through post-processing codes, the number, volume, and lesions of lung lesions can be analyzed. This model has been fully trained by high-resolution and low-resolution CT image data, which can adapt to the examination data collected by different levels of CT imaging equipment. (This module is a submodule of Pneumonia_CT_LKM_PP.)\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install Pneumonia_CT_LKM_PP_lung==1.0.0\n      ```\n      \n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n    ```python\n    import paddlehub as hub\n\n    pneumonia = hub.Module(name=\"Pneumonia_CT_LKM_PP_lung\")\n\n    input_only_lesion_np_path = \"/PATH/TO/ONLY_LESION_NP\"\n    input_both_lesion_np_path = \"/PATH/TO/LESION_NP\"\n    input_both_lung_np_path = \"/PATH/TO/LUNG_NP\"\n\n    # set input dict\n    input_dict = {\"image_np_path\": [\n                                    [input_only_lesion_np_path],\n                                    [input_both_lesion_np_path, input_both_lung_np_path],\n                                    ]}\n\n    # execute predict and print the result\n    results = pneumonia.segmentation(data=input_dict)\n    for result in results:\n        print(result)\n\n    ```\n   \n\n- ### 2、API\n\n  - ```python\n    def segmentation(data)\n    ```\n\n    - Prediction API, used for CT analysis of pneumonia.\n\n    - **Parameter**\n\n        * data (dict): Key is \"image_np_path\", value is the list of results which contains lesion and lung segmentation masks. \n        \n\n    - **Return**\n\n        * result  (list\\[dict\\]): The list of recognition results, where each element is dict and each field is: \n            * input_lesion_np_path: Input path of lesion.\n            * output_lesion_np: Segmentation result path of lesion.\n            * input_lung_np_path: Input path of lung.\n            * output_lung_np: Segmentation result path of lung.\n\n\n## IV. Release Note\n\n* 1.0.0\n\n    First release\n"
  },
  {
    "path": "modules/image/semantic_segmentation/README.md",
    "content": "## **更好用户体验，建议参考WEB端官方文档 -> [【图像分割】](https://www.paddlepaddle.org.cn/hublist)**\n\n### 图像分割\n\n图像语义分割顾名思义是将图像像素按照表达的语义含义的不同进行分组/分割，图像语义是指对图像内容的理解，例如，能够描绘出什么物体在哪里做了什么事情等，分割是指对图片中的每个像素点进行标注，标注属于哪一类别。近年来用在无人车驾驶技术中分割街景来避让行人和车辆、医疗影像分析中辅助诊断等。\n\n- 精选模型\n\n| 模型名称                                                     | 模型简介                                                     |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n| [人像分割](https://www.paddlepaddle.org.cn/hubdetail?name=deeplabv3p_xception65_humanseg&en_category=ImageSegmentation) | 百度**自建数据集**训练，人像分割效果卓越。                 |\n| [人体解析](https://www.paddlepaddle.org.cn/hubdetail?name=ace2p&en_category=ImageSegmentation) | CVPR2019 LIP挑战赛中**满贯三冠王**。人体解析任务必选。     |\n| [肺炎CT影像分析](https://www.paddlepaddle.org.cn/hubdetail?name=Pneumonia_CT_LKM_PP&en_category=ImageSegmentation) | 助力连心医疗开源**业界首个**肺炎CT影像分析模型\n"
  },
  {
    "path": "modules/image/semantic_segmentation/README_en.md",
    "content": "## **For better user experience, refer to the Web official document ->  [Image Segmentation](https://www.paddlepaddle.org.cn/hublist)**\n\n### Image Segmentation\n\nImage semantic segmentation, as the name implies, is the grouping/segmentation of image pixels according to the semantic meanings. Image semantics refers to the understanding of the image content, for example, depict what is done in which place by which object. Segmentation refers to the labeling of each pixel in an image as to which category it belongs. In recent years, it has been used to segment street scenes to avoid pedestrians and vehicles in unmanned vehicle driving technology, and to assist in diagnosis in medical image analysis.\n\n- Selected Model\n\n  | Model Name                                                   | Model Introduction                                           |\n  | ------------------------------------------------------------ | ------------------------------------------------------------ |\n  | [Human Image Segmentation](https://www.paddlepaddle.org.cn/hubdetail?name=deeplabv3p_xception65_humanseg&en_category=ImageSegmentation) | Baidu Self-Built DataSet Training. It has the excellent effect of human Image Segmentation. |\n  | [Human Body Analysis](https://www.paddlepaddle.org.cn/hubdetail?name=ace2p&en_category=ImageSegmentation) | CVPR2019 LIP Challenge Triple Crown. It is a necessity for a human body analysis task. |\n\n  | [Pneumonia CT Imaging Analysis](https://www.paddlepaddle.org.cn/hubdetail?name=Pneumonia_CT_LKM_PP&en_category=ImageSegmentation) | Help LinkingMed Open source First Case in the industry Pneumonia CT Imaging Analysis Model\n"
  },
  {
    "path": "modules/image/semantic_segmentation/SINet_Portrait_Segmentation/README.md",
    "content": "## 概述\r\n* 基于 SINet 模型实现的轻量化人像分割模型\r\n* 模型具体规格如下：\r\n    |model|SINet|\r\n    |----|----|\r\n    |Param|0.087 M|\r\n    |Flop|0.064 G|\r\n\r\n* 模型参数转换至 [ext_portrait_segmentation](https://github.com/clovaai/ext_portrait_segmentation) 项目\r\n* 感谢 [ext_portrait_segmentation](https://github.com/clovaai/ext_portrait_segmentation) 项目提供的开源代码和模型\r\n\r\n## 效果展示\r\n![](https://ai-studio-static-online.cdn.bcebos.com/264c689d024942f3817bc9b290dea18812ba88e43d89457e977cd811988f0b44)\r\n\r\n## API\r\n```python\r\ndef Segmentation(\r\n    images=None,\r\n    paths=None,\r\n    batch_size=1,\r\n    output_dir='output',\r\n    visualization=False):\r\n```\r\n人像分割 API\r\n\r\n**参数**\r\n* images (list[np.ndarray]) : 输入图像数据列表（BGR）\r\n* paths (list[str]) : 输入图像路径列表\r\n* batch_size (int) : 数据批大小\r\n* output_dir (str) : 可视化图像输出目录\r\n* visualization (bool) : 是否可视化\r\n\r\n**返回**\r\n* results (list[dict{\"mask\":np.ndarray,\"result\":np.ndarray}]): 输出图像数据列表\r\n\r\n**代码示例**\r\n```python\r\nimport cv2\r\nimport paddlehub as hub\r\n\r\nmodel = hub.Module(name='SINet_Portrait_Segmentation')\r\n\r\nresult = model.Segmentation(\r\n    images=[cv2.imread('/PATH/TO/IMAGE')],\r\n    paths=None,\r\n    batch_size=1,\r\n    output_dir='output',\r\n    visualization=False)\r\n```\r\n\r\n## 查看代码\r\nhttps://github.com/clovaai/ext_portrait_segmentation\r\n\r\n## 依赖\r\npaddlepaddle >= 2.0.0rc0  \r\npaddlehub >= 2.0.0b1\r\n"
  },
  {
    "path": "modules/image/semantic_segmentation/SINet_Portrait_Segmentation/module.py",
    "content": "import os\r\nimport cv2\r\nimport paddle\r\nimport numpy as np\r\n\r\nfrom paddle.nn import Layer\r\nfrom paddlehub.module.module import moduleinfo\r\n\r\n\r\n@moduleinfo(\r\n    name=\"SINet_Portrait_Segmentation\",  # 模型名称\r\n    type=\"CV/semantic_segmentation\",  # 模型类型\r\n    author=\"jm12138\",  # 作者名称\r\n    author_email=\"jm12138@qq.com\",  # 作者邮箱\r\n    summary=\"SINet_Portrait_Segmentation\",  # 模型介绍\r\n    version=\"1.0.0\"  # 版本号\r\n)\r\nclass SINet_Portrait_Segmentation(Layer):\r\n    # 初始化函数\r\n    def __init__(self, name=None, directory=None):\r\n        super(SINet_Portrait_Segmentation, self).__init__()\r\n        # 设置模型路径\r\n        self.model_path = os.path.join(self.directory, \"SINet\")\r\n\r\n        # 加载模型\r\n        self.model = paddle.jit.load(self.model_path)\r\n        self.model.eval()\r\n\r\n        # 均值方差\r\n        self.mean = [107.304565, 115.69884, 132.35703]\r\n        self.std = [63.97182, 65.1337, 68.29726]\r\n\r\n    # 读取数据函数\r\n    @staticmethod\r\n    def load_datas(paths, images):\r\n        datas = []\r\n\r\n        # 读取数据列表\r\n        if paths is not None:\r\n            for im_path in paths:\r\n                assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\r\n                im = cv2.imread(im_path)\r\n                datas.append(im)\r\n\r\n        if images is not None:\r\n            datas = images\r\n\r\n        # 返回数据列表\r\n        return datas\r\n\r\n    # 预处理函数\r\n    def preprocess(self, datas, batch_size):\r\n        input_datas = []\r\n        for img in datas:\r\n            # 缩放\r\n            h, w = img.shape[:2]\r\n            img = cv2.resize(img, (224, 224))\r\n\r\n            # 格式转换\r\n            img = img.astype(np.float32)\r\n\r\n            # 归一化\r\n            for j in range(3):\r\n                img[:, :, j] -= self.mean[j]\r\n            for j in range(3):\r\n                img[:, :, j] /= self.std[j]\r\n            img /= 255.\r\n\r\n            # 格式转换\r\n            img = img.transpose((2, 0, 1))\r\n            img = img[np.newaxis, ...]\r\n\r\n            input_datas.append(img)\r\n\r\n        # 数据切分\r\n        input_datas = np.concatenate(input_datas, 0)\r\n        split_num = len(datas) // batch_size + 1 if len(datas) % batch_size != 0 else len(datas) // batch_size\r\n        input_datas = np.array_split(input_datas, split_num)\r\n        return input_datas\r\n\r\n    # 后处理函数\r\n    def postprocess(self, outputs, datas, output_dir, visualization):\r\n        # 检查输出目录\r\n        if visualization:\r\n            if not os.path.exists(output_dir):\r\n                os.mkdir(output_dir)\r\n\r\n        # 拼接输出\r\n        outputs = paddle.concat(outputs)\r\n\r\n        results = []\r\n        for output, img, i in zip(outputs, datas, range(len(datas))):\r\n            # 计算MASK\r\n            mask = 1 - (output[0] > 0).numpy().astype('float32')\r\n\r\n            # 缩放\r\n            h, w = img.shape[:2]\r\n            mask = cv2.resize(mask, (w, h))\r\n\r\n            # 计算输出图像\r\n            result = (img * mask[..., np.newaxis] + (1 - mask[..., np.newaxis]) * 255).astype(np.uint8)\r\n\r\n            # 格式还原\r\n            mask = (mask * 255).astype(np.uint8)\r\n\r\n            # 可视化\r\n            if visualization:\r\n                cv2.imwrite(os.path.join(output_dir, 'result_mask_%d.png' % i), mask)\r\n                cv2.imwrite(os.path.join(output_dir, 'result_%d.png' % i), result)\r\n\r\n            results.append({'mask': mask, 'result': result})\r\n\r\n        return results\r\n\r\n    # 关键点检测函数\r\n    def Segmentation(self, images=None, paths=None, batch_size=1, output_dir='output', visualization=False):\r\n        # 加载数据处理器\r\n        datas = self.load_datas(paths, images)\r\n\r\n        # 获取输入数据\r\n        input_datas = self.preprocess(datas, batch_size)\r\n\r\n        # 模型预测\r\n        outputs = [self.model(paddle.to_tensor(input_data)) for input_data in input_datas]\r\n\r\n        # 结果后处理\r\n        results = self.postprocess(outputs, datas, output_dir, visualization)\r\n\r\n        # 返回结果\r\n        return results\r\n"
  },
  {
    "path": "modules/image/semantic_segmentation/U2Net/README.md",
    "content": "# U2Net\r\n\r\n|模型名称|U2Net|\r\n| :--- | :---: |\r\n|类别|图像-图像分割|\r\n|网络|U^2Net|\r\n|数据集|-|\r\n|是否支持Fine-tuning|否|\r\n|模型大小|254MB|\r\n|指标|-|\r\n|最新更新日期|2021-02-26|\r\n\r\n\r\n\r\n## 一、模型基本信息\r\n\r\n- ### 应用效果展示\r\n\r\n    - 效果展示\r\n\r\n     <p align=\"center\">\r\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/4d77bc3a05cf48bba6f67b797978f4cdf10f38288b9645d59393dd85cef58eff\" width = \"450\" height = \"300\" hspace='10'/> <img src=\"https://ai-studio-static-online.cdn.bcebos.com/11c9eba8de6d4316b672f10b285245061821f0a744e441f3b80c223881256ca0\" width = \"450\" height = \"300\" hspace='10'/>\r\n    </p>\r\n\r\n- ### 模型介绍\r\n    * ![](http://latex.codecogs.com/svg.latex?U^2Net)的网络结构如下图，其类似于编码-解码(Encoder-Decoder)结构的 U-Net，每个 stage 由新提出的 RSU模块(residual U-block) 组成. 例如，En_1 即为基于 RSU 构建的\r\n    *  - 更多详情请参考：[U2Net](https://github.com/xuebinqin/U-2-Net)\r\n\r\n    ![](https://ai-studio-static-online.cdn.bcebos.com/999d37b4ffdd49dc9e3315b7cec7b2c6918fdd57c8594ced9dded758a497913d)\r\n\r\n## 二、安装\r\n\r\n- ### 1、环境依赖\r\n    - paddlepaddle >= 2.0.0  \r\n    - paddlehub >= 2.0.0\r\n\r\n- ### 2、安装\r\n    - ```shell\r\n      $ hub install U2Net\r\n      ```\r\n\r\n    - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\r\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\r\n\r\n## 三、模型API预测\r\n- ### 1、预测代码示例\r\n\r\n    ```python\r\n    import cv2\r\n    import paddlehub as hub\r\n\r\n    model = hub.Module(name='U2Net')\r\n\r\n    result = model.Segmentation(\r\n        images=[cv2.imread('/PATH/TO/IMAGE')],\r\n        paths=None,\r\n        batch_size=1,\r\n        input_size=320,\r\n        output_dir='output',\r\n        visualization=True)\r\n    ```\r\n - ### 2、API\r\n\r\n    ```python\r\n    def Segmentation(\r\n            images=None,\r\n            paths=None,\r\n            batch_size=1,\r\n            input_size=320,\r\n            output_dir='output',\r\n            visualization=False):\r\n    ```\r\n    - 图像前景背景分割 API\r\n\r\n    -   **参数**\r\n        * images (list[np.ndarray]) : 输入图像数据列表（BGR）\r\n        * paths (list[str]) : 输入图像路径列表\r\n        * batch_size (int) : 数据批大小\r\n        * input_size (int) : 输入图像大小\r\n        * output_dir (str) : 可视化图像输出目录\r\n        * visualization (bool) : 是否可视化\r\n\r\n    -   **返回**\r\n        * results (list[np.ndarray]): 输出图像数据列表\r\n\r\n## 四、更新历史\r\n\r\n* 1.0.0\r\n\r\n  初始发布\r\n"
  },
  {
    "path": "modules/image/semantic_segmentation/U2Net/README_en.md",
    "content": "# U2Net\n\n|Module Name |U2Net|\n| :--- | :---: |\n|Category |Image segmentation|\n|Network |U^2Net|\n|Dataset|-|\n|Fine-tuning supported or not|No|\n|Module Size |254MB|\n|Data indicators|-|\n|Latest update date|2021-02-26|\n\n\n## I. Basic Information \n\n- ### Application Effect Display\n\n  - Sample results:\n\n     <p align=\"center\">\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/4d77bc3a05cf48bba6f67b797978f4cdf10f38288b9645d59393dd85cef58eff\" width = \"450\" height = \"300\" hspace='10'/> <img src=\"https://ai-studio-static-online.cdn.bcebos.com/11c9eba8de6d4316b672f10b285245061821f0a744e441f3b80c223881256ca0\" width = \"450\" height = \"300\" hspace='10'/>\n    </p>\n\n\n- ### Module Introduction\n\n    - Network architecture:\n      <p align=\"center\">\n      <img src=\"https://ai-studio-static-online.cdn.bcebos.com/999d37b4ffdd49dc9e3315b7cec7b2c6918fdd57c8594ced9dded758a497913d\" hspace='10'/> <br />\n      </p>\n\n    - For more information, please refer to: [U2Net](https://github.com/xuebinqin/U-2-Net)\n\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0  \n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n    - ```shell\n      $ hub install U2Net\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md) \n\n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n    ```python\n    import cv2\n    import paddlehub as hub\n\n    model = hub.Module(name='U2Net')\n\n    result = model.Segmentation(\n        images=[cv2.imread('/PATH/TO/IMAGE')],\n        paths=None,\n        batch_size=1,\n        input_size=320,\n        output_dir='output',\n        visualization=True)\n    ```\n - ### 2、API\n\n    ```python\n    def Segmentation(\n            images=None,\n            paths=None,\n            batch_size=1,\n            input_size=320,\n            output_dir='output',\n            visualization=False):\n    ```\n    - Prediction API, obtaining segmentation result.\n\n    - **Parameter**\n        * images (list[np.ndarray]) : Image data, ndarray.shape is in the format [H, W, C], BGR.\n        * paths (list[str]) : Image path.\n        * batch_size (int) : Batch size.\n        * input_size (int) : Input image size, default is 320.\n        * output_dir (str) : Save path of images, 'output' by default.\n        * visualization (bool) : Whether to save the results as picture files.\n\n    - **Return**\n        * results (list[np.ndarray]): The list of segmentation results.\n\n## IV. Release Note\n\n- 1.0.0\n\n  First release\n"
  },
  {
    "path": "modules/image/semantic_segmentation/U2Net/module.py",
    "content": "import os\r\nimport paddle\r\nimport paddle.nn as nn\r\nimport numpy as np\r\nfrom U2Net.u2net import U2NET\r\nfrom U2Net.processor import Processor\r\nfrom paddlehub.module.module import moduleinfo\r\n\r\n\r\n@moduleinfo(\r\n    name=\"U2Net\",  # 模型名称\r\n    type=\"CV\",  # 模型类型\r\n    author=\"jm12138\",  # 作者名称\r\n    author_email=\"jm12138@qq.com\",  # 作者邮箱\r\n    summary=\"U2Net\",  # 模型介绍\r\n    version=\"1.0.0\"  # 版本号\r\n)\r\nclass U2Net(nn.Layer):\r\n    def __init__(self):\r\n        super(U2Net, self).__init__()\r\n        self.model = U2NET(3, 1)\r\n        state_dict = paddle.load(os.path.join(self.directory, 'u2net.pdparams'))\r\n        self.model.set_dict(state_dict)\r\n        self.model.eval()\r\n\r\n    def predict(self, input_datas):\r\n        outputs = []\r\n        for data in input_datas:\r\n            data = paddle.to_tensor(data, 'float32')\r\n            d1, d2, d3, d4, d5, d6, d7 = self.model(data)\r\n            outputs.append(d1.numpy())\r\n\r\n        outputs = np.concatenate(outputs, 0)\r\n\r\n        return outputs\r\n\r\n    def Segmentation(self,\r\n                     images=None,\r\n                     paths=None,\r\n                     batch_size=1,\r\n                     input_size=320,\r\n                     output_dir='output',\r\n                     visualization=False):\r\n\r\n        # 初始化数据处理器\r\n        processor = Processor(paths, images, batch_size, input_size)\r\n\r\n        # 模型预测\r\n        outputs = self.predict(processor.input_datas)\r\n\r\n        # 预测结果后处理\r\n        results = processor.postprocess(outputs, visualization=visualization, output_dir=output_dir)\r\n\r\n        return results\r\n"
  },
  {
    "path": "modules/image/semantic_segmentation/U2Net/processor.py",
    "content": "import os\r\nimport cv2\r\nimport numpy as np\r\n\r\n__all__ = ['Processor']\r\n\r\n\r\nclass Processor():\r\n    def __init__(self, paths, images, batch_size, input_size):\r\n        # 图像列表\r\n        self.imgs = self.load_datas(paths, images)\r\n\r\n        # 输入数据\r\n        self.input_datas = self.preprocess(self.imgs, batch_size, input_size)\r\n\r\n    # 读取数据函数\r\n    def load_datas(self, paths, images):\r\n        datas = []\r\n\r\n        # 读取数据列表\r\n        if paths is not None:\r\n            for im_path in paths:\r\n                assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\r\n                im = cv2.imread(im_path)\r\n                datas.append(im)\r\n\r\n        if images is not None:\r\n            datas = images\r\n\r\n        # 返回数据列表\r\n        return datas\r\n\r\n    # 预处理\r\n    def preprocess(self, imgs, batch_size=1, input_size=320):\r\n        input_datas = []\r\n        for image in imgs:\r\n            image = cv2.resize(image, (input_size, input_size))\r\n            tmpImg = np.zeros((image.shape[0], image.shape[1], 3))\r\n            image = image / np.max(image)\r\n\r\n            tmpImg[:, :, 0] = (image[:, :, 0] - 0.485) / 0.229\r\n            tmpImg[:, :, 1] = (image[:, :, 1] - 0.456) / 0.224\r\n            tmpImg[:, :, 2] = (image[:, :, 2] - 0.406) / 0.225\r\n\r\n            # convert BGR to RGB\r\n            tmpImg = tmpImg.transpose((2, 0, 1))\r\n            tmpImg = tmpImg[np.newaxis, :, :, :]\r\n            input_datas.append(tmpImg)\r\n\r\n        input_datas = np.concatenate(input_datas, 0)\r\n\r\n        datas_num = input_datas.shape[0]\r\n        split_num = datas_num // batch_size + 1 if datas_num % batch_size != 0 else datas_num // batch_size\r\n\r\n        input_datas = np.array_split(input_datas, split_num)\r\n\r\n        return input_datas\r\n\r\n    def normPRED(self, d):\r\n        ma = np.max(d)\r\n        mi = np.min(d)\r\n\r\n        dn = (d - mi) / (ma - mi)\r\n\r\n        return dn\r\n\r\n    # 后处理\r\n    def postprocess(self, outputs, visualization=False, output_dir='output'):\r\n        results = []\r\n        if visualization and not os.path.exists(output_dir):\r\n            os.mkdir(output_dir)\r\n\r\n        for i, image in enumerate(self.imgs):\r\n            # normalization\r\n            pred = outputs[i, 0, :, :]\r\n\r\n            pred = self.normPRED(pred)\r\n\r\n            # convert torch tensor to numpy array\r\n            h, w = image.shape[:2]\r\n            mask = cv2.resize(pred, (w, h))\r\n\r\n            output_img = (image * mask[..., np.newaxis] + (1 - mask[..., np.newaxis]) * 255).astype(np.uint8)\r\n\r\n            mask = (mask * 255).astype(np.uint8)\r\n\r\n            if visualization:\r\n                cv2.imwrite(os.path.join(output_dir, 'result_mask_%d.png' % i), mask)\r\n                cv2.imwrite(os.path.join(output_dir, 'result_%d.png' % i), output_img)\r\n\r\n            results.append({'mask': mask, 'front': output_img})\r\n\r\n        return results\r\n"
  },
  {
    "path": "modules/image/semantic_segmentation/U2Net/u2net.py",
    "content": "import paddle\r\nimport paddle.nn as nn\r\nimport paddle.nn.functional as F\r\n\r\n__all__ = ['U2NETP', 'U2NET']\r\n\r\n\r\nclass REBNCONV(nn.Layer):\r\n    def __init__(self, in_ch=3, out_ch=3, dirate=1):\r\n        super(REBNCONV, self).__init__()\r\n\r\n        self.conv_s1 = nn.Conv2D(in_ch, out_ch, 3, padding=1 * dirate, dilation=1 * dirate)\r\n        self.bn_s1 = nn.BatchNorm2D(out_ch)\r\n        self.relu_s1 = nn.ReLU()\r\n\r\n    def forward(self, x):\r\n\r\n        hx = x\r\n        xout = self.relu_s1(self.bn_s1(self.conv_s1(hx)))\r\n\r\n        return xout\r\n\r\n\r\n## upsample tensor 'src' to have the same spatial size with tensor 'tar'\r\ndef _upsample_like(src, tar):\r\n\r\n    src = F.upsample(src, size=tar.shape[2:], mode='bilinear')\r\n\r\n    return src\r\n\r\n\r\n### RSU-7 ###\r\nclass RSU7(nn.Layer):  #UNet07DRES(nn.Layer):\r\n    def __init__(self, in_ch=3, mid_ch=12, out_ch=3):\r\n        super(RSU7, self).__init__()\r\n\r\n        self.rebnconvin = REBNCONV(in_ch, out_ch, dirate=1)\r\n\r\n        self.rebnconv1 = REBNCONV(out_ch, mid_ch, dirate=1)\r\n        self.pool1 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv2 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool2 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv3 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool3 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv4 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool4 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv5 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool5 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv6 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n\r\n        self.rebnconv7 = REBNCONV(mid_ch, mid_ch, dirate=2)\r\n\r\n        self.rebnconv6d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv5d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv4d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv3d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv2d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv1d = REBNCONV(mid_ch * 2, out_ch, dirate=1)\r\n\r\n    def forward(self, x):\r\n\r\n        hx = x\r\n        hxin = self.rebnconvin(hx)\r\n\r\n        hx1 = self.rebnconv1(hxin)\r\n        hx = self.pool1(hx1)\r\n\r\n        hx2 = self.rebnconv2(hx)\r\n        hx = self.pool2(hx2)\r\n\r\n        hx3 = self.rebnconv3(hx)\r\n        hx = self.pool3(hx3)\r\n\r\n        hx4 = self.rebnconv4(hx)\r\n        hx = self.pool4(hx4)\r\n\r\n        hx5 = self.rebnconv5(hx)\r\n        hx = self.pool5(hx5)\r\n\r\n        hx6 = self.rebnconv6(hx)\r\n\r\n        hx7 = self.rebnconv7(hx6)\r\n\r\n        hx6d = self.rebnconv6d(paddle.concat((hx7, hx6), 1))\r\n        hx6dup = _upsample_like(hx6d, hx5)\r\n\r\n        hx5d = self.rebnconv5d(paddle.concat((hx6dup, hx5), 1))\r\n        hx5dup = _upsample_like(hx5d, hx4)\r\n\r\n        hx4d = self.rebnconv4d(paddle.concat((hx5dup, hx4), 1))\r\n        hx4dup = _upsample_like(hx4d, hx3)\r\n\r\n        hx3d = self.rebnconv3d(paddle.concat((hx4dup, hx3), 1))\r\n        hx3dup = _upsample_like(hx3d, hx2)\r\n\r\n        hx2d = self.rebnconv2d(paddle.concat((hx3dup, hx2), 1))\r\n        hx2dup = _upsample_like(hx2d, hx1)\r\n\r\n        hx1d = self.rebnconv1d(paddle.concat((hx2dup, hx1), 1))\r\n\r\n        return hx1d + hxin\r\n\r\n\r\n### RSU-6 ###\r\nclass RSU6(nn.Layer):  #UNet06DRES(nn.Layer):\r\n    def __init__(self, in_ch=3, mid_ch=12, out_ch=3):\r\n        super(RSU6, self).__init__()\r\n\r\n        self.rebnconvin = REBNCONV(in_ch, out_ch, dirate=1)\r\n\r\n        self.rebnconv1 = REBNCONV(out_ch, mid_ch, dirate=1)\r\n        self.pool1 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv2 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool2 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv3 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool3 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv4 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool4 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv5 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n\r\n        self.rebnconv6 = REBNCONV(mid_ch, mid_ch, dirate=2)\r\n\r\n        self.rebnconv5d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv4d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv3d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv2d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv1d = REBNCONV(mid_ch * 2, out_ch, dirate=1)\r\n\r\n    def forward(self, x):\r\n\r\n        hx = x\r\n\r\n        hxin = self.rebnconvin(hx)\r\n\r\n        hx1 = self.rebnconv1(hxin)\r\n        hx = self.pool1(hx1)\r\n\r\n        hx2 = self.rebnconv2(hx)\r\n        hx = self.pool2(hx2)\r\n\r\n        hx3 = self.rebnconv3(hx)\r\n        hx = self.pool3(hx3)\r\n\r\n        hx4 = self.rebnconv4(hx)\r\n        hx = self.pool4(hx4)\r\n\r\n        hx5 = self.rebnconv5(hx)\r\n\r\n        hx6 = self.rebnconv6(hx5)\r\n\r\n        hx5d = self.rebnconv5d(paddle.concat((hx6, hx5), 1))\r\n        hx5dup = _upsample_like(hx5d, hx4)\r\n\r\n        hx4d = self.rebnconv4d(paddle.concat((hx5dup, hx4), 1))\r\n        hx4dup = _upsample_like(hx4d, hx3)\r\n\r\n        hx3d = self.rebnconv3d(paddle.concat((hx4dup, hx3), 1))\r\n        hx3dup = _upsample_like(hx3d, hx2)\r\n\r\n        hx2d = self.rebnconv2d(paddle.concat((hx3dup, hx2), 1))\r\n        hx2dup = _upsample_like(hx2d, hx1)\r\n\r\n        hx1d = self.rebnconv1d(paddle.concat((hx2dup, hx1), 1))\r\n\r\n        return hx1d + hxin\r\n\r\n\r\n### RSU-5 ###\r\nclass RSU5(nn.Layer):  #UNet05DRES(nn.Layer):\r\n    def __init__(self, in_ch=3, mid_ch=12, out_ch=3):\r\n        super(RSU5, self).__init__()\r\n\r\n        self.rebnconvin = REBNCONV(in_ch, out_ch, dirate=1)\r\n\r\n        self.rebnconv1 = REBNCONV(out_ch, mid_ch, dirate=1)\r\n        self.pool1 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv2 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool2 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv3 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool3 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv4 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n\r\n        self.rebnconv5 = REBNCONV(mid_ch, mid_ch, dirate=2)\r\n\r\n        self.rebnconv4d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv3d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv2d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv1d = REBNCONV(mid_ch * 2, out_ch, dirate=1)\r\n\r\n    def forward(self, x):\r\n\r\n        hx = x\r\n\r\n        hxin = self.rebnconvin(hx)\r\n\r\n        hx1 = self.rebnconv1(hxin)\r\n        hx = self.pool1(hx1)\r\n\r\n        hx2 = self.rebnconv2(hx)\r\n        hx = self.pool2(hx2)\r\n\r\n        hx3 = self.rebnconv3(hx)\r\n        hx = self.pool3(hx3)\r\n\r\n        hx4 = self.rebnconv4(hx)\r\n\r\n        hx5 = self.rebnconv5(hx4)\r\n\r\n        hx4d = self.rebnconv4d(paddle.concat((hx5, hx4), 1))\r\n        hx4dup = _upsample_like(hx4d, hx3)\r\n\r\n        hx3d = self.rebnconv3d(paddle.concat((hx4dup, hx3), 1))\r\n        hx3dup = _upsample_like(hx3d, hx2)\r\n\r\n        hx2d = self.rebnconv2d(paddle.concat((hx3dup, hx2), 1))\r\n        hx2dup = _upsample_like(hx2d, hx1)\r\n\r\n        hx1d = self.rebnconv1d(paddle.concat((hx2dup, hx1), 1))\r\n\r\n        return hx1d + hxin\r\n\r\n\r\n### RSU-4 ###\r\nclass RSU4(nn.Layer):  #UNet04DRES(nn.Layer):\r\n    def __init__(self, in_ch=3, mid_ch=12, out_ch=3):\r\n        super(RSU4, self).__init__()\r\n\r\n        self.rebnconvin = REBNCONV(in_ch, out_ch, dirate=1)\r\n\r\n        self.rebnconv1 = REBNCONV(out_ch, mid_ch, dirate=1)\r\n        self.pool1 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv2 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool2 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv3 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n\r\n        self.rebnconv4 = REBNCONV(mid_ch, mid_ch, dirate=2)\r\n\r\n        self.rebnconv3d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv2d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv1d = REBNCONV(mid_ch * 2, out_ch, dirate=1)\r\n\r\n    def forward(self, x):\r\n\r\n        hx = x\r\n\r\n        hxin = self.rebnconvin(hx)\r\n\r\n        hx1 = self.rebnconv1(hxin)\r\n        hx = self.pool1(hx1)\r\n\r\n        hx2 = self.rebnconv2(hx)\r\n        hx = self.pool2(hx2)\r\n\r\n        hx3 = self.rebnconv3(hx)\r\n\r\n        hx4 = self.rebnconv4(hx3)\r\n\r\n        hx3d = self.rebnconv3d(paddle.concat((hx4, hx3), 1))\r\n        hx3dup = _upsample_like(hx3d, hx2)\r\n\r\n        hx2d = self.rebnconv2d(paddle.concat((hx3dup, hx2), 1))\r\n        hx2dup = _upsample_like(hx2d, hx1)\r\n\r\n        hx1d = self.rebnconv1d(paddle.concat((hx2dup, hx1), 1))\r\n\r\n        return hx1d + hxin\r\n\r\n\r\n### RSU-4F ###\r\nclass RSU4F(nn.Layer):  #UNet04FRES(nn.Layer):\r\n    def __init__(self, in_ch=3, mid_ch=12, out_ch=3):\r\n        super(RSU4F, self).__init__()\r\n\r\n        self.rebnconvin = REBNCONV(in_ch, out_ch, dirate=1)\r\n\r\n        self.rebnconv1 = REBNCONV(out_ch, mid_ch, dirate=1)\r\n        self.rebnconv2 = REBNCONV(mid_ch, mid_ch, dirate=2)\r\n        self.rebnconv3 = REBNCONV(mid_ch, mid_ch, dirate=4)\r\n\r\n        self.rebnconv4 = REBNCONV(mid_ch, mid_ch, dirate=8)\r\n\r\n        self.rebnconv3d = REBNCONV(mid_ch * 2, mid_ch, dirate=4)\r\n        self.rebnconv2d = REBNCONV(mid_ch * 2, mid_ch, dirate=2)\r\n        self.rebnconv1d = REBNCONV(mid_ch * 2, out_ch, dirate=1)\r\n\r\n    def forward(self, x):\r\n\r\n        hx = x\r\n\r\n        hxin = self.rebnconvin(hx)\r\n\r\n        hx1 = self.rebnconv1(hxin)\r\n        hx2 = self.rebnconv2(hx1)\r\n        hx3 = self.rebnconv3(hx2)\r\n\r\n        hx4 = self.rebnconv4(hx3)\r\n\r\n        hx3d = self.rebnconv3d(paddle.concat((hx4, hx3), 1))\r\n        hx2d = self.rebnconv2d(paddle.concat((hx3d, hx2), 1))\r\n        hx1d = self.rebnconv1d(paddle.concat((hx2d, hx1), 1))\r\n\r\n        return hx1d + hxin\r\n\r\n\r\n##### U^2-Net ####\r\nclass U2NET(nn.Layer):\r\n    def __init__(self, in_ch=3, out_ch=1):\r\n        super(U2NET, self).__init__()\r\n\r\n        self.stage1 = RSU7(in_ch, 32, 64)\r\n        self.pool12 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage2 = RSU6(64, 32, 128)\r\n        self.pool23 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage3 = RSU5(128, 64, 256)\r\n        self.pool34 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage4 = RSU4(256, 128, 512)\r\n        self.pool45 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage5 = RSU4F(512, 256, 512)\r\n        self.pool56 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage6 = RSU4F(512, 256, 512)\r\n\r\n        # decoder\r\n        self.stage5d = RSU4F(1024, 256, 512)\r\n        self.stage4d = RSU4(1024, 128, 256)\r\n        self.stage3d = RSU5(512, 64, 128)\r\n        self.stage2d = RSU6(256, 32, 64)\r\n        self.stage1d = RSU7(128, 16, 64)\r\n\r\n        self.side1 = nn.Conv2D(64, out_ch, 3, padding=1)\r\n        self.side2 = nn.Conv2D(64, out_ch, 3, padding=1)\r\n        self.side3 = nn.Conv2D(128, out_ch, 3, padding=1)\r\n        self.side4 = nn.Conv2D(256, out_ch, 3, padding=1)\r\n        self.side5 = nn.Conv2D(512, out_ch, 3, padding=1)\r\n        self.side6 = nn.Conv2D(512, out_ch, 3, padding=1)\r\n\r\n        self.outconv = nn.Conv2D(6, out_ch, 1)\r\n\r\n    def forward(self, x):\r\n\r\n        hx = x\r\n\r\n        #stage 1\r\n        hx1 = self.stage1(hx)\r\n        hx = self.pool12(hx1)\r\n\r\n        #stage 2\r\n        hx2 = self.stage2(hx)\r\n        hx = self.pool23(hx2)\r\n\r\n        #stage 3\r\n        hx3 = self.stage3(hx)\r\n        hx = self.pool34(hx3)\r\n\r\n        #stage 4\r\n        hx4 = self.stage4(hx)\r\n        hx = self.pool45(hx4)\r\n\r\n        #stage 5\r\n        hx5 = self.stage5(hx)\r\n        hx = self.pool56(hx5)\r\n\r\n        #stage 6\r\n        hx6 = self.stage6(hx)\r\n        hx6up = _upsample_like(hx6, hx5)\r\n\r\n        #-------------------- decoder --------------------\r\n        hx5d = self.stage5d(paddle.concat((hx6up, hx5), 1))\r\n        hx5dup = _upsample_like(hx5d, hx4)\r\n\r\n        hx4d = self.stage4d(paddle.concat((hx5dup, hx4), 1))\r\n        hx4dup = _upsample_like(hx4d, hx3)\r\n\r\n        hx3d = self.stage3d(paddle.concat((hx4dup, hx3), 1))\r\n        hx3dup = _upsample_like(hx3d, hx2)\r\n\r\n        hx2d = self.stage2d(paddle.concat((hx3dup, hx2), 1))\r\n        hx2dup = _upsample_like(hx2d, hx1)\r\n\r\n        hx1d = self.stage1d(paddle.concat((hx2dup, hx1), 1))\r\n\r\n        #side output\r\n        d1 = self.side1(hx1d)\r\n\r\n        d2 = self.side2(hx2d)\r\n        d2 = _upsample_like(d2, d1)\r\n\r\n        d3 = self.side3(hx3d)\r\n        d3 = _upsample_like(d3, d1)\r\n\r\n        d4 = self.side4(hx4d)\r\n        d4 = _upsample_like(d4, d1)\r\n\r\n        d5 = self.side5(hx5d)\r\n        d5 = _upsample_like(d5, d1)\r\n\r\n        d6 = self.side6(hx6)\r\n        d6 = _upsample_like(d6, d1)\r\n\r\n        d0 = self.outconv(paddle.concat((d1, d2, d3, d4, d5, d6), 1))\r\n\r\n        return F.sigmoid(d0), F.sigmoid(d1), F.sigmoid(d2), F.sigmoid(d3), F.sigmoid(d4), F.sigmoid(d5), F.sigmoid(d6)\r\n\r\n\r\n### U^2-Net small ###\r\nclass U2NETP(nn.Layer):\r\n    def __init__(self, in_ch=3, out_ch=1):\r\n        super(U2NETP, self).__init__()\r\n\r\n        self.stage1 = RSU7(in_ch, 16, 64)\r\n        self.pool12 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage2 = RSU6(64, 16, 64)\r\n        self.pool23 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage3 = RSU5(64, 16, 64)\r\n        self.pool34 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage4 = RSU4(64, 16, 64)\r\n        self.pool45 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage5 = RSU4F(64, 16, 64)\r\n        self.pool56 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage6 = RSU4F(64, 16, 64)\r\n\r\n        # decoder\r\n        self.stage5d = RSU4F(128, 16, 64)\r\n        self.stage4d = RSU4(128, 16, 64)\r\n        self.stage3d = RSU5(128, 16, 64)\r\n        self.stage2d = RSU6(128, 16, 64)\r\n        self.stage1d = RSU7(128, 16, 64)\r\n\r\n        self.side1 = nn.Conv2D(64, out_ch, 3, padding=1)\r\n        self.side2 = nn.Conv2D(64, out_ch, 3, padding=1)\r\n        self.side3 = nn.Conv2D(64, out_ch, 3, padding=1)\r\n        self.side4 = nn.Conv2D(64, out_ch, 3, padding=1)\r\n        self.side5 = nn.Conv2D(64, out_ch, 3, padding=1)\r\n        self.side6 = nn.Conv2D(64, out_ch, 3, padding=1)\r\n\r\n        self.outconv = nn.Conv2D(6, out_ch, 1)\r\n\r\n    def forward(self, x):\r\n\r\n        hx = x\r\n\r\n        #stage 1\r\n        hx1 = self.stage1(hx)\r\n        hx = self.pool12(hx1)\r\n\r\n        #stage 2\r\n        hx2 = self.stage2(hx)\r\n        hx = self.pool23(hx2)\r\n\r\n        #stage 3\r\n        hx3 = self.stage3(hx)\r\n        hx = self.pool34(hx3)\r\n\r\n        #stage 4\r\n        hx4 = self.stage4(hx)\r\n        hx = self.pool45(hx4)\r\n\r\n        #stage 5\r\n        hx5 = self.stage5(hx)\r\n        hx = self.pool56(hx5)\r\n\r\n        #stage 6\r\n        hx6 = self.stage6(hx)\r\n        hx6up = _upsample_like(hx6, hx5)\r\n\r\n        #decoder\r\n        hx5d = self.stage5d(paddle.concat((hx6up, hx5), 1))\r\n        hx5dup = _upsample_like(hx5d, hx4)\r\n\r\n        hx4d = self.stage4d(paddle.concat((hx5dup, hx4), 1))\r\n        hx4dup = _upsample_like(hx4d, hx3)\r\n\r\n        hx3d = self.stage3d(paddle.concat((hx4dup, hx3), 1))\r\n        hx3dup = _upsample_like(hx3d, hx2)\r\n\r\n        hx2d = self.stage2d(paddle.concat((hx3dup, hx2), 1))\r\n        hx2dup = _upsample_like(hx2d, hx1)\r\n\r\n        hx1d = self.stage1d(paddle.concat((hx2dup, hx1), 1))\r\n\r\n        #side output\r\n        d1 = self.side1(hx1d)\r\n\r\n        d2 = self.side2(hx2d)\r\n        d2 = _upsample_like(d2, d1)\r\n\r\n        d3 = self.side3(hx3d)\r\n        d3 = _upsample_like(d3, d1)\r\n\r\n        d4 = self.side4(hx4d)\r\n        d4 = _upsample_like(d4, d1)\r\n\r\n        d5 = self.side5(hx5d)\r\n        d5 = _upsample_like(d5, d1)\r\n\r\n        d6 = self.side6(hx6)\r\n        d6 = _upsample_like(d6, d1)\r\n\r\n        d0 = self.outconv(paddle.concat((d1, d2, d3, d4, d5, d6), 1))\r\n\r\n        return F.sigmoid(d0), F.sigmoid(d1), F.sigmoid(d2), F.sigmoid(d3), F.sigmoid(d4), F.sigmoid(d5), F.sigmoid(d6)\r\n"
  },
  {
    "path": "modules/image/semantic_segmentation/U2Netp/README.md",
    "content": "# U2Netp\r\n\r\n|模型名称|U2Netp|\r\n| :--- | :---: |\r\n|类别|图像-图像分割|\r\n|网络|U^2Net|\r\n|数据集|-|\r\n|是否支持Fine-tuning|否|\r\n|模型大小|6.7MB|\r\n|指标|-|\r\n|最新更新日期|2021-02-26|\r\n\r\n\r\n\r\n## 一、模型基本信息\r\n\r\n- ### 应用效果展示\r\n\r\n    - 样例结果示例：\r\n        <p align=\"center\">\r\n        <img src=\"https://ai-studio-static-online.cdn.bcebos.com/4d77bc3a05cf48bba6f67b797978f4cdf10f38288b9645d59393dd85cef58eff\" width = \"450\" height = \"300\" hspace='10'/> <img src=\"https://ai-studio-static-online.cdn.bcebos.com/11c9eba8de6d4316b672f10b285245061821f0a744e441f3b80c223881256ca0\" width = \"450\" height = \"300\" hspace='10'/>\r\n        </p>\r\n\r\n\r\n- ### 模型介绍\r\n\r\n    * U2Netp的网络结构如下图，其类似于编码-解码(Encoder-Decoder)结构的 U-Net, 每个 stage 由新提出的 RSU模块(residual U-block) 组成. 例如，En_1 即为基于 RSU 构建的, 它是一个小型化的模型\r\n\r\n    ![](https://ai-studio-static-online.cdn.bcebos.com/999d37b4ffdd49dc9e3315b7cec7b2c6918fdd57c8594ced9dded758a497913d)\r\n\r\n    *  - 更多详情请参考：[U2Net](https://github.com/xuebinqin/U-2-Net)\r\n\r\n\r\n## 二、安装\r\n\r\n- ### 1、环境依赖\r\n    - paddlepaddle >= 2.0.0\r\n\r\n    - paddlehub >= 2.0.0\r\n\r\n- ### 2、安装\r\n    - ```shell\r\n      $ hub install U2Netp\r\n      ```\r\n\r\n    - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\r\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\r\n\r\n## 三、模型API预测\r\n- ### 1、预测代码示例\r\n\r\n    ```python\r\n    import cv2\r\n    import paddlehub as hub\r\n\r\n    model = hub.Module(name='U2Netp')\r\n\r\n    result = model.Segmentation(\r\n        images=[cv2.imread('/PATH/TO/IMAGE')],\r\n        paths=None,\r\n        batch_size=1,\r\n        input_size=320,\r\n        output_dir='output',\r\n        visualization=True)\r\n    ```\r\n - ### 2、API\r\n\r\n    ```python\r\n    def Segmentation(\r\n            images=None,\r\n            paths=None,\r\n            batch_size=1,\r\n            input_size=320,\r\n            output_dir='output',\r\n            visualization=False):\r\n    ```\r\n    - 图像前景背景分割 API\r\n\r\n    -   **参数**\r\n        * images (list[np.ndarray]) : 输入图像数据列表（BGR）\r\n        * paths (list[str]) : 输入图像路径列表\r\n        * batch_size (int) : 数据批大小\r\n        * input_size (int) : 输入图像大小\r\n        * output_dir (str) : 可视化图像输出目录\r\n        * visualization (bool) : 是否可视化\r\n\r\n    -   **返回**\r\n        * results (list[np.ndarray]): 输出图像数据列表\r\n\r\n## 四、更新历史\r\n\r\n* 1.0.0\r\n\r\n  初始发布\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n"
  },
  {
    "path": "modules/image/semantic_segmentation/U2Netp/README_en.md",
    "content": "# U2Netp\n\n|Module Name |U2Netp|\n| :--- | :---: |\n|Category |Image segmentation|\n|Network |U^2Net|\n|Dataset|-|\n|Fine-tuning supported or not|No|\n|Module Size |6.7MB|\n|Data indicators|-|\n|Latest update date|2021-02-26|\n\n\n## I. Basic Information \n\n- ### Application Effect Display\n\n  - Sample results:\n\n        <p align=\"center\">\n        <img src=\"https://ai-studio-static-online.cdn.bcebos.com/4d77bc3a05cf48bba6f67b797978f4cdf10f38288b9645d59393dd85cef58eff\" width = \"450\" height = \"300\" hspace='10'/> <img src=\"https://ai-studio-static-online.cdn.bcebos.com/11c9eba8de6d4316b672f10b285245061821f0a744e441f3b80c223881256ca0\" width = \"450\" height = \"300\" hspace='10'/>\n        </p>\n\n\n- ### Module Introduction\n\n    - Network architecture:\n      <p align=\"center\">\n      <img src=\"https://ai-studio-static-online.cdn.bcebos.com/999d37b4ffdd49dc9e3315b7cec7b2c6918fdd57c8594ced9dded758a497913d\" hspace='10'/> <br />\n      </p>\n\n    - For more information, please refer to: [U2Net](https://github.com/xuebinqin/U-2-Net)\n\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0  \n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n    - ```shell\n      $ hub install U2Netp\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md) \n\n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n    ```python\n    import cv2\n    import paddlehub as hub\n\n    model = hub.Module(name='U2Netp')\n\n    result = model.Segmentation(\n        images=[cv2.imread('/PATH/TO/IMAGE')],\n        paths=None,\n        batch_size=1,\n        input_size=320,\n        output_dir='output',\n        visualization=True)\n    ```\n - ### 2、API\n\n    ```python\n    def Segmentation(\n            images=None,\n            paths=None,\n            batch_size=1,\n            input_size=320,\n            output_dir='output',\n            visualization=False):\n    ```\n    - Prediction API, obtaining segmentation result.\n\n    - **Parameter**\n        * images (list[np.ndarray]) : Image data, ndarray.shape is in the format [H, W, C], BGR.\n        * paths (list[str]) : Image path.\n        * batch_size (int) : Batch size.\n        * input_size (int) : Input image size, default is 320.\n        * output_dir (str) : Save path of images, 'output' by default.\n        * visualization (bool) : Whether to save the results as picture files.\n\n    - **Return**\n        * results (list[np.ndarray]): The list of segmentation results.\n\n## IV. Release Note\n\n- 1.0.0\n\n  First release\n"
  },
  {
    "path": "modules/image/semantic_segmentation/U2Netp/module.py",
    "content": "import os\r\nimport paddle\r\nimport paddle.nn as nn\r\nimport numpy as np\r\nfrom U2Netp.u2net import U2NETP\r\nfrom U2Netp.processor import Processor\r\nfrom paddlehub.module.module import moduleinfo\r\n\r\n\r\n@moduleinfo(\r\n    name=\"U2Netp\",  # 模型名称\r\n    type=\"CV\",  # 模型类型\r\n    author=\"jm12138\",  # 作者名称\r\n    author_email=\"jm12138@qq.com\",  # 作者邮箱\r\n    summary=\"U2Netp\",  # 模型介绍\r\n    version=\"1.0.0\"  # 版本号\r\n)\r\nclass U2Netp(nn.Layer):\r\n    def __init__(self):\r\n        super(U2Netp, self).__init__()\r\n        self.model = U2NETP(3, 1)\r\n        state_dict = paddle.load(os.path.join(self.directory, 'u2netp.pdparams'))\r\n        self.model.set_dict(state_dict)\r\n        self.model.eval()\r\n\r\n    def predict(self, input_datas):\r\n        outputs = []\r\n        for data in input_datas:\r\n            data = paddle.to_tensor(data, 'float32')\r\n            d1, d2, d3, d4, d5, d6, d7 = self.model(data)\r\n            outputs.append(d1.numpy())\r\n\r\n        outputs = np.concatenate(outputs, 0)\r\n\r\n        return outputs\r\n\r\n    def Segmentation(self,\r\n                     images=None,\r\n                     paths=None,\r\n                     batch_size=1,\r\n                     input_size=320,\r\n                     output_dir='output',\r\n                     visualization=False):\r\n\r\n        # 初始化数据处理器\r\n        processor = Processor(paths, images, batch_size, input_size)\r\n\r\n        # 模型预测\r\n        outputs = self.predict(processor.input_datas)\r\n\r\n        # 预测结果后处理\r\n        results = processor.postprocess(outputs, visualization=visualization, output_dir=output_dir)\r\n\r\n        return results\r\n"
  },
  {
    "path": "modules/image/semantic_segmentation/U2Netp/processor.py",
    "content": "import os\r\nimport cv2\r\nimport numpy as np\r\n\r\n__all__ = ['Processor']\r\n\r\n\r\nclass Processor():\r\n    def __init__(self, paths, images, batch_size, input_size):\r\n        # 图像列表\r\n        self.imgs = self.load_datas(paths, images)\r\n\r\n        # 输入数据\r\n        self.input_datas = self.preprocess(self.imgs, batch_size, input_size)\r\n\r\n    # 读取数据函数\r\n    def load_datas(self, paths, images):\r\n        datas = []\r\n\r\n        # 读取数据列表\r\n        if paths is not None:\r\n            for im_path in paths:\r\n                assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\r\n                im = cv2.imread(im_path)\r\n                datas.append(im)\r\n\r\n        if images is not None:\r\n            datas = images\r\n\r\n        # 返回数据列表\r\n        return datas\r\n\r\n    # 预处理\r\n    def preprocess(self, imgs, batch_size=1, input_size=320):\r\n        input_datas = []\r\n        for image in imgs:\r\n            image = cv2.resize(image, (input_size, input_size))\r\n            tmpImg = np.zeros((image.shape[0], image.shape[1], 3))\r\n            image = image / np.max(image)\r\n\r\n            tmpImg[:, :, 0] = (image[:, :, 0] - 0.485) / 0.229\r\n            tmpImg[:, :, 1] = (image[:, :, 1] - 0.456) / 0.224\r\n            tmpImg[:, :, 2] = (image[:, :, 2] - 0.406) / 0.225\r\n\r\n            # convert BGR to RGB\r\n            tmpImg = tmpImg.transpose((2, 0, 1))\r\n            tmpImg = tmpImg[np.newaxis, :, :, :]\r\n            input_datas.append(tmpImg)\r\n\r\n        input_datas = np.concatenate(input_datas, 0)\r\n\r\n        datas_num = input_datas.shape[0]\r\n        split_num = datas_num // batch_size + 1 if datas_num % batch_size != 0 else datas_num // batch_size\r\n\r\n        input_datas = np.array_split(input_datas, split_num)\r\n\r\n        return input_datas\r\n\r\n    def normPRED(self, d):\r\n        ma = np.max(d)\r\n        mi = np.min(d)\r\n\r\n        dn = (d - mi) / (ma - mi)\r\n\r\n        return dn\r\n\r\n    # 后处理\r\n    def postprocess(self, outputs, visualization=False, output_dir='output'):\r\n        results = []\r\n        if visualization and not os.path.exists(output_dir):\r\n            os.mkdir(output_dir)\r\n\r\n        for i, image in enumerate(self.imgs):\r\n            # normalization\r\n            pred = outputs[i, 0, :, :]\r\n\r\n            pred = self.normPRED(pred)\r\n\r\n            # convert torch tensor to numpy array\r\n            h, w = image.shape[:2]\r\n            mask = cv2.resize(pred, (w, h))\r\n\r\n            output_img = (image * mask[..., np.newaxis] + (1 - mask[..., np.newaxis]) * 255).astype(np.uint8)\r\n\r\n            mask = (mask * 255).astype(np.uint8)\r\n\r\n            if visualization:\r\n                cv2.imwrite(os.path.join(output_dir, 'result_mask_%d.png' % i), mask)\r\n                cv2.imwrite(os.path.join(output_dir, 'result_%d.png' % i), output_img)\r\n\r\n            results.append({'mask': mask, 'front': output_img})\r\n\r\n        return results\r\n"
  },
  {
    "path": "modules/image/semantic_segmentation/U2Netp/u2net.py",
    "content": "import paddle\r\nimport paddle.nn as nn\r\nimport paddle.nn.functional as F\r\n\r\n__all__ = ['U2NETP', 'U2NET']\r\n\r\n\r\nclass REBNCONV(nn.Layer):\r\n    def __init__(self, in_ch=3, out_ch=3, dirate=1):\r\n        super(REBNCONV, self).__init__()\r\n\r\n        self.conv_s1 = nn.Conv2D(in_ch, out_ch, 3, padding=1 * dirate, dilation=1 * dirate)\r\n        self.bn_s1 = nn.BatchNorm2D(out_ch)\r\n        self.relu_s1 = nn.ReLU()\r\n\r\n    def forward(self, x):\r\n\r\n        hx = x\r\n        xout = self.relu_s1(self.bn_s1(self.conv_s1(hx)))\r\n\r\n        return xout\r\n\r\n\r\n## upsample tensor 'src' to have the same spatial size with tensor 'tar'\r\ndef _upsample_like(src, tar):\r\n\r\n    src = F.upsample(src, size=tar.shape[2:], mode='bilinear')\r\n\r\n    return src\r\n\r\n\r\n### RSU-7 ###\r\nclass RSU7(nn.Layer):  #UNet07DRES(nn.Layer):\r\n    def __init__(self, in_ch=3, mid_ch=12, out_ch=3):\r\n        super(RSU7, self).__init__()\r\n\r\n        self.rebnconvin = REBNCONV(in_ch, out_ch, dirate=1)\r\n\r\n        self.rebnconv1 = REBNCONV(out_ch, mid_ch, dirate=1)\r\n        self.pool1 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv2 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool2 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv3 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool3 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv4 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool4 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv5 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool5 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv6 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n\r\n        self.rebnconv7 = REBNCONV(mid_ch, mid_ch, dirate=2)\r\n\r\n        self.rebnconv6d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv5d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv4d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv3d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv2d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv1d = REBNCONV(mid_ch * 2, out_ch, dirate=1)\r\n\r\n    def forward(self, x):\r\n\r\n        hx = x\r\n        hxin = self.rebnconvin(hx)\r\n\r\n        hx1 = self.rebnconv1(hxin)\r\n        hx = self.pool1(hx1)\r\n\r\n        hx2 = self.rebnconv2(hx)\r\n        hx = self.pool2(hx2)\r\n\r\n        hx3 = self.rebnconv3(hx)\r\n        hx = self.pool3(hx3)\r\n\r\n        hx4 = self.rebnconv4(hx)\r\n        hx = self.pool4(hx4)\r\n\r\n        hx5 = self.rebnconv5(hx)\r\n        hx = self.pool5(hx5)\r\n\r\n        hx6 = self.rebnconv6(hx)\r\n\r\n        hx7 = self.rebnconv7(hx6)\r\n\r\n        hx6d = self.rebnconv6d(paddle.concat((hx7, hx6), 1))\r\n        hx6dup = _upsample_like(hx6d, hx5)\r\n\r\n        hx5d = self.rebnconv5d(paddle.concat((hx6dup, hx5), 1))\r\n        hx5dup = _upsample_like(hx5d, hx4)\r\n\r\n        hx4d = self.rebnconv4d(paddle.concat((hx5dup, hx4), 1))\r\n        hx4dup = _upsample_like(hx4d, hx3)\r\n\r\n        hx3d = self.rebnconv3d(paddle.concat((hx4dup, hx3), 1))\r\n        hx3dup = _upsample_like(hx3d, hx2)\r\n\r\n        hx2d = self.rebnconv2d(paddle.concat((hx3dup, hx2), 1))\r\n        hx2dup = _upsample_like(hx2d, hx1)\r\n\r\n        hx1d = self.rebnconv1d(paddle.concat((hx2dup, hx1), 1))\r\n\r\n        return hx1d + hxin\r\n\r\n\r\n### RSU-6 ###\r\nclass RSU6(nn.Layer):  #UNet06DRES(nn.Layer):\r\n    def __init__(self, in_ch=3, mid_ch=12, out_ch=3):\r\n        super(RSU6, self).__init__()\r\n\r\n        self.rebnconvin = REBNCONV(in_ch, out_ch, dirate=1)\r\n\r\n        self.rebnconv1 = REBNCONV(out_ch, mid_ch, dirate=1)\r\n        self.pool1 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv2 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool2 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv3 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool3 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv4 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool4 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv5 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n\r\n        self.rebnconv6 = REBNCONV(mid_ch, mid_ch, dirate=2)\r\n\r\n        self.rebnconv5d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv4d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv3d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv2d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv1d = REBNCONV(mid_ch * 2, out_ch, dirate=1)\r\n\r\n    def forward(self, x):\r\n\r\n        hx = x\r\n\r\n        hxin = self.rebnconvin(hx)\r\n\r\n        hx1 = self.rebnconv1(hxin)\r\n        hx = self.pool1(hx1)\r\n\r\n        hx2 = self.rebnconv2(hx)\r\n        hx = self.pool2(hx2)\r\n\r\n        hx3 = self.rebnconv3(hx)\r\n        hx = self.pool3(hx3)\r\n\r\n        hx4 = self.rebnconv4(hx)\r\n        hx = self.pool4(hx4)\r\n\r\n        hx5 = self.rebnconv5(hx)\r\n\r\n        hx6 = self.rebnconv6(hx5)\r\n\r\n        hx5d = self.rebnconv5d(paddle.concat((hx6, hx5), 1))\r\n        hx5dup = _upsample_like(hx5d, hx4)\r\n\r\n        hx4d = self.rebnconv4d(paddle.concat((hx5dup, hx4), 1))\r\n        hx4dup = _upsample_like(hx4d, hx3)\r\n\r\n        hx3d = self.rebnconv3d(paddle.concat((hx4dup, hx3), 1))\r\n        hx3dup = _upsample_like(hx3d, hx2)\r\n\r\n        hx2d = self.rebnconv2d(paddle.concat((hx3dup, hx2), 1))\r\n        hx2dup = _upsample_like(hx2d, hx1)\r\n\r\n        hx1d = self.rebnconv1d(paddle.concat((hx2dup, hx1), 1))\r\n\r\n        return hx1d + hxin\r\n\r\n\r\n### RSU-5 ###\r\nclass RSU5(nn.Layer):  #UNet05DRES(nn.Layer):\r\n    def __init__(self, in_ch=3, mid_ch=12, out_ch=3):\r\n        super(RSU5, self).__init__()\r\n\r\n        self.rebnconvin = REBNCONV(in_ch, out_ch, dirate=1)\r\n\r\n        self.rebnconv1 = REBNCONV(out_ch, mid_ch, dirate=1)\r\n        self.pool1 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv2 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool2 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv3 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool3 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv4 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n\r\n        self.rebnconv5 = REBNCONV(mid_ch, mid_ch, dirate=2)\r\n\r\n        self.rebnconv4d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv3d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv2d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv1d = REBNCONV(mid_ch * 2, out_ch, dirate=1)\r\n\r\n    def forward(self, x):\r\n\r\n        hx = x\r\n\r\n        hxin = self.rebnconvin(hx)\r\n\r\n        hx1 = self.rebnconv1(hxin)\r\n        hx = self.pool1(hx1)\r\n\r\n        hx2 = self.rebnconv2(hx)\r\n        hx = self.pool2(hx2)\r\n\r\n        hx3 = self.rebnconv3(hx)\r\n        hx = self.pool3(hx3)\r\n\r\n        hx4 = self.rebnconv4(hx)\r\n\r\n        hx5 = self.rebnconv5(hx4)\r\n\r\n        hx4d = self.rebnconv4d(paddle.concat((hx5, hx4), 1))\r\n        hx4dup = _upsample_like(hx4d, hx3)\r\n\r\n        hx3d = self.rebnconv3d(paddle.concat((hx4dup, hx3), 1))\r\n        hx3dup = _upsample_like(hx3d, hx2)\r\n\r\n        hx2d = self.rebnconv2d(paddle.concat((hx3dup, hx2), 1))\r\n        hx2dup = _upsample_like(hx2d, hx1)\r\n\r\n        hx1d = self.rebnconv1d(paddle.concat((hx2dup, hx1), 1))\r\n\r\n        return hx1d + hxin\r\n\r\n\r\n### RSU-4 ###\r\nclass RSU4(nn.Layer):  #UNet04DRES(nn.Layer):\r\n    def __init__(self, in_ch=3, mid_ch=12, out_ch=3):\r\n        super(RSU4, self).__init__()\r\n\r\n        self.rebnconvin = REBNCONV(in_ch, out_ch, dirate=1)\r\n\r\n        self.rebnconv1 = REBNCONV(out_ch, mid_ch, dirate=1)\r\n        self.pool1 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv2 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n        self.pool2 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.rebnconv3 = REBNCONV(mid_ch, mid_ch, dirate=1)\r\n\r\n        self.rebnconv4 = REBNCONV(mid_ch, mid_ch, dirate=2)\r\n\r\n        self.rebnconv3d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv2d = REBNCONV(mid_ch * 2, mid_ch, dirate=1)\r\n        self.rebnconv1d = REBNCONV(mid_ch * 2, out_ch, dirate=1)\r\n\r\n    def forward(self, x):\r\n\r\n        hx = x\r\n\r\n        hxin = self.rebnconvin(hx)\r\n\r\n        hx1 = self.rebnconv1(hxin)\r\n        hx = self.pool1(hx1)\r\n\r\n        hx2 = self.rebnconv2(hx)\r\n        hx = self.pool2(hx2)\r\n\r\n        hx3 = self.rebnconv3(hx)\r\n\r\n        hx4 = self.rebnconv4(hx3)\r\n\r\n        hx3d = self.rebnconv3d(paddle.concat((hx4, hx3), 1))\r\n        hx3dup = _upsample_like(hx3d, hx2)\r\n\r\n        hx2d = self.rebnconv2d(paddle.concat((hx3dup, hx2), 1))\r\n        hx2dup = _upsample_like(hx2d, hx1)\r\n\r\n        hx1d = self.rebnconv1d(paddle.concat((hx2dup, hx1), 1))\r\n\r\n        return hx1d + hxin\r\n\r\n\r\n### RSU-4F ###\r\nclass RSU4F(nn.Layer):  #UNet04FRES(nn.Layer):\r\n    def __init__(self, in_ch=3, mid_ch=12, out_ch=3):\r\n        super(RSU4F, self).__init__()\r\n\r\n        self.rebnconvin = REBNCONV(in_ch, out_ch, dirate=1)\r\n\r\n        self.rebnconv1 = REBNCONV(out_ch, mid_ch, dirate=1)\r\n        self.rebnconv2 = REBNCONV(mid_ch, mid_ch, dirate=2)\r\n        self.rebnconv3 = REBNCONV(mid_ch, mid_ch, dirate=4)\r\n\r\n        self.rebnconv4 = REBNCONV(mid_ch, mid_ch, dirate=8)\r\n\r\n        self.rebnconv3d = REBNCONV(mid_ch * 2, mid_ch, dirate=4)\r\n        self.rebnconv2d = REBNCONV(mid_ch * 2, mid_ch, dirate=2)\r\n        self.rebnconv1d = REBNCONV(mid_ch * 2, out_ch, dirate=1)\r\n\r\n    def forward(self, x):\r\n\r\n        hx = x\r\n\r\n        hxin = self.rebnconvin(hx)\r\n\r\n        hx1 = self.rebnconv1(hxin)\r\n        hx2 = self.rebnconv2(hx1)\r\n        hx3 = self.rebnconv3(hx2)\r\n\r\n        hx4 = self.rebnconv4(hx3)\r\n\r\n        hx3d = self.rebnconv3d(paddle.concat((hx4, hx3), 1))\r\n        hx2d = self.rebnconv2d(paddle.concat((hx3d, hx2), 1))\r\n        hx1d = self.rebnconv1d(paddle.concat((hx2d, hx1), 1))\r\n\r\n        return hx1d + hxin\r\n\r\n\r\n##### U^2-Net ####\r\nclass U2NET(nn.Layer):\r\n    def __init__(self, in_ch=3, out_ch=1):\r\n        super(U2NET, self).__init__()\r\n\r\n        self.stage1 = RSU7(in_ch, 32, 64)\r\n        self.pool12 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage2 = RSU6(64, 32, 128)\r\n        self.pool23 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage3 = RSU5(128, 64, 256)\r\n        self.pool34 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage4 = RSU4(256, 128, 512)\r\n        self.pool45 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage5 = RSU4F(512, 256, 512)\r\n        self.pool56 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage6 = RSU4F(512, 256, 512)\r\n\r\n        # decoder\r\n        self.stage5d = RSU4F(1024, 256, 512)\r\n        self.stage4d = RSU4(1024, 128, 256)\r\n        self.stage3d = RSU5(512, 64, 128)\r\n        self.stage2d = RSU6(256, 32, 64)\r\n        self.stage1d = RSU7(128, 16, 64)\r\n\r\n        self.side1 = nn.Conv2D(64, out_ch, 3, padding=1)\r\n        self.side2 = nn.Conv2D(64, out_ch, 3, padding=1)\r\n        self.side3 = nn.Conv2D(128, out_ch, 3, padding=1)\r\n        self.side4 = nn.Conv2D(256, out_ch, 3, padding=1)\r\n        self.side5 = nn.Conv2D(512, out_ch, 3, padding=1)\r\n        self.side6 = nn.Conv2D(512, out_ch, 3, padding=1)\r\n\r\n        self.outconv = nn.Conv2D(6, out_ch, 1)\r\n\r\n    def forward(self, x):\r\n\r\n        hx = x\r\n\r\n        #stage 1\r\n        hx1 = self.stage1(hx)\r\n        hx = self.pool12(hx1)\r\n\r\n        #stage 2\r\n        hx2 = self.stage2(hx)\r\n        hx = self.pool23(hx2)\r\n\r\n        #stage 3\r\n        hx3 = self.stage3(hx)\r\n        hx = self.pool34(hx3)\r\n\r\n        #stage 4\r\n        hx4 = self.stage4(hx)\r\n        hx = self.pool45(hx4)\r\n\r\n        #stage 5\r\n        hx5 = self.stage5(hx)\r\n        hx = self.pool56(hx5)\r\n\r\n        #stage 6\r\n        hx6 = self.stage6(hx)\r\n        hx6up = _upsample_like(hx6, hx5)\r\n\r\n        #-------------------- decoder --------------------\r\n        hx5d = self.stage5d(paddle.concat((hx6up, hx5), 1))\r\n        hx5dup = _upsample_like(hx5d, hx4)\r\n\r\n        hx4d = self.stage4d(paddle.concat((hx5dup, hx4), 1))\r\n        hx4dup = _upsample_like(hx4d, hx3)\r\n\r\n        hx3d = self.stage3d(paddle.concat((hx4dup, hx3), 1))\r\n        hx3dup = _upsample_like(hx3d, hx2)\r\n\r\n        hx2d = self.stage2d(paddle.concat((hx3dup, hx2), 1))\r\n        hx2dup = _upsample_like(hx2d, hx1)\r\n\r\n        hx1d = self.stage1d(paddle.concat((hx2dup, hx1), 1))\r\n\r\n        #side output\r\n        d1 = self.side1(hx1d)\r\n\r\n        d2 = self.side2(hx2d)\r\n        d2 = _upsample_like(d2, d1)\r\n\r\n        d3 = self.side3(hx3d)\r\n        d3 = _upsample_like(d3, d1)\r\n\r\n        d4 = self.side4(hx4d)\r\n        d4 = _upsample_like(d4, d1)\r\n\r\n        d5 = self.side5(hx5d)\r\n        d5 = _upsample_like(d5, d1)\r\n\r\n        d6 = self.side6(hx6)\r\n        d6 = _upsample_like(d6, d1)\r\n\r\n        d0 = self.outconv(paddle.concat((d1, d2, d3, d4, d5, d6), 1))\r\n\r\n        return F.sigmoid(d0), F.sigmoid(d1), F.sigmoid(d2), F.sigmoid(d3), F.sigmoid(d4), F.sigmoid(d5), F.sigmoid(d6)\r\n\r\n\r\n### U^2-Net small ###\r\nclass U2NETP(nn.Layer):\r\n    def __init__(self, in_ch=3, out_ch=1):\r\n        super(U2NETP, self).__init__()\r\n\r\n        self.stage1 = RSU7(in_ch, 16, 64)\r\n        self.pool12 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage2 = RSU6(64, 16, 64)\r\n        self.pool23 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage3 = RSU5(64, 16, 64)\r\n        self.pool34 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage4 = RSU4(64, 16, 64)\r\n        self.pool45 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage5 = RSU4F(64, 16, 64)\r\n        self.pool56 = nn.MaxPool2D(2, stride=2, ceil_mode=True)\r\n\r\n        self.stage6 = RSU4F(64, 16, 64)\r\n\r\n        # decoder\r\n        self.stage5d = RSU4F(128, 16, 64)\r\n        self.stage4d = RSU4(128, 16, 64)\r\n        self.stage3d = RSU5(128, 16, 64)\r\n        self.stage2d = RSU6(128, 16, 64)\r\n        self.stage1d = RSU7(128, 16, 64)\r\n\r\n        self.side1 = nn.Conv2D(64, out_ch, 3, padding=1)\r\n        self.side2 = nn.Conv2D(64, out_ch, 3, padding=1)\r\n        self.side3 = nn.Conv2D(64, out_ch, 3, padding=1)\r\n        self.side4 = nn.Conv2D(64, out_ch, 3, padding=1)\r\n        self.side5 = nn.Conv2D(64, out_ch, 3, padding=1)\r\n        self.side6 = nn.Conv2D(64, out_ch, 3, padding=1)\r\n\r\n        self.outconv = nn.Conv2D(6, out_ch, 1)\r\n\r\n    def forward(self, x):\r\n\r\n        hx = x\r\n\r\n        #stage 1\r\n        hx1 = self.stage1(hx)\r\n        hx = self.pool12(hx1)\r\n\r\n        #stage 2\r\n        hx2 = self.stage2(hx)\r\n        hx = self.pool23(hx2)\r\n\r\n        #stage 3\r\n        hx3 = self.stage3(hx)\r\n        hx = self.pool34(hx3)\r\n\r\n        #stage 4\r\n        hx4 = self.stage4(hx)\r\n        hx = self.pool45(hx4)\r\n\r\n        #stage 5\r\n        hx5 = self.stage5(hx)\r\n        hx = self.pool56(hx5)\r\n\r\n        #stage 6\r\n        hx6 = self.stage6(hx)\r\n        hx6up = _upsample_like(hx6, hx5)\r\n\r\n        #decoder\r\n        hx5d = self.stage5d(paddle.concat((hx6up, hx5), 1))\r\n        hx5dup = _upsample_like(hx5d, hx4)\r\n\r\n        hx4d = self.stage4d(paddle.concat((hx5dup, hx4), 1))\r\n        hx4dup = _upsample_like(hx4d, hx3)\r\n\r\n        hx3d = self.stage3d(paddle.concat((hx4dup, hx3), 1))\r\n        hx3dup = _upsample_like(hx3d, hx2)\r\n\r\n        hx2d = self.stage2d(paddle.concat((hx3dup, hx2), 1))\r\n        hx2dup = _upsample_like(hx2d, hx1)\r\n\r\n        hx1d = self.stage1d(paddle.concat((hx2dup, hx1), 1))\r\n\r\n        #side output\r\n        d1 = self.side1(hx1d)\r\n\r\n        d2 = self.side2(hx2d)\r\n        d2 = _upsample_like(d2, d1)\r\n\r\n        d3 = self.side3(hx3d)\r\n        d3 = _upsample_like(d3, d1)\r\n\r\n        d4 = self.side4(hx4d)\r\n        d4 = _upsample_like(d4, d1)\r\n\r\n        d5 = self.side5(hx5d)\r\n        d5 = _upsample_like(d5, d1)\r\n\r\n        d6 = self.side6(hx6)\r\n        d6 = _upsample_like(d6, d1)\r\n\r\n        d0 = self.outconv(paddle.concat((d1, d2, d3, d4, d5, d6), 1))\r\n\r\n        return F.sigmoid(d0), F.sigmoid(d1), F.sigmoid(d2), F.sigmoid(d3), F.sigmoid(d4), F.sigmoid(d5), F.sigmoid(d6)\r\n"
  },
  {
    "path": "modules/image/semantic_segmentation/WatermeterSegmentation/README.md",
    "content": "WatermeterSegmentation\r\n类别 图像 - 图像分割\r\n网络 DeepLabV3\r\n数据集 水表的数字表盘分割数据集\r\n\r\n# 模型概述\r\n水表的数字表盘分割（WatermeterSegmentation），该模型可自动提取水表上的数字。该PaddleHub Module支持API预测及命令行预测。\r\n\r\n# 选择模型版本进行安装\r\n$ hub install WatermeterSegmentation==1.0.0\r\n\r\n# 在线体验\r\n[AI Studio快速体验](https://aistudio.baidu.com/aistudio/projectdetail/1643214)\r\n\r\n# 命令行预测示例\r\n$ hub run WatermeterSegmentation --image 1.png --use_gpu True\r\n\r\n# Module API说明\r\n## def cutPic(picUrl)\r\n水表的数字表盘分割预测接口，输入一张图像，输出该图像上水表记录的数字\r\n### 参数\r\n- picUrl(str): 待检测的图片路径\r\n\r\n# 代码示例\r\n\r\n## API调用\r\n~~~\r\nimport cv2\r\nimport paddlehub as hub\r\n\r\nseg = hub.Module(name='WatermeterSegmentation')\r\nres = seg.cutPic(picUrl=\"1.png\")\r\n~~~\r\n\r\n## 命令行调用\r\n~~~\r\n$ hub run WatermeterSegmentation --image 1.png --use_gpu True\r\n~~~\r\n\r\n# 效果展示\r\n\r\n## 原图\r\n<img src=\"/docs/imgs/Readme_Related/ImageSeg_WaterInput.png\">\r\n\r\n## 输出结果\r\n<img src=\"/docs/imgs/Readme_Related/ImageSeg_WaterOutput.png\">\r\n\r\n# 贡献者\r\n郑博培、彭兆帅\r\n\r\n# 依赖\r\npaddlepaddle >= 2.0.0<br>\r\npaddlehub >= 2.0.0\r\n"
  },
  {
    "path": "modules/image/semantic_segmentation/WatermeterSegmentation/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/semantic_segmentation/WatermeterSegmentation/assets/model.yml",
    "content": "Model: DeepLabv3p\nTransforms:\n- Resize:\n    interp: LINEAR\n    target_size: 512\n- Normalize:\n    max_val:\n    - 255.0\n    - 255.0\n    - 255.0\n    mean:\n    - 0.5\n    - 0.5\n    - 0.5\n    min_val:\n    - 0\n    - 0\n    - 0\n    std:\n    - 0.5\n    - 0.5\n    - 0.5\nTransformsMode: BGR\n_Attributes:\n  eval_metrics:\n    miou: 0.8284633456567256\n  fixed_input_shape: null\n  labels:\n  - _background_\n  - number\n  model_type: segmenter\n  num_classes: 2\n_ModelInputsOutputs:\n  test_inputs:\n  - - image\n    - image\n  test_outputs:\n  - - pred\n    - unsqueeze2_0.tmp_0\n  - - logit\n    - softmax_0.tmp_0\n_init_params:\n  aspp_with_sep_conv: true\n  backbone: MobileNetV2_x1.0\n  class_weight: null\n  decoder_use_sep_conv: true\n  enable_decoder: true\n  encoder_with_aspp: true\n  ignore_index: 255\n  input_channel: 3\n  num_classes: 2\n  output_stride: 16\n  pooling_crop_size: null\n  use_bce_loss: false\n  use_dice_loss: false\ncompleted_epochs: 0\nstatus: Infer\nversion: 1.3.7\n"
  },
  {
    "path": "modules/image/semantic_segmentation/WatermeterSegmentation/module.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\n\nimport os\nimport cv2\nimport argparse\nimport base64\nimport paddlex as pdx\n\nfrom math import *\nimport time, math, re\n\nimport numpy as np\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef cv2_to_base64(image):\n    # return base64.b64encode(image)\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef read_images(paths):\n    images = []\n    for path in paths:\n        images.append(cv2.imread(path))\n    return images\n\n\n'''旋转图像并剪裁'''\n\n\ndef rotate(\n        img,  # 图片\n        pt1,\n        pt2,\n        pt3,\n        pt4,\n        imgOutSrc):\n    # print(pt1,pt2,pt3,pt4)\n    withRect = math.sqrt((pt4[0] - pt1[0])**2 + (pt4[1] - pt1[1])**2)  # 矩形框的宽度\n    heightRect = math.sqrt((pt1[0] - pt2[0])**2 + (pt1[1] - pt2[1])**2)\n    # print(\"矩形的宽度\",withRect, \"矩形的高度\", heightRect)\n    angle = acos((pt4[0] - pt1[0]) / withRect) * (180 / math.pi)  # 矩形框旋转角度\n    # print(\"矩形框旋转角度\", angle)\n\n    if withRect > heightRect:\n        if pt4[1] > pt1[1]:\n            pass\n            # print(\"顺时针旋转\")\n        else:\n            # print(\"逆时针旋转\")\n            angle = -angle\n\n    else:\n        # print(\"逆时针旋转\")\n        angle = 90 - angle\n\n    height = img.shape[0]  # 原始图像高度\n    width = img.shape[1]  # 原始图像宽度\n    rotateMat = cv2.getRotationMatrix2D((width / 2, height / 2), angle, 1)  # 按angle角度旋转图像\n    heightNew = int(width * fabs(sin(radians(angle))) + height * fabs(cos(radians(angle))))\n    widthNew = int(height * fabs(sin(radians(angle))) + width * fabs(cos(radians(angle))))\n\n    rotateMat[0, 2] += (widthNew - width) / 2\n    rotateMat[1, 2] += (heightNew - height) / 2\n    imgRotation = cv2.warpAffine(img, rotateMat, (widthNew, heightNew), borderValue=(255, 255, 255))\n    # cv2.imwrite(\"imgRotation.jpg\", imgRotation)\n\n    # 旋转后图像的四点坐标\n    [[pt1[0]], [pt1[1]]] = np.dot(rotateMat, np.array([[pt1[0]], [pt1[1]], [1]]))\n    [[pt3[0]], [pt3[1]]] = np.dot(rotateMat, np.array([[pt3[0]], [pt3[1]], [1]]))\n    [[pt2[0]], [pt2[1]]] = np.dot(rotateMat, np.array([[pt2[0]], [pt2[1]], [1]]))\n    [[pt4[0]], [pt4[1]]] = np.dot(rotateMat, np.array([[pt4[0]], [pt4[1]], [1]]))\n\n    # 处理反转的情况\n    if pt2[1] > pt4[1]:\n        pt2[1], pt4[1] = pt4[1], pt2[1]\n    if pt1[0] > pt3[0]:\n        pt1[0], pt3[0] = pt3[0], pt1[0]\n\n    imgOut = imgRotation[int(pt2[1]):int(pt4[1]), int(pt1[0]):int(pt3[0])]\n    cv2.imwrite(imgOutSrc, imgOut)  # 裁减得到的旋转矩形框\n\n\n@moduleinfo(\n    name='WatermeterSegmentation',\n    type='CV/semantic_segmentatio',\n    author='郑博培、彭兆帅',\n    author_email='2733821739@qq.com',\n    summary='Digital dial segmentation of water meter',\n    version='1.0.0')\nclass MODULE(hub.Module):\n    def _initialize(self, **kwargs):\n        self.default_pretrained_model_path = os.path.join(self.directory, 'assets')\n        self.model = pdx.deploy.Predictor(self.default_pretrained_model_path, **kwargs)\n\n    def predict(self, images=None, paths=None, data=None, batch_size=1, use_gpu=False, **kwargs):\n\n        all_data = images if images is not None else read_images(paths)\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = []\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except IndexError:\n                    break\n            out = self.model.batch_predict(batch_data, **kwargs)\n            res.extend(out)\n        return res\n\n    def cutPic(self, picUrl):\n        # seg = hub.Module(name='WatermeterSegmentation')\n        image_name = picUrl\n        im = cv2.imread(image_name)\n        result = self.predict(images=[im])\n\n        # 将多边形polygon转矩形\n        contours, hier = cv2.findContours(result[0]['label_map'], cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)\n        print(type(contours[0]))\n        n = 0\n        m = 0\n        for index, contour in enumerate(contours):\n            if len(contour) > n:\n                n = len(contour)\n                m = index\n\n        image = cv2.imread(image_name)\n        # 获取最小的矩形\n        rect = cv2.minAreaRect(contours[m])\n        box = np.int0(cv2.boxPoints(rect))\n\n        # 获取到矩形的四个点\n        tmp = cv2.drawContours(image, [box], 0, (0, 0, 255), 3)\n        imgOutSrc = 'result.jpg'\n        rotate(image, box[0], box[1], box[2], box[3], imgOutSrc)\n        res = []\n        res.append(imgOutSrc)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.predict(images_decode, **kwargs)\n        res = []\n        for result in results:\n            if isinstance(result, dict):\n                # result_new = dict()\n                for key, value in result.items():\n                    if isinstance(value, np.ndarray):\n                        result[key] = cv2_to_base64(value)\n                    elif isinstance(value, np.generic):\n                        result[key] = np.asscalar(value)\n\n            elif isinstance(result, list):\n                for index in range(len(result)):\n                    for key, value in result[index].items():\n                        if isinstance(value, np.ndarray):\n                            result[index][key] = cv2_to_base64(value)\n                        elif isinstance(value, np.generic):\n                            result[index][key] = np.asscalar(value)\n            else:\n                raise RuntimeError('The result cannot be used in serving.')\n            res.append(result)\n        return res\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.predict(paths=[args.input_path], use_gpu=args.use_gpu)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', type=bool, default=False, help=\"whether use GPU or not\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n\n\nif __name__ == '__main__':\n    module = MODULE(directory='./new_model')\n    images = [cv2.imread('./cat.jpg'), cv2.imread('./cat.jpg'), cv2.imread('./cat.jpg')]\n    res = module.predict(images=images)\n"
  },
  {
    "path": "modules/image/semantic_segmentation/WatermeterSegmentation/serving_client_demo.py",
    "content": "# coding: utf8\nimport requests\nimport json\nimport cv2\nimport base64\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\nif __name__ == '__main__':\n    # 获取图片的base64编码格式\n    img1 = cv2_to_base64(cv2.imread(\"IMAGE_PATH1\"))\n    img2 = cv2_to_base64(cv2.imread(\"IMAGE_PATH2\"))\n    data = {'images': [img1, img2]}\n    # 指定content-type\n    headers = {\"Content-type\": \"application/json\"}\n    # 发送HTTP请求\n    url = \"http://127.0.0.1:8866/predict/WatermeterSegmentation\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ace2p/README.md",
    "content": "# ace2p\n\n|模型名称|ace2p|\n| :--- | :---: | \n|类别|图像-图像分割|\n|网络|ACE2P|\n|数据集|LIP|\n|是否支持Fine-tuning|否|\n|模型大小|259MB|\n|指标|-|\n|最新更新日期|2021-02-26|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n  - 网络结构：\n      <p align=\"center\">\n      <img src=\"https://bj.bcebos.com/paddlehub/paddlehub-img/ace2p_network.jpg\" hspace='10'/> <br />\n      </p>\n\n  - 调色板\n\n      <p align=\"left\">\n      <img src=\"https://bj.bcebos.com/paddlehub/paddlehub-img/ace2p_palette.jpg\" hspace='10'/> <br />\n      </p>\n\n  - 样例结果示例：\n      <p align=\"center\">\n      <img src=\"https://user-images.githubusercontent.com/35907364/130913092-312a5f37-842e-4fd0-8db4-5f853fd8419f.jpg\" width = \"337\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/130913765-c9572c77-c6bf-46ec-9653-04ff356b4b85.png\" width = \"337\" height = \"505\" hspace='10'/>\n      </p>\n\n- ### 模型介绍\n\n  - 人体解析(Human Parsing)是细粒度的语义分割任务，其旨在识别像素级别的人类图像的组成部分（例如，身体部位和服装）。ACE2P通过融合底层特征，全局上下文信息和边缘细节，端到端地训练学习人体解析任务。该结构针对Intersection over Union指标进行针对性的优化学习，提升准确率。以ACE2P单人人体解析网络为基础的解决方案在CVPR2019第三届LIP挑战赛中赢得了全部三个人体解析任务的第一名。该PaddleHub Module采用ResNet101作为骨干网络，接受输入图片大小为473x473x3。\n\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0  \n\n- ### 2.安装\n\n    - ```shell\n      $ hub install ace2p\n      ```\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n  - ### 1、命令行预测\n\n    ```shell\n    $ hub run ace2p --input_path \"/PATH/TO/IMAGE\"\n    ```\n\n  - ### 2、预测代码示例\n\n    ```python\n    import paddlehub as hub\n    import cv2\n\n    human_parser = hub.Module(name=\"ace2p\")\n    result = human_parser.segmentation(images=[cv2.imread('/PATH/TO/IMAGE')])\n    ```\n  \n  - ### 3、API\n\n    ```python\n    def segmentation(images=None,\n                    paths=None,\n                    batch_size=1,\n                    use_gpu=False,\n                    output_dir='ace2p_output',\n                    visualization=False):\n    ```\n\n    - 预测API，用于图像分割得到人体解析。\n\n    - **参数**\n\n      * images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      * paths (list\\[str\\]): 图片的路径；\n      * batch\\_size (int): batch 的大小；\n      * use\\_gpu (bool): 是否使用 GPU；\n      * output\\_dir (str): 保存处理结果的文件目录；\n      * visualization (bool): 是否将识别结果保存为图片文件。\n\n    - **返回**\n\n      * res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，关键字有'path', 'data'，相应的取值为：\n          * path (str): 原输入图片的路径；\n          * data (numpy.ndarray): 图像分割得到的结果，shape 为`H * W`，元素的取值为0-19，表示每个像素的分类结果，映射顺序与下面的调色板相同。\n\n    ```python\n    def save_inference_model(dirname)\n    ```\n\n    - 将模型保存到指定路径。\n\n    - **参数**\n\n      * dirname: 模型保存路径\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个人体解析的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  \n    ```shell\n     $ hub serving start -m ace2p\n    ```\n\n    - 这样就完成了一个人体解析服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n      ```python\n      import requests\n      import json\n      import cv2\n      import base64\n\n      import numpy as np\n\n\n      def cv2_to_base64(image):\n          data = cv2.imencode('.jpg', image)[1]\n          return base64.b64encode(data.tostring()).decode('utf8')\n\n\n      def base64_to_cv2(b64str):\n          data = base64.b64decode(b64str.encode('utf8'))\n          data = np.fromstring(data, np.uint8)\n          data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n          return data\n\n\n      # 发送HTTP请求\n      data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n      headers = {\"Content-type\": \"application/json\"}\n      url = \"http://127.0.0.1:8866/predict/ace2p\"\n      r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n      # 打印预测结果\n      print(base64_to_cv2(r.json()[\"results\"][0]['data']))\n      ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  适配paddlehub2.0版本\n\n* 1.2.0\n\n  移除 Fluid API\n\n  ```shell\n  $ hub install ace2p == 1.2.0\n  ```"
  },
  {
    "path": "modules/image/semantic_segmentation/ace2p/README_en.md",
    "content": "# ace2p\n\n|Module Name|ace2p|\n| :--- | :---: | \n|Category|Image segmentation|\n|Network|ACE2P|\n|Dataset|LIP|\n|Fine-tuning supported or not|No|\n|Module Size|259MB|\n|Data indicators|-|\n|Latest update date |2021-02-26|\n\n\n## I. Basic Information \n\n- ### Application Effect Display\n\n  - Network architecture:\n      <p align=\"center\">\n      <img src=\"https://bj.bcebos.com/paddlehub/paddlehub-img/ace2p_network.jpg\" hspace='10'/> <br />\n      </p>\n\n  - Color palette\n\n      <p align=\"left\">\n      <img src=\"https://bj.bcebos.com/paddlehub/paddlehub-img/ace2p_palette.jpg\" hspace='10'/> <br />\n      </p>\n\n  - Sample results:\n      <p align=\"center\">\n      <img src=\"https://user-images.githubusercontent.com/35907364/130913092-312a5f37-842e-4fd0-8db4-5f853fd8419f.jpg\" width = \"337\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/130913765-c9572c77-c6bf-46ec-9653-04ff356b4b85.png\" width = \"337\" height = \"505\" hspace='10'/>\n      </p>\n\n- ### Module Introduction\n\n  - Human Parsing is a fine-grained semantic segmentation task that aims to identify the components (for example, body parts and clothing) of a human image at the pixel level.  The PaddleHub Module uses ResNet101 as the backbone network, and accepts input image sizes of 473x473x3.\n\n\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0  \n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install ace2p\n      ```\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Command line Prediction\n\n    - ```shell\n      $ hub run ace2p --input_path \"/PATH/TO/IMAGE\"\n      ```\n\n    - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_en/tutorial/cmd_usage.rst)\n\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    human_parser = hub.Module(name=\"ace2p\")\n    result = human_parser.segmentation(images=[cv2.imread('/PATH/TO/IMAGE')])\n    ```\n  \n- ### 3、API\n\n  - ```python\n    def segmentation(images=None,\n                    paths=None,\n                    batch_size=1,\n                    use_gpu=False,\n                    output_dir='ace2p_output',\n                    visualization=False):\n    ```\n\n    - Prediction API, used for human parsing.\n\n    - **Parameter**\n\n        * images (list\\[numpy.ndarray\\]): Image data, ndarray.shape is in the format [H, W, C], BGR.\n        * paths (list\\[str\\]): Image path.\n        * batch\\_size (int): Batch size.\n        * use\\_gpu (bool): Use GPU or not. **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n        * output\\_dir (str): Save path of output, default is 'ace2p_output'.\n        * visualization (bool): Whether to save the recognition results as picture files.\n\n    - **Return**\n\n        * res (list\\[dict\\]): The list of recognition results, where each element is dict and each field is: \n            * save\\_path (str, optional): Save path of the result.\n            * data (numpy.ndarray): The result of portrait segmentation. \n\n\n  - ```python\n    def save_inference_model(dirname)\n    ```\n\n    - Save the model to the specified path.\n\n    - **Parameters**\n      * dirname: Model save path.\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of  human parsing\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command:\n  \n    - ```shell\n      $ hub serving start -m ace2p\n      ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n\n    - ```python\n      import requests\n      import json\n      import cv2\n      import base64\n\n      import numpy as np\n\n\n      def cv2_to_base64(image):\n          data = cv2.imencode('.jpg', image)[1]\n          return base64.b64encode(data.tostring()).decode('utf8')\n\n\n      def base64_to_cv2(b64str):\n          data = base64.b64decode(b64str.encode('utf8'))\n          data = np.fromstring(data, np.uint8)\n          data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n          return data\n\n\n      # Send an HTTP request\n      data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n      headers = {\"Content-type\": \"application/json\"}\n      url = \"http://127.0.0.1:8866/predict/ace2p\"\n      r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n      # print prediction results\n      print(base64_to_cv2(r.json()[\"results\"][0]['data']))\n      ```\n\n\n## 五、更新历史\n\n- 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Adapt to paddlehub2.0\n\n* 1.2.0\n\n  Remove Fluid API\n\n  ```shell\n  $ hub install ace2p == 1.2.0\n  ```\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ace2p/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/semantic_segmentation/ace2p/data_feed.py",
    "content": "# coding=utf-8\nimport os\nimport time\nfrom collections import OrderedDict\n\nimport cv2\nimport numpy as np\n\nfrom .processor import get_affine_transform\n\n__all__ = ['reader']\n\n\ndef _box2cs(box, aspect_ratio):\n    x, y, w, h = box[:4]\n    return _xywh2cs(x, y, w, h, aspect_ratio)\n\n\ndef _xywh2cs(x, y, w, h, aspect_ratio, pixel_std=200):\n    center = np.zeros((2), dtype=np.float32)\n    center[0] = x + w * 0.5\n    center[1] = y + h * 0.5\n    if w > aspect_ratio * h:\n        h = w * 1.0 / aspect_ratio\n    elif w < aspect_ratio * h:\n        w = h * aspect_ratio\n    scale = np.array([w * 1.0 / pixel_std, h * 1.0 / pixel_std], dtype=np.float32)\n    return center, scale\n\n\ndef preprocess(org_im, scale, rotation):\n    image = org_im.copy()\n    image_height, image_width, _ = image.shape\n\n    aspect_ratio = scale[1] * 1.0 / scale[0]\n    image_center, image_scale = _box2cs([0, 0, image_width - 1, image_height - 1], aspect_ratio)\n\n    trans = get_affine_transform(image_center, image_scale, rotation, scale)\n    image = cv2.warpAffine(\n        image,\n        trans, (int(scale[1]), int(scale[0])),\n        flags=cv2.INTER_LINEAR,\n        borderMode=cv2.BORDER_CONSTANT,\n        borderValue=(0, 0, 0))\n\n    img_mean = np.array([0.406, 0.456, 0.485]).reshape((1, 1, 3))\n    img_std = np.array([0.225, 0.224, 0.229]).reshape((1, 1, 3))\n    image = image.astype(np.float32)\n    image = (image / 255.0 - img_mean) / img_std\n    image = image.transpose(2, 0, 1).astype(np.float32)\n\n    image_info = {\n        'image_center': image_center,\n        'image_height': image_height,\n        'image_width': image_width,\n        'image_scale': image_scale,\n        'rotation': rotation,\n        'scale': scale\n    }\n\n    return image, image_info\n\n\ndef reader(images, paths, scale, rotation):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n        paths (list[str]): paths to images.\n        scale (tuple): size of preprocessed image.\n        rotation (int): rotation angle, used for obtaining affine matrix in preprocess.\n\n    Yield:\n        element (collections.OrderedDict): info of original image and preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            im = cv2.imread(im_path)\n            each['org_im'] = im\n            each['org_im_path'] = im_path\n            component.append(each)\n    if images is not None:\n        assert type(images) is list, \"images should be a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = im\n            each['org_im_path'] = 'ndarray_time={}.jpg'.format(round(time.time(), 6) * 1e6)\n            component.append(each)\n\n    for element in component:\n        element['image'], element['image_info'] = preprocess(element['org_im'], scale, rotation)\n        yield element\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ace2p/label_list.txt",
    "content": "background\nHat\nHair\nGlove\nSunglasses\nUpperClothes\nDress\nCoat\nSocks\nPants\nJumpsuits\nScarf\nSkirt\nFace\nLeft-arm\nRight-arm\nLeft-leg\nRight-leg\nLeft-shoe\nRight-shoe\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ace2p/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\n\nimport ast\nimport argparse\nimport os\n\nimport numpy as np\nimport paddle\nimport paddle.jit\nimport paddle.static\nfrom paddle.inference import Config, create_predictor\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\nfrom .processor import get_palette, postprocess, base64_to_cv2, cv2_to_base64\nfrom .data_feed import reader\n\n\n@moduleinfo(\n    name=\"ace2p\",\n    type=\"CV/semantic-segmentation\",\n    author=\"baidu-idl\",\n    author_email=\"\",\n    summary=\"ACE2P is an image segmentation model for human parsing solution.\",\n    version=\"1.2.0\")\nclass ACE2P:\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(\n            self.directory, \"ace2p_human_parsing\", \"model\")\n        # label list\n        label_list_file = os.path.join(self.directory, 'label_list.txt')\n        with open(label_list_file, \"r\") as file:\n            content = file.read()\n        self.label_list = content.split(\"\\n\")\n        # palette used in postprocess\n        self.palette = get_palette(len(self.label_list))\n        self._set_config()\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path+'.pdmodel'\n        params = self.default_pretrained_model_path+'.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def segmentation(self,\n                     images=None,\n                     paths=None,\n                     data=None,\n                     batch_size=1,\n                     use_gpu=False,\n                     output_dir='ace2p_output',\n                     visualization=False):\n        \"\"\"\n        API for human parsing.\n\n        Args:\n            images (list[numpy.ndarray]): images data, shape of each is [H, W, C], color space is BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            output_dir (str): The path to store output images.\n            visualization (bool): Whether to save output images or not.\n\n        Returns:\n            res (list[dict]): The result of human parsing and original path of images.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        # compatibility with older versions\n        if data and 'image' in data:\n            if paths is None:\n                paths = []\n            paths += data['image']\n\n        # get all data\n        all_data = []\n        scale = (473, 473)  # size of preprocessed image.\n        rotation = 0  # rotation angle, used for obtaining affine matrix in preprocess.\n        for yield_data in reader(images, paths, scale, rotation):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = []\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except:\n                    pass\n            # feed batch image\n            batch_image = np.array([data['image'] for data in batch_data])\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(batch_image.astype('float32'))\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n    \n            # postprocess one by one\n            for i in range(len(batch_data)):\n                out = postprocess(\n                    data_out=output_handle.copy_to_cpu()[i],\n                    org_im=batch_data[i]['org_im'],\n                    org_im_path=batch_data[i]['org_im_path'],\n                    image_info=batch_data[i]['image_info'],\n                    output_dir=output_dir,\n                    visualization=visualization,\n                    palette=self.palette)\n                res.append(out)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.segmentation(images_decode, **kwargs)\n        results = [{'data': cv2_to_base64(result['data'])} for result in results]\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.segmentation(\n            paths=[args.input_path],\n            batch_size=args.batch_size,\n            use_gpu=args.use_gpu,\n            output_dir=args.output_dir,\n            visualization=args.visualization)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu', type=ast.literal_eval, default=False, help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default='ace2p_output', help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument(\n            '--visualization', type=ast.literal_eval, default=False, help=\"whether to save output as images.\")\n        self.arg_config_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ace2p/processor.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport os\nimport time\n\nimport base64\nimport cv2\nimport numpy as np\nfrom PIL import Image\n\n__all__ = ['cv2_to_base64', 'base64_to_cv2', 'get_palette', 'postprocess']\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef check_dir(dir_path):\n    \"\"\"\n    Create directory to save processed image.\n\n    Args:\n        dir_path (str): directory path to save images.\n    \"\"\"\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef get_save_image_name(org_im, org_im_path, output_dir):\n    \"\"\"\n    Get save image name from source image path.\n    \"\"\"\n    # name prefix of orginal image\n    org_im_name = os.path.split(org_im_path)[-1]\n    im_prefix = os.path.splitext(org_im_name)[0]\n    ext = '.png'\n    # save image path\n    save_im_path = os.path.join(output_dir, im_prefix + ext)\n    if os.path.exists(save_im_path):\n        save_im_path = os.path.join(output_dir, im_prefix + 'time={}'.format(int(time.time())) + ext)\n\n    return save_im_path\n\n\ndef get_direction(src_point, rot_rad):\n    sn, cs = np.sin(rot_rad), np.cos(rot_rad)\n    src_result = [0, 0]\n    src_result[0] = src_point[0] * cs - src_point[1] * sn\n    src_result[1] = src_point[0] * sn + src_point[1] * cs\n    return src_result\n\n\ndef get_3rd_point(a, b):\n    direct = a - b\n    return b + np.array([-direct[1], direct[0]], dtype=np.float32)\n\n\ndef get_affine_transform(center, scale, rot, output_size, shift=np.array([0, 0], dtype=np.float32), inv=0):\n    if not isinstance(scale, np.ndarray) and not isinstance(scale, list) and not isinstance(scale, tuple):\n        print(scale)\n        scale = np.array([scale, scale])\n\n    scale_tmp = scale * 200.0\n    src_w = scale_tmp[0]\n    dst_w = output_size[1]\n    dst_h = output_size[0]\n    rot_rad = np.pi * rot / 180\n    src_direction = get_direction([0, src_w * -0.5], rot_rad)\n    dst_direction = np.array([0, (dst_w - 1) * -0.5], np.float32)\n    src = np.zeros((3, 2), dtype=np.float32)\n    dst = np.zeros((3, 2), dtype=np.float32)\n    src[0, :] = center + scale_tmp * shift\n    src[1, :] = center + src_direction + scale_tmp * shift\n    dst[0, :] = [(dst_w - 1) * 0.5, (dst_h - 1) * 0.5]\n    dst[1, :] = np.array([(dst_w - 1) * 0.5, (dst_h - 1) * 0.5]) + dst_direction\n    src[2:, :] = get_3rd_point(src[0, :], src[1, :])\n    dst[2:, :] = get_3rd_point(dst[0, :], dst[1, :])\n\n    if inv:\n        trans = cv2.getAffineTransform(np.float32(dst), np.float32(src))\n    else:\n        trans = cv2.getAffineTransform(np.float32(src), np.float32(dst))\n    return trans\n\n\ndef transform_logits(logits, center, scale, width, height, input_size):\n    trans = get_affine_transform(center, scale, 0, input_size, inv=1)\n    channel = logits.shape[2]\n    target_logits = []\n    for i in range(channel):\n        target_logit = cv2.warpAffine(\n            logits[:, :, i],\n            trans, (int(width), int(height)),\n            flags=cv2.INTER_LINEAR,\n            borderMode=cv2.BORDER_CONSTANT,\n            borderValue=(0))\n        target_logits.append(target_logit)\n    target_logits = np.stack(target_logits, axis=2)\n    return target_logits\n\n\ndef get_palette(num_cls):\n    \"\"\"\n    Returns the color map for visualizing the segmentation mask.\n\n    Args:\n        num_cls: Number of classes\n\n    Returns:\n        The color map\n    \"\"\"\n    n = num_cls\n    palette = [0] * (n * 3)\n    for j in range(0, n):\n        lab = j\n        palette[j * 3 + 0] = 0\n        palette[j * 3 + 1] = 0\n        palette[j * 3 + 2] = 0\n        i = 0\n        while lab:\n            palette[j * 3 + 0] |= (((lab >> 0) & 1) << (7 - i))\n            palette[j * 3 + 1] |= (((lab >> 1) & 1) << (7 - i))\n            palette[j * 3 + 2] |= (((lab >> 2) & 1) << (7 - i))\n            i += 1\n            lab >>= 3\n    return palette\n\n\ndef postprocess(data_out, org_im, org_im_path, image_info, output_dir, visualization, palette):\n    \"\"\"\n    Postprocess output of network. one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output of neural network.\n        org_im (numpy.ndarray): orginal image.\n        org_im_path (str): path of original image.\n        image_info (dict): info about the preprocessed image.\n        output_dir (str): output directory to store image.\n        visualization (bool): whether to save image or not.\n        palette (list): The palette to draw.\n\n    Returns:\n        res (list[dict]): keys contain 'path', 'data', the corresponding value is:\n            path (str): The path of original image.\n            data (numpy.ndarray): The postprocessed image data, only the alpha channel.\n    \"\"\"\n    result = dict()\n    result['path'] = org_im_path\n\n    image_center = image_info['image_center']\n    image_scale = image_info['image_scale']\n    image_width = image_info['image_width']\n    image_height = image_info['image_height']\n    scale = image_info['scale']\n\n    data_out = np.squeeze(data_out)\n    data_out = np.transpose(data_out, [1, 2, 0])\n    logits_result = transform_logits(data_out, image_center, image_scale, image_width, image_height, scale)\n    parsing = np.argmax(logits_result, axis=2)\n    parsing_im = np.asarray(parsing, dtype=np.uint8)\n    result['data'] = parsing_im\n\n    if visualization:\n        check_dir(output_dir)\n        save_im_path = get_save_image_name(org_im, org_im_path, output_dir)\n        parsing_im = Image.fromarray(parsing_im)\n        parsing_im.putpalette(palette)\n        parsing_im.save(save_im_path)\n\n    return result\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ace2p/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport numpy as np\nimport paddlehub as hub\n\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/pg_WCHWSdT8/download?ixid=MnwxMjA3fDB8MXxhbGx8fHx8fHx8fHwxNjYyNDM2ODI4&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        fourcc = cv2.VideoWriter_fourcc('M', 'J', 'P', 'G')\n        img = cv2.imread('tests/test.jpg')\n        video = cv2.VideoWriter('tests/test.avi', fourcc,\n                                20.0, tuple(img.shape[:2]))\n        for i in range(40):\n            video.write(img)\n        video.release()\n        cls.module = hub.Module(name=\"ace2p\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('ace2p_output')\n\n    def test_segmentation1(self):\n        results = self.module.segmentation(\n            paths=['tests/test.jpg'],\n            use_gpu=False,\n            visualization=False\n        )\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_segmentation2(self):\n        results = self.module.segmentation(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=False\n        )\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_segmentation3(self):\n        results = self.module.segmentation(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=True\n        )\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_segmentation4(self):\n        results = self.module.segmentation(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=True,\n            visualization=False\n        )\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_segmentation5(self):\n        self.assertRaises(\n            AssertionError,\n            self.module.segmentation,\n            paths=['no.jpg']\n        )\n\n    def test_segmentation6(self):\n        self.assertRaises(\n            AttributeError,\n            self.module.segmentation,\n            images=['test.jpg']\n        )\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ann_resnet50_cityscapes/README.md",
    "content": "# ann_resnet50_cityscapes\n\n|模型名称|ann_resnet50_cityscapes|\n| :--- | :---: | \n|类别|图像-图像分割|\n|网络|ann_resnet50vd|\n|数据集|Cityscapes|\n|是否支持Fine-tuning|是|\n|模型大小|228MB|\n|指标|-|\n|最新更新日期|2022-03-22|\n\n## 一、模型基本信息\n  \n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/159212111-df341f2a-e994-45d7-92d6-2288d666079c.png\"  width = \"420\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/159212188-2db40b29-2943-47ce-9ad2-36a6fb85ba3e.png\" width = \"420\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### 模型介绍\n\n    - 本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n    - 更多详情请参考：[ann](https://arxiv.org/pdf/1908.07678.pdf)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install ann_resnet50_cityscapes\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)   \n\n\n## 三、模型API预测\n\n- ### 1.预测代码示例\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='ann_resnet50_cityscapes')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用ann_resnet50_cityscapes模型对OpticDiscSeg数据集进行Fine-tune。 `train.py`内容如下：\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n              ```\n                - `transforms`: 数据预处理方式。\n                - `mode`: `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n                - 数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='ann_resnet50_cityscapes', num_classes=2, pretrained=None)\n              ```\n                - `name`: 选择预训练模型的名字。\n                - `load_checkpoint`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n        - Step4: 选择优化策略和运行配置\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n             \n    - 模型预测\n\n        - 当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。我们使用该模型来进行预测。predict.py脚本如下：\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='ann_resnet50_cityscapes', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - 参数配置正确后，请执行脚本`python predict.py`。\n\n            - **Args**\n                * `images`:原始图像路径或BGR格式图片；\n                * `visualization`: 是否可视化，默认为True；\n                * `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n                **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像分割服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m ann_resnet50_cityscapes\n      ```\n\n    - 这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ginet_resnet50vd_voc\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ann_resnet50_cityscapes/README_en.md",
    "content": "# ann_resnet50_cityscapes\n\n|Module Name|ann_resnet50_cityscapes|\n| :--- | :---: | \n|Category|Image Segmentation|\n|Network|ann_resnet50vd|\n|Dataset|Cityscapes|\n|Fine-tuning supported or not|Yes|\n|Module Size|228MB|\n|Data indicators|-|\n|Latest update date|2022-03-22|\n\n## I. Basic Information \n  \n- ### Application Effect Display\n    - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/159212111-df341f2a-e994-45d7-92d6-2288d666079c.png\"  width = \"420\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/159212188-2db40b29-2943-47ce-9ad2-36a6fb85ba3e.png\" width = \"420\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### Module Introduction\n\n    - We will show how to use PaddleHub to finetune the pre-trained model and complete the prediction.\n    - For more information, please refer to: [ann](https://arxiv.org/pdf/1908.07678.pdf)\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install ann_resnet50_cityscapes\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='ann_resnet50_cityscapes')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.Fine-tune and Encapsulation\n\n    - After completing the installation of PaddlePaddle and PaddleHub, you can start using the ann_resnet50_cityscapes model to fine-tune datasets such as OpticDiscSeg.\n\n    - Steps:\n\n         - Step1: Define the data preprocessing method\n\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms`: The data enhancement module defines lots of data preprocessing methods. Users can replace the data preprocessing methods according to their needs.\n\n         - Step2: Download the dataset\n\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n              ```\n                * `transforms`: data preprocessing methods.\n\n                * `mode`: Select the data mode, the options are `train`, `test`, `val`. Default is `train`.\n\n                * Dataset preparation can be referred to [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`will be automatically downloaded from the network and decompressed to the `$HOME/.paddlehub/dataset` directory under the user directory.\n\n        - Step3: Load the pre-trained model\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='ann_resnet50_cityscapes', num_classes=2, pretrained=None)\n              ```\n                - `name`: model name.\n                - `load_checkpoint`: Whether to load the self-trained model, if it is None, load the provided parameters.\n\n        - Step4:  Optimization strategy\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n             \n\n    -  Model prediction\n\n        - When Fine-tune is completed, the model with the best performance on the verification set will be saved in the `${CHECKPOINT_DIR}/best_model` directory. We use this model to make predictions. The `predict.py` script is as follows:\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='ann_resnet50_cityscapes', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - **Args**\n                * `images`: Image path or ndarray data with format [H, W, C], BGR.\n                * `visualization`: Whether to save the recognition results as picture files.\n                * `save_path`: Save path of the result, default is 'seg_result'.\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of image segmentation.\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n        - ```shell\n          $ hub serving start -m ann_resnet50_cityscapes\n          ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result:\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ann_resnet50_cityscapes\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## V. Release Note\n\n- 1.0.0\n\n  First release\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ann_resnet50_cityscapes/layers.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn.layer import activation\nfrom paddle.nn import Conv2D, AvgPool2D\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu':\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Depthwise Separable Convolution.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(SeparableConvBNReLU, self).__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=in_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n        self.piontwise_conv = ConvBNReLU(\n            in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBN, self).__init__()\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBNReLU, self).__init__()\n\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n    Returns:\n        A callable object of Activation.\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n    Examples:\n        from paddleseg.models.common.activation import Activation\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(\n                    act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n\n\nclass ASPPModule(nn.Layer):\n    \"\"\"\n    Atrous Spatial Pyramid Pooling.\n    Args:\n        aspp_ratios (tuple): The dilation rate using in ASSP module.\n        in_channels (int): The number of input channels.\n        out_channels (int): The number of output channels.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n        use_sep_conv (bool, optional): If using separable conv in ASPP module. Default: False.\n        image_pooling (bool, optional): If augmented with image-level features. Default: False\n    \"\"\"\n\n    def __init__(self,\n                 aspp_ratios: tuple,\n                 in_channels: int,\n                 out_channels: int,\n                 align_corners: bool,\n                 use_sep_conv: bool = False,\n                 image_pooling: bool = False):\n        super().__init__()\n\n        self.align_corners = align_corners\n        self.aspp_blocks = nn.LayerList()\n\n        for ratio in aspp_ratios:\n            if use_sep_conv and ratio > 1:\n                conv_func = SeparableConvBNReLU\n            else:\n                conv_func = ConvBNReLU\n\n            block = conv_func(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1 if ratio == 1 else 3,\n                dilation=ratio,\n                padding=0 if ratio == 1 else ratio)\n            self.aspp_blocks.append(block)\n\n        out_size = len(self.aspp_blocks)\n\n        if image_pooling:\n            self.global_avg_pool = nn.Sequential(\n                nn.AdaptiveAvgPool2D(output_size=(1, 1)),\n                ConvBNReLU(in_channels, out_channels, kernel_size=1, bias_attr=False))\n            out_size += 1\n        self.image_pooling = image_pooling\n\n        self.conv_bn_relu = ConvBNReLU(\n            in_channels=out_channels * out_size,\n            out_channels=out_channels,\n            kernel_size=1)\n\n        self.dropout = nn.Dropout(p=0.1)  # drop rate\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outputs = []\n        for block in self.aspp_blocks:\n            y = block(x)\n            y = F.interpolate(\n                y,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(y)\n\n        if self.image_pooling:\n            img_avg = self.global_avg_pool(x)\n            img_avg = F.interpolate(\n                img_avg,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(img_avg)\n\n        x = paddle.concat(outputs, axis=1)\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n\n        return x\n\n\nclass AuxLayer(nn.Layer):\n    \"\"\"\n    The auxiliary layer implementation for auxiliary loss.\n\n    Args:\n        in_channels (int): The number of input channels.\n        inter_channels (int): The intermediate channels.\n        out_channels (int): The number of output channels, and usually it is num_classes.\n        dropout_prob (float, optional): The drop rate. Default: 0.1.\n    \"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 inter_channels: int,\n                 out_channels: int,\n                 dropout_prob: float = 0.1,\n                 **kwargs):\n        super().__init__()\n\n        self.conv_bn_relu = ConvBNReLU(\n            in_channels=in_channels,\n            out_channels=inter_channels,\n            kernel_size=3,\n            padding=1,\n            **kwargs)\n\n        self.dropout = nn.Dropout(p=dropout_prob)\n\n        self.conv = nn.Conv2D(\n            in_channels=inter_channels,\n            out_channels=out_channels,\n            kernel_size=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n        x = self.conv(x)\n        return x\n\n\nclass Add(nn.Layer):\n    def __init__(self):\n        super().__init__()\n\n    def forward(self, x: paddle.Tensor, y: paddle.Tensor, name: str = None):\n        return paddle.add(x, y, name)\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ann_resnet50_cityscapes/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union, List, Tuple\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\n\nfrom ann_resnet50_cityscapes.resnet import ResNet50_vd\nimport ann_resnet50_cityscapes.layers as layers\n\n@moduleinfo(\n    name=\"ann_resnet50_cityscapes\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"ANNResnet50 is a segmentation model.\",\n    version=\"1.0.0\",\n    meta=ImageSegmentationModule)\nclass ANN(nn.Layer):\n    \"\"\"\n    The ANN implementation based on PaddlePaddle.\n\n    The original article refers to\n    Zhen, Zhu, et al. \"Asymmetric Non-local Neural Networks for Semantic Segmentation\"\n    (https://arxiv.org/pdf/1908.07678.pdf).\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (tuple, optional): Two values in the tuple indicate the indices of output of backbone.\n        key_value_channels (int, optional): The key and value channels of self-attention map in both AFNB and APNB modules.\n            Default: 256.\n        inter_channels (int, optional): Both input and output channels of APNB modules. Default: 512.\n        psp_size (tuple, optional): The out size of pooled feature maps. Default: (1, 3, 6, 8).\n        enable_auxiliary_loss (bool, optional): A bool value indicates whether adding auxiliary loss. Default: True.\n        align_corners (bool, optional): An argument of F.interpolate. It should be set to False when the feature size is even,\n            e.g. 1024x512, otherwise it is True, e.g. 769x769. Default: False.\n        pretrained (str, optional): The path or url of pretrained model. Default: None.\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 19,\n                 backbone_indices: Tuple[int] = (2, 3),\n                 key_value_channels: int = 256,\n                 inter_channels: int = 512,\n                 psp_size: Tuple[int] = (1, 3, 6, 8),\n                 align_corners: bool = False,\n                 pretrained: str = None):\n        super(ANN, self).__init__()\n\n        self.backbone = ResNet50_vd()\n        backbone_channels = [\n            self.backbone.feat_channels[i] for i in backbone_indices\n        ]\n\n        self.head = ANNHead(num_classes, backbone_indices, backbone_channels,\n                            key_value_channels, inter_channels, psp_size)\n        self.align_corners = align_corners\n        self.transforms = T.Compose([T.Normalize()])\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n            \n    def transform(self, img: Union[np.ndarray, str]) -> Union[np.ndarray, str]:\n        return self.transforms(img)\n    \n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        feat_list = self.backbone(x)\n        logit_list = self.head(feat_list)\n        return [\n            F.interpolate(\n                logit,\n                paddle.shape(x)[2:],\n                mode='bilinear',\n                align_corners=self.align_corners) for logit in logit_list\n        ]\n\n\n\nclass ANNHead(nn.Layer):\n    \"\"\"\n    The ANNHead implementation.\n\n    It mainly consists of AFNB and APNB modules.\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (tuple): Two values in the tuple indicate the indices of output of backbone.\n            The first index will be taken as low-level features; the second one will be\n            taken as high-level features in AFNB module. Usually backbone consists of four\n            downsampling stage, such as ResNet, and return an output of each stage. If it is (2, 3),\n            it means taking feature map of the third stage and the fourth stage in backbone.\n        backbone_channels (tuple): The same length with \"backbone_indices\". It indicates the channels of corresponding index.\n        key_value_channels (int): The key and value channels of self-attention map in both AFNB and APNB modules.\n        inter_channels (int): Both input and output channels of APNB modules.\n        psp_size (tuple): The out size of pooled feature maps.\n        enable_auxiliary_loss (bool, optional): A bool value indicates whether adding auxiliary loss. Default: False\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int,\n                 backbone_indices: Tuple[int],\n                 backbone_channels: Tuple[int],\n                 key_value_channels: int,\n                 inter_channels: int,\n                 psp_size: Tuple[int],\n                 enable_auxiliary_loss: bool  = False):\n        super().__init__()\n\n        low_in_channels = backbone_channels[0]\n        high_in_channels = backbone_channels[1]\n\n        self.fusion = AFNB(\n            low_in_channels=low_in_channels,\n            high_in_channels=high_in_channels,\n            out_channels=high_in_channels,\n            key_channels=key_value_channels,\n            value_channels=key_value_channels,\n            dropout_prob=0.05,\n            repeat_sizes=([1]),\n            psp_size=psp_size)\n\n        self.context = nn.Sequential(\n            layers.ConvBNReLU(\n                in_channels=high_in_channels,\n                out_channels=inter_channels,\n                kernel_size=3,\n                padding=1),\n            APNB(\n                in_channels=inter_channels,\n                out_channels=inter_channels,\n                key_channels=key_value_channels,\n                value_channels=key_value_channels,\n                dropout_prob=0.05,\n                repeat_sizes=([1]),\n                psp_size=psp_size))\n\n        self.cls = nn.Conv2D(\n            in_channels=inter_channels, out_channels=num_classes, kernel_size=1)\n        self.auxlayer = layers.AuxLayer(\n            in_channels=low_in_channels,\n            inter_channels=low_in_channels // 2,\n            out_channels=num_classes,\n            dropout_prob=0.05)\n\n        self.backbone_indices = backbone_indices\n        self.enable_auxiliary_loss = enable_auxiliary_loss\n\n    def forward(self, feat_list: List[paddle.Tensor]) -> List[paddle.Tensor]:\n        logit_list = []\n        low_level_x = feat_list[self.backbone_indices[0]]\n        high_level_x = feat_list[self.backbone_indices[1]]\n        x = self.fusion(low_level_x, high_level_x)\n        x = self.context(x)\n        logit = self.cls(x)\n        logit_list.append(logit)\n\n        if self.enable_auxiliary_loss:\n            auxiliary_logit = self.auxlayer(low_level_x)\n            logit_list.append(auxiliary_logit)\n\n        return logit_list\n\n\nclass AFNB(nn.Layer):\n    \"\"\"\n    Asymmetric Fusion Non-local Block.\n\n    Args:\n        low_in_channels (int): Low-level-feature channels.\n        high_in_channels (int): High-level-feature channels.\n        out_channels (int): Out channels of AFNB module.\n        key_channels (int): The key channels in self-attention block.\n        value_channels (int): The value channels in self-attention block.\n        dropout_prob (float): The dropout rate of output.\n        repeat_sizes (tuple, optional): The number of AFNB modules. Default: ([1]).\n        psp_size (tuple. optional): The out size of pooled feature maps. Default: (1, 3, 6, 8).\n    \"\"\"\n\n    def __init__(self,\n                 low_in_channels: int,\n                 high_in_channels: int,\n                 out_channels: int,\n                 key_channels: int,\n                 value_channels: int,\n                 dropout_prob: float,\n                 repeat_sizes: Tuple[int] = ([1]),\n                 psp_size: Tuple[int] = (1, 3, 6, 8)):\n        super().__init__()\n\n        self.psp_size = psp_size\n        self.stages = nn.LayerList([\n            SelfAttentionBlock_AFNB(low_in_channels, high_in_channels,\n                                    key_channels, value_channels, out_channels,\n                                    size) for size in repeat_sizes\n        ])\n        self.conv_bn = layers.ConvBN(\n            in_channels=out_channels + high_in_channels,\n            out_channels=out_channels,\n            kernel_size=1)\n        self.dropout = nn.Dropout(p=dropout_prob)\n\n    def forward(self, low_feats: List[paddle.Tensor], high_feats: List[paddle.Tensor]) -> paddle.Tensor:\n        priors = [stage(low_feats, high_feats) for stage in self.stages]\n        context = priors[0]\n        for i in range(1, len(priors)):\n            context += priors[i]\n\n        output = self.conv_bn(paddle.concat([context, high_feats], axis=1))\n        output = self.dropout(output)\n\n        return output\n\n\nclass APNB(nn.Layer):\n    \"\"\"\n    Asymmetric Pyramid Non-local Block.\n\n    Args:\n        in_channels (int): The input channels of APNB module.\n        out_channels (int): Out channels of APNB module.\n        key_channels (int): The key channels in self-attention block.\n        value_channels (int): The value channels in self-attention block.\n        dropout_prob (float): The dropout rate of output.\n        repeat_sizes (tuple, optional): The number of AFNB modules. Default: ([1]).\n        psp_size (tuple, optional): The out size of pooled feature maps. Default: (1, 3, 6, 8).\n    \"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 key_channels: int,\n                 value_channels: int,\n                 dropout_prob: float,\n                 repeat_sizes: Tuple[int] = ([1]),\n                 psp_size: Tuple[int] = (1, 3, 6, 8)):\n        super().__init__()\n\n        self.psp_size = psp_size\n        self.stages = nn.LayerList([\n            SelfAttentionBlock_APNB(in_channels, out_channels, key_channels,\n                                    value_channels, size)\n            for size in repeat_sizes\n        ])\n        self.conv_bn = layers.ConvBNReLU(\n            in_channels=in_channels * 2,\n            out_channels=out_channels,\n            kernel_size=1)\n        self.dropout = nn.Dropout(p=dropout_prob)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        priors = [stage(x) for stage in self.stages]\n        context = priors[0]\n        for i in range(1, len(priors)):\n            context += priors[i]\n\n        output = self.conv_bn(paddle.concat([context, x], axis=1))\n        output = self.dropout(output)\n\n        return output\n\n\ndef _pp_module(x: paddle.Tensor, psp_size: List[int]) -> paddle.Tensor:\n    n, c, h, w = x.shape\n    priors = []\n    for size in psp_size:\n        feat = F.adaptive_avg_pool2d(x, size)\n        feat = paddle.reshape(feat, shape=(0, c, -1))\n        priors.append(feat)\n    center = paddle.concat(priors, axis=-1)\n    return center\n\n\nclass SelfAttentionBlock_AFNB(nn.Layer):\n    \"\"\"\n    Self-Attention Block for AFNB module.\n\n    Args:\n        low_in_channels (int): Low-level-feature channels.\n        high_in_channels (int): High-level-feature channels.\n        key_channels (int): The key channels in self-attention block.\n        value_channels (int): The value channels in self-attention block.\n        out_channels (int, optional): Out channels of AFNB module. Default: None.\n        scale (int, optional): Pooling size. Default: 1.\n        psp_size (tuple, optional): The out size of pooled feature maps. Default: (1, 3, 6, 8).\n    \"\"\"\n\n    def __init__(self,\n                 low_in_channels: int,\n                 high_in_channels: int,\n                 key_channels: int,\n                 value_channels: int,\n                 out_channels: int = None,\n                 scale: int = 1,\n                 psp_size: Tuple[int] = (1, 3, 6, 8)):\n        super().__init__()\n\n        self.scale = scale\n        self.in_channels = low_in_channels\n        self.out_channels = out_channels\n        self.key_channels = key_channels\n        self.value_channels = value_channels\n        if out_channels == None:\n            self.out_channels = high_in_channels\n        self.pool = nn.MaxPool2D(scale)\n        self.f_key = layers.ConvBNReLU(\n            in_channels=low_in_channels,\n            out_channels=key_channels,\n            kernel_size=1)\n        self.f_query = layers.ConvBNReLU(\n            in_channels=high_in_channels,\n            out_channels=key_channels,\n            kernel_size=1)\n        self.f_value = nn.Conv2D(\n            in_channels=low_in_channels,\n            out_channels=value_channels,\n            kernel_size=1)\n\n        self.W = nn.Conv2D(\n            in_channels=value_channels,\n            out_channels=out_channels,\n            kernel_size=1)\n\n        self.psp_size = psp_size\n\n    def forward(self, low_feats: List[paddle.Tensor], high_feats: List[paddle.Tensor]) -> paddle.Tensor:\n        batch_size, _, h, w = high_feats.shape\n\n        value = self.f_value(low_feats)\n        value = _pp_module(value, self.psp_size)\n        value = paddle.transpose(value, (0, 2, 1))\n\n        query = self.f_query(high_feats)\n        query = paddle.reshape(query, shape=(0, self.key_channels, -1))\n        query = paddle.transpose(query, perm=(0, 2, 1))\n\n        key = self.f_key(low_feats)\n        key = _pp_module(key, self.psp_size)\n\n        sim_map = paddle.matmul(query, key)\n        sim_map = (self.key_channels**-.5) * sim_map\n        sim_map = F.softmax(sim_map, axis=-1)\n\n        context = paddle.matmul(sim_map, value)\n        context = paddle.transpose(context, perm=(0, 2, 1))\n        hf_shape = paddle.shape(high_feats)\n        context = paddle.reshape(\n            context, shape=[0, self.value_channels, hf_shape[2], hf_shape[3]])\n\n        context = self.W(context)\n\n        return context\n\n\nclass SelfAttentionBlock_APNB(nn.Layer):\n    \"\"\"\n    Self-Attention Block for APNB module.\n\n    Args:\n        in_channels (int): The input channels of APNB module.\n        out_channels (int): The out channels of APNB module.\n        key_channels (int): The key channels in self-attention block.\n        value_channels (int): The value channels in self-attention block.\n        scale (int, optional): Pooling size. Default: 1.\n        psp_size (tuple, optional): The out size of pooled feature maps. Default: (1, 3, 6, 8).\n    \"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 key_channels: int,\n                 value_channels: int,\n                 scale: int = 1,\n                 psp_size: Tuple[int] = (1, 3, 6, 8)):\n        super().__init__()\n\n        self.scale = scale\n        self.in_channels = in_channels\n        self.out_channels = out_channels\n        self.key_channels = key_channels\n        self.value_channels = value_channels\n        self.pool = nn.MaxPool2D(scale)\n        self.f_key = layers.ConvBNReLU(\n            in_channels=self.in_channels,\n            out_channels=self.key_channels,\n            kernel_size=1)\n        self.f_query = self.f_key\n        self.f_value = nn.Conv2D(\n            in_channels=self.in_channels,\n            out_channels=self.value_channels,\n            kernel_size=1)\n        self.W = nn.Conv2D(\n            in_channels=self.value_channels,\n            out_channels=self.out_channels,\n            kernel_size=1)\n\n        self.psp_size = psp_size\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        batch_size, _, h, w = x.shape\n        if self.scale > 1:\n            x = self.pool(x)\n\n        value = self.f_value(x)\n        value = _pp_module(value, self.psp_size)\n        value = paddle.transpose(value, perm=(0, 2, 1))\n\n        query = self.f_query(x)\n        query = paddle.reshape(query, shape=(0, self.key_channels, -1))\n        query = paddle.transpose(query, perm=(0, 2, 1))\n\n        key = self.f_key(x)\n        key = _pp_module(key, self.psp_size)\n\n        sim_map = paddle.matmul(query, key)\n        sim_map = (self.key_channels**-.5) * sim_map\n        sim_map = F.softmax(sim_map, axis=-1)\n\n        context = paddle.matmul(sim_map, value)\n        context = paddle.transpose(context, perm=(0, 2, 1))\n\n        x_shape = paddle.shape(x)\n        context = paddle.reshape(\n            context, shape=[0, self.value_channels, x_shape[2], x_shape[3]])\n        context = self.W(context)\n\n        return context\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ann_resnet50_cityscapes/resnet.py",
    "content": "# copyright (c) 2022 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import Union, List, Tuple\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport ann_resnet50_cityscapes.layers as layers\n\n\nclass ConvBNLayer(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 stride: int = 1,\n                 dilation: int = 1,\n                 groups: int = 1,\n                 is_vd_mode: bool = False,\n                 act: str = None,\n                 data_format: str = 'NCHW'):\n        super(ConvBNLayer, self).__init__()\n        if dilation != 1 and kernel_size != 3:\n            raise RuntimeError(\"When the dilation isn't 1,\" \\\n                               \"the kernel_size should be 3.\")\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = nn.AvgPool2D(\n            kernel_size=2,\n            stride=2,\n            padding=0,\n            ceil_mode=True,\n            data_format=data_format)\n        self._conv = nn.Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=(kernel_size - 1) // 2 \\\n                if dilation == 1 else dilation,\n            dilation=dilation,\n            groups=groups,\n            bias_attr=False,\n            data_format=data_format)\n\n        self._batch_norm = layers.SyncBatchNorm(\n            out_channels, data_format=data_format)\n        self._act_op = layers.Activation(act=act)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        y = self._act_op(y)\n\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool  = False,\n                 dilation: int = 1,\n                 data_format: str = 'NCHW'):\n        super(BottleneckBlock, self).__init__()\n\n        self.data_format = data_format\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=1,\n            act='relu',\n            data_format=data_format)\n\n        self.dilation = dilation\n\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            dilation=dilation,\n            data_format=data_format)\n        self.conv2 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels * 4,\n            kernel_size=1,\n            act=None,\n            data_format=data_format)\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels * 4,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                data_format=data_format)\n\n        self.shortcut = shortcut\n        # NOTE: Use the wrap layer for quantization training\n        self.add = layers.Add()\n        self.relu = layers.Activation(act=\"relu\")\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = self.add(short, conv2)\n        y = self.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 dilation: int = 1,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 data_format: str = 'NCHW'):\n        super(BasicBlock, self).__init__()\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            dilation=dilation,\n            act='relu',\n            data_format=data_format)\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            dilation=dilation,\n            act=None,\n            data_format=data_format)\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                data_format=data_format)\n\n        self.shortcut = shortcut\n        self.dilation = dilation\n        self.data_format = data_format\n        self.add = layers.Add()\n        self.relu = layers.Activation(act=\"relu\")\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = self.add(short, conv1)\n        y = self.relu(y)\n\n        return y\n\n\nclass ResNet_vd(nn.Layer):\n    \"\"\"\n    The ResNet_vd implementation based on PaddlePaddle.\n\n    The original article refers to Jingdong\n    Tong He, et, al. \"Bag of Tricks for Image Classification with Convolutional Neural Networks\"\n    (https://arxiv.org/pdf/1812.01187.pdf).\n\n    Args:\n        layers (int, optional): The layers of ResNet_vd. The supported layers are (18, 34, 50, 101, 152, 200). Default: 50.\n        output_stride (int, optional): The stride of output features compared to input images. It is 8 or 16. Default: 8.\n        multi_grid (tuple|list, optional): The grid of stage4. Defult: (1, 1, 1).\n        pretrained (str, optional): The path of pretrained model.\n\n    \"\"\"\n\n    def __init__(self,\n                 layers: int = 50,\n                 output_stride: int = 8,\n                 multi_grid: Tuple[int] = (1, 1, 1),\n                 pretrained: str = None,\n                 data_format: str = 'NCHW'):\n        super(ResNet_vd, self).__init__()\n\n        self.data_format = data_format\n        self.conv1_logit = None  # for gscnn shape stream\n        self.layers = layers\n        supported_layers = [18, 34, 50, 101, 152, 200]\n        assert layers in supported_layers, \\\n            \"supported layers are {} but input layer is {}\".format(\n                supported_layers, layers)\n\n        if layers == 18:\n            depth = [2, 2, 2, 2]\n        elif layers == 34 or layers == 50:\n            depth = [3, 4, 6, 3]\n        elif layers == 101:\n            depth = [3, 4, 23, 3]\n        elif layers == 152:\n            depth = [3, 8, 36, 3]\n        elif layers == 200:\n            depth = [3, 12, 48, 3]\n        num_channels = [64, 256, 512, 1024\n                        ] if layers >= 50 else [64, 64, 128, 256]\n        num_filters = [64, 128, 256, 512]\n\n        # for channels of four returned stages\n        self.feat_channels = [c * 4 for c in num_filters\n                              ] if layers >= 50 else num_filters\n\n        dilation_dict = None\n        if output_stride == 8:\n            dilation_dict = {2: 2, 3: 4}\n        elif output_stride == 16:\n            dilation_dict = {3: 2}\n\n        self.conv1_1 = ConvBNLayer(\n            in_channels=3,\n            out_channels=32,\n            kernel_size=3,\n            stride=2,\n            act='relu',\n            data_format=data_format)\n        self.conv1_2 = ConvBNLayer(\n            in_channels=32,\n            out_channels=32,\n            kernel_size=3,\n            stride=1,\n            act='relu',\n            data_format=data_format)\n        self.conv1_3 = ConvBNLayer(\n            in_channels=32,\n            out_channels=64,\n            kernel_size=3,\n            stride=1,\n            act='relu',\n            data_format=data_format)\n        self.pool2d_max = nn.MaxPool2D(\n            kernel_size=3, stride=2, padding=1, data_format=data_format)\n\n        # self.block_list = []\n        self.stage_list = []\n        if layers >= 50:\n            for block in range(len(depth)):\n                shortcut = False\n                block_list = []\n                for i in range(depth[block]):\n                    if layers in [101, 152] and block == 2:\n                        if i == 0:\n                            conv_name = \"res\" + str(block + 2) + \"a\"\n                        else:\n                            conv_name = \"res\" + str(block + 2) + \"b\" + str(i)\n                    else:\n                        conv_name = \"res\" + str(block + 2) + chr(97 + i)\n\n                    ###############################################################################\n                    # Add dilation rate for some segmentation tasks, if dilation_dict is not None.\n                    dilation_rate = dilation_dict[\n                        block] if dilation_dict and block in dilation_dict else 1\n\n                    # Actually block here is 'stage', and i is 'block' in 'stage'\n                    # At the stage 4, expand the the dilation_rate if given multi_grid\n                    if block == 3:\n                        dilation_rate = dilation_rate * multi_grid[i]\n                    ###############################################################################\n\n                    bottleneck_block = self.add_sublayer(\n                        'bb_%d_%d' % (block, i),\n                        BottleneckBlock(\n                            in_channels=num_channels[block]\n                            if i == 0 else num_filters[block] * 4,\n                            out_channels=num_filters[block],\n                            stride=2 if i == 0 and block != 0\n                                        and dilation_rate == 1 else 1,\n                            shortcut=shortcut,\n                            if_first=block == i == 0,\n                            dilation=dilation_rate,\n                            data_format=data_format))\n\n                    block_list.append(bottleneck_block)\n                    shortcut = True\n                self.stage_list.append(block_list)\n        else:\n            for block in range(len(depth)):\n                shortcut = False\n                block_list = []\n                for i in range(depth[block]):\n                    dilation_rate = dilation_dict[block] \\\n                        if dilation_dict and block in dilation_dict else 1\n                    if block == 3:\n                        dilation_rate = dilation_rate * multi_grid[i]\n\n                    basic_block = self.add_sublayer(\n                        'bb_%d_%d' % (block, i),\n                        BasicBlock(\n                            in_channels=num_channels[block]\n                            if i == 0 else num_filters[block],\n                            out_channels=num_filters[block],\n                            stride=2 if i == 0 and block != 0 \\\n                                        and dilation_rate == 1 else 1,\n                            dilation=dilation_rate,\n                            shortcut=shortcut,\n                            if_first=block == i == 0,\n                            data_format=data_format))\n                    block_list.append(basic_block)\n                    shortcut = True\n                self.stage_list.append(block_list)\n\n        self.pretrained = pretrained\n\n    def forward(self, inputs: paddle.Tensor) -> List[paddle.Tensor]:\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        self.conv1_logit = y.clone()\n        y = self.pool2d_max(y)\n\n        # A feature list saves the output feature map of each stage.\n        feat_list = []\n        for stage in self.stage_list:\n            for block in stage:\n                y = block(y)\n            feat_list.append(y)\n\n        return feat_list\n\n\ndef ResNet50_vd(**args):\n    model = ResNet_vd(layers=50, **args)\n    return model"
  },
  {
    "path": "modules/image/semantic_segmentation/ann_resnet50_voc/README.md",
    "content": "# ann_resnet50_voc\n\n|模型名称|ann_resnet50_voc|\n| :--- | :---: | \n|类别|图像-图像分割|\n|网络|ann_resnet50vd|\n|数据集|PascalVOC2012|\n|是否支持Fine-tuning|是|\n|模型大小|228MB|\n|指标|-|\n|最新更新日期|2022-03-21|\n\n## 一、模型基本信息\n  \n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/159212097-443a5a65-2f2e-4126-9c07-d7c3c220e55f.jpg\"  width = \"420\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/159212375-52e123af-4699-4c25-8f50-4240bbb714b4.png\" width = \"420\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### 模型介绍\n\n    - 本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n    - 更多详情请参考：[ann](https://arxiv.org/pdf/1908.07678.pdf)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install ann_resnet50_voc\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)   \n\n\n## 三、模型API预测\n\n- ### 1.预测代码示例\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='ann_resnet50_voc')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用ann_resnet50_voc模型对OpticDiscSeg数据集进行Fine-tune。 `train.py`内容如下：\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n              ```\n                - `transforms`: 数据预处理方式。\n                - `mode`: `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n                - 数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='ann_resnet50_voc', num_classes=2, pretrained=None)\n              ```\n                - `name`: 选择预训练模型的名字。\n                - `load_checkpoint`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n        - Step4: 选择优化策略和运行配置\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n             \n    - 模型预测\n\n        - 当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。我们使用该模型来进行预测。predict.py脚本如下：\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='ann_resnet50_voc', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - 参数配置正确后，请执行脚本`python predict.py`。\n\n            - **Args**\n                * `images`:原始图像路径或BGR格式图片；\n                * `visualization`: 是否可视化，默认为True；\n                * `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n                **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像分割服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m ann_resnet50_voc\n      ```\n\n    - 这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ginet_resnet50vd_voc\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ann_resnet50_voc/README_en.md",
    "content": "# ann_resnet50_voc\n\n|Module Name|ann_resnet50_voc|\n| :--- | :---: | \n|Category|Image Segmentation|\n|Network|ann_resnet50vd|\n|Dataset|PascalVOC2012|\n|Fine-tuning supported or not|Yes|\n|Module Size|228MB|\n|Data indicators|-|\n|Latest update date|2022-03-22|\n\n## I. Basic Information \n  \n- ### Application Effect Display\n    - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/159212097-443a5a65-2f2e-4126-9c07-d7c3c220e55f.jpg\"  width = \"420\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/159212375-52e123af-4699-4c25-8f50-4240bbb714b4.png\" width = \"420\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### Module Introduction\n\n    - We will show how to use PaddleHub to finetune the pre-trained model and complete the prediction.\n    - For more information, please refer to: [ann](https://arxiv.org/pdf/1908.07678.pdf)\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install ann_resnet50_voc\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='ann_resnet50_voc')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.Fine-tune and Encapsulation\n\n    - After completing the installation of PaddlePaddle and PaddleHub, you can start using the ann_resnet50_voc model to fine-tune datasets such as OpticDiscSeg.\n\n    - Steps:\n\n         - Step1: Define the data preprocessing method\n\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms`: The data enhancement module defines lots of data preprocessing methods. Users can replace the data preprocessing methods according to their needs.\n\n         - Step2: Download the dataset\n\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n              ```\n                * `transforms`: data preprocessing methods.\n\n                * `mode`: Select the data mode, the options are `train`, `test`, `val`. Default is `train`.\n\n                * Dataset preparation can be referred to [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`will be automatically downloaded from the network and decompressed to the `$HOME/.paddlehub/dataset` directory under the user directory.\n\n        - Step3: Load the pre-trained model\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='ann_resnet50_voc', num_classes=2, pretrained=None)\n              ```\n                - `name`: model name.\n                - `load_checkpoint`: Whether to load the self-trained model, if it is None, load the provided parameters.\n\n        - Step4:  Optimization strategy\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```            \n\n    -  Model prediction\n\n        - When Fine-tune is completed, the model with the best performance on the verification set will be saved in the `${CHECKPOINT_DIR}/best_model` directory. We use this model to make predictions. The `predict.py` script is as follows:\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='ann_resnet50_voc', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - **Args**\n                * `images`: Image path or ndarray data with format [H, W, C], BGR.\n                * `visualization`: Whether to save the recognition results as picture files.\n                * `save_path`: Save path of the result, default is 'seg_result'.\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of image segmentation.\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n        - ```shell\n          $ hub serving start -m ann_resnet50_voc\n          ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result:\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ann_resnet50_voc\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## V. Release Note\n\n- 1.0.0\n\n  First release\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ann_resnet50_voc/layers.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Union, List, Tuple\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn.layer import activation\nfrom paddle.nn import Conv2D, AvgPool2D\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu':\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Depthwise Separable Convolution.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(SeparableConvBNReLU, self).__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=in_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n        self.piontwise_conv = ConvBNReLU(\n            in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBN, self).__init__()\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBNReLU, self).__init__()\n\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n    Returns:\n        A callable object of Activation.\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n    Examples:\n        from paddleseg.models.common.activation import Activation\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(\n                    act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n\n\nclass ASPPModule(nn.Layer):\n    \"\"\"\n    Atrous Spatial Pyramid Pooling.\n    Args:\n        aspp_ratios (tuple): The dilation rate using in ASSP module.\n        in_channels (int): The number of input channels.\n        out_channels (int): The number of output channels.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n        use_sep_conv (bool, optional): If using separable conv in ASPP module. Default: False.\n        image_pooling (bool, optional): If augmented with image-level features. Default: False\n    \"\"\"\n\n    def __init__(self,\n                 aspp_ratios: tuple,\n                 in_channels: int,\n                 out_channels: int,\n                 align_corners: bool,\n                 use_sep_conv: bool = False,\n                 image_pooling: bool = False):\n        super().__init__()\n\n        self.align_corners = align_corners\n        self.aspp_blocks = nn.LayerList()\n\n        for ratio in aspp_ratios:\n            if use_sep_conv and ratio > 1:\n                conv_func = SeparableConvBNReLU\n            else:\n                conv_func = ConvBNReLU\n\n            block = conv_func(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1 if ratio == 1 else 3,\n                dilation=ratio,\n                padding=0 if ratio == 1 else ratio)\n            self.aspp_blocks.append(block)\n\n        out_size = len(self.aspp_blocks)\n\n        if image_pooling:\n            self.global_avg_pool = nn.Sequential(\n                nn.AdaptiveAvgPool2D(output_size=(1, 1)),\n                ConvBNReLU(in_channels, out_channels, kernel_size=1, bias_attr=False))\n            out_size += 1\n        self.image_pooling = image_pooling\n\n        self.conv_bn_relu = ConvBNReLU(\n            in_channels=out_channels * out_size,\n            out_channels=out_channels,\n            kernel_size=1)\n\n        self.dropout = nn.Dropout(p=0.1)  # drop rate\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outputs = []\n        for block in self.aspp_blocks:\n            y = block(x)\n            y = F.interpolate(\n                y,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(y)\n\n        if self.image_pooling:\n            img_avg = self.global_avg_pool(x)\n            img_avg = F.interpolate(\n                img_avg,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(img_avg)\n\n        x = paddle.concat(outputs, axis=1)\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n\n        return x\n\n\nclass AuxLayer(nn.Layer):\n    \"\"\"\n    The auxiliary layer implementation for auxiliary loss.\n\n    Args:\n        in_channels (int): The number of input channels.\n        inter_channels (int): The intermediate channels.\n        out_channels (int): The number of output channels, and usually it is num_classes.\n        dropout_prob (float, optional): The drop rate. Default: 0.1.\n    \"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 inter_channels: int,\n                 out_channels: int,\n                 dropout_prob: float = 0.1,\n                 **kwargs):\n        super().__init__()\n\n        self.conv_bn_relu = ConvBNReLU(\n            in_channels=in_channels,\n            out_channels=inter_channels,\n            kernel_size=3,\n            padding=1,\n            **kwargs)\n\n        self.dropout = nn.Dropout(p=dropout_prob)\n\n        self.conv = nn.Conv2D(\n            in_channels=inter_channels,\n            out_channels=out_channels,\n            kernel_size=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n        x = self.conv(x)\n        return x\n\n\nclass Add(nn.Layer):\n    def __init__(self):\n        super().__init__()\n\n    def forward(self, x: paddle.Tensor, y: paddle.Tensor, name: str = None) -> paddle.Tensor:\n        return paddle.add(x, y, name)\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ann_resnet50_voc/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union, List, Tuple\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\n\nfrom ann_resnet50_voc.resnet import ResNet50_vd\nimport ann_resnet50_voc.layers as layers\n\n@moduleinfo(\n    name=\"ann_resnet50_voc\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"ANNResnet50 is a segmentation model.\",\n    version=\"1.0.0\",\n    meta=ImageSegmentationModule)\nclass ANN(nn.Layer):\n    \"\"\"\n    The ANN implementation based on PaddlePaddle.\n\n    The original article refers to\n    Zhen, Zhu, et al. \"Asymmetric Non-local Neural Networks for Semantic Segmentation\"\n    (https://arxiv.org/pdf/1908.07678.pdf).\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (tuple, optional): Two values in the tuple indicate the indices of output of backbone.\n        key_value_channels (int, optional): The key and value channels of self-attention map in both AFNB and APNB modules.\n            Default: 256.\n        inter_channels (int, optional): Both input and output channels of APNB modules. Default: 512.\n        psp_size (tuple, optional): The out size of pooled feature maps. Default: (1, 3, 6, 8).\n        enable_auxiliary_loss (bool, optional): A bool value indicates whether adding auxiliary loss. Default: True.\n        align_corners (bool, optional): An argument of F.interpolate. It should be set to False when the feature size is even,\n            e.g. 1024x512, otherwise it is True, e.g. 769x769. Default: False.\n        pretrained (str, optional): The path or url of pretrained model. Default: None.\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 21,\n                 backbone_indices: Tuple[int] = (2, 3),\n                 key_value_channels: int = 256,\n                 inter_channels: int = 512,\n                 psp_size: Tuple[int] = (1, 3, 6, 8),\n                 align_corners: bool = False,\n                 pretrained: str = None):\n        super(ANN, self).__init__()\n\n        self.backbone = ResNet50_vd()\n        backbone_channels = [\n            self.backbone.feat_channels[i] for i in backbone_indices\n        ]\n\n        self.head = ANNHead(num_classes, backbone_indices, backbone_channels,\n                            key_value_channels, inter_channels, psp_size)\n        self.align_corners = align_corners\n        self.transforms = T.Compose([T.Normalize()])\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n            \n    def transform(self, img: Union[np.ndarray, str]) -> Union[np.ndarray, str]:\n        return self.transforms(img)\n    \n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        feat_list = self.backbone(x)\n        logit_list = self.head(feat_list)\n        return [\n            F.interpolate(\n                logit,\n                paddle.shape(x)[2:],\n                mode='bilinear',\n                align_corners=self.align_corners) for logit in logit_list\n        ]\n\n\n\nclass ANNHead(nn.Layer):\n    \"\"\"\n    The ANNHead implementation.\n\n    It mainly consists of AFNB and APNB modules.\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (tuple): Two values in the tuple indicate the indices of output of backbone.\n            The first index will be taken as low-level features; the second one will be\n            taken as high-level features in AFNB module. Usually backbone consists of four\n            downsampling stage, such as ResNet, and return an output of each stage. If it is (2, 3),\n            it means taking feature map of the third stage and the fourth stage in backbone.\n        backbone_channels (tuple): The same length with \"backbone_indices\". It indicates the channels of corresponding index.\n        key_value_channels (int): The key and value channels of self-attention map in both AFNB and APNB modules.\n        inter_channels (int): Both input and output channels of APNB modules.\n        psp_size (tuple): The out size of pooled feature maps.\n        enable_auxiliary_loss (bool, optional): A bool value indicates whether adding auxiliary loss. Default: False\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int,\n                 backbone_indices: Tuple[int],\n                 backbone_channels: Tuple[int],\n                 key_value_channels: int,\n                 inter_channels: int,\n                 psp_size: Tuple[int],\n                 enable_auxiliary_loss: bool = False):\n        super().__init__()\n\n        low_in_channels = backbone_channels[0]\n        high_in_channels = backbone_channels[1]\n\n        self.fusion = AFNB(\n            low_in_channels=low_in_channels,\n            high_in_channels=high_in_channels,\n            out_channels=high_in_channels,\n            key_channels=key_value_channels,\n            value_channels=key_value_channels,\n            dropout_prob=0.05,\n            repeat_sizes=([1]),\n            psp_size=psp_size)\n\n        self.context = nn.Sequential(\n            layers.ConvBNReLU(\n                in_channels=high_in_channels,\n                out_channels=inter_channels,\n                kernel_size=3,\n                padding=1),\n            APNB(\n                in_channels=inter_channels,\n                out_channels=inter_channels,\n                key_channels=key_value_channels,\n                value_channels=key_value_channels,\n                dropout_prob=0.05,\n                repeat_sizes=([1]),\n                psp_size=psp_size))\n\n        self.cls = nn.Conv2D(\n            in_channels=inter_channels, out_channels=num_classes, kernel_size=1)\n        self.auxlayer = layers.AuxLayer(\n            in_channels=low_in_channels,\n            inter_channels=low_in_channels // 2,\n            out_channels=num_classes,\n            dropout_prob=0.05)\n\n        self.backbone_indices = backbone_indices\n        self.enable_auxiliary_loss = enable_auxiliary_loss\n\n    def forward(self, feat_list: List[paddle.Tensor]) -> List[paddle.Tensor]:\n        logit_list = []\n        low_level_x = feat_list[self.backbone_indices[0]]\n        high_level_x = feat_list[self.backbone_indices[1]]\n        x = self.fusion(low_level_x, high_level_x)\n        x = self.context(x)\n        logit = self.cls(x)\n        logit_list.append(logit)\n\n        if self.enable_auxiliary_loss:\n            auxiliary_logit = self.auxlayer(low_level_x)\n            logit_list.append(auxiliary_logit)\n\n        return logit_list\n\n\nclass AFNB(nn.Layer):\n    \"\"\"\n    Asymmetric Fusion Non-local Block.\n\n    Args:\n        low_in_channels (int): Low-level-feature channels.\n        high_in_channels (int): High-level-feature channels.\n        out_channels (int): Out channels of AFNB module.\n        key_channels (int): The key channels in self-attention block.\n        value_channels (int): The value channels in self-attention block.\n        dropout_prob (float): The dropout rate of output.\n        repeat_sizes (tuple, optional): The number of AFNB modules. Default: ([1]).\n        psp_size (tuple. optional): The out size of pooled feature maps. Default: (1, 3, 6, 8).\n    \"\"\"\n\n    def __init__(self,\n                 low_in_channels: int,\n                 high_in_channels: int,\n                 out_channels: int,\n                 key_channels: int,\n                 value_channels: int,\n                 dropout_prob: float,\n                 repeat_sizes: Tuple[int] = ([1]),\n                 psp_size: Tuple[int] = (1, 3, 6, 8)):\n        super().__init__()\n\n        self.psp_size = psp_size\n        self.stages = nn.LayerList([\n            SelfAttentionBlock_AFNB(low_in_channels, high_in_channels,\n                                    key_channels, value_channels, out_channels,\n                                    size) for size in repeat_sizes\n        ])\n        self.conv_bn = layers.ConvBN(\n            in_channels=out_channels + high_in_channels,\n            out_channels=out_channels,\n            kernel_size=1)\n        self.dropout = nn.Dropout(p=dropout_prob)\n\n    def forward(self, low_feats: List[paddle.Tensor], high_feats: List[paddle.Tensor]) -> paddle.Tensor:\n        priors = [stage(low_feats, high_feats) for stage in self.stages]\n        context = priors[0]\n        for i in range(1, len(priors)):\n            context += priors[i]\n\n        output = self.conv_bn(paddle.concat([context, high_feats], axis=1))\n        output = self.dropout(output)\n\n        return output\n\n\nclass APNB(nn.Layer):\n    \"\"\"\n    Asymmetric Pyramid Non-local Block.\n\n    Args:\n        in_channels (int): The input channels of APNB module.\n        out_channels (int): Out channels of APNB module.\n        key_channels (int): The key channels in self-attention block.\n        value_channels (int): The value channels in self-attention block.\n        dropout_prob (float): The dropout rate of output.\n        repeat_sizes (tuple, optional): The number of AFNB modules. Default: ([1]).\n        psp_size (tuple, optional): The out size of pooled feature maps. Default: (1, 3, 6, 8).\n    \"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 key_channels: int,\n                 value_channels: int,\n                 dropout_prob: float,\n                 repeat_sizes: Tuple[int] = ([1]),\n                 psp_size: Tuple[int] = (1, 3, 6, 8)):\n        super().__init__()\n\n        self.psp_size = psp_size\n        self.stages = nn.LayerList([\n            SelfAttentionBlock_APNB(in_channels, out_channels, key_channels,\n                                    value_channels, size)\n            for size in repeat_sizes\n        ])\n        self.conv_bn = layers.ConvBNReLU(\n            in_channels=in_channels * 2,\n            out_channels=out_channels,\n            kernel_size=1)\n        self.dropout = nn.Dropout(p=dropout_prob)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        priors = [stage(x) for stage in self.stages]\n        context = priors[0]\n        for i in range(1, len(priors)):\n            context += priors[i]\n\n        output = self.conv_bn(paddle.concat([context, x], axis=1))\n        output = self.dropout(output)\n\n        return output\n\n\ndef _pp_module(x: paddle.Tensor, psp_size: List[int]) -> paddle.Tensor:\n    n, c, h, w = x.shape\n    priors = []\n    for size in psp_size:\n        feat = F.adaptive_avg_pool2d(x, size)\n        feat = paddle.reshape(feat, shape=(0, c, -1))\n        priors.append(feat)\n    center = paddle.concat(priors, axis=-1)\n    return center\n\n\nclass SelfAttentionBlock_AFNB(nn.Layer):\n    \"\"\"\n    Self-Attention Block for AFNB module.\n\n    Args:\n        low_in_channels (int): Low-level-feature channels.\n        high_in_channels (int): High-level-feature channels.\n        key_channels (int): The key channels in self-attention block.\n        value_channels (int): The value channels in self-attention block.\n        out_channels (int, optional): Out channels of AFNB module. Default: None.\n        scale (int, optional): Pooling size. Default: 1.\n        psp_size (tuple, optional): The out size of pooled feature maps. Default: (1, 3, 6, 8).\n    \"\"\"\n\n    def __init__(self,\n                 low_in_channels: int,\n                 high_in_channels: int,\n                 key_channels: int,\n                 value_channels: int,\n                 out_channels: int = None,\n                 scale: int = 1,\n                 psp_size: Tuple[int] = (1, 3, 6, 8)):\n        super().__init__()\n\n        self.scale = scale\n        self.in_channels = low_in_channels\n        self.out_channels = out_channels\n        self.key_channels = key_channels\n        self.value_channels = value_channels\n        if out_channels == None:\n            self.out_channels = high_in_channels\n        self.pool = nn.MaxPool2D(scale)\n        self.f_key = layers.ConvBNReLU(\n            in_channels=low_in_channels,\n            out_channels=key_channels,\n            kernel_size=1)\n        self.f_query = layers.ConvBNReLU(\n            in_channels=high_in_channels,\n            out_channels=key_channels,\n            kernel_size=1)\n        self.f_value = nn.Conv2D(\n            in_channels=low_in_channels,\n            out_channels=value_channels,\n            kernel_size=1)\n\n        self.W = nn.Conv2D(\n            in_channels=value_channels,\n            out_channels=out_channels,\n            kernel_size=1)\n\n        self.psp_size = psp_size\n\n    def forward(self, low_feats: List[paddle.Tensor], high_feats: List[paddle.Tensor]) -> paddle.Tensor:\n        batch_size, _, h, w = high_feats.shape\n\n        value = self.f_value(low_feats)\n        value = _pp_module(value, self.psp_size)\n        value = paddle.transpose(value, (0, 2, 1))\n\n        query = self.f_query(high_feats)\n        query = paddle.reshape(query, shape=(0, self.key_channels, -1))\n        query = paddle.transpose(query, perm=(0, 2, 1))\n\n        key = self.f_key(low_feats)\n        key = _pp_module(key, self.psp_size)\n\n        sim_map = paddle.matmul(query, key)\n        sim_map = (self.key_channels**-.5) * sim_map\n        sim_map = F.softmax(sim_map, axis=-1)\n\n        context = paddle.matmul(sim_map, value)\n        context = paddle.transpose(context, perm=(0, 2, 1))\n        hf_shape = paddle.shape(high_feats)\n        context = paddle.reshape(\n            context, shape=[0, self.value_channels, hf_shape[2], hf_shape[3]])\n\n        context = self.W(context)\n\n        return context\n\n\nclass SelfAttentionBlock_APNB(nn.Layer):\n    \"\"\"\n    Self-Attention Block for APNB module.\n\n    Args:\n        in_channels (int): The input channels of APNB module.\n        out_channels (int): The out channels of APNB module.\n        key_channels (int): The key channels in self-attention block.\n        value_channels (int): The value channels in self-attention block.\n        scale (int, optional): Pooling size. Default: 1.\n        psp_size (tuple, optional): The out size of pooled feature maps. Default: (1, 3, 6, 8).\n    \"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 key_channels: int,\n                 value_channels: int,\n                 scale: int = 1,\n                 psp_size: Tuple[int] = (1, 3, 6, 8)):\n        super().__init__()\n\n        self.scale = scale\n        self.in_channels = in_channels\n        self.out_channels = out_channels\n        self.key_channels = key_channels\n        self.value_channels = value_channels\n        self.pool = nn.MaxPool2D(scale)\n        self.f_key = layers.ConvBNReLU(\n            in_channels=self.in_channels,\n            out_channels=self.key_channels,\n            kernel_size=1)\n        self.f_query = self.f_key\n        self.f_value = nn.Conv2D(\n            in_channels=self.in_channels,\n            out_channels=self.value_channels,\n            kernel_size=1)\n        self.W = nn.Conv2D(\n            in_channels=self.value_channels,\n            out_channels=self.out_channels,\n            kernel_size=1)\n\n        self.psp_size = psp_size\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        batch_size, _, h, w = x.shape\n        if self.scale > 1:\n            x = self.pool(x)\n\n        value = self.f_value(x)\n        value = _pp_module(value, self.psp_size)\n        value = paddle.transpose(value, perm=(0, 2, 1))\n\n        query = self.f_query(x)\n        query = paddle.reshape(query, shape=(0, self.key_channels, -1))\n        query = paddle.transpose(query, perm=(0, 2, 1))\n\n        key = self.f_key(x)\n        key = _pp_module(key, self.psp_size)\n\n        sim_map = paddle.matmul(query, key)\n        sim_map = (self.key_channels**-.5) * sim_map\n        sim_map = F.softmax(sim_map, axis=-1)\n\n        context = paddle.matmul(sim_map, value)\n        context = paddle.transpose(context, perm=(0, 2, 1))\n\n        x_shape = paddle.shape(x)\n        context = paddle.reshape(\n            context, shape=[0, self.value_channels, x_shape[2], x_shape[3]])\n        context = self.W(context)\n\n        return context\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ann_resnet50_voc/resnet.py",
    "content": "# copyright (c) 2022 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import Union, List, Tuple\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport ann_resnet50_voc.layers as layers\n\n\nclass ConvBNLayer(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 stride: int = 1,\n                 dilation: int = 1,\n                 groups: int  = 1,\n                 is_vd_mode: bool = False,\n                 act: str = None,\n                 data_format: str = 'NCHW'):\n        super(ConvBNLayer, self).__init__()\n        if dilation != 1 and kernel_size != 3:\n            raise RuntimeError(\"When the dilation isn't 1,\" \\\n                               \"the kernel_size should be 3.\")\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = nn.AvgPool2D(\n            kernel_size=2,\n            stride=2,\n            padding=0,\n            ceil_mode=True,\n            data_format=data_format)\n        self._conv = nn.Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=(kernel_size - 1) // 2 \\\n                if dilation == 1 else dilation,\n            dilation=dilation,\n            groups=groups,\n            bias_attr=False,\n            data_format=data_format)\n\n        self._batch_norm = layers.SyncBatchNorm(\n            out_channels, data_format=data_format)\n        self._act_op = layers.Activation(act=act)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        y = self._act_op(y)\n\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 dilation: int = 1,\n                 data_format: str = 'NCHW'):\n        super(BottleneckBlock, self).__init__()\n\n        self.data_format = data_format\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=1,\n            act='relu',\n            data_format=data_format)\n\n        self.dilation = dilation\n\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            dilation=dilation,\n            data_format=data_format)\n        self.conv2 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels * 4,\n            kernel_size=1,\n            act=None,\n            data_format=data_format)\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels * 4,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                data_format=data_format)\n\n        self.shortcut = shortcut\n        # NOTE: Use the wrap layer for quantization training\n        self.add = layers.Add()\n        self.relu = layers.Activation(act=\"relu\")\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = self.add(short, conv2)\n        y = self.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 dilation: int = 1,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 data_format: str = 'NCHW'):\n        super(BasicBlock, self).__init__()\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            dilation=dilation,\n            act='relu',\n            data_format=data_format)\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            dilation=dilation,\n            act=None,\n            data_format=data_format)\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                data_format=data_format)\n\n        self.shortcut = shortcut\n        self.dilation = dilation\n        self.data_format = data_format\n        self.add = layers.Add()\n        self.relu = layers.Activation(act=\"relu\")\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = self.add(short, conv1)\n        y = self.relu(y)\n\n        return y\n\n\nclass ResNet_vd(nn.Layer):\n    \"\"\"\n    The ResNet_vd implementation based on PaddlePaddle.\n\n    The original article refers to Jingdong\n    Tong He, et, al. \"Bag of Tricks for Image Classification with Convolutional Neural Networks\"\n    (https://arxiv.org/pdf/1812.01187.pdf).\n\n    Args:\n        layers (int, optional): The layers of ResNet_vd. The supported layers are (18, 34, 50, 101, 152, 200). Default: 50.\n        output_stride (int, optional): The stride of output features compared to input images. It is 8 or 16. Default: 8.\n        multi_grid (tuple|list, optional): The grid of stage4. Defult: (1, 1, 1).\n        pretrained (str, optional): The path of pretrained model.\n\n    \"\"\"\n\n    def __init__(self,\n                 layers: int = 50,\n                 output_stride: int = 8,\n                 multi_grid: Tuple[int]=(1, 1, 1),\n                 pretrained: str = None,\n                 data_format: str = 'NCHW'):\n        super(ResNet_vd, self).__init__()\n\n        self.data_format = data_format\n        self.conv1_logit = None  # for gscnn shape stream\n        self.layers = layers\n        supported_layers = [18, 34, 50, 101, 152, 200]\n        assert layers in supported_layers, \\\n            \"supported layers are {} but input layer is {}\".format(\n                supported_layers, layers)\n\n        if layers == 18:\n            depth = [2, 2, 2, 2]\n        elif layers == 34 or layers == 50:\n            depth = [3, 4, 6, 3]\n        elif layers == 101:\n            depth = [3, 4, 23, 3]\n        elif layers == 152:\n            depth = [3, 8, 36, 3]\n        elif layers == 200:\n            depth = [3, 12, 48, 3]\n        num_channels = [64, 256, 512, 1024\n                        ] if layers >= 50 else [64, 64, 128, 256]\n        num_filters = [64, 128, 256, 512]\n\n        # for channels of four returned stages\n        self.feat_channels = [c * 4 for c in num_filters\n                              ] if layers >= 50 else num_filters\n\n        dilation_dict = None\n        if output_stride == 8:\n            dilation_dict = {2: 2, 3: 4}\n        elif output_stride == 16:\n            dilation_dict = {3: 2}\n\n        self.conv1_1 = ConvBNLayer(\n            in_channels=3,\n            out_channels=32,\n            kernel_size=3,\n            stride=2,\n            act='relu',\n            data_format=data_format)\n        self.conv1_2 = ConvBNLayer(\n            in_channels=32,\n            out_channels=32,\n            kernel_size=3,\n            stride=1,\n            act='relu',\n            data_format=data_format)\n        self.conv1_3 = ConvBNLayer(\n            in_channels=32,\n            out_channels=64,\n            kernel_size=3,\n            stride=1,\n            act='relu',\n            data_format=data_format)\n        self.pool2d_max = nn.MaxPool2D(\n            kernel_size=3, stride=2, padding=1, data_format=data_format)\n\n        # self.block_list = []\n        self.stage_list = []\n        if layers >= 50:\n            for block in range(len(depth)):\n                shortcut = False\n                block_list = []\n                for i in range(depth[block]):\n                    if layers in [101, 152] and block == 2:\n                        if i == 0:\n                            conv_name = \"res\" + str(block + 2) + \"a\"\n                        else:\n                            conv_name = \"res\" + str(block + 2) + \"b\" + str(i)\n                    else:\n                        conv_name = \"res\" + str(block + 2) + chr(97 + i)\n\n                    ###############################################################################\n                    # Add dilation rate for some segmentation tasks, if dilation_dict is not None.\n                    dilation_rate = dilation_dict[\n                        block] if dilation_dict and block in dilation_dict else 1\n\n                    # Actually block here is 'stage', and i is 'block' in 'stage'\n                    # At the stage 4, expand the the dilation_rate if given multi_grid\n                    if block == 3:\n                        dilation_rate = dilation_rate * multi_grid[i]\n                    ###############################################################################\n\n                    bottleneck_block = self.add_sublayer(\n                        'bb_%d_%d' % (block, i),\n                        BottleneckBlock(\n                            in_channels=num_channels[block]\n                            if i == 0 else num_filters[block] * 4,\n                            out_channels=num_filters[block],\n                            stride=2 if i == 0 and block != 0\n                                        and dilation_rate == 1 else 1,\n                            shortcut=shortcut,\n                            if_first=block == i == 0,\n                            dilation=dilation_rate,\n                            data_format=data_format))\n\n                    block_list.append(bottleneck_block)\n                    shortcut = True\n                self.stage_list.append(block_list)\n        else:\n            for block in range(len(depth)):\n                shortcut = False\n                block_list = []\n                for i in range(depth[block]):\n                    dilation_rate = dilation_dict[block] \\\n                        if dilation_dict and block in dilation_dict else 1\n                    if block == 3:\n                        dilation_rate = dilation_rate * multi_grid[i]\n\n                    basic_block = self.add_sublayer(\n                        'bb_%d_%d' % (block, i),\n                        BasicBlock(\n                            in_channels=num_channels[block]\n                            if i == 0 else num_filters[block],\n                            out_channels=num_filters[block],\n                            stride=2 if i == 0 and block != 0 \\\n                                        and dilation_rate == 1 else 1,\n                            dilation=dilation_rate,\n                            shortcut=shortcut,\n                            if_first=block == i == 0,\n                            data_format=data_format))\n                    block_list.append(basic_block)\n                    shortcut = True\n                self.stage_list.append(block_list)\n\n        self.pretrained = pretrained\n\n    def forward(self, inputs: paddle.Tensor) -> List[paddle.Tensor]:\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        self.conv1_logit = y.clone()\n        y = self.pool2d_max(y)\n\n        # A feature list saves the output feature map of each stage.\n        feat_list = []\n        for stage in self.stage_list:\n            for block in stage:\n                y = block(y)\n            feat_list.append(y)\n\n        return feat_list\n\n\ndef ResNet50_vd(**args):\n    model = ResNet_vd(layers=50, **args)\n    return model\n"
  },
  {
    "path": "modules/image/semantic_segmentation/bisenet_lane_segmentation/README.md",
    "content": "# bisenet_lane_segmentation\n\n|模型名称|bisenet_lane_segmentation|\n| :--- | :---: | \n|类别|图像-图像分割|\n|网络|bisenet|\n|数据集|TuSimple|\n|是否支持Fine-tuning|否|\n|模型大小|9.7MB|\n|指标|ACC96.09%|\n|最新更新日期|2021-12-03|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n  - 样例结果示例(左为原图，右为效果图)：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/146115316-e9ed4220-8470-432f-b3f1-549d2bcdc845.jpg\" /> \n    <img src=\"https://user-images.githubusercontent.com/35907364/146115396-a7d19290-6117-4831-bc35-4b14ae8f90bc.png\" /> \n    </p>\n\n- ### 模型介绍\n\n  - 车道线分割是自动驾驶算法的一个范畴，可以用来辅助进行车辆定位和进行决策，早期已有基于传统图像处理的车道线检测方法，但是随着技术的演进，车道线检测任务所应对的场景越来越多样化，目前更多的方式是寻求在语义上对车道线存在位置的检测。bisenet_lane_segmentation是一个轻量化车道线分割模型。\n\n  - 更多详情请参考：[bisenet_lane_segmentation](https://github.com/PaddlePaddle/PaddleSeg)\n  \n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.2.0\n\n    - paddlehub >= 2.1.0\n\n    - paddleseg >= 2.3.0\n    \n    - Python >= 3.7+\n\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install bisenet_lane_segmentation\n      ```\n      \n    - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n    \n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run bisenet_lane_segmentation --input_path \"/PATH/TO/IMAGE\"\n    ```\n    \n  - 通过命令行方式实现hub模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n    - ```python\n      import paddlehub as hub\n      import cv2\n\n      model = hub.Module(name=\"bisenet_lane_segmentation\")\n      result = model.predict(image_list=[\"/PATH/TO/IMAGE\"])\n      print(result)\n      ```\n- ### 3、API\n\n    - ```python\n        def predict(self, \n                    image_list, \n                    visualization, \n                    save_path):\n      ```\n\n        - 车道线分割预测API，用于将输入图片中的车道线分割出来。\n\n        - 参数\n\n            - image_list (list(str | numpy.ndarray)):图片输入路径或者BGR格式numpy数据。\n            - visualization (bool): 是否进行可视化，默认为False。\n            - save_path (str): 当visualization为True时，保存图片的路径，默认为\"bisenet_lane_segmentation_output\"。\n\n        - 返回\n\n            - result (list(numpy.ndarray))：模型分割结果：\n\n \n## 四、服务部署\n\n- PaddleHub Serving可以部署车道线分割在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m bisenet_lane_segmentation\n      ```\n\n    - 这样就完成了一个车道线分割在线服务API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    import numpy as np\n\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    def base64_to_cv2(b64str):\n        data = base64.b64decode(b64str.encode('utf8'))\n        data = np.fromstring(data, np.uint8)\n        data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n        return data\n\n    # 发送HTTP请求\n    org_im = cv2.imread('/PATH/TO/IMAGE')\n    data = {'images':[cv2_to_base64(org_im)]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/bisenet_lane_segmentation\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    #print(r.json())\n    mask = base64_to_cv2(r.json()[\"results\"]['data'][0])\n    print(mask)\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n"
  },
  {
    "path": "modules/image/semantic_segmentation/bisenet_lane_segmentation/README_en.md",
    "content": "# bisenet_lane_segmentation\n\n|Module Name|bisenet_lane_segmentation|\n| :--- | :---: | \n|Category|Image Segmentation|\n|Network|bisenet|\n|Dataset|TuSimple|\n|Support Fine-tuning|No|\n|Module Size|9.7MB|\n|Data Indicators|ACC96.09%|\n|Latest update date|2021-12-03|\n\n\n## I. Basic Information\n\n- ### Application Effect Display\n\n  - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/146115316-e9ed4220-8470-432f-b3f1-549d2bcdc845.jpg\" /> \n    <img src=\"https://user-images.githubusercontent.com/35907364/146115396-a7d19290-6117-4831-bc35-4b14ae8f90bc.png\" /> \n    </p>\n\n- ### Module Introduction\n\n  - Lane segmentation is a category of automatic driving algorithms, which can be used to assist vehicle positioning and decision-making. In the early days, there were lane detection methods based on traditional image processing, but with the evolution of technology, the scenes that lane detection tasks deal with More and more diversified, and more methods are currently seeking to detect the location of lane semantically. bisenet_lane_segmentation is a lightweight model for lane segmentation.\n\n\n  \n  - For more information, please refer to: [bisenet_lane_segmentation](https://github.com/PaddlePaddle/PaddleSeg)\n  \n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.2.0\n\n    - paddlehub >= 2.1.0\n\n    - paddleseg >= 2.3.0\n    \n    - Python >= 3.7+\n\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install bisenet_lane_segmentation\n      ```\n      \n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n    \n## III. Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run bisenet_lane_segmentation --input_path \"/PATH/TO/IMAGE\"\n    ```\n    \n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_en/tutorial/cmd_usage.rst)\n\n\n- ### 2、Prediction Code Example\n\n    - ```python\n      import paddlehub as hub\n      import cv2\n\n      model = hub.Module(name=\"bisenet_lane_segmentation\")\n      result = model.predict(image_list=[\"/PATH/TO/IMAGE\"])\n      print(result)\n\n      ```\n- ### 3、API\n\n    - ```python\n        def predict(self, \n                    image_list, \n                    visualization, \n                    save_path):\n      ```\n\n        - Prediction API for lane segmentation.\n\n        - **Parameter**\n\n            - image_list (list(str | numpy.ndarray)): Image path or image data, ndarray.shape is in the format \\[H, W, C\\]，BGR.\n            - visualization (bool): Whether to save the recognition results as picture files, default is False.\n            - save_path (str): Save path of images, \"bisenet_lane_segmentation_output\" by default.\n\n        - **Return**\n\n            - result (list(numpy.ndarray))：The list of model results.\n\n \n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of lane segmentation.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command:\n\n    - ```shell\n      $ hub serving start -m bisenet_lane_segmentation\n      ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n\n    ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    import numpy as np\n\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    def base64_to_cv2(b64str):\n        data = base64.b64decode(b64str.encode('utf8'))\n        data = np.fromstring(data, np.uint8)\n        data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n        return data\n\n    org_im = cv2.imread('/PATH/TO/IMAGE')\n    data = {'images':[cv2_to_base64(org_im)]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/bisenet_lane_segmentation\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    #print(r.json())\n    mask = base64_to_cv2(r.json()[\"results\"]['data'][0])\n    print(mask)\n    ```\n\n## V. Release Note\n\n- 1.0.0\n\n  First release"
  },
  {
    "path": "modules/image/semantic_segmentation/bisenet_lane_segmentation/lane_processor/get_lane_coords.py",
    "content": "#   Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# this code is based on\n# https://github.com/ZJULearning/resa/blob/main/datasets/tusimple.py\n\nimport cv2\nimport numpy as np\n\n\nclass LaneProcessor:\n    def __init__(self,\n                 num_classes=2,\n                 ori_shape=(720, 1280),\n                 cut_height=0,\n                 y_pixel_gap=10,\n                 points_nums=56,\n                 thresh=0.6,\n                 smooth=True):\n        super(LaneProcessor, self).__init__()\n        self.num_classes = num_classes\n        self.ori_shape = ori_shape\n        self.cut_height = cut_height\n        self.y_pixel_gap = y_pixel_gap\n        self.points_nums = points_nums\n        self.thresh = thresh\n        self.smooth = smooth\n\n    def get_lane_coords(self, seg_pred):\n        lane_coords_list = []\n        for batch in range(len(seg_pred)):\n            seg = seg_pred[batch]\n            lane_coords = self.heatmap2coords(seg)\n            for i in range(len(lane_coords)):\n                lane_coords[i] = sorted(\n                    lane_coords[i], key=lambda pair: pair[1])\n            lane_coords_list.append(lane_coords)\n        return lane_coords_list\n\n    def process_gap(self, coordinate):\n        if any(x > 0 for x in coordinate):\n            start = [i for i, x in enumerate(coordinate) if x > 0][0]\n            end = [\n                i for i, x in reversed(list(enumerate(coordinate))) if x > 0\n            ][0]\n            lane = coordinate[start:end + 1]\n            # The line segment is not continuous\n            if any(x < 0 for x in lane):\n                gap_start = [\n                    i for i, x in enumerate(lane[:-1])\n                    if x > 0 and lane[i + 1] < 0\n                ]\n                gap_end = [\n                    i + 1 for i, x in enumerate(lane[:-1])\n                    if x < 0 and lane[i + 1] > 0\n                ]\n                gap_id = [i for i, x in enumerate(lane) if x < 0]\n                if len(gap_start) == 0 or len(gap_end) == 0:\n                    return coordinate\n                for id in gap_id:\n                    for i in range(len(gap_start)):\n                        if i >= len(gap_end):\n                            return coordinate\n                        if id > gap_start[i] and id < gap_end[i]:\n                            gap_width = float(gap_end[i] - gap_start[i])\n                            # line interpolation\n                            lane[id] = int((id - gap_start[i]) / gap_width *\n                                           lane[gap_end[i]] +\n                                           (gap_end[i] - id) / gap_width *\n                                           lane[gap_start[i]])\n                if not all(x > 0 for x in lane):\n                    print(\"Gaps still exist!\")\n                coordinate[start:end + 1] = lane\n        return coordinate\n\n    def get_coords(self, heat_map):\n        dst_height = self.ori_shape[0] - self.cut_height\n        coords = np.zeros(self.points_nums)\n        coords[:] = -2\n        pointCount = 0\n        for i in range(self.points_nums):\n            y_coord = dst_height - 10 - i * self.y_pixel_gap\n            y = int(y_coord / dst_height * heat_map.shape[0])\n            if y < 0:\n                break\n            prob_line = heat_map[y, :]\n            x = np.argmax(prob_line)\n            prob = prob_line[x]\n            if prob > self.thresh:\n                coords[i] = int(x / heat_map.shape[1] * self.ori_shape[1])\n                pointCount = pointCount + 1\n        if pointCount < 2:\n            coords[:] = -2\n        self.process_gap(coords)\n        return coords\n\n    def fix_outliers(self, coords):\n        data = [x for i, x in enumerate(coords) if x > 0]\n        index = [i for i, x in enumerate(coords) if x > 0]\n        if len(data) == 0:\n            return coords\n        diff = []\n        is_outlier = False\n        n = 1\n        x_gap = abs((data[-1] - data[0]) / (1.0 * (len(data) - 1)))\n        for idx, dt in enumerate(data):\n            if is_outlier == False:\n                t = idx - 1\n                n = 1\n            if idx == 0:\n                diff.append(0)\n            else:\n                diff.append(abs(data[idx] - data[t]))\n                if abs(data[idx] - data[t]) > n * (x_gap * 1.5):\n                    n = n + 1\n                    is_outlier = True\n                    ind = index[idx]\n                    coords[ind] = -1\n                else:\n                    is_outlier = False\n\n    def heatmap2coords(self, seg_pred):\n        coordinates = []\n        for i in range(self.num_classes - 1):\n            heat_map = seg_pred[i + 1]\n            if self.smooth:\n                heat_map = cv2.blur(\n                    heat_map, (9, 9), borderType=cv2.BORDER_REPLICATE)\n            coords = self.get_coords(heat_map)\n            indexes = [i for i, x in enumerate(coords) if x > 0]\n            if not indexes:\n                continue\n            self.add_coords(coordinates, coords)\n\n        if len(coordinates) == 0:\n            coords = np.zeros(self.points_nums)\n            self.add_coords(coordinates, coords)\n        return coordinates\n\n    def add_coords(self, coordinates, coords):\n        sub_lanes = []\n        for j in range(self.points_nums):\n            y_lane = self.ori_shape[0] - 10 - j * self.y_pixel_gap\n            x_lane = coords[j] if coords[j] > 0 else -2\n            sub_lanes.append([x_lane, y_lane])\n        coordinates.append(sub_lanes)\n"
  },
  {
    "path": "modules/image/semantic_segmentation/bisenet_lane_segmentation/lane_processor/lane.py",
    "content": "#   Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# this code is from https://github.com/TuSimple/tusimple-benchmark/blob/master/evaluate/lane.py\n\nimport json as json\nimport numpy as np\nfrom sklearn.linear_model import LinearRegression\n\n\nclass LaneEval(object):\n    lr = LinearRegression()\n    pixel_thresh = 20\n    pt_thresh = 0.85\n\n    @staticmethod\n    def get_angle(xs, y_samples):\n        xs, ys = xs[xs >= 0], y_samples[xs >= 0]\n        if len(xs) > 1:\n            LaneEval.lr.fit(ys[:, None], xs)\n            k = LaneEval.lr.coef_[0]\n            theta = np.arctan(k)\n        else:\n            theta = 0\n        return theta\n\n    @staticmethod\n    def line_accuracy(pred, gt, thresh):\n        pred = np.array([p if p >= 0 else -100 for p in pred])\n        gt = np.array([g if g >= 0 else -100 for g in gt])\n        return np.sum(np.where(np.abs(pred - gt) < thresh, 1., 0.)) / len(gt)\n\n    @staticmethod\n    def bench(pred, gt, y_samples, running_time):\n        if any(len(p) != len(y_samples) for p in pred):\n            raise Exception('Format of lanes error.')\n        if running_time > 200 or len(gt) + 2 < len(pred):\n            return 0., 0., 1.\n        angles = [\n            LaneEval.get_angle(np.array(x_gts), np.array(y_samples))\n            for x_gts in gt\n        ]\n        threshs = [LaneEval.pixel_thresh / np.cos(angle) for angle in angles]\n        line_accs = []\n        fp, fn = 0., 0.\n        matched = 0.\n        for x_gts, thresh in zip(gt, threshs):\n            accs = [\n                LaneEval.line_accuracy(\n                    np.array(x_preds), np.array(x_gts), thresh)\n                for x_preds in pred\n            ]\n            max_acc = np.max(accs) if len(accs) > 0 else 0.\n            if max_acc < LaneEval.pt_thresh:\n                fn += 1\n            else:\n                matched += 1\n            line_accs.append(max_acc)\n        fp = len(pred) - matched\n        if len(gt) > 4 and fn > 0:\n            fn -= 1\n        s = sum(line_accs)\n        if len(gt) > 4:\n            s -= min(line_accs)\n        return s / max(min(4.0, len(gt)),\n                       1.), fp / len(pred) if len(pred) > 0 else 0., fn / max(\n                           min(len(gt), 4.), 1.)\n\n    @staticmethod\n    def bench_one_submit(pred_file, gt_file):\n        try:\n            json_pred = [\n                json.loads(line) for line in open(pred_file).readlines()\n            ]\n        except BaseException as e:\n            raise Exception('Fail to load json file of the prediction.')\n        json_gt = [json.loads(line) for line in open(gt_file).readlines()]\n        if len(json_gt) != len(json_pred):\n            raise Exception(\n                'We do not get the predictions of all the test tasks')\n        gts = {l['raw_file']: l for l in json_gt}\n        accuracy, fp, fn = 0., 0., 0.\n        for pred in json_pred:\n            if 'raw_file' not in pred or 'lanes' not in pred or 'run_time' not in pred:\n                raise Exception(\n                    'raw_file or lanes or run_time not in some predictions.')\n            raw_file = pred['raw_file']\n            pred_lanes = pred['lanes']\n            run_time = pred['run_time']\n            if raw_file not in gts:\n                raise Exception(\n                    'Some raw_file from your predictions do not exist in the test tasks.'\n                )\n            gt = gts[raw_file]\n            gt_lanes = gt['lanes']\n            y_samples = gt['h_samples']\n            try:\n                a, p, n = LaneEval.bench(pred_lanes, gt_lanes, y_samples,\n                                         run_time)\n            except BaseException as e:\n                raise Exception('Format of lanes error.')\n            accuracy += a\n            fp += p\n            fn += n\n        num = len(gts)\n        # the first return parameter is the default ranking parameter\n        return json.dumps([{\n            'name': 'Accuracy',\n            'value': accuracy / num,\n            'order': 'desc'\n        }, {\n            'name': 'FP',\n            'value': fp / num,\n            'order': 'asc'\n        }, {\n            'name': 'FN',\n            'value': fn / num,\n            'order': 'asc'\n        }]), accuracy / num, fp / num, fn / num\n\n\nif __name__ == '__main__':\n    import sys\n\n    try:\n        if len(sys.argv) != 3:\n            raise Exception('Invalid input arguments')\n        print(LaneEval.bench_one_submit(sys.argv[1], sys.argv[2]))\n    except Exception as e:\n        print(e.message)\n        sys.exit(e.message)\n"
  },
  {
    "path": "modules/image/semantic_segmentation/bisenet_lane_segmentation/lane_processor/tusimple_processor.py",
    "content": "#   Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\n\nimport cv2\nimport json\nimport paddle.nn as nn\n\nfrom .lane import LaneEval\nfrom .get_lane_coords import LaneProcessor\n\n\ndef mkdir(path):\n    sub_dir = os.path.dirname(path)\n    if not os.path.exists(sub_dir):\n        os.makedirs(sub_dir)\n\n\nclass TusimpleProcessor:\n    def __init__(self,\n                 num_classes=2,\n                 ori_shape=(720, 1280),\n                 cut_height=0,\n                 thresh=0.6,\n                 test_gt_json=None,\n                 save_dir='output/'):\n        super(TusimpleProcessor, self).__init__()\n        self.num_classes = num_classes\n        self.dump_to_json = []\n        self.save_dir = save_dir\n        self.test_gt_json = test_gt_json\n        self.color_map = [(255, 0, 0), (0, 255, 0), (0, 0, 255), (255, 255, 0),\n                          (255, 0, 255), (0, 255, 125), (50, 100, 50),\n                          (100, 50, 100)]\n        self.laneProcessor = LaneProcessor(\n            num_classes=self.num_classes,\n            ori_shape=ori_shape,\n            cut_height=cut_height,\n            y_pixel_gap=10,\n            points_nums=56,\n            thresh=thresh,\n            smooth=True)\n\n    def dump_data_to_json(self,\n                          output,\n                          im_path,\n                          run_time=0,\n                          is_dump_json=True,\n                          is_view=False):\n        seg_pred = output[0]\n        seg_pred = nn.functional.softmax(seg_pred, axis=1)\n        seg_pred = seg_pred.numpy()\n        lane_coords_list = self.laneProcessor.get_lane_coords(seg_pred)\n\n        for batch in range(len(seg_pred)):\n            lane_coords = lane_coords_list[batch]\n            path_list = im_path[batch].split(\"/\")\n            if is_dump_json:\n                json_pred = {}\n                json_pred['lanes'] = []\n                json_pred['run_time'] = run_time * 1000\n                json_pred['h_sample'] = []\n\n                json_pred['raw_file'] = os.path.join(*path_list[-4:])\n                for l in lane_coords:\n                    if len(l) == 0:\n                        continue\n                    json_pred['lanes'].append([])\n                    for (x, y) in l:\n                        json_pred['lanes'][-1].append(int(x))\n                for (x, y) in lane_coords[0]:\n                    json_pred['h_sample'].append(y)\n                self.dump_to_json.append(json.dumps(json_pred))\n\n            if is_view:\n                img = cv2.imread(im_path[batch])\n                if is_dump_json:\n                    img_name = '_'.join([x for x in path_list[-4:]])\n                    sub_dir = 'visual_eval'\n                else:\n                    img_name = os.path.basename(im_path[batch])\n                    sub_dir = 'visual_points'\n                saved_path = os.path.join(self.save_dir, sub_dir, img_name)\n                self.draw(img, lane_coords, saved_path)\n\n    def predict(self, output, im_path):\n        self.dump_data_to_json(\n            output, [im_path], is_dump_json=False, is_view=True)\n\n    def bench_one_submit(self):\n        output_file = os.path.join(self.save_dir, 'pred.json')\n        if output_file is not None:\n            mkdir(output_file)\n        with open(output_file, \"w+\") as f:\n            for line in self.dump_to_json:\n                print(line, end=\"\\n\", file=f)\n\n        eval_rst, acc, fp, fn = LaneEval.bench_one_submit(\n            output_file, self.test_gt_json)\n        self.dump_to_json = []\n        return acc, fp, fn, eval_rst\n\n    def draw(self, img, coords, file_path=None):\n        for i, coord in enumerate(coords):\n            for x, y in coord:\n                if x <= 0 or y <= 0:\n                    continue\n                cv2.circle(img, (int(x), int(y)), 4,\n                           self.color_map[i % self.num_classes], 2)\n\n        if file_path is not None:\n            mkdir(file_path)\n            cv2.imwrite(file_path, img)\n"
  },
  {
    "path": "modules/image/semantic_segmentation/bisenet_lane_segmentation/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport time\nimport argparse\nimport os\nfrom typing import Union, List, Tuple\n\nimport cv2\nimport paddle\nfrom paddle import nn\nimport paddle.nn.functional as F\nimport numpy as np\nfrom paddlehub.module.module import moduleinfo, runnable, serving\nimport paddleseg.transforms as T\nfrom paddleseg.utils import logger, progbar, visualize\nfrom paddlehub.module.cv_module import ImageSegmentationModule\nimport paddleseg.utils as utils\nfrom paddleseg.models import layers\nfrom paddleseg.models import BiSeNetV2\n\nfrom bisenet_lane_segmentation.processor import Crop, reverse_transform, cv2_to_base64, base64_to_cv2\nfrom bisenet_lane_segmentation.lane_processor.tusimple_processor import TusimpleProcessor\n\n@moduleinfo(\n    name=\"bisenet_lane_segmentation\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"BiSeNetLane is a lane segmentation model.\",\n    version=\"1.0.0\")\nclass BiSeNetLane(nn.Layer):\n    \"\"\"\n    The BiSeNetLane use BiseNet V2 to process lane segmentation .\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        lambd (float, optional): A factor for controlling the size of semantic branch channels. Default: 0.25.\n        pretrained (str, optional): The path or url of pretrained model. Default: None.\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 7,\n                 lambd: float = 0.25,\n                 align_corners: bool = False,\n                 pretrained: str = None):\n        super(BiSeNetLane, self).__init__()\n\n        self.net = BiSeNetV2(\n            num_classes=num_classes,\n            lambd=lambd,\n            align_corners=align_corners,\n            pretrained=None)\n\n        self.transforms = [Crop(up_h_off=160), T.Resize([640, 368]), T.Normalize()]\n        self.cut_height = 160\n        self.postprocessor = TusimpleProcessor(num_classes=7, cut_height=160,)\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n            \n\n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        logit_list = self.net(x)\n        return logit_list\n    \n    def predict(self, image_list: list, visualization: bool = False, save_path: str = \"bisenet_lane_segmentation_output\") -> List[np.ndarray]:\n        self.eval()\n        result = []\n        with paddle.no_grad():\n            for i, im in enumerate(image_list):\n                if isinstance(im, str):\n                    im = cv2.imread(im)\n\n                ori_shape = im.shape[:2]\n                for op in self.transforms:\n                    outputs = op(im)\n                    im = outputs[0]\n\n                im = np.transpose(im, (2, 0, 1))\n                im = im[np.newaxis, ...]\n                im = paddle.to_tensor(im)\n                logit = self.forward(im)[0]\n                pred = reverse_transform(logit, ori_shape, self.transforms, mode='bilinear')\n                pred = paddle.argmax(pred, axis=1, keepdim=True, dtype='int32')\n                pred = paddle.squeeze(pred[0])\n                pred = pred.numpy().astype('uint8')\n                if visualization:\n                    color_map = visualize.get_color_map_list(256)\n                    pred_mask = visualize.get_pseudo_color_map(pred, color_map)\n                    if not os.path.exists(save_path):\n                        os.makedirs(save_path)\n                    img_name = str(time.time()) + '.png'\n                    image_save_path = os.path.join(save_path, img_name)\n                    pred_mask.save(image_save_path)\n                result.append(pred)\n        return result\n\n    @serving\n    def serving_method(self, images: str, **kwargs) -> dict:\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        outputs = self.predict(image_list=images_decode, **kwargs)\n        serving_data = [cv2_to_base64(outputs[i]) for i in range(len(outputs))]\n        results = {'data': serving_data}\n\n        return results\n\n    @runnable\n    def run_cmd(self, argvs: list) -> List[np.ndarray]:\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n\n        results = self.predict(image_list=[args.input_path], save_path=args.output_dir, visualization=args.visualization)\n\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default=\"bisenet_lane_segmentation_output\", help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument(\n            '--visualization', type=bool, default=True, help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n        "
  },
  {
    "path": "modules/image/semantic_segmentation/bisenet_lane_segmentation/processor.py",
    "content": "import base64\nimport collections.abc\nfrom itertools import combinations\nfrom typing import Union, List, Tuple, Callable\n\nimport numpy as np\nimport cv2\nimport paddle\nimport paddle.nn.functional as F\n\n\ndef get_reverse_list(ori_shape: list, transforms: Callable) -> list:\n    \"\"\"\n    get reverse list of transform.\n\n    Args:\n        ori_shape (list): Origin shape of image.\n        transforms (list): List of transform.\n\n    Returns:\n        list: List of tuple, there are two format:\n            ('resize', (h, w)) The image shape before resize,\n            ('padding', (h, w)) The image shape before padding.\n    \"\"\"\n    reverse_list = []\n    h, w = ori_shape[0], ori_shape[1]\n    for op in transforms:\n        if op.__class__.__name__ in ['Resize']:\n            reverse_list.append(('resize', (h, w)))\n            h, w = op.target_size[0], op.target_size[1]\n        if op.__class__.__name__ in ['Crop']:\n            reverse_list.append(('crop', (op.up_h_off, op.down_h_off),\n                                 (op.left_w_off, op.right_w_off)))\n            h = h - op.up_h_off\n            h = h - op.down_h_off\n            w = w - op.left_w_off\n            w = w - op.right_w_off\n        if op.__class__.__name__ in ['ResizeByLong']:\n            reverse_list.append(('resize', (h, w)))\n            long_edge = max(h, w)\n            short_edge = min(h, w)\n            short_edge = int(round(short_edge * op.long_size / long_edge))\n            long_edge = op.long_size\n            if h > w:\n                h = long_edge\n                w = short_edge\n            else:\n                w = long_edge\n                h = short_edge\n        if op.__class__.__name__ in ['ResizeByShort']:\n            reverse_list.append(('resize', (h, w)))\n            long_edge = max(h, w)\n            short_edge = min(h, w)\n            long_edge = int(round(long_edge * op.short_size / short_edge))\n            short_edge = op.short_size\n            if h > w:\n                h = long_edge\n                w = short_edge\n            else:\n                w = long_edge\n                h = short_edge\n        if op.__class__.__name__ in ['Padding']:\n            reverse_list.append(('padding', (h, w)))\n            w, h = op.target_size[0], op.target_size[1]\n        if op.__class__.__name__ in ['PaddingByAspectRatio']:\n            reverse_list.append(('padding', (h, w)))\n            ratio = w / h\n            if ratio == op.aspect_ratio:\n                pass\n            elif ratio > op.aspect_ratio:\n                h = int(w / op.aspect_ratio)\n            else:\n                w = int(h * op.aspect_ratio)\n        if op.__class__.__name__ in ['LimitLong']:\n            long_edge = max(h, w)\n            short_edge = min(h, w)\n            if ((op.max_long is not None) and (long_edge > op.max_long)):\n                reverse_list.append(('resize', (h, w)))\n                long_edge = op.max_long\n                short_edge = int(round(short_edge * op.max_long / long_edge))\n            elif ((op.min_long is not None) and (long_edge < op.min_long)):\n                reverse_list.append(('resize', (h, w)))\n                long_edge = op.min_long\n                short_edge = int(round(short_edge * op.min_long / long_edge))\n            if h > w:\n                h = long_edge\n                w = short_edge\n            else:\n                w = long_edge\n                h = short_edge\n    return reverse_list\n\n\ndef reverse_transform(pred: paddle.Tensor, ori_shape: list, transforms: Callable, mode: str = 'nearest') -> paddle.Tensor:\n    \"\"\"recover pred to origin shape\"\"\"\n    reverse_list = get_reverse_list(ori_shape, transforms)\n    for item in reverse_list[::-1]:\n        if item[0] == 'resize':\n            h, w = item[1][0], item[1][1]\n            # if paddle.get_device() == 'cpu':\n            #     pred = paddle.cast(pred, 'uint8')\n            #     pred = F.interpolate(pred, (h, w), mode=mode)\n            #     pred = paddle.cast(pred, 'int32')\n            # else:\n            pred = F.interpolate(pred, (h, w), mode=mode)\n        elif item[0] == 'crop':\n            up_h_off, down_h_off = item[1][0], item[1][1]\n            left_w_off, right_w_off = item[2][0], item[2][1]\n            pred = F.pad(\n                pred, [left_w_off, right_w_off, up_h_off, down_h_off],\n                value=0,\n                mode='constant',\n                data_format=\"NCHW\")\n        elif item[0] == 'padding':\n            h, w = item[1][0], item[1][1]\n            pred = pred[:, :, 0:h, 0:w]\n        else:\n            raise Exception(\"Unexpected info '{}' in im_info\".format(item[0]))\n    return pred\n\n\nclass Crop:\n    \"\"\"\n    crop an image from four forwards.\n\n    Args:\n        up_h_off (int, optional): The cut height for image from up to down. Default: 0.\n        down_h_off (int, optional): The cut height for image from down to up . Default: 0.\n        left_w_off (int, optional): The cut height for image from left to right. Default: 0.\n        right_w_off (int, optional): The cut width for image from right to left. Default: 0.\n    \"\"\"\n\n    def __init__(self, up_h_off: int = 0, down_h_off: int = 0, left_w_off: int = 0, right_w_off: int = 0):\n        self.up_h_off = up_h_off\n        self.down_h_off = down_h_off\n        self.left_w_off = left_w_off\n        self.right_w_off = right_w_off\n\n    def __call__(self, im: np.ndarray, label: np.ndarray = None) -> Tuple[np.ndarray]:\n        if self.up_h_off < 0 or self.down_h_off < 0 or self.left_w_off < 0 or self.right_w_off < 0:\n            raise Exception(\n                \"up_h_off, down_h_off, left_w_off,  right_w_off must equal or greater zero\"\n            )\n\n        if self.up_h_off > 0 and self.up_h_off < im.shape[0]:\n            im = im[self.up_h_off:, :, :]\n            if label is not None:\n                label = label[self.up_h_off:, :]\n\n        if self.down_h_off > 0 and self.down_h_off < im.shape[0]:\n            im = im[:-self.down_h_off, :, :]\n            if label is not None:\n                label = label[:-self.down_h_off, :]\n\n        if self.left_w_off > 0 and self.left_w_off < im.shape[1]:\n            im = im[:, self.left_w_off:, :]\n            if label is not None:\n                label = label[:, self.left_w_off:]\n\n        if self.right_w_off > 0 and self.right_w_off < im.shape[1]:\n            im = im[:, :-self.right_w_off, :]\n            if label is not None:\n                label = label[:, :-self.right_w_off]\n\n        if label is None:\n            return (im, )\n        else:\n            return (im, label)\n\ndef cv2_to_base64(image: np.ndarray) -> str:\n    \"\"\"\n    Convert data from BGR to base64 format.\n    \"\"\"\n    data = cv2.imencode('.png', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef base64_to_cv2(b64str: str) -> np.ndarray:\n    \"\"\"\n    Convert data from base64 to BGR format.\n    \"\"\"\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n"
  },
  {
    "path": "modules/image/semantic_segmentation/bisenetv2_cityscapes/README.md",
    "content": "# PaddleHub 图像分割\n\n## 模型预测\n\n若想使用我们提供的预训练模型进行预测，可使用如下脚本：\n\n```python\nimport paddle\nimport cv2\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='bisenetv2_cityscapes')\n    img = cv2.imread(\"/PATH/TO/IMAGE\")\n    model.predict(images=[img], visualization=True)\n```\n\n\n\n## 如何开始Fine-tune\n\n本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n\n在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用bisenetv2_cityscapes模型对OpticDiscSeg等数据集进行Fine-tune。\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 定义数据预处理方式\n```python\nfrom paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\ntransform = Compose([Resize(target_size=(512, 512)), Normalize()])\n```\n\n`segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n### Step2: 下载数据集并使用\n```python\nfrom paddlehub.datasets import OpticDiscSeg\n\ntrain_reader = OpticDiscSeg(transform， mode='train')\n\n```\n* `transform`: 数据预处理方式。\n* `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n### Step3: 加载预训练模型\n\n```python\nmodel = hub.Module(name='bisenetv2_cityscapes', num_classes=2, pretrained=None)\n```\n* `name`: 选择预训练模型的名字。\n* `num_classes`: 分割模型的类别数目。\n* `pretrained`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n### Step4: 选择优化策略和运行配置\n\n```python\nscheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\noptimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\ntrainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_ocr', use_gpu=True)\n```\n\n#### 优化策略\n\nPaddle2.0提供了多种优化器选择，如`SGD`, `Adam`, `Adamax`等，其中`Adam`:\n\n* `learning_rate`: 全局学习率。\n*  `parameters`: 待优化模型参数。\n\n#### 运行配置\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_gpu`: 是否使用gpu，默认为False;\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: works的数量，默认为0；\n* `eval_dataset`: 验证集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们使用该模型来进行预测。predict.py脚本如下：\n\n```python\nimport paddle\nimport cv2\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='bisenetv2_cityscapes', pretrained='/PATH/TO/CHECKPOINT')\n    img = cv2.imread(\"/PATH/TO/IMAGE\")\n    model.predict(images=[img], visualization=True)\n```\n\n参数配置正确后，请执行脚本`python predict.py`。\n**Args**\n* `images`:原始图像路径或BGR格式图片；\n* `visualization`: 是否可视化，默认为True；\n* `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n**NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线图像分割服务。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m bisenetv2_cityscapes\n```\n\n这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\nimport cv2\nimport base64\n\nimport numpy as np\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n# 发送HTTP请求\norg_im = cv2.imread('/PATH/TO/IMAGE')\ndata = {'images':[cv2_to_base64(org_im)]}\nheaders = {\"Content-type\": \"application/json\"}\nurl = \"http://127.0.0.1:8866/predict/bisenetv2_cityscapes\"\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\nmask = base64_to_cv2(r.json()[\"results\"][0])\n```\n\n### 查看代码\n\nhttps://github.com/PaddlePaddle/PaddleSeg\n\n### 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "modules/image/semantic_segmentation/bisenetv2_cityscapes/layers.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu' or os.environ.get('PADDLESEG_EXPORT_STAGE'):\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs):\n        super().__init__()\n\n        self._conv = nn.Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs):\n        super().__init__()\n        self._conv = nn.Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvReLUPool(nn.Layer):\n    \"\"\"Basic conv bn pool layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int):\n        super().__init__()\n        self.conv = nn.Conv2D(in_channels, out_channels, kernel_size=3, stride=1, padding=1, dilation=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.conv(x)\n        x = F.relu(x)\n        x = F.pool2d(x, pool_size=2, pool_type=\"max\", pool_stride=2)\n        return x\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Basic separable conv bn relu layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs):\n        super().__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=in_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n        self.piontwise_conv = ConvBNReLU(in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass DepthwiseConvBN(nn.Layer):\n    \"\"\"Basic depthwise conv bn relu layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs):\n        super().__init__()\n\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        return x\n\n\nclass AuxLayer(nn.Layer):\n    \"\"\"\n    The auxiliary layer implementation for auxiliary loss.\n\n    Args:\n        in_channels (int): The number of input channels.\n        inter_channels (int): The intermediate channels.\n        out_channels (int): The number of output channels, and usually it is num_classes.\n        dropout_prob (float, optional): The drop rate. Default: 0.1.\n    \"\"\"\n\n    def __init__(self, in_channels: int, inter_channels: int, out_channels: int, dropout_prob: float = 0.1):\n        super().__init__()\n\n        self.conv_bn_relu = ConvBNReLU(in_channels=in_channels, out_channels=inter_channels, kernel_size=3, padding=1)\n\n        self.dropout = nn.Dropout(p=dropout_prob)\n\n        self.conv = nn.Conv2D(in_channels=inter_channels, out_channels=out_channels, kernel_size=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n        x = self.conv(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n    Returns:\n        A callable object of Activation.\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n    Examples:\n        from paddleseg.models.common.activation import Activation\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = nn.layer.activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"nn.layer.activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n"
  },
  {
    "path": "modules/image/semantic_segmentation/bisenetv2_cityscapes/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union, List, Tuple\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\n\nimport bisenet_cityscapes.layers as layers\n\n\n@moduleinfo(\n    name=\"bisenetv2_cityscapes\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"Bisenet is a segmentation model trained by Cityscapes.\",\n    version=\"1.0.0\",\n    meta=ImageSegmentationModule)\nclass BiSeNetV2(nn.Layer):\n    \"\"\"\n    The BiSeNet V2 implementation based on PaddlePaddle.\n\n    The original article refers to\n    Yu, Changqian, et al. \"BiSeNet V2: Bilateral Network with Guided Aggregation for Real-time Semantic Segmentation\"\n    (https://arxiv.org/abs/2004.02147)\n\n    Args:\n        num_classes (int): The unique number of target classes, default is 19.\n        lambd (float, optional): A factor for controlling the size of semantic branch channels. Default: 0.25.\n        align_corners (bool, optional): An argument of F.interpolate. It should be set to False when the feature size is even,\n            e.g. 1024x512, otherwise it is True, e.g. 769x769. Default: False.\n        pretrained (str, optional): The path or url of pretrained model. Default: None.\n    \"\"\"\n\n    def __init__(self, num_classes: int = 19, lambd: float = 0.25, align_corners: bool = False, pretrained: str = None):\n        super(BiSeNetV2, self).__init__()\n\n        C1, C2, C3 = 64, 64, 128\n        db_channels = (C1, C2, C3)\n        C1, C3, C4, C5 = int(C1 * lambd), int(C3 * lambd), 64, 128\n        sb_channels = (C1, C3, C4, C5)\n        mid_channels = 128\n\n        self.db = DetailBranch(db_channels)\n        self.sb = SemanticBranch(sb_channels)\n\n        self.bga = BGA(mid_channels, align_corners)\n        self.aux_head1 = SegHead(C1, C1, num_classes)\n        self.aux_head2 = SegHead(C3, C3, num_classes)\n        self.aux_head3 = SegHead(C4, C4, num_classes)\n        self.aux_head4 = SegHead(C5, C5, num_classes)\n        self.head = SegHead(mid_channels, mid_channels, num_classes)\n\n        self.align_corners = align_corners\n        self.transforms = T.Compose([T.Normalize()])\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'bisenet_model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def transform(self, img: Union[np.ndarray, str]) -> Union[np.ndarray, str]:\n        return self.transforms(img)\n\n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        dfm = self.db(x)\n        feat1, feat2, feat3, feat4, sfm = self.sb(x)\n        logit = self.head(self.bga(dfm, sfm))\n\n        if not self.training:\n            logit_list = [logit]\n        else:\n            logit1 = self.aux_head1(feat1)\n            logit2 = self.aux_head2(feat2)\n            logit3 = self.aux_head3(feat3)\n            logit4 = self.aux_head4(feat4)\n            logit_list = [logit, logit1, logit2, logit3, logit4]\n\n        logit_list = [\n            F.interpolate(logit, paddle.shape(x)[2:], mode='bilinear', align_corners=self.align_corners)\n            for logit in logit_list\n        ]\n\n        return logit_list\n\n\nclass StemBlock(nn.Layer):\n    def __init__(self, in_dim: int, out_dim: int):\n        super(StemBlock, self).__init__()\n\n        self.conv = layers.ConvBNReLU(in_dim, out_dim, 3, stride=2)\n\n        self.left = nn.Sequential(\n            layers.ConvBNReLU(out_dim, out_dim // 2, 1), layers.ConvBNReLU(out_dim // 2, out_dim, 3, stride=2))\n\n        self.right = nn.MaxPool2D(kernel_size=3, stride=2, padding=1)\n\n        self.fuse = layers.ConvBNReLU(out_dim * 2, out_dim, 3)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.conv(x)\n        left = self.left(x)\n        right = self.right(x)\n        concat = paddle.concat([left, right], axis=1)\n        return self.fuse(concat)\n\n\nclass ContextEmbeddingBlock(nn.Layer):\n    def __init__(self, in_dim: int, out_dim: int):\n        super(ContextEmbeddingBlock, self).__init__()\n\n        self.gap = nn.AdaptiveAvgPool2D(1)\n        self.bn = layers.SyncBatchNorm(in_dim)\n\n        self.conv_1x1 = layers.ConvBNReLU(in_dim, out_dim, 1)\n        self.conv_3x3 = nn.Conv2D(out_dim, out_dim, 3, 1, 1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        gap = self.gap(x)\n        bn = self.bn(gap)\n        conv1 = self.conv_1x1(bn) + x\n        return self.conv_3x3(conv1)\n\n\nclass GatherAndExpansionLayer1(nn.Layer):\n    \"\"\"Gather And Expansion Layer with stride 1\"\"\"\n\n    def __init__(self, in_dim: int, out_dim: int, expand: int):\n        super().__init__()\n\n        expand_dim = expand * in_dim\n\n        self.conv = nn.Sequential(\n            layers.ConvBNReLU(in_dim, in_dim, 3), layers.DepthwiseConvBN(in_dim, expand_dim, 3),\n            layers.ConvBN(expand_dim, out_dim, 1))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        return F.relu(self.conv(x) + x)\n\n\nclass GatherAndExpansionLayer2(nn.Layer):\n    \"\"\"Gather And Expansion Layer with stride 2\"\"\"\n\n    def __init__(self, in_dim: int, out_dim: int, expand: int):\n        super().__init__()\n\n        expand_dim = expand * in_dim\n\n        self.branch_1 = nn.Sequential(\n            layers.ConvBNReLU(in_dim, in_dim, 3), layers.DepthwiseConvBN(in_dim, expand_dim, 3, stride=2),\n            layers.DepthwiseConvBN(expand_dim, expand_dim, 3), layers.ConvBN(expand_dim, out_dim, 1))\n\n        self.branch_2 = nn.Sequential(\n            layers.DepthwiseConvBN(in_dim, in_dim, 3, stride=2), layers.ConvBN(in_dim, out_dim, 1))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        return F.relu(self.branch_1(x) + self.branch_2(x))\n\n\nclass DetailBranch(nn.Layer):\n    \"\"\"The detail branch of BiSeNet, which has wide channels but shallow layers.\"\"\"\n\n    def __init__(self, in_channels: int):\n        super().__init__()\n\n        C1, C2, C3 = in_channels\n\n        self.convs = nn.Sequential(\n            # stage 1\n            layers.ConvBNReLU(3, C1, 3, stride=2),\n            layers.ConvBNReLU(C1, C1, 3),\n            # stage 2\n            layers.ConvBNReLU(C1, C2, 3, stride=2),\n            layers.ConvBNReLU(C2, C2, 3),\n            layers.ConvBNReLU(C2, C2, 3),\n            # stage 3\n            layers.ConvBNReLU(C2, C3, 3, stride=2),\n            layers.ConvBNReLU(C3, C3, 3),\n            layers.ConvBNReLU(C3, C3, 3),\n        )\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        return self.convs(x)\n\n\nclass SemanticBranch(nn.Layer):\n    \"\"\"The semantic branch of BiSeNet, which has narrow channels but deep layers.\"\"\"\n\n    def __init__(self, in_channels: int):\n        super().__init__()\n        C1, C3, C4, C5 = in_channels\n\n        self.stem = StemBlock(3, C1)\n\n        self.stage3 = nn.Sequential(GatherAndExpansionLayer2(C1, C3, 6), GatherAndExpansionLayer1(C3, C3, 6))\n\n        self.stage4 = nn.Sequential(GatherAndExpansionLayer2(C3, C4, 6), GatherAndExpansionLayer1(C4, C4, 6))\n\n        self.stage5_4 = nn.Sequential(\n            GatherAndExpansionLayer2(C4, C5, 6), GatherAndExpansionLayer1(C5, C5, 6), GatherAndExpansionLayer1(\n                C5, C5, 6), GatherAndExpansionLayer1(C5, C5, 6))\n\n        self.ce = ContextEmbeddingBlock(C5, C5)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        stage2 = self.stem(x)\n        stage3 = self.stage3(stage2)\n        stage4 = self.stage4(stage3)\n        stage5_4 = self.stage5_4(stage4)\n        fm = self.ce(stage5_4)\n        return stage2, stage3, stage4, stage5_4, fm\n\n\nclass BGA(nn.Layer):\n    \"\"\"The Bilateral Guided Aggregation Layer, used to fuse the semantic features and spatial features.\"\"\"\n\n    def __init__(self, out_dim: int, align_corners: bool):\n        super().__init__()\n\n        self.align_corners = align_corners\n\n        self.db_branch_keep = nn.Sequential(layers.DepthwiseConvBN(out_dim, out_dim, 3), nn.Conv2D(out_dim, out_dim, 1))\n\n        self.db_branch_down = nn.Sequential(\n            layers.ConvBN(out_dim, out_dim, 3, stride=2), nn.AvgPool2D(kernel_size=3, stride=2, padding=1))\n\n        self.sb_branch_keep = nn.Sequential(\n            layers.DepthwiseConvBN(out_dim, out_dim, 3), nn.Conv2D(out_dim, out_dim, 1),\n            layers.Activation(act='sigmoid'))\n\n        self.sb_branch_up = layers.ConvBN(out_dim, out_dim, 3)\n\n        self.conv = layers.ConvBN(out_dim, out_dim, 3)\n\n    def forward(self, dfm: int, sfm: int) -> paddle.Tensor:\n        db_feat_keep = self.db_branch_keep(dfm)\n        db_feat_down = self.db_branch_down(dfm)\n        sb_feat_keep = self.sb_branch_keep(sfm)\n\n        sb_feat_up = self.sb_branch_up(sfm)\n        sb_feat_up = F.interpolate(\n            sb_feat_up, paddle.shape(db_feat_keep)[2:], mode='bilinear', align_corners=self.align_corners)\n\n        sb_feat_up = F.sigmoid(sb_feat_up)\n        db_feat = db_feat_keep * sb_feat_up\n\n        sb_feat = db_feat_down * sb_feat_keep\n        sb_feat = F.interpolate(sb_feat, paddle.shape(db_feat)[2:], mode='bilinear', align_corners=self.align_corners)\n\n        return self.conv(db_feat + sb_feat)\n\n\nclass SegHead(nn.Layer):\n    def __init__(self, in_dim: int, mid_dim: int, num_classes: int):\n        super().__init__()\n\n        self.conv_3x3 = nn.Sequential(layers.ConvBNReLU(in_dim, mid_dim, 3), nn.Dropout(0.1))\n\n        self.conv_1x1 = nn.Conv2D(mid_dim, num_classes, 1, 1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        conv1 = self.conv_3x3(x)\n        conv2 = self.conv_1x1(conv1)\n        return conv2\n"
  },
  {
    "path": "modules/image/semantic_segmentation/danet_resnet50_cityscapes/README.md",
    "content": "# danet_resnet50_cityscapes\n\n|模型名称|danet_resnet50_cityscapes|\n| :--- | :---: | \n|类别|图像-图像分割|\n|网络|danet_resnet50vd|\n|数据集|Cityscapes|\n|是否支持Fine-tuning|是|\n|模型大小|272MB|\n|指标|-|\n|最新更新日期|2022-03-21|\n\n## 一、模型基本信息\n  \n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/159212111-df341f2a-e994-45d7-92d6-2288d666079c.png\"  width = \"420\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/159212188-2db40b29-2943-47ce-9ad2-36a6fb85ba3e.png\" width = \"420\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### 模型介绍\n\n    - 本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n    - 更多详情请参考：[ann](https://arxiv.org/pdf/1908.07678.pdf)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install danet_resnet50_cityscapes\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)   \n\n\n## 三、模型API预测\n\n- ### 1.预测代码示例\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='danet_resnet50_cityscapes')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用danet_resnet50_cityscapes模型对OpticDiscSeg数据集进行Fine-tune。 `train.py`内容如下：\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n              ```\n                - `transforms`: 数据预处理方式。\n                - `mode`: `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n                - 数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='danet_resnet50_cityscapes', num_classes=2, pretrained=None)\n              ```\n                - `name`: 选择预训练模型的名字。\n                - `load_checkpoint`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n        - Step4: 选择优化策略和运行配置\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n             \n    - 模型预测\n\n        - 当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。我们使用该模型来进行预测。predict.py脚本如下：\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='danet_resnet50_cityscapes', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - 参数配置正确后，请执行脚本`python predict.py`。\n\n            - **Args**\n                * `images`:原始图像路径或BGR格式图片；\n                * `visualization`: 是否可视化，默认为True；\n                * `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n                **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像分割服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m danet_resnet50_cityscapes\n      ```\n\n    - 这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ginet_resnet50vd_voc\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/semantic_segmentation/danet_resnet50_cityscapes/README_en.md",
    "content": "# danet_resnet50_cityscapes\n\n|Module Name|danet_resnet50_cityscapes|\n| :--- | :---: | \n|Category|Image Segmentation|\n|Network|danet_resnet50vd|\n|Dataset|Cityscapes|\n|Fine-tuning supported or not|Yes|\n|Module Size|272MB|\n|Data indicators|-|\n|Latest update date|2022-03-21|\n\n## I. Basic Information \n  \n- ### Application Effect Display\n    - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/159212111-df341f2a-e994-45d7-92d6-2288d666079c.png\"  width = \"420\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/159212188-2db40b29-2943-47ce-9ad2-36a6fb85ba3e.png\" width = \"420\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### Module Introduction\n\n    - We will show how to use PaddleHub to finetune the pre-trained model and complete the prediction.\n    - For more information, please refer to: [ginet](https://arxiv.org/pdf/2009.06160)\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install danet_resnet50_cityscapes\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='danet_resnet50_cityscapes')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.Fine-tune and Encapsulation\n\n    - After completing the installation of PaddlePaddle and PaddleHub, you can start using the danet_resnet50_cityscapes model to fine-tune datasets such as OpticDiscSeg.\n\n    - Steps:\n\n         - Step1: Define the data preprocessing method\n\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms`: The data enhancement module defines lots of data preprocessing methods. Users can replace the data preprocessing methods according to their needs.\n\n         - Step2: Download the dataset\n\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n              ```\n                * `transforms`: data preprocessing methods.\n\n                * `mode`: Select the data mode, the options are `train`, `test`, `val`. Default is `train`.\n\n                * Dataset preparation can be referred to [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`will be automatically downloaded from the network and decompressed to the `$HOME/.paddlehub/dataset` directory under the user directory.\n\n        - Step3: Load the pre-trained model\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='danet_resnet50_cityscapes', num_classes=2, pretrained=None)\n              ```\n                - `name`: model name.\n                - `load_checkpoint`: Whether to load the self-trained model, if it is None, load the provided parameters.\n\n        - Step4:  Optimization strategy\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```            \n\n    -  Model prediction\n\n        - When Fine-tune is completed, the model with the best performance on the verification set will be saved in the `${CHECKPOINT_DIR}/best_model` directory. We use this model to make predictions. The `predict.py` script is as follows:\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='danet_resnet50_cityscapes', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - **Args**\n                * `images`: Image path or ndarray data with format [H, W, C], BGR.\n                * `visualization`: Whether to save the recognition results as picture files.\n                * `save_path`: Save path of the result, default is 'seg_result'.\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of image segmentation.\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n        - ```shell\n          $ hub serving start -m danet_resnet50_cityscapes\n          ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result:\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/danet_resnet50_cityscapes\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## V. Release Note\n\n- 1.0.0\n\n  First release\n"
  },
  {
    "path": "modules/image/semantic_segmentation/danet_resnet50_cityscapes/layers.py",
    "content": "# copyright (c) 2022 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn.layer import activation\nfrom paddle.nn import Conv2D, AvgPool2D\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu':\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(\n            self,\n            in_channels: int,\n            out_channels: int,\n            kernel_size: int,\n            stride: int = 1,\n            dilation: int = 1,\n            groups: int = 1,\n            is_vd_mode: bool = False,\n            act: str = None,\n            name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2D(\n            kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=(kernel_size - 1) // 2 if dilation == 1 else 0,\n            dilation=dilation,\n            groups=groups,\n            bias_attr=False)\n\n        self._batch_norm = SyncBatchNorm(out_channels)\n        self._act_op = Activation(act=act)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        y = self._act_op(y)\n\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Residual bottleneck block\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 dilation: int = 1,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=1,\n            act='relu',\n            name=name + \"_branch2a\")\n\n        self.dilation = dilation\n\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            dilation=dilation,\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels * 4,\n            kernel_size=1,\n            act=None,\n            name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels * 4,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        if self.dilation > 1:\n            padding = self.dilation\n            y = F.pad(y, [padding, padding, padding, padding])\n\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.add(x=short, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Depthwise Separable Convolution.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(SeparableConvBNReLU, self).__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=in_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n        self.piontwise_conv = ConvBNReLU(\n            in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBN, self).__init__()\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBNReLU, self).__init__()\n\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n\n    Returns:\n        A callable object of Activation.\n\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n\n    Examples:\n\n        from paddleseg.models.common.activation import Activation\n\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(\n                    act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n\n\nclass ASPPModule(nn.Layer):\n    \"\"\"\n    Atrous Spatial Pyramid Pooling.\n\n    Args:\n        aspp_ratios (tuple): The dilation rate using in ASSP module.\n        in_channels (int): The number of input channels.\n        out_channels (int): The number of output channels.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n        use_sep_conv (bool, optional): If using separable conv in ASPP module. Default: False.\n        image_pooling (bool, optional): If augmented with image-level features. Default: False\n    \"\"\"\n\n    def __init__(self,\n                 aspp_ratios: tuple,\n                 in_channels: int,\n                 out_channels: int,\n                 align_corners: bool,\n                 use_sep_conv: bool= False,\n                 image_pooling: bool = False):\n        super().__init__()\n\n        self.align_corners = align_corners\n        self.aspp_blocks = nn.LayerList()\n\n        for ratio in aspp_ratios:\n            if use_sep_conv and ratio > 1:\n                conv_func = SeparableConvBNReLU\n            else:\n                conv_func = ConvBNReLU\n\n            block = conv_func(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1 if ratio == 1 else 3,\n                dilation=ratio,\n                padding=0 if ratio == 1 else ratio)\n            self.aspp_blocks.append(block)\n\n        out_size = len(self.aspp_blocks)\n\n        if image_pooling:\n            self.global_avg_pool = nn.Sequential(\n                nn.AdaptiveAvgPool2D(output_size=(1, 1)),\n                ConvBNReLU(in_channels, out_channels, kernel_size=1, bias_attr=False))\n            out_size += 1\n        self.image_pooling = image_pooling\n\n        self.conv_bn_relu = ConvBNReLU(\n            in_channels=out_channels * out_size,\n            out_channels=out_channels,\n            kernel_size=1)\n\n        self.dropout = nn.Dropout(p=0.1)  # drop rate\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outputs = []\n        for block in self.aspp_blocks:\n            y = block(x)\n            y = F.interpolate(\n                y,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(y)\n\n        if self.image_pooling:\n            img_avg = self.global_avg_pool(x)\n            img_avg = F.interpolate(\n                img_avg,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(img_avg)\n\n        x = paddle.concat(outputs, axis=1)\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n\n        return x\n\n\n\n\n"
  },
  {
    "path": "modules/image/semantic_segmentation/danet_resnet50_cityscapes/module.py",
    "content": "# copyright (c) 2022 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union, List, Tuple\n\nimport paddle\nfrom paddle import nn\nimport paddle.nn.functional as F\nimport numpy as np\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\n\nfrom danet_resnet50_voc.resnet import ResNet50_vd\nimport danet_resnet50_voc.layers as L\n\n\n@moduleinfo(\n    name=\"danet_resnet50_cityscapes\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"DANetResnet50 is a segmentation model.\",\n    version=\"1.0.0\",\n    meta=ImageSegmentationModule)\nclass DANet(nn.Layer):\n    \"\"\"\n    The DANet implementation based on PaddlePaddle.\n\n    The original article refers to\n    Fu, jun, et al. \"Dual Attention Network for Scene Segmentation\"\n    (https://arxiv.org/pdf/1809.02983.pdf)\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone (Paddle.nn.Layer): A backbone network.\n        backbone_indices (tuple): The values in the tuple indicate the indices of\n            output of backbone.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.  Default: False.\n        pretrained (str, optional): The path or url of pretrained model. Default: None.\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 19,\n                 backbone_indices: Tuple[int] = (2, 3),\n                 align_corners: bool = False,\n                 pretrained: str = None):\n        super(DANet, self).__init__()\n\n        self.backbone = ResNet50_vd()\n        self.backbone_indices = backbone_indices\n        in_channels = [self.backbone.feat_channels[i] for i in backbone_indices]\n\n        self.head = DAHead(num_classes=num_classes, in_channels=in_channels)\n\n        self.align_corners = align_corners\n        self.transforms = T.Compose([T.Normalize()])\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        feats = self.backbone(x)\n        feats = [feats[i] for i in self.backbone_indices]\n        logit_list = self.head(feats)\n        if not self.training:\n            logit_list = [logit_list[0]]\n\n        logit_list = [\n            F.interpolate(\n                logit,\n                paddle.shape(x)[2:],\n                mode='bilinear',\n                align_corners=self.align_corners,\n                align_mode=1) for logit in logit_list\n        ]\n        return logit_list\n\n    def transform(self, img: Union[np.ndarray, str]) -> Union[np.ndarray, str]:\n        return self.transforms(img)\n\n\nclass DAHead(nn.Layer):\n    \"\"\"\n    The Dual attention head.\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        in_channels (tuple): The number of input channels.\n    \"\"\"\n\n    def __init__(self, num_classes: int, in_channels: int):\n        super().__init__()\n        in_channels = in_channels[-1]\n        inter_channels = in_channels // 4\n\n        self.channel_conv = L.ConvBNReLU(in_channels, inter_channels, 3)\n        self.position_conv = L.ConvBNReLU(in_channels, inter_channels, 3)\n        self.pam = PAM(inter_channels)\n        self.cam = CAM(inter_channels)\n        self.conv1 = L.ConvBNReLU(inter_channels, inter_channels, 3)\n        self.conv2 = L.ConvBNReLU(inter_channels, inter_channels, 3)\n\n        self.aux_head = nn.Sequential(\n            nn.Dropout2D(0.1), nn.Conv2D(in_channels, num_classes, 1))\n\n        self.aux_head_pam = nn.Sequential(\n            nn.Dropout2D(0.1), nn.Conv2D(inter_channels, num_classes, 1))\n\n        self.aux_head_cam = nn.Sequential(\n            nn.Dropout2D(0.1), nn.Conv2D(inter_channels, num_classes, 1))\n\n        self.cls_head = nn.Sequential(\n            nn.Dropout2D(0.1), nn.Conv2D(inter_channels, num_classes, 1))\n\n    def forward(self, feat_list: List[paddle.Tensor]) -> List[paddle.Tensor]:\n        feats = feat_list[-1]\n        channel_feats = self.channel_conv(feats)\n        channel_feats = self.cam(channel_feats)\n        channel_feats = self.conv1(channel_feats)\n\n        position_feats = self.position_conv(feats)\n        position_feats = self.pam(position_feats)\n        position_feats = self.conv2(position_feats)\n\n        feats_sum = position_feats + channel_feats\n        logit = self.cls_head(feats_sum)\n\n        if not self.training:\n            return [logit]\n\n        cam_logit = self.aux_head_cam(channel_feats)\n        pam_logit = self.aux_head_cam(position_feats)\n        aux_logit = self.aux_head(feats)\n        return [logit, cam_logit, pam_logit, aux_logit]\n\n\nclass PAM(nn.Layer):\n    \"\"\"Position attention module.\"\"\"\n\n    def __init__(self, in_channels: int):\n        super().__init__()\n        mid_channels = in_channels // 8\n        self.mid_channels = mid_channels\n        self.in_channels = in_channels\n\n        self.query_conv = nn.Conv2D(in_channels, mid_channels, 1, 1)\n        self.key_conv = nn.Conv2D(in_channels, mid_channels, 1, 1)\n        self.value_conv = nn.Conv2D(in_channels, in_channels, 1, 1)\n\n        self.gamma = self.create_parameter(\n            shape=[1],\n            dtype='float32',\n            default_initializer=nn.initializer.Constant(0))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x_shape = paddle.shape(x)\n\n        # query: n, h * w, c1\n        query = self.query_conv(x)\n        query = paddle.reshape(query, (0, self.mid_channels, -1))\n        query = paddle.transpose(query, (0, 2, 1))\n\n        # key: n, c1, h * w\n        key = self.key_conv(x)\n        key = paddle.reshape(key, (0, self.mid_channels, -1))\n\n        # sim: n, h * w, h * w\n        sim = paddle.bmm(query, key)\n        sim = F.softmax(sim, axis=-1)\n\n        value = self.value_conv(x)\n        value = paddle.reshape(value, (0, self.in_channels, -1))\n        sim = paddle.transpose(sim, (0, 2, 1))\n\n        # feat: from (n, c2, h * w) -> (n, c2, h, w)\n        feat = paddle.bmm(value, sim)\n        feat = paddle.reshape(feat,\n                              (0, self.in_channels, x_shape[2], x_shape[3]))\n\n        out = self.gamma * feat + x\n        return out\n\n\nclass CAM(nn.Layer):\n    \"\"\"Channel attention module.\"\"\"\n\n    def __init__(self, channels: int):\n        super().__init__()\n\n        self.channels = channels\n        self.gamma = self.create_parameter(\n            shape=[1],\n            dtype='float32',\n            default_initializer=nn.initializer.Constant(0))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x_shape = paddle.shape(x)\n        # query: n, c, h * w\n        query = paddle.reshape(x, (0, self.channels, -1))\n        # key: n, h * w, c\n        key = paddle.reshape(x, (0, self.channels, -1))\n        key = paddle.transpose(key, (0, 2, 1))\n\n        # sim: n, c, c\n        sim = paddle.bmm(query, key)\n        # The danet author claims that this can avoid gradient divergence\n        sim = paddle.max(\n            sim, axis=-1, keepdim=True).tile([1, 1, self.channels]) - sim\n        sim = F.softmax(sim, axis=-1)\n\n        # feat: from (n, c, h * w) to (n, c, h, w)\n        value = paddle.reshape(x, (0, self.channels, -1))\n        feat = paddle.bmm(sim, value)\n        feat = paddle.reshape(feat, (0, self.channels, x_shape[2], x_shape[3]))\n\n        out = self.gamma * feat + x\n        return out\n"
  },
  {
    "path": "modules/image/semantic_segmentation/danet_resnet50_cityscapes/resnet.py",
    "content": "# copyright (c) 2022 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import Union, List, Tuple\n\nimport paddle.nn as nn\nimport ann_resnet50_voc.layers as layers\n\n\nclass ConvBNLayer(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 stride: int = 1,\n                 dilation: int = 1,\n                 groups: int = 1,\n                 is_vd_mode: bool = False,\n                 act: str = None,\n                 data_format: str = 'NCHW'):\n        super(ConvBNLayer, self).__init__()\n        if dilation != 1 and kernel_size != 3:\n            raise RuntimeError(\"When the dilation isn't 1,\" \\\n                               \"the kernel_size should be 3.\")\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = nn.AvgPool2D(\n            kernel_size=2,\n            stride=2,\n            padding=0,\n            ceil_mode=True,\n            data_format=data_format)\n        self._conv = nn.Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=(kernel_size - 1) // 2 \\\n                if dilation == 1 else dilation,\n            dilation=dilation,\n            groups=groups,\n            bias_attr=False,\n            data_format=data_format)\n\n        self._batch_norm = layers.SyncBatchNorm(\n            out_channels, data_format=data_format)\n        self._act_op = layers.Activation(act=act)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        y = self._act_op(y)\n\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool  = False,\n                 dilation: int = 1,\n                 data_format: str = 'NCHW'):\n        super(BottleneckBlock, self).__init__()\n\n        self.data_format = data_format\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=1,\n            act='relu',\n            data_format=data_format)\n\n        self.dilation = dilation\n\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            dilation=dilation,\n            data_format=data_format)\n        self.conv2 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels * 4,\n            kernel_size=1,\n            act=None,\n            data_format=data_format)\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels * 4,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                data_format=data_format)\n\n        self.shortcut = shortcut\n        # NOTE: Use the wrap layer for quantization training\n        self.add = layers.Add()\n        self.relu = layers.Activation(act=\"relu\")\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = self.add(short, conv2)\n        y = self.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 dilation: int = 1,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 data_format: str = 'NCHW'):\n        super(BasicBlock, self).__init__()\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            dilation=dilation,\n            act='relu',\n            data_format=data_format)\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            dilation=dilation,\n            act=None,\n            data_format=data_format)\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                data_format=data_format)\n\n        self.shortcut = shortcut\n        self.dilation = dilation\n        self.data_format = data_format\n        self.add = layers.Add()\n        self.relu = layers.Activation(act=\"relu\")\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = self.add(short, conv1)\n        y = self.relu(y)\n\n        return y\n\n\nclass ResNet_vd(nn.Layer):\n    \"\"\"\n    The ResNet_vd implementation based on PaddlePaddle.\n\n    The original article refers to Jingdong\n    Tong He, et, al. \"Bag of Tricks for Image Classification with Convolutional Neural Networks\"\n    (https://arxiv.org/pdf/1812.01187.pdf).\n\n    Args:\n        layers (int, optional): The layers of ResNet_vd. The supported layers are (18, 34, 50, 101, 152, 200). Default: 50.\n        output_stride (int, optional): The stride of output features compared to input images. It is 8 or 16. Default: 8.\n        multi_grid (tuple|list, optional): The grid of stage4. Defult: (1, 1, 1).\n        pretrained (str, optional): The path of pretrained model.\n\n    \"\"\"\n\n    def __init__(self,\n                 layers: int = 50,\n                 output_stride: int = 8,\n                 multi_grid: Tuple[int] = (1, 1, 1),\n                 pretrained: str = None,\n                 data_format: str = 'NCHW'):\n        super(ResNet_vd, self).__init__()\n\n        self.data_format = data_format\n        self.conv1_logit = None  # for gscnn shape stream\n        self.layers = layers\n        supported_layers = [18, 34, 50, 101, 152, 200]\n        assert layers in supported_layers, \\\n            \"supported layers are {} but input layer is {}\".format(\n                supported_layers, layers)\n\n        if layers == 18:\n            depth = [2, 2, 2, 2]\n        elif layers == 34 or layers == 50:\n            depth = [3, 4, 6, 3]\n        elif layers == 101:\n            depth = [3, 4, 23, 3]\n        elif layers == 152:\n            depth = [3, 8, 36, 3]\n        elif layers == 200:\n            depth = [3, 12, 48, 3]\n        num_channels = [64, 256, 512, 1024\n                        ] if layers >= 50 else [64, 64, 128, 256]\n        num_filters = [64, 128, 256, 512]\n\n        # for channels of four returned stages\n        self.feat_channels = [c * 4 for c in num_filters\n                              ] if layers >= 50 else num_filters\n\n        dilation_dict = None\n        if output_stride == 8:\n            dilation_dict = {2: 2, 3: 4}\n        elif output_stride == 16:\n            dilation_dict = {3: 2}\n\n        self.conv1_1 = ConvBNLayer(\n            in_channels=3,\n            out_channels=32,\n            kernel_size=3,\n            stride=2,\n            act='relu',\n            data_format=data_format)\n        self.conv1_2 = ConvBNLayer(\n            in_channels=32,\n            out_channels=32,\n            kernel_size=3,\n            stride=1,\n            act='relu',\n            data_format=data_format)\n        self.conv1_3 = ConvBNLayer(\n            in_channels=32,\n            out_channels=64,\n            kernel_size=3,\n            stride=1,\n            act='relu',\n            data_format=data_format)\n        self.pool2d_max = nn.MaxPool2D(\n            kernel_size=3, stride=2, padding=1, data_format=data_format)\n\n        # self.block_list = []\n        self.stage_list = []\n        if layers >= 50:\n            for block in range(len(depth)):\n                shortcut = False\n                block_list = []\n                for i in range(depth[block]):\n                    if layers in [101, 152] and block == 2:\n                        if i == 0:\n                            conv_name = \"res\" + str(block + 2) + \"a\"\n                        else:\n                            conv_name = \"res\" + str(block + 2) + \"b\" + str(i)\n                    else:\n                        conv_name = \"res\" + str(block + 2) + chr(97 + i)\n\n                    ###############################################################################\n                    # Add dilation rate for some segmentation tasks, if dilation_dict is not None.\n                    dilation_rate = dilation_dict[\n                        block] if dilation_dict and block in dilation_dict else 1\n\n                    # Actually block here is 'stage', and i is 'block' in 'stage'\n                    # At the stage 4, expand the the dilation_rate if given multi_grid\n                    if block == 3:\n                        dilation_rate = dilation_rate * multi_grid[i]\n                    ###############################################################################\n\n                    bottleneck_block = self.add_sublayer(\n                        'bb_%d_%d' % (block, i),\n                        BottleneckBlock(\n                            in_channels=num_channels[block]\n                            if i == 0 else num_filters[block] * 4,\n                            out_channels=num_filters[block],\n                            stride=2 if i == 0 and block != 0\n                                        and dilation_rate == 1 else 1,\n                            shortcut=shortcut,\n                            if_first=block == i == 0,\n                            dilation=dilation_rate,\n                            data_format=data_format))\n\n                    block_list.append(bottleneck_block)\n                    shortcut = True\n                self.stage_list.append(block_list)\n        else:\n            for block in range(len(depth)):\n                shortcut = False\n                block_list = []\n                for i in range(depth[block]):\n                    dilation_rate = dilation_dict[block] \\\n                        if dilation_dict and block in dilation_dict else 1\n                    if block == 3:\n                        dilation_rate = dilation_rate * multi_grid[i]\n\n                    basic_block = self.add_sublayer(\n                        'bb_%d_%d' % (block, i),\n                        BasicBlock(\n                            in_channels=num_channels[block]\n                            if i == 0 else num_filters[block],\n                            out_channels=num_filters[block],\n                            stride=2 if i == 0 and block != 0 \\\n                                        and dilation_rate == 1 else 1,\n                            dilation=dilation_rate,\n                            shortcut=shortcut,\n                            if_first=block == i == 0,\n                            data_format=data_format))\n                    block_list.append(basic_block)\n                    shortcut = True\n                self.stage_list.append(block_list)\n\n        self.pretrained = pretrained\n\n    def forward(self, inputs: paddle.Tensor) -> List[paddle.Tensor]:\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        self.conv1_logit = y.clone()\n        y = self.pool2d_max(y)\n\n        # A feature list saves the output feature map of each stage.\n        feat_list = []\n        for stage in self.stage_list:\n            for block in stage:\n                y = block(y)\n            feat_list.append(y)\n\n        return feat_list\n\n\ndef ResNet50_vd(**args):\n    model = ResNet_vd(layers=50, **args)\n    return model"
  },
  {
    "path": "modules/image/semantic_segmentation/danet_resnet50_voc/README.md",
    "content": "# danet_resnet50_voc\n\n|模型名称|danet_resnet50_voc|\n| :--- | :---: | \n|类别|图像-图像分割|\n|网络|danet_resnet50vd|\n|数据集|PascalVOC2012|\n|是否支持Fine-tuning|是|\n|模型大小|273MB|\n|指标|-|\n|最新更新日期|2022-03-21|\n\n## 一、模型基本信息\n  \n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/159212097-443a5a65-2f2e-4126-9c07-d7c3c220e55f.jpg\"  width = \"420\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/159212375-52e123af-4699-4c25-8f50-4240bbb714b4.png\" width = \"420\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### 模型介绍\n\n    - 本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n    - 更多详情请参考：[danet](https://arxiv.org/pdf/1809.02983.pdf)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install danet_resnet50_voc\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)   \n\n\n## 三、模型API预测\n\n- ### 1.预测代码示例\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='danet_resnet50_voc')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用danet_resnet50_voc模型对OpticDiscSeg数据集进行Fine-tune。 `train.py`内容如下：\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n              ```\n                - `transforms`: 数据预处理方式。\n                - `mode`: `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n                - 数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='danet_resnet50_voc', num_classes=2, pretrained=None)\n              ```\n                - `name`: 选择预训练模型的名字。\n                - `load_checkpoint`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n        - Step4: 选择优化策略和运行配置\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n             \n    - 模型预测\n\n        - 当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。我们使用该模型来进行预测。predict.py脚本如下：\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='danet_resnet50_voc', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - 参数配置正确后，请执行脚本`python predict.py`。\n\n            - **Args**\n                * `images`:原始图像路径或BGR格式图片；\n                * `visualization`: 是否可视化，默认为True；\n                * `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n                **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像分割服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m danet_resnet50_voc\n      ```\n\n    - 这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ginet_resnet50vd_voc\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/semantic_segmentation/danet_resnet50_voc/README_en.md",
    "content": "# danet_resnet50_voc\n\n|Module Name|danet_resnet50_voc|\n| :--- | :---: | \n|Category|Image Segmentation|\n|Network|danet_resnet50vd|\n|Dataset|PascalVOC2012|\n|Fine-tuning supported or not|Yes|\n|Module Size|273MB|\n|Data indicators|-|\n|Latest update date|2022-03-22|\n\n## I. Basic Information \n  \n- ### Application Effect Display\n    - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/159212097-443a5a65-2f2e-4126-9c07-d7c3c220e55f.jpg\"  width = \"420\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/159212375-52e123af-4699-4c25-8f50-4240bbb714b4.png\" width = \"420\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### Module Introduction\n\n    - We will show how to use PaddleHub to finetune the pre-trained model and complete the prediction.\n    - For more information, please refer to: [danet](https://arxiv.org/pdf/1809.02983.pdf)\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install danet_resnet50_voc\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='danet_resnet50_voc')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.Fine-tune and Encapsulation\n\n    - After completing the installation of PaddlePaddle and PaddleHub, you can start using the danet_resnet50_voc model to fine-tune datasets such as OpticDiscSeg.\n\n    - Steps:\n\n         - Step1: Define the data preprocessing method\n\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms`: The data enhancement module defines lots of data preprocessing methods. Users can replace the data preprocessing methods according to their needs.\n\n         - Step2: Download the dataset\n\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n              ```\n                * `transforms`: data preprocessing methods.\n\n                * `mode`: Select the data mode, the options are `train`, `test`, `val`. Default is `train`.\n\n                * Dataset preparation can be referred to [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`will be automatically downloaded from the network and decompressed to the `$HOME/.paddlehub/dataset` directory under the user directory.\n\n        - Step3: Load the pre-trained model\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='danet_resnet50_voc', num_classes=2, pretrained=None)\n              ```\n                - `name`: model name.\n                - `load_checkpoint`: Whether to load the self-trained model, if it is None, load the provided parameters.\n\n        - Step4:  Optimization strategy\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n             \n    -  Model prediction\n\n        - When Fine-tune is completed, the model with the best performance on the verification set will be saved in the `${CHECKPOINT_DIR}/best_model` directory. We use this model to make predictions. The `predict.py` script is as follows:\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='danet_resnet50_voc', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - **Args**\n                * `images`: Image path or ndarray data with format [H, W, C], BGR.\n                * `visualization`: Whether to save the recognition results as picture files.\n                * `save_path`: Save path of the result, default is 'seg_result'.\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of image segmentation.\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n        - ```shell\n          $ hub serving start -m danet_resnet50_voc\n          ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result:\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/danet_resnet50_voc\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## V. Release Note\n\n- 1.0.0\n\n  First release\n"
  },
  {
    "path": "modules/image/semantic_segmentation/danet_resnet50_voc/layers.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn.layer import activation\nfrom paddle.nn import Conv2D, AvgPool2D\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu':\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(\n            self,\n            in_channels: int,\n            out_channels: int,\n            kernel_size: int,\n            stride: int = 1,\n            dilation: int = 1,\n            groups: int = 1,\n            is_vd_mode: bool = False,\n            act: str = None,\n            name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2D(\n            kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=(kernel_size - 1) // 2 if dilation == 1 else 0,\n            dilation=dilation,\n            groups=groups,\n            bias_attr=False)\n\n        self._batch_norm = SyncBatchNorm(out_channels)\n        self._act_op = Activation(act=act)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        y = self._act_op(y)\n\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Residual bottleneck block\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 dilation: int = 1,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=1,\n            act='relu',\n            name=name + \"_branch2a\")\n\n        self.dilation = dilation\n\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            dilation=dilation,\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels * 4,\n            kernel_size=1,\n            act=None,\n            name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels * 4,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        if self.dilation > 1:\n            padding = self.dilation\n            y = F.pad(y, [padding, padding, padding, padding])\n\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.add(x=short, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Depthwise Separable Convolution.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(SeparableConvBNReLU, self).__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=in_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n        self.piontwise_conv = ConvBNReLU(\n            in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBN, self).__init__()\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBNReLU, self).__init__()\n\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n\n    Returns:\n        A callable object of Activation.\n\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n\n    Examples:\n\n        from paddleseg.models.common.activation import Activation\n\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(\n                    act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n\n\nclass ASPPModule(nn.Layer):\n    \"\"\"\n    Atrous Spatial Pyramid Pooling.\n\n    Args:\n        aspp_ratios (tuple): The dilation rate using in ASSP module.\n        in_channels (int): The number of input channels.\n        out_channels (int): The number of output channels.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n        use_sep_conv (bool, optional): If using separable conv in ASPP module. Default: False.\n        image_pooling (bool, optional): If augmented with image-level features. Default: False\n    \"\"\"\n\n    def __init__(self,\n                 aspp_ratios: tuple,\n                 in_channels: int,\n                 out_channels: int,\n                 align_corners: bool,\n                 use_sep_conv: bool= False,\n                 image_pooling: bool = False):\n        super().__init__()\n\n        self.align_corners = align_corners\n        self.aspp_blocks = nn.LayerList()\n\n        for ratio in aspp_ratios:\n            if use_sep_conv and ratio > 1:\n                conv_func = SeparableConvBNReLU\n            else:\n                conv_func = ConvBNReLU\n\n            block = conv_func(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1 if ratio == 1 else 3,\n                dilation=ratio,\n                padding=0 if ratio == 1 else ratio)\n            self.aspp_blocks.append(block)\n\n        out_size = len(self.aspp_blocks)\n\n        if image_pooling:\n            self.global_avg_pool = nn.Sequential(\n                nn.AdaptiveAvgPool2D(output_size=(1, 1)),\n                ConvBNReLU(in_channels, out_channels, kernel_size=1, bias_attr=False))\n            out_size += 1\n        self.image_pooling = image_pooling\n\n        self.conv_bn_relu = ConvBNReLU(\n            in_channels=out_channels * out_size,\n            out_channels=out_channels,\n            kernel_size=1)\n\n        self.dropout = nn.Dropout(p=0.1)  # drop rate\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outputs = []\n        for block in self.aspp_blocks:\n            y = block(x)\n            y = F.interpolate(\n                y,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(y)\n\n        if self.image_pooling:\n            img_avg = self.global_avg_pool(x)\n            img_avg = F.interpolate(\n                img_avg,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(img_avg)\n\n        x = paddle.concat(outputs, axis=1)\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n\n        return x\n\n\n\n\n"
  },
  {
    "path": "modules/image/semantic_segmentation/danet_resnet50_voc/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union, List, Tuple\n\nimport paddle\nfrom paddle import nn\nimport paddle.nn.functional as F\nimport numpy as np\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\n\nfrom danet_resnet50_voc.resnet import ResNet50_vd\nimport danet_resnet50_voc.layers as L\n\n\n@moduleinfo(\n    name=\"danet_resnet50_voc\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"DeepLabV3PResnet50 is a segmentation model.\",\n    version=\"1.0.0\",\n    meta=ImageSegmentationModule)\nclass DANet(nn.Layer):\n    \"\"\"\n    The DANet implementation based on PaddlePaddle.\n\n    The original article refers to\n    Fu, jun, et al. \"Dual Attention Network for Scene Segmentation\"\n    (https://arxiv.org/pdf/1809.02983.pdf)\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone (Paddle.nn.Layer): A backbone network.\n        backbone_indices (tuple): The values in the tuple indicate the indices of\n            output of backbone.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.  Default: False.\n        pretrained (str, optional): The path or url of pretrained model. Default: None.\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 21,\n                 backbone_indices: Tuple[int] = (2, 3),\n                 align_corners: bool = False,\n                 pretrained: str = None):\n        super(DANet, self).__init__()\n\n        self.backbone = ResNet50_vd()\n        self.backbone_indices = backbone_indices\n        in_channels = [self.backbone.feat_channels[i] for i in backbone_indices]\n\n        self.head = DAHead(num_classes=num_classes, in_channels=in_channels)\n\n        self.align_corners = align_corners\n        self.transforms = T.Compose([T.Normalize()])\n        \n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        feats = self.backbone(x)\n        feats = [feats[i] for i in self.backbone_indices]\n        logit_list = self.head(feats)\n        if not self.training:\n            logit_list = [logit_list[0]]\n\n        logit_list = [\n            F.interpolate(\n                logit,\n                paddle.shape(x)[2:],\n                mode='bilinear',\n                align_corners=self.align_corners,\n                align_mode=1) for logit in logit_list\n        ]\n        return logit_list\n    \n    def transform(self, img: Union[np.ndarray, str]) -> Union[np.ndarray, str]:\n        return self.transforms(img)\n\n\n\nclass DAHead(nn.Layer):\n    \"\"\"\n    The Dual attention head.\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        in_channels (tuple): The number of input channels.\n    \"\"\"\n\n    def __init__(self, num_classes: int, in_channels: int):\n        super().__init__()\n        in_channels = in_channels[-1]\n        inter_channels = in_channels // 4\n\n        self.channel_conv = L.ConvBNReLU(in_channels, inter_channels, 3)\n        self.position_conv = L.ConvBNReLU(in_channels, inter_channels, 3)\n        self.pam = PAM(inter_channels)\n        self.cam = CAM(inter_channels)\n        self.conv1 = L.ConvBNReLU(inter_channels, inter_channels, 3)\n        self.conv2 = L.ConvBNReLU(inter_channels, inter_channels, 3)\n\n        self.aux_head = nn.Sequential(\n            nn.Dropout2D(0.1), nn.Conv2D(in_channels, num_classes, 1))\n\n        self.aux_head_pam = nn.Sequential(\n            nn.Dropout2D(0.1), nn.Conv2D(inter_channels, num_classes, 1))\n\n        self.aux_head_cam = nn.Sequential(\n            nn.Dropout2D(0.1), nn.Conv2D(inter_channels, num_classes, 1))\n\n        self.cls_head = nn.Sequential(\n            nn.Dropout2D(0.1), nn.Conv2D(inter_channels, num_classes, 1))\n\n    def forward(self, feat_list: List[paddle.Tensor]) -> List[paddle.Tensor]:\n        feats = feat_list[-1]\n        channel_feats = self.channel_conv(feats)\n        channel_feats = self.cam(channel_feats)\n        channel_feats = self.conv1(channel_feats)\n\n        position_feats = self.position_conv(feats)\n        position_feats = self.pam(position_feats)\n        position_feats = self.conv2(position_feats)\n\n        feats_sum = position_feats + channel_feats\n        logit = self.cls_head(feats_sum)\n\n        if not self.training:\n            return [logit]\n\n        cam_logit = self.aux_head_cam(channel_feats)\n        pam_logit = self.aux_head_cam(position_feats)\n        aux_logit = self.aux_head(feats)\n        return [logit, cam_logit, pam_logit, aux_logit]\n\n\nclass PAM(nn.Layer):\n    \"\"\"Position attention module.\"\"\"\n\n    def __init__(self, in_channels: int):\n        super().__init__()\n        mid_channels = in_channels // 8\n        self.mid_channels = mid_channels\n        self.in_channels = in_channels\n\n        self.query_conv = nn.Conv2D(in_channels, mid_channels, 1, 1)\n        self.key_conv = nn.Conv2D(in_channels, mid_channels, 1, 1)\n        self.value_conv = nn.Conv2D(in_channels, in_channels, 1, 1)\n\n        self.gamma = self.create_parameter(\n            shape=[1],\n            dtype='float32',\n            default_initializer=nn.initializer.Constant(0))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x_shape = paddle.shape(x)\n\n        # query: n, h * w, c1\n        query = self.query_conv(x)\n        query = paddle.reshape(query, (0, self.mid_channels, -1))\n        query = paddle.transpose(query, (0, 2, 1))\n\n        # key: n, c1, h * w\n        key = self.key_conv(x)\n        key = paddle.reshape(key, (0, self.mid_channels, -1))\n\n        # sim: n, h * w, h * w\n        sim = paddle.bmm(query, key)\n        sim = F.softmax(sim, axis=-1)\n\n        value = self.value_conv(x)\n        value = paddle.reshape(value, (0, self.in_channels, -1))\n        sim = paddle.transpose(sim, (0, 2, 1))\n\n        # feat: from (n, c2, h * w) -> (n, c2, h, w)\n        feat = paddle.bmm(value, sim)\n        feat = paddle.reshape(feat,\n                              (0, self.in_channels, x_shape[2], x_shape[3]))\n\n        out = self.gamma * feat + x\n        return out\n\n\nclass CAM(nn.Layer):\n    \"\"\"Channel attention module.\"\"\"\n\n    def __init__(self, channels: int):\n        super().__init__()\n\n        self.channels = channels\n        self.gamma = self.create_parameter(\n            shape=[1],\n            dtype='float32',\n            default_initializer=nn.initializer.Constant(0))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x_shape = paddle.shape(x)\n        # query: n, c, h * w\n        query = paddle.reshape(x, (0, self.channels, -1))\n        # key: n, h * w, c\n        key = paddle.reshape(x, (0, self.channels, -1))\n        key = paddle.transpose(key, (0, 2, 1))\n\n        # sim: n, c, c\n        sim = paddle.bmm(query, key)\n        # The danet author claims that this can avoid gradient divergence\n        sim = paddle.max(\n            sim, axis=-1, keepdim=True).tile([1, 1, self.channels]) - sim\n        sim = F.softmax(sim, axis=-1)\n\n        # feat: from (n, c, h * w) to (n, c, h, w)\n        value = paddle.reshape(x, (0, self.channels, -1))\n        feat = paddle.bmm(sim, value)\n        feat = paddle.reshape(feat, (0, self.channels, x_shape[2], x_shape[3]))\n\n        out = self.gamma * feat + x\n        return out\n\n\n\n\n\n"
  },
  {
    "path": "modules/image/semantic_segmentation/danet_resnet50_voc/resnet.py",
    "content": "# copyright (c) 2022 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import Union, List, Tuple\n\nimport paddle.nn as nn\nimport ann_resnet50_voc.layers as layers\n\n\nclass ConvBNLayer(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 stride: int = 1,\n                 dilation: int = 1,\n                 groups: int = 1,\n                 is_vd_mode: bool = False,\n                 act: str = None,\n                 data_format: str = 'NCHW'):\n        super(ConvBNLayer, self).__init__()\n        if dilation != 1 and kernel_size != 3:\n            raise RuntimeError(\"When the dilation isn't 1,\" \\\n                               \"the kernel_size should be 3.\")\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = nn.AvgPool2D(\n            kernel_size=2,\n            stride=2,\n            padding=0,\n            ceil_mode=True,\n            data_format=data_format)\n        self._conv = nn.Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=(kernel_size - 1) // 2 \\\n                if dilation == 1 else dilation,\n            dilation=dilation,\n            groups=groups,\n            bias_attr=False,\n            data_format=data_format)\n\n        self._batch_norm = layers.SyncBatchNorm(\n            out_channels, data_format=data_format)\n        self._act_op = layers.Activation(act=act)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        y = self._act_op(y)\n\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool  = False,\n                 dilation: int = 1,\n                 data_format: str = 'NCHW'):\n        super(BottleneckBlock, self).__init__()\n\n        self.data_format = data_format\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=1,\n            act='relu',\n            data_format=data_format)\n\n        self.dilation = dilation\n\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            dilation=dilation,\n            data_format=data_format)\n        self.conv2 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels * 4,\n            kernel_size=1,\n            act=None,\n            data_format=data_format)\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels * 4,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                data_format=data_format)\n\n        self.shortcut = shortcut\n        # NOTE: Use the wrap layer for quantization training\n        self.add = layers.Add()\n        self.relu = layers.Activation(act=\"relu\")\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = self.add(short, conv2)\n        y = self.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 dilation: int = 1,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 data_format: str = 'NCHW'):\n        super(BasicBlock, self).__init__()\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            dilation=dilation,\n            act='relu',\n            data_format=data_format)\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            dilation=dilation,\n            act=None,\n            data_format=data_format)\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                data_format=data_format)\n\n        self.shortcut = shortcut\n        self.dilation = dilation\n        self.data_format = data_format\n        self.add = layers.Add()\n        self.relu = layers.Activation(act=\"relu\")\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = self.add(short, conv1)\n        y = self.relu(y)\n\n        return y\n\n\nclass ResNet_vd(nn.Layer):\n    \"\"\"\n    The ResNet_vd implementation based on PaddlePaddle.\n\n    The original article refers to Jingdong\n    Tong He, et, al. \"Bag of Tricks for Image Classification with Convolutional Neural Networks\"\n    (https://arxiv.org/pdf/1812.01187.pdf).\n\n    Args:\n        layers (int, optional): The layers of ResNet_vd. The supported layers are (18, 34, 50, 101, 152, 200). Default: 50.\n        output_stride (int, optional): The stride of output features compared to input images. It is 8 or 16. Default: 8.\n        multi_grid (tuple|list, optional): The grid of stage4. Defult: (1, 1, 1).\n        pretrained (str, optional): The path of pretrained model.\n\n    \"\"\"\n\n    def __init__(self,\n                 layers: int = 50,\n                 output_stride: int = 8,\n                 multi_grid: Tuple[int] = (1, 1, 1),\n                 pretrained: str = None,\n                 data_format: str = 'NCHW'):\n        super(ResNet_vd, self).__init__()\n\n        self.data_format = data_format\n        self.conv1_logit = None  # for gscnn shape stream\n        self.layers = layers\n        supported_layers = [18, 34, 50, 101, 152, 200]\n        assert layers in supported_layers, \\\n            \"supported layers are {} but input layer is {}\".format(\n                supported_layers, layers)\n\n        if layers == 18:\n            depth = [2, 2, 2, 2]\n        elif layers == 34 or layers == 50:\n            depth = [3, 4, 6, 3]\n        elif layers == 101:\n            depth = [3, 4, 23, 3]\n        elif layers == 152:\n            depth = [3, 8, 36, 3]\n        elif layers == 200:\n            depth = [3, 12, 48, 3]\n        num_channels = [64, 256, 512, 1024\n                        ] if layers >= 50 else [64, 64, 128, 256]\n        num_filters = [64, 128, 256, 512]\n\n        # for channels of four returned stages\n        self.feat_channels = [c * 4 for c in num_filters\n                              ] if layers >= 50 else num_filters\n\n        dilation_dict = None\n        if output_stride == 8:\n            dilation_dict = {2: 2, 3: 4}\n        elif output_stride == 16:\n            dilation_dict = {3: 2}\n\n        self.conv1_1 = ConvBNLayer(\n            in_channels=3,\n            out_channels=32,\n            kernel_size=3,\n            stride=2,\n            act='relu',\n            data_format=data_format)\n        self.conv1_2 = ConvBNLayer(\n            in_channels=32,\n            out_channels=32,\n            kernel_size=3,\n            stride=1,\n            act='relu',\n            data_format=data_format)\n        self.conv1_3 = ConvBNLayer(\n            in_channels=32,\n            out_channels=64,\n            kernel_size=3,\n            stride=1,\n            act='relu',\n            data_format=data_format)\n        self.pool2d_max = nn.MaxPool2D(\n            kernel_size=3, stride=2, padding=1, data_format=data_format)\n\n        # self.block_list = []\n        self.stage_list = []\n        if layers >= 50:\n            for block in range(len(depth)):\n                shortcut = False\n                block_list = []\n                for i in range(depth[block]):\n                    if layers in [101, 152] and block == 2:\n                        if i == 0:\n                            conv_name = \"res\" + str(block + 2) + \"a\"\n                        else:\n                            conv_name = \"res\" + str(block + 2) + \"b\" + str(i)\n                    else:\n                        conv_name = \"res\" + str(block + 2) + chr(97 + i)\n\n                    ###############################################################################\n                    # Add dilation rate for some segmentation tasks, if dilation_dict is not None.\n                    dilation_rate = dilation_dict[\n                        block] if dilation_dict and block in dilation_dict else 1\n\n                    # Actually block here is 'stage', and i is 'block' in 'stage'\n                    # At the stage 4, expand the the dilation_rate if given multi_grid\n                    if block == 3:\n                        dilation_rate = dilation_rate * multi_grid[i]\n                    ###############################################################################\n\n                    bottleneck_block = self.add_sublayer(\n                        'bb_%d_%d' % (block, i),\n                        BottleneckBlock(\n                            in_channels=num_channels[block]\n                            if i == 0 else num_filters[block] * 4,\n                            out_channels=num_filters[block],\n                            stride=2 if i == 0 and block != 0\n                                        and dilation_rate == 1 else 1,\n                            shortcut=shortcut,\n                            if_first=block == i == 0,\n                            dilation=dilation_rate,\n                            data_format=data_format))\n\n                    block_list.append(bottleneck_block)\n                    shortcut = True\n                self.stage_list.append(block_list)\n        else:\n            for block in range(len(depth)):\n                shortcut = False\n                block_list = []\n                for i in range(depth[block]):\n                    dilation_rate = dilation_dict[block] \\\n                        if dilation_dict and block in dilation_dict else 1\n                    if block == 3:\n                        dilation_rate = dilation_rate * multi_grid[i]\n\n                    basic_block = self.add_sublayer(\n                        'bb_%d_%d' % (block, i),\n                        BasicBlock(\n                            in_channels=num_channels[block]\n                            if i == 0 else num_filters[block],\n                            out_channels=num_filters[block],\n                            stride=2 if i == 0 and block != 0 \\\n                                        and dilation_rate == 1 else 1,\n                            dilation=dilation_rate,\n                            shortcut=shortcut,\n                            if_first=block == i == 0,\n                            data_format=data_format))\n                    block_list.append(basic_block)\n                    shortcut = True\n                self.stage_list.append(block_list)\n\n        self.pretrained = pretrained\n\n    def forward(self, inputs: paddle.Tensor) -> List[paddle.Tensor]:\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        self.conv1_logit = y.clone()\n        y = self.pool2d_max(y)\n\n        # A feature list saves the output feature map of each stage.\n        feat_list = []\n        for stage in self.stage_list:\n            for block in stage:\n                y = block(y)\n            feat_list.append(y)\n\n        return feat_list\n\n\ndef ResNet50_vd(**args):\n    model = ResNet_vd(layers=50, **args)\n    return model"
  },
  {
    "path": "modules/image/semantic_segmentation/deeplabv3p_resnet50_cityscapes/README.md",
    "content": "# PaddleHub 图像分割\n\n## 模型预测\n\n若想使用我们提供的预训练模型进行预测，可使用如下脚本：\n\n```python\nimport paddle\nimport cv2\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='deeplabv3p_resnet50_cityscapes')\n    img = cv2.imread(\"/PATH/TO/IMAGE\")\n    model.predict(images=[img], visualization=True)\n```\n\n\n## 如何开始Fine-tune\n\n在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用deeplabv3p_resnet50_cityscapes模型对OpticDiscSeg等数据集进行Fine-tune。\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 定义数据预处理方式\n```python\nfrom paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\ntransform = Compose([Resize(target_size=(512, 512)), Normalize()])\n```\n\n`segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n### Step2: 下载数据集并使用\n```python\nfrom paddlehub.datasets import OpticDiscSeg\n\ntrain_reader = OpticDiscSeg(transform， mode='train')\n\n```\n* `transform`: 数据预处理方式。\n* `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n### Step3: 加载预训练模型\n\n```python\nmodel = hub.Module(name='deeplabv3p_resnet50_cityscapes', num_classes=2, pretrained=None)\n```\n* `name`: 选择预训练模型的名字。\n* `num_classes`: 分割模型的类别数目。\n* `pretrained`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n### Step4: 选择优化策略和运行配置\n\n```python\nscheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\noptimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\ntrainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_ocr', use_gpu=True)\n```\n\n#### 优化策略\n\nPaddle2.0rc提供了多种优化器选择，如`SGD`, `Adam`, `Adamax`等，其中`Adam`:\n\n* `learning_rate`: 全局学习率。\n*  `parameters`: 待优化模型参数。\n\n#### 运行配置\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_gpu`: 是否使用gpu，默认为False;\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: works的数量，默认为0；\n* `eval_dataset`: 验证集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们使用该模型来进行预测。predict.py脚本如下：\n\n```python\nimport paddle\nimport cv2\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='deeplabv3p_resnet50_cityscapes', pretrained='/PATH/TO/CHECKPOINT')\n    img = cv2.imread(\"/PATH/TO/IMAGE\")\n    model.predict(images=[img], visualization=True)\n```\n\n参数配置正确后，请执行脚本`python predict.py`。\n**Args**\n* `images`:原始图像路径或BGR格式图片；\n* `visualization`: 是否可视化，默认为True；\n* `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n**NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线图像分割服务。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m deeplabv3p_resnet50_cityscapes\n```\n\n这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\nimport cv2\nimport base64\n\nimport numpy as np\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n# 发送HTTP请求\norg_im = cv2.imread('/PATH/TO/IMAGE')\ndata = {'images':[cv2_to_base64(org_im)]}\nheaders = {\"Content-type\": \"application/json\"}\nurl = \"http://127.0.0.1:8866/predict/deeplabv3p_resnet50_cityscapes\"\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\nmask = base64_to_cv2(r.json()[\"results\"][0])\n```\n\n### 查看代码\n\nhttps://github.com/PaddlePaddle/PaddleSeg\n\n### 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "modules/image/semantic_segmentation/deeplabv3p_resnet50_cityscapes/layers.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn import Conv2D, AvgPool2D\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu':\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 stride: int = 1,\n                 dilation: int = 1,\n                 groups: int = 1,\n                 is_vd_mode: bool = False,\n                 act: str = None,\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2D(kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=(kernel_size - 1) // 2 if dilation == 1 else 0,\n            dilation=dilation,\n            groups=groups,\n            bias_attr=False)\n\n        self._batch_norm = SyncBatchNorm(out_channels)\n        self._act_op = Activation(act=act)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        y = self._act_op(y)\n\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Residual bottleneck block\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 dilation: int = 1,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels, out_channels=out_channels, kernel_size=1, act='relu', name=name + \"_branch2a\")\n\n        self.dilation = dilation\n\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            dilation=dilation,\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            in_channels=out_channels, out_channels=out_channels * 4, kernel_size=1, act=None, name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels * 4,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        if self.dilation > 1:\n            padding = self.dilation\n            y = F.pad(y, [padding, padding, padding, padding])\n\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.add(x=short, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Depthwise Separable Convolution.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs: dict):\n        super(SeparableConvBNReLU, self).__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=in_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n        self.piontwise_conv = ConvBNReLU(in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs: dict):\n        super(ConvBN, self).__init__()\n        self._conv = Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs: dict):\n        super(ConvBNReLU, self).__init__()\n\n        self._conv = Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n    Returns:\n        A callable object of Activation.\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n    Examples:\n        from paddleseg.models.common.activation import Activation\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = nn.layer.activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"nn.layer.activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n\n\nclass ASPPModule(nn.Layer):\n    \"\"\"\n    Atrous Spatial Pyramid Pooling.\n\n    Args:\n        aspp_ratios (tuple): The dilation rate using in ASSP module.\n        in_channels (int): The number of input channels.\n        out_channels (int): The number of output channels.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n        use_sep_conv (bool, optional): If using separable conv in ASPP module. Default: False.\n        image_pooling (bool, optional): If augmented with image-level features. Default: False\n    \"\"\"\n\n    def __init__(self,\n                 aspp_ratios: tuple,\n                 in_channels: int,\n                 out_channels: int,\n                 align_corners: bool,\n                 use_sep_conv: bool = False,\n                 image_pooling: bool = False):\n        super().__init__()\n\n        self.align_corners = align_corners\n        self.aspp_blocks = nn.LayerList()\n\n        for ratio in aspp_ratios:\n            if use_sep_conv and ratio > 1:\n                conv_func = SeparableConvBNReLU\n            else:\n                conv_func = ConvBNReLU\n\n            block = conv_func(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1 if ratio == 1 else 3,\n                dilation=ratio,\n                padding=0 if ratio == 1 else ratio)\n            self.aspp_blocks.append(block)\n\n        out_size = len(self.aspp_blocks)\n\n        if image_pooling:\n            self.global_avg_pool = nn.Sequential(\n                nn.AdaptiveAvgPool2D(output_size=(1, 1)),\n                ConvBNReLU(in_channels, out_channels, kernel_size=1, bias_attr=False))\n            out_size += 1\n        self.image_pooling = image_pooling\n\n        self.conv_bn_relu = ConvBNReLU(in_channels=out_channels * out_size, out_channels=out_channels, kernel_size=1)\n\n        self.dropout = nn.Dropout(p=0.1)  # drop rate\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outputs = []\n        for block in self.aspp_blocks:\n            y = block(x)\n            y = F.interpolate(y, x.shape[2:], mode='bilinear', align_corners=self.align_corners)\n            outputs.append(y)\n\n        if self.image_pooling:\n            img_avg = self.global_avg_pool(x)\n            img_avg = F.interpolate(img_avg, x.shape[2:], mode='bilinear', align_corners=self.align_corners)\n            outputs.append(img_avg)\n\n        x = paddle.concat(outputs, axis=1)\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n\n        return x\n"
  },
  {
    "path": "modules/image/semantic_segmentation/deeplabv3p_resnet50_cityscapes/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union, List, Tuple\n\nimport paddle\nfrom paddle import nn\nimport paddle.nn.functional as F\nimport numpy as np\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\n\nfrom deeplabv3p_resnet50_cityscapes.resnet import ResNet50_vd\nimport deeplabv3p_resnet50_cityscapes.layers as L\n\n\n@moduleinfo(\n    name=\"deeplabv3p_resnet50_cityscapes\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"DeepLabV3PResnet50 is a segmentation model.\",\n    version=\"1.0.0\",\n    meta=ImageSegmentationModule)\nclass DeepLabV3PResnet50(nn.Layer):\n    \"\"\"\n    The DeepLabV3PResnet50 implementation based on PaddlePaddle.\n\n    The original article refers to\n     Liang-Chieh Chen, et, al. \"Encoder-Decoder with Atrous Separable Convolution for Semantic Image Segmentation\"\n     (https://arxiv.org/abs/1802.02611)\n\n    Args:\n        num_classes (int): the unique number of target classes.\n        backbone_indices (tuple): two values in the tuple indicate the indices of output of backbone.\n            the first index will be taken as a low-level feature in Decoder component;\n            the second one will be taken as input of ASPP component.\n            Usually backbone consists of four downsampling stage, and return an output of\n            each stage, so we set default (0, 3), which means taking feature map of the first\n            stage in backbone as low-level feature used in Decoder, and feature map of the fourth\n            stage as input of ASPP.\n        aspp_ratios (tuple): the dilation rate using in ASSP module.\n            if output_stride=16, aspp_ratios should be set as (1, 6, 12, 18).\n            if output_stride=8, aspp_ratios is (1, 12, 24, 36).\n        aspp_out_channels (int): the output channels of ASPP module.\n        align_corners (bool, optional): An argument of F.interpolate. It should be set to False when the feature size is even,\n            e.g. 1024x512, otherwise it is True, e.g. 769x769. Default: False.\n        pretrained (str): the path of pretrained model. Default to None.\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 19,\n                 backbone_indices: Tuple[int] = (0, 3),\n                 aspp_ratios: Tuple[int] = (1, 12, 24, 36),\n                 aspp_out_channels: int = 256,\n                 align_corners=False,\n                 pretrained: str = None):\n        super(DeepLabV3PResnet50, self).__init__()\n        self.backbone = ResNet50_vd()\n        backbone_channels = [self.backbone.feat_channels[i] for i in backbone_indices]\n        self.head = DeepLabV3PHead(num_classes, backbone_indices, backbone_channels, aspp_ratios, aspp_out_channels,\n                                   align_corners)\n        self.align_corners = align_corners\n        self.transforms = T.Compose([T.Normalize()])\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def transform(self, img: Union[np.ndarray, str]) -> Union[np.ndarray, str]:\n        return self.transforms(img)\n\n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        feat_list = self.backbone(x)\n        logit_list = self.head(feat_list)\n        return [\n            F.interpolate(logit, x.shape[2:], mode='bilinear', align_corners=self.align_corners) for logit in logit_list\n        ]\n\n\nclass DeepLabV3PHead(nn.Layer):\n    \"\"\"\n    The DeepLabV3PHead implementation based on PaddlePaddle.\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (tuple): Two values in the tuple indicate the indices of output of backbone.\n            the first index will be taken as a low-level feature in Decoder component;\n            the second one will be taken as input of ASPP component.\n            Usually backbone consists of four downsampling stage, and return an output of\n            each stage. If we set it as (0, 3), it means taking feature map of the first\n            stage in backbone as low-level feature used in Decoder, and feature map of the fourth\n            stage as input of ASPP.\n        backbone_channels (tuple): The same length with \"backbone_indices\". It indicates the channels of corresponding index.\n        aspp_ratios (tuple): The dilation rates using in ASSP module.\n        aspp_out_channels (int): The output channels of ASPP module.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n    \"\"\"\n\n    def __init__(self, num_classes: int, backbone_indices: Tuple[paddle.Tensor],\n                 backbone_channels: Tuple[paddle.Tensor], aspp_ratios: Tuple[float], aspp_out_channels: int,\n                 align_corners: bool):\n        super().__init__()\n\n        self.aspp = L.ASPPModule(\n            aspp_ratios, backbone_channels[1], aspp_out_channels, align_corners, use_sep_conv=True, image_pooling=True)\n        self.decoder = Decoder(num_classes, backbone_channels[0], align_corners)\n        self.backbone_indices = backbone_indices\n\n    def forward(self, feat_list: List[paddle.Tensor]) -> List[paddle.Tensor]:\n        logit_list = []\n        low_level_feat = feat_list[self.backbone_indices[0]]\n        x = feat_list[self.backbone_indices[1]]\n        x = self.aspp(x)\n        logit = self.decoder(x, low_level_feat)\n        logit_list.append(logit)\n        return logit_list\n\n\nclass Decoder(nn.Layer):\n    \"\"\"\n    Decoder module of DeepLabV3P model\n\n    Args:\n        num_classes (int): The number of classes.\n        in_channels (int): The number of input channels in decoder module.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n    \"\"\"\n\n    def __init__(self, num_classes: int, in_channels: int, align_corners: bool):\n        super(Decoder, self).__init__()\n\n        self.conv_bn_relu1 = L.ConvBNReLU(in_channels=in_channels, out_channels=48, kernel_size=1)\n\n        self.conv_bn_relu2 = L.SeparableConvBNReLU(in_channels=304, out_channels=256, kernel_size=3, padding=1)\n        self.conv_bn_relu3 = L.SeparableConvBNReLU(in_channels=256, out_channels=256, kernel_size=3, padding=1)\n        self.conv = nn.Conv2D(in_channels=256, out_channels=num_classes, kernel_size=1)\n\n        self.align_corners = align_corners\n\n    def forward(self, x: paddle.Tensor, low_level_feat: paddle.Tensor) -> paddle.Tensor:\n        low_level_feat = self.conv_bn_relu1(low_level_feat)\n        x = F.interpolate(x, low_level_feat.shape[2:], mode='bilinear', align_corners=self.align_corners)\n        x = paddle.concat([x, low_level_feat], axis=1)\n        x = self.conv_bn_relu2(x)\n        x = self.conv_bn_relu3(x)\n        x = self.conv(x)\n        return x\n"
  },
  {
    "path": "modules/image/semantic_segmentation/deeplabv3p_resnet50_cityscapes/resnet.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Union, List, Tuple\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport deeplabv3p_resnet50_cityscapes.layers as L\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BasicBlock, self).__init__()\n        self.stride = stride\n        self.conv0 = L.ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2a\")\n        self.conv1 = L.ConvBNLayer(\n            in_channels=out_channels, out_channels=out_channels, kernel_size=3, act=None, name=name + \"_branch2b\")\n\n        if not shortcut:\n            self.short = L.ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv1, act='relu')\n\n        return y\n\n\nclass ResNet50_vd(nn.Layer):\n    def __init__(self, multi_grid: Tuple[int] = (1, 2, 4)):\n        super(ResNet50_vd, self).__init__()\n        depth = [3, 4, 6, 3]\n        num_channels = [64, 256, 512, 1024]\n        num_filters = [64, 128, 256, 512]\n        self.feat_channels = [c * 4 for c in num_filters]\n        dilation_dict = {2: 2, 3: 4}\n        self.conv1_1 = L.ConvBNLayer(\n            in_channels=3, out_channels=32, kernel_size=3, stride=2, act='relu', name=\"conv1_1\")\n        self.conv1_2 = L.ConvBNLayer(\n            in_channels=32, out_channels=32, kernel_size=3, stride=1, act='relu', name=\"conv1_2\")\n        self.conv1_3 = L.ConvBNLayer(\n            in_channels=32, out_channels=64, kernel_size=3, stride=1, act='relu', name=\"conv1_3\")\n        self.pool2d_max = nn.MaxPool2D(kernel_size=3, stride=2, padding=1)\n        self.stage_list = []\n\n        for block in range(len(depth)):\n            shortcut = False\n            block_list = []\n            for i in range(depth[block]):\n                conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                dilation_rate = dilation_dict[block] if dilation_dict and block in dilation_dict else 1\n                if block == 3:\n                    dilation_rate = dilation_rate * multi_grid[i]\n                bottleneck_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    L.BottleneckBlock(\n                        in_channels=num_channels[block] if i == 0 else num_filters[block] * 4,\n                        out_channels=num_filters[block],\n                        stride=2 if i == 0 and block != 0 and dilation_rate == 1 else 1,\n                        shortcut=shortcut,\n                        if_first=block == i == 0,\n                        name=conv_name,\n                        dilation=dilation_rate))\n                block_list.append(bottleneck_block)\n                shortcut = True\n            self.stage_list.append(block_list)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        y = self.pool2d_max(y)\n        feat_list = []\n        for stage in self.stage_list:\n            for block in stage:\n                y = block(y)\n            feat_list.append(y)\n        return feat_list\n"
  },
  {
    "path": "modules/image/semantic_segmentation/deeplabv3p_resnet50_voc/README.md",
    "content": "# PaddleHub 图像分割\n\n## 模型预测\n\n若想使用我们提供的预训练模型进行预测，可使用如下脚本：\n\n```python\nimport paddle\nimport cv2\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='deeplabv3p_resnet50_voc')\n    img = cv2.imread(\"/PATH/TO/IMAGE\")\n    model.predict(images=[img], visualization=True)\n```\n\n\n\n## 如何开始Fine-tune\n\n本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n\n在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用deeplabv3p_resnet50_voc模型对OpticDiscSeg等数据集进行Fine-tune。\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 定义数据预处理方式\n```python\nfrom paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\ntransform = Compose([Resize(target_size=(512, 512)), Normalize()])\n```\n\n`segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n### Step2: 下载数据集并使用\n```python\nfrom paddlehub.datasets import OpticDiscSeg\n\ntrain_reader = OpticDiscSeg(transform， mode='train')\n\n```\n* `transform`: 数据预处理方式。\n* `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n### Step3: 加载预训练模型\n\n```python\nmodel = hub.Module(name='deeplabv3p_resnet50_voc', num_classes=2, pretrained=None)\n```\n* `name`: 选择预训练模型的名字。\n* `num_classes`: 分割模型的类别数目。\n* `pretrained`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n### Step4: 选择优化策略和运行配置\n\n```python\nscheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\noptimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\ntrainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_ocr', use_gpu=True)\n```\n\n#### 优化策略\n\nPaddle2.0提供了多种优化器选择，如`SGD`, `Adam`, `Adamax`等，其中`Adam`:\n\n* `learning_rate`: 全局学习率。\n*  `parameters`: 待优化模型参数。\n\n#### 运行配置\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_gpu`: 是否使用gpu，默认为False;\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: works的数量，默认为0；\n* `eval_dataset`: 验证集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们使用该模型来进行预测。predict.py脚本如下：\n\n```python\nimport paddle\nimport cv2\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='deeplabv3p_resnet50_voc', pretrained='/PATH/TO/CHECKPOINT')\n    img = cv2.imread(\"/PATH/TO/IMAGE\")\n    model.predict(images=[img], visualization=True)\n```\n\n参数配置正确后，请执行脚本`python predict.py`。\n**Args**\n* `images`:原始图像路径或BGR格式图片；\n* `visualization`: 是否可视化，默认为True；\n* `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n**NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线图像分割服务。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m deeplabv3p_resnet50_voc\n```\n\n这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\nimport cv2\nimport base64\n\nimport numpy as np\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n# 发送HTTP请求\norg_im = cv2.imread('/PATH/TO/IMAGE')\ndata = {'images':[cv2_to_base64(org_im)]}\nheaders = {\"Content-type\": \"application/json\"}\nurl = \"http://127.0.0.1:8866/predict/bisenetv2_cityscapes\"\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\nmask = base64_to_cv2(r.json()[\"results\"][0])\n```\n\n### 查看代码\n\nhttps://github.com/PaddlePaddle/PaddleSeg\n\n### 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "modules/image/semantic_segmentation/deeplabv3p_resnet50_voc/layers.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn import Conv2D, AvgPool2D\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu':\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 stride: int = 1,\n                 dilation: int = 1,\n                 groups: int = 1,\n                 is_vd_mode: bool = False,\n                 act: str = None,\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2D(kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=(kernel_size - 1) // 2 if dilation == 1 else 0,\n            dilation=dilation,\n            groups=groups,\n            bias_attr=False)\n\n        self._batch_norm = SyncBatchNorm(out_channels)\n        self._act_op = Activation(act=act)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        y = self._act_op(y)\n\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Residual bottleneck block\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 dilation: int = 1,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels, out_channels=out_channels, kernel_size=1, act='relu', name=name + \"_branch2a\")\n\n        self.dilation = dilation\n\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            dilation=dilation,\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            in_channels=out_channels, out_channels=out_channels * 4, kernel_size=1, act=None, name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels * 4,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        if self.dilation > 1:\n            padding = self.dilation\n            y = F.pad(y, [padding, padding, padding, padding])\n\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.add(x=short, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Depthwise Separable Convolution.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs: dict):\n        super(SeparableConvBNReLU, self).__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=in_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n        self.piontwise_conv = ConvBNReLU(in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs: dict):\n        super(ConvBN, self).__init__()\n        self._conv = Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs: dict):\n        super(ConvBNReLU, self).__init__()\n\n        self._conv = Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n\n    Returns:\n        A callable object of Activation.\n\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n\n    Examples:\n\n        from paddleseg.models.common.activation import Activation\n\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = nn.layer.activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n\n\nclass ASPPModule(nn.Layer):\n    \"\"\"\n    Atrous Spatial Pyramid Pooling.\n\n    Args:\n        aspp_ratios (tuple): The dilation rate using in ASSP module.\n        in_channels (int): The number of input channels.\n        out_channels (int): The number of output channels.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n        use_sep_conv (bool, optional): If using separable conv in ASPP module. Default: False.\n        image_pooling (bool, optional): If augmented with image-level features. Default: False\n    \"\"\"\n\n    def __init__(self,\n                 aspp_ratios: tuple,\n                 in_channels: int,\n                 out_channels: int,\n                 align_corners: bool,\n                 use_sep_conv: bool = False,\n                 image_pooling: bool = False):\n        super().__init__()\n\n        self.align_corners = align_corners\n        self.aspp_blocks = nn.LayerList()\n\n        for ratio in aspp_ratios:\n            if use_sep_conv and ratio > 1:\n                conv_func = SeparableConvBNReLU\n            else:\n                conv_func = ConvBNReLU\n\n            block = conv_func(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1 if ratio == 1 else 3,\n                dilation=ratio,\n                padding=0 if ratio == 1 else ratio)\n            self.aspp_blocks.append(block)\n\n        out_size = len(self.aspp_blocks)\n\n        if image_pooling:\n            self.global_avg_pool = nn.Sequential(\n                nn.AdaptiveAvgPool2D(output_size=(1, 1)),\n                ConvBNReLU(in_channels, out_channels, kernel_size=1, bias_attr=False))\n            out_size += 1\n        self.image_pooling = image_pooling\n\n        self.conv_bn_relu = ConvBNReLU(in_channels=out_channels * out_size, out_channels=out_channels, kernel_size=1)\n\n        self.dropout = nn.Dropout(p=0.1)  # drop rate\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outputs = []\n        for block in self.aspp_blocks:\n            y = block(x)\n            y = F.interpolate(y, x.shape[2:], mode='bilinear', align_corners=self.align_corners)\n            outputs.append(y)\n\n        if self.image_pooling:\n            img_avg = self.global_avg_pool(x)\n            img_avg = F.interpolate(img_avg, x.shape[2:], mode='bilinear', align_corners=self.align_corners)\n            outputs.append(img_avg)\n\n        x = paddle.concat(outputs, axis=1)\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n\n        return x\n"
  },
  {
    "path": "modules/image/semantic_segmentation/deeplabv3p_resnet50_voc/module.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union, List, Tuple\n\nimport paddle\nfrom paddle import nn\nimport paddle.nn.functional as F\nimport numpy as np\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\n\nfrom deeplabv3p_resnet50_voc.resnet import ResNet50_vd\nimport deeplabv3p_resnet50_voc.layers as L\n\n\n@moduleinfo(\n    name=\"deeplabv3p_resnet50_voc\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"DeepLabV3PResnet50 is a segmentation model.\",\n    version=\"1.0.0\",\n    meta=ImageSegmentationModule)\nclass DeepLabV3PResnet50(nn.Layer):\n    \"\"\"\n    The DeepLabV3PResnet50 implementation based on PaddlePaddle.\n\n    The original article refers to\n     Liang-Chieh Chen, et, al. \"Encoder-Decoder with Atrous Separable Convolution for Semantic Image Segmentation\"\n     (https://arxiv.org/abs/1802.02611)\n\n    Args:\n        num_classes (int): the unique number of target classes.\n        backbone_indices (tuple): two values in the tuple indicate the indices of output of backbone.\n            the first index will be taken as a low-level feature in Decoder component;\n            the second one will be taken as input of ASPP component.\n            Usually backbone consists of four downsampling stage, and return an output of\n            each stage, so we set default (0, 3), which means taking feature map of the first\n            stage in backbone as low-level feature used in Decoder, and feature map of the fourth\n            stage as input of ASPP.\n        aspp_ratios (tuple): the dilation rate using in ASSP module.\n            if output_stride=16, aspp_ratios should be set as (1, 6, 12, 18).\n            if output_stride=8, aspp_ratios is (1, 12, 24, 36).\n        aspp_out_channels (int): the output channels of ASPP module.\n        align_corners (bool, optional): An argument of F.interpolate. It should be set to False when the feature size is even,\n            e.g. 1024x512, otherwise it is True, e.g. 769x769. Default: False.\n        pretrained (str): the path of pretrained model. Default to None.\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 21,\n                 backbone_indices: Tuple[int] = (0, 3),\n                 aspp_ratios: Tuple[int] = (1, 12, 24, 36),\n                 aspp_out_channels: int = 256,\n                 align_corners=False,\n                 pretrained: str = None):\n        super(DeepLabV3PResnet50, self).__init__()\n        self.backbone = ResNet50_vd()\n        backbone_channels = [self.backbone.feat_channels[i] for i in backbone_indices]\n        self.head = DeepLabV3PHead(num_classes, backbone_indices, backbone_channels, aspp_ratios, aspp_out_channels,\n                                   align_corners)\n        self.align_corners = align_corners\n        self.transforms = T.Compose([T.Normalize()])\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'deeplabv3p_model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def transform(self, img: Union[np.ndarray, str]) -> Union[np.ndarray, str]:\n        return self.transforms(img)\n\n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        feat_list = self.backbone(x)\n        logit_list = self.head(feat_list)\n        return [\n            F.interpolate(logit, x.shape[2:], mode='bilinear', align_corners=self.align_corners) for logit in logit_list\n        ]\n\n\nclass DeepLabV3PHead(nn.Layer):\n    \"\"\"\n    The DeepLabV3PHead implementation based on PaddlePaddle.\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (tuple): Two values in the tuple indicate the indices of output of backbone.\n            the first index will be taken as a low-level feature in Decoder component;\n            the second one will be taken as input of ASPP component.\n            Usually backbone consists of four downsampling stage, and return an output of\n            each stage. If we set it as (0, 3), it means taking feature map of the first\n            stage in backbone as low-level feature used in Decoder, and feature map of the fourth\n            stage as input of ASPP.\n        backbone_channels (tuple): The same length with \"backbone_indices\". It indicates the channels of corresponding index.\n        aspp_ratios (tuple): The dilation rates using in ASSP module.\n        aspp_out_channels (int): The output channels of ASPP module.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n    \"\"\"\n\n    def __init__(self, num_classes: int, backbone_indices: Tuple[paddle.Tensor],\n                 backbone_channels: Tuple[paddle.Tensor], aspp_ratios: Tuple[float], aspp_out_channels: int,\n                 align_corners: bool):\n        super().__init__()\n\n        self.aspp = L.ASPPModule(\n            aspp_ratios, backbone_channels[1], aspp_out_channels, align_corners, use_sep_conv=True, image_pooling=True)\n        self.decoder = Decoder(num_classes, backbone_channels[0], align_corners)\n        self.backbone_indices = backbone_indices\n\n    def forward(self, feat_list: List[paddle.Tensor]) -> List[paddle.Tensor]:\n        logit_list = []\n        low_level_feat = feat_list[self.backbone_indices[0]]\n        x = feat_list[self.backbone_indices[1]]\n        x = self.aspp(x)\n        logit = self.decoder(x, low_level_feat)\n        logit_list.append(logit)\n        return logit_list\n\n\nclass Decoder(nn.Layer):\n    \"\"\"\n    Decoder module of DeepLabV3P model\n\n    Args:\n        num_classes (int): The number of classes.\n        in_channels (int): The number of input channels in decoder module.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n    \"\"\"\n\n    def __init__(self, num_classes: int, in_channels: int, align_corners: bool):\n        super(Decoder, self).__init__()\n\n        self.conv_bn_relu1 = L.ConvBNReLU(in_channels=in_channels, out_channels=48, kernel_size=1)\n\n        self.conv_bn_relu2 = L.SeparableConvBNReLU(in_channels=304, out_channels=256, kernel_size=3, padding=1)\n        self.conv_bn_relu3 = L.SeparableConvBNReLU(in_channels=256, out_channels=256, kernel_size=3, padding=1)\n        self.conv = nn.Conv2D(in_channels=256, out_channels=num_classes, kernel_size=1)\n\n        self.align_corners = align_corners\n\n    def forward(self, x: paddle.Tensor, low_level_feat: paddle.Tensor) -> paddle.Tensor:\n        low_level_feat = self.conv_bn_relu1(low_level_feat)\n        x = F.interpolate(x, low_level_feat.shape[2:], mode='bilinear', align_corners=self.align_corners)\n        x = paddle.concat([x, low_level_feat], axis=1)\n        x = self.conv_bn_relu2(x)\n        x = self.conv_bn_relu3(x)\n        x = self.conv(x)\n        return x\n"
  },
  {
    "path": "modules/image/semantic_segmentation/deeplabv3p_resnet50_voc/resnet.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport deeplabv3p_resnet50_voc.layers as L\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BasicBlock, self).__init__()\n        self.stride = stride\n        self.conv0 = L.ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2a\")\n        self.conv1 = L.ConvBNLayer(\n            in_channels=out_channels, out_channels=out_channels, kernel_size=3, act=None, name=name + \"_branch2b\")\n\n        if not shortcut:\n            self.short = L.ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv1, act='relu')\n\n        return y\n\n\nclass ResNet50_vd(nn.Layer):\n    def __init__(self, multi_grid: tuple = (1, 2, 4)):\n        super(ResNet50_vd, self).__init__()\n        depth = [3, 4, 6, 3]\n        num_channels = [64, 256, 512, 1024]\n        num_filters = [64, 128, 256, 512]\n        self.feat_channels = [c * 4 for c in num_filters]\n        dilation_dict = {2: 2, 3: 4}\n        self.conv1_1 = L.ConvBNLayer(\n            in_channels=3, out_channels=32, kernel_size=3, stride=2, act='relu', name=\"conv1_1\")\n        self.conv1_2 = L.ConvBNLayer(\n            in_channels=32, out_channels=32, kernel_size=3, stride=1, act='relu', name=\"conv1_2\")\n        self.conv1_3 = L.ConvBNLayer(\n            in_channels=32, out_channels=64, kernel_size=3, stride=1, act='relu', name=\"conv1_3\")\n        self.pool2d_max = nn.MaxPool2D(kernel_size=3, stride=2, padding=1)\n        self.stage_list = []\n\n        for block in range(len(depth)):\n            shortcut = False\n            block_list = []\n            for i in range(depth[block]):\n                conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                dilation_rate = dilation_dict[block] if dilation_dict and block in dilation_dict else 1\n                if block == 3:\n                    dilation_rate = dilation_rate * multi_grid[i]\n                bottleneck_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    L.BottleneckBlock(\n                        in_channels=num_channels[block] if i == 0 else num_filters[block] * 4,\n                        out_channels=num_filters[block],\n                        stride=2 if i == 0 and block != 0 and dilation_rate == 1 else 1,\n                        shortcut=shortcut,\n                        if_first=block == i == 0,\n                        name=conv_name,\n                        dilation=dilation_rate))\n                block_list.append(bottleneck_block)\n                shortcut = True\n            self.stage_list.append(block_list)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        y = self.pool2d_max(y)\n        feat_list = []\n        for stage in self.stage_list:\n            for block in stage:\n                y = block(y)\n            feat_list.append(y)\n        return feat_list\n"
  },
  {
    "path": "modules/image/semantic_segmentation/deeplabv3p_xception65_humanseg/README.md",
    "content": "# deeplabv3p_xception65_humanseg\n\n|模型名称|deeplabv3p_xception65_humanseg|\n| :--- | :---: |\n|类别|图像-图像分割|\n|网络|deeplabv3p|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|否|\n|模型大小|162MB|\n|指标|-|\n|最新更新日期|2021-02-26|\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/130913092-312a5f37-842e-4fd0-8db4-5f853fd8419f.jpg\" width = \"337\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/130913256-41056b21-1c3d-4ee2-b481-969c94754609.png\" width = \"337\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### 模型介绍\n\n  - DeepLabv3+使用百度自建数据集进行训练，可用于人像分割，支持任意大小的图片输入。\n<p align=\"center\">\n<img src=\"https://paddlehub.bj.bcebos.com/paddlehub-img/deeplabv3plus.png\" hspace='10'/> <br />\n</p>\n\n- 更多详情请参考：[deeplabv3p](https://github.com/PaddlePaddle/PaddleSeg)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install deeplabv3p_xception65_humanseg\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1.命令行预测\n\n  ```shell\n  hub run deeplabv3p_xception65_humanseg --input_path \"/PATH/TO/IMAGE\"\n  ```\n\n\n\n- ### 2.预测代码示例\n\n  ```python\n  import paddlehub as hub\n  import cv2\n\n  human_seg = hub.Module(name=\"deeplabv3p_xception65_humanseg\")\n  result = human_seg.segmentation(images=[cv2.imread('/PATH/TO/IMAGE')])\n\n  ```\n\n- ### 3.API\n\n    ```python\n    def segmentation(images=None,\n                     paths=None,\n                     batch_size=1,\n                     use_gpu=False,\n                     visualization=False,\n                     output_dir='humanseg_output')\n    ```\n\n    - 预测API，用于人像分割。\n\n    - **参数**\n\n      * images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      * paths (list\\[str\\]): 图片的路径；\n      * batch\\_size (int): batch 的大小；\n      * use\\_gpu (bool): 是否使用 GPU；\n      * visualization (bool): 是否将识别结果保存为图片文件；\n      * output\\_dir (str): 图片的保存路径。\n\n    - **返回**\n\n      * res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，关键字有 'save\\_path', 'data'，对应的取值为：\n      * save\\_path (str, optional): 可视化图片的保存路径（仅当visualization=True时存在）；\n      * data (numpy.ndarray): 人像分割结果，仅包含Alpha通道，取值为0-255 (0为全透明，255为不透明)，也即取值越大的像素点越可能为人体，取值越小的像素点越可能为背景。\n\n    ```python\n    def save_inference_model(dirname)\n    ```\n\n    - 将模型保存到指定路径。\n\n    - **参数**\n\n      * dirname: 模型保存路径\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个人像分割的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n\n    ```shell\n    $ hub serving start -m deeplabv3p_xception65_humanseg\n    ```\n\n    - 这样就完成了一个人像分割的服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    import numpy as np\n  \n  \n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n  \n  \n    def base64_to_cv2(b64str):\n        data = base64.b64decode(b64str.encode('utf8'))\n        data = np.fromstring(data, np.uint8)\n        data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n        return data\n  \n    org_im = cv2.imread(\"/PATH/TO/IMAGE\")\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(org_im)]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/deeplabv3p_xception65_humanseg\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))# 保存图片\n    mask =cv2.cvtColor(base64_to_cv2(r.json()[\"results\"][0]['data']), cv2.COLOR_BGR2GRAY)\n    rgba = np.concatenate((org_im, np.expand_dims(mask, axis=2)), axis=2)\n    cv2.imwrite(\"segment_human_server.png\", rgba)\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n   初始发布\n\n* 1.1.0\n\n   提升预测性能\n\n* 1.1.1\n\n   修复预测后处理图像数据超过[0,255]范围\n\n* 1.2.0\n\n   移除 fluid api\n\n  - ```shell\n    $ hub install deeplabv3p_xception65_humanseg==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/semantic_segmentation/deeplabv3p_xception65_humanseg/README_en.md",
    "content": "# deeplabv3p_xception65_humanseg\n\n|Module Name |deeplabv3p_xception65_humanseg|\n| :--- | :---: |\n|Category|Image segmentation|\n|Network|deeplabv3p|\n|Dataset|Baidu self-built dataset|\n|Fine-tuning supported or not|No|\n|Module Size|162MB|\n|Data indicators |-|\n|Latest update date|2021-02-26|\n\n## I. Basic Information\n\n- ### Application Effect Display\n\n  - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/130913092-312a5f37-842e-4fd0-8db4-5f853fd8419f.jpg\" width = \"337\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/130913256-41056b21-1c3d-4ee2-b481-969c94754609.png\" width = \"337\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### Module Introduction\n\n  - DeepLabv3+ model is trained by Baidu self-built dataset, which can be used for portrait segmentation.\n<p align=\"center\">\n<img src=\"https://paddlehub.bj.bcebos.com/paddlehub-img/deeplabv3plus.png\" hspace='10'/> <br />\n</p>\n\n- For more information, please refer to: [deeplabv3p](https://github.com/PaddlePaddle/PaddleSeg)\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install deeplabv3p_xception65_humanseg\n      ```\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    hub run deeplabv3p_xception65_humanseg --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_en/tutorial/cmd_usage.rst)\n\n\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    human_seg = hub.Module(name=\"deeplabv3p_xception65_humanseg\")\n    result = human_seg.segmentation(images=[cv2.imread('/PATH/TO/IMAGE')])\n    ```\n\n- ### 3.API\n\n    - ```python\n      def segmentation(images=None,\n                       paths=None,\n                       batch_size=1,\n                       use_gpu=False,\n                       visualization=False,\n                       output_dir='humanseg_output')\n      ```\n\n      - Prediction API, generating segmentation result.\n\n      - **Parameter**\n        * images (list\\[numpy.ndarray\\]): Image data, ndarray.shape is in the format [H, W, C], BGR.\n        * paths (list\\[str\\]): Image path.\n        * batch\\_size (int): Batch size.\n        * use\\_gpu (bool): Use GPU or not. **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n        * visualization (bool): Whether to save the recognition results as picture files.\n        * output\\_dir (str): Save path of images.\n\n      - **Return**\n\n          * res (list\\[dict\\]): The list of recognition results, where each element is dict and each field is:\n              * save\\_path (str, optional): Save path of the result.\n              * data (numpy.ndarray): The result of portrait segmentation.\n\n    - ```python\n      def save_inference_model(dirname)\n      ```\n\n      - Save the model to the specified path.\n\n      - **Parameters**\n        * dirname: Model save path.\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of for human segmentation.\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n      - ```shell\n        $ hub serving start -m deeplabv3p_xception65_humanseg\n        ```\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    import numpy as np\n\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n\n    def base64_to_cv2(b64str):\n        data = base64.b64decode(b64str.encode('utf8'))\n        data = np.fromstring(data, np.uint8)\n        data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n        return data\n\n    org_im = cv2.imread(\"/PATH/TO/IMAGE\")\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(org_im)]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/deeplabv3p_xception65_humanseg\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    mask =cv2.cvtColor(base64_to_cv2(r.json()[\"results\"][0]['data']), cv2.COLOR_BGR2GRAY)\n    rgba = np.concatenate((org_im, np.expand_dims(mask, axis=2)), axis=2)\n    cv2.imwrite(\"segment_human_server.png\", rgba)\n    ```\n## V. Release Note\n\n- 1.0.0\n\n  First release\n\n* 1.1.0\n\n   Improve prediction performance\n\n* 1.1.1\n\n   Fix the bug of image value out of range\n\n* 1.2.0\n\n   Remove fluid api\n\n  - ```shell\n    $ hub install deeplabv3p_xception65_humanseg==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/semantic_segmentation/deeplabv3p_xception65_humanseg/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/semantic_segmentation/deeplabv3p_xception65_humanseg/data_feed.py",
    "content": "# coding=utf-8\nimport os\nimport time\nfrom collections import OrderedDict\n\nimport cv2\nimport numpy as np\n\n__all__ = ['reader']\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            im = cv2.imread(im_path).astype('float32')\n            each['org_im'] = im\n            each['org_im_path'] = im_path\n            each['org_im_shape'] = im.shape\n            component.append(each)\n    if images is not None:\n        assert type(images) is list, \"images should be a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = im\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            each['org_im_shape'] = im.shape\n            component.append(each)\n\n    for element in component:\n        img = element['org_im'].copy()\n        img = cv2.resize(img, (513, 513)).astype(np.float32)\n        img -= np.array([104.008, 116.669, 122.675])\n        img /= np.array([1.0, 1.0, 1.0])\n        img = img.transpose((2, 0, 1))\n        element['image'] = img\n        yield element\n"
  },
  {
    "path": "modules/image/semantic_segmentation/deeplabv3p_xception65_humanseg/module.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\n\nimport argparse\nimport ast\nimport os\n\nimport numpy as np\nimport paddle\nfrom .data_feed import reader\nfrom .processor import base64_to_cv2\nfrom .processor import cv2_to_base64\nfrom .processor import postprocess\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"deeplabv3p_xception65_humanseg\",\n            type=\"CV/semantic_segmentation\",\n            author=\"baidu-vis\",\n            author_email=\"\",\n            summary=\"DeepLabv3+ is a semantic segmentation model.\",\n            version=\"1.2.0\")\nclass DeeplabV3pXception65HumanSeg:\n\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"deeplabv3p_xception65_humanseg_model\", \"model\")\n        self._set_config()\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path+'.pdmodel'\n        params = self.default_pretrained_model_path+'.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def segmentation(self,\n                     images=None,\n                     paths=None,\n                     data=None,\n                     batch_size=1,\n                     use_gpu=False,\n                     visualization=False,\n                     output_dir='humanseg_output'):\n        \"\"\"\n        API for human segmentation.\n\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C], the color space is BGR.\n            paths (list[str]): The paths of images.\n            data (dict): key is 'image', the corresponding value is the path to image.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            visualization (bool): Whether to save image or not.\n            output_dir (str): The path to store output images.\n\n        Returns:\n            res (list[dict]): each element in the list is a dict, the keys and values are:\n                save_path (str, optional): the path to save images. (Exists only if visualization is True)\n                data (numpy.ndarray): data of post processed image.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        # compatibility with older versions\n        if data and 'image' in data:\n            if paths is None:\n                paths = list()\n            paths += data['image']\n\n        all_data = list()\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = list()\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except:\n                    pass\n            # feed batch image\n            batch_image = np.array([data['image'] for data in batch_data])\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(batch_image)\n\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n            output_data = output_handle.copy_to_cpu()\n            output = np.expand_dims(output_data, axis=1)\n            # postprocess one by one\n            for i in range(len(batch_data)):\n                out = postprocess(data_out=output[i],\n                                  org_im=batch_data[i]['org_im'],\n                                  org_im_shape=batch_data[i]['org_im_shape'],\n                                  org_im_path=batch_data[i]['org_im_path'],\n                                  output_dir=output_dir,\n                                  visualization=visualization)\n                res.append(out)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.segmentation(images=images_decode, **kwargs)\n        results = [{'data': cv2_to_base64(result['data'])} for result in results]\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.segmentation(paths=[args.input_path],\n                                    batch_size=args.batch_size,\n                                    use_gpu=args.use_gpu,\n                                    output_dir=args.output_dir,\n                                    visualization=args.visualization)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='humanseg_output',\n                                           help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument('--visualization',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether to save output as images.\")\n        self.arg_config_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n\n\nif __name__ == \"__main__\":\n    m = DeeplabV3pXception65HumanSeg()\n    import cv2\n    img = cv2.imread('./meditation.jpg')\n    res = m.segmentation(images=[img])\n    print(res[0]['data'])\n"
  },
  {
    "path": "modules/image/semantic_segmentation/deeplabv3p_xception65_humanseg/processor.py",
    "content": "# coding=utf-8\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport os\nimport time\n\nimport base64\nimport cv2\nimport numpy as np\n\n__all__ = ['cv2_to_base64', 'base64_to_cv2', 'postprocess']\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef postprocess(data_out, org_im, org_im_shape, org_im_path, output_dir, visualization, thresh=120):\n    \"\"\"\n    Postprocess output of network. one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output of network.\n        org_im (numpy.ndarray): original image.\n        org_im_shape (list): shape pf original image.\n        org_im_path (list): path of riginal image.\n        output_dir (str): output directory to store image.\n        visualization (bool): whether to save image or not.\n        thresh (float): threshold.\n\n    Returns:\n        result (dict): The data of processed image.\n    \"\"\"\n    result = dict()\n    for logit in data_out:\n        logit = logit[1] * 255\n        logit = cv2.resize(logit, (org_im_shape[1], org_im_shape[0]))\n        logit -= thresh\n        logit[logit < 0] = 0\n        logit = 255 * logit / (255 - thresh)\n        rgba = np.concatenate((org_im, np.expand_dims(logit, axis=2)), axis=2)\n\n        if visualization:\n            check_dir(output_dir)\n            save_im_path = get_save_image_name(org_im, org_im_path, output_dir)\n            cv2.imwrite(save_im_path, rgba)\n            result['save_path'] = save_im_path\n            result['data'] = rgba[:, :, 3]\n        else:\n            result['data'] = rgba[:, :, 3]\n    return result\n\n\ndef check_dir(dir_path):\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef get_save_image_name(org_im, org_im_path, output_dir):\n    \"\"\"\n    Get save image name from source image path.\n    \"\"\"\n    # name prefix of orginal image\n    org_im_name = os.path.split(org_im_path)[-1]\n    im_prefix = os.path.splitext(org_im_name)[0]\n    ext = '.png'\n    # save image path\n    save_im_path = os.path.join(output_dir, im_prefix + ext)\n    if os.path.exists(save_im_path):\n        save_im_path = os.path.join(output_dir, im_prefix + 'time={}'.format(int(time.time())) + ext)\n\n    return save_im_path\n"
  },
  {
    "path": "modules/image/semantic_segmentation/deeplabv3p_xception65_humanseg/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport numpy as np\nimport paddlehub as hub\n\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/pg_WCHWSdT8/download?ixid=MnwxMjA3fDB8MXxhbGx8fHx8fHx8fHwxNjYyNDM2ODI4&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"deeplabv3p_xception65_humanseg\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('humanseg_output')\n\n    def test_segmentation1(self):\n        results = self.module.segmentation(\n            paths=['tests/test.jpg'],\n            use_gpu=False,\n            visualization=False\n        )\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_segmentation2(self):\n        results = self.module.segmentation(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=False\n        )\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_segmentation3(self):\n        results = self.module.segmentation(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=True\n        )\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_segmentation4(self):\n        results = self.module.segmentation(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=True,\n            visualization=False\n        )\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_segmentation5(self):\n        self.assertRaises(\n            AssertionError,\n            self.module.segmentation,\n            paths=['no.jpg']\n        )\n\n    def test_segmentation6(self):\n        self.assertRaises(\n            AttributeError,\n            self.module.segmentation,\n            images=['test.jpg']\n        )\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/semantic_segmentation/fastscnn_cityscapes/README.md",
    "content": "# PaddleHub 图像分割\n\n## 模型预测\n\n若想使用我们提供的预训练模型进行预测，可使用如下脚本：\n\n```python\nimport paddle\nimport cv2\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='fastscnn_cityscapes')\n    img = cv2.imread(\"/PATH/TO/IMAGE\")\n    model.predict(images=[img], visualization=True)\n```\n\n\n## 如何开始Fine-tune\n\n在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用fastscnn_cityscapes模型对OpticDiscSeg等数据集进行Fine-tune。\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 定义数据预处理方式\n```python\nfrom paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\ntransform = Compose([Resize(target_size=(512, 512)), Normalize()])\n```\n\n`segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n### Step2: 下载数据集并使用\n```python\nfrom paddlehub.datasets import OpticDiscSeg\n\ntrain_reader = OpticDiscSeg(transform， mode='train')\n\n```\n* `transform`: 数据预处理方式。\n* `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n### Step3: 加载预训练模型\n\n```python\nmodel = hub.Module(name='fastscnn_cityscapes', num_classes=2, pretrained=None)\n```\n* `name`: 选择预训练模型的名字。\n* `num_classes`: 分割模型的类别数目。\n* `pretrained`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n### Step4: 选择优化策略和运行配置\n\n```python\nscheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\noptimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\ntrainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_ocr', use_gpu=True)\n```\n\n#### 优化策略\n\nPaddle2.0rc提供了多种优化器选择，如`SGD`, `Adam`, `Adamax`等，其中`Adam`:\n\n* `learning_rate`: 全局学习率。\n*  `parameters`: 待优化模型参数。\n\n#### 运行配置\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_gpu`: 是否使用gpu，默认为False;\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: works的数量，默认为0；\n* `eval_dataset`: 验证集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们使用该模型来进行预测。predict.py脚本如下：\n\n```python\nimport paddle\nimport cv2\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='fastscnn_cityscapes', pretrained='/PATH/TO/CHECKPOINT')\n    img = cv2.imread(\"/PATH/TO/IMAGE\")\n    model.predict(images=[img], visualization=True)\n```\n\n参数配置正确后，请执行脚本`python predict.py`。\n**Args**\n* `images`:原始图像路径或BGR格式图片；\n* `visualization`: 是否可视化，默认为True；\n* `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n**NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线图像分割服务。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m fastscnn_cityscapes\n```\n\n这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\nimport cv2\nimport base64\n\nimport numpy as np\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n# 发送HTTP请求\norg_im = cv2.imread('/PATH/TO/IMAGE')\ndata = {'images':[cv2_to_base64(org_im)]}\nheaders = {\"Content-type\": \"application/json\"}\nurl = \"http://127.0.0.1:8866/predict/fastscnn_cityscapes\"\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\nmask = base64_to_cv2(r.json()[\"results\"][0])\n```\n\n### 查看代码\n\nhttps://github.com/PaddlePaddle/PaddleSeg\n\n### 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "modules/image/semantic_segmentation/fastscnn_cityscapes/layers.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Tuple\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu' or os.environ.get('PADDLESEG_EXPORT_STAGE'):\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs):\n        super().__init__()\n\n        self._conv = nn.Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs):\n        super().__init__()\n        self._conv = nn.Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvReLUPool(nn.Layer):\n    \"\"\"Basic conv bn pool layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int):\n        super().__init__()\n        self.conv = nn.Conv2D(in_channels, out_channels, kernel_size=3, stride=1, padding=1, dilation=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.conv(x)\n        x = F.relu(x)\n        x = F.pool2d(x, pool_size=2, pool_type=\"max\", pool_stride=2)\n        return x\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Basic separable conv bn relu layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs):\n        super().__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=in_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n        self.piontwise_conv = ConvBNReLU(in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass DepthwiseConvBN(nn.Layer):\n    \"\"\"Basic depthwise conv bn relu layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs):\n        super().__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        return x\n\n\nclass AuxLayer(nn.Layer):\n    \"\"\"\n    The auxiliary layer implementation for auxiliary loss.\n\n    Args:\n        in_channels (int): The number of input channels.\n        inter_channels (int): The intermediate channels.\n        out_channels (int): The number of output channels, and usually it is num_classes.\n        dropout_prob (float, optional): The drop rate. Default: 0.1.\n    \"\"\"\n\n    def __init__(self, in_channels: int, inter_channels: int, out_channels: int, dropout_prob: float = 0.1):\n        super().__init__()\n\n        self.conv_bn_relu = ConvBNReLU(in_channels=in_channels, out_channels=inter_channels, kernel_size=3, padding=1)\n\n        self.dropout = nn.Dropout(p=dropout_prob)\n\n        self.conv = nn.Conv2D(in_channels=inter_channels, out_channels=out_channels, kernel_size=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n        x = self.conv(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n    Returns:\n        A callable object of Activation.\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n    Examples:\n        from paddleseg.models.common.activation import Activation\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = nn.layer.activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"nn.layer.activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n\n\nclass PPModule(nn.Layer):\n    \"\"\"\n    Pyramid pooling module originally in PSPNet.\n\n    Args:\n        in_channels (int): The number of intput channels to pyramid pooling module.\n        out_channels (int): The number of output channels after pyramid pooling module.\n        bin_sizes (tuple, optional): The out size of pooled feature maps. Default: (1, 2, 3, 6).\n        dim_reduction (bool, optional): A bool value represents if reducing dimension after pooling. Default: True.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n    \"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, bin_sizes: Tuple, dim_reduction: bool, align_corners: bool):\n        super().__init__()\n\n        self.bin_sizes = bin_sizes\n\n        inter_channels = in_channels\n        if dim_reduction:\n            inter_channels = in_channels // len(bin_sizes)\n\n        # we use dimension reduction after pooling mentioned in original implementation.\n        self.stages = nn.LayerList([self._make_stage(in_channels, inter_channels, size) for size in bin_sizes])\n\n        self.conv_bn_relu2 = ConvBNReLU(\n            in_channels=in_channels + inter_channels * len(bin_sizes),\n            out_channels=out_channels,\n            kernel_size=3,\n            padding=1)\n\n        self.align_corners = align_corners\n\n    def _make_stage(self, in_channels: int, out_channels: int, size: int):\n        \"\"\"\n        Create one pooling layer.\n\n        In our implementation, we adopt the same dimension reduction as the original paper that might be\n        slightly different with other implementations.\n\n        After pooling, the channels are reduced to 1/len(bin_sizes) immediately, while some other implementations\n        keep the channels to be same.\n\n        Args:\n            in_channels (int): The number of intput channels to pyramid pooling module.\n            out_channels (int): The number of output channels to pyramid pooling module.\n            size (int): The out size of the pooled layer.\n\n        Returns:\n            conv (Tensor): A tensor after Pyramid Pooling Module.\n        \"\"\"\n\n        prior = nn.AdaptiveAvgPool2D(output_size=(size, size))\n        conv = ConvBNReLU(in_channels=in_channels, out_channels=out_channels, kernel_size=1)\n\n        return nn.Sequential(prior, conv)\n\n    def forward(self, input: paddle.Tensor) -> paddle.Tensor:\n        cat_layers = []\n        for stage in self.stages:\n            x = stage(input)\n            x = F.interpolate(x, paddle.shape(input)[2:], mode='bilinear', align_corners=self.align_corners)\n            cat_layers.append(x)\n        cat_layers = [input] + cat_layers[::-1]\n        cat = paddle.concat(cat_layers, axis=1)\n        out = self.conv_bn_relu2(cat)\n\n        return out\n"
  },
  {
    "path": "modules/image/semantic_segmentation/fastscnn_cityscapes/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nfrom typing import Callable, Union, Tuple\n\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport paddle\nimport numpy as np\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\n\nimport fastscnn_cityscapes.layers as layers\n\n\n@moduleinfo(\n    name=\"fastscnn_cityscapes\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"fastscnn_cityscapes is a segmentation model.\",\n    version=\"1.0.0\",\n    meta=ImageSegmentationModule)\nclass FastSCNN(nn.Layer):\n    \"\"\"\n    The FastSCNN implementation based on PaddlePaddle.\n    As mentioned in the original paper, FastSCNN is a real-time segmentation algorithm (123.5fps)\n    even for high resolution images (1024x2048).\n    The original article refers to\n    Poudel, Rudra PK, et al. \"Fast-scnn: Fast semantic segmentation network\"\n    (https://arxiv.org/pdf/1902.04502.pdf).\n    Args:\n        num_classes (int): The unique number of target classes, default is 19.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.. Default: False.\n        pretrained (str, optional): The path or url of pretrained model. Default: None.\n    \"\"\"\n\n    def __init__(self, num_classes: int = 19, align_corners: bool = False, pretrained: str = None):\n\n        super(FastSCNN, self).__init__()\n\n        self.learning_to_downsample = LearningToDownsample(32, 48, 64)\n        self.global_feature_extractor = GlobalFeatureExtractor(\n            in_channels=64,\n            block_channels=[64, 96, 128],\n            out_channels=128,\n            expansion=6,\n            num_blocks=[3, 3, 3],\n            align_corners=True)\n        self.feature_fusion = FeatureFusionModule(64, 128, 128, align_corners)\n        self.classifier = Classifier(128, num_classes)\n        self.align_corners = align_corners\n        self.transforms = T.Compose([T.Normalize()])\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'fastscnn_model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def transform(self, img: Union[np.ndarray, str]) -> Union[np.ndarray, str]:\n        return self.transforms(img)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        logit_list = []\n        input_size = paddle.shape(x)[2:]\n        higher_res_features = self.learning_to_downsample(x)\n        x = self.global_feature_extractor(higher_res_features)\n        x = self.feature_fusion(higher_res_features, x)\n        logit = self.classifier(x)\n        logit = F.interpolate(logit, input_size, mode='bilinear', align_corners=self.align_corners)\n        logit_list.append(logit)\n\n        return logit_list\n\n\nclass LearningToDownsample(nn.Layer):\n    \"\"\"\n    Learning to downsample module.\n    This module consists of three downsampling blocks (one conv and two separable conv)\n    Args:\n        dw_channels1 (int, optional): The input channels of the first sep conv. Default: 32.\n        dw_channels2 (int, optional): The input channels of the second sep conv. Default: 48.\n        out_channels (int, optional): The output channels of LearningToDownsample module. Default: 64.\n    \"\"\"\n\n    def __init__(self, dw_channels1: int = 32, dw_channels2: int = 48, out_channels: int = 64):\n        super(LearningToDownsample, self).__init__()\n\n        self.conv_bn_relu = layers.ConvBNReLU(in_channels=3, out_channels=dw_channels1, kernel_size=3, stride=2)\n        self.dsconv_bn_relu1 = layers.SeparableConvBNReLU(\n            in_channels=dw_channels1, out_channels=dw_channels2, kernel_size=3, stride=2, padding=1)\n        self.dsconv_bn_relu2 = layers.SeparableConvBNReLU(\n            in_channels=dw_channels2, out_channels=out_channels, kernel_size=3, stride=2, padding=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.conv_bn_relu(x)\n        x = self.dsconv_bn_relu1(x)\n        x = self.dsconv_bn_relu2(x)\n        return x\n\n\nclass GlobalFeatureExtractor(nn.Layer):\n    \"\"\"\n    Global feature extractor module.\n    This module consists of three InvertedBottleneck blocks (like inverted residual introduced by MobileNetV2) and\n    a PPModule (introduced by PSPNet).\n    Args:\n        in_channels (int): The number of input channels to the module.\n        block_channels (tuple): A tuple represents output channels of each bottleneck block.\n        out_channels (int): The number of output channels of the module. Default:\n        expansion (int): The expansion factor in bottleneck.\n        num_blocks (tuple): It indicates the repeat time of each bottleneck.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n    \"\"\"\n\n    def __init__(self, in_channels: int, block_channels: int, out_channels: int, expansion: int, num_blocks: Tuple[int],\n                 align_corners: bool):\n        super(GlobalFeatureExtractor, self).__init__()\n\n        self.bottleneck1 = self._make_layer(InvertedBottleneck, in_channels, block_channels[0], num_blocks[0],\n                                            expansion, 2)\n        self.bottleneck2 = self._make_layer(InvertedBottleneck, block_channels[0], block_channels[1], num_blocks[1],\n                                            expansion, 2)\n        self.bottleneck3 = self._make_layer(InvertedBottleneck, block_channels[1], block_channels[2], num_blocks[2],\n                                            expansion, 1)\n\n        self.ppm = layers.PPModule(\n            block_channels[2], out_channels, bin_sizes=(1, 2, 3, 6), dim_reduction=True, align_corners=align_corners)\n\n    def _make_layer(self,\n                    block: Callable,\n                    in_channels: int,\n                    out_channels: int,\n                    blocks: int,\n                    expansion: int = 6,\n                    stride: int = 1):\n        layers = []\n        layers.append(block(in_channels, out_channels, expansion, stride))\n        for _ in range(1, blocks):\n            layers.append(block(out_channels, out_channels, expansion, 1))\n        return nn.Sequential(*layers)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.bottleneck1(x)\n        x = self.bottleneck2(x)\n        x = self.bottleneck3(x)\n        x = self.ppm(x)\n        return x\n\n\nclass InvertedBottleneck(nn.Layer):\n    \"\"\"\n    Single Inverted bottleneck implementation.\n    Args:\n        in_channels (int): The number of input channels to bottleneck block.\n        out_channels (int): The number of output channels of bottleneck block.\n        expansion (int, optional). The expansion factor in bottleneck. Default: 6.\n        stride (int, optional). The stride used in depth-wise conv. Defalt: 2.\n    \"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, expansion: int = 6, stride: int = 2):\n        super().__init__()\n\n        self.use_shortcut = stride == 1 and in_channels == out_channels\n\n        expand_channels = in_channels * expansion\n        self.block = nn.Sequential(\n            # pw\n            layers.ConvBNReLU(in_channels=in_channels, out_channels=expand_channels, kernel_size=1, bias_attr=False),\n            # dw\n            layers.ConvBNReLU(\n                in_channels=expand_channels,\n                out_channels=expand_channels,\n                kernel_size=3,\n                stride=stride,\n                padding=1,\n                groups=expand_channels,\n                bias_attr=False),\n            # pw-linear\n            layers.ConvBN(in_channels=expand_channels, out_channels=out_channels, kernel_size=1, bias_attr=False))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        out = self.block(x)\n        if self.use_shortcut:\n            out = x + out\n        return out\n\n\nclass FeatureFusionModule(nn.Layer):\n    \"\"\"\n    Feature Fusion Module Implementation.\n    This module fuses high-resolution feature and low-resolution feature.\n    Args:\n        high_in_channels (int): The channels of high-resolution feature (output of LearningToDownsample).\n        low_in_channels (int): The channels of low-resolution feature (output of GlobalFeatureExtractor).\n        out_channels (int): The output channels of this module.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n    \"\"\"\n\n    def __init__(self, high_in_channels: int, low_in_channels: int, out_channels: int, align_corners: bool):\n        super().__init__()\n\n        # Only depth-wise conv\n        self.dwconv = layers.ConvBNReLU(\n            in_channels=low_in_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            padding=1,\n            groups=128,\n            bias_attr=False)\n\n        self.conv_low_res = layers.ConvBN(out_channels, out_channels, 1)\n        self.conv_high_res = layers.ConvBN(high_in_channels, out_channels, 1)\n        self.align_corners = align_corners\n\n    def forward(self, high_res_input: int, low_res_input: int) -> paddle.Tensor:\n        low_res_input = F.interpolate(\n            low_res_input, paddle.shape(high_res_input)[2:], mode='bilinear', align_corners=self.align_corners)\n        low_res_input = self.dwconv(low_res_input)\n        low_res_input = self.conv_low_res(low_res_input)\n        high_res_input = self.conv_high_res(high_res_input)\n        x = high_res_input + low_res_input\n\n        return F.relu(x)\n\n\nclass Classifier(nn.Layer):\n    \"\"\"\n    The Classifier module implementation.\n    This module consists of two depth-wise conv and one conv.\n    Args:\n        input_channels (int): The input channels to this module.\n        num_classes (int): The unique number of target classes.\n    \"\"\"\n\n    def __init__(self, input_channels: int, num_classes: int):\n        super().__init__()\n\n        self.dsconv1 = layers.SeparableConvBNReLU(\n            in_channels=input_channels, out_channels=input_channels, kernel_size=3, padding=1)\n\n        self.dsconv2 = layers.SeparableConvBNReLU(\n            in_channels=input_channels, out_channels=input_channels, kernel_size=3, padding=1)\n\n        self.conv = nn.Conv2D(in_channels=input_channels, out_channels=num_classes, kernel_size=1)\n\n        self.dropout = nn.Dropout(p=0.1)  # dropout_prob\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.dsconv1(x)\n        x = self.dsconv2(x)\n        x = self.dropout(x)\n        x = self.conv(x)\n        return x\n"
  },
  {
    "path": "modules/image/semantic_segmentation/fcn_hrnetw18_cityscapes/README.md",
    "content": "# PaddleHub 图像分割\n\n## 模型预测\n\n\n若想使用我们提供的预训练模型进行预测，可使用如下脚本：\n\n```python\nimport paddle\nimport cv2\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='fcn_hrnetw18_cityscapes')\n    img = cv2.imread(\"/PATH/TO/IMAGE\")\n    model.predict(images=[img], visualization=True)\n```\n\n\n## 如何开始Fine-tune\n\n在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用fcn_hrnetw18_cityscapes模型对OpticDiscSeg等数据集进行Fine-tune。\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 定义数据预处理方式\n```python\nfrom paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\ntransform = Compose([Resize(target_size=(512, 512)), Normalize()])\n```\n\n`segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n### Step2: 下载数据集并使用\n```python\nfrom paddlehub.datasets import OpticDiscSeg\n\ntrain_reader = OpticDiscSeg(transform， mode='train')\n\n```\n* `transform`: 数据预处理方式。\n* `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n### Step3: 加载预训练模型\n\n```python\nmodel = hub.Module(name='fcn_hrnetw18_cityscapes', num_classes=2, pretrained=None)\n```\n* `name`: 选择预训练模型的名字。\n* `num_classes`: 分割模型的类别数目。\n* `pretrained`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n### Step4: 选择优化策略和运行配置\n\n```python\nscheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\noptimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\ntrainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_ocr', use_gpu=True)\n```\n\n#### 优化策略\n\nPaddle2.0rc提供了多种优化器选择，如`SGD`, `Adam`, `Adamax`等, 其中`Adam`:\n\n* `learning_rate`: 全局学习率。\n*  `parameters`: 待优化模型参数。\n\n#### 运行配置\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_gpu`: 是否使用gpu，默认为False;\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: works的数量，默认为0；\n* `eval_dataset`: 验证集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们使用该模型来进行预测。predict.py脚本如下：\n\n```python\nimport paddle\nimport cv2\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='fcn_hrnetw18_cityscapes', pretrained='/PATH/TO/CHECKPOINT')\n    img = cv2.imread(\"/PATH/TO/IMAGE\")\n    model.predict(images=[img], visualization=True)\n```\n\n参数配置正确后，请执行脚本`python predict.py`。\n**Args**\n* `images`:原始图像路径或BGR格式图片；\n* `visualization`: 是否可视化，默认为True；\n* `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n**NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线图像分割服务。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m fcn_hrnetw18_cityscapes\n```\n\n这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\nimport cv2\nimport base64\n\nimport numpy as np\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n# 发送HTTP请求\norg_im = cv2.imread('/PATH/TO/IMAGE')\ndata = {'images':[cv2_to_base64(org_im)]}\nheaders = {\"Content-type\": \"application/json\"}\nurl = \"http://127.0.0.1:8866/predict/fcn_hrnetw18_cityscapes\"\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\nmask = base64_to_cv2(r.json()[\"results\"][0])\n```\n\n### 查看代码\n\nhttps://github.com/PaddlePaddle/PaddleSeg\n\n### 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "modules/image/semantic_segmentation/fcn_hrnetw18_cityscapes/hrnet.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport math\nfrom typing import Tuple\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nimport fcn_hrnetw18_cityscapes.layers as L\n\n\nclass HRNet_W18(nn.Layer):\n    \"\"\"\n    The HRNet implementation based on PaddlePaddle.\n\n    The original article refers to\n    Jingdong Wang, et, al. \"HRNet：Deep High-Resolution Representation Learning for Visual Recognition\"\n    (https://arxiv.org/pdf/1908.07919.pdf).\n\n    Args:\n        stage1_num_modules (int, optional): Number of modules for stage1. Default 1.\n        stage1_num_blocks (list, optional): Number of blocks per module for stage1. Default (4).\n        stage1_num_channels (list, optional): Number of channels per branch for stage1. Default (64).\n        stage2_num_modules (int, optional): Number of modules for stage2. Default 1.\n        stage2_num_blocks (list, optional): Number of blocks per module for stage2. Default (4, 4).\n        stage2_num_channels (list, optional): Number of channels per branch for stage2. Default (18, 36).\n        stage3_num_modules (int, optional): Number of modules for stage3. Default 4.\n        stage3_num_blocks (list, optional): Number of blocks per module for stage3. Default (4, 4, 4).\n        stage3_num_channels (list, optional): Number of channels per branch for stage3. Default [18, 36, 72).\n        stage4_num_modules (int, optional): Number of modules for stage4. Default 3.\n        stage4_num_blocks (list, optional): Number of blocks per module for stage4. Default (4, 4, 4, 4).\n        stage4_num_channels (list, optional): Number of channels per branch for stage4. Default (18, 36, 72. 144).\n        has_se (bool, optional): Whether to use Squeeze-and-Excitation module. Default False.\n        align_corners (bool, optional): An argument of F.interpolate. It should be set to False when the feature size is even,\n            e.g. 1024x512, otherwise it is True, e.g. 769x769. Default: False.\n    \"\"\"\n\n    def __init__(self,\n                 stage1_num_modules: int = 1,\n                 stage1_num_blocks: Tuple[int] = (4, ),\n                 stage1_num_channels: Tuple[int] = (64, ),\n                 stage2_num_modules: int = 1,\n                 stage2_num_blocks: Tuple[int] = (4, 4),\n                 stage2_num_channels: Tuple[int] = (18, 36),\n                 stage3_num_modules: int = 4,\n                 stage3_num_blocks: Tuple[int] = (4, 4, 4),\n                 stage3_num_channels: Tuple[int] = (18, 36, 72),\n                 stage4_num_modules: int = 3,\n                 stage4_num_blocks: Tuple[int] = (4, 4, 4, 4),\n                 stage4_num_channels: Tuple[int] = (18, 36, 72, 144),\n                 has_se: bool = False,\n                 align_corners: bool = False):\n        super(HRNet_W18, self).__init__()\n\n        self.stage1_num_modules = stage1_num_modules\n        self.stage1_num_blocks = stage1_num_blocks\n        self.stage1_num_channels = stage1_num_channels\n        self.stage2_num_modules = stage2_num_modules\n        self.stage2_num_blocks = stage2_num_blocks\n        self.stage2_num_channels = stage2_num_channels\n        self.stage3_num_modules = stage3_num_modules\n        self.stage3_num_blocks = stage3_num_blocks\n        self.stage3_num_channels = stage3_num_channels\n        self.stage4_num_modules = stage4_num_modules\n        self.stage4_num_blocks = stage4_num_blocks\n        self.stage4_num_channels = stage4_num_channels\n        self.has_se = has_se\n        self.align_corners = align_corners\n        self.feat_channels = [sum(stage4_num_channels)]\n\n        self.conv_layer1_1 = L.ConvBNReLU(\n            in_channels=3, out_channels=64, kernel_size=3, stride=2, padding='same', bias_attr=False)\n\n        self.conv_layer1_2 = L.ConvBNReLU(\n            in_channels=64, out_channels=64, kernel_size=3, stride=2, padding='same', bias_attr=False)\n\n        self.la1 = Layer1(\n            num_channels=64,\n            num_blocks=self.stage1_num_blocks[0],\n            num_filters=self.stage1_num_channels[0],\n            has_se=has_se,\n            name=\"layer2\")\n\n        self.tr1 = TransitionLayer(\n            in_channels=[self.stage1_num_channels[0] * 4], out_channels=self.stage2_num_channels, name=\"tr1\")\n\n        self.st2 = Stage(\n            num_channels=self.stage2_num_channels,\n            num_modules=self.stage2_num_modules,\n            num_blocks=self.stage2_num_blocks,\n            num_filters=self.stage2_num_channels,\n            has_se=self.has_se,\n            name=\"st2\",\n            align_corners=align_corners)\n\n        self.tr2 = TransitionLayer(\n            in_channels=self.stage2_num_channels, out_channels=self.stage3_num_channels, name=\"tr2\")\n        self.st3 = Stage(\n            num_channels=self.stage3_num_channels,\n            num_modules=self.stage3_num_modules,\n            num_blocks=self.stage3_num_blocks,\n            num_filters=self.stage3_num_channels,\n            has_se=self.has_se,\n            name=\"st3\",\n            align_corners=align_corners)\n\n        self.tr3 = TransitionLayer(\n            in_channels=self.stage3_num_channels, out_channels=self.stage4_num_channels, name=\"tr3\")\n        self.st4 = Stage(\n            num_channels=self.stage4_num_channels,\n            num_modules=self.stage4_num_modules,\n            num_blocks=self.stage4_num_blocks,\n            num_filters=self.stage4_num_channels,\n            has_se=self.has_se,\n            name=\"st4\",\n            align_corners=align_corners)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        conv1 = self.conv_layer1_1(x)\n        conv2 = self.conv_layer1_2(conv1)\n\n        la1 = self.la1(conv2)\n\n        tr1 = self.tr1([la1])\n        st2 = self.st2(tr1)\n\n        tr2 = self.tr2(st2)\n        st3 = self.st3(tr2)\n\n        tr3 = self.tr3(st3)\n        st4 = self.st4(tr3)\n\n        x0_h, x0_w = st4[0].shape[2:]\n        x1 = F.interpolate(st4[1], (x0_h, x0_w), mode='bilinear', align_corners=self.align_corners)\n        x2 = F.interpolate(st4[2], (x0_h, x0_w), mode='bilinear', align_corners=self.align_corners)\n        x3 = F.interpolate(st4[3], (x0_h, x0_w), mode='bilinear', align_corners=self.align_corners)\n        x = paddle.concat([st4[0], x1, x2, x3], axis=1)\n\n        return [x]\n\n\nclass Layer1(nn.Layer):\n    def __init__(self, num_channels: int, num_filters: int, num_blocks: int, has_se: bool = False, name: str = None):\n        super(Layer1, self).__init__()\n\n        self.bottleneck_block_list = []\n\n        for i in range(num_blocks):\n            bottleneck_block = self.add_sublayer(\n                \"bb_{}_{}\".format(name, i + 1),\n                BottleneckBlock(\n                    num_channels=num_channels if i == 0 else num_filters * 4,\n                    num_filters=num_filters,\n                    has_se=has_se,\n                    stride=1,\n                    downsample=True if i == 0 else False,\n                    name=name + '_' + str(i + 1)))\n            self.bottleneck_block_list.append(bottleneck_block)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        conv = x\n        for block_func in self.bottleneck_block_list:\n            conv = block_func(conv)\n        return conv\n\n\nclass TransitionLayer(nn.Layer):\n    def __init__(self, in_channels: int, out_channels: int, name=None):\n        super(TransitionLayer, self).__init__()\n\n        num_in = len(in_channels)\n        num_out = len(out_channels)\n        self.conv_bn_func_list = []\n        for i in range(num_out):\n            residual = None\n            if i < num_in:\n                if in_channels[i] != out_channels[i]:\n                    residual = self.add_sublayer(\n                        \"transition_{}_layer_{}\".format(name, i + 1),\n                        L.ConvBNReLU(\n                            in_channels=in_channels[i],\n                            out_channels=out_channels[i],\n                            kernel_size=3,\n                            padding='same',\n                            bias_attr=False))\n            else:\n                residual = self.add_sublayer(\n                    \"transition_{}_layer_{}\".format(name, i + 1),\n                    L.ConvBNReLU(\n                        in_channels=in_channels[-1],\n                        out_channels=out_channels[i],\n                        kernel_size=3,\n                        stride=2,\n                        padding='same',\n                        bias_attr=False))\n            self.conv_bn_func_list.append(residual)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outs = []\n        for idx, conv_bn_func in enumerate(self.conv_bn_func_list):\n            if conv_bn_func is None:\n                outs.append(x[idx])\n            else:\n                if idx < len(x):\n                    outs.append(conv_bn_func(x[idx]))\n                else:\n                    outs.append(conv_bn_func(x[-1]))\n        return outs\n\n\nclass Branches(nn.Layer):\n    def __init__(self, num_blocks: int, in_channels: int, out_channels: int, has_se: bool = False, name: str = None):\n        super(Branches, self).__init__()\n\n        self.basic_block_list = []\n\n        for i in range(len(out_channels)):\n            self.basic_block_list.append([])\n            for j in range(num_blocks[i]):\n                in_ch = in_channels[i] if j == 0 else out_channels[i]\n                basic_block_func = self.add_sublayer(\n                    \"bb_{}_branch_layer_{}_{}\".format(name, i + 1, j + 1),\n                    BasicBlock(\n                        num_channels=in_ch,\n                        num_filters=out_channels[i],\n                        has_se=has_se,\n                        name=name + '_branch_layer_' + str(i + 1) + '_' + str(j + 1)))\n                self.basic_block_list[i].append(basic_block_func)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outs = []\n        for idx, input in enumerate(x):\n            conv = input\n            for basic_block_func in self.basic_block_list[idx]:\n                conv = basic_block_func(conv)\n            outs.append(conv)\n        return outs\n\n\nclass BottleneckBlock(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 has_se: bool,\n                 stride: int = 1,\n                 downsample: bool = False,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = L.ConvBNReLU(\n            in_channels=num_channels, out_channels=num_filters, kernel_size=1, padding='same', bias_attr=False)\n\n        self.conv2 = L.ConvBNReLU(\n            in_channels=num_filters,\n            out_channels=num_filters,\n            kernel_size=3,\n            stride=stride,\n            padding='same',\n            bias_attr=False)\n\n        self.conv3 = L.ConvBN(\n            in_channels=num_filters, out_channels=num_filters * 4, kernel_size=1, padding='same', bias_attr=False)\n\n        if self.downsample:\n            self.conv_down = L.ConvBN(\n                in_channels=num_channels, out_channels=num_filters * 4, kernel_size=1, padding='same', bias_attr=False)\n\n        if self.has_se:\n            self.se = SELayer(\n                num_channels=num_filters * 4, num_filters=num_filters * 4, reduction_ratio=16, name=name + '_fc')\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        residual = x\n        conv1 = self.conv1(x)\n        conv2 = self.conv2(conv1)\n        conv3 = self.conv3(conv2)\n\n        if self.downsample:\n            residual = self.conv_down(x)\n\n        if self.has_se:\n            conv3 = self.se(conv3)\n\n        y = conv3 + residual\n        y = F.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int = 1,\n                 has_se: bool = False,\n                 downsample: bool = False,\n                 name: str = None):\n        super(BasicBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = L.ConvBNReLU(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=3,\n            stride=stride,\n            padding='same',\n            bias_attr=False)\n        self.conv2 = L.ConvBN(\n            in_channels=num_filters, out_channels=num_filters, kernel_size=3, padding='same', bias_attr=False)\n\n        if self.downsample:\n            self.conv_down = L.ConvBNReLU(\n                in_channels=num_channels, out_channels=num_filters, kernel_size=1, padding='same', bias_attr=False)\n\n        if self.has_se:\n            self.se = SELayer(num_channels=num_filters, num_filters=num_filters, reduction_ratio=16, name=name + '_fc')\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        residual = x\n        conv1 = self.conv1(x)\n        conv2 = self.conv2(conv1)\n\n        if self.downsample:\n            residual = self.conv_down(x)\n\n        if self.has_se:\n            conv2 = self.se(conv2)\n\n        y = conv2 + residual\n        y = F.relu(y)\n        return y\n\n\nclass SELayer(nn.Layer):\n    def __init__(self, num_channels: int, num_filters: int, reduction_ratio: int, name: str = None):\n        super(SELayer, self).__init__()\n\n        self.pool2d_gap = nn.AdaptiveAvgPool2D(1)\n\n        self._num_channels = num_channels\n\n        med_ch = int(num_channels / reduction_ratio)\n        stdv = 1.0 / math.sqrt(num_channels * 1.0)\n        self.squeeze = nn.Linear(\n            num_channels, med_ch, weight_attr=paddle.ParamAttr(initializer=nn.initializer.Uniform(-stdv, stdv)))\n\n        stdv = 1.0 / math.sqrt(med_ch * 1.0)\n        self.excitation = nn.Linear(\n            med_ch, num_filters, weight_attr=paddle.ParamAttr(initializer=nn.initializer.Uniform(-stdv, stdv)))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        pool = self.pool2d_gap(x)\n        pool = paddle.reshape(pool, shape=[-1, self._num_channels])\n        squeeze = self.squeeze(pool)\n        squeeze = F.relu(squeeze)\n        excitation = self.excitation(squeeze)\n        excitation = F.sigmoid(excitation)\n        excitation = paddle.reshape(excitation, shape=[-1, self._num_channels, 1, 1])\n        out = x * excitation\n        return out\n\n\nclass Stage(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_modules: int,\n                 num_blocks: int,\n                 num_filters: int,\n                 has_se: bool = False,\n                 multi_scale_output: bool = True,\n                 name: str = None,\n                 align_corners: bool = False):\n        super(Stage, self).__init__()\n\n        self._num_modules = num_modules\n\n        self.stage_func_list = []\n        for i in range(num_modules):\n            if i == num_modules - 1 and not multi_scale_output:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels,\n                        num_blocks=num_blocks,\n                        num_filters=num_filters,\n                        has_se=has_se,\n                        multi_scale_output=False,\n                        name=name + '_' + str(i + 1),\n                        align_corners=align_corners))\n            else:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels,\n                        num_blocks=num_blocks,\n                        num_filters=num_filters,\n                        has_se=has_se,\n                        name=name + '_' + str(i + 1),\n                        align_corners=align_corners))\n\n            self.stage_func_list.append(stage_func)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        out = x\n        for idx in range(self._num_modules):\n            out = self.stage_func_list[idx](out)\n        return out\n\n\nclass HighResolutionModule(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_blocks: int,\n                 num_filters: int,\n                 has_se: bool = False,\n                 multi_scale_output: bool = True,\n                 name: str = None,\n                 align_corners: str = False):\n        super(HighResolutionModule, self).__init__()\n\n        self.branches_func = Branches(\n            num_blocks=num_blocks, in_channels=num_channels, out_channels=num_filters, has_se=has_se, name=name)\n\n        self.fuse_func = FuseLayers(\n            in_channels=num_filters,\n            out_channels=num_filters,\n            multi_scale_output=multi_scale_output,\n            name=name,\n            align_corners=align_corners)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        out = self.branches_func(x)\n        out = self.fuse_func(out)\n        return out\n\n\nclass FuseLayers(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 multi_scale_output: bool = True,\n                 name: str = None,\n                 align_corners: bool = False):\n        super(FuseLayers, self).__init__()\n\n        self._actual_ch = len(in_channels) if multi_scale_output else 1\n        self._in_channels = in_channels\n        self.align_corners = align_corners\n\n        self.residual_func_list = []\n        for i in range(self._actual_ch):\n            for j in range(len(in_channels)):\n                if j > i:\n                    residual_func = self.add_sublayer(\n                        \"residual_{}_layer_{}_{}\".format(name, i + 1, j + 1),\n                        L.ConvBN(\n                            in_channels=in_channels[j],\n                            out_channels=out_channels[i],\n                            kernel_size=1,\n                            padding='same',\n                            bias_attr=False))\n                    self.residual_func_list.append(residual_func)\n                elif j < i:\n                    pre_num_filters = in_channels[j]\n                    for k in range(i - j):\n                        if k == i - j - 1:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                L.ConvBN(\n                                    in_channels=pre_num_filters,\n                                    out_channels=out_channels[i],\n                                    kernel_size=3,\n                                    stride=2,\n                                    padding='same',\n                                    bias_attr=False))\n                            pre_num_filters = out_channels[i]\n                        else:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                L.ConvBNReLU(\n                                    in_channels=pre_num_filters,\n                                    out_channels=out_channels[j],\n                                    kernel_size=3,\n                                    stride=2,\n                                    padding='same',\n                                    bias_attr=False))\n                            pre_num_filters = out_channels[j]\n                        self.residual_func_list.append(residual_func)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outs = []\n        residual_func_idx = 0\n        for i in range(self._actual_ch):\n            residual = x[i]\n            residual_shape = residual.shape[-2:]\n            for j in range(len(self._in_channels)):\n                if j > i:\n                    y = self.residual_func_list[residual_func_idx](x[j])\n                    residual_func_idx += 1\n\n                    y = F.interpolate(y, residual_shape, mode='bilinear', align_corners=self.align_corners)\n                    residual = residual + y\n                elif j < i:\n                    y = x[j]\n                    for k in range(i - j):\n                        y = self.residual_func_list[residual_func_idx](y)\n                        residual_func_idx += 1\n\n                    residual = residual + y\n\n            residual = F.relu(residual)\n            outs.append(residual)\n\n        return outs\n"
  },
  {
    "path": "modules/image/semantic_segmentation/fcn_hrnetw18_cityscapes/layers.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Tuple\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn import Conv2D, AvgPool2D\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu':\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 stride: int = 1,\n                 dilation: int = 1,\n                 groups: int = 1,\n                 is_vd_mode: bool = False,\n                 act: str = None,\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2D(kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=(kernel_size - 1) // 2 if dilation == 1 else 0,\n            dilation=dilation,\n            groups=groups,\n            bias_attr=False)\n\n        self._batch_norm = SyncBatchNorm(out_channels)\n        self._act_op = Activation(act=act)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        y = self._act_op(y)\n\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Residual bottleneck block\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 dilation: int = 1,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels, out_channels=out_channels, kernel_size=1, act='relu', name=name + \"_branch2a\")\n\n        self.dilation = dilation\n\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            dilation=dilation,\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            in_channels=out_channels, out_channels=out_channels * 4, kernel_size=1, act=None, name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels * 4,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        if self.dilation > 1:\n            padding = self.dilation\n            y = F.pad(y, [padding, padding, padding, padding])\n\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.add(x=short, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Depthwise Separable Convolution.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs: dict):\n        super(SeparableConvBNReLU, self).__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=in_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n        self.piontwise_conv = ConvBNReLU(in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs: dict):\n        super(ConvBN, self).__init__()\n        self._conv = Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs: dict):\n        super(ConvBNReLU, self).__init__()\n\n        self._conv = Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n    Returns:\n        A callable object of Activation.\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n    Examples:\n        from paddleseg.models.common.activation import Activation\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = nn.layer.activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"nn.layer.activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n\n\nclass ASPPModule(nn.Layer):\n    \"\"\"\n    Atrous Spatial Pyramid Pooling.\n\n    Args:\n        aspp_ratios (tuple): The dilation rate using in ASSP module.\n        in_channels (int): The number of input channels.\n        out_channels (int): The number of output channels.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n        use_sep_conv (bool, optional): If using separable conv in ASPP module. Default: False.\n        image_pooling (bool, optional): If augmented with image-level features. Default: False\n    \"\"\"\n\n    def __init__(self,\n                 aspp_ratios: Tuple[int],\n                 in_channels: int,\n                 out_channels: int,\n                 align_corners: bool,\n                 use_sep_conv: bool = False,\n                 image_pooling: bool = False):\n        super().__init__()\n\n        self.align_corners = align_corners\n        self.aspp_blocks = nn.LayerList()\n\n        for ratio in aspp_ratios:\n            if use_sep_conv and ratio > 1:\n                conv_func = SeparableConvBNReLU\n            else:\n                conv_func = ConvBNReLU\n\n            block = conv_func(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1 if ratio == 1 else 3,\n                dilation=ratio,\n                padding=0 if ratio == 1 else ratio)\n            self.aspp_blocks.append(block)\n\n        out_size = len(self.aspp_blocks)\n\n        if image_pooling:\n            self.global_avg_pool = nn.Sequential(\n                nn.AdaptiveAvgPool2D(output_size=(1, 1)),\n                ConvBNReLU(in_channels, out_channels, kernel_size=1, bias_attr=False))\n            out_size += 1\n        self.image_pooling = image_pooling\n\n        self.conv_bn_relu = ConvBNReLU(in_channels=out_channels * out_size, out_channels=out_channels, kernel_size=1)\n\n        self.dropout = nn.Dropout(p=0.1)  # drop rate\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outputs = []\n        for block in self.aspp_blocks:\n            y = block(x)\n            y = F.interpolate(y, x.shape[2:], mode='bilinear', align_corners=self.align_corners)\n            outputs.append(y)\n\n        if self.image_pooling:\n            img_avg = self.global_avg_pool(x)\n            img_avg = F.interpolate(img_avg, x.shape[2:], mode='bilinear', align_corners=self.align_corners)\n            outputs.append(img_avg)\n\n        x = paddle.concat(outputs, axis=1)\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n\n        return x\n"
  },
  {
    "path": "modules/image/semantic_segmentation/fcn_hrnetw18_cityscapes/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union, List, Tuple\n\nimport paddle\nfrom paddle import nn\nimport paddle.nn.functional as F\nimport numpy as np\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\n\nfrom fcn_hrnetw18_cityscapes.hrnet import HRNet_W18\nimport fcn_hrnetw18_cityscapes.layers as layers\n\n\n@moduleinfo(\n    name=\"fcn_hrnetw18_cityscapes\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"Fcn_hrnetw18 is a segmentation model.\",\n    version=\"1.0.0\",\n    meta=ImageSegmentationModule)\nclass FCN(nn.Layer):\n    \"\"\"\n    A simple implementation for FCN based on PaddlePaddle.\n\n    The original article refers to\n    Evan Shelhamer, et, al. \"Fully Convolutional Networks for Semantic Segmentation\"\n    (https://arxiv.org/abs/1411.4038).\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (tuple, optional): The values in the tuple indicate the indices of output of backbone.\n            Default: (-1, ).\n        channels (int, optional): The channels between conv layer and the last layer of FCNHead.\n            If None, it will be the number of channels of input features. Default: None.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.  Default: False.\n        pretrained (str, optional): The path or url of pretrained model. Default: None\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 19,\n                 backbone_indices: Tuple[int] = (-1, ),\n                 channels: int = None,\n                 align_corners: bool = False,\n                 pretrained: str = None):\n        super(FCN, self).__init__()\n\n        self.backbone = HRNet_W18()\n        backbone_channels = [self.backbone.feat_channels[i] for i in backbone_indices]\n\n        self.head = FCNHead(num_classes, backbone_indices, backbone_channels, channels)\n\n        self.align_corners = align_corners\n        self.transforms = T.Compose([T.Normalize()])\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def transform(self, img: Union[np.ndarray, str]) -> Union[np.ndarray, str]:\n        return self.transforms(img)\n\n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        feat_list = self.backbone(x)\n        logit_list = self.head(feat_list)\n        return [\n            F.interpolate(logit, paddle.shape(x)[2:], mode='bilinear', align_corners=self.align_corners)\n            for logit in logit_list\n        ]\n\n\nclass FCNHead(nn.Layer):\n    \"\"\"\n    A simple implementation for FCNHead based on PaddlePaddle\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (tuple, optional): The values in the tuple indicate the indices of output of backbone.\n            Default: (-1, ).\n        backbone_channels (tuple): The values of backbone channels.\n            Default: (270, ).\n        channels (int, optional): The channels between conv layer and the last layer of FCNHead.\n            If None, it will be the number of channels of input features. Default: None.\n        pretrained (str, optional): The path of pretrained model. Default: None\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int,\n                 backbone_indices: Tuple[int] = (-1, ),\n                 backbone_channels: Tuple[int] = (270, ),\n                 channels: int = None):\n        super(FCNHead, self).__init__()\n\n        self.num_classes = num_classes\n        self.backbone_indices = backbone_indices\n        if channels is None:\n            channels = backbone_channels[0]\n\n        self.conv_1 = layers.ConvBNReLU(\n            in_channels=backbone_channels[0], out_channels=channels, kernel_size=1, padding='same', stride=1)\n        self.cls = nn.Conv2D(in_channels=channels, out_channels=self.num_classes, kernel_size=1, stride=1, padding=0)\n\n    def forward(self, feat_list: nn.Layer) -> List[paddle.Tensor]:\n        logit_list = []\n        x = feat_list[self.backbone_indices[0]]\n        x = self.conv_1(x)\n        logit = self.cls(x)\n        logit_list.append(logit)\n        return logit_list\n"
  },
  {
    "path": "modules/image/semantic_segmentation/fcn_hrnetw18_voc/README.md",
    "content": "# PaddleHub 图像分割\n\n\n## 模型预测\n\n\n若想使用我们提供的预训练模型进行预测，可使用如下脚本：\n\n```python\nimport paddle\nimport cv2\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='fcn_hrnetw18_voc')\n    img = cv2.imread(\"/PATH/TO/IMAGE\")\n    model.predict(images=[img], visualization=True)\n```\n\n\n## 如何开始Fine-tune\n\n在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用fcn_hrnetw18_voc模型对OpticDiscSeg等数据集进行Fine-tune。\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 定义数据预处理方式\n```python\nfrom paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\ntransform = Compose([Resize(target_size=(512, 512)), Normalize()])\n```\n\n`segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n### Step2: 下载数据集并使用\n```python\nfrom paddlehub.datasets import OpticDiscSeg\n\ntrain_reader = OpticDiscSeg(transform， mode='train')\n\n```\n* `transform`: 数据预处理方式。\n* `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n### Step3: 加载预训练模型\n\n```python\nmodel = hub.Module(name='fcn_hrnetw18_voc', num_classes=2, pretrained=None)\n```\n* `name`: 选择预训练模型的名字。\n* `num_classes`: 分割模型的类别数目。\n* `pretrained`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n### Step4: 选择优化策略和运行配置\n\n```python\nscheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\noptimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\ntrainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_ocr', use_gpu=True)\n```\n\n#### 优化策略\n\nPaddle2.0rc提供了多种优化器选择，如`SGD`, `Adam`, `Adamax`等，其中`Adam`:\n\n* `learning_rate`: 全局学习率。\n*  `parameters`: 待优化模型参数。\n\n#### 运行配置\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_gpu`: 是否使用gpu，默认为False;\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: works的数量，默认为0；\n* `eval_dataset`: 验证集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们使用该模型来进行预测。predict.py脚本如下：\n\n```python\nimport paddle\nimport cv2\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='fcn_hrnetw18_voc', pretrained='/PATH/TO/CHECKPOINT')\n    img = cv2.imread(\"/PATH/TO/IMAGE\")\n    model.predict(images=[img], visualization=True)\n```\n\n参数配置正确后，请执行脚本`python predict.py`。\n**Args**\n* `images`:原始图像路径或BGR格式图片；\n* `visualization`: 是否可视化，默认为True；\n* `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n**NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线图像分割服务。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m fcn_hrnetw18_voc\n```\n\n这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\nimport cv2\nimport base64\n\nimport numpy as np\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n# 发送HTTP请求\norg_im = cv2.imread('/PATH/TO/IMAGE')\ndata = {'images':[cv2_to_base64(org_im)]}\nheaders = {\"Content-type\": \"application/json\"}\nurl = \"http://127.0.0.1:8866/predict/fcn_hrnetw18_voc\"\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\nmask = base64_to_cv2(r.json()[\"results\"][0])\n```\n\n### 查看代码\n\nhttps://github.com/PaddlePaddle/PaddleSeg\n\n### 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "modules/image/semantic_segmentation/fcn_hrnetw18_voc/hrnet.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport math\nfrom typing import Tuple\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nimport fcn_hrnetw18_voc.layers as L\n\n\nclass HRNet_W18(nn.Layer):\n    \"\"\"\n    The HRNet implementation based on PaddlePaddle.\n\n    The original article refers to\n    Jingdong Wang, et, al. \"HRNet：Deep High-Resolution Representation Learning for Visual Recognition\"\n    (https://arxiv.org/pdf/1908.07919.pdf).\n\n    Args:\n        stage1_num_modules (int, optional): Number of modules for stage1. Default 1.\n        stage1_num_blocks (list, optional): Number of blocks per module for stage1. Default (4).\n        stage1_num_channels (list, optional): Number of channels per branch for stage1. Default (64).\n        stage2_num_modules (int, optional): Number of modules for stage2. Default 1.\n        stage2_num_blocks (list, optional): Number of blocks per module for stage2. Default (4, 4).\n        stage2_num_channels (list, optional): Number of channels per branch for stage2. Default (18, 36).\n        stage3_num_modules (int, optional): Number of modules for stage3. Default 4.\n        stage3_num_blocks (list, optional): Number of blocks per module for stage3. Default (4, 4, 4).\n        stage3_num_channels (list, optional): Number of channels per branch for stage3. Default [18, 36, 72).\n        stage4_num_modules (int, optional): Number of modules for stage4. Default 3.\n        stage4_num_blocks (list, optional): Number of blocks per module for stage4. Default (4, 4, 4, 4).\n        stage4_num_channels (list, optional): Number of channels per branch for stage4. Default (18, 36, 72. 144).\n        has_se (bool, optional): Whether to use Squeeze-and-Excitation module. Default False.\n        align_corners (bool, optional): An argument of F.interpolate. It should be set to False when the feature size is even,\n            e.g. 1024x512, otherwise it is True, e.g. 769x769. Default: False.\n    \"\"\"\n\n    def __init__(self,\n                 stage1_num_modules: int = 1,\n                 stage1_num_blocks: Tuple[int] = (4, ),\n                 stage1_num_channels: Tuple[int] = (64, ),\n                 stage2_num_modules: int = 1,\n                 stage2_num_blocks: Tuple[int] = (4, 4),\n                 stage2_num_channels: Tuple[int] = (18, 36),\n                 stage3_num_modules: int = 4,\n                 stage3_num_blocks: Tuple[int] = (4, 4, 4),\n                 stage3_num_channels: Tuple[int] = (18, 36, 72),\n                 stage4_num_modules: int = 3,\n                 stage4_num_blocks: Tuple[int] = (4, 4, 4, 4),\n                 stage4_num_channels: Tuple[int] = (18, 36, 72, 144),\n                 has_se: bool = False,\n                 align_corners: bool = False):\n        super(HRNet_W18, self).__init__()\n\n        self.stage1_num_modules = stage1_num_modules\n        self.stage1_num_blocks = stage1_num_blocks\n        self.stage1_num_channels = stage1_num_channels\n        self.stage2_num_modules = stage2_num_modules\n        self.stage2_num_blocks = stage2_num_blocks\n        self.stage2_num_channels = stage2_num_channels\n        self.stage3_num_modules = stage3_num_modules\n        self.stage3_num_blocks = stage3_num_blocks\n        self.stage3_num_channels = stage3_num_channels\n        self.stage4_num_modules = stage4_num_modules\n        self.stage4_num_blocks = stage4_num_blocks\n        self.stage4_num_channels = stage4_num_channels\n        self.has_se = has_se\n        self.align_corners = align_corners\n        self.feat_channels = [sum(stage4_num_channels)]\n\n        self.conv_layer1_1 = L.ConvBNReLU(\n            in_channels=3, out_channels=64, kernel_size=3, stride=2, padding='same', bias_attr=False)\n\n        self.conv_layer1_2 = L.ConvBNReLU(\n            in_channels=64, out_channels=64, kernel_size=3, stride=2, padding='same', bias_attr=False)\n\n        self.la1 = Layer1(\n            num_channels=64,\n            num_blocks=self.stage1_num_blocks[0],\n            num_filters=self.stage1_num_channels[0],\n            has_se=has_se,\n            name=\"layer2\")\n\n        self.tr1 = TransitionLayer(\n            in_channels=[self.stage1_num_channels[0] * 4], out_channels=self.stage2_num_channels, name=\"tr1\")\n\n        self.st2 = Stage(\n            num_channels=self.stage2_num_channels,\n            num_modules=self.stage2_num_modules,\n            num_blocks=self.stage2_num_blocks,\n            num_filters=self.stage2_num_channels,\n            has_se=self.has_se,\n            name=\"st2\",\n            align_corners=align_corners)\n\n        self.tr2 = TransitionLayer(\n            in_channels=self.stage2_num_channels, out_channels=self.stage3_num_channels, name=\"tr2\")\n        self.st3 = Stage(\n            num_channels=self.stage3_num_channels,\n            num_modules=self.stage3_num_modules,\n            num_blocks=self.stage3_num_blocks,\n            num_filters=self.stage3_num_channels,\n            has_se=self.has_se,\n            name=\"st3\",\n            align_corners=align_corners)\n\n        self.tr3 = TransitionLayer(\n            in_channels=self.stage3_num_channels, out_channels=self.stage4_num_channels, name=\"tr3\")\n        self.st4 = Stage(\n            num_channels=self.stage4_num_channels,\n            num_modules=self.stage4_num_modules,\n            num_blocks=self.stage4_num_blocks,\n            num_filters=self.stage4_num_channels,\n            has_se=self.has_se,\n            name=\"st4\",\n            align_corners=align_corners)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        conv1 = self.conv_layer1_1(x)\n        conv2 = self.conv_layer1_2(conv1)\n\n        la1 = self.la1(conv2)\n\n        tr1 = self.tr1([la1])\n        st2 = self.st2(tr1)\n\n        tr2 = self.tr2(st2)\n        st3 = self.st3(tr2)\n\n        tr3 = self.tr3(st3)\n        st4 = self.st4(tr3)\n\n        x0_h, x0_w = st4[0].shape[2:]\n        x1 = F.interpolate(st4[1], (x0_h, x0_w), mode='bilinear', align_corners=self.align_corners)\n        x2 = F.interpolate(st4[2], (x0_h, x0_w), mode='bilinear', align_corners=self.align_corners)\n        x3 = F.interpolate(st4[3], (x0_h, x0_w), mode='bilinear', align_corners=self.align_corners)\n        x = paddle.concat([st4[0], x1, x2, x3], axis=1)\n\n        return [x]\n\n\nclass Layer1(nn.Layer):\n    def __init__(self, num_channels: int, num_filters: int, num_blocks: int, has_se: bool = False, name: str = None):\n        super(Layer1, self).__init__()\n\n        self.bottleneck_block_list = []\n\n        for i in range(num_blocks):\n            bottleneck_block = self.add_sublayer(\n                \"bb_{}_{}\".format(name, i + 1),\n                BottleneckBlock(\n                    num_channels=num_channels if i == 0 else num_filters * 4,\n                    num_filters=num_filters,\n                    has_se=has_se,\n                    stride=1,\n                    downsample=True if i == 0 else False,\n                    name=name + '_' + str(i + 1)))\n            self.bottleneck_block_list.append(bottleneck_block)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        conv = x\n        for block_func in self.bottleneck_block_list:\n            conv = block_func(conv)\n        return conv\n\n\nclass TransitionLayer(nn.Layer):\n    def __init__(self, in_channels: int, out_channels: int, name=None):\n        super(TransitionLayer, self).__init__()\n\n        num_in = len(in_channels)\n        num_out = len(out_channels)\n        self.conv_bn_func_list = []\n        for i in range(num_out):\n            residual = None\n            if i < num_in:\n                if in_channels[i] != out_channels[i]:\n                    residual = self.add_sublayer(\n                        \"transition_{}_layer_{}\".format(name, i + 1),\n                        L.ConvBNReLU(\n                            in_channels=in_channels[i],\n                            out_channels=out_channels[i],\n                            kernel_size=3,\n                            padding='same',\n                            bias_attr=False))\n            else:\n                residual = self.add_sublayer(\n                    \"transition_{}_layer_{}\".format(name, i + 1),\n                    L.ConvBNReLU(\n                        in_channels=in_channels[-1],\n                        out_channels=out_channels[i],\n                        kernel_size=3,\n                        stride=2,\n                        padding='same',\n                        bias_attr=False))\n            self.conv_bn_func_list.append(residual)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outs = []\n        for idx, conv_bn_func in enumerate(self.conv_bn_func_list):\n            if conv_bn_func is None:\n                outs.append(x[idx])\n            else:\n                if idx < len(x):\n                    outs.append(conv_bn_func(x[idx]))\n                else:\n                    outs.append(conv_bn_func(x[-1]))\n        return outs\n\n\nclass Branches(nn.Layer):\n    def __init__(self, num_blocks: int, in_channels: int, out_channels: int, has_se: bool = False, name: str = None):\n        super(Branches, self).__init__()\n\n        self.basic_block_list = []\n\n        for i in range(len(out_channels)):\n            self.basic_block_list.append([])\n            for j in range(num_blocks[i]):\n                in_ch = in_channels[i] if j == 0 else out_channels[i]\n                basic_block_func = self.add_sublayer(\n                    \"bb_{}_branch_layer_{}_{}\".format(name, i + 1, j + 1),\n                    BasicBlock(\n                        num_channels=in_ch,\n                        num_filters=out_channels[i],\n                        has_se=has_se,\n                        name=name + '_branch_layer_' + str(i + 1) + '_' + str(j + 1)))\n                self.basic_block_list[i].append(basic_block_func)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outs = []\n        for idx, input in enumerate(x):\n            conv = input\n            for basic_block_func in self.basic_block_list[idx]:\n                conv = basic_block_func(conv)\n            outs.append(conv)\n        return outs\n\n\nclass BottleneckBlock(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 has_se: bool,\n                 stride: int = 1,\n                 downsample: bool = False,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = L.ConvBNReLU(\n            in_channels=num_channels, out_channels=num_filters, kernel_size=1, padding='same', bias_attr=False)\n\n        self.conv2 = L.ConvBNReLU(\n            in_channels=num_filters,\n            out_channels=num_filters,\n            kernel_size=3,\n            stride=stride,\n            padding='same',\n            bias_attr=False)\n\n        self.conv3 = L.ConvBN(\n            in_channels=num_filters, out_channels=num_filters * 4, kernel_size=1, padding='same', bias_attr=False)\n\n        if self.downsample:\n            self.conv_down = L.ConvBN(\n                in_channels=num_channels, out_channels=num_filters * 4, kernel_size=1, padding='same', bias_attr=False)\n\n        if self.has_se:\n            self.se = SELayer(\n                num_channels=num_filters * 4, num_filters=num_filters * 4, reduction_ratio=16, name=name + '_fc')\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        residual = x\n        conv1 = self.conv1(x)\n        conv2 = self.conv2(conv1)\n        conv3 = self.conv3(conv2)\n\n        if self.downsample:\n            residual = self.conv_down(x)\n\n        if self.has_se:\n            conv3 = self.se(conv3)\n\n        y = conv3 + residual\n        y = F.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int = 1,\n                 has_se: bool = False,\n                 downsample: bool = False,\n                 name: str = None):\n        super(BasicBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = L.ConvBNReLU(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=3,\n            stride=stride,\n            padding='same',\n            bias_attr=False)\n        self.conv2 = L.ConvBN(\n            in_channels=num_filters, out_channels=num_filters, kernel_size=3, padding='same', bias_attr=False)\n\n        if self.downsample:\n            self.conv_down = L.ConvBNReLU(\n                in_channels=num_channels, out_channels=num_filters, kernel_size=1, padding='same', bias_attr=False)\n\n        if self.has_se:\n            self.se = SELayer(num_channels=num_filters, num_filters=num_filters, reduction_ratio=16, name=name + '_fc')\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        residual = x\n        conv1 = self.conv1(x)\n        conv2 = self.conv2(conv1)\n\n        if self.downsample:\n            residual = self.conv_down(x)\n\n        if self.has_se:\n            conv2 = self.se(conv2)\n\n        y = conv2 + residual\n        y = F.relu(y)\n        return y\n\n\nclass SELayer(nn.Layer):\n    def __init__(self, num_channels: int, num_filters: int, reduction_ratio: int, name: str = None):\n        super(SELayer, self).__init__()\n\n        self.pool2d_gap = nn.AdaptiveAvgPool2D(1)\n\n        self._num_channels = num_channels\n\n        med_ch = int(num_channels / reduction_ratio)\n        stdv = 1.0 / math.sqrt(num_channels * 1.0)\n        self.squeeze = nn.Linear(\n            num_channels, med_ch, weight_attr=paddle.ParamAttr(initializer=nn.initializer.Uniform(-stdv, stdv)))\n\n        stdv = 1.0 / math.sqrt(med_ch * 1.0)\n        self.excitation = nn.Linear(\n            med_ch, num_filters, weight_attr=paddle.ParamAttr(initializer=nn.initializer.Uniform(-stdv, stdv)))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        pool = self.pool2d_gap(x)\n        pool = paddle.reshape(pool, shape=[-1, self._num_channels])\n        squeeze = self.squeeze(pool)\n        squeeze = F.relu(squeeze)\n        excitation = self.excitation(squeeze)\n        excitation = F.sigmoid(excitation)\n        excitation = paddle.reshape(excitation, shape=[-1, self._num_channels, 1, 1])\n        out = x * excitation\n        return out\n\n\nclass Stage(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_modules: int,\n                 num_blocks: int,\n                 num_filters: int,\n                 has_se: bool = False,\n                 multi_scale_output: bool = True,\n                 name: str = None,\n                 align_corners: bool = False):\n        super(Stage, self).__init__()\n\n        self._num_modules = num_modules\n\n        self.stage_func_list = []\n        for i in range(num_modules):\n            if i == num_modules - 1 and not multi_scale_output:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels,\n                        num_blocks=num_blocks,\n                        num_filters=num_filters,\n                        has_se=has_se,\n                        multi_scale_output=False,\n                        name=name + '_' + str(i + 1),\n                        align_corners=align_corners))\n            else:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels,\n                        num_blocks=num_blocks,\n                        num_filters=num_filters,\n                        has_se=has_se,\n                        name=name + '_' + str(i + 1),\n                        align_corners=align_corners))\n\n            self.stage_func_list.append(stage_func)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        out = x\n        for idx in range(self._num_modules):\n            out = self.stage_func_list[idx](out)\n        return out\n\n\nclass HighResolutionModule(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_blocks: int,\n                 num_filters: int,\n                 has_se: bool = False,\n                 multi_scale_output: bool = True,\n                 name: str = None,\n                 align_corners: str = False):\n        super(HighResolutionModule, self).__init__()\n\n        self.branches_func = Branches(\n            num_blocks=num_blocks, in_channels=num_channels, out_channels=num_filters, has_se=has_se, name=name)\n\n        self.fuse_func = FuseLayers(\n            in_channels=num_filters,\n            out_channels=num_filters,\n            multi_scale_output=multi_scale_output,\n            name=name,\n            align_corners=align_corners)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        out = self.branches_func(x)\n        out = self.fuse_func(out)\n        return out\n\n\nclass FuseLayers(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 multi_scale_output: bool = True,\n                 name: str = None,\n                 align_corners: bool = False):\n        super(FuseLayers, self).__init__()\n\n        self._actual_ch = len(in_channels) if multi_scale_output else 1\n        self._in_channels = in_channels\n        self.align_corners = align_corners\n\n        self.residual_func_list = []\n        for i in range(self._actual_ch):\n            for j in range(len(in_channels)):\n                if j > i:\n                    residual_func = self.add_sublayer(\n                        \"residual_{}_layer_{}_{}\".format(name, i + 1, j + 1),\n                        L.ConvBN(\n                            in_channels=in_channels[j],\n                            out_channels=out_channels[i],\n                            kernel_size=1,\n                            padding='same',\n                            bias_attr=False))\n                    self.residual_func_list.append(residual_func)\n                elif j < i:\n                    pre_num_filters = in_channels[j]\n                    for k in range(i - j):\n                        if k == i - j - 1:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                L.ConvBN(\n                                    in_channels=pre_num_filters,\n                                    out_channels=out_channels[i],\n                                    kernel_size=3,\n                                    stride=2,\n                                    padding='same',\n                                    bias_attr=False))\n                            pre_num_filters = out_channels[i]\n                        else:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                L.ConvBNReLU(\n                                    in_channels=pre_num_filters,\n                                    out_channels=out_channels[j],\n                                    kernel_size=3,\n                                    stride=2,\n                                    padding='same',\n                                    bias_attr=False))\n                            pre_num_filters = out_channels[j]\n                        self.residual_func_list.append(residual_func)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outs = []\n        residual_func_idx = 0\n        for i in range(self._actual_ch):\n            residual = x[i]\n            residual_shape = residual.shape[-2:]\n            for j in range(len(self._in_channels)):\n                if j > i:\n                    y = self.residual_func_list[residual_func_idx](x[j])\n                    residual_func_idx += 1\n\n                    y = F.interpolate(y, residual_shape, mode='bilinear', align_corners=self.align_corners)\n                    residual = residual + y\n                elif j < i:\n                    y = x[j]\n                    for k in range(i - j):\n                        y = self.residual_func_list[residual_func_idx](y)\n                        residual_func_idx += 1\n\n                    residual = residual + y\n\n            residual = F.relu(residual)\n            outs.append(residual)\n\n        return outs\n"
  },
  {
    "path": "modules/image/semantic_segmentation/fcn_hrnetw18_voc/layers.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Tuple\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn import Conv2D, AvgPool2D\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu':\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 stride: int = 1,\n                 dilation: int = 1,\n                 groups: int = 1,\n                 is_vd_mode: bool = False,\n                 act: str = None,\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2D(kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=(kernel_size - 1) // 2 if dilation == 1 else 0,\n            dilation=dilation,\n            groups=groups,\n            bias_attr=False)\n\n        self._batch_norm = SyncBatchNorm(out_channels)\n        self._act_op = Activation(act=act)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        y = self._act_op(y)\n\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Residual bottleneck block\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 dilation: int = 1,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels, out_channels=out_channels, kernel_size=1, act='relu', name=name + \"_branch2a\")\n\n        self.dilation = dilation\n\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            dilation=dilation,\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            in_channels=out_channels, out_channels=out_channels * 4, kernel_size=1, act=None, name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels * 4,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        if self.dilation > 1:\n            padding = self.dilation\n            y = F.pad(y, [padding, padding, padding, padding])\n\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.add(x=short, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Depthwise Separable Convolution.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs: dict):\n        super(SeparableConvBNReLU, self).__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=in_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n        self.piontwise_conv = ConvBNReLU(in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs: dict):\n        super(ConvBN, self).__init__()\n        self._conv = Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs: dict):\n        super(ConvBNReLU, self).__init__()\n\n        self._conv = Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n    Returns:\n        A callable object of Activation.\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n    Examples:\n        from paddleseg.models.common.activation import Activation\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = nn.layer.activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"nn.layer.activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n\n\nclass ASPPModule(nn.Layer):\n    \"\"\"\n    Atrous Spatial Pyramid Pooling.\n\n    Args:\n        aspp_ratios (tuple): The dilation rate using in ASSP module.\n        in_channels (int): The number of input channels.\n        out_channels (int): The number of output channels.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n        use_sep_conv (bool, optional): If using separable conv in ASPP module. Default: False.\n        image_pooling (bool, optional): If augmented with image-level features. Default: False\n    \"\"\"\n\n    def __init__(self,\n                 aspp_ratios: Tuple[int],\n                 in_channels: int,\n                 out_channels: int,\n                 align_corners: bool,\n                 use_sep_conv: bool = False,\n                 image_pooling: bool = False):\n        super().__init__()\n\n        self.align_corners = align_corners\n        self.aspp_blocks = nn.LayerList()\n\n        for ratio in aspp_ratios:\n            if use_sep_conv and ratio > 1:\n                conv_func = SeparableConvBNReLU\n            else:\n                conv_func = ConvBNReLU\n\n            block = conv_func(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1 if ratio == 1 else 3,\n                dilation=ratio,\n                padding=0 if ratio == 1 else ratio)\n            self.aspp_blocks.append(block)\n\n        out_size = len(self.aspp_blocks)\n\n        if image_pooling:\n            self.global_avg_pool = nn.Sequential(\n                nn.AdaptiveAvgPool2D(output_size=(1, 1)),\n                ConvBNReLU(in_channels, out_channels, kernel_size=1, bias_attr=False))\n            out_size += 1\n        self.image_pooling = image_pooling\n\n        self.conv_bn_relu = ConvBNReLU(in_channels=out_channels * out_size, out_channels=out_channels, kernel_size=1)\n\n        self.dropout = nn.Dropout(p=0.1)  # drop rate\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outputs = []\n        for block in self.aspp_blocks:\n            y = block(x)\n            y = F.interpolate(y, x.shape[2:], mode='bilinear', align_corners=self.align_corners)\n            outputs.append(y)\n\n        if self.image_pooling:\n            img_avg = self.global_avg_pool(x)\n            img_avg = F.interpolate(img_avg, x.shape[2:], mode='bilinear', align_corners=self.align_corners)\n            outputs.append(img_avg)\n\n        x = paddle.concat(outputs, axis=1)\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n\n        return x\n"
  },
  {
    "path": "modules/image/semantic_segmentation/fcn_hrnetw18_voc/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union, List, Tuple\n\nimport paddle\nfrom paddle import nn\nimport paddle.nn.functional as F\nimport numpy as np\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\n\nfrom fcn_hrnetw18_voc.hrnet import HRNet_W18\nimport fcn_hrnetw18_voc.layers as layers\n\n\n@moduleinfo(\n    name=\"fcn_hrnetw18_voc\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"Fcn_hrnetw18 is a segmentation model.\",\n    version=\"1.0.0\",\n    meta=ImageSegmentationModule)\nclass FCN(nn.Layer):\n    \"\"\"\n    A simple implementation for FCN based on PaddlePaddle.\n\n    The original article refers to\n    Evan Shelhamer, et, al. \"Fully Convolutional Networks for Semantic Segmentation\"\n    (https://arxiv.org/abs/1411.4038).\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (tuple, optional): The values in the tuple indicate the indices of output of backbone.\n            Default: (-1, ).\n        channels (int, optional): The channels between conv layer and the last layer of FCNHead.\n            If None, it will be the number of channels of input features. Default: None.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.  Default: False.\n        pretrained (str, optional): The path or url of pretrained model. Default: None\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 21,\n                 backbone_indices: Tuple[int] = (-1, ),\n                 channels: int = None,\n                 align_corners: bool = False,\n                 pretrained: str = None):\n        super(FCN, self).__init__()\n\n        self.backbone = HRNet_W18()\n        backbone_channels = [self.backbone.feat_channels[i] for i in backbone_indices]\n\n        self.head = FCNHead(num_classes, backbone_indices, backbone_channels, channels)\n\n        self.align_corners = align_corners\n        self.transforms = T.Compose([T.Normalize()])\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def transform(self, img: Union[np.ndarray, str]) -> Union[np.ndarray, str]:\n        return self.transforms(img)\n\n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        feat_list = self.backbone(x)\n        logit_list = self.head(feat_list)\n        return [\n            F.interpolate(logit, paddle.shape(x)[2:], mode='bilinear', align_corners=self.align_corners)\n            for logit in logit_list\n        ]\n\n\nclass FCNHead(nn.Layer):\n    \"\"\"\n    A simple implementation for FCNHead based on PaddlePaddle\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (tuple, optional): The values in the tuple indicate the indices of output of backbone.\n            Default: (-1, ).\n        backbone_channels (tuple): The values of backbone channels.\n            Default: (270, ).\n        channels (int, optional): The channels between conv layer and the last layer of FCNHead.\n            If None, it will be the number of channels of input features. Default: None.\n        pretrained (str, optional): The path of pretrained model. Default: None\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int,\n                 backbone_indices: Tuple[int] = (-1, ),\n                 backbone_channels: Tuple[int] = (270, ),\n                 channels: int = None):\n        super(FCNHead, self).__init__()\n\n        self.num_classes = num_classes\n        self.backbone_indices = backbone_indices\n        if channels is None:\n            channels = backbone_channels[0]\n\n        self.conv_1 = layers.ConvBNReLU(\n            in_channels=backbone_channels[0], out_channels=channels, kernel_size=1, padding='same', stride=1)\n        self.cls = nn.Conv2D(in_channels=channels, out_channels=self.num_classes, kernel_size=1, stride=1, padding=0)\n\n    def forward(self, feat_list: nn.Layer) -> List[paddle.Tensor]:\n        logit_list = []\n        x = feat_list[self.backbone_indices[0]]\n        x = self.conv_1(x)\n        logit = self.cls(x)\n        logit_list.append(logit)\n        return logit_list\n"
  },
  {
    "path": "modules/image/semantic_segmentation/fcn_hrnetw48_cityscapes/README.md",
    "content": "# PaddleHub 图像分割\n\n## 模型预测\n\n\n若想使用我们提供的预训练模型进行预测，可使用如下脚本：\n\n```python\nimport paddle\nimport cv2\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='fcn_hrnetw48_cityscapes')\n    img = cv2.imread(\"/PATH/TO/IMAGE\")\n    model.predict(images=[img], visualization=True)\n```\n\n\n## 如何开始Fine-tune\n\n在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用fcn_hrnetw48_cityscapes模型对OpticDiscSeg等数据集进行Fine-tune。\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 定义数据预处理方式\n```python\nfrom paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\ntransform = Compose([Resize(target_size=(512, 512)), Normalize()])\n```\n\n`segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n### Step2: 下载数据集并使用\n```python\nfrom paddlehub.datasets import OpticDiscSeg\n\ntrain_reader = OpticDiscSeg(transform， mode='train')\n\n```\n* `transform`: 数据预处理方式。\n* `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n### Step3: 加载预训练模型\n\n```python\nmodel = hub.Module(name='fcn_hrnetw48_cityscapes', num_classes=2, pretrained=None)\n```\n* `name`: 选择预训练模型的名字。\n* `num_classes`: 分割模型的类别数目。\n* `pretrained`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n### Step4: 选择优化策略和运行配置\n\n```python\nscheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\noptimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\ntrainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_ocr', use_gpu=True)\n```\n\n#### 优化策略\n\nPaddle2.0提供了多种优化器选择，如`SGD`, `Adam`, `Adamax`等，其中`Adam`:\n\n* `learning_rate`: 全局学习率。\n*  `parameters`: 待优化模型参数。\n\n#### 运行配置\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_gpu`: 是否使用gpu，默认为False;\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: works的数量，默认为0；\n* `eval_dataset`: 验证集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们使用该模型来进行预测。predict.py脚本如下：\n\n```python\nimport paddle\nimport cv2\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='fcn_hrnetw48_cityscapes', pretrained='/PATH/TO/CHECKPOINT')\n    img = cv2.imread(\"/PATH/TO/IMAGE\")\n    model.predict(images=[img], visualization=True)\n```\n\n参数配置正确后，请执行脚本`python predict.py`。\n**Args**\n* `images`:原始图像路径或BGR格式图片；\n* `visualization`: 是否可视化，默认为True；\n* `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n**NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线图像分割服务。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m fcn_hrnetw48_cityscapes\n```\n\n这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\nimport cv2\nimport base64\n\nimport numpy as np\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n# 发送HTTP请求\norg_im = cv2.imread('/PATH/TO/IMAGE')\ndata = {'images':[cv2_to_base64(org_im)]}\nheaders = {\"Content-type\": \"application/json\"}\nurl = \"http://127.0.0.1:8866/predict/fcn_hrnetw48_cityscapes\"\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\nmask = base64_to_cv2(r.json()[\"results\"][0])\n```\n\n### 查看代码\n\nhttps://github.com/PaddlePaddle/PaddleSeg\n\n### 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "modules/image/semantic_segmentation/fcn_hrnetw48_cityscapes/hrnet.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport math\nfrom typing import List\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nimport fcn_hrnetw48_cityscapes.layers as layers\n\n\nclass HRNet_W48(nn.Layer):\n    \"\"\"\n    The HRNet implementation based on PaddlePaddle.\n    The original article refers to\n    Jingdong Wang, et, al. \"HRNet：Deep High-Resolution Representation Learning for Visual Recognition\"\n    (https://arxiv.org/pdf/1908.07919.pdf).\n    Args:\n        stage1_num_modules (int, optional): Number of modules for stage1. Default 1.\n        stage1_num_blocks (list, optional): Number of blocks per module for stage1. Default (4).\n        stage1_num_channels (list, optional): Number of channels per branch for stage1. Default (64).\n        stage2_num_modules (int, optional): Number of modules for stage2. Default 1.\n        stage2_num_blocks (list, optional): Number of blocks per module for stage2. Default (4, 4).\n        stage2_num_channels (list, optional): Number of channels per branch for stage2. Default (48, 96).\n        stage3_num_modules (int, optional): Number of modules for stage3. Default 4.\n        stage3_num_blocks (list, optional): Number of blocks per module for stage3. Default (4, 4, 4).\n        stage3_num_channels (list, optional): Number of channels per branch for stage3. Default [48, 96, 192).\n        stage4_num_modules (int, optional): Number of modules for stage4. Default 3.\n        stage4_num_blocks (list, optional): Number of blocks per module for stage4. Default (4, 4, 4, 4).\n        stage4_num_channels (list, optional): Number of channels per branch for stage4. Default (48, 96, 192. 384).\n        has_se (bool, optional): Whether to use Squeeze-and-Excitation module. Default False.\n        align_corners (bool, optional): An argument of F.interpolate. It should be set to False when the feature size is even,\n            e.g. 1024x512, otherwise it is True, e.g. 769x769. Default: False.\n    \"\"\"\n\n    def __init__(self,\n                 stage1_num_modules: int = 1,\n                 stage1_num_blocks: List[int] = [4],\n                 stage1_num_channels: List[int] = [64],\n                 stage2_num_modules: int = 1,\n                 stage2_num_blocks: List[int] = [4, 4],\n                 stage2_num_channels: List[int] = [48, 96],\n                 stage3_num_modules: int = 4,\n                 stage3_num_blocks: List[int] = [4, 4, 4],\n                 stage3_num_channels: List[int] = [48, 96, 192],\n                 stage4_num_modules: int = 3,\n                 stage4_num_blocks: List[int] = [4, 4, 4, 4],\n                 stage4_num_channels: List[int] = [48, 96, 192, 384],\n                 has_se=False,\n                 align_corners=False):\n        super(HRNet_W48, self).__init__()\n        self.stage1_num_modules = stage1_num_modules\n        self.stage1_num_blocks = stage1_num_blocks\n        self.stage1_num_channels = stage1_num_channels\n        self.stage2_num_modules = stage2_num_modules\n        self.stage2_num_blocks = stage2_num_blocks\n        self.stage2_num_channels = stage2_num_channels\n        self.stage3_num_modules = stage3_num_modules\n        self.stage3_num_blocks = stage3_num_blocks\n        self.stage3_num_channels = stage3_num_channels\n        self.stage4_num_modules = stage4_num_modules\n        self.stage4_num_blocks = stage4_num_blocks\n        self.stage4_num_channels = stage4_num_channels\n        self.has_se = has_se\n        self.align_corners = align_corners\n        self.feat_channels = [sum(stage4_num_channels)]\n\n        self.conv_layer1_1 = layers.ConvBNReLU(\n            in_channels=3, out_channels=64, kernel_size=3, stride=2, padding='same', bias_attr=False)\n\n        self.conv_layer1_2 = layers.ConvBNReLU(\n            in_channels=64, out_channels=64, kernel_size=3, stride=2, padding='same', bias_attr=False)\n\n        self.la1 = Layer1(\n            num_channels=64,\n            num_blocks=self.stage1_num_blocks[0],\n            num_filters=self.stage1_num_channels[0],\n            has_se=has_se,\n            name=\"layer2\")\n\n        self.tr1 = TransitionLayer(\n            in_channels=[self.stage1_num_channels[0] * 4], out_channels=self.stage2_num_channels, name=\"tr1\")\n\n        self.st2 = Stage(\n            num_channels=self.stage2_num_channels,\n            num_modules=self.stage2_num_modules,\n            num_blocks=self.stage2_num_blocks,\n            num_filters=self.stage2_num_channels,\n            has_se=self.has_se,\n            name=\"st2\",\n            align_corners=align_corners)\n\n        self.tr2 = TransitionLayer(\n            in_channels=self.stage2_num_channels, out_channels=self.stage3_num_channels, name=\"tr2\")\n        self.st3 = Stage(\n            num_channels=self.stage3_num_channels,\n            num_modules=self.stage3_num_modules,\n            num_blocks=self.stage3_num_blocks,\n            num_filters=self.stage3_num_channels,\n            has_se=self.has_se,\n            name=\"st3\",\n            align_corners=align_corners)\n\n        self.tr3 = TransitionLayer(\n            in_channels=self.stage3_num_channels, out_channels=self.stage4_num_channels, name=\"tr3\")\n        self.st4 = Stage(\n            num_channels=self.stage4_num_channels,\n            num_modules=self.stage4_num_modules,\n            num_blocks=self.stage4_num_blocks,\n            num_filters=self.stage4_num_channels,\n            has_se=self.has_se,\n            name=\"st4\",\n            align_corners=align_corners)\n\n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        conv1 = self.conv_layer1_1(x)\n        conv2 = self.conv_layer1_2(conv1)\n\n        la1 = self.la1(conv2)\n\n        tr1 = self.tr1([la1])\n        st2 = self.st2(tr1)\n\n        tr2 = self.tr2(st2)\n        st3 = self.st3(tr2)\n\n        tr3 = self.tr3(st3)\n        st4 = self.st4(tr3)\n\n        size = paddle.shape(st4[0])[2:]\n        x1 = F.interpolate(st4[1], size, mode='bilinear', align_corners=self.align_corners)\n        x2 = F.interpolate(st4[2], size, mode='bilinear', align_corners=self.align_corners)\n        x3 = F.interpolate(st4[3], size, mode='bilinear', align_corners=self.align_corners)\n        x = paddle.concat([st4[0], x1, x2, x3], axis=1)\n\n        return [x]\n\n\nclass Layer1(nn.Layer):\n    def __init__(self, num_channels: int, num_filters: int, num_blocks: int, has_se: bool = False, name: str = None):\n        super(Layer1, self).__init__()\n\n        self.bottleneck_block_list = []\n\n        for i in range(num_blocks):\n            bottleneck_block = self.add_sublayer(\n                \"bb_{}_{}\".format(name, i + 1),\n                BottleneckBlock(\n                    num_channels=num_channels if i == 0 else num_filters * 4,\n                    num_filters=num_filters,\n                    has_se=has_se,\n                    stride=1,\n                    downsample=True if i == 0 else False,\n                    name=name + '_' + str(i + 1)))\n            self.bottleneck_block_list.append(bottleneck_block)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        conv = x\n        for block_func in self.bottleneck_block_list:\n            conv = block_func(conv)\n        return conv\n\n\nclass TransitionLayer(nn.Layer):\n    def __init__(self, in_channels: int, out_channels: int, name: str = None):\n        super(TransitionLayer, self).__init__()\n\n        num_in = len(in_channels)\n        num_out = len(out_channels)\n        self.conv_bn_func_list = []\n        for i in range(num_out):\n            residual = None\n            if i < num_in:\n                if in_channels[i] != out_channels[i]:\n                    residual = self.add_sublayer(\n                        \"transition_{}_layer_{}\".format(name, i + 1),\n                        layers.ConvBNReLU(\n                            in_channels=in_channels[i],\n                            out_channels=out_channels[i],\n                            kernel_size=3,\n                            padding='same',\n                            bias_attr=False))\n            else:\n                residual = self.add_sublayer(\n                    \"transition_{}_layer_{}\".format(name, i + 1),\n                    layers.ConvBNReLU(\n                        in_channels=in_channels[-1],\n                        out_channels=out_channels[i],\n                        kernel_size=3,\n                        stride=2,\n                        padding='same',\n                        bias_attr=False))\n            self.conv_bn_func_list.append(residual)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outs = []\n        for idx, conv_bn_func in enumerate(self.conv_bn_func_list):\n            if conv_bn_func is None:\n                outs.append(x[idx])\n            else:\n                if idx < len(x):\n                    outs.append(conv_bn_func(x[idx]))\n                else:\n                    outs.append(conv_bn_func(x[-1]))\n        return outs\n\n\nclass Branches(nn.Layer):\n    def __init__(self, num_blocks: int, in_channels: int, out_channels: int, has_se: bool = False, name: str = None):\n        super(Branches, self).__init__()\n\n        self.basic_block_list = []\n\n        for i in range(len(out_channels)):\n            self.basic_block_list.append([])\n            for j in range(num_blocks[i]):\n                in_ch = in_channels[i] if j == 0 else out_channels[i]\n                basic_block_func = self.add_sublayer(\n                    \"bb_{}_branch_layer_{}_{}\".format(name, i + 1, j + 1),\n                    BasicBlock(\n                        num_channels=in_ch,\n                        num_filters=out_channels[i],\n                        has_se=has_se,\n                        name=name + '_branch_layer_' + str(i + 1) + '_' + str(j + 1)))\n                self.basic_block_list[i].append(basic_block_func)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outs = []\n        for idx, input in enumerate(x):\n            conv = input\n            for basic_block_func in self.basic_block_list[idx]:\n                conv = basic_block_func(conv)\n            outs.append(conv)\n        return outs\n\n\nclass BottleneckBlock(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 has_se: bool,\n                 stride: int = 1,\n                 downsample: bool = False,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = layers.ConvBNReLU(\n            in_channels=num_channels, out_channels=num_filters, kernel_size=1, padding='same', bias_attr=False)\n\n        self.conv2 = layers.ConvBNReLU(\n            in_channels=num_filters,\n            out_channels=num_filters,\n            kernel_size=3,\n            stride=stride,\n            padding='same',\n            bias_attr=False)\n\n        self.conv3 = layers.ConvBN(\n            in_channels=num_filters, out_channels=num_filters * 4, kernel_size=1, padding='same', bias_attr=False)\n\n        if self.downsample:\n            self.conv_down = layers.ConvBN(\n                in_channels=num_channels, out_channels=num_filters * 4, kernel_size=1, padding='same', bias_attr=False)\n\n        if self.has_se:\n            self.se = SELayer(\n                num_channels=num_filters * 4, num_filters=num_filters * 4, reduction_ratio=16, name=name + '_fc')\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        residual = x\n        conv1 = self.conv1(x)\n        conv2 = self.conv2(conv1)\n        conv3 = self.conv3(conv2)\n\n        if self.downsample:\n            residual = self.conv_down(x)\n\n        if self.has_se:\n            conv3 = self.se(conv3)\n\n        y = conv3 + residual\n        y = F.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int = 1,\n                 has_se: bool = False,\n                 downsample: bool = False,\n                 name: str = None):\n        super(BasicBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = layers.ConvBNReLU(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=3,\n            stride=stride,\n            padding='same',\n            bias_attr=False)\n        self.conv2 = layers.ConvBN(\n            in_channels=num_filters, out_channels=num_filters, kernel_size=3, padding='same', bias_attr=False)\n\n        if self.downsample:\n            self.conv_down = layers.ConvBNReLU(\n                in_channels=num_channels, out_channels=num_filters, kernel_size=1, padding='same', bias_attr=False)\n\n        if self.has_se:\n            self.se = SELayer(num_channels=num_filters, num_filters=num_filters, reduction_ratio=16, name=name + '_fc')\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        residual = x\n        conv1 = self.conv1(x)\n        conv2 = self.conv2(conv1)\n\n        if self.downsample:\n            residual = self.conv_down(x)\n\n        if self.has_se:\n            conv2 = self.se(conv2)\n\n        y = conv2 + residual\n        y = F.relu(y)\n        return y\n\n\nclass SELayer(nn.Layer):\n    def __init__(self, num_channels: int, num_filters: int, reduction_ratio: float, name: str = None):\n        super(SELayer, self).__init__()\n\n        self.pool2d_gap = nn.AdaptiveAvgPool2D(1)\n\n        self._num_channels = num_channels\n\n        med_ch = int(num_channels / reduction_ratio)\n        stdv = 1.0 / math.sqrt(num_channels * 1.0)\n        self.squeeze = nn.Linear(\n            num_channels, med_ch, weight_attr=paddle.ParamAttr(initializer=nn.initializer.Uniform(-stdv, stdv)))\n\n        stdv = 1.0 / math.sqrt(med_ch * 1.0)\n        self.excitation = nn.Linear(\n            med_ch, num_filters, weight_attr=paddle.ParamAttr(initializer=nn.initializer.Uniform(-stdv, stdv)))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        pool = self.pool2d_gap(x)\n        pool = paddle.reshape(pool, shape=[-1, self._num_channels])\n        squeeze = self.squeeze(pool)\n        squeeze = F.relu(squeeze)\n        excitation = self.excitation(squeeze)\n        excitation = F.sigmoid(excitation)\n        excitation = paddle.reshape(excitation, shape=[-1, self._num_channels, 1, 1])\n        out = x * excitation\n        return out\n\n\nclass Stage(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_modules: int,\n                 num_blocks: int,\n                 num_filters: int,\n                 has_se: bool = False,\n                 multi_scale_output: bool = True,\n                 name: str = None,\n                 align_corners: bool = False):\n        super(Stage, self).__init__()\n\n        self._num_modules = num_modules\n\n        self.stage_func_list = []\n        for i in range(num_modules):\n            if i == num_modules - 1 and not multi_scale_output:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels,\n                        num_blocks=num_blocks,\n                        num_filters=num_filters,\n                        has_se=has_se,\n                        multi_scale_output=False,\n                        name=name + '_' + str(i + 1),\n                        align_corners=align_corners))\n            else:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels,\n                        num_blocks=num_blocks,\n                        num_filters=num_filters,\n                        has_se=has_se,\n                        name=name + '_' + str(i + 1),\n                        align_corners=align_corners))\n\n            self.stage_func_list.append(stage_func)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        out = x\n        for idx in range(self._num_modules):\n            out = self.stage_func_list[idx](out)\n        return out\n\n\nclass HighResolutionModule(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_blocks: int,\n                 num_filters: int,\n                 has_se: bool = False,\n                 multi_scale_output: bool = True,\n                 name: str = None,\n                 align_corners: bool = False):\n        super(HighResolutionModule, self).__init__()\n\n        self.branches_func = Branches(\n            num_blocks=num_blocks, in_channels=num_channels, out_channels=num_filters, has_se=has_se, name=name)\n\n        self.fuse_func = FuseLayers(\n            in_channels=num_filters,\n            out_channels=num_filters,\n            multi_scale_output=multi_scale_output,\n            name=name,\n            align_corners=align_corners)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        out = self.branches_func(x)\n        out = self.fuse_func(out)\n        return out\n\n\nclass FuseLayers(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 multi_scale_output: bool = True,\n                 name: str = None,\n                 align_corners: bool = False):\n        super(FuseLayers, self).__init__()\n\n        self._actual_ch = len(in_channels) if multi_scale_output else 1\n        self._in_channels = in_channels\n        self.align_corners = align_corners\n\n        self.residual_func_list = []\n        for i in range(self._actual_ch):\n            for j in range(len(in_channels)):\n                if j > i:\n                    residual_func = self.add_sublayer(\n                        \"residual_{}_layer_{}_{}\".format(name, i + 1, j + 1),\n                        layers.ConvBN(\n                            in_channels=in_channels[j],\n                            out_channels=out_channels[i],\n                            kernel_size=1,\n                            padding='same',\n                            bias_attr=False))\n                    self.residual_func_list.append(residual_func)\n                elif j < i:\n                    pre_num_filters = in_channels[j]\n                    for k in range(i - j):\n                        if k == i - j - 1:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                layers.ConvBN(\n                                    in_channels=pre_num_filters,\n                                    out_channels=out_channels[i],\n                                    kernel_size=3,\n                                    stride=2,\n                                    padding='same',\n                                    bias_attr=False))\n                            pre_num_filters = out_channels[i]\n                        else:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                layers.ConvBNReLU(\n                                    in_channels=pre_num_filters,\n                                    out_channels=out_channels[j],\n                                    kernel_size=3,\n                                    stride=2,\n                                    padding='same',\n                                    bias_attr=False))\n                            pre_num_filters = out_channels[j]\n                        self.residual_func_list.append(residual_func)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outs = []\n        residual_func_idx = 0\n        for i in range(self._actual_ch):\n            residual = x[i]\n            residual_shape = paddle.shape(residual)[-2:]\n            for j in range(len(self._in_channels)):\n                if j > i:\n                    y = self.residual_func_list[residual_func_idx](x[j])\n                    residual_func_idx += 1\n\n                    y = F.interpolate(y, residual_shape, mode='bilinear', align_corners=self.align_corners)\n                    residual = residual + y\n                elif j < i:\n                    y = x[j]\n                    for k in range(i - j):\n                        y = self.residual_func_list[residual_func_idx](y)\n                        residual_func_idx += 1\n\n                    residual = residual + y\n\n            residual = F.relu(residual)\n            outs.append(residual)\n\n        return outs\n"
  },
  {
    "path": "modules/image/semantic_segmentation/fcn_hrnetw48_cityscapes/layers.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import Tuple\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn import Conv2D, AvgPool2D\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu':\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 stride: int = 1,\n                 dilation: int = 1,\n                 groups: int = 1,\n                 is_vd_mode: bool = False,\n                 act: str = None,\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2D(kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=(kernel_size - 1) // 2 if dilation == 1 else 0,\n            dilation=dilation,\n            groups=groups,\n            bias_attr=False)\n\n        self._batch_norm = SyncBatchNorm(out_channels)\n        self._act_op = Activation(act=act)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        y = self._act_op(y)\n\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Residual bottleneck block\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 dilation: int = 1,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels, out_channels=out_channels, kernel_size=1, act='relu', name=name + \"_branch2a\")\n\n        self.dilation = dilation\n\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            dilation=dilation,\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            in_channels=out_channels, out_channels=out_channels * 4, kernel_size=1, act=None, name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels * 4,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        if self.dilation > 1:\n            padding = self.dilation\n            y = F.pad(y, [padding, padding, padding, padding])\n\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.add(x=short, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Depthwise Separable Convolution.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs: dict):\n        super(SeparableConvBNReLU, self).__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=in_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n        self.piontwise_conv = ConvBNReLU(in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs: dict):\n        super(ConvBN, self).__init__()\n        self._conv = Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs: dict):\n        super(ConvBNReLU, self).__init__()\n\n        self._conv = Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n    Returns:\n        A callable object of Activation.\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n    Examples:\n        from paddleseg.models.common.activation import Activation\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = nn.layer.activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"nn.layer.activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n\n\nclass ASPPModule(nn.Layer):\n    \"\"\"\n    Atrous Spatial Pyramid Pooling.\n\n    Args:\n        aspp_ratios (tuple): The dilation rate using in ASSP module.\n        in_channels (int): The number of input channels.\n        out_channels (int): The number of output channels.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n        use_sep_conv (bool, optional): If using separable conv in ASPP module. Default: False.\n        image_pooling (bool, optional): If augmented with image-level features. Default: False\n    \"\"\"\n\n    def __init__(self,\n                 aspp_ratios: Tuple[int],\n                 in_channels: int,\n                 out_channels: int,\n                 align_corners: bool,\n                 use_sep_conv: bool = False,\n                 image_pooling: bool = False):\n        super().__init__()\n\n        self.align_corners = align_corners\n        self.aspp_blocks = nn.LayerList()\n\n        for ratio in aspp_ratios:\n            if use_sep_conv and ratio > 1:\n                conv_func = SeparableConvBNReLU\n            else:\n                conv_func = ConvBNReLU\n\n            block = conv_func(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1 if ratio == 1 else 3,\n                dilation=ratio,\n                padding=0 if ratio == 1 else ratio)\n            self.aspp_blocks.append(block)\n\n        out_size = len(self.aspp_blocks)\n\n        if image_pooling:\n            self.global_avg_pool = nn.Sequential(\n                nn.AdaptiveAvgPool2D(output_size=(1, 1)),\n                ConvBNReLU(in_channels, out_channels, kernel_size=1, bias_attr=False))\n            out_size += 1\n        self.image_pooling = image_pooling\n\n        self.conv_bn_relu = ConvBNReLU(in_channels=out_channels * out_size, out_channels=out_channels, kernel_size=1)\n\n        self.dropout = nn.Dropout(p=0.1)  # drop rate\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outputs = []\n        for block in self.aspp_blocks:\n            y = block(x)\n            y = F.interpolate(y, x.shape[2:], mode='bilinear', align_corners=self.align_corners)\n            outputs.append(y)\n\n        if self.image_pooling:\n            img_avg = self.global_avg_pool(x)\n            img_avg = F.interpolate(img_avg, x.shape[2:], mode='bilinear', align_corners=self.align_corners)\n            outputs.append(img_avg)\n\n        x = paddle.concat(outputs, axis=1)\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n\n        return x\n"
  },
  {
    "path": "modules/image/semantic_segmentation/fcn_hrnetw48_cityscapes/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union, List, Tuple\n\nimport paddle\nfrom paddle import nn\nimport paddle.nn.functional as F\nimport numpy as np\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\n\nfrom fcn_hrnetw48_cityscapes.hrnet import HRNet_W48\nimport fcn_hrnetw48_cityscapes.layers as layers\n\n\n@moduleinfo(\n    name=\"fcn_hrnetw48_cityscapes\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"Fcn_hrnetw48 is a segmentation model.\",\n    version=\"1.0.0\",\n    meta=ImageSegmentationModule)\nclass FCN(nn.Layer):\n    \"\"\"\n    A simple implementation for FCN based on PaddlePaddle.\n\n    The original article refers to\n    Evan Shelhamer, et, al. \"Fully Convolutional Networks for Semantic Segmentation\"\n    (https://arxiv.org/abs/1411.4038).\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (tuple, optional): The values in the tuple indicate the indices of output of backbone.\n            Default: (-1, ).\n        channels (int, optional): The channels between conv layer and the last layer of FCNHead.\n            If None, it will be the number of channels of input features. Default: None.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.  Default: False.\n        pretrained (str, optional): The path or url of pretrained model. Default: None\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 19,\n                 backbone_indices: Tuple[int] = (-1, ),\n                 channels: int = None,\n                 align_corners: bool = False,\n                 pretrained: str = None):\n        super(FCN, self).__init__()\n\n        self.backbone = HRNet_W48()\n        backbone_channels = [self.backbone.feat_channels[i] for i in backbone_indices]\n\n        self.head = FCNHead(num_classes, backbone_indices, backbone_channels, channels)\n\n        self.align_corners = align_corners\n        self.transforms = T.Compose([T.Normalize()])\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def transform(self, img: Union[np.ndarray, str]) -> Union[np.ndarray, str]:\n        return self.transforms(img)\n\n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        feat_list = self.backbone(x)\n        logit_list = self.head(feat_list)\n        return [\n            F.interpolate(logit, paddle.shape(x)[2:], mode='bilinear', align_corners=self.align_corners)\n            for logit in logit_list\n        ]\n\n\nclass FCNHead(nn.Layer):\n    \"\"\"\n    A simple implementation for FCNHead based on PaddlePaddle\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (tuple, optional): The values in the tuple indicate the indices of output of backbone.\n            Default: (-1, ).\n        backbone_channels (tuple): The values of backbone channels.\n            Default: (270, ).\n        channels (int, optional): The channels between conv layer and the last layer of FCNHead.\n            If None, it will be the number of channels of input features. Default: None.\n        pretrained (str, optional): The path of pretrained model. Default: None\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int,\n                 backbone_indices: Tuple[int] = (-1, ),\n                 backbone_channels: Tuple[int] = (270, ),\n                 channels: int = None):\n        super(FCNHead, self).__init__()\n\n        self.num_classes = num_classes\n        self.backbone_indices = backbone_indices\n        if channels is None:\n            channels = backbone_channels[0]\n\n        self.conv_1 = layers.ConvBNReLU(\n            in_channels=backbone_channels[0], out_channels=channels, kernel_size=1, padding='same', stride=1)\n        self.cls = nn.Conv2D(in_channels=channels, out_channels=self.num_classes, kernel_size=1, stride=1, padding=0)\n\n    def forward(self, feat_list: nn.Layer) -> List[paddle.Tensor]:\n        logit_list = []\n        x = feat_list[self.backbone_indices[0]]\n        x = self.conv_1(x)\n        logit = self.cls(x)\n        logit_list.append(logit)\n        return logit_list\n"
  },
  {
    "path": "modules/image/semantic_segmentation/fcn_hrnetw48_voc/README.md",
    "content": "# PaddleHub 图像分割\n\n## 模型预测\n\n\n若想使用我们提供的预训练模型进行预测，可使用如下脚本：\n\n```python\nimport paddle\nimport cv2\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='fcn_hrnetw48_voc')\n    img = cv2.imread(\"/PATH/TO/IMAGE\")\n    model.predict(images=[img], visualization=True)\n```\n\n\n## 如何开始Fine-tune\n\n在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用fcn_hrnetw48_voc模型对OpticDiscSeg等数据集进行Fine-tune。\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 定义数据预处理方式\n```python\nfrom paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\ntransform = Compose([Resize(target_size=(512, 512)), Normalize()])\n```\n\n`segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n### Step2: 下载数据集并使用\n```python\nfrom paddlehub.datasets import OpticDiscSeg\n\ntrain_reader = OpticDiscSeg(transform， mode='train')\n\n```\n* `transform`: 数据预处理方式。\n* `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n### Step3: 加载预训练模型\n\n```python\nmodel = hub.Module(name='fcn_hrnetw48_voc', num_classes=2, pretrained=None)\n```\n* `name`: 选择预训练模型的名字。\n* `num_classes`: 分割模型的类别数目。\n* `pretrained`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n### Step4: 选择优化策略和运行配置\n\n```python\nscheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\noptimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\ntrainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_ocr', use_gpu=True)\n```\n\n#### 优化策略\n\nPaddle2.0提供了多种优化器选择，如`SGD`, `Adam`, `Adamax`等，其中`Adam`:\n\n* `learning_rate`: 全局学习率。\n*  `parameters`: 待优化模型参数。\n\n#### 运行配置\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_gpu`: 是否使用gpu，默认为False;\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: works的数量，默认为0；\n* `eval_dataset`: 验证集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们使用该模型来进行预测。predict.py脚本如下：\n\n```python\nimport paddle\nimport cv2\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='fcn_hrnetw48_voc', pretrained='/PATH/TO/CHECKPOINT')\n    img = cv2.imread(\"/PATH/TO/IMAGE\")\n    model.predict(images=[img], visualization=True)\n```\n\n参数配置正确后，请执行脚本`python predict.py`。\n**Args**\n* `images`:原始图像路径或BGR格式图片；\n* `visualization`: 是否可视化，默认为True；\n* `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n**NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线图像分割服务。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m fcn_hrnetw48_voc\n```\n\n这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\nimport cv2\nimport base64\n\nimport numpy as np\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n# 发送HTTP请求\norg_im = cv2.imread('/PATH/TO/IMAGE')\ndata = {'images':[cv2_to_base64(org_im)]}\nheaders = {\"Content-type\": \"application/json\"}\nurl = \"http://127.0.0.1:8866/predict/fcn_hrnetw48_voc\"\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\nmask = base64_to_cv2(r.json()[\"results\"][0])\n```\n\n### 查看代码\n\nhttps://github.com/PaddlePaddle/PaddleSeg\n\n### 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "modules/image/semantic_segmentation/fcn_hrnetw48_voc/hrnet.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport math\nfrom typing import List\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nimport fcn_hrnetw48_voc.layers as layers\n\n\nclass HRNet_W48(nn.Layer):\n    \"\"\"\n    The HRNet implementation based on PaddlePaddle.\n    The original article refers to\n    Jingdong Wang, et, al. \"HRNet：Deep High-Resolution Representation Learning for Visual Recognition\"\n    (https://arxiv.org/pdf/1908.07919.pdf).\n    Args:\n        stage1_num_modules (int, optional): Number of modules for stage1. Default 1.\n        stage1_num_blocks (list, optional): Number of blocks per module for stage1. Default (4).\n        stage1_num_channels (list, optional): Number of channels per branch for stage1. Default (64).\n        stage2_num_modules (int, optional): Number of modules for stage2. Default 1.\n        stage2_num_blocks (list, optional): Number of blocks per module for stage2. Default (4, 4).\n        stage2_num_channels (list, optional): Number of channels per branch for stage2. Default (48, 96).\n        stage3_num_modules (int, optional): Number of modules for stage3. Default 4.\n        stage3_num_blocks (list, optional): Number of blocks per module for stage3. Default (4, 4, 4).\n        stage3_num_channels (list, optional): Number of channels per branch for stage3. Default [48, 96, 192).\n        stage4_num_modules (int, optional): Number of modules for stage4. Default 3.\n        stage4_num_blocks (list, optional): Number of blocks per module for stage4. Default (4, 4, 4, 4).\n        stage4_num_channels (list, optional): Number of channels per branch for stage4. Default (48, 96, 192. 384).\n        has_se (bool, optional): Whether to use Squeeze-and-Excitation module. Default False.\n        align_corners (bool, optional): An argument of F.interpolate. It should be set to False when the feature size is even,\n            e.g. 1024x512, otherwise it is True, e.g. 769x769. Default: False.\n    \"\"\"\n\n    def __init__(self,\n                 stage1_num_modules: int = 1,\n                 stage1_num_blocks: List[int] = [4],\n                 stage1_num_channels: List[int] = [64],\n                 stage2_num_modules: int = 1,\n                 stage2_num_blocks: List[int] = [4, 4],\n                 stage2_num_channels: List[int] = [48, 96],\n                 stage3_num_modules: int = 4,\n                 stage3_num_blocks: List[int] = [4, 4, 4],\n                 stage3_num_channels: List[int] = [48, 96, 192],\n                 stage4_num_modules: int = 3,\n                 stage4_num_blocks: List[int] = [4, 4, 4, 4],\n                 stage4_num_channels: List[int] = [48, 96, 192, 384],\n                 has_se=False,\n                 align_corners=False):\n        super(HRNet_W48, self).__init__()\n        self.stage1_num_modules = stage1_num_modules\n        self.stage1_num_blocks = stage1_num_blocks\n        self.stage1_num_channels = stage1_num_channels\n        self.stage2_num_modules = stage2_num_modules\n        self.stage2_num_blocks = stage2_num_blocks\n        self.stage2_num_channels = stage2_num_channels\n        self.stage3_num_modules = stage3_num_modules\n        self.stage3_num_blocks = stage3_num_blocks\n        self.stage3_num_channels = stage3_num_channels\n        self.stage4_num_modules = stage4_num_modules\n        self.stage4_num_blocks = stage4_num_blocks\n        self.stage4_num_channels = stage4_num_channels\n        self.has_se = has_se\n        self.align_corners = align_corners\n        self.feat_channels = [sum(stage4_num_channels)]\n\n        self.conv_layer1_1 = layers.ConvBNReLU(\n            in_channels=3, out_channels=64, kernel_size=3, stride=2, padding='same', bias_attr=False)\n\n        self.conv_layer1_2 = layers.ConvBNReLU(\n            in_channels=64, out_channels=64, kernel_size=3, stride=2, padding='same', bias_attr=False)\n\n        self.la1 = Layer1(\n            num_channels=64,\n            num_blocks=self.stage1_num_blocks[0],\n            num_filters=self.stage1_num_channels[0],\n            has_se=has_se,\n            name=\"layer2\")\n\n        self.tr1 = TransitionLayer(\n            in_channels=[self.stage1_num_channels[0] * 4], out_channels=self.stage2_num_channels, name=\"tr1\")\n\n        self.st2 = Stage(\n            num_channels=self.stage2_num_channels,\n            num_modules=self.stage2_num_modules,\n            num_blocks=self.stage2_num_blocks,\n            num_filters=self.stage2_num_channels,\n            has_se=self.has_se,\n            name=\"st2\",\n            align_corners=align_corners)\n\n        self.tr2 = TransitionLayer(\n            in_channels=self.stage2_num_channels, out_channels=self.stage3_num_channels, name=\"tr2\")\n        self.st3 = Stage(\n            num_channels=self.stage3_num_channels,\n            num_modules=self.stage3_num_modules,\n            num_blocks=self.stage3_num_blocks,\n            num_filters=self.stage3_num_channels,\n            has_se=self.has_se,\n            name=\"st3\",\n            align_corners=align_corners)\n\n        self.tr3 = TransitionLayer(\n            in_channels=self.stage3_num_channels, out_channels=self.stage4_num_channels, name=\"tr3\")\n        self.st4 = Stage(\n            num_channels=self.stage4_num_channels,\n            num_modules=self.stage4_num_modules,\n            num_blocks=self.stage4_num_blocks,\n            num_filters=self.stage4_num_channels,\n            has_se=self.has_se,\n            name=\"st4\",\n            align_corners=align_corners)\n\n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        conv1 = self.conv_layer1_1(x)\n        conv2 = self.conv_layer1_2(conv1)\n\n        la1 = self.la1(conv2)\n\n        tr1 = self.tr1([la1])\n        st2 = self.st2(tr1)\n\n        tr2 = self.tr2(st2)\n        st3 = self.st3(tr2)\n\n        tr3 = self.tr3(st3)\n        st4 = self.st4(tr3)\n\n        size = paddle.shape(st4[0])[2:]\n        x1 = F.interpolate(st4[1], size, mode='bilinear', align_corners=self.align_corners)\n        x2 = F.interpolate(st4[2], size, mode='bilinear', align_corners=self.align_corners)\n        x3 = F.interpolate(st4[3], size, mode='bilinear', align_corners=self.align_corners)\n        x = paddle.concat([st4[0], x1, x2, x3], axis=1)\n\n        return [x]\n\n\nclass Layer1(nn.Layer):\n    def __init__(self, num_channels: int, num_filters: int, num_blocks: int, has_se: bool = False, name: str = None):\n        super(Layer1, self).__init__()\n\n        self.bottleneck_block_list = []\n\n        for i in range(num_blocks):\n            bottleneck_block = self.add_sublayer(\n                \"bb_{}_{}\".format(name, i + 1),\n                BottleneckBlock(\n                    num_channels=num_channels if i == 0 else num_filters * 4,\n                    num_filters=num_filters,\n                    has_se=has_se,\n                    stride=1,\n                    downsample=True if i == 0 else False,\n                    name=name + '_' + str(i + 1)))\n            self.bottleneck_block_list.append(bottleneck_block)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        conv = x\n        for block_func in self.bottleneck_block_list:\n            conv = block_func(conv)\n        return conv\n\n\nclass TransitionLayer(nn.Layer):\n    def __init__(self, in_channels: int, out_channels: int, name: str = None):\n        super(TransitionLayer, self).__init__()\n\n        num_in = len(in_channels)\n        num_out = len(out_channels)\n        self.conv_bn_func_list = []\n        for i in range(num_out):\n            residual = None\n            if i < num_in:\n                if in_channels[i] != out_channels[i]:\n                    residual = self.add_sublayer(\n                        \"transition_{}_layer_{}\".format(name, i + 1),\n                        layers.ConvBNReLU(\n                            in_channels=in_channels[i],\n                            out_channels=out_channels[i],\n                            kernel_size=3,\n                            padding='same',\n                            bias_attr=False))\n            else:\n                residual = self.add_sublayer(\n                    \"transition_{}_layer_{}\".format(name, i + 1),\n                    layers.ConvBNReLU(\n                        in_channels=in_channels[-1],\n                        out_channels=out_channels[i],\n                        kernel_size=3,\n                        stride=2,\n                        padding='same',\n                        bias_attr=False))\n            self.conv_bn_func_list.append(residual)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outs = []\n        for idx, conv_bn_func in enumerate(self.conv_bn_func_list):\n            if conv_bn_func is None:\n                outs.append(x[idx])\n            else:\n                if idx < len(x):\n                    outs.append(conv_bn_func(x[idx]))\n                else:\n                    outs.append(conv_bn_func(x[-1]))\n        return outs\n\n\nclass Branches(nn.Layer):\n    def __init__(self, num_blocks: int, in_channels: int, out_channels: int, has_se: bool = False, name: str = None):\n        super(Branches, self).__init__()\n\n        self.basic_block_list = []\n\n        for i in range(len(out_channels)):\n            self.basic_block_list.append([])\n            for j in range(num_blocks[i]):\n                in_ch = in_channels[i] if j == 0 else out_channels[i]\n                basic_block_func = self.add_sublayer(\n                    \"bb_{}_branch_layer_{}_{}\".format(name, i + 1, j + 1),\n                    BasicBlock(\n                        num_channels=in_ch,\n                        num_filters=out_channels[i],\n                        has_se=has_se,\n                        name=name + '_branch_layer_' + str(i + 1) + '_' + str(j + 1)))\n                self.basic_block_list[i].append(basic_block_func)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outs = []\n        for idx, input in enumerate(x):\n            conv = input\n            for basic_block_func in self.basic_block_list[idx]:\n                conv = basic_block_func(conv)\n            outs.append(conv)\n        return outs\n\n\nclass BottleneckBlock(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 has_se: bool,\n                 stride: int = 1,\n                 downsample: bool = False,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = layers.ConvBNReLU(\n            in_channels=num_channels, out_channels=num_filters, kernel_size=1, padding='same', bias_attr=False)\n\n        self.conv2 = layers.ConvBNReLU(\n            in_channels=num_filters,\n            out_channels=num_filters,\n            kernel_size=3,\n            stride=stride,\n            padding='same',\n            bias_attr=False)\n\n        self.conv3 = layers.ConvBN(\n            in_channels=num_filters, out_channels=num_filters * 4, kernel_size=1, padding='same', bias_attr=False)\n\n        if self.downsample:\n            self.conv_down = layers.ConvBN(\n                in_channels=num_channels, out_channels=num_filters * 4, kernel_size=1, padding='same', bias_attr=False)\n\n        if self.has_se:\n            self.se = SELayer(\n                num_channels=num_filters * 4, num_filters=num_filters * 4, reduction_ratio=16, name=name + '_fc')\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        residual = x\n        conv1 = self.conv1(x)\n        conv2 = self.conv2(conv1)\n        conv3 = self.conv3(conv2)\n\n        if self.downsample:\n            residual = self.conv_down(x)\n\n        if self.has_se:\n            conv3 = self.se(conv3)\n\n        y = conv3 + residual\n        y = F.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int = 1,\n                 has_se: bool = False,\n                 downsample: bool = False,\n                 name: str = None):\n        super(BasicBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = layers.ConvBNReLU(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=3,\n            stride=stride,\n            padding='same',\n            bias_attr=False)\n        self.conv2 = layers.ConvBN(\n            in_channels=num_filters, out_channels=num_filters, kernel_size=3, padding='same', bias_attr=False)\n\n        if self.downsample:\n            self.conv_down = layers.ConvBNReLU(\n                in_channels=num_channels, out_channels=num_filters, kernel_size=1, padding='same', bias_attr=False)\n\n        if self.has_se:\n            self.se = SELayer(num_channels=num_filters, num_filters=num_filters, reduction_ratio=16, name=name + '_fc')\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        residual = x\n        conv1 = self.conv1(x)\n        conv2 = self.conv2(conv1)\n\n        if self.downsample:\n            residual = self.conv_down(x)\n\n        if self.has_se:\n            conv2 = self.se(conv2)\n\n        y = conv2 + residual\n        y = F.relu(y)\n        return y\n\n\nclass SELayer(nn.Layer):\n    def __init__(self, num_channels: int, num_filters: int, reduction_ratio: float, name: str = None):\n        super(SELayer, self).__init__()\n\n        self.pool2d_gap = nn.AdaptiveAvgPool2D(1)\n\n        self._num_channels = num_channels\n\n        med_ch = int(num_channels / reduction_ratio)\n        stdv = 1.0 / math.sqrt(num_channels * 1.0)\n        self.squeeze = nn.Linear(\n            num_channels, med_ch, weight_attr=paddle.ParamAttr(initializer=nn.initializer.Uniform(-stdv, stdv)))\n\n        stdv = 1.0 / math.sqrt(med_ch * 1.0)\n        self.excitation = nn.Linear(\n            med_ch, num_filters, weight_attr=paddle.ParamAttr(initializer=nn.initializer.Uniform(-stdv, stdv)))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        pool = self.pool2d_gap(x)\n        pool = paddle.reshape(pool, shape=[-1, self._num_channels])\n        squeeze = self.squeeze(pool)\n        squeeze = F.relu(squeeze)\n        excitation = self.excitation(squeeze)\n        excitation = F.sigmoid(excitation)\n        excitation = paddle.reshape(excitation, shape=[-1, self._num_channels, 1, 1])\n        out = x * excitation\n        return out\n\n\nclass Stage(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_modules: int,\n                 num_blocks: int,\n                 num_filters: int,\n                 has_se: bool = False,\n                 multi_scale_output: bool = True,\n                 name: str = None,\n                 align_corners: bool = False):\n        super(Stage, self).__init__()\n\n        self._num_modules = num_modules\n\n        self.stage_func_list = []\n        for i in range(num_modules):\n            if i == num_modules - 1 and not multi_scale_output:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels,\n                        num_blocks=num_blocks,\n                        num_filters=num_filters,\n                        has_se=has_se,\n                        multi_scale_output=False,\n                        name=name + '_' + str(i + 1),\n                        align_corners=align_corners))\n            else:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels,\n                        num_blocks=num_blocks,\n                        num_filters=num_filters,\n                        has_se=has_se,\n                        name=name + '_' + str(i + 1),\n                        align_corners=align_corners))\n\n            self.stage_func_list.append(stage_func)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        out = x\n        for idx in range(self._num_modules):\n            out = self.stage_func_list[idx](out)\n        return out\n\n\nclass HighResolutionModule(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_blocks: int,\n                 num_filters: int,\n                 has_se: bool = False,\n                 multi_scale_output: bool = True,\n                 name: str = None,\n                 align_corners: bool = False):\n        super(HighResolutionModule, self).__init__()\n\n        self.branches_func = Branches(\n            num_blocks=num_blocks, in_channels=num_channels, out_channels=num_filters, has_se=has_se, name=name)\n\n        self.fuse_func = FuseLayers(\n            in_channels=num_filters,\n            out_channels=num_filters,\n            multi_scale_output=multi_scale_output,\n            name=name,\n            align_corners=align_corners)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        out = self.branches_func(x)\n        out = self.fuse_func(out)\n        return out\n\n\nclass FuseLayers(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 multi_scale_output: bool = True,\n                 name: str = None,\n                 align_corners: bool = False):\n        super(FuseLayers, self).__init__()\n\n        self._actual_ch = len(in_channels) if multi_scale_output else 1\n        self._in_channels = in_channels\n        self.align_corners = align_corners\n\n        self.residual_func_list = []\n        for i in range(self._actual_ch):\n            for j in range(len(in_channels)):\n                if j > i:\n                    residual_func = self.add_sublayer(\n                        \"residual_{}_layer_{}_{}\".format(name, i + 1, j + 1),\n                        layers.ConvBN(\n                            in_channels=in_channels[j],\n                            out_channels=out_channels[i],\n                            kernel_size=1,\n                            padding='same',\n                            bias_attr=False))\n                    self.residual_func_list.append(residual_func)\n                elif j < i:\n                    pre_num_filters = in_channels[j]\n                    for k in range(i - j):\n                        if k == i - j - 1:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                layers.ConvBN(\n                                    in_channels=pre_num_filters,\n                                    out_channels=out_channels[i],\n                                    kernel_size=3,\n                                    stride=2,\n                                    padding='same',\n                                    bias_attr=False))\n                            pre_num_filters = out_channels[i]\n                        else:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                layers.ConvBNReLU(\n                                    in_channels=pre_num_filters,\n                                    out_channels=out_channels[j],\n                                    kernel_size=3,\n                                    stride=2,\n                                    padding='same',\n                                    bias_attr=False))\n                            pre_num_filters = out_channels[j]\n                        self.residual_func_list.append(residual_func)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outs = []\n        residual_func_idx = 0\n        for i in range(self._actual_ch):\n            residual = x[i]\n            residual_shape = paddle.shape(residual)[-2:]\n            for j in range(len(self._in_channels)):\n                if j > i:\n                    y = self.residual_func_list[residual_func_idx](x[j])\n                    residual_func_idx += 1\n\n                    y = F.interpolate(y, residual_shape, mode='bilinear', align_corners=self.align_corners)\n                    residual = residual + y\n                elif j < i:\n                    y = x[j]\n                    for k in range(i - j):\n                        y = self.residual_func_list[residual_func_idx](y)\n                        residual_func_idx += 1\n\n                    residual = residual + y\n\n            residual = F.relu(residual)\n            outs.append(residual)\n\n        return outs\n"
  },
  {
    "path": "modules/image/semantic_segmentation/fcn_hrnetw48_voc/layers.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import Tuple\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn.layer import activation\nfrom paddle.nn import Conv2D, AvgPool2D\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu':\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 stride: int = 1,\n                 dilation: int = 1,\n                 groups: int = 1,\n                 is_vd_mode: bool = False,\n                 act: str = None,\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2D(kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=(kernel_size - 1) // 2 if dilation == 1 else 0,\n            dilation=dilation,\n            groups=groups,\n            bias_attr=False)\n\n        self._batch_norm = SyncBatchNorm(out_channels)\n        self._act_op = Activation(act=act)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        y = self._act_op(y)\n\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Residual bottleneck block\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 dilation: int = 1,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels, out_channels=out_channels, kernel_size=1, act='relu', name=name + \"_branch2a\")\n\n        self.dilation = dilation\n\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            dilation=dilation,\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            in_channels=out_channels, out_channels=out_channels * 4, kernel_size=1, act=None, name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels * 4,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        if self.dilation > 1:\n            padding = self.dilation\n            y = F.pad(y, [padding, padding, padding, padding])\n\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.add(x=short, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Depthwise Separable Convolution.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs: dict):\n        super(SeparableConvBNReLU, self).__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=in_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n        self.piontwise_conv = ConvBNReLU(in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs: dict):\n        super(ConvBN, self).__init__()\n        self._conv = Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs: dict):\n        super(ConvBNReLU, self).__init__()\n\n        self._conv = Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n    Returns:\n        A callable object of Activation.\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n    Examples:\n        from paddleseg.models.common.activation import Activation\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = nn.layer.activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"nn.layer.activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n\n\nclass ASPPModule(nn.Layer):\n    \"\"\"\n    Atrous Spatial Pyramid Pooling.\n\n    Args:\n        aspp_ratios (tuple): The dilation rate using in ASSP module.\n        in_channels (int): The number of input channels.\n        out_channels (int): The number of output channels.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n        use_sep_conv (bool, optional): If using separable conv in ASPP module. Default: False.\n        image_pooling (bool, optional): If augmented with image-level features. Default: False\n    \"\"\"\n\n    def __init__(self,\n                 aspp_ratios: Tuple[int],\n                 in_channels: int,\n                 out_channels: int,\n                 align_corners: bool,\n                 use_sep_conv: bool = False,\n                 image_pooling: bool = False):\n        super().__init__()\n\n        self.align_corners = align_corners\n        self.aspp_blocks = nn.LayerList()\n\n        for ratio in aspp_ratios:\n            if use_sep_conv and ratio > 1:\n                conv_func = SeparableConvBNReLU\n            else:\n                conv_func = ConvBNReLU\n\n            block = conv_func(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1 if ratio == 1 else 3,\n                dilation=ratio,\n                padding=0 if ratio == 1 else ratio)\n            self.aspp_blocks.append(block)\n\n        out_size = len(self.aspp_blocks)\n\n        if image_pooling:\n            self.global_avg_pool = nn.Sequential(\n                nn.AdaptiveAvgPool2D(output_size=(1, 1)),\n                ConvBNReLU(in_channels, out_channels, kernel_size=1, bias_attr=False))\n            out_size += 1\n        self.image_pooling = image_pooling\n\n        self.conv_bn_relu = ConvBNReLU(in_channels=out_channels * out_size, out_channels=out_channels, kernel_size=1)\n\n        self.dropout = nn.Dropout(p=0.1)  # drop rate\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outputs = []\n        for block in self.aspp_blocks:\n            y = block(x)\n            y = F.interpolate(y, x.shape[2:], mode='bilinear', align_corners=self.align_corners)\n            outputs.append(y)\n\n        if self.image_pooling:\n            img_avg = self.global_avg_pool(x)\n            img_avg = F.interpolate(img_avg, x.shape[2:], mode='bilinear', align_corners=self.align_corners)\n            outputs.append(img_avg)\n\n        x = paddle.concat(outputs, axis=1)\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n\n        return x\n"
  },
  {
    "path": "modules/image/semantic_segmentation/fcn_hrnetw48_voc/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union, List, Tuple\n\nimport paddle\nfrom paddle import nn\nimport paddle.nn.functional as F\nimport numpy as np\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\n\nfrom fcn_hrnetw48_voc.hrnet import HRNet_W48\nimport fcn_hrnetw48_voc.layers as layers\n\n\n@moduleinfo(\n    name=\"fcn_hrnetw48_voc\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"Fcn_hrnetw48 is a segmentation model.\",\n    version=\"1.0.0\",\n    meta=ImageSegmentationModule)\nclass FCN(nn.Layer):\n    \"\"\"\n    A simple implementation for FCN based on PaddlePaddle.\n\n    The original article refers to\n    Evan Shelhamer, et, al. \"Fully Convolutional Networks for Semantic Segmentation\"\n    (https://arxiv.org/abs/1411.4038).\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (tuple, optional): The values in the tuple indicate the indices of output of backbone.\n            Default: (-1, ).\n        channels (int, optional): The channels between conv layer and the last layer of FCNHead.\n            If None, it will be the number of channels of input features. Default: None.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.  Default: False.\n        pretrained (str, optional): The path or url of pretrained model. Default: None\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 21,\n                 backbone_indices: Tuple[int] = (-1, ),\n                 channels: int = None,\n                 align_corners: bool = False,\n                 pretrained: str = None):\n        super(FCN, self).__init__()\n\n        self.backbone = HRNet_W48()\n        backbone_channels = [self.backbone.feat_channels[i] for i in backbone_indices]\n\n        self.head = FCNHead(num_classes, backbone_indices, backbone_channels, channels)\n\n        self.align_corners = align_corners\n        self.transforms = T.Compose([T.Normalize()])\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def transform(self, img: Union[np.ndarray, str]) -> Union[np.ndarray, str]:\n        return self.transforms(img)\n\n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        feat_list = self.backbone(x)\n        logit_list = self.head(feat_list)\n        return [\n            F.interpolate(logit, paddle.shape(x)[2:], mode='bilinear', align_corners=self.align_corners)\n            for logit in logit_list\n        ]\n\n\nclass FCNHead(nn.Layer):\n    \"\"\"\n    A simple implementation for FCNHead based on PaddlePaddle\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (tuple, optional): The values in the tuple indicate the indices of output of backbone.\n            Default: (-1, ).\n        backbone_channels (tuple): The values of backbone channels.\n            Default: (270, ).\n        channels (int, optional): The channels between conv layer and the last layer of FCNHead.\n            If None, it will be the number of channels of input features. Default: None.\n        pretrained (str, optional): The path of pretrained model. Default: None\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int,\n                 backbone_indices: Tuple[int] = (-1, ),\n                 backbone_channels: Tuple[int] = (270, ),\n                 channels: int = None):\n        super(FCNHead, self).__init__()\n\n        self.num_classes = num_classes\n        self.backbone_indices = backbone_indices\n        if channels is None:\n            channels = backbone_channels[0]\n\n        self.conv_1 = layers.ConvBNReLU(\n            in_channels=backbone_channels[0], out_channels=channels, kernel_size=1, padding='same', stride=1)\n        self.cls = nn.Conv2D(in_channels=channels, out_channels=self.num_classes, kernel_size=1, stride=1, padding=0)\n\n    def forward(self, feat_list: nn.Layer) -> List[paddle.Tensor]:\n        logit_list = []\n        x = feat_list[self.backbone_indices[0]]\n        x = self.conv_1(x)\n        logit = self.cls(x)\n        logit_list.append(logit)\n        return logit_list\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet101vd_ade20k/README.md",
    "content": "# ginet_resnet101vd_ade20k\n\n|模型名称|ginet_resnet101vd_ade20k|\n| :--- | :---: | \n|类别|图像-图像分割|\n|网络|ginet_resnet101vd|\n|数据集|ADE20K|\n|是否支持Fine-tuning|是|\n|模型大小|287MB|\n|指标|-|\n|最新更新日期|2021-12-14|\n\n## 一、模型基本信息\n  \n  - 样例结果示例：\n    - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/145947107-b6f87161-d824-4c21-b01d-594ad03e56de.jpg\"  hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/145960027-343246bf-ce8b-456b-85c6-042c1f4477bd.png\" hspace='10'/>\n    </p>\n\n- ### 模型介绍\n\n    - 本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n    - 更多详情请参考：[ginet](https://arxiv.org/pdf/2009.06160)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install ginet_resnet101vd_ade20k\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)   \n\n\n## 三、模型API预测\n\n- ### 1.预测代码示例\n\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='ginet_resnet101vd_ade20k')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用ginet_resnet101vd_ade20k模型对OpticDiscSeg数据集进行Fine-tune。 `train.py`内容如下：\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n\n              ```\n                - `transforms`: 数据预处理方式。\n                - `mode`: `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n                - 数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='ginet_resnet101vd_ade20k', num_classes=2, pretrained=None)\n              ```\n                - `name`: 选择预训练模型的名字。\n                - `load_checkpoint`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n        - Step4: 选择优化策略和运行配置\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n             \n\n    - 模型预测\n\n        - 当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。我们使用该模型来进行预测。predict.py脚本如下：\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='ginet_resnet101vd_ade20k', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - 参数配置正确后，请执行脚本`python predict.py`。\n\n            - **Args**\n                * `images`:原始图像路径或BGR格式图片；\n                * `visualization`: 是否可视化，默认为True；\n                * `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n                **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像分割服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m ginet_resnet101vd_ade20k\n      ```\n\n    - 这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ginet_resnet101vd_ade20k\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet101vd_ade20k/README_en.md",
    "content": "# ginet_resnet101vd_ade20k\n\n|Module Name|ginet_resnet101vd_ade20k|\n| :--- | :---: | \n|Category|Image Segmentation|\n|Network|ginet_resnet101vd|\n|Dataset|ADE20K|\n|Fine-tuning supported or not|Yes|\n|Module Size|287MB|\n|Data indicators|-|\n|Latest update date|2021-12-14|\n\n## I. Basic Information \n  \n- ### Application Effect Display\n    - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/145947107-b6f87161-d824-4c21-b01d-594ad03e56de.jpg\"  hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/145960027-343246bf-ce8b-456b-85c6-042c1f4477bd.png\" hspace='10'/>\n    </p>\n\n- ### Module Introduction\n\n    - We will show how to use PaddleHub to finetune the pre-trained model and complete the prediction.\n    - For more information, please refer to: [ginet](https://arxiv.org/pdf/2009.06160)\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install ginet_resnet101vd_ade20k\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='ginet_resnet101vd_ade20k')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.Fine-tune and Encapsulation\n\n    - After completing the installation of PaddlePaddle and PaddleHub, you can start using the ginet_resnet101vd_ade20k model to fine-tune datasets such as OpticDiscSeg.\n\n    - Steps:\n\n         - Step1: Define the data preprocessing method\n\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms`: The data enhancement module defines lots of data preprocessing methods. Users can replace the data preprocessing methods according to their needs.\n\n         - Step2: Download the dataset\n\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n\n              ```\n                * `transforms`: data preprocessing methods.\n\n                * `mode`: Select the data mode, the options are `train`, `test`, `val`. Default is `train`.\n\n                * Dataset preparation can be referred to [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`will be automatically downloaded from the network and decompressed to the `$HOME/.paddlehub/dataset` directory under the user directory.\n\n        - Step3: Load the pre-trained model\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='ginet_resnet101vd_ade20k', num_classes=2, pretrained=None)\n              ```\n                - `name`: model name.\n                - `load_checkpoint`: Whether to load the self-trained model, if it is None, load the provided parameters.\n\n        - Step4:  Optimization strategy\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n             \n\n    -  Model prediction\n\n        - When Fine-tune is completed, the model with the best performance on the verification set will be saved in the `${CHECKPOINT_DIR}/best_model` directory. We use this model to make predictions. The `predict.py` script is as follows:\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='ginet_resnet101vd_ade20k', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - **Args**\n                * `images`: Image path or ndarray data with format [H, W, C], BGR.\n                * `visualization`: Whether to save the recognition results as picture files.\n                * `save_path`: Save path of the result, default is 'seg_result'.\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of image segmentation.\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n        - ```shell\n          $ hub serving start -m ginet_resnet101vd_ade20k\n          ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result:\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ginet_resnet101vd_ade20k\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## V. Release Note\n\n- 1.0.0\n\n  First release"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet101vd_ade20k/layers.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn.layer import activation\nfrom paddle.nn import Conv2D, AvgPool2D\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu':\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(\n            self,\n            in_channels: int,\n            out_channels: int,\n            kernel_size: int,\n            stride: int = 1,\n            dilation: int = 1,\n            groups: int = 1,\n            is_vd_mode: bool = False,\n            act: str = None,\n            name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2D(\n            kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=(kernel_size - 1) // 2 if dilation == 1 else 0,\n            dilation=dilation,\n            groups=groups,\n            bias_attr=False)\n\n        self._batch_norm = SyncBatchNorm(out_channels)\n        self._act_op = Activation(act=act)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        y = self._act_op(y)\n\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Residual bottleneck block\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 dilation: int = 1,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=1,\n            act='relu',\n            name=name + \"_branch2a\")\n\n        self.dilation = dilation\n\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            dilation=dilation,\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels * 4,\n            kernel_size=1,\n            act=None,\n            name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels * 4,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        if self.dilation > 1:\n            padding = self.dilation\n            y = F.pad(y, [padding, padding, padding, padding])\n\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.add(x=short, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Depthwise Separable Convolution.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(SeparableConvBNReLU, self).__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=in_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n        self.piontwise_conv = ConvBNReLU(\n            in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBN, self).__init__()\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBNReLU, self).__init__()\n\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n\n    Returns:\n        A callable object of Activation.\n\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n\n    Examples:\n\n        from paddleseg.models.common.activation import Activation\n\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(\n                    act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n\n\nclass ASPPModule(nn.Layer):\n    \"\"\"\n    Atrous Spatial Pyramid Pooling.\n\n    Args:\n        aspp_ratios (tuple): The dilation rate using in ASSP module.\n        in_channels (int): The number of input channels.\n        out_channels (int): The number of output channels.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n        use_sep_conv (bool, optional): If using separable conv in ASPP module. Default: False.\n        image_pooling (bool, optional): If augmented with image-level features. Default: False\n    \"\"\"\n\n    def __init__(self,\n                 aspp_ratios: tuple,\n                 in_channels: int,\n                 out_channels: int,\n                 align_corners: bool,\n                 use_sep_conv: bool= False,\n                 image_pooling: bool = False):\n        super().__init__()\n\n        self.align_corners = align_corners\n        self.aspp_blocks = nn.LayerList()\n\n        for ratio in aspp_ratios:\n            if use_sep_conv and ratio > 1:\n                conv_func = SeparableConvBNReLU\n            else:\n                conv_func = ConvBNReLU\n\n            block = conv_func(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1 if ratio == 1 else 3,\n                dilation=ratio,\n                padding=0 if ratio == 1 else ratio)\n            self.aspp_blocks.append(block)\n\n        out_size = len(self.aspp_blocks)\n\n        if image_pooling:\n            self.global_avg_pool = nn.Sequential(\n                nn.AdaptiveAvgPool2D(output_size=(1, 1)),\n                ConvBNReLU(in_channels, out_channels, kernel_size=1, bias_attr=False))\n            out_size += 1\n        self.image_pooling = image_pooling\n\n        self.conv_bn_relu = ConvBNReLU(\n            in_channels=out_channels * out_size,\n            out_channels=out_channels,\n            kernel_size=1)\n\n        self.dropout = nn.Dropout(p=0.1)  # drop rate\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outputs = []\n        for block in self.aspp_blocks:\n            y = block(x)\n            y = F.interpolate(\n                y,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(y)\n\n        if self.image_pooling:\n            img_avg = self.global_avg_pool(x)\n            img_avg = F.interpolate(\n                img_avg,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(img_avg)\n\n        x = paddle.concat(outputs, axis=1)\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n\n        return x\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet101vd_ade20k/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union, List, Tuple\n\nimport paddle\nfrom paddle import nn\nimport paddle.nn.functional as F\nimport numpy as np\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\nfrom paddleseg.utils import utils\nfrom paddleseg.models import layers\n\nfrom ginet_resnet101vd_ade20k.resnet import ResNet101_vd\n\n\n@moduleinfo(\n    name=\"ginet_resnet101vd_ade20k\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"GINetResnet101 is a segmentation model.\",\n    version=\"1.0.0\",\n    meta=ImageSegmentationModule)\nclass GINetResNet101(nn.Layer):\n    \"\"\"\n    The GINetResNet101 implementation based on PaddlePaddle.\n    The original article refers to\n    Wu, Tianyi, Yu Lu, Yu Zhu, Chuang Zhang, Ming Wu, Zhanyu Ma, and Guodong Guo. \"GINet: Graph interaction network for scene parsing.\" In European Conference on Computer Vision, pp. 34-51. Springer, Cham, 2020.\n    (https://arxiv.org/pdf/2009.06160).\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (tuple, optional): Values in the tuple indicate the indices of output of backbone.\n        enable_auxiliary_loss (bool, optional): A bool value indicates whether adding auxiliary loss.\n            If true, auxiliary loss will be added after LearningToDownsample module. Default: False.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.. Default: False.\n        jpu (bool, optional)): whether to use jpu unit in the base forward. Default:True.\n        pretrained (str, optional): The path or url of pretrained model. Default: None.\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 150,\n                 backbone_indices: Tuple[int]=(0, 1, 2, 3),\n                 enable_auxiliary_loss: bool = True,\n                 align_corners: bool = True,\n                 jpu: bool = True,\n                 pretrained: str = None):\n        super(GINetResNet101, self).__init__()\n        self.nclass = num_classes\n        self.aux = enable_auxiliary_loss\n        self.jpu = jpu\n\n        self.backbone = ResNet101_vd()\n        self.backbone_indices = backbone_indices\n        self.align_corners = align_corners\n        self.transforms = T.Compose([T.Normalize()])\n\n        self.jpu = layers.JPU([512, 1024, 2048], width=512) if jpu else None\n        self.head = GIHead(in_channels=2048, nclass=num_classes)\n\n        if self.aux:\n            self.auxlayer = layers.AuxLayer(\n                1024, 1024 // 4, num_classes, bias_attr=False)\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def transform(self, img: Union[np.ndarray, str]) -> Union[np.ndarray, str]:\n        return self.transforms(img)\n\n    def base_forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        feat_list = self.backbone(x)\n        c1, c2, c3, c4 = [feat_list[i] for i in self.backbone_indices]\n\n        if self.jpu:\n            return self.jpu(c1, c2, c3, c4)\n        else:\n            return c1, c2, c3, c4\n\n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        _, _, h, w = x.shape\n        _, _, c3, c4 = self.base_forward(x)\n\n        logit_list = []\n        x, _ = self.head(c4)\n        logit_list.append(x)\n\n        if self.aux:\n            auxout = self.auxlayer(c3)\n\n            logit_list.append(auxout)\n\n        return [\n            F.interpolate(\n                logit, (h, w),\n                mode='bilinear',\n                align_corners=self.align_corners) for logit in logit_list\n        ]\n\n\nclass GIHead(nn.Layer):\n    \"\"\"The Graph Interaction Network head.\"\"\"\n\n    def __init__(self, in_channels: int, nclass: int):\n        super().__init__()\n        self.nclass = nclass\n        inter_channels = in_channels // 4\n        self.inp = paddle.zeros(shape=(nclass, 300), dtype='float32')\n        self.inp = paddle.create_parameter(\n            shape=self.inp.shape,\n            dtype=str(self.inp.numpy().dtype),\n            default_initializer=paddle.nn.initializer.Assign(self.inp))\n\n        self.fc1 = nn.Sequential(\n            nn.Linear(300, 128), nn.BatchNorm1D(128), nn.ReLU())\n        self.fc2 = nn.Sequential(\n            nn.Linear(128, 256), nn.BatchNorm1D(256), nn.ReLU())\n        self.conv5 = layers.ConvBNReLU(\n            in_channels,\n            inter_channels,\n            3,\n            padding=1,\n            bias_attr=False,\n            stride=1)\n\n        self.gloru = GlobalReasonUnit(\n            in_channels=inter_channels,\n            num_state=256,\n            num_node=84,\n            nclass=nclass)\n        self.conv6 = nn.Sequential(\n            nn.Dropout(0.1), nn.Conv2D(inter_channels, nclass, 1))\n\n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        B, C, H, W = x.shape\n        inp = self.inp.detach()\n\n        inp = self.fc1(inp)\n        inp = self.fc2(inp).unsqueeze(axis=0).transpose((0, 2, 1))\\\n                           .expand((B, 256, self.nclass))\n\n        out = self.conv5(x)\n\n        out, se_out = self.gloru(out, inp)\n        out = self.conv6(out)\n        return out, se_out\n\n\nclass GlobalReasonUnit(nn.Layer):\n    \"\"\"\n        The original paper refers to:\n            Chen, Yunpeng, et al. \"Graph-Based Global Reasoning Networks\" (https://arxiv.org/abs/1811.12814)\n    \"\"\"\n\n    def __init__(self, in_channels: int, num_state: int = 256, num_node: int = 84, nclass: int = 59):\n        super().__init__()\n        self.num_state = num_state\n        self.conv_theta = nn.Conv2D(\n            in_channels, num_node, kernel_size=1, stride=1, padding=0)\n        self.conv_phi = nn.Conv2D(\n            in_channels, num_state, kernel_size=1, stride=1, padding=0)\n        self.graph = GraphLayer(num_state, num_node, nclass)\n        self.extend_dim = nn.Conv2D(\n            num_state, in_channels, kernel_size=1, bias_attr=False)\n\n        self.bn = layers.SyncBatchNorm(in_channels)\n\n    def forward(self, x: paddle.Tensor, inp: paddle.Tensor) -> List[paddle.Tensor]:\n        B = self.conv_theta(x)\n        sizeB = B.shape\n        B = B.reshape((sizeB[0], sizeB[1], -1))\n\n        sizex = x.shape\n        x_reduce = self.conv_phi(x)\n        x_reduce = x_reduce.reshape((sizex[0], -1, sizex[2] * sizex[3]))\\\n                           .transpose((0, 2, 1))\n\n        V = paddle.bmm(B, x_reduce).transpose((0, 2, 1))\n        V = paddle.divide(\n            V, paddle.to_tensor([sizex[2] * sizex[3]], dtype='float32'))\n\n        class_node, new_V = self.graph(inp, V)\n        D = B.reshape((sizeB[0], -1, sizeB[2] * sizeB[3])).transpose((0, 2, 1))\n        Y = paddle.bmm(D, new_V.transpose((0, 2, 1)))\n        Y = Y.transpose((0, 2, 1)).reshape((sizex[0], self.num_state, \\\n                                            sizex[2], -1))\n        Y = self.extend_dim(Y)\n        Y = self.bn(Y)\n        out = Y + x\n\n        return out, class_node\n\n\nclass GraphLayer(nn.Layer):\n    def __init__(self, num_state: int, num_node: int, num_class: int):\n        super().__init__()\n        self.vis_gcn = GCN(num_state, num_node)\n        self.word_gcn = GCN(num_state, num_class)\n        self.transfer = GraphTransfer(num_state)\n        self.gamma_vis = paddle.zeros([num_node])\n        self.gamma_word = paddle.zeros([num_class])\n        self.gamma_vis = paddle.create_parameter(\n            shape=self.gamma_vis.shape,\n            dtype=str(self.gamma_vis.numpy().dtype),\n            default_initializer=paddle.nn.initializer.Assign(self.gamma_vis))\n        self.gamma_word = paddle.create_parameter(\n            shape=self.gamma_word.shape,\n            dtype=str(self.gamma_word.numpy().dtype),\n            default_initializer=paddle.nn.initializer.Assign(self.gamma_word))\n\n    def forward(self, inp: paddle.Tensor, vis_node: paddle.Tensor) -> List[paddle.Tensor]:\n        inp = self.word_gcn(inp)\n        new_V = self.vis_gcn(vis_node)\n        class_node, vis_node = self.transfer(inp, new_V)\n\n        class_node = self.gamma_word * inp + class_node\n        new_V = self.gamma_vis * vis_node + new_V\n        return class_node, new_V\n\n\nclass GCN(nn.Layer):\n    def __init__(self, num_state: int = 128, num_node: int = 64, bias=False):\n        super().__init__()\n        self.conv1 = nn.Conv1D(\n            num_node,\n            num_node,\n            kernel_size=1,\n            padding=0,\n            stride=1,\n            groups=1,\n        )\n        self.relu = nn.ReLU()\n        self.conv2 = nn.Conv1D(\n            num_state,\n            num_state,\n            kernel_size=1,\n            padding=0,\n            stride=1,\n            groups=1,\n            bias_attr=bias)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        h = self.conv1(x.transpose((0, 2, 1))).transpose((0, 2, 1))\n        h = h + x\n        h = self.relu(h)\n        h = self.conv2(h)\n        return h\n\n\nclass GraphTransfer(nn.Layer):\n    \"\"\"Transfer vis graph to class node, transfer class node to vis feature\"\"\"\n\n    def __init__(self, in_dim: int):\n        super().__init__()\n        self.channle_in = in_dim\n        self.query_conv = nn.Conv1D(\n            in_channels=in_dim, out_channels=in_dim // 2, kernel_size=1)\n        self.key_conv = nn.Conv1D(\n            in_channels=in_dim, out_channels=in_dim // 2, kernel_size=1)\n        self.value_conv_vis = nn.Conv1D(\n            in_channels=in_dim, out_channels=in_dim, kernel_size=1)\n        self.value_conv_word = nn.Conv1D(\n            in_channels=in_dim, out_channels=in_dim, kernel_size=1)\n        self.softmax_vis = nn.Softmax(axis=-1)\n        self.softmax_word = nn.Softmax(axis=-2)\n\n    def forward(self, word: paddle.Tensor, vis_node: paddle.Tensor) -> List[paddle.Tensor]:\n        m_batchsize, C, Nc = word.shape\n        m_batchsize, C, Nn = vis_node.shape\n\n        proj_query = self.query_conv(word).reshape((m_batchsize, -1, Nc))\\\n                                          .transpose((0, 2, 1))\n        proj_key = self.key_conv(vis_node).reshape((m_batchsize, -1, Nn))\n\n        energy = paddle.bmm(proj_query, proj_key)\n        attention_vis = self.softmax_vis(energy).transpose((0, 2, 1))\n        attention_word = self.softmax_word(energy)\n\n        proj_value_vis = self.value_conv_vis(vis_node).reshape((m_batchsize, -1,\n                                                                Nn))\n        proj_value_word = self.value_conv_word(word).reshape((m_batchsize, -1,\n                                                              Nc))\n\n        class_out = paddle.bmm(proj_value_vis, attention_vis)\n        node_out = paddle.bmm(proj_value_word, attention_word)\n        return class_out, node_out"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet101vd_ade20k/requirements.txt",
    "content": "paddleseg >= 2.3.0\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet101vd_ade20k/resnet.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport ginet_resnet101vd_ade20k.layers as L\n\nclass BasicBlock(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BasicBlock, self).__init__()\n        self.stride = stride\n        self.conv0 = L.ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2a\")\n        self.conv1 = L.ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            act=None,\n            name=name + \"_branch2b\")\n\n        if not shortcut:\n            self.short = L.ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv1, act='relu')\n\n        return y\n\n\nclass ResNet101_vd(nn.Layer):\n    def __init__(self,\n                 multi_grid: tuple = (1, 2, 4)):\n        super(ResNet101_vd, self).__init__()\n        depth = [3, 4, 23, 3]\n        num_channels = [64, 256, 512, 1024] \n        num_filters = [64, 128, 256, 512]\n        self.feat_channels = [c * 4 for c in num_filters]\n        dilation_dict = {2: 2, 3: 4}\n        self.conv1_1 = L.ConvBNLayer(\n            in_channels=3,\n            out_channels=32,\n            kernel_size=3,\n            stride=2,\n            act='relu',\n            name=\"conv1_1\")\n        self.conv1_2 = L.ConvBNLayer(\n            in_channels=32,\n            out_channels=32,\n            kernel_size=3,\n            stride=1,\n            act='relu',\n            name=\"conv1_2\")\n        self.conv1_3 = L.ConvBNLayer(\n            in_channels=32,\n            out_channels=64,\n            kernel_size=3,\n            stride=1,\n            act='relu',\n            name=\"conv1_3\")\n        self.pool2d_max = nn.MaxPool2D(kernel_size=3, stride=2, padding=1)\n        self.stage_list = []\n\n        for block in range(len(depth)):\n            shortcut = False\n            block_list = []\n            for i in range(depth[block]):\n                conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                dilation_rate = dilation_dict[\n                    block] if dilation_dict and block in dilation_dict else 1\n                if block == 3:\n                    dilation_rate = dilation_rate * multi_grid[i]\n                bottleneck_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    L.BottleneckBlock(\n                        in_channels=num_channels[block]\n                        if i == 0 else num_filters[block] * 4,\n                        out_channels=num_filters[block],\n                        stride=2 if i == 0 and block != 0\n                                    and dilation_rate == 1 else 1,\n                        shortcut=shortcut,\n                        if_first=block == i == 0,\n                        name=conv_name,\n                        dilation=dilation_rate))\n                block_list.append(bottleneck_block)\n                shortcut = True\n            self.stage_list.append(block_list)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        y = self.pool2d_max(y)\n        feat_list = []\n        for stage in self.stage_list:\n            for block in stage:\n                y = block(y)\n            feat_list.append(y)\n        return feat_list"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet101vd_cityscapes/README.md",
    "content": "# ginet_resnet101vd_cityscapes\n\n|模型名称|ginet_resnet101vd_cityscapes|\n| :--- | :---: | \n|类别|图像-图像分割|\n|网络|ginet_resnet101vd|\n|数据集|Cityscapes|\n|是否支持Fine-tuning|是|\n|模型大小|286MB|\n|指标|-|\n|最新更新日期|2021-12-14|\n\n## 一、模型基本信息\n  \n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/145943452-6e3a8cce-b17c-417e-80ad-d47e1dd5e00c.png\"  hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/145943518-b608d555-1ddb-4100-b399-b6f777658caf.png\" hspace='10'/>\n    </p>\n\n- ### 模型介绍\n\n    - 本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n    - 更多详情请参考：[ginet](https://arxiv.org/pdf/2009.06160)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install ginet_resnet101vd_cityscapes\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)   \n\n\n## 三、模型API预测\n\n- ### 1.预测代码示例\n\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='ginet_resnet101vd_cityscapes')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用ginet_resnet101vd_cityscapes模型对OpticDiscSeg数据集进行Fine-tune。 `train.py`内容如下：\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n\n              ```\n                - `transforms`: 数据预处理方式。\n                - `mode`: `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n                - 数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='ginet_resnet101vd_cityscapes', num_classes=2, pretrained=None)\n              ```\n                - `name`: 选择预训练模型的名字。\n                - `load_checkpoint`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n        - Step4: 选择优化策略和运行配置\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n             \n\n    - 模型预测\n\n        - 当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。我们使用该模型来进行预测。predict.py脚本如下：\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='ginet_resnet101vd_cityscapes', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - 参数配置正确后，请执行脚本`python predict.py`。\n\n            - **Args**\n                * `images`:原始图像路径或BGR格式图片；\n                * `visualization`: 是否可视化，默认为True；\n                * `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n                **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像分割服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m ginet_resnet101vd_cityscapes\n      ```\n\n    - 这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ginet_resnet101vd_cityscapes\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet101vd_cityscapes/README_en.md",
    "content": "# ginet_resnet101vd_cityscapes\n\n|Module Name|ginet_resnet101vd_cityscapes|\n| :--- | :---: | \n|Category|Image Segmentation|\n|Network|ginet_resnet101vd|\n|Dataset|Cityscapes|\n|Fine-tuning supported or not|Yes|\n|Module Size|286MB|\n|Data indicators|-|\n|Latest update date|2021-12-14|\n\n## I. Basic Information \n  \n- ### Application Effect Display\n    - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/145943452-6e3a8cce-b17c-417e-80ad-d47e1dd5e00c.png\"  hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/145954768-026af5a4-30a5-43f3-abe3-c0ea821e895a.png\" hspace='10'/>\n    </p>\n\n- ### Module Introduction\n\n    - We will show how to use PaddleHub to finetune the pre-trained model and complete the prediction.\n    - For more information, please refer to: [ginet](https://arxiv.org/pdf/2009.06160)\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install ginet_resnet101vd_cityscapes\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='ginet_resnet101vd_cityscapes')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.Fine-tune and Encapsulation\n\n    - After completing the installation of PaddlePaddle and PaddleHub, you can start using the ginet_resnet101vd_cityscapes model to fine-tune datasets such as OpticDiscSeg.\n\n    - Steps:\n\n         - Step1: Define the data preprocessing method\n\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms`: The data enhancement module defines lots of data preprocessing methods. Users can replace the data preprocessing methods according to their needs.\n\n         - Step2: Download the dataset\n\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n\n              ```\n                * `transforms`: data preprocessing methods.\n\n                * `mode`: Select the data mode, the options are `train`, `test`, `val`. Default is `train`.\n\n                * Dataset preparation can be referred to [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`will be automatically downloaded from the network and decompressed to the `$HOME/.paddlehub/dataset` directory under the user directory.\n\n        - Step3: Load the pre-trained model\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='ginet_resnet101vd_cityscapes', num_classes=2, pretrained=None)\n              ```\n                - `name`: model name.\n                - `load_checkpoint`: Whether to load the self-trained model, if it is None, load the provided parameters.\n\n        - Step4:  Optimization strategy\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n             \n\n    -  Model prediction\n\n        - When Fine-tune is completed, the model with the best performance on the verification set will be saved in the `${CHECKPOINT_DIR}/best_model` directory. We use this model to make predictions. The `predict.py` script is as follows:\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='ginet_resnet101vd_cityscapes', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - **Args**\n                * `images`: Image path or ndarray data with format [H, W, C], BGR.\n                * `visualization`: Whether to save the recognition results as picture files.\n                * `save_path`: Save path of the result, default is 'seg_result'.\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of image segmentation.\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n        - ```shell\n          $ hub serving start -m ginet_resnet101vd_cityscapes\n          ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result:\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ginet_resnet101vd_cityscapes\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## V. Release Note\n\n- 1.0.0\n\n  First release"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet101vd_cityscapes/layers.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn.layer import activation\nfrom paddle.nn import Conv2D, AvgPool2D\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu':\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(\n            self,\n            in_channels: int,\n            out_channels: int,\n            kernel_size: int,\n            stride: int = 1,\n            dilation: int = 1,\n            groups: int = 1,\n            is_vd_mode: bool = False,\n            act: str = None,\n            name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2D(\n            kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=(kernel_size - 1) // 2 if dilation == 1 else 0,\n            dilation=dilation,\n            groups=groups,\n            bias_attr=False)\n\n        self._batch_norm = SyncBatchNorm(out_channels)\n        self._act_op = Activation(act=act)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        y = self._act_op(y)\n\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Residual bottleneck block\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 dilation: int = 1,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=1,\n            act='relu',\n            name=name + \"_branch2a\")\n\n        self.dilation = dilation\n\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            dilation=dilation,\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels * 4,\n            kernel_size=1,\n            act=None,\n            name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels * 4,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        if self.dilation > 1:\n            padding = self.dilation\n            y = F.pad(y, [padding, padding, padding, padding])\n\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.add(x=short, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Depthwise Separable Convolution.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(SeparableConvBNReLU, self).__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=in_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n        self.piontwise_conv = ConvBNReLU(\n            in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBN, self).__init__()\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBNReLU, self).__init__()\n\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n\n    Returns:\n        A callable object of Activation.\n\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n\n    Examples:\n\n        from paddleseg.models.common.activation import Activation\n\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(\n                    act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n\n\nclass ASPPModule(nn.Layer):\n    \"\"\"\n    Atrous Spatial Pyramid Pooling.\n\n    Args:\n        aspp_ratios (tuple): The dilation rate using in ASSP module.\n        in_channels (int): The number of input channels.\n        out_channels (int): The number of output channels.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n        use_sep_conv (bool, optional): If using separable conv in ASPP module. Default: False.\n        image_pooling (bool, optional): If augmented with image-level features. Default: False\n    \"\"\"\n\n    def __init__(self,\n                 aspp_ratios: tuple,\n                 in_channels: int,\n                 out_channels: int,\n                 align_corners: bool,\n                 use_sep_conv: bool= False,\n                 image_pooling: bool = False):\n        super().__init__()\n\n        self.align_corners = align_corners\n        self.aspp_blocks = nn.LayerList()\n\n        for ratio in aspp_ratios:\n            if use_sep_conv and ratio > 1:\n                conv_func = SeparableConvBNReLU\n            else:\n                conv_func = ConvBNReLU\n\n            block = conv_func(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1 if ratio == 1 else 3,\n                dilation=ratio,\n                padding=0 if ratio == 1 else ratio)\n            self.aspp_blocks.append(block)\n\n        out_size = len(self.aspp_blocks)\n\n        if image_pooling:\n            self.global_avg_pool = nn.Sequential(\n                nn.AdaptiveAvgPool2D(output_size=(1, 1)),\n                ConvBNReLU(in_channels, out_channels, kernel_size=1, bias_attr=False))\n            out_size += 1\n        self.image_pooling = image_pooling\n\n        self.conv_bn_relu = ConvBNReLU(\n            in_channels=out_channels * out_size,\n            out_channels=out_channels,\n            kernel_size=1)\n\n        self.dropout = nn.Dropout(p=0.1)  # drop rate\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outputs = []\n        for block in self.aspp_blocks:\n            y = block(x)\n            y = F.interpolate(\n                y,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(y)\n\n        if self.image_pooling:\n            img_avg = self.global_avg_pool(x)\n            img_avg = F.interpolate(\n                img_avg,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(img_avg)\n\n        x = paddle.concat(outputs, axis=1)\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n\n        return x\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet101vd_cityscapes/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union, List, Tuple\n\nimport paddle\nfrom paddle import nn\nimport paddle.nn.functional as F\nimport numpy as np\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\nfrom paddleseg.utils import utils\nfrom paddleseg.models import layers\n\nfrom ginet_resnet101vd_cityscapes.resnet import ResNet101_vd\n\n\n@moduleinfo(\n    name=\"ginet_resnet101vd_cityscapes\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"GINetResnet101 is a segmentation model.\",\n    version=\"1.0.0\",\n    meta=ImageSegmentationModule)\nclass GINetResNet101(nn.Layer):\n    \"\"\"\n    The GINetResNet101 implementation based on PaddlePaddle.\n    The original article refers to\n    Wu, Tianyi, Yu Lu, Yu Zhu, Chuang Zhang, Ming Wu, Zhanyu Ma, and Guodong Guo. \"GINet: Graph interaction network for scene parsing.\" In European Conference on Computer Vision, pp. 34-51. Springer, Cham, 2020.\n    (https://arxiv.org/pdf/2009.06160).\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (tuple, optional): Values in the tuple indicate the indices of output of backbone.\n        enable_auxiliary_loss (bool, optional): A bool value indicates whether adding auxiliary loss.\n            If true, auxiliary loss will be added after LearningToDownsample module. Default: False.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.. Default: False.\n        jpu (bool, optional)): whether to use jpu unit in the base forward. Default:True.\n        pretrained (str, optional): The path or url of pretrained model. Default: None.\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 19,\n                 backbone_indices: Tuple[int]=(0, 1, 2, 3),\n                 enable_auxiliary_loss: bool = True,\n                 align_corners: bool = True,\n                 jpu: bool = True,\n                 pretrained: str = None):\n        super(GINetResNet101, self).__init__()\n        self.nclass = num_classes\n        self.aux = enable_auxiliary_loss\n        self.jpu = jpu\n\n        self.backbone = ResNet101_vd()\n        self.backbone_indices = backbone_indices\n        self.align_corners = align_corners\n        self.transforms = T.Compose([T.Normalize()])\n\n        self.jpu = layers.JPU([512, 1024, 2048], width=512) if jpu else None\n        self.head = GIHead(in_channels=2048, nclass=num_classes)\n\n        if self.aux:\n            self.auxlayer = layers.AuxLayer(\n                1024, 1024 // 4, num_classes, bias_attr=False)\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def transform(self, img: Union[np.ndarray, str]) -> Union[np.ndarray, str]:\n        return self.transforms(img)\n\n    def base_forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        feat_list = self.backbone(x)\n        c1, c2, c3, c4 = [feat_list[i] for i in self.backbone_indices]\n\n        if self.jpu:\n            return self.jpu(c1, c2, c3, c4)\n        else:\n            return c1, c2, c3, c4\n\n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        _, _, h, w = x.shape\n        _, _, c3, c4 = self.base_forward(x)\n\n        logit_list = []\n        x, _ = self.head(c4)\n        logit_list.append(x)\n\n        if self.aux:\n            auxout = self.auxlayer(c3)\n\n            logit_list.append(auxout)\n\n        return [\n            F.interpolate(\n                logit, (h, w),\n                mode='bilinear',\n                align_corners=self.align_corners) for logit in logit_list\n        ]\n\n\nclass GIHead(nn.Layer):\n    \"\"\"The Graph Interaction Network head.\"\"\"\n\n    def __init__(self, in_channels: int, nclass: int):\n        super().__init__()\n        self.nclass = nclass\n        inter_channels = in_channels // 4\n        self.inp = paddle.zeros(shape=(nclass, 300), dtype='float32')\n        self.inp = paddle.create_parameter(\n            shape=self.inp.shape,\n            dtype=str(self.inp.numpy().dtype),\n            default_initializer=paddle.nn.initializer.Assign(self.inp))\n\n        self.fc1 = nn.Sequential(\n            nn.Linear(300, 128), nn.BatchNorm1D(128), nn.ReLU())\n        self.fc2 = nn.Sequential(\n            nn.Linear(128, 256), nn.BatchNorm1D(256), nn.ReLU())\n        self.conv5 = layers.ConvBNReLU(\n            in_channels,\n            inter_channels,\n            3,\n            padding=1,\n            bias_attr=False,\n            stride=1)\n\n        self.gloru = GlobalReasonUnit(\n            in_channels=inter_channels,\n            num_state=256,\n            num_node=84,\n            nclass=nclass)\n        self.conv6 = nn.Sequential(\n            nn.Dropout(0.1), nn.Conv2D(inter_channels, nclass, 1))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        B, C, H, W = x.shape\n        inp = self.inp.detach()\n\n        inp = self.fc1(inp)\n        inp = self.fc2(inp).unsqueeze(axis=0).transpose((0, 2, 1))\\\n                           .expand((B, 256, self.nclass))\n\n        out = self.conv5(x)\n\n        out, se_out = self.gloru(out, inp)\n        out = self.conv6(out)\n        return out, se_out\n\n\nclass GlobalReasonUnit(nn.Layer):\n    \"\"\"\n        The original paper refers to:\n            Chen, Yunpeng, et al. \"Graph-Based Global Reasoning Networks\" (https://arxiv.org/abs/1811.12814)\n    \"\"\"\n\n    def __init__(self, in_channels: int, num_state: int = 256, num_node: int = 84, nclass: int = 59):\n        super().__init__()\n        self.num_state = num_state\n        self.conv_theta = nn.Conv2D(\n            in_channels, num_node, kernel_size=1, stride=1, padding=0)\n        self.conv_phi = nn.Conv2D(\n            in_channels, num_state, kernel_size=1, stride=1, padding=0)\n        self.graph = GraphLayer(num_state, num_node, nclass)\n        self.extend_dim = nn.Conv2D(\n            num_state, in_channels, kernel_size=1, bias_attr=False)\n\n        self.bn = layers.SyncBatchNorm(in_channels)\n\n    def forward(self, x: paddle.Tensor, inp: paddle.Tensor) -> paddle.Tensor:\n        B = self.conv_theta(x)\n        sizeB = B.shape\n        B = B.reshape((sizeB[0], sizeB[1], -1))\n\n        sizex = x.shape\n        x_reduce = self.conv_phi(x)\n        x_reduce = x_reduce.reshape((sizex[0], -1, sizex[2] * sizex[3]))\\\n                           .transpose((0, 2, 1))\n\n        V = paddle.bmm(B, x_reduce).transpose((0, 2, 1))\n        V = paddle.divide(\n            V, paddle.to_tensor([sizex[2] * sizex[3]], dtype='float32'))\n\n        class_node, new_V = self.graph(inp, V)\n        D = B.reshape((sizeB[0], -1, sizeB[2] * sizeB[3])).transpose((0, 2, 1))\n        Y = paddle.bmm(D, new_V.transpose((0, 2, 1)))\n        Y = Y.transpose((0, 2, 1)).reshape((sizex[0], self.num_state, \\\n                                            sizex[2], -1))\n        Y = self.extend_dim(Y)\n        Y = self.bn(Y)\n        out = Y + x\n\n        return out, class_node\n\n\nclass GraphLayer(nn.Layer):\n    def __init__(self, num_state: int, num_node: int, num_class: int):\n        super().__init__()\n        self.vis_gcn = GCN(num_state, num_node)\n        self.word_gcn = GCN(num_state, num_class)\n        self.transfer = GraphTransfer(num_state)\n        self.gamma_vis = paddle.zeros([num_node])\n        self.gamma_word = paddle.zeros([num_class])\n        self.gamma_vis = paddle.create_parameter(\n            shape=self.gamma_vis.shape,\n            dtype=str(self.gamma_vis.numpy().dtype),\n            default_initializer=paddle.nn.initializer.Assign(self.gamma_vis))\n        self.gamma_word = paddle.create_parameter(\n            shape=self.gamma_word.shape,\n            dtype=str(self.gamma_word.numpy().dtype),\n            default_initializer=paddle.nn.initializer.Assign(self.gamma_word))\n\n    def forward(self, inp: paddle.Tensor, vis_node: paddle.Tensor) -> List[paddle.Tensor]:\n        inp = self.word_gcn(inp)\n        new_V = self.vis_gcn(vis_node)\n        class_node, vis_node = self.transfer(inp, new_V)\n\n        class_node = self.gamma_word * inp + class_node\n        new_V = self.gamma_vis * vis_node + new_V\n        return class_node, new_V\n\n\nclass GCN(nn.Layer):\n    def __init__(self, num_state: int = 128, num_node: int = 64, bias=False):\n        super().__init__()\n        self.conv1 = nn.Conv1D(\n            num_node,\n            num_node,\n            kernel_size=1,\n            padding=0,\n            stride=1,\n            groups=1,\n        )\n        self.relu = nn.ReLU()\n        self.conv2 = nn.Conv1D(\n            num_state,\n            num_state,\n            kernel_size=1,\n            padding=0,\n            stride=1,\n            groups=1,\n            bias_attr=bias)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        h = self.conv1(x.transpose((0, 2, 1))).transpose((0, 2, 1))\n        h = h + x\n        h = self.relu(h)\n        h = self.conv2(h)\n        return h\n\n\nclass GraphTransfer(nn.Layer):\n    \"\"\"Transfer vis graph to class node, transfer class node to vis feature\"\"\"\n\n    def __init__(self, in_dim: int):\n        super().__init__()\n        self.channle_in = in_dim\n        self.query_conv = nn.Conv1D(\n            in_channels=in_dim, out_channels=in_dim // 2, kernel_size=1)\n        self.key_conv = nn.Conv1D(\n            in_channels=in_dim, out_channels=in_dim // 2, kernel_size=1)\n        self.value_conv_vis = nn.Conv1D(\n            in_channels=in_dim, out_channels=in_dim, kernel_size=1)\n        self.value_conv_word = nn.Conv1D(\n            in_channels=in_dim, out_channels=in_dim, kernel_size=1)\n        self.softmax_vis = nn.Softmax(axis=-1)\n        self.softmax_word = nn.Softmax(axis=-2)\n\n    def forward(self, word: paddle.Tensor, vis_node: paddle.Tensor) -> List[paddle.Tensor]:\n        m_batchsize, C, Nc = word.shape\n        m_batchsize, C, Nn = vis_node.shape\n\n        proj_query = self.query_conv(word).reshape((m_batchsize, -1, Nc))\\\n                                          .transpose((0, 2, 1))\n        proj_key = self.key_conv(vis_node).reshape((m_batchsize, -1, Nn))\n\n        energy = paddle.bmm(proj_query, proj_key)\n        attention_vis = self.softmax_vis(energy).transpose((0, 2, 1))\n        attention_word = self.softmax_word(energy)\n        proj_value_vis = self.value_conv_vis(vis_node).reshape((m_batchsize, -1,\n                                                                Nn))\n        proj_value_word = self.value_conv_word(word).reshape((m_batchsize, -1,\n                                                              Nc))\n\n        class_out = paddle.bmm(proj_value_vis, attention_vis)\n        node_out = paddle.bmm(proj_value_word, attention_word)\n        return class_out, node_out"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet101vd_cityscapes/requirements.txt",
    "content": "paddleseg >= 2.3.0\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet101vd_cityscapes/resnet.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport ginet_resnet101vd_cityscapes.layers as L\n\nclass BasicBlock(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BasicBlock, self).__init__()\n        self.stride = stride\n        self.conv0 = L.ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2a\")\n        self.conv1 = L.ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            act=None,\n            name=name + \"_branch2b\")\n\n        if not shortcut:\n            self.short = L.ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv1, act='relu')\n\n        return y\n\n\nclass ResNet101_vd(nn.Layer):\n    def __init__(self,\n                 multi_grid: tuple = (1, 2, 4)):\n        super(ResNet101_vd, self).__init__()\n        depth = [3, 4, 23, 3]\n        num_channels = [64, 256, 512, 1024] \n        num_filters = [64, 128, 256, 512]\n        self.feat_channels = [c * 4 for c in num_filters]\n        dilation_dict = {2: 2, 3: 4}\n        self.conv1_1 = L.ConvBNLayer(\n            in_channels=3,\n            out_channels=32,\n            kernel_size=3,\n            stride=2,\n            act='relu',\n            name=\"conv1_1\")\n        self.conv1_2 = L.ConvBNLayer(\n            in_channels=32,\n            out_channels=32,\n            kernel_size=3,\n            stride=1,\n            act='relu',\n            name=\"conv1_2\")\n        self.conv1_3 = L.ConvBNLayer(\n            in_channels=32,\n            out_channels=64,\n            kernel_size=3,\n            stride=1,\n            act='relu',\n            name=\"conv1_3\")\n        self.pool2d_max = nn.MaxPool2D(kernel_size=3, stride=2, padding=1)\n        self.stage_list = []\n\n        for block in range(len(depth)):\n            shortcut = False\n            block_list = []\n            for i in range(depth[block]):\n                conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                dilation_rate = dilation_dict[\n                    block] if dilation_dict and block in dilation_dict else 1\n                if block == 3:\n                    dilation_rate = dilation_rate * multi_grid[i]\n                bottleneck_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    L.BottleneckBlock(\n                        in_channels=num_channels[block]\n                        if i == 0 else num_filters[block] * 4,\n                        out_channels=num_filters[block],\n                        stride=2 if i == 0 and block != 0\n                                    and dilation_rate == 1 else 1,\n                        shortcut=shortcut,\n                        if_first=block == i == 0,\n                        name=conv_name,\n                        dilation=dilation_rate))\n                block_list.append(bottleneck_block)\n                shortcut = True\n            self.stage_list.append(block_list)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        y = self.pool2d_max(y)\n        feat_list = []\n        for stage in self.stage_list:\n            for block in stage:\n                y = block(y)\n            feat_list.append(y)\n        return feat_list"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet101vd_voc/README.md",
    "content": "# ginet_resnet101vd_voc\n\n|模型名称|ginet_resnet101vd_voc|\n| :--- | :---: | \n|类别|图像-图像分割|\n|网络|ginet_resnet101vd|\n|数据集|PascalVOC2012|\n|是否支持Fine-tuning|是|\n|模型大小|286MB|\n|指标|-|\n|最新更新日期|2021-12-14|\n\n## 一、模型基本信息\n  \n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/145925887-bf9e62d3-8c6d-43c2-8062-6cb6ba59ec0e.jpg\" width = \"420\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/145925692-badb21d1-10e7-4a5d-82f5-1177d10a7681.png\" width = \"420\" height = \"505\" />\n    </p>\n\n- ### 模型介绍\n\n    - 本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n    - 更多详情请参考：[ginet](https://arxiv.org/pdf/2009.06160)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install ginet_resnet101vd_voc\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)   \n\n\n## 三、模型API预测\n\n- ### 1.预测代码示例\n\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='ginet_resnet101vd_voc')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用ginet_resnet101vd_voc模型对OpticDiscSeg数据集进行Fine-tune。 `train.py`内容如下：\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n\n              ```\n                - `transforms`: 数据预处理方式。\n                - `mode`: `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n                - 数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='ginet_resnet101vd_voc', num_classes=2, pretrained=None)\n              ```\n                - `name`: 选择预训练模型的名字。\n                - `load_checkpoint`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n        - Step4: 选择优化策略和运行配置\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n             \n\n    - 模型预测\n\n        - 当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。我们使用该模型来进行预测。predict.py脚本如下：\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='ginet_resnet101vd_voc', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - 参数配置正确后，请执行脚本`python predict.py`。\n\n            - **Args**\n                * `images`:原始图像路径或BGR格式图片；\n                * `visualization`: 是否可视化，默认为True；\n                * `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n                **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像分割服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m ginet_resnet101vd_voc\n      ```\n\n    - 这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ginet_resnet101vd_voc\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet101vd_voc/README_en.md",
    "content": "# ginet_resnet101vd_voc\n\n|Module Name|ginet_resnet101vd_voc|\n| :--- | :---: | \n|Category|Image Segmentation|\n|Network|ginet_resnet101vd|\n|Dataset|PascalVOC2012|\n|Fine-tuning supported or not|Yes|\n|Module Size|286MB|\n|Data indicators|-|\n|Latest update date|2021-12-14|\n\n## I. Basic Information \n  \n- ### Application Effect Display\n    - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/145925887-bf9e62d3-8c6d-43c2-8062-6cb6ba59ec0e.jpg\"  width = \"420\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/145925692-badb21d1-10e7-4a5d-82f5-1177d10a7681.png\" width = \"420\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### Module Introduction\n\n    - We will show how to use PaddleHub to finetune the pre-trained model and complete the prediction.\n    - For more information, please refer to: [ginet](https://arxiv.org/pdf/2009.06160)\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install ginet_resnet101vd_voc\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='ginet_resnet101vd_voc')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.Fine-tune and Encapsulation\n\n    - After completing the installation of PaddlePaddle and PaddleHub, you can start using the ginet_resnet101vd_voc model to fine-tune datasets such as OpticDiscSeg.\n\n    - Steps:\n\n         - Step1: Define the data preprocessing method\n\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms`: The data enhancement module defines lots of data preprocessing methods. Users can replace the data preprocessing methods according to their needs.\n\n         - Step2: Download the dataset\n\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n\n              ```\n                * `transforms`: data preprocessing methods.\n\n                * `mode`: Select the data mode, the options are `train`, `test`, `val`. Default is `train`.\n\n                * Dataset preparation can be referred to [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`will be automatically downloaded from the network and decompressed to the `$HOME/.paddlehub/dataset` directory under the user directory.\n\n        - Step3: Load the pre-trained model\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='ginet_resnet101vd_voc', num_classes=2, pretrained=None)\n              ```\n                - `name`: model name.\n                - `load_checkpoint`: Whether to load the self-trained model, if it is None, load the provided parameters.\n\n        - Step4:  Optimization strategy\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='ttest_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n             \n\n    -  Model prediction\n\n        - When Fine-tune is completed, the model with the best performance on the verification set will be saved in the `${CHECKPOINT_DIR}/best_model` directory. We use this model to make predictions. The `predict.py` script is as follows:\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='ginet_resnet101vd_voc', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - **Args**\n                * `images`: Image path or ndarray data with format [H, W, C], BGR.\n                * `visualization`: Whether to save the recognition results as picture files.\n                * `save_path`: Save path of the result, default is 'seg_result'.\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of image segmentation.\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n        - ```shell\n          $ hub serving start -m ginet_resnet101vd_voc\n          ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result:\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ginet_resnet101vd_voc\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## V. Release Note\n\n- 1.0.0\n\n  First release\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet101vd_voc/layers.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn.layer import activation\nfrom paddle.nn import Conv2D, AvgPool2D\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu':\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(\n            self,\n            in_channels: int,\n            out_channels: int,\n            kernel_size: int,\n            stride: int = 1,\n            dilation: int = 1,\n            groups: int = 1,\n            is_vd_mode: bool = False,\n            act: str = None,\n            name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2D(\n            kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=(kernel_size - 1) // 2 if dilation == 1 else 0,\n            dilation=dilation,\n            groups=groups,\n            bias_attr=False)\n\n        self._batch_norm = SyncBatchNorm(out_channels)\n        self._act_op = Activation(act=act)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        y = self._act_op(y)\n\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Residual bottleneck block\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 dilation: int = 1,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=1,\n            act='relu',\n            name=name + \"_branch2a\")\n\n        self.dilation = dilation\n\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            dilation=dilation,\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels * 4,\n            kernel_size=1,\n            act=None,\n            name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels * 4,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        if self.dilation > 1:\n            padding = self.dilation\n            y = F.pad(y, [padding, padding, padding, padding])\n\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.add(x=short, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Depthwise Separable Convolution.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(SeparableConvBNReLU, self).__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=in_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n        self.piontwise_conv = ConvBNReLU(\n            in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBN, self).__init__()\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBNReLU, self).__init__()\n\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n\n    Returns:\n        A callable object of Activation.\n\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n\n    Examples:\n\n        from paddleseg.models.common.activation import Activation\n\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(\n                    act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n\n\nclass ASPPModule(nn.Layer):\n    \"\"\"\n    Atrous Spatial Pyramid Pooling.\n\n    Args:\n        aspp_ratios (tuple): The dilation rate using in ASSP module.\n        in_channels (int): The number of input channels.\n        out_channels (int): The number of output channels.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n        use_sep_conv (bool, optional): If using separable conv in ASPP module. Default: False.\n        image_pooling (bool, optional): If augmented with image-level features. Default: False\n    \"\"\"\n\n    def __init__(self,\n                 aspp_ratios: tuple,\n                 in_channels: int,\n                 out_channels: int,\n                 align_corners: bool,\n                 use_sep_conv: bool= False,\n                 image_pooling: bool = False):\n        super().__init__()\n\n        self.align_corners = align_corners\n        self.aspp_blocks = nn.LayerList()\n\n        for ratio in aspp_ratios:\n            if use_sep_conv and ratio > 1:\n                conv_func = SeparableConvBNReLU\n            else:\n                conv_func = ConvBNReLU\n\n            block = conv_func(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1 if ratio == 1 else 3,\n                dilation=ratio,\n                padding=0 if ratio == 1 else ratio)\n            self.aspp_blocks.append(block)\n\n        out_size = len(self.aspp_blocks)\n\n        if image_pooling:\n            self.global_avg_pool = nn.Sequential(\n                nn.AdaptiveAvgPool2D(output_size=(1, 1)),\n                ConvBNReLU(in_channels, out_channels, kernel_size=1, bias_attr=False))\n            out_size += 1\n        self.image_pooling = image_pooling\n\n        self.conv_bn_relu = ConvBNReLU(\n            in_channels=out_channels * out_size,\n            out_channels=out_channels,\n            kernel_size=1)\n\n        self.dropout = nn.Dropout(p=0.1)  # drop rate\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outputs = []\n        for block in self.aspp_blocks:\n            y = block(x)\n            y = F.interpolate(\n                y,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(y)\n\n        if self.image_pooling:\n            img_avg = self.global_avg_pool(x)\n            img_avg = F.interpolate(\n                img_avg,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(img_avg)\n\n        x = paddle.concat(outputs, axis=1)\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n\n        return x\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet101vd_voc/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union, List, Tuple\n\nimport paddle\nfrom paddle import nn\nimport paddle.nn.functional as F\nimport numpy as np\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\nfrom paddleseg.utils import utils\nfrom paddleseg.models import layers\n\nfrom ginet_resnet101vd_voc.resnet import ResNet101_vd\n\n\n@moduleinfo(\n    name=\"ginet_resnet101vd_voc\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"GINetResnet101 is a segmentation model.\",\n    version=\"1.0.0\",\n    meta=ImageSegmentationModule)\nclass GINetResNet101(nn.Layer):\n    \"\"\"\n    The GINetResNet101 implementation based on PaddlePaddle.\n    The original article refers to\n    Wu, Tianyi, Yu Lu, Yu Zhu, Chuang Zhang, Ming Wu, Zhanyu Ma, and Guodong Guo. \"GINet: Graph interaction network for scene parsing.\" In European Conference on Computer Vision, pp. 34-51. Springer, Cham, 2020.\n    (https://arxiv.org/pdf/2009.06160).\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (tuple, optional): Values in the tuple indicate the indices of output of backbone.\n        enable_auxiliary_loss (bool, optional): A bool value indicates whether adding auxiliary loss.\n            If true, auxiliary loss will be added after LearningToDownsample module. Default: False.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.. Default: False.\n        jpu (bool, optional)): whether to use jpu unit in the base forward. Default:True.\n        pretrained (str, optional): The path or url of pretrained model. Default: None.\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 21,\n                 backbone_indices: Tuple[int]=(0, 1, 2, 3),\n                 enable_auxiliary_loss: bool = True,\n                 align_corners: bool = True,\n                 jpu: bool = True,\n                 pretrained: str = None):\n        super(GINetResNet101, self).__init__()\n        self.nclass = num_classes\n        self.aux = enable_auxiliary_loss\n        self.jpu = jpu\n\n        self.backbone = ResNet101_vd()\n        self.backbone_indices = backbone_indices\n        self.align_corners = align_corners\n        self.transforms = T.Compose([T.Normalize()])\n\n        self.jpu = layers.JPU([512, 1024, 2048], width=512) if jpu else None\n        self.head = GIHead(in_channels=2048, nclass=num_classes)\n\n        if self.aux:\n            self.auxlayer = layers.AuxLayer(\n                1024, 1024 // 4, num_classes, bias_attr=False)\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def transform(self, img: Union[np.ndarray, str]) -> Union[np.ndarray, str]:\n        return self.transforms(img)\n\n    def base_forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        feat_list = self.backbone(x)\n        c1, c2, c3, c4 = [feat_list[i] for i in self.backbone_indices]\n\n        if self.jpu:\n            return self.jpu(c1, c2, c3, c4)\n        else:\n            return c1, c2, c3, c4\n\n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        _, _, h, w = x.shape\n        _, _, c3, c4 = self.base_forward(x)\n\n        logit_list = []\n        x, _ = self.head(c4)\n        logit_list.append(x)\n\n        if self.aux:\n            auxout = self.auxlayer(c3)\n\n            logit_list.append(auxout)\n\n        return [\n            F.interpolate(\n                logit, (h, w),\n                mode='bilinear',\n                align_corners=self.align_corners) for logit in logit_list\n        ]\n\n\nclass GIHead(nn.Layer):\n    \"\"\"The Graph Interaction Network head.\"\"\"\n\n    def __init__(self, in_channels: int, nclass: int):\n        super().__init__()\n        self.nclass = nclass\n        inter_channels = in_channels // 4\n        self.inp = paddle.zeros(shape=(nclass, 300), dtype='float32')\n        self.inp = paddle.create_parameter(\n            shape=self.inp.shape,\n            dtype=str(self.inp.numpy().dtype),\n            default_initializer=paddle.nn.initializer.Assign(self.inp))\n\n        self.fc1 = nn.Sequential(\n            nn.Linear(300, 128), nn.BatchNorm1D(128), nn.ReLU())\n        self.fc2 = nn.Sequential(\n            nn.Linear(128, 256), nn.BatchNorm1D(256), nn.ReLU())\n        self.conv5 = layers.ConvBNReLU(\n            in_channels,\n            inter_channels,\n            3,\n            padding=1,\n            bias_attr=False,\n            stride=1)\n\n        self.gloru = GlobalReasonUnit(\n            in_channels=inter_channels,\n            num_state=256,\n            num_node=84,\n            nclass=nclass)\n        self.conv6 = nn.Sequential(\n            nn.Dropout(0.1), nn.Conv2D(inter_channels, nclass, 1))\n\n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        B, C, H, W = x.shape\n        inp = self.inp.detach()\n\n        inp = self.fc1(inp)\n        inp = self.fc2(inp).unsqueeze(axis=0).transpose((0, 2, 1))\\\n                           .expand((B, 256, self.nclass))\n\n        out = self.conv5(x)\n\n        out, se_out = self.gloru(out, inp)\n        out = self.conv6(out)\n        return out, se_out\n\n\nclass GlobalReasonUnit(nn.Layer):\n    \"\"\"\n        The original paper refers to:\n            Chen, Yunpeng, et al. \"Graph-Based Global Reasoning Networks\" (https://arxiv.org/abs/1811.12814)\n    \"\"\"\n\n    def __init__(self, in_channels: int, num_state: int = 256, num_node: int = 84, nclass: int = 59):\n        super().__init__()\n        self.num_state = num_state\n        self.conv_theta = nn.Conv2D(\n            in_channels, num_node, kernel_size=1, stride=1, padding=0)\n        self.conv_phi = nn.Conv2D(\n            in_channels, num_state, kernel_size=1, stride=1, padding=0)\n        self.graph = GraphLayer(num_state, num_node, nclass)\n        self.extend_dim = nn.Conv2D(\n            num_state, in_channels, kernel_size=1, bias_attr=False)\n\n        self.bn = layers.SyncBatchNorm(in_channels)\n\n    def forward(self, x: paddle.Tensor, inp: paddle.Tensor) -> List[paddle.Tensor]:\n        B = self.conv_theta(x)\n        sizeB = B.shape\n        B = B.reshape((sizeB[0], sizeB[1], -1))\n\n        sizex = x.shape\n        x_reduce = self.conv_phi(x)\n        x_reduce = x_reduce.reshape((sizex[0], -1, sizex[2] * sizex[3]))\\\n                           .transpose((0, 2, 1))\n\n        V = paddle.bmm(B, x_reduce).transpose((0, 2, 1))\n        V = paddle.divide(\n            V, paddle.to_tensor([sizex[2] * sizex[3]], dtype='float32'))\n\n        class_node, new_V = self.graph(inp, V)\n        D = B.reshape((sizeB[0], -1, sizeB[2] * sizeB[3])).transpose((0, 2, 1))\n        Y = paddle.bmm(D, new_V.transpose((0, 2, 1)))\n        Y = Y.transpose((0, 2, 1)).reshape((sizex[0], self.num_state, \\\n                                            sizex[2], -1))\n        Y = self.extend_dim(Y)\n        Y = self.bn(Y)\n        out = Y + x\n\n        return out, class_node\n\n\nclass GraphLayer(nn.Layer):\n    def __init__(self, num_state: int, num_node: int, num_class: int):\n        super().__init__()\n        self.vis_gcn = GCN(num_state, num_node)\n        self.word_gcn = GCN(num_state, num_class)\n        self.transfer = GraphTransfer(num_state)\n        self.gamma_vis = paddle.zeros([num_node])\n        self.gamma_word = paddle.zeros([num_class])\n        self.gamma_vis = paddle.create_parameter(\n            shape=self.gamma_vis.shape,\n            dtype=str(self.gamma_vis.numpy().dtype),\n            default_initializer=paddle.nn.initializer.Assign(self.gamma_vis))\n        self.gamma_word = paddle.create_parameter(\n            shape=self.gamma_word.shape,\n            dtype=str(self.gamma_word.numpy().dtype),\n            default_initializer=paddle.nn.initializer.Assign(self.gamma_word))\n\n    def forward(self, inp: paddle.Tensor, vis_node: paddle.Tensor) -> List[paddle.Tensor]:\n        inp = self.word_gcn(inp)\n        new_V = self.vis_gcn(vis_node)\n        class_node, vis_node = self.transfer(inp, new_V)\n\n        class_node = self.gamma_word * inp + class_node\n        new_V = self.gamma_vis * vis_node + new_V\n        return class_node, new_V\n\n\nclass GCN(nn.Layer):\n    def __init__(self, num_state: int = 128, num_node: int = 64, bias=False):\n        super().__init__()\n        self.conv1 = nn.Conv1D(\n            num_node,\n            num_node,\n            kernel_size=1,\n            padding=0,\n            stride=1,\n            groups=1,\n        )\n        self.relu = nn.ReLU()\n        self.conv2 = nn.Conv1D(\n            num_state,\n            num_state,\n            kernel_size=1,\n            padding=0,\n            stride=1,\n            groups=1,\n            bias_attr=bias)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        h = self.conv1(x.transpose((0, 2, 1))).transpose((0, 2, 1))\n        h = h + x\n        h = self.relu(h)\n        h = self.conv2(h)\n        return h\n\n\nclass GraphTransfer(nn.Layer):\n    \"\"\"Transfer vis graph to class node, transfer class node to vis feature\"\"\"\n\n    def __init__(self, in_dim: int):\n        super().__init__()\n        self.channle_in = in_dim\n        self.query_conv = nn.Conv1D(\n            in_channels=in_dim, out_channels=in_dim // 2, kernel_size=1)\n        self.key_conv = nn.Conv1D(\n            in_channels=in_dim, out_channels=in_dim // 2, kernel_size=1)\n        self.value_conv_vis = nn.Conv1D(\n            in_channels=in_dim, out_channels=in_dim, kernel_size=1)\n        self.value_conv_word = nn.Conv1D(\n            in_channels=in_dim, out_channels=in_dim, kernel_size=1)\n        self.softmax_vis = nn.Softmax(axis=-1)\n        self.softmax_word = nn.Softmax(axis=-2)\n\n    def forward(self, word: paddle.Tensor, vis_node: paddle.Tensor) -> List[paddle.Tensor]:\n        m_batchsize, C, Nc = word.shape\n        m_batchsize, C, Nn = vis_node.shape\n\n        proj_query = self.query_conv(word).reshape((m_batchsize, -1, Nc))\\\n                                          .transpose((0, 2, 1))\n        proj_key = self.key_conv(vis_node).reshape((m_batchsize, -1, Nn))\n\n        energy = paddle.bmm(proj_query, proj_key)\n        attention_vis = self.softmax_vis(energy).transpose((0, 2, 1))\n        attention_word = self.softmax_word(energy)\n\n        proj_value_vis = self.value_conv_vis(vis_node).reshape((m_batchsize, -1,\n                                                                Nn))\n        proj_value_word = self.value_conv_word(word).reshape((m_batchsize, -1,\n                                                              Nc))\n\n        class_out = paddle.bmm(proj_value_vis, attention_vis)\n        node_out = paddle.bmm(proj_value_word, attention_word)\n        return class_out, node_out"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet101vd_voc/requirements.txt",
    "content": "paddleseg >= 2.3.0\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet101vd_voc/resnet.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport ginet_resnet101vd_voc.layers as L\n\nclass BasicBlock(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BasicBlock, self).__init__()\n        self.stride = stride\n        self.conv0 = L.ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2a\")\n        self.conv1 = L.ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            act=None,\n            name=name + \"_branch2b\")\n\n        if not shortcut:\n            self.short = L.ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv1, act='relu')\n\n        return y\n\n\nclass ResNet101_vd(nn.Layer):\n    def __init__(self,\n                 multi_grid: tuple = (1, 2, 4)):\n        super(ResNet101_vd, self).__init__()\n        depth = [3, 4, 23, 3]\n        num_channels = [64, 256, 512, 1024] \n        num_filters = [64, 128, 256, 512]\n        self.feat_channels = [c * 4 for c in num_filters]\n        dilation_dict = {2: 2, 3: 4}\n        self.conv1_1 = L.ConvBNLayer(\n            in_channels=3,\n            out_channels=32,\n            kernel_size=3,\n            stride=2,\n            act='relu',\n            name=\"conv1_1\")\n        self.conv1_2 = L.ConvBNLayer(\n            in_channels=32,\n            out_channels=32,\n            kernel_size=3,\n            stride=1,\n            act='relu',\n            name=\"conv1_2\")\n        self.conv1_3 = L.ConvBNLayer(\n            in_channels=32,\n            out_channels=64,\n            kernel_size=3,\n            stride=1,\n            act='relu',\n            name=\"conv1_3\")\n        self.pool2d_max = nn.MaxPool2D(kernel_size=3, stride=2, padding=1)\n        self.stage_list = []\n\n        for block in range(len(depth)):\n            shortcut = False\n            block_list = []\n            for i in range(depth[block]):\n                conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                dilation_rate = dilation_dict[\n                    block] if dilation_dict and block in dilation_dict else 1\n                if block == 3:\n                    dilation_rate = dilation_rate * multi_grid[i]\n                bottleneck_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    L.BottleneckBlock(\n                        in_channels=num_channels[block]\n                        if i == 0 else num_filters[block] * 4,\n                        out_channels=num_filters[block],\n                        stride=2 if i == 0 and block != 0\n                                    and dilation_rate == 1 else 1,\n                        shortcut=shortcut,\n                        if_first=block == i == 0,\n                        name=conv_name,\n                        dilation=dilation_rate))\n                block_list.append(bottleneck_block)\n                shortcut = True\n            self.stage_list.append(block_list)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        y = self.pool2d_max(y)\n        feat_list = []\n        for stage in self.stage_list:\n            for block in stage:\n                y = block(y)\n            feat_list.append(y)\n        return feat_list"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet50vd_ade20k/README.md",
    "content": "# ginet_resnet50vd_ade20k\n\n|模型名称|ginet_resnet50vd_ade20k|\n| :--- | :---: | \n|类别|图像-图像分割|\n|网络|ginet_resnet50vd|\n|数据集|ADE20K|\n|是否支持Fine-tuning|是|\n|模型大小|214MB|\n|指标|-|\n|最新更新日期|2021-12-14|\n\n## 一、模型基本信息\n  \n  - 样例结果示例：\n    - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/145947107-b6f87161-d824-4c21-b01d-594ad03e56de.jpg\"  hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/145947270-1b8e0671-c5d3-4b61-b99e-0af27ccd9096.png\" hspace='10'/>\n    </p>\n\n- ### 模型介绍\n\n    - 本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n    - 更多详情请参考：[ginet](https://arxiv.org/pdf/2009.06160)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install ginet_resnet50vd_ade20k\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)   \n\n\n## 三、模型API预测\n\n- ### 1.预测代码示例\n\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='ginet_resnet50vd_ade20k')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用ginet_resnet50vd_ade20k模型对OpticDiscSeg数据集进行Fine-tune。 `train.py`内容如下：\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n\n              ```\n                - `transforms`: 数据预处理方式。\n                - `mode`: `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n                - 数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='ginet_resnet50vd_ade20k', num_classes=2, pretrained=None)\n              ```\n                - `name`: 选择预训练模型的名字。\n                - `load_checkpoint`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n        - Step4: 选择优化策略和运行配置\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n             \n\n    - 模型预测\n\n        - 当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。我们使用该模型来进行预测。predict.py脚本如下：\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='ginet_resnet50vd_ade20k', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - 参数配置正确后，请执行脚本`python predict.py`。\n\n            - **Args**\n                * `images`:原始图像路径或BGR格式图片；\n                * `visualization`: 是否可视化，默认为True；\n                * `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n                **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像分割服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m ginet_resnet50vd_ade20k\n      ```\n\n    - 这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ginet_resnet50vd_ade20k\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet50vd_ade20k/README_en.md",
    "content": "# ginet_resnet50vd_ade20k\n\n|Module Name|ginet_resnet50vd_ade20k|\n| :--- | :---: | \n|Category|Image Segmentation|\n|Network|ginet_resnet50vd|\n|Dataset|ADE20K|\n|Fine-tuning supported or not|Yes|\n|Module Size|214MB|\n|Data indicators|-|\n|Latest update date|2021-12-14|\n\n## I. Basic Information \n  \n- ### Application Effect Display\n    - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/145947107-b6f87161-d824-4c21-b01d-594ad03e56de.jpg\"  hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/145947270-1b8e0671-c5d3-4b61-b99e-0af27ccd9096.png\" hspace='10'/>\n    </p>\n\n- ### Module Introduction\n\n    - We will show how to use PaddleHub to finetune the pre-trained model and complete the prediction.\n    - For more information, please refer to: [ginet](https://arxiv.org/pdf/2009.06160)\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install ginet_resnet50vd_ade20k\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='ginet_resnet50vd_ade20k')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.Fine-tune and Encapsulation\n\n    - After completing the installation of PaddlePaddle and PaddleHub, you can start using the ginet_resnet50vd_ade20k model to fine-tune datasets such as OpticDiscSeg.\n\n    - Steps:\n\n         - Step1: Define the data preprocessing method\n\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms`: The data enhancement module defines lots of data preprocessing methods. Users can replace the data preprocessing methods according to their needs.\n\n         - Step2: Download the dataset\n\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n\n              ```\n                * `transforms`: data preprocessing methods.\n\n                * `mode`: Select the data mode, the options are `train`, `test`, `val`. Default is `train`.\n\n                * Dataset preparation can be referred to [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`will be automatically downloaded from the network and decompressed to the `$HOME/.paddlehub/dataset` directory under the user directory.\n\n        - Step3: Load the pre-trained model\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='ginet_resnet50vd_ade20k', num_classes=2, pretrained=None)\n              ```\n                - `name`: model name.\n                - `load_checkpoint`: Whether to load the self-trained model, if it is None, load the provided parameters.\n\n        - Step4:  Optimization strategy\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n             \n\n    -  Model prediction\n\n        - When Fine-tune is completed, the model with the best performance on the verification set will be saved in the `${CHECKPOINT_DIR}/best_model` directory. We use this model to make predictions. The `predict.py` script is as follows:\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='ginet_resnet50vd_ade20k', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - **Args**\n                * `images`: Image path or ndarray data with format [H, W, C], BGR.\n                * `visualization`: Whether to save the recognition results as picture files.\n                * `save_path`: Save path of the result, default is 'seg_result'.\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of image segmentation.\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n        - ```shell\n          $ hub serving start -m ginet_resnet50vd_ade20k\n          ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result:\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ginet_resnet50vd_ade20k\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## V. Release Note\n\n- 1.0.0\n\n  First release"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet50vd_ade20k/layers.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn.layer import activation\nfrom paddle.nn import Conv2D, AvgPool2D\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu':\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(\n            self,\n            in_channels: int,\n            out_channels: int,\n            kernel_size: int,\n            stride: int = 1,\n            dilation: int = 1,\n            groups: int = 1,\n            is_vd_mode: bool = False,\n            act: str = None,\n            name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2D(\n            kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=(kernel_size - 1) // 2 if dilation == 1 else 0,\n            dilation=dilation,\n            groups=groups,\n            bias_attr=False)\n\n        self._batch_norm = SyncBatchNorm(out_channels)\n        self._act_op = Activation(act=act)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        y = self._act_op(y)\n\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Residual bottleneck block\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 dilation: int = 1,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=1,\n            act='relu',\n            name=name + \"_branch2a\")\n\n        self.dilation = dilation\n\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            dilation=dilation,\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels * 4,\n            kernel_size=1,\n            act=None,\n            name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels * 4,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        if self.dilation > 1:\n            padding = self.dilation\n            y = F.pad(y, [padding, padding, padding, padding])\n\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.add(x=short, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Depthwise Separable Convolution.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(SeparableConvBNReLU, self).__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=in_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n        self.piontwise_conv = ConvBNReLU(\n            in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBN, self).__init__()\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBNReLU, self).__init__()\n\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n\n    Returns:\n        A callable object of Activation.\n\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n\n    Examples:\n\n        from paddleseg.models.common.activation import Activation\n\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(\n                    act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n\n\nclass ASPPModule(nn.Layer):\n    \"\"\"\n    Atrous Spatial Pyramid Pooling.\n\n    Args:\n        aspp_ratios (tuple): The dilation rate using in ASSP module.\n        in_channels (int): The number of input channels.\n        out_channels (int): The number of output channels.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n        use_sep_conv (bool, optional): If using separable conv in ASPP module. Default: False.\n        image_pooling (bool, optional): If augmented with image-level features. Default: False\n    \"\"\"\n\n    def __init__(self,\n                 aspp_ratios: tuple,\n                 in_channels: int,\n                 out_channels: int,\n                 align_corners: bool,\n                 use_sep_conv: bool= False,\n                 image_pooling: bool = False):\n        super().__init__()\n\n        self.align_corners = align_corners\n        self.aspp_blocks = nn.LayerList()\n\n        for ratio in aspp_ratios:\n            if use_sep_conv and ratio > 1:\n                conv_func = SeparableConvBNReLU\n            else:\n                conv_func = ConvBNReLU\n\n            block = conv_func(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1 if ratio == 1 else 3,\n                dilation=ratio,\n                padding=0 if ratio == 1 else ratio)\n            self.aspp_blocks.append(block)\n\n        out_size = len(self.aspp_blocks)\n\n        if image_pooling:\n            self.global_avg_pool = nn.Sequential(\n                nn.AdaptiveAvgPool2D(output_size=(1, 1)),\n                ConvBNReLU(in_channels, out_channels, kernel_size=1, bias_attr=False))\n            out_size += 1\n        self.image_pooling = image_pooling\n\n        self.conv_bn_relu = ConvBNReLU(\n            in_channels=out_channels * out_size,\n            out_channels=out_channels,\n            kernel_size=1)\n\n        self.dropout = nn.Dropout(p=0.1)  # drop rate\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outputs = []\n        for block in self.aspp_blocks:\n            y = block(x)\n            y = F.interpolate(\n                y,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(y)\n\n        if self.image_pooling:\n            img_avg = self.global_avg_pool(x)\n            img_avg = F.interpolate(\n                img_avg,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(img_avg)\n\n        x = paddle.concat(outputs, axis=1)\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n\n        return x\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet50vd_ade20k/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union, List, Tuple\n\nimport paddle\nfrom paddle import nn\nimport paddle.nn.functional as F\nimport numpy as np\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\nfrom paddleseg.utils import utils\nfrom paddleseg.models import layers\n\nfrom ginet_resnet50vd_ade20k.resnet import ResNet50_vd\n\n\n@moduleinfo(\n    name=\"ginet_resnet50vd_ade20k\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"GINetResnet50 is a segmentation model.\",\n    version=\"1.0.0\",\n    meta=ImageSegmentationModule)\nclass GINetResNet50(nn.Layer):\n    \"\"\"\n    The GINetResNet50 implementation based on PaddlePaddle.\n    The original article refers to\n    Wu, Tianyi, Yu Lu, Yu Zhu, Chuang Zhang, Ming Wu, Zhanyu Ma, and Guodong Guo. \"GINet: Graph interaction network for scene parsing.\" In European Conference on Computer Vision, pp. 34-51. Springer, Cham, 2020.\n    (https://arxiv.org/pdf/2009.06160).\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (tuple, optional): Values in the tuple indicate the indices of output of backbone.\n        enable_auxiliary_loss (bool, optional): A bool value indicates whether adding auxiliary loss.\n            If true, auxiliary loss will be added after LearningToDownsample module. Default: False.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.. Default: False.\n        jpu (bool, optional)): whether to use jpu unit in the base forward. Default:True.\n        pretrained (str, optional): The path or url of pretrained model. Default: None.\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 150,\n                 backbone_indices: Tuple[int]=(0, 1, 2, 3),\n                 enable_auxiliary_loss: bool = True,\n                 align_corners: bool = True,\n                 jpu: bool = True,\n                 pretrained: str = None):\n        super(GINetResNet50, self).__init__()\n        self.nclass = num_classes\n        self.aux = enable_auxiliary_loss\n        self.jpu = jpu\n\n        self.backbone = ResNet50_vd()\n        self.backbone_indices = backbone_indices\n        self.align_corners = align_corners\n        self.transforms = T.Compose([T.Normalize()])\n\n        self.jpu = layers.JPU([512, 1024, 2048], width=512) if jpu else None\n        self.head = GIHead(in_channels=2048, nclass=num_classes)\n\n        if self.aux:\n            self.auxlayer = layers.AuxLayer(\n                1024, 1024 // 4, num_classes, bias_attr=False)\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def transform(self, img: Union[np.ndarray, str]) -> Union[np.ndarray, str]:\n        return self.transforms(img)\n\n    def base_forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        feat_list = self.backbone(x)\n        c1, c2, c3, c4 = [feat_list[i] for i in self.backbone_indices]\n\n        if self.jpu:\n            return self.jpu(c1, c2, c3, c4)\n        else:\n            return c1, c2, c3, c4\n\n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        _, _, h, w = x.shape\n        _, _, c3, c4 = self.base_forward(x)\n\n        logit_list = []\n        x, _ = self.head(c4)\n        logit_list.append(x)\n\n        if self.aux:\n            auxout = self.auxlayer(c3)\n\n            logit_list.append(auxout)\n\n        return [\n            F.interpolate(\n                logit, (h, w),\n                mode='bilinear',\n                align_corners=self.align_corners) for logit in logit_list\n        ]\n\n\nclass GIHead(nn.Layer):\n    \"\"\"The Graph Interaction Network head.\"\"\"\n\n    def __init__(self, in_channels: int, nclass: int):\n        super().__init__()\n        self.nclass = nclass\n        inter_channels = in_channels // 4\n        self.inp = paddle.zeros(shape=(nclass, 300), dtype='float32')\n        self.inp = paddle.create_parameter(\n            shape=self.inp.shape,\n            dtype=str(self.inp.numpy().dtype),\n            default_initializer=paddle.nn.initializer.Assign(self.inp))\n\n        self.fc1 = nn.Sequential(\n            nn.Linear(300, 128), nn.BatchNorm1D(128), nn.ReLU())\n        self.fc2 = nn.Sequential(\n            nn.Linear(128, 256), nn.BatchNorm1D(256), nn.ReLU())\n        self.conv5 = layers.ConvBNReLU(\n            in_channels,\n            inter_channels,\n            3,\n            padding=1,\n            bias_attr=False,\n            stride=1)\n\n        self.gloru = GlobalReasonUnit(\n            in_channels=inter_channels,\n            num_state=256,\n            num_node=84,\n            nclass=nclass)\n        self.conv6 = nn.Sequential(\n            nn.Dropout(0.1), nn.Conv2D(inter_channels, nclass, 1))\n\n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        B, C, H, W = x.shape\n        inp = self.inp.detach()\n\n        inp = self.fc1(inp)\n        inp = self.fc2(inp).unsqueeze(axis=0).transpose((0, 2, 1))\\\n                           .expand((B, 256, self.nclass))\n\n        out = self.conv5(x)\n\n        out, se_out = self.gloru(out, inp)\n        out = self.conv6(out)\n        return out, se_out\n\n\nclass GlobalReasonUnit(nn.Layer):\n    \"\"\"\n        The original paper refers to:\n            Chen, Yunpeng, et al. \"Graph-Based Global Reasoning Networks\" (https://arxiv.org/abs/1811.12814)\n    \"\"\"\n\n    def __init__(self, in_channels: int, num_state: int = 256, num_node: int = 84, nclass: int = 59):\n        super().__init__()\n        self.num_state = num_state\n        self.conv_theta = nn.Conv2D(\n            in_channels, num_node, kernel_size=1, stride=1, padding=0)\n        self.conv_phi = nn.Conv2D(\n            in_channels, num_state, kernel_size=1, stride=1, padding=0)\n        self.graph = GraphLayer(num_state, num_node, nclass)\n        self.extend_dim = nn.Conv2D(\n            num_state, in_channels, kernel_size=1, bias_attr=False)\n\n        self.bn = layers.SyncBatchNorm(in_channels)\n\n    def forward(self, x: paddle.Tensor, inp:paddle.Tensor) -> List[paddle.Tensor]:\n        B = self.conv_theta(x)\n        sizeB = B.shape\n        B = B.reshape((sizeB[0], sizeB[1], -1))\n\n        sizex = x.shape\n        x_reduce = self.conv_phi(x)\n        x_reduce = x_reduce.reshape((sizex[0], -1, sizex[2] * sizex[3]))\\\n                           .transpose((0, 2, 1))\n\n        V = paddle.bmm(B, x_reduce).transpose((0, 2, 1))\n        V = paddle.divide(\n            V, paddle.to_tensor([sizex[2] * sizex[3]], dtype='float32'))\n\n        class_node, new_V = self.graph(inp, V)\n        D = B.reshape((sizeB[0], -1, sizeB[2] * sizeB[3])).transpose((0, 2, 1))\n        Y = paddle.bmm(D, new_V.transpose((0, 2, 1)))\n        Y = Y.transpose((0, 2, 1)).reshape((sizex[0], self.num_state, \\\n                                            sizex[2], -1))\n        Y = self.extend_dim(Y)\n        Y = self.bn(Y)\n        out = Y + x\n\n        return out, class_node\n\n\nclass GraphLayer(nn.Layer):\n    def __init__(self, num_state: int, num_node: int, num_class: int):\n        super().__init__()\n        self.vis_gcn = GCN(num_state, num_node)\n        self.word_gcn = GCN(num_state, num_class)\n        self.transfer = GraphTransfer(num_state)\n        self.gamma_vis = paddle.zeros([num_node])\n        self.gamma_word = paddle.zeros([num_class])\n        self.gamma_vis = paddle.create_parameter(\n            shape=self.gamma_vis.shape,\n            dtype=str(self.gamma_vis.numpy().dtype),\n            default_initializer=paddle.nn.initializer.Assign(self.gamma_vis))\n        self.gamma_word = paddle.create_parameter(\n            shape=self.gamma_word.shape,\n            dtype=str(self.gamma_word.numpy().dtype),\n            default_initializer=paddle.nn.initializer.Assign(self.gamma_word))\n\n    def forward(self, inp: paddle.Tensor, vis_node: paddle.Tensor) -> List[paddle.Tensor]:\n        inp = self.word_gcn(inp)\n        new_V = self.vis_gcn(vis_node)\n        class_node, vis_node = self.transfer(inp, new_V)\n\n        class_node = self.gamma_word * inp + class_node\n        new_V = self.gamma_vis * vis_node + new_V\n        return class_node, new_V\n\n\nclass GCN(nn.Layer):\n    def __init__(self, num_state: int = 128, num_node: int = 64, bias: bool = False):\n        super().__init__()\n        self.conv1 = nn.Conv1D(\n            num_node,\n            num_node,\n            kernel_size=1,\n            padding=0,\n            stride=1,\n            groups=1,\n        )\n        self.relu = nn.ReLU()\n        self.conv2 = nn.Conv1D(\n            num_state,\n            num_state,\n            kernel_size=1,\n            padding=0,\n            stride=1,\n            groups=1,\n            bias_attr=bias)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        h = self.conv1(x.transpose((0, 2, 1))).transpose((0, 2, 1))\n        h = h + x\n        h = self.relu(h)\n        h = self.conv2(h)\n        return h\n\n\nclass GraphTransfer(nn.Layer):\n    \"\"\"Transfer vis graph to class node, transfer class node to vis feature\"\"\"\n\n    def __init__(self, in_dim: int):\n        super().__init__()\n        self.channle_in = in_dim\n        self.query_conv = nn.Conv1D(\n            in_channels=in_dim, out_channels=in_dim // 2, kernel_size=1)\n        self.key_conv = nn.Conv1D(\n            in_channels=in_dim, out_channels=in_dim // 2, kernel_size=1)\n        self.value_conv_vis = nn.Conv1D(\n            in_channels=in_dim, out_channels=in_dim, kernel_size=1)\n        self.value_conv_word = nn.Conv1D(\n            in_channels=in_dim, out_channels=in_dim, kernel_size=1)\n        self.softmax_vis = nn.Softmax(axis=-1)\n        self.softmax_word = nn.Softmax(axis=-2)\n\n    def forward(self, word: paddle.Tensor, vis_node: paddle.Tensor) -> List[paddle.Tensor]:\n        m_batchsize, C, Nc = word.shape\n        m_batchsize, C, Nn = vis_node.shape\n\n        proj_query = self.query_conv(word).reshape((m_batchsize, -1, Nc))\\\n                                          .transpose((0, 2, 1))\n        proj_key = self.key_conv(vis_node).reshape((m_batchsize, -1, Nn))\n\n        energy = paddle.bmm(proj_query, proj_key)\n        attention_vis = self.softmax_vis(energy).transpose((0, 2, 1))\n        attention_word = self.softmax_word(energy)\n\n        proj_value_vis = self.value_conv_vis(vis_node).reshape((m_batchsize, -1,\n                                                                Nn))\n        proj_value_word = self.value_conv_word(word).reshape((m_batchsize, -1,\n                                                              Nc))\n\n        class_out = paddle.bmm(proj_value_vis, attention_vis)\n        node_out = paddle.bmm(proj_value_word, attention_word)\n        return class_out, node_out"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet50vd_ade20k/requirements.txt",
    "content": "paddleseg >= 2.3.0\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet50vd_ade20k/resnet.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport ginet_resnet50vd_ade20k.layers as L\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BasicBlock, self).__init__()\n        self.stride = stride\n        self.conv0 = L.ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2a\")\n        self.conv1 = L.ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            act=None,\n            name=name + \"_branch2b\")\n\n        if not shortcut:\n            self.short = L.ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv1, act='relu')\n\n        return y\n\n\nclass ResNet50_vd(nn.Layer):\n    def __init__(self,\n                 multi_grid: tuple = (1, 2, 4)):\n        super(ResNet50_vd, self).__init__()\n        depth = [3, 4, 6, 3]\n        num_channels = [64, 256, 512, 1024] \n        num_filters = [64, 128, 256, 512]\n        self.feat_channels = [c * 4 for c in num_filters]\n        dilation_dict = {2: 2, 3: 4}\n        self.conv1_1 = L.ConvBNLayer(\n            in_channels=3,\n            out_channels=32,\n            kernel_size=3,\n            stride=2,\n            act='relu',\n            name=\"conv1_1\")\n        self.conv1_2 = L.ConvBNLayer(\n            in_channels=32,\n            out_channels=32,\n            kernel_size=3,\n            stride=1,\n            act='relu',\n            name=\"conv1_2\")\n        self.conv1_3 = L.ConvBNLayer(\n            in_channels=32,\n            out_channels=64,\n            kernel_size=3,\n            stride=1,\n            act='relu',\n            name=\"conv1_3\")\n        self.pool2d_max = nn.MaxPool2D(kernel_size=3, stride=2, padding=1)\n        self.stage_list = []\n\n        for block in range(len(depth)):\n            shortcut = False\n            block_list = []\n            for i in range(depth[block]):\n                conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                dilation_rate = dilation_dict[\n                    block] if dilation_dict and block in dilation_dict else 1\n                if block == 3:\n                    dilation_rate = dilation_rate * multi_grid[i]\n                bottleneck_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    L.BottleneckBlock(\n                        in_channels=num_channels[block]\n                        if i == 0 else num_filters[block] * 4,\n                        out_channels=num_filters[block],\n                        stride=2 if i == 0 and block != 0\n                                    and dilation_rate == 1 else 1,\n                        shortcut=shortcut,\n                        if_first=block == i == 0,\n                        name=conv_name,\n                        dilation=dilation_rate))\n                block_list.append(bottleneck_block)\n                shortcut = True\n            self.stage_list.append(block_list)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        y = self.pool2d_max(y)\n        feat_list = []\n        for stage in self.stage_list:\n            for block in stage:\n                y = block(y)\n            feat_list.append(y)\n        return feat_list"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet50vd_cityscapes/README.md",
    "content": "# ginet_resnet50vd_cityscapes\n\n|模型名称|ginet_resnet50vd_cityscapes|\n| :--- | :---: | \n|类别|图像-图像分割|\n|网络|ginet_resnet50vd|\n|数据集|Cityscapes|\n|是否支持Fine-tuning|是|\n|模型大小|214MB|\n|指标|-|\n|最新更新日期|2021-12-14|\n\n## 一、模型基本信息\n  \n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/145943452-6e3a8cce-b17c-417e-80ad-d47e1dd5e00c.png\"  hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/145943518-b608d555-1ddb-4100-b399-b6f777658caf.png\" hspace='10'/>\n    </p>\n\n- ### 模型介绍\n\n    - 本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n    - 更多详情请参考：[ginet](https://arxiv.org/pdf/2009.06160)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install ginet_resnet50vd_cityscapes\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)   \n\n\n## 三、模型API预测\n\n- ### 1.预测代码示例\n\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='ginet_resnet50vd_cityscapes')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用ginet_resnet50vd_cityscapes模型对OpticDiscSeg数据集进行Fine-tune。 `train.py`内容如下：\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n\n              ```\n                - `transforms`: 数据预处理方式。\n                - `mode`: `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n                - 数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='ginet_resnet50vd_cityscapes', num_classes=2, pretrained=None)\n              ```\n                - `name`: 选择预训练模型的名字。\n                - `load_checkpoint`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n        - Step4: 选择优化策略和运行配置\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n             \n\n    - 模型预测\n\n        - 当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。我们使用该模型来进行预测。predict.py脚本如下：\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='ginet_resnet50vd_cityscapes', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - 参数配置正确后，请执行脚本`python predict.py`。\n\n            - **Args**\n                * `images`:原始图像路径或BGR格式图片；\n                * `visualization`: 是否可视化，默认为True；\n                * `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n                **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像分割服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m ginet_resnet50vd_cityscapes\n      ```\n\n    - 这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ginet_resnet50vd_cityscapes\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet50vd_cityscapes/README_en.md",
    "content": "# ginet_resnet50vd_cityscapes\n\n|Module Name|ginet_resnet50vd_cityscapes|\n| :--- | :---: | \n|Category|Image Segmentation|\n|Network|ginet_resnet50vd|\n|Dataset|Cityscapes|\n|Fine-tuning supported or not|Yes|\n|Module Size|214MB|\n|Data indicators|-|\n|Latest update date|2021-12-14|\n\n## I. Basic Information \n  \n- ### Application Effect Display\n    - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/145943452-6e3a8cce-b17c-417e-80ad-d47e1dd5e00c.png\"  hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/145943518-b608d555-1ddb-4100-b399-b6f777658caf.png\" hspace='10'/>\n    </p>\n\n- ### Module Introduction\n\n    - We will show how to use PaddleHub to finetune the pre-trained model and complete the prediction.\n    - For more information, please refer to: [ginet](https://arxiv.org/pdf/2009.06160)\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install ginet_resnet50vd_cityscapes\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='ginet_resnet50vd_cityscapes')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.Fine-tune and Encapsulation\n\n    - After completing the installation of PaddlePaddle and PaddleHub, you can start using the ginet_resnet50vd_cityscapes model to fine-tune datasets such as OpticDiscSeg.\n\n    - Steps:\n\n         - Step1: Define the data preprocessing method\n\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms`: The data enhancement module defines lots of data preprocessing methods. Users can replace the data preprocessing methods according to their needs.\n\n         - Step2: Download the dataset\n\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n\n              ```\n                * `transforms`: data preprocessing methods.\n\n                * `mode`: Select the data mode, the options are `train`, `test`, `val`. Default is `train`.\n\n                * Dataset preparation can be referred to [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`will be automatically downloaded from the network and decompressed to the `$HOME/.paddlehub/dataset` directory under the user directory.\n\n        - Step3: Load the pre-trained model\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='ginet_resnet50vd_cityscapes', num_classes=2, pretrained=None)\n              ```\n                - `name`: model name.\n                - `load_checkpoint`: Whether to load the self-trained model, if it is None, load the provided parameters.\n\n        - Step4:  Optimization strategy\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n             \n\n    -  Model prediction\n\n        - When Fine-tune is completed, the model with the best performance on the verification set will be saved in the `${CHECKPOINT_DIR}/best_model` directory. We use this model to make predictions. The `predict.py` script is as follows:\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='ginet_resnet50vd_cityscapes', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - **Args**\n                * `images`: Image path or ndarray data with format [H, W, C], BGR.\n                * `visualization`: Whether to save the recognition results as picture files.\n                * `save_path`: Save path of the result, default is 'seg_result'.\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of image segmentation.\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n        - ```shell\n          $ hub serving start -m ginet_resnet50vd_cityscapes\n          ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result:\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ginet_resnet50vd_cityscapes\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## V. Release Note\n\n- 1.0.0\n\n  First release"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet50vd_cityscapes/layers.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn.layer import activation\nfrom paddle.nn import Conv2D, AvgPool2D\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu':\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(\n            self,\n            in_channels: int,\n            out_channels: int,\n            kernel_size: int,\n            stride: int = 1,\n            dilation: int = 1,\n            groups: int = 1,\n            is_vd_mode: bool = False,\n            act: str = None,\n            name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2D(\n            kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=(kernel_size - 1) // 2 if dilation == 1 else 0,\n            dilation=dilation,\n            groups=groups,\n            bias_attr=False)\n\n        self._batch_norm = SyncBatchNorm(out_channels)\n        self._act_op = Activation(act=act)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        y = self._act_op(y)\n\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Residual bottleneck block\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 dilation: int = 1,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=1,\n            act='relu',\n            name=name + \"_branch2a\")\n\n        self.dilation = dilation\n\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            dilation=dilation,\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels * 4,\n            kernel_size=1,\n            act=None,\n            name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels * 4,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        if self.dilation > 1:\n            padding = self.dilation\n            y = F.pad(y, [padding, padding, padding, padding])\n\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.add(x=short, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Depthwise Separable Convolution.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(SeparableConvBNReLU, self).__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=in_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n        self.piontwise_conv = ConvBNReLU(\n            in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBN, self).__init__()\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBNReLU, self).__init__()\n\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n\n    Returns:\n        A callable object of Activation.\n\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n\n    Examples:\n\n        from paddleseg.models.common.activation import Activation\n\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(\n                    act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n\n\nclass ASPPModule(nn.Layer):\n    \"\"\"\n    Atrous Spatial Pyramid Pooling.\n\n    Args:\n        aspp_ratios (tuple): The dilation rate using in ASSP module.\n        in_channels (int): The number of input channels.\n        out_channels (int): The number of output channels.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n        use_sep_conv (bool, optional): If using separable conv in ASPP module. Default: False.\n        image_pooling (bool, optional): If augmented with image-level features. Default: False\n    \"\"\"\n\n    def __init__(self,\n                 aspp_ratios: tuple,\n                 in_channels: int,\n                 out_channels: int,\n                 align_corners: bool,\n                 use_sep_conv: bool= False,\n                 image_pooling: bool = False):\n        super().__init__()\n\n        self.align_corners = align_corners\n        self.aspp_blocks = nn.LayerList()\n\n        for ratio in aspp_ratios:\n            if use_sep_conv and ratio > 1:\n                conv_func = SeparableConvBNReLU\n            else:\n                conv_func = ConvBNReLU\n\n            block = conv_func(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1 if ratio == 1 else 3,\n                dilation=ratio,\n                padding=0 if ratio == 1 else ratio)\n            self.aspp_blocks.append(block)\n\n        out_size = len(self.aspp_blocks)\n\n        if image_pooling:\n            self.global_avg_pool = nn.Sequential(\n                nn.AdaptiveAvgPool2D(output_size=(1, 1)),\n                ConvBNReLU(in_channels, out_channels, kernel_size=1, bias_attr=False))\n            out_size += 1\n        self.image_pooling = image_pooling\n\n        self.conv_bn_relu = ConvBNReLU(\n            in_channels=out_channels * out_size,\n            out_channels=out_channels,\n            kernel_size=1)\n\n        self.dropout = nn.Dropout(p=0.1)  # drop rate\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outputs = []\n        for block in self.aspp_blocks:\n            y = block(x)\n            y = F.interpolate(\n                y,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(y)\n\n        if self.image_pooling:\n            img_avg = self.global_avg_pool(x)\n            img_avg = F.interpolate(\n                img_avg,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(img_avg)\n\n        x = paddle.concat(outputs, axis=1)\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n\n        return x\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet50vd_cityscapes/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union, List, Tuple\n\nimport paddle\nfrom paddle import nn\nimport paddle.nn.functional as F\nimport numpy as np\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\nfrom paddleseg.utils import utils\nfrom paddleseg.models import layers\n\nfrom ginet_resnet50vd_cityscapes.resnet import ResNet50_vd\n\n\n@moduleinfo(\n    name=\"ginet_resnet50vd_cityscapes\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"GINetResnet50 is a segmentation model.\",\n    version=\"1.0.0\",\n    meta=ImageSegmentationModule)\nclass GINetResNet50(nn.Layer):\n    \"\"\"\n    The GINetResNet50 implementation based on PaddlePaddle.\n    The original article refers to\n    Wu, Tianyi, Yu Lu, Yu Zhu, Chuang Zhang, Ming Wu, Zhanyu Ma, and Guodong Guo. \"GINet: Graph interaction network for scene parsing.\" In European Conference on Computer Vision, pp. 34-51. Springer, Cham, 2020.\n    (https://arxiv.org/pdf/2009.06160).\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (tuple, optional): Values in the tuple indicate the indices of output of backbone.\n        enable_auxiliary_loss (bool, optional): A bool value indicates whether adding auxiliary loss.\n            If true, auxiliary loss will be added after LearningToDownsample module. Default: False.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.. Default: False.\n        jpu (bool, optional)): whether to use jpu unit in the base forward. Default:True.\n        pretrained (str, optional): The path or url of pretrained model. Default: None.\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 19,\n                 backbone_indices: Tuple[int]=(0, 1, 2, 3),\n                 enable_auxiliary_loss: bool = True,\n                 align_corners: bool = True,\n                 jpu: bool = True,\n                 pretrained: str = None):\n        super(GINetResNet50, self).__init__()\n        self.nclass = num_classes\n        self.aux = enable_auxiliary_loss\n        self.jpu = jpu\n\n        self.backbone = ResNet50_vd()\n        self.backbone_indices = backbone_indices\n        self.align_corners = align_corners\n        self.transforms = T.Compose([T.Normalize()])\n\n        self.jpu = layers.JPU([512, 1024, 2048], width=512) if jpu else None\n        self.head = GIHead(in_channels=2048, nclass=num_classes)\n\n        if self.aux:\n            self.auxlayer = layers.AuxLayer(\n                1024, 1024 // 4, num_classes, bias_attr=False)\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def transform(self, img: Union[np.ndarray, str]) -> Union[np.ndarray, str]:\n        return self.transforms(img)\n\n    def base_forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        feat_list = self.backbone(x)\n        c1, c2, c3, c4 = [feat_list[i] for i in self.backbone_indices]\n\n        if self.jpu:\n            return self.jpu(c1, c2, c3, c4)\n        else:\n            return c1, c2, c3, c4\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        _, _, h, w = x.shape\n        _, _, c3, c4 = self.base_forward(x)\n\n        logit_list = []\n        x, _ = self.head(c4)\n        logit_list.append(x)\n\n        if self.aux:\n            auxout = self.auxlayer(c3)\n\n            logit_list.append(auxout)\n\n        return [\n            F.interpolate(\n                logit, (h, w),\n                mode='bilinear',\n                align_corners=self.align_corners) for logit in logit_list\n        ]\n\n\nclass GIHead(nn.Layer):\n    \"\"\"The Graph Interaction Network head.\"\"\"\n\n    def __init__(self, in_channels: int, nclass: int):\n        super().__init__()\n        self.nclass = nclass\n        inter_channels = in_channels // 4\n        self.inp = paddle.zeros(shape=(nclass, 300), dtype='float32')\n        self.inp = paddle.create_parameter(\n            shape=self.inp.shape,\n            dtype=str(self.inp.numpy().dtype),\n            default_initializer=paddle.nn.initializer.Assign(self.inp))\n\n        self.fc1 = nn.Sequential(\n            nn.Linear(300, 128), nn.BatchNorm1D(128), nn.ReLU())\n        self.fc2 = nn.Sequential(\n            nn.Linear(128, 256), nn.BatchNorm1D(256), nn.ReLU())\n        self.conv5 = layers.ConvBNReLU(\n            in_channels,\n            inter_channels,\n            3,\n            padding=1,\n            bias_attr=False,\n            stride=1)\n\n        self.gloru = GlobalReasonUnit(\n            in_channels=inter_channels,\n            num_state=256,\n            num_node=84,\n            nclass=nclass)\n        self.conv6 = nn.Sequential(\n            nn.Dropout(0.1), nn.Conv2D(inter_channels, nclass, 1))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        B, C, H, W = x.shape\n        inp = self.inp.detach()\n\n        inp = self.fc1(inp)\n        inp = self.fc2(inp).unsqueeze(axis=0).transpose((0, 2, 1))\\\n                           .expand((B, 256, self.nclass))\n\n        out = self.conv5(x)\n\n        out, se_out = self.gloru(out, inp)\n        out = self.conv6(out)\n        return out, se_out\n\n\nclass GlobalReasonUnit(nn.Layer):\n    \"\"\"\n        The original paper refers to:\n            Chen, Yunpeng, et al. \"Graph-Based Global Reasoning Networks\" (https://arxiv.org/abs/1811.12814)\n    \"\"\"\n\n    def __init__(self, in_channels: int, num_state: int = 256, num_node: int = 84, nclass: int = 59):\n        super().__init__()\n        self.num_state = num_state\n        self.conv_theta = nn.Conv2D(\n            in_channels, num_node, kernel_size=1, stride=1, padding=0)\n        self.conv_phi = nn.Conv2D(\n            in_channels, num_state, kernel_size=1, stride=1, padding=0)\n        self.graph = GraphLayer(num_state, num_node, nclass)\n        self.extend_dim = nn.Conv2D(\n            num_state, in_channels, kernel_size=1, bias_attr=False)\n\n        self.bn = layers.SyncBatchNorm(in_channels)\n\n    def forward(self, x: paddle.Tensor, inp: paddle.Tensor) -> paddle.Tensor:\n        B = self.conv_theta(x)\n        sizeB = B.shape\n        B = B.reshape((sizeB[0], sizeB[1], -1))\n\n        sizex = x.shape\n        x_reduce = self.conv_phi(x)\n        x_reduce = x_reduce.reshape((sizex[0], -1, sizex[2] * sizex[3]))\\\n                           .transpose((0, 2, 1))\n\n        V = paddle.bmm(B, x_reduce).transpose((0, 2, 1))\n        V = paddle.divide(\n            V, paddle.to_tensor([sizex[2] * sizex[3]], dtype='float32'))\n\n        class_node, new_V = self.graph(inp, V)\n        D = B.reshape((sizeB[0], -1, sizeB[2] * sizeB[3])).transpose((0, 2, 1))\n        Y = paddle.bmm(D, new_V.transpose((0, 2, 1)))\n        Y = Y.transpose((0, 2, 1)).reshape((sizex[0], self.num_state, \\\n                                            sizex[2], -1))\n        Y = self.extend_dim(Y)\n        Y = self.bn(Y)\n        out = Y + x\n\n        return out, class_node\n\n\nclass GraphLayer(nn.Layer):\n    def __init__(self, num_state: int, num_node: int, num_class: int):\n        super().__init__()\n        self.vis_gcn = GCN(num_state, num_node)\n        self.word_gcn = GCN(num_state, num_class)\n        self.transfer = GraphTransfer(num_state)\n        self.gamma_vis = paddle.zeros([num_node])\n        self.gamma_word = paddle.zeros([num_class])\n        self.gamma_vis = paddle.create_parameter(\n            shape=self.gamma_vis.shape,\n            dtype=str(self.gamma_vis.numpy().dtype),\n            default_initializer=paddle.nn.initializer.Assign(self.gamma_vis))\n        self.gamma_word = paddle.create_parameter(\n            shape=self.gamma_word.shape,\n            dtype=str(self.gamma_word.numpy().dtype),\n            default_initializer=paddle.nn.initializer.Assign(self.gamma_word))\n\n    def forward(self, inp: paddle.Tensor, vis_node: paddle.Tensor) -> List[paddle.Tensor]:\n        inp = self.word_gcn(inp)\n        new_V = self.vis_gcn(vis_node)\n        class_node, vis_node = self.transfer(inp, new_V)\n\n        class_node = self.gamma_word * inp + class_node\n        new_V = self.gamma_vis * vis_node + new_V\n        return class_node, new_V\n\n\nclass GCN(nn.Layer):\n    def __init__(self, num_state: int = 128, num_node: int = 64, bias: bool = False):\n        super().__init__()\n        self.conv1 = nn.Conv1D(\n            num_node,\n            num_node,\n            kernel_size=1,\n            padding=0,\n            stride=1,\n            groups=1,\n        )\n        self.relu = nn.ReLU()\n        self.conv2 = nn.Conv1D(\n            num_state,\n            num_state,\n            kernel_size=1,\n            padding=0,\n            stride=1,\n            groups=1,\n            bias_attr=bias)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        h = self.conv1(x.transpose((0, 2, 1))).transpose((0, 2, 1))\n        h = h + x\n        h = self.relu(h)\n        h = self.conv2(h)\n        return h\n\n\nclass GraphTransfer(nn.Layer):\n    \"\"\"Transfer vis graph to class node, transfer class node to vis feature\"\"\"\n\n    def __init__(self, in_dim: int):\n        super().__init__()\n        self.channle_in = in_dim\n        self.query_conv = nn.Conv1D(\n            in_channels=in_dim, out_channels=in_dim // 2, kernel_size=1)\n        self.key_conv = nn.Conv1D(\n            in_channels=in_dim, out_channels=in_dim // 2, kernel_size=1)\n        self.value_conv_vis = nn.Conv1D(\n            in_channels=in_dim, out_channels=in_dim, kernel_size=1)\n        self.value_conv_word = nn.Conv1D(\n            in_channels=in_dim, out_channels=in_dim, kernel_size=1)\n        self.softmax_vis = nn.Softmax(axis=-1)\n        self.softmax_word = nn.Softmax(axis=-2)\n\n    def forward(self, word: paddle.Tensor, vis_node: paddle.Tensor) -> List[paddle.Tensor]:\n        m_batchsize, C, Nc = word.shape\n        m_batchsize, C, Nn = vis_node.shape\n\n        proj_query = self.query_conv(word).reshape((m_batchsize, -1, Nc))\\\n                                          .transpose((0, 2, 1))\n        proj_key = self.key_conv(vis_node).reshape((m_batchsize, -1, Nn))\n\n        energy = paddle.bmm(proj_query, proj_key)\n        attention_vis = self.softmax_vis(energy).transpose((0, 2, 1))\n        attention_word = self.softmax_word(energy)\n\n        proj_value_vis = self.value_conv_vis(vis_node).reshape((m_batchsize, -1,\n                                                                Nn))\n        proj_value_word = self.value_conv_word(word).reshape((m_batchsize, -1,\n                                                              Nc))\n\n        class_out = paddle.bmm(proj_value_vis, attention_vis)\n        node_out = paddle.bmm(proj_value_word, attention_word)\n        return class_out, node_out"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet50vd_cityscapes/requirements.txt",
    "content": "paddleseg >= 2.3.0\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet50vd_cityscapes/resnet.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport ginet_resnet50vd_cityscapes.layers as L\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BasicBlock, self).__init__()\n        self.stride = stride\n        self.conv0 = L.ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            name=name + \"_branch2a\")\n        self.conv1 = L.ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            act=None,\n            name=name + \"_branch2b\")\n\n        if not shortcut:\n            self.short = L.ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv1, act='relu')\n\n        return y\n\n\nclass ResNet50_vd(nn.Layer):\n    def __init__(self,\n                 multi_grid: tuple = (1, 2, 4)):\n        super(ResNet50_vd, self).__init__()\n        depth = [3, 4, 6, 3]\n        num_channels = [64, 256, 512, 1024] \n        num_filters = [64, 128, 256, 512]\n        self.feat_channels = [c * 4 for c in num_filters]\n        dilation_dict = {2: 2, 3: 4}\n        self.conv1_1 = L.ConvBNLayer(\n            in_channels=3,\n            out_channels=32,\n            kernel_size=3,\n            stride=2,\n            act='relu',\n            name=\"conv1_1\")\n        self.conv1_2 = L.ConvBNLayer(\n            in_channels=32,\n            out_channels=32,\n            kernel_size=3,\n            stride=1,\n            act='relu',\n            name=\"conv1_2\")\n        self.conv1_3 = L.ConvBNLayer(\n            in_channels=32,\n            out_channels=64,\n            kernel_size=3,\n            stride=1,\n            act='relu',\n            name=\"conv1_3\")\n        self.pool2d_max = nn.MaxPool2D(kernel_size=3, stride=2, padding=1)\n        self.stage_list = []\n\n        for block in range(len(depth)):\n            shortcut = False\n            block_list = []\n            for i in range(depth[block]):\n                conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                dilation_rate = dilation_dict[\n                    block] if dilation_dict and block in dilation_dict else 1\n                if block == 3:\n                    dilation_rate = dilation_rate * multi_grid[i]\n                bottleneck_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    L.BottleneckBlock(\n                        in_channels=num_channels[block]\n                        if i == 0 else num_filters[block] * 4,\n                        out_channels=num_filters[block],\n                        stride=2 if i == 0 and block != 0\n                                    and dilation_rate == 1 else 1,\n                        shortcut=shortcut,\n                        if_first=block == i == 0,\n                        name=conv_name,\n                        dilation=dilation_rate))\n                block_list.append(bottleneck_block)\n                shortcut = True\n            self.stage_list.append(block_list)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        y = self.pool2d_max(y)\n        feat_list = []\n        for stage in self.stage_list:\n            for block in stage:\n                y = block(y)\n            feat_list.append(y)\n        return feat_list"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet50vd_voc/README.md",
    "content": "# ginet_resnet50vd_voc\n\n|模型名称|ginet_resnet50vd_voc|\n| :--- | :---: |\n|类别|图像-图像分割|\n|网络|ginet_resnet50vd|\n|数据集|PascalVOC2012|\n|是否支持Fine-tuning|是|\n|模型大小|214MB|\n|指标|-|\n|最新更新日期|2021-12-14|\n\n## 一、模型基本信息\n\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/145925887-bf9e62d3-8c6d-43c2-8062-6cb6ba59ec0e.jpg\"  width = \"420\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/145925692-badb21d1-10e7-4a5d-82f5-1177d10a7681.png\" width = \"420\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### 模型介绍\n\n    - 本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n    - 更多详情请参考：[ginet](https://arxiv.org/pdf/2009.06160)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install ginet_resnet50vd_voc\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)  \n\n\n## 三、模型API预测\n\n- ### 1.预测代码示例\n\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='ginet_resnet50vd_voc')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用ginet_resnet50vd_voc模型对OpticDiscSeg数据集进行Fine-tune。 `train.py`内容如下：\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n\n              ```\n                - `transforms`: 数据预处理方式。\n                - `mode`: `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n                - 数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='ginet_resnet50vd_voc', num_classes=2, pretrained=None)\n              ```\n                - `name`: 选择预训练模型的名字。\n                - `load_checkpoint`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n        - Step4: 选择优化策略和运行配置\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n\n\n    - 模型预测\n\n        - 当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。我们使用该模型来进行预测。predict.py脚本如下：\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='ginet_resnet50vd_voc', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - 参数配置正确后，请执行脚本`python predict.py`。\n\n            - **Args**\n                * `images`:原始图像路径或BGR格式图片；\n                * `visualization`: 是否可视化，默认为True；\n                * `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n                **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像分割服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m ginet_resnet50vd_voc\n      ```\n\n    - 这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ginet_resnet50vd_voc\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet50vd_voc/README_en.md",
    "content": "# ginet_resnet50vd_voc\n\n|Module Name|ginet_resnet50vd_voc|\n| :--- | :---: |\n|Category|Image Segmentation|\n|Network|ginet_resnet50vd|\n|Dataset|PascalVOC2012|\n|Fine-tuning supported or not|Yes|\n|Module Size|214MB|\n|Data indicators|-|\n|Latest update date|2021-12-14|\n\n## I. Basic Information\n\n- ### Application Effect Display\n    - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/145925887-bf9e62d3-8c6d-43c2-8062-6cb6ba59ec0e.jpg\"  width = \"420\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/145925692-badb21d1-10e7-4a5d-82f5-1177d10a7681.png\"  width = \"420\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### Module Introduction\n\n    - We will show how to use PaddleHub to finetune the pre-trained model and complete the prediction.\n    - For more information, please refer to: [ginet](https://arxiv.org/pdf/2009.06160)\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install ginet_resnet50vd_voc\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='ginet_resnet50vd_voc')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.Fine-tune and Encapsulation\n\n    - After completing the installation of PaddlePaddle and PaddleHub, you can start using the ginet_resnet50vd_voc model to fine-tune datasets such as OpticDiscSeg.\n\n    - Steps:\n\n         - Step1: Define the data preprocessing method\n\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms`: The data enhancement module defines lots of data preprocessing methods. Users can replace the data preprocessing methods according to their needs.\n\n         - Step2: Download the dataset\n\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n\n              ```\n                * `transforms`: data preprocessing methods.\n\n                * `mode`: Select the data mode, the options are `train`, `test`, `val`. Default is `train`.\n\n                * Dataset preparation can be referred to [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`will be automatically downloaded from the network and decompressed to the `$HOME/.paddlehub/dataset` directory under the user directory.\n\n        - Step3: Load the pre-trained model\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='ginet_resnet50vd_voc', num_classes=2, pretrained=None)\n              ```\n                - `name`: model name.\n                - `load_checkpoint`: Whether to load the self-trained model, if it is None, load the provided parameters.\n\n        - Step4:  Optimization strategy\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n\n\n    -  Model prediction\n\n        - When Fine-tune is completed, the model with the best performance on the verification set will be saved in the `${CHECKPOINT_DIR}/best_model` directory. We use this model to make predictions. The `predict.py` script is as follows:\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='ginet_resnet50vd_voc', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - **Args**\n                * `images`: Image path or ndarray data with format [H, W, C], BGR.\n                * `visualization`: Whether to save the recognition results as picture files.\n                * `save_path`: Save path of the result, default is 'seg_result'.\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of image segmentation.\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n        - ```shell\n          $ hub serving start -m ginet_resnet50vd_voc\n          ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result:\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ginet_resnet50vd_voc\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## V. Release Note\n\n- 1.0.0\n\n  First release\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet50vd_voc/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet50vd_voc/layers.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn import AvgPool2D\nfrom paddle.nn import Conv2D\nfrom paddle.nn.layer import activation\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu':\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 stride: int = 1,\n                 dilation: int = 1,\n                 groups: int = 1,\n                 is_vd_mode: bool = False,\n                 act: str = None,\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2D(kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2D(in_channels=in_channels,\n                            out_channels=out_channels,\n                            kernel_size=kernel_size,\n                            stride=stride,\n                            padding=(kernel_size - 1) // 2 if dilation == 1 else 0,\n                            dilation=dilation,\n                            groups=groups,\n                            bias_attr=False)\n\n        self._batch_norm = SyncBatchNorm(out_channels)\n        self._act_op = Activation(act=act)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        y = self._act_op(y)\n\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Residual bottleneck block\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 dilation: int = 1,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(in_channels=in_channels,\n                                 out_channels=out_channels,\n                                 kernel_size=1,\n                                 act='relu',\n                                 name=name + \"_branch2a\")\n\n        self.dilation = dilation\n\n        self.conv1 = ConvBNLayer(in_channels=out_channels,\n                                 out_channels=out_channels,\n                                 kernel_size=3,\n                                 stride=stride,\n                                 act='relu',\n                                 dilation=dilation,\n                                 name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(in_channels=out_channels,\n                                 out_channels=out_channels * 4,\n                                 kernel_size=1,\n                                 act=None,\n                                 name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(in_channels=in_channels,\n                                     out_channels=out_channels * 4,\n                                     kernel_size=1,\n                                     stride=1,\n                                     is_vd_mode=False if if_first or stride == 1 else True,\n                                     name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        if self.dilation > 1:\n            padding = self.dilation\n            y = F.pad(y, [padding, padding, padding, padding])\n\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.add(x=short, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Depthwise Separable Convolution.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs: dict):\n        super(SeparableConvBNReLU, self).__init__()\n        self.depthwise_conv = ConvBN(in_channels,\n                                     out_channels=in_channels,\n                                     kernel_size=kernel_size,\n                                     padding=padding,\n                                     groups=in_channels,\n                                     **kwargs)\n        self.piontwise_conv = ConvBNReLU(in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs: dict):\n        super(ConvBN, self).__init__()\n        self._conv = Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs: dict):\n        super(ConvBNReLU, self).__init__()\n\n        self._conv = Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n\n    Returns:\n        A callable object of Activation.\n\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n\n    Examples:\n\n        from paddleseg.models.common.activation import Activation\n\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n\n\nclass ASPPModule(nn.Layer):\n    \"\"\"\n    Atrous Spatial Pyramid Pooling.\n\n    Args:\n        aspp_ratios (tuple): The dilation rate using in ASSP module.\n        in_channels (int): The number of input channels.\n        out_channels (int): The number of output channels.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n        use_sep_conv (bool, optional): If using separable conv in ASPP module. Default: False.\n        image_pooling (bool, optional): If augmented with image-level features. Default: False\n    \"\"\"\n\n    def __init__(self,\n                 aspp_ratios: tuple,\n                 in_channels: int,\n                 out_channels: int,\n                 align_corners: bool,\n                 use_sep_conv: bool = False,\n                 image_pooling: bool = False):\n        super().__init__()\n\n        self.align_corners = align_corners\n        self.aspp_blocks = nn.LayerList()\n\n        for ratio in aspp_ratios:\n            if use_sep_conv and ratio > 1:\n                conv_func = SeparableConvBNReLU\n            else:\n                conv_func = ConvBNReLU\n\n            block = conv_func(in_channels=in_channels,\n                              out_channels=out_channels,\n                              kernel_size=1 if ratio == 1 else 3,\n                              dilation=ratio,\n                              padding=0 if ratio == 1 else ratio)\n            self.aspp_blocks.append(block)\n\n        out_size = len(self.aspp_blocks)\n\n        if image_pooling:\n            self.global_avg_pool = nn.Sequential(nn.AdaptiveAvgPool2D(output_size=(1, 1)),\n                                                 ConvBNReLU(in_channels, out_channels, kernel_size=1, bias_attr=False))\n            out_size += 1\n        self.image_pooling = image_pooling\n\n        self.conv_bn_relu = ConvBNReLU(in_channels=out_channels * out_size, out_channels=out_channels, kernel_size=1)\n\n        self.dropout = nn.Dropout(p=0.1)  # drop rate\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outputs = []\n        for block in self.aspp_blocks:\n            y = block(x)\n            y = F.interpolate(y, x.shape[2:], mode='bilinear', align_corners=self.align_corners)\n            outputs.append(y)\n\n        if self.image_pooling:\n            img_avg = self.global_avg_pool(x)\n            img_avg = F.interpolate(img_avg, x.shape[2:], mode='bilinear', align_corners=self.align_corners)\n            outputs.append(img_avg)\n\n        x = paddle.concat(outputs, axis=1)\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n\n        return x\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet50vd_voc/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nfrom typing import List\nfrom typing import Tuple\nfrom typing import Union\n\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\nfrom ginet_resnet50vd_voc.resnet import ResNet50_vd\nfrom paddle import nn\nfrom paddleseg.models import layers\n\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\nfrom paddlehub.module.module import moduleinfo\n\n\n@moduleinfo(name=\"ginet_resnet50vd_voc\",\n            type=\"CV/semantic_segmentation\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"GINetResnet50 is a segmentation model.\",\n            version=\"1.0.0\",\n            meta=ImageSegmentationModule)\nclass GINetResNet50(nn.Layer):\n    \"\"\"\n    The GINetResNet50 implementation based on PaddlePaddle.\n    The original article refers to\n    Wu, Tianyi, Yu Lu, Yu Zhu, Chuang Zhang, Ming Wu, Zhanyu Ma, and Guodong Guo. \"GINet: Graph interaction network for scene parsing.\" In European Conference on Computer Vision, pp. 34-51. Springer, Cham, 2020.\n    (https://arxiv.org/pdf/2009.06160).\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (tuple, optional): Values in the tuple indicate the indices of output of backbone.\n        enable_auxiliary_loss (bool, optional): A bool value indicates whether adding auxiliary loss.\n            If true, auxiliary loss will be added after LearningToDownsample module. Default: False.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.. Default: False.\n        jpu (bool, optional)): whether to use jpu unit in the base forward. Default:True.\n        pretrained (str, optional): The path or url of pretrained model. Default: None.\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 21,\n                 backbone_indices: Tuple[int] = (0, 1, 2, 3),\n                 enable_auxiliary_loss: bool = True,\n                 align_corners: bool = True,\n                 jpu: bool = True,\n                 pretrained: str = None):\n        super(GINetResNet50, self).__init__()\n        self.nclass = num_classes\n        self.aux = enable_auxiliary_loss\n        self.jpu = jpu\n\n        self.backbone = ResNet50_vd()\n        self.backbone_indices = backbone_indices\n        self.align_corners = align_corners\n        self.transforms = T.Compose([T.Normalize()])\n\n        self.jpu = layers.JPU([512, 1024, 2048], width=512) if jpu else None\n        self.head = GIHead(in_channels=2048, nclass=num_classes)\n\n        if self.aux:\n            self.auxlayer = layers.AuxLayer(1024, 1024 // 4, num_classes, bias_attr=False)\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def transform(self, img: Union[np.ndarray, str]) -> Union[np.ndarray, str]:\n        return self.transforms(img)\n\n    def base_forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        feat_list = self.backbone(x)\n        c1, c2, c3, c4 = [feat_list[i] for i in self.backbone_indices]\n\n        if self.jpu:\n            return self.jpu(c1, c2, c3, c4)\n        else:\n            return c1, c2, c3, c4\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        _, _, h, w = x.shape\n        _, _, c3, c4 = self.base_forward(x)\n\n        logit_list = []\n        x, _ = self.head(c4)\n        logit_list.append(x)\n\n        if self.aux:\n            auxout = self.auxlayer(c3)\n\n            logit_list.append(auxout)\n\n        return [F.interpolate(logit, (h, w), mode='bilinear', align_corners=self.align_corners) for logit in logit_list]\n\n\nclass GIHead(nn.Layer):\n    \"\"\"The Graph Interaction Network head.\"\"\"\n\n    def __init__(self, in_channels: int, nclass: int):\n        super().__init__()\n        self.nclass = nclass\n        inter_channels = in_channels // 4\n        self.inp = paddle.zeros(shape=(nclass, 300), dtype='float32')\n        self.inp = paddle.create_parameter(shape=self.inp.shape,\n                                           dtype=str(self.inp.numpy().dtype),\n                                           default_initializer=paddle.nn.initializer.Assign(self.inp))\n\n        self.fc1 = nn.Sequential(nn.Linear(300, 128), nn.BatchNorm1D(128), nn.ReLU())\n        self.fc2 = nn.Sequential(nn.Linear(128, 256), nn.BatchNorm1D(256), nn.ReLU())\n        self.conv5 = layers.ConvBNReLU(in_channels, inter_channels, 3, padding=1, bias_attr=False, stride=1)\n\n        self.gloru = GlobalReasonUnit(in_channels=inter_channels, num_state=256, num_node=84, nclass=nclass)\n        self.conv6 = nn.Sequential(nn.Dropout(0.1), nn.Conv2D(inter_channels, nclass, 1))\n\n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        B, C, H, W = x.shape\n        inp = self.inp.detach()\n\n        inp = self.fc1(inp)\n        inp = self.fc2(inp).unsqueeze(axis=0).transpose((0, 2, 1))\\\n                           .expand((B, 256, self.nclass))\n\n        out = self.conv5(x)\n\n        out, se_out = self.gloru(out, inp)\n        out = self.conv6(out)\n        return out, se_out\n\n\nclass GlobalReasonUnit(nn.Layer):\n    \"\"\"\n        The original paper refers to:\n            Chen, Yunpeng, et al. \"Graph-Based Global Reasoning Networks\" (https://arxiv.org/abs/1811.12814)\n    \"\"\"\n\n    def __init__(self, in_channels: int, num_state: int = 256, num_node: int = 84, nclass: int = 59):\n        super().__init__()\n        self.num_state = num_state\n        self.conv_theta = nn.Conv2D(in_channels, num_node, kernel_size=1, stride=1, padding=0)\n        self.conv_phi = nn.Conv2D(in_channels, num_state, kernel_size=1, stride=1, padding=0)\n        self.graph = GraphLayer(num_state, num_node, nclass)\n        self.extend_dim = nn.Conv2D(num_state, in_channels, kernel_size=1, bias_attr=False)\n\n        self.bn = layers.SyncBatchNorm(in_channels)\n\n    def forward(self, x: paddle.Tensor, inp: paddle.Tensor) -> List[paddle.Tensor]:\n        B = self.conv_theta(x)\n        sizeB = B.shape\n        B = B.reshape((sizeB[0], sizeB[1], -1))\n\n        sizex = x.shape\n        x_reduce = self.conv_phi(x)\n        x_reduce = x_reduce.reshape((sizex[0], -1, sizex[2] * sizex[3]))\\\n                           .transpose((0, 2, 1))\n\n        V = paddle.bmm(B, x_reduce).transpose((0, 2, 1))\n        V = paddle.divide(V, paddle.to_tensor([sizex[2] * sizex[3]], dtype='float32'))\n\n        class_node, new_V = self.graph(inp, V)\n        D = B.reshape((sizeB[0], -1, sizeB[2] * sizeB[3])).transpose((0, 2, 1))\n        Y = paddle.bmm(D, new_V.transpose((0, 2, 1)))\n        Y = Y.transpose((0, 2, 1)).reshape((sizex[0], self.num_state, \\\n                                            sizex[2], -1))\n        Y = self.extend_dim(Y)\n        Y = self.bn(Y)\n        out = Y + x\n\n        return out, class_node\n\n\nclass GraphLayer(nn.Layer):\n\n    def __init__(self, num_state: int, num_node: int, num_class: int):\n        super().__init__()\n        self.vis_gcn = GCN(num_state, num_node)\n        self.word_gcn = GCN(num_state, num_class)\n        self.transfer = GraphTransfer(num_state)\n        self.gamma_vis = paddle.zeros([num_node])\n        self.gamma_word = paddle.zeros([num_class])\n        self.gamma_vis = paddle.create_parameter(shape=self.gamma_vis.shape,\n                                                 dtype=str(self.gamma_vis.numpy().dtype),\n                                                 default_initializer=paddle.nn.initializer.Assign(self.gamma_vis))\n        self.gamma_word = paddle.create_parameter(shape=self.gamma_word.shape,\n                                                  dtype=str(self.gamma_word.numpy().dtype),\n                                                  default_initializer=paddle.nn.initializer.Assign(self.gamma_word))\n\n    def forward(self, inp: paddle.Tensor, vis_node: paddle.Tensor) -> List[paddle.Tensor]:\n        inp = self.word_gcn(inp)\n        new_V = self.vis_gcn(vis_node)\n        class_node, vis_node = self.transfer(inp, new_V)\n\n        class_node = self.gamma_word * inp + class_node\n        new_V = self.gamma_vis * vis_node + new_V\n        return class_node, new_V\n\n\nclass GCN(nn.Layer):\n\n    def __init__(self, num_state: int = 128, num_node: int = 64, bias: bool = False):\n        super().__init__()\n        self.conv1 = nn.Conv1D(\n            num_node,\n            num_node,\n            kernel_size=1,\n            padding=0,\n            stride=1,\n            groups=1,\n        )\n        self.relu = nn.ReLU()\n        self.conv2 = nn.Conv1D(num_state, num_state, kernel_size=1, padding=0, stride=1, groups=1, bias_attr=bias)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        h = self.conv1(x.transpose((0, 2, 1))).transpose((0, 2, 1))\n        h = h + x\n        h = self.relu(h)\n        h = self.conv2(h)\n        return h\n\n\nclass GraphTransfer(nn.Layer):\n    \"\"\"Transfer vis graph to class node, transfer class node to vis feature\"\"\"\n\n    def __init__(self, in_dim: int):\n        super().__init__()\n        self.channle_in = in_dim\n        self.query_conv = nn.Conv1D(in_channels=in_dim, out_channels=in_dim // 2, kernel_size=1)\n        self.key_conv = nn.Conv1D(in_channels=in_dim, out_channels=in_dim // 2, kernel_size=1)\n        self.value_conv_vis = nn.Conv1D(in_channels=in_dim, out_channels=in_dim, kernel_size=1)\n        self.value_conv_word = nn.Conv1D(in_channels=in_dim, out_channels=in_dim, kernel_size=1)\n        self.softmax_vis = nn.Softmax(axis=-1)\n        self.softmax_word = nn.Softmax(axis=-2)\n\n    def forward(self, word: paddle.Tensor, vis_node: paddle.Tensor) -> List[paddle.Tensor]:\n        m_batchsize, C, Nc = word.shape\n        m_batchsize, C, Nn = vis_node.shape\n\n        proj_query = self.query_conv(word).reshape((m_batchsize, -1, Nc))\\\n                                          .transpose((0, 2, 1))\n        proj_key = self.key_conv(vis_node).reshape((m_batchsize, -1, Nn))\n\n        energy = paddle.bmm(proj_query, proj_key)\n        attention_vis = self.softmax_vis(energy).transpose((0, 2, 1))\n        attention_word = self.softmax_word(energy)\n\n        proj_value_vis = self.value_conv_vis(vis_node).reshape((m_batchsize, -1, Nn))\n        proj_value_word = self.value_conv_word(word).reshape((m_batchsize, -1, Nc))\n\n        class_out = paddle.bmm(proj_value_vis, attention_vis)\n        node_out = paddle.bmm(proj_value_word, attention_word)\n        return class_out, node_out\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet50vd_voc/requirements.txt",
    "content": "paddleseg>=2.3.0"
  },
  {
    "path": "modules/image/semantic_segmentation/ginet_resnet50vd_voc/resnet.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport ginet_resnet50vd_voc.layers as L\nimport paddle\nimport paddle.nn as nn\n\n\nclass BasicBlock(nn.Layer):\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 name: str = None):\n        super(BasicBlock, self).__init__()\n        self.stride = stride\n        self.conv0 = L.ConvBNLayer(in_channels=in_channels,\n                                   out_channels=out_channels,\n                                   kernel_size=3,\n                                   stride=stride,\n                                   act='relu',\n                                   name=name + \"_branch2a\")\n        self.conv1 = L.ConvBNLayer(in_channels=out_channels,\n                                   out_channels=out_channels,\n                                   kernel_size=3,\n                                   act=None,\n                                   name=name + \"_branch2b\")\n\n        if not shortcut:\n            self.short = L.ConvBNLayer(in_channels=in_channels,\n                                       out_channels=out_channels,\n                                       kernel_size=1,\n                                       stride=1,\n                                       is_vd_mode=False if if_first else True,\n                                       name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = paddle.elementwise_add(x=short, y=conv1, act='relu')\n\n        return y\n\n\nclass ResNet50_vd(nn.Layer):\n\n    def __init__(self, multi_grid: tuple = (1, 2, 4)):\n        super(ResNet50_vd, self).__init__()\n        depth = [3, 4, 6, 3]\n        num_channels = [64, 256, 512, 1024]\n        num_filters = [64, 128, 256, 512]\n        self.feat_channels = [c * 4 for c in num_filters]\n        dilation_dict = {2: 2, 3: 4}\n        self.conv1_1 = L.ConvBNLayer(in_channels=3,\n                                     out_channels=32,\n                                     kernel_size=3,\n                                     stride=2,\n                                     act='relu',\n                                     name=\"conv1_1\")\n        self.conv1_2 = L.ConvBNLayer(in_channels=32,\n                                     out_channels=32,\n                                     kernel_size=3,\n                                     stride=1,\n                                     act='relu',\n                                     name=\"conv1_2\")\n        self.conv1_3 = L.ConvBNLayer(in_channels=32,\n                                     out_channels=64,\n                                     kernel_size=3,\n                                     stride=1,\n                                     act='relu',\n                                     name=\"conv1_3\")\n        self.pool2d_max = nn.MaxPool2D(kernel_size=3, stride=2, padding=1)\n        self.stage_list = []\n\n        for block in range(len(depth)):\n            shortcut = False\n            block_list = []\n            for i in range(depth[block]):\n                conv_name = \"res\" + str(block + 2) + chr(97 + i)\n                dilation_rate = dilation_dict[block] if dilation_dict and block in dilation_dict else 1\n                if block == 3:\n                    dilation_rate = dilation_rate * multi_grid[i]\n                bottleneck_block = self.add_sublayer(\n                    'bb_%d_%d' % (block, i),\n                    L.BottleneckBlock(in_channels=num_channels[block] if i == 0 else num_filters[block] * 4,\n                                      out_channels=num_filters[block],\n                                      stride=2 if i == 0 and block != 0 and dilation_rate == 1 else 1,\n                                      shortcut=shortcut,\n                                      if_first=block == i == 0,\n                                      name=conv_name,\n                                      dilation=dilation_rate))\n                block_list.append(bottleneck_block)\n                shortcut = True\n            self.stage_list.append(block_list)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        y = self.pool2d_max(y)\n        feat_list = []\n        for stage in self.stage_list:\n            for block in stage:\n                y = block(y)\n            feat_list.append(y)\n        return feat_list\n"
  },
  {
    "path": "modules/image/semantic_segmentation/hardnet_cityscapes/README.md",
    "content": "# PaddleHub 图像分割\n\n## 模型预测\n\n若想使用我们提供的预训练模型进行预测，可使用如下脚本：\n\n```python\nimport paddle\nimport cv2\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='hardnet_cityscapes')\n    img = cv2.imread(\"/PATH/TO/IMAGE\")\n    model.predict(images=[img], visualization=True)\n```\n\n\n## 如何开始Fine-tune\n\n在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用hardnet_cityscapes模型对OpticDiscSeg等数据集进行Fine-tune。\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 定义数据预处理方式\n```python\nfrom paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\ntransform = Compose([Resize(target_size=(512, 512)), Normalize()])\n```\n\n`segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n### Step2: 下载数据集并使用\n```python\nfrom paddlehub.datasets import OpticDiscSeg\n\ntrain_reader = OpticDiscSeg(transform， mode='train')\n\n```\n* `transform`: 数据预处理方式。\n* `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n### Step3: 加载预训练模型\n\n```python\nmodel = hub.Module(name='hardnet_cityscapes', num_classes=2, pretrained=None)\n```\n* `name`: 选择预训练模型的名字。\n* `num_classes`: 分割模型的类别数目。\n* `pretrained`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n### Step4: 选择优化策略和运行配置\n\n```python\nscheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\noptimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\ntrainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_ocr', use_gpu=True)\n```\n\n#### 优化策略\n\nPaddle2.0提供了多种优化器选择，如`SGD`, `Adam`, `Adamax`等，其中`Adam`:\n\n* `learning_rate`: 全局学习率。\n*  `parameters`: 待优化模型参数。\n\n#### 运行配置\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_gpu`: 是否使用gpu，默认为False;\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: works的数量，默认为0；\n* `eval_dataset`: 验证集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们使用该模型来进行预测。predict.py脚本如下：\n\n```python\nimport paddle\nimport cv2\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='hardnet_cityscapes', pretrained='/PATH/TO/CHECKPOINT')\n    img = cv2.imread(\"/PATH/TO/IMAGE\")\n    model.predict(images=[img], visualization=True)\n```\n\n参数配置正确后，请执行脚本`python predict.py`。\n**Args**\n* `images`:原始图像路径或BGR格式图片；\n* `visualization`: 是否可视化，默认为True；\n* `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n**NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线图像分割服务。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m hardnet_cityscapes\n```\n\n这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\nimport cv2\nimport base64\n\nimport numpy as np\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n# 发送HTTP请求\norg_im = cv2.imread('/PATH/TO/IMAGE')\ndata = {'images':[cv2_to_base64(org_im)]}\nheaders = {\"Content-type\": \"application/json\"}\nurl = \"http://127.0.0.1:8866/predict/hardnet_cityscapes\"\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\nmask = base64_to_cv2(r.json()[\"results\"][0])\n```\n\n### 查看代码\n\nhttps://github.com/PaddlePaddle/PaddleSeg\n\n### 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "modules/image/semantic_segmentation/hardnet_cityscapes/layers.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu' or os.environ.get('PADDLESEG_EXPORT_STAGE'):\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs):\n        super().__init__()\n\n        self._conv = nn.Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs):\n        super().__init__()\n        self._conv = nn.Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvReLUPool(nn.Layer):\n    \"\"\"Basic conv bn pool layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int):\n        super().__init__()\n        self.conv = nn.Conv2D(in_channels, out_channels, kernel_size=3, stride=1, padding=1, dilation=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.conv(x)\n        x = F.relu(x)\n        x = F.pool2d(x, pool_size=2, pool_type=\"max\", pool_stride=2)\n        return x\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Basic separable conv bn relu layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs):\n        super().__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=in_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n        self.piontwise_conv = ConvBNReLU(in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass DepthwiseConvBN(nn.Layer):\n    \"\"\"Basic depthwise conv bn relu layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs):\n        super().__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        return x\n\n\nclass AuxLayer(nn.Layer):\n    \"\"\"\n    The auxiliary layer implementation for auxiliary loss.\n\n    Args:\n        in_channels (int): The number of input channels.\n        inter_channels (int): The intermediate channels.\n        out_channels (int): The number of output channels, and usually it is num_classes.\n        dropout_prob (float, optional): The drop rate. Default: 0.1.\n    \"\"\"\n\n    def __init__(self, in_channels: int, inter_channels: int, out_channels: int, dropout_prob: float = 0.1):\n        super().__init__()\n\n        self.conv_bn_relu = ConvBNReLU(in_channels=in_channels, out_channels=inter_channels, kernel_size=3, padding=1)\n\n        self.dropout = nn.Dropout(p=dropout_prob)\n\n        self.conv = nn.Conv2D(in_channels=inter_channels, out_channels=out_channels, kernel_size=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n        x = self.conv(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n    Returns:\n        A callable object of Activation.\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n    Examples:\n        from paddleseg.models.common.activation import Activation\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = nn.layer.activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"nn.layer.activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n"
  },
  {
    "path": "modules/image/semantic_segmentation/hardnet_cityscapes/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nfrom typing import Union, Tuple, List\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\n\nimport hardnet_cityscapes.layers as layers\n\n\n@moduleinfo(\n    name=\"hardnet_cityscapes\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"Hardnet is a segmentation model trained by PascalVoc.\",\n    version=\"1.0.0\",\n    meta=ImageSegmentationModule)\nclass HarDNet(nn.Layer):\n    \"\"\"\n    [Real Time] The FC-HardDNet 70 implementation based on PaddlePaddle.\n    The original article refers to\n        Chao, Ping, et al. \"HarDNet: A Low Memory Traffic Network\"\n        (https://arxiv.org/pdf/1909.00948.pdf)\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        stem_channels (tuple|list, optional): The number of channels before the encoder. Default: (16, 24, 32, 48).\n        ch_list (tuple|list, optional): The number of channels at each block in the encoder. Default: (64, 96, 160, 224, 320).\n        grmul (float, optional): The channel multiplying factor in HarDBlock, which is m in the paper. Default: 1.7.\n        gr (tuple|list, optional): The growth rate in each HarDBlock, which is k in the paper. Default: (10, 16, 18, 24, 32).\n        n_layers (tuple|list, optional): The number of layers in each HarDBlock. Default: (4, 4, 8, 8, 8).\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.  Default: False.\n        pretrained (str, optional): The path or url of pretrained model. Default: None.\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 19,\n                 stem_channels: Tuple[int] = (16, 24, 32, 48),\n                 ch_list: Tuple[int] = (64, 96, 160, 224, 320),\n                 grmul: float = 1.7,\n                 gr: Tuple[int] = (10, 16, 18, 24, 32),\n                 n_layers: Tuple[int] = (4, 4, 8, 8, 8),\n                 align_corners: bool = False,\n                 pretrained: str = None):\n\n        super(HarDNet, self).__init__()\n        self.align_corners = align_corners\n        self.pretrained = pretrained\n        encoder_blks_num = len(n_layers)\n        decoder_blks_num = encoder_blks_num - 1\n        encoder_in_channels = stem_channels[3]\n\n        self.stem = nn.Sequential(\n            layers.ConvBNReLU(3, stem_channels[0], kernel_size=3, bias_attr=False),\n            layers.ConvBNReLU(stem_channels[0], stem_channels[1], kernel_size=3, bias_attr=False),\n            layers.ConvBNReLU(stem_channels[1], stem_channels[2], kernel_size=3, stride=2, bias_attr=False),\n            layers.ConvBNReLU(stem_channels[2], stem_channels[3], kernel_size=3, bias_attr=False))\n\n        self.encoder = Encoder(encoder_blks_num, encoder_in_channels, ch_list, gr, grmul, n_layers)\n\n        skip_connection_channels = self.encoder.get_skip_channels()\n        decoder_in_channels = self.encoder.get_out_channels()\n\n        self.decoder = Decoder(decoder_blks_num, decoder_in_channels, skip_connection_channels, gr, grmul, n_layers,\n                               align_corners)\n\n        self.cls_head = nn.Conv2D(in_channels=self.decoder.get_out_channels(), out_channels=num_classes, kernel_size=1)\n\n        self.transforms = T.Compose([T.Normalize()])\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def transform(self, img: Union[np.ndarray, str]) -> Union[np.ndarray, str]:\n        return self.transforms(img)\n\n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        input_shape = paddle.shape(x)[2:]\n        x = self.stem(x)\n        x, skip_connections = self.encoder(x)\n        x = self.decoder(x, skip_connections)\n        logit = self.cls_head(x)\n        logit = F.interpolate(logit, size=input_shape, mode=\"bilinear\", align_corners=self.align_corners)\n        return [logit]\n\n\nclass Encoder(nn.Layer):\n    \"\"\"The Encoder implementation of FC-HardDNet 70.\n\n    Args:\n        n_blocks (int): The number of blocks in the Encoder module.\n        in_channels (int): The number of input channels.\n        ch_list (tuple|list): The number of channels at each block in the encoder.\n        grmul (float): The channel multiplying factor in HarDBlock, which is m in the paper.\n        gr (tuple|list): The growth rate in each HarDBlock, which is k in the paper.\n        n_layers (tuple|list): The number of layers in each HarDBlock.\n    \"\"\"\n\n    def __init__(self, n_blocks: int, in_channels: int, ch_list: List[int], gr: List[int], grmul: float,\n                 n_layers: List[int]):\n        super().__init__()\n        self.skip_connection_channels = []\n        self.shortcut_layers = []\n        self.blks = nn.LayerList()\n        ch = in_channels\n        for i in range(n_blocks):\n            blk = HarDBlock(ch, gr[i], grmul, n_layers[i])\n            ch = blk.get_out_ch()\n            self.skip_connection_channels.append(ch)\n            self.blks.append(blk)\n            if i < n_blocks - 1:\n                self.shortcut_layers.append(len(self.blks) - 1)\n            self.blks.append(layers.ConvBNReLU(ch, ch_list[i], kernel_size=1, bias_attr=False))\n\n            ch = ch_list[i]\n            if i < n_blocks - 1:\n                self.blks.append(nn.AvgPool2D(kernel_size=2, stride=2))\n        self.out_channels = ch\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        skip_connections = []\n        for i in range(len(self.blks)):\n            x = self.blks[i](x)\n            if i in self.shortcut_layers:\n                skip_connections.append(x)\n        return x, skip_connections\n\n    def get_skip_channels(self):\n        return self.skip_connection_channels\n\n    def get_out_channels(self):\n        return self.out_channels\n\n\nclass Decoder(nn.Layer):\n    \"\"\"The Decoder implementation of FC-HardDNet 70.\n\n    Args:\n        n_blocks (int): The number of blocks in the Encoder module.\n        in_channels (int): The number of input channels.\n        skip_connection_channels (tuple|list): The channels of shortcut layers in encoder.\n        grmul (float): The channel multiplying factor in HarDBlock, which is m in the paper.\n        gr (tuple|list): The growth rate in each HarDBlock, which is k in the paper.\n        n_layers (tuple|list): The number of layers in each HarDBlock.\n    \"\"\"\n\n    def __init__(self,\n                 n_blocks: int,\n                 in_channels: int,\n                 skip_connection_channels: List[paddle.Tensor],\n                 gr: List[int],\n                 grmul: float,\n                 n_layers: List[int],\n                 align_corners: bool = False):\n        super().__init__()\n        prev_block_channels = in_channels\n        self.n_blocks = n_blocks\n        self.dense_blocks_up = nn.LayerList()\n        self.conv1x1_up = nn.LayerList()\n\n        for i in range(n_blocks - 1, -1, -1):\n            cur_channels_count = prev_block_channels + skip_connection_channels[i]\n            conv1x1 = layers.ConvBNReLU(cur_channels_count, cur_channels_count // 2, kernel_size=1, bias_attr=False)\n            blk = HarDBlock(base_channels=cur_channels_count // 2, growth_rate=gr[i], grmul=grmul, n_layers=n_layers[i])\n\n            self.conv1x1_up.append(conv1x1)\n            self.dense_blocks_up.append(blk)\n\n            prev_block_channels = blk.get_out_ch()\n\n        self.out_channels = prev_block_channels\n        self.align_corners = align_corners\n\n    def forward(self, x: paddle.Tensor, skip_connections: List[paddle.Tensor]) -> paddle.Tensor:\n        for i in range(self.n_blocks):\n            skip = skip_connections.pop()\n            x = F.interpolate(x, size=paddle.shape(skip)[2:], mode=\"bilinear\", align_corners=self.align_corners)\n            x = paddle.concat([x, skip], axis=1)\n            x = self.conv1x1_up[i](x)\n            x = self.dense_blocks_up[i](x)\n        return x\n\n    def get_out_channels(self):\n        return self.out_channels\n\n\nclass HarDBlock(nn.Layer):\n    \"\"\"The HarDBlock implementation\n\n    Args:\n        base_channels (int): The base channels.\n        growth_rate (tuple|list): The growth rate.\n        grmul (float): The channel multiplying factor.\n        n_layers (tuple|list): The number of layers.\n        keepBase (bool, optional): A bool value indicates whether concatenating the first layer. Default: False.\n    \"\"\"\n\n    def __init__(self,\n                 base_channels: int,\n                 growth_rate: List[int],\n                 grmul: float,\n                 n_layers: List[int],\n                 keepBase: bool = False):\n        super().__init__()\n        self.keepBase = keepBase\n        self.links = []\n        layers_ = []\n        self.out_channels = 0\n        for i in range(n_layers):\n            outch, inch, link = get_link(i + 1, base_channels, growth_rate, grmul)\n\n            self.links.append(link)\n            layers_.append(layers.ConvBNReLU(inch, outch, kernel_size=3, bias_attr=False))\n            if (i % 2 == 0) or (i == n_layers - 1):\n                self.out_channels += outch\n        self.layers = nn.LayerList(layers_)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        layers_ = [x]\n        for layer in range(len(self.layers)):\n            link = self.links[layer]\n            tin = []\n            for i in link:\n                tin.append(layers_[i])\n            if len(tin) > 1:\n                x = paddle.concat(tin, axis=1)\n            else:\n                x = tin[0]\n            out = self.layers[layer](x)\n            layers_.append(out)\n\n        t = len(layers_)\n        out_ = []\n        for i in range(t):\n            if (i == 0 and self.keepBase) or \\\n                (i == t - 1) or (i % 2 == 1):\n                out_.append(layers_[i])\n        out = paddle.concat(out_, 1)\n\n        return out\n\n    def get_out_ch(self):\n        return self.out_channels\n\n\ndef get_link(layer: int, base_ch: int, growth_rate: List[int], grmul: float) -> Tuple:\n    if layer == 0:\n        return base_ch, 0, []\n    out_channels = growth_rate\n    link = []\n    for i in range(10):\n        dv = 2**i\n        if layer % dv == 0:\n            k = layer - dv\n            link.insert(0, k)\n            if i > 0:\n                out_channels *= grmul\n    out_channels = int(int(out_channels + 1) / 2) * 2\n    in_channels = 0\n    for i in link:\n        ch, _, _ = get_link(i, base_ch, growth_rate, grmul)\n        in_channels += ch\n    return out_channels, in_channels, link\n"
  },
  {
    "path": "modules/image/semantic_segmentation/humanseg_lite/README.md",
    "content": "# humanseg_lite\n\n|模型名称|humanseg_lite|\n| :--- | :---: |\n|类别|图像-图像分割|\n|网络|shufflenet|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|否|\n|模型大小|541k|\n|指标|-|\n|最新更新日期|2021-02-26|\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/130913092-312a5f37-842e-4fd0-8db4-5f853fd8419f.jpg\" width = \"337\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/130916087-7d537ad9-bbc8-4bce-9382-8eb132b35532.png\" width = \"337\" height = \"505\" hspace='10'/>\n    </p>\n- ### 模型介绍\n\n    - HumanSeg_lite是在ShuffleNetV2网络结构的基础上进行优化，进一步减小了网络规模，网络大小只有541K，量化后只有187K， 适用于手机自拍人像分割，且能在移动端进行实时分割。\n\n    - 更多详情请参考：[humanseg_lite](https://github.com/PaddlePaddle/PaddleSeg/tree/release/2.2/contrib/HumanSeg)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install humanseg_lite\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n    ```\n    hub run humanseg_lite --input_path \"/PATH/TO/IMAGE\"\n    ```\n- ### 2、预测代码示例\n\n    - 图片分割及视频分割代码示例：\n\n    ```python\n    import cv2\n    import paddlehub as hub\n\n    human_seg = hub.Module(name='humanseg_lite')\n    im = cv2.imread('/PATH/TO/IMAGE')\n    #visualization=True可以用于查看人像分割图片效果，可设置为False提升运行速度。\n    res = human_seg.segment(images=[im],visualization=True)\n    print(res[0]['data'])\n    human_seg.video_segment('/PATH/TO/VIDEO')\n    ```\n    - 视频流预测代码示例：\n\n    ```python\n    import cv2\n    import numpy as np\n    import paddlehub as hub\n\n    human_seg = hub.Module(name='humanseg_lite')\n    cap_video = cv2.VideoCapture('\\PATH\\TO\\VIDEO')\n    fps = cap_video.get(cv2.CAP_PROP_FPS)\n    save_path = 'humanseg_lite_video.avi'\n    width = int(cap_video.get(cv2.CAP_PROP_FRAME_WIDTH))\n    height = int(cap_video.get(cv2.CAP_PROP_FRAME_HEIGHT))\n    cap_out = cv2.VideoWriter(save_path, cv2.VideoWriter_fourcc('M', 'J', 'P', 'G'), fps, (width, height))\n    prev_gray = None\n    prev_cfd = None\n    while cap_video.isOpened():\n        ret, frame_org = cap_video.read()\n        if ret:\n            [img_matting, prev_gray, prev_cfd] = human_seg.video_stream_segment(frame_org=frame_org, frame_id=cap_video.get(1), prev_gray=prev_gray, prev_cfd=prev_cfd)\n            img_matting = np.repeat(img_matting[:, :, np.newaxis], 3, axis=2)\n            bg_im = np.ones_like(img_matting) * 255\n            comb = (img_matting * frame_org + (1 - img_matting) * bg_im).astype(np.uint8)\n            cap_out.write(comb)\n        else:\n            break\n\n    cap_video.release()\n    cap_out.release()\n\n    ```\n\n- ### 3、API\n\n    ```python\n    def segment(images=None,\n                paths=None,\n                batch_size=1,\n                use_gpu=False,\n                visualization=False,\n                output_dir='humanseg_lite_output')\n    ```\n\n    - 预测API，用于人像分割。\n\n    - **参数**\n\n        * images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n        * paths (list\\[str\\]): 图片的路径；\n        * batch\\_size (int): batch 的大小；\n        * use\\_gpu (bool): 是否使用 GPU预测，如果使用GPU预测，则在预测之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置；\n        * visualization (bool): 是否将识别结果保存为图片文件；\n        * output\\_dir (str): 图片的保存路径。\n\n    - **返回**\n\n        * res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，关键字有 'save\\_path', 'data'，对应的取值为：\n            * save\\_path (str, optional): 可视化图片的保存路径（仅当visualization=True时存在）；\n            * data (numpy.ndarray): 人像分割结果，仅包含Alpha通道，取值为0-255 (0为全透明，255为不透明)，也即取值越大的像素点越可能为人体，取值越小的像素点越可能为背景。\n\n\n    ```python\n    def video_stream_segment(self,\n                            frame_org,\n                            frame_id,\n                            prev_gray,\n                            prev_cfd,\n                            use_gpu=False):\n    ```\n\n    -  预测API，用于逐帧对视频人像分割。\n\n    - **参数**\n\n        * frame_org (numpy.ndarray): 单帧图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n        * frame_id (int): 当前帧的编号；\n        * prev_gray (numpy.ndarray): 前一帧输入网络图像的灰度图；\n        * prev_cfd (numpy.ndarray): 前一帧光流追踪图和预测结果融合图\n        * use\\_gpu (bool): 是否使用 GPU预测，如果使用GPU预测，则在预测之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置；\n\n\n    -  **返回**\n\n        * img_matting (numpy.ndarray): 人像分割结果，仅包含Alpha通道，取值为0-1 (0为全透明，1为不透明)。\n        * cur_gray (numpy.ndarray): 当前帧输入网络图像的灰度图；\n        * optflow_map (numpy.ndarray): 当前帧光流追踪图和预测结果融合图\n\n\n    ```python\n    def video_segment(self,\n                      video_path=None,\n                      use_gpu=False,\n                      save_dir='humanseg_lite_video_result'):\n    ```\n\n    -  预测API，用于视频人像分割。\n\n    - **参数**\n\n        * video\\_path (str): 待分割视频路径。若为None，则从本地摄像头获取视频，并弹出窗口显示在线分割结果。\n        * use\\_gpu (bool): 是否使用 GPU预测，如果使用GPU预测，则在预测之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置；\n        * save\\_dir (str): 视频保存路径，仅在video\\_path不为None时启用，保存离线视频处理结果。\n\n\n    ```python\n    def save_inference_model(dirname)\n    ```\n\n    -  将模型保存到指定路径。\n\n    - **参数**\n        * dirname: 模型保存路径\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个人像分割的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    ```shell\n    $ hub serving start -m humanseg_lite\n    ```\n\n    - 这样就完成了一个人像分割的服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import base64\n\n        import cv2\n        import numpy as np\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/humanseg_lite\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n        # 保存图片\n        mask =cv2.cvtColor(base64_to_cv2(r.json()[\"results\"][0]['data']), cv2.COLOR_BGR2GRAY)\n        rgba = np.concatenate((org_im, np.expand_dims(mask, axis=2)), axis=2)\n        cv2.imwrite(\"segment_human_lite.png\", rgba)\n        ```\n\n- ### Gradio APP 支持\n\n    从 PaddleHub 2.3.1 开始支持使用链接 http://127.0.0.1:8866/gradio/humanseg_lite 在浏览器中访问 humanseg_lite 的 Gradio APP。\n\n## 五、更新历史\n\n* 1.0.0\n\n    初始发布\n\n* 1.1.0\n\n    新增视频人像分割接口\n\n    新增视频流人像分割接口\n\n* 1.1.1\n\n   修复cudnn为8.0.4显存泄露问题\n\n* 1.2.0\n\n    移除 Fluid API\n\n* 1.3.0\n\n    添加 Gradio APP 支持\n\n    ```shell\n    $ hub install humanseg_lite == 1.3.0\n    ```\n"
  },
  {
    "path": "modules/image/semantic_segmentation/humanseg_lite/README_en.md",
    "content": "# humanseg_lite\n\n|Module Name |humanseg_lite|\n| :--- | :---: |\n|Category |Image segmentation|\n|Network|shufflenet|\n|Dataset|Baidu self-built dataset|\n|Fine-tuning supported or not|No|\n|Module Size|541k|\n|Data indicators|-|\n|Latest update date|2021-02-26|\n\n## I. Basic Information\n\n- ### Application Effect Display\n\n  - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/130913092-312a5f37-842e-4fd0-8db4-5f853fd8419f.jpg\" width = \"337\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/130916087-7d537ad9-bbc8-4bce-9382-8eb132b35532.png\" width = \"337\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### Module Introduction\n\n    - HumanSeg_lite is based on ShuffleNetV2 network. The network size is only 541K. It is suitable for selfie portrait segmentation and can be segmented in real time on the mobile terminal.\n\n    - For more information, please refer to:[humanseg_lite](https://github.com/PaddlePaddle/PaddleSeg/tree/release/2.2/contrib/HumanSeg)\n\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install humanseg_lite\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n## III. Module API Prediction\n\n- ### 1、Command line Prediction\n\n    - ```\n      hub run humanseg_lite --input_path \"/PATH/TO/IMAGE\"\n\n      ```\n\n    - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_en/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n    - Image segmentation and video segmentation example：\n      - ```python\n        import cv2\n        import paddlehub as hub\n\n        human_seg = hub.Module(name='humanseg_lite')\n        im = cv2.imread('/PATH/TO/IMAGE')\n        res = human_seg.segment(images=[im],visualization=True)\n        print(res[0]['data'])\n        human_seg.video_segment('/PATH/TO/VIDEO')\n        ```\n    - Video prediction example:\n\n      - ```python\n        import cv2\n        import numpy as np\n        import paddlehub as hub\n\n        human_seg = hub.Module('humanseg_lite')\n        cap_video = cv2.VideoCapture('\\PATH\\TO\\VIDEO')\n        fps = cap_video.get(cv2.CAP_PROP_FPS)\n        save_path = 'humanseg_lite_video.avi'\n        width = int(cap_video.get(cv2.CAP_PROP_FRAME_WIDTH))\n        height = int(cap_video.get(cv2.CAP_PROP_FRAME_HEIGHT))\n        cap_out = cv2.VideoWriter(save_path, cv2.VideoWriter_fourcc('M', 'J', 'P', 'G'), fps, (width, height))\n        prev_gray = None\n        prev_cfd = None\n        while cap_video.isOpened():\n            ret, frame_org = cap_video.read()\n            if ret:\n                [img_matting, prev_gray, prev_cfd] = human_seg.video_stream_segment(frame_org=frame_org, frame_id=cap_video.get(1), prev_gray=prev_gray, prev_cfd=prev_cfd)\n                img_matting = np.repeat(img_matting[:, :, np.newaxis], 3, axis=2)\n                bg_im = np.ones_like(img_matting) * 255\n                comb = (img_matting * frame_org + (1 - img_matting) * bg_im).astype(np.uint8)\n                cap_out.write(comb)\n            else:\n                break\n\n        cap_video.release()\n        cap_out.release()\n\n        ```\n\n- ### 3、API\n\n    - ```python\n      def segment(images=None,\n                paths=None,\n                batch_size=1,\n                use_gpu=False,\n                visualization=False,\n                output_dir='humanseg_lite_output')\n      ```\n\n        - Prediction API, generating segmentation result.\n\n        - **Parameter**\n\n            * images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR.\n            * paths (list\\[str\\]): image path.\n            * batch\\_size (int): batch size.\n            * use\\_gpu (bool): use GPU or not. **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n            * visualization (bool): Whether to save the results as picture files.\n            * output\\_dir (str): save path of images, humanseg_lite_output by default.\n\n        - **Return**\n\n            * res (list\\[dict\\]): The list of recognition results, where each element is dict and each field is:\n                * save\\_path (str, optional): Save path of the result.\n                * data (numpy.ndarray): The result of portrait segmentation.\n\n    - ```python\n      def video_stream_segment(self,\n                            frame_org,\n                            frame_id,\n                            prev_gray,\n                            prev_cfd,\n                            use_gpu=False):\n      ```\n        -  Prediction API, used to segment video portraits frame by frame.\n\n        - **Parameter**\n\n            * frame_org (numpy.ndarray): single frame for prediction，ndarray.shape is in the format [H, W, C], BGR.\n            * frame_id (int): The number of the current frame.\n            * prev_gray (numpy.ndarray): Grayscale image of the previous network input.\n            * prev_cfd (numpy.ndarray): The fusion image from optical flow and the prediction result from previous frame.\n            * use\\_gpu (bool): use GPU or not. **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n\n\n        - **Return**\n\n            * img_matting (numpy.ndarray): The result of portrait segmentation.\n            * cur_gray (numpy.ndarray): Grayscale image of the current network input.\n            * optflow_map (numpy.ndarray): The fusion image from optical flow and the prediction result from current frame.\n\n\n    - ```python\n      def video_segment(self,\n                        video_path=None,\n                        use_gpu=False,\n                        save_dir='humanseg_lite_video_result'):\n        ```\n\n        -  Prediction API to produce video segmentation result.\n\n        - **Parameter**\n\n            * video\\_path (str): Video path for segmentation。If None, the video will be obtained from the local camera, and a window will display the online segmentation result.\n            * use\\_gpu (bool): use GPU or not. **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n            * save\\_dir (str): save path of video.\n\n\n    -  ```python\n       def save_inference_model(dirname)\n       ```\n\n\n        - Save the model to the specified path.\n\n        - **Parameters**\n\n            * dirname: Model save path.\n\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of for human segmentation.\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n        - ```shell\n          hub serving start -m humanseg_lite\n          ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n        -   ```python\n            import requests\n            import json\n            import base64\n\n            import cv2\n            import numpy as np\n\n            def cv2_to_base64(image):\n                data = cv2.imencode('.jpg', image)[1]\n                return base64.b64encode(data.tostring()).decode('utf8')\n            def base64_to_cv2(b64str):\n                data = base64.b64decode(b64str.encode('utf8'))\n                data = np.fromstring(data, np.uint8)\n                data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n                return data\n\n            # Send an HTTP request\n            org_im = cv2.imread('/PATH/TO/IMAGE')\n            data = {'images':[cv2_to_base64(org_im)]}\n            headers = {\"Content-type\": \"application/json\"}\n            url = \"http://127.0.0.1:8866/predict/humanseg_lite\"\n            r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n            mask =cv2.cvtColor(base64_to_cv2(r.json()[\"results\"][0]['data']), cv2.COLOR_BGR2GRAY)\n            rgba = np.concatenate((org_im, np.expand_dims(mask, axis=2)), axis=2)\n            cv2.imwrite(\"segment_human_lite.png\", rgba)\n            ```\n\n- ### Gradio APP support\n   Starting with PaddleHub 2.3.1, the Gradio APP for humanseg_lite is supported to be accessed in the browser using the link http://127.0.0.1:8866/gradio/humanseg_lite.\n\n## V. Release Note\n\n- 1.0.0\n\n    First release\n\n- 1.1.0\n\n    Added video portrait segmentation interface\n\n    Added video stream portrait segmentation interface\n\n* 1.1.1\n\n    Fix memory leakage problem of on cudnn 8.0.4\n\n* 1.2.0\n\n    Remove Fluid API\n\n* 1.3.0\n\n    Add Gradio APP support.\n\n    ```shell\n    $ hub install humanseg_lite == 1.3.0\n    ```\n"
  },
  {
    "path": "modules/image/semantic_segmentation/humanseg_lite/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/semantic_segmentation/humanseg_lite/data_feed.py",
    "content": "# -*- coding:utf-8 -*-\nimport os\nimport time\nfrom collections import OrderedDict\n\nimport cv2\nimport numpy as np\n\n__all__ = ['reader', 'preprocess_v']\n\n\ndef preprocess_v(img, w, h):\n    img = cv2.resize(img, (w, h), cv2.INTER_LINEAR).astype(np.float32)\n    img_mean = np.array([0.5, 0.5, 0.5]).reshape((3, 1, 1))\n    img_std = np.array([0.5, 0.5, 0.5]).reshape((3, 1, 1))\n    img = img.transpose((2, 0, 1)) / 255\n    img -= img_mean\n    img /= img_std\n    return img\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            #print(im_path)\n            im = cv2.imread(im_path).astype('float32')\n            each['org_im'] = im\n            each['org_im_path'] = im_path\n            each['org_im_shape'] = im.shape\n            component.append(each)\n    if images is not None:\n        assert type(images) is list, \"images should be a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = im\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            each['org_im_shape'] = im.shape\n            component.append(each)\n\n    for element in component:\n        img = element['org_im'].copy()\n        img = cv2.resize(img, (192, 192)).astype(np.float32)\n        img_mean = np.array([0.5, 0.5, 0.5]).reshape((3, 1, 1))\n        img_std = np.array([0.5, 0.5, 0.5]).reshape((3, 1, 1))\n        img = img.transpose((2, 0, 1)) / 255\n        img -= img_mean\n        img /= img_std\n        element['image'] = img\n        yield element\n"
  },
  {
    "path": "modules/image/semantic_segmentation/humanseg_lite/module.py",
    "content": "# -*- coding:utf-8 -*-\n# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport ast\nimport os\nimport os.path as osp\n\nimport cv2\nimport numpy as np\nimport paddle.jit\nimport paddle.static\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nfrom .data_feed import preprocess_v\nfrom .data_feed import reader\nfrom .optimal import postprocess_v\nfrom .optimal import threshold_mask\nfrom .processor import base64_to_cv2\nfrom .processor import check_dir\nfrom .processor import cv2_to_base64\nfrom .processor import postprocess\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"humanseg_lite\",\n            type=\"CV/semantic_segmentation\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"humanseg_lite is a semantic segmentation model.\",\n            version=\"1.3.0\")\nclass ShufflenetHumanSeg:\n\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"humanseg_lite_inference\", \"model\")\n        self._set_config()\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path + '.pdmodel'\n        params = self.default_pretrained_model_path + '.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n\n            if paddle.get_cudnn_version() == 8004:\n                gpu_config.delete_pass('conv_elementwise_add_act_fuse_pass')\n                gpu_config.delete_pass('conv_elementwise_add2_act_fuse_pass')\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def segment(self,\n                images=None,\n                paths=None,\n                batch_size=1,\n                use_gpu=False,\n                visualization=False,\n                output_dir='humanseg_lite_output'):\n        \"\"\"\n        API for human segmentation.\n\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C], the color space is BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            visualization (bool): Whether to save image or not.\n            output_dir (str): The path to store output images.\n\n        Returns:\n            res (list[dict]): each element in the list is a dict, the keys and values are:\n                save_path (str, optional): the path to save images. (Exists only if visualization is True)\n                data (numpy.ndarray): data of post processed image.\n        \"\"\"\n\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        all_data = list()\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = list()\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except:\n                    pass\n            # feed batch image\n            batch_image = np.array([data['image'] for data in batch_data])\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(batch_image.copy())\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[1])\n            output = output_handle.copy_to_cpu()\n\n            output = np.expand_dims(output[:, 1, :, :], axis=1)\n            # postprocess one by one\n            for i in range(len(batch_data)):\n                out = postprocess(data_out=output[i],\n                                  org_im=batch_data[i]['org_im'],\n                                  org_im_shape=batch_data[i]['org_im_shape'],\n                                  org_im_path=batch_data[i]['org_im_path'],\n                                  output_dir=output_dir,\n                                  visualization=visualization)\n                res.append(out)\n        return res\n\n    def video_stream_segment(self, frame_org, frame_id, prev_gray, prev_cfd, use_gpu=False):\n        \"\"\"\n        API for human video segmentation.\n\n        Args:\n           frame_org (numpy.ndarray): frame data, shape of each is [H, W, C], the color space is BGR.\n           frame_id (int): index of the frame to be decoded.\n           prev_gray (numpy.ndarray): gray scale image of last frame, shape of each is [H, W]\n           prev_cfd (numpy.ndarray): fusion image from optical flow image and segment result, shape of each is [H, W]\n           use_gpu (bool): Whether to use gpu.\n\n        Returns:\n            img_matting (numpy.ndarray): data of segmentation mask.\n            cur_gray (numpy.ndarray): gray scale image of current frame, shape of each is [H, W]\n            optflow_map (numpy.ndarray): optical flow image of current frame, shape of each is [H, W]\n\n        \"\"\"\n        resize_h = 192\n        resize_w = 192\n        is_init = True\n        width = int(frame_org.shape[0])\n        height = int(frame_org.shape[1])\n        disflow = cv2.DISOpticalFlow_create(cv2.DISOPTICAL_FLOW_PRESET_ULTRAFAST)\n        frame = preprocess_v(frame_org, resize_w, resize_h)\n\n        predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n        input_names = predictor.get_input_names()\n        input_handle = predictor.get_input_handle(input_names[0])\n        input_handle.copy_from_cpu(frame.copy()[None, ...])\n        predictor.run()\n        output_names = predictor.get_output_names()\n        output_handle = predictor.get_output_handle(output_names[1])\n        score_map = output_handle.copy_to_cpu()\n\n        frame = np.transpose(frame, axes=[1, 2, 0])\n        score_map = np.transpose(np.squeeze(score_map, 0), axes=[1, 2, 0])\n        cur_gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)\n        cur_gray = cv2.resize(cur_gray, (resize_w, resize_h))\n        score_map = 255 * score_map[:, :, 1]\n        if frame_id == 1:\n            prev_gray = np.zeros((resize_h, resize_w), np.uint8)\n            prev_cfd = np.zeros((resize_h, resize_w), np.float32)\n            optflow_map = postprocess_v(cur_gray, score_map, prev_gray, prev_cfd, disflow, is_init)\n        else:\n            optflow_map = postprocess_v(cur_gray, score_map, prev_gray, prev_cfd, disflow, is_init)\n\n        optflow_map = cv2.GaussianBlur(optflow_map, (3, 3), 0)\n        optflow_map = threshold_mask(optflow_map, thresh_bg=0.2, thresh_fg=0.8)\n        img_matting = cv2.resize(optflow_map, (height, width), cv2.INTER_LINEAR)\n\n        return [img_matting, cur_gray, optflow_map]\n\n    def video_segment(self, video_path=None, use_gpu=False, save_dir='humanseg_lite_video_result'):\n        \"\"\"\n        API for human video segmentation.\n\n        Args:\n           video_path (str): The path to take the video under preprocess. If video_path is None, it will capture\n           the vedio from your camera.\n           use_gpu (bool): Whether to use gpu.\n           save_dir (str): The path to store output video.\n\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. \"\n                                   \"If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\")\n\n        resize_h = 192\n        resize_w = 192\n        if not video_path:\n            cap_video = cv2.VideoCapture(0)\n        else:\n            cap_video = cv2.VideoCapture(video_path)\n\n        if not cap_video.isOpened():\n            raise IOError(\"Error opening video stream or file, \"\n                          \"--video_path whether existing: {}\"\n                          \" or camera whether working\".format(video_path))\n\n        width = int(cap_video.get(cv2.CAP_PROP_FRAME_WIDTH))\n        height = int(cap_video.get(cv2.CAP_PROP_FRAME_HEIGHT))\n        disflow = cv2.DISOpticalFlow_create(cv2.DISOPTICAL_FLOW_PRESET_ULTRAFAST)\n        prev_gray = np.zeros((resize_h, resize_w), np.uint8)\n        prev_cfd = np.zeros((resize_h, resize_w), np.float32)\n        is_init = True\n        fps = cap_video.get(cv2.CAP_PROP_FPS)\n\n        if video_path is not None:\n            print('Please wait. It is computing......')\n            if not osp.exists(save_dir):\n                os.makedirs(save_dir)\n            save_path = osp.join(save_dir, 'result' + '.avi')\n            cap_out = cv2.VideoWriter(save_path, cv2.VideoWriter_fourcc('M', 'J', 'P', 'G'), fps, (width, height))\n\n            while cap_video.isOpened():\n                ret, frame_org = cap_video.read()\n                if ret:\n                    frame = preprocess_v(frame_org, resize_w, resize_h)\n\n                    predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n                    input_names = predictor.get_input_names()\n                    input_handle = predictor.get_input_handle(input_names[0])\n                    input_handle.copy_from_cpu(frame.copy()[None, ...])\n                    predictor.run()\n                    output_names = predictor.get_output_names()\n                    output_handle = predictor.get_output_handle(output_names[1])\n                    score_map = output_handle.copy_to_cpu()\n\n                    frame = np.transpose(frame, axes=[1, 2, 0])\n                    score_map = np.transpose(np.squeeze(score_map, 0), axes=[1, 2, 0])\n                    cur_gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)\n                    cur_gray = cv2.resize(cur_gray, (resize_w, resize_h))\n                    score_map = 255 * score_map[:, :, 1]\n                    optflow_map = postprocess_v(cur_gray, score_map, prev_gray, prev_cfd, disflow, is_init)\n                    prev_gray = cur_gray.copy()\n                    prev_cfd = optflow_map.copy()\n\n                    optflow_map = cv2.GaussianBlur(optflow_map, (3, 3), 0)\n                    optflow_map = threshold_mask(optflow_map, thresh_bg=0.2, thresh_fg=0.8)\n                    img_matting = cv2.resize(optflow_map, (width, height), cv2.INTER_LINEAR)\n                    img_matting = np.repeat(img_matting[:, :, np.newaxis], 3, axis=2)\n                    bg_im = np.ones_like(img_matting) * 255\n                    comb = (img_matting * frame_org + (1 - img_matting) * bg_im).astype(np.uint8)\n                    cap_out.write(comb)\n                else:\n                    break\n            cap_video.release()\n            cap_out.release()\n        else:\n            while cap_video.isOpened():\n                ret, frame_org = cap_video.read()\n                if ret:\n                    frame = preprocess_v(frame_org, resize_w, resize_h)\n\n                    predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n                    input_names = predictor.get_input_names()\n                    input_handle = predictor.get_input_handle(input_names[0])\n                    input_handle.copy_from_cpu(frame.copy()[None, ...])\n                    predictor.run()\n                    output_names = predictor.get_output_names()\n                    output_handle = predictor.get_output_handle(output_names[1])\n                    score_map = output_handle.copy_to_cpu()\n\n                    frame = np.transpose(frame, axes=[1, 2, 0])\n                    score_map = np.transpose(np.squeeze(score_map, 0), axes=[1, 2, 0])\n                    cur_gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)\n                    cur_gray = cv2.resize(cur_gray, (resize_w, resize_h))\n                    score_map = 255 * score_map[:, :, 1]\n                    optflow_map = postprocess_v(cur_gray, score_map, prev_gray, prev_cfd, disflow, is_init)\n                    prev_gray = cur_gray.copy()\n                    prev_cfd = optflow_map.copy()\n                    optflow_map = cv2.GaussianBlur(optflow_map, (3, 3), 0)\n                    optflow_map = threshold_mask(optflow_map, thresh_bg=0.2, thresh_fg=0.8)\n                    img_matting = cv2.resize(optflow_map, (width, height), cv2.INTER_LINEAR)\n                    img_matting = np.repeat(img_matting[:, :, np.newaxis], 3, axis=2)\n                    bg_im = np.ones_like(img_matting) * 255\n                    comb = (img_matting * frame_org + (1 - img_matting) * bg_im).astype(np.uint8)\n                    cv2.imshow('HumanSegmentation', comb)\n                    if cv2.waitKey(1) & 0xFF == ord('q'):\n                        break\n                else:\n                    break\n            cap_video.release()\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.segment(images=images_decode, **kwargs)\n        results = [{'data': cv2_to_base64(result['data'])} for result in results]\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.segment(paths=[args.input_path],\n                               batch_size=args.batch_size,\n                               use_gpu=args.use_gpu,\n                               output_dir=args.output_dir,\n                               visualization=args.visualization)\n        if args.save_dir is not None:\n            check_dir(args.save_dir)\n            self.save_inference_model(args.save_dir)\n\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='humanseg_lite_output',\n                                           help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument('--save_dir',\n                                           type=str,\n                                           default='humanseg_lite_model',\n                                           help=\"The directory to save model.\")\n        self.arg_config_group.add_argument('--visualization',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether to save output as images.\")\n        self.arg_config_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n\n    def create_gradio_app(self):\n        import gradio as gr\n        import tempfile\n        import os\n        from PIL import Image\n\n        def inference(image, use_gpu=False):\n            with tempfile.TemporaryDirectory() as temp_dir:\n                self.segment(paths=[image], use_gpu=use_gpu, visualization=True, output_dir=temp_dir)\n                return Image.open(os.path.join(temp_dir, os.listdir(temp_dir)[0]))\n\n        interface = gr.Interface(\n            inference,\n            [gr.inputs.Image(type=\"filepath\"), gr.Checkbox(label='use_gpu')],\n            gr.outputs.Image(type=\"ndarray\"),\n            title='humanseg_lite')\n        return interface\n"
  },
  {
    "path": "modules/image/semantic_segmentation/humanseg_lite/optimal.py",
    "content": "# -*- coding:utf-8 -*\nimport numpy as np\n\n\ndef human_seg_tracking(pre_gray, cur_gray, prev_cfd, dl_weights, disflow):\n    \"\"\"计算光流跟踪匹配点和光流图\n    输入参数:\n        pre_gray: 上一帧灰度图\n        cur_gray: 当前帧灰度图\n        prev_cfd: 上一帧光流图\n        dl_weights: 融合权重图\n        disflow: 光流数据结构\n    返回值:\n        is_track: 光流点跟踪二值图，即是否具有光流点匹配\n        track_cfd: 光流跟踪图\n    \"\"\"\n    check_thres = 8\n    h, w = pre_gray.shape[:2]\n    track_cfd = np.zeros_like(prev_cfd)\n    is_track = np.zeros_like(pre_gray)\n    flow_fw = disflow.calc(pre_gray, cur_gray, None)\n    flow_bw = disflow.calc(cur_gray, pre_gray, None)\n    flow_fw = np.round(flow_fw).astype(np.int)\n    flow_bw = np.round(flow_bw).astype(np.int)\n    y_list = np.array(range(h))\n    x_list = np.array(range(w))\n    yv, xv = np.meshgrid(y_list, x_list)\n    yv, xv = yv.T, xv.T\n    cur_x = xv + flow_fw[:, :, 0]\n    cur_y = yv + flow_fw[:, :, 1]\n\n    # 超出边界不跟踪\n    not_track = (cur_x < 0) + (cur_x >= w) + (cur_y < 0) + (cur_y >= h)\n    flow_bw[~not_track] = flow_bw[cur_y[~not_track], cur_x[~not_track]]\n    not_track += (np.square(flow_fw[:, :, 0] + flow_bw[:, :, 0]) +\n                  np.square(flow_fw[:, :, 1] + flow_bw[:, :, 1])) >= check_thres\n    track_cfd[cur_y[~not_track], cur_x[~not_track]] = prev_cfd[~not_track]\n\n    is_track[cur_y[~not_track], cur_x[~not_track]] = 1\n\n    not_flow = np.all(np.abs(flow_fw) == 0, axis=-1) * np.all(np.abs(flow_bw) == 0, axis=-1)\n    dl_weights[cur_y[not_flow], cur_x[not_flow]] = 0.05\n    return track_cfd, is_track, dl_weights\n\n\ndef human_seg_track_fuse(track_cfd, dl_cfd, dl_weights, is_track):\n    \"\"\"光流追踪图和人像分割结构融合\n    输入参数:\n        track_cfd: 光流追踪图\n        dl_cfd: 当前帧分割结果\n        dl_weights: 融合权重图\n        is_track: 光流点匹配二值图\n    返回\n        cur_cfd: 光流跟踪图和人像分割结果融合图\n    \"\"\"\n    fusion_cfd = dl_cfd.copy()\n    is_track = is_track.astype(np.bool)\n    fusion_cfd[is_track] = dl_weights[is_track] * dl_cfd[is_track] + (1 - dl_weights[is_track]) * track_cfd[is_track]\n    # 确定区域\n    index_certain = ((dl_cfd > 0.9) + (dl_cfd < 0.1)) * is_track\n    index_less01 = (dl_weights < 0.1) * index_certain\n    fusion_cfd[index_less01] = 0.3 * dl_cfd[index_less01] + 0.7 * track_cfd[index_less01]\n    index_larger09 = (dl_weights >= 0.1) * index_certain\n    fusion_cfd[index_larger09] = 0.4 * dl_cfd[index_larger09] + 0.6 * track_cfd[index_larger09]\n    return fusion_cfd\n\n\ndef threshold_mask(img, thresh_bg, thresh_fg):\n    dst = (img / 255.0 - thresh_bg) / (thresh_fg - thresh_bg)\n    dst[np.where(dst > 1)] = 1\n    dst[np.where(dst < 0)] = 0\n    return dst.astype(np.float32)\n\n\ndef postprocess_v(cur_gray, scoremap, prev_gray, pre_cfd, disflow, is_init):\n    \"\"\"光流优化\n    Args:\n        cur_gray : 当前帧灰度图\n        pre_gray : 前一帧灰度图\n        pre_cfd  ：前一帧融合结果\n        scoremap : 当前帧分割结果\n        difflow  : 光流\n        is_init : 是否第一帧\n    Returns:\n        fusion_cfd : 光流追踪图和预测结果融合图\n    \"\"\"\n    h, w = scoremap.shape\n    cur_cfd = scoremap.copy()\n\n    if is_init:\n        if h <= 64 or w <= 64:\n            disflow.setFinestScale(1)\n        elif h <= 160 or w <= 160:\n            disflow.setFinestScale(2)\n        else:\n            disflow.setFinestScale(3)\n        fusion_cfd = cur_cfd\n    else:\n        weights = np.ones((h, w), np.float32) * 0.3\n        track_cfd, is_track, weights = human_seg_tracking(prev_gray, cur_gray, pre_cfd, weights, disflow)\n        fusion_cfd = human_seg_track_fuse(track_cfd, cur_cfd, weights, is_track)\n\n    return fusion_cfd\n"
  },
  {
    "path": "modules/image/semantic_segmentation/humanseg_lite/processor.py",
    "content": "# -*- coding:utf-8 -*-\nimport base64\nimport os\nimport time\n\nimport cv2\nimport numpy as np\n\n__all__ = ['cv2_to_base64', 'base64_to_cv2', 'postprocess']\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef postprocess(data_out, org_im, org_im_shape, org_im_path, output_dir, visualization):\n    \"\"\"\n    Postprocess output of network. one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output of network.\n        org_im (numpy.ndarray): original image.\n        org_im_shape (list): shape pf original image.\n        org_im_path (list): path of riginal image.\n        output_dir (str): output directory to store image.\n        visualization (bool): whether to save image or not.\n\n    Returns:\n        result (dict): The data of processed image.\n    \"\"\"\n    result = dict()\n    for logit in data_out:\n        logit = (logit * 255).astype(np.uint8)\n        logit = cv2.resize(logit, (org_im_shape[1], org_im_shape[0]))\n        rgba = np.concatenate((org_im, np.expand_dims(logit, axis=2)), axis=2)\n\n        if visualization:\n            check_dir(output_dir)\n            save_im_path = get_save_image_name(org_im, org_im_path, output_dir)\n            cv2.imwrite(save_im_path, rgba)\n            result['save_path'] = save_im_path\n            result['data'] = logit\n        else:\n            result['data'] = logit\n    return result\n\n\ndef check_dir(dir_path):\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef get_save_image_name(org_im, org_im_path, output_dir):\n    \"\"\"\n    Get save image name from source image path.\n    \"\"\"\n    # name prefix of orginal image\n    org_im_name = os.path.split(org_im_path)[-1]\n    im_prefix = os.path.splitext(org_im_name)[0]\n    ext = '.png'\n    # save image path\n    save_im_path = os.path.join(output_dir, im_prefix + ext)\n    if os.path.exists(save_im_path):\n        save_im_path = os.path.join(output_dir, im_prefix + 'time={}'.format(int(time.time())) + ext)\n\n    return save_im_path\n"
  },
  {
    "path": "modules/image/semantic_segmentation/humanseg_lite/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport numpy as np\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/pg_WCHWSdT8/download?ixid=MnwxMjA3fDB8MXxhbGx8fHx8fHx8fHwxNjYyNDM2ODI4&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        fourcc = cv2.VideoWriter_fourcc('M', 'J', 'P', 'G')\n        img = cv2.imread('tests/test.jpg')\n        video = cv2.VideoWriter('tests/test.avi', fourcc, 20.0, tuple(img.shape[:2]))\n        for i in range(40):\n            video.write(img)\n        video.release()\n        cls.module = hub.Module(name=\"humanseg_lite\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('humanseg_lite_output')\n        shutil.rmtree('humanseg_lite_video_result')\n\n    def test_segment1(self):\n        results = self.module.segment(paths=['tests/test.jpg'], use_gpu=False, visualization=False)\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_segment2(self):\n        results = self.module.segment(images=[cv2.imread('tests/test.jpg')], use_gpu=False, visualization=False)\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_segment3(self):\n        results = self.module.segment(images=[cv2.imread('tests/test.jpg')], use_gpu=False, visualization=True)\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_segment4(self):\n        results = self.module.segment(images=[cv2.imread('tests/test.jpg')], use_gpu=True, visualization=False)\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_segment5(self):\n        self.assertRaises(AssertionError, self.module.segment, paths=['no.jpg'])\n\n    def test_segment6(self):\n        self.assertRaises(AttributeError, self.module.segment, images=['test.jpg'])\n\n    def test_video_stream_segment1(self):\n        img_matting, cur_gray, optflow_map = self.module.video_stream_segment(frame_org=cv2.imread('tests/test.jpg'),\n                                                                              frame_id=1,\n                                                                              prev_gray=None,\n                                                                              prev_cfd=None,\n                                                                              use_gpu=False)\n        self.assertIsInstance(img_matting, np.ndarray)\n        self.assertIsInstance(cur_gray, np.ndarray)\n        self.assertIsInstance(optflow_map, np.ndarray)\n        img_matting, cur_gray, optflow_map = self.module.video_stream_segment(frame_org=cv2.imread('tests/test.jpg'),\n                                                                              frame_id=2,\n                                                                              prev_gray=cur_gray,\n                                                                              prev_cfd=optflow_map,\n                                                                              use_gpu=False)\n        self.assertIsInstance(img_matting, np.ndarray)\n        self.assertIsInstance(cur_gray, np.ndarray)\n        self.assertIsInstance(optflow_map, np.ndarray)\n\n    def test_video_stream_segment2(self):\n        img_matting, cur_gray, optflow_map = self.module.video_stream_segment(frame_org=cv2.imread('tests/test.jpg'),\n                                                                              frame_id=1,\n                                                                              prev_gray=None,\n                                                                              prev_cfd=None,\n                                                                              use_gpu=True)\n        self.assertIsInstance(img_matting, np.ndarray)\n        self.assertIsInstance(cur_gray, np.ndarray)\n        self.assertIsInstance(optflow_map, np.ndarray)\n        img_matting, cur_gray, optflow_map = self.module.video_stream_segment(frame_org=cv2.imread('tests/test.jpg'),\n                                                                              frame_id=2,\n                                                                              prev_gray=cur_gray,\n                                                                              prev_cfd=optflow_map,\n                                                                              use_gpu=True)\n        self.assertIsInstance(img_matting, np.ndarray)\n        self.assertIsInstance(cur_gray, np.ndarray)\n        self.assertIsInstance(optflow_map, np.ndarray)\n\n    def test_video_segment1(self):\n        self.module.video_segment(video_path=\"tests/test.avi\", use_gpu=False, save_dir='humanseg_lite_video_result')\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/semantic_segmentation/humanseg_mobile/README.md",
    "content": "# humanseg_mobile\n\n|模型名称|humanseg_mobile|\n| :--- | :---: |\n|类别|图像-图像分割|\n|网络|hrnet|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|否|\n|模型大小|5.8MB|\n|指标|-|\n|最新更新日期|2021-02-26|\n\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/130913092-312a5f37-842e-4fd0-8db4-5f853fd8419f.jpg\" width = \"337\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/130914325-3795e241-b611-46a1-aa70-ffc47326c86a.png\" width = \"337\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### 模型介绍\n\n    - HumanSeg-mobile采用了HRNet_w18_small_v1的网络结构，模型大小只有5.8M， 适用于移动端或服务端CPU的前置摄像头场景。\n\n    - 更多详情请参考：[humanseg_mobile](https://github.com/PaddlePaddle/PaddleSeg/tree/release/2.2/contrib/HumanSeg)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install humanseg_mobile\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n    ```\n    hub run humanseg_mobile --input_path \"/PATH/TO/IMAGE\"\n    ```\n- ### 2、预测代码示例\n\n    - 图片分割及视频分割代码示例：\n\n    ```python\n    import cv2\n    import paddlehub as hub\n\n    human_seg = hub.Module(name='humanseg_mobile')\n    im = cv2.imread('/PATH/TO/IMAGE')\n    #visualization=True可以用于查看人像分割图片效果，可设置为False提升运行速度。\n    res = human_seg.segment(images=[im],visualization=True)\n    print(res[0]['data'])\n    human_seg.video_segment('/PATH/TO/VIDEO')\n    ```\n    - 视频流预测代码示例：\n\n    ```python\n    import cv2\n    import numpy as np\n    import paddlehub as hub\n\n    human_seg = hub.Module(name='humanseg_mobile')\n    cap_video = cv2.VideoCapture('\\PATH\\TO\\VIDEO')\n    fps = cap_video.get(cv2.CAP_PROP_FPS)\n    save_path = 'humanseg_mobile_video.avi'\n    width = int(cap_video.get(cv2.CAP_PROP_FRAME_WIDTH))\n    height = int(cap_video.get(cv2.CAP_PROP_FRAME_HEIGHT))\n    cap_out = cv2.VideoWriter(save_path, cv2.VideoWriter_fourcc('M', 'J', 'P', 'G'), fps, (width, height))\n    prev_gray = None\n    prev_cfd = None\n    while cap_video.isOpened():\n        ret, frame_org = cap_video.read()\n        if ret:\n            [img_matting, prev_gray, prev_cfd] = human_seg.video_stream_segment(frame_org=frame_org, frame_id=cap_video.get(1), prev_gray=prev_gray, prev_cfd=prev_cfd)\n            img_matting = np.repeat(img_matting[:, :, np.newaxis], 3, axis=2)\n            bg_im = np.ones_like(img_matting) * 255\n            comb = (img_matting * frame_org + (1 - img_matting) * bg_im).astype(np.uint8)\n            cap_out.write(comb)\n        else:\n            break\n\n    cap_video.release()\n    cap_out.release()\n\n    ```\n\n- ### 3、API\n\n    ```python\n    def segment(images=None,\n                paths=None,\n                batch_size=1,\n                use_gpu=False,\n                visualization=False,\n                output_dir='humanseg_mobile_output')\n    ```\n\n    - 预测API，用于人像分割。\n\n    - **参数**\n\n        * images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n        * paths (list\\[str\\]): 图片的路径；\n        * batch\\_size (int): batch 的大小；\n        * use\\_gpu (bool): 是否使用 GPU预测，如果使用GPU预测，则在预测之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置；\n        * visualization (bool): 是否将识别结果保存为图片文件；\n        * output\\_dir (str): 图片的保存路径。\n\n    - **返回**\n\n        * res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，关键字有 'save\\_path', 'data'，对应的取值为：\n            * save\\_path (str, optional): 可视化图片的保存路径（仅当visualization=True时存在）；\n            * data (numpy.ndarray): 人像分割结果，仅包含Alpha通道，取值为0-255 (0为全透明，255为不透明)，也即取值越大的像素点越可能为人体，取值越小的像素点越可能为背景。\n\n\n    ```python\n    def video_stream_segment(self,\n                            frame_org,\n                            frame_id,\n                            prev_gray,\n                            prev_cfd,\n                            use_gpu=False):\n    ```\n\n    -  预测API，用于逐帧对视频人像分割。\n\n    - **参数**\n\n        * frame_org (numpy.ndarray): 单帧图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n        * frame_id (int): 当前帧的编号；\n        * prev_gray (numpy.ndarray): 前一帧输入网络图像的灰度图；\n        * prev_cfd (numpy.ndarray): 前一帧光流追踪图和预测结果融合图\n        * use\\_gpu (bool): 是否使用 GPU预测，如果使用GPU预测，则在预测之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置；\n\n\n    -  **返回**\n\n        * img_matting (numpy.ndarray): 人像分割结果，仅包含Alpha通道，取值为0-1 (0为全透明，1为不透明)。\n        * cur_gray (numpy.ndarray): 当前帧输入网络图像的灰度图；\n        * optflow_map (numpy.ndarray): 当前帧光流追踪图和预测结果融合图\n\n\n    ```python\n    def video_segment(self,\n                      video_path=None,\n                      use_gpu=False,\n                      save_dir='humanseg_mobile_video_result'):\n    ```\n\n    -  预测API，用于视频人像分割。\n\n    - **参数**\n\n        * video\\_path (str): 待分割视频路径。若为None，则从本地摄像头获取视频，并弹出窗口显示在线分割结果。\n        * use\\_gpu (bool): 是否使用 GPU预测，如果使用GPU预测，则在预测之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置；\n        * save\\_dir (str): 视频保存路径，仅在video\\_path不为None时启用，保存离线视频处理结果。\n\n\n    ```python\n    def save_inference_model(dirname)\n    ```\n\n    -  将模型保存到指定路径。\n\n    - **参数**\n        * dirname: 模型保存路径\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个人像分割的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    ```shell\n    $ hub serving start -m humanseg_mobile\n    ```\n\n    - 这样就完成了一个人像分割的服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    ```python\n    import requests\n    import json\n    import base64\n\n    import cv2\n    import numpy as np\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n    def base64_to_cv2(b64str):\n        data = base64.b64decode(b64str.encode('utf8'))\n        data = np.fromstring(data, np.uint8)\n        data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n        return data\n\n    # 发送HTTP请求\n    org_im = cv2.imread('/PATH/TO/IMAGE')\n    data = {'images':[cv2_to_base64(org_im)]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/humanseg_mobile\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 保存图片\n    mask =cv2.cvtColor(base64_to_cv2(r.json()[\"results\"][0]['data']), cv2.COLOR_BGR2GRAY)\n    rgba = np.concatenate((org_im, np.expand_dims(mask, axis=2)), axis=2)\n    cv2.imwrite(\"segment_human_mobile.png\", rgba)\n    ```\n\n- ### Gradio APP 支持\n\n    从 PaddleHub 2.3.1 开始支持使用链接 http://127.0.0.1:8866/gradio/humanseg_mobile 在浏览器中访问 humanseg_mobile 的 Gradio APP。\n\n## 五、更新历史\n\n* 1.0.0\n\n    初始发布\n\n* 1.1.0\n\n    新增视频人像分割接口\n\n    新增视频流人像分割接口\n\n* 1.1.1\n\n    修复cudnn为8.0.4显存泄露问题\n\n* 1.2.0\n\n    移除 Fluid API\n\n* 1.3.0\n\n    添加 Gradio APP 支持\n\n    ```shell\n    $ hub install humanseg_mobile == 1.3.0\n    ```\n"
  },
  {
    "path": "modules/image/semantic_segmentation/humanseg_mobile/README_en.md",
    "content": "# humanseg_mobile\n\n|Module Name |humanseg_mobile|\n| :--- | :---: |\n|Category |Image segmentation|\n|Network|hrnet|\n|Dataset|Baidu self-built dataset|\n|Fine-tuning supported or not|No|\n|Module Size|5.8M|\n|Data indicators|-|\n|Latest update date|2021-02-26|\n\n## I. Basic Information\n\n- ### Application Effect Display\n\n  - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/130913092-312a5f37-842e-4fd0-8db4-5f853fd8419f.jpg\" width = \"337\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/130914325-3795e241-b611-46a1-aa70-ffc47326c86a.png\" width = \"337\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### Module Introduction\n\n    - HumanSeg_mobile is based on HRNet_w18_small_v1 network. The network size is only 5.8M. It is suitable for selfie portrait segmentation and can be segmented in real time on the mobile terminal.\n\n    - For more information, please refer to:[humanseg_mobile](https://github.com/PaddlePaddle/PaddleSeg/tree/release/2.2/contrib/HumanSeg)\n\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install humanseg_mobile\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n## III. Module API Prediction\n\n- ### 1、Command line Prediction\n\n    - ```\n      hub run humanseg_mobile --input_path \"/PATH/TO/IMAGE\"\n\n      ```\n    - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_en/tutorial/cmd_usage.rst)\n\n\n- ### 2、Prediction Code Example\n    - Image segmentation and video segmentation example：\n        ```python\n        import cv2\n        import paddlehub as hub\n\n        human_seg = hub.Module(name='humanseg_mobile')\n        im = cv2.imread('/PATH/TO/IMAGE')\n        res = human_seg.segment(images=[im],visualization=True)\n        print(res[0]['data'])\n        human_seg.video_segment('/PATH/TO/VIDEO')\n        ```\n    - Video prediction example:\n\n        ```python\n        import cv2\n        import numpy as np\n        import paddlehub as hub\n\n        human_seg = hub.Module('humanseg_mobile')\n        cap_video = cv2.VideoCapture('\\PATH\\TO\\VIDEO')\n        fps = cap_video.get(cv2.CAP_PROP_FPS)\n        save_path = 'humanseg_mobile_video.avi'\n        width = int(cap_video.get(cv2.CAP_PROP_FRAME_WIDTH))\n        height = int(cap_video.get(cv2.CAP_PROP_FRAME_HEIGHT))\n        cap_out = cv2.VideoWriter(save_path, cv2.VideoWriter_fourcc('M', 'J', 'P', 'G'), fps, (width, height))\n        prev_gray = None\n        prev_cfd = None\n        while cap_video.isOpened():\n            ret, frame_org = cap_video.read()\n            if ret:\n                [img_matting, prev_gray, prev_cfd] = human_seg.video_stream_segment(frame_org=frame_org, frame_id=cap_video.get(1), prev_gray=prev_gray, prev_cfd=prev_cfd)\n                img_matting = np.repeat(img_matting[:, :, np.newaxis], 3, axis=2)\n                bg_im = np.ones_like(img_matting) * 255\n                comb = (img_matting * frame_org + (1 - img_matting) * bg_im).astype(np.uint8)\n                cap_out.write(comb)\n            else:\n                break\n\n        cap_video.release()\n        cap_out.release()\n\n        ```\n\n- ### 3、API\n\n    ```python\n    def segment(images=None,\n                paths=None,\n                batch_size=1,\n                use_gpu=False,\n                visualization=False,\n                output_dir='humanseg_mobile_output')\n    ```\n\n    - Prediction API, generating segmentation result.\n\n    - **Parameter**\n\n        * images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR.\n        * paths (list\\[str\\]): image path.\n        * batch\\_size (int): batch size.\n        * use\\_gpu (bool): use GPU or not. **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n        * visualization (bool): Whether to save the results as picture files.\n        * output\\_dir (str): save path of images, humanseg_mobile_output by default.\n\n    - **Return**\n\n        * res (list\\[dict\\]): The list of recognition results, where each element is dict and each field is:\n            * save\\_path (str, optional): Save path of the result.\n            * data (numpy.ndarray): The result of portrait segmentation.\n\n    ```python\n    def video_stream_segment(self,\n                            frame_org,\n                            frame_id,\n                            prev_gray,\n                            prev_cfd,\n                            use_gpu=False):\n    ```\n\n    -  Prediction API, used to segment video portraits frame by frame.\n\n    - **Parameter**\n\n        * frame_org (numpy.ndarray): single frame for prediction，ndarray.shape is in the format [H, W, C], BGR.\n        * frame_id (int): The number of the current frame.\n        * prev_gray (numpy.ndarray): Grayscale image of the previous network input.\n        * prev_cfd (numpy.ndarray): The fusion image from optical flow and the prediction result from previous frame.\n        * use\\_gpu (bool): Use GPU or not. **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n\n\n    - **Return**\n\n        * img_matting (numpy.ndarray): The result of portrait segmentation.\n        * cur_gray (numpy.ndarray): Grayscale image of the current network input.\n        * optflow_map (numpy.ndarray): The fusion image from optical flow and the prediction result from current frame.\n\n\n    ```python\n    def video_segment(self,\n                      video_path=None,\n                      use_gpu=False,\n                      save_dir='humanseg_mobile_video_result'):\n    ```\n\n    -  Prediction API to produce video segmentation result.\n\n    - **Parameter**\n\n        * video\\_path (str): Video path for segmentation。If None, the video will be obtained from the local camera, and a window will display the online segmentation result.\n        * use\\_gpu (bool): Use GPU or not. **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n        * save\\_dir (str): save path of video.\n\n\n    ```python\n    def save_inference_model(dirname)\n    ```\n\n\n    - Save the model to the specified path.\n\n    - **Parameters**\n\n      * dirname: Model save path.\n\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of for human segmentation.\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n        -  ```shell\n           $ hub serving start -m humanseg_mobile\n           ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n        ```python\n        import requests\n        import json\n        import base64\n\n        import cv2\n        import numpy as np\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # Send an HTTP request\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/humanseg_mobile\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n        mask =cv2.cvtColor(base64_to_cv2(r.json()[\"results\"][0]['data']), cv2.COLOR_BGR2GRAY)\n        rgba = np.concatenate((org_im, np.expand_dims(mask, axis=2)), axis=2)\n        cv2.imwrite(\"segment_human_mobile.png\", rgba)\n        ```\n\n- ### Gradio APP support\n   Starting with PaddleHub 2.3.1, the Gradio APP for humanseg_mobile is supported to be accessed in the browser using the link http://127.0.0.1:8866/gradio/humanseg_mobile.\n\n## V. Release Note\n\n- 1.0.0\n\n    First release\n\n- 1.1.0\n\n    Added video portrait split interface\n\n    Added video stream portrait segmentation interface\n\n* 1.1.1\n\n    Fix the video memory leakage problem of on cudnn 8.0.4\n\n* 1.2.0\n\n    Remove Fluid API\n\n* 1.3.0\n\n    Add Gradio APP support.\n\n    ```shell\n    $ hub install humanseg_mobile == 1.3.0\n    ```\n"
  },
  {
    "path": "modules/image/semantic_segmentation/humanseg_mobile/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/semantic_segmentation/humanseg_mobile/data_feed.py",
    "content": "# -*- coding:utf-8 -*-\nimport os\nimport time\nfrom collections import OrderedDict\n\nimport cv2\nimport numpy as np\n\n__all__ = ['reader', 'preprocess_v']\n\n\ndef preprocess_v(img, w, h):\n    img = cv2.resize(img, (w, h), cv2.INTER_LINEAR).astype(np.float32)\n    img_mean = np.array([0.5, 0.5, 0.5]).reshape((3, 1, 1))\n    img_std = np.array([0.5, 0.5, 0.5]).reshape((3, 1, 1))\n    img = img.transpose((2, 0, 1)) / 255\n    img -= img_mean\n    img /= img_std\n    return img\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            #print(im_path)\n            im = cv2.imread(im_path).astype('float32')\n            each['org_im'] = im\n            each['org_im_path'] = im_path\n            each['org_im_shape'] = im.shape\n            component.append(each)\n    if images is not None:\n        assert type(images) is list, \"images should be a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = im\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            each['org_im_shape'] = im.shape\n            component.append(each)\n\n    for element in component:\n        img = element['org_im'].copy()\n        img = cv2.resize(img, (192, 192)).astype(np.float32)\n        img_mean = np.array([0.5, 0.5, 0.5]).reshape((3, 1, 1))\n        img_std = np.array([0.5, 0.5, 0.5]).reshape((3, 1, 1))\n        img = img.transpose((2, 0, 1)) / 255\n        img -= img_mean\n        img /= img_std\n        element['image'] = img\n        yield element\n"
  },
  {
    "path": "modules/image/semantic_segmentation/humanseg_mobile/module.py",
    "content": "# -*- coding:utf-8 -*-\n# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport ast\nimport os\nimport os.path as osp\n\nimport cv2\nimport numpy as np\nimport paddle.jit\nimport paddle.static\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nfrom .data_feed import preprocess_v\nfrom .data_feed import reader\nfrom .optimal import postprocess_v\nfrom .optimal import threshold_mask\nfrom .processor import base64_to_cv2\nfrom .processor import check_dir\nfrom .processor import cv2_to_base64\nfrom .processor import postprocess\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"humanseg_mobile\",\n            type=\"CV/semantic_segmentation\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"HRNet_w18_samll_v1 is a semantic segmentation model.\",\n            version=\"1.3.0\")\nclass HRNetw18samllv1humanseg:\n\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"humanseg_mobile_inference\", \"model\")\n        self._set_config()\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path + '.pdmodel'\n        params = self.default_pretrained_model_path + '.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n\n            if paddle.get_cudnn_version() == 8004:\n                gpu_config.delete_pass('conv_elementwise_add_act_fuse_pass')\n                gpu_config.delete_pass('conv_elementwise_add2_act_fuse_pass')\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def segment(self,\n                images=None,\n                paths=None,\n                batch_size=1,\n                use_gpu=False,\n                visualization=False,\n                output_dir='humanseg_mobile_output'):\n        \"\"\"\n        API for human segmentation.\n\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C], the color space is BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            visualization (bool): Whether to save image or not.\n            output_dir (str): The path to store output images.\n\n        Returns:\n            res (list[dict]): each element in the list is a dict, the keys and values are:\n                save_path (str, optional): the path to save images. (Exists only if visualization is True)\n                data (numpy.ndarray): data of post processed image.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly.\"\n                                   \"If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\")\n\n        # compatibility with older versions\n\n        all_data = list()\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n        res = list()\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except:\n                    pass\n            # feed batch image\n            batch_image = np.array([data['image'] for data in batch_data])\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(batch_image.copy())\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[1])\n            output = output_handle.copy_to_cpu()\n\n            output = np.expand_dims(output[:, 1, :, :], axis=1)\n            # postprocess one by one\n            for i in range(len(batch_data)):\n                out = postprocess(data_out=output[i],\n                                  org_im=batch_data[i]['org_im'],\n                                  org_im_shape=batch_data[i]['org_im_shape'],\n                                  org_im_path=batch_data[i]['org_im_path'],\n                                  output_dir=output_dir,\n                                  visualization=visualization)\n                res.append(out)\n        return res\n\n    def video_stream_segment(self, frame_org, frame_id, prev_gray, prev_cfd, use_gpu=False):\n        \"\"\"\n        API for human video segmentation.\n\n        Args:\n           frame_org (numpy.ndarray): frame data, shape of each is [H, W, C], the color space is BGR.\n           frame_id (int): index of the frame to be decoded.\n           prev_gray (numpy.ndarray): gray scale image of last frame, shape of each is [H, W]\n           prev_cfd (numpy.ndarray): fusion image from optical flow image and segment result, shape of each is [H, W]\n           use_gpu (bool): Whether to use gpu.\n\n        Returns:\n            img_matting (numpy.ndarray): data of segmentation mask.\n            cur_gray (numpy.ndarray): gray scale image of current frame, shape of each is [H, W]\n            optflow_map (numpy.ndarray): optical flow image of current frame, shape of each is [H, W]\n\n        \"\"\"\n        resize_h = 192\n        resize_w = 192\n        is_init = True\n        width = int(frame_org.shape[0])\n        height = int(frame_org.shape[1])\n        disflow = cv2.DISOpticalFlow_create(cv2.DISOPTICAL_FLOW_PRESET_ULTRAFAST)\n        frame = preprocess_v(frame_org, resize_w, resize_h)\n\n        predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n        input_names = predictor.get_input_names()\n        input_handle = predictor.get_input_handle(input_names[0])\n        input_handle.copy_from_cpu(frame.copy()[None, ...])\n        predictor.run()\n        output_names = predictor.get_output_names()\n        output_handle = predictor.get_output_handle(output_names[1])\n        score_map = output_handle.copy_to_cpu()\n\n        frame = np.transpose(frame, axes=[1, 2, 0])\n        score_map = np.transpose(np.squeeze(score_map, 0), axes=[1, 2, 0])\n        cur_gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)\n        cur_gray = cv2.resize(cur_gray, (resize_w, resize_h))\n        score_map = 255 * score_map[:, :, 1]\n        if frame_id == 1:\n            prev_gray = np.zeros((resize_h, resize_w), np.uint8)\n            prev_cfd = np.zeros((resize_h, resize_w), np.float32)\n            optflow_map = postprocess_v(cur_gray, score_map, prev_gray, prev_cfd, disflow, is_init)\n        else:\n            optflow_map = postprocess_v(cur_gray, score_map, prev_gray, prev_cfd, disflow, is_init)\n        optflow_map = cv2.GaussianBlur(optflow_map, (3, 3), 0)\n        optflow_map = threshold_mask(optflow_map, thresh_bg=0.2, thresh_fg=0.8)\n        img_matting = cv2.resize(optflow_map, (height, width), cv2.INTER_LINEAR)\n        return [img_matting, cur_gray, optflow_map]\n\n    def video_segment(self, video_path=None, use_gpu=False, save_dir='humanseg_mobile_video_result'):\n        \"\"\"\n        API for human video segmentation.\n\n        Args:\n           video_path (str): The path to take the video under preprocess. If video_path is None, it will capture\n           the vedio from your camera.\n           use_gpu (bool): Whether to use gpu.\n           save_dir (str): The path to store output video.\n\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. \"\n                                   \"If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\")\n\n        resize_h = 192\n        resize_w = 192\n        if not video_path:\n            cap_video = cv2.VideoCapture(0)\n        else:\n            cap_video = cv2.VideoCapture(video_path)\n        if not cap_video.isOpened():\n            raise IOError(\"Error opening video stream or file, \"\n                          \"--video_path whether existing: {}\"\n                          \" or camera whether working\".format(video_path))\n        width = int(cap_video.get(cv2.CAP_PROP_FRAME_WIDTH))\n        height = int(cap_video.get(cv2.CAP_PROP_FRAME_HEIGHT))\n        disflow = cv2.DISOpticalFlow_create(cv2.DISOPTICAL_FLOW_PRESET_ULTRAFAST)\n        prev_gray = np.zeros((resize_h, resize_w), np.uint8)\n        prev_cfd = np.zeros((resize_h, resize_w), np.float32)\n        is_init = True\n        fps = cap_video.get(cv2.CAP_PROP_FPS)\n        if video_path is not None:\n            print('Please wait. It is computing......')\n            if not osp.exists(save_dir):\n                os.makedirs(save_dir)\n            save_path = osp.join(save_dir, 'result' + '.avi')\n            cap_out = cv2.VideoWriter(save_path, cv2.VideoWriter_fourcc('M', 'J', 'P', 'G'), fps, (width, height))\n            while cap_video.isOpened():\n                ret, frame_org = cap_video.read()\n                if ret:\n                    frame = preprocess_v(frame_org, resize_w, resize_h)\n\n                    predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n                    input_names = predictor.get_input_names()\n                    input_handle = predictor.get_input_handle(input_names[0])\n                    input_handle.copy_from_cpu(frame.copy()[None, ...])\n                    predictor.run()\n                    output_names = predictor.get_output_names()\n                    output_handle = predictor.get_output_handle(output_names[1])\n                    score_map = output_handle.copy_to_cpu()\n\n                    frame = np.transpose(frame, axes=[1, 2, 0])\n                    score_map = np.transpose(np.squeeze(score_map, 0), axes=[1, 2, 0])\n                    cur_gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)\n                    cur_gray = cv2.resize(cur_gray, (resize_w, resize_h))\n                    score_map = 255 * score_map[:, :, 1]\n                    optflow_map = postprocess_v(cur_gray, score_map, prev_gray, prev_cfd, disflow, is_init)\n                    prev_gray = cur_gray.copy()\n                    prev_cfd = optflow_map.copy()\n                    optflow_map = cv2.GaussianBlur(optflow_map, (3, 3), 0)\n                    optflow_map = threshold_mask(optflow_map, thresh_bg=0.2, thresh_fg=0.8)\n                    img_matting = cv2.resize(optflow_map, (width, height), cv2.INTER_LINEAR)\n                    img_matting = np.repeat(img_matting[:, :, np.newaxis], 3, axis=2)\n                    bg_im = np.ones_like(img_matting) * 255\n                    comb = (img_matting * frame_org + (1 - img_matting) * bg_im).astype(np.uint8)\n                    cap_out.write(comb)\n                else:\n                    break\n            cap_video.release()\n            cap_out.release()\n        else:\n            while cap_video.isOpened():\n                ret, frame_org = cap_video.read()\n                if ret:\n                    frame = preprocess_v(frame_org, resize_w, resize_h)\n\n                    predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n                    input_names = predictor.get_input_names()\n                    input_handle = predictor.get_input_handle(input_names[0])\n                    input_handle.copy_from_cpu(frame.copy()[None, ...])\n                    predictor.run()\n                    output_names = predictor.get_output_names()\n                    output_handle = predictor.get_output_handle(output_names[1])\n                    score_map = output_handle.copy_to_cpu()\n\n                    frame = np.transpose(frame, axes=[1, 2, 0])\n                    score_map = np.transpose(np.squeeze(score_map, 0), axes=[1, 2, 0])\n                    cur_gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)\n                    cur_gray = cv2.resize(cur_gray, (resize_w, resize_h))\n                    score_map = 255 * score_map[:, :, 1]\n                    optflow_map = postprocess_v(cur_gray, score_map, prev_gray, prev_cfd, disflow, is_init)\n                    prev_gray = cur_gray.copy()\n                    prev_cfd = optflow_map.copy()\n                    optflow_map = cv2.GaussianBlur(optflow_map, (3, 3), 0)\n                    optflow_map = threshold_mask(optflow_map, thresh_bg=0.2, thresh_fg=0.8)\n                    img_matting = cv2.resize(optflow_map, (width, height), cv2.INTER_LINEAR)\n                    img_matting = np.repeat(img_matting[:, :, np.newaxis], 3, axis=2)\n                    bg_im = np.ones_like(img_matting) * 255\n                    comb = (img_matting * frame_org + (1 - img_matting) * bg_im).astype(np.uint8)\n                    cv2.imshow('HumanSegmentation', comb)\n                    if cv2.waitKey(1) & 0xFF == ord('q'):\n                        break\n                else:\n                    break\n            cap_video.release()\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.segment(images=images_decode, **kwargs)\n        results = [{'data': cv2_to_base64(result['data'])} for result in results]\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.segment(paths=[args.input_path],\n                               batch_size=args.batch_size,\n                               use_gpu=args.use_gpu,\n                               output_dir=args.output_dir,\n                               visualization=args.visualization)\n        if args.save_dir is not None:\n            check_dir(args.save_dir)\n            self.save_inference_model(args.save_dir)\n\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='humanseg_mobile_output',\n                                           help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument('--save_dir',\n                                           type=str,\n                                           default='humanseg_mobile_model',\n                                           help=\"The directory to save model.\")\n        self.arg_config_group.add_argument('--visualization',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether to save output as images.\")\n        self.arg_config_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n\n    def create_gradio_app(self):\n        import gradio as gr\n        import tempfile\n        import os\n        from PIL import Image\n\n        def inference(image, use_gpu=False):\n            with tempfile.TemporaryDirectory() as temp_dir:\n                self.segment(paths=[image], use_gpu=use_gpu, visualization=True, output_dir=temp_dir)\n                return Image.open(os.path.join(temp_dir, os.listdir(temp_dir)[0]))\n\n        interface = gr.Interface(\n            inference,\n            [gr.inputs.Image(type=\"filepath\"), gr.Checkbox(label='use_gpu')],\n            gr.outputs.Image(type=\"ndarray\"),\n            title='humanseg_mobile')\n        return interface\n"
  },
  {
    "path": "modules/image/semantic_segmentation/humanseg_mobile/optimal.py",
    "content": "# -*- coding:utf-8 -*-\nimport numpy as np\n\n\ndef human_seg_tracking(pre_gray, cur_gray, prev_cfd, dl_weights, disflow):\n    \"\"\"计算光流跟踪匹配点和光流图\n    输入参数:\n        pre_gray: 上一帧灰度图\n        cur_gray: 当前帧灰度图\n        prev_cfd: 上一帧光流图\n        dl_weights: 融合权重图\n        disflow: 光流数据结构\n    返回值:\n        is_track: 光流点跟踪二值图，即是否具有光流点匹配\n        track_cfd: 光流跟踪图\n    \"\"\"\n    check_thres = 8\n    h, w = pre_gray.shape[:2]\n    track_cfd = np.zeros_like(prev_cfd)\n    is_track = np.zeros_like(pre_gray)\n    flow_fw = disflow.calc(pre_gray, cur_gray, None)\n    flow_bw = disflow.calc(cur_gray, pre_gray, None)\n    flow_fw = np.round(flow_fw).astype(np.int)\n    flow_bw = np.round(flow_bw).astype(np.int)\n    y_list = np.array(range(h))\n    x_list = np.array(range(w))\n    yv, xv = np.meshgrid(y_list, x_list)\n    yv, xv = yv.T, xv.T\n    cur_x = xv + flow_fw[:, :, 0]\n    cur_y = yv + flow_fw[:, :, 1]\n\n    # 超出边界不跟踪\n    not_track = (cur_x < 0) + (cur_x >= w) + (cur_y < 0) + (cur_y >= h)\n    flow_bw[~not_track] = flow_bw[cur_y[~not_track], cur_x[~not_track]]\n    not_track += (np.square(flow_fw[:, :, 0] + flow_bw[:, :, 0]) +\n                  np.square(flow_fw[:, :, 1] + flow_bw[:, :, 1])) >= check_thres\n    track_cfd[cur_y[~not_track], cur_x[~not_track]] = prev_cfd[~not_track]\n\n    is_track[cur_y[~not_track], cur_x[~not_track]] = 1\n\n    not_flow = np.all(np.abs(flow_fw) == 0, axis=-1) * np.all(np.abs(flow_bw) == 0, axis=-1)\n    dl_weights[cur_y[not_flow], cur_x[not_flow]] = 0.05\n    return track_cfd, is_track, dl_weights\n\n\ndef human_seg_track_fuse(track_cfd, dl_cfd, dl_weights, is_track):\n    \"\"\"光流追踪图和人像分割结构融合\n    输入参数:\n        track_cfd: 光流追踪图\n        dl_cfd: 当前帧分割结果\n        dl_weights: 融合权重图\n        is_track: 光流点匹配二值图\n    返回\n        cur_cfd: 光流跟踪图和人像分割结果融合图\n    \"\"\"\n    fusion_cfd = dl_cfd.copy()\n    is_track = is_track.astype(np.bool)\n    fusion_cfd[is_track] = dl_weights[is_track] * dl_cfd[is_track] + (1 - dl_weights[is_track]) * track_cfd[is_track]\n    # 确定区域\n    index_certain = ((dl_cfd > 0.9) + (dl_cfd < 0.1)) * is_track\n    index_less01 = (dl_weights < 0.1) * index_certain\n    fusion_cfd[index_less01] = 0.3 * dl_cfd[index_less01] + 0.7 * track_cfd[index_less01]\n    index_larger09 = (dl_weights >= 0.1) * index_certain\n    fusion_cfd[index_larger09] = 0.4 * dl_cfd[index_larger09] + 0.6 * track_cfd[index_larger09]\n    return fusion_cfd\n\n\ndef threshold_mask(img, thresh_bg, thresh_fg):\n    dst = (img / 255.0 - thresh_bg) / (thresh_fg - thresh_bg)\n    dst[np.where(dst > 1)] = 1\n    dst[np.where(dst < 0)] = 0\n    return dst.astype(np.float32)\n\n\ndef postprocess_v(cur_gray, scoremap, prev_gray, pre_cfd, disflow, is_init):\n    \"\"\"光流优化\n    Args:\n        cur_gray : 当前帧灰度图\n        pre_gray : 前一帧灰度图\n        pre_cfd  ：前一帧融合结果\n        scoremap : 当前帧分割结果\n        difflow  : 光流\n        is_init : 是否第一帧\n    Returns:\n        fusion_cfd : 光流追踪图和预测结果融合图\n    \"\"\"\n    h, w = scoremap.shape\n    cur_cfd = scoremap.copy()\n\n    if is_init:\n        if h <= 64 or w <= 64:\n            disflow.setFinestScale(1)\n        elif h <= 160 or w <= 160:\n            disflow.setFinestScale(2)\n        else:\n            disflow.setFinestScale(3)\n        fusion_cfd = cur_cfd\n    else:\n        weights = np.ones((h, w), np.float32) * 0.3\n        track_cfd, is_track, weights = human_seg_tracking(prev_gray, cur_gray, pre_cfd, weights, disflow)\n        fusion_cfd = human_seg_track_fuse(track_cfd, cur_cfd, weights, is_track)\n\n    return fusion_cfd\n"
  },
  {
    "path": "modules/image/semantic_segmentation/humanseg_mobile/processor.py",
    "content": "# -*- coding:utf-8 -*-\nimport base64\nimport os\nimport time\n\nimport cv2\nimport numpy as np\n\n__all__ = ['cv2_to_base64', 'base64_to_cv2', 'postprocess']\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef postprocess(data_out, org_im, org_im_shape, org_im_path, output_dir, visualization, thresh=120):\n    \"\"\"\n    Postprocess output of network. one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output of network.\n        org_im (numpy.ndarray): original image.\n        org_im_shape (list): shape pf original image.\n        org_im_path (list): path of riginal image.\n        output_dir (str): output directory to store image.\n        visualization (bool): whether to save image or not.\n        thresh (float): threshold.\n\n    Returns:\n        result (dict): The data of processed image.\n    \"\"\"\n    result = dict()\n    for logit in data_out:\n        logit = (logit * 255).astype(np.uint8)\n        logit = cv2.resize(logit, (org_im_shape[1], org_im_shape[0]))\n        rgba = np.concatenate((org_im, np.expand_dims(logit, axis=2)), axis=2)\n\n        if visualization:\n            check_dir(output_dir)\n            save_im_path = get_save_image_name(org_im, org_im_path, output_dir)\n            cv2.imwrite(save_im_path, rgba)\n            result['save_path'] = save_im_path\n            result['data'] = logit\n        else:\n            result['data'] = logit\n    return result\n\n\ndef check_dir(dir_path):\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef get_save_image_name(org_im, org_im_path, output_dir):\n    \"\"\"\n    Get save image name from source image path.\n    \"\"\"\n    # name prefix of orginal image\n    org_im_name = os.path.split(org_im_path)[-1]\n    im_prefix = os.path.splitext(org_im_name)[0]\n    ext = '.png'\n    # save image path\n    save_im_path = os.path.join(output_dir, im_prefix + ext)\n    if os.path.exists(save_im_path):\n        save_im_path = os.path.join(output_dir, im_prefix + 'time={}'.format(int(time.time())) + ext)\n\n    return save_im_path\n"
  },
  {
    "path": "modules/image/semantic_segmentation/humanseg_mobile/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport numpy as np\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/pg_WCHWSdT8/download?ixid=MnwxMjA3fDB8MXxhbGx8fHx8fHx8fHwxNjYyNDM2ODI4&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        fourcc = cv2.VideoWriter_fourcc('M', 'J', 'P', 'G')\n        img = cv2.imread('tests/test.jpg')\n        video = cv2.VideoWriter('tests/test.avi', fourcc, 20.0, tuple(img.shape[:2]))\n        for i in range(40):\n            video.write(img)\n        video.release()\n        cls.module = hub.Module(name=\"humanseg_mobile\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('humanseg_mobile_output')\n        shutil.rmtree('humanseg_mobile_video_result')\n\n    def test_segment1(self):\n        results = self.module.segment(paths=['tests/test.jpg'], use_gpu=False, visualization=False)\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_segment2(self):\n        results = self.module.segment(images=[cv2.imread('tests/test.jpg')], use_gpu=False, visualization=False)\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_segment3(self):\n        results = self.module.segment(images=[cv2.imread('tests/test.jpg')], use_gpu=False, visualization=True)\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_segment4(self):\n        results = self.module.segment(images=[cv2.imread('tests/test.jpg')], use_gpu=True, visualization=False)\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_segment5(self):\n        self.assertRaises(AssertionError, self.module.segment, paths=['no.jpg'])\n\n    def test_segment6(self):\n        self.assertRaises(AttributeError, self.module.segment, images=['test.jpg'])\n\n    def test_video_stream_segment1(self):\n        img_matting, cur_gray, optflow_map = self.module.video_stream_segment(frame_org=cv2.imread('tests/test.jpg'),\n                                                                              frame_id=1,\n                                                                              prev_gray=None,\n                                                                              prev_cfd=None,\n                                                                              use_gpu=False)\n        self.assertIsInstance(img_matting, np.ndarray)\n        self.assertIsInstance(cur_gray, np.ndarray)\n        self.assertIsInstance(optflow_map, np.ndarray)\n        img_matting, cur_gray, optflow_map = self.module.video_stream_segment(frame_org=cv2.imread('tests/test.jpg'),\n                                                                              frame_id=2,\n                                                                              prev_gray=cur_gray,\n                                                                              prev_cfd=optflow_map,\n                                                                              use_gpu=False)\n        self.assertIsInstance(img_matting, np.ndarray)\n        self.assertIsInstance(cur_gray, np.ndarray)\n        self.assertIsInstance(optflow_map, np.ndarray)\n\n    def test_video_stream_segment2(self):\n        img_matting, cur_gray, optflow_map = self.module.video_stream_segment(frame_org=cv2.imread('tests/test.jpg'),\n                                                                              frame_id=1,\n                                                                              prev_gray=None,\n                                                                              prev_cfd=None,\n                                                                              use_gpu=True)\n        self.assertIsInstance(img_matting, np.ndarray)\n        self.assertIsInstance(cur_gray, np.ndarray)\n        self.assertIsInstance(optflow_map, np.ndarray)\n        img_matting, cur_gray, optflow_map = self.module.video_stream_segment(frame_org=cv2.imread('tests/test.jpg'),\n                                                                              frame_id=2,\n                                                                              prev_gray=cur_gray,\n                                                                              prev_cfd=optflow_map,\n                                                                              use_gpu=True)\n        self.assertIsInstance(img_matting, np.ndarray)\n        self.assertIsInstance(cur_gray, np.ndarray)\n        self.assertIsInstance(optflow_map, np.ndarray)\n\n    def test_video_segment1(self):\n        self.module.video_segment(video_path=\"tests/test.avi\", use_gpu=False)\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/semantic_segmentation/humanseg_server/README.md",
    "content": "# humanseg_server\n\n|模型名称|humanseg_server|\n| :--- | :---: |\n|类别|图像-图像分割|\n|网络|hrnet|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|否|\n|模型大小|159MB|\n|指标|-|\n|最新更新日期|2021-02-26|\n\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/130913092-312a5f37-842e-4fd0-8db4-5f853fd8419f.jpg\" width = \"337\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/130915531-bd4b2294-47e4-47e1-b9d3-3c1fa8b90f8f.png\" width = \"337\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### 模型介绍\n\n    - HumanSeg-server使用百度自建数据集进行训练，可用于人像分割，支持任意大小的图片输入。\n\n    - 更多详情请参考：[humanseg_server](https://github.com/PaddlePaddle/PaddleSeg/tree/release/2.2/contrib/HumanSeg)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >=2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install humanseg_server\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n    ```\n    hub run humanseg_server --input_path \"/PATH/TO/IMAGE\"\n    ```\n- ### 2、预测代码示例\n\n    - 图片分割及视频分割代码示例：\n\n    ```python\n    import cv2\n    import paddlehub as hub\n\n    human_seg = hub.Module(name='humanseg_server')\n    im = cv2.imread('/PATH/TO/IMAGE')\n    #visualization=True可以用于查看人像分割图片效果，可设置为False提升运行速度。\n    res = human_seg.segment(images=[im],visualization=True)\n    print(res[0]['data'])\n    human_seg.video_segment('/PATH/TO/VIDEO')\n    ```\n    - 视频流预测代码示例：\n\n    ```python\n    import cv2\n    import numpy as np\n    import paddlehub as hub\n\n    human_seg = hub.Module(name='humanseg_server')\n    cap_video = cv2.VideoCapture('\\PATH\\TO\\VIDEO')\n    fps = cap_video.get(cv2.CAP_PROP_FPS)\n    save_path = 'humanseg_server_video.avi'\n    width = int(cap_video.get(cv2.CAP_PROP_FRAME_WIDTH))\n    height = int(cap_video.get(cv2.CAP_PROP_FRAME_HEIGHT))\n    cap_out = cv2.VideoWriter(save_path, cv2.VideoWriter_fourcc('M', 'J', 'P', 'G'), fps, (width, height))\n    prev_gray = None\n    prev_cfd = None\n    while cap_video.isOpened():\n        ret, frame_org = cap_video.read()\n        if ret:\n            [img_matting, prev_gray, prev_cfd] = human_seg.video_stream_segment(frame_org=frame_org, frame_id=cap_video.get(1), prev_gray=prev_gray, prev_cfd=prev_cfd)\n            img_matting = np.repeat(img_matting[:, :, np.newaxis], 3, axis=2)\n            bg_im = np.ones_like(img_matting) * 255\n            comb = (img_matting * frame_org + (1 - img_matting) * bg_im).astype(np.uint8)\n            cap_out.write(comb)\n        else:\n            break\n\n    cap_video.release()\n    cap_out.release()\n\n    ```\n\n- ### 3、API\n\n    ```python\n    def segment(images=None,\n                paths=None,\n                batch_size=1,\n                use_gpu=False,\n                visualization=False,\n                output_dir='humanseg_server_output')\n    ```\n\n    - 预测API，用于人像分割。\n\n    - **参数**\n\n        * images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n        * paths (list\\[str\\]): 图片的路径；\n        * batch\\_size (int): batch 的大小；\n        * use\\_gpu (bool): 是否使用 GPU预测，如果使用GPU预测，则在预测之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置；\n        * visualization (bool): 是否将识别结果保存为图片文件；\n        * output\\_dir (str): 图片的保存路径。\n\n    - **返回**\n\n        * res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，关键字有 'save\\_path', 'data'，对应的取值为：\n            * save\\_path (str, optional): 可视化图片的保存路径（仅当visualization=True时存在）；\n            * data (numpy.ndarray): 人像分割结果，仅包含Alpha通道，取值为0-255 (0为全透明，255为不透明)，也即取值越大的像素点越可能为人体，取值越小的像素点越可能为背景。\n\n\n    ```python\n    def video_stream_segment(self,\n                            frame_org,\n                            frame_id,\n                            prev_gray,\n                            prev_cfd,\n                            use_gpu=False):\n    ```\n\n    -  预测API，用于逐帧对视频人像分割。\n\n    - **参数**\n\n        * frame_org (numpy.ndarray): 单帧图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n        * frame_id (int): 当前帧的编号；\n        * prev_gray (numpy.ndarray): 前一帧输入网络图像的灰度图；\n        * prev_cfd (numpy.ndarray): 前一帧光流追踪图和预测结果融合图\n        * use\\_gpu (bool): 是否使用 GPU预测，如果使用GPU预测，则在预测之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置；\n\n\n    -  **返回**\n\n        * img_matting (numpy.ndarray): 人像分割结果，仅包含Alpha通道，取值为0-1 (0为全透明，1为不透明)。\n        * cur_gray (numpy.ndarray): 当前帧输入网络图像的灰度图；\n        * optflow_map (numpy.ndarray): 当前帧光流追踪图和预测结果融合图\n\n\n    ```python\n    def video_segment(self,\n                      video_path=None,\n                      use_gpu=False,\n                      save_dir='humanseg_server_video_result'):\n    ```\n\n    -  预测API，用于视频人像分割。\n\n    - **参数**\n\n        * video\\_path (str): 待分割视频路径。若为None，则从本地摄像头获取视频，并弹出窗口显示在线分割结果。\n        * use\\_gpu (bool): 是否使用 GPU预测，如果使用GPU预测，则在预测之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置；\n        * save\\_dir (str): 视频保存路径，仅在video\\_path不为None时启用，保存离线视频处理结果。\n\n\n    ```python\n    def save_inference_model(dirname)\n    ```\n\n    -  将模型保存到指定路径。\n\n    - **参数**\n        * dirname: 模型保存路径\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个人像分割的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    ```shell\n    $ hub serving start -m humanseg_server\n    ```\n\n    - 这样就完成了一个人像分割的服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import base64\n\n        import cv2\n        import numpy as np\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/humanseg_server\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n        # 保存图片\n        mask =cv2.cvtColor(base64_to_cv2(r.json()[\"results\"][0]['data']), cv2.COLOR_BGR2GRAY)\n        rgba = np.concatenate((org_im, np.expand_dims(mask, axis=2)), axis=2)\n        cv2.imwrite(\"segment_human_server.png\", rgba)\n        ```\n\n- ### Gradio APP 支持\n\n    从 PaddleHub 2.3.1 开始支持使用链接 http://127.0.0.1:8866/gradio/humanseg_server 在浏览器中访问 humanseg_server 的 Gradio APP。\n\n## 五、更新历史\n\n* 1.0.0\n\n    初始发布\n\n* 1.1.0\n\n    提升预测性能\n\n* 1.1.1\n\n    修复预测后处理图像数据超过\\[0, 255\\]范围\n\n* 1.2.0\n\n    新增视频人像分割接口\n\n    新增视频流人像分割处理接口\n\n* 1.2.1\n\n    修复cudnn为8.0.4显存泄露问题\n\n* 1.3.0\n\n    移除 Fluid API\n\n* 1.4.0\n\n    添加 Gradio APP 支持\n\n    ```shell\n    $ hub install humanseg_server == 1.4.0\n    ```\n"
  },
  {
    "path": "modules/image/semantic_segmentation/humanseg_server/README_en.md",
    "content": "# humanseg_server\n\n|Module Name |humanseg_server|\n| :--- | :---: |\n|Category |Image segmentation|\n|Network|hrnet|\n|Dataset|Baidu self-built dataset|\n|Fine-tuning supported or not|No|\n|Module Size|159MB|\n|Data indicators|-|\n|Latest update date|2021-02-26|\n\n## I. Basic Information\n\n- ### Application Effect Display\n\n  - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/130913092-312a5f37-842e-4fd0-8db4-5f853fd8419f.jpg\" width = \"337\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/130915531-bd4b2294-47e4-47e1-b9d3-3c1fa8b90f8f.png\" width = \"337\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### Module Introduction\n\n    - HumanSeg-server model is trained by Baidu self-built dataset, which can be used for portrait segmentation.\n\n    - For more information, please refer to:[humanseg_server](https://github.com/PaddlePaddle/PaddleSeg/tree/release/2.2/contrib/HumanSeg)\n\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install humanseg_server\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n## III. Module API Prediction\n\n- ### 1、Command line Prediction\n\n    -   ```\n        hub run humanseg_server --input_path \"/PATH/TO/IMAGE\"\n        ```\n    - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_en/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n    - Image segmentation and video segmentation example：\n        ```python\n        import cv2\n        import paddlehub as hub\n\n        human_seg = hub.Module(name='humanseg_server')\n        im = cv2.imread('/PATH/TO/IMAGE')\n        res = human_seg.segment(images=[im],visualization=True)\n        print(res[0]['data'])\n        human_seg.video_segment('/PATH/TO/VIDEO')\n        ```\n    - Video prediction example:\n\n        ```python\n        import cv2\n        import numpy as np\n        import paddlehub as hub\n\n        human_seg = hub.Module('humanseg_server')\n        cap_video = cv2.VideoCapture('\\PATH\\TO\\VIDEO')\n        fps = cap_video.get(cv2.CAP_PROP_FPS)\n        save_path = 'humanseg_server_video.avi'\n        width = int(cap_video.get(cv2.CAP_PROP_FRAME_WIDTH))\n        height = int(cap_video.get(cv2.CAP_PROP_FRAME_HEIGHT))\n        cap_out = cv2.VideoWriter(save_path, cv2.VideoWriter_fourcc('M', 'J', 'P', 'G'), fps, (width, height))\n        prev_gray = None\n        prev_cfd = None\n        while cap_video.isOpened():\n            ret, frame_org = cap_video.read()\n            if ret:\n                [img_matting, prev_gray, prev_cfd] = human_seg.video_stream_segment(frame_org=frame_org, frame_id=cap_video.get(1), prev_gray=prev_gray, prev_cfd=prev_cfd)\n                img_matting = np.repeat(img_matting[:, :, np.newaxis], 3, axis=2)\n                bg_im = np.ones_like(img_matting) * 255\n                comb = (img_matting * frame_org + (1 - img_matting) * bg_im).astype(np.uint8)\n                cap_out.write(comb)\n            else:\n                break\n\n        cap_video.release()\n        cap_out.release()\n\n        ```\n\n- ### 3、API\n\n    ```python\n    def segment(images=None,\n                paths=None,\n                batch_size=1,\n                use_gpu=False,\n                visualization=False,\n                output_dir='humanseg_server_output')\n    ```\n\n    - Prediction API, generating segmentation result.\n\n    - **Parameter**\n\n        * images (list\\[numpy.ndarray\\]): Image data, ndarray.shape is in the format [H, W, C], BGR.\n        * paths (list\\[str\\]): Image path.\n        * batch\\_size (int): Batch size.\n        * use\\_gpu (bool): Use GPU or not. **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n        * visualization (bool): Whether to save the results as picture files.\n        * output\\_dir (str): Save path of images, humanseg_server_output by default.\n\n    - **Return**\n\n        * res (list\\[dict\\]): The list of recognition results, where each element is dict and each field is:\n            * save\\_path (str, optional): Save path of the result.\n            * data (numpy.ndarray): The result of portrait segmentation.\n\n    ```python\n    def video_stream_segment(self,\n                            frame_org,\n                            frame_id,\n                            prev_gray,\n                            prev_cfd,\n                            use_gpu=False):\n    ```\n\n    -  Prediction API, used to segment video portraits frame by frame.\n\n    - **Parameter**\n\n        * frame_org (numpy.ndarray): Single frame for prediction，ndarray.shape is in the format [H, W, C], BGR.\n        * frame_id (int): The number of the current frame.\n        * prev_gray (numpy.ndarray): Grayscale image of the previous network input.\n        * prev_cfd (numpy.ndarray): The fusion image from optical flow and the prediction result from previous frame.\n        * use\\_gpu (bool): Use GPU or not. **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n\n\n    - **Return**\n\n        * img_matting (numpy.ndarray): The result of portrait segmentation.\n        * cur_gray (numpy.ndarray): Grayscale image of the current network input.\n        * optflow_map (numpy.ndarray): The fusion image from optical flow and the prediction result from current frame.\n\n\n    ```python\n    def video_segment(self,\n                      video_path=None,\n                      use_gpu=False,\n                      save_dir='humanseg_server_video_result'):\n    ```\n\n    -  Prediction API to produce video segmentation result.\n\n    - **Parameter**\n\n        * video\\_path (str): Video path for segmentation。If None, the video will be obtained from the local camera, and a window will display the online segmentation result.\n        * use\\_gpu (bool): Use GPU or not. **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n        * save\\_dir (str): Save path of video.\n\n\n    ```python\n    def save_inference_model(dirname)\n    ```\n\n\n    - Save the model to the specified path.\n\n    - **Parameters**\n\n      * dirname: Model save path.\n\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of for human segmentation.\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n        - ```shell\n          $ hub serving start -m humanseg_server\n          ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n     -  ```python\n        import requests\n        import json\n        import base64\n\n        import cv2\n        import numpy as np\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # Send an HTTP request\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/humanseg_server\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n        mask =cv2.cvtColor(base64_to_cv2(r.json()[\"results\"][0]['data']), cv2.COLOR_BGR2GRAY)\n        rgba = np.concatenate((org_im, np.expand_dims(mask, axis=2)), axis=2)\n        cv2.imwrite(\"segment_human_server.png\", rgba)\n        ```\n\n- ### Gradio APP support\n   Starting with PaddleHub 2.3.1, the Gradio APP for humanseg_server is supported to be accessed in the browser using the link http://127.0.0.1:8866/gradio/humanseg_server.\n\n## V. Release Note\n\n- 1.0.0\n\n    First release\n\n- 1.1.0\n\n    Added video portrait segmentation interface\n\n    Added video stream portrait segmentation interface\n\n* 1.1.1\n\n    Fix memory leakage problem of on cudnn 8.0.4\n\n* 1.3.0\n\n    Remove Fluid API\n\n* 1.4.0\n\n    Add Gradio APP support.\n\n    ```shell\n    $ hub install humanseg_server == 1.4.0\n    ```\n"
  },
  {
    "path": "modules/image/semantic_segmentation/humanseg_server/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/semantic_segmentation/humanseg_server/data_feed.py",
    "content": "# coding=utf-8\nimport os\nimport time\nfrom collections import OrderedDict\n\nimport cv2\nimport numpy as np\n\n__all__ = ['reader', 'preprocess_v']\n\n\ndef preprocess_v(img, w, h):\n    img = cv2.resize(img, (w, h), cv2.INTER_LINEAR).astype(np.float32)\n    img_mean = np.array([0.5, 0.5, 0.5]).reshape((3, 1, 1))\n    img_std = np.array([0.5, 0.5, 0.5]).reshape((3, 1, 1))\n    img = img.transpose((2, 0, 1)) / 255\n    img -= img_mean\n    img /= img_std\n    return img\n\n\ndef reader(images=None, paths=None):\n    \"\"\"\n    Preprocess to yield image.\n\n    Args:\n        images (list(numpy.ndarray)): images data, shape of each is [H, W, C]\n        paths (list[str]): paths to images.\n\n    Yield:\n        each (collections.OrderedDict): info of original image, preprocessed image.\n    \"\"\"\n    component = list()\n    if paths:\n        for im_path in paths:\n            each = OrderedDict()\n            assert os.path.isfile(im_path), \"The {} isn't a valid file path.\".format(im_path)\n            im = cv2.imread(im_path).astype('float32')\n            each['org_im'] = im\n            each['org_im_path'] = im_path\n            each['org_im_shape'] = im.shape\n            component.append(each)\n    if images is not None:\n        assert type(images) is list, \"images should be a list.\"\n        for im in images:\n            each = OrderedDict()\n            each['org_im'] = im\n            each['org_im_path'] = 'ndarray_time={}'.format(round(time.time(), 6) * 1e6)\n            each['org_im_shape'] = im.shape\n            component.append(each)\n\n    for element in component:\n        img = element['org_im'].copy()\n        img = cv2.resize(img, (513, 513)).astype(np.float32)\n        img_mean = np.array([0.5, 0.5, 0.5]).reshape((3, 1, 1))\n        img_std = np.array([0.5, 0.5, 0.5]).reshape((3, 1, 1))\n        img = img.transpose((2, 0, 1)) / 255\n        img -= img_mean\n        img /= img_std\n        element['image'] = img\n        yield element\n"
  },
  {
    "path": "modules/image/semantic_segmentation/humanseg_server/module.py",
    "content": "# -*- coding:utf-8 -*-\n# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport ast\nimport os\nimport os.path as osp\n\nimport cv2\nimport numpy as np\nimport paddle.jit\nimport paddle.static\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nfrom .data_feed import preprocess_v\nfrom .data_feed import reader\nfrom .optimal import postprocess_v\nfrom .optimal import threshold_mask\nfrom .processor import base64_to_cv2\nfrom .processor import check_dir\nfrom .processor import cv2_to_base64\nfrom .processor import postprocess\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"humanseg_server\",\n            type=\"CV/semantic_segmentation\",\n            author=\"baidu-vis\",\n            author_email=\"\",\n            summary=\"DeepLabv3+ is a semantic segmentation model.\",\n            version=\"1.4.0\")\nclass DeeplabV3pXception65HumanSeg:\n\n    def __init__(self):\n        self.default_pretrained_model_path = os.path.join(self.directory, \"humanseg_server_inference\", \"model\")\n        self._set_config()\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path + '.pdmodel'\n        params = self.default_pretrained_model_path + '.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=1000, device_id=0)\n\n            if paddle.get_cudnn_version() == 8004:\n                gpu_config.delete_pass('conv_elementwise_add_act_fuse_pass')\n                gpu_config.delete_pass('conv_elementwise_add2_act_fuse_pass')\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def segment(self,\n                images=None,\n                paths=None,\n                batch_size=1,\n                use_gpu=False,\n                visualization=False,\n                output_dir='humanseg_server_output'):\n        \"\"\"\n        API for human segmentation.\n\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C], the color space is BGR.\n            paths (list[str]): The paths of images.\n            batch_size (int): batch size.\n            use_gpu (bool): Whether to use gpu.\n            visualization (bool): Whether to save image or not.\n            output_dir (str): The path to store output images.\n\n        Returns:\n            res (list[dict]): each element in the list is a dict, the keys and values are:\n                save_path (str, optional): the path to save images. (Exists only if visualization is True)\n                data (numpy.ndarray): data of post processed image.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        # compatibility with older versions\n\n        all_data = list()\n        for yield_data in reader(images, paths):\n            all_data.append(yield_data)\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n        res = list()\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except:\n                    pass\n            # feed batch image\n            batch_image = np.array([data['image'] for data in batch_data])\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(batch_image.copy())\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[1])\n            output = output_handle.copy_to_cpu()\n\n            output = np.expand_dims(output[:, 1, :, :], axis=1)\n            # postprocess one by one\n            for i in range(len(batch_data)):\n                out = postprocess(data_out=output[i],\n                                  org_im=batch_data[i]['org_im'],\n                                  org_im_shape=batch_data[i]['org_im_shape'],\n                                  org_im_path=batch_data[i]['org_im_path'],\n                                  output_dir=output_dir,\n                                  visualization=visualization)\n                res.append(out)\n        return res\n\n    def video_stream_segment(self, frame_org, frame_id, prev_gray, prev_cfd, use_gpu=False):\n        \"\"\"\n        API for human video segmentation.\n\n        Args:\n           frame_org (numpy.ndarray): frame data, shape of each is [H, W, C], the color space is BGR.\n           frame_id (int): index of the frame to be decoded.\n           prev_gray (numpy.ndarray): gray scale image of last frame, shape of each is [H, W]\n           prev_cfd (numpy.ndarray): fusion image from optical flow image and segment result, shape of each is [H, W]\n           use_gpu (bool): Whether to use gpu.\n\n        Returns:\n            img_matting (numpy.ndarray): data of segmentation mask.\n            cur_gray (numpy.ndarray): gray scale image of current frame, shape of each is [H, W]\n            optflow_map (numpy.ndarray): optical flow image of current frame, shape of each is [H, W]\n\n        \"\"\"\n        resize_h = 512\n        resize_w = 512\n        is_init = True\n        width = int(frame_org.shape[0])\n        height = int(frame_org.shape[1])\n        disflow = cv2.DISOpticalFlow_create(cv2.DISOPTICAL_FLOW_PRESET_ULTRAFAST)\n        frame = preprocess_v(frame_org, resize_w, resize_h)\n\n        predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n        input_names = predictor.get_input_names()\n        input_handle = predictor.get_input_handle(input_names[0])\n        input_handle.copy_from_cpu(frame.copy()[None, ...])\n        predictor.run()\n        output_names = predictor.get_output_names()\n        output_handle = predictor.get_output_handle(output_names[1])\n        score_map = output_handle.copy_to_cpu()\n\n        frame = np.transpose(frame, axes=[1, 2, 0])\n        score_map = np.transpose(np.squeeze(score_map, 0), axes=[1, 2, 0])\n        cur_gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)\n        cur_gray = cv2.resize(cur_gray, (resize_w, resize_h))\n        score_map = 255 * score_map[:, :, 1]\n        if frame_id == 1:\n            prev_gray = np.zeros((resize_h, resize_w), np.uint8)\n            prev_cfd = np.zeros((resize_h, resize_w), np.float32)\n            optflow_map = postprocess_v(cur_gray, score_map, prev_gray, prev_cfd, disflow, is_init)\n        else:\n            optflow_map = postprocess_v(cur_gray, score_map, prev_gray, prev_cfd, disflow, is_init)\n        optflow_map = cv2.GaussianBlur(optflow_map, (3, 3), 0)\n        optflow_map = threshold_mask(optflow_map, thresh_bg=0.2, thresh_fg=0.8)\n        img_matting = cv2.resize(optflow_map, (height, width), cv2.INTER_LINEAR)\n        return [img_matting, cur_gray, optflow_map]\n\n    def video_segment(self, video_path=None, use_gpu=False, save_dir='humanseg_server_video_result'):\n        resize_h = 512\n        resize_w = 512\n        if not video_path:\n            cap_video = cv2.VideoCapture(0)\n        else:\n            cap_video = cv2.VideoCapture(video_path)\n        if not cap_video.isOpened():\n            raise IOError(\"Error opening video stream or file, \"\n                          \"--video_path whether existing: {}\"\n                          \" or camera whether working\".format(video_path))\n        width = int(cap_video.get(cv2.CAP_PROP_FRAME_WIDTH))\n        height = int(cap_video.get(cv2.CAP_PROP_FRAME_HEIGHT))\n        disflow = cv2.DISOpticalFlow_create(cv2.DISOPTICAL_FLOW_PRESET_ULTRAFAST)\n        prev_gray = np.zeros((resize_h, resize_w), np.uint8)\n        prev_cfd = np.zeros((resize_h, resize_w), np.float32)\n        is_init = True\n        fps = cap_video.get(cv2.CAP_PROP_FPS)\n        if video_path is not None:\n            print('Please wait. It is computing......')\n            if not osp.exists(save_dir):\n                os.makedirs(save_dir)\n            save_path = osp.join(save_dir, 'result' + '.avi')\n            cap_out = cv2.VideoWriter(save_path, cv2.VideoWriter_fourcc('M', 'J', 'P', 'G'), fps, (width, height))\n            while cap_video.isOpened():\n                ret, frame_org = cap_video.read()\n                if ret:\n                    frame = preprocess_v(frame_org, resize_w, resize_h)\n\n                    predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n                    input_names = predictor.get_input_names()\n                    input_handle = predictor.get_input_handle(input_names[0])\n                    input_handle.copy_from_cpu(frame.copy()[None, ...])\n                    predictor.run()\n                    output_names = predictor.get_output_names()\n                    output_handle = predictor.get_output_handle(output_names[1])\n                    score_map = output_handle.copy_to_cpu()\n\n                    frame = np.transpose(frame, axes=[1, 2, 0])\n                    score_map = np.transpose(np.squeeze(score_map, 0), axes=[1, 2, 0])\n                    cur_gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)\n                    cur_gray = cv2.resize(cur_gray, (resize_w, resize_h))\n                    score_map = 255 * score_map[:, :, 1]\n                    optflow_map = postprocess_v(cur_gray, score_map, prev_gray, prev_cfd, disflow, is_init)\n                    prev_gray = cur_gray.copy()\n                    prev_cfd = optflow_map.copy()\n                    optflow_map = cv2.GaussianBlur(optflow_map, (3, 3), 0)\n                    optflow_map = threshold_mask(optflow_map, thresh_bg=0.2, thresh_fg=0.8)\n                    img_matting = cv2.resize(optflow_map, (width, height), cv2.INTER_LINEAR)\n                    img_matting = np.repeat(img_matting[:, :, np.newaxis], 3, axis=2)\n                    bg_im = np.ones_like(img_matting) * 255\n                    comb = (img_matting * frame_org + (1 - img_matting) * bg_im).astype(np.uint8)\n                    cap_out.write(comb)\n                else:\n                    break\n            cap_video.release()\n            cap_out.release()\n        else:\n            while cap_video.isOpened():\n                ret, frame_org = cap_video.read()\n                if ret:\n                    frame = preprocess_v(frame_org, resize_w, resize_h)\n\n                    predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n                    input_names = predictor.get_input_names()\n                    input_handle = predictor.get_input_handle(input_names[0])\n                    input_handle.copy_from_cpu(frame.copy()[None, ...])\n                    predictor.run()\n                    output_names = predictor.get_output_names()\n                    output_handle = predictor.get_output_handle(output_names[1])\n                    score_map = output_handle.copy_to_cpu()\n\n                    frame = np.transpose(frame, axes=[1, 2, 0])\n                    score_map = np.transpose(np.squeeze(score_map, 0), axes=[1, 2, 0])\n                    cur_gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)\n                    cur_gray = cv2.resize(cur_gray, (resize_w, resize_h))\n                    score_map = 255 * score_map[:, :, 1]\n                    optflow_map = postprocess_v(cur_gray, score_map, prev_gray, prev_cfd, disflow, is_init)\n                    prev_gray = cur_gray.copy()\n                    prev_cfd = optflow_map.copy()\n                    optflow_map = cv2.GaussianBlur(optflow_map, (3, 3), 0)\n                    optflow_map = threshold_mask(optflow_map, thresh_bg=0.2, thresh_fg=0.8)\n                    img_matting = cv2.resize(optflow_map, (width, height), cv2.INTER_LINEAR)\n                    img_matting = np.repeat(img_matting[:, :, np.newaxis], 3, axis=2)\n                    bg_im = np.ones_like(img_matting) * 255\n                    comb = (img_matting * frame_org + (1 - img_matting) * bg_im).astype(np.uint8)\n                    cv2.imshow('HumanSegmentation', comb)\n                    if cv2.waitKey(1) & 0xFF == ord('q'):\n                        break\n                else:\n                    break\n            cap_video.release()\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.segment(images=images_decode, **kwargs)\n        results = [{'data': cv2_to_base64(result['data'])} for result in results]\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.segment(paths=[args.input_path],\n                               batch_size=args.batch_size,\n                               use_gpu=args.use_gpu,\n                               output_dir=args.output_dir,\n                               visualization=args.visualization)\n        if args.save_dir is not None:\n            check_dir(args.save_dir)\n            self.save_inference_model(args.save_dir)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='humanseg_server_output',\n                                           help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument('--save_dir',\n                                           type=str,\n                                           default='humanseg_server_model',\n                                           help=\"The directory to save model.\")\n        self.arg_config_group.add_argument('--visualization',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether to save output as images.\")\n        self.arg_config_group.add_argument('--batch_size', type=ast.literal_eval, default=1, help=\"batch size.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n\n    def create_gradio_app(self):\n        import gradio as gr\n        import tempfile\n        import os\n        from PIL import Image\n\n        def inference(image, use_gpu=False):\n            with tempfile.TemporaryDirectory() as temp_dir:\n                self.segment(paths=[image], use_gpu=use_gpu, visualization=True, output_dir=temp_dir)\n                return Image.open(os.path.join(temp_dir, os.listdir(temp_dir)[0]))\n\n        interface = gr.Interface(\n            inference,\n            [gr.inputs.Image(type=\"filepath\"), gr.Checkbox(label='use_gpu')],\n            gr.outputs.Image(type=\"ndarray\"),\n            title='humanseg_server')\n        return interface\n"
  },
  {
    "path": "modules/image/semantic_segmentation/humanseg_server/optimal.py",
    "content": "# -*- coding:utf-8 -*-\nimport numpy as np\n\n\ndef human_seg_tracking(pre_gray, cur_gray, prev_cfd, dl_weights, disflow):\n    \"\"\"计算光流跟踪匹配点和光流图\n    输入参数:\n        pre_gray: 上一帧灰度图\n        cur_gray: 当前帧灰度图\n        prev_cfd: 上一帧光流图\n        dl_weights: 融合权重图\n        disflow: 光流数据结构\n    返回值:\n        is_track: 光流点跟踪二值图，即是否具有光流点匹配\n        track_cfd: 光流跟踪图\n    \"\"\"\n    check_thres = 8\n    h, w = pre_gray.shape[:2]\n    track_cfd = np.zeros_like(prev_cfd)\n    is_track = np.zeros_like(pre_gray)\n    flow_fw = disflow.calc(pre_gray, cur_gray, None)\n    flow_bw = disflow.calc(cur_gray, pre_gray, None)\n    flow_fw = np.round(flow_fw).astype(np.int)\n    flow_bw = np.round(flow_bw).astype(np.int)\n    y_list = np.array(range(h))\n    x_list = np.array(range(w))\n    yv, xv = np.meshgrid(y_list, x_list)\n    yv, xv = yv.T, xv.T\n    cur_x = xv + flow_fw[:, :, 0]\n    cur_y = yv + flow_fw[:, :, 1]\n\n    # 超出边界不跟踪\n    not_track = (cur_x < 0) + (cur_x >= w) + (cur_y < 0) + (cur_y >= h)\n    flow_bw[~not_track] = flow_bw[cur_y[~not_track], cur_x[~not_track]]\n    not_track += (np.square(flow_fw[:, :, 0] + flow_bw[:, :, 0]) +\n                  np.square(flow_fw[:, :, 1] + flow_bw[:, :, 1])) >= check_thres\n    track_cfd[cur_y[~not_track], cur_x[~not_track]] = prev_cfd[~not_track]\n\n    is_track[cur_y[~not_track], cur_x[~not_track]] = 1\n\n    not_flow = np.all(np.abs(flow_fw) == 0, axis=-1) * np.all(np.abs(flow_bw) == 0, axis=-1)\n    dl_weights[cur_y[not_flow], cur_x[not_flow]] = 0.05\n    return track_cfd, is_track, dl_weights\n\n\ndef human_seg_track_fuse(track_cfd, dl_cfd, dl_weights, is_track):\n    \"\"\"光流追踪图和人像分割结构融合\n    输入参数:\n        track_cfd: 光流追踪图\n        dl_cfd: 当前帧分割结果\n        dl_weights: 融合权重图\n        is_track: 光流点匹配二值图\n    返回\n        cur_cfd: 光流跟踪图和人像分割结果融合图\n    \"\"\"\n    fusion_cfd = dl_cfd.copy()\n    is_track = is_track.astype(np.bool)\n    fusion_cfd[is_track] = dl_weights[is_track] * dl_cfd[is_track] + (1 - dl_weights[is_track]) * track_cfd[is_track]\n    # 确定区域\n    index_certain = ((dl_cfd > 0.9) + (dl_cfd < 0.1)) * is_track\n    index_less01 = (dl_weights < 0.1) * index_certain\n    fusion_cfd[index_less01] = 0.3 * dl_cfd[index_less01] + 0.7 * track_cfd[index_less01]\n    index_larger09 = (dl_weights >= 0.1) * index_certain\n    fusion_cfd[index_larger09] = 0.4 * dl_cfd[index_larger09] + 0.6 * track_cfd[index_larger09]\n    return fusion_cfd\n\n\ndef threshold_mask(img, thresh_bg, thresh_fg):\n    dst = (img / 255.0 - thresh_bg) / (thresh_fg - thresh_bg)\n    dst[np.where(dst > 1)] = 1\n    dst[np.where(dst < 0)] = 0\n    return dst.astype(np.float32)\n\n\ndef postprocess_v(cur_gray, scoremap, prev_gray, pre_cfd, disflow, is_init):\n    \"\"\"光流优化\n    Args:\n        cur_gray : 当前帧灰度图\n        pre_gray : 前一帧灰度图\n        pre_cfd  ：前一帧融合结果\n        scoremap : 当前帧分割结果\n        difflow  : 光流\n        is_init : 是否第一帧\n    Returns:\n        fusion_cfd : 光流追踪图和预测结果融合图\n    \"\"\"\n    h, w = scoremap.shape\n    cur_cfd = scoremap.copy()\n\n    if is_init:\n        if h <= 64 or w <= 64:\n            disflow.setFinestScale(1)\n        elif h <= 160 or w <= 160:\n            disflow.setFinestScale(2)\n        else:\n            disflow.setFinestScale(3)\n        fusion_cfd = cur_cfd\n    else:\n        weights = np.ones((h, w), np.float32) * 0.3\n        track_cfd, is_track, weights = human_seg_tracking(prev_gray, cur_gray, pre_cfd, weights, disflow)\n        fusion_cfd = human_seg_track_fuse(track_cfd, cur_cfd, weights, is_track)\n\n    return fusion_cfd\n"
  },
  {
    "path": "modules/image/semantic_segmentation/humanseg_server/processor.py",
    "content": "# -*- coding:utf-8 -*-\nimport base64\nimport os\nimport time\n\nimport cv2\nimport numpy as np\n\n__all__ = ['cv2_to_base64', 'base64_to_cv2', 'postprocess']\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef postprocess(data_out, org_im, org_im_shape, org_im_path, output_dir, visualization):\n    \"\"\"\n    Postprocess output of network. one image at a time.\n\n    Args:\n        data_out (numpy.ndarray): output of network.\n        org_im (numpy.ndarray): original image.\n        org_im_shape (list): shape pf original image.\n        org_im_path (list): path of riginal image.\n        output_dir (str): output directory to store image.\n        visualization (bool): whether to save image or not.\n    Returns:\n        result (dict): The data of processed image.\n    \"\"\"\n    result = dict()\n    for logit in data_out:\n        logit = (logit * 255).astype(np.uint8)\n        logit = cv2.resize(logit, (org_im_shape[1], org_im_shape[0]))\n        rgba = np.concatenate((org_im, np.expand_dims(logit, axis=2)), axis=2)\n\n        if visualization:\n            check_dir(output_dir)\n            save_im_path = get_save_image_name(org_im, org_im_path, output_dir)\n            cv2.imwrite(save_im_path, rgba)\n            result['save_path'] = save_im_path\n            result['data'] = rgba[:, :, 3]\n        else:\n            result['data'] = rgba[:, :, 3]\n    return result\n\n\ndef check_dir(dir_path):\n    if not os.path.exists(dir_path):\n        os.makedirs(dir_path)\n    elif os.path.isfile(dir_path):\n        os.remove(dir_path)\n        os.makedirs(dir_path)\n\n\ndef get_save_image_name(org_im, org_im_path, output_dir):\n    \"\"\"\n    Get save image name from source image path.\n    \"\"\"\n    # name prefix of orginal image\n    org_im_name = os.path.split(org_im_path)[-1]\n    im_prefix = os.path.splitext(org_im_name)[0]\n    ext = '.png'\n    # save image path\n    save_im_path = os.path.join(output_dir, im_prefix + ext)\n    if os.path.exists(save_im_path):\n        save_im_path = os.path.join(output_dir, im_prefix + 'time={}'.format(int(time.time())) + ext)\n\n    return save_im_path\n"
  },
  {
    "path": "modules/image/semantic_segmentation/humanseg_server/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport numpy as np\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/pg_WCHWSdT8/download?ixid=MnwxMjA3fDB8MXxhbGx8fHx8fHx8fHwxNjYyNDM2ODI4&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        fourcc = cv2.VideoWriter_fourcc('M', 'J', 'P', 'G')\n        img = cv2.imread('tests/test.jpg')\n        video = cv2.VideoWriter('tests/test.avi', fourcc, 20.0, tuple(img.shape[:2]))\n        for i in range(40):\n            video.write(img)\n        video.release()\n        cls.module = hub.Module(name=\"humanseg_server\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('humanseg_server_output')\n        shutil.rmtree('humanseg_server_video_result')\n\n    def test_segment1(self):\n        results = self.module.segment(paths=['tests/test.jpg'], use_gpu=False, visualization=False)\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_segment2(self):\n        results = self.module.segment(images=[cv2.imread('tests/test.jpg')], use_gpu=False, visualization=False)\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_segment3(self):\n        results = self.module.segment(images=[cv2.imread('tests/test.jpg')], use_gpu=False, visualization=True)\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_segment4(self):\n        results = self.module.segment(images=[cv2.imread('tests/test.jpg')], use_gpu=True, visualization=False)\n        self.assertIsInstance(results[0]['data'], np.ndarray)\n\n    def test_segment5(self):\n        self.assertRaises(AssertionError, self.module.segment, paths=['no.jpg'])\n\n    def test_segment6(self):\n        self.assertRaises(AttributeError, self.module.segment, images=['test.jpg'])\n\n    def test_video_stream_segment1(self):\n        img_matting, cur_gray, optflow_map = self.module.video_stream_segment(frame_org=cv2.imread('tests/test.jpg'),\n                                                                              frame_id=1,\n                                                                              prev_gray=None,\n                                                                              prev_cfd=None,\n                                                                              use_gpu=False)\n        self.assertIsInstance(img_matting, np.ndarray)\n        self.assertIsInstance(cur_gray, np.ndarray)\n        self.assertIsInstance(optflow_map, np.ndarray)\n        img_matting, cur_gray, optflow_map = self.module.video_stream_segment(frame_org=cv2.imread('tests/test.jpg'),\n                                                                              frame_id=2,\n                                                                              prev_gray=cur_gray,\n                                                                              prev_cfd=optflow_map,\n                                                                              use_gpu=False)\n        self.assertIsInstance(img_matting, np.ndarray)\n        self.assertIsInstance(cur_gray, np.ndarray)\n        self.assertIsInstance(optflow_map, np.ndarray)\n\n    def test_video_stream_segment2(self):\n        img_matting, cur_gray, optflow_map = self.module.video_stream_segment(frame_org=cv2.imread('tests/test.jpg'),\n                                                                              frame_id=1,\n                                                                              prev_gray=None,\n                                                                              prev_cfd=None,\n                                                                              use_gpu=True)\n        self.assertIsInstance(img_matting, np.ndarray)\n        self.assertIsInstance(cur_gray, np.ndarray)\n        self.assertIsInstance(optflow_map, np.ndarray)\n        img_matting, cur_gray, optflow_map = self.module.video_stream_segment(frame_org=cv2.imread('tests/test.jpg'),\n                                                                              frame_id=2,\n                                                                              prev_gray=cur_gray,\n                                                                              prev_cfd=optflow_map,\n                                                                              use_gpu=True)\n        self.assertIsInstance(img_matting, np.ndarray)\n        self.assertIsInstance(cur_gray, np.ndarray)\n        self.assertIsInstance(optflow_map, np.ndarray)\n\n    def test_video_segment1(self):\n        self.module.video_segment(video_path=\"tests/test.avi\", use_gpu=False)\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/semantic_segmentation/isanet_resnet50_cityscapes/README.md",
    "content": "# isanet_resnet50_cityscapes\n\n|模型名称|isanet_resnet50_cityscapes|\n| :--- | :---: | \n|类别|图像-图像分割|\n|网络|isanet_resnet50vd|\n|数据集|Cityscapes|\n|是否支持Fine-tuning|是|\n|模型大小|217MB|\n|指标|-|\n|最新更新日期|2022-03-21|\n\n## 一、模型基本信息\n  \n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/159212111-df341f2a-e994-45d7-92d6-2288d666079c.png\"  width = \"420\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/159212188-2db40b29-2943-47ce-9ad2-36a6fb85ba3e.png\" width = \"420\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### 模型介绍\n\n    - 本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n    - 更多详情请参考：[isanet](https://arxiv.org/abs/1907.12273)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install isanet_resnet50_cityscapes\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)   \n\n\n## 三、模型API预测\n\n- ### 1.预测代码示例\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='isanet_resnet50_cityscapes')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用isanet_resnet50_cityscapes模型对OpticDiscSeg数据集进行Fine-tune。 `train.py`内容如下：\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n              ```\n                - `transforms`: 数据预处理方式。\n                - `mode`: `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n                - 数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='isanet_resnet50_cityscapes', num_classes=2, pretrained=None)\n              ```\n                - `name`: 选择预训练模型的名字。\n                - `load_checkpoint`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n        - Step4: 选择优化策略和运行配置\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n             \n    - 模型预测\n\n        - 当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。我们使用该模型来进行预测。predict.py脚本如下：\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='isanet_resnet50_cityscapes', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - 参数配置正确后，请执行脚本`python predict.py`。\n\n            - **Args**\n                * `images`:原始图像路径或BGR格式图片；\n                * `visualization`: 是否可视化，默认为True；\n                * `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n                **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像分割服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m isanet_resnet50_cityscapes\n      ```\n\n    - 这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ginet_resnet50vd_voc\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/semantic_segmentation/isanet_resnet50_cityscapes/README_en.md",
    "content": "# isanet_resnet50_cityscapes\n\n|Module Name|isanet_resnet50_cityscapes|\n| :--- | :---: | \n|Category|Image Segmentation|\n|Network|isanet_resnet50vd|\n|Dataset|Cityscapes|\n|Fine-tuning supported or not|Yes|\n|Module Size|217MB|\n|Data indicators|-|\n|Latest update date|2022-03-21|\n\n## I. Basic Information \n  \n- ### Application Effect Display\n    - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/159212111-df341f2a-e994-45d7-92d6-2288d666079c.png\"  width = \"420\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/159212188-2db40b29-2943-47ce-9ad2-36a6fb85ba3e.png\" width = \"420\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### Module Introduction\n\n    - We will show how to use PaddleHub to finetune the pre-trained model and complete the prediction.\n    - For more information, please refer to: [isanet](https://arxiv.org/abs/1907.12273)\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install isanet_resnet50_cityscapes\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='isanet_resnet50_cityscapes')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.Fine-tune and Encapsulation\n\n    - After completing the installation of PaddlePaddle and PaddleHub, you can start using the isanet_resnet50_cityscapes model to fine-tune datasets such as OpticDiscSeg.\n\n    - Steps:\n\n         - Step1: Define the data preprocessing method\n\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms`: The data enhancement module defines lots of data preprocessing methods. Users can replace the data preprocessing methods according to their needs.\n\n         - Step2: Download the dataset\n\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n              ```\n                * `transforms`: data preprocessing methods.\n\n                * `mode`: Select the data mode, the options are `train`, `test`, `val`. Default is `train`.\n\n                * Dataset preparation can be referred to [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`will be automatically downloaded from the network and decompressed to the `$HOME/.paddlehub/dataset` directory under the user directory.\n\n        - Step3: Load the pre-trained model\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='isanet_resnet50_cityscapes', num_classes=2, pretrained=None)\n              ```\n                - `name`: model name.\n                - `load_checkpoint`: Whether to load the self-trained model, if it is None, load the provided parameters.\n\n        - Step4:  Optimization strategy\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n   \n    - Model prediction\n    \n        - When Fine-tune is completed, the model with the best performance on the verification set will be saved in the `${CHECKPOINT_DIR}/best_model` directory. We use this model to make predictions. The `predict.py` script is as follows:\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='isanet_resnet50_cityscapes', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - **Args**\n                * `images`: Image path or ndarray data with format [H, W, C], BGR.\n                * `visualization`: Whether to save the recognition results as picture files.\n                * `save_path`: Save path of the result, default is 'seg_result'.\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of image segmentation.\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n        - ```shell\n          $ hub serving start -m isanet_resnet50_cityscapes\n          ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result:\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/isanet_resnet50_cityscapes\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## V. Release Note\n\n- 1.0.0\n\n  First release\n"
  },
  {
    "path": "modules/image/semantic_segmentation/isanet_resnet50_cityscapes/layers.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn.layer import activation\nfrom paddle.nn import Conv2D, AvgPool2D\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu':\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Depthwise Separable Convolution.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(SeparableConvBNReLU, self).__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=in_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n        self.piontwise_conv = ConvBNReLU(\n            in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBN, self).__init__()\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBNReLU, self).__init__()\n\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n    Returns:\n        A callable object of Activation.\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n    Examples:\n        from paddleseg.models.common.activation import Activation\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(\n                    act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n\n\nclass ASPPModule(nn.Layer):\n    \"\"\"\n    Atrous Spatial Pyramid Pooling.\n    Args:\n        aspp_ratios (tuple): The dilation rate using in ASSP module.\n        in_channels (int): The number of input channels.\n        out_channels (int): The number of output channels.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n        use_sep_conv (bool, optional): If using separable conv in ASPP module. Default: False.\n        image_pooling (bool, optional): If augmented with image-level features. Default: False\n    \"\"\"\n\n    def __init__(self,\n                 aspp_ratios: tuple,\n                 in_channels: int,\n                 out_channels: int,\n                 align_corners: bool,\n                 use_sep_conv: bool = False,\n                 image_pooling: bool = False):\n        super().__init__()\n\n        self.align_corners = align_corners\n        self.aspp_blocks = nn.LayerList()\n\n        for ratio in aspp_ratios:\n            if use_sep_conv and ratio > 1:\n                conv_func = SeparableConvBNReLU\n            else:\n                conv_func = ConvBNReLU\n\n            block = conv_func(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1 if ratio == 1 else 3,\n                dilation=ratio,\n                padding=0 if ratio == 1 else ratio)\n            self.aspp_blocks.append(block)\n\n        out_size = len(self.aspp_blocks)\n\n        if image_pooling:\n            self.global_avg_pool = nn.Sequential(\n                nn.AdaptiveAvgPool2D(output_size=(1, 1)),\n                ConvBNReLU(in_channels, out_channels, kernel_size=1, bias_attr=False))\n            out_size += 1\n        self.image_pooling = image_pooling\n\n        self.conv_bn_relu = ConvBNReLU(\n            in_channels=out_channels * out_size,\n            out_channels=out_channels,\n            kernel_size=1)\n\n        self.dropout = nn.Dropout(p=0.1)  # drop rate\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outputs = []\n        for block in self.aspp_blocks:\n            y = block(x)\n            y = F.interpolate(\n                y,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(y)\n\n        if self.image_pooling:\n            img_avg = self.global_avg_pool(x)\n            img_avg = F.interpolate(\n                img_avg,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(img_avg)\n\n        x = paddle.concat(outputs, axis=1)\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n\n        return x\n\n\nclass AuxLayer(nn.Layer):\n    \"\"\"\n    The auxiliary layer implementation for auxiliary loss.\n\n    Args:\n        in_channels (int): The number of input channels.\n        inter_channels (int): The intermediate channels.\n        out_channels (int): The number of output channels, and usually it is num_classes.\n        dropout_prob (float, optional): The drop rate. Default: 0.1.\n    \"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 inter_channels: int,\n                 out_channels: int,\n                 dropout_prob: float = 0.1,\n                 **kwargs):\n        super().__init__()\n\n        self.conv_bn_relu = ConvBNReLU(\n            in_channels=in_channels,\n            out_channels=inter_channels,\n            kernel_size=3,\n            padding=1,\n            **kwargs)\n\n        self.dropout = nn.Dropout(p=dropout_prob)\n\n        self.conv = nn.Conv2D(\n            in_channels=inter_channels,\n            out_channels=out_channels,\n            kernel_size=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n        x = self.conv(x)\n        return x\n\n\nclass Add(nn.Layer):\n    def __init__(self):\n        super().__init__()\n\n    def forward(self, x: paddle.Tensor, y: paddle.Tensor, name: str = None):\n        return paddle.add(x, y, name)\n\nclass AttentionBlock(nn.Layer):\n    \"\"\"General self-attention block/non-local block.\n\n    The original article refers to refer to https://arxiv.org/abs/1706.03762.\n    Args:\n        key_in_channels (int): Input channels of key feature.\n        query_in_channels (int): Input channels of query feature.\n        channels (int): Output channels of key/query transform.\n        out_channels (int): Output channels.\n        share_key_query (bool): Whether share projection weight between key\n            and query projection.\n        query_downsample (nn.Module): Query downsample module.\n        key_downsample (nn.Module): Key downsample module.\n        key_query_num_convs (int): Number of convs for key/query projection.\n        value_out_num_convs (int): Number of convs for value projection.\n        key_query_norm (bool): Whether to use BN for key/query projection.\n        value_out_norm (bool): Whether to use BN for value projection.\n        matmul_norm (bool): Whether normalize attention map with sqrt of\n            channels\n        with_out (bool): Whether use out projection.\n    \"\"\"\n\n    def __init__(self, key_in_channels, query_in_channels, channels,\n                 out_channels, share_key_query, query_downsample,\n                 key_downsample, key_query_num_convs, value_out_num_convs,\n                 key_query_norm, value_out_norm, matmul_norm, with_out):\n        super(AttentionBlock, self).__init__()\n        if share_key_query:\n            assert key_in_channels == query_in_channels\n        self.with_out = with_out\n        self.key_in_channels = key_in_channels\n        self.query_in_channels = query_in_channels\n        self.out_channels = out_channels\n        self.channels = channels\n        self.share_key_query = share_key_query\n        self.key_project = self.build_project(\n            key_in_channels,\n            channels,\n            num_convs=key_query_num_convs,\n            use_conv_module=key_query_norm)\n        if share_key_query:\n            self.query_project = self.key_project\n        else:\n            self.query_project = self.build_project(\n                query_in_channels,\n                channels,\n                num_convs=key_query_num_convs,\n                use_conv_module=key_query_norm)\n\n        self.value_project = self.build_project(\n            key_in_channels,\n            channels if self.with_out else out_channels,\n            num_convs=value_out_num_convs,\n            use_conv_module=value_out_norm)\n\n        if self.with_out:\n            self.out_project = self.build_project(\n                channels,\n                out_channels,\n                num_convs=value_out_num_convs,\n                use_conv_module=value_out_norm)\n        else:\n            self.out_project = None\n\n        self.query_downsample = query_downsample\n        self.key_downsample = key_downsample\n        self.matmul_norm = matmul_norm\n\n    def build_project(self, in_channels: int , channels: int, num_convs: int, use_conv_module: bool):\n        if use_conv_module:\n            convs = [\n                ConvBNReLU(\n                    in_channels=in_channels,\n                    out_channels=channels,\n                    kernel_size=1,\n                    bias_attr=False)\n            ]\n            for _ in range(num_convs - 1):\n                convs.append(\n                    ConvBNReLU(\n                        in_channels=channels,\n                        out_channels=channels,\n                        kernel_size=1,\n                        bias_attr=False))\n        else:\n            convs = [nn.Conv2D(in_channels, channels, 1)]\n            for _ in range(num_convs - 1):\n                convs.append(nn.Conv2D(channels, channels, 1))\n\n        if len(convs) > 1:\n            convs = nn.Sequential(*convs)\n        else:\n            convs = convs[0]\n        return convs\n\n    def forward(self, query_feats: paddle.Tensor, key_feats: paddle.Tensor) -> paddle.Tensor:\n        query_shape = paddle.shape(query_feats)\n        query = self.query_project(query_feats)\n        if self.query_downsample is not None:\n            query = self.query_downsample(query)\n        query = query.flatten(2).transpose([0, 2, 1])\n\n        key = self.key_project(key_feats)\n        value = self.value_project(key_feats)\n\n        if self.key_downsample is not None:\n            key = self.key_downsample(key)\n            value = self.key_downsample(value)\n\n        key = key.flatten(2)\n        value = value.flatten(2).transpose([0, 2, 1])\n        sim_map = paddle.matmul(query, key)\n        if self.matmul_norm:\n            sim_map = (self.channels**-0.5) * sim_map\n        sim_map = F.softmax(sim_map, axis=-1)\n\n        context = paddle.matmul(sim_map, value)\n        context = paddle.transpose(context, [0, 2, 1])\n\n        context = paddle.reshape(\n            context, [0, self.out_channels, query_shape[2], query_shape[3]])\n\n        if self.out_project is not None:\n            context = self.out_project(context)\n        return context\n"
  },
  {
    "path": "modules/image/semantic_segmentation/isanet_resnet50_cityscapes/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union, List, Tuple\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\n\nfrom isanet_resnet50_cityscapes.resnet import ResNet50_vd\nimport isanet_resnet50_cityscapes.layers as layers\n\n\n@moduleinfo(\n    name=\"isanet_resnet50_cityscapes\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"ISANetResnet50 is a segmentation model.\",\n    version=\"1.0.0\",\n    meta=ImageSegmentationModule)\nclass ISANet(nn.Layer):\n    \"\"\"Interlaced Sparse Self-Attention for Semantic Segmentation.\n\n    The original article refers to Lang Huang, et al. \"Interlaced Sparse Self-Attention for Semantic Segmentation\"\n    (https://arxiv.org/abs/1907.12273).\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (tuple): The values in the tuple indicate the indices of output of backbone.\n        isa_channels (int): The channels of ISA Module.\n        down_factor (tuple): Divide the height and width dimension to (Ph, PW) groups.\n        enable_auxiliary_loss (bool, optional): A bool value indicates whether adding auxiliary loss. Default: True.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.  Default: False.\n        pretrained (str, optional): The path or url of pretrained model. Default: None.\n\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 19,\n                 backbone_indices: Tuple[int] = (2, 3),\n                 isa_channels: int = 256,\n                 down_factor: Tuple[int] = (8, 8),\n                 enable_auxiliary_loss: bool = True,\n                 align_corners: bool = False,\n                 pretrained: str = None):\n        super(ISANet, self).__init__()\n\n        self.backbone = ResNet50_vd()\n        self.backbone_indices = backbone_indices\n        in_channels = [self.backbone.feat_channels[i] for i in backbone_indices]\n        self.head = ISAHead(num_classes, in_channels, isa_channels, down_factor,\n                            enable_auxiliary_loss)\n        self.align_corners = align_corners\n        self.transforms = T.Compose([T.Normalize()])\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def transform(self, img: Union[np.ndarray, str]) -> Union[np.ndarray, str]:\n        return self.transforms(img)\n\n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        feats = self.backbone(x)\n        feats = [feats[i] for i in self.backbone_indices]\n        logit_list = self.head(feats)\n        logit_list = [\n            F.interpolate(\n                logit,\n                paddle.shape(x)[2:],\n                mode='bilinear',\n                align_corners=self.align_corners,\n                align_mode=1) for logit in logit_list\n        ]\n\n        return logit_list\n\n\nclass ISAHead(nn.Layer):\n    \"\"\"\n    The ISAHead.\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        in_channels (tuple): The number of input channels.\n        isa_channels (int): The channels of ISA Module.\n        down_factor (tuple): Divide the height and width dimension to (Ph, PW) groups.\n        enable_auxiliary_loss (bool, optional): A bool value indicates whether adding auxiliary loss. Default: True.\n    \"\"\"\n\n    def __init__(self, \n                 num_classes: int, \n                 in_channels: int, \n                 isa_channels: int, \n                 down_factor: Tuple[int],\n                 enable_auxiliary_loss: bool):\n        super(ISAHead, self).__init__()\n        self.in_channels = in_channels[-1]\n        inter_channels = self.in_channels // 4\n        self.inter_channels = inter_channels\n        self.down_factor = down_factor\n        self.enable_auxiliary_loss = enable_auxiliary_loss\n        self.in_conv = layers.ConvBNReLU(\n            self.in_channels, inter_channels, 3, bias_attr=False)\n        self.global_relation = SelfAttentionBlock(inter_channels, isa_channels)\n        self.local_relation = SelfAttentionBlock(inter_channels, isa_channels)\n        self.out_conv = layers.ConvBNReLU(\n            inter_channels * 2, inter_channels, 1, bias_attr=False)\n        self.cls = nn.Sequential(\n            nn.Dropout2D(p=0.1), nn.Conv2D(inter_channels, num_classes, 1))\n        self.aux = nn.Sequential(\n            layers.ConvBNReLU(\n                in_channels=1024,\n                out_channels=256,\n                kernel_size=3,\n                bias_attr=False), nn.Dropout2D(p=0.1),\n            nn.Conv2D(256, num_classes, 1))\n\n    def forward(self, feat_list: List[paddle.Tensor]) -> List[paddle.Tensor]:\n        C3, C4 = feat_list\n        x = self.in_conv(C4)\n        x_shape = paddle.shape(x)\n        P_h, P_w = self.down_factor\n        Q_h, Q_w = paddle.ceil(x_shape[2] / P_h).astype('int32'), paddle.ceil(\n            x_shape[3] / P_w).astype('int32')\n        pad_h, pad_w = (Q_h * P_h - x_shape[2]).astype('int32'), (\n                Q_w * P_w - x_shape[3]).astype('int32')\n        if pad_h > 0 or pad_w > 0:\n            padding = paddle.concat([\n                pad_w // 2, pad_w - pad_w // 2, pad_h // 2, pad_h - pad_h // 2\n            ],\n                axis=0)\n            feat = F.pad(x, padding)\n        else:\n            feat = x\n\n        feat = feat.reshape([0, x_shape[1], Q_h, P_h, Q_w, P_w])\n        feat = feat.transpose([0, 3, 5, 1, 2,\n                               4]).reshape([-1, self.inter_channels, Q_h, Q_w])\n        feat = self.global_relation(feat)\n\n        feat = feat.reshape([x_shape[0], P_h, P_w, x_shape[1], Q_h, Q_w])\n        feat = feat.transpose([0, 4, 5, 3, 1,\n                               2]).reshape([-1, self.inter_channels, P_h, P_w])\n        feat = self.local_relation(feat)\n\n        feat = feat.reshape([x_shape[0], Q_h, Q_w, x_shape[1], P_h, P_w])\n        feat = feat.transpose([0, 3, 1, 4, 2, 5]).reshape(\n            [0, self.inter_channels, P_h * Q_h, P_w * Q_w])\n        if pad_h > 0 or pad_w > 0:\n            feat = paddle.slice(\n                feat,\n                axes=[2, 3],\n                starts=[pad_h // 2, pad_w // 2],\n                ends=[pad_h // 2 + x_shape[2], pad_w // 2 + x_shape[3]])\n\n        feat = self.out_conv(paddle.concat([feat, x], axis=1))\n        output = self.cls(feat)\n\n        if self.enable_auxiliary_loss:\n            auxout = self.aux(C3)\n            return [output, auxout]\n        else:\n            return [output]\n\n\nclass SelfAttentionBlock(layers.AttentionBlock):\n    \"\"\"General self-attention block/non-local block.\n\n       Args:\n            in_channels (int): Input channels of key/query feature.\n            channels (int): Output channels of key/query transform.\n    \"\"\"\n\n    def __init__(self, in_channels: int, channels: int):\n        super(SelfAttentionBlock, self).__init__(\n            key_in_channels=in_channels,\n            query_in_channels=in_channels,\n            channels=channels,\n            out_channels=in_channels,\n            share_key_query=False,\n            query_downsample=None,\n            key_downsample=None,\n            key_query_num_convs=2,\n            key_query_norm=True,\n            value_out_num_convs=1,\n            value_out_norm=False,\n            matmul_norm=True,\n            with_out=False)\n\n        self.output_project = self.build_project(\n            in_channels, in_channels, num_convs=1, use_conv_module=True)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        context = super(SelfAttentionBlock, self).forward(x, x)\n        return self.output_project(context)\n"
  },
  {
    "path": "modules/image/semantic_segmentation/isanet_resnet50_cityscapes/resnet.py",
    "content": "# copyright (c) 2022 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport isanet_resnet50_cityscapes.layers as layers\n\n\nclass ConvBNLayer(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 stride: int = 1,\n                 dilation: int = 1,\n                 groups: int = 1,\n                 is_vd_mode: bool = False,\n                 act: str = None,\n                 data_format: str = 'NCHW'):\n        super(ConvBNLayer, self).__init__()\n        if dilation != 1 and kernel_size != 3:\n            raise RuntimeError(\"When the dilation isn't 1,\" \\\n                               \"the kernel_size should be 3.\")\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = nn.AvgPool2D(\n            kernel_size=2,\n            stride=2,\n            padding=0,\n            ceil_mode=True,\n            data_format=data_format)\n        self._conv = nn.Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=(kernel_size - 1) // 2 \\\n                if dilation == 1 else dilation,\n            dilation=dilation,\n            groups=groups,\n            bias_attr=False,\n            data_format=data_format)\n\n        self._batch_norm = layers.SyncBatchNorm(\n            out_channels, data_format=data_format)\n        self._act_op = layers.Activation(act=act)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        y = self._act_op(y)\n\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool  = False,\n                 dilation: int = 1,\n                 data_format: str = 'NCHW'):\n        super(BottleneckBlock, self).__init__()\n\n        self.data_format = data_format\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=1,\n            act='relu',\n            data_format=data_format)\n\n        self.dilation = dilation\n\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            dilation=dilation,\n            data_format=data_format)\n        self.conv2 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels * 4,\n            kernel_size=1,\n            act=None,\n            data_format=data_format)\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels * 4,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                data_format=data_format)\n\n        self.shortcut = shortcut\n        # NOTE: Use the wrap layer for quantization training\n        self.add = layers.Add()\n        self.relu = layers.Activation(act=\"relu\")\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = self.add(short, conv2)\n        y = self.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 dilation: int = 1,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 data_format: str = 'NCHW'):\n        super(BasicBlock, self).__init__()\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            dilation=dilation,\n            act='relu',\n            data_format=data_format)\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            dilation=dilation,\n            act=None,\n            data_format=data_format)\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                data_format=data_format)\n\n        self.shortcut = shortcut\n        self.dilation = dilation\n        self.data_format = data_format\n        self.add = layers.Add()\n        self.relu = layers.Activation(act=\"relu\")\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = self.add(short, conv1)\n        y = self.relu(y)\n\n        return y\n\n\nclass ResNet_vd(nn.Layer):\n    \"\"\"\n    The ResNet_vd implementation based on PaddlePaddle.\n\n    The original article refers to Jingdong\n    Tong He, et, al. \"Bag of Tricks for Image Classification with Convolutional Neural Networks\"\n    (https://arxiv.org/pdf/1812.01187.pdf).\n\n    Args:\n        layers (int, optional): The layers of ResNet_vd. The supported layers are (18, 34, 50, 101, 152, 200). Default: 50.\n        output_stride (int, optional): The stride of output features compared to input images. It is 8 or 16. Default: 8.\n        multi_grid (tuple|list, optional): The grid of stage4. Defult: (1, 1, 1).\n        pretrained (str, optional): The path of pretrained model.\n\n    \"\"\"\n\n    def __init__(self,\n                 layers: int = 50,\n                 output_stride: int = 8,\n                 multi_grid: Tuple[int] = (1, 1, 1),\n                 pretrained: str = None,\n                 data_format: str = 'NCHW'):\n        super(ResNet_vd, self).__init__()\n\n        self.data_format = data_format\n        self.conv1_logit = None  # for gscnn shape stream\n        self.layers = layers\n        supported_layers = [18, 34, 50, 101, 152, 200]\n        assert layers in supported_layers, \\\n            \"supported layers are {} but input layer is {}\".format(\n                supported_layers, layers)\n\n        if layers == 18:\n            depth = [2, 2, 2, 2]\n        elif layers == 34 or layers == 50:\n            depth = [3, 4, 6, 3]\n        elif layers == 101:\n            depth = [3, 4, 23, 3]\n        elif layers == 152:\n            depth = [3, 8, 36, 3]\n        elif layers == 200:\n            depth = [3, 12, 48, 3]\n        num_channels = [64, 256, 512, 1024\n                        ] if layers >= 50 else [64, 64, 128, 256]\n        num_filters = [64, 128, 256, 512]\n\n        # for channels of four returned stages\n        self.feat_channels = [c * 4 for c in num_filters\n                              ] if layers >= 50 else num_filters\n\n        dilation_dict = None\n        if output_stride == 8:\n            dilation_dict = {2: 2, 3: 4}\n        elif output_stride == 16:\n            dilation_dict = {3: 2}\n\n        self.conv1_1 = ConvBNLayer(\n            in_channels=3,\n            out_channels=32,\n            kernel_size=3,\n            stride=2,\n            act='relu',\n            data_format=data_format)\n        self.conv1_2 = ConvBNLayer(\n            in_channels=32,\n            out_channels=32,\n            kernel_size=3,\n            stride=1,\n            act='relu',\n            data_format=data_format)\n        self.conv1_3 = ConvBNLayer(\n            in_channels=32,\n            out_channels=64,\n            kernel_size=3,\n            stride=1,\n            act='relu',\n            data_format=data_format)\n        self.pool2d_max = nn.MaxPool2D(\n            kernel_size=3, stride=2, padding=1, data_format=data_format)\n\n        # self.block_list = []\n        self.stage_list = []\n        if layers >= 50:\n            for block in range(len(depth)):\n                shortcut = False\n                block_list = []\n                for i in range(depth[block]):\n                    if layers in [101, 152] and block == 2:\n                        if i == 0:\n                            conv_name = \"res\" + str(block + 2) + \"a\"\n                        else:\n                            conv_name = \"res\" + str(block + 2) + \"b\" + str(i)\n                    else:\n                        conv_name = \"res\" + str(block + 2) + chr(97 + i)\n\n                    ###############################################################################\n                    # Add dilation rate for some segmentation tasks, if dilation_dict is not None.\n                    dilation_rate = dilation_dict[\n                        block] if dilation_dict and block in dilation_dict else 1\n\n                    # Actually block here is 'stage', and i is 'block' in 'stage'\n                    # At the stage 4, expand the the dilation_rate if given multi_grid\n                    if block == 3:\n                        dilation_rate = dilation_rate * multi_grid[i]\n                    ###############################################################################\n\n                    bottleneck_block = self.add_sublayer(\n                        'bb_%d_%d' % (block, i),\n                        BottleneckBlock(\n                            in_channels=num_channels[block]\n                            if i == 0 else num_filters[block] * 4,\n                            out_channels=num_filters[block],\n                            stride=2 if i == 0 and block != 0\n                                        and dilation_rate == 1 else 1,\n                            shortcut=shortcut,\n                            if_first=block == i == 0,\n                            dilation=dilation_rate,\n                            data_format=data_format))\n\n                    block_list.append(bottleneck_block)\n                    shortcut = True\n                self.stage_list.append(block_list)\n        else:\n            for block in range(len(depth)):\n                shortcut = False\n                block_list = []\n                for i in range(depth[block]):\n                    dilation_rate = dilation_dict[block] \\\n                        if dilation_dict and block in dilation_dict else 1\n                    if block == 3:\n                        dilation_rate = dilation_rate * multi_grid[i]\n\n                    basic_block = self.add_sublayer(\n                        'bb_%d_%d' % (block, i),\n                        BasicBlock(\n                            in_channels=num_channels[block]\n                            if i == 0 else num_filters[block],\n                            out_channels=num_filters[block],\n                            stride=2 if i == 0 and block != 0 \\\n                                        and dilation_rate == 1 else 1,\n                            dilation=dilation_rate,\n                            shortcut=shortcut,\n                            if_first=block == i == 0,\n                            data_format=data_format))\n                    block_list.append(basic_block)\n                    shortcut = True\n                self.stage_list.append(block_list)\n\n        self.pretrained = pretrained\n\n    def forward(self, inputs: paddle.Tensor) -> List[paddle.Tensor]:\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        self.conv1_logit = y.clone()\n        y = self.pool2d_max(y)\n\n        # A feature list saves the output feature map of each stage.\n        feat_list = []\n        for stage in self.stage_list:\n            for block in stage:\n                y = block(y)\n            feat_list.append(y)\n\n        return feat_list\n\n\ndef ResNet50_vd(**args):\n    model = ResNet_vd(layers=50, **args)\n    return model\n"
  },
  {
    "path": "modules/image/semantic_segmentation/isanet_resnet50_voc/README.md",
    "content": "# isanet_resnet50_voc\n\n|模型名称|isanet_resnet50_voc|\n| :--- | :---: | \n|类别|图像-图像分割|\n|网络|isanet_resnet50vd|\n|数据集|PascalVOC2012|\n|是否支持Fine-tuning|是|\n|模型大小|217MB|\n|指标|-|\n|最新更新日期|2022-03-21|\n\n## 一、模型基本信息\n  \n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/159212097-443a5a65-2f2e-4126-9c07-d7c3c220e55f.jpg\"  width = \"420\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/159212375-52e123af-4699-4c25-8f50-4240bbb714b4.png\" width = \"420\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### 模型介绍\n\n    - 本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n    - 更多详情请参考：[isanet](https://arxiv.org/abs/1907.12273)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install isanet_resnet50_voc\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)   \n\n\n## 三、模型API预测\n\n- ### 1.预测代码示例\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='isanet_resnet50_voc')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用isanet_resnet50_voc模型对OpticDiscSeg数据集进行Fine-tune。 `train.py`内容如下：\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n              ```\n                - `transforms`: 数据预处理方式。\n                - `mode`: `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n                - 数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='isanet_resnet50_voc', num_classes=2, pretrained=None)\n              ```\n                - `name`: 选择预训练模型的名字。\n                - `load_checkpoint`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n        - Step4: 选择优化策略和运行配置\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n             \n    - 模型预测\n\n        - 当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。我们使用该模型来进行预测。predict.py脚本如下：\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='isanet_resnet50_voc', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - 参数配置正确后，请执行脚本`python predict.py`。\n\n            - **Args**\n                * `images`:原始图像路径或BGR格式图片；\n                * `visualization`: 是否可视化，默认为True；\n                * `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n                **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像分割服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m isanet_resnet50_voc\n      ```\n\n    - 这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ginet_resnet50vd_voc\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/semantic_segmentation/isanet_resnet50_voc/README_en.md",
    "content": "# isanet_resnet50_voc\n\n|Module Name|isanet_resnet50_voc|\n| :--- | :---: | \n|Category|Image Segmentation|\n|Network|isanet_resnet50vd|\n|Dataset|PascalVOC2012|\n|Fine-tuning supported or not|Yes|\n|Module Size|217MB|\n|Data indicators|-|\n|Latest update date|2022-03-22|\n\n## I. Basic Information \n  \n- ### Application Effect Display\n    - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/159212097-443a5a65-2f2e-4126-9c07-d7c3c220e55f.jpg\"  width = \"420\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/159212375-52e123af-4699-4c25-8f50-4240bbb714b4.png\" width = \"420\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### Module Introduction\n\n    - We will show how to use PaddleHub to finetune the pre-trained model and complete the prediction.\n    - For more information, please refer to: [isanet](https://arxiv.org/abs/1907.12273)\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install isanet_resnet50_voc\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='isanet_resnet50_voc')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.Fine-tune and Encapsulation\n\n    - After completing the installation of PaddlePaddle and PaddleHub, you can start using the isanet_resnet50_voc model to fine-tune datasets such as OpticDiscSeg.\n\n    - Steps:\n\n         - Step1: Define the data preprocessing method\n\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms`: The data enhancement module defines lots of data preprocessing methods. Users can replace the data preprocessing methods according to their needs.\n\n         - Step2: Download the dataset\n\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n              ```\n                * `transforms`: data preprocessing methods.\n\n                * `mode`: Select the data mode, the options are `train`, `test`, `val`. Default is `train`.\n\n                * Dataset preparation can be referred to [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`will be automatically downloaded from the network and decompressed to the `$HOME/.paddlehub/dataset` directory under the user directory.\n\n        - Step3: Load the pre-trained model\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='isanet_resnet50_voc', num_classes=2, pretrained=None)\n              ```\n                - `name`: model name.\n                - `load_checkpoint`: Whether to load the self-trained model, if it is None, load the provided parameters.\n\n        - Step4:  Optimization strategy\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n             \n    -  Model prediction\n\n        - When Fine-tune is completed, the model with the best performance on the verification set will be saved in the `${CHECKPOINT_DIR}/best_model` directory. We use this model to make predictions. The `predict.py` script is as follows:\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='isanet_resnet50_voc', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - **Args**\n                * `images`: Image path or ndarray data with format [H, W, C], BGR.\n                * `visualization`: Whether to save the recognition results as picture files.\n                * `save_path`: Save path of the result, default is 'seg_result'.\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of image segmentation.\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n        - ```shell\n          $ hub serving start -m isanet_resnet50_voc\n          ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result:\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/isanet_resnet50_voc\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## V. Release Note\n\n- 1.0.0\n\n  First release\n"
  },
  {
    "path": "modules/image/semantic_segmentation/isanet_resnet50_voc/layers.py",
    "content": "# copyright (c) 2022 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn.layer import activation\nfrom paddle.nn import Conv2D, AvgPool2D\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu':\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Depthwise Separable Convolution.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(SeparableConvBNReLU, self).__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=in_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n        self.piontwise_conv = ConvBNReLU(\n            in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBN, self).__init__()\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBNReLU, self).__init__()\n\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n    Returns:\n        A callable object of Activation.\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n    Examples:\n        from paddleseg.models.common.activation import Activation\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(\n                    act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n\n\nclass ASPPModule(nn.Layer):\n    \"\"\"\n    Atrous Spatial Pyramid Pooling.\n    Args:\n        aspp_ratios (tuple): The dilation rate using in ASSP module.\n        in_channels (int): The number of input channels.\n        out_channels (int): The number of output channels.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n        use_sep_conv (bool, optional): If using separable conv in ASPP module. Default: False.\n        image_pooling (bool, optional): If augmented with image-level features. Default: False\n    \"\"\"\n\n    def __init__(self,\n                 aspp_ratios: tuple,\n                 in_channels: int,\n                 out_channels: int,\n                 align_corners: bool,\n                 use_sep_conv: bool = False,\n                 image_pooling: bool = False):\n        super().__init__()\n\n        self.align_corners = align_corners\n        self.aspp_blocks = nn.LayerList()\n\n        for ratio in aspp_ratios:\n            if use_sep_conv and ratio > 1:\n                conv_func = SeparableConvBNReLU\n            else:\n                conv_func = ConvBNReLU\n\n            block = conv_func(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1 if ratio == 1 else 3,\n                dilation=ratio,\n                padding=0 if ratio == 1 else ratio)\n            self.aspp_blocks.append(block)\n\n        out_size = len(self.aspp_blocks)\n\n        if image_pooling:\n            self.global_avg_pool = nn.Sequential(\n                nn.AdaptiveAvgPool2D(output_size=(1, 1)),\n                ConvBNReLU(in_channels, out_channels, kernel_size=1, bias_attr=False))\n            out_size += 1\n        self.image_pooling = image_pooling\n\n        self.conv_bn_relu = ConvBNReLU(\n            in_channels=out_channels * out_size,\n            out_channels=out_channels,\n            kernel_size=1)\n\n        self.dropout = nn.Dropout(p=0.1)  # drop rate\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outputs = []\n        for block in self.aspp_blocks:\n            y = block(x)\n            y = F.interpolate(\n                y,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(y)\n\n        if self.image_pooling:\n            img_avg = self.global_avg_pool(x)\n            img_avg = F.interpolate(\n                img_avg,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(img_avg)\n\n        x = paddle.concat(outputs, axis=1)\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n\n        return x\n\n\nclass AuxLayer(nn.Layer):\n    \"\"\"\n    The auxiliary layer implementation for auxiliary loss.\n\n    Args:\n        in_channels (int): The number of input channels.\n        inter_channels (int): The intermediate channels.\n        out_channels (int): The number of output channels, and usually it is num_classes.\n        dropout_prob (float, optional): The drop rate. Default: 0.1.\n    \"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 inter_channels: int,\n                 out_channels: int,\n                 dropout_prob: float = 0.1,\n                 **kwargs):\n        super().__init__()\n\n        self.conv_bn_relu = ConvBNReLU(\n            in_channels=in_channels,\n            out_channels=inter_channels,\n            kernel_size=3,\n            padding=1,\n            **kwargs)\n\n        self.dropout = nn.Dropout(p=dropout_prob)\n\n        self.conv = nn.Conv2D(\n            in_channels=inter_channels,\n            out_channels=out_channels,\n            kernel_size=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n        x = self.conv(x)\n        return x\n\n\nclass Add(nn.Layer):\n    def __init__(self):\n        super().__init__()\n\n    def forward(self, x: paddle.Tensor, y: paddle.Tensor, name: str = None):\n        return paddle.add(x, y, name)\n\nclass AttentionBlock(nn.Layer):\n    \"\"\"General self-attention block/non-local block.\n\n    The original article refers to refer to https://arxiv.org/abs/1706.03762.\n    Args:\n        key_in_channels (int): Input channels of key feature.\n        query_in_channels (int): Input channels of query feature.\n        channels (int): Output channels of key/query transform.\n        out_channels (int): Output channels.\n        share_key_query (bool): Whether share projection weight between key\n            and query projection.\n        query_downsample (nn.Module): Query downsample module.\n        key_downsample (nn.Module): Key downsample module.\n        key_query_num_convs (int): Number of convs for key/query projection.\n        value_out_num_convs (int): Number of convs for value projection.\n        key_query_norm (bool): Whether to use BN for key/query projection.\n        value_out_norm (bool): Whether to use BN for value projection.\n        matmul_norm (bool): Whether normalize attention map with sqrt of\n            channels\n        with_out (bool): Whether use out projection.\n    \"\"\"\n\n    def __init__(self, key_in_channels, query_in_channels, channels,\n                 out_channels, share_key_query, query_downsample,\n                 key_downsample, key_query_num_convs, value_out_num_convs,\n                 key_query_norm, value_out_norm, matmul_norm, with_out):\n        super(AttentionBlock, self).__init__()\n        if share_key_query:\n            assert key_in_channels == query_in_channels\n        self.with_out = with_out\n        self.key_in_channels = key_in_channels\n        self.query_in_channels = query_in_channels\n        self.out_channels = out_channels\n        self.channels = channels\n        self.share_key_query = share_key_query\n        self.key_project = self.build_project(\n            key_in_channels,\n            channels,\n            num_convs=key_query_num_convs,\n            use_conv_module=key_query_norm)\n        if share_key_query:\n            self.query_project = self.key_project\n        else:\n            self.query_project = self.build_project(\n                query_in_channels,\n                channels,\n                num_convs=key_query_num_convs,\n                use_conv_module=key_query_norm)\n\n        self.value_project = self.build_project(\n            key_in_channels,\n            channels if self.with_out else out_channels,\n            num_convs=value_out_num_convs,\n            use_conv_module=value_out_norm)\n\n        if self.with_out:\n            self.out_project = self.build_project(\n                channels,\n                out_channels,\n                num_convs=value_out_num_convs,\n                use_conv_module=value_out_norm)\n        else:\n            self.out_project = None\n\n        self.query_downsample = query_downsample\n        self.key_downsample = key_downsample\n        self.matmul_norm = matmul_norm\n\n    def build_project(self, in_channels: int, channels: int, num_convs: int, use_conv_module: bool):\n        if use_conv_module:\n            convs = [\n                ConvBNReLU(\n                    in_channels=in_channels,\n                    out_channels=channels,\n                    kernel_size=1,\n                    bias_attr=False)\n            ]\n            for _ in range(num_convs - 1):\n                convs.append(\n                    ConvBNReLU(\n                        in_channels=channels,\n                        out_channels=channels,\n                        kernel_size=1,\n                        bias_attr=False))\n        else:\n            convs = [nn.Conv2D(in_channels, channels, 1)]\n            for _ in range(num_convs - 1):\n                convs.append(nn.Conv2D(channels, channels, 1))\n\n        if len(convs) > 1:\n            convs = nn.Sequential(*convs)\n        else:\n            convs = convs[0]\n        return convs\n\n    def forward(self, query_feats: paddle.Tensor, key_feats: paddle.Tensor) -> paddle.Tensor:\n        query_shape = paddle.shape(query_feats)\n        query = self.query_project(query_feats)\n        if self.query_downsample is not None:\n            query = self.query_downsample(query)\n        query = query.flatten(2).transpose([0, 2, 1])\n\n        key = self.key_project(key_feats)\n        value = self.value_project(key_feats)\n\n        if self.key_downsample is not None:\n            key = self.key_downsample(key)\n            value = self.key_downsample(value)\n\n        key = key.flatten(2)\n        value = value.flatten(2).transpose([0, 2, 1])\n        sim_map = paddle.matmul(query, key)\n        if self.matmul_norm:\n            sim_map = (self.channels**-0.5) * sim_map\n        sim_map = F.softmax(sim_map, axis=-1)\n\n        context = paddle.matmul(sim_map, value)\n        context = paddle.transpose(context, [0, 2, 1])\n\n        context = paddle.reshape(\n            context, [0, self.out_channels, query_shape[2], query_shape[3]])\n\n        if self.out_project is not None:\n            context = self.out_project(context)\n        return context\n"
  },
  {
    "path": "modules/image/semantic_segmentation/isanet_resnet50_voc/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union, List, Tuple\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\n\nfrom isanet_resnet50_voc.resnet import ResNet50_vd\nimport isanet_resnet50_voc.layers as layers\n\n\n@moduleinfo(\n    name=\"isanet_resnet50_voc\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"ISANetResnet50 is a segmentation model.\",\n    version=\"1.0.0\",\n    meta=ImageSegmentationModule)\nclass ISANet(nn.Layer):\n    \"\"\"Interlaced Sparse Self-Attention for Semantic Segmentation.\n\n    The original article refers to Lang Huang, et al. \"Interlaced Sparse Self-Attention for Semantic Segmentation\"\n    (https://arxiv.org/abs/1907.12273).\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (tuple): The values in the tuple indicate the indices of output of backbone.\n        isa_channels (int): The channels of ISA Module.\n        down_factor (tuple): Divide the height and width dimension to (Ph, PW) groups.\n        enable_auxiliary_loss (bool, optional): A bool value indicates whether adding auxiliary loss. Default: True.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.  Default: False.\n        pretrained (str, optional): The path or url of pretrained model. Default: None.\n\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 21,\n                 backbone_indices: Tuple[int] = (2, 3),\n                 isa_channels: int = 256,\n                 down_factor: Tuple[int] = (8, 8),\n                 enable_auxiliary_loss: bool = True,\n                 align_corners: bool = False,\n                 pretrained: str = None):\n        super(ISANet, self).__init__()\n\n        self.backbone = ResNet50_vd()\n        self.backbone_indices = backbone_indices\n        in_channels = [self.backbone.feat_channels[i] for i in backbone_indices]\n        self.head = ISAHead(num_classes, in_channels, isa_channels, down_factor,\n                            enable_auxiliary_loss)\n        self.align_corners = align_corners\n        self.transforms = T.Compose([T.Normalize()])\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def transform(self, img: Union[np.ndarray, str]) -> Union[np.ndarray, str]:\n        return self.transforms(img)\n\n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        feats = self.backbone(x)\n        feats = [feats[i] for i in self.backbone_indices]\n        logit_list = self.head(feats)\n        logit_list = [\n            F.interpolate(\n                logit,\n                paddle.shape(x)[2:],\n                mode='bilinear',\n                align_corners=self.align_corners,\n                align_mode=1) for logit in logit_list\n        ]\n\n        return logit_list\n\n\nclass ISAHead(nn.Layer):\n    \"\"\"\n    The ISAHead.\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        in_channels (tuple): The number of input channels.\n        isa_channels (int): The channels of ISA Module.\n        down_factor (tuple): Divide the height and width dimension to (Ph, PW) groups.\n        enable_auxiliary_loss (bool, optional): A bool value indicates whether adding auxiliary loss. Default: True.\n    \"\"\"\n\n    def __init__(self, \n                 num_classes: int, \n                 in_channels: Tuple[int], \n                 isa_channels: int, \n                 down_factor: Tuple[int],\n                 enable_auxiliary_loss: bool):\n        super(ISAHead, self).__init__()\n        self.in_channels = in_channels[-1]\n        inter_channels = self.in_channels // 4\n        self.inter_channels = inter_channels\n        self.down_factor = down_factor\n        self.enable_auxiliary_loss = enable_auxiliary_loss\n        self.in_conv = layers.ConvBNReLU(\n            self.in_channels, inter_channels, 3, bias_attr=False)\n        self.global_relation = SelfAttentionBlock(inter_channels, isa_channels)\n        self.local_relation = SelfAttentionBlock(inter_channels, isa_channels)\n        self.out_conv = layers.ConvBNReLU(\n            inter_channels * 2, inter_channels, 1, bias_attr=False)\n        self.cls = nn.Sequential(\n            nn.Dropout2D(p=0.1), nn.Conv2D(inter_channels, num_classes, 1))\n        self.aux = nn.Sequential(\n            layers.ConvBNReLU(\n                in_channels=1024,\n                out_channels=256,\n                kernel_size=3,\n                bias_attr=False), nn.Dropout2D(p=0.1),\n            nn.Conv2D(256, num_classes, 1))\n\n    def forward(self, feat_list: List[paddle.Tensor]) -> List[paddle.Tensor]:\n        C3, C4 = feat_list\n        x = self.in_conv(C4)\n        x_shape = paddle.shape(x)\n        P_h, P_w = self.down_factor\n        Q_h, Q_w = paddle.ceil(x_shape[2] / P_h).astype('int32'), paddle.ceil(\n            x_shape[3] / P_w).astype('int32')\n        pad_h, pad_w = (Q_h * P_h - x_shape[2]).astype('int32'), (\n                Q_w * P_w - x_shape[3]).astype('int32')\n        if pad_h > 0 or pad_w > 0:\n            padding = paddle.concat([\n                pad_w // 2, pad_w - pad_w // 2, pad_h // 2, pad_h - pad_h // 2\n            ],\n                axis=0)\n            feat = F.pad(x, padding)\n        else:\n            feat = x\n\n        feat = feat.reshape([0, x_shape[1], Q_h, P_h, Q_w, P_w])\n        feat = feat.transpose([0, 3, 5, 1, 2,\n                               4]).reshape([-1, self.inter_channels, Q_h, Q_w])\n        feat = self.global_relation(feat)\n\n        feat = feat.reshape([x_shape[0], P_h, P_w, x_shape[1], Q_h, Q_w])\n        feat = feat.transpose([0, 4, 5, 3, 1,\n                               2]).reshape([-1, self.inter_channels, P_h, P_w])\n        feat = self.local_relation(feat)\n\n        feat = feat.reshape([x_shape[0], Q_h, Q_w, x_shape[1], P_h, P_w])\n        feat = feat.transpose([0, 3, 1, 4, 2, 5]).reshape(\n            [0, self.inter_channels, P_h * Q_h, P_w * Q_w])\n        if pad_h > 0 or pad_w > 0:\n            feat = paddle.slice(\n                feat,\n                axes=[2, 3],\n                starts=[pad_h // 2, pad_w // 2],\n                ends=[pad_h // 2 + x_shape[2], pad_w // 2 + x_shape[3]])\n\n        feat = self.out_conv(paddle.concat([feat, x], axis=1))\n        output = self.cls(feat)\n\n        if self.enable_auxiliary_loss:\n            auxout = self.aux(C3)\n            return [output, auxout]\n        else:\n            return [output]\n\n\nclass SelfAttentionBlock(layers.AttentionBlock):\n    \"\"\"General self-attention block/non-local block.\n\n       Args:\n            in_channels (int): Input channels of key/query feature.\n            channels (int): Output channels of key/query transform.\n    \"\"\"\n\n    def __init__(self, in_channels, channels):\n        super(SelfAttentionBlock, self).__init__(\n            key_in_channels=in_channels,\n            query_in_channels=in_channels,\n            channels=channels,\n            out_channels=in_channels,\n            share_key_query=False,\n            query_downsample=None,\n            key_downsample=None,\n            key_query_num_convs=2,\n            key_query_norm=True,\n            value_out_num_convs=1,\n            value_out_norm=False,\n            matmul_norm=True,\n            with_out=False)\n\n        self.output_project = self.build_project(\n            in_channels, in_channels, num_convs=1, use_conv_module=True)\n\n    def forward(self, x):\n        context = super(SelfAttentionBlock, self).forward(x, x)\n        return self.output_project(context)\n"
  },
  {
    "path": "modules/image/semantic_segmentation/isanet_resnet50_voc/resnet.py",
    "content": "# copyright (c) 2022 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nimport isanet_resnet50_voc.layers as layers\n\n\nclass ConvBNLayer(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 stride: int = 1,\n                 dilation: int = 1,\n                 groups: int = 1,\n                 is_vd_mode: bool = False,\n                 act: str = None,\n                 data_format: str = 'NCHW'):\n        super(ConvBNLayer, self).__init__()\n        if dilation != 1 and kernel_size != 3:\n            raise RuntimeError(\"When the dilation isn't 1,\" \\\n                               \"the kernel_size should be 3.\")\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = nn.AvgPool2D(\n            kernel_size=2,\n            stride=2,\n            padding=0,\n            ceil_mode=True,\n            data_format=data_format)\n        self._conv = nn.Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=(kernel_size - 1) // 2 \\\n                if dilation == 1 else dilation,\n            dilation=dilation,\n            groups=groups,\n            bias_attr=False,\n            data_format=data_format)\n\n        self._batch_norm = layers.SyncBatchNorm(\n            out_channels, data_format=data_format)\n        self._act_op = layers.Activation(act=act)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        y = self._act_op(y)\n\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool  = False,\n                 dilation: int = 1,\n                 data_format: str = 'NCHW'):\n        super(BottleneckBlock, self).__init__()\n\n        self.data_format = data_format\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=1,\n            act='relu',\n            data_format=data_format)\n\n        self.dilation = dilation\n\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            dilation=dilation,\n            data_format=data_format)\n        self.conv2 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels * 4,\n            kernel_size=1,\n            act=None,\n            data_format=data_format)\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels * 4,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                data_format=data_format)\n\n        self.shortcut = shortcut\n        # NOTE: Use the wrap layer for quantization training\n        self.add = layers.Add()\n        self.relu = layers.Activation(act=\"relu\")\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = self.add(short, conv2)\n        y = self.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 dilation: int = 1,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 data_format: str = 'NCHW'):\n        super(BasicBlock, self).__init__()\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            dilation=dilation,\n            act='relu',\n            data_format=data_format)\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            dilation=dilation,\n            act=None,\n            data_format=data_format)\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                data_format=data_format)\n\n        self.shortcut = shortcut\n        self.dilation = dilation\n        self.data_format = data_format\n        self.add = layers.Add()\n        self.relu = layers.Activation(act=\"relu\")\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = self.add(short, conv1)\n        y = self.relu(y)\n\n        return y\n\n\nclass ResNet_vd(nn.Layer):\n    \"\"\"\n    The ResNet_vd implementation based on PaddlePaddle.\n\n    The original article refers to Jingdong\n    Tong He, et, al. \"Bag of Tricks for Image Classification with Convolutional Neural Networks\"\n    (https://arxiv.org/pdf/1812.01187.pdf).\n\n    Args:\n        layers (int, optional): The layers of ResNet_vd. The supported layers are (18, 34, 50, 101, 152, 200). Default: 50.\n        output_stride (int, optional): The stride of output features compared to input images. It is 8 or 16. Default: 8.\n        multi_grid (tuple|list, optional): The grid of stage4. Defult: (1, 1, 1).\n        pretrained (str, optional): The path of pretrained model.\n\n    \"\"\"\n\n    def __init__(self,\n                 layers: int = 50,\n                 output_stride: int = 8,\n                 multi_grid: Tuple[int] = (1, 1, 1),\n                 pretrained: str = None,\n                 data_format: str = 'NCHW'):\n        super(ResNet_vd, self).__init__()\n\n        self.data_format = data_format\n        self.conv1_logit = None  # for gscnn shape stream\n        self.layers = layers\n        supported_layers = [18, 34, 50, 101, 152, 200]\n        assert layers in supported_layers, \\\n            \"supported layers are {} but input layer is {}\".format(\n                supported_layers, layers)\n\n        if layers == 18:\n            depth = [2, 2, 2, 2]\n        elif layers == 34 or layers == 50:\n            depth = [3, 4, 6, 3]\n        elif layers == 101:\n            depth = [3, 4, 23, 3]\n        elif layers == 152:\n            depth = [3, 8, 36, 3]\n        elif layers == 200:\n            depth = [3, 12, 48, 3]\n        num_channels = [64, 256, 512, 1024\n                        ] if layers >= 50 else [64, 64, 128, 256]\n        num_filters = [64, 128, 256, 512]\n\n        # for channels of four returned stages\n        self.feat_channels = [c * 4 for c in num_filters\n                              ] if layers >= 50 else num_filters\n\n        dilation_dict = None\n        if output_stride == 8:\n            dilation_dict = {2: 2, 3: 4}\n        elif output_stride == 16:\n            dilation_dict = {3: 2}\n\n        self.conv1_1 = ConvBNLayer(\n            in_channels=3,\n            out_channels=32,\n            kernel_size=3,\n            stride=2,\n            act='relu',\n            data_format=data_format)\n        self.conv1_2 = ConvBNLayer(\n            in_channels=32,\n            out_channels=32,\n            kernel_size=3,\n            stride=1,\n            act='relu',\n            data_format=data_format)\n        self.conv1_3 = ConvBNLayer(\n            in_channels=32,\n            out_channels=64,\n            kernel_size=3,\n            stride=1,\n            act='relu',\n            data_format=data_format)\n        self.pool2d_max = nn.MaxPool2D(\n            kernel_size=3, stride=2, padding=1, data_format=data_format)\n\n        # self.block_list = []\n        self.stage_list = []\n        if layers >= 50:\n            for block in range(len(depth)):\n                shortcut = False\n                block_list = []\n                for i in range(depth[block]):\n                    if layers in [101, 152] and block == 2:\n                        if i == 0:\n                            conv_name = \"res\" + str(block + 2) + \"a\"\n                        else:\n                            conv_name = \"res\" + str(block + 2) + \"b\" + str(i)\n                    else:\n                        conv_name = \"res\" + str(block + 2) + chr(97 + i)\n\n                    ###############################################################################\n                    # Add dilation rate for some segmentation tasks, if dilation_dict is not None.\n                    dilation_rate = dilation_dict[\n                        block] if dilation_dict and block in dilation_dict else 1\n\n                    # Actually block here is 'stage', and i is 'block' in 'stage'\n                    # At the stage 4, expand the the dilation_rate if given multi_grid\n                    if block == 3:\n                        dilation_rate = dilation_rate * multi_grid[i]\n                    ###############################################################################\n\n                    bottleneck_block = self.add_sublayer(\n                        'bb_%d_%d' % (block, i),\n                        BottleneckBlock(\n                            in_channels=num_channels[block]\n                            if i == 0 else num_filters[block] * 4,\n                            out_channels=num_filters[block],\n                            stride=2 if i == 0 and block != 0\n                                        and dilation_rate == 1 else 1,\n                            shortcut=shortcut,\n                            if_first=block == i == 0,\n                            dilation=dilation_rate,\n                            data_format=data_format))\n\n                    block_list.append(bottleneck_block)\n                    shortcut = True\n                self.stage_list.append(block_list)\n        else:\n            for block in range(len(depth)):\n                shortcut = False\n                block_list = []\n                for i in range(depth[block]):\n                    dilation_rate = dilation_dict[block] \\\n                        if dilation_dict and block in dilation_dict else 1\n                    if block == 3:\n                        dilation_rate = dilation_rate * multi_grid[i]\n\n                    basic_block = self.add_sublayer(\n                        'bb_%d_%d' % (block, i),\n                        BasicBlock(\n                            in_channels=num_channels[block]\n                            if i == 0 else num_filters[block],\n                            out_channels=num_filters[block],\n                            stride=2 if i == 0 and block != 0 \\\n                                        and dilation_rate == 1 else 1,\n                            dilation=dilation_rate,\n                            shortcut=shortcut,\n                            if_first=block == i == 0,\n                            data_format=data_format))\n                    block_list.append(basic_block)\n                    shortcut = True\n                self.stage_list.append(block_list)\n\n        self.pretrained = pretrained\n\n    def forward(self, inputs: paddle.Tensor) -> List[paddle.Tensor]:\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        self.conv1_logit = y.clone()\n        y = self.pool2d_max(y)\n\n        # A feature list saves the output feature map of each stage.\n        feat_list = []\n        for stage in self.stage_list:\n            for block in stage:\n                y = block(y)\n            feat_list.append(y)\n\n        return feat_list\n\n\ndef ResNet50_vd(**args):\n    model = ResNet_vd(layers=50, **args)\n    return model"
  },
  {
    "path": "modules/image/semantic_segmentation/lseg/README.md",
    "content": "# lseg\n\n|模型名称|lseg|\n| :--- | :---: |\n|类别|图像-图像分割|\n|网络|LSeg|\n|数据集|-|\n|是否支持Fine-tuning|否|\n|模型大小|1.63GB|\n|指标|-|\n|最新更新日期|2022-09-22|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n  - 网络结构：\n      <p align=\"center\">\n      <img src=\"https://ai-studio-static-online.cdn.bcebos.com/5617725d3c5640c2b24c27294437d73c83c63f78498e40b5ab2e94d01128c70c\" hspace='10'/> <br />\n      </p>\n\n  - 样例结果示例：\n      <p align=\"center\">\n      <img src=\"https://ai-studio-static-online.cdn.bcebos.com/2168a1e6270c40e896dfc74f2127e964ee8a8c7164aa41e3afafe1657d1e2bba\" hspace='10'/>\n      </p>\n\n- ### 模型介绍\n\n  - 文本驱动的图像语义分割模型（Language-driven Semantic Segmentation），即通过文本控制模型的分割类别实现指定类别的图像语义分割算法。\n\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0  \n\n- ### 2.安装\n\n    - ```shell\n      $ hub install lseg\n      ```\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n  - ### 1、命令行预测\n\n    ```shell\n    $ hub run lseg \\\n        --input_path \"/PATH/TO/IMAGE\" \\\n        --labels \"Category 1\" \"Category 2\" \"Category n\" \\\n        --output_dir \"lseg_output\"\n    ```\n\n  - ### 2、预测代码示例\n\n    ```python\n    import paddlehub as hub\n    import cv2\n\n    module = hub.Module(name=\"lseg\")\n    result = module.segment(\n        image=cv2.imread('/PATH/TO/IMAGE'),\n        labels=[\"Category 1\", \"Category 2\", \"Category n\"],\n        visualization=True,\n        output_dir='lseg_output'\n    )\n    ```\n\n  - ### 3、API\n\n    ```python\n    def segment(\n        image: Union[str, numpy.ndarray],\n        labels: Union[str, List[str]],\n        visualization: bool = False,\n        output_dir: str = 'lseg_output'\n    ) -> Dict[str, Union[numpy.ndarray, Dict[str, numpy.ndarray]]]\n    ```\n\n    - 语义分割 API\n\n    - **参数**\n\n      * image (Union\\[str, numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      * labels (Union\\[str, List\\[str\\]\\]): 类别文本标签；\n      * visualization (bool): 是否将识别结果保存为图片文件；\n      * output\\_dir (str): 保存处理结果的文件目录。\n\n    - **返回**\n\n      * res (Dict\\[str, Union\\[numpy.ndarray, Dict\\[str, numpy.ndarray\\]\\]\\]): 识别结果的字典，字典中包含如下元素：\n        * gray (numpy.ndarray): 灰度分割结果 (GRAY)；\n        * color (numpy.ndarray): 伪彩色图分割结果 (BGR)；\n        * mix (numpy.ndarray): 叠加原图和伪彩色图的分割结果 (BGR)；\n        * classes (Dict\\[str, numpy.ndarray\\]): 各个类别标签的分割抠图结果 (BGRA)。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个语义驱动的语义分割的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n\n    ```shell\n     $ hub serving start -m lseg\n    ```\n\n    - 这样就完成了一个语义驱动的语义分割服务化API的部署，默认端口号为8866。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    ```python\n    import requests\n    import json\n    import base64\n\n    import cv2\n    import numpy as np\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tobytes()).decode('utf8')\n\n    def base64_to_cv2(b64str):\n        data = base64.b64decode(b64str.encode('utf8'))\n        data = np.frombuffer(data, np.uint8)\n        data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n        return data\n\n    # 发送HTTP请求\n    org_im = cv2.imread('/PATH/TO/IMAGE')\n    data = {\n        'image': cv2_to_base64(org_im),\n        'labels': [\"Category 1\", \"Category 2\", \"Category n\"]\n    }\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/lseg\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 结果转换\n    results = r.json()['results']\n    results = {\n        'gray': base64_to_cv2(results['gray']),\n        'color': base64_to_cv2(results['color']),\n        'mix': base64_to_cv2(results['mix']),\n        'classes': {\n            k: base64_to_cv2(v) for k, v in results['classes'].items()\n        }\n    }\n\n    # 保存输出\n    cv2.imwrite('mix.jpg', results['mix'])\n    ```\n\n## 五、参考资料\n\n* 论文：[Language-driven Semantic Segmentation](https://arxiv.org/abs/2201.03546)\n\n* 官方实现：[isl-org/lang-seg](https://github.com/isl-org/lang-seg)\n\n## 六、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install lseg==1.0.0\n  ```\n"
  },
  {
    "path": "modules/image/semantic_segmentation/lseg/models/__init__.py",
    "content": "from .lseg import LSeg\n\n__all__ = ['LSeg']\n"
  },
  {
    "path": "modules/image/semantic_segmentation/lseg/models/clip.py",
    "content": "import paddle\nimport paddle.nn as nn\nfrom paddlenlp.transformers.clip.modeling import TextTransformer\n\n\nclass CLIPText(nn.Layer):\n\n    def __init__(self,\n                 max_text_length: int = 77,\n                 vocab_size: int = 49408,\n                 text_embed_dim: int = 512,\n                 text_heads: int = 8,\n                 text_layers: int = 12,\n                 text_hidden_act: str = \"quick_gelu\",\n                 projection_dim: int = 512):\n        super().__init__()\n\n        self.text_model = TextTransformer(context_length=max_text_length,\n                                          transformer_width=text_embed_dim,\n                                          transformer_heads=text_heads,\n                                          transformer_layers=text_layers,\n                                          vocab_size=vocab_size,\n                                          activation=text_hidden_act,\n                                          normalize_before=True)\n\n        self.text_projection = paddle.create_parameter((text_embed_dim, projection_dim), paddle.get_default_dtype())\n\n    def get_text_features(\n        self,\n        input_ids,\n        attention_mask=None,\n        position_ids=None,\n        output_attentions=False,\n        output_hidden_states=False,\n        return_dict=False,\n    ):\n        text_outputs = self.text_model(input_ids=input_ids,\n                                       position_ids=position_ids,\n                                       attention_mask=attention_mask,\n                                       output_attentions=output_attentions,\n                                       output_hidden_states=output_hidden_states,\n                                       return_dict=return_dict)\n        pooled_output = text_outputs[1]\n        text_features = paddle.matmul(pooled_output, self.text_projection)\n        return text_features\n"
  },
  {
    "path": "modules/image/semantic_segmentation/lseg/models/lseg.py",
    "content": "import paddle.nn as nn\n\nfrom .clip import CLIPText\nfrom .scratch import Scratch\nfrom .vit import ViT\n\n\nclass LSeg(nn.Layer):\n\n    def __init__(self):\n        super().__init__()\n        self.clip = CLIPText()\n        self.vit = ViT()\n        self.scratch = Scratch()\n\n    def forward(self, images, texts):\n        layer_1, layer_2, layer_3, layer_4 = self.vit.forward(images)\n        text_features = self.clip.get_text_features(texts)\n        return self.scratch.forward(layer_1, layer_2, layer_3, layer_4, text_features)\n"
  },
  {
    "path": "modules/image/semantic_segmentation/lseg/models/scratch.py",
    "content": "import numpy as np\nimport paddle\nimport paddle.nn as nn\n\n\nclass Interpolate(nn.Layer):\n    \"\"\"Interpolation module.\"\"\"\n\n    def __init__(self, scale_factor, mode, align_corners=False):\n        \"\"\"Init.\n\n        Args:\n            scale_factor (float): scaling\n            mode (str): interpolation mode\n        \"\"\"\n        super(Interpolate, self).__init__()\n\n        self.interp = nn.functional.interpolate\n        self.scale_factor = scale_factor\n        self.mode = mode\n        self.align_corners = align_corners\n\n    def forward(self, x):\n        \"\"\"Forward pass.\n\n        Args:\n            x (tensor): input\n\n        Returns:\n            tensor: interpolated data\n        \"\"\"\n\n        x = self.interp(\n            x,\n            scale_factor=self.scale_factor,\n            mode=self.mode,\n            align_corners=self.align_corners,\n        )\n\n        return x\n\n\nclass ResidualConvUnit(nn.Layer):\n    \"\"\"Residual convolution module.\"\"\"\n\n    def __init__(self, features):\n        \"\"\"Init.\n\n        Args:\n            features (int): number of features\n        \"\"\"\n        super().__init__()\n\n        self.conv1 = nn.Conv2D(features, features, kernel_size=3, stride=1, padding=1)\n\n        self.conv2 = nn.Conv2D(features, features, kernel_size=3, stride=1, padding=1)\n\n        self.relu = nn.ReLU(inplace=True)\n\n    def forward(self, x):\n        \"\"\"Forward pass.\n\n        Args:\n            x (tensor): input\n\n        Returns:\n            tensor: output\n        \"\"\"\n        out = self.relu(x)\n        out = self.conv1(out)\n        out = self.relu(out)\n        out = self.conv2(out)\n\n        return out + x\n\n\nclass FeatureFusionBlock(nn.Layer):\n    \"\"\"Feature fusion block.\"\"\"\n\n    def __init__(self, features):\n        \"\"\"Init.\n\n        Args:\n            features (int): number of features\n        \"\"\"\n        super(FeatureFusionBlock, self).__init__()\n\n        self.resConfUnit1 = ResidualConvUnit(features)\n        self.resConfUnit2 = ResidualConvUnit(features)\n\n    def forward(self, *xs):\n        \"\"\"Forward pass.\n\n        Returns:\n            tensor: output\n        \"\"\"\n        output = xs[0]\n\n        if len(xs) == 2:\n            output += self.resConfUnit1(xs[1])\n\n        output = self.resConfUnit2(output)\n\n        output = nn.functional.interpolate(output, scale_factor=2, mode=\"bilinear\", align_corners=True)\n\n        return output\n\n\nclass ResidualConvUnit_custom(nn.Layer):\n    \"\"\"Residual convolution module.\"\"\"\n\n    def __init__(self, features, activation, bn):\n        \"\"\"Init.\n\n        Args:\n            features (int): number of features\n        \"\"\"\n        super().__init__()\n\n        self.bn = bn\n\n        self.groups = 1\n\n        self.conv1 = nn.Conv2D(\n            features,\n            features,\n            kernel_size=3,\n            stride=1,\n            padding=1,\n            bias_attr=not self.bn,\n            groups=self.groups,\n        )\n\n        self.conv2 = nn.Conv2D(\n            features,\n            features,\n            kernel_size=3,\n            stride=1,\n            padding=1,\n            bias_attr=not self.bn,\n            groups=self.groups,\n        )\n\n        if self.bn == True:\n            self.bn1 = nn.BatchNorm2D(features)\n            self.bn2 = nn.BatchNorm2D(features)\n\n        self.activation = activation\n\n    def forward(self, x):\n        \"\"\"Forward pass.\n\n        Args:\n            x (tensor): input\n\n        Returns:\n            tensor: output\n        \"\"\"\n\n        out = self.activation(x)\n        out = self.conv1(out)\n        if self.bn == True:\n            out = self.bn1(out)\n\n        out = self.activation(out)\n        out = self.conv2(out)\n        if self.bn == True:\n            out = self.bn2(out)\n\n        if self.groups > 1:\n            out = self.conv_merge(out)\n\n        return out + x\n\n\nclass FeatureFusionBlock_custom(nn.Layer):\n    \"\"\"Feature fusion block.\"\"\"\n\n    def __init__(\n            self,\n            features,\n            activation=nn.ReLU(),\n            deconv=False,\n            bn=False,\n            expand=False,\n            align_corners=True,\n    ):\n        \"\"\"Init.\n\n        Args:\n            features (int): number of features\n        \"\"\"\n        super(FeatureFusionBlock_custom, self).__init__()\n\n        self.deconv = deconv\n        self.align_corners = align_corners\n\n        self.groups = 1\n\n        self.expand = expand\n        out_features = features\n        if self.expand == True:\n            out_features = features // 2\n\n        self.out_conv = nn.Conv2D(\n            features,\n            out_features,\n            kernel_size=1,\n            stride=1,\n            padding=0,\n            bias_attr=True,\n            groups=1,\n        )\n\n        self.resConfUnit1 = ResidualConvUnit_custom(features, activation, bn)\n        self.resConfUnit2 = ResidualConvUnit_custom(features, activation, bn)\n\n    def forward(self, *xs):\n        \"\"\"Forward pass.\n\n        Returns:\n            tensor: output\n        \"\"\"\n        output = xs[0]\n\n        if len(xs) == 2:\n            res = self.resConfUnit1(xs[1])\n            output += res\n\n        output = self.resConfUnit2(output)\n\n        output = nn.functional.interpolate(output, scale_factor=2, mode=\"bilinear\", align_corners=self.align_corners)\n\n        output = self.out_conv(output)\n\n        return output\n\n\nclass Scratch(nn.Layer):\n\n    def __init__(self, in_channels=[256, 512, 1024, 1024], out_channels=256):\n        super().__init__()\n        self.out_c = 512\n        self.logit_scale = paddle.to_tensor(np.exp(np.log([1 / 0.07])))\n        self.layer1_rn = nn.Conv2D(\n            in_channels[0],\n            out_channels,\n            kernel_size=3,\n            stride=1,\n            padding=1,\n            bias_attr=False,\n            groups=1,\n        )\n        self.layer2_rn = nn.Conv2D(\n            in_channels[1],\n            out_channels,\n            kernel_size=3,\n            stride=1,\n            padding=1,\n            bias_attr=False,\n            groups=1,\n        )\n        self.layer3_rn = nn.Conv2D(\n            in_channels[2],\n            out_channels,\n            kernel_size=3,\n            stride=1,\n            padding=1,\n            bias_attr=False,\n            groups=1,\n        )\n        self.layer4_rn = nn.Conv2D(\n            in_channels[3],\n            out_channels,\n            kernel_size=3,\n            stride=1,\n            padding=1,\n            bias_attr=False,\n            groups=1,\n        )\n\n        self.refinenet1 = FeatureFusionBlock_custom(out_channels, bn=True)\n        self.refinenet2 = FeatureFusionBlock_custom(out_channels, bn=True)\n        self.refinenet3 = FeatureFusionBlock_custom(out_channels, bn=True)\n        self.refinenet4 = FeatureFusionBlock_custom(out_channels, bn=True)\n\n        self.head1 = nn.Conv2D(out_channels, self.out_c, kernel_size=1)\n\n        self.output_conv = nn.Sequential(Interpolate(scale_factor=2, mode=\"bilinear\", align_corners=True))\n\n    def forward(self, layer_1, layer_2, layer_3, layer_4, text_features):\n\n        layer_1_rn = self.layer1_rn(layer_1)\n        layer_2_rn = self.layer2_rn(layer_2)\n        layer_3_rn = self.layer3_rn(layer_3)\n        layer_4_rn = self.layer4_rn(layer_4)\n\n        path_4 = self.refinenet4(layer_4_rn)\n        path_3 = self.refinenet3(path_4, layer_3_rn)\n        path_2 = self.refinenet2(path_3, layer_2_rn)\n        path_1 = self.refinenet1(path_2, layer_1_rn)\n\n        image_features = self.head1(path_1)\n\n        imshape = image_features.shape\n        image_features = image_features.transpose((0, 2, 3, 1)).reshape((-1, self.out_c))\n\n        # normalized features\n        image_features = image_features / image_features.norm(axis=-1, keepdim=True)\n        text_features = text_features / text_features.norm(axis=-1, keepdim=True)\n\n        logits_per_image = self.logit_scale * image_features @ text_features.t()\n\n        out = logits_per_image.reshape((imshape[0], imshape[2], imshape[3], -1)).transpose((0, 3, 1, 2))\n\n        out = self.output_conv(out)\n\n        return out\n"
  },
  {
    "path": "modules/image/semantic_segmentation/lseg/models/vit.py",
    "content": "import math\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddleclas.ppcls.arch.backbone.model_zoo.vision_transformer import VisionTransformer\n\n\nclass Slice(nn.Layer):\n\n    def __init__(self, start_index=1):\n        super(Slice, self).__init__()\n        self.start_index = start_index\n\n    def forward(self, x):\n        return x[:, self.start_index:]\n\n\nclass AddReadout(nn.Layer):\n\n    def __init__(self, start_index=1):\n        super(AddReadout, self).__init__()\n        self.start_index = start_index\n\n    def forward(self, x):\n        if self.start_index == 2:\n            readout = (x[:, 0] + x[:, 1]) / 2\n        else:\n            readout = x[:, 0]\n        return x[:, self.start_index:] + readout.unsqueeze(1)\n\n\nclass Transpose(nn.Layer):\n\n    def __init__(self, dim0, dim1):\n        super(Transpose, self).__init__()\n        self.dim0 = dim0\n        self.dim1 = dim1\n\n    def forward(self, x):\n        prems = list(range(x.dim()))\n        prems[self.dim0], prems[self.dim1] = prems[self.dim1], prems[self.dim0]\n        x = x.transpose(prems)\n        return x\n\n\nclass Unflatten(nn.Layer):\n\n    def __init__(self, start_axis, shape):\n        super(Unflatten, self).__init__()\n        self.start_axis = start_axis\n        self.shape = shape\n\n    def forward(self, x):\n        return paddle.reshape(x, x.shape[:self.start_axis] + [self.shape])\n\n\nclass ProjectReadout(nn.Layer):\n\n    def __init__(self, in_features, start_index=1):\n        super(ProjectReadout, self).__init__()\n        self.start_index = start_index\n\n        self.project = nn.Sequential(nn.Linear(2 * in_features, in_features), nn.GELU())\n\n    def forward(self, x):\n        readout = x[:, 0].unsqueeze(1).expand_as(x[:, self.start_index:])\n        features = paddle.concat((x[:, self.start_index:], readout), -1)\n\n        return self.project(features)\n\n\nclass ViT(VisionTransformer):\n\n    def __init__(self,\n                 img_size=384,\n                 patch_size=16,\n                 in_chans=3,\n                 class_num=1000,\n                 embed_dim=1024,\n                 depth=24,\n                 num_heads=16,\n                 mlp_ratio=4,\n                 qkv_bias=True,\n                 qk_scale=None,\n                 drop_rate=0,\n                 attn_drop_rate=0,\n                 drop_path_rate=0,\n                 norm_layer='nn.LayerNorm',\n                 epsilon=1e-6,\n                 **kwargs):\n        super().__init__(img_size, patch_size, in_chans, class_num, embed_dim, depth, num_heads, mlp_ratio, qkv_bias,\n                         qk_scale, drop_rate, attn_drop_rate, drop_path_rate, norm_layer, epsilon, **kwargs)\n        self.patch_size = patch_size\n        self.start_index = 1\n        features = [256, 512, 1024, 1024]\n        readout_oper = [ProjectReadout(embed_dim, self.start_index) for out_feat in features]\n        self.act_postprocess1 = nn.Sequential(\n            readout_oper[0],\n            Transpose(1, 2),\n            Unflatten(2, [img_size // 16, img_size // 16]),\n            nn.Conv2D(\n                in_channels=embed_dim,\n                out_channels=features[0],\n                kernel_size=1,\n                stride=1,\n                padding=0,\n            ),\n            nn.Conv2DTranspose(\n                in_channels=features[0],\n                out_channels=features[0],\n                kernel_size=4,\n                stride=4,\n                padding=0,\n                dilation=1,\n                groups=1,\n            ),\n        )\n\n        self.act_postprocess2 = nn.Sequential(\n            readout_oper[1],\n            Transpose(1, 2),\n            Unflatten(2, [img_size // 16, img_size // 16]),\n            nn.Conv2D(\n                in_channels=embed_dim,\n                out_channels=features[1],\n                kernel_size=1,\n                stride=1,\n                padding=0,\n            ),\n            nn.Conv2DTranspose(\n                in_channels=features[1],\n                out_channels=features[1],\n                kernel_size=2,\n                stride=2,\n                padding=0,\n                dilation=1,\n                groups=1,\n            ),\n        )\n\n        self.act_postprocess3 = nn.Sequential(\n            readout_oper[2],\n            Transpose(1, 2),\n            Unflatten(2, [img_size // 16, img_size // 16]),\n            nn.Conv2D(\n                in_channels=embed_dim,\n                out_channels=features[2],\n                kernel_size=1,\n                stride=1,\n                padding=0,\n            ),\n        )\n\n        self.act_postprocess4 = nn.Sequential(\n            readout_oper[3],\n            Transpose(1, 2),\n            Unflatten(2, [img_size // 16, img_size // 16]),\n            nn.Conv2D(\n                in_channels=embed_dim,\n                out_channels=features[3],\n                kernel_size=1,\n                stride=1,\n                padding=0,\n            ),\n            nn.Conv2D(\n                in_channels=features[3],\n                out_channels=features[3],\n                kernel_size=3,\n                stride=2,\n                padding=1,\n            ),\n        )\n\n        self.norm = nn.Identity()\n        self.head = nn.Identity()\n\n    def _resize_pos_embed(self, posemb, gs_h, gs_w):\n        posemb_tok, posemb_grid = (\n            posemb[:, :self.start_index],\n            posemb[0, self.start_index:],\n        )\n\n        gs_old = int(math.sqrt(len(posemb_grid)))\n\n        posemb_grid = posemb_grid.reshape((1, gs_old, gs_old, -1)).transpose((0, 3, 1, 2))\n        posemb_grid = F.interpolate(posemb_grid, size=(gs_h, gs_w), mode=\"bilinear\")\n        posemb_grid = posemb_grid.transpose((0, 2, 3, 1)).reshape((1, gs_h * gs_w, -1))\n\n        posemb = paddle.concat([posemb_tok, posemb_grid], axis=1)\n\n        return posemb\n\n    def forward(self, x):\n        b, c, h, w = x.shape\n\n        pos_embed = self._resize_pos_embed(self.pos_embed, h // self.patch_size, w // self.patch_size)\n        x = self.patch_embed.proj(x).flatten(2).transpose((0, 2, 1))\n\n        cls_tokens = self.cls_token.expand((b, -1, -1))\n        x = paddle.concat((cls_tokens, x), axis=1)\n\n        x = x + pos_embed\n        x = self.pos_drop(x)\n\n        outputs = []\n        for index, blk in enumerate(self.blocks):\n            x = blk(x)\n            if index in [5, 11, 17, 23]:\n                outputs.append(x)\n\n        layer_1 = self.act_postprocess1[0:2](outputs[0])\n        layer_2 = self.act_postprocess2[0:2](outputs[1])\n        layer_3 = self.act_postprocess3[0:2](outputs[2])\n        layer_4 = self.act_postprocess4[0:2](outputs[3])\n\n        shape = (-1, 1024, h // self.patch_size, w // self.patch_size)\n        layer_1 = layer_1.reshape(shape)\n        layer_2 = layer_2.reshape(shape)\n        layer_3 = layer_3.reshape(shape)\n        layer_4 = layer_4.reshape(shape)\n\n        layer_1 = self.act_postprocess1[3:len(self.act_postprocess1)](layer_1)\n        layer_2 = self.act_postprocess2[3:len(self.act_postprocess2)](layer_2)\n        layer_3 = self.act_postprocess3[3:len(self.act_postprocess3)](layer_3)\n        layer_4 = self.act_postprocess4[3:len(self.act_postprocess4)](layer_4)\n\n        return layer_1, layer_2, layer_3, layer_4\n"
  },
  {
    "path": "modules/image/semantic_segmentation/lseg/module.py",
    "content": "import argparse\nimport base64\nimport os\nimport time\nfrom typing import Dict\nfrom typing import List\nfrom typing import Union\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.vision.transforms as transforms\nfrom paddlenlp.transformers.clip.tokenizer import CLIPTokenizer\n\nimport paddlehub as hub\nfrom . import models\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tobytes()).decode('utf8')\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.frombuffer(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\n@moduleinfo(\n    name='lseg',\n    version='1.0.0',\n    type=\"CV/semantic_segmentation\",\n    author=\"\",\n    author_email=\"\",\n    summary=\"Language-driven Semantic Segmentation.\",\n)\nclass LSeg(models.LSeg):\n\n    def __init__(self):\n        super(LSeg, self).__init__()\n        self.default_pretrained_model_path = os.path.join(self.directory, 'ckpts', 'LSeg.pdparams')\n        state_dict = paddle.load(self.default_pretrained_model_path)\n        self.set_state_dict(state_dict)\n        self.eval()\n        self.transforms = transforms.Compose([\n            transforms.ToTensor(),\n            transforms.Normalize([0.5, 0.5, 0.5], [0.5, 0.5, 0.5]),\n        ])\n        self.tokenizer = CLIPTokenizer.from_pretrained('openai/clip-vit-base-patch32')\n\n        self.language_recognition = hub.Module(name='baidu_language_recognition')\n        self.translate = hub.Module(name='baidu_translate')\n\n    @staticmethod\n    def get_colormap(n):\n        assert n <= 256, \"num_class should be less than 256.\"\n\n        pallete = [0] * (256 * 3)\n\n        for j in range(0, n):\n            lab = j\n            pallete[j * 3 + 0] = 0\n            pallete[j * 3 + 1] = 0\n            pallete[j * 3 + 2] = 0\n            i = 0\n            while (lab > 0):\n                pallete[j * 3 + 0] |= (((lab >> 0) & 1) << (7 - i))\n                pallete[j * 3 + 1] |= (((lab >> 1) & 1) << (7 - i))\n                pallete[j * 3 + 2] |= (((lab >> 2) & 1) << (7 - i))\n                i = i + 1\n                lab >>= 3\n\n        return np.asarray(pallete, dtype=np.uint8).reshape(256, 1, 3)\n\n    def segment(self,\n                image: Union[str, np.ndarray],\n                labels: Union[str, List[str]],\n                visualization: bool = False,\n                output_dir: str = 'lseg_output') -> Dict[str, Union[np.ndarray, Dict[str, np.ndarray]]]:\n        if isinstance(image, str):\n            image = cv2.imread(image)\n        elif isinstance(image, np.ndarray):\n            image = image\n        else:\n            raise Exception(\"image should be a str / np.ndarray\")\n\n        if isinstance(labels, str):\n            labels = [labels, 'other']\n            print('\"other\" category label is automatically added because the length of labels is equal to 1')\n            print('new labels: ', labels)\n        elif isinstance(labels, list):\n            if len(labels) == 1:\n                labels.append('other')\n                print('\"other\" category label is automatically added because the length of labels is equal to 1')\n                print('new labels: ', labels)\n            elif len(labels) == 0:\n                raise Exception(\"labels should not be empty.\")\n        else:\n            raise Exception(\"labels should be a str or list.\")\n\n        class_num = len(labels)\n\n        labels_ = list(set(labels))\n        labels_.sort(key=labels.index)\n        labels = labels_\n\n        input_labels = []\n        for label in labels:\n            from_lang = self.language_recognition.recognize(query=label)\n            if from_lang != 'en':\n                label = self.translate.translate(query=label, from_lang=from_lang, to_lang='en')\n            input_labels.append(label)\n\n        labels_dict = {k: v for k, v in zip(input_labels, labels)}\n\n        input_labels_ = list(set(input_labels))\n        input_labels_.sort(key=input_labels.index)\n        input_labels = input_labels_\n\n        labels = []\n        for input_label in input_labels:\n            labels.append(labels_dict[input_label])\n\n        if len(labels) < class_num:\n            print('remove the same labels...')\n            print('new labels: ', labels)\n\n        h, w = image.shape[:2]\n        image = image[:-(h % 32) if h % 32 else None, :-(w % 32) if w % 32 else None]\n        images = self.transforms(cv2.cvtColor(image, cv2.COLOR_BGR2RGB)).unsqueeze(0)\n        texts = self.tokenizer(input_labels, padding=True, return_tensors=\"pd\")['input_ids']\n\n        with paddle.no_grad():\n            results = self.forward(images, texts)\n            results = paddle.argmax(results, 1).cast(paddle.uint8)\n            gray_seg = results.numpy()[0]\n\n        colormap = self.get_colormap(len(labels))\n        color_seg = cv2.applyColorMap(gray_seg, colormap)\n        mix_seg = cv2.addWeighted(image, 0.5, color_seg, 0.5, 0.0)\n\n        classes_seg = {}\n        for i, label in enumerate(labels):\n            mask = ((gray_seg == i).astype('uint8') * 255)[..., None]\n            classes_seg[label] = np.concatenate([image, mask], 2)\n\n        if visualization:\n            save_dir = os.path.join(output_dir, str(int(time.time())))\n            if not os.path.isdir(save_dir):\n                os.makedirs(save_dir)\n            for label, dst in classes_seg.items():\n                cv2.imwrite(os.path.join(save_dir, '%s.png' % label), dst)\n\n        return {'gray': gray_seg, 'color': color_seg, 'mix': mix_seg, 'classes': classes_seg}\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.parser.add_argument('--input_path', type=str, help=\"path to image.\")\n        self.parser.add_argument('--labels', type=str, nargs='+', help=\"segmentation labels.\")\n        self.parser.add_argument('--output_dir',\n                                 type=str,\n                                 default='lseg_output',\n                                 help=\"The directory to save output images.\")\n        args = self.parser.parse_args(argvs)\n        self.segment(image=args.input_path, labels=args.labels, visualization=True, output_dir=args.output_dir)\n        return 'segmentation results are saved in %s' % args.output_dir\n\n    @serving\n    def serving_method(self, image, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        image = base64_to_cv2(image)\n        results = self.segment(image=image, **kwargs)\n\n        return {\n            'gray': cv2_to_base64(results['gray']),\n            'color': cv2_to_base64(results['color']),\n            'mix': cv2_to_base64(results['mix']),\n            'classes': {k: cv2_to_base64(v)\n                        for k, v in results['classes'].items()}\n        }\n"
  },
  {
    "path": "modules/image/semantic_segmentation/lseg/requirements.txt",
    "content": "paddleclas>=2.4.0\npaddlenlp>=2.4.0\nftfy\nregex\n"
  },
  {
    "path": "modules/image/semantic_segmentation/lseg/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport numpy as np\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/mJaD10XeD7w/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8M3x8Y2F0fGVufDB8fHx8MTY2MzczNDc3Mw&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"lseg\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('lseg_output')\n\n    def test_segment1(self):\n        results = self.module.segment(image='tests/test.jpg', labels=['other', 'cat'], visualization=False)\n\n        self.assertIsInstance(results['mix'], np.ndarray)\n        self.assertIsInstance(results['color'], np.ndarray)\n        self.assertIsInstance(results['gray'], np.ndarray)\n        self.assertIsInstance(results['classes']['other'], np.ndarray)\n        self.assertIsInstance(results['classes']['cat'], np.ndarray)\n\n    def test_segment2(self):\n        results = self.module.segment(image=cv2.imread('tests/test.jpg'), labels=['other', 'cat'], visualization=True)\n\n        self.assertIsInstance(results['mix'], np.ndarray)\n        self.assertIsInstance(results['color'], np.ndarray)\n        self.assertIsInstance(results['gray'], np.ndarray)\n        self.assertIsInstance(results['classes']['other'], np.ndarray)\n        self.assertIsInstance(results['classes']['cat'], np.ndarray)\n\n    def test_segment3(self):\n        results = self.module.segment(image=cv2.imread('tests/test.jpg'), labels=['其他', '猫'], visualization=False)\n\n        self.assertIsInstance(results['mix'], np.ndarray)\n        self.assertIsInstance(results['color'], np.ndarray)\n        self.assertIsInstance(results['gray'], np.ndarray)\n        self.assertIsInstance(results['classes']['其他'], np.ndarray)\n        self.assertIsInstance(results['classes']['猫'], np.ndarray)\n\n    def test_segment4(self):\n        self.assertRaises(Exception, self.module.segment, image=['tests/test.jpg'], labels=['other', 'cat'])\n\n    def test_segment5(self):\n        self.assertRaises(AttributeError, self.module.segment, image='no.jpg', labels=['other', 'cat'])\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ocrnet_hrnetw18_cityscapes/README.md",
    "content": "# PaddleHub 图像分割\n\n## 模型预测\n\n若想使用我们提供的预训练模型进行预测，可使用如下脚本：\n\n```python\nimport paddle\nimport cv2\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='ocrnet_hrnetw18_cityscapes')\n    img = cv2.imread(\"/PATH/TO/IMAGE\")\n    model.predict(images=[img], visualization=True)\n```\n\n\n## 如何开始Fine-tune\n\n在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用ocrnet_hrnetw18_cityscapes模型对OpticDiscSeg等数据集进行Fine-tune。\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 定义数据预处理方式\n```python\nfrom paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\ntransform = Compose([Resize(target_size=(512, 512)), Normalize()])\n```\n\n`segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n### Step2: 下载数据集并使用\n```python\nfrom paddlehub.datasets import OpticDiscSeg\n\ntrain_reader = OpticDiscSeg(transform， mode='train')\n\n```\n* `transform`: 数据预处理方式。\n* `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n### Step3: 加载预训练模型\n\n```python\nmodel = hub.Module(name='ocrnet_hrnetw18_cityscapes', num_classes=2, pretrained=None)\n```\n* `name`: 选择预训练模型的名字。\n* `num_classes`: 分割模型的类别数目。\n* `pretrained`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n### Step4: 选择优化策略和运行配置\n\n```python\nscheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\noptimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\ntrainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_ocr', use_gpu=True)\n```\n\n#### 优化策略\n\nPaddle2.0提供了多种优化器选择，如`SGD`, `Adam`, `Adamax`等，其中`Adam`:\n\n* `learning_rate`: 全局学习率。\n*  `parameters`: 待优化模型参数。\n\n#### 运行配置\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_gpu`: 是否使用gpu，默认为False;\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: works的数量，默认为0；\n* `eval_dataset`: 验证集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们使用该模型来进行预测。predict.py脚本如下：\n\n```python\nimport paddle\nimport cv2\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='ocrnet_hrnetw18_cityscapes', pretrained='/PATH/TO/CHECKPOINT')\n    img = cv2.imread(\"/PATH/TO/IMAGE\")\n    model.predict(images=[img], visualization=True)\n```\n\n参数配置正确后，请执行脚本`python predict.py`。\n**Args**\n* `images`:原始图像路径或BGR格式图片；\n* `visualization`: 是否可视化，默认为True；\n* `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n**NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线图像分割服务。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m ocrnet_hrnetw18_cityscapes\n```\n\n这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\nimport cv2\nimport base64\n\nimport numpy as np\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n# 发送HTTP请求\norg_im = cv2.imread('/PATH/TO/IMAGE')\ndata = {'images':[cv2_to_base64(org_im)]}\nheaders = {\"Content-type\": \"application/json\"}\nurl = \"http://127.0.0.1:8866/predict/ocrnet_hrnetw18_cityscapes\"\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\nmask = base64_to_cv2(r.json()[\"results\"][0])\n```\n\n### 查看代码\n\nhttps://github.com/PaddlePaddle/PaddleSeg\n\n### 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ocrnet_hrnetw18_cityscapes/hrnet.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport math\nfrom typing import Tuple\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nimport ocrnet_hrnetw18_cityscapes.layers as L\n\n\nclass HRNet_W18(nn.Layer):\n    \"\"\"\n    The HRNet implementation based on PaddlePaddle.\n\n    The original article refers to\n    Jingdong Wang, et, al. \"HRNet：Deep High-Resolution Representation Learning for Visual Recognition\"\n    (https://arxiv.org/pdf/1908.07919.pdf).\n\n    Args:\n        stage1_num_modules (int, optional): Number of modules for stage1. Default 1.\n        stage1_num_blocks (list, optional): Number of blocks per module for stage1. Default (4).\n        stage1_num_channels (list, optional): Number of channels per branch for stage1. Default (64).\n        stage2_num_modules (int, optional): Number of modules for stage2. Default 1.\n        stage2_num_blocks (list, optional): Number of blocks per module for stage2. Default (4, 4).\n        stage2_num_channels (list, optional): Number of channels per branch for stage2. Default (18, 36).\n        stage3_num_modules (int, optional): Number of modules for stage3. Default 4.\n        stage3_num_blocks (list, optional): Number of blocks per module for stage3. Default (4, 4, 4).\n        stage3_num_channels (list, optional): Number of channels per branch for stage3. Default (18, 36, 72).\n        stage4_num_modules (int, optional): Number of modules for stage4. Default 3.\n        stage4_num_blocks (list, optional): Number of blocks per module for stage4. Default (4, 4, 4, 4).\n        stage4_num_channels (list, optional): Number of channels per branch for stage4. Default (18, 36, 72. 144).\n        has_se (bool, optional): Whether to use Squeeze-and-Excitation module. Default False.\n        align_corners (bool, optional): An argument of F.interpolate. It should be set to False when the feature size is even,\n            e.g. 1024x512, otherwise it is True, e.g. 769x769. Default: False.\n    \"\"\"\n\n    def __init__(self,\n                 stage1_num_modules: int = 1,\n                 stage1_num_blocks: Tuple[int] = (4, ),\n                 stage1_num_channels: Tuple[int] = (64, ),\n                 stage2_num_modules: int = 1,\n                 stage2_num_blocks: Tuple[int] = (4, 4),\n                 stage2_num_channels: Tuple[int] = (18, 36),\n                 stage3_num_modules: int = 4,\n                 stage3_num_blocks: Tuple[int] = (4, 4, 4),\n                 stage3_num_channels: Tuple[int] = (18, 36, 72),\n                 stage4_num_modules: int = 3,\n                 stage4_num_blocks: Tuple[int] = (4, 4, 4, 4),\n                 stage4_num_channels: Tuple[int] = (18, 36, 72, 144),\n                 has_se: bool = False,\n                 align_corners: bool = False):\n        super(HRNet_W18, self).__init__()\n\n        self.stage1_num_modules = stage1_num_modules\n        self.stage1_num_blocks = stage1_num_blocks\n        self.stage1_num_channels = stage1_num_channels\n        self.stage2_num_modules = stage2_num_modules\n        self.stage2_num_blocks = stage2_num_blocks\n        self.stage2_num_channels = stage2_num_channels\n        self.stage3_num_modules = stage3_num_modules\n        self.stage3_num_blocks = stage3_num_blocks\n        self.stage3_num_channels = stage3_num_channels\n        self.stage4_num_modules = stage4_num_modules\n        self.stage4_num_blocks = stage4_num_blocks\n        self.stage4_num_channels = stage4_num_channels\n        self.has_se = has_se\n        self.align_corners = align_corners\n        self.feat_channels = [sum(stage4_num_channels)]\n\n        self.conv_layer1_1 = L.ConvBNReLU(\n            in_channels=3, out_channels=64, kernel_size=3, stride=2, padding='same', bias_attr=False)\n\n        self.conv_layer1_2 = L.ConvBNReLU(\n            in_channels=64, out_channels=64, kernel_size=3, stride=2, padding='same', bias_attr=False)\n\n        self.la1 = Layer1(\n            num_channels=64,\n            num_blocks=self.stage1_num_blocks[0],\n            num_filters=self.stage1_num_channels[0],\n            has_se=has_se,\n            name=\"layer2\")\n\n        self.tr1 = TransitionLayer(\n            in_channels=[self.stage1_num_channels[0] * 4], out_channels=self.stage2_num_channels, name=\"tr1\")\n\n        self.st2 = Stage(\n            num_channels=self.stage2_num_channels,\n            num_modules=self.stage2_num_modules,\n            num_blocks=self.stage2_num_blocks,\n            num_filters=self.stage2_num_channels,\n            has_se=self.has_se,\n            name=\"st2\",\n            align_corners=align_corners)\n\n        self.tr2 = TransitionLayer(\n            in_channels=self.stage2_num_channels, out_channels=self.stage3_num_channels, name=\"tr2\")\n        self.st3 = Stage(\n            num_channels=self.stage3_num_channels,\n            num_modules=self.stage3_num_modules,\n            num_blocks=self.stage3_num_blocks,\n            num_filters=self.stage3_num_channels,\n            has_se=self.has_se,\n            name=\"st3\",\n            align_corners=align_corners)\n\n        self.tr3 = TransitionLayer(\n            in_channels=self.stage3_num_channels, out_channels=self.stage4_num_channels, name=\"tr3\")\n        self.st4 = Stage(\n            num_channels=self.stage4_num_channels,\n            num_modules=self.stage4_num_modules,\n            num_blocks=self.stage4_num_blocks,\n            num_filters=self.stage4_num_channels,\n            has_se=self.has_se,\n            name=\"st4\",\n            align_corners=align_corners)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        conv1 = self.conv_layer1_1(x)\n        conv2 = self.conv_layer1_2(conv1)\n\n        la1 = self.la1(conv2)\n\n        tr1 = self.tr1([la1])\n        st2 = self.st2(tr1)\n\n        tr2 = self.tr2(st2)\n        st3 = self.st3(tr2)\n\n        tr3 = self.tr3(st3)\n        st4 = self.st4(tr3)\n\n        x0_h, x0_w = st4[0].shape[2:]\n        x1 = F.interpolate(st4[1], (x0_h, x0_w), mode='bilinear', align_corners=self.align_corners)\n        x2 = F.interpolate(st4[2], (x0_h, x0_w), mode='bilinear', align_corners=self.align_corners)\n        x3 = F.interpolate(st4[3], (x0_h, x0_w), mode='bilinear', align_corners=self.align_corners)\n        x = paddle.concat([st4[0], x1, x2, x3], axis=1)\n\n        return [x]\n\n\nclass Layer1(nn.Layer):\n    def __init__(self, num_channels: int, num_filters: int, num_blocks: int, has_se: bool = False, name: str = None):\n        super(Layer1, self).__init__()\n\n        self.bottleneck_block_list = []\n\n        for i in range(num_blocks):\n            bottleneck_block = self.add_sublayer(\n                \"bb_{}_{}\".format(name, i + 1),\n                BottleneckBlock(\n                    num_channels=num_channels if i == 0 else num_filters * 4,\n                    num_filters=num_filters,\n                    has_se=has_se,\n                    stride=1,\n                    downsample=True if i == 0 else False,\n                    name=name + '_' + str(i + 1)))\n            self.bottleneck_block_list.append(bottleneck_block)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        conv = x\n        for block_func in self.bottleneck_block_list:\n            conv = block_func(conv)\n        return conv\n\n\nclass TransitionLayer(nn.Layer):\n    def __init__(self, in_channels: int, out_channels: int, name=None):\n        super(TransitionLayer, self).__init__()\n\n        num_in = len(in_channels)\n        num_out = len(out_channels)\n        self.conv_bn_func_list = []\n        for i in range(num_out):\n            residual = None\n            if i < num_in:\n                if in_channels[i] != out_channels[i]:\n                    residual = self.add_sublayer(\n                        \"transition_{}_layer_{}\".format(name, i + 1),\n                        L.ConvBNReLU(\n                            in_channels=in_channels[i],\n                            out_channels=out_channels[i],\n                            kernel_size=3,\n                            padding='same',\n                            bias_attr=False))\n            else:\n                residual = self.add_sublayer(\n                    \"transition_{}_layer_{}\".format(name, i + 1),\n                    L.ConvBNReLU(\n                        in_channels=in_channels[-1],\n                        out_channels=out_channels[i],\n                        kernel_size=3,\n                        stride=2,\n                        padding='same',\n                        bias_attr=False))\n            self.conv_bn_func_list.append(residual)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outs = []\n        for idx, conv_bn_func in enumerate(self.conv_bn_func_list):\n            if conv_bn_func is None:\n                outs.append(x[idx])\n            else:\n                if idx < len(x):\n                    outs.append(conv_bn_func(x[idx]))\n                else:\n                    outs.append(conv_bn_func(x[-1]))\n        return outs\n\n\nclass Branches(nn.Layer):\n    def __init__(self, num_blocks: int, in_channels: int, out_channels: int, has_se: bool = False, name: str = None):\n        super(Branches, self).__init__()\n\n        self.basic_block_list = []\n\n        for i in range(len(out_channels)):\n            self.basic_block_list.append([])\n            for j in range(num_blocks[i]):\n                in_ch = in_channels[i] if j == 0 else out_channels[i]\n                basic_block_func = self.add_sublayer(\n                    \"bb_{}_branch_layer_{}_{}\".format(name, i + 1, j + 1),\n                    BasicBlock(\n                        num_channels=in_ch,\n                        num_filters=out_channels[i],\n                        has_se=has_se,\n                        name=name + '_branch_layer_' + str(i + 1) + '_' + str(j + 1)))\n                self.basic_block_list[i].append(basic_block_func)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outs = []\n        for idx, input in enumerate(x):\n            conv = input\n            for basic_block_func in self.basic_block_list[idx]:\n                conv = basic_block_func(conv)\n            outs.append(conv)\n        return outs\n\n\nclass BottleneckBlock(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 has_se: bool,\n                 stride: int = 1,\n                 downsample: bool = False,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = L.ConvBNReLU(\n            in_channels=num_channels, out_channels=num_filters, kernel_size=1, padding='same', bias_attr=False)\n\n        self.conv2 = L.ConvBNReLU(\n            in_channels=num_filters,\n            out_channels=num_filters,\n            kernel_size=3,\n            stride=stride,\n            padding='same',\n            bias_attr=False)\n\n        self.conv3 = L.ConvBN(\n            in_channels=num_filters, out_channels=num_filters * 4, kernel_size=1, padding='same', bias_attr=False)\n\n        if self.downsample:\n            self.conv_down = L.ConvBN(\n                in_channels=num_channels, out_channels=num_filters * 4, kernel_size=1, padding='same', bias_attr=False)\n\n        if self.has_se:\n            self.se = SELayer(\n                num_channels=num_filters * 4, num_filters=num_filters * 4, reduction_ratio=16, name=name + '_fc')\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        residual = x\n        conv1 = self.conv1(x)\n        conv2 = self.conv2(conv1)\n        conv3 = self.conv3(conv2)\n\n        if self.downsample:\n            residual = self.conv_down(x)\n\n        if self.has_se:\n            conv3 = self.se(conv3)\n\n        y = conv3 + residual\n        y = F.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int = 1,\n                 has_se: bool = False,\n                 downsample: bool = False,\n                 name: str = None):\n        super(BasicBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = L.ConvBNReLU(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=3,\n            stride=stride,\n            padding='same',\n            bias_attr=False)\n        self.conv2 = L.ConvBN(\n            in_channels=num_filters, out_channels=num_filters, kernel_size=3, padding='same', bias_attr=False)\n\n        if self.downsample:\n            self.conv_down = L.ConvBNReLU(\n                in_channels=num_channels, out_channels=num_filters, kernel_size=1, padding='same', bias_attr=False)\n\n        if self.has_se:\n            self.se = SELayer(num_channels=num_filters, num_filters=num_filters, reduction_ratio=16, name=name + '_fc')\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        residual = x\n        conv1 = self.conv1(x)\n        conv2 = self.conv2(conv1)\n\n        if self.downsample:\n            residual = self.conv_down(x)\n\n        if self.has_se:\n            conv2 = self.se(conv2)\n\n        y = conv2 + residual\n        y = F.relu(y)\n        return y\n\n\nclass SELayer(nn.Layer):\n    def __init__(self, num_channels: int, num_filters: int, reduction_ratio: int, name: str = None):\n        super(SELayer, self).__init__()\n\n        self.pool2d_gap = nn.AdaptiveAvgPool2D(1)\n\n        self._num_channels = num_channels\n\n        med_ch = int(num_channels / reduction_ratio)\n        stdv = 1.0 / math.sqrt(num_channels * 1.0)\n        self.squeeze = nn.Linear(\n            num_channels, med_ch, weight_attr=paddle.ParamAttr(initializer=nn.initializer.Uniform(-stdv, stdv)))\n\n        stdv = 1.0 / math.sqrt(med_ch * 1.0)\n        self.excitation = nn.Linear(\n            med_ch, num_filters, weight_attr=paddle.ParamAttr(initializer=nn.initializer.Uniform(-stdv, stdv)))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        pool = self.pool2d_gap(x)\n        pool = paddle.reshape(pool, shape=[-1, self._num_channels])\n        squeeze = self.squeeze(pool)\n        squeeze = F.relu(squeeze)\n        excitation = self.excitation(squeeze)\n        excitation = F.sigmoid(excitation)\n        excitation = paddle.reshape(excitation, shape=[-1, self._num_channels, 1, 1])\n        out = x * excitation\n        return out\n\n\nclass Stage(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_modules: int,\n                 num_blocks: int,\n                 num_filters: int,\n                 has_se: bool = False,\n                 multi_scale_output: bool = True,\n                 name: str = None,\n                 align_corners: bool = False):\n        super(Stage, self).__init__()\n\n        self._num_modules = num_modules\n\n        self.stage_func_list = []\n        for i in range(num_modules):\n            if i == num_modules - 1 and not multi_scale_output:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels,\n                        num_blocks=num_blocks,\n                        num_filters=num_filters,\n                        has_se=has_se,\n                        multi_scale_output=False,\n                        name=name + '_' + str(i + 1),\n                        align_corners=align_corners))\n            else:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels,\n                        num_blocks=num_blocks,\n                        num_filters=num_filters,\n                        has_se=has_se,\n                        name=name + '_' + str(i + 1),\n                        align_corners=align_corners))\n\n            self.stage_func_list.append(stage_func)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        out = x\n        for idx in range(self._num_modules):\n            out = self.stage_func_list[idx](out)\n        return out\n\n\nclass HighResolutionModule(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_blocks: int,\n                 num_filters: int,\n                 has_se: bool = False,\n                 multi_scale_output: bool = True,\n                 name: str = None,\n                 align_corners: str = False):\n        super(HighResolutionModule, self).__init__()\n\n        self.branches_func = Branches(\n            num_blocks=num_blocks, in_channels=num_channels, out_channels=num_filters, has_se=has_se, name=name)\n\n        self.fuse_func = FuseLayers(\n            in_channels=num_filters,\n            out_channels=num_filters,\n            multi_scale_output=multi_scale_output,\n            name=name,\n            align_corners=align_corners)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        out = self.branches_func(x)\n        out = self.fuse_func(out)\n        return out\n\n\nclass FuseLayers(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 multi_scale_output: bool = True,\n                 name: str = None,\n                 align_corners: bool = False):\n        super(FuseLayers, self).__init__()\n\n        self._actual_ch = len(in_channels) if multi_scale_output else 1\n        self._in_channels = in_channels\n        self.align_corners = align_corners\n\n        self.residual_func_list = []\n        for i in range(self._actual_ch):\n            for j in range(len(in_channels)):\n                if j > i:\n                    residual_func = self.add_sublayer(\n                        \"residual_{}_layer_{}_{}\".format(name, i + 1, j + 1),\n                        L.ConvBN(\n                            in_channels=in_channels[j],\n                            out_channels=out_channels[i],\n                            kernel_size=1,\n                            padding='same',\n                            bias_attr=False))\n                    self.residual_func_list.append(residual_func)\n                elif j < i:\n                    pre_num_filters = in_channels[j]\n                    for k in range(i - j):\n                        if k == i - j - 1:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                L.ConvBN(\n                                    in_channels=pre_num_filters,\n                                    out_channels=out_channels[i],\n                                    kernel_size=3,\n                                    stride=2,\n                                    padding='same',\n                                    bias_attr=False))\n                            pre_num_filters = out_channels[i]\n                        else:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                L.ConvBNReLU(\n                                    in_channels=pre_num_filters,\n                                    out_channels=out_channels[j],\n                                    kernel_size=3,\n                                    stride=2,\n                                    padding='same',\n                                    bias_attr=False))\n                            pre_num_filters = out_channels[j]\n                        self.residual_func_list.append(residual_func)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outs = []\n        residual_func_idx = 0\n        for i in range(self._actual_ch):\n            residual = x[i]\n            residual_shape = residual.shape[-2:]\n            for j in range(len(self._in_channels)):\n                if j > i:\n                    y = self.residual_func_list[residual_func_idx](x[j])\n                    residual_func_idx += 1\n\n                    y = F.interpolate(y, residual_shape, mode='bilinear', align_corners=self.align_corners)\n                    residual = residual + y\n                elif j < i:\n                    y = x[j]\n                    for k in range(i - j):\n                        y = self.residual_func_list[residual_func_idx](y)\n                        residual_func_idx += 1\n\n                    residual = residual + y\n\n            residual = F.relu(residual)\n            outs.append(residual)\n\n        return outs\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ocrnet_hrnetw18_cityscapes/layers.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Tuple\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn.layer import activation\nfrom paddle.nn import Conv2D, AvgPool2D\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu':\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 stride: int = 1,\n                 dilation: int = 1,\n                 groups: int = 1,\n                 is_vd_mode: bool = False,\n                 act: str = None,\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2D(kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=(kernel_size - 1) // 2 if dilation == 1 else 0,\n            dilation=dilation,\n            groups=groups,\n            bias_attr=False)\n\n        self._batch_norm = SyncBatchNorm(out_channels)\n        self._act_op = Activation(act=act)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        y = self._act_op(y)\n\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Residual bottleneck block\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 dilation: int = 1,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels, out_channels=out_channels, kernel_size=1, act='relu', name=name + \"_branch2a\")\n\n        self.dilation = dilation\n\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            dilation=dilation,\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            in_channels=out_channels, out_channels=out_channels * 4, kernel_size=1, act=None, name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels * 4,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        if self.dilation > 1:\n            padding = self.dilation\n            y = F.pad(y, [padding, padding, padding, padding])\n\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.add(x=short, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Depthwise Separable Convolution.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs: dict):\n        super(SeparableConvBNReLU, self).__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=in_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n        self.piontwise_conv = ConvBNReLU(in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs: dict):\n        super(ConvBN, self).__init__()\n        self._conv = Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs: dict):\n        super(ConvBNReLU, self).__init__()\n\n        self._conv = Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n    Returns:\n        A callable object of Activation.\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n    Examples:\n        from paddleseg.models.common.activation import Activation\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = nn.layer.activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"nn.layer.activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n\n\nclass ASPPModule(nn.Layer):\n    \"\"\"\n    Atrous Spatial Pyramid Pooling.\n\n    Args:\n        aspp_ratios (tuple): The dilation rate using in ASSP module.\n        in_channels (int): The number of input channels.\n        out_channels (int): The number of output channels.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n        use_sep_conv (bool, optional): If using separable conv in ASPP module. Default: False.\n        image_pooling (bool, optional): If augmented with image-level features. Default: False\n    \"\"\"\n\n    def __init__(self,\n                 aspp_ratios: Tuple[int],\n                 in_channels: int,\n                 out_channels: int,\n                 align_corners: bool,\n                 use_sep_conv: bool = False,\n                 image_pooling: bool = False):\n        super().__init__()\n\n        self.align_corners = align_corners\n        self.aspp_blocks = nn.LayerList()\n\n        for ratio in aspp_ratios:\n            if use_sep_conv and ratio > 1:\n                conv_func = SeparableConvBNReLU\n            else:\n                conv_func = ConvBNReLU\n\n            block = conv_func(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1 if ratio == 1 else 3,\n                dilation=ratio,\n                padding=0 if ratio == 1 else ratio)\n            self.aspp_blocks.append(block)\n\n        out_size = len(self.aspp_blocks)\n\n        if image_pooling:\n            self.global_avg_pool = nn.Sequential(\n                nn.AdaptiveAvgPool2D(output_size=(1, 1)),\n                ConvBNReLU(in_channels, out_channels, kernel_size=1, bias_attr=False))\n            out_size += 1\n        self.image_pooling = image_pooling\n\n        self.conv_bn_relu = ConvBNReLU(in_channels=out_channels * out_size, out_channels=out_channels, kernel_size=1)\n\n        self.dropout = nn.Dropout(p=0.1)  # drop rate\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outputs = []\n        for block in self.aspp_blocks:\n            y = block(x)\n            y = F.interpolate(y, x.shape[2:], mode='bilinear', align_corners=self.align_corners)\n            outputs.append(y)\n\n        if self.image_pooling:\n            img_avg = self.global_avg_pool(x)\n            img_avg = F.interpolate(img_avg, x.shape[2:], mode='bilinear', align_corners=self.align_corners)\n            outputs.append(img_avg)\n\n        x = paddle.concat(outputs, axis=1)\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n\n        return x\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ocrnet_hrnetw18_cityscapes/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nfrom typing import List\n\nimport paddle\nimport numpy as np\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\n\nimport ocrnet_hrnetw18_cityscapes.layers as L\nfrom ocrnet_hrnetw18_cityscapes.hrnet import HRNet_W18\n\n\n@moduleinfo(\n    name=\"ocrnet_hrnetw18_cityscapes\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"OCRNetHRNetW18 is a segmentation model pretrained by pascal voc.\",\n    version=\"1.0.0\",\n    meta=ImageSegmentationModule)\nclass OCRNetHRNetW18(nn.Layer):\n    \"\"\"\n    The OCRNet implementation based on PaddlePaddle.\n    The original article refers to\n        Yuan, Yuhui, et al. \"Object-Contextual Representations for Semantic Segmentation\"\n        (https://arxiv.org/pdf/1909.11065.pdf)\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (list): A list indicates the indices of output of backbone.\n            It can be either one or two values, if two values, the first index will be taken as\n            a deep-supervision feature in auxiliary layer; the second one will be taken as\n            input of pixel representation. If one value, it is taken by both above.\n        ocr_mid_channels (int, optional): The number of middle channels in OCRHead. Default: 512.\n        ocr_key_channels (int, optional): The number of key channels in ObjectAttentionBlock. Default: 256.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.  Default: False.\n        pretrained (str, optional): The path or url of pretrained model. Default: None.\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 19,\n                 backbone_indices: List[int] = [0],\n                 ocr_mid_channels: int = 512,\n                 ocr_key_channels: int = 256,\n                 align_corners: bool = False,\n                 pretrained: str = None):\n        super(OCRNetHRNetW18, self).__init__()\n        self.backbone = HRNet_W18()\n        self.backbone_indices = backbone_indices\n        in_channels = [self.backbone.feat_channels[i] for i in backbone_indices]\n        self.head = OCRHead(\n            num_classes=num_classes,\n            in_channels=in_channels,\n            ocr_mid_channels=ocr_mid_channels,\n            ocr_key_channels=ocr_key_channels)\n        self.align_corners = align_corners\n        self.transforms = T.Compose([T.Normalize()])\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def transform(self, img: np.ndarray) -> np.ndarray:\n        return self.transforms(img)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        feats = self.backbone(x)\n        feats = [feats[i] for i in self.backbone_indices]\n        logit_list = self.head(feats)\n        logit_list = [\n            F.interpolate(logit, x.shape[2:], mode='bilinear', align_corners=self.align_corners) for logit in logit_list\n        ]\n        return logit_list\n\n\nclass OCRHead(nn.Layer):\n    \"\"\"\n    The Object contextual representation head.\n    Args:\n        num_classes(int): The unique number of target classes.\n        in_channels(tuple): The number of input channels.\n        ocr_mid_channels(int, optional): The number of middle channels in OCRHead. Default: 512.\n        ocr_key_channels(int, optional): The number of key channels in ObjectAttentionBlock. Default: 256.\n    \"\"\"\n\n    def __init__(self, num_classes: int, in_channels: int, ocr_mid_channels: int = 512, ocr_key_channels: int = 256):\n        super().__init__()\n\n        self.num_classes = num_classes\n        self.spatial_gather = SpatialGatherBlock()\n        self.spatial_ocr = SpatialOCRModule(ocr_mid_channels, ocr_key_channels, ocr_mid_channels)\n\n        self.indices = [-2, -1] if len(in_channels) > 1 else [-1, -1]\n\n        self.conv3x3_ocr = L.ConvBNReLU(in_channels[self.indices[1]], ocr_mid_channels, 3, padding=1)\n        self.cls_head = nn.Conv2D(ocr_mid_channels, self.num_classes, 1)\n        self.aux_head = nn.Sequential(\n            L.ConvBNReLU(in_channels[self.indices[0]], in_channels[self.indices[0]], 1),\n            nn.Conv2D(in_channels[self.indices[0]], self.num_classes, 1))\n\n    def forward(self, feat_list: List[paddle.Tensor]) -> paddle.Tensor:\n        feat_shallow, feat_deep = feat_list[self.indices[0]], feat_list[self.indices[1]]\n\n        soft_regions = self.aux_head(feat_shallow)\n        pixels = self.conv3x3_ocr(feat_deep)\n\n        object_regions = self.spatial_gather(pixels, soft_regions)\n        ocr = self.spatial_ocr(pixels, object_regions)\n\n        logit = self.cls_head(ocr)\n        return [logit, soft_regions]\n\n\nclass SpatialGatherBlock(nn.Layer):\n    \"\"\"Aggregation layer to compute the pixel-region representation.\"\"\"\n\n    def forward(self, pixels: paddle.Tensor, regions: paddle.Tensor) -> paddle.Tensor:\n        n, c, h, w = pixels.shape\n        _, k, _, _ = regions.shape\n\n        # pixels: from (n, c, h, w) to (n, h*w, c)\n        pixels = paddle.reshape(pixels, (n, c, h * w))\n        pixels = paddle.transpose(pixels, [0, 2, 1])\n\n        # regions: from (n, k, h, w) to (n, k, h*w)\n        regions = paddle.reshape(regions, (n, k, h * w))\n        regions = F.softmax(regions, axis=2)\n\n        # feats: from (n, k, c) to (n, c, k, 1)\n        feats = paddle.bmm(regions, pixels)\n        feats = paddle.transpose(feats, [0, 2, 1])\n        feats = paddle.unsqueeze(feats, axis=-1)\n\n        return feats\n\n\nclass SpatialOCRModule(nn.Layer):\n    \"\"\"Aggregate the global object representation to update the representation for each pixel.\"\"\"\n\n    def __init__(self, in_channels: int, key_channels: int, out_channels: int, dropout_rate: float = 0.1):\n        super().__init__()\n\n        self.attention_block = ObjectAttentionBlock(in_channels, key_channels)\n        self.conv1x1 = nn.Sequential(L.ConvBNReLU(2 * in_channels, out_channels, 1), nn.Dropout2D(dropout_rate))\n\n    def forward(self, pixels: paddle.Tensor, regions: paddle.Tensor) -> paddle.Tensor:\n        context = self.attention_block(pixels, regions)\n        feats = paddle.concat([context, pixels], axis=1)\n        feats = self.conv1x1(feats)\n\n        return feats\n\n\nclass ObjectAttentionBlock(nn.Layer):\n    \"\"\"A self-attention module.\"\"\"\n\n    def __init__(self, in_channels: int, key_channels: int):\n        super().__init__()\n\n        self.in_channels = in_channels\n        self.key_channels = key_channels\n\n        self.f_pixel = nn.Sequential(\n            L.ConvBNReLU(in_channels, key_channels, 1), L.ConvBNReLU(key_channels, key_channels, 1))\n\n        self.f_object = nn.Sequential(\n            L.ConvBNReLU(in_channels, key_channels, 1), L.ConvBNReLU(key_channels, key_channels, 1))\n\n        self.f_down = L.ConvBNReLU(in_channels, key_channels, 1)\n\n        self.f_up = L.ConvBNReLU(key_channels, in_channels, 1)\n\n    def forward(self, x: paddle.Tensor, proxy: paddle.Tensor) -> paddle.Tensor:\n        n, _, h, w = x.shape\n\n        # query : from (n, c1, h1, w1) to (n, h1*w1, key_channels)\n        query = self.f_pixel(x)\n        query = paddle.reshape(query, (n, self.key_channels, -1))\n        query = paddle.transpose(query, [0, 2, 1])\n\n        # key : from (n, c2, h2, w2) to (n, key_channels, h2*w2)\n        key = self.f_object(proxy)\n        key = paddle.reshape(key, (n, self.key_channels, -1))\n\n        # value : from (n, c2, h2, w2) to (n, h2*w2, key_channels)\n        value = self.f_down(proxy)\n        value = paddle.reshape(value, (n, self.key_channels, -1))\n        value = paddle.transpose(value, [0, 2, 1])\n\n        # sim_map (n, h1*w1, h2*w2)\n        sim_map = paddle.bmm(query, key)\n        sim_map = (self.key_channels**-.5) * sim_map\n        sim_map = F.softmax(sim_map, axis=-1)\n\n        # context from (n, h1*w1, key_channels) to (n , out_channels, h1, w1)\n        context = paddle.bmm(sim_map, value)\n        context = paddle.transpose(context, [0, 2, 1])\n        context = paddle.reshape(context, (n, self.key_channels, h, w))\n        context = self.f_up(context)\n\n        return context\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ocrnet_hrnetw18_voc/README.md",
    "content": "# PaddleHub 图像分割\n\n## 模型预测\n\n若想使用我们提供的预训练模型进行预测，可使用如下脚本：\n\n```python\nimport paddle\nimport cv2\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='ocrnet_hrnetw18_voc')\n    img = cv2.imread(\"/PATH/TO/IMAGE\")\n    model.predict(images=[img], visualization=True)\n```\n\n\n\n## 如何开始Fine-tune\n\n本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n\n在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用ocrnet_hrnetw18_voc模型对OpticDiscSeg等数据集进行Fine-tune。\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 定义数据预处理方式\n```python\nfrom paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\ntransform = Compose([Resize(target_size=(512, 512)), Normalize()])\n```\n\n`segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n### Step2: 下载数据集并使用\n```python\nfrom paddlehub.datasets import OpticDiscSeg\n\ntrain_reader = OpticDiscSeg(transform， mode='train')\n\n```\n* `transform`: 数据预处理方式。\n* `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n### Step3: 加载预训练模型\n\n```python\nmodel = hub.Module(name='ocrnet_hrnetw18_voc', num_classes=2, pretrained=None)\n```\n* `name`: 选择预训练模型的名字。\n* `num_classes`: 分割模型的类别数目。\n* `pretrained`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n### Step4: 选择优化策略和运行配置\n\n```python\nscheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\noptimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\ntrainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_ocr', use_gpu=True)\n```\n\n#### 优化策略\n\nPaddle2.0提供了多种优化器选择，如`SGD`, `Adam`, `Adamax`等，其中`Adam`:\n\n* `learning_rate`: 全局学习率。\n*  `parameters`: 待优化模型参数。\n\n#### 运行配置\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_gpu`: 是否使用gpu，默认为False;\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: works的数量，默认为0；\n* `eval_dataset`: 验证集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们使用该模型来进行预测。predict.py脚本如下：\n\n```python\nimport paddle\nimport cv2\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='ocrnet_hrnetw18_voc', pretrained='/PATH/TO/CHECKPOINT')\n    img = cv2.imread(\"/PATH/TO/IMAGE\")\n    model.predict(images=[img], visualization=True)\n```\n\n参数配置正确后，请执行脚本`python predict.py`。\n**Args**\n* `images`:原始图像路径或BGR格式图片；\n* `visualization`: 是否可视化，默认为True；\n* `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n**NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线图像分割服务。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m ocrnet_hrnetw18_voc\n```\n\n这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\nimport cv2\nimport base64\n\nimport numpy as np\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n# 发送HTTP请求\norg_im = cv2.imread('/PATH/TO/IMAGE')\ndata = {'images':[cv2_to_base64(org_im)]}\nheaders = {\"Content-type\": \"application/json\"}\nurl = \"http://127.0.0.1:8866/predict/ocrnet_hrnetw18_voc\"\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\nmask = base64_to_cv2(r.json()[\"results\"][0])\n```\n\n### 查看代码\n\nhttps://github.com/PaddlePaddle/PaddleSeg\n\n### 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ocrnet_hrnetw18_voc/hrnet.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport math\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nimport ocrnet_hrnetw18_voc.layers as L\n\n\nclass HRNet_W18(nn.Layer):\n    \"\"\"\n    The HRNet implementation based on PaddlePaddle.\n\n    The original article refers to\n    Jingdong Wang, et, al. \"HRNet：Deep High-Resolution Representation Learning for Visual Recognition\"\n    (https://arxiv.org/pdf/1908.07919.pdf).\n\n    Args:\n        pretrained (str, optional): The path of pretrained model.\n        stage1_num_modules (int, optional): Number of modules for stage1. Default 1.\n        stage1_num_blocks (list, optional): Number of blocks per module for stage1. Default (4).\n        stage1_num_channels (list, optional): Number of channels per branch for stage1. Default (64).\n        stage2_num_modules (int, optional): Number of modules for stage2. Default 1.\n        stage2_num_blocks (list, optional): Number of blocks per module for stage2. Default (4, 4).\n        stage2_num_channels (list, optional): Number of channels per branch for stage2. Default (18, 36).\n        stage3_num_modules (int, optional): Number of modules for stage3. Default 4.\n        stage3_num_blocks (list, optional): Number of blocks per module for stage3. Default (4, 4, 4).\n        stage3_num_channels (list, optional): Number of channels per branch for stage3. Default [18, 36, 72).\n        stage4_num_modules (int, optional): Number of modules for stage4. Default 3.\n        stage4_num_blocks (list, optional): Number of blocks per module for stage4. Default (4, 4, 4, 4).\n        stage4_num_channels (list, optional): Number of channels per branch for stage4. Default (18, 36, 72. 144).\n        has_se (bool, optional): Whether to use Squeeze-and-Excitation module. Default False.\n        align_corners (bool, optional): An argument of F.interpolate. It should be set to False when the feature size is even,\n            e.g. 1024x512, otherwise it is True, e.g. 769x769. Default: False.\n    \"\"\"\n\n    def __init__(self,\n                 pretrained: str = None,\n                 stage1_num_modules: int = 1,\n                 stage1_num_blocks: tuple = (4, ),\n                 stage1_num_channels: tuple = (64, ),\n                 stage2_num_modules: int = 1,\n                 stage2_num_blocks: tuple = (4, 4),\n                 stage2_num_channels: tuple = (18, 36),\n                 stage3_num_modules: int = 4,\n                 stage3_num_blocks: tuple = (4, 4, 4),\n                 stage3_num_channels: tuple = (18, 36, 72),\n                 stage4_num_modules: int = 3,\n                 stage4_num_blocks: tuple = (4, 4, 4, 4),\n                 stage4_num_channels: tuple = (18, 36, 72, 144),\n                 has_se: bool = False,\n                 align_corners: bool = False):\n        super(HRNet_W18, self).__init__()\n        self.pretrained = pretrained\n        self.stage1_num_modules = stage1_num_modules\n        self.stage1_num_blocks = stage1_num_blocks\n        self.stage1_num_channels = stage1_num_channels\n        self.stage2_num_modules = stage2_num_modules\n        self.stage2_num_blocks = stage2_num_blocks\n        self.stage2_num_channels = stage2_num_channels\n        self.stage3_num_modules = stage3_num_modules\n        self.stage3_num_blocks = stage3_num_blocks\n        self.stage3_num_channels = stage3_num_channels\n        self.stage4_num_modules = stage4_num_modules\n        self.stage4_num_blocks = stage4_num_blocks\n        self.stage4_num_channels = stage4_num_channels\n        self.has_se = has_se\n        self.align_corners = align_corners\n        self.feat_channels = [sum(stage4_num_channels)]\n\n        self.conv_layer1_1 = L.ConvBNReLU(\n            in_channels=3, out_channels=64, kernel_size=3, stride=2, padding='same', bias_attr=False)\n\n        self.conv_layer1_2 = L.ConvBNReLU(\n            in_channels=64, out_channels=64, kernel_size=3, stride=2, padding='same', bias_attr=False)\n\n        self.la1 = Layer1(\n            num_channels=64,\n            num_blocks=self.stage1_num_blocks[0],\n            num_filters=self.stage1_num_channels[0],\n            has_se=has_se,\n            name=\"layer2\")\n\n        self.tr1 = TransitionLayer(\n            in_channels=[self.stage1_num_channels[0] * 4], out_channels=self.stage2_num_channels, name=\"tr1\")\n\n        self.st2 = Stage(\n            num_channels=self.stage2_num_channels,\n            num_modules=self.stage2_num_modules,\n            num_blocks=self.stage2_num_blocks,\n            num_filters=self.stage2_num_channels,\n            has_se=self.has_se,\n            name=\"st2\",\n            align_corners=align_corners)\n\n        self.tr2 = TransitionLayer(\n            in_channels=self.stage2_num_channels, out_channels=self.stage3_num_channels, name=\"tr2\")\n        self.st3 = Stage(\n            num_channels=self.stage3_num_channels,\n            num_modules=self.stage3_num_modules,\n            num_blocks=self.stage3_num_blocks,\n            num_filters=self.stage3_num_channels,\n            has_se=self.has_se,\n            name=\"st3\",\n            align_corners=align_corners)\n\n        self.tr3 = TransitionLayer(\n            in_channels=self.stage3_num_channels, out_channels=self.stage4_num_channels, name=\"tr3\")\n        self.st4 = Stage(\n            num_channels=self.stage4_num_channels,\n            num_modules=self.stage4_num_modules,\n            num_blocks=self.stage4_num_blocks,\n            num_filters=self.stage4_num_channels,\n            has_se=self.has_se,\n            name=\"st4\",\n            align_corners=align_corners)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        conv1 = self.conv_layer1_1(x)\n        conv2 = self.conv_layer1_2(conv1)\n\n        la1 = self.la1(conv2)\n\n        tr1 = self.tr1([la1])\n        st2 = self.st2(tr1)\n\n        tr2 = self.tr2(st2)\n        st3 = self.st3(tr2)\n\n        tr3 = self.tr3(st3)\n        st4 = self.st4(tr3)\n\n        x0_h, x0_w = st4[0].shape[2:]\n        x1 = F.interpolate(st4[1], (x0_h, x0_w), mode='bilinear', align_corners=self.align_corners)\n        x2 = F.interpolate(st4[2], (x0_h, x0_w), mode='bilinear', align_corners=self.align_corners)\n        x3 = F.interpolate(st4[3], (x0_h, x0_w), mode='bilinear', align_corners=self.align_corners)\n        x = paddle.concat([st4[0], x1, x2, x3], axis=1)\n\n        return [x]\n\n\nclass Layer1(nn.Layer):\n    def __init__(self, num_channels: int, num_filters: int, num_blocks: int, has_se: bool = False, name: str = None):\n        super(Layer1, self).__init__()\n\n        self.bottleneck_block_list = []\n\n        for i in range(num_blocks):\n            bottleneck_block = self.add_sublayer(\n                \"bb_{}_{}\".format(name, i + 1),\n                BottleneckBlock(\n                    num_channels=num_channels if i == 0 else num_filters * 4,\n                    num_filters=num_filters,\n                    has_se=has_se,\n                    stride=1,\n                    downsample=True if i == 0 else False,\n                    name=name + '_' + str(i + 1)))\n            self.bottleneck_block_list.append(bottleneck_block)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        conv = x\n        for block_func in self.bottleneck_block_list:\n            conv = block_func(conv)\n        return conv\n\n\nclass TransitionLayer(nn.Layer):\n    def __init__(self, in_channels: int, out_channels: int, name=None):\n        super(TransitionLayer, self).__init__()\n\n        num_in = len(in_channels)\n        num_out = len(out_channels)\n        self.conv_bn_func_list = []\n        for i in range(num_out):\n            residual = None\n            if i < num_in:\n                if in_channels[i] != out_channels[i]:\n                    residual = self.add_sublayer(\n                        \"transition_{}_layer_{}\".format(name, i + 1),\n                        L.ConvBNReLU(\n                            in_channels=in_channels[i],\n                            out_channels=out_channels[i],\n                            kernel_size=3,\n                            padding='same',\n                            bias_attr=False))\n            else:\n                residual = self.add_sublayer(\n                    \"transition_{}_layer_{}\".format(name, i + 1),\n                    L.ConvBNReLU(\n                        in_channels=in_channels[-1],\n                        out_channels=out_channels[i],\n                        kernel_size=3,\n                        stride=2,\n                        padding='same',\n                        bias_attr=False))\n            self.conv_bn_func_list.append(residual)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outs = []\n        for idx, conv_bn_func in enumerate(self.conv_bn_func_list):\n            if conv_bn_func is None:\n                outs.append(x[idx])\n            else:\n                if idx < len(x):\n                    outs.append(conv_bn_func(x[idx]))\n                else:\n                    outs.append(conv_bn_func(x[-1]))\n        return outs\n\n\nclass Branches(nn.Layer):\n    def __init__(self, num_blocks: int, in_channels: int, out_channels: int, has_se: bool = False, name: str = None):\n        super(Branches, self).__init__()\n\n        self.basic_block_list = []\n\n        for i in range(len(out_channels)):\n            self.basic_block_list.append([])\n            for j in range(num_blocks[i]):\n                in_ch = in_channels[i] if j == 0 else out_channels[i]\n                basic_block_func = self.add_sublayer(\n                    \"bb_{}_branch_layer_{}_{}\".format(name, i + 1, j + 1),\n                    BasicBlock(\n                        num_channels=in_ch,\n                        num_filters=out_channels[i],\n                        has_se=has_se,\n                        name=name + '_branch_layer_' + str(i + 1) + '_' + str(j + 1)))\n                self.basic_block_list[i].append(basic_block_func)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outs = []\n        for idx, input in enumerate(x):\n            conv = input\n            for basic_block_func in self.basic_block_list[idx]:\n                conv = basic_block_func(conv)\n            outs.append(conv)\n        return outs\n\n\nclass BottleneckBlock(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 has_se: bool,\n                 stride: int = 1,\n                 downsample: bool = False,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = L.ConvBNReLU(\n            in_channels=num_channels, out_channels=num_filters, kernel_size=1, padding='same', bias_attr=False)\n\n        self.conv2 = L.ConvBNReLU(\n            in_channels=num_filters,\n            out_channels=num_filters,\n            kernel_size=3,\n            stride=stride,\n            padding='same',\n            bias_attr=False)\n\n        self.conv3 = L.ConvBN(\n            in_channels=num_filters, out_channels=num_filters * 4, kernel_size=1, padding='same', bias_attr=False)\n\n        if self.downsample:\n            self.conv_down = L.ConvBN(\n                in_channels=num_channels, out_channels=num_filters * 4, kernel_size=1, padding='same', bias_attr=False)\n\n        if self.has_se:\n            self.se = SELayer(\n                num_channels=num_filters * 4, num_filters=num_filters * 4, reduction_ratio=16, name=name + '_fc')\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        residual = x\n        conv1 = self.conv1(x)\n        conv2 = self.conv2(conv1)\n        conv3 = self.conv3(conv2)\n\n        if self.downsample:\n            residual = self.conv_down(x)\n\n        if self.has_se:\n            conv3 = self.se(conv3)\n\n        y = conv3 + residual\n        y = F.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_filters: int,\n                 stride: int = 1,\n                 has_se: bool = False,\n                 downsample: bool = False,\n                 name: str = None):\n        super(BasicBlock, self).__init__()\n\n        self.has_se = has_se\n        self.downsample = downsample\n\n        self.conv1 = L.ConvBNReLU(\n            in_channels=num_channels,\n            out_channels=num_filters,\n            kernel_size=3,\n            stride=stride,\n            padding='same',\n            bias_attr=False)\n        self.conv2 = L.ConvBN(\n            in_channels=num_filters, out_channels=num_filters, kernel_size=3, padding='same', bias_attr=False)\n\n        if self.downsample:\n            self.conv_down = L.ConvBNReLU(\n                in_channels=num_channels, out_channels=num_filters, kernel_size=1, padding='same', bias_attr=False)\n\n        if self.has_se:\n            self.se = SELayer(num_channels=num_filters, num_filters=num_filters, reduction_ratio=16, name=name + '_fc')\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        residual = x\n        conv1 = self.conv1(x)\n        conv2 = self.conv2(conv1)\n\n        if self.downsample:\n            residual = self.conv_down(x)\n\n        if self.has_se:\n            conv2 = self.se(conv2)\n\n        y = conv2 + residual\n        y = F.relu(y)\n        return y\n\n\nclass SELayer(nn.Layer):\n    def __init__(self, num_channels: int, num_filters: int, reduction_ratio: int, name: str = None):\n        super(SELayer, self).__init__()\n\n        self.pool2d_gap = nn.AdaptiveAvgPool2D(1)\n\n        self._num_channels = num_channels\n\n        med_ch = int(num_channels / reduction_ratio)\n        stdv = 1.0 / math.sqrt(num_channels * 1.0)\n        self.squeeze = nn.Linear(\n            num_channels, med_ch, weight_attr=paddle.ParamAttr(initializer=nn.initializer.Uniform(-stdv, stdv)))\n\n        stdv = 1.0 / math.sqrt(med_ch * 1.0)\n        self.excitation = nn.Linear(\n            med_ch, num_filters, weight_attr=paddle.ParamAttr(initializer=nn.initializer.Uniform(-stdv, stdv)))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        pool = self.pool2d_gap(x)\n        pool = paddle.reshape(pool, shape=[-1, self._num_channels])\n        squeeze = self.squeeze(pool)\n        squeeze = F.relu(squeeze)\n        excitation = self.excitation(squeeze)\n        excitation = F.sigmoid(excitation)\n        excitation = paddle.reshape(excitation, shape=[-1, self._num_channels, 1, 1])\n        out = x * excitation\n        return out\n\n\nclass Stage(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_modules: int,\n                 num_blocks: int,\n                 num_filters: int,\n                 has_se: bool = False,\n                 multi_scale_output: bool = True,\n                 name: str = None,\n                 align_corners: bool = False):\n        super(Stage, self).__init__()\n\n        self._num_modules = num_modules\n\n        self.stage_func_list = []\n        for i in range(num_modules):\n            if i == num_modules - 1 and not multi_scale_output:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels,\n                        num_blocks=num_blocks,\n                        num_filters=num_filters,\n                        has_se=has_se,\n                        multi_scale_output=False,\n                        name=name + '_' + str(i + 1),\n                        align_corners=align_corners))\n            else:\n                stage_func = self.add_sublayer(\n                    \"stage_{}_{}\".format(name, i + 1),\n                    HighResolutionModule(\n                        num_channels=num_channels,\n                        num_blocks=num_blocks,\n                        num_filters=num_filters,\n                        has_se=has_se,\n                        name=name + '_' + str(i + 1),\n                        align_corners=align_corners))\n\n            self.stage_func_list.append(stage_func)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        out = x\n        for idx in range(self._num_modules):\n            out = self.stage_func_list[idx](out)\n        return out\n\n\nclass HighResolutionModule(nn.Layer):\n    def __init__(self,\n                 num_channels: int,\n                 num_blocks: int,\n                 num_filters: int,\n                 has_se: bool = False,\n                 multi_scale_output: bool = True,\n                 name: str = None,\n                 align_corners: str = False):\n        super(HighResolutionModule, self).__init__()\n\n        self.branches_func = Branches(\n            num_blocks=num_blocks, in_channels=num_channels, out_channels=num_filters, has_se=has_se, name=name)\n\n        self.fuse_func = FuseLayers(\n            in_channels=num_filters,\n            out_channels=num_filters,\n            multi_scale_output=multi_scale_output,\n            name=name,\n            align_corners=align_corners)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        out = self.branches_func(x)\n        out = self.fuse_func(out)\n        return out\n\n\nclass FuseLayers(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 multi_scale_output: bool = True,\n                 name: str = None,\n                 align_corners: bool = False):\n        super(FuseLayers, self).__init__()\n\n        self._actual_ch = len(in_channels) if multi_scale_output else 1\n        self._in_channels = in_channels\n        self.align_corners = align_corners\n\n        self.residual_func_list = []\n        for i in range(self._actual_ch):\n            for j in range(len(in_channels)):\n                if j > i:\n                    residual_func = self.add_sublayer(\n                        \"residual_{}_layer_{}_{}\".format(name, i + 1, j + 1),\n                        L.ConvBN(\n                            in_channels=in_channels[j],\n                            out_channels=out_channels[i],\n                            kernel_size=1,\n                            padding='same',\n                            bias_attr=False))\n                    self.residual_func_list.append(residual_func)\n                elif j < i:\n                    pre_num_filters = in_channels[j]\n                    for k in range(i - j):\n                        if k == i - j - 1:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                L.ConvBN(\n                                    in_channels=pre_num_filters,\n                                    out_channels=out_channels[i],\n                                    kernel_size=3,\n                                    stride=2,\n                                    padding='same',\n                                    bias_attr=False))\n                            pre_num_filters = out_channels[i]\n                        else:\n                            residual_func = self.add_sublayer(\n                                \"residual_{}_layer_{}_{}_{}\".format(name, i + 1, j + 1, k + 1),\n                                L.ConvBNReLU(\n                                    in_channels=pre_num_filters,\n                                    out_channels=out_channels[j],\n                                    kernel_size=3,\n                                    stride=2,\n                                    padding='same',\n                                    bias_attr=False))\n                            pre_num_filters = out_channels[j]\n                        self.residual_func_list.append(residual_func)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outs = []\n        residual_func_idx = 0\n        for i in range(self._actual_ch):\n            residual = x[i]\n            residual_shape = residual.shape[-2:]\n            for j in range(len(self._in_channels)):\n                if j > i:\n                    y = self.residual_func_list[residual_func_idx](x[j])\n                    residual_func_idx += 1\n\n                    y = F.interpolate(y, residual_shape, mode='bilinear', align_corners=self.align_corners)\n                    residual = residual + y\n                elif j < i:\n                    y = x[j]\n                    for k in range(i - j):\n                        y = self.residual_func_list[residual_func_idx](y)\n                        residual_func_idx += 1\n\n                    residual = residual + y\n\n            residual = F.relu(residual)\n            outs.append(residual)\n\n        return outs\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ocrnet_hrnetw18_voc/layers.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn import Conv2D, AvgPool2D\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu':\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass ConvBNLayer(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 stride: int = 1,\n                 dilation: int = 1,\n                 groups: int = 1,\n                 is_vd_mode: bool = False,\n                 act: str = None,\n                 name: str = None):\n        super(ConvBNLayer, self).__init__()\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = AvgPool2D(kernel_size=2, stride=2, padding=0, ceil_mode=True)\n        self._conv = Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=(kernel_size - 1) // 2 if dilation == 1 else 0,\n            dilation=dilation,\n            groups=groups,\n            bias_attr=False)\n\n        self._batch_norm = SyncBatchNorm(out_channels)\n        self._act_op = Activation(act=act)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        y = self._act_op(y)\n\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    \"\"\"Residual bottleneck block\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 dilation: int = 1,\n                 name: str = None):\n        super(BottleneckBlock, self).__init__()\n\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels, out_channels=out_channels, kernel_size=1, act='relu', name=name + \"_branch2a\")\n\n        self.dilation = dilation\n\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            dilation=dilation,\n            name=name + \"_branch2b\")\n        self.conv2 = ConvBNLayer(\n            in_channels=out_channels, out_channels=out_channels * 4, kernel_size=1, act=None, name=name + \"_branch2c\")\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels * 4,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                name=name + \"_branch1\")\n\n        self.shortcut = shortcut\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        if self.dilation > 1:\n            padding = self.dilation\n            y = F.pad(y, [padding, padding, padding, padding])\n\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = paddle.add(x=short, y=conv2)\n        y = F.relu(y)\n        return y\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Depthwise Separable Convolution.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs: dict):\n        super(SeparableConvBNReLU, self).__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=in_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n        self.piontwise_conv = ConvBNReLU(in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs: dict):\n        super(ConvBN, self).__init__()\n        self._conv = Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs: dict):\n        super(ConvBNReLU, self).__init__()\n\n        self._conv = Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n\n    Returns:\n        A callable object of Activation.\n\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n\n    Examples:\n\n        from paddleseg.models.common.activation import Activation\n\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = nn.layer.activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n\n\nclass ASPPModule(nn.Layer):\n    \"\"\"\n    Atrous Spatial Pyramid Pooling.\n\n    Args:\n        aspp_ratios (tuple): The dilation rate using in ASSP module.\n        in_channels (int): The number of input channels.\n        out_channels (int): The number of output channels.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n        use_sep_conv (bool, optional): If using separable conv in ASPP module. Default: False.\n        image_pooling (bool, optional): If augmented with image-level features. Default: False\n    \"\"\"\n\n    def __init__(self, aspp_ratios, in_channels, out_channels, align_corners, use_sep_conv=False, image_pooling=False):\n        super().__init__()\n\n        self.align_corners = align_corners\n        self.aspp_blocks = nn.LayerList()\n\n        for ratio in aspp_ratios:\n            if use_sep_conv and ratio > 1:\n                conv_func = SeparableConvBNReLU\n            else:\n                conv_func = ConvBNReLU\n\n            block = conv_func(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1 if ratio == 1 else 3,\n                dilation=ratio,\n                padding=0 if ratio == 1 else ratio)\n            self.aspp_blocks.append(block)\n\n        out_size = len(self.aspp_blocks)\n\n        if image_pooling:\n            self.global_avg_pool = nn.Sequential(\n                nn.AdaptiveAvgPool2D(output_size=(1, 1)),\n                ConvBNReLU(in_channels, out_channels, kernel_size=1, bias_attr=False))\n            out_size += 1\n        self.image_pooling = image_pooling\n\n        self.conv_bn_relu = ConvBNReLU(in_channels=out_channels * out_size, out_channels=out_channels, kernel_size=1)\n\n        self.dropout = nn.Dropout(p=0.1)  # drop rate\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outputs = []\n        for block in self.aspp_blocks:\n            y = block(x)\n            y = F.interpolate(y, x.shape[2:], mode='bilinear', align_corners=self.align_corners)\n            outputs.append(y)\n\n        if self.image_pooling:\n            img_avg = self.global_avg_pool(x)\n            img_avg = F.interpolate(img_avg, x.shape[2:], mode='bilinear', align_corners=self.align_corners)\n            outputs.append(img_avg)\n\n        x = paddle.concat(outputs, axis=1)\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n\n        return x\n"
  },
  {
    "path": "modules/image/semantic_segmentation/ocrnet_hrnetw18_voc/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nfrom typing import List\n\nimport paddle\nimport numpy as np\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\n\nimport ocrnet_hrnetw18_voc.layers as L\nfrom ocrnet_hrnetw18_voc.hrnet import HRNet_W18\n\n\n@moduleinfo(\n    name=\"ocrnet_hrnetw18_voc\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"OCRNetHRNetW18 is a segmentation model pretrained by pascal voc.\",\n    version=\"1.0.0\",\n    meta=ImageSegmentationModule)\nclass OCRNetHRNetW18(nn.Layer):\n    \"\"\"\n    The OCRNet implementation based on PaddlePaddle.\n    The original article refers to\n        Yuan, Yuhui, et al. \"Object-Contextual Representations for Semantic Segmentation\"\n        (https://arxiv.org/pdf/1909.11065.pdf)\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (list): A list indicates the indices of output of backbone.\n            It can be either one or two values, if two values, the first index will be taken as\n            a deep-supervision feature in auxiliary layer; the second one will be taken as\n            input of pixel representation. If one value, it is taken by both above.\n        ocr_mid_channels (int, optional): The number of middle channels in OCRHead. Default: 512.\n        ocr_key_channels (int, optional): The number of key channels in ObjectAttentionBlock. Default: 256.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.  Default: False.\n        pretrained (str, optional): The path or url of pretrained model. Default: None.\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 21,\n                 backbone_indices: List[int] = [0],\n                 ocr_mid_channels: int = 512,\n                 ocr_key_channels: int = 256,\n                 align_corners: bool = False,\n                 pretrained: str = None):\n        super(OCRNetHRNetW18, self).__init__()\n        self.backbone = HRNet_W18()\n        self.backbone_indices = backbone_indices\n        in_channels = [self.backbone.feat_channels[i] for i in backbone_indices]\n        self.head = OCRHead(\n            num_classes=num_classes,\n            in_channels=in_channels,\n            ocr_mid_channels=ocr_mid_channels,\n            ocr_key_channels=ocr_key_channels)\n        self.align_corners = align_corners\n        self.transforms = T.Compose([T.Normalize()])\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'ocrnet_hrnetw18.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def transform(self, img: np.ndarray) -> np.ndarray:\n        return self.transforms(img)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        feats = self.backbone(x)\n        feats = [feats[i] for i in self.backbone_indices]\n        logit_list = self.head(feats)\n        logit_list = [\n            F.interpolate(logit, x.shape[2:], mode='bilinear', align_corners=self.align_corners) for logit in logit_list\n        ]\n        return logit_list\n\n\nclass OCRHead(nn.Layer):\n    \"\"\"\n    The Object contextual representation head.\n    Args:\n        num_classes(int): The unique number of target classes.\n        in_channels(tuple): The number of input channels.\n        ocr_mid_channels(int, optional): The number of middle channels in OCRHead. Default: 512.\n        ocr_key_channels(int, optional): The number of key channels in ObjectAttentionBlock. Default: 256.\n    \"\"\"\n\n    def __init__(self, num_classes: int, in_channels: int, ocr_mid_channels: int = 512, ocr_key_channels: int = 256):\n        super().__init__()\n\n        self.num_classes = num_classes\n        self.spatial_gather = SpatialGatherBlock()\n        self.spatial_ocr = SpatialOCRModule(ocr_mid_channels, ocr_key_channels, ocr_mid_channels)\n\n        self.indices = [-2, -1] if len(in_channels) > 1 else [-1, -1]\n\n        self.conv3x3_ocr = L.ConvBNReLU(in_channels[self.indices[1]], ocr_mid_channels, 3, padding=1)\n        self.cls_head = nn.Conv2D(ocr_mid_channels, self.num_classes, 1)\n        self.aux_head = nn.Sequential(\n            L.ConvBNReLU(in_channels[self.indices[0]], in_channels[self.indices[0]], 1),\n            nn.Conv2D(in_channels[self.indices[0]], self.num_classes, 1))\n\n    def forward(self, feat_list: List[paddle.Tensor]) -> paddle.Tensor:\n        feat_shallow, feat_deep = feat_list[self.indices[0]], feat_list[self.indices[1]]\n\n        soft_regions = self.aux_head(feat_shallow)\n        pixels = self.conv3x3_ocr(feat_deep)\n\n        object_regions = self.spatial_gather(pixels, soft_regions)\n        ocr = self.spatial_ocr(pixels, object_regions)\n\n        logit = self.cls_head(ocr)\n        return [logit, soft_regions]\n\n\nclass SpatialGatherBlock(nn.Layer):\n    \"\"\"Aggregation layer to compute the pixel-region representation.\"\"\"\n\n    def forward(self, pixels: paddle.Tensor, regions: paddle.Tensor) -> paddle.Tensor:\n        n, c, h, w = pixels.shape\n        _, k, _, _ = regions.shape\n\n        # pixels: from (n, c, h, w) to (n, h*w, c)\n        pixels = paddle.reshape(pixels, (n, c, h * w))\n        pixels = paddle.transpose(pixels, [0, 2, 1])\n\n        # regions: from (n, k, h, w) to (n, k, h*w)\n        regions = paddle.reshape(regions, (n, k, h * w))\n        regions = F.softmax(regions, axis=2)\n\n        # feats: from (n, k, c) to (n, c, k, 1)\n        feats = paddle.bmm(regions, pixels)\n        feats = paddle.transpose(feats, [0, 2, 1])\n        feats = paddle.unsqueeze(feats, axis=-1)\n\n        return feats\n\n\nclass SpatialOCRModule(nn.Layer):\n    \"\"\"Aggregate the global object representation to update the representation for each pixel.\"\"\"\n\n    def __init__(self, in_channels: int, key_channels: int, out_channels: int, dropout_rate: float = 0.1):\n        super().__init__()\n\n        self.attention_block = ObjectAttentionBlock(in_channels, key_channels)\n        self.conv1x1 = nn.Sequential(L.ConvBNReLU(2 * in_channels, out_channels, 1), nn.Dropout2D(dropout_rate))\n\n    def forward(self, pixels: paddle.Tensor, regions: paddle.Tensor) -> paddle.Tensor:\n        context = self.attention_block(pixels, regions)\n        feats = paddle.concat([context, pixels], axis=1)\n        feats = self.conv1x1(feats)\n\n        return feats\n\n\nclass ObjectAttentionBlock(nn.Layer):\n    \"\"\"A self-attention module.\"\"\"\n\n    def __init__(self, in_channels: int, key_channels: int):\n        super().__init__()\n\n        self.in_channels = in_channels\n        self.key_channels = key_channels\n\n        self.f_pixel = nn.Sequential(\n            L.ConvBNReLU(in_channels, key_channels, 1), L.ConvBNReLU(key_channels, key_channels, 1))\n\n        self.f_object = nn.Sequential(\n            L.ConvBNReLU(in_channels, key_channels, 1), L.ConvBNReLU(key_channels, key_channels, 1))\n\n        self.f_down = L.ConvBNReLU(in_channels, key_channels, 1)\n\n        self.f_up = L.ConvBNReLU(key_channels, in_channels, 1)\n\n    def forward(self, x: paddle.Tensor, proxy: paddle.Tensor) -> paddle.Tensor:\n        n, _, h, w = x.shape\n\n        # query : from (n, c1, h1, w1) to (n, h1*w1, key_channels)\n        query = self.f_pixel(x)\n        query = paddle.reshape(query, (n, self.key_channels, -1))\n        query = paddle.transpose(query, [0, 2, 1])\n\n        # key : from (n, c2, h2, w2) to (n, key_channels, h2*w2)\n        key = self.f_object(proxy)\n        key = paddle.reshape(key, (n, self.key_channels, -1))\n\n        # value : from (n, c2, h2, w2) to (n, h2*w2, key_channels)\n        value = self.f_down(proxy)\n        value = paddle.reshape(value, (n, self.key_channels, -1))\n        value = paddle.transpose(value, [0, 2, 1])\n\n        # sim_map (n, h1*w1, h2*w2)\n        sim_map = paddle.bmm(query, key)\n        sim_map = (self.key_channels**-.5) * sim_map\n        sim_map = F.softmax(sim_map, axis=-1)\n\n        # context from (n, h1*w1, key_channels) to (n , out_channels, h1, w1)\n        context = paddle.bmm(sim_map, value)\n        context = paddle.transpose(context, [0, 2, 1])\n        context = paddle.reshape(context, (n, self.key_channels, h, w))\n        context = self.f_up(context)\n\n        return context\n"
  },
  {
    "path": "modules/image/semantic_segmentation/pspnet_resnet50_cityscapes/README.md",
    "content": "# pspnet_resnet50_cityscapes\n\n|模型名称|pspnet_resnet50_cityscapes|\n| :--- | :---: | \n|类别|图像-图像分割|\n|网络|pspnet_resnet50vd|\n|数据集|Cityscapes|\n|是否支持Fine-tuning|是|\n|模型大小|390MB|\n|指标|-|\n|最新更新日期|2022-03-21|\n\n## 一、模型基本信息\n  \n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/159212111-df341f2a-e994-45d7-92d6-2288d666079c.png\"  width = \"420\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/159212188-2db40b29-2943-47ce-9ad2-36a6fb85ba3e.png\" width = \"420\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### 模型介绍\n\n    - 本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n    - 更多详情请参考：[pspnet](https://openaccess.thecvf.com/content_cvpr_2017/papers/Zhao_Pyramid_Scene_Parsing_CVPR_2017_paper.pdf)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install pspnet_resnet50_cityscapes\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)   \n\n\n## 三、模型API预测\n\n- ### 1.预测代码示例\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='pspnet_resnet50_cityscapes')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用pspnet_resnet50_cityscapes模型对OpticDiscSeg数据集进行Fine-tune。 `train.py`内容如下：\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n              ```\n                - `transforms`: 数据预处理方式。\n                - `mode`: `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n                - 数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='pspnet_resnet50_cityscapes', num_classes=2, pretrained=None)\n              ```\n                - `name`: 选择预训练模型的名字。\n                - `load_checkpoint`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n        - Step4: 选择优化策略和运行配置\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n             \n    - 模型预测\n\n        - 当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。我们使用该模型来进行预测。predict.py脚本如下：\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='pspnet_resnet50_cityscapes', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - 参数配置正确后，请执行脚本`python predict.py`。\n\n            - **Args**\n                * `images`:原始图像路径或BGR格式图片；\n                * `visualization`: 是否可视化，默认为True；\n                * `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n                **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像分割服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m pspnet_resnet50_cityscapes\n      ```\n\n    - 这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ginet_resnet50vd_voc\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/semantic_segmentation/pspnet_resnet50_cityscapes/README_en.md",
    "content": "# pspnet_resnet50_cityscapes\n\n|Module Name|pspnet_resnet50_cityscapes|\n| :--- | :---: | \n|Category|Image Segmentation|\n|Network|pspnet_resnet50vd|\n|Dataset|Cityscapes|\n|Fine-tuning supported or not|Yes|\n|Module Size|390MB|\n|Data indicators|-|\n|Latest update date|2022-03-21|\n\n## I. Basic Information \n  \n- ### Application Effect Display\n    - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/159212111-df341f2a-e994-45d7-92d6-2288d666079c.png\"  width = \"420\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/159212188-2db40b29-2943-47ce-9ad2-36a6fb85ba3e.png\" width = \"420\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### Module Introduction\n\n    - We will show how to use PaddleHub to finetune the pre-trained model and complete the prediction.\n    - For more information, please refer to: [pspnet](https://openaccess.thecvf.com/content_cvpr_2017/papers/Zhao_Pyramid_Scene_Parsing_CVPR_2017_paper.pdf)\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install pspnet_resnet50_cityscapes\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='pspnet_resnet50_cityscapes')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.Fine-tune and Encapsulation\n\n    - After completing the installation of PaddlePaddle and PaddleHub, you can start using the pspnet_resnet50_cityscapes model to fine-tune datasets such as OpticDiscSeg.\n\n    - Steps:\n\n         - Step1: Define the data preprocessing method\n\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms`: The data enhancement module defines lots of data preprocessing methods. Users can replace the data preprocessing methods according to their needs.\n\n         - Step2: Download the dataset\n\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n              ```\n                * `transforms`: data preprocessing methods.\n\n                * `mode`: Select the data mode, the options are `train`, `test`, `val`. Default is `train`.\n\n                * Dataset preparation can be referred to [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`will be automatically downloaded from the network and decompressed to the `$HOME/.paddlehub/dataset` directory under the user directory.\n\n        - Step3: Load the pre-trained model\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='pspnet_resnet50_cityscapes', num_classes=2, pretrained=None)\n              ```\n                - `name`: model name.\n                - `load_checkpoint`: Whether to load the self-trained model, if it is None, load the provided parameters.\n\n        - Step4:  Optimization strategy\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n   \n    - Model prediction\n    \n        - When Fine-tune is completed, the model with the best performance on the verification set will be saved in the `${CHECKPOINT_DIR}/best_model` directory. We use this model to make predictions. The `predict.py` script is as follows:\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='pspnet_resnet50_cityscapes', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - **Args**\n                * `images`: Image path or ndarray data with format [H, W, C], BGR.\n                * `visualization`: Whether to save the recognition results as picture files.\n                * `save_path`: Save path of the result, default is 'seg_result'.\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of image segmentation.\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n        - ```shell\n          $ hub serving start -m pspnet_resnet50_cityscapes\n          ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result:\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/pspnet_resnet50_cityscapes\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## V. Release Note\n\n- 1.0.0\n\n  First release\n"
  },
  {
    "path": "modules/image/semantic_segmentation/pspnet_resnet50_cityscapes/layers.py",
    "content": "# copyright (c) 2022 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn.layer import activation\nfrom paddle.nn import Conv2D, AvgPool2D\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu':\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Depthwise Separable Convolution.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(SeparableConvBNReLU, self).__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=in_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n        self.piontwise_conv = ConvBNReLU(\n            in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBN, self).__init__()\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBNReLU, self).__init__()\n\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n    Returns:\n        A callable object of Activation.\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n    Examples:\n        from paddleseg.models.common.activation import Activation\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(\n                    act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n\n\nclass ASPPModule(nn.Layer):\n    \"\"\"\n    Atrous Spatial Pyramid Pooling.\n    Args:\n        aspp_ratios (tuple): The dilation rate using in ASSP module.\n        in_channels (int): The number of input channels.\n        out_channels (int): The number of output channels.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n        use_sep_conv (bool, optional): If using separable conv in ASPP module. Default: False.\n        image_pooling (bool, optional): If augmented with image-level features. Default: False\n    \"\"\"\n\n    def __init__(self,\n                 aspp_ratios: tuple,\n                 in_channels: int,\n                 out_channels: int,\n                 align_corners: bool,\n                 use_sep_conv: bool = False,\n                 image_pooling: bool = False):\n        super().__init__()\n\n        self.align_corners = align_corners\n        self.aspp_blocks = nn.LayerList()\n\n        for ratio in aspp_ratios:\n            if use_sep_conv and ratio > 1:\n                conv_func = SeparableConvBNReLU\n            else:\n                conv_func = ConvBNReLU\n\n            block = conv_func(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1 if ratio == 1 else 3,\n                dilation=ratio,\n                padding=0 if ratio == 1 else ratio)\n            self.aspp_blocks.append(block)\n\n        out_size = len(self.aspp_blocks)\n\n        if image_pooling:\n            self.global_avg_pool = nn.Sequential(\n                nn.AdaptiveAvgPool2D(output_size=(1, 1)),\n                ConvBNReLU(in_channels, out_channels, kernel_size=1, bias_attr=False))\n            out_size += 1\n        self.image_pooling = image_pooling\n\n        self.conv_bn_relu = ConvBNReLU(\n            in_channels=out_channels * out_size,\n            out_channels=out_channels,\n            kernel_size=1)\n\n        self.dropout = nn.Dropout(p=0.1)  # drop rate\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outputs = []\n        for block in self.aspp_blocks:\n            y = block(x)\n            y = F.interpolate(\n                y,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(y)\n\n        if self.image_pooling:\n            img_avg = self.global_avg_pool(x)\n            img_avg = F.interpolate(\n                img_avg,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(img_avg)\n\n        x = paddle.concat(outputs, axis=1)\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n\n        return x\n\n\nclass AuxLayer(nn.Layer):\n    \"\"\"\n    The auxiliary layer implementation for auxiliary loss.\n\n    Args:\n        in_channels (int): The number of input channels.\n        inter_channels (int): The intermediate channels.\n        out_channels (int): The number of output channels, and usually it is num_classes.\n        dropout_prob (float, optional): The drop rate. Default: 0.1.\n    \"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 inter_channels: int,\n                 out_channels: int,\n                 dropout_prob: float = 0.1,\n                 **kwargs):\n        super().__init__()\n\n        self.conv_bn_relu = ConvBNReLU(\n            in_channels=in_channels,\n            out_channels=inter_channels,\n            kernel_size=3,\n            padding=1,\n            **kwargs)\n\n        self.dropout = nn.Dropout(p=dropout_prob)\n\n        self.conv = nn.Conv2D(\n            in_channels=inter_channels,\n            out_channels=out_channels,\n            kernel_size=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n        x = self.conv(x)\n        return x\n\n\nclass Add(nn.Layer):\n    def __init__(self):\n        super().__init__()\n\n    def forward(self, x: paddle.Tensor, y: paddle.Tensor, name: str = None) -> paddle.Tensor:\n        return paddle.add(x, y, name)\n\nclass PPModule(nn.Layer):\n    \"\"\"\n    Pyramid pooling module originally in PSPNet.\n\n    Args:\n        in_channels (int): The number of intput channels to pyramid pooling module.\n        out_channels (int): The number of output channels after pyramid pooling module.\n        bin_sizes (tuple, optional): The out size of pooled feature maps. Default: (1, 2, 3, 6).\n        dim_reduction (bool, optional): A bool value represents if reducing dimension after pooling. Default: True.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n    \"\"\"\n\n    def __init__(self, \n                 in_channels: int, \n                 out_channels: int, \n                 bin_sizes: Tuple[int], \n                 dim_reduction: bool,\n                 align_corners: bool):\n        super().__init__()\n\n        self.bin_sizes = bin_sizes\n\n        inter_channels = in_channels\n        if dim_reduction:\n            inter_channels = in_channels // len(bin_sizes)\n\n        # we use dimension reduction after pooling mentioned in original implementation.\n        self.stages = nn.LayerList([\n            self._make_stage(in_channels, inter_channels, size)\n            for size in bin_sizes\n        ])\n\n        self.conv_bn_relu2 = ConvBNReLU(\n            in_channels=in_channels + inter_channels * len(bin_sizes),\n            out_channels=out_channels,\n            kernel_size=3,\n            padding=1)\n\n        self.align_corners = align_corners\n\n    def _make_stage(self, in_channels: int, out_channels: int, size: int):\n        \"\"\"\n        Create one pooling layer.\n\n        In our implementation, we adopt the same dimension reduction as the original paper that might be\n        slightly different with other implementations.\n\n        After pooling, the channels are reduced to 1/len(bin_sizes) immediately, while some other implementations\n        keep the channels to be same.\n\n        Args:\n            in_channels (int): The number of intput channels to pyramid pooling module.\n            size (int): The out size of the pooled layer.\n\n        Returns:\n            conv (Tensor): A tensor after Pyramid Pooling Module.\n        \"\"\"\n\n        prior = nn.AdaptiveAvgPool2D(output_size=(size, size))\n        conv = ConvBNReLU(\n            in_channels=in_channels, out_channels=out_channels, kernel_size=1)\n\n        return nn.Sequential(prior, conv)\n\n    def forward(self, input: paddle.Tensor) -> paddle.Tensor:\n        cat_layers = []\n        for stage in self.stages:\n            x = stage(input)\n            x = F.interpolate(\n                x,\n                paddle.shape(input)[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            cat_layers.append(x)\n        cat_layers = [input] + cat_layers[::-1]\n        cat = paddle.concat(cat_layers, axis=1)\n        out = self.conv_bn_relu2(cat)\n\n        return out\n"
  },
  {
    "path": "modules/image/semantic_segmentation/pspnet_resnet50_cityscapes/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union, List, Tuple\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\n\nfrom pspnet_resnet50_cityscapes.resnet import ResNet50_vd\nimport pspnet_resnet50_cityscapes.layers as layers\n\n@moduleinfo(\n    name=\"pspnet_resnet50_cityscapes\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"PSPNetResnet50 is a segmentation model.\",\n    version=\"1.0.0\",\n    meta=ImageSegmentationModule)\nclass PSPNet(nn.Layer):\n    \"\"\"\n    The PSPNet implementation based on PaddlePaddle.\n\n    The original article refers to\n    Zhao, Hengshuang, et al. \"Pyramid scene parsing network\"\n    (https://openaccess.thecvf.com/content_cvpr_2017/papers/Zhao_Pyramid_Scene_Parsing_CVPR_2017_paper.pdf).\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (tuple, optional): Two values in the tuple indicate the indices of output of backbone.\n        pp_out_channels (int, optional): The output channels after Pyramid Pooling Module. Default: 1024.\n        bin_sizes (tuple, optional): The out size of pooled feature maps. Default: (1,2,3,6).\n        enable_auxiliary_loss (bool, optional): A bool value indicates whether adding auxiliary loss. Default: True.\n        align_corners (bool, optional): An argument of F.interpolate. It should be set to False when the feature size is even,\n            e.g. 1024x512, otherwise it is True, e.g. 769x769. Default: False.\n        pretrained (str, optional): The path or url of pretrained model. Default: None.\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 19,\n                 backbone_indices: Tuple[int] = (2, 3),\n                 pp_out_channels: int = 1024,\n                 bin_sizes: Tuple[int] = (1, 2, 3, 6),\n                 enable_auxiliary_loss: bool = True,\n                 align_corners: bool = False,\n                 pretrained: str = None):\n        super(PSPNet, self).__init__()\n\n        self.backbone = ResNet50_vd()\n        backbone_channels = [\n            self.backbone.feat_channels[i] for i in backbone_indices\n        ]\n\n        self.head = PSPNetHead(num_classes, backbone_indices, backbone_channels,\n                               pp_out_channels, bin_sizes,\n                               enable_auxiliary_loss, align_corners)\n        self.align_corners = align_corners\n        self.transforms = T.Compose([T.Normalize()])\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def transform(self, img: Union[np.ndarray, str]) -> Union[np.ndarray, str]:\n        return self.transforms(img)\n\n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        feat_list = self.backbone(x)\n        logit_list = self.head(feat_list)\n        return [\n            F.interpolate(\n                logit,\n                paddle.shape(x)[2:],\n                mode='bilinear',\n                align_corners=self.align_corners) for logit in logit_list\n        ]\n\n\nclass PSPNetHead(nn.Layer):\n    \"\"\"\n    The PSPNetHead implementation.\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (tuple): Two values in the tuple indicate the indices of output of backbone.\n            The first index will be taken as a deep-supervision feature in auxiliary layer;\n            the second one will be taken as input of Pyramid Pooling Module (PPModule).\n            Usually backbone consists of four downsampling stage, and return an output of\n            each stage. If we set it as (2, 3) in ResNet, that means taking feature map of the third\n            stage (res4b22) in backbone, and feature map of the fourth stage (res5c) as input of PPModule.\n        backbone_channels (tuple): The same length with \"backbone_indices\". It indicates the channels of corresponding index.\n        pp_out_channels (int): The output channels after Pyramid Pooling Module.\n        bin_sizes (tuple): The out size of pooled feature maps.\n        enable_auxiliary_loss (bool, optional): A bool value indicates whether adding auxiliary loss. Default: True.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n    \"\"\"\n\n    def __init__(self, num_classes, backbone_indices, backbone_channels,\n                 pp_out_channels, bin_sizes, enable_auxiliary_loss,\n                 align_corners):\n\n        super().__init__()\n\n        self.backbone_indices = backbone_indices\n\n        self.psp_module = layers.PPModule(\n            in_channels=backbone_channels[1],\n            out_channels=pp_out_channels,\n            bin_sizes=bin_sizes,\n            dim_reduction=True,\n            align_corners=align_corners)\n\n        self.dropout = nn.Dropout(p=0.1)  # dropout_prob\n\n        self.conv = nn.Conv2D(\n            in_channels=pp_out_channels,\n            out_channels=num_classes,\n            kernel_size=1)\n\n        if enable_auxiliary_loss:\n            self.auxlayer = layers.AuxLayer(\n                in_channels=backbone_channels[0],\n                inter_channels=backbone_channels[0] // 4,\n                out_channels=num_classes)\n\n        self.enable_auxiliary_loss = enable_auxiliary_loss\n\n    def forward(self, feat_list: List[paddle.Tensor]) -> List[paddle.Tensor]:\n        logit_list = []\n        x = feat_list[self.backbone_indices[1]]\n        x = self.psp_module(x)\n        x = self.dropout(x)\n        logit = self.conv(x)\n        logit_list.append(logit)\n\n        if self.enable_auxiliary_loss:\n            auxiliary_feat = feat_list[self.backbone_indices[0]]\n            auxiliary_logit = self.auxlayer(auxiliary_feat)\n            logit_list.append(auxiliary_logit)\n\n        return logit_list\n"
  },
  {
    "path": "modules/image/semantic_segmentation/pspnet_resnet50_cityscapes/resnet.py",
    "content": "# copyright (c) 2022 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle.nn as nn\nimport pspnet_resnet50_cityscapes.layers as layers\n\n\nclass ConvBNLayer(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 stride: int = 1,\n                 dilation: int = 1,\n                 groups: int = 1,\n                 is_vd_mode: bool = False,\n                 act: str = None,\n                 data_format: str = 'NCHW'):\n        super(ConvBNLayer, self).__init__()\n        if dilation != 1 and kernel_size != 3:\n            raise RuntimeError(\"When the dilation isn't 1,\" \\\n                               \"the kernel_size should be 3.\")\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = nn.AvgPool2D(\n            kernel_size=2,\n            stride=2,\n            padding=0,\n            ceil_mode=True,\n            data_format=data_format)\n        self._conv = nn.Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=(kernel_size - 1) // 2 \\\n                if dilation == 1 else dilation,\n            dilation=dilation,\n            groups=groups,\n            bias_attr=False,\n            data_format=data_format)\n\n        self._batch_norm = layers.SyncBatchNorm(\n            out_channels, data_format=data_format)\n        self._act_op = layers.Activation(act=act)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        y = self._act_op(y)\n\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool  = False,\n                 dilation: int = 1,\n                 data_format: str = 'NCHW'):\n        super(BottleneckBlock, self).__init__()\n\n        self.data_format = data_format\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=1,\n            act='relu',\n            data_format=data_format)\n\n        self.dilation = dilation\n\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            dilation=dilation,\n            data_format=data_format)\n        self.conv2 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels * 4,\n            kernel_size=1,\n            act=None,\n            data_format=data_format)\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels * 4,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                data_format=data_format)\n\n        self.shortcut = shortcut\n        # NOTE: Use the wrap layer for quantization training\n        self.add = layers.Add()\n        self.relu = layers.Activation(act=\"relu\")\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = self.add(short, conv2)\n        y = self.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 dilation: int = 1,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 data_format: str = 'NCHW'):\n        super(BasicBlock, self).__init__()\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            dilation=dilation,\n            act='relu',\n            data_format=data_format)\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            dilation=dilation,\n            act=None,\n            data_format=data_format)\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                data_format=data_format)\n\n        self.shortcut = shortcut\n        self.dilation = dilation\n        self.data_format = data_format\n        self.add = layers.Add()\n        self.relu = layers.Activation(act=\"relu\")\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = self.add(short, conv1)\n        y = self.relu(y)\n\n        return y\n\n\nclass ResNet_vd(nn.Layer):\n    \"\"\"\n    The ResNet_vd implementation based on PaddlePaddle.\n\n    The original article refers to Jingdong\n    Tong He, et, al. \"Bag of Tricks for Image Classification with Convolutional Neural Networks\"\n    (https://arxiv.org/pdf/1812.01187.pdf).\n\n    Args:\n        layers (int, optional): The layers of ResNet_vd. The supported layers are (18, 34, 50, 101, 152, 200). Default: 50.\n        output_stride (int, optional): The stride of output features compared to input images. It is 8 or 16. Default: 8.\n        multi_grid (tuple|list, optional): The grid of stage4. Defult: (1, 1, 1).\n        pretrained (str, optional): The path of pretrained model.\n\n    \"\"\"\n\n    def __init__(self,\n                 layers: int = 50,\n                 output_stride: int = 8,\n                 multi_grid: Tuple[int] = (1, 1, 1),\n                 pretrained: str = None,\n                 data_format: str = 'NCHW'):\n        super(ResNet_vd, self).__init__()\n\n        self.data_format = data_format\n        self.conv1_logit = None  # for gscnn shape stream\n        self.layers = layers\n        supported_layers = [18, 34, 50, 101, 152, 200]\n        assert layers in supported_layers, \\\n            \"supported layers are {} but input layer is {}\".format(\n                supported_layers, layers)\n\n        if layers == 18:\n            depth = [2, 2, 2, 2]\n        elif layers == 34 or layers == 50:\n            depth = [3, 4, 6, 3]\n        elif layers == 101:\n            depth = [3, 4, 23, 3]\n        elif layers == 152:\n            depth = [3, 8, 36, 3]\n        elif layers == 200:\n            depth = [3, 12, 48, 3]\n        num_channels = [64, 256, 512, 1024\n                        ] if layers >= 50 else [64, 64, 128, 256]\n        num_filters = [64, 128, 256, 512]\n\n        # for channels of four returned stages\n        self.feat_channels = [c * 4 for c in num_filters\n                              ] if layers >= 50 else num_filters\n\n        dilation_dict = None\n        if output_stride == 8:\n            dilation_dict = {2: 2, 3: 4}\n        elif output_stride == 16:\n            dilation_dict = {3: 2}\n\n        self.conv1_1 = ConvBNLayer(\n            in_channels=3,\n            out_channels=32,\n            kernel_size=3,\n            stride=2,\n            act='relu',\n            data_format=data_format)\n        self.conv1_2 = ConvBNLayer(\n            in_channels=32,\n            out_channels=32,\n            kernel_size=3,\n            stride=1,\n            act='relu',\n            data_format=data_format)\n        self.conv1_3 = ConvBNLayer(\n            in_channels=32,\n            out_channels=64,\n            kernel_size=3,\n            stride=1,\n            act='relu',\n            data_format=data_format)\n        self.pool2d_max = nn.MaxPool2D(\n            kernel_size=3, stride=2, padding=1, data_format=data_format)\n\n        # self.block_list = []\n        self.stage_list = []\n        if layers >= 50:\n            for block in range(len(depth)):\n                shortcut = False\n                block_list = []\n                for i in range(depth[block]):\n                    if layers in [101, 152] and block == 2:\n                        if i == 0:\n                            conv_name = \"res\" + str(block + 2) + \"a\"\n                        else:\n                            conv_name = \"res\" + str(block + 2) + \"b\" + str(i)\n                    else:\n                        conv_name = \"res\" + str(block + 2) + chr(97 + i)\n\n                    ###############################################################################\n                    # Add dilation rate for some segmentation tasks, if dilation_dict is not None.\n                    dilation_rate = dilation_dict[\n                        block] if dilation_dict and block in dilation_dict else 1\n\n                    # Actually block here is 'stage', and i is 'block' in 'stage'\n                    # At the stage 4, expand the the dilation_rate if given multi_grid\n                    if block == 3:\n                        dilation_rate = dilation_rate * multi_grid[i]\n                    ###############################################################################\n\n                    bottleneck_block = self.add_sublayer(\n                        'bb_%d_%d' % (block, i),\n                        BottleneckBlock(\n                            in_channels=num_channels[block]\n                            if i == 0 else num_filters[block] * 4,\n                            out_channels=num_filters[block],\n                            stride=2 if i == 0 and block != 0\n                                        and dilation_rate == 1 else 1,\n                            shortcut=shortcut,\n                            if_first=block == i == 0,\n                            dilation=dilation_rate,\n                            data_format=data_format))\n\n                    block_list.append(bottleneck_block)\n                    shortcut = True\n                self.stage_list.append(block_list)\n        else:\n            for block in range(len(depth)):\n                shortcut = False\n                block_list = []\n                for i in range(depth[block]):\n                    dilation_rate = dilation_dict[block] \\\n                        if dilation_dict and block in dilation_dict else 1\n                    if block == 3:\n                        dilation_rate = dilation_rate * multi_grid[i]\n\n                    basic_block = self.add_sublayer(\n                        'bb_%d_%d' % (block, i),\n                        BasicBlock(\n                            in_channels=num_channels[block]\n                            if i == 0 else num_filters[block],\n                            out_channels=num_filters[block],\n                            stride=2 if i == 0 and block != 0 \\\n                                        and dilation_rate == 1 else 1,\n                            dilation=dilation_rate,\n                            shortcut=shortcut,\n                            if_first=block == i == 0,\n                            data_format=data_format))\n                    block_list.append(basic_block)\n                    shortcut = True\n                self.stage_list.append(block_list)\n\n        self.pretrained = pretrained\n\n    def forward(self, inputs: paddle.Tensor) -> List[paddle.Tensor]:\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        self.conv1_logit = y.clone()\n        y = self.pool2d_max(y)\n\n        # A feature list saves the output feature map of each stage.\n        feat_list = []\n        for stage in self.stage_list:\n            for block in stage:\n                y = block(y)\n            feat_list.append(y)\n\n        return feat_list\n\n\ndef ResNet50_vd(**args):\n    model = ResNet_vd(layers=50, **args)\n    return model"
  },
  {
    "path": "modules/image/semantic_segmentation/pspnet_resnet50_voc/README.md",
    "content": "# pspnet_resnet50_voc\n\n|模型名称|pspnet_resnet50_voc|\n| :--- | :---: | \n|类别|图像-图像分割|\n|网络|pspnet_resnet50vd|\n|数据集|PascalVOC2012|\n|是否支持Fine-tuning|是|\n|模型大小|390MB|\n|指标|-|\n|最新更新日期|2022-03-21|\n\n## 一、模型基本信息\n  \n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/159212097-443a5a65-2f2e-4126-9c07-d7c3c220e55f.jpg\"  width = \"420\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/159212375-52e123af-4699-4c25-8f50-4240bbb714b4.png\" width = \"420\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### 模型介绍\n\n    - 本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n    - 更多详情请参考：[pspnet](https://openaccess.thecvf.com/content_cvpr_2017/papers/Zhao_Pyramid_Scene_Parsing_CVPR_2017_paper.pdf)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install pspnet_resnet50_voc\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)   \n\n\n## 三、模型API预测\n\n- ### 1.预测代码示例\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='pspnet_resnet50_voc')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用pspnet_resnet50_voc模型对OpticDiscSeg数据集进行Fine-tune。 `train.py`内容如下：\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n              ```\n                - `transforms`: 数据预处理方式。\n                - `mode`: `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n                - 数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='pspnet_resnet50_voc', num_classes=2, pretrained=None)\n              ```\n                - `name`: 选择预训练模型的名字。\n                - `load_checkpoint`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n        - Step4: 选择优化策略和运行配置\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n             \n    - 模型预测\n\n        - 当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。我们使用该模型来进行预测。predict.py脚本如下：\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='pspnet_resnet50_voc', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - 参数配置正确后，请执行脚本`python predict.py`。\n\n            - **Args**\n                * `images`:原始图像路径或BGR格式图片；\n                * `visualization`: 是否可视化，默认为True；\n                * `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n                **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像分割服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m pspnet_resnet50_voc\n      ```\n\n    - 这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ginet_resnet50vd_voc\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/semantic_segmentation/pspnet_resnet50_voc/README_en.md",
    "content": "# pspnet_resnet50_voc\n\n|Module Name|pspnet_resnet50_voc|\n| :--- | :---: | \n|Category|Image Segmentation|\n|Network|pspnet_resnet50vd|\n|Dataset|PascalVOC2012|\n|Fine-tuning supported or not|Yes|\n|Module Size|370MB|\n|Data indicators|-|\n|Latest update date|2022-03-22|\n\n## I. Basic Information \n  \n- ### Application Effect Display\n    - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/159212097-443a5a65-2f2e-4126-9c07-d7c3c220e55f.jpg\"  width = \"420\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/159212375-52e123af-4699-4c25-8f50-4240bbb714b4.png\" width = \"420\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### Module Introduction\n\n    - We will show how to use PaddleHub to finetune the pre-trained model and complete the prediction.\n    - For more information, please refer to: [pspnet](https://openaccess.thecvf.com/content_cvpr_2017/papers/Zhao_Pyramid_Scene_Parsing_CVPR_2017_paper.pdf)\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install pspnet_resnet50_voc\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='pspnet_resnet50_voc')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.Fine-tune and Encapsulation\n\n    - After completing the installation of PaddlePaddle and PaddleHub, you can start using the pspnet_resnet50_voc model to fine-tune datasets such as OpticDiscSeg.\n\n    - Steps:\n\n         - Step1: Define the data preprocessing method\n\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms`: The data enhancement module defines lots of data preprocessing methods. Users can replace the data preprocessing methods according to their needs.\n\n         - Step2: Download the dataset\n\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n              ```\n                * `transforms`: data preprocessing methods.\n\n                * `mode`: Select the data mode, the options are `train`, `test`, `val`. Default is `train`.\n\n                * Dataset preparation can be referred to [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`will be automatically downloaded from the network and decompressed to the `$HOME/.paddlehub/dataset` directory under the user directory.\n\n        - Step3: Load the pre-trained model\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='pspnet_resnet50_voc', num_classes=2, pretrained=None)\n              ```\n                - `name`: model name.\n                - `load_checkpoint`: Whether to load the self-trained model, if it is None, load the provided parameters.\n\n        - Step4:  Optimization strategy\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n             \n    -  Model prediction\n\n        - When Fine-tune is completed, the model with the best performance on the verification set will be saved in the `${CHECKPOINT_DIR}/best_model` directory. We use this model to make predictions. The `predict.py` script is as follows:\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='pspnet_resnet50_voc', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - **Args**\n                * `images`: Image path or ndarray data with format [H, W, C], BGR.\n                * `visualization`: Whether to save the recognition results as picture files.\n                * `save_path`: Save path of the result, default is 'seg_result'.\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of image segmentation.\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n        - ```shell\n          $ hub serving start -m pspnet_resnet50_voc\n          ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result:\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/pspnet_resnet50_voc\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## V. Release Note\n\n- 1.0.0\n\n  First release\n"
  },
  {
    "path": "modules/image/semantic_segmentation/pspnet_resnet50_voc/layers.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn.layer import activation\nfrom paddle.nn import Conv2D, AvgPool2D\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu':\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Depthwise Separable Convolution.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(SeparableConvBNReLU, self).__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=in_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n        self.piontwise_conv = ConvBNReLU(\n            in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBN, self).__init__()\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBNReLU, self).__init__()\n\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n    Returns:\n        A callable object of Activation.\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n    Examples:\n        from paddleseg.models.common.activation import Activation\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(\n                    act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n\n\nclass ASPPModule(nn.Layer):\n    \"\"\"\n    Atrous Spatial Pyramid Pooling.\n    Args:\n        aspp_ratios (tuple): The dilation rate using in ASSP module.\n        in_channels (int): The number of input channels.\n        out_channels (int): The number of output channels.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n        use_sep_conv (bool, optional): If using separable conv in ASPP module. Default: False.\n        image_pooling (bool, optional): If augmented with image-level features. Default: False\n    \"\"\"\n\n    def __init__(self,\n                 aspp_ratios: tuple,\n                 in_channels: int,\n                 out_channels: int,\n                 align_corners: bool,\n                 use_sep_conv: bool = False,\n                 image_pooling: bool = False):\n        super().__init__()\n\n        self.align_corners = align_corners\n        self.aspp_blocks = nn.LayerList()\n\n        for ratio in aspp_ratios:\n            if use_sep_conv and ratio > 1:\n                conv_func = SeparableConvBNReLU\n            else:\n                conv_func = ConvBNReLU\n\n            block = conv_func(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1 if ratio == 1 else 3,\n                dilation=ratio,\n                padding=0 if ratio == 1 else ratio)\n            self.aspp_blocks.append(block)\n\n        out_size = len(self.aspp_blocks)\n\n        if image_pooling:\n            self.global_avg_pool = nn.Sequential(\n                nn.AdaptiveAvgPool2D(output_size=(1, 1)),\n                ConvBNReLU(in_channels, out_channels, kernel_size=1, bias_attr=False))\n            out_size += 1\n        self.image_pooling = image_pooling\n\n        self.conv_bn_relu = ConvBNReLU(\n            in_channels=out_channels * out_size,\n            out_channels=out_channels,\n            kernel_size=1)\n\n        self.dropout = nn.Dropout(p=0.1)  # drop rate\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outputs = []\n        for block in self.aspp_blocks:\n            y = block(x)\n            y = F.interpolate(\n                y,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(y)\n\n        if self.image_pooling:\n            img_avg = self.global_avg_pool(x)\n            img_avg = F.interpolate(\n                img_avg,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(img_avg)\n\n        x = paddle.concat(outputs, axis=1)\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n\n        return x\n\n\nclass AuxLayer(nn.Layer):\n    \"\"\"\n    The auxiliary layer implementation for auxiliary loss.\n\n    Args:\n        in_channels (int): The number of input channels.\n        inter_channels (int): The intermediate channels.\n        out_channels (int): The number of output channels, and usually it is num_classes.\n        dropout_prob (float, optional): The drop rate. Default: 0.1.\n    \"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 inter_channels: int,\n                 out_channels: int,\n                 dropout_prob: float = 0.1,\n                 **kwargs):\n        super().__init__()\n\n        self.conv_bn_relu = ConvBNReLU(\n            in_channels=in_channels,\n            out_channels=inter_channels,\n            kernel_size=3,\n            padding=1,\n            **kwargs)\n\n        self.dropout = nn.Dropout(p=dropout_prob)\n\n        self.conv = nn.Conv2D(\n            in_channels=inter_channels,\n            out_channels=out_channels,\n            kernel_size=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n        x = self.conv(x)\n        return x\n\n\nclass Add(nn.Layer):\n    def __init__(self):\n        super().__init__()\n\n    def forward(self, x: paddle.Tensor, y: paddle.Tensor, name: str = None):\n        return paddle.add(x, y, name)\n\nclass PPModule(nn.Layer):\n    \"\"\"\n    Pyramid pooling module originally in PSPNet.\n\n    Args:\n        in_channels (int): The number of intput channels to pyramid pooling module.\n        out_channels (int): The number of output channels after pyramid pooling module.\n        bin_sizes (tuple, optional): The out size of pooled feature maps. Default: (1, 2, 3, 6).\n        dim_reduction (bool, optional): A bool value represents if reducing dimension after pooling. Default: True.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n    \"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, bin_sizes: tuple, dim_reduction: bool,\n                 align_corners: bool):\n        super().__init__()\n\n        self.bin_sizes = bin_sizes\n\n        inter_channels = in_channels\n        if dim_reduction:\n            inter_channels = in_channels // len(bin_sizes)\n\n        # we use dimension reduction after pooling mentioned in original implementation.\n        self.stages = nn.LayerList([\n            self._make_stage(in_channels, inter_channels, size)\n            for size in bin_sizes\n        ])\n\n        self.conv_bn_relu2 = ConvBNReLU(\n            in_channels=in_channels + inter_channels * len(bin_sizes),\n            out_channels=out_channels,\n            kernel_size=3,\n            padding=1)\n\n        self.align_corners = align_corners\n\n    def _make_stage(self, in_channels: int, out_channels: int, size: int):\n        \"\"\"\n        Create one pooling layer.\n\n        In our implementation, we adopt the same dimension reduction as the original paper that might be\n        slightly different with other implementations.\n\n        After pooling, the channels are reduced to 1/len(bin_sizes) immediately, while some other implementations\n        keep the channels to be same.\n\n        Args:\n            in_channels (int): The number of intput channels to pyramid pooling module.\n            out_channels (int): The number of output channels to pyramid pooling module.\n            size (int): The out size of the pooled layer.\n\n        Returns:\n            conv (Tensor): A tensor after Pyramid Pooling Module.\n        \"\"\"\n\n        prior = nn.AdaptiveAvgPool2D(output_size=(size, size))\n        conv = ConvBNReLU(\n            in_channels=in_channels, out_channels=out_channels, kernel_size=1)\n\n        return nn.Sequential(prior, conv)\n\n    def forward(self, input: paddle.Tensor) -> paddle.Tensor:\n        cat_layers = []\n        for stage in self.stages:\n            x = stage(input)\n            x = F.interpolate(\n                x,\n                paddle.shape(input)[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            cat_layers.append(x)\n        cat_layers = [input] + cat_layers[::-1]\n        cat = paddle.concat(cat_layers, axis=1)\n        out = self.conv_bn_relu2(cat)\n\n        return out\n"
  },
  {
    "path": "modules/image/semantic_segmentation/pspnet_resnet50_voc/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union, List, Tuple\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\n\nfrom pspnet_resnet50_voc.resnet import ResNet50_vd\nimport pspnet_resnet50_voc.layers as layers\n\n@moduleinfo(\n    name=\"pspnet_resnet50_voc\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"PSPNetResnet50 is a segmentation model.\",\n    version=\"1.0.0\",\n    meta=ImageSegmentationModule)\nclass PSPNet(nn.Layer):\n    \"\"\"\n    The PSPNet implementation based on PaddlePaddle.\n\n    The original article refers to\n    Zhao, Hengshuang, et al. \"Pyramid scene parsing network\"\n    (https://openaccess.thecvf.com/content_cvpr_2017/papers/Zhao_Pyramid_Scene_Parsing_CVPR_2017_paper.pdf).\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (tuple, optional): Two values in the tuple indicate the indices of output of backbone.\n        pp_out_channels (int, optional): The output channels after Pyramid Pooling Module. Default: 1024.\n        bin_sizes (tuple, optional): The out size of pooled feature maps. Default: (1,2,3,6).\n        enable_auxiliary_loss (bool, optional): A bool value indicates whether adding auxiliary loss. Default: True.\n        align_corners (bool, optional): An argument of F.interpolate. It should be set to False when the feature size is even,\n            e.g. 1024x512, otherwise it is True, e.g. 769x769. Default: False.\n        pretrained (str, optional): The path or url of pretrained model. Default: None.\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 21,\n                 backbone_indices: Tuple[int] = (2, 3),\n                 pp_out_channels: int = 1024,\n                 bin_sizes: Tuple[int] = (1, 2, 3, 6),\n                 enable_auxiliary_loss: bool  = True,\n                 align_corners: bool = False,\n                 pretrained: str = None):\n        super(PSPNet, self).__init__()\n\n        self.backbone = ResNet50_vd()\n        backbone_channels = [\n            self.backbone.feat_channels[i] for i in backbone_indices\n        ]\n\n        self.head = PSPNetHead(num_classes, backbone_indices, backbone_channels,\n                               pp_out_channels, bin_sizes,\n                               enable_auxiliary_loss, align_corners)\n        self.align_corners = align_corners\n        self.transforms = T.Compose([T.Normalize()])\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def transform(self, img: Union[np.ndarray, str]) -> Union[np.ndarray, str]:\n        return self.transforms(img)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        feat_list = self.backbone(x)\n        logit_list = self.head(feat_list)\n        return [\n            F.interpolate(\n                logit,\n                paddle.shape(x)[2:],\n                mode='bilinear',\n                align_corners=self.align_corners) for logit in logit_list\n        ]\n\n\nclass PSPNetHead(nn.Layer):\n    \"\"\"\n    The PSPNetHead implementation.\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        backbone_indices (tuple): Two values in the tuple indicate the indices of output of backbone.\n            The first index will be taken as a deep-supervision feature in auxiliary layer;\n            the second one will be taken as input of Pyramid Pooling Module (PPModule).\n            Usually backbone consists of four downsampling stage, and return an output of\n            each stage. If we set it as (2, 3) in ResNet, that means taking feature map of the third\n            stage (res4b22) in backbone, and feature map of the fourth stage (res5c) as input of PPModule.\n        backbone_channels (tuple): The same length with \"backbone_indices\". It indicates the channels of corresponding index.\n        pp_out_channels (int): The output channels after Pyramid Pooling Module.\n        bin_sizes (tuple): The out size of pooled feature maps.\n        enable_auxiliary_loss (bool, optional): A bool value indicates whether adding auxiliary loss. Default: True.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n    \"\"\"\n\n    def __init__(self, num_classes, backbone_indices, backbone_channels,\n                 pp_out_channels, bin_sizes, enable_auxiliary_loss,\n                 align_corners):\n\n        super().__init__()\n\n        self.backbone_indices = backbone_indices\n\n        self.psp_module = layers.PPModule(\n            in_channels=backbone_channels[1],\n            out_channels=pp_out_channels,\n            bin_sizes=bin_sizes,\n            dim_reduction=True,\n            align_corners=align_corners)\n\n        self.dropout = nn.Dropout(p=0.1)  # dropout_prob\n\n        self.conv = nn.Conv2D(\n            in_channels=pp_out_channels,\n            out_channels=num_classes,\n            kernel_size=1)\n\n        if enable_auxiliary_loss:\n            self.auxlayer = layers.AuxLayer(\n                in_channels=backbone_channels[0],\n                inter_channels=backbone_channels[0] // 4,\n                out_channels=num_classes)\n\n        self.enable_auxiliary_loss = enable_auxiliary_loss\n\n    def forward(self, feat_list: List[paddle.Tensor]) -> List[paddle.Tensor]:\n        logit_list = []\n        x = feat_list[self.backbone_indices[1]]\n        x = self.psp_module(x)\n        x = self.dropout(x)\n        logit = self.conv(x)\n        logit_list.append(logit)\n\n        if self.enable_auxiliary_loss:\n            auxiliary_feat = feat_list[self.backbone_indices[0]]\n            auxiliary_logit = self.auxlayer(auxiliary_feat)\n            logit_list.append(auxiliary_logit)\n\n        return logit_list\n"
  },
  {
    "path": "modules/image/semantic_segmentation/pspnet_resnet50_voc/resnet.py",
    "content": "# copyright (c) 2022 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle.nn as nn\nimport pspnet_resnet50_voc.layers as layers\n\n\nclass ConvBNLayer(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 stride: int = 1,\n                 dilation: int = 1,\n                 groups: int = 1,\n                 is_vd_mode: bool = False,\n                 act: str = None,\n                 data_format: str = 'NCHW'):\n        super(ConvBNLayer, self).__init__()\n        if dilation != 1 and kernel_size != 3:\n            raise RuntimeError(\"When the dilation isn't 1,\" \\\n                               \"the kernel_size should be 3.\")\n\n        self.is_vd_mode = is_vd_mode\n        self._pool2d_avg = nn.AvgPool2D(\n            kernel_size=2,\n            stride=2,\n            padding=0,\n            ceil_mode=True,\n            data_format=data_format)\n        self._conv = nn.Conv2D(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            stride=stride,\n            padding=(kernel_size - 1) // 2 \\\n                if dilation == 1 else dilation,\n            dilation=dilation,\n            groups=groups,\n            bias_attr=False,\n            data_format=data_format)\n\n        self._batch_norm = layers.SyncBatchNorm(\n            out_channels, data_format=data_format)\n        self._act_op = layers.Activation(act=act)\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        if self.is_vd_mode:\n            inputs = self._pool2d_avg(inputs)\n        y = self._conv(inputs)\n        y = self._batch_norm(y)\n        y = self._act_op(y)\n\n        return y\n\n\nclass BottleneckBlock(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 shortcut: bool = True,\n                 if_first: bool  = False,\n                 dilation: int = 1,\n                 data_format: str = 'NCHW'):\n        super(BottleneckBlock, self).__init__()\n\n        self.data_format = data_format\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=1,\n            act='relu',\n            data_format=data_format)\n\n        self.dilation = dilation\n\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            act='relu',\n            dilation=dilation,\n            data_format=data_format)\n        self.conv2 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels * 4,\n            kernel_size=1,\n            act=None,\n            data_format=data_format)\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels * 4,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                data_format=data_format)\n\n        self.shortcut = shortcut\n        # NOTE: Use the wrap layer for quantization training\n        self.add = layers.Add()\n        self.relu = layers.Activation(act=\"relu\")\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n        conv2 = self.conv2(conv1)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n\n        y = self.add(short, conv2)\n        y = self.relu(y)\n        return y\n\n\nclass BasicBlock(nn.Layer):\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 stride: int,\n                 dilation: int = 1,\n                 shortcut: bool = True,\n                 if_first: bool = False,\n                 data_format: str = 'NCHW'):\n        super(BasicBlock, self).__init__()\n        self.conv0 = ConvBNLayer(\n            in_channels=in_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            stride=stride,\n            dilation=dilation,\n            act='relu',\n            data_format=data_format)\n        self.conv1 = ConvBNLayer(\n            in_channels=out_channels,\n            out_channels=out_channels,\n            kernel_size=3,\n            dilation=dilation,\n            act=None,\n            data_format=data_format)\n\n        if not shortcut:\n            self.short = ConvBNLayer(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1,\n                stride=1,\n                is_vd_mode=False if if_first or stride == 1 else True,\n                data_format=data_format)\n\n        self.shortcut = shortcut\n        self.dilation = dilation\n        self.data_format = data_format\n        self.add = layers.Add()\n        self.relu = layers.Activation(act=\"relu\")\n\n    def forward(self, inputs: paddle.Tensor) -> paddle.Tensor:\n        y = self.conv0(inputs)\n        conv1 = self.conv1(y)\n\n        if self.shortcut:\n            short = inputs\n        else:\n            short = self.short(inputs)\n        y = self.add(short, conv1)\n        y = self.relu(y)\n\n        return y\n\n\nclass ResNet_vd(nn.Layer):\n    \"\"\"\n    The ResNet_vd implementation based on PaddlePaddle.\n\n    The original article refers to Jingdong\n    Tong He, et, al. \"Bag of Tricks for Image Classification with Convolutional Neural Networks\"\n    (https://arxiv.org/pdf/1812.01187.pdf).\n\n    Args:\n        layers (int, optional): The layers of ResNet_vd. The supported layers are (18, 34, 50, 101, 152, 200). Default: 50.\n        output_stride (int, optional): The stride of output features compared to input images. It is 8 or 16. Default: 8.\n        multi_grid (tuple|list, optional): The grid of stage4. Defult: (1, 1, 1).\n        pretrained (str, optional): The path of pretrained model.\n\n    \"\"\"\n\n    def __init__(self,\n                 layers: int = 50,\n                 output_stride: int = 8,\n                 multi_grid: Tuple[int] = (1, 1, 1),\n                 pretrained: str = None,\n                 data_format: str = 'NCHW'):\n        super(ResNet_vd, self).__init__()\n\n        self.data_format = data_format\n        self.conv1_logit = None  # for gscnn shape stream\n        self.layers = layers\n        supported_layers = [18, 34, 50, 101, 152, 200]\n        assert layers in supported_layers, \\\n            \"supported layers are {} but input layer is {}\".format(\n                supported_layers, layers)\n\n        if layers == 18:\n            depth = [2, 2, 2, 2]\n        elif layers == 34 or layers == 50:\n            depth = [3, 4, 6, 3]\n        elif layers == 101:\n            depth = [3, 4, 23, 3]\n        elif layers == 152:\n            depth = [3, 8, 36, 3]\n        elif layers == 200:\n            depth = [3, 12, 48, 3]\n        num_channels = [64, 256, 512, 1024\n                        ] if layers >= 50 else [64, 64, 128, 256]\n        num_filters = [64, 128, 256, 512]\n\n        # for channels of four returned stages\n        self.feat_channels = [c * 4 for c in num_filters\n                              ] if layers >= 50 else num_filters\n\n        dilation_dict = None\n        if output_stride == 8:\n            dilation_dict = {2: 2, 3: 4}\n        elif output_stride == 16:\n            dilation_dict = {3: 2}\n\n        self.conv1_1 = ConvBNLayer(\n            in_channels=3,\n            out_channels=32,\n            kernel_size=3,\n            stride=2,\n            act='relu',\n            data_format=data_format)\n        self.conv1_2 = ConvBNLayer(\n            in_channels=32,\n            out_channels=32,\n            kernel_size=3,\n            stride=1,\n            act='relu',\n            data_format=data_format)\n        self.conv1_3 = ConvBNLayer(\n            in_channels=32,\n            out_channels=64,\n            kernel_size=3,\n            stride=1,\n            act='relu',\n            data_format=data_format)\n        self.pool2d_max = nn.MaxPool2D(\n            kernel_size=3, stride=2, padding=1, data_format=data_format)\n\n        # self.block_list = []\n        self.stage_list = []\n        if layers >= 50:\n            for block in range(len(depth)):\n                shortcut = False\n                block_list = []\n                for i in range(depth[block]):\n                    if layers in [101, 152] and block == 2:\n                        if i == 0:\n                            conv_name = \"res\" + str(block + 2) + \"a\"\n                        else:\n                            conv_name = \"res\" + str(block + 2) + \"b\" + str(i)\n                    else:\n                        conv_name = \"res\" + str(block + 2) + chr(97 + i)\n\n                    ###############################################################################\n                    # Add dilation rate for some segmentation tasks, if dilation_dict is not None.\n                    dilation_rate = dilation_dict[\n                        block] if dilation_dict and block in dilation_dict else 1\n\n                    # Actually block here is 'stage', and i is 'block' in 'stage'\n                    # At the stage 4, expand the the dilation_rate if given multi_grid\n                    if block == 3:\n                        dilation_rate = dilation_rate * multi_grid[i]\n                    ###############################################################################\n\n                    bottleneck_block = self.add_sublayer(\n                        'bb_%d_%d' % (block, i),\n                        BottleneckBlock(\n                            in_channels=num_channels[block]\n                            if i == 0 else num_filters[block] * 4,\n                            out_channels=num_filters[block],\n                            stride=2 if i == 0 and block != 0\n                                        and dilation_rate == 1 else 1,\n                            shortcut=shortcut,\n                            if_first=block == i == 0,\n                            dilation=dilation_rate,\n                            data_format=data_format))\n\n                    block_list.append(bottleneck_block)\n                    shortcut = True\n                self.stage_list.append(block_list)\n        else:\n            for block in range(len(depth)):\n                shortcut = False\n                block_list = []\n                for i in range(depth[block]):\n                    dilation_rate = dilation_dict[block] \\\n                        if dilation_dict and block in dilation_dict else 1\n                    if block == 3:\n                        dilation_rate = dilation_rate * multi_grid[i]\n\n                    basic_block = self.add_sublayer(\n                        'bb_%d_%d' % (block, i),\n                        BasicBlock(\n                            in_channels=num_channels[block]\n                            if i == 0 else num_filters[block],\n                            out_channels=num_filters[block],\n                            stride=2 if i == 0 and block != 0 \\\n                                        and dilation_rate == 1 else 1,\n                            dilation=dilation_rate,\n                            shortcut=shortcut,\n                            if_first=block == i == 0,\n                            data_format=data_format))\n                    block_list.append(basic_block)\n                    shortcut = True\n                self.stage_list.append(block_list)\n\n        self.pretrained = pretrained\n\n    def forward(self, inputs: paddle.Tensor) -> List[paddle.Tensor]:\n        y = self.conv1_1(inputs)\n        y = self.conv1_2(y)\n        y = self.conv1_3(y)\n        self.conv1_logit = y.clone()\n        y = self.pool2d_max(y)\n\n        # A feature list saves the output feature map of each stage.\n        feat_list = []\n        for stage in self.stage_list:\n            for block in stage:\n                y = block(y)\n            feat_list.append(y)\n\n        return feat_list\n\n\ndef ResNet50_vd(**args):\n    model = ResNet_vd(layers=50, **args)\n    return model"
  },
  {
    "path": "modules/image/semantic_segmentation/stdc1_seg_cityscapes/README.md",
    "content": "# stdc1_seg_cityscapes\n\n|模型名称|stdc1_seg_cityscapes|\n| :--- | :---: | \n|类别|图像-图像分割|\n|网络|stdc1_seg|\n|数据集|Cityscapes|\n|是否支持Fine-tuning|是|\n|模型大小|67MB|\n|指标|-|\n|最新更新日期|2022-03-21|\n\n## 一、模型基本信息\n  \n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/159212111-df341f2a-e994-45d7-92d6-2288d666079c.png\"  width = \"420\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/159212188-2db40b29-2943-47ce-9ad2-36a6fb85ba3e.png\" width = \"420\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### 模型介绍\n\n    - 本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n    - 更多详情请参考：[stdc](https://arxiv.org/abs/2104.13188)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install stdc1_seg_cityscapes\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)   \n\n\n## 三、模型API预测\n\n- ### 1.预测代码示例\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='stdc1_seg_cityscapes')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用stdc1_seg_cityscapes模型对OpticDiscSeg数据集进行Fine-tune。 `train.py`内容如下：\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n              ```\n                - `transforms`: 数据预处理方式。\n                - `mode`: `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n                - 数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='stdc1_seg_cityscapes', num_classes=2, pretrained=None)\n              ```\n                - `name`: 选择预训练模型的名字。\n                - `load_checkpoint`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n        - Step4: 选择优化策略和运行配置\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n             \n    - 模型预测\n\n        - 当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。我们使用该模型来进行预测。predict.py脚本如下：\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='stdc1_seg_cityscapes', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - 参数配置正确后，请执行脚本`python predict.py`。\n\n            - **Args**\n                * `images`:原始图像路径或BGR格式图片；\n                * `visualization`: 是否可视化，默认为True；\n                * `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n                **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像分割服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m stdc1_seg_cityscapes\n      ```\n\n    - 这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ginet_resnet50vd_voc\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/semantic_segmentation/stdc1_seg_cityscapes/README_en.md",
    "content": "# stdc1_seg_cityscapes\n\n|Module Name|stdc1_seg_cityscapes|\n| :--- | :---: | \n|Category|Image Segmentation|\n|Network|stdc1_seg|\n|Dataset|Cityscapes|\n|Fine-tuning supported or not|Yes|\n|Module Size|67MB|\n|Data indicators|-|\n|Latest update date|2022-03-21|\n\n## I. Basic Information \n  \n- ### Application Effect Display\n    - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/159212111-df341f2a-e994-45d7-92d6-2288d666079c.png\"  width = \"420\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/159212188-2db40b29-2943-47ce-9ad2-36a6fb85ba3e.png\" width = \"420\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### Module Introduction\n\n    - We will show how to use PaddleHub to finetune the pre-trained model and complete the prediction.\n    - For more information, please refer to: [pspnet](https://openaccess.thecvf.com/content_cvpr_2017/papers/Zhao_Pyramid_Scene_Parsing_CVPR_2017_paper.pdf)\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install stdc1_seg_cityscapes\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='stdc1_seg_cityscapes')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.Fine-tune and Encapsulation\n\n    - After completing the installation of PaddlePaddle and PaddleHub, you can start using the stdc1_seg_cityscapes model to fine-tune datasets such as OpticDiscSeg.\n\n    - Steps:\n\n         - Step1: Define the data preprocessing method\n\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms`: The data enhancement module defines lots of data preprocessing methods. Users can replace the data preprocessing methods according to their needs.\n\n         - Step2: Download the dataset\n\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n              ```\n                * `transforms`: data preprocessing methods.\n\n                * `mode`: Select the data mode, the options are `train`, `test`, `val`. Default is `train`.\n\n                * Dataset preparation can be referred to [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`will be automatically downloaded from the network and decompressed to the `$HOME/.paddlehub/dataset` directory under the user directory.\n\n        - Step3: Load the pre-trained model\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='stdc1_seg_cityscapes', num_classes=2, pretrained=None)\n              ```\n                - `name`: model name.\n                - `load_checkpoint`: Whether to load the self-trained model, if it is None, load the provided parameters.\n\n        - Step4:  Optimization strategy\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n   \n    - Model prediction\n    \n        - When Fine-tune is completed, the model with the best performance on the verification set will be saved in the `${CHECKPOINT_DIR}/best_model` directory. We use this model to make predictions. The `predict.py` script is as follows:\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='stdc1_seg_cityscapes', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - **Args**\n                * `images`: Image path or ndarray data with format [H, W, C], BGR.\n                * `visualization`: Whether to save the recognition results as picture files.\n                * `save_path`: Save path of the result, default is 'seg_result'.\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of image segmentation.\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n        - ```shell\n          $ hub serving start -m stdc1_seg_cityscapes\n          ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result:\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/stdc1_seg_cityscapes\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## V. Release Note\n\n- 1.0.0\n\n  First release\n"
  },
  {
    "path": "modules/image/semantic_segmentation/stdc1_seg_cityscapes/layers.py",
    "content": "# copyright (c) 2022 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn.layer import activation\nfrom paddle.nn import Conv2D, AvgPool2D\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu':\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Depthwise Separable Convolution.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(SeparableConvBNReLU, self).__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=in_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n        self.piontwise_conv = ConvBNReLU(\n            in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBN, self).__init__()\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBNReLU, self).__init__()\n\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n    Returns:\n        A callable object of Activation.\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n    Examples:\n        from paddleseg.models.common.activation import Activation\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(\n                    act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n\n\nclass ASPPModule(nn.Layer):\n    \"\"\"\n    Atrous Spatial Pyramid Pooling.\n    Args:\n        aspp_ratios (tuple): The dilation rate using in ASSP module.\n        in_channels (int): The number of input channels.\n        out_channels (int): The number of output channels.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n        use_sep_conv (bool, optional): If using separable conv in ASPP module. Default: False.\n        image_pooling (bool, optional): If augmented with image-level features. Default: False\n    \"\"\"\n\n    def __init__(self,\n                 aspp_ratios: tuple,\n                 in_channels: int,\n                 out_channels: int,\n                 align_corners: bool,\n                 use_sep_conv: bool = False,\n                 image_pooling: bool = False):\n        super().__init__()\n\n        self.align_corners = align_corners\n        self.aspp_blocks = nn.LayerList()\n\n        for ratio in aspp_ratios:\n            if use_sep_conv and ratio > 1:\n                conv_func = SeparableConvBNReLU\n            else:\n                conv_func = ConvBNReLU\n\n            block = conv_func(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1 if ratio == 1 else 3,\n                dilation=ratio,\n                padding=0 if ratio == 1 else ratio)\n            self.aspp_blocks.append(block)\n\n        out_size = len(self.aspp_blocks)\n\n        if image_pooling:\n            self.global_avg_pool = nn.Sequential(\n                nn.AdaptiveAvgPool2D(output_size=(1, 1)),\n                ConvBNReLU(in_channels, out_channels, kernel_size=1, bias_attr=False))\n            out_size += 1\n        self.image_pooling = image_pooling\n\n        self.conv_bn_relu = ConvBNReLU(\n            in_channels=out_channels * out_size,\n            out_channels=out_channels,\n            kernel_size=1)\n\n        self.dropout = nn.Dropout(p=0.1)  # drop rate\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outputs = []\n        for block in self.aspp_blocks:\n            y = block(x)\n            y = F.interpolate(\n                y,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(y)\n\n        if self.image_pooling:\n            img_avg = self.global_avg_pool(x)\n            img_avg = F.interpolate(\n                img_avg,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(img_avg)\n\n        x = paddle.concat(outputs, axis=1)\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n\n        return x\n\n\nclass AuxLayer(nn.Layer):\n    \"\"\"\n    The auxiliary layer implementation for auxiliary loss.\n\n    Args:\n        in_channels (int): The number of input channels.\n        inter_channels (int): The intermediate channels.\n        out_channels (int): The number of output channels, and usually it is num_classes.\n        dropout_prob (float, optional): The drop rate. Default: 0.1.\n    \"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 inter_channels: int,\n                 out_channels: int,\n                 dropout_prob: float = 0.1,\n                 **kwargs):\n        super().__init__()\n\n        self.conv_bn_relu = ConvBNReLU(\n            in_channels=in_channels,\n            out_channels=inter_channels,\n            kernel_size=3,\n            padding=1,\n            **kwargs)\n\n        self.dropout = nn.Dropout(p=dropout_prob)\n\n        self.conv = nn.Conv2D(\n            in_channels=inter_channels,\n            out_channels=out_channels,\n            kernel_size=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n        x = self.conv(x)\n        return x\n\n\nclass Add(nn.Layer):\n    def __init__(self):\n        super().__init__()\n\n    def forward(self, x: paddle.Tensor, y: paddle.Tensor, name=None) -> paddle.Tensor:\n        return paddle.add(x, y, name)\n\nclass PPModule(nn.Layer):\n    \"\"\"\n    Pyramid pooling module originally in PSPNet.\n\n    Args:\n        in_channels (int): The number of intput channels to pyramid pooling module.\n        out_channels (int): The number of output channels after pyramid pooling module.\n        bin_sizes (tuple, optional): The out size of pooled feature maps. Default: (1, 2, 3, 6).\n        dim_reduction (bool, optional): A bool value represents if reducing dimension after pooling. Default: True.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n    \"\"\"\n\n    def __init__(self, \n                 in_channels: int, \n                 out_channels: int, \n                 bin_sizes: tuple, \n                 dim_reduction: bool,\n                 align_corners: bool):\n        super().__init__()\n\n        self.bin_sizes = bin_sizes\n\n        inter_channels = in_channels\n        if dim_reduction:\n            inter_channels = in_channels // len(bin_sizes)\n\n        # we use dimension reduction after pooling mentioned in original implementation.\n        self.stages = nn.LayerList([\n            self._make_stage(in_channels, inter_channels, size)\n            for size in bin_sizes\n        ])\n\n        self.conv_bn_relu2 = ConvBNReLU(\n            in_channels=in_channels + inter_channels * len(bin_sizes),\n            out_channels=out_channels,\n            kernel_size=3,\n            padding=1)\n\n        self.align_corners = align_corners\n\n    def _make_stage(self, in_channels: int, out_channels: int, size: int):\n        \"\"\"\n        Create one pooling layer.\n\n        In our implementation, we adopt the same dimension reduction as the original paper that might be\n        slightly different with other implementations.\n\n        After pooling, the channels are reduced to 1/len(bin_sizes) immediately, while some other implementations\n        keep the channels to be same.\n\n        Args:\n            in_channels (int): The number of intput channels to pyramid pooling module.\n            out_channels (int): The number of output channels to pyramid pooling module.\n            size (int): The out size of the pooled layer.\n\n        Returns:\n            conv (Tensor): A tensor after Pyramid Pooling Module.\n        \"\"\"\n\n        prior = nn.AdaptiveAvgPool2D(output_size=(size, size))\n        conv = ConvBNReLU(\n            in_channels=in_channels, out_channels=out_channels, kernel_size=1)\n\n        return nn.Sequential(prior, conv)\n\n    def forward(self, input: paddle.Tensor) -> paddle.Tensor:\n        cat_layers = []\n        for stage in self.stages:\n            x = stage(input)\n            x = F.interpolate(\n                x,\n                paddle.shape(input)[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            cat_layers.append(x)\n        cat_layers = [input] + cat_layers[::-1]\n        cat = paddle.concat(cat_layers, axis=1)\n        out = self.conv_bn_relu2(cat)\n\n        return out\n"
  },
  {
    "path": "modules/image/semantic_segmentation/stdc1_seg_cityscapes/module.py",
    "content": "# copyright (c) 2022 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union, List, Tuple\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\n\nfrom stdc1_seg_cityscapes.stdcnet import STDC1\nimport stdc1_seg_cityscapes.layers as layers\n\n\n@moduleinfo(\n    name=\"stdc1_seg_cityscapes\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"STDCSeg is a segmentation model.\",\n    version=\"1.0.0\",\n    meta=ImageSegmentationModule)\nclass STDCSeg(nn.Layer):\n    \"\"\"\n    The STDCSeg implementation based on PaddlePaddle.\n\n    The original article refers to Meituan\n    Fan, Mingyuan, et al. \"Rethinking BiSeNet For Real-time Semantic Segmentation.\"\n    (https://arxiv.org/abs/2104.13188)\n\n    Args:\n        num_classes(int,optional): The unique number of target classes.\n        use_boundary_8(bool,non-optional): Whether to use detail loss. it should be True accroding to paper for best metric. Default: True.\n        Actually,if you want to use _boundary_2/_boundary_4/_boundary_16,you should append loss function number of DetailAggregateLoss.It should work properly.\n        use_conv_last(bool,optional): Determine ContextPath 's inplanes variable according to whether to use bockbone's last conv. Default: False.\n        pretrained (str, optional): The path or url of pretrained model. Default: None.\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 19,\n                 use_boundary_2: bool = False,\n                 use_boundary_4: bool = False,\n                 use_boundary_8: bool = True,\n                 use_boundary_16: bool = False,\n                 use_conv_last: bool = False,\n                 pretrained: str = None):\n        super(STDCSeg, self).__init__()\n\n        self.use_boundary_2 = use_boundary_2\n        self.use_boundary_4 = use_boundary_4\n        self.use_boundary_8 = use_boundary_8\n        self.use_boundary_16 = use_boundary_16\n        self.cp = ContextPath(STDC1(), use_conv_last=use_conv_last)\n        self.ffm = FeatureFusionModule(384, 256)\n        self.conv_out = SegHead(256, 256, num_classes)\n        self.conv_out8 = SegHead(128, 64, num_classes)\n        self.conv_out16 = SegHead(128, 64, num_classes)\n        self.conv_out_sp16 = SegHead(512, 64, 1)\n        self.conv_out_sp8 = SegHead(256, 64, 1)\n        self.conv_out_sp4 = SegHead(64, 64, 1)\n        self.conv_out_sp2 = SegHead(32, 64, 1)\n        self.transforms = T.Compose([T.Normalize()])\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def transform(self, img: Union[np.ndarray, str]) -> Union[np.ndarray, str]:\n        return self.transforms(img)\n\n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        x_hw = paddle.shape(x)[2:]\n        feat_res2, feat_res4, feat_res8, _, feat_cp8, feat_cp16 = self.cp(x)\n\n        logit_list = []\n        if self.training:\n            feat_fuse = self.ffm(feat_res8, feat_cp8)\n            feat_out = self.conv_out(feat_fuse)\n            feat_out8 = self.conv_out8(feat_cp8)\n            feat_out16 = self.conv_out16(feat_cp16)\n\n            logit_list = [feat_out, feat_out8, feat_out16]\n            logit_list = [\n                F.interpolate(x, x_hw, mode='bilinear', align_corners=True)\n                for x in logit_list\n            ]\n\n            if self.use_boundary_2:\n                feat_out_sp2 = self.conv_out_sp2(feat_res2)\n                logit_list.append(feat_out_sp2)\n            if self.use_boundary_4:\n                feat_out_sp4 = self.conv_out_sp4(feat_res4)\n                logit_list.append(feat_out_sp4)\n            if self.use_boundary_8:\n                feat_out_sp8 = self.conv_out_sp8(feat_res8)\n                logit_list.append(feat_out_sp8)\n        else:\n            feat_fuse = self.ffm(feat_res8, feat_cp8)\n            feat_out = self.conv_out(feat_fuse)\n            feat_out = F.interpolate(\n                feat_out, x_hw, mode='bilinear', align_corners=True)\n            logit_list = [feat_out]\n\n        return logit_list\n\n\nclass SegHead(nn.Layer):\n    def __init__(self, in_chan: int, mid_chan: int, n_classes:int):\n        super(SegHead, self).__init__()\n        self.conv = layers.ConvBNReLU(\n            in_chan, mid_chan, kernel_size=3, stride=1, padding=1)\n        self.conv_out = nn.Conv2D(\n            mid_chan, n_classes, kernel_size=1, bias_attr=None)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.conv(x)\n        x = self.conv_out(x)\n        return x\n\n\nclass AttentionRefinementModule(nn.Layer):\n    def __init__(self, in_chan: int, out_chan: int):\n        super(AttentionRefinementModule, self).__init__()\n        self.conv = layers.ConvBNReLU(\n            in_chan, out_chan, kernel_size=3, stride=1, padding=1)\n        self.conv_atten = nn.Conv2D(\n            out_chan, out_chan, kernel_size=1, bias_attr=None)\n        self.bn_atten = nn.BatchNorm2D(out_chan)\n        self.sigmoid_atten = nn.Sigmoid()\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        feat = self.conv(x)\n        atten = F.adaptive_avg_pool2d(feat, 1)\n        atten = self.conv_atten(atten)\n        atten = self.bn_atten(atten)\n        atten = self.sigmoid_atten(atten)\n        out = paddle.multiply(feat, atten)\n        return out\n\n\nclass ContextPath(nn.Layer):\n    def __init__(self, backbone, use_conv_last: bool = False):\n        super(ContextPath, self).__init__()\n        self.backbone = backbone\n        self.arm16 = AttentionRefinementModule(512, 128)\n        inplanes = 1024\n        if use_conv_last:\n            inplanes = 1024\n        self.arm32 = AttentionRefinementModule(inplanes, 128)\n        self.conv_head32 = layers.ConvBNReLU(\n            128, 128, kernel_size=3, stride=1, padding=1)\n        self.conv_head16 = layers.ConvBNReLU(\n            128, 128, kernel_size=3, stride=1, padding=1)\n        self.conv_avg = layers.ConvBNReLU(\n            inplanes, 128, kernel_size=1, stride=1, padding=0)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        feat2, feat4, feat8, feat16, feat32 = self.backbone(x)\n\n        feat8_hw = paddle.shape(feat8)[2:]\n        feat16_hw = paddle.shape(feat16)[2:]\n        feat32_hw = paddle.shape(feat32)[2:]\n\n        avg = F.adaptive_avg_pool2d(feat32, 1)\n        avg = self.conv_avg(avg)\n        avg_up = F.interpolate(avg, feat32_hw, mode='nearest')\n\n        feat32_arm = self.arm32(feat32)\n        feat32_sum = feat32_arm + avg_up\n        feat32_up = F.interpolate(feat32_sum, feat16_hw, mode='nearest')\n        feat32_up = self.conv_head32(feat32_up)\n\n        feat16_arm = self.arm16(feat16)\n        feat16_sum = feat16_arm + feat32_up\n        feat16_up = F.interpolate(feat16_sum, feat8_hw, mode='nearest')\n        feat16_up = self.conv_head16(feat16_up)\n\n        return feat2, feat4, feat8, feat16, feat16_up, feat32_up  # x8, x16\n\n\nclass FeatureFusionModule(nn.Layer):\n    def __init__(self, in_chan:int , out_chan: int):\n        super(FeatureFusionModule, self).__init__()\n        self.convblk = layers.ConvBNReLU(\n            in_chan, out_chan, kernel_size=1, stride=1, padding=0)\n        self.conv1 = nn.Conv2D(\n            out_chan,\n            out_chan // 4,\n            kernel_size=1,\n            stride=1,\n            padding=0,\n            bias_attr=None)\n        self.conv2 = nn.Conv2D(\n            out_chan // 4,\n            out_chan,\n            kernel_size=1,\n            stride=1,\n            padding=0,\n            bias_attr=None)\n        self.relu = nn.ReLU()\n        self.sigmoid = nn.Sigmoid()\n\n    def forward(self, fsp: paddle.Tensor, fcp: paddle.Tensor) -> paddle.Tensor:\n        fcat = paddle.concat([fsp, fcp], axis=1)\n        feat = self.convblk(fcat)\n        atten = F.adaptive_avg_pool2d(feat, 1)\n        atten = self.conv1(atten)\n        atten = self.relu(atten)\n        atten = self.conv2(atten)\n        atten = self.sigmoid(atten)\n        feat_atten = paddle.multiply(feat, atten)\n        feat_out = feat_atten + feat\n        return feat_out"
  },
  {
    "path": "modules/image/semantic_segmentation/stdc1_seg_cityscapes/stdcnet.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import Union, List, Tuple\nimport math\n\nimport paddle\nimport paddle.nn as nn\n\nimport stdc1_seg_cityscapes.layers as L\n\n__all__ = [\"STDC1\", \"STDC2\"]\n\n\nclass STDCNet(nn.Layer):\n    \"\"\"\n    The STDCNet implementation based on PaddlePaddle.\n\n    The original article refers to Meituan\n    Fan, Mingyuan, et al. \"Rethinking BiSeNet For Real-time Semantic Segmentation.\"\n    (https://arxiv.org/abs/2104.13188)\n\n    Args:\n        base(int, optional): base channels. Default: 64.\n        layers(list, optional): layers numbers list. It determines STDC block numbers of STDCNet's stage3\\4\\5. Defualt: [4, 5, 3].\n        block_num(int,optional): block_num of features block. Default: 4.\n        type(str,optional): feature fusion method \"cat\"/\"add\". Default: \"cat\".\n        num_classes(int, optional): class number for image classification. Default: 1000.\n        dropout(float,optional): dropout ratio. if >0,use dropout ratio.  Default: 0.20.\n        use_conv_last(bool,optional): whether to use the last ConvBNReLU layer . Default: False.\n        pretrained(str, optional): the path of pretrained model.\n    \"\"\"\n\n    def __init__(self,\n                 base: int = 64,\n                 layers: List[int] = [4, 5, 3],\n                 block_num: int = 4,\n                 type: str = \"cat\",\n                 num_classes: int = 1000,\n                 dropout: float = 0.20,\n                 use_conv_last: bool = False):\n        super(STDCNet, self).__init__()\n        if type == \"cat\":\n            block = CatBottleneck\n        elif type == \"add\":\n            block = AddBottleneck\n        self.use_conv_last = use_conv_last\n        self.features = self._make_layers(base, layers, block_num, block)\n        self.conv_last = ConvBNRelu(base * 16, max(1024, base * 16), 1, 1)\n\n        if (layers == [4, 5, 3]):  #stdc1446\n            self.x2 = nn.Sequential(self.features[:1])\n            self.x4 = nn.Sequential(self.features[1:2])\n            self.x8 = nn.Sequential(self.features[2:6])\n            self.x16 = nn.Sequential(self.features[6:11])\n            self.x32 = nn.Sequential(self.features[11:])\n        elif (layers == [2, 2, 2]):  #stdc813\n            self.x2 = nn.Sequential(self.features[:1])\n            self.x4 = nn.Sequential(self.features[1:2])\n            self.x8 = nn.Sequential(self.features[2:4])\n            self.x16 = nn.Sequential(self.features[4:6])\n            self.x32 = nn.Sequential(self.features[6:])\n        else:\n            raise NotImplementedError(\n                \"model with layers:{} is not implemented!\".format(layers))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        \"\"\"\n        forward function for feature extract.\n        \"\"\"\n        feat2 = self.x2(x)\n        feat4 = self.x4(feat2)\n        feat8 = self.x8(feat4)\n        feat16 = self.x16(feat8)\n        feat32 = self.x32(feat16)\n        if self.use_conv_last:\n            feat32 = self.conv_last(feat32)\n        return feat2, feat4, feat8, feat16, feat32\n\n    def _make_layers(self, base, layers, block_num, block):\n        features = []\n        features += [ConvBNRelu(3, base // 2, 3, 2)]\n        features += [ConvBNRelu(base // 2, base, 3, 2)]\n\n        for i, layer in enumerate(layers):\n            for j in range(layer):\n                if i == 0 and j == 0:\n                    features.append(block(base, base * 4, block_num, 2))\n                elif j == 0:\n                    features.append(\n                        block(base * int(math.pow(2, i + 1)),\n                              base * int(math.pow(2, i + 2)), block_num, 2))\n                else:\n                    features.append(\n                        block(base * int(math.pow(2, i + 2)),\n                              base * int(math.pow(2, i + 2)), block_num, 1))\n\n        return nn.Sequential(*features)\n\n\nclass ConvBNRelu(nn.Layer):\n    def __init__(self, in_planes: int, out_planes: int, kernel: int = 3, stride: int = 1):\n        super(ConvBNRelu, self).__init__()\n        self.conv = nn.Conv2D(\n            in_planes,\n            out_planes,\n            kernel_size=kernel,\n            stride=stride,\n            padding=kernel // 2,\n            bias_attr=False)\n        self.bn = L.SyncBatchNorm(out_planes, data_format='NCHW')\n        self.relu = nn.ReLU()\n\n    def forward(self, x):\n        out = self.relu(self.bn(self.conv(x)))\n        return out\n\n\nclass AddBottleneck(nn.Layer):\n    def __init__(self, in_planes: int, out_planes: int, block_num: int = 3, stride: int = 1):\n        super(AddBottleneck, self).__init__()\n        assert block_num > 1, \"block number should be larger than 1.\"\n        self.conv_list = nn.LayerList()\n        self.stride = stride\n        if stride == 2:\n            self.avd_layer = nn.Sequential(\n                nn.Conv2D(\n                    out_planes // 2,\n                    out_planes // 2,\n                    kernel_size=3,\n                    stride=2,\n                    padding=1,\n                    groups=out_planes // 2,\n                    bias_attr=False),\n                nn.BatchNorm2D(out_planes // 2),\n            )\n            self.skip = nn.Sequential(\n                nn.Conv2D(\n                    in_planes,\n                    in_planes,\n                    kernel_size=3,\n                    stride=2,\n                    padding=1,\n                    groups=in_planes,\n                    bias_attr=False),\n                nn.BatchNorm2D(in_planes),\n                nn.Conv2D(\n                    in_planes, out_planes, kernel_size=1, bias_attr=False),\n                nn.BatchNorm2D(out_planes),\n            )\n            stride = 1\n\n        for idx in range(block_num):\n            if idx == 0:\n                self.conv_list.append(\n                    ConvBNRelu(in_planes, out_planes // 2, kernel=1))\n            elif idx == 1 and block_num == 2:\n                self.conv_list.append(\n                    ConvBNRelu(out_planes // 2, out_planes // 2, stride=stride))\n            elif idx == 1 and block_num > 2:\n                self.conv_list.append(\n                    ConvBNRelu(out_planes // 2, out_planes // 4, stride=stride))\n            elif idx < block_num - 1:\n                self.conv_list.append(\n                    ConvBNRelu(out_planes // int(math.pow(2, idx)),\n                               out_planes // int(math.pow(2, idx + 1))))\n            else:\n                self.conv_list.append(\n                    ConvBNRelu(out_planes // int(math.pow(2, idx)),\n                               out_planes // int(math.pow(2, idx))))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        out_list = []\n        out = x\n        for idx, conv in enumerate(self.conv_list):\n            if idx == 0 and self.stride == 2:\n                out = self.avd_layer(conv(out))\n            else:\n                out = conv(out)\n            out_list.append(out)\n        if self.stride == 2:\n            x = self.skip(x)\n        return paddle.concat(out_list, axis=1) + x\n\n\nclass CatBottleneck(nn.Layer):\n    def __init__(self, in_planes: int, out_planes: int, block_num: int = 3, stride: int = 1):\n        super(CatBottleneck, self).__init__()\n        assert block_num > 1, \"block number should be larger than 1.\"\n        self.conv_list = nn.LayerList()\n        self.stride = stride\n        if stride == 2:\n            self.avd_layer = nn.Sequential(\n                nn.Conv2D(\n                    out_planes // 2,\n                    out_planes // 2,\n                    kernel_size=3,\n                    stride=2,\n                    padding=1,\n                    groups=out_planes // 2,\n                    bias_attr=False),\n                nn.BatchNorm2D(out_planes // 2),\n            )\n            self.skip = nn.AvgPool2D(kernel_size=3, stride=2, padding=1)\n            stride = 1\n\n        for idx in range(block_num):\n            if idx == 0:\n                self.conv_list.append(\n                    ConvBNRelu(in_planes, out_planes // 2, kernel=1))\n            elif idx == 1 and block_num == 2:\n                self.conv_list.append(\n                    ConvBNRelu(out_planes // 2, out_planes // 2, stride=stride))\n            elif idx == 1 and block_num > 2:\n                self.conv_list.append(\n                    ConvBNRelu(out_planes // 2, out_planes // 4, stride=stride))\n            elif idx < block_num - 1:\n                self.conv_list.append(\n                    ConvBNRelu(out_planes // int(math.pow(2, idx)),\n                               out_planes // int(math.pow(2, idx + 1))))\n            else:\n                self.conv_list.append(\n                    ConvBNRelu(out_planes // int(math.pow(2, idx)),\n                               out_planes // int(math.pow(2, idx))))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        out_list = []\n        out1 = self.conv_list[0](x)\n        for idx, conv in enumerate(self.conv_list[1:]):\n            if idx == 0:\n                if self.stride == 2:\n                    out = conv(self.avd_layer(out1))\n                else:\n                    out = conv(out1)\n            else:\n                out = conv(out)\n            out_list.append(out)\n\n        if self.stride == 2:\n            out1 = self.skip(out1)\n        out_list.insert(0, out1)\n        out = paddle.concat(out_list, axis=1)\n        return out\n\n\ndef STDC2(**kwargs):\n    model = STDCNet(base=64, layers=[4, 5, 3], **kwargs)\n    return model\n\ndef STDC1(**kwargs):\n    model = STDCNet(base=64, layers=[2, 2, 2], **kwargs)\n    return model"
  },
  {
    "path": "modules/image/semantic_segmentation/stdc1_seg_voc/README.md",
    "content": "# stdc1_seg_voc\n\n|模型名称|stdc1_seg_voc|\n| :--- | :---: | \n|类别|图像-图像分割|\n|网络|stdc1_seg|\n|数据集|PascalVOC2012|\n|是否支持Fine-tuning|是|\n|模型大小|67MB|\n|指标|-|\n|最新更新日期|2022-03-21|\n\n## 一、模型基本信息\n  \n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/159212097-443a5a65-2f2e-4126-9c07-d7c3c220e55f.jpg\"  width = \"420\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/159212375-52e123af-4699-4c25-8f50-4240bbb714b4.png\" width = \"420\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### 模型介绍\n\n    - 本示例将展示如何使用PaddleHub对预训练模型进行finetune并完成预测任务。\n    - 更多详情请参考：[stdc](https://arxiv.org/abs/2104.13188)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n\n    - ```shell\n      $ hub install stdc1_seg_voc\n      ```\n\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n    | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)   \n\n\n## 三、模型API预测\n\n- ### 1.预测代码示例\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='stdc1_seg_voc')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.如何开始Fine-tune\n\n    - 在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用stdc1_seg_voc模型对OpticDiscSeg数据集进行Fine-tune。 `train.py`内容如下：\n\n    - 代码步骤\n\n        - Step1: 定义数据预处理方式\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n        - Step2: 下载数据集并使用\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n              ```\n                - `transforms`: 数据预处理方式。\n                - `mode`: `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n                - 数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n        - Step3: 加载预训练模型\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='stdc1_seg_voc', num_classes=2, pretrained=None)\n              ```\n                - `name`: 选择预训练模型的名字。\n                - `load_checkpoint`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n        - Step4: 选择优化策略和运行配置\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n             \n    - 模型预测\n\n        - 当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。我们使用该模型来进行预测。predict.py脚本如下：\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='stdc1_seg_voc', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - 参数配置正确后，请执行脚本`python predict.py`。\n\n            - **Args**\n                * `images`:原始图像路径或BGR格式图片；\n                * `visualization`: 是否可视化，默认为True；\n                * `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n                **NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线图像分割服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m stdc1_seg_voc\n      ```\n\n    - 这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        # 发送HTTP请求\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/ginet_resnet50vd_voc\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/image/semantic_segmentation/stdc1_seg_voc/README_en.md",
    "content": "# stdc1_seg_voc\n\n|Module Name|stdc1_seg_voc|\n| :--- | :---: | \n|Category|Image Segmentation|\n|Network|stdc1_seg|\n|Dataset|PascalVOC2012|\n|Fine-tuning supported or not|Yes|\n|Module Size|370MB|\n|Data indicators|-|\n|Latest update date|2022-03-22|\n\n## I. Basic Information \n  \n- ### Application Effect Display\n    - Sample results:\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/35907364/159212097-443a5a65-2f2e-4126-9c07-d7c3c220e55f.jpg\"  width = \"420\" height = \"505\" hspace='10'/> <img src=\"https://user-images.githubusercontent.com/35907364/159212375-52e123af-4699-4c25-8f50-4240bbb714b4.png\" width = \"420\" height = \"505\" hspace='10'/>\n    </p>\n\n- ### Module Introduction\n\n    - We will show how to use PaddleHub to finetune the pre-trained model and complete the prediction.\n    - For more information, please refer to: [stdc](https://arxiv.org/abs/2104.13188)\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n    - ```shell\n      $ hub install stdc1_seg_voc\n      ```\n\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n\n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n    - ```python\n      import cv2\n      import paddle\n      import paddlehub as hub\n\n      if __name__ == '__main__':\n          model = hub.Module(name='stdc1_seg_voc')\n          img = cv2.imread(\"/PATH/TO/IMAGE\")\n          result = model.predict(images=[img], visualization=True)\n      ```\n\n- ### 2.Fine-tune and Encapsulation\n\n    - After completing the installation of PaddlePaddle and PaddleHub, you can start using the stdc1_seg_voc model to fine-tune datasets such as OpticDiscSeg.\n\n    - Steps:\n\n         - Step1: Define the data preprocessing method\n\n            - ```python\n              from paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\n              transform = Compose([Resize(target_size=(512, 512)), Normalize()])\n              ```\n\n            - `segmentation_transforms`: The data enhancement module defines lots of data preprocessing methods. Users can replace the data preprocessing methods according to their needs.\n\n         - Step2: Download the dataset\n\n            - ```python\n              from paddlehub.datasets import OpticDiscSeg\n\n              train_reader = OpticDiscSeg(transform, mode='train')\n              ```\n                * `transforms`: data preprocessing methods.\n\n                * `mode`: Select the data mode, the options are `train`, `test`, `val`. Default is `train`.\n\n                * Dataset preparation can be referred to [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`will be automatically downloaded from the network and decompressed to the `$HOME/.paddlehub/dataset` directory under the user directory.\n\n        - Step3: Load the pre-trained model\n\n            - ```python\n              import paddlehub as hub\n\n              model = hub.Module(name='stdc1_seg_voc', num_classes=2, pretrained=None)\n              ```\n                - `name`: model name.\n                - `load_checkpoint`: Whether to load the self-trained model, if it is None, load the provided parameters.\n\n        - Step4:  Optimization strategy\n\n            - ```python\n              import paddle\n              from paddlehub.finetune.trainer import Trainer\n\n              scheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\n              optimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\n              trainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_seg', use_gpu=True)\n              trainer.train(train_reader, epochs=10, batch_size=4, log_interval=10, save_interval=4)\n              ```\n             \n    -  Model prediction\n\n        - When Fine-tune is completed, the model with the best performance on the verification set will be saved in the `${CHECKPOINT_DIR}/best_model` directory. We use this model to make predictions. The `predict.py` script is as follows:\n\n            ```python\n            import paddle\n            import cv2\n            import paddlehub as hub\n\n            if __name__ == '__main__':\n                model = hub.Module(name='stdc1_seg_voc', pretrained='/PATH/TO/CHECKPOINT')\n                img = cv2.imread(\"/PATH/TO/IMAGE\")\n                model.predict(images=[img], visualization=True)\n            ```\n\n            - **Args**\n                * `images`: Image path or ndarray data with format [H, W, C], BGR.\n                * `visualization`: Whether to save the recognition results as picture files.\n                * `save_path`: Save path of the result, default is 'seg_result'.\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of image segmentation.\n\n- ### Step 1: Start PaddleHub Serving\n\n    - Run the startup command:\n\n        - ```shell\n          $ hub serving start -m stdc1_seg_voc\n          ```\n\n    - The servitization API is now deployed and the default port number is 8866.\n\n    - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n    - With a configured server, use the following lines of code to send the prediction request and obtain the result:\n\n        ```python\n        import requests\n        import json\n        import cv2\n        import base64\n\n        import numpy as np\n\n        def cv2_to_base64(image):\n            data = cv2.imencode('.jpg', image)[1]\n            return base64.b64encode(data.tostring()).decode('utf8')\n\n        def base64_to_cv2(b64str):\n            data = base64.b64decode(b64str.encode('utf8'))\n            data = np.fromstring(data, np.uint8)\n            data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n            return data\n\n        org_im = cv2.imread('/PATH/TO/IMAGE')\n        data = {'images':[cv2_to_base64(org_im)]}\n        headers = {\"Content-type\": \"application/json\"}\n        url = \"http://127.0.0.1:8866/predict/stdc1_seg_voc\"\n        r = requests.post(url=url, headers=headers, data=json.dumps(data))\n        mask = base64_to_cv2(r.json()[\"results\"][0])\n        ```\n\n## V. Release Note\n\n- 1.0.0\n\n  First release\n"
  },
  {
    "path": "modules/image/semantic_segmentation/stdc1_seg_voc/layers.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn.layer import activation\nfrom paddle.nn import Conv2D, AvgPool2D\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu':\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Depthwise Separable Convolution.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(SeparableConvBNReLU, self).__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=in_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n        self.piontwise_conv = ConvBNReLU(\n            in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBN, self).__init__()\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 out_channels: int,\n                 kernel_size: int,\n                 padding: str = 'same',\n                 **kwargs: dict):\n        super(ConvBNReLU, self).__init__()\n\n        self._conv = Conv2D(\n            in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n    Returns:\n        A callable object of Activation.\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n    Examples:\n        from paddleseg.models.common.activation import Activation\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(\n                    act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n\n\nclass ASPPModule(nn.Layer):\n    \"\"\"\n    Atrous Spatial Pyramid Pooling.\n    Args:\n        aspp_ratios (tuple): The dilation rate using in ASSP module.\n        in_channels (int): The number of input channels.\n        out_channels (int): The number of output channels.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n        use_sep_conv (bool, optional): If using separable conv in ASPP module. Default: False.\n        image_pooling (bool, optional): If augmented with image-level features. Default: False\n    \"\"\"\n\n    def __init__(self,\n                 aspp_ratios: tuple,\n                 in_channels: int,\n                 out_channels: int,\n                 align_corners: bool,\n                 use_sep_conv: bool = False,\n                 image_pooling: bool = False):\n        super().__init__()\n\n        self.align_corners = align_corners\n        self.aspp_blocks = nn.LayerList()\n\n        for ratio in aspp_ratios:\n            if use_sep_conv and ratio > 1:\n                conv_func = SeparableConvBNReLU\n            else:\n                conv_func = ConvBNReLU\n\n            block = conv_func(\n                in_channels=in_channels,\n                out_channels=out_channels,\n                kernel_size=1 if ratio == 1 else 3,\n                dilation=ratio,\n                padding=0 if ratio == 1 else ratio)\n            self.aspp_blocks.append(block)\n\n        out_size = len(self.aspp_blocks)\n\n        if image_pooling:\n            self.global_avg_pool = nn.Sequential(\n                nn.AdaptiveAvgPool2D(output_size=(1, 1)),\n                ConvBNReLU(in_channels, out_channels, kernel_size=1, bias_attr=False))\n            out_size += 1\n        self.image_pooling = image_pooling\n\n        self.conv_bn_relu = ConvBNReLU(\n            in_channels=out_channels * out_size,\n            out_channels=out_channels,\n            kernel_size=1)\n\n        self.dropout = nn.Dropout(p=0.1)  # drop rate\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        outputs = []\n        for block in self.aspp_blocks:\n            y = block(x)\n            y = F.interpolate(\n                y,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(y)\n\n        if self.image_pooling:\n            img_avg = self.global_avg_pool(x)\n            img_avg = F.interpolate(\n                img_avg,\n                x.shape[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            outputs.append(img_avg)\n\n        x = paddle.concat(outputs, axis=1)\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n\n        return x\n\n\nclass AuxLayer(nn.Layer):\n    \"\"\"\n    The auxiliary layer implementation for auxiliary loss.\n\n    Args:\n        in_channels (int): The number of input channels.\n        inter_channels (int): The intermediate channels.\n        out_channels (int): The number of output channels, and usually it is num_classes.\n        dropout_prob (float, optional): The drop rate. Default: 0.1.\n    \"\"\"\n\n    def __init__(self,\n                 in_channels: int,\n                 inter_channels: int,\n                 out_channels: int,\n                 dropout_prob: float = 0.1,\n                 **kwargs):\n        super().__init__()\n\n        self.conv_bn_relu = ConvBNReLU(\n            in_channels=in_channels,\n            out_channels=inter_channels,\n            kernel_size=3,\n            padding=1,\n            **kwargs)\n\n        self.dropout = nn.Dropout(p=dropout_prob)\n\n        self.conv = nn.Conv2D(\n            in_channels=inter_channels,\n            out_channels=out_channels,\n            kernel_size=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n        x = self.conv(x)\n        return x\n\n\nclass Add(nn.Layer):\n    def __init__(self):\n        super().__init__()\n\n    def forward(self, x: paddle.Tensor, y: paddle.Tensor, name=None) -> paddle.Tensor:\n        return paddle.add(x, y, name)\n\nclass PPModule(nn.Layer):\n    \"\"\"\n    Pyramid pooling module originally in PSPNet.\n\n    Args:\n        in_channels (int): The number of intput channels to pyramid pooling module.\n        out_channels (int): The number of output channels after pyramid pooling module.\n        bin_sizes (tuple, optional): The out size of pooled feature maps. Default: (1, 2, 3, 6).\n        dim_reduction (bool, optional): A bool value represents if reducing dimension after pooling. Default: True.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.\n    \"\"\"\n\n    def __init__(self, \n                 in_channels: int, \n                 out_channels: int, \n                 bin_sizes: tuple, \n                 dim_reduction: bool,\n                 align_corners: bool):\n        super().__init__()\n\n        self.bin_sizes = bin_sizes\n\n        inter_channels = in_channels\n        if dim_reduction:\n            inter_channels = in_channels // len(bin_sizes)\n\n        # we use dimension reduction after pooling mentioned in original implementation.\n        self.stages = nn.LayerList([\n            self._make_stage(in_channels, inter_channels, size)\n            for size in bin_sizes\n        ])\n\n        self.conv_bn_relu2 = ConvBNReLU(\n            in_channels=in_channels + inter_channels * len(bin_sizes),\n            out_channels=out_channels,\n            kernel_size=3,\n            padding=1)\n\n        self.align_corners = align_corners\n\n    def _make_stage(self, in_channels: int, out_channels: int, size: int):\n        \"\"\"\n        Create one pooling layer.\n\n        In our implementation, we adopt the same dimension reduction as the original paper that might be\n        slightly different with other implementations.\n\n        After pooling, the channels are reduced to 1/len(bin_sizes) immediately, while some other implementations\n        keep the channels to be same.\n\n        Args:\n            in_channels (int): The number of intput channels to pyramid pooling module.\n            out_channels (int): The number of output channels to pyramid pooling module.\n            size (int): The out size of the pooled layer.\n\n        Returns:\n            conv (Tensor): A tensor after Pyramid Pooling Module.\n        \"\"\"\n\n        prior = nn.AdaptiveAvgPool2D(output_size=(size, size))\n        conv = ConvBNReLU(\n            in_channels=in_channels, out_channels=out_channels, kernel_size=1)\n\n        return nn.Sequential(prior, conv)\n\n    def forward(self, input: paddle.Tensor) -> paddle.Tensor:\n        cat_layers = []\n        for stage in self.stages:\n            x = stage(input)\n            x = F.interpolate(\n                x,\n                paddle.shape(input)[2:],\n                mode='bilinear',\n                align_corners=self.align_corners)\n            cat_layers.append(x)\n        cat_layers = [input] + cat_layers[::-1]\n        cat = paddle.concat(cat_layers, axis=1)\n        out = self.conv_bn_relu2(cat)\n\n        return out"
  },
  {
    "path": "modules/image/semantic_segmentation/stdc1_seg_voc/module.py",
    "content": "# copyright (c) 2022 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union, List, Tuple\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\n\nfrom stdc1_seg_voc.stdcnet import STDC1\nimport stdc1_seg_voc.layers as layers\n\n\n@moduleinfo(\n    name=\"stdc1_seg_voc\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"STDCSeg is a segmentation model.\",\n    version=\"1.0.0\",\n    meta=ImageSegmentationModule)\nclass STDCSeg(nn.Layer):\n    \"\"\"\n    The STDCSeg implementation based on PaddlePaddle.\n\n    The original article refers to Meituan\n    Fan, Mingyuan, et al. \"Rethinking BiSeNet For Real-time Semantic Segmentation.\"\n    (https://arxiv.org/abs/2104.13188)\n\n    Args:\n        num_classes(int,optional): The unique number of target classes.\n        use_boundary_8(bool,non-optional): Whether to use detail loss. it should be True accroding to paper for best metric. Default: True.\n        Actually,if you want to use _boundary_2/_boundary_4/_boundary_16,you should append loss function number of DetailAggregateLoss.It should work properly.\n        use_conv_last(bool,optional): Determine ContextPath 's inplanes variable according to whether to use bockbone's last conv. Default: False.\n        pretrained (str, optional): The path or url of pretrained model. Default: None.\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 19,\n                 use_boundary_2: bool = False,\n                 use_boundary_4: bool = False,\n                 use_boundary_8: bool = True,\n                 use_boundary_16: bool = False,\n                 use_conv_last: bool = False,\n                 pretrained: str = None):\n        super(STDCSeg, self).__init__()\n\n        self.use_boundary_2 = use_boundary_2\n        self.use_boundary_4 = use_boundary_4\n        self.use_boundary_8 = use_boundary_8\n        self.use_boundary_16 = use_boundary_16\n        self.cp = ContextPath(STDC1(), use_conv_last=use_conv_last)\n        self.ffm = FeatureFusionModule(384, 256)\n        self.conv_out = SegHead(256, 256, num_classes)\n        self.conv_out8 = SegHead(128, 64, num_classes)\n        self.conv_out16 = SegHead(128, 64, num_classes)\n        self.conv_out_sp16 = SegHead(512, 64, 1)\n        self.conv_out_sp8 = SegHead(256, 64, 1)\n        self.conv_out_sp4 = SegHead(64, 64, 1)\n        self.conv_out_sp2 = SegHead(32, 64, 1)\n        self.transforms = T.Compose([T.Normalize()])\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def transform(self, img: Union[np.ndarray, str]) -> Union[np.ndarray, str]:\n        return self.transforms(img)\n\n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        x_hw = paddle.shape(x)[2:]\n        feat_res2, feat_res4, feat_res8, _, feat_cp8, feat_cp16 = self.cp(x)\n\n        logit_list = []\n        if self.training:\n            feat_fuse = self.ffm(feat_res8, feat_cp8)\n            feat_out = self.conv_out(feat_fuse)\n            feat_out8 = self.conv_out8(feat_cp8)\n            feat_out16 = self.conv_out16(feat_cp16)\n\n            logit_list = [feat_out, feat_out8, feat_out16]\n            logit_list = [\n                F.interpolate(x, x_hw, mode='bilinear', align_corners=True)\n                for x in logit_list\n            ]\n\n            if self.use_boundary_2:\n                feat_out_sp2 = self.conv_out_sp2(feat_res2)\n                logit_list.append(feat_out_sp2)\n            if self.use_boundary_4:\n                feat_out_sp4 = self.conv_out_sp4(feat_res4)\n                logit_list.append(feat_out_sp4)\n            if self.use_boundary_8:\n                feat_out_sp8 = self.conv_out_sp8(feat_res8)\n                logit_list.append(feat_out_sp8)\n        else:\n            feat_fuse = self.ffm(feat_res8, feat_cp8)\n            feat_out = self.conv_out(feat_fuse)\n            feat_out = F.interpolate(\n                feat_out, x_hw, mode='bilinear', align_corners=True)\n            logit_list = [feat_out]\n\n        return logit_list\n\n\nclass SegHead(nn.Layer):\n    def __init__(self, in_chan: int, mid_chan: int, n_classes:int):\n        super(SegHead, self).__init__()\n        self.conv = layers.ConvBNReLU(\n            in_chan, mid_chan, kernel_size=3, stride=1, padding=1)\n        self.conv_out = nn.Conv2D(\n            mid_chan, n_classes, kernel_size=1, bias_attr=None)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.conv(x)\n        x = self.conv_out(x)\n        return x\n\n\nclass AttentionRefinementModule(nn.Layer):\n    def __init__(self, in_chan: int, out_chan: int):\n        super(AttentionRefinementModule, self).__init__()\n        self.conv = layers.ConvBNReLU(\n            in_chan, out_chan, kernel_size=3, stride=1, padding=1)\n        self.conv_atten = nn.Conv2D(\n            out_chan, out_chan, kernel_size=1, bias_attr=None)\n        self.bn_atten = nn.BatchNorm2D(out_chan)\n        self.sigmoid_atten = nn.Sigmoid()\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        feat = self.conv(x)\n        atten = F.adaptive_avg_pool2d(feat, 1)\n        atten = self.conv_atten(atten)\n        atten = self.bn_atten(atten)\n        atten = self.sigmoid_atten(atten)\n        out = paddle.multiply(feat, atten)\n        return out\n\n\nclass ContextPath(nn.Layer):\n    def __init__(self, backbone, use_conv_last: bool = False):\n        super(ContextPath, self).__init__()\n        self.backbone = backbone\n        self.arm16 = AttentionRefinementModule(512, 128)\n        inplanes = 1024\n        if use_conv_last:\n            inplanes = 1024\n        self.arm32 = AttentionRefinementModule(inplanes, 128)\n        self.conv_head32 = layers.ConvBNReLU(\n            128, 128, kernel_size=3, stride=1, padding=1)\n        self.conv_head16 = layers.ConvBNReLU(\n            128, 128, kernel_size=3, stride=1, padding=1)\n        self.conv_avg = layers.ConvBNReLU(\n            inplanes, 128, kernel_size=1, stride=1, padding=0)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        feat2, feat4, feat8, feat16, feat32 = self.backbone(x)\n\n        feat8_hw = paddle.shape(feat8)[2:]\n        feat16_hw = paddle.shape(feat16)[2:]\n        feat32_hw = paddle.shape(feat32)[2:]\n\n        avg = F.adaptive_avg_pool2d(feat32, 1)\n        avg = self.conv_avg(avg)\n        avg_up = F.interpolate(avg, feat32_hw, mode='nearest')\n\n        feat32_arm = self.arm32(feat32)\n        feat32_sum = feat32_arm + avg_up\n        feat32_up = F.interpolate(feat32_sum, feat16_hw, mode='nearest')\n        feat32_up = self.conv_head32(feat32_up)\n\n        feat16_arm = self.arm16(feat16)\n        feat16_sum = feat16_arm + feat32_up\n        feat16_up = F.interpolate(feat16_sum, feat8_hw, mode='nearest')\n        feat16_up = self.conv_head16(feat16_up)\n\n        return feat2, feat4, feat8, feat16, feat16_up, feat32_up  # x8, x16\n\n\nclass FeatureFusionModule(nn.Layer):\n    def __init__(self, in_chan:int , out_chan: int):\n        super(FeatureFusionModule, self).__init__()\n        self.convblk = layers.ConvBNReLU(\n            in_chan, out_chan, kernel_size=1, stride=1, padding=0)\n        self.conv1 = nn.Conv2D(\n            out_chan,\n            out_chan // 4,\n            kernel_size=1,\n            stride=1,\n            padding=0,\n            bias_attr=None)\n        self.conv2 = nn.Conv2D(\n            out_chan // 4,\n            out_chan,\n            kernel_size=1,\n            stride=1,\n            padding=0,\n            bias_attr=None)\n        self.relu = nn.ReLU()\n        self.sigmoid = nn.Sigmoid()\n\n    def forward(self, fsp: paddle.Tensor, fcp: paddle.Tensor) -> paddle.Tensor:\n        fcat = paddle.concat([fsp, fcp], axis=1)\n        feat = self.convblk(fcat)\n        atten = F.adaptive_avg_pool2d(feat, 1)\n        atten = self.conv1(atten)\n        atten = self.relu(atten)\n        atten = self.conv2(atten)\n        atten = self.sigmoid(atten)\n        feat_atten = paddle.multiply(feat, atten)\n        feat_out = feat_atten + feat\n        return feat_out"
  },
  {
    "path": "modules/image/semantic_segmentation/stdc1_seg_voc/stdcnet.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport math\n\nimport paddle\nimport paddle.nn as nn\n\nimport stdc1_seg_voc.layers as L\n\n__all__ = [\"STDC1\", \"STDC2\"]\n\n\nclass STDCNet(nn.Layer):\n    \"\"\"\n    The STDCNet implementation based on PaddlePaddle.\n\n    The original article refers to Meituan\n    Fan, Mingyuan, et al. \"Rethinking BiSeNet For Real-time Semantic Segmentation.\"\n    (https://arxiv.org/abs/2104.13188)\n\n    Args:\n        base(int, optional): base channels. Default: 64.\n        layers(list, optional): layers numbers list. It determines STDC block numbers of STDCNet's stage3\\4\\5. Defualt: [4, 5, 3].\n        block_num(int,optional): block_num of features block. Default: 4.\n        type(str,optional): feature fusion method \"cat\"/\"add\". Default: \"cat\".\n        num_classes(int, optional): class number for image classification. Default: 1000.\n        dropout(float,optional): dropout ratio. if >0,use dropout ratio.  Default: 0.20.\n        use_conv_last(bool,optional): whether to use the last ConvBNReLU layer . Default: False.\n        pretrained(str, optional): the path of pretrained model.\n    \"\"\"\n\n    def __init__(self,\n                 base: int = 64,\n                 layers: List[int] = [4, 5, 3],\n                 block_num: int = 4,\n                 type: str = \"cat\",\n                 num_classes: int = 1000,\n                 dropout: float = 0.20,\n                 use_conv_last: bool = False):\n        super(STDCNet, self).__init__()\n        if type == \"cat\":\n            block = CatBottleneck\n        elif type == \"add\":\n            block = AddBottleneck\n        self.use_conv_last = use_conv_last\n        self.features = self._make_layers(base, layers, block_num, block)\n        self.conv_last = ConvBNRelu(base * 16, max(1024, base * 16), 1, 1)\n\n        if (layers == [4, 5, 3]):  #stdc1446\n            self.x2 = nn.Sequential(self.features[:1])\n            self.x4 = nn.Sequential(self.features[1:2])\n            self.x8 = nn.Sequential(self.features[2:6])\n            self.x16 = nn.Sequential(self.features[6:11])\n            self.x32 = nn.Sequential(self.features[11:])\n        elif (layers == [2, 2, 2]):  #stdc813\n            self.x2 = nn.Sequential(self.features[:1])\n            self.x4 = nn.Sequential(self.features[1:2])\n            self.x8 = nn.Sequential(self.features[2:4])\n            self.x16 = nn.Sequential(self.features[4:6])\n            self.x32 = nn.Sequential(self.features[6:])\n        else:\n            raise NotImplementedError(\n                \"model with layers:{} is not implemented!\".format(layers))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        \"\"\"\n        forward function for feature extract.\n        \"\"\"\n        feat2 = self.x2(x)\n        feat4 = self.x4(feat2)\n        feat8 = self.x8(feat4)\n        feat16 = self.x16(feat8)\n        feat32 = self.x32(feat16)\n        if self.use_conv_last:\n            feat32 = self.conv_last(feat32)\n        return feat2, feat4, feat8, feat16, feat32\n\n    def _make_layers(self, base, layers, block_num, block):\n        features = []\n        features += [ConvBNRelu(3, base // 2, 3, 2)]\n        features += [ConvBNRelu(base // 2, base, 3, 2)]\n\n        for i, layer in enumerate(layers):\n            for j in range(layer):\n                if i == 0 and j == 0:\n                    features.append(block(base, base * 4, block_num, 2))\n                elif j == 0:\n                    features.append(\n                        block(base * int(math.pow(2, i + 1)),\n                              base * int(math.pow(2, i + 2)), block_num, 2))\n                else:\n                    features.append(\n                        block(base * int(math.pow(2, i + 2)),\n                              base * int(math.pow(2, i + 2)), block_num, 1))\n\n        return nn.Sequential(*features)\n\n\nclass ConvBNRelu(nn.Layer):\n    def __init__(self, in_planes: int, out_planes: int, kernel: int = 3, stride: int = 1):\n        super(ConvBNRelu, self).__init__()\n        self.conv = nn.Conv2D(\n            in_planes,\n            out_planes,\n            kernel_size=kernel,\n            stride=stride,\n            padding=kernel // 2,\n            bias_attr=False)\n        self.bn = L.SyncBatchNorm(out_planes, data_format='NCHW')\n        self.relu = nn.ReLU()\n\n    def forward(self, x):\n        out = self.relu(self.bn(self.conv(x)))\n        return out\n\n\nclass AddBottleneck(nn.Layer):\n    def __init__(self, in_planes: int, out_planes: int, block_num: int = 3, stride: int = 1):\n        super(AddBottleneck, self).__init__()\n        assert block_num > 1, \"block number should be larger than 1.\"\n        self.conv_list = nn.LayerList()\n        self.stride = stride\n        if stride == 2:\n            self.avd_layer = nn.Sequential(\n                nn.Conv2D(\n                    out_planes // 2,\n                    out_planes // 2,\n                    kernel_size=3,\n                    stride=2,\n                    padding=1,\n                    groups=out_planes // 2,\n                    bias_attr=False),\n                nn.BatchNorm2D(out_planes // 2),\n            )\n            self.skip = nn.Sequential(\n                nn.Conv2D(\n                    in_planes,\n                    in_planes,\n                    kernel_size=3,\n                    stride=2,\n                    padding=1,\n                    groups=in_planes,\n                    bias_attr=False),\n                nn.BatchNorm2D(in_planes),\n                nn.Conv2D(\n                    in_planes, out_planes, kernel_size=1, bias_attr=False),\n                nn.BatchNorm2D(out_planes),\n            )\n            stride = 1\n\n        for idx in range(block_num):\n            if idx == 0:\n                self.conv_list.append(\n                    ConvBNRelu(in_planes, out_planes // 2, kernel=1))\n            elif idx == 1 and block_num == 2:\n                self.conv_list.append(\n                    ConvBNRelu(out_planes // 2, out_planes // 2, stride=stride))\n            elif idx == 1 and block_num > 2:\n                self.conv_list.append(\n                    ConvBNRelu(out_planes // 2, out_planes // 4, stride=stride))\n            elif idx < block_num - 1:\n                self.conv_list.append(\n                    ConvBNRelu(out_planes // int(math.pow(2, idx)),\n                               out_planes // int(math.pow(2, idx + 1))))\n            else:\n                self.conv_list.append(\n                    ConvBNRelu(out_planes // int(math.pow(2, idx)),\n                               out_planes // int(math.pow(2, idx))))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        out_list = []\n        out = x\n        for idx, conv in enumerate(self.conv_list):\n            if idx == 0 and self.stride == 2:\n                out = self.avd_layer(conv(out))\n            else:\n                out = conv(out)\n            out_list.append(out)\n        if self.stride == 2:\n            x = self.skip(x)\n        return paddle.concat(out_list, axis=1) + x\n\n\nclass CatBottleneck(nn.Layer):\n    def __init__(self, in_planes: int, out_planes: int, block_num: int = 3, stride: int = 1):\n        super(CatBottleneck, self).__init__()\n        assert block_num > 1, \"block number should be larger than 1.\"\n        self.conv_list = nn.LayerList()\n        self.stride = stride\n        if stride == 2:\n            self.avd_layer = nn.Sequential(\n                nn.Conv2D(\n                    out_planes // 2,\n                    out_planes // 2,\n                    kernel_size=3,\n                    stride=2,\n                    padding=1,\n                    groups=out_planes // 2,\n                    bias_attr=False),\n                nn.BatchNorm2D(out_planes // 2),\n            )\n            self.skip = nn.AvgPool2D(kernel_size=3, stride=2, padding=1)\n            stride = 1\n\n        for idx in range(block_num):\n            if idx == 0:\n                self.conv_list.append(\n                    ConvBNRelu(in_planes, out_planes // 2, kernel=1))\n            elif idx == 1 and block_num == 2:\n                self.conv_list.append(\n                    ConvBNRelu(out_planes // 2, out_planes // 2, stride=stride))\n            elif idx == 1 and block_num > 2:\n                self.conv_list.append(\n                    ConvBNRelu(out_planes // 2, out_planes // 4, stride=stride))\n            elif idx < block_num - 1:\n                self.conv_list.append(\n                    ConvBNRelu(out_planes // int(math.pow(2, idx)),\n                               out_planes // int(math.pow(2, idx + 1))))\n            else:\n                self.conv_list.append(\n                    ConvBNRelu(out_planes // int(math.pow(2, idx)),\n                               out_planes // int(math.pow(2, idx))))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        out_list = []\n        out1 = self.conv_list[0](x)\n        for idx, conv in enumerate(self.conv_list[1:]):\n            if idx == 0:\n                if self.stride == 2:\n                    out = conv(self.avd_layer(out1))\n                else:\n                    out = conv(out1)\n            else:\n                out = conv(out)\n            out_list.append(out)\n\n        if self.stride == 2:\n            out1 = self.skip(out1)\n        out_list.insert(0, out1)\n        out = paddle.concat(out_list, axis=1)\n        return out\n\n\ndef STDC2(**kwargs):\n    model = STDCNet(base=64, layers=[4, 5, 3], **kwargs)\n    return model\n\ndef STDC1(**kwargs):\n    model = STDCNet(base=64, layers=[2, 2, 2], **kwargs)\n    return model"
  },
  {
    "path": "modules/image/semantic_segmentation/unet_cityscapes/README.md",
    "content": "# PaddleHub 图像分割\n\n## 模型预测\n\n若想使用我们提供的预训练模型进行预测，可使用如下脚本：\n\n```python\nimport paddle\nimport cv2\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='unet_cityscapes')\n    img = cv2.imread(\"/PATH/TO/IMAGE\")\n    model.predict(images=[img], visualization=True)\n```\n\n## 如何开始Fine-tune\n\n在完成安装PaddlePaddle与PaddleHub后，通过执行`python train.py`即可开始使用unet_cityscapes模型对OpticDiscSeg等数据集进行Fine-tune。\n\n## 代码步骤\n\n使用PaddleHub Fine-tune API进行Fine-tune可以分为4个步骤。\n\n### Step1: 定义数据预处理方式\n```python\nfrom paddlehub.vision.segmentation_transforms import Compose, Resize, Normalize\n\ntransform = Compose([Resize(target_size=(512, 512)), Normalize()])\n```\n\n`segmentation_transforms` 数据增强模块定义了丰富的针对图像分割数据的预处理方式，用户可按照需求替换自己需要的数据预处理方式。\n\n### Step2: 下载数据集并使用\n```python\nfrom paddlehub.datasets import OpticDiscSeg\n\ntrain_reader = OpticDiscSeg(transform， mode='train')\n\n```\n* `transform`: 数据预处理方式。\n* `mode`: 选择数据模式，可选项有 `train`, `test`, `val`, 默认为`train`。\n\n数据集的准备代码可以参考 [opticdiscseg.py](../../paddlehub/datasets/opticdiscseg.py)。`hub.datasets.OpticDiscSeg()`会自动从网络下载数据集并解压到用户目录下`$HOME/.paddlehub/dataset`目录。\n\n### Step3: 加载预训练模型\n\n```python\nmodel = hub.Module(name='unet_cityscapes', num_classes=2, pretrained=None)\n```\n* `name`: 选择预训练模型的名字。\n* `num_classes`: 分割模型的类别数目。\n* `pretrained`: 是否加载自己训练的模型，若为None，则加载提供的模型默认参数。\n\n### Step4: 选择优化策略和运行配置\n\n```python\nscheduler = paddle.optimizer.lr.PolynomialDecay(learning_rate=0.01, decay_steps=1000, power=0.9,  end_lr=0.0001)\noptimizer = paddle.optimizer.Adam(learning_rate=scheduler, parameters=model.parameters())\ntrainer = Trainer(model, optimizer, checkpoint_dir='test_ckpt_img_ocr', use_gpu=True)\n```\n\n#### 优化策略\n\nPaddle2.0rc提供了多种优化器选择，如`SGD`, `Adam`, `Adamax`等，详细参见[策略](https://www.paddlepaddle.org.cn/documentation/docs/zh/2.0-rc/api/paddle/optimizer/optimizer/Optimizer_cn.html)。\n\n其中`Adam`:\n\n* `learning_rate`: 全局学习率。\n*  `parameters`: 待优化模型参数。\n\n#### 运行配置\n`Trainer` 主要控制Fine-tune的训练，包含以下可控制的参数:\n\n* `model`: 被优化模型；\n* `optimizer`: 优化器选择；\n* `use_gpu`: 是否使用gpu，默认为False;\n* `use_vdl`: 是否使用vdl可视化训练过程；\n* `checkpoint_dir`: 保存模型参数的地址；\n* `compare_metrics`: 保存最优模型的衡量指标；\n\n`trainer.train` 主要控制具体的训练过程，包含以下可控制的参数：\n\n* `train_dataset`: 训练时所用的数据集；\n* `epochs`: 训练轮数；\n* `batch_size`: 训练的批大小，如果使用GPU，请根据实际情况调整batch_size；\n* `num_workers`: works的数量，默认为0；\n* `eval_dataset`: 验证集；\n* `log_interval`: 打印日志的间隔， 单位为执行批训练的次数。\n* `save_interval`: 保存模型的间隔频次，单位为执行训练的轮数。\n\n## 模型预测\n\n当完成Fine-tune后，Fine-tune过程在验证集上表现最优的模型会被保存在`${CHECKPOINT_DIR}/best_model`目录下，其中`${CHECKPOINT_DIR}`目录为Fine-tune时所选择的保存checkpoint的目录。\n\n我们使用该模型来进行预测。predict.py脚本如下：\n\n```python\nimport paddle\nimport cv2\nimport paddlehub as hub\n\nif __name__ == '__main__':\n    model = hub.Module(name='unet_cityscapes', pretrained='/PATH/TO/CHECKPOINT')\n    img = cv2.imread(\"/PATH/TO/IMAGE\")\n    model.predict(images=[img], visualization=True)\n```\n\n参数配置正确后，请执行脚本`python predict.py`。\n**Args**\n* `images`:原始图像路径或BGR格式图片；\n* `visualization`: 是否可视化，默认为True；\n* `save_path`: 保存结果的路径，默认保存路径为'seg_result'。\n\n**NOTE:** 进行预测时，所选择的module，checkpoint_dir，dataset必须和Fine-tune所用的一样。\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线图像分割服务。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m unet_cityscapes\n```\n\n这样就完成了一个图像分割服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\nimport cv2\nimport base64\n\nimport numpy as np\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n# 发送HTTP请求\norg_im = cv2.imread('/PATH/TO/IMAGE')\ndata = {'images':[cv2_to_base64(org_im)]}\nheaders = {\"Content-type\": \"application/json\"}\nurl = \"http://127.0.0.1:8866/predict/unet_cityscapes\"\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\nmask = base64_to_cv2(r.json()[\"results\"][0])\n```\n\n### 查看代码\n\nhttps://github.com/PaddlePaddle/PaddleSeg\n\n### 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n"
  },
  {
    "path": "modules/image/semantic_segmentation/unet_cityscapes/layers.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\ndef SyncBatchNorm(*args, **kwargs):\n    \"\"\"In cpu environment nn.SyncBatchNorm does not have kernel so use nn.BatchNorm2D instead\"\"\"\n    if paddle.get_device() == 'cpu' or os.environ.get('PADDLESEG_EXPORT_STAGE'):\n        return nn.BatchNorm2D(*args, **kwargs)\n    else:\n        return nn.SyncBatchNorm(*args, **kwargs)\n\n\nclass ConvBNReLU(nn.Layer):\n    \"\"\"Basic conv bn relu layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs):\n        super().__init__()\n\n        self._conv = nn.Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        x = F.relu(x)\n        return x\n\n\nclass ConvBN(nn.Layer):\n    \"\"\"Basic conv bn layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs):\n        super().__init__()\n        self._conv = nn.Conv2D(in_channels, out_channels, kernel_size, padding=padding, **kwargs)\n        self._batch_norm = SyncBatchNorm(out_channels)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self._conv(x)\n        x = self._batch_norm(x)\n        return x\n\n\nclass ConvReLUPool(nn.Layer):\n    \"\"\"Basic conv bn pool layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int):\n        super().__init__()\n        self.conv = nn.Conv2D(in_channels, out_channels, kernel_size=3, stride=1, padding=1, dilation=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.conv(x)\n        x = F.relu(x)\n        x = F.pool2d(x, pool_size=2, pool_type=\"max\", pool_stride=2)\n        return x\n\n\nclass SeparableConvBNReLU(nn.Layer):\n    \"\"\"Basic separable Convolution layer.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs):\n        super().__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=in_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n        self.piontwise_conv = ConvBNReLU(in_channels, out_channels, kernel_size=1, groups=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        x = self.piontwise_conv(x)\n        return x\n\n\nclass DepthwiseConvBN(nn.Layer):\n    \"\"\"Depthwise Convolution.\"\"\"\n\n    def __init__(self, in_channels: int, out_channels: int, kernel_size: int, padding: str = 'same', **kwargs):\n        super().__init__()\n        self.depthwise_conv = ConvBN(\n            in_channels,\n            out_channels=out_channels,\n            kernel_size=kernel_size,\n            padding=padding,\n            groups=in_channels,\n            **kwargs)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.depthwise_conv(x)\n        return x\n\n\nclass AuxLayer(nn.Layer):\n    \"\"\"\n    The auxiliary layer implementation for auxiliary loss.\n\n    Args:\n        in_channels (int): The number of input channels.\n        inter_channels (int): The intermediate channels.\n        out_channels (int): The number of output channels, and usually it is num_classes.\n        dropout_prob (float, optional): The drop rate. Default: 0.1.\n    \"\"\"\n\n    def __init__(self, in_channels: int, inter_channels: int, out_channels: int, dropout_prob: float = 0.1):\n        super().__init__()\n\n        self.conv_bn_relu = ConvBNReLU(in_channels=in_channels, out_channels=inter_channels, kernel_size=3, padding=1)\n\n        self.dropout = nn.Dropout(p=dropout_prob)\n\n        self.conv = nn.Conv2D(in_channels=inter_channels, out_channels=out_channels, kernel_size=1)\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        x = self.conv_bn_relu(x)\n        x = self.dropout(x)\n        x = self.conv(x)\n        return x\n\n\nclass Activation(nn.Layer):\n    \"\"\"\n    The wrapper of activations.\n    Args:\n        act (str, optional): The activation name in lowercase. It must be one of ['elu', 'gelu',\n            'hardshrink', 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid',\n            'softmax', 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax',\n            'hsigmoid']. Default: None, means identical transformation.\n    Returns:\n        A callable object of Activation.\n    Raises:\n        KeyError: When parameter `act` is not in the optional range.\n    Examples:\n        from paddleseg.models.common.activation import Activation\n        relu = Activation(\"relu\")\n        print(relu)\n        # <class 'paddle.nn.layer.activation.ReLU'>\n        sigmoid = Activation(\"sigmoid\")\n        print(sigmoid)\n        # <class 'paddle.nn.layer.activation.Sigmoid'>\n        not_exit_one = Activation(\"not_exit_one\")\n        # KeyError: \"not_exit_one does not exist in the current dict_keys(['elu', 'gelu', 'hardshrink',\n        # 'tanh', 'hardtanh', 'prelu', 'relu', 'relu6', 'selu', 'leakyrelu', 'sigmoid', 'softmax',\n        # 'softplus', 'softshrink', 'softsign', 'tanhshrink', 'logsigmoid', 'logsoftmax', 'hsigmoid'])\"\n    \"\"\"\n\n    def __init__(self, act: str = None):\n        super(Activation, self).__init__()\n\n        self._act = act\n        upper_act_names = nn.layer.activation.__dict__.keys()\n        lower_act_names = [act.lower() for act in upper_act_names]\n        act_dict = dict(zip(lower_act_names, upper_act_names))\n\n        if act is not None:\n            if act in act_dict.keys():\n                act_name = act_dict[act]\n                self.act_func = eval(\"nn.layer.activation.{}()\".format(act_name))\n            else:\n                raise KeyError(\"{} does not exist in the current {}\".format(act, act_dict.keys()))\n\n    def forward(self, x: paddle.Tensor) -> paddle.Tensor:\n        if self._act is not None:\n            return self.act_func(x)\n        else:\n            return x\n"
  },
  {
    "path": "modules/image/semantic_segmentation/unet_cityscapes/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Union, List, Tuple\n\nimport paddle\nfrom paddle import nn\nimport paddle.nn.functional as F\nimport numpy as np\nfrom paddlehub.module.module import moduleinfo\nimport paddlehub.vision.segmentation_transforms as T\nfrom paddlehub.module.cv_module import ImageSegmentationModule\n\nimport unet_cityscapes.layers as layers\n\n\n@moduleinfo(\n    name=\"unet_cityscapes\",\n    type=\"CV/semantic_segmentation\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    summary=\"Unet is a segmentation model.\",\n    version=\"1.0.0\",\n    meta=ImageSegmentationModule)\nclass UNet(nn.Layer):\n    \"\"\"\n    The UNet implementation based on PaddlePaddle.\n\n    The original article refers to\n    Olaf Ronneberger, et, al. \"U-Net: Convolutional Networks for Biomedical Image Segmentation\"\n    (https://arxiv.org/abs/1505.04597).\n\n    Args:\n        num_classes (int): The unique number of target classes.\n        align_corners (bool): An argument of F.interpolate. It should be set to False when the output size of feature\n            is even, e.g. 1024x512, otherwise it is True, e.g. 769x769.  Default: False.\n        use_deconv (bool, optional): A bool value indicates whether using deconvolution in upsampling.\n            If False, use resize_bilinear. Default: False.\n        pretrained (str, optional): The path or url of pretrained model for fine tuning. Default: None.\n    \"\"\"\n\n    def __init__(self,\n                 num_classes: int = 19,\n                 align_corners: bool = False,\n                 use_deconv: bool = False,\n                 pretrained: str = None):\n        super(UNet, self).__init__()\n\n        self.encode = Encoder()\n        self.decode = Decoder(align_corners, use_deconv=use_deconv)\n        self.cls = self.conv = nn.Conv2D(in_channels=64, out_channels=num_classes, kernel_size=3, stride=1, padding=1)\n\n        self.transforms = T.Compose([T.Normalize()])\n\n        if pretrained is not None:\n            model_dict = paddle.load(pretrained)\n            self.set_dict(model_dict)\n            print(\"load custom parameters success\")\n\n        else:\n            checkpoint = os.path.join(self.directory, 'model.pdparams')\n            model_dict = paddle.load(checkpoint)\n            self.set_dict(model_dict)\n            print(\"load pretrained parameters success\")\n\n    def transform(self, img: Union[np.ndarray, str]) -> Union[np.ndarray, str]:\n        return self.transforms(img)\n\n    def forward(self, x: paddle.Tensor) -> List[paddle.Tensor]:\n        logit_list = []\n        x, short_cuts = self.encode(x)\n        x = self.decode(x, short_cuts)\n        logit = self.cls(x)\n        logit_list.append(logit)\n        return logit_list\n\n\nclass Encoder(nn.Layer):\n    def __init__(self):\n        super().__init__()\n\n        self.double_conv = nn.Sequential(layers.ConvBNReLU(3, 64, 3), layers.ConvBNReLU(64, 64, 3))\n        down_channels = [[64, 128], [128, 256], [256, 512], [512, 512]]\n        self.down_sample_list = nn.LayerList([self.down_sampling(channel[0], channel[1]) for channel in down_channels])\n\n    def down_sampling(self, in_channels: int, out_channels: int) -> nn.Layer:\n        modules = []\n        modules.append(nn.MaxPool2D(kernel_size=2, stride=2))\n        modules.append(layers.ConvBNReLU(in_channels, out_channels, 3))\n        modules.append(layers.ConvBNReLU(out_channels, out_channels, 3))\n        return nn.Sequential(*modules)\n\n    def forward(self, x: paddle.Tensor) -> Tuple:\n        short_cuts = []\n        x = self.double_conv(x)\n        for down_sample in self.down_sample_list:\n            short_cuts.append(x)\n            x = down_sample(x)\n        return x, short_cuts\n\n\nclass Decoder(nn.Layer):\n    def __init__(self, align_corners: bool, use_deconv: bool = False):\n        super().__init__()\n\n        up_channels = [[512, 256], [256, 128], [128, 64], [64, 64]]\n        self.up_sample_list = nn.LayerList(\n            [UpSampling(channel[0], channel[1], align_corners, use_deconv) for channel in up_channels])\n\n    def forward(self, x: paddle.Tensor, short_cuts: List) -> paddle.Tensor:\n        for i in range(len(short_cuts)):\n            x = self.up_sample_list[i](x, short_cuts[-(i + 1)])\n        return x\n\n\nclass UpSampling(nn.Layer):\n    def __init__(self, in_channels: int, out_channels: int, align_corners: bool, use_deconv: bool = False):\n        super().__init__()\n\n        self.align_corners = align_corners\n\n        self.use_deconv = use_deconv\n        if self.use_deconv:\n            self.deconv = nn.Conv2DTranspose(in_channels, out_channels // 2, kernel_size=2, stride=2, padding=0)\n            in_channels = in_channels + out_channels // 2\n        else:\n            in_channels *= 2\n\n        self.double_conv = nn.Sequential(\n            layers.ConvBNReLU(in_channels, out_channels, 3), layers.ConvBNReLU(out_channels, out_channels, 3))\n\n    def forward(self, x: paddle.Tensor, short_cut: paddle.Tensor) -> paddle.Tensor:\n        if self.use_deconv:\n            x = self.deconv(x)\n        else:\n            x = F.interpolate(x, paddle.shape(short_cut)[2:], mode='bilinear', align_corners=self.align_corners)\n        x = paddle.concat([x, short_cut], axis=1)\n        x = self.double_conv(x)\n        return x\n"
  },
  {
    "path": "modules/image/text_recognition/README.md",
    "content": "## **更好用户体验，建议参考WEB端官方文档 -> [【文字识别】](https://www.paddlepaddle.org.cn/hublist)**\n\n### 文字识别\n文字识别（OCR）是计算机视觉重要任务之一，主要用于图像中文本信息的提取，具有重要的产业实践意义。\n\n- 推荐模型\n\n| 模型名称                                                     | 模型简介                                                     |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n| [超轻量-中英文OCR文字识别](https://www.paddlepaddle.org.cn/hubdetail?name=chinese_ocr_db_crnn_mobile&en_category=TextRecognition) | 业界开源最小，8.1M超轻量中英文识别模型。支持中英文识别；支持倾斜、竖排等多种方向文字识别，**强力推荐** |\n| [高精度-中英文OCR文字识别](https://www.paddlepaddle.org.cn/hubdetail?name=chinese_ocr_db_crnn_mobile&en_category=TextRecognition) | 业界开源效果最好，155M高精度中英文识别模型。支持中英文识别；支持倾斜、竖排等多种方向文字识别，**强力推荐** |\n| [德语-超轻量OCR文字识别](https://www.paddlepaddle.org.cn/hubdetail?name=german_ocr_db_crnn_mobile&en_category=TextRecognition) | 德语OCR识别，超轻量|\n| [法语-超轻量OCR文字识别](https://www.paddlepaddle.org.cn/hubdetail?name=french_ocr_db_crnn_mobile&en_category=TextRecognition) | 法语OCR识别，超轻量|\n| [日语-超轻量OCR文字识别](https://www.paddlepaddle.org.cn/hubdetail?name=japan_ocr_db_crnn_mobile&en_category=TextRecognition) | 日语OCR识别，超轻量|\n| [韩语-超轻量OCR文字识别](https://www.paddlepaddle.org.cn/hubdetail?name=korean_ocr_db_crnn_mobile&en_category=TextRecognition) | 韩语OCR识别，超轻量|\n"
  },
  {
    "path": "modules/image/text_recognition/README_en.md",
    "content": "## **For better user experience, refer to the web official document -> [OCR](https://www.paddlepaddle.org.cn/hublist)**\n\n### OCR\n\nOptical Character Recognition  (OCR) is one of the important tasks of computer vision, mainly used for the extraction of text information from images. It has the important industrial practice significance.\n\n- Recommended Models\n\n| Model Name                                                   | Model Introduction                                           |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n| [Ultra Lightweight - Chinese \\& English OCR](https://www.paddlepaddle.org.cn/hubdetail?name=chinese_ocr_db_crnn_mobile&en_category=TextRecognition) | It has the industry's smallest open source, with 8.1M ultra-lightweight Chinese and English recognition model. Support Chinese and English recognition. In addition, it supports tilt, vertical and other multi-directional OCR. It is highly recommended. |\n| [High Precision - Chinese \\& English OCR](https://www.paddlepaddle.org.cn/hubdetail?name=chinese_ocr_db_crnn_mobile&en_category=TextRecognition) | It has the industry's best open source effect, with 155M high-precision Chinese and English recognition model. Support Chinese and English recognition. In addition, it supports tilt, vertical and other multi-directional OCR. It is highly recommended. |\n| [German - Ultra lightweight OCR](https://www.paddlepaddle.org.cn/hubdetail?name=german_ocr_db_crnn_mobile&en_category=TextRecognition) | German OCR, ultra-lightweight                                |\n| [French - Ultra Lightweight OCR](https://www.paddlepaddle.org.cn/hubdetail?name=french_ocr_db_crnn_mobile&en_category=TextRecognition) | French OCR, ultra lightweight                                |\n| [Japanese - Ultra Lightweight  OCR](https://www.paddlepaddle.org.cn/hubdetail?name=japan_ocr_db_crnn_mobile&en_category=TextRecognition) | Japanese OCR, ultra lightweight                              |\n| [Korean - Ultra Lightweight OCR](https://www.paddlepaddle.org.cn/hubdetail?name=korean_ocr_db_crnn_mobile&en_category=TextRecognition) | Korean OCR, ultra lightweight                                |\n"
  },
  {
    "path": "modules/image/text_recognition/Vehicle_License_Plate_Recognition/README.md",
    "content": "# Vehicle_License_Plate_Recognition\n\n|模型名称|Vehicle_License_Plate_Recognition|\n| :--- | :---: |\n|类别|图像 - 文字识别|\n|网络|-|\n|数据集|CCPD|\n|是否支持Fine-tuning|否|\n|模型大小|111MB|\n|最新更新日期|2021-03-22|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/35a3dab32ac948549de41afba7b51a5770d3f872d60b437d891f359a5cef8052\"  width = \"450\" height = \"300\" hspace='10'/> <br />\n    </p>\n\n\n- ### 模型介绍\n\n  - Vehicle_License_Plate_Recognition是一个基于CCPD数据集训练的车牌识别模型，能够检测出图像中车牌位置并识别其中的车牌文字信息。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.0.0  \n\n  - paddlehub >= 2.0.4\n\n  - paddleocr >= 2.0.2  \n\n- ### 2、安装\n\n  - ```shell\n    $ hub install Vehicle_License_Plate_Recognition\n    ```\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"Vehicle_License_Plate_Recognition\")\n    result = model.plate_recognition(images=[cv2.imread('/PATH/TO/IMAGE')])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def plate_recognition(images)\n    ```\n\n    - 车牌识别 API。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]；<br/>\n\n\n    - **返回**\n      - results(list(dict{'license', 'bbox'})): 识别到的车牌信息列表，包含车牌的位置坐标和车牌号码\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线车牌识别服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m Vehicle_License_Plate_Recognition\n    ```\n\n  - 这样就完成了一个车牌识别的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/Vehicle_License_Plate_Recognition\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install Vehicle_License_Plate_Recognition==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/text_recognition/Vehicle_License_Plate_Recognition/README_en.md",
    "content": "# Vehicle_License_Plate_Recognition\n\n|Module Name|Vehicle_License_Plate_Recognition|\n| :--- | :---: |\n|Category|text recognition|\n|Network|-|\n|Dataset|CCPD|\n|Fine-tuning supported or not|No|\n|Module Size|111MB|\n|Latest update date|2021-03-22|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n    <p align=\"center\">\n    <img src=\"https://ai-studio-static-online.cdn.bcebos.com/35a3dab32ac948549de41afba7b51a5770d3f872d60b437d891f359a5cef8052\"  width = \"450\" height = \"300\" hspace='10'/> <br />\n    </p>\n\n\n- ### Module Introduction\n\n  - Vehicle_License_Plate_Recognition is a module for licence plate recognition, trained on CCPD dataset. This model can detect the position of licence plate and recognize the contents.\n\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 2.0.0  \n\n  - paddlehub >= 2.0.4\n\n  - paddleocr >= 2.0.2  \n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install Vehicle_License_Plate_Recognition\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    model = hub.Module(name=\"Vehicle_License_Plate_Recognition\")\n    result = model.plate_recognition(images=[cv2.imread('/PATH/TO/IMAGE')])\n    ```\n\n- ### 2、API\n\n  - ```python\n    def plate_recognition(images)\n    ```\n\n    - Prediction API.\n\n    - **Parameters**\n\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format [H, W, C], BGR;\n\n\n    - **Return**\n      - results(list(dict{'license', 'bbox'})): The list of recognition results, where each element is dict.\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of text recognition.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m Vehicle_License_Plate_Recognition\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/Vehicle_License_Plate_Recognition\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install Vehicle_License_Plate_Recognition==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/text_recognition/Vehicle_License_Plate_Recognition/module.py",
    "content": "import os\nimport cv2\nimport base64\nimport numpy as np\nimport paddle.nn as nn\nfrom paddleocr import PaddleOCR\nfrom paddlehub.module.module import moduleinfo, serving\n\n\n@moduleinfo(\n    name=\"Vehicle_License_Plate_Recognition\",\n    type=\"CV/text_recognition\",\n    author=\"jm12138\",\n    author_email=\"\",\n    summary=\"Vehicle_License_Plate_Recognition\",\n    version=\"1.0.0\")\nclass Vehicle_License_Plate_Recognition(nn.Layer):\n    def __init__(self):\n        super(Vehicle_License_Plate_Recognition, self).__init__()\n        self.vlpr = PaddleOCR(\n            det_model_dir=os.path.join(self.directory, 'det_vlpr'),\n            rec_model_dir=os.path.join(self.directory, 'ch_ppocr_server_v2.0_rec_infer'))\n\n    @staticmethod\n    def base64_to_cv2(b64str):\n        data = base64.b64decode(b64str.encode('utf8'))\n        data = np.frombuffer(data, np.uint8)\n        data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n        return data\n\n    def plate_recognition(self, images=None):\n        assert isinstance(images, (list, str, np.ndarray))\n        results = []\n\n        if isinstance(images, list):\n            for item in images:\n                for bbox, text in self.vlpr.ocr(item):\n                    results.append({'license': text[0], 'bbox': bbox})\n\n        elif isinstance(images, (str, np.ndarray)):\n            for bbox, text in self.vlpr.ocr(images):\n                results.append({'license': text[0], 'bbox': bbox})\n\n        return results\n\n    @serving\n    def serving_method(self, images):\n        if isinstance(images, list):\n            images_decode = [self.base64_to_cv2(image) for image in images]\n        elif isinstance(images, str):\n            images_decode = self.base64_to_cv2(images)\n\n        return self.plate_recognition(images_decode)\n"
  },
  {
    "path": "modules/image/text_recognition/arabic_ocr_db_crnn_mobile/README.md",
    "content": "# arabic_ocr_db_crnn_mobile\n\n|模型名称|arabic_ocr_db_crnn_mobile|\n| :--- | :---: |\n|类别|图像-文字识别|\n|网络|Differentiable Binarization+CRNN|\n|数据集|icdar2015数据集|\n|是否支持Fine-tuning|否|\n|最新更新日期|2021-12-2|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - arabic_ocr_db_crnn_mobile Module用于识别图片当中的阿拉伯文字，包括阿拉伯文、波斯文、维吾尔文。其基于multi_languages_ocr_db_crnn检测得到的文本框，继续识别文本框中的阿拉伯文文字。最终识别文字算法采用CRNN（Convolutional Recurrent Neural Network）即卷积递归神经网络。其是DCNN和RNN的组合，专门用于识别图像中的序列式对象。与CTC loss配合使用，进行文字识别，可以直接从文本词级或行级的标注中学习，不需要详细的字符级的标注。该Module是一个识别阿拉伯文的轻量级OCR模型，支持直接预测。\n\n  - 更多详情参考：\n    - [Real-time Scene Text Detection with Differentiable Binarization](https://arxiv.org/pdf/1911.08947.pdf)\n    - [An end-to-end trainable neural network for image-based sequence recognition and its application to scene text recognition](https://arxiv.org/pdf/1507.05717.pdf)\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.0.2  \n\n  - paddlehub >= 2.0.0   | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install arabic_ocr_db_crnn_mobile\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run arabic_ocr_db_crnn_mobile --input_path \"/PATH/TO/IMAGE\"\n    $ hub run arabic_ocr_db_crnn_mobile --input_path \"/PATH/TO/IMAGE\" --det True --rec True --use_angle_cls True  --box_thresh 0.7 --angle_classification_thresh 0.8 --visualization True\n    ```\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    ocr = hub.Module(name=\"arabic_ocr_db_crnn_mobile\", enable_mkldnn=True)       # mkldnn加速仅在CPU下有效\n    result = ocr.recognize_text(images=[cv2.imread('/PATH/TO/IMAGE')])\n\n    # or\n    # result = ocr.recognize_text(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def __init__(self,\n                 det=True,\n                 rec=True,\n                 use_angle_cls=False,\n                 enable_mkldnn=False,  \n                 use_gpu=False,\n                 box_thresh=0.6,\n                 angle_classification_thresh=0.9)\n    ```\n\n    - 构造ArabicOCRDBCRNNMobile对象\n\n    - **参数**\n      - det(bool): 是否开启文字检测。默认为True。\n      - rec(bool): 是否开启文字识别。默认为True。\n      - use_angle_cls(bool): 是否开启方向分类, 用于设置使用方向分类器识别180度旋转文字。默认为False。\n      - enable_mkldnn(bool): 是否开启mkldnn加速CPU计算。该参数仅在CPU运行下设置有效。默认为False。\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量**\n      - box\\_thresh (float): 检测文本框置信度的阈值；\n      - angle_classification_thresh(float): 文本方向分类置信度的阈值\n\n\n  - ```python\n    def recognize_text(images=[],\n                       paths=[],\n                       output_dir='ocr_result',\n                       visualization=False)\n    ```\n\n    - 预测API，检测输入图片中的所有文本的位置和识别文本结果。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径；\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      - output\\_dir (str): 图片的保存路径，默认设为 ocr\\_result；\n      - visualization (bool): 是否将识别结果保存为图片文件, 仅有检测开启时有效, 默认为False；\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为：\n        - data (list\\[dict\\]): 识别文本结果，列表中每一个元素为 dict，各字段为：\n          - text(str): 识别得到的文本\n          - confidence(float): 识别文本结果置信度\n          - text_box_position(list): 文本框在原图中的像素坐标，4*2的矩阵，依次表示文本框左下、右下、右上、左上顶点的坐标，如果无识别结果则data为\\[\\]\n          - orientation(str): 分类的方向，仅在只有方向分类开启时输出\n          - score(float): 分类的得分，仅在只有方向分类开启时输出\n        - save_path (str, optional): 识别结果的保存路径，如不保存图片则save_path为''\n\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m arabic_ocr_db_crnn_mobile\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/arabic_ocr_db_crnn_mobile\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n  - ```shell\n    $ hub install arabic_ocr_db_crnn_mobile==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/text_recognition/arabic_ocr_db_crnn_mobile/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/text_recognition/arabic_ocr_db_crnn_mobile/module.py",
    "content": "import paddlehub as hub\nfrom paddleocr.ppocr.utils.logging import get_logger\nfrom paddleocr.tools.infer.utility import base64_to_cv2\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\n\n@moduleinfo(\n    name=\"arabic_ocr_db_crnn_mobile\",\n    version=\"1.1.0\",\n    summary=\"ocr service\",\n    author=\"PaddlePaddle\",\n    type=\"cv/text_recognition\")\nclass ArabicOCRDBCRNNMobile:\n    def __init__(self,\n                 det=True,\n                 rec=True,\n                 use_angle_cls=False,\n                 enable_mkldnn=False,\n                 use_gpu=False,\n                 box_thresh=0.6,\n                 angle_classification_thresh=0.9):\n        \"\"\"\n        initialize with the necessary elements\n        Args:\n            det(bool): Whether to use text detector.\n            rec(bool): Whether to use text recognizer.\n            use_angle_cls(bool): Whether to use text orientation classifier.\n            enable_mkldnn(bool): Whether to enable mkldnn.\n            use_gpu (bool): Whether to use gpu.\n            box_thresh(float): the threshold of the detected text box's confidence\n            angle_classification_thresh(float): the threshold of the angle classification confidence\n        \"\"\"\n        self.logger = get_logger()\n        self.model = hub.Module(\n            name=\"multi_languages_ocr_db_crnn\",\n            lang=\"arabic\",\n            det=det,\n            rec=rec,\n            use_angle_cls=use_angle_cls,\n            enable_mkldnn=enable_mkldnn,\n            use_gpu=use_gpu,\n            box_thresh=box_thresh,\n            angle_classification_thresh=angle_classification_thresh)\n        self.model.name = self.name\n\n    def recognize_text(self, images=[], paths=[], output_dir='ocr_result', visualization=False):\n        \"\"\"\n        Get the text in the predicted images.\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]. If images not paths\n            paths (list[str]): The paths of images. If paths not images\n            output_dir (str): The directory to store output images.\n            visualization (bool): Whether to save image or not.\n        Returns:\n            res (list): The result of text detection box and save path of images.\n        \"\"\"\n        all_results = self.model.recognize_text(\n            images=images, paths=paths, output_dir=output_dir, visualization=visualization)\n        return all_results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.recognize_text(images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        results = self.model.run_cmd(argvs)\n        return results\n\n    def export_onnx_model(self, dirname: str, input_shape_dict=None, opset_version=10):\n        '''\n        Export the model to ONNX format.\n\n        Args:\n            dirname(str): The directory to save the onnx model.\n            input_shape_dict: dictionary ``{ input_name: input_value }, eg. {'x': [-1, 3, -1, -1]}``\n            opset_version(int): operator set\n        '''\n        self.model.export_onnx_model(dirname=dirname, input_shape_dict=input_shape_dict, opset_version=opset_version)\n"
  },
  {
    "path": "modules/image/text_recognition/arabic_ocr_db_crnn_mobile/requirements.txt",
    "content": "paddleocr>=2.3.0.2\npaddle2onnx>=0.9.0\nshapely\npyclipper\n"
  },
  {
    "path": "modules/image/text_recognition/ch_pp-ocrv3/README.md",
    "content": "# ch_pp-ocrv3\n\n|模型名称|ch_pp-ocrv3|\n| :--- | :---: |\n|类别|图像-文字识别|\n|网络|Differentiable Binarization+SVTR_LCNet|\n|数据集|icdar2015数据集|\n|是否支持Fine-tuning|否|\n|模型大小|13M|\n|最新更新日期|2022-05-11|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - [OCR文字识别场景在线体验](https://www.paddlepaddle.org.cn/hub/scene/ocr)\n  - 样例结果示例：\n<p align=\"center\">\n<img src=\"https://user-images.githubusercontent.com/22424850/167818854-96811631-d40c-4d07-9aae-b78d4514c917.jpg\"  width = \"600\" hspace='10'/> <br />\n</p>\n\n- ### 模型介绍\n\n  - PP-OCR是PaddleOCR自研的实用的超轻量OCR系统。在实现前沿算法的基础上，考虑精度与速度的平衡，进行模型瘦身和深度优化，使其尽可能满足产业落地需求。该系统包含文本检测和文本识别两个阶段，其中文本检测算法选用DB，文本识别算法选用CRNN，并在检测和识别模块之间添加文本方向分类器，以应对不同方向的文本识别。当前模块为PP-OCRv3，在PP-OCRv2的基础上，针对检测模型和识别模型，进行了共计9个方面的升级，进一步提升了模型效果。\n<p align=\"center\">\n<img src=\"https://raw.githubusercontent.com/PaddlePaddle/PaddleOCR/release/2.5/doc/ppocrv3_framework.png\" width=\"800\" hspace='10'/> <br />\n</p>\n\n  - 更多详情参考：[PP-OCRv3](https://github.com/PaddlePaddle/PaddleOCR/blob/release/2.5/doc/doc_ch/PP-OCRv3_introduction.md)。\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.2\n\n  - paddlehub >= 2.2   | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install ch_pp-ocrv3\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run ch_pp-ocrv3 --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    ocr = hub.Module(name=\"ch_pp-ocrv3\", enable_mkldnn=True)       # mkldnn加速仅在CPU下有效\n    result = ocr.recognize_text(images=[cv2.imread('/PATH/TO/IMAGE')])\n\n    # or\n    # result = ocr.recognize_text(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    __init__(text_detector_module=None, enable_mkldnn=False)\n    ```\n\n    - 构造用于文本检测的模块\n\n    - **参数**\n\n      - text_detector_module(str): 文字检测PaddleHub Module名字，如设置为None，则默认使用[ch_pp-ocrv3_det Module](../ch_pp-ocrv3_det/)。其作用为检测图片当中的文本。\n      - enable_mkldnn(bool): 是否开启mkldnn加速CPU计算。该参数仅在CPU运行下设置有效。默认为False。\n\n\n  - ```python\n    def recognize_text(images=[],\n                       paths=[],\n                       use_gpu=False,\n                       output_dir='ocr_result',\n                       visualization=False,\n                       box_thresh=0.6,\n                       text_thresh=0.5,\n                       angle_classification_thresh=0.9,\n                       det_db_unclip_ratio=1.5,\n                       det_db_score_mode=\"fast\"):\n    ```\n\n    - 预测API，检测输入图片中的所有中文文本的位置。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径；\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量**\n      - box\\_thresh (float): 检测文本框置信度的阈值；\n      - text\\_thresh (float): 识别中文文本置信度的阈值；\n      - angle_classification_thresh(float): 文本角度分类置信度的阈值\n      - visualization (bool): 是否将识别结果保存为图片文件；\n      - output\\_dir (str): 图片的保存路径，默认设为 ocr\\_result；\n      - det\\_db\\_unclip\\_ratio: 设置检测框的大小；\n      - det\\_db\\_score\\_mode: 设置检测得分计算方式，“fast” / “slow”\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为：\n        - data (list\\[dict\\]): 识别文本结果，列表中每一个元素为 dict，各字段为：\n          - text(str): 识别得到的文本\n          - confidence(float): 识别文本结果置信度\n          - text_box_position(list): 文本框在原图中的像素坐标，4*2的矩阵，依次表示文本框左下、右下、右上、左上顶点的坐标\n      如果无识别结果则data为\\[\\]\n        - save_path (str, optional): 识别结果的保存路径，如不保存图片则save_path为''\n\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m ch_pp-ocrv3\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/ch_pp-ocrv3\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n- ### Gradio App 支持\n  从 PaddleHub 2.3.1 开始支持使用链接 http://127.0.0.1:8866/gradio/ch_pp-ocrv3 在浏览器中访问 ch_pp-ocrv3 的 Gradio App。\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 Fluid API\n\n* 1.2.0\n\n  添加 Gradio APP 支持\n\n  - ```shell\n    $ hub install ch_pp-ocrv3==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/text_recognition/ch_pp-ocrv3/character.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport string\n\nimport numpy as np\n\n\nclass CharacterOps(object):\n    \"\"\" Convert between text-label and text-index\n    Args:\n        config: config from yaml file\n    \"\"\"\n\n    def __init__(self, config):\n        self.character_type = config['character_type']\n        self.max_text_len = config['max_text_length']\n        if self.character_type == \"en\":\n            self.character_str = \"0123456789abcdefghijklmnopqrstuvwxyz\"\n            dict_character = list(self.character_str)\n        # use the custom dictionary\n        elif self.character_type == \"ch\":\n            character_dict_path = config['character_dict_path']\n            add_space = False\n            if 'use_space_char' in config:\n                add_space = config['use_space_char']\n            self.character_str = []\n            with open(character_dict_path, \"rb\") as fin:\n                lines = fin.readlines()\n                for line in lines:\n                    line = line.decode('utf-8').strip(\"\\n\").strip(\"\\r\\n\")\n                    self.character_str.append(line)\n            if add_space:\n                self.character_str.append(\" \")\n            dict_character = list(self.character_str)\n        elif self.character_type == \"en_sensitive\":\n            # same with ASTER setting (use 94 char).\n            self.character_str = string.printable[:-6]\n            dict_character = list(self.character_str)\n        else:\n            self.character_str = None\n        self.beg_str = \"sos\"\n        self.end_str = \"eos\"\n\n        dict_character = self.add_special_char(dict_character)\n        self.dict = {}\n        for i, char in enumerate(dict_character):\n            self.dict[char] = i\n        self.character = dict_character\n\n    def add_special_char(self, dict_character):\n        dict_character = ['blank'] + dict_character\n        return dict_character\n\n    def encode(self, text):\n        \"\"\"convert text-label into text-index.\n        input:\n            text: text labels of each image. [batch_size]\n\n        output:\n            text: concatenated text index for CTCLoss.\n                    [sum(text_lengths)] = [text_index_0 + text_index_1 + ... + text_index_(n - 1)]\n            length: length of each text. [batch_size]\n        \"\"\"\n        if self.character_type == \"en\":\n            text = text.lower()\n\n        text_list = []\n        for char in text:\n            if char not in self.dict:\n                continue\n            text_list.append(self.dict[char])\n        text = np.array(text_list)\n        return text\n\n    def decode(self, text_index, text_prob=None, is_remove_duplicate=False):\n        \"\"\" convert text-index into text-label. \"\"\"\n        result_list = []\n        ignored_tokens = self.get_ignored_tokens()\n        batch_size = len(text_index)\n        for batch_idx in range(batch_size):\n            selection = np.ones(len(text_index[batch_idx]), dtype=bool)\n            if is_remove_duplicate:\n                selection[1:] = text_index[batch_idx][1:] != text_index[batch_idx][:-1]\n            for ignored_token in ignored_tokens:\n                selection &= text_index[batch_idx] != ignored_token\n            char_list = [self.character[text_id] for text_id in text_index[batch_idx][selection]]\n            if text_prob is not None:\n                conf_list = text_prob[batch_idx][selection]\n            else:\n                conf_list = [1] * len(selection)\n            if len(conf_list) == 0:\n                conf_list = [0]\n\n            text = ''.join(char_list)\n            result_list.append((text, np.mean(conf_list).tolist()))\n        return result_list\n\n    def get_char_num(self):\n        return len(self.character)\n\n    def get_beg_end_flag_idx(self, beg_or_end):\n        if self.loss_type == \"attention\":\n            if beg_or_end == \"beg\":\n                idx = np.array(self.dict[self.beg_str])\n            elif beg_or_end == \"end\":\n                idx = np.array(self.dict[self.end_str])\n            else:\n                assert False, \"Unsupport type %s in get_beg_end_flag_idx\"\\\n                    % beg_or_end\n            return idx\n        else:\n            err = \"error in get_beg_end_flag_idx when using the loss %s\"\\\n                % (self.loss_type)\n            assert False, err\n\n    def get_ignored_tokens(self):\n        return [0]  # for ctc blank\n\n\ndef cal_predicts_accuracy(char_ops, preds, preds_lod, labels, labels_lod, is_remove_duplicate=False):\n    \"\"\"\n    Calculate prediction accuracy\n    Args:\n        char_ops: CharacterOps\n        preds: preds result,text index\n        preds_lod: lod tensor of preds\n        labels: label of input image, text index\n        labels_lod:  lod tensor of label\n        is_remove_duplicate: Whether to remove duplicate characters,\n                                 The default is False\n    Return:\n        acc: The accuracy of test set\n        acc_num: The correct number of samples predicted\n        img_num: The total sample number of the test set\n    \"\"\"\n    acc_num = 0\n    img_num = 0\n    for ino in range(len(labels_lod) - 1):\n        beg_no = preds_lod[ino]\n        end_no = preds_lod[ino + 1]\n        preds_text = preds[beg_no:end_no].reshape(-1)\n        preds_text = char_ops.decode(preds_text, is_remove_duplicate)\n\n        beg_no = labels_lod[ino]\n        end_no = labels_lod[ino + 1]\n        labels_text = labels[beg_no:end_no].reshape(-1)\n        labels_text = char_ops.decode(labels_text, is_remove_duplicate)\n        img_num += 1\n\n        if preds_text == labels_text:\n            acc_num += 1\n    acc = acc_num * 1.0 / img_num\n    return acc, acc_num, img_num\n\n\ndef cal_predicts_accuracy_srn(char_ops, preds, labels, max_text_len, is_debug=False):\n    acc_num = 0\n    img_num = 0\n\n    char_num = char_ops.get_char_num()\n\n    total_len = preds.shape[0]\n    img_num = int(total_len / max_text_len)\n    for i in range(img_num):\n        cur_label = []\n        cur_pred = []\n        for j in range(max_text_len):\n            if labels[j + i * max_text_len] != int(char_num - 1):  #0\n                cur_label.append(labels[j + i * max_text_len][0])\n            else:\n                break\n\n        for j in range(max_text_len + 1):\n            if j < len(cur_label) and preds[j + i * max_text_len][0] != cur_label[j]:\n                break\n            elif j == len(cur_label) and j == max_text_len:\n                acc_num += 1\n                break\n            elif j == len(cur_label) and preds[j + i * max_text_len][0] == int(char_num - 1):\n                acc_num += 1\n                break\n    acc = acc_num * 1.0 / img_num\n    return acc, acc_num, img_num\n\n\ndef convert_rec_attention_infer_res(preds):\n    img_num = preds.shape[0]\n    target_lod = [0]\n    convert_ids = []\n    for ino in range(img_num):\n        end_pos = np.where(preds[ino, :] == 1)[0]\n        if len(end_pos) <= 1:\n            text_list = preds[ino, 1:]\n        else:\n            text_list = preds[ino, 1:end_pos[1]]\n        target_lod.append(target_lod[ino] + len(text_list))\n        convert_ids = convert_ids + list(text_list)\n    convert_ids = np.array(convert_ids)\n    convert_ids = convert_ids.reshape((-1, 1))\n    return convert_ids, target_lod\n\n\ndef convert_rec_label_to_lod(ori_labels):\n    img_num = len(ori_labels)\n    target_lod = [0]\n    convert_ids = []\n    for ino in range(img_num):\n        target_lod.append(target_lod[ino] + len(ori_labels[ino]))\n        convert_ids = convert_ids + list(ori_labels[ino])\n    convert_ids = np.array(convert_ids)\n    convert_ids = convert_ids.reshape((-1, 1))\n    return convert_ids, target_lod\n"
  },
  {
    "path": "modules/image/text_recognition/ch_pp-ocrv3/module.py",
    "content": "# -*- coding:utf-8 -*-\n# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport ast\nimport copy\nimport math\nimport os\nimport time\n\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.inference as paddle_infer\nfrom PIL import Image\n\nimport paddlehub as hub\nfrom .character import CharacterOps\nfrom .utils import base64_to_cv2\nfrom .utils import draw_ocr\nfrom .utils import get_image_ext\nfrom .utils import sorted_boxes\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\nfrom paddlehub.utils.utils import logger\n\n\n@moduleinfo(\n    name=\"ch_pp-ocrv3\",\n    version=\"1.2.0\",\n    summary=\"The module can recognize the chinese texts in an image. Firstly, it will detect the text box positions \\\n        based on the differentiable_binarization_chn module. Then it classifies the text angle and recognizes the chinese texts. \",\n    author=\"paddle-dev\",\n    author_email=\"paddle-dev@baidu.com\",\n    type=\"cv/text_recognition\")\nclass ChPPOCRv3:\n\n    def __init__(self, text_detector_module=None, enable_mkldnn=False):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        self.character_dict_path = os.path.join(self.directory, 'assets', 'ppocr_keys_v1.txt')\n        char_ops_params = {\n            'character_type': 'ch',\n            'character_dict_path': self.character_dict_path,\n            'loss_type': 'ctc',\n            'max_text_length': 25,\n            'use_space_char': True\n        }\n        self.char_ops = CharacterOps(char_ops_params)\n        self.rec_image_shape = [3, 48, 320]\n        self._text_detector_module = text_detector_module\n        self.font_file = os.path.join(self.directory, 'assets', 'simfang.ttf')\n        self.enable_mkldnn = enable_mkldnn\n\n        self.rec_pretrained_model_path = os.path.join(self.directory, 'inference_model', 'ppocrv3_rec')\n        self.cls_pretrained_model_path = os.path.join(self.directory, 'inference_model', 'ppocr_cls')\n        self.rec_predictor, self.rec_input_tensor, self.rec_output_tensors = self._set_config(\n            self.rec_pretrained_model_path)\n        self.cls_predictor, self.cls_input_tensor, self.cls_output_tensors = self._set_config(\n            self.cls_pretrained_model_path)\n\n    def _set_config(self, pretrained_model_path):\n        \"\"\"\n        predictor config path\n        \"\"\"\n        model_file_path = pretrained_model_path + '.pdmodel'\n        params_file_path = pretrained_model_path + '.pdiparams'\n\n        config = paddle_infer.Config(model_file_path, params_file_path)\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n\n        if use_gpu:\n            config.enable_use_gpu(8000, 0)\n        else:\n            config.disable_gpu()\n            if self.enable_mkldnn:\n                # cache 10 different shapes for mkldnn to avoid memory leak\n                config.set_mkldnn_cache_capacity(10)\n                config.enable_mkldnn()\n\n        config.disable_glog_info()\n        config.delete_pass(\"conv_transpose_eltwiseadd_bn_fuse_pass\")\n        config.switch_use_feed_fetch_ops(False)\n\n        predictor = paddle_infer.create_predictor(config)\n\n        input_names = predictor.get_input_names()\n        input_handle = predictor.get_input_handle(input_names[0])\n        output_names = predictor.get_output_names()\n        output_handles = []\n        for output_name in output_names:\n            output_handle = predictor.get_output_handle(output_name)\n            output_handles.append(output_handle)\n\n        return predictor, input_handle, output_handles\n\n    @property\n    def text_detector_module(self):\n        \"\"\"\n        text detect module\n        \"\"\"\n        if not self._text_detector_module:\n            self._text_detector_module = hub.Module(name='ch_pp-ocrv3_det',\n                                                    enable_mkldnn=self.enable_mkldnn,\n                                                    version='1.1.0')\n        return self._text_detector_module\n\n    def read_images(self, paths=[]):\n        images = []\n        for img_path in paths:\n            assert os.path.isfile(img_path), \"The {} isn't a valid file.\".format(img_path)\n            img = cv2.imread(img_path)\n            if img is None:\n                logger.info(\"error in loading image:{}\".format(img_path))\n                continue\n            images.append(img)\n        return images\n\n    def get_rotate_crop_image(self, img, points):\n        '''\n        img_height, img_width = img.shape[0:2]\n        left = int(np.min(points[:, 0]))\n        right = int(np.max(points[:, 0]))\n        top = int(np.min(points[:, 1]))\n        bottom = int(np.max(points[:, 1]))\n        img_crop = img[top:bottom, left:right, :].copy()\n        points[:, 0] = points[:, 0] - left\n        points[:, 1] = points[:, 1] - top\n        '''\n        img_crop_width = int(max(np.linalg.norm(points[0] - points[1]), np.linalg.norm(points[2] - points[3])))\n        img_crop_height = int(max(np.linalg.norm(points[0] - points[3]), np.linalg.norm(points[1] - points[2])))\n        pts_std = np.float32([[0, 0], [img_crop_width, 0], [img_crop_width, img_crop_height], [0, img_crop_height]])\n        M = cv2.getPerspectiveTransform(points, pts_std)\n        dst_img = cv2.warpPerspective(img,\n                                      M, (img_crop_width, img_crop_height),\n                                      borderMode=cv2.BORDER_REPLICATE,\n                                      flags=cv2.INTER_CUBIC)\n        dst_img_height, dst_img_width = dst_img.shape[0:2]\n        if dst_img_height * 1.0 / dst_img_width >= 1.5:\n            dst_img = np.rot90(dst_img)\n        return dst_img\n\n    def resize_norm_img_rec(self, img, max_wh_ratio):\n        imgC, imgH, imgW = self.rec_image_shape\n        assert imgC == img.shape[2]\n        imgW = int((imgH * max_wh_ratio))\n        h, w = img.shape[:2]\n        ratio = w / float(h)\n        if math.ceil(imgH * ratio) > imgW:\n            resized_w = imgW\n        else:\n            resized_w = int(math.ceil(imgH * ratio))\n        resized_image = cv2.resize(img, (resized_w, imgH))\n        resized_image = resized_image.astype('float32')\n        resized_image = resized_image.transpose((2, 0, 1)) / 255\n        resized_image -= 0.5\n        resized_image /= 0.5\n        padding_im = np.zeros((imgC, imgH, imgW), dtype=np.float32)\n        padding_im[:, :, 0:resized_w] = resized_image\n        return padding_im\n\n    def resize_norm_img_cls(self, img):\n        cls_image_shape = [3, 48, 192]\n        imgC, imgH, imgW = cls_image_shape\n        h = img.shape[0]\n        w = img.shape[1]\n        ratio = w / float(h)\n        if math.ceil(imgH * ratio) > imgW:\n            resized_w = imgW\n        else:\n            resized_w = int(math.ceil(imgH * ratio))\n        resized_image = cv2.resize(img, (resized_w, imgH))\n        resized_image = resized_image.astype('float32')\n        if cls_image_shape[0] == 1:\n            resized_image = resized_image / 255\n            resized_image = resized_image[np.newaxis, :]\n        else:\n            resized_image = resized_image.transpose((2, 0, 1)) / 255\n        resized_image -= 0.5\n        resized_image /= 0.5\n        padding_im = np.zeros((imgC, imgH, imgW), dtype=np.float32)\n        padding_im[:, :, 0:resized_w] = resized_image\n        return padding_im\n\n    def recognize_text(self,\n                       images=[],\n                       paths=[],\n                       use_gpu=False,\n                       output_dir='ocr_result',\n                       visualization=False,\n                       box_thresh=0.6,\n                       text_thresh=0.5,\n                       angle_classification_thresh=0.9,\n                       det_db_unclip_ratio=1.5,\n                       det_db_score_mode=\"fast\"):\n        \"\"\"\n        Get the chinese texts in the predicted images.\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]. If images not paths\n            paths (list[str]): The paths of images. If paths not images\n            use_gpu (bool): Whether to use gpu.\n            batch_size(int): the program deals once with one\n            output_dir (str): The directory to store output images.\n            visualization (bool): Whether to save image or not.\n            box_thresh(float): the threshold of the detected text box's confidence\n            text_thresh(float): the threshold of the chinese text recognition confidence\n            angle_classification_thresh(float): the threshold of the angle classification confidence\n            det_db_unclip_ratio(float): unclip ratio for post processing in DB detection.\n            det_db_score_mode(str): method to calc the final det score, one of fast(using box) and slow(using poly).\n        Returns:\n            res (list): The result of chinese texts and save path of images.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES via export CUDA_VISIBLE_DEVICES=cuda_device_id.\"\n                )\n\n        self.use_gpu = use_gpu\n\n        if images != [] and isinstance(images, list) and paths == []:\n            predicted_data = images\n        elif images == [] and isinstance(paths, list) and paths != []:\n            predicted_data = self.read_images(paths)\n        else:\n            raise TypeError(\"The input data is inconsistent with expectations.\")\n\n        assert predicted_data != [], \"There is not any image to be predicted. Please check the input data.\"\n\n        detection_results = self.text_detector_module.detect_text(images=predicted_data,\n                                                                  use_gpu=self.use_gpu,\n                                                                  box_thresh=box_thresh,\n                                                                  det_db_unclip_ratio=det_db_unclip_ratio,\n                                                                  det_db_score_mode=det_db_score_mode)\n\n        boxes = [np.array(item['data']).astype(np.float32) for item in detection_results]\n        all_results = []\n        for index, img_boxes in enumerate(boxes):\n            original_image = predicted_data[index].copy()\n            result = {'save_path': ''}\n            if img_boxes.size == 0:\n                result['data'] = []\n            else:\n                img_crop_list = []\n                boxes = sorted_boxes(img_boxes)\n                for num_box in range(len(boxes)):\n                    tmp_box = copy.deepcopy(boxes[num_box])\n                    img_crop = self.get_rotate_crop_image(original_image, tmp_box)\n                    img_crop_list.append(img_crop)\n                img_crop_list, angle_list = self._classify_text(img_crop_list,\n                                                                angle_classification_thresh=angle_classification_thresh)\n                rec_results = self._recognize_text(img_crop_list)\n\n                # if the recognized text confidence score is lower than text_thresh, then drop it\n                rec_res_final = []\n                for index, res in enumerate(rec_results):\n                    text, score = res\n                    if score >= text_thresh:\n                        rec_res_final.append({\n                            'text': text,\n                            'confidence': float(score),\n                            'text_box_position': boxes[index].astype(np.int64).tolist()\n                        })\n                result['data'] = rec_res_final\n\n                if visualization and result['data']:\n                    result['save_path'] = self.save_result_image(original_image, boxes, rec_results, output_dir,\n                                                                 text_thresh)\n            all_results.append(result)\n\n        return all_results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.recognize_text(images_decode, **kwargs)\n        return results\n\n    def save_result_image(\n        self,\n        original_image,\n        detection_boxes,\n        rec_results,\n        output_dir='ocr_result',\n        text_thresh=0.5,\n    ):\n        image = Image.fromarray(cv2.cvtColor(original_image, cv2.COLOR_BGR2RGB))\n        txts = [item[0] for item in rec_results]\n        scores = [item[1] for item in rec_results]\n        draw_img = draw_ocr(image,\n                            detection_boxes,\n                            txts,\n                            scores,\n                            font_file=self.font_file,\n                            draw_txt=True,\n                            drop_score=text_thresh)\n\n        if not os.path.exists(output_dir):\n            os.makedirs(output_dir)\n        ext = get_image_ext(original_image)\n        saved_name = 'ndarray_{}{}'.format(time.time(), ext)\n        save_file_path = os.path.join(output_dir, saved_name)\n        cv2.imwrite(save_file_path, draw_img[:, :, ::-1])\n        return save_file_path\n\n    def _classify_text(self, image_list, angle_classification_thresh=0.9):\n        img_list = copy.deepcopy(image_list)\n        img_num = len(img_list)\n        # Calculate the aspect ratio of all text bars\n        width_list = []\n        for img in img_list:\n            width_list.append(img.shape[1] / float(img.shape[0]))\n        # Sorting can speed up the cls process\n        indices = np.argsort(np.array(width_list))\n\n        cls_res = [['', 0.0]] * img_num\n        batch_num = 6\n        for beg_img_no in range(0, img_num, batch_num):\n            end_img_no = min(img_num, beg_img_no + batch_num)\n            norm_img_batch = []\n            max_wh_ratio = 0\n            for ino in range(beg_img_no, end_img_no):\n                h, w = img_list[indices[ino]].shape[0:2]\n                wh_ratio = w * 1.0 / h\n                max_wh_ratio = max(max_wh_ratio, wh_ratio)\n            for ino in range(beg_img_no, end_img_no):\n                norm_img = self.resize_norm_img_cls(img_list[indices[ino]])\n                norm_img = norm_img[np.newaxis, :]\n                norm_img_batch.append(norm_img)\n            norm_img_batch = np.concatenate(norm_img_batch)\n            norm_img_batch = norm_img_batch.copy()\n\n            self.cls_input_tensor.copy_from_cpu(norm_img_batch)\n            self.cls_predictor.run()\n\n            prob_out = self.cls_output_tensors[0].copy_to_cpu()\n            ## post process\n            label_list = ['0', '180']\n            pred_idxs = prob_out.argmax(axis=1)\n            cls_result = [(label_list[idx], prob_out[i, idx]) for i, idx in enumerate(pred_idxs)]\n            for rno in range(len(cls_result)):\n                label, score = cls_result[rno]\n                cls_res[indices[beg_img_no + rno]] = [label, score]\n                if '180' in label and score > angle_classification_thresh:\n                    img_list[indices[beg_img_no + rno]] = cv2.rotate(img_list[indices[beg_img_no + rno]], 1)\n        return img_list, cls_res\n\n    def _recognize_text(self, img_list):\n        img_num = len(img_list)\n        # Calculate the aspect ratio of all text bars\n        width_list = []\n        for img in img_list:\n            width_list.append(img.shape[1] / float(img.shape[0]))\n        # Sorting can speed up the recognition process\n        indices = np.argsort(np.array(width_list))\n\n        rec_res = [['', 0.0]] * img_num\n        batch_num = 6\n        for beg_img_no in range(0, img_num, batch_num):\n            end_img_no = min(img_num, beg_img_no + batch_num)\n            norm_img_batch = []\n            imgC, imgH, imgW = self.rec_image_shape\n            max_wh_ratio = imgW / imgH\n            for ino in range(beg_img_no, end_img_no):\n                h, w = img_list[indices[ino]].shape[0:2]\n                wh_ratio = w * 1.0 / h\n                max_wh_ratio = max(max_wh_ratio, wh_ratio)\n            for ino in range(beg_img_no, end_img_no):\n                norm_img = self.resize_norm_img_rec(img_list[indices[ino]], max_wh_ratio)\n                norm_img = norm_img[np.newaxis, :]\n                norm_img_batch.append(norm_img)\n\n            norm_img_batch = np.concatenate(norm_img_batch, axis=0)\n            norm_img_batch = norm_img_batch.copy()\n\n            self.rec_input_tensor.copy_from_cpu(norm_img_batch)\n            self.rec_predictor.run()\n\n            ##\n            outputs = []\n            for output_tensor in self.rec_output_tensors:\n                output = output_tensor.copy_to_cpu()\n                outputs.append(output)\n            if len(outputs) != 1:\n                preds = outputs\n            else:\n                preds = outputs[0]\n            if isinstance(preds, tuple) or isinstance(preds, list):\n                preds = preds[-1]\n            if isinstance(preds, paddle.Tensor):\n                preds = preds.numpy()\n            preds_idx = preds.argmax(axis=2)\n            preds_prob = preds.max(axis=2)\n            rec_result = self.char_ops.decode(preds_idx, preds_prob, is_remove_duplicate=True)\n            for rno in range(len(rec_result)):\n                rec_res[indices[beg_img_no + rno]] = rec_result[rno]\n\n        return rec_res\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the %s module.\" % self.name,\n                                              prog='hub run %s' % self.name,\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n\n        args = self.parser.parse_args(argvs)\n        results = self.recognize_text(paths=[args.input_path],\n                                      use_gpu=args.use_gpu,\n                                      output_dir=args.output_dir,\n                                      det_db_unclip_ratio=args.det_db_unclip_ratio,\n                                      det_db_score_mode=args.det_db_score_mode,\n                                      visualization=args.visualization)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='ocr_result',\n                                           help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument('--visualization',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether to save output as images.\")\n        self.arg_config_group.add_argument('--det_db_unclip_ratio',\n                                           type=float,\n                                           default=1.5,\n                                           help=\"unclip ratio for post processing in DB detection.\")\n        self.arg_config_group.add_argument(\n            '--det_db_score_mode',\n            type=str,\n            default=\"fast\",\n            help=\"method to calc the final det score, one of fast(using box) and slow(using poly).\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, default=None, help=\"diretory to image\")\n\n    def create_gradio_app(self):\n        import gradio as gr\n\n        def inference(image, use_gpu=False, box_thresh=0.5, text_thresh=0.5, angle_classification_thresh=0.9):\n            return self.recognize_text(paths=[image],\n                                       use_gpu=use_gpu,\n                                       output_dir=None,\n                                       visualization=False,\n                                       box_thresh=box_thresh,\n                                       text_thresh=text_thresh,\n                                       angle_classification_thresh=angle_classification_thresh)\n\n        return gr.Interface(inference, [\n            gr.Image(type='filepath'),\n            gr.Checkbox(),\n            gr.Slider(0, 1.0, 0.5, step=0.01),\n            gr.Slider(0, 1.0, 0.5, step=0.01),\n            gr.Slider(0, 1.0, 0.5, step=0.01)\n        ], [gr.JSON(label='results')],\n                            title='ch_pp-ocrv3',\n                            allow_flagging=False)\n"
  },
  {
    "path": "modules/image/text_recognition/ch_pp-ocrv3/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/KTzZVDjUsXw/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8MzM3fHx0ZXh0fGVufDB8fHx8MTY2MzUxMTExMQ&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"ch_pp-ocrv3\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('ocr_result')\n\n    def test_recognize_text1(self):\n        results = self.module.recognize_text(\n            paths=['tests/test.jpg'],\n            use_gpu=False,\n            visualization=False,\n        )\n        self.assertEqual(results[0]['data'], [{\n            'text': 'GIVE.',\n            'confidence': 0.9509768486022949,\n            'text_box_position': [[283, 162], [352, 162], [352, 202], [283, 202]]\n        }, {\n            'text': 'THANKS',\n            'confidence': 0.9943074584007263,\n            'text_box_position': [[261, 202], [376, 202], [376, 239], [261, 239]]\n        }])\n\n    def test_recognize_text2(self):\n        results = self.module.recognize_text(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=False,\n        )\n        self.assertEqual(results[0]['data'], [{\n            'text': 'GIVE.',\n            'confidence': 0.9509768486022949,\n            'text_box_position': [[283, 162], [352, 162], [352, 202], [283, 202]]\n        }, {\n            'text': 'THANKS',\n            'confidence': 0.9943074584007263,\n            'text_box_position': [[261, 202], [376, 202], [376, 239], [261, 239]]\n        }])\n\n    def test_recognize_text3(self):\n        results = self.module.recognize_text(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=True,\n            visualization=False,\n        )\n        self.assertEqual(results[0]['data'], [{\n            'text': 'GIVE.',\n            'confidence': 0.9509768486022949,\n            'text_box_position': [[283, 162], [352, 162], [352, 202], [283, 202]]\n        }, {\n            'text': 'THANKS',\n            'confidence': 0.9943074584007263,\n            'text_box_position': [[261, 202], [376, 202], [376, 239], [261, 239]]\n        }])\n\n    def test_recognize_text4(self):\n        results = self.module.recognize_text(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=True,\n        )\n        self.assertEqual(results[0]['data'], [{\n            'text': 'GIVE.',\n            'confidence': 0.9509768486022949,\n            'text_box_position': [[283, 162], [352, 162], [352, 202], [283, 202]]\n        }, {\n            'text': 'THANKS',\n            'confidence': 0.9943074584007263,\n            'text_box_position': [[261, 202], [376, 202], [376, 239], [261, 239]]\n        }])\n\n    def test_recognize_text5(self):\n        self.assertRaises(AttributeError, self.module.recognize_text, images=['tests/test.jpg'])\n\n    def test_recognize_text6(self):\n        self.assertRaises(AssertionError, self.module.recognize_text, paths=['no.jpg'])\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model/model.pdiparams'))\n\n        self.assertTrue(os.path.exists('./inference/model/_text_detector_module.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model/_text_detector_module.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/text_recognition/ch_pp-ocrv3/utils.py",
    "content": "# -*- coding:utf-8 -*-\n# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport base64\nimport math\nfrom io import BytesIO\n\nimport cv2\nimport numpy as np\nfrom PIL import Image\nfrom PIL import ImageDraw\nfrom PIL import ImageFont\n\n\ndef draw_ocr(image, boxes, txts, scores, font_file, draw_txt=True, drop_score=0.5):\n    \"\"\"\n    Visualize the results of OCR detection and recognition\n    args:\n        image(Image|array): RGB image\n        boxes(list): boxes with shape(N, 4, 2)\n        txts(list): the texts\n        scores(list): txxs corresponding scores\n        draw_txt(bool): whether draw text or not\n        drop_score(float): only scores greater than drop_threshold will be visualized\n    return(array):\n        the visualized img\n    \"\"\"\n    if scores is None:\n        scores = [1] * len(boxes)\n    for (box, score) in zip(boxes, scores):\n        if score < drop_score or math.isnan(score):\n            continue\n        box = np.reshape(np.array(box), [-1, 1, 2]).astype(np.int64)\n        image = cv2.polylines(np.array(image), [box], True, (255, 0, 0), 2)\n\n    if draw_txt:\n        img = np.array(resize_img(image, input_size=600))\n        txt_img = text_visual(txts, scores, font_file, img_h=img.shape[0], img_w=600, threshold=drop_score)\n        img = np.concatenate([np.array(img), np.array(txt_img)], axis=1)\n        return img\n    return image\n\n\ndef text_visual(texts, scores, font_file, img_h=400, img_w=600, threshold=0.):\n    \"\"\"\n    create new blank img and draw txt on it\n    args:\n        texts(list): the text will be draw\n        scores(list|None): corresponding score of each txt\n        img_h(int): the height of blank img\n        img_w(int): the width of blank img\n    return(array):\n    \"\"\"\n    if scores is not None:\n        assert len(texts) == len(scores), \"The number of txts and corresponding scores must match\"\n\n    def create_blank_img():\n        blank_img = np.ones(shape=[img_h, img_w], dtype=np.uint8) * 255\n        blank_img[:, img_w - 1:] = 0\n        blank_img = Image.fromarray(blank_img).convert(\"RGB\")\n        draw_txt = ImageDraw.Draw(blank_img)\n        return blank_img, draw_txt\n\n    blank_img, draw_txt = create_blank_img()\n\n    font_size = 20\n    txt_color = (0, 0, 0)\n    font = ImageFont.truetype(font_file, font_size, encoding=\"utf-8\")\n\n    gap = font_size + 5\n    txt_img_list = []\n    count, index = 1, 0\n    for idx, txt in enumerate(texts):\n        index += 1\n        if scores[idx] < threshold or math.isnan(scores[idx]):\n            index -= 1\n            continue\n        first_line = True\n        while str_count(txt) >= img_w // font_size - 4:\n            tmp = txt\n            txt = tmp[:img_w // font_size - 4]\n            if first_line:\n                new_txt = str(index) + ': ' + txt\n                first_line = False\n            else:\n                new_txt = '    ' + txt\n            draw_txt.text((0, gap * count), new_txt, txt_color, font=font)\n            txt = tmp[img_w // font_size - 4:]\n            if count >= img_h // gap - 1:\n                txt_img_list.append(np.array(blank_img))\n                blank_img, draw_txt = create_blank_img()\n                count = 0\n            count += 1\n        if first_line:\n            new_txt = str(index) + ': ' + txt + '   ' + '%.3f' % (scores[idx])\n        else:\n            new_txt = \"  \" + txt + \"  \" + '%.3f' % (scores[idx])\n        draw_txt.text((0, gap * count), new_txt, txt_color, font=font)\n        # whether add new blank img or not\n        if count >= img_h // gap - 1 and idx + 1 < len(texts):\n            txt_img_list.append(np.array(blank_img))\n            blank_img, draw_txt = create_blank_img()\n            count = 0\n        count += 1\n    txt_img_list.append(np.array(blank_img))\n    if len(txt_img_list) == 1:\n        blank_img = np.array(txt_img_list[0])\n    else:\n        blank_img = np.concatenate(txt_img_list, axis=1)\n    return np.array(blank_img)\n\n\ndef str_count(s):\n    \"\"\"\n    Count the number of Chinese characters,\n    a single English character and a single number\n    equal to half the length of Chinese characters.\n    args:\n        s(string): the input of string\n    return(int):\n        the number of Chinese characters\n    \"\"\"\n    import string\n    count_zh = count_pu = 0\n    s_len = len(s)\n    en_dg_count = 0\n    for c in s:\n        if c in string.ascii_letters or c.isdigit() or c.isspace():\n            en_dg_count += 1\n        elif c.isalpha():\n            count_zh += 1\n        else:\n            count_pu += 1\n    return s_len - math.ceil(en_dg_count / 2)\n\n\ndef resize_img(img, input_size=600):\n    img = np.array(img)\n    im_shape = img.shape\n    im_size_min = np.min(im_shape[0:2])\n    im_size_max = np.max(im_shape[0:2])\n    im_scale = float(input_size) / float(im_size_max)\n    im = cv2.resize(img, None, None, fx=im_scale, fy=im_scale)\n    return im\n\n\ndef get_image_ext(image):\n    if image.shape[2] == 4:\n        return \".png\"\n    return \".jpg\"\n\n\ndef sorted_boxes(dt_boxes):\n    \"\"\"\n    Sort text boxes in order from top to bottom, left to right\n    args:\n        dt_boxes(array):detected text boxes with shape [4, 2]\n    return:\n        sorted boxes(array) with shape [4, 2]\n    \"\"\"\n    num_boxes = dt_boxes.shape[0]\n    sorted_boxes = sorted(dt_boxes, key=lambda x: (x[0][1], x[0][0]))\n    _boxes = list(sorted_boxes)\n\n    for i in range(num_boxes - 1):\n        if abs(_boxes[i + 1][0][1] - _boxes[i][0][1]) < 10 and \\\n                (_boxes[i + 1][0][0] < _boxes[i][0][0]):\n            tmp = _boxes[i]\n            _boxes[i] = _boxes[i + 1]\n            _boxes[i + 1] = tmp\n    return _boxes\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    if data is None:\n        buf = BytesIO()\n        image_decode = base64.b64decode(b64str.encode('utf8'))\n        image = BytesIO(image_decode)\n        im = Image.open(image)\n        rgb = im.convert('RGB')\n        rgb.save(buf, 'jpeg')\n        buf.seek(0)\n        image_bytes = buf.read()\n        data_base64 = str(base64.b64encode(image_bytes), encoding=\"utf-8\")\n        image_decode = base64.b64decode(data_base64)\n        img_array = np.frombuffer(image_decode, np.uint8)\n        data = cv2.imdecode(img_array, cv2.IMREAD_COLOR)\n    return data\n"
  },
  {
    "path": "modules/image/text_recognition/ch_pp-ocrv3_det/README.md",
    "content": "# ch_pp-ocrv3_det\n\n|模型名称|ch_pp-ocrv3_det|\n| :--- | :---: |\n|类别|图像-文字检测|\n|网络|Differentiable Binarization|\n|数据集|icdar2015数据集|\n|是否支持Fine-tuning|否|\n|模型大小|3.7MB|\n|最新更新日期|2022-05-11|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n<p align=\"center\">\n<img src=\"https://user-images.githubusercontent.com/22424850/167821705-f38496ef-daae-4de1-9363-3df20424f525.jpg\" width=\"500\" alt=\"package\" >\n\n</p>\n\n- ### 模型介绍\n\n  - DB（Differentiable Binarization）是一种基于分割的文本检测算法。此类算法可以更好地处理弯曲等不规则形状文本，因此检测效果往往会更好。但其后处理步骤中将分割结果转化为检测框的流程复杂，耗时严重。DB将二值化阈值加入训练中学习，可以获得更准确的检测边界，从而简化后处理流程。该Module是PP-OCRv3的检测模型，对PP-OCRv2中的CML（Collaborative Mutual Learning) 协同互学习文本检测蒸馏策略进行了升级。\n\n<p align=\"center\">\n<img src=\"https://raw.githubusercontent.com/PaddlePaddle/PaddleOCR/release/2.5/doc/ppocrv3_framework.png\" width=\"800\" hspace='10'/> <br />\n</p>\n\n  - 更多详情参考：[PP-OCRv3](https://github.com/PaddlePaddle/PaddleOCR/blob/release/2.5/doc/doc_ch/PP-OCRv3_introduction.md)\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.2  \n\n  - paddlehub >= 2.2   | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install ch_pp-ocrv3_det\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run ch_pp-ocrv3_det --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    text_detector = hub.Module(name=\"ch_pp-ocrv3_det\", enable_mkldnn=True)       # mkldnn加速仅在CPU下有效\n    result = text_detector.detect_text(images=[cv2.imread('/PATH/TO/IMAGE')])\n\n    # or\n    # result =text_detector.detect_text(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    __init__(enable_mkldnn=False)\n    ```\n\n    - 构造检测模块的对象\n\n    - **参数**\n      - enable_mkldnn(bool): 是否开启mkldnn加速CPU计算。该参数仅在CPU运行下设置有效。默认为False。\n\n\n  - ```python\n    def detect_text(images=[],\n                    paths=[],\n                    use_gpu=False,\n                    output_dir='detection_result',\n                    visualization=False,\n                    box_thresh=0.6,\n                    det_db_unclip_ratio=1.5,\n                    det_db_score_mode=\"fast\")\n    ```\n\n    - 预测API，检测输入图片中的所有中文文本的位置。\n\n    - **参数**\n\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      - paths (list\\[str\\]): 图片的路径；\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量**\n      - box\\_thresh (float): 检测文本框置信度的阈值；\n      - visualization (bool): 是否将识别结果保存为图片文件；\n      - output\\_dir (str): 图片的保存路径，默认设为 detection\\_result；\n      - det\\_db\\_unclip\\_ratio: 设置检测框的大小；\n      - det\\_db\\_score\\_mode: 设置检测得分计算方式，“fast” / “slow”\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为：\n        - data (list): 检测文本框结果，文本框在原图中的像素坐标，4*2的矩阵，依次表示文本框左下、右下、右上、左上顶点的坐标\n        - save_path (str): 识别结果的保存路径, 如不保存图片则save_path为''\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m ch_pp-ocrv3_det\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/ch_pp-ocrv3_det\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install ch_pp-ocrv3_det==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/text_recognition/ch_pp-ocrv3_det/module.py",
    "content": "# -*- coding:utf-8 -*-\n# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport argparse\nimport ast\nimport base64\nimport os\nimport time\nfrom io import BytesIO\n\nimport cv2\nimport numpy as np\nimport paddle.inference as paddle_infer\nfrom PIL import Image\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\nfrom paddlehub.utils.utils import logger\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    if data is None:\n        buf = BytesIO()\n        image_decode = base64.b64decode(b64str.encode('utf8'))\n        image = BytesIO(image_decode)\n        im = Image.open(image)\n        rgb = im.convert('RGB')\n        rgb.save(buf, 'jpeg')\n        buf.seek(0)\n        image_bytes = buf.read()\n        data_base64 = str(base64.b64encode(image_bytes), encoding=\"utf-8\")\n        image_decode = base64.b64decode(data_base64)\n        img_array = np.frombuffer(image_decode, np.uint8)\n        data = cv2.imdecode(img_array, cv2.IMREAD_COLOR)\n    return data\n\n\n@moduleinfo(\n    name=\"ch_pp-ocrv3_det\",\n    version=\"1.1.0\",\n    summary=\n    \"The module aims to detect chinese text position in the image, which is based on differentiable_binarization algorithm.\",\n    author=\"paddle-dev\",\n    author_email=\"paddle-dev@baidu.com\",\n    type=\"cv/text_recognition\")\nclass ChPPOCRv3Det:\n\n    def __init__(self, enable_mkldnn=False):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        self.pretrained_model_path = os.path.join(self.directory, 'inference_model', 'ppocrv3_det')\n        self.enable_mkldnn = enable_mkldnn\n\n        self._set_config()\n\n    def check_requirements(self):\n        try:\n            import shapely, pyclipper\n        except:\n            raise ImportError(\n                'This module requires the shapely, pyclipper tools. The running environment does not meet the requirements. Please install the two packages.'\n            )\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model_file_path = self.pretrained_model_path + '.pdmodel'\n        params_file_path = self.pretrained_model_path + '.pdiparams'\n\n        config = paddle_infer.Config(model_file_path, params_file_path)\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n\n        if use_gpu:\n            config.enable_use_gpu(8000, 0)\n        else:\n            config.disable_gpu()\n            config.set_cpu_math_library_num_threads(6)\n            if self.enable_mkldnn:\n                # cache 10 different shapes for mkldnn to avoid memory leak\n                config.set_mkldnn_cache_capacity(10)\n                config.enable_mkldnn()\n\n        config.disable_glog_info()\n\n        # use zero copy\n        config.delete_pass(\"conv_transpose_eltwiseadd_bn_fuse_pass\")\n        config.switch_use_feed_fetch_ops(False)\n        self.predictor = paddle_infer.create_predictor(config)\n        input_names = self.predictor.get_input_names()\n        self.input_tensor = self.predictor.get_input_handle(input_names[0])\n        output_names = self.predictor.get_output_names()\n        self.output_tensors = []\n        for output_name in output_names:\n            output_tensor = self.predictor.get_output_handle(output_name)\n            self.output_tensors.append(output_tensor)\n\n    def read_images(self, paths=[]):\n        images = []\n        for img_path in paths:\n            assert os.path.isfile(img_path), \"The {} isn't a valid file.\".format(img_path)\n            img = cv2.imread(img_path)\n            if img is None:\n                logger.info(\"error in loading image:{}\".format(img_path))\n                continue\n            images.append(img)\n        return images\n\n    def order_points_clockwise(self, pts):\n        rect = np.zeros((4, 2), dtype=\"float32\")\n        s = pts.sum(axis=1)\n        rect[0] = pts[np.argmin(s)]\n        rect[2] = pts[np.argmax(s)]\n        diff = np.diff(pts, axis=1)\n        rect[1] = pts[np.argmin(diff)]\n        rect[3] = pts[np.argmax(diff)]\n        return rect\n\n    def clip_det_res(self, points, img_height, img_width):\n        for pno in range(points.shape[0]):\n            points[pno, 0] = int(min(max(points[pno, 0], 0), img_width - 1))\n            points[pno, 1] = int(min(max(points[pno, 1], 0), img_height - 1))\n        return points\n\n    def filter_tag_det_res(self, dt_boxes, image_shape):\n        img_height, img_width = image_shape[0:2]\n        dt_boxes_new = []\n        for box in dt_boxes:\n            box = self.order_points_clockwise(box)\n            box = self.clip_det_res(box, img_height, img_width)\n            rect_width = int(np.linalg.norm(box[0] - box[1]))\n            rect_height = int(np.linalg.norm(box[0] - box[3]))\n            if rect_width <= 3 or rect_height <= 3:\n                continue\n            dt_boxes_new.append(box)\n        dt_boxes = np.array(dt_boxes_new)\n        return dt_boxes\n\n    def filter_tag_det_res_only_clip(self, dt_boxes, image_shape):\n        img_height, img_width = image_shape[0:2]\n        dt_boxes_new = []\n        for box in dt_boxes:\n            box = self.clip_det_res(box, img_height, img_width)\n            dt_boxes_new.append(box)\n        dt_boxes = np.array(dt_boxes_new)\n        return dt_boxes\n\n    def detect_text(self,\n                    images=[],\n                    paths=[],\n                    use_gpu=False,\n                    output_dir='detection_result',\n                    visualization=False,\n                    box_thresh=0.6,\n                    det_db_unclip_ratio=1.5,\n                    det_db_score_mode=\"fast\"):\n        \"\"\"\n        Get the text box in the predicted images.\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]. If images not paths\n            paths (list[str]): The paths of images. If paths not images\n            use_gpu (bool): Whether to use gpu. Default false.\n            output_dir (str): The directory to store output images.\n            visualization (bool): Whether to save image or not.\n            box_thresh(float): the threshold of the detected text box's confidence\n            det_db_unclip_ratio(float): unclip ratio for post processing in DB detection.\n            det_db_score_mode(str): method to calc the final det score, one of fast(using box) and slow(using poly).\n        Returns:\n            res (list): The result of text detection box and save path of images.\n        \"\"\"\n        self.check_requirements()\n\n        from .processor import DBProcessTest, DBPostProcess, draw_boxes, get_image_ext\n\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES via export CUDA_VISIBLE_DEVICES=cuda_device_id.\"\n                )\n\n        if images != [] and isinstance(images, list) and paths == []:\n            predicted_data = images\n        elif images == [] and isinstance(paths, list) and paths != []:\n            predicted_data = self.read_images(paths)\n        else:\n            raise TypeError(\"The input data is inconsistent with expectations.\")\n\n        assert predicted_data != [], \"There is not any image to be predicted. Please check the input data.\"\n\n        preprocessor = DBProcessTest(params={'max_side_len': 960})\n        postprocessor = DBPostProcess(\n            params={\n                'thresh': 0.3,\n                'box_thresh': box_thresh,\n                'max_candidates': 1000,\n                'unclip_ratio': det_db_unclip_ratio,\n                'det_db_score_mode': det_db_score_mode,\n            })\n\n        all_imgs = []\n        all_ratios = []\n        all_results = []\n        for original_image in predicted_data:\n            ori_im = original_image.copy()\n            im, ratio_list = preprocessor(original_image)\n            res = {'save_path': ''}\n            if im is None:\n                res['data'] = []\n\n            else:\n                im = im.copy()\n                self.input_tensor.copy_from_cpu(im)\n                self.predictor.run()\n\n                outputs = []\n                for output_tensor in self.output_tensors:\n                    output = output_tensor.copy_to_cpu()\n                    outputs.append(output)\n\n                outs_dict = {}\n                outs_dict['maps'] = outputs[0]\n\n                dt_boxes_list = postprocessor(outs_dict, [ratio_list])\n                dt_boxes = dt_boxes_list[0]\n                boxes = self.filter_tag_det_res(dt_boxes_list[0], original_image.shape)\n                res['data'] = boxes.astype(np.int64).tolist()\n                all_imgs.append(im)\n                all_ratios.append(ratio_list)\n                if visualization:\n                    img = Image.fromarray(cv2.cvtColor(original_image, cv2.COLOR_BGR2RGB))\n                    draw_img = draw_boxes(img, boxes)\n                    draw_img = np.array(draw_img)\n                    if not os.path.exists(output_dir):\n                        os.makedirs(output_dir)\n                    ext = get_image_ext(original_image)\n                    saved_name = 'ndarray_{}{}'.format(time.time(), ext)\n                    cv2.imwrite(os.path.join(output_dir, saved_name), draw_img[:, :, ::-1])\n                    res['save_path'] = os.path.join(output_dir, saved_name)\n\n            all_results.append(res)\n\n        return all_results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.detect_text(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the %s module.\" % self.name,\n                                              prog='hub run %s' % self.name,\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n\n        args = self.parser.parse_args(argvs)\n        results = self.detect_text(paths=[args.input_path],\n                                   use_gpu=args.use_gpu,\n                                   output_dir=args.output_dir,\n                                   det_db_unclip_ratio=args.det_db_unclip_ratio,\n                                   det_db_score_mode=args.det_db_score_mode,\n                                   visualization=args.visualization)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='detection_result',\n                                           help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument('--visualization',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether to save output as images.\")\n        self.arg_config_group.add_argument('--det_db_unclip_ratio',\n                                           type=float,\n                                           default=1.5,\n                                           help=\"unclip ratio for post processing in DB detection.\")\n        self.arg_config_group.add_argument(\n            '--det_db_score_mode',\n            type=str,\n            default=\"fast\",\n            help=\"method to calc the final det score, one of fast(using box) and slow(using poly).\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, default=None, help=\"diretory to image\")\n"
  },
  {
    "path": "modules/image/text_recognition/ch_pp-ocrv3_det/processor.py",
    "content": "# -*- coding:utf-8 -*-\n# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport sys\n\nimport cv2\nimport numpy as np\nimport pyclipper\nfrom PIL import ImageDraw\nfrom shapely.geometry import Polygon\n\n\nclass DBProcessTest(object):\n    \"\"\"\n    DB pre-process for Test mode\n    \"\"\"\n\n    def __init__(self, params):\n        super(DBProcessTest, self).__init__()\n        self.resize_type = 0\n        if 'test_image_shape' in params:\n            self.image_shape = params['test_image_shape']\n            self.resize_type = 1\n        if 'max_side_len' in params:\n            self.max_side_len = params['max_side_len']\n        else:\n            self.max_side_len = 2400\n\n    def resize_image_type0(self, img):\n        \"\"\"\n        resize image to a size multiple of 32 which is required by the network\n        args:\n            img(array): array with shape [h, w, c]\n        return(tuple):\n            img, (ratio_h, ratio_w)\n        \"\"\"\n        limit_side_len = self.max_side_len\n        h, w, _ = img.shape\n\n        # limit the max side\n        if max(h, w) > limit_side_len:\n            if h > w:\n                ratio = float(limit_side_len) / h\n            else:\n                ratio = float(limit_side_len) / w\n        else:\n            ratio = 1.\n        resize_h = int(h * ratio)\n        resize_w = int(w * ratio)\n\n        resize_h = max(int(round(resize_h / 32) * 32), 32)\n        resize_w = max(int(round(resize_w / 32) * 32), 32)\n\n        try:\n            if int(resize_w) <= 0 or int(resize_h) <= 0:\n                return None, (None, None)\n            img = cv2.resize(img, (int(resize_w), int(resize_h)))\n        except:\n            sys.exit(0)\n        ratio_h = resize_h / float(h)\n        ratio_w = resize_w / float(w)\n        # return img, np.array([h, w])\n        return img, [ratio_h, ratio_w]\n\n    def resize_image_type1(self, im):\n        resize_h, resize_w = self.image_shape\n        ori_h, ori_w = im.shape[:2]  # (h, w, c)\n        im = cv2.resize(im, (int(resize_w), int(resize_h)))\n        ratio_h = float(resize_h) / ori_h\n        ratio_w = float(resize_w) / ori_w\n        return im, (ratio_h, ratio_w)\n\n    def normalize(self, im):\n        img_mean = [0.485, 0.456, 0.406]\n        img_std = [0.229, 0.224, 0.225]\n        im = im.astype(np.float32, copy=False)\n        im = im / 255\n        im[:, :, 0] -= img_mean[0]\n        im[:, :, 1] -= img_mean[1]\n        im[:, :, 2] -= img_mean[2]\n        im[:, :, 0] /= img_std[0]\n        im[:, :, 1] /= img_std[1]\n        im[:, :, 2] /= img_std[2]\n        channel_swap = (2, 0, 1)\n        im = im.transpose(channel_swap)\n        return im\n\n    def __call__(self, im):\n        src_h, src_w, _ = im.shape\n        if self.resize_type == 0:\n            im, (ratio_h, ratio_w) = self.resize_image_type0(im)\n        else:\n            im, (ratio_h, ratio_w) = self.resize_image_type1(im)\n        im = self.normalize(im)\n        im = im[np.newaxis, :]\n        return [im, (src_h, src_w, ratio_h, ratio_w)]\n\n\nclass DBPostProcess(object):\n    \"\"\"\n    The post process for Differentiable Binarization (DB).\n    \"\"\"\n\n    def __init__(self, params):\n        self.thresh = params['thresh']\n        self.box_thresh = params['box_thresh']\n        self.max_candidates = params['max_candidates']\n        self.unclip_ratio = params['unclip_ratio']\n        self.score_mode = params['det_db_score_mode']\n        self.min_size = 3\n        self.dilation_kernel = None\n\n    def boxes_from_bitmap(self, pred, _bitmap, dest_width, dest_height):\n        '''\n        _bitmap: single map with shape (1, H, W),\n                whose values are binarized as {0, 1}\n        '''\n\n        bitmap = _bitmap\n        height, width = bitmap.shape\n\n        outs = cv2.findContours((bitmap * 255).astype(np.uint8), cv2.RETR_LIST, cv2.CHAIN_APPROX_SIMPLE)\n        if len(outs) == 3:\n            img, contours, _ = outs[0], outs[1], outs[2]\n        elif len(outs) == 2:\n            contours, _ = outs[0], outs[1]\n\n        num_contours = min(len(contours), self.max_candidates)\n\n        boxes = []\n        scores = []\n        for index in range(num_contours):\n            contour = contours[index]\n            points, sside = self.get_mini_boxes(contour)\n            if sside < self.min_size:\n                continue\n            points = np.array(points)\n            if self.score_mode == \"fast\":\n                score = self.box_score_fast(pred, points.reshape(-1, 2))\n            else:\n                score = self.box_score_slow(pred, contour)\n            if self.box_thresh > score:\n                continue\n\n            box = self.unclip(points).reshape(-1, 1, 2)\n            box, sside = self.get_mini_boxes(box)\n            if sside < self.min_size + 2:\n                continue\n            box = np.array(box)\n\n            box[:, 0] = np.clip(np.round(box[:, 0] / width * dest_width), 0, dest_width)\n            box[:, 1] = np.clip(np.round(box[:, 1] / height * dest_height), 0, dest_height)\n            boxes.append(box.astype(np.int64))\n            scores.append(score)\n        return np.array(boxes, dtype=np.int64), scores\n\n    def unclip(self, box):\n        unclip_ratio = self.unclip_ratio\n        poly = Polygon(box)\n        distance = poly.area * unclip_ratio / poly.length\n        offset = pyclipper.PyclipperOffset()\n        offset.AddPath(box, pyclipper.JT_ROUND, pyclipper.ET_CLOSEDPOLYGON)\n        expanded = np.array(offset.Execute(distance))\n        return expanded\n\n    def get_mini_boxes(self, contour):\n        bounding_box = cv2.minAreaRect(contour)\n        points = sorted(list(cv2.boxPoints(bounding_box)), key=lambda x: x[0])\n\n        index_1, index_2, index_3, index_4 = 0, 1, 2, 3\n        if points[1][1] > points[0][1]:\n            index_1 = 0\n            index_4 = 1\n        else:\n            index_1 = 1\n            index_4 = 0\n        if points[3][1] > points[2][1]:\n            index_2 = 2\n            index_3 = 3\n        else:\n            index_2 = 3\n            index_3 = 2\n\n        box = [points[index_1], points[index_2], points[index_3], points[index_4]]\n        return box, min(bounding_box[1])\n\n    def box_score_fast(self, bitmap, _box):\n        '''\n        box_score_fast: use bbox mean score as the mean score\n        '''\n        h, w = bitmap.shape[:2]\n        box = _box.copy()\n        xmin = np.clip(np.floor(box[:, 0].min()).astype(np.int64), 0, w - 1)\n        xmax = np.clip(np.ceil(box[:, 0].max()).astype(np.int64), 0, w - 1)\n        ymin = np.clip(np.floor(box[:, 1].min()).astype(np.int64), 0, h - 1)\n        ymax = np.clip(np.ceil(box[:, 1].max()).astype(np.int64), 0, h - 1)\n\n        mask = np.zeros((ymax - ymin + 1, xmax - xmin + 1), dtype=np.uint8)\n        box[:, 0] = box[:, 0] - xmin\n        box[:, 1] = box[:, 1] - ymin\n        cv2.fillPoly(mask, box.reshape(1, -1, 2).astype(np.int64), 1)\n        return cv2.mean(bitmap[ymin:ymax + 1, xmin:xmax + 1], mask)[0]\n\n    def box_score_slow(self, bitmap, contour):\n        '''\n        box_score_slow: use polyon mean score as the mean score\n        '''\n        h, w = bitmap.shape[:2]\n        contour = contour.copy()\n        contour = np.reshape(contour, (-1, 2))\n\n        xmin = np.clip(np.min(contour[:, 0]), 0, w - 1)\n        xmax = np.clip(np.max(contour[:, 0]), 0, w - 1)\n        ymin = np.clip(np.min(contour[:, 1]), 0, h - 1)\n        ymax = np.clip(np.max(contour[:, 1]), 0, h - 1)\n\n        mask = np.zeros((ymax - ymin + 1, xmax - xmin + 1), dtype=np.uint8)\n\n        contour[:, 0] = contour[:, 0] - xmin\n        contour[:, 1] = contour[:, 1] - ymin\n\n        cv2.fillPoly(mask, contour.reshape(1, -1, 2).astype(np.int64), 1)\n        return cv2.mean(bitmap[ymin:ymax + 1, xmin:xmax + 1], mask)[0]\n\n    def __call__(self, outs_dict, shape_list):\n        pred = outs_dict['maps']\n\n        pred = pred[:, 0, :, :]\n        segmentation = pred > self.thresh\n\n        boxes_batch = []\n        for batch_index in range(pred.shape[0]):\n            src_h, src_w, ratio_h, ratio_w = shape_list[batch_index]\n\n            mask = segmentation[batch_index]\n            tmp_boxes, tmp_scores = self.boxes_from_bitmap(pred[batch_index], mask, src_w, src_h)\n\n            boxes_batch.append(tmp_boxes)\n        return boxes_batch\n\n\ndef draw_boxes(image, boxes, scores=None, drop_score=0.5):\n    img = image.copy()\n    draw = ImageDraw.Draw(img)\n    if scores is None:\n        scores = [1] * len(boxes)\n    for (box, score) in zip(boxes, scores):\n        if score < drop_score:\n            continue\n        draw.line([(box[0][0], box[0][1]), (box[1][0], box[1][1])], fill='red')\n        draw.line([(box[1][0], box[1][1]), (box[2][0], box[2][1])], fill='red')\n        draw.line([(box[2][0], box[2][1]), (box[3][0], box[3][1])], fill='red')\n        draw.line([(box[3][0], box[3][1]), (box[0][0], box[0][1])], fill='red')\n        draw.line([(box[0][0] - 1, box[0][1] + 1), (box[1][0] - 1, box[1][1] + 1)], fill='red')\n        draw.line([(box[1][0] - 1, box[1][1] + 1), (box[2][0] - 1, box[2][1] + 1)], fill='red')\n        draw.line([(box[2][0] - 1, box[2][1] + 1), (box[3][0] - 1, box[3][1] + 1)], fill='red')\n        draw.line([(box[3][0] - 1, box[3][1] + 1), (box[0][0] - 1, box[0][1] + 1)], fill='red')\n    return img\n\n\ndef get_image_ext(image):\n    if image.shape[2] == 4:\n        return \".png\"\n    return \".jpg\"\n"
  },
  {
    "path": "modules/image/text_recognition/ch_pp-ocrv3_det/requirements.txt",
    "content": "shapely\npyclipper\n"
  },
  {
    "path": "modules/image/text_recognition/ch_pp-ocrv3_det/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/KTzZVDjUsXw/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8MzM3fHx0ZXh0fGVufDB8fHx8MTY2MzUxMTExMQ&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"ch_pp-ocrv3_det\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('detection_result')\n\n    def test_detect_text1(self):\n        results = self.module.detect_text(\n            paths=['tests/test.jpg'],\n            use_gpu=False,\n            visualization=False,\n        )\n        self.assertEqual(\n            results[0]['data'],\n            [[[261, 202], [376, 202], [376, 239], [261, 239]], [[283, 162], [352, 162], [352, 202], [283, 202]]])\n\n    def test_detect_text2(self):\n        results = self.module.detect_text(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=False,\n        )\n        self.assertEqual(\n            results[0]['data'],\n            [[[261, 202], [376, 202], [376, 239], [261, 239]], [[283, 162], [352, 162], [352, 202], [283, 202]]])\n\n    def test_detect_text3(self):\n        results = self.module.detect_text(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=True,\n            visualization=False,\n        )\n        self.assertEqual(\n            results[0]['data'],\n            [[[261, 202], [376, 202], [376, 239], [261, 239]], [[283, 162], [352, 162], [352, 202], [283, 202]]])\n\n    def test_detect_text4(self):\n        results = self.module.detect_text(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=True,\n        )\n        self.assertEqual(\n            results[0]['data'],\n            [[[261, 202], [376, 202], [376, 239], [261, 239]], [[283, 162], [352, 162], [352, 202], [283, 202]]])\n\n    def test_detect_text5(self):\n        self.assertRaises(AttributeError, self.module.detect_text, images=['tests/test.jpg'])\n\n    def test_detect_text6(self):\n        self.assertRaises(AssertionError, self.module.detect_text, paths=['no.jpg'])\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/text_recognition/chinese_cht_ocr_db_crnn_mobile/README.md",
    "content": "# chinese_cht_ocr_db_crnn_mobile\n\n|模型名称|chinese_cht_ocr_db_crnn_mobile|\n| :--- | :---: |\n|类别|图像-文字识别|\n|网络|Differentiable Binarization+CRNN|\n|数据集|icdar2015数据集|\n|是否支持Fine-tuning|否|\n|最新更新日期|2021-12-2|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - chinese_cht_ocr_db_crnn_mobile Module用于识别图片当中的繁体中文。其基于multi_languages_ocr_db_crnn检测得到的文本框，继续识别文本框中的繁体中文文字。最终识别文字算法采用CRNN（Convolutional Recurrent Neural Network）即卷积递归神经网络。其是DCNN和RNN的组合，专门用于识别图像中的序列式对象。与CTC loss配合使用，进行文字识别，可以直接从文本词级或行级的标注中学习，不需要详细的字符级的标注。该Module是一个识别繁体中文的轻量级OCR模型，支持直接预测。\n\n  - 更多详情参考：\n    - [Real-time Scene Text Detection with Differentiable Binarization](https://arxiv.org/pdf/1911.08947.pdf)\n    - [An end-to-end trainable neural network for image-based sequence recognition and its application to scene text recognition](https://arxiv.org/pdf/1507.05717.pdf)\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.0.2  \n\n  - paddlehub >= 2.0.0   | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install chinese_cht_ocr_db_crnn_mobile\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run chinese_cht_ocr_db_crnn_mobile --input_path \"/PATH/TO/IMAGE\"\n    $ hub run chinese_cht_ocr_db_crnn_mobile --input_path \"/PATH/TO/IMAGE\" --det True --rec True --use_angle_cls True  --box_thresh 0.7 --angle_classification_thresh 0.8 --visualization True\n    ```\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    ocr = hub.Module(name=\"chinese_cht_ocr_db_crnn_mobile\", enable_mkldnn=True)       # mkldnn加速仅在CPU下有效\n    result = ocr.recognize_text(images=[cv2.imread('/PATH/TO/IMAGE')])\n\n    # or\n    # result = ocr.recognize_text(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def __init__(self,\n                 det=True,\n                 rec=True,\n                 use_angle_cls=False,\n                 enable_mkldnn=False,  \n                 use_gpu=False,\n                 box_thresh=0.6,\n                 angle_classification_thresh=0.9)\n    ```\n\n    - 构造ChineseChtOCRDBCRNNMobile对象\n\n    - **参数**\n      - det(bool): 是否开启文字检测。默认为True。\n      - rec(bool): 是否开启文字识别。默认为True。\n      - use_angle_cls(bool): 是否开启方向分类, 用于设置使用方向分类器识别180度旋转文字。默认为False。\n      - enable_mkldnn(bool): 是否开启mkldnn加速CPU计算。该参数仅在CPU运行下设置有效。默认为False。\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量**\n      - box\\_thresh (float): 检测文本框置信度的阈值；\n      - angle_classification_thresh(float): 文本方向分类置信度的阈值\n\n\n  - ```python\n    def recognize_text(images=[],\n                       paths=[],\n                       output_dir='ocr_result',\n                       visualization=False)\n    ```\n\n    - 预测API，检测输入图片中的所有文本的位置和识别文本结果。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径；\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      - output\\_dir (str): 图片的保存路径，默认设为 ocr\\_result；\n      - visualization (bool): 是否将识别结果保存为图片文件, 仅有检测开启时有效, 默认为False；\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为：\n        - data (list\\[dict\\]): 识别文本结果，列表中每一个元素为 dict，各字段为：\n          - text(str): 识别得到的文本\n          - confidence(float): 识别文本结果置信度\n          - text_box_position(list): 文本框在原图中的像素坐标，4*2的矩阵，依次表示文本框左下、右下、右上、左上顶点的坐标，如果无识别结果则data为\\[\\]\n          - orientation(str): 分类的方向，仅在只有方向分类开启时输出\n          - score(float): 分类的得分，仅在只有方向分类开启时输出\n        - save_path (str, optional): 识别结果的保存路径，如不保存图片则save_path为''\n\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m chinese_cht_ocr_db_crnn_mobile\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/chinese_cht_ocr_db_crnn_mobile\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n  - ```shell\n    $ hub install chinese_cht_ocr_db_crnn_mobile==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/text_recognition/chinese_cht_ocr_db_crnn_mobile/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/text_recognition/chinese_cht_ocr_db_crnn_mobile/module.py",
    "content": "import paddlehub as hub\nfrom paddleocr.ppocr.utils.logging import get_logger\nfrom paddleocr.tools.infer.utility import base64_to_cv2\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\n\n@moduleinfo(\n    name=\"chinese_cht_ocr_db_crnn_mobile\",\n    version=\"1.0.0\",\n    summary=\"ocr service\",\n    author=\"PaddlePaddle\",\n    type=\"cv/text_recognition\")\nclass ChineseChtOCRDBCRNNMobile:\n    def __init__(self,\n                 det=True,\n                 rec=True,\n                 use_angle_cls=False,\n                 enable_mkldnn=False,\n                 use_gpu=False,\n                 box_thresh=0.6,\n                 angle_classification_thresh=0.9):\n        \"\"\"\n        initialize with the necessary elements\n        Args:\n            det(bool): Whether to use text detector.\n            rec(bool): Whether to use text recognizer.\n            use_angle_cls(bool): Whether to use text orientation classifier.\n            enable_mkldnn(bool): Whether to enable mkldnn.\n            use_gpu (bool): Whether to use gpu.\n            box_thresh(float): the threshold of the detected text box's confidence\n            angle_classification_thresh(float): the threshold of the angle classification confidence\n        \"\"\"\n        self.logger = get_logger()\n        self.model = hub.Module(\n            name=\"multi_languages_ocr_db_crnn\",\n            lang=\"chinese_cht\",\n            det=det,\n            rec=rec,\n            use_angle_cls=use_angle_cls,\n            enable_mkldnn=enable_mkldnn,\n            use_gpu=use_gpu,\n            box_thresh=box_thresh,\n            angle_classification_thresh=angle_classification_thresh)\n        self.model.name = self.name\n\n    def recognize_text(self, images=[], paths=[], output_dir='ocr_result', visualization=False):\n        \"\"\"\n        Get the text in the predicted images.\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]. If images not paths\n            paths (list[str]): The paths of images. If paths not images\n            output_dir (str): The directory to store output images.\n            visualization (bool): Whether to save image or not.\n        Returns:\n            res (list): The result of text detection box and save path of images.\n        \"\"\"\n        all_results = self.model.recognize_text(\n            images=images, paths=paths, output_dir=output_dir, visualization=visualization)\n        return all_results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.recognize_text(images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        results = self.model.run_cmd(argvs)\n        return results\n\n    def export_onnx_model(self, dirname: str, input_shape_dict=None, opset_version=10):\n        '''\n        Export the model to ONNX format.\n\n        Args:\n            dirname(str): The directory to save the onnx model.\n            input_shape_dict: dictionary ``{ input_name: input_value }, eg. {'x': [-1, 3, -1, -1]}``\n            opset_version(int): operator set\n        '''\n        self.model.export_onnx_model(dirname=dirname, input_shape_dict=input_shape_dict, opset_version=opset_version)\n"
  },
  {
    "path": "modules/image/text_recognition/chinese_cht_ocr_db_crnn_mobile/requirements.txt",
    "content": "paddleocr>=2.3.0.2\npaddle2onnx>=0.9.0\nshapely\npyclipper\n"
  },
  {
    "path": "modules/image/text_recognition/chinese_ocr_db_crnn_mobile/README.md",
    "content": "# chinese_ocr_db_crnn_mobile\n\n|模型名称|chinese_ocr_db_crnn_mobile|\n| :--- | :---: |\n|类别|图像-文字识别|\n|网络|Differentiable Binarization+RCNN|\n|数据集|icdar2015数据集|\n|是否支持Fine-tuning|否|\n|模型大小|16M|\n|最新更新日期|2021-05-31|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - [OCR文字识别场景在线体验](https://www.paddlepaddle.org.cn/hub/scene/ocr)\n  - 样例结果示例：\n<p align=\"center\">\n<img src=\"https://user-images.githubusercontent.com/76040149/133097562-d8c9abd1-6c70-4d93-809f-fa4735764836.png\"  width = \"600\" hspace='10'/> <br />\n</p>\n\n- ### 模型介绍\n\n  - chinese_ocr_db_crnn_mobile Module用于识别图片当中的汉字。其基于[chinese_text_detection_db_mobile Module](../chinese_text_detection_db_mobile/)检测得到的文本框，识别文本框中的中文文字,再对检测文本框进行角度分类。最终识别文字算法采用CRNN（Convolutional Recurrent Neural Network）即卷积递归神经网络。该Module是一个超轻量级中文OCR模型，支持直接预测。\n\n\n<p align=\"center\">\n<img src=\"https://user-images.githubusercontent.com/76040149/133098254-7c642826-d6d7-4dd0-986e-371622337867.png\" width = \"300\" height = \"450\"  hspace='10'/> <br />\n</p>\n\n  - 更多详情参考：[An end-to-end trainable neural network for image-based sequence recognition and its application to scene text recognition](https://arxiv.org/pdf/1507.05717.pdf)\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.7.2  \n\n  - paddlehub >= 1.6.0   | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n  - shapely\n\n  - pyclipper\n\n  - ```shell\n    $ pip install shapely pyclipper\n    ```\n  - **该Module依赖于第三方库shapely和pyclipper，使用该Module之前，请先安装shapely和pyclipper。**  \n\n- ### 2、安装\n\n  - ```shell\n    $ hub install chinese_ocr_db_crnn_mobile\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run chinese_ocr_db_crnn_mobile --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    ocr = hub.Module(name=\"chinese_ocr_db_crnn_mobile\", enable_mkldnn=True)       # mkldnn加速仅在CPU下有效\n    result = ocr.recognize_text(images=[cv2.imread('/PATH/TO/IMAGE')])\n\n    # or\n    # result = ocr.recognize_text(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    __init__(text_detector_module=None, enable_mkldnn=False)\n    ```\n\n    - 构造ChineseOCRDBCRNN对象\n\n    - **参数**\n\n      - text_detector_module(str): 文字检测PaddleHub Module名字，如设置为None，则默认使用[chinese_text_detection_db_mobile Module](../chinese_text_detection_db_mobile/)。其作用为检测图片当中的文本。\n      - enable_mkldnn(bool): 是否开启mkldnn加速CPU计算。该参数仅在CPU运行下设置有效。默认为False。\n\n\n  - ```python\n    def recognize_text(images=[],\n                        paths=[],\n                        use_gpu=False,\n                        output_dir='ocr_result',\n                        visualization=False,\n                        box_thresh=0.5,\n                        text_thresh=0.5,\n                        angle_classification_thresh=0.9)\n    ```\n\n    - 预测API，检测输入图片中的所有中文文本的位置。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径；\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量**\n      - box\\_thresh (float): 检测文本框置信度的阈值；\n      - text\\_thresh (float): 识别中文文本置信度的阈值；\n      - angle_classification_thresh(float): 文本角度分类置信度的阈值\n      - visualization (bool): 是否将识别结果保存为图片文件；\n      - output\\_dir (str): 图片的保存路径，默认设为 ocr\\_result；\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为：\n        - data (list\\[dict\\]): 识别文本结果，列表中每一个元素为 dict，各字段为：\n          - text(str): 识别得到的文本\n          - confidence(float): 识别文本结果置信度\n          - text_box_position(list): 文本框在原图中的像素坐标，4*2的矩阵，依次表示文本框左下、右下、右上、左上顶点的坐标\n      如果无识别结果则data为\\[\\]\n        - save_path (str, optional): 识别结果的保存路径，如不保存图片则save_path为''\n\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m chinese_ocr_db_crnn_mobile\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/chinese_ocr_db_crnn_mobile\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n- ### Gradio App 支持\n  从 PaddleHub 2.3.1 开始支持使用链接 http://127.0.0.1:8866/gradio/chinese_ocr_db_crnn_mobile 在浏览器中访问 chinese_ocr_db_crnn_mobile 的 Gradio App。\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  修复使用在线服务调用模型失败问题\n\n* 1.0.2\n\n  支持mkldnn加速CPU计算\n\n* 1.1.0\n\n  使用超轻量级的三阶段模型（文本框检测-角度分类-文字识别）识别图片文字。\n\n* 1.1.1\n\n  支持文本中空格识别。\n\n* 1.1.2\n\n  修复只能检出30字段问题。\n\n* 1.1.3\n\n  移除 fluid api\n\n* 1.2.0\n\n  适配 PaddleHub 2.x 添加 Gradio APP\n\n  - ```shell\n    $ hub install chinese_ocr_db_crnn_mobile==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/text_recognition/chinese_ocr_db_crnn_mobile/README_en.md",
    "content": "# chinese_ocr_db_crnn_mobile\n\n|     Module Name      |  chinese_ocr_db_crnn_mobile  |\n|  :------------------ | :------------: |\n|       Category       | image-text_recognition |\n|         Network      |     Differentiable Binarization+RCNN     |\n|         Dataset      | icdar2015 |\n| Fine-tuning supported or not |      No       |\n|     Module Size      |       16M       |\n| Latest update date   |   2021-02-26   |\n|   Data indicators    |       -        |\n\n\n## I. Basic Information of Module\n\n- ### Application Effect Display\n  - [Online experience in OCR text recognition scenarios](https://www.paddlepaddle.org.cn/hub/scene/ocr)\n  - Example result:\n<p align=\"center\">\n<img src=\"https://user-images.githubusercontent.com/76040149/133097562-d8c9abd1-6c70-4d93-809f-fa4735764836.png\"  width = \"600\" hspace='10'/> <br />\n</p>\n\n- ### Module Introduction\n\n  - chinese_ocr_db_crnn_mobile Module is used to identify Chinese characters in pictures. Get the text box after using [chinese_text_detection_db_mobile Module](../chinese_text_detection_db_mobile/), identify the Chinese characters in the text box, and then do angle classification to the detection text box. CRNN(Convolutional Recurrent Neural Network) is adopted as the final recognition algorithm. This Module is an ultra-lightweight Chinese OCR model that supports direct prediction.\n\n\n<p align=\"center\">\n<img src=\"https://user-images.githubusercontent.com/76040149/133098254-7c642826-d6d7-4dd0-986e-371622337867.png\" width = \"300\" height = \"450\"  hspace='10'/> <br />\n</p>\n\n  - For more information, please refer to:[An end-to-end trainable neural network for image-based sequence recognition and its application to scene text recognition](https://arxiv.org/pdf/1507.05717.pdf)\n\n\n\n## II. Installation\n\n- ### 1、Environmental dependence  \n\n  - paddlepaddle >= 1.7.2  \n\n  - paddlehub >= 1.6.0   | [How to install PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n  - shapely\n\n  - pyclipper\n\n  - ```shell\n    $ pip install shapely pyclipper\n    ```\n  - **This Module relies on the third-party libraries shapely and pyclipper. Please install shapely and pyclipper before using this Module.**  \n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install chinese_ocr_db_crnn_mobile\n    ```\n  - If you have problems during installation, please refer to:[windows_quickstart](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [linux_quickstart](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [mac_quickstart](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## III. Module API and Prediction\n\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run chinese_ocr_db_crnn_mobile --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command line instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    ocr = hub.Module(name=\"chinese_ocr_db_crnn_mobile\", enable_mkldnn=True)       # MKLDNN acceleration is only available on CPU\n    result = ocr.recognize_text(images=[cv2.imread('/PATH/TO/IMAGE')])\n\n    # or\n    # result = ocr.recognize_text(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    __init__(text_detector_module=None, enable_mkldnn=False)\n    ```\n\n    - Construct the ChineseOCRDBCRNN object\n\n    - **Parameter**\n\n      - text_detector_module(str): PaddleHub Module Name for text detection, use [chinese_text_detection_db_mobile Module](../chinese_text_detection_db_mobile/) by default if set to None. Its function is to detect the text in the picture.\n      - enable_mkldnn(bool): Whether to enable MKLDNN to accelerate CPU computing. This parameter is valid only when the CPU is running. The default is False.\n\n\n  - ```python\n    def recognize_text(images=[],\n                        paths=[],\n                        use_gpu=False,\n                        output_dir='ocr_result',\n                        visualization=False,\n                        box_thresh=0.5,\n                        text_thresh=0.5,\n                        angle_classification_thresh=0.9)\n    ```\n\n    - Prediction API, detecting the position of all Chinese text in the input image.\n\n    - **Parameter**\n\n      - paths (list\\[str\\]): image path\n      - images (list\\[numpy.ndarray\\]): image data, ndarray.shape is in the format \\[H, W, C\\], BGR;\n      - use\\_gpu (bool): use GPU or not **If GPU is used, set the CUDA_VISIBLE_DEVICES environment variable first**\n      - box\\_thresh (float): The confidence threshold of text box detection;\n      - text\\_thresh (float): The confidence threshold of Chinese text recognition;\n      - angle_classification_thresh(float): The confidence threshold of text Angle classification\n      - visualization (bool): Whether to save the recognition results as picture files;\n      - output\\_dir (str): path to save the image, ocr\\_result by default.\n\n    - **Return**\n\n      - res (list\\[dict\\]): The list of recognition results, where each element is dict and each field is:\n        - data (list\\[dict\\]): recognition result, each element in the list is dict and each field is:\n          - text(str): The result text of recognition\n          - confidence(float): The confidence of the results\n          - text_box_position(list): The pixel coordinates of the text box in the original picture, a 4*2 matrix, represent the coordinates of the lower left, lower right, upper right and upper left vertices of the text box in turn\n      data is \\[\\] if there's no result\n        - save_path (str, optional): Path to save the result, save_path is '' if no image is saved.\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online object detection service.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command:\n  - ```shell\n    $ hub serving start -m chinese_ocr_db_crnn_mobile\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before prediction. Otherwise, need not set it.\n\n\n- ### Step 2: Send a predictive request\n\n  - After configuring the server, the following lines of code can be used to send the prediction request and obtain the prediction result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/chinese_ocr_db_crnn_mobile\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction result\n    print(r.json()[\"results\"])\n    ```\n\n- ### Gradio APP support\n   Starting with PaddleHub 2.3.1, the Gradio APP for chinese_ocr_db_crnn_mobile is supported to be accessed in the browser using the link http://127.0.0.1:8866/gradio/chinese_ocr_db_crnn_mobile.\n\n## V. Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.0.1\n\n  Fixed failure to use the online service invocating model\n\n* 1.0.2\n\n  Supports MKLDNN to speed up CPU computing\n\n* 1.1.0\n\n  An ultra-lightweight three-stage model (text box detection - angle classification - text recognition) is used to identify text in images.\n\n* 1.1.1\n\n  Supports recognition of spaces in text.\n\n* 1.1.2\n\n  Fixed an issue where only 30 fields can be detected.\n\n* 1.1.3\n\n  Remove fluid api\n\n* 1.2.0\n\n  Support PaddleHub 2.x version. Add Gradio APP support.\n\n  - ```shell\n    $ hub install chinese_ocr_db_crnn_mobile==1.1.3\n    ```\n"
  },
  {
    "path": "modules/image/text_recognition/chinese_ocr_db_crnn_mobile/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/text_recognition/chinese_ocr_db_crnn_mobile/character.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport string\n\nimport numpy as np\n\n\nclass CharacterOps(object):\n    \"\"\" Convert between text-label and text-index\n    Args:\n        config: config from yaml file\n    \"\"\"\n\n    def __init__(self, config):\n        self.character_type = config['character_type']\n        self.loss_type = config['loss_type']\n        self.max_text_len = config['max_text_length']\n        if self.character_type == \"en\":\n            self.character_str = \"0123456789abcdefghijklmnopqrstuvwxyz\"\n            dict_character = list(self.character_str)\n        # use the custom dictionary\n        elif self.character_type == \"ch\":\n            character_dict_path = config['character_dict_path']\n            add_space = False\n            if 'use_space_char' in config:\n                add_space = config['use_space_char']\n            self.character_str = \"\"\n            with open(character_dict_path, \"rb\") as fin:\n                lines = fin.readlines()\n                for line in lines:\n                    line = line.decode('utf-8').strip(\"\\n\").strip(\"\\r\\n\")\n                    self.character_str += line\n            if add_space:\n                self.character_str += \" \"\n            dict_character = list(self.character_str)\n        elif self.character_type == \"en_sensitive\":\n            # same with ASTER setting (use 94 char).\n            self.character_str = string.printable[:-6]\n            dict_character = list(self.character_str)\n        else:\n            self.character_str = None\n        assert self.character_str is not None, \\\n            \"Nonsupport type of the character: {}\".format(self.character_str)\n        self.beg_str = \"sos\"\n        self.end_str = \"eos\"\n        # add start and end str for attention\n        if self.loss_type == \"attention\":\n            dict_character = [self.beg_str, self.end_str] + dict_character\n        elif self.loss_type == \"srn\":\n            dict_character = dict_character + [self.beg_str, self.end_str]\n        # create char dict\n        self.dict = {}\n        for i, char in enumerate(dict_character):\n            self.dict[char] = i\n        self.character = dict_character\n\n    def encode(self, text):\n        \"\"\"convert text-label into text-index.\n        input:\n            text: text labels of each image. [batch_size]\n\n        output:\n            text: concatenated text index for CTCLoss.\n                    [sum(text_lengths)] = [text_index_0 + text_index_1 + ... + text_index_(n - 1)]\n            length: length of each text. [batch_size]\n        \"\"\"\n        if self.character_type == \"en\":\n            text = text.lower()\n\n        text_list = []\n        for char in text:\n            if char not in self.dict:\n                continue\n            text_list.append(self.dict[char])\n        text = np.array(text_list)\n        return text\n\n    def decode(self, text_index, is_remove_duplicate=False):\n        \"\"\" convert text-index into text-label. \"\"\"\n        char_list = []\n        char_num = self.get_char_num()\n\n        if self.loss_type == \"attention\":\n            beg_idx = self.get_beg_end_flag_idx(\"beg\")\n            end_idx = self.get_beg_end_flag_idx(\"end\")\n            ignored_tokens = [beg_idx, end_idx]\n        else:\n            ignored_tokens = [char_num]\n\n        for idx in range(len(text_index)):\n            if text_index[idx] in ignored_tokens:\n                continue\n            if is_remove_duplicate:\n                if idx > 0 and text_index[idx - 1] == text_index[idx]:\n                    continue\n            char_list.append(self.character[int(text_index[idx])])\n        text = ''.join(char_list)\n        return text\n\n    def get_char_num(self):\n        return len(self.character)\n\n    def get_beg_end_flag_idx(self, beg_or_end):\n        if self.loss_type == \"attention\":\n            if beg_or_end == \"beg\":\n                idx = np.array(self.dict[self.beg_str])\n            elif beg_or_end == \"end\":\n                idx = np.array(self.dict[self.end_str])\n            else:\n                assert False, \"Unsupport type %s in get_beg_end_flag_idx\"\\\n                    % beg_or_end\n            return idx\n        else:\n            err = \"error in get_beg_end_flag_idx when using the loss %s\"\\\n                % (self.loss_type)\n            assert False, err\n\n\ndef cal_predicts_accuracy(char_ops, preds, preds_lod, labels, labels_lod, is_remove_duplicate=False):\n    \"\"\"\n    Calculate prediction accuracy\n    Args:\n        char_ops: CharacterOps\n        preds: preds result,text index\n        preds_lod: lod tensor of preds\n        labels: label of input image, text index\n        labels_lod:  lod tensor of label\n        is_remove_duplicate: Whether to remove duplicate characters,\n                                 The default is False\n    Return:\n        acc: The accuracy of test set\n        acc_num: The correct number of samples predicted\n        img_num: The total sample number of the test set\n    \"\"\"\n    acc_num = 0\n    img_num = 0\n    for ino in range(len(labels_lod) - 1):\n        beg_no = preds_lod[ino]\n        end_no = preds_lod[ino + 1]\n        preds_text = preds[beg_no:end_no].reshape(-1)\n        preds_text = char_ops.decode(preds_text, is_remove_duplicate)\n\n        beg_no = labels_lod[ino]\n        end_no = labels_lod[ino + 1]\n        labels_text = labels[beg_no:end_no].reshape(-1)\n        labels_text = char_ops.decode(labels_text, is_remove_duplicate)\n        img_num += 1\n\n        if preds_text == labels_text:\n            acc_num += 1\n    acc = acc_num * 1.0 / img_num\n    return acc, acc_num, img_num\n\n\ndef cal_predicts_accuracy_srn(char_ops, preds, labels, max_text_len, is_debug=False):\n    acc_num = 0\n    img_num = 0\n\n    char_num = char_ops.get_char_num()\n\n    total_len = preds.shape[0]\n    img_num = int(total_len / max_text_len)\n    for i in range(img_num):\n        cur_label = []\n        cur_pred = []\n        for j in range(max_text_len):\n            if labels[j + i * max_text_len] != int(char_num - 1):  #0\n                cur_label.append(labels[j + i * max_text_len][0])\n            else:\n                break\n\n        for j in range(max_text_len + 1):\n            if j < len(cur_label) and preds[j + i * max_text_len][0] != cur_label[j]:\n                break\n            elif j == len(cur_label) and j == max_text_len:\n                acc_num += 1\n                break\n            elif j == len(cur_label) and preds[j + i * max_text_len][0] == int(char_num - 1):\n                acc_num += 1\n                break\n    acc = acc_num * 1.0 / img_num\n    return acc, acc_num, img_num\n\n\ndef convert_rec_attention_infer_res(preds):\n    img_num = preds.shape[0]\n    target_lod = [0]\n    convert_ids = []\n    for ino in range(img_num):\n        end_pos = np.where(preds[ino, :] == 1)[0]\n        if len(end_pos) <= 1:\n            text_list = preds[ino, 1:]\n        else:\n            text_list = preds[ino, 1:end_pos[1]]\n        target_lod.append(target_lod[ino] + len(text_list))\n        convert_ids = convert_ids + list(text_list)\n    convert_ids = np.array(convert_ids)\n    convert_ids = convert_ids.reshape((-1, 1))\n    return convert_ids, target_lod\n\n\ndef convert_rec_label_to_lod(ori_labels):\n    img_num = len(ori_labels)\n    target_lod = [0]\n    convert_ids = []\n    for ino in range(img_num):\n        target_lod.append(target_lod[ino] + len(ori_labels[ino]))\n        convert_ids = convert_ids + list(ori_labels[ino])\n    convert_ids = np.array(convert_ids)\n    convert_ids = convert_ids.reshape((-1, 1))\n    return convert_ids, target_lod\n"
  },
  {
    "path": "modules/image/text_recognition/chinese_ocr_db_crnn_mobile/module.py",
    "content": "import argparse\nimport ast\nimport copy\nimport math\nimport os\nimport time\n\nimport cv2\nimport numpy as np\nimport paddle\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\nfrom PIL import Image\n\nimport paddlehub as hub\nfrom .character import CharacterOps\nfrom .utils import base64_to_cv2\nfrom .utils import draw_ocr\nfrom .utils import get_image_ext\nfrom .utils import sorted_boxes\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"chinese_ocr_db_crnn_mobile\",\n    version=\"1.2.0\",\n    summary=\"The module can recognize the chinese texts in an image. Firstly, it will detect the text box positions \\\n        based on the differentiable_binarization_chn module. Then it classifies the text angle and recognizes the chinese texts. \",\n    author=\"paddle-dev\",\n    author_email=\"paddle-dev@baidu.com\",\n    type=\"cv/text_recognition\")\nclass ChineseOCRDBCRNN:\n\n    def __init__(self, text_detector_module=None, enable_mkldnn=False):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        self.character_dict_path = os.path.join(self.directory, 'assets', 'ppocr_keys_v1.txt')\n        char_ops_params = {\n            'character_type': 'ch',\n            'character_dict_path': self.character_dict_path,\n            'loss_type': 'ctc',\n            'max_text_length': 25,\n            'use_space_char': True\n        }\n        self.char_ops = CharacterOps(char_ops_params)\n        self.rec_image_shape = [3, 32, 320]\n        self._text_detector_module = text_detector_module\n        self.font_file = os.path.join(self.directory, 'assets', 'simfang.ttf')\n        self.enable_mkldnn = enable_mkldnn\n\n        self.rec_pretrained_model_path = os.path.join(self.directory, 'inference_model', 'character_rec', 'model')\n        self.cls_pretrained_model_path = os.path.join(self.directory, 'inference_model', 'angle_cls', 'model')\n        self.rec_predictor, self.rec_input_tensor, self.rec_output_tensors = self._set_config(\n            self.rec_pretrained_model_path)\n        self.cls_predictor, self.cls_input_tensor, self.cls_output_tensors = self._set_config(\n            self.cls_pretrained_model_path)\n\n    def _set_config(self, pretrained_model_path):\n        \"\"\"\n        predictor config path\n        \"\"\"\n        model_file_path = pretrained_model_path + '.pdmodel'\n        params_file_path = pretrained_model_path + '.pdiparams'\n\n        config = Config(model_file_path, params_file_path)\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n\n        if use_gpu:\n            config.enable_use_gpu(8000, 0)\n        else:\n            config.disable_gpu()\n            if self.enable_mkldnn:\n                # cache 10 different shapes for mkldnn to avoid memory leak\n                config.set_mkldnn_cache_capacity(10)\n                config.enable_mkldnn()\n\n        config.disable_glog_info()\n        config.delete_pass(\"conv_transpose_eltwiseadd_bn_fuse_pass\")\n        config.switch_use_feed_fetch_ops(False)\n\n        predictor = create_predictor(config)\n\n        input_names = predictor.get_input_names()\n        input_tensor = predictor.get_input_handle(input_names[0])\n        output_names = predictor.get_output_names()\n        output_tensors = []\n        for output_name in output_names:\n            output_tensor = predictor.get_output_handle(output_name)\n            output_tensors.append(output_tensor)\n\n        return predictor, input_tensor, output_tensors\n\n    @property\n    def text_detector_module(self):\n        \"\"\"\n        text detect module\n        \"\"\"\n        if not self._text_detector_module:\n            self._text_detector_module = hub.Module(name='chinese_text_detection_db_mobile',\n                                                    enable_mkldnn=self.enable_mkldnn)\n        return self._text_detector_module\n\n    def read_images(self, paths=[]):\n        images = []\n        for img_path in paths:\n            assert os.path.isfile(img_path), \"The {} isn't a valid file.\".format(img_path)\n            img = cv2.imread(img_path)\n            if img is None:\n                logger.info(\"error in loading image:{}\".format(img_path))\n                continue\n            images.append(img)\n        return images\n\n    def get_rotate_crop_image(self, img, points):\n        '''\n        img_height, img_width = img.shape[0:2]\n        left = int(np.min(points[:, 0]))\n        right = int(np.max(points[:, 0]))\n        top = int(np.min(points[:, 1]))\n        bottom = int(np.max(points[:, 1]))\n        img_crop = img[top:bottom, left:right, :].copy()\n        points[:, 0] = points[:, 0] - left\n        points[:, 1] = points[:, 1] - top\n        '''\n        img_crop_width = int(max(np.linalg.norm(points[0] - points[1]), np.linalg.norm(points[2] - points[3])))\n        img_crop_height = int(max(np.linalg.norm(points[0] - points[3]), np.linalg.norm(points[1] - points[2])))\n        pts_std = np.float32([[0, 0], [img_crop_width, 0], [img_crop_width, img_crop_height], [0, img_crop_height]])\n        M = cv2.getPerspectiveTransform(points, pts_std)\n        dst_img = cv2.warpPerspective(img,\n                                      M, (img_crop_width, img_crop_height),\n                                      borderMode=cv2.BORDER_REPLICATE,\n                                      flags=cv2.INTER_CUBIC)\n        dst_img_height, dst_img_width = dst_img.shape[0:2]\n        if dst_img_height * 1.0 / dst_img_width >= 1.5:\n            dst_img = np.rot90(dst_img)\n        return dst_img\n\n    def resize_norm_img_rec(self, img, max_wh_ratio):\n        imgC, imgH, imgW = self.rec_image_shape\n        assert imgC == img.shape[2]\n        imgW = int((32 * max_wh_ratio))\n        h, w = img.shape[:2]\n        ratio = w / float(h)\n        if math.ceil(imgH * ratio) > imgW:\n            resized_w = imgW\n        else:\n            resized_w = int(math.ceil(imgH * ratio))\n        resized_image = cv2.resize(img, (resized_w, imgH))\n        resized_image = resized_image.astype('float32')\n        resized_image = resized_image.transpose((2, 0, 1)) / 255\n        resized_image -= 0.5\n        resized_image /= 0.5\n        padding_im = np.zeros((imgC, imgH, imgW), dtype=np.float32)\n        padding_im[:, :, 0:resized_w] = resized_image\n        return padding_im\n\n    def resize_norm_img_cls(self, img):\n        cls_image_shape = [3, 48, 192]\n        imgC, imgH, imgW = cls_image_shape\n        h = img.shape[0]\n        w = img.shape[1]\n        ratio = w / float(h)\n        if math.ceil(imgH * ratio) > imgW:\n            resized_w = imgW\n        else:\n            resized_w = int(math.ceil(imgH * ratio))\n        resized_image = cv2.resize(img, (resized_w, imgH))\n        resized_image = resized_image.astype('float32')\n        if cls_image_shape[0] == 1:\n            resized_image = resized_image / 255\n            resized_image = resized_image[np.newaxis, :]\n        else:\n            resized_image = resized_image.transpose((2, 0, 1)) / 255\n        resized_image -= 0.5\n        resized_image /= 0.5\n        padding_im = np.zeros((imgC, imgH, imgW), dtype=np.float32)\n        padding_im[:, :, 0:resized_w] = resized_image\n        return padding_im\n\n    def recognize_text(self,\n                       images=[],\n                       paths=[],\n                       use_gpu=False,\n                       output_dir='ocr_result',\n                       visualization=False,\n                       box_thresh=0.5,\n                       text_thresh=0.5,\n                       angle_classification_thresh=0.9):\n        \"\"\"\n        Get the chinese texts in the predicted images.\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]. If images not paths\n            paths (list[str]): The paths of images. If paths not images\n            use_gpu (bool): Whether to use gpu.\n            batch_size(int): the program deals once with one\n            output_dir (str): The directory to store output images.\n            visualization (bool): Whether to save image or not.\n            box_thresh(float): the threshold of the detected text box's confidence\n            text_thresh(float): the threshold of the chinese text recognition confidence\n            angle_classification_thresh(float): the threshold of the angle classification confidence\n\n        Returns:\n            res (list): The result of chinese texts and save path of images.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES via export CUDA_VISIBLE_DEVICES=cuda_device_id.\"\n                )\n\n        self.use_gpu = use_gpu\n\n        if images != [] and isinstance(images, list) and paths == []:\n            predicted_data = images\n        elif images == [] and isinstance(paths, list) and paths != []:\n            predicted_data = self.read_images(paths)\n        else:\n            raise TypeError(\"The input data is inconsistent with expectations.\")\n\n        assert predicted_data != [], \"There is not any image to be predicted. Please check the input data.\"\n\n        detection_results = self.text_detector_module.detect_text(images=predicted_data,\n                                                                  use_gpu=self.use_gpu,\n                                                                  box_thresh=box_thresh)\n\n        boxes = [np.array(item['data']).astype(np.float32) for item in detection_results]\n        all_results = []\n        for index, img_boxes in enumerate(boxes):\n            original_image = predicted_data[index].copy()\n            result = {'save_path': ''}\n            if img_boxes.size == 0:\n                result['data'] = []\n            else:\n                img_crop_list = []\n                boxes = sorted_boxes(img_boxes)\n                for num_box in range(len(boxes)):\n                    tmp_box = copy.deepcopy(boxes[num_box])\n                    img_crop = self.get_rotate_crop_image(original_image, tmp_box)\n                    img_crop_list.append(img_crop)\n                img_crop_list, angle_list = self._classify_text(img_crop_list,\n                                                                angle_classification_thresh=angle_classification_thresh)\n                rec_results = self._recognize_text(img_crop_list)\n\n                # if the recognized text confidence score is lower than text_thresh, then drop it\n                rec_res_final = []\n                for index, res in enumerate(rec_results):\n                    text, score = res\n                    if score >= text_thresh:\n                        rec_res_final.append({\n                            'text': text,\n                            'confidence': float(score),\n                            'text_box_position': boxes[index].astype(np.int).tolist()\n                        })\n                result['data'] = rec_res_final\n\n                if visualization and result['data']:\n                    result['save_path'] = self.save_result_image(original_image, boxes, rec_results, output_dir,\n                                                                 text_thresh)\n            all_results.append(result)\n\n        return all_results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.recognize_text(images_decode, **kwargs)\n        return results\n\n    def save_result_image(\n        self,\n        original_image,\n        detection_boxes,\n        rec_results,\n        output_dir='ocr_result',\n        text_thresh=0.5,\n    ):\n        image = Image.fromarray(cv2.cvtColor(original_image, cv2.COLOR_BGR2RGB))\n        txts = [item[0] for item in rec_results]\n        scores = [item[1] for item in rec_results]\n        draw_img = draw_ocr(image,\n                            detection_boxes,\n                            txts,\n                            scores,\n                            font_file=self.font_file,\n                            draw_txt=True,\n                            drop_score=text_thresh)\n\n        if not os.path.exists(output_dir):\n            os.makedirs(output_dir)\n        ext = get_image_ext(original_image)\n        saved_name = 'ndarray_{}{}'.format(time.time(), ext)\n        save_file_path = os.path.join(output_dir, saved_name)\n        cv2.imwrite(save_file_path, draw_img[:, :, ::-1])\n        return save_file_path\n\n    def _classify_text(self, image_list, angle_classification_thresh=0.9):\n        img_list = copy.deepcopy(image_list)\n        img_num = len(img_list)\n        # Calculate the aspect ratio of all text bars\n        width_list = []\n        for img in img_list:\n            width_list.append(img.shape[1] / float(img.shape[0]))\n        # Sorting can speed up the cls process\n        indices = np.argsort(np.array(width_list))\n\n        cls_res = [['', 0.0]] * img_num\n        batch_num = 30\n        for beg_img_no in range(0, img_num, batch_num):\n            end_img_no = min(img_num, beg_img_no + batch_num)\n            norm_img_batch = []\n            max_wh_ratio = 0\n            for ino in range(beg_img_no, end_img_no):\n                h, w = img_list[indices[ino]].shape[0:2]\n                wh_ratio = w * 1.0 / h\n                max_wh_ratio = max(max_wh_ratio, wh_ratio)\n            for ino in range(beg_img_no, end_img_no):\n                norm_img = self.resize_norm_img_cls(img_list[indices[ino]])\n                norm_img = norm_img[np.newaxis, :]\n                norm_img_batch.append(norm_img)\n            norm_img_batch = np.concatenate(norm_img_batch)\n            norm_img_batch = norm_img_batch.copy()\n\n            self.cls_input_tensor.copy_from_cpu(norm_img_batch)\n            self.cls_predictor.run()\n\n            prob_out = self.cls_output_tensors[0].copy_to_cpu()\n            label_out = self.cls_output_tensors[1].copy_to_cpu()\n            if len(label_out.shape) != 1:\n                prob_out, label_out = label_out, prob_out\n            label_list = ['0', '180']\n            for rno in range(len(label_out)):\n                label_idx = label_out[rno]\n                score = prob_out[rno][label_idx]\n                label = label_list[label_idx]\n                cls_res[indices[beg_img_no + rno]] = [label, score]\n                if '180' in label and score > angle_classification_thresh:\n                    img_list[indices[beg_img_no + rno]] = cv2.rotate(img_list[indices[beg_img_no + rno]], 1)\n        return img_list, cls_res\n\n    def _recognize_text(self, img_list):\n        img_num = len(img_list)\n        # Calculate the aspect ratio of all text bars\n        width_list = []\n        for img in img_list:\n            width_list.append(img.shape[1] / float(img.shape[0]))\n        # Sorting can speed up the recognition process\n        indices = np.argsort(np.array(width_list))\n\n        rec_res = [['', 0.0]] * img_num\n        batch_num = 30\n        for beg_img_no in range(0, img_num, batch_num):\n            end_img_no = min(img_num, beg_img_no + batch_num)\n            norm_img_batch = []\n            max_wh_ratio = 0\n            for ino in range(beg_img_no, end_img_no):\n                h, w = img_list[indices[ino]].shape[0:2]\n                wh_ratio = w * 1.0 / h\n                max_wh_ratio = max(max_wh_ratio, wh_ratio)\n            for ino in range(beg_img_no, end_img_no):\n                norm_img = self.resize_norm_img_rec(img_list[indices[ino]], max_wh_ratio)\n                norm_img = norm_img[np.newaxis, :]\n                norm_img_batch.append(norm_img)\n\n            norm_img_batch = np.concatenate(norm_img_batch, axis=0)\n            norm_img_batch = norm_img_batch.copy()\n\n            self.rec_input_tensor.copy_from_cpu(norm_img_batch)\n            self.rec_predictor.run()\n\n            rec_idx_batch = self.rec_output_tensors[0].copy_to_cpu()\n            rec_idx_lod = self.rec_output_tensors[0].lod()[0]\n            predict_batch = self.rec_output_tensors[1].copy_to_cpu()\n            predict_lod = self.rec_output_tensors[1].lod()[0]\n            for rno in range(len(rec_idx_lod) - 1):\n                beg = rec_idx_lod[rno]\n                end = rec_idx_lod[rno + 1]\n                rec_idx_tmp = rec_idx_batch[beg:end, 0]\n                preds_text = self.char_ops.decode(rec_idx_tmp)\n                beg = predict_lod[rno]\n                end = predict_lod[rno + 1]\n                probs = predict_batch[beg:end, :]\n                ind = np.argmax(probs, axis=1)\n                blank = probs.shape[1]\n                valid_ind = np.where(ind != (blank - 1))[0]\n                if len(valid_ind) == 0:\n                    continue\n                score = np.mean(probs[valid_ind, ind[valid_ind]])\n                # rec_res.append([preds_text, score])\n                rec_res[indices[beg_img_no + rno]] = [preds_text, score]\n\n        return rec_res\n\n    def save_inference_model(self, dirname):\n        detector_dir = os.path.join(dirname, 'text_detector')\n        classifier_dir = os.path.join(dirname, 'angle_classifier')\n        recognizer_dir = os.path.join(dirname, 'text_recognizer')\n\n        self._save_detector_model(detector_dir)\n        self._save_classifier_model(classifier_dir)\n        self._save_recognizer_model(recognizer_dir)\n        logger.info(\"The inference model has been saved in the path {}\".format(os.path.realpath(dirname)))\n\n    def _save_detector_model(self, dirname):\n        self.text_detector_module.save_inference_model(dirname)\n\n    def _save_recognizer_model(self, dirname):\n        place = paddle.CPUPlace()\n        exe = paddle.static.Executor(place)\n\n        program, feeded_var_names, target_vars = paddle.static.load_inference_model(self.rec_pretrained_model_path,\n                                                                                    executor=exe)\n        global_block = program.global_block()\n        feed_vars = [global_block.var(item) for item in feeded_var_names]\n        paddle.static.save_inference_model(dirname,\n                                           feed_vars=feed_vars,\n                                           fetch_vars=target_vars,\n                                           executor=exe,\n                                           program=program)\n\n    def _save_classifier_model(self, dirname):\n        place = paddle.CPUPlace()\n        exe = paddle.static.Executor(place)\n\n        program, feeded_var_names, target_vars = paddle.static.load_inference_model(self.cls_pretrained_model_path,\n                                                                                    executor=exe)\n        global_block = program.global_block()\n        feed_vars = [global_block.var(item) for item in feeded_var_names]\n        paddle.static.save_inference_model(dirname,\n                                           feed_vars=feed_vars,\n                                           fetch_vars=target_vars,\n                                           executor=exe,\n                                           program=program)\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the %s module.\" % self.name,\n                                              prog='hub run %s' % self.name,\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n\n        args = self.parser.parse_args(argvs)\n        results = self.recognize_text(paths=[args.input_path],\n                                      use_gpu=args.use_gpu,\n                                      output_dir=args.output_dir,\n                                      visualization=args.visualization)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='ocr_result',\n                                           help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument('--visualization',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, default=None, help=\"diretory to image\")\n\n    def create_gradio_app(self):\n        import gradio as gr\n\n        def inference(image, use_gpu=False, box_thresh=0.5, text_thresh=0.5, angle_classification_thresh=0.9):\n            return self.recognize_text(paths=[image],\n                                       use_gpu=use_gpu,\n                                       output_dir=None,\n                                       visualization=False,\n                                       box_thresh=box_thresh,\n                                       text_thresh=text_thresh,\n                                       angle_classification_thresh=angle_classification_thresh)\n\n        return gr.Interface(inference, [\n            gr.Image(type='filepath'),\n            gr.Checkbox(),\n            gr.Slider(0, 1.0, 0.5, step=0.01),\n            gr.Slider(0, 1.0, 0.5, step=0.01),\n            gr.Slider(0, 1.0, 0.5, step=0.01)\n        ], [gr.JSON(label='results')],\n                            title='chinese_ocr_db_crnn_mobile',\n                            allow_flagging=False)\n"
  },
  {
    "path": "modules/image/text_recognition/chinese_ocr_db_crnn_mobile/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/KTzZVDjUsXw/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8MzM3fHx0ZXh0fGVufDB8fHx8MTY2MzUxMTExMQ&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"chinese_ocr_db_crnn_mobile\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('ocr_result')\n\n    def test_recognize_text1(self):\n        results = self.module.recognize_text(\n            paths=['tests/test.jpg'],\n            use_gpu=False,\n            visualization=False,\n        )\n        self.assertEqual(results[0]['data'], [{\n            'text': 'GIVE',\n            'confidence': 0.9329867362976074,\n            'text_box_position': [[282, 163], [351, 163], [351, 200], [282, 200]]\n        }, {\n            'text': 'THANKS',\n            'confidence': 0.9966865181922913,\n            'text_box_position': [[259, 201], [376, 199], [376, 238], [259, 240]]\n        }])\n\n    def test_recognize_text2(self):\n        results = self.module.recognize_text(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=False,\n        )\n        self.assertEqual(results[0]['data'], [{\n            'text': 'GIVE',\n            'confidence': 0.9329867362976074,\n            'text_box_position': [[282, 163], [351, 163], [351, 200], [282, 200]]\n        }, {\n            'text': 'THANKS',\n            'confidence': 0.9966865181922913,\n            'text_box_position': [[259, 201], [376, 199], [376, 238], [259, 240]]\n        }])\n\n    def test_recognize_text3(self):\n        results = self.module.recognize_text(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=True,\n            visualization=False,\n        )\n        self.assertEqual(results[0]['data'], [{\n            'text': 'GIVE',\n            'confidence': 0.9329867362976074,\n            'text_box_position': [[282, 163], [351, 163], [351, 200], [282, 200]]\n        }, {\n            'text': 'THANKS',\n            'confidence': 0.9966865181922913,\n            'text_box_position': [[259, 201], [376, 199], [376, 238], [259, 240]]\n        }])\n\n    def test_recognize_text4(self):\n        results = self.module.recognize_text(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=True,\n        )\n        self.assertEqual(results[0]['data'], [{\n            'text': 'GIVE',\n            'confidence': 0.9329867362976074,\n            'text_box_position': [[282, 163], [351, 163], [351, 200], [282, 200]]\n        }, {\n            'text': 'THANKS',\n            'confidence': 0.9966865181922913,\n            'text_box_position': [[259, 201], [376, 199], [376, 238], [259, 240]]\n        }])\n\n    def test_recognize_text5(self):\n        self.assertRaises(AttributeError, self.module.recognize_text, images=['tests/test.jpg'])\n\n    def test_recognize_text6(self):\n        self.assertRaises(AssertionError, self.module.recognize_text, paths=['no.jpg'])\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model/angle_classifier.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model/angle_classifier.pdiparams'))\n\n        self.assertTrue(os.path.exists('./inference/model/text_detector.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model/text_detector.pdiparams'))\n\n        self.assertTrue(os.path.exists('./inference/model/text_recognizer.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model/text_recognizer.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/text_recognition/chinese_ocr_db_crnn_mobile/utils.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport base64\nimport math\nfrom io import BytesIO\n\nimport cv2\nimport numpy as np\nfrom PIL import Image\nfrom PIL import ImageDraw\nfrom PIL import ImageFont\n\n\ndef draw_ocr(image, boxes, txts, scores, font_file, draw_txt=True, drop_score=0.5):\n    \"\"\"\n    Visualize the results of OCR detection and recognition\n    args:\n        image(Image|array): RGB image\n        boxes(list): boxes with shape(N, 4, 2)\n        txts(list): the texts\n        scores(list): txxs corresponding scores\n        draw_txt(bool): whether draw text or not\n        drop_score(float): only scores greater than drop_threshold will be visualized\n    return(array):\n        the visualized img\n    \"\"\"\n    if scores is None:\n        scores = [1] * len(boxes)\n    for (box, score) in zip(boxes, scores):\n        if score < drop_score or math.isnan(score):\n            continue\n        box = np.reshape(np.array(box), [-1, 1, 2]).astype(np.int64)\n        image = cv2.polylines(np.array(image), [box], True, (255, 0, 0), 2)\n\n    if draw_txt:\n        img = np.array(resize_img(image, input_size=600))\n        txt_img = text_visual(txts, scores, font_file, img_h=img.shape[0], img_w=600, threshold=drop_score)\n        img = np.concatenate([np.array(img), np.array(txt_img)], axis=1)\n        return img\n    return image\n\n\ndef text_visual(texts, scores, font_file, img_h=400, img_w=600, threshold=0.):\n    \"\"\"\n    create new blank img and draw txt on it\n    args:\n        texts(list): the text will be draw\n        scores(list|None): corresponding score of each txt\n        img_h(int): the height of blank img\n        img_w(int): the width of blank img\n    return(array):\n    \"\"\"\n    if scores is not None:\n        assert len(texts) == len(scores), \"The number of txts and corresponding scores must match\"\n\n    def create_blank_img():\n        blank_img = np.ones(shape=[img_h, img_w], dtype=np.int8) * 255\n        blank_img[:, img_w - 1:] = 0\n        blank_img = Image.fromarray(blank_img).convert(\"RGB\")\n        draw_txt = ImageDraw.Draw(blank_img)\n        return blank_img, draw_txt\n\n    blank_img, draw_txt = create_blank_img()\n\n    font_size = 20\n    txt_color = (0, 0, 0)\n    font = ImageFont.truetype(font_file, font_size, encoding=\"utf-8\")\n\n    gap = font_size + 5\n    txt_img_list = []\n    count, index = 1, 0\n    for idx, txt in enumerate(texts):\n        index += 1\n        if scores[idx] < threshold or math.isnan(scores[idx]):\n            index -= 1\n            continue\n        first_line = True\n        while str_count(txt) >= img_w // font_size - 4:\n            tmp = txt\n            txt = tmp[:img_w // font_size - 4]\n            if first_line:\n                new_txt = str(index) + ': ' + txt\n                first_line = False\n            else:\n                new_txt = '    ' + txt\n            draw_txt.text((0, gap * count), new_txt, txt_color, font=font)\n            txt = tmp[img_w // font_size - 4:]\n            if count >= img_h // gap - 1:\n                txt_img_list.append(np.array(blank_img))\n                blank_img, draw_txt = create_blank_img()\n                count = 0\n            count += 1\n        if first_line:\n            new_txt = str(index) + ': ' + txt + '   ' + '%.3f' % (scores[idx])\n        else:\n            new_txt = \"  \" + txt + \"  \" + '%.3f' % (scores[idx])\n        draw_txt.text((0, gap * count), new_txt, txt_color, font=font)\n        # whether add new blank img or not\n        if count >= img_h // gap - 1 and idx + 1 < len(texts):\n            txt_img_list.append(np.array(blank_img))\n            blank_img, draw_txt = create_blank_img()\n            count = 0\n        count += 1\n    txt_img_list.append(np.array(blank_img))\n    if len(txt_img_list) == 1:\n        blank_img = np.array(txt_img_list[0])\n    else:\n        blank_img = np.concatenate(txt_img_list, axis=1)\n    return np.array(blank_img)\n\n\ndef str_count(s):\n    \"\"\"\n    Count the number of Chinese characters,\n    a single English character and a single number\n    equal to half the length of Chinese characters.\n    args:\n        s(string): the input of string\n    return(int):\n        the number of Chinese characters\n    \"\"\"\n    import string\n    count_zh = count_pu = 0\n    s_len = len(s)\n    en_dg_count = 0\n    for c in s:\n        if c in string.ascii_letters or c.isdigit() or c.isspace():\n            en_dg_count += 1\n        elif c.isalpha():\n            count_zh += 1\n        else:\n            count_pu += 1\n    return s_len - math.ceil(en_dg_count / 2)\n\n\ndef resize_img(img, input_size=600):\n    img = np.array(img)\n    im_shape = img.shape\n    im_size_min = np.min(im_shape[0:2])\n    im_size_max = np.max(im_shape[0:2])\n    im_scale = float(input_size) / float(im_size_max)\n    im = cv2.resize(img, None, None, fx=im_scale, fy=im_scale)\n    return im\n\n\ndef get_image_ext(image):\n    if image.shape[2] == 4:\n        return \".png\"\n    return \".jpg\"\n\n\ndef sorted_boxes(dt_boxes):\n    \"\"\"\n    Sort text boxes in order from top to bottom, left to right\n    args:\n        dt_boxes(array):detected text boxes with shape [4, 2]\n    return:\n        sorted boxes(array) with shape [4, 2]\n    \"\"\"\n    num_boxes = dt_boxes.shape[0]\n    sorted_boxes = sorted(dt_boxes, key=lambda x: (x[0][1], x[0][0]))\n    _boxes = list(sorted_boxes)\n\n    for i in range(num_boxes - 1):\n        if abs(_boxes[i + 1][0][1] - _boxes[i][0][1]) < 10 and \\\n                (_boxes[i + 1][0][0] < _boxes[i][0][0]):\n            tmp = _boxes[i]\n            _boxes[i] = _boxes[i + 1]\n            _boxes[i + 1] = tmp\n    return _boxes\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    if data is None:\n        buf = BytesIO()\n        image_decode = base64.b64decode(b64str.encode('utf8'))\n        image = BytesIO(image_decode)\n        im = Image.open(image)\n        rgb = im.convert('RGB')\n        rgb.save(buf, 'jpeg')\n        buf.seek(0)\n        image_bytes = buf.read()\n        data_base64 = str(base64.b64encode(image_bytes), encoding=\"utf-8\")\n        image_decode = base64.b64decode(data_base64)\n        img_array = np.frombuffer(image_decode, np.uint8)\n        data = cv2.imdecode(img_array, cv2.IMREAD_COLOR)\n    return data\n"
  },
  {
    "path": "modules/image/text_recognition/chinese_ocr_db_crnn_server/README.md",
    "content": "# chinese_ocr_db_crnn_server\n\n|模型名称|chinese_ocr_db_crnn_server|\n| :--- | :---: |\n|类别|图像-文字识别|\n|网络|Differentiable Binarization+RCNN|\n|数据集|icdar2015数据集|\n|是否支持Fine-tuning|否|\n|模型大小|116MB|\n|最新更新日期|2021-05-31|\n|数据指标|mAP@0.98|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - [OCR文字识别场景在线体验](https://www.paddlepaddle.org.cn/hub/scene/ocr)\n  - 样例结果示例：\n<p align=\"center\">\n<img src=\"https://user-images.githubusercontent.com/76040149/133097562-d8c9abd1-6c70-4d93-809f-fa4735764836.png\"  width = \"600\" hspace='10'/> <br />\n</p>\n\n- ### 模型介绍\n\n  - chinese_ocr_db_crnn_server Module用于识别图片当中的汉字。其基于[chinese_text_detection_db_server Module](../chinese_text_detection_db_server/)检测得到的文本框，识别文本框中的中文文字。识别文字算法采用CRNN（Convolutional Recurrent Neural Network）即卷积循环神经网络。该Module是一个通用的OCR模型，支持直接预测。\n\n<p align=\"center\">\n<img src=\"https://user-images.githubusercontent.com/76040149/133098254-7c642826-d6d7-4dd0-986e-371622337867.png\" width = \"300\" height = \"450\"  hspace='10'/> <br />\n</p>\n\n  - 更多详情参考：[An end-to-end trainable neural network for image-based sequence recognition and its application to scene text recognition](https://arxiv.org/pdf/1507.05717.pdf)\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.7.2  \n\n  - paddlehub >= 1.6.0   | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n  - shapely\n\n  - pyclipper\n\n  - ```shell\n    $ pip install shapely pyclipper\n    ```\n  - **该Module依赖于第三方库shapely和pyclipper，使用该Module之前，请先安装shapely和pyclipper。**  \n\n- ### 2、安装\n\n  - ```shell\n    $ hub install chinese_ocr_db_crnn_server\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run chinese_ocr_db_crnn_server --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    ocr = hub.Module(name=\"chinese_ocr_db_crnn_server\", enable_mkldnn=True)       # mkldnn加速仅在CPU下有效\n    result = ocr.recognize_text(images=[cv2.imread('/PATH/TO/IMAGE')])\n\n    # or\n    # result = ocr.recognize_text(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def __init__(text_detector_module=None, enable_mkldnn=False)\n    ```\n\n    - 构造ChineseOCRDBCRNNServer对象\n\n    - **参数**\n\n      - text_detector_module(str): 文字检测PaddleHub Module名字，如设置为None，则默认使用[chinese_text_detection_db_server Module](../chinese_text_detection_db_server/)。其作用为检测图片当中的文本。<br/>\n      - enable_mkldnn(bool): 是否开启mkldnn加速CPU计算。该参数仅在CPU运行下设置有效。默认为False。\n\n  - ```python\n    def recognize_text(images=[],\n                       paths=[],\n                       use_gpu=False,\n                       output_dir='ocr_result',\n                       visualization=False,\n                       box_thresh=0.5,\n                       text_thresh=0.5,\n                       angle_classification_thresh=0.9)\n    ```\n\n    - 预测API，检测输入图片中的所有中文文本的位置。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式； <br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - box\\_thresh (float): 检测文本框置信度的阈值； <br/>\n      - text\\_thresh (float): 识别中文文本置信度的阈值； <br/>\n      - angle_classification_thresh(float): 文本角度分类置信度的阈值 <br/>\n      - visualization (bool): 是否将识别结果保存为图片文件； <br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 ocr\\_result；\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为：\n        - data (list\\[dict\\]): 识别文本结果，列表中每一个元素为 dict，各字段为：\n          - text(str): 识别得到的文本\n          - confidence(float): 识别文本结果置信度\n          - text_box_position(list): 文本框在原图中的像素坐标，4*2的矩阵，依次表示文本框左下、右下、右上、左上顶点的坐标\n      如果无识别结果则data为\\[\\]\n        - save_path (str, optional): 识别结果的保存路径，如不保存图片则save_path为''\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m chinese_ocr_db_crnn_server\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/chinese_ocr_db_crnn_server\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n- ### Gradio App 支持\n  从 PaddleHub 2.3.1 开始支持使用链接 http://127.0.0.1:8866/gradio/chinese_ocr_db_crnn_server 在浏览器中访问 chinese_ocr_db_crnn_server 的 Gradio App。\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  支持mkldnn加速CPU计算\n\n* 1.1.0\n\n  使用三阶段模型（文本框检测-角度分类-文字识别）识别图片文字。\n\n* 1.1.1\n\n  支持文本中空格识别。\n\n* 1.1.2\n\n  修复检出字段无法超过30个问题。\n\n* 1.1.3\n\n  移除 fluid api\n\n* 1.2.0\n\n  添加 Gradio APP\n\n  - ```shell\n    $ hub install chinese_ocr_db_crnn_server==1.2.0\n    ```\n"
  },
  {
    "path": "modules/image/text_recognition/chinese_ocr_db_crnn_server/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/text_recognition/chinese_ocr_db_crnn_server/character.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport string\n\nimport numpy as np\n\n\nclass CharacterOps(object):\n    \"\"\" Convert between text-label and text-index\n    Args:\n        config: config from yaml file\n    \"\"\"\n\n    def __init__(self, config):\n        self.character_type = config['character_type']\n        self.loss_type = config['loss_type']\n        self.max_text_len = config['max_text_length']\n        if self.character_type == \"en\":\n            self.character_str = \"0123456789abcdefghijklmnopqrstuvwxyz\"\n            dict_character = list(self.character_str)\n        # use the custom dictionary\n        elif self.character_type == \"ch\":\n            character_dict_path = config['character_dict_path']\n            add_space = False\n            if 'use_space_char' in config:\n                add_space = config['use_space_char']\n            self.character_str = \"\"\n            with open(character_dict_path, \"rb\") as fin:\n                lines = fin.readlines()\n                for line in lines:\n                    line = line.decode('utf-8').strip(\"\\n\").strip(\"\\r\\n\")\n                    self.character_str += line\n            if add_space:\n                self.character_str += \" \"\n            dict_character = list(self.character_str)\n        elif self.character_type == \"en_sensitive\":\n            # same with ASTER setting (use 94 char).\n            self.character_str = string.printable[:-6]\n            dict_character = list(self.character_str)\n        else:\n            self.character_str = None\n        assert self.character_str is not None, \\\n            \"Nonsupport type of the character: {}\".format(self.character_str)\n        self.beg_str = \"sos\"\n        self.end_str = \"eos\"\n        # add start and end str for attention\n        if self.loss_type == \"attention\":\n            dict_character = [self.beg_str, self.end_str] + dict_character\n        elif self.loss_type == \"srn\":\n            dict_character = dict_character + [self.beg_str, self.end_str]\n        # create char dict\n        self.dict = {}\n        for i, char in enumerate(dict_character):\n            self.dict[char] = i\n        self.character = dict_character\n\n    def encode(self, text):\n        \"\"\"convert text-label into text-index.\n        input:\n            text: text labels of each image. [batch_size]\n\n        output:\n            text: concatenated text index for CTCLoss.\n                    [sum(text_lengths)] = [text_index_0 + text_index_1 + ... + text_index_(n - 1)]\n            length: length of each text. [batch_size]\n        \"\"\"\n        if self.character_type == \"en\":\n            text = text.lower()\n\n        text_list = []\n        for char in text:\n            if char not in self.dict:\n                continue\n            text_list.append(self.dict[char])\n        text = np.array(text_list)\n        return text\n\n    def decode(self, text_index, is_remove_duplicate=False):\n        \"\"\" convert text-index into text-label. \"\"\"\n        char_list = []\n        char_num = self.get_char_num()\n\n        if self.loss_type == \"attention\":\n            beg_idx = self.get_beg_end_flag_idx(\"beg\")\n            end_idx = self.get_beg_end_flag_idx(\"end\")\n            ignored_tokens = [beg_idx, end_idx]\n        else:\n            ignored_tokens = [char_num]\n\n        for idx in range(len(text_index)):\n            if text_index[idx] in ignored_tokens:\n                continue\n            if is_remove_duplicate:\n                if idx > 0 and text_index[idx - 1] == text_index[idx]:\n                    continue\n            char_list.append(self.character[int(text_index[idx])])\n        text = ''.join(char_list)\n        return text\n\n    def get_char_num(self):\n        return len(self.character)\n\n    def get_beg_end_flag_idx(self, beg_or_end):\n        if self.loss_type == \"attention\":\n            if beg_or_end == \"beg\":\n                idx = np.array(self.dict[self.beg_str])\n            elif beg_or_end == \"end\":\n                idx = np.array(self.dict[self.end_str])\n            else:\n                assert False, \"Unsupport type %s in get_beg_end_flag_idx\"\\\n                    % beg_or_end\n            return idx\n        else:\n            err = \"error in get_beg_end_flag_idx when using the loss %s\"\\\n                % (self.loss_type)\n            assert False, err\n\n\ndef cal_predicts_accuracy(char_ops, preds, preds_lod, labels, labels_lod, is_remove_duplicate=False):\n    \"\"\"\n    Calculate prediction accuracy\n    Args:\n        char_ops: CharacterOps\n        preds: preds result,text index\n        preds_lod: lod tensor of preds\n        labels: label of input image, text index\n        labels_lod:  lod tensor of label\n        is_remove_duplicate: Whether to remove duplicate characters,\n                                 The default is False\n    Return:\n        acc: The accuracy of test set\n        acc_num: The correct number of samples predicted\n        img_num: The total sample number of the test set\n    \"\"\"\n    acc_num = 0\n    img_num = 0\n    for ino in range(len(labels_lod) - 1):\n        beg_no = preds_lod[ino]\n        end_no = preds_lod[ino + 1]\n        preds_text = preds[beg_no:end_no].reshape(-1)\n        preds_text = char_ops.decode(preds_text, is_remove_duplicate)\n\n        beg_no = labels_lod[ino]\n        end_no = labels_lod[ino + 1]\n        labels_text = labels[beg_no:end_no].reshape(-1)\n        labels_text = char_ops.decode(labels_text, is_remove_duplicate)\n        img_num += 1\n\n        if preds_text == labels_text:\n            acc_num += 1\n    acc = acc_num * 1.0 / img_num\n    return acc, acc_num, img_num\n\n\ndef cal_predicts_accuracy_srn(char_ops, preds, labels, max_text_len, is_debug=False):\n    acc_num = 0\n    img_num = 0\n\n    char_num = char_ops.get_char_num()\n\n    total_len = preds.shape[0]\n    img_num = int(total_len / max_text_len)\n    for i in range(img_num):\n        cur_label = []\n        cur_pred = []\n        for j in range(max_text_len):\n            if labels[j + i * max_text_len] != int(char_num - 1):  #0\n                cur_label.append(labels[j + i * max_text_len][0])\n            else:\n                break\n\n        for j in range(max_text_len + 1):\n            if j < len(cur_label) and preds[j + i * max_text_len][0] != cur_label[j]:\n                break\n            elif j == len(cur_label) and j == max_text_len:\n                acc_num += 1\n                break\n            elif j == len(cur_label) and preds[j + i * max_text_len][0] == int(char_num - 1):\n                acc_num += 1\n                break\n    acc = acc_num * 1.0 / img_num\n    return acc, acc_num, img_num\n\n\ndef convert_rec_attention_infer_res(preds):\n    img_num = preds.shape[0]\n    target_lod = [0]\n    convert_ids = []\n    for ino in range(img_num):\n        end_pos = np.where(preds[ino, :] == 1)[0]\n        if len(end_pos) <= 1:\n            text_list = preds[ino, 1:]\n        else:\n            text_list = preds[ino, 1:end_pos[1]]\n        target_lod.append(target_lod[ino] + len(text_list))\n        convert_ids = convert_ids + list(text_list)\n    convert_ids = np.array(convert_ids)\n    convert_ids = convert_ids.reshape((-1, 1))\n    return convert_ids, target_lod\n\n\ndef convert_rec_label_to_lod(ori_labels):\n    img_num = len(ori_labels)\n    target_lod = [0]\n    convert_ids = []\n    for ino in range(img_num):\n        target_lod.append(target_lod[ino] + len(ori_labels[ino]))\n        convert_ids = convert_ids + list(ori_labels[ino])\n    convert_ids = np.array(convert_ids)\n    convert_ids = convert_ids.reshape((-1, 1))\n    return convert_ids, target_lod\n"
  },
  {
    "path": "modules/image/text_recognition/chinese_ocr_db_crnn_server/module.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport argparse\nimport ast\nimport copy\nimport math\nimport os\nimport time\n\nimport cv2\nimport numpy as np\nimport paddle\nfrom chinese_ocr_db_crnn_server.character import CharacterOps\nfrom chinese_ocr_db_crnn_server.utils import base64_to_cv2\nfrom chinese_ocr_db_crnn_server.utils import draw_ocr\nfrom chinese_ocr_db_crnn_server.utils import get_image_ext\nfrom chinese_ocr_db_crnn_server.utils import sorted_boxes\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\nfrom PIL import Image\n\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"chinese_ocr_db_crnn_server\",\n    version=\"1.2.0\",\n    summary=\n    \"The module can recognize the chinese texts in an image. Firstly, it will detect the text box positions based on the differentiable_binarization_chn module. Then it recognizes the chinese texts. \",\n    author=\"paddle-dev\",\n    author_email=\"paddle-dev@baidu.com\",\n    type=\"cv/text_recognition\")\nclass ChineseOCRDBCRNNServer:\n\n    def __init__(self, text_detector_module=None, enable_mkldnn=False):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        self.character_dict_path = os.path.join(self.directory, 'assets', 'ppocr_keys_v1.txt')\n        char_ops_params = {\n            'character_type': 'ch',\n            'character_dict_path': self.character_dict_path,\n            'loss_type': 'ctc',\n            'max_text_length': 25,\n            'use_space_char': True\n        }\n        self.char_ops = CharacterOps(char_ops_params)\n        self.rec_image_shape = [3, 32, 320]\n        self._text_detector_module = text_detector_module\n        self.font_file = os.path.join(self.directory, 'assets', 'simfang.ttf')\n        self.enable_mkldnn = enable_mkldnn\n\n        self.rec_pretrained_model_path = os.path.join(self.directory, 'inference_model', 'character_rec', 'model')\n        self.cls_pretrained_model_path = os.path.join(self.directory, 'inference_model', 'angle_cls', 'model')\n        self.rec_predictor, self.rec_input_tensor, self.rec_output_tensors = self._set_config(\n            self.rec_pretrained_model_path)\n        self.cls_predictor, self.cls_input_tensor, self.cls_output_tensors = self._set_config(\n            self.cls_pretrained_model_path)\n\n    def _set_config(self, pretrained_model_path):\n        \"\"\"\n        predictor config path\n        \"\"\"\n        model_file_path = pretrained_model_path + '.pdmodel'\n        params_file_path = pretrained_model_path + '.pdiparams'\n\n        config = Config(model_file_path, params_file_path)\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n\n        if use_gpu:\n            config.enable_use_gpu(8000, 0)\n        else:\n            config.disable_gpu()\n            if self.enable_mkldnn:\n                # cache 10 different shapes for mkldnn to avoid memory leak\n                config.set_mkldnn_cache_capacity(10)\n                config.enable_mkldnn()\n\n        config.disable_glog_info()\n        config.delete_pass(\"conv_transpose_eltwiseadd_bn_fuse_pass\")\n        config.switch_use_feed_fetch_ops(False)\n\n        predictor = create_predictor(config)\n\n        input_names = predictor.get_input_names()\n        input_tensor = predictor.get_input_handle(input_names[0])\n        output_names = predictor.get_output_names()\n        output_tensors = []\n        for output_name in output_names:\n            output_tensor = predictor.get_output_handle(output_name)\n            output_tensors.append(output_tensor)\n\n        return predictor, input_tensor, output_tensors\n\n    @property\n    def text_detector_module(self):\n        \"\"\"\n        text detect module\n        \"\"\"\n        if not self._text_detector_module:\n            self._text_detector_module = hub.Module(name='chinese_text_detection_db_server',\n                                                    enable_mkldnn=self.enable_mkldnn)\n        return self._text_detector_module\n\n    def read_images(self, paths=[]):\n        images = []\n        for img_path in paths:\n            assert os.path.isfile(img_path), \"The {} isn't a valid file.\".format(img_path)\n            img = cv2.imread(img_path)\n            if img is None:\n                logger.info(\"error in loading image:{}\".format(img_path))\n                continue\n            images.append(img)\n        return images\n\n    def get_rotate_crop_image(self, img, points):\n        '''\n        img_height, img_width = img.shape[0:2]\n        left = int(np.min(points[:, 0]))\n        right = int(np.max(points[:, 0]))\n        top = int(np.min(points[:, 1]))\n        bottom = int(np.max(points[:, 1]))\n        img_crop = img[top:bottom, left:right, :].copy()\n        points[:, 0] = points[:, 0] - left\n        points[:, 1] = points[:, 1] - top\n        '''\n        img_crop_width = int(max(np.linalg.norm(points[0] - points[1]), np.linalg.norm(points[2] - points[3])))\n        img_crop_height = int(max(np.linalg.norm(points[0] - points[3]), np.linalg.norm(points[1] - points[2])))\n        pts_std = np.float32([[0, 0], [img_crop_width, 0], [img_crop_width, img_crop_height], [0, img_crop_height]])\n        M = cv2.getPerspectiveTransform(points, pts_std)\n        dst_img = cv2.warpPerspective(img,\n                                      M, (img_crop_width, img_crop_height),\n                                      borderMode=cv2.BORDER_REPLICATE,\n                                      flags=cv2.INTER_CUBIC)\n        dst_img_height, dst_img_width = dst_img.shape[0:2]\n        if dst_img_height * 1.0 / dst_img_width >= 1.5:\n            dst_img = np.rot90(dst_img)\n        return dst_img\n\n    def resize_norm_img_rec(self, img, max_wh_ratio):\n        imgC, imgH, imgW = self.rec_image_shape\n        assert imgC == img.shape[2]\n        imgW = int((32 * max_wh_ratio))\n        h, w = img.shape[:2]\n        ratio = w / float(h)\n        if math.ceil(imgH * ratio) > imgW:\n            resized_w = imgW\n        else:\n            resized_w = int(math.ceil(imgH * ratio))\n        resized_image = cv2.resize(img, (resized_w, imgH))\n        resized_image = resized_image.astype('float32')\n        resized_image = resized_image.transpose((2, 0, 1)) / 255\n        resized_image -= 0.5\n        resized_image /= 0.5\n        padding_im = np.zeros((imgC, imgH, imgW), dtype=np.float32)\n        padding_im[:, :, 0:resized_w] = resized_image\n        return padding_im\n\n    def resize_norm_img_cls(self, img):\n        cls_image_shape = [3, 48, 192]\n        imgC, imgH, imgW = cls_image_shape\n        h = img.shape[0]\n        w = img.shape[1]\n        ratio = w / float(h)\n        if math.ceil(imgH * ratio) > imgW:\n            resized_w = imgW\n        else:\n            resized_w = int(math.ceil(imgH * ratio))\n        resized_image = cv2.resize(img, (resized_w, imgH))\n        resized_image = resized_image.astype('float32')\n        if cls_image_shape[0] == 1:\n            resized_image = resized_image / 255\n            resized_image = resized_image[np.newaxis, :]\n        else:\n            resized_image = resized_image.transpose((2, 0, 1)) / 255\n        resized_image -= 0.5\n        resized_image /= 0.5\n        padding_im = np.zeros((imgC, imgH, imgW), dtype=np.float32)\n        padding_im[:, :, 0:resized_w] = resized_image\n        return padding_im\n\n    def recognize_text(self,\n                       images=[],\n                       paths=[],\n                       use_gpu=False,\n                       output_dir='ocr_result',\n                       visualization=False,\n                       box_thresh=0.5,\n                       text_thresh=0.5,\n                       angle_classification_thresh=0.9):\n        \"\"\"\n        Get the chinese texts in the predicted images.\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]. If images not paths\n            paths (list[str]): The paths of images. If paths not images\n            use_gpu (bool): Whether to use gpu.\n            batch_size(int): the program deals once with one\n            output_dir (str): The directory to store output images.\n            visualization (bool): Whether to save image or not.\n            box_thresh(float): the threshold of the detected text box's confidence\n            text_thresh(float): the threshold of the chinese text recognition confidence\n            angle_classification_thresh(float): the threshold of the angle classification confidence\n\n        Returns:\n            res (list): The result of chinese texts and save path of images.\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES via export CUDA_VISIBLE_DEVICES=cuda_device_id.\"\n                )\n\n        self.use_gpu = use_gpu\n\n        if images != [] and isinstance(images, list) and paths == []:\n            predicted_data = images\n        elif images == [] and isinstance(paths, list) and paths != []:\n            predicted_data = self.read_images(paths)\n        else:\n            raise TypeError(\"The input data is inconsistent with expectations.\")\n\n        assert predicted_data != [], \"There is not any image to be predicted. Please check the input data.\"\n\n        detection_results = self.text_detector_module.detect_text(images=predicted_data,\n                                                                  use_gpu=self.use_gpu,\n                                                                  box_thresh=box_thresh)\n\n        boxes = [np.array(item['data']).astype(np.float32) for item in detection_results]\n        all_results = []\n        for index, img_boxes in enumerate(boxes):\n            original_image = predicted_data[index].copy()\n            result = {'save_path': ''}\n            if img_boxes.size == 0:\n                result['data'] = []\n            else:\n                img_crop_list = []\n                boxes = sorted_boxes(img_boxes)\n                for num_box in range(len(boxes)):\n                    tmp_box = copy.deepcopy(boxes[num_box])\n                    img_crop = self.get_rotate_crop_image(original_image, tmp_box)\n                    img_crop_list.append(img_crop)\n                img_crop_list, angle_list = self._classify_text(img_crop_list,\n                                                                angle_classification_thresh=angle_classification_thresh)\n                rec_results = self._recognize_text(img_crop_list)\n\n                # if the recognized text confidence score is lower than text_thresh, then drop it\n                rec_res_final = []\n                for index, res in enumerate(rec_results):\n                    text, score = res\n                    if score >= text_thresh:\n                        rec_res_final.append({\n                            'text': text,\n                            'confidence': float(score),\n                            'text_box_position': boxes[index].astype(np.int64).tolist()\n                        })\n                result['data'] = rec_res_final\n\n                if visualization and result['data']:\n                    result['save_path'] = self.save_result_image(original_image, boxes, rec_results, output_dir,\n                                                                 text_thresh)\n            all_results.append(result)\n\n        return all_results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.recognize_text(images_decode, **kwargs)\n        return results\n\n    def save_result_image(\n        self,\n        original_image,\n        detection_boxes,\n        rec_results,\n        output_dir='ocr_result',\n        text_thresh=0.5,\n    ):\n        image = Image.fromarray(cv2.cvtColor(original_image, cv2.COLOR_BGR2RGB))\n        txts = [item[0] for item in rec_results]\n        scores = [item[1] for item in rec_results]\n        draw_img = draw_ocr(image,\n                            detection_boxes,\n                            txts,\n                            scores,\n                            font_file=self.font_file,\n                            draw_txt=True,\n                            drop_score=text_thresh)\n\n        if not os.path.exists(output_dir):\n            os.makedirs(output_dir)\n        ext = get_image_ext(original_image)\n        saved_name = 'ndarray_{}{}'.format(time.time(), ext)\n        save_file_path = os.path.join(output_dir, saved_name)\n        cv2.imwrite(save_file_path, draw_img[:, :, ::-1])\n        return save_file_path\n\n    def _classify_text(self, image_list, angle_classification_thresh=0.9):\n        img_list = copy.deepcopy(image_list)\n        img_num = len(img_list)\n        # Calculate the aspect ratio of all text bars\n        width_list = []\n        for img in img_list:\n            width_list.append(img.shape[1] / float(img.shape[0]))\n        # Sorting can speed up the cls process\n        indices = np.argsort(np.array(width_list))\n\n        cls_res = [['', 0.0]] * img_num\n        batch_num = 30\n        for beg_img_no in range(0, img_num, batch_num):\n            end_img_no = min(img_num, beg_img_no + batch_num)\n            norm_img_batch = []\n            max_wh_ratio = 0\n            for ino in range(beg_img_no, end_img_no):\n                h, w = img_list[indices[ino]].shape[0:2]\n                wh_ratio = w * 1.0 / h\n                max_wh_ratio = max(max_wh_ratio, wh_ratio)\n            for ino in range(beg_img_no, end_img_no):\n                norm_img = self.resize_norm_img_cls(img_list[indices[ino]])\n                norm_img = norm_img[np.newaxis, :]\n                norm_img_batch.append(norm_img)\n            norm_img_batch = np.concatenate(norm_img_batch)\n            norm_img_batch = norm_img_batch.copy()\n\n            self.cls_input_tensor.copy_from_cpu(norm_img_batch)\n            self.cls_predictor.run()\n\n            prob_out = self.cls_output_tensors[0].copy_to_cpu()\n            label_out = self.cls_output_tensors[1].copy_to_cpu()\n            if len(label_out.shape) != 1:\n                prob_out, label_out = label_out, prob_out\n            label_list = ['0', '180']\n            for rno in range(len(label_out)):\n                label_idx = label_out[rno]\n                score = prob_out[rno][label_idx]\n                label = label_list[label_idx]\n                cls_res[indices[beg_img_no + rno]] = [label, score]\n                if '180' in label and score > angle_classification_thresh:\n                    img_list[indices[beg_img_no + rno]] = cv2.rotate(img_list[indices[beg_img_no + rno]], 1)\n        return img_list, cls_res\n\n    def _recognize_text(self, img_list):\n        img_num = len(img_list)\n        # Calculate the aspect ratio of all text bars\n        width_list = []\n        for img in img_list:\n            width_list.append(img.shape[1] / float(img.shape[0]))\n        # Sorting can speed up the recognition process\n        indices = np.argsort(np.array(width_list))\n\n        rec_res = [['', 0.0]] * img_num\n        batch_num = 30\n        for beg_img_no in range(0, img_num, batch_num):\n            end_img_no = min(img_num, beg_img_no + batch_num)\n            norm_img_batch = []\n            max_wh_ratio = 0\n            for ino in range(beg_img_no, end_img_no):\n                h, w = img_list[indices[ino]].shape[0:2]\n                wh_ratio = w * 1.0 / h\n                max_wh_ratio = max(max_wh_ratio, wh_ratio)\n            for ino in range(beg_img_no, end_img_no):\n                norm_img = self.resize_norm_img_rec(img_list[indices[ino]], max_wh_ratio)\n                norm_img = norm_img[np.newaxis, :]\n                norm_img_batch.append(norm_img)\n\n            norm_img_batch = np.concatenate(norm_img_batch, axis=0)\n            norm_img_batch = norm_img_batch.copy()\n\n            self.rec_input_tensor.copy_from_cpu(norm_img_batch)\n            self.rec_predictor.run()\n\n            rec_idx_batch = self.rec_output_tensors[0].copy_to_cpu()\n            rec_idx_lod = self.rec_output_tensors[0].lod()[0]\n            predict_batch = self.rec_output_tensors[1].copy_to_cpu()\n            predict_lod = self.rec_output_tensors[1].lod()[0]\n            for rno in range(len(rec_idx_lod) - 1):\n                beg = rec_idx_lod[rno]\n                end = rec_idx_lod[rno + 1]\n                rec_idx_tmp = rec_idx_batch[beg:end, 0]\n                preds_text = self.char_ops.decode(rec_idx_tmp)\n                beg = predict_lod[rno]\n                end = predict_lod[rno + 1]\n                probs = predict_batch[beg:end, :]\n                ind = np.argmax(probs, axis=1)\n                blank = probs.shape[1]\n                valid_ind = np.where(ind != (blank - 1))[0]\n                if len(valid_ind) == 0:\n                    continue\n                score = np.mean(probs[valid_ind, ind[valid_ind]])\n                # rec_res.append([preds_text, score])\n                rec_res[indices[beg_img_no + rno]] = [preds_text, score]\n\n        return rec_res\n\n    def save_inference_model(self, dirname):\n        detector_dir = os.path.join(dirname, 'text_detector')\n        classifier_dir = os.path.join(dirname, 'angle_classifier')\n        recognizer_dir = os.path.join(dirname, 'text_recognizer')\n\n        self._save_detector_model(detector_dir)\n        self._save_classifier_model(classifier_dir)\n        self._save_recognizer_model(recognizer_dir)\n        logger.info(\"The inference model has been saved in the path {}\".format(os.path.realpath(dirname)))\n\n    def _save_detector_model(self, dirname):\n        self.text_detector_module.save_inference_model(dirname)\n\n    def _save_recognizer_model(self, dirname):\n        place = paddle.CPUPlace()\n        exe = paddle.static.Executor(place)\n\n        program, feeded_var_names, target_vars = paddle.static.load_inference_model(self.rec_pretrained_model_path,\n                                                                                    executor=exe)\n        global_block = program.global_block()\n        feed_vars = [global_block.var(item) for item in feeded_var_names]\n        paddle.static.save_inference_model(dirname,\n                                           feed_vars=feed_vars,\n                                           fetch_vars=target_vars,\n                                           executor=exe,\n                                           program=program)\n\n    def _save_classifier_model(self, dirname):\n        place = paddle.CPUPlace()\n        exe = paddle.static.Executor(place)\n\n        program, feeded_var_names, target_vars = paddle.static.load_inference_model(self.cls_pretrained_model_path,\n                                                                                    executor=exe)\n        global_block = program.global_block()\n        feed_vars = [global_block.var(item) for item in feeded_var_names]\n        paddle.static.save_inference_model(dirname,\n                                           feed_vars=feed_vars,\n                                           fetch_vars=target_vars,\n                                           executor=exe,\n                                           program=program)\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the %s module.\" % self.name,\n                                              prog='hub run %s' % self.name,\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n\n        args = self.parser.parse_args(argvs)\n        results = self.recognize_text(paths=[args.input_path],\n                                      use_gpu=args.use_gpu,\n                                      output_dir=args.output_dir,\n                                      visualization=args.visualization)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='ocr_result',\n                                           help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument('--visualization',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, default=None, help=\"diretory to image\")\n\n    def create_gradio_app(self):\n        import gradio as gr\n\n        def inference(image, use_gpu=False, box_thresh=0.5, text_thresh=0.5, angle_classification_thresh=0.9):\n            return self.recognize_text(paths=[image],\n                                       use_gpu=use_gpu,\n                                       output_dir=None,\n                                       visualization=False,\n                                       box_thresh=box_thresh,\n                                       text_thresh=text_thresh,\n                                       angle_classification_thresh=angle_classification_thresh)\n\n        return gr.Interface(inference, [\n            gr.Image(type='filepath'),\n            gr.Checkbox(),\n            gr.Slider(0, 1.0, 0.5, step=0.01),\n            gr.Slider(0, 1.0, 0.5, step=0.01),\n            gr.Slider(0, 1.0, 0.5, step=0.01)\n        ], [gr.JSON(label='results')],\n                            title='chinese_ocr_db_crnn_server',\n                            allow_flagging=False)\n"
  },
  {
    "path": "modules/image/text_recognition/chinese_ocr_db_crnn_server/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/KTzZVDjUsXw/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8MzM3fHx0ZXh0fGVufDB8fHx8MTY2MzUxMTExMQ&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"chinese_ocr_db_crnn_server\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('ocr_result')\n\n    def test_recognize_text1(self):\n        results = self.module.recognize_text(\n            paths=['tests/test.jpg'],\n            use_gpu=False,\n            visualization=False,\n        )\n        self.assertEqual(results[0]['data'], [{\n            'text': 'GIVE.',\n            'confidence': 0.944110095500946,\n            'text_box_position': [[281, 159], [359, 159], [359, 202], [281, 202]]\n        }, {\n            'text': 'THANKS.',\n            'confidence': 0.9850907325744629,\n            'text_box_position': [[258, 199], [382, 199], [382, 240], [258, 240]]\n        }])\n\n    def test_recognize_text2(self):\n        results = self.module.recognize_text(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=False,\n        )\n        self.assertEqual(results[0]['data'], [{\n            'text': 'GIVE.',\n            'confidence': 0.944110095500946,\n            'text_box_position': [[281, 159], [359, 159], [359, 202], [281, 202]]\n        }, {\n            'text': 'THANKS.',\n            'confidence': 0.9850907325744629,\n            'text_box_position': [[258, 199], [382, 199], [382, 240], [258, 240]]\n        }])\n\n    def test_recognize_text3(self):\n        results = self.module.recognize_text(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=True,\n            visualization=False,\n        )\n        self.assertEqual(results[0]['data'], [{\n            'text': 'GIVE.',\n            'confidence': 0.944110095500946,\n            'text_box_position': [[281, 159], [359, 159], [359, 202], [281, 202]]\n        }, {\n            'text': 'THANKS.',\n            'confidence': 0.9850907325744629,\n            'text_box_position': [[258, 199], [382, 199], [382, 240], [258, 240]]\n        }])\n\n    def test_recognize_text4(self):\n        results = self.module.recognize_text(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=True,\n        )\n        self.assertEqual(results[0]['data'], [{\n            'text': 'GIVE.',\n            'confidence': 0.944110095500946,\n            'text_box_position': [[281, 159], [359, 159], [359, 202], [281, 202]]\n        }, {\n            'text': 'THANKS.',\n            'confidence': 0.9850907325744629,\n            'text_box_position': [[258, 199], [382, 199], [382, 240], [258, 240]]\n        }])\n\n    def test_recognize_text5(self):\n        self.assertRaises(AttributeError, self.module.recognize_text, images=['tests/test.jpg'])\n\n    def test_recognize_text6(self):\n        self.assertRaises(AssertionError, self.module.recognize_text, paths=['no.jpg'])\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model/angle_classifier.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model/angle_classifier.pdiparams'))\n\n        self.assertTrue(os.path.exists('./inference/model/text_detector.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model/text_detector.pdiparams'))\n\n        self.assertTrue(os.path.exists('./inference/model/text_recognizer.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model/text_recognizer.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/text_recognition/chinese_ocr_db_crnn_server/utils.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport base64\nimport math\nfrom io import BytesIO\n\nimport cv2\nimport numpy as np\nfrom PIL import Image\nfrom PIL import ImageDraw\nfrom PIL import ImageFont\n\n\ndef draw_ocr(image, boxes, txts, scores, font_file, draw_txt=True, drop_score=0.5):\n    \"\"\"\n    Visualize the results of OCR detection and recognition\n    args:\n        image(Image|array): RGB image\n        boxes(list): boxes with shape(N, 4, 2)\n        txts(list): the texts\n        scores(list): txxs corresponding scores\n        draw_txt(bool): whether draw text or not\n        drop_score(float): only scores greater than drop_threshold will be visualized\n    return(array):\n        the visualized img\n    \"\"\"\n    if scores is None:\n        scores = [1] * len(boxes)\n    for (box, score) in zip(boxes, scores):\n        if score < drop_score or math.isnan(score):\n            continue\n        box = np.reshape(np.array(box), [-1, 1, 2]).astype(np.int64)\n        image = cv2.polylines(np.array(image), [box], True, (255, 0, 0), 2)\n\n    if draw_txt:\n        img = np.array(resize_img(image, input_size=600))\n        txt_img = text_visual(txts, scores, font_file, img_h=img.shape[0], img_w=600, threshold=drop_score)\n        img = np.concatenate([np.array(img), np.array(txt_img)], axis=1)\n        return img\n    return image\n\n\ndef text_visual(texts, scores, font_file, img_h=400, img_w=600, threshold=0.):\n    \"\"\"\n    create new blank img and draw txt on it\n    args:\n        texts(list): the text will be draw\n        scores(list|None): corresponding score of each txt\n        img_h(int): the height of blank img\n        img_w(int): the width of blank img\n    return(array):\n    \"\"\"\n    if scores is not None:\n        assert len(texts) == len(scores), \"The number of txts and corresponding scores must match\"\n\n    def create_blank_img():\n        blank_img = np.ones(shape=[img_h, img_w], dtype=np.int8) * 255\n        blank_img[:, img_w - 1:] = 0\n        blank_img = Image.fromarray(blank_img).convert(\"RGB\")\n        draw_txt = ImageDraw.Draw(blank_img)\n        return blank_img, draw_txt\n\n    blank_img, draw_txt = create_blank_img()\n\n    font_size = 20\n    txt_color = (0, 0, 0)\n    font = ImageFont.truetype(font_file, font_size, encoding=\"utf-8\")\n\n    gap = font_size + 5\n    txt_img_list = []\n    count, index = 1, 0\n    for idx, txt in enumerate(texts):\n        index += 1\n        if scores[idx] < threshold or math.isnan(scores[idx]):\n            index -= 1\n            continue\n        first_line = True\n        while str_count(txt) >= img_w // font_size - 4:\n            tmp = txt\n            txt = tmp[:img_w // font_size - 4]\n            if first_line:\n                new_txt = str(index) + ': ' + txt\n                first_line = False\n            else:\n                new_txt = '    ' + txt\n            draw_txt.text((0, gap * count), new_txt, txt_color, font=font)\n            txt = tmp[img_w // font_size - 4:]\n            if count >= img_h // gap - 1:\n                txt_img_list.append(np.array(blank_img))\n                blank_img, draw_txt = create_blank_img()\n                count = 0\n            count += 1\n        if first_line:\n            new_txt = str(index) + ': ' + txt + '   ' + '%.3f' % (scores[idx])\n        else:\n            new_txt = \"  \" + txt + \"  \" + '%.3f' % (scores[idx])\n        draw_txt.text((0, gap * count), new_txt, txt_color, font=font)\n        # whether add new blank img or not\n        if count >= img_h // gap - 1 and idx + 1 < len(texts):\n            txt_img_list.append(np.array(blank_img))\n            blank_img, draw_txt = create_blank_img()\n            count = 0\n        count += 1\n    txt_img_list.append(np.array(blank_img))\n    if len(txt_img_list) == 1:\n        blank_img = np.array(txt_img_list[0])\n    else:\n        blank_img = np.concatenate(txt_img_list, axis=1)\n    return np.array(blank_img)\n\n\ndef str_count(s):\n    \"\"\"\n    Count the number of Chinese characters,\n    a single English character and a single number\n    equal to half the length of Chinese characters.\n    args:\n        s(string): the input of string\n    return(int):\n        the number of Chinese characters\n    \"\"\"\n    import string\n    count_zh = count_pu = 0\n    s_len = len(s)\n    en_dg_count = 0\n    for c in s:\n        if c in string.ascii_letters or c.isdigit() or c.isspace():\n            en_dg_count += 1\n        elif c.isalpha():\n            count_zh += 1\n        else:\n            count_pu += 1\n    return s_len - math.ceil(en_dg_count / 2)\n\n\ndef resize_img(img, input_size=600):\n    img = np.array(img)\n    im_shape = img.shape\n    im_size_min = np.min(im_shape[0:2])\n    im_size_max = np.max(im_shape[0:2])\n    im_scale = float(input_size) / float(im_size_max)\n    im = cv2.resize(img, None, None, fx=im_scale, fy=im_scale)\n    return im\n\n\ndef get_image_ext(image):\n    if image.shape[2] == 4:\n        return \".png\"\n    return \".jpg\"\n\n\ndef sorted_boxes(dt_boxes):\n    \"\"\"\n    Sort text boxes in order from top to bottom, left to right\n    args:\n        dt_boxes(array):detected text boxes with shape [4, 2]\n    return:\n        sorted boxes(array) with shape [4, 2]\n    \"\"\"\n    num_boxes = dt_boxes.shape[0]\n    sorted_boxes = sorted(dt_boxes, key=lambda x: (x[0][1], x[0][0]))\n    _boxes = list(sorted_boxes)\n\n    for i in range(num_boxes - 1):\n        if abs(_boxes[i + 1][0][1] - _boxes[i][0][1]) < 10 and \\\n                (_boxes[i + 1][0][0] < _boxes[i][0][0]):\n            tmp = _boxes[i]\n            _boxes[i] = _boxes[i + 1]\n            _boxes[i + 1] = tmp\n    return _boxes\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    if data is None:\n        buf = BytesIO()\n        image_decode = base64.b64decode(b64str.encode('utf8'))\n        image = BytesIO(image_decode)\n        im = Image.open(image)\n        rgb = im.convert('RGB')\n        rgb.save(buf, 'jpeg')\n        buf.seek(0)\n        image_bytes = buf.read()\n        data_base64 = str(base64.b64encode(image_bytes), encoding=\"utf-8\")\n        image_decode = base64.b64decode(data_base64)\n        img_array = np.frombuffer(image_decode, np.uint8)\n        data = cv2.imdecode(img_array, cv2.IMREAD_COLOR)\n    return data\n"
  },
  {
    "path": "modules/image/text_recognition/chinese_text_detection_db_mobile/README.md",
    "content": "# chinese_text_detection_db_mobile\n\n|模型名称|chinese_text_detection_db_mobile|\n| :--- | :---: |\n|类别|图像-文字识别|\n|网络|Differentiable Binarization|\n|数据集|icdar2015数据集|\n|是否支持Fine-tuning|否|\n|模型大小|2.6MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n<p align=\"center\">\n<img src=\"https://user-images.githubusercontent.com/76040149/133101419-2d175dc4-2274-404d-b9f3-802f398a6ce4.jpg\" width=\"500\" alt=\"package\" >\n\n</p>\n\n- ### 模型介绍\n\n  - DB（Differentiable Binarization）是一种基于分割的文本检测算法。此类算法可以更好地处理弯曲等不规则形状文本，因此检测效果往往会更好。但其后处理步骤中将分割结果转化为检测框的流程复杂，耗时严重。DB将二值化阈值加入训练中学习，可以获得更准确的检测边界，从而简化后处理流程。该Module是一个超轻量级文本检测模型，支持直接预测。\n\n<p align=\"center\">\n<img src=\"https://user-images.githubusercontent.com/76040149/133101635-7fb142d3-9056-44da-8201-d931727d3977.png\" width=\"800\" hspace='10'/> <br />\n</p>\n\n更多详情参考：[Real-time Scene Text Detection with Differentiable Binarization](https://arxiv.org/pdf/1911.08947.pdf)\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.7.2  \n\n  - paddlehub >= 1.6.0   | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n  - shapely\n\n  - pyclipper\n\n  - ```shell\n    $ pip install shapely pyclipper\n    ```\n  - **该Module依赖于第三方库shapely和pyclipper，使用该Module之前，请先安装shapely和pyclipper。**\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install chinese_text_detection_db_mobile\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run chinese_text_detection_db_mobile --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    text_detector = hub.Module(name=\"chinese_text_detection_db_mobile\", enable_mkldnn=True)       # mkldnn加速仅在CPU下有效\n    result = text_detector.detect_text(images=[cv2.imread('/PATH/TO/IMAGE')])\n\n    # or\n    # result =text_detector.detect_text(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    __init__(enable_mkldnn=False)\n    ```\n\n    - 构造ChineseTextDetectionDB对象\n\n    - **参数**\n      - enable_mkldnn(bool): 是否开启mkldnn加速CPU计算。该参数仅在CPU运行下设置有效。默认为False。\n\n\n  - ```python\n    def detect_text(paths=[],\n                    images=[],\n                    use_gpu=False,\n                    output_dir='detection_result',\n                    box_thresh=0.5,\n                    visualization=False)\n    ```\n\n    - 预测API，检测输入图片中的所有中文文本的位置。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径；\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量**\n      - box\\_thresh (float): 检测文本框置信度的阈值；\n      - visualization (bool): 是否将识别结果保存为图片文件；\n      - output\\_dir (str): 图片的保存路径，默认设为 detection\\_result；\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为：\n        - data (list): 检测文本框结果，文本框在原图中的像素坐标，4*2的矩阵，依次表示文本框左下、右下、右上、左上顶点的坐标\n        - save_path (str): 识别结果的保存路径, 如不保存图片则save_path为''\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m chinese_text_detection_db_mobile\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/chinese_text_detection_db_mobile\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  修复使用在线服务调用模型失败问题\n\n* 1.0.2\n\n  支持mkldnn加速CPU计算\n\n* 1.0.3\n\n  增加更多预训练数据，更新预训练参数\n\n* 1.0.4\n\n  使用超轻量级的三阶段模型（文本框检测-角度分类-文字识别）识别图片文字。\n\n* 1.0.5\n\n  移除 fluid api\n\n* 1.1.0\n\n  适配 PaddleHub 2.x 版本\n\n  - ```shell\n    $ hub install chinese_text_detection_db_mobile==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/text_recognition/chinese_text_detection_db_mobile/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/text_recognition/chinese_text_detection_db_mobile/module.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport argparse\nimport ast\nimport base64\nimport os\nimport time\nfrom io import BytesIO\n\nimport cv2\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\nfrom PIL import Image\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\nfrom paddlehub.utils.log import logger\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    if data is None:\n        buf = BytesIO()\n        image_decode = base64.b64decode(b64str.encode('utf8'))\n        image = BytesIO(image_decode)\n        im = Image.open(image)\n        rgb = im.convert('RGB')\n        rgb.save(buf, 'jpeg')\n        buf.seek(0)\n        image_bytes = buf.read()\n        data_base64 = str(base64.b64encode(image_bytes), encoding=\"utf-8\")\n        image_decode = base64.b64decode(data_base64)\n        img_array = np.frombuffer(image_decode, np.uint8)\n        data = cv2.imdecode(img_array, cv2.IMREAD_COLOR)\n    return data\n\n\n@moduleinfo(\n    name=\"chinese_text_detection_db_mobile\",\n    version=\"1.1.0\",\n    summary=\n    \"The module aims to detect chinese text position in the image, which is based on differentiable_binarization algorithm.\",\n    author=\"paddle-dev\",\n    author_email=\"paddle-dev@baidu.com\",\n    type=\"cv/text_recognition\")\nclass ChineseTextDetectionDB:\n\n    def __init__(self, enable_mkldnn=False):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        self.pretrained_model_path = os.path.join(self.directory, 'inference_model', 'model')\n        self.enable_mkldnn = enable_mkldnn\n\n        self._set_config()\n\n    def check_requirements(self):\n        try:\n            import shapely, pyclipper\n        except:\n            raise ImportError(\n                'This module requires the shapely, pyclipper tools. The running environment does not meet the requirements. Please install the two packages.'\n            )\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model_file_path = self.pretrained_model_path + '.pdmodel'\n        params_file_path = self.pretrained_model_path + '.pdiparams'\n\n        config = Config(model_file_path, params_file_path)\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n\n        if use_gpu:\n            config.enable_use_gpu(8000, 0)\n        else:\n            config.disable_gpu()\n            config.set_cpu_math_library_num_threads(6)\n            if self.enable_mkldnn:\n                # cache 10 different shapes for mkldnn to avoid memory leak\n                config.set_mkldnn_cache_capacity(10)\n                config.enable_mkldnn()\n\n        config.disable_glog_info()\n\n        # use zero copy\n        config.delete_pass(\"conv_transpose_eltwiseadd_bn_fuse_pass\")\n        config.switch_use_feed_fetch_ops(False)\n        self.predictor = create_predictor(config)\n        input_names = self.predictor.get_input_names()\n        self.input_tensor = self.predictor.get_input_handle(input_names[0])\n        output_names = self.predictor.get_output_names()\n        self.output_tensors = []\n        for output_name in output_names:\n            output_tensor = self.predictor.get_output_handle(output_name)\n            self.output_tensors.append(output_tensor)\n\n    def read_images(self, paths=[]):\n        images = []\n        for img_path in paths:\n            assert os.path.isfile(img_path), \"The {} isn't a valid file.\".format(img_path)\n            img = cv2.imread(img_path)\n            if img is None:\n                logger.info(\"error in loading image:{}\".format(img_path))\n                continue\n            images.append(img)\n        return images\n\n    def order_points_clockwise(self, pts):\n        \"\"\"\n        reference from: https://github.com/jrosebr1/imutils/blob/master/imutils/perspective.py\n        # sort the points based on their x-coordinates\n        \"\"\"\n        xSorted = pts[np.argsort(pts[:, 0]), :]\n\n        # grab the left-most and right-most points from the sorted\n        # x-roodinate points\n        leftMost = xSorted[:2, :]\n        rightMost = xSorted[2:, :]\n\n        # now, sort the left-most coordinates according to their\n        # y-coordinates so we can grab the top-left and bottom-left\n        # points, respectively\n        leftMost = leftMost[np.argsort(leftMost[:, 1]), :]\n        (tl, bl) = leftMost\n\n        rightMost = rightMost[np.argsort(rightMost[:, 1]), :]\n        (tr, br) = rightMost\n\n        rect = np.array([tl, tr, br, bl], dtype=\"float32\")\n        return rect\n\n    def clip_det_res(self, points, img_height, img_width):\n        for pno in range(points.shape[0]):\n            points[pno, 0] = int(min(max(points[pno, 0], 0), img_width - 1))\n            points[pno, 1] = int(min(max(points[pno, 1], 0), img_height - 1))\n        return points\n\n    def filter_tag_det_res(self, dt_boxes, image_shape):\n        img_height, img_width = image_shape[0:2]\n        dt_boxes_new = []\n        for box in dt_boxes:\n            box = self.order_points_clockwise(box)\n            box = self.clip_det_res(box, img_height, img_width)\n            rect_width = int(np.linalg.norm(box[0] - box[1]))\n            rect_height = int(np.linalg.norm(box[0] - box[3]))\n            if rect_width <= 3 or rect_height <= 3:\n                continue\n            dt_boxes_new.append(box)\n        dt_boxes = np.array(dt_boxes_new)\n        return dt_boxes\n\n    def filter_tag_det_res_only_clip(self, dt_boxes, image_shape):\n        img_height, img_width = image_shape[0:2]\n        dt_boxes_new = []\n        for box in dt_boxes:\n            box = self.clip_det_res(box, img_height, img_width)\n            dt_boxes_new.append(box)\n        dt_boxes = np.array(dt_boxes_new)\n        return dt_boxes\n\n    def detect_text(self,\n                    images=[],\n                    paths=[],\n                    use_gpu=False,\n                    output_dir='detection_result',\n                    visualization=False,\n                    box_thresh=0.5):\n        \"\"\"\n        Get the text box in the predicted images.\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]. If images not paths\n            paths (list[str]): The paths of images. If paths not images\n            use_gpu (bool): Whether to use gpu. Default false.\n            output_dir (str): The directory to store output images.\n            visualization (bool): Whether to save image or not.\n            box_thresh(float): the threshold of the detected text box's confidence\n        Returns:\n            res (list): The result of text detection box and save path of images.\n        \"\"\"\n        self.check_requirements()\n\n        from chinese_text_detection_db_mobile.processor import DBProcessTest, DBPostProcess, draw_boxes, get_image_ext\n\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES via export CUDA_VISIBLE_DEVICES=cuda_device_id.\"\n                )\n\n        if images != [] and isinstance(images, list) and paths == []:\n            predicted_data = images\n        elif images == [] and isinstance(paths, list) and paths != []:\n            predicted_data = self.read_images(paths)\n        else:\n            raise TypeError(\"The input data is inconsistent with expectations.\")\n\n        assert predicted_data != [], \"There is not any image to be predicted. Please check the input data.\"\n\n        preprocessor = DBProcessTest(params={'max_side_len': 960})\n        postprocessor = DBPostProcess(params={\n            'thresh': 0.3,\n            'box_thresh': box_thresh,\n            'max_candidates': 1000,\n            'unclip_ratio': 1.6\n        })\n\n        all_imgs = []\n        all_ratios = []\n        all_results = []\n        for original_image in predicted_data:\n            ori_im = original_image.copy()\n            im, ratio_list = preprocessor(original_image)\n            res = {'save_path': ''}\n            if im is None:\n                res['data'] = []\n\n            else:\n                im = im.copy()\n                self.input_tensor.copy_from_cpu(im)\n                self.predictor.run()\n\n                outputs = []\n                for output_tensor in self.output_tensors:\n                    output = output_tensor.copy_to_cpu()\n                    outputs.append(output)\n\n                outs_dict = {}\n                outs_dict['maps'] = outputs[0]\n\n                # data_out = self.output_tensors[0].copy_to_cpu()\n                dt_boxes_list = postprocessor(outs_dict, [ratio_list])\n                dt_boxes = dt_boxes_list[0]\n                boxes = self.filter_tag_det_res(dt_boxes_list[0], original_image.shape)\n                res['data'] = boxes.astype(np.int64).tolist()\n\n                all_imgs.append(im)\n                all_ratios.append(ratio_list)\n                if visualization:\n                    img = Image.fromarray(cv2.cvtColor(original_image, cv2.COLOR_BGR2RGB))\n                    draw_img = draw_boxes(img, boxes)\n                    draw_img = np.array(draw_img)\n                    if not os.path.exists(output_dir):\n                        os.makedirs(output_dir)\n                    ext = get_image_ext(original_image)\n                    saved_name = 'ndarray_{}{}'.format(time.time(), ext)\n                    cv2.imwrite(os.path.join(output_dir, saved_name), draw_img[:, :, ::-1])\n                    res['save_path'] = os.path.join(output_dir, saved_name)\n\n            all_results.append(res)\n\n        return all_results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.detect_text(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the %s module.\" % self.name,\n                                              prog='hub run %s' % self.name,\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n\n        args = self.parser.parse_args(argvs)\n        results = self.detect_text(paths=[args.input_path],\n                                   use_gpu=args.use_gpu,\n                                   output_dir=args.output_dir,\n                                   visualization=args.visualization)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='detection_result',\n                                           help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument('--visualization',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, default=None, help=\"diretory to image\")\n"
  },
  {
    "path": "modules/image/text_recognition/chinese_text_detection_db_mobile/processor.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport sys\n\nimport cv2\nimport numpy as np\nimport pyclipper\nfrom PIL import ImageDraw\nfrom shapely.geometry import Polygon\n\n\nclass DBProcessTest(object):\n    \"\"\"\n    DB pre-process for Test mode\n    \"\"\"\n\n    def __init__(self, params):\n        super(DBProcessTest, self).__init__()\n        self.resize_type = 0\n        if 'test_image_shape' in params:\n            self.image_shape = params['test_image_shape']\n            # print(self.image_shape)\n            self.resize_type = 1\n        if 'max_side_len' in params:\n            self.max_side_len = params['max_side_len']\n        else:\n            self.max_side_len = 2400\n\n    def resize_image_type0(self, im):\n        \"\"\"\n        resize image to a size multiple of 32 which is required by the network\n        args:\n            img(array): array with shape [h, w, c]\n        return(tuple):\n            img, (ratio_h, ratio_w)\n        \"\"\"\n        max_side_len = self.max_side_len\n        h, w, _ = im.shape\n\n        resize_w = w\n        resize_h = h\n\n        # limit the max side\n        if max(resize_h, resize_w) > max_side_len:\n            if resize_h > resize_w:\n                ratio = float(max_side_len) / resize_h\n            else:\n                ratio = float(max_side_len) / resize_w\n        else:\n            ratio = 1.\n        resize_h = int(resize_h * ratio)\n        resize_w = int(resize_w * ratio)\n        if resize_h % 32 == 0:\n            resize_h = resize_h\n        elif resize_h // 32 <= 1:\n            resize_h = 32\n        else:\n            resize_h = (resize_h // 32 - 1) * 32\n        if resize_w % 32 == 0:\n            resize_w = resize_w\n        elif resize_w // 32 <= 1:\n            resize_w = 32\n        else:\n            resize_w = (resize_w // 32 - 1) * 32\n        try:\n            if int(resize_w) <= 0 or int(resize_h) <= 0:\n                return None, (None, None)\n            im = cv2.resize(im, (int(resize_w), int(resize_h)))\n        except:\n            print(im.shape, resize_w, resize_h)\n            sys.exit(0)\n        ratio_h = resize_h / float(h)\n        ratio_w = resize_w / float(w)\n        return im, (ratio_h, ratio_w)\n\n    def resize_image_type1(self, im):\n        resize_h, resize_w = self.image_shape\n        ori_h, ori_w = im.shape[:2]  # (h, w, c)\n        im = cv2.resize(im, (int(resize_w), int(resize_h)))\n        ratio_h = float(resize_h) / ori_h\n        ratio_w = float(resize_w) / ori_w\n        return im, (ratio_h, ratio_w)\n\n    def normalize(self, im):\n        img_mean = [0.485, 0.456, 0.406]\n        img_std = [0.229, 0.224, 0.225]\n        im = im.astype(np.float32, copy=False)\n        im = im / 255\n        im[:, :, 0] -= img_mean[0]\n        im[:, :, 1] -= img_mean[1]\n        im[:, :, 2] -= img_mean[2]\n        im[:, :, 0] /= img_std[0]\n        im[:, :, 1] /= img_std[1]\n        im[:, :, 2] /= img_std[2]\n        channel_swap = (2, 0, 1)\n        im = im.transpose(channel_swap)\n        return im\n\n    def __call__(self, im):\n        if self.resize_type == 0:\n            im, (ratio_h, ratio_w) = self.resize_image_type0(im)\n        else:\n            im, (ratio_h, ratio_w) = self.resize_image_type1(im)\n        im = self.normalize(im)\n        im = im[np.newaxis, :]\n        return [im, (ratio_h, ratio_w)]\n\n\nclass DBPostProcess(object):\n    \"\"\"\n    The post process for Differentiable Binarization (DB).\n    \"\"\"\n\n    def __init__(self, params):\n        self.thresh = params['thresh']\n        self.box_thresh = params['box_thresh']\n        self.max_candidates = params['max_candidates']\n        self.unclip_ratio = params['unclip_ratio']\n        self.min_size = 3\n        self.dilation_kernel = np.array([[1, 1], [1, 1]])\n\n    def boxes_from_bitmap(self, pred, _bitmap, dest_width, dest_height):\n        '''\n        _bitmap: single map with shape (1, H, W),\n                whose values are binarized as {0, 1}\n        '''\n\n        bitmap = _bitmap\n        height, width = bitmap.shape\n\n        outs = cv2.findContours((bitmap * 255).astype(np.uint8), cv2.RETR_LIST, cv2.CHAIN_APPROX_SIMPLE)\n        if len(outs) == 3:\n            img, contours, _ = outs[0], outs[1], outs[2]\n        elif len(outs) == 2:\n            contours, _ = outs[0], outs[1]\n\n        num_contours = min(len(contours), self.max_candidates)\n        boxes = np.zeros((num_contours, 4, 2), dtype=np.int64)\n        scores = np.zeros((num_contours, ), dtype=np.float32)\n\n        for index in range(num_contours):\n            contour = contours[index]\n            points, sside = self.get_mini_boxes(contour)\n            if sside < self.min_size:\n                continue\n            points = np.array(points)\n            score = self.box_score_fast(pred, points.reshape(-1, 2))\n            if self.box_thresh > score:\n                continue\n\n            box = self.unclip(points).reshape(-1, 1, 2)\n            box, sside = self.get_mini_boxes(box)\n            if sside < self.min_size + 2:\n                continue\n            box = np.array(box)\n            if not isinstance(dest_width, int):\n                dest_width = dest_width.item()\n                dest_height = dest_height.item()\n\n            box[:, 0] = np.clip(np.round(box[:, 0] / width * dest_width), 0, dest_width)\n            box[:, 1] = np.clip(np.round(box[:, 1] / height * dest_height), 0, dest_height)\n            boxes[index, :, :] = box.astype(np.int64)\n            scores[index] = score\n        return boxes, scores\n\n    def unclip(self, box):\n        unclip_ratio = self.unclip_ratio\n        poly = Polygon(box)\n        distance = poly.area * unclip_ratio / poly.length\n        offset = pyclipper.PyclipperOffset()\n        offset.AddPath(box, pyclipper.JT_ROUND, pyclipper.ET_CLOSEDPOLYGON)\n        expanded = np.array(offset.Execute(distance))\n        return expanded\n\n    def get_mini_boxes(self, contour):\n        bounding_box = cv2.minAreaRect(contour)\n        points = sorted(list(cv2.boxPoints(bounding_box)), key=lambda x: x[0])\n\n        index_1, index_2, index_3, index_4 = 0, 1, 2, 3\n        if points[1][1] > points[0][1]:\n            index_1 = 0\n            index_4 = 1\n        else:\n            index_1 = 1\n            index_4 = 0\n        if points[3][1] > points[2][1]:\n            index_2 = 2\n            index_3 = 3\n        else:\n            index_2 = 3\n            index_3 = 2\n\n        box = [points[index_1], points[index_2], points[index_3], points[index_4]]\n        return box, min(bounding_box[1])\n\n    def box_score_fast(self, bitmap, _box):\n        h, w = bitmap.shape[:2]\n        box = _box.copy()\n        xmin = np.clip(np.floor(box[:, 0].min()).astype(np.int64), 0, w - 1)\n        xmax = np.clip(np.ceil(box[:, 0].max()).astype(np.int64), 0, w - 1)\n        ymin = np.clip(np.floor(box[:, 1].min()).astype(np.int64), 0, h - 1)\n        ymax = np.clip(np.ceil(box[:, 1].max()).astype(np.int64), 0, h - 1)\n\n        mask = np.zeros((ymax - ymin + 1, xmax - xmin + 1), dtype=np.uint8)\n        box[:, 0] = box[:, 0] - xmin\n        box[:, 1] = box[:, 1] - ymin\n        cv2.fillPoly(mask, box.reshape(1, -1, 2).astype(np.int64), 1)\n        return cv2.mean(bitmap[ymin:ymax + 1, xmin:xmax + 1], mask)[0]\n\n    def __call__(self, outs_dict, ratio_list):\n        pred = outs_dict['maps']\n\n        pred = pred[:, 0, :, :]\n        segmentation = pred > self.thresh\n\n        boxes_batch = []\n        for batch_index in range(pred.shape[0]):\n            height, width = pred.shape[-2:]\n\n            mask = cv2.dilate(np.array(segmentation[batch_index]).astype(np.uint8), self.dilation_kernel)\n            tmp_boxes, tmp_scores = self.boxes_from_bitmap(pred[batch_index], mask, width, height)\n\n            boxes = []\n            for k in range(len(tmp_boxes)):\n                if tmp_scores[k] > self.box_thresh:\n                    boxes.append(tmp_boxes[k])\n            if len(boxes) > 0:\n                boxes = np.array(boxes)\n\n                ratio_h, ratio_w = ratio_list[batch_index]\n                boxes[:, :, 0] = boxes[:, :, 0] / ratio_w\n                boxes[:, :, 1] = boxes[:, :, 1] / ratio_h\n\n            boxes_batch.append(boxes)\n        return boxes_batch\n\n\ndef draw_boxes(image, boxes, scores=None, drop_score=0.5):\n    img = image.copy()\n    draw = ImageDraw.Draw(img)\n    if scores is None:\n        scores = [1] * len(boxes)\n    for (box, score) in zip(boxes, scores):\n        if score < drop_score:\n            continue\n        draw.line([(box[0][0], box[0][1]), (box[1][0], box[1][1])], fill='red')\n        draw.line([(box[1][0], box[1][1]), (box[2][0], box[2][1])], fill='red')\n        draw.line([(box[2][0], box[2][1]), (box[3][0], box[3][1])], fill='red')\n        draw.line([(box[3][0], box[3][1]), (box[0][0], box[0][1])], fill='red')\n        draw.line([(box[0][0] - 1, box[0][1] + 1), (box[1][0] - 1, box[1][1] + 1)], fill='red')\n        draw.line([(box[1][0] - 1, box[1][1] + 1), (box[2][0] - 1, box[2][1] + 1)], fill='red')\n        draw.line([(box[2][0] - 1, box[2][1] + 1), (box[3][0] - 1, box[3][1] + 1)], fill='red')\n        draw.line([(box[3][0] - 1, box[3][1] + 1), (box[0][0] - 1, box[0][1] + 1)], fill='red')\n    return img\n\n\ndef get_image_ext(image):\n    if image.shape[2] == 4:\n        return \".png\"\n    return \".jpg\"\n"
  },
  {
    "path": "modules/image/text_recognition/chinese_text_detection_db_mobile/requirements.txt",
    "content": "shapely\npyclipper\n"
  },
  {
    "path": "modules/image/text_recognition/chinese_text_detection_db_mobile/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/KTzZVDjUsXw/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8MzM3fHx0ZXh0fGVufDB8fHx8MTY2MzUxMTExMQ&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"chinese_text_detection_db_mobile\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('detection_result')\n\n    def test_detect_text1(self):\n        results = self.module.detect_text(\n            paths=['tests/test.jpg'],\n            use_gpu=False,\n            visualization=False,\n        )\n        self.assertEqual(\n            results[0]['data'],\n            [[[259, 201], [376, 199], [376, 238], [259, 240]], [[282, 163], [351, 163], [351, 200], [282, 200]]])\n\n    def test_detect_text2(self):\n        results = self.module.detect_text(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=False,\n        )\n        self.assertEqual(\n            results[0]['data'],\n            [[[259, 201], [376, 199], [376, 238], [259, 240]], [[282, 163], [351, 163], [351, 200], [282, 200]]])\n\n    def test_detect_text3(self):\n        results = self.module.detect_text(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=True,\n            visualization=False,\n        )\n        self.assertEqual(\n            results[0]['data'],\n            [[[259, 201], [376, 199], [376, 238], [259, 240]], [[282, 163], [351, 163], [351, 200], [282, 200]]])\n\n    def test_detect_text4(self):\n        results = self.module.detect_text(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=True,\n        )\n        self.assertEqual(\n            results[0]['data'],\n            [[[259, 201], [376, 199], [376, 238], [259, 240]], [[282, 163], [351, 163], [351, 200], [282, 200]]])\n\n    def test_detect_text5(self):\n        self.assertRaises(AttributeError, self.module.detect_text, images=['tests/test.jpg'])\n\n    def test_detect_text6(self):\n        self.assertRaises(AssertionError, self.module.detect_text, paths=['no.jpg'])\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/text_recognition/chinese_text_detection_db_server/README.md",
    "content": "# chinese_text_detection_db_server\n\n|模型名称|chinese_text_detection_db_server|\n| :--- | :---: |\n|类别|图像-文字识别|\n|网络|Differentiable Binarization|\n|数据集|icdar2015数据集|\n|是否支持Fine-tuning|否|\n|模型大小|47MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n<p align=\"center\">\n<img src=\"https://user-images.githubusercontent.com/76040149/133101419-2d175dc4-2274-404d-b9f3-802f398a6ce4.jpg\" width=\"500\" alt=\"package\" >\n</p>\n\n- ### 模型介绍\n\n  - DB（Differentiable Binarization）是一种基于分割的文本检测算法。此类算法可以更好地处理弯曲等不规则形状文本，因此检测效果往往会更好。但其后处理步骤中将分割结果转化为检测框的流程复杂，耗时严重。DB将二值化阈值加入训练中学习，可以获得更准确的检测边界，从而简化后处理流程。该Module是一个超轻量级文本检测模型，支持直接预测。\n\n<p align=\"center\">\n<img src=\"https://user-images.githubusercontent.com/76040149/133101635-7fb142d3-9056-44da-8201-d931727d3977.png\" width=\"800\" hspace='10'/> <br />\n</p>\n\n  - 更多详情参考：[Real-time Scene Text Detection with Differentiable Binarization](https://arxiv.org/pdf/1911.08947.pdf)\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.7.2  \n\n  - paddlehub >= 1.6.0   | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n  - shapely\n\n  - pyclipper\n\n  - ```shell\n    $ pip install shapely pyclipper\n    ```\n  - **该Module依赖于第三方库shapely和pyclipper，使用该Module之前，请先安装shapely和pyclipper。**\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install chinese_text_detection_db_server\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run chinese_text_detection_db_server --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    text_detector = hub.Module(name=\"chinese_text_detection_db_server\", enable_mkldnn=True)       # mkldnn加速仅在CPU下有效\n    result = text_detector.detect_text(images=[cv2.imread('/PATH/TO/IMAGE')])\n\n    # or\n    # result =text_detector.detect_text(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    __init__(enable_mkldnn=False)\n    ```\n    - 构造ChineseTextDetectionDBServer对象\n\n    - **参数**\n\n      - enable_mkldnn(bool): 是否开启mkldnn加速CPU计算。该参数仅在CPU运行下设置有效。默认为False。\n\n  - ```python\n    def detect_text(paths=[],\n                    images=[],\n                    use_gpu=False,\n                    output_dir='detection_result',\n                    box_thresh=0.5,\n                    visualization=False)\n    ```\n    - 预测API，检测输入图片中的所有中文文本的位置。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径；\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量**\n      - box\\_thresh (float): 检测文本框置信度的阈值；\n      - visualization (bool): 是否将识别结果保存为图片文件；\n      - output\\_dir (str): 图片的保存路径，默认设为 detection\\_result；\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为：\n        - data (list): 检测文本框结果，文本框在原图中的像素坐标，4*2的矩阵，依次表示文本框左下、右下、右上、左上顶点的坐标\n        - ave_path (str): 识别结果的保存路径, 如不保存图片则save_path为''\n\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m chinese_text_detection_db_server\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/chinese_text_detection_db_server\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  支持mkldnn加速CPU计算\n\n* 1.0.2\n\n  增加更多预训练数据，更新预训练参数\n\n* 1.0.3\n\n  移除 fluid api\n\n* 1.1.0\n\n  适配 PaddleHub 2.x 版本\n\n  - ```shell\n    $ hub install chinese_text_detection_db_server==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/text_recognition/chinese_text_detection_db_server/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/text_recognition/chinese_text_detection_db_server/module.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport argparse\nimport ast\nimport base64\nimport math\nimport os\nimport time\nfrom io import BytesIO\n\nimport cv2\nimport numpy as np\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\nfrom PIL import Image\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\nfrom paddlehub.utils.log import logger\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    if data is None:\n        buf = BytesIO()\n        image_decode = base64.b64decode(b64str.encode('utf8'))\n        image = BytesIO(image_decode)\n        im = Image.open(image)\n        rgb = im.convert('RGB')\n        rgb.save(buf, 'jpeg')\n        buf.seek(0)\n        image_bytes = buf.read()\n        data_base64 = str(base64.b64encode(image_bytes), encoding=\"utf-8\")\n        image_decode = base64.b64decode(data_base64)\n        img_array = np.frombuffer(image_decode, np.uint8)\n        data = cv2.imdecode(img_array, cv2.IMREAD_COLOR)\n    return data\n\n\n@moduleinfo(\n    name=\"chinese_text_detection_db_server\",\n    version=\"1.1.0\",\n    summary=\n    \"The module aims to detect chinese text position in the image, which is based on differentiable_binarization algorithm.\",\n    author=\"paddle-dev\",\n    author_email=\"paddle-dev@baidu.com\",\n    type=\"cv/text_recognition\")\nclass ChineseTextDetectionDBServer:\n\n    def __init__(self, enable_mkldnn=False):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        self.pretrained_model_path = os.path.join(self.directory, 'inference_model', 'model')\n        self.enable_mkldnn = enable_mkldnn\n\n        self._set_config()\n\n    def check_requirements(self):\n        try:\n            import shapely, pyclipper\n        except:\n            raise ImportError(\n                'This module requires the shapely, pyclipper tools. The running environment does not meet the requirements. Please install the two packages.'\n            )\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model_file_path = self.pretrained_model_path + '.pdmodel'\n        params_file_path = self.pretrained_model_path + '.pdiparams'\n\n        config = Config(model_file_path, params_file_path)\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n\n        if use_gpu:\n            config.enable_use_gpu(8000, 0)\n        else:\n            config.disable_gpu()\n            if self.enable_mkldnn:\n                config.enable_mkldnn()\n\n        config.disable_glog_info()\n\n        # use zero copy\n        config.delete_pass(\"conv_transpose_eltwiseadd_bn_fuse_pass\")\n        config.switch_use_feed_fetch_ops(False)\n        self.predictor = create_predictor(config)\n        input_names = self.predictor.get_input_names()\n        self.input_tensor = self.predictor.get_input_handle(input_names[0])\n        output_names = self.predictor.get_output_names()\n        self.output_tensors = []\n        for output_name in output_names:\n            output_tensor = self.predictor.get_output_handle(output_name)\n            self.output_tensors.append(output_tensor)\n\n    def read_images(self, paths=[]):\n        images = []\n        for img_path in paths:\n            assert os.path.isfile(img_path), \"The {} isn't a valid file.\".format(img_path)\n            img = cv2.imread(img_path)\n            if img is None:\n                logger.info(\"error in loading image:{}\".format(img_path))\n                continue\n            images.append(img)\n        return images\n\n    def filter_tag_det_res(self, dt_boxes, image_shape):\n        img_height, img_width = image_shape[0:2]\n        dt_boxes_new = []\n        for box in dt_boxes:\n            box = self.order_points_clockwise(box)\n            left = int(np.min(box[:, 0]))\n            right = int(np.max(box[:, 0]))\n            top = int(np.min(box[:, 1]))\n            bottom = int(np.max(box[:, 1]))\n            bbox_height = bottom - top\n            bbox_width = right - left\n            diffh = math.fabs(box[0, 1] - box[1, 1])\n            diffw = math.fabs(box[0, 0] - box[3, 0])\n            rect_width = int(np.linalg.norm(box[0] - box[1]))\n            rect_height = int(np.linalg.norm(box[0] - box[3]))\n            if rect_width <= 10 or rect_height <= 10:\n                continue\n            dt_boxes_new.append(box)\n        dt_boxes = np.array(dt_boxes_new)\n        return dt_boxes\n\n    def order_points_clockwise(self, pts):\n        \"\"\"\n        reference from: https://github.com/jrosebr1/imutils/blob/master/imutils/perspective.py\n        # sort the points based on their x-coordinates\n        \"\"\"\n        xSorted = pts[np.argsort(pts[:, 0]), :]\n\n        # grab the left-most and right-most points from the sorted\n        # x-roodinate points\n        leftMost = xSorted[:2, :]\n        rightMost = xSorted[2:, :]\n\n        # now, sort the left-most coordinates according to their\n        # y-coordinates so we can grab the top-left and bottom-left\n        # points, respectively\n        leftMost = leftMost[np.argsort(leftMost[:, 1]), :]\n        (tl, bl) = leftMost\n\n        rightMost = rightMost[np.argsort(rightMost[:, 1]), :]\n        (tr, br) = rightMost\n\n        rect = np.array([tl, tr, br, bl], dtype=\"float32\")\n        return rect\n\n    def detect_text(self,\n                    images=[],\n                    paths=[],\n                    use_gpu=False,\n                    output_dir='detection_result',\n                    visualization=False,\n                    box_thresh=0.5):\n        \"\"\"\n        Get the text box in the predicted images.\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]. If images not paths\n            paths (list[str]): The paths of images. If paths not images\n            use_gpu (bool): Whether to use gpu. Default false.\n            output_dir (str): The directory to store output images.\n            visualization (bool): Whether to save image or not.\n            box_thresh(float): the threshold of the detected text box's confidence\n        Returns:\n            res (list): The result of text detection box and save path of images.\n        \"\"\"\n        self.check_requirements()\n\n        from chinese_text_detection_db_server.processor import DBPreProcess, DBPostProcess, draw_boxes, get_image_ext\n\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES via export CUDA_VISIBLE_DEVICES=cuda_device_id.\"\n                )\n\n        if images != [] and isinstance(images, list) and paths == []:\n            predicted_data = images\n        elif images == [] and isinstance(paths, list) and paths != []:\n            predicted_data = self.read_images(paths)\n        else:\n            raise TypeError(\"The input data is inconsistent with expectations.\")\n\n        assert predicted_data != [], \"There is not any image to be predicted. Please check the input data.\"\n\n        preprocessor = DBPreProcess()\n        postprocessor = DBPostProcess(box_thresh)\n\n        all_imgs = []\n        all_ratios = []\n        all_results = []\n        for original_image in predicted_data:\n            im, ratio_list = preprocessor(original_image)\n            res = {'save_path': ''}\n            if im is None:\n                res['data'] = []\n\n            else:\n                im = im.copy()\n                starttime = time.time()\n                self.input_tensor.copy_from_cpu(im)\n                self.predictor.run()\n                data_out = self.output_tensors[0].copy_to_cpu()\n                dt_boxes_list = postprocessor(data_out, [ratio_list])\n                boxes = self.filter_tag_det_res(dt_boxes_list[0], original_image.shape)\n                res['data'] = boxes.astype(np.int64).tolist()\n\n                all_imgs.append(im)\n                all_ratios.append(ratio_list)\n                if visualization:\n                    img = Image.fromarray(cv2.cvtColor(original_image, cv2.COLOR_BGR2RGB))\n                    draw_img = draw_boxes(img, boxes)\n                    draw_img = np.array(draw_img)\n                    if not os.path.exists(output_dir):\n                        os.makedirs(output_dir)\n                    ext = get_image_ext(original_image)\n                    saved_name = 'ndarray_{}{}'.format(time.time(), ext)\n                    cv2.imwrite(os.path.join(output_dir, saved_name), draw_img[:, :, ::-1])\n                    res['save_path'] = os.path.join(output_dir, saved_name)\n\n            all_results.append(res)\n\n        return all_results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.detect_text(images=images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the %s module.\" % self.name,\n                                              prog='hub run %s' % self.name,\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n\n        args = self.parser.parse_args(argvs)\n        results = self.detect_text(paths=[args.input_path],\n                                   use_gpu=args.use_gpu,\n                                   output_dir=args.output_dir,\n                                   visualization=args.visualization)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='detection_result',\n                                           help=\"The directory to save output images.\")\n        self.arg_config_group.add_argument('--visualization',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, default=None, help=\"diretory to image\")\n"
  },
  {
    "path": "modules/image/text_recognition/chinese_text_detection_db_server/processor.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport sys\n\nimport cv2\nimport numpy as np\nimport pyclipper\nfrom PIL import ImageDraw\nfrom shapely.geometry import Polygon\n\n\nclass DBPreProcess(object):\n\n    def __init__(self, max_side_len=960):\n        self.max_side_len = max_side_len\n\n    def resize_image_type(self, im):\n        \"\"\"\n        resize image to a size multiple of 32 which is required by the network\n        \"\"\"\n        h, w, _ = im.shape\n\n        resize_w = w\n        resize_h = h\n\n        # limit the max side\n        if max(resize_h, resize_w) > self.max_side_len:\n            if resize_h > resize_w:\n                ratio = float(self.max_side_len) / resize_h\n            else:\n                ratio = float(self.max_side_len) / resize_w\n        else:\n            ratio = 1.\n        resize_h = int(resize_h * ratio)\n        resize_w = int(resize_w * ratio)\n        if resize_h % 32 == 0:\n            resize_h = resize_h\n        elif resize_h // 32 <= 1:\n            resize_h = 32\n        else:\n            resize_h = (resize_h // 32 - 1) * 32\n        if resize_w % 32 == 0:\n            resize_w = resize_w\n        elif resize_w // 32 <= 1:\n            resize_w = 32\n        else:\n            resize_w = (resize_w // 32 - 1) * 32\n        try:\n            if int(resize_w) <= 0 or int(resize_h) <= 0:\n                return None, (None, None)\n            im = cv2.resize(im, (int(resize_w), int(resize_h)))\n        except:\n            print(im.shape, resize_w, resize_h)\n            sys.exit(0)\n        ratio_h = resize_h / float(h)\n        ratio_w = resize_w / float(w)\n        return im, (ratio_h, ratio_w)\n\n    def normalize(self, im):\n        img_mean = [0.485, 0.456, 0.406]\n        img_std = [0.229, 0.224, 0.225]\n        im = im.astype(np.float32, copy=False)\n        im = im / 255\n        im -= img_mean\n        im /= img_std\n        channel_swap = (2, 0, 1)\n        im = im.transpose(channel_swap)\n        return im\n\n    def __call__(self, im):\n        im, (ratio_h, ratio_w) = self.resize_image_type(im)\n        im = self.normalize(im)\n        im = im[np.newaxis, :]\n        return [im, (ratio_h, ratio_w)]\n\n\nclass DBPostProcess(object):\n    \"\"\"\n    The post process for Differentiable Binarization (DB).\n    \"\"\"\n\n    def __init__(self, thresh=0.3, box_thresh=0.5, max_candidates=1000):\n        self.thresh = thresh\n        self.box_thresh = box_thresh\n        self.max_candidates = max_candidates\n        self.min_size = 3\n\n    def boxes_from_bitmap(self, pred, _bitmap, dest_width, dest_height):\n        '''\n        _bitmap: single map with shape (1, H, W),\n                whose values are binarized as {0, 1}\n        '''\n\n        bitmap = _bitmap\n        height, width = bitmap.shape\n\n        outs = cv2.findContours((bitmap * 255).astype(np.uint8), cv2.RETR_LIST, cv2.CHAIN_APPROX_SIMPLE)\n        if len(outs) == 3:\n            img, contours, _ = outs[0], outs[1], outs[2]\n        elif len(outs) == 2:\n            contours, _ = outs[0], outs[1]\n\n        num_contours = min(len(contours), self.max_candidates)\n        boxes = np.zeros((num_contours, 4, 2), dtype=np.int64)\n        scores = np.zeros((num_contours, ), dtype=np.float32)\n\n        for index in range(num_contours):\n            contour = contours[index]\n            points, sside = self.get_mini_boxes(contour)\n            if sside < self.min_size:\n                continue\n            points = np.array(points)\n            score = self.box_score_fast(pred, points.reshape(-1, 2))\n            if self.box_thresh > score:\n                continue\n\n            box = self.unclip(points).reshape(-1, 1, 2)\n            box, sside = self.get_mini_boxes(box)\n            if sside < self.min_size + 2:\n                continue\n            box = np.array(box)\n            if not isinstance(dest_width, int):\n                dest_width = dest_width.item()\n                dest_height = dest_height.item()\n\n            box[:, 0] = np.clip(np.round(box[:, 0] / width * dest_width), 0, dest_width)\n            box[:, 1] = np.clip(np.round(box[:, 1] / height * dest_height), 0, dest_height)\n            boxes[index, :, :] = box.astype(np.int64)\n            scores[index] = score\n        return boxes, scores\n\n    def unclip(self, box, unclip_ratio=2.0):\n        poly = Polygon(box)\n        distance = poly.area * unclip_ratio / poly.length\n        offset = pyclipper.PyclipperOffset()\n        offset.AddPath(box, pyclipper.JT_ROUND, pyclipper.ET_CLOSEDPOLYGON)\n        expanded = np.array(offset.Execute(distance))\n        return expanded\n\n    def get_mini_boxes(self, contour):\n        bounding_box = cv2.minAreaRect(contour)\n        points = sorted(list(cv2.boxPoints(bounding_box)), key=lambda x: x[0])\n\n        index_1, index_2, index_3, index_4 = 0, 1, 2, 3\n        if points[1][1] > points[0][1]:\n            index_1 = 0\n            index_4 = 1\n        else:\n            index_1 = 1\n            index_4 = 0\n        if points[3][1] > points[2][1]:\n            index_2 = 2\n            index_3 = 3\n        else:\n            index_2 = 3\n            index_3 = 2\n\n        box = [points[index_1], points[index_2], points[index_3], points[index_4]]\n        return box, min(bounding_box[1])\n\n    def box_score_fast(self, bitmap, _box):\n        h, w = bitmap.shape[:2]\n        box = _box.copy()\n        xmin = np.clip(np.floor(box[:, 0].min()).astype(np.int64), 0, w - 1)\n        xmax = np.clip(np.ceil(box[:, 0].max()).astype(np.int64), 0, w - 1)\n        ymin = np.clip(np.floor(box[:, 1].min()).astype(np.int64), 0, h - 1)\n        ymax = np.clip(np.ceil(box[:, 1].max()).astype(np.int64), 0, h - 1)\n\n        mask = np.zeros((ymax - ymin + 1, xmax - xmin + 1), dtype=np.uint8)\n        box[:, 0] = box[:, 0] - xmin\n        box[:, 1] = box[:, 1] - ymin\n        cv2.fillPoly(mask, box.reshape(1, -1, 2).astype(np.int64), 1)\n        return cv2.mean(bitmap[ymin:ymax + 1, xmin:xmax + 1], mask)[0]\n\n    def __call__(self, predictions, ratio_list):\n        pred = predictions[:, 0, :, :]\n        segmentation = pred > self.thresh\n\n        boxes_batch = []\n        for batch_index in range(pred.shape[0]):\n            height, width = pred.shape[-2:]\n            tmp_boxes, tmp_scores = self.boxes_from_bitmap(pred[batch_index], segmentation[batch_index], width, height)\n\n            boxes = []\n            for k in range(len(tmp_boxes)):\n                if tmp_scores[k] > self.box_thresh:\n                    boxes.append(tmp_boxes[k])\n            if len(boxes) > 0:\n                boxes = np.array(boxes)\n\n                ratio_h, ratio_w = ratio_list[batch_index]\n                boxes[:, :, 0] = boxes[:, :, 0] / ratio_w\n                boxes[:, :, 1] = boxes[:, :, 1] / ratio_h\n\n            boxes_batch.append(boxes)\n        return boxes_batch\n\n\ndef draw_boxes(image, boxes, scores=None, drop_score=0.5):\n    img = image.copy()\n    draw = ImageDraw.Draw(img)\n    if scores is None:\n        scores = [1] * len(boxes)\n    for (box, score) in zip(boxes, scores):\n        if score < drop_score:\n            continue\n        draw.line([(box[0][0], box[0][1]), (box[1][0], box[1][1])], fill='red')\n        draw.line([(box[1][0], box[1][1]), (box[2][0], box[2][1])], fill='red')\n        draw.line([(box[2][0], box[2][1]), (box[3][0], box[3][1])], fill='red')\n        draw.line([(box[3][0], box[3][1]), (box[0][0], box[0][1])], fill='red')\n        draw.line([(box[0][0] - 1, box[0][1] + 1), (box[1][0] - 1, box[1][1] + 1)], fill='red')\n        draw.line([(box[1][0] - 1, box[1][1] + 1), (box[2][0] - 1, box[2][1] + 1)], fill='red')\n        draw.line([(box[2][0] - 1, box[2][1] + 1), (box[3][0] - 1, box[3][1] + 1)], fill='red')\n        draw.line([(box[3][0] - 1, box[3][1] + 1), (box[0][0] - 1, box[0][1] + 1)], fill='red')\n    return img\n\n\ndef get_image_ext(image):\n    if image.shape[2] == 4:\n        return \".png\"\n    return \".jpg\"\n"
  },
  {
    "path": "modules/image/text_recognition/chinese_text_detection_db_server/requirements.txt",
    "content": "shapely\npyclipper\n"
  },
  {
    "path": "modules/image/text_recognition/chinese_text_detection_db_server/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/KTzZVDjUsXw/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8MzM3fHx0ZXh0fGVufDB8fHx8MTY2MzUxMTExMQ&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"chinese_text_detection_db_server\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('inference')\n        shutil.rmtree('detection_result')\n\n    def test_detect_text1(self):\n        results = self.module.detect_text(\n            paths=['tests/test.jpg'],\n            use_gpu=False,\n            visualization=False,\n        )\n        self.assertEqual(\n            results[0]['data'],\n            [[[258, 199], [382, 199], [382, 240], [258, 240]], [[281, 159], [359, 159], [359, 202], [281, 202]]])\n\n    def test_detect_text2(self):\n        results = self.module.detect_text(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=False,\n        )\n        self.assertEqual(\n            results[0]['data'],\n            [[[258, 199], [382, 199], [382, 240], [258, 240]], [[281, 159], [359, 159], [359, 202], [281, 202]]])\n\n    def test_detect_text3(self):\n        results = self.module.detect_text(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=True,\n            visualization=False,\n        )\n        self.assertEqual(\n            results[0]['data'],\n            [[[258, 199], [382, 199], [382, 240], [258, 240]], [[281, 159], [359, 159], [359, 202], [281, 202]]])\n\n    def test_detect_text4(self):\n        results = self.module.detect_text(\n            images=[cv2.imread('tests/test.jpg')],\n            use_gpu=False,\n            visualization=True,\n        )\n        self.assertEqual(\n            results[0]['data'],\n            [[[258, 199], [382, 199], [382, 240], [258, 240]], [[281, 159], [359, 159], [359, 202], [281, 202]]])\n\n    def test_detect_text5(self):\n        self.assertRaises(AttributeError, self.module.detect_text, images=['tests/test.jpg'])\n\n    def test_detect_text6(self):\n        self.assertRaises(AssertionError, self.module.detect_text, paths=['no.jpg'])\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/text_recognition/cyrillic_ocr_db_crnn_mobile/README.md",
    "content": "# cyrillic_ocr_db_crnn_mobile\n\n|模型名称|cyrillic_ocr_db_crnn_mobile|\n| :--- | :---: |\n|类别|图像-文字识别|\n|网络|Differentiable Binarization+CRNN|\n|数据集|icdar2015数据集|\n|是否支持Fine-tuning|否|\n|最新更新日期|2021-12-2|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - cyrillic_ocr_db_crnn_mobile Module用于识别图片当中的斯拉夫文，包括俄罗斯文、塞尔维亚文、白俄罗斯文、保加利亚文、乌克兰文、蒙古文、阿迪赫文、阿瓦尔文、达尔瓦文、因古什文、拉克文、莱兹甘文、塔巴萨兰文。其基于multi_languages_ocr_db_crnn检测得到的文本框，继续识别文本框中的斯拉夫文文字。最终识别文字算法采用CRNN（Convolutional Recurrent Neural Network）即卷积递归神经网络。其是DCNN和RNN的组合，专门用于识别图像中的序列式对象。与CTC loss配合使用，进行文字识别，可以直接从文本词级或行级的标注中学习，不需要详细的字符级的标注。该Module是一个识别斯拉夫文的轻量级OCR模型，支持直接预测。\n\n  - 更多详情参考：\n    - [Real-time Scene Text Detection with Differentiable Binarization](https://arxiv.org/pdf/1911.08947.pdf)\n    - [An end-to-end trainable neural network for image-based sequence recognition and its application to scene text recognition](https://arxiv.org/pdf/1507.05717.pdf)\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.0.2  \n\n  - paddlehub >= 2.0.0   | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install cyrillic_ocr_db_crnn_mobile\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run cyrillic_ocr_db_crnn_mobile --input_path \"/PATH/TO/IMAGE\"\n    $ hub run cyrillic_ocr_db_crnn_mobile --input_path \"/PATH/TO/IMAGE\" --det True --rec True --use_angle_cls True  --box_thresh 0.7 --angle_classification_thresh 0.8 --visualization True\n    ```\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    ocr = hub.Module(name=\"cyrillic_ocr_db_crnn_mobile\", enable_mkldnn=True)       # mkldnn加速仅在CPU下有效\n    result = ocr.recognize_text(images=[cv2.imread('/PATH/TO/IMAGE')])\n\n    # or\n    # result = ocr.recognize_text(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def __init__(self,\n                 det=True,\n                 rec=True,\n                 use_angle_cls=False,\n                 enable_mkldnn=False,  \n                 use_gpu=False,\n                 box_thresh=0.6,\n                 angle_classification_thresh=0.9)\n    ```\n\n    - 构造CyrillicOCRDBCRNNMobile对象\n\n    - **参数**\n      - det(bool): 是否开启文字检测。默认为True。\n      - rec(bool): 是否开启文字识别。默认为True。\n      - use_angle_cls(bool): 是否开启方向分类, 用于设置使用方向分类器识别180度旋转文字。默认为False。\n      - enable_mkldnn(bool): 是否开启mkldnn加速CPU计算。该参数仅在CPU运行下设置有效。默认为False。\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量**\n      - box\\_thresh (float): 检测文本框置信度的阈值；\n      - angle_classification_thresh(float): 文本方向分类置信度的阈值\n\n\n  - ```python\n    def recognize_text(images=[],\n                       paths=[],\n                       output_dir='ocr_result',\n                       visualization=False)\n    ```\n\n    - 预测API，检测输入图片中的所有文本的位置和识别文本结果。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径；\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      - output\\_dir (str): 图片的保存路径，默认设为 ocr\\_result；\n      - visualization (bool): 是否将识别结果保存为图片文件, 仅有检测开启时有效, 默认为False；\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为：\n        - data (list\\[dict\\]): 识别文本结果，列表中每一个元素为 dict，各字段为：\n          - text(str): 识别得到的文本\n          - confidence(float): 识别文本结果置信度\n          - text_box_position(list): 文本框在原图中的像素坐标，4*2的矩阵，依次表示文本框左下、右下、右上、左上顶点的坐标，如果无识别结果则data为\\[\\]\n          - orientation(str): 分类的方向，仅在只有方向分类开启时输出\n          - score(float): 分类的得分，仅在只有方向分类开启时输出\n        - save_path (str, optional): 识别结果的保存路径，如不保存图片则save_path为''\n\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m cyrillic_ocr_db_crnn_mobile\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/cyrillic_ocr_db_crnn_mobile\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n  - ```shell\n    $ hub install cyrillic_ocr_db_crnn_mobile==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/text_recognition/cyrillic_ocr_db_crnn_mobile/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/text_recognition/cyrillic_ocr_db_crnn_mobile/module.py",
    "content": "import paddlehub as hub\nfrom paddleocr.ppocr.utils.logging import get_logger\nfrom paddleocr.tools.infer.utility import base64_to_cv2\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\n\n@moduleinfo(\n    name=\"cyrillic_ocr_db_crnn_mobile\",\n    version=\"1.0.0\",\n    summary=\"ocr service\",\n    author=\"PaddlePaddle\",\n    type=\"cv/text_recognition\")\nclass CyrillicOCRDBCRNNMobile:\n    def __init__(self,\n                 det=True,\n                 rec=True,\n                 use_angle_cls=False,\n                 enable_mkldnn=False,\n                 use_gpu=False,\n                 box_thresh=0.6,\n                 angle_classification_thresh=0.9):\n        \"\"\"\n        initialize with the necessary elements\n        Args:\n            det(bool): Whether to use text detector.\n            rec(bool): Whether to use text recognizer.\n            use_angle_cls(bool): Whether to use text orientation classifier.\n            enable_mkldnn(bool): Whether to enable mkldnn.\n            use_gpu (bool): Whether to use gpu.\n            box_thresh(float): the threshold of the detected text box's confidence\n            angle_classification_thresh(float): the threshold of the angle classification confidence\n        \"\"\"\n        self.logger = get_logger()\n        self.model = hub.Module(\n            name=\"multi_languages_ocr_db_crnn\",\n            lang=\"cyrillic\",\n            det=det,\n            rec=rec,\n            use_angle_cls=use_angle_cls,\n            enable_mkldnn=enable_mkldnn,\n            use_gpu=use_gpu,\n            box_thresh=box_thresh,\n            angle_classification_thresh=angle_classification_thresh)\n        self.model.name = self.name\n\n    def recognize_text(self, images=[], paths=[], output_dir='ocr_result', visualization=False):\n        \"\"\"\n        Get the text in the predicted images.\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]. If images not paths\n            paths (list[str]): The paths of images. If paths not images\n            output_dir (str): The directory to store output images.\n            visualization (bool): Whether to save image or not.\n        Returns:\n            res (list): The result of text detection box and save path of images.\n        \"\"\"\n        all_results = self.model.recognize_text(\n            images=images, paths=paths, output_dir=output_dir, visualization=visualization)\n        return all_results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.recognize_text(images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        results = self.model.run_cmd(argvs)\n        return results\n\n    def export_onnx_model(self, dirname: str, input_shape_dict=None, opset_version=10):\n        '''\n        Export the model to ONNX format.\n\n        Args:\n            dirname(str): The directory to save the onnx model.\n            input_shape_dict: dictionary ``{ input_name: input_value }, eg. {'x': [-1, 3, -1, -1]}``\n            opset_version(int): operator set\n        '''\n        self.model.export_onnx_model(dirname=dirname, input_shape_dict=input_shape_dict, opset_version=opset_version)\n"
  },
  {
    "path": "modules/image/text_recognition/cyrillic_ocr_db_crnn_mobile/requirements.txt",
    "content": "paddleocr>=2.3.0.2\npaddle2onnx>=0.9.0\nshapely\npyclipper\n"
  },
  {
    "path": "modules/image/text_recognition/devanagari_ocr_db_crnn_mobile/README.md",
    "content": "# devanagari_ocr_db_crnn_mobile\n\n|模型名称|devanagari_ocr_db_crnn_mobile|\n| :--- | :---: |\n|类别|图像-文字识别|\n|网络|Differentiable Binarization+CRNN|\n|数据集|icdar2015数据集|\n|是否支持Fine-tuning|否|\n|最新更新日期|2021-12-2|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - devanagari_ocr_db_crnn_mobile Module用于识别图片当中的梵文，包括印地文、马拉地文、尼泊尔文、比尔哈文、迈蒂利文、昂加文、孟加拉文、摩揭陀文、那格浦尔文、尼瓦尔文。其基于multi_languages_ocr_db_crnn检测得到的文本框，继续识别文本框中的梵文文字。最终识别文字算法采用CRNN（Convolutional Recurrent Neural Network）即卷积递归神经网络。其是DCNN和RNN的组合，专门用于识别图像中的序列式对象。与CTC loss配合使用，进行文字识别，可以直接从文本词级或行级的标注中学习，不需要详细的字符级的标注。该Module是一个识别梵文的轻量级OCR模型，支持直接预测。\n\n  - 更多详情参考：\n    - [Real-time Scene Text Detection with Differentiable Binarization](https://arxiv.org/pdf/1911.08947.pdf)\n    - [An end-to-end trainable neural network for image-based sequence recognition and its application to scene text recognition](https://arxiv.org/pdf/1507.05717.pdf)\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.0.2  \n\n  - paddlehub >= 2.0.0   | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install devanagari_ocr_db_crnn_mobile\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run devanagari_ocr_db_crnn_mobile --input_path \"/PATH/TO/IMAGE\"\n    $ hub run devanagari_ocr_db_crnn_mobile --input_path \"/PATH/TO/IMAGE\" --det True --rec True --use_angle_cls True  --box_thresh 0.7 --angle_classification_thresh 0.8 --visualization True\n    ```\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    ocr = hub.Module(name=\"devanagari_ocr_db_crnn_mobile\", enable_mkldnn=True)       # mkldnn加速仅在CPU下有效\n    result = ocr.recognize_text(images=[cv2.imread('/PATH/TO/IMAGE')])\n\n    # or\n    # result = ocr.recognize_text(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def __init__(self,\n                 det=True,\n                 rec=True,\n                 use_angle_cls=False,\n                 enable_mkldnn=False,  \n                 use_gpu=False,\n                 box_thresh=0.6,\n                 angle_classification_thresh=0.9)\n    ```\n\n    - 构造DevanagariOCRDBCRNNMobile对象\n\n    - **参数**\n      - det(bool): 是否开启文字检测。默认为True。\n      - rec(bool): 是否开启文字识别。默认为True。\n      - use_angle_cls(bool): 是否开启方向分类, 用于设置使用方向分类器识别180度旋转文字。默认为False。\n      - enable_mkldnn(bool): 是否开启mkldnn加速CPU计算。该参数仅在CPU运行下设置有效。默认为False。\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量**\n      - box\\_thresh (float): 检测文本框置信度的阈值；\n      - angle_classification_thresh(float): 文本方向分类置信度的阈值\n\n\n  - ```python\n    def recognize_text(images=[],\n                       paths=[],\n                       output_dir='ocr_result',\n                       visualization=False)\n    ```\n\n    - 预测API，检测输入图片中的所有文本的位置和识别文本结果。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径；\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      - output\\_dir (str): 图片的保存路径，默认设为 ocr\\_result；\n      - visualization (bool): 是否将识别结果保存为图片文件, 仅有检测开启时有效, 默认为False；\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为：\n        - data (list\\[dict\\]): 识别文本结果，列表中每一个元素为 dict，各字段为：\n          - text(str): 识别得到的文本\n          - confidence(float): 识别文本结果置信度\n          - text_box_position(list): 文本框在原图中的像素坐标，4*2的矩阵，依次表示文本框左下、右下、右上、左上顶点的坐标，如果无识别结果则data为\\[\\]\n          - orientation(str): 分类的方向，仅在只有方向分类开启时输出\n          - score(float): 分类的得分，仅在只有方向分类开启时输出\n        - save_path (str, optional): 识别结果的保存路径，如不保存图片则save_path为''\n\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m devanagari_ocr_db_crnn_mobile\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/devanagari_ocr_db_crnn_mobile\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n  - ```shell\n    $ hub install devanagari_ocr_db_crnn_mobile==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/text_recognition/devanagari_ocr_db_crnn_mobile/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/text_recognition/devanagari_ocr_db_crnn_mobile/module.py",
    "content": "import paddlehub as hub\nfrom paddleocr.ppocr.utils.logging import get_logger\nfrom paddleocr.tools.infer.utility import base64_to_cv2\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\n\n@moduleinfo(\n    name=\"devanagari_ocr_db_crnn_mobile\",\n    version=\"1.0.0\",\n    summary=\"ocr service\",\n    author=\"PaddlePaddle\",\n    type=\"cv/text_recognition\")\nclass DevanagariOCRDBCRNNMobile:\n    def __init__(self,\n                 det=True,\n                 rec=True,\n                 use_angle_cls=False,\n                 enable_mkldnn=False,\n                 use_gpu=False,\n                 box_thresh=0.6,\n                 angle_classification_thresh=0.9):\n        \"\"\"\n        initialize with the necessary elements\n        Args:\n            det(bool): Whether to use text detector.\n            rec(bool): Whether to use text recognizer.\n            use_angle_cls(bool): Whether to use text orientation classifier.\n            enable_mkldnn(bool): Whether to enable mkldnn.\n            use_gpu (bool): Whether to use gpu.\n            box_thresh(float): the threshold of the detected text box's confidence\n            angle_classification_thresh(float): the threshold of the angle classification confidence\n        \"\"\"\n        self.logger = get_logger()\n        self.model = hub.Module(\n            name=\"multi_languages_ocr_db_crnn\",\n            lang=\"devanagari\",\n            det=det,\n            rec=rec,\n            use_angle_cls=use_angle_cls,\n            enable_mkldnn=enable_mkldnn,\n            use_gpu=use_gpu,\n            box_thresh=box_thresh,\n            angle_classification_thresh=angle_classification_thresh)\n        self.model.name = self.name\n\n    def recognize_text(self, images=[], paths=[], output_dir='ocr_result', visualization=False):\n        \"\"\"\n        Get the text in the predicted images.\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]. If images not paths\n            paths (list[str]): The paths of images. If paths not images\n            output_dir (str): The directory to store output images.\n            visualization (bool): Whether to save image or not.\n        Returns:\n            res (list): The result of text detection box and save path of images.\n        \"\"\"\n        all_results = self.model.recognize_text(\n            images=images, paths=paths, output_dir=output_dir, visualization=visualization)\n        return all_results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.recognize_text(images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        results = self.model.run_cmd(argvs)\n        return results\n\n    def export_onnx_model(self, dirname: str, input_shape_dict=None, opset_version=10):\n        '''\n        Export the model to ONNX format.\n\n        Args:\n            dirname(str): The directory to save the onnx model.\n            input_shape_dict: dictionary ``{ input_name: input_value }, eg. {'x': [-1, 3, -1, -1]}``\n            opset_version(int): operator set\n        '''\n        self.model.export_onnx_model(dirname=dirname, input_shape_dict=input_shape_dict, opset_version=opset_version)\n"
  },
  {
    "path": "modules/image/text_recognition/devanagari_ocr_db_crnn_mobile/requirements.txt",
    "content": "paddleocr>=2.3.0.2\npaddle2onnx>=0.9.0\nshapely\npyclipper\n"
  },
  {
    "path": "modules/image/text_recognition/french_ocr_db_crnn_mobile/README.md",
    "content": "# french_ocr_db_crnn_mobile\n\n|模型名称|french_ocr_db_crnn_mobile|\n| :--- | :---: |\n|类别|图像-文字识别|\n|网络|Differentiable Binarization+CRNN|\n|数据集|icdar2015数据集|\n|是否支持Fine-tuning|否|\n|最新更新日期|2021-12-2|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - french_ocr_db_crnn_mobile Module用于识别图片当中的法文。其基于multi_languages_ocr_db_crnn检测得到的文本框，继续识别文本框中的法文文字。最终识别文字算法采用CRNN（Convolutional Recurrent Neural Network）即卷积递归神经网络。其是DCNN和RNN的组合，专门用于识别图像中的序列式对象。与CTC loss配合使用，进行文字识别，可以直接从文本词级或行级的标注中学习，不需要详细的字符级的标注。该Module是一个识别法文的轻量级OCR模型，支持直接预测。\n\n  - 更多详情参考：\n    - [Real-time Scene Text Detection with Differentiable Binarization](https://arxiv.org/pdf/1911.08947.pdf)\n    - [An end-to-end trainable neural network for image-based sequence recognition and its application to scene text recognition](https://arxiv.org/pdf/1507.05717.pdf)\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.0.2  \n\n  - paddlehub >= 2.0.0   | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install french_ocr_db_crnn_mobile\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run french_ocr_db_crnn_mobile --input_path \"/PATH/TO/IMAGE\"\n    $ hub run french_ocr_db_crnn_mobile --input_path \"/PATH/TO/IMAGE\" --det True --rec True --use_angle_cls True  --box_thresh 0.7 --angle_classification_thresh 0.8 --visualization True\n    ```\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    ocr = hub.Module(name=\"french_ocr_db_crnn_mobile\", enable_mkldnn=True)       # mkldnn加速仅在CPU下有效\n    result = ocr.recognize_text(images=[cv2.imread('/PATH/TO/IMAGE')])\n\n    # or\n    # result = ocr.recognize_text(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def __init__(self,\n                 det=True,\n                 rec=True,\n                 use_angle_cls=False,\n                 enable_mkldnn=False,  \n                 use_gpu=False,\n                 box_thresh=0.6,\n                 angle_classification_thresh=0.9)\n    ```\n\n    - 构造FrechOCRDBCRNNMobile对象\n\n    - **参数**\n      - det(bool): 是否开启文字检测。默认为True。\n      - rec(bool): 是否开启文字识别。默认为True。\n      - use_angle_cls(bool): 是否开启方向分类, 用于设置使用方向分类器识别180度旋转文字。默认为False。\n      - enable_mkldnn(bool): 是否开启mkldnn加速CPU计算。该参数仅在CPU运行下设置有效。默认为False。\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量**\n      - box\\_thresh (float): 检测文本框置信度的阈值；\n      - angle_classification_thresh(float): 文本方向分类置信度的阈值\n\n\n  - ```python\n    def recognize_text(images=[],\n                       paths=[],\n                       output_dir='ocr_result',\n                       visualization=False)\n    ```\n\n    - 预测API，检测输入图片中的所有文本的位置和识别文本结果。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径；\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      - output\\_dir (str): 图片的保存路径，默认设为 ocr\\_result；\n      - visualization (bool): 是否将识别结果保存为图片文件, 仅有检测开启时有效, 默认为False；\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为：\n        - data (list\\[dict\\]): 识别文本结果，列表中每一个元素为 dict，各字段为：\n          - text(str): 识别得到的文本\n          - confidence(float): 识别文本结果置信度\n          - text_box_position(list): 文本框在原图中的像素坐标，4*2的矩阵，依次表示文本框左下、右下、右上、左上顶点的坐标，如果无识别结果则data为\\[\\]\n          - orientation(str): 分类的方向，仅在只有方向分类开启时输出\n          - score(float): 分类的得分，仅在只有方向分类开启时输出\n        - save_path (str, optional): 识别结果的保存路径，如不保存图片则save_path为''\n\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m french_ocr_db_crnn_mobile\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/french_ocr_db_crnn_mobile\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  优化模型\n  - ```shell\n    $ hub install french_ocr_db_crnn_mobile==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/text_recognition/french_ocr_db_crnn_mobile/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/text_recognition/french_ocr_db_crnn_mobile/module.py",
    "content": "import paddlehub as hub\nfrom paddleocr.ppocr.utils.logging import get_logger\nfrom paddleocr.tools.infer.utility import base64_to_cv2\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\n\n@moduleinfo(\n    name=\"french_ocr_db_crnn_mobile\",\n    version=\"1.1.0\",\n    summary=\"ocr service\",\n    author=\"PaddlePaddle\",\n    type=\"cv/text_recognition\")\nclass FrechOCRDBCRNNMobile:\n    def __init__(self,\n                 det=True,\n                 rec=True,\n                 use_angle_cls=False,\n                 enable_mkldnn=False,\n                 use_gpu=False,\n                 box_thresh=0.6,\n                 angle_classification_thresh=0.9):\n        \"\"\"\n        initialize with the necessary elements\n        Args:\n            det(bool): Whether to use text detector.\n            rec(bool): Whether to use text recognizer.\n            use_angle_cls(bool): Whether to use text orientation classifier.\n            enable_mkldnn(bool): Whether to enable mkldnn.\n            use_gpu (bool): Whether to use gpu.\n            box_thresh(float): the threshold of the detected text box's confidence\n            angle_classification_thresh(float): the threshold of the angle classification confidence\n        \"\"\"\n        self.logger = get_logger()\n        self.model = hub.Module(\n            name=\"multi_languages_ocr_db_crnn\",\n            lang=\"fr\",\n            det=det,\n            rec=rec,\n            use_angle_cls=use_angle_cls,\n            enable_mkldnn=enable_mkldnn,\n            use_gpu=use_gpu,\n            box_thresh=box_thresh,\n            angle_classification_thresh=angle_classification_thresh)\n        self.model.name = self.name\n\n    def recognize_text(self, images=[], paths=[], output_dir='ocr_result', visualization=False):\n        \"\"\"\n        Get the text in the predicted images.\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]. If images not paths\n            paths (list[str]): The paths of images. If paths not images\n            output_dir (str): The directory to store output images.\n            visualization (bool): Whether to save image or not.\n        Returns:\n            res (list): The result of text detection box and save path of images.\n        \"\"\"\n        all_results = self.model.recognize_text(\n            images=images, paths=paths, output_dir=output_dir, visualization=visualization)\n        return all_results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.recognize_text(images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        results = self.model.run_cmd(argvs)\n        return results\n\n    def export_onnx_model(self, dirname: str, input_shape_dict=None, opset_version=10):\n        '''\n        Export the model to ONNX format.\n\n        Args:\n            dirname(str): The directory to save the onnx model.\n            input_shape_dict: dictionary ``{ input_name: input_value }, eg. {'x': [-1, 3, -1, -1]}``\n            opset_version(int): operator set\n        '''\n        self.model.export_onnx_model(dirname=dirname, input_shape_dict=input_shape_dict, opset_version=opset_version)\n"
  },
  {
    "path": "modules/image/text_recognition/french_ocr_db_crnn_mobile/requirements.txt",
    "content": "paddleocr>=2.3.0.2\npaddle2onnx>=0.9.0\nshapely\npyclipper\n"
  },
  {
    "path": "modules/image/text_recognition/german_ocr_db_crnn_mobile/README.md",
    "content": "# german_ocr_db_crnn_mobile\n\n|模型名称|german_ocr_db_crnn_mobile|\n| :--- | :---: |\n|类别|图像-文字识别|\n|网络|Differentiable Binarization+CRNN|\n|数据集|icdar2015数据集|\n|是否支持Fine-tuning|否|\n|模型大小|3.8MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/133761772-8c47f25f-0d95-45b4-8075-867dbbd14c86.jpg\"  width=\"80%\" hspace='10'/> <br />\n    </p>\n\n- ### 模型介绍\n\n  - german_ocr_db_crnn_mobile Module用于识别图片当中的德文。其基于chinese_text_detection_db_mobile检测得到的文本框，继续识别文本框中的德文文字。最终识别文字算法采用CRNN（Convolutional Recurrent Neural Network）即卷积递归神经网络。其是DCNN和RNN的组合，专门用于识别图像中的序列式对象。与CTC loss配合使用，进行文字识别，可以直接从文本词级或行级的标注中学习，不需要详细的字符级的标注。该Module是一个识别德文的轻量级OCR模型，支持直接预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.0.2  \n\n  - paddlehub >= 2.0.0   | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install german_ocr_db_crnn_mobile\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run german_ocr_db_crnn_mobile --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    ocr = hub.Module(name=\"german_ocr_db_crnn_mobile\", enable_mkldnn=True)       # mkldnn加速仅在CPU下有效\n    result = ocr.recognize_text(images=[cv2.imread('/PATH/TO/IMAGE')])\n\n    # or\n    # result = ocr.recognize_text(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def __init__(text_detector_module=None, enable_mkldnn=False)\n    ```\n\n    - 构造GenmanOCRDBCRNNMobile对象\n\n    - **参数**\n\n      - text_detector_module(str): 文字检测PaddleHub Module名字，如设置为None，则默认使用[chinese_text_detection_db_mobile Module](../chinese_text_detection_db_mobile/)。其作用为检测图片当中的文本。<br/>\n      - enable_mkldnn(bool): 是否开启mkldnn加速CPU计算。该参数仅在CPU运行下设置有效。默认为False。\n\n  - ```python\n    def recognize_text(images=[],\n                       paths=[],\n                       use_gpu=False,\n                       output_dir='ocr_result',\n                       visualization=False,\n                       box_thresh=0.5,\n                       text_thresh=0.5,\n                       angle_classification_thresh=0.9)\n    ```\n\n    - 预测API，检测输入图片中的所有德文文本的位置。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式； <br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - box\\_thresh (float): 检测文本框置信度的阈值； <br/>\n      - text\\_thresh (float): 识别德文文本置信度的阈值； <br/>\n      - angle_classification_thresh(float): 文本角度分类置信度的阈值 <br/>\n      - visualization (bool): 是否将识别结果保存为图片文件； <br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 ocr\\_result；\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为：\n        - data (list\\[dict\\]): 识别文本结果，列表中每一个元素为 dict，各字段为：\n          - text(str): 识别得到的文本\n          - confidence(float): 识别文本结果置信度\n          - text_box_position(list): 文本框在原图中的像素坐标，4*2的矩阵，依次表示文本框左下、右下、右上、左上顶点的坐标\n      如果无识别结果则data为\\[\\]\n        - save_path (str, optional): 识别结果的保存路径，如不保存图片则save_path为''\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m german_ocr_db_crnn_mobile\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/german_ocr_db_crnn_mobile\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  优化模型\n  - ```shell\n    $ hub install german_ocr_db_crnn_mobile==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/text_recognition/german_ocr_db_crnn_mobile/README_en.md",
    "content": "# german_ocr_db_crnn_mobile\n\n|Module Name|german_ocr_db_crnn_mobile|\n| :--- | :---: |\n|Category|text recognition|\n|Network|Differentiable Binarization+CRNN|\n|Dataset|icdar2015Dataset|\n|Fine-tuning supported or not|No|\n|Module Size|3.8MB|\n|Latest update date|2021-02-26|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/133761772-8c47f25f-0d95-45b4-8075-867dbbd14c86.jpg\"  width=\"80%\" hspace='10'/> <br />\n    </p>\n\n- ### Module Introduction\n  - german_ocr_db_crnn_mobile Module is used to identify Germany characters in pictures. It first obtains the text box detected by [chinese_text_detection_db_mobile Module](), then identifies the Germany characters and carries out angle classification to these text boxes. CRNN(Convolutional Recurrent Neural Network) is adopted as the final recognition algorithm. This Module is an ultra-lightweight Germany OCR model that supports direct prediction.\n\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.8.0  \n\n  - paddlehub >= 1.8.0   | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n  - shapely\n\n  - pyclipper\n\n  - ```shell\n    $ pip install shapely pyclipper\n    ```\n  - **This Module relies on the third-party libraries, shapely and pyclipper. Please install shapely and pyclipper before using this Module.**\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install german_ocr_db_crnn_mobile\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run german_ocr_db_crnn_mobile --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    ocr = hub.Module(name=\"german_ocr_db_crnn_mobile\", enable_mkldnn=True)       # MKLDNN acceleration is only available on CPU\n    result = ocr.recognize_text(images=[cv2.imread('/PATH/TO/IMAGE')])\n\n    # or\n    # result = ocr.recognize_text(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def __init__(text_detector_module=None, enable_mkldnn=False)\n    ```\n    - Construct the GenmanOCRDBCRNNMobile object\n    - **Parameters**\n      - text_detector_module(str): Name of text detection module in PaddleHub Module, if set to None, [chinese_text_detection_db_mobile Module]() will be used by default. It serves to detect the text in the picture.\n      - enable_mkldnn(bool): Whether to enable MKLDNN for CPU computing acceleration. This parameter is valid only when the CPU is running. The default is False.\n\n  - ```python\n    def recognize_text(images=[],\n                       paths=[],\n                       use_gpu=False,\n                       output_dir='ocr_result',\n                       visualization=False,\n                       box_thresh=0.5,\n                       text_thresh=0.5,\n                       angle_classification_thresh=0.9)\n    ```\n\n    - Prediction API, detecting the position of all Germany text in the input image.\n\n    - **Parameter**\n      - paths (list[str]): image path\n      - images (list[numpy.ndarray]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - box_thresh (float): The confidence threshold for text box detection;\n      - text_thresh (float): The confidence threshold for Germany text recognition;\n      - angle_classification_thresh(float): The confidence threshold for text angle classification\n      - visualization (bool): Whether to save the results as picture files;\n      - output_dir (str): save path of images;\n    - **Return**\n      - res (list[dict]): The list of recognition results, where each element is dict and each field is:\n        - data (list[dict]): recognition results, each element in the list is dict and each field is:\n          - text(str): Recognized texts\n          - confidence(float): The confidence of the results\n          - text_box_position(list): The pixel coordinates of the text box in the original picture, a 4*2 matrix representing the coordinates of the lower left, lower right, upper right and upper left vertices of the text box in turn, data is [] if there's no result\n        - save_path (str, optional): Save path of the result, save_path is '' if no image is saved.\n\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of text recognition.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m german_ocr_db_crnn_mobile\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/german_ocr_db_crnn_mobile\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install german_ocr_db_crnn_mobile==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/text_recognition/german_ocr_db_crnn_mobile/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/text_recognition/german_ocr_db_crnn_mobile/module.py",
    "content": "import paddlehub as hub\nfrom paddleocr.ppocr.utils.logging import get_logger\nfrom paddleocr.tools.infer.utility import base64_to_cv2\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\n\n@moduleinfo(\n    name=\"german_ocr_db_crnn_mobile\",\n    version=\"1.1.0\",\n    summary=\"ocr service\",\n    author=\"PaddlePaddle\",\n    type=\"cv/text_recognition\")\nclass GermanOCRDBCRNNMobile:\n    def __init__(self,\n                 det=True,\n                 rec=True,\n                 use_angle_cls=False,\n                 enable_mkldnn=False,\n                 use_gpu=False,\n                 box_thresh=0.6,\n                 angle_classification_thresh=0.9):\n        \"\"\"\n        initialize with the necessary elements\n        Args:\n            det(bool): Whether to use text detector.\n            rec(bool): Whether to use text recognizer.\n            use_angle_cls(bool): Whether to use text orientation classifier.\n            enable_mkldnn(bool): Whether to enable mkldnn.\n            use_gpu (bool): Whether to use gpu.\n            box_thresh(float): the threshold of the detected text box's confidence\n            angle_classification_thresh(float): the threshold of the angle classification confidence\n        \"\"\"\n        self.logger = get_logger()\n        self.model = hub.Module(\n            name=\"multi_languages_ocr_db_crnn\",\n            lang=\"german\",\n            det=det,\n            rec=rec,\n            use_angle_cls=use_angle_cls,\n            enable_mkldnn=enable_mkldnn,\n            use_gpu=use_gpu,\n            box_thresh=box_thresh,\n            angle_classification_thresh=angle_classification_thresh)\n        self.model.name = self.name\n\n    def recognize_text(self, images=[], paths=[], output_dir='ocr_result', visualization=False):\n        \"\"\"\n        Get the text in the predicted images.\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]. If images not paths\n            paths (list[str]): The paths of images. If paths not images\n            output_dir (str): The directory to store output images.\n            visualization (bool): Whether to save image or not.\n        Returns:\n            res (list): The result of text detection box and save path of images.\n        \"\"\"\n        all_results = self.model.recognize_text(\n            images=images, paths=paths, output_dir=output_dir, visualization=visualization)\n        return all_results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.recognize_text(images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        results = self.model.run_cmd(argvs)\n        return results\n\n    def export_onnx_model(self, dirname: str, input_shape_dict=None, opset_version=10):\n        '''\n        Export the model to ONNX format.\n\n        Args:\n            dirname(str): The directory to save the onnx model.\n            input_shape_dict: dictionary ``{ input_name: input_value }, eg. {'x': [-1, 3, -1, -1]}``\n            opset_version(int): operator set\n        '''\n        self.model.export_onnx_model(dirname=dirname, input_shape_dict=input_shape_dict, opset_version=opset_version)\n"
  },
  {
    "path": "modules/image/text_recognition/german_ocr_db_crnn_mobile/requirements.txt",
    "content": "paddleocr>=2.3.0.2\npaddle2onnx>=0.9.0\nshapely\npyclipper\n"
  },
  {
    "path": "modules/image/text_recognition/japan_ocr_db_crnn_mobile/README.md",
    "content": "# japan_ocr_db_crnn_mobile\n\n|模型名称|japan_ocr_db_crnn_mobile|\n| :--- | :---: |\n|类别|图像-文字识别|\n|网络|Differentiable Binarization+CRNN|\n|数据集|icdar2015数据集|\n|是否支持Fine-tuning|否|\n|模型大小|8MB|\n|最新更新日期|2021-04-15|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/133761650-91f24c1e-f437-47b1-8cfb-a074e7150ff5.jpg\" width='80%' hspace='10'/> <br />\n    </p>\n\n- ### 模型介绍\n\n  - japan_ocr_db_crnn_mobile Module用于识别图片当中的日文。其基于chinese_text_detection_db_mobile检测得到的文本框，继续识别文本框中的日文文字。最终识别文字算法采用CRNN（Convolutional Recurrent Neural Network）即卷积递归神经网络。其是DCNN和RNN的组合，专门用于识别图像中的序列式对象。与CTC loss配合使用，进行文字识别，可以直接从文本词级或行级的标注中学习，不需要详细的字符级的标注。该Module是一个识别日文的轻量级OCR模型，支持直接预测。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.0.2  \n\n  - paddlehub >= 2.0.0   | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install japan_ocr_db_crnn_mobile\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run japan_ocr_db_crnn_mobile --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    ocr = hub.Module(name=\"japan_ocr_db_crnn_mobile\", enable_mkldnn=True)       # mkldnn加速仅在CPU下有效\n    result = ocr.recognize_text(images=[cv2.imread('/PATH/TO/IMAGE')])\n\n    # or\n    # result = ocr.recognize_text(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def __init__(text_detector_module=None, enable_mkldnn=False)\n    ```\n\n    - 构造JapanOCRDBCRNNMobile对象\n\n    - **参数**\n\n      - text_detector_module(str): 文字检测PaddleHub Module名字，如设置为None，则默认使用[chinese_text_detection_db_mobile Module](../chinese_text_detection_db_mobile/)。其作用为检测图片当中的文本。<br/>\n      - enable_mkldnn(bool): 是否开启mkldnn加速CPU计算。该参数仅在CPU运行下设置有效。默认为False。\n\n  - ```python\n    def recognize_text(images=[],\n                       paths=[],\n                       use_gpu=False,\n                       output_dir='ocr_result',\n                       visualization=False,\n                       box_thresh=0.5,\n                       text_thresh=0.5,\n                       angle_classification_thresh=0.9)\n    ```\n\n    - 预测API，检测输入图片中的所有日文文本的位置。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径； <br/>\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式； <br/>\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量** <br/>\n      - output\\_dir (str): 图片的保存路径，默认设为 ocr\\_result； <br/>\n      - box\\_thresh (float): 检测文本框置信度的阈值； <br/>\n      - text\\_thresh (float): 识别日文文本置信度的阈值； <br/>\n      - angle_classification_thresh(float): 文本角度分类置信度的阈值 <br/>\n      - visualization (bool): 是否将识别结果保存为图片文件。\n\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为：\n        - data (list\\[dict\\]): 识别文本结果，列表中每一个元素为 dict，各字段为：\n          - text(str): 识别得到的文本\n          - confidence(float): 识别文本结果置信度\n          - text_box_position(list): 文本框在原图中的像素坐标，4*2的矩阵，依次表示文本框左下、右下、右上、左上顶点的坐标\n      如果无识别结果则data为\\[\\]\n        - save_path (str, optional): 识别结果的保存路径，如不保存图片则save_path为''\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m japan_ocr_db_crnn_mobile\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/japan_ocr_db_crnn_mobile\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  优化模型\n  - ```shell\n    $ hub install japan_ocr_db_crnn_mobile==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/text_recognition/japan_ocr_db_crnn_mobile/README_en.md",
    "content": "# japan_ocr_db_crnn_mobile\n\n|Module Name|japan_ocr_db_crnn_mobile|\n| :--- | :---: |\n|Category|text recognition|\n|Network|Differentiable Binarization+CRNN|\n|Dataset|icdar2015Dataset|\n|Fine-tuning supported or not|No|\n|Module Size|8MB|\n|Latest update date|2021-04-15|\n|Data indicators|-|\n\n\n## I.Basic Information\n\n- ### Application Effect Display\n  - Sample results：\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/133761650-91f24c1e-f437-47b1-8cfb-a074e7150ff5.jpg\" width='80%' hspace='10'/> <br />\n    </p>\n\n- ### Module Introduction\n\n  - japan_ocr_db_crnn_mobile Module is used to identify Japanese characters in pictures. It first obtains the text box detected by [chinese_text_detection_db_mobile Module](), then identifies the Japanese characters and carries out angle classification to these text boxes. CRNN(Convolutional Recurrent Neural Network) is adopted as the final recognition algorithm. This Module is an ultra-lightweight Japanese OCR model that supports direct prediction.\n## II.Installation\n\n- ### 1、Environmental Dependence  \n\n  - paddlepaddle >= 1.8.0  \n\n  - paddlehub >= 1.8.0    | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n  - shapely\n\n  - pyclipper\n\n  - ```shell\n    $ pip install shapely pyclipper\n    ```\n  - **This Module relies on the third-party libraries, shapely and pyclipper. Please install shapely and pyclipper before using this Module.**\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install japan_ocr_db_crnn_mobile\n    ```\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n\n## III.Module API Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run japan_ocr_db_crnn_mobile --input_path \"/PATH/TO/IMAGE\"\n    ```\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command Line Instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    ocr = hub.Module(name=\"japan_ocr_db_crnn_mobile\", enable_mkldnn=True)    # MKLDNN acceleration is only available on CPU\n    result = ocr.recognize_text(images=[cv2.imread('/PATH/TO/IMAGE')])\n\n    # or\n    # result = ocr.recognize_text(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def __init__(text_detector_module=None, enable_mkldnn=False)\n    ```\n    - Construct the JapanOCRDBCRNNMobile object\n    - **Parameters**\n      - text_detector_module(str): Name of text detection module in PaddleHub Module, if set to None, [chinese_text_detection_db_mobile Module]() will be used by default. It serves to detect the text in the picture.\n      - enable_mkldnn(bool): Whether to enable MKLDNN for CPU computing acceleration. This parameter is valid only when the CPU is running. The default is False.\n\n  - ```python\n    def recognize_text(images=[],\n                       paths=[],\n                       use_gpu=False,\n                       output_dir='ocr_result',\n                       visualization=False,\n                       box_thresh=0.5,\n                       text_thresh=0.5,\n                       angle_classification_thresh=0.9)\n    ```\n\n    - Prediction API, detecting the position of all Japanese text in the input image.\n    - **Parameter**\n      - paths (list[str]): image path\n      - images (list[numpy.ndarray]): image data, ndarray.shape is in the format [H, W, C], BGR;\n      - use_gpu (bool): use GPU or not; **set the CUDA_VISIBLE_DEVICES environment variable first if you are using GPU**\n      - box_thresh (float): The confidence threshold for text box detection;\n      - text_thresh (float): The confidence threshold for Japanese text recognition;\n      - angle_classification_thresh(float): The confidence threshold for text angle classification\n      - visualization (bool): Whether to save the results as picture files;\n      - output_dir (str): save path of images;\n    - **Return**\n      - res (list[dict]): The list of recognition results, where each element is dict and each field is:\n        - data (list[dict]): recognition results, each element in the list is dict and each field is:\n          - text(str): Recognized texts\n          - confidence(float): The confidence of the results\n          - text_box_position(list): The pixel coordinates of the text box in the original picture, a 4*2 matrix representing the coordinates of the lower left, lower right, upper right and upper left vertices of the text box in turn, data is [] if there's no result\n        - save_path (str, optional): Save path of the result, save_path is '' if no image is saved.\n\n\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of text recognition.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m japan_ocr_db_crnn_mobile\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # Send an HTTP request\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/japan_ocr_db_crnn_mobile\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction results\n    print(r.json()[\"results\"])\n    ```\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n  - ```shell\n    $ hub install japan_ocr_db_crnn_mobile==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/text_recognition/japan_ocr_db_crnn_mobile/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/text_recognition/japan_ocr_db_crnn_mobile/module.py",
    "content": "import paddlehub as hub\nfrom paddleocr.ppocr.utils.logging import get_logger\nfrom paddleocr.tools.infer.utility import base64_to_cv2\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\n\n@moduleinfo(\n    name=\"japan_ocr_db_crnn_mobile\",\n    version=\"1.1.0\",\n    summary=\"ocr service\",\n    author=\"PaddlePaddle\",\n    type=\"cv/text_recognition\")\nclass JapanOCRDBCRNNMobile:\n    def __init__(self,\n                 det=True,\n                 rec=True,\n                 use_angle_cls=False,\n                 enable_mkldnn=False,\n                 use_gpu=False,\n                 box_thresh=0.6,\n                 angle_classification_thresh=0.9):\n        \"\"\"\n        initialize with the necessary elements\n        Args:\n            det(bool): Whether to use text detector.\n            rec(bool): Whether to use text recognizer.\n            use_angle_cls(bool): Whether to use text orientation classifier.\n            enable_mkldnn(bool): Whether to enable mkldnn.\n            use_gpu (bool): Whether to use gpu.\n            box_thresh(float): the threshold of the detected text box's confidence\n            angle_classification_thresh(float): the threshold of the angle classification confidence\n        \"\"\"\n        self.logger = get_logger()\n        self.model = hub.Module(\n            name=\"multi_languages_ocr_db_crnn\",\n            lang=\"japan\",\n            det=det,\n            rec=rec,\n            use_angle_cls=use_angle_cls,\n            enable_mkldnn=enable_mkldnn,\n            use_gpu=use_gpu,\n            box_thresh=box_thresh,\n            angle_classification_thresh=angle_classification_thresh)\n        self.model.name = self.name\n\n    def recognize_text(self, images=[], paths=[], output_dir='ocr_result', visualization=False):\n        \"\"\"\n        Get the text in the predicted images.\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]. If images not paths\n            paths (list[str]): The paths of images. If paths not images\n            output_dir (str): The directory to store output images.\n            visualization (bool): Whether to save image or not.\n        Returns:\n            res (list): The result of text detection box and save path of images.\n        \"\"\"\n        all_results = self.model.recognize_text(\n            images=images, paths=paths, output_dir=output_dir, visualization=visualization)\n        return all_results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.recognize_text(images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        results = self.model.run_cmd(argvs)\n        return results\n\n    def export_onnx_model(self, dirname: str, input_shape_dict=None, opset_version=10):\n        '''\n        Export the model to ONNX format.\n\n        Args:\n            dirname(str): The directory to save the onnx model.\n            input_shape_dict: dictionary ``{ input_name: input_value }, eg. {'x': [-1, 3, -1, -1]}``\n            opset_version(int): operator set\n        '''\n        self.model.export_onnx_model(dirname=dirname, input_shape_dict=input_shape_dict, opset_version=opset_version)\n"
  },
  {
    "path": "modules/image/text_recognition/japan_ocr_db_crnn_mobile/requirements.txt",
    "content": "paddleocr>=2.3.0.2\npaddle2onnx>=0.9.0\nshapely\npyclipper\n"
  },
  {
    "path": "modules/image/text_recognition/kannada_ocr_db_crnn_mobile/README.md",
    "content": "# kannada_ocr_db_crnn_mobile\n\n|模型名称|kannada_ocr_db_crnn_mobile|\n| :--- | :---: |\n|类别|图像-文字识别|\n|网络|Differentiable Binarization+CRNN|\n|数据集|icdar2015数据集|\n|是否支持Fine-tuning|否|\n|最新更新日期|2021-12-2|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - kannada_ocr_db_crnn_mobile Module用于识别图片当中的卡纳达文。其基于multi_languages_ocr_db_crnn检测得到的文本框，继续识别文本框中的卡纳达文文字。最终识别文字算法采用CRNN（Convolutional Recurrent Neural Network）即卷积递归神经网络。其是DCNN和RNN的组合，专门用于识别图像中的序列式对象。与CTC loss配合使用，进行文字识别，可以直接从文本词级或行级的标注中学习，不需要详细的字符级的标注。该Module是一个识别卡纳达文的轻量级OCR模型，支持直接预测。\n\n  - 更多详情参考：\n    - [Real-time Scene Text Detection with Differentiable Binarization](https://arxiv.org/pdf/1911.08947.pdf)\n    - [An end-to-end trainable neural network for image-based sequence recognition and its application to scene text recognition](https://arxiv.org/pdf/1507.05717.pdf)\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.0.2  \n\n  - paddlehub >= 2.0.0   | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install kannada_ocr_db_crnn_mobile\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run kannada_ocr_db_crnn_mobile --input_path \"/PATH/TO/IMAGE\"\n    $ hub run kannada_ocr_db_crnn_mobile --input_path \"/PATH/TO/IMAGE\" --det True --rec True --use_angle_cls True  --box_thresh 0.7 --angle_classification_thresh 0.8 --visualization True\n    ```\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    ocr = hub.Module(name=\"kannada_ocr_db_crnn_mobile\", enable_mkldnn=True)       # mkldnn加速仅在CPU下有效\n    result = ocr.recognize_text(images=[cv2.imread('/PATH/TO/IMAGE')])\n\n    # or\n    # result = ocr.recognize_text(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def __init__(self,\n                 det=True,\n                 rec=True,\n                 use_angle_cls=False,\n                 enable_mkldnn=False,  \n                 use_gpu=False,\n                 box_thresh=0.6,\n                 angle_classification_thresh=0.9)\n    ```\n\n    - 构造KannadaOCRDBCRNNMobile对象\n\n    - **参数**\n      - det(bool): 是否开启文字检测。默认为True。\n      - rec(bool): 是否开启文字识别。默认为True。\n      - use_angle_cls(bool): 是否开启方向分类, 用于设置使用方向分类器识别180度旋转文字。默认为False。\n      - enable_mkldnn(bool): 是否开启mkldnn加速CPU计算。该参数仅在CPU运行下设置有效。默认为False。\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量**\n      - box\\_thresh (float): 检测文本框置信度的阈值；\n      - angle_classification_thresh(float): 文本方向分类置信度的阈值\n\n\n  - ```python\n    def recognize_text(images=[],\n                       paths=[],\n                       output_dir='ocr_result',\n                       visualization=False)\n    ```\n\n    - 预测API，检测输入图片中的所有文本的位置和识别文本结果。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径；\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      - output\\_dir (str): 图片的保存路径，默认设为 ocr\\_result；\n      - visualization (bool): 是否将识别结果保存为图片文件, 仅有检测开启时有效, 默认为False；\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为：\n        - data (list\\[dict\\]): 识别文本结果，列表中每一个元素为 dict，各字段为：\n          - text(str): 识别得到的文本\n          - confidence(float): 识别文本结果置信度\n          - text_box_position(list): 文本框在原图中的像素坐标，4*2的矩阵，依次表示文本框左下、右下、右上、左上顶点的坐标，如果无识别结果则data为\\[\\]\n          - orientation(str): 分类的方向，仅在只有方向分类开启时输出\n          - score(float): 分类的得分，仅在只有方向分类开启时输出\n        - save_path (str, optional): 识别结果的保存路径，如不保存图片则save_path为''\n\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m kannada_ocr_db_crnn_mobile\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/kannada_ocr_db_crnn_mobile\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n  - ```shell\n    $ hub install kannada_ocr_db_crnn_mobile==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/text_recognition/kannada_ocr_db_crnn_mobile/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/text_recognition/kannada_ocr_db_crnn_mobile/module.py",
    "content": "import paddlehub as hub\nfrom paddleocr.ppocr.utils.logging import get_logger\nfrom paddleocr.tools.infer.utility import base64_to_cv2\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\n\n@moduleinfo(\n    name=\"kannada_ocr_db_crnn_mobile\",\n    version=\"1.0.0\",\n    summary=\"ocr service\",\n    author=\"PaddlePaddle\",\n    type=\"cv/text_recognition\")\nclass KannadaOCRDBCRNNMobile:\n    def __init__(self,\n                 det=True,\n                 rec=True,\n                 use_angle_cls=False,\n                 enable_mkldnn=False,\n                 use_gpu=False,\n                 box_thresh=0.6,\n                 angle_classification_thresh=0.9):\n        \"\"\"\n        initialize with the necessary elements\n        Args:\n            det(bool): Whether to use text detector.\n            rec(bool): Whether to use text recognizer.\n            use_angle_cls(bool): Whether to use text orientation classifier.\n            enable_mkldnn(bool): Whether to enable mkldnn.\n            use_gpu (bool): Whether to use gpu.\n            box_thresh(float): the threshold of the detected text box's confidence\n            angle_classification_thresh(float): the threshold of the angle classification confidence\n        \"\"\"\n        self.logger = get_logger()\n        self.model = hub.Module(\n            name=\"multi_languages_ocr_db_crnn\",\n            lang=\"ka\",\n            det=det,\n            rec=rec,\n            use_angle_cls=use_angle_cls,\n            enable_mkldnn=enable_mkldnn,\n            use_gpu=use_gpu,\n            box_thresh=box_thresh,\n            angle_classification_thresh=angle_classification_thresh)\n        self.model.name = self.name\n\n    def recognize_text(self, images=[], paths=[], output_dir='ocr_result', visualization=False):\n        \"\"\"\n        Get the text in the predicted images.\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]. If images not paths\n            paths (list[str]): The paths of images. If paths not images\n            output_dir (str): The directory to store output images.\n            visualization (bool): Whether to save image or not.\n        Returns:\n            res (list): The result of text detection box and save path of images.\n        \"\"\"\n        all_results = self.model.recognize_text(\n            images=images, paths=paths, output_dir=output_dir, visualization=visualization)\n        return all_results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.recognize_text(images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        results = self.model.run_cmd(argvs)\n        return results\n\n    def export_onnx_model(self, dirname: str, input_shape_dict=None, opset_version=10):\n        '''\n        Export the model to ONNX format.\n\n        Args:\n            dirname(str): The directory to save the onnx model.\n            input_shape_dict: dictionary ``{ input_name: input_value }, eg. {'x': [-1, 3, -1, -1]}``\n            opset_version(int): operator set\n        '''\n        self.model.export_onnx_model(dirname=dirname, input_shape_dict=input_shape_dict, opset_version=opset_version)\n"
  },
  {
    "path": "modules/image/text_recognition/kannada_ocr_db_crnn_mobile/requirements.txt",
    "content": "paddleocr>=2.3.0.2\npaddle2onnx>=0.9.0\nshapely\npyclipper\n"
  },
  {
    "path": "modules/image/text_recognition/korean_ocr_db_crnn_mobile/README.md",
    "content": "# korean_ocr_db_crnn_mobile\n\n|模型名称|korean_ocr_db_crnn_mobile|\n| :--- | :---: |\n|类别|图像-文字识别|\n|网络|Differentiable Binarization+CRNN|\n|数据集|icdar2015数据集|\n|是否支持Fine-tuning|否|\n|最新更新日期|2021-12-2|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - korean_ocr_db_crnn_mobile Module用于识别图片当中的韩文。其基于multi_languages_ocr_db_crnn检测得到的文本框，继续识别文本框中的韩文文字。最终识别文字算法采用CRNN（Convolutional Recurrent Neural Network）即卷积递归神经网络。其是DCNN和RNN的组合，专门用于识别图像中的序列式对象。与CTC loss配合使用，进行文字识别，可以直接从文本词级或行级的标注中学习，不需要详细的字符级的标注。该Module是一个识别韩文的轻量级OCR模型，支持直接预测。\n\n  - 更多详情参考：\n    - [Real-time Scene Text Detection with Differentiable Binarization](https://arxiv.org/pdf/1911.08947.pdf)\n    - [An end-to-end trainable neural network for image-based sequence recognition and its application to scene text recognition](https://arxiv.org/pdf/1507.05717.pdf)\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.0.2  \n\n  - paddlehub >= 2.0.0   | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install french_ocr_db_crnn_mobile\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run korean_ocr_db_crnn_mobile --input_path \"/PATH/TO/IMAGE\"\n    $ hub run korean_ocr_db_crnn_mobile --input_path \"/PATH/TO/IMAGE\" --det True --rec True --use_angle_cls True  --box_thresh 0.7 --angle_classification_thresh 0.8 --visualization True\n    ```\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    ocr = hub.Module(name=\"korean_ocr_db_crnn_mobile\", enable_mkldnn=True)       # mkldnn加速仅在CPU下有效\n    result = ocr.recognize_text(images=[cv2.imread('/PATH/TO/IMAGE')])\n\n    # or\n    # result = ocr.recognize_text(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def __init__(self,\n                 det=True,\n                 rec=True,\n                 use_angle_cls=False,\n                 enable_mkldnn=False,  \n                 use_gpu=False,\n                 box_thresh=0.6,\n                 angle_classification_thresh=0.9)\n    ```\n\n    - 构造KoreanOCRDBCRNNMobile对象\n\n    - **参数**\n      - det(bool): 是否开启文字检测。默认为True。\n      - rec(bool): 是否开启文字识别。默认为True。\n      - use_angle_cls(bool): 是否开启方向分类, 用于设置使用方向分类器识别180度旋转文字。默认为False。\n      - enable_mkldnn(bool): 是否开启mkldnn加速CPU计算。该参数仅在CPU运行下设置有效。默认为False。\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量**\n      - box\\_thresh (float): 检测文本框置信度的阈值；\n      - angle_classification_thresh(float): 文本方向分类置信度的阈值\n\n\n  - ```python\n    def recognize_text(images=[],\n                       paths=[],\n                       output_dir='ocr_result',\n                       visualization=False)\n    ```\n\n    - 预测API，检测输入图片中的所有文本的位置和识别文本结果。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径；\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      - output\\_dir (str): 图片的保存路径，默认设为 ocr\\_result；\n      - visualization (bool): 是否将识别结果保存为图片文件, 仅有检测开启时有效, 默认为False；\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为：\n        - data (list\\[dict\\]): 识别文本结果，列表中每一个元素为 dict，各字段为：\n          - text(str): 识别得到的文本\n          - confidence(float): 识别文本结果置信度\n          - text_box_position(list): 文本框在原图中的像素坐标，4*2的矩阵，依次表示文本框左下、右下、右上、左上顶点的坐标，如果无识别结果则data为\\[\\]\n          - orientation(str): 分类的方向，仅在只有方向分类开启时输出\n          - score(float): 分类的得分，仅在只有方向分类开启时输出\n        - save_path (str, optional): 识别结果的保存路径，如不保存图片则save_path为''\n\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m korean_ocr_db_crnn_mobile\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/korean_ocr_db_crnn_mobile\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  优化模型\n  - ```shell\n    $ hub install korean_ocr_db_crnn_mobile==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/text_recognition/korean_ocr_db_crnn_mobile/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/text_recognition/korean_ocr_db_crnn_mobile/module.py",
    "content": "import paddlehub as hub\nfrom paddleocr.ppocr.utils.logging import get_logger\nfrom paddleocr.tools.infer.utility import base64_to_cv2\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\n\n@moduleinfo(\n    name=\"korean_ocr_db_crnn_mobile\",\n    version=\"1.1.0\",\n    summary=\"ocr service\",\n    author=\"PaddlePaddle\",\n    type=\"cv/text_recognition\")\nclass KoreanOCRDBCRNNMobile:\n    def __init__(self,\n                 det=True,\n                 rec=True,\n                 use_angle_cls=False,\n                 enable_mkldnn=False,\n                 use_gpu=False,\n                 box_thresh=0.6,\n                 angle_classification_thresh=0.9):\n        \"\"\"\n        initialize with the necessary elements\n        Args:\n            det(bool): Whether to use text detector.\n            rec(bool): Whether to use text recognizer.\n            use_angle_cls(bool): Whether to use text orientation classifier.\n            enable_mkldnn(bool): Whether to enable mkldnn.\n            use_gpu (bool): Whether to use gpu.\n            box_thresh(float): the threshold of the detected text box's confidence\n            angle_classification_thresh(float): the threshold of the angle classification confidence\n        \"\"\"\n        self.logger = get_logger()\n        self.model = hub.Module(\n            name=\"multi_languages_ocr_db_crnn\",\n            lang=\"korean\",\n            det=det,\n            rec=rec,\n            use_angle_cls=use_angle_cls,\n            enable_mkldnn=enable_mkldnn,\n            use_gpu=use_gpu,\n            box_thresh=box_thresh,\n            angle_classification_thresh=angle_classification_thresh)\n        self.model.name = self.name\n\n    def recognize_text(self, images=[], paths=[], output_dir='ocr_result', visualization=False):\n        \"\"\"\n        Get the text in the predicted images.\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]. If images not paths\n            paths (list[str]): The paths of images. If paths not images\n            output_dir (str): The directory to store output images.\n            visualization (bool): Whether to save image or not.\n        Returns:\n            res (list): The result of text detection box and save path of images.\n        \"\"\"\n        all_results = self.model.recognize_text(\n            images=images, paths=paths, output_dir=output_dir, visualization=visualization)\n        return all_results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.recognize_text(images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        results = self.model.run_cmd(argvs)\n        return results\n\n    def export_onnx_model(self, dirname: str, input_shape_dict=None, opset_version=10):\n        '''\n        Export the model to ONNX format.\n\n        Args:\n            dirname(str): The directory to save the onnx model.\n            input_shape_dict: dictionary ``{ input_name: input_value }, eg. {'x': [-1, 3, -1, -1]}``\n            opset_version(int): operator set\n        '''\n        self.model.export_onnx_model(dirname=dirname, input_shape_dict=input_shape_dict, opset_version=opset_version)\n"
  },
  {
    "path": "modules/image/text_recognition/korean_ocr_db_crnn_mobile/requirements.txt",
    "content": "paddleocr>=2.3.0.2\npaddle2onnx>=0.9.0\nshapely\npyclipper\n"
  },
  {
    "path": "modules/image/text_recognition/latin_ocr_db_crnn_mobile/README.md",
    "content": "# latin_ocr_db_crnn_mobile\n\n\n|模型名称|latin_ocr_db_crnn_mobile|\n| :--- | :---: |\n|类别|图像-文字识别|\n|网络|Differentiable Binarization+CRNN|\n|数据集|icdar2015数据集|\n|是否支持Fine-tuning|否|\n|最新更新日期|2021-12-2|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - latin_ocr_db_crnn_mobile Module用于识别图片当中的拉丁文。其基于multi_languages_ocr_db_crnn检测得到的文本框，继续识别文本框中的拉丁文文字。最终识别文字算法采用CRNN（Convolutional Recurrent Neural Network）即卷积递归神经网络。其是DCNN和RNN的组合，专门用于识别图像中的序列式对象。与CTC loss配合使用，进行文字识别，可以直接从文本词级或行级的标注中学习，不需要详细的字符级的标注。该Module是一个识别拉丁文的轻量级OCR模型，支持直接预测。\n\n  - 更多详情参考：\n    - [Real-time Scene Text Detection with Differentiable Binarization](https://arxiv.org/pdf/1911.08947.pdf)\n    - [An end-to-end trainable neural network for image-based sequence recognition and its application to scene text recognition](https://arxiv.org/pdf/1507.05717.pdf)\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.0.2  \n\n  - paddlehub >= 2.0.0   | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install latin_ocr_db_crnn_mobile\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run latin_ocr_db_crnn_mobile --input_path \"/PATH/TO/IMAGE\"\n    $ hub run latin_ocr_db_crnn_mobile --input_path \"/PATH/TO/IMAGE\" --det True --rec True --use_angle_cls True  --box_thresh 0.7 --angle_classification_thresh 0.8 --visualization True\n    ```\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    ocr = hub.Module(name=\"latin_ocr_db_crnn_mobile\", enable_mkldnn=True)       # mkldnn加速仅在CPU下有效\n    result = ocr.recognize_text(images=[cv2.imread('/PATH/TO/IMAGE')])\n\n    # or\n    # result = ocr.recognize_text(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def __init__(self,\n                 det=True,\n                 rec=True,\n                 use_angle_cls=False,\n                 enable_mkldnn=False,  \n                 use_gpu=False,\n                 box_thresh=0.6,\n                 angle_classification_thresh=0.9)\n    ```\n\n    - 构造LatinOCRDBCRNNMobile对象\n\n    - **参数**\n      - det(bool): 是否开启文字检测。默认为True。\n      - rec(bool): 是否开启文字识别。默认为True。\n      - use_angle_cls(bool): 是否开启方向分类, 用于设置使用方向分类器识别180度旋转文字。默认为False。\n      - enable_mkldnn(bool): 是否开启mkldnn加速CPU计算。该参数仅在CPU运行下设置有效。默认为False。\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量**\n      - box\\_thresh (float): 检测文本框置信度的阈值；\n      - angle_classification_thresh(float): 文本方向分类置信度的阈值\n\n\n  - ```python\n    def recognize_text(images=[],\n                       paths=[],\n                       output_dir='ocr_result',\n                       visualization=False)\n    ```\n\n    - 预测API，检测输入图片中的所有文本的位置和识别文本结果。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径；\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      - output\\_dir (str): 图片的保存路径，默认设为 ocr\\_result；\n      - visualization (bool): 是否将识别结果保存为图片文件, 仅有检测开启时有效, 默认为False；\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为：\n        - data (list\\[dict\\]): 识别文本结果，列表中每一个元素为 dict，各字段为：\n          - text(str): 识别得到的文本\n          - confidence(float): 识别文本结果置信度\n          - text_box_position(list): 文本框在原图中的像素坐标，4*2的矩阵，依次表示文本框左下、右下、右上、左上顶点的坐标，如果无识别结果则data为\\[\\]\n          - orientation(str): 分类的方向，仅在只有方向分类开启时输出\n          - score(float): 分类的得分，仅在只有方向分类开启时输出\n        - save_path (str, optional): 识别结果的保存路径，如不保存图片则save_path为''\n\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m latin_ocr_db_crnn_mobile\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/latin_ocr_db_crnn_mobile\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n  - ```shell\n    $ hub install latin_ocr_db_crnn_mobile==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/text_recognition/latin_ocr_db_crnn_mobile/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/text_recognition/latin_ocr_db_crnn_mobile/module.py",
    "content": "import paddlehub as hub\nfrom paddleocr.ppocr.utils.logging import get_logger\nfrom paddleocr.tools.infer.utility import base64_to_cv2\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\n\n@moduleinfo(\n    name=\"latin_ocr_db_crnn_mobile\",\n    version=\"1.0.0\",\n    summary=\"ocr service\",\n    author=\"PaddlePaddle\",\n    type=\"cv/text_recognition\")\nclass LatinOCRDBCRNNMobile:\n    def __init__(self,\n                 det=True,\n                 rec=True,\n                 use_angle_cls=False,\n                 enable_mkldnn=False,\n                 use_gpu=False,\n                 box_thresh=0.6,\n                 angle_classification_thresh=0.9):\n        \"\"\"\n        initialize with the necessary elements\n        Args:\n            det(bool): Whether to use text detector.\n            rec(bool): Whether to use text recognizer.\n            use_angle_cls(bool): Whether to use text orientation classifier.\n            enable_mkldnn(bool): Whether to enable mkldnn.\n            use_gpu (bool): Whether to use gpu.\n            box_thresh(float): the threshold of the detected text box's confidence\n            angle_classification_thresh(float): the threshold of the angle classification confidence\n        \"\"\"\n        self.logger = get_logger()\n        self.model = hub.Module(\n            name=\"multi_languages_ocr_db_crnn\",\n            lang=\"latin\",\n            det=det,\n            rec=rec,\n            use_angle_cls=use_angle_cls,\n            enable_mkldnn=enable_mkldnn,\n            use_gpu=use_gpu,\n            box_thresh=box_thresh,\n            angle_classification_thresh=angle_classification_thresh)\n        self.model.name = self.name\n\n    def recognize_text(self, images=[], paths=[], output_dir='ocr_result', visualization=False):\n        \"\"\"\n        Get the text in the predicted images.\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]. If images not paths\n            paths (list[str]): The paths of images. If paths not images\n            output_dir (str): The directory to store output images.\n            visualization (bool): Whether to save image or not.\n        Returns:\n            res (list): The result of text detection box and save path of images.\n        \"\"\"\n        all_results = self.model.recognize_text(\n            images=images, paths=paths, output_dir=output_dir, visualization=visualization)\n        return all_results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.recognize_text(images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        results = self.model.run_cmd(argvs)\n        return results\n\n    def export_onnx_model(self, dirname: str, input_shape_dict=None, opset_version=10):\n        '''\n        Export the model to ONNX format.\n\n        Args:\n            dirname(str): The directory to save the onnx model.\n            input_shape_dict: dictionary ``{ input_name: input_value }, eg. {'x': [-1, 3, -1, -1]}``\n            opset_version(int): operator set\n        '''\n        self.model.export_onnx_model(dirname=dirname, input_shape_dict=input_shape_dict, opset_version=opset_version)\n"
  },
  {
    "path": "modules/image/text_recognition/latin_ocr_db_crnn_mobile/requirements.txt",
    "content": "paddleocr>=2.3.0.2\npaddle2onnx>=0.9.0\nshapely\npyclipper\n"
  },
  {
    "path": "modules/image/text_recognition/multi_languages_ocr_db_crnn/README.md",
    "content": "# multi_languages_ocr_db_crnn\n\n|模型名称|multi_languages_ocr_db_crnn|\n| :--- | :---: |\n|类别|图像-文字识别|\n|网络|Differentiable Binarization+RCNN|\n|数据集|icdar2015数据集|\n|是否支持Fine-tuning|否|\n|最新更新日期|2021-11-24|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n<p align=\"center\">\n<img src=\"https://user-images.githubusercontent.com/76040149/133097562-d8c9abd1-6c70-4d93-809f-fa4735764836.png\"  width = \"600\" hspace='10'/> <br />\n</p>\n\n- ### 模型介绍\n\n  - multi_languages_ocr_db_crnn Module用于识别图片当中的文字。其基于PaddleOCR模块，检测得到文本框，识别文本框中的文字,再对检测文本框进行角度分类。最终检测算法采用DB(Differentiable Binarization)，而识别文字算法则采用CRNN（Convolutional Recurrent Neural Network）即卷积递归神经网络。\n    该Module不仅提供了通用场景下的中英文模型，也提供了[80个语言](#语种缩写)的小语种模型。\n\n\n<p align=\"center\">\n<img src=\"https://user-images.githubusercontent.com/76040149/133098254-7c642826-d6d7-4dd0-986e-371622337867.png\" width = \"300\" height = \"450\"  hspace='10'/> <br />\n</p>\n\n  - 更多详情参考：\n    - [Real-time Scene Text Detection with Differentiable Binarization](https://arxiv.org/pdf/1911.08947.pdf)\n    - [An end-to-end trainable neural network for image-based sequence recognition and its application to scene text recognition](https://arxiv.org/pdf/1507.05717.pdf)\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.0.2  \n\n  - paddlehub >= 2.0.0   | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install multi_languages_ocr_db_crnn\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run multi_languages_ocr_db_crnn --input_path \"/PATH/TO/IMAGE\"\n    $ hub run multi_languages_ocr_db_crnn --input_path \"/PATH/TO/IMAGE\" --lang \"ch\"  --det True --rec True --use_angle_cls True  --box_thresh 0.7 --angle_classification_thresh 0.8 --visualization True\n    ```\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    ocr = hub.Module(name=\"multi_languages_ocr_db_crnn\", lang='en', enable_mkldnn=True)       # mkldnn加速仅在CPU下有效\n    result = ocr.recognize_text(images=[cv2.imread('/PATH/TO/IMAGE')])\n\n    # or\n    # result = ocr.recognize_text(paths=['/PATH/TO/IMAGE'])\n    ```\n  - multi_languages_ocr_db_crnn目前支持80个语种，可以通过修改lang参数进行切换，对于英文模型，指定lang=en，具体支持的[语种](#语种缩写)可查看表格。\n\n- ### 3、API\n\n  - ```python\n    def __init__(self,\n                 lang=\"ch\",\n                 det=True, rec=True,\n                 use_angle_cls=False,\n                 enable_mkldnn=False,  \n                 use_gpu=False,\n                 box_thresh=0.6,\n                 angle_classification_thresh=0.9)\n    ```\n\n    - 构造MultiLangOCR对象\n\n    - **参数**\n      - lang(str): 多语言模型选择。默认为中文模型，即lang=\"ch\"。\n      - det(bool): 是否开启文字检测。默认为True。\n      - rec(bool): 是否开启文字识别。默认为True。\n      - use_angle_cls(bool): 是否开启方向分类, 用于设置使用方向分类器识别180度旋转文字。默认为False。\n      - enable_mkldnn(bool): 是否开启mkldnn加速CPU计算。该参数仅在CPU运行下设置有效。默认为False。\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量**\n      - box\\_thresh (float): 检测文本框置信度的阈值；\n      - angle_classification_thresh(float): 文本方向分类置信度的阈值\n\n\n  - ```python\n    def recognize_text(images=[],\n                       paths=[],\n                       output_dir='ocr_result',\n                       visualization=False)\n    ```\n\n    - 预测API，检测输入图片中的所有文本的位置和识别文本结果。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径；\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      - output\\_dir (str): 图片的保存路径，默认设为 ocr\\_result；\n      - visualization (bool): 是否将识别结果保存为图片文件, 仅有检测开启时有效, 默认为False；\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为：\n        - data (list\\[dict\\]): 识别文本结果，列表中每一个元素为 dict，各字段为：\n          - text(str): 识别得到的文本\n          - confidence(float): 识别文本结果置信度\n          - text_box_position(list): 文本框在原图中的像素坐标，4*2的矩阵，依次表示文本框左下、右下、右上、左上顶点的坐标，如果无识别结果则data为\\[\\]\n          - orientation(str): 分类的方向，仅在只有方向分类开启时输出\n          - score(float): 分类的得分，仅在只有方向分类开启时输出\n        - save_path (str, optional): 识别结果的保存路径，如不保存图片则save_path为''\n\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m multi_languages_ocr_db_crnn\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/multi_languages_ocr_db_crnn\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n<a name=\"语种缩写\"></a>\n## 五、支持语种及缩写\n\n| 语种 | 描述 | 缩写 | | 语种 | 描述 | 缩写 |\n| --- | --- | --- | ---|--- | --- | --- |\n|中文|chinese and english|ch| |保加利亚文|Bulgarian |bg|\n|英文|english|en| |乌克兰文|Ukranian|uk|\n|法文|french|fr| |白俄罗斯文|Belarusian|be|\n|德文|german|german| |泰卢固文|Telugu |te|\n|日文|japan|japan| | 阿巴扎文 | Abaza        | abq  |\n|韩文|korean|korean| |泰米尔文|Tamil |ta|\n|中文繁体|chinese traditional |chinese_cht| |南非荷兰文 |Afrikaans |af|\n|意大利文| Italian |it| |阿塞拜疆文 |Azerbaijani    |az|\n|西班牙文|Spanish |es| |波斯尼亚文|Bosnian|bs|\n|葡萄牙文| Portuguese|pt| |捷克文|Czech|cs|\n|俄罗斯文|Russia|ru| |威尔士文 |Welsh |cy|\n|阿拉伯文|Arabic|ar| |丹麦文 |Danish|da|\n|印地文|Hindi|hi| |爱沙尼亚文 |Estonian |et|\n|维吾尔|Uyghur|ug| |爱尔兰文 |Irish |ga|\n|波斯文|Persian|fa| |克罗地亚文|Croatian |hr|\n|乌尔都文|Urdu|ur| |匈牙利文|Hungarian |hu|\n|塞尔维亚文（latin)| Serbian(latin) |rs_latin| |印尼文|Indonesian|id|\n|欧西坦文|Occitan |oc| |冰岛文 |Icelandic|is|\n|马拉地文|Marathi|mr| |库尔德文 |Kurdish|ku|\n|尼泊尔文|Nepali|ne| |立陶宛文|Lithuanian |lt|\n|塞尔维亚文（cyrillic)|Serbian(cyrillic)|rs_cyrillic| |拉脱维亚文 |Latvian |lv|\n|毛利文|Maori|mi| | 达尔瓦文|Dargwa |dar|\n|马来文 |Malay|ms| | 因古什文|Ingush |inh|\n|马耳他文 |Maltese |mt| | 拉克文|Lak |lbe|\n|荷兰文 |Dutch |nl| | 莱兹甘文|Lezghian |lez|\n|挪威文 |Norwegian |no| |塔巴萨兰文 |Tabassaran |tab|\n|波兰文|Polish |pl| | 比尔哈文|Bihari |bh|\n| 罗马尼亚文|Romanian |ro| | 迈蒂利文|Maithili |mai|\n| 斯洛伐克文|Slovak |sk| | 昂加文|Angika |ang|\n| 斯洛文尼亚文|Slovenian |sl| | 孟加拉文|Bhojpuri |bho|\n| 阿尔巴尼亚文|Albanian |sq| | 摩揭陀文 |Magahi |mah|\n| 瑞典文|Swedish |sv| | 那格浦尔文|Nagpur |sck|\n| 西瓦希里文|Swahili |sw| | 尼瓦尔文|Newari |new|\n| 塔加洛文|Tagalog |tl| | 保加利亚文 |Goan Konkani|gom|\n| 土耳其文|Turkish |tr| | 沙特阿拉伯文|Saudi Arabia|sa|\n| 乌兹别克文|Uzbek |uz| | 阿瓦尔文|Avar |ava|\n| 越南文|Vietnamese |vi| | 阿瓦尔文|Avar |ava|\n| 蒙古文|Mongolian |mn| | 阿迪赫文|Adyghe |ady|\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install multi_languages_ocr_db_crnn==1.1.0\n    ```\n"
  },
  {
    "path": "modules/image/text_recognition/multi_languages_ocr_db_crnn/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/text_recognition/multi_languages_ocr_db_crnn/module.py",
    "content": "import argparse\nimport os\nimport ast\n\nimport paddle\nimport paddle.static\nimport paddle2onnx\nimport paddle2onnx as p2o\nfrom paddleocr import PaddleOCR\nfrom paddleocr.ppocr.utils.logging import get_logger\nfrom paddleocr.tools.infer.utility import base64_to_cv2\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\nfrom .utils import read_images, save_result_image, mkdir\n\n\n@moduleinfo(\n    name=\"multi_languages_ocr_db_crnn\",\n    version=\"1.1.0\",\n    summary=\"ocr service\",\n    author=\"PaddlePaddle\",\n    type=\"cv/text_recognition\")\nclass MultiLangOCR:\n    def __init__(self,\n                 lang=\"ch\",\n                 det=True,\n                 rec=True,\n                 use_angle_cls=False,\n                 enable_mkldnn=False,\n                 use_gpu=False,\n                 box_thresh=0.6,\n                 angle_classification_thresh=0.9):\n        \"\"\"\n        initialize with the necessary elements\n        Args:\n            lang(str): the selection of languages\n            det(bool): Whether to use text detector.\n            rec(bool): Whether to use text recognizer.\n            use_angle_cls(bool): Whether to use text orientation classifier.\n            enable_mkldnn(bool): Whether to enable mkldnn.\n            use_gpu (bool): Whether to use gpu.\n            box_thresh(float): the threshold of the detected text box's confidence\n            angle_classification_thresh(float): the threshold of the angle classification confidence\n        \"\"\"\n        self.lang = lang\n        self.logger = get_logger()\n        self.det = det\n        self.rec = rec\n        self.use_angle_cls = use_angle_cls\n        self.engine = PaddleOCR(\n            lang=lang,\n            det=det,\n            rec=rec,\n            use_angle_cls=use_angle_cls,\n            enable_mkldnn=enable_mkldnn,\n            use_gpu=use_gpu,\n            det_db_box_thresh=box_thresh,\n            cls_thresh=angle_classification_thresh)\n        self.det_model_dir = self.engine.text_detector.args.det_model_dir\n        self.rec_model_dir = self.engine.text_detector.args.rec_model_dir\n        self.cls_model_dir = self.engine.text_detector.args.cls_model_dir\n\n    def recognize_text(self, images=[], paths=[], output_dir='ocr_result', visualization=False):\n        \"\"\"\n        Get the text in the predicted images.\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]. If images not paths\n            paths (list[str]): The paths of images. If paths not images\n            output_dir (str): The directory to store output images.\n            visualization (bool): Whether to save image or not.\n        Returns:\n            res (list): The result of text detection box and save path of images.\n        \"\"\"\n\n        if images != [] and isinstance(images, list) and paths == []:\n            predicted_data = images\n        elif images == [] and isinstance(paths, list) and paths != []:\n            predicted_data = read_images(paths)\n        else:\n            raise TypeError(\"The input data is inconsistent with expectations.\")\n\n        assert predicted_data != [], \"There is not any image to be predicted. Please check the input data.\"\n        all_results = []\n        for img in predicted_data:\n            result = {'save_path': ''}\n            if img is None:\n                result['data'] = []\n                all_results.append(result)\n                continue\n            original_image = img.copy()\n            rec_results = self.engine.ocr(img, det=self.det, rec=self.rec, cls=self.use_angle_cls)\n            rec_res_final = []\n            for line in rec_results:\n                if self.det and self.rec:\n                    boxes = line[0]\n                    text, score = line[1]\n                    rec_res_final.append({'text': text, 'confidence': float(score), 'text_box_position': boxes})\n                elif self.det and not self.rec:\n                    boxes = line\n                    rec_res_final.append({'text_box_position': boxes})\n                else:\n                    if self.use_angle_cls and not self.rec:\n                        orientation, score = line\n                        rec_res_final.append({'orientation': orientation, 'score': float(score)})\n                    else:\n                        text, score = line\n                        rec_res_final.append({'text': text, 'confidence': float(score)})\n\n            result['data'] = rec_res_final\n            if visualization and result['data']:\n                result['save_path'] = save_result_image(original_image, rec_results, output_dir, self.directory,\n                                                        self.lang, self.det, self.rec, self.logger)\n\n            all_results.append(result)\n        return all_results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.recognize_text(images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        parser = self.arg_parser()\n        args = parser.parse_args(argvs)\n        if args.lang is not None:\n            self.lang = args.lang\n        self.det = args.det\n        self.rec = args.rec\n        self.use_angle_cls = args.use_angle_cls\n        self.engine = PaddleOCR(\n            lang=self.lang,\n            det=args.det,\n            rec=args.rec,\n            use_angle_cls=args.use_angle_cls,\n            enable_mkldnn=args.enable_mkldnn,\n            use_gpu=args.use_gpu,\n            det_db_box_thresh=args.box_thresh,\n            cls_thresh=args.angle_classification_thresh)\n        results = self.recognize_text(\n            paths=[args.input_path], output_dir=args.output_dir, visualization=args.visualization)\n        return results\n\n    def arg_parser(self):\n        parser = argparse.ArgumentParser(\n            description=\"Run the %s module.\" % self.name,\n            prog='hub run %s' % self.name,\n            usage='%(prog)s',\n            add_help=True)\n\n        parser.add_argument('--input_path', type=str, default=None, help=\"diretory to image. Required.\", required=True)\n        parser.add_argument('--use_gpu', type=ast.literal_eval, default=False, help=\"whether use GPU or not\")\n        parser.add_argument('--output_dir', type=str, default='ocr_result', help=\"The directory to save output images.\")\n        parser.add_argument(\n            '--visualization', type=ast.literal_eval, default=False, help=\"whether to save output as images.\")\n        parser.add_argument('--lang', type=str, default=None, help=\"the selection of languages\")\n        parser.add_argument('--det', type=ast.literal_eval, default=True, help=\"whether use text detector or not\")\n        parser.add_argument('--rec', type=ast.literal_eval, default=True, help=\"whether use text recognizer or not\")\n        parser.add_argument(\n            '--use_angle_cls', type=ast.literal_eval, default=False, help=\"whether text orientation classifier or not\")\n        parser.add_argument('--enable_mkldnn', type=ast.literal_eval, default=False, help=\"whether use mkldnn or not\")\n        parser.add_argument(\n            \"--box_thresh\", type=float, default=0.6, help=\"set the threshold of the detected text box's confidence\")\n        parser.add_argument(\n            \"--angle_classification_thresh\",\n            type=float,\n            default=0.9,\n            help=\"set the threshold of the angle classification confidence\")\n\n        return parser\n\n    def export_onnx_model(self, dirname: str, input_shape_dict=None, opset_version=10):\n        '''\n        Export the model to ONNX format.\n\n        Args:\n            dirname(str): The directory to save the onnx model.\n            input_shape_dict: dictionary ``{ input_name: input_value }, eg. {'x': [-1, 3, -1, -1]}``\n            opset_version(int): operator set\n        '''\n        v0, v1, v2 = paddle2onnx.__version__.split('.')\n        if int(v0) == 0 and int(v1) < 9:\n            raise ImportError(\"paddle2onnx>=0.9.0 is required\")\n\n        if input_shape_dict is not None and not isinstance(input_shape_dict, dict):\n            raise Exception(\"input_shape_dict should be dict, eg. {'x': [-1, 3, -1, -1]}.\")\n\n        if opset_version <= 9:\n            raise Exception(\"opset_version <= 9 is not surpported, please try with higher opset_version >=10.\")\n\n        path_dict = {\"det\": self.det_model_dir, \"rec\": self.rec_model_dir, \"cls\": self.cls_model_dir}\n        for (key, path) in path_dict.items():\n            save_file = os.path.join(dirname, '{}_{}.onnx'.format(self.name, key))\n\n            exe = paddle.static.Executor(paddle.CPUPlace())\n            [program, feed_var_names, fetch_vars] = paddle.static.load_inference_model(\n                    os.path.join(path, 'inference'), exe)\n\n            onnx_proto = p2o.run_convert(program, input_shape_dict=input_shape_dict, opset_version=opset_version)\n            mkdir(save_file)\n            with open(save_file, \"wb\") as f:\n                f.write(onnx_proto.SerializeToString())\n"
  },
  {
    "path": "modules/image/text_recognition/multi_languages_ocr_db_crnn/requirements.txt",
    "content": "paddleocr>=2.3.0.2\npaddle2onnx>=0.9.0\nshapely\npyclipper\n"
  },
  {
    "path": "modules/image/text_recognition/multi_languages_ocr_db_crnn/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport cv2\nimport requests\nimport paddlehub as hub\n\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n    @classmethod\n    def setUpClass(cls) -> None:\n        img_url = 'https://unsplash.com/photos/KTzZVDjUsXw/download?ixid=MnwxMjA3fDB8MXxzZWFyY2h8MzM3fHx0ZXh0fGVufDB8fHx8MTY2MzUxMTExMQ&force=true&w=640'\n        if not os.path.exists('tests'):\n            os.makedirs('tests')\n        response = requests.get(img_url)\n        assert response.status_code == 200, 'Network Error.'\n        with open('tests/test.jpg', 'wb') as f:\n            f.write(response.content)\n        cls.module = hub.Module(name=\"multi_languages_ocr_db_crnn\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('tests')\n        shutil.rmtree('onnx')\n        shutil.rmtree('ocr_result')\n\n    def test_recognize_text1(self):\n        results = self.module.recognize_text(\n            paths=['tests/test.jpg'],\n            visualization=False,\n        )\n        self.assertEqual(results[0]['data'], [\n        {\n            'text': 'GIVE.', 'confidence': 0.9509806632995605, \n            'text_box_position': [[283, 162], [352, 162], [352, 202], [283, 202]]\n        }, \n        {\n            'text': 'THANKS', 'confidence': 0.9943129420280457, \n            'text_box_position': [[261, 202], [376, 202], [376, 239], [261, 239]]\n        }])\n\n    def test_recognize_text2(self):\n        results = self.module.recognize_text(\n            images=[cv2.imread('tests/test.jpg')],\n            visualization=False,\n        )\n        self.assertEqual(results[0]['data'], [\n        {\n            'text': 'GIVE.', 'confidence': 0.9509806632995605, \n            'text_box_position': [[283, 162], [352, 162], [352, 202], [283, 202]]\n        }, \n        {\n            'text': 'THANKS', 'confidence': 0.9943129420280457, \n            'text_box_position': [[261, 202], [376, 202], [376, 239], [261, 239]]\n        }])\n\n    def test_recognize_text3(self):\n        results = self.module.recognize_text(\n            images=[cv2.imread('tests/test.jpg')],\n            visualization=True,\n        )\n        self.assertEqual(results[0]['data'], [\n        {\n            'text': 'GIVE.', 'confidence': 0.9509806632995605, \n            'text_box_position': [[283, 162], [352, 162], [352, 202], [283, 202]]\n        }, \n        {\n            'text': 'THANKS', 'confidence': 0.9943129420280457, \n            'text_box_position': [[261, 202], [376, 202], [376, 239], [261, 239]]\n        }])\n\n    def test_recognize_text4(self):\n        self.assertRaises(\n            AttributeError,\n            self.module.recognize_text,\n            images=['tests/test.jpg']\n        )\n\n    def test_recognize_text5(self):\n        self.assertRaises(\n            AssertionError,\n            self.module.recognize_text,\n            paths=['no.jpg']\n        )\n\n    def test_export_onnx_model(self):\n        self.module.export_onnx_model(dirname='onnx', input_shape_dict=None, opset_version=10)\n        self.assertTrue(os.path.isfile('onnx/multi_languages_ocr_db_crnn_cls.onnx'))\n        self.assertTrue(os.path.isfile('onnx/multi_languages_ocr_db_crnn_det.onnx'))\n        self.assertTrue(os.path.isfile('onnx/multi_languages_ocr_db_crnn_rec.onnx'))\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/text_recognition/multi_languages_ocr_db_crnn/utils.py",
    "content": "import os\nimport time\n\nimport cv2\nimport numpy as np\nfrom PIL import Image, ImageDraw\n\nfrom paddleocr import draw_ocr\n\n\ndef save_result_image(original_image,\n                      rec_results,\n                      output_dir='ocr_result',\n                      directory=None,\n                      lang='ch',\n                      det=True,\n                      rec=True,\n                      logger=None):\n    image = Image.fromarray(cv2.cvtColor(original_image, cv2.COLOR_BGR2RGB))\n    if det and rec:\n        boxes = [line[0] for line in rec_results]\n        txts = [line[1][0] for line in rec_results]\n        scores = [line[1][1] for line in rec_results]\n        fonts_lang = 'fonts/simfang.ttf'\n        lang_fonts = {\n            'korean': 'korean',\n            'fr': 'french',\n            'german': 'german',\n            'hi': 'hindi',\n            'ne': 'nepali',\n            'fa': 'persian',\n            'es': 'spanish',\n            'ta': 'tamil',\n            'te': 'telugu',\n            'ur': 'urdu',\n            'ug': 'uyghur',\n        }\n        if lang in lang_fonts.keys():\n            fonts_lang = 'fonts/' + lang_fonts[lang] + '.ttf'\n        font_file = os.path.join(directory, 'assets', fonts_lang)\n        im_show = draw_ocr(image, boxes, txts, scores, font_path=font_file)\n    elif det and not rec:\n        boxes = rec_results\n        im_show = draw_boxes(image, boxes)\n        im_show = np.array(im_show)\n    else:\n        logger.warning(\"only cls or rec not supported visualization.\")\n        return \"\"\n\n    if not os.path.exists(output_dir):\n        os.makedirs(output_dir)\n\n    ext = get_image_ext(original_image)\n    saved_name = 'ndarray_{}{}'.format(time.time(), ext)\n    save_file_path = os.path.join(output_dir, saved_name)\n    im_show = Image.fromarray(im_show)\n    im_show.save(save_file_path)\n    return save_file_path\n\n\ndef read_images(paths=[]):\n    images = []\n    for img_path in paths:\n        assert os.path.isfile(img_path), \"The {} isn't a valid file.\".format(img_path)\n        img = cv2.imread(img_path)\n        if img is None:\n            continue\n        images.append(img)\n    return images\n\n\ndef draw_boxes(image, boxes, scores=None, drop_score=0.5):\n    img = image.copy()\n    draw = ImageDraw.Draw(img)\n    if scores is None:\n        scores = [1] * len(boxes)\n    for (box, score) in zip(boxes, scores):\n        if score < drop_score:\n            continue\n        draw.line([(box[0][0], box[0][1]), (box[1][0], box[1][1])], fill='red')\n        draw.line([(box[1][0], box[1][1]), (box[2][0], box[2][1])], fill='red')\n        draw.line([(box[2][0], box[2][1]), (box[3][0], box[3][1])], fill='red')\n        draw.line([(box[3][0], box[3][1]), (box[0][0], box[0][1])], fill='red')\n        draw.line([(box[0][0] - 1, box[0][1] + 1), (box[1][0] - 1, box[1][1] + 1)], fill='red')\n        draw.line([(box[1][0] - 1, box[1][1] + 1), (box[2][0] - 1, box[2][1] + 1)], fill='red')\n        draw.line([(box[2][0] - 1, box[2][1] + 1), (box[3][0] - 1, box[3][1] + 1)], fill='red')\n        draw.line([(box[3][0] - 1, box[3][1] + 1), (box[0][0] - 1, box[0][1] + 1)], fill='red')\n    return img\n\n\ndef get_image_ext(image):\n    if image.shape[2] == 4:\n        return \".png\"\n    return \".jpg\"\n\n\ndef mkdir(path):\n    sub_dir = os.path.dirname(path)\n    if not os.path.exists(sub_dir):\n        os.makedirs(sub_dir)\n"
  },
  {
    "path": "modules/image/text_recognition/tamil_ocr_db_crnn_mobile/README.md",
    "content": "# tamil_ocr_db_crnn_mobile\n\n|模型名称|tamil_ocr_db_crnn_mobile|\n| :--- | :---: |\n|类别|图像-文字识别|\n|网络|Differentiable Binarization+CRNN|\n|数据集|icdar2015数据集|\n|是否支持Fine-tuning|否|\n|最新更新日期|2021-12-2|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - tamil_ocr_db_crnn_mobile Module用于识别图片当中的泰米尔文。其基于multi_languages_ocr_db_crnn检测得到的文本框，继续识别文本框中的泰米尔文文字。最终识别文字算法采用CRNN（Convolutional Recurrent Neural Network）即卷积递归神经网络。其是DCNN和RNN的组合，专门用于识别图像中的序列式对象。与CTC loss配合使用，进行文字识别，可以直接从文本词级或行级的标注中学习，不需要详细的字符级的标注。该Module是一个识别泰米尔文的轻量级OCR模型，支持直接预测。\n\n  - 更多详情参考：\n    - [Real-time Scene Text Detection with Differentiable Binarization](https://arxiv.org/pdf/1911.08947.pdf)\n    - [An end-to-end trainable neural network for image-based sequence recognition and its application to scene text recognition](https://arxiv.org/pdf/1507.05717.pdf)\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.0.2  \n\n  - paddlehub >= 2.0.0   | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install tamil_ocr_db_crnn_mobile\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run tamil_ocr_db_crnn_mobile --input_path \"/PATH/TO/IMAGE\"\n    $ hub run tamil_ocr_db_crnn_mobile --input_path \"/PATH/TO/IMAGE\" --det True --rec True --use_angle_cls True  --box_thresh 0.7 --angle_classification_thresh 0.8 --visualization True\n    ```\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    ocr = hub.Module(name=\"tamil_ocr_db_crnn_mobile\", enable_mkldnn=True)       # mkldnn加速仅在CPU下有效\n    result = ocr.recognize_text(images=[cv2.imread('/PATH/TO/IMAGE')])\n\n    # or\n    # result = ocr.recognize_text(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def __init__(self,\n                 det=True,\n                 rec=True,\n                 use_angle_cls=False,\n                 enable_mkldnn=False,  \n                 use_gpu=False,\n                 box_thresh=0.6,\n                 angle_classification_thresh=0.9)\n    ```\n\n    - 构造TamilOCRDBCRNNMobile对象\n\n    - **参数**\n      - det(bool): 是否开启文字检测。默认为True。\n      - rec(bool): 是否开启文字识别。默认为True。\n      - use_angle_cls(bool): 是否开启方向分类, 用于设置使用方向分类器识别180度旋转文字。默认为False。\n      - enable_mkldnn(bool): 是否开启mkldnn加速CPU计算。该参数仅在CPU运行下设置有效。默认为False。\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量**\n      - box\\_thresh (float): 检测文本框置信度的阈值；\n      - angle_classification_thresh(float): 文本方向分类置信度的阈值\n\n\n  - ```python\n    def recognize_text(images=[],\n                       paths=[],\n                       output_dir='ocr_result',\n                       visualization=False)\n    ```\n\n    - 预测API，检测输入图片中的所有文本的位置和识别文本结果。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径；\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      - output\\_dir (str): 图片的保存路径，默认设为 ocr\\_result；\n      - visualization (bool): 是否将识别结果保存为图片文件, 仅有检测开启时有效, 默认为False；\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为：\n        - data (list\\[dict\\]): 识别文本结果，列表中每一个元素为 dict，各字段为：\n          - text(str): 识别得到的文本\n          - confidence(float): 识别文本结果置信度\n          - text_box_position(list): 文本框在原图中的像素坐标，4*2的矩阵，依次表示文本框左下、右下、右上、左上顶点的坐标，如果无识别结果则data为\\[\\]\n          - orientation(str): 分类的方向，仅在只有方向分类开启时输出\n          - score(float): 分类的得分，仅在只有方向分类开启时输出\n        - save_path (str, optional): 识别结果的保存路径，如不保存图片则save_path为''\n\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m tamil_ocr_db_crnn_mobile\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/tamil_ocr_db_crnn_mobile\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n  - ```shell\n    $ hub install tamil_ocr_db_crnn_mobile==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/text_recognition/tamil_ocr_db_crnn_mobile/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/text_recognition/tamil_ocr_db_crnn_mobile/module.py",
    "content": "import paddlehub as hub\nfrom paddleocr.ppocr.utils.logging import get_logger\nfrom paddleocr.tools.infer.utility import base64_to_cv2\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\n\n@moduleinfo(\n    name=\"tamil_ocr_db_crnn_mobile\",\n    version=\"1.0.0\",\n    summary=\"ocr service\",\n    author=\"PaddlePaddle\",\n    type=\"cv/text_recognition\")\nclass TamilOCRDBCRNNMobile:\n    def __init__(self,\n                 det=True,\n                 rec=True,\n                 use_angle_cls=False,\n                 enable_mkldnn=False,\n                 use_gpu=False,\n                 box_thresh=0.6,\n                 angle_classification_thresh=0.9):\n        \"\"\"\n        initialize with the necessary elements\n        Args:\n            det(bool): Whether to use text detector.\n            rec(bool): Whether to use text recognizer.\n            use_angle_cls(bool): Whether to use text orientation classifier.\n            enable_mkldnn(bool): Whether to enable mkldnn.\n            use_gpu (bool): Whether to use gpu.\n            box_thresh(float): the threshold of the detected text box's confidence\n            angle_classification_thresh(float): the threshold of the angle classification confidence\n        \"\"\"\n        self.logger = get_logger()\n        self.model = hub.Module(\n            name=\"multi_languages_ocr_db_crnn\",\n            lang=\"ta\",\n            det=det,\n            rec=rec,\n            use_angle_cls=use_angle_cls,\n            enable_mkldnn=enable_mkldnn,\n            use_gpu=use_gpu,\n            box_thresh=box_thresh,\n            angle_classification_thresh=angle_classification_thresh)\n        self.model.name = self.name\n\n    def recognize_text(self, images=[], paths=[], output_dir='ocr_result', visualization=False):\n        \"\"\"\n        Get the text in the predicted images.\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]. If images not paths\n            paths (list[str]): The paths of images. If paths not images\n            output_dir (str): The directory to store output images.\n            visualization (bool): Whether to save image or not.\n        Returns:\n            res (list): The result of text detection box and save path of images.\n        \"\"\"\n        all_results = self.model.recognize_text(\n            images=images, paths=paths, output_dir=output_dir, visualization=visualization)\n        return all_results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.recognize_text(images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        results = self.model.run_cmd(argvs)\n        return results\n\n    def export_onnx_model(self, dirname: str, input_shape_dict=None, opset_version=10):\n        '''\n        Export the model to ONNX format.\n\n        Args:\n            dirname(str): The directory to save the onnx model.\n            input_shape_dict: dictionary ``{ input_name: input_value }, eg. {'x': [-1, 3, -1, -1]}``\n            opset_version(int): operator set\n        '''\n        self.model.export_onnx_model(dirname=dirname, input_shape_dict=input_shape_dict, opset_version=opset_version)\n"
  },
  {
    "path": "modules/image/text_recognition/tamil_ocr_db_crnn_mobile/requirements.txt",
    "content": "paddleocr>=2.3.0.2\npaddle2onnx>=0.9.0\nshapely\npyclipper\n"
  },
  {
    "path": "modules/image/text_recognition/telugu_ocr_db_crnn_mobile/README.md",
    "content": "# telugu_ocr_db_crnn_mobile\n\n|模型名称|telugu_ocr_db_crnn_mobile|\n| :--- | :---: |\n|类别|图像-文字识别|\n|网络|Differentiable Binarization+CRNN|\n|数据集|icdar2015数据集|\n|是否支持Fine-tuning|否|\n|最新更新日期|2021-12-2|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - telugu_ocr_db_crnn_mobile Module用于识别图片当中的泰卢固文。其基于multi_languages_ocr_db_crnn检测得到的文本框，继续识别文本框中的泰卢固文文字。最终识别文字算法采用CRNN（Convolutional Recurrent Neural Network）即卷积递归神经网络。其是DCNN和RNN的组合，专门用于识别图像中的序列式对象。与CTC loss配合使用，进行文字识别，可以直接从文本词级或行级的标注中学习，不需要详细的字符级的标注。该Module是一个识别泰卢固文的轻量级OCR模型，支持直接预测。\n\n  - 更多详情参考：\n    - [Real-time Scene Text Detection with Differentiable Binarization](https://arxiv.org/pdf/1911.08947.pdf)\n    - [An end-to-end trainable neural network for image-based sequence recognition and its application to scene text recognition](https://arxiv.org/pdf/1507.05717.pdf)\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.0.2  \n\n  - paddlehub >= 2.0.0   | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install telugu_ocr_db_crnn_mobile\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run telugu_ocr_db_crnn_mobile --input_path \"/PATH/TO/IMAGE\"\n    $ hub run telugu_ocr_db_crnn_mobile --input_path \"/PATH/TO/IMAGE\" --det True --rec True --use_angle_cls True  --box_thresh 0.7 --angle_classification_thresh 0.8 --visualization True\n    ```\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    import cv2\n\n    ocr = hub.Module(name=\"telugu_ocr_db_crnn_mobile\", enable_mkldnn=True)       # mkldnn加速仅在CPU下有效\n    result = ocr.recognize_text(images=[cv2.imread('/PATH/TO/IMAGE')])\n\n    # or\n    # result = ocr.recognize_text(paths=['/PATH/TO/IMAGE'])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def __init__(self,\n                 det=True,\n                 rec=True,\n                 use_angle_cls=False,\n                 enable_mkldnn=False,  \n                 use_gpu=False,\n                 box_thresh=0.6,\n                 angle_classification_thresh=0.9)\n    ```\n\n    - 构造TeluguOCRDBCRNNMobile对象\n\n    - **参数**\n      - det(bool): 是否开启文字检测。默认为True。\n      - rec(bool): 是否开启文字识别。默认为True。\n      - use_angle_cls(bool): 是否开启方向分类, 用于设置使用方向分类器识别180度旋转文字。默认为False。\n      - enable_mkldnn(bool): 是否开启mkldnn加速CPU计算。该参数仅在CPU运行下设置有效。默认为False。\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量**\n      - box\\_thresh (float): 检测文本框置信度的阈值；\n      - angle_classification_thresh(float): 文本方向分类置信度的阈值\n\n\n  - ```python\n    def recognize_text(images=[],\n                       paths=[],\n                       output_dir='ocr_result',\n                       visualization=False)\n    ```\n\n    - 预测API，检测输入图片中的所有文本的位置和识别文本结果。\n\n    - **参数**\n\n      - paths (list\\[str\\]): 图片的路径；\n      - images (list\\[numpy.ndarray\\]): 图片数据，ndarray.shape 为 \\[H, W, C\\]，BGR格式；\n      - output\\_dir (str): 图片的保存路径，默认设为 ocr\\_result；\n      - visualization (bool): 是否将识别结果保存为图片文件, 仅有检测开启时有效, 默认为False；\n\n    - **返回**\n\n      - res (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，各字段为：\n        - data (list\\[dict\\]): 识别文本结果，列表中每一个元素为 dict，各字段为：\n          - text(str): 识别得到的文本\n          - confidence(float): 识别文本结果置信度\n          - text_box_position(list): 文本框在原图中的像素坐标，4*2的矩阵，依次表示文本框左下、右下、右上、左上顶点的坐标，如果无识别结果则data为\\[\\]\n          - orientation(str): 分类的方向，仅在只有方向分类开启时输出\n          - score(float): 分类的得分，仅在只有方向分类开启时输出\n        - save_path (str, optional): 识别结果的保存路径，如不保存图片则save_path为''\n\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m telugu_ocr_db_crnn_mobile\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n\n    def cv2_to_base64(image):\n        data = cv2.imencode('.jpg', image)[1]\n        return base64.b64encode(data.tostring()).decode('utf8')\n\n    # 发送HTTP请求\n    data = {'images':[cv2_to_base64(cv2.imread(\"/PATH/TO/IMAGE\"))]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/telugu_ocr_db_crnn_mobile\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n  - ```shell\n    $ hub install telugu_ocr_db_crnn_mobile==1.0.0\n    ```\n"
  },
  {
    "path": "modules/image/text_recognition/telugu_ocr_db_crnn_mobile/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/text_recognition/telugu_ocr_db_crnn_mobile/module.py",
    "content": "import paddlehub as hub\nfrom paddleocr.ppocr.utils.logging import get_logger\nfrom paddleocr.tools.infer.utility import base64_to_cv2\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\n\n@moduleinfo(\n    name=\"telugu_ocr_db_crnn_mobile\",\n    version=\"1.0.0\",\n    summary=\"ocr service\",\n    author=\"PaddlePaddle\",\n    type=\"cv/text_recognition\")\nclass TeluguOCRDBCRNNMobile:\n    def __init__(self,\n                 det=True,\n                 rec=True,\n                 use_angle_cls=False,\n                 enable_mkldnn=False,\n                 use_gpu=False,\n                 box_thresh=0.6,\n                 angle_classification_thresh=0.9):\n        \"\"\"\n        initialize with the necessary elements\n        Args:\n            det(bool): Whether to use text detector.\n            rec(bool): Whether to use text recognizer.\n            use_angle_cls(bool): Whether to use text orientation classifier.\n            enable_mkldnn(bool): Whether to enable mkldnn.\n            use_gpu (bool): Whether to use gpu.\n            box_thresh(float): the threshold of the detected text box's confidence\n            angle_classification_thresh(float): the threshold of the angle classification confidence\n        \"\"\"\n        self.logger = get_logger()\n        self.model = hub.Module(\n            name=\"multi_languages_ocr_db_crnn\",\n            lang=\"te\",\n            det=det,\n            rec=rec,\n            use_angle_cls=use_angle_cls,\n            enable_mkldnn=enable_mkldnn,\n            use_gpu=use_gpu,\n            box_thresh=box_thresh,\n            angle_classification_thresh=angle_classification_thresh)\n        self.model.name = self.name\n\n    def recognize_text(self, images=[], paths=[], output_dir='ocr_result', visualization=False):\n        \"\"\"\n        Get the text in the predicted images.\n        Args:\n            images (list(numpy.ndarray)): images data, shape of each is [H, W, C]. If images not paths\n            paths (list[str]): The paths of images. If paths not images\n            output_dir (str): The directory to store output images.\n            visualization (bool): Whether to save image or not.\n        Returns:\n            res (list): The result of text detection box and save path of images.\n        \"\"\"\n        all_results = self.model.recognize_text(\n            images=images, paths=paths, output_dir=output_dir, visualization=visualization)\n        return all_results\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.recognize_text(images_decode, **kwargs)\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        results = self.model.run_cmd(argvs)\n        return results\n\n    def export_onnx_model(self, dirname: str, input_shape_dict=None, opset_version=10):\n        '''\n        Export the model to ONNX format.\n\n        Args:\n            dirname(str): The directory to save the onnx model.\n            input_shape_dict: dictionary ``{ input_name: input_value }, eg. {'x': [-1, 3, -1, -1]}``\n            opset_version(int): operator set\n        '''\n        self.model.export_onnx_model(dirname=dirname, input_shape_dict=input_shape_dict, opset_version=opset_version)\n"
  },
  {
    "path": "modules/image/text_recognition/telugu_ocr_db_crnn_mobile/requirements.txt",
    "content": "paddleocr>=2.3.0.2\npaddle2onnx>=0.9.0\nshapely\npyclipper\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/README.md",
    "content": "# disco_diffusion_clip_rn101\n\n|模型名称|disco_diffusion_clip_rn101|\n| :--- | :---: |\n|类别|图像-文图生成|\n|网络|dd+clip ResNet101|\n|数据集|-|\n|是否支持Fine-tuning|否|\n|模型大小|2.9GB|\n|最新更新日期|2022-08-02|\n|数据指标|-|\n\n## 一、模型基本信息\n\n### 应用效果展示\n\n  - 输入文本 \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"\n\n  - 输出图像\n  <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/184836003-96d82c9e-b43e-4eb5-a237-bb22972d1848.png\"  width = \"80%\" hspace='10'/>\n  <br />\n\n  - 生成过程\n  <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/184836085-beef3562-f89d-405e-ae04-2d3cd8499df8.gif\"  width = \"80%\" hspace='10'/>\n  <br />\n\n### 模型介绍\n\ndisco_diffusion_clip_rn101 是一个文图生成模型，可以通过输入一段文字来生成符合该句子语义的图像。该模型由两部分组成，一部分是扩散模型，是一种生成模型，可以从噪声输入中重建出原始图像。另一部分是多模态预训练模型（CLIP）, 可以将文本和图像表示在同一个特征空间，相近语义的文本和图像在该特征空间里距离会更相近。在该文图生成模型中，扩散模型负责从初始噪声或者指定初始图像中来生成目标图像，CLIP负责引导生成图像的语义和输入的文本的语义尽可能接近，随着扩散模型在CLIP的引导下不断的迭代生成新图像，最终能够生成文本所描述内容的图像。该模块中使用的CLIP模型结构为ResNet101。\n\n更多详情请参考论文：[Diffusion Models Beat GANs on Image Synthesis](https://arxiv.org/abs/2105.05233) 以及 [Learning Transferable Visual Models From Natural Language Supervision](https://arxiv.org/abs/2103.00020)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.2.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install disco_diffusion_clip_rn101\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run disco_diffusion_clip_rn101 --text_prompts \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\" --output_dir disco_diffusion_clip_rn101_out\n    ```\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"disco_diffusion_clip_rn101\")\n    text_prompts = [\"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"]\n    # 生成图像, 默认会在disco_diffusion_clip_rn101_out目录保存图像\n    # 返回的da是一个DocumentArray对象，保存了所有的结果，包括最终结果和迭代过程的中间结果\n    # 可以通过操作DocumentArray对象对生成的图像做后处理，保存或者分析\n    da = module.generate_image(text_prompts=text_prompts, output_dir='./disco_diffusion_clip_rn101_out/')  \n    # 手动将最终生成的图像保存到指定路径\n    da[0].save_uri_to_file('disco_diffusion_clip_rn101_out-result.png')\n    # 展示所有的中间结果\n    da[0].chunks.plot_image_sprites(skip_empty=True, show_index=True, keep_aspect_ratio=True)\n    # 将整个生成过程保存为一个动态图gif\n    da[0].chunks.save_gif('disco_diffusion_clip_rn101_out-result.gif')\n    ```\n\n- ### 3、API\n\n  - ```python\n    def generate_image(\n            text_prompts,\n            style: Optional[str] = None,\n            artist: Optional[str] = None,\n            width_height: Optional[List[int]] = [1280, 768],\n            seed: Optional[int] = None,\n            output_dir: Optional[str] = 'disco_diffusion_clip_rn101_out'):\n    ```\n\n    - 文图生成API，生成文本描述内容的图像。\n\n    - **参数**\n\n      - text_prompts(str): 输入的语句，描述想要生成的图像的内容。通常比较有效的构造方式为 \"一段描述性的文字内容\" + \"指定艺术家的名字\"，如\"a beautiful painting of Chinese architecture, by krenz, sunny, super wide angle, artstation.\"。prompt的构造可以参考[网站](https://docs.google.com/document/d/1XUT2G9LmkZataHFzmuOtRXnuWBfhvXDAo8DkS--8tec/edit#)。\n      - style(Optional[str]): 指定绘画的风格，如'watercolor','Chinese painting'等。当不指定时，风格完全由您所填写的prompt决定。\n      - artist(Optional[str]): 指定特定的艺术家，如Greg Rutkowsk、krenz，将会生成所指定艺术家的绘画风格。当不指定时，风格完全由您所填写的prompt决定。各种艺术家的风格可以参考[网站](https://weirdwonderfulai.art/resources/disco-diffusion-70-plus-artist-studies/)。\n      - width_height(Optional[List[int]]): 指定最终输出图像的宽高，宽和高都需要是64的倍数，生成的图像越大，所需要的计算时间越长。\n      - seed(Optional[int]): 随机种子，由于输入默认是随机高斯噪声，设置不同的随机种子会由不同的初始输入，从而最终生成不同的结果，可以设置该参数来获得不同的输出图像。\n      - output_dir(Optional[str]): 保存输出图像的目录，默认为\"disco_diffusion_clip_rn101_out\"。\n\n\n    - **返回**\n      - ra(DocumentArray): DocumentArray对象， 包含`n_batches`个Documents，其中每个Document都保存了迭代过程的所有中间结果。详细可参考[DocumentArray使用文档](https://docarray.jina.ai/fundamentals/documentarray/index.html)。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线文图生成服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m disco_diffusion_clip_rn101\n    ```\n\n  - 这样就完成了一个文图生成的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果，返回的预测结果在反序列化后即是上述接口声明中说明的DocumentArray类型，返回后对结果的操作方式和使用generate_image接口完全相同。\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    from docarray import DocumentArray\n\n    # 发送HTTP请求\n    data = {'text_prompts': 'in the morning light,Overlooking TOKYO city by greg rutkowski and thomas kinkade,Trending on artstation.'}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/disco_diffusion_clip_rn101\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 获取返回结果\n    da = DocumentArray.from_base64(r.json()[\"results\"])\n    # 手动将最终生成的图像保存到指定路径\n    da[0].save_uri_to_file('disco_diffusion_clip_rn101_out-result.png')\n    # 将生成过程保存为一个动态图gif\n    da[0].chunks.save_gif('disco_diffusion_clip_rn101_out-result.gif')\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install disco_diffusion_clip_rn101 == 1.0.0\n  ```\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/README_en.md",
    "content": "# disco_diffusion_clip_rn101\n\n|Module Name|disco_diffusion_clip_rn101|\n| :--- | :---: |\n|Category|text to image|\n|Network|dd+clip ResNet101|\n|Dataset|-|\n|Fine-tuning supported or not|No|\n|Module Size|2.9GB|\n|Latest update date|2022-08-02|\n|Data indicators|-|\n\n## I.Basic Information\n\n### Application Effect Display\n\n  - Prompt \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"\n\n  - Output image\n  <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/184836003-96d82c9e-b43e-4eb5-a237-bb22972d1848.png\"  width = \"80%\" hspace='10'/>\n  <br />\n\n  - Generating process\n  <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/184836085-beef3562-f89d-405e-ae04-2d3cd8499df8.gif\"  width = \"80%\" hspace='10'/>\n  <br />\n\n### Module Introduction\n\ndisco_diffusion_clip_rn101 is a text-to-image generation model that can generate images that match the semantics of the sentence you prompt. The model consists of two parts, one is the diffusion model, which is a generative model that reconstructs the original image from the noisy input. The other part is the multimodal pre-training model (CLIP), which can represent text and images in the same feature space, and text and images with similar semantics will be closer in this feature space. In the text image generation model, the diffusion model is responsible for generating the target image from the initial noise or the specified initial image, and CLIP is responsible for guiding the generated image to be as close as possible to the semantics of the input text. Diffusion model under the guidance of CLIP iteratively generates new images, eventually generating images of what the text describes. The CLIP model used in this module is ResNet101.\n\nFor more details, please refer to [Diffusion Models Beat GANs on Image Synthesis](https://arxiv.org/abs/2105.05233) and [Learning Transferable Visual Models From Natural Language Supervision](https://arxiv.org/abs/2103.00020)\n\n## II.Installation\n\n- ### 1.Environmental Dependence\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.2.0    | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2.Installation\n\n  - ```shell\n    $ hub install disco_diffusion_clip_rn101\n    ```\n  - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n\n## III.Module API Prediction  \n\n- ### 1.Command line Prediction\n\n  - ```shell\n    $ hub run disco_diffusion_clip_rn101 --text_prompts \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\" --output_dir disco_diffusion_clip_rn101_out\n    ```\n\n- ### 2.Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"disco_diffusion_clip_rn101\")\n    text_prompts = [\"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"]\n    # Output images will be saved in disco_diffusion_clip_rn101_out directory.\n    # The returned da is a DocumentArray object, which contains all immediate and final results\n    # You can manipulate the DocumentArray object to do post-processing and save images\n    da = module.generate_image(text_prompts=text_prompts, output_dir='./disco_diffusion_clip_rn101_out/')  \n    # Save final result image to a file\n    da[0].save_uri_to_file('disco_diffusion_clip_rn101_out-result.png')\n    # Show all immediate results\n    da[0].chunks.plot_image_sprites(skip_empty=True, show_index=True, keep_aspect_ratio=True)\n    # Save the generating process as a gif\n    da[0].chunks.save_gif('disco_diffusion_clip_rn101_out-result.gif')\n    ```\n\n- ### 3.API\n\n  - ```python\n    def generate_image(\n            text_prompts,\n            style: Optional[str] = None,\n            artist: Optional[str] = None,\n            width_height: Optional[List[int]] = [1280, 768],\n            seed: Optional[int] = None,\n            output_dir: Optional[str] = 'disco_diffusion_clip_rn101_out'):\n    ```\n\n    - Image generating api, which generates an image corresponding to your prompt..\n\n    - **Parameters**\n\n      - text_prompts(str): Prompt, used to describe your image content. You can construct a prompt conforms to the format \"content\" + \"artist/style\", such as \"a beautiful painting of Chinese architecture, by krenz, sunny, super wide angle, artstation.\". For more details, you can refer to [website](https://docs.google.com/document/d/1XUT2G9LmkZataHFzmuOtRXnuWBfhvXDAo8DkS--8tec/edit#).\n      - style(Optional[str]): Image style, such as \"watercolor\" and \"Chinese painting\". If not provided, style is totally up to your prompt.\n      - artist(Optional[str]): Artist name, such as Greg Rutkowsk, krenz, image style is as whose works you choose. If not provided, style is totally up to your [prompt](https://weirdwonderfulai.art/resources/disco-diffusion-70-plus-artist-studies/).\n      - width_height(Optional[List[int]]): The width and height of output images, should be better multiples of 64. The larger size is, the longger computation time is.\n      - seed(Optional[int]): Random seed, different seeds result in different output images.\n      - output_dir(Optional[str]): Output directory, default is \"disco_diffusion_clip_rn101_out\".\n\n\n    - **Return**\n      - ra(DocumentArray): DocumentArray object， including `n_batches` Documents，each document keeps all immediate results during generation, please refer to [DocumentArray tutorial](https://docarray.jina.ai/fundamentals/documentarray/index.html) for more details.\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of text-to-image.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m disco_diffusion_clip_rn101\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result.\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    from docarray import DocumentArray\n\n    # Send an HTTP request\n    data = {'text_prompts': 'in the morning light,Overlooking TOKYO city by greg rutkowski and thomas kinkade,Trending on artstation.'}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/disco_diffusion_clip_rn101\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # Get results\n    da = DocumentArray.from_base64(r.json()[\"results\"])\n    # Save final result image to a file\n    da[0].save_uri_to_file('disco_diffusion_clip_rn101_out-result.png')\n    # Save the generating process as a gif\n    da[0].chunks.save_gif('disco_diffusion_clip_rn101_out-result.gif')\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n  ```shell\n  $ hub install disco_diffusion_clip_rn101 == 1.0.0\n  ```\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/clip/README.md",
    "content": "# OpenAI CLIP implemented in Paddle.\nThe original implementation repo is [ranchlai/clip.paddle](https://github.com/ranchlai/clip.paddle). We copy this repo here for guided diffusion.\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/clip/clip/__init__.py",
    "content": "from .utils import *\r\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/clip/clip/layers.py",
    "content": "from typing import Optional\n\nimport paddle\nimport paddle.nn as nn\nfrom paddle import Tensor\nfrom paddle.nn import functional as F\nfrom paddle.nn import Linear\n\n__all__ = ['ResidualAttentionBlock', 'AttentionPool2d', 'multi_head_attention_forward', 'MultiHeadAttention']\n\n\ndef multi_head_attention_forward(x: Tensor,\n                                 num_heads: int,\n                                 q_proj: Linear,\n                                 k_proj: Linear,\n                                 v_proj: Linear,\n                                 c_proj: Linear,\n                                 attn_mask: Optional[Tensor] = None):\n    max_len, batch_size, emb_dim = x.shape\n    head_dim = emb_dim // num_heads\n    scaling = float(head_dim)**-0.5\n    q = q_proj(x)  # L, N, E\n    k = k_proj(x)  # L, N, E\n    v = v_proj(x)  # L, N, E\n    #k = k.con\n    v = v.reshape((-1, batch_size * num_heads, head_dim)).transpose((1, 0, 2))\n    k = k.reshape((-1, batch_size * num_heads, head_dim)).transpose((1, 0, 2))\n    q = q.reshape((-1, batch_size * num_heads, head_dim)).transpose((1, 0, 2))\n\n    q = q * scaling\n    qk = paddle.bmm(q, k.transpose((0, 2, 1)))\n    if attn_mask is not None:\n        if attn_mask.ndim == 2:\n            attn_mask.unsqueeze_(0)\n        #assert str(attn_mask.dtype) == 'VarType.FP32' and attn_mask.ndim == 3\n        assert attn_mask.shape[0] == 1 and attn_mask.shape[1] == max_len and attn_mask.shape[2] == max_len\n        qk += attn_mask\n\n    qk = paddle.nn.functional.softmax(qk, axis=-1)\n    atten = paddle.bmm(qk, v)\n    atten = atten.transpose((1, 0, 2))\n    atten = atten.reshape((max_len, batch_size, emb_dim))\n    atten = c_proj(atten)\n    return atten\n\n\nclass MultiHeadAttention(nn.Layer):  # without attention mask\n\n    def __init__(self, emb_dim: int, num_heads: int):\n        super().__init__()\n        self.q_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.k_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.v_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.c_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.head_dim = emb_dim // num_heads\n        self.emb_dim = emb_dim\n        self.num_heads = num_heads\n        assert self.head_dim * num_heads == emb_dim, \"embed_dim must be divisible by num_heads\"\n        #self.scaling = float(self.head_dim) ** -0.5\n\n    def forward(self, x, attn_mask=None):  # x is in shape[max_len,batch_size,emb_dim]\n\n        atten = multi_head_attention_forward(x,\n                                             self.num_heads,\n                                             self.q_proj,\n                                             self.k_proj,\n                                             self.v_proj,\n                                             self.c_proj,\n                                             attn_mask=attn_mask)\n\n        return atten\n\n\nclass Identity(nn.Layer):\n\n    def __init__(self):\n        super().__init__()\n\n    def forward(self, x):\n        return x\n\n\nclass Bottleneck(nn.Layer):\n    expansion = 4\n\n    def __init__(self, inplanes, planes, stride=1):\n        super().__init__()\n\n        # all conv layers have stride 1. an avgpool is performed after the second convolution when stride > 1\n        self.conv1 = nn.Conv2D(inplanes, planes, 1, bias_attr=False)\n        self.bn1 = nn.BatchNorm2D(planes)\n\n        self.conv2 = nn.Conv2D(planes, planes, 3, padding=1, bias_attr=False)\n        self.bn2 = nn.BatchNorm2D(planes)\n\n        self.avgpool = nn.AvgPool2D(stride) if stride > 1 else Identity()\n\n        self.conv3 = nn.Conv2D(planes, planes * self.expansion, 1, bias_attr=False)\n        self.bn3 = nn.BatchNorm2D(planes * self.expansion)\n\n        self.relu = nn.ReLU()\n        self.downsample = None\n        self.stride = stride\n\n        if stride > 1 or inplanes != planes * Bottleneck.expansion:\n            self.downsample = nn.Sequential(\n                (\"-1\", nn.AvgPool2D(stride)),\n                (\"0\", nn.Conv2D(inplanes, planes * self.expansion, 1, stride=1, bias_attr=False)),\n                (\"1\", nn.BatchNorm2D(planes * self.expansion)))\n\n    def forward(self, x):\n        identity = x\n\n        out = self.relu(self.bn1(self.conv1(x)))\n        out = self.relu(self.bn2(self.conv2(out)))\n        out = self.avgpool(out)\n        out = self.bn3(self.conv3(out))\n\n        if self.downsample is not None:\n            identity = self.downsample(x)\n\n        out += identity\n        out = self.relu(out)\n        return out\n\n\nclass AttentionPool2d(nn.Layer):\n\n    def __init__(self, spacial_dim: int, embed_dim: int, num_heads: int, output_dim: int = None):\n        super().__init__()\n\n        self.positional_embedding = paddle.create_parameter((spacial_dim**2 + 1, embed_dim), dtype='float32')\n\n        self.q_proj = nn.Linear(embed_dim, embed_dim, bias_attr=True)\n        self.k_proj = nn.Linear(embed_dim, embed_dim, bias_attr=True)\n        self.v_proj = nn.Linear(embed_dim, embed_dim, bias_attr=True)\n        self.c_proj = nn.Linear(embed_dim, output_dim or embed_dim, bias_attr=True)\n        self.num_heads = num_heads\n\n        self.head_dim = embed_dim // num_heads\n        assert self.head_dim * num_heads == embed_dim, \"embed_dim must be divisible by num_heads\"\n\n    def forward(self, x):\n\n        x = x.reshape((x.shape[0], x.shape[1], x.shape[2] * x.shape[3])).transpose((2, 0, 1))  # NCHW -> (HW)NC\n        max_len, batch_size, emb_dim = x.shape\n        head_dim = self.head_dim\n        x = paddle.concat([paddle.mean(x, axis=0, keepdim=True), x], axis=0)\n        x = x + paddle.unsqueeze(self.positional_embedding, 1)\n        out = multi_head_attention_forward(x, self.num_heads, self.q_proj, self.k_proj, self.v_proj, self.c_proj)\n\n        return out[0]\n\n\nclass QuickGELU(nn.Layer):\n\n    def forward(self, x):\n        return x * paddle.nn.functional.sigmoid(1.702 * x)\n\n\nclass ResidualAttentionBlock(nn.Layer):\n\n    def __init__(self, d_model: int, n_head: int, attn_mask=None):\n        super().__init__()\n\n        self.attn = MultiHeadAttention(d_model, n_head)\n        self.ln_1 = nn.LayerNorm(d_model)\n        self.mlp = nn.Sequential((\"c_fc\", nn.Linear(d_model, d_model * 4)), (\"gelu\", QuickGELU()),\n                                 (\"c_proj\", nn.Linear(d_model * 4, d_model)))\n        self.ln_2 = nn.LayerNorm(d_model)\n        self.attn_mask = attn_mask\n\n    def attention(self, x):\n        x = self.attn(x, self.attn_mask)\n        assert isinstance(x, paddle.Tensor)  # not tuble here\n        return x\n\n    def forward(self, x):\n\n        x = x + self.attention(self.ln_1(x))\n        x = x + self.mlp(self.ln_2(x))\n        return x\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/clip/clip/model.py",
    "content": "from typing import Tuple\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle import nn\n\nfrom .layers import AttentionPool2d\nfrom .layers import Bottleneck\nfrom .layers import MultiHeadAttention\nfrom .layers import ResidualAttentionBlock\n\n\nclass ModifiedResNet(nn.Layer):\n    \"\"\"\n    A ResNet class that is similar to torchvision's but contains the following changes:\n    - There are now 3 \"stem\" convolutions as opposed to 1, with an average pool instead of a max pool.\n    - Performs anti-aliasing strided convolutions, where an avgpool is prepended to convolutions with stride > 1\n    - The final pooling layer is a QKV attention instead of an average pool\n    \"\"\"\n\n    def __init__(self, layers, output_dim, heads, input_resolution=224, width=64):\n        super().__init__()\n        self.output_dim = output_dim\n        self.input_resolution = input_resolution\n\n        # the 3-layer stem\n        self.conv1 = nn.Conv2D(3, width // 2, kernel_size=3, stride=2, padding=1, bias_attr=False)\n        self.bn1 = nn.BatchNorm2D(width // 2)\n        self.conv2 = nn.Conv2D(width // 2, width // 2, kernel_size=3, padding=1, bias_attr=False)\n        self.bn2 = nn.BatchNorm2D(width // 2)\n        self.conv3 = nn.Conv2D(width // 2, width, kernel_size=3, padding=1, bias_attr=False)\n        self.bn3 = nn.BatchNorm2D(width)\n        self.avgpool = nn.AvgPool2D(2)\n        self.relu = nn.ReLU()\n\n        # residual layers\n        self._inplanes = width  # this is a *mutable* variable used during construction\n        self.layer1 = self._make_layer(width, layers[0])\n        self.layer2 = self._make_layer(width * 2, layers[1], stride=2)\n        self.layer3 = self._make_layer(width * 4, layers[2], stride=2)\n        self.layer4 = self._make_layer(width * 8, layers[3], stride=2)\n\n        embed_dim = width * 32  # the ResNet feature dimension\n        self.attnpool = AttentionPool2d(input_resolution // 32, embed_dim, heads, output_dim)\n\n    def _make_layer(self, planes, blocks, stride=1):\n        layers = [Bottleneck(self._inplanes, planes, stride)]\n\n        self._inplanes = planes * Bottleneck.expansion\n        for _ in range(1, blocks):\n            layers.append(Bottleneck(self._inplanes, planes))\n\n        return nn.Sequential(*layers)\n\n    def forward(self, x):\n\n        def stem(x):\n            for conv, bn in [(self.conv1, self.bn1), (self.conv2, self.bn2), (self.conv3, self.bn3)]:\n                x = self.relu(bn(conv(x)))\n            x = self.avgpool(x)\n            return x\n\n        #x = x.type(self.conv1.weight.dtype)\n        x = stem(x)\n        x = self.layer1(x)\n        x = self.layer2(x)\n        x = self.layer3(x)\n        x = self.layer4(x)\n        x = self.attnpool(x)\n\n        return x\n\n\nclass Transformer(nn.Layer):\n\n    def __init__(self, width: int, layers: int, heads: int, attn_mask=None):\n        super().__init__()\n        self.width = width\n        self.layers = layers\n        self.resblocks = nn.Sequential(*[ResidualAttentionBlock(width, heads, attn_mask) for _ in range(layers)])\n\n    def forward(self, x):\n        return self.resblocks(x)\n\n\nclass VisualTransformer(nn.Layer):\n\n    def __init__(self, input_resolution: int, patch_size: int, width: int, layers: int, heads: int, output_dim: int):\n        super().__init__()\n        self.input_resolution = input_resolution\n        self.output_dim = output_dim\n        # used patch_size x patch_size, stride patch_size to do linear projection\n        self.conv1 = nn.Conv2D(in_channels=3,\n                               out_channels=width,\n                               kernel_size=patch_size,\n                               stride=patch_size,\n                               bias_attr=False)\n\n        # scale = width ** -0.5\n        self.class_embedding = paddle.create_parameter((width, ), 'float32')\n\n        self.positional_embedding = paddle.create_parameter(((input_resolution // patch_size)**2 + 1, width), 'float32')\n\n        self.ln_pre = nn.LayerNorm(width)\n\n        self.transformer = Transformer(width, layers, heads)\n\n        self.ln_post = nn.LayerNorm(width)\n        self.proj = paddle.create_parameter((width, output_dim), 'float32')\n\n    def forward(self, x):\n\n        x = self.conv1(x)\n        x = x.reshape((x.shape[0], x.shape[1], -1))\n        x = x.transpose((0, 2, 1))\n        x = paddle.concat([self.class_embedding + paddle.zeros((x.shape[0], 1, x.shape[-1]), dtype=x.dtype), x], axis=1)\n\n        x = x + self.positional_embedding\n        x = self.ln_pre(x)\n        x = x.transpose((1, 0, 2))\n        x = self.transformer(x)\n        x = x.transpose((1, 0, 2))\n        x = self.ln_post(x[:, 0, :])\n        if self.proj is not None:\n            x = paddle.matmul(x, self.proj)\n\n        return x\n\n\nclass CLIP(nn.Layer):\n\n    def __init__(\n            self,\n            embed_dim: int,\n            # vision\n            image_resolution: int,\n            vision_layers: Union[Tuple[int, int, int, int], int],\n            vision_width: int,\n            vision_patch_size: int,\n            # text\n            context_length: int,\n            vocab_size: int,\n            transformer_width: int,\n            transformer_heads: int,\n            transformer_layers: int):\n        super().__init__()\n\n        self.context_length = context_length\n        if isinstance(vision_layers, (tuple, list)):\n            vision_heads = vision_width * 32 // 64\n            self.visual = ModifiedResNet(layers=vision_layers,\n                                         output_dim=embed_dim,\n                                         heads=vision_heads,\n                                         input_resolution=image_resolution,\n                                         width=vision_width)\n        else:\n            vision_heads = vision_width // 64\n            self.visual = VisualTransformer(input_resolution=image_resolution,\n                                            patch_size=vision_patch_size,\n                                            width=vision_width,\n                                            layers=vision_layers,\n                                            heads=vision_heads,\n                                            output_dim=embed_dim)\n\n        self.transformer = Transformer(width=transformer_width,\n                                       layers=transformer_layers,\n                                       heads=transformer_heads,\n                                       attn_mask=self.build_attention_mask())\n\n        self.vocab_size = vocab_size\n        self.token_embedding = nn.Embedding(vocab_size, transformer_width)\n        self.positional_embedding = paddle.create_parameter((self.context_length, transformer_width), 'float32')\n        self.ln_final = nn.LayerNorm(transformer_width)\n\n        self.text_projection = paddle.create_parameter((transformer_width, embed_dim), 'float32')\n        self.logit_scale = paddle.create_parameter((1, ), 'float32')\n\n    def build_attention_mask(self):\n        # lazily create causal attention mask, with full attention between the vision tokens\n        # mask = paddle.empty((self.context_length, self.context_length),dtype='float32')\n        # mask.fill_(float(\"-inf\"))\n        #mask.triu_(1)  # zero out the lower diagonal\n\n        mask = paddle.ones((self.context_length, self.context_length)) * float(\"-inf\")\n        mask = paddle.triu(mask, diagonal=1)\n\n        return mask\n\n    def encode_image(self, image):\n        return self.visual(image)\n\n    def encode_text(self, text):\n        x = self.token_embedding(text)  # [batch_size, n_ctx, d_model]\n        # print(x.shape)\n\n        x = x + self.positional_embedding\n        #print(x.shape)\n\n        x = x.transpose((1, 0, 2))  # NLD -> LND\n        x = self.transformer(x)\n        x = x.transpose((1, 0, 2))  # LND -> NLD\n        x = self.ln_final(x)\n\n        idx = text.numpy().argmax(-1)\n        idx = list(idx)\n        x = [x[i:i + 1, int(j), :] for i, j in enumerate(idx)]\n        x = paddle.concat(x, 0)\n        x = paddle.matmul(x, self.text_projection)\n        return x\n\n    def forward(self, image, text):\n        image_features = self.encode_image(image)\n        text_features = self.encode_text(text)\n\n        # normalized features\n        image_features = image_features / image_features.norm(dim=-1, keepdim=True)\n        text_features = text_features / text_features.norm(dim=-1, keepdim=True)\n\n        # cosine similarity as logits\n        logit_scale = self.logit_scale.exp()\n        logits_per_image = paddle.matmul(logit_scale * image_features, text_features.t())\n        logits_per_text = paddle.matmul(logit_scale * text_features, image_features.t())\n\n        # shape = [global_batch_size, global_batch_size]\n        return logits_per_image, logits_per_text\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/clip/clip/simple_tokenizer.py",
    "content": "import gzip\nimport html\nimport os\nfrom functools import lru_cache\n\nimport ftfy\nimport regex as re\n\n\n@lru_cache()\ndef default_bpe():\n    return os.path.join(os.path.dirname(os.path.abspath(__file__)), \"../assets/bpe_simple_vocab_16e6.txt.gz\")\n\n\n@lru_cache()\ndef bytes_to_unicode():\n    \"\"\"\n    Returns list of utf-8 byte and a corresponding list of unicode strings.\n    The reversible bpe codes work on unicode strings.\n    This means you need a large # of unicode characters in your vocab if you want to avoid UNKs.\n    When you're at something like a 10B token dataset you end up needing around 5K for decent coverage.\n    This is a signficant percentage of your normal, say, 32K bpe vocab.\n    To avoid that, we want lookup tables between utf-8 bytes and unicode strings.\n    And avoids mapping to whitespace/control characters the bpe code barfs on.\n    \"\"\"\n    bs = list(range(ord(\"!\"), ord(\"~\") + 1)) + list(range(ord(\"¡\"), ord(\"¬\") + 1)) + list(range(ord(\"®\"), ord(\"ÿ\") + 1))\n    cs = bs[:]\n    n = 0\n    for b in range(2**8):\n        if b not in bs:\n            bs.append(b)\n            cs.append(2**8 + n)\n            n += 1\n    cs = [chr(n) for n in cs]\n    return dict(zip(bs, cs))\n\n\ndef get_pairs(word):\n    \"\"\"Return set of symbol pairs in a word.\n    Word is represented as tuple of symbols (symbols being variable-length strings).\n    \"\"\"\n    pairs = set()\n    prev_char = word[0]\n    for char in word[1:]:\n        pairs.add((prev_char, char))\n        prev_char = char\n    return pairs\n\n\ndef basic_clean(text):\n    text = ftfy.fix_text(text)\n    text = html.unescape(html.unescape(text))\n    return text.strip()\n\n\ndef whitespace_clean(text):\n    text = re.sub(r'\\s+', ' ', text)\n    text = text.strip()\n    return text\n\n\nclass SimpleTokenizer(object):\n\n    def __init__(self, bpe_path: str = default_bpe()):\n        self.byte_encoder = bytes_to_unicode()\n        self.byte_decoder = {v: k for k, v in self.byte_encoder.items()}\n        merges = gzip.open(bpe_path).read().decode(\"utf-8\").split('\\n')\n        merges = merges[1:49152 - 256 - 2 + 1]\n        merges = [tuple(merge.split()) for merge in merges]\n        vocab = list(bytes_to_unicode().values())\n        vocab = vocab + [v + '</w>' for v in vocab]\n        for merge in merges:\n            vocab.append(''.join(merge))\n        vocab.extend(['<|startoftext|>', '<|endoftext|>'])\n        self.encoder = dict(zip(vocab, range(len(vocab))))\n        self.decoder = {v: k for k, v in self.encoder.items()}\n        self.bpe_ranks = dict(zip(merges, range(len(merges))))\n        self.cache = {'<|startoftext|>': '<|startoftext|>', '<|endoftext|>': '<|endoftext|>'}\n        self.pat = re.compile(\n            r\"\"\"<\\|startoftext\\|>|<\\|endoftext\\|>|'s|'t|'re|'ve|'m|'ll|'d|[\\p{L}]+|[\\p{N}]|[^\\s\\p{L}\\p{N}]+\"\"\",\n            re.IGNORECASE)\n\n    def bpe(self, token):\n        if token in self.cache:\n            return self.cache[token]\n        word = tuple(token[:-1]) + (token[-1] + '</w>', )\n        pairs = get_pairs(word)\n\n        if not pairs:\n            return token + '</w>'\n\n        while True:\n            bigram = min(pairs, key=lambda pair: self.bpe_ranks.get(pair, float('inf')))\n            if bigram not in self.bpe_ranks:\n                break\n            first, second = bigram\n            new_word = []\n            i = 0\n            while i < len(word):\n                try:\n                    j = word.index(first, i)\n                    new_word.extend(word[i:j])\n                    i = j\n                except:\n                    new_word.extend(word[i:])\n                    break\n\n                if word[i] == first and i < len(word) - 1 and word[i + 1] == second:\n                    new_word.append(first + second)\n                    i += 2\n                else:\n                    new_word.append(word[i])\n                    i += 1\n            new_word = tuple(new_word)\n            word = new_word\n            if len(word) == 1:\n                break\n            else:\n                pairs = get_pairs(word)\n        word = ' '.join(word)\n        self.cache[token] = word\n        return word\n\n    def encode(self, text):\n        bpe_tokens = []\n        text = whitespace_clean(basic_clean(text)).lower()\n        for token in re.findall(self.pat, text):\n            token = ''.join(self.byte_encoder[b] for b in token.encode('utf-8'))\n            bpe_tokens.extend(self.encoder[bpe_token] for bpe_token in self.bpe(token).split(' '))\n        return bpe_tokens\n\n    def decode(self, tokens):\n        text = ''.join([self.decoder[token] for token in tokens])\n        text = bytearray([self.byte_decoder[c] for c in text]).decode('utf-8', errors=\"replace\").replace('</w>', ' ')\n        return text\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/clip/clip/utils.py",
    "content": "import os\nfrom typing import List\nfrom typing import Union\n\nimport numpy as np\nimport paddle\nfrom paddle.utils import download\nfrom paddle.vision.transforms import CenterCrop\nfrom paddle.vision.transforms import Compose\nfrom paddle.vision.transforms import Normalize\nfrom paddle.vision.transforms import Resize\nfrom paddle.vision.transforms import ToTensor\n\nfrom .model import CLIP\nfrom .simple_tokenizer import SimpleTokenizer\n\n__all__ = ['transform', 'tokenize', 'build_model']\n\nMODEL_NAMES = ['RN50', 'RN101', 'VIT32']\n\nURL = {\n    'RN50': os.path.join(os.path.dirname(__file__), 'pre_trained', 'RN50.pdparams'),\n    'RN101': os.path.join(os.path.dirname(__file__), 'pre_trained', 'RN101.pdparams'),\n    'VIT32': os.path.join(os.path.dirname(__file__), 'pre_trained', 'ViT-B-32.pdparams')\n}\n\nMEAN, STD = (0.48145466, 0.4578275, 0.40821073), (0.26862954, 0.26130258, 0.27577711)\n_tokenizer = SimpleTokenizer()\n\ntransform = Compose([\n    Resize(224, interpolation='bicubic'),\n    CenterCrop(224), lambda image: image.convert('RGB'),\n    ToTensor(),\n    Normalize(mean=MEAN, std=STD), lambda t: t.unsqueeze_(0)\n])\n\n\ndef tokenize(texts: Union[str, List[str]], context_length: int = 77):\n    \"\"\"\n    Returns the tokenized representation of given input string(s)\n\n    Parameters\n    ----------\n    texts : Union[str, List[str]]\n        An input string or a list of input strings to tokenize\n\n    context_length : int\n        The context length to use; all CLIP models use 77 as the context length\n\n    Returns\n    -------\n    A two-dimensional tensor containing the resulting tokens, shape = [number of input strings, context_length]\n    \"\"\"\n    if isinstance(texts, str):\n        texts = [texts]\n\n    sot_token = _tokenizer.encoder[\"<|startoftext|>\"]\n    eot_token = _tokenizer.encoder[\"<|endoftext|>\"]\n    all_tokens = [[sot_token] + _tokenizer.encode(text) + [eot_token] for text in texts]\n    result = paddle.zeros((len(all_tokens), context_length), dtype='int64')\n\n    for i, tokens in enumerate(all_tokens):\n        if len(tokens) > context_length:\n            raise RuntimeError(f\"Input {texts[i]} is too long for context length {context_length}\")\n        result[i, :len(tokens)] = paddle.to_tensor(np.array(tokens), dtype='int64')\n\n    return result\n\n\ndef build_model(name='RN101'):\n    assert name in MODEL_NAMES, f\"model name must be one of {MODEL_NAMES}\"\n    name2model = {'RN101': build_rn101_model, 'VIT32': build_vit_model, 'RN50': build_rn50_model}\n    model = name2model[name]()\n    weight = URL[name]\n    sd = paddle.load(weight)\n    model.load_dict(sd)\n    model.eval()\n    return model\n\n\ndef build_vit_model():\n\n    model = CLIP(embed_dim=512,\n                 image_resolution=224,\n                 vision_layers=12,\n                 vision_width=768,\n                 vision_patch_size=32,\n                 context_length=77,\n                 vocab_size=49408,\n                 transformer_width=512,\n                 transformer_heads=8,\n                 transformer_layers=12)\n    return model\n\n\ndef build_rn101_model():\n    model = CLIP(\n        embed_dim=512,\n        image_resolution=224,\n        vision_layers=(3, 4, 23, 3),\n        vision_width=64,\n        vision_patch_size=0,  #Not used in resnet\n        context_length=77,\n        vocab_size=49408,\n        transformer_width=512,\n        transformer_heads=8,\n        transformer_layers=12)\n    return model\n\n\ndef build_rn50_model():\n    model = CLIP(embed_dim=1024,\n                 image_resolution=224,\n                 vision_layers=(3, 4, 6, 3),\n                 vision_width=64,\n                 vision_patch_size=None,\n                 context_length=77,\n                 vocab_size=49408,\n                 transformer_width=512,\n                 transformer_heads=8,\n                 transformer_layers=12)\n    return model\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/module.py",
    "content": "# copyright (c) 2022 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport ast\nimport os\nimport sys\nfrom functools import partial\nfrom typing import List\nfrom typing import Optional\n\nimport disco_diffusion_clip_rn101.clip as clip\nimport disco_diffusion_clip_rn101.resize_right as resize_right\nimport paddle\nfrom disco_diffusion_clip_rn101.reverse_diffusion import create\n\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"disco_diffusion_clip_rn101\",\n            version=\"1.0.0\",\n            type=\"image/text_to_image\",\n            summary=\"\",\n            author=\"paddlepaddle\",\n            author_email=\"paddle-dev@baidu.com\")\nclass DiscoDiffusionClip:\n\n    def generate_image(self,\n                       text_prompts,\n                       style: Optional[str] = None,\n                       artist: Optional[str] = None,\n                       init_image: Optional[str] = None,\n                       width_height: Optional[List[int]] = [1280, 768],\n                       skip_steps: Optional[int] = 0,\n                       steps: Optional[int] = 250,\n                       cut_ic_pow: Optional[int] = 1,\n                       init_scale: Optional[int] = 1000,\n                       clip_guidance_scale: Optional[int] = 5000,\n                       tv_scale: Optional[int] = 0,\n                       range_scale: Optional[int] = 0,\n                       sat_scale: Optional[int] = 0,\n                       cutn_batches: Optional[int] = 4,\n                       diffusion_sampling_mode: Optional[str] = 'ddim',\n                       perlin_init: Optional[bool] = False,\n                       perlin_mode: Optional[str] = 'mixed',\n                       seed: Optional[int] = None,\n                       eta: Optional[float] = 0.8,\n                       clamp_grad: Optional[bool] = True,\n                       clamp_max: Optional[float] = 0.05,\n                       randomize_class: Optional[bool] = True,\n                       clip_denoised: Optional[bool] = False,\n                       fuzzy_prompt: Optional[bool] = False,\n                       rand_mag: Optional[float] = 0.05,\n                       cut_overview: Optional[str] = '[12]*400+[4]*600',\n                       cut_innercut: Optional[str] = '[4]*400+[12]*600',\n                       cut_icgray_p: Optional[str] = '[0.2]*400+[0]*600',\n                       display_rate: Optional[int] = 10,\n                       n_batches: Optional[int] = 1,\n                       batch_size: Optional[int] = 1,\n                       batch_name: Optional[str] = '',\n                       use_gpu: Optional[bool] = True,\n                       output_dir: Optional[str] = 'disco_diffusion_clip_rn101_out'):\n        \"\"\"\n        Create Disco Diffusion artworks and save the result into a DocumentArray.\n\n        :param text_prompts: Phrase, sentence, or string of words and phrases describing what the image should look like.  The words will be analyzed by the AI and will guide the diffusion process toward the image(s) you describe. These can include commas and weights to adjust the relative importance of each element.  E.g. \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"Notice that this prompt loosely follows a structure: [subject], [prepositional details], [setting], [meta modifiers and artist]; this is a good starting point for your experiments. Developing text prompts takes practice and experience, and is not the subject of this guide.  If you are a beginner to writing text prompts, a good place to start is on a simple AI art app like Night Cafe, starry ai or WOMBO prior to using DD, to get a feel for how text gets translated into images by GAN tools.  These other apps use different technologies, but many of the same principles apply.\n        :param style: Image style, such as oil paintings, if specified, style will be used to construct prompts.\n        :param artist: Artist style, if specified, style will be used to construct prompts.\n        :param init_image: Recall that in the image sequence above, the first image shown is just noise.  If an init_image is provided, diffusion will replace the noise with the init_image as its starting state.  To use an init_image, upload the image to the Colab instance or your Google Drive, and enter the full image path here. If using an init_image, you may need to increase skip_steps to ~ 50% of total steps to retain the character of the init. See skip_steps above for further discussion.\n        :param width_height: Desired final image size, in pixels. You can have a square, wide, or tall image, but each edge length should be set to a multiple of 64px, and a minimum of 512px on the default CLIP model setting.  If you forget to use multiples of 64px in your dimensions, DD will adjust the dimensions of your image to make it so.\n        :param skip_steps: Consider the chart shown here.  Noise scheduling (denoise strength) starts very high and progressively gets lower and lower as diffusion steps progress. The noise levels in the first few steps are very high, so images change dramatically in early steps.As DD moves along the curve, noise levels (and thus the amount an image changes per step) declines, and image coherence from one step to the next increases.The first few steps of denoising are often so dramatic that some steps (maybe 10-15% of total) can be skipped without affecting the final image. You can experiment with this as a way to cut render times.If you skip too many steps, however, the remaining noise may not be high enough to generate new content, and thus may not have ‘time left’ to finish an image satisfactorily.Also, depending on your other settings, you may need to skip steps to prevent CLIP from overshooting your goal, resulting in ‘blown out’ colors (hyper saturated, solid white, or solid black regions) or otherwise poor image quality.  Consider that the denoising process is at its strongest in the early steps, so skipping steps can sometimes mitigate other problems.Lastly, if using an init_image, you will need to skip ~50% of the diffusion steps to retain the shapes in the original init image. However, if you’re using an init_image, you can also adjust skip_steps up or down for creative reasons.  With low skip_steps you can get a result \"inspired by\" the init_image which will retain the colors and rough layout and shapes but look quite different. With high skip_steps you can preserve most of the init_image contents and just do fine tuning of the texture.\n        :param steps: When creating an image, the denoising curve is subdivided into steps for processing. Each step (or iteration) involves the AI looking at subsets of the image called ‘cuts’ and calculating the ‘direction’ the image should be guided to be more like the prompt. Then it adjusts the image with the help of the diffusion denoiser, and moves to the next step.Increasing steps will provide more opportunities for the AI to adjust the image, and each adjustment will be smaller, and thus will yield a more precise, detailed image.  Increasing steps comes at the expense of longer render times.  Also, while increasing steps should generally increase image quality, there is a diminishing return on additional steps beyond 250 - 500 steps.  However, some intricate images can take 1000, 2000, or more steps.  It is really up to the user.  Just know that the render time is directly related to the number of steps, and many other parameters have a major impact on image quality, without costing additional time.\n        :param cut_ic_pow: This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n        :param init_scale: This controls how strongly CLIP will try to match the init_image provided.  This is balanced against the clip_guidance_scale (CGS) above.  Too much init scale, and the image won’t change much during diffusion. Too much CGS and the init image will be lost.\n        :param clip_guidance_scale: CGS is one of the most important parameters you will use. It tells DD how strongly you want CLIP to move toward your prompt each timestep.  Higher is generally better, but if CGS is too strong it will overshoot the goal and distort the image. So a happy medium is needed, and it takes experience to learn how to adjust CGS. Note that this parameter generally scales with image dimensions. In other words, if you increase your total dimensions by 50% (e.g. a change from 512 x 512 to 512 x 768), then to maintain the same effect on the image, you’d want to increase clip_guidance_scale from 5000 to 7500. Of the basic settings, clip_guidance_scale, steps and skip_steps are the most important contributors to image quality, so learn them well.\n        :param tv_scale: Total variance denoising. Optional, set to zero to turn off. Controls ‘smoothness’ of final output. If used, tv_scale will try to smooth out your final image to reduce overall noise. If your image is too ‘crunchy’, increase tv_scale. TV denoising is good at preserving edges while smoothing away noise in flat regions.  See https://en.wikipedia.org/wiki/Total_variation_denoising\n        :param range_scale: Optional, set to zero to turn off.  Used for adjustment of color contrast.  Lower range_scale will increase contrast. Very low numbers create a reduced color palette, resulting in more vibrant or poster-like images. Higher range_scale will reduce contrast, for more muted images.\n        :param sat_scale: Saturation scale. Optional, set to zero to turn off.  If used, sat_scale will help mitigate oversaturation. If your image is too saturated, increase sat_scale to reduce the saturation.\n        :param cutn_batches: Each iteration, the AI cuts the image into smaller pieces known as cuts, and compares each cut to the prompt to decide how to guide the next diffusion step.  More cuts can generally lead to better images, since DD has more chances to fine-tune the image precision in each timestep.  Additional cuts are memory intensive, however, and if DD tries to evaluate too many cuts at once, it can run out of memory.  You can use cutn_batches to increase cuts per timestep without increasing memory usage. At the default settings, DD is scheduled to do 16 cuts per timestep.  If cutn_batches is set to 1, there will indeed only be 16 cuts total per timestep. However, if cutn_batches is increased to 4, DD will do 64 cuts total in each timestep, divided into 4 sequential batches of 16 cuts each.  Because the cuts are being evaluated only 16 at a time, DD uses the memory required for only 16 cuts, but gives you the quality benefit of 64 cuts.  The tradeoff, of course, is that this will take ~4 times as long to render each image.So, (scheduled cuts) x (cutn_batches) = (total cuts per timestep). Increasing cutn_batches will increase render times, however, as the work is being done sequentially.  DD’s default cut schedule is a good place to start, but the cut schedule can be adjusted in the Cutn Scheduling section, explained below.\n        :param diffusion_sampling_mode: Two alternate diffusion denoising algorithms. ddim has been around longer, and is more established and tested.  plms is a newly added alternate method that promises good diffusion results in fewer steps, but has not been as fully tested and may have side effects. This new plms mode is actively being researched in the #settings-and-techniques channel in the DD Discord.\n        :param perlin_init: Normally, DD will use an image filled with random noise as a starting point for the diffusion curve.  If perlin_init is selected, DD will instead use a Perlin noise model as an initial state.  Perlin has very interesting characteristics, distinct from random noise, so it’s worth experimenting with this for your projects. Beyond perlin, you can, of course, generate your own noise images (such as with GIMP, etc) and use them as an init_image (without skipping steps). Choosing perlin_init does not affect the actual diffusion process, just the starting point for the diffusion. Please note that selecting a perlin_init will replace and override any init_image you may have specified.  Further, because the 2D, 3D and video animation systems all rely on the init_image system, if you enable Perlin while using animation modes, the perlin_init will jump in front of any previous image or video input, and DD will NOT give you the expected sequence of coherent images. All of that said, using Perlin and animation modes together do make a very colorful rainbow effect, which can be used creatively.\n        :param perlin_mode: sets type of Perlin noise: colored, gray, or a mix of both, giving you additional options for noise types. Experiment to see what these do in your projects.\n        :param seed: Deep in the diffusion code, there is a random number ‘seed’ which is used as the basis for determining the initial state of the diffusion.  By default, this is random, but you can also specify your own seed.  This is useful if you like a particular result and would like to run more iterations that will be similar. After each run, the actual seed value used will be reported in the parameters report, and can be reused if desired by entering seed # here.  If a specific numerical seed is used repeatedly, the resulting images will be quite similar but not identical.\n        :param eta: eta (greek letter η) is a diffusion model variable that mixes in a random amount of scaled noise into each timestep. 0 is no noise, 1.0 is more noise. As with most DD parameters, you can go below zero for eta, but it may give you unpredictable results. The steps parameter has a close relationship with the eta parameter. If you set eta to 0, then you can get decent output with only 50-75 steps. Setting eta to 1.0 favors higher step counts, ideally around 250 and up. eta has a subtle, unpredictable effect on image, so you’ll need to experiment to see how this affects your projects.\n        :param clamp_grad: As I understand it, clamp_grad is an internal limiter that stops DD from producing extreme results.  Try your images with and without clamp_grad. If the image changes drastically with clamp_grad turned off, it probably means your clip_guidance_scale is too high and should be reduced.\n        :param clamp_max: Sets the value of the clamp_grad limitation. Default is 0.05, providing for smoother, more muted coloration in images, but setting higher values (0.15-0.3) can provide interesting contrast and vibrancy.\n        :param fuzzy_prompt: Controls whether to add multiple noisy prompts to the prompt losses. If True, can increase variability of image output. Experiment with this.\n        :param rand_mag: Affects only the fuzzy_prompt.  Controls the magnitude of the random noise added by fuzzy_prompt.\n        :param cut_overview: The schedule of overview cuts\n        :param cut_innercut: The schedule of inner cuts\n        :param cut_icgray_p: This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n        :param display_rate: During a diffusion run, you can monitor the progress of each image being created with this variable.  If display_rate is set to 50, DD will show you the in-progress image every 50 timesteps. Setting this to a lower value, like 5 or 10, is a good way to get an early peek at where your image is heading. If you don’t like the progression, just interrupt execution, change some settings, and re-run.  If you are planning a long, unmonitored batch, it’s better to set display_rate equal to steps, because displaying interim images does slow Colab down slightly.\n        :param n_batches: This variable sets the number of still images you want DD to create.  If you are using an animation mode (see below for details) DD will ignore n_batches and create a single set of animated frames based on the animation settings.\n        :param batch_name: The name of the batch, the batch id will be named as \"discoart-[batch_name]-seed\". To avoid your artworks be overridden by other users, please use a unique name.\n        :param use_gpu: whether to use gpu or not.\n        :return: a DocumentArray object that has `n_batches` Documents\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ.get(\"CUDA_VISIBLE_DEVICES\", None)\n                if _places:\n                    paddle.device.set_device(\"gpu:{}\".format(0))\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n        else:\n            paddle.device.set_device(\"cpu\")\n        paddle.disable_static()\n        if not os.path.exists(output_dir):\n            os.makedirs(output_dir, exist_ok=True)\n\n        if isinstance(text_prompts, str):\n            text_prompts = text_prompts.rstrip(',.，。')\n            if style is not None:\n                text_prompts += \",{}\".format(style)\n            if artist is not None:\n                text_prompts += \",{},trending on artstation\".format(artist)\n        elif isinstance(text_prompts, list):\n            text_prompts[0] = text_prompts[0].rstrip(',.，。')\n            if style is not None:\n                text_prompts[0] += \",{}\".format(style)\n            if artist is not None:\n                text_prompts[0] += \",{},trending on artstation\".format(artist)\n\n        return create(text_prompts=text_prompts,\n                      init_image=init_image,\n                      width_height=width_height,\n                      skip_steps=skip_steps,\n                      steps=steps,\n                      cut_ic_pow=cut_ic_pow,\n                      init_scale=init_scale,\n                      clip_guidance_scale=clip_guidance_scale,\n                      tv_scale=tv_scale,\n                      range_scale=range_scale,\n                      sat_scale=sat_scale,\n                      cutn_batches=cutn_batches,\n                      diffusion_sampling_mode=diffusion_sampling_mode,\n                      perlin_init=perlin_init,\n                      perlin_mode=perlin_mode,\n                      seed=seed,\n                      eta=eta,\n                      clamp_grad=clamp_grad,\n                      clamp_max=clamp_max,\n                      randomize_class=randomize_class,\n                      clip_denoised=clip_denoised,\n                      fuzzy_prompt=fuzzy_prompt,\n                      rand_mag=rand_mag,\n                      cut_overview=cut_overview,\n                      cut_innercut=cut_innercut,\n                      cut_icgray_p=cut_icgray_p,\n                      display_rate=display_rate,\n                      n_batches=n_batches,\n                      batch_size=batch_size,\n                      batch_name=batch_name,\n                      clip_models=['RN101'],\n                      output_dir=output_dir)\n\n    @serving\n    def serving_method(self, text_prompts, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        results = self.generate_image(text_prompts=text_prompts, **kwargs).to_base64()\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.generate_image(text_prompts=args.text_prompts,\n                                      style=args.style,\n                                      artist=args.artist,\n                                      init_image=args.init_image,\n                                      width_height=args.width_height,\n                                      skip_steps=args.skip_steps,\n                                      steps=args.steps,\n                                      cut_ic_pow=args.cut_ic_pow,\n                                      init_scale=args.init_scale,\n                                      clip_guidance_scale=args.clip_guidance_scale,\n                                      tv_scale=args.tv_scale,\n                                      range_scale=args.range_scale,\n                                      sat_scale=args.sat_scale,\n                                      cutn_batches=args.cutn_batches,\n                                      diffusion_sampling_mode=args.diffusion_sampling_mode,\n                                      perlin_init=args.perlin_init,\n                                      perlin_mode=args.perlin_mode,\n                                      seed=args.seed,\n                                      eta=args.eta,\n                                      clamp_grad=args.clamp_grad,\n                                      clamp_max=args.clamp_max,\n                                      randomize_class=args.randomize_class,\n                                      clip_denoised=args.clip_denoised,\n                                      fuzzy_prompt=args.fuzzy_prompt,\n                                      rand_mag=args.rand_mag,\n                                      cut_overview=args.cut_overview,\n                                      cut_innercut=args.cut_innercut,\n                                      cut_icgray_p=args.cut_icgray_p,\n                                      display_rate=args.display_rate,\n                                      n_batches=args.n_batches,\n                                      batch_size=args.batch_size,\n                                      batch_name=args.batch_name,\n                                      output_dir=args.output_dir)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_input_group.add_argument(\n            '--skip_steps',\n            type=int,\n            default=0,\n            help=\n            'Consider the chart shown here.  Noise scheduling (denoise strength) starts very high and progressively gets lower and lower as diffusion steps progress. The noise levels in the first few steps are very high, so images change dramatically in early steps.As DD moves along the curve, noise levels (and thus the amount an image changes per step) declines, and image coherence from one step to the next increases.The first few steps of denoising are often so dramatic that some steps (maybe 10-15%% of total) can be skipped without affecting the final image. You can experiment with this as a way to cut render times.If you skip too many steps, however, the remaining noise may not be high enough to generate new content, and thus may not have ‘time left’ to finish an image satisfactorily.Also, depending on your other settings, you may need to skip steps to prevent CLIP from overshooting your goal, resulting in ‘blown out’ colors (hyper saturated, solid white, or solid black regions) or otherwise poor image quality.  Consider that the denoising process is at its strongest in the early steps, so skipping steps can sometimes mitigate other problems.Lastly, if using an init_image, you will need to skip ~50%% of the diffusion steps to retain the shapes in the original init image. However, if you’re using an init_image, you can also adjust skip_steps up or down for creative reasons.  With low skip_steps you can get a result \"inspired by\" the init_image which will retain the colors and rough layout and shapes but look quite different. With high skip_steps you can preserve most of the init_image contents and just do fine tuning of the texture'\n        )\n        self.arg_input_group.add_argument(\n            '--steps',\n            type=int,\n            default=250,\n            help=\n            \"When creating an image, the denoising curve is subdivided into steps for processing. Each step (or iteration) involves the AI looking at subsets of the image called ‘cuts’ and calculating the ‘direction’ the image should be guided to be more like the prompt. Then it adjusts the image with the help of the diffusion denoiser, and moves to the next step.Increasing steps will provide more opportunities for the AI to adjust the image, and each adjustment will be smaller, and thus will yield a more precise, detailed image.  Increasing steps comes at the expense of longer render times.  Also, while increasing steps should generally increase image quality, there is a diminishing return on additional steps beyond 250 - 500 steps.  However, some intricate images can take 1000, 2000, or more steps.  It is really up to the user.  Just know that the render time is directly related to the number of steps, and many other parameters have a major impact on image quality, without costing additional time.\"\n        )\n        self.arg_input_group.add_argument(\n            '--cut_ic_pow',\n            type=int,\n            default=1,\n            help=\n            \"This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\"\n        )\n        self.arg_input_group.add_argument(\n            '--init_scale',\n            type=int,\n            default=1000,\n            help=\n            \"This controls how strongly CLIP will try to match the init_image provided.  This is balanced against the clip_guidance_scale (CGS) above.  Too much init scale, and the image won’t change much during diffusion. Too much CGS and the init image will be lost.\"\n        )\n        self.arg_input_group.add_argument(\n            '--clip_guidance_scale',\n            type=int,\n            default=5000,\n            help=\n            \"CGS is one of the most important parameters you will use. It tells DD how strongly you want CLIP to move toward your prompt each timestep.  Higher is generally better, but if CGS is too strong it will overshoot the goal and distort the image. So a happy medium is needed, and it takes experience to learn how to adjust CGS. Note that this parameter generally scales with image dimensions. In other words, if you increase your total dimensions by 50% (e.g. a change from 512 x 512 to 512 x 768), then to maintain the same effect on the image, you’d want to increase clip_guidance_scale from 5000 to 7500. Of the basic settings, clip_guidance_scale, steps and skip_steps are the most important contributors to image quality, so learn them well.\"\n        )\n        self.arg_input_group.add_argument(\n            '--tv_scale',\n            type=int,\n            default=0,\n            help=\n            \"Total variance denoising. Optional, set to zero to turn off. Controls ‘smoothness’ of final output. If used, tv_scale will try to smooth out your final image to reduce overall noise. If your image is too ‘crunchy’, increase tv_scale. TV denoising is good at preserving edges while smoothing away noise in flat regions.  See https://en.wikipedia.org/wiki/Total_variation_denoising\"\n        )\n        self.arg_input_group.add_argument(\n            '--range_scale',\n            type=int,\n            default=0,\n            help=\n            \"Optional, set to zero to turn off.  Used for adjustment of color contrast.  Lower range_scale will increase contrast. Very low numbers create a reduced color palette, resulting in more vibrant or poster-like images. Higher range_scale will reduce contrast, for more muted images.\"\n        )\n        self.arg_input_group.add_argument(\n            '--sat_scale',\n            type=int,\n            default=0,\n            help=\n            \"Saturation scale. Optional, set to zero to turn off.  If used, sat_scale will help mitigate oversaturation. If your image is too saturated, increase sat_scale to reduce the saturation.\"\n        )\n        self.arg_input_group.add_argument(\n            '--cutn_batches',\n            type=int,\n            default=4,\n            help=\n            \"Each iteration, the AI cuts the image into smaller pieces known as cuts, and compares each cut to the prompt to decide how to guide the next diffusion step.  More cuts can generally lead to better images, since DD has more chances to fine-tune the image precision in each timestep.  Additional cuts are memory intensive, however, and if DD tries to evaluate too many cuts at once, it can run out of memory.  You can use cutn_batches to increase cuts per timestep without increasing memory usage. At the default settings, DD is scheduled to do 16 cuts per timestep.  If cutn_batches is set to 1, there will indeed only be 16 cuts total per timestep. However, if cutn_batches is increased to 4, DD will do 64 cuts total in each timestep, divided into 4 sequential batches of 16 cuts each.  Because the cuts are being evaluated only 16 at a time, DD uses the memory required for only 16 cuts, but gives you the quality benefit of 64 cuts.  The tradeoff, of course, is that this will take ~4 times as long to render each image.So, (scheduled cuts) x (cutn_batches) = (total cuts per timestep). Increasing cutn_batches will increase render times, however, as the work is being done sequentially.  DD’s default cut schedule is a good place to start, but the cut schedule can be adjusted in the Cutn Scheduling section, explained below.\"\n        )\n        self.arg_input_group.add_argument(\n            '--diffusion_sampling_mode',\n            type=str,\n            default='ddim',\n            help=\n            \"Two alternate diffusion denoising algorithms. ddim has been around longer, and is more established and tested.  plms is a newly added alternate method that promises good diffusion results in fewer steps, but has not been as fully tested and may have side effects. This new plms mode is actively being researched in the #settings-and-techniques channel in the DD Discord.\"\n        )\n        self.arg_input_group.add_argument(\n            '--perlin_init',\n            type=bool,\n            default=False,\n            help=\n            \"Normally, DD will use an image filled with random noise as a starting point for the diffusion curve.  If perlin_init is selected, DD will instead use a Perlin noise model as an initial state.  Perlin has very interesting characteristics, distinct from random noise, so it’s worth experimenting with this for your projects. Beyond perlin, you can, of course, generate your own noise images (such as with GIMP, etc) and use them as an init_image (without skipping steps). Choosing perlin_init does not affect the actual diffusion process, just the starting point for the diffusion. Please note that selecting a perlin_init will replace and override any init_image you may have specified.  Further, because the 2D, 3D and video animation systems all rely on the init_image system, if you enable Perlin while using animation modes, the perlin_init will jump in front of any previous image or video input, and DD will NOT give you the expected sequence of coherent images. All of that said, using Perlin and animation modes together do make a very colorful rainbow effect, which can be used creatively.\"\n        )\n        self.arg_input_group.add_argument(\n            '--perlin_mode',\n            type=str,\n            default='mixed',\n            help=\n            \"sets type of Perlin noise: colored, gray, or a mix of both, giving you additional options for noise types. Experiment to see what these do in your projects.\"\n        )\n        self.arg_input_group.add_argument(\n            '--seed',\n            type=int,\n            default=None,\n            help=\n            \"Deep in the diffusion code, there is a random number ‘seed’ which is used as the basis for determining the initial state of the diffusion.  By default, this is random, but you can also specify your own seed.  This is useful if you like a particular result and would like to run more iterations that will be similar. After each run, the actual seed value used will be reported in the parameters report, and can be reused if desired by entering seed # here.  If a specific numerical seed is used repeatedly, the resulting images will be quite similar but not identical.\"\n        )\n        self.arg_input_group.add_argument(\n            '--eta',\n            type=float,\n            default=0.8,\n            help=\n            \"eta (greek letter η) is a diffusion model variable that mixes in a random amount of scaled noise into each timestep. 0 is no noise, 1.0 is more noise. As with most DD parameters, you can go below zero for eta, but it may give you unpredictable results. The steps parameter has a close relationship with the eta parameter. If you set eta to 0, then you can get decent output with only 50-75 steps. Setting eta to 1.0 favors higher step counts, ideally around 250 and up. eta has a subtle, unpredictable effect on image, so you’ll need to experiment to see how this affects your projects.\"\n        )\n        self.arg_input_group.add_argument(\n            '--clamp_grad',\n            type=bool,\n            default=True,\n            help=\n            \"As I understand it, clamp_grad is an internal limiter that stops DD from producing extreme results.  Try your images with and without clamp_grad. If the image changes drastically with clamp_grad turned off, it probably means your clip_guidance_scale is too high and should be reduced.\"\n        )\n        self.arg_input_group.add_argument(\n            '--clamp_max',\n            type=float,\n            default=0.05,\n            help=\n            \"Sets the value of the clamp_grad limitation. Default is 0.05, providing for smoother, more muted coloration in images, but setting higher values (0.15-0.3) can provide interesting contrast and vibrancy.\"\n        )\n        self.arg_input_group.add_argument('--randomize_class', type=bool, default=True, help=\"Random class.\")\n        self.arg_input_group.add_argument('--clip_denoised', type=bool, default=False, help=\"Clip denoised.\")\n        self.arg_input_group.add_argument(\n            '--fuzzy_prompt',\n            type=bool,\n            default=False,\n            help=\n            \"Controls whether to add multiple noisy prompts to the prompt losses. If True, can increase variability of image output. Experiment with this.\"\n        )\n        self.arg_input_group.add_argument(\n            '--rand_mag',\n            type=float,\n            default=0.5,\n            help=\"Affects only the fuzzy_prompt.  Controls the magnitude of the random noise added by fuzzy_prompt.\")\n        self.arg_input_group.add_argument('--cut_overview',\n                                          type=str,\n                                          default='[12]*400+[4]*600',\n                                          help=\"The schedule of overview cuts\")\n        self.arg_input_group.add_argument('--cut_innercut',\n                                          type=str,\n                                          default='[4]*400+[12]*600',\n                                          help=\"The schedule of inner cuts\")\n        self.arg_input_group.add_argument(\n            '--cut_icgray_p',\n            type=str,\n            default='[0.2]*400+[0]*600',\n            help=\n            \"This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\"\n        )\n        self.arg_input_group.add_argument(\n            '--display_rate',\n            type=int,\n            default=10,\n            help=\n            \"During a diffusion run, you can monitor the progress of each image being created with this variable.  If display_rate is set to 50, DD will show you the in-progress image every 50 timesteps. Setting this to a lower value, like 5 or 10, is a good way to get an early peek at where your image is heading. If you don’t like the progression, just interrupt execution, change some settings, and re-run.  If you are planning a long, unmonitored batch, it’s better to set display_rate equal to steps, because displaying interim images does slow Colab down slightly.\"\n        )\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=True,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='disco_diffusion_clip_rn101_out',\n                                           help='Output directory.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument(\n            '--text_prompts',\n            type=str,\n            help=\n            'Phrase, sentence, or string of words and phrases describing what the image should look like.  The words will be analyzed by the AI and will guide the diffusion process toward the image(s) you describe. These can include commas and weights to adjust the relative importance of each element.  E.g. \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"Notice that this prompt loosely follows a structure: [subject], [prepositional details], [setting], [meta modifiers and artist]; this is a good starting point for your experiments. Developing text prompts takes practice and experience, and is not the subject of this guide.  If you are a beginner to writing text prompts, a good place to start is on a simple AI art app like Night Cafe, starry ai or WOMBO prior to using DD, to get a feel for how text gets translated into images by GAN tools.  These other apps use different technologies, but many of the same principles apply.'\n        )\n        self.arg_input_group.add_argument(\n            '--style',\n            type=str,\n            default=None,\n            help='Image style, such as oil paintings, if specified, style will be used to construct prompts.')\n        self.arg_input_group.add_argument('--artist',\n                                          type=str,\n                                          default=None,\n                                          help='Artist style, if specified, style will be used to construct prompts.')\n        self.arg_input_group.add_argument(\n            '--init_image',\n            type=str,\n            default=None,\n            help=\n            \"Recall that in the image sequence above, the first image shown is just noise.  If an init_image is provided, diffusion will replace the noise with the init_image as its starting state.  To use an init_image, upload the image to the Colab instance or your Google Drive, and enter the full image path here. If using an init_image, you may need to increase skip_steps to ~ 50% of total steps to retain the character of the init. See skip_steps above for further discussion.\"\n        )\n        self.arg_input_group.add_argument(\n            '--width_height',\n            type=ast.literal_eval,\n            default=[1280, 768],\n            help=\n            \"Desired final image size, in pixels. You can have a square, wide, or tall image, but each edge length should be set to a multiple of 64px, and a minimum of 512px on the default CLIP model setting.  If you forget to use multiples of 64px in your dimensions, DD will adjust the dimensions of your image to make it so.\"\n        )\n        self.arg_input_group.add_argument(\n            '--n_batches',\n            type=int,\n            default=1,\n            help=\n            \"This variable sets the number of still images you want DD to create.  If you are using an animation mode (see below for details) DD will ignore n_batches and create a single set of animated frames based on the animation settings.\"\n        )\n        self.arg_input_group.add_argument('--batch_size', type=int, default=1, help=\"Batch size.\")\n        self.arg_input_group.add_argument(\n            '--batch_name',\n            type=str,\n            default='',\n            help=\n            'The name of the batch, the batch id will be named as \"discoart-[batch_name]-seed\". To avoid your artworks be overridden by other users, please use a unique name.'\n        )\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/requirements.txt",
    "content": "numpy\npaddle_lpips==0.1.2\nftfy\ndocarray>=0.13.29\npyyaml\nregex\ntqdm\nipywidgets\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/resize_right/README.md",
    "content": "# ResizeRight (Paddle)\nFully differentiable resize function implemented by Paddle.\nThis module is based on [assafshocher/ResizeRight](https://github.com/assafshocher/ResizeRight).\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/resize_right/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/resize_right/interp_methods.py",
    "content": "from math import pi\n\ntry:\n    import paddle\nexcept ImportError:\n    paddle = None\n\ntry:\n    import numpy\n    import numpy as np\nexcept ImportError:\n    numpy = None\n\nif numpy is None and paddle is None:\n    raise ImportError(\"Must have either Numpy or PyTorch but both not found\")\n\n\ndef set_framework_dependencies(x):\n    if type(x) is numpy.ndarray:\n        to_dtype = lambda a: a\n        fw = numpy\n    else:\n        to_dtype = lambda a: paddle.cast(a, x.dtype)\n        fw = paddle\n    # eps = fw.finfo(fw.float32).eps\n    eps = paddle.to_tensor(np.finfo(np.float32).eps)\n    return fw, to_dtype, eps\n\n\ndef support_sz(sz):\n\n    def wrapper(f):\n        f.support_sz = sz\n        return f\n\n    return wrapper\n\n\n@support_sz(4)\ndef cubic(x):\n    fw, to_dtype, eps = set_framework_dependencies(x)\n    absx = fw.abs(x)\n    absx2 = absx**2\n    absx3 = absx**3\n    return ((1.5 * absx3 - 2.5 * absx2 + 1.) * to_dtype(absx <= 1.) +\n            (-0.5 * absx3 + 2.5 * absx2 - 4. * absx + 2.) * to_dtype((1. < absx) & (absx <= 2.)))\n\n\n@support_sz(4)\ndef lanczos2(x):\n    fw, to_dtype, eps = set_framework_dependencies(x)\n    return (((fw.sin(pi * x) * fw.sin(pi * x / 2) + eps) / ((pi**2 * x**2 / 2) + eps)) * to_dtype(abs(x) < 2))\n\n\n@support_sz(6)\ndef lanczos3(x):\n    fw, to_dtype, eps = set_framework_dependencies(x)\n    return (((fw.sin(pi * x) * fw.sin(pi * x / 3) + eps) / ((pi**2 * x**2 / 3) + eps)) * to_dtype(abs(x) < 3))\n\n\n@support_sz(2)\ndef linear(x):\n    fw, to_dtype, eps = set_framework_dependencies(x)\n    return ((x + 1) * to_dtype((-1 <= x) & (x < 0)) + (1 - x) * to_dtype((0 <= x) & (x <= 1)))\n\n\n@support_sz(1)\ndef box(x):\n    fw, to_dtype, eps = set_framework_dependencies(x)\n    return to_dtype((-1 <= x) & (x < 0)) + to_dtype((0 <= x) & (x <= 1))\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/resize_right/resize_right.py",
    "content": "import warnings\nfrom fractions import Fraction\nfrom math import ceil\nfrom typing import Tuple\n\nimport disco_diffusion_clip_rn101.resize_right.interp_methods as interp_methods\n\n\nclass NoneClass:\n    pass\n\n\ntry:\n    import paddle\n    from paddle import nn\n    nnModuleWrapped = nn.Layer\nexcept ImportError:\n    warnings.warn('No PyTorch found, will work only with Numpy')\n    paddle = None\n    nnModuleWrapped = NoneClass\n\ntry:\n    import numpy\n    import numpy as np\nexcept ImportError:\n    warnings.warn('No Numpy found, will work only with PyTorch')\n    numpy = None\n\nif numpy is None and paddle is None:\n    raise ImportError(\"Must have either Numpy or PyTorch but both not found\")\n\n\ndef resize(input,\n           scale_factors=None,\n           out_shape=None,\n           interp_method=interp_methods.cubic,\n           support_sz=None,\n           antialiasing=True,\n           by_convs=False,\n           scale_tolerance=None,\n           max_numerator=10,\n           pad_mode='constant'):\n    # get properties of the input tensor\n    in_shape, n_dims = input.shape, input.ndim\n\n    # fw stands for framework that can be either numpy or paddle,\n    # determined by the input type\n    fw = numpy if type(input) is numpy.ndarray else paddle\n    eps = np.finfo(np.float32).eps if fw == numpy else paddle.to_tensor(np.finfo(np.float32).eps)\n    device = input.place if fw is paddle else None\n\n    # set missing scale factors or output shapem one according to another,\n    # scream if both missing. this is also where all the defults policies\n    # take place. also handling the by_convs attribute carefully.\n    scale_factors, out_shape, by_convs = set_scale_and_out_sz(in_shape, out_shape, scale_factors, by_convs,\n                                                              scale_tolerance, max_numerator, eps, fw)\n\n    # sort indices of dimensions according to scale of each dimension.\n    # since we are going dim by dim this is efficient\n    sorted_filtered_dims_and_scales = [(dim, scale_factors[dim], by_convs[dim], in_shape[dim], out_shape[dim])\n                                       for dim in sorted(range(n_dims), key=lambda ind: scale_factors[ind])\n                                       if scale_factors[dim] != 1.]\n    # unless support size is specified by the user, it is an attribute\n    # of the interpolation method\n    if support_sz is None:\n        support_sz = interp_method.support_sz\n\n    # output begins identical to input and changes with each iteration\n    output = input\n\n    # iterate over dims\n    for (dim, scale_factor, dim_by_convs, in_sz, out_sz) in sorted_filtered_dims_and_scales:\n        # STEP 1- PROJECTED GRID: The non-integer locations of the projection\n        # of output pixel locations to the input tensor\n        projected_grid = get_projected_grid(in_sz, out_sz, scale_factor, fw, dim_by_convs, device)\n\n        # STEP 1.5: ANTIALIASING- If antialiasing is taking place, we modify\n        # the window size and the interpolation method (see inside function)\n        cur_interp_method, cur_support_sz = apply_antialiasing_if_needed(interp_method, support_sz, scale_factor,\n                                                                         antialiasing)\n\n        # STEP 2- FIELDS OF VIEW: for each output pixels, map the input pixels\n        # that influence it. Also calculate needed padding and update grid\n        # accoedingly\n        field_of_view = get_field_of_view(projected_grid, cur_support_sz, fw, eps, device)\n\n        # STEP 2.5- CALCULATE PAD AND UPDATE: according to the field of view,\n        # the input should be padded to handle the boundaries, coordinates\n        # should be updated. actual padding only occurs when weights are\n        # aplied (step 4). if using by_convs for this dim, then we need to\n        # calc right and left boundaries for each filter instead.\n        pad_sz, projected_grid, field_of_view = calc_pad_sz(in_sz, out_sz, field_of_view, projected_grid, scale_factor,\n                                                            dim_by_convs, fw, device)\n        # STEP 3- CALCULATE WEIGHTS: Match a set of weights to the pixels in\n        # the field of view for each output pixel\n        weights = get_weights(cur_interp_method, projected_grid, field_of_view)\n\n        # STEP 4- APPLY WEIGHTS: Each output pixel is calculated by multiplying\n        # its set of weights with the pixel values in its field of view.\n        # We now multiply the fields of view with their matching weights.\n        # We do this by tensor multiplication and broadcasting.\n        # if by_convs is true for this dim, then we do this action by\n        # convolutions. this is equivalent but faster.\n        if not dim_by_convs:\n            output = apply_weights(output, field_of_view, weights, dim, n_dims, pad_sz, pad_mode, fw)\n        else:\n            output = apply_convs(output, scale_factor, in_sz, out_sz, weights, dim, pad_sz, pad_mode, fw)\n    return output\n\n\ndef get_projected_grid(in_sz, out_sz, scale_factor, fw, by_convs, device=None):\n    # we start by having the ouput coordinates which are just integer locations\n    # in the special case when usin by_convs, we only need two cycles of grid\n    # points. the first and last.\n    grid_sz = out_sz if not by_convs else scale_factor.numerator\n    out_coordinates = fw_arange(grid_sz, fw, device)\n\n    # This is projecting the ouput pixel locations in 1d to the input tensor,\n    # as non-integer locations.\n    # the following fomrula is derived in the paper\n    # \"From Discrete to Continuous Convolutions\" by Shocher et al.\n    return (out_coordinates / float(scale_factor) + (in_sz - 1) / 2 - (out_sz - 1) / (2 * float(scale_factor)))\n\n\ndef get_field_of_view(projected_grid, cur_support_sz, fw, eps, device):\n    # for each output pixel, map which input pixels influence it, in 1d.\n    # we start by calculating the leftmost neighbor, using half of the window\n    # size (eps is for when boundary is exact int)\n    left_boundaries = fw_ceil(projected_grid - cur_support_sz / 2 - eps, fw)\n\n    # then we simply take all the pixel centers in the field by counting\n    # window size pixels from the left boundary\n    ordinal_numbers = fw_arange(ceil(cur_support_sz - eps), fw, device)\n    return left_boundaries[:, None] + ordinal_numbers\n\n\ndef calc_pad_sz(in_sz, out_sz, field_of_view, projected_grid, scale_factor, dim_by_convs, fw, device):\n    if not dim_by_convs:\n        # determine padding according to neighbor coords out of bound.\n        # this is a generalized notion of padding, when pad<0 it means crop\n        pad_sz = [-field_of_view[0, 0].item(), field_of_view[-1, -1].item() - in_sz + 1]\n\n        # since input image will be changed by padding, coordinates of both\n        # field_of_view and projected_grid need to be updated\n        field_of_view += pad_sz[0]\n        projected_grid += pad_sz[0]\n\n    else:\n        # only used for by_convs, to calc the boundaries of each filter the\n        # number of distinct convolutions is the numerator of the scale factor\n        num_convs, stride = scale_factor.numerator, scale_factor.denominator\n\n        # calculate left and right boundaries for each conv. left can also be\n        # negative right can be bigger than in_sz. such cases imply padding if\n        # needed. however if# both are in-bounds, it means we need to crop,\n        # practically apply the conv only on part of the image.\n        left_pads = -field_of_view[:, 0]\n\n        # next calc is tricky, explanation by rows:\n        # 1) counting output pixels between the first position of each filter\n        #    to the right boundary of the input\n        # 2) dividing it by number of filters to count how many 'jumps'\n        #    each filter does\n        # 3) multiplying by the stride gives us the distance over the input\n        #    coords done by all these jumps for each filter\n        # 4) to this distance we add the right boundary of the filter when\n        #    placed in its leftmost position. so now we get the right boundary\n        #    of that filter in input coord.\n        # 5) the padding size needed is obtained by subtracting the rightmost\n        #    input coordinate. if the result is positive padding is needed. if\n        #    negative then negative padding means shaving off pixel columns.\n        right_pads = (((out_sz - fw_arange(num_convs, fw, device) - 1)  # (1)\n                       // num_convs)  # (2)\n                      * stride  # (3)\n                      + field_of_view[:, -1]  # (4)\n                      - in_sz + 1)  # (5)\n\n        # in the by_convs case pad_sz is a list of left-right pairs. one per\n        # each filter\n\n        pad_sz = list(zip(left_pads, right_pads))\n\n    return pad_sz, projected_grid, field_of_view\n\n\ndef get_weights(interp_method, projected_grid, field_of_view):\n    # the set of weights per each output pixels is the result of the chosen\n    # interpolation method applied to the distances between projected grid\n    # locations and the pixel-centers in the field of view (distances are\n    # directed, can be positive or negative)\n    weights = interp_method(projected_grid[:, None] - field_of_view)\n\n    # we now carefully normalize the weights to sum to 1 per each output pixel\n    sum_weights = weights.sum(1, keepdim=True)\n    sum_weights[sum_weights == 0] = 1\n    return weights / sum_weights\n\n\ndef apply_weights(input, field_of_view, weights, dim, n_dims, pad_sz, pad_mode, fw):\n    # for this operation we assume the resized dim is the first one.\n    # so we transpose and will transpose back after multiplying\n    tmp_input = fw_swapaxes(input, dim, 0, fw)\n\n    # apply padding\n    tmp_input = fw_pad(tmp_input, fw, pad_sz, pad_mode)\n\n    # field_of_view is a tensor of order 2: for each output (1d location\n    # along cur dim)- a list of 1d neighbors locations.\n    # note that this whole operations is applied to each dim separately,\n    # this is why it is all in 1d.\n    # neighbors = tmp_input[field_of_view] is a tensor of order image_dims+1:\n    # for each output pixel (this time indicated in all dims), these are the\n    # values of the neighbors in the 1d field of view. note that we only\n    # consider neighbors along the current dim, but such set exists for every\n    # multi-dim location, hence the final tensor order is image_dims+1.\n    paddle.device.cuda.empty_cache()\n    neighbors = tmp_input[field_of_view]\n\n    # weights is an order 2 tensor: for each output location along 1d- a list\n    # of weights matching the field of view. we augment it with ones, for\n    # broadcasting, so that when multiplies some tensor the weights affect\n    # only its first dim.\n    tmp_weights = fw.reshape(weights, (*weights.shape, *[1] * (n_dims - 1)))\n\n    # now we simply multiply the weights with the neighbors, and then sum\n    # along the field of view, to get a single value per out pixel\n    tmp_output = (neighbors * tmp_weights).sum(1)\n    # we transpose back the resized dim to its original position\n    return fw_swapaxes(tmp_output, 0, dim, fw)\n\n\ndef apply_convs(input, scale_factor, in_sz, out_sz, weights, dim, pad_sz, pad_mode, fw):\n    # for this operations we assume the resized dim is the last one.\n    # so we transpose and will transpose back after multiplying\n    input = fw_swapaxes(input, dim, -1, fw)\n\n    # the stride for all convs is the denominator of the scale factor\n    stride, num_convs = scale_factor.denominator, scale_factor.numerator\n\n    # prepare an empty tensor for the output\n    tmp_out_shape = list(input.shape)\n    tmp_out_shape[-1] = out_sz\n    tmp_output = fw_empty(tuple(tmp_out_shape), fw, input.device)\n\n    # iterate over the conv operations. we have as many as the numerator\n    # of the scale-factor. for each we need boundaries and a filter.\n    for conv_ind, (pad_sz, filt) in enumerate(zip(pad_sz, weights)):\n        # apply padding (we pad last dim, padding can be negative)\n        pad_dim = input.ndim - 1\n        tmp_input = fw_pad(input, fw, pad_sz, pad_mode, dim=pad_dim)\n\n        # apply convolution over last dim. store in the output tensor with\n        # positional strides so that when the loop is comlete conv results are\n        # interwind\n        tmp_output[..., conv_ind::num_convs] = fw_conv(tmp_input, filt, stride)\n\n    return fw_swapaxes(tmp_output, -1, dim, fw)\n\n\ndef set_scale_and_out_sz(in_shape, out_shape, scale_factors, by_convs, scale_tolerance, max_numerator, eps, fw):\n    # eventually we must have both scale-factors and out-sizes for all in/out\n    # dims. however, we support many possible partial arguments\n    if scale_factors is None and out_shape is None:\n        raise ValueError(\"either scale_factors or out_shape should be \"\n                         \"provided\")\n    if out_shape is not None:\n        # if out_shape has less dims than in_shape, we defaultly resize the\n        # first dims for numpy and last dims for paddle\n        out_shape = (list(out_shape) +\n                     list(in_shape[len(out_shape):]) if fw is numpy else list(in_shape[:-len(out_shape)]) +\n                     list(out_shape))\n        if scale_factors is None:\n            # if no scale given, we calculate it as the out to in ratio\n            # (not recomended)\n            scale_factors = [out_sz / in_sz for out_sz, in_sz in zip(out_shape, in_shape)]\n    if scale_factors is not None:\n        # by default, if a single number is given as scale, we assume resizing\n        # two dims (most common are images with 2 spatial dims)\n        scale_factors = (scale_factors if isinstance(scale_factors, (list, tuple)) else [scale_factors, scale_factors])\n        # if less scale_factors than in_shape dims, we defaultly resize the\n        # first dims for numpy and last dims for paddle\n        scale_factors = (list(scale_factors) + [1] * (len(in_shape) - len(scale_factors)) if fw is numpy else [1] *\n                         (len(in_shape) - len(scale_factors)) + list(scale_factors))\n        if out_shape is None:\n            # when no out_shape given, it is calculated by multiplying the\n            # scale by the in_shape (not recomended)\n            out_shape = [ceil(scale_factor * in_sz) for scale_factor, in_sz in zip(scale_factors, in_shape)]\n        # next part intentionally after out_shape determined for stability\n        # we fix by_convs to be a list of truth values in case it is not\n        if not isinstance(by_convs, (list, tuple)):\n            by_convs = [by_convs] * len(out_shape)\n\n        # next loop fixes the scale for each dim to be either frac or float.\n        # this is determined by by_convs and by tolerance for scale accuracy.\n        for ind, (sf, dim_by_convs) in enumerate(zip(scale_factors, by_convs)):\n            # first we fractionaize\n            if dim_by_convs:\n                frac = Fraction(1 / sf).limit_denominator(max_numerator)\n                frac = Fraction(numerator=frac.denominator, denominator=frac.numerator)\n\n            # if accuracy is within tolerance scale will be frac. if not, then\n            # it will be float and the by_convs attr will be set false for\n            # this dim\n            if scale_tolerance is None:\n                scale_tolerance = eps\n            if dim_by_convs and abs(frac - sf) < scale_tolerance:\n                scale_factors[ind] = frac\n            else:\n                scale_factors[ind] = float(sf)\n                by_convs[ind] = False\n\n        return scale_factors, out_shape, by_convs\n\n\ndef apply_antialiasing_if_needed(interp_method, support_sz, scale_factor, antialiasing):\n    # antialiasing is \"stretching\" the field of view according to the scale\n    # factor (only for downscaling). this is low-pass filtering. this\n    # requires modifying both the interpolation (stretching the 1d\n    # function and multiplying by the scale-factor) and the window size.\n    scale_factor = float(scale_factor)\n    if scale_factor >= 1.0 or not antialiasing:\n        return interp_method, support_sz\n    cur_interp_method = (lambda arg: scale_factor * interp_method(scale_factor * arg))\n    cur_support_sz = support_sz / scale_factor\n    return cur_interp_method, cur_support_sz\n\n\ndef fw_ceil(x, fw):\n    if fw is numpy:\n        return fw.int_(fw.ceil(x))\n    else:\n        return paddle.cast(x.ceil(), dtype='int64')\n\n\ndef fw_floor(x, fw):\n    if fw is numpy:\n        return fw.int_(fw.floor(x))\n    else:\n        return paddle.cast(x.floor(), dtype='int64')\n\n\ndef fw_cat(x, fw):\n    if fw is numpy:\n        return fw.concatenate(x)\n    else:\n        return fw.concat(x)\n\n\ndef fw_swapaxes(x, ax_1, ax_2, fw):\n    if fw is numpy:\n        return fw.swapaxes(x, ax_1, ax_2)\n    else:\n        if ax_1 == -1:\n            ax_1 = len(x.shape) - 1\n        if ax_2 == -1:\n            ax_2 = len(x.shape) - 1\n        perm0 = list(range(len(x.shape)))\n        temp = ax_1\n        perm0[temp] = ax_2\n        perm0[ax_2] = temp\n        return fw.transpose(x, perm0)\n\n\ndef fw_pad(x, fw, pad_sz, pad_mode, dim=0):\n    if pad_sz == (0, 0):\n        return x\n    if fw is numpy:\n        pad_vec = [(0, 0)] * x.ndim\n        pad_vec[dim] = pad_sz\n        return fw.pad(x, pad_width=pad_vec, mode=pad_mode)\n    else:\n        if x.ndim < 3:\n            x = x[None, None, ...]\n\n        pad_vec = [0] * ((x.ndim - 2) * 2)\n        pad_vec[0:2] = pad_sz\n        return fw_swapaxes(fw.nn.functional.pad(fw_swapaxes(x, dim, -1, fw), pad=pad_vec, mode=pad_mode), dim, -1, fw)\n\n\ndef fw_conv(input, filter, stride):\n    # we want to apply 1d conv to any nd array. the way to do it is to reshape\n    # the input to a 4D tensor. first two dims are singeletons, 3rd dim stores\n    # all the spatial dims that we are not convolving along now. then we can\n    # apply conv2d with a 1xK filter. This convolves the same way all the other\n    # dims stored in the 3d dim. like depthwise conv over these.\n    # TODO: numpy support\n    reshaped_input = input.reshape(1, 1, -1, input.shape[-1])\n    reshaped_output = paddle.nn.functional.conv2d(reshaped_input, filter.view(1, 1, 1, -1), stride=(1, stride))\n    return reshaped_output.reshape(*input.shape[:-1], -1)\n\n\ndef fw_arange(upper_bound, fw, device):\n    if fw is numpy:\n        return fw.arange(upper_bound)\n    else:\n        return fw.arange(upper_bound)\n\n\ndef fw_empty(shape, fw, device):\n    if fw is numpy:\n        return fw.empty(shape)\n    else:\n        return fw.empty(shape=shape)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/reverse_diffusion/README.md",
    "content": "# Diffusion model (Paddle)\nThis module implements diffusion model which accepts a text prompt and outputs images semantically close to the text. The code is rewritten by Paddle, and mainly refer to two projects:  jina-ai/discoart[https://github.com/jina-ai/discoart] and openai/guided-diffusion[https://github.com/openai/guided-diffusion]. Thanks for their wonderful work.\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/reverse_diffusion/__init__.py",
    "content": "'''\nhttps://github.com/jina-ai/discoart/blob/main/discoart/__init__.py\n'''\nimport os\nimport warnings\n\nos.environ['KMP_DUPLICATE_LIB_OK'] = 'TRUE'\n\n__all__ = ['create']\n\nimport sys\n\n__resources_path__ = os.path.join(\n    os.path.dirname(sys.modules.get(__package__).__file__ if __package__ in sys.modules else __file__),\n    'resources',\n)\n\nimport gc\n\n# check if GPU is available\nimport paddle\n\n# download and load models, this will take some time on the first load\n\nfrom .helper import load_all_models, load_diffusion_model, load_clip_models\n\nmodel_config, secondary_model = load_all_models('512x512_diffusion_uncond_finetune_008100', use_secondary_model=True)\n\nfrom typing import TYPE_CHECKING, overload, List, Optional\n\nif TYPE_CHECKING:\n    from docarray import DocumentArray, Document\n\n_clip_models_cache = {}\n\n# begin_create_overload\n\n\n@overload\ndef create(text_prompts: Optional[List[str]] = [\n    'A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.',\n    'yellow color scheme',\n],\n           init_image: Optional[str] = None,\n           width_height: Optional[List[int]] = [1280, 768],\n           skip_steps: Optional[int] = 10,\n           steps: Optional[int] = 250,\n           cut_ic_pow: Optional[int] = 1,\n           init_scale: Optional[int] = 1000,\n           clip_guidance_scale: Optional[int] = 5000,\n           tv_scale: Optional[int] = 0,\n           range_scale: Optional[int] = 150,\n           sat_scale: Optional[int] = 0,\n           cutn_batches: Optional[int] = 4,\n           diffusion_model: Optional[str] = '512x512_diffusion_uncond_finetune_008100',\n           use_secondary_model: Optional[bool] = True,\n           diffusion_sampling_mode: Optional[str] = 'ddim',\n           perlin_init: Optional[bool] = False,\n           perlin_mode: Optional[str] = 'mixed',\n           seed: Optional[int] = None,\n           eta: Optional[float] = 0.8,\n           clamp_grad: Optional[bool] = True,\n           clamp_max: Optional[float] = 0.05,\n           randomize_class: Optional[bool] = True,\n           clip_denoised: Optional[bool] = False,\n           fuzzy_prompt: Optional[bool] = False,\n           rand_mag: Optional[float] = 0.05,\n           cut_overview: Optional[str] = '[12]*400+[4]*600',\n           cut_innercut: Optional[str] = '[4]*400+[12]*600',\n           cut_icgray_p: Optional[str] = '[0.2]*400+[0]*600',\n           display_rate: Optional[int] = 10,\n           n_batches: Optional[int] = 4,\n           batch_size: Optional[int] = 1,\n           batch_name: Optional[str] = '',\n           clip_models: Optional[list] = ['ViTB32', 'ViTB16', 'RN50'],\n           output_dir: Optional[str] = 'discoart_output') -> 'DocumentArray':\n    \"\"\"\n    Create Disco Diffusion artworks and save the result into a DocumentArray.\n\n    :param text_prompts: Phrase, sentence, or string of words and phrases describing what the image should look like.  The words will be analyzed by the AI and will guide the diffusion process toward the image(s) you describe. These can include commas and weights to adjust the relative importance of each element.  E.g. \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"Notice that this prompt loosely follows a structure: [subject], [prepositional details], [setting], [meta modifiers and artist]; this is a good starting point for your experiments. Developing text prompts takes practice and experience, and is not the subject of this guide.  If you are a beginner to writing text prompts, a good place to start is on a simple AI art app like Night Cafe, starry ai or WOMBO prior to using DD, to get a feel for how text gets translated into images by GAN tools.  These other apps use different technologies, but many of the same principles apply.\n    :param init_image: Recall that in the image sequence above, the first image shown is just noise.  If an init_image is provided, diffusion will replace the noise with the init_image as its starting state.  To use an init_image, upload the image to the Colab instance or your Google Drive, and enter the full image path here. If using an init_image, you may need to increase skip_steps to ~ 50% of total steps to retain the character of the init. See skip_steps above for further discussion.\n    :param width_height: Desired final image size, in pixels. You can have a square, wide, or tall image, but each edge length should be set to a multiple of 64px, and a minimum of 512px on the default CLIP model setting.  If you forget to use multiples of 64px in your dimensions, DD will adjust the dimensions of your image to make it so.\n    :param skip_steps: Consider the chart shown here.  Noise scheduling (denoise strength) starts very high and progressively gets lower and lower as diffusion steps progress. The noise levels in the first few steps are very high, so images change dramatically in early steps.As DD moves along the curve, noise levels (and thus the amount an image changes per step) declines, and image coherence from one step to the next increases.The first few steps of denoising are often so dramatic that some steps (maybe 10-15% of total) can be skipped without affecting the final image. You can experiment with this as a way to cut render times.If you skip too many steps, however, the remaining noise may not be high enough to generate new content, and thus may not have ‘time left’ to finish an image satisfactorily.Also, depending on your other settings, you may need to skip steps to prevent CLIP from overshooting your goal, resulting in ‘blown out’ colors (hyper saturated, solid white, or solid black regions) or otherwise poor image quality.  Consider that the denoising process is at its strongest in the early steps, so skipping steps can sometimes mitigate other problems.Lastly, if using an init_image, you will need to skip ~50% of the diffusion steps to retain the shapes in the original init image. However, if you’re using an init_image, you can also adjust skip_steps up or down for creative reasons.  With low skip_steps you can get a result \"inspired by\" the init_image which will retain the colors and rough layout and shapes but look quite different. With high skip_steps you can preserve most of the init_image contents and just do fine tuning of the texture.\n    :param steps: When creating an image, the denoising curve is subdivided into steps for processing. Each step (or iteration) involves the AI looking at subsets of the image called ‘cuts’ and calculating the ‘direction’ the image should be guided to be more like the prompt. Then it adjusts the image with the help of the diffusion denoiser, and moves to the next step.Increasing steps will provide more opportunities for the AI to adjust the image, and each adjustment will be smaller, and thus will yield a more precise, detailed image.  Increasing steps comes at the expense of longer render times.  Also, while increasing steps should generally increase image quality, there is a diminishing return on additional steps beyond 250 - 500 steps.  However, some intricate images can take 1000, 2000, or more steps.  It is really up to the user.  Just know that the render time is directly related to the number of steps, and many other parameters have a major impact on image quality, without costing additional time.\n    :param cut_ic_pow: This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n    :param init_scale: This controls how strongly CLIP will try to match the init_image provided.  This is balanced against the clip_guidance_scale (CGS) above.  Too much init scale, and the image won’t change much during diffusion. Too much CGS and the init image will be lost.\n    :param clip_guidance_scale: CGS is one of the most important parameters you will use. It tells DD how strongly you want CLIP to move toward your prompt each timestep.  Higher is generally better, but if CGS is too strong it will overshoot the goal and distort the image. So a happy medium is needed, and it takes experience to learn how to adjust CGS. Note that this parameter generally scales with image dimensions. In other words, if you increase your total dimensions by 50% (e.g. a change from 512 x 512 to 512 x 768), then to maintain the same effect on the image, you’d want to increase clip_guidance_scale from 5000 to 7500. Of the basic settings, clip_guidance_scale, steps and skip_steps are the most important contributors to image quality, so learn them well.\n    :param tv_scale: Total variance denoising. Optional, set to zero to turn off. Controls ‘smoothness’ of final output. If used, tv_scale will try to smooth out your final image to reduce overall noise. If your image is too ‘crunchy’, increase tv_scale. TV denoising is good at preserving edges while smoothing away noise in flat regions.  See https://en.wikipedia.org/wiki/Total_variation_denoising\n    :param range_scale: Optional, set to zero to turn off.  Used for adjustment of color contrast.  Lower range_scale will increase contrast. Very low numbers create a reduced color palette, resulting in more vibrant or poster-like images. Higher range_scale will reduce contrast, for more muted images.\n    :param sat_scale: Saturation scale. Optional, set to zero to turn off.  If used, sat_scale will help mitigate oversaturation. If your image is too saturated, increase sat_scale to reduce the saturation.\n    :param cutn_batches: Each iteration, the AI cuts the image into smaller pieces known as cuts, and compares each cut to the prompt to decide how to guide the next diffusion step.  More cuts can generally lead to better images, since DD has more chances to fine-tune the image precision in each timestep.  Additional cuts are memory intensive, however, and if DD tries to evaluate too many cuts at once, it can run out of memory.  You can use cutn_batches to increase cuts per timestep without increasing memory usage. At the default settings, DD is scheduled to do 16 cuts per timestep.  If cutn_batches is set to 1, there will indeed only be 16 cuts total per timestep. However, if cutn_batches is increased to 4, DD will do 64 cuts total in each timestep, divided into 4 sequential batches of 16 cuts each.  Because the cuts are being evaluated only 16 at a time, DD uses the memory required for only 16 cuts, but gives you the quality benefit of 64 cuts.  The tradeoff, of course, is that this will take ~4 times as long to render each image.So, (scheduled cuts) x (cutn_batches) = (total cuts per timestep). Increasing cutn_batches will increase render times, however, as the work is being done sequentially.  DD’s default cut schedule is a good place to start, but the cut schedule can be adjusted in the Cutn Scheduling section, explained below.\n    :param diffusion_model: Diffusion_model of choice.\n    :param use_secondary_model: Option to use a secondary purpose-made diffusion model to clean up interim diffusion images for CLIP evaluation.    If this option is turned off, DD will use the regular (large) diffusion model.    Using the secondary model is faster - one user reported a 50% improvement in render speed! However, the secondary model is much smaller, and may reduce image quality and detail.  I suggest you experiment with this.\n    :param diffusion_sampling_mode: Two alternate diffusion denoising algorithms. ddim has been around longer, and is more established and tested.  plms is a newly added alternate method that promises good diffusion results in fewer steps, but has not been as fully tested and may have side effects. This new plms mode is actively being researched in the #settings-and-techniques channel in the DD Discord.\n    :param perlin_init: Normally, DD will use an image filled with random noise as a starting point for the diffusion curve.  If perlin_init is selected, DD will instead use a Perlin noise model as an initial state.  Perlin has very interesting characteristics, distinct from random noise, so it’s worth experimenting with this for your projects. Beyond perlin, you can, of course, generate your own noise images (such as with GIMP, etc) and use them as an init_image (without skipping steps). Choosing perlin_init does not affect the actual diffusion process, just the starting point for the diffusion. Please note that selecting a perlin_init will replace and override any init_image you may have specified.  Further, because the 2D, 3D and video animation systems all rely on the init_image system, if you enable Perlin while using animation modes, the perlin_init will jump in front of any previous image or video input, and DD will NOT give you the expected sequence of coherent images. All of that said, using Perlin and animation modes together do make a very colorful rainbow effect, which can be used creatively.\n    :param perlin_mode: sets type of Perlin noise: colored, gray, or a mix of both, giving you additional options for noise types. Experiment to see what these do in your projects.\n    :param seed: Deep in the diffusion code, there is a random number ‘seed’ which is used as the basis for determining the initial state of the diffusion.  By default, this is random, but you can also specify your own seed.  This is useful if you like a particular result and would like to run more iterations that will be similar. After each run, the actual seed value used will be reported in the parameters report, and can be reused if desired by entering seed # here.  If a specific numerical seed is used repeatedly, the resulting images will be quite similar but not identical.\n    :param eta: eta (greek letter η) is a diffusion model variable that mixes in a random amount of scaled noise into each timestep. 0 is no noise, 1.0 is more noise. As with most DD parameters, you can go below zero for eta, but it may give you unpredictable results. The steps parameter has a close relationship with the eta parameter. If you set eta to 0, then you can get decent output with only 50-75 steps. Setting eta to 1.0 favors higher step counts, ideally around 250 and up. eta has a subtle, unpredictable effect on image, so you’ll need to experiment to see how this affects your projects.\n    :param clamp_grad: As I understand it, clamp_grad is an internal limiter that stops DD from producing extreme results.  Try your images with and without clamp_grad. If the image changes drastically with clamp_grad turned off, it probably means your clip_guidance_scale is too high and should be reduced.\n    :param clamp_max: Sets the value of the clamp_grad limitation. Default is 0.05, providing for smoother, more muted coloration in images, but setting higher values (0.15-0.3) can provide interesting contrast and vibrancy.\n    :param fuzzy_prompt: Controls whether to add multiple noisy prompts to the prompt losses. If True, can increase variability of image output. Experiment with this.\n    :param rand_mag: Affects only the fuzzy_prompt.  Controls the magnitude of the random noise added by fuzzy_prompt.\n    :param cut_overview: The schedule of overview cuts\n    :param cut_innercut: The schedule of inner cuts\n    :param cut_icgray_p: This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n    :param display_rate: During a diffusion run, you can monitor the progress of each image being created with this variable.  If display_rate is set to 50, DD will show you the in-progress image every 50 timesteps. Setting this to a lower value, like 5 or 10, is a good way to get an early peek at where your image is heading. If you don’t like the progression, just interrupt execution, change some settings, and re-run.  If you are planning a long, unmonitored batch, it’s better to set display_rate equal to steps, because displaying interim images does slow Colab down slightly.\n    :param n_batches: This variable sets the number of still images you want DD to create.  If you are using an animation mode (see below for details) DD will ignore n_batches and create a single set of animated frames based on the animation settings.\n    :param batch_name: The name of the batch, the batch id will be named as \"discoart-[batch_name]-seed\". To avoid your artworks be overridden by other users, please use a unique name.\n    :param clip_models: CLIP Model selectors. ViTB32, ViTB16, ViTL14, RN101, RN50, RN50x4, RN50x16, RN50x64.These various CLIP models are available for you to use during image generation.  Models have different styles or ‘flavors,’ so look around.  You can mix in multiple models as well for different results.  However, keep in mind that some models are extremely memory-hungry, and turning on additional models will take additional memory and may cause a crash.The rough order of speed/mem usage is (smallest/fastest to largest/slowest):VitB32RN50RN101VitB16RN50x4RN50x16RN50x64ViTL14For RN50x64 & ViTL14 you may need to use fewer cuts, depending on your VRAM.\n    :return: a DocumentArray object that has `n_batches` Documents\n    \"\"\"\n\n\n# end_create_overload\n\n\n@overload\ndef create(init_document: 'Document') -> 'DocumentArray':\n    \"\"\"\n    Create an artwork using a DocArray ``Document`` object as initial state.\n    :param init_document: its ``.tags`` will be used as parameters, ``.uri`` (if present) will be used as init image.\n    :return: a DocumentArray object that has `n_batches` Documents\n    \"\"\"\n\n\ndef create(**kwargs) -> 'DocumentArray':\n    from .config import load_config\n    from .runner import do_run\n\n    if 'init_document' in kwargs:\n        d = kwargs['init_document']\n        _kwargs = d.tags\n        if not _kwargs:\n            warnings.warn('init_document has no .tags, fallback to default config')\n        if d.uri:\n            _kwargs['init_image'] = kwargs['init_document'].uri\n        else:\n            warnings.warn('init_document has no .uri, fallback to no init image')\n        kwargs.pop('init_document')\n        if kwargs:\n            warnings.warn('init_document has .tags and .uri, but kwargs are also present, will override .tags')\n            _kwargs.update(kwargs)\n        _args = load_config(user_config=_kwargs)\n    else:\n        _args = load_config(user_config=kwargs)\n\n    model, diffusion = load_diffusion_model(model_config, _args.diffusion_model, steps=_args.steps)\n\n    clip_models = load_clip_models(enabled=_args.clip_models, clip_models=_clip_models_cache)\n\n    gc.collect()\n    paddle.device.cuda.empty_cache()\n    try:\n        return do_run(_args, (model, diffusion, clip_models, secondary_model))\n    except KeyboardInterrupt:\n        pass\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/reverse_diffusion/config.py",
    "content": "'''\nhttps://github.com/jina-ai/discoart/blob/main/discoart/config.py\n'''\nimport copy\nimport random\nimport warnings\nfrom types import SimpleNamespace\nfrom typing import Dict\n\nimport yaml\nfrom yaml import Loader\n\nfrom . import __resources_path__\n\nwith open(f'{__resources_path__}/default.yml') as ymlfile:\n    default_args = yaml.load(ymlfile, Loader=Loader)\n\n\ndef load_config(user_config: Dict, ):\n    cfg = copy.deepcopy(default_args)\n\n    if user_config:\n        cfg.update(**user_config)\n\n    for k in user_config.keys():\n        if k not in cfg:\n            warnings.warn(f'unknown argument {k}, ignored')\n\n    for k, v in cfg.items():\n        if k in ('batch_size', 'display_rate', 'seed', 'skip_steps', 'steps', 'n_batches',\n                 'cutn_batches') and isinstance(v, float):\n            cfg[k] = int(v)\n        if k == 'width_height':\n            cfg[k] = [int(vv) for vv in v]\n\n    cfg.update(**{\n        'seed': cfg['seed'] or random.randint(0, 2**32),\n    })\n\n    if cfg['batch_name']:\n        da_name = f'{__package__}-{cfg[\"batch_name\"]}-{cfg[\"seed\"]}'\n    else:\n        da_name = f'{__package__}-{cfg[\"seed\"]}'\n        warnings.warn('you did not set `batch_name`, set it to have unique session ID')\n\n    cfg.update(**{'name_docarray': da_name})\n\n    print_args_table(cfg)\n\n    return SimpleNamespace(**cfg)\n\n\ndef print_args_table(cfg):\n    from rich.table import Table\n    from rich import box\n    from rich.console import Console\n\n    console = Console()\n\n    param_str = Table(\n        title=cfg['name_docarray'],\n        box=box.ROUNDED,\n        highlight=True,\n        title_justify='left',\n    )\n    param_str.add_column('Argument', justify='right')\n    param_str.add_column('Value', justify='left')\n\n    for k, v in sorted(cfg.items()):\n        value = str(v)\n\n        if not default_args.get(k, None) == v:\n            value = f'[b]{value}[/]'\n\n        param_str.add_row(k, value)\n\n    console.print(param_str)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/reverse_diffusion/helper.py",
    "content": "'''\nThis code is rewritten by Paddle based on Jina-ai/discoart.\nhttps://github.com/jina-ai/discoart/blob/main/discoart/helper.py\n'''\nimport hashlib\nimport logging\nimport os\nimport subprocess\nimport sys\nfrom os.path import expanduser\nfrom pathlib import Path\nfrom typing import Any\nfrom typing import Dict\nfrom typing import List\n\nimport paddle\n\n\ndef _get_logger():\n    logger = logging.getLogger(__package__)\n    logger.setLevel(\"INFO\")\n    ch = logging.StreamHandler()\n    ch.setLevel(\"INFO\")\n    formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')\n    ch.setFormatter(formatter)\n    logger.addHandler(ch)\n    return logger\n\n\nlogger = _get_logger()\n\n\ndef load_clip_models(enabled: List[str], clip_models: Dict[str, Any] = {}):\n\n    import disco_diffusion_clip_rn101.clip.clip as clip\n    from disco_diffusion_clip_rn101.clip.clip import build_model, tokenize, transform\n\n    # load enabled models\n    for k in enabled:\n        if k not in clip_models:\n            clip_models[k] = build_model(name=k)\n            clip_models[k].eval()\n            for parameter in clip_models[k].parameters():\n                parameter.stop_gradient = True\n\n    # disable not enabled models to save memory\n    for k in clip_models:\n        if k not in enabled:\n            clip_models.pop(k)\n\n    return list(clip_models.values())\n\n\ndef load_all_models(diffusion_model, use_secondary_model):\n    from .model.script_util import (\n        model_and_diffusion_defaults, )\n\n    model_config = model_and_diffusion_defaults()\n\n    if diffusion_model == '512x512_diffusion_uncond_finetune_008100':\n        model_config.update({\n            'attention_resolutions': '32, 16, 8',\n            'class_cond': False,\n            'diffusion_steps': 1000,  # No need to edit this, it is taken care of later.\n            'rescale_timesteps': True,\n            'timestep_respacing': 250,  # No need to edit this, it is taken care of later.\n            'image_size': 512,\n            'learn_sigma': True,\n            'noise_schedule': 'linear',\n            'num_channels': 256,\n            'num_head_channels': 64,\n            'num_res_blocks': 2,\n            'resblock_updown': True,\n            'use_fp16': False,\n            'use_scale_shift_norm': True,\n        })\n    elif diffusion_model == '256x256_diffusion_uncond':\n        model_config.update({\n            'attention_resolutions': '32, 16, 8',\n            'class_cond': False,\n            'diffusion_steps': 1000,  # No need to edit this, it is taken care of later.\n            'rescale_timesteps': True,\n            'timestep_respacing': 250,  # No need to edit this, it is taken care of later.\n            'image_size': 256,\n            'learn_sigma': True,\n            'noise_schedule': 'linear',\n            'num_channels': 256,\n            'num_head_channels': 64,\n            'num_res_blocks': 2,\n            'resblock_updown': True,\n            'use_fp16': False,\n            'use_scale_shift_norm': True,\n        })\n\n    secondary_model = None\n    if use_secondary_model:\n        from .model.sec_diff import SecondaryDiffusionImageNet2\n        secondary_model = SecondaryDiffusionImageNet2()\n        model_dict = paddle.load(\n            os.path.join(os.path.dirname(__file__), 'pre_trained', 'secondary_model_imagenet_2.pdparams'))\n        secondary_model.set_state_dict(model_dict)\n        secondary_model.eval()\n        for parameter in secondary_model.parameters():\n            parameter.stop_gradient = True\n\n    return model_config, secondary_model\n\n\ndef load_diffusion_model(model_config, diffusion_model, steps):\n    from .model.script_util import (\n        create_model_and_diffusion, )\n\n    timestep_respacing = f'ddim{steps}'\n    diffusion_steps = (1000 // steps) * steps if steps < 1000 else steps\n    model_config.update({\n        'timestep_respacing': timestep_respacing,\n        'diffusion_steps': diffusion_steps,\n    })\n\n    model, diffusion = create_model_and_diffusion(**model_config)\n    model.set_state_dict(\n        paddle.load(os.path.join(os.path.dirname(__file__), 'pre_trained', f'{diffusion_model}.pdparams')))\n    model.eval()\n    for name, param in model.named_parameters():\n        param.stop_gradient = True\n\n    return model, diffusion\n\n\ndef parse_prompt(prompt):\n    if prompt.startswith('http://') or prompt.startswith('https://'):\n        vals = prompt.rsplit(':', 2)\n        vals = [vals[0] + ':' + vals[1], *vals[2:]]\n    else:\n        vals = prompt.rsplit(':', 1)\n    vals = vals + ['', '1'][len(vals):]\n    return vals[0], float(vals[1])\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/reverse_diffusion/model/__init__.py",
    "content": "\"\"\"\nCodebase for \"Improved Denoising Diffusion Probabilistic Models\" implemented by Paddle.\n\"\"\"\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/reverse_diffusion/model/gaussian_diffusion.py",
    "content": "\"\"\"\nDiffusion model implemented by Paddle.\nThis code is rewritten based on Pytorch version of of Ho et al's diffusion models:\nhttps://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/diffusion_utils_2.py\n\"\"\"\nimport enum\nimport math\n\nimport numpy as np\nimport paddle\n\nfrom .losses import discretized_gaussian_log_likelihood\nfrom .losses import normal_kl\nfrom .nn import mean_flat\n\n\ndef get_named_beta_schedule(schedule_name, num_diffusion_timesteps):\n    \"\"\"\n    Get a pre-defined beta schedule for the given name.\n\n    The beta schedule library consists of beta schedules which remain similar\n    in the limit of num_diffusion_timesteps.\n    Beta schedules may be added, but should not be removed or changed once\n    they are committed to maintain backwards compatibility.\n    \"\"\"\n    if schedule_name == \"linear\":\n        # Linear schedule from Ho et al, extended to work for any number of\n        # diffusion steps.\n        scale = 1000 / num_diffusion_timesteps\n        beta_start = scale * 0.0001\n        beta_end = scale * 0.02\n        return np.linspace(beta_start, beta_end, num_diffusion_timesteps, dtype=np.float64)\n    elif schedule_name == \"cosine\":\n        return betas_for_alpha_bar(\n            num_diffusion_timesteps,\n            lambda t: math.cos((t + 0.008) / 1.008 * math.pi / 2)**2,\n        )\n    else:\n        raise NotImplementedError(f\"unknown beta schedule: {schedule_name}\")\n\n\ndef betas_for_alpha_bar(num_diffusion_timesteps, alpha_bar, max_beta=0.999):\n    \"\"\"\n    Create a beta schedule that discretizes the given alpha_t_bar function,\n    which defines the cumulative product of (1-beta) over time from t = [0,1].\n\n    :param num_diffusion_timesteps: the number of betas to produce.\n    :param alpha_bar: a lambda that takes an argument t from 0 to 1 and\n                      produces the cumulative product of (1-beta) up to that\n                      part of the diffusion process.\n    :param max_beta: the maximum beta to use; use values lower than 1 to\n                     prevent singularities.\n    \"\"\"\n    betas = []\n    for i in range(num_diffusion_timesteps):\n        t1 = i / num_diffusion_timesteps\n        t2 = (i + 1) / num_diffusion_timesteps\n        betas.append(min(1 - alpha_bar(t2) / alpha_bar(t1), max_beta))\n    return np.array(betas)\n\n\nclass ModelMeanType(enum.Enum):\n    \"\"\"\n    Which type of output the model predicts.\n    \"\"\"\n\n    PREVIOUS_X = enum.auto()  # the model predicts x_{t-1}\n    START_X = enum.auto()  # the model predicts x_0\n    EPSILON = enum.auto()  # the model predicts epsilon\n\n\nclass ModelVarType(enum.Enum):\n    \"\"\"\n    What is used as the model's output variance.\n\n    The LEARNED_RANGE option has been added to allow the model to predict\n    values between FIXED_SMALL and FIXED_LARGE, making its job easier.\n    \"\"\"\n\n    LEARNED = enum.auto()\n    FIXED_SMALL = enum.auto()\n    FIXED_LARGE = enum.auto()\n    LEARNED_RANGE = enum.auto()\n\n\nclass LossType(enum.Enum):\n    MSE = enum.auto()  # use raw MSE loss (and KL when learning variances)\n    RESCALED_MSE = (enum.auto())  # use raw MSE loss (with RESCALED_KL when learning variances)\n    KL = enum.auto()  # use the variational lower-bound\n    RESCALED_KL = enum.auto()  # like KL, but rescale to estimate the full VLB\n\n    def is_vb(self):\n        return self == LossType.KL or self == LossType.RESCALED_KL\n\n\nclass GaussianDiffusion:\n    \"\"\"\n    Utilities for training and sampling diffusion models.\n\n    Ported directly from here, and then adapted over time to further experimentation.\n    https://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/diffusion_utils_2.py#L42\n\n    :param betas: a 1-D numpy array of betas for each diffusion timestep,\n                  starting at T and going to 1.\n    :param model_mean_type: a ModelMeanType determining what the model outputs.\n    :param model_var_type: a ModelVarType determining how variance is output.\n    :param loss_type: a LossType determining the loss function to use.\n    :param rescale_timesteps: if True, pass floating point timesteps into the\n                              model so that they are always scaled like in the\n                              original paper (0 to 1000).\n    \"\"\"\n\n    def __init__(\n        self,\n        *,\n        betas,\n        model_mean_type,\n        model_var_type,\n        loss_type,\n        rescale_timesteps=False,\n    ):\n        self.model_mean_type = model_mean_type\n        self.model_var_type = model_var_type\n        self.loss_type = loss_type\n        self.rescale_timesteps = rescale_timesteps\n\n        # Use float64 for accuracy.\n        betas = np.array(betas, dtype=np.float64)\n        self.betas = betas\n        assert len(betas.shape) == 1, \"betas must be 1-D\"\n        assert (betas > 0).all() and (betas <= 1).all()\n\n        self.num_timesteps = int(betas.shape[0])\n\n        alphas = 1.0 - betas\n        self.alphas_cumprod = np.cumprod(alphas, axis=0)\n        self.alphas_cumprod_prev = np.append(1.0, self.alphas_cumprod[:-1])\n        self.alphas_cumprod_next = np.append(self.alphas_cumprod[1:], 0.0)\n        assert self.alphas_cumprod_prev.shape == (self.num_timesteps, )\n\n        # calculations for diffusion q(x_t | x_{t-1}) and others\n        self.sqrt_alphas_cumprod = np.sqrt(self.alphas_cumprod)\n        self.sqrt_one_minus_alphas_cumprod = np.sqrt(1.0 - self.alphas_cumprod)\n        self.log_one_minus_alphas_cumprod = np.log(1.0 - self.alphas_cumprod)\n        self.sqrt_recip_alphas_cumprod = np.sqrt(1.0 / self.alphas_cumprod)\n        self.sqrt_recipm1_alphas_cumprod = np.sqrt(1.0 / self.alphas_cumprod - 1)\n\n        # calculations for posterior q(x_{t-1} | x_t, x_0)\n        self.posterior_variance = (betas * (1.0 - self.alphas_cumprod_prev) / (1.0 - self.alphas_cumprod))\n        # log calculation clipped because the posterior variance is 0 at the\n        # beginning of the diffusion chain.\n        self.posterior_log_variance_clipped = np.log(np.append(self.posterior_variance[1], self.posterior_variance[1:]))\n        self.posterior_mean_coef1 = (betas * np.sqrt(self.alphas_cumprod_prev) / (1.0 - self.alphas_cumprod))\n        self.posterior_mean_coef2 = ((1.0 - self.alphas_cumprod_prev) * np.sqrt(alphas) / (1.0 - self.alphas_cumprod))\n\n    def q_mean_variance(self, x_start, t):\n        \"\"\"\n        Get the distribution q(x_t | x_0).\n\n        :param x_start: the [N x C x ...] tensor of noiseless inputs.\n        :param t: the number of diffusion steps (minus 1). Here, 0 means one step.\n        :return: A tuple (mean, variance, log_variance), all of x_start's shape.\n        \"\"\"\n        mean = (_extract_into_tensor(self.sqrt_alphas_cumprod, t, x_start.shape) * x_start)\n        variance = _extract_into_tensor(1.0 - self.alphas_cumprod, t, x_start.shape)\n        log_variance = _extract_into_tensor(self.log_one_minus_alphas_cumprod, t, x_start.shape)\n        return mean, variance, log_variance\n\n    def q_sample(self, x_start, t, noise=None):\n        \"\"\"\n        Diffuse the data for a given number of diffusion steps.\n\n        In other words, sample from q(x_t | x_0).\n\n        :param x_start: the initial data batch.\n        :param t: the number of diffusion steps (minus 1). Here, 0 means one step.\n        :param noise: if specified, the split-out normal noise.\n        :return: A noisy version of x_start.\n        \"\"\"\n        if noise is None:\n            # noise = th.randn_like(x_start)\n            noise = paddle.randn(x_start.shape, x_start.dtype)\n        assert noise.shape == x_start.shape\n        return (_extract_into_tensor(self.sqrt_alphas_cumprod, t, x_start.shape) * x_start +\n                _extract_into_tensor(self.sqrt_one_minus_alphas_cumprod, t, x_start.shape) * noise)\n\n    def q_posterior_mean_variance(self, x_start, x_t, t):\n        \"\"\"\n        Compute the mean and variance of the diffusion posterior:\n\n            q(x_{t-1} | x_t, x_0)\n\n        \"\"\"\n        assert x_start.shape == x_t.shape\n        posterior_mean = (_extract_into_tensor(self.posterior_mean_coef1, t, x_t.shape) * x_start +\n                          _extract_into_tensor(self.posterior_mean_coef2, t, x_t.shape) * x_t)\n        posterior_variance = _extract_into_tensor(self.posterior_variance, t, x_t.shape)\n        posterior_log_variance_clipped = _extract_into_tensor(self.posterior_log_variance_clipped, t, x_t.shape)\n        assert (posterior_mean.shape[0] == posterior_variance.shape[0] == posterior_log_variance_clipped.shape[0] ==\n                x_start.shape[0])\n        return posterior_mean, posterior_variance, posterior_log_variance_clipped\n\n    def p_mean_variance(self, model, x, t, clip_denoised=True, denoised_fn=None, model_kwargs=None):\n        \"\"\"\n        Apply the model to get p(x_{t-1} | x_t), as well as a prediction of\n        the initial x, x_0.\n\n        :param model: the model, which takes a signal and a batch of timesteps\n                      as input.\n        :param x: the [N x C x ...] tensor at time t.\n        :param t: a 1-D Tensor of timesteps.\n        :param clip_denoised: if True, clip the denoised signal into [-1, 1].\n        :param denoised_fn: if not None, a function which applies to the\n            x_start prediction before it is used to sample. Applies before\n            clip_denoised.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n        :return: a dict with the following keys:\n                 - 'mean': the model mean output.\n                 - 'variance': the model variance output.\n                 - 'log_variance': the log of 'variance'.\n                 - 'pred_xstart': the prediction for x_0.\n        \"\"\"\n        if model_kwargs is None:\n            model_kwargs = {}\n\n        B, C = x.shape[:2]\n        assert t.shape == [B]\n        model_output = model(x, self._scale_timesteps(t), **model_kwargs)\n\n        if self.model_var_type in [ModelVarType.LEARNED, ModelVarType.LEARNED_RANGE]:\n            assert model_output.shape == [B, C * 2, *x.shape[2:]]\n            model_output, model_var_values = paddle.split(model_output, 2, axis=1)\n            if self.model_var_type == ModelVarType.LEARNED:\n                model_log_variance = model_var_values\n                model_variance = paddle.exp(model_log_variance)\n            else:\n                min_log = _extract_into_tensor(self.posterior_log_variance_clipped, t, x.shape)\n                max_log = _extract_into_tensor(np.log(self.betas), t, x.shape)\n                # The model_var_values is [-1, 1] for [min_var, max_var].\n                frac = (model_var_values + 1) / 2\n                model_log_variance = frac * max_log + (1 - frac) * min_log\n                model_variance = paddle.exp(model_log_variance)\n        else:\n            model_variance, model_log_variance = {\n                # for fixedlarge, we set the initial (log-)variance like so\n                # to get a better decoder log likelihood.\n                ModelVarType.FIXED_LARGE: (\n                    np.append(self.posterior_variance[1], self.betas[1:]),\n                    np.log(np.append(self.posterior_variance[1], self.betas[1:])),\n                ),\n                ModelVarType.FIXED_SMALL: (\n                    self.posterior_variance,\n                    self.posterior_log_variance_clipped,\n                ),\n            }[self.model_var_type]\n            model_variance = _extract_into_tensor(model_variance, t, x.shape)\n            model_log_variance = _extract_into_tensor(model_log_variance, t, x.shape)\n\n        def process_xstart(x):\n            if denoised_fn is not None:\n                x = denoised_fn(x)\n            if clip_denoised:\n                return x.clamp(-1, 1)\n            return x\n\n        if self.model_mean_type == ModelMeanType.PREVIOUS_X:\n            pred_xstart = process_xstart(self._predict_xstart_from_xprev(x_t=x, t=t, xprev=model_output))\n            model_mean = model_output\n        elif self.model_mean_type in [ModelMeanType.START_X, ModelMeanType.EPSILON]:\n            if self.model_mean_type == ModelMeanType.START_X:\n                pred_xstart = process_xstart(model_output)\n            else:\n                pred_xstart = process_xstart(self._predict_xstart_from_eps(x_t=x, t=t, eps=model_output))\n            model_mean, _, _ = self.q_posterior_mean_variance(x_start=pred_xstart, x_t=x, t=t)\n        else:\n            raise NotImplementedError(self.model_mean_type)\n\n        assert (model_mean.shape == model_log_variance.shape == pred_xstart.shape == x.shape)\n        return {\n            \"mean\": model_mean,\n            \"variance\": model_variance,\n            \"log_variance\": model_log_variance,\n            \"pred_xstart\": pred_xstart,\n        }\n\n    def _predict_xstart_from_eps(self, x_t, t, eps):\n        assert x_t.shape == eps.shape\n        return (_extract_into_tensor(self.sqrt_recip_alphas_cumprod, t, x_t.shape) * x_t -\n                _extract_into_tensor(self.sqrt_recipm1_alphas_cumprod, t, x_t.shape) * eps)\n\n    def _predict_xstart_from_xprev(self, x_t, t, xprev):\n        assert x_t.shape == xprev.shape\n        return (  # (xprev - coef2*x_t) / coef1\n            _extract_into_tensor(1.0 / self.posterior_mean_coef1, t, x_t.shape) * xprev -\n            _extract_into_tensor(self.posterior_mean_coef2 / self.posterior_mean_coef1, t, x_t.shape) * x_t)\n\n    def _predict_eps_from_xstart(self, x_t, t, pred_xstart):\n        return (_extract_into_tensor(self.sqrt_recip_alphas_cumprod, t, x_t.shape) * x_t -\n                pred_xstart) / _extract_into_tensor(self.sqrt_recipm1_alphas_cumprod, t, x_t.shape)\n\n    def _scale_timesteps(self, t):\n        if self.rescale_timesteps:\n            return paddle.cast((t), 'float32') * (1000.0 / self.num_timesteps)\n        return t\n\n    def condition_mean(self, cond_fn, p_mean_var, x, t, model_kwargs=None):\n        \"\"\"\n        Compute the mean for the previous step, given a function cond_fn that\n        computes the gradient of a conditional log probability with respect to\n        x. In particular, cond_fn computes grad(log(p(y|x))), and we want to\n        condition on y.\n\n        This uses the conditioning strategy from Sohl-Dickstein et al. (2015).\n        \"\"\"\n        gradient = cond_fn(x, self._scale_timesteps(t), **model_kwargs)\n        new_mean = (paddle.cast((p_mean_var[\"mean\"]), 'float32') + p_mean_var[\"variance\"] * paddle.cast(\n            (gradient), 'float32'))\n        return new_mean\n\n    def condition_mean_with_grad(self, cond_fn, p_mean_var, x, t, model_kwargs=None):\n        \"\"\"\n        Compute the mean for the previous step, given a function cond_fn that\n        computes the gradient of a conditional log probability with respect to\n        x. In particular, cond_fn computes grad(log(p(y|x))), and we want to\n        condition on y.\n\n        This uses the conditioning strategy from Sohl-Dickstein et al. (2015).\n        \"\"\"\n        gradient = cond_fn(x, t, p_mean_var, **model_kwargs)\n        new_mean = (paddle.cast((p_mean_var[\"mean\"]), 'float32') + p_mean_var[\"variance\"] * paddle.cast(\n            (gradient), 'float32'))\n        return new_mean\n\n    def condition_score(self, cond_fn, p_mean_var, x, t, model_kwargs=None):\n        \"\"\"\n        Compute what the p_mean_variance output would have been, should the\n        model's score function be conditioned by cond_fn.\n\n        See condition_mean() for details on cond_fn.\n\n        Unlike condition_mean(), this instead uses the conditioning strategy\n        from Song et al (2020).\n        \"\"\"\n        alpha_bar = _extract_into_tensor(self.alphas_cumprod, t, x.shape)\n\n        eps = self._predict_eps_from_xstart(x, t, p_mean_var[\"pred_xstart\"])\n        eps = eps - (1 - alpha_bar).sqrt() * cond_fn(x, self._scale_timesteps(t), **model_kwargs)\n\n        out = p_mean_var.copy()\n        out[\"pred_xstart\"] = self._predict_xstart_from_eps(x, t, eps)\n        out[\"mean\"], _, _ = self.q_posterior_mean_variance(x_start=out[\"pred_xstart\"], x_t=x, t=t)\n        return out\n\n    def condition_score_with_grad(self, cond_fn, p_mean_var, x, t, model_kwargs=None):\n        \"\"\"\n        Compute what the p_mean_variance output would have been, should the\n        model's score function be conditioned by cond_fn.\n\n        See condition_mean() for details on cond_fn.\n\n        Unlike condition_mean(), this instead uses the conditioning strategy\n        from Song et al (2020).\n        \"\"\"\n        alpha_bar = _extract_into_tensor(self.alphas_cumprod, t, x.shape)\n\n        eps = self._predict_eps_from_xstart(x, t, p_mean_var[\"pred_xstart\"])\n        eps = eps - (1 - alpha_bar).sqrt() * cond_fn(x, t, p_mean_var, **model_kwargs)\n\n        out = p_mean_var.copy()\n        out[\"pred_xstart\"] = self._predict_xstart_from_eps(x, t, eps)\n        out[\"mean\"], _, _ = self.q_posterior_mean_variance(x_start=out[\"pred_xstart\"], x_t=x, t=t)\n        return out\n\n    def p_sample(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n    ):\n        \"\"\"\n        Sample x_{t-1} from the model at the given timestep.\n\n        :param model: the model to sample from.\n        :param x: the current tensor at x_{t-1}.\n        :param t: the value of t, starting at 0 for the first diffusion step.\n        :param clip_denoised: if True, clip the x_start prediction to [-1, 1].\n        :param denoised_fn: if not None, a function which applies to the\n            x_start prediction before it is used to sample.\n        :param cond_fn: if not None, this is a gradient function that acts\n                        similarly to the model.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n        :return: a dict containing the following keys:\n                 - 'sample': a random sample from the model.\n                 - 'pred_xstart': a prediction of x_0.\n        \"\"\"\n        out = self.p_mean_variance(\n            model,\n            x,\n            t,\n            clip_denoised=clip_denoised,\n            denoised_fn=denoised_fn,\n            model_kwargs=model_kwargs,\n        )\n        # noise = th.randn_like(x)\n        noise = paddle.randn(x.shape, x.dtype)\n        nonzero_mask = (paddle.cast((t != 0), 'float32').reshape([-1,\n                                                                  *([1] * (len(x.shape) - 1))]))  # no noise when t == 0\n        if cond_fn is not None:\n            out[\"mean\"] = self.condition_mean(cond_fn, out, x, t, model_kwargs=model_kwargs)\n        sample = out[\"mean\"] + nonzero_mask * paddle.exp(0.5 * out[\"log_variance\"]) * noise\n        return {\"sample\": sample, \"pred_xstart\": out[\"pred_xstart\"]}\n\n    def p_sample_with_grad(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n    ):\n        \"\"\"\n        Sample x_{t-1} from the model at the given timestep.\n\n        :param model: the model to sample from.\n        :param x: the current tensor at x_{t-1}.\n        :param t: the value of t, starting at 0 for the first diffusion step.\n        :param clip_denoised: if True, clip the x_start prediction to [-1, 1].\n        :param denoised_fn: if not None, a function which applies to the\n            x_start prediction before it is used to sample.\n        :param cond_fn: if not None, this is a gradient function that acts\n                        similarly to the model.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n        :return: a dict containing the following keys:\n                 - 'sample': a random sample from the model.\n                 - 'pred_xstart': a prediction of x_0.\n        \"\"\"\n        # with th.enable_grad():\n        # x = x.detach().requires_grad_()\n        x = x.detach()\n        # x.stop_gradient = False\n        out = self.p_mean_variance(\n            model,\n            x,\n            t,\n            clip_denoised=clip_denoised,\n            denoised_fn=denoised_fn,\n            model_kwargs=model_kwargs,\n        )\n        # noise = th.randn_like(x)\n        noise = paddle.randn(x.shape, x.dtype)\n        nonzero_mask = (paddle.cast((t != 0), 'float32').reshape([-1,\n                                                                  *([1] * (len(x.shape) - 1))]))  # no noise when t == 0\n        if cond_fn is not None:\n            out[\"mean\"] = self.condition_mean_with_grad(cond_fn, out, x, t, model_kwargs=model_kwargs)\n        sample = out[\"mean\"] + nonzero_mask * paddle.exp(0.5 * out[\"log_variance\"]) * noise\n        return {\"sample\": sample, \"pred_xstart\": out[\"pred_xstart\"].detach()}\n\n    def p_sample_loop(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n    ):\n        \"\"\"\n        Generate samples from the model.\n\n        :param model: the model module.\n        :param shape: the shape of the samples, (N, C, H, W).\n        :param noise: if specified, the noise from the encoder to sample.\n                      Should be of the same shape as `shape`.\n        :param clip_denoised: if True, clip x_start predictions to [-1, 1].\n        :param denoised_fn: if not None, a function which applies to the\n            x_start prediction before it is used to sample.\n        :param cond_fn: if not None, this is a gradient function that acts\n                        similarly to the model.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n        :param device: if specified, the device to create the samples on.\n                       If not specified, use a model parameter's device.\n        :param progress: if True, show a tqdm progress bar.\n        :return: a non-differentiable batch of samples.\n        \"\"\"\n        final = None\n        for sample in self.p_sample_loop_progressive(\n                model,\n                shape,\n                noise=noise,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n                device=device,\n                progress=progress,\n                skip_timesteps=skip_timesteps,\n                init_image=init_image,\n                randomize_class=randomize_class,\n                cond_fn_with_grad=cond_fn_with_grad,\n        ):\n            final = sample\n        return final[\"sample\"]\n\n    def p_sample_loop_progressive(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n    ):\n        \"\"\"\n        Generate samples from the model and yield intermediate samples from\n        each timestep of diffusion.\n\n        Arguments are the same as p_sample_loop().\n        Returns a generator over dicts, where each dict is the return value of\n        p_sample().\n        \"\"\"\n        if device is None:\n            device = model.parameters()[0].place\n        assert isinstance(shape, (tuple, list))\n        if noise is not None:\n            img = noise\n        else:\n            img = paddle.randn(shape)\n\n        if skip_timesteps and init_image is None:\n            init_image = paddle.zeros_like(img)\n\n        indices = list(range(self.num_timesteps - skip_timesteps))[::-1]\n\n        if init_image is not None:\n            my_t = paddle.ones([shape[0]], dtype='int64') * indices[0]\n            img = self.q_sample(init_image, my_t, img)\n\n        if progress:\n            # Lazy import so that we don't depend on tqdm.\n            from tqdm.auto import tqdm\n\n            indices = tqdm(indices)\n\n        for i in indices:\n            t = paddle.to_tensor([i] * shape[0], place=device)\n            if randomize_class and 'y' in model_kwargs:\n                model_kwargs['y'] = paddle.randint(low=0, high=model.num_classes, shape=model_kwargs['y'].shape)\n            # with paddle.no_grad():\n            sample_fn = self.p_sample_with_grad if cond_fn_with_grad else self.p_sample\n            out = sample_fn(\n                model,\n                img,\n                t,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n            )\n            yield out\n            img = out[\"sample\"]\n\n    def ddim_sample(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        eta=0.0,\n    ):\n        \"\"\"\n        Sample x_{t-1} from the model using DDIM.\n\n        Same usage as p_sample().\n        \"\"\"\n        out_orig = self.p_mean_variance(\n            model,\n            x,\n            t,\n            clip_denoised=clip_denoised,\n            denoised_fn=denoised_fn,\n            model_kwargs=model_kwargs,\n        )\n        if cond_fn is not None:\n            out = self.condition_score(cond_fn, out_orig, x, t, model_kwargs=model_kwargs)\n        else:\n            out = out_orig\n\n        # Usually our model outputs epsilon, but we re-derive it\n        # in case we used x_start or x_prev prediction.\n        eps = self._predict_eps_from_xstart(x, t, out[\"pred_xstart\"])\n\n        alpha_bar = _extract_into_tensor(self.alphas_cumprod, t, x.shape)\n        alpha_bar_prev = _extract_into_tensor(self.alphas_cumprod_prev, t, x.shape)\n        sigma = (eta * paddle.sqrt(\n            (1 - alpha_bar_prev) / (1 - alpha_bar)) * paddle.sqrt(1 - alpha_bar / alpha_bar_prev))\n        # Equation 12.\n        # noise = th.randn_like(x)\n        noise = paddle.randn(x.shape, x.dtype)\n        mean_pred = (out[\"pred_xstart\"] * paddle.sqrt(alpha_bar_prev) +\n                     paddle.sqrt(1 - alpha_bar_prev - sigma**2) * eps)\n        nonzero_mask = (paddle.cast((t != 0), 'float32').reshape([-1,\n                                                                  *([1] * (len(x.shape) - 1))]))  # no noise when t == 0\n        sample = mean_pred + nonzero_mask * sigma * noise\n        return {\"sample\": sample, \"pred_xstart\": out_orig[\"pred_xstart\"]}\n\n    def ddim_sample_with_grad(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        eta=0.0,\n    ):\n        \"\"\"\n        Sample x_{t-1} from the model using DDIM.\n\n        Same usage as p_sample().\n        \"\"\"\n        # with th.enable_grad():\n        # x = x.detach().requires_grad_()\n        x = x.detach()\n        # x.stop_gradient = False\n        out_orig = self.p_mean_variance(\n            model,\n            x,\n            t,\n            clip_denoised=clip_denoised,\n            denoised_fn=denoised_fn,\n            model_kwargs=model_kwargs,\n        )\n        if cond_fn is not None:\n            out = self.condition_score_with_grad(cond_fn, out_orig, x, t, model_kwargs=model_kwargs)\n        else:\n            out = out_orig\n\n        out[\"pred_xstart\"] = out[\"pred_xstart\"].detach()\n\n        # Usually our model outputs epsilon, but we re-derive it\n        # in case we used x_start or x_prev prediction.\n        eps = self._predict_eps_from_xstart(x, t, out[\"pred_xstart\"])\n\n        alpha_bar = _extract_into_tensor(self.alphas_cumprod, t, x.shape)\n        alpha_bar_prev = _extract_into_tensor(self.alphas_cumprod_prev, t, x.shape)\n        sigma = (eta * paddle.sqrt(\n            (1 - alpha_bar_prev) / (1 - alpha_bar)) * paddle.sqrt(1 - alpha_bar / alpha_bar_prev))\n        # Equation 12.\n        # noise = th.randn_like(x)\n        noise = paddle.randn(x.shape, x.dtype)\n        mean_pred = (out[\"pred_xstart\"] * paddle.sqrt(alpha_bar_prev) +\n                     paddle.sqrt(1 - alpha_bar_prev - sigma**2) * eps)\n        nonzero_mask = (paddle.cast((t != 0), 'float32').reshape([-1,\n                                                                  *([1] * (len(x.shape) - 1))]))  # no noise when t == 0\n        sample = mean_pred + nonzero_mask * sigma * noise\n        return {\"sample\": sample, \"pred_xstart\": out_orig[\"pred_xstart\"].detach()}\n\n    def ddim_reverse_sample(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        model_kwargs=None,\n        eta=0.0,\n    ):\n        \"\"\"\n        Sample x_{t+1} from the model using DDIM reverse ODE.\n        \"\"\"\n        assert eta == 0.0, \"Reverse ODE only for deterministic path\"\n        out = self.p_mean_variance(\n            model,\n            x,\n            t,\n            clip_denoised=clip_denoised,\n            denoised_fn=denoised_fn,\n            model_kwargs=model_kwargs,\n        )\n        # Usually our model outputs epsilon, but we re-derive it\n        # in case we used x_start or x_prev prediction.\n        eps = (_extract_into_tensor(self.sqrt_recip_alphas_cumprod, t, x.shape) * x -\n               out[\"pred_xstart\"]) / _extract_into_tensor(self.sqrt_recipm1_alphas_cumprod, t, x.shape)\n        alpha_bar_next = _extract_into_tensor(self.alphas_cumprod_next, t, x.shape)\n\n        # Equation 12. reversed\n        mean_pred = (out[\"pred_xstart\"] * paddle.sqrt(alpha_bar_next) + paddle.sqrt(1 - alpha_bar_next) * eps)\n\n        return {\"sample\": mean_pred, \"pred_xstart\": out[\"pred_xstart\"]}\n\n    def ddim_sample_loop(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        eta=0.0,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n    ):\n        \"\"\"\n        Generate samples from the model using DDIM.\n\n        Same usage as p_sample_loop().\n        \"\"\"\n        final = None\n        for sample in self.ddim_sample_loop_progressive(\n                model,\n                shape,\n                noise=noise,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n                device=device,\n                progress=progress,\n                eta=eta,\n                skip_timesteps=skip_timesteps,\n                init_image=init_image,\n                randomize_class=randomize_class,\n                cond_fn_with_grad=cond_fn_with_grad,\n        ):\n            final = sample\n        return final[\"sample\"]\n\n    def ddim_sample_loop_progressive(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        eta=0.0,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n    ):\n        \"\"\"\n        Use DDIM to sample from the model and yield intermediate samples from\n        each timestep of DDIM.\n\n        Same usage as p_sample_loop_progressive().\n        \"\"\"\n        # if device is None:\n        #     device = next(model.parameters()).device\n        if device is None:\n            device = model.parameters()[0].place\n        assert isinstance(shape, (tuple, list))\n        if noise is not None:\n            img = noise\n        else:\n            img = paddle.randn(shape)\n\n        if skip_timesteps and init_image is None:\n            init_image = paddle.zeros_like(img)\n\n        indices = list(range(self.num_timesteps - skip_timesteps))[::-1]\n\n        if init_image is not None:\n            my_t = paddle.ones([shape[0]], dtype='int64') * indices[0]\n            img = self.q_sample(init_image, my_t, img)\n\n        if progress:\n            # Lazy import so that we don't depend on tqdm.\n            from tqdm.auto import tqdm\n\n            indices = tqdm(indices)\n\n        for i in indices:\n            t = paddle.to_tensor([i] * shape[0])\n            if randomize_class and 'y' in model_kwargs:\n                model_kwargs['y'] = paddle.randint(\n                    low=0,\n                    high=model.num_classes,\n                    shape=model_kwargs['y'].shape,\n                )\n            sample_fn = self.ddim_sample_with_grad if cond_fn_with_grad else self.ddim_sample\n            out = sample_fn(\n                model,\n                img,\n                t,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n                eta=eta,\n            )\n            yield out\n            img = out[\"sample\"]\n\n    def plms_sample(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        cond_fn_with_grad=False,\n        order=2,\n        old_out=None,\n    ):\n        \"\"\"\n        Sample x_{t-1} from the model using Pseudo Linear Multistep.\n\n        Same usage as p_sample().\n        \"\"\"\n        if not int(order) or not 1 <= order <= 4:\n            raise ValueError('order is invalid (should be int from 1-4).')\n\n        def get_model_output(x, t):\n            with paddle.set_grad_enabled(cond_fn_with_grad and cond_fn is not None):\n                x = x.detach().requires_grad_() if cond_fn_with_grad else x\n                out_orig = self.p_mean_variance(\n                    model,\n                    x,\n                    t,\n                    clip_denoised=clip_denoised,\n                    denoised_fn=denoised_fn,\n                    model_kwargs=model_kwargs,\n                )\n                if cond_fn is not None:\n                    if cond_fn_with_grad:\n                        out = self.condition_score_with_grad(cond_fn, out_orig, x, t, model_kwargs=model_kwargs)\n                        x = x.detach()\n                    else:\n                        out = self.condition_score(cond_fn, out_orig, x, t, model_kwargs=model_kwargs)\n                else:\n                    out = out_orig\n\n            # Usually our model outputs epsilon, but we re-derive it\n            # in case we used x_start or x_prev prediction.\n            eps = self._predict_eps_from_xstart(x, t, out[\"pred_xstart\"])\n            return eps, out, out_orig\n\n        alpha_bar = _extract_into_tensor(self.alphas_cumprod, t, x.shape)\n        alpha_bar_prev = _extract_into_tensor(self.alphas_cumprod_prev, t, x.shape)\n        eps, out, out_orig = get_model_output(x, t)\n\n        if order > 1 and old_out is None:\n            # Pseudo Improved Euler\n            old_eps = [eps]\n            mean_pred = out[\"pred_xstart\"] * paddle.sqrt(alpha_bar_prev) + paddle.sqrt(1 - alpha_bar_prev) * eps\n            eps_2, _, _ = get_model_output(mean_pred, t - 1)\n            eps_prime = (eps + eps_2) / 2\n            pred_prime = self._predict_xstart_from_eps(x, t, eps_prime)\n            mean_pred = pred_prime * paddle.sqrt(alpha_bar_prev) + paddle.sqrt(1 - alpha_bar_prev) * eps_prime\n        else:\n            # Pseudo Linear Multistep (Adams-Bashforth)\n            old_eps = old_out[\"old_eps\"]\n            old_eps.append(eps)\n            cur_order = min(order, len(old_eps))\n            if cur_order == 1:\n                eps_prime = old_eps[-1]\n            elif cur_order == 2:\n                eps_prime = (3 * old_eps[-1] - old_eps[-2]) / 2\n            elif cur_order == 3:\n                eps_prime = (23 * old_eps[-1] - 16 * old_eps[-2] + 5 * old_eps[-3]) / 12\n            elif cur_order == 4:\n                eps_prime = (55 * old_eps[-1] - 59 * old_eps[-2] + 37 * old_eps[-3] - 9 * old_eps[-4]) / 24\n            else:\n                raise RuntimeError('cur_order is invalid.')\n            pred_prime = self._predict_xstart_from_eps(x, t, eps_prime)\n            mean_pred = pred_prime * paddle.sqrt(alpha_bar_prev) + paddle.sqrt(1 - alpha_bar_prev) * eps_prime\n\n        if len(old_eps) >= order:\n            old_eps.pop(0)\n\n        nonzero_mask = paddle.cast((t != 0), 'float32').reshape([-1, *([1] * (len(x.shape) - 1))])\n        sample = mean_pred * nonzero_mask + out[\"pred_xstart\"] * (1 - nonzero_mask)\n\n        return {\"sample\": sample, \"pred_xstart\": out_orig[\"pred_xstart\"], \"old_eps\": old_eps}\n\n    def plms_sample_loop(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n        order=2,\n    ):\n        \"\"\"\n        Generate samples from the model using Pseudo Linear Multistep.\n\n        Same usage as p_sample_loop().\n        \"\"\"\n        final = None\n        for sample in self.plms_sample_loop_progressive(\n                model,\n                shape,\n                noise=noise,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n                device=device,\n                progress=progress,\n                skip_timesteps=skip_timesteps,\n                init_image=init_image,\n                randomize_class=randomize_class,\n                cond_fn_with_grad=cond_fn_with_grad,\n                order=order,\n        ):\n            final = sample\n        return final[\"sample\"]\n\n    def plms_sample_loop_progressive(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n        order=2,\n    ):\n        \"\"\"\n        Use PLMS to sample from the model and yield intermediate samples from each\n        timestep of PLMS.\n\n        Same usage as p_sample_loop_progressive().\n        \"\"\"\n        if device is None:\n            device = model.parameters()[0].place\n        assert isinstance(shape, (tuple, list))\n        if noise is not None:\n            img = noise\n        else:\n            img = paddle.randn(shape)\n\n        if skip_timesteps and init_image is None:\n            init_image = paddle.zeros_like(img)\n\n        indices = list(range(self.num_timesteps - skip_timesteps))[::-1]\n\n        if init_image is not None:\n            my_t = paddle.ones([shape[0]], dtype='int64') * indices[0]\n            img = self.q_sample(init_image, my_t, img)\n\n        if progress:\n            # Lazy import so that we don't depend on tqdm.\n            from tqdm.auto import tqdm\n\n            indices = tqdm(indices)\n\n        old_out = None\n\n        for i in indices:\n            t = paddle.to_tensor([i] * shape[0], place=device)\n            if randomize_class and 'y' in model_kwargs:\n                model_kwargs['y'] = paddle.randint(low=0, high=model.num_classes, shape=model_kwargs['y'].shape)\n            # with paddle.no_grad():\n            out = self.plms_sample(\n                model,\n                img,\n                t,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n                cond_fn_with_grad=cond_fn_with_grad,\n                order=order,\n                old_out=old_out,\n            )\n            yield out\n            old_out = out\n            img = out[\"sample\"]\n\n    def _vb_terms_bpd(self, model, x_start, x_t, t, clip_denoised=True, model_kwargs=None):\n        \"\"\"\n        Get a term for the variational lower-bound.\n\n        The resulting units are bits (rather than nats, as one might expect).\n        This allows for comparison to other papers.\n\n        :return: a dict with the following keys:\n                 - 'output': a shape [N] tensor of NLLs or KLs.\n                 - 'pred_xstart': the x_0 predictions.\n        \"\"\"\n        true_mean, _, true_log_variance_clipped = self.q_posterior_mean_variance(x_start=x_start, x_t=x_t, t=t)\n        out = self.p_mean_variance(model, x_t, t, clip_denoised=clip_denoised, model_kwargs=model_kwargs)\n        kl = normal_kl(true_mean, true_log_variance_clipped, out[\"mean\"], out[\"log_variance\"])\n        kl = mean_flat(kl) / np.log(2.0)\n\n        decoder_nll = -discretized_gaussian_log_likelihood(\n            x_start, means=out[\"mean\"], log_scales=0.5 * out[\"log_variance\"])\n        assert decoder_nll.shape == x_start.shape\n        decoder_nll = mean_flat(decoder_nll) / np.log(2.0)\n\n        # At the first timestep return the decoder NLL,\n        # otherwise return KL(q(x_{t-1}|x_t,x_0) || p(x_{t-1}|x_t))\n        output = paddle.where((t == 0), decoder_nll, kl)\n        return {\"output\": output, \"pred_xstart\": out[\"pred_xstart\"]}\n\n    def training_losses(self, model, x_start, t, model_kwargs=None, noise=None):\n        \"\"\"\n        Compute training losses for a single timestep.\n\n        :param model: the model to evaluate loss on.\n        :param x_start: the [N x C x ...] tensor of inputs.\n        :param t: a batch of timestep indices.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n        :param noise: if specified, the specific Gaussian noise to try to remove.\n        :return: a dict with the key \"loss\" containing a tensor of shape [N].\n                 Some mean or variance settings may also have other keys.\n        \"\"\"\n        if model_kwargs is None:\n            model_kwargs = {}\n        if noise is None:\n            # noise = th.randn_like(x_start)\n            noise = paddle.randn(x_start.shape, x_start.dtype)\n        x_t = self.q_sample(x_start, t, noise=noise)\n\n        terms = {}\n\n        if self.loss_type == LossType.KL or self.loss_type == LossType.RESCALED_KL:\n            terms[\"loss\"] = self._vb_terms_bpd(\n                model=model,\n                x_start=x_start,\n                x_t=x_t,\n                t=t,\n                clip_denoised=False,\n                model_kwargs=model_kwargs,\n            )[\"output\"]\n            if self.loss_type == LossType.RESCALED_KL:\n                terms[\"loss\"] *= self.num_timesteps\n        elif self.loss_type == LossType.MSE or self.loss_type == LossType.RESCALED_MSE:\n            model_output = model(x_t, self._scale_timesteps(t), **model_kwargs)\n\n            if self.model_var_type in [\n                    ModelVarType.LEARNED,\n                    ModelVarType.LEARNED_RANGE,\n            ]:\n                B, C = x_t.shape[:2]\n                assert model_output.shape == (B, C * 2, *x_t.shape[2:])\n                model_output, model_var_values = paddle.split(model_output, 2, dim=1)\n                # Learn the variance using the variational bound, but don't let\n                # it affect our mean prediction.\n                frozen_out = paddle.concat([model_output.detach(), model_var_values], axis=1)\n                terms[\"vb\"] = self._vb_terms_bpd(\n                    model=lambda *args, r=frozen_out: r,\n                    x_start=x_start,\n                    x_t=x_t,\n                    t=t,\n                    clip_denoised=False,\n                )[\"output\"]\n                if self.loss_type == LossType.RESCALED_MSE:\n                    # Divide by 1000 for equivalence with initial implementation.\n                    # Without a factor of 1/1000, the VB term hurts the MSE term.\n                    terms[\"vb\"] *= self.num_timesteps / 1000.0\n\n            target = {\n                ModelMeanType.PREVIOUS_X: self.q_posterior_mean_variance(x_start=x_start, x_t=x_t, t=t)[0],\n                ModelMeanType.START_X: x_start,\n                ModelMeanType.EPSILON: noise,\n            }[self.model_mean_type]\n            assert model_output.shape == target.shape == x_start.shape\n            terms[\"mse\"] = mean_flat((target - model_output)**2)\n            if \"vb\" in terms:\n                terms[\"loss\"] = terms[\"mse\"] + terms[\"vb\"]\n            else:\n                terms[\"loss\"] = terms[\"mse\"]\n        else:\n            raise NotImplementedError(self.loss_type)\n\n        return terms\n\n    def _prior_bpd(self, x_start):\n        \"\"\"\n        Get the prior KL term for the variational lower-bound, measured in\n        bits-per-dim.\n\n        This term can't be optimized, as it only depends on the encoder.\n\n        :param x_start: the [N x C x ...] tensor of inputs.\n        :return: a batch of [N] KL values (in bits), one per batch element.\n        \"\"\"\n        batch_size = x_start.shape[0]\n        t = paddle.to_tensor([self.num_timesteps - 1] * batch_size, place=x_start.place)\n        qt_mean, _, qt_log_variance = self.q_mean_variance(x_start, t)\n        kl_prior = normal_kl(mean1=qt_mean, logvar1=qt_log_variance, mean2=0.0, logvar2=0.0)\n        return mean_flat(kl_prior) / np.log(2.0)\n\n    def calc_bpd_loop(self, model, x_start, clip_denoised=True, model_kwargs=None):\n        \"\"\"\n        Compute the entire variational lower-bound, measured in bits-per-dim,\n        as well as other related quantities.\n\n        :param model: the model to evaluate loss on.\n        :param x_start: the [N x C x ...] tensor of inputs.\n        :param clip_denoised: if True, clip denoised samples.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n\n        :return: a dict containing the following keys:\n                 - total_bpd: the total variational lower-bound, per batch element.\n                 - prior_bpd: the prior term in the lower-bound.\n                 - vb: an [N x T] tensor of terms in the lower-bound.\n                 - xstart_mse: an [N x T] tensor of x_0 MSEs for each timestep.\n                 - mse: an [N x T] tensor of epsilon MSEs for each timestep.\n        \"\"\"\n        device = x_start.place\n        batch_size = x_start.shape[0]\n\n        vb = []\n        xstart_mse = []\n        mse = []\n        for t in list(range(self.num_timesteps))[::-1]:\n            t_batch = paddle.to_tensor([t] * batch_size, place=device)\n            # noise = th.randn_like(x_start)\n            noise = paddle.randn(x_start.shape, x_start.dtype)\n            x_t = self.q_sample(x_start=x_start, t=t_batch, noise=noise)\n            # Calculate VLB term at the current timestep\n            # with paddle.no_grad():\n            out = self._vb_terms_bpd(\n                model,\n                x_start=x_start,\n                x_t=x_t,\n                t=t_batch,\n                clip_denoised=clip_denoised,\n                model_kwargs=model_kwargs,\n            )\n            vb.append(out[\"output\"])\n            xstart_mse.append(mean_flat((out[\"pred_xstart\"] - x_start)**2))\n            eps = self._predict_eps_from_xstart(x_t, t_batch, out[\"pred_xstart\"])\n            mse.append(mean_flat((eps - noise)**2))\n\n        vb = paddle.stack(vb, axis=1)\n        xstart_mse = paddle.stack(xstart_mse, axis=1)\n        mse = paddle.stack(mse, axis=1)\n\n        prior_bpd = self._prior_bpd(x_start)\n        total_bpd = vb.sum(axis=1) + prior_bpd\n        return {\n            \"total_bpd\": total_bpd,\n            \"prior_bpd\": prior_bpd,\n            \"vb\": vb,\n            \"xstart_mse\": xstart_mse,\n            \"mse\": mse,\n        }\n\n\ndef _extract_into_tensor(arr, timesteps, broadcast_shape):\n    \"\"\"\n    Extract values from a 1-D numpy array for a batch of indices.\n\n    :param arr: the 1-D numpy array.\n    :param timesteps: a tensor of indices into the array to extract.\n    :param broadcast_shape: a larger shape of K dimensions with the batch\n                            dimension equal to the length of timesteps.\n    :return: a tensor of shape [batch_size, 1, ...] where the shape has K dims.\n    \"\"\"\n    res = paddle.to_tensor(arr, place=timesteps.place)[timesteps]\n    while len(res.shape) < len(broadcast_shape):\n        res = res[..., None]\n    return res.expand(broadcast_shape)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/reverse_diffusion/model/losses.py",
    "content": "\"\"\"\nHelpers for various likelihood-based losses implemented by Paddle. These are ported from the original\nHo et al. diffusion models codebase:\nhttps://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/utils.py\n\"\"\"\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\n\n\ndef normal_kl(mean1, logvar1, mean2, logvar2):\n    \"\"\"\n    Compute the KL divergence between two gaussians.\n\n    Shapes are automatically broadcasted, so batches can be compared to\n    scalars, among other use cases.\n    \"\"\"\n    tensor = None\n    for obj in (mean1, logvar1, mean2, logvar2):\n        if isinstance(obj, paddle.Tensor):\n            tensor = obj\n            break\n    assert tensor is not None, \"at least one argument must be a Tensor\"\n\n    # Force variances to be Tensors. Broadcasting helps convert scalars to\n    # Tensors, but it does not work for th.exp().\n    logvar1, logvar2 = [x if isinstance(x, paddle.Tensor) else paddle.to_tensor(x) for x in (logvar1, logvar2)]\n\n    return 0.5 * (-1.0 + logvar2 - logvar1 + paddle.exp(logvar1 - logvar2) +\n                  ((mean1 - mean2)**2) * paddle.exp(-logvar2))\n\n\ndef approx_standard_normal_cdf(x):\n    \"\"\"\n    A fast approximation of the cumulative distribution function of the\n    standard normal.\n    \"\"\"\n    return 0.5 * (1.0 + paddle.tanh(np.sqrt(2.0 / np.pi) * (x + 0.044715 * paddle.pow(x, 3))))\n\n\ndef discretized_gaussian_log_likelihood(x, *, means, log_scales):\n    \"\"\"\n    Compute the log-likelihood of a Gaussian distribution discretizing to a\n    given image.\n\n    :param x: the target images. It is assumed that this was uint8 values,\n              rescaled to the range [-1, 1].\n    :param means: the Gaussian mean Tensor.\n    :param log_scales: the Gaussian log stddev Tensor.\n    :return: a tensor like x of log probabilities (in nats).\n    \"\"\"\n    assert x.shape == means.shape == log_scales.shape\n    centered_x = x - means\n    inv_stdv = paddle.exp(-log_scales)\n    plus_in = inv_stdv * (centered_x + 1.0 / 255.0)\n    cdf_plus = approx_standard_normal_cdf(plus_in)\n    min_in = inv_stdv * (centered_x - 1.0 / 255.0)\n    cdf_min = approx_standard_normal_cdf(min_in)\n    log_cdf_plus = paddle.log(cdf_plus.clip(min=1e-12))\n    log_one_minus_cdf_min = paddle.log((1.0 - cdf_min).clip(min=1e-12))\n    cdf_delta = cdf_plus - cdf_min\n    log_probs = paddle.where(\n        x < -0.999,\n        log_cdf_plus,\n        paddle.where(x > 0.999, log_one_minus_cdf_min, paddle.log(cdf_delta.clip(min=1e-12))),\n    )\n    assert log_probs.shape == x.shape\n    return log_probs\n\n\ndef spherical_dist_loss(x, y):\n    x = F.normalize(x, axis=-1)\n    y = F.normalize(y, axis=-1)\n    return (x - y).norm(axis=-1).divide(paddle.to_tensor(2.0)).asin().pow(2).multiply(paddle.to_tensor(2.0))\n\n\ndef tv_loss(input):\n    \"\"\"L2 total variation loss, as in Mahendran et al.\"\"\"\n    input = F.pad(input, (0, 1, 0, 1), 'replicate')\n    x_diff = input[..., :-1, 1:] - input[..., :-1, :-1]\n    y_diff = input[..., 1:, :-1] - input[..., :-1, :-1]\n    return (x_diff**2 + y_diff**2).mean([1, 2, 3])\n\n\ndef range_loss(input):\n    return (input - input.clip(-1, 1)).pow(2).mean([1, 2, 3])\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/reverse_diffusion/model/make_cutouts.py",
    "content": "'''\nThis code is rewritten by Paddle based on Jina-ai/discoart.\nhttps://github.com/jina-ai/discoart/blob/main/discoart/nn/make_cutouts.py\n'''\nimport math\n\nimport paddle\nimport paddle.nn as nn\nfrom disco_diffusion_clip_rn101.resize_right.resize_right import resize\nfrom paddle.nn import functional as F\n\nfrom . import transforms as T\n\nskip_augs = False  # @param{type: 'boolean'}\n\n\ndef sinc(x):\n    return paddle.where(x != 0, paddle.sin(math.pi * x) / (math.pi * x), x.new_ones([]))\n\n\ndef lanczos(x, a):\n    cond = paddle.logical_and(-a < x, x < a)\n    out = paddle.where(cond, sinc(x) * sinc(x / a), x.new_zeros([]))\n    return out / out.sum()\n\n\ndef ramp(ratio, width):\n    n = math.ceil(width / ratio + 1)\n    out = paddle.empty([n])\n    cur = 0\n    for i in range(out.shape[0]):\n        out[i] = cur\n        cur += ratio\n    return paddle.concat([-out[1:].flip([0]), out])[1:-1]\n\n\nclass MakeCutouts(nn.Layer):\n\n    def __init__(self, cut_size, cutn, skip_augs=False):\n        super().__init__()\n        self.cut_size = cut_size\n        self.cutn = cutn\n        self.skip_augs = skip_augs\n        self.augs = nn.Sequential(*[\n            T.RandomHorizontalFlip(prob=0.5),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.RandomAffine(degrees=15, translate=(0.1, 0.1)),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.RandomPerspective(distortion_scale=0.4, p=0.7),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.RandomGrayscale(p=0.15),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.ColorJitter(brightness=0.1, contrast=0.1, saturation=0.1, hue=0.1),\n        ])\n\n    def forward(self, input):\n        input = T.Pad(input.shape[2] // 4, fill=0)(input)\n        sideY, sideX = input.shape[2:4]\n        max_size = min(sideX, sideY)\n\n        cutouts = []\n        for ch in range(self.cutn):\n            if ch > self.cutn - self.cutn // 4:\n                cutout = input.clone()\n            else:\n                size = int(max_size *\n                           paddle.zeros(1, ).normal_(mean=0.8, std=0.3).clip(float(self.cut_size / max_size), 1.0))\n                offsetx = paddle.randint(0, abs(sideX - size + 1), ())\n                offsety = paddle.randint(0, abs(sideY - size + 1), ())\n                cutout = input[:, :, offsety:offsety + size, offsetx:offsetx + size]\n\n            if not self.skip_augs:\n                cutout = self.augs(cutout)\n            cutouts.append(resample(cutout, (self.cut_size, self.cut_size)))\n            del cutout\n\n        cutouts = paddle.concat(cutouts, axis=0)\n        return cutouts\n\n\nclass MakeCutoutsDango(nn.Layer):\n\n    def __init__(self, cut_size, Overview=4, InnerCrop=0, IC_Size_Pow=0.5, IC_Grey_P=0.2):\n        super().__init__()\n        self.cut_size = cut_size\n        self.Overview = Overview\n        self.InnerCrop = InnerCrop\n        self.IC_Size_Pow = IC_Size_Pow\n        self.IC_Grey_P = IC_Grey_P\n        self.augs = nn.Sequential(*[\n            T.RandomHorizontalFlip(prob=0.5),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.RandomAffine(\n                degrees=10,\n                translate=(0.05, 0.05),\n                interpolation=T.InterpolationMode.BILINEAR,\n            ),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.RandomGrayscale(p=0.1),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.ColorJitter(brightness=0.1, contrast=0.1, saturation=0.1, hue=0.1),\n        ])\n\n    def forward(self, input):\n        cutouts = []\n        gray = T.Grayscale(3)\n        sideY, sideX = input.shape[2:4]\n        max_size = min(sideX, sideY)\n        min_size = min(sideX, sideY, self.cut_size)\n        output_shape = [1, 3, self.cut_size, self.cut_size]\n        pad_input = F.pad(\n            input,\n            (\n                (sideY - max_size) // 2,\n                (sideY - max_size) // 2,\n                (sideX - max_size) // 2,\n                (sideX - max_size) // 2,\n            ),\n            **padargs,\n        )\n        cutout = resize(pad_input, out_shape=output_shape)\n\n        if self.Overview > 0:\n            if self.Overview <= 4:\n                if self.Overview >= 1:\n                    cutouts.append(cutout)\n                if self.Overview >= 2:\n                    cutouts.append(gray(cutout))\n                if self.Overview >= 3:\n                    cutouts.append(cutout[:, :, :, ::-1])\n                if self.Overview == 4:\n                    cutouts.append(gray(cutout[:, :, :, ::-1]))\n            else:\n                cutout = resize(pad_input, out_shape=output_shape)\n                for _ in range(self.Overview):\n                    cutouts.append(cutout)\n\n        if self.InnerCrop > 0:\n            for i in range(self.InnerCrop):\n                size = int(paddle.rand([1])**self.IC_Size_Pow * (max_size - min_size) + min_size)\n                offsetx = paddle.randint(0, sideX - size + 1)\n                offsety = paddle.randint(0, sideY - size + 1)\n                cutout = input[:, :, offsety:offsety + size, offsetx:offsetx + size]\n                if i <= int(self.IC_Grey_P * self.InnerCrop):\n                    cutout = gray(cutout)\n                cutout = resize(cutout, out_shape=output_shape)\n                cutouts.append(cutout)\n\n        cutouts = paddle.concat(cutouts)\n        if skip_augs is not True:\n            cutouts = self.augs(cutouts)\n        return cutouts\n\n\ndef resample(input, size, align_corners=True):\n    n, c, h, w = input.shape\n    dh, dw = size\n\n    input = input.reshape([n * c, 1, h, w])\n\n    if dh < h:\n        kernel_h = lanczos(ramp(dh / h, 2), 2).to(input.device, input.dtype)\n        pad_h = (kernel_h.shape[0] - 1) // 2\n        input = F.pad(input, (0, 0, pad_h, pad_h), 'reflect')\n        input = F.conv2d(input, kernel_h[None, None, :, None])\n\n    if dw < w:\n        kernel_w = lanczos(ramp(dw / w, 2), 2).to(input.device, input.dtype)\n        pad_w = (kernel_w.shape[0] - 1) // 2\n        input = F.pad(input, (pad_w, pad_w, 0, 0), 'reflect')\n        input = F.conv2d(input, kernel_w[None, None, None, :])\n\n    input = input.reshape([n, c, h, w])\n    return F.interpolate(input, size, mode='bicubic', align_corners=align_corners)\n\n\npadargs = {}\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/reverse_diffusion/model/nn.py",
    "content": "\"\"\"\nVarious utilities for neural networks implemented by Paddle. This code is rewritten based on:\nhttps://github.com/openai/guided-diffusion/blob/main/guided_diffusion/nn.py\n\"\"\"\nimport math\n\nimport paddle\nimport paddle.nn as nn\n\n\nclass SiLU(nn.Layer):\n\n    def forward(self, x):\n        return x * nn.functional.sigmoid(x)\n\n\nclass GroupNorm32(nn.GroupNorm):\n\n    def forward(self, x):\n        return super().forward(x)\n\n\ndef conv_nd(dims, *args, **kwargs):\n    \"\"\"\n    Create a 1D, 2D, or 3D convolution module.\n    \"\"\"\n    if dims == 1:\n        return nn.Conv1D(*args, **kwargs)\n    elif dims == 2:\n        return nn.Conv2D(*args, **kwargs)\n    elif dims == 3:\n        return nn.Conv3D(*args, **kwargs)\n    raise ValueError(f\"unsupported dimensions: {dims}\")\n\n\ndef linear(*args, **kwargs):\n    \"\"\"\n    Create a linear module.\n    \"\"\"\n    return nn.Linear(*args, **kwargs)\n\n\ndef avg_pool_nd(dims, *args, **kwargs):\n    \"\"\"\n    Create a 1D, 2D, or 3D average pooling module.\n    \"\"\"\n    if dims == 1:\n        return nn.AvgPool1D(*args, **kwargs)\n    elif dims == 2:\n        return nn.AvgPool2D(*args, **kwargs)\n    elif dims == 3:\n        return nn.AvgPool3D(*args, **kwargs)\n    raise ValueError(f\"unsupported dimensions: {dims}\")\n\n\ndef update_ema(target_params, source_params, rate=0.99):\n    \"\"\"\n    Update target parameters to be closer to those of source parameters using\n    an exponential moving average.\n\n    :param target_params: the target parameter sequence.\n    :param source_params: the source parameter sequence.\n    :param rate: the EMA rate (closer to 1 means slower).\n    \"\"\"\n    for targ, src in zip(target_params, source_params):\n        targ.detach().mul_(rate).add_(src, alpha=1 - rate)\n\n\ndef zero_module(module):\n    \"\"\"\n    Zero out the parameters of a module and return it.\n    \"\"\"\n    for p in module.parameters():\n        p.detach().zero_()\n    return module\n\n\ndef scale_module(module, scale):\n    \"\"\"\n    Scale the parameters of a module and return it.\n    \"\"\"\n    for p in module.parameters():\n        p.detach().mul_(scale)\n    return module\n\n\ndef mean_flat(tensor):\n    \"\"\"\n    Take the mean over all non-batch dimensions.\n    \"\"\"\n    return tensor.mean(axis=list(range(1, len(tensor.shape))))\n\n\ndef normalization(channels):\n    \"\"\"\n    Make a standard normalization layer.\n\n    :param channels: number of input channels.\n    :return: an nn.Module for normalization.\n    \"\"\"\n    return GroupNorm32(32, channels)\n\n\ndef timestep_embedding(timesteps, dim, max_period=10000):\n    \"\"\"\n    Create sinusoidal timestep embeddings.\n\n    :param timesteps: a 1-D Tensor of N indices, one per batch element.\n                      These may be fractional.\n    :param dim: the dimension of the output.\n    :param max_period: controls the minimum frequency of the embeddings.\n    :return: an [N x dim] Tensor of positional embeddings.\n    \"\"\"\n    half = dim // 2\n    freqs = paddle.exp(-math.log(max_period) * paddle.arange(start=0, end=half, dtype=paddle.float32) / half)\n    args = paddle.cast(timesteps[:, None], 'float32') * freqs[None]\n    embedding = paddle.concat([paddle.cos(args), paddle.sin(args)], axis=-1)\n    if dim % 2:\n        embedding = paddle.concat([embedding, paddle.zeros_like(embedding[:, :1])], axis=-1)\n    return embedding\n\n\ndef checkpoint(func, inputs, params, flag):\n    \"\"\"\n    This function is disabled. And now just forward.\n    \"\"\"\n    return func(*inputs)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/reverse_diffusion/model/perlin_noises.py",
    "content": "'''\nPerlin noise implementation by Paddle.\nThis code is rewritten based on:\nhttps://github.com/jina-ai/discoart/blob/main/discoart/nn/perlin_noises.py\n'''\nimport numpy as np\nimport paddle\nimport paddle.vision.transforms as TF\nfrom PIL import Image\nfrom PIL import ImageOps\n\n\ndef interp(t):\n    return 3 * t**2 - 2 * t**3\n\n\ndef perlin(width, height, scale=10):\n    gx, gy = paddle.randn([2, width + 1, height + 1, 1, 1])\n    xs = paddle.linspace(0, 1, scale + 1)[:-1, None]\n    ys = paddle.linspace(0, 1, scale + 1)[None, :-1]\n    wx = 1 - interp(xs)\n    wy = 1 - interp(ys)\n    dots = 0\n    dots += wx * wy * (gx[:-1, :-1] * xs + gy[:-1, :-1] * ys)\n    dots += (1 - wx) * wy * (-gx[1:, :-1] * (1 - xs) + gy[1:, :-1] * ys)\n    dots += wx * (1 - wy) * (gx[:-1, 1:] * xs - gy[:-1, 1:] * (1 - ys))\n    dots += (1 - wx) * (1 - wy) * (-gx[1:, 1:] * (1 - xs) - gy[1:, 1:] * (1 - ys))\n    return dots.transpose([0, 2, 1, 3]).reshape([width * scale, height * scale])\n\n\ndef perlin_ms(octaves, width, height, grayscale):\n    out_array = [0.5] if grayscale else [0.5, 0.5, 0.5]\n    # out_array = [0.0] if grayscale else [0.0, 0.0, 0.0]\n    for i in range(1 if grayscale else 3):\n        scale = 2**len(octaves)\n        oct_width = width\n        oct_height = height\n        for oct in octaves:\n            p = perlin(oct_width, oct_height, scale)\n            out_array[i] += p * oct\n            scale //= 2\n            oct_width *= 2\n            oct_height *= 2\n    return paddle.concat(out_array)\n\n\ndef create_perlin_noise(octaves, width, height, grayscale, side_y, side_x):\n    out = perlin_ms(octaves, width, height, grayscale)\n    if grayscale:\n        out = TF.resize(size=(side_y, side_x), img=out.numpy())\n        out = np.uint8(out)\n        out = Image.fromarray(out).convert('RGB')\n    else:\n        out = out.reshape([-1, 3, out.shape[0] // 3, out.shape[1]])\n        out = out.squeeze().transpose([1, 2, 0]).numpy()\n        out = TF.resize(size=(side_y, side_x), img=out)\n        out = out.clip(0, 1) * 255\n        out = np.uint8(out)\n        out = Image.fromarray(out)\n\n    out = ImageOps.autocontrast(out)\n    return out\n\n\ndef regen_perlin(perlin_mode, side_y, side_x, batch_size):\n    if perlin_mode == 'color':\n        init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, False, side_y, side_x)\n        init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, False, side_y, side_x)\n    elif perlin_mode == 'gray':\n        init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, True, side_y, side_x)\n        init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, True, side_y, side_x)\n    else:\n        init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, False, side_y, side_x)\n        init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, True, side_y, side_x)\n\n    init = (TF.to_tensor(init).add(TF.to_tensor(init2)).divide(paddle.to_tensor(2.0)).unsqueeze(0) * 2 - 1)\n    del init2\n    return init.expand([batch_size, -1, -1, -1])\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/reverse_diffusion/model/respace.py",
    "content": "'''\nThis code is rewritten by Paddle based on\nhttps://github.com/openai/guided-diffusion/blob/main/guided_diffusion/respace.py\n'''\nimport numpy as np\nimport paddle\n\nfrom .gaussian_diffusion import GaussianDiffusion\n\n\ndef space_timesteps(num_timesteps, section_counts):\n    \"\"\"\n    Create a list of timesteps to use from an original diffusion process,\n    given the number of timesteps we want to take from equally-sized portions\n    of the original process.\n\n    For example, if there's 300 timesteps and the section counts are [10,15,20]\n    then the first 100 timesteps are strided to be 10 timesteps, the second 100\n    are strided to be 15 timesteps, and the final 100 are strided to be 20.\n\n    If the stride is a string starting with \"ddim\", then the fixed striding\n    from the DDIM paper is used, and only one section is allowed.\n\n    :param num_timesteps: the number of diffusion steps in the original\n                          process to divide up.\n    :param section_counts: either a list of numbers, or a string containing\n                           comma-separated numbers, indicating the step count\n                           per section. As a special case, use \"ddimN\" where N\n                           is a number of steps to use the striding from the\n                           DDIM paper.\n    :return: a set of diffusion steps from the original process to use.\n    \"\"\"\n    if isinstance(section_counts, str):\n        if section_counts.startswith(\"ddim\"):\n            desired_count = int(section_counts[len(\"ddim\"):])\n            for i in range(1, num_timesteps):\n                if len(range(0, num_timesteps, i)) == desired_count:\n                    return set(range(0, num_timesteps, i))\n            raise ValueError(f\"cannot create exactly {num_timesteps} steps with an integer stride\")\n        section_counts = [int(x) for x in section_counts.split(\",\")]\n    size_per = num_timesteps // len(section_counts)\n    extra = num_timesteps % len(section_counts)\n    start_idx = 0\n    all_steps = []\n    for i, section_count in enumerate(section_counts):\n        size = size_per + (1 if i < extra else 0)\n        if size < section_count:\n            raise ValueError(f\"cannot divide section of {size} steps into {section_count}\")\n        if section_count <= 1:\n            frac_stride = 1\n        else:\n            frac_stride = (size - 1) / (section_count - 1)\n        cur_idx = 0.0\n        taken_steps = []\n        for _ in range(section_count):\n            taken_steps.append(start_idx + round(cur_idx))\n            cur_idx += frac_stride\n        all_steps += taken_steps\n        start_idx += size\n    return set(all_steps)\n\n\nclass SpacedDiffusion(GaussianDiffusion):\n    \"\"\"\n    A diffusion process which can skip steps in a base diffusion process.\n\n    :param use_timesteps: a collection (sequence or set) of timesteps from the\n                          original diffusion process to retain.\n    :param kwargs: the kwargs to create the base diffusion process.\n    \"\"\"\n\n    def __init__(self, use_timesteps, **kwargs):\n        self.use_timesteps = set(use_timesteps)\n        self.timestep_map = []\n        self.original_num_steps = len(kwargs[\"betas\"])\n\n        base_diffusion = GaussianDiffusion(**kwargs)  # pylint: disable=missing-kwoa\n        last_alpha_cumprod = 1.0\n        new_betas = []\n        for i, alpha_cumprod in enumerate(base_diffusion.alphas_cumprod):\n            if i in self.use_timesteps:\n                new_betas.append(1 - alpha_cumprod / last_alpha_cumprod)\n                last_alpha_cumprod = alpha_cumprod\n                self.timestep_map.append(i)\n        kwargs[\"betas\"] = np.array(new_betas)\n        super().__init__(**kwargs)\n\n    def p_mean_variance(self, model, *args, **kwargs):  # pylint: disable=signature-differs\n        return super().p_mean_variance(self._wrap_model(model), *args, **kwargs)\n\n    def training_losses(self, model, *args, **kwargs):  # pylint: disable=signature-differs\n        return super().training_losses(self._wrap_model(model), *args, **kwargs)\n\n    def condition_mean(self, cond_fn, *args, **kwargs):\n        return super().condition_mean(self._wrap_model(cond_fn), *args, **kwargs)\n\n    def condition_score(self, cond_fn, *args, **kwargs):\n        return super().condition_score(self._wrap_model(cond_fn), *args, **kwargs)\n\n    def _wrap_model(self, model):\n        if isinstance(model, _WrappedModel):\n            return model\n        return _WrappedModel(model, self.timestep_map, self.rescale_timesteps, self.original_num_steps)\n\n    def _scale_timesteps(self, t):\n        # Scaling is done by the wrapped model.\n        return t\n\n\nclass _WrappedModel:\n\n    def __init__(self, model, timestep_map, rescale_timesteps, original_num_steps):\n        self.model = model\n        self.timestep_map = timestep_map\n        self.rescale_timesteps = rescale_timesteps\n        self.original_num_steps = original_num_steps\n\n    def __call__(self, x, ts, **kwargs):\n        map_tensor = paddle.to_tensor(self.timestep_map, place=ts.place, dtype=ts.dtype)\n        new_ts = map_tensor[ts]\n        if self.rescale_timesteps:\n            new_ts = paddle.cast(new_ts, 'float32') * (1000.0 / self.original_num_steps)\n        return self.model(x, new_ts, **kwargs)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/reverse_diffusion/model/script_util.py",
    "content": "'''\nThis code is based on\nhttps://github.com/openai/guided-diffusion/blob/main/guided_diffusion/script_util.py\n'''\nimport argparse\nimport inspect\n\nfrom . import gaussian_diffusion as gd\nfrom .respace import space_timesteps\nfrom .respace import SpacedDiffusion\nfrom .unet import EncoderUNetModel\nfrom .unet import SuperResModel\nfrom .unet import UNetModel\n\nNUM_CLASSES = 1000\n\n\ndef diffusion_defaults():\n    \"\"\"\n    Defaults for image and classifier training.\n    \"\"\"\n    return dict(\n        learn_sigma=False,\n        diffusion_steps=1000,\n        noise_schedule=\"linear\",\n        timestep_respacing=\"\",\n        use_kl=False,\n        predict_xstart=False,\n        rescale_timesteps=False,\n        rescale_learned_sigmas=False,\n    )\n\n\ndef model_and_diffusion_defaults():\n    \"\"\"\n    Defaults for image training.\n    \"\"\"\n    res = dict(\n        image_size=64,\n        num_channels=128,\n        num_res_blocks=2,\n        num_heads=4,\n        num_heads_upsample=-1,\n        num_head_channels=-1,\n        attention_resolutions=\"16,8\",\n        channel_mult=\"\",\n        dropout=0.0,\n        class_cond=False,\n        use_checkpoint=False,\n        use_scale_shift_norm=True,\n        resblock_updown=False,\n        use_fp16=False,\n        use_new_attention_order=False,\n    )\n    res.update(diffusion_defaults())\n    return res\n\n\ndef create_model_and_diffusion(\n    image_size,\n    class_cond,\n    learn_sigma,\n    num_channels,\n    num_res_blocks,\n    channel_mult,\n    num_heads,\n    num_head_channels,\n    num_heads_upsample,\n    attention_resolutions,\n    dropout,\n    diffusion_steps,\n    noise_schedule,\n    timestep_respacing,\n    use_kl,\n    predict_xstart,\n    rescale_timesteps,\n    rescale_learned_sigmas,\n    use_checkpoint,\n    use_scale_shift_norm,\n    resblock_updown,\n    use_fp16,\n    use_new_attention_order,\n):\n    model = create_model(\n        image_size,\n        num_channels,\n        num_res_blocks,\n        channel_mult=channel_mult,\n        learn_sigma=learn_sigma,\n        class_cond=class_cond,\n        use_checkpoint=use_checkpoint,\n        attention_resolutions=attention_resolutions,\n        num_heads=num_heads,\n        num_head_channels=num_head_channels,\n        num_heads_upsample=num_heads_upsample,\n        use_scale_shift_norm=use_scale_shift_norm,\n        dropout=dropout,\n        resblock_updown=resblock_updown,\n        use_fp16=use_fp16,\n        use_new_attention_order=use_new_attention_order,\n    )\n    diffusion = create_gaussian_diffusion(\n        steps=diffusion_steps,\n        learn_sigma=learn_sigma,\n        noise_schedule=noise_schedule,\n        use_kl=use_kl,\n        predict_xstart=predict_xstart,\n        rescale_timesteps=rescale_timesteps,\n        rescale_learned_sigmas=rescale_learned_sigmas,\n        timestep_respacing=timestep_respacing,\n    )\n    return model, diffusion\n\n\ndef create_model(\n    image_size,\n    num_channels,\n    num_res_blocks,\n    channel_mult=\"\",\n    learn_sigma=False,\n    class_cond=False,\n    use_checkpoint=False,\n    attention_resolutions=\"16\",\n    num_heads=1,\n    num_head_channels=-1,\n    num_heads_upsample=-1,\n    use_scale_shift_norm=False,\n    dropout=0,\n    resblock_updown=False,\n    use_fp16=False,\n    use_new_attention_order=False,\n):\n    if channel_mult == \"\":\n        if image_size == 512:\n            channel_mult = (0.5, 1, 1, 2, 2, 4, 4)\n        elif image_size == 256:\n            channel_mult = (1, 1, 2, 2, 4, 4)\n        elif image_size == 128:\n            channel_mult = (1, 1, 2, 3, 4)\n        elif image_size == 64:\n            channel_mult = (1, 2, 3, 4)\n        else:\n            raise ValueError(f\"unsupported image size: {image_size}\")\n    else:\n        channel_mult = tuple(int(ch_mult) for ch_mult in channel_mult.split(\",\"))\n\n    attention_ds = []\n    for res in attention_resolutions.split(\",\"):\n        attention_ds.append(image_size // int(res))\n\n    return UNetModel(\n        image_size=image_size,\n        in_channels=3,\n        model_channels=num_channels,\n        out_channels=(3 if not learn_sigma else 6),\n        num_res_blocks=num_res_blocks,\n        attention_resolutions=tuple(attention_ds),\n        dropout=dropout,\n        channel_mult=channel_mult,\n        num_classes=(NUM_CLASSES if class_cond else None),\n        use_checkpoint=use_checkpoint,\n        use_fp16=use_fp16,\n        num_heads=num_heads,\n        num_head_channels=num_head_channels,\n        num_heads_upsample=num_heads_upsample,\n        use_scale_shift_norm=use_scale_shift_norm,\n        resblock_updown=resblock_updown,\n        use_new_attention_order=use_new_attention_order,\n    )\n\n\ndef create_gaussian_diffusion(\n    *,\n    steps=1000,\n    learn_sigma=False,\n    sigma_small=False,\n    noise_schedule=\"linear\",\n    use_kl=False,\n    predict_xstart=False,\n    rescale_timesteps=False,\n    rescale_learned_sigmas=False,\n    timestep_respacing=\"\",\n):\n    betas = gd.get_named_beta_schedule(noise_schedule, steps)\n    if use_kl:\n        loss_type = gd.LossType.RESCALED_KL\n    elif rescale_learned_sigmas:\n        loss_type = gd.LossType.RESCALED_MSE\n    else:\n        loss_type = gd.LossType.MSE\n    if not timestep_respacing:\n        timestep_respacing = [steps]\n    return SpacedDiffusion(\n        use_timesteps=space_timesteps(steps, timestep_respacing),\n        betas=betas,\n        model_mean_type=(gd.ModelMeanType.EPSILON if not predict_xstart else gd.ModelMeanType.START_X),\n        model_var_type=((gd.ModelVarType.FIXED_LARGE if not sigma_small else gd.ModelVarType.FIXED_SMALL)\n                        if not learn_sigma else gd.ModelVarType.LEARNED_RANGE),\n        loss_type=loss_type,\n        rescale_timesteps=rescale_timesteps,\n    )\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/reverse_diffusion/model/sec_diff.py",
    "content": "'''\nThis code is rewritten by Paddle based on\nhttps://github.com/jina-ai/discoart/blob/main/discoart/nn/sec_diff.py\n'''\nimport math\nfrom dataclasses import dataclass\nfrom functools import partial\n\nimport paddle\nimport paddle.nn as nn\n\n\n@dataclass\nclass DiffusionOutput:\n    v: paddle.Tensor\n    pred: paddle.Tensor\n    eps: paddle.Tensor\n\n\nclass SkipBlock(nn.Layer):\n\n    def __init__(self, main, skip=None):\n        super().__init__()\n        self.main = nn.Sequential(*main)\n        self.skip = skip if skip else nn.Identity()\n\n    def forward(self, input):\n        return paddle.concat([self.main(input), self.skip(input)], axis=1)\n\n\ndef append_dims(x, n):\n    return x[(Ellipsis, *(None, ) * (n - x.ndim))]\n\n\ndef expand_to_planes(x, shape):\n    return paddle.tile(append_dims(x, len(shape)), [1, 1, *shape[2:]])\n\n\ndef alpha_sigma_to_t(alpha, sigma):\n    return paddle.atan2(sigma, alpha) * 2 / math.pi\n\n\ndef t_to_alpha_sigma(t):\n    return paddle.cos(t * math.pi / 2), paddle.sin(t * math.pi / 2)\n\n\nclass SecondaryDiffusionImageNet2(nn.Layer):\n\n    def __init__(self):\n        super().__init__()\n        c = 64  # The base channel count\n        cs = [c, c * 2, c * 2, c * 4, c * 4, c * 8]\n\n        self.timestep_embed = FourierFeatures(1, 16)\n        self.down = nn.AvgPool2D(2)\n        self.up = nn.Upsample(scale_factor=2, mode='bilinear', align_corners=False)\n\n        self.net = nn.Sequential(\n            ConvBlock(3 + 16, cs[0]),\n            ConvBlock(cs[0], cs[0]),\n            SkipBlock([\n                self.down,\n                ConvBlock(cs[0], cs[1]),\n                ConvBlock(cs[1], cs[1]),\n                SkipBlock([\n                    self.down,\n                    ConvBlock(cs[1], cs[2]),\n                    ConvBlock(cs[2], cs[2]),\n                    SkipBlock([\n                        self.down,\n                        ConvBlock(cs[2], cs[3]),\n                        ConvBlock(cs[3], cs[3]),\n                        SkipBlock([\n                            self.down,\n                            ConvBlock(cs[3], cs[4]),\n                            ConvBlock(cs[4], cs[4]),\n                            SkipBlock([\n                                self.down,\n                                ConvBlock(cs[4], cs[5]),\n                                ConvBlock(cs[5], cs[5]),\n                                ConvBlock(cs[5], cs[5]),\n                                ConvBlock(cs[5], cs[4]),\n                                self.up,\n                            ]),\n                            ConvBlock(cs[4] * 2, cs[4]),\n                            ConvBlock(cs[4], cs[3]),\n                            self.up,\n                        ]),\n                        ConvBlock(cs[3] * 2, cs[3]),\n                        ConvBlock(cs[3], cs[2]),\n                        self.up,\n                    ]),\n                    ConvBlock(cs[2] * 2, cs[2]),\n                    ConvBlock(cs[2], cs[1]),\n                    self.up,\n                ]),\n                ConvBlock(cs[1] * 2, cs[1]),\n                ConvBlock(cs[1], cs[0]),\n                self.up,\n            ]),\n            ConvBlock(cs[0] * 2, cs[0]),\n            nn.Conv2D(cs[0], 3, 3, padding=1),\n        )\n\n    def forward(self, input, t):\n        timestep_embed = expand_to_planes(self.timestep_embed(t[:, None]), input.shape)\n        v = self.net(paddle.concat([input, timestep_embed], axis=1))\n        alphas, sigmas = map(partial(append_dims, n=v.ndim), t_to_alpha_sigma(t))\n        pred = input * alphas - v * sigmas\n        eps = input * sigmas + v * alphas\n        return DiffusionOutput(v, pred, eps)\n\n\nclass FourierFeatures(nn.Layer):\n\n    def __init__(self, in_features, out_features, std=1.0):\n        super().__init__()\n        assert out_features % 2 == 0\n        # self.weight = nn.Parameter(paddle.randn([out_features // 2, in_features]) * std)\n        self.weight = paddle.create_parameter([out_features // 2, in_features],\n                                              dtype='float32',\n                                              default_initializer=nn.initializer.Normal(mean=0.0, std=std))\n\n    def forward(self, input):\n        f = 2 * math.pi * input @ self.weight.T\n        return paddle.concat([f.cos(), f.sin()], axis=-1)\n\n\nclass ConvBlock(nn.Sequential):\n\n    def __init__(self, c_in, c_out):\n        super().__init__(\n            nn.Conv2D(c_in, c_out, 3, padding=1),\n            nn.ReLU(),\n        )\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/reverse_diffusion/model/transforms.py",
    "content": "'''\nThis code is rewritten by Paddle based on\nhttps://github.com/pytorch/vision/blob/main/torchvision/transforms/transforms.py\n'''\nimport math\nimport numbers\nimport warnings\nfrom enum import Enum\nfrom typing import Any\nfrom typing import Dict\nfrom typing import List\nfrom typing import Optional\nfrom typing import Sequence\nfrom typing import Tuple\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nfrom paddle import Tensor\nfrom paddle.nn import functional as F\nfrom paddle.nn.functional import grid_sample\nfrom paddle.vision import transforms as T\n\n\nclass Normalize(nn.Layer):\n\n    def __init__(self, mean, std):\n        super(Normalize, self).__init__()\n        self.mean = paddle.to_tensor(mean)\n        self.std = paddle.to_tensor(std)\n\n    def forward(self, tensor: Tensor):\n        dtype = tensor.dtype\n        mean = paddle.to_tensor(self.mean, dtype=dtype)\n        std = paddle.to_tensor(self.std, dtype=dtype)\n        mean = mean.reshape([1, -1, 1, 1])\n        std = std.reshape([1, -1, 1, 1])\n        result = tensor.subtract(mean).divide(std)\n        return result\n\n\nclass InterpolationMode(Enum):\n    \"\"\"Interpolation modes\n    Available interpolation methods are ``nearest``, ``bilinear``, ``bicubic``, ``box``, ``hamming``, and ``lanczos``.\n    \"\"\"\n\n    NEAREST = \"nearest\"\n    BILINEAR = \"bilinear\"\n    BICUBIC = \"bicubic\"\n    # For PIL compatibility\n    BOX = \"box\"\n    HAMMING = \"hamming\"\n    LANCZOS = \"lanczos\"\n\n\nclass Grayscale(nn.Layer):\n\n    def __init__(self, num_output_channels):\n        super(Grayscale, self).__init__()\n        self.num_output_channels = num_output_channels\n\n    def forward(self, x):\n        output = (0.2989 * x[:, 0:1, :, :] + 0.587 * x[:, 1:2, :, :] + 0.114 * x[:, 2:3, :, :])\n        if self.num_output_channels == 3:\n            return output.expand(x.shape)\n\n        return output\n\n\nclass Lambda(nn.Layer):\n\n    def __init__(self, func):\n        super(Lambda, self).__init__()\n        self.transform = func\n\n    def forward(self, x):\n        return self.transform(x)\n\n\nclass RandomGrayscale(nn.Layer):\n\n    def __init__(self, p):\n        super(RandomGrayscale, self).__init__()\n        self.prob = p\n        self.transform = Grayscale(3)\n\n    def forward(self, x):\n        if paddle.rand([1]) < self.prob:\n            return self.transform(x)\n        else:\n            return x\n\n\nclass RandomHorizontalFlip(nn.Layer):\n\n    def __init__(self, prob):\n        super(RandomHorizontalFlip, self).__init__()\n        self.prob = prob\n\n    def forward(self, x):\n        if paddle.rand([1]) < self.prob:\n            return x[:, :, :, ::-1]\n        else:\n            return x\n\n\ndef _blend(img1: Tensor, img2: Tensor, ratio: float) -> Tensor:\n    ratio = float(ratio)\n    bound = 1.0\n    return (ratio * img1 + (1.0 - ratio) * img2).clip(0, bound)\n\n\ndef trunc_div(a, b):\n    ipt = paddle.divide(a, b)\n    sign_ipt = paddle.sign(ipt)\n    abs_ipt = paddle.abs(ipt)\n    abs_ipt = paddle.floor(abs_ipt)\n    out = paddle.multiply(sign_ipt, abs_ipt)\n    return out\n\n\ndef fmod(a, b):\n    return a - trunc_div(a, b) * b\n\n\ndef _rgb2hsv(img: Tensor) -> Tensor:\n    r, g, b = img.unbind(axis=-3)\n\n    # Implementation is based on https://github.com/python-pillow/Pillow/blob/4174d4267616897df3746d315d5a2d0f82c656ee/\n    # src/libImaging/Convert.c#L330\n    maxc = paddle.max(img, axis=-3)\n    minc = paddle.min(img, axis=-3)\n\n    # The algorithm erases S and H channel where `maxc = minc`. This avoids NaN\n    # from happening in the results, because\n    #   + S channel has division by `maxc`, which is zero only if `maxc = minc`\n    #   + H channel has division by `(maxc - minc)`.\n    #\n    # Instead of overwriting NaN afterwards, we just prevent it from occuring so\n    # we don't need to deal with it in case we save the NaN in a buffer in\n    # backprop, if it is ever supported, but it doesn't hurt to do so.\n    eqc = maxc == minc\n\n    cr = maxc - minc\n    # Since `eqc => cr = 0`, replacing denominator with 1 when `eqc` is fine.\n    ones = paddle.ones_like(maxc)\n    s = cr / paddle.where(eqc, ones, maxc)\n    # Note that `eqc => maxc = minc = r = g = b`. So the following calculation\n    # of `h` would reduce to `bc - gc + 2 + rc - bc + 4 + rc - bc = 6` so it\n    # would not matter what values `rc`, `gc`, and `bc` have here, and thus\n    # replacing denominator with 1 when `eqc` is fine.\n    cr_divisor = paddle.where(eqc, ones, cr)\n    rc = (maxc - r) / cr_divisor\n    gc = (maxc - g) / cr_divisor\n    bc = (maxc - b) / cr_divisor\n\n    hr = (maxc == r).cast('float32') * (bc - gc)\n    hg = ((maxc == g) & (maxc != r)).cast('float32') * (2.0 + rc - bc)\n    hb = ((maxc != g) & (maxc != r)).cast('float32') * (4.0 + gc - rc)\n    h = hr + hg + hb\n    h = fmod((h / 6.0 + 1.0), paddle.to_tensor(1.0))\n    return paddle.stack((h, s, maxc), axis=-3)\n\n\ndef _hsv2rgb(img: Tensor) -> Tensor:\n    h, s, v = img.unbind(axis=-3)\n    i = paddle.floor(h * 6.0)\n    f = (h * 6.0) - i\n    i = i.cast(dtype='int32')\n\n    p = paddle.clip((v * (1.0 - s)), 0.0, 1.0)\n    q = paddle.clip((v * (1.0 - s * f)), 0.0, 1.0)\n    t = paddle.clip((v * (1.0 - s * (1.0 - f))), 0.0, 1.0)\n    i = i % 6\n\n    mask = i.unsqueeze(axis=-3) == paddle.arange(6).reshape([-1, 1, 1])\n\n    a1 = paddle.stack((v, q, p, p, t, v), axis=-3)\n    a2 = paddle.stack((t, v, v, q, p, p), axis=-3)\n    a3 = paddle.stack((p, p, t, v, v, q), axis=-3)\n    a4 = paddle.stack((a1, a2, a3), axis=-4)\n\n    return paddle.einsum(\"...ijk, ...xijk -> ...xjk\", mask.cast(dtype=img.dtype), a4)\n\n\ndef adjust_brightness(img: Tensor, brightness_factor: float) -> Tensor:\n    if brightness_factor < 0:\n        raise ValueError(f\"brightness_factor ({brightness_factor}) is not non-negative.\")\n\n    return _blend(img, paddle.zeros_like(img), brightness_factor)\n\n\ndef adjust_contrast(img: Tensor, contrast_factor: float) -> Tensor:\n    if contrast_factor < 0:\n        raise ValueError(f\"contrast_factor ({contrast_factor}) is not non-negative.\")\n\n    c = img.shape[1]\n\n    if c == 3:\n        output = (0.2989 * img[:, 0:1, :, :] + 0.587 * img[:, 1:2, :, :] + 0.114 * img[:, 2:3, :, :])\n        mean = paddle.mean(output, axis=(-3, -2, -1), keepdim=True)\n\n    else:\n        mean = paddle.mean(img, axis=(-3, -2, -1), keepdim=True)\n\n    return _blend(img, mean, contrast_factor)\n\n\ndef adjust_hue(img: Tensor, hue_factor: float) -> Tensor:\n    if not (-0.5 <= hue_factor <= 0.5):\n        raise ValueError(f\"hue_factor ({hue_factor}) is not in [-0.5, 0.5].\")\n\n    img = _rgb2hsv(img)\n    h, s, v = img.unbind(axis=-3)\n    h = fmod(h + hue_factor, paddle.to_tensor(1.0))\n    img = paddle.stack((h, s, v), axis=-3)\n    img_hue_adj = _hsv2rgb(img)\n    return img_hue_adj\n\n\ndef adjust_saturation(img: Tensor, saturation_factor: float) -> Tensor:\n    if saturation_factor < 0:\n        raise ValueError(f\"saturation_factor ({saturation_factor}) is not non-negative.\")\n\n    output = (0.2989 * img[:, 0:1, :, :] + 0.587 * img[:, 1:2, :, :] + 0.114 * img[:, 2:3, :, :])\n\n    return _blend(img, output, saturation_factor)\n\n\nclass ColorJitter(nn.Layer):\n\n    def __init__(self, brightness=0, contrast=0, saturation=0, hue=0):\n        super(ColorJitter, self).__init__()\n        self.brightness = self._check_input(brightness, \"brightness\")\n        self.contrast = self._check_input(contrast, \"contrast\")\n        self.saturation = self._check_input(saturation, \"saturation\")\n        self.hue = self._check_input(hue, \"hue\", center=0, bound=(-0.5, 0.5), clip_first_on_zero=False)\n\n    def _check_input(self, value, name, center=1, bound=(0, float(\"inf\")), clip_first_on_zero=True):\n        if isinstance(value, numbers.Number):\n            if value < 0:\n                raise ValueError(f\"If {name} is a single number, it must be non negative.\")\n            value = [center - float(value), center + float(value)]\n            if clip_first_on_zero:\n                value[0] = max(value[0], 0.0)\n        elif isinstance(value, (tuple, list)) and len(value) == 2:\n            if not bound[0] <= value[0] <= value[1] <= bound[1]:\n                raise ValueError(f\"{name} values should be between {bound}\")\n        else:\n            raise TypeError(f\"{name} should be a single number or a list/tuple with length 2.\")\n\n        # if value is 0 or (1., 1.) for brightness/contrast/saturation\n        # or (0., 0.) for hue, do nothing\n        if value[0] == value[1] == center:\n            value = None\n        return value\n\n    @staticmethod\n    def get_params(\n        brightness: Optional[List[float]],\n        contrast: Optional[List[float]],\n        saturation: Optional[List[float]],\n        hue: Optional[List[float]],\n    ) -> Tuple[Tensor, Optional[float], Optional[float], Optional[float], Optional[float]]:\n        \"\"\"Get the parameters for the randomized transform to be applied on image.\n\n        Args:\n            brightness (tuple of float (min, max), optional): The range from which the brightness_factor is chosen\n                uniformly. Pass None to turn off the transformation.\n            contrast (tuple of float (min, max), optional): The range from which the contrast_factor is chosen\n                uniformly. Pass None to turn off the transformation.\n            saturation (tuple of float (min, max), optional): The range from which the saturation_factor is chosen\n                uniformly. Pass None to turn off the transformation.\n            hue (tuple of float (min, max), optional): The range from which the hue_factor is chosen uniformly.\n                Pass None to turn off the transformation.\n\n        Returns:\n            tuple: The parameters used to apply the randomized transform\n            along with their random order.\n        \"\"\"\n        fn_idx = paddle.randperm(4)\n\n        b = None if brightness is None else paddle.empty([1]).uniform_(brightness[0], brightness[1])\n        c = None if contrast is None else paddle.empty([1]).uniform_(contrast[0], contrast[1])\n        s = None if saturation is None else paddle.empty([1]).uniform_(saturation[0], saturation[1])\n        h = None if hue is None else paddle.empty([1]).uniform_(hue[0], hue[1])\n\n        return fn_idx, b, c, s, h\n\n    def forward(self, img):\n        \"\"\"\n        Args:\n            img (PIL Image or Tensor): Input image.\n\n        Returns:\n            PIL Image or Tensor: Color jittered image.\n        \"\"\"\n        fn_idx, brightness_factor, contrast_factor, saturation_factor, hue_factor = self.get_params(\n            self.brightness, self.contrast, self.saturation, self.hue)\n\n        for fn_id in fn_idx:\n            if fn_id == 0 and brightness_factor is not None:\n                img = adjust_brightness(img, brightness_factor)\n            elif fn_id == 1 and contrast_factor is not None:\n                img = adjust_contrast(img, contrast_factor)\n            elif fn_id == 2 and saturation_factor is not None:\n                img = adjust_saturation(img, saturation_factor)\n            elif fn_id == 3 and hue_factor is not None:\n                img = adjust_hue(img, hue_factor)\n\n        return img\n\n    def __repr__(self) -> str:\n        s = (f\"{self.__class__.__name__}(\"\n             f\"brightness={self.brightness}\"\n             f\", contrast={self.contrast}\"\n             f\", saturation={self.saturation}\"\n             f\", hue={self.hue})\")\n        return s\n\n\ndef _apply_grid_transform(img: Tensor, grid: Tensor, mode: str, fill: Optional[List[float]]) -> Tensor:\n\n    if img.shape[0] > 1:\n        # Apply same grid to a batch of images\n        grid = grid.expand([img.shape[0], grid.shape[1], grid.shape[2], grid.shape[3]])\n\n    # Append a dummy mask for customized fill colors, should be faster than grid_sample() twice\n    if fill is not None:\n        dummy = paddle.ones((img.shape[0], 1, img.shape[2], img.shape[3]), dtype=img.dtype)\n        img = paddle.concat((img, dummy), axis=1)\n\n    img = grid_sample(img, grid, mode=mode, padding_mode=\"zeros\", align_corners=False)\n\n    # Fill with required color\n    if fill is not None:\n        mask = img[:, -1:, :, :]  # N * 1 * H * W\n        img = img[:, :-1, :, :]  # N * C * H * W\n        mask = mask.expand_as(img)\n        len_fill = len(fill) if isinstance(fill, (tuple, list)) else 1\n        fill_img = paddle.to_tensor(fill, dtype=img.dtype).reshape([1, len_fill, 1, 1]).expand_as(img)\n        if mode == \"nearest\":\n            mask = mask < 0.5\n            img[mask] = fill_img[mask]\n        else:  # 'bilinear'\n            img = img * mask + (1.0 - mask) * fill_img\n    return img\n\n\ndef _gen_affine_grid(\n    theta: Tensor,\n    w: int,\n    h: int,\n    ow: int,\n    oh: int,\n) -> Tensor:\n    # https://github.com/pytorch/pytorch/blob/74b65c32be68b15dc7c9e8bb62459efbfbde33d8/aten/src/ATen/native/\n    # AffineGridGenerator.cpp#L18\n    # Difference with AffineGridGenerator is that:\n    # 1) we normalize grid values after applying theta\n    # 2) we can normalize by other image size, such that it covers \"extend\" option like in PIL.Image.rotate\n\n    d = 0.5\n    base_grid = paddle.empty([1, oh, ow, 3], dtype=theta.dtype)\n    x_grid = paddle.linspace(-ow * 0.5 + d, ow * 0.5 + d - 1, num=ow)\n    base_grid[..., 0] = (x_grid)\n    y_grid = paddle.linspace(-oh * 0.5 + d, oh * 0.5 + d - 1, num=oh).unsqueeze_(-1)\n    base_grid[..., 1] = (y_grid)\n    base_grid[..., 2] = 1.0\n    rescaled_theta = theta.transpose([0, 2, 1]) / paddle.to_tensor([0.5 * w, 0.5 * h], dtype=theta.dtype)\n    output_grid = base_grid.reshape([1, oh * ow, 3]).bmm(rescaled_theta)\n    return output_grid.reshape([1, oh, ow, 2])\n\n\ndef affine_impl(img: Tensor,\n                matrix: List[float],\n                interpolation: str = \"nearest\",\n                fill: Optional[List[float]] = None) -> Tensor:\n    theta = paddle.to_tensor(matrix, dtype=img.dtype).reshape([1, 2, 3])\n    shape = img.shape\n    # grid will be generated on the same device as theta and img\n    grid = _gen_affine_grid(theta, w=shape[-1], h=shape[-2], ow=shape[-1], oh=shape[-2])\n    return _apply_grid_transform(img, grid, interpolation, fill=fill)\n\n\ndef _get_inverse_affine_matrix(center: List[float],\n                               angle: float,\n                               translate: List[float],\n                               scale: float,\n                               shear: List[float],\n                               inverted: bool = True) -> List[float]:\n    # Helper method to compute inverse matrix for affine transformation\n\n    # Pillow requires inverse affine transformation matrix:\n    # Affine matrix is : M = T * C * RotateScaleShear * C^-1\n    #\n    # where T is translation matrix: [1, 0, tx | 0, 1, ty | 0, 0, 1]\n    #       C is translation matrix to keep center: [1, 0, cx | 0, 1, cy | 0, 0, 1]\n    #       RotateScaleShear is rotation with scale and shear matrix\n    #\n    #       RotateScaleShear(a, s, (sx, sy)) =\n    #       = R(a) * S(s) * SHy(sy) * SHx(sx)\n    #       = [ s*cos(a - sy)/cos(sy), s*(-cos(a - sy)*tan(sx)/cos(sy) - sin(a)), 0 ]\n    #         [ s*sin(a + sy)/cos(sy), s*(-sin(a - sy)*tan(sx)/cos(sy) + cos(a)), 0 ]\n    #         [ 0                    , 0                                      , 1 ]\n    # where R is a rotation matrix, S is a scaling matrix, and SHx and SHy are the shears:\n    # SHx(s) = [1, -tan(s)] and SHy(s) = [1      , 0]\n    #          [0, 1      ]              [-tan(s), 1]\n    #\n    # Thus, the inverse is M^-1 = C * RotateScaleShear^-1 * C^-1 * T^-1\n\n    rot = math.radians(angle)\n    sx = math.radians(shear[0])\n    sy = math.radians(shear[1])\n\n    cx, cy = center\n    tx, ty = translate\n\n    # RSS without scaling\n    a = math.cos(rot - sy) / math.cos(sy)\n    b = -math.cos(rot - sy) * math.tan(sx) / math.cos(sy) - math.sin(rot)\n    c = math.sin(rot - sy) / math.cos(sy)\n    d = -math.sin(rot - sy) * math.tan(sx) / math.cos(sy) + math.cos(rot)\n\n    if inverted:\n        # Inverted rotation matrix with scale and shear\n        # det([[a, b], [c, d]]) == 1, since det(rotation) = 1 and det(shear) = 1\n        matrix = [d, -b, 0.0, -c, a, 0.0]\n        matrix = [x / scale for x in matrix]\n        # Apply inverse of translation and of center translation: RSS^-1 * C^-1 * T^-1\n        matrix[2] += matrix[0] * (-cx - tx) + matrix[1] * (-cy - ty)\n        matrix[5] += matrix[3] * (-cx - tx) + matrix[4] * (-cy - ty)\n        # Apply center translation: C * RSS^-1 * C^-1 * T^-1\n        matrix[2] += cx\n        matrix[5] += cy\n    else:\n        matrix = [a, b, 0.0, c, d, 0.0]\n        matrix = [x * scale for x in matrix]\n        # Apply inverse of center translation: RSS * C^-1\n        matrix[2] += matrix[0] * (-cx) + matrix[1] * (-cy)\n        matrix[5] += matrix[3] * (-cx) + matrix[4] * (-cy)\n        # Apply translation and center : T * C * RSS * C^-1\n        matrix[2] += cx + tx\n        matrix[5] += cy + ty\n\n    return matrix\n\n\ndef affine(\n    img: Tensor,\n    angle: float,\n    translate: List[int],\n    scale: float,\n    shear: List[float],\n    interpolation: InterpolationMode = InterpolationMode.NEAREST,\n    fill: Optional[List[float]] = None,\n    resample: Optional[int] = None,\n    fillcolor: Optional[List[float]] = None,\n    center: Optional[List[int]] = None,\n) -> Tensor:\n    \"\"\"Apply affine transformation on the image keeping image center invariant.\n    If the image is paddle Tensor, it is expected\n    to have [..., H, W] shape, where ... means an arbitrary number of leading dimensions.\n\n    Args:\n        img (PIL Image or Tensor): image to transform.\n        angle (number): rotation angle in degrees between -180 and 180, clockwise direction.\n        translate (sequence of integers): horizontal and vertical translations (post-rotation translation)\n        scale (float): overall scale\n        shear (float or sequence): shear angle value in degrees between -180 to 180, clockwise direction.\n            If a sequence is specified, the first value corresponds to a shear parallel to the x axis, while\n            the second value corresponds to a shear parallel to the y axis.\n        interpolation (InterpolationMode): Desired interpolation enum defined by\n            :class:`torchvision.transforms.InterpolationMode`. Default is ``InterpolationMode.NEAREST``.\n            If input is Tensor, only ``InterpolationMode.NEAREST``, ``InterpolationMode.BILINEAR`` are supported.\n            For backward compatibility integer values (e.g. ``PIL.Image[.Resampling].NEAREST``) are still accepted,\n            but deprecated since 0.13 and will be removed in 0.15. Please use InterpolationMode enum.\n        fill (sequence or number, optional): Pixel fill value for the area outside the transformed\n            image. If given a number, the value is used for all bands respectively.\n\n            .. note::\n                In torchscript mode single int/float value is not supported, please use a sequence\n                of length 1: ``[value, ]``.\n        fillcolor (sequence or number, optional):\n            .. warning::\n                This parameter was deprecated in ``0.12`` and will be removed in ``0.14``. Please use ``fill`` instead.\n        resample (int, optional):\n            .. warning::\n                This parameter was deprecated in ``0.12`` and will be removed in ``0.14``. Please use ``interpolation``\n                instead.\n        center (sequence, optional): Optional center of rotation. Origin is the upper left corner.\n            Default is the center of the image.\n\n    Returns:\n        PIL Image or Tensor: Transformed image.\n    \"\"\"\n\n    # Backward compatibility with integer value\n    if isinstance(interpolation, int):\n        warnings.warn(\"Argument 'interpolation' of type int is deprecated since 0.13 and will be removed in 0.15. \"\n                      \"Please use InterpolationMode enum.\")\n        interpolation = _interpolation_modes_from_int(interpolation)\n\n    if fillcolor is not None:\n        warnings.warn(\"The parameter 'fillcolor' is deprecated since 0.12 and will be removed in 0.14. \"\n                      \"Please use 'fill' instead.\")\n        fill = fillcolor\n\n    if not isinstance(angle, (int, float)):\n        raise TypeError(\"Argument angle should be int or float\")\n\n    if not isinstance(translate, (list, tuple)):\n        raise TypeError(\"Argument translate should be a sequence\")\n\n    if len(translate) != 2:\n        raise ValueError(\"Argument translate should be a sequence of length 2\")\n\n    if scale <= 0.0:\n        raise ValueError(\"Argument scale should be positive\")\n\n    if not isinstance(shear, (numbers.Number, (list, tuple))):\n        raise TypeError(\"Shear should be either a single value or a sequence of two values\")\n\n    if not isinstance(interpolation, InterpolationMode):\n        raise TypeError(\"Argument interpolation should be a InterpolationMode\")\n\n    if isinstance(angle, int):\n        angle = float(angle)\n\n    if isinstance(translate, tuple):\n        translate = list(translate)\n\n    if isinstance(shear, numbers.Number):\n        shear = [shear, 0.0]\n\n    if isinstance(shear, tuple):\n        shear = list(shear)\n\n    if len(shear) == 1:\n        shear = [shear[0], shear[0]]\n\n    if len(shear) != 2:\n        raise ValueError(f\"Shear should be a sequence containing two values. Got {shear}\")\n\n    if center is not None and not isinstance(center, (list, tuple)):\n        raise TypeError(\"Argument center should be a sequence\")\n    center_f = [0.0, 0.0]\n    if center is not None:\n        _, height, width = img.shape[0], img.shape[1], img.shape[2]\n        # Center values should be in pixel coordinates but translated such that (0, 0) corresponds to image center.\n        center_f = [1.0 * (c - s * 0.5) for c, s in zip(center, [width, height])]\n\n    translate_f = [1.0 * t for t in translate]\n    matrix = _get_inverse_affine_matrix(center_f, angle, translate_f, scale, shear)\n    return affine_impl(img, matrix=matrix, interpolation=interpolation.value, fill=fill)\n\n\ndef _interpolation_modes_from_int(i: int) -> InterpolationMode:\n    inverse_modes_mapping = {\n        0: InterpolationMode.NEAREST,\n        2: InterpolationMode.BILINEAR,\n        3: InterpolationMode.BICUBIC,\n        4: InterpolationMode.BOX,\n        5: InterpolationMode.HAMMING,\n        1: InterpolationMode.LANCZOS,\n    }\n    return inverse_modes_mapping[i]\n\n\ndef _check_sequence_input(x, name, req_sizes):\n    msg = req_sizes[0] if len(req_sizes) < 2 else \" or \".join([str(s) for s in req_sizes])\n    if not isinstance(x, Sequence):\n        raise TypeError(f\"{name} should be a sequence of length {msg}.\")\n    if len(x) not in req_sizes:\n        raise ValueError(f\"{name} should be sequence of length {msg}.\")\n\n\ndef _setup_angle(x, name, req_sizes=(2, )):\n    if isinstance(x, numbers.Number):\n        if x < 0:\n            raise ValueError(f\"If {name} is a single number, it must be positive.\")\n        x = [-x, x]\n    else:\n        _check_sequence_input(x, name, req_sizes)\n\n    return [float(d) for d in x]\n\n\nclass RandomAffine(nn.Layer):\n    \"\"\"Random affine transformation of the image keeping center invariant.\n    If the image is paddle Tensor, it is expected\n    to have [..., H, W] shape, where ... means an arbitrary number of leading dimensions.\n\n    Args:\n        degrees (sequence or number): Range of degrees to select from.\n            If degrees is a number instead of sequence like (min, max), the range of degrees\n            will be (-degrees, +degrees). Set to 0 to deactivate rotations.\n        translate (tuple, optional): tuple of maximum absolute fraction for horizontal\n            and vertical translations. For example translate=(a, b), then horizontal shift\n            is randomly sampled in the range -img_width * a < dx < img_width * a and vertical shift is\n            randomly sampled in the range -img_height * b < dy < img_height * b. Will not translate by default.\n        scale (tuple, optional): scaling factor interval, e.g (a, b), then scale is\n            randomly sampled from the range a <= scale <= b. Will keep original scale by default.\n        shear (sequence or number, optional): Range of degrees to select from.\n            If shear is a number, a shear parallel to the x axis in the range (-shear, +shear)\n            will be applied. Else if shear is a sequence of 2 values a shear parallel to the x axis in the\n            range (shear[0], shear[1]) will be applied. Else if shear is a sequence of 4 values,\n            a x-axis shear in (shear[0], shear[1]) and y-axis shear in (shear[2], shear[3]) will be applied.\n            Will not apply shear by default.\n        interpolation (InterpolationMode): Desired interpolation enum defined by\n            :class:`torchvision.transforms.InterpolationMode`. Default is ``InterpolationMode.NEAREST``.\n            If input is Tensor, only ``InterpolationMode.NEAREST``, ``InterpolationMode.BILINEAR`` are supported.\n            For backward compatibility integer values (e.g. ``PIL.Image[.Resampling].NEAREST``) are still accepted,\n            but deprecated since 0.13 and will be removed in 0.15. Please use InterpolationMode enum.\n        fill (sequence or number): Pixel fill value for the area outside the transformed\n            image. Default is ``0``. If given a number, the value is used for all bands respectively.\n        fillcolor (sequence or number, optional):\n            .. warning::\n                This parameter was deprecated in ``0.12`` and will be removed in ``0.14``. Please use ``fill`` instead.\n        resample (int, optional):\n            .. warning::\n                This parameter was deprecated in ``0.12`` and will be removed in ``0.14``. Please use ``interpolation``\n                instead.\n        center (sequence, optional): Optional center of rotation, (x, y). Origin is the upper left corner.\n            Default is the center of the image.\n\n    .. _filters: https://pillow.readthedocs.io/en/latest/handbook/concepts.html#filters\n\n    \"\"\"\n\n    def __init__(\n        self,\n        degrees,\n        translate=None,\n        scale=None,\n        shear=None,\n        interpolation=InterpolationMode.NEAREST,\n        fill=0,\n        fillcolor=None,\n        resample=None,\n        center=None,\n    ):\n        super(RandomAffine, self).__init__()\n        if resample is not None:\n            warnings.warn(\"The parameter 'resample' is deprecated since 0.12 and will be removed in 0.14. \"\n                          \"Please use 'interpolation' instead.\")\n            interpolation = _interpolation_modes_from_int(resample)\n\n        # Backward compatibility with integer value\n        if isinstance(interpolation, int):\n            warnings.warn(\"Argument 'interpolation' of type int is deprecated since 0.13 and will be removed in 0.15. \"\n                          \"Please use InterpolationMode enum.\")\n            interpolation = _interpolation_modes_from_int(interpolation)\n\n        if fillcolor is not None:\n            warnings.warn(\"The parameter 'fillcolor' is deprecated since 0.12 and will be removed in 0.14. \"\n                          \"Please use 'fill' instead.\")\n            fill = fillcolor\n\n        self.degrees = _setup_angle(degrees, name=\"degrees\", req_sizes=(2, ))\n\n        if translate is not None:\n            _check_sequence_input(translate, \"translate\", req_sizes=(2, ))\n            for t in translate:\n                if not (0.0 <= t <= 1.0):\n                    raise ValueError(\"translation values should be between 0 and 1\")\n        self.translate = translate\n\n        if scale is not None:\n            _check_sequence_input(scale, \"scale\", req_sizes=(2, ))\n            for s in scale:\n                if s <= 0:\n                    raise ValueError(\"scale values should be positive\")\n        self.scale = scale\n\n        if shear is not None:\n            self.shear = _setup_angle(shear, name=\"shear\", req_sizes=(2, 4))\n        else:\n            self.shear = shear\n\n        self.resample = self.interpolation = interpolation\n\n        if fill is None:\n            fill = 0\n        elif not isinstance(fill, (Sequence, numbers.Number)):\n            raise TypeError(\"Fill should be either a sequence or a number.\")\n\n        self.fillcolor = self.fill = fill\n\n        if center is not None:\n            _check_sequence_input(center, \"center\", req_sizes=(2, ))\n\n        self.center = center\n\n    @staticmethod\n    def get_params(\n        degrees: List[float],\n        translate: Optional[List[float]],\n        scale_ranges: Optional[List[float]],\n        shears: Optional[List[float]],\n        img_size: List[int],\n    ) -> Tuple[float, Tuple[int, int], float, Tuple[float, float]]:\n        \"\"\"Get parameters for affine transformation\n\n        Returns:\n            params to be passed to the affine transformation\n        \"\"\"\n        angle = float(paddle.empty([1]).uniform_(float(degrees[0]), float(degrees[1])))\n        if translate is not None:\n            max_dx = float(translate[0] * img_size[0])\n            max_dy = float(translate[1] * img_size[1])\n            tx = int(float(paddle.empty([1]).uniform_(-max_dx, max_dx)))\n            ty = int(float(paddle.empty([1]).uniform_(-max_dy, max_dy)))\n            translations = (tx, ty)\n        else:\n            translations = (0, 0)\n\n        if scale_ranges is not None:\n            scale = float(paddle.empty([1]).uniform_(scale_ranges[0], scale_ranges[1]))\n        else:\n            scale = 1.0\n\n        shear_x = shear_y = 0.0\n        if shears is not None:\n            shear_x = float(paddle.empty([1]).uniform_(shears[0], shears[1]))\n            if len(shears) == 4:\n                shear_y = float(paddle.empty([1]).uniform_(shears[2], shears[3]))\n\n        shear = (shear_x, shear_y)\n\n        return angle, translations, scale, shear\n\n    def forward(self, img):\n        fill = self.fill\n        channels, height, width = img.shape[1], img.shape[2], img.shape[3]\n        if isinstance(fill, (int, float)):\n            fill = [float(fill)] * channels\n        else:\n            fill = [float(f) for f in fill]\n\n        img_size = [width, height]  # flip for keeping BC on get_params call\n\n        ret = self.get_params(self.degrees, self.translate, self.scale, self.shear, img_size)\n\n        return affine(img, *ret, interpolation=self.interpolation, fill=fill, center=self.center)\n\n    def __repr__(self) -> str:\n        s = f\"{self.__class__.__name__}(degrees={self.degrees}\"\n        s += f\", translate={self.translate}\" if self.translate is not None else \"\"\n        s += f\", scale={self.scale}\" if self.scale is not None else \"\"\n        s += f\", shear={self.shear}\" if self.shear is not None else \"\"\n        s += f\", interpolation={self.interpolation.value}\" if self.interpolation != InterpolationMode.NEAREST else \"\"\n        s += f\", fill={self.fill}\" if self.fill != 0 else \"\"\n        s += f\", center={self.center}\" if self.center is not None else \"\"\n        s += \")\"\n\n        return s\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/reverse_diffusion/model/unet.py",
    "content": "'''\nThis code is rewritten by Paddle based on\nhttps://github.com/openai/guided-diffusion/blob/main/guided_diffusion/unet.py\n'''\nimport math\nfrom abc import abstractmethod\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom .nn import avg_pool_nd\nfrom .nn import checkpoint\nfrom .nn import conv_nd\nfrom .nn import linear\nfrom .nn import normalization\nfrom .nn import SiLU\nfrom .nn import timestep_embedding\nfrom .nn import zero_module\n\n\nclass AttentionPool2d(nn.Layer):\n    \"\"\"\n    Adapted from CLIP: https://github.com/openai/CLIP/blob/main/clip/model.py\n    \"\"\"\n\n    def __init__(\n        self,\n        spacial_dim: int,\n        embed_dim: int,\n        num_heads_channels: int,\n        output_dim: int = None,\n    ):\n        super().__init__()\n        # self.positional_embedding = nn.Parameter(\n        #     th.randn(embed_dim, spacial_dim ** 2 + 1) / embed_dim ** 0.5\n        # )\n        positional_embedding = self.create_parameter(paddle.randn(embed_dim, spacial_dim**2 + 1) / embed_dim**0.5)\n        self.add_parameter(\"positional_embedding\", positional_embedding)\n        self.qkv_proj = conv_nd(1, embed_dim, 3 * embed_dim, 1)\n        self.c_proj = conv_nd(1, embed_dim, output_dim or embed_dim, 1)\n        self.num_heads = embed_dim // num_heads_channels\n        self.attention = QKVAttention(self.num_heads)\n\n    def forward(self, x):\n        b, c, *_spatial = x.shape\n        # x = x.reshape(b, c, -1)  # NC(HW)\n        x = paddle.reshape(x, [b, c, -1])\n        x = paddle.concat([x.mean(dim=-1, keepdim=True), x], axis=-1)  # NC(HW+1)\n        x = x + paddle.cast(self.positional_embedding[None, :, :], x.dtype)  # NC(HW+1)\n        x = self.qkv_proj(x)\n        x = self.attention(x)\n        x = self.c_proj(x)\n        return x[:, :, 0]\n\n\nclass TimestepBlock(nn.Layer):\n    \"\"\"\n    Any module where forward() takes timestep embeddings as a second argument.\n    \"\"\"\n\n    @abstractmethod\n    def forward(self, x, emb):\n        \"\"\"\n        Apply the module to `x` given `emb` timestep embeddings.\n        \"\"\"\n\n\nclass TimestepEmbedSequential(nn.Sequential, TimestepBlock):\n    \"\"\"\n    A sequential module that passes timestep embeddings to the children that\n    support it as an extra input.\n    \"\"\"\n\n    def forward(self, x, emb):\n        for layer in self:\n            if isinstance(layer, TimestepBlock):\n                x = layer(x, emb)\n            else:\n                x = layer(x)\n        return x\n\n\nclass Upsample(nn.Layer):\n    \"\"\"\n    An upsampling layer with an optional convolution.\n\n    :param channels: channels in the inputs and outputs.\n    :param use_conv: a bool determining if a convolution is applied.\n    :param dims: determines if the signal is 1D, 2D, or 3D. If 3D, then\n                 upsampling occurs in the inner-two dimensions.\n    \"\"\"\n\n    def __init__(self, channels, use_conv, dims=2, out_channels=None):\n        super().__init__()\n        self.channels = channels\n        self.out_channels = out_channels or channels\n        self.use_conv = use_conv\n        self.dims = dims\n        if use_conv:\n            self.conv = conv_nd(dims, self.channels, self.out_channels, 3, padding=1)\n\n    def forward(self, x):\n        assert x.shape[1] == self.channels\n        if self.dims == 3:\n            x = F.interpolate(x, (x.shape[2], x.shape[3] * 2, x.shape[4] * 2), mode=\"nearest\")\n        else:\n            x = F.interpolate(x, scale_factor=2, mode=\"nearest\")\n        if self.use_conv:\n            x = self.conv(x)\n        return x\n\n\nclass Downsample(nn.Layer):\n    \"\"\"\n    A downsampling layer with an optional convolution.\n\n    :param channels: channels in the inputs and outputs.\n    :param use_conv: a bool determining if a convolution is applied.\n    :param dims: determines if the signal is 1D, 2D, or 3D. If 3D, then\n                 downsampling occurs in the inner-two dimensions.\n    \"\"\"\n\n    def __init__(self, channels, use_conv, dims=2, out_channels=None):\n        super().__init__()\n        self.channels = channels\n        self.out_channels = out_channels or channels\n        self.use_conv = use_conv\n        self.dims = dims\n        stride = 2 if dims != 3 else (1, 2, 2)\n        if use_conv:\n            self.op = conv_nd(dims, self.channels, self.out_channels, 3, stride=stride, padding=1)\n        else:\n            assert self.channels == self.out_channels\n            self.op = avg_pool_nd(dims, kernel_size=stride, stride=stride)\n\n    def forward(self, x):\n        assert x.shape[1] == self.channels\n        return self.op(x)\n\n\nclass ResBlock(TimestepBlock):\n    \"\"\"\n    A residual block that can optionally change the number of channels.\n\n    :param channels: the number of input channels.\n    :param emb_channels: the number of timestep embedding channels.\n    :param dropout: the rate of dropout.\n    :param out_channels: if specified, the number of out channels.\n    :param use_conv: if True and out_channels is specified, use a spatial\n        convolution instead of a smaller 1x1 convolution to change the\n        channels in the skip connection.\n    :param dims: determines if the signal is 1D, 2D, or 3D.\n    :param use_checkpoint: if True, use gradient checkpointing on this module.\n    :param up: if True, use this block for upsampling.\n    :param down: if True, use this block for downsampling.\n    \"\"\"\n\n    def __init__(\n        self,\n        channels,\n        emb_channels,\n        dropout,\n        out_channels=None,\n        use_conv=False,\n        use_scale_shift_norm=False,\n        dims=2,\n        use_checkpoint=False,\n        up=False,\n        down=False,\n    ):\n        super().__init__()\n        self.channels = channels\n        self.emb_channels = emb_channels\n        self.dropout = dropout\n        self.out_channels = out_channels or channels\n        self.use_conv = use_conv\n        self.use_checkpoint = use_checkpoint\n        self.use_scale_shift_norm = use_scale_shift_norm\n\n        self.in_layers = nn.Sequential(\n            normalization(channels),\n            SiLU(),\n            conv_nd(dims, channels, self.out_channels, 3, padding=1),\n        )\n\n        self.updown = up or down\n\n        if up:\n            self.h_upd = Upsample(channels, False, dims)\n            self.x_upd = Upsample(channels, False, dims)\n        elif down:\n            self.h_upd = Downsample(channels, False, dims)\n            self.x_upd = Downsample(channels, False, dims)\n        else:\n            self.h_upd = self.x_upd = nn.Identity()\n\n        self.emb_layers = nn.Sequential(\n            SiLU(),\n            linear(\n                emb_channels,\n                2 * self.out_channels if use_scale_shift_norm else self.out_channels,\n            ),\n        )\n        self.out_layers = nn.Sequential(\n            normalization(self.out_channels),\n            SiLU(),\n            nn.Dropout(p=dropout),\n            zero_module(conv_nd(dims, self.out_channels, self.out_channels, 3, padding=1)),\n        )\n\n        if self.out_channels == channels:\n            self.skip_connection = nn.Identity()\n        elif use_conv:\n            self.skip_connection = conv_nd(dims, channels, self.out_channels, 3, padding=1)\n        else:\n            self.skip_connection = conv_nd(dims, channels, self.out_channels, 1)\n\n    def forward(self, x, emb):\n        \"\"\"\n        Apply the block to a Tensor, conditioned on a timestep embedding.\n\n        :param x: an [N x C x ...] Tensor of features.\n        :param emb: an [N x emb_channels] Tensor of timestep embeddings.\n        :return: an [N x C x ...] Tensor of outputs.\n        \"\"\"\n        return checkpoint(self._forward, (x, emb), self.parameters(), self.use_checkpoint)\n\n    def _forward(self, x, emb):\n        if self.updown:\n            in_rest, in_conv = self.in_layers[:-1], self.in_layers[-1]\n            h = in_rest(x)\n            h = self.h_upd(h)\n            x = self.x_upd(x)\n            h = in_conv(h)\n        else:\n            h = self.in_layers(x)\n        emb_out = self.emb_layers(emb)\n        emb_out = paddle.cast(emb_out, h.dtype)\n        while len(emb_out.shape) < len(h.shape):\n            emb_out = emb_out[..., None]\n        if self.use_scale_shift_norm:\n            out_norm, out_rest = self.out_layers[0], self.out_layers[1:]\n            scale, shift = paddle.chunk(emb_out, 2, axis=1)\n            h = out_norm(h) * (1 + scale) + shift\n            h = out_rest(h)\n        else:\n            h = h + emb_out\n            h = self.out_layers(h)\n        return self.skip_connection(x) + h\n\n\nclass AttentionBlock(nn.Layer):\n    \"\"\"\n    An attention block that allows spatial positions to attend to each other.\n\n    Originally ported from here, but adapted to the N-d case.\n    https://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/models/unet.py#L66.\n    \"\"\"\n\n    def __init__(\n        self,\n        channels,\n        num_heads=1,\n        num_head_channels=-1,\n        use_checkpoint=False,\n        use_new_attention_order=False,\n    ):\n        super().__init__()\n        self.channels = channels\n        if num_head_channels == -1:\n            self.num_heads = num_heads\n        else:\n            assert (channels % num_head_channels == 0\n                    ), f\"q,k,v channels {channels} is not divisible by num_head_channels {num_head_channels}\"\n            self.num_heads = channels // num_head_channels\n        self.use_checkpoint = use_checkpoint\n        self.norm = normalization(channels)\n        self.qkv = conv_nd(1, channels, channels * 3, 1)\n        if use_new_attention_order:\n            # split qkv before split heads\n            self.attention = QKVAttention(self.num_heads)\n        else:\n            # split heads before split qkv\n            self.attention = QKVAttentionLegacy(self.num_heads)\n\n        self.proj_out = zero_module(conv_nd(1, channels, channels, 1))\n\n    def forward(self, x):\n        return checkpoint(self._forward, (x, ), self.parameters(), self.use_checkpoint)\n\n    def _forward(self, x):\n        b, c, *spatial = x.shape\n        # x = x.reshape(b, c, -1)\n        x = paddle.reshape(x, [b, c, -1])\n        qkv = self.qkv(self.norm(x))\n        h = self.attention(qkv)\n        h = self.proj_out(h)\n        # return (x + h).reshape(b, c, *spatial)\n        return paddle.reshape(x + h, [b, c, *spatial])\n\n\ndef count_flops_attn(model, _x, y):\n    \"\"\"\n    A counter for the `thop` package to count the operations in an\n    attention operation.\n    Meant to be used like:\n        macs, params = thop.profile(\n            model,\n            inputs=(inputs, timestamps),\n            custom_ops={QKVAttention: QKVAttention.count_flops},\n        )\n    \"\"\"\n    b, c, *spatial = y[0].shape\n    num_spatial = int(np.prod(spatial))\n    # We perform two matmuls with the same number of ops.\n    # The first computes the weight matrix, the second computes\n    # the combination of the value vectors.\n    matmul_ops = 2 * b * (num_spatial**2) * c\n    model.total_ops += paddle.to_tensor([matmul_ops], dtype='float64')\n\n\nclass QKVAttentionLegacy(nn.Layer):\n    \"\"\"\n    A module which performs QKV attention. Matches legacy QKVAttention + input/ouput heads shaping\n    \"\"\"\n\n    def __init__(self, n_heads):\n        super().__init__()\n        self.n_heads = n_heads\n\n    def forward(self, qkv):\n        \"\"\"\n        Apply QKV attention.\n\n        :param qkv: an [N x (H * 3 * C) x T] tensor of Qs, Ks, and Vs.\n        :return: an [N x (H * C) x T] tensor after attention.\n        \"\"\"\n        bs, width, length = qkv.shape\n        assert width % (3 * self.n_heads) == 0\n        ch = width // (3 * self.n_heads)\n        # q, k, v = qkv.reshape(bs * self.n_heads, ch * 3, length).split(ch, dim=1)\n        q, k, v = paddle.reshape(qkv, [bs * self.n_heads, ch * 3, length]).split(3, axis=1)\n        scale = 1 / math.sqrt(math.sqrt(ch))\n        weight = paddle.einsum(\"bct,bcs->bts\", q * scale, k * scale)  # More stable with f16 than dividing afterwards\n        weight = paddle.cast(nn.functional.softmax(paddle.cast(weight, 'float32'), axis=-1), weight.dtype)\n        a = paddle.einsum(\"bts,bcs->bct\", weight, v)\n        # return a.reshape(bs, -1, length)\n        return paddle.reshape(a, [bs, -1, length])\n\n    @staticmethod\n    def count_flops(model, _x, y):\n        return count_flops_attn(model, _x, y)\n\n\nclass QKVAttention(nn.Layer):\n    \"\"\"\n    A module which performs QKV attention and splits in a different order.\n    \"\"\"\n\n    def __init__(self, n_heads):\n        super().__init__()\n        self.n_heads = n_heads\n\n    def forward(self, qkv):\n        \"\"\"\n        Apply QKV attention.\n\n        :param qkv: an [N x (3 * H * C) x T] tensor of Qs, Ks, and Vs.\n        :return: an [N x (H * C) x T] tensor after attention.\n        \"\"\"\n        bs, width, length = qkv.shape\n        assert width % (3 * self.n_heads) == 0\n        ch = width // (3 * self.n_heads)\n        q, k, v = qkv.chunk(3, axis=1)\n        scale = 1 / math.sqrt(math.sqrt(ch))\n        weight = paddle.einsum(\n            \"bct,bcs->bts\",\n            (q * scale).view(bs * self.n_heads, ch, length),\n            (k * scale).view(bs * self.n_heads, ch, length),\n        )  # More stable with f16 than dividing afterwards\n        weight = paddle.cast(nn.functional.softmax(paddle.cast(weight, 'float32'), axis=-1), weight.dtype)\n        a = paddle.einsum(\"bts,bcs->bct\", weight, v.reshape(bs * self.n_heads, ch, length))\n        # return a.reshape(bs, -1, length)\n        return paddle.reshape(a, [bs, -1, length])\n\n    @staticmethod\n    def count_flops(model, _x, y):\n        return count_flops_attn(model, _x, y)\n\n\nclass UNetModel(nn.Layer):\n    \"\"\"\n    The full UNet model with attention and timestep embedding.\n\n    :param in_channels: channels in the input Tensor.\n    :param model_channels: base channel count for the model.\n    :param out_channels: channels in the output Tensor.\n    :param num_res_blocks: number of residual blocks per downsample.\n    :param attention_resolutions: a collection of downsample rates at which\n        attention will take place. May be a set, list, or tuple.\n        For example, if this contains 4, then at 4x downsampling, attention\n        will be used.\n    :param dropout: the dropout probability.\n    :param channel_mult: channel multiplier for each level of the UNet.\n    :param conv_resample: if True, use learned convolutions for upsampling and\n        downsampling.\n    :param dims: determines if the signal is 1D, 2D, or 3D.\n    :param num_classes: if specified (as an int), then this model will be\n        class-conditional with `num_classes` classes.\n    :param use_checkpoint: use gradient checkpointing to reduce memory usage.\n    :param num_heads: the number of attention heads in each attention layer.\n    :param num_heads_channels: if specified, ignore num_heads and instead use\n                               a fixed channel width per attention head.\n    :param num_heads_upsample: works with num_heads to set a different number\n                               of heads for upsampling. Deprecated.\n    :param use_scale_shift_norm: use a FiLM-like conditioning mechanism.\n    :param resblock_updown: use residual blocks for up/downsampling.\n    :param use_new_attention_order: use a different attention pattern for potentially\n                                    increased efficiency.\n    \"\"\"\n\n    def __init__(\n        self,\n        image_size,\n        in_channels,\n        model_channels,\n        out_channels,\n        num_res_blocks,\n        attention_resolutions,\n        dropout=0,\n        channel_mult=(1, 2, 4, 8),\n        conv_resample=True,\n        dims=2,\n        num_classes=None,\n        use_checkpoint=False,\n        use_fp16=False,\n        num_heads=1,\n        num_head_channels=-1,\n        num_heads_upsample=-1,\n        use_scale_shift_norm=False,\n        resblock_updown=False,\n        use_new_attention_order=False,\n    ):\n        super().__init__()\n\n        if num_heads_upsample == -1:\n            num_heads_upsample = num_heads\n\n        self.image_size = image_size\n        self.in_channels = in_channels\n        self.model_channels = model_channels\n        self.out_channels = out_channels\n        self.num_res_blocks = num_res_blocks\n        self.attention_resolutions = attention_resolutions\n        self.dropout = dropout\n        self.channel_mult = channel_mult\n        self.conv_resample = conv_resample\n        self.num_classes = num_classes\n        self.use_checkpoint = use_checkpoint\n        self.dtype = paddle.float16 if use_fp16 else paddle.float32\n        self.num_heads = num_heads\n        self.num_head_channels = num_head_channels\n        self.num_heads_upsample = num_heads_upsample\n\n        time_embed_dim = model_channels * 4\n        self.time_embed = nn.Sequential(\n            linear(model_channels, time_embed_dim),\n            SiLU(),\n            linear(time_embed_dim, time_embed_dim),\n        )\n\n        if self.num_classes is not None:\n            self.label_emb = nn.Embedding(num_classes, time_embed_dim)\n\n        ch = input_ch = int(channel_mult[0] * model_channels)\n        self.input_blocks = nn.LayerList([TimestepEmbedSequential(conv_nd(dims, in_channels, ch, 3, padding=1))])\n        self._feature_size = ch\n        input_block_chans = [ch]\n        ds = 1\n        for level, mult in enumerate(channel_mult):\n            for _ in range(num_res_blocks):\n                layers = [\n                    ResBlock(\n                        ch,\n                        time_embed_dim,\n                        dropout,\n                        out_channels=int(mult * model_channels),\n                        dims=dims,\n                        use_checkpoint=use_checkpoint,\n                        use_scale_shift_norm=use_scale_shift_norm,\n                    )\n                ]\n                ch = int(mult * model_channels)\n                if ds in attention_resolutions:\n                    layers.append(\n                        AttentionBlock(\n                            ch,\n                            use_checkpoint=use_checkpoint,\n                            num_heads=num_heads,\n                            num_head_channels=num_head_channels,\n                            use_new_attention_order=use_new_attention_order,\n                        ))\n                self.input_blocks.append(TimestepEmbedSequential(*layers))\n                self._feature_size += ch\n                input_block_chans.append(ch)\n            if level != len(channel_mult) - 1:\n                out_ch = ch\n                self.input_blocks.append(\n                    TimestepEmbedSequential(\n                        ResBlock(\n                            ch,\n                            time_embed_dim,\n                            dropout,\n                            out_channels=out_ch,\n                            dims=dims,\n                            use_checkpoint=use_checkpoint,\n                            use_scale_shift_norm=use_scale_shift_norm,\n                            down=True,\n                        ) if resblock_updown else Downsample(ch, conv_resample, dims=dims, out_channels=out_ch)))\n                ch = out_ch\n                input_block_chans.append(ch)\n                ds *= 2\n                self._feature_size += ch\n\n        self.middle_block = TimestepEmbedSequential(\n            ResBlock(\n                ch,\n                time_embed_dim,\n                dropout,\n                dims=dims,\n                use_checkpoint=use_checkpoint,\n                use_scale_shift_norm=use_scale_shift_norm,\n            ),\n            AttentionBlock(\n                ch,\n                use_checkpoint=use_checkpoint,\n                num_heads=num_heads,\n                num_head_channels=num_head_channels,\n                use_new_attention_order=use_new_attention_order,\n            ),\n            ResBlock(\n                ch,\n                time_embed_dim,\n                dropout,\n                dims=dims,\n                use_checkpoint=use_checkpoint,\n                use_scale_shift_norm=use_scale_shift_norm,\n            ),\n        )\n        self._feature_size += ch\n\n        self.output_blocks = nn.LayerList([])\n        for level, mult in list(enumerate(channel_mult))[::-1]:\n            for i in range(num_res_blocks + 1):\n                ich = input_block_chans.pop()\n                layers = [\n                    ResBlock(\n                        ch + ich,\n                        time_embed_dim,\n                        dropout,\n                        out_channels=int(model_channels * mult),\n                        dims=dims,\n                        use_checkpoint=use_checkpoint,\n                        use_scale_shift_norm=use_scale_shift_norm,\n                    )\n                ]\n                ch = int(model_channels * mult)\n                if ds in attention_resolutions:\n                    layers.append(\n                        AttentionBlock(\n                            ch,\n                            use_checkpoint=use_checkpoint,\n                            num_heads=num_heads_upsample,\n                            num_head_channels=num_head_channels,\n                            use_new_attention_order=use_new_attention_order,\n                        ))\n                if level and i == num_res_blocks:\n                    out_ch = ch\n                    layers.append(\n                        ResBlock(\n                            ch,\n                            time_embed_dim,\n                            dropout,\n                            out_channels=out_ch,\n                            dims=dims,\n                            use_checkpoint=use_checkpoint,\n                            use_scale_shift_norm=use_scale_shift_norm,\n                            up=True,\n                        ) if resblock_updown else Upsample(ch, conv_resample, dims=dims, out_channels=out_ch))\n                    ds //= 2\n                self.output_blocks.append(TimestepEmbedSequential(*layers))\n                self._feature_size += ch\n\n        self.out = nn.Sequential(\n            normalization(ch),\n            SiLU(),\n            zero_module(conv_nd(dims, input_ch, out_channels, 3, padding=1)),\n        )\n\n    def forward(self, x, timesteps, y=None):\n        \"\"\"\n        Apply the model to an input batch.\n\n        :param x: an [N x C x ...] Tensor of inputs.\n        :param timesteps: a 1-D batch of timesteps.\n        :param y: an [N] Tensor of labels, if class-conditional.\n        :return: an [N x C x ...] Tensor of outputs.\n        \"\"\"\n        assert (y is not None) == (self.num_classes\n                                   is not None), \"must specify y if and only if the model is class-conditional\"\n\n        hs = []\n        emb = self.time_embed(timestep_embedding(timesteps, self.model_channels))\n        if self.num_classes is not None:\n            assert y.shape == (x.shape[0], )\n            emb = emb + self.label_emb(y)\n\n        h = paddle.cast(x, self.dtype)\n        for module in self.input_blocks:\n            h = module(h, emb)\n            hs.append(h)\n        h = self.middle_block(h, emb)\n        for module in self.output_blocks:\n            h = paddle.concat([h, hs.pop()], axis=1)\n            h = module(h, emb)\n        # h = paddle.cast(h, x.dtype)\n        return self.out(h)\n\n\nclass SuperResModel(UNetModel):\n    \"\"\"\n    A UNetModel that performs super-resolution.\n\n    Expects an extra kwarg `low_res` to condition on a low-resolution image.\n    \"\"\"\n\n    def __init__(self, image_size, in_channels, *args, **kwargs):\n        super().__init__(image_size, in_channels * 2, *args, **kwargs)\n\n    def forward(self, x, timesteps, low_res=None, **kwargs):\n        _, _, new_height, new_width = x.shape\n        upsampled = F.interpolate(low_res, (new_height, new_width), mode=\"bilinear\")\n        x = paddle.concat([x, upsampled], axis=1)\n        return super().forward(x, timesteps, **kwargs)\n\n\nclass EncoderUNetModel(nn.Layer):\n    \"\"\"\n    The half UNet model with attention and timestep embedding.\n\n    For usage, see UNet.\n    \"\"\"\n\n    def __init__(\n        self,\n        image_size,\n        in_channels,\n        model_channels,\n        out_channels,\n        num_res_blocks,\n        attention_resolutions,\n        dropout=0,\n        channel_mult=(1, 2, 4, 8),\n        conv_resample=True,\n        dims=2,\n        use_checkpoint=False,\n        use_fp16=False,\n        num_heads=1,\n        num_head_channels=-1,\n        num_heads_upsample=-1,\n        use_scale_shift_norm=False,\n        resblock_updown=False,\n        use_new_attention_order=False,\n        pool=\"adaptive\",\n    ):\n        super().__init__()\n\n        if num_heads_upsample == -1:\n            num_heads_upsample = num_heads\n\n        self.in_channels = in_channels\n        self.model_channels = model_channels\n        self.out_channels = out_channels\n        self.num_res_blocks = num_res_blocks\n        self.attention_resolutions = attention_resolutions\n        self.dropout = dropout\n        self.channel_mult = channel_mult\n        self.conv_resample = conv_resample\n        self.use_checkpoint = use_checkpoint\n        self.dtype = paddle.float16 if use_fp16 else paddle.float32\n        self.num_heads = num_heads\n        self.num_head_channels = num_head_channels\n        self.num_heads_upsample = num_heads_upsample\n\n        time_embed_dim = model_channels * 4\n        self.time_embed = nn.Sequential(\n            linear(model_channels, time_embed_dim),\n            SiLU(),\n            linear(time_embed_dim, time_embed_dim),\n        )\n\n        ch = int(channel_mult[0] * model_channels)\n        self.input_blocks = nn.LayerList([TimestepEmbedSequential(conv_nd(dims, in_channels, ch, 3, padding=1))])\n        self._feature_size = ch\n        input_block_chans = [ch]\n        ds = 1\n        for level, mult in enumerate(channel_mult):\n            for _ in range(num_res_blocks):\n                layers = [\n                    ResBlock(\n                        ch,\n                        time_embed_dim,\n                        dropout,\n                        out_channels=int(mult * model_channels),\n                        dims=dims,\n                        use_checkpoint=use_checkpoint,\n                        use_scale_shift_norm=use_scale_shift_norm,\n                    )\n                ]\n                ch = int(mult * model_channels)\n                if ds in attention_resolutions:\n                    layers.append(\n                        AttentionBlock(\n                            ch,\n                            use_checkpoint=use_checkpoint,\n                            num_heads=num_heads,\n                            num_head_channels=num_head_channels,\n                            use_new_attention_order=use_new_attention_order,\n                        ))\n                self.input_blocks.append(TimestepEmbedSequential(*layers))\n                self._feature_size += ch\n                input_block_chans.append(ch)\n            if level != len(channel_mult) - 1:\n                out_ch = ch\n                self.input_blocks.append(\n                    TimestepEmbedSequential(\n                        ResBlock(\n                            ch,\n                            time_embed_dim,\n                            dropout,\n                            out_channels=out_ch,\n                            dims=dims,\n                            use_checkpoint=use_checkpoint,\n                            use_scale_shift_norm=use_scale_shift_norm,\n                            down=True,\n                        ) if resblock_updown else Downsample(ch, conv_resample, dims=dims, out_channels=out_ch)))\n                ch = out_ch\n                input_block_chans.append(ch)\n                ds *= 2\n                self._feature_size += ch\n\n        self.middle_block = TimestepEmbedSequential(\n            ResBlock(\n                ch,\n                time_embed_dim,\n                dropout,\n                dims=dims,\n                use_checkpoint=use_checkpoint,\n                use_scale_shift_norm=use_scale_shift_norm,\n            ),\n            AttentionBlock(\n                ch,\n                use_checkpoint=use_checkpoint,\n                num_heads=num_heads,\n                num_head_channels=num_head_channels,\n                use_new_attention_order=use_new_attention_order,\n            ),\n            ResBlock(\n                ch,\n                time_embed_dim,\n                dropout,\n                dims=dims,\n                use_checkpoint=use_checkpoint,\n                use_scale_shift_norm=use_scale_shift_norm,\n            ),\n        )\n        self._feature_size += ch\n        self.pool = pool\n        if pool == \"adaptive\":\n            self.out = nn.Sequential(\n                normalization(ch),\n                SiLU(),\n                nn.AdaptiveAvgPool2D((1, 1)),\n                zero_module(conv_nd(dims, ch, out_channels, 1)),\n                nn.Flatten(),\n            )\n        elif pool == \"attention\":\n            assert num_head_channels != -1\n            self.out = nn.Sequential(\n                normalization(ch),\n                SiLU(),\n                AttentionPool2d((image_size // ds), ch, num_head_channels, out_channels),\n            )\n        elif pool == \"spatial\":\n            self.out = nn.Sequential(\n                nn.Linear(self._feature_size, 2048),\n                nn.ReLU(),\n                nn.Linear(2048, self.out_channels),\n            )\n        elif pool == \"spatial_v2\":\n            self.out = nn.Sequential(\n                nn.Linear(self._feature_size, 2048),\n                normalization(2048),\n                SiLU(),\n                nn.Linear(2048, self.out_channels),\n            )\n        else:\n            raise NotImplementedError(f\"Unexpected {pool} pooling\")\n\n    def forward(self, x, timesteps):\n        \"\"\"\n        Apply the model to an input batch.\n\n        :param x: an [N x C x ...] Tensor of inputs.\n        :param timesteps: a 1-D batch of timesteps.\n        :return: an [N x K] Tensor of outputs.\n        \"\"\"\n        emb = self.time_embed(timestep_embedding(timesteps, self.model_channels))\n\n        results = []\n        # h = x.type(self.dtype)\n        h = paddle.cast(x, self.dtype)\n        for module in self.input_blocks:\n            h = module(h, emb)\n            if self.pool.startswith(\"spatial\"):\n                # results.append(h.type(x.dtype).mean(axis=(2, 3)))\n                results.append(paddle.cast(h, x.dtype).mean(axis=(2, 3)))\n        h = self.middle_block(h, emb)\n        if self.pool.startswith(\"spatial\"):\n            results.append(paddle.cast(h, x.dtype).mean(axis=(2, 3)))\n            h = paddle.concat(results, axis=-1)\n            return self.out(h)\n        else:\n            # h = h.type(x.dtype)\n            h = paddle.cast(h, x.dtype)\n            return self.out(h)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/reverse_diffusion/resources/default.yml",
    "content": "text_prompts:\n  - A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\n\ninit_image:\n\nwidth_height: [ 1280, 768]\n\nskip_steps: 10\nsteps: 250\n\ncut_ic_pow: 1\ninit_scale: 1000\nclip_guidance_scale: 5000\n\ntv_scale: 0\nrange_scale: 150\nsat_scale: 0\ncutn_batches: 4\n\ndiffusion_model: 512x512_diffusion_uncond_finetune_008100\nuse_secondary_model: True\ndiffusion_sampling_mode: ddim\n\nperlin_init: False\nperlin_mode: mixed\nseed: 445467575\neta: 0.8\nclamp_grad: True\nclamp_max: 0.05\n\nrandomize_class: True\nclip_denoised: False\nfuzzy_prompt: False\nrand_mag: 0.05\n\ncut_overview: \"[12]*400+[4]*600\"\ncut_innercut: \"[4]*400+[12]*600\"\ncut_icgray_p: \"[0.2]*400+[0]*600\"\n\ndisplay_rate: 10\nn_batches: 1\nbatch_size: 1\nbatch_name: ''\nclip_models:\n  - VIT\n  - RN50\n  - RN101\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/reverse_diffusion/resources/docstrings.yml",
    "content": "text_prompts: |\n  Phrase, sentence, or string of words and phrases describing what the image should look like.  The words will be analyzed by the AI and will guide the diffusion process toward the image(s) you describe. These can include commas and weights to adjust the relative importance of each element.  E.g. \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"\n  Notice that this prompt loosely follows a structure: [subject], [prepositional details], [setting], [meta modifiers and artist]; this is a good starting point for your experiments.\n  Developing text prompts takes practice and experience, and is not the subject of this guide.  If you are a beginner to writing text prompts, a good place to start is on a simple AI art app like Night Cafe, starry ai or WOMBO prior to using DD, to get a feel for how text gets translated into images by GAN tools.  These other apps use different technologies, but many of the same principles apply.\ninit_image: |\n  Recall that in the image sequence above, the first image shown is just noise.  If an init_image is provided, diffusion will replace the noise with the init_image as its starting state.  To use an init_image, upload the image to the Colab instance or your Google Drive, and enter the full image path here.\n  If using an init_image, you may need to increase skip_steps to ~ 50% of total steps to retain the character of the init. See skip_steps above for further discussion.\nwidth_height: |\n  Desired final image size, in pixels. You can have a square, wide, or tall image, but each edge length should be set to a multiple of 64px, and a minimum of 512px on the default CLIP model setting.  If you forget to use multiples of 64px in your dimensions, DD will adjust the dimensions of your image to make it so.\n\nskip_steps: |\n  Consider the chart shown here.  Noise scheduling (denoise strength) starts very high and progressively gets lower and lower as diffusion steps progress. The noise levels in the first few steps are very high, so images change dramatically in early steps.\n  As DD moves along the curve, noise levels (and thus the amount an image changes per step) declines, and image coherence from one step to the next increases.\n  The first few steps of denoising are often so dramatic that some steps (maybe 10-15% of total) can be skipped without affecting the final image. You can experiment with this as a way to cut render times.\n  If you skip too many steps, however, the remaining noise may not be high enough to generate new content, and thus may not have ‘time left’ to finish an image satisfactorily.\n  Also, depending on your other settings, you may need to skip steps to prevent CLIP from overshooting your goal, resulting in ‘blown out’ colors (hyper saturated, solid white, or solid black regions) or otherwise poor image quality.  Consider that the denoising process is at its strongest in the early steps, so skipping steps can sometimes mitigate other problems.\n  Lastly, if using an init_image, you will need to skip ~50% of the diffusion steps to retain the shapes in the original init image.\n  However, if you’re using an init_image, you can also adjust skip_steps up or down for creative reasons.  With low skip_steps you can get a result \"inspired by\" the init_image which will retain the colors and rough layout and shapes but look quite different. With high skip_steps you can preserve most of the init_image contents and just do fine tuning of the texture.\n\nsteps: |\n  When creating an image, the denoising curve is subdivided into steps for processing. Each step (or iteration) involves the AI looking at subsets of the image called ‘cuts’ and calculating the ‘direction’ the image should be guided to be more like the prompt. Then it adjusts the image with the help of the diffusion denoiser, and moves to the next step.\n  Increasing steps will provide more opportunities for the AI to adjust the image, and each adjustment will be smaller, and thus will yield a more precise, detailed image.  Increasing steps comes at the expense of longer render times.  Also, while increasing steps should generally increase image quality, there is a diminishing return on additional steps beyond 250 - 500 steps.  However, some intricate images can take 1000, 2000, or more steps.  It is really up to the user.\n  Just know that the render time is directly related to the number of steps, and many other parameters have a major impact on image quality, without costing additional time.\n\ncut_ic_pow: |\n  This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n\ninit_scale: |\n  This controls how strongly CLIP will try to match the init_image provided.  This is balanced against the clip_guidance_scale (CGS) above.  Too much init scale, and the image won’t change much during diffusion. Too much CGS and the init image will be lost.\nclip_guidance_scale: |\n  CGS is one of the most important parameters you will use. It tells DD how strongly you want CLIP to move toward your prompt each timestep.  Higher is generally better, but if CGS is too strong it will overshoot the goal and distort the image. So a happy medium is needed, and it takes experience to learn how to adjust CGS.\n  Note that this parameter generally scales with image dimensions. In other words, if you increase your total dimensions by 50% (e.g. a change from 512 x 512 to 512 x 768), then to maintain the same effect on the image, you’d want to increase clip_guidance_scale from 5000 to 7500.\n  Of the basic settings, clip_guidance_scale, steps and skip_steps are the most important contributors to image quality, so learn them well.\ntv_scale: |\n  Total variance denoising. Optional, set to zero to turn off. Controls ‘smoothness’ of final output. If used, tv_scale will try to smooth out your final image to reduce overall noise. If your image is too ‘crunchy’, increase tv_scale. TV denoising is good at preserving edges while smoothing away noise in flat regions.  See https://en.wikipedia.org/wiki/Total_variation_denoising\nrange_scale: |\n  Optional, set to zero to turn off.  Used for adjustment of color contrast.  Lower range_scale will increase contrast. Very low numbers create a reduced color palette, resulting in more vibrant or poster-like images. Higher range_scale will reduce contrast, for more muted images.\n\nsat_scale: |\n  Saturation scale. Optional, set to zero to turn off.  If used, sat_scale will help mitigate oversaturation. If your image is too saturated, increase sat_scale to reduce the saturation.\ncutn_batches: |\n  Each iteration, the AI cuts the image into smaller pieces known as cuts, and compares each cut to the prompt to decide how to guide the next diffusion step.  More cuts can generally lead to better images, since DD has more chances to fine-tune the image precision in each timestep.\n  Additional cuts are memory intensive, however, and if DD tries to evaluate too many cuts at once, it can run out of memory.  You can use cutn_batches to increase cuts per timestep without increasing memory usage.\n  At the default settings, DD is scheduled to do 16 cuts per timestep.  If cutn_batches is set to 1, there will indeed only be 16 cuts total per timestep.\n  However, if cutn_batches is increased to 4, DD will do 64 cuts total in each timestep, divided into 4 sequential batches of 16 cuts each.  Because the cuts are being evaluated only 16 at a time, DD uses the memory required for only 16 cuts, but gives you the quality benefit of 64 cuts.  The tradeoff, of course, is that this will take ~4 times as long to render each image.\n  So, (scheduled cuts) x (cutn_batches) = (total cuts per timestep). Increasing cutn_batches will increase render times, however, as the work is being done sequentially.  DD’s default cut schedule is a good place to start, but the cut schedule can be adjusted in the Cutn Scheduling section, explained below.\n\ndiffusion_model: Diffusion_model of choice.\n\nuse_secondary_model: |\n  Option to use a secondary purpose-made diffusion model to clean up interim diffusion images for CLIP evaluation.    If this option is turned off, DD will use the regular (large) diffusion model.    Using the secondary model is faster - one user reported a 50% improvement in render speed! However, the secondary model is much smaller, and may reduce image quality and detail.  I suggest you experiment with this.\n\ndiffusion_sampling_mode: |\n  Two alternate diffusion denoising algorithms. ddim has been around longer, and is more established and tested.  plms is a newly added alternate method that promises good diffusion results in fewer steps, but has not been as fully tested and may have side effects. This new plms mode is actively being researched in the #settings-and-techniques channel in the DD Discord.\n\nperlin_init: |\n  Normally, DD will use an image filled with random noise as a starting point for the diffusion curve.  If perlin_init is selected, DD will instead use a Perlin noise model as an initial state.  Perlin has very interesting characteristics, distinct from random noise, so it’s worth experimenting with this for your projects. Beyond perlin, you can, of course, generate your own noise images (such as with GIMP, etc) and use them as an init_image (without skipping steps).\n  Choosing perlin_init does not affect the actual diffusion process, just the starting point for the diffusion. Please note that selecting a perlin_init will replace and override any init_image you may have specified.  Further, because the 2D, 3D and video animation systems all rely on the init_image system, if you enable Perlin while using animation modes, the perlin_init will jump in front of any previous image or video input, and DD will NOT give you the expected sequence of coherent images. All of that said, using Perlin and animation modes together do make a very colorful rainbow effect, which can be used creatively.\n\nperlin_mode: |\n  sets type of Perlin noise: colored, gray, or a mix of both, giving you additional options for noise types. Experiment to see what these do in your projects.\nseed: |\n  Deep in the diffusion code, there is a random number ‘seed’ which is used as the basis for determining the initial state of the diffusion.  By default, this is random, but you can also specify your own seed.  This is useful if you like a particular result and would like to run more iterations that will be similar.\n  After each run, the actual seed value used will be reported in the parameters report, and can be reused if desired by entering seed # here.  If a specific numerical seed is used repeatedly, the resulting images will be quite similar but not identical.\neta: |\n  eta (greek letter η) is a diffusion model variable that mixes in a random amount of scaled noise into each timestep. 0 is no noise, 1.0 is more noise. As with most DD parameters, you can go below zero for eta, but it may give you unpredictable results.\n  The steps parameter has a close relationship with the eta parameter. If you set eta to 0, then you can get decent output with only 50-75 steps. Setting eta to 1.0 favors higher step counts, ideally around 250 and up. eta has a subtle, unpredictable effect on image, so you’ll need to experiment to see how this affects your projects.\nclamp_grad: |\n  As I understand it, clamp_grad is an internal limiter that stops DD from producing extreme results.  Try your images with and without clamp_grad. If the image changes drastically with clamp_grad turned off, it probably means your clip_guidance_scale is too high and should be reduced.\nclamp_max: |\n  Sets the value of the clamp_grad limitation. Default is 0.05, providing for smoother, more muted coloration in images, but setting higher values (0.15-0.3) can provide interesting contrast and vibrancy.\n\nrandomize_class:\nclip_denoised: False\nfuzzy_prompt: |\n  Controls whether to add multiple noisy prompts to the prompt losses. If True, can increase variability of image output. Experiment with this.\nrand_mag: |\n  Affects only the fuzzy_prompt.  Controls the magnitude of the random noise added by fuzzy_prompt.\n\ncut_overview: The schedule of overview cuts\ncut_innercut: The schedule of inner cuts\ncut_icgray_p: This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n\ndisplay_rate: |\n  During a diffusion run, you can monitor the progress of each image being created with this variable.  If display_rate is set to 50, DD will show you the in-progress image every 50 timesteps. Setting this to a lower value, like 5 or 10, is a good way to get an early peek at where your image is heading. If you don’t like the progression, just interrupt execution, change some settings, and re-run.  If you are planning a long, unmonitored batch, it’s better to set display_rate equal to steps, because displaying interim images does slow Colab down slightly.\nn_batches: |\n  This variable sets the number of still images you want DD to create.  If you are using an animation mode (see below for details) DD will ignore n_batches and create a single set of animated frames based on the animation settings.\nbatch_name: |\n  The name of the batch, the batch id will be named as \"discoart-[batch_name]-seed\". To avoid your artworks be overridden by other users, please use a unique name.\nclip_models: |\n  CLIP Model selectors. ViT-B/32, ViT-B/16, ViT-L/14, RN101, RN50, RN50x4, RN50x16, RN50x64.\n  These various CLIP models are available for you to use during image generation.  Models have different styles or ‘flavors,’ so look around.\n  You can mix in multiple models as well for different results.  However, keep in mind that some models are extremely memory-hungry, and turning on additional models will take additional memory and may cause a crash.\n  The rough order of speed/mem usage is (smallest/fastest to largest/slowest):\n  ViT-B/32\n  RN50\n  RN101\n  ViT-B/16\n  RN50x4\n  RN50x16\n  RN50x64\n  ViT-L/14\n  For RN50x64 & ViTL14 you may need to use fewer cuts, depending on your VRAM.\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn101/reverse_diffusion/runner.py",
    "content": "'''\nThis code is rewritten by Paddle based on Jina-ai/discoart.\nhttps://github.com/jina-ai/discoart/blob/main/discoart/runner.py\n'''\nimport gc\nimport os\nimport random\nfrom threading import Thread\n\nimport disco_diffusion_clip_rn101.clip.clip as clip\nimport numpy as np\nimport paddle\nimport paddle.vision.transforms as T\nimport paddle_lpips as lpips\nfrom docarray import Document\nfrom docarray import DocumentArray\nfrom IPython import display\nfrom ipywidgets import Output\nfrom PIL import Image\n\nfrom .helper import logger\nfrom .helper import parse_prompt\nfrom .model.losses import range_loss\nfrom .model.losses import spherical_dist_loss\nfrom .model.losses import tv_loss\nfrom .model.make_cutouts import MakeCutoutsDango\nfrom .model.sec_diff import alpha_sigma_to_t\nfrom .model.sec_diff import SecondaryDiffusionImageNet2\nfrom .model.transforms import Normalize\n\n\ndef do_run(args, models) -> 'DocumentArray':\n    logger.info('preparing models...')\n    model, diffusion, clip_models, secondary_model = models\n    normalize = Normalize(\n        mean=[0.48145466, 0.4578275, 0.40821073],\n        std=[0.26862954, 0.26130258, 0.27577711],\n    )\n    lpips_model = lpips.LPIPS(net='vgg')\n    for parameter in lpips_model.parameters():\n        parameter.stop_gradient = True\n    side_x = (args.width_height[0] // 64) * 64\n    side_y = (args.width_height[1] // 64) * 64\n    cut_overview = eval(args.cut_overview)\n    cut_innercut = eval(args.cut_innercut)\n    cut_icgray_p = eval(args.cut_icgray_p)\n\n    from .model.perlin_noises import create_perlin_noise, regen_perlin\n\n    seed = args.seed\n\n    skip_steps = args.skip_steps\n\n    loss_values = []\n\n    if seed is not None:\n        np.random.seed(seed)\n        random.seed(seed)\n        paddle.seed(seed)\n\n    model_stats = []\n    for clip_model in clip_models:\n        model_stat = {\n            'clip_model': None,\n            'target_embeds': [],\n            'make_cutouts': None,\n            'weights': [],\n        }\n        model_stat['clip_model'] = clip_model\n\n        if isinstance(args.text_prompts, str):\n            args.text_prompts = [args.text_prompts]\n\n        for prompt in args.text_prompts:\n            txt, weight = parse_prompt(prompt)\n            txt = clip_model.encode_text(clip.tokenize(prompt))\n            if args.fuzzy_prompt:\n                for i in range(25):\n                    model_stat['target_embeds'].append((txt + paddle.randn(txt.shape) * args.rand_mag).clip(0, 1))\n                    model_stat['weights'].append(weight)\n            else:\n                model_stat['target_embeds'].append(txt)\n                model_stat['weights'].append(weight)\n\n        model_stat['target_embeds'] = paddle.concat(model_stat['target_embeds'])\n        model_stat['weights'] = paddle.to_tensor(model_stat['weights'])\n        if model_stat['weights'].sum().abs() < 1e-3:\n            raise RuntimeError('The weights must not sum to 0.')\n        model_stat['weights'] /= model_stat['weights'].sum().abs()\n        model_stats.append(model_stat)\n\n    init = None\n    if args.init_image:\n        d = Document(uri=args.init_image).load_uri_to_image_tensor(side_x, side_y)\n        init = T.to_tensor(d.tensor).unsqueeze(0) * 2 - 1\n\n    if args.perlin_init:\n        if args.perlin_mode == 'color':\n            init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, False, side_y, side_x)\n            init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, False, side_y, side_x)\n        elif args.perlin_mode == 'gray':\n            init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, True, side_y, side_x)\n            init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, True, side_y, side_x)\n        else:\n            init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, False, side_y, side_x)\n            init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, True, side_y, side_x)\n        init = (T.to_tensor(init).add(T.to_tensor(init2)).divide(paddle.to_tensor(2.0)).unsqueeze(0) * 2 - 1)\n        del init2\n\n    cur_t = None\n\n    def cond_fn(x, t, y=None):\n        x_is_NaN = False\n        n = x.shape[0]\n        if secondary_model:\n            alpha = paddle.to_tensor(diffusion.sqrt_alphas_cumprod[cur_t], dtype='float32')\n            sigma = paddle.to_tensor(diffusion.sqrt_one_minus_alphas_cumprod[cur_t], dtype='float32')\n            cosine_t = alpha_sigma_to_t(alpha, sigma)\n            x = paddle.to_tensor(x.detach(), dtype='float32')\n            x.stop_gradient = False\n            cosine_t = paddle.tile(paddle.to_tensor(cosine_t.detach().cpu().numpy()), [n])\n            cosine_t.stop_gradient = False\n            out = secondary_model(x, cosine_t).pred\n            fac = diffusion.sqrt_one_minus_alphas_cumprod[cur_t]\n            x_in_d = out * fac + x * (1 - fac)\n            x_in = x_in_d.detach()\n            x_in.stop_gradient = False\n            x_in_grad = paddle.zeros_like(x_in, dtype='float32')\n        else:\n            t = paddle.ones([n], dtype='int64') * cur_t\n            out = diffusion.p_mean_variance(model, x, t, clip_denoised=False, model_kwargs={'y': y})\n            fac = diffusion.sqrt_one_minus_alphas_cumprod[cur_t]\n            x_in_d = out['pred_xstart'] * fac + x * (1 - fac)\n            x_in = x_in_d.detach()\n            x_in.stop_gradient = False\n            x_in_grad = paddle.zeros_like(x_in, dtype='float32')\n        for model_stat in model_stats:\n            for i in range(args.cutn_batches):\n                t_int = (int(t.item()) + 1)  # errors on last step without +1, need to find source\n                # when using SLIP Base model the dimensions need to be hard coded to avoid AttributeError: 'VisionTransformer' object has no attribute 'input_resolution'\n                try:\n                    input_resolution = model_stat['clip_model'].visual.input_resolution\n                except:\n                    input_resolution = 224\n\n                cuts = MakeCutoutsDango(\n                    input_resolution,\n                    Overview=cut_overview[1000 - t_int],\n                    InnerCrop=cut_innercut[1000 - t_int],\n                    IC_Size_Pow=args.cut_ic_pow,\n                    IC_Grey_P=cut_icgray_p[1000 - t_int],\n                )\n                clip_in = normalize(cuts(x_in.add(paddle.to_tensor(1.0)).divide(paddle.to_tensor(2.0))))\n                image_embeds = (model_stat['clip_model'].encode_image(clip_in))\n\n                dists = spherical_dist_loss(\n                    image_embeds.unsqueeze(1),\n                    model_stat['target_embeds'].unsqueeze(0),\n                )\n\n                dists = dists.reshape([\n                    cut_overview[1000 - t_int] + cut_innercut[1000 - t_int],\n                    n,\n                    -1,\n                ])\n                losses = dists.multiply(model_stat['weights']).sum(2).mean(0)\n                loss_values.append(losses.sum().item())  # log loss, probably shouldn't do per cutn_batch\n\n                x_in_grad += (paddle.grad(losses.sum() * args.clip_guidance_scale, x_in)[0] / args.cutn_batches)\n        tv_losses = tv_loss(x_in)\n        range_losses = range_loss(x_in)\n        sat_losses = paddle.abs(x_in - x_in.clip(min=-1, max=1)).mean()\n        loss = (tv_losses.sum() * args.tv_scale + range_losses.sum() * args.range_scale +\n                sat_losses.sum() * args.sat_scale)\n        if init is not None and args.init_scale:\n            init_losses = lpips_model(x_in, init)\n            loss = loss + init_losses.sum() * args.init_scale\n        x_in_grad += paddle.grad(loss, x_in)[0]\n        if not paddle.isnan(x_in_grad).any():\n            grad = -paddle.grad(x_in_d, x, x_in_grad)[0]\n        else:\n            x_is_NaN = True\n            grad = paddle.zeros_like(x)\n        if args.clamp_grad and not x_is_NaN:\n            magnitude = grad.square().mean().sqrt()\n            return (grad * magnitude.clip(max=args.clamp_max) / magnitude)\n        return grad\n\n    if args.diffusion_sampling_mode == 'ddim':\n        sample_fn = diffusion.ddim_sample_loop_progressive\n    else:\n        sample_fn = diffusion.plms_sample_loop_progressive\n\n    logger.info('creating artwork...')\n\n    image_display = Output()\n    da_batches = DocumentArray()\n\n    for _nb in range(args.n_batches):\n        display.clear_output(wait=True)\n        display.display(args.name_docarray, image_display)\n        gc.collect()\n        paddle.device.cuda.empty_cache()\n\n        d = Document(tags=vars(args))\n        da_batches.append(d)\n\n        cur_t = diffusion.num_timesteps - skip_steps - 1\n\n        if args.perlin_init:\n            init = regen_perlin(args.perlin_mode, side_y, side_x, args.batch_size)\n\n        if args.diffusion_sampling_mode == 'ddim':\n            samples = sample_fn(\n                model,\n                (args.batch_size, 3, side_y, side_x),\n                clip_denoised=args.clip_denoised,\n                model_kwargs={},\n                cond_fn=cond_fn,\n                progress=True,\n                skip_timesteps=skip_steps,\n                init_image=init,\n                randomize_class=args.randomize_class,\n                eta=args.eta,\n            )\n        else:\n            samples = sample_fn(\n                model,\n                (args.batch_size, 3, side_y, side_x),\n                clip_denoised=args.clip_denoised,\n                model_kwargs={},\n                cond_fn=cond_fn,\n                progress=True,\n                skip_timesteps=skip_steps,\n                init_image=init,\n                randomize_class=args.randomize_class,\n                order=2,\n            )\n\n        threads = []\n        for j, sample in enumerate(samples):\n            cur_t -= 1\n            with image_display:\n                if j % args.display_rate == 0 or cur_t == -1:\n                    for _, image in enumerate(sample['pred_xstart']):\n                        image = (image + 1) / 2\n                        image = image.clip(0, 1).squeeze().transpose([1, 2, 0]).numpy() * 255\n                        image = np.uint8(image)\n                        image = Image.fromarray(image)\n\n                        image.save(os.path.join(args.output_dir, 'progress-{}.png'.format(_nb)))\n                        c = Document(tags={'cur_t': cur_t})\n                        c.load_pil_image_to_datauri(image)\n                        d.chunks.append(c)\n                        display.clear_output(wait=True)\n                        display.display(display.Image(os.path.join(args.output_dir, 'progress-{}.png'.format(_nb))))\n                        d.chunks.plot_image_sprites(os.path.join(args.output_dir,\n                                                                 f'{args.name_docarray}-progress-{_nb}.png'),\n                                                    show_index=True)\n                        t = Thread(\n                            target=_silent_push,\n                            args=(\n                                da_batches,\n                                args.name_docarray,\n                            ),\n                        )\n                        threads.append(t)\n                        t.start()\n\n                    if cur_t == -1:\n                        d.load_pil_image_to_datauri(image)\n\n        for t in threads:\n            t.join()\n    display.clear_output(wait=True)\n    logger.info(f'done! {args.name_docarray}')\n    da_batches.plot_image_sprites(skip_empty=True, show_index=True, keep_aspect_ratio=True)\n    return da_batches\n\n\ndef _silent_push(da_batches: DocumentArray, name: str) -> None:\n    try:\n        da_batches.push(name)\n    except Exception as ex:\n        logger.debug(f'push failed: {ex}')\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/README.md",
    "content": "# disco_diffusion_clip_rn50\n\n|模型名称|disco_diffusion_clip_rn50|\n| :--- | :---: |\n|类别|图像-文图生成|\n|网络|dd+clip ResNet50|\n|数据集|-|\n|是否支持Fine-tuning|否|\n|模型大小|2.8GB|\n|最新更新日期|2022-08-02|\n|数据指标|-|\n\n## 一、模型基本信息\n\n### 应用效果展示\n\n  - 输入文本 \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"\n\n  - 输出图像\n  <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/184826628-7a716163-3439-489b-b5f5-0104b6a107de.png\"  width = \"80%\" hspace='10'/>\n  <br />\n\n  - 生成过程\n  <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/184826692-7959337f-8144-46d5-affb-362ad023420c.gif\"  width = \"80%\" hspace='10'/>\n  <br />\n\n### 模型介绍\n\ndisco_diffusion_clip_rn50 是一个文图生成模型，可以通过输入一段文字来生成符合该句子语义的图像。该模型由两部分组成，一部分是扩散模型，是一种生成模型，可以从噪声输入中重建出原始图像。另一部分是多模态预训练模型（CLIP）, 可以将文本和图像表示在同一个特征空间，相近语义的文本和图像在该特征空间里距离会更相近。在该文图生成模型中，扩散模型负责从初始噪声或者指定初始图像中来生成目标图像，CLIP负责引导生成图像的语义和输入的文本的语义尽可能接近，随着扩散模型在CLIP的引导下不断的迭代生成新图像，最终能够生成文本所描述内容的图像。该模块中使用的CLIP模型结构为ResNet50。\n\n更多详情请参考论文：[Diffusion Models Beat GANs on Image Synthesis](https://arxiv.org/abs/2105.05233) 以及 [Learning Transferable Visual Models From Natural Language Supervision](https://arxiv.org/abs/2103.00020)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.2.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install disco_diffusion_clip_rn50\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run disco_diffusion_clip_rn50 --text_prompts \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\" --output_dir disco_diffusion_clip_rn50_out\n    ```\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"disco_diffusion_clip_rn50\")\n    text_prompts = [\"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"]\n    # 生成图像, 默认会在disco_diffusion_clip_rn50_out目录保存图像\n    # 返回的da是一个DocumentArray对象，保存了所有的结果，包括最终结果和迭代过程的中间结果\n    # 可以通过操作DocumentArray对象对生成的图像做后处理，保存或者分析\n    da = module.generate_image(text_prompts=text_prompts, output_dir='./disco_diffusion_clip_rn50_out/')  \n    # 手动将最终生成的图像保存到指定路径\n    da[0].save_uri_to_file('disco_diffusion_clip_rn50_out-result.png')\n    # 展示所有的中间结果\n    da[0].chunks.plot_image_sprites(skip_empty=True, show_index=True, keep_aspect_ratio=True)\n    # 将整个生成过程保存为一个动态图gif\n    da[0].chunks.save_gif('disco_diffusion_clip_rn50_out-result.gif')\n    ```\n\n- ### 3、API\n\n  - ```python\n    def generate_image(\n            text_prompts,\n            style: Optional[str] = None,\n            artist: Optional[str] = None,\n            width_height: Optional[List[int]] = [1280, 768],\n            seed: Optional[int] = None,\n            output_dir: Optional[str] = 'disco_diffusion_clip_rn50_out'):\n    ```\n\n    - 文图生成API，生成文本描述内容的图像。\n\n    - **参数**\n\n      - text_prompts(str): 输入的语句，描述想要生成的图像的内容。通常比较有效的构造方式为 \"一段描述性的文字内容\" + \"指定艺术家的名字\"，如\"a beautiful painting of Chinese architecture, by krenz, sunny, super wide angle, artstation.\"。prompt的构造可以参考[网站](https://docs.google.com/document/d/1XUT2G9LmkZataHFzmuOtRXnuWBfhvXDAo8DkS--8tec/edit#)。\n      - style(Optional[str]): 指定绘画的风格，如'watercolor','Chinese painting'等。当不指定时，风格完全由您所填写的prompt决定。\n      - artist(Optional[str]): 指定特定的艺术家，如Greg Rutkowsk、krenz，将会生成所指定艺术家的绘画风格。当不指定时，风格完全由您所填写的prompt决定。各种艺术家的风格可以参考[网站](https://weirdwonderfulai.art/resources/disco-diffusion-70-plus-artist-studies/)。\n      - width_height(Optional[List[int]]): 指定最终输出图像的宽高，宽和高都需要是64的倍数，生成的图像越大，所需要的计算时间越长。\n      - seed(Optional[int]): 随机种子，由于输入默认是随机高斯噪声，设置不同的随机种子会由不同的初始输入，从而最终生成不同的结果，可以设置该参数来获得不同的输出图像。\n      - output_dir(Optional[str]): 保存输出图像的目录，默认为\"disco_diffusion_clip_rn50_out\"。\n\n\n    - **返回**\n      - ra(DocumentArray): DocumentArray对象， 包含`n_batches`个Documents，其中每个Document都保存了迭代过程的所有中间结果。详细可参考[DocumentArray使用文档](https://docarray.jina.ai/fundamentals/documentarray/index.html)。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线文图生成服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m disco_diffusion_clip_rn50\n    ```\n\n  - 这样就完成了一个文图生成的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果，返回的预测结果在反序列化后即是上述接口声明中说明的DocumentArray类型，返回后对结果的操作方式和使用generate_image接口完全相同。\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    from docarray import DocumentArray\n\n    # 发送HTTP请求\n    data = {'text_prompts': 'in the morning light,Overlooking TOKYO city by greg rutkowski and thomas kinkade,Trending on artstation.'}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/disco_diffusion_clip_rn50\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 获取返回结果\n    da = DocumentArray.from_base64(r.json()[\"results\"])\n    # 手动将最终生成的图像保存到指定路径\n    da[0].save_uri_to_file('disco_diffusion_clip_rn50_out-result.png')\n    # 将生成过程保存为一个动态图gif\n    da[0].chunks.save_gif('disco_diffusion_clip_rn50_out-result.gif')\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install disco_diffusion_clip_rn50 == 1.0.0\n  ```\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/README_en.md",
    "content": "# disco_diffusion_clip_rn50\n\n|Module Name|disco_diffusion_clip_rn50|\n| :--- | :---: |\n|Category|text to image|\n|Network|dd+clip ResNet50|\n|Dataset|-|\n|Fine-tuning supported or not|No|\n|Module Size|2.8GB|\n|Latest update date|2022-08-02|\n|Data indicators|-|\n\n## I.Basic Information\n\n### Application Effect Display\n\n  - Prompt \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"\n\n  - Output image\n  <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/184826628-7a716163-3439-489b-b5f5-0104b6a107de.png\"  width = \"80%\" hspace='10'/>\n  <br />\n\n  - Generating process\n  <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/184826692-7959337f-8144-46d5-affb-362ad023420c.gif\"  width = \"80%\" hspace='10'/>\n  <br />\n\n### Module Introduction\n\ndisco_diffusion_clip_rn50 is a text-to-image generation model that can generate images that match the semantics of the sentence you prompt. The model consists of two parts, one is the diffusion model, which is a generative model that reconstructs the original image from the noisy input. The other part is the multimodal pre-training model (CLIP), which can represent text and images in the same feature space, and text and images with similar semantics will be closer in this feature space. In the text image generation model, the diffusion model is responsible for generating the target image from the initial noise or the specified initial image, and CLIP is responsible for guiding the generated image to be as close as possible to the semantics of the input text. Diffusion model under the guidance of CLIP iteratively generates new images, eventually generating images of what the text describes. The CLIP model used in this module is ResNet50.\n\nFor more details, please refer to [Diffusion Models Beat GANs on Image Synthesis](https://arxiv.org/abs/2105.05233) and [Learning Transferable Visual Models From Natural Language Supervision](https://arxiv.org/abs/2103.00020)\n\n## II.Installation\n\n- ### 1.Environmental Dependence\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.2.0    | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2.Installation\n\n  - ```shell\n    $ hub install disco_diffusion_clip_rn50\n    ```\n  - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n\n## III.Module API Prediction  \n\n- ### 1.Command line Prediction\n\n  - ```shell\n    $ hub run disco_diffusion_clip_rn50 --text_prompts \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\" --output_dir disco_diffusion_clip_rn50_out\n    ```\n\n- ### 2.Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"disco_diffusion_clip_rn50\")\n    text_prompts = [\"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"]\n    # Output images will be saved in disco_diffusion_clip_rn50_out directory.\n    # The returned da is a DocumentArray object, which contains all immediate and final results\n    # You can manipulate the DocumentArray object to do post-processing and save images\n    da = module.generate_image(text_prompts=text_prompts, output_dir='./disco_diffusion_clip_rn50_out/')  \n    # Save final result image to a file\n    da[0].save_uri_to_file('disco_diffusion_clip_rn50_out-result.png')\n    # Show all immediate results\n    da[0].chunks.plot_image_sprites(skip_empty=True, show_index=True, keep_aspect_ratio=True)\n    # Save the generating process as a gif\n    da[0].chunks.save_gif('disco_diffusion_clip_rn50_out-result.gif')\n    ```\n\n- ### 3.API\n\n  - ```python\n    def generate_image(\n            text_prompts,\n            style: Optional[str] = None,\n            artist: Optional[str] = None,\n            width_height: Optional[List[int]] = [1280, 768],\n            seed: Optional[int] = None,\n            output_dir: Optional[str] = 'disco_diffusion_clip_rn50_out'):\n    ```\n\n    - Image generating api, which generates an image corresponding to your prompt.\n\n    - **Parameters**\n\n      - text_prompts(str): Prompt, used to describe your image content. You can construct a prompt conforms to the format \"content\" + \"artist/style\", such as \"a beautiful painting of Chinese architecture, by krenz, sunny, super wide angle, artstation.\". For more details, you can refer to [website](https://docs.google.com/document/d/1XUT2G9LmkZataHFzmuOtRXnuWBfhvXDAo8DkS--8tec/edit#).\n      - style(Optional[str]): Image style, such as \"watercolor\" and \"Chinese painting\". If not provided, style is totally up to your prompt.\n      - artist(Optional[str]): Artist name, such as Greg Rutkowsk, krenz, image style is as whose works you choose. If not provided, style is totally up to your [prompt](https://weirdwonderfulai.art/resources/disco-diffusion-70-plus-artist-studies/).\n      - width_height(Optional[List[int]]): The width and height of output images, should be better multiples of 64. The larger size is, the longger computation time is.\n      - seed(Optional[int]): Random seed, different seeds result in different output images.\n      - output_dir(Optional[str]): Output directory, default is \"disco_diffusion_clip_rn50_out\".\n\n\n    - **Return**\n      - ra(DocumentArray): DocumentArray object， including `n_batches` Documents，each document keeps all immediate results during generation, please refer to [DocumentArray tutorial](https://docarray.jina.ai/fundamentals/documentarray/index.html) for more details.\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of text-to-image.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m disco_diffusion_clip_rn50\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result.\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    from docarray import DocumentArray\n\n    # Send an HTTP request\n    data = {'text_prompts': 'in the morning light,Overlooking TOKYO city by greg rutkowski and thomas kinkade,Trending on artstation.'}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/disco_diffusion_clip_rn50\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # Get results\n    da = DocumentArray.from_base64(r.json()[\"results\"])\n    # Save final result image to a file\n    da[0].save_uri_to_file('disco_diffusion_clip_rn50_out-result.png')\n    # Save the generating process as a gif\n    da[0].chunks.save_gif('disco_diffusion_clip_rn50_out-result.gif')\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n  ```shell\n  $ hub install disco_diffusion_clip_rn50 == 1.0.0\n  ```\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/clip/README.md",
    "content": "# OpenAI CLIP implemented in Paddle.\nThe original implementation repo is [ranchlai/clip.paddle](https://github.com/ranchlai/clip.paddle). We copy this repo here for guided diffusion.\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/clip/clip/__init__.py",
    "content": "from .utils import *\r\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/clip/clip/layers.py",
    "content": "from typing import Optional\n\nimport paddle\nimport paddle.nn as nn\nfrom paddle import Tensor\nfrom paddle.nn import functional as F\nfrom paddle.nn import Linear\n\n__all__ = ['ResidualAttentionBlock', 'AttentionPool2d', 'multi_head_attention_forward', 'MultiHeadAttention']\n\n\ndef multi_head_attention_forward(x: Tensor,\n                                 num_heads: int,\n                                 q_proj: Linear,\n                                 k_proj: Linear,\n                                 v_proj: Linear,\n                                 c_proj: Linear,\n                                 attn_mask: Optional[Tensor] = None):\n    max_len, batch_size, emb_dim = x.shape\n    head_dim = emb_dim // num_heads\n    scaling = float(head_dim)**-0.5\n    q = q_proj(x)  # L, N, E\n    k = k_proj(x)  # L, N, E\n    v = v_proj(x)  # L, N, E\n    #k = k.con\n    v = v.reshape((-1, batch_size * num_heads, head_dim)).transpose((1, 0, 2))\n    k = k.reshape((-1, batch_size * num_heads, head_dim)).transpose((1, 0, 2))\n    q = q.reshape((-1, batch_size * num_heads, head_dim)).transpose((1, 0, 2))\n\n    q = q * scaling\n    qk = paddle.bmm(q, k.transpose((0, 2, 1)))\n    if attn_mask is not None:\n        if attn_mask.ndim == 2:\n            attn_mask.unsqueeze_(0)\n        #assert str(attn_mask.dtype) == 'VarType.FP32' and attn_mask.ndim == 3\n        assert attn_mask.shape[0] == 1 and attn_mask.shape[1] == max_len and attn_mask.shape[2] == max_len\n        qk += attn_mask\n\n    qk = paddle.nn.functional.softmax(qk, axis=-1)\n    atten = paddle.bmm(qk, v)\n    atten = atten.transpose((1, 0, 2))\n    atten = atten.reshape((max_len, batch_size, emb_dim))\n    atten = c_proj(atten)\n    return atten\n\n\nclass MultiHeadAttention(nn.Layer):  # without attention mask\n\n    def __init__(self, emb_dim: int, num_heads: int):\n        super().__init__()\n        self.q_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.k_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.v_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.c_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.head_dim = emb_dim // num_heads\n        self.emb_dim = emb_dim\n        self.num_heads = num_heads\n        assert self.head_dim * num_heads == emb_dim, \"embed_dim must be divisible by num_heads\"\n        #self.scaling = float(self.head_dim) ** -0.5\n\n    def forward(self, x, attn_mask=None):  # x is in shape[max_len,batch_size,emb_dim]\n\n        atten = multi_head_attention_forward(x,\n                                             self.num_heads,\n                                             self.q_proj,\n                                             self.k_proj,\n                                             self.v_proj,\n                                             self.c_proj,\n                                             attn_mask=attn_mask)\n\n        return atten\n\n\nclass Identity(nn.Layer):\n\n    def __init__(self):\n        super().__init__()\n\n    def forward(self, x):\n        return x\n\n\nclass Bottleneck(nn.Layer):\n    expansion = 4\n\n    def __init__(self, inplanes, planes, stride=1):\n        super().__init__()\n\n        # all conv layers have stride 1. an avgpool is performed after the second convolution when stride > 1\n        self.conv1 = nn.Conv2D(inplanes, planes, 1, bias_attr=False)\n        self.bn1 = nn.BatchNorm2D(planes)\n\n        self.conv2 = nn.Conv2D(planes, planes, 3, padding=1, bias_attr=False)\n        self.bn2 = nn.BatchNorm2D(planes)\n\n        self.avgpool = nn.AvgPool2D(stride) if stride > 1 else Identity()\n\n        self.conv3 = nn.Conv2D(planes, planes * self.expansion, 1, bias_attr=False)\n        self.bn3 = nn.BatchNorm2D(planes * self.expansion)\n\n        self.relu = nn.ReLU()\n        self.downsample = None\n        self.stride = stride\n\n        if stride > 1 or inplanes != planes * Bottleneck.expansion:\n            self.downsample = nn.Sequential(\n                (\"-1\", nn.AvgPool2D(stride)),\n                (\"0\", nn.Conv2D(inplanes, planes * self.expansion, 1, stride=1, bias_attr=False)),\n                (\"1\", nn.BatchNorm2D(planes * self.expansion)))\n\n    def forward(self, x):\n        identity = x\n\n        out = self.relu(self.bn1(self.conv1(x)))\n        out = self.relu(self.bn2(self.conv2(out)))\n        out = self.avgpool(out)\n        out = self.bn3(self.conv3(out))\n\n        if self.downsample is not None:\n            identity = self.downsample(x)\n\n        out += identity\n        out = self.relu(out)\n        return out\n\n\nclass AttentionPool2d(nn.Layer):\n\n    def __init__(self, spacial_dim: int, embed_dim: int, num_heads: int, output_dim: int = None):\n        super().__init__()\n\n        self.positional_embedding = paddle.create_parameter((spacial_dim**2 + 1, embed_dim), dtype='float32')\n\n        self.q_proj = nn.Linear(embed_dim, embed_dim, bias_attr=True)\n        self.k_proj = nn.Linear(embed_dim, embed_dim, bias_attr=True)\n        self.v_proj = nn.Linear(embed_dim, embed_dim, bias_attr=True)\n        self.c_proj = nn.Linear(embed_dim, output_dim or embed_dim, bias_attr=True)\n        self.num_heads = num_heads\n\n        self.head_dim = embed_dim // num_heads\n        assert self.head_dim * num_heads == embed_dim, \"embed_dim must be divisible by num_heads\"\n\n    def forward(self, x):\n\n        x = x.reshape((x.shape[0], x.shape[1], x.shape[2] * x.shape[3])).transpose((2, 0, 1))  # NCHW -> (HW)NC\n        max_len, batch_size, emb_dim = x.shape\n        head_dim = self.head_dim\n        x = paddle.concat([paddle.mean(x, axis=0, keepdim=True), x], axis=0)\n        x = x + paddle.unsqueeze(self.positional_embedding, 1)\n        out = multi_head_attention_forward(x, self.num_heads, self.q_proj, self.k_proj, self.v_proj, self.c_proj)\n\n        return out[0]\n\n\nclass QuickGELU(nn.Layer):\n\n    def forward(self, x):\n        return x * paddle.nn.functional.sigmoid(1.702 * x)\n\n\nclass ResidualAttentionBlock(nn.Layer):\n\n    def __init__(self, d_model: int, n_head: int, attn_mask=None):\n        super().__init__()\n\n        self.attn = MultiHeadAttention(d_model, n_head)\n        self.ln_1 = nn.LayerNorm(d_model)\n        self.mlp = nn.Sequential((\"c_fc\", nn.Linear(d_model, d_model * 4)), (\"gelu\", QuickGELU()),\n                                 (\"c_proj\", nn.Linear(d_model * 4, d_model)))\n        self.ln_2 = nn.LayerNorm(d_model)\n        self.attn_mask = attn_mask\n\n    def attention(self, x):\n        x = self.attn(x, self.attn_mask)\n        assert isinstance(x, paddle.Tensor)  # not tuble here\n        return x\n\n    def forward(self, x):\n\n        x = x + self.attention(self.ln_1(x))\n        x = x + self.mlp(self.ln_2(x))\n        return x\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/clip/clip/model.py",
    "content": "from typing import Tuple\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle import nn\n\nfrom .layers import AttentionPool2d\nfrom .layers import Bottleneck\nfrom .layers import MultiHeadAttention\nfrom .layers import ResidualAttentionBlock\n\n\nclass ModifiedResNet(nn.Layer):\n    \"\"\"\n    A ResNet class that is similar to torchvision's but contains the following changes:\n    - There are now 3 \"stem\" convolutions as opposed to 1, with an average pool instead of a max pool.\n    - Performs anti-aliasing strided convolutions, where an avgpool is prepended to convolutions with stride > 1\n    - The final pooling layer is a QKV attention instead of an average pool\n    \"\"\"\n\n    def __init__(self, layers, output_dim, heads, input_resolution=224, width=64):\n        super().__init__()\n        self.output_dim = output_dim\n        self.input_resolution = input_resolution\n\n        # the 3-layer stem\n        self.conv1 = nn.Conv2D(3, width // 2, kernel_size=3, stride=2, padding=1, bias_attr=False)\n        self.bn1 = nn.BatchNorm2D(width // 2)\n        self.conv2 = nn.Conv2D(width // 2, width // 2, kernel_size=3, padding=1, bias_attr=False)\n        self.bn2 = nn.BatchNorm2D(width // 2)\n        self.conv3 = nn.Conv2D(width // 2, width, kernel_size=3, padding=1, bias_attr=False)\n        self.bn3 = nn.BatchNorm2D(width)\n        self.avgpool = nn.AvgPool2D(2)\n        self.relu = nn.ReLU()\n\n        # residual layers\n        self._inplanes = width  # this is a *mutable* variable used during construction\n        self.layer1 = self._make_layer(width, layers[0])\n        self.layer2 = self._make_layer(width * 2, layers[1], stride=2)\n        self.layer3 = self._make_layer(width * 4, layers[2], stride=2)\n        self.layer4 = self._make_layer(width * 8, layers[3], stride=2)\n\n        embed_dim = width * 32  # the ResNet feature dimension\n        self.attnpool = AttentionPool2d(input_resolution // 32, embed_dim, heads, output_dim)\n\n    def _make_layer(self, planes, blocks, stride=1):\n        layers = [Bottleneck(self._inplanes, planes, stride)]\n\n        self._inplanes = planes * Bottleneck.expansion\n        for _ in range(1, blocks):\n            layers.append(Bottleneck(self._inplanes, planes))\n\n        return nn.Sequential(*layers)\n\n    def forward(self, x):\n\n        def stem(x):\n            for conv, bn in [(self.conv1, self.bn1), (self.conv2, self.bn2), (self.conv3, self.bn3)]:\n                x = self.relu(bn(conv(x)))\n            x = self.avgpool(x)\n            return x\n\n        #x = x.type(self.conv1.weight.dtype)\n        x = stem(x)\n        x = self.layer1(x)\n        x = self.layer2(x)\n        x = self.layer3(x)\n        x = self.layer4(x)\n        x = self.attnpool(x)\n\n        return x\n\n\nclass Transformer(nn.Layer):\n\n    def __init__(self, width: int, layers: int, heads: int, attn_mask=None):\n        super().__init__()\n        self.width = width\n        self.layers = layers\n        self.resblocks = nn.Sequential(*[ResidualAttentionBlock(width, heads, attn_mask) for _ in range(layers)])\n\n    def forward(self, x):\n        return self.resblocks(x)\n\n\nclass VisualTransformer(nn.Layer):\n\n    def __init__(self, input_resolution: int, patch_size: int, width: int, layers: int, heads: int, output_dim: int):\n        super().__init__()\n        self.input_resolution = input_resolution\n        self.output_dim = output_dim\n        # used patch_size x patch_size, stride patch_size to do linear projection\n        self.conv1 = nn.Conv2D(in_channels=3,\n                               out_channels=width,\n                               kernel_size=patch_size,\n                               stride=patch_size,\n                               bias_attr=False)\n\n        # scale = width ** -0.5\n        self.class_embedding = paddle.create_parameter((width, ), 'float32')\n\n        self.positional_embedding = paddle.create_parameter(((input_resolution // patch_size)**2 + 1, width), 'float32')\n\n        self.ln_pre = nn.LayerNorm(width)\n\n        self.transformer = Transformer(width, layers, heads)\n\n        self.ln_post = nn.LayerNorm(width)\n        self.proj = paddle.create_parameter((width, output_dim), 'float32')\n\n    def forward(self, x):\n\n        x = self.conv1(x)\n        x = x.reshape((x.shape[0], x.shape[1], -1))\n        x = x.transpose((0, 2, 1))\n        x = paddle.concat([self.class_embedding + paddle.zeros((x.shape[0], 1, x.shape[-1]), dtype=x.dtype), x], axis=1)\n\n        x = x + self.positional_embedding\n        x = self.ln_pre(x)\n        x = x.transpose((1, 0, 2))\n        x = self.transformer(x)\n        x = x.transpose((1, 0, 2))\n        x = self.ln_post(x[:, 0, :])\n        if self.proj is not None:\n            x = paddle.matmul(x, self.proj)\n\n        return x\n\n\nclass CLIP(nn.Layer):\n\n    def __init__(\n            self,\n            embed_dim: int,\n            # vision\n            image_resolution: int,\n            vision_layers: Union[Tuple[int, int, int, int], int],\n            vision_width: int,\n            vision_patch_size: int,\n            # text\n            context_length: int,\n            vocab_size: int,\n            transformer_width: int,\n            transformer_heads: int,\n            transformer_layers: int):\n        super().__init__()\n\n        self.context_length = context_length\n        if isinstance(vision_layers, (tuple, list)):\n            vision_heads = vision_width * 32 // 64\n            self.visual = ModifiedResNet(layers=vision_layers,\n                                         output_dim=embed_dim,\n                                         heads=vision_heads,\n                                         input_resolution=image_resolution,\n                                         width=vision_width)\n        else:\n            vision_heads = vision_width // 64\n            self.visual = VisualTransformer(input_resolution=image_resolution,\n                                            patch_size=vision_patch_size,\n                                            width=vision_width,\n                                            layers=vision_layers,\n                                            heads=vision_heads,\n                                            output_dim=embed_dim)\n\n        self.transformer = Transformer(width=transformer_width,\n                                       layers=transformer_layers,\n                                       heads=transformer_heads,\n                                       attn_mask=self.build_attention_mask())\n\n        self.vocab_size = vocab_size\n        self.token_embedding = nn.Embedding(vocab_size, transformer_width)\n        self.positional_embedding = paddle.create_parameter((self.context_length, transformer_width), 'float32')\n        self.ln_final = nn.LayerNorm(transformer_width)\n\n        self.text_projection = paddle.create_parameter((transformer_width, embed_dim), 'float32')\n        self.logit_scale = paddle.create_parameter((1, ), 'float32')\n\n    def build_attention_mask(self):\n        # lazily create causal attention mask, with full attention between the vision tokens\n        # mask = paddle.empty((self.context_length, self.context_length),dtype='float32')\n        # mask.fill_(float(\"-inf\"))\n        #mask.triu_(1)  # zero out the lower diagonal\n\n        mask = paddle.ones((self.context_length, self.context_length)) * float(\"-inf\")\n        mask = paddle.triu(mask, diagonal=1)\n\n        return mask\n\n    def encode_image(self, image):\n        return self.visual(image)\n\n    def encode_text(self, text):\n        x = self.token_embedding(text)  # [batch_size, n_ctx, d_model]\n        # print(x.shape)\n\n        x = x + self.positional_embedding\n        #print(x.shape)\n\n        x = x.transpose((1, 0, 2))  # NLD -> LND\n        x = self.transformer(x)\n        x = x.transpose((1, 0, 2))  # LND -> NLD\n        x = self.ln_final(x)\n\n        idx = text.numpy().argmax(-1)\n        idx = list(idx)\n        x = [x[i:i + 1, int(j), :] for i, j in enumerate(idx)]\n        x = paddle.concat(x, 0)\n        x = paddle.matmul(x, self.text_projection)\n        return x\n\n    def forward(self, image, text):\n        image_features = self.encode_image(image)\n        text_features = self.encode_text(text)\n\n        # normalized features\n        image_features = image_features / image_features.norm(dim=-1, keepdim=True)\n        text_features = text_features / text_features.norm(dim=-1, keepdim=True)\n\n        # cosine similarity as logits\n        logit_scale = self.logit_scale.exp()\n        logits_per_image = paddle.matmul(logit_scale * image_features, text_features.t())\n        logits_per_text = paddle.matmul(logit_scale * text_features, image_features.t())\n\n        # shape = [global_batch_size, global_batch_size]\n        return logits_per_image, logits_per_text\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/clip/clip/simple_tokenizer.py",
    "content": "import gzip\nimport html\nimport os\nfrom functools import lru_cache\n\nimport ftfy\nimport regex as re\n\n\n@lru_cache()\ndef default_bpe():\n    return os.path.join(os.path.dirname(os.path.abspath(__file__)), \"../assets/bpe_simple_vocab_16e6.txt.gz\")\n\n\n@lru_cache()\ndef bytes_to_unicode():\n    \"\"\"\n    Returns list of utf-8 byte and a corresponding list of unicode strings.\n    The reversible bpe codes work on unicode strings.\n    This means you need a large # of unicode characters in your vocab if you want to avoid UNKs.\n    When you're at something like a 10B token dataset you end up needing around 5K for decent coverage.\n    This is a signficant percentage of your normal, say, 32K bpe vocab.\n    To avoid that, we want lookup tables between utf-8 bytes and unicode strings.\n    And avoids mapping to whitespace/control characters the bpe code barfs on.\n    \"\"\"\n    bs = list(range(ord(\"!\"), ord(\"~\") + 1)) + list(range(ord(\"¡\"), ord(\"¬\") + 1)) + list(range(ord(\"®\"), ord(\"ÿ\") + 1))\n    cs = bs[:]\n    n = 0\n    for b in range(2**8):\n        if b not in bs:\n            bs.append(b)\n            cs.append(2**8 + n)\n            n += 1\n    cs = [chr(n) for n in cs]\n    return dict(zip(bs, cs))\n\n\ndef get_pairs(word):\n    \"\"\"Return set of symbol pairs in a word.\n    Word is represented as tuple of symbols (symbols being variable-length strings).\n    \"\"\"\n    pairs = set()\n    prev_char = word[0]\n    for char in word[1:]:\n        pairs.add((prev_char, char))\n        prev_char = char\n    return pairs\n\n\ndef basic_clean(text):\n    text = ftfy.fix_text(text)\n    text = html.unescape(html.unescape(text))\n    return text.strip()\n\n\ndef whitespace_clean(text):\n    text = re.sub(r'\\s+', ' ', text)\n    text = text.strip()\n    return text\n\n\nclass SimpleTokenizer(object):\n\n    def __init__(self, bpe_path: str = default_bpe()):\n        self.byte_encoder = bytes_to_unicode()\n        self.byte_decoder = {v: k for k, v in self.byte_encoder.items()}\n        merges = gzip.open(bpe_path).read().decode(\"utf-8\").split('\\n')\n        merges = merges[1:49152 - 256 - 2 + 1]\n        merges = [tuple(merge.split()) for merge in merges]\n        vocab = list(bytes_to_unicode().values())\n        vocab = vocab + [v + '</w>' for v in vocab]\n        for merge in merges:\n            vocab.append(''.join(merge))\n        vocab.extend(['<|startoftext|>', '<|endoftext|>'])\n        self.encoder = dict(zip(vocab, range(len(vocab))))\n        self.decoder = {v: k for k, v in self.encoder.items()}\n        self.bpe_ranks = dict(zip(merges, range(len(merges))))\n        self.cache = {'<|startoftext|>': '<|startoftext|>', '<|endoftext|>': '<|endoftext|>'}\n        self.pat = re.compile(\n            r\"\"\"<\\|startoftext\\|>|<\\|endoftext\\|>|'s|'t|'re|'ve|'m|'ll|'d|[\\p{L}]+|[\\p{N}]|[^\\s\\p{L}\\p{N}]+\"\"\",\n            re.IGNORECASE)\n\n    def bpe(self, token):\n        if token in self.cache:\n            return self.cache[token]\n        word = tuple(token[:-1]) + (token[-1] + '</w>', )\n        pairs = get_pairs(word)\n\n        if not pairs:\n            return token + '</w>'\n\n        while True:\n            bigram = min(pairs, key=lambda pair: self.bpe_ranks.get(pair, float('inf')))\n            if bigram not in self.bpe_ranks:\n                break\n            first, second = bigram\n            new_word = []\n            i = 0\n            while i < len(word):\n                try:\n                    j = word.index(first, i)\n                    new_word.extend(word[i:j])\n                    i = j\n                except:\n                    new_word.extend(word[i:])\n                    break\n\n                if word[i] == first and i < len(word) - 1 and word[i + 1] == second:\n                    new_word.append(first + second)\n                    i += 2\n                else:\n                    new_word.append(word[i])\n                    i += 1\n            new_word = tuple(new_word)\n            word = new_word\n            if len(word) == 1:\n                break\n            else:\n                pairs = get_pairs(word)\n        word = ' '.join(word)\n        self.cache[token] = word\n        return word\n\n    def encode(self, text):\n        bpe_tokens = []\n        text = whitespace_clean(basic_clean(text)).lower()\n        for token in re.findall(self.pat, text):\n            token = ''.join(self.byte_encoder[b] for b in token.encode('utf-8'))\n            bpe_tokens.extend(self.encoder[bpe_token] for bpe_token in self.bpe(token).split(' '))\n        return bpe_tokens\n\n    def decode(self, tokens):\n        text = ''.join([self.decoder[token] for token in tokens])\n        text = bytearray([self.byte_decoder[c] for c in text]).decode('utf-8', errors=\"replace\").replace('</w>', ' ')\n        return text\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/clip/clip/utils.py",
    "content": "import os\nfrom typing import List\nfrom typing import Union\n\nimport numpy as np\nimport paddle\nfrom paddle.utils import download\nfrom paddle.vision.transforms import CenterCrop\nfrom paddle.vision.transforms import Compose\nfrom paddle.vision.transforms import Normalize\nfrom paddle.vision.transforms import Resize\nfrom paddle.vision.transforms import ToTensor\n\nfrom .model import CLIP\nfrom .simple_tokenizer import SimpleTokenizer\n\n__all__ = ['transform', 'tokenize', 'build_model']\n\nMODEL_NAMES = ['RN50', 'RN101', 'VIT32']\n\nURL = {\n    'RN50': os.path.join(os.path.dirname(__file__), 'pre_trained', 'RN50.pdparams'),\n    'RN101': os.path.join(os.path.dirname(__file__), 'pre_trained', 'RN101.pdparams'),\n    'VIT32': os.path.join(os.path.dirname(__file__), 'pre_trained', 'ViT-B-32.pdparams')\n}\n\nMEAN, STD = (0.48145466, 0.4578275, 0.40821073), (0.26862954, 0.26130258, 0.27577711)\n_tokenizer = SimpleTokenizer()\n\ntransform = Compose([\n    Resize(224, interpolation='bicubic'),\n    CenterCrop(224), lambda image: image.convert('RGB'),\n    ToTensor(),\n    Normalize(mean=MEAN, std=STD), lambda t: t.unsqueeze_(0)\n])\n\n\ndef tokenize(texts: Union[str, List[str]], context_length: int = 77):\n    \"\"\"\n    Returns the tokenized representation of given input string(s)\n\n    Parameters\n    ----------\n    texts : Union[str, List[str]]\n        An input string or a list of input strings to tokenize\n\n    context_length : int\n        The context length to use; all CLIP models use 77 as the context length\n\n    Returns\n    -------\n    A two-dimensional tensor containing the resulting tokens, shape = [number of input strings, context_length]\n    \"\"\"\n    if isinstance(texts, str):\n        texts = [texts]\n\n    sot_token = _tokenizer.encoder[\"<|startoftext|>\"]\n    eot_token = _tokenizer.encoder[\"<|endoftext|>\"]\n    all_tokens = [[sot_token] + _tokenizer.encode(text) + [eot_token] for text in texts]\n    result = paddle.zeros((len(all_tokens), context_length), dtype='int64')\n\n    for i, tokens in enumerate(all_tokens):\n        if len(tokens) > context_length:\n            raise RuntimeError(f\"Input {texts[i]} is too long for context length {context_length}\")\n        result[i, :len(tokens)] = paddle.to_tensor(np.array(tokens), dtype='int64')\n\n    return result\n\n\ndef build_model(name='RN50'):\n    assert name in MODEL_NAMES, f\"model name must be one of {MODEL_NAMES}\"\n    name2model = {'RN101': build_rn101_model, 'VIT32': build_vit_model, 'RN50': build_rn50_model}\n    model = name2model[name]()\n    weight = URL[name]\n    sd = paddle.load(weight)\n    model.load_dict(sd)\n    model.eval()\n    return model\n\n\ndef build_vit_model():\n\n    model = CLIP(embed_dim=512,\n                 image_resolution=224,\n                 vision_layers=12,\n                 vision_width=768,\n                 vision_patch_size=32,\n                 context_length=77,\n                 vocab_size=49408,\n                 transformer_width=512,\n                 transformer_heads=8,\n                 transformer_layers=12)\n    return model\n\n\ndef build_rn101_model():\n    model = CLIP(\n        embed_dim=512,\n        image_resolution=224,\n        vision_layers=(3, 4, 23, 3),\n        vision_width=64,\n        vision_patch_size=0,  #Not used in resnet\n        context_length=77,\n        vocab_size=49408,\n        transformer_width=512,\n        transformer_heads=8,\n        transformer_layers=12)\n    return model\n\n\ndef build_rn50_model():\n    model = CLIP(embed_dim=1024,\n                 image_resolution=224,\n                 vision_layers=(3, 4, 6, 3),\n                 vision_width=64,\n                 vision_patch_size=None,\n                 context_length=77,\n                 vocab_size=49408,\n                 transformer_width=512,\n                 transformer_heads=8,\n                 transformer_layers=12)\n    return model\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/module.py",
    "content": "# copyright (c) 2022 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport ast\nimport os\nimport sys\nfrom functools import partial\nfrom typing import List\nfrom typing import Optional\n\nimport disco_diffusion_clip_rn50.clip as clip\nimport disco_diffusion_clip_rn50.resize_right as resize_right\nimport paddle\nfrom disco_diffusion_clip_rn50.reverse_diffusion import create\n\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"disco_diffusion_clip_rn50\",\n            version=\"1.0.0\",\n            type=\"image/text_to_image\",\n            summary=\"\",\n            author=\"paddlepaddle\",\n            author_email=\"paddle-dev@baidu.com\")\nclass DiscoDiffusionClip:\n\n    def generate_image(self,\n                       text_prompts: [str],\n                       style: Optional[str] = None,\n                       artist: Optional[str] = None,\n                       init_image: Optional[str] = None,\n                       width_height: Optional[List[int]] = [1280, 768],\n                       skip_steps: Optional[int] = 0,\n                       steps: Optional[int] = 250,\n                       cut_ic_pow: Optional[int] = 1,\n                       init_scale: Optional[int] = 1000,\n                       clip_guidance_scale: Optional[int] = 5000,\n                       tv_scale: Optional[int] = 0,\n                       range_scale: Optional[int] = 0,\n                       sat_scale: Optional[int] = 0,\n                       cutn_batches: Optional[int] = 4,\n                       diffusion_sampling_mode: Optional[str] = 'ddim',\n                       perlin_init: Optional[bool] = False,\n                       perlin_mode: Optional[str] = 'mixed',\n                       seed: Optional[int] = None,\n                       eta: Optional[float] = 0.8,\n                       clamp_grad: Optional[bool] = True,\n                       clamp_max: Optional[float] = 0.05,\n                       randomize_class: Optional[bool] = True,\n                       clip_denoised: Optional[bool] = False,\n                       fuzzy_prompt: Optional[bool] = False,\n                       rand_mag: Optional[float] = 0.05,\n                       cut_overview: Optional[str] = '[12]*400+[4]*600',\n                       cut_innercut: Optional[str] = '[4]*400+[12]*600',\n                       cut_icgray_p: Optional[str] = '[0.2]*400+[0]*600',\n                       display_rate: Optional[int] = 10,\n                       n_batches: Optional[int] = 1,\n                       batch_size: Optional[int] = 1,\n                       batch_name: Optional[str] = '',\n                       use_gpu: Optional[bool] = True,\n                       output_dir: Optional[str] = 'disco_diffusion_clip_rn50_out'):\n        \"\"\"\n        Create Disco Diffusion artworks and save the result into a DocumentArray.\n\n        :param text_prompts: Phrase, sentence, or string of words and phrases describing what the image should look like.  The words will be analyzed by the AI and will guide the diffusion process toward the image(s) you describe. These can include commas and weights to adjust the relative importance of each element.  E.g. \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"Notice that this prompt loosely follows a structure: [subject], [prepositional details], [setting], [meta modifiers and artist]; this is a good starting point for your experiments. Developing text prompts takes practice and experience, and is not the subject of this guide.  If you are a beginner to writing text prompts, a good place to start is on a simple AI art app like Night Cafe, starry ai or WOMBO prior to using DD, to get a feel for how text gets translated into images by GAN tools.  These other apps use different technologies, but many of the same principles apply.\n        :param style: Image style, such as oil paintings, if specified, style will be used to construct prompts.\n        :param artist: Artist style, if specified, style will be used to construct prompts.\n        :param init_image: Recall that in the image sequence above, the first image shown is just noise.  If an init_image is provided, diffusion will replace the noise with the init_image as its starting state.  To use an init_image, upload the image to the Colab instance or your Google Drive, and enter the full image path here. If using an init_image, you may need to increase skip_steps to ~ 50% of total steps to retain the character of the init. See skip_steps above for further discussion.\n        :param width_height: Desired final image size, in pixels. You can have a square, wide, or tall image, but each edge length should be set to a multiple of 64px, and a minimum of 512px on the default CLIP model setting.  If you forget to use multiples of 64px in your dimensions, DD will adjust the dimensions of your image to make it so.\n        :param skip_steps: Consider the chart shown here.  Noise scheduling (denoise strength) starts very high and progressively gets lower and lower as diffusion steps progress. The noise levels in the first few steps are very high, so images change dramatically in early steps.As DD moves along the curve, noise levels (and thus the amount an image changes per step) declines, and image coherence from one step to the next increases.The first few steps of denoising are often so dramatic that some steps (maybe 10-15% of total) can be skipped without affecting the final image. You can experiment with this as a way to cut render times.If you skip too many steps, however, the remaining noise may not be high enough to generate new content, and thus may not have ‘time left’ to finish an image satisfactorily.Also, depending on your other settings, you may need to skip steps to prevent CLIP from overshooting your goal, resulting in ‘blown out’ colors (hyper saturated, solid white, or solid black regions) or otherwise poor image quality.  Consider that the denoising process is at its strongest in the early steps, so skipping steps can sometimes mitigate other problems.Lastly, if using an init_image, you will need to skip ~50% of the diffusion steps to retain the shapes in the original init image. However, if you’re using an init_image, you can also adjust skip_steps up or down for creative reasons.  With low skip_steps you can get a result \"inspired by\" the init_image which will retain the colors and rough layout and shapes but look quite different. With high skip_steps you can preserve most of the init_image contents and just do fine tuning of the texture.\n        :param steps: When creating an image, the denoising curve is subdivided into steps for processing. Each step (or iteration) involves the AI looking at subsets of the image called ‘cuts’ and calculating the ‘direction’ the image should be guided to be more like the prompt. Then it adjusts the image with the help of the diffusion denoiser, and moves to the next step.Increasing steps will provide more opportunities for the AI to adjust the image, and each adjustment will be smaller, and thus will yield a more precise, detailed image.  Increasing steps comes at the expense of longer render times.  Also, while increasing steps should generally increase image quality, there is a diminishing return on additional steps beyond 250 - 500 steps.  However, some intricate images can take 1000, 2000, or more steps.  It is really up to the user.  Just know that the render time is directly related to the number of steps, and many other parameters have a major impact on image quality, without costing additional time.\n        :param cut_ic_pow: This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n        :param init_scale: This controls how strongly CLIP will try to match the init_image provided.  This is balanced against the clip_guidance_scale (CGS) above.  Too much init scale, and the image won’t change much during diffusion. Too much CGS and the init image will be lost.\n        :param clip_guidance_scale: CGS is one of the most important parameters you will use. It tells DD how strongly you want CLIP to move toward your prompt each timestep.  Higher is generally better, but if CGS is too strong it will overshoot the goal and distort the image. So a happy medium is needed, and it takes experience to learn how to adjust CGS. Note that this parameter generally scales with image dimensions. In other words, if you increase your total dimensions by 50% (e.g. a change from 512 x 512 to 512 x 768), then to maintain the same effect on the image, you’d want to increase clip_guidance_scale from 5000 to 7500. Of the basic settings, clip_guidance_scale, steps and skip_steps are the most important contributors to image quality, so learn them well.\n        :param tv_scale: Total variance denoising. Optional, set to zero to turn off. Controls ‘smoothness’ of final output. If used, tv_scale will try to smooth out your final image to reduce overall noise. If your image is too ‘crunchy’, increase tv_scale. TV denoising is good at preserving edges while smoothing away noise in flat regions.  See https://en.wikipedia.org/wiki/Total_variation_denoising\n        :param range_scale: Optional, set to zero to turn off.  Used for adjustment of color contrast.  Lower range_scale will increase contrast. Very low numbers create a reduced color palette, resulting in more vibrant or poster-like images. Higher range_scale will reduce contrast, for more muted images.\n        :param sat_scale: Saturation scale. Optional, set to zero to turn off.  If used, sat_scale will help mitigate oversaturation. If your image is too saturated, increase sat_scale to reduce the saturation.\n        :param cutn_batches: Each iteration, the AI cuts the image into smaller pieces known as cuts, and compares each cut to the prompt to decide how to guide the next diffusion step.  More cuts can generally lead to better images, since DD has more chances to fine-tune the image precision in each timestep.  Additional cuts are memory intensive, however, and if DD tries to evaluate too many cuts at once, it can run out of memory.  You can use cutn_batches to increase cuts per timestep without increasing memory usage. At the default settings, DD is scheduled to do 16 cuts per timestep.  If cutn_batches is set to 1, there will indeed only be 16 cuts total per timestep. However, if cutn_batches is increased to 4, DD will do 64 cuts total in each timestep, divided into 4 sequential batches of 16 cuts each.  Because the cuts are being evaluated only 16 at a time, DD uses the memory required for only 16 cuts, but gives you the quality benefit of 64 cuts.  The tradeoff, of course, is that this will take ~4 times as long to render each image.So, (scheduled cuts) x (cutn_batches) = (total cuts per timestep). Increasing cutn_batches will increase render times, however, as the work is being done sequentially.  DD’s default cut schedule is a good place to start, but the cut schedule can be adjusted in the Cutn Scheduling section, explained below.\n        :param diffusion_sampling_mode: Two alternate diffusion denoising algorithms. ddim has been around longer, and is more established and tested.  plms is a newly added alternate method that promises good diffusion results in fewer steps, but has not been as fully tested and may have side effects. This new plms mode is actively being researched in the #settings-and-techniques channel in the DD Discord.\n        :param perlin_init: Normally, DD will use an image filled with random noise as a starting point for the diffusion curve.  If perlin_init is selected, DD will instead use a Perlin noise model as an initial state.  Perlin has very interesting characteristics, distinct from random noise, so it’s worth experimenting with this for your projects. Beyond perlin, you can, of course, generate your own noise images (such as with GIMP, etc) and use them as an init_image (without skipping steps). Choosing perlin_init does not affect the actual diffusion process, just the starting point for the diffusion. Please note that selecting a perlin_init will replace and override any init_image you may have specified.  Further, because the 2D, 3D and video animation systems all rely on the init_image system, if you enable Perlin while using animation modes, the perlin_init will jump in front of any previous image or video input, and DD will NOT give you the expected sequence of coherent images. All of that said, using Perlin and animation modes together do make a very colorful rainbow effect, which can be used creatively.\n        :param perlin_mode: sets type of Perlin noise: colored, gray, or a mix of both, giving you additional options for noise types. Experiment to see what these do in your projects.\n        :param seed: Deep in the diffusion code, there is a random number ‘seed’ which is used as the basis for determining the initial state of the diffusion.  By default, this is random, but you can also specify your own seed.  This is useful if you like a particular result and would like to run more iterations that will be similar. After each run, the actual seed value used will be reported in the parameters report, and can be reused if desired by entering seed # here.  If a specific numerical seed is used repeatedly, the resulting images will be quite similar but not identical.\n        :param eta: eta (greek letter η) is a diffusion model variable that mixes in a random amount of scaled noise into each timestep. 0 is no noise, 1.0 is more noise. As with most DD parameters, you can go below zero for eta, but it may give you unpredictable results. The steps parameter has a close relationship with the eta parameter. If you set eta to 0, then you can get decent output with only 50-75 steps. Setting eta to 1.0 favors higher step counts, ideally around 250 and up. eta has a subtle, unpredictable effect on image, so you’ll need to experiment to see how this affects your projects.\n        :param clamp_grad: As I understand it, clamp_grad is an internal limiter that stops DD from producing extreme results.  Try your images with and without clamp_grad. If the image changes drastically with clamp_grad turned off, it probably means your clip_guidance_scale is too high and should be reduced.\n        :param clamp_max: Sets the value of the clamp_grad limitation. Default is 0.05, providing for smoother, more muted coloration in images, but setting higher values (0.15-0.3) can provide interesting contrast and vibrancy.\n        :param fuzzy_prompt: Controls whether to add multiple noisy prompts to the prompt losses. If True, can increase variability of image output. Experiment with this.\n        :param rand_mag: Affects only the fuzzy_prompt.  Controls the magnitude of the random noise added by fuzzy_prompt.\n        :param cut_overview: The schedule of overview cuts\n        :param cut_innercut: The schedule of inner cuts\n        :param cut_icgray_p: This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n        :param display_rate: During a diffusion run, you can monitor the progress of each image being created with this variable.  If display_rate is set to 50, DD will show you the in-progress image every 50 timesteps. Setting this to a lower value, like 5 or 10, is a good way to get an early peek at where your image is heading. If you don’t like the progression, just interrupt execution, change some settings, and re-run.  If you are planning a long, unmonitored batch, it’s better to set display_rate equal to steps, because displaying interim images does slow Colab down slightly.\n        :param n_batches: This variable sets the number of still images you want DD to create.  If you are using an animation mode (see below for details) DD will ignore n_batches and create a single set of animated frames based on the animation settings.\n        :param batch_name: The name of the batch, the batch id will be named as \"discoart-[batch_name]-seed\". To avoid your artworks be overridden by other users, please use a unique name.\n        :param use_gpu: whether to use gpu or not.\n        :return: a DocumentArray object that has `n_batches` Documents\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ.get(\"CUDA_VISIBLE_DEVICES\", None)\n                if _places:\n                    paddle.device.set_device(\"gpu:{}\".format(0))\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n        else:\n            paddle.device.set_device(\"cpu\")\n        paddle.disable_static()\n        if not os.path.exists(output_dir):\n            os.makedirs(output_dir, exist_ok=True)\n\n        if isinstance(text_prompts, str):\n            text_prompts = text_prompts.rstrip(',.，。')\n            if style is not None:\n                text_prompts += \",{}\".format(style)\n            if artist is not None:\n                text_prompts += \",{},trending on artstation\".format(artist)\n        elif isinstance(text_prompts, list):\n            text_prompts[0] = text_prompts[0].rstrip(',.，。')\n            if style is not None:\n                text_prompts[0] += \",{}\".format(style)\n            if artist is not None:\n                text_prompts[0] += \",{},trending on artstation\".format(artist)\n\n        return create(text_prompts=text_prompts,\n                      init_image=init_image,\n                      width_height=width_height,\n                      skip_steps=skip_steps,\n                      steps=steps,\n                      cut_ic_pow=cut_ic_pow,\n                      init_scale=init_scale,\n                      clip_guidance_scale=clip_guidance_scale,\n                      tv_scale=tv_scale,\n                      range_scale=range_scale,\n                      sat_scale=sat_scale,\n                      cutn_batches=cutn_batches,\n                      diffusion_sampling_mode=diffusion_sampling_mode,\n                      perlin_init=perlin_init,\n                      perlin_mode=perlin_mode,\n                      seed=seed,\n                      eta=eta,\n                      clamp_grad=clamp_grad,\n                      clamp_max=clamp_max,\n                      randomize_class=randomize_class,\n                      clip_denoised=clip_denoised,\n                      fuzzy_prompt=fuzzy_prompt,\n                      rand_mag=rand_mag,\n                      cut_overview=cut_overview,\n                      cut_innercut=cut_innercut,\n                      cut_icgray_p=cut_icgray_p,\n                      display_rate=display_rate,\n                      n_batches=n_batches,\n                      batch_size=batch_size,\n                      batch_name=batch_name,\n                      clip_models=['RN50'],\n                      output_dir=output_dir)\n\n    @serving\n    def serving_method(self, text_prompts, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        results = self.generate_image(text_prompts=text_prompts, **kwargs).to_base64()\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.generate_image(text_prompts=args.text_prompts,\n                                      style=args.style,\n                                      artist=args.artist,\n                                      init_image=args.init_image,\n                                      width_height=args.width_height,\n                                      skip_steps=args.skip_steps,\n                                      steps=args.steps,\n                                      cut_ic_pow=args.cut_ic_pow,\n                                      init_scale=args.init_scale,\n                                      clip_guidance_scale=args.clip_guidance_scale,\n                                      tv_scale=args.tv_scale,\n                                      range_scale=args.range_scale,\n                                      sat_scale=args.sat_scale,\n                                      cutn_batches=args.cutn_batches,\n                                      diffusion_sampling_mode=args.diffusion_sampling_mode,\n                                      perlin_init=args.perlin_init,\n                                      perlin_mode=args.perlin_mode,\n                                      seed=args.seed,\n                                      eta=args.eta,\n                                      clamp_grad=args.clamp_grad,\n                                      clamp_max=args.clamp_max,\n                                      randomize_class=args.randomize_class,\n                                      clip_denoised=args.clip_denoised,\n                                      fuzzy_prompt=args.fuzzy_prompt,\n                                      rand_mag=args.rand_mag,\n                                      cut_overview=args.cut_overview,\n                                      cut_innercut=args.cut_innercut,\n                                      cut_icgray_p=args.cut_icgray_p,\n                                      display_rate=args.display_rate,\n                                      n_batches=args.n_batches,\n                                      batch_size=args.batch_size,\n                                      batch_name=args.batch_name,\n                                      output_dir=args.output_dir)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_input_group.add_argument(\n            '--skip_steps',\n            type=int,\n            default=0,\n            help=\n            'Consider the chart shown here.  Noise scheduling (denoise strength) starts very high and progressively gets lower and lower as diffusion steps progress. The noise levels in the first few steps are very high, so images change dramatically in early steps.As DD moves along the curve, noise levels (and thus the amount an image changes per step) declines, and image coherence from one step to the next increases.The first few steps of denoising are often so dramatic that some steps (maybe 10-15%% of total) can be skipped without affecting the final image. You can experiment with this as a way to cut render times.If you skip too many steps, however, the remaining noise may not be high enough to generate new content, and thus may not have ‘time left’ to finish an image satisfactorily.Also, depending on your other settings, you may need to skip steps to prevent CLIP from overshooting your goal, resulting in ‘blown out’ colors (hyper saturated, solid white, or solid black regions) or otherwise poor image quality.  Consider that the denoising process is at its strongest in the early steps, so skipping steps can sometimes mitigate other problems.Lastly, if using an init_image, you will need to skip ~50%% of the diffusion steps to retain the shapes in the original init image. However, if you’re using an init_image, you can also adjust skip_steps up or down for creative reasons.  With low skip_steps you can get a result \"inspired by\" the init_image which will retain the colors and rough layout and shapes but look quite different. With high skip_steps you can preserve most of the init_image contents and just do fine tuning of the texture'\n        )\n        self.arg_input_group.add_argument(\n            '--steps',\n            type=int,\n            default=250,\n            help=\n            \"When creating an image, the denoising curve is subdivided into steps for processing. Each step (or iteration) involves the AI looking at subsets of the image called ‘cuts’ and calculating the ‘direction’ the image should be guided to be more like the prompt. Then it adjusts the image with the help of the diffusion denoiser, and moves to the next step.Increasing steps will provide more opportunities for the AI to adjust the image, and each adjustment will be smaller, and thus will yield a more precise, detailed image.  Increasing steps comes at the expense of longer render times.  Also, while increasing steps should generally increase image quality, there is a diminishing return on additional steps beyond 250 - 500 steps.  However, some intricate images can take 1000, 2000, or more steps.  It is really up to the user.  Just know that the render time is directly related to the number of steps, and many other parameters have a major impact on image quality, without costing additional time.\"\n        )\n        self.arg_input_group.add_argument(\n            '--cut_ic_pow',\n            type=int,\n            default=1,\n            help=\n            \"This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\"\n        )\n        self.arg_input_group.add_argument(\n            '--init_scale',\n            type=int,\n            default=1000,\n            help=\n            \"This controls how strongly CLIP will try to match the init_image provided.  This is balanced against the clip_guidance_scale (CGS) above.  Too much init scale, and the image won’t change much during diffusion. Too much CGS and the init image will be lost.\"\n        )\n        self.arg_input_group.add_argument(\n            '--clip_guidance_scale',\n            type=int,\n            default=5000,\n            help=\n            \"CGS is one of the most important parameters you will use. It tells DD how strongly you want CLIP to move toward your prompt each timestep.  Higher is generally better, but if CGS is too strong it will overshoot the goal and distort the image. So a happy medium is needed, and it takes experience to learn how to adjust CGS. Note that this parameter generally scales with image dimensions. In other words, if you increase your total dimensions by 50% (e.g. a change from 512 x 512 to 512 x 768), then to maintain the same effect on the image, you’d want to increase clip_guidance_scale from 5000 to 7500. Of the basic settings, clip_guidance_scale, steps and skip_steps are the most important contributors to image quality, so learn them well.\"\n        )\n        self.arg_input_group.add_argument(\n            '--tv_scale',\n            type=int,\n            default=0,\n            help=\n            \"Total variance denoising. Optional, set to zero to turn off. Controls ‘smoothness’ of final output. If used, tv_scale will try to smooth out your final image to reduce overall noise. If your image is too ‘crunchy’, increase tv_scale. TV denoising is good at preserving edges while smoothing away noise in flat regions.  See https://en.wikipedia.org/wiki/Total_variation_denoising\"\n        )\n        self.arg_input_group.add_argument(\n            '--range_scale',\n            type=int,\n            default=0,\n            help=\n            \"Optional, set to zero to turn off.  Used for adjustment of color contrast.  Lower range_scale will increase contrast. Very low numbers create a reduced color palette, resulting in more vibrant or poster-like images. Higher range_scale will reduce contrast, for more muted images.\"\n        )\n        self.arg_input_group.add_argument(\n            '--sat_scale',\n            type=int,\n            default=0,\n            help=\n            \"Saturation scale. Optional, set to zero to turn off.  If used, sat_scale will help mitigate oversaturation. If your image is too saturated, increase sat_scale to reduce the saturation.\"\n        )\n        self.arg_input_group.add_argument(\n            '--cutn_batches',\n            type=int,\n            default=4,\n            help=\n            \"Each iteration, the AI cuts the image into smaller pieces known as cuts, and compares each cut to the prompt to decide how to guide the next diffusion step.  More cuts can generally lead to better images, since DD has more chances to fine-tune the image precision in each timestep.  Additional cuts are memory intensive, however, and if DD tries to evaluate too many cuts at once, it can run out of memory.  You can use cutn_batches to increase cuts per timestep without increasing memory usage. At the default settings, DD is scheduled to do 16 cuts per timestep.  If cutn_batches is set to 1, there will indeed only be 16 cuts total per timestep. However, if cutn_batches is increased to 4, DD will do 64 cuts total in each timestep, divided into 4 sequential batches of 16 cuts each.  Because the cuts are being evaluated only 16 at a time, DD uses the memory required for only 16 cuts, but gives you the quality benefit of 64 cuts.  The tradeoff, of course, is that this will take ~4 times as long to render each image.So, (scheduled cuts) x (cutn_batches) = (total cuts per timestep). Increasing cutn_batches will increase render times, however, as the work is being done sequentially.  DD’s default cut schedule is a good place to start, but the cut schedule can be adjusted in the Cutn Scheduling section, explained below.\"\n        )\n        self.arg_input_group.add_argument(\n            '--diffusion_sampling_mode',\n            type=str,\n            default='ddim',\n            help=\n            \"Two alternate diffusion denoising algorithms. ddim has been around longer, and is more established and tested.  plms is a newly added alternate method that promises good diffusion results in fewer steps, but has not been as fully tested and may have side effects. This new plms mode is actively being researched in the #settings-and-techniques channel in the DD Discord.\"\n        )\n        self.arg_input_group.add_argument(\n            '--perlin_init',\n            type=bool,\n            default=False,\n            help=\n            \"Normally, DD will use an image filled with random noise as a starting point for the diffusion curve.  If perlin_init is selected, DD will instead use a Perlin noise model as an initial state.  Perlin has very interesting characteristics, distinct from random noise, so it’s worth experimenting with this for your projects. Beyond perlin, you can, of course, generate your own noise images (such as with GIMP, etc) and use them as an init_image (without skipping steps). Choosing perlin_init does not affect the actual diffusion process, just the starting point for the diffusion. Please note that selecting a perlin_init will replace and override any init_image you may have specified.  Further, because the 2D, 3D and video animation systems all rely on the init_image system, if you enable Perlin while using animation modes, the perlin_init will jump in front of any previous image or video input, and DD will NOT give you the expected sequence of coherent images. All of that said, using Perlin and animation modes together do make a very colorful rainbow effect, which can be used creatively.\"\n        )\n        self.arg_input_group.add_argument(\n            '--perlin_mode',\n            type=str,\n            default='mixed',\n            help=\n            \"sets type of Perlin noise: colored, gray, or a mix of both, giving you additional options for noise types. Experiment to see what these do in your projects.\"\n        )\n        self.arg_input_group.add_argument(\n            '--seed',\n            type=int,\n            default=None,\n            help=\n            \"Deep in the diffusion code, there is a random number ‘seed’ which is used as the basis for determining the initial state of the diffusion.  By default, this is random, but you can also specify your own seed.  This is useful if you like a particular result and would like to run more iterations that will be similar. After each run, the actual seed value used will be reported in the parameters report, and can be reused if desired by entering seed # here.  If a specific numerical seed is used repeatedly, the resulting images will be quite similar but not identical.\"\n        )\n        self.arg_input_group.add_argument(\n            '--eta',\n            type=float,\n            default=0.8,\n            help=\n            \"eta (greek letter η) is a diffusion model variable that mixes in a random amount of scaled noise into each timestep. 0 is no noise, 1.0 is more noise. As with most DD parameters, you can go below zero for eta, but it may give you unpredictable results. The steps parameter has a close relationship with the eta parameter. If you set eta to 0, then you can get decent output with only 50-75 steps. Setting eta to 1.0 favors higher step counts, ideally around 250 and up. eta has a subtle, unpredictable effect on image, so you’ll need to experiment to see how this affects your projects.\"\n        )\n        self.arg_input_group.add_argument(\n            '--clamp_grad',\n            type=bool,\n            default=True,\n            help=\n            \"As I understand it, clamp_grad is an internal limiter that stops DD from producing extreme results.  Try your images with and without clamp_grad. If the image changes drastically with clamp_grad turned off, it probably means your clip_guidance_scale is too high and should be reduced.\"\n        )\n        self.arg_input_group.add_argument(\n            '--clamp_max',\n            type=float,\n            default=0.05,\n            help=\n            \"Sets the value of the clamp_grad limitation. Default is 0.05, providing for smoother, more muted coloration in images, but setting higher values (0.15-0.3) can provide interesting contrast and vibrancy.\"\n        )\n        self.arg_input_group.add_argument('--randomize_class', type=bool, default=True, help=\"Random class.\")\n        self.arg_input_group.add_argument('--clip_denoised', type=bool, default=False, help=\"Clip denoised.\")\n        self.arg_input_group.add_argument(\n            '--fuzzy_prompt',\n            type=bool,\n            default=False,\n            help=\n            \"Controls whether to add multiple noisy prompts to the prompt losses. If True, can increase variability of image output. Experiment with this.\"\n        )\n        self.arg_input_group.add_argument(\n            '--rand_mag',\n            type=float,\n            default=0.5,\n            help=\"Affects only the fuzzy_prompt.  Controls the magnitude of the random noise added by fuzzy_prompt.\")\n        self.arg_input_group.add_argument('--cut_overview',\n                                          type=str,\n                                          default='[12]*400+[4]*600',\n                                          help=\"The schedule of overview cuts\")\n        self.arg_input_group.add_argument('--cut_innercut',\n                                          type=str,\n                                          default='[4]*400+[12]*600',\n                                          help=\"The schedule of inner cuts\")\n        self.arg_input_group.add_argument(\n            '--cut_icgray_p',\n            type=str,\n            default='[0.2]*400+[0]*600',\n            help=\n            \"This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\"\n        )\n        self.arg_input_group.add_argument(\n            '--display_rate',\n            type=int,\n            default=10,\n            help=\n            \"During a diffusion run, you can monitor the progress of each image being created with this variable.  If display_rate is set to 50, DD will show you the in-progress image every 50 timesteps. Setting this to a lower value, like 5 or 10, is a good way to get an early peek at where your image is heading. If you don’t like the progression, just interrupt execution, change some settings, and re-run.  If you are planning a long, unmonitored batch, it’s better to set display_rate equal to steps, because displaying interim images does slow Colab down slightly.\"\n        )\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=True,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='disco_diffusion_clip_rn50_out',\n                                           help='Output directory.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument(\n            '--text_prompts',\n            type=str,\n            help=\n            'Phrase, sentence, or string of words and phrases describing what the image should look like.  The words will be analyzed by the AI and will guide the diffusion process toward the image(s) you describe. These can include commas and weights to adjust the relative importance of each element.  E.g. \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"Notice that this prompt loosely follows a structure: [subject], [prepositional details], [setting], [meta modifiers and artist]; this is a good starting point for your experiments. Developing text prompts takes practice and experience, and is not the subject of this guide.  If you are a beginner to writing text prompts, a good place to start is on a simple AI art app like Night Cafe, starry ai or WOMBO prior to using DD, to get a feel for how text gets translated into images by GAN tools.  These other apps use different technologies, but many of the same principles apply.'\n        )\n        self.arg_input_group.add_argument(\n            '--style',\n            type=str,\n            default=None,\n            help='Image style, such as oil paintings, if specified, style will be used to construct prompts.')\n        self.arg_input_group.add_argument('--artist',\n                                          type=str,\n                                          default=None,\n                                          help='Artist style, if specified, style will be used to construct prompts.')\n        self.arg_input_group.add_argument(\n            '--init_image',\n            type=str,\n            default=None,\n            help=\n            \"Recall that in the image sequence above, the first image shown is just noise.  If an init_image is provided, diffusion will replace the noise with the init_image as its starting state.  To use an init_image, upload the image to the Colab instance or your Google Drive, and enter the full image path here. If using an init_image, you may need to increase skip_steps to ~ 50% of total steps to retain the character of the init. See skip_steps above for further discussion.\"\n        )\n        self.arg_input_group.add_argument(\n            '--width_height',\n            type=ast.literal_eval,\n            default=[1280, 768],\n            help=\n            \"Desired final image size, in pixels. You can have a square, wide, or tall image, but each edge length should be set to a multiple of 64px, and a minimum of 512px on the default CLIP model setting.  If you forget to use multiples of 64px in your dimensions, DD will adjust the dimensions of your image to make it so.\"\n        )\n        self.arg_input_group.add_argument(\n            '--n_batches',\n            type=int,\n            default=1,\n            help=\n            \"This variable sets the number of still images you want DD to create.  If you are using an animation mode (see below for details) DD will ignore n_batches and create a single set of animated frames based on the animation settings.\"\n        )\n        self.arg_input_group.add_argument('--batch_size', type=int, default=1, help=\"Batch size.\")\n        self.arg_input_group.add_argument(\n            '--batch_name',\n            type=str,\n            default='',\n            help=\n            'The name of the batch, the batch id will be named as \"discoart-[batch_name]-seed\". To avoid your artworks be overridden by other users, please use a unique name.'\n        )\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/requirements.txt",
    "content": "numpy\npaddle_lpips==0.1.2\nftfy\ndocarray>=0.13.29\npyyaml\nregex\ntqdm\nipywidgets\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/resize_right/README.md",
    "content": "# ResizeRight (Paddle)\nFully differentiable resize function implemented by Paddle.\nThis module is based on [assafshocher/ResizeRight](https://github.com/assafshocher/ResizeRight).\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/resize_right/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/resize_right/interp_methods.py",
    "content": "from math import pi\n\ntry:\n    import paddle\nexcept ImportError:\n    paddle = None\n\ntry:\n    import numpy\n    import numpy as np\nexcept ImportError:\n    numpy = None\n\nif numpy is None and paddle is None:\n    raise ImportError(\"Must have either Numpy or PyTorch but both not found\")\n\n\ndef set_framework_dependencies(x):\n    if type(x) is numpy.ndarray:\n        to_dtype = lambda a: a\n        fw = numpy\n    else:\n        to_dtype = lambda a: paddle.cast(a, x.dtype)\n        fw = paddle\n    # eps = fw.finfo(fw.float32).eps\n    eps = paddle.to_tensor(np.finfo(np.float32).eps)\n    return fw, to_dtype, eps\n\n\ndef support_sz(sz):\n\n    def wrapper(f):\n        f.support_sz = sz\n        return f\n\n    return wrapper\n\n\n@support_sz(4)\ndef cubic(x):\n    fw, to_dtype, eps = set_framework_dependencies(x)\n    absx = fw.abs(x)\n    absx2 = absx**2\n    absx3 = absx**3\n    return ((1.5 * absx3 - 2.5 * absx2 + 1.) * to_dtype(absx <= 1.) +\n            (-0.5 * absx3 + 2.5 * absx2 - 4. * absx + 2.) * to_dtype((1. < absx) & (absx <= 2.)))\n\n\n@support_sz(4)\ndef lanczos2(x):\n    fw, to_dtype, eps = set_framework_dependencies(x)\n    return (((fw.sin(pi * x) * fw.sin(pi * x / 2) + eps) / ((pi**2 * x**2 / 2) + eps)) * to_dtype(abs(x) < 2))\n\n\n@support_sz(6)\ndef lanczos3(x):\n    fw, to_dtype, eps = set_framework_dependencies(x)\n    return (((fw.sin(pi * x) * fw.sin(pi * x / 3) + eps) / ((pi**2 * x**2 / 3) + eps)) * to_dtype(abs(x) < 3))\n\n\n@support_sz(2)\ndef linear(x):\n    fw, to_dtype, eps = set_framework_dependencies(x)\n    return ((x + 1) * to_dtype((-1 <= x) & (x < 0)) + (1 - x) * to_dtype((0 <= x) & (x <= 1)))\n\n\n@support_sz(1)\ndef box(x):\n    fw, to_dtype, eps = set_framework_dependencies(x)\n    return to_dtype((-1 <= x) & (x < 0)) + to_dtype((0 <= x) & (x <= 1))\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/resize_right/resize_right.py",
    "content": "import warnings\nfrom fractions import Fraction\nfrom math import ceil\nfrom typing import Tuple\n\nimport disco_diffusion_clip_rn50.resize_right.interp_methods as interp_methods\n\n\nclass NoneClass:\n    pass\n\n\ntry:\n    import paddle\n    from paddle import nn\n    nnModuleWrapped = nn.Layer\nexcept ImportError:\n    warnings.warn('No PyTorch found, will work only with Numpy')\n    paddle = None\n    nnModuleWrapped = NoneClass\n\ntry:\n    import numpy\n    import numpy as np\nexcept ImportError:\n    warnings.warn('No Numpy found, will work only with PyTorch')\n    numpy = None\n\nif numpy is None and paddle is None:\n    raise ImportError(\"Must have either Numpy or PyTorch but both not found\")\n\n\ndef resize(input,\n           scale_factors=None,\n           out_shape=None,\n           interp_method=interp_methods.cubic,\n           support_sz=None,\n           antialiasing=True,\n           by_convs=False,\n           scale_tolerance=None,\n           max_numerator=10,\n           pad_mode='constant'):\n    # get properties of the input tensor\n    in_shape, n_dims = input.shape, input.ndim\n\n    # fw stands for framework that can be either numpy or paddle,\n    # determined by the input type\n    fw = numpy if type(input) is numpy.ndarray else paddle\n    eps = np.finfo(np.float32).eps if fw == numpy else paddle.to_tensor(np.finfo(np.float32).eps)\n    device = input.place if fw is paddle else None\n\n    # set missing scale factors or output shapem one according to another,\n    # scream if both missing. this is also where all the defults policies\n    # take place. also handling the by_convs attribute carefully.\n    scale_factors, out_shape, by_convs = set_scale_and_out_sz(in_shape, out_shape, scale_factors, by_convs,\n                                                              scale_tolerance, max_numerator, eps, fw)\n\n    # sort indices of dimensions according to scale of each dimension.\n    # since we are going dim by dim this is efficient\n    sorted_filtered_dims_and_scales = [(dim, scale_factors[dim], by_convs[dim], in_shape[dim], out_shape[dim])\n                                       for dim in sorted(range(n_dims), key=lambda ind: scale_factors[ind])\n                                       if scale_factors[dim] != 1.]\n    # unless support size is specified by the user, it is an attribute\n    # of the interpolation method\n    if support_sz is None:\n        support_sz = interp_method.support_sz\n\n    # output begins identical to input and changes with each iteration\n    output = input\n\n    # iterate over dims\n    for (dim, scale_factor, dim_by_convs, in_sz, out_sz) in sorted_filtered_dims_and_scales:\n        # STEP 1- PROJECTED GRID: The non-integer locations of the projection\n        # of output pixel locations to the input tensor\n        projected_grid = get_projected_grid(in_sz, out_sz, scale_factor, fw, dim_by_convs, device)\n\n        # STEP 1.5: ANTIALIASING- If antialiasing is taking place, we modify\n        # the window size and the interpolation method (see inside function)\n        cur_interp_method, cur_support_sz = apply_antialiasing_if_needed(interp_method, support_sz, scale_factor,\n                                                                         antialiasing)\n\n        # STEP 2- FIELDS OF VIEW: for each output pixels, map the input pixels\n        # that influence it. Also calculate needed padding and update grid\n        # accoedingly\n        field_of_view = get_field_of_view(projected_grid, cur_support_sz, fw, eps, device)\n\n        # STEP 2.5- CALCULATE PAD AND UPDATE: according to the field of view,\n        # the input should be padded to handle the boundaries, coordinates\n        # should be updated. actual padding only occurs when weights are\n        # aplied (step 4). if using by_convs for this dim, then we need to\n        # calc right and left boundaries for each filter instead.\n        pad_sz, projected_grid, field_of_view = calc_pad_sz(in_sz, out_sz, field_of_view, projected_grid, scale_factor,\n                                                            dim_by_convs, fw, device)\n        # STEP 3- CALCULATE WEIGHTS: Match a set of weights to the pixels in\n        # the field of view for each output pixel\n        weights = get_weights(cur_interp_method, projected_grid, field_of_view)\n\n        # STEP 4- APPLY WEIGHTS: Each output pixel is calculated by multiplying\n        # its set of weights with the pixel values in its field of view.\n        # We now multiply the fields of view with their matching weights.\n        # We do this by tensor multiplication and broadcasting.\n        # if by_convs is true for this dim, then we do this action by\n        # convolutions. this is equivalent but faster.\n        if not dim_by_convs:\n            output = apply_weights(output, field_of_view, weights, dim, n_dims, pad_sz, pad_mode, fw)\n        else:\n            output = apply_convs(output, scale_factor, in_sz, out_sz, weights, dim, pad_sz, pad_mode, fw)\n    return output\n\n\ndef get_projected_grid(in_sz, out_sz, scale_factor, fw, by_convs, device=None):\n    # we start by having the ouput coordinates which are just integer locations\n    # in the special case when usin by_convs, we only need two cycles of grid\n    # points. the first and last.\n    grid_sz = out_sz if not by_convs else scale_factor.numerator\n    out_coordinates = fw_arange(grid_sz, fw, device)\n\n    # This is projecting the ouput pixel locations in 1d to the input tensor,\n    # as non-integer locations.\n    # the following fomrula is derived in the paper\n    # \"From Discrete to Continuous Convolutions\" by Shocher et al.\n    return (out_coordinates / float(scale_factor) + (in_sz - 1) / 2 - (out_sz - 1) / (2 * float(scale_factor)))\n\n\ndef get_field_of_view(projected_grid, cur_support_sz, fw, eps, device):\n    # for each output pixel, map which input pixels influence it, in 1d.\n    # we start by calculating the leftmost neighbor, using half of the window\n    # size (eps is for when boundary is exact int)\n    left_boundaries = fw_ceil(projected_grid - cur_support_sz / 2 - eps, fw)\n\n    # then we simply take all the pixel centers in the field by counting\n    # window size pixels from the left boundary\n    ordinal_numbers = fw_arange(ceil(cur_support_sz - eps), fw, device)\n    return left_boundaries[:, None] + ordinal_numbers\n\n\ndef calc_pad_sz(in_sz, out_sz, field_of_view, projected_grid, scale_factor, dim_by_convs, fw, device):\n    if not dim_by_convs:\n        # determine padding according to neighbor coords out of bound.\n        # this is a generalized notion of padding, when pad<0 it means crop\n        pad_sz = [-field_of_view[0, 0].item(), field_of_view[-1, -1].item() - in_sz + 1]\n\n        # since input image will be changed by padding, coordinates of both\n        # field_of_view and projected_grid need to be updated\n        field_of_view += pad_sz[0]\n        projected_grid += pad_sz[0]\n\n    else:\n        # only used for by_convs, to calc the boundaries of each filter the\n        # number of distinct convolutions is the numerator of the scale factor\n        num_convs, stride = scale_factor.numerator, scale_factor.denominator\n\n        # calculate left and right boundaries for each conv. left can also be\n        # negative right can be bigger than in_sz. such cases imply padding if\n        # needed. however if# both are in-bounds, it means we need to crop,\n        # practically apply the conv only on part of the image.\n        left_pads = -field_of_view[:, 0]\n\n        # next calc is tricky, explanation by rows:\n        # 1) counting output pixels between the first position of each filter\n        #    to the right boundary of the input\n        # 2) dividing it by number of filters to count how many 'jumps'\n        #    each filter does\n        # 3) multiplying by the stride gives us the distance over the input\n        #    coords done by all these jumps for each filter\n        # 4) to this distance we add the right boundary of the filter when\n        #    placed in its leftmost position. so now we get the right boundary\n        #    of that filter in input coord.\n        # 5) the padding size needed is obtained by subtracting the rightmost\n        #    input coordinate. if the result is positive padding is needed. if\n        #    negative then negative padding means shaving off pixel columns.\n        right_pads = (((out_sz - fw_arange(num_convs, fw, device) - 1)  # (1)\n                       // num_convs)  # (2)\n                      * stride  # (3)\n                      + field_of_view[:, -1]  # (4)\n                      - in_sz + 1)  # (5)\n\n        # in the by_convs case pad_sz is a list of left-right pairs. one per\n        # each filter\n\n        pad_sz = list(zip(left_pads, right_pads))\n\n    return pad_sz, projected_grid, field_of_view\n\n\ndef get_weights(interp_method, projected_grid, field_of_view):\n    # the set of weights per each output pixels is the result of the chosen\n    # interpolation method applied to the distances between projected grid\n    # locations and the pixel-centers in the field of view (distances are\n    # directed, can be positive or negative)\n    weights = interp_method(projected_grid[:, None] - field_of_view)\n\n    # we now carefully normalize the weights to sum to 1 per each output pixel\n    sum_weights = weights.sum(1, keepdim=True)\n    sum_weights[sum_weights == 0] = 1\n    return weights / sum_weights\n\n\ndef apply_weights(input, field_of_view, weights, dim, n_dims, pad_sz, pad_mode, fw):\n    # for this operation we assume the resized dim is the first one.\n    # so we transpose and will transpose back after multiplying\n    tmp_input = fw_swapaxes(input, dim, 0, fw)\n\n    # apply padding\n    tmp_input = fw_pad(tmp_input, fw, pad_sz, pad_mode)\n\n    # field_of_view is a tensor of order 2: for each output (1d location\n    # along cur dim)- a list of 1d neighbors locations.\n    # note that this whole operations is applied to each dim separately,\n    # this is why it is all in 1d.\n    # neighbors = tmp_input[field_of_view] is a tensor of order image_dims+1:\n    # for each output pixel (this time indicated in all dims), these are the\n    # values of the neighbors in the 1d field of view. note that we only\n    # consider neighbors along the current dim, but such set exists for every\n    # multi-dim location, hence the final tensor order is image_dims+1.\n    paddle.device.cuda.empty_cache()\n    neighbors = tmp_input[field_of_view]\n\n    # weights is an order 2 tensor: for each output location along 1d- a list\n    # of weights matching the field of view. we augment it with ones, for\n    # broadcasting, so that when multiplies some tensor the weights affect\n    # only its first dim.\n    tmp_weights = fw.reshape(weights, (*weights.shape, *[1] * (n_dims - 1)))\n\n    # now we simply multiply the weights with the neighbors, and then sum\n    # along the field of view, to get a single value per out pixel\n    tmp_output = (neighbors * tmp_weights).sum(1)\n    # we transpose back the resized dim to its original position\n    return fw_swapaxes(tmp_output, 0, dim, fw)\n\n\ndef apply_convs(input, scale_factor, in_sz, out_sz, weights, dim, pad_sz, pad_mode, fw):\n    # for this operations we assume the resized dim is the last one.\n    # so we transpose and will transpose back after multiplying\n    input = fw_swapaxes(input, dim, -1, fw)\n\n    # the stride for all convs is the denominator of the scale factor\n    stride, num_convs = scale_factor.denominator, scale_factor.numerator\n\n    # prepare an empty tensor for the output\n    tmp_out_shape = list(input.shape)\n    tmp_out_shape[-1] = out_sz\n    tmp_output = fw_empty(tuple(tmp_out_shape), fw, input.device)\n\n    # iterate over the conv operations. we have as many as the numerator\n    # of the scale-factor. for each we need boundaries and a filter.\n    for conv_ind, (pad_sz, filt) in enumerate(zip(pad_sz, weights)):\n        # apply padding (we pad last dim, padding can be negative)\n        pad_dim = input.ndim - 1\n        tmp_input = fw_pad(input, fw, pad_sz, pad_mode, dim=pad_dim)\n\n        # apply convolution over last dim. store in the output tensor with\n        # positional strides so that when the loop is comlete conv results are\n        # interwind\n        tmp_output[..., conv_ind::num_convs] = fw_conv(tmp_input, filt, stride)\n\n    return fw_swapaxes(tmp_output, -1, dim, fw)\n\n\ndef set_scale_and_out_sz(in_shape, out_shape, scale_factors, by_convs, scale_tolerance, max_numerator, eps, fw):\n    # eventually we must have both scale-factors and out-sizes for all in/out\n    # dims. however, we support many possible partial arguments\n    if scale_factors is None and out_shape is None:\n        raise ValueError(\"either scale_factors or out_shape should be \"\n                         \"provided\")\n    if out_shape is not None:\n        # if out_shape has less dims than in_shape, we defaultly resize the\n        # first dims for numpy and last dims for paddle\n        out_shape = (list(out_shape) +\n                     list(in_shape[len(out_shape):]) if fw is numpy else list(in_shape[:-len(out_shape)]) +\n                     list(out_shape))\n        if scale_factors is None:\n            # if no scale given, we calculate it as the out to in ratio\n            # (not recomended)\n            scale_factors = [out_sz / in_sz for out_sz, in_sz in zip(out_shape, in_shape)]\n    if scale_factors is not None:\n        # by default, if a single number is given as scale, we assume resizing\n        # two dims (most common are images with 2 spatial dims)\n        scale_factors = (scale_factors if isinstance(scale_factors, (list, tuple)) else [scale_factors, scale_factors])\n        # if less scale_factors than in_shape dims, we defaultly resize the\n        # first dims for numpy and last dims for paddle\n        scale_factors = (list(scale_factors) + [1] * (len(in_shape) - len(scale_factors)) if fw is numpy else [1] *\n                         (len(in_shape) - len(scale_factors)) + list(scale_factors))\n        if out_shape is None:\n            # when no out_shape given, it is calculated by multiplying the\n            # scale by the in_shape (not recomended)\n            out_shape = [ceil(scale_factor * in_sz) for scale_factor, in_sz in zip(scale_factors, in_shape)]\n        # next part intentionally after out_shape determined for stability\n        # we fix by_convs to be a list of truth values in case it is not\n        if not isinstance(by_convs, (list, tuple)):\n            by_convs = [by_convs] * len(out_shape)\n\n        # next loop fixes the scale for each dim to be either frac or float.\n        # this is determined by by_convs and by tolerance for scale accuracy.\n        for ind, (sf, dim_by_convs) in enumerate(zip(scale_factors, by_convs)):\n            # first we fractionaize\n            if dim_by_convs:\n                frac = Fraction(1 / sf).limit_denominator(max_numerator)\n                frac = Fraction(numerator=frac.denominator, denominator=frac.numerator)\n\n            # if accuracy is within tolerance scale will be frac. if not, then\n            # it will be float and the by_convs attr will be set false for\n            # this dim\n            if scale_tolerance is None:\n                scale_tolerance = eps\n            if dim_by_convs and abs(frac - sf) < scale_tolerance:\n                scale_factors[ind] = frac\n            else:\n                scale_factors[ind] = float(sf)\n                by_convs[ind] = False\n\n        return scale_factors, out_shape, by_convs\n\n\ndef apply_antialiasing_if_needed(interp_method, support_sz, scale_factor, antialiasing):\n    # antialiasing is \"stretching\" the field of view according to the scale\n    # factor (only for downscaling). this is low-pass filtering. this\n    # requires modifying both the interpolation (stretching the 1d\n    # function and multiplying by the scale-factor) and the window size.\n    scale_factor = float(scale_factor)\n    if scale_factor >= 1.0 or not antialiasing:\n        return interp_method, support_sz\n    cur_interp_method = (lambda arg: scale_factor * interp_method(scale_factor * arg))\n    cur_support_sz = support_sz / scale_factor\n    return cur_interp_method, cur_support_sz\n\n\ndef fw_ceil(x, fw):\n    if fw is numpy:\n        return fw.int_(fw.ceil(x))\n    else:\n        return paddle.cast(x.ceil(), dtype='int64')\n\n\ndef fw_floor(x, fw):\n    if fw is numpy:\n        return fw.int_(fw.floor(x))\n    else:\n        return paddle.cast(x.floor(), dtype='int64')\n\n\ndef fw_cat(x, fw):\n    if fw is numpy:\n        return fw.concatenate(x)\n    else:\n        return fw.concat(x)\n\n\ndef fw_swapaxes(x, ax_1, ax_2, fw):\n    if fw is numpy:\n        return fw.swapaxes(x, ax_1, ax_2)\n    else:\n        if ax_1 == -1:\n            ax_1 = len(x.shape) - 1\n        if ax_2 == -1:\n            ax_2 = len(x.shape) - 1\n        perm0 = list(range(len(x.shape)))\n        temp = ax_1\n        perm0[temp] = ax_2\n        perm0[ax_2] = temp\n        return fw.transpose(x, perm0)\n\n\ndef fw_pad(x, fw, pad_sz, pad_mode, dim=0):\n    if pad_sz == (0, 0):\n        return x\n    if fw is numpy:\n        pad_vec = [(0, 0)] * x.ndim\n        pad_vec[dim] = pad_sz\n        return fw.pad(x, pad_width=pad_vec, mode=pad_mode)\n    else:\n        if x.ndim < 3:\n            x = x[None, None, ...]\n\n        pad_vec = [0] * ((x.ndim - 2) * 2)\n        pad_vec[0:2] = pad_sz\n        return fw_swapaxes(fw.nn.functional.pad(fw_swapaxes(x, dim, -1, fw), pad=pad_vec, mode=pad_mode), dim, -1, fw)\n\n\ndef fw_conv(input, filter, stride):\n    # we want to apply 1d conv to any nd array. the way to do it is to reshape\n    # the input to a 4D tensor. first two dims are singeletons, 3rd dim stores\n    # all the spatial dims that we are not convolving along now. then we can\n    # apply conv2d with a 1xK filter. This convolves the same way all the other\n    # dims stored in the 3d dim. like depthwise conv over these.\n    # TODO: numpy support\n    reshaped_input = input.reshape(1, 1, -1, input.shape[-1])\n    reshaped_output = paddle.nn.functional.conv2d(reshaped_input, filter.view(1, 1, 1, -1), stride=(1, stride))\n    return reshaped_output.reshape(*input.shape[:-1], -1)\n\n\ndef fw_arange(upper_bound, fw, device):\n    if fw is numpy:\n        return fw.arange(upper_bound)\n    else:\n        return fw.arange(upper_bound)\n\n\ndef fw_empty(shape, fw, device):\n    if fw is numpy:\n        return fw.empty(shape)\n    else:\n        return fw.empty(shape=shape)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/reverse_diffusion/README.md",
    "content": "# Diffusion model (Paddle)\nThis module implements diffusion model which accepts a text prompt and outputs images semantically close to the text. The code is rewritten by Paddle, and mainly refer to two projects:  jina-ai/discoart[https://github.com/jina-ai/discoart] and openai/guided-diffusion[https://github.com/openai/guided-diffusion]. Thanks for their wonderful work.\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/reverse_diffusion/__init__.py",
    "content": "'''\nhttps://github.com/jina-ai/discoart/blob/main/discoart/__init__.py\n'''\nimport os\nimport warnings\n\nos.environ['KMP_DUPLICATE_LIB_OK'] = 'TRUE'\n\n__all__ = ['create']\n\nimport sys\n\n__resources_path__ = os.path.join(\n    os.path.dirname(sys.modules.get(__package__).__file__ if __package__ in sys.modules else __file__),\n    'resources',\n)\n\nimport gc\n\n# check if GPU is available\nimport paddle\n\n# download and load models, this will take some time on the first load\n\nfrom .helper import load_all_models, load_diffusion_model, load_clip_models\n\nmodel_config, secondary_model = load_all_models('512x512_diffusion_uncond_finetune_008100', use_secondary_model=True)\n\nfrom typing import TYPE_CHECKING, overload, List, Optional\n\nif TYPE_CHECKING:\n    from docarray import DocumentArray, Document\n\n_clip_models_cache = {}\n\n# begin_create_overload\n\n\n@overload\ndef create(text_prompts: Optional[List[str]] = [\n    'A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.',\n    'yellow color scheme',\n],\n           init_image: Optional[str] = None,\n           width_height: Optional[List[int]] = [1280, 768],\n           skip_steps: Optional[int] = 10,\n           steps: Optional[int] = 250,\n           cut_ic_pow: Optional[int] = 1,\n           init_scale: Optional[int] = 1000,\n           clip_guidance_scale: Optional[int] = 5000,\n           tv_scale: Optional[int] = 0,\n           range_scale: Optional[int] = 150,\n           sat_scale: Optional[int] = 0,\n           cutn_batches: Optional[int] = 4,\n           diffusion_model: Optional[str] = '512x512_diffusion_uncond_finetune_008100',\n           use_secondary_model: Optional[bool] = True,\n           diffusion_sampling_mode: Optional[str] = 'ddim',\n           perlin_init: Optional[bool] = False,\n           perlin_mode: Optional[str] = 'mixed',\n           seed: Optional[int] = None,\n           eta: Optional[float] = 0.8,\n           clamp_grad: Optional[bool] = True,\n           clamp_max: Optional[float] = 0.05,\n           randomize_class: Optional[bool] = True,\n           clip_denoised: Optional[bool] = False,\n           fuzzy_prompt: Optional[bool] = False,\n           rand_mag: Optional[float] = 0.05,\n           cut_overview: Optional[str] = '[12]*400+[4]*600',\n           cut_innercut: Optional[str] = '[4]*400+[12]*600',\n           cut_icgray_p: Optional[str] = '[0.2]*400+[0]*600',\n           display_rate: Optional[int] = 10,\n           n_batches: Optional[int] = 4,\n           batch_size: Optional[int] = 1,\n           batch_name: Optional[str] = '',\n           clip_models: Optional[list] = ['ViTB32', 'ViTB16', 'RN50'],\n           output_dir: Optional[str] = 'discoart_output') -> 'DocumentArray':\n    \"\"\"\n    Create Disco Diffusion artworks and save the result into a DocumentArray.\n\n    :param text_prompts: Phrase, sentence, or string of words and phrases describing what the image should look like.  The words will be analyzed by the AI and will guide the diffusion process toward the image(s) you describe. These can include commas and weights to adjust the relative importance of each element.  E.g. \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"Notice that this prompt loosely follows a structure: [subject], [prepositional details], [setting], [meta modifiers and artist]; this is a good starting point for your experiments. Developing text prompts takes practice and experience, and is not the subject of this guide.  If you are a beginner to writing text prompts, a good place to start is on a simple AI art app like Night Cafe, starry ai or WOMBO prior to using DD, to get a feel for how text gets translated into images by GAN tools.  These other apps use different technologies, but many of the same principles apply.\n    :param init_image: Recall that in the image sequence above, the first image shown is just noise.  If an init_image is provided, diffusion will replace the noise with the init_image as its starting state.  To use an init_image, upload the image to the Colab instance or your Google Drive, and enter the full image path here. If using an init_image, you may need to increase skip_steps to ~ 50% of total steps to retain the character of the init. See skip_steps above for further discussion.\n    :param width_height: Desired final image size, in pixels. You can have a square, wide, or tall image, but each edge length should be set to a multiple of 64px, and a minimum of 512px on the default CLIP model setting.  If you forget to use multiples of 64px in your dimensions, DD will adjust the dimensions of your image to make it so.\n    :param skip_steps: Consider the chart shown here.  Noise scheduling (denoise strength) starts very high and progressively gets lower and lower as diffusion steps progress. The noise levels in the first few steps are very high, so images change dramatically in early steps.As DD moves along the curve, noise levels (and thus the amount an image changes per step) declines, and image coherence from one step to the next increases.The first few steps of denoising are often so dramatic that some steps (maybe 10-15% of total) can be skipped without affecting the final image. You can experiment with this as a way to cut render times.If you skip too many steps, however, the remaining noise may not be high enough to generate new content, and thus may not have ‘time left’ to finish an image satisfactorily.Also, depending on your other settings, you may need to skip steps to prevent CLIP from overshooting your goal, resulting in ‘blown out’ colors (hyper saturated, solid white, or solid black regions) or otherwise poor image quality.  Consider that the denoising process is at its strongest in the early steps, so skipping steps can sometimes mitigate other problems.Lastly, if using an init_image, you will need to skip ~50% of the diffusion steps to retain the shapes in the original init image. However, if you’re using an init_image, you can also adjust skip_steps up or down for creative reasons.  With low skip_steps you can get a result \"inspired by\" the init_image which will retain the colors and rough layout and shapes but look quite different. With high skip_steps you can preserve most of the init_image contents and just do fine tuning of the texture.\n    :param steps: When creating an image, the denoising curve is subdivided into steps for processing. Each step (or iteration) involves the AI looking at subsets of the image called ‘cuts’ and calculating the ‘direction’ the image should be guided to be more like the prompt. Then it adjusts the image with the help of the diffusion denoiser, and moves to the next step.Increasing steps will provide more opportunities for the AI to adjust the image, and each adjustment will be smaller, and thus will yield a more precise, detailed image.  Increasing steps comes at the expense of longer render times.  Also, while increasing steps should generally increase image quality, there is a diminishing return on additional steps beyond 250 - 500 steps.  However, some intricate images can take 1000, 2000, or more steps.  It is really up to the user.  Just know that the render time is directly related to the number of steps, and many other parameters have a major impact on image quality, without costing additional time.\n    :param cut_ic_pow: This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n    :param init_scale: This controls how strongly CLIP will try to match the init_image provided.  This is balanced against the clip_guidance_scale (CGS) above.  Too much init scale, and the image won’t change much during diffusion. Too much CGS and the init image will be lost.\n    :param clip_guidance_scale: CGS is one of the most important parameters you will use. It tells DD how strongly you want CLIP to move toward your prompt each timestep.  Higher is generally better, but if CGS is too strong it will overshoot the goal and distort the image. So a happy medium is needed, and it takes experience to learn how to adjust CGS. Note that this parameter generally scales with image dimensions. In other words, if you increase your total dimensions by 50% (e.g. a change from 512 x 512 to 512 x 768), then to maintain the same effect on the image, you’d want to increase clip_guidance_scale from 5000 to 7500. Of the basic settings, clip_guidance_scale, steps and skip_steps are the most important contributors to image quality, so learn them well.\n    :param tv_scale: Total variance denoising. Optional, set to zero to turn off. Controls ‘smoothness’ of final output. If used, tv_scale will try to smooth out your final image to reduce overall noise. If your image is too ‘crunchy’, increase tv_scale. TV denoising is good at preserving edges while smoothing away noise in flat regions.  See https://en.wikipedia.org/wiki/Total_variation_denoising\n    :param range_scale: Optional, set to zero to turn off.  Used for adjustment of color contrast.  Lower range_scale will increase contrast. Very low numbers create a reduced color palette, resulting in more vibrant or poster-like images. Higher range_scale will reduce contrast, for more muted images.\n    :param sat_scale: Saturation scale. Optional, set to zero to turn off.  If used, sat_scale will help mitigate oversaturation. If your image is too saturated, increase sat_scale to reduce the saturation.\n    :param cutn_batches: Each iteration, the AI cuts the image into smaller pieces known as cuts, and compares each cut to the prompt to decide how to guide the next diffusion step.  More cuts can generally lead to better images, since DD has more chances to fine-tune the image precision in each timestep.  Additional cuts are memory intensive, however, and if DD tries to evaluate too many cuts at once, it can run out of memory.  You can use cutn_batches to increase cuts per timestep without increasing memory usage. At the default settings, DD is scheduled to do 16 cuts per timestep.  If cutn_batches is set to 1, there will indeed only be 16 cuts total per timestep. However, if cutn_batches is increased to 4, DD will do 64 cuts total in each timestep, divided into 4 sequential batches of 16 cuts each.  Because the cuts are being evaluated only 16 at a time, DD uses the memory required for only 16 cuts, but gives you the quality benefit of 64 cuts.  The tradeoff, of course, is that this will take ~4 times as long to render each image.So, (scheduled cuts) x (cutn_batches) = (total cuts per timestep). Increasing cutn_batches will increase render times, however, as the work is being done sequentially.  DD’s default cut schedule is a good place to start, but the cut schedule can be adjusted in the Cutn Scheduling section, explained below.\n    :param diffusion_model: Diffusion_model of choice.\n    :param use_secondary_model: Option to use a secondary purpose-made diffusion model to clean up interim diffusion images for CLIP evaluation.    If this option is turned off, DD will use the regular (large) diffusion model.    Using the secondary model is faster - one user reported a 50% improvement in render speed! However, the secondary model is much smaller, and may reduce image quality and detail.  I suggest you experiment with this.\n    :param diffusion_sampling_mode: Two alternate diffusion denoising algorithms. ddim has been around longer, and is more established and tested.  plms is a newly added alternate method that promises good diffusion results in fewer steps, but has not been as fully tested and may have side effects. This new plms mode is actively being researched in the #settings-and-techniques channel in the DD Discord.\n    :param perlin_init: Normally, DD will use an image filled with random noise as a starting point for the diffusion curve.  If perlin_init is selected, DD will instead use a Perlin noise model as an initial state.  Perlin has very interesting characteristics, distinct from random noise, so it’s worth experimenting with this for your projects. Beyond perlin, you can, of course, generate your own noise images (such as with GIMP, etc) and use them as an init_image (without skipping steps). Choosing perlin_init does not affect the actual diffusion process, just the starting point for the diffusion. Please note that selecting a perlin_init will replace and override any init_image you may have specified.  Further, because the 2D, 3D and video animation systems all rely on the init_image system, if you enable Perlin while using animation modes, the perlin_init will jump in front of any previous image or video input, and DD will NOT give you the expected sequence of coherent images. All of that said, using Perlin and animation modes together do make a very colorful rainbow effect, which can be used creatively.\n    :param perlin_mode: sets type of Perlin noise: colored, gray, or a mix of both, giving you additional options for noise types. Experiment to see what these do in your projects.\n    :param seed: Deep in the diffusion code, there is a random number ‘seed’ which is used as the basis for determining the initial state of the diffusion.  By default, this is random, but you can also specify your own seed.  This is useful if you like a particular result and would like to run more iterations that will be similar. After each run, the actual seed value used will be reported in the parameters report, and can be reused if desired by entering seed # here.  If a specific numerical seed is used repeatedly, the resulting images will be quite similar but not identical.\n    :param eta: eta (greek letter η) is a diffusion model variable that mixes in a random amount of scaled noise into each timestep. 0 is no noise, 1.0 is more noise. As with most DD parameters, you can go below zero for eta, but it may give you unpredictable results. The steps parameter has a close relationship with the eta parameter. If you set eta to 0, then you can get decent output with only 50-75 steps. Setting eta to 1.0 favors higher step counts, ideally around 250 and up. eta has a subtle, unpredictable effect on image, so you’ll need to experiment to see how this affects your projects.\n    :param clamp_grad: As I understand it, clamp_grad is an internal limiter that stops DD from producing extreme results.  Try your images with and without clamp_grad. If the image changes drastically with clamp_grad turned off, it probably means your clip_guidance_scale is too high and should be reduced.\n    :param clamp_max: Sets the value of the clamp_grad limitation. Default is 0.05, providing for smoother, more muted coloration in images, but setting higher values (0.15-0.3) can provide interesting contrast and vibrancy.\n    :param fuzzy_prompt: Controls whether to add multiple noisy prompts to the prompt losses. If True, can increase variability of image output. Experiment with this.\n    :param rand_mag: Affects only the fuzzy_prompt.  Controls the magnitude of the random noise added by fuzzy_prompt.\n    :param cut_overview: The schedule of overview cuts\n    :param cut_innercut: The schedule of inner cuts\n    :param cut_icgray_p: This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n    :param display_rate: During a diffusion run, you can monitor the progress of each image being created with this variable.  If display_rate is set to 50, DD will show you the in-progress image every 50 timesteps. Setting this to a lower value, like 5 or 10, is a good way to get an early peek at where your image is heading. If you don’t like the progression, just interrupt execution, change some settings, and re-run.  If you are planning a long, unmonitored batch, it’s better to set display_rate equal to steps, because displaying interim images does slow Colab down slightly.\n    :param n_batches: This variable sets the number of still images you want DD to create.  If you are using an animation mode (see below for details) DD will ignore n_batches and create a single set of animated frames based on the animation settings.\n    :param batch_name: The name of the batch, the batch id will be named as \"discoart-[batch_name]-seed\". To avoid your artworks be overridden by other users, please use a unique name.\n    :param clip_models: CLIP Model selectors. ViTB32, ViTB16, ViTL14, RN101, RN50, RN50x4, RN50x16, RN50x64.These various CLIP models are available for you to use during image generation.  Models have different styles or ‘flavors,’ so look around.  You can mix in multiple models as well for different results.  However, keep in mind that some models are extremely memory-hungry, and turning on additional models will take additional memory and may cause a crash.The rough order of speed/mem usage is (smallest/fastest to largest/slowest):VitB32RN50RN101VitB16RN50x4RN50x16RN50x64ViTL14For RN50x64 & ViTL14 you may need to use fewer cuts, depending on your VRAM.\n    :return: a DocumentArray object that has `n_batches` Documents\n    \"\"\"\n\n\n# end_create_overload\n\n\n@overload\ndef create(init_document: 'Document') -> 'DocumentArray':\n    \"\"\"\n    Create an artwork using a DocArray ``Document`` object as initial state.\n    :param init_document: its ``.tags`` will be used as parameters, ``.uri`` (if present) will be used as init image.\n    :return: a DocumentArray object that has `n_batches` Documents\n    \"\"\"\n\n\ndef create(**kwargs) -> 'DocumentArray':\n    from .config import load_config\n    from .runner import do_run\n\n    if 'init_document' in kwargs:\n        d = kwargs['init_document']\n        _kwargs = d.tags\n        if not _kwargs:\n            warnings.warn('init_document has no .tags, fallback to default config')\n        if d.uri:\n            _kwargs['init_image'] = kwargs['init_document'].uri\n        else:\n            warnings.warn('init_document has no .uri, fallback to no init image')\n        kwargs.pop('init_document')\n        if kwargs:\n            warnings.warn('init_document has .tags and .uri, but kwargs are also present, will override .tags')\n            _kwargs.update(kwargs)\n        _args = load_config(user_config=_kwargs)\n    else:\n        _args = load_config(user_config=kwargs)\n\n    model, diffusion = load_diffusion_model(model_config, _args.diffusion_model, steps=_args.steps)\n\n    clip_models = load_clip_models(enabled=_args.clip_models, clip_models=_clip_models_cache)\n\n    gc.collect()\n    paddle.device.cuda.empty_cache()\n    try:\n        return do_run(_args, (model, diffusion, clip_models, secondary_model))\n    except KeyboardInterrupt:\n        pass\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/reverse_diffusion/config.py",
    "content": "'''\nhttps://github.com/jina-ai/discoart/blob/main/discoart/config.py\n'''\nimport copy\nimport random\nimport warnings\nfrom types import SimpleNamespace\nfrom typing import Dict\n\nimport yaml\nfrom yaml import Loader\n\nfrom . import __resources_path__\n\nwith open(f'{__resources_path__}/default.yml') as ymlfile:\n    default_args = yaml.load(ymlfile, Loader=Loader)\n\n\ndef load_config(user_config: Dict, ):\n    cfg = copy.deepcopy(default_args)\n\n    if user_config:\n        cfg.update(**user_config)\n\n    for k in user_config.keys():\n        if k not in cfg:\n            warnings.warn(f'unknown argument {k}, ignored')\n\n    for k, v in cfg.items():\n        if k in ('batch_size', 'display_rate', 'seed', 'skip_steps', 'steps', 'n_batches',\n                 'cutn_batches') and isinstance(v, float):\n            cfg[k] = int(v)\n        if k == 'width_height':\n            cfg[k] = [int(vv) for vv in v]\n\n    cfg.update(**{\n        'seed': cfg['seed'] or random.randint(0, 2**32),\n    })\n\n    if cfg['batch_name']:\n        da_name = f'{__package__}-{cfg[\"batch_name\"]}-{cfg[\"seed\"]}'\n    else:\n        da_name = f'{__package__}-{cfg[\"seed\"]}'\n        warnings.warn('you did not set `batch_name`, set it to have unique session ID')\n\n    cfg.update(**{'name_docarray': da_name})\n\n    print_args_table(cfg)\n\n    return SimpleNamespace(**cfg)\n\n\ndef print_args_table(cfg):\n    from rich.table import Table\n    from rich import box\n    from rich.console import Console\n\n    console = Console()\n\n    param_str = Table(\n        title=cfg['name_docarray'],\n        box=box.ROUNDED,\n        highlight=True,\n        title_justify='left',\n    )\n    param_str.add_column('Argument', justify='right')\n    param_str.add_column('Value', justify='left')\n\n    for k, v in sorted(cfg.items()):\n        value = str(v)\n\n        if not default_args.get(k, None) == v:\n            value = f'[b]{value}[/]'\n\n        param_str.add_row(k, value)\n\n    console.print(param_str)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/reverse_diffusion/helper.py",
    "content": "'''\nThis code is rewritten by Paddle based on Jina-ai/discoart.\nhttps://github.com/jina-ai/discoart/blob/main/discoart/helper.py\n'''\nimport hashlib\nimport logging\nimport os\nimport subprocess\nimport sys\nfrom os.path import expanduser\nfrom pathlib import Path\nfrom typing import Any\nfrom typing import Dict\nfrom typing import List\n\nimport paddle\n\n\ndef _get_logger():\n    logger = logging.getLogger(__package__)\n    logger.setLevel(\"INFO\")\n    ch = logging.StreamHandler()\n    ch.setLevel(\"INFO\")\n    formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')\n    ch.setFormatter(formatter)\n    logger.addHandler(ch)\n    return logger\n\n\nlogger = _get_logger()\n\n\ndef load_clip_models(enabled: List[str], clip_models: Dict[str, Any] = {}):\n\n    import disco_diffusion_clip_rn50.clip.clip as clip\n    from disco_diffusion_clip_rn50.clip.clip import build_model, tokenize, transform\n\n    # load enabled models\n    for k in enabled:\n        if k not in clip_models:\n            clip_models[k] = build_model(name=k)\n            clip_models[k].eval()\n            for parameter in clip_models[k].parameters():\n                parameter.stop_gradient = True\n\n    # disable not enabled models to save memory\n    for k in clip_models:\n        if k not in enabled:\n            clip_models.pop(k)\n\n    return list(clip_models.values())\n\n\ndef load_all_models(diffusion_model, use_secondary_model):\n    from .model.script_util import (\n        model_and_diffusion_defaults, )\n\n    model_config = model_and_diffusion_defaults()\n\n    if diffusion_model == '512x512_diffusion_uncond_finetune_008100':\n        model_config.update({\n            'attention_resolutions': '32, 16, 8',\n            'class_cond': False,\n            'diffusion_steps': 1000,  # No need to edit this, it is taken care of later.\n            'rescale_timesteps': True,\n            'timestep_respacing': 250,  # No need to edit this, it is taken care of later.\n            'image_size': 512,\n            'learn_sigma': True,\n            'noise_schedule': 'linear',\n            'num_channels': 256,\n            'num_head_channels': 64,\n            'num_res_blocks': 2,\n            'resblock_updown': True,\n            'use_fp16': False,\n            'use_scale_shift_norm': True,\n        })\n    elif diffusion_model == '256x256_diffusion_uncond':\n        model_config.update({\n            'attention_resolutions': '32, 16, 8',\n            'class_cond': False,\n            'diffusion_steps': 1000,  # No need to edit this, it is taken care of later.\n            'rescale_timesteps': True,\n            'timestep_respacing': 250,  # No need to edit this, it is taken care of later.\n            'image_size': 256,\n            'learn_sigma': True,\n            'noise_schedule': 'linear',\n            'num_channels': 256,\n            'num_head_channels': 64,\n            'num_res_blocks': 2,\n            'resblock_updown': True,\n            'use_fp16': False,\n            'use_scale_shift_norm': True,\n        })\n\n    secondary_model = None\n    if use_secondary_model:\n        from .model.sec_diff import SecondaryDiffusionImageNet2\n        secondary_model = SecondaryDiffusionImageNet2()\n        model_dict = paddle.load(\n            os.path.join(os.path.dirname(__file__), 'pre_trained', 'secondary_model_imagenet_2.pdparams'))\n        secondary_model.set_state_dict(model_dict)\n        secondary_model.eval()\n        for parameter in secondary_model.parameters():\n            parameter.stop_gradient = True\n\n    return model_config, secondary_model\n\n\ndef load_diffusion_model(model_config, diffusion_model, steps):\n    from .model.script_util import (\n        create_model_and_diffusion, )\n\n    timestep_respacing = f'ddim{steps}'\n    diffusion_steps = (1000 // steps) * steps if steps < 1000 else steps\n    model_config.update({\n        'timestep_respacing': timestep_respacing,\n        'diffusion_steps': diffusion_steps,\n    })\n\n    model, diffusion = create_model_and_diffusion(**model_config)\n    model.set_state_dict(\n        paddle.load(os.path.join(os.path.dirname(__file__), 'pre_trained', f'{diffusion_model}.pdparams')))\n    model.eval()\n    for name, param in model.named_parameters():\n        param.stop_gradient = True\n\n    return model, diffusion\n\n\ndef parse_prompt(prompt):\n    if prompt.startswith('http://') or prompt.startswith('https://'):\n        vals = prompt.rsplit(':', 2)\n        vals = [vals[0] + ':' + vals[1], *vals[2:]]\n    else:\n        vals = prompt.rsplit(':', 1)\n    vals = vals + ['', '1'][len(vals):]\n    return vals[0], float(vals[1])\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/reverse_diffusion/model/__init__.py",
    "content": "\"\"\"\nCodebase for \"Improved Denoising Diffusion Probabilistic Models\" implemented by Paddle.\n\"\"\"\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/reverse_diffusion/model/gaussian_diffusion.py",
    "content": "\"\"\"\nDiffusion model implemented by Paddle.\nThis code is rewritten based on Pytorch version of of Ho et al's diffusion models:\nhttps://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/diffusion_utils_2.py\n\"\"\"\nimport enum\nimport math\n\nimport numpy as np\nimport paddle\n\nfrom .losses import discretized_gaussian_log_likelihood\nfrom .losses import normal_kl\nfrom .nn import mean_flat\n\n\ndef get_named_beta_schedule(schedule_name, num_diffusion_timesteps):\n    \"\"\"\n    Get a pre-defined beta schedule for the given name.\n\n    The beta schedule library consists of beta schedules which remain similar\n    in the limit of num_diffusion_timesteps.\n    Beta schedules may be added, but should not be removed or changed once\n    they are committed to maintain backwards compatibility.\n    \"\"\"\n    if schedule_name == \"linear\":\n        # Linear schedule from Ho et al, extended to work for any number of\n        # diffusion steps.\n        scale = 1000 / num_diffusion_timesteps\n        beta_start = scale * 0.0001\n        beta_end = scale * 0.02\n        return np.linspace(beta_start, beta_end, num_diffusion_timesteps, dtype=np.float64)\n    elif schedule_name == \"cosine\":\n        return betas_for_alpha_bar(\n            num_diffusion_timesteps,\n            lambda t: math.cos((t + 0.008) / 1.008 * math.pi / 2)**2,\n        )\n    else:\n        raise NotImplementedError(f\"unknown beta schedule: {schedule_name}\")\n\n\ndef betas_for_alpha_bar(num_diffusion_timesteps, alpha_bar, max_beta=0.999):\n    \"\"\"\n    Create a beta schedule that discretizes the given alpha_t_bar function,\n    which defines the cumulative product of (1-beta) over time from t = [0,1].\n\n    :param num_diffusion_timesteps: the number of betas to produce.\n    :param alpha_bar: a lambda that takes an argument t from 0 to 1 and\n                      produces the cumulative product of (1-beta) up to that\n                      part of the diffusion process.\n    :param max_beta: the maximum beta to use; use values lower than 1 to\n                     prevent singularities.\n    \"\"\"\n    betas = []\n    for i in range(num_diffusion_timesteps):\n        t1 = i / num_diffusion_timesteps\n        t2 = (i + 1) / num_diffusion_timesteps\n        betas.append(min(1 - alpha_bar(t2) / alpha_bar(t1), max_beta))\n    return np.array(betas)\n\n\nclass ModelMeanType(enum.Enum):\n    \"\"\"\n    Which type of output the model predicts.\n    \"\"\"\n\n    PREVIOUS_X = enum.auto()  # the model predicts x_{t-1}\n    START_X = enum.auto()  # the model predicts x_0\n    EPSILON = enum.auto()  # the model predicts epsilon\n\n\nclass ModelVarType(enum.Enum):\n    \"\"\"\n    What is used as the model's output variance.\n\n    The LEARNED_RANGE option has been added to allow the model to predict\n    values between FIXED_SMALL and FIXED_LARGE, making its job easier.\n    \"\"\"\n\n    LEARNED = enum.auto()\n    FIXED_SMALL = enum.auto()\n    FIXED_LARGE = enum.auto()\n    LEARNED_RANGE = enum.auto()\n\n\nclass LossType(enum.Enum):\n    MSE = enum.auto()  # use raw MSE loss (and KL when learning variances)\n    RESCALED_MSE = (enum.auto())  # use raw MSE loss (with RESCALED_KL when learning variances)\n    KL = enum.auto()  # use the variational lower-bound\n    RESCALED_KL = enum.auto()  # like KL, but rescale to estimate the full VLB\n\n    def is_vb(self):\n        return self == LossType.KL or self == LossType.RESCALED_KL\n\n\nclass GaussianDiffusion:\n    \"\"\"\n    Utilities for training and sampling diffusion models.\n\n    Ported directly from here, and then adapted over time to further experimentation.\n    https://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/diffusion_utils_2.py#L42\n\n    :param betas: a 1-D numpy array of betas for each diffusion timestep,\n                  starting at T and going to 1.\n    :param model_mean_type: a ModelMeanType determining what the model outputs.\n    :param model_var_type: a ModelVarType determining how variance is output.\n    :param loss_type: a LossType determining the loss function to use.\n    :param rescale_timesteps: if True, pass floating point timesteps into the\n                              model so that they are always scaled like in the\n                              original paper (0 to 1000).\n    \"\"\"\n\n    def __init__(\n        self,\n        *,\n        betas,\n        model_mean_type,\n        model_var_type,\n        loss_type,\n        rescale_timesteps=False,\n    ):\n        self.model_mean_type = model_mean_type\n        self.model_var_type = model_var_type\n        self.loss_type = loss_type\n        self.rescale_timesteps = rescale_timesteps\n\n        # Use float64 for accuracy.\n        betas = np.array(betas, dtype=np.float64)\n        self.betas = betas\n        assert len(betas.shape) == 1, \"betas must be 1-D\"\n        assert (betas > 0).all() and (betas <= 1).all()\n\n        self.num_timesteps = int(betas.shape[0])\n\n        alphas = 1.0 - betas\n        self.alphas_cumprod = np.cumprod(alphas, axis=0)\n        self.alphas_cumprod_prev = np.append(1.0, self.alphas_cumprod[:-1])\n        self.alphas_cumprod_next = np.append(self.alphas_cumprod[1:], 0.0)\n        assert self.alphas_cumprod_prev.shape == (self.num_timesteps, )\n\n        # calculations for diffusion q(x_t | x_{t-1}) and others\n        self.sqrt_alphas_cumprod = np.sqrt(self.alphas_cumprod)\n        self.sqrt_one_minus_alphas_cumprod = np.sqrt(1.0 - self.alphas_cumprod)\n        self.log_one_minus_alphas_cumprod = np.log(1.0 - self.alphas_cumprod)\n        self.sqrt_recip_alphas_cumprod = np.sqrt(1.0 / self.alphas_cumprod)\n        self.sqrt_recipm1_alphas_cumprod = np.sqrt(1.0 / self.alphas_cumprod - 1)\n\n        # calculations for posterior q(x_{t-1} | x_t, x_0)\n        self.posterior_variance = (betas * (1.0 - self.alphas_cumprod_prev) / (1.0 - self.alphas_cumprod))\n        # log calculation clipped because the posterior variance is 0 at the\n        # beginning of the diffusion chain.\n        self.posterior_log_variance_clipped = np.log(np.append(self.posterior_variance[1], self.posterior_variance[1:]))\n        self.posterior_mean_coef1 = (betas * np.sqrt(self.alphas_cumprod_prev) / (1.0 - self.alphas_cumprod))\n        self.posterior_mean_coef2 = ((1.0 - self.alphas_cumprod_prev) * np.sqrt(alphas) / (1.0 - self.alphas_cumprod))\n\n    def q_mean_variance(self, x_start, t):\n        \"\"\"\n        Get the distribution q(x_t | x_0).\n\n        :param x_start: the [N x C x ...] tensor of noiseless inputs.\n        :param t: the number of diffusion steps (minus 1). Here, 0 means one step.\n        :return: A tuple (mean, variance, log_variance), all of x_start's shape.\n        \"\"\"\n        mean = (_extract_into_tensor(self.sqrt_alphas_cumprod, t, x_start.shape) * x_start)\n        variance = _extract_into_tensor(1.0 - self.alphas_cumprod, t, x_start.shape)\n        log_variance = _extract_into_tensor(self.log_one_minus_alphas_cumprod, t, x_start.shape)\n        return mean, variance, log_variance\n\n    def q_sample(self, x_start, t, noise=None):\n        \"\"\"\n        Diffuse the data for a given number of diffusion steps.\n\n        In other words, sample from q(x_t | x_0).\n\n        :param x_start: the initial data batch.\n        :param t: the number of diffusion steps (minus 1). Here, 0 means one step.\n        :param noise: if specified, the split-out normal noise.\n        :return: A noisy version of x_start.\n        \"\"\"\n        if noise is None:\n            # noise = th.randn_like(x_start)\n            noise = paddle.randn(x_start.shape, x_start.dtype)\n        assert noise.shape == x_start.shape\n        return (_extract_into_tensor(self.sqrt_alphas_cumprod, t, x_start.shape) * x_start +\n                _extract_into_tensor(self.sqrt_one_minus_alphas_cumprod, t, x_start.shape) * noise)\n\n    def q_posterior_mean_variance(self, x_start, x_t, t):\n        \"\"\"\n        Compute the mean and variance of the diffusion posterior:\n\n            q(x_{t-1} | x_t, x_0)\n\n        \"\"\"\n        assert x_start.shape == x_t.shape\n        posterior_mean = (_extract_into_tensor(self.posterior_mean_coef1, t, x_t.shape) * x_start +\n                          _extract_into_tensor(self.posterior_mean_coef2, t, x_t.shape) * x_t)\n        posterior_variance = _extract_into_tensor(self.posterior_variance, t, x_t.shape)\n        posterior_log_variance_clipped = _extract_into_tensor(self.posterior_log_variance_clipped, t, x_t.shape)\n        assert (posterior_mean.shape[0] == posterior_variance.shape[0] == posterior_log_variance_clipped.shape[0] ==\n                x_start.shape[0])\n        return posterior_mean, posterior_variance, posterior_log_variance_clipped\n\n    def p_mean_variance(self, model, x, t, clip_denoised=True, denoised_fn=None, model_kwargs=None):\n        \"\"\"\n        Apply the model to get p(x_{t-1} | x_t), as well as a prediction of\n        the initial x, x_0.\n\n        :param model: the model, which takes a signal and a batch of timesteps\n                      as input.\n        :param x: the [N x C x ...] tensor at time t.\n        :param t: a 1-D Tensor of timesteps.\n        :param clip_denoised: if True, clip the denoised signal into [-1, 1].\n        :param denoised_fn: if not None, a function which applies to the\n            x_start prediction before it is used to sample. Applies before\n            clip_denoised.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n        :return: a dict with the following keys:\n                 - 'mean': the model mean output.\n                 - 'variance': the model variance output.\n                 - 'log_variance': the log of 'variance'.\n                 - 'pred_xstart': the prediction for x_0.\n        \"\"\"\n        if model_kwargs is None:\n            model_kwargs = {}\n\n        B, C = x.shape[:2]\n        assert t.shape == [B]\n        model_output = model(x, self._scale_timesteps(t), **model_kwargs)\n\n        if self.model_var_type in [ModelVarType.LEARNED, ModelVarType.LEARNED_RANGE]:\n            assert model_output.shape == [B, C * 2, *x.shape[2:]]\n            model_output, model_var_values = paddle.split(model_output, 2, axis=1)\n            if self.model_var_type == ModelVarType.LEARNED:\n                model_log_variance = model_var_values\n                model_variance = paddle.exp(model_log_variance)\n            else:\n                min_log = _extract_into_tensor(self.posterior_log_variance_clipped, t, x.shape)\n                max_log = _extract_into_tensor(np.log(self.betas), t, x.shape)\n                # The model_var_values is [-1, 1] for [min_var, max_var].\n                frac = (model_var_values + 1) / 2\n                model_log_variance = frac * max_log + (1 - frac) * min_log\n                model_variance = paddle.exp(model_log_variance)\n        else:\n            model_variance, model_log_variance = {\n                # for fixedlarge, we set the initial (log-)variance like so\n                # to get a better decoder log likelihood.\n                ModelVarType.FIXED_LARGE: (\n                    np.append(self.posterior_variance[1], self.betas[1:]),\n                    np.log(np.append(self.posterior_variance[1], self.betas[1:])),\n                ),\n                ModelVarType.FIXED_SMALL: (\n                    self.posterior_variance,\n                    self.posterior_log_variance_clipped,\n                ),\n            }[self.model_var_type]\n            model_variance = _extract_into_tensor(model_variance, t, x.shape)\n            model_log_variance = _extract_into_tensor(model_log_variance, t, x.shape)\n\n        def process_xstart(x):\n            if denoised_fn is not None:\n                x = denoised_fn(x)\n            if clip_denoised:\n                return x.clamp(-1, 1)\n            return x\n\n        if self.model_mean_type == ModelMeanType.PREVIOUS_X:\n            pred_xstart = process_xstart(self._predict_xstart_from_xprev(x_t=x, t=t, xprev=model_output))\n            model_mean = model_output\n        elif self.model_mean_type in [ModelMeanType.START_X, ModelMeanType.EPSILON]:\n            if self.model_mean_type == ModelMeanType.START_X:\n                pred_xstart = process_xstart(model_output)\n            else:\n                pred_xstart = process_xstart(self._predict_xstart_from_eps(x_t=x, t=t, eps=model_output))\n            model_mean, _, _ = self.q_posterior_mean_variance(x_start=pred_xstart, x_t=x, t=t)\n        else:\n            raise NotImplementedError(self.model_mean_type)\n\n        assert (model_mean.shape == model_log_variance.shape == pred_xstart.shape == x.shape)\n        return {\n            \"mean\": model_mean,\n            \"variance\": model_variance,\n            \"log_variance\": model_log_variance,\n            \"pred_xstart\": pred_xstart,\n        }\n\n    def _predict_xstart_from_eps(self, x_t, t, eps):\n        assert x_t.shape == eps.shape\n        return (_extract_into_tensor(self.sqrt_recip_alphas_cumprod, t, x_t.shape) * x_t -\n                _extract_into_tensor(self.sqrt_recipm1_alphas_cumprod, t, x_t.shape) * eps)\n\n    def _predict_xstart_from_xprev(self, x_t, t, xprev):\n        assert x_t.shape == xprev.shape\n        return (  # (xprev - coef2*x_t) / coef1\n            _extract_into_tensor(1.0 / self.posterior_mean_coef1, t, x_t.shape) * xprev -\n            _extract_into_tensor(self.posterior_mean_coef2 / self.posterior_mean_coef1, t, x_t.shape) * x_t)\n\n    def _predict_eps_from_xstart(self, x_t, t, pred_xstart):\n        return (_extract_into_tensor(self.sqrt_recip_alphas_cumprod, t, x_t.shape) * x_t -\n                pred_xstart) / _extract_into_tensor(self.sqrt_recipm1_alphas_cumprod, t, x_t.shape)\n\n    def _scale_timesteps(self, t):\n        if self.rescale_timesteps:\n            return paddle.cast((t), 'float32') * (1000.0 / self.num_timesteps)\n        return t\n\n    def condition_mean(self, cond_fn, p_mean_var, x, t, model_kwargs=None):\n        \"\"\"\n        Compute the mean for the previous step, given a function cond_fn that\n        computes the gradient of a conditional log probability with respect to\n        x. In particular, cond_fn computes grad(log(p(y|x))), and we want to\n        condition on y.\n\n        This uses the conditioning strategy from Sohl-Dickstein et al. (2015).\n        \"\"\"\n        gradient = cond_fn(x, self._scale_timesteps(t), **model_kwargs)\n        new_mean = (paddle.cast((p_mean_var[\"mean\"]), 'float32') + p_mean_var[\"variance\"] * paddle.cast(\n            (gradient), 'float32'))\n        return new_mean\n\n    def condition_mean_with_grad(self, cond_fn, p_mean_var, x, t, model_kwargs=None):\n        \"\"\"\n        Compute the mean for the previous step, given a function cond_fn that\n        computes the gradient of a conditional log probability with respect to\n        x. In particular, cond_fn computes grad(log(p(y|x))), and we want to\n        condition on y.\n\n        This uses the conditioning strategy from Sohl-Dickstein et al. (2015).\n        \"\"\"\n        gradient = cond_fn(x, t, p_mean_var, **model_kwargs)\n        new_mean = (paddle.cast((p_mean_var[\"mean\"]), 'float32') + p_mean_var[\"variance\"] * paddle.cast(\n            (gradient), 'float32'))\n        return new_mean\n\n    def condition_score(self, cond_fn, p_mean_var, x, t, model_kwargs=None):\n        \"\"\"\n        Compute what the p_mean_variance output would have been, should the\n        model's score function be conditioned by cond_fn.\n\n        See condition_mean() for details on cond_fn.\n\n        Unlike condition_mean(), this instead uses the conditioning strategy\n        from Song et al (2020).\n        \"\"\"\n        alpha_bar = _extract_into_tensor(self.alphas_cumprod, t, x.shape)\n\n        eps = self._predict_eps_from_xstart(x, t, p_mean_var[\"pred_xstart\"])\n        eps = eps - (1 - alpha_bar).sqrt() * cond_fn(x, self._scale_timesteps(t), **model_kwargs)\n\n        out = p_mean_var.copy()\n        out[\"pred_xstart\"] = self._predict_xstart_from_eps(x, t, eps)\n        out[\"mean\"], _, _ = self.q_posterior_mean_variance(x_start=out[\"pred_xstart\"], x_t=x, t=t)\n        return out\n\n    def condition_score_with_grad(self, cond_fn, p_mean_var, x, t, model_kwargs=None):\n        \"\"\"\n        Compute what the p_mean_variance output would have been, should the\n        model's score function be conditioned by cond_fn.\n\n        See condition_mean() for details on cond_fn.\n\n        Unlike condition_mean(), this instead uses the conditioning strategy\n        from Song et al (2020).\n        \"\"\"\n        alpha_bar = _extract_into_tensor(self.alphas_cumprod, t, x.shape)\n\n        eps = self._predict_eps_from_xstart(x, t, p_mean_var[\"pred_xstart\"])\n        eps = eps - (1 - alpha_bar).sqrt() * cond_fn(x, t, p_mean_var, **model_kwargs)\n\n        out = p_mean_var.copy()\n        out[\"pred_xstart\"] = self._predict_xstart_from_eps(x, t, eps)\n        out[\"mean\"], _, _ = self.q_posterior_mean_variance(x_start=out[\"pred_xstart\"], x_t=x, t=t)\n        return out\n\n    def p_sample(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n    ):\n        \"\"\"\n        Sample x_{t-1} from the model at the given timestep.\n\n        :param model: the model to sample from.\n        :param x: the current tensor at x_{t-1}.\n        :param t: the value of t, starting at 0 for the first diffusion step.\n        :param clip_denoised: if True, clip the x_start prediction to [-1, 1].\n        :param denoised_fn: if not None, a function which applies to the\n            x_start prediction before it is used to sample.\n        :param cond_fn: if not None, this is a gradient function that acts\n                        similarly to the model.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n        :return: a dict containing the following keys:\n                 - 'sample': a random sample from the model.\n                 - 'pred_xstart': a prediction of x_0.\n        \"\"\"\n        out = self.p_mean_variance(\n            model,\n            x,\n            t,\n            clip_denoised=clip_denoised,\n            denoised_fn=denoised_fn,\n            model_kwargs=model_kwargs,\n        )\n        # noise = th.randn_like(x)\n        noise = paddle.randn(x.shape, x.dtype)\n        nonzero_mask = (paddle.cast((t != 0), 'float32').reshape([-1,\n                                                                  *([1] * (len(x.shape) - 1))]))  # no noise when t == 0\n        if cond_fn is not None:\n            out[\"mean\"] = self.condition_mean(cond_fn, out, x, t, model_kwargs=model_kwargs)\n        sample = out[\"mean\"] + nonzero_mask * paddle.exp(0.5 * out[\"log_variance\"]) * noise\n        return {\"sample\": sample, \"pred_xstart\": out[\"pred_xstart\"]}\n\n    def p_sample_with_grad(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n    ):\n        \"\"\"\n        Sample x_{t-1} from the model at the given timestep.\n\n        :param model: the model to sample from.\n        :param x: the current tensor at x_{t-1}.\n        :param t: the value of t, starting at 0 for the first diffusion step.\n        :param clip_denoised: if True, clip the x_start prediction to [-1, 1].\n        :param denoised_fn: if not None, a function which applies to the\n            x_start prediction before it is used to sample.\n        :param cond_fn: if not None, this is a gradient function that acts\n                        similarly to the model.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n        :return: a dict containing the following keys:\n                 - 'sample': a random sample from the model.\n                 - 'pred_xstart': a prediction of x_0.\n        \"\"\"\n        # with th.enable_grad():\n        # x = x.detach().requires_grad_()\n        x = x.detach()\n        # x.stop_gradient = False\n        out = self.p_mean_variance(\n            model,\n            x,\n            t,\n            clip_denoised=clip_denoised,\n            denoised_fn=denoised_fn,\n            model_kwargs=model_kwargs,\n        )\n        # noise = th.randn_like(x)\n        noise = paddle.randn(x.shape, x.dtype)\n        nonzero_mask = (paddle.cast((t != 0), 'float32').reshape([-1,\n                                                                  *([1] * (len(x.shape) - 1))]))  # no noise when t == 0\n        if cond_fn is not None:\n            out[\"mean\"] = self.condition_mean_with_grad(cond_fn, out, x, t, model_kwargs=model_kwargs)\n        sample = out[\"mean\"] + nonzero_mask * paddle.exp(0.5 * out[\"log_variance\"]) * noise\n        return {\"sample\": sample, \"pred_xstart\": out[\"pred_xstart\"].detach()}\n\n    def p_sample_loop(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n    ):\n        \"\"\"\n        Generate samples from the model.\n\n        :param model: the model module.\n        :param shape: the shape of the samples, (N, C, H, W).\n        :param noise: if specified, the noise from the encoder to sample.\n                      Should be of the same shape as `shape`.\n        :param clip_denoised: if True, clip x_start predictions to [-1, 1].\n        :param denoised_fn: if not None, a function which applies to the\n            x_start prediction before it is used to sample.\n        :param cond_fn: if not None, this is a gradient function that acts\n                        similarly to the model.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n        :param device: if specified, the device to create the samples on.\n                       If not specified, use a model parameter's device.\n        :param progress: if True, show a tqdm progress bar.\n        :return: a non-differentiable batch of samples.\n        \"\"\"\n        final = None\n        for sample in self.p_sample_loop_progressive(\n                model,\n                shape,\n                noise=noise,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n                device=device,\n                progress=progress,\n                skip_timesteps=skip_timesteps,\n                init_image=init_image,\n                randomize_class=randomize_class,\n                cond_fn_with_grad=cond_fn_with_grad,\n        ):\n            final = sample\n        return final[\"sample\"]\n\n    def p_sample_loop_progressive(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n    ):\n        \"\"\"\n        Generate samples from the model and yield intermediate samples from\n        each timestep of diffusion.\n\n        Arguments are the same as p_sample_loop().\n        Returns a generator over dicts, where each dict is the return value of\n        p_sample().\n        \"\"\"\n        if device is None:\n            device = model.parameters()[0].place\n        assert isinstance(shape, (tuple, list))\n        if noise is not None:\n            img = noise\n        else:\n            img = paddle.randn(shape)\n\n        if skip_timesteps and init_image is None:\n            init_image = paddle.zeros_like(img)\n\n        indices = list(range(self.num_timesteps - skip_timesteps))[::-1]\n\n        if init_image is not None:\n            my_t = paddle.ones([shape[0]], dtype='int64') * indices[0]\n            img = self.q_sample(init_image, my_t, img)\n\n        if progress:\n            # Lazy import so that we don't depend on tqdm.\n            from tqdm.auto import tqdm\n\n            indices = tqdm(indices)\n\n        for i in indices:\n            t = paddle.to_tensor([i] * shape[0], place=device)\n            if randomize_class and 'y' in model_kwargs:\n                model_kwargs['y'] = paddle.randint(low=0, high=model.num_classes, shape=model_kwargs['y'].shape)\n            # with paddle.no_grad():\n            sample_fn = self.p_sample_with_grad if cond_fn_with_grad else self.p_sample\n            out = sample_fn(\n                model,\n                img,\n                t,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n            )\n            yield out\n            img = out[\"sample\"]\n\n    def ddim_sample(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        eta=0.0,\n    ):\n        \"\"\"\n        Sample x_{t-1} from the model using DDIM.\n\n        Same usage as p_sample().\n        \"\"\"\n        out_orig = self.p_mean_variance(\n            model,\n            x,\n            t,\n            clip_denoised=clip_denoised,\n            denoised_fn=denoised_fn,\n            model_kwargs=model_kwargs,\n        )\n        if cond_fn is not None:\n            out = self.condition_score(cond_fn, out_orig, x, t, model_kwargs=model_kwargs)\n        else:\n            out = out_orig\n\n        # Usually our model outputs epsilon, but we re-derive it\n        # in case we used x_start or x_prev prediction.\n        eps = self._predict_eps_from_xstart(x, t, out[\"pred_xstart\"])\n\n        alpha_bar = _extract_into_tensor(self.alphas_cumprod, t, x.shape)\n        alpha_bar_prev = _extract_into_tensor(self.alphas_cumprod_prev, t, x.shape)\n        sigma = (eta * paddle.sqrt(\n            (1 - alpha_bar_prev) / (1 - alpha_bar)) * paddle.sqrt(1 - alpha_bar / alpha_bar_prev))\n        # Equation 12.\n        # noise = th.randn_like(x)\n        noise = paddle.randn(x.shape, x.dtype)\n        mean_pred = (out[\"pred_xstart\"] * paddle.sqrt(alpha_bar_prev) +\n                     paddle.sqrt(1 - alpha_bar_prev - sigma**2) * eps)\n        nonzero_mask = (paddle.cast((t != 0), 'float32').reshape([-1,\n                                                                  *([1] * (len(x.shape) - 1))]))  # no noise when t == 0\n        sample = mean_pred + nonzero_mask * sigma * noise\n        return {\"sample\": sample, \"pred_xstart\": out_orig[\"pred_xstart\"]}\n\n    def ddim_sample_with_grad(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        eta=0.0,\n    ):\n        \"\"\"\n        Sample x_{t-1} from the model using DDIM.\n\n        Same usage as p_sample().\n        \"\"\"\n        # with th.enable_grad():\n        # x = x.detach().requires_grad_()\n        x = x.detach()\n        # x.stop_gradient = False\n        out_orig = self.p_mean_variance(\n            model,\n            x,\n            t,\n            clip_denoised=clip_denoised,\n            denoised_fn=denoised_fn,\n            model_kwargs=model_kwargs,\n        )\n        if cond_fn is not None:\n            out = self.condition_score_with_grad(cond_fn, out_orig, x, t, model_kwargs=model_kwargs)\n        else:\n            out = out_orig\n\n        out[\"pred_xstart\"] = out[\"pred_xstart\"].detach()\n\n        # Usually our model outputs epsilon, but we re-derive it\n        # in case we used x_start or x_prev prediction.\n        eps = self._predict_eps_from_xstart(x, t, out[\"pred_xstart\"])\n\n        alpha_bar = _extract_into_tensor(self.alphas_cumprod, t, x.shape)\n        alpha_bar_prev = _extract_into_tensor(self.alphas_cumprod_prev, t, x.shape)\n        sigma = (eta * paddle.sqrt(\n            (1 - alpha_bar_prev) / (1 - alpha_bar)) * paddle.sqrt(1 - alpha_bar / alpha_bar_prev))\n        # Equation 12.\n        # noise = th.randn_like(x)\n        noise = paddle.randn(x.shape, x.dtype)\n        mean_pred = (out[\"pred_xstart\"] * paddle.sqrt(alpha_bar_prev) +\n                     paddle.sqrt(1 - alpha_bar_prev - sigma**2) * eps)\n        nonzero_mask = (paddle.cast((t != 0), 'float32').reshape([-1,\n                                                                  *([1] * (len(x.shape) - 1))]))  # no noise when t == 0\n        sample = mean_pred + nonzero_mask * sigma * noise\n        return {\"sample\": sample, \"pred_xstart\": out_orig[\"pred_xstart\"].detach()}\n\n    def ddim_reverse_sample(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        model_kwargs=None,\n        eta=0.0,\n    ):\n        \"\"\"\n        Sample x_{t+1} from the model using DDIM reverse ODE.\n        \"\"\"\n        assert eta == 0.0, \"Reverse ODE only for deterministic path\"\n        out = self.p_mean_variance(\n            model,\n            x,\n            t,\n            clip_denoised=clip_denoised,\n            denoised_fn=denoised_fn,\n            model_kwargs=model_kwargs,\n        )\n        # Usually our model outputs epsilon, but we re-derive it\n        # in case we used x_start or x_prev prediction.\n        eps = (_extract_into_tensor(self.sqrt_recip_alphas_cumprod, t, x.shape) * x -\n               out[\"pred_xstart\"]) / _extract_into_tensor(self.sqrt_recipm1_alphas_cumprod, t, x.shape)\n        alpha_bar_next = _extract_into_tensor(self.alphas_cumprod_next, t, x.shape)\n\n        # Equation 12. reversed\n        mean_pred = (out[\"pred_xstart\"] * paddle.sqrt(alpha_bar_next) + paddle.sqrt(1 - alpha_bar_next) * eps)\n\n        return {\"sample\": mean_pred, \"pred_xstart\": out[\"pred_xstart\"]}\n\n    def ddim_sample_loop(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        eta=0.0,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n    ):\n        \"\"\"\n        Generate samples from the model using DDIM.\n\n        Same usage as p_sample_loop().\n        \"\"\"\n        final = None\n        for sample in self.ddim_sample_loop_progressive(\n                model,\n                shape,\n                noise=noise,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n                device=device,\n                progress=progress,\n                eta=eta,\n                skip_timesteps=skip_timesteps,\n                init_image=init_image,\n                randomize_class=randomize_class,\n                cond_fn_with_grad=cond_fn_with_grad,\n        ):\n            final = sample\n        return final[\"sample\"]\n\n    def ddim_sample_loop_progressive(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        eta=0.0,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n    ):\n        \"\"\"\n        Use DDIM to sample from the model and yield intermediate samples from\n        each timestep of DDIM.\n\n        Same usage as p_sample_loop_progressive().\n        \"\"\"\n        # if device is None:\n        #     device = next(model.parameters()).device\n        if device is None:\n            device = model.parameters()[0].place\n        assert isinstance(shape, (tuple, list))\n        if noise is not None:\n            img = noise\n        else:\n            img = paddle.randn(shape)\n\n        if skip_timesteps and init_image is None:\n            init_image = paddle.zeros_like(img)\n\n        indices = list(range(self.num_timesteps - skip_timesteps))[::-1]\n\n        if init_image is not None:\n            my_t = paddle.ones([shape[0]], dtype='int64') * indices[0]\n            img = self.q_sample(init_image, my_t, img)\n\n        if progress:\n            # Lazy import so that we don't depend on tqdm.\n            from tqdm.auto import tqdm\n\n            indices = tqdm(indices)\n\n        for i in indices:\n            t = paddle.to_tensor([i] * shape[0])\n            if randomize_class and 'y' in model_kwargs:\n                model_kwargs['y'] = paddle.randint(\n                    low=0,\n                    high=model.num_classes,\n                    shape=model_kwargs['y'].shape,\n                )\n            sample_fn = self.ddim_sample_with_grad if cond_fn_with_grad else self.ddim_sample\n            out = sample_fn(\n                model,\n                img,\n                t,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n                eta=eta,\n            )\n            yield out\n            img = out[\"sample\"]\n\n    def plms_sample(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        cond_fn_with_grad=False,\n        order=2,\n        old_out=None,\n    ):\n        \"\"\"\n        Sample x_{t-1} from the model using Pseudo Linear Multistep.\n\n        Same usage as p_sample().\n        \"\"\"\n        if not int(order) or not 1 <= order <= 4:\n            raise ValueError('order is invalid (should be int from 1-4).')\n\n        def get_model_output(x, t):\n            with paddle.set_grad_enabled(cond_fn_with_grad and cond_fn is not None):\n                x = x.detach().requires_grad_() if cond_fn_with_grad else x\n                out_orig = self.p_mean_variance(\n                    model,\n                    x,\n                    t,\n                    clip_denoised=clip_denoised,\n                    denoised_fn=denoised_fn,\n                    model_kwargs=model_kwargs,\n                )\n                if cond_fn is not None:\n                    if cond_fn_with_grad:\n                        out = self.condition_score_with_grad(cond_fn, out_orig, x, t, model_kwargs=model_kwargs)\n                        x = x.detach()\n                    else:\n                        out = self.condition_score(cond_fn, out_orig, x, t, model_kwargs=model_kwargs)\n                else:\n                    out = out_orig\n\n            # Usually our model outputs epsilon, but we re-derive it\n            # in case we used x_start or x_prev prediction.\n            eps = self._predict_eps_from_xstart(x, t, out[\"pred_xstart\"])\n            return eps, out, out_orig\n\n        alpha_bar = _extract_into_tensor(self.alphas_cumprod, t, x.shape)\n        alpha_bar_prev = _extract_into_tensor(self.alphas_cumprod_prev, t, x.shape)\n        eps, out, out_orig = get_model_output(x, t)\n\n        if order > 1 and old_out is None:\n            # Pseudo Improved Euler\n            old_eps = [eps]\n            mean_pred = out[\"pred_xstart\"] * paddle.sqrt(alpha_bar_prev) + paddle.sqrt(1 - alpha_bar_prev) * eps\n            eps_2, _, _ = get_model_output(mean_pred, t - 1)\n            eps_prime = (eps + eps_2) / 2\n            pred_prime = self._predict_xstart_from_eps(x, t, eps_prime)\n            mean_pred = pred_prime * paddle.sqrt(alpha_bar_prev) + paddle.sqrt(1 - alpha_bar_prev) * eps_prime\n        else:\n            # Pseudo Linear Multistep (Adams-Bashforth)\n            old_eps = old_out[\"old_eps\"]\n            old_eps.append(eps)\n            cur_order = min(order, len(old_eps))\n            if cur_order == 1:\n                eps_prime = old_eps[-1]\n            elif cur_order == 2:\n                eps_prime = (3 * old_eps[-1] - old_eps[-2]) / 2\n            elif cur_order == 3:\n                eps_prime = (23 * old_eps[-1] - 16 * old_eps[-2] + 5 * old_eps[-3]) / 12\n            elif cur_order == 4:\n                eps_prime = (55 * old_eps[-1] - 59 * old_eps[-2] + 37 * old_eps[-3] - 9 * old_eps[-4]) / 24\n            else:\n                raise RuntimeError('cur_order is invalid.')\n            pred_prime = self._predict_xstart_from_eps(x, t, eps_prime)\n            mean_pred = pred_prime * paddle.sqrt(alpha_bar_prev) + paddle.sqrt(1 - alpha_bar_prev) * eps_prime\n\n        if len(old_eps) >= order:\n            old_eps.pop(0)\n\n        nonzero_mask = paddle.cast((t != 0), 'float32').reshape([-1, *([1] * (len(x.shape) - 1))])\n        sample = mean_pred * nonzero_mask + out[\"pred_xstart\"] * (1 - nonzero_mask)\n\n        return {\"sample\": sample, \"pred_xstart\": out_orig[\"pred_xstart\"], \"old_eps\": old_eps}\n\n    def plms_sample_loop(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n        order=2,\n    ):\n        \"\"\"\n        Generate samples from the model using Pseudo Linear Multistep.\n\n        Same usage as p_sample_loop().\n        \"\"\"\n        final = None\n        for sample in self.plms_sample_loop_progressive(\n                model,\n                shape,\n                noise=noise,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n                device=device,\n                progress=progress,\n                skip_timesteps=skip_timesteps,\n                init_image=init_image,\n                randomize_class=randomize_class,\n                cond_fn_with_grad=cond_fn_with_grad,\n                order=order,\n        ):\n            final = sample\n        return final[\"sample\"]\n\n    def plms_sample_loop_progressive(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n        order=2,\n    ):\n        \"\"\"\n        Use PLMS to sample from the model and yield intermediate samples from each\n        timestep of PLMS.\n\n        Same usage as p_sample_loop_progressive().\n        \"\"\"\n        if device is None:\n            device = model.parameters()[0].place\n        assert isinstance(shape, (tuple, list))\n        if noise is not None:\n            img = noise\n        else:\n            img = paddle.randn(shape)\n\n        if skip_timesteps and init_image is None:\n            init_image = paddle.zeros_like(img)\n\n        indices = list(range(self.num_timesteps - skip_timesteps))[::-1]\n\n        if init_image is not None:\n            my_t = paddle.ones([shape[0]], dtype='int64') * indices[0]\n            img = self.q_sample(init_image, my_t, img)\n\n        if progress:\n            # Lazy import so that we don't depend on tqdm.\n            from tqdm.auto import tqdm\n\n            indices = tqdm(indices)\n\n        old_out = None\n\n        for i in indices:\n            t = paddle.to_tensor([i] * shape[0], place=device)\n            if randomize_class and 'y' in model_kwargs:\n                model_kwargs['y'] = paddle.randint(low=0, high=model.num_classes, shape=model_kwargs['y'].shape)\n            # with paddle.no_grad():\n            out = self.plms_sample(\n                model,\n                img,\n                t,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n                cond_fn_with_grad=cond_fn_with_grad,\n                order=order,\n                old_out=old_out,\n            )\n            yield out\n            old_out = out\n            img = out[\"sample\"]\n\n    def _vb_terms_bpd(self, model, x_start, x_t, t, clip_denoised=True, model_kwargs=None):\n        \"\"\"\n        Get a term for the variational lower-bound.\n\n        The resulting units are bits (rather than nats, as one might expect).\n        This allows for comparison to other papers.\n\n        :return: a dict with the following keys:\n                 - 'output': a shape [N] tensor of NLLs or KLs.\n                 - 'pred_xstart': the x_0 predictions.\n        \"\"\"\n        true_mean, _, true_log_variance_clipped = self.q_posterior_mean_variance(x_start=x_start, x_t=x_t, t=t)\n        out = self.p_mean_variance(model, x_t, t, clip_denoised=clip_denoised, model_kwargs=model_kwargs)\n        kl = normal_kl(true_mean, true_log_variance_clipped, out[\"mean\"], out[\"log_variance\"])\n        kl = mean_flat(kl) / np.log(2.0)\n\n        decoder_nll = -discretized_gaussian_log_likelihood(\n            x_start, means=out[\"mean\"], log_scales=0.5 * out[\"log_variance\"])\n        assert decoder_nll.shape == x_start.shape\n        decoder_nll = mean_flat(decoder_nll) / np.log(2.0)\n\n        # At the first timestep return the decoder NLL,\n        # otherwise return KL(q(x_{t-1}|x_t,x_0) || p(x_{t-1}|x_t))\n        output = paddle.where((t == 0), decoder_nll, kl)\n        return {\"output\": output, \"pred_xstart\": out[\"pred_xstart\"]}\n\n    def training_losses(self, model, x_start, t, model_kwargs=None, noise=None):\n        \"\"\"\n        Compute training losses for a single timestep.\n\n        :param model: the model to evaluate loss on.\n        :param x_start: the [N x C x ...] tensor of inputs.\n        :param t: a batch of timestep indices.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n        :param noise: if specified, the specific Gaussian noise to try to remove.\n        :return: a dict with the key \"loss\" containing a tensor of shape [N].\n                 Some mean or variance settings may also have other keys.\n        \"\"\"\n        if model_kwargs is None:\n            model_kwargs = {}\n        if noise is None:\n            # noise = th.randn_like(x_start)\n            noise = paddle.randn(x_start.shape, x_start.dtype)\n        x_t = self.q_sample(x_start, t, noise=noise)\n\n        terms = {}\n\n        if self.loss_type == LossType.KL or self.loss_type == LossType.RESCALED_KL:\n            terms[\"loss\"] = self._vb_terms_bpd(\n                model=model,\n                x_start=x_start,\n                x_t=x_t,\n                t=t,\n                clip_denoised=False,\n                model_kwargs=model_kwargs,\n            )[\"output\"]\n            if self.loss_type == LossType.RESCALED_KL:\n                terms[\"loss\"] *= self.num_timesteps\n        elif self.loss_type == LossType.MSE or self.loss_type == LossType.RESCALED_MSE:\n            model_output = model(x_t, self._scale_timesteps(t), **model_kwargs)\n\n            if self.model_var_type in [\n                    ModelVarType.LEARNED,\n                    ModelVarType.LEARNED_RANGE,\n            ]:\n                B, C = x_t.shape[:2]\n                assert model_output.shape == (B, C * 2, *x_t.shape[2:])\n                model_output, model_var_values = paddle.split(model_output, 2, dim=1)\n                # Learn the variance using the variational bound, but don't let\n                # it affect our mean prediction.\n                frozen_out = paddle.concat([model_output.detach(), model_var_values], axis=1)\n                terms[\"vb\"] = self._vb_terms_bpd(\n                    model=lambda *args, r=frozen_out: r,\n                    x_start=x_start,\n                    x_t=x_t,\n                    t=t,\n                    clip_denoised=False,\n                )[\"output\"]\n                if self.loss_type == LossType.RESCALED_MSE:\n                    # Divide by 1000 for equivalence with initial implementation.\n                    # Without a factor of 1/1000, the VB term hurts the MSE term.\n                    terms[\"vb\"] *= self.num_timesteps / 1000.0\n\n            target = {\n                ModelMeanType.PREVIOUS_X: self.q_posterior_mean_variance(x_start=x_start, x_t=x_t, t=t)[0],\n                ModelMeanType.START_X: x_start,\n                ModelMeanType.EPSILON: noise,\n            }[self.model_mean_type]\n            assert model_output.shape == target.shape == x_start.shape\n            terms[\"mse\"] = mean_flat((target - model_output)**2)\n            if \"vb\" in terms:\n                terms[\"loss\"] = terms[\"mse\"] + terms[\"vb\"]\n            else:\n                terms[\"loss\"] = terms[\"mse\"]\n        else:\n            raise NotImplementedError(self.loss_type)\n\n        return terms\n\n    def _prior_bpd(self, x_start):\n        \"\"\"\n        Get the prior KL term for the variational lower-bound, measured in\n        bits-per-dim.\n\n        This term can't be optimized, as it only depends on the encoder.\n\n        :param x_start: the [N x C x ...] tensor of inputs.\n        :return: a batch of [N] KL values (in bits), one per batch element.\n        \"\"\"\n        batch_size = x_start.shape[0]\n        t = paddle.to_tensor([self.num_timesteps - 1] * batch_size, place=x_start.place)\n        qt_mean, _, qt_log_variance = self.q_mean_variance(x_start, t)\n        kl_prior = normal_kl(mean1=qt_mean, logvar1=qt_log_variance, mean2=0.0, logvar2=0.0)\n        return mean_flat(kl_prior) / np.log(2.0)\n\n    def calc_bpd_loop(self, model, x_start, clip_denoised=True, model_kwargs=None):\n        \"\"\"\n        Compute the entire variational lower-bound, measured in bits-per-dim,\n        as well as other related quantities.\n\n        :param model: the model to evaluate loss on.\n        :param x_start: the [N x C x ...] tensor of inputs.\n        :param clip_denoised: if True, clip denoised samples.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n\n        :return: a dict containing the following keys:\n                 - total_bpd: the total variational lower-bound, per batch element.\n                 - prior_bpd: the prior term in the lower-bound.\n                 - vb: an [N x T] tensor of terms in the lower-bound.\n                 - xstart_mse: an [N x T] tensor of x_0 MSEs for each timestep.\n                 - mse: an [N x T] tensor of epsilon MSEs for each timestep.\n        \"\"\"\n        device = x_start.place\n        batch_size = x_start.shape[0]\n\n        vb = []\n        xstart_mse = []\n        mse = []\n        for t in list(range(self.num_timesteps))[::-1]:\n            t_batch = paddle.to_tensor([t] * batch_size, place=device)\n            # noise = th.randn_like(x_start)\n            noise = paddle.randn(x_start.shape, x_start.dtype)\n            x_t = self.q_sample(x_start=x_start, t=t_batch, noise=noise)\n            # Calculate VLB term at the current timestep\n            # with paddle.no_grad():\n            out = self._vb_terms_bpd(\n                model,\n                x_start=x_start,\n                x_t=x_t,\n                t=t_batch,\n                clip_denoised=clip_denoised,\n                model_kwargs=model_kwargs,\n            )\n            vb.append(out[\"output\"])\n            xstart_mse.append(mean_flat((out[\"pred_xstart\"] - x_start)**2))\n            eps = self._predict_eps_from_xstart(x_t, t_batch, out[\"pred_xstart\"])\n            mse.append(mean_flat((eps - noise)**2))\n\n        vb = paddle.stack(vb, axis=1)\n        xstart_mse = paddle.stack(xstart_mse, axis=1)\n        mse = paddle.stack(mse, axis=1)\n\n        prior_bpd = self._prior_bpd(x_start)\n        total_bpd = vb.sum(axis=1) + prior_bpd\n        return {\n            \"total_bpd\": total_bpd,\n            \"prior_bpd\": prior_bpd,\n            \"vb\": vb,\n            \"xstart_mse\": xstart_mse,\n            \"mse\": mse,\n        }\n\n\ndef _extract_into_tensor(arr, timesteps, broadcast_shape):\n    \"\"\"\n    Extract values from a 1-D numpy array for a batch of indices.\n\n    :param arr: the 1-D numpy array.\n    :param timesteps: a tensor of indices into the array to extract.\n    :param broadcast_shape: a larger shape of K dimensions with the batch\n                            dimension equal to the length of timesteps.\n    :return: a tensor of shape [batch_size, 1, ...] where the shape has K dims.\n    \"\"\"\n    res = paddle.to_tensor(arr, place=timesteps.place)[timesteps]\n    while len(res.shape) < len(broadcast_shape):\n        res = res[..., None]\n    return res.expand(broadcast_shape)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/reverse_diffusion/model/losses.py",
    "content": "\"\"\"\nHelpers for various likelihood-based losses implemented by Paddle. These are ported from the original\nHo et al. diffusion models codebase:\nhttps://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/utils.py\n\"\"\"\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\n\n\ndef normal_kl(mean1, logvar1, mean2, logvar2):\n    \"\"\"\n    Compute the KL divergence between two gaussians.\n\n    Shapes are automatically broadcasted, so batches can be compared to\n    scalars, among other use cases.\n    \"\"\"\n    tensor = None\n    for obj in (mean1, logvar1, mean2, logvar2):\n        if isinstance(obj, paddle.Tensor):\n            tensor = obj\n            break\n    assert tensor is not None, \"at least one argument must be a Tensor\"\n\n    # Force variances to be Tensors. Broadcasting helps convert scalars to\n    # Tensors, but it does not work for th.exp().\n    logvar1, logvar2 = [x if isinstance(x, paddle.Tensor) else paddle.to_tensor(x) for x in (logvar1, logvar2)]\n\n    return 0.5 * (-1.0 + logvar2 - logvar1 + paddle.exp(logvar1 - logvar2) +\n                  ((mean1 - mean2)**2) * paddle.exp(-logvar2))\n\n\ndef approx_standard_normal_cdf(x):\n    \"\"\"\n    A fast approximation of the cumulative distribution function of the\n    standard normal.\n    \"\"\"\n    return 0.5 * (1.0 + paddle.tanh(np.sqrt(2.0 / np.pi) * (x + 0.044715 * paddle.pow(x, 3))))\n\n\ndef discretized_gaussian_log_likelihood(x, *, means, log_scales):\n    \"\"\"\n    Compute the log-likelihood of a Gaussian distribution discretizing to a\n    given image.\n\n    :param x: the target images. It is assumed that this was uint8 values,\n              rescaled to the range [-1, 1].\n    :param means: the Gaussian mean Tensor.\n    :param log_scales: the Gaussian log stddev Tensor.\n    :return: a tensor like x of log probabilities (in nats).\n    \"\"\"\n    assert x.shape == means.shape == log_scales.shape\n    centered_x = x - means\n    inv_stdv = paddle.exp(-log_scales)\n    plus_in = inv_stdv * (centered_x + 1.0 / 255.0)\n    cdf_plus = approx_standard_normal_cdf(plus_in)\n    min_in = inv_stdv * (centered_x - 1.0 / 255.0)\n    cdf_min = approx_standard_normal_cdf(min_in)\n    log_cdf_plus = paddle.log(cdf_plus.clip(min=1e-12))\n    log_one_minus_cdf_min = paddle.log((1.0 - cdf_min).clip(min=1e-12))\n    cdf_delta = cdf_plus - cdf_min\n    log_probs = paddle.where(\n        x < -0.999,\n        log_cdf_plus,\n        paddle.where(x > 0.999, log_one_minus_cdf_min, paddle.log(cdf_delta.clip(min=1e-12))),\n    )\n    assert log_probs.shape == x.shape\n    return log_probs\n\n\ndef spherical_dist_loss(x, y):\n    x = F.normalize(x, axis=-1)\n    y = F.normalize(y, axis=-1)\n    return (x - y).norm(axis=-1).divide(paddle.to_tensor(2.0)).asin().pow(2).multiply(paddle.to_tensor(2.0))\n\n\ndef tv_loss(input):\n    \"\"\"L2 total variation loss, as in Mahendran et al.\"\"\"\n    input = F.pad(input, (0, 1, 0, 1), 'replicate')\n    x_diff = input[..., :-1, 1:] - input[..., :-1, :-1]\n    y_diff = input[..., 1:, :-1] - input[..., :-1, :-1]\n    return (x_diff**2 + y_diff**2).mean([1, 2, 3])\n\n\ndef range_loss(input):\n    return (input - input.clip(-1, 1)).pow(2).mean([1, 2, 3])\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/reverse_diffusion/model/make_cutouts.py",
    "content": "'''\nThis code is rewritten by Paddle based on Jina-ai/discoart.\nhttps://github.com/jina-ai/discoart/blob/main/discoart/nn/make_cutouts.py\n'''\nimport math\n\nimport paddle\nimport paddle.nn as nn\nfrom disco_diffusion_clip_rn50.resize_right.resize_right import resize\nfrom paddle.nn import functional as F\n\nfrom . import transforms as T\n\nskip_augs = False  # @param{type: 'boolean'}\n\n\ndef sinc(x):\n    return paddle.where(x != 0, paddle.sin(math.pi * x) / (math.pi * x), x.new_ones([]))\n\n\ndef lanczos(x, a):\n    cond = paddle.logical_and(-a < x, x < a)\n    out = paddle.where(cond, sinc(x) * sinc(x / a), x.new_zeros([]))\n    return out / out.sum()\n\n\ndef ramp(ratio, width):\n    n = math.ceil(width / ratio + 1)\n    out = paddle.empty([n])\n    cur = 0\n    for i in range(out.shape[0]):\n        out[i] = cur\n        cur += ratio\n    return paddle.concat([-out[1:].flip([0]), out])[1:-1]\n\n\nclass MakeCutouts(nn.Layer):\n\n    def __init__(self, cut_size, cutn, skip_augs=False):\n        super().__init__()\n        self.cut_size = cut_size\n        self.cutn = cutn\n        self.skip_augs = skip_augs\n        self.augs = nn.Sequential(*[\n            T.RandomHorizontalFlip(prob=0.5),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.RandomAffine(degrees=15, translate=(0.1, 0.1)),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.RandomPerspective(distortion_scale=0.4, p=0.7),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.RandomGrayscale(p=0.15),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.ColorJitter(brightness=0.1, contrast=0.1, saturation=0.1, hue=0.1),\n        ])\n\n    def forward(self, input):\n        input = T.Pad(input.shape[2] // 4, fill=0)(input)\n        sideY, sideX = input.shape[2:4]\n        max_size = min(sideX, sideY)\n\n        cutouts = []\n        for ch in range(self.cutn):\n            if ch > self.cutn - self.cutn // 4:\n                cutout = input.clone()\n            else:\n                size = int(max_size *\n                           paddle.zeros(1, ).normal_(mean=0.8, std=0.3).clip(float(self.cut_size / max_size), 1.0))\n                offsetx = paddle.randint(0, abs(sideX - size + 1), ())\n                offsety = paddle.randint(0, abs(sideY - size + 1), ())\n                cutout = input[:, :, offsety:offsety + size, offsetx:offsetx + size]\n\n            if not self.skip_augs:\n                cutout = self.augs(cutout)\n            cutouts.append(resample(cutout, (self.cut_size, self.cut_size)))\n            del cutout\n\n        cutouts = paddle.concat(cutouts, axis=0)\n        return cutouts\n\n\nclass MakeCutoutsDango(nn.Layer):\n\n    def __init__(self, cut_size, Overview=4, InnerCrop=0, IC_Size_Pow=0.5, IC_Grey_P=0.2):\n        super().__init__()\n        self.cut_size = cut_size\n        self.Overview = Overview\n        self.InnerCrop = InnerCrop\n        self.IC_Size_Pow = IC_Size_Pow\n        self.IC_Grey_P = IC_Grey_P\n        self.augs = nn.Sequential(*[\n            T.RandomHorizontalFlip(prob=0.5),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.RandomAffine(\n                degrees=10,\n                translate=(0.05, 0.05),\n                interpolation=T.InterpolationMode.BILINEAR,\n            ),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.RandomGrayscale(p=0.1),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.ColorJitter(brightness=0.1, contrast=0.1, saturation=0.1, hue=0.1),\n        ])\n\n    def forward(self, input):\n        cutouts = []\n        gray = T.Grayscale(3)\n        sideY, sideX = input.shape[2:4]\n        max_size = min(sideX, sideY)\n        min_size = min(sideX, sideY, self.cut_size)\n        output_shape = [1, 3, self.cut_size, self.cut_size]\n        pad_input = F.pad(\n            input,\n            (\n                (sideY - max_size) // 2,\n                (sideY - max_size) // 2,\n                (sideX - max_size) // 2,\n                (sideX - max_size) // 2,\n            ),\n            **padargs,\n        )\n        cutout = resize(pad_input, out_shape=output_shape)\n\n        if self.Overview > 0:\n            if self.Overview <= 4:\n                if self.Overview >= 1:\n                    cutouts.append(cutout)\n                if self.Overview >= 2:\n                    cutouts.append(gray(cutout))\n                if self.Overview >= 3:\n                    cutouts.append(cutout[:, :, :, ::-1])\n                if self.Overview == 4:\n                    cutouts.append(gray(cutout[:, :, :, ::-1]))\n            else:\n                cutout = resize(pad_input, out_shape=output_shape)\n                for _ in range(self.Overview):\n                    cutouts.append(cutout)\n\n        if self.InnerCrop > 0:\n            for i in range(self.InnerCrop):\n                size = int(paddle.rand([1])**self.IC_Size_Pow * (max_size - min_size) + min_size)\n                offsetx = paddle.randint(0, sideX - size + 1)\n                offsety = paddle.randint(0, sideY - size + 1)\n                cutout = input[:, :, offsety:offsety + size, offsetx:offsetx + size]\n                if i <= int(self.IC_Grey_P * self.InnerCrop):\n                    cutout = gray(cutout)\n                cutout = resize(cutout, out_shape=output_shape)\n                cutouts.append(cutout)\n\n        cutouts = paddle.concat(cutouts)\n        if skip_augs is not True:\n            cutouts = self.augs(cutouts)\n        return cutouts\n\n\ndef resample(input, size, align_corners=True):\n    n, c, h, w = input.shape\n    dh, dw = size\n\n    input = input.reshape([n * c, 1, h, w])\n\n    if dh < h:\n        kernel_h = lanczos(ramp(dh / h, 2), 2).to(input.device, input.dtype)\n        pad_h = (kernel_h.shape[0] - 1) // 2\n        input = F.pad(input, (0, 0, pad_h, pad_h), 'reflect')\n        input = F.conv2d(input, kernel_h[None, None, :, None])\n\n    if dw < w:\n        kernel_w = lanczos(ramp(dw / w, 2), 2).to(input.device, input.dtype)\n        pad_w = (kernel_w.shape[0] - 1) // 2\n        input = F.pad(input, (pad_w, pad_w, 0, 0), 'reflect')\n        input = F.conv2d(input, kernel_w[None, None, None, :])\n\n    input = input.reshape([n, c, h, w])\n    return F.interpolate(input, size, mode='bicubic', align_corners=align_corners)\n\n\npadargs = {}\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/reverse_diffusion/model/nn.py",
    "content": "\"\"\"\nVarious utilities for neural networks implemented by Paddle. This code is rewritten based on:\nhttps://github.com/openai/guided-diffusion/blob/main/guided_diffusion/nn.py\n\"\"\"\nimport math\n\nimport paddle\nimport paddle.nn as nn\n\n\nclass SiLU(nn.Layer):\n\n    def forward(self, x):\n        return x * nn.functional.sigmoid(x)\n\n\nclass GroupNorm32(nn.GroupNorm):\n\n    def forward(self, x):\n        return super().forward(x)\n\n\ndef conv_nd(dims, *args, **kwargs):\n    \"\"\"\n    Create a 1D, 2D, or 3D convolution module.\n    \"\"\"\n    if dims == 1:\n        return nn.Conv1D(*args, **kwargs)\n    elif dims == 2:\n        return nn.Conv2D(*args, **kwargs)\n    elif dims == 3:\n        return nn.Conv3D(*args, **kwargs)\n    raise ValueError(f\"unsupported dimensions: {dims}\")\n\n\ndef linear(*args, **kwargs):\n    \"\"\"\n    Create a linear module.\n    \"\"\"\n    return nn.Linear(*args, **kwargs)\n\n\ndef avg_pool_nd(dims, *args, **kwargs):\n    \"\"\"\n    Create a 1D, 2D, or 3D average pooling module.\n    \"\"\"\n    if dims == 1:\n        return nn.AvgPool1D(*args, **kwargs)\n    elif dims == 2:\n        return nn.AvgPool2D(*args, **kwargs)\n    elif dims == 3:\n        return nn.AvgPool3D(*args, **kwargs)\n    raise ValueError(f\"unsupported dimensions: {dims}\")\n\n\ndef update_ema(target_params, source_params, rate=0.99):\n    \"\"\"\n    Update target parameters to be closer to those of source parameters using\n    an exponential moving average.\n\n    :param target_params: the target parameter sequence.\n    :param source_params: the source parameter sequence.\n    :param rate: the EMA rate (closer to 1 means slower).\n    \"\"\"\n    for targ, src in zip(target_params, source_params):\n        targ.detach().mul_(rate).add_(src, alpha=1 - rate)\n\n\ndef zero_module(module):\n    \"\"\"\n    Zero out the parameters of a module and return it.\n    \"\"\"\n    for p in module.parameters():\n        p.detach().zero_()\n    return module\n\n\ndef scale_module(module, scale):\n    \"\"\"\n    Scale the parameters of a module and return it.\n    \"\"\"\n    for p in module.parameters():\n        p.detach().mul_(scale)\n    return module\n\n\ndef mean_flat(tensor):\n    \"\"\"\n    Take the mean over all non-batch dimensions.\n    \"\"\"\n    return tensor.mean(axis=list(range(1, len(tensor.shape))))\n\n\ndef normalization(channels):\n    \"\"\"\n    Make a standard normalization layer.\n\n    :param channels: number of input channels.\n    :return: an nn.Module for normalization.\n    \"\"\"\n    return GroupNorm32(32, channels)\n\n\ndef timestep_embedding(timesteps, dim, max_period=10000):\n    \"\"\"\n    Create sinusoidal timestep embeddings.\n\n    :param timesteps: a 1-D Tensor of N indices, one per batch element.\n                      These may be fractional.\n    :param dim: the dimension of the output.\n    :param max_period: controls the minimum frequency of the embeddings.\n    :return: an [N x dim] Tensor of positional embeddings.\n    \"\"\"\n    half = dim // 2\n    freqs = paddle.exp(-math.log(max_period) * paddle.arange(start=0, end=half, dtype=paddle.float32) / half)\n    args = paddle.cast(timesteps[:, None], 'float32') * freqs[None]\n    embedding = paddle.concat([paddle.cos(args), paddle.sin(args)], axis=-1)\n    if dim % 2:\n        embedding = paddle.concat([embedding, paddle.zeros_like(embedding[:, :1])], axis=-1)\n    return embedding\n\n\ndef checkpoint(func, inputs, params, flag):\n    \"\"\"\n    This function is disabled. And now just forward.\n    \"\"\"\n    return func(*inputs)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/reverse_diffusion/model/perlin_noises.py",
    "content": "'''\nPerlin noise implementation by Paddle.\nThis code is rewritten based on:\nhttps://github.com/jina-ai/discoart/blob/main/discoart/nn/perlin_noises.py\n'''\nimport numpy as np\nimport paddle\nimport paddle.vision.transforms as TF\nfrom PIL import Image\nfrom PIL import ImageOps\n\n\ndef interp(t):\n    return 3 * t**2 - 2 * t**3\n\n\ndef perlin(width, height, scale=10):\n    gx, gy = paddle.randn([2, width + 1, height + 1, 1, 1])\n    xs = paddle.linspace(0, 1, scale + 1)[:-1, None]\n    ys = paddle.linspace(0, 1, scale + 1)[None, :-1]\n    wx = 1 - interp(xs)\n    wy = 1 - interp(ys)\n    dots = 0\n    dots += wx * wy * (gx[:-1, :-1] * xs + gy[:-1, :-1] * ys)\n    dots += (1 - wx) * wy * (-gx[1:, :-1] * (1 - xs) + gy[1:, :-1] * ys)\n    dots += wx * (1 - wy) * (gx[:-1, 1:] * xs - gy[:-1, 1:] * (1 - ys))\n    dots += (1 - wx) * (1 - wy) * (-gx[1:, 1:] * (1 - xs) - gy[1:, 1:] * (1 - ys))\n    return dots.transpose([0, 2, 1, 3]).reshape([width * scale, height * scale])\n\n\ndef perlin_ms(octaves, width, height, grayscale):\n    out_array = [0.5] if grayscale else [0.5, 0.5, 0.5]\n    # out_array = [0.0] if grayscale else [0.0, 0.0, 0.0]\n    for i in range(1 if grayscale else 3):\n        scale = 2**len(octaves)\n        oct_width = width\n        oct_height = height\n        for oct in octaves:\n            p = perlin(oct_width, oct_height, scale)\n            out_array[i] += p * oct\n            scale //= 2\n            oct_width *= 2\n            oct_height *= 2\n    return paddle.concat(out_array)\n\n\ndef create_perlin_noise(octaves, width, height, grayscale, side_y, side_x):\n    out = perlin_ms(octaves, width, height, grayscale)\n    if grayscale:\n        out = TF.resize(size=(side_y, side_x), img=out.numpy())\n        out = np.uint8(out)\n        out = Image.fromarray(out).convert('RGB')\n    else:\n        out = out.reshape([-1, 3, out.shape[0] // 3, out.shape[1]])\n        out = out.squeeze().transpose([1, 2, 0]).numpy()\n        out = TF.resize(size=(side_y, side_x), img=out)\n        out = out.clip(0, 1) * 255\n        out = np.uint8(out)\n        out = Image.fromarray(out)\n\n    out = ImageOps.autocontrast(out)\n    return out\n\n\ndef regen_perlin(perlin_mode, side_y, side_x, batch_size):\n    if perlin_mode == 'color':\n        init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, False, side_y, side_x)\n        init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, False, side_y, side_x)\n    elif perlin_mode == 'gray':\n        init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, True, side_y, side_x)\n        init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, True, side_y, side_x)\n    else:\n        init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, False, side_y, side_x)\n        init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, True, side_y, side_x)\n\n    init = (TF.to_tensor(init).add(TF.to_tensor(init2)).divide(paddle.to_tensor(2.0)).unsqueeze(0) * 2 - 1)\n    del init2\n    return init.expand([batch_size, -1, -1, -1])\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/reverse_diffusion/model/respace.py",
    "content": "'''\nThis code is rewritten by Paddle based on\nhttps://github.com/openai/guided-diffusion/blob/main/guided_diffusion/respace.py\n'''\nimport numpy as np\nimport paddle\n\nfrom .gaussian_diffusion import GaussianDiffusion\n\n\ndef space_timesteps(num_timesteps, section_counts):\n    \"\"\"\n    Create a list of timesteps to use from an original diffusion process,\n    given the number of timesteps we want to take from equally-sized portions\n    of the original process.\n\n    For example, if there's 300 timesteps and the section counts are [10,15,20]\n    then the first 100 timesteps are strided to be 10 timesteps, the second 100\n    are strided to be 15 timesteps, and the final 100 are strided to be 20.\n\n    If the stride is a string starting with \"ddim\", then the fixed striding\n    from the DDIM paper is used, and only one section is allowed.\n\n    :param num_timesteps: the number of diffusion steps in the original\n                          process to divide up.\n    :param section_counts: either a list of numbers, or a string containing\n                           comma-separated numbers, indicating the step count\n                           per section. As a special case, use \"ddimN\" where N\n                           is a number of steps to use the striding from the\n                           DDIM paper.\n    :return: a set of diffusion steps from the original process to use.\n    \"\"\"\n    if isinstance(section_counts, str):\n        if section_counts.startswith(\"ddim\"):\n            desired_count = int(section_counts[len(\"ddim\"):])\n            for i in range(1, num_timesteps):\n                if len(range(0, num_timesteps, i)) == desired_count:\n                    return set(range(0, num_timesteps, i))\n            raise ValueError(f\"cannot create exactly {num_timesteps} steps with an integer stride\")\n        section_counts = [int(x) for x in section_counts.split(\",\")]\n    size_per = num_timesteps // len(section_counts)\n    extra = num_timesteps % len(section_counts)\n    start_idx = 0\n    all_steps = []\n    for i, section_count in enumerate(section_counts):\n        size = size_per + (1 if i < extra else 0)\n        if size < section_count:\n            raise ValueError(f\"cannot divide section of {size} steps into {section_count}\")\n        if section_count <= 1:\n            frac_stride = 1\n        else:\n            frac_stride = (size - 1) / (section_count - 1)\n        cur_idx = 0.0\n        taken_steps = []\n        for _ in range(section_count):\n            taken_steps.append(start_idx + round(cur_idx))\n            cur_idx += frac_stride\n        all_steps += taken_steps\n        start_idx += size\n    return set(all_steps)\n\n\nclass SpacedDiffusion(GaussianDiffusion):\n    \"\"\"\n    A diffusion process which can skip steps in a base diffusion process.\n\n    :param use_timesteps: a collection (sequence or set) of timesteps from the\n                          original diffusion process to retain.\n    :param kwargs: the kwargs to create the base diffusion process.\n    \"\"\"\n\n    def __init__(self, use_timesteps, **kwargs):\n        self.use_timesteps = set(use_timesteps)\n        self.timestep_map = []\n        self.original_num_steps = len(kwargs[\"betas\"])\n\n        base_diffusion = GaussianDiffusion(**kwargs)  # pylint: disable=missing-kwoa\n        last_alpha_cumprod = 1.0\n        new_betas = []\n        for i, alpha_cumprod in enumerate(base_diffusion.alphas_cumprod):\n            if i in self.use_timesteps:\n                new_betas.append(1 - alpha_cumprod / last_alpha_cumprod)\n                last_alpha_cumprod = alpha_cumprod\n                self.timestep_map.append(i)\n        kwargs[\"betas\"] = np.array(new_betas)\n        super().__init__(**kwargs)\n\n    def p_mean_variance(self, model, *args, **kwargs):  # pylint: disable=signature-differs\n        return super().p_mean_variance(self._wrap_model(model), *args, **kwargs)\n\n    def training_losses(self, model, *args, **kwargs):  # pylint: disable=signature-differs\n        return super().training_losses(self._wrap_model(model), *args, **kwargs)\n\n    def condition_mean(self, cond_fn, *args, **kwargs):\n        return super().condition_mean(self._wrap_model(cond_fn), *args, **kwargs)\n\n    def condition_score(self, cond_fn, *args, **kwargs):\n        return super().condition_score(self._wrap_model(cond_fn), *args, **kwargs)\n\n    def _wrap_model(self, model):\n        if isinstance(model, _WrappedModel):\n            return model\n        return _WrappedModel(model, self.timestep_map, self.rescale_timesteps, self.original_num_steps)\n\n    def _scale_timesteps(self, t):\n        # Scaling is done by the wrapped model.\n        return t\n\n\nclass _WrappedModel:\n\n    def __init__(self, model, timestep_map, rescale_timesteps, original_num_steps):\n        self.model = model\n        self.timestep_map = timestep_map\n        self.rescale_timesteps = rescale_timesteps\n        self.original_num_steps = original_num_steps\n\n    def __call__(self, x, ts, **kwargs):\n        map_tensor = paddle.to_tensor(self.timestep_map, place=ts.place, dtype=ts.dtype)\n        new_ts = map_tensor[ts]\n        if self.rescale_timesteps:\n            new_ts = paddle.cast(new_ts, 'float32') * (1000.0 / self.original_num_steps)\n        return self.model(x, new_ts, **kwargs)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/reverse_diffusion/model/script_util.py",
    "content": "'''\nThis code is based on\nhttps://github.com/openai/guided-diffusion/blob/main/guided_diffusion/script_util.py\n'''\nimport argparse\nimport inspect\n\nfrom . import gaussian_diffusion as gd\nfrom .respace import space_timesteps\nfrom .respace import SpacedDiffusion\nfrom .unet import EncoderUNetModel\nfrom .unet import SuperResModel\nfrom .unet import UNetModel\n\nNUM_CLASSES = 1000\n\n\ndef diffusion_defaults():\n    \"\"\"\n    Defaults for image and classifier training.\n    \"\"\"\n    return dict(\n        learn_sigma=False,\n        diffusion_steps=1000,\n        noise_schedule=\"linear\",\n        timestep_respacing=\"\",\n        use_kl=False,\n        predict_xstart=False,\n        rescale_timesteps=False,\n        rescale_learned_sigmas=False,\n    )\n\n\ndef model_and_diffusion_defaults():\n    \"\"\"\n    Defaults for image training.\n    \"\"\"\n    res = dict(\n        image_size=64,\n        num_channels=128,\n        num_res_blocks=2,\n        num_heads=4,\n        num_heads_upsample=-1,\n        num_head_channels=-1,\n        attention_resolutions=\"16,8\",\n        channel_mult=\"\",\n        dropout=0.0,\n        class_cond=False,\n        use_checkpoint=False,\n        use_scale_shift_norm=True,\n        resblock_updown=False,\n        use_fp16=False,\n        use_new_attention_order=False,\n    )\n    res.update(diffusion_defaults())\n    return res\n\n\ndef create_model_and_diffusion(\n    image_size,\n    class_cond,\n    learn_sigma,\n    num_channels,\n    num_res_blocks,\n    channel_mult,\n    num_heads,\n    num_head_channels,\n    num_heads_upsample,\n    attention_resolutions,\n    dropout,\n    diffusion_steps,\n    noise_schedule,\n    timestep_respacing,\n    use_kl,\n    predict_xstart,\n    rescale_timesteps,\n    rescale_learned_sigmas,\n    use_checkpoint,\n    use_scale_shift_norm,\n    resblock_updown,\n    use_fp16,\n    use_new_attention_order,\n):\n    model = create_model(\n        image_size,\n        num_channels,\n        num_res_blocks,\n        channel_mult=channel_mult,\n        learn_sigma=learn_sigma,\n        class_cond=class_cond,\n        use_checkpoint=use_checkpoint,\n        attention_resolutions=attention_resolutions,\n        num_heads=num_heads,\n        num_head_channels=num_head_channels,\n        num_heads_upsample=num_heads_upsample,\n        use_scale_shift_norm=use_scale_shift_norm,\n        dropout=dropout,\n        resblock_updown=resblock_updown,\n        use_fp16=use_fp16,\n        use_new_attention_order=use_new_attention_order,\n    )\n    diffusion = create_gaussian_diffusion(\n        steps=diffusion_steps,\n        learn_sigma=learn_sigma,\n        noise_schedule=noise_schedule,\n        use_kl=use_kl,\n        predict_xstart=predict_xstart,\n        rescale_timesteps=rescale_timesteps,\n        rescale_learned_sigmas=rescale_learned_sigmas,\n        timestep_respacing=timestep_respacing,\n    )\n    return model, diffusion\n\n\ndef create_model(\n    image_size,\n    num_channels,\n    num_res_blocks,\n    channel_mult=\"\",\n    learn_sigma=False,\n    class_cond=False,\n    use_checkpoint=False,\n    attention_resolutions=\"16\",\n    num_heads=1,\n    num_head_channels=-1,\n    num_heads_upsample=-1,\n    use_scale_shift_norm=False,\n    dropout=0,\n    resblock_updown=False,\n    use_fp16=False,\n    use_new_attention_order=False,\n):\n    if channel_mult == \"\":\n        if image_size == 512:\n            channel_mult = (0.5, 1, 1, 2, 2, 4, 4)\n        elif image_size == 256:\n            channel_mult = (1, 1, 2, 2, 4, 4)\n        elif image_size == 128:\n            channel_mult = (1, 1, 2, 3, 4)\n        elif image_size == 64:\n            channel_mult = (1, 2, 3, 4)\n        else:\n            raise ValueError(f\"unsupported image size: {image_size}\")\n    else:\n        channel_mult = tuple(int(ch_mult) for ch_mult in channel_mult.split(\",\"))\n\n    attention_ds = []\n    for res in attention_resolutions.split(\",\"):\n        attention_ds.append(image_size // int(res))\n\n    return UNetModel(\n        image_size=image_size,\n        in_channels=3,\n        model_channels=num_channels,\n        out_channels=(3 if not learn_sigma else 6),\n        num_res_blocks=num_res_blocks,\n        attention_resolutions=tuple(attention_ds),\n        dropout=dropout,\n        channel_mult=channel_mult,\n        num_classes=(NUM_CLASSES if class_cond else None),\n        use_checkpoint=use_checkpoint,\n        use_fp16=use_fp16,\n        num_heads=num_heads,\n        num_head_channels=num_head_channels,\n        num_heads_upsample=num_heads_upsample,\n        use_scale_shift_norm=use_scale_shift_norm,\n        resblock_updown=resblock_updown,\n        use_new_attention_order=use_new_attention_order,\n    )\n\n\ndef create_gaussian_diffusion(\n    *,\n    steps=1000,\n    learn_sigma=False,\n    sigma_small=False,\n    noise_schedule=\"linear\",\n    use_kl=False,\n    predict_xstart=False,\n    rescale_timesteps=False,\n    rescale_learned_sigmas=False,\n    timestep_respacing=\"\",\n):\n    betas = gd.get_named_beta_schedule(noise_schedule, steps)\n    if use_kl:\n        loss_type = gd.LossType.RESCALED_KL\n    elif rescale_learned_sigmas:\n        loss_type = gd.LossType.RESCALED_MSE\n    else:\n        loss_type = gd.LossType.MSE\n    if not timestep_respacing:\n        timestep_respacing = [steps]\n    return SpacedDiffusion(\n        use_timesteps=space_timesteps(steps, timestep_respacing),\n        betas=betas,\n        model_mean_type=(gd.ModelMeanType.EPSILON if not predict_xstart else gd.ModelMeanType.START_X),\n        model_var_type=((gd.ModelVarType.FIXED_LARGE if not sigma_small else gd.ModelVarType.FIXED_SMALL)\n                        if not learn_sigma else gd.ModelVarType.LEARNED_RANGE),\n        loss_type=loss_type,\n        rescale_timesteps=rescale_timesteps,\n    )\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/reverse_diffusion/model/sec_diff.py",
    "content": "'''\nThis code is rewritten by Paddle based on\nhttps://github.com/jina-ai/discoart/blob/main/discoart/nn/sec_diff.py\n'''\nimport math\nfrom dataclasses import dataclass\nfrom functools import partial\n\nimport paddle\nimport paddle.nn as nn\n\n\n@dataclass\nclass DiffusionOutput:\n    v: paddle.Tensor\n    pred: paddle.Tensor\n    eps: paddle.Tensor\n\n\nclass SkipBlock(nn.Layer):\n\n    def __init__(self, main, skip=None):\n        super().__init__()\n        self.main = nn.Sequential(*main)\n        self.skip = skip if skip else nn.Identity()\n\n    def forward(self, input):\n        return paddle.concat([self.main(input), self.skip(input)], axis=1)\n\n\ndef append_dims(x, n):\n    return x[(Ellipsis, *(None, ) * (n - x.ndim))]\n\n\ndef expand_to_planes(x, shape):\n    return paddle.tile(append_dims(x, len(shape)), [1, 1, *shape[2:]])\n\n\ndef alpha_sigma_to_t(alpha, sigma):\n    return paddle.atan2(sigma, alpha) * 2 / math.pi\n\n\ndef t_to_alpha_sigma(t):\n    return paddle.cos(t * math.pi / 2), paddle.sin(t * math.pi / 2)\n\n\nclass SecondaryDiffusionImageNet2(nn.Layer):\n\n    def __init__(self):\n        super().__init__()\n        c = 64  # The base channel count\n        cs = [c, c * 2, c * 2, c * 4, c * 4, c * 8]\n\n        self.timestep_embed = FourierFeatures(1, 16)\n        self.down = nn.AvgPool2D(2)\n        self.up = nn.Upsample(scale_factor=2, mode='bilinear', align_corners=False)\n\n        self.net = nn.Sequential(\n            ConvBlock(3 + 16, cs[0]),\n            ConvBlock(cs[0], cs[0]),\n            SkipBlock([\n                self.down,\n                ConvBlock(cs[0], cs[1]),\n                ConvBlock(cs[1], cs[1]),\n                SkipBlock([\n                    self.down,\n                    ConvBlock(cs[1], cs[2]),\n                    ConvBlock(cs[2], cs[2]),\n                    SkipBlock([\n                        self.down,\n                        ConvBlock(cs[2], cs[3]),\n                        ConvBlock(cs[3], cs[3]),\n                        SkipBlock([\n                            self.down,\n                            ConvBlock(cs[3], cs[4]),\n                            ConvBlock(cs[4], cs[4]),\n                            SkipBlock([\n                                self.down,\n                                ConvBlock(cs[4], cs[5]),\n                                ConvBlock(cs[5], cs[5]),\n                                ConvBlock(cs[5], cs[5]),\n                                ConvBlock(cs[5], cs[4]),\n                                self.up,\n                            ]),\n                            ConvBlock(cs[4] * 2, cs[4]),\n                            ConvBlock(cs[4], cs[3]),\n                            self.up,\n                        ]),\n                        ConvBlock(cs[3] * 2, cs[3]),\n                        ConvBlock(cs[3], cs[2]),\n                        self.up,\n                    ]),\n                    ConvBlock(cs[2] * 2, cs[2]),\n                    ConvBlock(cs[2], cs[1]),\n                    self.up,\n                ]),\n                ConvBlock(cs[1] * 2, cs[1]),\n                ConvBlock(cs[1], cs[0]),\n                self.up,\n            ]),\n            ConvBlock(cs[0] * 2, cs[0]),\n            nn.Conv2D(cs[0], 3, 3, padding=1),\n        )\n\n    def forward(self, input, t):\n        timestep_embed = expand_to_planes(self.timestep_embed(t[:, None]), input.shape)\n        v = self.net(paddle.concat([input, timestep_embed], axis=1))\n        alphas, sigmas = map(partial(append_dims, n=v.ndim), t_to_alpha_sigma(t))\n        pred = input * alphas - v * sigmas\n        eps = input * sigmas + v * alphas\n        return DiffusionOutput(v, pred, eps)\n\n\nclass FourierFeatures(nn.Layer):\n\n    def __init__(self, in_features, out_features, std=1.0):\n        super().__init__()\n        assert out_features % 2 == 0\n        # self.weight = nn.Parameter(paddle.randn([out_features // 2, in_features]) * std)\n        self.weight = paddle.create_parameter([out_features // 2, in_features],\n                                              dtype='float32',\n                                              default_initializer=nn.initializer.Normal(mean=0.0, std=std))\n\n    def forward(self, input):\n        f = 2 * math.pi * input @ self.weight.T\n        return paddle.concat([f.cos(), f.sin()], axis=-1)\n\n\nclass ConvBlock(nn.Sequential):\n\n    def __init__(self, c_in, c_out):\n        super().__init__(\n            nn.Conv2D(c_in, c_out, 3, padding=1),\n            nn.ReLU(),\n        )\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/reverse_diffusion/model/transforms.py",
    "content": "'''\nThis code is rewritten by Paddle based on\nhttps://github.com/pytorch/vision/blob/main/torchvision/transforms/transforms.py\n'''\nimport math\nimport numbers\nimport warnings\nfrom enum import Enum\nfrom typing import Any\nfrom typing import Dict\nfrom typing import List\nfrom typing import Optional\nfrom typing import Sequence\nfrom typing import Tuple\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nfrom paddle import Tensor\nfrom paddle.nn import functional as F\nfrom paddle.nn.functional import grid_sample\nfrom paddle.vision import transforms as T\n\n\nclass Normalize(nn.Layer):\n\n    def __init__(self, mean, std):\n        super(Normalize, self).__init__()\n        self.mean = paddle.to_tensor(mean)\n        self.std = paddle.to_tensor(std)\n\n    def forward(self, tensor: Tensor):\n        dtype = tensor.dtype\n        mean = paddle.to_tensor(self.mean, dtype=dtype)\n        std = paddle.to_tensor(self.std, dtype=dtype)\n        mean = mean.reshape([1, -1, 1, 1])\n        std = std.reshape([1, -1, 1, 1])\n        result = tensor.subtract(mean).divide(std)\n        return result\n\n\nclass InterpolationMode(Enum):\n    \"\"\"Interpolation modes\n    Available interpolation methods are ``nearest``, ``bilinear``, ``bicubic``, ``box``, ``hamming``, and ``lanczos``.\n    \"\"\"\n\n    NEAREST = \"nearest\"\n    BILINEAR = \"bilinear\"\n    BICUBIC = \"bicubic\"\n    # For PIL compatibility\n    BOX = \"box\"\n    HAMMING = \"hamming\"\n    LANCZOS = \"lanczos\"\n\n\nclass Grayscale(nn.Layer):\n\n    def __init__(self, num_output_channels):\n        super(Grayscale, self).__init__()\n        self.num_output_channels = num_output_channels\n\n    def forward(self, x):\n        output = (0.2989 * x[:, 0:1, :, :] + 0.587 * x[:, 1:2, :, :] + 0.114 * x[:, 2:3, :, :])\n        if self.num_output_channels == 3:\n            return output.expand(x.shape)\n\n        return output\n\n\nclass Lambda(nn.Layer):\n\n    def __init__(self, func):\n        super(Lambda, self).__init__()\n        self.transform = func\n\n    def forward(self, x):\n        return self.transform(x)\n\n\nclass RandomGrayscale(nn.Layer):\n\n    def __init__(self, p):\n        super(RandomGrayscale, self).__init__()\n        self.prob = p\n        self.transform = Grayscale(3)\n\n    def forward(self, x):\n        if paddle.rand([1]) < self.prob:\n            return self.transform(x)\n        else:\n            return x\n\n\nclass RandomHorizontalFlip(nn.Layer):\n\n    def __init__(self, prob):\n        super(RandomHorizontalFlip, self).__init__()\n        self.prob = prob\n\n    def forward(self, x):\n        if paddle.rand([1]) < self.prob:\n            return x[:, :, :, ::-1]\n        else:\n            return x\n\n\ndef _blend(img1: Tensor, img2: Tensor, ratio: float) -> Tensor:\n    ratio = float(ratio)\n    bound = 1.0\n    return (ratio * img1 + (1.0 - ratio) * img2).clip(0, bound)\n\n\ndef trunc_div(a, b):\n    ipt = paddle.divide(a, b)\n    sign_ipt = paddle.sign(ipt)\n    abs_ipt = paddle.abs(ipt)\n    abs_ipt = paddle.floor(abs_ipt)\n    out = paddle.multiply(sign_ipt, abs_ipt)\n    return out\n\n\ndef fmod(a, b):\n    return a - trunc_div(a, b) * b\n\n\ndef _rgb2hsv(img: Tensor) -> Tensor:\n    r, g, b = img.unbind(axis=-3)\n\n    # Implementation is based on https://github.com/python-pillow/Pillow/blob/4174d4267616897df3746d315d5a2d0f82c656ee/\n    # src/libImaging/Convert.c#L330\n    maxc = paddle.max(img, axis=-3)\n    minc = paddle.min(img, axis=-3)\n\n    # The algorithm erases S and H channel where `maxc = minc`. This avoids NaN\n    # from happening in the results, because\n    #   + S channel has division by `maxc`, which is zero only if `maxc = minc`\n    #   + H channel has division by `(maxc - minc)`.\n    #\n    # Instead of overwriting NaN afterwards, we just prevent it from occuring so\n    # we don't need to deal with it in case we save the NaN in a buffer in\n    # backprop, if it is ever supported, but it doesn't hurt to do so.\n    eqc = maxc == minc\n\n    cr = maxc - minc\n    # Since `eqc => cr = 0`, replacing denominator with 1 when `eqc` is fine.\n    ones = paddle.ones_like(maxc)\n    s = cr / paddle.where(eqc, ones, maxc)\n    # Note that `eqc => maxc = minc = r = g = b`. So the following calculation\n    # of `h` would reduce to `bc - gc + 2 + rc - bc + 4 + rc - bc = 6` so it\n    # would not matter what values `rc`, `gc`, and `bc` have here, and thus\n    # replacing denominator with 1 when `eqc` is fine.\n    cr_divisor = paddle.where(eqc, ones, cr)\n    rc = (maxc - r) / cr_divisor\n    gc = (maxc - g) / cr_divisor\n    bc = (maxc - b) / cr_divisor\n\n    hr = (maxc == r).cast('float32') * (bc - gc)\n    hg = ((maxc == g) & (maxc != r)).cast('float32') * (2.0 + rc - bc)\n    hb = ((maxc != g) & (maxc != r)).cast('float32') * (4.0 + gc - rc)\n    h = hr + hg + hb\n    h = fmod((h / 6.0 + 1.0), paddle.to_tensor(1.0))\n    return paddle.stack((h, s, maxc), axis=-3)\n\n\ndef _hsv2rgb(img: Tensor) -> Tensor:\n    h, s, v = img.unbind(axis=-3)\n    i = paddle.floor(h * 6.0)\n    f = (h * 6.0) - i\n    i = i.cast(dtype='int32')\n\n    p = paddle.clip((v * (1.0 - s)), 0.0, 1.0)\n    q = paddle.clip((v * (1.0 - s * f)), 0.0, 1.0)\n    t = paddle.clip((v * (1.0 - s * (1.0 - f))), 0.0, 1.0)\n    i = i % 6\n\n    mask = i.unsqueeze(axis=-3) == paddle.arange(6).reshape([-1, 1, 1])\n\n    a1 = paddle.stack((v, q, p, p, t, v), axis=-3)\n    a2 = paddle.stack((t, v, v, q, p, p), axis=-3)\n    a3 = paddle.stack((p, p, t, v, v, q), axis=-3)\n    a4 = paddle.stack((a1, a2, a3), axis=-4)\n\n    return paddle.einsum(\"...ijk, ...xijk -> ...xjk\", mask.cast(dtype=img.dtype), a4)\n\n\ndef adjust_brightness(img: Tensor, brightness_factor: float) -> Tensor:\n    if brightness_factor < 0:\n        raise ValueError(f\"brightness_factor ({brightness_factor}) is not non-negative.\")\n\n    return _blend(img, paddle.zeros_like(img), brightness_factor)\n\n\ndef adjust_contrast(img: Tensor, contrast_factor: float) -> Tensor:\n    if contrast_factor < 0:\n        raise ValueError(f\"contrast_factor ({contrast_factor}) is not non-negative.\")\n\n    c = img.shape[1]\n\n    if c == 3:\n        output = (0.2989 * img[:, 0:1, :, :] + 0.587 * img[:, 1:2, :, :] + 0.114 * img[:, 2:3, :, :])\n        mean = paddle.mean(output, axis=(-3, -2, -1), keepdim=True)\n\n    else:\n        mean = paddle.mean(img, axis=(-3, -2, -1), keepdim=True)\n\n    return _blend(img, mean, contrast_factor)\n\n\ndef adjust_hue(img: Tensor, hue_factor: float) -> Tensor:\n    if not (-0.5 <= hue_factor <= 0.5):\n        raise ValueError(f\"hue_factor ({hue_factor}) is not in [-0.5, 0.5].\")\n\n    img = _rgb2hsv(img)\n    h, s, v = img.unbind(axis=-3)\n    h = fmod(h + hue_factor, paddle.to_tensor(1.0))\n    img = paddle.stack((h, s, v), axis=-3)\n    img_hue_adj = _hsv2rgb(img)\n    return img_hue_adj\n\n\ndef adjust_saturation(img: Tensor, saturation_factor: float) -> Tensor:\n    if saturation_factor < 0:\n        raise ValueError(f\"saturation_factor ({saturation_factor}) is not non-negative.\")\n\n    output = (0.2989 * img[:, 0:1, :, :] + 0.587 * img[:, 1:2, :, :] + 0.114 * img[:, 2:3, :, :])\n\n    return _blend(img, output, saturation_factor)\n\n\nclass ColorJitter(nn.Layer):\n\n    def __init__(self, brightness=0, contrast=0, saturation=0, hue=0):\n        super(ColorJitter, self).__init__()\n        self.brightness = self._check_input(brightness, \"brightness\")\n        self.contrast = self._check_input(contrast, \"contrast\")\n        self.saturation = self._check_input(saturation, \"saturation\")\n        self.hue = self._check_input(hue, \"hue\", center=0, bound=(-0.5, 0.5), clip_first_on_zero=False)\n\n    def _check_input(self, value, name, center=1, bound=(0, float(\"inf\")), clip_first_on_zero=True):\n        if isinstance(value, numbers.Number):\n            if value < 0:\n                raise ValueError(f\"If {name} is a single number, it must be non negative.\")\n            value = [center - float(value), center + float(value)]\n            if clip_first_on_zero:\n                value[0] = max(value[0], 0.0)\n        elif isinstance(value, (tuple, list)) and len(value) == 2:\n            if not bound[0] <= value[0] <= value[1] <= bound[1]:\n                raise ValueError(f\"{name} values should be between {bound}\")\n        else:\n            raise TypeError(f\"{name} should be a single number or a list/tuple with length 2.\")\n\n        # if value is 0 or (1., 1.) for brightness/contrast/saturation\n        # or (0., 0.) for hue, do nothing\n        if value[0] == value[1] == center:\n            value = None\n        return value\n\n    @staticmethod\n    def get_params(\n        brightness: Optional[List[float]],\n        contrast: Optional[List[float]],\n        saturation: Optional[List[float]],\n        hue: Optional[List[float]],\n    ) -> Tuple[Tensor, Optional[float], Optional[float], Optional[float], Optional[float]]:\n        \"\"\"Get the parameters for the randomized transform to be applied on image.\n\n        Args:\n            brightness (tuple of float (min, max), optional): The range from which the brightness_factor is chosen\n                uniformly. Pass None to turn off the transformation.\n            contrast (tuple of float (min, max), optional): The range from which the contrast_factor is chosen\n                uniformly. Pass None to turn off the transformation.\n            saturation (tuple of float (min, max), optional): The range from which the saturation_factor is chosen\n                uniformly. Pass None to turn off the transformation.\n            hue (tuple of float (min, max), optional): The range from which the hue_factor is chosen uniformly.\n                Pass None to turn off the transformation.\n\n        Returns:\n            tuple: The parameters used to apply the randomized transform\n            along with their random order.\n        \"\"\"\n        fn_idx = paddle.randperm(4)\n\n        b = None if brightness is None else paddle.empty([1]).uniform_(brightness[0], brightness[1])\n        c = None if contrast is None else paddle.empty([1]).uniform_(contrast[0], contrast[1])\n        s = None if saturation is None else paddle.empty([1]).uniform_(saturation[0], saturation[1])\n        h = None if hue is None else paddle.empty([1]).uniform_(hue[0], hue[1])\n\n        return fn_idx, b, c, s, h\n\n    def forward(self, img):\n        \"\"\"\n        Args:\n            img (PIL Image or Tensor): Input image.\n\n        Returns:\n            PIL Image or Tensor: Color jittered image.\n        \"\"\"\n        fn_idx, brightness_factor, contrast_factor, saturation_factor, hue_factor = self.get_params(\n            self.brightness, self.contrast, self.saturation, self.hue)\n\n        for fn_id in fn_idx:\n            if fn_id == 0 and brightness_factor is not None:\n                img = adjust_brightness(img, brightness_factor)\n            elif fn_id == 1 and contrast_factor is not None:\n                img = adjust_contrast(img, contrast_factor)\n            elif fn_id == 2 and saturation_factor is not None:\n                img = adjust_saturation(img, saturation_factor)\n            elif fn_id == 3 and hue_factor is not None:\n                img = adjust_hue(img, hue_factor)\n\n        return img\n\n    def __repr__(self) -> str:\n        s = (f\"{self.__class__.__name__}(\"\n             f\"brightness={self.brightness}\"\n             f\", contrast={self.contrast}\"\n             f\", saturation={self.saturation}\"\n             f\", hue={self.hue})\")\n        return s\n\n\ndef _apply_grid_transform(img: Tensor, grid: Tensor, mode: str, fill: Optional[List[float]]) -> Tensor:\n\n    if img.shape[0] > 1:\n        # Apply same grid to a batch of images\n        grid = grid.expand([img.shape[0], grid.shape[1], grid.shape[2], grid.shape[3]])\n\n    # Append a dummy mask for customized fill colors, should be faster than grid_sample() twice\n    if fill is not None:\n        dummy = paddle.ones((img.shape[0], 1, img.shape[2], img.shape[3]), dtype=img.dtype)\n        img = paddle.concat((img, dummy), axis=1)\n\n    img = grid_sample(img, grid, mode=mode, padding_mode=\"zeros\", align_corners=False)\n\n    # Fill with required color\n    if fill is not None:\n        mask = img[:, -1:, :, :]  # N * 1 * H * W\n        img = img[:, :-1, :, :]  # N * C * H * W\n        mask = mask.expand_as(img)\n        len_fill = len(fill) if isinstance(fill, (tuple, list)) else 1\n        fill_img = paddle.to_tensor(fill, dtype=img.dtype).reshape([1, len_fill, 1, 1]).expand_as(img)\n        if mode == \"nearest\":\n            mask = mask < 0.5\n            img[mask] = fill_img[mask]\n        else:  # 'bilinear'\n            img = img * mask + (1.0 - mask) * fill_img\n    return img\n\n\ndef _gen_affine_grid(\n    theta: Tensor,\n    w: int,\n    h: int,\n    ow: int,\n    oh: int,\n) -> Tensor:\n    # https://github.com/pytorch/pytorch/blob/74b65c32be68b15dc7c9e8bb62459efbfbde33d8/aten/src/ATen/native/\n    # AffineGridGenerator.cpp#L18\n    # Difference with AffineGridGenerator is that:\n    # 1) we normalize grid values after applying theta\n    # 2) we can normalize by other image size, such that it covers \"extend\" option like in PIL.Image.rotate\n\n    d = 0.5\n    base_grid = paddle.empty([1, oh, ow, 3], dtype=theta.dtype)\n    x_grid = paddle.linspace(-ow * 0.5 + d, ow * 0.5 + d - 1, num=ow)\n    base_grid[..., 0] = (x_grid)\n    y_grid = paddle.linspace(-oh * 0.5 + d, oh * 0.5 + d - 1, num=oh).unsqueeze_(-1)\n    base_grid[..., 1] = (y_grid)\n    base_grid[..., 2] = 1.0\n    rescaled_theta = theta.transpose([0, 2, 1]) / paddle.to_tensor([0.5 * w, 0.5 * h], dtype=theta.dtype)\n    output_grid = base_grid.reshape([1, oh * ow, 3]).bmm(rescaled_theta)\n    return output_grid.reshape([1, oh, ow, 2])\n\n\ndef affine_impl(img: Tensor,\n                matrix: List[float],\n                interpolation: str = \"nearest\",\n                fill: Optional[List[float]] = None) -> Tensor:\n    theta = paddle.to_tensor(matrix, dtype=img.dtype).reshape([1, 2, 3])\n    shape = img.shape\n    # grid will be generated on the same device as theta and img\n    grid = _gen_affine_grid(theta, w=shape[-1], h=shape[-2], ow=shape[-1], oh=shape[-2])\n    return _apply_grid_transform(img, grid, interpolation, fill=fill)\n\n\ndef _get_inverse_affine_matrix(center: List[float],\n                               angle: float,\n                               translate: List[float],\n                               scale: float,\n                               shear: List[float],\n                               inverted: bool = True) -> List[float]:\n    # Helper method to compute inverse matrix for affine transformation\n\n    # Pillow requires inverse affine transformation matrix:\n    # Affine matrix is : M = T * C * RotateScaleShear * C^-1\n    #\n    # where T is translation matrix: [1, 0, tx | 0, 1, ty | 0, 0, 1]\n    #       C is translation matrix to keep center: [1, 0, cx | 0, 1, cy | 0, 0, 1]\n    #       RotateScaleShear is rotation with scale and shear matrix\n    #\n    #       RotateScaleShear(a, s, (sx, sy)) =\n    #       = R(a) * S(s) * SHy(sy) * SHx(sx)\n    #       = [ s*cos(a - sy)/cos(sy), s*(-cos(a - sy)*tan(sx)/cos(sy) - sin(a)), 0 ]\n    #         [ s*sin(a + sy)/cos(sy), s*(-sin(a - sy)*tan(sx)/cos(sy) + cos(a)), 0 ]\n    #         [ 0                    , 0                                      , 1 ]\n    # where R is a rotation matrix, S is a scaling matrix, and SHx and SHy are the shears:\n    # SHx(s) = [1, -tan(s)] and SHy(s) = [1      , 0]\n    #          [0, 1      ]              [-tan(s), 1]\n    #\n    # Thus, the inverse is M^-1 = C * RotateScaleShear^-1 * C^-1 * T^-1\n\n    rot = math.radians(angle)\n    sx = math.radians(shear[0])\n    sy = math.radians(shear[1])\n\n    cx, cy = center\n    tx, ty = translate\n\n    # RSS without scaling\n    a = math.cos(rot - sy) / math.cos(sy)\n    b = -math.cos(rot - sy) * math.tan(sx) / math.cos(sy) - math.sin(rot)\n    c = math.sin(rot - sy) / math.cos(sy)\n    d = -math.sin(rot - sy) * math.tan(sx) / math.cos(sy) + math.cos(rot)\n\n    if inverted:\n        # Inverted rotation matrix with scale and shear\n        # det([[a, b], [c, d]]) == 1, since det(rotation) = 1 and det(shear) = 1\n        matrix = [d, -b, 0.0, -c, a, 0.0]\n        matrix = [x / scale for x in matrix]\n        # Apply inverse of translation and of center translation: RSS^-1 * C^-1 * T^-1\n        matrix[2] += matrix[0] * (-cx - tx) + matrix[1] * (-cy - ty)\n        matrix[5] += matrix[3] * (-cx - tx) + matrix[4] * (-cy - ty)\n        # Apply center translation: C * RSS^-1 * C^-1 * T^-1\n        matrix[2] += cx\n        matrix[5] += cy\n    else:\n        matrix = [a, b, 0.0, c, d, 0.0]\n        matrix = [x * scale for x in matrix]\n        # Apply inverse of center translation: RSS * C^-1\n        matrix[2] += matrix[0] * (-cx) + matrix[1] * (-cy)\n        matrix[5] += matrix[3] * (-cx) + matrix[4] * (-cy)\n        # Apply translation and center : T * C * RSS * C^-1\n        matrix[2] += cx + tx\n        matrix[5] += cy + ty\n\n    return matrix\n\n\ndef affine(\n    img: Tensor,\n    angle: float,\n    translate: List[int],\n    scale: float,\n    shear: List[float],\n    interpolation: InterpolationMode = InterpolationMode.NEAREST,\n    fill: Optional[List[float]] = None,\n    resample: Optional[int] = None,\n    fillcolor: Optional[List[float]] = None,\n    center: Optional[List[int]] = None,\n) -> Tensor:\n    \"\"\"Apply affine transformation on the image keeping image center invariant.\n    If the image is paddle Tensor, it is expected\n    to have [..., H, W] shape, where ... means an arbitrary number of leading dimensions.\n\n    Args:\n        img (PIL Image or Tensor): image to transform.\n        angle (number): rotation angle in degrees between -180 and 180, clockwise direction.\n        translate (sequence of integers): horizontal and vertical translations (post-rotation translation)\n        scale (float): overall scale\n        shear (float or sequence): shear angle value in degrees between -180 to 180, clockwise direction.\n            If a sequence is specified, the first value corresponds to a shear parallel to the x axis, while\n            the second value corresponds to a shear parallel to the y axis.\n        interpolation (InterpolationMode): Desired interpolation enum defined by\n            :class:`torchvision.transforms.InterpolationMode`. Default is ``InterpolationMode.NEAREST``.\n            If input is Tensor, only ``InterpolationMode.NEAREST``, ``InterpolationMode.BILINEAR`` are supported.\n            For backward compatibility integer values (e.g. ``PIL.Image[.Resampling].NEAREST``) are still accepted,\n            but deprecated since 0.13 and will be removed in 0.15. Please use InterpolationMode enum.\n        fill (sequence or number, optional): Pixel fill value for the area outside the transformed\n            image. If given a number, the value is used for all bands respectively.\n\n            .. note::\n                In torchscript mode single int/float value is not supported, please use a sequence\n                of length 1: ``[value, ]``.\n        fillcolor (sequence or number, optional):\n            .. warning::\n                This parameter was deprecated in ``0.12`` and will be removed in ``0.14``. Please use ``fill`` instead.\n        resample (int, optional):\n            .. warning::\n                This parameter was deprecated in ``0.12`` and will be removed in ``0.14``. Please use ``interpolation``\n                instead.\n        center (sequence, optional): Optional center of rotation. Origin is the upper left corner.\n            Default is the center of the image.\n\n    Returns:\n        PIL Image or Tensor: Transformed image.\n    \"\"\"\n\n    # Backward compatibility with integer value\n    if isinstance(interpolation, int):\n        warnings.warn(\"Argument 'interpolation' of type int is deprecated since 0.13 and will be removed in 0.15. \"\n                      \"Please use InterpolationMode enum.\")\n        interpolation = _interpolation_modes_from_int(interpolation)\n\n    if fillcolor is not None:\n        warnings.warn(\"The parameter 'fillcolor' is deprecated since 0.12 and will be removed in 0.14. \"\n                      \"Please use 'fill' instead.\")\n        fill = fillcolor\n\n    if not isinstance(angle, (int, float)):\n        raise TypeError(\"Argument angle should be int or float\")\n\n    if not isinstance(translate, (list, tuple)):\n        raise TypeError(\"Argument translate should be a sequence\")\n\n    if len(translate) != 2:\n        raise ValueError(\"Argument translate should be a sequence of length 2\")\n\n    if scale <= 0.0:\n        raise ValueError(\"Argument scale should be positive\")\n\n    if not isinstance(shear, (numbers.Number, (list, tuple))):\n        raise TypeError(\"Shear should be either a single value or a sequence of two values\")\n\n    if not isinstance(interpolation, InterpolationMode):\n        raise TypeError(\"Argument interpolation should be a InterpolationMode\")\n\n    if isinstance(angle, int):\n        angle = float(angle)\n\n    if isinstance(translate, tuple):\n        translate = list(translate)\n\n    if isinstance(shear, numbers.Number):\n        shear = [shear, 0.0]\n\n    if isinstance(shear, tuple):\n        shear = list(shear)\n\n    if len(shear) == 1:\n        shear = [shear[0], shear[0]]\n\n    if len(shear) != 2:\n        raise ValueError(f\"Shear should be a sequence containing two values. Got {shear}\")\n\n    if center is not None and not isinstance(center, (list, tuple)):\n        raise TypeError(\"Argument center should be a sequence\")\n    center_f = [0.0, 0.0]\n    if center is not None:\n        _, height, width = img.shape[0], img.shape[1], img.shape[2]\n        # Center values should be in pixel coordinates but translated such that (0, 0) corresponds to image center.\n        center_f = [1.0 * (c - s * 0.5) for c, s in zip(center, [width, height])]\n\n    translate_f = [1.0 * t for t in translate]\n    matrix = _get_inverse_affine_matrix(center_f, angle, translate_f, scale, shear)\n    return affine_impl(img, matrix=matrix, interpolation=interpolation.value, fill=fill)\n\n\ndef _interpolation_modes_from_int(i: int) -> InterpolationMode:\n    inverse_modes_mapping = {\n        0: InterpolationMode.NEAREST,\n        2: InterpolationMode.BILINEAR,\n        3: InterpolationMode.BICUBIC,\n        4: InterpolationMode.BOX,\n        5: InterpolationMode.HAMMING,\n        1: InterpolationMode.LANCZOS,\n    }\n    return inverse_modes_mapping[i]\n\n\ndef _check_sequence_input(x, name, req_sizes):\n    msg = req_sizes[0] if len(req_sizes) < 2 else \" or \".join([str(s) for s in req_sizes])\n    if not isinstance(x, Sequence):\n        raise TypeError(f\"{name} should be a sequence of length {msg}.\")\n    if len(x) not in req_sizes:\n        raise ValueError(f\"{name} should be sequence of length {msg}.\")\n\n\ndef _setup_angle(x, name, req_sizes=(2, )):\n    if isinstance(x, numbers.Number):\n        if x < 0:\n            raise ValueError(f\"If {name} is a single number, it must be positive.\")\n        x = [-x, x]\n    else:\n        _check_sequence_input(x, name, req_sizes)\n\n    return [float(d) for d in x]\n\n\nclass RandomAffine(nn.Layer):\n    \"\"\"Random affine transformation of the image keeping center invariant.\n    If the image is paddle Tensor, it is expected\n    to have [..., H, W] shape, where ... means an arbitrary number of leading dimensions.\n\n    Args:\n        degrees (sequence or number): Range of degrees to select from.\n            If degrees is a number instead of sequence like (min, max), the range of degrees\n            will be (-degrees, +degrees). Set to 0 to deactivate rotations.\n        translate (tuple, optional): tuple of maximum absolute fraction for horizontal\n            and vertical translations. For example translate=(a, b), then horizontal shift\n            is randomly sampled in the range -img_width * a < dx < img_width * a and vertical shift is\n            randomly sampled in the range -img_height * b < dy < img_height * b. Will not translate by default.\n        scale (tuple, optional): scaling factor interval, e.g (a, b), then scale is\n            randomly sampled from the range a <= scale <= b. Will keep original scale by default.\n        shear (sequence or number, optional): Range of degrees to select from.\n            If shear is a number, a shear parallel to the x axis in the range (-shear, +shear)\n            will be applied. Else if shear is a sequence of 2 values a shear parallel to the x axis in the\n            range (shear[0], shear[1]) will be applied. Else if shear is a sequence of 4 values,\n            a x-axis shear in (shear[0], shear[1]) and y-axis shear in (shear[2], shear[3]) will be applied.\n            Will not apply shear by default.\n        interpolation (InterpolationMode): Desired interpolation enum defined by\n            :class:`torchvision.transforms.InterpolationMode`. Default is ``InterpolationMode.NEAREST``.\n            If input is Tensor, only ``InterpolationMode.NEAREST``, ``InterpolationMode.BILINEAR`` are supported.\n            For backward compatibility integer values (e.g. ``PIL.Image[.Resampling].NEAREST``) are still accepted,\n            but deprecated since 0.13 and will be removed in 0.15. Please use InterpolationMode enum.\n        fill (sequence or number): Pixel fill value for the area outside the transformed\n            image. Default is ``0``. If given a number, the value is used for all bands respectively.\n        fillcolor (sequence or number, optional):\n            .. warning::\n                This parameter was deprecated in ``0.12`` and will be removed in ``0.14``. Please use ``fill`` instead.\n        resample (int, optional):\n            .. warning::\n                This parameter was deprecated in ``0.12`` and will be removed in ``0.14``. Please use ``interpolation``\n                instead.\n        center (sequence, optional): Optional center of rotation, (x, y). Origin is the upper left corner.\n            Default is the center of the image.\n\n    .. _filters: https://pillow.readthedocs.io/en/latest/handbook/concepts.html#filters\n\n    \"\"\"\n\n    def __init__(\n        self,\n        degrees,\n        translate=None,\n        scale=None,\n        shear=None,\n        interpolation=InterpolationMode.NEAREST,\n        fill=0,\n        fillcolor=None,\n        resample=None,\n        center=None,\n    ):\n        super(RandomAffine, self).__init__()\n        if resample is not None:\n            warnings.warn(\"The parameter 'resample' is deprecated since 0.12 and will be removed in 0.14. \"\n                          \"Please use 'interpolation' instead.\")\n            interpolation = _interpolation_modes_from_int(resample)\n\n        # Backward compatibility with integer value\n        if isinstance(interpolation, int):\n            warnings.warn(\"Argument 'interpolation' of type int is deprecated since 0.13 and will be removed in 0.15. \"\n                          \"Please use InterpolationMode enum.\")\n            interpolation = _interpolation_modes_from_int(interpolation)\n\n        if fillcolor is not None:\n            warnings.warn(\"The parameter 'fillcolor' is deprecated since 0.12 and will be removed in 0.14. \"\n                          \"Please use 'fill' instead.\")\n            fill = fillcolor\n\n        self.degrees = _setup_angle(degrees, name=\"degrees\", req_sizes=(2, ))\n\n        if translate is not None:\n            _check_sequence_input(translate, \"translate\", req_sizes=(2, ))\n            for t in translate:\n                if not (0.0 <= t <= 1.0):\n                    raise ValueError(\"translation values should be between 0 and 1\")\n        self.translate = translate\n\n        if scale is not None:\n            _check_sequence_input(scale, \"scale\", req_sizes=(2, ))\n            for s in scale:\n                if s <= 0:\n                    raise ValueError(\"scale values should be positive\")\n        self.scale = scale\n\n        if shear is not None:\n            self.shear = _setup_angle(shear, name=\"shear\", req_sizes=(2, 4))\n        else:\n            self.shear = shear\n\n        self.resample = self.interpolation = interpolation\n\n        if fill is None:\n            fill = 0\n        elif not isinstance(fill, (Sequence, numbers.Number)):\n            raise TypeError(\"Fill should be either a sequence or a number.\")\n\n        self.fillcolor = self.fill = fill\n\n        if center is not None:\n            _check_sequence_input(center, \"center\", req_sizes=(2, ))\n\n        self.center = center\n\n    @staticmethod\n    def get_params(\n        degrees: List[float],\n        translate: Optional[List[float]],\n        scale_ranges: Optional[List[float]],\n        shears: Optional[List[float]],\n        img_size: List[int],\n    ) -> Tuple[float, Tuple[int, int], float, Tuple[float, float]]:\n        \"\"\"Get parameters for affine transformation\n\n        Returns:\n            params to be passed to the affine transformation\n        \"\"\"\n        angle = float(paddle.empty([1]).uniform_(float(degrees[0]), float(degrees[1])))\n        if translate is not None:\n            max_dx = float(translate[0] * img_size[0])\n            max_dy = float(translate[1] * img_size[1])\n            tx = int(float(paddle.empty([1]).uniform_(-max_dx, max_dx)))\n            ty = int(float(paddle.empty([1]).uniform_(-max_dy, max_dy)))\n            translations = (tx, ty)\n        else:\n            translations = (0, 0)\n\n        if scale_ranges is not None:\n            scale = float(paddle.empty([1]).uniform_(scale_ranges[0], scale_ranges[1]))\n        else:\n            scale = 1.0\n\n        shear_x = shear_y = 0.0\n        if shears is not None:\n            shear_x = float(paddle.empty([1]).uniform_(shears[0], shears[1]))\n            if len(shears) == 4:\n                shear_y = float(paddle.empty([1]).uniform_(shears[2], shears[3]))\n\n        shear = (shear_x, shear_y)\n\n        return angle, translations, scale, shear\n\n    def forward(self, img):\n        fill = self.fill\n        channels, height, width = img.shape[1], img.shape[2], img.shape[3]\n        if isinstance(fill, (int, float)):\n            fill = [float(fill)] * channels\n        else:\n            fill = [float(f) for f in fill]\n\n        img_size = [width, height]  # flip for keeping BC on get_params call\n\n        ret = self.get_params(self.degrees, self.translate, self.scale, self.shear, img_size)\n\n        return affine(img, *ret, interpolation=self.interpolation, fill=fill, center=self.center)\n\n    def __repr__(self) -> str:\n        s = f\"{self.__class__.__name__}(degrees={self.degrees}\"\n        s += f\", translate={self.translate}\" if self.translate is not None else \"\"\n        s += f\", scale={self.scale}\" if self.scale is not None else \"\"\n        s += f\", shear={self.shear}\" if self.shear is not None else \"\"\n        s += f\", interpolation={self.interpolation.value}\" if self.interpolation != InterpolationMode.NEAREST else \"\"\n        s += f\", fill={self.fill}\" if self.fill != 0 else \"\"\n        s += f\", center={self.center}\" if self.center is not None else \"\"\n        s += \")\"\n\n        return s\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/reverse_diffusion/model/unet.py",
    "content": "'''\nThis code is rewritten by Paddle based on\nhttps://github.com/openai/guided-diffusion/blob/main/guided_diffusion/unet.py\n'''\nimport math\nfrom abc import abstractmethod\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom .nn import avg_pool_nd\nfrom .nn import checkpoint\nfrom .nn import conv_nd\nfrom .nn import linear\nfrom .nn import normalization\nfrom .nn import SiLU\nfrom .nn import timestep_embedding\nfrom .nn import zero_module\n\n\nclass AttentionPool2d(nn.Layer):\n    \"\"\"\n    Adapted from CLIP: https://github.com/openai/CLIP/blob/main/clip/model.py\n    \"\"\"\n\n    def __init__(\n        self,\n        spacial_dim: int,\n        embed_dim: int,\n        num_heads_channels: int,\n        output_dim: int = None,\n    ):\n        super().__init__()\n        # self.positional_embedding = nn.Parameter(\n        #     th.randn(embed_dim, spacial_dim ** 2 + 1) / embed_dim ** 0.5\n        # )\n        positional_embedding = self.create_parameter(paddle.randn(embed_dim, spacial_dim**2 + 1) / embed_dim**0.5)\n        self.add_parameter(\"positional_embedding\", positional_embedding)\n        self.qkv_proj = conv_nd(1, embed_dim, 3 * embed_dim, 1)\n        self.c_proj = conv_nd(1, embed_dim, output_dim or embed_dim, 1)\n        self.num_heads = embed_dim // num_heads_channels\n        self.attention = QKVAttention(self.num_heads)\n\n    def forward(self, x):\n        b, c, *_spatial = x.shape\n        # x = x.reshape(b, c, -1)  # NC(HW)\n        x = paddle.reshape(x, [b, c, -1])\n        x = paddle.concat([x.mean(dim=-1, keepdim=True), x], axis=-1)  # NC(HW+1)\n        x = x + paddle.cast(self.positional_embedding[None, :, :], x.dtype)  # NC(HW+1)\n        x = self.qkv_proj(x)\n        x = self.attention(x)\n        x = self.c_proj(x)\n        return x[:, :, 0]\n\n\nclass TimestepBlock(nn.Layer):\n    \"\"\"\n    Any module where forward() takes timestep embeddings as a second argument.\n    \"\"\"\n\n    @abstractmethod\n    def forward(self, x, emb):\n        \"\"\"\n        Apply the module to `x` given `emb` timestep embeddings.\n        \"\"\"\n\n\nclass TimestepEmbedSequential(nn.Sequential, TimestepBlock):\n    \"\"\"\n    A sequential module that passes timestep embeddings to the children that\n    support it as an extra input.\n    \"\"\"\n\n    def forward(self, x, emb):\n        for layer in self:\n            if isinstance(layer, TimestepBlock):\n                x = layer(x, emb)\n            else:\n                x = layer(x)\n        return x\n\n\nclass Upsample(nn.Layer):\n    \"\"\"\n    An upsampling layer with an optional convolution.\n\n    :param channels: channels in the inputs and outputs.\n    :param use_conv: a bool determining if a convolution is applied.\n    :param dims: determines if the signal is 1D, 2D, or 3D. If 3D, then\n                 upsampling occurs in the inner-two dimensions.\n    \"\"\"\n\n    def __init__(self, channels, use_conv, dims=2, out_channels=None):\n        super().__init__()\n        self.channels = channels\n        self.out_channels = out_channels or channels\n        self.use_conv = use_conv\n        self.dims = dims\n        if use_conv:\n            self.conv = conv_nd(dims, self.channels, self.out_channels, 3, padding=1)\n\n    def forward(self, x):\n        assert x.shape[1] == self.channels\n        if self.dims == 3:\n            x = F.interpolate(x, (x.shape[2], x.shape[3] * 2, x.shape[4] * 2), mode=\"nearest\")\n        else:\n            x = F.interpolate(x, scale_factor=2, mode=\"nearest\")\n        if self.use_conv:\n            x = self.conv(x)\n        return x\n\n\nclass Downsample(nn.Layer):\n    \"\"\"\n    A downsampling layer with an optional convolution.\n\n    :param channels: channels in the inputs and outputs.\n    :param use_conv: a bool determining if a convolution is applied.\n    :param dims: determines if the signal is 1D, 2D, or 3D. If 3D, then\n                 downsampling occurs in the inner-two dimensions.\n    \"\"\"\n\n    def __init__(self, channels, use_conv, dims=2, out_channels=None):\n        super().__init__()\n        self.channels = channels\n        self.out_channels = out_channels or channels\n        self.use_conv = use_conv\n        self.dims = dims\n        stride = 2 if dims != 3 else (1, 2, 2)\n        if use_conv:\n            self.op = conv_nd(dims, self.channels, self.out_channels, 3, stride=stride, padding=1)\n        else:\n            assert self.channels == self.out_channels\n            self.op = avg_pool_nd(dims, kernel_size=stride, stride=stride)\n\n    def forward(self, x):\n        assert x.shape[1] == self.channels\n        return self.op(x)\n\n\nclass ResBlock(TimestepBlock):\n    \"\"\"\n    A residual block that can optionally change the number of channels.\n\n    :param channels: the number of input channels.\n    :param emb_channels: the number of timestep embedding channels.\n    :param dropout: the rate of dropout.\n    :param out_channels: if specified, the number of out channels.\n    :param use_conv: if True and out_channels is specified, use a spatial\n        convolution instead of a smaller 1x1 convolution to change the\n        channels in the skip connection.\n    :param dims: determines if the signal is 1D, 2D, or 3D.\n    :param use_checkpoint: if True, use gradient checkpointing on this module.\n    :param up: if True, use this block for upsampling.\n    :param down: if True, use this block for downsampling.\n    \"\"\"\n\n    def __init__(\n        self,\n        channels,\n        emb_channels,\n        dropout,\n        out_channels=None,\n        use_conv=False,\n        use_scale_shift_norm=False,\n        dims=2,\n        use_checkpoint=False,\n        up=False,\n        down=False,\n    ):\n        super().__init__()\n        self.channels = channels\n        self.emb_channels = emb_channels\n        self.dropout = dropout\n        self.out_channels = out_channels or channels\n        self.use_conv = use_conv\n        self.use_checkpoint = use_checkpoint\n        self.use_scale_shift_norm = use_scale_shift_norm\n\n        self.in_layers = nn.Sequential(\n            normalization(channels),\n            SiLU(),\n            conv_nd(dims, channels, self.out_channels, 3, padding=1),\n        )\n\n        self.updown = up or down\n\n        if up:\n            self.h_upd = Upsample(channels, False, dims)\n            self.x_upd = Upsample(channels, False, dims)\n        elif down:\n            self.h_upd = Downsample(channels, False, dims)\n            self.x_upd = Downsample(channels, False, dims)\n        else:\n            self.h_upd = self.x_upd = nn.Identity()\n\n        self.emb_layers = nn.Sequential(\n            SiLU(),\n            linear(\n                emb_channels,\n                2 * self.out_channels if use_scale_shift_norm else self.out_channels,\n            ),\n        )\n        self.out_layers = nn.Sequential(\n            normalization(self.out_channels),\n            SiLU(),\n            nn.Dropout(p=dropout),\n            zero_module(conv_nd(dims, self.out_channels, self.out_channels, 3, padding=1)),\n        )\n\n        if self.out_channels == channels:\n            self.skip_connection = nn.Identity()\n        elif use_conv:\n            self.skip_connection = conv_nd(dims, channels, self.out_channels, 3, padding=1)\n        else:\n            self.skip_connection = conv_nd(dims, channels, self.out_channels, 1)\n\n    def forward(self, x, emb):\n        \"\"\"\n        Apply the block to a Tensor, conditioned on a timestep embedding.\n\n        :param x: an [N x C x ...] Tensor of features.\n        :param emb: an [N x emb_channels] Tensor of timestep embeddings.\n        :return: an [N x C x ...] Tensor of outputs.\n        \"\"\"\n        return checkpoint(self._forward, (x, emb), self.parameters(), self.use_checkpoint)\n\n    def _forward(self, x, emb):\n        if self.updown:\n            in_rest, in_conv = self.in_layers[:-1], self.in_layers[-1]\n            h = in_rest(x)\n            h = self.h_upd(h)\n            x = self.x_upd(x)\n            h = in_conv(h)\n        else:\n            h = self.in_layers(x)\n        emb_out = self.emb_layers(emb)\n        emb_out = paddle.cast(emb_out, h.dtype)\n        while len(emb_out.shape) < len(h.shape):\n            emb_out = emb_out[..., None]\n        if self.use_scale_shift_norm:\n            out_norm, out_rest = self.out_layers[0], self.out_layers[1:]\n            scale, shift = paddle.chunk(emb_out, 2, axis=1)\n            h = out_norm(h) * (1 + scale) + shift\n            h = out_rest(h)\n        else:\n            h = h + emb_out\n            h = self.out_layers(h)\n        return self.skip_connection(x) + h\n\n\nclass AttentionBlock(nn.Layer):\n    \"\"\"\n    An attention block that allows spatial positions to attend to each other.\n\n    Originally ported from here, but adapted to the N-d case.\n    https://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/models/unet.py#L66.\n    \"\"\"\n\n    def __init__(\n        self,\n        channels,\n        num_heads=1,\n        num_head_channels=-1,\n        use_checkpoint=False,\n        use_new_attention_order=False,\n    ):\n        super().__init__()\n        self.channels = channels\n        if num_head_channels == -1:\n            self.num_heads = num_heads\n        else:\n            assert (channels % num_head_channels == 0\n                    ), f\"q,k,v channels {channels} is not divisible by num_head_channels {num_head_channels}\"\n            self.num_heads = channels // num_head_channels\n        self.use_checkpoint = use_checkpoint\n        self.norm = normalization(channels)\n        self.qkv = conv_nd(1, channels, channels * 3, 1)\n        if use_new_attention_order:\n            # split qkv before split heads\n            self.attention = QKVAttention(self.num_heads)\n        else:\n            # split heads before split qkv\n            self.attention = QKVAttentionLegacy(self.num_heads)\n\n        self.proj_out = zero_module(conv_nd(1, channels, channels, 1))\n\n    def forward(self, x):\n        return checkpoint(self._forward, (x, ), self.parameters(), self.use_checkpoint)\n\n    def _forward(self, x):\n        b, c, *spatial = x.shape\n        # x = x.reshape(b, c, -1)\n        x = paddle.reshape(x, [b, c, -1])\n        qkv = self.qkv(self.norm(x))\n        h = self.attention(qkv)\n        h = self.proj_out(h)\n        # return (x + h).reshape(b, c, *spatial)\n        return paddle.reshape(x + h, [b, c, *spatial])\n\n\ndef count_flops_attn(model, _x, y):\n    \"\"\"\n    A counter for the `thop` package to count the operations in an\n    attention operation.\n    Meant to be used like:\n        macs, params = thop.profile(\n            model,\n            inputs=(inputs, timestamps),\n            custom_ops={QKVAttention: QKVAttention.count_flops},\n        )\n    \"\"\"\n    b, c, *spatial = y[0].shape\n    num_spatial = int(np.prod(spatial))\n    # We perform two matmuls with the same number of ops.\n    # The first computes the weight matrix, the second computes\n    # the combination of the value vectors.\n    matmul_ops = 2 * b * (num_spatial**2) * c\n    model.total_ops += paddle.to_tensor([matmul_ops], dtype='float64')\n\n\nclass QKVAttentionLegacy(nn.Layer):\n    \"\"\"\n    A module which performs QKV attention. Matches legacy QKVAttention + input/ouput heads shaping\n    \"\"\"\n\n    def __init__(self, n_heads):\n        super().__init__()\n        self.n_heads = n_heads\n\n    def forward(self, qkv):\n        \"\"\"\n        Apply QKV attention.\n\n        :param qkv: an [N x (H * 3 * C) x T] tensor of Qs, Ks, and Vs.\n        :return: an [N x (H * C) x T] tensor after attention.\n        \"\"\"\n        bs, width, length = qkv.shape\n        assert width % (3 * self.n_heads) == 0\n        ch = width // (3 * self.n_heads)\n        # q, k, v = qkv.reshape(bs * self.n_heads, ch * 3, length).split(ch, dim=1)\n        q, k, v = paddle.reshape(qkv, [bs * self.n_heads, ch * 3, length]).split(3, axis=1)\n        scale = 1 / math.sqrt(math.sqrt(ch))\n        weight = paddle.einsum(\"bct,bcs->bts\", q * scale, k * scale)  # More stable with f16 than dividing afterwards\n        weight = paddle.cast(nn.functional.softmax(paddle.cast(weight, 'float32'), axis=-1), weight.dtype)\n        a = paddle.einsum(\"bts,bcs->bct\", weight, v)\n        # return a.reshape(bs, -1, length)\n        return paddle.reshape(a, [bs, -1, length])\n\n    @staticmethod\n    def count_flops(model, _x, y):\n        return count_flops_attn(model, _x, y)\n\n\nclass QKVAttention(nn.Layer):\n    \"\"\"\n    A module which performs QKV attention and splits in a different order.\n    \"\"\"\n\n    def __init__(self, n_heads):\n        super().__init__()\n        self.n_heads = n_heads\n\n    def forward(self, qkv):\n        \"\"\"\n        Apply QKV attention.\n\n        :param qkv: an [N x (3 * H * C) x T] tensor of Qs, Ks, and Vs.\n        :return: an [N x (H * C) x T] tensor after attention.\n        \"\"\"\n        bs, width, length = qkv.shape\n        assert width % (3 * self.n_heads) == 0\n        ch = width // (3 * self.n_heads)\n        q, k, v = qkv.chunk(3, axis=1)\n        scale = 1 / math.sqrt(math.sqrt(ch))\n        weight = paddle.einsum(\n            \"bct,bcs->bts\",\n            (q * scale).view(bs * self.n_heads, ch, length),\n            (k * scale).view(bs * self.n_heads, ch, length),\n        )  # More stable with f16 than dividing afterwards\n        weight = paddle.cast(nn.functional.softmax(paddle.cast(weight, 'float32'), axis=-1), weight.dtype)\n        a = paddle.einsum(\"bts,bcs->bct\", weight, v.reshape(bs * self.n_heads, ch, length))\n        # return a.reshape(bs, -1, length)\n        return paddle.reshape(a, [bs, -1, length])\n\n    @staticmethod\n    def count_flops(model, _x, y):\n        return count_flops_attn(model, _x, y)\n\n\nclass UNetModel(nn.Layer):\n    \"\"\"\n    The full UNet model with attention and timestep embedding.\n\n    :param in_channels: channels in the input Tensor.\n    :param model_channels: base channel count for the model.\n    :param out_channels: channels in the output Tensor.\n    :param num_res_blocks: number of residual blocks per downsample.\n    :param attention_resolutions: a collection of downsample rates at which\n        attention will take place. May be a set, list, or tuple.\n        For example, if this contains 4, then at 4x downsampling, attention\n        will be used.\n    :param dropout: the dropout probability.\n    :param channel_mult: channel multiplier for each level of the UNet.\n    :param conv_resample: if True, use learned convolutions for upsampling and\n        downsampling.\n    :param dims: determines if the signal is 1D, 2D, or 3D.\n    :param num_classes: if specified (as an int), then this model will be\n        class-conditional with `num_classes` classes.\n    :param use_checkpoint: use gradient checkpointing to reduce memory usage.\n    :param num_heads: the number of attention heads in each attention layer.\n    :param num_heads_channels: if specified, ignore num_heads and instead use\n                               a fixed channel width per attention head.\n    :param num_heads_upsample: works with num_heads to set a different number\n                               of heads for upsampling. Deprecated.\n    :param use_scale_shift_norm: use a FiLM-like conditioning mechanism.\n    :param resblock_updown: use residual blocks for up/downsampling.\n    :param use_new_attention_order: use a different attention pattern for potentially\n                                    increased efficiency.\n    \"\"\"\n\n    def __init__(\n        self,\n        image_size,\n        in_channels,\n        model_channels,\n        out_channels,\n        num_res_blocks,\n        attention_resolutions,\n        dropout=0,\n        channel_mult=(1, 2, 4, 8),\n        conv_resample=True,\n        dims=2,\n        num_classes=None,\n        use_checkpoint=False,\n        use_fp16=False,\n        num_heads=1,\n        num_head_channels=-1,\n        num_heads_upsample=-1,\n        use_scale_shift_norm=False,\n        resblock_updown=False,\n        use_new_attention_order=False,\n    ):\n        super().__init__()\n\n        if num_heads_upsample == -1:\n            num_heads_upsample = num_heads\n\n        self.image_size = image_size\n        self.in_channels = in_channels\n        self.model_channels = model_channels\n        self.out_channels = out_channels\n        self.num_res_blocks = num_res_blocks\n        self.attention_resolutions = attention_resolutions\n        self.dropout = dropout\n        self.channel_mult = channel_mult\n        self.conv_resample = conv_resample\n        self.num_classes = num_classes\n        self.use_checkpoint = use_checkpoint\n        self.dtype = paddle.float16 if use_fp16 else paddle.float32\n        self.num_heads = num_heads\n        self.num_head_channels = num_head_channels\n        self.num_heads_upsample = num_heads_upsample\n\n        time_embed_dim = model_channels * 4\n        self.time_embed = nn.Sequential(\n            linear(model_channels, time_embed_dim),\n            SiLU(),\n            linear(time_embed_dim, time_embed_dim),\n        )\n\n        if self.num_classes is not None:\n            self.label_emb = nn.Embedding(num_classes, time_embed_dim)\n\n        ch = input_ch = int(channel_mult[0] * model_channels)\n        self.input_blocks = nn.LayerList([TimestepEmbedSequential(conv_nd(dims, in_channels, ch, 3, padding=1))])\n        self._feature_size = ch\n        input_block_chans = [ch]\n        ds = 1\n        for level, mult in enumerate(channel_mult):\n            for _ in range(num_res_blocks):\n                layers = [\n                    ResBlock(\n                        ch,\n                        time_embed_dim,\n                        dropout,\n                        out_channels=int(mult * model_channels),\n                        dims=dims,\n                        use_checkpoint=use_checkpoint,\n                        use_scale_shift_norm=use_scale_shift_norm,\n                    )\n                ]\n                ch = int(mult * model_channels)\n                if ds in attention_resolutions:\n                    layers.append(\n                        AttentionBlock(\n                            ch,\n                            use_checkpoint=use_checkpoint,\n                            num_heads=num_heads,\n                            num_head_channels=num_head_channels,\n                            use_new_attention_order=use_new_attention_order,\n                        ))\n                self.input_blocks.append(TimestepEmbedSequential(*layers))\n                self._feature_size += ch\n                input_block_chans.append(ch)\n            if level != len(channel_mult) - 1:\n                out_ch = ch\n                self.input_blocks.append(\n                    TimestepEmbedSequential(\n                        ResBlock(\n                            ch,\n                            time_embed_dim,\n                            dropout,\n                            out_channels=out_ch,\n                            dims=dims,\n                            use_checkpoint=use_checkpoint,\n                            use_scale_shift_norm=use_scale_shift_norm,\n                            down=True,\n                        ) if resblock_updown else Downsample(ch, conv_resample, dims=dims, out_channels=out_ch)))\n                ch = out_ch\n                input_block_chans.append(ch)\n                ds *= 2\n                self._feature_size += ch\n\n        self.middle_block = TimestepEmbedSequential(\n            ResBlock(\n                ch,\n                time_embed_dim,\n                dropout,\n                dims=dims,\n                use_checkpoint=use_checkpoint,\n                use_scale_shift_norm=use_scale_shift_norm,\n            ),\n            AttentionBlock(\n                ch,\n                use_checkpoint=use_checkpoint,\n                num_heads=num_heads,\n                num_head_channels=num_head_channels,\n                use_new_attention_order=use_new_attention_order,\n            ),\n            ResBlock(\n                ch,\n                time_embed_dim,\n                dropout,\n                dims=dims,\n                use_checkpoint=use_checkpoint,\n                use_scale_shift_norm=use_scale_shift_norm,\n            ),\n        )\n        self._feature_size += ch\n\n        self.output_blocks = nn.LayerList([])\n        for level, mult in list(enumerate(channel_mult))[::-1]:\n            for i in range(num_res_blocks + 1):\n                ich = input_block_chans.pop()\n                layers = [\n                    ResBlock(\n                        ch + ich,\n                        time_embed_dim,\n                        dropout,\n                        out_channels=int(model_channels * mult),\n                        dims=dims,\n                        use_checkpoint=use_checkpoint,\n                        use_scale_shift_norm=use_scale_shift_norm,\n                    )\n                ]\n                ch = int(model_channels * mult)\n                if ds in attention_resolutions:\n                    layers.append(\n                        AttentionBlock(\n                            ch,\n                            use_checkpoint=use_checkpoint,\n                            num_heads=num_heads_upsample,\n                            num_head_channels=num_head_channels,\n                            use_new_attention_order=use_new_attention_order,\n                        ))\n                if level and i == num_res_blocks:\n                    out_ch = ch\n                    layers.append(\n                        ResBlock(\n                            ch,\n                            time_embed_dim,\n                            dropout,\n                            out_channels=out_ch,\n                            dims=dims,\n                            use_checkpoint=use_checkpoint,\n                            use_scale_shift_norm=use_scale_shift_norm,\n                            up=True,\n                        ) if resblock_updown else Upsample(ch, conv_resample, dims=dims, out_channels=out_ch))\n                    ds //= 2\n                self.output_blocks.append(TimestepEmbedSequential(*layers))\n                self._feature_size += ch\n\n        self.out = nn.Sequential(\n            normalization(ch),\n            SiLU(),\n            zero_module(conv_nd(dims, input_ch, out_channels, 3, padding=1)),\n        )\n\n    def forward(self, x, timesteps, y=None):\n        \"\"\"\n        Apply the model to an input batch.\n\n        :param x: an [N x C x ...] Tensor of inputs.\n        :param timesteps: a 1-D batch of timesteps.\n        :param y: an [N] Tensor of labels, if class-conditional.\n        :return: an [N x C x ...] Tensor of outputs.\n        \"\"\"\n        assert (y is not None) == (self.num_classes\n                                   is not None), \"must specify y if and only if the model is class-conditional\"\n\n        hs = []\n        emb = self.time_embed(timestep_embedding(timesteps, self.model_channels))\n        if self.num_classes is not None:\n            assert y.shape == (x.shape[0], )\n            emb = emb + self.label_emb(y)\n\n        h = paddle.cast(x, self.dtype)\n        for module in self.input_blocks:\n            h = module(h, emb)\n            hs.append(h)\n        h = self.middle_block(h, emb)\n        for module in self.output_blocks:\n            h = paddle.concat([h, hs.pop()], axis=1)\n            h = module(h, emb)\n        # h = paddle.cast(h, x.dtype)\n        return self.out(h)\n\n\nclass SuperResModel(UNetModel):\n    \"\"\"\n    A UNetModel that performs super-resolution.\n\n    Expects an extra kwarg `low_res` to condition on a low-resolution image.\n    \"\"\"\n\n    def __init__(self, image_size, in_channels, *args, **kwargs):\n        super().__init__(image_size, in_channels * 2, *args, **kwargs)\n\n    def forward(self, x, timesteps, low_res=None, **kwargs):\n        _, _, new_height, new_width = x.shape\n        upsampled = F.interpolate(low_res, (new_height, new_width), mode=\"bilinear\")\n        x = paddle.concat([x, upsampled], axis=1)\n        return super().forward(x, timesteps, **kwargs)\n\n\nclass EncoderUNetModel(nn.Layer):\n    \"\"\"\n    The half UNet model with attention and timestep embedding.\n\n    For usage, see UNet.\n    \"\"\"\n\n    def __init__(\n        self,\n        image_size,\n        in_channels,\n        model_channels,\n        out_channels,\n        num_res_blocks,\n        attention_resolutions,\n        dropout=0,\n        channel_mult=(1, 2, 4, 8),\n        conv_resample=True,\n        dims=2,\n        use_checkpoint=False,\n        use_fp16=False,\n        num_heads=1,\n        num_head_channels=-1,\n        num_heads_upsample=-1,\n        use_scale_shift_norm=False,\n        resblock_updown=False,\n        use_new_attention_order=False,\n        pool=\"adaptive\",\n    ):\n        super().__init__()\n\n        if num_heads_upsample == -1:\n            num_heads_upsample = num_heads\n\n        self.in_channels = in_channels\n        self.model_channels = model_channels\n        self.out_channels = out_channels\n        self.num_res_blocks = num_res_blocks\n        self.attention_resolutions = attention_resolutions\n        self.dropout = dropout\n        self.channel_mult = channel_mult\n        self.conv_resample = conv_resample\n        self.use_checkpoint = use_checkpoint\n        self.dtype = paddle.float16 if use_fp16 else paddle.float32\n        self.num_heads = num_heads\n        self.num_head_channels = num_head_channels\n        self.num_heads_upsample = num_heads_upsample\n\n        time_embed_dim = model_channels * 4\n        self.time_embed = nn.Sequential(\n            linear(model_channels, time_embed_dim),\n            SiLU(),\n            linear(time_embed_dim, time_embed_dim),\n        )\n\n        ch = int(channel_mult[0] * model_channels)\n        self.input_blocks = nn.LayerList([TimestepEmbedSequential(conv_nd(dims, in_channels, ch, 3, padding=1))])\n        self._feature_size = ch\n        input_block_chans = [ch]\n        ds = 1\n        for level, mult in enumerate(channel_mult):\n            for _ in range(num_res_blocks):\n                layers = [\n                    ResBlock(\n                        ch,\n                        time_embed_dim,\n                        dropout,\n                        out_channels=int(mult * model_channels),\n                        dims=dims,\n                        use_checkpoint=use_checkpoint,\n                        use_scale_shift_norm=use_scale_shift_norm,\n                    )\n                ]\n                ch = int(mult * model_channels)\n                if ds in attention_resolutions:\n                    layers.append(\n                        AttentionBlock(\n                            ch,\n                            use_checkpoint=use_checkpoint,\n                            num_heads=num_heads,\n                            num_head_channels=num_head_channels,\n                            use_new_attention_order=use_new_attention_order,\n                        ))\n                self.input_blocks.append(TimestepEmbedSequential(*layers))\n                self._feature_size += ch\n                input_block_chans.append(ch)\n            if level != len(channel_mult) - 1:\n                out_ch = ch\n                self.input_blocks.append(\n                    TimestepEmbedSequential(\n                        ResBlock(\n                            ch,\n                            time_embed_dim,\n                            dropout,\n                            out_channels=out_ch,\n                            dims=dims,\n                            use_checkpoint=use_checkpoint,\n                            use_scale_shift_norm=use_scale_shift_norm,\n                            down=True,\n                        ) if resblock_updown else Downsample(ch, conv_resample, dims=dims, out_channels=out_ch)))\n                ch = out_ch\n                input_block_chans.append(ch)\n                ds *= 2\n                self._feature_size += ch\n\n        self.middle_block = TimestepEmbedSequential(\n            ResBlock(\n                ch,\n                time_embed_dim,\n                dropout,\n                dims=dims,\n                use_checkpoint=use_checkpoint,\n                use_scale_shift_norm=use_scale_shift_norm,\n            ),\n            AttentionBlock(\n                ch,\n                use_checkpoint=use_checkpoint,\n                num_heads=num_heads,\n                num_head_channels=num_head_channels,\n                use_new_attention_order=use_new_attention_order,\n            ),\n            ResBlock(\n                ch,\n                time_embed_dim,\n                dropout,\n                dims=dims,\n                use_checkpoint=use_checkpoint,\n                use_scale_shift_norm=use_scale_shift_norm,\n            ),\n        )\n        self._feature_size += ch\n        self.pool = pool\n        if pool == \"adaptive\":\n            self.out = nn.Sequential(\n                normalization(ch),\n                SiLU(),\n                nn.AdaptiveAvgPool2D((1, 1)),\n                zero_module(conv_nd(dims, ch, out_channels, 1)),\n                nn.Flatten(),\n            )\n        elif pool == \"attention\":\n            assert num_head_channels != -1\n            self.out = nn.Sequential(\n                normalization(ch),\n                SiLU(),\n                AttentionPool2d((image_size // ds), ch, num_head_channels, out_channels),\n            )\n        elif pool == \"spatial\":\n            self.out = nn.Sequential(\n                nn.Linear(self._feature_size, 2048),\n                nn.ReLU(),\n                nn.Linear(2048, self.out_channels),\n            )\n        elif pool == \"spatial_v2\":\n            self.out = nn.Sequential(\n                nn.Linear(self._feature_size, 2048),\n                normalization(2048),\n                SiLU(),\n                nn.Linear(2048, self.out_channels),\n            )\n        else:\n            raise NotImplementedError(f\"Unexpected {pool} pooling\")\n\n    def forward(self, x, timesteps):\n        \"\"\"\n        Apply the model to an input batch.\n\n        :param x: an [N x C x ...] Tensor of inputs.\n        :param timesteps: a 1-D batch of timesteps.\n        :return: an [N x K] Tensor of outputs.\n        \"\"\"\n        emb = self.time_embed(timestep_embedding(timesteps, self.model_channels))\n\n        results = []\n        # h = x.type(self.dtype)\n        h = paddle.cast(x, self.dtype)\n        for module in self.input_blocks:\n            h = module(h, emb)\n            if self.pool.startswith(\"spatial\"):\n                # results.append(h.type(x.dtype).mean(axis=(2, 3)))\n                results.append(paddle.cast(h, x.dtype).mean(axis=(2, 3)))\n        h = self.middle_block(h, emb)\n        if self.pool.startswith(\"spatial\"):\n            results.append(paddle.cast(h, x.dtype).mean(axis=(2, 3)))\n            h = paddle.concat(results, axis=-1)\n            return self.out(h)\n        else:\n            # h = h.type(x.dtype)\n            h = paddle.cast(h, x.dtype)\n            return self.out(h)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/reverse_diffusion/resources/default.yml",
    "content": "text_prompts:\n  - A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\n\ninit_image:\n\nwidth_height: [ 1280, 768]\n\nskip_steps: 10\nsteps: 250\n\ncut_ic_pow: 1\ninit_scale: 1000\nclip_guidance_scale: 5000\n\ntv_scale: 0\nrange_scale: 150\nsat_scale: 0\ncutn_batches: 4\n\ndiffusion_model: 512x512_diffusion_uncond_finetune_008100\nuse_secondary_model: True\ndiffusion_sampling_mode: ddim\n\nperlin_init: False\nperlin_mode: mixed\nseed: 445467575\neta: 0.8\nclamp_grad: True\nclamp_max: 0.05\n\nrandomize_class: True\nclip_denoised: False\nfuzzy_prompt: False\nrand_mag: 0.05\n\ncut_overview: \"[12]*400+[4]*600\"\ncut_innercut: \"[4]*400+[12]*600\"\ncut_icgray_p: \"[0.2]*400+[0]*600\"\n\ndisplay_rate: 10\nn_batches: 1\nbatch_size: 1\nbatch_name: ''\nclip_models:\n  - VIT\n  - RN50\n  - RN101\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/reverse_diffusion/resources/docstrings.yml",
    "content": "text_prompts: |\n  Phrase, sentence, or string of words and phrases describing what the image should look like.  The words will be analyzed by the AI and will guide the diffusion process toward the image(s) you describe. These can include commas and weights to adjust the relative importance of each element.  E.g. \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"\n  Notice that this prompt loosely follows a structure: [subject], [prepositional details], [setting], [meta modifiers and artist]; this is a good starting point for your experiments.\n  Developing text prompts takes practice and experience, and is not the subject of this guide.  If you are a beginner to writing text prompts, a good place to start is on a simple AI art app like Night Cafe, starry ai or WOMBO prior to using DD, to get a feel for how text gets translated into images by GAN tools.  These other apps use different technologies, but many of the same principles apply.\ninit_image: |\n  Recall that in the image sequence above, the first image shown is just noise.  If an init_image is provided, diffusion will replace the noise with the init_image as its starting state.  To use an init_image, upload the image to the Colab instance or your Google Drive, and enter the full image path here.\n  If using an init_image, you may need to increase skip_steps to ~ 50% of total steps to retain the character of the init. See skip_steps above for further discussion.\nwidth_height: |\n  Desired final image size, in pixels. You can have a square, wide, or tall image, but each edge length should be set to a multiple of 64px, and a minimum of 512px on the default CLIP model setting.  If you forget to use multiples of 64px in your dimensions, DD will adjust the dimensions of your image to make it so.\n\nskip_steps: |\n  Consider the chart shown here.  Noise scheduling (denoise strength) starts very high and progressively gets lower and lower as diffusion steps progress. The noise levels in the first few steps are very high, so images change dramatically in early steps.\n  As DD moves along the curve, noise levels (and thus the amount an image changes per step) declines, and image coherence from one step to the next increases.\n  The first few steps of denoising are often so dramatic that some steps (maybe 10-15% of total) can be skipped without affecting the final image. You can experiment with this as a way to cut render times.\n  If you skip too many steps, however, the remaining noise may not be high enough to generate new content, and thus may not have ‘time left’ to finish an image satisfactorily.\n  Also, depending on your other settings, you may need to skip steps to prevent CLIP from overshooting your goal, resulting in ‘blown out’ colors (hyper saturated, solid white, or solid black regions) or otherwise poor image quality.  Consider that the denoising process is at its strongest in the early steps, so skipping steps can sometimes mitigate other problems.\n  Lastly, if using an init_image, you will need to skip ~50% of the diffusion steps to retain the shapes in the original init image.\n  However, if you’re using an init_image, you can also adjust skip_steps up or down for creative reasons.  With low skip_steps you can get a result \"inspired by\" the init_image which will retain the colors and rough layout and shapes but look quite different. With high skip_steps you can preserve most of the init_image contents and just do fine tuning of the texture.\n\nsteps: |\n  When creating an image, the denoising curve is subdivided into steps for processing. Each step (or iteration) involves the AI looking at subsets of the image called ‘cuts’ and calculating the ‘direction’ the image should be guided to be more like the prompt. Then it adjusts the image with the help of the diffusion denoiser, and moves to the next step.\n  Increasing steps will provide more opportunities for the AI to adjust the image, and each adjustment will be smaller, and thus will yield a more precise, detailed image.  Increasing steps comes at the expense of longer render times.  Also, while increasing steps should generally increase image quality, there is a diminishing return on additional steps beyond 250 - 500 steps.  However, some intricate images can take 1000, 2000, or more steps.  It is really up to the user.\n  Just know that the render time is directly related to the number of steps, and many other parameters have a major impact on image quality, without costing additional time.\n\ncut_ic_pow: |\n  This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n\ninit_scale: |\n  This controls how strongly CLIP will try to match the init_image provided.  This is balanced against the clip_guidance_scale (CGS) above.  Too much init scale, and the image won’t change much during diffusion. Too much CGS and the init image will be lost.\nclip_guidance_scale: |\n  CGS is one of the most important parameters you will use. It tells DD how strongly you want CLIP to move toward your prompt each timestep.  Higher is generally better, but if CGS is too strong it will overshoot the goal and distort the image. So a happy medium is needed, and it takes experience to learn how to adjust CGS.\n  Note that this parameter generally scales with image dimensions. In other words, if you increase your total dimensions by 50% (e.g. a change from 512 x 512 to 512 x 768), then to maintain the same effect on the image, you’d want to increase clip_guidance_scale from 5000 to 7500.\n  Of the basic settings, clip_guidance_scale, steps and skip_steps are the most important contributors to image quality, so learn them well.\ntv_scale: |\n  Total variance denoising. Optional, set to zero to turn off. Controls ‘smoothness’ of final output. If used, tv_scale will try to smooth out your final image to reduce overall noise. If your image is too ‘crunchy’, increase tv_scale. TV denoising is good at preserving edges while smoothing away noise in flat regions.  See https://en.wikipedia.org/wiki/Total_variation_denoising\nrange_scale: |\n  Optional, set to zero to turn off.  Used for adjustment of color contrast.  Lower range_scale will increase contrast. Very low numbers create a reduced color palette, resulting in more vibrant or poster-like images. Higher range_scale will reduce contrast, for more muted images.\n\nsat_scale: |\n  Saturation scale. Optional, set to zero to turn off.  If used, sat_scale will help mitigate oversaturation. If your image is too saturated, increase sat_scale to reduce the saturation.\ncutn_batches: |\n  Each iteration, the AI cuts the image into smaller pieces known as cuts, and compares each cut to the prompt to decide how to guide the next diffusion step.  More cuts can generally lead to better images, since DD has more chances to fine-tune the image precision in each timestep.\n  Additional cuts are memory intensive, however, and if DD tries to evaluate too many cuts at once, it can run out of memory.  You can use cutn_batches to increase cuts per timestep without increasing memory usage.\n  At the default settings, DD is scheduled to do 16 cuts per timestep.  If cutn_batches is set to 1, there will indeed only be 16 cuts total per timestep.\n  However, if cutn_batches is increased to 4, DD will do 64 cuts total in each timestep, divided into 4 sequential batches of 16 cuts each.  Because the cuts are being evaluated only 16 at a time, DD uses the memory required for only 16 cuts, but gives you the quality benefit of 64 cuts.  The tradeoff, of course, is that this will take ~4 times as long to render each image.\n  So, (scheduled cuts) x (cutn_batches) = (total cuts per timestep). Increasing cutn_batches will increase render times, however, as the work is being done sequentially.  DD’s default cut schedule is a good place to start, but the cut schedule can be adjusted in the Cutn Scheduling section, explained below.\n\ndiffusion_model: Diffusion_model of choice.\n\nuse_secondary_model: |\n  Option to use a secondary purpose-made diffusion model to clean up interim diffusion images for CLIP evaluation.    If this option is turned off, DD will use the regular (large) diffusion model.    Using the secondary model is faster - one user reported a 50% improvement in render speed! However, the secondary model is much smaller, and may reduce image quality and detail.  I suggest you experiment with this.\n\ndiffusion_sampling_mode: |\n  Two alternate diffusion denoising algorithms. ddim has been around longer, and is more established and tested.  plms is a newly added alternate method that promises good diffusion results in fewer steps, but has not been as fully tested and may have side effects. This new plms mode is actively being researched in the #settings-and-techniques channel in the DD Discord.\n\nperlin_init: |\n  Normally, DD will use an image filled with random noise as a starting point for the diffusion curve.  If perlin_init is selected, DD will instead use a Perlin noise model as an initial state.  Perlin has very interesting characteristics, distinct from random noise, so it’s worth experimenting with this for your projects. Beyond perlin, you can, of course, generate your own noise images (such as with GIMP, etc) and use them as an init_image (without skipping steps).\n  Choosing perlin_init does not affect the actual diffusion process, just the starting point for the diffusion. Please note that selecting a perlin_init will replace and override any init_image you may have specified.  Further, because the 2D, 3D and video animation systems all rely on the init_image system, if you enable Perlin while using animation modes, the perlin_init will jump in front of any previous image or video input, and DD will NOT give you the expected sequence of coherent images. All of that said, using Perlin and animation modes together do make a very colorful rainbow effect, which can be used creatively.\n\nperlin_mode: |\n  sets type of Perlin noise: colored, gray, or a mix of both, giving you additional options for noise types. Experiment to see what these do in your projects.\nseed: |\n  Deep in the diffusion code, there is a random number ‘seed’ which is used as the basis for determining the initial state of the diffusion.  By default, this is random, but you can also specify your own seed.  This is useful if you like a particular result and would like to run more iterations that will be similar.\n  After each run, the actual seed value used will be reported in the parameters report, and can be reused if desired by entering seed # here.  If a specific numerical seed is used repeatedly, the resulting images will be quite similar but not identical.\neta: |\n  eta (greek letter η) is a diffusion model variable that mixes in a random amount of scaled noise into each timestep. 0 is no noise, 1.0 is more noise. As with most DD parameters, you can go below zero for eta, but it may give you unpredictable results.\n  The steps parameter has a close relationship with the eta parameter. If you set eta to 0, then you can get decent output with only 50-75 steps. Setting eta to 1.0 favors higher step counts, ideally around 250 and up. eta has a subtle, unpredictable effect on image, so you’ll need to experiment to see how this affects your projects.\nclamp_grad: |\n  As I understand it, clamp_grad is an internal limiter that stops DD from producing extreme results.  Try your images with and without clamp_grad. If the image changes drastically with clamp_grad turned off, it probably means your clip_guidance_scale is too high and should be reduced.\nclamp_max: |\n  Sets the value of the clamp_grad limitation. Default is 0.05, providing for smoother, more muted coloration in images, but setting higher values (0.15-0.3) can provide interesting contrast and vibrancy.\n\nrandomize_class:\nclip_denoised: False\nfuzzy_prompt: |\n  Controls whether to add multiple noisy prompts to the prompt losses. If True, can increase variability of image output. Experiment with this.\nrand_mag: |\n  Affects only the fuzzy_prompt.  Controls the magnitude of the random noise added by fuzzy_prompt.\n\ncut_overview: The schedule of overview cuts\ncut_innercut: The schedule of inner cuts\ncut_icgray_p: This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n\ndisplay_rate: |\n  During a diffusion run, you can monitor the progress of each image being created with this variable.  If display_rate is set to 50, DD will show you the in-progress image every 50 timesteps. Setting this to a lower value, like 5 or 10, is a good way to get an early peek at where your image is heading. If you don’t like the progression, just interrupt execution, change some settings, and re-run.  If you are planning a long, unmonitored batch, it’s better to set display_rate equal to steps, because displaying interim images does slow Colab down slightly.\nn_batches: |\n  This variable sets the number of still images you want DD to create.  If you are using an animation mode (see below for details) DD will ignore n_batches and create a single set of animated frames based on the animation settings.\nbatch_name: |\n  The name of the batch, the batch id will be named as \"discoart-[batch_name]-seed\". To avoid your artworks be overridden by other users, please use a unique name.\nclip_models: |\n  CLIP Model selectors. ViT-B/32, ViT-B/16, ViT-L/14, RN101, RN50, RN50x4, RN50x16, RN50x64.\n  These various CLIP models are available for you to use during image generation.  Models have different styles or ‘flavors,’ so look around.\n  You can mix in multiple models as well for different results.  However, keep in mind that some models are extremely memory-hungry, and turning on additional models will take additional memory and may cause a crash.\n  The rough order of speed/mem usage is (smallest/fastest to largest/slowest):\n  ViT-B/32\n  RN50\n  RN101\n  ViT-B/16\n  RN50x4\n  RN50x16\n  RN50x64\n  ViT-L/14\n  For RN50x64 & ViTL14 you may need to use fewer cuts, depending on your VRAM.\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_rn50/reverse_diffusion/runner.py",
    "content": "'''\nThis code is rewritten by Paddle based on Jina-ai/discoart.\nhttps://github.com/jina-ai/discoart/blob/main/discoart/runner.py\n'''\nimport gc\nimport os\nimport random\nfrom threading import Thread\n\nimport disco_diffusion_clip_rn50.clip.clip as clip\nimport numpy as np\nimport paddle\nimport paddle.vision.transforms as T\nimport paddle_lpips as lpips\nfrom docarray import Document\nfrom docarray import DocumentArray\nfrom IPython import display\nfrom ipywidgets import Output\nfrom PIL import Image\n\nfrom .helper import logger\nfrom .helper import parse_prompt\nfrom .model.losses import range_loss\nfrom .model.losses import spherical_dist_loss\nfrom .model.losses import tv_loss\nfrom .model.make_cutouts import MakeCutoutsDango\nfrom .model.sec_diff import alpha_sigma_to_t\nfrom .model.sec_diff import SecondaryDiffusionImageNet2\nfrom .model.transforms import Normalize\n\n\ndef do_run(args, models) -> 'DocumentArray':\n    logger.info('preparing models...')\n    model, diffusion, clip_models, secondary_model = models\n    normalize = Normalize(\n        mean=[0.48145466, 0.4578275, 0.40821073],\n        std=[0.26862954, 0.26130258, 0.27577711],\n    )\n    lpips_model = lpips.LPIPS(net='vgg')\n    for parameter in lpips_model.parameters():\n        parameter.stop_gradient = True\n    side_x = (args.width_height[0] // 64) * 64\n    side_y = (args.width_height[1] // 64) * 64\n    cut_overview = eval(args.cut_overview)\n    cut_innercut = eval(args.cut_innercut)\n    cut_icgray_p = eval(args.cut_icgray_p)\n\n    from .model.perlin_noises import create_perlin_noise, regen_perlin\n\n    seed = args.seed\n\n    skip_steps = args.skip_steps\n\n    loss_values = []\n\n    if seed is not None:\n        np.random.seed(seed)\n        random.seed(seed)\n        paddle.seed(seed)\n\n    model_stats = []\n    for clip_model in clip_models:\n        model_stat = {\n            'clip_model': None,\n            'target_embeds': [],\n            'make_cutouts': None,\n            'weights': [],\n        }\n        model_stat['clip_model'] = clip_model\n\n        if isinstance(args.text_prompts, str):\n            args.text_prompts = [args.text_prompts]\n\n        for prompt in args.text_prompts:\n            txt, weight = parse_prompt(prompt)\n            txt = clip_model.encode_text(clip.tokenize(prompt))\n            if args.fuzzy_prompt:\n                for i in range(25):\n                    model_stat['target_embeds'].append((txt + paddle.randn(txt.shape) * args.rand_mag).clip(0, 1))\n                    model_stat['weights'].append(weight)\n            else:\n                model_stat['target_embeds'].append(txt)\n                model_stat['weights'].append(weight)\n\n        model_stat['target_embeds'] = paddle.concat(model_stat['target_embeds'])\n        model_stat['weights'] = paddle.to_tensor(model_stat['weights'])\n        if model_stat['weights'].sum().abs() < 1e-3:\n            raise RuntimeError('The weights must not sum to 0.')\n        model_stat['weights'] /= model_stat['weights'].sum().abs()\n        model_stats.append(model_stat)\n\n    init = None\n    if args.init_image:\n        d = Document(uri=args.init_image).load_uri_to_image_tensor(side_x, side_y)\n        init = T.to_tensor(d.tensor).unsqueeze(0) * 2 - 1\n\n    if args.perlin_init:\n        if args.perlin_mode == 'color':\n            init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, False, side_y, side_x)\n            init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, False, side_y, side_x)\n        elif args.perlin_mode == 'gray':\n            init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, True, side_y, side_x)\n            init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, True, side_y, side_x)\n        else:\n            init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, False, side_y, side_x)\n            init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, True, side_y, side_x)\n        init = (T.to_tensor(init).add(T.to_tensor(init2)).divide(paddle.to_tensor(2.0)).unsqueeze(0) * 2 - 1)\n        del init2\n\n    cur_t = None\n\n    def cond_fn(x, t, y=None):\n        x_is_NaN = False\n        n = x.shape[0]\n        if secondary_model:\n            alpha = paddle.to_tensor(diffusion.sqrt_alphas_cumprod[cur_t], dtype='float32')\n            sigma = paddle.to_tensor(diffusion.sqrt_one_minus_alphas_cumprod[cur_t], dtype='float32')\n            cosine_t = alpha_sigma_to_t(alpha, sigma)\n            x = paddle.to_tensor(x.detach(), dtype='float32')\n            x.stop_gradient = False\n            cosine_t = paddle.tile(paddle.to_tensor(cosine_t.detach().cpu().numpy()), [n])\n            cosine_t.stop_gradient = False\n            out = secondary_model(x, cosine_t).pred\n            fac = diffusion.sqrt_one_minus_alphas_cumprod[cur_t]\n            x_in_d = out * fac + x * (1 - fac)\n            x_in = x_in_d.detach()\n            x_in.stop_gradient = False\n            x_in_grad = paddle.zeros_like(x_in, dtype='float32')\n        else:\n            t = paddle.ones([n], dtype='int64') * cur_t\n            out = diffusion.p_mean_variance(model, x, t, clip_denoised=False, model_kwargs={'y': y})\n            fac = diffusion.sqrt_one_minus_alphas_cumprod[cur_t]\n            x_in_d = out['pred_xstart'] * fac + x * (1 - fac)\n            x_in = x_in_d.detach()\n            x_in.stop_gradient = False\n            x_in_grad = paddle.zeros_like(x_in, dtype='float32')\n        for model_stat in model_stats:\n            for i in range(args.cutn_batches):\n                t_int = (int(t.item()) + 1)  # errors on last step without +1, need to find source\n                # when using SLIP Base model the dimensions need to be hard coded to avoid AttributeError: 'VisionTransformer' object has no attribute 'input_resolution'\n                try:\n                    input_resolution = model_stat['clip_model'].visual.input_resolution\n                except:\n                    input_resolution = 224\n\n                cuts = MakeCutoutsDango(\n                    input_resolution,\n                    Overview=cut_overview[1000 - t_int],\n                    InnerCrop=cut_innercut[1000 - t_int],\n                    IC_Size_Pow=args.cut_ic_pow,\n                    IC_Grey_P=cut_icgray_p[1000 - t_int],\n                )\n                clip_in = normalize(cuts(x_in.add(paddle.to_tensor(1.0)).divide(paddle.to_tensor(2.0))))\n                image_embeds = (model_stat['clip_model'].encode_image(clip_in))\n\n                dists = spherical_dist_loss(\n                    image_embeds.unsqueeze(1),\n                    model_stat['target_embeds'].unsqueeze(0),\n                )\n\n                dists = dists.reshape([\n                    cut_overview[1000 - t_int] + cut_innercut[1000 - t_int],\n                    n,\n                    -1,\n                ])\n                losses = dists.multiply(model_stat['weights']).sum(2).mean(0)\n                loss_values.append(losses.sum().item())  # log loss, probably shouldn't do per cutn_batch\n\n                x_in_grad += (paddle.grad(losses.sum() * args.clip_guidance_scale, x_in)[0] / args.cutn_batches)\n        tv_losses = tv_loss(x_in)\n        range_losses = range_loss(x_in)\n        sat_losses = paddle.abs(x_in - x_in.clip(min=-1, max=1)).mean()\n        loss = (tv_losses.sum() * args.tv_scale + range_losses.sum() * args.range_scale +\n                sat_losses.sum() * args.sat_scale)\n        if init is not None and args.init_scale:\n            init_losses = lpips_model(x_in, init)\n            loss = loss + init_losses.sum() * args.init_scale\n        x_in_grad += paddle.grad(loss, x_in)[0]\n        if not paddle.isnan(x_in_grad).any():\n            grad = -paddle.grad(x_in_d, x, x_in_grad)[0]\n        else:\n            x_is_NaN = True\n            grad = paddle.zeros_like(x)\n        if args.clamp_grad and not x_is_NaN:\n            magnitude = grad.square().mean().sqrt()\n            return (grad * magnitude.clip(max=args.clamp_max) / magnitude)\n        return grad\n\n    if args.diffusion_sampling_mode == 'ddim':\n        sample_fn = diffusion.ddim_sample_loop_progressive\n    else:\n        sample_fn = diffusion.plms_sample_loop_progressive\n\n    logger.info('creating artwork...')\n\n    image_display = Output()\n    da_batches = DocumentArray()\n\n    for _nb in range(args.n_batches):\n        display.clear_output(wait=True)\n        display.display(args.name_docarray, image_display)\n        gc.collect()\n        paddle.device.cuda.empty_cache()\n\n        d = Document(tags=vars(args))\n        da_batches.append(d)\n\n        cur_t = diffusion.num_timesteps - skip_steps - 1\n\n        if args.perlin_init:\n            init = regen_perlin(args.perlin_mode, side_y, side_x, args.batch_size)\n\n        if args.diffusion_sampling_mode == 'ddim':\n            samples = sample_fn(\n                model,\n                (args.batch_size, 3, side_y, side_x),\n                clip_denoised=args.clip_denoised,\n                model_kwargs={},\n                cond_fn=cond_fn,\n                progress=True,\n                skip_timesteps=skip_steps,\n                init_image=init,\n                randomize_class=args.randomize_class,\n                eta=args.eta,\n            )\n        else:\n            samples = sample_fn(\n                model,\n                (args.batch_size, 3, side_y, side_x),\n                clip_denoised=args.clip_denoised,\n                model_kwargs={},\n                cond_fn=cond_fn,\n                progress=True,\n                skip_timesteps=skip_steps,\n                init_image=init,\n                randomize_class=args.randomize_class,\n                order=2,\n            )\n\n        threads = []\n        for j, sample in enumerate(samples):\n            cur_t -= 1\n            with image_display:\n                if j % args.display_rate == 0 or cur_t == -1:\n                    for _, image in enumerate(sample['pred_xstart']):\n                        image = (image + 1) / 2\n                        image = image.clip(0, 1).squeeze().transpose([1, 2, 0]).numpy() * 255\n                        image = np.uint8(image)\n                        image = Image.fromarray(image)\n\n                        image.save(os.path.join(args.output_dir, 'progress-{}.png'.format(_nb)))\n                        c = Document(tags={'cur_t': cur_t})\n                        c.load_pil_image_to_datauri(image)\n                        d.chunks.append(c)\n                        display.clear_output(wait=True)\n                        display.display(display.Image(os.path.join(args.output_dir, 'progress-{}.png'.format(_nb))))\n                        d.chunks.plot_image_sprites(os.path.join(args.output_dir,\n                                                                 f'{args.name_docarray}-progress-{_nb}.png'),\n                                                    show_index=True)\n                        t = Thread(\n                            target=_silent_push,\n                            args=(\n                                da_batches,\n                                args.name_docarray,\n                            ),\n                        )\n                        threads.append(t)\n                        t.start()\n\n                    if cur_t == -1:\n                        d.load_pil_image_to_datauri(image)\n\n        for t in threads:\n            t.join()\n    display.clear_output(wait=True)\n    logger.info(f'done! {args.name_docarray}')\n    da_batches.plot_image_sprites(skip_empty=True, show_index=True, keep_aspect_ratio=True)\n    return da_batches\n\n\ndef _silent_push(da_batches: DocumentArray, name: str) -> None:\n    try:\n        da_batches.push(name)\n    except Exception as ex:\n        logger.debug(f'push failed: {ex}')\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/README.md",
    "content": "# disco_diffusion_clip_vitb32\n\n|模型名称|disco_diffusion_clip_vitb32|\n| :--- | :---: |\n|类别|图像-文图生成|\n|网络|dd+clip ViTB32|\n|数据集|-|\n|是否支持Fine-tuning|否|\n|模型大小|3.1GB|\n|最新更新日期|2022-08-02|\n|数据指标|-|\n\n## 一、模型基本信息\n\n### 应用效果展示\n\n  - 输入文本 \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"\n\n  - 输出图像\n  <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/182298446-7feb530b-62cc-4e3f-a693-249ec8383daa.png\"  width = \"80%\" hspace='10'/>\n  <br />\n\n  - 生成过程\n  <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/182298453-9a8a8336-66e6-4adb-a46f-7a0fa211b467.gif\"  width = \"80%\" hspace='10'/>\n  <br />\n\n### 模型介绍\n\ndisco_diffusion_clip_vitb32 是一个文图生成模型，可以通过输入一段文字来生成符合该句子语义的图像。该模型由两部分组成，一部分是扩散模型，是一种生成模型，可以从噪声输入中重建出原始图像。另一部分是多模态预训练模型（CLIP）, 可以将文本和图像表示在同一个特征空间，相近语义的文本和图像在该特征空间里距离会更相近。在该文图生成模型中，扩散模型负责从初始噪声或者指定初始图像中来生成目标图像，CLIP负责引导生成图像的语义和输入的文本的语义尽可能接近，随着扩散模型在CLIP的引导下不断的迭代生成新图像，最终能够生成文本所描述内容的图像。该模块中使用的CLIP模型结构为ViTB32。\n\n更多详情请参考论文：[Diffusion Models Beat GANs on Image Synthesis](https://arxiv.org/abs/2105.05233) 以及 [Learning Transferable Visual Models From Natural Language Supervision](https://arxiv.org/abs/2103.00020)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.2.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install disco_diffusion_clip_vitb32\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run disco_diffusion_clip_vitb32 --text_prompts \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\" --output_dir disco_diffusion_clip_vitb32_out\n    ```\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"disco_diffusion_clip_vitb32\")\n    text_prompts = [\"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"]\n    # 生成图像, 默认会在disco_diffusion_clip_vitb32_out目录保存图像\n    # 返回的da是一个DocumentArray对象，保存了所有的结果，包括最终结果和迭代过程的中间结果\n    # 可以通过操作DocumentArray对象对生成的图像做后处理，保存或者分析\n    da = module.generate_image(text_prompts=text_prompts, output_dir='./disco_diffusion_clip_vitb32_out/')  \n    # 手动将最终生成的图像保存到指定路径\n    da[0].save_uri_to_file('disco_diffusion_clip_vitb32_out-result.png')\n    # 展示所有的中间结果\n    da[0].chunks.plot_image_sprites(skip_empty=True, show_index=True, keep_aspect_ratio=True)\n    # 将整个生成过程保存为一个动态图gif\n    da[0].chunks.save_gif('disco_diffusion_clip_vitb32_out-result.gif')\n    ```\n\n- ### 3、API\n\n  - ```python\n    def generate_image(\n            text_prompts,\n            style: Optional[str] = None,\n            artist: Optional[str] = None,\n            width_height: Optional[List[int]] = [1280, 768],\n            seed: Optional[int] = None,\n            output_dir: Optional[str] = 'disco_diffusion_clip_vitb32_out'):\n    ```\n\n    - 文图生成API，生成文本描述内容的图像。\n\n    - **参数**\n\n      - text_prompts(str): 输入的语句，描述想要生成的图像的内容。通常比较有效的构造方式为 \"一段描述性的文字内容\" + \"指定艺术家的名字\"，如\"a beautiful painting of Chinese architecture, by krenz, sunny, super wide angle, artstation.\"。prompt的构造可以参考[网站](https://docs.google.com/document/d/1XUT2G9LmkZataHFzmuOtRXnuWBfhvXDAo8DkS--8tec/edit#)。\n      - style(Optional[str]): 指定绘画的风格，如'watercolor','Chinese painting'等。当不指定时，风格完全由您所填写的prompt决定。\n      - artist(Optional[str]): 指定特定的艺术家，如Greg Rutkowsk、krenz，将会生成所指定艺术家的绘画风格。当不指定时，风格完全由您所填写的prompt决定。各种艺术家的风格可以参考[网站](https://weirdwonderfulai.art/resources/disco-diffusion-70-plus-artist-studies/)。\n      - width_height(Optional[List[int]]): 指定最终输出图像的宽高，宽和高都需要是64的倍数，生成的图像越大，所需要的计算时间越长。\n      - seed(Optional[int]): 随机种子，由于输入默认是随机高斯噪声，设置不同的随机种子会由不同的初始输入，从而最终生成不同的结果，可以设置该参数来获得不同的输出图像。\n      - output_dir(Optional[str]): 保存输出图像的目录，默认为\"disco_diffusion_clip_vitb32_out\"。\n\n\n    - **返回**\n      - ra(DocumentArray): DocumentArray对象， 包含`n_batches`个Documents，其中每个Document都保存了迭代过程的所有中间结果。详细可参考[DocumentArray使用文档](https://docarray.jina.ai/fundamentals/documentarray/index.html)。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线文图生成服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m disco_diffusion_clip_vitb32\n    ```\n\n  - 这样就完成了一个文图生成的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果，返回的预测结果在反序列化后即是上述接口声明中说明的DocumentArray类型，返回后对结果的操作方式和使用generate_image接口完全相同。\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    from docarray import DocumentArray\n\n    # 发送HTTP请求\n    data = {'text_prompts': 'in the morning light,Overlooking TOKYO city by greg rutkowski and thomas kinkade,Trending on artstation.'}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/disco_diffusion_clip_vitb32\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 获取返回结果\n    da = DocumentArray.from_base64(r.json()[\"results\"])\n    # 手动将最终生成的图像保存到指定路径\n    da[0].save_uri_to_file('disco_diffusion_clip_vitb32_out-result.png')\n    # 将生成过程保存为一个动态图gif\n    da[0].chunks.save_gif('disco_diffusion_clip_vitb32_out-result.gif')\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install disco_diffusion_clip_vitb32 == 1.0.0\n  ```\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/README_en.md",
    "content": "# disco_diffusion_clip_vitb32\n\n|Module Name|disco_diffusion_clip_vitb32|\n| :--- | :---: |\n|Category|text to image|\n|Network|dd+clip ViTB32|\n|Dataset|-|\n|Fine-tuning supported or not|No|\n|Module Size|3.1GB|\n|Latest update date|2022-08-02|\n|Data indicators|-|\n\n## I.Basic Information\n\n### Application Effect Display\n\n  - Prompt \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"\n\n  - Output image\n  <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/182298446-7feb530b-62cc-4e3f-a693-249ec8383daa.png\"  width = \"80%\" hspace='10'/>\n  <br />\n\n  - Generating process\n  <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/182298453-9a8a8336-66e6-4adb-a46f-7a0fa211b467.gif\"  width = \"80%\" hspace='10'/>\n  <br />\n\n### Module Introduction\n\ndisco_diffusion_clip_vitb32 disco_diffusion_clip_rn50 is a text-to-image generation model that can generate images that match the semantics of the sentence you prompt. The model consists of two parts, one is the diffusion model, which is a generative model that reconstructs the original image from the noisy input. The other part is the multimodal pre-training model (CLIP), which can represent text and images in the same feature space, and text and images with similar semantics will be closer in this feature space. In the text image generation model, the diffusion model is responsible for generating the target image from the initial noise or the specified initial image, and CLIP is responsible for guiding the generated image to be as close as possible to the semantics of the input text. Diffusion model under the guidance of CLIP iteratively generates new images, eventually generating images of what the text describes. The CLIP model used in this module is ViTB32.\n\nFor more details, please refer to [Diffusion Models Beat GANs on Image Synthesis](https://arxiv.org/abs/2105.05233) and [Learning Transferable Visual Models From Natural Language Supervision](https://arxiv.org/abs/2103.00020)\n\n## II.Installation\n\n- ### 1.Environmental Dependence\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.2.0    | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2.Installation\n\n  - ```shell\n    $ hub install disco_diffusion_clip_vitb32\n    ```\n  - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n\n## III.Module API Prediction  \n\n- ### 1.Command line Prediction\n\n  - ```shell\n    $ hub run disco_diffusion_clip_vitb32 --text_prompts \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\" --output_dir disco_diffusion_clip_vitb32_out\n    ```\n\n- ### 2.Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"disco_diffusion_clip_vitb32\")\n    text_prompts = [\"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"]\n    # Output images will be saved in  disco_diffusion_clip_vitb32_out directory.\n    # The returned da is a DocumentArray object, which contains all immediate and final results\n    # You can manipulate the DocumentArray object to do post-processing and save images\n    da = module.generate_image(text_prompts=text_prompts, output_dir='./disco_diffusion_clip_vitb32_out/')  \n    # Save final result image to a file\n    da[0].save_uri_to_file('disco_diffusion_clip_vitb32_out-result.png')\n    # Show all immediate results\n    da[0].chunks.plot_image_sprites(skip_empty=True, show_index=True, keep_aspect_ratio=True)\n    # Save the generating process as a gif\n    da[0].chunks.save_gif('disco_diffusion_clip_vitb32_out-result.gif')\n    ```\n\n- ### 3.API\n\n  - ```python\n    def generate_image(\n            text_prompts,\n            style: Optional[str] = None,\n            artist: Optional[str] = None,\n            width_height: Optional[List[int]] = [1280, 768],\n            seed: Optional[int] = None,\n            output_dir: Optional[str] = 'disco_diffusion_clip_vitb32_out'):\n    ```\n\n    - Image generating api, which generates an image corresponding to your prompt..\n\n    - **Parameters**\n\n      - text_prompts(str): Prompt, used to describe your image content. You can construct a prompt conforms to the format \"content\" + \"artist/style\", such as \"a beautiful painting of Chinese architecture, by krenz, sunny, super wide angle, artstation.\". For more details, you can refer to [website](https://docs.google.com/document/d/1XUT2G9LmkZataHFzmuOtRXnuWBfhvXDAo8DkS--8tec/edit#).\n      - style(Optional[str]): Image style, such as \"watercolor\" and \"Chinese painting\". If not provided, style is totally up to your prompt.\n      - artist(Optional[str]): Artist name, such as Greg Rutkowsk, krenz, image style is as whose works you choose. If not provided, style is totally up to your [prompt](https://weirdwonderfulai.art/resources/disco-diffusion-70-plus-artist-studies/).\n      - width_height(Optional[List[int]]): The width and height of output images, should be better multiples of 64. The larger size is, the longger computation time is.\n      - seed(Optional[int]): Random seed, different seeds result in different output images.\n      - output_dir(Optional[str]): Output directory, default is \"disco_diffusion_clip_vitb32_out\".\n\n\n    - **Return**\n      - ra(DocumentArray): DocumentArray object， including `n_batches` Documents，each document keeps all immediate results during generation, please refer to [DocumentArray tutorial](https://docarray.jina.ai/fundamentals/documentarray/index.html) for more details.\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of text-to-image.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m disco_diffusion_clip_vitb32\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result.\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    from docarray import DocumentArray\n\n    # Send an HTTP request\n    data = {'text_prompts': 'in the morning light,Overlooking TOKYO city by greg rutkowski and thomas kinkade,Trending on artstation.'}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/disco_diffusion_clip_vitb32\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # Get results\n    da = DocumentArray.from_base64(r.json()[\"results\"])\n    # Save final result image to a file\n    da[0].save_uri_to_file('disco_diffusion_clip_vitb32_out-result.png')\n    # Save the generating process as a gif\n    da[0].chunks.save_gif('disco_diffusion_clip_vitb32_out-result.gif')\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n  ```shell\n  $ hub install disco_diffusion_clip_vitb32 == 1.0.0\n  ```\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/clip/README.md",
    "content": "# OpenAI CLIP implemented in Paddle.\nThe original implementation repo is [ranchlai/clip.paddle](https://github.com/ranchlai/clip.paddle). We copy this repo here for guided diffusion.\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/clip/clip/__init__.py",
    "content": "from .utils import *\r\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/clip/clip/layers.py",
    "content": "from typing import Optional\n\nimport paddle\nimport paddle.nn as nn\nfrom paddle import Tensor\nfrom paddle.nn import functional as F\nfrom paddle.nn import Linear\n\n__all__ = ['ResidualAttentionBlock', 'AttentionPool2d', 'multi_head_attention_forward', 'MultiHeadAttention']\n\n\ndef multi_head_attention_forward(x: Tensor,\n                                 num_heads: int,\n                                 q_proj: Linear,\n                                 k_proj: Linear,\n                                 v_proj: Linear,\n                                 c_proj: Linear,\n                                 attn_mask: Optional[Tensor] = None):\n    max_len, batch_size, emb_dim = x.shape\n    head_dim = emb_dim // num_heads\n    scaling = float(head_dim)**-0.5\n    q = q_proj(x)  # L, N, E\n    k = k_proj(x)  # L, N, E\n    v = v_proj(x)  # L, N, E\n    #k = k.con\n    v = v.reshape((-1, batch_size * num_heads, head_dim)).transpose((1, 0, 2))\n    k = k.reshape((-1, batch_size * num_heads, head_dim)).transpose((1, 0, 2))\n    q = q.reshape((-1, batch_size * num_heads, head_dim)).transpose((1, 0, 2))\n\n    q = q * scaling\n    qk = paddle.bmm(q, k.transpose((0, 2, 1)))\n    if attn_mask is not None:\n        if attn_mask.ndim == 2:\n            attn_mask.unsqueeze_(0)\n        #assert str(attn_mask.dtype) == 'VarType.FP32' and attn_mask.ndim == 3\n        assert attn_mask.shape[0] == 1 and attn_mask.shape[1] == max_len and attn_mask.shape[2] == max_len\n        qk += attn_mask\n\n    qk = paddle.nn.functional.softmax(qk, axis=-1)\n    atten = paddle.bmm(qk, v)\n    atten = atten.transpose((1, 0, 2))\n    atten = atten.reshape((max_len, batch_size, emb_dim))\n    atten = c_proj(atten)\n    return atten\n\n\nclass MultiHeadAttention(nn.Layer):  # without attention mask\n\n    def __init__(self, emb_dim: int, num_heads: int):\n        super().__init__()\n        self.q_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.k_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.v_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.c_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.head_dim = emb_dim // num_heads\n        self.emb_dim = emb_dim\n        self.num_heads = num_heads\n        assert self.head_dim * num_heads == emb_dim, \"embed_dim must be divisible by num_heads\"\n        #self.scaling = float(self.head_dim) ** -0.5\n\n    def forward(self, x, attn_mask=None):  # x is in shape[max_len,batch_size,emb_dim]\n\n        atten = multi_head_attention_forward(x,\n                                             self.num_heads,\n                                             self.q_proj,\n                                             self.k_proj,\n                                             self.v_proj,\n                                             self.c_proj,\n                                             attn_mask=attn_mask)\n\n        return atten\n\n\nclass Identity(nn.Layer):\n\n    def __init__(self):\n        super().__init__()\n\n    def forward(self, x):\n        return x\n\n\nclass Bottleneck(nn.Layer):\n    expansion = 4\n\n    def __init__(self, inplanes, planes, stride=1):\n        super().__init__()\n\n        # all conv layers have stride 1. an avgpool is performed after the second convolution when stride > 1\n        self.conv1 = nn.Conv2D(inplanes, planes, 1, bias_attr=False)\n        self.bn1 = nn.BatchNorm2D(planes)\n\n        self.conv2 = nn.Conv2D(planes, planes, 3, padding=1, bias_attr=False)\n        self.bn2 = nn.BatchNorm2D(planes)\n\n        self.avgpool = nn.AvgPool2D(stride) if stride > 1 else Identity()\n\n        self.conv3 = nn.Conv2D(planes, planes * self.expansion, 1, bias_attr=False)\n        self.bn3 = nn.BatchNorm2D(planes * self.expansion)\n\n        self.relu = nn.ReLU()\n        self.downsample = None\n        self.stride = stride\n\n        if stride > 1 or inplanes != planes * Bottleneck.expansion:\n            self.downsample = nn.Sequential(\n                (\"-1\", nn.AvgPool2D(stride)),\n                (\"0\", nn.Conv2D(inplanes, planes * self.expansion, 1, stride=1, bias_attr=False)),\n                (\"1\", nn.BatchNorm2D(planes * self.expansion)))\n\n    def forward(self, x):\n        identity = x\n\n        out = self.relu(self.bn1(self.conv1(x)))\n        out = self.relu(self.bn2(self.conv2(out)))\n        out = self.avgpool(out)\n        out = self.bn3(self.conv3(out))\n\n        if self.downsample is not None:\n            identity = self.downsample(x)\n\n        out += identity\n        out = self.relu(out)\n        return out\n\n\nclass AttentionPool2d(nn.Layer):\n\n    def __init__(self, spacial_dim: int, embed_dim: int, num_heads: int, output_dim: int = None):\n        super().__init__()\n\n        self.positional_embedding = paddle.create_parameter((spacial_dim**2 + 1, embed_dim), dtype='float32')\n\n        self.q_proj = nn.Linear(embed_dim, embed_dim, bias_attr=True)\n        self.k_proj = nn.Linear(embed_dim, embed_dim, bias_attr=True)\n        self.v_proj = nn.Linear(embed_dim, embed_dim, bias_attr=True)\n        self.c_proj = nn.Linear(embed_dim, output_dim or embed_dim, bias_attr=True)\n        self.num_heads = num_heads\n\n        self.head_dim = embed_dim // num_heads\n        assert self.head_dim * num_heads == embed_dim, \"embed_dim must be divisible by num_heads\"\n\n    def forward(self, x):\n\n        x = x.reshape((x.shape[0], x.shape[1], x.shape[2] * x.shape[3])).transpose((2, 0, 1))  # NCHW -> (HW)NC\n        max_len, batch_size, emb_dim = x.shape\n        head_dim = self.head_dim\n        x = paddle.concat([paddle.mean(x, axis=0, keepdim=True), x], axis=0)\n        x = x + paddle.unsqueeze(self.positional_embedding, 1)\n        out = multi_head_attention_forward(x, self.num_heads, self.q_proj, self.k_proj, self.v_proj, self.c_proj)\n\n        return out[0]\n\n\nclass QuickGELU(nn.Layer):\n\n    def forward(self, x):\n        return x * paddle.nn.functional.sigmoid(1.702 * x)\n\n\nclass ResidualAttentionBlock(nn.Layer):\n\n    def __init__(self, d_model: int, n_head: int, attn_mask=None):\n        super().__init__()\n\n        self.attn = MultiHeadAttention(d_model, n_head)\n        self.ln_1 = nn.LayerNorm(d_model)\n        self.mlp = nn.Sequential((\"c_fc\", nn.Linear(d_model, d_model * 4)), (\"gelu\", QuickGELU()),\n                                 (\"c_proj\", nn.Linear(d_model * 4, d_model)))\n        self.ln_2 = nn.LayerNorm(d_model)\n        self.attn_mask = attn_mask\n\n    def attention(self, x):\n        x = self.attn(x, self.attn_mask)\n        assert isinstance(x, paddle.Tensor)  # not tuble here\n        return x\n\n    def forward(self, x):\n\n        x = x + self.attention(self.ln_1(x))\n        x = x + self.mlp(self.ln_2(x))\n        return x\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/clip/clip/model.py",
    "content": "from typing import Tuple\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle import nn\n\nfrom .layers import AttentionPool2d\nfrom .layers import Bottleneck\nfrom .layers import MultiHeadAttention\nfrom .layers import ResidualAttentionBlock\n\n\nclass ModifiedResNet(nn.Layer):\n    \"\"\"\n    A ResNet class that is similar to torchvision's but contains the following changes:\n    - There are now 3 \"stem\" convolutions as opposed to 1, with an average pool instead of a max pool.\n    - Performs anti-aliasing strided convolutions, where an avgpool is prepended to convolutions with stride > 1\n    - The final pooling layer is a QKV attention instead of an average pool\n    \"\"\"\n\n    def __init__(self, layers, output_dim, heads, input_resolution=224, width=64):\n        super().__init__()\n        self.output_dim = output_dim\n        self.input_resolution = input_resolution\n\n        # the 3-layer stem\n        self.conv1 = nn.Conv2D(3, width // 2, kernel_size=3, stride=2, padding=1, bias_attr=False)\n        self.bn1 = nn.BatchNorm2D(width // 2)\n        self.conv2 = nn.Conv2D(width // 2, width // 2, kernel_size=3, padding=1, bias_attr=False)\n        self.bn2 = nn.BatchNorm2D(width // 2)\n        self.conv3 = nn.Conv2D(width // 2, width, kernel_size=3, padding=1, bias_attr=False)\n        self.bn3 = nn.BatchNorm2D(width)\n        self.avgpool = nn.AvgPool2D(2)\n        self.relu = nn.ReLU()\n\n        # residual layers\n        self._inplanes = width  # this is a *mutable* variable used during construction\n        self.layer1 = self._make_layer(width, layers[0])\n        self.layer2 = self._make_layer(width * 2, layers[1], stride=2)\n        self.layer3 = self._make_layer(width * 4, layers[2], stride=2)\n        self.layer4 = self._make_layer(width * 8, layers[3], stride=2)\n\n        embed_dim = width * 32  # the ResNet feature dimension\n        self.attnpool = AttentionPool2d(input_resolution // 32, embed_dim, heads, output_dim)\n\n    def _make_layer(self, planes, blocks, stride=1):\n        layers = [Bottleneck(self._inplanes, planes, stride)]\n\n        self._inplanes = planes * Bottleneck.expansion\n        for _ in range(1, blocks):\n            layers.append(Bottleneck(self._inplanes, planes))\n\n        return nn.Sequential(*layers)\n\n    def forward(self, x):\n\n        def stem(x):\n            for conv, bn in [(self.conv1, self.bn1), (self.conv2, self.bn2), (self.conv3, self.bn3)]:\n                x = self.relu(bn(conv(x)))\n            x = self.avgpool(x)\n            return x\n\n        #x = x.type(self.conv1.weight.dtype)\n        x = stem(x)\n        x = self.layer1(x)\n        x = self.layer2(x)\n        x = self.layer3(x)\n        x = self.layer4(x)\n        x = self.attnpool(x)\n\n        return x\n\n\nclass Transformer(nn.Layer):\n\n    def __init__(self, width: int, layers: int, heads: int, attn_mask=None):\n        super().__init__()\n        self.width = width\n        self.layers = layers\n        self.resblocks = nn.Sequential(*[ResidualAttentionBlock(width, heads, attn_mask) for _ in range(layers)])\n\n    def forward(self, x):\n        return self.resblocks(x)\n\n\nclass VisualTransformer(nn.Layer):\n\n    def __init__(self, input_resolution: int, patch_size: int, width: int, layers: int, heads: int, output_dim: int):\n        super().__init__()\n        self.input_resolution = input_resolution\n        self.output_dim = output_dim\n        # used patch_size x patch_size, stride patch_size to do linear projection\n        self.conv1 = nn.Conv2D(in_channels=3,\n                               out_channels=width,\n                               kernel_size=patch_size,\n                               stride=patch_size,\n                               bias_attr=False)\n\n        # scale = width ** -0.5\n        self.class_embedding = paddle.create_parameter((width, ), 'float32')\n\n        self.positional_embedding = paddle.create_parameter(((input_resolution // patch_size)**2 + 1, width), 'float32')\n\n        self.ln_pre = nn.LayerNorm(width)\n\n        self.transformer = Transformer(width, layers, heads)\n\n        self.ln_post = nn.LayerNorm(width)\n        self.proj = paddle.create_parameter((width, output_dim), 'float32')\n\n    def forward(self, x):\n\n        x = self.conv1(x)\n        x = x.reshape((x.shape[0], x.shape[1], -1))\n        x = x.transpose((0, 2, 1))\n        x = paddle.concat([self.class_embedding + paddle.zeros((x.shape[0], 1, x.shape[-1]), dtype=x.dtype), x], axis=1)\n\n        x = x + self.positional_embedding\n        x = self.ln_pre(x)\n        x = x.transpose((1, 0, 2))\n        x = self.transformer(x)\n        x = x.transpose((1, 0, 2))\n        x = self.ln_post(x[:, 0, :])\n        if self.proj is not None:\n            x = paddle.matmul(x, self.proj)\n\n        return x\n\n\nclass CLIP(nn.Layer):\n\n    def __init__(\n            self,\n            embed_dim: int,\n            # vision\n            image_resolution: int,\n            vision_layers: Union[Tuple[int, int, int, int], int],\n            vision_width: int,\n            vision_patch_size: int,\n            # text\n            context_length: int,\n            vocab_size: int,\n            transformer_width: int,\n            transformer_heads: int,\n            transformer_layers: int):\n        super().__init__()\n\n        self.context_length = context_length\n        if isinstance(vision_layers, (tuple, list)):\n            vision_heads = vision_width * 32 // 64\n            self.visual = ModifiedResNet(layers=vision_layers,\n                                         output_dim=embed_dim,\n                                         heads=vision_heads,\n                                         input_resolution=image_resolution,\n                                         width=vision_width)\n        else:\n            vision_heads = vision_width // 64\n            self.visual = VisualTransformer(input_resolution=image_resolution,\n                                            patch_size=vision_patch_size,\n                                            width=vision_width,\n                                            layers=vision_layers,\n                                            heads=vision_heads,\n                                            output_dim=embed_dim)\n\n        self.transformer = Transformer(width=transformer_width,\n                                       layers=transformer_layers,\n                                       heads=transformer_heads,\n                                       attn_mask=self.build_attention_mask())\n\n        self.vocab_size = vocab_size\n        self.token_embedding = nn.Embedding(vocab_size, transformer_width)\n        self.positional_embedding = paddle.create_parameter((self.context_length, transformer_width), 'float32')\n        self.ln_final = nn.LayerNorm(transformer_width)\n\n        self.text_projection = paddle.create_parameter((transformer_width, embed_dim), 'float32')\n        self.logit_scale = paddle.create_parameter((1, ), 'float32')\n\n    def build_attention_mask(self):\n        # lazily create causal attention mask, with full attention between the vision tokens\n        # mask = paddle.empty((self.context_length, self.context_length),dtype='float32')\n        # mask.fill_(float(\"-inf\"))\n        #mask.triu_(1)  # zero out the lower diagonal\n\n        mask = paddle.ones((self.context_length, self.context_length)) * float(\"-inf\")\n        mask = paddle.triu(mask, diagonal=1)\n\n        return mask\n\n    def encode_image(self, image):\n        return self.visual(image)\n\n    def encode_text(self, text):\n        x = self.token_embedding(text)  # [batch_size, n_ctx, d_model]\n        # print(x.shape)\n\n        x = x + self.positional_embedding\n        #print(x.shape)\n\n        x = x.transpose((1, 0, 2))  # NLD -> LND\n        x = self.transformer(x)\n        x = x.transpose((1, 0, 2))  # LND -> NLD\n        x = self.ln_final(x)\n\n        idx = text.numpy().argmax(-1)\n        idx = list(idx)\n        x = [x[i:i + 1, int(j), :] for i, j in enumerate(idx)]\n        x = paddle.concat(x, 0)\n        x = paddle.matmul(x, self.text_projection)\n        return x\n\n    def forward(self, image, text):\n        image_features = self.encode_image(image)\n        text_features = self.encode_text(text)\n\n        # normalized features\n        image_features = image_features / image_features.norm(dim=-1, keepdim=True)\n        text_features = text_features / text_features.norm(dim=-1, keepdim=True)\n\n        # cosine similarity as logits\n        logit_scale = self.logit_scale.exp()\n        logits_per_image = paddle.matmul(logit_scale * image_features, text_features.t())\n        logits_per_text = paddle.matmul(logit_scale * text_features, image_features.t())\n\n        # shape = [global_batch_size, global_batch_size]\n        return logits_per_image, logits_per_text\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/clip/clip/simple_tokenizer.py",
    "content": "import gzip\nimport html\nimport os\nfrom functools import lru_cache\n\nimport ftfy\nimport regex as re\n\n\n@lru_cache()\ndef default_bpe():\n    return os.path.join(os.path.dirname(os.path.abspath(__file__)), \"../assets/bpe_simple_vocab_16e6.txt.gz\")\n\n\n@lru_cache()\ndef bytes_to_unicode():\n    \"\"\"\n    Returns list of utf-8 byte and a corresponding list of unicode strings.\n    The reversible bpe codes work on unicode strings.\n    This means you need a large # of unicode characters in your vocab if you want to avoid UNKs.\n    When you're at something like a 10B token dataset you end up needing around 5K for decent coverage.\n    This is a signficant percentage of your normal, say, 32K bpe vocab.\n    To avoid that, we want lookup tables between utf-8 bytes and unicode strings.\n    And avoids mapping to whitespace/control characters the bpe code barfs on.\n    \"\"\"\n    bs = list(range(ord(\"!\"), ord(\"~\") + 1)) + list(range(ord(\"¡\"), ord(\"¬\") + 1)) + list(range(ord(\"®\"), ord(\"ÿ\") + 1))\n    cs = bs[:]\n    n = 0\n    for b in range(2**8):\n        if b not in bs:\n            bs.append(b)\n            cs.append(2**8 + n)\n            n += 1\n    cs = [chr(n) for n in cs]\n    return dict(zip(bs, cs))\n\n\ndef get_pairs(word):\n    \"\"\"Return set of symbol pairs in a word.\n    Word is represented as tuple of symbols (symbols being variable-length strings).\n    \"\"\"\n    pairs = set()\n    prev_char = word[0]\n    for char in word[1:]:\n        pairs.add((prev_char, char))\n        prev_char = char\n    return pairs\n\n\ndef basic_clean(text):\n    text = ftfy.fix_text(text)\n    text = html.unescape(html.unescape(text))\n    return text.strip()\n\n\ndef whitespace_clean(text):\n    text = re.sub(r'\\s+', ' ', text)\n    text = text.strip()\n    return text\n\n\nclass SimpleTokenizer(object):\n\n    def __init__(self, bpe_path: str = default_bpe()):\n        self.byte_encoder = bytes_to_unicode()\n        self.byte_decoder = {v: k for k, v in self.byte_encoder.items()}\n        merges = gzip.open(bpe_path).read().decode(\"utf-8\").split('\\n')\n        merges = merges[1:49152 - 256 - 2 + 1]\n        merges = [tuple(merge.split()) for merge in merges]\n        vocab = list(bytes_to_unicode().values())\n        vocab = vocab + [v + '</w>' for v in vocab]\n        for merge in merges:\n            vocab.append(''.join(merge))\n        vocab.extend(['<|startoftext|>', '<|endoftext|>'])\n        self.encoder = dict(zip(vocab, range(len(vocab))))\n        self.decoder = {v: k for k, v in self.encoder.items()}\n        self.bpe_ranks = dict(zip(merges, range(len(merges))))\n        self.cache = {'<|startoftext|>': '<|startoftext|>', '<|endoftext|>': '<|endoftext|>'}\n        self.pat = re.compile(\n            r\"\"\"<\\|startoftext\\|>|<\\|endoftext\\|>|'s|'t|'re|'ve|'m|'ll|'d|[\\p{L}]+|[\\p{N}]|[^\\s\\p{L}\\p{N}]+\"\"\",\n            re.IGNORECASE)\n\n    def bpe(self, token):\n        if token in self.cache:\n            return self.cache[token]\n        word = tuple(token[:-1]) + (token[-1] + '</w>', )\n        pairs = get_pairs(word)\n\n        if not pairs:\n            return token + '</w>'\n\n        while True:\n            bigram = min(pairs, key=lambda pair: self.bpe_ranks.get(pair, float('inf')))\n            if bigram not in self.bpe_ranks:\n                break\n            first, second = bigram\n            new_word = []\n            i = 0\n            while i < len(word):\n                try:\n                    j = word.index(first, i)\n                    new_word.extend(word[i:j])\n                    i = j\n                except:\n                    new_word.extend(word[i:])\n                    break\n\n                if word[i] == first and i < len(word) - 1 and word[i + 1] == second:\n                    new_word.append(first + second)\n                    i += 2\n                else:\n                    new_word.append(word[i])\n                    i += 1\n            new_word = tuple(new_word)\n            word = new_word\n            if len(word) == 1:\n                break\n            else:\n                pairs = get_pairs(word)\n        word = ' '.join(word)\n        self.cache[token] = word\n        return word\n\n    def encode(self, text):\n        bpe_tokens = []\n        text = whitespace_clean(basic_clean(text)).lower()\n        for token in re.findall(self.pat, text):\n            token = ''.join(self.byte_encoder[b] for b in token.encode('utf-8'))\n            bpe_tokens.extend(self.encoder[bpe_token] for bpe_token in self.bpe(token).split(' '))\n        return bpe_tokens\n\n    def decode(self, tokens):\n        text = ''.join([self.decoder[token] for token in tokens])\n        text = bytearray([self.byte_decoder[c] for c in text]).decode('utf-8', errors=\"replace\").replace('</w>', ' ')\n        return text\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/clip/clip/utils.py",
    "content": "import os\nfrom typing import List\nfrom typing import Union\n\nimport numpy as np\nimport paddle\nfrom paddle.utils import download\nfrom paddle.vision.transforms import CenterCrop\nfrom paddle.vision.transforms import Compose\nfrom paddle.vision.transforms import Normalize\nfrom paddle.vision.transforms import Resize\nfrom paddle.vision.transforms import ToTensor\n\nfrom .model import CLIP\nfrom .simple_tokenizer import SimpleTokenizer\n\n__all__ = ['transform', 'tokenize', 'build_model']\n\nMODEL_NAMES = ['RN50', 'RN101', 'VIT32']\n\nURL = {\n    'RN50': os.path.join(os.path.dirname(__file__), 'pre_trained', 'RN50.pdparams'),\n    'RN101': os.path.join(os.path.dirname(__file__), 'pre_trained', 'RN101.pdparams'),\n    'VIT32': os.path.join(os.path.dirname(__file__), 'pre_trained', 'ViT-B-32.pdparams')\n}\n\nMEAN, STD = (0.48145466, 0.4578275, 0.40821073), (0.26862954, 0.26130258, 0.27577711)\n_tokenizer = SimpleTokenizer()\n\ntransform = Compose([\n    Resize(224, interpolation='bicubic'),\n    CenterCrop(224), lambda image: image.convert('RGB'),\n    ToTensor(),\n    Normalize(mean=MEAN, std=STD), lambda t: t.unsqueeze_(0)\n])\n\n\ndef tokenize(texts: Union[str, List[str]], context_length: int = 77):\n    \"\"\"\n    Returns the tokenized representation of given input string(s)\n\n    Parameters\n    ----------\n    texts : Union[str, List[str]]\n        An input string or a list of input strings to tokenize\n\n    context_length : int\n        The context length to use; all CLIP models use 77 as the context length\n\n    Returns\n    -------\n    A two-dimensional tensor containing the resulting tokens, shape = [number of input strings, context_length]\n    \"\"\"\n    if isinstance(texts, str):\n        texts = [texts]\n\n    sot_token = _tokenizer.encoder[\"<|startoftext|>\"]\n    eot_token = _tokenizer.encoder[\"<|endoftext|>\"]\n    all_tokens = [[sot_token] + _tokenizer.encode(text) + [eot_token] for text in texts]\n    result = paddle.zeros((len(all_tokens), context_length), dtype='int64')\n\n    for i, tokens in enumerate(all_tokens):\n        if len(tokens) > context_length:\n            raise RuntimeError(f\"Input {texts[i]} is too long for context length {context_length}\")\n        result[i, :len(tokens)] = paddle.to_tensor(np.array(tokens), dtype='int64')\n\n    return result\n\n\ndef build_model(name='VIT32'):\n    assert name in MODEL_NAMES, f\"model name must be one of {MODEL_NAMES}\"\n    name2model = {'RN101': build_rn101_model, 'VIT32': build_vit_model, 'RN50': build_rn50_model}\n    model = name2model[name]()\n    weight = URL[name]\n    sd = paddle.load(weight)\n    model.load_dict(sd)\n    model.eval()\n    return model\n\n\ndef build_vit_model():\n\n    model = CLIP(embed_dim=512,\n                 image_resolution=224,\n                 vision_layers=12,\n                 vision_width=768,\n                 vision_patch_size=32,\n                 context_length=77,\n                 vocab_size=49408,\n                 transformer_width=512,\n                 transformer_heads=8,\n                 transformer_layers=12)\n    return model\n\n\ndef build_rn101_model():\n    model = CLIP(\n        embed_dim=512,\n        image_resolution=224,\n        vision_layers=(3, 4, 23, 3),\n        vision_width=64,\n        vision_patch_size=0,  #Not used in resnet\n        context_length=77,\n        vocab_size=49408,\n        transformer_width=512,\n        transformer_heads=8,\n        transformer_layers=12)\n    return model\n\n\ndef build_rn50_model():\n    model = CLIP(embed_dim=1024,\n                 image_resolution=224,\n                 vision_layers=(3, 4, 6, 3),\n                 vision_width=64,\n                 vision_patch_size=None,\n                 context_length=77,\n                 vocab_size=49408,\n                 transformer_width=512,\n                 transformer_heads=8,\n                 transformer_layers=12)\n    return model\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/module.py",
    "content": "# copyright (c) 2022 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport ast\nimport os\nimport sys\nfrom functools import partial\nfrom typing import List\nfrom typing import Optional\n\nimport paddle\n\nimport disco_diffusion_clip_vitb32.clip as clip\nimport disco_diffusion_clip_vitb32.resize_right as resize_right\nimport paddlehub as hub\nfrom disco_diffusion_clip_vitb32.reverse_diffusion import create\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"disco_diffusion_clip_vitb32\",\n            version=\"1.0.0\",\n            type=\"image/text_to_image\",\n            summary=\"\",\n            author=\"paddlepaddle\",\n            author_email=\"paddle-dev@baidu.com\")\nclass DiscoDiffusionClip:\n\n    def generate_image(self,\n                       text_prompts,\n                       style: Optional[str] = None,\n                       artist: Optional[str] = None,\n                       init_image: Optional[str] = None,\n                       width_height: Optional[List[int]] = [1280, 768],\n                       skip_steps: Optional[int] = 0,\n                       steps: Optional[int] = 250,\n                       cut_ic_pow: Optional[int] = 1,\n                       init_scale: Optional[int] = 1000,\n                       clip_guidance_scale: Optional[int] = 5000,\n                       tv_scale: Optional[int] = 0,\n                       range_scale: Optional[int] = 0,\n                       sat_scale: Optional[int] = 0,\n                       cutn_batches: Optional[int] = 4,\n                       diffusion_sampling_mode: Optional[str] = 'ddim',\n                       perlin_init: Optional[bool] = False,\n                       perlin_mode: Optional[str] = 'mixed',\n                       seed: Optional[int] = None,\n                       eta: Optional[float] = 0.8,\n                       clamp_grad: Optional[bool] = True,\n                       clamp_max: Optional[float] = 0.05,\n                       randomize_class: Optional[bool] = True,\n                       clip_denoised: Optional[bool] = False,\n                       fuzzy_prompt: Optional[bool] = False,\n                       rand_mag: Optional[float] = 0.05,\n                       cut_overview: Optional[str] = '[12]*400+[4]*600',\n                       cut_innercut: Optional[str] = '[4]*400+[12]*600',\n                       cut_icgray_p: Optional[str] = '[0.2]*400+[0]*600',\n                       display_rate: Optional[int] = 10,\n                       n_batches: Optional[int] = 1,\n                       batch_size: Optional[int] = 1,\n                       batch_name: Optional[str] = '',\n                       use_gpu: Optional[bool] = True,\n                       output_dir: Optional[str] = 'disco_diffusion_clip_vitb32_out'):\n        \"\"\"\n        Create Disco Diffusion artworks and save the result into a DocumentArray.\n\n        :param text_prompts: Phrase, sentence, or string of words and phrases describing what the image should look like.  The words will be analyzed by the AI and will guide the diffusion process toward the image(s) you describe. These can include commas and weights to adjust the relative importance of each element.  E.g. \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"Notice that this prompt loosely follows a structure: [subject], [prepositional details], [setting], [meta modifiers and artist]; this is a good starting point for your experiments. Developing text prompts takes practice and experience, and is not the subject of this guide.  If you are a beginner to writing text prompts, a good place to start is on a simple AI art app like Night Cafe, starry ai or WOMBO prior to using DD, to get a feel for how text gets translated into images by GAN tools.  These other apps use different technologies, but many of the same principles apply.\n        :param style: Image style, such as oil paintings, if specified, style will be used to construct prompts.\n        :param artist: Artist style, if specified, style will be used to construct prompts.\n        :param init_image: Recall that in the image sequence above, the first image shown is just noise.  If an init_image is provided, diffusion will replace the noise with the init_image as its starting state.  To use an init_image, upload the image to the Colab instance or your Google Drive, and enter the full image path here. If using an init_image, you may need to increase skip_steps to ~ 50% of total steps to retain the character of the init. See skip_steps above for further discussion.\n        :param width_height: Desired final image size, in pixels. You can have a square, wide, or tall image, but each edge length should be set to a multiple of 64px, and a minimum of 512px on the default CLIP model setting.  If you forget to use multiples of 64px in your dimensions, DD will adjust the dimensions of your image to make it so.\n        :param skip_steps: Consider the chart shown here.  Noise scheduling (denoise strength) starts very high and progressively gets lower and lower as diffusion steps progress. The noise levels in the first few steps are very high, so images change dramatically in early steps.As DD moves along the curve, noise levels (and thus the amount an image changes per step) declines, and image coherence from one step to the next increases.The first few steps of denoising are often so dramatic that some steps (maybe 10-15% of total) can be skipped without affecting the final image. You can experiment with this as a way to cut render times.If you skip too many steps, however, the remaining noise may not be high enough to generate new content, and thus may not have ‘time left’ to finish an image satisfactorily.Also, depending on your other settings, you may need to skip steps to prevent CLIP from overshooting your goal, resulting in ‘blown out’ colors (hyper saturated, solid white, or solid black regions) or otherwise poor image quality.  Consider that the denoising process is at its strongest in the early steps, so skipping steps can sometimes mitigate other problems.Lastly, if using an init_image, you will need to skip ~50% of the diffusion steps to retain the shapes in the original init image. However, if you’re using an init_image, you can also adjust skip_steps up or down for creative reasons.  With low skip_steps you can get a result \"inspired by\" the init_image which will retain the colors and rough layout and shapes but look quite different. With high skip_steps you can preserve most of the init_image contents and just do fine tuning of the texture.\n        :param steps: When creating an image, the denoising curve is subdivided into steps for processing. Each step (or iteration) involves the AI looking at subsets of the image called ‘cuts’ and calculating the ‘direction’ the image should be guided to be more like the prompt. Then it adjusts the image with the help of the diffusion denoiser, and moves to the next step.Increasing steps will provide more opportunities for the AI to adjust the image, and each adjustment will be smaller, and thus will yield a more precise, detailed image.  Increasing steps comes at the expense of longer render times.  Also, while increasing steps should generally increase image quality, there is a diminishing return on additional steps beyond 250 - 500 steps.  However, some intricate images can take 1000, 2000, or more steps.  It is really up to the user.  Just know that the render time is directly related to the number of steps, and many other parameters have a major impact on image quality, without costing additional time.\n        :param cut_ic_pow: This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n        :param init_scale: This controls how strongly CLIP will try to match the init_image provided.  This is balanced against the clip_guidance_scale (CGS) above.  Too much init scale, and the image won’t change much during diffusion. Too much CGS and the init image will be lost.\n        :param clip_guidance_scale: CGS is one of the most important parameters you will use. It tells DD how strongly you want CLIP to move toward your prompt each timestep.  Higher is generally better, but if CGS is too strong it will overshoot the goal and distort the image. So a happy medium is needed, and it takes experience to learn how to adjust CGS. Note that this parameter generally scales with image dimensions. In other words, if you increase your total dimensions by 50% (e.g. a change from 512 x 512 to 512 x 768), then to maintain the same effect on the image, you’d want to increase clip_guidance_scale from 5000 to 7500. Of the basic settings, clip_guidance_scale, steps and skip_steps are the most important contributors to image quality, so learn them well.\n        :param tv_scale: Total variance denoising. Optional, set to zero to turn off. Controls ‘smoothness’ of final output. If used, tv_scale will try to smooth out your final image to reduce overall noise. If your image is too ‘crunchy’, increase tv_scale. TV denoising is good at preserving edges while smoothing away noise in flat regions.  See https://en.wikipedia.org/wiki/Total_variation_denoising\n        :param range_scale: Optional, set to zero to turn off.  Used for adjustment of color contrast.  Lower range_scale will increase contrast. Very low numbers create a reduced color palette, resulting in more vibrant or poster-like images. Higher range_scale will reduce contrast, for more muted images.\n        :param sat_scale: Saturation scale. Optional, set to zero to turn off.  If used, sat_scale will help mitigate oversaturation. If your image is too saturated, increase sat_scale to reduce the saturation.\n        :param cutn_batches: Each iteration, the AI cuts the image into smaller pieces known as cuts, and compares each cut to the prompt to decide how to guide the next diffusion step.  More cuts can generally lead to better images, since DD has more chances to fine-tune the image precision in each timestep.  Additional cuts are memory intensive, however, and if DD tries to evaluate too many cuts at once, it can run out of memory.  You can use cutn_batches to increase cuts per timestep without increasing memory usage. At the default settings, DD is scheduled to do 16 cuts per timestep.  If cutn_batches is set to 1, there will indeed only be 16 cuts total per timestep. However, if cutn_batches is increased to 4, DD will do 64 cuts total in each timestep, divided into 4 sequential batches of 16 cuts each.  Because the cuts are being evaluated only 16 at a time, DD uses the memory required for only 16 cuts, but gives you the quality benefit of 64 cuts.  The tradeoff, of course, is that this will take ~4 times as long to render each image.So, (scheduled cuts) x (cutn_batches) = (total cuts per timestep). Increasing cutn_batches will increase render times, however, as the work is being done sequentially.  DD’s default cut schedule is a good place to start, but the cut schedule can be adjusted in the Cutn Scheduling section, explained below.\n        :param diffusion_sampling_mode: Two alternate diffusion denoising algorithms. ddim has been around longer, and is more established and tested.  plms is a newly added alternate method that promises good diffusion results in fewer steps, but has not been as fully tested and may have side effects. This new plms mode is actively being researched in the #settings-and-techniques channel in the DD Discord.\n        :param perlin_init: Normally, DD will use an image filled with random noise as a starting point for the diffusion curve.  If perlin_init is selected, DD will instead use a Perlin noise model as an initial state.  Perlin has very interesting characteristics, distinct from random noise, so it’s worth experimenting with this for your projects. Beyond perlin, you can, of course, generate your own noise images (such as with GIMP, etc) and use them as an init_image (without skipping steps). Choosing perlin_init does not affect the actual diffusion process, just the starting point for the diffusion. Please note that selecting a perlin_init will replace and override any init_image you may have specified.  Further, because the 2D, 3D and video animation systems all rely on the init_image system, if you enable Perlin while using animation modes, the perlin_init will jump in front of any previous image or video input, and DD will NOT give you the expected sequence of coherent images. All of that said, using Perlin and animation modes together do make a very colorful rainbow effect, which can be used creatively.\n        :param perlin_mode: sets type of Perlin noise: colored, gray, or a mix of both, giving you additional options for noise types. Experiment to see what these do in your projects.\n        :param seed: Deep in the diffusion code, there is a random number ‘seed’ which is used as the basis for determining the initial state of the diffusion.  By default, this is random, but you can also specify your own seed.  This is useful if you like a particular result and would like to run more iterations that will be similar. After each run, the actual seed value used will be reported in the parameters report, and can be reused if desired by entering seed # here.  If a specific numerical seed is used repeatedly, the resulting images will be quite similar but not identical.\n        :param eta: eta (greek letter η) is a diffusion model variable that mixes in a random amount of scaled noise into each timestep. 0 is no noise, 1.0 is more noise. As with most DD parameters, you can go below zero for eta, but it may give you unpredictable results. The steps parameter has a close relationship with the eta parameter. If you set eta to 0, then you can get decent output with only 50-75 steps. Setting eta to 1.0 favors higher step counts, ideally around 250 and up. eta has a subtle, unpredictable effect on image, so you’ll need to experiment to see how this affects your projects.\n        :param clamp_grad: As I understand it, clamp_grad is an internal limiter that stops DD from producing extreme results.  Try your images with and without clamp_grad. If the image changes drastically with clamp_grad turned off, it probably means your clip_guidance_scale is too high and should be reduced.\n        :param clamp_max: Sets the value of the clamp_grad limitation. Default is 0.05, providing for smoother, more muted coloration in images, but setting higher values (0.15-0.3) can provide interesting contrast and vibrancy.\n        :param fuzzy_prompt: Controls whether to add multiple noisy prompts to the prompt losses. If True, can increase variability of image output. Experiment with this.\n        :param rand_mag: Affects only the fuzzy_prompt.  Controls the magnitude of the random noise added by fuzzy_prompt.\n        :param cut_overview: The schedule of overview cuts\n        :param cut_innercut: The schedule of inner cuts\n        :param cut_icgray_p: This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n        :param display_rate: During a diffusion run, you can monitor the progress of each image being created with this variable.  If display_rate is set to 50, DD will show you the in-progress image every 50 timesteps. Setting this to a lower value, like 5 or 10, is a good way to get an early peek at where your image is heading. If you don’t like the progression, just interrupt execution, change some settings, and re-run.  If you are planning a long, unmonitored batch, it’s better to set display_rate equal to steps, because displaying interim images does slow Colab down slightly.\n        :param n_batches: This variable sets the number of still images you want DD to create.  If you are using an animation mode (see below for details) DD will ignore n_batches and create a single set of animated frames based on the animation settings.\n        :param batch_name: The name of the batch, the batch id will be named as \"discoart-[batch_name]-seed\". To avoid your artworks be overridden by other users, please use a unique name.\n        :param use_gpu: whether to use gpu or not.\n        :return: a DocumentArray object that has `n_batches` Documents\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ.get(\"CUDA_VISIBLE_DEVICES\", None)\n                if _places:\n                    paddle.device.set_device(\"gpu:{}\".format(0))\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n        else:\n            paddle.device.set_device(\"cpu\")\n        paddle.disable_static()\n        if not os.path.exists(output_dir):\n            os.makedirs(output_dir, exist_ok=True)\n\n        if isinstance(text_prompts, str):\n            text_prompts = text_prompts.rstrip(',.，。')\n            if style is not None:\n                text_prompts += \",{}\".format(style)\n            if artist is not None:\n                text_prompts += \",{},trending on artstation\".format(artist)\n        elif isinstance(text_prompts, list):\n            text_prompts[0] = text_prompts[0].rstrip(',.，。')\n            if style is not None:\n                text_prompts[0] += \",{}\".format(style)\n            if artist is not None:\n                text_prompts[0] += \",{},trending on artstation\".format(artist)\n\n        return create(text_prompts=text_prompts,\n                      init_image=init_image,\n                      width_height=width_height,\n                      skip_steps=skip_steps,\n                      steps=steps,\n                      cut_ic_pow=cut_ic_pow,\n                      init_scale=init_scale,\n                      clip_guidance_scale=clip_guidance_scale,\n                      tv_scale=tv_scale,\n                      range_scale=range_scale,\n                      sat_scale=sat_scale,\n                      cutn_batches=cutn_batches,\n                      diffusion_sampling_mode=diffusion_sampling_mode,\n                      perlin_init=perlin_init,\n                      perlin_mode=perlin_mode,\n                      seed=seed,\n                      eta=eta,\n                      clamp_grad=clamp_grad,\n                      clamp_max=clamp_max,\n                      randomize_class=randomize_class,\n                      clip_denoised=clip_denoised,\n                      fuzzy_prompt=fuzzy_prompt,\n                      rand_mag=rand_mag,\n                      cut_overview=cut_overview,\n                      cut_innercut=cut_innercut,\n                      cut_icgray_p=cut_icgray_p,\n                      display_rate=display_rate,\n                      n_batches=n_batches,\n                      batch_size=batch_size,\n                      batch_name=batch_name,\n                      clip_models=['VIT32'],\n                      output_dir=output_dir)\n\n    @serving\n    def serving_method(self, text_prompts, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        results = self.generate_image(text_prompts=text_prompts, **kwargs).to_base64()\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.generate_image(text_prompts=args.text_prompts,\n                                      style=args.style,\n                                      artist=args.artist,\n                                      init_image=args.init_image,\n                                      width_height=args.width_height,\n                                      skip_steps=args.skip_steps,\n                                      steps=args.steps,\n                                      cut_ic_pow=args.cut_ic_pow,\n                                      init_scale=args.init_scale,\n                                      clip_guidance_scale=args.clip_guidance_scale,\n                                      tv_scale=args.tv_scale,\n                                      range_scale=args.range_scale,\n                                      sat_scale=args.sat_scale,\n                                      cutn_batches=args.cutn_batches,\n                                      diffusion_sampling_mode=args.diffusion_sampling_mode,\n                                      perlin_init=args.perlin_init,\n                                      perlin_mode=args.perlin_mode,\n                                      seed=args.seed,\n                                      eta=args.eta,\n                                      clamp_grad=args.clamp_grad,\n                                      clamp_max=args.clamp_max,\n                                      randomize_class=args.randomize_class,\n                                      clip_denoised=args.clip_denoised,\n                                      fuzzy_prompt=args.fuzzy_prompt,\n                                      rand_mag=args.rand_mag,\n                                      cut_overview=args.cut_overview,\n                                      cut_innercut=args.cut_innercut,\n                                      cut_icgray_p=args.cut_icgray_p,\n                                      display_rate=args.display_rate,\n                                      n_batches=args.n_batches,\n                                      batch_size=args.batch_size,\n                                      batch_name=args.batch_name,\n                                      output_dir=args.output_dir)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_input_group.add_argument(\n            '--skip_steps',\n            type=int,\n            default=0,\n            help=\n            'Consider the chart shown here.  Noise scheduling (denoise strength) starts very high and progressively gets lower and lower as diffusion steps progress. The noise levels in the first few steps are very high, so images change dramatically in early steps.As DD moves along the curve, noise levels (and thus the amount an image changes per step) declines, and image coherence from one step to the next increases.The first few steps of denoising are often so dramatic that some steps (maybe 10-15%% of total) can be skipped without affecting the final image. You can experiment with this as a way to cut render times.If you skip too many steps, however, the remaining noise may not be high enough to generate new content, and thus may not have ‘time left’ to finish an image satisfactorily.Also, depending on your other settings, you may need to skip steps to prevent CLIP from overshooting your goal, resulting in ‘blown out’ colors (hyper saturated, solid white, or solid black regions) or otherwise poor image quality.  Consider that the denoising process is at its strongest in the early steps, so skipping steps can sometimes mitigate other problems.Lastly, if using an init_image, you will need to skip ~50%% of the diffusion steps to retain the shapes in the original init image. However, if you’re using an init_image, you can also adjust skip_steps up or down for creative reasons.  With low skip_steps you can get a result \"inspired by\" the init_image which will retain the colors and rough layout and shapes but look quite different. With high skip_steps you can preserve most of the init_image contents and just do fine tuning of the texture'\n        )\n        self.arg_input_group.add_argument(\n            '--steps',\n            type=int,\n            default=250,\n            help=\n            \"When creating an image, the denoising curve is subdivided into steps for processing. Each step (or iteration) involves the AI looking at subsets of the image called ‘cuts’ and calculating the ‘direction’ the image should be guided to be more like the prompt. Then it adjusts the image with the help of the diffusion denoiser, and moves to the next step.Increasing steps will provide more opportunities for the AI to adjust the image, and each adjustment will be smaller, and thus will yield a more precise, detailed image.  Increasing steps comes at the expense of longer render times.  Also, while increasing steps should generally increase image quality, there is a diminishing return on additional steps beyond 250 - 500 steps.  However, some intricate images can take 1000, 2000, or more steps.  It is really up to the user.  Just know that the render time is directly related to the number of steps, and many other parameters have a major impact on image quality, without costing additional time.\"\n        )\n        self.arg_input_group.add_argument(\n            '--cut_ic_pow',\n            type=int,\n            default=1,\n            help=\n            \"This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\"\n        )\n        self.arg_input_group.add_argument(\n            '--init_scale',\n            type=int,\n            default=1000,\n            help=\n            \"This controls how strongly CLIP will try to match the init_image provided.  This is balanced against the clip_guidance_scale (CGS) above.  Too much init scale, and the image won’t change much during diffusion. Too much CGS and the init image will be lost.\"\n        )\n        self.arg_input_group.add_argument(\n            '--clip_guidance_scale',\n            type=int,\n            default=5000,\n            help=\n            \"CGS is one of the most important parameters you will use. It tells DD how strongly you want CLIP to move toward your prompt each timestep.  Higher is generally better, but if CGS is too strong it will overshoot the goal and distort the image. So a happy medium is needed, and it takes experience to learn how to adjust CGS. Note that this parameter generally scales with image dimensions. In other words, if you increase your total dimensions by 50% (e.g. a change from 512 x 512 to 512 x 768), then to maintain the same effect on the image, you’d want to increase clip_guidance_scale from 5000 to 7500. Of the basic settings, clip_guidance_scale, steps and skip_steps are the most important contributors to image quality, so learn them well.\"\n        )\n        self.arg_input_group.add_argument(\n            '--tv_scale',\n            type=int,\n            default=0,\n            help=\n            \"Total variance denoising. Optional, set to zero to turn off. Controls ‘smoothness’ of final output. If used, tv_scale will try to smooth out your final image to reduce overall noise. If your image is too ‘crunchy’, increase tv_scale. TV denoising is good at preserving edges while smoothing away noise in flat regions.  See https://en.wikipedia.org/wiki/Total_variation_denoising\"\n        )\n        self.arg_input_group.add_argument(\n            '--range_scale',\n            type=int,\n            default=0,\n            help=\n            \"Optional, set to zero to turn off.  Used for adjustment of color contrast.  Lower range_scale will increase contrast. Very low numbers create a reduced color palette, resulting in more vibrant or poster-like images. Higher range_scale will reduce contrast, for more muted images.\"\n        )\n        self.arg_input_group.add_argument(\n            '--sat_scale',\n            type=int,\n            default=0,\n            help=\n            \"Saturation scale. Optional, set to zero to turn off.  If used, sat_scale will help mitigate oversaturation. If your image is too saturated, increase sat_scale to reduce the saturation.\"\n        )\n        self.arg_input_group.add_argument(\n            '--cutn_batches',\n            type=int,\n            default=4,\n            help=\n            \"Each iteration, the AI cuts the image into smaller pieces known as cuts, and compares each cut to the prompt to decide how to guide the next diffusion step.  More cuts can generally lead to better images, since DD has more chances to fine-tune the image precision in each timestep.  Additional cuts are memory intensive, however, and if DD tries to evaluate too many cuts at once, it can run out of memory.  You can use cutn_batches to increase cuts per timestep without increasing memory usage. At the default settings, DD is scheduled to do 16 cuts per timestep.  If cutn_batches is set to 1, there will indeed only be 16 cuts total per timestep. However, if cutn_batches is increased to 4, DD will do 64 cuts total in each timestep, divided into 4 sequential batches of 16 cuts each.  Because the cuts are being evaluated only 16 at a time, DD uses the memory required for only 16 cuts, but gives you the quality benefit of 64 cuts.  The tradeoff, of course, is that this will take ~4 times as long to render each image.So, (scheduled cuts) x (cutn_batches) = (total cuts per timestep). Increasing cutn_batches will increase render times, however, as the work is being done sequentially.  DD’s default cut schedule is a good place to start, but the cut schedule can be adjusted in the Cutn Scheduling section, explained below.\"\n        )\n        self.arg_input_group.add_argument(\n            '--diffusion_sampling_mode',\n            type=str,\n            default='ddim',\n            help=\n            \"Two alternate diffusion denoising algorithms. ddim has been around longer, and is more established and tested.  plms is a newly added alternate method that promises good diffusion results in fewer steps, but has not been as fully tested and may have side effects. This new plms mode is actively being researched in the #settings-and-techniques channel in the DD Discord.\"\n        )\n        self.arg_input_group.add_argument(\n            '--perlin_init',\n            type=bool,\n            default=False,\n            help=\n            \"Normally, DD will use an image filled with random noise as a starting point for the diffusion curve.  If perlin_init is selected, DD will instead use a Perlin noise model as an initial state.  Perlin has very interesting characteristics, distinct from random noise, so it’s worth experimenting with this for your projects. Beyond perlin, you can, of course, generate your own noise images (such as with GIMP, etc) and use them as an init_image (without skipping steps). Choosing perlin_init does not affect the actual diffusion process, just the starting point for the diffusion. Please note that selecting a perlin_init will replace and override any init_image you may have specified.  Further, because the 2D, 3D and video animation systems all rely on the init_image system, if you enable Perlin while using animation modes, the perlin_init will jump in front of any previous image or video input, and DD will NOT give you the expected sequence of coherent images. All of that said, using Perlin and animation modes together do make a very colorful rainbow effect, which can be used creatively.\"\n        )\n        self.arg_input_group.add_argument(\n            '--perlin_mode',\n            type=str,\n            default='mixed',\n            help=\n            \"sets type of Perlin noise: colored, gray, or a mix of both, giving you additional options for noise types. Experiment to see what these do in your projects.\"\n        )\n        self.arg_input_group.add_argument(\n            '--seed',\n            type=int,\n            default=None,\n            help=\n            \"Deep in the diffusion code, there is a random number ‘seed’ which is used as the basis for determining the initial state of the diffusion.  By default, this is random, but you can also specify your own seed.  This is useful if you like a particular result and would like to run more iterations that will be similar. After each run, the actual seed value used will be reported in the parameters report, and can be reused if desired by entering seed # here.  If a specific numerical seed is used repeatedly, the resulting images will be quite similar but not identical.\"\n        )\n        self.arg_input_group.add_argument(\n            '--eta',\n            type=float,\n            default=0.8,\n            help=\n            \"eta (greek letter η) is a diffusion model variable that mixes in a random amount of scaled noise into each timestep. 0 is no noise, 1.0 is more noise. As with most DD parameters, you can go below zero for eta, but it may give you unpredictable results. The steps parameter has a close relationship with the eta parameter. If you set eta to 0, then you can get decent output with only 50-75 steps. Setting eta to 1.0 favors higher step counts, ideally around 250 and up. eta has a subtle, unpredictable effect on image, so you’ll need to experiment to see how this affects your projects.\"\n        )\n        self.arg_input_group.add_argument(\n            '--clamp_grad',\n            type=bool,\n            default=True,\n            help=\n            \"As I understand it, clamp_grad is an internal limiter that stops DD from producing extreme results.  Try your images with and without clamp_grad. If the image changes drastically with clamp_grad turned off, it probably means your clip_guidance_scale is too high and should be reduced.\"\n        )\n        self.arg_input_group.add_argument(\n            '--clamp_max',\n            type=float,\n            default=0.05,\n            help=\n            \"Sets the value of the clamp_grad limitation. Default is 0.05, providing for smoother, more muted coloration in images, but setting higher values (0.15-0.3) can provide interesting contrast and vibrancy.\"\n        )\n        self.arg_input_group.add_argument('--randomize_class', type=bool, default=True, help=\"Random class.\")\n        self.arg_input_group.add_argument('--clip_denoised', type=bool, default=False, help=\"Clip denoised.\")\n        self.arg_input_group.add_argument(\n            '--fuzzy_prompt',\n            type=bool,\n            default=False,\n            help=\n            \"Controls whether to add multiple noisy prompts to the prompt losses. If True, can increase variability of image output. Experiment with this.\"\n        )\n        self.arg_input_group.add_argument(\n            '--rand_mag',\n            type=float,\n            default=0.5,\n            help=\"Affects only the fuzzy_prompt.  Controls the magnitude of the random noise added by fuzzy_prompt.\")\n        self.arg_input_group.add_argument('--cut_overview',\n                                          type=str,\n                                          default='[12]*400+[4]*600',\n                                          help=\"The schedule of overview cuts\")\n        self.arg_input_group.add_argument('--cut_innercut',\n                                          type=str,\n                                          default='[4]*400+[12]*600',\n                                          help=\"The schedule of inner cuts\")\n        self.arg_input_group.add_argument(\n            '--cut_icgray_p',\n            type=str,\n            default='[0.2]*400+[0]*600',\n            help=\n            \"This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\"\n        )\n        self.arg_input_group.add_argument(\n            '--display_rate',\n            type=int,\n            default=10,\n            help=\n            \"During a diffusion run, you can monitor the progress of each image being created with this variable.  If display_rate is set to 50, DD will show you the in-progress image every 50 timesteps. Setting this to a lower value, like 5 or 10, is a good way to get an early peek at where your image is heading. If you don’t like the progression, just interrupt execution, change some settings, and re-run.  If you are planning a long, unmonitored batch, it’s better to set display_rate equal to steps, because displaying interim images does slow Colab down slightly.\"\n        )\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=True,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='disco_diffusion_clip_vitb32_out',\n                                           help='Output directory.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument(\n            '--text_prompts',\n            type=str,\n            help=\n            'Phrase, sentence, or string of words and phrases describing what the image should look like.  The words will be analyzed by the AI and will guide the diffusion process toward the image(s) you describe. These can include commas and weights to adjust the relative importance of each element.  E.g. \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"Notice that this prompt loosely follows a structure: [subject], [prepositional details], [setting], [meta modifiers and artist]; this is a good starting point for your experiments. Developing text prompts takes practice and experience, and is not the subject of this guide.  If you are a beginner to writing text prompts, a good place to start is on a simple AI art app like Night Cafe, starry ai or WOMBO prior to using DD, to get a feel for how text gets translated into images by GAN tools.  These other apps use different technologies, but many of the same principles apply.'\n        )\n        self.arg_input_group.add_argument(\n            '--style',\n            type=str,\n            default=None,\n            help='Image style, such as oil paintings, if specified, style will be used to construct prompts.')\n        self.arg_input_group.add_argument('--artist',\n                                          type=str,\n                                          default=None,\n                                          help='Artist style, if specified, style will be used to construct prompts.')\n        self.arg_input_group.add_argument(\n            '--init_image',\n            type=str,\n            default=None,\n            help=\n            \"Recall that in the image sequence above, the first image shown is just noise.  If an init_image is provided, diffusion will replace the noise with the init_image as its starting state.  To use an init_image, upload the image to the Colab instance or your Google Drive, and enter the full image path here. If using an init_image, you may need to increase skip_steps to ~ 50% of total steps to retain the character of the init. See skip_steps above for further discussion.\"\n        )\n        self.arg_input_group.add_argument(\n            '--width_height',\n            type=ast.literal_eval,\n            default=[1280, 768],\n            help=\n            \"Desired final image size, in pixels. You can have a square, wide, or tall image, but each edge length should be set to a multiple of 64px, and a minimum of 512px on the default CLIP model setting.  If you forget to use multiples of 64px in your dimensions, DD will adjust the dimensions of your image to make it so.\"\n        )\n        self.arg_input_group.add_argument(\n            '--n_batches',\n            type=int,\n            default=1,\n            help=\n            \"This variable sets the number of still images you want DD to create.  If you are using an animation mode (see below for details) DD will ignore n_batches and create a single set of animated frames based on the animation settings.\"\n        )\n        self.arg_input_group.add_argument('--batch_size', type=int, default=1, help=\"Batch size.\")\n        self.arg_input_group.add_argument(\n            '--batch_name',\n            type=str,\n            default='',\n            help=\n            'The name of the batch, the batch id will be named as \"reverse_diffusion-[batch_name]-seed\". To avoid your artworks be overridden by other users, please use a unique name.'\n        )\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/requirements.txt",
    "content": "numpy\npaddle_lpips==0.1.2\nftfy\ndocarray>=0.13.29\npyyaml\nregex\ntqdm\nipywidgets\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/resize_right/README.md",
    "content": "# ResizeRight (Paddle)\nFully differentiable resize function implemented by Paddle.\nThis module is based on [assafshocher/ResizeRight](https://github.com/assafshocher/ResizeRight).\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/resize_right/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/resize_right/interp_methods.py",
    "content": "from math import pi\n\ntry:\n    import paddle\nexcept ImportError:\n    paddle = None\n\ntry:\n    import numpy\n    import numpy as np\nexcept ImportError:\n    numpy = None\n\nif numpy is None and paddle is None:\n    raise ImportError(\"Must have either Numpy or PyTorch but both not found\")\n\n\ndef set_framework_dependencies(x):\n    if type(x) is numpy.ndarray:\n        to_dtype = lambda a: a\n        fw = numpy\n    else:\n        to_dtype = lambda a: paddle.cast(a, x.dtype)\n        fw = paddle\n    # eps = fw.finfo(fw.float32).eps\n    eps = paddle.to_tensor(np.finfo(np.float32).eps)\n    return fw, to_dtype, eps\n\n\ndef support_sz(sz):\n\n    def wrapper(f):\n        f.support_sz = sz\n        return f\n\n    return wrapper\n\n\n@support_sz(4)\ndef cubic(x):\n    fw, to_dtype, eps = set_framework_dependencies(x)\n    absx = fw.abs(x)\n    absx2 = absx**2\n    absx3 = absx**3\n    return ((1.5 * absx3 - 2.5 * absx2 + 1.) * to_dtype(absx <= 1.) +\n            (-0.5 * absx3 + 2.5 * absx2 - 4. * absx + 2.) * to_dtype((1. < absx) & (absx <= 2.)))\n\n\n@support_sz(4)\ndef lanczos2(x):\n    fw, to_dtype, eps = set_framework_dependencies(x)\n    return (((fw.sin(pi * x) * fw.sin(pi * x / 2) + eps) / ((pi**2 * x**2 / 2) + eps)) * to_dtype(abs(x) < 2))\n\n\n@support_sz(6)\ndef lanczos3(x):\n    fw, to_dtype, eps = set_framework_dependencies(x)\n    return (((fw.sin(pi * x) * fw.sin(pi * x / 3) + eps) / ((pi**2 * x**2 / 3) + eps)) * to_dtype(abs(x) < 3))\n\n\n@support_sz(2)\ndef linear(x):\n    fw, to_dtype, eps = set_framework_dependencies(x)\n    return ((x + 1) * to_dtype((-1 <= x) & (x < 0)) + (1 - x) * to_dtype((0 <= x) & (x <= 1)))\n\n\n@support_sz(1)\ndef box(x):\n    fw, to_dtype, eps = set_framework_dependencies(x)\n    return to_dtype((-1 <= x) & (x < 0)) + to_dtype((0 <= x) & (x <= 1))\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/resize_right/resize_right.py",
    "content": "import warnings\nfrom fractions import Fraction\nfrom math import ceil\nfrom typing import Tuple\n\nimport disco_diffusion_clip_vitb32.resize_right.interp_methods as interp_methods\n\n\nclass NoneClass:\n    pass\n\n\ntry:\n    import paddle\n    from paddle import nn\n    nnModuleWrapped = nn.Layer\nexcept ImportError:\n    warnings.warn('No PyTorch found, will work only with Numpy')\n    paddle = None\n    nnModuleWrapped = NoneClass\n\ntry:\n    import numpy\n    import numpy as np\nexcept ImportError:\n    warnings.warn('No Numpy found, will work only with PyTorch')\n    numpy = None\n\nif numpy is None and paddle is None:\n    raise ImportError(\"Must have either Numpy or PyTorch but both not found\")\n\n\ndef resize(input,\n           scale_factors=None,\n           out_shape=None,\n           interp_method=interp_methods.cubic,\n           support_sz=None,\n           antialiasing=True,\n           by_convs=False,\n           scale_tolerance=None,\n           max_numerator=10,\n           pad_mode='constant'):\n    # get properties of the input tensor\n    in_shape, n_dims = input.shape, input.ndim\n\n    # fw stands for framework that can be either numpy or paddle,\n    # determined by the input type\n    fw = numpy if type(input) is numpy.ndarray else paddle\n    eps = np.finfo(np.float32).eps if fw == numpy else paddle.to_tensor(np.finfo(np.float32).eps)\n    device = input.place if fw is paddle else None\n\n    # set missing scale factors or output shapem one according to another,\n    # scream if both missing. this is also where all the defults policies\n    # take place. also handling the by_convs attribute carefully.\n    scale_factors, out_shape, by_convs = set_scale_and_out_sz(in_shape, out_shape, scale_factors, by_convs,\n                                                              scale_tolerance, max_numerator, eps, fw)\n\n    # sort indices of dimensions according to scale of each dimension.\n    # since we are going dim by dim this is efficient\n    sorted_filtered_dims_and_scales = [(dim, scale_factors[dim], by_convs[dim], in_shape[dim], out_shape[dim])\n                                       for dim in sorted(range(n_dims), key=lambda ind: scale_factors[ind])\n                                       if scale_factors[dim] != 1.]\n    # unless support size is specified by the user, it is an attribute\n    # of the interpolation method\n    if support_sz is None:\n        support_sz = interp_method.support_sz\n\n    # output begins identical to input and changes with each iteration\n    output = input\n\n    # iterate over dims\n    for (dim, scale_factor, dim_by_convs, in_sz, out_sz) in sorted_filtered_dims_and_scales:\n        # STEP 1- PROJECTED GRID: The non-integer locations of the projection\n        # of output pixel locations to the input tensor\n        projected_grid = get_projected_grid(in_sz, out_sz, scale_factor, fw, dim_by_convs, device)\n\n        # STEP 1.5: ANTIALIASING- If antialiasing is taking place, we modify\n        # the window size and the interpolation method (see inside function)\n        cur_interp_method, cur_support_sz = apply_antialiasing_if_needed(interp_method, support_sz, scale_factor,\n                                                                         antialiasing)\n\n        # STEP 2- FIELDS OF VIEW: for each output pixels, map the input pixels\n        # that influence it. Also calculate needed padding and update grid\n        # accoedingly\n        field_of_view = get_field_of_view(projected_grid, cur_support_sz, fw, eps, device)\n\n        # STEP 2.5- CALCULATE PAD AND UPDATE: according to the field of view,\n        # the input should be padded to handle the boundaries, coordinates\n        # should be updated. actual padding only occurs when weights are\n        # aplied (step 4). if using by_convs for this dim, then we need to\n        # calc right and left boundaries for each filter instead.\n        pad_sz, projected_grid, field_of_view = calc_pad_sz(in_sz, out_sz, field_of_view, projected_grid, scale_factor,\n                                                            dim_by_convs, fw, device)\n        # STEP 3- CALCULATE WEIGHTS: Match a set of weights to the pixels in\n        # the field of view for each output pixel\n        weights = get_weights(cur_interp_method, projected_grid, field_of_view)\n\n        # STEP 4- APPLY WEIGHTS: Each output pixel is calculated by multiplying\n        # its set of weights with the pixel values in its field of view.\n        # We now multiply the fields of view with their matching weights.\n        # We do this by tensor multiplication and broadcasting.\n        # if by_convs is true for this dim, then we do this action by\n        # convolutions. this is equivalent but faster.\n        if not dim_by_convs:\n            output = apply_weights(output, field_of_view, weights, dim, n_dims, pad_sz, pad_mode, fw)\n        else:\n            output = apply_convs(output, scale_factor, in_sz, out_sz, weights, dim, pad_sz, pad_mode, fw)\n    return output\n\n\ndef get_projected_grid(in_sz, out_sz, scale_factor, fw, by_convs, device=None):\n    # we start by having the ouput coordinates which are just integer locations\n    # in the special case when usin by_convs, we only need two cycles of grid\n    # points. the first and last.\n    grid_sz = out_sz if not by_convs else scale_factor.numerator\n    out_coordinates = fw_arange(grid_sz, fw, device)\n\n    # This is projecting the ouput pixel locations in 1d to the input tensor,\n    # as non-integer locations.\n    # the following fomrula is derived in the paper\n    # \"From Discrete to Continuous Convolutions\" by Shocher et al.\n    return (out_coordinates / float(scale_factor) + (in_sz - 1) / 2 - (out_sz - 1) / (2 * float(scale_factor)))\n\n\ndef get_field_of_view(projected_grid, cur_support_sz, fw, eps, device):\n    # for each output pixel, map which input pixels influence it, in 1d.\n    # we start by calculating the leftmost neighbor, using half of the window\n    # size (eps is for when boundary is exact int)\n    left_boundaries = fw_ceil(projected_grid - cur_support_sz / 2 - eps, fw)\n\n    # then we simply take all the pixel centers in the field by counting\n    # window size pixels from the left boundary\n    ordinal_numbers = fw_arange(ceil(cur_support_sz - eps), fw, device)\n    return left_boundaries[:, None] + ordinal_numbers\n\n\ndef calc_pad_sz(in_sz, out_sz, field_of_view, projected_grid, scale_factor, dim_by_convs, fw, device):\n    if not dim_by_convs:\n        # determine padding according to neighbor coords out of bound.\n        # this is a generalized notion of padding, when pad<0 it means crop\n        pad_sz = [-field_of_view[0, 0].item(), field_of_view[-1, -1].item() - in_sz + 1]\n\n        # since input image will be changed by padding, coordinates of both\n        # field_of_view and projected_grid need to be updated\n        field_of_view += pad_sz[0]\n        projected_grid += pad_sz[0]\n\n    else:\n        # only used for by_convs, to calc the boundaries of each filter the\n        # number of distinct convolutions is the numerator of the scale factor\n        num_convs, stride = scale_factor.numerator, scale_factor.denominator\n\n        # calculate left and right boundaries for each conv. left can also be\n        # negative right can be bigger than in_sz. such cases imply padding if\n        # needed. however if# both are in-bounds, it means we need to crop,\n        # practically apply the conv only on part of the image.\n        left_pads = -field_of_view[:, 0]\n\n        # next calc is tricky, explanation by rows:\n        # 1) counting output pixels between the first position of each filter\n        #    to the right boundary of the input\n        # 2) dividing it by number of filters to count how many 'jumps'\n        #    each filter does\n        # 3) multiplying by the stride gives us the distance over the input\n        #    coords done by all these jumps for each filter\n        # 4) to this distance we add the right boundary of the filter when\n        #    placed in its leftmost position. so now we get the right boundary\n        #    of that filter in input coord.\n        # 5) the padding size needed is obtained by subtracting the rightmost\n        #    input coordinate. if the result is positive padding is needed. if\n        #    negative then negative padding means shaving off pixel columns.\n        right_pads = (((out_sz - fw_arange(num_convs, fw, device) - 1)  # (1)\n                       // num_convs)  # (2)\n                      * stride  # (3)\n                      + field_of_view[:, -1]  # (4)\n                      - in_sz + 1)  # (5)\n\n        # in the by_convs case pad_sz is a list of left-right pairs. one per\n        # each filter\n\n        pad_sz = list(zip(left_pads, right_pads))\n\n    return pad_sz, projected_grid, field_of_view\n\n\ndef get_weights(interp_method, projected_grid, field_of_view):\n    # the set of weights per each output pixels is the result of the chosen\n    # interpolation method applied to the distances between projected grid\n    # locations and the pixel-centers in the field of view (distances are\n    # directed, can be positive or negative)\n    weights = interp_method(projected_grid[:, None] - field_of_view)\n\n    # we now carefully normalize the weights to sum to 1 per each output pixel\n    sum_weights = weights.sum(1, keepdim=True)\n    sum_weights[sum_weights == 0] = 1\n    return weights / sum_weights\n\n\ndef apply_weights(input, field_of_view, weights, dim, n_dims, pad_sz, pad_mode, fw):\n    # for this operation we assume the resized dim is the first one.\n    # so we transpose and will transpose back after multiplying\n    tmp_input = fw_swapaxes(input, dim, 0, fw)\n\n    # apply padding\n    tmp_input = fw_pad(tmp_input, fw, pad_sz, pad_mode)\n\n    # field_of_view is a tensor of order 2: for each output (1d location\n    # along cur dim)- a list of 1d neighbors locations.\n    # note that this whole operations is applied to each dim separately,\n    # this is why it is all in 1d.\n    # neighbors = tmp_input[field_of_view] is a tensor of order image_dims+1:\n    # for each output pixel (this time indicated in all dims), these are the\n    # values of the neighbors in the 1d field of view. note that we only\n    # consider neighbors along the current dim, but such set exists for every\n    # multi-dim location, hence the final tensor order is image_dims+1.\n    paddle.device.cuda.empty_cache()\n    neighbors = tmp_input[field_of_view]\n\n    # weights is an order 2 tensor: for each output location along 1d- a list\n    # of weights matching the field of view. we augment it with ones, for\n    # broadcasting, so that when multiplies some tensor the weights affect\n    # only its first dim.\n    tmp_weights = fw.reshape(weights, (*weights.shape, *[1] * (n_dims - 1)))\n\n    # now we simply multiply the weights with the neighbors, and then sum\n    # along the field of view, to get a single value per out pixel\n    tmp_output = (neighbors * tmp_weights).sum(1)\n    # we transpose back the resized dim to its original position\n    return fw_swapaxes(tmp_output, 0, dim, fw)\n\n\ndef apply_convs(input, scale_factor, in_sz, out_sz, weights, dim, pad_sz, pad_mode, fw):\n    # for this operations we assume the resized dim is the last one.\n    # so we transpose and will transpose back after multiplying\n    input = fw_swapaxes(input, dim, -1, fw)\n\n    # the stride for all convs is the denominator of the scale factor\n    stride, num_convs = scale_factor.denominator, scale_factor.numerator\n\n    # prepare an empty tensor for the output\n    tmp_out_shape = list(input.shape)\n    tmp_out_shape[-1] = out_sz\n    tmp_output = fw_empty(tuple(tmp_out_shape), fw, input.device)\n\n    # iterate over the conv operations. we have as many as the numerator\n    # of the scale-factor. for each we need boundaries and a filter.\n    for conv_ind, (pad_sz, filt) in enumerate(zip(pad_sz, weights)):\n        # apply padding (we pad last dim, padding can be negative)\n        pad_dim = input.ndim - 1\n        tmp_input = fw_pad(input, fw, pad_sz, pad_mode, dim=pad_dim)\n\n        # apply convolution over last dim. store in the output tensor with\n        # positional strides so that when the loop is comlete conv results are\n        # interwind\n        tmp_output[..., conv_ind::num_convs] = fw_conv(tmp_input, filt, stride)\n\n    return fw_swapaxes(tmp_output, -1, dim, fw)\n\n\ndef set_scale_and_out_sz(in_shape, out_shape, scale_factors, by_convs, scale_tolerance, max_numerator, eps, fw):\n    # eventually we must have both scale-factors and out-sizes for all in/out\n    # dims. however, we support many possible partial arguments\n    if scale_factors is None and out_shape is None:\n        raise ValueError(\"either scale_factors or out_shape should be \"\n                         \"provided\")\n    if out_shape is not None:\n        # if out_shape has less dims than in_shape, we defaultly resize the\n        # first dims for numpy and last dims for paddle\n        out_shape = (list(out_shape) +\n                     list(in_shape[len(out_shape):]) if fw is numpy else list(in_shape[:-len(out_shape)]) +\n                     list(out_shape))\n        if scale_factors is None:\n            # if no scale given, we calculate it as the out to in ratio\n            # (not recomended)\n            scale_factors = [out_sz / in_sz for out_sz, in_sz in zip(out_shape, in_shape)]\n    if scale_factors is not None:\n        # by default, if a single number is given as scale, we assume resizing\n        # two dims (most common are images with 2 spatial dims)\n        scale_factors = (scale_factors if isinstance(scale_factors, (list, tuple)) else [scale_factors, scale_factors])\n        # if less scale_factors than in_shape dims, we defaultly resize the\n        # first dims for numpy and last dims for paddle\n        scale_factors = (list(scale_factors) + [1] * (len(in_shape) - len(scale_factors)) if fw is numpy else [1] *\n                         (len(in_shape) - len(scale_factors)) + list(scale_factors))\n        if out_shape is None:\n            # when no out_shape given, it is calculated by multiplying the\n            # scale by the in_shape (not recomended)\n            out_shape = [ceil(scale_factor * in_sz) for scale_factor, in_sz in zip(scale_factors, in_shape)]\n        # next part intentionally after out_shape determined for stability\n        # we fix by_convs to be a list of truth values in case it is not\n        if not isinstance(by_convs, (list, tuple)):\n            by_convs = [by_convs] * len(out_shape)\n\n        # next loop fixes the scale for each dim to be either frac or float.\n        # this is determined by by_convs and by tolerance for scale accuracy.\n        for ind, (sf, dim_by_convs) in enumerate(zip(scale_factors, by_convs)):\n            # first we fractionaize\n            if dim_by_convs:\n                frac = Fraction(1 / sf).limit_denominator(max_numerator)\n                frac = Fraction(numerator=frac.denominator, denominator=frac.numerator)\n\n            # if accuracy is within tolerance scale will be frac. if not, then\n            # it will be float and the by_convs attr will be set false for\n            # this dim\n            if scale_tolerance is None:\n                scale_tolerance = eps\n            if dim_by_convs and abs(frac - sf) < scale_tolerance:\n                scale_factors[ind] = frac\n            else:\n                scale_factors[ind] = float(sf)\n                by_convs[ind] = False\n\n        return scale_factors, out_shape, by_convs\n\n\ndef apply_antialiasing_if_needed(interp_method, support_sz, scale_factor, antialiasing):\n    # antialiasing is \"stretching\" the field of view according to the scale\n    # factor (only for downscaling). this is low-pass filtering. this\n    # requires modifying both the interpolation (stretching the 1d\n    # function and multiplying by the scale-factor) and the window size.\n    scale_factor = float(scale_factor)\n    if scale_factor >= 1.0 or not antialiasing:\n        return interp_method, support_sz\n    cur_interp_method = (lambda arg: scale_factor * interp_method(scale_factor * arg))\n    cur_support_sz = support_sz / scale_factor\n    return cur_interp_method, cur_support_sz\n\n\ndef fw_ceil(x, fw):\n    if fw is numpy:\n        return fw.int_(fw.ceil(x))\n    else:\n        return paddle.cast(x.ceil(), dtype='int64')\n\n\ndef fw_floor(x, fw):\n    if fw is numpy:\n        return fw.int_(fw.floor(x))\n    else:\n        return paddle.cast(x.floor(), dtype='int64')\n\n\ndef fw_cat(x, fw):\n    if fw is numpy:\n        return fw.concatenate(x)\n    else:\n        return fw.concat(x)\n\n\ndef fw_swapaxes(x, ax_1, ax_2, fw):\n    if fw is numpy:\n        return fw.swapaxes(x, ax_1, ax_2)\n    else:\n        if ax_1 == -1:\n            ax_1 = len(x.shape) - 1\n        if ax_2 == -1:\n            ax_2 = len(x.shape) - 1\n        perm0 = list(range(len(x.shape)))\n        temp = ax_1\n        perm0[temp] = ax_2\n        perm0[ax_2] = temp\n        return fw.transpose(x, perm0)\n\n\ndef fw_pad(x, fw, pad_sz, pad_mode, dim=0):\n    if pad_sz == (0, 0):\n        return x\n    if fw is numpy:\n        pad_vec = [(0, 0)] * x.ndim\n        pad_vec[dim] = pad_sz\n        return fw.pad(x, pad_width=pad_vec, mode=pad_mode)\n    else:\n        if x.ndim < 3:\n            x = x[None, None, ...]\n\n        pad_vec = [0] * ((x.ndim - 2) * 2)\n        pad_vec[0:2] = pad_sz\n        return fw_swapaxes(fw.nn.functional.pad(fw_swapaxes(x, dim, -1, fw), pad=pad_vec, mode=pad_mode), dim, -1, fw)\n\n\ndef fw_conv(input, filter, stride):\n    # we want to apply 1d conv to any nd array. the way to do it is to reshape\n    # the input to a 4D tensor. first two dims are singeletons, 3rd dim stores\n    # all the spatial dims that we are not convolving along now. then we can\n    # apply conv2d with a 1xK filter. This convolves the same way all the other\n    # dims stored in the 3d dim. like depthwise conv over these.\n    # TODO: numpy support\n    reshaped_input = input.reshape(1, 1, -1, input.shape[-1])\n    reshaped_output = paddle.nn.functional.conv2d(reshaped_input, filter.view(1, 1, 1, -1), stride=(1, stride))\n    return reshaped_output.reshape(*input.shape[:-1], -1)\n\n\ndef fw_arange(upper_bound, fw, device):\n    if fw is numpy:\n        return fw.arange(upper_bound)\n    else:\n        return fw.arange(upper_bound)\n\n\ndef fw_empty(shape, fw, device):\n    if fw is numpy:\n        return fw.empty(shape)\n    else:\n        return fw.empty(shape=shape)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/reverse_diffusion/README.md",
    "content": "# Diffusion model (Paddle)\nThis module implements diffusion model which accepts a text prompt and outputs images semantically close to the text. The code is rewritten by Paddle, and mainly refer to two projects:  jina-ai/discoart[https://github.com/jina-ai/discoart] and openai/guided-diffusion[https://github.com/openai/guided-diffusion]. Thanks for their wonderful work.\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/reverse_diffusion/__init__.py",
    "content": "'''\nhttps://github.com/jina-ai/discoart/blob/main/discoart/__init__.py\n'''\nimport os\nimport warnings\n\nos.environ['KMP_DUPLICATE_LIB_OK'] = 'TRUE'\n\n__all__ = ['create']\n\nimport sys\n\n__resources_path__ = os.path.join(\n    os.path.dirname(sys.modules.get(__package__).__file__ if __package__ in sys.modules else __file__),\n    'resources',\n)\n\nimport gc\n\n# check if GPU is available\nimport paddle\n\n# download and load models, this will take some time on the first load\n\nfrom .helper import load_all_models, load_diffusion_model, load_clip_models\n\nmodel_config, secondary_model = load_all_models('512x512_diffusion_uncond_finetune_008100', use_secondary_model=True)\n\nfrom typing import TYPE_CHECKING, overload, List, Optional\n\nif TYPE_CHECKING:\n    from docarray import DocumentArray, Document\n\n_clip_models_cache = {}\n\n# begin_create_overload\n\n\n@overload\ndef create(text_prompts: Optional[List[str]] = [\n    'A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.',\n    'yellow color scheme',\n],\n           init_image: Optional[str] = None,\n           width_height: Optional[List[int]] = [1280, 768],\n           skip_steps: Optional[int] = 10,\n           steps: Optional[int] = 250,\n           cut_ic_pow: Optional[int] = 1,\n           init_scale: Optional[int] = 1000,\n           clip_guidance_scale: Optional[int] = 5000,\n           tv_scale: Optional[int] = 0,\n           range_scale: Optional[int] = 150,\n           sat_scale: Optional[int] = 0,\n           cutn_batches: Optional[int] = 4,\n           diffusion_model: Optional[str] = '512x512_diffusion_uncond_finetune_008100',\n           use_secondary_model: Optional[bool] = True,\n           diffusion_sampling_mode: Optional[str] = 'ddim',\n           perlin_init: Optional[bool] = False,\n           perlin_mode: Optional[str] = 'mixed',\n           seed: Optional[int] = None,\n           eta: Optional[float] = 0.8,\n           clamp_grad: Optional[bool] = True,\n           clamp_max: Optional[float] = 0.05,\n           randomize_class: Optional[bool] = True,\n           clip_denoised: Optional[bool] = False,\n           fuzzy_prompt: Optional[bool] = False,\n           rand_mag: Optional[float] = 0.05,\n           cut_overview: Optional[str] = '[12]*400+[4]*600',\n           cut_innercut: Optional[str] = '[4]*400+[12]*600',\n           cut_icgray_p: Optional[str] = '[0.2]*400+[0]*600',\n           display_rate: Optional[int] = 10,\n           n_batches: Optional[int] = 4,\n           batch_size: Optional[int] = 1,\n           batch_name: Optional[str] = '',\n           clip_models: Optional[list] = ['ViTB32', 'ViTB16', 'RN50'],\n           output_dir: Optional[str] = 'discoart_output') -> 'DocumentArray':\n    \"\"\"\n    Create Disco Diffusion artworks and save the result into a DocumentArray.\n\n    :param text_prompts: Phrase, sentence, or string of words and phrases describing what the image should look like.  The words will be analyzed by the AI and will guide the diffusion process toward the image(s) you describe. These can include commas and weights to adjust the relative importance of each element.  E.g. \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"Notice that this prompt loosely follows a structure: [subject], [prepositional details], [setting], [meta modifiers and artist]; this is a good starting point for your experiments. Developing text prompts takes practice and experience, and is not the subject of this guide.  If you are a beginner to writing text prompts, a good place to start is on a simple AI art app like Night Cafe, starry ai or WOMBO prior to using DD, to get a feel for how text gets translated into images by GAN tools.  These other apps use different technologies, but many of the same principles apply.\n    :param init_image: Recall that in the image sequence above, the first image shown is just noise.  If an init_image is provided, diffusion will replace the noise with the init_image as its starting state.  To use an init_image, upload the image to the Colab instance or your Google Drive, and enter the full image path here. If using an init_image, you may need to increase skip_steps to ~ 50% of total steps to retain the character of the init. See skip_steps above for further discussion.\n    :param width_height: Desired final image size, in pixels. You can have a square, wide, or tall image, but each edge length should be set to a multiple of 64px, and a minimum of 512px on the default CLIP model setting.  If you forget to use multiples of 64px in your dimensions, DD will adjust the dimensions of your image to make it so.\n    :param skip_steps: Consider the chart shown here.  Noise scheduling (denoise strength) starts very high and progressively gets lower and lower as diffusion steps progress. The noise levels in the first few steps are very high, so images change dramatically in early steps.As DD moves along the curve, noise levels (and thus the amount an image changes per step) declines, and image coherence from one step to the next increases.The first few steps of denoising are often so dramatic that some steps (maybe 10-15% of total) can be skipped without affecting the final image. You can experiment with this as a way to cut render times.If you skip too many steps, however, the remaining noise may not be high enough to generate new content, and thus may not have ‘time left’ to finish an image satisfactorily.Also, depending on your other settings, you may need to skip steps to prevent CLIP from overshooting your goal, resulting in ‘blown out’ colors (hyper saturated, solid white, or solid black regions) or otherwise poor image quality.  Consider that the denoising process is at its strongest in the early steps, so skipping steps can sometimes mitigate other problems.Lastly, if using an init_image, you will need to skip ~50% of the diffusion steps to retain the shapes in the original init image. However, if you’re using an init_image, you can also adjust skip_steps up or down for creative reasons.  With low skip_steps you can get a result \"inspired by\" the init_image which will retain the colors and rough layout and shapes but look quite different. With high skip_steps you can preserve most of the init_image contents and just do fine tuning of the texture.\n    :param steps: When creating an image, the denoising curve is subdivided into steps for processing. Each step (or iteration) involves the AI looking at subsets of the image called ‘cuts’ and calculating the ‘direction’ the image should be guided to be more like the prompt. Then it adjusts the image with the help of the diffusion denoiser, and moves to the next step.Increasing steps will provide more opportunities for the AI to adjust the image, and each adjustment will be smaller, and thus will yield a more precise, detailed image.  Increasing steps comes at the expense of longer render times.  Also, while increasing steps should generally increase image quality, there is a diminishing return on additional steps beyond 250 - 500 steps.  However, some intricate images can take 1000, 2000, or more steps.  It is really up to the user.  Just know that the render time is directly related to the number of steps, and many other parameters have a major impact on image quality, without costing additional time.\n    :param cut_ic_pow: This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n    :param init_scale: This controls how strongly CLIP will try to match the init_image provided.  This is balanced against the clip_guidance_scale (CGS) above.  Too much init scale, and the image won’t change much during diffusion. Too much CGS and the init image will be lost.\n    :param clip_guidance_scale: CGS is one of the most important parameters you will use. It tells DD how strongly you want CLIP to move toward your prompt each timestep.  Higher is generally better, but if CGS is too strong it will overshoot the goal and distort the image. So a happy medium is needed, and it takes experience to learn how to adjust CGS. Note that this parameter generally scales with image dimensions. In other words, if you increase your total dimensions by 50% (e.g. a change from 512 x 512 to 512 x 768), then to maintain the same effect on the image, you’d want to increase clip_guidance_scale from 5000 to 7500. Of the basic settings, clip_guidance_scale, steps and skip_steps are the most important contributors to image quality, so learn them well.\n    :param tv_scale: Total variance denoising. Optional, set to zero to turn off. Controls ‘smoothness’ of final output. If used, tv_scale will try to smooth out your final image to reduce overall noise. If your image is too ‘crunchy’, increase tv_scale. TV denoising is good at preserving edges while smoothing away noise in flat regions.  See https://en.wikipedia.org/wiki/Total_variation_denoising\n    :param range_scale: Optional, set to zero to turn off.  Used for adjustment of color contrast.  Lower range_scale will increase contrast. Very low numbers create a reduced color palette, resulting in more vibrant or poster-like images. Higher range_scale will reduce contrast, for more muted images.\n    :param sat_scale: Saturation scale. Optional, set to zero to turn off.  If used, sat_scale will help mitigate oversaturation. If your image is too saturated, increase sat_scale to reduce the saturation.\n    :param cutn_batches: Each iteration, the AI cuts the image into smaller pieces known as cuts, and compares each cut to the prompt to decide how to guide the next diffusion step.  More cuts can generally lead to better images, since DD has more chances to fine-tune the image precision in each timestep.  Additional cuts are memory intensive, however, and if DD tries to evaluate too many cuts at once, it can run out of memory.  You can use cutn_batches to increase cuts per timestep without increasing memory usage. At the default settings, DD is scheduled to do 16 cuts per timestep.  If cutn_batches is set to 1, there will indeed only be 16 cuts total per timestep. However, if cutn_batches is increased to 4, DD will do 64 cuts total in each timestep, divided into 4 sequential batches of 16 cuts each.  Because the cuts are being evaluated only 16 at a time, DD uses the memory required for only 16 cuts, but gives you the quality benefit of 64 cuts.  The tradeoff, of course, is that this will take ~4 times as long to render each image.So, (scheduled cuts) x (cutn_batches) = (total cuts per timestep). Increasing cutn_batches will increase render times, however, as the work is being done sequentially.  DD’s default cut schedule is a good place to start, but the cut schedule can be adjusted in the Cutn Scheduling section, explained below.\n    :param diffusion_model: Diffusion_model of choice.\n    :param use_secondary_model: Option to use a secondary purpose-made diffusion model to clean up interim diffusion images for CLIP evaluation.    If this option is turned off, DD will use the regular (large) diffusion model.    Using the secondary model is faster - one user reported a 50% improvement in render speed! However, the secondary model is much smaller, and may reduce image quality and detail.  I suggest you experiment with this.\n    :param diffusion_sampling_mode: Two alternate diffusion denoising algorithms. ddim has been around longer, and is more established and tested.  plms is a newly added alternate method that promises good diffusion results in fewer steps, but has not been as fully tested and may have side effects. This new plms mode is actively being researched in the #settings-and-techniques channel in the DD Discord.\n    :param perlin_init: Normally, DD will use an image filled with random noise as a starting point for the diffusion curve.  If perlin_init is selected, DD will instead use a Perlin noise model as an initial state.  Perlin has very interesting characteristics, distinct from random noise, so it’s worth experimenting with this for your projects. Beyond perlin, you can, of course, generate your own noise images (such as with GIMP, etc) and use them as an init_image (without skipping steps). Choosing perlin_init does not affect the actual diffusion process, just the starting point for the diffusion. Please note that selecting a perlin_init will replace and override any init_image you may have specified.  Further, because the 2D, 3D and video animation systems all rely on the init_image system, if you enable Perlin while using animation modes, the perlin_init will jump in front of any previous image or video input, and DD will NOT give you the expected sequence of coherent images. All of that said, using Perlin and animation modes together do make a very colorful rainbow effect, which can be used creatively.\n    :param perlin_mode: sets type of Perlin noise: colored, gray, or a mix of both, giving you additional options for noise types. Experiment to see what these do in your projects.\n    :param seed: Deep in the diffusion code, there is a random number ‘seed’ which is used as the basis for determining the initial state of the diffusion.  By default, this is random, but you can also specify your own seed.  This is useful if you like a particular result and would like to run more iterations that will be similar. After each run, the actual seed value used will be reported in the parameters report, and can be reused if desired by entering seed # here.  If a specific numerical seed is used repeatedly, the resulting images will be quite similar but not identical.\n    :param eta: eta (greek letter η) is a diffusion model variable that mixes in a random amount of scaled noise into each timestep. 0 is no noise, 1.0 is more noise. As with most DD parameters, you can go below zero for eta, but it may give you unpredictable results. The steps parameter has a close relationship with the eta parameter. If you set eta to 0, then you can get decent output with only 50-75 steps. Setting eta to 1.0 favors higher step counts, ideally around 250 and up. eta has a subtle, unpredictable effect on image, so you’ll need to experiment to see how this affects your projects.\n    :param clamp_grad: As I understand it, clamp_grad is an internal limiter that stops DD from producing extreme results.  Try your images with and without clamp_grad. If the image changes drastically with clamp_grad turned off, it probably means your clip_guidance_scale is too high and should be reduced.\n    :param clamp_max: Sets the value of the clamp_grad limitation. Default is 0.05, providing for smoother, more muted coloration in images, but setting higher values (0.15-0.3) can provide interesting contrast and vibrancy.\n    :param fuzzy_prompt: Controls whether to add multiple noisy prompts to the prompt losses. If True, can increase variability of image output. Experiment with this.\n    :param rand_mag: Affects only the fuzzy_prompt.  Controls the magnitude of the random noise added by fuzzy_prompt.\n    :param cut_overview: The schedule of overview cuts\n    :param cut_innercut: The schedule of inner cuts\n    :param cut_icgray_p: This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n    :param display_rate: During a diffusion run, you can monitor the progress of each image being created with this variable.  If display_rate is set to 50, DD will show you the in-progress image every 50 timesteps. Setting this to a lower value, like 5 or 10, is a good way to get an early peek at where your image is heading. If you don’t like the progression, just interrupt execution, change some settings, and re-run.  If you are planning a long, unmonitored batch, it’s better to set display_rate equal to steps, because displaying interim images does slow Colab down slightly.\n    :param n_batches: This variable sets the number of still images you want DD to create.  If you are using an animation mode (see below for details) DD will ignore n_batches and create a single set of animated frames based on the animation settings.\n    :param batch_name: The name of the batch, the batch id will be named as \"discoart-[batch_name]-seed\". To avoid your artworks be overridden by other users, please use a unique name.\n    :param clip_models: CLIP Model selectors. ViTB32, ViTB16, ViTL14, RN101, RN50, RN50x4, RN50x16, RN50x64.These various CLIP models are available for you to use during image generation.  Models have different styles or ‘flavors,’ so look around.  You can mix in multiple models as well for different results.  However, keep in mind that some models are extremely memory-hungry, and turning on additional models will take additional memory and may cause a crash.The rough order of speed/mem usage is (smallest/fastest to largest/slowest):VitB32RN50RN101VitB16RN50x4RN50x16RN50x64ViTL14For RN50x64 & ViTL14 you may need to use fewer cuts, depending on your VRAM.\n    :return: a DocumentArray object that has `n_batches` Documents\n    \"\"\"\n\n\n# end_create_overload\n\n\n@overload\ndef create(init_document: 'Document') -> 'DocumentArray':\n    \"\"\"\n    Create an artwork using a DocArray ``Document`` object as initial state.\n    :param init_document: its ``.tags`` will be used as parameters, ``.uri`` (if present) will be used as init image.\n    :return: a DocumentArray object that has `n_batches` Documents\n    \"\"\"\n\n\ndef create(**kwargs) -> 'DocumentArray':\n    from .config import load_config\n    from .runner import do_run\n\n    if 'init_document' in kwargs:\n        d = kwargs['init_document']\n        _kwargs = d.tags\n        if not _kwargs:\n            warnings.warn('init_document has no .tags, fallback to default config')\n        if d.uri:\n            _kwargs['init_image'] = kwargs['init_document'].uri\n        else:\n            warnings.warn('init_document has no .uri, fallback to no init image')\n        kwargs.pop('init_document')\n        if kwargs:\n            warnings.warn('init_document has .tags and .uri, but kwargs are also present, will override .tags')\n            _kwargs.update(kwargs)\n        _args = load_config(user_config=_kwargs)\n    else:\n        _args = load_config(user_config=kwargs)\n\n    model, diffusion = load_diffusion_model(model_config, _args.diffusion_model, steps=_args.steps)\n\n    clip_models = load_clip_models(enabled=_args.clip_models, clip_models=_clip_models_cache)\n\n    gc.collect()\n    paddle.device.cuda.empty_cache()\n    try:\n        return do_run(_args, (model, diffusion, clip_models, secondary_model))\n    except KeyboardInterrupt:\n        pass\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/reverse_diffusion/config.py",
    "content": "'''\nhttps://github.com/jina-ai/discoart/blob/main/discoart/config.py\n'''\nimport copy\nimport random\nimport warnings\nfrom types import SimpleNamespace\nfrom typing import Dict\n\nimport yaml\nfrom yaml import Loader\n\nfrom . import __resources_path__\n\nwith open(f'{__resources_path__}/default.yml') as ymlfile:\n    default_args = yaml.load(ymlfile, Loader=Loader)\n\n\ndef load_config(user_config: Dict, ):\n    cfg = copy.deepcopy(default_args)\n\n    if user_config:\n        cfg.update(**user_config)\n\n    for k in user_config.keys():\n        if k not in cfg:\n            warnings.warn(f'unknown argument {k}, ignored')\n\n    for k, v in cfg.items():\n        if k in ('batch_size', 'display_rate', 'seed', 'skip_steps', 'steps', 'n_batches',\n                 'cutn_batches') and isinstance(v, float):\n            cfg[k] = int(v)\n        if k == 'width_height':\n            cfg[k] = [int(vv) for vv in v]\n\n    cfg.update(**{\n        'seed': cfg['seed'] or random.randint(0, 2**32),\n    })\n\n    if cfg['batch_name']:\n        da_name = f'{__package__}-{cfg[\"batch_name\"]}-{cfg[\"seed\"]}'\n    else:\n        da_name = f'{__package__}-{cfg[\"seed\"]}'\n        warnings.warn('you did not set `batch_name`, set it to have unique session ID')\n\n    cfg.update(**{'name_docarray': da_name})\n\n    print_args_table(cfg)\n\n    return SimpleNamespace(**cfg)\n\n\ndef print_args_table(cfg):\n    from rich.table import Table\n    from rich import box\n    from rich.console import Console\n\n    console = Console()\n\n    param_str = Table(\n        title=cfg['name_docarray'],\n        box=box.ROUNDED,\n        highlight=True,\n        title_justify='left',\n    )\n    param_str.add_column('Argument', justify='right')\n    param_str.add_column('Value', justify='left')\n\n    for k, v in sorted(cfg.items()):\n        value = str(v)\n\n        if not default_args.get(k, None) == v:\n            value = f'[b]{value}[/]'\n\n        param_str.add_row(k, value)\n\n    console.print(param_str)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/reverse_diffusion/helper.py",
    "content": "'''\nThis code is rewritten by Paddle based on Jina-ai/discoart.\nhttps://github.com/jina-ai/discoart/blob/main/discoart/helper.py\n'''\nimport hashlib\nimport logging\nimport os\nimport subprocess\nimport sys\nfrom os.path import expanduser\nfrom pathlib import Path\nfrom typing import Any\nfrom typing import Dict\nfrom typing import List\n\nimport paddle\n\n\ndef _get_logger():\n    logger = logging.getLogger(__package__)\n    logger.setLevel(\"INFO\")\n    ch = logging.StreamHandler()\n    ch.setLevel(\"INFO\")\n    formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')\n    ch.setFormatter(formatter)\n    logger.addHandler(ch)\n    return logger\n\n\nlogger = _get_logger()\n\n\ndef load_clip_models(enabled: List[str], clip_models: Dict[str, Any] = {}):\n\n    import disco_diffusion_clip_vitb32.clip.clip as clip\n    from disco_diffusion_clip_vitb32.clip.clip import build_model, tokenize, transform\n\n    # load enabled models\n    for k in enabled:\n        if k not in clip_models:\n            clip_models[k] = build_model(name=k)\n            clip_models[k].eval()\n            for parameter in clip_models[k].parameters():\n                parameter.stop_gradient = True\n\n    # disable not enabled models to save memory\n    for k in clip_models:\n        if k not in enabled:\n            clip_models.pop(k)\n\n    return list(clip_models.values())\n\n\ndef load_all_models(diffusion_model, use_secondary_model):\n    from .model.script_util import (\n        model_and_diffusion_defaults, )\n\n    model_config = model_and_diffusion_defaults()\n\n    if diffusion_model == '512x512_diffusion_uncond_finetune_008100':\n        model_config.update({\n            'attention_resolutions': '32, 16, 8',\n            'class_cond': False,\n            'diffusion_steps': 1000,  # No need to edit this, it is taken care of later.\n            'rescale_timesteps': True,\n            'timestep_respacing': 250,  # No need to edit this, it is taken care of later.\n            'image_size': 512,\n            'learn_sigma': True,\n            'noise_schedule': 'linear',\n            'num_channels': 256,\n            'num_head_channels': 64,\n            'num_res_blocks': 2,\n            'resblock_updown': True,\n            'use_fp16': False,\n            'use_scale_shift_norm': True,\n        })\n    elif diffusion_model == '256x256_diffusion_uncond':\n        model_config.update({\n            'attention_resolutions': '32, 16, 8',\n            'class_cond': False,\n            'diffusion_steps': 1000,  # No need to edit this, it is taken care of later.\n            'rescale_timesteps': True,\n            'timestep_respacing': 250,  # No need to edit this, it is taken care of later.\n            'image_size': 256,\n            'learn_sigma': True,\n            'noise_schedule': 'linear',\n            'num_channels': 256,\n            'num_head_channels': 64,\n            'num_res_blocks': 2,\n            'resblock_updown': True,\n            'use_fp16': False,\n            'use_scale_shift_norm': True,\n        })\n\n    secondary_model = None\n    if use_secondary_model:\n        from .model.sec_diff import SecondaryDiffusionImageNet2\n        secondary_model = SecondaryDiffusionImageNet2()\n        model_dict = paddle.load(\n            os.path.join(os.path.dirname(__file__), 'pre_trained', 'secondary_model_imagenet_2.pdparams'))\n        secondary_model.set_state_dict(model_dict)\n        secondary_model.eval()\n        for parameter in secondary_model.parameters():\n            parameter.stop_gradient = True\n\n    return model_config, secondary_model\n\n\ndef load_diffusion_model(model_config, diffusion_model, steps):\n    from .model.script_util import (\n        create_model_and_diffusion, )\n\n    timestep_respacing = f'ddim{steps}'\n    diffusion_steps = (1000 // steps) * steps if steps < 1000 else steps\n    model_config.update({\n        'timestep_respacing': timestep_respacing,\n        'diffusion_steps': diffusion_steps,\n    })\n\n    model, diffusion = create_model_and_diffusion(**model_config)\n    model.set_state_dict(\n        paddle.load(os.path.join(os.path.dirname(__file__), 'pre_trained', f'{diffusion_model}.pdparams')))\n    model.eval()\n    for name, param in model.named_parameters():\n        param.stop_gradient = True\n\n    return model, diffusion\n\n\ndef parse_prompt(prompt):\n    if prompt.startswith('http://') or prompt.startswith('https://'):\n        vals = prompt.rsplit(':', 2)\n        vals = [vals[0] + ':' + vals[1], *vals[2:]]\n    else:\n        vals = prompt.rsplit(':', 1)\n    vals = vals + ['', '1'][len(vals):]\n    return vals[0], float(vals[1])\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/reverse_diffusion/model/__init__.py",
    "content": "\"\"\"\nCodebase for \"Improved Denoising Diffusion Probabilistic Models\" implemented by Paddle.\n\"\"\"\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/reverse_diffusion/model/gaussian_diffusion.py",
    "content": "\"\"\"\nDiffusion model implemented by Paddle.\nThis code is rewritten based on Pytorch version of of Ho et al's diffusion models:\nhttps://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/diffusion_utils_2.py\n\"\"\"\nimport enum\nimport math\n\nimport numpy as np\nimport paddle\n\nfrom .losses import discretized_gaussian_log_likelihood\nfrom .losses import normal_kl\nfrom .nn import mean_flat\n\n\ndef get_named_beta_schedule(schedule_name, num_diffusion_timesteps):\n    \"\"\"\n    Get a pre-defined beta schedule for the given name.\n\n    The beta schedule library consists of beta schedules which remain similar\n    in the limit of num_diffusion_timesteps.\n    Beta schedules may be added, but should not be removed or changed once\n    they are committed to maintain backwards compatibility.\n    \"\"\"\n    if schedule_name == \"linear\":\n        # Linear schedule from Ho et al, extended to work for any number of\n        # diffusion steps.\n        scale = 1000 / num_diffusion_timesteps\n        beta_start = scale * 0.0001\n        beta_end = scale * 0.02\n        return np.linspace(beta_start, beta_end, num_diffusion_timesteps, dtype=np.float64)\n    elif schedule_name == \"cosine\":\n        return betas_for_alpha_bar(\n            num_diffusion_timesteps,\n            lambda t: math.cos((t + 0.008) / 1.008 * math.pi / 2)**2,\n        )\n    else:\n        raise NotImplementedError(f\"unknown beta schedule: {schedule_name}\")\n\n\ndef betas_for_alpha_bar(num_diffusion_timesteps, alpha_bar, max_beta=0.999):\n    \"\"\"\n    Create a beta schedule that discretizes the given alpha_t_bar function,\n    which defines the cumulative product of (1-beta) over time from t = [0,1].\n\n    :param num_diffusion_timesteps: the number of betas to produce.\n    :param alpha_bar: a lambda that takes an argument t from 0 to 1 and\n                      produces the cumulative product of (1-beta) up to that\n                      part of the diffusion process.\n    :param max_beta: the maximum beta to use; use values lower than 1 to\n                     prevent singularities.\n    \"\"\"\n    betas = []\n    for i in range(num_diffusion_timesteps):\n        t1 = i / num_diffusion_timesteps\n        t2 = (i + 1) / num_diffusion_timesteps\n        betas.append(min(1 - alpha_bar(t2) / alpha_bar(t1), max_beta))\n    return np.array(betas)\n\n\nclass ModelMeanType(enum.Enum):\n    \"\"\"\n    Which type of output the model predicts.\n    \"\"\"\n\n    PREVIOUS_X = enum.auto()  # the model predicts x_{t-1}\n    START_X = enum.auto()  # the model predicts x_0\n    EPSILON = enum.auto()  # the model predicts epsilon\n\n\nclass ModelVarType(enum.Enum):\n    \"\"\"\n    What is used as the model's output variance.\n\n    The LEARNED_RANGE option has been added to allow the model to predict\n    values between FIXED_SMALL and FIXED_LARGE, making its job easier.\n    \"\"\"\n\n    LEARNED = enum.auto()\n    FIXED_SMALL = enum.auto()\n    FIXED_LARGE = enum.auto()\n    LEARNED_RANGE = enum.auto()\n\n\nclass LossType(enum.Enum):\n    MSE = enum.auto()  # use raw MSE loss (and KL when learning variances)\n    RESCALED_MSE = (enum.auto())  # use raw MSE loss (with RESCALED_KL when learning variances)\n    KL = enum.auto()  # use the variational lower-bound\n    RESCALED_KL = enum.auto()  # like KL, but rescale to estimate the full VLB\n\n    def is_vb(self):\n        return self == LossType.KL or self == LossType.RESCALED_KL\n\n\nclass GaussianDiffusion:\n    \"\"\"\n    Utilities for training and sampling diffusion models.\n\n    Ported directly from here, and then adapted over time to further experimentation.\n    https://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/diffusion_utils_2.py#L42\n\n    :param betas: a 1-D numpy array of betas for each diffusion timestep,\n                  starting at T and going to 1.\n    :param model_mean_type: a ModelMeanType determining what the model outputs.\n    :param model_var_type: a ModelVarType determining how variance is output.\n    :param loss_type: a LossType determining the loss function to use.\n    :param rescale_timesteps: if True, pass floating point timesteps into the\n                              model so that they are always scaled like in the\n                              original paper (0 to 1000).\n    \"\"\"\n\n    def __init__(\n        self,\n        *,\n        betas,\n        model_mean_type,\n        model_var_type,\n        loss_type,\n        rescale_timesteps=False,\n    ):\n        self.model_mean_type = model_mean_type\n        self.model_var_type = model_var_type\n        self.loss_type = loss_type\n        self.rescale_timesteps = rescale_timesteps\n\n        # Use float64 for accuracy.\n        betas = np.array(betas, dtype=np.float64)\n        self.betas = betas\n        assert len(betas.shape) == 1, \"betas must be 1-D\"\n        assert (betas > 0).all() and (betas <= 1).all()\n\n        self.num_timesteps = int(betas.shape[0])\n\n        alphas = 1.0 - betas\n        self.alphas_cumprod = np.cumprod(alphas, axis=0)\n        self.alphas_cumprod_prev = np.append(1.0, self.alphas_cumprod[:-1])\n        self.alphas_cumprod_next = np.append(self.alphas_cumprod[1:], 0.0)\n        assert self.alphas_cumprod_prev.shape == (self.num_timesteps, )\n\n        # calculations for diffusion q(x_t | x_{t-1}) and others\n        self.sqrt_alphas_cumprod = np.sqrt(self.alphas_cumprod)\n        self.sqrt_one_minus_alphas_cumprod = np.sqrt(1.0 - self.alphas_cumprod)\n        self.log_one_minus_alphas_cumprod = np.log(1.0 - self.alphas_cumprod)\n        self.sqrt_recip_alphas_cumprod = np.sqrt(1.0 / self.alphas_cumprod)\n        self.sqrt_recipm1_alphas_cumprod = np.sqrt(1.0 / self.alphas_cumprod - 1)\n\n        # calculations for posterior q(x_{t-1} | x_t, x_0)\n        self.posterior_variance = (betas * (1.0 - self.alphas_cumprod_prev) / (1.0 - self.alphas_cumprod))\n        # log calculation clipped because the posterior variance is 0 at the\n        # beginning of the diffusion chain.\n        self.posterior_log_variance_clipped = np.log(np.append(self.posterior_variance[1], self.posterior_variance[1:]))\n        self.posterior_mean_coef1 = (betas * np.sqrt(self.alphas_cumprod_prev) / (1.0 - self.alphas_cumprod))\n        self.posterior_mean_coef2 = ((1.0 - self.alphas_cumprod_prev) * np.sqrt(alphas) / (1.0 - self.alphas_cumprod))\n\n    def q_mean_variance(self, x_start, t):\n        \"\"\"\n        Get the distribution q(x_t | x_0).\n\n        :param x_start: the [N x C x ...] tensor of noiseless inputs.\n        :param t: the number of diffusion steps (minus 1). Here, 0 means one step.\n        :return: A tuple (mean, variance, log_variance), all of x_start's shape.\n        \"\"\"\n        mean = (_extract_into_tensor(self.sqrt_alphas_cumprod, t, x_start.shape) * x_start)\n        variance = _extract_into_tensor(1.0 - self.alphas_cumprod, t, x_start.shape)\n        log_variance = _extract_into_tensor(self.log_one_minus_alphas_cumprod, t, x_start.shape)\n        return mean, variance, log_variance\n\n    def q_sample(self, x_start, t, noise=None):\n        \"\"\"\n        Diffuse the data for a given number of diffusion steps.\n\n        In other words, sample from q(x_t | x_0).\n\n        :param x_start: the initial data batch.\n        :param t: the number of diffusion steps (minus 1). Here, 0 means one step.\n        :param noise: if specified, the split-out normal noise.\n        :return: A noisy version of x_start.\n        \"\"\"\n        if noise is None:\n            # noise = th.randn_like(x_start)\n            noise = paddle.randn(x_start.shape, x_start.dtype)\n        assert noise.shape == x_start.shape\n        return (_extract_into_tensor(self.sqrt_alphas_cumprod, t, x_start.shape) * x_start +\n                _extract_into_tensor(self.sqrt_one_minus_alphas_cumprod, t, x_start.shape) * noise)\n\n    def q_posterior_mean_variance(self, x_start, x_t, t):\n        \"\"\"\n        Compute the mean and variance of the diffusion posterior:\n\n            q(x_{t-1} | x_t, x_0)\n\n        \"\"\"\n        assert x_start.shape == x_t.shape\n        posterior_mean = (_extract_into_tensor(self.posterior_mean_coef1, t, x_t.shape) * x_start +\n                          _extract_into_tensor(self.posterior_mean_coef2, t, x_t.shape) * x_t)\n        posterior_variance = _extract_into_tensor(self.posterior_variance, t, x_t.shape)\n        posterior_log_variance_clipped = _extract_into_tensor(self.posterior_log_variance_clipped, t, x_t.shape)\n        assert (posterior_mean.shape[0] == posterior_variance.shape[0] == posterior_log_variance_clipped.shape[0] ==\n                x_start.shape[0])\n        return posterior_mean, posterior_variance, posterior_log_variance_clipped\n\n    def p_mean_variance(self, model, x, t, clip_denoised=True, denoised_fn=None, model_kwargs=None):\n        \"\"\"\n        Apply the model to get p(x_{t-1} | x_t), as well as a prediction of\n        the initial x, x_0.\n\n        :param model: the model, which takes a signal and a batch of timesteps\n                      as input.\n        :param x: the [N x C x ...] tensor at time t.\n        :param t: a 1-D Tensor of timesteps.\n        :param clip_denoised: if True, clip the denoised signal into [-1, 1].\n        :param denoised_fn: if not None, a function which applies to the\n            x_start prediction before it is used to sample. Applies before\n            clip_denoised.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n        :return: a dict with the following keys:\n                 - 'mean': the model mean output.\n                 - 'variance': the model variance output.\n                 - 'log_variance': the log of 'variance'.\n                 - 'pred_xstart': the prediction for x_0.\n        \"\"\"\n        if model_kwargs is None:\n            model_kwargs = {}\n\n        B, C = x.shape[:2]\n        assert t.shape == [B]\n        model_output = model(x, self._scale_timesteps(t), **model_kwargs)\n\n        if self.model_var_type in [ModelVarType.LEARNED, ModelVarType.LEARNED_RANGE]:\n            assert model_output.shape == [B, C * 2, *x.shape[2:]]\n            model_output, model_var_values = paddle.split(model_output, 2, axis=1)\n            if self.model_var_type == ModelVarType.LEARNED:\n                model_log_variance = model_var_values\n                model_variance = paddle.exp(model_log_variance)\n            else:\n                min_log = _extract_into_tensor(self.posterior_log_variance_clipped, t, x.shape)\n                max_log = _extract_into_tensor(np.log(self.betas), t, x.shape)\n                # The model_var_values is [-1, 1] for [min_var, max_var].\n                frac = (model_var_values + 1) / 2\n                model_log_variance = frac * max_log + (1 - frac) * min_log\n                model_variance = paddle.exp(model_log_variance)\n        else:\n            model_variance, model_log_variance = {\n                # for fixedlarge, we set the initial (log-)variance like so\n                # to get a better decoder log likelihood.\n                ModelVarType.FIXED_LARGE: (\n                    np.append(self.posterior_variance[1], self.betas[1:]),\n                    np.log(np.append(self.posterior_variance[1], self.betas[1:])),\n                ),\n                ModelVarType.FIXED_SMALL: (\n                    self.posterior_variance,\n                    self.posterior_log_variance_clipped,\n                ),\n            }[self.model_var_type]\n            model_variance = _extract_into_tensor(model_variance, t, x.shape)\n            model_log_variance = _extract_into_tensor(model_log_variance, t, x.shape)\n\n        def process_xstart(x):\n            if denoised_fn is not None:\n                x = denoised_fn(x)\n            if clip_denoised:\n                return x.clamp(-1, 1)\n            return x\n\n        if self.model_mean_type == ModelMeanType.PREVIOUS_X:\n            pred_xstart = process_xstart(self._predict_xstart_from_xprev(x_t=x, t=t, xprev=model_output))\n            model_mean = model_output\n        elif self.model_mean_type in [ModelMeanType.START_X, ModelMeanType.EPSILON]:\n            if self.model_mean_type == ModelMeanType.START_X:\n                pred_xstart = process_xstart(model_output)\n            else:\n                pred_xstart = process_xstart(self._predict_xstart_from_eps(x_t=x, t=t, eps=model_output))\n            model_mean, _, _ = self.q_posterior_mean_variance(x_start=pred_xstart, x_t=x, t=t)\n        else:\n            raise NotImplementedError(self.model_mean_type)\n\n        assert (model_mean.shape == model_log_variance.shape == pred_xstart.shape == x.shape)\n        return {\n            \"mean\": model_mean,\n            \"variance\": model_variance,\n            \"log_variance\": model_log_variance,\n            \"pred_xstart\": pred_xstart,\n        }\n\n    def _predict_xstart_from_eps(self, x_t, t, eps):\n        assert x_t.shape == eps.shape\n        return (_extract_into_tensor(self.sqrt_recip_alphas_cumprod, t, x_t.shape) * x_t -\n                _extract_into_tensor(self.sqrt_recipm1_alphas_cumprod, t, x_t.shape) * eps)\n\n    def _predict_xstart_from_xprev(self, x_t, t, xprev):\n        assert x_t.shape == xprev.shape\n        return (  # (xprev - coef2*x_t) / coef1\n            _extract_into_tensor(1.0 / self.posterior_mean_coef1, t, x_t.shape) * xprev -\n            _extract_into_tensor(self.posterior_mean_coef2 / self.posterior_mean_coef1, t, x_t.shape) * x_t)\n\n    def _predict_eps_from_xstart(self, x_t, t, pred_xstart):\n        return (_extract_into_tensor(self.sqrt_recip_alphas_cumprod, t, x_t.shape) * x_t -\n                pred_xstart) / _extract_into_tensor(self.sqrt_recipm1_alphas_cumprod, t, x_t.shape)\n\n    def _scale_timesteps(self, t):\n        if self.rescale_timesteps:\n            return paddle.cast((t), 'float32') * (1000.0 / self.num_timesteps)\n        return t\n\n    def condition_mean(self, cond_fn, p_mean_var, x, t, model_kwargs=None):\n        \"\"\"\n        Compute the mean for the previous step, given a function cond_fn that\n        computes the gradient of a conditional log probability with respect to\n        x. In particular, cond_fn computes grad(log(p(y|x))), and we want to\n        condition on y.\n\n        This uses the conditioning strategy from Sohl-Dickstein et al. (2015).\n        \"\"\"\n        gradient = cond_fn(x, self._scale_timesteps(t), **model_kwargs)\n        new_mean = (paddle.cast((p_mean_var[\"mean\"]), 'float32') + p_mean_var[\"variance\"] * paddle.cast(\n            (gradient), 'float32'))\n        return new_mean\n\n    def condition_mean_with_grad(self, cond_fn, p_mean_var, x, t, model_kwargs=None):\n        \"\"\"\n        Compute the mean for the previous step, given a function cond_fn that\n        computes the gradient of a conditional log probability with respect to\n        x. In particular, cond_fn computes grad(log(p(y|x))), and we want to\n        condition on y.\n\n        This uses the conditioning strategy from Sohl-Dickstein et al. (2015).\n        \"\"\"\n        gradient = cond_fn(x, t, p_mean_var, **model_kwargs)\n        new_mean = (paddle.cast((p_mean_var[\"mean\"]), 'float32') + p_mean_var[\"variance\"] * paddle.cast(\n            (gradient), 'float32'))\n        return new_mean\n\n    def condition_score(self, cond_fn, p_mean_var, x, t, model_kwargs=None):\n        \"\"\"\n        Compute what the p_mean_variance output would have been, should the\n        model's score function be conditioned by cond_fn.\n\n        See condition_mean() for details on cond_fn.\n\n        Unlike condition_mean(), this instead uses the conditioning strategy\n        from Song et al (2020).\n        \"\"\"\n        alpha_bar = _extract_into_tensor(self.alphas_cumprod, t, x.shape)\n\n        eps = self._predict_eps_from_xstart(x, t, p_mean_var[\"pred_xstart\"])\n        eps = eps - (1 - alpha_bar).sqrt() * cond_fn(x, self._scale_timesteps(t), **model_kwargs)\n\n        out = p_mean_var.copy()\n        out[\"pred_xstart\"] = self._predict_xstart_from_eps(x, t, eps)\n        out[\"mean\"], _, _ = self.q_posterior_mean_variance(x_start=out[\"pred_xstart\"], x_t=x, t=t)\n        return out\n\n    def condition_score_with_grad(self, cond_fn, p_mean_var, x, t, model_kwargs=None):\n        \"\"\"\n        Compute what the p_mean_variance output would have been, should the\n        model's score function be conditioned by cond_fn.\n\n        See condition_mean() for details on cond_fn.\n\n        Unlike condition_mean(), this instead uses the conditioning strategy\n        from Song et al (2020).\n        \"\"\"\n        alpha_bar = _extract_into_tensor(self.alphas_cumprod, t, x.shape)\n\n        eps = self._predict_eps_from_xstart(x, t, p_mean_var[\"pred_xstart\"])\n        eps = eps - (1 - alpha_bar).sqrt() * cond_fn(x, t, p_mean_var, **model_kwargs)\n\n        out = p_mean_var.copy()\n        out[\"pred_xstart\"] = self._predict_xstart_from_eps(x, t, eps)\n        out[\"mean\"], _, _ = self.q_posterior_mean_variance(x_start=out[\"pred_xstart\"], x_t=x, t=t)\n        return out\n\n    def p_sample(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n    ):\n        \"\"\"\n        Sample x_{t-1} from the model at the given timestep.\n\n        :param model: the model to sample from.\n        :param x: the current tensor at x_{t-1}.\n        :param t: the value of t, starting at 0 for the first diffusion step.\n        :param clip_denoised: if True, clip the x_start prediction to [-1, 1].\n        :param denoised_fn: if not None, a function which applies to the\n            x_start prediction before it is used to sample.\n        :param cond_fn: if not None, this is a gradient function that acts\n                        similarly to the model.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n        :return: a dict containing the following keys:\n                 - 'sample': a random sample from the model.\n                 - 'pred_xstart': a prediction of x_0.\n        \"\"\"\n        out = self.p_mean_variance(\n            model,\n            x,\n            t,\n            clip_denoised=clip_denoised,\n            denoised_fn=denoised_fn,\n            model_kwargs=model_kwargs,\n        )\n        # noise = th.randn_like(x)\n        noise = paddle.randn(x.shape, x.dtype)\n        nonzero_mask = (paddle.cast((t != 0), 'float32').reshape([-1,\n                                                                  *([1] * (len(x.shape) - 1))]))  # no noise when t == 0\n        if cond_fn is not None:\n            out[\"mean\"] = self.condition_mean(cond_fn, out, x, t, model_kwargs=model_kwargs)\n        sample = out[\"mean\"] + nonzero_mask * paddle.exp(0.5 * out[\"log_variance\"]) * noise\n        return {\"sample\": sample, \"pred_xstart\": out[\"pred_xstart\"]}\n\n    def p_sample_with_grad(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n    ):\n        \"\"\"\n        Sample x_{t-1} from the model at the given timestep.\n\n        :param model: the model to sample from.\n        :param x: the current tensor at x_{t-1}.\n        :param t: the value of t, starting at 0 for the first diffusion step.\n        :param clip_denoised: if True, clip the x_start prediction to [-1, 1].\n        :param denoised_fn: if not None, a function which applies to the\n            x_start prediction before it is used to sample.\n        :param cond_fn: if not None, this is a gradient function that acts\n                        similarly to the model.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n        :return: a dict containing the following keys:\n                 - 'sample': a random sample from the model.\n                 - 'pred_xstart': a prediction of x_0.\n        \"\"\"\n        # with th.enable_grad():\n        # x = x.detach().requires_grad_()\n        x = x.detach()\n        # x.stop_gradient = False\n        out = self.p_mean_variance(\n            model,\n            x,\n            t,\n            clip_denoised=clip_denoised,\n            denoised_fn=denoised_fn,\n            model_kwargs=model_kwargs,\n        )\n        # noise = th.randn_like(x)\n        noise = paddle.randn(x.shape, x.dtype)\n        nonzero_mask = (paddle.cast((t != 0), 'float32').reshape([-1,\n                                                                  *([1] * (len(x.shape) - 1))]))  # no noise when t == 0\n        if cond_fn is not None:\n            out[\"mean\"] = self.condition_mean_with_grad(cond_fn, out, x, t, model_kwargs=model_kwargs)\n        sample = out[\"mean\"] + nonzero_mask * paddle.exp(0.5 * out[\"log_variance\"]) * noise\n        return {\"sample\": sample, \"pred_xstart\": out[\"pred_xstart\"].detach()}\n\n    def p_sample_loop(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n    ):\n        \"\"\"\n        Generate samples from the model.\n\n        :param model: the model module.\n        :param shape: the shape of the samples, (N, C, H, W).\n        :param noise: if specified, the noise from the encoder to sample.\n                      Should be of the same shape as `shape`.\n        :param clip_denoised: if True, clip x_start predictions to [-1, 1].\n        :param denoised_fn: if not None, a function which applies to the\n            x_start prediction before it is used to sample.\n        :param cond_fn: if not None, this is a gradient function that acts\n                        similarly to the model.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n        :param device: if specified, the device to create the samples on.\n                       If not specified, use a model parameter's device.\n        :param progress: if True, show a tqdm progress bar.\n        :return: a non-differentiable batch of samples.\n        \"\"\"\n        final = None\n        for sample in self.p_sample_loop_progressive(\n                model,\n                shape,\n                noise=noise,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n                device=device,\n                progress=progress,\n                skip_timesteps=skip_timesteps,\n                init_image=init_image,\n                randomize_class=randomize_class,\n                cond_fn_with_grad=cond_fn_with_grad,\n        ):\n            final = sample\n        return final[\"sample\"]\n\n    def p_sample_loop_progressive(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n    ):\n        \"\"\"\n        Generate samples from the model and yield intermediate samples from\n        each timestep of diffusion.\n\n        Arguments are the same as p_sample_loop().\n        Returns a generator over dicts, where each dict is the return value of\n        p_sample().\n        \"\"\"\n        if device is None:\n            device = model.parameters()[0].place\n        assert isinstance(shape, (tuple, list))\n        if noise is not None:\n            img = noise\n        else:\n            img = paddle.randn(shape)\n\n        if skip_timesteps and init_image is None:\n            init_image = paddle.zeros_like(img)\n\n        indices = list(range(self.num_timesteps - skip_timesteps))[::-1]\n\n        if init_image is not None:\n            my_t = paddle.ones([shape[0]], dtype='int64') * indices[0]\n            img = self.q_sample(init_image, my_t, img)\n\n        if progress:\n            # Lazy import so that we don't depend on tqdm.\n            from tqdm.auto import tqdm\n\n            indices = tqdm(indices)\n\n        for i in indices:\n            t = paddle.to_tensor([i] * shape[0], place=device)\n            if randomize_class and 'y' in model_kwargs:\n                model_kwargs['y'] = paddle.randint(low=0, high=model.num_classes, shape=model_kwargs['y'].shape)\n            # with paddle.no_grad():\n            sample_fn = self.p_sample_with_grad if cond_fn_with_grad else self.p_sample\n            out = sample_fn(\n                model,\n                img,\n                t,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n            )\n            yield out\n            img = out[\"sample\"]\n\n    def ddim_sample(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        eta=0.0,\n    ):\n        \"\"\"\n        Sample x_{t-1} from the model using DDIM.\n\n        Same usage as p_sample().\n        \"\"\"\n        out_orig = self.p_mean_variance(\n            model,\n            x,\n            t,\n            clip_denoised=clip_denoised,\n            denoised_fn=denoised_fn,\n            model_kwargs=model_kwargs,\n        )\n        if cond_fn is not None:\n            out = self.condition_score(cond_fn, out_orig, x, t, model_kwargs=model_kwargs)\n        else:\n            out = out_orig\n\n        # Usually our model outputs epsilon, but we re-derive it\n        # in case we used x_start or x_prev prediction.\n        eps = self._predict_eps_from_xstart(x, t, out[\"pred_xstart\"])\n\n        alpha_bar = _extract_into_tensor(self.alphas_cumprod, t, x.shape)\n        alpha_bar_prev = _extract_into_tensor(self.alphas_cumprod_prev, t, x.shape)\n        sigma = (eta * paddle.sqrt(\n            (1 - alpha_bar_prev) / (1 - alpha_bar)) * paddle.sqrt(1 - alpha_bar / alpha_bar_prev))\n        # Equation 12.\n        # noise = th.randn_like(x)\n        noise = paddle.randn(x.shape, x.dtype)\n        mean_pred = (out[\"pred_xstart\"] * paddle.sqrt(alpha_bar_prev) +\n                     paddle.sqrt(1 - alpha_bar_prev - sigma**2) * eps)\n        nonzero_mask = (paddle.cast((t != 0), 'float32').reshape([-1,\n                                                                  *([1] * (len(x.shape) - 1))]))  # no noise when t == 0\n        sample = mean_pred + nonzero_mask * sigma * noise\n        return {\"sample\": sample, \"pred_xstart\": out_orig[\"pred_xstart\"]}\n\n    def ddim_sample_with_grad(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        eta=0.0,\n    ):\n        \"\"\"\n        Sample x_{t-1} from the model using DDIM.\n\n        Same usage as p_sample().\n        \"\"\"\n        # with th.enable_grad():\n        # x = x.detach().requires_grad_()\n        x = x.detach()\n        # x.stop_gradient = False\n        out_orig = self.p_mean_variance(\n            model,\n            x,\n            t,\n            clip_denoised=clip_denoised,\n            denoised_fn=denoised_fn,\n            model_kwargs=model_kwargs,\n        )\n        if cond_fn is not None:\n            out = self.condition_score_with_grad(cond_fn, out_orig, x, t, model_kwargs=model_kwargs)\n        else:\n            out = out_orig\n\n        out[\"pred_xstart\"] = out[\"pred_xstart\"].detach()\n\n        # Usually our model outputs epsilon, but we re-derive it\n        # in case we used x_start or x_prev prediction.\n        eps = self._predict_eps_from_xstart(x, t, out[\"pred_xstart\"])\n\n        alpha_bar = _extract_into_tensor(self.alphas_cumprod, t, x.shape)\n        alpha_bar_prev = _extract_into_tensor(self.alphas_cumprod_prev, t, x.shape)\n        sigma = (eta * paddle.sqrt(\n            (1 - alpha_bar_prev) / (1 - alpha_bar)) * paddle.sqrt(1 - alpha_bar / alpha_bar_prev))\n        # Equation 12.\n        # noise = th.randn_like(x)\n        noise = paddle.randn(x.shape, x.dtype)\n        mean_pred = (out[\"pred_xstart\"] * paddle.sqrt(alpha_bar_prev) +\n                     paddle.sqrt(1 - alpha_bar_prev - sigma**2) * eps)\n        nonzero_mask = (paddle.cast((t != 0), 'float32').reshape([-1,\n                                                                  *([1] * (len(x.shape) - 1))]))  # no noise when t == 0\n        sample = mean_pred + nonzero_mask * sigma * noise\n        return {\"sample\": sample, \"pred_xstart\": out_orig[\"pred_xstart\"].detach()}\n\n    def ddim_reverse_sample(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        model_kwargs=None,\n        eta=0.0,\n    ):\n        \"\"\"\n        Sample x_{t+1} from the model using DDIM reverse ODE.\n        \"\"\"\n        assert eta == 0.0, \"Reverse ODE only for deterministic path\"\n        out = self.p_mean_variance(\n            model,\n            x,\n            t,\n            clip_denoised=clip_denoised,\n            denoised_fn=denoised_fn,\n            model_kwargs=model_kwargs,\n        )\n        # Usually our model outputs epsilon, but we re-derive it\n        # in case we used x_start or x_prev prediction.\n        eps = (_extract_into_tensor(self.sqrt_recip_alphas_cumprod, t, x.shape) * x -\n               out[\"pred_xstart\"]) / _extract_into_tensor(self.sqrt_recipm1_alphas_cumprod, t, x.shape)\n        alpha_bar_next = _extract_into_tensor(self.alphas_cumprod_next, t, x.shape)\n\n        # Equation 12. reversed\n        mean_pred = (out[\"pred_xstart\"] * paddle.sqrt(alpha_bar_next) + paddle.sqrt(1 - alpha_bar_next) * eps)\n\n        return {\"sample\": mean_pred, \"pred_xstart\": out[\"pred_xstart\"]}\n\n    def ddim_sample_loop(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        eta=0.0,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n    ):\n        \"\"\"\n        Generate samples from the model using DDIM.\n\n        Same usage as p_sample_loop().\n        \"\"\"\n        final = None\n        for sample in self.ddim_sample_loop_progressive(\n                model,\n                shape,\n                noise=noise,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n                device=device,\n                progress=progress,\n                eta=eta,\n                skip_timesteps=skip_timesteps,\n                init_image=init_image,\n                randomize_class=randomize_class,\n                cond_fn_with_grad=cond_fn_with_grad,\n        ):\n            final = sample\n        return final[\"sample\"]\n\n    def ddim_sample_loop_progressive(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        eta=0.0,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n    ):\n        \"\"\"\n        Use DDIM to sample from the model and yield intermediate samples from\n        each timestep of DDIM.\n\n        Same usage as p_sample_loop_progressive().\n        \"\"\"\n        # if device is None:\n        #     device = next(model.parameters()).device\n        if device is None:\n            device = model.parameters()[0].place\n        assert isinstance(shape, (tuple, list))\n        if noise is not None:\n            img = noise\n        else:\n            img = paddle.randn(shape)\n\n        if skip_timesteps and init_image is None:\n            init_image = paddle.zeros_like(img)\n\n        indices = list(range(self.num_timesteps - skip_timesteps))[::-1]\n\n        if init_image is not None:\n            my_t = paddle.ones([shape[0]], dtype='int64') * indices[0]\n            img = self.q_sample(init_image, my_t, img)\n\n        if progress:\n            # Lazy import so that we don't depend on tqdm.\n            from tqdm.auto import tqdm\n\n            indices = tqdm(indices)\n\n        for i in indices:\n            t = paddle.to_tensor([i] * shape[0])\n            if randomize_class and 'y' in model_kwargs:\n                model_kwargs['y'] = paddle.randint(\n                    low=0,\n                    high=model.num_classes,\n                    shape=model_kwargs['y'].shape,\n                )\n            sample_fn = self.ddim_sample_with_grad if cond_fn_with_grad else self.ddim_sample\n            out = sample_fn(\n                model,\n                img,\n                t,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n                eta=eta,\n            )\n            yield out\n            img = out[\"sample\"]\n\n    def plms_sample(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        cond_fn_with_grad=False,\n        order=2,\n        old_out=None,\n    ):\n        \"\"\"\n        Sample x_{t-1} from the model using Pseudo Linear Multistep.\n\n        Same usage as p_sample().\n        \"\"\"\n        if not int(order) or not 1 <= order <= 4:\n            raise ValueError('order is invalid (should be int from 1-4).')\n\n        def get_model_output(x, t):\n            with paddle.set_grad_enabled(cond_fn_with_grad and cond_fn is not None):\n                x = x.detach().requires_grad_() if cond_fn_with_grad else x\n                out_orig = self.p_mean_variance(\n                    model,\n                    x,\n                    t,\n                    clip_denoised=clip_denoised,\n                    denoised_fn=denoised_fn,\n                    model_kwargs=model_kwargs,\n                )\n                if cond_fn is not None:\n                    if cond_fn_with_grad:\n                        out = self.condition_score_with_grad(cond_fn, out_orig, x, t, model_kwargs=model_kwargs)\n                        x = x.detach()\n                    else:\n                        out = self.condition_score(cond_fn, out_orig, x, t, model_kwargs=model_kwargs)\n                else:\n                    out = out_orig\n\n            # Usually our model outputs epsilon, but we re-derive it\n            # in case we used x_start or x_prev prediction.\n            eps = self._predict_eps_from_xstart(x, t, out[\"pred_xstart\"])\n            return eps, out, out_orig\n\n        alpha_bar = _extract_into_tensor(self.alphas_cumprod, t, x.shape)\n        alpha_bar_prev = _extract_into_tensor(self.alphas_cumprod_prev, t, x.shape)\n        eps, out, out_orig = get_model_output(x, t)\n\n        if order > 1 and old_out is None:\n            # Pseudo Improved Euler\n            old_eps = [eps]\n            mean_pred = out[\"pred_xstart\"] * paddle.sqrt(alpha_bar_prev) + paddle.sqrt(1 - alpha_bar_prev) * eps\n            eps_2, _, _ = get_model_output(mean_pred, t - 1)\n            eps_prime = (eps + eps_2) / 2\n            pred_prime = self._predict_xstart_from_eps(x, t, eps_prime)\n            mean_pred = pred_prime * paddle.sqrt(alpha_bar_prev) + paddle.sqrt(1 - alpha_bar_prev) * eps_prime\n        else:\n            # Pseudo Linear Multistep (Adams-Bashforth)\n            old_eps = old_out[\"old_eps\"]\n            old_eps.append(eps)\n            cur_order = min(order, len(old_eps))\n            if cur_order == 1:\n                eps_prime = old_eps[-1]\n            elif cur_order == 2:\n                eps_prime = (3 * old_eps[-1] - old_eps[-2]) / 2\n            elif cur_order == 3:\n                eps_prime = (23 * old_eps[-1] - 16 * old_eps[-2] + 5 * old_eps[-3]) / 12\n            elif cur_order == 4:\n                eps_prime = (55 * old_eps[-1] - 59 * old_eps[-2] + 37 * old_eps[-3] - 9 * old_eps[-4]) / 24\n            else:\n                raise RuntimeError('cur_order is invalid.')\n            pred_prime = self._predict_xstart_from_eps(x, t, eps_prime)\n            mean_pred = pred_prime * paddle.sqrt(alpha_bar_prev) + paddle.sqrt(1 - alpha_bar_prev) * eps_prime\n\n        if len(old_eps) >= order:\n            old_eps.pop(0)\n\n        nonzero_mask = paddle.cast((t != 0), 'float32').reshape([-1, *([1] * (len(x.shape) - 1))])\n        sample = mean_pred * nonzero_mask + out[\"pred_xstart\"] * (1 - nonzero_mask)\n\n        return {\"sample\": sample, \"pred_xstart\": out_orig[\"pred_xstart\"], \"old_eps\": old_eps}\n\n    def plms_sample_loop(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n        order=2,\n    ):\n        \"\"\"\n        Generate samples from the model using Pseudo Linear Multistep.\n\n        Same usage as p_sample_loop().\n        \"\"\"\n        final = None\n        for sample in self.plms_sample_loop_progressive(\n                model,\n                shape,\n                noise=noise,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n                device=device,\n                progress=progress,\n                skip_timesteps=skip_timesteps,\n                init_image=init_image,\n                randomize_class=randomize_class,\n                cond_fn_with_grad=cond_fn_with_grad,\n                order=order,\n        ):\n            final = sample\n        return final[\"sample\"]\n\n    def plms_sample_loop_progressive(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n        order=2,\n    ):\n        \"\"\"\n        Use PLMS to sample from the model and yield intermediate samples from each\n        timestep of PLMS.\n\n        Same usage as p_sample_loop_progressive().\n        \"\"\"\n        if device is None:\n            device = model.parameters()[0].place\n        assert isinstance(shape, (tuple, list))\n        if noise is not None:\n            img = noise\n        else:\n            img = paddle.randn(shape)\n\n        if skip_timesteps and init_image is None:\n            init_image = paddle.zeros_like(img)\n\n        indices = list(range(self.num_timesteps - skip_timesteps))[::-1]\n\n        if init_image is not None:\n            my_t = paddle.ones([shape[0]], dtype='int64') * indices[0]\n            img = self.q_sample(init_image, my_t, img)\n\n        if progress:\n            # Lazy import so that we don't depend on tqdm.\n            from tqdm.auto import tqdm\n\n            indices = tqdm(indices)\n\n        old_out = None\n\n        for i in indices:\n            t = paddle.to_tensor([i] * shape[0], place=device)\n            if randomize_class and 'y' in model_kwargs:\n                model_kwargs['y'] = paddle.randint(low=0, high=model.num_classes, shape=model_kwargs['y'].shape)\n            # with paddle.no_grad():\n            out = self.plms_sample(\n                model,\n                img,\n                t,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n                cond_fn_with_grad=cond_fn_with_grad,\n                order=order,\n                old_out=old_out,\n            )\n            yield out\n            old_out = out\n            img = out[\"sample\"]\n\n    def _vb_terms_bpd(self, model, x_start, x_t, t, clip_denoised=True, model_kwargs=None):\n        \"\"\"\n        Get a term for the variational lower-bound.\n\n        The resulting units are bits (rather than nats, as one might expect).\n        This allows for comparison to other papers.\n\n        :return: a dict with the following keys:\n                 - 'output': a shape [N] tensor of NLLs or KLs.\n                 - 'pred_xstart': the x_0 predictions.\n        \"\"\"\n        true_mean, _, true_log_variance_clipped = self.q_posterior_mean_variance(x_start=x_start, x_t=x_t, t=t)\n        out = self.p_mean_variance(model, x_t, t, clip_denoised=clip_denoised, model_kwargs=model_kwargs)\n        kl = normal_kl(true_mean, true_log_variance_clipped, out[\"mean\"], out[\"log_variance\"])\n        kl = mean_flat(kl) / np.log(2.0)\n\n        decoder_nll = -discretized_gaussian_log_likelihood(\n            x_start, means=out[\"mean\"], log_scales=0.5 * out[\"log_variance\"])\n        assert decoder_nll.shape == x_start.shape\n        decoder_nll = mean_flat(decoder_nll) / np.log(2.0)\n\n        # At the first timestep return the decoder NLL,\n        # otherwise return KL(q(x_{t-1}|x_t,x_0) || p(x_{t-1}|x_t))\n        output = paddle.where((t == 0), decoder_nll, kl)\n        return {\"output\": output, \"pred_xstart\": out[\"pred_xstart\"]}\n\n    def training_losses(self, model, x_start, t, model_kwargs=None, noise=None):\n        \"\"\"\n        Compute training losses for a single timestep.\n\n        :param model: the model to evaluate loss on.\n        :param x_start: the [N x C x ...] tensor of inputs.\n        :param t: a batch of timestep indices.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n        :param noise: if specified, the specific Gaussian noise to try to remove.\n        :return: a dict with the key \"loss\" containing a tensor of shape [N].\n                 Some mean or variance settings may also have other keys.\n        \"\"\"\n        if model_kwargs is None:\n            model_kwargs = {}\n        if noise is None:\n            # noise = th.randn_like(x_start)\n            noise = paddle.randn(x_start.shape, x_start.dtype)\n        x_t = self.q_sample(x_start, t, noise=noise)\n\n        terms = {}\n\n        if self.loss_type == LossType.KL or self.loss_type == LossType.RESCALED_KL:\n            terms[\"loss\"] = self._vb_terms_bpd(\n                model=model,\n                x_start=x_start,\n                x_t=x_t,\n                t=t,\n                clip_denoised=False,\n                model_kwargs=model_kwargs,\n            )[\"output\"]\n            if self.loss_type == LossType.RESCALED_KL:\n                terms[\"loss\"] *= self.num_timesteps\n        elif self.loss_type == LossType.MSE or self.loss_type == LossType.RESCALED_MSE:\n            model_output = model(x_t, self._scale_timesteps(t), **model_kwargs)\n\n            if self.model_var_type in [\n                    ModelVarType.LEARNED,\n                    ModelVarType.LEARNED_RANGE,\n            ]:\n                B, C = x_t.shape[:2]\n                assert model_output.shape == (B, C * 2, *x_t.shape[2:])\n                model_output, model_var_values = paddle.split(model_output, 2, dim=1)\n                # Learn the variance using the variational bound, but don't let\n                # it affect our mean prediction.\n                frozen_out = paddle.concat([model_output.detach(), model_var_values], axis=1)\n                terms[\"vb\"] = self._vb_terms_bpd(\n                    model=lambda *args, r=frozen_out: r,\n                    x_start=x_start,\n                    x_t=x_t,\n                    t=t,\n                    clip_denoised=False,\n                )[\"output\"]\n                if self.loss_type == LossType.RESCALED_MSE:\n                    # Divide by 1000 for equivalence with initial implementation.\n                    # Without a factor of 1/1000, the VB term hurts the MSE term.\n                    terms[\"vb\"] *= self.num_timesteps / 1000.0\n\n            target = {\n                ModelMeanType.PREVIOUS_X: self.q_posterior_mean_variance(x_start=x_start, x_t=x_t, t=t)[0],\n                ModelMeanType.START_X: x_start,\n                ModelMeanType.EPSILON: noise,\n            }[self.model_mean_type]\n            assert model_output.shape == target.shape == x_start.shape\n            terms[\"mse\"] = mean_flat((target - model_output)**2)\n            if \"vb\" in terms:\n                terms[\"loss\"] = terms[\"mse\"] + terms[\"vb\"]\n            else:\n                terms[\"loss\"] = terms[\"mse\"]\n        else:\n            raise NotImplementedError(self.loss_type)\n\n        return terms\n\n    def _prior_bpd(self, x_start):\n        \"\"\"\n        Get the prior KL term for the variational lower-bound, measured in\n        bits-per-dim.\n\n        This term can't be optimized, as it only depends on the encoder.\n\n        :param x_start: the [N x C x ...] tensor of inputs.\n        :return: a batch of [N] KL values (in bits), one per batch element.\n        \"\"\"\n        batch_size = x_start.shape[0]\n        t = paddle.to_tensor([self.num_timesteps - 1] * batch_size, place=x_start.place)\n        qt_mean, _, qt_log_variance = self.q_mean_variance(x_start, t)\n        kl_prior = normal_kl(mean1=qt_mean, logvar1=qt_log_variance, mean2=0.0, logvar2=0.0)\n        return mean_flat(kl_prior) / np.log(2.0)\n\n    def calc_bpd_loop(self, model, x_start, clip_denoised=True, model_kwargs=None):\n        \"\"\"\n        Compute the entire variational lower-bound, measured in bits-per-dim,\n        as well as other related quantities.\n\n        :param model: the model to evaluate loss on.\n        :param x_start: the [N x C x ...] tensor of inputs.\n        :param clip_denoised: if True, clip denoised samples.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n\n        :return: a dict containing the following keys:\n                 - total_bpd: the total variational lower-bound, per batch element.\n                 - prior_bpd: the prior term in the lower-bound.\n                 - vb: an [N x T] tensor of terms in the lower-bound.\n                 - xstart_mse: an [N x T] tensor of x_0 MSEs for each timestep.\n                 - mse: an [N x T] tensor of epsilon MSEs for each timestep.\n        \"\"\"\n        device = x_start.place\n        batch_size = x_start.shape[0]\n\n        vb = []\n        xstart_mse = []\n        mse = []\n        for t in list(range(self.num_timesteps))[::-1]:\n            t_batch = paddle.to_tensor([t] * batch_size, place=device)\n            # noise = th.randn_like(x_start)\n            noise = paddle.randn(x_start.shape, x_start.dtype)\n            x_t = self.q_sample(x_start=x_start, t=t_batch, noise=noise)\n            # Calculate VLB term at the current timestep\n            # with paddle.no_grad():\n            out = self._vb_terms_bpd(\n                model,\n                x_start=x_start,\n                x_t=x_t,\n                t=t_batch,\n                clip_denoised=clip_denoised,\n                model_kwargs=model_kwargs,\n            )\n            vb.append(out[\"output\"])\n            xstart_mse.append(mean_flat((out[\"pred_xstart\"] - x_start)**2))\n            eps = self._predict_eps_from_xstart(x_t, t_batch, out[\"pred_xstart\"])\n            mse.append(mean_flat((eps - noise)**2))\n\n        vb = paddle.stack(vb, axis=1)\n        xstart_mse = paddle.stack(xstart_mse, axis=1)\n        mse = paddle.stack(mse, axis=1)\n\n        prior_bpd = self._prior_bpd(x_start)\n        total_bpd = vb.sum(axis=1) + prior_bpd\n        return {\n            \"total_bpd\": total_bpd,\n            \"prior_bpd\": prior_bpd,\n            \"vb\": vb,\n            \"xstart_mse\": xstart_mse,\n            \"mse\": mse,\n        }\n\n\ndef _extract_into_tensor(arr, timesteps, broadcast_shape):\n    \"\"\"\n    Extract values from a 1-D numpy array for a batch of indices.\n\n    :param arr: the 1-D numpy array.\n    :param timesteps: a tensor of indices into the array to extract.\n    :param broadcast_shape: a larger shape of K dimensions with the batch\n                            dimension equal to the length of timesteps.\n    :return: a tensor of shape [batch_size, 1, ...] where the shape has K dims.\n    \"\"\"\n    res = paddle.to_tensor(arr, place=timesteps.place)[timesteps]\n    while len(res.shape) < len(broadcast_shape):\n        res = res[..., None]\n    return res.expand(broadcast_shape)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/reverse_diffusion/model/losses.py",
    "content": "\"\"\"\nHelpers for various likelihood-based losses implemented by Paddle. These are ported from the original\nHo et al. diffusion models codebase:\nhttps://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/utils.py\n\"\"\"\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\n\n\ndef normal_kl(mean1, logvar1, mean2, logvar2):\n    \"\"\"\n    Compute the KL divergence between two gaussians.\n\n    Shapes are automatically broadcasted, so batches can be compared to\n    scalars, among other use cases.\n    \"\"\"\n    tensor = None\n    for obj in (mean1, logvar1, mean2, logvar2):\n        if isinstance(obj, paddle.Tensor):\n            tensor = obj\n            break\n    assert tensor is not None, \"at least one argument must be a Tensor\"\n\n    # Force variances to be Tensors. Broadcasting helps convert scalars to\n    # Tensors, but it does not work for th.exp().\n    logvar1, logvar2 = [x if isinstance(x, paddle.Tensor) else paddle.to_tensor(x) for x in (logvar1, logvar2)]\n\n    return 0.5 * (-1.0 + logvar2 - logvar1 + paddle.exp(logvar1 - logvar2) +\n                  ((mean1 - mean2)**2) * paddle.exp(-logvar2))\n\n\ndef approx_standard_normal_cdf(x):\n    \"\"\"\n    A fast approximation of the cumulative distribution function of the\n    standard normal.\n    \"\"\"\n    return 0.5 * (1.0 + paddle.tanh(np.sqrt(2.0 / np.pi) * (x + 0.044715 * paddle.pow(x, 3))))\n\n\ndef discretized_gaussian_log_likelihood(x, *, means, log_scales):\n    \"\"\"\n    Compute the log-likelihood of a Gaussian distribution discretizing to a\n    given image.\n\n    :param x: the target images. It is assumed that this was uint8 values,\n              rescaled to the range [-1, 1].\n    :param means: the Gaussian mean Tensor.\n    :param log_scales: the Gaussian log stddev Tensor.\n    :return: a tensor like x of log probabilities (in nats).\n    \"\"\"\n    assert x.shape == means.shape == log_scales.shape\n    centered_x = x - means\n    inv_stdv = paddle.exp(-log_scales)\n    plus_in = inv_stdv * (centered_x + 1.0 / 255.0)\n    cdf_plus = approx_standard_normal_cdf(plus_in)\n    min_in = inv_stdv * (centered_x - 1.0 / 255.0)\n    cdf_min = approx_standard_normal_cdf(min_in)\n    log_cdf_plus = paddle.log(cdf_plus.clip(min=1e-12))\n    log_one_minus_cdf_min = paddle.log((1.0 - cdf_min).clip(min=1e-12))\n    cdf_delta = cdf_plus - cdf_min\n    log_probs = paddle.where(\n        x < -0.999,\n        log_cdf_plus,\n        paddle.where(x > 0.999, log_one_minus_cdf_min, paddle.log(cdf_delta.clip(min=1e-12))),\n    )\n    assert log_probs.shape == x.shape\n    return log_probs\n\n\ndef spherical_dist_loss(x, y):\n    x = F.normalize(x, axis=-1)\n    y = F.normalize(y, axis=-1)\n    return (x - y).norm(axis=-1).divide(paddle.to_tensor(2.0)).asin().pow(2).multiply(paddle.to_tensor(2.0))\n\n\ndef tv_loss(input):\n    \"\"\"L2 total variation loss, as in Mahendran et al.\"\"\"\n    input = F.pad(input, (0, 1, 0, 1), 'replicate')\n    x_diff = input[..., :-1, 1:] - input[..., :-1, :-1]\n    y_diff = input[..., 1:, :-1] - input[..., :-1, :-1]\n    return (x_diff**2 + y_diff**2).mean([1, 2, 3])\n\n\ndef range_loss(input):\n    return (input - input.clip(-1, 1)).pow(2).mean([1, 2, 3])\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/reverse_diffusion/model/make_cutouts.py",
    "content": "'''\nThis code is rewritten by Paddle based on Jina-ai/discoart.\nhttps://github.com/jina-ai/discoart/blob/main/discoart/nn/make_cutouts.py\n'''\nimport math\n\nimport paddle\nimport paddle.nn as nn\nfrom disco_diffusion_clip_vitb32.resize_right.resize_right import resize\nfrom paddle.nn import functional as F\n\nfrom . import transforms as T\n\nskip_augs = False  # @param{type: 'boolean'}\n\n\ndef sinc(x):\n    return paddle.where(x != 0, paddle.sin(math.pi * x) / (math.pi * x), x.new_ones([]))\n\n\ndef lanczos(x, a):\n    cond = paddle.logical_and(-a < x, x < a)\n    out = paddle.where(cond, sinc(x) * sinc(x / a), x.new_zeros([]))\n    return out / out.sum()\n\n\ndef ramp(ratio, width):\n    n = math.ceil(width / ratio + 1)\n    out = paddle.empty([n])\n    cur = 0\n    for i in range(out.shape[0]):\n        out[i] = cur\n        cur += ratio\n    return paddle.concat([-out[1:].flip([0]), out])[1:-1]\n\n\nclass MakeCutouts(nn.Layer):\n\n    def __init__(self, cut_size, cutn, skip_augs=False):\n        super().__init__()\n        self.cut_size = cut_size\n        self.cutn = cutn\n        self.skip_augs = skip_augs\n        self.augs = nn.Sequential(*[\n            T.RandomHorizontalFlip(prob=0.5),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.RandomAffine(degrees=15, translate=(0.1, 0.1)),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.RandomPerspective(distortion_scale=0.4, p=0.7),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.RandomGrayscale(p=0.15),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.ColorJitter(brightness=0.1, contrast=0.1, saturation=0.1, hue=0.1),\n        ])\n\n    def forward(self, input):\n        input = T.Pad(input.shape[2] // 4, fill=0)(input)\n        sideY, sideX = input.shape[2:4]\n        max_size = min(sideX, sideY)\n\n        cutouts = []\n        for ch in range(self.cutn):\n            if ch > self.cutn - self.cutn // 4:\n                cutout = input.clone()\n            else:\n                size = int(max_size *\n                           paddle.zeros(1, ).normal_(mean=0.8, std=0.3).clip(float(self.cut_size / max_size), 1.0))\n                offsetx = paddle.randint(0, abs(sideX - size + 1), ())\n                offsety = paddle.randint(0, abs(sideY - size + 1), ())\n                cutout = input[:, :, offsety:offsety + size, offsetx:offsetx + size]\n\n            if not self.skip_augs:\n                cutout = self.augs(cutout)\n            cutouts.append(resample(cutout, (self.cut_size, self.cut_size)))\n            del cutout\n\n        cutouts = paddle.concat(cutouts, axis=0)\n        return cutouts\n\n\nclass MakeCutoutsDango(nn.Layer):\n\n    def __init__(self, cut_size, Overview=4, InnerCrop=0, IC_Size_Pow=0.5, IC_Grey_P=0.2):\n        super().__init__()\n        self.cut_size = cut_size\n        self.Overview = Overview\n        self.InnerCrop = InnerCrop\n        self.IC_Size_Pow = IC_Size_Pow\n        self.IC_Grey_P = IC_Grey_P\n        self.augs = nn.Sequential(*[\n            T.RandomHorizontalFlip(prob=0.5),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.RandomAffine(\n                degrees=10,\n                translate=(0.05, 0.05),\n                interpolation=T.InterpolationMode.BILINEAR,\n            ),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.RandomGrayscale(p=0.1),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.ColorJitter(brightness=0.1, contrast=0.1, saturation=0.1, hue=0.1),\n        ])\n\n    def forward(self, input):\n        cutouts = []\n        gray = T.Grayscale(3)\n        sideY, sideX = input.shape[2:4]\n        max_size = min(sideX, sideY)\n        min_size = min(sideX, sideY, self.cut_size)\n        output_shape = [1, 3, self.cut_size, self.cut_size]\n        pad_input = F.pad(\n            input,\n            (\n                (sideY - max_size) // 2,\n                (sideY - max_size) // 2,\n                (sideX - max_size) // 2,\n                (sideX - max_size) // 2,\n            ),\n            **padargs,\n        )\n        cutout = resize(pad_input, out_shape=output_shape)\n\n        if self.Overview > 0:\n            if self.Overview <= 4:\n                if self.Overview >= 1:\n                    cutouts.append(cutout)\n                if self.Overview >= 2:\n                    cutouts.append(gray(cutout))\n                if self.Overview >= 3:\n                    cutouts.append(cutout[:, :, :, ::-1])\n                if self.Overview == 4:\n                    cutouts.append(gray(cutout[:, :, :, ::-1]))\n            else:\n                cutout = resize(pad_input, out_shape=output_shape)\n                for _ in range(self.Overview):\n                    cutouts.append(cutout)\n\n        if self.InnerCrop > 0:\n            for i in range(self.InnerCrop):\n                size = int(paddle.rand([1])**self.IC_Size_Pow * (max_size - min_size) + min_size)\n                offsetx = paddle.randint(0, sideX - size + 1)\n                offsety = paddle.randint(0, sideY - size + 1)\n                cutout = input[:, :, offsety:offsety + size, offsetx:offsetx + size]\n                if i <= int(self.IC_Grey_P * self.InnerCrop):\n                    cutout = gray(cutout)\n                cutout = resize(cutout, out_shape=output_shape)\n                cutouts.append(cutout)\n\n        cutouts = paddle.concat(cutouts)\n        if skip_augs is not True:\n            cutouts = self.augs(cutouts)\n        return cutouts\n\n\ndef resample(input, size, align_corners=True):\n    n, c, h, w = input.shape\n    dh, dw = size\n\n    input = input.reshape([n * c, 1, h, w])\n\n    if dh < h:\n        kernel_h = lanczos(ramp(dh / h, 2), 2).to(input.device, input.dtype)\n        pad_h = (kernel_h.shape[0] - 1) // 2\n        input = F.pad(input, (0, 0, pad_h, pad_h), 'reflect')\n        input = F.conv2d(input, kernel_h[None, None, :, None])\n\n    if dw < w:\n        kernel_w = lanczos(ramp(dw / w, 2), 2).to(input.device, input.dtype)\n        pad_w = (kernel_w.shape[0] - 1) // 2\n        input = F.pad(input, (pad_w, pad_w, 0, 0), 'reflect')\n        input = F.conv2d(input, kernel_w[None, None, None, :])\n\n    input = input.reshape([n, c, h, w])\n    return F.interpolate(input, size, mode='bicubic', align_corners=align_corners)\n\n\npadargs = {}\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/reverse_diffusion/model/nn.py",
    "content": "\"\"\"\nVarious utilities for neural networks implemented by Paddle. This code is rewritten based on:\nhttps://github.com/openai/guided-diffusion/blob/main/guided_diffusion/nn.py\n\"\"\"\nimport math\n\nimport paddle\nimport paddle.nn as nn\n\n\nclass SiLU(nn.Layer):\n\n    def forward(self, x):\n        return x * nn.functional.sigmoid(x)\n\n\nclass GroupNorm32(nn.GroupNorm):\n\n    def forward(self, x):\n        return super().forward(x)\n\n\ndef conv_nd(dims, *args, **kwargs):\n    \"\"\"\n    Create a 1D, 2D, or 3D convolution module.\n    \"\"\"\n    if dims == 1:\n        return nn.Conv1D(*args, **kwargs)\n    elif dims == 2:\n        return nn.Conv2D(*args, **kwargs)\n    elif dims == 3:\n        return nn.Conv3D(*args, **kwargs)\n    raise ValueError(f\"unsupported dimensions: {dims}\")\n\n\ndef linear(*args, **kwargs):\n    \"\"\"\n    Create a linear module.\n    \"\"\"\n    return nn.Linear(*args, **kwargs)\n\n\ndef avg_pool_nd(dims, *args, **kwargs):\n    \"\"\"\n    Create a 1D, 2D, or 3D average pooling module.\n    \"\"\"\n    if dims == 1:\n        return nn.AvgPool1D(*args, **kwargs)\n    elif dims == 2:\n        return nn.AvgPool2D(*args, **kwargs)\n    elif dims == 3:\n        return nn.AvgPool3D(*args, **kwargs)\n    raise ValueError(f\"unsupported dimensions: {dims}\")\n\n\ndef update_ema(target_params, source_params, rate=0.99):\n    \"\"\"\n    Update target parameters to be closer to those of source parameters using\n    an exponential moving average.\n\n    :param target_params: the target parameter sequence.\n    :param source_params: the source parameter sequence.\n    :param rate: the EMA rate (closer to 1 means slower).\n    \"\"\"\n    for targ, src in zip(target_params, source_params):\n        targ.detach().mul_(rate).add_(src, alpha=1 - rate)\n\n\ndef zero_module(module):\n    \"\"\"\n    Zero out the parameters of a module and return it.\n    \"\"\"\n    for p in module.parameters():\n        p.detach().zero_()\n    return module\n\n\ndef scale_module(module, scale):\n    \"\"\"\n    Scale the parameters of a module and return it.\n    \"\"\"\n    for p in module.parameters():\n        p.detach().mul_(scale)\n    return module\n\n\ndef mean_flat(tensor):\n    \"\"\"\n    Take the mean over all non-batch dimensions.\n    \"\"\"\n    return tensor.mean(axis=list(range(1, len(tensor.shape))))\n\n\ndef normalization(channels):\n    \"\"\"\n    Make a standard normalization layer.\n\n    :param channels: number of input channels.\n    :return: an nn.Module for normalization.\n    \"\"\"\n    return GroupNorm32(32, channels)\n\n\ndef timestep_embedding(timesteps, dim, max_period=10000):\n    \"\"\"\n    Create sinusoidal timestep embeddings.\n\n    :param timesteps: a 1-D Tensor of N indices, one per batch element.\n                      These may be fractional.\n    :param dim: the dimension of the output.\n    :param max_period: controls the minimum frequency of the embeddings.\n    :return: an [N x dim] Tensor of positional embeddings.\n    \"\"\"\n    half = dim // 2\n    freqs = paddle.exp(-math.log(max_period) * paddle.arange(start=0, end=half, dtype=paddle.float32) / half)\n    args = paddle.cast(timesteps[:, None], 'float32') * freqs[None]\n    embedding = paddle.concat([paddle.cos(args), paddle.sin(args)], axis=-1)\n    if dim % 2:\n        embedding = paddle.concat([embedding, paddle.zeros_like(embedding[:, :1])], axis=-1)\n    return embedding\n\n\ndef checkpoint(func, inputs, params, flag):\n    \"\"\"\n    This function is disabled. And now just forward.\n    \"\"\"\n    return func(*inputs)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/reverse_diffusion/model/perlin_noises.py",
    "content": "'''\nPerlin noise implementation by Paddle.\nThis code is rewritten based on:\nhttps://github.com/jina-ai/discoart/blob/main/discoart/nn/perlin_noises.py\n'''\nimport numpy as np\nimport paddle\nimport paddle.vision.transforms as TF\nfrom PIL import Image\nfrom PIL import ImageOps\n\n\ndef interp(t):\n    return 3 * t**2 - 2 * t**3\n\n\ndef perlin(width, height, scale=10):\n    gx, gy = paddle.randn([2, width + 1, height + 1, 1, 1])\n    xs = paddle.linspace(0, 1, scale + 1)[:-1, None]\n    ys = paddle.linspace(0, 1, scale + 1)[None, :-1]\n    wx = 1 - interp(xs)\n    wy = 1 - interp(ys)\n    dots = 0\n    dots += wx * wy * (gx[:-1, :-1] * xs + gy[:-1, :-1] * ys)\n    dots += (1 - wx) * wy * (-gx[1:, :-1] * (1 - xs) + gy[1:, :-1] * ys)\n    dots += wx * (1 - wy) * (gx[:-1, 1:] * xs - gy[:-1, 1:] * (1 - ys))\n    dots += (1 - wx) * (1 - wy) * (-gx[1:, 1:] * (1 - xs) - gy[1:, 1:] * (1 - ys))\n    return dots.transpose([0, 2, 1, 3]).reshape([width * scale, height * scale])\n\n\ndef perlin_ms(octaves, width, height, grayscale):\n    out_array = [0.5] if grayscale else [0.5, 0.5, 0.5]\n    # out_array = [0.0] if grayscale else [0.0, 0.0, 0.0]\n    for i in range(1 if grayscale else 3):\n        scale = 2**len(octaves)\n        oct_width = width\n        oct_height = height\n        for oct in octaves:\n            p = perlin(oct_width, oct_height, scale)\n            out_array[i] += p * oct\n            scale //= 2\n            oct_width *= 2\n            oct_height *= 2\n    return paddle.concat(out_array)\n\n\ndef create_perlin_noise(octaves, width, height, grayscale, side_y, side_x):\n    out = perlin_ms(octaves, width, height, grayscale)\n    if grayscale:\n        out = TF.resize(size=(side_y, side_x), img=out.numpy())\n        out = np.uint8(out)\n        out = Image.fromarray(out).convert('RGB')\n    else:\n        out = out.reshape([-1, 3, out.shape[0] // 3, out.shape[1]])\n        out = out.squeeze().transpose([1, 2, 0]).numpy()\n        out = TF.resize(size=(side_y, side_x), img=out)\n        out = out.clip(0, 1) * 255\n        out = np.uint8(out)\n        out = Image.fromarray(out)\n\n    out = ImageOps.autocontrast(out)\n    return out\n\n\ndef regen_perlin(perlin_mode, side_y, side_x, batch_size):\n    if perlin_mode == 'color':\n        init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, False, side_y, side_x)\n        init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, False, side_y, side_x)\n    elif perlin_mode == 'gray':\n        init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, True, side_y, side_x)\n        init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, True, side_y, side_x)\n    else:\n        init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, False, side_y, side_x)\n        init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, True, side_y, side_x)\n\n    init = (TF.to_tensor(init).add(TF.to_tensor(init2)).divide(paddle.to_tensor(2.0)).unsqueeze(0) * 2 - 1)\n    del init2\n    return init.expand([batch_size, -1, -1, -1])\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/reverse_diffusion/model/respace.py",
    "content": "'''\nThis code is rewritten by Paddle based on\nhttps://github.com/openai/guided-diffusion/blob/main/guided_diffusion/respace.py\n'''\nimport numpy as np\nimport paddle\n\nfrom .gaussian_diffusion import GaussianDiffusion\n\n\ndef space_timesteps(num_timesteps, section_counts):\n    \"\"\"\n    Create a list of timesteps to use from an original diffusion process,\n    given the number of timesteps we want to take from equally-sized portions\n    of the original process.\n\n    For example, if there's 300 timesteps and the section counts are [10,15,20]\n    then the first 100 timesteps are strided to be 10 timesteps, the second 100\n    are strided to be 15 timesteps, and the final 100 are strided to be 20.\n\n    If the stride is a string starting with \"ddim\", then the fixed striding\n    from the DDIM paper is used, and only one section is allowed.\n\n    :param num_timesteps: the number of diffusion steps in the original\n                          process to divide up.\n    :param section_counts: either a list of numbers, or a string containing\n                           comma-separated numbers, indicating the step count\n                           per section. As a special case, use \"ddimN\" where N\n                           is a number of steps to use the striding from the\n                           DDIM paper.\n    :return: a set of diffusion steps from the original process to use.\n    \"\"\"\n    if isinstance(section_counts, str):\n        if section_counts.startswith(\"ddim\"):\n            desired_count = int(section_counts[len(\"ddim\"):])\n            for i in range(1, num_timesteps):\n                if len(range(0, num_timesteps, i)) == desired_count:\n                    return set(range(0, num_timesteps, i))\n            raise ValueError(f\"cannot create exactly {num_timesteps} steps with an integer stride\")\n        section_counts = [int(x) for x in section_counts.split(\",\")]\n    size_per = num_timesteps // len(section_counts)\n    extra = num_timesteps % len(section_counts)\n    start_idx = 0\n    all_steps = []\n    for i, section_count in enumerate(section_counts):\n        size = size_per + (1 if i < extra else 0)\n        if size < section_count:\n            raise ValueError(f\"cannot divide section of {size} steps into {section_count}\")\n        if section_count <= 1:\n            frac_stride = 1\n        else:\n            frac_stride = (size - 1) / (section_count - 1)\n        cur_idx = 0.0\n        taken_steps = []\n        for _ in range(section_count):\n            taken_steps.append(start_idx + round(cur_idx))\n            cur_idx += frac_stride\n        all_steps += taken_steps\n        start_idx += size\n    return set(all_steps)\n\n\nclass SpacedDiffusion(GaussianDiffusion):\n    \"\"\"\n    A diffusion process which can skip steps in a base diffusion process.\n\n    :param use_timesteps: a collection (sequence or set) of timesteps from the\n                          original diffusion process to retain.\n    :param kwargs: the kwargs to create the base diffusion process.\n    \"\"\"\n\n    def __init__(self, use_timesteps, **kwargs):\n        self.use_timesteps = set(use_timesteps)\n        self.timestep_map = []\n        self.original_num_steps = len(kwargs[\"betas\"])\n\n        base_diffusion = GaussianDiffusion(**kwargs)  # pylint: disable=missing-kwoa\n        last_alpha_cumprod = 1.0\n        new_betas = []\n        for i, alpha_cumprod in enumerate(base_diffusion.alphas_cumprod):\n            if i in self.use_timesteps:\n                new_betas.append(1 - alpha_cumprod / last_alpha_cumprod)\n                last_alpha_cumprod = alpha_cumprod\n                self.timestep_map.append(i)\n        kwargs[\"betas\"] = np.array(new_betas)\n        super().__init__(**kwargs)\n\n    def p_mean_variance(self, model, *args, **kwargs):  # pylint: disable=signature-differs\n        return super().p_mean_variance(self._wrap_model(model), *args, **kwargs)\n\n    def training_losses(self, model, *args, **kwargs):  # pylint: disable=signature-differs\n        return super().training_losses(self._wrap_model(model), *args, **kwargs)\n\n    def condition_mean(self, cond_fn, *args, **kwargs):\n        return super().condition_mean(self._wrap_model(cond_fn), *args, **kwargs)\n\n    def condition_score(self, cond_fn, *args, **kwargs):\n        return super().condition_score(self._wrap_model(cond_fn), *args, **kwargs)\n\n    def _wrap_model(self, model):\n        if isinstance(model, _WrappedModel):\n            return model\n        return _WrappedModel(model, self.timestep_map, self.rescale_timesteps, self.original_num_steps)\n\n    def _scale_timesteps(self, t):\n        # Scaling is done by the wrapped model.\n        return t\n\n\nclass _WrappedModel:\n\n    def __init__(self, model, timestep_map, rescale_timesteps, original_num_steps):\n        self.model = model\n        self.timestep_map = timestep_map\n        self.rescale_timesteps = rescale_timesteps\n        self.original_num_steps = original_num_steps\n\n    def __call__(self, x, ts, **kwargs):\n        map_tensor = paddle.to_tensor(self.timestep_map, place=ts.place, dtype=ts.dtype)\n        new_ts = map_tensor[ts]\n        if self.rescale_timesteps:\n            new_ts = paddle.cast(new_ts, 'float32') * (1000.0 / self.original_num_steps)\n        return self.model(x, new_ts, **kwargs)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/reverse_diffusion/model/script_util.py",
    "content": "'''\nThis code is based on\nhttps://github.com/openai/guided-diffusion/blob/main/guided_diffusion/script_util.py\n'''\nimport argparse\nimport inspect\n\nfrom . import gaussian_diffusion as gd\nfrom .respace import space_timesteps\nfrom .respace import SpacedDiffusion\nfrom .unet import EncoderUNetModel\nfrom .unet import SuperResModel\nfrom .unet import UNetModel\n\nNUM_CLASSES = 1000\n\n\ndef diffusion_defaults():\n    \"\"\"\n    Defaults for image and classifier training.\n    \"\"\"\n    return dict(\n        learn_sigma=False,\n        diffusion_steps=1000,\n        noise_schedule=\"linear\",\n        timestep_respacing=\"\",\n        use_kl=False,\n        predict_xstart=False,\n        rescale_timesteps=False,\n        rescale_learned_sigmas=False,\n    )\n\n\ndef model_and_diffusion_defaults():\n    \"\"\"\n    Defaults for image training.\n    \"\"\"\n    res = dict(\n        image_size=64,\n        num_channels=128,\n        num_res_blocks=2,\n        num_heads=4,\n        num_heads_upsample=-1,\n        num_head_channels=-1,\n        attention_resolutions=\"16,8\",\n        channel_mult=\"\",\n        dropout=0.0,\n        class_cond=False,\n        use_checkpoint=False,\n        use_scale_shift_norm=True,\n        resblock_updown=False,\n        use_fp16=False,\n        use_new_attention_order=False,\n    )\n    res.update(diffusion_defaults())\n    return res\n\n\ndef create_model_and_diffusion(\n    image_size,\n    class_cond,\n    learn_sigma,\n    num_channels,\n    num_res_blocks,\n    channel_mult,\n    num_heads,\n    num_head_channels,\n    num_heads_upsample,\n    attention_resolutions,\n    dropout,\n    diffusion_steps,\n    noise_schedule,\n    timestep_respacing,\n    use_kl,\n    predict_xstart,\n    rescale_timesteps,\n    rescale_learned_sigmas,\n    use_checkpoint,\n    use_scale_shift_norm,\n    resblock_updown,\n    use_fp16,\n    use_new_attention_order,\n):\n    model = create_model(\n        image_size,\n        num_channels,\n        num_res_blocks,\n        channel_mult=channel_mult,\n        learn_sigma=learn_sigma,\n        class_cond=class_cond,\n        use_checkpoint=use_checkpoint,\n        attention_resolutions=attention_resolutions,\n        num_heads=num_heads,\n        num_head_channels=num_head_channels,\n        num_heads_upsample=num_heads_upsample,\n        use_scale_shift_norm=use_scale_shift_norm,\n        dropout=dropout,\n        resblock_updown=resblock_updown,\n        use_fp16=use_fp16,\n        use_new_attention_order=use_new_attention_order,\n    )\n    diffusion = create_gaussian_diffusion(\n        steps=diffusion_steps,\n        learn_sigma=learn_sigma,\n        noise_schedule=noise_schedule,\n        use_kl=use_kl,\n        predict_xstart=predict_xstart,\n        rescale_timesteps=rescale_timesteps,\n        rescale_learned_sigmas=rescale_learned_sigmas,\n        timestep_respacing=timestep_respacing,\n    )\n    return model, diffusion\n\n\ndef create_model(\n    image_size,\n    num_channels,\n    num_res_blocks,\n    channel_mult=\"\",\n    learn_sigma=False,\n    class_cond=False,\n    use_checkpoint=False,\n    attention_resolutions=\"16\",\n    num_heads=1,\n    num_head_channels=-1,\n    num_heads_upsample=-1,\n    use_scale_shift_norm=False,\n    dropout=0,\n    resblock_updown=False,\n    use_fp16=False,\n    use_new_attention_order=False,\n):\n    if channel_mult == \"\":\n        if image_size == 512:\n            channel_mult = (0.5, 1, 1, 2, 2, 4, 4)\n        elif image_size == 256:\n            channel_mult = (1, 1, 2, 2, 4, 4)\n        elif image_size == 128:\n            channel_mult = (1, 1, 2, 3, 4)\n        elif image_size == 64:\n            channel_mult = (1, 2, 3, 4)\n        else:\n            raise ValueError(f\"unsupported image size: {image_size}\")\n    else:\n        channel_mult = tuple(int(ch_mult) for ch_mult in channel_mult.split(\",\"))\n\n    attention_ds = []\n    for res in attention_resolutions.split(\",\"):\n        attention_ds.append(image_size // int(res))\n\n    return UNetModel(\n        image_size=image_size,\n        in_channels=3,\n        model_channels=num_channels,\n        out_channels=(3 if not learn_sigma else 6),\n        num_res_blocks=num_res_blocks,\n        attention_resolutions=tuple(attention_ds),\n        dropout=dropout,\n        channel_mult=channel_mult,\n        num_classes=(NUM_CLASSES if class_cond else None),\n        use_checkpoint=use_checkpoint,\n        use_fp16=use_fp16,\n        num_heads=num_heads,\n        num_head_channels=num_head_channels,\n        num_heads_upsample=num_heads_upsample,\n        use_scale_shift_norm=use_scale_shift_norm,\n        resblock_updown=resblock_updown,\n        use_new_attention_order=use_new_attention_order,\n    )\n\n\ndef create_gaussian_diffusion(\n    *,\n    steps=1000,\n    learn_sigma=False,\n    sigma_small=False,\n    noise_schedule=\"linear\",\n    use_kl=False,\n    predict_xstart=False,\n    rescale_timesteps=False,\n    rescale_learned_sigmas=False,\n    timestep_respacing=\"\",\n):\n    betas = gd.get_named_beta_schedule(noise_schedule, steps)\n    if use_kl:\n        loss_type = gd.LossType.RESCALED_KL\n    elif rescale_learned_sigmas:\n        loss_type = gd.LossType.RESCALED_MSE\n    else:\n        loss_type = gd.LossType.MSE\n    if not timestep_respacing:\n        timestep_respacing = [steps]\n    return SpacedDiffusion(\n        use_timesteps=space_timesteps(steps, timestep_respacing),\n        betas=betas,\n        model_mean_type=(gd.ModelMeanType.EPSILON if not predict_xstart else gd.ModelMeanType.START_X),\n        model_var_type=((gd.ModelVarType.FIXED_LARGE if not sigma_small else gd.ModelVarType.FIXED_SMALL)\n                        if not learn_sigma else gd.ModelVarType.LEARNED_RANGE),\n        loss_type=loss_type,\n        rescale_timesteps=rescale_timesteps,\n    )\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/reverse_diffusion/model/sec_diff.py",
    "content": "'''\nThis code is rewritten by Paddle based on\nhttps://github.com/jina-ai/discoart/blob/main/discoart/nn/sec_diff.py\n'''\nimport math\nfrom dataclasses import dataclass\nfrom functools import partial\n\nimport paddle\nimport paddle.nn as nn\n\n\n@dataclass\nclass DiffusionOutput:\n    v: paddle.Tensor\n    pred: paddle.Tensor\n    eps: paddle.Tensor\n\n\nclass SkipBlock(nn.Layer):\n\n    def __init__(self, main, skip=None):\n        super().__init__()\n        self.main = nn.Sequential(*main)\n        self.skip = skip if skip else nn.Identity()\n\n    def forward(self, input):\n        return paddle.concat([self.main(input), self.skip(input)], axis=1)\n\n\ndef append_dims(x, n):\n    return x[(Ellipsis, *(None, ) * (n - x.ndim))]\n\n\ndef expand_to_planes(x, shape):\n    return paddle.tile(append_dims(x, len(shape)), [1, 1, *shape[2:]])\n\n\ndef alpha_sigma_to_t(alpha, sigma):\n    return paddle.atan2(sigma, alpha) * 2 / math.pi\n\n\ndef t_to_alpha_sigma(t):\n    return paddle.cos(t * math.pi / 2), paddle.sin(t * math.pi / 2)\n\n\nclass SecondaryDiffusionImageNet2(nn.Layer):\n\n    def __init__(self):\n        super().__init__()\n        c = 64  # The base channel count\n        cs = [c, c * 2, c * 2, c * 4, c * 4, c * 8]\n\n        self.timestep_embed = FourierFeatures(1, 16)\n        self.down = nn.AvgPool2D(2)\n        self.up = nn.Upsample(scale_factor=2, mode='bilinear', align_corners=False)\n\n        self.net = nn.Sequential(\n            ConvBlock(3 + 16, cs[0]),\n            ConvBlock(cs[0], cs[0]),\n            SkipBlock([\n                self.down,\n                ConvBlock(cs[0], cs[1]),\n                ConvBlock(cs[1], cs[1]),\n                SkipBlock([\n                    self.down,\n                    ConvBlock(cs[1], cs[2]),\n                    ConvBlock(cs[2], cs[2]),\n                    SkipBlock([\n                        self.down,\n                        ConvBlock(cs[2], cs[3]),\n                        ConvBlock(cs[3], cs[3]),\n                        SkipBlock([\n                            self.down,\n                            ConvBlock(cs[3], cs[4]),\n                            ConvBlock(cs[4], cs[4]),\n                            SkipBlock([\n                                self.down,\n                                ConvBlock(cs[4], cs[5]),\n                                ConvBlock(cs[5], cs[5]),\n                                ConvBlock(cs[5], cs[5]),\n                                ConvBlock(cs[5], cs[4]),\n                                self.up,\n                            ]),\n                            ConvBlock(cs[4] * 2, cs[4]),\n                            ConvBlock(cs[4], cs[3]),\n                            self.up,\n                        ]),\n                        ConvBlock(cs[3] * 2, cs[3]),\n                        ConvBlock(cs[3], cs[2]),\n                        self.up,\n                    ]),\n                    ConvBlock(cs[2] * 2, cs[2]),\n                    ConvBlock(cs[2], cs[1]),\n                    self.up,\n                ]),\n                ConvBlock(cs[1] * 2, cs[1]),\n                ConvBlock(cs[1], cs[0]),\n                self.up,\n            ]),\n            ConvBlock(cs[0] * 2, cs[0]),\n            nn.Conv2D(cs[0], 3, 3, padding=1),\n        )\n\n    def forward(self, input, t):\n        timestep_embed = expand_to_planes(self.timestep_embed(t[:, None]), input.shape)\n        v = self.net(paddle.concat([input, timestep_embed], axis=1))\n        alphas, sigmas = map(partial(append_dims, n=v.ndim), t_to_alpha_sigma(t))\n        pred = input * alphas - v * sigmas\n        eps = input * sigmas + v * alphas\n        return DiffusionOutput(v, pred, eps)\n\n\nclass FourierFeatures(nn.Layer):\n\n    def __init__(self, in_features, out_features, std=1.0):\n        super().__init__()\n        assert out_features % 2 == 0\n        # self.weight = nn.Parameter(paddle.randn([out_features // 2, in_features]) * std)\n        self.weight = paddle.create_parameter([out_features // 2, in_features],\n                                              dtype='float32',\n                                              default_initializer=nn.initializer.Normal(mean=0.0, std=std))\n\n    def forward(self, input):\n        f = 2 * math.pi * input @ self.weight.T\n        return paddle.concat([f.cos(), f.sin()], axis=-1)\n\n\nclass ConvBlock(nn.Sequential):\n\n    def __init__(self, c_in, c_out):\n        super().__init__(\n            nn.Conv2D(c_in, c_out, 3, padding=1),\n            nn.ReLU(),\n        )\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/reverse_diffusion/model/transforms.py",
    "content": "'''\nThis code is rewritten by Paddle based on\nhttps://github.com/pytorch/vision/blob/main/torchvision/transforms/transforms.py\n'''\nimport math\nimport numbers\nimport warnings\nfrom enum import Enum\nfrom typing import Any\nfrom typing import Dict\nfrom typing import List\nfrom typing import Optional\nfrom typing import Sequence\nfrom typing import Tuple\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nfrom paddle import Tensor\nfrom paddle.nn import functional as F\nfrom paddle.nn.functional import grid_sample\nfrom paddle.vision import transforms as T\n\n\nclass Normalize(nn.Layer):\n\n    def __init__(self, mean, std):\n        super(Normalize, self).__init__()\n        self.mean = paddle.to_tensor(mean)\n        self.std = paddle.to_tensor(std)\n\n    def forward(self, tensor: Tensor):\n        dtype = tensor.dtype\n        mean = paddle.to_tensor(self.mean, dtype=dtype)\n        std = paddle.to_tensor(self.std, dtype=dtype)\n        mean = mean.reshape([1, -1, 1, 1])\n        std = std.reshape([1, -1, 1, 1])\n        result = tensor.subtract(mean).divide(std)\n        return result\n\n\nclass InterpolationMode(Enum):\n    \"\"\"Interpolation modes\n    Available interpolation methods are ``nearest``, ``bilinear``, ``bicubic``, ``box``, ``hamming``, and ``lanczos``.\n    \"\"\"\n\n    NEAREST = \"nearest\"\n    BILINEAR = \"bilinear\"\n    BICUBIC = \"bicubic\"\n    # For PIL compatibility\n    BOX = \"box\"\n    HAMMING = \"hamming\"\n    LANCZOS = \"lanczos\"\n\n\nclass Grayscale(nn.Layer):\n\n    def __init__(self, num_output_channels):\n        super(Grayscale, self).__init__()\n        self.num_output_channels = num_output_channels\n\n    def forward(self, x):\n        output = (0.2989 * x[:, 0:1, :, :] + 0.587 * x[:, 1:2, :, :] + 0.114 * x[:, 2:3, :, :])\n        if self.num_output_channels == 3:\n            return output.expand(x.shape)\n\n        return output\n\n\nclass Lambda(nn.Layer):\n\n    def __init__(self, func):\n        super(Lambda, self).__init__()\n        self.transform = func\n\n    def forward(self, x):\n        return self.transform(x)\n\n\nclass RandomGrayscale(nn.Layer):\n\n    def __init__(self, p):\n        super(RandomGrayscale, self).__init__()\n        self.prob = p\n        self.transform = Grayscale(3)\n\n    def forward(self, x):\n        if paddle.rand([1]) < self.prob:\n            return self.transform(x)\n        else:\n            return x\n\n\nclass RandomHorizontalFlip(nn.Layer):\n\n    def __init__(self, prob):\n        super(RandomHorizontalFlip, self).__init__()\n        self.prob = prob\n\n    def forward(self, x):\n        if paddle.rand([1]) < self.prob:\n            return x[:, :, :, ::-1]\n        else:\n            return x\n\n\ndef _blend(img1: Tensor, img2: Tensor, ratio: float) -> Tensor:\n    ratio = float(ratio)\n    bound = 1.0\n    return (ratio * img1 + (1.0 - ratio) * img2).clip(0, bound)\n\n\ndef trunc_div(a, b):\n    ipt = paddle.divide(a, b)\n    sign_ipt = paddle.sign(ipt)\n    abs_ipt = paddle.abs(ipt)\n    abs_ipt = paddle.floor(abs_ipt)\n    out = paddle.multiply(sign_ipt, abs_ipt)\n    return out\n\n\ndef fmod(a, b):\n    return a - trunc_div(a, b) * b\n\n\ndef _rgb2hsv(img: Tensor) -> Tensor:\n    r, g, b = img.unbind(axis=-3)\n\n    # Implementation is based on https://github.com/python-pillow/Pillow/blob/4174d4267616897df3746d315d5a2d0f82c656ee/\n    # src/libImaging/Convert.c#L330\n    maxc = paddle.max(img, axis=-3)\n    minc = paddle.min(img, axis=-3)\n\n    # The algorithm erases S and H channel where `maxc = minc`. This avoids NaN\n    # from happening in the results, because\n    #   + S channel has division by `maxc`, which is zero only if `maxc = minc`\n    #   + H channel has division by `(maxc - minc)`.\n    #\n    # Instead of overwriting NaN afterwards, we just prevent it from occuring so\n    # we don't need to deal with it in case we save the NaN in a buffer in\n    # backprop, if it is ever supported, but it doesn't hurt to do so.\n    eqc = maxc == minc\n\n    cr = maxc - minc\n    # Since `eqc => cr = 0`, replacing denominator with 1 when `eqc` is fine.\n    ones = paddle.ones_like(maxc)\n    s = cr / paddle.where(eqc, ones, maxc)\n    # Note that `eqc => maxc = minc = r = g = b`. So the following calculation\n    # of `h` would reduce to `bc - gc + 2 + rc - bc + 4 + rc - bc = 6` so it\n    # would not matter what values `rc`, `gc`, and `bc` have here, and thus\n    # replacing denominator with 1 when `eqc` is fine.\n    cr_divisor = paddle.where(eqc, ones, cr)\n    rc = (maxc - r) / cr_divisor\n    gc = (maxc - g) / cr_divisor\n    bc = (maxc - b) / cr_divisor\n\n    hr = (maxc == r).cast('float32') * (bc - gc)\n    hg = ((maxc == g) & (maxc != r)).cast('float32') * (2.0 + rc - bc)\n    hb = ((maxc != g) & (maxc != r)).cast('float32') * (4.0 + gc - rc)\n    h = hr + hg + hb\n    h = fmod((h / 6.0 + 1.0), paddle.to_tensor(1.0))\n    return paddle.stack((h, s, maxc), axis=-3)\n\n\ndef _hsv2rgb(img: Tensor) -> Tensor:\n    h, s, v = img.unbind(axis=-3)\n    i = paddle.floor(h * 6.0)\n    f = (h * 6.0) - i\n    i = i.cast(dtype='int32')\n\n    p = paddle.clip((v * (1.0 - s)), 0.0, 1.0)\n    q = paddle.clip((v * (1.0 - s * f)), 0.0, 1.0)\n    t = paddle.clip((v * (1.0 - s * (1.0 - f))), 0.0, 1.0)\n    i = i % 6\n\n    mask = i.unsqueeze(axis=-3) == paddle.arange(6).reshape([-1, 1, 1])\n\n    a1 = paddle.stack((v, q, p, p, t, v), axis=-3)\n    a2 = paddle.stack((t, v, v, q, p, p), axis=-3)\n    a3 = paddle.stack((p, p, t, v, v, q), axis=-3)\n    a4 = paddle.stack((a1, a2, a3), axis=-4)\n\n    return paddle.einsum(\"...ijk, ...xijk -> ...xjk\", mask.cast(dtype=img.dtype), a4)\n\n\ndef adjust_brightness(img: Tensor, brightness_factor: float) -> Tensor:\n    if brightness_factor < 0:\n        raise ValueError(f\"brightness_factor ({brightness_factor}) is not non-negative.\")\n\n    return _blend(img, paddle.zeros_like(img), brightness_factor)\n\n\ndef adjust_contrast(img: Tensor, contrast_factor: float) -> Tensor:\n    if contrast_factor < 0:\n        raise ValueError(f\"contrast_factor ({contrast_factor}) is not non-negative.\")\n\n    c = img.shape[1]\n\n    if c == 3:\n        output = (0.2989 * img[:, 0:1, :, :] + 0.587 * img[:, 1:2, :, :] + 0.114 * img[:, 2:3, :, :])\n        mean = paddle.mean(output, axis=(-3, -2, -1), keepdim=True)\n\n    else:\n        mean = paddle.mean(img, axis=(-3, -2, -1), keepdim=True)\n\n    return _blend(img, mean, contrast_factor)\n\n\ndef adjust_hue(img: Tensor, hue_factor: float) -> Tensor:\n    if not (-0.5 <= hue_factor <= 0.5):\n        raise ValueError(f\"hue_factor ({hue_factor}) is not in [-0.5, 0.5].\")\n\n    img = _rgb2hsv(img)\n    h, s, v = img.unbind(axis=-3)\n    h = fmod(h + hue_factor, paddle.to_tensor(1.0))\n    img = paddle.stack((h, s, v), axis=-3)\n    img_hue_adj = _hsv2rgb(img)\n    return img_hue_adj\n\n\ndef adjust_saturation(img: Tensor, saturation_factor: float) -> Tensor:\n    if saturation_factor < 0:\n        raise ValueError(f\"saturation_factor ({saturation_factor}) is not non-negative.\")\n\n    output = (0.2989 * img[:, 0:1, :, :] + 0.587 * img[:, 1:2, :, :] + 0.114 * img[:, 2:3, :, :])\n\n    return _blend(img, output, saturation_factor)\n\n\nclass ColorJitter(nn.Layer):\n\n    def __init__(self, brightness=0, contrast=0, saturation=0, hue=0):\n        super(ColorJitter, self).__init__()\n        self.brightness = self._check_input(brightness, \"brightness\")\n        self.contrast = self._check_input(contrast, \"contrast\")\n        self.saturation = self._check_input(saturation, \"saturation\")\n        self.hue = self._check_input(hue, \"hue\", center=0, bound=(-0.5, 0.5), clip_first_on_zero=False)\n\n    def _check_input(self, value, name, center=1, bound=(0, float(\"inf\")), clip_first_on_zero=True):\n        if isinstance(value, numbers.Number):\n            if value < 0:\n                raise ValueError(f\"If {name} is a single number, it must be non negative.\")\n            value = [center - float(value), center + float(value)]\n            if clip_first_on_zero:\n                value[0] = max(value[0], 0.0)\n        elif isinstance(value, (tuple, list)) and len(value) == 2:\n            if not bound[0] <= value[0] <= value[1] <= bound[1]:\n                raise ValueError(f\"{name} values should be between {bound}\")\n        else:\n            raise TypeError(f\"{name} should be a single number or a list/tuple with length 2.\")\n\n        # if value is 0 or (1., 1.) for brightness/contrast/saturation\n        # or (0., 0.) for hue, do nothing\n        if value[0] == value[1] == center:\n            value = None\n        return value\n\n    @staticmethod\n    def get_params(\n        brightness: Optional[List[float]],\n        contrast: Optional[List[float]],\n        saturation: Optional[List[float]],\n        hue: Optional[List[float]],\n    ) -> Tuple[Tensor, Optional[float], Optional[float], Optional[float], Optional[float]]:\n        \"\"\"Get the parameters for the randomized transform to be applied on image.\n\n        Args:\n            brightness (tuple of float (min, max), optional): The range from which the brightness_factor is chosen\n                uniformly. Pass None to turn off the transformation.\n            contrast (tuple of float (min, max), optional): The range from which the contrast_factor is chosen\n                uniformly. Pass None to turn off the transformation.\n            saturation (tuple of float (min, max), optional): The range from which the saturation_factor is chosen\n                uniformly. Pass None to turn off the transformation.\n            hue (tuple of float (min, max), optional): The range from which the hue_factor is chosen uniformly.\n                Pass None to turn off the transformation.\n\n        Returns:\n            tuple: The parameters used to apply the randomized transform\n            along with their random order.\n        \"\"\"\n        fn_idx = paddle.randperm(4)\n\n        b = None if brightness is None else paddle.empty([1]).uniform_(brightness[0], brightness[1])\n        c = None if contrast is None else paddle.empty([1]).uniform_(contrast[0], contrast[1])\n        s = None if saturation is None else paddle.empty([1]).uniform_(saturation[0], saturation[1])\n        h = None if hue is None else paddle.empty([1]).uniform_(hue[0], hue[1])\n\n        return fn_idx, b, c, s, h\n\n    def forward(self, img):\n        \"\"\"\n        Args:\n            img (PIL Image or Tensor): Input image.\n\n        Returns:\n            PIL Image or Tensor: Color jittered image.\n        \"\"\"\n        fn_idx, brightness_factor, contrast_factor, saturation_factor, hue_factor = self.get_params(\n            self.brightness, self.contrast, self.saturation, self.hue)\n\n        for fn_id in fn_idx:\n            if fn_id == 0 and brightness_factor is not None:\n                img = adjust_brightness(img, brightness_factor)\n            elif fn_id == 1 and contrast_factor is not None:\n                img = adjust_contrast(img, contrast_factor)\n            elif fn_id == 2 and saturation_factor is not None:\n                img = adjust_saturation(img, saturation_factor)\n            elif fn_id == 3 and hue_factor is not None:\n                img = adjust_hue(img, hue_factor)\n\n        return img\n\n    def __repr__(self) -> str:\n        s = (f\"{self.__class__.__name__}(\"\n             f\"brightness={self.brightness}\"\n             f\", contrast={self.contrast}\"\n             f\", saturation={self.saturation}\"\n             f\", hue={self.hue})\")\n        return s\n\n\ndef _apply_grid_transform(img: Tensor, grid: Tensor, mode: str, fill: Optional[List[float]]) -> Tensor:\n\n    if img.shape[0] > 1:\n        # Apply same grid to a batch of images\n        grid = grid.expand([img.shape[0], grid.shape[1], grid.shape[2], grid.shape[3]])\n\n    # Append a dummy mask for customized fill colors, should be faster than grid_sample() twice\n    if fill is not None:\n        dummy = paddle.ones((img.shape[0], 1, img.shape[2], img.shape[3]), dtype=img.dtype)\n        img = paddle.concat((img, dummy), axis=1)\n\n    img = grid_sample(img, grid, mode=mode, padding_mode=\"zeros\", align_corners=False)\n\n    # Fill with required color\n    if fill is not None:\n        mask = img[:, -1:, :, :]  # N * 1 * H * W\n        img = img[:, :-1, :, :]  # N * C * H * W\n        mask = mask.expand_as(img)\n        len_fill = len(fill) if isinstance(fill, (tuple, list)) else 1\n        fill_img = paddle.to_tensor(fill, dtype=img.dtype).reshape([1, len_fill, 1, 1]).expand_as(img)\n        if mode == \"nearest\":\n            mask = mask < 0.5\n            img[mask] = fill_img[mask]\n        else:  # 'bilinear'\n            img = img * mask + (1.0 - mask) * fill_img\n    return img\n\n\ndef _gen_affine_grid(\n    theta: Tensor,\n    w: int,\n    h: int,\n    ow: int,\n    oh: int,\n) -> Tensor:\n    # https://github.com/pytorch/pytorch/blob/74b65c32be68b15dc7c9e8bb62459efbfbde33d8/aten/src/ATen/native/\n    # AffineGridGenerator.cpp#L18\n    # Difference with AffineGridGenerator is that:\n    # 1) we normalize grid values after applying theta\n    # 2) we can normalize by other image size, such that it covers \"extend\" option like in PIL.Image.rotate\n\n    d = 0.5\n    base_grid = paddle.empty([1, oh, ow, 3], dtype=theta.dtype)\n    x_grid = paddle.linspace(-ow * 0.5 + d, ow * 0.5 + d - 1, num=ow)\n    base_grid[..., 0] = (x_grid)\n    y_grid = paddle.linspace(-oh * 0.5 + d, oh * 0.5 + d - 1, num=oh).unsqueeze_(-1)\n    base_grid[..., 1] = (y_grid)\n    base_grid[..., 2] = 1.0\n    rescaled_theta = theta.transpose([0, 2, 1]) / paddle.to_tensor([0.5 * w, 0.5 * h], dtype=theta.dtype)\n    output_grid = base_grid.reshape([1, oh * ow, 3]).bmm(rescaled_theta)\n    return output_grid.reshape([1, oh, ow, 2])\n\n\ndef affine_impl(img: Tensor,\n                matrix: List[float],\n                interpolation: str = \"nearest\",\n                fill: Optional[List[float]] = None) -> Tensor:\n    theta = paddle.to_tensor(matrix, dtype=img.dtype).reshape([1, 2, 3])\n    shape = img.shape\n    # grid will be generated on the same device as theta and img\n    grid = _gen_affine_grid(theta, w=shape[-1], h=shape[-2], ow=shape[-1], oh=shape[-2])\n    return _apply_grid_transform(img, grid, interpolation, fill=fill)\n\n\ndef _get_inverse_affine_matrix(center: List[float],\n                               angle: float,\n                               translate: List[float],\n                               scale: float,\n                               shear: List[float],\n                               inverted: bool = True) -> List[float]:\n    # Helper method to compute inverse matrix for affine transformation\n\n    # Pillow requires inverse affine transformation matrix:\n    # Affine matrix is : M = T * C * RotateScaleShear * C^-1\n    #\n    # where T is translation matrix: [1, 0, tx | 0, 1, ty | 0, 0, 1]\n    #       C is translation matrix to keep center: [1, 0, cx | 0, 1, cy | 0, 0, 1]\n    #       RotateScaleShear is rotation with scale and shear matrix\n    #\n    #       RotateScaleShear(a, s, (sx, sy)) =\n    #       = R(a) * S(s) * SHy(sy) * SHx(sx)\n    #       = [ s*cos(a - sy)/cos(sy), s*(-cos(a - sy)*tan(sx)/cos(sy) - sin(a)), 0 ]\n    #         [ s*sin(a + sy)/cos(sy), s*(-sin(a - sy)*tan(sx)/cos(sy) + cos(a)), 0 ]\n    #         [ 0                    , 0                                      , 1 ]\n    # where R is a rotation matrix, S is a scaling matrix, and SHx and SHy are the shears:\n    # SHx(s) = [1, -tan(s)] and SHy(s) = [1      , 0]\n    #          [0, 1      ]              [-tan(s), 1]\n    #\n    # Thus, the inverse is M^-1 = C * RotateScaleShear^-1 * C^-1 * T^-1\n\n    rot = math.radians(angle)\n    sx = math.radians(shear[0])\n    sy = math.radians(shear[1])\n\n    cx, cy = center\n    tx, ty = translate\n\n    # RSS without scaling\n    a = math.cos(rot - sy) / math.cos(sy)\n    b = -math.cos(rot - sy) * math.tan(sx) / math.cos(sy) - math.sin(rot)\n    c = math.sin(rot - sy) / math.cos(sy)\n    d = -math.sin(rot - sy) * math.tan(sx) / math.cos(sy) + math.cos(rot)\n\n    if inverted:\n        # Inverted rotation matrix with scale and shear\n        # det([[a, b], [c, d]]) == 1, since det(rotation) = 1 and det(shear) = 1\n        matrix = [d, -b, 0.0, -c, a, 0.0]\n        matrix = [x / scale for x in matrix]\n        # Apply inverse of translation and of center translation: RSS^-1 * C^-1 * T^-1\n        matrix[2] += matrix[0] * (-cx - tx) + matrix[1] * (-cy - ty)\n        matrix[5] += matrix[3] * (-cx - tx) + matrix[4] * (-cy - ty)\n        # Apply center translation: C * RSS^-1 * C^-1 * T^-1\n        matrix[2] += cx\n        matrix[5] += cy\n    else:\n        matrix = [a, b, 0.0, c, d, 0.0]\n        matrix = [x * scale for x in matrix]\n        # Apply inverse of center translation: RSS * C^-1\n        matrix[2] += matrix[0] * (-cx) + matrix[1] * (-cy)\n        matrix[5] += matrix[3] * (-cx) + matrix[4] * (-cy)\n        # Apply translation and center : T * C * RSS * C^-1\n        matrix[2] += cx + tx\n        matrix[5] += cy + ty\n\n    return matrix\n\n\ndef affine(\n    img: Tensor,\n    angle: float,\n    translate: List[int],\n    scale: float,\n    shear: List[float],\n    interpolation: InterpolationMode = InterpolationMode.NEAREST,\n    fill: Optional[List[float]] = None,\n    resample: Optional[int] = None,\n    fillcolor: Optional[List[float]] = None,\n    center: Optional[List[int]] = None,\n) -> Tensor:\n    \"\"\"Apply affine transformation on the image keeping image center invariant.\n    If the image is paddle Tensor, it is expected\n    to have [..., H, W] shape, where ... means an arbitrary number of leading dimensions.\n\n    Args:\n        img (PIL Image or Tensor): image to transform.\n        angle (number): rotation angle in degrees between -180 and 180, clockwise direction.\n        translate (sequence of integers): horizontal and vertical translations (post-rotation translation)\n        scale (float): overall scale\n        shear (float or sequence): shear angle value in degrees between -180 to 180, clockwise direction.\n            If a sequence is specified, the first value corresponds to a shear parallel to the x axis, while\n            the second value corresponds to a shear parallel to the y axis.\n        interpolation (InterpolationMode): Desired interpolation enum defined by\n            :class:`torchvision.transforms.InterpolationMode`. Default is ``InterpolationMode.NEAREST``.\n            If input is Tensor, only ``InterpolationMode.NEAREST``, ``InterpolationMode.BILINEAR`` are supported.\n            For backward compatibility integer values (e.g. ``PIL.Image[.Resampling].NEAREST``) are still accepted,\n            but deprecated since 0.13 and will be removed in 0.15. Please use InterpolationMode enum.\n        fill (sequence or number, optional): Pixel fill value for the area outside the transformed\n            image. If given a number, the value is used for all bands respectively.\n\n            .. note::\n                In torchscript mode single int/float value is not supported, please use a sequence\n                of length 1: ``[value, ]``.\n        fillcolor (sequence or number, optional):\n            .. warning::\n                This parameter was deprecated in ``0.12`` and will be removed in ``0.14``. Please use ``fill`` instead.\n        resample (int, optional):\n            .. warning::\n                This parameter was deprecated in ``0.12`` and will be removed in ``0.14``. Please use ``interpolation``\n                instead.\n        center (sequence, optional): Optional center of rotation. Origin is the upper left corner.\n            Default is the center of the image.\n\n    Returns:\n        PIL Image or Tensor: Transformed image.\n    \"\"\"\n\n    # Backward compatibility with integer value\n    if isinstance(interpolation, int):\n        warnings.warn(\"Argument 'interpolation' of type int is deprecated since 0.13 and will be removed in 0.15. \"\n                      \"Please use InterpolationMode enum.\")\n        interpolation = _interpolation_modes_from_int(interpolation)\n\n    if fillcolor is not None:\n        warnings.warn(\"The parameter 'fillcolor' is deprecated since 0.12 and will be removed in 0.14. \"\n                      \"Please use 'fill' instead.\")\n        fill = fillcolor\n\n    if not isinstance(angle, (int, float)):\n        raise TypeError(\"Argument angle should be int or float\")\n\n    if not isinstance(translate, (list, tuple)):\n        raise TypeError(\"Argument translate should be a sequence\")\n\n    if len(translate) != 2:\n        raise ValueError(\"Argument translate should be a sequence of length 2\")\n\n    if scale <= 0.0:\n        raise ValueError(\"Argument scale should be positive\")\n\n    if not isinstance(shear, (numbers.Number, (list, tuple))):\n        raise TypeError(\"Shear should be either a single value or a sequence of two values\")\n\n    if not isinstance(interpolation, InterpolationMode):\n        raise TypeError(\"Argument interpolation should be a InterpolationMode\")\n\n    if isinstance(angle, int):\n        angle = float(angle)\n\n    if isinstance(translate, tuple):\n        translate = list(translate)\n\n    if isinstance(shear, numbers.Number):\n        shear = [shear, 0.0]\n\n    if isinstance(shear, tuple):\n        shear = list(shear)\n\n    if len(shear) == 1:\n        shear = [shear[0], shear[0]]\n\n    if len(shear) != 2:\n        raise ValueError(f\"Shear should be a sequence containing two values. Got {shear}\")\n\n    if center is not None and not isinstance(center, (list, tuple)):\n        raise TypeError(\"Argument center should be a sequence\")\n    center_f = [0.0, 0.0]\n    if center is not None:\n        _, height, width = img.shape[0], img.shape[1], img.shape[2]\n        # Center values should be in pixel coordinates but translated such that (0, 0) corresponds to image center.\n        center_f = [1.0 * (c - s * 0.5) for c, s in zip(center, [width, height])]\n\n    translate_f = [1.0 * t for t in translate]\n    matrix = _get_inverse_affine_matrix(center_f, angle, translate_f, scale, shear)\n    return affine_impl(img, matrix=matrix, interpolation=interpolation.value, fill=fill)\n\n\ndef _interpolation_modes_from_int(i: int) -> InterpolationMode:\n    inverse_modes_mapping = {\n        0: InterpolationMode.NEAREST,\n        2: InterpolationMode.BILINEAR,\n        3: InterpolationMode.BICUBIC,\n        4: InterpolationMode.BOX,\n        5: InterpolationMode.HAMMING,\n        1: InterpolationMode.LANCZOS,\n    }\n    return inverse_modes_mapping[i]\n\n\ndef _check_sequence_input(x, name, req_sizes):\n    msg = req_sizes[0] if len(req_sizes) < 2 else \" or \".join([str(s) for s in req_sizes])\n    if not isinstance(x, Sequence):\n        raise TypeError(f\"{name} should be a sequence of length {msg}.\")\n    if len(x) not in req_sizes:\n        raise ValueError(f\"{name} should be sequence of length {msg}.\")\n\n\ndef _setup_angle(x, name, req_sizes=(2, )):\n    if isinstance(x, numbers.Number):\n        if x < 0:\n            raise ValueError(f\"If {name} is a single number, it must be positive.\")\n        x = [-x, x]\n    else:\n        _check_sequence_input(x, name, req_sizes)\n\n    return [float(d) for d in x]\n\n\nclass RandomAffine(nn.Layer):\n    \"\"\"Random affine transformation of the image keeping center invariant.\n    If the image is paddle Tensor, it is expected\n    to have [..., H, W] shape, where ... means an arbitrary number of leading dimensions.\n\n    Args:\n        degrees (sequence or number): Range of degrees to select from.\n            If degrees is a number instead of sequence like (min, max), the range of degrees\n            will be (-degrees, +degrees). Set to 0 to deactivate rotations.\n        translate (tuple, optional): tuple of maximum absolute fraction for horizontal\n            and vertical translations. For example translate=(a, b), then horizontal shift\n            is randomly sampled in the range -img_width * a < dx < img_width * a and vertical shift is\n            randomly sampled in the range -img_height * b < dy < img_height * b. Will not translate by default.\n        scale (tuple, optional): scaling factor interval, e.g (a, b), then scale is\n            randomly sampled from the range a <= scale <= b. Will keep original scale by default.\n        shear (sequence or number, optional): Range of degrees to select from.\n            If shear is a number, a shear parallel to the x axis in the range (-shear, +shear)\n            will be applied. Else if shear is a sequence of 2 values a shear parallel to the x axis in the\n            range (shear[0], shear[1]) will be applied. Else if shear is a sequence of 4 values,\n            a x-axis shear in (shear[0], shear[1]) and y-axis shear in (shear[2], shear[3]) will be applied.\n            Will not apply shear by default.\n        interpolation (InterpolationMode): Desired interpolation enum defined by\n            :class:`torchvision.transforms.InterpolationMode`. Default is ``InterpolationMode.NEAREST``.\n            If input is Tensor, only ``InterpolationMode.NEAREST``, ``InterpolationMode.BILINEAR`` are supported.\n            For backward compatibility integer values (e.g. ``PIL.Image[.Resampling].NEAREST``) are still accepted,\n            but deprecated since 0.13 and will be removed in 0.15. Please use InterpolationMode enum.\n        fill (sequence or number): Pixel fill value for the area outside the transformed\n            image. Default is ``0``. If given a number, the value is used for all bands respectively.\n        fillcolor (sequence or number, optional):\n            .. warning::\n                This parameter was deprecated in ``0.12`` and will be removed in ``0.14``. Please use ``fill`` instead.\n        resample (int, optional):\n            .. warning::\n                This parameter was deprecated in ``0.12`` and will be removed in ``0.14``. Please use ``interpolation``\n                instead.\n        center (sequence, optional): Optional center of rotation, (x, y). Origin is the upper left corner.\n            Default is the center of the image.\n\n    .. _filters: https://pillow.readthedocs.io/en/latest/handbook/concepts.html#filters\n\n    \"\"\"\n\n    def __init__(\n        self,\n        degrees,\n        translate=None,\n        scale=None,\n        shear=None,\n        interpolation=InterpolationMode.NEAREST,\n        fill=0,\n        fillcolor=None,\n        resample=None,\n        center=None,\n    ):\n        super(RandomAffine, self).__init__()\n        if resample is not None:\n            warnings.warn(\"The parameter 'resample' is deprecated since 0.12 and will be removed in 0.14. \"\n                          \"Please use 'interpolation' instead.\")\n            interpolation = _interpolation_modes_from_int(resample)\n\n        # Backward compatibility with integer value\n        if isinstance(interpolation, int):\n            warnings.warn(\"Argument 'interpolation' of type int is deprecated since 0.13 and will be removed in 0.15. \"\n                          \"Please use InterpolationMode enum.\")\n            interpolation = _interpolation_modes_from_int(interpolation)\n\n        if fillcolor is not None:\n            warnings.warn(\"The parameter 'fillcolor' is deprecated since 0.12 and will be removed in 0.14. \"\n                          \"Please use 'fill' instead.\")\n            fill = fillcolor\n\n        self.degrees = _setup_angle(degrees, name=\"degrees\", req_sizes=(2, ))\n\n        if translate is not None:\n            _check_sequence_input(translate, \"translate\", req_sizes=(2, ))\n            for t in translate:\n                if not (0.0 <= t <= 1.0):\n                    raise ValueError(\"translation values should be between 0 and 1\")\n        self.translate = translate\n\n        if scale is not None:\n            _check_sequence_input(scale, \"scale\", req_sizes=(2, ))\n            for s in scale:\n                if s <= 0:\n                    raise ValueError(\"scale values should be positive\")\n        self.scale = scale\n\n        if shear is not None:\n            self.shear = _setup_angle(shear, name=\"shear\", req_sizes=(2, 4))\n        else:\n            self.shear = shear\n\n        self.resample = self.interpolation = interpolation\n\n        if fill is None:\n            fill = 0\n        elif not isinstance(fill, (Sequence, numbers.Number)):\n            raise TypeError(\"Fill should be either a sequence or a number.\")\n\n        self.fillcolor = self.fill = fill\n\n        if center is not None:\n            _check_sequence_input(center, \"center\", req_sizes=(2, ))\n\n        self.center = center\n\n    @staticmethod\n    def get_params(\n        degrees: List[float],\n        translate: Optional[List[float]],\n        scale_ranges: Optional[List[float]],\n        shears: Optional[List[float]],\n        img_size: List[int],\n    ) -> Tuple[float, Tuple[int, int], float, Tuple[float, float]]:\n        \"\"\"Get parameters for affine transformation\n\n        Returns:\n            params to be passed to the affine transformation\n        \"\"\"\n        angle = float(paddle.empty([1]).uniform_(float(degrees[0]), float(degrees[1])))\n        if translate is not None:\n            max_dx = float(translate[0] * img_size[0])\n            max_dy = float(translate[1] * img_size[1])\n            tx = int(float(paddle.empty([1]).uniform_(-max_dx, max_dx)))\n            ty = int(float(paddle.empty([1]).uniform_(-max_dy, max_dy)))\n            translations = (tx, ty)\n        else:\n            translations = (0, 0)\n\n        if scale_ranges is not None:\n            scale = float(paddle.empty([1]).uniform_(scale_ranges[0], scale_ranges[1]))\n        else:\n            scale = 1.0\n\n        shear_x = shear_y = 0.0\n        if shears is not None:\n            shear_x = float(paddle.empty([1]).uniform_(shears[0], shears[1]))\n            if len(shears) == 4:\n                shear_y = float(paddle.empty([1]).uniform_(shears[2], shears[3]))\n\n        shear = (shear_x, shear_y)\n\n        return angle, translations, scale, shear\n\n    def forward(self, img):\n        fill = self.fill\n        channels, height, width = img.shape[1], img.shape[2], img.shape[3]\n        if isinstance(fill, (int, float)):\n            fill = [float(fill)] * channels\n        else:\n            fill = [float(f) for f in fill]\n\n        img_size = [width, height]  # flip for keeping BC on get_params call\n\n        ret = self.get_params(self.degrees, self.translate, self.scale, self.shear, img_size)\n\n        return affine(img, *ret, interpolation=self.interpolation, fill=fill, center=self.center)\n\n    def __repr__(self) -> str:\n        s = f\"{self.__class__.__name__}(degrees={self.degrees}\"\n        s += f\", translate={self.translate}\" if self.translate is not None else \"\"\n        s += f\", scale={self.scale}\" if self.scale is not None else \"\"\n        s += f\", shear={self.shear}\" if self.shear is not None else \"\"\n        s += f\", interpolation={self.interpolation.value}\" if self.interpolation != InterpolationMode.NEAREST else \"\"\n        s += f\", fill={self.fill}\" if self.fill != 0 else \"\"\n        s += f\", center={self.center}\" if self.center is not None else \"\"\n        s += \")\"\n\n        return s\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/reverse_diffusion/model/unet.py",
    "content": "'''\nThis code is rewritten by Paddle based on\nhttps://github.com/openai/guided-diffusion/blob/main/guided_diffusion/unet.py\n'''\nimport math\nfrom abc import abstractmethod\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom .nn import avg_pool_nd\nfrom .nn import checkpoint\nfrom .nn import conv_nd\nfrom .nn import linear\nfrom .nn import normalization\nfrom .nn import SiLU\nfrom .nn import timestep_embedding\nfrom .nn import zero_module\n\n\nclass AttentionPool2d(nn.Layer):\n    \"\"\"\n    Adapted from CLIP: https://github.com/openai/CLIP/blob/main/clip/model.py\n    \"\"\"\n\n    def __init__(\n        self,\n        spacial_dim: int,\n        embed_dim: int,\n        num_heads_channels: int,\n        output_dim: int = None,\n    ):\n        super().__init__()\n        # self.positional_embedding = nn.Parameter(\n        #     th.randn(embed_dim, spacial_dim ** 2 + 1) / embed_dim ** 0.5\n        # )\n        positional_embedding = self.create_parameter(paddle.randn(embed_dim, spacial_dim**2 + 1) / embed_dim**0.5)\n        self.add_parameter(\"positional_embedding\", positional_embedding)\n        self.qkv_proj = conv_nd(1, embed_dim, 3 * embed_dim, 1)\n        self.c_proj = conv_nd(1, embed_dim, output_dim or embed_dim, 1)\n        self.num_heads = embed_dim // num_heads_channels\n        self.attention = QKVAttention(self.num_heads)\n\n    def forward(self, x):\n        b, c, *_spatial = x.shape\n        # x = x.reshape(b, c, -1)  # NC(HW)\n        x = paddle.reshape(x, [b, c, -1])\n        x = paddle.concat([x.mean(dim=-1, keepdim=True), x], axis=-1)  # NC(HW+1)\n        x = x + paddle.cast(self.positional_embedding[None, :, :], x.dtype)  # NC(HW+1)\n        x = self.qkv_proj(x)\n        x = self.attention(x)\n        x = self.c_proj(x)\n        return x[:, :, 0]\n\n\nclass TimestepBlock(nn.Layer):\n    \"\"\"\n    Any module where forward() takes timestep embeddings as a second argument.\n    \"\"\"\n\n    @abstractmethod\n    def forward(self, x, emb):\n        \"\"\"\n        Apply the module to `x` given `emb` timestep embeddings.\n        \"\"\"\n\n\nclass TimestepEmbedSequential(nn.Sequential, TimestepBlock):\n    \"\"\"\n    A sequential module that passes timestep embeddings to the children that\n    support it as an extra input.\n    \"\"\"\n\n    def forward(self, x, emb):\n        for layer in self:\n            if isinstance(layer, TimestepBlock):\n                x = layer(x, emb)\n            else:\n                x = layer(x)\n        return x\n\n\nclass Upsample(nn.Layer):\n    \"\"\"\n    An upsampling layer with an optional convolution.\n\n    :param channels: channels in the inputs and outputs.\n    :param use_conv: a bool determining if a convolution is applied.\n    :param dims: determines if the signal is 1D, 2D, or 3D. If 3D, then\n                 upsampling occurs in the inner-two dimensions.\n    \"\"\"\n\n    def __init__(self, channels, use_conv, dims=2, out_channels=None):\n        super().__init__()\n        self.channels = channels\n        self.out_channels = out_channels or channels\n        self.use_conv = use_conv\n        self.dims = dims\n        if use_conv:\n            self.conv = conv_nd(dims, self.channels, self.out_channels, 3, padding=1)\n\n    def forward(self, x):\n        assert x.shape[1] == self.channels\n        if self.dims == 3:\n            x = F.interpolate(x, (x.shape[2], x.shape[3] * 2, x.shape[4] * 2), mode=\"nearest\")\n        else:\n            x = F.interpolate(x, scale_factor=2, mode=\"nearest\")\n        if self.use_conv:\n            x = self.conv(x)\n        return x\n\n\nclass Downsample(nn.Layer):\n    \"\"\"\n    A downsampling layer with an optional convolution.\n\n    :param channels: channels in the inputs and outputs.\n    :param use_conv: a bool determining if a convolution is applied.\n    :param dims: determines if the signal is 1D, 2D, or 3D. If 3D, then\n                 downsampling occurs in the inner-two dimensions.\n    \"\"\"\n\n    def __init__(self, channels, use_conv, dims=2, out_channels=None):\n        super().__init__()\n        self.channels = channels\n        self.out_channels = out_channels or channels\n        self.use_conv = use_conv\n        self.dims = dims\n        stride = 2 if dims != 3 else (1, 2, 2)\n        if use_conv:\n            self.op = conv_nd(dims, self.channels, self.out_channels, 3, stride=stride, padding=1)\n        else:\n            assert self.channels == self.out_channels\n            self.op = avg_pool_nd(dims, kernel_size=stride, stride=stride)\n\n    def forward(self, x):\n        assert x.shape[1] == self.channels\n        return self.op(x)\n\n\nclass ResBlock(TimestepBlock):\n    \"\"\"\n    A residual block that can optionally change the number of channels.\n\n    :param channels: the number of input channels.\n    :param emb_channels: the number of timestep embedding channels.\n    :param dropout: the rate of dropout.\n    :param out_channels: if specified, the number of out channels.\n    :param use_conv: if True and out_channels is specified, use a spatial\n        convolution instead of a smaller 1x1 convolution to change the\n        channels in the skip connection.\n    :param dims: determines if the signal is 1D, 2D, or 3D.\n    :param use_checkpoint: if True, use gradient checkpointing on this module.\n    :param up: if True, use this block for upsampling.\n    :param down: if True, use this block for downsampling.\n    \"\"\"\n\n    def __init__(\n        self,\n        channels,\n        emb_channels,\n        dropout,\n        out_channels=None,\n        use_conv=False,\n        use_scale_shift_norm=False,\n        dims=2,\n        use_checkpoint=False,\n        up=False,\n        down=False,\n    ):\n        super().__init__()\n        self.channels = channels\n        self.emb_channels = emb_channels\n        self.dropout = dropout\n        self.out_channels = out_channels or channels\n        self.use_conv = use_conv\n        self.use_checkpoint = use_checkpoint\n        self.use_scale_shift_norm = use_scale_shift_norm\n\n        self.in_layers = nn.Sequential(\n            normalization(channels),\n            SiLU(),\n            conv_nd(dims, channels, self.out_channels, 3, padding=1),\n        )\n\n        self.updown = up or down\n\n        if up:\n            self.h_upd = Upsample(channels, False, dims)\n            self.x_upd = Upsample(channels, False, dims)\n        elif down:\n            self.h_upd = Downsample(channels, False, dims)\n            self.x_upd = Downsample(channels, False, dims)\n        else:\n            self.h_upd = self.x_upd = nn.Identity()\n\n        self.emb_layers = nn.Sequential(\n            SiLU(),\n            linear(\n                emb_channels,\n                2 * self.out_channels if use_scale_shift_norm else self.out_channels,\n            ),\n        )\n        self.out_layers = nn.Sequential(\n            normalization(self.out_channels),\n            SiLU(),\n            nn.Dropout(p=dropout),\n            zero_module(conv_nd(dims, self.out_channels, self.out_channels, 3, padding=1)),\n        )\n\n        if self.out_channels == channels:\n            self.skip_connection = nn.Identity()\n        elif use_conv:\n            self.skip_connection = conv_nd(dims, channels, self.out_channels, 3, padding=1)\n        else:\n            self.skip_connection = conv_nd(dims, channels, self.out_channels, 1)\n\n    def forward(self, x, emb):\n        \"\"\"\n        Apply the block to a Tensor, conditioned on a timestep embedding.\n\n        :param x: an [N x C x ...] Tensor of features.\n        :param emb: an [N x emb_channels] Tensor of timestep embeddings.\n        :return: an [N x C x ...] Tensor of outputs.\n        \"\"\"\n        return checkpoint(self._forward, (x, emb), self.parameters(), self.use_checkpoint)\n\n    def _forward(self, x, emb):\n        if self.updown:\n            in_rest, in_conv = self.in_layers[:-1], self.in_layers[-1]\n            h = in_rest(x)\n            h = self.h_upd(h)\n            x = self.x_upd(x)\n            h = in_conv(h)\n        else:\n            h = self.in_layers(x)\n        emb_out = self.emb_layers(emb)\n        emb_out = paddle.cast(emb_out, h.dtype)\n        while len(emb_out.shape) < len(h.shape):\n            emb_out = emb_out[..., None]\n        if self.use_scale_shift_norm:\n            out_norm, out_rest = self.out_layers[0], self.out_layers[1:]\n            scale, shift = paddle.chunk(emb_out, 2, axis=1)\n            h = out_norm(h) * (1 + scale) + shift\n            h = out_rest(h)\n        else:\n            h = h + emb_out\n            h = self.out_layers(h)\n        return self.skip_connection(x) + h\n\n\nclass AttentionBlock(nn.Layer):\n    \"\"\"\n    An attention block that allows spatial positions to attend to each other.\n\n    Originally ported from here, but adapted to the N-d case.\n    https://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/models/unet.py#L66.\n    \"\"\"\n\n    def __init__(\n        self,\n        channels,\n        num_heads=1,\n        num_head_channels=-1,\n        use_checkpoint=False,\n        use_new_attention_order=False,\n    ):\n        super().__init__()\n        self.channels = channels\n        if num_head_channels == -1:\n            self.num_heads = num_heads\n        else:\n            assert (channels % num_head_channels == 0\n                    ), f\"q,k,v channels {channels} is not divisible by num_head_channels {num_head_channels}\"\n            self.num_heads = channels // num_head_channels\n        self.use_checkpoint = use_checkpoint\n        self.norm = normalization(channels)\n        self.qkv = conv_nd(1, channels, channels * 3, 1)\n        if use_new_attention_order:\n            # split qkv before split heads\n            self.attention = QKVAttention(self.num_heads)\n        else:\n            # split heads before split qkv\n            self.attention = QKVAttentionLegacy(self.num_heads)\n\n        self.proj_out = zero_module(conv_nd(1, channels, channels, 1))\n\n    def forward(self, x):\n        return checkpoint(self._forward, (x, ), self.parameters(), self.use_checkpoint)\n\n    def _forward(self, x):\n        b, c, *spatial = x.shape\n        # x = x.reshape(b, c, -1)\n        x = paddle.reshape(x, [b, c, -1])\n        qkv = self.qkv(self.norm(x))\n        h = self.attention(qkv)\n        h = self.proj_out(h)\n        # return (x + h).reshape(b, c, *spatial)\n        return paddle.reshape(x + h, [b, c, *spatial])\n\n\ndef count_flops_attn(model, _x, y):\n    \"\"\"\n    A counter for the `thop` package to count the operations in an\n    attention operation.\n    Meant to be used like:\n        macs, params = thop.profile(\n            model,\n            inputs=(inputs, timestamps),\n            custom_ops={QKVAttention: QKVAttention.count_flops},\n        )\n    \"\"\"\n    b, c, *spatial = y[0].shape\n    num_spatial = int(np.prod(spatial))\n    # We perform two matmuls with the same number of ops.\n    # The first computes the weight matrix, the second computes\n    # the combination of the value vectors.\n    matmul_ops = 2 * b * (num_spatial**2) * c\n    model.total_ops += paddle.to_tensor([matmul_ops], dtype='float64')\n\n\nclass QKVAttentionLegacy(nn.Layer):\n    \"\"\"\n    A module which performs QKV attention. Matches legacy QKVAttention + input/ouput heads shaping\n    \"\"\"\n\n    def __init__(self, n_heads):\n        super().__init__()\n        self.n_heads = n_heads\n\n    def forward(self, qkv):\n        \"\"\"\n        Apply QKV attention.\n\n        :param qkv: an [N x (H * 3 * C) x T] tensor of Qs, Ks, and Vs.\n        :return: an [N x (H * C) x T] tensor after attention.\n        \"\"\"\n        bs, width, length = qkv.shape\n        assert width % (3 * self.n_heads) == 0\n        ch = width // (3 * self.n_heads)\n        # q, k, v = qkv.reshape(bs * self.n_heads, ch * 3, length).split(ch, dim=1)\n        q, k, v = paddle.reshape(qkv, [bs * self.n_heads, ch * 3, length]).split(3, axis=1)\n        scale = 1 / math.sqrt(math.sqrt(ch))\n        weight = paddle.einsum(\"bct,bcs->bts\", q * scale, k * scale)  # More stable with f16 than dividing afterwards\n        weight = paddle.cast(nn.functional.softmax(paddle.cast(weight, 'float32'), axis=-1), weight.dtype)\n        a = paddle.einsum(\"bts,bcs->bct\", weight, v)\n        # return a.reshape(bs, -1, length)\n        return paddle.reshape(a, [bs, -1, length])\n\n    @staticmethod\n    def count_flops(model, _x, y):\n        return count_flops_attn(model, _x, y)\n\n\nclass QKVAttention(nn.Layer):\n    \"\"\"\n    A module which performs QKV attention and splits in a different order.\n    \"\"\"\n\n    def __init__(self, n_heads):\n        super().__init__()\n        self.n_heads = n_heads\n\n    def forward(self, qkv):\n        \"\"\"\n        Apply QKV attention.\n\n        :param qkv: an [N x (3 * H * C) x T] tensor of Qs, Ks, and Vs.\n        :return: an [N x (H * C) x T] tensor after attention.\n        \"\"\"\n        bs, width, length = qkv.shape\n        assert width % (3 * self.n_heads) == 0\n        ch = width // (3 * self.n_heads)\n        q, k, v = qkv.chunk(3, axis=1)\n        scale = 1 / math.sqrt(math.sqrt(ch))\n        weight = paddle.einsum(\n            \"bct,bcs->bts\",\n            (q * scale).view(bs * self.n_heads, ch, length),\n            (k * scale).view(bs * self.n_heads, ch, length),\n        )  # More stable with f16 than dividing afterwards\n        weight = paddle.cast(nn.functional.softmax(paddle.cast(weight, 'float32'), axis=-1), weight.dtype)\n        a = paddle.einsum(\"bts,bcs->bct\", weight, v.reshape(bs * self.n_heads, ch, length))\n        # return a.reshape(bs, -1, length)\n        return paddle.reshape(a, [bs, -1, length])\n\n    @staticmethod\n    def count_flops(model, _x, y):\n        return count_flops_attn(model, _x, y)\n\n\nclass UNetModel(nn.Layer):\n    \"\"\"\n    The full UNet model with attention and timestep embedding.\n\n    :param in_channels: channels in the input Tensor.\n    :param model_channels: base channel count for the model.\n    :param out_channels: channels in the output Tensor.\n    :param num_res_blocks: number of residual blocks per downsample.\n    :param attention_resolutions: a collection of downsample rates at which\n        attention will take place. May be a set, list, or tuple.\n        For example, if this contains 4, then at 4x downsampling, attention\n        will be used.\n    :param dropout: the dropout probability.\n    :param channel_mult: channel multiplier for each level of the UNet.\n    :param conv_resample: if True, use learned convolutions for upsampling and\n        downsampling.\n    :param dims: determines if the signal is 1D, 2D, or 3D.\n    :param num_classes: if specified (as an int), then this model will be\n        class-conditional with `num_classes` classes.\n    :param use_checkpoint: use gradient checkpointing to reduce memory usage.\n    :param num_heads: the number of attention heads in each attention layer.\n    :param num_heads_channels: if specified, ignore num_heads and instead use\n                               a fixed channel width per attention head.\n    :param num_heads_upsample: works with num_heads to set a different number\n                               of heads for upsampling. Deprecated.\n    :param use_scale_shift_norm: use a FiLM-like conditioning mechanism.\n    :param resblock_updown: use residual blocks for up/downsampling.\n    :param use_new_attention_order: use a different attention pattern for potentially\n                                    increased efficiency.\n    \"\"\"\n\n    def __init__(\n        self,\n        image_size,\n        in_channels,\n        model_channels,\n        out_channels,\n        num_res_blocks,\n        attention_resolutions,\n        dropout=0,\n        channel_mult=(1, 2, 4, 8),\n        conv_resample=True,\n        dims=2,\n        num_classes=None,\n        use_checkpoint=False,\n        use_fp16=False,\n        num_heads=1,\n        num_head_channels=-1,\n        num_heads_upsample=-1,\n        use_scale_shift_norm=False,\n        resblock_updown=False,\n        use_new_attention_order=False,\n    ):\n        super().__init__()\n\n        if num_heads_upsample == -1:\n            num_heads_upsample = num_heads\n\n        self.image_size = image_size\n        self.in_channels = in_channels\n        self.model_channels = model_channels\n        self.out_channels = out_channels\n        self.num_res_blocks = num_res_blocks\n        self.attention_resolutions = attention_resolutions\n        self.dropout = dropout\n        self.channel_mult = channel_mult\n        self.conv_resample = conv_resample\n        self.num_classes = num_classes\n        self.use_checkpoint = use_checkpoint\n        self.dtype = paddle.float16 if use_fp16 else paddle.float32\n        self.num_heads = num_heads\n        self.num_head_channels = num_head_channels\n        self.num_heads_upsample = num_heads_upsample\n\n        time_embed_dim = model_channels * 4\n        self.time_embed = nn.Sequential(\n            linear(model_channels, time_embed_dim),\n            SiLU(),\n            linear(time_embed_dim, time_embed_dim),\n        )\n\n        if self.num_classes is not None:\n            self.label_emb = nn.Embedding(num_classes, time_embed_dim)\n\n        ch = input_ch = int(channel_mult[0] * model_channels)\n        self.input_blocks = nn.LayerList([TimestepEmbedSequential(conv_nd(dims, in_channels, ch, 3, padding=1))])\n        self._feature_size = ch\n        input_block_chans = [ch]\n        ds = 1\n        for level, mult in enumerate(channel_mult):\n            for _ in range(num_res_blocks):\n                layers = [\n                    ResBlock(\n                        ch,\n                        time_embed_dim,\n                        dropout,\n                        out_channels=int(mult * model_channels),\n                        dims=dims,\n                        use_checkpoint=use_checkpoint,\n                        use_scale_shift_norm=use_scale_shift_norm,\n                    )\n                ]\n                ch = int(mult * model_channels)\n                if ds in attention_resolutions:\n                    layers.append(\n                        AttentionBlock(\n                            ch,\n                            use_checkpoint=use_checkpoint,\n                            num_heads=num_heads,\n                            num_head_channels=num_head_channels,\n                            use_new_attention_order=use_new_attention_order,\n                        ))\n                self.input_blocks.append(TimestepEmbedSequential(*layers))\n                self._feature_size += ch\n                input_block_chans.append(ch)\n            if level != len(channel_mult) - 1:\n                out_ch = ch\n                self.input_blocks.append(\n                    TimestepEmbedSequential(\n                        ResBlock(\n                            ch,\n                            time_embed_dim,\n                            dropout,\n                            out_channels=out_ch,\n                            dims=dims,\n                            use_checkpoint=use_checkpoint,\n                            use_scale_shift_norm=use_scale_shift_norm,\n                            down=True,\n                        ) if resblock_updown else Downsample(ch, conv_resample, dims=dims, out_channels=out_ch)))\n                ch = out_ch\n                input_block_chans.append(ch)\n                ds *= 2\n                self._feature_size += ch\n\n        self.middle_block = TimestepEmbedSequential(\n            ResBlock(\n                ch,\n                time_embed_dim,\n                dropout,\n                dims=dims,\n                use_checkpoint=use_checkpoint,\n                use_scale_shift_norm=use_scale_shift_norm,\n            ),\n            AttentionBlock(\n                ch,\n                use_checkpoint=use_checkpoint,\n                num_heads=num_heads,\n                num_head_channels=num_head_channels,\n                use_new_attention_order=use_new_attention_order,\n            ),\n            ResBlock(\n                ch,\n                time_embed_dim,\n                dropout,\n                dims=dims,\n                use_checkpoint=use_checkpoint,\n                use_scale_shift_norm=use_scale_shift_norm,\n            ),\n        )\n        self._feature_size += ch\n\n        self.output_blocks = nn.LayerList([])\n        for level, mult in list(enumerate(channel_mult))[::-1]:\n            for i in range(num_res_blocks + 1):\n                ich = input_block_chans.pop()\n                layers = [\n                    ResBlock(\n                        ch + ich,\n                        time_embed_dim,\n                        dropout,\n                        out_channels=int(model_channels * mult),\n                        dims=dims,\n                        use_checkpoint=use_checkpoint,\n                        use_scale_shift_norm=use_scale_shift_norm,\n                    )\n                ]\n                ch = int(model_channels * mult)\n                if ds in attention_resolutions:\n                    layers.append(\n                        AttentionBlock(\n                            ch,\n                            use_checkpoint=use_checkpoint,\n                            num_heads=num_heads_upsample,\n                            num_head_channels=num_head_channels,\n                            use_new_attention_order=use_new_attention_order,\n                        ))\n                if level and i == num_res_blocks:\n                    out_ch = ch\n                    layers.append(\n                        ResBlock(\n                            ch,\n                            time_embed_dim,\n                            dropout,\n                            out_channels=out_ch,\n                            dims=dims,\n                            use_checkpoint=use_checkpoint,\n                            use_scale_shift_norm=use_scale_shift_norm,\n                            up=True,\n                        ) if resblock_updown else Upsample(ch, conv_resample, dims=dims, out_channels=out_ch))\n                    ds //= 2\n                self.output_blocks.append(TimestepEmbedSequential(*layers))\n                self._feature_size += ch\n\n        self.out = nn.Sequential(\n            normalization(ch),\n            SiLU(),\n            zero_module(conv_nd(dims, input_ch, out_channels, 3, padding=1)),\n        )\n\n    def forward(self, x, timesteps, y=None):\n        \"\"\"\n        Apply the model to an input batch.\n\n        :param x: an [N x C x ...] Tensor of inputs.\n        :param timesteps: a 1-D batch of timesteps.\n        :param y: an [N] Tensor of labels, if class-conditional.\n        :return: an [N x C x ...] Tensor of outputs.\n        \"\"\"\n        assert (y is not None) == (self.num_classes\n                                   is not None), \"must specify y if and only if the model is class-conditional\"\n\n        hs = []\n        emb = self.time_embed(timestep_embedding(timesteps, self.model_channels))\n        if self.num_classes is not None:\n            assert y.shape == (x.shape[0], )\n            emb = emb + self.label_emb(y)\n\n        h = paddle.cast(x, self.dtype)\n        for module in self.input_blocks:\n            h = module(h, emb)\n            hs.append(h)\n        h = self.middle_block(h, emb)\n        for module in self.output_blocks:\n            h = paddle.concat([h, hs.pop()], axis=1)\n            h = module(h, emb)\n        # h = paddle.cast(h, x.dtype)\n        return self.out(h)\n\n\nclass SuperResModel(UNetModel):\n    \"\"\"\n    A UNetModel that performs super-resolution.\n\n    Expects an extra kwarg `low_res` to condition on a low-resolution image.\n    \"\"\"\n\n    def __init__(self, image_size, in_channels, *args, **kwargs):\n        super().__init__(image_size, in_channels * 2, *args, **kwargs)\n\n    def forward(self, x, timesteps, low_res=None, **kwargs):\n        _, _, new_height, new_width = x.shape\n        upsampled = F.interpolate(low_res, (new_height, new_width), mode=\"bilinear\")\n        x = paddle.concat([x, upsampled], axis=1)\n        return super().forward(x, timesteps, **kwargs)\n\n\nclass EncoderUNetModel(nn.Layer):\n    \"\"\"\n    The half UNet model with attention and timestep embedding.\n\n    For usage, see UNet.\n    \"\"\"\n\n    def __init__(\n        self,\n        image_size,\n        in_channels,\n        model_channels,\n        out_channels,\n        num_res_blocks,\n        attention_resolutions,\n        dropout=0,\n        channel_mult=(1, 2, 4, 8),\n        conv_resample=True,\n        dims=2,\n        use_checkpoint=False,\n        use_fp16=False,\n        num_heads=1,\n        num_head_channels=-1,\n        num_heads_upsample=-1,\n        use_scale_shift_norm=False,\n        resblock_updown=False,\n        use_new_attention_order=False,\n        pool=\"adaptive\",\n    ):\n        super().__init__()\n\n        if num_heads_upsample == -1:\n            num_heads_upsample = num_heads\n\n        self.in_channels = in_channels\n        self.model_channels = model_channels\n        self.out_channels = out_channels\n        self.num_res_blocks = num_res_blocks\n        self.attention_resolutions = attention_resolutions\n        self.dropout = dropout\n        self.channel_mult = channel_mult\n        self.conv_resample = conv_resample\n        self.use_checkpoint = use_checkpoint\n        self.dtype = paddle.float16 if use_fp16 else paddle.float32\n        self.num_heads = num_heads\n        self.num_head_channels = num_head_channels\n        self.num_heads_upsample = num_heads_upsample\n\n        time_embed_dim = model_channels * 4\n        self.time_embed = nn.Sequential(\n            linear(model_channels, time_embed_dim),\n            SiLU(),\n            linear(time_embed_dim, time_embed_dim),\n        )\n\n        ch = int(channel_mult[0] * model_channels)\n        self.input_blocks = nn.LayerList([TimestepEmbedSequential(conv_nd(dims, in_channels, ch, 3, padding=1))])\n        self._feature_size = ch\n        input_block_chans = [ch]\n        ds = 1\n        for level, mult in enumerate(channel_mult):\n            for _ in range(num_res_blocks):\n                layers = [\n                    ResBlock(\n                        ch,\n                        time_embed_dim,\n                        dropout,\n                        out_channels=int(mult * model_channels),\n                        dims=dims,\n                        use_checkpoint=use_checkpoint,\n                        use_scale_shift_norm=use_scale_shift_norm,\n                    )\n                ]\n                ch = int(mult * model_channels)\n                if ds in attention_resolutions:\n                    layers.append(\n                        AttentionBlock(\n                            ch,\n                            use_checkpoint=use_checkpoint,\n                            num_heads=num_heads,\n                            num_head_channels=num_head_channels,\n                            use_new_attention_order=use_new_attention_order,\n                        ))\n                self.input_blocks.append(TimestepEmbedSequential(*layers))\n                self._feature_size += ch\n                input_block_chans.append(ch)\n            if level != len(channel_mult) - 1:\n                out_ch = ch\n                self.input_blocks.append(\n                    TimestepEmbedSequential(\n                        ResBlock(\n                            ch,\n                            time_embed_dim,\n                            dropout,\n                            out_channels=out_ch,\n                            dims=dims,\n                            use_checkpoint=use_checkpoint,\n                            use_scale_shift_norm=use_scale_shift_norm,\n                            down=True,\n                        ) if resblock_updown else Downsample(ch, conv_resample, dims=dims, out_channels=out_ch)))\n                ch = out_ch\n                input_block_chans.append(ch)\n                ds *= 2\n                self._feature_size += ch\n\n        self.middle_block = TimestepEmbedSequential(\n            ResBlock(\n                ch,\n                time_embed_dim,\n                dropout,\n                dims=dims,\n                use_checkpoint=use_checkpoint,\n                use_scale_shift_norm=use_scale_shift_norm,\n            ),\n            AttentionBlock(\n                ch,\n                use_checkpoint=use_checkpoint,\n                num_heads=num_heads,\n                num_head_channels=num_head_channels,\n                use_new_attention_order=use_new_attention_order,\n            ),\n            ResBlock(\n                ch,\n                time_embed_dim,\n                dropout,\n                dims=dims,\n                use_checkpoint=use_checkpoint,\n                use_scale_shift_norm=use_scale_shift_norm,\n            ),\n        )\n        self._feature_size += ch\n        self.pool = pool\n        if pool == \"adaptive\":\n            self.out = nn.Sequential(\n                normalization(ch),\n                SiLU(),\n                nn.AdaptiveAvgPool2D((1, 1)),\n                zero_module(conv_nd(dims, ch, out_channels, 1)),\n                nn.Flatten(),\n            )\n        elif pool == \"attention\":\n            assert num_head_channels != -1\n            self.out = nn.Sequential(\n                normalization(ch),\n                SiLU(),\n                AttentionPool2d((image_size // ds), ch, num_head_channels, out_channels),\n            )\n        elif pool == \"spatial\":\n            self.out = nn.Sequential(\n                nn.Linear(self._feature_size, 2048),\n                nn.ReLU(),\n                nn.Linear(2048, self.out_channels),\n            )\n        elif pool == \"spatial_v2\":\n            self.out = nn.Sequential(\n                nn.Linear(self._feature_size, 2048),\n                normalization(2048),\n                SiLU(),\n                nn.Linear(2048, self.out_channels),\n            )\n        else:\n            raise NotImplementedError(f\"Unexpected {pool} pooling\")\n\n    def forward(self, x, timesteps):\n        \"\"\"\n        Apply the model to an input batch.\n\n        :param x: an [N x C x ...] Tensor of inputs.\n        :param timesteps: a 1-D batch of timesteps.\n        :return: an [N x K] Tensor of outputs.\n        \"\"\"\n        emb = self.time_embed(timestep_embedding(timesteps, self.model_channels))\n\n        results = []\n        # h = x.type(self.dtype)\n        h = paddle.cast(x, self.dtype)\n        for module in self.input_blocks:\n            h = module(h, emb)\n            if self.pool.startswith(\"spatial\"):\n                # results.append(h.type(x.dtype).mean(axis=(2, 3)))\n                results.append(paddle.cast(h, x.dtype).mean(axis=(2, 3)))\n        h = self.middle_block(h, emb)\n        if self.pool.startswith(\"spatial\"):\n            results.append(paddle.cast(h, x.dtype).mean(axis=(2, 3)))\n            h = paddle.concat(results, axis=-1)\n            return self.out(h)\n        else:\n            # h = h.type(x.dtype)\n            h = paddle.cast(h, x.dtype)\n            return self.out(h)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/reverse_diffusion/resources/default.yml",
    "content": "text_prompts:\n  - A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\n\ninit_image:\n\nwidth_height: [ 1280, 768]\n\nskip_steps: 10\nsteps: 250\n\ncut_ic_pow: 1\ninit_scale: 1000\nclip_guidance_scale: 5000\n\ntv_scale: 0\nrange_scale: 150\nsat_scale: 0\ncutn_batches: 4\n\ndiffusion_model: 512x512_diffusion_uncond_finetune_008100\nuse_secondary_model: True\ndiffusion_sampling_mode: ddim\n\nperlin_init: False\nperlin_mode: mixed\nseed: 445467575\neta: 0.8\nclamp_grad: True\nclamp_max: 0.05\n\nrandomize_class: True\nclip_denoised: False\nfuzzy_prompt: False\nrand_mag: 0.05\n\ncut_overview: \"[12]*400+[4]*600\"\ncut_innercut: \"[4]*400+[12]*600\"\ncut_icgray_p: \"[0.2]*400+[0]*600\"\n\ndisplay_rate: 10\nn_batches: 1\nbatch_size: 1\nbatch_name: ''\nclip_models:\n  - VIT\n  - RN50\n  - RN101\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/reverse_diffusion/resources/docstrings.yml",
    "content": "text_prompts: |\n  Phrase, sentence, or string of words and phrases describing what the image should look like.  The words will be analyzed by the AI and will guide the diffusion process toward the image(s) you describe. These can include commas and weights to adjust the relative importance of each element.  E.g. \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"\n  Notice that this prompt loosely follows a structure: [subject], [prepositional details], [setting], [meta modifiers and artist]; this is a good starting point for your experiments.\n  Developing text prompts takes practice and experience, and is not the subject of this guide.  If you are a beginner to writing text prompts, a good place to start is on a simple AI art app like Night Cafe, starry ai or WOMBO prior to using DD, to get a feel for how text gets translated into images by GAN tools.  These other apps use different technologies, but many of the same principles apply.\ninit_image: |\n  Recall that in the image sequence above, the first image shown is just noise.  If an init_image is provided, diffusion will replace the noise with the init_image as its starting state.  To use an init_image, upload the image to the Colab instance or your Google Drive, and enter the full image path here.\n  If using an init_image, you may need to increase skip_steps to ~ 50% of total steps to retain the character of the init. See skip_steps above for further discussion.\nwidth_height: |\n  Desired final image size, in pixels. You can have a square, wide, or tall image, but each edge length should be set to a multiple of 64px, and a minimum of 512px on the default CLIP model setting.  If you forget to use multiples of 64px in your dimensions, DD will adjust the dimensions of your image to make it so.\n\nskip_steps: |\n  Consider the chart shown here.  Noise scheduling (denoise strength) starts very high and progressively gets lower and lower as diffusion steps progress. The noise levels in the first few steps are very high, so images change dramatically in early steps.\n  As DD moves along the curve, noise levels (and thus the amount an image changes per step) declines, and image coherence from one step to the next increases.\n  The first few steps of denoising are often so dramatic that some steps (maybe 10-15% of total) can be skipped without affecting the final image. You can experiment with this as a way to cut render times.\n  If you skip too many steps, however, the remaining noise may not be high enough to generate new content, and thus may not have ‘time left’ to finish an image satisfactorily.\n  Also, depending on your other settings, you may need to skip steps to prevent CLIP from overshooting your goal, resulting in ‘blown out’ colors (hyper saturated, solid white, or solid black regions) or otherwise poor image quality.  Consider that the denoising process is at its strongest in the early steps, so skipping steps can sometimes mitigate other problems.\n  Lastly, if using an init_image, you will need to skip ~50% of the diffusion steps to retain the shapes in the original init image.\n  However, if you’re using an init_image, you can also adjust skip_steps up or down for creative reasons.  With low skip_steps you can get a result \"inspired by\" the init_image which will retain the colors and rough layout and shapes but look quite different. With high skip_steps you can preserve most of the init_image contents and just do fine tuning of the texture.\n\nsteps: |\n  When creating an image, the denoising curve is subdivided into steps for processing. Each step (or iteration) involves the AI looking at subsets of the image called ‘cuts’ and calculating the ‘direction’ the image should be guided to be more like the prompt. Then it adjusts the image with the help of the diffusion denoiser, and moves to the next step.\n  Increasing steps will provide more opportunities for the AI to adjust the image, and each adjustment will be smaller, and thus will yield a more precise, detailed image.  Increasing steps comes at the expense of longer render times.  Also, while increasing steps should generally increase image quality, there is a diminishing return on additional steps beyond 250 - 500 steps.  However, some intricate images can take 1000, 2000, or more steps.  It is really up to the user.\n  Just know that the render time is directly related to the number of steps, and many other parameters have a major impact on image quality, without costing additional time.\n\ncut_ic_pow: |\n  This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n\ninit_scale: |\n  This controls how strongly CLIP will try to match the init_image provided.  This is balanced against the clip_guidance_scale (CGS) above.  Too much init scale, and the image won’t change much during diffusion. Too much CGS and the init image will be lost.\nclip_guidance_scale: |\n  CGS is one of the most important parameters you will use. It tells DD how strongly you want CLIP to move toward your prompt each timestep.  Higher is generally better, but if CGS is too strong it will overshoot the goal and distort the image. So a happy medium is needed, and it takes experience to learn how to adjust CGS.\n  Note that this parameter generally scales with image dimensions. In other words, if you increase your total dimensions by 50% (e.g. a change from 512 x 512 to 512 x 768), then to maintain the same effect on the image, you’d want to increase clip_guidance_scale from 5000 to 7500.\n  Of the basic settings, clip_guidance_scale, steps and skip_steps are the most important contributors to image quality, so learn them well.\ntv_scale: |\n  Total variance denoising. Optional, set to zero to turn off. Controls ‘smoothness’ of final output. If used, tv_scale will try to smooth out your final image to reduce overall noise. If your image is too ‘crunchy’, increase tv_scale. TV denoising is good at preserving edges while smoothing away noise in flat regions.  See https://en.wikipedia.org/wiki/Total_variation_denoising\nrange_scale: |\n  Optional, set to zero to turn off.  Used for adjustment of color contrast.  Lower range_scale will increase contrast. Very low numbers create a reduced color palette, resulting in more vibrant or poster-like images. Higher range_scale will reduce contrast, for more muted images.\n\nsat_scale: |\n  Saturation scale. Optional, set to zero to turn off.  If used, sat_scale will help mitigate oversaturation. If your image is too saturated, increase sat_scale to reduce the saturation.\ncutn_batches: |\n  Each iteration, the AI cuts the image into smaller pieces known as cuts, and compares each cut to the prompt to decide how to guide the next diffusion step.  More cuts can generally lead to better images, since DD has more chances to fine-tune the image precision in each timestep.\n  Additional cuts are memory intensive, however, and if DD tries to evaluate too many cuts at once, it can run out of memory.  You can use cutn_batches to increase cuts per timestep without increasing memory usage.\n  At the default settings, DD is scheduled to do 16 cuts per timestep.  If cutn_batches is set to 1, there will indeed only be 16 cuts total per timestep.\n  However, if cutn_batches is increased to 4, DD will do 64 cuts total in each timestep, divided into 4 sequential batches of 16 cuts each.  Because the cuts are being evaluated only 16 at a time, DD uses the memory required for only 16 cuts, but gives you the quality benefit of 64 cuts.  The tradeoff, of course, is that this will take ~4 times as long to render each image.\n  So, (scheduled cuts) x (cutn_batches) = (total cuts per timestep). Increasing cutn_batches will increase render times, however, as the work is being done sequentially.  DD’s default cut schedule is a good place to start, but the cut schedule can be adjusted in the Cutn Scheduling section, explained below.\n\ndiffusion_model: Diffusion_model of choice.\n\nuse_secondary_model: |\n  Option to use a secondary purpose-made diffusion model to clean up interim diffusion images for CLIP evaluation.    If this option is turned off, DD will use the regular (large) diffusion model.    Using the secondary model is faster - one user reported a 50% improvement in render speed! However, the secondary model is much smaller, and may reduce image quality and detail.  I suggest you experiment with this.\n\ndiffusion_sampling_mode: |\n  Two alternate diffusion denoising algorithms. ddim has been around longer, and is more established and tested.  plms is a newly added alternate method that promises good diffusion results in fewer steps, but has not been as fully tested and may have side effects. This new plms mode is actively being researched in the #settings-and-techniques channel in the DD Discord.\n\nperlin_init: |\n  Normally, DD will use an image filled with random noise as a starting point for the diffusion curve.  If perlin_init is selected, DD will instead use a Perlin noise model as an initial state.  Perlin has very interesting characteristics, distinct from random noise, so it’s worth experimenting with this for your projects. Beyond perlin, you can, of course, generate your own noise images (such as with GIMP, etc) and use them as an init_image (without skipping steps).\n  Choosing perlin_init does not affect the actual diffusion process, just the starting point for the diffusion. Please note that selecting a perlin_init will replace and override any init_image you may have specified.  Further, because the 2D, 3D and video animation systems all rely on the init_image system, if you enable Perlin while using animation modes, the perlin_init will jump in front of any previous image or video input, and DD will NOT give you the expected sequence of coherent images. All of that said, using Perlin and animation modes together do make a very colorful rainbow effect, which can be used creatively.\n\nperlin_mode: |\n  sets type of Perlin noise: colored, gray, or a mix of both, giving you additional options for noise types. Experiment to see what these do in your projects.\nseed: |\n  Deep in the diffusion code, there is a random number ‘seed’ which is used as the basis for determining the initial state of the diffusion.  By default, this is random, but you can also specify your own seed.  This is useful if you like a particular result and would like to run more iterations that will be similar.\n  After each run, the actual seed value used will be reported in the parameters report, and can be reused if desired by entering seed # here.  If a specific numerical seed is used repeatedly, the resulting images will be quite similar but not identical.\neta: |\n  eta (greek letter η) is a diffusion model variable that mixes in a random amount of scaled noise into each timestep. 0 is no noise, 1.0 is more noise. As with most DD parameters, you can go below zero for eta, but it may give you unpredictable results.\n  The steps parameter has a close relationship with the eta parameter. If you set eta to 0, then you can get decent output with only 50-75 steps. Setting eta to 1.0 favors higher step counts, ideally around 250 and up. eta has a subtle, unpredictable effect on image, so you’ll need to experiment to see how this affects your projects.\nclamp_grad: |\n  As I understand it, clamp_grad is an internal limiter that stops DD from producing extreme results.  Try your images with and without clamp_grad. If the image changes drastically with clamp_grad turned off, it probably means your clip_guidance_scale is too high and should be reduced.\nclamp_max: |\n  Sets the value of the clamp_grad limitation. Default is 0.05, providing for smoother, more muted coloration in images, but setting higher values (0.15-0.3) can provide interesting contrast and vibrancy.\n\nrandomize_class:\nclip_denoised: False\nfuzzy_prompt: |\n  Controls whether to add multiple noisy prompts to the prompt losses. If True, can increase variability of image output. Experiment with this.\nrand_mag: |\n  Affects only the fuzzy_prompt.  Controls the magnitude of the random noise added by fuzzy_prompt.\n\ncut_overview: The schedule of overview cuts\ncut_innercut: The schedule of inner cuts\ncut_icgray_p: This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n\ndisplay_rate: |\n  During a diffusion run, you can monitor the progress of each image being created with this variable.  If display_rate is set to 50, DD will show you the in-progress image every 50 timesteps. Setting this to a lower value, like 5 or 10, is a good way to get an early peek at where your image is heading. If you don’t like the progression, just interrupt execution, change some settings, and re-run.  If you are planning a long, unmonitored batch, it’s better to set display_rate equal to steps, because displaying interim images does slow Colab down slightly.\nn_batches: |\n  This variable sets the number of still images you want DD to create.  If you are using an animation mode (see below for details) DD will ignore n_batches and create a single set of animated frames based on the animation settings.\nbatch_name: |\n  The name of the batch, the batch id will be named as \"discoart-[batch_name]-seed\". To avoid your artworks be overridden by other users, please use a unique name.\nclip_models: |\n  CLIP Model selectors. ViT-B/32, ViT-B/16, ViT-L/14, RN101, RN50, RN50x4, RN50x16, RN50x64.\n  These various CLIP models are available for you to use during image generation.  Models have different styles or ‘flavors,’ so look around.\n  You can mix in multiple models as well for different results.  However, keep in mind that some models are extremely memory-hungry, and turning on additional models will take additional memory and may cause a crash.\n  The rough order of speed/mem usage is (smallest/fastest to largest/slowest):\n  ViT-B/32\n  RN50\n  RN101\n  ViT-B/16\n  RN50x4\n  RN50x16\n  RN50x64\n  ViT-L/14\n  For RN50x64 & ViTL14 you may need to use fewer cuts, depending on your VRAM.\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_clip_vitb32/reverse_diffusion/runner.py",
    "content": "'''\nThis code is rewritten by Paddle based on Jina-ai/discoart.\nhttps://github.com/jina-ai/discoart/blob/main/discoart/runner.py\n'''\nimport gc\nimport os\nimport random\nfrom threading import Thread\n\nimport disco_diffusion_clip_vitb32.clip.clip as clip\nimport numpy as np\nimport paddle\nimport paddle.vision.transforms as T\nimport paddle_lpips as lpips\nfrom docarray import Document\nfrom docarray import DocumentArray\nfrom IPython import display\nfrom ipywidgets import Output\nfrom PIL import Image\n\nfrom .helper import logger\nfrom .helper import parse_prompt\nfrom .model.losses import range_loss\nfrom .model.losses import spherical_dist_loss\nfrom .model.losses import tv_loss\nfrom .model.make_cutouts import MakeCutoutsDango\nfrom .model.sec_diff import alpha_sigma_to_t\nfrom .model.sec_diff import SecondaryDiffusionImageNet2\nfrom .model.transforms import Normalize\n\n\ndef do_run(args, models) -> 'DocumentArray':\n    logger.info('preparing models...')\n    model, diffusion, clip_models, secondary_model = models\n    normalize = Normalize(\n        mean=[0.48145466, 0.4578275, 0.40821073],\n        std=[0.26862954, 0.26130258, 0.27577711],\n    )\n    lpips_model = lpips.LPIPS(net='vgg')\n    for parameter in lpips_model.parameters():\n        parameter.stop_gradient = True\n    side_x = (args.width_height[0] // 64) * 64\n    side_y = (args.width_height[1] // 64) * 64\n    cut_overview = eval(args.cut_overview)\n    cut_innercut = eval(args.cut_innercut)\n    cut_icgray_p = eval(args.cut_icgray_p)\n\n    from .model.perlin_noises import create_perlin_noise, regen_perlin\n\n    seed = args.seed\n\n    skip_steps = args.skip_steps\n\n    loss_values = []\n\n    if seed is not None:\n        np.random.seed(seed)\n        random.seed(seed)\n        paddle.seed(seed)\n\n    model_stats = []\n    for clip_model in clip_models:\n        model_stat = {\n            'clip_model': None,\n            'target_embeds': [],\n            'make_cutouts': None,\n            'weights': [],\n        }\n        model_stat['clip_model'] = clip_model\n\n        if isinstance(args.text_prompts, str):\n            args.text_prompts = [args.text_prompts]\n\n        for prompt in args.text_prompts:\n            txt, weight = parse_prompt(prompt)\n            txt = clip_model.encode_text(clip.tokenize(prompt))\n            if args.fuzzy_prompt:\n                for i in range(25):\n                    model_stat['target_embeds'].append((txt + paddle.randn(txt.shape) * args.rand_mag).clip(0, 1))\n                    model_stat['weights'].append(weight)\n            else:\n                model_stat['target_embeds'].append(txt)\n                model_stat['weights'].append(weight)\n\n        model_stat['target_embeds'] = paddle.concat(model_stat['target_embeds'])\n        model_stat['weights'] = paddle.to_tensor(model_stat['weights'])\n        if model_stat['weights'].sum().abs() < 1e-3:\n            raise RuntimeError('The weights must not sum to 0.')\n        model_stat['weights'] /= model_stat['weights'].sum().abs()\n        model_stats.append(model_stat)\n\n    init = None\n    if args.init_image:\n        d = Document(uri=args.init_image).load_uri_to_image_tensor(side_x, side_y)\n        init = T.to_tensor(d.tensor).unsqueeze(0) * 2 - 1\n\n    if args.perlin_init:\n        if args.perlin_mode == 'color':\n            init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, False, side_y, side_x)\n            init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, False, side_y, side_x)\n        elif args.perlin_mode == 'gray':\n            init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, True, side_y, side_x)\n            init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, True, side_y, side_x)\n        else:\n            init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, False, side_y, side_x)\n            init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, True, side_y, side_x)\n        init = (T.to_tensor(init).add(T.to_tensor(init2)).divide(paddle.to_tensor(2.0)).unsqueeze(0) * 2 - 1)\n        del init2\n\n    cur_t = None\n\n    def cond_fn(x, t, y=None):\n        x_is_NaN = False\n        n = x.shape[0]\n        if secondary_model:\n            alpha = paddle.to_tensor(diffusion.sqrt_alphas_cumprod[cur_t], dtype='float32')\n            sigma = paddle.to_tensor(diffusion.sqrt_one_minus_alphas_cumprod[cur_t], dtype='float32')\n            cosine_t = alpha_sigma_to_t(alpha, sigma)\n            x = paddle.to_tensor(x.detach(), dtype='float32')\n            x.stop_gradient = False\n            cosine_t = paddle.tile(paddle.to_tensor(cosine_t.detach().cpu().numpy()), [n])\n            cosine_t.stop_gradient = False\n            out = secondary_model(x, cosine_t).pred\n            fac = diffusion.sqrt_one_minus_alphas_cumprod[cur_t]\n            x_in_d = out * fac + x * (1 - fac)\n            x_in = x_in_d.detach()\n            x_in.stop_gradient = False\n            x_in_grad = paddle.zeros_like(x_in, dtype='float32')\n        else:\n            t = paddle.ones([n], dtype='int64') * cur_t\n            out = diffusion.p_mean_variance(model, x, t, clip_denoised=False, model_kwargs={'y': y})\n            fac = diffusion.sqrt_one_minus_alphas_cumprod[cur_t]\n            x_in_d = out['pred_xstart'] * fac + x * (1 - fac)\n            x_in = x_in_d.detach()\n            x_in.stop_gradient = False\n            x_in_grad = paddle.zeros_like(x_in, dtype='float32')\n        for model_stat in model_stats:\n            for i in range(args.cutn_batches):\n                t_int = (int(t.item()) + 1)  # errors on last step without +1, need to find source\n                # when using SLIP Base model the dimensions need to be hard coded to avoid AttributeError: 'VisionTransformer' object has no attribute 'input_resolution'\n                try:\n                    input_resolution = model_stat['clip_model'].visual.input_resolution\n                except:\n                    input_resolution = 224\n\n                cuts = MakeCutoutsDango(\n                    input_resolution,\n                    Overview=cut_overview[1000 - t_int],\n                    InnerCrop=cut_innercut[1000 - t_int],\n                    IC_Size_Pow=args.cut_ic_pow,\n                    IC_Grey_P=cut_icgray_p[1000 - t_int],\n                )\n                clip_in = normalize(cuts(x_in.add(paddle.to_tensor(1.0)).divide(paddle.to_tensor(2.0))))\n                image_embeds = (model_stat['clip_model'].encode_image(clip_in))\n\n                dists = spherical_dist_loss(\n                    image_embeds.unsqueeze(1),\n                    model_stat['target_embeds'].unsqueeze(0),\n                )\n\n                dists = dists.reshape([\n                    cut_overview[1000 - t_int] + cut_innercut[1000 - t_int],\n                    n,\n                    -1,\n                ])\n                losses = dists.multiply(model_stat['weights']).sum(2).mean(0)\n                loss_values.append(losses.sum().item())  # log loss, probably shouldn't do per cutn_batch\n\n                x_in_grad += (paddle.grad(losses.sum() * args.clip_guidance_scale, x_in)[0] / args.cutn_batches)\n        tv_losses = tv_loss(x_in)\n        range_losses = range_loss(x_in)\n        sat_losses = paddle.abs(x_in - x_in.clip(min=-1, max=1)).mean()\n        loss = (tv_losses.sum() * args.tv_scale + range_losses.sum() * args.range_scale +\n                sat_losses.sum() * args.sat_scale)\n        if init is not None and args.init_scale:\n            init_losses = lpips_model(x_in, init)\n            loss = loss + init_losses.sum() * args.init_scale\n        x_in_grad += paddle.grad(loss, x_in)[0]\n        if not paddle.isnan(x_in_grad).any():\n            grad = -paddle.grad(x_in_d, x, x_in_grad)[0]\n        else:\n            x_is_NaN = True\n            grad = paddle.zeros_like(x)\n        if args.clamp_grad and not x_is_NaN:\n            magnitude = grad.square().mean().sqrt()\n            return (grad * magnitude.clip(max=args.clamp_max) / magnitude)\n        return grad\n\n    if args.diffusion_sampling_mode == 'ddim':\n        sample_fn = diffusion.ddim_sample_loop_progressive\n    else:\n        sample_fn = diffusion.plms_sample_loop_progressive\n\n    logger.info('creating artwork...')\n\n    image_display = Output()\n    da_batches = DocumentArray()\n\n    for _nb in range(args.n_batches):\n        display.clear_output(wait=True)\n        display.display(args.name_docarray, image_display)\n        gc.collect()\n        paddle.device.cuda.empty_cache()\n\n        d = Document(tags=vars(args))\n        da_batches.append(d)\n\n        cur_t = diffusion.num_timesteps - skip_steps - 1\n\n        if args.perlin_init:\n            init = regen_perlin(args.perlin_mode, side_y, side_x, args.batch_size)\n\n        if args.diffusion_sampling_mode == 'ddim':\n            samples = sample_fn(\n                model,\n                (args.batch_size, 3, side_y, side_x),\n                clip_denoised=args.clip_denoised,\n                model_kwargs={},\n                cond_fn=cond_fn,\n                progress=True,\n                skip_timesteps=skip_steps,\n                init_image=init,\n                randomize_class=args.randomize_class,\n                eta=args.eta,\n            )\n        else:\n            samples = sample_fn(\n                model,\n                (args.batch_size, 3, side_y, side_x),\n                clip_denoised=args.clip_denoised,\n                model_kwargs={},\n                cond_fn=cond_fn,\n                progress=True,\n                skip_timesteps=skip_steps,\n                init_image=init,\n                randomize_class=args.randomize_class,\n                order=2,\n            )\n\n        threads = []\n        for j, sample in enumerate(samples):\n            cur_t -= 1\n            with image_display:\n                if j % args.display_rate == 0 or cur_t == -1:\n                    for _, image in enumerate(sample['pred_xstart']):\n                        image = (image + 1) / 2\n                        image = image.clip(0, 1).squeeze().transpose([1, 2, 0]).numpy() * 255\n                        image = np.uint8(image)\n                        image = Image.fromarray(image)\n\n                        image.save(os.path.join(args.output_dir, 'progress-{}.png'.format(_nb)))\n                        c = Document(tags={'cur_t': cur_t})\n                        c.load_pil_image_to_datauri(image)\n                        d.chunks.append(c)\n                        display.clear_output(wait=True)\n                        display.display(display.Image(os.path.join(args.output_dir, 'progress-{}.png'.format(_nb))))\n                        d.chunks.plot_image_sprites(os.path.join(args.output_dir,\n                                                                 f'{args.name_docarray}-progress-{_nb}.png'),\n                                                    show_index=True)\n                        t = Thread(\n                            target=_silent_push,\n                            args=(\n                                da_batches,\n                                args.name_docarray,\n                            ),\n                        )\n                        threads.append(t)\n                        t.start()\n\n                    if cur_t == -1:\n                        d.load_pil_image_to_datauri(image)\n\n        for t in threads:\n            t.join()\n    display.clear_output(wait=True)\n    logger.info(f'done! {args.name_docarray}')\n    da_batches.plot_image_sprites(skip_empty=True, show_index=True, keep_aspect_ratio=True)\n    return da_batches\n\n\ndef _silent_push(da_batches: DocumentArray, name: str) -> None:\n    try:\n        da_batches.push(name)\n    except Exception as ex:\n        logger.debug(f'push failed: {ex}')\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/README.md",
    "content": "# disco_diffusion_cnclip_vitb16\n\n|模型名称|disco_diffusion_cnclip_vitb16|\n| :--- | :---: |\n|类别|图像-文图生成|\n|网络|dd+cnclip ViTB16|\n|数据集|-|\n|是否支持Fine-tuning|否|\n|模型大小|2.9GB|\n|最新更新日期|2022-08-02|\n|数据指标|-|\n\n## 一、模型基本信息\n\n### 应用效果展示\n\n   - 输入文本 \"在宁静的风景中画一幅美丽的建筑画，由Arthur Adams在artstation上所作\"\n\n   - 输出图像\n   <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/184838000-566f0548-f0f5-4df6-a4af-16bb70220137.png\"  width = \"80%\" hspace='10'/>\n   <br />\n\n   - 生成过程\n   <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/184837520-f30f21e1-5310-4925-a8d2-946337795010.gif\"  width = \"80%\" hspace='10'/>\n   <br />\n\n### 模型介绍\n\ndisco_diffusion_cnclip_vitb16 是一个文图生成模型，可以通过输入一段文字来生成符合该句子语义的图像。该模型由两部分组成，一部分是扩散模型，是一种生成模型，可以从噪声输入中重建出原始图像。另一部分是多模态预训练模型（CLIP）, 可以将文本和图像表示在同一个特征空间，相近语义的文本和图像在该特征空间里距离会更相近。在该文图生成模型中，扩散模型负责从初始噪声或者指定初始图像中来生成目标图像，CLIP负责引导生成图像的语义和输入的文本的语义尽可能接近，随着扩散模型在CLIP的引导下不断的迭代生成新图像，最终能够生成文本所描述内容的图像。该模块中使用的CLIP模型结构为ViTB16。\n\n更多详情请参考论文：[Diffusion Models Beat GANs on Image Synthesis](https://arxiv.org/abs/2105.05233) 以及 [Learning Transferable Visual Models From Natural Language Supervision](https://arxiv.org/abs/2103.00020)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.2.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install disco_diffusion_cnclip_vitb16\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run disco_diffusion_cnclip_vitb16 --text_prompts \"孤舟蓑笠翁，独钓寒江雪。风格如齐白石所作。\" --output_dir disco_diffusion_cnclip_vitb16_out\n    ```\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"disco_diffusion_cnclip_vitb16\")\n    text_prompts = [\"孤舟蓑笠翁，独钓寒江雪。\"]\n    # 生成图像, 默认会在disco_diffusion_cnclip_vitb16_out目录保存图像\n    # 返回的da是一个DocumentArray对象，保存了所有的结果，包括最终结果和迭代过程的中间结果\n    # 可以通过操作DocumentArray对象对生成的图像做后处理，保存或者分析\n    da = module.generate_image(text_prompts=text_prompts, artist='齐白石', output_dir='./disco_diffusion_cnclip_vitb16_out/')  \n    # 手动将最终生成的图像保存到指定路径\n    da[0].save_uri_to_file('disco_diffusion_cnclip_vitb16_out-result.png')\n    # 展示所有的中间结果\n    da[0].chunks.plot_image_sprites(skip_empty=True, show_index=True, keep_aspect_ratio=True)\n    # 将整个生成过程保存为一个动态图gif\n    da[0].chunks.save_gif('disco_diffusion_cnclip_vitb16_out-result.gif')\n    ```\n\n- ### 3、API\n\n  - ```python\n    def generate_image(\n            text_prompts,\n            style: Optional[str] = None,\n            artist: Optional[str] = None,\n            width_height: Optional[List[int]] = [1280, 768],\n            seed: Optional[int] = None,\n            output_dir: Optional[str] = 'disco_diffusion_cnclip_vitb16_out'):\n    ```\n\n    - 文图生成API，生成文本描述内容的图像。\n\n    - **参数**\n\n      - text_prompts(str): 输入的语句，描述想要生成的图像的内容。通常比较有效的构造方式为 \"一段描述性的文字内容\" + \"指定艺术家的名字\"，如\"孤舟蓑笠翁，独钓寒江雪。风格如齐白石所作\"。\n      - style(Optional[str]): 指定绘画的风格，如水墨画、油画、水彩画等。当不指定时，风格完全由您所填写的prompt决定。\n      - artist(Optional[str]): 指定特定的艺术家，如齐白石、Greg Rutkowsk，将会生成所指定艺术家的绘画风格。当不指定时，风格完全由您所填写的prompt决定。各种艺术家的风格可以参考[网站](https://weirdwonderfulai.art/resources/disco-diffusion-70-plus-artist-studies/)。\n      - width_height(Optional[List[int]]): 指定最终输出图像的宽高，宽和高都需要是64的倍数，生成的图像越大，所需要的计算时间越长。\n      - seed(Optional[int]): 随机种子，由于输入默认是随机高斯噪声，设置不同的随机种子会由不同的初始输入，从而最终生成不同的结果，可以设置该参数来获得不同的输出图像。\n      - output_dir(Optional[str]): 保存输出图像的目录，默认为\"disco_diffusion_cnclip_vitb16_out\"。\n\n\n    - **返回**\n      - ra(DocumentArray): DocumentArray对象， 包含`n_batches`个Documents，其中每个Document都保存了迭代过程的所有中间结果。详细可参考[DocumentArray使用文档](https://docarray.jina.ai/fundamentals/documentarray/index.html)。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线文图生成服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m disco_diffusion_cnclip_vitb16\n    ```\n\n  - 这样就完成了一个文图生成的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果，返回的预测结果在反序列化后即是上述接口声明中说明的DocumentArray类型，返回后对结果的操作方式和使用generate_image接口完全相同。\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    from docarray import DocumentArray\n\n    # 发送HTTP请求\n    data = {'text_prompts': '孤舟蓑笠翁，独钓寒江雪。风格如齐白石所作'}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/disco_diffusion_cnclip_vitb16\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 获取返回结果\n    da = DocumentArray.from_base64(r.json()[\"results\"])\n    # 手动将最终生成的图像保存到指定路径\n    da[0].save_uri_to_file('disco_diffusion_cnclip_vitb16_out-result.png')\n    # 将生成过程保存为一个动态图gif\n    da[0].chunks.save_gif('disco_diffusion_cnclip_vitb16_out-result.gif')\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install disco_diffusion_cnclip_vitb16 == 1.0.0\n  ```\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/README_en.md",
    "content": "# disco_diffusion_cnclip_vitb16\n\n|Module Name|disco_diffusion_cnclip_vitb16|\n| :--- | :---: |\n|Category|text to image|\n|Network|dd+cnclip ViTB16|\n|Dataset|-|\n|Fine-tuning supported or not|No|\n|Module Size|2.9GB|\n|Latest update date|2022-08-02|\n|Data indicators|-|\n\n## I.Basic Information\n\n### Application Effect Display\n\n   - Prompt \"在宁静的风景中画一幅美丽的建筑画，由Arthur Adams在artstation上所作\"\n\n   - Output image\n   <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/184838000-566f0548-f0f5-4df6-a4af-16bb70220137.png\"  width = \"80%\" hspace='10'/>\n   <br />\n\n   - Generating process\n   <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/184837520-f30f21e1-5310-4925-a8d2-946337795010.gif\"  width = \"80%\" hspace='10'/>\n   <br />\n\n### Module Introduction\n\ndisco_diffusion_cnclip_vitb16 is a text-to-image generation model that can generate images that match the semantics of the sentence you prompt. The model consists of two parts, one is the diffusion model, which is a generative model that reconstructs the original image from the noisy input. The other part is the multimodal pre-training model (CLIP), which can represent text and images in the same feature space, and text and images with similar semantics will be closer in this feature space. In the text image generation model, the diffusion model is responsible for generating the target image from the initial noise or the specified initial image, and CLIP is responsible for guiding the generated image to be as close as possible to the semantics of the input text. Diffusion model under the guidance of CLIP iteratively generates new images, eventually generating images of what the text describes. The CLIP model used in this module is ViTB16.\n\nFor more details, please refer to [Diffusion Models Beat GANs on Image Synthesis](https://arxiv.org/abs/2105.05233) and [Learning Transferable Visual Models From Natural Language Supervision](https://arxiv.org/abs/2103.00020)\n\n## II.Installation\n\n- ### 1.Environmental Dependence\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.2.0    | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2.Installation\n\n  - ```shell\n    $ hub install disco_diffusion_cnclip_vitb16\n    ```\n  - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n\n## III.Module API Prediction  \n\n- ### 1.Command line Prediction\n\n  - ```shell\n    $ hub run disco_diffusion_cnclip_vitb16 --text_prompts \"孤舟蓑笠翁，独钓寒江雪。风格如齐白石所作。\" --output_dir disco_diffusion_cnclip_vitb16_out\n    ```\n\n- ### 2.Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"disco_diffusion_cnclip_vitb16\")\n    text_prompts = [\"孤舟蓑笠翁，独钓寒江雪。\"]\n    # Output images will be saved in disco_diffusion_cnclip_vitb16_out directory.\n    # The returned da is a DocumentArray object, which contains all immediate and final results\n    # You can manipulate the DocumentArray object to do post-processing and save images\n    da = module.generate_image(text_prompts=text_prompts, artist='齐白石', output_dir='./disco_diffusion_cnclip_vitb16_out/')  \n    # Save final result image to a file\n    da[0].save_uri_to_file('disco_diffusion_cnclip_vitb16_out-result.png')\n    # Show all immediate results\n    da[0].chunks.plot_image_sprites(skip_empty=True, show_index=True, keep_aspect_ratio=True)\n    # Save the generating process as a gif\n    da[0].chunks.save_gif('disco_diffusion_cnclip_vitb16_out-result.gif')\n    ```\n\n- ### 3.API\n\n  - ```python\n    def generate_image(\n            text_prompts,\n            style: Optional[str] = None,\n            artist: Optional[str] = None,\n            width_height: Optional[List[int]] = [1280, 768],\n            seed: Optional[int] = None,\n            output_dir: Optional[str] = 'disco_diffusion_cnclip_vitb16_out'):\n    ```\n\n    - Image generating api, which generates an image corresponding to your prompt.\n\n    - **Parameters**\n\n      - text_prompts(str): Prompt, used to describe your image content. You can construct a prompt conforms to the format \"content\" + \"artist/style\", such as \"孤舟蓑笠翁，独钓寒江雪。风格如齐白石所作\". For more details, you can refer to [website](https://docs.google.com/document/d/1XUT2G9LmkZataHFzmuOtRXnuWBfhvXDAo8DkS--8tec/edit#).\n      - style(Optional[str]): Image style, such as \"watercolor\" and \"Chinese painting\". If not provided, style is totally up to your prompt.\n      - artist(Optional[str]): Artist name, such as 齐白石,Greg Rutkowsk，image style is as whose works you choose. If not provided, style is totally up to your [prompt](https://weirdwonderfulai.art/resources/disco-diffusion-70-plus-artist-studies/).\n      - width_height(Optional[List[int]]): The width and height of output images, should be better multiples of 64. The larger size is, the longger computation time is.\n      - seed(Optional[int]): Random seed, different seeds result in different output images.\n      - output_dir(Optional[str]): Output directory, default is \"disco_diffusion_cnclip_vitb16_out\".\n\n\n    - **Return**\n      - ra(DocumentArray): DocumentArray object， including `n_batches` Documents，each document keeps all immediate results during generation, please refer to [DocumentArray tutorial](https://docarray.jina.ai/fundamentals/documentarray/index.html) for more details.\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of text-to-image.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m disco_diffusion_cnclip_vitb16\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result.\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    from docarray import DocumentArray\n\n    # Send an HTTP request\n    data = {'text_prompts': '孤舟蓑笠翁，独钓寒江雪。风格如齐白石所作'}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/disco_diffusion_cnclip_vitb16\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # Get results\n    da = DocumentArray.from_base64(r.json()[\"results\"])\n    # Save final result image to a file\n    da[0].save_uri_to_file('disco_diffusion_cnclip_vitb16_out-result.png')\n    # Save the generating process as a gif\n    da[0].chunks.save_gif('disco_diffusion_cnclip_vitb16_out-result.gif')\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n  ```shell\n  $ hub install disco_diffusion_cnclip_vitb16 == 1.0.0\n  ```\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/cn_clip/clip/README.md",
    "content": "# Chinese-CLIP (Paddle)\nChinese-CLIP implemented by Paddle.\nThis module is based on [billjie1/Chinese-CLIP](https://github.com/billjie1/Chinese-CLIP).\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/cn_clip/clip/__init__.py",
    "content": "from .bert_tokenizer import FullTokenizer\n\n_tokenizer = FullTokenizer()\nfrom .utils import tokenize, create_model\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/cn_clip/clip/bert_tokenizer.py",
    "content": "# coding=utf-8\n# Copyright 2018 The Google AI Language Team Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Tokenization classes.\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport collections\nimport os\nimport re\nimport unicodedata\nfrom functools import lru_cache\n\nimport six\n\n\n@lru_cache()\ndef default_vocab():\n    return os.path.join(os.path.dirname(os.path.abspath(__file__)), \"vocab.txt\")\n\n\ndef validate_case_matches_checkpoint(do_lower_case, init_checkpoint):\n    \"\"\"Checks whether the casing config is consistent with the checkpoint name.\"\"\"\n\n    # The casing has to be passed in by the user and there is no explicit check\n    # as to whether it matches the checkpoint. The casing information probably\n    # should have been stored in the bert_config.json file, but it's not, so\n    # we have to heuristically detect it to validate.\n\n    if not init_checkpoint:\n        return\n\n    m = re.match(\"^.*?([A-Za-z0-9_-]+)/bert_model.ckpt\", init_checkpoint)\n    if m is None:\n        return\n\n    model_name = m.group(1)\n\n    lower_models = [\n        \"uncased_L-24_H-1024_A-16\", \"uncased_L-12_H-768_A-12\", \"multilingual_L-12_H-768_A-12\", \"chinese_L-12_H-768_A-12\"\n    ]\n\n    cased_models = [\"cased_L-12_H-768_A-12\", \"cased_L-24_H-1024_A-16\", \"multi_cased_L-12_H-768_A-12\"]\n\n    is_bad_config = False\n    if model_name in lower_models and not do_lower_case:\n        is_bad_config = True\n        actual_flag = \"False\"\n        case_name = \"lowercased\"\n        opposite_flag = \"True\"\n\n    if model_name in cased_models and do_lower_case:\n        is_bad_config = True\n        actual_flag = \"True\"\n        case_name = \"cased\"\n        opposite_flag = \"False\"\n\n    if is_bad_config:\n        raise ValueError(\"You passed in `--do_lower_case=%s` with `--init_checkpoint=%s`. \"\n                         \"However, `%s` seems to be a %s model, so you \"\n                         \"should pass in `--do_lower_case=%s` so that the fine-tuning matches \"\n                         \"how the model was pre-training. If this error is wrong, please \"\n                         \"just comment out this check.\" %\n                         (actual_flag, init_checkpoint, model_name, case_name, opposite_flag))\n\n\ndef convert_to_unicode(text):\n    \"\"\"Converts `text` to Unicode (if it's not already), assuming utf-8 input.\"\"\"\n    if six.PY3:\n        if isinstance(text, str):\n            return text\n        elif isinstance(text, bytes):\n            return text.decode(\"utf-8\", \"ignore\")\n        else:\n            raise ValueError(\"Unsupported string type: %s\" % (type(text)))\n    elif six.PY2:\n        if isinstance(text, str):\n            return text.decode(\"utf-8\", \"ignore\")\n        elif isinstance(text, unicode):\n            return text\n        else:\n            raise ValueError(\"Unsupported string type: %s\" % (type(text)))\n    else:\n        raise ValueError(\"Not running on Python2 or Python 3?\")\n\n\ndef printable_text(text):\n    \"\"\"Returns text encoded in a way suitable for print or `tf.logging`.\"\"\"\n\n    # These functions want `str` for both Python2 and Python3, but in one case\n    # it's a Unicode string and in the other it's a byte string.\n    if six.PY3:\n        if isinstance(text, str):\n            return text\n        elif isinstance(text, bytes):\n            return text.decode(\"utf-8\", \"ignore\")\n        else:\n            raise ValueError(\"Unsupported string type: %s\" % (type(text)))\n    elif six.PY2:\n        if isinstance(text, str):\n            return text\n        elif isinstance(text, unicode):\n            return text.encode(\"utf-8\")\n        else:\n            raise ValueError(\"Unsupported string type: %s\" % (type(text)))\n    else:\n        raise ValueError(\"Not running on Python2 or Python 3?\")\n\n\ndef load_vocab(vocab_file):\n    \"\"\"Loads a vocabulary file into a dictionary.\"\"\"\n    vocab = collections.OrderedDict()\n    index = 0\n    with open(vocab_file, \"r\") as reader:\n        while True:\n            token = convert_to_unicode(reader.readline())\n            if not token:\n                break\n            token = token.strip()\n            vocab[token] = index\n            index += 1\n    return vocab\n\n\ndef convert_by_vocab(vocab, items):\n    \"\"\"Converts a sequence of [tokens|ids] using the vocab.\"\"\"\n    output = []\n    for item in items:\n        output.append(vocab[item])\n    return output\n\n\ndef convert_tokens_to_ids(vocab, tokens):\n    return convert_by_vocab(vocab, tokens)\n\n\ndef convert_ids_to_tokens(inv_vocab, ids):\n    return convert_by_vocab(inv_vocab, ids)\n\n\ndef whitespace_tokenize(text):\n    \"\"\"Runs basic whitespace cleaning and splitting on a piece of text.\"\"\"\n    text = text.strip()\n    if not text:\n        return []\n    tokens = text.split()\n    return tokens\n\n\nclass FullTokenizer(object):\n    \"\"\"Runs end-to-end tokenziation.\"\"\"\n\n    def __init__(self, vocab_file=default_vocab(), do_lower_case=True):\n        self.vocab = load_vocab(vocab_file)\n        self.inv_vocab = {v: k for k, v in self.vocab.items()}\n        self.basic_tokenizer = BasicTokenizer(do_lower_case=do_lower_case)\n        self.wordpiece_tokenizer = WordpieceTokenizer(vocab=self.vocab)\n\n    def tokenize(self, text):\n        split_tokens = []\n        for token in self.basic_tokenizer.tokenize(text):\n            for sub_token in self.wordpiece_tokenizer.tokenize(token):\n                split_tokens.append(sub_token)\n\n        return split_tokens\n\n    def convert_tokens_to_ids(self, tokens):\n        return convert_by_vocab(self.vocab, tokens)\n\n    def convert_ids_to_tokens(self, ids):\n        return convert_by_vocab(self.inv_vocab, ids)\n\n    @staticmethod\n    def convert_tokens_to_string(tokens, clean_up_tokenization_spaces=True):\n        \"\"\" Converts a sequence of tokens (string) in a single string. \"\"\"\n\n        def clean_up_tokenization(out_string):\n            \"\"\" Clean up a list of simple English tokenization artifacts\n            like spaces before punctuations and abreviated forms.\n            \"\"\"\n            out_string = (out_string.replace(\" .\", \".\").replace(\" ?\", \"?\").replace(\" !\", \"!\").replace(\n                \" ,\",\n                \",\").replace(\" ' \",\n                             \"'\").replace(\" n't\",\n                                          \"n't\").replace(\" 'm\",\n                                                         \"'m\").replace(\" 's\",\n                                                                       \"'s\").replace(\" 've\",\n                                                                                     \"'ve\").replace(\" 're\", \"'re\"))\n            return out_string\n\n        text = ' '.join(tokens).replace(' ##', '').strip()\n        if clean_up_tokenization_spaces:\n            clean_text = clean_up_tokenization(text)\n            return clean_text\n        else:\n            return text\n\n    def vocab_size(self):\n        return len(self.vocab)\n\n\nclass BasicTokenizer(object):\n    \"\"\"Runs basic tokenization (punctuation splitting, lower casing, etc.).\"\"\"\n\n    def __init__(self, do_lower_case=True):\n        \"\"\"Constructs a BasicTokenizer.\n\n        Args:\n          do_lower_case: Whether to lower case the input.\n        \"\"\"\n        self.do_lower_case = do_lower_case\n\n    def tokenize(self, text):\n        \"\"\"Tokenizes a piece of text.\"\"\"\n        text = convert_to_unicode(text)\n        text = self._clean_text(text)\n\n        # This was added on November 1st, 2018 for the multilingual and Chinese\n        # models. This is also applied to the English models now, but it doesn't\n        # matter since the English models were not trained on any Chinese data\n        # and generally don't have any Chinese data in them (there are Chinese\n        # characters in the vocabulary because Wikipedia does have some Chinese\n        # words in the English Wikipedia.).\n        text = self._tokenize_chinese_chars(text)\n\n        orig_tokens = whitespace_tokenize(text)\n        split_tokens = []\n        for token in orig_tokens:\n            if self.do_lower_case:\n                token = token.lower()\n                token = self._run_strip_accents(token)\n            split_tokens.extend(self._run_split_on_punc(token))\n\n        output_tokens = whitespace_tokenize(\" \".join(split_tokens))\n        return output_tokens\n\n    def _run_strip_accents(self, text):\n        \"\"\"Strips accents from a piece of text.\"\"\"\n        text = unicodedata.normalize(\"NFD\", text)\n        output = []\n        for char in text:\n            cat = unicodedata.category(char)\n            if cat == \"Mn\":\n                continue\n            output.append(char)\n        return \"\".join(output)\n\n    def _run_split_on_punc(self, text):\n        \"\"\"Splits punctuation on a piece of text.\"\"\"\n        chars = list(text)\n        i = 0\n        start_new_word = True\n        output = []\n        while i < len(chars):\n            char = chars[i]\n            if _is_punctuation(char):\n                output.append([char])\n                start_new_word = True\n            else:\n                if start_new_word:\n                    output.append([])\n                start_new_word = False\n                output[-1].append(char)\n            i += 1\n\n        return [\"\".join(x) for x in output]\n\n    def _tokenize_chinese_chars(self, text):\n        \"\"\"Adds whitespace around any CJK character.\"\"\"\n        output = []\n        for char in text:\n            cp = ord(char)\n            if self._is_chinese_char(cp):\n                output.append(\" \")\n                output.append(char)\n                output.append(\" \")\n            else:\n                output.append(char)\n        return \"\".join(output)\n\n    def _is_chinese_char(self, cp):\n        \"\"\"Checks whether CP is the codepoint of a CJK character.\"\"\"\n        # This defines a \"chinese character\" as anything in the CJK Unicode block:\n        #   https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_(Unicode_block)\n        #\n        # Note that the CJK Unicode block is NOT all Japanese and Korean characters,\n        # despite its name. The modern Korean Hangul alphabet is a different block,\n        # as is Japanese Hiragana and Katakana. Those alphabets are used to write\n        # space-separated words, so they are not treated specially and handled\n        # like the all of the other languages.\n        if ((cp >= 0x4E00 and cp <= 0x9FFF) or  #\n            (cp >= 0x3400 and cp <= 0x4DBF) or  #\n            (cp >= 0x20000 and cp <= 0x2A6DF) or  #\n            (cp >= 0x2A700 and cp <= 0x2B73F) or  #\n            (cp >= 0x2B740 and cp <= 0x2B81F) or  #\n            (cp >= 0x2B820 and cp <= 0x2CEAF) or (cp >= 0xF900 and cp <= 0xFAFF) or  #\n            (cp >= 0x2F800 and cp <= 0x2FA1F)):  #\n            return True\n\n        return False\n\n    def _clean_text(self, text):\n        \"\"\"Performs invalid character removal and whitespace cleanup on text.\"\"\"\n        output = []\n        for char in text:\n            cp = ord(char)\n            if cp == 0 or cp == 0xfffd or _is_control(char):\n                continue\n            if _is_whitespace(char):\n                output.append(\" \")\n            else:\n                output.append(char)\n        return \"\".join(output)\n\n\nclass WordpieceTokenizer(object):\n    \"\"\"Runs WordPiece tokenziation.\"\"\"\n\n    def __init__(self, vocab, unk_token=\"[UNK]\", max_input_chars_per_word=200):\n        self.vocab = vocab\n        self.unk_token = unk_token\n        self.max_input_chars_per_word = max_input_chars_per_word\n\n    def tokenize(self, text):\n        \"\"\"Tokenizes a piece of text into its word pieces.\n\n        This uses a greedy longest-match-first algorithm to perform tokenization\n        using the given vocabulary.\n\n        For example:\n          input = \"unaffable\"\n          output = [\"un\", \"##aff\", \"##able\"]\n\n        Args:\n          text: A single token or whitespace separated tokens. This should have\n            already been passed through `BasicTokenizer.\n\n        Returns:\n          A list of wordpiece tokens.\n        \"\"\"\n\n        text = convert_to_unicode(text)\n\n        output_tokens = []\n        for token in whitespace_tokenize(text):\n            chars = list(token)\n            if len(chars) > self.max_input_chars_per_word:\n                output_tokens.append(self.unk_token)\n                continue\n\n            is_bad = False\n            start = 0\n            sub_tokens = []\n            while start < len(chars):\n                end = len(chars)\n                cur_substr = None\n                while start < end:\n                    substr = \"\".join(chars[start:end])\n                    if start > 0:\n                        substr = \"##\" + substr\n                    if substr in self.vocab:\n                        cur_substr = substr\n                        break\n                    end -= 1\n                if cur_substr is None:\n                    is_bad = True\n                    break\n                sub_tokens.append(cur_substr)\n                start = end\n\n            if is_bad:\n                output_tokens.append(self.unk_token)\n            else:\n                output_tokens.extend(sub_tokens)\n        return output_tokens\n\n\ndef _is_whitespace(char):\n    \"\"\"Checks whether `chars` is a whitespace character.\"\"\"\n    # \\t, \\n, and \\r are technically contorl characters but we treat them\n    # as whitespace since they are generally considered as such.\n    if char == \" \" or char == \"\\t\" or char == \"\\n\" or char == \"\\r\":\n        return True\n    cat = unicodedata.category(char)\n    if cat == \"Zs\":\n        return True\n    return False\n\n\ndef _is_control(char):\n    \"\"\"Checks whether `chars` is a control character.\"\"\"\n    # These are technically control characters but we count them as whitespace\n    # characters.\n    if char == \"\\t\" or char == \"\\n\" or char == \"\\r\":\n        return False\n    cat = unicodedata.category(char)\n    if cat in (\"Cc\", \"Cf\"):\n        return True\n    return False\n\n\ndef _is_punctuation(char):\n    \"\"\"Checks whether `chars` is a punctuation character.\"\"\"\n    cp = ord(char)\n    # We treat all non-letter/number ASCII as punctuation.\n    # Characters such as \"^\", \"$\", and \"`\" are not in the Unicode\n    # Punctuation class but we treat them as punctuation anyways, for\n    # consistency.\n    if ((cp >= 33 and cp <= 47) or (cp >= 58 and cp <= 64) or (cp >= 91 and cp <= 96) or (cp >= 123 and cp <= 126)):\n        return True\n    cat = unicodedata.category(char)\n    if cat.startswith(\"P\"):\n        return True\n    return False\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/cn_clip/clip/configuration_bert.py",
    "content": "# coding=utf-8\n# Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team.\n# Copyright (c) 2018, NVIDIA CORPORATION.  All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\" BERT model configuration \"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport logging\n\nlogger = logging.getLogger(__name__)\n\n\nclass BertConfig(object):\n    r\"\"\"\n        :class:`~transformers.BertConfig` is the configuration class to store the configuration of a\n        `BertModel`.\n\n\n        Arguments:\n            vocab_size_or_config_json_file: Vocabulary size of `inputs_ids` in `BertModel`.\n            hidden_size: Size of the encoder layers and the pooler layer.\n            num_hidden_layers: Number of hidden layers in the Transformer encoder.\n            num_attention_heads: Number of attention heads for each attention layer in\n                the Transformer encoder.\n            intermediate_size: The size of the \"intermediate\" (i.e., feed-forward)\n                layer in the Transformer encoder.\n            hidden_act: The non-linear activation function (function or string) in the\n                encoder and pooler. If string, \"gelu\", \"relu\", \"swish\" and \"gelu_new\" are supported.\n            hidden_dropout_prob: The dropout probabilitiy for all fully connected\n                layers in the embeddings, encoder, and pooler.\n            attention_probs_dropout_prob: The dropout ratio for the attention\n                probabilities.\n            max_position_embeddings: The maximum sequence length that this model might\n                ever be used with. Typically set this to something large just in case\n                (e.g., 512 or 1024 or 2048).\n            type_vocab_size: The vocabulary size of the `token_type_ids` passed into\n                `BertModel`.\n            initializer_range: The sttdev of the truncated_normal_initializer for\n                initializing all weight matrices.\n            layer_norm_eps: The epsilon used by LayerNorm.\n    \"\"\"\n\n    def __init__(self,\n                 vocab_size_or_config_json_file=30522,\n                 hidden_size=768,\n                 num_hidden_layers=12,\n                 num_attention_heads=12,\n                 intermediate_size=3072,\n                 hidden_act=\"gelu\",\n                 hidden_dropout_prob=0.1,\n                 attention_probs_dropout_prob=0.1,\n                 max_position_embeddings=512,\n                 type_vocab_size=2,\n                 initializer_range=0.02,\n                 layer_norm_eps=1e-12,\n                 output_attentions=False,\n                 output_hidden_states=False):\n        self.vocab_size = vocab_size_or_config_json_file\n        self.hidden_size = hidden_size\n        self.num_hidden_layers = num_hidden_layers\n        self.num_attention_heads = num_attention_heads\n        self.hidden_act = hidden_act\n        self.intermediate_size = intermediate_size\n        self.hidden_dropout_prob = hidden_dropout_prob\n        self.attention_probs_dropout_prob = attention_probs_dropout_prob\n        self.max_position_embeddings = max_position_embeddings\n        self.type_vocab_size = type_vocab_size\n        self.initializer_range = initializer_range\n        self.layer_norm_eps = layer_norm_eps\n        self.output_attentions = output_attentions\n        self.output_hidden_states = output_hidden_states\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/cn_clip/clip/model.py",
    "content": "from collections import OrderedDict\nfrom typing import Tuple\nfrom typing import Union\n\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\nfrom disco_diffusion_cnclip_vitb16.cn_clip.clip import _tokenizer\nfrom disco_diffusion_cnclip_vitb16.cn_clip.clip.configuration_bert import BertConfig\nfrom disco_diffusion_cnclip_vitb16.cn_clip.clip.modeling_bert import BertModel\nfrom paddle import nn\nfrom paddle.nn import MultiHeadAttention\n\n\nclass Bottleneck(nn.Layer):\n    expansion = 4\n\n    def __init__(self, inplanes, planes, stride=1):\n        super().__init__()\n\n        # all conv layers have stride 1. an avgpool is performed after the second convolution when stride > 1\n        self.conv1 = nn.Conv2D(inplanes, planes, 1, bias_attr=False)\n        self.bn1 = nn.BatchNorm2D(planes)\n\n        self.conv2 = nn.Conv2D(planes, planes, 3, padding=1, bias_attr=False)\n        self.bn2 = nn.BatchNorm2D(planes)\n\n        self.avgpool = nn.AvgPool2D(stride) if stride > 1 else nn.Identity()\n\n        self.conv3 = nn.Conv2D(planes, planes * self.expansion, 1, bias_attr=False)\n        self.bn3 = nn.BatchNorm2D(planes * self.expansion)\n\n        self.relu = nn.ReLU(inplace=True)\n        self.downsample = None\n        self.stride = stride\n\n        if stride > 1 or inplanes != planes * Bottleneck.expansion:\n            # downsampling layer is prepended with an avgpool, and the subsequent convolution has stride 1\n            self.downsample = nn.Sequential(\n                OrderedDict([(\"-1\", nn.AvgPool2D(stride)),\n                             (\"0\", nn.Conv2D(inplanes, planes * self.expansion, 1, stride=1, bias_attr=False)),\n                             (\"1\", nn.BatchNorm2D(planes * self.expansion))]))\n\n    def forward(self, x: paddle.Tensor):\n        identity = x\n\n        out = self.relu(self.bn1(self.conv1(x)))\n        out = self.relu(self.bn2(self.conv2(out)))\n        out = self.avgpool(out)\n        out = self.bn3(self.conv3(out))\n\n        if self.downsample is not None:\n            identity = self.downsample(x)\n\n        out += identity\n        out = self.relu(out)\n        return out\n\n\nclass QuickGELU(nn.Layer):\n\n    def forward(self, x: paddle.Tensor):\n        return x * paddle.nn.functional.sigmoid(1.702 * x)\n\n\nclass ResidualAttentionBlock(nn.Layer):\n\n    def __init__(self, d_model: int, n_head: int, attn_mask: paddle.Tensor = None):\n        super().__init__()\n        self.attn = MultiHeadAttention(d_model, n_head)\n        self.ln_1 = nn.LayerNorm(d_model)\n        self.mlp = nn.Sequential(*[(\"c_fc\", nn.Linear(d_model, d_model * 4)), (\n            \"gelu\", QuickGELU()), (\"c_proj\", nn.Linear(d_model * 4, d_model))])\n        self.ln_2 = nn.LayerNorm(d_model)\n        self.attn_mask = attn_mask\n\n    def attention(self, x: paddle.Tensor):\n        self.attn_mask = self.attn_mask.to(dtype=x.dtype, device=x.device) if self.attn_mask is not None else None\n        return self.attn(x, x, x, attn_mask=self.attn_mask)\n\n    def forward(self, x: paddle.Tensor):\n        x = x + self.attention(self.ln_1(x))\n        x = x + self.mlp(self.ln_2(x))\n        return x\n\n\nclass Transformer(nn.Layer):\n\n    def __init__(self, width: int, layers: int, heads: int, attn_mask: paddle.Tensor = None):\n        super().__init__()\n        self.width = width\n        self.layers = layers\n        self.resblocks = nn.Sequential(*[ResidualAttentionBlock(width, heads, attn_mask) for _ in range(layers)])\n\n    def forward(self, x: paddle.Tensor):\n        return self.resblocks(x)\n\n\nclass VisualTransformer(nn.Layer):\n\n    def __init__(self, input_resolution: int, patch_size: int, width: int, layers: int, heads: int, output_dim: int):\n        super().__init__()\n        self.input_resolution = input_resolution\n        self.output_dim = output_dim\n        self.conv1 = nn.Conv2D(in_channels=3,\n                               out_channels=width,\n                               kernel_size=patch_size,\n                               stride=patch_size,\n                               bias_attr=False)\n\n        scale = width**-0.5\n        # self.class_embedding = nn.Parameter(scale * paddle.randn(width))\n        class_embedding = self.create_parameter([width])\n        self.add_parameter(\"class_embedding\", class_embedding)\n        # self.positional_embedding = nn.Parameter(scale * paddle.randn([(input_resolution // patch_size) ** 2 + 1, width)])\n        positional_embedding = self.create_parameter([(input_resolution // patch_size)**2 + 1, width])\n        self.add_parameter(\"positional_embedding\", positional_embedding)\n        self.ln_pre = nn.LayerNorm(width)\n\n        self.transformer = Transformer(width, layers, heads)\n\n        self.ln_post = nn.LayerNorm(width)\n        # self.proj = nn.Parameter(scale * paddle.randn([width, output_dim]))\n        proj = self.create_parameter([width, output_dim])\n        self.add_parameter(\"proj\", proj)\n\n    def forward(self, x: paddle.Tensor):\n        x = self.conv1(x)  # shape = [*, width, grid, grid]\n        x = x.reshape([x.shape[0], x.shape[1], -1])  # shape = [*, width, grid ** 2]\n        x = x.transpose([0, 2, 1])  # shape = [*, grid ** 2, width]\n        x = paddle.concat([self.class_embedding + paddle.zeros([x.shape[0], 1, x.shape[-1]], dtype=x.dtype), x],\n                          axis=1)  # shape = [*, grid ** 2 + 1, width]\n        x = x + paddle.cast(self.positional_embedding, x.dtype)\n        x = self.ln_pre(x)\n\n        x = self.transformer(x)\n\n        x = self.ln_post(x[:, 0, :])\n\n        if self.proj is not None:\n            x = x @ self.proj\n\n        return x\n\n\nclass CLIP(nn.Layer):\n\n    def __init__(\n        self,\n        embed_dim: int,\n        # vision\n        image_resolution: int,\n        vision_layers: Union[Tuple[int, int, int, int], int],\n        vision_width: int,\n        vision_patch_size: int,\n        # text\n        vocab_size: int,\n        text_attention_probs_dropout_prob: float,\n        text_hidden_act: str,\n        text_hidden_dropout_prob: float,\n        text_hidden_size: int,\n        text_initializer_range: float,\n        text_intermediate_size: int,\n        text_max_position_embeddings: int,\n        text_num_attention_heads: int,\n        text_num_hidden_layers: int,\n        text_type_vocab_size: int,\n        tokenizer=_tokenizer,\n    ):\n        super().__init__()\n\n        vision_heads = vision_width // 64\n        self.visual = VisualTransformer(input_resolution=image_resolution,\n                                        patch_size=vision_patch_size,\n                                        width=vision_width,\n                                        layers=vision_layers,\n                                        heads=vision_heads,\n                                        output_dim=embed_dim)\n\n        self.bert_config = BertConfig(\n            vocab_size_or_config_json_file=vocab_size,\n            hidden_size=text_hidden_size,\n            num_hidden_layers=text_num_hidden_layers,\n            num_attention_heads=text_num_attention_heads,\n            intermediate_size=text_intermediate_size,\n            hidden_act=text_hidden_act,\n            hidden_dropout_prob=text_hidden_dropout_prob,\n            attention_probs_dropout_prob=text_attention_probs_dropout_prob,\n            max_position_embeddings=text_max_position_embeddings,\n            type_vocab_size=text_type_vocab_size,\n            initializer_range=text_initializer_range,\n            layer_norm_eps=1e-12,\n        )\n        self.bert = BertModel(self.bert_config)\n\n        text_projection = self.create_parameter([text_hidden_size, embed_dim])\n        self.add_parameter(\"text_projection\", text_projection)\n        logit_scale = self.create_parameter([1])\n        self.add_parameter(\"logit_scale\", logit_scale)\n\n        self.tokenizer = tokenizer\n\n    @property\n    def dtype(self):\n        return self.visual.conv1.weight.dtype\n\n    def encode_image(self, image):\n        return self.visual(image.cast(self.dtype))\n\n    def encode_text(self, text):\n        pad_index = self.tokenizer.vocab['[PAD]']\n\n        attn_mask = text.not_equal(paddle.to_tensor(pad_index)).cast(self.dtype)\n\n        x = self.bert(text, attention_mask=attn_mask)[0].cast(self.dtype)  # [batch_size, seq_length, hidden_size]\n        return x[:, 0, :] @ self.text_projection\n\n    def forward(self, image, text):\n        assert image is not None or text is not None, \"text and image cannot both be None!\"\n\n        if image is None:\n            return self.encode_text(text)\n        elif text is None:\n            return self.encode_image(image)\n        image_features = self.encode_image(image)\n        text_features = self.encode_text(text)\n\n        image_features = image_features / image_features.norm(axis=-1, keepdim=True)\n        text_features = text_features / text_features.norm(axis=-1, keepdim=True)\n\n        return image_features, text_features, self.logit_scale.exp()\n\n    def get_similarity(self, image, text):\n        image_features = self.encode_image(image)\n        text_features = self.encode_text(text)\n\n        # normalized features\n        image_features = image_features / image_features.norm(axis=1, keepdim=True)\n        text_features = text_features / text_features.norm(axis=1, keepdim=True)\n\n        # cosine similarity as logits\n        logit_scale = self.logit_scale.exp()\n        logits_per_image = logit_scale * image_features @ text_features.t()\n        logits_per_text = logits_per_image.t()\n\n        # shape = [global_batch_size, global_batch_size]\n        return logits_per_image, logits_per_text\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/cn_clip/clip/model_configs/RoBERTa-wwm-ext-base-chinese.json",
    "content": "{\n    \"vocab_size\": 21128,\n    \"text_attention_probs_dropout_prob\": 0.1,\n    \"text_hidden_act\": \"gelu\",\n    \"text_hidden_dropout_prob\": 0.1,\n    \"text_hidden_size\": 768,\n    \"text_initializer_range\": 0.02,\n    \"text_intermediate_size\": 3072,\n    \"text_max_position_embeddings\": 512,\n    \"text_num_attention_heads\": 12,\n    \"text_num_hidden_layers\": 12,\n    \"text_type_vocab_size\": 2\n}\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/cn_clip/clip/model_configs/RoBERTa-wwm-ext-large-chinese.json",
    "content": "{\n    \"vocab_size\": 21128,\n    \"text_attention_probs_dropout_prob\": 0.1,\n    \"text_hidden_act\": \"gelu\",\n    \"text_hidden_dropout_prob\": 0.1,\n    \"text_hidden_size\": 1024,\n    \"text_initializer_range\": 0.02,\n    \"text_intermediate_size\": 4096,\n    \"text_max_position_embeddings\": 512,\n    \"text_num_attention_heads\": 16,\n    \"text_num_hidden_layers\": 24,\n    \"text_type_vocab_size\": 2\n}\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/cn_clip/clip/model_configs/ViT-B-16.json",
    "content": "{\n    \"embed_dim\": 512,\n    \"image_resolution\": 224,\n    \"vision_layers\": 12,\n    \"vision_width\": 768,\n    \"vision_patch_size\": 16\n}\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/cn_clip/clip/model_configs/ViT-B-32.json",
    "content": "{\n    \"embed_dim\": 512,\n    \"image_resolution\": 224,\n    \"vision_layers\": 12,\n    \"vision_width\": 768,\n    \"vision_patch_size\": 32\n}\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/cn_clip/clip/model_configs/ViT-L-14.json",
    "content": "{\n    \"embed_dim\": 768,\n    \"image_resolution\": 224,\n    \"vision_layers\": 24,\n    \"vision_width\": 1024,\n    \"vision_patch_size\": 14\n}\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/cn_clip/clip/modeling_bert.py",
    "content": "# coding=utf-8\n# Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team.\n# Copyright (c) 2018, NVIDIA CORPORATION.  All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"PyTorch BERT model. \"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport json\nimport logging\nimport math\nimport os\nimport sys\nfrom io import open\n\nimport paddle\nfrom paddle import nn\n\nfrom .configuration_bert import BertConfig\n\nlogger = logging.getLogger(__name__)\n\n\ndef gelu(x):\n    \"\"\" Original Implementation of the gelu activation function in Google Bert repo when initially created.\n        For information: OpenAI GPT's gelu is slightly different (and gives slightly different results):\n        0.5 * x * (1 + torch.tanh(math.sqrt(2 / math.pi) * (x + 0.044715 * torch.pow(x, 3))))\n        Also see https://arxiv.org/abs/1606.08415\n    \"\"\"\n    return x * 0.5 * (1.0 + paddle.erf(x / math.sqrt(2.0)))\n\n\ndef gelu_new(x):\n    \"\"\" Implementation of the gelu activation function currently in Google Bert repo (identical to OpenAI GPT).\n        Also see https://arxiv.org/abs/1606.08415\n    \"\"\"\n    return 0.5 * x * (1 + paddle.tanh(math.sqrt(2 / math.pi) * (x + 0.044715 * paddle.pow(x, 3))))\n\n\ndef swish(x):\n    return x * paddle.nn.functional.sigmoid(x)\n\n\nACT2FN = {\"gelu\": gelu, \"relu\": paddle.nn.functional.relu, \"swish\": swish, \"gelu_new\": gelu_new}\n\nBertLayerNorm = paddle.nn.LayerNorm\n\n\nclass BertEmbeddings(nn.Layer):\n    \"\"\"Construct the embeddings from word, position and token_type embeddings.\n    \"\"\"\n\n    def __init__(self, config):\n        super(BertEmbeddings, self).__init__()\n        self.word_embeddings = nn.Embedding(config.vocab_size, config.hidden_size)  #, padding_idx=0)\n        self.position_embeddings = nn.Embedding(config.max_position_embeddings, config.hidden_size)\n        self.token_type_embeddings = nn.Embedding(config.type_vocab_size, config.hidden_size)\n\n        # self.LayerNorm is not snake-cased to stick with TensorFlow model variable name and be able to load\n        # any TensorFlow checkpoint file\n        self.LayerNorm = BertLayerNorm(config.hidden_size, epsilon=config.layer_norm_eps)\n        self.dropout = nn.Dropout(config.hidden_dropout_prob)\n\n    def forward(self, input_ids, token_type_ids=None, position_ids=None):\n        seq_length = input_ids.shape[1]\n        if position_ids is None:\n            position_ids = paddle.arange(seq_length, dtype='int64')\n            position_ids = position_ids.unsqueeze(0).expand_as(input_ids)\n        if token_type_ids is None:\n            token_type_ids = paddle.zeros_like(input_ids)\n\n        words_embeddings = self.word_embeddings(input_ids)\n        position_embeddings = self.position_embeddings(position_ids)\n\n        token_type_embeddings = self.token_type_embeddings(token_type_ids)\n\n        embeddings = words_embeddings + position_embeddings + token_type_embeddings\n        embeddings = self.LayerNorm(embeddings)\n\n        embeddings = self.dropout(embeddings)\n\n        return embeddings\n\n\nclass BertSelfAttention(nn.Layer):\n\n    def __init__(self, config):\n        super(BertSelfAttention, self).__init__()\n        if config.hidden_size % config.num_attention_heads != 0:\n            raise ValueError(\"The hidden size (%d) is not a multiple of the number of attention \"\n                             \"heads (%d)\" % (config.hidden_size, config.num_attention_heads))\n        self.output_attentions = config.output_attentions\n\n        self.num_attention_heads = config.num_attention_heads\n        self.attention_head_size = int(config.hidden_size / config.num_attention_heads)\n        self.all_head_size = self.num_attention_heads * self.attention_head_size\n\n        self.query = nn.Linear(config.hidden_size, self.all_head_size)\n        self.key = nn.Linear(config.hidden_size, self.all_head_size)\n        self.value = nn.Linear(config.hidden_size, self.all_head_size)\n\n        self.dropout = nn.Dropout(config.attention_probs_dropout_prob)\n\n    def transpose_for_scores(self, x):\n        new_x_shape = x.shape[:-1] + [self.num_attention_heads, self.attention_head_size]\n        x = x.reshape(new_x_shape)\n        return x.transpose([0, 2, 1, 3])\n\n    def forward(self, hidden_states, attention_mask=None, head_mask=None):\n        mixed_query_layer = self.query(hidden_states)\n        mixed_key_layer = self.key(hidden_states)\n        mixed_value_layer = self.value(hidden_states)\n\n        query_layer = self.transpose_for_scores(mixed_query_layer)\n        key_layer = self.transpose_for_scores(mixed_key_layer)\n        value_layer = self.transpose_for_scores(mixed_value_layer)\n\n        # Take the dot product between \"query\" and \"key\" to get the raw attention scores.\n        attention_scores = paddle.matmul(query_layer, key_layer.transpose([0, 1, 3, 2]))\n        attention_scores = attention_scores / math.sqrt(self.attention_head_size)\n        if attention_mask is not None:\n            # Apply the attention mask is (precomputed for all layers in BertModel forward() function)\n            attention_scores = attention_scores + attention_mask\n\n        # Normalize the attention scores to probabilities.\n        attention_probs = nn.Softmax(axis=-1)(attention_scores)\n\n        # This is actually dropping out entire tokens to attend to, which might\n        # seem a bit unusual, but is taken from the original Transformer paper.\n        attention_probs = self.dropout(attention_probs)\n\n        # Mask heads if we want to\n        if head_mask is not None:\n            attention_probs = attention_probs * head_mask\n\n        context_layer = paddle.matmul(attention_probs, value_layer)\n\n        context_layer = context_layer.transpose([0, 2, 1, 3])\n        new_context_layer_shape = context_layer.shape[:-2] + [self.all_head_size]\n        context_layer = context_layer.reshape(new_context_layer_shape)\n\n        outputs = (context_layer, attention_probs) if self.output_attentions else (context_layer, )\n        return outputs\n\n\nclass BertSelfOutput(nn.Layer):\n\n    def __init__(self, config):\n        super(BertSelfOutput, self).__init__()\n        self.dense = nn.Linear(config.hidden_size, config.hidden_size)\n        self.LayerNorm = BertLayerNorm(config.hidden_size, epsilon=config.layer_norm_eps)\n        self.dropout = nn.Dropout(config.hidden_dropout_prob)\n\n    def forward(self, hidden_states, input_tensor):\n        hidden_states = self.dense(hidden_states)\n        hidden_states = self.dropout(hidden_states)\n        hidden_states = self.LayerNorm(hidden_states + input_tensor)\n        return hidden_states\n\n\nclass BertAttention(nn.Layer):\n\n    def __init__(self, config):\n        super(BertAttention, self).__init__()\n        self.self = BertSelfAttention(config)\n        self.output = BertSelfOutput(config)\n        self.pruned_heads = set()\n\n    def forward(self, input_tensor, attention_mask=None, head_mask=None):\n        self_outputs = self.self(input_tensor, attention_mask, head_mask)\n        attention_output = self.output(self_outputs[0], input_tensor)\n        outputs = (attention_output, ) + self_outputs[1:]  # add attentions if we output them\n        return outputs\n\n\nclass BertIntermediate(nn.Layer):\n\n    def __init__(self, config):\n        super(BertIntermediate, self).__init__()\n        self.dense = nn.Linear(config.hidden_size, config.intermediate_size)\n        if isinstance(config.hidden_act, str) or (sys.version_info[0] == 2 and isinstance(config.hidden_act, unicode)):\n            self.intermediate_act_fn = ACT2FN[config.hidden_act]\n        else:\n            self.intermediate_act_fn = config.hidden_act\n\n    def forward(self, hidden_states):\n        hidden_states = self.dense(hidden_states)\n        hidden_states = self.intermediate_act_fn(hidden_states)\n        return hidden_states\n\n\nclass BertOutput(nn.Layer):\n\n    def __init__(self, config):\n        super(BertOutput, self).__init__()\n        self.dense = nn.Linear(config.intermediate_size, config.hidden_size)\n        self.LayerNorm = BertLayerNorm(config.hidden_size, epsilon=config.layer_norm_eps)\n        self.dropout = nn.Dropout(config.hidden_dropout_prob)\n\n    def forward(self, hidden_states, input_tensor):\n        hidden_states = self.dense(hidden_states)\n        hidden_states = self.dropout(hidden_states)\n        hidden_states = self.LayerNorm(hidden_states + input_tensor)\n        return hidden_states\n\n\nclass BertLayer(nn.Layer):\n\n    def __init__(self, config):\n        super(BertLayer, self).__init__()\n        self.attention = BertAttention(config)\n        self.intermediate = BertIntermediate(config)\n        self.output = BertOutput(config)\n\n    def forward(self, hidden_states, attention_mask=None, head_mask=None):\n        attention_outputs = self.attention(hidden_states, attention_mask, head_mask)\n        attention_output = attention_outputs[0]\n        intermediate_output = self.intermediate(attention_output)\n        layer_output = self.output(intermediate_output, attention_output)\n        outputs = (layer_output, ) + attention_outputs[1:]  # add attentions if we output them\n        return outputs\n\n\nclass BertEncoder(nn.Layer):\n\n    def __init__(self, config):\n        super(BertEncoder, self).__init__()\n        self.output_attentions = config.output_attentions\n        self.output_hidden_states = config.output_hidden_states\n        self.layer = nn.LayerList([BertLayer(config) for _ in range(config.num_hidden_layers)])\n\n    def forward(self, hidden_states, attention_mask=None, head_mask=None):\n        all_hidden_states = ()\n        all_attentions = ()\n        for i, layer_module in enumerate(self.layer):\n            if self.output_hidden_states:\n                all_hidden_states = all_hidden_states + (hidden_states, )\n\n            layer_outputs = layer_module(hidden_states, attention_mask, head_mask[i])\n            hidden_states = layer_outputs[0]\n\n            if self.output_attentions:\n                all_attentions = all_attentions + (layer_outputs[1], )\n        # Add last layer\n        if self.output_hidden_states:\n            all_hidden_states = all_hidden_states + (hidden_states, )\n\n        outputs = (hidden_states, )\n        if self.output_hidden_states:\n            outputs = outputs + (all_hidden_states, )\n        if self.output_attentions:\n            outputs = outputs + (all_attentions, )\n        return outputs  # last-layer hidden state, (all hidden states), (all attentions)\n\n\nclass BertPooler(nn.Layer):\n\n    def __init__(self, config):\n        super(BertPooler, self).__init__()\n        self.dense = nn.Linear(config.hidden_size, config.hidden_size)\n        self.activation = nn.Tanh()\n\n    def forward(self, hidden_states):\n        # We \"pool\" the model by simply taking the hidden state corresponding\n        # to the first token.\n        first_token_tensor = hidden_states[:, 0]\n        pooled_output = self.dense(first_token_tensor)\n        pooled_output = self.activation(pooled_output)\n        return pooled_output\n\n\nclass BertPredictionHeadTransform(nn.Layer):\n\n    def __init__(self, config):\n        super(BertPredictionHeadTransform, self).__init__()\n        self.dense = nn.Linear(config.hidden_size, config.hidden_size)\n        if isinstance(config.hidden_act, str) or (sys.version_info[0] == 2 and isinstance(config.hidden_act, unicode)):\n            self.transform_act_fn = ACT2FN[config.hidden_act]\n        else:\n            self.transform_act_fn = config.hidden_act\n        self.LayerNorm = BertLayerNorm(config.hidden_size, epsilon=config.layer_norm_eps)\n\n    def forward(self, hidden_states):\n        hidden_states = self.dense(hidden_states)\n        hidden_states = self.transform_act_fn(hidden_states)\n        hidden_states = self.LayerNorm(hidden_states)\n        return hidden_states\n\n\nclass BertLMPredictionHead(nn.Layer):\n\n    def __init__(self, config):\n        super(BertLMPredictionHead, self).__init__()\n        self.transform = BertPredictionHeadTransform(config)\n\n        # The output weights are the same as the input embeddings, but there is\n        # an output-only bias for each token.\n        self.decoder = nn.Linear(config.hidden_size, config.vocab_size, bias=False)\n\n        self.bias = nn.Parameter(paddle.zeros(config.vocab_size))\n\n    def forward(self, hidden_states):\n        hidden_states = self.transform(hidden_states)\n        hidden_states = self.decoder(hidden_states) + self.bias\n        return hidden_states\n\n\nclass BertOnlyMLMHead(nn.Layer):\n\n    def __init__(self, config):\n        super(BertOnlyMLMHead, self).__init__()\n        self.predictions = BertLMPredictionHead(config)\n\n    def forward(self, sequence_output):\n        prediction_scores = self.predictions(sequence_output)\n        return prediction_scores\n\n\nclass BertOnlyNSPHead(nn.Layer):\n\n    def __init__(self, config):\n        super(BertOnlyNSPHead, self).__init__()\n        self.seq_relationship = nn.Linear(config.hidden_size, 2)\n\n    def forward(self, pooled_output):\n        seq_relationship_score = self.seq_relationship(pooled_output)\n        return seq_relationship_score\n\n\nclass BertPreTrainingHeads(nn.Layer):\n\n    def __init__(self, config):\n        super(BertPreTrainingHeads, self).__init__()\n        self.predictions = BertLMPredictionHead(config)\n        self.seq_relationship = nn.Linear(config.hidden_size, 2)\n\n    def forward(self, sequence_output, pooled_output):\n        prediction_scores = self.predictions(sequence_output)\n        seq_relationship_score = self.seq_relationship(pooled_output)\n        return prediction_scores, seq_relationship_score\n\n\nclass BertPreTrainedModel(nn.Layer):\n    config_class = BertConfig\n    base_model_prefix = \"bert\"\n\n    def __init__(self, config):\n        super(BertPreTrainedModel, self).__init__()\n        self.config = config\n\n\nclass BertModel(BertPreTrainedModel):\n    r\"\"\"\n    Outputs: `Tuple` comprising various elements depending on the configuration (config) and inputs:\n        **last_hidden_state**: ``torch.FloatTensor`` of shape ``(batch_size, sequence_length, hidden_size)``\n            Sequence of hidden-states at the output of the last layer of the model.\n        **pooler_output**: ``torch.FloatTensor`` of shape ``(batch_size, hidden_size)``\n            Last layer hidden-state of the first token of the sequence (classification token)\n            further processed by a Linear layer and a Tanh activation function. The Linear\n            layer weights are trained from the next sentence prediction (classification)\n            objective during Bert pretraining. This output is usually *not* a good summary\n            of the semantic content of the input, you're often better with averaging or pooling\n            the sequence of hidden-states for the whole input sequence.\n        **hidden_states**: (`optional`, returned when ``config.output_hidden_states=True``)\n            list of ``torch.FloatTensor`` (one for the output of each layer + the output of the embeddings)\n            of shape ``(batch_size, sequence_length, hidden_size)``:\n            Hidden-states of the model at the output of each layer plus the initial embedding outputs.\n        **attentions**: (`optional`, returned when ``config.output_attentions=True``)\n            list of ``torch.FloatTensor`` (one for each layer) of shape ``(batch_size, num_heads, sequence_length, sequence_length)``:\n            Attentions weights after the attention softmax, used to compute the weighted average in the self-attention heads.\n\n    Examples::\n\n        tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')\n        model = BertModel.from_pretrained('bert-base-uncased')\n        input_ids = torch.tensor(tokenizer.encode(\"Hello, my dog is cute\")).unsqueeze(0)  # Batch size 1\n        outputs = model(input_ids)\n        last_hidden_states = outputs[0]  # The last hidden-state is the first element of the output tuple\n\n    \"\"\"\n\n    def __init__(self, config):\n        super(BertModel, self).__init__(config)\n\n        self.embeddings = BertEmbeddings(config)\n        self.encoder = BertEncoder(config)\n        self.pooler = BertPooler(config)\n\n    def forward(self, input_ids, attention_mask=None, token_type_ids=None, position_ids=None, head_mask=None):\n        if attention_mask is None:\n            attention_mask = paddle.ones_like(input_ids)\n        if token_type_ids is None:\n            token_type_ids = paddle.zeros_like(input_ids)\n\n        # We create a 3D attention mask from a 2D tensor mask.\n        # Sizes are [batch_size, 1, 1, to_seq_length]\n        # So we can broadcast to [batch_size, num_heads, from_seq_length, to_seq_length]\n        # this attention mask is more simple than the triangular masking of causal attention\n        # used in OpenAI GPT, we just need to prepare the broadcast dimension here.\n        extended_attention_mask = attention_mask.unsqueeze(1).unsqueeze(2)\n\n        # Since attention_mask is 1.0 for positions we want to attend and 0.0 for\n        # masked positions, this operation will create a tensor which is 0.0 for\n        # positions we want to attend and -10000.0 for masked positions.\n        # Since we are adding it to the raw scores before the softmax, this is\n        # effectively the same as removing these entirely.\n        extended_attention_mask = extended_attention_mask.cast(dtype=self.parameters()[0].dtype)  # fp16 compatibility\n        extended_attention_mask = (1.0 - extended_attention_mask) * -10000.0\n\n        # Prepare head mask if needed\n        # 1.0 in head_mask indicate we keep the head\n        # attention_probs has shape bsz x n_heads x N x N\n        # input head_mask has shape [num_heads] or [num_hidden_layers x num_heads]\n        # and head_mask is converted to shape [num_hidden_layers x batch x num_heads x seq_length x seq_length]\n        if head_mask is not None:\n            if head_mask.rank() == 1:\n                head_mask = head_mask.unsqueeze(0).unsqueeze(0).unsqueeze(-1).unsqueeze(-1)\n                head_mask = head_mask.expand(self.config.num_hidden_layers, -1, -1, -1, -1)\n            elif head_mask.rank() == 2:\n                head_mask = head_mask.unsqueeze(1).unsqueeze(-1).unsqueeze(\n                    -1)  # We can specify head_mask for each layer\n            head_mask = head_mask.cast(dtype=self.parameters()[0].dtype)  # switch to fload if need + fp16 compatibility\n        else:\n            head_mask = [None] * self.config.num_hidden_layers\n\n        embedding_output = self.embeddings(input_ids, position_ids=position_ids, token_type_ids=token_type_ids)\n\n        encoder_outputs = self.encoder(embedding_output, extended_attention_mask, head_mask=head_mask)\n\n        sequence_output = encoder_outputs[0]\n        pooled_output = self.pooler(sequence_output)\n\n        outputs = (\n            sequence_output,\n            pooled_output,\n        ) + encoder_outputs[1:]  # add hidden_states and attentions if they are here\n        return outputs  # sequence_output, pooled_output, (hidden_states), (attentions)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/cn_clip/clip/utils.py",
    "content": "# Code modified from https://github.com/openai/CLIP\nimport json\nimport os\nfrom pathlib import Path\nfrom typing import List\nfrom typing import Union\n\nimport paddle\nfrom disco_diffusion_cnclip_vitb16.cn_clip.clip import _tokenizer\nfrom disco_diffusion_cnclip_vitb16.cn_clip.clip.model import CLIP\nfrom tqdm import tqdm\n\n__all__ = [\"tokenize\", \"create_model\", \"available_models\"]\n\n_MODEL_INFO = {\"ViTB16\": {\"struct\": \"ViT-B-16@RoBERTa-wwm-ext-base-chinese\", \"input_resolution\": 224}}\n\n\ndef available_models() -> List[str]:\n    \"\"\"Returns the names of available CLIP models\"\"\"\n    return list(_MODEL_INFO.keys())\n\n\ndef tokenize(texts: Union[str, List[str]], context_length: int = 64):\n    \"\"\"\n    Returns the tokenized representation of given input string(s)\n    Parameters\n    ----------\n    texts : Union[str, List[str]]\n        An input string or a list of input strings to tokenize\n    context_length : int\n        The context length to use; all baseline models use 24 as the context length\n    Returns\n    -------\n    A two-dimensional tensor containing the resulting tokens, shape = [number of input strings, context_length]\n    \"\"\"\n    if isinstance(texts, str):\n        texts = [texts]\n\n    all_tokens = []\n    for text in texts:\n        all_tokens.append([_tokenizer.vocab['[CLS]']] +\n                          _tokenizer.convert_tokens_to_ids(_tokenizer.tokenize(text))[:context_length - 2] +\n                          [_tokenizer.vocab['[SEP]']])\n\n    result = paddle.zeros([len(all_tokens), context_length], dtype='int64')\n\n    for i, tokens in enumerate(all_tokens):\n        assert len(tokens) <= context_length\n        result[i, :len(tokens)] = paddle.to_tensor(tokens, dtype='int64')\n\n    return result\n\n\ndef create_model(name):\n    checkpoint = paddle.load(os.path.join(os.path.dirname(__file__), 'pre_trained', '{}.pdparams'.format(name)))\n    model_name = _MODEL_INFO[name]['struct']\n    vision_model, text_model = model_name.split('@')\n    # Initialize the model.\n    vision_model_config_file = Path(__file__).parent / f\"model_configs/{vision_model.replace('/', '-')}.json\"\n    print('Loading vision model config from', vision_model_config_file)\n    assert os.path.exists(vision_model_config_file)\n\n    text_model_config_file = Path(__file__).parent / f\"model_configs/{text_model.replace('/', '-')}.json\"\n    print('Loading text model config from', text_model_config_file)\n    assert os.path.exists(text_model_config_file)\n\n    with open(vision_model_config_file, 'r') as fv, open(text_model_config_file, 'r') as ft:\n        model_info = json.load(fv)\n        for k, v in json.load(ft).items():\n            model_info[k] = v\n\n    model = CLIP(**model_info)\n    model.set_state_dict(checkpoint)\n    return model\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/module.py",
    "content": "# copyright (c) 2022 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport ast\nimport os\nimport sys\nfrom functools import partial\nfrom typing import List\nfrom typing import Optional\n\nimport paddle\nfrom disco_diffusion_cnclip_vitb16 import resize_right\nfrom disco_diffusion_cnclip_vitb16.reverse_diffusion import create\n\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"disco_diffusion_cnclip_vitb16\",\n            version=\"1.0.0\",\n            type=\"image/text_to_image\",\n            summary=\"\",\n            author=\"paddlepaddle\",\n            author_email=\"paddle-dev@baidu.com\")\nclass DiscoDiffusionClip:\n\n    def generate_image(self,\n                       text_prompts,\n                       style: Optional[str] = None,\n                       artist: Optional[str] = None,\n                       init_image: Optional[str] = None,\n                       width_height: Optional[List[int]] = [1280, 768],\n                       skip_steps: Optional[int] = 0,\n                       steps: Optional[int] = 250,\n                       cut_ic_pow: Optional[int] = 1,\n                       init_scale: Optional[int] = 1000,\n                       clip_guidance_scale: Optional[int] = 5000,\n                       tv_scale: Optional[int] = 0,\n                       range_scale: Optional[int] = 0,\n                       sat_scale: Optional[int] = 0,\n                       cutn_batches: Optional[int] = 4,\n                       diffusion_sampling_mode: Optional[str] = 'ddim',\n                       perlin_init: Optional[bool] = False,\n                       perlin_mode: Optional[str] = 'mixed',\n                       seed: Optional[int] = None,\n                       eta: Optional[float] = 0.8,\n                       clamp_grad: Optional[bool] = True,\n                       clamp_max: Optional[float] = 0.05,\n                       randomize_class: Optional[bool] = True,\n                       clip_denoised: Optional[bool] = False,\n                       fuzzy_prompt: Optional[bool] = False,\n                       rand_mag: Optional[float] = 0.05,\n                       cut_overview: Optional[str] = '[12]*400+[4]*600',\n                       cut_innercut: Optional[str] = '[4]*400+[12]*600',\n                       cut_icgray_p: Optional[str] = '[0.2]*400+[0]*600',\n                       display_rate: Optional[int] = 10,\n                       n_batches: Optional[int] = 1,\n                       batch_size: Optional[int] = 1,\n                       batch_name: Optional[str] = '',\n                       use_gpu: Optional[bool] = True,\n                       output_dir: Optional[str] = 'disco_diffusion_cnclip_vitb16_out'):\n        \"\"\"\n        Create Disco Diffusion artworks and save the result into a DocumentArray.\n\n        :param text_prompts: Phrase, sentence, or string of words and phrases describing what the image should look like.  The words will be analyzed by the AI and will guide the diffusion process toward the image(s) you describe. These can include commas and weights to adjust the relative importance of each element.  E.g. \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"Notice that this prompt loosely follows a structure: [subject], [prepositional details], [setting], [meta modifiers and artist]; this is a good starting point for your experiments. Developing text prompts takes practice and experience, and is not the subject of this guide.  If you are a beginner to writing text prompts, a good place to start is on a simple AI art app like Night Cafe, starry ai or WOMBO prior to using DD, to get a feel for how text gets translated into images by GAN tools.  These other apps use different technologies, but many of the same principles apply.\n        :param style: Image style, such as oil paintings, if specified, style will be used to construct prompts.\n        :param artist: Artist style, if specified, style will be used to construct prompts.\n        :param init_image: Recall that in the image sequence above, the first image shown is just noise.  If an init_image is provided, diffusion will replace the noise with the init_image as its starting state.  To use an init_image, upload the image to the Colab instance or your Google Drive, and enter the full image path here. If using an init_image, you may need to increase skip_steps to ~ 50% of total steps to retain the character of the init. See skip_steps above for further discussion.\n        :param width_height: Desired final image size, in pixels. You can have a square, wide, or tall image, but each edge length should be set to a multiple of 64px, and a minimum of 512px on the default CLIP model setting.  If you forget to use multiples of 64px in your dimensions, DD will adjust the dimensions of your image to make it so.\n        :param skip_steps: Consider the chart shown here.  Noise scheduling (denoise strength) starts very high and progressively gets lower and lower as diffusion steps progress. The noise levels in the first few steps are very high, so images change dramatically in early steps.As DD moves along the curve, noise levels (and thus the amount an image changes per step) declines, and image coherence from one step to the next increases.The first few steps of denoising are often so dramatic that some steps (maybe 10-15% of total) can be skipped without affecting the final image. You can experiment with this as a way to cut render times.If you skip too many steps, however, the remaining noise may not be high enough to generate new content, and thus may not have ‘time left’ to finish an image satisfactorily.Also, depending on your other settings, you may need to skip steps to prevent CLIP from overshooting your goal, resulting in ‘blown out’ colors (hyper saturated, solid white, or solid black regions) or otherwise poor image quality.  Consider that the denoising process is at its strongest in the early steps, so skipping steps can sometimes mitigate other problems.Lastly, if using an init_image, you will need to skip ~50% of the diffusion steps to retain the shapes in the original init image. However, if you’re using an init_image, you can also adjust skip_steps up or down for creative reasons.  With low skip_steps you can get a result \"inspired by\" the init_image which will retain the colors and rough layout and shapes but look quite different. With high skip_steps you can preserve most of the init_image contents and just do fine tuning of the texture.\n        :param steps: When creating an image, the denoising curve is subdivided into steps for processing. Each step (or iteration) involves the AI looking at subsets of the image called ‘cuts’ and calculating the ‘direction’ the image should be guided to be more like the prompt. Then it adjusts the image with the help of the diffusion denoiser, and moves to the next step.Increasing steps will provide more opportunities for the AI to adjust the image, and each adjustment will be smaller, and thus will yield a more precise, detailed image.  Increasing steps comes at the expense of longer render times.  Also, while increasing steps should generally increase image quality, there is a diminishing return on additional steps beyond 250 - 500 steps.  However, some intricate images can take 1000, 2000, or more steps.  It is really up to the user.  Just know that the render time is directly related to the number of steps, and many other parameters have a major impact on image quality, without costing additional time.\n        :param cut_ic_pow: This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n        :param init_scale: This controls how strongly CLIP will try to match the init_image provided.  This is balanced against the clip_guidance_scale (CGS) above.  Too much init scale, and the image won’t change much during diffusion. Too much CGS and the init image will be lost.\n        :param clip_guidance_scale: CGS is one of the most important parameters you will use. It tells DD how strongly you want CLIP to move toward your prompt each timestep.  Higher is generally better, but if CGS is too strong it will overshoot the goal and distort the image. So a happy medium is needed, and it takes experience to learn how to adjust CGS. Note that this parameter generally scales with image dimensions. In other words, if you increase your total dimensions by 50% (e.g. a change from 512 x 512 to 512 x 768), then to maintain the same effect on the image, you’d want to increase clip_guidance_scale from 5000 to 7500. Of the basic settings, clip_guidance_scale, steps and skip_steps are the most important contributors to image quality, so learn them well.\n        :param tv_scale: Total variance denoising. Optional, set to zero to turn off. Controls ‘smoothness’ of final output. If used, tv_scale will try to smooth out your final image to reduce overall noise. If your image is too ‘crunchy’, increase tv_scale. TV denoising is good at preserving edges while smoothing away noise in flat regions.  See https://en.wikipedia.org/wiki/Total_variation_denoising\n        :param range_scale: Optional, set to zero to turn off.  Used for adjustment of color contrast.  Lower range_scale will increase contrast. Very low numbers create a reduced color palette, resulting in more vibrant or poster-like images. Higher range_scale will reduce contrast, for more muted images.\n        :param sat_scale: Saturation scale. Optional, set to zero to turn off.  If used, sat_scale will help mitigate oversaturation. If your image is too saturated, increase sat_scale to reduce the saturation.\n        :param cutn_batches: Each iteration, the AI cuts the image into smaller pieces known as cuts, and compares each cut to the prompt to decide how to guide the next diffusion step.  More cuts can generally lead to better images, since DD has more chances to fine-tune the image precision in each timestep.  Additional cuts are memory intensive, however, and if DD tries to evaluate too many cuts at once, it can run out of memory.  You can use cutn_batches to increase cuts per timestep without increasing memory usage. At the default settings, DD is scheduled to do 16 cuts per timestep.  If cutn_batches is set to 1, there will indeed only be 16 cuts total per timestep. However, if cutn_batches is increased to 4, DD will do 64 cuts total in each timestep, divided into 4 sequential batches of 16 cuts each.  Because the cuts are being evaluated only 16 at a time, DD uses the memory required for only 16 cuts, but gives you the quality benefit of 64 cuts.  The tradeoff, of course, is that this will take ~4 times as long to render each image.So, (scheduled cuts) x (cutn_batches) = (total cuts per timestep). Increasing cutn_batches will increase render times, however, as the work is being done sequentially.  DD’s default cut schedule is a good place to start, but the cut schedule can be adjusted in the Cutn Scheduling section, explained below.\n        :param diffusion_sampling_mode: Two alternate diffusion denoising algorithms. ddim has been around longer, and is more established and tested.  plms is a newly added alternate method that promises good diffusion results in fewer steps, but has not been as fully tested and may have side effects. This new plms mode is actively being researched in the #settings-and-techniques channel in the DD Discord.\n        :param perlin_init: Normally, DD will use an image filled with random noise as a starting point for the diffusion curve.  If perlin_init is selected, DD will instead use a Perlin noise model as an initial state.  Perlin has very interesting characteristics, distinct from random noise, so it’s worth experimenting with this for your projects. Beyond perlin, you can, of course, generate your own noise images (such as with GIMP, etc) and use them as an init_image (without skipping steps). Choosing perlin_init does not affect the actual diffusion process, just the starting point for the diffusion. Please note that selecting a perlin_init will replace and override any init_image you may have specified.  Further, because the 2D, 3D and video animation systems all rely on the init_image system, if you enable Perlin while using animation modes, the perlin_init will jump in front of any previous image or video input, and DD will NOT give you the expected sequence of coherent images. All of that said, using Perlin and animation modes together do make a very colorful rainbow effect, which can be used creatively.\n        :param perlin_mode: sets type of Perlin noise: colored, gray, or a mix of both, giving you additional options for noise types. Experiment to see what these do in your projects.\n        :param seed: Deep in the diffusion code, there is a random number ‘seed’ which is used as the basis for determining the initial state of the diffusion.  By default, this is random, but you can also specify your own seed.  This is useful if you like a particular result and would like to run more iterations that will be similar. After each run, the actual seed value used will be reported in the parameters report, and can be reused if desired by entering seed # here.  If a specific numerical seed is used repeatedly, the resulting images will be quite similar but not identical.\n        :param eta: eta (greek letter η) is a diffusion model variable that mixes in a random amount of scaled noise into each timestep. 0 is no noise, 1.0 is more noise. As with most DD parameters, you can go below zero for eta, but it may give you unpredictable results. The steps parameter has a close relationship with the eta parameter. If you set eta to 0, then you can get decent output with only 50-75 steps. Setting eta to 1.0 favors higher step counts, ideally around 250 and up. eta has a subtle, unpredictable effect on image, so you’ll need to experiment to see how this affects your projects.\n        :param clamp_grad: As I understand it, clamp_grad is an internal limiter that stops DD from producing extreme results.  Try your images with and without clamp_grad. If the image changes drastically with clamp_grad turned off, it probably means your clip_guidance_scale is too high and should be reduced.\n        :param clamp_max: Sets the value of the clamp_grad limitation. Default is 0.05, providing for smoother, more muted coloration in images, but setting higher values (0.15-0.3) can provide interesting contrast and vibrancy.\n        :param fuzzy_prompt: Controls whether to add multiple noisy prompts to the prompt losses. If True, can increase variability of image output. Experiment with this.\n        :param rand_mag: Affects only the fuzzy_prompt.  Controls the magnitude of the random noise added by fuzzy_prompt.\n        :param cut_overview: The schedule of overview cuts\n        :param cut_innercut: The schedule of inner cuts\n        :param cut_icgray_p: This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n        :param display_rate: During a diffusion run, you can monitor the progress of each image being created with this variable.  If display_rate is set to 50, DD will show you the in-progress image every 50 timesteps. Setting this to a lower value, like 5 or 10, is a good way to get an early peek at where your image is heading. If you don’t like the progression, just interrupt execution, change some settings, and re-run.  If you are planning a long, unmonitored batch, it’s better to set display_rate equal to steps, because displaying interim images does slow Colab down slightly.\n        :param n_batches: This variable sets the number of still images you want DD to create.  If you are using an animation mode (see below for details) DD will ignore n_batches and create a single set of animated frames based on the animation settings.\n        :param batch_name: The name of the batch, the batch id will be named as \"discoart-[batch_name]-seed\". To avoid your artworks be overridden by other users, please use a unique name.\n        :param use_gpu: whether to use gpu or not.\n        :return: a DocumentArray object that has `n_batches` Documents\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ.get(\"CUDA_VISIBLE_DEVICES\", None)\n                if _places:\n                    paddle.device.set_device(\"gpu:{}\".format(0))\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n        else:\n            paddle.device.set_device(\"cpu\")\n        paddle.disable_static()\n        if not os.path.exists(output_dir):\n            os.makedirs(output_dir, exist_ok=True)\n\n        if isinstance(text_prompts, str):\n            text_prompts = text_prompts.rstrip(',.，。')\n            if style is not None:\n                text_prompts += \"，{}\".format(style)\n            if artist is not None:\n                text_prompts += \"，由{}所作\".format(artist)\n        elif isinstance(text_prompts, list):\n            text_prompts[0] = text_prompts[0].rstrip(',.，。')\n            if style is not None:\n                text_prompts[0] += \"，{}\".format(style)\n            if artist is not None:\n                text_prompts[0] += \"，由{}所作\".format(artist)\n\n        return create(text_prompts=text_prompts,\n                      init_image=init_image,\n                      width_height=width_height,\n                      skip_steps=skip_steps,\n                      steps=steps,\n                      cut_ic_pow=cut_ic_pow,\n                      init_scale=init_scale,\n                      clip_guidance_scale=clip_guidance_scale,\n                      tv_scale=tv_scale,\n                      range_scale=range_scale,\n                      sat_scale=sat_scale,\n                      cutn_batches=cutn_batches,\n                      diffusion_sampling_mode=diffusion_sampling_mode,\n                      perlin_init=perlin_init,\n                      perlin_mode=perlin_mode,\n                      seed=seed,\n                      eta=eta,\n                      clamp_grad=clamp_grad,\n                      clamp_max=clamp_max,\n                      randomize_class=randomize_class,\n                      clip_denoised=clip_denoised,\n                      fuzzy_prompt=fuzzy_prompt,\n                      rand_mag=rand_mag,\n                      cut_overview=cut_overview,\n                      cut_innercut=cut_innercut,\n                      cut_icgray_p=cut_icgray_p,\n                      display_rate=display_rate,\n                      n_batches=n_batches,\n                      batch_size=batch_size,\n                      batch_name=batch_name,\n                      clip_models=['ViTB16'],\n                      output_dir=output_dir)\n\n    @serving\n    def serving_method(self, text_prompts, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        results = self.generate_image(text_prompts=text_prompts, **kwargs).to_base64()\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.generate_image(text_prompts=args.text_prompts,\n                                      style=args.style,\n                                      artist=args.artist,\n                                      init_image=args.init_image,\n                                      width_height=args.width_height,\n                                      skip_steps=args.skip_steps,\n                                      steps=args.steps,\n                                      cut_ic_pow=args.cut_ic_pow,\n                                      init_scale=args.init_scale,\n                                      clip_guidance_scale=args.clip_guidance_scale,\n                                      tv_scale=args.tv_scale,\n                                      range_scale=args.range_scale,\n                                      sat_scale=args.sat_scale,\n                                      cutn_batches=args.cutn_batches,\n                                      diffusion_sampling_mode=args.diffusion_sampling_mode,\n                                      perlin_init=args.perlin_init,\n                                      perlin_mode=args.perlin_mode,\n                                      seed=args.seed,\n                                      eta=args.eta,\n                                      clamp_grad=args.clamp_grad,\n                                      clamp_max=args.clamp_max,\n                                      randomize_class=args.randomize_class,\n                                      clip_denoised=args.clip_denoised,\n                                      fuzzy_prompt=args.fuzzy_prompt,\n                                      rand_mag=args.rand_mag,\n                                      cut_overview=args.cut_overview,\n                                      cut_innercut=args.cut_innercut,\n                                      cut_icgray_p=args.cut_icgray_p,\n                                      display_rate=args.display_rate,\n                                      n_batches=args.n_batches,\n                                      batch_size=args.batch_size,\n                                      batch_name=args.batch_name,\n                                      output_dir=args.output_dir)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_input_group.add_argument(\n            '--skip_steps',\n            type=int,\n            default=0,\n            help=\n            'Consider the chart shown here.  Noise scheduling (denoise strength) starts very high and progressively gets lower and lower as diffusion steps progress. The noise levels in the first few steps are very high, so images change dramatically in early steps.As DD moves along the curve, noise levels (and thus the amount an image changes per step) declines, and image coherence from one step to the next increases.The first few steps of denoising are often so dramatic that some steps (maybe 10-15%% of total) can be skipped without affecting the final image. You can experiment with this as a way to cut render times.If you skip too many steps, however, the remaining noise may not be high enough to generate new content, and thus may not have ‘time left’ to finish an image satisfactorily.Also, depending on your other settings, you may need to skip steps to prevent CLIP from overshooting your goal, resulting in ‘blown out’ colors (hyper saturated, solid white, or solid black regions) or otherwise poor image quality.  Consider that the denoising process is at its strongest in the early steps, so skipping steps can sometimes mitigate other problems.Lastly, if using an init_image, you will need to skip ~50%% of the diffusion steps to retain the shapes in the original init image. However, if you’re using an init_image, you can also adjust skip_steps up or down for creative reasons.  With low skip_steps you can get a result \"inspired by\" the init_image which will retain the colors and rough layout and shapes but look quite different. With high skip_steps you can preserve most of the init_image contents and just do fine tuning of the texture'\n        )\n        self.arg_input_group.add_argument(\n            '--steps',\n            type=int,\n            default=250,\n            help=\n            \"When creating an image, the denoising curve is subdivided into steps for processing. Each step (or iteration) involves the AI looking at subsets of the image called ‘cuts’ and calculating the ‘direction’ the image should be guided to be more like the prompt. Then it adjusts the image with the help of the diffusion denoiser, and moves to the next step.Increasing steps will provide more opportunities for the AI to adjust the image, and each adjustment will be smaller, and thus will yield a more precise, detailed image.  Increasing steps comes at the expense of longer render times.  Also, while increasing steps should generally increase image quality, there is a diminishing return on additional steps beyond 250 - 500 steps.  However, some intricate images can take 1000, 2000, or more steps.  It is really up to the user.  Just know that the render time is directly related to the number of steps, and many other parameters have a major impact on image quality, without costing additional time.\"\n        )\n        self.arg_input_group.add_argument(\n            '--cut_ic_pow',\n            type=int,\n            default=1,\n            help=\n            \"This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\"\n        )\n        self.arg_input_group.add_argument(\n            '--init_scale',\n            type=int,\n            default=1000,\n            help=\n            \"This controls how strongly CLIP will try to match the init_image provided.  This is balanced against the clip_guidance_scale (CGS) above.  Too much init scale, and the image won’t change much during diffusion. Too much CGS and the init image will be lost.\"\n        )\n        self.arg_input_group.add_argument(\n            '--clip_guidance_scale',\n            type=int,\n            default=5000,\n            help=\n            \"CGS is one of the most important parameters you will use. It tells DD how strongly you want CLIP to move toward your prompt each timestep.  Higher is generally better, but if CGS is too strong it will overshoot the goal and distort the image. So a happy medium is needed, and it takes experience to learn how to adjust CGS. Note that this parameter generally scales with image dimensions. In other words, if you increase your total dimensions by 50% (e.g. a change from 512 x 512 to 512 x 768), then to maintain the same effect on the image, you’d want to increase clip_guidance_scale from 5000 to 7500. Of the basic settings, clip_guidance_scale, steps and skip_steps are the most important contributors to image quality, so learn them well.\"\n        )\n        self.arg_input_group.add_argument(\n            '--tv_scale',\n            type=int,\n            default=0,\n            help=\n            \"Total variance denoising. Optional, set to zero to turn off. Controls ‘smoothness’ of final output. If used, tv_scale will try to smooth out your final image to reduce overall noise. If your image is too ‘crunchy’, increase tv_scale. TV denoising is good at preserving edges while smoothing away noise in flat regions.  See https://en.wikipedia.org/wiki/Total_variation_denoising\"\n        )\n        self.arg_input_group.add_argument(\n            '--range_scale',\n            type=int,\n            default=0,\n            help=\n            \"Optional, set to zero to turn off.  Used for adjustment of color contrast.  Lower range_scale will increase contrast. Very low numbers create a reduced color palette, resulting in more vibrant or poster-like images. Higher range_scale will reduce contrast, for more muted images.\"\n        )\n        self.arg_input_group.add_argument(\n            '--sat_scale',\n            type=int,\n            default=0,\n            help=\n            \"Saturation scale. Optional, set to zero to turn off.  If used, sat_scale will help mitigate oversaturation. If your image is too saturated, increase sat_scale to reduce the saturation.\"\n        )\n        self.arg_input_group.add_argument(\n            '--cutn_batches',\n            type=int,\n            default=4,\n            help=\n            \"Each iteration, the AI cuts the image into smaller pieces known as cuts, and compares each cut to the prompt to decide how to guide the next diffusion step.  More cuts can generally lead to better images, since DD has more chances to fine-tune the image precision in each timestep.  Additional cuts are memory intensive, however, and if DD tries to evaluate too many cuts at once, it can run out of memory.  You can use cutn_batches to increase cuts per timestep without increasing memory usage. At the default settings, DD is scheduled to do 16 cuts per timestep.  If cutn_batches is set to 1, there will indeed only be 16 cuts total per timestep. However, if cutn_batches is increased to 4, DD will do 64 cuts total in each timestep, divided into 4 sequential batches of 16 cuts each.  Because the cuts are being evaluated only 16 at a time, DD uses the memory required for only 16 cuts, but gives you the quality benefit of 64 cuts.  The tradeoff, of course, is that this will take ~4 times as long to render each image.So, (scheduled cuts) x (cutn_batches) = (total cuts per timestep). Increasing cutn_batches will increase render times, however, as the work is being done sequentially.  DD’s default cut schedule is a good place to start, but the cut schedule can be adjusted in the Cutn Scheduling section, explained below.\"\n        )\n        self.arg_input_group.add_argument(\n            '--diffusion_sampling_mode',\n            type=str,\n            default='ddim',\n            help=\n            \"Two alternate diffusion denoising algorithms. ddim has been around longer, and is more established and tested.  plms is a newly added alternate method that promises good diffusion results in fewer steps, but has not been as fully tested and may have side effects. This new plms mode is actively being researched in the #settings-and-techniques channel in the DD Discord.\"\n        )\n        self.arg_input_group.add_argument(\n            '--perlin_init',\n            type=bool,\n            default=False,\n            help=\n            \"Normally, DD will use an image filled with random noise as a starting point for the diffusion curve.  If perlin_init is selected, DD will instead use a Perlin noise model as an initial state.  Perlin has very interesting characteristics, distinct from random noise, so it’s worth experimenting with this for your projects. Beyond perlin, you can, of course, generate your own noise images (such as with GIMP, etc) and use them as an init_image (without skipping steps). Choosing perlin_init does not affect the actual diffusion process, just the starting point for the diffusion. Please note that selecting a perlin_init will replace and override any init_image you may have specified.  Further, because the 2D, 3D and video animation systems all rely on the init_image system, if you enable Perlin while using animation modes, the perlin_init will jump in front of any previous image or video input, and DD will NOT give you the expected sequence of coherent images. All of that said, using Perlin and animation modes together do make a very colorful rainbow effect, which can be used creatively.\"\n        )\n        self.arg_input_group.add_argument(\n            '--perlin_mode',\n            type=str,\n            default='mixed',\n            help=\n            \"sets type of Perlin noise: colored, gray, or a mix of both, giving you additional options for noise types. Experiment to see what these do in your projects.\"\n        )\n        self.arg_input_group.add_argument(\n            '--seed',\n            type=int,\n            default=None,\n            help=\n            \"Deep in the diffusion code, there is a random number ‘seed’ which is used as the basis for determining the initial state of the diffusion.  By default, this is random, but you can also specify your own seed.  This is useful if you like a particular result and would like to run more iterations that will be similar. After each run, the actual seed value used will be reported in the parameters report, and can be reused if desired by entering seed # here.  If a specific numerical seed is used repeatedly, the resulting images will be quite similar but not identical.\"\n        )\n        self.arg_input_group.add_argument(\n            '--eta',\n            type=float,\n            default=0.8,\n            help=\n            \"eta (greek letter η) is a diffusion model variable that mixes in a random amount of scaled noise into each timestep. 0 is no noise, 1.0 is more noise. As with most DD parameters, you can go below zero for eta, but it may give you unpredictable results. The steps parameter has a close relationship with the eta parameter. If you set eta to 0, then you can get decent output with only 50-75 steps. Setting eta to 1.0 favors higher step counts, ideally around 250 and up. eta has a subtle, unpredictable effect on image, so you’ll need to experiment to see how this affects your projects.\"\n        )\n        self.arg_input_group.add_argument(\n            '--clamp_grad',\n            type=bool,\n            default=True,\n            help=\n            \"As I understand it, clamp_grad is an internal limiter that stops DD from producing extreme results.  Try your images with and without clamp_grad. If the image changes drastically with clamp_grad turned off, it probably means your clip_guidance_scale is too high and should be reduced.\"\n        )\n        self.arg_input_group.add_argument(\n            '--clamp_max',\n            type=float,\n            default=0.05,\n            help=\n            \"Sets the value of the clamp_grad limitation. Default is 0.05, providing for smoother, more muted coloration in images, but setting higher values (0.15-0.3) can provide interesting contrast and vibrancy.\"\n        )\n        self.arg_input_group.add_argument('--randomize_class', type=bool, default=True, help=\"Random class.\")\n        self.arg_input_group.add_argument('--clip_denoised', type=bool, default=False, help=\"Clip denoised.\")\n        self.arg_input_group.add_argument(\n            '--fuzzy_prompt',\n            type=bool,\n            default=False,\n            help=\n            \"Controls whether to add multiple noisy prompts to the prompt losses. If True, can increase variability of image output. Experiment with this.\"\n        )\n        self.arg_input_group.add_argument(\n            '--rand_mag',\n            type=float,\n            default=0.5,\n            help=\"Affects only the fuzzy_prompt.  Controls the magnitude of the random noise added by fuzzy_prompt.\")\n        self.arg_input_group.add_argument('--cut_overview',\n                                          type=str,\n                                          default='[12]*400+[4]*600',\n                                          help=\"The schedule of overview cuts\")\n        self.arg_input_group.add_argument('--cut_innercut',\n                                          type=str,\n                                          default='[4]*400+[12]*600',\n                                          help=\"The schedule of inner cuts\")\n        self.arg_input_group.add_argument(\n            '--cut_icgray_p',\n            type=str,\n            default='[0.2]*400+[0]*600',\n            help=\n            \"This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\"\n        )\n        self.arg_input_group.add_argument(\n            '--display_rate',\n            type=int,\n            default=10,\n            help=\n            \"During a diffusion run, you can monitor the progress of each image being created with this variable.  If display_rate is set to 50, DD will show you the in-progress image every 50 timesteps. Setting this to a lower value, like 5 or 10, is a good way to get an early peek at where your image is heading. If you don’t like the progression, just interrupt execution, change some settings, and re-run.  If you are planning a long, unmonitored batch, it’s better to set display_rate equal to steps, because displaying interim images does slow Colab down slightly.\"\n        )\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=True,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='disco_diffusion_cnclip_vitb16',\n                                           help='Output directory.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--text_prompts', type=str)\n        self.arg_input_group.add_argument(\n            '--style',\n            type=str,\n            default=None,\n            help='Image style, such as oil paintings, if specified, style will be used to construct prompts.')\n        self.arg_input_group.add_argument('--artist',\n                                          type=str,\n                                          default=None,\n                                          help='Artist style, if specified, style will be used to construct prompts.')\n        self.arg_input_group.add_argument(\n            '--init_image',\n            type=str,\n            default=None,\n            help=\n            \"Recall that in the image sequence above, the first image shown is just noise.  If an init_image is provided, diffusion will replace the noise with the init_image as its starting state.  To use an init_image, upload the image to the Colab instance or your Google Drive, and enter the full image path here. If using an init_image, you may need to increase skip_steps to ~ 50% of total steps to retain the character of the init. See skip_steps above for further discussion.\"\n        )\n        self.arg_input_group.add_argument(\n            '--width_height',\n            type=ast.literal_eval,\n            default=[1280, 768],\n            help=\n            \"Desired final image size, in pixels. You can have a square, wide, or tall image, but each edge length should be set to a multiple of 64px, and a minimum of 512px on the default CLIP model setting.  If you forget to use multiples of 64px in your dimensions, DD will adjust the dimensions of your image to make it so.\"\n        )\n        self.arg_input_group.add_argument(\n            '--n_batches',\n            type=int,\n            default=1,\n            help=\n            \"This variable sets the number of still images you want DD to create.  If you are using an animation mode (see below for details) DD will ignore n_batches and create a single set of animated frames based on the animation settings.\"\n        )\n        self.arg_input_group.add_argument('--batch_size', type=int, default=1, help=\"Batch size.\")\n        self.arg_input_group.add_argument(\n            '--batch_name',\n            type=str,\n            default='',\n            help=\n            'The name of the batch, the batch id will be named as \"discoart-[batch_name]-seed\". To avoid your artworks be overridden by other users, please use a unique name.'\n        )\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/requirements.txt",
    "content": "numpy\npaddle_lpips==0.1.2\nftfy\ndocarray>=0.13.29\npyyaml\nregex\ntqdm\nipywidgets\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/resize_right/README.md",
    "content": "# ResizeRight (Paddle)\nFully differentiable resize function implemented by Paddle.\nThis module is based on [assafshocher/ResizeRight](https://github.com/assafshocher/ResizeRight).\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/resize_right/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/resize_right/interp_methods.py",
    "content": "from math import pi\n\ntry:\n    import paddle\nexcept ImportError:\n    paddle = None\n\ntry:\n    import numpy\n    import numpy as np\nexcept ImportError:\n    numpy = None\n\nif numpy is None and paddle is None:\n    raise ImportError(\"Must have either Numpy or PyTorch but both not found\")\n\n\ndef set_framework_dependencies(x):\n    if type(x) is numpy.ndarray:\n        to_dtype = lambda a: a\n        fw = numpy\n    else:\n        to_dtype = lambda a: paddle.cast(a, x.dtype)\n        fw = paddle\n    # eps = fw.finfo(fw.float32).eps\n    eps = paddle.to_tensor(np.finfo(np.float32).eps)\n    return fw, to_dtype, eps\n\n\ndef support_sz(sz):\n\n    def wrapper(f):\n        f.support_sz = sz\n        return f\n\n    return wrapper\n\n\n@support_sz(4)\ndef cubic(x):\n    fw, to_dtype, eps = set_framework_dependencies(x)\n    absx = fw.abs(x)\n    absx2 = absx**2\n    absx3 = absx**3\n    return ((1.5 * absx3 - 2.5 * absx2 + 1.) * to_dtype(absx <= 1.) +\n            (-0.5 * absx3 + 2.5 * absx2 - 4. * absx + 2.) * to_dtype((1. < absx) & (absx <= 2.)))\n\n\n@support_sz(4)\ndef lanczos2(x):\n    fw, to_dtype, eps = set_framework_dependencies(x)\n    return (((fw.sin(pi * x) * fw.sin(pi * x / 2) + eps) / ((pi**2 * x**2 / 2) + eps)) * to_dtype(abs(x) < 2))\n\n\n@support_sz(6)\ndef lanczos3(x):\n    fw, to_dtype, eps = set_framework_dependencies(x)\n    return (((fw.sin(pi * x) * fw.sin(pi * x / 3) + eps) / ((pi**2 * x**2 / 3) + eps)) * to_dtype(abs(x) < 3))\n\n\n@support_sz(2)\ndef linear(x):\n    fw, to_dtype, eps = set_framework_dependencies(x)\n    return ((x + 1) * to_dtype((-1 <= x) & (x < 0)) + (1 - x) * to_dtype((0 <= x) & (x <= 1)))\n\n\n@support_sz(1)\ndef box(x):\n    fw, to_dtype, eps = set_framework_dependencies(x)\n    return to_dtype((-1 <= x) & (x < 0)) + to_dtype((0 <= x) & (x <= 1))\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/resize_right/resize_right.py",
    "content": "import warnings\nfrom fractions import Fraction\nfrom math import ceil\nfrom typing import Tuple\n\nimport disco_diffusion_cnclip_vitb16.resize_right.interp_methods as interp_methods\n\n\nclass NoneClass:\n    pass\n\n\ntry:\n    import paddle\n    from paddle import nn\n    nnModuleWrapped = nn.Layer\nexcept ImportError:\n    warnings.warn('No PyTorch found, will work only with Numpy')\n    paddle = None\n    nnModuleWrapped = NoneClass\n\ntry:\n    import numpy\n    import numpy as np\nexcept ImportError:\n    warnings.warn('No Numpy found, will work only with PyTorch')\n    numpy = None\n\nif numpy is None and paddle is None:\n    raise ImportError(\"Must have either Numpy or PyTorch but both not found\")\n\n\ndef resize(input,\n           scale_factors=None,\n           out_shape=None,\n           interp_method=interp_methods.cubic,\n           support_sz=None,\n           antialiasing=True,\n           by_convs=False,\n           scale_tolerance=None,\n           max_numerator=10,\n           pad_mode='constant'):\n    # get properties of the input tensor\n    in_shape, n_dims = input.shape, input.ndim\n\n    # fw stands for framework that can be either numpy or paddle,\n    # determined by the input type\n    fw = numpy if type(input) is numpy.ndarray else paddle\n    eps = np.finfo(np.float32).eps if fw == numpy else paddle.to_tensor(np.finfo(np.float32).eps)\n    device = input.place if fw is paddle else None\n\n    # set missing scale factors or output shapem one according to another,\n    # scream if both missing. this is also where all the defults policies\n    # take place. also handling the by_convs attribute carefully.\n    scale_factors, out_shape, by_convs = set_scale_and_out_sz(in_shape, out_shape, scale_factors, by_convs,\n                                                              scale_tolerance, max_numerator, eps, fw)\n\n    # sort indices of dimensions according to scale of each dimension.\n    # since we are going dim by dim this is efficient\n    sorted_filtered_dims_and_scales = [(dim, scale_factors[dim], by_convs[dim], in_shape[dim], out_shape[dim])\n                                       for dim in sorted(range(n_dims), key=lambda ind: scale_factors[ind])\n                                       if scale_factors[dim] != 1.]\n    # unless support size is specified by the user, it is an attribute\n    # of the interpolation method\n    if support_sz is None:\n        support_sz = interp_method.support_sz\n\n    # output begins identical to input and changes with each iteration\n    output = input\n\n    # iterate over dims\n    for (dim, scale_factor, dim_by_convs, in_sz, out_sz) in sorted_filtered_dims_and_scales:\n        # STEP 1- PROJECTED GRID: The non-integer locations of the projection\n        # of output pixel locations to the input tensor\n        projected_grid = get_projected_grid(in_sz, out_sz, scale_factor, fw, dim_by_convs, device)\n\n        # STEP 1.5: ANTIALIASING- If antialiasing is taking place, we modify\n        # the window size and the interpolation method (see inside function)\n        cur_interp_method, cur_support_sz = apply_antialiasing_if_needed(interp_method, support_sz, scale_factor,\n                                                                         antialiasing)\n\n        # STEP 2- FIELDS OF VIEW: for each output pixels, map the input pixels\n        # that influence it. Also calculate needed padding and update grid\n        # accoedingly\n        field_of_view = get_field_of_view(projected_grid, cur_support_sz, fw, eps, device)\n\n        # STEP 2.5- CALCULATE PAD AND UPDATE: according to the field of view,\n        # the input should be padded to handle the boundaries, coordinates\n        # should be updated. actual padding only occurs when weights are\n        # aplied (step 4). if using by_convs for this dim, then we need to\n        # calc right and left boundaries for each filter instead.\n        pad_sz, projected_grid, field_of_view = calc_pad_sz(in_sz, out_sz, field_of_view, projected_grid, scale_factor,\n                                                            dim_by_convs, fw, device)\n        # STEP 3- CALCULATE WEIGHTS: Match a set of weights to the pixels in\n        # the field of view for each output pixel\n        weights = get_weights(cur_interp_method, projected_grid, field_of_view)\n\n        # STEP 4- APPLY WEIGHTS: Each output pixel is calculated by multiplying\n        # its set of weights with the pixel values in its field of view.\n        # We now multiply the fields of view with their matching weights.\n        # We do this by tensor multiplication and broadcasting.\n        # if by_convs is true for this dim, then we do this action by\n        # convolutions. this is equivalent but faster.\n        if not dim_by_convs:\n            output = apply_weights(output, field_of_view, weights, dim, n_dims, pad_sz, pad_mode, fw)\n        else:\n            output = apply_convs(output, scale_factor, in_sz, out_sz, weights, dim, pad_sz, pad_mode, fw)\n    return output\n\n\ndef get_projected_grid(in_sz, out_sz, scale_factor, fw, by_convs, device=None):\n    # we start by having the ouput coordinates which are just integer locations\n    # in the special case when usin by_convs, we only need two cycles of grid\n    # points. the first and last.\n    grid_sz = out_sz if not by_convs else scale_factor.numerator\n    out_coordinates = fw_arange(grid_sz, fw, device)\n\n    # This is projecting the ouput pixel locations in 1d to the input tensor,\n    # as non-integer locations.\n    # the following fomrula is derived in the paper\n    # \"From Discrete to Continuous Convolutions\" by Shocher et al.\n    return (out_coordinates / float(scale_factor) + (in_sz - 1) / 2 - (out_sz - 1) / (2 * float(scale_factor)))\n\n\ndef get_field_of_view(projected_grid, cur_support_sz, fw, eps, device):\n    # for each output pixel, map which input pixels influence it, in 1d.\n    # we start by calculating the leftmost neighbor, using half of the window\n    # size (eps is for when boundary is exact int)\n    left_boundaries = fw_ceil(projected_grid - cur_support_sz / 2 - eps, fw)\n\n    # then we simply take all the pixel centers in the field by counting\n    # window size pixels from the left boundary\n    ordinal_numbers = fw_arange(ceil(cur_support_sz - eps), fw, device)\n    return left_boundaries[:, None] + ordinal_numbers\n\n\ndef calc_pad_sz(in_sz, out_sz, field_of_view, projected_grid, scale_factor, dim_by_convs, fw, device):\n    if not dim_by_convs:\n        # determine padding according to neighbor coords out of bound.\n        # this is a generalized notion of padding, when pad<0 it means crop\n        pad_sz = [-field_of_view[0, 0].item(), field_of_view[-1, -1].item() - in_sz + 1]\n\n        # since input image will be changed by padding, coordinates of both\n        # field_of_view and projected_grid need to be updated\n        field_of_view += pad_sz[0]\n        projected_grid += pad_sz[0]\n\n    else:\n        # only used for by_convs, to calc the boundaries of each filter the\n        # number of distinct convolutions is the numerator of the scale factor\n        num_convs, stride = scale_factor.numerator, scale_factor.denominator\n\n        # calculate left and right boundaries for each conv. left can also be\n        # negative right can be bigger than in_sz. such cases imply padding if\n        # needed. however if# both are in-bounds, it means we need to crop,\n        # practically apply the conv only on part of the image.\n        left_pads = -field_of_view[:, 0]\n\n        # next calc is tricky, explanation by rows:\n        # 1) counting output pixels between the first position of each filter\n        #    to the right boundary of the input\n        # 2) dividing it by number of filters to count how many 'jumps'\n        #    each filter does\n        # 3) multiplying by the stride gives us the distance over the input\n        #    coords done by all these jumps for each filter\n        # 4) to this distance we add the right boundary of the filter when\n        #    placed in its leftmost position. so now we get the right boundary\n        #    of that filter in input coord.\n        # 5) the padding size needed is obtained by subtracting the rightmost\n        #    input coordinate. if the result is positive padding is needed. if\n        #    negative then negative padding means shaving off pixel columns.\n        right_pads = (((out_sz - fw_arange(num_convs, fw, device) - 1)  # (1)\n                       // num_convs)  # (2)\n                      * stride  # (3)\n                      + field_of_view[:, -1]  # (4)\n                      - in_sz + 1)  # (5)\n\n        # in the by_convs case pad_sz is a list of left-right pairs. one per\n        # each filter\n\n        pad_sz = list(zip(left_pads, right_pads))\n\n    return pad_sz, projected_grid, field_of_view\n\n\ndef get_weights(interp_method, projected_grid, field_of_view):\n    # the set of weights per each output pixels is the result of the chosen\n    # interpolation method applied to the distances between projected grid\n    # locations and the pixel-centers in the field of view (distances are\n    # directed, can be positive or negative)\n    weights = interp_method(projected_grid[:, None] - field_of_view)\n\n    # we now carefully normalize the weights to sum to 1 per each output pixel\n    sum_weights = weights.sum(1, keepdim=True)\n    sum_weights[sum_weights == 0] = 1\n    return weights / sum_weights\n\n\ndef apply_weights(input, field_of_view, weights, dim, n_dims, pad_sz, pad_mode, fw):\n    # for this operation we assume the resized dim is the first one.\n    # so we transpose and will transpose back after multiplying\n    tmp_input = fw_swapaxes(input, dim, 0, fw)\n\n    # apply padding\n    tmp_input = fw_pad(tmp_input, fw, pad_sz, pad_mode)\n\n    # field_of_view is a tensor of order 2: for each output (1d location\n    # along cur dim)- a list of 1d neighbors locations.\n    # note that this whole operations is applied to each dim separately,\n    # this is why it is all in 1d.\n    # neighbors = tmp_input[field_of_view] is a tensor of order image_dims+1:\n    # for each output pixel (this time indicated in all dims), these are the\n    # values of the neighbors in the 1d field of view. note that we only\n    # consider neighbors along the current dim, but such set exists for every\n    # multi-dim location, hence the final tensor order is image_dims+1.\n    paddle.device.cuda.empty_cache()\n    neighbors = tmp_input[field_of_view]\n\n    # weights is an order 2 tensor: for each output location along 1d- a list\n    # of weights matching the field of view. we augment it with ones, for\n    # broadcasting, so that when multiplies some tensor the weights affect\n    # only its first dim.\n    tmp_weights = fw.reshape(weights, (*weights.shape, *[1] * (n_dims - 1)))\n\n    # now we simply multiply the weights with the neighbors, and then sum\n    # along the field of view, to get a single value per out pixel\n    tmp_output = (neighbors * tmp_weights).sum(1)\n    # we transpose back the resized dim to its original position\n    return fw_swapaxes(tmp_output, 0, dim, fw)\n\n\ndef apply_convs(input, scale_factor, in_sz, out_sz, weights, dim, pad_sz, pad_mode, fw):\n    # for this operations we assume the resized dim is the last one.\n    # so we transpose and will transpose back after multiplying\n    input = fw_swapaxes(input, dim, -1, fw)\n\n    # the stride for all convs is the denominator of the scale factor\n    stride, num_convs = scale_factor.denominator, scale_factor.numerator\n\n    # prepare an empty tensor for the output\n    tmp_out_shape = list(input.shape)\n    tmp_out_shape[-1] = out_sz\n    tmp_output = fw_empty(tuple(tmp_out_shape), fw, input.device)\n\n    # iterate over the conv operations. we have as many as the numerator\n    # of the scale-factor. for each we need boundaries and a filter.\n    for conv_ind, (pad_sz, filt) in enumerate(zip(pad_sz, weights)):\n        # apply padding (we pad last dim, padding can be negative)\n        pad_dim = input.ndim - 1\n        tmp_input = fw_pad(input, fw, pad_sz, pad_mode, dim=pad_dim)\n\n        # apply convolution over last dim. store in the output tensor with\n        # positional strides so that when the loop is comlete conv results are\n        # interwind\n        tmp_output[..., conv_ind::num_convs] = fw_conv(tmp_input, filt, stride)\n\n    return fw_swapaxes(tmp_output, -1, dim, fw)\n\n\ndef set_scale_and_out_sz(in_shape, out_shape, scale_factors, by_convs, scale_tolerance, max_numerator, eps, fw):\n    # eventually we must have both scale-factors and out-sizes for all in/out\n    # dims. however, we support many possible partial arguments\n    if scale_factors is None and out_shape is None:\n        raise ValueError(\"either scale_factors or out_shape should be \"\n                         \"provided\")\n    if out_shape is not None:\n        # if out_shape has less dims than in_shape, we defaultly resize the\n        # first dims for numpy and last dims for paddle\n        out_shape = (list(out_shape) +\n                     list(in_shape[len(out_shape):]) if fw is numpy else list(in_shape[:-len(out_shape)]) +\n                     list(out_shape))\n        if scale_factors is None:\n            # if no scale given, we calculate it as the out to in ratio\n            # (not recomended)\n            scale_factors = [out_sz / in_sz for out_sz, in_sz in zip(out_shape, in_shape)]\n    if scale_factors is not None:\n        # by default, if a single number is given as scale, we assume resizing\n        # two dims (most common are images with 2 spatial dims)\n        scale_factors = (scale_factors if isinstance(scale_factors, (list, tuple)) else [scale_factors, scale_factors])\n        # if less scale_factors than in_shape dims, we defaultly resize the\n        # first dims for numpy and last dims for paddle\n        scale_factors = (list(scale_factors) + [1] * (len(in_shape) - len(scale_factors)) if fw is numpy else [1] *\n                         (len(in_shape) - len(scale_factors)) + list(scale_factors))\n        if out_shape is None:\n            # when no out_shape given, it is calculated by multiplying the\n            # scale by the in_shape (not recomended)\n            out_shape = [ceil(scale_factor * in_sz) for scale_factor, in_sz in zip(scale_factors, in_shape)]\n        # next part intentionally after out_shape determined for stability\n        # we fix by_convs to be a list of truth values in case it is not\n        if not isinstance(by_convs, (list, tuple)):\n            by_convs = [by_convs] * len(out_shape)\n\n        # next loop fixes the scale for each dim to be either frac or float.\n        # this is determined by by_convs and by tolerance for scale accuracy.\n        for ind, (sf, dim_by_convs) in enumerate(zip(scale_factors, by_convs)):\n            # first we fractionaize\n            if dim_by_convs:\n                frac = Fraction(1 / sf).limit_denominator(max_numerator)\n                frac = Fraction(numerator=frac.denominator, denominator=frac.numerator)\n\n            # if accuracy is within tolerance scale will be frac. if not, then\n            # it will be float and the by_convs attr will be set false for\n            # this dim\n            if scale_tolerance is None:\n                scale_tolerance = eps\n            if dim_by_convs and abs(frac - sf) < scale_tolerance:\n                scale_factors[ind] = frac\n            else:\n                scale_factors[ind] = float(sf)\n                by_convs[ind] = False\n\n        return scale_factors, out_shape, by_convs\n\n\ndef apply_antialiasing_if_needed(interp_method, support_sz, scale_factor, antialiasing):\n    # antialiasing is \"stretching\" the field of view according to the scale\n    # factor (only for downscaling). this is low-pass filtering. this\n    # requires modifying both the interpolation (stretching the 1d\n    # function and multiplying by the scale-factor) and the window size.\n    scale_factor = float(scale_factor)\n    if scale_factor >= 1.0 or not antialiasing:\n        return interp_method, support_sz\n    cur_interp_method = (lambda arg: scale_factor * interp_method(scale_factor * arg))\n    cur_support_sz = support_sz / scale_factor\n    return cur_interp_method, cur_support_sz\n\n\ndef fw_ceil(x, fw):\n    if fw is numpy:\n        return fw.int_(fw.ceil(x))\n    else:\n        return paddle.cast(x.ceil(), dtype='int64')\n\n\ndef fw_floor(x, fw):\n    if fw is numpy:\n        return fw.int_(fw.floor(x))\n    else:\n        return paddle.cast(x.floor(), dtype='int64')\n\n\ndef fw_cat(x, fw):\n    if fw is numpy:\n        return fw.concatenate(x)\n    else:\n        return fw.concat(x)\n\n\ndef fw_swapaxes(x, ax_1, ax_2, fw):\n    if fw is numpy:\n        return fw.swapaxes(x, ax_1, ax_2)\n    else:\n        if ax_1 == -1:\n            ax_1 = len(x.shape) - 1\n        if ax_2 == -1:\n            ax_2 = len(x.shape) - 1\n        perm0 = list(range(len(x.shape)))\n        temp = ax_1\n        perm0[temp] = ax_2\n        perm0[ax_2] = temp\n        return fw.transpose(x, perm0)\n\n\ndef fw_pad(x, fw, pad_sz, pad_mode, dim=0):\n    if pad_sz == (0, 0):\n        return x\n    if fw is numpy:\n        pad_vec = [(0, 0)] * x.ndim\n        pad_vec[dim] = pad_sz\n        return fw.pad(x, pad_width=pad_vec, mode=pad_mode)\n    else:\n        if x.ndim < 3:\n            x = x[None, None, ...]\n\n        pad_vec = [0] * ((x.ndim - 2) * 2)\n        pad_vec[0:2] = pad_sz\n        return fw_swapaxes(fw.nn.functional.pad(fw_swapaxes(x, dim, -1, fw), pad=pad_vec, mode=pad_mode), dim, -1, fw)\n\n\ndef fw_conv(input, filter, stride):\n    # we want to apply 1d conv to any nd array. the way to do it is to reshape\n    # the input to a 4D tensor. first two dims are singeletons, 3rd dim stores\n    # all the spatial dims that we are not convolving along now. then we can\n    # apply conv2d with a 1xK filter. This convolves the same way all the other\n    # dims stored in the 3d dim. like depthwise conv over these.\n    # TODO: numpy support\n    reshaped_input = input.reshape(1, 1, -1, input.shape[-1])\n    reshaped_output = paddle.nn.functional.conv2d(reshaped_input, filter.view(1, 1, 1, -1), stride=(1, stride))\n    return reshaped_output.reshape(*input.shape[:-1], -1)\n\n\ndef fw_arange(upper_bound, fw, device):\n    if fw is numpy:\n        return fw.arange(upper_bound)\n    else:\n        return fw.arange(upper_bound)\n\n\ndef fw_empty(shape, fw, device):\n    if fw is numpy:\n        return fw.empty(shape)\n    else:\n        return fw.empty(shape=shape)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/reverse_diffusion/README.md",
    "content": "# Diffusion model (Paddle)\nThis module implements diffusion model which accepts a text prompt and outputs images semantically close to the text. The code is rewritten by Paddle, and mainly refer to two projects:  jina-ai/discoart[https://github.com/jina-ai/discoart] and openai/guided-diffusion[https://github.com/openai/guided-diffusion]. Thanks for their wonderful work.\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/reverse_diffusion/__init__.py",
    "content": "'''\nhttps://github.com/jina-ai/discoart/blob/main/discoart/__init__.py\n'''\nimport os\nimport warnings\n\nos.environ['KMP_DUPLICATE_LIB_OK'] = 'TRUE'\n\n__all__ = ['create']\n\nimport sys\n\n__resources_path__ = os.path.join(\n    os.path.dirname(sys.modules.get(__package__).__file__ if __package__ in sys.modules else __file__),\n    'resources',\n)\n\nimport gc\n\n# check if GPU is available\nimport paddle\n\n# download and load models, this will take some time on the first load\n\nfrom .helper import load_all_models, load_diffusion_model, load_clip_models\n\nmodel_config, secondary_model = load_all_models('512x512_diffusion_uncond_finetune_008100', use_secondary_model=True)\n\nfrom typing import TYPE_CHECKING, overload, List, Optional\n\nif TYPE_CHECKING:\n    from docarray import DocumentArray, Document\n\n_clip_models_cache = {}\n\n# begin_create_overload\n\n\n@overload\ndef create(text_prompts: Optional[List[str]] = [\n    'A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.',\n    'yellow color scheme',\n],\n           init_image: Optional[str] = None,\n           width_height: Optional[List[int]] = [1280, 768],\n           skip_steps: Optional[int] = 10,\n           steps: Optional[int] = 250,\n           cut_ic_pow: Optional[int] = 1,\n           init_scale: Optional[int] = 1000,\n           clip_guidance_scale: Optional[int] = 5000,\n           tv_scale: Optional[int] = 0,\n           range_scale: Optional[int] = 150,\n           sat_scale: Optional[int] = 0,\n           cutn_batches: Optional[int] = 4,\n           diffusion_model: Optional[str] = '512x512_diffusion_uncond_finetune_008100',\n           use_secondary_model: Optional[bool] = True,\n           diffusion_sampling_mode: Optional[str] = 'ddim',\n           perlin_init: Optional[bool] = False,\n           perlin_mode: Optional[str] = 'mixed',\n           seed: Optional[int] = None,\n           eta: Optional[float] = 0.8,\n           clamp_grad: Optional[bool] = True,\n           clamp_max: Optional[float] = 0.05,\n           randomize_class: Optional[bool] = True,\n           clip_denoised: Optional[bool] = False,\n           fuzzy_prompt: Optional[bool] = False,\n           rand_mag: Optional[float] = 0.05,\n           cut_overview: Optional[str] = '[12]*400+[4]*600',\n           cut_innercut: Optional[str] = '[4]*400+[12]*600',\n           cut_icgray_p: Optional[str] = '[0.2]*400+[0]*600',\n           display_rate: Optional[int] = 10,\n           n_batches: Optional[int] = 4,\n           batch_size: Optional[int] = 1,\n           batch_name: Optional[str] = '',\n           clip_models: Optional[list] = ['ViTB32', 'ViTB16', 'RN50'],\n           output_dir: Optional[str] = 'discoart_output') -> 'DocumentArray':\n    \"\"\"\n    Create Disco Diffusion artworks and save the result into a DocumentArray.\n\n    :param text_prompts: Phrase, sentence, or string of words and phrases describing what the image should look like.  The words will be analyzed by the AI and will guide the diffusion process toward the image(s) you describe. These can include commas and weights to adjust the relative importance of each element.  E.g. \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"Notice that this prompt loosely follows a structure: [subject], [prepositional details], [setting], [meta modifiers and artist]; this is a good starting point for your experiments. Developing text prompts takes practice and experience, and is not the subject of this guide.  If you are a beginner to writing text prompts, a good place to start is on a simple AI art app like Night Cafe, starry ai or WOMBO prior to using DD, to get a feel for how text gets translated into images by GAN tools.  These other apps use different technologies, but many of the same principles apply.\n    :param init_image: Recall that in the image sequence above, the first image shown is just noise.  If an init_image is provided, diffusion will replace the noise with the init_image as its starting state.  To use an init_image, upload the image to the Colab instance or your Google Drive, and enter the full image path here. If using an init_image, you may need to increase skip_steps to ~ 50% of total steps to retain the character of the init. See skip_steps above for further discussion.\n    :param width_height: Desired final image size, in pixels. You can have a square, wide, or tall image, but each edge length should be set to a multiple of 64px, and a minimum of 512px on the default CLIP model setting.  If you forget to use multiples of 64px in your dimensions, DD will adjust the dimensions of your image to make it so.\n    :param skip_steps: Consider the chart shown here.  Noise scheduling (denoise strength) starts very high and progressively gets lower and lower as diffusion steps progress. The noise levels in the first few steps are very high, so images change dramatically in early steps.As DD moves along the curve, noise levels (and thus the amount an image changes per step) declines, and image coherence from one step to the next increases.The first few steps of denoising are often so dramatic that some steps (maybe 10-15% of total) can be skipped without affecting the final image. You can experiment with this as a way to cut render times.If you skip too many steps, however, the remaining noise may not be high enough to generate new content, and thus may not have ‘time left’ to finish an image satisfactorily.Also, depending on your other settings, you may need to skip steps to prevent CLIP from overshooting your goal, resulting in ‘blown out’ colors (hyper saturated, solid white, or solid black regions) or otherwise poor image quality.  Consider that the denoising process is at its strongest in the early steps, so skipping steps can sometimes mitigate other problems.Lastly, if using an init_image, you will need to skip ~50% of the diffusion steps to retain the shapes in the original init image. However, if you’re using an init_image, you can also adjust skip_steps up or down for creative reasons.  With low skip_steps you can get a result \"inspired by\" the init_image which will retain the colors and rough layout and shapes but look quite different. With high skip_steps you can preserve most of the init_image contents and just do fine tuning of the texture.\n    :param steps: When creating an image, the denoising curve is subdivided into steps for processing. Each step (or iteration) involves the AI looking at subsets of the image called ‘cuts’ and calculating the ‘direction’ the image should be guided to be more like the prompt. Then it adjusts the image with the help of the diffusion denoiser, and moves to the next step.Increasing steps will provide more opportunities for the AI to adjust the image, and each adjustment will be smaller, and thus will yield a more precise, detailed image.  Increasing steps comes at the expense of longer render times.  Also, while increasing steps should generally increase image quality, there is a diminishing return on additional steps beyond 250 - 500 steps.  However, some intricate images can take 1000, 2000, or more steps.  It is really up to the user.  Just know that the render time is directly related to the number of steps, and many other parameters have a major impact on image quality, without costing additional time.\n    :param cut_ic_pow: This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n    :param init_scale: This controls how strongly CLIP will try to match the init_image provided.  This is balanced against the clip_guidance_scale (CGS) above.  Too much init scale, and the image won’t change much during diffusion. Too much CGS and the init image will be lost.\n    :param clip_guidance_scale: CGS is one of the most important parameters you will use. It tells DD how strongly you want CLIP to move toward your prompt each timestep.  Higher is generally better, but if CGS is too strong it will overshoot the goal and distort the image. So a happy medium is needed, and it takes experience to learn how to adjust CGS. Note that this parameter generally scales with image dimensions. In other words, if you increase your total dimensions by 50% (e.g. a change from 512 x 512 to 512 x 768), then to maintain the same effect on the image, you’d want to increase clip_guidance_scale from 5000 to 7500. Of the basic settings, clip_guidance_scale, steps and skip_steps are the most important contributors to image quality, so learn them well.\n    :param tv_scale: Total variance denoising. Optional, set to zero to turn off. Controls ‘smoothness’ of final output. If used, tv_scale will try to smooth out your final image to reduce overall noise. If your image is too ‘crunchy’, increase tv_scale. TV denoising is good at preserving edges while smoothing away noise in flat regions.  See https://en.wikipedia.org/wiki/Total_variation_denoising\n    :param range_scale: Optional, set to zero to turn off.  Used for adjustment of color contrast.  Lower range_scale will increase contrast. Very low numbers create a reduced color palette, resulting in more vibrant or poster-like images. Higher range_scale will reduce contrast, for more muted images.\n    :param sat_scale: Saturation scale. Optional, set to zero to turn off.  If used, sat_scale will help mitigate oversaturation. If your image is too saturated, increase sat_scale to reduce the saturation.\n    :param cutn_batches: Each iteration, the AI cuts the image into smaller pieces known as cuts, and compares each cut to the prompt to decide how to guide the next diffusion step.  More cuts can generally lead to better images, since DD has more chances to fine-tune the image precision in each timestep.  Additional cuts are memory intensive, however, and if DD tries to evaluate too many cuts at once, it can run out of memory.  You can use cutn_batches to increase cuts per timestep without increasing memory usage. At the default settings, DD is scheduled to do 16 cuts per timestep.  If cutn_batches is set to 1, there will indeed only be 16 cuts total per timestep. However, if cutn_batches is increased to 4, DD will do 64 cuts total in each timestep, divided into 4 sequential batches of 16 cuts each.  Because the cuts are being evaluated only 16 at a time, DD uses the memory required for only 16 cuts, but gives you the quality benefit of 64 cuts.  The tradeoff, of course, is that this will take ~4 times as long to render each image.So, (scheduled cuts) x (cutn_batches) = (total cuts per timestep). Increasing cutn_batches will increase render times, however, as the work is being done sequentially.  DD’s default cut schedule is a good place to start, but the cut schedule can be adjusted in the Cutn Scheduling section, explained below.\n    :param diffusion_model: Diffusion_model of choice.\n    :param use_secondary_model: Option to use a secondary purpose-made diffusion model to clean up interim diffusion images for CLIP evaluation.    If this option is turned off, DD will use the regular (large) diffusion model.    Using the secondary model is faster - one user reported a 50% improvement in render speed! However, the secondary model is much smaller, and may reduce image quality and detail.  I suggest you experiment with this.\n    :param diffusion_sampling_mode: Two alternate diffusion denoising algorithms. ddim has been around longer, and is more established and tested.  plms is a newly added alternate method that promises good diffusion results in fewer steps, but has not been as fully tested and may have side effects. This new plms mode is actively being researched in the #settings-and-techniques channel in the DD Discord.\n    :param perlin_init: Normally, DD will use an image filled with random noise as a starting point for the diffusion curve.  If perlin_init is selected, DD will instead use a Perlin noise model as an initial state.  Perlin has very interesting characteristics, distinct from random noise, so it’s worth experimenting with this for your projects. Beyond perlin, you can, of course, generate your own noise images (such as with GIMP, etc) and use them as an init_image (without skipping steps). Choosing perlin_init does not affect the actual diffusion process, just the starting point for the diffusion. Please note that selecting a perlin_init will replace and override any init_image you may have specified.  Further, because the 2D, 3D and video animation systems all rely on the init_image system, if you enable Perlin while using animation modes, the perlin_init will jump in front of any previous image or video input, and DD will NOT give you the expected sequence of coherent images. All of that said, using Perlin and animation modes together do make a very colorful rainbow effect, which can be used creatively.\n    :param perlin_mode: sets type of Perlin noise: colored, gray, or a mix of both, giving you additional options for noise types. Experiment to see what these do in your projects.\n    :param seed: Deep in the diffusion code, there is a random number ‘seed’ which is used as the basis for determining the initial state of the diffusion.  By default, this is random, but you can also specify your own seed.  This is useful if you like a particular result and would like to run more iterations that will be similar. After each run, the actual seed value used will be reported in the parameters report, and can be reused if desired by entering seed # here.  If a specific numerical seed is used repeatedly, the resulting images will be quite similar but not identical.\n    :param eta: eta (greek letter η) is a diffusion model variable that mixes in a random amount of scaled noise into each timestep. 0 is no noise, 1.0 is more noise. As with most DD parameters, you can go below zero for eta, but it may give you unpredictable results. The steps parameter has a close relationship with the eta parameter. If you set eta to 0, then you can get decent output with only 50-75 steps. Setting eta to 1.0 favors higher step counts, ideally around 250 and up. eta has a subtle, unpredictable effect on image, so you’ll need to experiment to see how this affects your projects.\n    :param clamp_grad: As I understand it, clamp_grad is an internal limiter that stops DD from producing extreme results.  Try your images with and without clamp_grad. If the image changes drastically with clamp_grad turned off, it probably means your clip_guidance_scale is too high and should be reduced.\n    :param clamp_max: Sets the value of the clamp_grad limitation. Default is 0.05, providing for smoother, more muted coloration in images, but setting higher values (0.15-0.3) can provide interesting contrast and vibrancy.\n    :param fuzzy_prompt: Controls whether to add multiple noisy prompts to the prompt losses. If True, can increase variability of image output. Experiment with this.\n    :param rand_mag: Affects only the fuzzy_prompt.  Controls the magnitude of the random noise added by fuzzy_prompt.\n    :param cut_overview: The schedule of overview cuts\n    :param cut_innercut: The schedule of inner cuts\n    :param cut_icgray_p: This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n    :param display_rate: During a diffusion run, you can monitor the progress of each image being created with this variable.  If display_rate is set to 50, DD will show you the in-progress image every 50 timesteps. Setting this to a lower value, like 5 or 10, is a good way to get an early peek at where your image is heading. If you don’t like the progression, just interrupt execution, change some settings, and re-run.  If you are planning a long, unmonitored batch, it’s better to set display_rate equal to steps, because displaying interim images does slow Colab down slightly.\n    :param n_batches: This variable sets the number of still images you want DD to create.  If you are using an animation mode (see below for details) DD will ignore n_batches and create a single set of animated frames based on the animation settings.\n    :param batch_name: The name of the batch, the batch id will be named as \"discoart-[batch_name]-seed\". To avoid your artworks be overridden by other users, please use a unique name.\n    :param clip_models: CLIP Model selectors. ViTB32, ViTB16, ViTL14, RN101, RN50, RN50x4, RN50x16, RN50x64.These various CLIP models are available for you to use during image generation.  Models have different styles or ‘flavors,’ so look around.  You can mix in multiple models as well for different results.  However, keep in mind that some models are extremely memory-hungry, and turning on additional models will take additional memory and may cause a crash.The rough order of speed/mem usage is (smallest/fastest to largest/slowest):VitB32RN50RN101VitB16RN50x4RN50x16RN50x64ViTL14For RN50x64 & ViTL14 you may need to use fewer cuts, depending on your VRAM.\n    :return: a DocumentArray object that has `n_batches` Documents\n    \"\"\"\n\n\n# end_create_overload\n\n\n@overload\ndef create(init_document: 'Document') -> 'DocumentArray':\n    \"\"\"\n    Create an artwork using a DocArray ``Document`` object as initial state.\n    :param init_document: its ``.tags`` will be used as parameters, ``.uri`` (if present) will be used as init image.\n    :return: a DocumentArray object that has `n_batches` Documents\n    \"\"\"\n\n\ndef create(**kwargs) -> 'DocumentArray':\n    from .config import load_config\n    from .runner import do_run\n\n    if 'init_document' in kwargs:\n        d = kwargs['init_document']\n        _kwargs = d.tags\n        if not _kwargs:\n            warnings.warn('init_document has no .tags, fallback to default config')\n        if d.uri:\n            _kwargs['init_image'] = kwargs['init_document'].uri\n        else:\n            warnings.warn('init_document has no .uri, fallback to no init image')\n        kwargs.pop('init_document')\n        if kwargs:\n            warnings.warn('init_document has .tags and .uri, but kwargs are also present, will override .tags')\n            _kwargs.update(kwargs)\n        _args = load_config(user_config=_kwargs)\n    else:\n        _args = load_config(user_config=kwargs)\n\n    model, diffusion = load_diffusion_model(model_config, _args.diffusion_model, steps=_args.steps)\n\n    clip_models = load_clip_models(enabled=_args.clip_models, clip_models=_clip_models_cache)\n\n    gc.collect()\n    paddle.device.cuda.empty_cache()\n    try:\n        return do_run(_args, (model, diffusion, clip_models, secondary_model))\n    except KeyboardInterrupt:\n        pass\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/reverse_diffusion/config.py",
    "content": "'''\nhttps://github.com/jina-ai/discoart/blob/main/discoart/config.py\n'''\nimport copy\nimport random\nimport warnings\nfrom types import SimpleNamespace\nfrom typing import Dict\n\nimport yaml\nfrom yaml import Loader\n\nfrom . import __resources_path__\n\nwith open(f'{__resources_path__}/default.yml') as ymlfile:\n    default_args = yaml.load(ymlfile, Loader=Loader)\n\n\ndef load_config(user_config: Dict, ):\n    cfg = copy.deepcopy(default_args)\n\n    if user_config:\n        cfg.update(**user_config)\n\n    for k in user_config.keys():\n        if k not in cfg:\n            warnings.warn(f'unknown argument {k}, ignored')\n\n    for k, v in cfg.items():\n        if k in ('batch_size', 'display_rate', 'seed', 'skip_steps', 'steps', 'n_batches',\n                 'cutn_batches') and isinstance(v, float):\n            cfg[k] = int(v)\n        if k == 'width_height':\n            cfg[k] = [int(vv) for vv in v]\n\n    cfg.update(**{\n        'seed': cfg['seed'] or random.randint(0, 2**32),\n    })\n\n    if cfg['batch_name']:\n        da_name = f'{__package__}-{cfg[\"batch_name\"]}-{cfg[\"seed\"]}'\n    else:\n        da_name = f'{__package__}-{cfg[\"seed\"]}'\n        warnings.warn('you did not set `batch_name`, set it to have unique session ID')\n\n    cfg.update(**{'name_docarray': da_name})\n\n    print_args_table(cfg)\n\n    return SimpleNamespace(**cfg)\n\n\ndef print_args_table(cfg):\n    from rich.table import Table\n    from rich import box\n    from rich.console import Console\n\n    console = Console()\n\n    param_str = Table(\n        title=cfg['name_docarray'],\n        box=box.ROUNDED,\n        highlight=True,\n        title_justify='left',\n    )\n    param_str.add_column('Argument', justify='right')\n    param_str.add_column('Value', justify='left')\n\n    for k, v in sorted(cfg.items()):\n        value = str(v)\n\n        if not default_args.get(k, None) == v:\n            value = f'[b]{value}[/]'\n\n        param_str.add_row(k, value)\n\n    console.print(param_str)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/reverse_diffusion/helper.py",
    "content": "'''\nThis code is rewritten by Paddle based on Jina-ai/discoart.\nhttps://github.com/jina-ai/discoart/blob/main/discoart/helper.py\n'''\nimport hashlib\nimport logging\nimport os\nimport subprocess\nimport sys\nfrom os.path import expanduser\nfrom pathlib import Path\nfrom typing import Any\nfrom typing import Dict\nfrom typing import List\n\nimport paddle\n\n\ndef _get_logger():\n    logger = logging.getLogger(__package__)\n    _log_level = os.environ.get('DISCOART_LOG_LEVEL', 'INFO')\n    logger.setLevel(_log_level)\n    ch = logging.StreamHandler()\n    ch.setLevel(_log_level)\n    formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')\n    ch.setFormatter(formatter)\n    logger.addHandler(ch)\n    return logger\n\n\nlogger = _get_logger()\n\n\ndef load_clip_models(enabled: List[str], clip_models: Dict[str, Any] = {}):\n\n    import disco_diffusion_cnclip_vitb16.cn_clip as cn_clip\n    from disco_diffusion_cnclip_vitb16.cn_clip.clip.utils import create_model\n\n    # load enabled models\n    for k in enabled:\n        if k not in clip_models:\n            clip_models[k] = create_model(name=k)\n            clip_models[k].eval()\n            for parameter in clip_models[k].parameters():\n                parameter.stop_gradient = True\n\n    # disable not enabled models to save memory\n    for k in clip_models:\n        if k not in enabled:\n            clip_models.pop(k)\n\n    return list(clip_models.values())\n\n\ndef load_all_models(diffusion_model, use_secondary_model):\n    from .model.script_util import (\n        model_and_diffusion_defaults, )\n\n    model_config = model_and_diffusion_defaults()\n\n    if diffusion_model == '512x512_diffusion_uncond_finetune_008100':\n        model_config.update({\n            'attention_resolutions': '32, 16, 8',\n            'class_cond': False,\n            'diffusion_steps': 1000,  # No need to edit this, it is taken care of later.\n            'rescale_timesteps': True,\n            'timestep_respacing': 250,  # No need to edit this, it is taken care of later.\n            'image_size': 512,\n            'learn_sigma': True,\n            'noise_schedule': 'linear',\n            'num_channels': 256,\n            'num_head_channels': 64,\n            'num_res_blocks': 2,\n            'resblock_updown': True,\n            'use_fp16': False,\n            'use_scale_shift_norm': True,\n        })\n    elif diffusion_model == '256x256_diffusion_uncond':\n        model_config.update({\n            'attention_resolutions': '32, 16, 8',\n            'class_cond': False,\n            'diffusion_steps': 1000,  # No need to edit this, it is taken care of later.\n            'rescale_timesteps': True,\n            'timestep_respacing': 250,  # No need to edit this, it is taken care of later.\n            'image_size': 256,\n            'learn_sigma': True,\n            'noise_schedule': 'linear',\n            'num_channels': 256,\n            'num_head_channels': 64,\n            'num_res_blocks': 2,\n            'resblock_updown': True,\n            'use_fp16': False,\n            'use_scale_shift_norm': True,\n        })\n\n    secondary_model = None\n    if use_secondary_model:\n        from .model.sec_diff import SecondaryDiffusionImageNet2\n        secondary_model = SecondaryDiffusionImageNet2()\n        model_dict = paddle.load(\n            os.path.join(os.path.dirname(__file__), 'pre_trained', 'secondary_model_imagenet_2.pdparams'))\n        secondary_model.set_state_dict(model_dict)\n        secondary_model.eval()\n        for parameter in secondary_model.parameters():\n            parameter.stop_gradient = True\n\n    return model_config, secondary_model\n\n\ndef load_diffusion_model(model_config, diffusion_model, steps):\n    from .model.script_util import (\n        create_model_and_diffusion, )\n\n    timestep_respacing = f'ddim{steps}'\n    diffusion_steps = (1000 // steps) * steps if steps < 1000 else steps\n    model_config.update({\n        'timestep_respacing': timestep_respacing,\n        'diffusion_steps': diffusion_steps,\n    })\n\n    model, diffusion = create_model_and_diffusion(**model_config)\n    model.set_state_dict(\n        paddle.load(os.path.join(os.path.dirname(__file__), 'pre_trained', f'{diffusion_model}.pdparams')))\n    model.eval()\n    for name, param in model.named_parameters():\n        param.stop_gradient = True\n\n    return model, diffusion\n\n\ndef parse_prompt(prompt):\n    if prompt.startswith('http://') or prompt.startswith('https://'):\n        vals = prompt.rsplit(':', 2)\n        vals = [vals[0] + ':' + vals[1], *vals[2:]]\n    else:\n        vals = prompt.rsplit(':', 1)\n    vals = vals + ['', '1'][len(vals):]\n    return vals[0], float(vals[1])\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/reverse_diffusion/model/__init__.py",
    "content": "\"\"\"\nCodebase for \"Improved Denoising Diffusion Probabilistic Models\" implemented by Paddle.\n\"\"\"\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/reverse_diffusion/model/gaussian_diffusion.py",
    "content": "\"\"\"\nDiffusion model implemented by Paddle.\nThis code is rewritten based on Pytorch version of of Ho et al's diffusion models:\nhttps://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/diffusion_utils_2.py\n\"\"\"\nimport enum\nimport math\n\nimport numpy as np\nimport paddle\n\nfrom .losses import discretized_gaussian_log_likelihood\nfrom .losses import normal_kl\nfrom .nn import mean_flat\n\n\ndef get_named_beta_schedule(schedule_name, num_diffusion_timesteps):\n    \"\"\"\n    Get a pre-defined beta schedule for the given name.\n\n    The beta schedule library consists of beta schedules which remain similar\n    in the limit of num_diffusion_timesteps.\n    Beta schedules may be added, but should not be removed or changed once\n    they are committed to maintain backwards compatibility.\n    \"\"\"\n    if schedule_name == \"linear\":\n        # Linear schedule from Ho et al, extended to work for any number of\n        # diffusion steps.\n        scale = 1000 / num_diffusion_timesteps\n        beta_start = scale * 0.0001\n        beta_end = scale * 0.02\n        return np.linspace(beta_start, beta_end, num_diffusion_timesteps, dtype=np.float64)\n    elif schedule_name == \"cosine\":\n        return betas_for_alpha_bar(\n            num_diffusion_timesteps,\n            lambda t: math.cos((t + 0.008) / 1.008 * math.pi / 2)**2,\n        )\n    else:\n        raise NotImplementedError(f\"unknown beta schedule: {schedule_name}\")\n\n\ndef betas_for_alpha_bar(num_diffusion_timesteps, alpha_bar, max_beta=0.999):\n    \"\"\"\n    Create a beta schedule that discretizes the given alpha_t_bar function,\n    which defines the cumulative product of (1-beta) over time from t = [0,1].\n\n    :param num_diffusion_timesteps: the number of betas to produce.\n    :param alpha_bar: a lambda that takes an argument t from 0 to 1 and\n                      produces the cumulative product of (1-beta) up to that\n                      part of the diffusion process.\n    :param max_beta: the maximum beta to use; use values lower than 1 to\n                     prevent singularities.\n    \"\"\"\n    betas = []\n    for i in range(num_diffusion_timesteps):\n        t1 = i / num_diffusion_timesteps\n        t2 = (i + 1) / num_diffusion_timesteps\n        betas.append(min(1 - alpha_bar(t2) / alpha_bar(t1), max_beta))\n    return np.array(betas)\n\n\nclass ModelMeanType(enum.Enum):\n    \"\"\"\n    Which type of output the model predicts.\n    \"\"\"\n\n    PREVIOUS_X = enum.auto()  # the model predicts x_{t-1}\n    START_X = enum.auto()  # the model predicts x_0\n    EPSILON = enum.auto()  # the model predicts epsilon\n\n\nclass ModelVarType(enum.Enum):\n    \"\"\"\n    What is used as the model's output variance.\n\n    The LEARNED_RANGE option has been added to allow the model to predict\n    values between FIXED_SMALL and FIXED_LARGE, making its job easier.\n    \"\"\"\n\n    LEARNED = enum.auto()\n    FIXED_SMALL = enum.auto()\n    FIXED_LARGE = enum.auto()\n    LEARNED_RANGE = enum.auto()\n\n\nclass LossType(enum.Enum):\n    MSE = enum.auto()  # use raw MSE loss (and KL when learning variances)\n    RESCALED_MSE = (enum.auto())  # use raw MSE loss (with RESCALED_KL when learning variances)\n    KL = enum.auto()  # use the variational lower-bound\n    RESCALED_KL = enum.auto()  # like KL, but rescale to estimate the full VLB\n\n    def is_vb(self):\n        return self == LossType.KL or self == LossType.RESCALED_KL\n\n\nclass GaussianDiffusion:\n    \"\"\"\n    Utilities for training and sampling diffusion models.\n\n    Ported directly from here, and then adapted over time to further experimentation.\n    https://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/diffusion_utils_2.py#L42\n\n    :param betas: a 1-D numpy array of betas for each diffusion timestep,\n                  starting at T and going to 1.\n    :param model_mean_type: a ModelMeanType determining what the model outputs.\n    :param model_var_type: a ModelVarType determining how variance is output.\n    :param loss_type: a LossType determining the loss function to use.\n    :param rescale_timesteps: if True, pass floating point timesteps into the\n                              model so that they are always scaled like in the\n                              original paper (0 to 1000).\n    \"\"\"\n\n    def __init__(\n        self,\n        *,\n        betas,\n        model_mean_type,\n        model_var_type,\n        loss_type,\n        rescale_timesteps=False,\n    ):\n        self.model_mean_type = model_mean_type\n        self.model_var_type = model_var_type\n        self.loss_type = loss_type\n        self.rescale_timesteps = rescale_timesteps\n\n        # Use float64 for accuracy.\n        betas = np.array(betas, dtype=np.float64)\n        self.betas = betas\n        assert len(betas.shape) == 1, \"betas must be 1-D\"\n        assert (betas > 0).all() and (betas <= 1).all()\n\n        self.num_timesteps = int(betas.shape[0])\n\n        alphas = 1.0 - betas\n        self.alphas_cumprod = np.cumprod(alphas, axis=0)\n        self.alphas_cumprod_prev = np.append(1.0, self.alphas_cumprod[:-1])\n        self.alphas_cumprod_next = np.append(self.alphas_cumprod[1:], 0.0)\n        assert self.alphas_cumprod_prev.shape == (self.num_timesteps, )\n\n        # calculations for diffusion q(x_t | x_{t-1}) and others\n        self.sqrt_alphas_cumprod = np.sqrt(self.alphas_cumprod)\n        self.sqrt_one_minus_alphas_cumprod = np.sqrt(1.0 - self.alphas_cumprod)\n        self.log_one_minus_alphas_cumprod = np.log(1.0 - self.alphas_cumprod)\n        self.sqrt_recip_alphas_cumprod = np.sqrt(1.0 / self.alphas_cumprod)\n        self.sqrt_recipm1_alphas_cumprod = np.sqrt(1.0 / self.alphas_cumprod - 1)\n\n        # calculations for posterior q(x_{t-1} | x_t, x_0)\n        self.posterior_variance = (betas * (1.0 - self.alphas_cumprod_prev) / (1.0 - self.alphas_cumprod))\n        # log calculation clipped because the posterior variance is 0 at the\n        # beginning of the diffusion chain.\n        self.posterior_log_variance_clipped = np.log(np.append(self.posterior_variance[1], self.posterior_variance[1:]))\n        self.posterior_mean_coef1 = (betas * np.sqrt(self.alphas_cumprod_prev) / (1.0 - self.alphas_cumprod))\n        self.posterior_mean_coef2 = ((1.0 - self.alphas_cumprod_prev) * np.sqrt(alphas) / (1.0 - self.alphas_cumprod))\n\n    def q_mean_variance(self, x_start, t):\n        \"\"\"\n        Get the distribution q(x_t | x_0).\n\n        :param x_start: the [N x C x ...] tensor of noiseless inputs.\n        :param t: the number of diffusion steps (minus 1). Here, 0 means one step.\n        :return: A tuple (mean, variance, log_variance), all of x_start's shape.\n        \"\"\"\n        mean = (_extract_into_tensor(self.sqrt_alphas_cumprod, t, x_start.shape) * x_start)\n        variance = _extract_into_tensor(1.0 - self.alphas_cumprod, t, x_start.shape)\n        log_variance = _extract_into_tensor(self.log_one_minus_alphas_cumprod, t, x_start.shape)\n        return mean, variance, log_variance\n\n    def q_sample(self, x_start, t, noise=None):\n        \"\"\"\n        Diffuse the data for a given number of diffusion steps.\n\n        In other words, sample from q(x_t | x_0).\n\n        :param x_start: the initial data batch.\n        :param t: the number of diffusion steps (minus 1). Here, 0 means one step.\n        :param noise: if specified, the split-out normal noise.\n        :return: A noisy version of x_start.\n        \"\"\"\n        if noise is None:\n            # noise = th.randn_like(x_start)\n            noise = paddle.randn(x_start.shape, x_start.dtype)\n        assert noise.shape == x_start.shape\n        return (_extract_into_tensor(self.sqrt_alphas_cumprod, t, x_start.shape) * x_start +\n                _extract_into_tensor(self.sqrt_one_minus_alphas_cumprod, t, x_start.shape) * noise)\n\n    def q_posterior_mean_variance(self, x_start, x_t, t):\n        \"\"\"\n        Compute the mean and variance of the diffusion posterior:\n\n            q(x_{t-1} | x_t, x_0)\n\n        \"\"\"\n        assert x_start.shape == x_t.shape\n        posterior_mean = (_extract_into_tensor(self.posterior_mean_coef1, t, x_t.shape) * x_start +\n                          _extract_into_tensor(self.posterior_mean_coef2, t, x_t.shape) * x_t)\n        posterior_variance = _extract_into_tensor(self.posterior_variance, t, x_t.shape)\n        posterior_log_variance_clipped = _extract_into_tensor(self.posterior_log_variance_clipped, t, x_t.shape)\n        assert (posterior_mean.shape[0] == posterior_variance.shape[0] == posterior_log_variance_clipped.shape[0] ==\n                x_start.shape[0])\n        return posterior_mean, posterior_variance, posterior_log_variance_clipped\n\n    def p_mean_variance(self, model, x, t, clip_denoised=True, denoised_fn=None, model_kwargs=None):\n        \"\"\"\n        Apply the model to get p(x_{t-1} | x_t), as well as a prediction of\n        the initial x, x_0.\n\n        :param model: the model, which takes a signal and a batch of timesteps\n                      as input.\n        :param x: the [N x C x ...] tensor at time t.\n        :param t: a 1-D Tensor of timesteps.\n        :param clip_denoised: if True, clip the denoised signal into [-1, 1].\n        :param denoised_fn: if not None, a function which applies to the\n            x_start prediction before it is used to sample. Applies before\n            clip_denoised.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n        :return: a dict with the following keys:\n                 - 'mean': the model mean output.\n                 - 'variance': the model variance output.\n                 - 'log_variance': the log of 'variance'.\n                 - 'pred_xstart': the prediction for x_0.\n        \"\"\"\n        if model_kwargs is None:\n            model_kwargs = {}\n\n        B, C = x.shape[:2]\n        assert t.shape == [B]\n        model_output = model(x, self._scale_timesteps(t), **model_kwargs)\n\n        if self.model_var_type in [ModelVarType.LEARNED, ModelVarType.LEARNED_RANGE]:\n            assert model_output.shape == [B, C * 2, *x.shape[2:]]\n            model_output, model_var_values = paddle.split(model_output, 2, axis=1)\n            if self.model_var_type == ModelVarType.LEARNED:\n                model_log_variance = model_var_values\n                model_variance = paddle.exp(model_log_variance)\n            else:\n                min_log = _extract_into_tensor(self.posterior_log_variance_clipped, t, x.shape)\n                max_log = _extract_into_tensor(np.log(self.betas), t, x.shape)\n                # The model_var_values is [-1, 1] for [min_var, max_var].\n                frac = (model_var_values + 1) / 2\n                model_log_variance = frac * max_log + (1 - frac) * min_log\n                model_variance = paddle.exp(model_log_variance)\n        else:\n            model_variance, model_log_variance = {\n                # for fixedlarge, we set the initial (log-)variance like so\n                # to get a better decoder log likelihood.\n                ModelVarType.FIXED_LARGE: (\n                    np.append(self.posterior_variance[1], self.betas[1:]),\n                    np.log(np.append(self.posterior_variance[1], self.betas[1:])),\n                ),\n                ModelVarType.FIXED_SMALL: (\n                    self.posterior_variance,\n                    self.posterior_log_variance_clipped,\n                ),\n            }[self.model_var_type]\n            model_variance = _extract_into_tensor(model_variance, t, x.shape)\n            model_log_variance = _extract_into_tensor(model_log_variance, t, x.shape)\n\n        def process_xstart(x):\n            if denoised_fn is not None:\n                x = denoised_fn(x)\n            if clip_denoised:\n                return x.clamp(-1, 1)\n            return x\n\n        if self.model_mean_type == ModelMeanType.PREVIOUS_X:\n            pred_xstart = process_xstart(self._predict_xstart_from_xprev(x_t=x, t=t, xprev=model_output))\n            model_mean = model_output\n        elif self.model_mean_type in [ModelMeanType.START_X, ModelMeanType.EPSILON]:\n            if self.model_mean_type == ModelMeanType.START_X:\n                pred_xstart = process_xstart(model_output)\n            else:\n                pred_xstart = process_xstart(self._predict_xstart_from_eps(x_t=x, t=t, eps=model_output))\n            model_mean, _, _ = self.q_posterior_mean_variance(x_start=pred_xstart, x_t=x, t=t)\n        else:\n            raise NotImplementedError(self.model_mean_type)\n\n        assert (model_mean.shape == model_log_variance.shape == pred_xstart.shape == x.shape)\n        return {\n            \"mean\": model_mean,\n            \"variance\": model_variance,\n            \"log_variance\": model_log_variance,\n            \"pred_xstart\": pred_xstart,\n        }\n\n    def _predict_xstart_from_eps(self, x_t, t, eps):\n        assert x_t.shape == eps.shape\n        return (_extract_into_tensor(self.sqrt_recip_alphas_cumprod, t, x_t.shape) * x_t -\n                _extract_into_tensor(self.sqrt_recipm1_alphas_cumprod, t, x_t.shape) * eps)\n\n    def _predict_xstart_from_xprev(self, x_t, t, xprev):\n        assert x_t.shape == xprev.shape\n        return (  # (xprev - coef2*x_t) / coef1\n            _extract_into_tensor(1.0 / self.posterior_mean_coef1, t, x_t.shape) * xprev -\n            _extract_into_tensor(self.posterior_mean_coef2 / self.posterior_mean_coef1, t, x_t.shape) * x_t)\n\n    def _predict_eps_from_xstart(self, x_t, t, pred_xstart):\n        return (_extract_into_tensor(self.sqrt_recip_alphas_cumprod, t, x_t.shape) * x_t -\n                pred_xstart) / _extract_into_tensor(self.sqrt_recipm1_alphas_cumprod, t, x_t.shape)\n\n    def _scale_timesteps(self, t):\n        if self.rescale_timesteps:\n            return paddle.cast((t), 'float32') * (1000.0 / self.num_timesteps)\n        return t\n\n    def condition_mean(self, cond_fn, p_mean_var, x, t, model_kwargs=None):\n        \"\"\"\n        Compute the mean for the previous step, given a function cond_fn that\n        computes the gradient of a conditional log probability with respect to\n        x. In particular, cond_fn computes grad(log(p(y|x))), and we want to\n        condition on y.\n\n        This uses the conditioning strategy from Sohl-Dickstein et al. (2015).\n        \"\"\"\n        gradient = cond_fn(x, self._scale_timesteps(t), **model_kwargs)\n        new_mean = (paddle.cast((p_mean_var[\"mean\"]), 'float32') + p_mean_var[\"variance\"] * paddle.cast(\n            (gradient), 'float32'))\n        return new_mean\n\n    def condition_mean_with_grad(self, cond_fn, p_mean_var, x, t, model_kwargs=None):\n        \"\"\"\n        Compute the mean for the previous step, given a function cond_fn that\n        computes the gradient of a conditional log probability with respect to\n        x. In particular, cond_fn computes grad(log(p(y|x))), and we want to\n        condition on y.\n\n        This uses the conditioning strategy from Sohl-Dickstein et al. (2015).\n        \"\"\"\n        gradient = cond_fn(x, t, p_mean_var, **model_kwargs)\n        new_mean = (paddle.cast((p_mean_var[\"mean\"]), 'float32') + p_mean_var[\"variance\"] * paddle.cast(\n            (gradient), 'float32'))\n        return new_mean\n\n    def condition_score(self, cond_fn, p_mean_var, x, t, model_kwargs=None):\n        \"\"\"\n        Compute what the p_mean_variance output would have been, should the\n        model's score function be conditioned by cond_fn.\n\n        See condition_mean() for details on cond_fn.\n\n        Unlike condition_mean(), this instead uses the conditioning strategy\n        from Song et al (2020).\n        \"\"\"\n        alpha_bar = _extract_into_tensor(self.alphas_cumprod, t, x.shape)\n\n        eps = self._predict_eps_from_xstart(x, t, p_mean_var[\"pred_xstart\"])\n        eps = eps - (1 - alpha_bar).sqrt() * cond_fn(x, self._scale_timesteps(t), **model_kwargs)\n\n        out = p_mean_var.copy()\n        out[\"pred_xstart\"] = self._predict_xstart_from_eps(x, t, eps)\n        out[\"mean\"], _, _ = self.q_posterior_mean_variance(x_start=out[\"pred_xstart\"], x_t=x, t=t)\n        return out\n\n    def condition_score_with_grad(self, cond_fn, p_mean_var, x, t, model_kwargs=None):\n        \"\"\"\n        Compute what the p_mean_variance output would have been, should the\n        model's score function be conditioned by cond_fn.\n\n        See condition_mean() for details on cond_fn.\n\n        Unlike condition_mean(), this instead uses the conditioning strategy\n        from Song et al (2020).\n        \"\"\"\n        alpha_bar = _extract_into_tensor(self.alphas_cumprod, t, x.shape)\n\n        eps = self._predict_eps_from_xstart(x, t, p_mean_var[\"pred_xstart\"])\n        eps = eps - (1 - alpha_bar).sqrt() * cond_fn(x, t, p_mean_var, **model_kwargs)\n\n        out = p_mean_var.copy()\n        out[\"pred_xstart\"] = self._predict_xstart_from_eps(x, t, eps)\n        out[\"mean\"], _, _ = self.q_posterior_mean_variance(x_start=out[\"pred_xstart\"], x_t=x, t=t)\n        return out\n\n    def p_sample(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n    ):\n        \"\"\"\n        Sample x_{t-1} from the model at the given timestep.\n\n        :param model: the model to sample from.\n        :param x: the current tensor at x_{t-1}.\n        :param t: the value of t, starting at 0 for the first diffusion step.\n        :param clip_denoised: if True, clip the x_start prediction to [-1, 1].\n        :param denoised_fn: if not None, a function which applies to the\n            x_start prediction before it is used to sample.\n        :param cond_fn: if not None, this is a gradient function that acts\n                        similarly to the model.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n        :return: a dict containing the following keys:\n                 - 'sample': a random sample from the model.\n                 - 'pred_xstart': a prediction of x_0.\n        \"\"\"\n        out = self.p_mean_variance(\n            model,\n            x,\n            t,\n            clip_denoised=clip_denoised,\n            denoised_fn=denoised_fn,\n            model_kwargs=model_kwargs,\n        )\n        # noise = th.randn_like(x)\n        noise = paddle.randn(x.shape, x.dtype)\n        nonzero_mask = (paddle.cast((t != 0), 'float32').reshape([-1,\n                                                                  *([1] * (len(x.shape) - 1))]))  # no noise when t == 0\n        if cond_fn is not None:\n            out[\"mean\"] = self.condition_mean(cond_fn, out, x, t, model_kwargs=model_kwargs)\n        sample = out[\"mean\"] + nonzero_mask * paddle.exp(0.5 * out[\"log_variance\"]) * noise\n        return {\"sample\": sample, \"pred_xstart\": out[\"pred_xstart\"]}\n\n    def p_sample_with_grad(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n    ):\n        \"\"\"\n        Sample x_{t-1} from the model at the given timestep.\n\n        :param model: the model to sample from.\n        :param x: the current tensor at x_{t-1}.\n        :param t: the value of t, starting at 0 for the first diffusion step.\n        :param clip_denoised: if True, clip the x_start prediction to [-1, 1].\n        :param denoised_fn: if not None, a function which applies to the\n            x_start prediction before it is used to sample.\n        :param cond_fn: if not None, this is a gradient function that acts\n                        similarly to the model.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n        :return: a dict containing the following keys:\n                 - 'sample': a random sample from the model.\n                 - 'pred_xstart': a prediction of x_0.\n        \"\"\"\n        # with th.enable_grad():\n        # x = x.detach().requires_grad_()\n        x = x.detach()\n        # x.stop_gradient = False\n        out = self.p_mean_variance(\n            model,\n            x,\n            t,\n            clip_denoised=clip_denoised,\n            denoised_fn=denoised_fn,\n            model_kwargs=model_kwargs,\n        )\n        # noise = th.randn_like(x)\n        noise = paddle.randn(x.shape, x.dtype)\n        nonzero_mask = (paddle.cast((t != 0), 'float32').reshape([-1,\n                                                                  *([1] * (len(x.shape) - 1))]))  # no noise when t == 0\n        if cond_fn is not None:\n            out[\"mean\"] = self.condition_mean_with_grad(cond_fn, out, x, t, model_kwargs=model_kwargs)\n        sample = out[\"mean\"] + nonzero_mask * paddle.exp(0.5 * out[\"log_variance\"]) * noise\n        return {\"sample\": sample, \"pred_xstart\": out[\"pred_xstart\"].detach()}\n\n    def p_sample_loop(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n    ):\n        \"\"\"\n        Generate samples from the model.\n\n        :param model: the model module.\n        :param shape: the shape of the samples, (N, C, H, W).\n        :param noise: if specified, the noise from the encoder to sample.\n                      Should be of the same shape as `shape`.\n        :param clip_denoised: if True, clip x_start predictions to [-1, 1].\n        :param denoised_fn: if not None, a function which applies to the\n            x_start prediction before it is used to sample.\n        :param cond_fn: if not None, this is a gradient function that acts\n                        similarly to the model.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n        :param device: if specified, the device to create the samples on.\n                       If not specified, use a model parameter's device.\n        :param progress: if True, show a tqdm progress bar.\n        :return: a non-differentiable batch of samples.\n        \"\"\"\n        final = None\n        for sample in self.p_sample_loop_progressive(\n                model,\n                shape,\n                noise=noise,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n                device=device,\n                progress=progress,\n                skip_timesteps=skip_timesteps,\n                init_image=init_image,\n                randomize_class=randomize_class,\n                cond_fn_with_grad=cond_fn_with_grad,\n        ):\n            final = sample\n        return final[\"sample\"]\n\n    def p_sample_loop_progressive(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n    ):\n        \"\"\"\n        Generate samples from the model and yield intermediate samples from\n        each timestep of diffusion.\n\n        Arguments are the same as p_sample_loop().\n        Returns a generator over dicts, where each dict is the return value of\n        p_sample().\n        \"\"\"\n        if device is None:\n            device = model.parameters()[0].place\n        assert isinstance(shape, (tuple, list))\n        if noise is not None:\n            img = noise\n        else:\n            img = paddle.randn(shape)\n\n        if skip_timesteps and init_image is None:\n            init_image = paddle.zeros_like(img)\n\n        indices = list(range(self.num_timesteps - skip_timesteps))[::-1]\n\n        if init_image is not None:\n            my_t = paddle.ones([shape[0]], dtype='int64') * indices[0]\n            img = self.q_sample(init_image, my_t, img)\n\n        if progress:\n            # Lazy import so that we don't depend on tqdm.\n            from tqdm.auto import tqdm\n\n            indices = tqdm(indices)\n\n        for i in indices:\n            t = paddle.to_tensor([i] * shape[0], place=device)\n            if randomize_class and 'y' in model_kwargs:\n                model_kwargs['y'] = paddle.randint(low=0, high=model.num_classes, shape=model_kwargs['y'].shape)\n            # with paddle.no_grad():\n            sample_fn = self.p_sample_with_grad if cond_fn_with_grad else self.p_sample\n            out = sample_fn(\n                model,\n                img,\n                t,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n            )\n            yield out\n            img = out[\"sample\"]\n\n    def ddim_sample(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        eta=0.0,\n    ):\n        \"\"\"\n        Sample x_{t-1} from the model using DDIM.\n\n        Same usage as p_sample().\n        \"\"\"\n        out_orig = self.p_mean_variance(\n            model,\n            x,\n            t,\n            clip_denoised=clip_denoised,\n            denoised_fn=denoised_fn,\n            model_kwargs=model_kwargs,\n        )\n        if cond_fn is not None:\n            out = self.condition_score(cond_fn, out_orig, x, t, model_kwargs=model_kwargs)\n        else:\n            out = out_orig\n\n        # Usually our model outputs epsilon, but we re-derive it\n        # in case we used x_start or x_prev prediction.\n        eps = self._predict_eps_from_xstart(x, t, out[\"pred_xstart\"])\n\n        alpha_bar = _extract_into_tensor(self.alphas_cumprod, t, x.shape)\n        alpha_bar_prev = _extract_into_tensor(self.alphas_cumprod_prev, t, x.shape)\n        sigma = (eta * paddle.sqrt(\n            (1 - alpha_bar_prev) / (1 - alpha_bar)) * paddle.sqrt(1 - alpha_bar / alpha_bar_prev))\n        # Equation 12.\n        # noise = th.randn_like(x)\n        noise = paddle.randn(x.shape, x.dtype)\n        mean_pred = (out[\"pred_xstart\"] * paddle.sqrt(alpha_bar_prev) +\n                     paddle.sqrt(1 - alpha_bar_prev - sigma**2) * eps)\n        nonzero_mask = (paddle.cast((t != 0), 'float32').reshape([-1,\n                                                                  *([1] * (len(x.shape) - 1))]))  # no noise when t == 0\n        sample = mean_pred + nonzero_mask * sigma * noise\n        return {\"sample\": sample, \"pred_xstart\": out_orig[\"pred_xstart\"]}\n\n    def ddim_sample_with_grad(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        eta=0.0,\n    ):\n        \"\"\"\n        Sample x_{t-1} from the model using DDIM.\n\n        Same usage as p_sample().\n        \"\"\"\n        # with th.enable_grad():\n        # x = x.detach().requires_grad_()\n        x = x.detach()\n        # x.stop_gradient = False\n        out_orig = self.p_mean_variance(\n            model,\n            x,\n            t,\n            clip_denoised=clip_denoised,\n            denoised_fn=denoised_fn,\n            model_kwargs=model_kwargs,\n        )\n        if cond_fn is not None:\n            out = self.condition_score_with_grad(cond_fn, out_orig, x, t, model_kwargs=model_kwargs)\n        else:\n            out = out_orig\n\n        out[\"pred_xstart\"] = out[\"pred_xstart\"].detach()\n\n        # Usually our model outputs epsilon, but we re-derive it\n        # in case we used x_start or x_prev prediction.\n        eps = self._predict_eps_from_xstart(x, t, out[\"pred_xstart\"])\n\n        alpha_bar = _extract_into_tensor(self.alphas_cumprod, t, x.shape)\n        alpha_bar_prev = _extract_into_tensor(self.alphas_cumprod_prev, t, x.shape)\n        sigma = (eta * paddle.sqrt(\n            (1 - alpha_bar_prev) / (1 - alpha_bar)) * paddle.sqrt(1 - alpha_bar / alpha_bar_prev))\n        # Equation 12.\n        # noise = th.randn_like(x)\n        noise = paddle.randn(x.shape, x.dtype)\n        mean_pred = (out[\"pred_xstart\"] * paddle.sqrt(alpha_bar_prev) +\n                     paddle.sqrt(1 - alpha_bar_prev - sigma**2) * eps)\n        nonzero_mask = (paddle.cast((t != 0), 'float32').reshape([-1,\n                                                                  *([1] * (len(x.shape) - 1))]))  # no noise when t == 0\n        sample = mean_pred + nonzero_mask * sigma * noise\n        return {\"sample\": sample, \"pred_xstart\": out_orig[\"pred_xstart\"].detach()}\n\n    def ddim_reverse_sample(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        model_kwargs=None,\n        eta=0.0,\n    ):\n        \"\"\"\n        Sample x_{t+1} from the model using DDIM reverse ODE.\n        \"\"\"\n        assert eta == 0.0, \"Reverse ODE only for deterministic path\"\n        out = self.p_mean_variance(\n            model,\n            x,\n            t,\n            clip_denoised=clip_denoised,\n            denoised_fn=denoised_fn,\n            model_kwargs=model_kwargs,\n        )\n        # Usually our model outputs epsilon, but we re-derive it\n        # in case we used x_start or x_prev prediction.\n        eps = (_extract_into_tensor(self.sqrt_recip_alphas_cumprod, t, x.shape) * x -\n               out[\"pred_xstart\"]) / _extract_into_tensor(self.sqrt_recipm1_alphas_cumprod, t, x.shape)\n        alpha_bar_next = _extract_into_tensor(self.alphas_cumprod_next, t, x.shape)\n\n        # Equation 12. reversed\n        mean_pred = (out[\"pred_xstart\"] * paddle.sqrt(alpha_bar_next) + paddle.sqrt(1 - alpha_bar_next) * eps)\n\n        return {\"sample\": mean_pred, \"pred_xstart\": out[\"pred_xstart\"]}\n\n    def ddim_sample_loop(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        eta=0.0,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n    ):\n        \"\"\"\n        Generate samples from the model using DDIM.\n\n        Same usage as p_sample_loop().\n        \"\"\"\n        final = None\n        for sample in self.ddim_sample_loop_progressive(\n                model,\n                shape,\n                noise=noise,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n                device=device,\n                progress=progress,\n                eta=eta,\n                skip_timesteps=skip_timesteps,\n                init_image=init_image,\n                randomize_class=randomize_class,\n                cond_fn_with_grad=cond_fn_with_grad,\n        ):\n            final = sample\n        return final[\"sample\"]\n\n    def ddim_sample_loop_progressive(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        eta=0.0,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n    ):\n        \"\"\"\n        Use DDIM to sample from the model and yield intermediate samples from\n        each timestep of DDIM.\n\n        Same usage as p_sample_loop_progressive().\n        \"\"\"\n        # if device is None:\n        #     device = next(model.parameters()).device\n        if device is None:\n            device = model.parameters()[0].place\n        assert isinstance(shape, (tuple, list))\n        if noise is not None:\n            img = noise\n        else:\n            img = paddle.randn(shape)\n\n        if skip_timesteps and init_image is None:\n            init_image = paddle.zeros_like(img)\n\n        indices = list(range(self.num_timesteps - skip_timesteps))[::-1]\n\n        if init_image is not None:\n            my_t = paddle.ones([shape[0]], dtype='int64') * indices[0]\n            img = self.q_sample(init_image, my_t, img)\n\n        if progress:\n            # Lazy import so that we don't depend on tqdm.\n            from tqdm.auto import tqdm\n\n            indices = tqdm(indices)\n\n        for i in indices:\n            t = paddle.to_tensor([i] * shape[0])\n            if randomize_class and 'y' in model_kwargs:\n                model_kwargs['y'] = paddle.randint(\n                    low=0,\n                    high=model.num_classes,\n                    shape=model_kwargs['y'].shape,\n                )\n            sample_fn = self.ddim_sample_with_grad if cond_fn_with_grad else self.ddim_sample\n            out = sample_fn(\n                model,\n                img,\n                t,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n                eta=eta,\n            )\n            yield out\n            img = out[\"sample\"]\n\n    def plms_sample(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        cond_fn_with_grad=False,\n        order=2,\n        old_out=None,\n    ):\n        \"\"\"\n        Sample x_{t-1} from the model using Pseudo Linear Multistep.\n\n        Same usage as p_sample().\n        \"\"\"\n        if not int(order) or not 1 <= order <= 4:\n            raise ValueError('order is invalid (should be int from 1-4).')\n\n        def get_model_output(x, t):\n            with paddle.set_grad_enabled(cond_fn_with_grad and cond_fn is not None):\n                x = x.detach().requires_grad_() if cond_fn_with_grad else x\n                out_orig = self.p_mean_variance(\n                    model,\n                    x,\n                    t,\n                    clip_denoised=clip_denoised,\n                    denoised_fn=denoised_fn,\n                    model_kwargs=model_kwargs,\n                )\n                if cond_fn is not None:\n                    if cond_fn_with_grad:\n                        out = self.condition_score_with_grad(cond_fn, out_orig, x, t, model_kwargs=model_kwargs)\n                        x = x.detach()\n                    else:\n                        out = self.condition_score(cond_fn, out_orig, x, t, model_kwargs=model_kwargs)\n                else:\n                    out = out_orig\n\n            # Usually our model outputs epsilon, but we re-derive it\n            # in case we used x_start or x_prev prediction.\n            eps = self._predict_eps_from_xstart(x, t, out[\"pred_xstart\"])\n            return eps, out, out_orig\n\n        alpha_bar = _extract_into_tensor(self.alphas_cumprod, t, x.shape)\n        alpha_bar_prev = _extract_into_tensor(self.alphas_cumprod_prev, t, x.shape)\n        eps, out, out_orig = get_model_output(x, t)\n\n        if order > 1 and old_out is None:\n            # Pseudo Improved Euler\n            old_eps = [eps]\n            mean_pred = out[\"pred_xstart\"] * paddle.sqrt(alpha_bar_prev) + paddle.sqrt(1 - alpha_bar_prev) * eps\n            eps_2, _, _ = get_model_output(mean_pred, t - 1)\n            eps_prime = (eps + eps_2) / 2\n            pred_prime = self._predict_xstart_from_eps(x, t, eps_prime)\n            mean_pred = pred_prime * paddle.sqrt(alpha_bar_prev) + paddle.sqrt(1 - alpha_bar_prev) * eps_prime\n        else:\n            # Pseudo Linear Multistep (Adams-Bashforth)\n            old_eps = old_out[\"old_eps\"]\n            old_eps.append(eps)\n            cur_order = min(order, len(old_eps))\n            if cur_order == 1:\n                eps_prime = old_eps[-1]\n            elif cur_order == 2:\n                eps_prime = (3 * old_eps[-1] - old_eps[-2]) / 2\n            elif cur_order == 3:\n                eps_prime = (23 * old_eps[-1] - 16 * old_eps[-2] + 5 * old_eps[-3]) / 12\n            elif cur_order == 4:\n                eps_prime = (55 * old_eps[-1] - 59 * old_eps[-2] + 37 * old_eps[-3] - 9 * old_eps[-4]) / 24\n            else:\n                raise RuntimeError('cur_order is invalid.')\n            pred_prime = self._predict_xstart_from_eps(x, t, eps_prime)\n            mean_pred = pred_prime * paddle.sqrt(alpha_bar_prev) + paddle.sqrt(1 - alpha_bar_prev) * eps_prime\n\n        if len(old_eps) >= order:\n            old_eps.pop(0)\n\n        nonzero_mask = paddle.cast((t != 0), 'float32').reshape([-1, *([1] * (len(x.shape) - 1))])\n        sample = mean_pred * nonzero_mask + out[\"pred_xstart\"] * (1 - nonzero_mask)\n\n        return {\"sample\": sample, \"pred_xstart\": out_orig[\"pred_xstart\"], \"old_eps\": old_eps}\n\n    def plms_sample_loop(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n        order=2,\n    ):\n        \"\"\"\n        Generate samples from the model using Pseudo Linear Multistep.\n\n        Same usage as p_sample_loop().\n        \"\"\"\n        final = None\n        for sample in self.plms_sample_loop_progressive(\n                model,\n                shape,\n                noise=noise,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n                device=device,\n                progress=progress,\n                skip_timesteps=skip_timesteps,\n                init_image=init_image,\n                randomize_class=randomize_class,\n                cond_fn_with_grad=cond_fn_with_grad,\n                order=order,\n        ):\n            final = sample\n        return final[\"sample\"]\n\n    def plms_sample_loop_progressive(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n        order=2,\n    ):\n        \"\"\"\n        Use PLMS to sample from the model and yield intermediate samples from each\n        timestep of PLMS.\n\n        Same usage as p_sample_loop_progressive().\n        \"\"\"\n        if device is None:\n            device = model.parameters()[0].place\n        assert isinstance(shape, (tuple, list))\n        if noise is not None:\n            img = noise\n        else:\n            img = paddle.randn(shape)\n\n        if skip_timesteps and init_image is None:\n            init_image = paddle.zeros_like(img)\n\n        indices = list(range(self.num_timesteps - skip_timesteps))[::-1]\n\n        if init_image is not None:\n            my_t = paddle.ones([shape[0]], dtype='int64') * indices[0]\n            img = self.q_sample(init_image, my_t, img)\n\n        if progress:\n            # Lazy import so that we don't depend on tqdm.\n            from tqdm.auto import tqdm\n\n            indices = tqdm(indices)\n\n        old_out = None\n\n        for i in indices:\n            t = paddle.to_tensor([i] * shape[0], place=device)\n            if randomize_class and 'y' in model_kwargs:\n                model_kwargs['y'] = paddle.randint(low=0, high=model.num_classes, shape=model_kwargs['y'].shape)\n            # with paddle.no_grad():\n            out = self.plms_sample(\n                model,\n                img,\n                t,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n                cond_fn_with_grad=cond_fn_with_grad,\n                order=order,\n                old_out=old_out,\n            )\n            yield out\n            old_out = out\n            img = out[\"sample\"]\n\n    def _vb_terms_bpd(self, model, x_start, x_t, t, clip_denoised=True, model_kwargs=None):\n        \"\"\"\n        Get a term for the variational lower-bound.\n\n        The resulting units are bits (rather than nats, as one might expect).\n        This allows for comparison to other papers.\n\n        :return: a dict with the following keys:\n                 - 'output': a shape [N] tensor of NLLs or KLs.\n                 - 'pred_xstart': the x_0 predictions.\n        \"\"\"\n        true_mean, _, true_log_variance_clipped = self.q_posterior_mean_variance(x_start=x_start, x_t=x_t, t=t)\n        out = self.p_mean_variance(model, x_t, t, clip_denoised=clip_denoised, model_kwargs=model_kwargs)\n        kl = normal_kl(true_mean, true_log_variance_clipped, out[\"mean\"], out[\"log_variance\"])\n        kl = mean_flat(kl) / np.log(2.0)\n\n        decoder_nll = -discretized_gaussian_log_likelihood(\n            x_start, means=out[\"mean\"], log_scales=0.5 * out[\"log_variance\"])\n        assert decoder_nll.shape == x_start.shape\n        decoder_nll = mean_flat(decoder_nll) / np.log(2.0)\n\n        # At the first timestep return the decoder NLL,\n        # otherwise return KL(q(x_{t-1}|x_t,x_0) || p(x_{t-1}|x_t))\n        output = paddle.where((t == 0), decoder_nll, kl)\n        return {\"output\": output, \"pred_xstart\": out[\"pred_xstart\"]}\n\n    def training_losses(self, model, x_start, t, model_kwargs=None, noise=None):\n        \"\"\"\n        Compute training losses for a single timestep.\n\n        :param model: the model to evaluate loss on.\n        :param x_start: the [N x C x ...] tensor of inputs.\n        :param t: a batch of timestep indices.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n        :param noise: if specified, the specific Gaussian noise to try to remove.\n        :return: a dict with the key \"loss\" containing a tensor of shape [N].\n                 Some mean or variance settings may also have other keys.\n        \"\"\"\n        if model_kwargs is None:\n            model_kwargs = {}\n        if noise is None:\n            # noise = th.randn_like(x_start)\n            noise = paddle.randn(x_start.shape, x_start.dtype)\n        x_t = self.q_sample(x_start, t, noise=noise)\n\n        terms = {}\n\n        if self.loss_type == LossType.KL or self.loss_type == LossType.RESCALED_KL:\n            terms[\"loss\"] = self._vb_terms_bpd(\n                model=model,\n                x_start=x_start,\n                x_t=x_t,\n                t=t,\n                clip_denoised=False,\n                model_kwargs=model_kwargs,\n            )[\"output\"]\n            if self.loss_type == LossType.RESCALED_KL:\n                terms[\"loss\"] *= self.num_timesteps\n        elif self.loss_type == LossType.MSE or self.loss_type == LossType.RESCALED_MSE:\n            model_output = model(x_t, self._scale_timesteps(t), **model_kwargs)\n\n            if self.model_var_type in [\n                    ModelVarType.LEARNED,\n                    ModelVarType.LEARNED_RANGE,\n            ]:\n                B, C = x_t.shape[:2]\n                assert model_output.shape == (B, C * 2, *x_t.shape[2:])\n                model_output, model_var_values = paddle.split(model_output, 2, dim=1)\n                # Learn the variance using the variational bound, but don't let\n                # it affect our mean prediction.\n                frozen_out = paddle.concat([model_output.detach(), model_var_values], axis=1)\n                terms[\"vb\"] = self._vb_terms_bpd(\n                    model=lambda *args, r=frozen_out: r,\n                    x_start=x_start,\n                    x_t=x_t,\n                    t=t,\n                    clip_denoised=False,\n                )[\"output\"]\n                if self.loss_type == LossType.RESCALED_MSE:\n                    # Divide by 1000 for equivalence with initial implementation.\n                    # Without a factor of 1/1000, the VB term hurts the MSE term.\n                    terms[\"vb\"] *= self.num_timesteps / 1000.0\n\n            target = {\n                ModelMeanType.PREVIOUS_X: self.q_posterior_mean_variance(x_start=x_start, x_t=x_t, t=t)[0],\n                ModelMeanType.START_X: x_start,\n                ModelMeanType.EPSILON: noise,\n            }[self.model_mean_type]\n            assert model_output.shape == target.shape == x_start.shape\n            terms[\"mse\"] = mean_flat((target - model_output)**2)\n            if \"vb\" in terms:\n                terms[\"loss\"] = terms[\"mse\"] + terms[\"vb\"]\n            else:\n                terms[\"loss\"] = terms[\"mse\"]\n        else:\n            raise NotImplementedError(self.loss_type)\n\n        return terms\n\n    def _prior_bpd(self, x_start):\n        \"\"\"\n        Get the prior KL term for the variational lower-bound, measured in\n        bits-per-dim.\n\n        This term can't be optimized, as it only depends on the encoder.\n\n        :param x_start: the [N x C x ...] tensor of inputs.\n        :return: a batch of [N] KL values (in bits), one per batch element.\n        \"\"\"\n        batch_size = x_start.shape[0]\n        t = paddle.to_tensor([self.num_timesteps - 1] * batch_size, place=x_start.place)\n        qt_mean, _, qt_log_variance = self.q_mean_variance(x_start, t)\n        kl_prior = normal_kl(mean1=qt_mean, logvar1=qt_log_variance, mean2=0.0, logvar2=0.0)\n        return mean_flat(kl_prior) / np.log(2.0)\n\n    def calc_bpd_loop(self, model, x_start, clip_denoised=True, model_kwargs=None):\n        \"\"\"\n        Compute the entire variational lower-bound, measured in bits-per-dim,\n        as well as other related quantities.\n\n        :param model: the model to evaluate loss on.\n        :param x_start: the [N x C x ...] tensor of inputs.\n        :param clip_denoised: if True, clip denoised samples.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n\n        :return: a dict containing the following keys:\n                 - total_bpd: the total variational lower-bound, per batch element.\n                 - prior_bpd: the prior term in the lower-bound.\n                 - vb: an [N x T] tensor of terms in the lower-bound.\n                 - xstart_mse: an [N x T] tensor of x_0 MSEs for each timestep.\n                 - mse: an [N x T] tensor of epsilon MSEs for each timestep.\n        \"\"\"\n        device = x_start.place\n        batch_size = x_start.shape[0]\n\n        vb = []\n        xstart_mse = []\n        mse = []\n        for t in list(range(self.num_timesteps))[::-1]:\n            t_batch = paddle.to_tensor([t] * batch_size, place=device)\n            # noise = th.randn_like(x_start)\n            noise = paddle.randn(x_start.shape, x_start.dtype)\n            x_t = self.q_sample(x_start=x_start, t=t_batch, noise=noise)\n            # Calculate VLB term at the current timestep\n            # with paddle.no_grad():\n            out = self._vb_terms_bpd(\n                model,\n                x_start=x_start,\n                x_t=x_t,\n                t=t_batch,\n                clip_denoised=clip_denoised,\n                model_kwargs=model_kwargs,\n            )\n            vb.append(out[\"output\"])\n            xstart_mse.append(mean_flat((out[\"pred_xstart\"] - x_start)**2))\n            eps = self._predict_eps_from_xstart(x_t, t_batch, out[\"pred_xstart\"])\n            mse.append(mean_flat((eps - noise)**2))\n\n        vb = paddle.stack(vb, axis=1)\n        xstart_mse = paddle.stack(xstart_mse, axis=1)\n        mse = paddle.stack(mse, axis=1)\n\n        prior_bpd = self._prior_bpd(x_start)\n        total_bpd = vb.sum(axis=1) + prior_bpd\n        return {\n            \"total_bpd\": total_bpd,\n            \"prior_bpd\": prior_bpd,\n            \"vb\": vb,\n            \"xstart_mse\": xstart_mse,\n            \"mse\": mse,\n        }\n\n\ndef _extract_into_tensor(arr, timesteps, broadcast_shape):\n    \"\"\"\n    Extract values from a 1-D numpy array for a batch of indices.\n\n    :param arr: the 1-D numpy array.\n    :param timesteps: a tensor of indices into the array to extract.\n    :param broadcast_shape: a larger shape of K dimensions with the batch\n                            dimension equal to the length of timesteps.\n    :return: a tensor of shape [batch_size, 1, ...] where the shape has K dims.\n    \"\"\"\n    res = paddle.to_tensor(arr, place=timesteps.place)[timesteps]\n    while len(res.shape) < len(broadcast_shape):\n        res = res[..., None]\n    return res.expand(broadcast_shape)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/reverse_diffusion/model/losses.py",
    "content": "\"\"\"\nHelpers for various likelihood-based losses implemented by Paddle. These are ported from the original\nHo et al. diffusion models codebase:\nhttps://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/utils.py\n\"\"\"\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\n\n\ndef normal_kl(mean1, logvar1, mean2, logvar2):\n    \"\"\"\n    Compute the KL divergence between two gaussians.\n\n    Shapes are automatically broadcasted, so batches can be compared to\n    scalars, among other use cases.\n    \"\"\"\n    tensor = None\n    for obj in (mean1, logvar1, mean2, logvar2):\n        if isinstance(obj, paddle.Tensor):\n            tensor = obj\n            break\n    assert tensor is not None, \"at least one argument must be a Tensor\"\n\n    # Force variances to be Tensors. Broadcasting helps convert scalars to\n    # Tensors, but it does not work for th.exp().\n    logvar1, logvar2 = [x if isinstance(x, paddle.Tensor) else paddle.to_tensor(x) for x in (logvar1, logvar2)]\n\n    return 0.5 * (-1.0 + logvar2 - logvar1 + paddle.exp(logvar1 - logvar2) +\n                  ((mean1 - mean2)**2) * paddle.exp(-logvar2))\n\n\ndef approx_standard_normal_cdf(x):\n    \"\"\"\n    A fast approximation of the cumulative distribution function of the\n    standard normal.\n    \"\"\"\n    return 0.5 * (1.0 + paddle.tanh(np.sqrt(2.0 / np.pi) * (x + 0.044715 * paddle.pow(x, 3))))\n\n\ndef discretized_gaussian_log_likelihood(x, *, means, log_scales):\n    \"\"\"\n    Compute the log-likelihood of a Gaussian distribution discretizing to a\n    given image.\n\n    :param x: the target images. It is assumed that this was uint8 values,\n              rescaled to the range [-1, 1].\n    :param means: the Gaussian mean Tensor.\n    :param log_scales: the Gaussian log stddev Tensor.\n    :return: a tensor like x of log probabilities (in nats).\n    \"\"\"\n    assert x.shape == means.shape == log_scales.shape\n    centered_x = x - means\n    inv_stdv = paddle.exp(-log_scales)\n    plus_in = inv_stdv * (centered_x + 1.0 / 255.0)\n    cdf_plus = approx_standard_normal_cdf(plus_in)\n    min_in = inv_stdv * (centered_x - 1.0 / 255.0)\n    cdf_min = approx_standard_normal_cdf(min_in)\n    log_cdf_plus = paddle.log(cdf_plus.clip(min=1e-12))\n    log_one_minus_cdf_min = paddle.log((1.0 - cdf_min).clip(min=1e-12))\n    cdf_delta = cdf_plus - cdf_min\n    log_probs = paddle.where(\n        x < -0.999,\n        log_cdf_plus,\n        paddle.where(x > 0.999, log_one_minus_cdf_min, paddle.log(cdf_delta.clip(min=1e-12))),\n    )\n    assert log_probs.shape == x.shape\n    return log_probs\n\n\ndef spherical_dist_loss(x, y):\n    x = F.normalize(x, axis=-1)\n    y = F.normalize(y, axis=-1)\n    return (x - y).norm(axis=-1).divide(paddle.to_tensor(2.0)).asin().pow(2).multiply(paddle.to_tensor(2.0))\n\n\ndef tv_loss(input):\n    \"\"\"L2 total variation loss, as in Mahendran et al.\"\"\"\n    input = F.pad(input, (0, 1, 0, 1), 'replicate')\n    x_diff = input[..., :-1, 1:] - input[..., :-1, :-1]\n    y_diff = input[..., 1:, :-1] - input[..., :-1, :-1]\n    return (x_diff**2 + y_diff**2).mean([1, 2, 3])\n\n\ndef range_loss(input):\n    return (input - input.clip(-1, 1)).pow(2).mean([1, 2, 3])\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/reverse_diffusion/model/make_cutouts.py",
    "content": "'''\nThis code is rewritten by Paddle based on Jina-ai/discoart.\nhttps://github.com/jina-ai/discoart/blob/main/discoart/nn/make_cutouts.py\n'''\nimport math\n\nimport paddle\nimport paddle.nn as nn\nfrom disco_diffusion_cnclip_vitb16.resize_right.resize_right import resize\nfrom paddle.nn import functional as F\n\nfrom . import transforms as T\n\nskip_augs = False  # @param{type: 'boolean'}\n\n\ndef sinc(x):\n    return paddle.where(x != 0, paddle.sin(math.pi * x) / (math.pi * x), x.new_ones([]))\n\n\ndef lanczos(x, a):\n    cond = paddle.logical_and(-a < x, x < a)\n    out = paddle.where(cond, sinc(x) * sinc(x / a), x.new_zeros([]))\n    return out / out.sum()\n\n\ndef ramp(ratio, width):\n    n = math.ceil(width / ratio + 1)\n    out = paddle.empty([n])\n    cur = 0\n    for i in range(out.shape[0]):\n        out[i] = cur\n        cur += ratio\n    return paddle.concat([-out[1:].flip([0]), out])[1:-1]\n\n\nclass MakeCutouts(nn.Layer):\n\n    def __init__(self, cut_size, cutn, skip_augs=False):\n        super().__init__()\n        self.cut_size = cut_size\n        self.cutn = cutn\n        self.skip_augs = skip_augs\n        self.augs = nn.Sequential(*[\n            T.RandomHorizontalFlip(prob=0.5),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.RandomAffine(degrees=15, translate=(0.1, 0.1)),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.RandomPerspective(distortion_scale=0.4, p=0.7),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.RandomGrayscale(p=0.15),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.ColorJitter(brightness=0.1, contrast=0.1, saturation=0.1, hue=0.1),\n        ])\n\n    def forward(self, input):\n        input = T.Pad(input.shape[2] // 4, fill=0)(input)\n        sideY, sideX = input.shape[2:4]\n        max_size = min(sideX, sideY)\n\n        cutouts = []\n        for ch in range(self.cutn):\n            if ch > self.cutn - self.cutn // 4:\n                cutout = input.clone()\n            else:\n                size = int(max_size *\n                           paddle.zeros(1, ).normal_(mean=0.8, std=0.3).clip(float(self.cut_size / max_size), 1.0))\n                offsetx = paddle.randint(0, abs(sideX - size + 1), ())\n                offsety = paddle.randint(0, abs(sideY - size + 1), ())\n                cutout = input[:, :, offsety:offsety + size, offsetx:offsetx + size]\n\n            if not self.skip_augs:\n                cutout = self.augs(cutout)\n            cutouts.append(resample(cutout, (self.cut_size, self.cut_size)))\n            del cutout\n\n        cutouts = paddle.concat(cutouts, axis=0)\n        return cutouts\n\n\nclass MakeCutoutsDango(nn.Layer):\n\n    def __init__(self, cut_size, Overview=4, InnerCrop=0, IC_Size_Pow=0.5, IC_Grey_P=0.2):\n        super().__init__()\n        self.cut_size = cut_size\n        self.Overview = Overview\n        self.InnerCrop = InnerCrop\n        self.IC_Size_Pow = IC_Size_Pow\n        self.IC_Grey_P = IC_Grey_P\n        self.augs = nn.Sequential(*[\n            T.RandomHorizontalFlip(prob=0.5),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.RandomAffine(\n                degrees=10,\n                translate=(0.05, 0.05),\n                interpolation=T.InterpolationMode.BILINEAR,\n            ),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.RandomGrayscale(p=0.1),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.ColorJitter(brightness=0.1, contrast=0.1, saturation=0.1, hue=0.1),\n        ])\n\n    def forward(self, input):\n        cutouts = []\n        gray = T.Grayscale(3)\n        sideY, sideX = input.shape[2:4]\n        max_size = min(sideX, sideY)\n        min_size = min(sideX, sideY, self.cut_size)\n        output_shape = [1, 3, self.cut_size, self.cut_size]\n        pad_input = F.pad(\n            input,\n            (\n                (sideY - max_size) // 2,\n                (sideY - max_size) // 2,\n                (sideX - max_size) // 2,\n                (sideX - max_size) // 2,\n            ),\n            **padargs,\n        )\n        cutout = resize(pad_input, out_shape=output_shape)\n\n        if self.Overview > 0:\n            if self.Overview <= 4:\n                if self.Overview >= 1:\n                    cutouts.append(cutout)\n                if self.Overview >= 2:\n                    cutouts.append(gray(cutout))\n                if self.Overview >= 3:\n                    cutouts.append(cutout[:, :, :, ::-1])\n                if self.Overview == 4:\n                    cutouts.append(gray(cutout[:, :, :, ::-1]))\n            else:\n                cutout = resize(pad_input, out_shape=output_shape)\n                for _ in range(self.Overview):\n                    cutouts.append(cutout)\n\n        if self.InnerCrop > 0:\n            for i in range(self.InnerCrop):\n                size = int(paddle.rand([1])**self.IC_Size_Pow * (max_size - min_size) + min_size)\n                offsetx = paddle.randint(0, sideX - size + 1)\n                offsety = paddle.randint(0, sideY - size + 1)\n                cutout = input[:, :, offsety:offsety + size, offsetx:offsetx + size]\n                if i <= int(self.IC_Grey_P * self.InnerCrop):\n                    cutout = gray(cutout)\n                cutout = resize(cutout, out_shape=output_shape)\n                cutouts.append(cutout)\n\n        cutouts = paddle.concat(cutouts)\n        if skip_augs is not True:\n            cutouts = self.augs(cutouts)\n        return cutouts\n\n\ndef resample(input, size, align_corners=True):\n    n, c, h, w = input.shape\n    dh, dw = size\n\n    input = input.reshape([n * c, 1, h, w])\n\n    if dh < h:\n        kernel_h = lanczos(ramp(dh / h, 2), 2).to(input.device, input.dtype)\n        pad_h = (kernel_h.shape[0] - 1) // 2\n        input = F.pad(input, (0, 0, pad_h, pad_h), 'reflect')\n        input = F.conv2d(input, kernel_h[None, None, :, None])\n\n    if dw < w:\n        kernel_w = lanczos(ramp(dw / w, 2), 2).to(input.device, input.dtype)\n        pad_w = (kernel_w.shape[0] - 1) // 2\n        input = F.pad(input, (pad_w, pad_w, 0, 0), 'reflect')\n        input = F.conv2d(input, kernel_w[None, None, None, :])\n\n    input = input.reshape([n, c, h, w])\n    return F.interpolate(input, size, mode='bicubic', align_corners=align_corners)\n\n\npadargs = {}\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/reverse_diffusion/model/nn.py",
    "content": "\"\"\"\nVarious utilities for neural networks implemented by Paddle. This code is rewritten based on:\nhttps://github.com/openai/guided-diffusion/blob/main/guided_diffusion/nn.py\n\"\"\"\nimport math\n\nimport paddle\nimport paddle.nn as nn\n\n\nclass SiLU(nn.Layer):\n\n    def forward(self, x):\n        return x * nn.functional.sigmoid(x)\n\n\nclass GroupNorm32(nn.GroupNorm):\n\n    def forward(self, x):\n        return super().forward(x)\n\n\ndef conv_nd(dims, *args, **kwargs):\n    \"\"\"\n    Create a 1D, 2D, or 3D convolution module.\n    \"\"\"\n    if dims == 1:\n        return nn.Conv1D(*args, **kwargs)\n    elif dims == 2:\n        return nn.Conv2D(*args, **kwargs)\n    elif dims == 3:\n        return nn.Conv3D(*args, **kwargs)\n    raise ValueError(f\"unsupported dimensions: {dims}\")\n\n\ndef linear(*args, **kwargs):\n    \"\"\"\n    Create a linear module.\n    \"\"\"\n    return nn.Linear(*args, **kwargs)\n\n\ndef avg_pool_nd(dims, *args, **kwargs):\n    \"\"\"\n    Create a 1D, 2D, or 3D average pooling module.\n    \"\"\"\n    if dims == 1:\n        return nn.AvgPool1D(*args, **kwargs)\n    elif dims == 2:\n        return nn.AvgPool2D(*args, **kwargs)\n    elif dims == 3:\n        return nn.AvgPool3D(*args, **kwargs)\n    raise ValueError(f\"unsupported dimensions: {dims}\")\n\n\ndef update_ema(target_params, source_params, rate=0.99):\n    \"\"\"\n    Update target parameters to be closer to those of source parameters using\n    an exponential moving average.\n\n    :param target_params: the target parameter sequence.\n    :param source_params: the source parameter sequence.\n    :param rate: the EMA rate (closer to 1 means slower).\n    \"\"\"\n    for targ, src in zip(target_params, source_params):\n        targ.detach().mul_(rate).add_(src, alpha=1 - rate)\n\n\ndef zero_module(module):\n    \"\"\"\n    Zero out the parameters of a module and return it.\n    \"\"\"\n    for p in module.parameters():\n        p.detach().zero_()\n    return module\n\n\ndef scale_module(module, scale):\n    \"\"\"\n    Scale the parameters of a module and return it.\n    \"\"\"\n    for p in module.parameters():\n        p.detach().mul_(scale)\n    return module\n\n\ndef mean_flat(tensor):\n    \"\"\"\n    Take the mean over all non-batch dimensions.\n    \"\"\"\n    return tensor.mean(axis=list(range(1, len(tensor.shape))))\n\n\ndef normalization(channels):\n    \"\"\"\n    Make a standard normalization layer.\n\n    :param channels: number of input channels.\n    :return: an nn.Module for normalization.\n    \"\"\"\n    return GroupNorm32(32, channels)\n\n\ndef timestep_embedding(timesteps, dim, max_period=10000):\n    \"\"\"\n    Create sinusoidal timestep embeddings.\n\n    :param timesteps: a 1-D Tensor of N indices, one per batch element.\n                      These may be fractional.\n    :param dim: the dimension of the output.\n    :param max_period: controls the minimum frequency of the embeddings.\n    :return: an [N x dim] Tensor of positional embeddings.\n    \"\"\"\n    half = dim // 2\n    freqs = paddle.exp(-math.log(max_period) * paddle.arange(start=0, end=half, dtype=paddle.float32) / half)\n    args = paddle.cast(timesteps[:, None], 'float32') * freqs[None]\n    embedding = paddle.concat([paddle.cos(args), paddle.sin(args)], axis=-1)\n    if dim % 2:\n        embedding = paddle.concat([embedding, paddle.zeros_like(embedding[:, :1])], axis=-1)\n    return embedding\n\n\ndef checkpoint(func, inputs, params, flag):\n    \"\"\"\n    This function is disabled. And now just forward.\n    \"\"\"\n    return func(*inputs)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/reverse_diffusion/model/perlin_noises.py",
    "content": "'''\nPerlin noise implementation by Paddle.\nThis code is rewritten based on:\nhttps://github.com/jina-ai/discoart/blob/main/discoart/nn/perlin_noises.py\n'''\nimport numpy as np\nimport paddle\nimport paddle.vision.transforms as TF\nfrom PIL import Image\nfrom PIL import ImageOps\n\n\ndef interp(t):\n    return 3 * t**2 - 2 * t**3\n\n\ndef perlin(width, height, scale=10):\n    gx, gy = paddle.randn([2, width + 1, height + 1, 1, 1])\n    xs = paddle.linspace(0, 1, scale + 1)[:-1, None]\n    ys = paddle.linspace(0, 1, scale + 1)[None, :-1]\n    wx = 1 - interp(xs)\n    wy = 1 - interp(ys)\n    dots = 0\n    dots += wx * wy * (gx[:-1, :-1] * xs + gy[:-1, :-1] * ys)\n    dots += (1 - wx) * wy * (-gx[1:, :-1] * (1 - xs) + gy[1:, :-1] * ys)\n    dots += wx * (1 - wy) * (gx[:-1, 1:] * xs - gy[:-1, 1:] * (1 - ys))\n    dots += (1 - wx) * (1 - wy) * (-gx[1:, 1:] * (1 - xs) - gy[1:, 1:] * (1 - ys))\n    return dots.transpose([0, 2, 1, 3]).reshape([width * scale, height * scale])\n\n\ndef perlin_ms(octaves, width, height, grayscale):\n    out_array = [0.5] if grayscale else [0.5, 0.5, 0.5]\n    # out_array = [0.0] if grayscale else [0.0, 0.0, 0.0]\n    for i in range(1 if grayscale else 3):\n        scale = 2**len(octaves)\n        oct_width = width\n        oct_height = height\n        for oct in octaves:\n            p = perlin(oct_width, oct_height, scale)\n            out_array[i] += p * oct\n            scale //= 2\n            oct_width *= 2\n            oct_height *= 2\n    return paddle.concat(out_array)\n\n\ndef create_perlin_noise(octaves, width, height, grayscale, side_y, side_x):\n    out = perlin_ms(octaves, width, height, grayscale)\n    if grayscale:\n        out = TF.resize(size=(side_y, side_x), img=out.numpy())\n        out = np.uint8(out)\n        out = Image.fromarray(out).convert('RGB')\n    else:\n        out = out.reshape([-1, 3, out.shape[0] // 3, out.shape[1]])\n        out = out.squeeze().transpose([1, 2, 0]).numpy()\n        out = TF.resize(size=(side_y, side_x), img=out)\n        out = out.clip(0, 1) * 255\n        out = np.uint8(out)\n        out = Image.fromarray(out)\n\n    out = ImageOps.autocontrast(out)\n    return out\n\n\ndef regen_perlin(perlin_mode, side_y, side_x, batch_size):\n    if perlin_mode == 'color':\n        init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, False, side_y, side_x)\n        init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, False, side_y, side_x)\n    elif perlin_mode == 'gray':\n        init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, True, side_y, side_x)\n        init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, True, side_y, side_x)\n    else:\n        init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, False, side_y, side_x)\n        init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, True, side_y, side_x)\n\n    init = (TF.to_tensor(init).add(TF.to_tensor(init2)).divide(paddle.to_tensor(2.0)).unsqueeze(0) * 2 - 1)\n    del init2\n    return init.expand([batch_size, -1, -1, -1])\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/reverse_diffusion/model/respace.py",
    "content": "'''\nThis code is rewritten by Paddle based on\nhttps://github.com/openai/guided-diffusion/blob/main/guided_diffusion/respace.py\n'''\nimport numpy as np\nimport paddle\n\nfrom .gaussian_diffusion import GaussianDiffusion\n\n\ndef space_timesteps(num_timesteps, section_counts):\n    \"\"\"\n    Create a list of timesteps to use from an original diffusion process,\n    given the number of timesteps we want to take from equally-sized portions\n    of the original process.\n\n    For example, if there's 300 timesteps and the section counts are [10,15,20]\n    then the first 100 timesteps are strided to be 10 timesteps, the second 100\n    are strided to be 15 timesteps, and the final 100 are strided to be 20.\n\n    If the stride is a string starting with \"ddim\", then the fixed striding\n    from the DDIM paper is used, and only one section is allowed.\n\n    :param num_timesteps: the number of diffusion steps in the original\n                          process to divide up.\n    :param section_counts: either a list of numbers, or a string containing\n                           comma-separated numbers, indicating the step count\n                           per section. As a special case, use \"ddimN\" where N\n                           is a number of steps to use the striding from the\n                           DDIM paper.\n    :return: a set of diffusion steps from the original process to use.\n    \"\"\"\n    if isinstance(section_counts, str):\n        if section_counts.startswith(\"ddim\"):\n            desired_count = int(section_counts[len(\"ddim\"):])\n            for i in range(1, num_timesteps):\n                if len(range(0, num_timesteps, i)) == desired_count:\n                    return set(range(0, num_timesteps, i))\n            raise ValueError(f\"cannot create exactly {num_timesteps} steps with an integer stride\")\n        section_counts = [int(x) for x in section_counts.split(\",\")]\n    size_per = num_timesteps // len(section_counts)\n    extra = num_timesteps % len(section_counts)\n    start_idx = 0\n    all_steps = []\n    for i, section_count in enumerate(section_counts):\n        size = size_per + (1 if i < extra else 0)\n        if size < section_count:\n            raise ValueError(f\"cannot divide section of {size} steps into {section_count}\")\n        if section_count <= 1:\n            frac_stride = 1\n        else:\n            frac_stride = (size - 1) / (section_count - 1)\n        cur_idx = 0.0\n        taken_steps = []\n        for _ in range(section_count):\n            taken_steps.append(start_idx + round(cur_idx))\n            cur_idx += frac_stride\n        all_steps += taken_steps\n        start_idx += size\n    return set(all_steps)\n\n\nclass SpacedDiffusion(GaussianDiffusion):\n    \"\"\"\n    A diffusion process which can skip steps in a base diffusion process.\n\n    :param use_timesteps: a collection (sequence or set) of timesteps from the\n                          original diffusion process to retain.\n    :param kwargs: the kwargs to create the base diffusion process.\n    \"\"\"\n\n    def __init__(self, use_timesteps, **kwargs):\n        self.use_timesteps = set(use_timesteps)\n        self.timestep_map = []\n        self.original_num_steps = len(kwargs[\"betas\"])\n\n        base_diffusion = GaussianDiffusion(**kwargs)  # pylint: disable=missing-kwoa\n        last_alpha_cumprod = 1.0\n        new_betas = []\n        for i, alpha_cumprod in enumerate(base_diffusion.alphas_cumprod):\n            if i in self.use_timesteps:\n                new_betas.append(1 - alpha_cumprod / last_alpha_cumprod)\n                last_alpha_cumprod = alpha_cumprod\n                self.timestep_map.append(i)\n        kwargs[\"betas\"] = np.array(new_betas)\n        super().__init__(**kwargs)\n\n    def p_mean_variance(self, model, *args, **kwargs):  # pylint: disable=signature-differs\n        return super().p_mean_variance(self._wrap_model(model), *args, **kwargs)\n\n    def training_losses(self, model, *args, **kwargs):  # pylint: disable=signature-differs\n        return super().training_losses(self._wrap_model(model), *args, **kwargs)\n\n    def condition_mean(self, cond_fn, *args, **kwargs):\n        return super().condition_mean(self._wrap_model(cond_fn), *args, **kwargs)\n\n    def condition_score(self, cond_fn, *args, **kwargs):\n        return super().condition_score(self._wrap_model(cond_fn), *args, **kwargs)\n\n    def _wrap_model(self, model):\n        if isinstance(model, _WrappedModel):\n            return model\n        return _WrappedModel(model, self.timestep_map, self.rescale_timesteps, self.original_num_steps)\n\n    def _scale_timesteps(self, t):\n        # Scaling is done by the wrapped model.\n        return t\n\n\nclass _WrappedModel:\n\n    def __init__(self, model, timestep_map, rescale_timesteps, original_num_steps):\n        self.model = model\n        self.timestep_map = timestep_map\n        self.rescale_timesteps = rescale_timesteps\n        self.original_num_steps = original_num_steps\n\n    def __call__(self, x, ts, **kwargs):\n        map_tensor = paddle.to_tensor(self.timestep_map, place=ts.place, dtype=ts.dtype)\n        new_ts = map_tensor[ts]\n        if self.rescale_timesteps:\n            new_ts = paddle.cast(new_ts, 'float32') * (1000.0 / self.original_num_steps)\n        return self.model(x, new_ts, **kwargs)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/reverse_diffusion/model/script_util.py",
    "content": "'''\nThis code is based on\nhttps://github.com/openai/guided-diffusion/blob/main/guided_diffusion/script_util.py\n'''\nimport argparse\nimport inspect\n\nfrom . import gaussian_diffusion as gd\nfrom .respace import space_timesteps\nfrom .respace import SpacedDiffusion\nfrom .unet import EncoderUNetModel\nfrom .unet import SuperResModel\nfrom .unet import UNetModel\n\nNUM_CLASSES = 1000\n\n\ndef diffusion_defaults():\n    \"\"\"\n    Defaults for image and classifier training.\n    \"\"\"\n    return dict(\n        learn_sigma=False,\n        diffusion_steps=1000,\n        noise_schedule=\"linear\",\n        timestep_respacing=\"\",\n        use_kl=False,\n        predict_xstart=False,\n        rescale_timesteps=False,\n        rescale_learned_sigmas=False,\n    )\n\n\ndef model_and_diffusion_defaults():\n    \"\"\"\n    Defaults for image training.\n    \"\"\"\n    res = dict(\n        image_size=64,\n        num_channels=128,\n        num_res_blocks=2,\n        num_heads=4,\n        num_heads_upsample=-1,\n        num_head_channels=-1,\n        attention_resolutions=\"16,8\",\n        channel_mult=\"\",\n        dropout=0.0,\n        class_cond=False,\n        use_checkpoint=False,\n        use_scale_shift_norm=True,\n        resblock_updown=False,\n        use_fp16=False,\n        use_new_attention_order=False,\n    )\n    res.update(diffusion_defaults())\n    return res\n\n\ndef create_model_and_diffusion(\n    image_size,\n    class_cond,\n    learn_sigma,\n    num_channels,\n    num_res_blocks,\n    channel_mult,\n    num_heads,\n    num_head_channels,\n    num_heads_upsample,\n    attention_resolutions,\n    dropout,\n    diffusion_steps,\n    noise_schedule,\n    timestep_respacing,\n    use_kl,\n    predict_xstart,\n    rescale_timesteps,\n    rescale_learned_sigmas,\n    use_checkpoint,\n    use_scale_shift_norm,\n    resblock_updown,\n    use_fp16,\n    use_new_attention_order,\n):\n    model = create_model(\n        image_size,\n        num_channels,\n        num_res_blocks,\n        channel_mult=channel_mult,\n        learn_sigma=learn_sigma,\n        class_cond=class_cond,\n        use_checkpoint=use_checkpoint,\n        attention_resolutions=attention_resolutions,\n        num_heads=num_heads,\n        num_head_channels=num_head_channels,\n        num_heads_upsample=num_heads_upsample,\n        use_scale_shift_norm=use_scale_shift_norm,\n        dropout=dropout,\n        resblock_updown=resblock_updown,\n        use_fp16=use_fp16,\n        use_new_attention_order=use_new_attention_order,\n    )\n    diffusion = create_gaussian_diffusion(\n        steps=diffusion_steps,\n        learn_sigma=learn_sigma,\n        noise_schedule=noise_schedule,\n        use_kl=use_kl,\n        predict_xstart=predict_xstart,\n        rescale_timesteps=rescale_timesteps,\n        rescale_learned_sigmas=rescale_learned_sigmas,\n        timestep_respacing=timestep_respacing,\n    )\n    return model, diffusion\n\n\ndef create_model(\n    image_size,\n    num_channels,\n    num_res_blocks,\n    channel_mult=\"\",\n    learn_sigma=False,\n    class_cond=False,\n    use_checkpoint=False,\n    attention_resolutions=\"16\",\n    num_heads=1,\n    num_head_channels=-1,\n    num_heads_upsample=-1,\n    use_scale_shift_norm=False,\n    dropout=0,\n    resblock_updown=False,\n    use_fp16=False,\n    use_new_attention_order=False,\n):\n    if channel_mult == \"\":\n        if image_size == 512:\n            channel_mult = (0.5, 1, 1, 2, 2, 4, 4)\n        elif image_size == 256:\n            channel_mult = (1, 1, 2, 2, 4, 4)\n        elif image_size == 128:\n            channel_mult = (1, 1, 2, 3, 4)\n        elif image_size == 64:\n            channel_mult = (1, 2, 3, 4)\n        else:\n            raise ValueError(f\"unsupported image size: {image_size}\")\n    else:\n        channel_mult = tuple(int(ch_mult) for ch_mult in channel_mult.split(\",\"))\n\n    attention_ds = []\n    for res in attention_resolutions.split(\",\"):\n        attention_ds.append(image_size // int(res))\n\n    return UNetModel(\n        image_size=image_size,\n        in_channels=3,\n        model_channels=num_channels,\n        out_channels=(3 if not learn_sigma else 6),\n        num_res_blocks=num_res_blocks,\n        attention_resolutions=tuple(attention_ds),\n        dropout=dropout,\n        channel_mult=channel_mult,\n        num_classes=(NUM_CLASSES if class_cond else None),\n        use_checkpoint=use_checkpoint,\n        use_fp16=use_fp16,\n        num_heads=num_heads,\n        num_head_channels=num_head_channels,\n        num_heads_upsample=num_heads_upsample,\n        use_scale_shift_norm=use_scale_shift_norm,\n        resblock_updown=resblock_updown,\n        use_new_attention_order=use_new_attention_order,\n    )\n\n\ndef create_gaussian_diffusion(\n    *,\n    steps=1000,\n    learn_sigma=False,\n    sigma_small=False,\n    noise_schedule=\"linear\",\n    use_kl=False,\n    predict_xstart=False,\n    rescale_timesteps=False,\n    rescale_learned_sigmas=False,\n    timestep_respacing=\"\",\n):\n    betas = gd.get_named_beta_schedule(noise_schedule, steps)\n    if use_kl:\n        loss_type = gd.LossType.RESCALED_KL\n    elif rescale_learned_sigmas:\n        loss_type = gd.LossType.RESCALED_MSE\n    else:\n        loss_type = gd.LossType.MSE\n    if not timestep_respacing:\n        timestep_respacing = [steps]\n    return SpacedDiffusion(\n        use_timesteps=space_timesteps(steps, timestep_respacing),\n        betas=betas,\n        model_mean_type=(gd.ModelMeanType.EPSILON if not predict_xstart else gd.ModelMeanType.START_X),\n        model_var_type=((gd.ModelVarType.FIXED_LARGE if not sigma_small else gd.ModelVarType.FIXED_SMALL)\n                        if not learn_sigma else gd.ModelVarType.LEARNED_RANGE),\n        loss_type=loss_type,\n        rescale_timesteps=rescale_timesteps,\n    )\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/reverse_diffusion/model/sec_diff.py",
    "content": "'''\nThis code is rewritten by Paddle based on\nhttps://github.com/jina-ai/discoart/blob/main/discoart/nn/sec_diff.py\n'''\nimport math\nfrom dataclasses import dataclass\nfrom functools import partial\n\nimport paddle\nimport paddle.nn as nn\n\n\n@dataclass\nclass DiffusionOutput:\n    v: paddle.Tensor\n    pred: paddle.Tensor\n    eps: paddle.Tensor\n\n\nclass SkipBlock(nn.Layer):\n\n    def __init__(self, main, skip=None):\n        super().__init__()\n        self.main = nn.Sequential(*main)\n        self.skip = skip if skip else nn.Identity()\n\n    def forward(self, input):\n        return paddle.concat([self.main(input), self.skip(input)], axis=1)\n\n\ndef append_dims(x, n):\n    return x[(Ellipsis, *(None, ) * (n - x.ndim))]\n\n\ndef expand_to_planes(x, shape):\n    return paddle.tile(append_dims(x, len(shape)), [1, 1, *shape[2:]])\n\n\ndef alpha_sigma_to_t(alpha, sigma):\n    return paddle.atan2(sigma, alpha) * 2 / math.pi\n\n\ndef t_to_alpha_sigma(t):\n    return paddle.cos(t * math.pi / 2), paddle.sin(t * math.pi / 2)\n\n\nclass SecondaryDiffusionImageNet2(nn.Layer):\n\n    def __init__(self):\n        super().__init__()\n        c = 64  # The base channel count\n        cs = [c, c * 2, c * 2, c * 4, c * 4, c * 8]\n\n        self.timestep_embed = FourierFeatures(1, 16)\n        self.down = nn.AvgPool2D(2)\n        self.up = nn.Upsample(scale_factor=2, mode='bilinear', align_corners=False)\n\n        self.net = nn.Sequential(\n            ConvBlock(3 + 16, cs[0]),\n            ConvBlock(cs[0], cs[0]),\n            SkipBlock([\n                self.down,\n                ConvBlock(cs[0], cs[1]),\n                ConvBlock(cs[1], cs[1]),\n                SkipBlock([\n                    self.down,\n                    ConvBlock(cs[1], cs[2]),\n                    ConvBlock(cs[2], cs[2]),\n                    SkipBlock([\n                        self.down,\n                        ConvBlock(cs[2], cs[3]),\n                        ConvBlock(cs[3], cs[3]),\n                        SkipBlock([\n                            self.down,\n                            ConvBlock(cs[3], cs[4]),\n                            ConvBlock(cs[4], cs[4]),\n                            SkipBlock([\n                                self.down,\n                                ConvBlock(cs[4], cs[5]),\n                                ConvBlock(cs[5], cs[5]),\n                                ConvBlock(cs[5], cs[5]),\n                                ConvBlock(cs[5], cs[4]),\n                                self.up,\n                            ]),\n                            ConvBlock(cs[4] * 2, cs[4]),\n                            ConvBlock(cs[4], cs[3]),\n                            self.up,\n                        ]),\n                        ConvBlock(cs[3] * 2, cs[3]),\n                        ConvBlock(cs[3], cs[2]),\n                        self.up,\n                    ]),\n                    ConvBlock(cs[2] * 2, cs[2]),\n                    ConvBlock(cs[2], cs[1]),\n                    self.up,\n                ]),\n                ConvBlock(cs[1] * 2, cs[1]),\n                ConvBlock(cs[1], cs[0]),\n                self.up,\n            ]),\n            ConvBlock(cs[0] * 2, cs[0]),\n            nn.Conv2D(cs[0], 3, 3, padding=1),\n        )\n\n    def forward(self, input, t):\n        timestep_embed = expand_to_planes(self.timestep_embed(t[:, None]), input.shape)\n        v = self.net(paddle.concat([input, timestep_embed], axis=1))\n        alphas, sigmas = map(partial(append_dims, n=v.ndim), t_to_alpha_sigma(t))\n        pred = input * alphas - v * sigmas\n        eps = input * sigmas + v * alphas\n        return DiffusionOutput(v, pred, eps)\n\n\nclass FourierFeatures(nn.Layer):\n\n    def __init__(self, in_features, out_features, std=1.0):\n        super().__init__()\n        assert out_features % 2 == 0\n        # self.weight = nn.Parameter(paddle.randn([out_features // 2, in_features]) * std)\n        self.weight = paddle.create_parameter([out_features // 2, in_features],\n                                              dtype='float32',\n                                              default_initializer=nn.initializer.Normal(mean=0.0, std=std))\n\n    def forward(self, input):\n        f = 2 * math.pi * input @ self.weight.T\n        return paddle.concat([f.cos(), f.sin()], axis=-1)\n\n\nclass ConvBlock(nn.Sequential):\n\n    def __init__(self, c_in, c_out):\n        super().__init__(\n            nn.Conv2D(c_in, c_out, 3, padding=1),\n            nn.ReLU(),\n        )\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/reverse_diffusion/model/transforms.py",
    "content": "'''\nThis code is rewritten by Paddle based on\nhttps://github.com/pytorch/vision/blob/main/torchvision/transforms/transforms.py\n'''\nimport math\nimport numbers\nimport warnings\nfrom enum import Enum\nfrom typing import Any\nfrom typing import Dict\nfrom typing import List\nfrom typing import Optional\nfrom typing import Sequence\nfrom typing import Tuple\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nfrom paddle import Tensor\nfrom paddle.nn import functional as F\nfrom paddle.nn.functional import grid_sample\nfrom paddle.vision import transforms as T\n\n\nclass Normalize(nn.Layer):\n\n    def __init__(self, mean, std):\n        super(Normalize, self).__init__()\n        self.mean = paddle.to_tensor(mean)\n        self.std = paddle.to_tensor(std)\n\n    def forward(self, tensor: Tensor):\n        dtype = tensor.dtype\n        mean = paddle.to_tensor(self.mean, dtype=dtype)\n        std = paddle.to_tensor(self.std, dtype=dtype)\n        mean = mean.reshape([1, -1, 1, 1])\n        std = std.reshape([1, -1, 1, 1])\n        result = tensor.subtract(mean).divide(std)\n        return result\n\n\nclass InterpolationMode(Enum):\n    \"\"\"Interpolation modes\n    Available interpolation methods are ``nearest``, ``bilinear``, ``bicubic``, ``box``, ``hamming``, and ``lanczos``.\n    \"\"\"\n\n    NEAREST = \"nearest\"\n    BILINEAR = \"bilinear\"\n    BICUBIC = \"bicubic\"\n    # For PIL compatibility\n    BOX = \"box\"\n    HAMMING = \"hamming\"\n    LANCZOS = \"lanczos\"\n\n\nclass Grayscale(nn.Layer):\n\n    def __init__(self, num_output_channels):\n        super(Grayscale, self).__init__()\n        self.num_output_channels = num_output_channels\n\n    def forward(self, x):\n        output = (0.2989 * x[:, 0:1, :, :] + 0.587 * x[:, 1:2, :, :] + 0.114 * x[:, 2:3, :, :])\n        if self.num_output_channels == 3:\n            return output.expand(x.shape)\n\n        return output\n\n\nclass Lambda(nn.Layer):\n\n    def __init__(self, func):\n        super(Lambda, self).__init__()\n        self.transform = func\n\n    def forward(self, x):\n        return self.transform(x)\n\n\nclass RandomGrayscale(nn.Layer):\n\n    def __init__(self, p):\n        super(RandomGrayscale, self).__init__()\n        self.prob = p\n        self.transform = Grayscale(3)\n\n    def forward(self, x):\n        if paddle.rand([1]) < self.prob:\n            return self.transform(x)\n        else:\n            return x\n\n\nclass RandomHorizontalFlip(nn.Layer):\n\n    def __init__(self, prob):\n        super(RandomHorizontalFlip, self).__init__()\n        self.prob = prob\n\n    def forward(self, x):\n        if paddle.rand([1]) < self.prob:\n            return x[:, :, :, ::-1]\n        else:\n            return x\n\n\ndef _blend(img1: Tensor, img2: Tensor, ratio: float) -> Tensor:\n    ratio = float(ratio)\n    bound = 1.0\n    return (ratio * img1 + (1.0 - ratio) * img2).clip(0, bound)\n\n\ndef trunc_div(a, b):\n    ipt = paddle.divide(a, b)\n    sign_ipt = paddle.sign(ipt)\n    abs_ipt = paddle.abs(ipt)\n    abs_ipt = paddle.floor(abs_ipt)\n    out = paddle.multiply(sign_ipt, abs_ipt)\n    return out\n\n\ndef fmod(a, b):\n    return a - trunc_div(a, b) * b\n\n\ndef _rgb2hsv(img: Tensor) -> Tensor:\n    r, g, b = img.unbind(axis=-3)\n\n    # Implementation is based on https://github.com/python-pillow/Pillow/blob/4174d4267616897df3746d315d5a2d0f82c656ee/\n    # src/libImaging/Convert.c#L330\n    maxc = paddle.max(img, axis=-3)\n    minc = paddle.min(img, axis=-3)\n\n    # The algorithm erases S and H channel where `maxc = minc`. This avoids NaN\n    # from happening in the results, because\n    #   + S channel has division by `maxc`, which is zero only if `maxc = minc`\n    #   + H channel has division by `(maxc - minc)`.\n    #\n    # Instead of overwriting NaN afterwards, we just prevent it from occuring so\n    # we don't need to deal with it in case we save the NaN in a buffer in\n    # backprop, if it is ever supported, but it doesn't hurt to do so.\n    eqc = maxc == minc\n\n    cr = maxc - minc\n    # Since `eqc => cr = 0`, replacing denominator with 1 when `eqc` is fine.\n    ones = paddle.ones_like(maxc)\n    s = cr / paddle.where(eqc, ones, maxc)\n    # Note that `eqc => maxc = minc = r = g = b`. So the following calculation\n    # of `h` would reduce to `bc - gc + 2 + rc - bc + 4 + rc - bc = 6` so it\n    # would not matter what values `rc`, `gc`, and `bc` have here, and thus\n    # replacing denominator with 1 when `eqc` is fine.\n    cr_divisor = paddle.where(eqc, ones, cr)\n    rc = (maxc - r) / cr_divisor\n    gc = (maxc - g) / cr_divisor\n    bc = (maxc - b) / cr_divisor\n\n    hr = (maxc == r).cast('float32') * (bc - gc)\n    hg = ((maxc == g) & (maxc != r)).cast('float32') * (2.0 + rc - bc)\n    hb = ((maxc != g) & (maxc != r)).cast('float32') * (4.0 + gc - rc)\n    h = hr + hg + hb\n    h = fmod((h / 6.0 + 1.0), paddle.to_tensor(1.0))\n    return paddle.stack((h, s, maxc), axis=-3)\n\n\ndef _hsv2rgb(img: Tensor) -> Tensor:\n    h, s, v = img.unbind(axis=-3)\n    i = paddle.floor(h * 6.0)\n    f = (h * 6.0) - i\n    i = i.cast(dtype='int32')\n\n    p = paddle.clip((v * (1.0 - s)), 0.0, 1.0)\n    q = paddle.clip((v * (1.0 - s * f)), 0.0, 1.0)\n    t = paddle.clip((v * (1.0 - s * (1.0 - f))), 0.0, 1.0)\n    i = i % 6\n\n    mask = i.unsqueeze(axis=-3) == paddle.arange(6).reshape([-1, 1, 1])\n\n    a1 = paddle.stack((v, q, p, p, t, v), axis=-3)\n    a2 = paddle.stack((t, v, v, q, p, p), axis=-3)\n    a3 = paddle.stack((p, p, t, v, v, q), axis=-3)\n    a4 = paddle.stack((a1, a2, a3), axis=-4)\n\n    return paddle.einsum(\"...ijk, ...xijk -> ...xjk\", mask.cast(dtype=img.dtype), a4)\n\n\ndef adjust_brightness(img: Tensor, brightness_factor: float) -> Tensor:\n    if brightness_factor < 0:\n        raise ValueError(f\"brightness_factor ({brightness_factor}) is not non-negative.\")\n\n    return _blend(img, paddle.zeros_like(img), brightness_factor)\n\n\ndef adjust_contrast(img: Tensor, contrast_factor: float) -> Tensor:\n    if contrast_factor < 0:\n        raise ValueError(f\"contrast_factor ({contrast_factor}) is not non-negative.\")\n\n    c = img.shape[1]\n\n    if c == 3:\n        output = (0.2989 * img[:, 0:1, :, :] + 0.587 * img[:, 1:2, :, :] + 0.114 * img[:, 2:3, :, :])\n        mean = paddle.mean(output, axis=(-3, -2, -1), keepdim=True)\n\n    else:\n        mean = paddle.mean(img, axis=(-3, -2, -1), keepdim=True)\n\n    return _blend(img, mean, contrast_factor)\n\n\ndef adjust_hue(img: Tensor, hue_factor: float) -> Tensor:\n    if not (-0.5 <= hue_factor <= 0.5):\n        raise ValueError(f\"hue_factor ({hue_factor}) is not in [-0.5, 0.5].\")\n\n    img = _rgb2hsv(img)\n    h, s, v = img.unbind(axis=-3)\n    h = fmod(h + hue_factor, paddle.to_tensor(1.0))\n    img = paddle.stack((h, s, v), axis=-3)\n    img_hue_adj = _hsv2rgb(img)\n    return img_hue_adj\n\n\ndef adjust_saturation(img: Tensor, saturation_factor: float) -> Tensor:\n    if saturation_factor < 0:\n        raise ValueError(f\"saturation_factor ({saturation_factor}) is not non-negative.\")\n\n    output = (0.2989 * img[:, 0:1, :, :] + 0.587 * img[:, 1:2, :, :] + 0.114 * img[:, 2:3, :, :])\n\n    return _blend(img, output, saturation_factor)\n\n\nclass ColorJitter(nn.Layer):\n\n    def __init__(self, brightness=0, contrast=0, saturation=0, hue=0):\n        super(ColorJitter, self).__init__()\n        self.brightness = self._check_input(brightness, \"brightness\")\n        self.contrast = self._check_input(contrast, \"contrast\")\n        self.saturation = self._check_input(saturation, \"saturation\")\n        self.hue = self._check_input(hue, \"hue\", center=0, bound=(-0.5, 0.5), clip_first_on_zero=False)\n\n    def _check_input(self, value, name, center=1, bound=(0, float(\"inf\")), clip_first_on_zero=True):\n        if isinstance(value, numbers.Number):\n            if value < 0:\n                raise ValueError(f\"If {name} is a single number, it must be non negative.\")\n            value = [center - float(value), center + float(value)]\n            if clip_first_on_zero:\n                value[0] = max(value[0], 0.0)\n        elif isinstance(value, (tuple, list)) and len(value) == 2:\n            if not bound[0] <= value[0] <= value[1] <= bound[1]:\n                raise ValueError(f\"{name} values should be between {bound}\")\n        else:\n            raise TypeError(f\"{name} should be a single number or a list/tuple with length 2.\")\n\n        # if value is 0 or (1., 1.) for brightness/contrast/saturation\n        # or (0., 0.) for hue, do nothing\n        if value[0] == value[1] == center:\n            value = None\n        return value\n\n    @staticmethod\n    def get_params(\n        brightness: Optional[List[float]],\n        contrast: Optional[List[float]],\n        saturation: Optional[List[float]],\n        hue: Optional[List[float]],\n    ) -> Tuple[Tensor, Optional[float], Optional[float], Optional[float], Optional[float]]:\n        \"\"\"Get the parameters for the randomized transform to be applied on image.\n\n        Args:\n            brightness (tuple of float (min, max), optional): The range from which the brightness_factor is chosen\n                uniformly. Pass None to turn off the transformation.\n            contrast (tuple of float (min, max), optional): The range from which the contrast_factor is chosen\n                uniformly. Pass None to turn off the transformation.\n            saturation (tuple of float (min, max), optional): The range from which the saturation_factor is chosen\n                uniformly. Pass None to turn off the transformation.\n            hue (tuple of float (min, max), optional): The range from which the hue_factor is chosen uniformly.\n                Pass None to turn off the transformation.\n\n        Returns:\n            tuple: The parameters used to apply the randomized transform\n            along with their random order.\n        \"\"\"\n        fn_idx = paddle.randperm(4)\n\n        b = None if brightness is None else paddle.empty([1]).uniform_(brightness[0], brightness[1])\n        c = None if contrast is None else paddle.empty([1]).uniform_(contrast[0], contrast[1])\n        s = None if saturation is None else paddle.empty([1]).uniform_(saturation[0], saturation[1])\n        h = None if hue is None else paddle.empty([1]).uniform_(hue[0], hue[1])\n\n        return fn_idx, b, c, s, h\n\n    def forward(self, img):\n        \"\"\"\n        Args:\n            img (PIL Image or Tensor): Input image.\n\n        Returns:\n            PIL Image or Tensor: Color jittered image.\n        \"\"\"\n        fn_idx, brightness_factor, contrast_factor, saturation_factor, hue_factor = self.get_params(\n            self.brightness, self.contrast, self.saturation, self.hue)\n\n        for fn_id in fn_idx:\n            if fn_id == 0 and brightness_factor is not None:\n                img = adjust_brightness(img, brightness_factor)\n            elif fn_id == 1 and contrast_factor is not None:\n                img = adjust_contrast(img, contrast_factor)\n            elif fn_id == 2 and saturation_factor is not None:\n                img = adjust_saturation(img, saturation_factor)\n            elif fn_id == 3 and hue_factor is not None:\n                img = adjust_hue(img, hue_factor)\n\n        return img\n\n    def __repr__(self) -> str:\n        s = (f\"{self.__class__.__name__}(\"\n             f\"brightness={self.brightness}\"\n             f\", contrast={self.contrast}\"\n             f\", saturation={self.saturation}\"\n             f\", hue={self.hue})\")\n        return s\n\n\ndef _apply_grid_transform(img: Tensor, grid: Tensor, mode: str, fill: Optional[List[float]]) -> Tensor:\n\n    if img.shape[0] > 1:\n        # Apply same grid to a batch of images\n        grid = grid.expand([img.shape[0], grid.shape[1], grid.shape[2], grid.shape[3]])\n\n    # Append a dummy mask for customized fill colors, should be faster than grid_sample() twice\n    if fill is not None:\n        dummy = paddle.ones((img.shape[0], 1, img.shape[2], img.shape[3]), dtype=img.dtype)\n        img = paddle.concat((img, dummy), axis=1)\n\n    img = grid_sample(img, grid, mode=mode, padding_mode=\"zeros\", align_corners=False)\n\n    # Fill with required color\n    if fill is not None:\n        mask = img[:, -1:, :, :]  # N * 1 * H * W\n        img = img[:, :-1, :, :]  # N * C * H * W\n        mask = mask.expand_as(img)\n        len_fill = len(fill) if isinstance(fill, (tuple, list)) else 1\n        fill_img = paddle.to_tensor(fill, dtype=img.dtype).reshape([1, len_fill, 1, 1]).expand_as(img)\n        if mode == \"nearest\":\n            mask = mask < 0.5\n            img[mask] = fill_img[mask]\n        else:  # 'bilinear'\n            img = img * mask + (1.0 - mask) * fill_img\n    return img\n\n\ndef _gen_affine_grid(\n    theta: Tensor,\n    w: int,\n    h: int,\n    ow: int,\n    oh: int,\n) -> Tensor:\n    # https://github.com/pytorch/pytorch/blob/74b65c32be68b15dc7c9e8bb62459efbfbde33d8/aten/src/ATen/native/\n    # AffineGridGenerator.cpp#L18\n    # Difference with AffineGridGenerator is that:\n    # 1) we normalize grid values after applying theta\n    # 2) we can normalize by other image size, such that it covers \"extend\" option like in PIL.Image.rotate\n\n    d = 0.5\n    base_grid = paddle.empty([1, oh, ow, 3], dtype=theta.dtype)\n    x_grid = paddle.linspace(-ow * 0.5 + d, ow * 0.5 + d - 1, num=ow)\n    base_grid[..., 0] = (x_grid)\n    y_grid = paddle.linspace(-oh * 0.5 + d, oh * 0.5 + d - 1, num=oh).unsqueeze_(-1)\n    base_grid[..., 1] = (y_grid)\n    base_grid[..., 2] = 1.0\n    rescaled_theta = theta.transpose([0, 2, 1]) / paddle.to_tensor([0.5 * w, 0.5 * h], dtype=theta.dtype)\n    output_grid = base_grid.reshape([1, oh * ow, 3]).bmm(rescaled_theta)\n    return output_grid.reshape([1, oh, ow, 2])\n\n\ndef affine_impl(img: Tensor,\n                matrix: List[float],\n                interpolation: str = \"nearest\",\n                fill: Optional[List[float]] = None) -> Tensor:\n    theta = paddle.to_tensor(matrix, dtype=img.dtype).reshape([1, 2, 3])\n    shape = img.shape\n    # grid will be generated on the same device as theta and img\n    grid = _gen_affine_grid(theta, w=shape[-1], h=shape[-2], ow=shape[-1], oh=shape[-2])\n    return _apply_grid_transform(img, grid, interpolation, fill=fill)\n\n\ndef _get_inverse_affine_matrix(center: List[float],\n                               angle: float,\n                               translate: List[float],\n                               scale: float,\n                               shear: List[float],\n                               inverted: bool = True) -> List[float]:\n    # Helper method to compute inverse matrix for affine transformation\n\n    # Pillow requires inverse affine transformation matrix:\n    # Affine matrix is : M = T * C * RotateScaleShear * C^-1\n    #\n    # where T is translation matrix: [1, 0, tx | 0, 1, ty | 0, 0, 1]\n    #       C is translation matrix to keep center: [1, 0, cx | 0, 1, cy | 0, 0, 1]\n    #       RotateScaleShear is rotation with scale and shear matrix\n    #\n    #       RotateScaleShear(a, s, (sx, sy)) =\n    #       = R(a) * S(s) * SHy(sy) * SHx(sx)\n    #       = [ s*cos(a - sy)/cos(sy), s*(-cos(a - sy)*tan(sx)/cos(sy) - sin(a)), 0 ]\n    #         [ s*sin(a + sy)/cos(sy), s*(-sin(a - sy)*tan(sx)/cos(sy) + cos(a)), 0 ]\n    #         [ 0                    , 0                                      , 1 ]\n    # where R is a rotation matrix, S is a scaling matrix, and SHx and SHy are the shears:\n    # SHx(s) = [1, -tan(s)] and SHy(s) = [1      , 0]\n    #          [0, 1      ]              [-tan(s), 1]\n    #\n    # Thus, the inverse is M^-1 = C * RotateScaleShear^-1 * C^-1 * T^-1\n\n    rot = math.radians(angle)\n    sx = math.radians(shear[0])\n    sy = math.radians(shear[1])\n\n    cx, cy = center\n    tx, ty = translate\n\n    # RSS without scaling\n    a = math.cos(rot - sy) / math.cos(sy)\n    b = -math.cos(rot - sy) * math.tan(sx) / math.cos(sy) - math.sin(rot)\n    c = math.sin(rot - sy) / math.cos(sy)\n    d = -math.sin(rot - sy) * math.tan(sx) / math.cos(sy) + math.cos(rot)\n\n    if inverted:\n        # Inverted rotation matrix with scale and shear\n        # det([[a, b], [c, d]]) == 1, since det(rotation) = 1 and det(shear) = 1\n        matrix = [d, -b, 0.0, -c, a, 0.0]\n        matrix = [x / scale for x in matrix]\n        # Apply inverse of translation and of center translation: RSS^-1 * C^-1 * T^-1\n        matrix[2] += matrix[0] * (-cx - tx) + matrix[1] * (-cy - ty)\n        matrix[5] += matrix[3] * (-cx - tx) + matrix[4] * (-cy - ty)\n        # Apply center translation: C * RSS^-1 * C^-1 * T^-1\n        matrix[2] += cx\n        matrix[5] += cy\n    else:\n        matrix = [a, b, 0.0, c, d, 0.0]\n        matrix = [x * scale for x in matrix]\n        # Apply inverse of center translation: RSS * C^-1\n        matrix[2] += matrix[0] * (-cx) + matrix[1] * (-cy)\n        matrix[5] += matrix[3] * (-cx) + matrix[4] * (-cy)\n        # Apply translation and center : T * C * RSS * C^-1\n        matrix[2] += cx + tx\n        matrix[5] += cy + ty\n\n    return matrix\n\n\ndef affine(\n    img: Tensor,\n    angle: float,\n    translate: List[int],\n    scale: float,\n    shear: List[float],\n    interpolation: InterpolationMode = InterpolationMode.NEAREST,\n    fill: Optional[List[float]] = None,\n    resample: Optional[int] = None,\n    fillcolor: Optional[List[float]] = None,\n    center: Optional[List[int]] = None,\n) -> Tensor:\n    \"\"\"Apply affine transformation on the image keeping image center invariant.\n    If the image is paddle Tensor, it is expected\n    to have [..., H, W] shape, where ... means an arbitrary number of leading dimensions.\n\n    Args:\n        img (PIL Image or Tensor): image to transform.\n        angle (number): rotation angle in degrees between -180 and 180, clockwise direction.\n        translate (sequence of integers): horizontal and vertical translations (post-rotation translation)\n        scale (float): overall scale\n        shear (float or sequence): shear angle value in degrees between -180 to 180, clockwise direction.\n            If a sequence is specified, the first value corresponds to a shear parallel to the x axis, while\n            the second value corresponds to a shear parallel to the y axis.\n        interpolation (InterpolationMode): Desired interpolation enum defined by\n            :class:`torchvision.transforms.InterpolationMode`. Default is ``InterpolationMode.NEAREST``.\n            If input is Tensor, only ``InterpolationMode.NEAREST``, ``InterpolationMode.BILINEAR`` are supported.\n            For backward compatibility integer values (e.g. ``PIL.Image[.Resampling].NEAREST``) are still accepted,\n            but deprecated since 0.13 and will be removed in 0.15. Please use InterpolationMode enum.\n        fill (sequence or number, optional): Pixel fill value for the area outside the transformed\n            image. If given a number, the value is used for all bands respectively.\n\n            .. note::\n                In torchscript mode single int/float value is not supported, please use a sequence\n                of length 1: ``[value, ]``.\n        fillcolor (sequence or number, optional):\n            .. warning::\n                This parameter was deprecated in ``0.12`` and will be removed in ``0.14``. Please use ``fill`` instead.\n        resample (int, optional):\n            .. warning::\n                This parameter was deprecated in ``0.12`` and will be removed in ``0.14``. Please use ``interpolation``\n                instead.\n        center (sequence, optional): Optional center of rotation. Origin is the upper left corner.\n            Default is the center of the image.\n\n    Returns:\n        PIL Image or Tensor: Transformed image.\n    \"\"\"\n\n    # Backward compatibility with integer value\n    if isinstance(interpolation, int):\n        warnings.warn(\"Argument 'interpolation' of type int is deprecated since 0.13 and will be removed in 0.15. \"\n                      \"Please use InterpolationMode enum.\")\n        interpolation = _interpolation_modes_from_int(interpolation)\n\n    if fillcolor is not None:\n        warnings.warn(\"The parameter 'fillcolor' is deprecated since 0.12 and will be removed in 0.14. \"\n                      \"Please use 'fill' instead.\")\n        fill = fillcolor\n\n    if not isinstance(angle, (int, float)):\n        raise TypeError(\"Argument angle should be int or float\")\n\n    if not isinstance(translate, (list, tuple)):\n        raise TypeError(\"Argument translate should be a sequence\")\n\n    if len(translate) != 2:\n        raise ValueError(\"Argument translate should be a sequence of length 2\")\n\n    if scale <= 0.0:\n        raise ValueError(\"Argument scale should be positive\")\n\n    if not isinstance(shear, (numbers.Number, (list, tuple))):\n        raise TypeError(\"Shear should be either a single value or a sequence of two values\")\n\n    if not isinstance(interpolation, InterpolationMode):\n        raise TypeError(\"Argument interpolation should be a InterpolationMode\")\n\n    if isinstance(angle, int):\n        angle = float(angle)\n\n    if isinstance(translate, tuple):\n        translate = list(translate)\n\n    if isinstance(shear, numbers.Number):\n        shear = [shear, 0.0]\n\n    if isinstance(shear, tuple):\n        shear = list(shear)\n\n    if len(shear) == 1:\n        shear = [shear[0], shear[0]]\n\n    if len(shear) != 2:\n        raise ValueError(f\"Shear should be a sequence containing two values. Got {shear}\")\n\n    if center is not None and not isinstance(center, (list, tuple)):\n        raise TypeError(\"Argument center should be a sequence\")\n    center_f = [0.0, 0.0]\n    if center is not None:\n        _, height, width = img.shape[0], img.shape[1], img.shape[2]\n        # Center values should be in pixel coordinates but translated such that (0, 0) corresponds to image center.\n        center_f = [1.0 * (c - s * 0.5) for c, s in zip(center, [width, height])]\n\n    translate_f = [1.0 * t for t in translate]\n    matrix = _get_inverse_affine_matrix(center_f, angle, translate_f, scale, shear)\n    return affine_impl(img, matrix=matrix, interpolation=interpolation.value, fill=fill)\n\n\ndef _interpolation_modes_from_int(i: int) -> InterpolationMode:\n    inverse_modes_mapping = {\n        0: InterpolationMode.NEAREST,\n        2: InterpolationMode.BILINEAR,\n        3: InterpolationMode.BICUBIC,\n        4: InterpolationMode.BOX,\n        5: InterpolationMode.HAMMING,\n        1: InterpolationMode.LANCZOS,\n    }\n    return inverse_modes_mapping[i]\n\n\ndef _check_sequence_input(x, name, req_sizes):\n    msg = req_sizes[0] if len(req_sizes) < 2 else \" or \".join([str(s) for s in req_sizes])\n    if not isinstance(x, Sequence):\n        raise TypeError(f\"{name} should be a sequence of length {msg}.\")\n    if len(x) not in req_sizes:\n        raise ValueError(f\"{name} should be sequence of length {msg}.\")\n\n\ndef _setup_angle(x, name, req_sizes=(2, )):\n    if isinstance(x, numbers.Number):\n        if x < 0:\n            raise ValueError(f\"If {name} is a single number, it must be positive.\")\n        x = [-x, x]\n    else:\n        _check_sequence_input(x, name, req_sizes)\n\n    return [float(d) for d in x]\n\n\nclass RandomAffine(nn.Layer):\n    \"\"\"Random affine transformation of the image keeping center invariant.\n    If the image is paddle Tensor, it is expected\n    to have [..., H, W] shape, where ... means an arbitrary number of leading dimensions.\n\n    Args:\n        degrees (sequence or number): Range of degrees to select from.\n            If degrees is a number instead of sequence like (min, max), the range of degrees\n            will be (-degrees, +degrees). Set to 0 to deactivate rotations.\n        translate (tuple, optional): tuple of maximum absolute fraction for horizontal\n            and vertical translations. For example translate=(a, b), then horizontal shift\n            is randomly sampled in the range -img_width * a < dx < img_width * a and vertical shift is\n            randomly sampled in the range -img_height * b < dy < img_height * b. Will not translate by default.\n        scale (tuple, optional): scaling factor interval, e.g (a, b), then scale is\n            randomly sampled from the range a <= scale <= b. Will keep original scale by default.\n        shear (sequence or number, optional): Range of degrees to select from.\n            If shear is a number, a shear parallel to the x axis in the range (-shear, +shear)\n            will be applied. Else if shear is a sequence of 2 values a shear parallel to the x axis in the\n            range (shear[0], shear[1]) will be applied. Else if shear is a sequence of 4 values,\n            a x-axis shear in (shear[0], shear[1]) and y-axis shear in (shear[2], shear[3]) will be applied.\n            Will not apply shear by default.\n        interpolation (InterpolationMode): Desired interpolation enum defined by\n            :class:`torchvision.transforms.InterpolationMode`. Default is ``InterpolationMode.NEAREST``.\n            If input is Tensor, only ``InterpolationMode.NEAREST``, ``InterpolationMode.BILINEAR`` are supported.\n            For backward compatibility integer values (e.g. ``PIL.Image[.Resampling].NEAREST``) are still accepted,\n            but deprecated since 0.13 and will be removed in 0.15. Please use InterpolationMode enum.\n        fill (sequence or number): Pixel fill value for the area outside the transformed\n            image. Default is ``0``. If given a number, the value is used for all bands respectively.\n        fillcolor (sequence or number, optional):\n            .. warning::\n                This parameter was deprecated in ``0.12`` and will be removed in ``0.14``. Please use ``fill`` instead.\n        resample (int, optional):\n            .. warning::\n                This parameter was deprecated in ``0.12`` and will be removed in ``0.14``. Please use ``interpolation``\n                instead.\n        center (sequence, optional): Optional center of rotation, (x, y). Origin is the upper left corner.\n            Default is the center of the image.\n\n    .. _filters: https://pillow.readthedocs.io/en/latest/handbook/concepts.html#filters\n\n    \"\"\"\n\n    def __init__(\n        self,\n        degrees,\n        translate=None,\n        scale=None,\n        shear=None,\n        interpolation=InterpolationMode.NEAREST,\n        fill=0,\n        fillcolor=None,\n        resample=None,\n        center=None,\n    ):\n        super(RandomAffine, self).__init__()\n        if resample is not None:\n            warnings.warn(\"The parameter 'resample' is deprecated since 0.12 and will be removed in 0.14. \"\n                          \"Please use 'interpolation' instead.\")\n            interpolation = _interpolation_modes_from_int(resample)\n\n        # Backward compatibility with integer value\n        if isinstance(interpolation, int):\n            warnings.warn(\"Argument 'interpolation' of type int is deprecated since 0.13 and will be removed in 0.15. \"\n                          \"Please use InterpolationMode enum.\")\n            interpolation = _interpolation_modes_from_int(interpolation)\n\n        if fillcolor is not None:\n            warnings.warn(\"The parameter 'fillcolor' is deprecated since 0.12 and will be removed in 0.14. \"\n                          \"Please use 'fill' instead.\")\n            fill = fillcolor\n\n        self.degrees = _setup_angle(degrees, name=\"degrees\", req_sizes=(2, ))\n\n        if translate is not None:\n            _check_sequence_input(translate, \"translate\", req_sizes=(2, ))\n            for t in translate:\n                if not (0.0 <= t <= 1.0):\n                    raise ValueError(\"translation values should be between 0 and 1\")\n        self.translate = translate\n\n        if scale is not None:\n            _check_sequence_input(scale, \"scale\", req_sizes=(2, ))\n            for s in scale:\n                if s <= 0:\n                    raise ValueError(\"scale values should be positive\")\n        self.scale = scale\n\n        if shear is not None:\n            self.shear = _setup_angle(shear, name=\"shear\", req_sizes=(2, 4))\n        else:\n            self.shear = shear\n\n        self.resample = self.interpolation = interpolation\n\n        if fill is None:\n            fill = 0\n        elif not isinstance(fill, (Sequence, numbers.Number)):\n            raise TypeError(\"Fill should be either a sequence or a number.\")\n\n        self.fillcolor = self.fill = fill\n\n        if center is not None:\n            _check_sequence_input(center, \"center\", req_sizes=(2, ))\n\n        self.center = center\n\n    @staticmethod\n    def get_params(\n        degrees: List[float],\n        translate: Optional[List[float]],\n        scale_ranges: Optional[List[float]],\n        shears: Optional[List[float]],\n        img_size: List[int],\n    ) -> Tuple[float, Tuple[int, int], float, Tuple[float, float]]:\n        \"\"\"Get parameters for affine transformation\n\n        Returns:\n            params to be passed to the affine transformation\n        \"\"\"\n        angle = float(paddle.empty([1]).uniform_(float(degrees[0]), float(degrees[1])))\n        if translate is not None:\n            max_dx = float(translate[0] * img_size[0])\n            max_dy = float(translate[1] * img_size[1])\n            tx = int(float(paddle.empty([1]).uniform_(-max_dx, max_dx)))\n            ty = int(float(paddle.empty([1]).uniform_(-max_dy, max_dy)))\n            translations = (tx, ty)\n        else:\n            translations = (0, 0)\n\n        if scale_ranges is not None:\n            scale = float(paddle.empty([1]).uniform_(scale_ranges[0], scale_ranges[1]))\n        else:\n            scale = 1.0\n\n        shear_x = shear_y = 0.0\n        if shears is not None:\n            shear_x = float(paddle.empty([1]).uniform_(shears[0], shears[1]))\n            if len(shears) == 4:\n                shear_y = float(paddle.empty([1]).uniform_(shears[2], shears[3]))\n\n        shear = (shear_x, shear_y)\n\n        return angle, translations, scale, shear\n\n    def forward(self, img):\n        fill = self.fill\n        channels, height, width = img.shape[1], img.shape[2], img.shape[3]\n        if isinstance(fill, (int, float)):\n            fill = [float(fill)] * channels\n        else:\n            fill = [float(f) for f in fill]\n\n        img_size = [width, height]  # flip for keeping BC on get_params call\n\n        ret = self.get_params(self.degrees, self.translate, self.scale, self.shear, img_size)\n\n        return affine(img, *ret, interpolation=self.interpolation, fill=fill, center=self.center)\n\n    def __repr__(self) -> str:\n        s = f\"{self.__class__.__name__}(degrees={self.degrees}\"\n        s += f\", translate={self.translate}\" if self.translate is not None else \"\"\n        s += f\", scale={self.scale}\" if self.scale is not None else \"\"\n        s += f\", shear={self.shear}\" if self.shear is not None else \"\"\n        s += f\", interpolation={self.interpolation.value}\" if self.interpolation != InterpolationMode.NEAREST else \"\"\n        s += f\", fill={self.fill}\" if self.fill != 0 else \"\"\n        s += f\", center={self.center}\" if self.center is not None else \"\"\n        s += \")\"\n\n        return s\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/reverse_diffusion/model/unet.py",
    "content": "'''\nThis code is rewritten by Paddle based on\nhttps://github.com/openai/guided-diffusion/blob/main/guided_diffusion/unet.py\n'''\nimport math\nfrom abc import abstractmethod\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom .nn import avg_pool_nd\nfrom .nn import checkpoint\nfrom .nn import conv_nd\nfrom .nn import linear\nfrom .nn import normalization\nfrom .nn import SiLU\nfrom .nn import timestep_embedding\nfrom .nn import zero_module\n\n\nclass AttentionPool2d(nn.Layer):\n    \"\"\"\n    Adapted from CLIP: https://github.com/openai/CLIP/blob/main/clip/model.py\n    \"\"\"\n\n    def __init__(\n        self,\n        spacial_dim: int,\n        embed_dim: int,\n        num_heads_channels: int,\n        output_dim: int = None,\n    ):\n        super().__init__()\n        # self.positional_embedding = nn.Parameter(\n        #     th.randn(embed_dim, spacial_dim ** 2 + 1) / embed_dim ** 0.5\n        # )\n        positional_embedding = self.create_parameter(paddle.randn(embed_dim, spacial_dim**2 + 1) / embed_dim**0.5)\n        self.add_parameter(\"positional_embedding\", positional_embedding)\n        self.qkv_proj = conv_nd(1, embed_dim, 3 * embed_dim, 1)\n        self.c_proj = conv_nd(1, embed_dim, output_dim or embed_dim, 1)\n        self.num_heads = embed_dim // num_heads_channels\n        self.attention = QKVAttention(self.num_heads)\n\n    def forward(self, x):\n        b, c, *_spatial = x.shape\n        # x = x.reshape(b, c, -1)  # NC(HW)\n        x = paddle.reshape(x, [b, c, -1])\n        x = paddle.concat([x.mean(dim=-1, keepdim=True), x], axis=-1)  # NC(HW+1)\n        x = x + paddle.cast(self.positional_embedding[None, :, :], x.dtype)  # NC(HW+1)\n        x = self.qkv_proj(x)\n        x = self.attention(x)\n        x = self.c_proj(x)\n        return x[:, :, 0]\n\n\nclass TimestepBlock(nn.Layer):\n    \"\"\"\n    Any module where forward() takes timestep embeddings as a second argument.\n    \"\"\"\n\n    @abstractmethod\n    def forward(self, x, emb):\n        \"\"\"\n        Apply the module to `x` given `emb` timestep embeddings.\n        \"\"\"\n\n\nclass TimestepEmbedSequential(nn.Sequential, TimestepBlock):\n    \"\"\"\n    A sequential module that passes timestep embeddings to the children that\n    support it as an extra input.\n    \"\"\"\n\n    def forward(self, x, emb):\n        for layer in self:\n            if isinstance(layer, TimestepBlock):\n                x = layer(x, emb)\n            else:\n                x = layer(x)\n        return x\n\n\nclass Upsample(nn.Layer):\n    \"\"\"\n    An upsampling layer with an optional convolution.\n\n    :param channels: channels in the inputs and outputs.\n    :param use_conv: a bool determining if a convolution is applied.\n    :param dims: determines if the signal is 1D, 2D, or 3D. If 3D, then\n                 upsampling occurs in the inner-two dimensions.\n    \"\"\"\n\n    def __init__(self, channels, use_conv, dims=2, out_channels=None):\n        super().__init__()\n        self.channels = channels\n        self.out_channels = out_channels or channels\n        self.use_conv = use_conv\n        self.dims = dims\n        if use_conv:\n            self.conv = conv_nd(dims, self.channels, self.out_channels, 3, padding=1)\n\n    def forward(self, x):\n        assert x.shape[1] == self.channels\n        if self.dims == 3:\n            x = F.interpolate(x, (x.shape[2], x.shape[3] * 2, x.shape[4] * 2), mode=\"nearest\")\n        else:\n            x = F.interpolate(x, scale_factor=2, mode=\"nearest\")\n        if self.use_conv:\n            x = self.conv(x)\n        return x\n\n\nclass Downsample(nn.Layer):\n    \"\"\"\n    A downsampling layer with an optional convolution.\n\n    :param channels: channels in the inputs and outputs.\n    :param use_conv: a bool determining if a convolution is applied.\n    :param dims: determines if the signal is 1D, 2D, or 3D. If 3D, then\n                 downsampling occurs in the inner-two dimensions.\n    \"\"\"\n\n    def __init__(self, channels, use_conv, dims=2, out_channels=None):\n        super().__init__()\n        self.channels = channels\n        self.out_channels = out_channels or channels\n        self.use_conv = use_conv\n        self.dims = dims\n        stride = 2 if dims != 3 else (1, 2, 2)\n        if use_conv:\n            self.op = conv_nd(dims, self.channels, self.out_channels, 3, stride=stride, padding=1)\n        else:\n            assert self.channels == self.out_channels\n            self.op = avg_pool_nd(dims, kernel_size=stride, stride=stride)\n\n    def forward(self, x):\n        assert x.shape[1] == self.channels\n        return self.op(x)\n\n\nclass ResBlock(TimestepBlock):\n    \"\"\"\n    A residual block that can optionally change the number of channels.\n\n    :param channels: the number of input channels.\n    :param emb_channels: the number of timestep embedding channels.\n    :param dropout: the rate of dropout.\n    :param out_channels: if specified, the number of out channels.\n    :param use_conv: if True and out_channels is specified, use a spatial\n        convolution instead of a smaller 1x1 convolution to change the\n        channels in the skip connection.\n    :param dims: determines if the signal is 1D, 2D, or 3D.\n    :param use_checkpoint: if True, use gradient checkpointing on this module.\n    :param up: if True, use this block for upsampling.\n    :param down: if True, use this block for downsampling.\n    \"\"\"\n\n    def __init__(\n        self,\n        channels,\n        emb_channels,\n        dropout,\n        out_channels=None,\n        use_conv=False,\n        use_scale_shift_norm=False,\n        dims=2,\n        use_checkpoint=False,\n        up=False,\n        down=False,\n    ):\n        super().__init__()\n        self.channels = channels\n        self.emb_channels = emb_channels\n        self.dropout = dropout\n        self.out_channels = out_channels or channels\n        self.use_conv = use_conv\n        self.use_checkpoint = use_checkpoint\n        self.use_scale_shift_norm = use_scale_shift_norm\n\n        self.in_layers = nn.Sequential(\n            normalization(channels),\n            SiLU(),\n            conv_nd(dims, channels, self.out_channels, 3, padding=1),\n        )\n\n        self.updown = up or down\n\n        if up:\n            self.h_upd = Upsample(channels, False, dims)\n            self.x_upd = Upsample(channels, False, dims)\n        elif down:\n            self.h_upd = Downsample(channels, False, dims)\n            self.x_upd = Downsample(channels, False, dims)\n        else:\n            self.h_upd = self.x_upd = nn.Identity()\n\n        self.emb_layers = nn.Sequential(\n            SiLU(),\n            linear(\n                emb_channels,\n                2 * self.out_channels if use_scale_shift_norm else self.out_channels,\n            ),\n        )\n        self.out_layers = nn.Sequential(\n            normalization(self.out_channels),\n            SiLU(),\n            nn.Dropout(p=dropout),\n            zero_module(conv_nd(dims, self.out_channels, self.out_channels, 3, padding=1)),\n        )\n\n        if self.out_channels == channels:\n            self.skip_connection = nn.Identity()\n        elif use_conv:\n            self.skip_connection = conv_nd(dims, channels, self.out_channels, 3, padding=1)\n        else:\n            self.skip_connection = conv_nd(dims, channels, self.out_channels, 1)\n\n    def forward(self, x, emb):\n        \"\"\"\n        Apply the block to a Tensor, conditioned on a timestep embedding.\n\n        :param x: an [N x C x ...] Tensor of features.\n        :param emb: an [N x emb_channels] Tensor of timestep embeddings.\n        :return: an [N x C x ...] Tensor of outputs.\n        \"\"\"\n        return checkpoint(self._forward, (x, emb), self.parameters(), self.use_checkpoint)\n\n    def _forward(self, x, emb):\n        if self.updown:\n            in_rest, in_conv = self.in_layers[:-1], self.in_layers[-1]\n            h = in_rest(x)\n            h = self.h_upd(h)\n            x = self.x_upd(x)\n            h = in_conv(h)\n        else:\n            h = self.in_layers(x)\n        emb_out = self.emb_layers(emb)\n        emb_out = paddle.cast(emb_out, h.dtype)\n        while len(emb_out.shape) < len(h.shape):\n            emb_out = emb_out[..., None]\n        if self.use_scale_shift_norm:\n            out_norm, out_rest = self.out_layers[0], self.out_layers[1:]\n            scale, shift = paddle.chunk(emb_out, 2, axis=1)\n            h = out_norm(h) * (1 + scale) + shift\n            h = out_rest(h)\n        else:\n            h = h + emb_out\n            h = self.out_layers(h)\n        return self.skip_connection(x) + h\n\n\nclass AttentionBlock(nn.Layer):\n    \"\"\"\n    An attention block that allows spatial positions to attend to each other.\n\n    Originally ported from here, but adapted to the N-d case.\n    https://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/models/unet.py#L66.\n    \"\"\"\n\n    def __init__(\n        self,\n        channels,\n        num_heads=1,\n        num_head_channels=-1,\n        use_checkpoint=False,\n        use_new_attention_order=False,\n    ):\n        super().__init__()\n        self.channels = channels\n        if num_head_channels == -1:\n            self.num_heads = num_heads\n        else:\n            assert (channels % num_head_channels == 0\n                    ), f\"q,k,v channels {channels} is not divisible by num_head_channels {num_head_channels}\"\n            self.num_heads = channels // num_head_channels\n        self.use_checkpoint = use_checkpoint\n        self.norm = normalization(channels)\n        self.qkv = conv_nd(1, channels, channels * 3, 1)\n        if use_new_attention_order:\n            # split qkv before split heads\n            self.attention = QKVAttention(self.num_heads)\n        else:\n            # split heads before split qkv\n            self.attention = QKVAttentionLegacy(self.num_heads)\n\n        self.proj_out = zero_module(conv_nd(1, channels, channels, 1))\n\n    def forward(self, x):\n        return checkpoint(self._forward, (x, ), self.parameters(), self.use_checkpoint)\n\n    def _forward(self, x):\n        b, c, *spatial = x.shape\n        # x = x.reshape(b, c, -1)\n        x = paddle.reshape(x, [b, c, -1])\n        qkv = self.qkv(self.norm(x))\n        h = self.attention(qkv)\n        h = self.proj_out(h)\n        # return (x + h).reshape(b, c, *spatial)\n        return paddle.reshape(x + h, [b, c, *spatial])\n\n\ndef count_flops_attn(model, _x, y):\n    \"\"\"\n    A counter for the `thop` package to count the operations in an\n    attention operation.\n    Meant to be used like:\n        macs, params = thop.profile(\n            model,\n            inputs=(inputs, timestamps),\n            custom_ops={QKVAttention: QKVAttention.count_flops},\n        )\n    \"\"\"\n    b, c, *spatial = y[0].shape\n    num_spatial = int(np.prod(spatial))\n    # We perform two matmuls with the same number of ops.\n    # The first computes the weight matrix, the second computes\n    # the combination of the value vectors.\n    matmul_ops = 2 * b * (num_spatial**2) * c\n    model.total_ops += paddle.to_tensor([matmul_ops], dtype='float64')\n\n\nclass QKVAttentionLegacy(nn.Layer):\n    \"\"\"\n    A module which performs QKV attention. Matches legacy QKVAttention + input/ouput heads shaping\n    \"\"\"\n\n    def __init__(self, n_heads):\n        super().__init__()\n        self.n_heads = n_heads\n\n    def forward(self, qkv):\n        \"\"\"\n        Apply QKV attention.\n\n        :param qkv: an [N x (H * 3 * C) x T] tensor of Qs, Ks, and Vs.\n        :return: an [N x (H * C) x T] tensor after attention.\n        \"\"\"\n        bs, width, length = qkv.shape\n        assert width % (3 * self.n_heads) == 0\n        ch = width // (3 * self.n_heads)\n        # q, k, v = qkv.reshape(bs * self.n_heads, ch * 3, length).split(ch, dim=1)\n        q, k, v = paddle.reshape(qkv, [bs * self.n_heads, ch * 3, length]).split(3, axis=1)\n        scale = 1 / math.sqrt(math.sqrt(ch))\n        weight = paddle.einsum(\"bct,bcs->bts\", q * scale, k * scale)  # More stable with f16 than dividing afterwards\n        weight = paddle.cast(nn.functional.softmax(paddle.cast(weight, 'float32'), axis=-1), weight.dtype)\n        a = paddle.einsum(\"bts,bcs->bct\", weight, v)\n        # return a.reshape(bs, -1, length)\n        return paddle.reshape(a, [bs, -1, length])\n\n    @staticmethod\n    def count_flops(model, _x, y):\n        return count_flops_attn(model, _x, y)\n\n\nclass QKVAttention(nn.Layer):\n    \"\"\"\n    A module which performs QKV attention and splits in a different order.\n    \"\"\"\n\n    def __init__(self, n_heads):\n        super().__init__()\n        self.n_heads = n_heads\n\n    def forward(self, qkv):\n        \"\"\"\n        Apply QKV attention.\n\n        :param qkv: an [N x (3 * H * C) x T] tensor of Qs, Ks, and Vs.\n        :return: an [N x (H * C) x T] tensor after attention.\n        \"\"\"\n        bs, width, length = qkv.shape\n        assert width % (3 * self.n_heads) == 0\n        ch = width // (3 * self.n_heads)\n        q, k, v = qkv.chunk(3, axis=1)\n        scale = 1 / math.sqrt(math.sqrt(ch))\n        weight = paddle.einsum(\n            \"bct,bcs->bts\",\n            (q * scale).view(bs * self.n_heads, ch, length),\n            (k * scale).view(bs * self.n_heads, ch, length),\n        )  # More stable with f16 than dividing afterwards\n        weight = paddle.cast(nn.functional.softmax(paddle.cast(weight, 'float32'), axis=-1), weight.dtype)\n        a = paddle.einsum(\"bts,bcs->bct\", weight, v.reshape(bs * self.n_heads, ch, length))\n        # return a.reshape(bs, -1, length)\n        return paddle.reshape(a, [bs, -1, length])\n\n    @staticmethod\n    def count_flops(model, _x, y):\n        return count_flops_attn(model, _x, y)\n\n\nclass UNetModel(nn.Layer):\n    \"\"\"\n    The full UNet model with attention and timestep embedding.\n\n    :param in_channels: channels in the input Tensor.\n    :param model_channels: base channel count for the model.\n    :param out_channels: channels in the output Tensor.\n    :param num_res_blocks: number of residual blocks per downsample.\n    :param attention_resolutions: a collection of downsample rates at which\n        attention will take place. May be a set, list, or tuple.\n        For example, if this contains 4, then at 4x downsampling, attention\n        will be used.\n    :param dropout: the dropout probability.\n    :param channel_mult: channel multiplier for each level of the UNet.\n    :param conv_resample: if True, use learned convolutions for upsampling and\n        downsampling.\n    :param dims: determines if the signal is 1D, 2D, or 3D.\n    :param num_classes: if specified (as an int), then this model will be\n        class-conditional with `num_classes` classes.\n    :param use_checkpoint: use gradient checkpointing to reduce memory usage.\n    :param num_heads: the number of attention heads in each attention layer.\n    :param num_heads_channels: if specified, ignore num_heads and instead use\n                               a fixed channel width per attention head.\n    :param num_heads_upsample: works with num_heads to set a different number\n                               of heads for upsampling. Deprecated.\n    :param use_scale_shift_norm: use a FiLM-like conditioning mechanism.\n    :param resblock_updown: use residual blocks for up/downsampling.\n    :param use_new_attention_order: use a different attention pattern for potentially\n                                    increased efficiency.\n    \"\"\"\n\n    def __init__(\n        self,\n        image_size,\n        in_channels,\n        model_channels,\n        out_channels,\n        num_res_blocks,\n        attention_resolutions,\n        dropout=0,\n        channel_mult=(1, 2, 4, 8),\n        conv_resample=True,\n        dims=2,\n        num_classes=None,\n        use_checkpoint=False,\n        use_fp16=False,\n        num_heads=1,\n        num_head_channels=-1,\n        num_heads_upsample=-1,\n        use_scale_shift_norm=False,\n        resblock_updown=False,\n        use_new_attention_order=False,\n    ):\n        super().__init__()\n\n        if num_heads_upsample == -1:\n            num_heads_upsample = num_heads\n\n        self.image_size = image_size\n        self.in_channels = in_channels\n        self.model_channels = model_channels\n        self.out_channels = out_channels\n        self.num_res_blocks = num_res_blocks\n        self.attention_resolutions = attention_resolutions\n        self.dropout = dropout\n        self.channel_mult = channel_mult\n        self.conv_resample = conv_resample\n        self.num_classes = num_classes\n        self.use_checkpoint = use_checkpoint\n        self.dtype = paddle.float16 if use_fp16 else paddle.float32\n        self.num_heads = num_heads\n        self.num_head_channels = num_head_channels\n        self.num_heads_upsample = num_heads_upsample\n\n        time_embed_dim = model_channels * 4\n        self.time_embed = nn.Sequential(\n            linear(model_channels, time_embed_dim),\n            SiLU(),\n            linear(time_embed_dim, time_embed_dim),\n        )\n\n        if self.num_classes is not None:\n            self.label_emb = nn.Embedding(num_classes, time_embed_dim)\n\n        ch = input_ch = int(channel_mult[0] * model_channels)\n        self.input_blocks = nn.LayerList([TimestepEmbedSequential(conv_nd(dims, in_channels, ch, 3, padding=1))])\n        self._feature_size = ch\n        input_block_chans = [ch]\n        ds = 1\n        for level, mult in enumerate(channel_mult):\n            for _ in range(num_res_blocks):\n                layers = [\n                    ResBlock(\n                        ch,\n                        time_embed_dim,\n                        dropout,\n                        out_channels=int(mult * model_channels),\n                        dims=dims,\n                        use_checkpoint=use_checkpoint,\n                        use_scale_shift_norm=use_scale_shift_norm,\n                    )\n                ]\n                ch = int(mult * model_channels)\n                if ds in attention_resolutions:\n                    layers.append(\n                        AttentionBlock(\n                            ch,\n                            use_checkpoint=use_checkpoint,\n                            num_heads=num_heads,\n                            num_head_channels=num_head_channels,\n                            use_new_attention_order=use_new_attention_order,\n                        ))\n                self.input_blocks.append(TimestepEmbedSequential(*layers))\n                self._feature_size += ch\n                input_block_chans.append(ch)\n            if level != len(channel_mult) - 1:\n                out_ch = ch\n                self.input_blocks.append(\n                    TimestepEmbedSequential(\n                        ResBlock(\n                            ch,\n                            time_embed_dim,\n                            dropout,\n                            out_channels=out_ch,\n                            dims=dims,\n                            use_checkpoint=use_checkpoint,\n                            use_scale_shift_norm=use_scale_shift_norm,\n                            down=True,\n                        ) if resblock_updown else Downsample(ch, conv_resample, dims=dims, out_channels=out_ch)))\n                ch = out_ch\n                input_block_chans.append(ch)\n                ds *= 2\n                self._feature_size += ch\n\n        self.middle_block = TimestepEmbedSequential(\n            ResBlock(\n                ch,\n                time_embed_dim,\n                dropout,\n                dims=dims,\n                use_checkpoint=use_checkpoint,\n                use_scale_shift_norm=use_scale_shift_norm,\n            ),\n            AttentionBlock(\n                ch,\n                use_checkpoint=use_checkpoint,\n                num_heads=num_heads,\n                num_head_channels=num_head_channels,\n                use_new_attention_order=use_new_attention_order,\n            ),\n            ResBlock(\n                ch,\n                time_embed_dim,\n                dropout,\n                dims=dims,\n                use_checkpoint=use_checkpoint,\n                use_scale_shift_norm=use_scale_shift_norm,\n            ),\n        )\n        self._feature_size += ch\n\n        self.output_blocks = nn.LayerList([])\n        for level, mult in list(enumerate(channel_mult))[::-1]:\n            for i in range(num_res_blocks + 1):\n                ich = input_block_chans.pop()\n                layers = [\n                    ResBlock(\n                        ch + ich,\n                        time_embed_dim,\n                        dropout,\n                        out_channels=int(model_channels * mult),\n                        dims=dims,\n                        use_checkpoint=use_checkpoint,\n                        use_scale_shift_norm=use_scale_shift_norm,\n                    )\n                ]\n                ch = int(model_channels * mult)\n                if ds in attention_resolutions:\n                    layers.append(\n                        AttentionBlock(\n                            ch,\n                            use_checkpoint=use_checkpoint,\n                            num_heads=num_heads_upsample,\n                            num_head_channels=num_head_channels,\n                            use_new_attention_order=use_new_attention_order,\n                        ))\n                if level and i == num_res_blocks:\n                    out_ch = ch\n                    layers.append(\n                        ResBlock(\n                            ch,\n                            time_embed_dim,\n                            dropout,\n                            out_channels=out_ch,\n                            dims=dims,\n                            use_checkpoint=use_checkpoint,\n                            use_scale_shift_norm=use_scale_shift_norm,\n                            up=True,\n                        ) if resblock_updown else Upsample(ch, conv_resample, dims=dims, out_channels=out_ch))\n                    ds //= 2\n                self.output_blocks.append(TimestepEmbedSequential(*layers))\n                self._feature_size += ch\n\n        self.out = nn.Sequential(\n            normalization(ch),\n            SiLU(),\n            zero_module(conv_nd(dims, input_ch, out_channels, 3, padding=1)),\n        )\n\n    def forward(self, x, timesteps, y=None):\n        \"\"\"\n        Apply the model to an input batch.\n\n        :param x: an [N x C x ...] Tensor of inputs.\n        :param timesteps: a 1-D batch of timesteps.\n        :param y: an [N] Tensor of labels, if class-conditional.\n        :return: an [N x C x ...] Tensor of outputs.\n        \"\"\"\n        assert (y is not None) == (self.num_classes\n                                   is not None), \"must specify y if and only if the model is class-conditional\"\n\n        hs = []\n        emb = self.time_embed(timestep_embedding(timesteps, self.model_channels))\n        if self.num_classes is not None:\n            assert y.shape == (x.shape[0], )\n            emb = emb + self.label_emb(y)\n\n        h = paddle.cast(x, self.dtype)\n        for module in self.input_blocks:\n            h = module(h, emb)\n            hs.append(h)\n        h = self.middle_block(h, emb)\n        for module in self.output_blocks:\n            h = paddle.concat([h, hs.pop()], axis=1)\n            h = module(h, emb)\n        # h = paddle.cast(h, x.dtype)\n        return self.out(h)\n\n\nclass SuperResModel(UNetModel):\n    \"\"\"\n    A UNetModel that performs super-resolution.\n\n    Expects an extra kwarg `low_res` to condition on a low-resolution image.\n    \"\"\"\n\n    def __init__(self, image_size, in_channels, *args, **kwargs):\n        super().__init__(image_size, in_channels * 2, *args, **kwargs)\n\n    def forward(self, x, timesteps, low_res=None, **kwargs):\n        _, _, new_height, new_width = x.shape\n        upsampled = F.interpolate(low_res, (new_height, new_width), mode=\"bilinear\")\n        x = paddle.concat([x, upsampled], axis=1)\n        return super().forward(x, timesteps, **kwargs)\n\n\nclass EncoderUNetModel(nn.Layer):\n    \"\"\"\n    The half UNet model with attention and timestep embedding.\n\n    For usage, see UNet.\n    \"\"\"\n\n    def __init__(\n        self,\n        image_size,\n        in_channels,\n        model_channels,\n        out_channels,\n        num_res_blocks,\n        attention_resolutions,\n        dropout=0,\n        channel_mult=(1, 2, 4, 8),\n        conv_resample=True,\n        dims=2,\n        use_checkpoint=False,\n        use_fp16=False,\n        num_heads=1,\n        num_head_channels=-1,\n        num_heads_upsample=-1,\n        use_scale_shift_norm=False,\n        resblock_updown=False,\n        use_new_attention_order=False,\n        pool=\"adaptive\",\n    ):\n        super().__init__()\n\n        if num_heads_upsample == -1:\n            num_heads_upsample = num_heads\n\n        self.in_channels = in_channels\n        self.model_channels = model_channels\n        self.out_channels = out_channels\n        self.num_res_blocks = num_res_blocks\n        self.attention_resolutions = attention_resolutions\n        self.dropout = dropout\n        self.channel_mult = channel_mult\n        self.conv_resample = conv_resample\n        self.use_checkpoint = use_checkpoint\n        self.dtype = paddle.float16 if use_fp16 else paddle.float32\n        self.num_heads = num_heads\n        self.num_head_channels = num_head_channels\n        self.num_heads_upsample = num_heads_upsample\n\n        time_embed_dim = model_channels * 4\n        self.time_embed = nn.Sequential(\n            linear(model_channels, time_embed_dim),\n            SiLU(),\n            linear(time_embed_dim, time_embed_dim),\n        )\n\n        ch = int(channel_mult[0] * model_channels)\n        self.input_blocks = nn.LayerList([TimestepEmbedSequential(conv_nd(dims, in_channels, ch, 3, padding=1))])\n        self._feature_size = ch\n        input_block_chans = [ch]\n        ds = 1\n        for level, mult in enumerate(channel_mult):\n            for _ in range(num_res_blocks):\n                layers = [\n                    ResBlock(\n                        ch,\n                        time_embed_dim,\n                        dropout,\n                        out_channels=int(mult * model_channels),\n                        dims=dims,\n                        use_checkpoint=use_checkpoint,\n                        use_scale_shift_norm=use_scale_shift_norm,\n                    )\n                ]\n                ch = int(mult * model_channels)\n                if ds in attention_resolutions:\n                    layers.append(\n                        AttentionBlock(\n                            ch,\n                            use_checkpoint=use_checkpoint,\n                            num_heads=num_heads,\n                            num_head_channels=num_head_channels,\n                            use_new_attention_order=use_new_attention_order,\n                        ))\n                self.input_blocks.append(TimestepEmbedSequential(*layers))\n                self._feature_size += ch\n                input_block_chans.append(ch)\n            if level != len(channel_mult) - 1:\n                out_ch = ch\n                self.input_blocks.append(\n                    TimestepEmbedSequential(\n                        ResBlock(\n                            ch,\n                            time_embed_dim,\n                            dropout,\n                            out_channels=out_ch,\n                            dims=dims,\n                            use_checkpoint=use_checkpoint,\n                            use_scale_shift_norm=use_scale_shift_norm,\n                            down=True,\n                        ) if resblock_updown else Downsample(ch, conv_resample, dims=dims, out_channels=out_ch)))\n                ch = out_ch\n                input_block_chans.append(ch)\n                ds *= 2\n                self._feature_size += ch\n\n        self.middle_block = TimestepEmbedSequential(\n            ResBlock(\n                ch,\n                time_embed_dim,\n                dropout,\n                dims=dims,\n                use_checkpoint=use_checkpoint,\n                use_scale_shift_norm=use_scale_shift_norm,\n            ),\n            AttentionBlock(\n                ch,\n                use_checkpoint=use_checkpoint,\n                num_heads=num_heads,\n                num_head_channels=num_head_channels,\n                use_new_attention_order=use_new_attention_order,\n            ),\n            ResBlock(\n                ch,\n                time_embed_dim,\n                dropout,\n                dims=dims,\n                use_checkpoint=use_checkpoint,\n                use_scale_shift_norm=use_scale_shift_norm,\n            ),\n        )\n        self._feature_size += ch\n        self.pool = pool\n        if pool == \"adaptive\":\n            self.out = nn.Sequential(\n                normalization(ch),\n                SiLU(),\n                nn.AdaptiveAvgPool2D((1, 1)),\n                zero_module(conv_nd(dims, ch, out_channels, 1)),\n                nn.Flatten(),\n            )\n        elif pool == \"attention\":\n            assert num_head_channels != -1\n            self.out = nn.Sequential(\n                normalization(ch),\n                SiLU(),\n                AttentionPool2d((image_size // ds), ch, num_head_channels, out_channels),\n            )\n        elif pool == \"spatial\":\n            self.out = nn.Sequential(\n                nn.Linear(self._feature_size, 2048),\n                nn.ReLU(),\n                nn.Linear(2048, self.out_channels),\n            )\n        elif pool == \"spatial_v2\":\n            self.out = nn.Sequential(\n                nn.Linear(self._feature_size, 2048),\n                normalization(2048),\n                SiLU(),\n                nn.Linear(2048, self.out_channels),\n            )\n        else:\n            raise NotImplementedError(f\"Unexpected {pool} pooling\")\n\n    def forward(self, x, timesteps):\n        \"\"\"\n        Apply the model to an input batch.\n\n        :param x: an [N x C x ...] Tensor of inputs.\n        :param timesteps: a 1-D batch of timesteps.\n        :return: an [N x K] Tensor of outputs.\n        \"\"\"\n        emb = self.time_embed(timestep_embedding(timesteps, self.model_channels))\n\n        results = []\n        # h = x.type(self.dtype)\n        h = paddle.cast(x, self.dtype)\n        for module in self.input_blocks:\n            h = module(h, emb)\n            if self.pool.startswith(\"spatial\"):\n                # results.append(h.type(x.dtype).mean(axis=(2, 3)))\n                results.append(paddle.cast(h, x.dtype).mean(axis=(2, 3)))\n        h = self.middle_block(h, emb)\n        if self.pool.startswith(\"spatial\"):\n            results.append(paddle.cast(h, x.dtype).mean(axis=(2, 3)))\n            h = paddle.concat(results, axis=-1)\n            return self.out(h)\n        else:\n            # h = h.type(x.dtype)\n            h = paddle.cast(h, x.dtype)\n            return self.out(h)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/reverse_diffusion/resources/default.yml",
    "content": "text_prompts:\n  - greg rutkowski和thomas kinkade在artstation上的一幅美丽的画，一个独特的灯塔，照耀着它的光穿过喧嚣的血海。\n\ninit_image:\nwidth_height: [ 1280, 768]\n\nskip_steps: 10\nsteps: 250\n\ncut_ic_pow: 1\ninit_scale: 1000\nclip_guidance_scale: 5000\n\ntv_scale: 0\nrange_scale: 150\nsat_scale: 0\ncutn_batches: 4\n\ndiffusion_model: 512x512_diffusion_uncond_finetune_008100\nuse_secondary_model: True\ndiffusion_sampling_mode: ddim\n\nperlin_init: False\nperlin_mode: mixed\nseed: 445467575\neta: 0.8\nclamp_grad: True\nclamp_max: 0.05\n\nrandomize_class: True\nclip_denoised: False\nfuzzy_prompt: False\nrand_mag: 0.05\n\ncut_overview: \"[12]*400+[4]*600\"\ncut_innercut: \"[4]*400+[12]*600\"\ncut_icgray_p: \"[0.2]*400+[0]*600\"\n\ndisplay_rate: 10\nn_batches: 1\nbatch_size: 1\nbatch_name: ''\nclip_models:\n  - ViTB16\noutput_dir: \"./\"\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/reverse_diffusion/resources/docstrings.yml",
    "content": "text_prompts: |\n  Phrase, sentence, or string of words and phrases describing what the image should look like.  The words will be analyzed by the AI and will guide the diffusion process toward the image(s) you describe. These can include commas and weights to adjust the relative importance of each element.  E.g. \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"\n  Notice that this prompt loosely follows a structure: [subject], [prepositional details], [setting], [meta modifiers and artist]; this is a good starting point for your experiments.\n  Developing text prompts takes practice and experience, and is not the subject of this guide.  If you are a beginner to writing text prompts, a good place to start is on a simple AI art app like Night Cafe, starry ai or WOMBO prior to using DD, to get a feel for how text gets translated into images by GAN tools.  These other apps use different technologies, but many of the same principles apply.\ninit_image: |\n  Recall that in the image sequence above, the first image shown is just noise.  If an init_image is provided, diffusion will replace the noise with the init_image as its starting state.  To use an init_image, upload the image to the Colab instance or your Google Drive, and enter the full image path here.\n  If using an init_image, you may need to increase skip_steps to ~ 50% of total steps to retain the character of the init. See skip_steps above for further discussion.\nwidth_height: |\n  Desired final image size, in pixels. You can have a square, wide, or tall image, but each edge length should be set to a multiple of 64px, and a minimum of 512px on the default CLIP model setting.  If you forget to use multiples of 64px in your dimensions, DD will adjust the dimensions of your image to make it so.\n\nskip_steps: |\n  Consider the chart shown here.  Noise scheduling (denoise strength) starts very high and progressively gets lower and lower as diffusion steps progress. The noise levels in the first few steps are very high, so images change dramatically in early steps.\n  As DD moves along the curve, noise levels (and thus the amount an image changes per step) declines, and image coherence from one step to the next increases.\n  The first few steps of denoising are often so dramatic that some steps (maybe 10-15% of total) can be skipped without affecting the final image. You can experiment with this as a way to cut render times.\n  If you skip too many steps, however, the remaining noise may not be high enough to generate new content, and thus may not have ‘time left’ to finish an image satisfactorily.\n  Also, depending on your other settings, you may need to skip steps to prevent CLIP from overshooting your goal, resulting in ‘blown out’ colors (hyper saturated, solid white, or solid black regions) or otherwise poor image quality.  Consider that the denoising process is at its strongest in the early steps, so skipping steps can sometimes mitigate other problems.\n  Lastly, if using an init_image, you will need to skip ~50% of the diffusion steps to retain the shapes in the original init image.\n  However, if you’re using an init_image, you can also adjust skip_steps up or down for creative reasons.  With low skip_steps you can get a result \"inspired by\" the init_image which will retain the colors and rough layout and shapes but look quite different. With high skip_steps you can preserve most of the init_image contents and just do fine tuning of the texture.\n\nsteps: |\n  When creating an image, the denoising curve is subdivided into steps for processing. Each step (or iteration) involves the AI looking at subsets of the image called ‘cuts’ and calculating the ‘direction’ the image should be guided to be more like the prompt. Then it adjusts the image with the help of the diffusion denoiser, and moves to the next step.\n  Increasing steps will provide more opportunities for the AI to adjust the image, and each adjustment will be smaller, and thus will yield a more precise, detailed image.  Increasing steps comes at the expense of longer render times.  Also, while increasing steps should generally increase image quality, there is a diminishing return on additional steps beyond 250 - 500 steps.  However, some intricate images can take 1000, 2000, or more steps.  It is really up to the user.\n  Just know that the render time is directly related to the number of steps, and many other parameters have a major impact on image quality, without costing additional time.\n\ncut_ic_pow: |\n  This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n\ninit_scale: |\n  This controls how strongly CLIP will try to match the init_image provided.  This is balanced against the clip_guidance_scale (CGS) above.  Too much init scale, and the image won’t change much during diffusion. Too much CGS and the init image will be lost.\nclip_guidance_scale: |\n  CGS is one of the most important parameters you will use. It tells DD how strongly you want CLIP to move toward your prompt each timestep.  Higher is generally better, but if CGS is too strong it will overshoot the goal and distort the image. So a happy medium is needed, and it takes experience to learn how to adjust CGS.\n  Note that this parameter generally scales with image dimensions. In other words, if you increase your total dimensions by 50% (e.g. a change from 512 x 512 to 512 x 768), then to maintain the same effect on the image, you’d want to increase clip_guidance_scale from 5000 to 7500.\n  Of the basic settings, clip_guidance_scale, steps and skip_steps are the most important contributors to image quality, so learn them well.\ntv_scale: |\n  Total variance denoising. Optional, set to zero to turn off. Controls ‘smoothness’ of final output. If used, tv_scale will try to smooth out your final image to reduce overall noise. If your image is too ‘crunchy’, increase tv_scale. TV denoising is good at preserving edges while smoothing away noise in flat regions.  See https://en.wikipedia.org/wiki/Total_variation_denoising\nrange_scale: |\n  Optional, set to zero to turn off.  Used for adjustment of color contrast.  Lower range_scale will increase contrast. Very low numbers create a reduced color palette, resulting in more vibrant or poster-like images. Higher range_scale will reduce contrast, for more muted images.\n\nsat_scale: |\n  Saturation scale. Optional, set to zero to turn off.  If used, sat_scale will help mitigate oversaturation. If your image is too saturated, increase sat_scale to reduce the saturation.\ncutn_batches: |\n  Each iteration, the AI cuts the image into smaller pieces known as cuts, and compares each cut to the prompt to decide how to guide the next diffusion step.  More cuts can generally lead to better images, since DD has more chances to fine-tune the image precision in each timestep.\n  Additional cuts are memory intensive, however, and if DD tries to evaluate too many cuts at once, it can run out of memory.  You can use cutn_batches to increase cuts per timestep without increasing memory usage.\n  At the default settings, DD is scheduled to do 16 cuts per timestep.  If cutn_batches is set to 1, there will indeed only be 16 cuts total per timestep.\n  However, if cutn_batches is increased to 4, DD will do 64 cuts total in each timestep, divided into 4 sequential batches of 16 cuts each.  Because the cuts are being evaluated only 16 at a time, DD uses the memory required for only 16 cuts, but gives you the quality benefit of 64 cuts.  The tradeoff, of course, is that this will take ~4 times as long to render each image.\n  So, (scheduled cuts) x (cutn_batches) = (total cuts per timestep). Increasing cutn_batches will increase render times, however, as the work is being done sequentially.  DD’s default cut schedule is a good place to start, but the cut schedule can be adjusted in the Cutn Scheduling section, explained below.\n\ndiffusion_model: Diffusion_model of choice.\n\nuse_secondary_model: |\n  Option to use a secondary purpose-made diffusion model to clean up interim diffusion images for CLIP evaluation.    If this option is turned off, DD will use the regular (large) diffusion model.    Using the secondary model is faster - one user reported a 50% improvement in render speed! However, the secondary model is much smaller, and may reduce image quality and detail.  I suggest you experiment with this.\n\ndiffusion_sampling_mode: |\n  Two alternate diffusion denoising algorithms. ddim has been around longer, and is more established and tested.  plms is a newly added alternate method that promises good diffusion results in fewer steps, but has not been as fully tested and may have side effects. This new plms mode is actively being researched in the #settings-and-techniques channel in the DD Discord.\n\nperlin_init: |\n  Normally, DD will use an image filled with random noise as a starting point for the diffusion curve.  If perlin_init is selected, DD will instead use a Perlin noise model as an initial state.  Perlin has very interesting characteristics, distinct from random noise, so it’s worth experimenting with this for your projects. Beyond perlin, you can, of course, generate your own noise images (such as with GIMP, etc) and use them as an init_image (without skipping steps).\n  Choosing perlin_init does not affect the actual diffusion process, just the starting point for the diffusion. Please note that selecting a perlin_init will replace and override any init_image you may have specified.  Further, because the 2D, 3D and video animation systems all rely on the init_image system, if you enable Perlin while using animation modes, the perlin_init will jump in front of any previous image or video input, and DD will NOT give you the expected sequence of coherent images. All of that said, using Perlin and animation modes together do make a very colorful rainbow effect, which can be used creatively.\n\nperlin_mode: |\n  sets type of Perlin noise: colored, gray, or a mix of both, giving you additional options for noise types. Experiment to see what these do in your projects.\nseed: |\n  Deep in the diffusion code, there is a random number ‘seed’ which is used as the basis for determining the initial state of the diffusion.  By default, this is random, but you can also specify your own seed.  This is useful if you like a particular result and would like to run more iterations that will be similar.\n  After each run, the actual seed value used will be reported in the parameters report, and can be reused if desired by entering seed # here.  If a specific numerical seed is used repeatedly, the resulting images will be quite similar but not identical.\neta: |\n  eta (greek letter η) is a diffusion model variable that mixes in a random amount of scaled noise into each timestep. 0 is no noise, 1.0 is more noise. As with most DD parameters, you can go below zero for eta, but it may give you unpredictable results.\n  The steps parameter has a close relationship with the eta parameter. If you set eta to 0, then you can get decent output with only 50-75 steps. Setting eta to 1.0 favors higher step counts, ideally around 250 and up. eta has a subtle, unpredictable effect on image, so you’ll need to experiment to see how this affects your projects.\nclamp_grad: |\n  As I understand it, clamp_grad is an internal limiter that stops DD from producing extreme results.  Try your images with and without clamp_grad. If the image changes drastically with clamp_grad turned off, it probably means your clip_guidance_scale is too high and should be reduced.\nclamp_max: |\n  Sets the value of the clamp_grad limitation. Default is 0.05, providing for smoother, more muted coloration in images, but setting higher values (0.15-0.3) can provide interesting contrast and vibrancy.\n\nrandomize_class:\nclip_denoised: False\nfuzzy_prompt: |\n  Controls whether to add multiple noisy prompts to the prompt losses. If True, can increase variability of image output. Experiment with this.\nrand_mag: |\n  Affects only the fuzzy_prompt.  Controls the magnitude of the random noise added by fuzzy_prompt.\n\ncut_overview: The schedule of overview cuts\ncut_innercut: The schedule of inner cuts\ncut_icgray_p: This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n\ndisplay_rate: |\n  During a diffusion run, you can monitor the progress of each image being created with this variable.  If display_rate is set to 50, DD will show you the in-progress image every 50 timesteps. Setting this to a lower value, like 5 or 10, is a good way to get an early peek at where your image is heading. If you don’t like the progression, just interrupt execution, change some settings, and re-run.  If you are planning a long, unmonitored batch, it’s better to set display_rate equal to steps, because displaying interim images does slow Colab down slightly.\nn_batches: |\n  This variable sets the number of still images you want DD to create.  If you are using an animation mode (see below for details) DD will ignore n_batches and create a single set of animated frames based on the animation settings.\nbatch_name: |\n  The name of the batch, the batch id will be named as \"discoart-[batch_name]-seed\". To avoid your artworks be overridden by other users, please use a unique name.\nclip_models: |\n  CLIP Model selectors. ViT-B/32, ViT-B/16, ViT-L/14, RN101, RN50, RN50x4, RN50x16, RN50x64.\n  These various CLIP models are available for you to use during image generation.  Models have different styles or ‘flavors,’ so look around.\n  You can mix in multiple models as well for different results.  However, keep in mind that some models are extremely memory-hungry, and turning on additional models will take additional memory and may cause a crash.\n  The rough order of speed/mem usage is (smallest/fastest to largest/slowest):\n  ViT-B/32\n  RN50\n  RN101\n  ViT-B/16\n  RN50x4\n  RN50x16\n  RN50x64\n  ViT-L/14\n  For RN50x64 & ViTL14 you may need to use fewer cuts, depending on your VRAM.\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_cnclip_vitb16/reverse_diffusion/runner.py",
    "content": "'''\nThis code is rewritten by Paddle based on Jina-ai/discoart.\nhttps://github.com/jina-ai/discoart/blob/main/discoart/runner.py\n'''\nimport gc\nimport os\nimport random\nfrom threading import Thread\n\nimport disco_diffusion_cnclip_vitb16.cn_clip.clip as clip\nimport numpy as np\nimport paddle\nimport paddle.vision.transforms as T\nimport paddle_lpips as lpips\nfrom docarray import Document\nfrom docarray import DocumentArray\nfrom IPython import display\nfrom ipywidgets import Output\nfrom PIL import Image\n\nfrom .helper import logger\nfrom .helper import parse_prompt\nfrom .model.losses import range_loss\nfrom .model.losses import spherical_dist_loss\nfrom .model.losses import tv_loss\nfrom .model.make_cutouts import MakeCutoutsDango\nfrom .model.sec_diff import alpha_sigma_to_t\nfrom .model.sec_diff import SecondaryDiffusionImageNet2\nfrom .model.transforms import Normalize\n\n\ndef do_run(args, models) -> 'DocumentArray':\n    logger.info('preparing models...')\n    model, diffusion, clip_models, secondary_model = models\n    normalize = Normalize(\n        mean=[0.48145466, 0.4578275, 0.40821073],\n        std=[0.26862954, 0.26130258, 0.27577711],\n    )\n    lpips_model = lpips.LPIPS(net='vgg')\n    for parameter in lpips_model.parameters():\n        parameter.stop_gradient = True\n    side_x = (args.width_height[0] // 64) * 64\n    side_y = (args.width_height[1] // 64) * 64\n    cut_overview = eval(args.cut_overview)\n    cut_innercut = eval(args.cut_innercut)\n    cut_icgray_p = eval(args.cut_icgray_p)\n\n    from .model.perlin_noises import create_perlin_noise, regen_perlin\n\n    seed = args.seed\n\n    skip_steps = args.skip_steps\n\n    loss_values = []\n\n    if seed is not None:\n        np.random.seed(seed)\n        random.seed(seed)\n        paddle.seed(seed)\n\n    model_stats = []\n    for clip_model in clip_models:\n        model_stat = {\n            'clip_model': None,\n            'target_embeds': [],\n            'make_cutouts': None,\n            'weights': [],\n        }\n        model_stat['clip_model'] = clip_model\n\n        if isinstance(args.text_prompts, str):\n            args.text_prompts = [args.text_prompts]\n\n        for prompt in args.text_prompts:\n            txt, weight = parse_prompt(prompt)\n            txt = clip_model.encode_text(clip.tokenize(prompt))\n            if args.fuzzy_prompt:\n                for i in range(25):\n                    model_stat['target_embeds'].append((txt + paddle.randn(txt.shape) * args.rand_mag).clip(0, 1))\n                    model_stat['weights'].append(weight)\n            else:\n                model_stat['target_embeds'].append(txt)\n                model_stat['weights'].append(weight)\n\n        model_stat['target_embeds'] = paddle.concat(model_stat['target_embeds'])\n        model_stat['weights'] = paddle.to_tensor(model_stat['weights'])\n        if model_stat['weights'].sum().abs() < 1e-3:\n            raise RuntimeError('The weights must not sum to 0.')\n        model_stat['weights'] /= model_stat['weights'].sum().abs()\n        model_stats.append(model_stat)\n\n    init = None\n    if args.init_image:\n        d = Document(uri=args.init_image).load_uri_to_image_tensor(side_x, side_y)\n        init = T.to_tensor(d.tensor).unsqueeze(0) * 2 - 1\n\n    if args.perlin_init:\n        if args.perlin_mode == 'color':\n            init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, False, side_y, side_x)\n            init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, False, side_y, side_x)\n        elif args.perlin_mode == 'gray':\n            init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, True, side_y, side_x)\n            init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, True, side_y, side_x)\n        else:\n            init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, False, side_y, side_x)\n            init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, True, side_y, side_x)\n        init = (T.to_tensor(init).add(T.to_tensor(init2)).divide(paddle.to_tensor(2.0)).unsqueeze(0) * 2 - 1)\n        del init2\n\n    cur_t = None\n\n    def cond_fn(x, t, y=None):\n        x_is_NaN = False\n        n = x.shape[0]\n        if secondary_model:\n            alpha = paddle.to_tensor(diffusion.sqrt_alphas_cumprod[cur_t], dtype='float32')\n            sigma = paddle.to_tensor(diffusion.sqrt_one_minus_alphas_cumprod[cur_t], dtype='float32')\n            cosine_t = alpha_sigma_to_t(alpha, sigma)\n            x = paddle.to_tensor(x.detach(), dtype='float32')\n            x.stop_gradient = False\n            cosine_t = paddle.tile(paddle.to_tensor(cosine_t.detach().cpu().numpy()), [n])\n            cosine_t.stop_gradient = False\n            out = secondary_model(x, cosine_t).pred\n            fac = diffusion.sqrt_one_minus_alphas_cumprod[cur_t]\n            x_in_d = out * fac + x * (1 - fac)\n            x_in = x_in_d.detach()\n            x_in.stop_gradient = False\n            x_in_grad = paddle.zeros_like(x_in, dtype='float32')\n        else:\n            t = paddle.ones([n], dtype='int64') * cur_t\n            out = diffusion.p_mean_variance(model, x, t, clip_denoised=False, model_kwargs={'y': y})\n            fac = diffusion.sqrt_one_minus_alphas_cumprod[cur_t]\n            x_in_d = out['pred_xstart'] * fac + x * (1 - fac)\n            x_in = x_in_d.detach()\n            x_in.stop_gradient = False\n            x_in_grad = paddle.zeros_like(x_in, dtype='float32')\n        for model_stat in model_stats:\n            for i in range(args.cutn_batches):\n                t_int = (int(t.item()) + 1)  # errors on last step without +1, need to find source\n                # when using SLIP Base model the dimensions need to be hard coded to avoid AttributeError: 'VisionTransformer' object has no attribute 'input_resolution'\n                try:\n                    input_resolution = model_stat['clip_model'].visual.input_resolution\n                except:\n                    input_resolution = 224\n\n                cuts = MakeCutoutsDango(\n                    input_resolution,\n                    Overview=cut_overview[1000 - t_int],\n                    InnerCrop=cut_innercut[1000 - t_int],\n                    IC_Size_Pow=args.cut_ic_pow,\n                    IC_Grey_P=cut_icgray_p[1000 - t_int],\n                )\n                clip_in = normalize(cuts(x_in.add(paddle.to_tensor(1.0)).divide(paddle.to_tensor(2.0))))\n                image_embeds = (model_stat['clip_model'].encode_image(clip_in))\n\n                dists = spherical_dist_loss(\n                    image_embeds.unsqueeze(1),\n                    model_stat['target_embeds'].unsqueeze(0),\n                )\n\n                dists = dists.reshape([\n                    cut_overview[1000 - t_int] + cut_innercut[1000 - t_int],\n                    n,\n                    -1,\n                ])\n                losses = dists.multiply(model_stat['weights']).sum(2).mean(0)\n                loss_values.append(losses.sum().item())  # log loss, probably shouldn't do per cutn_batch\n\n                x_in_grad += (paddle.grad(losses.sum() * args.clip_guidance_scale, x_in)[0] / args.cutn_batches)\n        tv_losses = tv_loss(x_in)\n        range_losses = range_loss(x_in)\n        sat_losses = paddle.abs(x_in - x_in.clip(min=-1, max=1)).mean()\n        loss = (tv_losses.sum() * args.tv_scale + range_losses.sum() * args.range_scale +\n                sat_losses.sum() * args.sat_scale)\n        if init is not None and args.init_scale:\n            init_losses = lpips_model(x_in, init)\n            loss = loss + init_losses.sum() * args.init_scale\n        x_in_grad += paddle.grad(loss, x_in)[0]\n        if not paddle.isnan(x_in_grad).any():\n            grad = -paddle.grad(x_in_d, x, x_in_grad)[0]\n        else:\n            x_is_NaN = True\n            grad = paddle.zeros_like(x)\n        if args.clamp_grad and not x_is_NaN:\n            magnitude = grad.square().mean().sqrt()\n            return (grad * magnitude.clip(max=args.clamp_max) / magnitude)\n        return grad\n\n    if args.diffusion_sampling_mode == 'ddim':\n        sample_fn = diffusion.ddim_sample_loop_progressive\n    else:\n        sample_fn = diffusion.plms_sample_loop_progressive\n\n    logger.info('creating artwork...')\n\n    image_display = Output()\n    da_batches = DocumentArray()\n\n    for _nb in range(args.n_batches):\n        display.clear_output(wait=True)\n        display.display(args.name_docarray, image_display)\n        gc.collect()\n        paddle.device.cuda.empty_cache()\n\n        d = Document(tags=vars(args))\n        da_batches.append(d)\n\n        cur_t = diffusion.num_timesteps - skip_steps - 1\n\n        if args.perlin_init:\n            init = regen_perlin(args.perlin_mode, side_y, side_x, args.batch_size)\n\n        if args.diffusion_sampling_mode == 'ddim':\n            samples = sample_fn(\n                model,\n                (args.batch_size, 3, side_y, side_x),\n                clip_denoised=args.clip_denoised,\n                model_kwargs={},\n                cond_fn=cond_fn,\n                progress=True,\n                skip_timesteps=skip_steps,\n                init_image=init,\n                randomize_class=args.randomize_class,\n                eta=args.eta,\n            )\n        else:\n            samples = sample_fn(\n                model,\n                (args.batch_size, 3, side_y, side_x),\n                clip_denoised=args.clip_denoised,\n                model_kwargs={},\n                cond_fn=cond_fn,\n                progress=True,\n                skip_timesteps=skip_steps,\n                init_image=init,\n                randomize_class=args.randomize_class,\n                order=2,\n            )\n\n        threads = []\n        for j, sample in enumerate(samples):\n            cur_t -= 1\n            with image_display:\n                if j % args.display_rate == 0 or cur_t == -1:\n                    for _, image in enumerate(sample['pred_xstart']):\n                        image = (image + 1) / 2\n                        image = image.clip(0, 1).squeeze().transpose([1, 2, 0]).numpy() * 255\n                        image = np.uint8(image)\n                        image = Image.fromarray(image)\n\n                        image.save(os.path.join(args.output_dir, 'progress-{}.png'.format(_nb)))\n                        c = Document(tags={'cur_t': cur_t})\n                        c.load_pil_image_to_datauri(image)\n                        d.chunks.append(c)\n                        display.clear_output(wait=True)\n                        display.display(display.Image(os.path.join(args.output_dir, 'progress-{}.png'.format(_nb))))\n                        d.chunks.plot_image_sprites(os.path.join(args.output_dir,\n                                                                 f'{args.name_docarray}-progress-{_nb}.png'),\n                                                    show_index=True)\n                        t = Thread(\n                            target=_silent_push,\n                            args=(\n                                da_batches,\n                                args.name_docarray,\n                            ),\n                        )\n                        threads.append(t)\n                        t.start()\n\n                    if cur_t == -1:\n                        d.load_pil_image_to_datauri(image)\n\n        for t in threads:\n            t.join()\n    display.clear_output(wait=True)\n    logger.info(f'done! {args.name_docarray}')\n    da_batches.plot_image_sprites(skip_empty=True, show_index=True, keep_aspect_ratio=True)\n    return da_batches\n\n\ndef _silent_push(da_batches: DocumentArray, name: str) -> None:\n    try:\n        da_batches.push(name)\n    except Exception as ex:\n        logger.debug(f'push failed: {ex}')\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/README.md",
    "content": "# disco_diffusion_ernievil_base\n\n|模型名称|disco_diffusion_ernievil_base|\n| :--- | :---: |\n|类别|图像-文图生成|\n|网络|dd+ERNIE-ViL|\n|数据集|-|\n|是否支持Fine-tuning|否|\n|模型大小|2.9GB|\n|最新更新日期|2022-08-02|\n|数据指标|-|\n\n## 一、模型基本信息\n\n### 应用效果展示\n\n   - 输入文本 \"小桥流水人家\"\n\n   - 输出图像\n   <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/183010362-ec76fa49-5170-462f-8fc8-4353fd648924.png\"  width = \"80%\" hspace='10'/>\n   <br />\n\n   - 生成过程\n   <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/183010368-834e6388-411b-4e73-9bc4-a0ac97cb58d5.gif\"  width = \"80%\" hspace='10'/>\n   <br />\n\n\n### 模型介绍\n\ndisco_diffusion_ernievil_base 是一个文图生成模型，可以通过输入一段文字来生成符合该句子语义的图像。该模型由两部分组成，一部分是扩散模型，是一种生成模型，可以从噪声输入中重建出原始图像。另一部分是多模态预训练模型（ERNIE-ViL), 可以将文本和图像表示在同一个特征空间，相近语义的文本和图像在该特征空间里距离会更相近。在该文图生成模型中，扩散模型负责从初始噪声或者指定初始图像中来生成目标图像，ERNIE-ViL负责引导生成图像的语义和输入的文本的语义尽可能接近，随着扩散模型在ERNIE-ViL的引导下不断的迭代生成新图像，最终能够生成文本所描述内容的图像。该模块中使用的模型为ERNIE-ViL，由ERNIE 3.0+ViT构成。\n\n更多详情请参考论文：[Diffusion Models Beat GANs on Image Synthesis](https://arxiv.org/abs/2105.05233)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.2.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install disco_diffusion_ernievil_base\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run disco_diffusion_ernievil_base --text_prompts \"孤舟蓑笠翁，独钓寒江雪。风格如齐白石所作。\" --output_dir disco_diffusion_ernievil_base_out\n    ```\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"disco_diffusion_ernievil_base\")\n    text_prompts = [\"孤舟蓑笠翁，独钓寒江雪。\"]\n    # 生成图像, 默认会在disco_diffusion_ernievil_base_out目录保存图像\n    # 返回的da是一个DocumentArray对象，保存了所有的结果，包括最终结果和迭代过程的中间结果\n    # 可以通过操作DocumentArray对象对生成的图像做后处理，保存或者分析\n    da = module.generate_image(text_prompts=text_prompts, artist='齐白石', output_dir='./disco_diffusion_ernievil_base_out/')  \n    # 手动将最终生成的图像保存到指定路径\n    da[0].save_uri_to_file('disco_diffusion_ernievil_base_out-result.png')\n    # 展示所有的中间结果\n    da[0].chunks.plot_image_sprites(skip_empty=True, show_index=True, keep_aspect_ratio=True)\n    # 将整个生成过程保存为一个动态图gif\n    da[0].chunks.save_gif('disco_diffusion_ernievil_base_out-result.gif')\n    ```\n\n- ### 3、API\n\n  - ```python\n    def generate_image(\n            text_prompts,\n            style: Optional[str] = None,\n            artist: Optional[str] = None,\n            width_height: Optional[List[int]] = [1280, 768],\n            seed: Optional[int] = None,\n            output_dir: Optional[str] = 'disco_diffusion_ernievil_base_out'):\n    ```\n\n    - 文图生成API，生成文本描述内容的图像。\n\n    - **参数**\n\n      - text_prompts(str): 输入的语句，描述想要生成的图像的内容。通常比较有效的构造方式为 \"一段描述性的文字内容\" + \"指定艺术家的名字\"，如\"孤舟蓑笠翁，独钓寒江雪。风格如齐白石所作\"。\n      - style(Optional[str]): 指定绘画的风格，如水墨画、油画、水彩画等。当不指定时，风格完全由您所填写的prompt决定。\n      - artist(Optional[str]): 指定特定的艺术家，如齐白石、Greg Rutkowsk，将会生成所指定艺术家的绘画风格。当不指定时，风格完全由您所填写的prompt决定。各种艺术家的风格可以参考[网站](https://weirdwonderfulai.art/resources/disco-diffusion-70-plus-artist-studies/)。\n      - width_height(Optional[List[int]]): 指定最终输出图像的宽高，宽和高都需要是64的倍数，生成的图像越大，所需要的计算时间越长。\n      - seed(Optional[int]): 随机种子，由于输入默认是随机高斯噪声，设置不同的随机种子会由不同的初始输入，从而最终生成不同的结果，可以设置该参数来获得不同的输出图像。\n      - output_dir(Optional[str]): 保存输出图像的目录，默认为\"disco_diffusion_ernievil_base_out\"。\n\n\n    - **返回**\n      - ra(DocumentArray): DocumentArray对象， 包含`n_batches`个Documents，其中每个Document都保存了迭代过程的所有中间结果。详细可参考[DocumentArray使用文档](https://docarray.jina.ai/fundamentals/documentarray/index.html)。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线文图生成服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m disco_diffusion_ernievil_base\n    ```\n\n  - 这样就完成了一个文图生成的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果，返回的预测结果在反序列化后即是上述接口声明中说明的DocumentArray类型，返回后对结果的操作方式和使用generate_image接口完全相同。\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    from docarray import DocumentArray\n\n    # 发送HTTP请求\n    data = {'text_prompts': '孤舟蓑笠翁，独钓寒江雪。风格如齐白石所作'}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/disco_diffusion_ernievil_base\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 获取返回结果\n    da = DocumentArray.from_base64(r.json()[\"results\"])\n    # 手动将最终生成的图像保存到指定路径\n    da[0].save_uri_to_file('disco_diffusion_ernievil_base_out-result.png')\n    # 将生成过程保存为一个动态图gif\n    da[0].chunks.save_gif('disco_diffusion_ernievil_base_out-result.gif')\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install disco_diffusion_ernievil_base == 1.0.0\n  ```\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/README_en.md",
    "content": "# disco_diffusion_ernievil_base\n\n|Module Name|disco_diffusion_ernievil_base|\n| :--- | :---: |\n|Category|text to image|\n|Network|dd+ERNIE-ViL|\n|Dataset|-|\n|Fine-tuning supported or not|No|\n|Module Size|2.9GB|\n|Latest update date|2022-08-02|\n|Data indicators|-|\n\n## I.Basic Information\n\n### Application Effect Display\n\n   - Prompt \"小桥流水人家\"\n\n   - Output image\n   <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/183010362-ec76fa49-5170-462f-8fc8-4353fd648924.png\"  width = \"80%\" hspace='10'/>\n   <br />\n\n   - Generating process\n   <p align=\"center\">\n     <img src=\"https://user-images.githubusercontent.com/22424850/183010368-834e6388-411b-4e73-9bc4-a0ac97cb58d5.gif\"  width = \"80%\" hspace='10'/>\n   <br />\n\n\n### Module Introduction\n\ndisco_diffusion_ernievil_base is a text-to-image generation model that can generate images that match the semantics of the sentence you prompt. The model consists of two parts, one is the diffusion model, which is a generative model that reconstructs the original image from the noisy input. The other part is the multimodal pre-training model (ERNIE-ViL), which can represent text and images in the same feature space, and text and images with similar semantics will be closer in this feature space. In the text image generation model, the diffusion model is responsible for generating the target image from the initial noise or the specified initial image, and ERNIE-ViL is responsible for guiding the generated image to be as close as possible to the semantics of the input text. Diffusion model under the guidance of ERNIE-ViL iteratively generates new images, eventually generating images of what the text describes. The model used in this module is ERNIE-ViL, consisting of ERNIE 3.0+ViT.\n\nFor more details, please refer to [Diffusion Models Beat GANs on Image Synthesis](https://arxiv.org/abs/2105.05233)\n\n## II.Installation\n\n- ### 1.Environmental Dependence\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.2.0    | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2.Installation\n\n  - ```shell\n    $ hub install disco_diffusion_ernievil_base\n    ```\n  - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n\n## III.Module API Prediction  \n\n- ### 1.Command line Prediction\n\n  - ```shell\n    $ hub run disco_diffusion_ernievil_base --text_prompts \"孤舟蓑笠翁，独钓寒江雪。风格如齐白石所作。\" --output_dir disco_diffusion_ernievil_base_out\n    ```\n\n- ### 2.Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"disco_diffusion_ernievil_base\")\n    text_prompts = [\"孤舟蓑笠翁，独钓寒江雪。\"]\n    # Output images will be saved in disco_diffusion_ernievil_base_out directory.\n    # The returned da is a DocumentArray object, which contains all immediate and final results\n    # You can manipulate the DocumentArray object to do post-processing and save images\n    da = module.generate_image(text_prompts=text_prompts, artist='齐白石', output_dir='./disco_diffusion_ernievil_base_out/')  \n    # Save final result image to a file\n    da[0].save_uri_to_file('disco_diffusion_ernievil_base_out-result.png')\n    # Show all immediate results\n    da[0].chunks.plot_image_sprites(skip_empty=True, show_index=True, keep_aspect_ratio=True)\n    # Save the generating process as a gif\n    da[0].chunks.save_gif('disco_diffusion_ernievil_base_out-result.gif')\n    ```\n\n- ### 3.API\n\n  - ```python\n    def generate_image(\n            text_prompts,\n            style: Optional[str] = None,\n            artist: Optional[str] = None,\n            width_height: Optional[List[int]] = [1280, 768],\n            seed: Optional[int] = None,\n            output_dir: Optional[str] = 'disco_diffusion_ernievil_base_out'):\n    ```\n\n    - Image generating api, which generates an image corresponding to your prompt.\n\n    - **Parameters**\n\n      - text_prompts(str): Prompt, used to describe your image content. You can construct a prompt conforms to the format \"content\" + \"artist/style\", such as \"孤舟蓑笠翁，独钓寒江雪。风格如齐白石所作\". For more details, you can refer to [website](https://docs.google.com/document/d/1XUT2G9LmkZataHFzmuOtRXnuWBfhvXDAo8DkS--8tec/edit#).\n      - style(Optional[str]): Image style, such as \"watercolor\" and \"Chinese painting\". If not provided, style is totally up to your prompt.\n      - artist(Optional[str]): Artist name, such as 齐白石,Greg Rutkowsk，image style is as whose works you choose. If not provided, style is totally up to your [prompt](https://weirdwonderfulai.art/resources/disco-diffusion-70-plus-artist-studies/).\n      - width_height(Optional[List[int]]): The width and height of output images, should be better multiples of 64. The larger size is, the longger computation time is.\n      - seed(Optional[int]): Random seed, different seeds result in different output images.\n      - output_dir(Optional[str]): Output directory, default is \"disco_diffusion_ernievil_base_out\".\n\n\n    - **Return**\n      - ra(DocumentArray): DocumentArray object， including `n_batches` Documents，each document keeps all immediate results during generation, please refer to [DocumentArray tutorial](https://docarray.jina.ai/fundamentals/documentarray/index.html) for more details.\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of text-to-image.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m disco_diffusion_ernievil_base\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result.\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    from docarray import DocumentArray\n\n    # Send an HTTP request\n    data = {'text_prompts': '孤舟蓑笠翁，独钓寒江雪。风格如齐白石所作'}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/disco_diffusion_ernievil_base\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # Get results\n    da = DocumentArray.from_base64(r.json()[\"results\"])\n    # Save final result image to a file\n    da[0].save_uri_to_file('disco_diffusion_ernievil_base_out-result.png')\n    # Save the generating process as a gif\n    da[0].chunks.save_gif('disco_diffusion_ernievil_base_out-result.gif')\n\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n  ```shell\n  $ hub install disco_diffusion_ernievil_base == 1.0.0\n  ```\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/module.py",
    "content": "# copyright (c) 2022 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport ast\nimport os\nimport sys\nfrom functools import partial\nfrom typing import List\nfrom typing import Optional\n\nimport paddle\nfrom disco_diffusion_ernievil_base import resize_right\nfrom disco_diffusion_ernievil_base.reverse_diffusion import create\nfrom disco_diffusion_ernievil_base.vit_b_16x import ernievil2\n\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"disco_diffusion_ernievil_base\",\n            version=\"1.0.0\",\n            type=\"image/text_to_image\",\n            summary=\"\",\n            author=\"paddlepaddle\",\n            author_email=\"paddle-dev@baidu.com\")\nclass DiscoDiffusionClip:\n\n    def generate_image(self,\n                       text_prompts,\n                       style: Optional[str] = None,\n                       artist: Optional[str] = None,\n                       init_image: Optional[str] = None,\n                       width_height: Optional[List[int]] = [1280, 768],\n                       skip_steps: Optional[int] = 0,\n                       steps: Optional[int] = 250,\n                       cut_ic_pow: Optional[int] = 1,\n                       init_scale: Optional[int] = 1000,\n                       clip_guidance_scale: Optional[int] = 5000,\n                       tv_scale: Optional[int] = 0,\n                       range_scale: Optional[int] = 0,\n                       sat_scale: Optional[int] = 0,\n                       cutn_batches: Optional[int] = 4,\n                       diffusion_sampling_mode: Optional[str] = 'ddim',\n                       perlin_init: Optional[bool] = False,\n                       perlin_mode: Optional[str] = 'mixed',\n                       seed: Optional[int] = None,\n                       eta: Optional[float] = 0.8,\n                       clamp_grad: Optional[bool] = True,\n                       clamp_max: Optional[float] = 0.05,\n                       randomize_class: Optional[bool] = True,\n                       clip_denoised: Optional[bool] = False,\n                       fuzzy_prompt: Optional[bool] = False,\n                       rand_mag: Optional[float] = 0.05,\n                       cut_overview: Optional[str] = '[12]*400+[4]*600',\n                       cut_innercut: Optional[str] = '[4]*400+[12]*600',\n                       cut_icgray_p: Optional[str] = '[0.2]*400+[0]*600',\n                       display_rate: Optional[int] = 10,\n                       n_batches: Optional[int] = 1,\n                       batch_size: Optional[int] = 1,\n                       batch_name: Optional[str] = '',\n                       use_gpu: Optional[bool] = True,\n                       output_dir: Optional[str] = 'disco_diffusion_ernievil_base_out'):\n        \"\"\"\n        Create Disco Diffusion artworks and save the result into a DocumentArray.\n\n        :param text_prompts: Phrase, sentence, or string of words and phrases describing what the image should look like.  The words will be analyzed by the AI and will guide the diffusion process toward the image(s) you describe. These can include commas and weights to adjust the relative importance of each element.  E.g. \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"Notice that this prompt loosely follows a structure: [subject], [prepositional details], [setting], [meta modifiers and artist]; this is a good starting point for your experiments. Developing text prompts takes practice and experience, and is not the subject of this guide.  If you are a beginner to writing text prompts, a good place to start is on a simple AI art app like Night Cafe, starry ai or WOMBO prior to using DD, to get a feel for how text gets translated into images by GAN tools.  These other apps use different technologies, but many of the same principles apply.\n\n        :param init_image: Recall that in the image sequence above, the first image shown is just noise.  If an init_image is provided, diffusion will replace the noise with the init_image as its starting state.  To use an init_image, upload the image to the Colab instance or your Google Drive, and enter the full image path here. If using an init_image, you may need to increase skip_steps to ~ 50% of total steps to retain the character of the init. See skip_steps above for further discussion.\n        :param style: Image style, such as oil paintings, if specified, style will be used to construct prompts.\n        :param artist: Artist style, if specified, style will be used to construct prompts.\n        :param width_height: Desired final image size, in pixels. You can have a square, wide, or tall image, but each edge length should be set to a multiple of 64px, and a minimum of 512px on the default CLIP model setting.  If you forget to use multiples of 64px in your dimensions, DD will adjust the dimensions of your image to make it so.\n        :param skip_steps: Consider the chart shown here.  Noise scheduling (denoise strength) starts very high and progressively gets lower and lower as diffusion steps progress. The noise levels in the first few steps are very high, so images change dramatically in early steps.As DD moves along the curve, noise levels (and thus the amount an image changes per step) declines, and image coherence from one step to the next increases.The first few steps of denoising are often so dramatic that some steps (maybe 10-15% of total) can be skipped without affecting the final image. You can experiment with this as a way to cut render times.If you skip too many steps, however, the remaining noise may not be high enough to generate new content, and thus may not have ‘time left’ to finish an image satisfactorily.Also, depending on your other settings, you may need to skip steps to prevent CLIP from overshooting your goal, resulting in ‘blown out’ colors (hyper saturated, solid white, or solid black regions) or otherwise poor image quality.  Consider that the denoising process is at its strongest in the early steps, so skipping steps can sometimes mitigate other problems.Lastly, if using an init_image, you will need to skip ~50% of the diffusion steps to retain the shapes in the original init image. However, if you’re using an init_image, you can also adjust skip_steps up or down for creative reasons.  With low skip_steps you can get a result \"inspired by\" the init_image which will retain the colors and rough layout and shapes but look quite different. With high skip_steps you can preserve most of the init_image contents and just do fine tuning of the texture.\n        :param steps: When creating an image, the denoising curve is subdivided into steps for processing. Each step (or iteration) involves the AI looking at subsets of the image called ‘cuts’ and calculating the ‘direction’ the image should be guided to be more like the prompt. Then it adjusts the image with the help of the diffusion denoiser, and moves to the next step.Increasing steps will provide more opportunities for the AI to adjust the image, and each adjustment will be smaller, and thus will yield a more precise, detailed image.  Increasing steps comes at the expense of longer render times.  Also, while increasing steps should generally increase image quality, there is a diminishing return on additional steps beyond 250 - 500 steps.  However, some intricate images can take 1000, 2000, or more steps.  It is really up to the user.  Just know that the render time is directly related to the number of steps, and many other parameters have a major impact on image quality, without costing additional time.\n        :param cut_ic_pow: This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n        :param init_scale: This controls how strongly CLIP will try to match the init_image provided.  This is balanced against the clip_guidance_scale (CGS) above.  Too much init scale, and the image won’t change much during diffusion. Too much CGS and the init image will be lost.\n        :param clip_guidance_scale: CGS is one of the most important parameters you will use. It tells DD how strongly you want CLIP to move toward your prompt each timestep.  Higher is generally better, but if CGS is too strong it will overshoot the goal and distort the image. So a happy medium is needed, and it takes experience to learn how to adjust CGS. Note that this parameter generally scales with image dimensions. In other words, if you increase your total dimensions by 50% (e.g. a change from 512 x 512 to 512 x 768), then to maintain the same effect on the image, you’d want to increase clip_guidance_scale from 5000 to 7500. Of the basic settings, clip_guidance_scale, steps and skip_steps are the most important contributors to image quality, so learn them well.\n        :param tv_scale: Total variance denoising. Optional, set to zero to turn off. Controls ‘smoothness’ of final output. If used, tv_scale will try to smooth out your final image to reduce overall noise. If your image is too ‘crunchy’, increase tv_scale. TV denoising is good at preserving edges while smoothing away noise in flat regions.  See https://en.wikipedia.org/wiki/Total_variation_denoising\n        :param range_scale: Optional, set to zero to turn off.  Used for adjustment of color contrast.  Lower range_scale will increase contrast. Very low numbers create a reduced color palette, resulting in more vibrant or poster-like images. Higher range_scale will reduce contrast, for more muted images.\n        :param sat_scale: Saturation scale. Optional, set to zero to turn off.  If used, sat_scale will help mitigate oversaturation. If your image is too saturated, increase sat_scale to reduce the saturation.\n        :param cutn_batches: Each iteration, the AI cuts the image into smaller pieces known as cuts, and compares each cut to the prompt to decide how to guide the next diffusion step.  More cuts can generally lead to better images, since DD has more chances to fine-tune the image precision in each timestep.  Additional cuts are memory intensive, however, and if DD tries to evaluate too many cuts at once, it can run out of memory.  You can use cutn_batches to increase cuts per timestep without increasing memory usage. At the default settings, DD is scheduled to do 16 cuts per timestep.  If cutn_batches is set to 1, there will indeed only be 16 cuts total per timestep. However, if cutn_batches is increased to 4, DD will do 64 cuts total in each timestep, divided into 4 sequential batches of 16 cuts each.  Because the cuts are being evaluated only 16 at a time, DD uses the memory required for only 16 cuts, but gives you the quality benefit of 64 cuts.  The tradeoff, of course, is that this will take ~4 times as long to render each image.So, (scheduled cuts) x (cutn_batches) = (total cuts per timestep). Increasing cutn_batches will increase render times, however, as the work is being done sequentially.  DD’s default cut schedule is a good place to start, but the cut schedule can be adjusted in the Cutn Scheduling section, explained below.\n        :param diffusion_sampling_mode: Two alternate diffusion denoising algorithms. ddim has been around longer, and is more established and tested.  plms is a newly added alternate method that promises good diffusion results in fewer steps, but has not been as fully tested and may have side effects. This new plms mode is actively being researched in the #settings-and-techniques channel in the DD Discord.\n        :param perlin_init: Normally, DD will use an image filled with random noise as a starting point for the diffusion curve.  If perlin_init is selected, DD will instead use a Perlin noise model as an initial state.  Perlin has very interesting characteristics, distinct from random noise, so it’s worth experimenting with this for your projects. Beyond perlin, you can, of course, generate your own noise images (such as with GIMP, etc) and use them as an init_image (without skipping steps). Choosing perlin_init does not affect the actual diffusion process, just the starting point for the diffusion. Please note that selecting a perlin_init will replace and override any init_image you may have specified.  Further, because the 2D, 3D and video animation systems all rely on the init_image system, if you enable Perlin while using animation modes, the perlin_init will jump in front of any previous image or video input, and DD will NOT give you the expected sequence of coherent images. All of that said, using Perlin and animation modes together do make a very colorful rainbow effect, which can be used creatively.\n        :param perlin_mode: sets type of Perlin noise: colored, gray, or a mix of both, giving you additional options for noise types. Experiment to see what these do in your projects.\n        :param seed: Deep in the diffusion code, there is a random number ‘seed’ which is used as the basis for determining the initial state of the diffusion.  By default, this is random, but you can also specify your own seed.  This is useful if you like a particular result and would like to run more iterations that will be similar. After each run, the actual seed value used will be reported in the parameters report, and can be reused if desired by entering seed # here.  If a specific numerical seed is used repeatedly, the resulting images will be quite similar but not identical.\n        :param eta: eta (greek letter η) is a diffusion model variable that mixes in a random amount of scaled noise into each timestep. 0 is no noise, 1.0 is more noise. As with most DD parameters, you can go below zero for eta, but it may give you unpredictable results. The steps parameter has a close relationship with the eta parameter. If you set eta to 0, then you can get decent output with only 50-75 steps. Setting eta to 1.0 favors higher step counts, ideally around 250 and up. eta has a subtle, unpredictable effect on image, so you’ll need to experiment to see how this affects your projects.\n        :param clamp_grad: As I understand it, clamp_grad is an internal limiter that stops DD from producing extreme results.  Try your images with and without clamp_grad. If the image changes drastically with clamp_grad turned off, it probably means your clip_guidance_scale is too high and should be reduced.\n        :param clamp_max: Sets the value of the clamp_grad limitation. Default is 0.05, providing for smoother, more muted coloration in images, but setting higher values (0.15-0.3) can provide interesting contrast and vibrancy.\n        :param fuzzy_prompt: Controls whether to add multiple noisy prompts to the prompt losses. If True, can increase variability of image output. Experiment with this.\n        :param rand_mag: Affects only the fuzzy_prompt.  Controls the magnitude of the random noise added by fuzzy_prompt.\n        :param cut_overview: The schedule of overview cuts\n        :param cut_innercut: The schedule of inner cuts\n        :param cut_icgray_p: This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n        :param display_rate: During a diffusion run, you can monitor the progress of each image being created with this variable.  If display_rate is set to 50, DD will show you the in-progress image every 50 timesteps. Setting this to a lower value, like 5 or 10, is a good way to get an early peek at where your image is heading. If you don’t like the progression, just interrupt execution, change some settings, and re-run.  If you are planning a long, unmonitored batch, it’s better to set display_rate equal to steps, because displaying interim images does slow Colab down slightly.\n        :param n_batches: This variable sets the number of still images you want DD to create.  If you are using an animation mode (see below for details) DD will ignore n_batches and create a single set of animated frames based on the animation settings.\n        :param batch_name: The name of the batch, the batch id will be named as \"discoart-[batch_name]-seed\". To avoid your artworks be overridden by other users, please use a unique name.\n        :param use_gpu: whether to use gpu or not.\n        :return: a DocumentArray object that has `n_batches` Documents\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ.get(\"CUDA_VISIBLE_DEVICES\", None)\n                if _places:\n                    paddle.device.set_device(\"gpu:{}\".format(0))\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n        else:\n            paddle.device.set_device(\"cpu\")\n        paddle.disable_static()\n        if not os.path.exists(output_dir):\n            os.makedirs(output_dir, exist_ok=True)\n\n        if isinstance(text_prompts, str):\n            text_prompts = text_prompts.rstrip(',.，。')\n            if style is not None:\n                text_prompts += \"，{}\".format(style)\n            if artist is not None:\n                text_prompts += \"，由{}所作\".format(artist)\n        elif isinstance(text_prompts, list):\n            text_prompts[0] = text_prompts[0].rstrip(',.，。')\n            if style is not None:\n                text_prompts[0] += \"，{}\".format(style)\n            if artist is not None:\n                text_prompts[0] += \"，由{}所作\".format(artist)\n\n        return create(text_prompts=text_prompts,\n                      init_image=init_image,\n                      width_height=width_height,\n                      skip_steps=skip_steps,\n                      steps=steps,\n                      cut_ic_pow=cut_ic_pow,\n                      init_scale=init_scale,\n                      clip_guidance_scale=clip_guidance_scale,\n                      tv_scale=tv_scale,\n                      range_scale=range_scale,\n                      sat_scale=sat_scale,\n                      cutn_batches=cutn_batches,\n                      diffusion_sampling_mode=diffusion_sampling_mode,\n                      perlin_init=perlin_init,\n                      perlin_mode=perlin_mode,\n                      seed=seed,\n                      eta=eta,\n                      clamp_grad=clamp_grad,\n                      clamp_max=clamp_max,\n                      randomize_class=randomize_class,\n                      clip_denoised=clip_denoised,\n                      fuzzy_prompt=fuzzy_prompt,\n                      rand_mag=rand_mag,\n                      cut_overview=cut_overview,\n                      cut_innercut=cut_innercut,\n                      cut_icgray_p=cut_icgray_p,\n                      display_rate=display_rate,\n                      n_batches=n_batches,\n                      batch_size=batch_size,\n                      batch_name=batch_name,\n                      clip_models=['vit_b_16x'],\n                      output_dir=output_dir)\n\n    @serving\n    def serving_method(self, text_prompts, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        results = self.generate_image(text_prompts=text_prompts, **kwargs).to_base64()\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.generate_image(text_prompts=args.text_prompts,\n                                      style=args.style,\n                                      artist=args.artist,\n                                      init_image=args.init_image,\n                                      width_height=args.width_height,\n                                      skip_steps=args.skip_steps,\n                                      steps=args.steps,\n                                      cut_ic_pow=args.cut_ic_pow,\n                                      init_scale=args.init_scale,\n                                      clip_guidance_scale=args.clip_guidance_scale,\n                                      tv_scale=args.tv_scale,\n                                      range_scale=args.range_scale,\n                                      sat_scale=args.sat_scale,\n                                      cutn_batches=args.cutn_batches,\n                                      diffusion_sampling_mode=args.diffusion_sampling_mode,\n                                      perlin_init=args.perlin_init,\n                                      perlin_mode=args.perlin_mode,\n                                      seed=args.seed,\n                                      eta=args.eta,\n                                      clamp_grad=args.clamp_grad,\n                                      clamp_max=args.clamp_max,\n                                      randomize_class=args.randomize_class,\n                                      clip_denoised=args.clip_denoised,\n                                      fuzzy_prompt=args.fuzzy_prompt,\n                                      rand_mag=args.rand_mag,\n                                      cut_overview=args.cut_overview,\n                                      cut_innercut=args.cut_innercut,\n                                      cut_icgray_p=args.cut_icgray_p,\n                                      display_rate=args.display_rate,\n                                      n_batches=args.n_batches,\n                                      batch_size=args.batch_size,\n                                      batch_name=args.batch_name,\n                                      output_dir=args.output_dir)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_input_group.add_argument(\n            '--skip_steps',\n            type=int,\n            default=0,\n            help=\n            'Consider the chart shown here.  Noise scheduling (denoise strength) starts very high and progressively gets lower and lower as diffusion steps progress. The noise levels in the first few steps are very high, so images change dramatically in early steps.As DD moves along the curve, noise levels (and thus the amount an image changes per step) declines, and image coherence from one step to the next increases.The first few steps of denoising are often so dramatic that some steps (maybe 10-15%% of total) can be skipped without affecting the final image. You can experiment with this as a way to cut render times.If you skip too many steps, however, the remaining noise may not be high enough to generate new content, and thus may not have ‘time left’ to finish an image satisfactorily.Also, depending on your other settings, you may need to skip steps to prevent CLIP from overshooting your goal, resulting in ‘blown out’ colors (hyper saturated, solid white, or solid black regions) or otherwise poor image quality.  Consider that the denoising process is at its strongest in the early steps, so skipping steps can sometimes mitigate other problems.Lastly, if using an init_image, you will need to skip ~50%% of the diffusion steps to retain the shapes in the original init image. However, if you’re using an init_image, you can also adjust skip_steps up or down for creative reasons.  With low skip_steps you can get a result \"inspired by\" the init_image which will retain the colors and rough layout and shapes but look quite different. With high skip_steps you can preserve most of the init_image contents and just do fine tuning of the texture'\n        )\n        self.arg_input_group.add_argument(\n            '--steps',\n            type=int,\n            default=250,\n            help=\n            \"When creating an image, the denoising curve is subdivided into steps for processing. Each step (or iteration) involves the AI looking at subsets of the image called ‘cuts’ and calculating the ‘direction’ the image should be guided to be more like the prompt. Then it adjusts the image with the help of the diffusion denoiser, and moves to the next step.Increasing steps will provide more opportunities for the AI to adjust the image, and each adjustment will be smaller, and thus will yield a more precise, detailed image.  Increasing steps comes at the expense of longer render times.  Also, while increasing steps should generally increase image quality, there is a diminishing return on additional steps beyond 250 - 500 steps.  However, some intricate images can take 1000, 2000, or more steps.  It is really up to the user.  Just know that the render time is directly related to the number of steps, and many other parameters have a major impact on image quality, without costing additional time.\"\n        )\n        self.arg_input_group.add_argument(\n            '--cut_ic_pow',\n            type=int,\n            default=1,\n            help=\n            \"This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\"\n        )\n        self.arg_input_group.add_argument(\n            '--init_scale',\n            type=int,\n            default=1000,\n            help=\n            \"This controls how strongly CLIP will try to match the init_image provided.  This is balanced against the clip_guidance_scale (CGS) above.  Too much init scale, and the image won’t change much during diffusion. Too much CGS and the init image will be lost.\"\n        )\n        self.arg_input_group.add_argument(\n            '--clip_guidance_scale',\n            type=int,\n            default=5000,\n            help=\n            \"CGS is one of the most important parameters you will use. It tells DD how strongly you want CLIP to move toward your prompt each timestep.  Higher is generally better, but if CGS is too strong it will overshoot the goal and distort the image. So a happy medium is needed, and it takes experience to learn how to adjust CGS. Note that this parameter generally scales with image dimensions. In other words, if you increase your total dimensions by 50% (e.g. a change from 512 x 512 to 512 x 768), then to maintain the same effect on the image, you’d want to increase clip_guidance_scale from 5000 to 7500. Of the basic settings, clip_guidance_scale, steps and skip_steps are the most important contributors to image quality, so learn them well.\"\n        )\n        self.arg_input_group.add_argument(\n            '--tv_scale',\n            type=int,\n            default=0,\n            help=\n            \"Total variance denoising. Optional, set to zero to turn off. Controls ‘smoothness’ of final output. If used, tv_scale will try to smooth out your final image to reduce overall noise. If your image is too ‘crunchy’, increase tv_scale. TV denoising is good at preserving edges while smoothing away noise in flat regions.  See https://en.wikipedia.org/wiki/Total_variation_denoising\"\n        )\n        self.arg_input_group.add_argument(\n            '--range_scale',\n            type=int,\n            default=0,\n            help=\n            \"Optional, set to zero to turn off.  Used for adjustment of color contrast.  Lower range_scale will increase contrast. Very low numbers create a reduced color palette, resulting in more vibrant or poster-like images. Higher range_scale will reduce contrast, for more muted images.\"\n        )\n        self.arg_input_group.add_argument(\n            '--sat_scale',\n            type=int,\n            default=0,\n            help=\n            \"Saturation scale. Optional, set to zero to turn off.  If used, sat_scale will help mitigate oversaturation. If your image is too saturated, increase sat_scale to reduce the saturation.\"\n        )\n        self.arg_input_group.add_argument(\n            '--cutn_batches',\n            type=int,\n            default=4,\n            help=\n            \"Each iteration, the AI cuts the image into smaller pieces known as cuts, and compares each cut to the prompt to decide how to guide the next diffusion step.  More cuts can generally lead to better images, since DD has more chances to fine-tune the image precision in each timestep.  Additional cuts are memory intensive, however, and if DD tries to evaluate too many cuts at once, it can run out of memory.  You can use cutn_batches to increase cuts per timestep without increasing memory usage. At the default settings, DD is scheduled to do 16 cuts per timestep.  If cutn_batches is set to 1, there will indeed only be 16 cuts total per timestep. However, if cutn_batches is increased to 4, DD will do 64 cuts total in each timestep, divided into 4 sequential batches of 16 cuts each.  Because the cuts are being evaluated only 16 at a time, DD uses the memory required for only 16 cuts, but gives you the quality benefit of 64 cuts.  The tradeoff, of course, is that this will take ~4 times as long to render each image.So, (scheduled cuts) x (cutn_batches) = (total cuts per timestep). Increasing cutn_batches will increase render times, however, as the work is being done sequentially.  DD’s default cut schedule is a good place to start, but the cut schedule can be adjusted in the Cutn Scheduling section, explained below.\"\n        )\n        self.arg_input_group.add_argument(\n            '--diffusion_sampling_mode',\n            type=str,\n            default='ddim',\n            help=\n            \"Two alternate diffusion denoising algorithms. ddim has been around longer, and is more established and tested.  plms is a newly added alternate method that promises good diffusion results in fewer steps, but has not been as fully tested and may have side effects. This new plms mode is actively being researched in the #settings-and-techniques channel in the DD Discord.\"\n        )\n        self.arg_input_group.add_argument(\n            '--perlin_init',\n            type=bool,\n            default=False,\n            help=\n            \"Normally, DD will use an image filled with random noise as a starting point for the diffusion curve.  If perlin_init is selected, DD will instead use a Perlin noise model as an initial state.  Perlin has very interesting characteristics, distinct from random noise, so it’s worth experimenting with this for your projects. Beyond perlin, you can, of course, generate your own noise images (such as with GIMP, etc) and use them as an init_image (without skipping steps). Choosing perlin_init does not affect the actual diffusion process, just the starting point for the diffusion. Please note that selecting a perlin_init will replace and override any init_image you may have specified.  Further, because the 2D, 3D and video animation systems all rely on the init_image system, if you enable Perlin while using animation modes, the perlin_init will jump in front of any previous image or video input, and DD will NOT give you the expected sequence of coherent images. All of that said, using Perlin and animation modes together do make a very colorful rainbow effect, which can be used creatively.\"\n        )\n        self.arg_input_group.add_argument(\n            '--perlin_mode',\n            type=str,\n            default='mixed',\n            help=\n            \"sets type of Perlin noise: colored, gray, or a mix of both, giving you additional options for noise types. Experiment to see what these do in your projects.\"\n        )\n        self.arg_input_group.add_argument(\n            '--seed',\n            type=int,\n            default=None,\n            help=\n            \"Deep in the diffusion code, there is a random number ‘seed’ which is used as the basis for determining the initial state of the diffusion.  By default, this is random, but you can also specify your own seed.  This is useful if you like a particular result and would like to run more iterations that will be similar. After each run, the actual seed value used will be reported in the parameters report, and can be reused if desired by entering seed # here.  If a specific numerical seed is used repeatedly, the resulting images will be quite similar but not identical.\"\n        )\n        self.arg_input_group.add_argument(\n            '--eta',\n            type=float,\n            default=0.8,\n            help=\n            \"eta (greek letter η) is a diffusion model variable that mixes in a random amount of scaled noise into each timestep. 0 is no noise, 1.0 is more noise. As with most DD parameters, you can go below zero for eta, but it may give you unpredictable results. The steps parameter has a close relationship with the eta parameter. If you set eta to 0, then you can get decent output with only 50-75 steps. Setting eta to 1.0 favors higher step counts, ideally around 250 and up. eta has a subtle, unpredictable effect on image, so you’ll need to experiment to see how this affects your projects.\"\n        )\n        self.arg_input_group.add_argument(\n            '--clamp_grad',\n            type=bool,\n            default=True,\n            help=\n            \"As I understand it, clamp_grad is an internal limiter that stops DD from producing extreme results.  Try your images with and without clamp_grad. If the image changes drastically with clamp_grad turned off, it probably means your clip_guidance_scale is too high and should be reduced.\"\n        )\n        self.arg_input_group.add_argument(\n            '--clamp_max',\n            type=float,\n            default=0.05,\n            help=\n            \"Sets the value of the clamp_grad limitation. Default is 0.05, providing for smoother, more muted coloration in images, but setting higher values (0.15-0.3) can provide interesting contrast and vibrancy.\"\n        )\n        self.arg_input_group.add_argument('--randomize_class', type=bool, default=True, help=\"Random class.\")\n        self.arg_input_group.add_argument('--clip_denoised', type=bool, default=False, help=\"Clip denoised.\")\n        self.arg_input_group.add_argument(\n            '--fuzzy_prompt',\n            type=bool,\n            default=False,\n            help=\n            \"Controls whether to add multiple noisy prompts to the prompt losses. If True, can increase variability of image output. Experiment with this.\"\n        )\n        self.arg_input_group.add_argument(\n            '--rand_mag',\n            type=float,\n            default=0.5,\n            help=\"Affects only the fuzzy_prompt.  Controls the magnitude of the random noise added by fuzzy_prompt.\")\n        self.arg_input_group.add_argument('--cut_overview',\n                                          type=str,\n                                          default='[12]*400+[4]*600',\n                                          help=\"The schedule of overview cuts\")\n        self.arg_input_group.add_argument('--cut_innercut',\n                                          type=str,\n                                          default='[4]*400+[12]*600',\n                                          help=\"The schedule of inner cuts\")\n        self.arg_input_group.add_argument(\n            '--cut_icgray_p',\n            type=str,\n            default='[0.2]*400+[0]*600',\n            help=\n            \"This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\"\n        )\n        self.arg_input_group.add_argument(\n            '--display_rate',\n            type=int,\n            default=10,\n            help=\n            \"During a diffusion run, you can monitor the progress of each image being created with this variable.  If display_rate is set to 50, DD will show you the in-progress image every 50 timesteps. Setting this to a lower value, like 5 or 10, is a good way to get an early peek at where your image is heading. If you don’t like the progression, just interrupt execution, change some settings, and re-run.  If you are planning a long, unmonitored batch, it’s better to set display_rate equal to steps, because displaying interim images does slow Colab down slightly.\"\n        )\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=True,\n                                           help=\"whether use GPU or not\")\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='disco_diffusion_ernievil_base_out',\n                                           help='Output directory.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--text_prompts', type=str)\n        self.arg_input_group.add_argument(\n            '--style',\n            type=str,\n            default=None,\n            help='Image style, such as oil paintings, if specified, style will be used to construct prompts.')\n        self.arg_input_group.add_argument('--artist',\n                                          type=str,\n                                          default=None,\n                                          help='Artist style, if specified, style will be used to construct prompts.')\n        self.arg_input_group.add_argument(\n            '--init_image',\n            type=str,\n            default=None,\n            help=\n            \"Recall that in the image sequence above, the first image shown is just noise.  If an init_image is provided, diffusion will replace the noise with the init_image as its starting state.  To use an init_image, upload the image to the Colab instance or your Google Drive, and enter the full image path here. If using an init_image, you may need to increase skip_steps to ~ 50% of total steps to retain the character of the init. See skip_steps above for further discussion.\"\n        )\n        self.arg_input_group.add_argument(\n            '--width_height',\n            type=ast.literal_eval,\n            default=[1280, 768],\n            help=\n            \"Desired final image size, in pixels. You can have a square, wide, or tall image, but each edge length should be set to a multiple of 64px, and a minimum of 512px on the default CLIP model setting.  If you forget to use multiples of 64px in your dimensions, DD will adjust the dimensions of your image to make it so.\"\n        )\n        self.arg_input_group.add_argument(\n            '--n_batches',\n            type=int,\n            default=1,\n            help=\n            \"This variable sets the number of still images you want DD to create.  If you are using an animation mode (see below for details) DD will ignore n_batches and create a single set of animated frames based on the animation settings.\"\n        )\n        self.arg_input_group.add_argument('--batch_size', type=int, default=1, help=\"Batch size.\")\n        self.arg_input_group.add_argument(\n            '--batch_name',\n            type=str,\n            default='',\n            help=\n            'The name of the batch, the batch id will be named as \"discoart-[batch_name]-seed\". To avoid your artworks be overridden by other users, please use a unique name.'\n        )\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/requirements.txt",
    "content": "numpy\npaddle_lpips==0.1.2\nftfy\ndocarray>=0.13.29\npyyaml\nregex\ntqdm\nipywidgets\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/resize_right/README.md",
    "content": "# ResizeRight (Paddle)\nFully differentiable resize function implemented by Paddle.\nThis module is based on [assafshocher/ResizeRight](https://github.com/assafshocher/ResizeRight).\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/resize_right/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/resize_right/interp_methods.py",
    "content": "from math import pi\n\ntry:\n    import paddle\nexcept ImportError:\n    paddle = None\n\ntry:\n    import numpy\n    import numpy as np\nexcept ImportError:\n    numpy = None\n\nif numpy is None and paddle is None:\n    raise ImportError(\"Must have either Numpy or PyTorch but both not found\")\n\n\ndef set_framework_dependencies(x):\n    if type(x) is numpy.ndarray:\n        to_dtype = lambda a: a\n        fw = numpy\n    else:\n        to_dtype = lambda a: paddle.cast(a, x.dtype)\n        fw = paddle\n    # eps = fw.finfo(fw.float32).eps\n    eps = paddle.to_tensor(np.finfo(np.float32).eps)\n    return fw, to_dtype, eps\n\n\ndef support_sz(sz):\n\n    def wrapper(f):\n        f.support_sz = sz\n        return f\n\n    return wrapper\n\n\n@support_sz(4)\ndef cubic(x):\n    fw, to_dtype, eps = set_framework_dependencies(x)\n    absx = fw.abs(x)\n    absx2 = absx**2\n    absx3 = absx**3\n    return ((1.5 * absx3 - 2.5 * absx2 + 1.) * to_dtype(absx <= 1.) +\n            (-0.5 * absx3 + 2.5 * absx2 - 4. * absx + 2.) * to_dtype((1. < absx) & (absx <= 2.)))\n\n\n@support_sz(4)\ndef lanczos2(x):\n    fw, to_dtype, eps = set_framework_dependencies(x)\n    return (((fw.sin(pi * x) * fw.sin(pi * x / 2) + eps) / ((pi**2 * x**2 / 2) + eps)) * to_dtype(abs(x) < 2))\n\n\n@support_sz(6)\ndef lanczos3(x):\n    fw, to_dtype, eps = set_framework_dependencies(x)\n    return (((fw.sin(pi * x) * fw.sin(pi * x / 3) + eps) / ((pi**2 * x**2 / 3) + eps)) * to_dtype(abs(x) < 3))\n\n\n@support_sz(2)\ndef linear(x):\n    fw, to_dtype, eps = set_framework_dependencies(x)\n    return ((x + 1) * to_dtype((-1 <= x) & (x < 0)) + (1 - x) * to_dtype((0 <= x) & (x <= 1)))\n\n\n@support_sz(1)\ndef box(x):\n    fw, to_dtype, eps = set_framework_dependencies(x)\n    return to_dtype((-1 <= x) & (x < 0)) + to_dtype((0 <= x) & (x <= 1))\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/resize_right/resize_right.py",
    "content": "import warnings\nfrom fractions import Fraction\nfrom math import ceil\nfrom typing import Tuple\n\nimport disco_diffusion_ernievil_base.resize_right.interp_methods as interp_methods\n\n\nclass NoneClass:\n    pass\n\n\ntry:\n    import paddle\n    from paddle import nn\n    nnModuleWrapped = nn.Layer\nexcept ImportError:\n    warnings.warn('No PyTorch found, will work only with Numpy')\n    paddle = None\n    nnModuleWrapped = NoneClass\n\ntry:\n    import numpy\n    import numpy as np\nexcept ImportError:\n    warnings.warn('No Numpy found, will work only with PyTorch')\n    numpy = None\n\nif numpy is None and paddle is None:\n    raise ImportError(\"Must have either Numpy or PyTorch but both not found\")\n\n\ndef resize(input,\n           scale_factors=None,\n           out_shape=None,\n           interp_method=interp_methods.cubic,\n           support_sz=None,\n           antialiasing=True,\n           by_convs=False,\n           scale_tolerance=None,\n           max_numerator=10,\n           pad_mode='constant'):\n    # get properties of the input tensor\n    in_shape, n_dims = input.shape, input.ndim\n\n    # fw stands for framework that can be either numpy or paddle,\n    # determined by the input type\n    fw = numpy if type(input) is numpy.ndarray else paddle\n    eps = np.finfo(np.float32).eps if fw == numpy else paddle.to_tensor(np.finfo(np.float32).eps)\n    device = input.place if fw is paddle else None\n\n    # set missing scale factors or output shapem one according to another,\n    # scream if both missing. this is also where all the defults policies\n    # take place. also handling the by_convs attribute carefully.\n    scale_factors, out_shape, by_convs = set_scale_and_out_sz(in_shape, out_shape, scale_factors, by_convs,\n                                                              scale_tolerance, max_numerator, eps, fw)\n\n    # sort indices of dimensions according to scale of each dimension.\n    # since we are going dim by dim this is efficient\n    sorted_filtered_dims_and_scales = [(dim, scale_factors[dim], by_convs[dim], in_shape[dim], out_shape[dim])\n                                       for dim in sorted(range(n_dims), key=lambda ind: scale_factors[ind])\n                                       if scale_factors[dim] != 1.]\n    # unless support size is specified by the user, it is an attribute\n    # of the interpolation method\n    if support_sz is None:\n        support_sz = interp_method.support_sz\n\n    # output begins identical to input and changes with each iteration\n    output = input\n\n    # iterate over dims\n    for (dim, scale_factor, dim_by_convs, in_sz, out_sz) in sorted_filtered_dims_and_scales:\n        # STEP 1- PROJECTED GRID: The non-integer locations of the projection\n        # of output pixel locations to the input tensor\n        projected_grid = get_projected_grid(in_sz, out_sz, scale_factor, fw, dim_by_convs, device)\n\n        # STEP 1.5: ANTIALIASING- If antialiasing is taking place, we modify\n        # the window size and the interpolation method (see inside function)\n        cur_interp_method, cur_support_sz = apply_antialiasing_if_needed(interp_method, support_sz, scale_factor,\n                                                                         antialiasing)\n\n        # STEP 2- FIELDS OF VIEW: for each output pixels, map the input pixels\n        # that influence it. Also calculate needed padding and update grid\n        # accoedingly\n        field_of_view = get_field_of_view(projected_grid, cur_support_sz, fw, eps, device)\n\n        # STEP 2.5- CALCULATE PAD AND UPDATE: according to the field of view,\n        # the input should be padded to handle the boundaries, coordinates\n        # should be updated. actual padding only occurs when weights are\n        # aplied (step 4). if using by_convs for this dim, then we need to\n        # calc right and left boundaries for each filter instead.\n        pad_sz, projected_grid, field_of_view = calc_pad_sz(in_sz, out_sz, field_of_view, projected_grid, scale_factor,\n                                                            dim_by_convs, fw, device)\n        # STEP 3- CALCULATE WEIGHTS: Match a set of weights to the pixels in\n        # the field of view for each output pixel\n        weights = get_weights(cur_interp_method, projected_grid, field_of_view)\n\n        # STEP 4- APPLY WEIGHTS: Each output pixel is calculated by multiplying\n        # its set of weights with the pixel values in its field of view.\n        # We now multiply the fields of view with their matching weights.\n        # We do this by tensor multiplication and broadcasting.\n        # if by_convs is true for this dim, then we do this action by\n        # convolutions. this is equivalent but faster.\n        if not dim_by_convs:\n            output = apply_weights(output, field_of_view, weights, dim, n_dims, pad_sz, pad_mode, fw)\n        else:\n            output = apply_convs(output, scale_factor, in_sz, out_sz, weights, dim, pad_sz, pad_mode, fw)\n    return output\n\n\ndef get_projected_grid(in_sz, out_sz, scale_factor, fw, by_convs, device=None):\n    # we start by having the ouput coordinates which are just integer locations\n    # in the special case when usin by_convs, we only need two cycles of grid\n    # points. the first and last.\n    grid_sz = out_sz if not by_convs else scale_factor.numerator\n    out_coordinates = fw_arange(grid_sz, fw, device)\n\n    # This is projecting the ouput pixel locations in 1d to the input tensor,\n    # as non-integer locations.\n    # the following fomrula is derived in the paper\n    # \"From Discrete to Continuous Convolutions\" by Shocher et al.\n    return (out_coordinates / float(scale_factor) + (in_sz - 1) / 2 - (out_sz - 1) / (2 * float(scale_factor)))\n\n\ndef get_field_of_view(projected_grid, cur_support_sz, fw, eps, device):\n    # for each output pixel, map which input pixels influence it, in 1d.\n    # we start by calculating the leftmost neighbor, using half of the window\n    # size (eps is for when boundary is exact int)\n    left_boundaries = fw_ceil(projected_grid - cur_support_sz / 2 - eps, fw)\n\n    # then we simply take all the pixel centers in the field by counting\n    # window size pixels from the left boundary\n    ordinal_numbers = fw_arange(ceil(cur_support_sz - eps), fw, device)\n    return left_boundaries[:, None] + ordinal_numbers\n\n\ndef calc_pad_sz(in_sz, out_sz, field_of_view, projected_grid, scale_factor, dim_by_convs, fw, device):\n    if not dim_by_convs:\n        # determine padding according to neighbor coords out of bound.\n        # this is a generalized notion of padding, when pad<0 it means crop\n        pad_sz = [-field_of_view[0, 0].item(), field_of_view[-1, -1].item() - in_sz + 1]\n\n        # since input image will be changed by padding, coordinates of both\n        # field_of_view and projected_grid need to be updated\n        field_of_view += pad_sz[0]\n        projected_grid += pad_sz[0]\n\n    else:\n        # only used for by_convs, to calc the boundaries of each filter the\n        # number of distinct convolutions is the numerator of the scale factor\n        num_convs, stride = scale_factor.numerator, scale_factor.denominator\n\n        # calculate left and right boundaries for each conv. left can also be\n        # negative right can be bigger than in_sz. such cases imply padding if\n        # needed. however if# both are in-bounds, it means we need to crop,\n        # practically apply the conv only on part of the image.\n        left_pads = -field_of_view[:, 0]\n\n        # next calc is tricky, explanation by rows:\n        # 1) counting output pixels between the first position of each filter\n        #    to the right boundary of the input\n        # 2) dividing it by number of filters to count how many 'jumps'\n        #    each filter does\n        # 3) multiplying by the stride gives us the distance over the input\n        #    coords done by all these jumps for each filter\n        # 4) to this distance we add the right boundary of the filter when\n        #    placed in its leftmost position. so now we get the right boundary\n        #    of that filter in input coord.\n        # 5) the padding size needed is obtained by subtracting the rightmost\n        #    input coordinate. if the result is positive padding is needed. if\n        #    negative then negative padding means shaving off pixel columns.\n        right_pads = (((out_sz - fw_arange(num_convs, fw, device) - 1)  # (1)\n                       // num_convs)  # (2)\n                      * stride  # (3)\n                      + field_of_view[:, -1]  # (4)\n                      - in_sz + 1)  # (5)\n\n        # in the by_convs case pad_sz is a list of left-right pairs. one per\n        # each filter\n\n        pad_sz = list(zip(left_pads, right_pads))\n\n    return pad_sz, projected_grid, field_of_view\n\n\ndef get_weights(interp_method, projected_grid, field_of_view):\n    # the set of weights per each output pixels is the result of the chosen\n    # interpolation method applied to the distances between projected grid\n    # locations and the pixel-centers in the field of view (distances are\n    # directed, can be positive or negative)\n    weights = interp_method(projected_grid[:, None] - field_of_view)\n\n    # we now carefully normalize the weights to sum to 1 per each output pixel\n    sum_weights = weights.sum(1, keepdim=True)\n    sum_weights[sum_weights == 0] = 1\n    return weights / sum_weights\n\n\ndef apply_weights(input, field_of_view, weights, dim, n_dims, pad_sz, pad_mode, fw):\n    # for this operation we assume the resized dim is the first one.\n    # so we transpose and will transpose back after multiplying\n    tmp_input = fw_swapaxes(input, dim, 0, fw)\n\n    # apply padding\n    tmp_input = fw_pad(tmp_input, fw, pad_sz, pad_mode)\n\n    # field_of_view is a tensor of order 2: for each output (1d location\n    # along cur dim)- a list of 1d neighbors locations.\n    # note that this whole operations is applied to each dim separately,\n    # this is why it is all in 1d.\n    # neighbors = tmp_input[field_of_view] is a tensor of order image_dims+1:\n    # for each output pixel (this time indicated in all dims), these are the\n    # values of the neighbors in the 1d field of view. note that we only\n    # consider neighbors along the current dim, but such set exists for every\n    # multi-dim location, hence the final tensor order is image_dims+1.\n    paddle.device.cuda.empty_cache()\n    neighbors = tmp_input[field_of_view]\n\n    # weights is an order 2 tensor: for each output location along 1d- a list\n    # of weights matching the field of view. we augment it with ones, for\n    # broadcasting, so that when multiplies some tensor the weights affect\n    # only its first dim.\n    tmp_weights = fw.reshape(weights, (*weights.shape, *[1] * (n_dims - 1)))\n\n    # now we simply multiply the weights with the neighbors, and then sum\n    # along the field of view, to get a single value per out pixel\n    tmp_output = (neighbors * tmp_weights).sum(1)\n    # we transpose back the resized dim to its original position\n    return fw_swapaxes(tmp_output, 0, dim, fw)\n\n\ndef apply_convs(input, scale_factor, in_sz, out_sz, weights, dim, pad_sz, pad_mode, fw):\n    # for this operations we assume the resized dim is the last one.\n    # so we transpose and will transpose back after multiplying\n    input = fw_swapaxes(input, dim, -1, fw)\n\n    # the stride for all convs is the denominator of the scale factor\n    stride, num_convs = scale_factor.denominator, scale_factor.numerator\n\n    # prepare an empty tensor for the output\n    tmp_out_shape = list(input.shape)\n    tmp_out_shape[-1] = out_sz\n    tmp_output = fw_empty(tuple(tmp_out_shape), fw, input.device)\n\n    # iterate over the conv operations. we have as many as the numerator\n    # of the scale-factor. for each we need boundaries and a filter.\n    for conv_ind, (pad_sz, filt) in enumerate(zip(pad_sz, weights)):\n        # apply padding (we pad last dim, padding can be negative)\n        pad_dim = input.ndim - 1\n        tmp_input = fw_pad(input, fw, pad_sz, pad_mode, dim=pad_dim)\n\n        # apply convolution over last dim. store in the output tensor with\n        # positional strides so that when the loop is comlete conv results are\n        # interwind\n        tmp_output[..., conv_ind::num_convs] = fw_conv(tmp_input, filt, stride)\n\n    return fw_swapaxes(tmp_output, -1, dim, fw)\n\n\ndef set_scale_and_out_sz(in_shape, out_shape, scale_factors, by_convs, scale_tolerance, max_numerator, eps, fw):\n    # eventually we must have both scale-factors and out-sizes for all in/out\n    # dims. however, we support many possible partial arguments\n    if scale_factors is None and out_shape is None:\n        raise ValueError(\"either scale_factors or out_shape should be \"\n                         \"provided\")\n    if out_shape is not None:\n        # if out_shape has less dims than in_shape, we defaultly resize the\n        # first dims for numpy and last dims for paddle\n        out_shape = (list(out_shape) +\n                     list(in_shape[len(out_shape):]) if fw is numpy else list(in_shape[:-len(out_shape)]) +\n                     list(out_shape))\n        if scale_factors is None:\n            # if no scale given, we calculate it as the out to in ratio\n            # (not recomended)\n            scale_factors = [out_sz / in_sz for out_sz, in_sz in zip(out_shape, in_shape)]\n    if scale_factors is not None:\n        # by default, if a single number is given as scale, we assume resizing\n        # two dims (most common are images with 2 spatial dims)\n        scale_factors = (scale_factors if isinstance(scale_factors, (list, tuple)) else [scale_factors, scale_factors])\n        # if less scale_factors than in_shape dims, we defaultly resize the\n        # first dims for numpy and last dims for paddle\n        scale_factors = (list(scale_factors) + [1] * (len(in_shape) - len(scale_factors)) if fw is numpy else [1] *\n                         (len(in_shape) - len(scale_factors)) + list(scale_factors))\n        if out_shape is None:\n            # when no out_shape given, it is calculated by multiplying the\n            # scale by the in_shape (not recomended)\n            out_shape = [ceil(scale_factor * in_sz) for scale_factor, in_sz in zip(scale_factors, in_shape)]\n        # next part intentionally after out_shape determined for stability\n        # we fix by_convs to be a list of truth values in case it is not\n        if not isinstance(by_convs, (list, tuple)):\n            by_convs = [by_convs] * len(out_shape)\n\n        # next loop fixes the scale for each dim to be either frac or float.\n        # this is determined by by_convs and by tolerance for scale accuracy.\n        for ind, (sf, dim_by_convs) in enumerate(zip(scale_factors, by_convs)):\n            # first we fractionaize\n            if dim_by_convs:\n                frac = Fraction(1 / sf).limit_denominator(max_numerator)\n                frac = Fraction(numerator=frac.denominator, denominator=frac.numerator)\n\n            # if accuracy is within tolerance scale will be frac. if not, then\n            # it will be float and the by_convs attr will be set false for\n            # this dim\n            if scale_tolerance is None:\n                scale_tolerance = eps\n            if dim_by_convs and abs(frac - sf) < scale_tolerance:\n                scale_factors[ind] = frac\n            else:\n                scale_factors[ind] = float(sf)\n                by_convs[ind] = False\n\n        return scale_factors, out_shape, by_convs\n\n\ndef apply_antialiasing_if_needed(interp_method, support_sz, scale_factor, antialiasing):\n    # antialiasing is \"stretching\" the field of view according to the scale\n    # factor (only for downscaling). this is low-pass filtering. this\n    # requires modifying both the interpolation (stretching the 1d\n    # function and multiplying by the scale-factor) and the window size.\n    scale_factor = float(scale_factor)\n    if scale_factor >= 1.0 or not antialiasing:\n        return interp_method, support_sz\n    cur_interp_method = (lambda arg: scale_factor * interp_method(scale_factor * arg))\n    cur_support_sz = support_sz / scale_factor\n    return cur_interp_method, cur_support_sz\n\n\ndef fw_ceil(x, fw):\n    if fw is numpy:\n        return fw.int_(fw.ceil(x))\n    else:\n        return paddle.cast(x.ceil(), dtype='int64')\n\n\ndef fw_floor(x, fw):\n    if fw is numpy:\n        return fw.int_(fw.floor(x))\n    else:\n        return paddle.cast(x.floor(), dtype='int64')\n\n\ndef fw_cat(x, fw):\n    if fw is numpy:\n        return fw.concatenate(x)\n    else:\n        return fw.concat(x)\n\n\ndef fw_swapaxes(x, ax_1, ax_2, fw):\n    if fw is numpy:\n        return fw.swapaxes(x, ax_1, ax_2)\n    else:\n        if ax_1 == -1:\n            ax_1 = len(x.shape) - 1\n        if ax_2 == -1:\n            ax_2 = len(x.shape) - 1\n        perm0 = list(range(len(x.shape)))\n        temp = ax_1\n        perm0[temp] = ax_2\n        perm0[ax_2] = temp\n        return fw.transpose(x, perm0)\n\n\ndef fw_pad(x, fw, pad_sz, pad_mode, dim=0):\n    if pad_sz == (0, 0):\n        return x\n    if fw is numpy:\n        pad_vec = [(0, 0)] * x.ndim\n        pad_vec[dim] = pad_sz\n        return fw.pad(x, pad_width=pad_vec, mode=pad_mode)\n    else:\n        if x.ndim < 3:\n            x = x[None, None, ...]\n\n        pad_vec = [0] * ((x.ndim - 2) * 2)\n        pad_vec[0:2] = pad_sz\n        return fw_swapaxes(fw.nn.functional.pad(fw_swapaxes(x, dim, -1, fw), pad=pad_vec, mode=pad_mode), dim, -1, fw)\n\n\ndef fw_conv(input, filter, stride):\n    # we want to apply 1d conv to any nd array. the way to do it is to reshape\n    # the input to a 4D tensor. first two dims are singeletons, 3rd dim stores\n    # all the spatial dims that we are not convolving along now. then we can\n    # apply conv2d with a 1xK filter. This convolves the same way all the other\n    # dims stored in the 3d dim. like depthwise conv over these.\n    # TODO: numpy support\n    reshaped_input = input.reshape(1, 1, -1, input.shape[-1])\n    reshaped_output = paddle.nn.functional.conv2d(reshaped_input, filter.view(1, 1, 1, -1), stride=(1, stride))\n    return reshaped_output.reshape(*input.shape[:-1], -1)\n\n\ndef fw_arange(upper_bound, fw, device):\n    if fw is numpy:\n        return fw.arange(upper_bound)\n    else:\n        return fw.arange(upper_bound)\n\n\ndef fw_empty(shape, fw, device):\n    if fw is numpy:\n        return fw.empty(shape)\n    else:\n        return fw.empty(shape=shape)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/reverse_diffusion/README.md",
    "content": "# Diffusion model (Paddle)\nThis module implements diffusion model which accepts a text prompt and outputs images semantically close to the text. The code is rewritten by Paddle, and mainly refer to two projects:  jina-ai/discoart[https://github.com/jina-ai/discoart] and openai/guided-diffusion[https://github.com/openai/guided-diffusion]. Thanks for their wonderful work.\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/reverse_diffusion/__init__.py",
    "content": "'''\nhttps://github.com/jina-ai/discoart/blob/main/discoart/__init__.py\n'''\nimport os\nimport warnings\n\nos.environ['KMP_DUPLICATE_LIB_OK'] = 'TRUE'\n\n__all__ = ['create']\n\nimport sys\n\n__resources_path__ = os.path.join(\n    os.path.dirname(sys.modules.get(__package__).__file__ if __package__ in sys.modules else __file__),\n    'resources',\n)\n\nimport gc\n\n# check if GPU is available\nimport paddle\n\n# download and load models, this will take some time on the first load\n\nfrom .helper import load_all_models, load_diffusion_model, load_clip_models\n\nmodel_config, secondary_model = load_all_models('512x512_diffusion_uncond_finetune_008100', use_secondary_model=True)\n\nfrom typing import TYPE_CHECKING, overload, List, Optional\n\nif TYPE_CHECKING:\n    from docarray import DocumentArray, Document\n\n_clip_models_cache = {}\n\n# begin_create_overload\n\n\n@overload\ndef create(text_prompts: Optional[List[str]] = [\n    'A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.',\n    'yellow color scheme',\n],\n           init_image: Optional[str] = None,\n           width_height: Optional[List[int]] = [1280, 768],\n           skip_steps: Optional[int] = 10,\n           steps: Optional[int] = 250,\n           cut_ic_pow: Optional[int] = 1,\n           init_scale: Optional[int] = 1000,\n           clip_guidance_scale: Optional[int] = 5000,\n           tv_scale: Optional[int] = 0,\n           range_scale: Optional[int] = 150,\n           sat_scale: Optional[int] = 0,\n           cutn_batches: Optional[int] = 4,\n           diffusion_model: Optional[str] = '512x512_diffusion_uncond_finetune_008100',\n           use_secondary_model: Optional[bool] = True,\n           diffusion_sampling_mode: Optional[str] = 'ddim',\n           perlin_init: Optional[bool] = False,\n           perlin_mode: Optional[str] = 'mixed',\n           seed: Optional[int] = None,\n           eta: Optional[float] = 0.8,\n           clamp_grad: Optional[bool] = True,\n           clamp_max: Optional[float] = 0.05,\n           randomize_class: Optional[bool] = True,\n           clip_denoised: Optional[bool] = False,\n           fuzzy_prompt: Optional[bool] = False,\n           rand_mag: Optional[float] = 0.05,\n           cut_overview: Optional[str] = '[12]*400+[4]*600',\n           cut_innercut: Optional[str] = '[4]*400+[12]*600',\n           cut_icgray_p: Optional[str] = '[0.2]*400+[0]*600',\n           display_rate: Optional[int] = 10,\n           n_batches: Optional[int] = 4,\n           batch_size: Optional[int] = 1,\n           batch_name: Optional[str] = '',\n           clip_models: Optional[list] = ['ViTB32', 'ViTB16', 'RN50'],\n           output_dir: Optional[str] = 'discoart_output') -> 'DocumentArray':\n    \"\"\"\n    Create Disco Diffusion artworks and save the result into a DocumentArray.\n\n    :param text_prompts: Phrase, sentence, or string of words and phrases describing what the image should look like.  The words will be analyzed by the AI and will guide the diffusion process toward the image(s) you describe. These can include commas and weights to adjust the relative importance of each element.  E.g. \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"Notice that this prompt loosely follows a structure: [subject], [prepositional details], [setting], [meta modifiers and artist]; this is a good starting point for your experiments. Developing text prompts takes practice and experience, and is not the subject of this guide.  If you are a beginner to writing text prompts, a good place to start is on a simple AI art app like Night Cafe, starry ai or WOMBO prior to using DD, to get a feel for how text gets translated into images by GAN tools.  These other apps use different technologies, but many of the same principles apply.\n    :param init_image: Recall that in the image sequence above, the first image shown is just noise.  If an init_image is provided, diffusion will replace the noise with the init_image as its starting state.  To use an init_image, upload the image to the Colab instance or your Google Drive, and enter the full image path here. If using an init_image, you may need to increase skip_steps to ~ 50% of total steps to retain the character of the init. See skip_steps above for further discussion.\n    :param width_height: Desired final image size, in pixels. You can have a square, wide, or tall image, but each edge length should be set to a multiple of 64px, and a minimum of 512px on the default CLIP model setting.  If you forget to use multiples of 64px in your dimensions, DD will adjust the dimensions of your image to make it so.\n    :param skip_steps: Consider the chart shown here.  Noise scheduling (denoise strength) starts very high and progressively gets lower and lower as diffusion steps progress. The noise levels in the first few steps are very high, so images change dramatically in early steps.As DD moves along the curve, noise levels (and thus the amount an image changes per step) declines, and image coherence from one step to the next increases.The first few steps of denoising are often so dramatic that some steps (maybe 10-15% of total) can be skipped without affecting the final image. You can experiment with this as a way to cut render times.If you skip too many steps, however, the remaining noise may not be high enough to generate new content, and thus may not have ‘time left’ to finish an image satisfactorily.Also, depending on your other settings, you may need to skip steps to prevent CLIP from overshooting your goal, resulting in ‘blown out’ colors (hyper saturated, solid white, or solid black regions) or otherwise poor image quality.  Consider that the denoising process is at its strongest in the early steps, so skipping steps can sometimes mitigate other problems.Lastly, if using an init_image, you will need to skip ~50% of the diffusion steps to retain the shapes in the original init image. However, if you’re using an init_image, you can also adjust skip_steps up or down for creative reasons.  With low skip_steps you can get a result \"inspired by\" the init_image which will retain the colors and rough layout and shapes but look quite different. With high skip_steps you can preserve most of the init_image contents and just do fine tuning of the texture.\n    :param steps: When creating an image, the denoising curve is subdivided into steps for processing. Each step (or iteration) involves the AI looking at subsets of the image called ‘cuts’ and calculating the ‘direction’ the image should be guided to be more like the prompt. Then it adjusts the image with the help of the diffusion denoiser, and moves to the next step.Increasing steps will provide more opportunities for the AI to adjust the image, and each adjustment will be smaller, and thus will yield a more precise, detailed image.  Increasing steps comes at the expense of longer render times.  Also, while increasing steps should generally increase image quality, there is a diminishing return on additional steps beyond 250 - 500 steps.  However, some intricate images can take 1000, 2000, or more steps.  It is really up to the user.  Just know that the render time is directly related to the number of steps, and many other parameters have a major impact on image quality, without costing additional time.\n    :param cut_ic_pow: This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n    :param init_scale: This controls how strongly CLIP will try to match the init_image provided.  This is balanced against the clip_guidance_scale (CGS) above.  Too much init scale, and the image won’t change much during diffusion. Too much CGS and the init image will be lost.\n    :param clip_guidance_scale: CGS is one of the most important parameters you will use. It tells DD how strongly you want CLIP to move toward your prompt each timestep.  Higher is generally better, but if CGS is too strong it will overshoot the goal and distort the image. So a happy medium is needed, and it takes experience to learn how to adjust CGS. Note that this parameter generally scales with image dimensions. In other words, if you increase your total dimensions by 50% (e.g. a change from 512 x 512 to 512 x 768), then to maintain the same effect on the image, you’d want to increase clip_guidance_scale from 5000 to 7500. Of the basic settings, clip_guidance_scale, steps and skip_steps are the most important contributors to image quality, so learn them well.\n    :param tv_scale: Total variance denoising. Optional, set to zero to turn off. Controls ‘smoothness’ of final output. If used, tv_scale will try to smooth out your final image to reduce overall noise. If your image is too ‘crunchy’, increase tv_scale. TV denoising is good at preserving edges while smoothing away noise in flat regions.  See https://en.wikipedia.org/wiki/Total_variation_denoising\n    :param range_scale: Optional, set to zero to turn off.  Used for adjustment of color contrast.  Lower range_scale will increase contrast. Very low numbers create a reduced color palette, resulting in more vibrant or poster-like images. Higher range_scale will reduce contrast, for more muted images.\n    :param sat_scale: Saturation scale. Optional, set to zero to turn off.  If used, sat_scale will help mitigate oversaturation. If your image is too saturated, increase sat_scale to reduce the saturation.\n    :param cutn_batches: Each iteration, the AI cuts the image into smaller pieces known as cuts, and compares each cut to the prompt to decide how to guide the next diffusion step.  More cuts can generally lead to better images, since DD has more chances to fine-tune the image precision in each timestep.  Additional cuts are memory intensive, however, and if DD tries to evaluate too many cuts at once, it can run out of memory.  You can use cutn_batches to increase cuts per timestep without increasing memory usage. At the default settings, DD is scheduled to do 16 cuts per timestep.  If cutn_batches is set to 1, there will indeed only be 16 cuts total per timestep. However, if cutn_batches is increased to 4, DD will do 64 cuts total in each timestep, divided into 4 sequential batches of 16 cuts each.  Because the cuts are being evaluated only 16 at a time, DD uses the memory required for only 16 cuts, but gives you the quality benefit of 64 cuts.  The tradeoff, of course, is that this will take ~4 times as long to render each image.So, (scheduled cuts) x (cutn_batches) = (total cuts per timestep). Increasing cutn_batches will increase render times, however, as the work is being done sequentially.  DD’s default cut schedule is a good place to start, but the cut schedule can be adjusted in the Cutn Scheduling section, explained below.\n    :param diffusion_model: Diffusion_model of choice.\n    :param use_secondary_model: Option to use a secondary purpose-made diffusion model to clean up interim diffusion images for CLIP evaluation.    If this option is turned off, DD will use the regular (large) diffusion model.    Using the secondary model is faster - one user reported a 50% improvement in render speed! However, the secondary model is much smaller, and may reduce image quality and detail.  I suggest you experiment with this.\n    :param diffusion_sampling_mode: Two alternate diffusion denoising algorithms. ddim has been around longer, and is more established and tested.  plms is a newly added alternate method that promises good diffusion results in fewer steps, but has not been as fully tested and may have side effects. This new plms mode is actively being researched in the #settings-and-techniques channel in the DD Discord.\n    :param perlin_init: Normally, DD will use an image filled with random noise as a starting point for the diffusion curve.  If perlin_init is selected, DD will instead use a Perlin noise model as an initial state.  Perlin has very interesting characteristics, distinct from random noise, so it’s worth experimenting with this for your projects. Beyond perlin, you can, of course, generate your own noise images (such as with GIMP, etc) and use them as an init_image (without skipping steps). Choosing perlin_init does not affect the actual diffusion process, just the starting point for the diffusion. Please note that selecting a perlin_init will replace and override any init_image you may have specified.  Further, because the 2D, 3D and video animation systems all rely on the init_image system, if you enable Perlin while using animation modes, the perlin_init will jump in front of any previous image or video input, and DD will NOT give you the expected sequence of coherent images. All of that said, using Perlin and animation modes together do make a very colorful rainbow effect, which can be used creatively.\n    :param perlin_mode: sets type of Perlin noise: colored, gray, or a mix of both, giving you additional options for noise types. Experiment to see what these do in your projects.\n    :param seed: Deep in the diffusion code, there is a random number ‘seed’ which is used as the basis for determining the initial state of the diffusion.  By default, this is random, but you can also specify your own seed.  This is useful if you like a particular result and would like to run more iterations that will be similar. After each run, the actual seed value used will be reported in the parameters report, and can be reused if desired by entering seed # here.  If a specific numerical seed is used repeatedly, the resulting images will be quite similar but not identical.\n    :param eta: eta (greek letter η) is a diffusion model variable that mixes in a random amount of scaled noise into each timestep. 0 is no noise, 1.0 is more noise. As with most DD parameters, you can go below zero for eta, but it may give you unpredictable results. The steps parameter has a close relationship with the eta parameter. If you set eta to 0, then you can get decent output with only 50-75 steps. Setting eta to 1.0 favors higher step counts, ideally around 250 and up. eta has a subtle, unpredictable effect on image, so you’ll need to experiment to see how this affects your projects.\n    :param clamp_grad: As I understand it, clamp_grad is an internal limiter that stops DD from producing extreme results.  Try your images with and without clamp_grad. If the image changes drastically with clamp_grad turned off, it probably means your clip_guidance_scale is too high and should be reduced.\n    :param clamp_max: Sets the value of the clamp_grad limitation. Default is 0.05, providing for smoother, more muted coloration in images, but setting higher values (0.15-0.3) can provide interesting contrast and vibrancy.\n    :param fuzzy_prompt: Controls whether to add multiple noisy prompts to the prompt losses. If True, can increase variability of image output. Experiment with this.\n    :param rand_mag: Affects only the fuzzy_prompt.  Controls the magnitude of the random noise added by fuzzy_prompt.\n    :param cut_overview: The schedule of overview cuts\n    :param cut_innercut: The schedule of inner cuts\n    :param cut_icgray_p: This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n    :param display_rate: During a diffusion run, you can monitor the progress of each image being created with this variable.  If display_rate is set to 50, DD will show you the in-progress image every 50 timesteps. Setting this to a lower value, like 5 or 10, is a good way to get an early peek at where your image is heading. If you don’t like the progression, just interrupt execution, change some settings, and re-run.  If you are planning a long, unmonitored batch, it’s better to set display_rate equal to steps, because displaying interim images does slow Colab down slightly.\n    :param n_batches: This variable sets the number of still images you want DD to create.  If you are using an animation mode (see below for details) DD will ignore n_batches and create a single set of animated frames based on the animation settings.\n    :param batch_name: The name of the batch, the batch id will be named as \"discoart-[batch_name]-seed\". To avoid your artworks be overridden by other users, please use a unique name.\n    :param clip_models: CLIP Model selectors. ViTB32, ViTB16, ViTL14, RN101, RN50, RN50x4, RN50x16, RN50x64.These various CLIP models are available for you to use during image generation.  Models have different styles or ‘flavors,’ so look around.  You can mix in multiple models as well for different results.  However, keep in mind that some models are extremely memory-hungry, and turning on additional models will take additional memory and may cause a crash.The rough order of speed/mem usage is (smallest/fastest to largest/slowest):VitB32RN50RN101VitB16RN50x4RN50x16RN50x64ViTL14For RN50x64 & ViTL14 you may need to use fewer cuts, depending on your VRAM.\n    :return: a DocumentArray object that has `n_batches` Documents\n    \"\"\"\n\n\n# end_create_overload\n\n\n@overload\ndef create(init_document: 'Document') -> 'DocumentArray':\n    \"\"\"\n    Create an artwork using a DocArray ``Document`` object as initial state.\n    :param init_document: its ``.tags`` will be used as parameters, ``.uri`` (if present) will be used as init image.\n    :return: a DocumentArray object that has `n_batches` Documents\n    \"\"\"\n\n\ndef create(**kwargs) -> 'DocumentArray':\n    from .config import load_config\n    from .runner import do_run\n\n    if 'init_document' in kwargs:\n        d = kwargs['init_document']\n        _kwargs = d.tags\n        if not _kwargs:\n            warnings.warn('init_document has no .tags, fallback to default config')\n        if d.uri:\n            _kwargs['init_image'] = kwargs['init_document'].uri\n        else:\n            warnings.warn('init_document has no .uri, fallback to no init image')\n        kwargs.pop('init_document')\n        if kwargs:\n            warnings.warn('init_document has .tags and .uri, but kwargs are also present, will override .tags')\n            _kwargs.update(kwargs)\n        _args = load_config(user_config=_kwargs)\n    else:\n        _args = load_config(user_config=kwargs)\n\n    model, diffusion = load_diffusion_model(model_config, _args.diffusion_model, steps=_args.steps)\n\n    clip_models = load_clip_models(enabled=_args.clip_models, clip_models=_clip_models_cache)\n\n    gc.collect()\n    paddle.device.cuda.empty_cache()\n    try:\n        return do_run(_args, (model, diffusion, clip_models, secondary_model))\n    except KeyboardInterrupt:\n        pass\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/reverse_diffusion/config.py",
    "content": "'''\nhttps://github.com/jina-ai/discoart/blob/main/discoart/config.py\n'''\nimport copy\nimport random\nimport warnings\nfrom types import SimpleNamespace\nfrom typing import Dict\n\nimport yaml\nfrom yaml import Loader\n\nfrom . import __resources_path__\n\nwith open(f'{__resources_path__}/default.yml') as ymlfile:\n    default_args = yaml.load(ymlfile, Loader=Loader)\n\n\ndef load_config(user_config: Dict, ):\n    cfg = copy.deepcopy(default_args)\n\n    if user_config:\n        cfg.update(**user_config)\n\n    for k in user_config.keys():\n        if k not in cfg:\n            warnings.warn(f'unknown argument {k}, ignored')\n\n    for k, v in cfg.items():\n        if k in ('batch_size', 'display_rate', 'seed', 'skip_steps', 'steps', 'n_batches',\n                 'cutn_batches') and isinstance(v, float):\n            cfg[k] = int(v)\n        if k == 'width_height':\n            cfg[k] = [int(vv) for vv in v]\n\n    cfg.update(**{\n        'seed': cfg['seed'] or random.randint(0, 2**32),\n    })\n\n    if cfg['batch_name']:\n        da_name = f'{__package__}-{cfg[\"batch_name\"]}-{cfg[\"seed\"]}'\n    else:\n        da_name = f'{__package__}-{cfg[\"seed\"]}'\n        warnings.warn('you did not set `batch_name`, set it to have unique session ID')\n\n    cfg.update(**{'name_docarray': da_name})\n\n    print_args_table(cfg)\n\n    return SimpleNamespace(**cfg)\n\n\ndef print_args_table(cfg):\n    from rich.table import Table\n    from rich import box\n    from rich.console import Console\n\n    console = Console()\n\n    param_str = Table(\n        title=cfg['name_docarray'],\n        box=box.ROUNDED,\n        highlight=True,\n        title_justify='left',\n    )\n    param_str.add_column('Argument', justify='right')\n    param_str.add_column('Value', justify='left')\n\n    for k, v in sorted(cfg.items()):\n        value = str(v)\n\n        if not default_args.get(k, None) == v:\n            value = f'[b]{value}[/]'\n\n        param_str.add_row(k, value)\n\n    console.print(param_str)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/reverse_diffusion/helper.py",
    "content": "'''\nThis code is rewritten by Paddle based on Jina-ai/discoart.\nhttps://github.com/jina-ai/discoart/blob/main/discoart/helper.py\n'''\nimport hashlib\nimport logging\nimport os\nimport subprocess\nimport sys\nfrom os.path import expanduser\nfrom pathlib import Path\nfrom typing import Any\nfrom typing import Dict\nfrom typing import List\n\nimport paddle\n\n\ndef _get_logger():\n    logger = logging.getLogger(__package__)\n    _log_level = os.environ.get('DISCOART_LOG_LEVEL', 'INFO')\n    logger.setLevel(_log_level)\n    ch = logging.StreamHandler()\n    ch.setLevel(_log_level)\n    formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')\n    ch.setFormatter(formatter)\n    logger.addHandler(ch)\n    return logger\n\n\nlogger = _get_logger()\n\n\ndef load_clip_models(enabled: List[str], clip_models: Dict[str, Any] = {}):\n\n    import disco_diffusion_ernievil_base.vit_b_16x.ernievil2 as ernievil2\n    from disco_diffusion_ernievil_base.vit_b_16x.ernievil2.utils.utils import build_model\n\n    # load enabled models\n    for k in enabled:\n        if k not in clip_models:\n            clip_models[k] = build_model(name=k)\n            clip_models[k].eval()\n            for parameter in clip_models[k].parameters():\n                parameter.stop_gradient = True\n\n    # disable not enabled models to save memory\n    for k in clip_models:\n        if k not in enabled:\n            clip_models.pop(k)\n\n    return list(clip_models.values())\n\n\ndef load_all_models(diffusion_model, use_secondary_model):\n    from .model.script_util import (\n        model_and_diffusion_defaults, )\n\n    model_config = model_and_diffusion_defaults()\n\n    if diffusion_model == '512x512_diffusion_uncond_finetune_008100':\n        model_config.update({\n            'attention_resolutions': '32, 16, 8',\n            'class_cond': False,\n            'diffusion_steps': 1000,  # No need to edit this, it is taken care of later.\n            'rescale_timesteps': True,\n            'timestep_respacing': 250,  # No need to edit this, it is taken care of later.\n            'image_size': 512,\n            'learn_sigma': True,\n            'noise_schedule': 'linear',\n            'num_channels': 256,\n            'num_head_channels': 64,\n            'num_res_blocks': 2,\n            'resblock_updown': True,\n            'use_fp16': False,\n            'use_scale_shift_norm': True,\n        })\n    elif diffusion_model == '256x256_diffusion_uncond':\n        model_config.update({\n            'attention_resolutions': '32, 16, 8',\n            'class_cond': False,\n            'diffusion_steps': 1000,  # No need to edit this, it is taken care of later.\n            'rescale_timesteps': True,\n            'timestep_respacing': 250,  # No need to edit this, it is taken care of later.\n            'image_size': 256,\n            'learn_sigma': True,\n            'noise_schedule': 'linear',\n            'num_channels': 256,\n            'num_head_channels': 64,\n            'num_res_blocks': 2,\n            'resblock_updown': True,\n            'use_fp16': False,\n            'use_scale_shift_norm': True,\n        })\n\n    secondary_model = None\n    if use_secondary_model:\n        from .model.sec_diff import SecondaryDiffusionImageNet2\n        secondary_model = SecondaryDiffusionImageNet2()\n        model_dict = paddle.load(\n            os.path.join(os.path.dirname(__file__), 'pre_trained', 'secondary_model_imagenet_2.pdparams'))\n        secondary_model.set_state_dict(model_dict)\n        secondary_model.eval()\n        for parameter in secondary_model.parameters():\n            parameter.stop_gradient = True\n\n    return model_config, secondary_model\n\n\ndef load_diffusion_model(model_config, diffusion_model, steps):\n    from .model.script_util import (\n        create_model_and_diffusion, )\n\n    timestep_respacing = f'ddim{steps}'\n    diffusion_steps = (1000 // steps) * steps if steps < 1000 else steps\n    model_config.update({\n        'timestep_respacing': timestep_respacing,\n        'diffusion_steps': diffusion_steps,\n    })\n\n    model, diffusion = create_model_and_diffusion(**model_config)\n    model.set_state_dict(\n        paddle.load(os.path.join(os.path.dirname(__file__), 'pre_trained', f'{diffusion_model}.pdparams')))\n    model.eval()\n    for name, param in model.named_parameters():\n        param.stop_gradient = True\n\n    return model, diffusion\n\n\ndef parse_prompt(prompt):\n    if prompt.startswith('http://') or prompt.startswith('https://'):\n        vals = prompt.rsplit(':', 2)\n        vals = [vals[0] + ':' + vals[1], *vals[2:]]\n    else:\n        vals = prompt.rsplit(':', 1)\n    vals = vals + ['', '1'][len(vals):]\n    return vals[0], float(vals[1])\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/reverse_diffusion/model/__init__.py",
    "content": "\"\"\"\nCodebase for \"Improved Denoising Diffusion Probabilistic Models\" implemented by Paddle.\n\"\"\"\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/reverse_diffusion/model/gaussian_diffusion.py",
    "content": "\"\"\"\nDiffusion model implemented by Paddle.\nThis code is rewritten based on Pytorch version of of Ho et al's diffusion models:\nhttps://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/diffusion_utils_2.py\n\"\"\"\nimport enum\nimport math\n\nimport numpy as np\nimport paddle\n\nfrom .losses import discretized_gaussian_log_likelihood\nfrom .losses import normal_kl\nfrom .nn import mean_flat\n\n\ndef get_named_beta_schedule(schedule_name, num_diffusion_timesteps):\n    \"\"\"\n    Get a pre-defined beta schedule for the given name.\n\n    The beta schedule library consists of beta schedules which remain similar\n    in the limit of num_diffusion_timesteps.\n    Beta schedules may be added, but should not be removed or changed once\n    they are committed to maintain backwards compatibility.\n    \"\"\"\n    if schedule_name == \"linear\":\n        # Linear schedule from Ho et al, extended to work for any number of\n        # diffusion steps.\n        scale = 1000 / num_diffusion_timesteps\n        beta_start = scale * 0.0001\n        beta_end = scale * 0.02\n        return np.linspace(beta_start, beta_end, num_diffusion_timesteps, dtype=np.float64)\n    elif schedule_name == \"cosine\":\n        return betas_for_alpha_bar(\n            num_diffusion_timesteps,\n            lambda t: math.cos((t + 0.008) / 1.008 * math.pi / 2)**2,\n        )\n    else:\n        raise NotImplementedError(f\"unknown beta schedule: {schedule_name}\")\n\n\ndef betas_for_alpha_bar(num_diffusion_timesteps, alpha_bar, max_beta=0.999):\n    \"\"\"\n    Create a beta schedule that discretizes the given alpha_t_bar function,\n    which defines the cumulative product of (1-beta) over time from t = [0,1].\n\n    :param num_diffusion_timesteps: the number of betas to produce.\n    :param alpha_bar: a lambda that takes an argument t from 0 to 1 and\n                      produces the cumulative product of (1-beta) up to that\n                      part of the diffusion process.\n    :param max_beta: the maximum beta to use; use values lower than 1 to\n                     prevent singularities.\n    \"\"\"\n    betas = []\n    for i in range(num_diffusion_timesteps):\n        t1 = i / num_diffusion_timesteps\n        t2 = (i + 1) / num_diffusion_timesteps\n        betas.append(min(1 - alpha_bar(t2) / alpha_bar(t1), max_beta))\n    return np.array(betas)\n\n\nclass ModelMeanType(enum.Enum):\n    \"\"\"\n    Which type of output the model predicts.\n    \"\"\"\n\n    PREVIOUS_X = enum.auto()  # the model predicts x_{t-1}\n    START_X = enum.auto()  # the model predicts x_0\n    EPSILON = enum.auto()  # the model predicts epsilon\n\n\nclass ModelVarType(enum.Enum):\n    \"\"\"\n    What is used as the model's output variance.\n\n    The LEARNED_RANGE option has been added to allow the model to predict\n    values between FIXED_SMALL and FIXED_LARGE, making its job easier.\n    \"\"\"\n\n    LEARNED = enum.auto()\n    FIXED_SMALL = enum.auto()\n    FIXED_LARGE = enum.auto()\n    LEARNED_RANGE = enum.auto()\n\n\nclass LossType(enum.Enum):\n    MSE = enum.auto()  # use raw MSE loss (and KL when learning variances)\n    RESCALED_MSE = (enum.auto())  # use raw MSE loss (with RESCALED_KL when learning variances)\n    KL = enum.auto()  # use the variational lower-bound\n    RESCALED_KL = enum.auto()  # like KL, but rescale to estimate the full VLB\n\n    def is_vb(self):\n        return self == LossType.KL or self == LossType.RESCALED_KL\n\n\nclass GaussianDiffusion:\n    \"\"\"\n    Utilities for training and sampling diffusion models.\n\n    Ported directly from here, and then adapted over time to further experimentation.\n    https://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/diffusion_utils_2.py#L42\n\n    :param betas: a 1-D numpy array of betas for each diffusion timestep,\n                  starting at T and going to 1.\n    :param model_mean_type: a ModelMeanType determining what the model outputs.\n    :param model_var_type: a ModelVarType determining how variance is output.\n    :param loss_type: a LossType determining the loss function to use.\n    :param rescale_timesteps: if True, pass floating point timesteps into the\n                              model so that they are always scaled like in the\n                              original paper (0 to 1000).\n    \"\"\"\n\n    def __init__(\n        self,\n        *,\n        betas,\n        model_mean_type,\n        model_var_type,\n        loss_type,\n        rescale_timesteps=False,\n    ):\n        self.model_mean_type = model_mean_type\n        self.model_var_type = model_var_type\n        self.loss_type = loss_type\n        self.rescale_timesteps = rescale_timesteps\n\n        # Use float64 for accuracy.\n        betas = np.array(betas, dtype=np.float64)\n        self.betas = betas\n        assert len(betas.shape) == 1, \"betas must be 1-D\"\n        assert (betas > 0).all() and (betas <= 1).all()\n\n        self.num_timesteps = int(betas.shape[0])\n\n        alphas = 1.0 - betas\n        self.alphas_cumprod = np.cumprod(alphas, axis=0)\n        self.alphas_cumprod_prev = np.append(1.0, self.alphas_cumprod[:-1])\n        self.alphas_cumprod_next = np.append(self.alphas_cumprod[1:], 0.0)\n        assert self.alphas_cumprod_prev.shape == (self.num_timesteps, )\n\n        # calculations for diffusion q(x_t | x_{t-1}) and others\n        self.sqrt_alphas_cumprod = np.sqrt(self.alphas_cumprod)\n        self.sqrt_one_minus_alphas_cumprod = np.sqrt(1.0 - self.alphas_cumprod)\n        self.log_one_minus_alphas_cumprod = np.log(1.0 - self.alphas_cumprod)\n        self.sqrt_recip_alphas_cumprod = np.sqrt(1.0 / self.alphas_cumprod)\n        self.sqrt_recipm1_alphas_cumprod = np.sqrt(1.0 / self.alphas_cumprod - 1)\n\n        # calculations for posterior q(x_{t-1} | x_t, x_0)\n        self.posterior_variance = (betas * (1.0 - self.alphas_cumprod_prev) / (1.0 - self.alphas_cumprod))\n        # log calculation clipped because the posterior variance is 0 at the\n        # beginning of the diffusion chain.\n        self.posterior_log_variance_clipped = np.log(np.append(self.posterior_variance[1], self.posterior_variance[1:]))\n        self.posterior_mean_coef1 = (betas * np.sqrt(self.alphas_cumprod_prev) / (1.0 - self.alphas_cumprod))\n        self.posterior_mean_coef2 = ((1.0 - self.alphas_cumprod_prev) * np.sqrt(alphas) / (1.0 - self.alphas_cumprod))\n\n    def q_mean_variance(self, x_start, t):\n        \"\"\"\n        Get the distribution q(x_t | x_0).\n\n        :param x_start: the [N x C x ...] tensor of noiseless inputs.\n        :param t: the number of diffusion steps (minus 1). Here, 0 means one step.\n        :return: A tuple (mean, variance, log_variance), all of x_start's shape.\n        \"\"\"\n        mean = (_extract_into_tensor(self.sqrt_alphas_cumprod, t, x_start.shape) * x_start)\n        variance = _extract_into_tensor(1.0 - self.alphas_cumprod, t, x_start.shape)\n        log_variance = _extract_into_tensor(self.log_one_minus_alphas_cumprod, t, x_start.shape)\n        return mean, variance, log_variance\n\n    def q_sample(self, x_start, t, noise=None):\n        \"\"\"\n        Diffuse the data for a given number of diffusion steps.\n\n        In other words, sample from q(x_t | x_0).\n\n        :param x_start: the initial data batch.\n        :param t: the number of diffusion steps (minus 1). Here, 0 means one step.\n        :param noise: if specified, the split-out normal noise.\n        :return: A noisy version of x_start.\n        \"\"\"\n        if noise is None:\n            # noise = th.randn_like(x_start)\n            noise = paddle.randn(x_start.shape, x_start.dtype)\n        assert noise.shape == x_start.shape\n        return (_extract_into_tensor(self.sqrt_alphas_cumprod, t, x_start.shape) * x_start +\n                _extract_into_tensor(self.sqrt_one_minus_alphas_cumprod, t, x_start.shape) * noise)\n\n    def q_posterior_mean_variance(self, x_start, x_t, t):\n        \"\"\"\n        Compute the mean and variance of the diffusion posterior:\n\n            q(x_{t-1} | x_t, x_0)\n\n        \"\"\"\n        assert x_start.shape == x_t.shape\n        posterior_mean = (_extract_into_tensor(self.posterior_mean_coef1, t, x_t.shape) * x_start +\n                          _extract_into_tensor(self.posterior_mean_coef2, t, x_t.shape) * x_t)\n        posterior_variance = _extract_into_tensor(self.posterior_variance, t, x_t.shape)\n        posterior_log_variance_clipped = _extract_into_tensor(self.posterior_log_variance_clipped, t, x_t.shape)\n        assert (posterior_mean.shape[0] == posterior_variance.shape[0] == posterior_log_variance_clipped.shape[0] ==\n                x_start.shape[0])\n        return posterior_mean, posterior_variance, posterior_log_variance_clipped\n\n    def p_mean_variance(self, model, x, t, clip_denoised=True, denoised_fn=None, model_kwargs=None):\n        \"\"\"\n        Apply the model to get p(x_{t-1} | x_t), as well as a prediction of\n        the initial x, x_0.\n\n        :param model: the model, which takes a signal and a batch of timesteps\n                      as input.\n        :param x: the [N x C x ...] tensor at time t.\n        :param t: a 1-D Tensor of timesteps.\n        :param clip_denoised: if True, clip the denoised signal into [-1, 1].\n        :param denoised_fn: if not None, a function which applies to the\n            x_start prediction before it is used to sample. Applies before\n            clip_denoised.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n        :return: a dict with the following keys:\n                 - 'mean': the model mean output.\n                 - 'variance': the model variance output.\n                 - 'log_variance': the log of 'variance'.\n                 - 'pred_xstart': the prediction for x_0.\n        \"\"\"\n        if model_kwargs is None:\n            model_kwargs = {}\n\n        B, C = x.shape[:2]\n        assert t.shape == [B]\n        model_output = model(x, self._scale_timesteps(t), **model_kwargs)\n\n        if self.model_var_type in [ModelVarType.LEARNED, ModelVarType.LEARNED_RANGE]:\n            assert model_output.shape == [B, C * 2, *x.shape[2:]]\n            model_output, model_var_values = paddle.split(model_output, 2, axis=1)\n            if self.model_var_type == ModelVarType.LEARNED:\n                model_log_variance = model_var_values\n                model_variance = paddle.exp(model_log_variance)\n            else:\n                min_log = _extract_into_tensor(self.posterior_log_variance_clipped, t, x.shape)\n                max_log = _extract_into_tensor(np.log(self.betas), t, x.shape)\n                # The model_var_values is [-1, 1] for [min_var, max_var].\n                frac = (model_var_values + 1) / 2\n                model_log_variance = frac * max_log + (1 - frac) * min_log\n                model_variance = paddle.exp(model_log_variance)\n        else:\n            model_variance, model_log_variance = {\n                # for fixedlarge, we set the initial (log-)variance like so\n                # to get a better decoder log likelihood.\n                ModelVarType.FIXED_LARGE: (\n                    np.append(self.posterior_variance[1], self.betas[1:]),\n                    np.log(np.append(self.posterior_variance[1], self.betas[1:])),\n                ),\n                ModelVarType.FIXED_SMALL: (\n                    self.posterior_variance,\n                    self.posterior_log_variance_clipped,\n                ),\n            }[self.model_var_type]\n            model_variance = _extract_into_tensor(model_variance, t, x.shape)\n            model_log_variance = _extract_into_tensor(model_log_variance, t, x.shape)\n\n        def process_xstart(x):\n            if denoised_fn is not None:\n                x = denoised_fn(x)\n            if clip_denoised:\n                return x.clamp(-1, 1)\n            return x\n\n        if self.model_mean_type == ModelMeanType.PREVIOUS_X:\n            pred_xstart = process_xstart(self._predict_xstart_from_xprev(x_t=x, t=t, xprev=model_output))\n            model_mean = model_output\n        elif self.model_mean_type in [ModelMeanType.START_X, ModelMeanType.EPSILON]:\n            if self.model_mean_type == ModelMeanType.START_X:\n                pred_xstart = process_xstart(model_output)\n            else:\n                pred_xstart = process_xstart(self._predict_xstart_from_eps(x_t=x, t=t, eps=model_output))\n            model_mean, _, _ = self.q_posterior_mean_variance(x_start=pred_xstart, x_t=x, t=t)\n        else:\n            raise NotImplementedError(self.model_mean_type)\n\n        assert (model_mean.shape == model_log_variance.shape == pred_xstart.shape == x.shape)\n        return {\n            \"mean\": model_mean,\n            \"variance\": model_variance,\n            \"log_variance\": model_log_variance,\n            \"pred_xstart\": pred_xstart,\n        }\n\n    def _predict_xstart_from_eps(self, x_t, t, eps):\n        assert x_t.shape == eps.shape\n        return (_extract_into_tensor(self.sqrt_recip_alphas_cumprod, t, x_t.shape) * x_t -\n                _extract_into_tensor(self.sqrt_recipm1_alphas_cumprod, t, x_t.shape) * eps)\n\n    def _predict_xstart_from_xprev(self, x_t, t, xprev):\n        assert x_t.shape == xprev.shape\n        return (  # (xprev - coef2*x_t) / coef1\n            _extract_into_tensor(1.0 / self.posterior_mean_coef1, t, x_t.shape) * xprev -\n            _extract_into_tensor(self.posterior_mean_coef2 / self.posterior_mean_coef1, t, x_t.shape) * x_t)\n\n    def _predict_eps_from_xstart(self, x_t, t, pred_xstart):\n        return (_extract_into_tensor(self.sqrt_recip_alphas_cumprod, t, x_t.shape) * x_t -\n                pred_xstart) / _extract_into_tensor(self.sqrt_recipm1_alphas_cumprod, t, x_t.shape)\n\n    def _scale_timesteps(self, t):\n        if self.rescale_timesteps:\n            return paddle.cast((t), 'float32') * (1000.0 / self.num_timesteps)\n        return t\n\n    def condition_mean(self, cond_fn, p_mean_var, x, t, model_kwargs=None):\n        \"\"\"\n        Compute the mean for the previous step, given a function cond_fn that\n        computes the gradient of a conditional log probability with respect to\n        x. In particular, cond_fn computes grad(log(p(y|x))), and we want to\n        condition on y.\n\n        This uses the conditioning strategy from Sohl-Dickstein et al. (2015).\n        \"\"\"\n        gradient = cond_fn(x, self._scale_timesteps(t), **model_kwargs)\n        new_mean = (paddle.cast((p_mean_var[\"mean\"]), 'float32') + p_mean_var[\"variance\"] * paddle.cast(\n            (gradient), 'float32'))\n        return new_mean\n\n    def condition_mean_with_grad(self, cond_fn, p_mean_var, x, t, model_kwargs=None):\n        \"\"\"\n        Compute the mean for the previous step, given a function cond_fn that\n        computes the gradient of a conditional log probability with respect to\n        x. In particular, cond_fn computes grad(log(p(y|x))), and we want to\n        condition on y.\n\n        This uses the conditioning strategy from Sohl-Dickstein et al. (2015).\n        \"\"\"\n        gradient = cond_fn(x, t, p_mean_var, **model_kwargs)\n        new_mean = (paddle.cast((p_mean_var[\"mean\"]), 'float32') + p_mean_var[\"variance\"] * paddle.cast(\n            (gradient), 'float32'))\n        return new_mean\n\n    def condition_score(self, cond_fn, p_mean_var, x, t, model_kwargs=None):\n        \"\"\"\n        Compute what the p_mean_variance output would have been, should the\n        model's score function be conditioned by cond_fn.\n\n        See condition_mean() for details on cond_fn.\n\n        Unlike condition_mean(), this instead uses the conditioning strategy\n        from Song et al (2020).\n        \"\"\"\n        alpha_bar = _extract_into_tensor(self.alphas_cumprod, t, x.shape)\n\n        eps = self._predict_eps_from_xstart(x, t, p_mean_var[\"pred_xstart\"])\n        eps = eps - (1 - alpha_bar).sqrt() * cond_fn(x, self._scale_timesteps(t), **model_kwargs)\n\n        out = p_mean_var.copy()\n        out[\"pred_xstart\"] = self._predict_xstart_from_eps(x, t, eps)\n        out[\"mean\"], _, _ = self.q_posterior_mean_variance(x_start=out[\"pred_xstart\"], x_t=x, t=t)\n        return out\n\n    def condition_score_with_grad(self, cond_fn, p_mean_var, x, t, model_kwargs=None):\n        \"\"\"\n        Compute what the p_mean_variance output would have been, should the\n        model's score function be conditioned by cond_fn.\n\n        See condition_mean() for details on cond_fn.\n\n        Unlike condition_mean(), this instead uses the conditioning strategy\n        from Song et al (2020).\n        \"\"\"\n        alpha_bar = _extract_into_tensor(self.alphas_cumprod, t, x.shape)\n\n        eps = self._predict_eps_from_xstart(x, t, p_mean_var[\"pred_xstart\"])\n        eps = eps - (1 - alpha_bar).sqrt() * cond_fn(x, t, p_mean_var, **model_kwargs)\n\n        out = p_mean_var.copy()\n        out[\"pred_xstart\"] = self._predict_xstart_from_eps(x, t, eps)\n        out[\"mean\"], _, _ = self.q_posterior_mean_variance(x_start=out[\"pred_xstart\"], x_t=x, t=t)\n        return out\n\n    def p_sample(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n    ):\n        \"\"\"\n        Sample x_{t-1} from the model at the given timestep.\n\n        :param model: the model to sample from.\n        :param x: the current tensor at x_{t-1}.\n        :param t: the value of t, starting at 0 for the first diffusion step.\n        :param clip_denoised: if True, clip the x_start prediction to [-1, 1].\n        :param denoised_fn: if not None, a function which applies to the\n            x_start prediction before it is used to sample.\n        :param cond_fn: if not None, this is a gradient function that acts\n                        similarly to the model.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n        :return: a dict containing the following keys:\n                 - 'sample': a random sample from the model.\n                 - 'pred_xstart': a prediction of x_0.\n        \"\"\"\n        out = self.p_mean_variance(\n            model,\n            x,\n            t,\n            clip_denoised=clip_denoised,\n            denoised_fn=denoised_fn,\n            model_kwargs=model_kwargs,\n        )\n        # noise = th.randn_like(x)\n        noise = paddle.randn(x.shape, x.dtype)\n        nonzero_mask = (paddle.cast((t != 0), 'float32').reshape([-1,\n                                                                  *([1] * (len(x.shape) - 1))]))  # no noise when t == 0\n        if cond_fn is not None:\n            out[\"mean\"] = self.condition_mean(cond_fn, out, x, t, model_kwargs=model_kwargs)\n        sample = out[\"mean\"] + nonzero_mask * paddle.exp(0.5 * out[\"log_variance\"]) * noise\n        return {\"sample\": sample, \"pred_xstart\": out[\"pred_xstart\"]}\n\n    def p_sample_with_grad(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n    ):\n        \"\"\"\n        Sample x_{t-1} from the model at the given timestep.\n\n        :param model: the model to sample from.\n        :param x: the current tensor at x_{t-1}.\n        :param t: the value of t, starting at 0 for the first diffusion step.\n        :param clip_denoised: if True, clip the x_start prediction to [-1, 1].\n        :param denoised_fn: if not None, a function which applies to the\n            x_start prediction before it is used to sample.\n        :param cond_fn: if not None, this is a gradient function that acts\n                        similarly to the model.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n        :return: a dict containing the following keys:\n                 - 'sample': a random sample from the model.\n                 - 'pred_xstart': a prediction of x_0.\n        \"\"\"\n        # with th.enable_grad():\n        # x = x.detach().requires_grad_()\n        x = x.detach()\n        # x.stop_gradient = False\n        out = self.p_mean_variance(\n            model,\n            x,\n            t,\n            clip_denoised=clip_denoised,\n            denoised_fn=denoised_fn,\n            model_kwargs=model_kwargs,\n        )\n        # noise = th.randn_like(x)\n        noise = paddle.randn(x.shape, x.dtype)\n        nonzero_mask = (paddle.cast((t != 0), 'float32').reshape([-1,\n                                                                  *([1] * (len(x.shape) - 1))]))  # no noise when t == 0\n        if cond_fn is not None:\n            out[\"mean\"] = self.condition_mean_with_grad(cond_fn, out, x, t, model_kwargs=model_kwargs)\n        sample = out[\"mean\"] + nonzero_mask * paddle.exp(0.5 * out[\"log_variance\"]) * noise\n        return {\"sample\": sample, \"pred_xstart\": out[\"pred_xstart\"].detach()}\n\n    def p_sample_loop(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n    ):\n        \"\"\"\n        Generate samples from the model.\n\n        :param model: the model module.\n        :param shape: the shape of the samples, (N, C, H, W).\n        :param noise: if specified, the noise from the encoder to sample.\n                      Should be of the same shape as `shape`.\n        :param clip_denoised: if True, clip x_start predictions to [-1, 1].\n        :param denoised_fn: if not None, a function which applies to the\n            x_start prediction before it is used to sample.\n        :param cond_fn: if not None, this is a gradient function that acts\n                        similarly to the model.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n        :param device: if specified, the device to create the samples on.\n                       If not specified, use a model parameter's device.\n        :param progress: if True, show a tqdm progress bar.\n        :return: a non-differentiable batch of samples.\n        \"\"\"\n        final = None\n        for sample in self.p_sample_loop_progressive(\n                model,\n                shape,\n                noise=noise,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n                device=device,\n                progress=progress,\n                skip_timesteps=skip_timesteps,\n                init_image=init_image,\n                randomize_class=randomize_class,\n                cond_fn_with_grad=cond_fn_with_grad,\n        ):\n            final = sample\n        return final[\"sample\"]\n\n    def p_sample_loop_progressive(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n    ):\n        \"\"\"\n        Generate samples from the model and yield intermediate samples from\n        each timestep of diffusion.\n\n        Arguments are the same as p_sample_loop().\n        Returns a generator over dicts, where each dict is the return value of\n        p_sample().\n        \"\"\"\n        if device is None:\n            device = model.parameters()[0].place\n        assert isinstance(shape, (tuple, list))\n        if noise is not None:\n            img = noise\n        else:\n            img = paddle.randn(shape)\n\n        if skip_timesteps and init_image is None:\n            init_image = paddle.zeros_like(img)\n\n        indices = list(range(self.num_timesteps - skip_timesteps))[::-1]\n\n        if init_image is not None:\n            my_t = paddle.ones([shape[0]], dtype='int64') * indices[0]\n            img = self.q_sample(init_image, my_t, img)\n\n        if progress:\n            # Lazy import so that we don't depend on tqdm.\n            from tqdm.auto import tqdm\n\n            indices = tqdm(indices)\n\n        for i in indices:\n            t = paddle.to_tensor([i] * shape[0], place=device)\n            if randomize_class and 'y' in model_kwargs:\n                model_kwargs['y'] = paddle.randint(low=0, high=model.num_classes, shape=model_kwargs['y'].shape)\n            # with paddle.no_grad():\n            sample_fn = self.p_sample_with_grad if cond_fn_with_grad else self.p_sample\n            out = sample_fn(\n                model,\n                img,\n                t,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n            )\n            yield out\n            img = out[\"sample\"]\n\n    def ddim_sample(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        eta=0.0,\n    ):\n        \"\"\"\n        Sample x_{t-1} from the model using DDIM.\n\n        Same usage as p_sample().\n        \"\"\"\n        out_orig = self.p_mean_variance(\n            model,\n            x,\n            t,\n            clip_denoised=clip_denoised,\n            denoised_fn=denoised_fn,\n            model_kwargs=model_kwargs,\n        )\n        if cond_fn is not None:\n            out = self.condition_score(cond_fn, out_orig, x, t, model_kwargs=model_kwargs)\n        else:\n            out = out_orig\n\n        # Usually our model outputs epsilon, but we re-derive it\n        # in case we used x_start or x_prev prediction.\n        eps = self._predict_eps_from_xstart(x, t, out[\"pred_xstart\"])\n\n        alpha_bar = _extract_into_tensor(self.alphas_cumprod, t, x.shape)\n        alpha_bar_prev = _extract_into_tensor(self.alphas_cumprod_prev, t, x.shape)\n        sigma = (eta * paddle.sqrt(\n            (1 - alpha_bar_prev) / (1 - alpha_bar)) * paddle.sqrt(1 - alpha_bar / alpha_bar_prev))\n        # Equation 12.\n        # noise = th.randn_like(x)\n        noise = paddle.randn(x.shape, x.dtype)\n        mean_pred = (out[\"pred_xstart\"] * paddle.sqrt(alpha_bar_prev) +\n                     paddle.sqrt(1 - alpha_bar_prev - sigma**2) * eps)\n        nonzero_mask = (paddle.cast((t != 0), 'float32').reshape([-1,\n                                                                  *([1] * (len(x.shape) - 1))]))  # no noise when t == 0\n        sample = mean_pred + nonzero_mask * sigma * noise\n        return {\"sample\": sample, \"pred_xstart\": out_orig[\"pred_xstart\"]}\n\n    def ddim_sample_with_grad(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        eta=0.0,\n    ):\n        \"\"\"\n        Sample x_{t-1} from the model using DDIM.\n\n        Same usage as p_sample().\n        \"\"\"\n        # with th.enable_grad():\n        # x = x.detach().requires_grad_()\n        x = x.detach()\n        # x.stop_gradient = False\n        out_orig = self.p_mean_variance(\n            model,\n            x,\n            t,\n            clip_denoised=clip_denoised,\n            denoised_fn=denoised_fn,\n            model_kwargs=model_kwargs,\n        )\n        if cond_fn is not None:\n            out = self.condition_score_with_grad(cond_fn, out_orig, x, t, model_kwargs=model_kwargs)\n        else:\n            out = out_orig\n\n        out[\"pred_xstart\"] = out[\"pred_xstart\"].detach()\n\n        # Usually our model outputs epsilon, but we re-derive it\n        # in case we used x_start or x_prev prediction.\n        eps = self._predict_eps_from_xstart(x, t, out[\"pred_xstart\"])\n\n        alpha_bar = _extract_into_tensor(self.alphas_cumprod, t, x.shape)\n        alpha_bar_prev = _extract_into_tensor(self.alphas_cumprod_prev, t, x.shape)\n        sigma = (eta * paddle.sqrt(\n            (1 - alpha_bar_prev) / (1 - alpha_bar)) * paddle.sqrt(1 - alpha_bar / alpha_bar_prev))\n        # Equation 12.\n        # noise = th.randn_like(x)\n        noise = paddle.randn(x.shape, x.dtype)\n        mean_pred = (out[\"pred_xstart\"] * paddle.sqrt(alpha_bar_prev) +\n                     paddle.sqrt(1 - alpha_bar_prev - sigma**2) * eps)\n        nonzero_mask = (paddle.cast((t != 0), 'float32').reshape([-1,\n                                                                  *([1] * (len(x.shape) - 1))]))  # no noise when t == 0\n        sample = mean_pred + nonzero_mask * sigma * noise\n        return {\"sample\": sample, \"pred_xstart\": out_orig[\"pred_xstart\"].detach()}\n\n    def ddim_reverse_sample(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        model_kwargs=None,\n        eta=0.0,\n    ):\n        \"\"\"\n        Sample x_{t+1} from the model using DDIM reverse ODE.\n        \"\"\"\n        assert eta == 0.0, \"Reverse ODE only for deterministic path\"\n        out = self.p_mean_variance(\n            model,\n            x,\n            t,\n            clip_denoised=clip_denoised,\n            denoised_fn=denoised_fn,\n            model_kwargs=model_kwargs,\n        )\n        # Usually our model outputs epsilon, but we re-derive it\n        # in case we used x_start or x_prev prediction.\n        eps = (_extract_into_tensor(self.sqrt_recip_alphas_cumprod, t, x.shape) * x -\n               out[\"pred_xstart\"]) / _extract_into_tensor(self.sqrt_recipm1_alphas_cumprod, t, x.shape)\n        alpha_bar_next = _extract_into_tensor(self.alphas_cumprod_next, t, x.shape)\n\n        # Equation 12. reversed\n        mean_pred = (out[\"pred_xstart\"] * paddle.sqrt(alpha_bar_next) + paddle.sqrt(1 - alpha_bar_next) * eps)\n\n        return {\"sample\": mean_pred, \"pred_xstart\": out[\"pred_xstart\"]}\n\n    def ddim_sample_loop(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        eta=0.0,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n    ):\n        \"\"\"\n        Generate samples from the model using DDIM.\n\n        Same usage as p_sample_loop().\n        \"\"\"\n        final = None\n        for sample in self.ddim_sample_loop_progressive(\n                model,\n                shape,\n                noise=noise,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n                device=device,\n                progress=progress,\n                eta=eta,\n                skip_timesteps=skip_timesteps,\n                init_image=init_image,\n                randomize_class=randomize_class,\n                cond_fn_with_grad=cond_fn_with_grad,\n        ):\n            final = sample\n        return final[\"sample\"]\n\n    def ddim_sample_loop_progressive(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        eta=0.0,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n    ):\n        \"\"\"\n        Use DDIM to sample from the model and yield intermediate samples from\n        each timestep of DDIM.\n\n        Same usage as p_sample_loop_progressive().\n        \"\"\"\n        # if device is None:\n        #     device = next(model.parameters()).device\n        if device is None:\n            device = model.parameters()[0].place\n        assert isinstance(shape, (tuple, list))\n        if noise is not None:\n            img = noise\n        else:\n            img = paddle.randn(shape)\n\n        if skip_timesteps and init_image is None:\n            init_image = paddle.zeros_like(img)\n\n        indices = list(range(self.num_timesteps - skip_timesteps))[::-1]\n\n        if init_image is not None:\n            my_t = paddle.ones([shape[0]], dtype='int64') * indices[0]\n            img = self.q_sample(init_image, my_t, img)\n\n        if progress:\n            # Lazy import so that we don't depend on tqdm.\n            from tqdm.auto import tqdm\n\n            indices = tqdm(indices)\n\n        for i in indices:\n            t = paddle.to_tensor([i] * shape[0])\n            if randomize_class and 'y' in model_kwargs:\n                model_kwargs['y'] = paddle.randint(\n                    low=0,\n                    high=model.num_classes,\n                    shape=model_kwargs['y'].shape,\n                )\n            sample_fn = self.ddim_sample_with_grad if cond_fn_with_grad else self.ddim_sample\n            out = sample_fn(\n                model,\n                img,\n                t,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n                eta=eta,\n            )\n            yield out\n            img = out[\"sample\"]\n\n    def plms_sample(\n        self,\n        model,\n        x,\n        t,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        cond_fn_with_grad=False,\n        order=2,\n        old_out=None,\n    ):\n        \"\"\"\n        Sample x_{t-1} from the model using Pseudo Linear Multistep.\n\n        Same usage as p_sample().\n        \"\"\"\n        if not int(order) or not 1 <= order <= 4:\n            raise ValueError('order is invalid (should be int from 1-4).')\n\n        def get_model_output(x, t):\n            with paddle.set_grad_enabled(cond_fn_with_grad and cond_fn is not None):\n                x = x.detach().requires_grad_() if cond_fn_with_grad else x\n                out_orig = self.p_mean_variance(\n                    model,\n                    x,\n                    t,\n                    clip_denoised=clip_denoised,\n                    denoised_fn=denoised_fn,\n                    model_kwargs=model_kwargs,\n                )\n                if cond_fn is not None:\n                    if cond_fn_with_grad:\n                        out = self.condition_score_with_grad(cond_fn, out_orig, x, t, model_kwargs=model_kwargs)\n                        x = x.detach()\n                    else:\n                        out = self.condition_score(cond_fn, out_orig, x, t, model_kwargs=model_kwargs)\n                else:\n                    out = out_orig\n\n            # Usually our model outputs epsilon, but we re-derive it\n            # in case we used x_start or x_prev prediction.\n            eps = self._predict_eps_from_xstart(x, t, out[\"pred_xstart\"])\n            return eps, out, out_orig\n\n        alpha_bar = _extract_into_tensor(self.alphas_cumprod, t, x.shape)\n        alpha_bar_prev = _extract_into_tensor(self.alphas_cumprod_prev, t, x.shape)\n        eps, out, out_orig = get_model_output(x, t)\n\n        if order > 1 and old_out is None:\n            # Pseudo Improved Euler\n            old_eps = [eps]\n            mean_pred = out[\"pred_xstart\"] * paddle.sqrt(alpha_bar_prev) + paddle.sqrt(1 - alpha_bar_prev) * eps\n            eps_2, _, _ = get_model_output(mean_pred, t - 1)\n            eps_prime = (eps + eps_2) / 2\n            pred_prime = self._predict_xstart_from_eps(x, t, eps_prime)\n            mean_pred = pred_prime * paddle.sqrt(alpha_bar_prev) + paddle.sqrt(1 - alpha_bar_prev) * eps_prime\n        else:\n            # Pseudo Linear Multistep (Adams-Bashforth)\n            old_eps = old_out[\"old_eps\"]\n            old_eps.append(eps)\n            cur_order = min(order, len(old_eps))\n            if cur_order == 1:\n                eps_prime = old_eps[-1]\n            elif cur_order == 2:\n                eps_prime = (3 * old_eps[-1] - old_eps[-2]) / 2\n            elif cur_order == 3:\n                eps_prime = (23 * old_eps[-1] - 16 * old_eps[-2] + 5 * old_eps[-3]) / 12\n            elif cur_order == 4:\n                eps_prime = (55 * old_eps[-1] - 59 * old_eps[-2] + 37 * old_eps[-3] - 9 * old_eps[-4]) / 24\n            else:\n                raise RuntimeError('cur_order is invalid.')\n            pred_prime = self._predict_xstart_from_eps(x, t, eps_prime)\n            mean_pred = pred_prime * paddle.sqrt(alpha_bar_prev) + paddle.sqrt(1 - alpha_bar_prev) * eps_prime\n\n        if len(old_eps) >= order:\n            old_eps.pop(0)\n\n        nonzero_mask = paddle.cast((t != 0), 'float32').reshape([-1, *([1] * (len(x.shape) - 1))])\n        sample = mean_pred * nonzero_mask + out[\"pred_xstart\"] * (1 - nonzero_mask)\n\n        return {\"sample\": sample, \"pred_xstart\": out_orig[\"pred_xstart\"], \"old_eps\": old_eps}\n\n    def plms_sample_loop(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n        order=2,\n    ):\n        \"\"\"\n        Generate samples from the model using Pseudo Linear Multistep.\n\n        Same usage as p_sample_loop().\n        \"\"\"\n        final = None\n        for sample in self.plms_sample_loop_progressive(\n                model,\n                shape,\n                noise=noise,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n                device=device,\n                progress=progress,\n                skip_timesteps=skip_timesteps,\n                init_image=init_image,\n                randomize_class=randomize_class,\n                cond_fn_with_grad=cond_fn_with_grad,\n                order=order,\n        ):\n            final = sample\n        return final[\"sample\"]\n\n    def plms_sample_loop_progressive(\n        self,\n        model,\n        shape,\n        noise=None,\n        clip_denoised=True,\n        denoised_fn=None,\n        cond_fn=None,\n        model_kwargs=None,\n        device=None,\n        progress=False,\n        skip_timesteps=0,\n        init_image=None,\n        randomize_class=False,\n        cond_fn_with_grad=False,\n        order=2,\n    ):\n        \"\"\"\n        Use PLMS to sample from the model and yield intermediate samples from each\n        timestep of PLMS.\n\n        Same usage as p_sample_loop_progressive().\n        \"\"\"\n        if device is None:\n            device = model.parameters()[0].place\n        assert isinstance(shape, (tuple, list))\n        if noise is not None:\n            img = noise\n        else:\n            img = paddle.randn(shape)\n\n        if skip_timesteps and init_image is None:\n            init_image = paddle.zeros_like(img)\n\n        indices = list(range(self.num_timesteps - skip_timesteps))[::-1]\n\n        if init_image is not None:\n            my_t = paddle.ones([shape[0]], dtype='int64') * indices[0]\n            img = self.q_sample(init_image, my_t, img)\n\n        if progress:\n            # Lazy import so that we don't depend on tqdm.\n            from tqdm.auto import tqdm\n\n            indices = tqdm(indices)\n\n        old_out = None\n\n        for i in indices:\n            t = paddle.to_tensor([i] * shape[0], place=device)\n            if randomize_class and 'y' in model_kwargs:\n                model_kwargs['y'] = paddle.randint(low=0, high=model.num_classes, shape=model_kwargs['y'].shape)\n            # with paddle.no_grad():\n            out = self.plms_sample(\n                model,\n                img,\n                t,\n                clip_denoised=clip_denoised,\n                denoised_fn=denoised_fn,\n                cond_fn=cond_fn,\n                model_kwargs=model_kwargs,\n                cond_fn_with_grad=cond_fn_with_grad,\n                order=order,\n                old_out=old_out,\n            )\n            yield out\n            old_out = out\n            img = out[\"sample\"]\n\n    def _vb_terms_bpd(self, model, x_start, x_t, t, clip_denoised=True, model_kwargs=None):\n        \"\"\"\n        Get a term for the variational lower-bound.\n\n        The resulting units are bits (rather than nats, as one might expect).\n        This allows for comparison to other papers.\n\n        :return: a dict with the following keys:\n                 - 'output': a shape [N] tensor of NLLs or KLs.\n                 - 'pred_xstart': the x_0 predictions.\n        \"\"\"\n        true_mean, _, true_log_variance_clipped = self.q_posterior_mean_variance(x_start=x_start, x_t=x_t, t=t)\n        out = self.p_mean_variance(model, x_t, t, clip_denoised=clip_denoised, model_kwargs=model_kwargs)\n        kl = normal_kl(true_mean, true_log_variance_clipped, out[\"mean\"], out[\"log_variance\"])\n        kl = mean_flat(kl) / np.log(2.0)\n\n        decoder_nll = -discretized_gaussian_log_likelihood(\n            x_start, means=out[\"mean\"], log_scales=0.5 * out[\"log_variance\"])\n        assert decoder_nll.shape == x_start.shape\n        decoder_nll = mean_flat(decoder_nll) / np.log(2.0)\n\n        # At the first timestep return the decoder NLL,\n        # otherwise return KL(q(x_{t-1}|x_t,x_0) || p(x_{t-1}|x_t))\n        output = paddle.where((t == 0), decoder_nll, kl)\n        return {\"output\": output, \"pred_xstart\": out[\"pred_xstart\"]}\n\n    def training_losses(self, model, x_start, t, model_kwargs=None, noise=None):\n        \"\"\"\n        Compute training losses for a single timestep.\n\n        :param model: the model to evaluate loss on.\n        :param x_start: the [N x C x ...] tensor of inputs.\n        :param t: a batch of timestep indices.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n        :param noise: if specified, the specific Gaussian noise to try to remove.\n        :return: a dict with the key \"loss\" containing a tensor of shape [N].\n                 Some mean or variance settings may also have other keys.\n        \"\"\"\n        if model_kwargs is None:\n            model_kwargs = {}\n        if noise is None:\n            # noise = th.randn_like(x_start)\n            noise = paddle.randn(x_start.shape, x_start.dtype)\n        x_t = self.q_sample(x_start, t, noise=noise)\n\n        terms = {}\n\n        if self.loss_type == LossType.KL or self.loss_type == LossType.RESCALED_KL:\n            terms[\"loss\"] = self._vb_terms_bpd(\n                model=model,\n                x_start=x_start,\n                x_t=x_t,\n                t=t,\n                clip_denoised=False,\n                model_kwargs=model_kwargs,\n            )[\"output\"]\n            if self.loss_type == LossType.RESCALED_KL:\n                terms[\"loss\"] *= self.num_timesteps\n        elif self.loss_type == LossType.MSE or self.loss_type == LossType.RESCALED_MSE:\n            model_output = model(x_t, self._scale_timesteps(t), **model_kwargs)\n\n            if self.model_var_type in [\n                    ModelVarType.LEARNED,\n                    ModelVarType.LEARNED_RANGE,\n            ]:\n                B, C = x_t.shape[:2]\n                assert model_output.shape == (B, C * 2, *x_t.shape[2:])\n                model_output, model_var_values = paddle.split(model_output, 2, dim=1)\n                # Learn the variance using the variational bound, but don't let\n                # it affect our mean prediction.\n                frozen_out = paddle.concat([model_output.detach(), model_var_values], axis=1)\n                terms[\"vb\"] = self._vb_terms_bpd(\n                    model=lambda *args, r=frozen_out: r,\n                    x_start=x_start,\n                    x_t=x_t,\n                    t=t,\n                    clip_denoised=False,\n                )[\"output\"]\n                if self.loss_type == LossType.RESCALED_MSE:\n                    # Divide by 1000 for equivalence with initial implementation.\n                    # Without a factor of 1/1000, the VB term hurts the MSE term.\n                    terms[\"vb\"] *= self.num_timesteps / 1000.0\n\n            target = {\n                ModelMeanType.PREVIOUS_X: self.q_posterior_mean_variance(x_start=x_start, x_t=x_t, t=t)[0],\n                ModelMeanType.START_X: x_start,\n                ModelMeanType.EPSILON: noise,\n            }[self.model_mean_type]\n            assert model_output.shape == target.shape == x_start.shape\n            terms[\"mse\"] = mean_flat((target - model_output)**2)\n            if \"vb\" in terms:\n                terms[\"loss\"] = terms[\"mse\"] + terms[\"vb\"]\n            else:\n                terms[\"loss\"] = terms[\"mse\"]\n        else:\n            raise NotImplementedError(self.loss_type)\n\n        return terms\n\n    def _prior_bpd(self, x_start):\n        \"\"\"\n        Get the prior KL term for the variational lower-bound, measured in\n        bits-per-dim.\n\n        This term can't be optimized, as it only depends on the encoder.\n\n        :param x_start: the [N x C x ...] tensor of inputs.\n        :return: a batch of [N] KL values (in bits), one per batch element.\n        \"\"\"\n        batch_size = x_start.shape[0]\n        t = paddle.to_tensor([self.num_timesteps - 1] * batch_size, place=x_start.place)\n        qt_mean, _, qt_log_variance = self.q_mean_variance(x_start, t)\n        kl_prior = normal_kl(mean1=qt_mean, logvar1=qt_log_variance, mean2=0.0, logvar2=0.0)\n        return mean_flat(kl_prior) / np.log(2.0)\n\n    def calc_bpd_loop(self, model, x_start, clip_denoised=True, model_kwargs=None):\n        \"\"\"\n        Compute the entire variational lower-bound, measured in bits-per-dim,\n        as well as other related quantities.\n\n        :param model: the model to evaluate loss on.\n        :param x_start: the [N x C x ...] tensor of inputs.\n        :param clip_denoised: if True, clip denoised samples.\n        :param model_kwargs: if not None, a dict of extra keyword arguments to\n            pass to the model. This can be used for conditioning.\n\n        :return: a dict containing the following keys:\n                 - total_bpd: the total variational lower-bound, per batch element.\n                 - prior_bpd: the prior term in the lower-bound.\n                 - vb: an [N x T] tensor of terms in the lower-bound.\n                 - xstart_mse: an [N x T] tensor of x_0 MSEs for each timestep.\n                 - mse: an [N x T] tensor of epsilon MSEs for each timestep.\n        \"\"\"\n        device = x_start.place\n        batch_size = x_start.shape[0]\n\n        vb = []\n        xstart_mse = []\n        mse = []\n        for t in list(range(self.num_timesteps))[::-1]:\n            t_batch = paddle.to_tensor([t] * batch_size, place=device)\n            # noise = th.randn_like(x_start)\n            noise = paddle.randn(x_start.shape, x_start.dtype)\n            x_t = self.q_sample(x_start=x_start, t=t_batch, noise=noise)\n            # Calculate VLB term at the current timestep\n            # with paddle.no_grad():\n            out = self._vb_terms_bpd(\n                model,\n                x_start=x_start,\n                x_t=x_t,\n                t=t_batch,\n                clip_denoised=clip_denoised,\n                model_kwargs=model_kwargs,\n            )\n            vb.append(out[\"output\"])\n            xstart_mse.append(mean_flat((out[\"pred_xstart\"] - x_start)**2))\n            eps = self._predict_eps_from_xstart(x_t, t_batch, out[\"pred_xstart\"])\n            mse.append(mean_flat((eps - noise)**2))\n\n        vb = paddle.stack(vb, axis=1)\n        xstart_mse = paddle.stack(xstart_mse, axis=1)\n        mse = paddle.stack(mse, axis=1)\n\n        prior_bpd = self._prior_bpd(x_start)\n        total_bpd = vb.sum(axis=1) + prior_bpd\n        return {\n            \"total_bpd\": total_bpd,\n            \"prior_bpd\": prior_bpd,\n            \"vb\": vb,\n            \"xstart_mse\": xstart_mse,\n            \"mse\": mse,\n        }\n\n\ndef _extract_into_tensor(arr, timesteps, broadcast_shape):\n    \"\"\"\n    Extract values from a 1-D numpy array for a batch of indices.\n\n    :param arr: the 1-D numpy array.\n    :param timesteps: a tensor of indices into the array to extract.\n    :param broadcast_shape: a larger shape of K dimensions with the batch\n                            dimension equal to the length of timesteps.\n    :return: a tensor of shape [batch_size, 1, ...] where the shape has K dims.\n    \"\"\"\n    res = paddle.to_tensor(arr, place=timesteps.place)[timesteps]\n    while len(res.shape) < len(broadcast_shape):\n        res = res[..., None]\n    return res.expand(broadcast_shape)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/reverse_diffusion/model/losses.py",
    "content": "\"\"\"\nHelpers for various likelihood-based losses implemented by Paddle. These are ported from the original\nHo et al. diffusion models codebase:\nhttps://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/utils.py\n\"\"\"\nimport numpy as np\nimport paddle\nimport paddle.nn.functional as F\n\n\ndef normal_kl(mean1, logvar1, mean2, logvar2):\n    \"\"\"\n    Compute the KL divergence between two gaussians.\n\n    Shapes are automatically broadcasted, so batches can be compared to\n    scalars, among other use cases.\n    \"\"\"\n    tensor = None\n    for obj in (mean1, logvar1, mean2, logvar2):\n        if isinstance(obj, paddle.Tensor):\n            tensor = obj\n            break\n    assert tensor is not None, \"at least one argument must be a Tensor\"\n\n    # Force variances to be Tensors. Broadcasting helps convert scalars to\n    # Tensors, but it does not work for th.exp().\n    logvar1, logvar2 = [x if isinstance(x, paddle.Tensor) else paddle.to_tensor(x) for x in (logvar1, logvar2)]\n\n    return 0.5 * (-1.0 + logvar2 - logvar1 + paddle.exp(logvar1 - logvar2) +\n                  ((mean1 - mean2)**2) * paddle.exp(-logvar2))\n\n\ndef approx_standard_normal_cdf(x):\n    \"\"\"\n    A fast approximation of the cumulative distribution function of the\n    standard normal.\n    \"\"\"\n    return 0.5 * (1.0 + paddle.tanh(np.sqrt(2.0 / np.pi) * (x + 0.044715 * paddle.pow(x, 3))))\n\n\ndef discretized_gaussian_log_likelihood(x, *, means, log_scales):\n    \"\"\"\n    Compute the log-likelihood of a Gaussian distribution discretizing to a\n    given image.\n\n    :param x: the target images. It is assumed that this was uint8 values,\n              rescaled to the range [-1, 1].\n    :param means: the Gaussian mean Tensor.\n    :param log_scales: the Gaussian log stddev Tensor.\n    :return: a tensor like x of log probabilities (in nats).\n    \"\"\"\n    assert x.shape == means.shape == log_scales.shape\n    centered_x = x - means\n    inv_stdv = paddle.exp(-log_scales)\n    plus_in = inv_stdv * (centered_x + 1.0 / 255.0)\n    cdf_plus = approx_standard_normal_cdf(plus_in)\n    min_in = inv_stdv * (centered_x - 1.0 / 255.0)\n    cdf_min = approx_standard_normal_cdf(min_in)\n    log_cdf_plus = paddle.log(cdf_plus.clip(min=1e-12))\n    log_one_minus_cdf_min = paddle.log((1.0 - cdf_min).clip(min=1e-12))\n    cdf_delta = cdf_plus - cdf_min\n    log_probs = paddle.where(\n        x < -0.999,\n        log_cdf_plus,\n        paddle.where(x > 0.999, log_one_minus_cdf_min, paddle.log(cdf_delta.clip(min=1e-12))),\n    )\n    assert log_probs.shape == x.shape\n    return log_probs\n\n\ndef spherical_dist_loss(x, y):\n    x = F.normalize(x, axis=-1)\n    y = F.normalize(y, axis=-1)\n    return (x - y).norm(axis=-1).divide(paddle.to_tensor(2.0)).asin().pow(2).multiply(paddle.to_tensor(2.0))\n\n\ndef tv_loss(input):\n    \"\"\"L2 total variation loss, as in Mahendran et al.\"\"\"\n    input = F.pad(input, (0, 1, 0, 1), 'replicate')\n    x_diff = input[..., :-1, 1:] - input[..., :-1, :-1]\n    y_diff = input[..., 1:, :-1] - input[..., :-1, :-1]\n    return (x_diff**2 + y_diff**2).mean([1, 2, 3])\n\n\ndef range_loss(input):\n    return (input - input.clip(-1, 1)).pow(2).mean([1, 2, 3])\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/reverse_diffusion/model/make_cutouts.py",
    "content": "'''\nThis code is rewritten by Paddle based on Jina-ai/discoart.\nhttps://github.com/jina-ai/discoart/blob/main/discoart/nn/make_cutouts.py\n'''\nimport math\n\nimport paddle\nimport paddle.nn as nn\nfrom disco_diffusion_ernievil_base.resize_right.resize_right import resize\nfrom paddle.nn import functional as F\n\nfrom . import transforms as T\n\nskip_augs = False  # @param{type: 'boolean'}\n\n\ndef sinc(x):\n    return paddle.where(x != 0, paddle.sin(math.pi * x) / (math.pi * x), x.new_ones([]))\n\n\ndef lanczos(x, a):\n    cond = paddle.logical_and(-a < x, x < a)\n    out = paddle.where(cond, sinc(x) * sinc(x / a), x.new_zeros([]))\n    return out / out.sum()\n\n\ndef ramp(ratio, width):\n    n = math.ceil(width / ratio + 1)\n    out = paddle.empty([n])\n    cur = 0\n    for i in range(out.shape[0]):\n        out[i] = cur\n        cur += ratio\n    return paddle.concat([-out[1:].flip([0]), out])[1:-1]\n\n\nclass MakeCutouts(nn.Layer):\n\n    def __init__(self, cut_size, cutn, skip_augs=False):\n        super().__init__()\n        self.cut_size = cut_size\n        self.cutn = cutn\n        self.skip_augs = skip_augs\n        self.augs = nn.Sequential(*[\n            T.RandomHorizontalFlip(prob=0.5),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.RandomAffine(degrees=15, translate=(0.1, 0.1)),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.RandomPerspective(distortion_scale=0.4, p=0.7),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.RandomGrayscale(p=0.15),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.ColorJitter(brightness=0.1, contrast=0.1, saturation=0.1, hue=0.1),\n        ])\n\n    def forward(self, input):\n        input = T.Pad(input.shape[2] // 4, fill=0)(input)\n        sideY, sideX = input.shape[2:4]\n        max_size = min(sideX, sideY)\n\n        cutouts = []\n        for ch in range(self.cutn):\n            if ch > self.cutn - self.cutn // 4:\n                cutout = input.clone()\n            else:\n                size = int(max_size *\n                           paddle.zeros(1, ).normal_(mean=0.8, std=0.3).clip(float(self.cut_size / max_size), 1.0))\n                offsetx = paddle.randint(0, abs(sideX - size + 1), ())\n                offsety = paddle.randint(0, abs(sideY - size + 1), ())\n                cutout = input[:, :, offsety:offsety + size, offsetx:offsetx + size]\n\n            if not self.skip_augs:\n                cutout = self.augs(cutout)\n            cutouts.append(resample(cutout, (self.cut_size, self.cut_size)))\n            del cutout\n\n        cutouts = paddle.concat(cutouts, axis=0)\n        return cutouts\n\n\nclass MakeCutoutsDango(nn.Layer):\n\n    def __init__(self, cut_size, Overview=4, InnerCrop=0, IC_Size_Pow=0.5, IC_Grey_P=0.2):\n        super().__init__()\n        self.cut_size = cut_size\n        self.Overview = Overview\n        self.InnerCrop = InnerCrop\n        self.IC_Size_Pow = IC_Size_Pow\n        self.IC_Grey_P = IC_Grey_P\n        self.augs = nn.Sequential(*[\n            T.RandomHorizontalFlip(prob=0.5),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.RandomAffine(\n                degrees=10,\n                translate=(0.05, 0.05),\n                interpolation=T.InterpolationMode.BILINEAR,\n            ),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.RandomGrayscale(p=0.1),\n            T.Lambda(lambda x: x + paddle.randn(x.shape) * 0.01),\n            T.ColorJitter(brightness=0.1, contrast=0.1, saturation=0.1, hue=0.1),\n        ])\n\n    def forward(self, input):\n        cutouts = []\n        gray = T.Grayscale(3)\n        sideY, sideX = input.shape[2:4]\n        max_size = min(sideX, sideY)\n        min_size = min(sideX, sideY, self.cut_size)\n        output_shape = [1, 3, self.cut_size, self.cut_size]\n        pad_input = F.pad(\n            input,\n            (\n                (sideY - max_size) // 2,\n                (sideY - max_size) // 2,\n                (sideX - max_size) // 2,\n                (sideX - max_size) // 2,\n            ),\n            **padargs,\n        )\n        cutout = resize(pad_input, out_shape=output_shape)\n\n        if self.Overview > 0:\n            if self.Overview <= 4:\n                if self.Overview >= 1:\n                    cutouts.append(cutout)\n                if self.Overview >= 2:\n                    cutouts.append(gray(cutout))\n                if self.Overview >= 3:\n                    cutouts.append(cutout[:, :, :, ::-1])\n                if self.Overview == 4:\n                    cutouts.append(gray(cutout[:, :, :, ::-1]))\n            else:\n                cutout = resize(pad_input, out_shape=output_shape)\n                for _ in range(self.Overview):\n                    cutouts.append(cutout)\n\n        if self.InnerCrop > 0:\n            for i in range(self.InnerCrop):\n                size = int(paddle.rand([1])**self.IC_Size_Pow * (max_size - min_size) + min_size)\n                offsetx = paddle.randint(0, sideX - size + 1)\n                offsety = paddle.randint(0, sideY - size + 1)\n                cutout = input[:, :, offsety:offsety + size, offsetx:offsetx + size]\n                if i <= int(self.IC_Grey_P * self.InnerCrop):\n                    cutout = gray(cutout)\n                cutout = resize(cutout, out_shape=output_shape)\n                cutouts.append(cutout)\n\n        cutouts = paddle.concat(cutouts)\n        if skip_augs is not True:\n            cutouts = self.augs(cutouts)\n        return cutouts\n\n\ndef resample(input, size, align_corners=True):\n    n, c, h, w = input.shape\n    dh, dw = size\n\n    input = input.reshape([n * c, 1, h, w])\n\n    if dh < h:\n        kernel_h = lanczos(ramp(dh / h, 2), 2).to(input.device, input.dtype)\n        pad_h = (kernel_h.shape[0] - 1) // 2\n        input = F.pad(input, (0, 0, pad_h, pad_h), 'reflect')\n        input = F.conv2d(input, kernel_h[None, None, :, None])\n\n    if dw < w:\n        kernel_w = lanczos(ramp(dw / w, 2), 2).to(input.device, input.dtype)\n        pad_w = (kernel_w.shape[0] - 1) // 2\n        input = F.pad(input, (pad_w, pad_w, 0, 0), 'reflect')\n        input = F.conv2d(input, kernel_w[None, None, None, :])\n\n    input = input.reshape([n, c, h, w])\n    return F.interpolate(input, size, mode='bicubic', align_corners=align_corners)\n\n\npadargs = {}\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/reverse_diffusion/model/nn.py",
    "content": "\"\"\"\nVarious utilities for neural networks implemented by Paddle. This code is rewritten based on:\nhttps://github.com/openai/guided-diffusion/blob/main/guided_diffusion/nn.py\n\"\"\"\nimport math\n\nimport paddle\nimport paddle.nn as nn\n\n\nclass SiLU(nn.Layer):\n\n    def forward(self, x):\n        return x * nn.functional.sigmoid(x)\n\n\nclass GroupNorm32(nn.GroupNorm):\n\n    def forward(self, x):\n        return super().forward(x)\n\n\ndef conv_nd(dims, *args, **kwargs):\n    \"\"\"\n    Create a 1D, 2D, or 3D convolution module.\n    \"\"\"\n    if dims == 1:\n        return nn.Conv1D(*args, **kwargs)\n    elif dims == 2:\n        return nn.Conv2D(*args, **kwargs)\n    elif dims == 3:\n        return nn.Conv3D(*args, **kwargs)\n    raise ValueError(f\"unsupported dimensions: {dims}\")\n\n\ndef linear(*args, **kwargs):\n    \"\"\"\n    Create a linear module.\n    \"\"\"\n    return nn.Linear(*args, **kwargs)\n\n\ndef avg_pool_nd(dims, *args, **kwargs):\n    \"\"\"\n    Create a 1D, 2D, or 3D average pooling module.\n    \"\"\"\n    if dims == 1:\n        return nn.AvgPool1D(*args, **kwargs)\n    elif dims == 2:\n        return nn.AvgPool2D(*args, **kwargs)\n    elif dims == 3:\n        return nn.AvgPool3D(*args, **kwargs)\n    raise ValueError(f\"unsupported dimensions: {dims}\")\n\n\ndef update_ema(target_params, source_params, rate=0.99):\n    \"\"\"\n    Update target parameters to be closer to those of source parameters using\n    an exponential moving average.\n\n    :param target_params: the target parameter sequence.\n    :param source_params: the source parameter sequence.\n    :param rate: the EMA rate (closer to 1 means slower).\n    \"\"\"\n    for targ, src in zip(target_params, source_params):\n        targ.detach().mul_(rate).add_(src, alpha=1 - rate)\n\n\ndef zero_module(module):\n    \"\"\"\n    Zero out the parameters of a module and return it.\n    \"\"\"\n    for p in module.parameters():\n        p.detach().zero_()\n    return module\n\n\ndef scale_module(module, scale):\n    \"\"\"\n    Scale the parameters of a module and return it.\n    \"\"\"\n    for p in module.parameters():\n        p.detach().mul_(scale)\n    return module\n\n\ndef mean_flat(tensor):\n    \"\"\"\n    Take the mean over all non-batch dimensions.\n    \"\"\"\n    return tensor.mean(axis=list(range(1, len(tensor.shape))))\n\n\ndef normalization(channels):\n    \"\"\"\n    Make a standard normalization layer.\n\n    :param channels: number of input channels.\n    :return: an nn.Module for normalization.\n    \"\"\"\n    return GroupNorm32(32, channels)\n\n\ndef timestep_embedding(timesteps, dim, max_period=10000):\n    \"\"\"\n    Create sinusoidal timestep embeddings.\n\n    :param timesteps: a 1-D Tensor of N indices, one per batch element.\n                      These may be fractional.\n    :param dim: the dimension of the output.\n    :param max_period: controls the minimum frequency of the embeddings.\n    :return: an [N x dim] Tensor of positional embeddings.\n    \"\"\"\n    half = dim // 2\n    freqs = paddle.exp(-math.log(max_period) * paddle.arange(start=0, end=half, dtype=paddle.float32) / half)\n    args = paddle.cast(timesteps[:, None], 'float32') * freqs[None]\n    embedding = paddle.concat([paddle.cos(args), paddle.sin(args)], axis=-1)\n    if dim % 2:\n        embedding = paddle.concat([embedding, paddle.zeros_like(embedding[:, :1])], axis=-1)\n    return embedding\n\n\ndef checkpoint(func, inputs, params, flag):\n    \"\"\"\n    This function is disabled. And now just forward.\n    \"\"\"\n    return func(*inputs)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/reverse_diffusion/model/perlin_noises.py",
    "content": "'''\nPerlin noise implementation by Paddle.\nThis code is rewritten based on:\nhttps://github.com/jina-ai/discoart/blob/main/discoart/nn/perlin_noises.py\n'''\nimport numpy as np\nimport paddle\nimport paddle.vision.transforms as TF\nfrom PIL import Image\nfrom PIL import ImageOps\n\n\ndef interp(t):\n    return 3 * t**2 - 2 * t**3\n\n\ndef perlin(width, height, scale=10):\n    gx, gy = paddle.randn([2, width + 1, height + 1, 1, 1])\n    xs = paddle.linspace(0, 1, scale + 1)[:-1, None]\n    ys = paddle.linspace(0, 1, scale + 1)[None, :-1]\n    wx = 1 - interp(xs)\n    wy = 1 - interp(ys)\n    dots = 0\n    dots += wx * wy * (gx[:-1, :-1] * xs + gy[:-1, :-1] * ys)\n    dots += (1 - wx) * wy * (-gx[1:, :-1] * (1 - xs) + gy[1:, :-1] * ys)\n    dots += wx * (1 - wy) * (gx[:-1, 1:] * xs - gy[:-1, 1:] * (1 - ys))\n    dots += (1 - wx) * (1 - wy) * (-gx[1:, 1:] * (1 - xs) - gy[1:, 1:] * (1 - ys))\n    return dots.transpose([0, 2, 1, 3]).reshape([width * scale, height * scale])\n\n\ndef perlin_ms(octaves, width, height, grayscale):\n    out_array = [0.5] if grayscale else [0.5, 0.5, 0.5]\n    # out_array = [0.0] if grayscale else [0.0, 0.0, 0.0]\n    for i in range(1 if grayscale else 3):\n        scale = 2**len(octaves)\n        oct_width = width\n        oct_height = height\n        for oct in octaves:\n            p = perlin(oct_width, oct_height, scale)\n            out_array[i] += p * oct\n            scale //= 2\n            oct_width *= 2\n            oct_height *= 2\n    return paddle.concat(out_array)\n\n\ndef create_perlin_noise(octaves, width, height, grayscale, side_y, side_x):\n    out = perlin_ms(octaves, width, height, grayscale)\n    if grayscale:\n        out = TF.resize(size=(side_y, side_x), img=out.numpy())\n        out = np.uint8(out)\n        out = Image.fromarray(out).convert('RGB')\n    else:\n        out = out.reshape([-1, 3, out.shape[0] // 3, out.shape[1]])\n        out = out.squeeze().transpose([1, 2, 0]).numpy()\n        out = TF.resize(size=(side_y, side_x), img=out)\n        out = out.clip(0, 1) * 255\n        out = np.uint8(out)\n        out = Image.fromarray(out)\n\n    out = ImageOps.autocontrast(out)\n    return out\n\n\ndef regen_perlin(perlin_mode, side_y, side_x, batch_size):\n    if perlin_mode == 'color':\n        init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, False, side_y, side_x)\n        init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, False, side_y, side_x)\n    elif perlin_mode == 'gray':\n        init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, True, side_y, side_x)\n        init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, True, side_y, side_x)\n    else:\n        init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, False, side_y, side_x)\n        init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, True, side_y, side_x)\n\n    init = (TF.to_tensor(init).add(TF.to_tensor(init2)).divide(paddle.to_tensor(2.0)).unsqueeze(0) * 2 - 1)\n    del init2\n    return init.expand([batch_size, -1, -1, -1])\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/reverse_diffusion/model/respace.py",
    "content": "'''\nThis code is rewritten by Paddle based on\nhttps://github.com/openai/guided-diffusion/blob/main/guided_diffusion/respace.py\n'''\nimport numpy as np\nimport paddle\n\nfrom .gaussian_diffusion import GaussianDiffusion\n\n\ndef space_timesteps(num_timesteps, section_counts):\n    \"\"\"\n    Create a list of timesteps to use from an original diffusion process,\n    given the number of timesteps we want to take from equally-sized portions\n    of the original process.\n\n    For example, if there's 300 timesteps and the section counts are [10,15,20]\n    then the first 100 timesteps are strided to be 10 timesteps, the second 100\n    are strided to be 15 timesteps, and the final 100 are strided to be 20.\n\n    If the stride is a string starting with \"ddim\", then the fixed striding\n    from the DDIM paper is used, and only one section is allowed.\n\n    :param num_timesteps: the number of diffusion steps in the original\n                          process to divide up.\n    :param section_counts: either a list of numbers, or a string containing\n                           comma-separated numbers, indicating the step count\n                           per section. As a special case, use \"ddimN\" where N\n                           is a number of steps to use the striding from the\n                           DDIM paper.\n    :return: a set of diffusion steps from the original process to use.\n    \"\"\"\n    if isinstance(section_counts, str):\n        if section_counts.startswith(\"ddim\"):\n            desired_count = int(section_counts[len(\"ddim\"):])\n            for i in range(1, num_timesteps):\n                if len(range(0, num_timesteps, i)) == desired_count:\n                    return set(range(0, num_timesteps, i))\n            raise ValueError(f\"cannot create exactly {num_timesteps} steps with an integer stride\")\n        section_counts = [int(x) for x in section_counts.split(\",\")]\n    size_per = num_timesteps // len(section_counts)\n    extra = num_timesteps % len(section_counts)\n    start_idx = 0\n    all_steps = []\n    for i, section_count in enumerate(section_counts):\n        size = size_per + (1 if i < extra else 0)\n        if size < section_count:\n            raise ValueError(f\"cannot divide section of {size} steps into {section_count}\")\n        if section_count <= 1:\n            frac_stride = 1\n        else:\n            frac_stride = (size - 1) / (section_count - 1)\n        cur_idx = 0.0\n        taken_steps = []\n        for _ in range(section_count):\n            taken_steps.append(start_idx + round(cur_idx))\n            cur_idx += frac_stride\n        all_steps += taken_steps\n        start_idx += size\n    return set(all_steps)\n\n\nclass SpacedDiffusion(GaussianDiffusion):\n    \"\"\"\n    A diffusion process which can skip steps in a base diffusion process.\n\n    :param use_timesteps: a collection (sequence or set) of timesteps from the\n                          original diffusion process to retain.\n    :param kwargs: the kwargs to create the base diffusion process.\n    \"\"\"\n\n    def __init__(self, use_timesteps, **kwargs):\n        self.use_timesteps = set(use_timesteps)\n        self.timestep_map = []\n        self.original_num_steps = len(kwargs[\"betas\"])\n\n        base_diffusion = GaussianDiffusion(**kwargs)  # pylint: disable=missing-kwoa\n        last_alpha_cumprod = 1.0\n        new_betas = []\n        for i, alpha_cumprod in enumerate(base_diffusion.alphas_cumprod):\n            if i in self.use_timesteps:\n                new_betas.append(1 - alpha_cumprod / last_alpha_cumprod)\n                last_alpha_cumprod = alpha_cumprod\n                self.timestep_map.append(i)\n        kwargs[\"betas\"] = np.array(new_betas)\n        super().__init__(**kwargs)\n\n    def p_mean_variance(self, model, *args, **kwargs):  # pylint: disable=signature-differs\n        return super().p_mean_variance(self._wrap_model(model), *args, **kwargs)\n\n    def training_losses(self, model, *args, **kwargs):  # pylint: disable=signature-differs\n        return super().training_losses(self._wrap_model(model), *args, **kwargs)\n\n    def condition_mean(self, cond_fn, *args, **kwargs):\n        return super().condition_mean(self._wrap_model(cond_fn), *args, **kwargs)\n\n    def condition_score(self, cond_fn, *args, **kwargs):\n        return super().condition_score(self._wrap_model(cond_fn), *args, **kwargs)\n\n    def _wrap_model(self, model):\n        if isinstance(model, _WrappedModel):\n            return model\n        return _WrappedModel(model, self.timestep_map, self.rescale_timesteps, self.original_num_steps)\n\n    def _scale_timesteps(self, t):\n        # Scaling is done by the wrapped model.\n        return t\n\n\nclass _WrappedModel:\n\n    def __init__(self, model, timestep_map, rescale_timesteps, original_num_steps):\n        self.model = model\n        self.timestep_map = timestep_map\n        self.rescale_timesteps = rescale_timesteps\n        self.original_num_steps = original_num_steps\n\n    def __call__(self, x, ts, **kwargs):\n        map_tensor = paddle.to_tensor(self.timestep_map, place=ts.place, dtype=ts.dtype)\n        new_ts = map_tensor[ts]\n        if self.rescale_timesteps:\n            new_ts = paddle.cast(new_ts, 'float32') * (1000.0 / self.original_num_steps)\n        return self.model(x, new_ts, **kwargs)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/reverse_diffusion/model/script_util.py",
    "content": "'''\nThis code is based on\nhttps://github.com/openai/guided-diffusion/blob/main/guided_diffusion/script_util.py\n'''\nimport argparse\nimport inspect\n\nfrom . import gaussian_diffusion as gd\nfrom .respace import space_timesteps\nfrom .respace import SpacedDiffusion\nfrom .unet import EncoderUNetModel\nfrom .unet import SuperResModel\nfrom .unet import UNetModel\n\nNUM_CLASSES = 1000\n\n\ndef diffusion_defaults():\n    \"\"\"\n    Defaults for image and classifier training.\n    \"\"\"\n    return dict(\n        learn_sigma=False,\n        diffusion_steps=1000,\n        noise_schedule=\"linear\",\n        timestep_respacing=\"\",\n        use_kl=False,\n        predict_xstart=False,\n        rescale_timesteps=False,\n        rescale_learned_sigmas=False,\n    )\n\n\ndef model_and_diffusion_defaults():\n    \"\"\"\n    Defaults for image training.\n    \"\"\"\n    res = dict(\n        image_size=64,\n        num_channels=128,\n        num_res_blocks=2,\n        num_heads=4,\n        num_heads_upsample=-1,\n        num_head_channels=-1,\n        attention_resolutions=\"16,8\",\n        channel_mult=\"\",\n        dropout=0.0,\n        class_cond=False,\n        use_checkpoint=False,\n        use_scale_shift_norm=True,\n        resblock_updown=False,\n        use_fp16=False,\n        use_new_attention_order=False,\n    )\n    res.update(diffusion_defaults())\n    return res\n\n\ndef create_model_and_diffusion(\n    image_size,\n    class_cond,\n    learn_sigma,\n    num_channels,\n    num_res_blocks,\n    channel_mult,\n    num_heads,\n    num_head_channels,\n    num_heads_upsample,\n    attention_resolutions,\n    dropout,\n    diffusion_steps,\n    noise_schedule,\n    timestep_respacing,\n    use_kl,\n    predict_xstart,\n    rescale_timesteps,\n    rescale_learned_sigmas,\n    use_checkpoint,\n    use_scale_shift_norm,\n    resblock_updown,\n    use_fp16,\n    use_new_attention_order,\n):\n    model = create_model(\n        image_size,\n        num_channels,\n        num_res_blocks,\n        channel_mult=channel_mult,\n        learn_sigma=learn_sigma,\n        class_cond=class_cond,\n        use_checkpoint=use_checkpoint,\n        attention_resolutions=attention_resolutions,\n        num_heads=num_heads,\n        num_head_channels=num_head_channels,\n        num_heads_upsample=num_heads_upsample,\n        use_scale_shift_norm=use_scale_shift_norm,\n        dropout=dropout,\n        resblock_updown=resblock_updown,\n        use_fp16=use_fp16,\n        use_new_attention_order=use_new_attention_order,\n    )\n    diffusion = create_gaussian_diffusion(\n        steps=diffusion_steps,\n        learn_sigma=learn_sigma,\n        noise_schedule=noise_schedule,\n        use_kl=use_kl,\n        predict_xstart=predict_xstart,\n        rescale_timesteps=rescale_timesteps,\n        rescale_learned_sigmas=rescale_learned_sigmas,\n        timestep_respacing=timestep_respacing,\n    )\n    return model, diffusion\n\n\ndef create_model(\n    image_size,\n    num_channels,\n    num_res_blocks,\n    channel_mult=\"\",\n    learn_sigma=False,\n    class_cond=False,\n    use_checkpoint=False,\n    attention_resolutions=\"16\",\n    num_heads=1,\n    num_head_channels=-1,\n    num_heads_upsample=-1,\n    use_scale_shift_norm=False,\n    dropout=0,\n    resblock_updown=False,\n    use_fp16=False,\n    use_new_attention_order=False,\n):\n    if channel_mult == \"\":\n        if image_size == 512:\n            channel_mult = (0.5, 1, 1, 2, 2, 4, 4)\n        elif image_size == 256:\n            channel_mult = (1, 1, 2, 2, 4, 4)\n        elif image_size == 128:\n            channel_mult = (1, 1, 2, 3, 4)\n        elif image_size == 64:\n            channel_mult = (1, 2, 3, 4)\n        else:\n            raise ValueError(f\"unsupported image size: {image_size}\")\n    else:\n        channel_mult = tuple(int(ch_mult) for ch_mult in channel_mult.split(\",\"))\n\n    attention_ds = []\n    for res in attention_resolutions.split(\",\"):\n        attention_ds.append(image_size // int(res))\n\n    return UNetModel(\n        image_size=image_size,\n        in_channels=3,\n        model_channels=num_channels,\n        out_channels=(3 if not learn_sigma else 6),\n        num_res_blocks=num_res_blocks,\n        attention_resolutions=tuple(attention_ds),\n        dropout=dropout,\n        channel_mult=channel_mult,\n        num_classes=(NUM_CLASSES if class_cond else None),\n        use_checkpoint=use_checkpoint,\n        use_fp16=use_fp16,\n        num_heads=num_heads,\n        num_head_channels=num_head_channels,\n        num_heads_upsample=num_heads_upsample,\n        use_scale_shift_norm=use_scale_shift_norm,\n        resblock_updown=resblock_updown,\n        use_new_attention_order=use_new_attention_order,\n    )\n\n\ndef create_gaussian_diffusion(\n    *,\n    steps=1000,\n    learn_sigma=False,\n    sigma_small=False,\n    noise_schedule=\"linear\",\n    use_kl=False,\n    predict_xstart=False,\n    rescale_timesteps=False,\n    rescale_learned_sigmas=False,\n    timestep_respacing=\"\",\n):\n    betas = gd.get_named_beta_schedule(noise_schedule, steps)\n    if use_kl:\n        loss_type = gd.LossType.RESCALED_KL\n    elif rescale_learned_sigmas:\n        loss_type = gd.LossType.RESCALED_MSE\n    else:\n        loss_type = gd.LossType.MSE\n    if not timestep_respacing:\n        timestep_respacing = [steps]\n    return SpacedDiffusion(\n        use_timesteps=space_timesteps(steps, timestep_respacing),\n        betas=betas,\n        model_mean_type=(gd.ModelMeanType.EPSILON if not predict_xstart else gd.ModelMeanType.START_X),\n        model_var_type=((gd.ModelVarType.FIXED_LARGE if not sigma_small else gd.ModelVarType.FIXED_SMALL)\n                        if not learn_sigma else gd.ModelVarType.LEARNED_RANGE),\n        loss_type=loss_type,\n        rescale_timesteps=rescale_timesteps,\n    )\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/reverse_diffusion/model/sec_diff.py",
    "content": "'''\nThis code is rewritten by Paddle based on\nhttps://github.com/jina-ai/discoart/blob/main/discoart/nn/sec_diff.py\n'''\nimport math\nfrom dataclasses import dataclass\nfrom functools import partial\n\nimport paddle\nimport paddle.nn as nn\n\n\n@dataclass\nclass DiffusionOutput:\n    v: paddle.Tensor\n    pred: paddle.Tensor\n    eps: paddle.Tensor\n\n\nclass SkipBlock(nn.Layer):\n\n    def __init__(self, main, skip=None):\n        super().__init__()\n        self.main = nn.Sequential(*main)\n        self.skip = skip if skip else nn.Identity()\n\n    def forward(self, input):\n        return paddle.concat([self.main(input), self.skip(input)], axis=1)\n\n\ndef append_dims(x, n):\n    return x[(Ellipsis, *(None, ) * (n - x.ndim))]\n\n\ndef expand_to_planes(x, shape):\n    return paddle.tile(append_dims(x, len(shape)), [1, 1, *shape[2:]])\n\n\ndef alpha_sigma_to_t(alpha, sigma):\n    return paddle.atan2(sigma, alpha) * 2 / math.pi\n\n\ndef t_to_alpha_sigma(t):\n    return paddle.cos(t * math.pi / 2), paddle.sin(t * math.pi / 2)\n\n\nclass SecondaryDiffusionImageNet2(nn.Layer):\n\n    def __init__(self):\n        super().__init__()\n        c = 64  # The base channel count\n        cs = [c, c * 2, c * 2, c * 4, c * 4, c * 8]\n\n        self.timestep_embed = FourierFeatures(1, 16)\n        self.down = nn.AvgPool2D(2)\n        self.up = nn.Upsample(scale_factor=2, mode='bilinear', align_corners=False)\n\n        self.net = nn.Sequential(\n            ConvBlock(3 + 16, cs[0]),\n            ConvBlock(cs[0], cs[0]),\n            SkipBlock([\n                self.down,\n                ConvBlock(cs[0], cs[1]),\n                ConvBlock(cs[1], cs[1]),\n                SkipBlock([\n                    self.down,\n                    ConvBlock(cs[1], cs[2]),\n                    ConvBlock(cs[2], cs[2]),\n                    SkipBlock([\n                        self.down,\n                        ConvBlock(cs[2], cs[3]),\n                        ConvBlock(cs[3], cs[3]),\n                        SkipBlock([\n                            self.down,\n                            ConvBlock(cs[3], cs[4]),\n                            ConvBlock(cs[4], cs[4]),\n                            SkipBlock([\n                                self.down,\n                                ConvBlock(cs[4], cs[5]),\n                                ConvBlock(cs[5], cs[5]),\n                                ConvBlock(cs[5], cs[5]),\n                                ConvBlock(cs[5], cs[4]),\n                                self.up,\n                            ]),\n                            ConvBlock(cs[4] * 2, cs[4]),\n                            ConvBlock(cs[4], cs[3]),\n                            self.up,\n                        ]),\n                        ConvBlock(cs[3] * 2, cs[3]),\n                        ConvBlock(cs[3], cs[2]),\n                        self.up,\n                    ]),\n                    ConvBlock(cs[2] * 2, cs[2]),\n                    ConvBlock(cs[2], cs[1]),\n                    self.up,\n                ]),\n                ConvBlock(cs[1] * 2, cs[1]),\n                ConvBlock(cs[1], cs[0]),\n                self.up,\n            ]),\n            ConvBlock(cs[0] * 2, cs[0]),\n            nn.Conv2D(cs[0], 3, 3, padding=1),\n        )\n\n    def forward(self, input, t):\n        timestep_embed = expand_to_planes(self.timestep_embed(t[:, None]), input.shape)\n        v = self.net(paddle.concat([input, timestep_embed], axis=1))\n        alphas, sigmas = map(partial(append_dims, n=v.ndim), t_to_alpha_sigma(t))\n        pred = input * alphas - v * sigmas\n        eps = input * sigmas + v * alphas\n        return DiffusionOutput(v, pred, eps)\n\n\nclass FourierFeatures(nn.Layer):\n\n    def __init__(self, in_features, out_features, std=1.0):\n        super().__init__()\n        assert out_features % 2 == 0\n        # self.weight = nn.Parameter(paddle.randn([out_features // 2, in_features]) * std)\n        self.weight = paddle.create_parameter([out_features // 2, in_features],\n                                              dtype='float32',\n                                              default_initializer=nn.initializer.Normal(mean=0.0, std=std))\n\n    def forward(self, input):\n        f = 2 * math.pi * input @ self.weight.T\n        return paddle.concat([f.cos(), f.sin()], axis=-1)\n\n\nclass ConvBlock(nn.Sequential):\n\n    def __init__(self, c_in, c_out):\n        super().__init__(\n            nn.Conv2D(c_in, c_out, 3, padding=1),\n            nn.ReLU(),\n        )\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/reverse_diffusion/model/transforms.py",
    "content": "'''\nThis code is rewritten by Paddle based on\nhttps://github.com/pytorch/vision/blob/main/torchvision/transforms/transforms.py\n'''\nimport math\nimport numbers\nimport warnings\nfrom enum import Enum\nfrom typing import Any\nfrom typing import Dict\nfrom typing import List\nfrom typing import Optional\nfrom typing import Sequence\nfrom typing import Tuple\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nfrom paddle import Tensor\nfrom paddle.nn import functional as F\nfrom paddle.nn.functional import grid_sample\nfrom paddle.vision import transforms as T\n\n\nclass Normalize(nn.Layer):\n\n    def __init__(self, mean, std):\n        super(Normalize, self).__init__()\n        self.mean = paddle.to_tensor(mean)\n        self.std = paddle.to_tensor(std)\n\n    def forward(self, tensor: Tensor):\n        dtype = tensor.dtype\n        mean = paddle.to_tensor(self.mean, dtype=dtype)\n        std = paddle.to_tensor(self.std, dtype=dtype)\n        mean = mean.reshape([1, -1, 1, 1])\n        std = std.reshape([1, -1, 1, 1])\n        result = tensor.subtract(mean).divide(std)\n        return result\n\n\nclass InterpolationMode(Enum):\n    \"\"\"Interpolation modes\n    Available interpolation methods are ``nearest``, ``bilinear``, ``bicubic``, ``box``, ``hamming``, and ``lanczos``.\n    \"\"\"\n\n    NEAREST = \"nearest\"\n    BILINEAR = \"bilinear\"\n    BICUBIC = \"bicubic\"\n    # For PIL compatibility\n    BOX = \"box\"\n    HAMMING = \"hamming\"\n    LANCZOS = \"lanczos\"\n\n\nclass Grayscale(nn.Layer):\n\n    def __init__(self, num_output_channels):\n        super(Grayscale, self).__init__()\n        self.num_output_channels = num_output_channels\n\n    def forward(self, x):\n        output = (0.2989 * x[:, 0:1, :, :] + 0.587 * x[:, 1:2, :, :] + 0.114 * x[:, 2:3, :, :])\n        if self.num_output_channels == 3:\n            return output.expand(x.shape)\n\n        return output\n\n\nclass Lambda(nn.Layer):\n\n    def __init__(self, func):\n        super(Lambda, self).__init__()\n        self.transform = func\n\n    def forward(self, x):\n        return self.transform(x)\n\n\nclass RandomGrayscale(nn.Layer):\n\n    def __init__(self, p):\n        super(RandomGrayscale, self).__init__()\n        self.prob = p\n        self.transform = Grayscale(3)\n\n    def forward(self, x):\n        if paddle.rand([1]) < self.prob:\n            return self.transform(x)\n        else:\n            return x\n\n\nclass RandomHorizontalFlip(nn.Layer):\n\n    def __init__(self, prob):\n        super(RandomHorizontalFlip, self).__init__()\n        self.prob = prob\n\n    def forward(self, x):\n        if paddle.rand([1]) < self.prob:\n            return x[:, :, :, ::-1]\n        else:\n            return x\n\n\ndef _blend(img1: Tensor, img2: Tensor, ratio: float) -> Tensor:\n    ratio = float(ratio)\n    bound = 1.0\n    return (ratio * img1 + (1.0 - ratio) * img2).clip(0, bound)\n\n\ndef trunc_div(a, b):\n    ipt = paddle.divide(a, b)\n    sign_ipt = paddle.sign(ipt)\n    abs_ipt = paddle.abs(ipt)\n    abs_ipt = paddle.floor(abs_ipt)\n    out = paddle.multiply(sign_ipt, abs_ipt)\n    return out\n\n\ndef fmod(a, b):\n    return a - trunc_div(a, b) * b\n\n\ndef _rgb2hsv(img: Tensor) -> Tensor:\n    r, g, b = img.unbind(axis=-3)\n\n    # Implementation is based on https://github.com/python-pillow/Pillow/blob/4174d4267616897df3746d315d5a2d0f82c656ee/\n    # src/libImaging/Convert.c#L330\n    maxc = paddle.max(img, axis=-3)\n    minc = paddle.min(img, axis=-3)\n\n    # The algorithm erases S and H channel where `maxc = minc`. This avoids NaN\n    # from happening in the results, because\n    #   + S channel has division by `maxc`, which is zero only if `maxc = minc`\n    #   + H channel has division by `(maxc - minc)`.\n    #\n    # Instead of overwriting NaN afterwards, we just prevent it from occuring so\n    # we don't need to deal with it in case we save the NaN in a buffer in\n    # backprop, if it is ever supported, but it doesn't hurt to do so.\n    eqc = maxc == minc\n\n    cr = maxc - minc\n    # Since `eqc => cr = 0`, replacing denominator with 1 when `eqc` is fine.\n    ones = paddle.ones_like(maxc)\n    s = cr / paddle.where(eqc, ones, maxc)\n    # Note that `eqc => maxc = minc = r = g = b`. So the following calculation\n    # of `h` would reduce to `bc - gc + 2 + rc - bc + 4 + rc - bc = 6` so it\n    # would not matter what values `rc`, `gc`, and `bc` have here, and thus\n    # replacing denominator with 1 when `eqc` is fine.\n    cr_divisor = paddle.where(eqc, ones, cr)\n    rc = (maxc - r) / cr_divisor\n    gc = (maxc - g) / cr_divisor\n    bc = (maxc - b) / cr_divisor\n\n    hr = (maxc == r).cast('float32') * (bc - gc)\n    hg = ((maxc == g) & (maxc != r)).cast('float32') * (2.0 + rc - bc)\n    hb = ((maxc != g) & (maxc != r)).cast('float32') * (4.0 + gc - rc)\n    h = hr + hg + hb\n    h = fmod((h / 6.0 + 1.0), paddle.to_tensor(1.0))\n    return paddle.stack((h, s, maxc), axis=-3)\n\n\ndef _hsv2rgb(img: Tensor) -> Tensor:\n    h, s, v = img.unbind(axis=-3)\n    i = paddle.floor(h * 6.0)\n    f = (h * 6.0) - i\n    i = i.cast(dtype='int32')\n\n    p = paddle.clip((v * (1.0 - s)), 0.0, 1.0)\n    q = paddle.clip((v * (1.0 - s * f)), 0.0, 1.0)\n    t = paddle.clip((v * (1.0 - s * (1.0 - f))), 0.0, 1.0)\n    i = i % 6\n\n    mask = i.unsqueeze(axis=-3) == paddle.arange(6).reshape([-1, 1, 1])\n\n    a1 = paddle.stack((v, q, p, p, t, v), axis=-3)\n    a2 = paddle.stack((t, v, v, q, p, p), axis=-3)\n    a3 = paddle.stack((p, p, t, v, v, q), axis=-3)\n    a4 = paddle.stack((a1, a2, a3), axis=-4)\n\n    return paddle.einsum(\"...ijk, ...xijk -> ...xjk\", mask.cast(dtype=img.dtype), a4)\n\n\ndef adjust_brightness(img: Tensor, brightness_factor: float) -> Tensor:\n    if brightness_factor < 0:\n        raise ValueError(f\"brightness_factor ({brightness_factor}) is not non-negative.\")\n\n    return _blend(img, paddle.zeros_like(img), brightness_factor)\n\n\ndef adjust_contrast(img: Tensor, contrast_factor: float) -> Tensor:\n    if contrast_factor < 0:\n        raise ValueError(f\"contrast_factor ({contrast_factor}) is not non-negative.\")\n\n    c = img.shape[1]\n\n    if c == 3:\n        output = (0.2989 * img[:, 0:1, :, :] + 0.587 * img[:, 1:2, :, :] + 0.114 * img[:, 2:3, :, :])\n        mean = paddle.mean(output, axis=(-3, -2, -1), keepdim=True)\n\n    else:\n        mean = paddle.mean(img, axis=(-3, -2, -1), keepdim=True)\n\n    return _blend(img, mean, contrast_factor)\n\n\ndef adjust_hue(img: Tensor, hue_factor: float) -> Tensor:\n    if not (-0.5 <= hue_factor <= 0.5):\n        raise ValueError(f\"hue_factor ({hue_factor}) is not in [-0.5, 0.5].\")\n\n    img = _rgb2hsv(img)\n    h, s, v = img.unbind(axis=-3)\n    h = fmod(h + hue_factor, paddle.to_tensor(1.0))\n    img = paddle.stack((h, s, v), axis=-3)\n    img_hue_adj = _hsv2rgb(img)\n    return img_hue_adj\n\n\ndef adjust_saturation(img: Tensor, saturation_factor: float) -> Tensor:\n    if saturation_factor < 0:\n        raise ValueError(f\"saturation_factor ({saturation_factor}) is not non-negative.\")\n\n    output = (0.2989 * img[:, 0:1, :, :] + 0.587 * img[:, 1:2, :, :] + 0.114 * img[:, 2:3, :, :])\n\n    return _blend(img, output, saturation_factor)\n\n\nclass ColorJitter(nn.Layer):\n\n    def __init__(self, brightness=0, contrast=0, saturation=0, hue=0):\n        super(ColorJitter, self).__init__()\n        self.brightness = self._check_input(brightness, \"brightness\")\n        self.contrast = self._check_input(contrast, \"contrast\")\n        self.saturation = self._check_input(saturation, \"saturation\")\n        self.hue = self._check_input(hue, \"hue\", center=0, bound=(-0.5, 0.5), clip_first_on_zero=False)\n\n    def _check_input(self, value, name, center=1, bound=(0, float(\"inf\")), clip_first_on_zero=True):\n        if isinstance(value, numbers.Number):\n            if value < 0:\n                raise ValueError(f\"If {name} is a single number, it must be non negative.\")\n            value = [center - float(value), center + float(value)]\n            if clip_first_on_zero:\n                value[0] = max(value[0], 0.0)\n        elif isinstance(value, (tuple, list)) and len(value) == 2:\n            if not bound[0] <= value[0] <= value[1] <= bound[1]:\n                raise ValueError(f\"{name} values should be between {bound}\")\n        else:\n            raise TypeError(f\"{name} should be a single number or a list/tuple with length 2.\")\n\n        # if value is 0 or (1., 1.) for brightness/contrast/saturation\n        # or (0., 0.) for hue, do nothing\n        if value[0] == value[1] == center:\n            value = None\n        return value\n\n    @staticmethod\n    def get_params(\n        brightness: Optional[List[float]],\n        contrast: Optional[List[float]],\n        saturation: Optional[List[float]],\n        hue: Optional[List[float]],\n    ) -> Tuple[Tensor, Optional[float], Optional[float], Optional[float], Optional[float]]:\n        \"\"\"Get the parameters for the randomized transform to be applied on image.\n\n        Args:\n            brightness (tuple of float (min, max), optional): The range from which the brightness_factor is chosen\n                uniformly. Pass None to turn off the transformation.\n            contrast (tuple of float (min, max), optional): The range from which the contrast_factor is chosen\n                uniformly. Pass None to turn off the transformation.\n            saturation (tuple of float (min, max), optional): The range from which the saturation_factor is chosen\n                uniformly. Pass None to turn off the transformation.\n            hue (tuple of float (min, max), optional): The range from which the hue_factor is chosen uniformly.\n                Pass None to turn off the transformation.\n\n        Returns:\n            tuple: The parameters used to apply the randomized transform\n            along with their random order.\n        \"\"\"\n        fn_idx = paddle.randperm(4)\n\n        b = None if brightness is None else paddle.empty([1]).uniform_(brightness[0], brightness[1])\n        c = None if contrast is None else paddle.empty([1]).uniform_(contrast[0], contrast[1])\n        s = None if saturation is None else paddle.empty([1]).uniform_(saturation[0], saturation[1])\n        h = None if hue is None else paddle.empty([1]).uniform_(hue[0], hue[1])\n\n        return fn_idx, b, c, s, h\n\n    def forward(self, img):\n        \"\"\"\n        Args:\n            img (PIL Image or Tensor): Input image.\n\n        Returns:\n            PIL Image or Tensor: Color jittered image.\n        \"\"\"\n        fn_idx, brightness_factor, contrast_factor, saturation_factor, hue_factor = self.get_params(\n            self.brightness, self.contrast, self.saturation, self.hue)\n\n        for fn_id in fn_idx:\n            if fn_id == 0 and brightness_factor is not None:\n                img = adjust_brightness(img, brightness_factor)\n            elif fn_id == 1 and contrast_factor is not None:\n                img = adjust_contrast(img, contrast_factor)\n            elif fn_id == 2 and saturation_factor is not None:\n                img = adjust_saturation(img, saturation_factor)\n            elif fn_id == 3 and hue_factor is not None:\n                img = adjust_hue(img, hue_factor)\n\n        return img\n\n    def __repr__(self) -> str:\n        s = (f\"{self.__class__.__name__}(\"\n             f\"brightness={self.brightness}\"\n             f\", contrast={self.contrast}\"\n             f\", saturation={self.saturation}\"\n             f\", hue={self.hue})\")\n        return s\n\n\ndef _apply_grid_transform(img: Tensor, grid: Tensor, mode: str, fill: Optional[List[float]]) -> Tensor:\n\n    if img.shape[0] > 1:\n        # Apply same grid to a batch of images\n        grid = grid.expand([img.shape[0], grid.shape[1], grid.shape[2], grid.shape[3]])\n\n    # Append a dummy mask for customized fill colors, should be faster than grid_sample() twice\n    if fill is not None:\n        dummy = paddle.ones((img.shape[0], 1, img.shape[2], img.shape[3]), dtype=img.dtype)\n        img = paddle.concat((img, dummy), axis=1)\n\n    img = grid_sample(img, grid, mode=mode, padding_mode=\"zeros\", align_corners=False)\n\n    # Fill with required color\n    if fill is not None:\n        mask = img[:, -1:, :, :]  # N * 1 * H * W\n        img = img[:, :-1, :, :]  # N * C * H * W\n        mask = mask.expand_as(img)\n        len_fill = len(fill) if isinstance(fill, (tuple, list)) else 1\n        fill_img = paddle.to_tensor(fill, dtype=img.dtype).reshape([1, len_fill, 1, 1]).expand_as(img)\n        if mode == \"nearest\":\n            mask = mask < 0.5\n            img[mask] = fill_img[mask]\n        else:  # 'bilinear'\n            img = img * mask + (1.0 - mask) * fill_img\n    return img\n\n\ndef _gen_affine_grid(\n    theta: Tensor,\n    w: int,\n    h: int,\n    ow: int,\n    oh: int,\n) -> Tensor:\n    # https://github.com/pytorch/pytorch/blob/74b65c32be68b15dc7c9e8bb62459efbfbde33d8/aten/src/ATen/native/\n    # AffineGridGenerator.cpp#L18\n    # Difference with AffineGridGenerator is that:\n    # 1) we normalize grid values after applying theta\n    # 2) we can normalize by other image size, such that it covers \"extend\" option like in PIL.Image.rotate\n\n    d = 0.5\n    base_grid = paddle.empty([1, oh, ow, 3], dtype=theta.dtype)\n    x_grid = paddle.linspace(-ow * 0.5 + d, ow * 0.5 + d - 1, num=ow)\n    base_grid[..., 0] = (x_grid)\n    y_grid = paddle.linspace(-oh * 0.5 + d, oh * 0.5 + d - 1, num=oh).unsqueeze_(-1)\n    base_grid[..., 1] = (y_grid)\n    base_grid[..., 2] = 1.0\n    rescaled_theta = theta.transpose([0, 2, 1]) / paddle.to_tensor([0.5 * w, 0.5 * h], dtype=theta.dtype)\n    output_grid = base_grid.reshape([1, oh * ow, 3]).bmm(rescaled_theta)\n    return output_grid.reshape([1, oh, ow, 2])\n\n\ndef affine_impl(img: Tensor,\n                matrix: List[float],\n                interpolation: str = \"nearest\",\n                fill: Optional[List[float]] = None) -> Tensor:\n    theta = paddle.to_tensor(matrix, dtype=img.dtype).reshape([1, 2, 3])\n    shape = img.shape\n    # grid will be generated on the same device as theta and img\n    grid = _gen_affine_grid(theta, w=shape[-1], h=shape[-2], ow=shape[-1], oh=shape[-2])\n    return _apply_grid_transform(img, grid, interpolation, fill=fill)\n\n\ndef _get_inverse_affine_matrix(center: List[float],\n                               angle: float,\n                               translate: List[float],\n                               scale: float,\n                               shear: List[float],\n                               inverted: bool = True) -> List[float]:\n    # Helper method to compute inverse matrix for affine transformation\n\n    # Pillow requires inverse affine transformation matrix:\n    # Affine matrix is : M = T * C * RotateScaleShear * C^-1\n    #\n    # where T is translation matrix: [1, 0, tx | 0, 1, ty | 0, 0, 1]\n    #       C is translation matrix to keep center: [1, 0, cx | 0, 1, cy | 0, 0, 1]\n    #       RotateScaleShear is rotation with scale and shear matrix\n    #\n    #       RotateScaleShear(a, s, (sx, sy)) =\n    #       = R(a) * S(s) * SHy(sy) * SHx(sx)\n    #       = [ s*cos(a - sy)/cos(sy), s*(-cos(a - sy)*tan(sx)/cos(sy) - sin(a)), 0 ]\n    #         [ s*sin(a + sy)/cos(sy), s*(-sin(a - sy)*tan(sx)/cos(sy) + cos(a)), 0 ]\n    #         [ 0                    , 0                                      , 1 ]\n    # where R is a rotation matrix, S is a scaling matrix, and SHx and SHy are the shears:\n    # SHx(s) = [1, -tan(s)] and SHy(s) = [1      , 0]\n    #          [0, 1      ]              [-tan(s), 1]\n    #\n    # Thus, the inverse is M^-1 = C * RotateScaleShear^-1 * C^-1 * T^-1\n\n    rot = math.radians(angle)\n    sx = math.radians(shear[0])\n    sy = math.radians(shear[1])\n\n    cx, cy = center\n    tx, ty = translate\n\n    # RSS without scaling\n    a = math.cos(rot - sy) / math.cos(sy)\n    b = -math.cos(rot - sy) * math.tan(sx) / math.cos(sy) - math.sin(rot)\n    c = math.sin(rot - sy) / math.cos(sy)\n    d = -math.sin(rot - sy) * math.tan(sx) / math.cos(sy) + math.cos(rot)\n\n    if inverted:\n        # Inverted rotation matrix with scale and shear\n        # det([[a, b], [c, d]]) == 1, since det(rotation) = 1 and det(shear) = 1\n        matrix = [d, -b, 0.0, -c, a, 0.0]\n        matrix = [x / scale for x in matrix]\n        # Apply inverse of translation and of center translation: RSS^-1 * C^-1 * T^-1\n        matrix[2] += matrix[0] * (-cx - tx) + matrix[1] * (-cy - ty)\n        matrix[5] += matrix[3] * (-cx - tx) + matrix[4] * (-cy - ty)\n        # Apply center translation: C * RSS^-1 * C^-1 * T^-1\n        matrix[2] += cx\n        matrix[5] += cy\n    else:\n        matrix = [a, b, 0.0, c, d, 0.0]\n        matrix = [x * scale for x in matrix]\n        # Apply inverse of center translation: RSS * C^-1\n        matrix[2] += matrix[0] * (-cx) + matrix[1] * (-cy)\n        matrix[5] += matrix[3] * (-cx) + matrix[4] * (-cy)\n        # Apply translation and center : T * C * RSS * C^-1\n        matrix[2] += cx + tx\n        matrix[5] += cy + ty\n\n    return matrix\n\n\ndef affine(\n    img: Tensor,\n    angle: float,\n    translate: List[int],\n    scale: float,\n    shear: List[float],\n    interpolation: InterpolationMode = InterpolationMode.NEAREST,\n    fill: Optional[List[float]] = None,\n    resample: Optional[int] = None,\n    fillcolor: Optional[List[float]] = None,\n    center: Optional[List[int]] = None,\n) -> Tensor:\n    \"\"\"Apply affine transformation on the image keeping image center invariant.\n    If the image is paddle Tensor, it is expected\n    to have [..., H, W] shape, where ... means an arbitrary number of leading dimensions.\n\n    Args:\n        img (PIL Image or Tensor): image to transform.\n        angle (number): rotation angle in degrees between -180 and 180, clockwise direction.\n        translate (sequence of integers): horizontal and vertical translations (post-rotation translation)\n        scale (float): overall scale\n        shear (float or sequence): shear angle value in degrees between -180 to 180, clockwise direction.\n            If a sequence is specified, the first value corresponds to a shear parallel to the x axis, while\n            the second value corresponds to a shear parallel to the y axis.\n        interpolation (InterpolationMode): Desired interpolation enum defined by\n            :class:`torchvision.transforms.InterpolationMode`. Default is ``InterpolationMode.NEAREST``.\n            If input is Tensor, only ``InterpolationMode.NEAREST``, ``InterpolationMode.BILINEAR`` are supported.\n            For backward compatibility integer values (e.g. ``PIL.Image[.Resampling].NEAREST``) are still accepted,\n            but deprecated since 0.13 and will be removed in 0.15. Please use InterpolationMode enum.\n        fill (sequence or number, optional): Pixel fill value for the area outside the transformed\n            image. If given a number, the value is used for all bands respectively.\n\n            .. note::\n                In torchscript mode single int/float value is not supported, please use a sequence\n                of length 1: ``[value, ]``.\n        fillcolor (sequence or number, optional):\n            .. warning::\n                This parameter was deprecated in ``0.12`` and will be removed in ``0.14``. Please use ``fill`` instead.\n        resample (int, optional):\n            .. warning::\n                This parameter was deprecated in ``0.12`` and will be removed in ``0.14``. Please use ``interpolation``\n                instead.\n        center (sequence, optional): Optional center of rotation. Origin is the upper left corner.\n            Default is the center of the image.\n\n    Returns:\n        PIL Image or Tensor: Transformed image.\n    \"\"\"\n\n    # Backward compatibility with integer value\n    if isinstance(interpolation, int):\n        warnings.warn(\"Argument 'interpolation' of type int is deprecated since 0.13 and will be removed in 0.15. \"\n                      \"Please use InterpolationMode enum.\")\n        interpolation = _interpolation_modes_from_int(interpolation)\n\n    if fillcolor is not None:\n        warnings.warn(\"The parameter 'fillcolor' is deprecated since 0.12 and will be removed in 0.14. \"\n                      \"Please use 'fill' instead.\")\n        fill = fillcolor\n\n    if not isinstance(angle, (int, float)):\n        raise TypeError(\"Argument angle should be int or float\")\n\n    if not isinstance(translate, (list, tuple)):\n        raise TypeError(\"Argument translate should be a sequence\")\n\n    if len(translate) != 2:\n        raise ValueError(\"Argument translate should be a sequence of length 2\")\n\n    if scale <= 0.0:\n        raise ValueError(\"Argument scale should be positive\")\n\n    if not isinstance(shear, (numbers.Number, (list, tuple))):\n        raise TypeError(\"Shear should be either a single value or a sequence of two values\")\n\n    if not isinstance(interpolation, InterpolationMode):\n        raise TypeError(\"Argument interpolation should be a InterpolationMode\")\n\n    if isinstance(angle, int):\n        angle = float(angle)\n\n    if isinstance(translate, tuple):\n        translate = list(translate)\n\n    if isinstance(shear, numbers.Number):\n        shear = [shear, 0.0]\n\n    if isinstance(shear, tuple):\n        shear = list(shear)\n\n    if len(shear) == 1:\n        shear = [shear[0], shear[0]]\n\n    if len(shear) != 2:\n        raise ValueError(f\"Shear should be a sequence containing two values. Got {shear}\")\n\n    if center is not None and not isinstance(center, (list, tuple)):\n        raise TypeError(\"Argument center should be a sequence\")\n    center_f = [0.0, 0.0]\n    if center is not None:\n        _, height, width = img.shape[0], img.shape[1], img.shape[2]\n        # Center values should be in pixel coordinates but translated such that (0, 0) corresponds to image center.\n        center_f = [1.0 * (c - s * 0.5) for c, s in zip(center, [width, height])]\n\n    translate_f = [1.0 * t for t in translate]\n    matrix = _get_inverse_affine_matrix(center_f, angle, translate_f, scale, shear)\n    return affine_impl(img, matrix=matrix, interpolation=interpolation.value, fill=fill)\n\n\ndef _interpolation_modes_from_int(i: int) -> InterpolationMode:\n    inverse_modes_mapping = {\n        0: InterpolationMode.NEAREST,\n        2: InterpolationMode.BILINEAR,\n        3: InterpolationMode.BICUBIC,\n        4: InterpolationMode.BOX,\n        5: InterpolationMode.HAMMING,\n        1: InterpolationMode.LANCZOS,\n    }\n    return inverse_modes_mapping[i]\n\n\ndef _check_sequence_input(x, name, req_sizes):\n    msg = req_sizes[0] if len(req_sizes) < 2 else \" or \".join([str(s) for s in req_sizes])\n    if not isinstance(x, Sequence):\n        raise TypeError(f\"{name} should be a sequence of length {msg}.\")\n    if len(x) not in req_sizes:\n        raise ValueError(f\"{name} should be sequence of length {msg}.\")\n\n\ndef _setup_angle(x, name, req_sizes=(2, )):\n    if isinstance(x, numbers.Number):\n        if x < 0:\n            raise ValueError(f\"If {name} is a single number, it must be positive.\")\n        x = [-x, x]\n    else:\n        _check_sequence_input(x, name, req_sizes)\n\n    return [float(d) for d in x]\n\n\nclass RandomAffine(nn.Layer):\n    \"\"\"Random affine transformation of the image keeping center invariant.\n    If the image is paddle Tensor, it is expected\n    to have [..., H, W] shape, where ... means an arbitrary number of leading dimensions.\n\n    Args:\n        degrees (sequence or number): Range of degrees to select from.\n            If degrees is a number instead of sequence like (min, max), the range of degrees\n            will be (-degrees, +degrees). Set to 0 to deactivate rotations.\n        translate (tuple, optional): tuple of maximum absolute fraction for horizontal\n            and vertical translations. For example translate=(a, b), then horizontal shift\n            is randomly sampled in the range -img_width * a < dx < img_width * a and vertical shift is\n            randomly sampled in the range -img_height * b < dy < img_height * b. Will not translate by default.\n        scale (tuple, optional): scaling factor interval, e.g (a, b), then scale is\n            randomly sampled from the range a <= scale <= b. Will keep original scale by default.\n        shear (sequence or number, optional): Range of degrees to select from.\n            If shear is a number, a shear parallel to the x axis in the range (-shear, +shear)\n            will be applied. Else if shear is a sequence of 2 values a shear parallel to the x axis in the\n            range (shear[0], shear[1]) will be applied. Else if shear is a sequence of 4 values,\n            a x-axis shear in (shear[0], shear[1]) and y-axis shear in (shear[2], shear[3]) will be applied.\n            Will not apply shear by default.\n        interpolation (InterpolationMode): Desired interpolation enum defined by\n            :class:`torchvision.transforms.InterpolationMode`. Default is ``InterpolationMode.NEAREST``.\n            If input is Tensor, only ``InterpolationMode.NEAREST``, ``InterpolationMode.BILINEAR`` are supported.\n            For backward compatibility integer values (e.g. ``PIL.Image[.Resampling].NEAREST``) are still accepted,\n            but deprecated since 0.13 and will be removed in 0.15. Please use InterpolationMode enum.\n        fill (sequence or number): Pixel fill value for the area outside the transformed\n            image. Default is ``0``. If given a number, the value is used for all bands respectively.\n        fillcolor (sequence or number, optional):\n            .. warning::\n                This parameter was deprecated in ``0.12`` and will be removed in ``0.14``. Please use ``fill`` instead.\n        resample (int, optional):\n            .. warning::\n                This parameter was deprecated in ``0.12`` and will be removed in ``0.14``. Please use ``interpolation``\n                instead.\n        center (sequence, optional): Optional center of rotation, (x, y). Origin is the upper left corner.\n            Default is the center of the image.\n\n    .. _filters: https://pillow.readthedocs.io/en/latest/handbook/concepts.html#filters\n\n    \"\"\"\n\n    def __init__(\n        self,\n        degrees,\n        translate=None,\n        scale=None,\n        shear=None,\n        interpolation=InterpolationMode.NEAREST,\n        fill=0,\n        fillcolor=None,\n        resample=None,\n        center=None,\n    ):\n        super(RandomAffine, self).__init__()\n        if resample is not None:\n            warnings.warn(\"The parameter 'resample' is deprecated since 0.12 and will be removed in 0.14. \"\n                          \"Please use 'interpolation' instead.\")\n            interpolation = _interpolation_modes_from_int(resample)\n\n        # Backward compatibility with integer value\n        if isinstance(interpolation, int):\n            warnings.warn(\"Argument 'interpolation' of type int is deprecated since 0.13 and will be removed in 0.15. \"\n                          \"Please use InterpolationMode enum.\")\n            interpolation = _interpolation_modes_from_int(interpolation)\n\n        if fillcolor is not None:\n            warnings.warn(\"The parameter 'fillcolor' is deprecated since 0.12 and will be removed in 0.14. \"\n                          \"Please use 'fill' instead.\")\n            fill = fillcolor\n\n        self.degrees = _setup_angle(degrees, name=\"degrees\", req_sizes=(2, ))\n\n        if translate is not None:\n            _check_sequence_input(translate, \"translate\", req_sizes=(2, ))\n            for t in translate:\n                if not (0.0 <= t <= 1.0):\n                    raise ValueError(\"translation values should be between 0 and 1\")\n        self.translate = translate\n\n        if scale is not None:\n            _check_sequence_input(scale, \"scale\", req_sizes=(2, ))\n            for s in scale:\n                if s <= 0:\n                    raise ValueError(\"scale values should be positive\")\n        self.scale = scale\n\n        if shear is not None:\n            self.shear = _setup_angle(shear, name=\"shear\", req_sizes=(2, 4))\n        else:\n            self.shear = shear\n\n        self.resample = self.interpolation = interpolation\n\n        if fill is None:\n            fill = 0\n        elif not isinstance(fill, (Sequence, numbers.Number)):\n            raise TypeError(\"Fill should be either a sequence or a number.\")\n\n        self.fillcolor = self.fill = fill\n\n        if center is not None:\n            _check_sequence_input(center, \"center\", req_sizes=(2, ))\n\n        self.center = center\n\n    @staticmethod\n    def get_params(\n        degrees: List[float],\n        translate: Optional[List[float]],\n        scale_ranges: Optional[List[float]],\n        shears: Optional[List[float]],\n        img_size: List[int],\n    ) -> Tuple[float, Tuple[int, int], float, Tuple[float, float]]:\n        \"\"\"Get parameters for affine transformation\n\n        Returns:\n            params to be passed to the affine transformation\n        \"\"\"\n        angle = float(paddle.empty([1]).uniform_(float(degrees[0]), float(degrees[1])))\n        if translate is not None:\n            max_dx = float(translate[0] * img_size[0])\n            max_dy = float(translate[1] * img_size[1])\n            tx = int(float(paddle.empty([1]).uniform_(-max_dx, max_dx)))\n            ty = int(float(paddle.empty([1]).uniform_(-max_dy, max_dy)))\n            translations = (tx, ty)\n        else:\n            translations = (0, 0)\n\n        if scale_ranges is not None:\n            scale = float(paddle.empty([1]).uniform_(scale_ranges[0], scale_ranges[1]))\n        else:\n            scale = 1.0\n\n        shear_x = shear_y = 0.0\n        if shears is not None:\n            shear_x = float(paddle.empty([1]).uniform_(shears[0], shears[1]))\n            if len(shears) == 4:\n                shear_y = float(paddle.empty([1]).uniform_(shears[2], shears[3]))\n\n        shear = (shear_x, shear_y)\n\n        return angle, translations, scale, shear\n\n    def forward(self, img):\n        fill = self.fill\n        channels, height, width = img.shape[1], img.shape[2], img.shape[3]\n        if isinstance(fill, (int, float)):\n            fill = [float(fill)] * channels\n        else:\n            fill = [float(f) for f in fill]\n\n        img_size = [width, height]  # flip for keeping BC on get_params call\n\n        ret = self.get_params(self.degrees, self.translate, self.scale, self.shear, img_size)\n\n        return affine(img, *ret, interpolation=self.interpolation, fill=fill, center=self.center)\n\n    def __repr__(self) -> str:\n        s = f\"{self.__class__.__name__}(degrees={self.degrees}\"\n        s += f\", translate={self.translate}\" if self.translate is not None else \"\"\n        s += f\", scale={self.scale}\" if self.scale is not None else \"\"\n        s += f\", shear={self.shear}\" if self.shear is not None else \"\"\n        s += f\", interpolation={self.interpolation.value}\" if self.interpolation != InterpolationMode.NEAREST else \"\"\n        s += f\", fill={self.fill}\" if self.fill != 0 else \"\"\n        s += f\", center={self.center}\" if self.center is not None else \"\"\n        s += \")\"\n\n        return s\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/reverse_diffusion/model/unet.py",
    "content": "'''\nThis code is rewritten by Paddle based on\nhttps://github.com/openai/guided-diffusion/blob/main/guided_diffusion/unet.py\n'''\nimport math\nfrom abc import abstractmethod\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom .nn import avg_pool_nd\nfrom .nn import checkpoint\nfrom .nn import conv_nd\nfrom .nn import linear\nfrom .nn import normalization\nfrom .nn import SiLU\nfrom .nn import timestep_embedding\nfrom .nn import zero_module\n\n\nclass AttentionPool2d(nn.Layer):\n    \"\"\"\n    Adapted from CLIP: https://github.com/openai/CLIP/blob/main/clip/model.py\n    \"\"\"\n\n    def __init__(\n        self,\n        spacial_dim: int,\n        embed_dim: int,\n        num_heads_channels: int,\n        output_dim: int = None,\n    ):\n        super().__init__()\n        # self.positional_embedding = nn.Parameter(\n        #     th.randn(embed_dim, spacial_dim ** 2 + 1) / embed_dim ** 0.5\n        # )\n        positional_embedding = self.create_parameter(paddle.randn(embed_dim, spacial_dim**2 + 1) / embed_dim**0.5)\n        self.add_parameter(\"positional_embedding\", positional_embedding)\n        self.qkv_proj = conv_nd(1, embed_dim, 3 * embed_dim, 1)\n        self.c_proj = conv_nd(1, embed_dim, output_dim or embed_dim, 1)\n        self.num_heads = embed_dim // num_heads_channels\n        self.attention = QKVAttention(self.num_heads)\n\n    def forward(self, x):\n        b, c, *_spatial = x.shape\n        # x = x.reshape(b, c, -1)  # NC(HW)\n        x = paddle.reshape(x, [b, c, -1])\n        x = paddle.concat([x.mean(dim=-1, keepdim=True), x], axis=-1)  # NC(HW+1)\n        x = x + paddle.cast(self.positional_embedding[None, :, :], x.dtype)  # NC(HW+1)\n        x = self.qkv_proj(x)\n        x = self.attention(x)\n        x = self.c_proj(x)\n        return x[:, :, 0]\n\n\nclass TimestepBlock(nn.Layer):\n    \"\"\"\n    Any module where forward() takes timestep embeddings as a second argument.\n    \"\"\"\n\n    @abstractmethod\n    def forward(self, x, emb):\n        \"\"\"\n        Apply the module to `x` given `emb` timestep embeddings.\n        \"\"\"\n\n\nclass TimestepEmbedSequential(nn.Sequential, TimestepBlock):\n    \"\"\"\n    A sequential module that passes timestep embeddings to the children that\n    support it as an extra input.\n    \"\"\"\n\n    def forward(self, x, emb):\n        for layer in self:\n            if isinstance(layer, TimestepBlock):\n                x = layer(x, emb)\n            else:\n                x = layer(x)\n        return x\n\n\nclass Upsample(nn.Layer):\n    \"\"\"\n    An upsampling layer with an optional convolution.\n\n    :param channels: channels in the inputs and outputs.\n    :param use_conv: a bool determining if a convolution is applied.\n    :param dims: determines if the signal is 1D, 2D, or 3D. If 3D, then\n                 upsampling occurs in the inner-two dimensions.\n    \"\"\"\n\n    def __init__(self, channels, use_conv, dims=2, out_channels=None):\n        super().__init__()\n        self.channels = channels\n        self.out_channels = out_channels or channels\n        self.use_conv = use_conv\n        self.dims = dims\n        if use_conv:\n            self.conv = conv_nd(dims, self.channels, self.out_channels, 3, padding=1)\n\n    def forward(self, x):\n        assert x.shape[1] == self.channels\n        if self.dims == 3:\n            x = F.interpolate(x, (x.shape[2], x.shape[3] * 2, x.shape[4] * 2), mode=\"nearest\")\n        else:\n            x = F.interpolate(x, scale_factor=2, mode=\"nearest\")\n        if self.use_conv:\n            x = self.conv(x)\n        return x\n\n\nclass Downsample(nn.Layer):\n    \"\"\"\n    A downsampling layer with an optional convolution.\n\n    :param channels: channels in the inputs and outputs.\n    :param use_conv: a bool determining if a convolution is applied.\n    :param dims: determines if the signal is 1D, 2D, or 3D. If 3D, then\n                 downsampling occurs in the inner-two dimensions.\n    \"\"\"\n\n    def __init__(self, channels, use_conv, dims=2, out_channels=None):\n        super().__init__()\n        self.channels = channels\n        self.out_channels = out_channels or channels\n        self.use_conv = use_conv\n        self.dims = dims\n        stride = 2 if dims != 3 else (1, 2, 2)\n        if use_conv:\n            self.op = conv_nd(dims, self.channels, self.out_channels, 3, stride=stride, padding=1)\n        else:\n            assert self.channels == self.out_channels\n            self.op = avg_pool_nd(dims, kernel_size=stride, stride=stride)\n\n    def forward(self, x):\n        assert x.shape[1] == self.channels\n        return self.op(x)\n\n\nclass ResBlock(TimestepBlock):\n    \"\"\"\n    A residual block that can optionally change the number of channels.\n\n    :param channels: the number of input channels.\n    :param emb_channels: the number of timestep embedding channels.\n    :param dropout: the rate of dropout.\n    :param out_channels: if specified, the number of out channels.\n    :param use_conv: if True and out_channels is specified, use a spatial\n        convolution instead of a smaller 1x1 convolution to change the\n        channels in the skip connection.\n    :param dims: determines if the signal is 1D, 2D, or 3D.\n    :param use_checkpoint: if True, use gradient checkpointing on this module.\n    :param up: if True, use this block for upsampling.\n    :param down: if True, use this block for downsampling.\n    \"\"\"\n\n    def __init__(\n        self,\n        channels,\n        emb_channels,\n        dropout,\n        out_channels=None,\n        use_conv=False,\n        use_scale_shift_norm=False,\n        dims=2,\n        use_checkpoint=False,\n        up=False,\n        down=False,\n    ):\n        super().__init__()\n        self.channels = channels\n        self.emb_channels = emb_channels\n        self.dropout = dropout\n        self.out_channels = out_channels or channels\n        self.use_conv = use_conv\n        self.use_checkpoint = use_checkpoint\n        self.use_scale_shift_norm = use_scale_shift_norm\n\n        self.in_layers = nn.Sequential(\n            normalization(channels),\n            SiLU(),\n            conv_nd(dims, channels, self.out_channels, 3, padding=1),\n        )\n\n        self.updown = up or down\n\n        if up:\n            self.h_upd = Upsample(channels, False, dims)\n            self.x_upd = Upsample(channels, False, dims)\n        elif down:\n            self.h_upd = Downsample(channels, False, dims)\n            self.x_upd = Downsample(channels, False, dims)\n        else:\n            self.h_upd = self.x_upd = nn.Identity()\n\n        self.emb_layers = nn.Sequential(\n            SiLU(),\n            linear(\n                emb_channels,\n                2 * self.out_channels if use_scale_shift_norm else self.out_channels,\n            ),\n        )\n        self.out_layers = nn.Sequential(\n            normalization(self.out_channels),\n            SiLU(),\n            nn.Dropout(p=dropout),\n            zero_module(conv_nd(dims, self.out_channels, self.out_channels, 3, padding=1)),\n        )\n\n        if self.out_channels == channels:\n            self.skip_connection = nn.Identity()\n        elif use_conv:\n            self.skip_connection = conv_nd(dims, channels, self.out_channels, 3, padding=1)\n        else:\n            self.skip_connection = conv_nd(dims, channels, self.out_channels, 1)\n\n    def forward(self, x, emb):\n        \"\"\"\n        Apply the block to a Tensor, conditioned on a timestep embedding.\n\n        :param x: an [N x C x ...] Tensor of features.\n        :param emb: an [N x emb_channels] Tensor of timestep embeddings.\n        :return: an [N x C x ...] Tensor of outputs.\n        \"\"\"\n        return checkpoint(self._forward, (x, emb), self.parameters(), self.use_checkpoint)\n\n    def _forward(self, x, emb):\n        if self.updown:\n            in_rest, in_conv = self.in_layers[:-1], self.in_layers[-1]\n            h = in_rest(x)\n            h = self.h_upd(h)\n            x = self.x_upd(x)\n            h = in_conv(h)\n        else:\n            h = self.in_layers(x)\n        emb_out = self.emb_layers(emb)\n        emb_out = paddle.cast(emb_out, h.dtype)\n        while len(emb_out.shape) < len(h.shape):\n            emb_out = emb_out[..., None]\n        if self.use_scale_shift_norm:\n            out_norm, out_rest = self.out_layers[0], self.out_layers[1:]\n            scale, shift = paddle.chunk(emb_out, 2, axis=1)\n            h = out_norm(h) * (1 + scale) + shift\n            h = out_rest(h)\n        else:\n            h = h + emb_out\n            h = self.out_layers(h)\n        return self.skip_connection(x) + h\n\n\nclass AttentionBlock(nn.Layer):\n    \"\"\"\n    An attention block that allows spatial positions to attend to each other.\n\n    Originally ported from here, but adapted to the N-d case.\n    https://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/models/unet.py#L66.\n    \"\"\"\n\n    def __init__(\n        self,\n        channels,\n        num_heads=1,\n        num_head_channels=-1,\n        use_checkpoint=False,\n        use_new_attention_order=False,\n    ):\n        super().__init__()\n        self.channels = channels\n        if num_head_channels == -1:\n            self.num_heads = num_heads\n        else:\n            assert (channels % num_head_channels == 0\n                    ), f\"q,k,v channels {channels} is not divisible by num_head_channels {num_head_channels}\"\n            self.num_heads = channels // num_head_channels\n        self.use_checkpoint = use_checkpoint\n        self.norm = normalization(channels)\n        self.qkv = conv_nd(1, channels, channels * 3, 1)\n        if use_new_attention_order:\n            # split qkv before split heads\n            self.attention = QKVAttention(self.num_heads)\n        else:\n            # split heads before split qkv\n            self.attention = QKVAttentionLegacy(self.num_heads)\n\n        self.proj_out = zero_module(conv_nd(1, channels, channels, 1))\n\n    def forward(self, x):\n        return checkpoint(self._forward, (x, ), self.parameters(), self.use_checkpoint)\n\n    def _forward(self, x):\n        b, c, *spatial = x.shape\n        # x = x.reshape(b, c, -1)\n        x = paddle.reshape(x, [b, c, -1])\n        qkv = self.qkv(self.norm(x))\n        h = self.attention(qkv)\n        h = self.proj_out(h)\n        # return (x + h).reshape(b, c, *spatial)\n        return paddle.reshape(x + h, [b, c, *spatial])\n\n\ndef count_flops_attn(model, _x, y):\n    \"\"\"\n    A counter for the `thop` package to count the operations in an\n    attention operation.\n    Meant to be used like:\n        macs, params = thop.profile(\n            model,\n            inputs=(inputs, timestamps),\n            custom_ops={QKVAttention: QKVAttention.count_flops},\n        )\n    \"\"\"\n    b, c, *spatial = y[0].shape\n    num_spatial = int(np.prod(spatial))\n    # We perform two matmuls with the same number of ops.\n    # The first computes the weight matrix, the second computes\n    # the combination of the value vectors.\n    matmul_ops = 2 * b * (num_spatial**2) * c\n    model.total_ops += paddle.to_tensor([matmul_ops], dtype='float64')\n\n\nclass QKVAttentionLegacy(nn.Layer):\n    \"\"\"\n    A module which performs QKV attention. Matches legacy QKVAttention + input/ouput heads shaping\n    \"\"\"\n\n    def __init__(self, n_heads):\n        super().__init__()\n        self.n_heads = n_heads\n\n    def forward(self, qkv):\n        \"\"\"\n        Apply QKV attention.\n\n        :param qkv: an [N x (H * 3 * C) x T] tensor of Qs, Ks, and Vs.\n        :return: an [N x (H * C) x T] tensor after attention.\n        \"\"\"\n        bs, width, length = qkv.shape\n        assert width % (3 * self.n_heads) == 0\n        ch = width // (3 * self.n_heads)\n        # q, k, v = qkv.reshape(bs * self.n_heads, ch * 3, length).split(ch, dim=1)\n        q, k, v = paddle.reshape(qkv, [bs * self.n_heads, ch * 3, length]).split(3, axis=1)\n        scale = 1 / math.sqrt(math.sqrt(ch))\n        weight = paddle.einsum(\"bct,bcs->bts\", q * scale, k * scale)  # More stable with f16 than dividing afterwards\n        weight = paddle.cast(nn.functional.softmax(paddle.cast(weight, 'float32'), axis=-1), weight.dtype)\n        a = paddle.einsum(\"bts,bcs->bct\", weight, v)\n        # return a.reshape(bs, -1, length)\n        return paddle.reshape(a, [bs, -1, length])\n\n    @staticmethod\n    def count_flops(model, _x, y):\n        return count_flops_attn(model, _x, y)\n\n\nclass QKVAttention(nn.Layer):\n    \"\"\"\n    A module which performs QKV attention and splits in a different order.\n    \"\"\"\n\n    def __init__(self, n_heads):\n        super().__init__()\n        self.n_heads = n_heads\n\n    def forward(self, qkv):\n        \"\"\"\n        Apply QKV attention.\n\n        :param qkv: an [N x (3 * H * C) x T] tensor of Qs, Ks, and Vs.\n        :return: an [N x (H * C) x T] tensor after attention.\n        \"\"\"\n        bs, width, length = qkv.shape\n        assert width % (3 * self.n_heads) == 0\n        ch = width // (3 * self.n_heads)\n        q, k, v = qkv.chunk(3, axis=1)\n        scale = 1 / math.sqrt(math.sqrt(ch))\n        weight = paddle.einsum(\n            \"bct,bcs->bts\",\n            (q * scale).view(bs * self.n_heads, ch, length),\n            (k * scale).view(bs * self.n_heads, ch, length),\n        )  # More stable with f16 than dividing afterwards\n        weight = paddle.cast(nn.functional.softmax(paddle.cast(weight, 'float32'), axis=-1), weight.dtype)\n        a = paddle.einsum(\"bts,bcs->bct\", weight, v.reshape(bs * self.n_heads, ch, length))\n        # return a.reshape(bs, -1, length)\n        return paddle.reshape(a, [bs, -1, length])\n\n    @staticmethod\n    def count_flops(model, _x, y):\n        return count_flops_attn(model, _x, y)\n\n\nclass UNetModel(nn.Layer):\n    \"\"\"\n    The full UNet model with attention and timestep embedding.\n\n    :param in_channels: channels in the input Tensor.\n    :param model_channels: base channel count for the model.\n    :param out_channels: channels in the output Tensor.\n    :param num_res_blocks: number of residual blocks per downsample.\n    :param attention_resolutions: a collection of downsample rates at which\n        attention will take place. May be a set, list, or tuple.\n        For example, if this contains 4, then at 4x downsampling, attention\n        will be used.\n    :param dropout: the dropout probability.\n    :param channel_mult: channel multiplier for each level of the UNet.\n    :param conv_resample: if True, use learned convolutions for upsampling and\n        downsampling.\n    :param dims: determines if the signal is 1D, 2D, or 3D.\n    :param num_classes: if specified (as an int), then this model will be\n        class-conditional with `num_classes` classes.\n    :param use_checkpoint: use gradient checkpointing to reduce memory usage.\n    :param num_heads: the number of attention heads in each attention layer.\n    :param num_heads_channels: if specified, ignore num_heads and instead use\n                               a fixed channel width per attention head.\n    :param num_heads_upsample: works with num_heads to set a different number\n                               of heads for upsampling. Deprecated.\n    :param use_scale_shift_norm: use a FiLM-like conditioning mechanism.\n    :param resblock_updown: use residual blocks for up/downsampling.\n    :param use_new_attention_order: use a different attention pattern for potentially\n                                    increased efficiency.\n    \"\"\"\n\n    def __init__(\n        self,\n        image_size,\n        in_channels,\n        model_channels,\n        out_channels,\n        num_res_blocks,\n        attention_resolutions,\n        dropout=0,\n        channel_mult=(1, 2, 4, 8),\n        conv_resample=True,\n        dims=2,\n        num_classes=None,\n        use_checkpoint=False,\n        use_fp16=False,\n        num_heads=1,\n        num_head_channels=-1,\n        num_heads_upsample=-1,\n        use_scale_shift_norm=False,\n        resblock_updown=False,\n        use_new_attention_order=False,\n    ):\n        super().__init__()\n\n        if num_heads_upsample == -1:\n            num_heads_upsample = num_heads\n\n        self.image_size = image_size\n        self.in_channels = in_channels\n        self.model_channels = model_channels\n        self.out_channels = out_channels\n        self.num_res_blocks = num_res_blocks\n        self.attention_resolutions = attention_resolutions\n        self.dropout = dropout\n        self.channel_mult = channel_mult\n        self.conv_resample = conv_resample\n        self.num_classes = num_classes\n        self.use_checkpoint = use_checkpoint\n        self.dtype = paddle.float16 if use_fp16 else paddle.float32\n        self.num_heads = num_heads\n        self.num_head_channels = num_head_channels\n        self.num_heads_upsample = num_heads_upsample\n\n        time_embed_dim = model_channels * 4\n        self.time_embed = nn.Sequential(\n            linear(model_channels, time_embed_dim),\n            SiLU(),\n            linear(time_embed_dim, time_embed_dim),\n        )\n\n        if self.num_classes is not None:\n            self.label_emb = nn.Embedding(num_classes, time_embed_dim)\n\n        ch = input_ch = int(channel_mult[0] * model_channels)\n        self.input_blocks = nn.LayerList([TimestepEmbedSequential(conv_nd(dims, in_channels, ch, 3, padding=1))])\n        self._feature_size = ch\n        input_block_chans = [ch]\n        ds = 1\n        for level, mult in enumerate(channel_mult):\n            for _ in range(num_res_blocks):\n                layers = [\n                    ResBlock(\n                        ch,\n                        time_embed_dim,\n                        dropout,\n                        out_channels=int(mult * model_channels),\n                        dims=dims,\n                        use_checkpoint=use_checkpoint,\n                        use_scale_shift_norm=use_scale_shift_norm,\n                    )\n                ]\n                ch = int(mult * model_channels)\n                if ds in attention_resolutions:\n                    layers.append(\n                        AttentionBlock(\n                            ch,\n                            use_checkpoint=use_checkpoint,\n                            num_heads=num_heads,\n                            num_head_channels=num_head_channels,\n                            use_new_attention_order=use_new_attention_order,\n                        ))\n                self.input_blocks.append(TimestepEmbedSequential(*layers))\n                self._feature_size += ch\n                input_block_chans.append(ch)\n            if level != len(channel_mult) - 1:\n                out_ch = ch\n                self.input_blocks.append(\n                    TimestepEmbedSequential(\n                        ResBlock(\n                            ch,\n                            time_embed_dim,\n                            dropout,\n                            out_channels=out_ch,\n                            dims=dims,\n                            use_checkpoint=use_checkpoint,\n                            use_scale_shift_norm=use_scale_shift_norm,\n                            down=True,\n                        ) if resblock_updown else Downsample(ch, conv_resample, dims=dims, out_channels=out_ch)))\n                ch = out_ch\n                input_block_chans.append(ch)\n                ds *= 2\n                self._feature_size += ch\n\n        self.middle_block = TimestepEmbedSequential(\n            ResBlock(\n                ch,\n                time_embed_dim,\n                dropout,\n                dims=dims,\n                use_checkpoint=use_checkpoint,\n                use_scale_shift_norm=use_scale_shift_norm,\n            ),\n            AttentionBlock(\n                ch,\n                use_checkpoint=use_checkpoint,\n                num_heads=num_heads,\n                num_head_channels=num_head_channels,\n                use_new_attention_order=use_new_attention_order,\n            ),\n            ResBlock(\n                ch,\n                time_embed_dim,\n                dropout,\n                dims=dims,\n                use_checkpoint=use_checkpoint,\n                use_scale_shift_norm=use_scale_shift_norm,\n            ),\n        )\n        self._feature_size += ch\n\n        self.output_blocks = nn.LayerList([])\n        for level, mult in list(enumerate(channel_mult))[::-1]:\n            for i in range(num_res_blocks + 1):\n                ich = input_block_chans.pop()\n                layers = [\n                    ResBlock(\n                        ch + ich,\n                        time_embed_dim,\n                        dropout,\n                        out_channels=int(model_channels * mult),\n                        dims=dims,\n                        use_checkpoint=use_checkpoint,\n                        use_scale_shift_norm=use_scale_shift_norm,\n                    )\n                ]\n                ch = int(model_channels * mult)\n                if ds in attention_resolutions:\n                    layers.append(\n                        AttentionBlock(\n                            ch,\n                            use_checkpoint=use_checkpoint,\n                            num_heads=num_heads_upsample,\n                            num_head_channels=num_head_channels,\n                            use_new_attention_order=use_new_attention_order,\n                        ))\n                if level and i == num_res_blocks:\n                    out_ch = ch\n                    layers.append(\n                        ResBlock(\n                            ch,\n                            time_embed_dim,\n                            dropout,\n                            out_channels=out_ch,\n                            dims=dims,\n                            use_checkpoint=use_checkpoint,\n                            use_scale_shift_norm=use_scale_shift_norm,\n                            up=True,\n                        ) if resblock_updown else Upsample(ch, conv_resample, dims=dims, out_channels=out_ch))\n                    ds //= 2\n                self.output_blocks.append(TimestepEmbedSequential(*layers))\n                self._feature_size += ch\n\n        self.out = nn.Sequential(\n            normalization(ch),\n            SiLU(),\n            zero_module(conv_nd(dims, input_ch, out_channels, 3, padding=1)),\n        )\n\n    def forward(self, x, timesteps, y=None):\n        \"\"\"\n        Apply the model to an input batch.\n\n        :param x: an [N x C x ...] Tensor of inputs.\n        :param timesteps: a 1-D batch of timesteps.\n        :param y: an [N] Tensor of labels, if class-conditional.\n        :return: an [N x C x ...] Tensor of outputs.\n        \"\"\"\n        assert (y is not None) == (self.num_classes\n                                   is not None), \"must specify y if and only if the model is class-conditional\"\n\n        hs = []\n        emb = self.time_embed(timestep_embedding(timesteps, self.model_channels))\n        if self.num_classes is not None:\n            assert y.shape == (x.shape[0], )\n            emb = emb + self.label_emb(y)\n\n        h = paddle.cast(x, self.dtype)\n        for module in self.input_blocks:\n            h = module(h, emb)\n            hs.append(h)\n        h = self.middle_block(h, emb)\n        for module in self.output_blocks:\n            h = paddle.concat([h, hs.pop()], axis=1)\n            h = module(h, emb)\n        # h = paddle.cast(h, x.dtype)\n        return self.out(h)\n\n\nclass SuperResModel(UNetModel):\n    \"\"\"\n    A UNetModel that performs super-resolution.\n\n    Expects an extra kwarg `low_res` to condition on a low-resolution image.\n    \"\"\"\n\n    def __init__(self, image_size, in_channels, *args, **kwargs):\n        super().__init__(image_size, in_channels * 2, *args, **kwargs)\n\n    def forward(self, x, timesteps, low_res=None, **kwargs):\n        _, _, new_height, new_width = x.shape\n        upsampled = F.interpolate(low_res, (new_height, new_width), mode=\"bilinear\")\n        x = paddle.concat([x, upsampled], axis=1)\n        return super().forward(x, timesteps, **kwargs)\n\n\nclass EncoderUNetModel(nn.Layer):\n    \"\"\"\n    The half UNet model with attention and timestep embedding.\n\n    For usage, see UNet.\n    \"\"\"\n\n    def __init__(\n        self,\n        image_size,\n        in_channels,\n        model_channels,\n        out_channels,\n        num_res_blocks,\n        attention_resolutions,\n        dropout=0,\n        channel_mult=(1, 2, 4, 8),\n        conv_resample=True,\n        dims=2,\n        use_checkpoint=False,\n        use_fp16=False,\n        num_heads=1,\n        num_head_channels=-1,\n        num_heads_upsample=-1,\n        use_scale_shift_norm=False,\n        resblock_updown=False,\n        use_new_attention_order=False,\n        pool=\"adaptive\",\n    ):\n        super().__init__()\n\n        if num_heads_upsample == -1:\n            num_heads_upsample = num_heads\n\n        self.in_channels = in_channels\n        self.model_channels = model_channels\n        self.out_channels = out_channels\n        self.num_res_blocks = num_res_blocks\n        self.attention_resolutions = attention_resolutions\n        self.dropout = dropout\n        self.channel_mult = channel_mult\n        self.conv_resample = conv_resample\n        self.use_checkpoint = use_checkpoint\n        self.dtype = paddle.float16 if use_fp16 else paddle.float32\n        self.num_heads = num_heads\n        self.num_head_channels = num_head_channels\n        self.num_heads_upsample = num_heads_upsample\n\n        time_embed_dim = model_channels * 4\n        self.time_embed = nn.Sequential(\n            linear(model_channels, time_embed_dim),\n            SiLU(),\n            linear(time_embed_dim, time_embed_dim),\n        )\n\n        ch = int(channel_mult[0] * model_channels)\n        self.input_blocks = nn.LayerList([TimestepEmbedSequential(conv_nd(dims, in_channels, ch, 3, padding=1))])\n        self._feature_size = ch\n        input_block_chans = [ch]\n        ds = 1\n        for level, mult in enumerate(channel_mult):\n            for _ in range(num_res_blocks):\n                layers = [\n                    ResBlock(\n                        ch,\n                        time_embed_dim,\n                        dropout,\n                        out_channels=int(mult * model_channels),\n                        dims=dims,\n                        use_checkpoint=use_checkpoint,\n                        use_scale_shift_norm=use_scale_shift_norm,\n                    )\n                ]\n                ch = int(mult * model_channels)\n                if ds in attention_resolutions:\n                    layers.append(\n                        AttentionBlock(\n                            ch,\n                            use_checkpoint=use_checkpoint,\n                            num_heads=num_heads,\n                            num_head_channels=num_head_channels,\n                            use_new_attention_order=use_new_attention_order,\n                        ))\n                self.input_blocks.append(TimestepEmbedSequential(*layers))\n                self._feature_size += ch\n                input_block_chans.append(ch)\n            if level != len(channel_mult) - 1:\n                out_ch = ch\n                self.input_blocks.append(\n                    TimestepEmbedSequential(\n                        ResBlock(\n                            ch,\n                            time_embed_dim,\n                            dropout,\n                            out_channels=out_ch,\n                            dims=dims,\n                            use_checkpoint=use_checkpoint,\n                            use_scale_shift_norm=use_scale_shift_norm,\n                            down=True,\n                        ) if resblock_updown else Downsample(ch, conv_resample, dims=dims, out_channels=out_ch)))\n                ch = out_ch\n                input_block_chans.append(ch)\n                ds *= 2\n                self._feature_size += ch\n\n        self.middle_block = TimestepEmbedSequential(\n            ResBlock(\n                ch,\n                time_embed_dim,\n                dropout,\n                dims=dims,\n                use_checkpoint=use_checkpoint,\n                use_scale_shift_norm=use_scale_shift_norm,\n            ),\n            AttentionBlock(\n                ch,\n                use_checkpoint=use_checkpoint,\n                num_heads=num_heads,\n                num_head_channels=num_head_channels,\n                use_new_attention_order=use_new_attention_order,\n            ),\n            ResBlock(\n                ch,\n                time_embed_dim,\n                dropout,\n                dims=dims,\n                use_checkpoint=use_checkpoint,\n                use_scale_shift_norm=use_scale_shift_norm,\n            ),\n        )\n        self._feature_size += ch\n        self.pool = pool\n        if pool == \"adaptive\":\n            self.out = nn.Sequential(\n                normalization(ch),\n                SiLU(),\n                nn.AdaptiveAvgPool2D((1, 1)),\n                zero_module(conv_nd(dims, ch, out_channels, 1)),\n                nn.Flatten(),\n            )\n        elif pool == \"attention\":\n            assert num_head_channels != -1\n            self.out = nn.Sequential(\n                normalization(ch),\n                SiLU(),\n                AttentionPool2d((image_size // ds), ch, num_head_channels, out_channels),\n            )\n        elif pool == \"spatial\":\n            self.out = nn.Sequential(\n                nn.Linear(self._feature_size, 2048),\n                nn.ReLU(),\n                nn.Linear(2048, self.out_channels),\n            )\n        elif pool == \"spatial_v2\":\n            self.out = nn.Sequential(\n                nn.Linear(self._feature_size, 2048),\n                normalization(2048),\n                SiLU(),\n                nn.Linear(2048, self.out_channels),\n            )\n        else:\n            raise NotImplementedError(f\"Unexpected {pool} pooling\")\n\n    def forward(self, x, timesteps):\n        \"\"\"\n        Apply the model to an input batch.\n\n        :param x: an [N x C x ...] Tensor of inputs.\n        :param timesteps: a 1-D batch of timesteps.\n        :return: an [N x K] Tensor of outputs.\n        \"\"\"\n        emb = self.time_embed(timestep_embedding(timesteps, self.model_channels))\n\n        results = []\n        # h = x.type(self.dtype)\n        h = paddle.cast(x, self.dtype)\n        for module in self.input_blocks:\n            h = module(h, emb)\n            if self.pool.startswith(\"spatial\"):\n                # results.append(h.type(x.dtype).mean(axis=(2, 3)))\n                results.append(paddle.cast(h, x.dtype).mean(axis=(2, 3)))\n        h = self.middle_block(h, emb)\n        if self.pool.startswith(\"spatial\"):\n            results.append(paddle.cast(h, x.dtype).mean(axis=(2, 3)))\n            h = paddle.concat(results, axis=-1)\n            return self.out(h)\n        else:\n            # h = h.type(x.dtype)\n            h = paddle.cast(h, x.dtype)\n            return self.out(h)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/reverse_diffusion/resources/default.yml",
    "content": "text_prompts:\n  - greg rutkowski和thomas kinkade在artstation上的一幅美丽的画，一个独特的灯塔，照耀着它的光穿过喧嚣的血海。\n\ninit_image:\nwidth_height: [ 1280, 768]\n\nskip_steps: 10\nsteps: 250\n\ncut_ic_pow: 1\ninit_scale: 1000\nclip_guidance_scale: 5000\n\ntv_scale: 0\nrange_scale: 150\nsat_scale: 0\ncutn_batches: 4\n\ndiffusion_model: 512x512_diffusion_uncond_finetune_008100\nuse_secondary_model: True\ndiffusion_sampling_mode: ddim\n\nperlin_init: False\nperlin_mode: mixed\nseed: 445467575\neta: 0.8\nclamp_grad: True\nclamp_max: 0.05\n\nrandomize_class: True\nclip_denoised: False\nfuzzy_prompt: False\nrand_mag: 0.05\n\ncut_overview: \"[12]*400+[4]*600\"\ncut_innercut: \"[4]*400+[12]*600\"\ncut_icgray_p: \"[0.2]*400+[0]*600\"\n\ndisplay_rate: 10\nn_batches: 1\nbatch_size: 1\nbatch_name: ''\nclip_models:\n  - ViTB16\noutput_dir: \"./\"\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/reverse_diffusion/resources/docstrings.yml",
    "content": "text_prompts: |\n  Phrase, sentence, or string of words and phrases describing what the image should look like.  The words will be analyzed by the AI and will guide the diffusion process toward the image(s) you describe. These can include commas and weights to adjust the relative importance of each element.  E.g. \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"\n  Notice that this prompt loosely follows a structure: [subject], [prepositional details], [setting], [meta modifiers and artist]; this is a good starting point for your experiments.\n  Developing text prompts takes practice and experience, and is not the subject of this guide.  If you are a beginner to writing text prompts, a good place to start is on a simple AI art app like Night Cafe, starry ai or WOMBO prior to using DD, to get a feel for how text gets translated into images by GAN tools.  These other apps use different technologies, but many of the same principles apply.\ninit_image: |\n  Recall that in the image sequence above, the first image shown is just noise.  If an init_image is provided, diffusion will replace the noise with the init_image as its starting state.  To use an init_image, upload the image to the Colab instance or your Google Drive, and enter the full image path here.\n  If using an init_image, you may need to increase skip_steps to ~ 50% of total steps to retain the character of the init. See skip_steps above for further discussion.\nwidth_height: |\n  Desired final image size, in pixels. You can have a square, wide, or tall image, but each edge length should be set to a multiple of 64px, and a minimum of 512px on the default CLIP model setting.  If you forget to use multiples of 64px in your dimensions, DD will adjust the dimensions of your image to make it so.\n\nskip_steps: |\n  Consider the chart shown here.  Noise scheduling (denoise strength) starts very high and progressively gets lower and lower as diffusion steps progress. The noise levels in the first few steps are very high, so images change dramatically in early steps.\n  As DD moves along the curve, noise levels (and thus the amount an image changes per step) declines, and image coherence from one step to the next increases.\n  The first few steps of denoising are often so dramatic that some steps (maybe 10-15% of total) can be skipped without affecting the final image. You can experiment with this as a way to cut render times.\n  If you skip too many steps, however, the remaining noise may not be high enough to generate new content, and thus may not have ‘time left’ to finish an image satisfactorily.\n  Also, depending on your other settings, you may need to skip steps to prevent CLIP from overshooting your goal, resulting in ‘blown out’ colors (hyper saturated, solid white, or solid black regions) or otherwise poor image quality.  Consider that the denoising process is at its strongest in the early steps, so skipping steps can sometimes mitigate other problems.\n  Lastly, if using an init_image, you will need to skip ~50% of the diffusion steps to retain the shapes in the original init image.\n  However, if you’re using an init_image, you can also adjust skip_steps up or down for creative reasons.  With low skip_steps you can get a result \"inspired by\" the init_image which will retain the colors and rough layout and shapes but look quite different. With high skip_steps you can preserve most of the init_image contents and just do fine tuning of the texture.\n\nsteps: |\n  When creating an image, the denoising curve is subdivided into steps for processing. Each step (or iteration) involves the AI looking at subsets of the image called ‘cuts’ and calculating the ‘direction’ the image should be guided to be more like the prompt. Then it adjusts the image with the help of the diffusion denoiser, and moves to the next step.\n  Increasing steps will provide more opportunities for the AI to adjust the image, and each adjustment will be smaller, and thus will yield a more precise, detailed image.  Increasing steps comes at the expense of longer render times.  Also, while increasing steps should generally increase image quality, there is a diminishing return on additional steps beyond 250 - 500 steps.  However, some intricate images can take 1000, 2000, or more steps.  It is really up to the user.\n  Just know that the render time is directly related to the number of steps, and many other parameters have a major impact on image quality, without costing additional time.\n\ncut_ic_pow: |\n  This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n\ninit_scale: |\n  This controls how strongly CLIP will try to match the init_image provided.  This is balanced against the clip_guidance_scale (CGS) above.  Too much init scale, and the image won’t change much during diffusion. Too much CGS and the init image will be lost.\nclip_guidance_scale: |\n  CGS is one of the most important parameters you will use. It tells DD how strongly you want CLIP to move toward your prompt each timestep.  Higher is generally better, but if CGS is too strong it will overshoot the goal and distort the image. So a happy medium is needed, and it takes experience to learn how to adjust CGS.\n  Note that this parameter generally scales with image dimensions. In other words, if you increase your total dimensions by 50% (e.g. a change from 512 x 512 to 512 x 768), then to maintain the same effect on the image, you’d want to increase clip_guidance_scale from 5000 to 7500.\n  Of the basic settings, clip_guidance_scale, steps and skip_steps are the most important contributors to image quality, so learn them well.\ntv_scale: |\n  Total variance denoising. Optional, set to zero to turn off. Controls ‘smoothness’ of final output. If used, tv_scale will try to smooth out your final image to reduce overall noise. If your image is too ‘crunchy’, increase tv_scale. TV denoising is good at preserving edges while smoothing away noise in flat regions.  See https://en.wikipedia.org/wiki/Total_variation_denoising\nrange_scale: |\n  Optional, set to zero to turn off.  Used for adjustment of color contrast.  Lower range_scale will increase contrast. Very low numbers create a reduced color palette, resulting in more vibrant or poster-like images. Higher range_scale will reduce contrast, for more muted images.\n\nsat_scale: |\n  Saturation scale. Optional, set to zero to turn off.  If used, sat_scale will help mitigate oversaturation. If your image is too saturated, increase sat_scale to reduce the saturation.\ncutn_batches: |\n  Each iteration, the AI cuts the image into smaller pieces known as cuts, and compares each cut to the prompt to decide how to guide the next diffusion step.  More cuts can generally lead to better images, since DD has more chances to fine-tune the image precision in each timestep.\n  Additional cuts are memory intensive, however, and if DD tries to evaluate too many cuts at once, it can run out of memory.  You can use cutn_batches to increase cuts per timestep without increasing memory usage.\n  At the default settings, DD is scheduled to do 16 cuts per timestep.  If cutn_batches is set to 1, there will indeed only be 16 cuts total per timestep.\n  However, if cutn_batches is increased to 4, DD will do 64 cuts total in each timestep, divided into 4 sequential batches of 16 cuts each.  Because the cuts are being evaluated only 16 at a time, DD uses the memory required for only 16 cuts, but gives you the quality benefit of 64 cuts.  The tradeoff, of course, is that this will take ~4 times as long to render each image.\n  So, (scheduled cuts) x (cutn_batches) = (total cuts per timestep). Increasing cutn_batches will increase render times, however, as the work is being done sequentially.  DD’s default cut schedule is a good place to start, but the cut schedule can be adjusted in the Cutn Scheduling section, explained below.\n\ndiffusion_model: Diffusion_model of choice.\n\nuse_secondary_model: |\n  Option to use a secondary purpose-made diffusion model to clean up interim diffusion images for CLIP evaluation.    If this option is turned off, DD will use the regular (large) diffusion model.    Using the secondary model is faster - one user reported a 50% improvement in render speed! However, the secondary model is much smaller, and may reduce image quality and detail.  I suggest you experiment with this.\n\ndiffusion_sampling_mode: |\n  Two alternate diffusion denoising algorithms. ddim has been around longer, and is more established and tested.  plms is a newly added alternate method that promises good diffusion results in fewer steps, but has not been as fully tested and may have side effects. This new plms mode is actively being researched in the #settings-and-techniques channel in the DD Discord.\n\nperlin_init: |\n  Normally, DD will use an image filled with random noise as a starting point for the diffusion curve.  If perlin_init is selected, DD will instead use a Perlin noise model as an initial state.  Perlin has very interesting characteristics, distinct from random noise, so it’s worth experimenting with this for your projects. Beyond perlin, you can, of course, generate your own noise images (such as with GIMP, etc) and use them as an init_image (without skipping steps).\n  Choosing perlin_init does not affect the actual diffusion process, just the starting point for the diffusion. Please note that selecting a perlin_init will replace and override any init_image you may have specified.  Further, because the 2D, 3D and video animation systems all rely on the init_image system, if you enable Perlin while using animation modes, the perlin_init will jump in front of any previous image or video input, and DD will NOT give you the expected sequence of coherent images. All of that said, using Perlin and animation modes together do make a very colorful rainbow effect, which can be used creatively.\n\nperlin_mode: |\n  sets type of Perlin noise: colored, gray, or a mix of both, giving you additional options for noise types. Experiment to see what these do in your projects.\nseed: |\n  Deep in the diffusion code, there is a random number ‘seed’ which is used as the basis for determining the initial state of the diffusion.  By default, this is random, but you can also specify your own seed.  This is useful if you like a particular result and would like to run more iterations that will be similar.\n  After each run, the actual seed value used will be reported in the parameters report, and can be reused if desired by entering seed # here.  If a specific numerical seed is used repeatedly, the resulting images will be quite similar but not identical.\neta: |\n  eta (greek letter η) is a diffusion model variable that mixes in a random amount of scaled noise into each timestep. 0 is no noise, 1.0 is more noise. As with most DD parameters, you can go below zero for eta, but it may give you unpredictable results.\n  The steps parameter has a close relationship with the eta parameter. If you set eta to 0, then you can get decent output with only 50-75 steps. Setting eta to 1.0 favors higher step counts, ideally around 250 and up. eta has a subtle, unpredictable effect on image, so you’ll need to experiment to see how this affects your projects.\nclamp_grad: |\n  As I understand it, clamp_grad is an internal limiter that stops DD from producing extreme results.  Try your images with and without clamp_grad. If the image changes drastically with clamp_grad turned off, it probably means your clip_guidance_scale is too high and should be reduced.\nclamp_max: |\n  Sets the value of the clamp_grad limitation. Default is 0.05, providing for smoother, more muted coloration in images, but setting higher values (0.15-0.3) can provide interesting contrast and vibrancy.\n\nrandomize_class:\nclip_denoised: False\nfuzzy_prompt: |\n  Controls whether to add multiple noisy prompts to the prompt losses. If True, can increase variability of image output. Experiment with this.\nrand_mag: |\n  Affects only the fuzzy_prompt.  Controls the magnitude of the random noise added by fuzzy_prompt.\n\ncut_overview: The schedule of overview cuts\ncut_innercut: The schedule of inner cuts\ncut_icgray_p: This sets the size of the border used for inner cuts.  High cut_ic_pow values have larger borders, and therefore the cuts themselves will be smaller and provide finer details.  If you have too many or too-small inner cuts, you may lose overall image coherency and/or it may cause an undesirable ‘mosaic’ effect.   Low cut_ic_pow values will allow the inner cuts to be larger, helping image coherency while still helping with some details.\n\ndisplay_rate: |\n  During a diffusion run, you can monitor the progress of each image being created with this variable.  If display_rate is set to 50, DD will show you the in-progress image every 50 timesteps. Setting this to a lower value, like 5 or 10, is a good way to get an early peek at where your image is heading. If you don’t like the progression, just interrupt execution, change some settings, and re-run.  If you are planning a long, unmonitored batch, it’s better to set display_rate equal to steps, because displaying interim images does slow Colab down slightly.\nn_batches: |\n  This variable sets the number of still images you want DD to create.  If you are using an animation mode (see below for details) DD will ignore n_batches and create a single set of animated frames based on the animation settings.\nbatch_name: |\n  The name of the batch, the batch id will be named as \"discoart-[batch_name]-seed\". To avoid your artworks be overridden by other users, please use a unique name.\nclip_models: |\n  CLIP Model selectors. ViT-B/32, ViT-B/16, ViT-L/14, RN101, RN50, RN50x4, RN50x16, RN50x64.\n  These various CLIP models are available for you to use during image generation.  Models have different styles or ‘flavors,’ so look around.\n  You can mix in multiple models as well for different results.  However, keep in mind that some models are extremely memory-hungry, and turning on additional models will take additional memory and may cause a crash.\n  The rough order of speed/mem usage is (smallest/fastest to largest/slowest):\n  ViT-B/32\n  RN50\n  RN101\n  ViT-B/16\n  RN50x4\n  RN50x16\n  RN50x64\n  ViT-L/14\n  For RN50x64 & ViTL14 you may need to use fewer cuts, depending on your VRAM.\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/reverse_diffusion/runner.py",
    "content": "'''\nThis code is rewritten by Paddle based on Jina-ai/discoart.\nhttps://github.com/jina-ai/discoart/blob/main/discoart/runner.py\n'''\nimport gc\nimport os\nimport random\nfrom threading import Thread\n\nimport numpy as np\nimport paddle\nimport paddle.vision.transforms as T\nimport paddle_lpips as lpips\nfrom disco_diffusion_ernievil_base.vit_b_16x.ernievil2.utils.utils import tokenize\nfrom docarray import Document\nfrom docarray import DocumentArray\nfrom IPython import display\nfrom ipywidgets import Output\nfrom PIL import Image\n\nfrom .helper import logger\nfrom .helper import parse_prompt\nfrom .model.losses import range_loss\nfrom .model.losses import spherical_dist_loss\nfrom .model.losses import tv_loss\nfrom .model.make_cutouts import MakeCutoutsDango\nfrom .model.sec_diff import alpha_sigma_to_t\nfrom .model.sec_diff import SecondaryDiffusionImageNet2\nfrom .model.transforms import Normalize\n\n\ndef do_run(args, models) -> 'DocumentArray':\n    logger.info('preparing models...')\n    model, diffusion, clip_models, secondary_model = models\n    normalize = Normalize(\n        mean=[0.485, 0.456, 0.406],\n        std=[0.229, 0.224, 0.225],\n    )\n    lpips_model = lpips.LPIPS(net='vgg')\n    for parameter in lpips_model.parameters():\n        parameter.stop_gradient = True\n    side_x = (args.width_height[0] // 64) * 64\n    side_y = (args.width_height[1] // 64) * 64\n    cut_overview = eval(args.cut_overview)\n    cut_innercut = eval(args.cut_innercut)\n    cut_icgray_p = eval(args.cut_icgray_p)\n\n    from .model.perlin_noises import create_perlin_noise, regen_perlin\n\n    seed = args.seed\n\n    skip_steps = args.skip_steps\n\n    loss_values = []\n\n    if seed is not None:\n        np.random.seed(seed)\n        random.seed(seed)\n        paddle.seed(seed)\n\n    model_stats = []\n    for clip_model in clip_models:\n        model_stat = {\n            'clip_model': None,\n            'target_embeds': [],\n            'make_cutouts': None,\n            'weights': [],\n        }\n        model_stat['clip_model'] = clip_model\n\n        if isinstance(args.text_prompts, str):\n            args.text_prompts = [args.text_prompts]\n\n        for prompt in args.text_prompts:\n            txt, weight = parse_prompt(prompt)\n            txt = clip_model.encode_text(tokenize(prompt))\n            if args.fuzzy_prompt:\n                for i in range(25):\n                    model_stat['target_embeds'].append((txt + paddle.randn(txt.shape) * args.rand_mag).clip(0, 1))\n                    model_stat['weights'].append(weight)\n            else:\n                model_stat['target_embeds'].append(txt)\n                model_stat['weights'].append(weight)\n\n        model_stat['target_embeds'] = paddle.concat(model_stat['target_embeds'])\n        model_stat['weights'] = paddle.to_tensor(model_stat['weights'])\n        if model_stat['weights'].sum().abs() < 1e-3:\n            raise RuntimeError('The weights must not sum to 0.')\n        model_stat['weights'] /= model_stat['weights'].sum().abs()\n        model_stats.append(model_stat)\n\n    init = None\n    if args.init_image:\n        d = Document(uri=args.init_image).load_uri_to_image_tensor(side_x, side_y)\n        init = T.to_tensor(d.tensor).unsqueeze(0) * 2 - 1\n\n    if args.perlin_init:\n        if args.perlin_mode == 'color':\n            init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, False, side_y, side_x)\n            init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, False, side_y, side_x)\n        elif args.perlin_mode == 'gray':\n            init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, True, side_y, side_x)\n            init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, True, side_y, side_x)\n        else:\n            init = create_perlin_noise([1.5**-i * 0.5 for i in range(12)], 1, 1, False, side_y, side_x)\n            init2 = create_perlin_noise([1.5**-i * 0.5 for i in range(8)], 4, 4, True, side_y, side_x)\n        init = (T.to_tensor(init).add(T.to_tensor(init2)).divide(paddle.to_tensor(2.0)).unsqueeze(0) * 2 - 1)\n        del init2\n\n    cur_t = None\n\n    def cond_fn(x, t, y=None):\n        x_is_NaN = False\n        n = x.shape[0]\n        if secondary_model:\n            alpha = paddle.to_tensor(diffusion.sqrt_alphas_cumprod[cur_t], dtype='float32')\n            sigma = paddle.to_tensor(diffusion.sqrt_one_minus_alphas_cumprod[cur_t], dtype='float32')\n            cosine_t = alpha_sigma_to_t(alpha, sigma)\n            x = paddle.to_tensor(x.detach(), dtype='float32')\n            x.stop_gradient = False\n            cosine_t = paddle.tile(paddle.to_tensor(cosine_t.detach().cpu().numpy()), [n])\n            cosine_t.stop_gradient = False\n            out = secondary_model(x, cosine_t).pred\n            fac = diffusion.sqrt_one_minus_alphas_cumprod[cur_t]\n            x_in_d = out * fac + x * (1 - fac)\n            x_in = x_in_d.detach()\n            x_in.stop_gradient = False\n            x_in_grad = paddle.zeros_like(x_in, dtype='float32')\n        else:\n            t = paddle.ones([n], dtype='int64') * cur_t\n            out = diffusion.p_mean_variance(model, x, t, clip_denoised=False, model_kwargs={'y': y})\n            fac = diffusion.sqrt_one_minus_alphas_cumprod[cur_t]\n            x_in_d = out['pred_xstart'] * fac + x * (1 - fac)\n            x_in = x_in_d.detach()\n            x_in.stop_gradient = False\n            x_in_grad = paddle.zeros_like(x_in, dtype='float32')\n        for model_stat in model_stats:\n            for i in range(args.cutn_batches):\n                t_int = (int(t.item()) + 1)  # errors on last step without +1, need to find source\n                # when using SLIP Base model the dimensions need to be hard coded to avoid AttributeError: 'VisionTransformer' object has no attribute 'input_resolution'\n                try:\n                    input_resolution = model_stat['clip_model'].visual.input_resolution\n                except:\n                    input_resolution = 224\n\n                cuts = MakeCutoutsDango(\n                    input_resolution,\n                    Overview=cut_overview[1000 - t_int],\n                    InnerCrop=cut_innercut[1000 - t_int],\n                    IC_Size_Pow=args.cut_ic_pow,\n                    IC_Grey_P=cut_icgray_p[1000 - t_int],\n                )\n                clip_in = normalize(cuts(x_in.add(paddle.to_tensor(1.0)).divide(paddle.to_tensor(2.0))))\n                image_embeds = (model_stat['clip_model'].encode_image(clip_in))\n\n                dists = spherical_dist_loss(\n                    image_embeds.unsqueeze(1),\n                    model_stat['target_embeds'].unsqueeze(0),\n                )\n\n                dists = dists.reshape([\n                    cut_overview[1000 - t_int] + cut_innercut[1000 - t_int],\n                    n,\n                    -1,\n                ])\n                losses = dists.multiply(model_stat['weights']).sum(2).mean(0)\n                loss_values.append(losses.sum().item())  # log loss, probably shouldn't do per cutn_batch\n\n                x_in_grad += ((paddle.grad(losses.sum() * args.clip_guidance_scale, x_in)[0]) / args.cutn_batches)\n        tv_losses = tv_loss(x_in)\n        range_losses = range_loss(x_in)\n        sat_losses = paddle.abs(x_in - x_in.clip(min=-1, max=1)).mean()\n        loss = (tv_losses.sum() * args.tv_scale + range_losses.sum() * args.range_scale +\n                sat_losses.sum() * args.sat_scale)\n        if init is not None and args.init_scale:\n            init_losses = lpips_model(x_in, init)\n            loss = loss + init_losses.sum() * args.init_scale\n        x_in_grad += paddle.grad(loss, x_in)[0]\n        if not paddle.isnan(x_in_grad).any():\n            grad = -paddle.grad(x_in_d, x, x_in_grad)[0]\n        else:\n            x_is_NaN = True\n            grad = paddle.zeros_like(x)\n        if args.clamp_grad and not x_is_NaN:\n            magnitude = grad.square().mean().sqrt()\n            return (grad * magnitude.clip(max=args.clamp_max) / magnitude)\n        return grad\n\n    if args.diffusion_sampling_mode == 'ddim':\n        sample_fn = diffusion.ddim_sample_loop_progressive\n    else:\n        sample_fn = diffusion.plms_sample_loop_progressive\n\n    logger.info('creating artwork...')\n\n    image_display = Output()\n    da_batches = DocumentArray()\n\n    for _nb in range(args.n_batches):\n        display.clear_output(wait=True)\n        display.display(args.name_docarray, image_display)\n        gc.collect()\n        paddle.device.cuda.empty_cache()\n\n        d = Document(tags=vars(args))\n        da_batches.append(d)\n\n        cur_t = diffusion.num_timesteps - skip_steps - 1\n\n        if args.perlin_init:\n            init = regen_perlin(args.perlin_mode, side_y, side_x, args.batch_size)\n\n        if args.diffusion_sampling_mode == 'ddim':\n            samples = sample_fn(\n                model,\n                (args.batch_size, 3, side_y, side_x),\n                clip_denoised=args.clip_denoised,\n                model_kwargs={},\n                cond_fn=cond_fn,\n                progress=True,\n                skip_timesteps=skip_steps,\n                init_image=init,\n                randomize_class=args.randomize_class,\n                eta=args.eta,\n            )\n        else:\n            samples = sample_fn(\n                model,\n                (args.batch_size, 3, side_y, side_x),\n                clip_denoised=args.clip_denoised,\n                model_kwargs={},\n                cond_fn=cond_fn,\n                progress=True,\n                skip_timesteps=skip_steps,\n                init_image=init,\n                randomize_class=args.randomize_class,\n                order=2,\n            )\n\n        threads = []\n        for j, sample in enumerate(samples):\n            cur_t -= 1\n            with image_display:\n                if j % args.display_rate == 0 or cur_t == -1:\n                    for _, image in enumerate(sample['pred_xstart']):\n                        image = (image + 1) / 2\n                        image = image.clip(0, 1).squeeze().transpose([1, 2, 0]).numpy() * 255\n                        image = np.uint8(image)\n                        image = Image.fromarray(image)\n\n                        image.save(os.path.join(args.output_dir, 'progress-{}.png'.format(_nb)))\n                        c = Document(tags={'cur_t': cur_t})\n                        c.load_pil_image_to_datauri(image)\n                        d.chunks.append(c)\n                        display.clear_output(wait=True)\n                        display.display(display.Image(os.path.join(args.output_dir, 'progress-{}.png'.format(_nb))))\n                        d.chunks.plot_image_sprites(os.path.join(args.output_dir,\n                                                                 f'{args.name_docarray}-progress-{_nb}.png'),\n                                                    show_index=True)\n                        t = Thread(\n                            target=_silent_push,\n                            args=(\n                                da_batches,\n                                args.name_docarray,\n                            ),\n                        )\n                        threads.append(t)\n                        t.start()\n\n                    if cur_t == -1:\n                        d.load_pil_image_to_datauri(image)\n\n        for t in threads:\n            t.join()\n    display.clear_output(wait=True)\n    logger.info(f'done! {args.name_docarray}')\n    da_batches.plot_image_sprites(skip_empty=True, show_index=True, keep_aspect_ratio=True)\n    return da_batches\n\n\ndef _silent_push(da_batches: DocumentArray, name: str) -> None:\n    try:\n        da_batches.push(name)\n    except Exception as ex:\n        logger.debug(f'push failed: {ex}')\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/vit_b_16x/ernievil2/__init__.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n__version__ = '2.0.0'  # Maybe dev is better\n\nfrom . import transformers\nfrom . import utils\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/vit_b_16x/ernievil2/transformers/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/vit_b_16x/ernievil2/transformers/clip_vision_transformer.py",
    "content": "# copyright (c) 2021 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle.nn import Layer\nfrom paddle.nn import Linear\nfrom paddle.nn.initializer import Constant\nfrom paddle.nn.initializer import Normal\nfrom paddle.nn.initializer import TruncatedNormal\n\n# from .base_transformer import QuickGELU\n\n__all__ = [\n    \"VisionTransformer\", \"ViT_small_patch16_224\", \"ViT_base_patch16_224\", \"ViT_base_patch16_384\",\n    \"ViT_base_patch32_224\", \"ViT_base_patch32_384\", \"ViT_large_patch16_224\", \"ViT_large_patch16_384\",\n    \"ViT_large_patch32_384\", \"ViT_huge_patch16_224\", \"ViT_huge_patch32_384\", \"ViT_large_patch14_224\"\n]\n\ntrunc_normal_ = TruncatedNormal(std=.02)\nzeros_ = Constant(value=0.)\nones_ = Constant(value=1.)\n\n\nclass QuickGELU(Layer):\n    \"\"\" GELU \"\"\"\n\n    def forward(self, x):\n        return x * F.sigmoid(1.702 * x)\n\n\ndef to_2tuple(x):\n    return tuple([x] * 2)\n\n\ndef drop_path(x, drop_prob=0., training=False):\n    \"\"\"Drop paths (Stochastic Depth) per sample (when applied in main path of residual blocks).\n    the original name is misleading as 'Drop Connect' is a different form of dropout in a separate paper...\n    See discussion: https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 ...\n    \"\"\"\n    if drop_prob == 0. or not training:\n        return x\n    keep_prob = paddle.to_tensor(1 - drop_prob)\n    shape = (paddle.shape(x)[0], ) + (1, ) * (x.ndim - 1)\n    random_tensor = keep_prob + paddle.rand(shape, dtype=x.dtype)\n    random_tensor = paddle.floor(random_tensor)  # binarize\n    output = x.divide(keep_prob) * random_tensor\n    return output\n\n\nclass DropPath(nn.Layer):\n    \"\"\"Drop paths (Stochastic Depth) per sample  (when applied in main path of residual blocks).\n    \"\"\"\n\n    def __init__(self, drop_prob=None):\n        super(DropPath, self).__init__()\n        self.drop_prob = drop_prob\n\n    def forward(self, x):\n        return drop_path(x, self.drop_prob, self.training)\n\n\nclass Identity(nn.Layer):\n\n    def __init__(self):\n        super(Identity, self).__init__()\n\n    def forward(self, input):\n        return input\n\n\nclass Mlp(nn.Layer):\n\n    def __init__(self, in_features, hidden_features=None, out_features=None, act_layer=nn.GELU, drop=0.):\n        super().__init__()\n        out_features = out_features or in_features\n        hidden_features = hidden_features or in_features\n        self.fc1 = nn.Linear(in_features, hidden_features)\n        self.act = act_layer()\n        self.fc2 = nn.Linear(hidden_features, out_features)\n        self.drop = nn.Dropout(drop)\n\n    def forward(self, x):\n        x = self.fc1(x)\n        x = self.act(x)\n        x = self.drop(x)\n        x = self.fc2(x)\n        x = self.drop(x)\n        return x\n\n\nclass Attention(nn.Layer):\n\n    def __init__(self, dim, num_heads=8, qkv_bias=False, qk_scale=None, attn_drop=0., proj_drop=0.):\n        super().__init__()\n        self.num_heads = num_heads\n        head_dim = dim // num_heads\n        self.scale = qk_scale or head_dim**-0.5\n\n        self.qkv = nn.Linear(dim, dim * 3, bias_attr=qkv_bias)\n        self.attn_drop = nn.Dropout(attn_drop)\n        self.proj = nn.Linear(dim, dim)\n        self.proj_drop = nn.Dropout(proj_drop)\n\n    def forward(self, x):\n        # B= paddle.shape(x)[0]\n        N, C = x.shape[1:]\n        qkv = self.qkv(x).reshape((-1, N, 3, self.num_heads, C // self.num_heads)).transpose((2, 0, 3, 1, 4))\n        q, k, v = qkv[0], qkv[1], qkv[2]\n\n        attn = (q.matmul(k.transpose((0, 1, 3, 2)))) * self.scale\n        attn = nn.functional.softmax(attn, axis=-1)\n        attn = self.attn_drop(attn)\n\n        x = (attn.matmul(v)).transpose((0, 2, 1, 3)).reshape((-1, N, C))\n        x = self.proj(x)\n        x = self.proj_drop(x)\n        return x\n\n\nclass Block(nn.Layer):\n\n    def __init__(self,\n                 dim,\n                 num_heads,\n                 mlp_ratio=4.,\n                 qkv_bias=False,\n                 qk_scale=None,\n                 drop=0.,\n                 attn_drop=0.,\n                 drop_path=0.,\n                 act_layer=QuickGELU,\n                 norm_layer='nn.LayerNorm',\n                 epsilon=1e-5):\n        super().__init__()\n        self.norm1 = eval(norm_layer)(dim, epsilon=epsilon)\n        self.attn = Attention(dim,\n                              num_heads=num_heads,\n                              qkv_bias=qkv_bias,\n                              qk_scale=qk_scale,\n                              attn_drop=attn_drop,\n                              proj_drop=drop)\n        # NOTE: drop path for stochastic depth, we shall see if this is better than dropout here\n        self.drop_path = DropPath(drop_path) if drop_path > 0. else Identity()\n        self.norm2 = eval(norm_layer)(dim, epsilon=epsilon)\n        mlp_hidden_dim = int(dim * mlp_ratio)\n        self.mlp = Mlp(in_features=dim, hidden_features=mlp_hidden_dim, act_layer=act_layer, drop=drop)\n\n    def forward(self, x):\n        x = x + self.drop_path(self.attn(self.norm1(x)))\n        x = x + self.drop_path(self.mlp(self.norm2(x)))\n        return x\n\n\nclass PatchEmbed(nn.Layer):\n    \"\"\" Image to Patch Embedding\n    \"\"\"\n\n    def __init__(self, img_size=224, patch_size=16, in_chans=3, embed_dim=768):\n        super().__init__()\n        img_size = to_2tuple(img_size)\n        patch_size = to_2tuple(patch_size)\n        num_patches = (img_size[1] // patch_size[1]) * \\\n            (img_size[0] // patch_size[0])\n        self.img_size = img_size\n        self.patch_size = patch_size\n        self.num_patches = num_patches\n\n        self.proj = nn.Conv2D(in_chans, embed_dim, kernel_size=patch_size, stride=patch_size, bias_attr=False)\n\n    def forward(self, x):\n        B, C, H, W = x.shape\n        assert H == self.img_size[0] and W == self.img_size[1], \\\n            \"Input image size ({H}*{W}) doesn't match model ({self.img_size[0]}*{self.img_size[1]}).\"\n\n        x = self.proj(x).flatten(2).transpose((0, 2, 1))\n        return x\n\n\nclass VisionTransformer(nn.Layer):\n    \"\"\" Vision Transformer with support for patch input\n    \"\"\"\n\n    def __init__(self,\n                 img_size=224,\n                 patch_size=16,\n                 in_chans=3,\n                 class_dim=0,\n                 embed_dim=768,\n                 depth=12,\n                 num_heads=12,\n                 mlp_ratio=4,\n                 qkv_bias=False,\n                 qk_scale=None,\n                 drop_rate=0.,\n                 attn_drop_rate=0.,\n                 drop_path_rate=0.,\n                 norm_layer='nn.LayerNorm',\n                 epsilon=1e-5,\n                 **args):\n        super().__init__()\n        self.class_dim = class_dim\n\n        self.num_features = self.embed_dim = embed_dim\n\n        self.patch_embed = PatchEmbed(img_size=img_size, patch_size=patch_size, in_chans=in_chans, embed_dim=embed_dim)\n        num_patches = self.patch_embed.num_patches\n\n        scale = embed_dim**-0.5\n        self.class_embedding = self.create_parameter(shape=(1, 1, embed_dim), default_initializer=Normal(std=scale))\n        self.positional_embedding = self.create_parameter(shape=(1, num_patches + 1, embed_dim),\n                                                          default_initializer=Normal(std=scale))\n        self.add_parameter(\"positional_embedding\", self.positional_embedding)\n        self.add_parameter(\"class_embedding\", self.class_embedding)\n        self.pos_drop = nn.Dropout(p=drop_rate)\n\n        dpr = np.linspace(0, drop_path_rate, depth)\n\n        self.norm_pre = eval(norm_layer)(embed_dim, epsilon=epsilon)\n\n        self.blocks = nn.LayerList([\n            Block(dim=embed_dim,\n                  num_heads=num_heads,\n                  mlp_ratio=mlp_ratio,\n                  qkv_bias=qkv_bias,\n                  qk_scale=qk_scale,\n                  drop=drop_rate,\n                  attn_drop=attn_drop_rate,\n                  drop_path=dpr[i],\n                  norm_layer=norm_layer,\n                  epsilon=epsilon) for i in range(depth)\n        ])\n\n        self.norm_post = eval(norm_layer)(embed_dim, epsilon=epsilon)\n\n        ## Classifier head\n        #self.head = nn.Linear(embed_dim,\n        #                      class_dim) if class_dim > 0 else Identity()\n\n        trunc_normal_(self.positional_embedding)\n        trunc_normal_(self.class_embedding)\n        self.apply(self._init_weights)\n\n    def _init_weights(self, m):\n        if isinstance(m, nn.Linear):\n            trunc_normal_(m.weight)\n            if isinstance(m, nn.Linear) and m.bias is not None:\n                zeros_(m.bias)\n        elif isinstance(m, nn.LayerNorm):\n            zeros_(m.bias)\n            ones_(m.weight)\n\n    def forward_features(self, x):\n        # B = x.shape[0]\n        B = paddle.shape(x)[0]\n        x = self.patch_embed(x)\n        class_embedding = self.class_embedding.expand((B, -1, -1))\n        x = paddle.concat((class_embedding, x), axis=1)\n        x = x + self.positional_embedding\n        x = self.pos_drop(x)\n        x = self.norm_pre(x)\n        for blk in self.blocks:\n            x = blk(x)\n\n        #x = self.norm_post(x[:, 0, :])\n        x = self.norm_post(x)\n        # x = self.classfy(x)\n        return x\n\n    def forward(self, x):\n        x = self.forward_features(x)\n        return x\n\n\ndef ViT_small_patch16_224(**kwargs):\n    model = VisionTransformer(patch_size=16,\n                              embed_dim=768,\n                              depth=8,\n                              num_heads=8,\n                              mlp_ratio=3,\n                              qk_scale=768**-0.5,\n                              **kwargs)\n    return model\n\n\ndef ViT_base_patch16_224(**kwargs):\n    model = VisionTransformer(patch_size=16,\n                              embed_dim=768,\n                              depth=12,\n                              num_heads=12,\n                              mlp_ratio=4,\n                              qkv_bias=True,\n                              epsilon=1e-6,\n                              **kwargs)\n    return model\n\n\ndef ViT_base_patch16_384(**kwargs):\n    model = VisionTransformer(img_size=384,\n                              patch_size=16,\n                              embed_dim=768,\n                              depth=12,\n                              num_heads=12,\n                              mlp_ratio=4,\n                              qkv_bias=True,\n                              epsilon=1e-6,\n                              **kwargs)\n    return model\n\n\ndef ViT_base_patch32_384(**kwargs):\n    model = VisionTransformer(img_size=384,\n                              patch_size=32,\n                              embed_dim=768,\n                              depth=12,\n                              num_heads=12,\n                              mlp_ratio=4,\n                              qkv_bias=True,\n                              epsilon=1e-6,\n                              **kwargs)\n    return model\n\n\ndef ViT_base_patch32_224(**kwargs):\n    model = VisionTransformer(img_size=224,\n                              patch_size=32,\n                              embed_dim=768,\n                              depth=12,\n                              num_heads=12,\n                              mlp_ratio=4,\n                              qkv_bias=True,\n                              epsilon=1e-6,\n                              **kwargs)\n    return model\n\n\ndef ViT_large_patch16_224(**kwargs):\n    model = VisionTransformer(patch_size=16,\n                              embed_dim=1024,\n                              depth=24,\n                              num_heads=16,\n                              mlp_ratio=4,\n                              qkv_bias=True,\n                              epsilon=1e-6,\n                              **kwargs)\n    return model\n\n\ndef ViT_large_patch16_384(**kwargs):\n    model = VisionTransformer(img_size=384,\n                              patch_size=16,\n                              embed_dim=1024,\n                              depth=24,\n                              num_heads=16,\n                              mlp_ratio=4,\n                              qkv_bias=True,\n                              epsilon=1e-6,\n                              **kwargs)\n    return model\n\n\ndef ViT_large_patch14_224(**kwargs):\n    model = VisionTransformer(patch_size=14,\n                              embed_dim=1024,\n                              depth=24,\n                              num_heads=16,\n                              mlp_ratio=4,\n                              qkv_bias=True,\n                              epsilon=1e-6,\n                              **kwargs)\n    return model\n\n\ndef ViT_large_patch32_384(**kwargs):\n    model = VisionTransformer(img_size=384,\n                              patch_size=32,\n                              embed_dim=1024,\n                              depth=24,\n                              num_heads=16,\n                              mlp_ratio=4,\n                              qkv_bias=True,\n                              epsilon=1e-6,\n                              **kwargs)\n    return model\n\n\ndef ViT_huge_patch16_224(**kwargs):\n    model = VisionTransformer(patch_size=16, embed_dim=1280, depth=32, num_heads=16, mlp_ratio=4, **kwargs)\n    return model\n\n\ndef ViT_huge_patch32_384(**kwargs):\n    model = VisionTransformer(img_size=384,\n                              patch_size=32,\n                              embed_dim=1280,\n                              depth=32,\n                              num_heads=16,\n                              mlp_ratio=4,\n                              **kwargs)\n    return model\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/vit_b_16x/ernievil2/transformers/droppath.py",
    "content": "# Copyright (c) 2021 PPViT Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"\nDroppath, reimplement from https://github.com/yueatsprograms/Stochastic_Depth\n\"\"\"\nimport paddle\nimport paddle.nn as nn\n\n\nclass DropPath(nn.Layer):\n    \"\"\"DropPath class\"\"\"\n\n    def __init__(self, drop_prob=None):\n        super(DropPath, self).__init__()\n        self.drop_prob = drop_prob\n\n    def drop_path(self, inputs):\n        \"\"\"drop path op\n        Args:\n            input: tensor with arbitrary shape\n            drop_prob: float number of drop path probability, default: 0.0\n            training: bool, if current mode is training, default: False\n        Returns:\n            output: output tensor after drop path\n        \"\"\"\n        # if prob is 0 or eval mode, return original input\n        if self.drop_prob == 0. or not self.training:\n            return inputs\n        keep_prob = 1 - self.drop_prob\n        keep_prob = paddle.to_tensor(keep_prob, dtype='float32')\n        shape = (inputs.shape[0], ) + (1, ) * (inputs.ndim - 1)  # shape=(N, 1, 1, 1)\n        random_tensor = keep_prob + paddle.rand(shape, dtype=inputs.dtype)\n        random_tensor = random_tensor.floor()  # mask\n        output = inputs.divide(keep_prob) * random_tensor  #divide is to keep same output expectation\n        return output\n\n    def forward(self, inputs):\n        return self.drop_path(inputs)\n\n\n#def main():\n#    tmp = paddle.to_tensor(np.random.rand(8, 16, 8, 8), dtype='float32')\n#    dp = DropPath(0.5)\n#    out = dp(tmp)\n#    print(out)\n#\n#if __name__ == \"__main__\":\n#    main()\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/vit_b_16x/ernievil2/transformers/efficientnet.py",
    "content": "import collections\nimport copy\nimport math\nimport os\nimport re\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle import ParamAttr\nfrom paddle.nn import AdaptiveAvgPool2D\nfrom paddle.nn import AvgPool2D\nfrom paddle.nn import BatchNorm\nfrom paddle.nn import Conv2D\nfrom paddle.nn import Dropout\nfrom paddle.nn import Linear\nfrom paddle.nn import MaxPool2D\n\nMODEL_URLS = {\n    \"EfficientNetB0_small\":\n    \"https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/EfficientNetB0_small_pretrained.pdparams\",\n    \"EfficientNetB0\": \"https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/EfficientNetB0_pretrained.pdparams\",\n    \"EfficientNetB1\": \"https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/EfficientNetB1_pretrained.pdparams\",\n    \"EfficientNetB2\": \"https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/EfficientNetB2_pretrained.pdparams\",\n    \"EfficientNetB3\": \"https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/EfficientNetB3_pretrained.pdparams\",\n    \"EfficientNetB4\": \"https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/EfficientNetB4_pretrained.pdparams\",\n    \"EfficientNetB5\": \"https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/EfficientNetB5_pretrained.pdparams\",\n    \"EfficientNetB6\": \"https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/EfficientNetB6_pretrained.pdparams\",\n    \"EfficientNetB7\": \"https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/EfficientNetB7_pretrained.pdparams\",\n}\n\n__all__ = list(MODEL_URLS.keys())\n\nGlobalParams = collections.namedtuple('GlobalParams', [\n    'batch_norm_momentum',\n    'batch_norm_epsilon',\n    'dropout_rate',\n    'num_classes',\n    'width_coefficient',\n    'depth_coefficient',\n    'depth_divisor',\n    'min_depth',\n    'drop_connect_rate',\n])\n\nBlockArgs = collections.namedtuple(\n    'BlockArgs',\n    ['kernel_size', 'num_repeat', 'input_filters', 'output_filters', 'expand_ratio', 'id_skip', 'stride', 'se_ratio'])\n\nGlobalParams.__new__.__defaults__ = (None, ) * len(GlobalParams._fields)\nBlockArgs.__new__.__defaults__ = (None, ) * len(BlockArgs._fields)\n\n\ndef load_dygraph_pretrain(model, path=None):\n    if not (os.path.isdir(path) or os.path.exists(path + '.pdparams')):\n        raise ValueError(\"Model pretrain path {} does not \"\n                         \"exists.\".format(path))\n    param_state_dict = paddle.load(path + \".pdparams\")\n    model.set_dict(param_state_dict)\n    return\n\n\ndef efficientnet_params(model_name):\n    \"\"\" Map EfficientNet model name to parameter coefficients. \"\"\"\n    params_dict = {\n        # Coefficients:   width,depth,resolution,dropout\n        'efficientnet-b0': (1.0, 1.0, 224, 0.2),\n        'efficientnet-b2': (1.1, 1.2, 260, 0.3),\n        'efficientnet-b3': (1.2, 1.4, 300, 0.3),\n        'efficientnet-b4': (1.4, 1.8, 380, 0.4),\n        'efficientnet-b5': (1.6, 2.2, 456, 0.4),\n        'efficientnet-b6': (1.8, 2.6, 528, 0.5),\n        'efficientnet-b7': (2.0, 3.1, 600, 0.5),\n    }\n    return params_dict[model_name]\n\n\ndef efficientnet(width_coefficient=None, depth_coefficient=None, dropout_rate=0.2, drop_connect_rate=0.2):\n    \"\"\" Get block arguments according to parameter and coefficients. \"\"\"\n    blocks_args = [\n        'r1_k3_s11_e1_i32_o16_se0.25',\n        'r2_k3_s22_e6_i16_o24_se0.25',\n        'r2_k5_s22_e6_i24_o40_se0.25',\n        'r3_k3_s22_e6_i40_o80_se0.25',\n        'r3_k5_s11_e6_i80_o112_se0.25',\n        'r4_k5_s22_e6_i112_o192_se0.25',\n        'r1_k3_s11_e6_i192_o320_se0.25',\n    ]\n    blocks_args = BlockDecoder.decode(blocks_args)\n\n    global_params = GlobalParams(batch_norm_momentum=0.99,\n                                 batch_norm_epsilon=1e-3,\n                                 dropout_rate=dropout_rate,\n                                 drop_connect_rate=drop_connect_rate,\n                                 num_classes=1000,\n                                 width_coefficient=width_coefficient,\n                                 depth_coefficient=depth_coefficient,\n                                 depth_divisor=8,\n                                 min_depth=None)\n    return blocks_args, global_params\n\n\ndef get_model_params(model_name, override_params):\n    \"\"\" Get the block args and global params for a given model \"\"\"\n    if model_name.startswith('efficientnet'):\n        w, d, _, p = efficientnet_params(model_name)\n        blocks_args, global_params = efficientnet(width_coefficient=w, depth_coefficient=d, dropout_rate=p)\n    else:\n        raise NotImplementedError('model name is not pre-defined: %s' % model_name)\n    if override_params:\n        global_params = global_params._replace(**override_params)\n    return blocks_args, global_params\n\n\ndef round_filters(filters, global_params):\n    \"\"\" Calculate and round number of filters based on depth multiplier. \"\"\"\n    multiplier = global_params.width_coefficient\n    if not multiplier:\n        return filters\n    divisor = global_params.depth_divisor\n    min_depth = global_params.min_depth\n    filters *= multiplier\n    min_depth = min_depth or divisor\n    new_filters = max(min_depth, int(filters + divisor / 2) // divisor * divisor)\n    if new_filters < 0.9 * filters:  # prevent rounding by more than 10%\n        new_filters += divisor\n    return int(new_filters)\n\n\ndef round_repeats(repeats, global_params):\n    \"\"\" Round number of filters based on depth multiplier. \"\"\"\n    multiplier = global_params.depth_coefficient\n    if not multiplier:\n        return repeats\n    return int(math.ceil(multiplier * repeats))\n\n\nclass BlockDecoder(object):\n    \"\"\"\n    Block Decoder, straight from the official TensorFlow repository.\n    \"\"\"\n\n    @staticmethod\n    def _decode_block_string(block_string):\n        \"\"\" Gets a block through a string notation of arguments. \"\"\"\n        assert isinstance(block_string, str)\n\n        ops = block_string.split('_')\n        options = {}\n        for op in ops:\n            splits = re.split(r'(\\d.*)', op)\n            if len(splits) >= 2:\n                key, value = splits[:2]\n                options[key] = value\n\n        # Check stride\n        cond_1 = ('s' in options and len(options['s']) == 1)\n        cond_2 = ((len(options['s']) == 2) and (options['s'][0] == options['s'][1]))\n        assert (cond_1 or cond_2)\n\n        return BlockArgs(kernel_size=int(options['k']),\n                         num_repeat=int(options['r']),\n                         input_filters=int(options['i']),\n                         output_filters=int(options['o']),\n                         expand_ratio=int(options['e']),\n                         id_skip=('noskip' not in block_string),\n                         se_ratio=float(options['se']) if 'se' in options else None,\n                         stride=[int(options['s'][0])])\n\n    @staticmethod\n    def _encode_block_string(block):\n        \"\"\"Encodes a block to a string.\"\"\"\n        args = [\n            'r%d' % block.num_repeat,\n            'k%d' % block.kernel_size,\n            's%d%d' % (block.strides[0], block.strides[1]),\n            'e%s' % block.expand_ratio,\n            'i%d' % block.input_filters,\n            'o%d' % block.output_filters\n        ]\n        if 0 < block.se_ratio <= 1:\n            args.append('se%s' % block.se_ratio)\n        if block.id_skip is False:\n            args.append('noskip')\n        return '_'.join(args)\n\n    @staticmethod\n    def decode(string_list):\n        \"\"\"\n        Decode a list of string notations to specify blocks in the network.\n        string_list: list of strings, each string is a notation of block\n        return\n            list of BlockArgs namedtuples of block args\n        \"\"\"\n        assert isinstance(string_list, list)\n        blocks_args = []\n        for block_string in string_list:\n            blocks_args.append(BlockDecoder._decode_block_string(block_string))\n        return blocks_args\n\n    @staticmethod\n    def encode(blocks_args):\n        \"\"\"\n        Encodes a list of BlockArgs to a list of strings.\n        :param blocks_args: a list of BlockArgs namedtuples of block args\n        :return: a list of strings, each string is a notation of block\n        \"\"\"\n        block_strings = []\n        for block in blocks_args:\n            block_strings.append(BlockDecoder._encode_block_string(block))\n        return block_strings\n\n\ndef initial_type(name, use_bias=False):\n    param_attr = ParamAttr(name=name + \"_weights\")\n    if use_bias:\n        bias_attr = ParamAttr(name=name + \"_offset\")\n    else:\n        bias_attr = False\n    return param_attr, bias_attr\n\n\ndef init_batch_norm_layer(name=\"batch_norm\"):\n    param_attr = ParamAttr(name=name + \"_scale\")\n    bias_attr = ParamAttr(name=name + \"_offset\")\n    return param_attr, bias_attr\n\n\ndef init_fc_layer(name=\"fc\"):\n    param_attr = ParamAttr(name=name + \"_weights\")\n    bias_attr = ParamAttr(name=name + \"_offset\")\n    return param_attr, bias_attr\n\n\ndef cal_padding(img_size, stride, filter_size, dilation=1):\n    \"\"\"Calculate padding size.\"\"\"\n    # out_size = max(filter_size - stride, 0)\n    if img_size % stride == 0:\n        out_size = max(filter_size - stride, 0)\n    else:\n        out_size = max(filter_size - (img_size % stride), 0)\n    return out_size // 2, out_size - out_size // 2\n\n\n# inp_shape = {\n#     \"b0_small\": [224, 112, 112, 56, 28, 14, 14, 7],\n#     \"b0\": [224, 112, 112, 56, 28, 14, 14, 7],\n#     \"b1\": [240, 120, 120, 60, 30, 15, 15, 8],\n#     \"b2\": [260, 130, 130, 65, 33, 17, 17, 9],\n#     \"b3\": [300, 150, 150, 75, 38, 19, 19, 10],\n#     \"b4\": [380, 190, 190, 95, 48, 24, 24, 12],\n#     \"b5\": [456, 228, 228, 114, 57, 29, 29, 15],\n#     \"b6\": [528, 264, 264, 132, 66, 33, 33, 17],\n#     \"b7\": [600, 300, 300, 150, 75, 38, 38, 19]\n# }\ninp_shape = {\n    \"b0_small\": [224, 112, 112, 56, 28, 14, 14, 7],\n    \"b0\": [224, 112, 112, 56, 28, 14, 14, 7],\n    \"b1\": [240, 120, 120, 60, 30, 15, 15, 8],\n    \"b2\": [260, 130, 130, 65, 33, 17, 17, 9],\n    \"b3\": [300, 150, 150, 75, 38, 19, 19, 10],\n    \"b4\": [380, 190, 190, 95, 48, 24, 24, 12],\n    \"b5\": [256, 128, 128, 64, 32, 16, 16, 8],\n    \"b6\": [528, 264, 264, 132, 66, 33, 33, 17],\n    \"b7\": [600, 300, 300, 150, 75, 38, 38, 19]\n}\n\n\ndef _drop_connect(inputs, prob, is_test):\n    if is_test:\n        output = inputs\n    else:\n        keep_prob = 1.0 - prob\n        inputs_shape = paddle.shape(inputs)\n        random_tensor = keep_prob + paddle.rand(shape=[inputs_shape[0], 1, 1, 1])\n        binary_tensor = paddle.floor(random_tensor)\n        output = paddle.multiply(inputs, binary_tensor) / keep_prob\n    return output\n\n\nclass Conv2ds(nn.Layer):\n\n    def __init__(self,\n                 input_channels,\n                 output_channels,\n                 filter_size,\n                 stride=1,\n                 padding=0,\n                 groups=None,\n                 name=\"conv2d\",\n                 act=None,\n                 use_bias=False,\n                 padding_type=None,\n                 model_name=None,\n                 cur_stage=None):\n        super(Conv2ds, self).__init__()\n        assert act in [None, \"swish\", \"sigmoid\"]\n        self.act = act\n\n        param_attr, bias_attr = initial_type(name=name, use_bias=use_bias)\n\n        self.padding_type = padding_type\n        self.stride = stride\n        self.filter_size = filter_size\n\n        def get_padding(filter_size, stride=1, dilation=1):\n            padding = ((stride - 1) + dilation * (filter_size - 1)) // 2\n            return padding\n\n        inps = 1 if model_name == None and cur_stage == None else inp_shape[model_name][cur_stage]\n        self.need_crop = False\n        if padding_type == \"SAME\":\n            top_padding, bottom_padding = cal_padding(inps, stride, filter_size)\n            left_padding, right_padding = cal_padding(inps, stride, filter_size)\n            height_padding = bottom_padding\n            width_padding = right_padding\n            if top_padding != bottom_padding or left_padding != right_padding:\n                height_padding = top_padding + stride\n                width_padding = left_padding + stride\n                self.need_crop = True\n            padding = [height_padding, width_padding]\n        elif padding_type == \"VALID\":\n            height_padding = 0\n            width_padding = 0\n            padding = [height_padding, width_padding]\n        elif padding_type == \"DYNAMIC\":\n            padding = get_padding(filter_size, stride)\n        else:\n            padding = padding_type\n\n        groups = 1 if groups is None else groups\n        print('------')\n        print(input_channels)\n        print(output_channels)\n        print(filter_size)\n        print('------')\n        self._conv = Conv2D(\n            input_channels,\n            output_channels,\n            filter_size,\n            groups=groups,\n            stride=stride,\n            # act=act,\n            padding=padding,\n            weight_attr=param_attr,\n            bias_attr=bias_attr)\n\n    def forward(self, inputs):\n        x = self._conv(inputs)\n        if self.act == \"swish\":\n            x = F.swish(x)\n        elif self.act == \"sigmoid\":\n            x = F.sigmoid(x)\n\n        if self.need_crop:\n            x = x[:, :, 1:, 1:]\n        return x\n\n\nclass ConvBNLayer(nn.Layer):\n\n    def __init__(self,\n                 input_channels,\n                 filter_size,\n                 output_channels,\n                 stride=1,\n                 num_groups=1,\n                 padding_type=\"SAME\",\n                 conv_act=None,\n                 bn_act=\"swish\",\n                 use_bn=True,\n                 use_bias=False,\n                 name=None,\n                 conv_name=None,\n                 bn_name=None,\n                 model_name=None,\n                 cur_stage=None):\n        super(ConvBNLayer, self).__init__()\n        self._conv = Conv2ds(input_channels=input_channels,\n                             output_channels=output_channels,\n                             filter_size=filter_size,\n                             stride=stride,\n                             groups=num_groups,\n                             act=conv_act,\n                             padding_type=padding_type,\n                             name=conv_name,\n                             use_bias=use_bias,\n                             model_name=model_name,\n                             cur_stage=cur_stage)\n        self.use_bn = use_bn\n        if use_bn is True:\n            bn_name = name + bn_name\n            param_attr, bias_attr = init_batch_norm_layer(bn_name)\n            self._bn = BatchNorm(num_channels=output_channels,\n                                 act=bn_act,\n                                 momentum=0.99,\n                                 epsilon=0.001,\n                                 moving_mean_name=bn_name + \"_mean\",\n                                 moving_variance_name=bn_name + \"_variance\",\n                                 param_attr=param_attr,\n                                 bias_attr=bias_attr)\n\n    def forward(self, inputs):\n        print('ConvBNLayer:')\n        if self.use_bn:\n            x = self._conv(inputs)\n            x = self._bn(x)\n            print(x.shape)\n            print('-------')\n            return x\n        else:\n            return self._conv(inputs)\n\n\nclass ExpandConvNorm(nn.Layer):\n\n    def __init__(self, input_channels, block_args, padding_type, name=None, model_name=None, cur_stage=None):\n        super(ExpandConvNorm, self).__init__()\n\n        self.oup = block_args.input_filters * block_args.expand_ratio\n        self.expand_ratio = block_args.expand_ratio\n\n        if self.expand_ratio != 1:\n            self._conv = ConvBNLayer(input_channels,\n                                     1,\n                                     self.oup,\n                                     bn_act=None,\n                                     padding_type=padding_type,\n                                     name=name,\n                                     conv_name=name + \"_expand_conv\",\n                                     bn_name=\"_bn0\",\n                                     model_name=model_name,\n                                     cur_stage=cur_stage)\n\n    def forward(self, inputs):\n        if self.expand_ratio != 1:\n            return self._conv(inputs)\n        else:\n            return inputs\n\n\nclass DepthwiseConvNorm(nn.Layer):\n\n    def __init__(self, input_channels, block_args, padding_type, name=None, model_name=None, cur_stage=None):\n        super(DepthwiseConvNorm, self).__init__()\n\n        self.k = block_args.kernel_size\n        self.s = block_args.stride\n        if isinstance(self.s, list) or isinstance(self.s, tuple):\n            self.s = self.s[0]\n        oup = block_args.input_filters * block_args.expand_ratio\n        self._conv = ConvBNLayer(input_channels,\n                                 self.k,\n                                 oup,\n                                 self.s,\n                                 num_groups=input_channels,\n                                 bn_act=None,\n                                 padding_type=padding_type,\n                                 name=name,\n                                 conv_name=name + \"_depthwise_conv\",\n                                 bn_name=\"_bn1\",\n                                 model_name=model_name,\n                                 cur_stage=cur_stage)\n\n    def forward(self, inputs):\n        return self._conv(inputs)\n\n\nclass ProjectConvNorm(nn.Layer):\n\n    def __init__(self, input_channels, block_args, padding_type, name=None, model_name=None, cur_stage=None):\n        super(ProjectConvNorm, self).__init__()\n\n        final_oup = block_args.output_filters\n\n        self._conv = ConvBNLayer(input_channels,\n                                 1,\n                                 final_oup,\n                                 bn_act=None,\n                                 padding_type=padding_type,\n                                 name=name,\n                                 conv_name=name + \"_project_conv\",\n                                 bn_name=\"_bn2\",\n                                 model_name=model_name,\n                                 cur_stage=cur_stage)\n\n    def forward(self, inputs):\n        return self._conv(inputs)\n\n\nclass SEBlock(nn.Layer):\n\n    def __init__(self,\n                 input_channels,\n                 num_squeezed_channels,\n                 oup,\n                 padding_type,\n                 name=None,\n                 model_name=None,\n                 cur_stage=None):\n        super(SEBlock, self).__init__()\n\n        self._pool = AdaptiveAvgPool2D(1)\n        self._conv1 = Conv2ds(input_channels,\n                              num_squeezed_channels,\n                              1,\n                              use_bias=True,\n                              padding_type=padding_type,\n                              act=\"swish\",\n                              name=name + \"_se_reduce\")\n\n        self._conv2 = Conv2ds(\n            num_squeezed_channels,\n            oup,\n            1,\n            # act=\"sigmoid\",\n            act=None,\n            use_bias=True,\n            padding_type=padding_type,\n            name=name + \"_se_expand\")\n\n    def forward(self, inputs):\n        x = self._pool(inputs)\n        x = self._conv1(x)\n        x = self._conv2(x)\n        out = paddle.multiply(inputs, F.sigmoid(x))\n        return out\n\n\nclass MbConvBlock(nn.Layer):\n\n    def __init__(self,\n                 input_channels,\n                 block_args,\n                 padding_type,\n                 use_se,\n                 name=None,\n                 drop_connect_rate=None,\n                 model_name=None,\n                 cur_stage=None):\n        super(MbConvBlock, self).__init__()\n\n        oup = block_args.input_filters * block_args.expand_ratio\n        self.block_args = block_args\n        self.has_se = use_se and (block_args.se_ratio is not None) and (0 < block_args.se_ratio <= 1)\n        self.id_skip = block_args.id_skip\n        self.expand_ratio = block_args.expand_ratio\n        self.drop_connect_rate = drop_connect_rate\n\n        if self.expand_ratio != 1:\n            self._ecn = ExpandConvNorm(input_channels,\n                                       block_args,\n                                       padding_type=padding_type,\n                                       name=name,\n                                       model_name=model_name,\n                                       cur_stage=cur_stage)\n\n        self._dcn = DepthwiseConvNorm(input_channels * block_args.expand_ratio,\n                                      block_args,\n                                      padding_type=padding_type,\n                                      name=name,\n                                      model_name=model_name,\n                                      cur_stage=cur_stage)\n\n        if self.has_se:\n            num_squeezed_channels = max(1, int(block_args.input_filters * block_args.se_ratio))\n            self._se = SEBlock(input_channels * block_args.expand_ratio,\n                               num_squeezed_channels,\n                               oup,\n                               padding_type=padding_type,\n                               name=name,\n                               model_name=model_name,\n                               cur_stage=cur_stage)\n\n        self._pcn = ProjectConvNorm(input_channels * block_args.expand_ratio,\n                                    block_args,\n                                    padding_type=padding_type,\n                                    name=name,\n                                    model_name=model_name,\n                                    cur_stage=cur_stage)\n\n    def forward(self, inputs):\n        x = inputs\n        if self.expand_ratio != 1:\n            x = self._ecn(x)\n            x = F.swish(x)\n        x = self._dcn(x)\n        x = F.swish(x)\n        if self.has_se:\n            x = self._se(x)\n        x = self._pcn(x)\n        if self.id_skip and \\\n                self.block_args.stride == 1 and \\\n                self.block_args.input_filters == self.block_args.output_filters:\n            if self.drop_connect_rate:\n                x = _drop_connect(x, self.drop_connect_rate, not self.training)\n            x = paddle.add(x, inputs)\n        return x\n\n\nclass ConvStemNorm(nn.Layer):\n\n    def __init__(self, input_channels, padding_type, _global_params, name=None, model_name=None, cur_stage=None):\n        super(ConvStemNorm, self).__init__()\n\n        output_channels = round_filters(32, _global_params)\n\n        self._conv = ConvBNLayer(input_channels,\n                                 filter_size=3,\n                                 output_channels=output_channels,\n                                 stride=2,\n                                 bn_act=None,\n                                 padding_type=padding_type,\n                                 name=\"\",\n                                 conv_name=\"_conv_stem\",\n                                 bn_name=\"_bn0\",\n                                 model_name=model_name,\n                                 cur_stage=cur_stage)\n\n    def forward(self, inputs):\n        return self._conv(inputs)\n\n\nclass ExtractFeatures(nn.Layer):\n\n    def __init__(self, input_channels, _block_args, _global_params, padding_type, use_se, model_name=None):\n        super(ExtractFeatures, self).__init__()\n\n        self._global_params = _global_params\n\n        self._conv_stem = ConvStemNorm(input_channels,\n                                       padding_type=padding_type,\n                                       _global_params=_global_params,\n                                       model_name=model_name,\n                                       cur_stage=0)\n\n        self.block_args_copy = copy.deepcopy(_block_args)\n        idx = 0\n        block_size = 0\n        for block_arg in self.block_args_copy:\n            block_arg = block_arg._replace(input_filters=round_filters(block_arg.input_filters, _global_params),\n                                           output_filters=round_filters(block_arg.output_filters, _global_params),\n                                           num_repeat=round_repeats(block_arg.num_repeat, _global_params))\n            block_size += 1\n            for _ in range(block_arg.num_repeat - 1):\n                block_size += 1\n\n        self.conv_seq = []\n        cur_stage = 1\n        for block_args in _block_args:\n            block_args = block_args._replace(input_filters=round_filters(block_args.input_filters, _global_params),\n                                             output_filters=round_filters(block_args.output_filters, _global_params),\n                                             num_repeat=round_repeats(block_args.num_repeat, _global_params))\n            drop_connect_rate = self._global_params.drop_connect_rate\n            if drop_connect_rate:\n                drop_connect_rate *= float(idx) / block_size\n            _mc_block = self.add_sublayer(\n                \"_blocks.\" + str(idx) + \".\",\n                MbConvBlock(block_args.input_filters,\n                            block_args=block_args,\n                            padding_type=padding_type,\n                            use_se=use_se,\n                            name=\"_blocks.\" + str(idx) + \".\",\n                            drop_connect_rate=drop_connect_rate,\n                            model_name=model_name,\n                            cur_stage=cur_stage))\n            self.conv_seq.append(_mc_block)\n            idx += 1\n            if block_args.num_repeat > 1:\n                block_args = block_args._replace(input_filters=block_args.output_filters, stride=1)\n            for _ in range(block_args.num_repeat - 1):\n                drop_connect_rate = self._global_params.drop_connect_rate\n                if drop_connect_rate:\n                    drop_connect_rate *= float(idx) / block_size\n                _mc_block = self.add_sublayer(\n                    \"block.\" + str(idx) + \".\",\n                    MbConvBlock(block_args.input_filters,\n                                block_args,\n                                padding_type=padding_type,\n                                use_se=use_se,\n                                name=\"_blocks.\" + str(idx) + \".\",\n                                drop_connect_rate=drop_connect_rate,\n                                model_name=model_name,\n                                cur_stage=cur_stage))\n                self.conv_seq.append(_mc_block)\n                idx += 1\n            cur_stage += 1\n\n    def forward(self, inputs):\n        print('ExtractFeatures:')\n        print(inputs.shape)\n        x = self._conv_stem(inputs)\n        print(x.shape)\n        print('******')\n        x = F.swish(x)\n        for _mc_block in self.conv_seq:\n            x = _mc_block(x)\n        return x\n\n\nclass EfficientNet(nn.Layer):\n\n    def __init__(self, name=\"b0\", padding_type=\"SAME\", override_params=None, use_se=True, class_num=768):\n        super(EfficientNet, self).__init__()\n\n        model_name = 'efficientnet-' + name\n        self.name = name\n        self._block_args, self._global_params = get_model_params(model_name, override_params)\n        self.padding_type = padding_type\n        self.use_se = use_se\n\n        self._ef = ExtractFeatures(3,\n                                   self._block_args,\n                                   self._global_params,\n                                   self.padding_type,\n                                   self.use_se,\n                                   model_name=self.name)\n\n        output_channels = round_filters(1280, self._global_params)\n\n        if name == \"b0_small\" or name == \"b0\" or name == \"b1\":\n            oup = 320\n        elif name == \"b2\":\n            oup = 352\n        elif name == \"b3\":\n            oup = 384\n        elif name == \"b4\":\n            oup = 448\n        elif name == \"b5\":\n            oup = 512\n        elif name == \"b6\":\n            oup = 576\n        elif name == \"b7\":\n            oup = 640\n        self._conv = ConvBNLayer(oup,\n                                 1,\n                                 output_channels,\n                                 bn_act=\"swish\",\n                                 padding_type=self.padding_type,\n                                 name=\"\",\n                                 conv_name=\"_conv_head\",\n                                 bn_name=\"_bn1\",\n                                 model_name=self.name,\n                                 cur_stage=7)\n        self._pool = AdaptiveAvgPool2D(1)\n\n        if self._global_params.dropout_rate:\n            self._drop = Dropout(p=self._global_params.dropout_rate, mode=\"upscale_in_train\")\n\n        # param_attr, bias_attr = init_fc_layer(\"_fc\")\n        self._fc = Linear(output_channels,\n                          class_num,\n                          weight_attr=paddle.ParamAttr(name='image_trans_w'),\n                          bias_attr=paddle.ParamAttr(name='image_trans_b'))\n\n    def forward(self, inputs):\n        x = self._ef(inputs)\n        print('EfficientNet:')\n        print(x.shape)\n        print(self._conv)\n        x = self._conv(x)\n        x = self._pool(x)\n        if self._global_params.dropout_rate:\n            x = self._drop(x)\n        x = paddle.squeeze(x, axis=[2, 3])\n        x = self._fc(x)\n        x = F.tanh(x)\n        return x\n\n\ndef _load_pretrained(pretrained, model, model_url, use_ssld=False):\n    if pretrained is False:\n        pass\n    elif isinstance(pretrained, str):\n        load_dygraph_pretrain(model, pretrained)\n    else:\n        raise RuntimeError(\"pretrained type is not available. Please use `string` type.\")\n\n\ndef EfficientNetB0_small(padding_type='DYNAMIC',\n                         override_params=None,\n                         use_se=False,\n                         pretrained=False,\n                         use_ssld=False,\n                         **kwargs):\n    model = EfficientNet(name='b0', padding_type=padding_type, override_params=override_params, use_se=use_se, **kwargs)\n    _load_pretrained(pretrained, model, MODEL_URLS[\"EfficientNetB0_small\"])\n    return model\n\n\ndef EfficientNetB0(padding_type='SAME', override_params=None, use_se=True, pretrained=False, use_ssld=False, **kwargs):\n    model = EfficientNet(name='b0', padding_type=padding_type, override_params=override_params, use_se=use_se, **kwargs)\n    _load_pretrained(pretrained, model, MODEL_URLS[\"EfficientNetB0\"])\n    return model\n\n\ndef EfficientNetB1(padding_type='SAME', override_params=None, use_se=True, pretrained=False, use_ssld=False, **kwargs):\n    model = EfficientNet(name='b1', padding_type=padding_type, override_params=override_params, use_se=use_se, **kwargs)\n    _load_pretrained(pretrained, model, MODEL_URLS[\"EfficientNetB1\"])\n    return model\n\n\ndef EfficientNetB2(padding_type='SAME', override_params=None, use_se=True, pretrained=False, use_ssld=False, **kwargs):\n    model = EfficientNet(name='b2', padding_type=padding_type, override_params=override_params, use_se=use_se, **kwargs)\n    _load_pretrained(pretrained, model, MODEL_URLS[\"EfficientNetB2\"])\n    return model\n\n\ndef EfficientNetB3(padding_type='SAME', override_params=None, use_se=True, pretrained=False, use_ssld=False, **kwargs):\n    model = EfficientNet(name='b3', padding_type=padding_type, override_params=override_params, use_se=use_se, **kwargs)\n    _load_pretrained(pretrained, model, MODEL_URLS[\"EfficientNetB3\"])\n    return model\n\n\ndef EfficientNetB4(padding_type='SAME', override_params=None, use_se=True, pretrained=False, use_ssld=False, **kwargs):\n    model = EfficientNet(name='b4', padding_type=padding_type, override_params=override_params, use_se=use_se, **kwargs)\n    _load_pretrained(pretrained, model, MODEL_URLS[\"EfficientNetB4\"])\n    return model\n\n\ndef EfficientNetB5(padding_type='SAME', override_params=None, use_se=True, pretrained=False, use_ssld=False, **kwargs):\n    model = EfficientNet(name='b5', padding_type=padding_type, override_params=override_params, use_se=use_se, **kwargs)\n    _load_pretrained(pretrained, model, MODEL_URLS[\"EfficientNetB5\"])\n    return model\n\n\ndef EfficientNetB6(padding_type='SAME', override_params=None, use_se=True, pretrained=False, use_ssld=False, **kwargs):\n    model = EfficientNet(name='b6', padding_type=padding_type, override_params=override_params, use_se=use_se, **kwargs)\n    _load_pretrained(pretrained, model, MODEL_URLS[\"EfficientNetB6\"])\n    return model\n\n\ndef EfficientNetB7(padding_type='SAME', override_params=None, use_se=True, pretrained=False, use_ssld=False, **kwargs):\n    model = EfficientNet(name='b7', padding_type=padding_type, override_params=override_params, use_se=use_se, **kwargs)\n    _load_pretrained(pretrained, model, MODEL_URLS[\"EfficientNetB7\"])\n    return model\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/vit_b_16x/ernievil2/transformers/ernie2.py",
    "content": "# -*- coding: utf-8 -*\n\"\"\"\nERNIE 网络结构\n\"\"\"\nimport logging\nimport re\nimport time\n\nimport paddle\nfrom paddle import nn\nfrom paddle.nn import functional as F\n\nACT_DICT = {\n    'relu': nn.ReLU,\n    'gelu': nn.GELU,\n}\n\n\nclass ErnieModel(nn.Layer):\n    \"\"\" ernie model \"\"\"\n\n    def __init__(self, cfg, name=''):\n        \"\"\"\n        Fundamental pretrained Ernie model\n        \"\"\"\n        nn.Layer.__init__(self)\n        self.cfg = cfg\n        d_model = cfg['hidden_size']\n        d_emb = cfg.get('emb_size', cfg['hidden_size'])\n        d_vocab = cfg['vocab_size']\n        d_pos = cfg['max_position_embeddings']\n        # d_sent = cfg.get(\"sent_type_vocab_size\", 4) or cfg.get('type_vocab_size', 4)\n        if cfg.get('sent_type_vocab_size'):\n            d_sent = cfg['sent_type_vocab_size']\n        else:\n            d_sent = cfg.get('type_vocab_size', 2)\n        self.n_head = cfg['num_attention_heads']\n        self.return_additional_info = cfg.get('return_additional_info', False)\n        self.initializer = nn.initializer.TruncatedNormal(std=cfg['initializer_range'])\n\n        self.ln = _build_ln(d_model, name=append_name(name, 'pre_encoder'))\n        self.word_emb = nn.Embedding(d_vocab,\n                                     d_emb,\n                                     weight_attr=paddle.ParamAttr(name=append_name(name, 'word_embedding'),\n                                                                  initializer=self.initializer))\n        self.pos_emb = nn.Embedding(d_pos,\n                                    d_emb,\n                                    weight_attr=paddle.ParamAttr(name=append_name(name, 'pos_embedding'),\n                                                                 initializer=self.initializer))\n        # self.sent_emb = nn.Embedding(\n        #    d_sent,\n        #    d_emb,\n        #    weight_attr=paddle.ParamAttr(name=append_name(name, 'sent_embedding'), initializer=self.initializer))\n        self._use_sent_id = cfg.get('use_sent_id', True)\n        self._use_sent_id = False\n        if self._use_sent_id:\n            self.sent_emb = nn.Embedding(d_sent,\n                                         d_emb,\n                                         weight_attr=paddle.ParamAttr(name=append_name(name, 'sent_embedding'),\n                                                                      initializer=self.initializer))\n        self._use_task_id = cfg.get('use_task_id', False)\n        self._use_task_id = False\n        if self._use_task_id:\n            self._task_types = cfg.get('task_type_vocab_size', 3)\n            logging.info('using task_id, #task_types:{}'.format(self._task_types))\n            self.task_emb = nn.Embedding(self._task_types,\n                                         d_emb,\n                                         weight_attr=paddle.ParamAttr(name=append_name(name, 'task_embedding'),\n                                                                      initializer=self.initializer))\n\n        prob = cfg['hidden_dropout_prob']\n        self.dropout = nn.Dropout(p=prob)\n\n        self.encoder_stack = ErnieEncoderStack(cfg, append_name(name, 'encoder'))\n\n        if cfg.get('has_pooler', True):\n            self.pooler = _build_linear(cfg['hidden_size'], cfg['hidden_size'], append_name(name, 'pooled_fc'),\n                                        self.initializer)\n        else:\n            self.pooler = None\n\n        self.key_tag = None\n        self._checkpoints = []\n        self.train()\n\n    def get_checkpoints(self):\n        \"\"\"return checkpoints for recomputing\"\"\"\n        # recompute checkpoints\n        return self._checkpoints\n\n    # FIXME:remove this\n    def eval(self):\n        \"\"\" eval \"\"\"\n        if paddle.in_dynamic_mode():\n            super(ErnieModel, self).eval()\n        self.training = False\n        for l in self.sublayers():\n            l.training = False\n        return self\n\n    def train(self):\n        \"\"\" train \"\"\"\n        if paddle.in_dynamic_mode():\n            super(ErnieModel, self).train()\n        self.training = True\n        for l in self.sublayers():\n            l.training = True\n        return self\n\n    def forward(self,\n                src_ids,\n                sent_ids=None,\n                pos_ids=None,\n                input_mask=None,\n                task_ids=None,\n                attn_bias=None,\n                past_cache=None,\n                use_causal_mask=False):\n        \"\"\"\n        Args:\n            src_ids (`Variable` of shape `[batch_size, seq_len]`):\n                Indices of input sequence tokens in the vocabulary.\n            sent_ids (optional, `Variable` of shape `[batch_size, seq_len]`):\n                aka token_type_ids, Segment token indices to indicate first and second portions of the inputs.\n                if None, assume all tokens come from `segment_a`\n            pos_ids(optional, `Variable` of shape `[batch_size, seq_len]`):\n                Indices of positions of each input sequence tokens in the position embeddings.\n            input_mask(optional `Variable` of shape `[batch_size, seq_len]`):\n                Mask to avoid performing attention on the padding token indices of the encoder input.\n            task_ids(optional `Variable` of shape `[batch_size, seq_len]`):\n                task type for pre_train task type\n            attn_bias(optional, `Variable` of shape `[batch_size, seq_len, seq_len] or False`):\n                3D version of `input_mask`, if set, overrides `input_mask`; if set not False, will not apply attention mask\n            past_cache(optional, tuple of two lists: cached key and cached value,\n                each is a list of `Variable`s of shape `[batch_size, seq_len, hidden_size]`):\n                cached key/value tensor that will be concated to generated key/value when performing self attention.\n                if set, `attn_bias` should not be None.\n\n        Returns:\n            pooled (`Variable` of shape `[batch_size, hidden_size]`):\n                output logits of pooler classifier\n            encoded(`Variable` of shape `[batch_size, seq_len, hidden_size]`):\n                output logits of transformer stack\n            info (Dictionary):\n                addtional middle level info, inclues: all hidden stats, k/v caches.\n        \"\"\"\n        assert len(src_ids.shape) == 2, 'expect src_ids.shape = [batch, sequence], got %s' % (repr(src_ids.shape))\n        assert attn_bias is not None if past_cache else True, 'if `past_cache` specified; attn_bias must not be None'\n        d_seqlen = paddle.shape(src_ids)[1]\n        if pos_ids is None:\n            pos_ids = paddle.arange(0, d_seqlen, 1, dtype='int32').reshape([1, -1]).cast('int64')\n\n        if attn_bias is None:\n            if input_mask is None:\n                input_mask = paddle.cast(src_ids != 0, 'float32')\n            assert len(input_mask.shape) == 2\n            input_mask = input_mask.unsqueeze(-1)\n            attn_bias = input_mask.matmul(input_mask, transpose_y=True)\n            if use_causal_mask:\n                sequence = paddle.reshape(paddle.arange(0, d_seqlen, 1, dtype='float32') + 1., [1, 1, -1, 1])\n                causal_mask = (sequence.matmul(1. / sequence, transpose_y=True) >= 1.).cast('float32')\n                attn_bias *= causal_mask\n        else:\n            assert len(attn_bias.shape) == 3, 'expect attn_bias tobe rank 3, got %r' % attn_bias.shape\n\n        attn_bias = (1. - attn_bias) * -10000.0\n        attn_bias = attn_bias.unsqueeze(1).tile([1, self.n_head, 1, 1])  # avoid broadcast =_=\n\n        if sent_ids is None:\n            sent_ids = paddle.zeros_like(src_ids)\n\n        src_embedded = self.word_emb(src_ids)\n        pos_embedded = self.pos_emb(pos_ids)\n        #         sent_embedded = self.sent_emb(sent_ids)\n        #         embedded = src_embedded + pos_embedded + sent_embedded\n        embedded = src_embedded + pos_embedded\n        if self._use_sent_id:\n            sent_embedded = self.sent_emb(sent_ids)\n            embedded = embedded + sent_embedded\n        if self._use_task_id:\n            task_embeded = self.task_emb(task_ids)\n            embedded = embedded + task_embeded\n\n        self._checkpoints.append(embedded.name)\n        embedded = self.dropout(self.ln(embedded))\n\n        (encoded, hidden_list, cache_list, checkpoint_name) = self.encoder_stack(embedded, attn_bias,\n                                                                               past_cache=past_cache, \\\n                                                                               key_tag=self.key_tag)\n\n        self._checkpoints.extend(checkpoint_name)\n        if self.pooler is not None:\n            pooled = F.tanh(self.pooler(encoded[:, 0, :]))\n        else:\n            pooled = None\n\n        additional_info = {\n            'hiddens': hidden_list,\n            'caches': cache_list,\n        }\n\n        if self.return_additional_info:\n            return pooled, encoded, additional_info\n        return pooled, encoded\n\n\nclass ErnieEncoderStack(nn.Layer):\n    \"\"\" ernie encoder stack \"\"\"\n\n    def __init__(self, cfg, name=None):\n        super(ErnieEncoderStack, self).__init__()\n        n_layers = cfg['num_hidden_layers']\n        self.block = nn.LayerList([ErnieBlock(cfg, append_name(name, 'layer_%d' % i)) for i in range(n_layers)])\n\n    def forward(self, inputs, attn_bias=None, past_cache=None, key_tag=None):\n        \"\"\" forward function \"\"\"\n        if past_cache is not None:\n            assert isinstance(\n                past_cache,\n                tuple), 'unknown type of `past_cache`, expect tuple or list. got %s' % repr(type(past_cache))\n            past_cache = list(zip(*past_cache))\n        else:\n            past_cache = [None] * len(self.block)\n        cache_list_k, cache_list_v, hidden_list = [], [], [inputs]\n        checkpoint_name = []\n\n        for b, p in zip(self.block, past_cache):\n            inputs, cache = b(inputs, attn_bias=attn_bias, past_cache=p, key_tag=key_tag)\n            cache_k, cache_v = cache\n            cache_list_k.append(cache_k)\n            cache_list_v.append(cache_v)\n            hidden_list.append(inputs)\n            checkpoint_name.append(inputs.name)\n\n        return [inputs, hidden_list, (cache_list_k, cache_list_v), checkpoint_name]\n\n\nclass ErnieBlock(nn.Layer):\n    \"\"\" ernie block class \"\"\"\n\n    def __init__(self, cfg, name=None):\n        super(ErnieBlock, self).__init__()\n        d_model = cfg['hidden_size']\n        self.attn = AttentionLayer(cfg, name=append_name(name, 'multi_head_att'))\n        self.ln1 = _build_ln(d_model, name=append_name(name, 'post_att'))\n        self.ffn = PositionWiseFeedForwardLayer(cfg, name=append_name(name, 'ffn'))\n        self.ln2 = _build_ln(d_model, name=append_name(name, 'post_ffn'))\n        prob = cfg.get('intermediate_dropout_prob', cfg['hidden_dropout_prob'])\n        self.dropout = nn.Dropout(p=prob)\n\n    def forward(self, inputs, attn_bias=None, past_cache=None, key_tag=None):\n        \"\"\" forward \"\"\"\n        attn_out, cache = self.attn(inputs, inputs, inputs, attn_bias, past_cache=past_cache,\n                                    key_tag=key_tag)  # self attention\n        attn_out = self.dropout(attn_out)\n        hidden = attn_out + inputs\n        hidden = self.ln1(hidden)  # dropout/ add/ norm\n\n        ffn_out = self.ffn(hidden)\n        ffn_out = self.dropout(ffn_out)\n        hidden = ffn_out + hidden\n        hidden = self.ln2(hidden)\n        return hidden, cache\n\n\nclass AttentionLayer(nn.Layer):\n    \"\"\" attention layer \"\"\"\n\n    def __init__(self, cfg, name=None):\n        super(AttentionLayer, self).__init__()\n        initializer = nn.initializer.TruncatedNormal(std=cfg['initializer_range'])\n        d_model = cfg['hidden_size']\n        n_head = cfg['num_attention_heads']\n        # assert d_model % n_head == 0\n        d_model_q = cfg.get('query_hidden_size_per_head', d_model // n_head) * n_head\n        d_model_v = cfg.get('value_hidden_size_per_head', d_model // n_head) * n_head\n\n        self.n_head = n_head\n        self.d_key = d_model_q // n_head\n\n        self.q = _build_linear(d_model, d_model_q, append_name(name, 'query_fc'), initializer)\n        self.k = _build_linear(d_model, d_model_q, append_name(name, 'key_fc'), initializer)\n        self.v = _build_linear(d_model, d_model_v, append_name(name, 'value_fc'), initializer)\n        self.o = _build_linear(d_model_v, d_model, append_name(name, 'output_fc'), initializer)\n        self.layer_num = int(re.findall(r\"\\d+\", name)[0])\n        # self.dropout = nn.Dropout(p=cfg['attention_probs_dropout_prob'])\n        self.dropout_prob = cfg['attention_probs_dropout_prob']\n        self.dropout = nn.Dropout(p=self.dropout_prob)\n\n    def forward(self, queries, keys, values, attn_bias, past_cache, key_tag=None):\n        \"\"\" layer forward function \"\"\"\n        assert len(queries.shape) == len(keys.shape) == len(values.shape) == 3\n        # bsz, q_len, q_dim = queries.shape\n        # bsz, k_len, k_dim = keys.shape\n        # bsz, v_len, v_dim = values.shape\n        # assert k_len == v_len\n\n        q = self.q(queries)\n        k = self.k(keys)\n        v = self.v(values)\n\n        cache = (k, v)\n        if past_cache is not None:\n            cached_k, cached_v = past_cache\n            k = paddle.concat([cached_k, k], 1)\n            v = paddle.concat([cached_v, v], 1)\n\n        # [batch, head, seq, dim]\n        q = q.reshape([0, 0, self.n_head, q.shape[-1] // self.n_head]).transpose([0, 2, 1, 3])\n        # [batch, head, seq, dim]\n        k = k.reshape([0, 0, self.n_head, k.shape[-1] // self.n_head]).transpose([0, 2, 1, 3])\n        # [batch, head, seq, dim]\n        v = v.reshape([0, 0, self.n_head, v.shape[-1] // self.n_head]).transpose([0, 2, 1, 3])\n        q = q.scale(self.d_key**-0.5)\n\n        score = q.matmul(k, transpose_y=True)\n\n        if attn_bias is not None:\n            score += attn_bias\n        score = F.softmax(score)\n        score = self.dropout(score)\n        out = score.matmul(v)\n\n        out = out.transpose([0, 2, 1, 3])\n        out = out.reshape([0, 0, out.shape[2] * out.shape[3]])\n        out = self.o(out)\n\n        return out, cache\n\n\nclass PositionWiseFeedForwardLayer(nn.Layer):\n    \"\"\" post wise feed forward layer \"\"\"\n\n    def __init__(self, cfg, name=None):\n        super(PositionWiseFeedForwardLayer, self).__init__()\n        initializer = nn.initializer.TruncatedNormal(std=cfg['initializer_range'])\n        d_model = cfg['hidden_size']\n        d_ffn = cfg.get('intermediate_size', 4 * d_model)\n\n        self.act = ACT_DICT[cfg['hidden_act']]()\n        self.i = _build_linear(d_model, d_ffn, append_name(name, 'fc_0'), initializer)\n        self.o = _build_linear(d_ffn, d_model, append_name(name, 'fc_1'), initializer)\n        prob = cfg.get('intermediate_dropout_prob', 0.)\n        self.dropout = nn.Dropout(p=prob)\n\n    def forward(self, inputs):\n        \"\"\" forward \"\"\"\n        hidden = self.act(self.i(inputs))\n        hidden = self.dropout(hidden)\n        out = self.o(hidden)\n        return out\n\n\ndef _build_linear(n_in, n_out, name, init):\n    \"\"\"\n    \"\"\"\n    return nn.Linear(n_in,\n                     n_out,\n                     weight_attr=paddle.ParamAttr(name='%s.w_0' % name if name is not None else None, initializer=init),\n                     bias_attr='%s.b_0' % name if name is not None else None)\n\n\ndef _build_ln(n_in, name):\n    \"\"\"\n    \"\"\"\n    return nn.LayerNorm(normalized_shape=n_in,\n                        weight_attr=paddle.ParamAttr(name='%s_layer_norm_scale' % name if name is not None else None,\n                                                     initializer=nn.initializer.Constant(1.)),\n                        bias_attr=paddle.ParamAttr(name='%s_layer_norm_bias' % name if name is not None else None,\n                                                   initializer=nn.initializer.Constant(0.)))\n\n\ndef append_name(name, postfix):\n    \"\"\" append name with postfix \"\"\"\n    if name is None:\n        ret = None\n    elif name == '':\n        ret = postfix\n    else:\n        ret = '%s_%s' % (name, postfix)\n    return ret\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/vit_b_16x/ernievil2/transformers/ernie_modeling.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport json\nimport logging\nimport math\n\nimport six\nif six.PY2:\n    from pathlib2 import Path\nelse:\n    from pathlib import Path\nimport numpy as np\nimport paddle as P\nfrom paddle import nn\nfrom paddle.nn import functional as F\nfrom disco_diffusion_ernievil_base.vit_b_16x.ernievil2.transformers.file_utils import _fetch_from_remote, add_docstring\n\nlog = logging.getLogger(__name__)\n\nACT_DICT = {\n    'relu': nn.ReLU,\n    'gelu': nn.GELU,\n}\n\n\ndef _get_rel_pos_bias(seq_len, max_len=128, num_buckets=32, bidirectional=True, reset=True):\n    #max_len = 520\n    pos = np.array(range(seq_len))\n    rel_pos = pos[:, None] - pos[None, :]\n    ret = 0\n    n = -rel_pos\n    if bidirectional:\n        num_buckets //= 2\n        ret += (n < 0).astype('int32') * num_buckets  # mtf.to_int32(mtf.less(n, 0)) * num_buckets\n        n = np.abs(n)\n    else:\n        n = np.max(n, np.zeros_like(n))\n    # now n is in the range [0, inf)\n\n    # half of the buckets are for exact increments in positions\n    max_exact = num_buckets // 2\n    is_small = n < max_exact\n    # The other half of the buckets are for logarithmically bigger bins in positions up to max_distance\n    val_if_large = max_exact + (np.log(n.astype('float32') / max_exact) / math.log(max_len / max_exact) *\n                                (num_buckets - max_exact)).astype('int32')\n    tmp = np.full_like(val_if_large, num_buckets - 1)\n    val_if_large = np.where(val_if_large < tmp, val_if_large, tmp)\n\n    ret += np.where(is_small, n, val_if_large)\n    if reset:\n        num_buckets *= 2\n        ret[:, 0] = num_buckets\n        ret[0, :] = num_buckets // 2\n\n    return np.array(ret).reshape([seq_len, seq_len]).astype(\"int64\")\n\n\ndef _build_linear(n_in, n_out, name, init):\n    return nn.Linear(\n        n_in,\n        n_out,\n        weight_attr=P.ParamAttr(name='%s.w_0' % name if name is not None else None, initializer=init),\n        bias_attr='%s.b_0' % name if name is not None else None,\n    )\n\n\ndef _build_ln(n_in, name):\n    return nn.LayerNorm(\n        normalized_shape=n_in,\n        weight_attr=P.ParamAttr(name='%s_layer_norm_scale' % name if name is not None else None,\n                                initializer=nn.initializer.Constant(1.)),\n        bias_attr=P.ParamAttr(name='%s_layer_norm_bias' % name if name is not None else None,\n                              initializer=nn.initializer.Constant(0.)),\n    )\n\n\ndef append_name(name, postfix):\n    if name is None:\n        ret = None\n    elif name == '':\n        ret = postfix\n    else:\n        ret = '%s_%s' % (name, postfix)\n    return ret\n\n\nclass AttentionLayer(nn.Layer):\n\n    def __init__(self, cfg, name=None):\n        super(AttentionLayer, self).__init__()\n        initializer = nn.initializer.TruncatedNormal(std=cfg['initializer_range'])\n        d_model = cfg['hidden_size']\n        n_head = cfg['num_attention_heads']\n        assert d_model % n_head == 0\n        d_model_q = cfg.get('query_hidden_size_per_head', d_model // n_head) * n_head\n        d_model_v = cfg.get('value_hidden_size_per_head', d_model // n_head) * n_head\n        self.n_head = n_head\n        self.d_key = d_model_q // n_head\n        self.q = _build_linear(d_model, d_model_q, append_name(name, 'query_fc'), initializer)\n        self.k = _build_linear(d_model, d_model_q, append_name(name, 'key_fc'), initializer)\n        self.v = _build_linear(d_model, d_model_v, append_name(name, 'value_fc'), initializer)\n        self.o = _build_linear(d_model_v, d_model, append_name(name, 'output_fc'), initializer)\n        self.dropout = nn.Dropout(p=cfg['attention_probs_dropout_prob'])\n\n    def forward(self, queries, keys, values, attn_bias, past_cache):\n        assert len(queries.shape) == len(keys.shape) == len(values.shape) == 3\n        #bsz, q_len, q_dim = queries.shape\n        #bsz, k_len, k_dim = keys.shape\n        #bsz, v_len, v_dim = values.shape\n        #assert k_len == v_len\n\n        q = self.q(queries)\n        k = self.k(keys)\n        v = self.v(values)\n\n        cache = (k, v)\n        if past_cache is not None:\n            cached_k, cached_v = past_cache\n            k = P.concat([cached_k, k], 1)\n            v = P.concat([cached_v, v], 1)\n\n        q = q.reshape([0, 0, self.n_head, q.shape[-1] // self.n_head]).transpose([0, 2, 1, 3])  #[batch, head, seq, dim]\n        k = k.reshape([0, 0, self.n_head, k.shape[-1] // self.n_head]).transpose([0, 2, 1, 3])  #[batch, head, seq, dim]\n        v = v.reshape([0, 0, self.n_head, v.shape[-1] // self.n_head]).transpose([0, 2, 1, 3])  #[batch, head, seq, dim]\n\n        q = q.scale(self.d_key**-0.5)\n        score = q.matmul(k, transpose_y=True)\n        if attn_bias is not None:\n            score += attn_bias\n        score = F.softmax(score)\n        score = self.dropout(score)\n\n        out = score.matmul(v).transpose([0, 2, 1, 3])\n        out = out.reshape([0, 0, out.shape[2] * out.shape[3]])\n        out = self.o(out)\n        return out, cache\n\n\nclass PositionwiseFeedForwardLayer(nn.Layer):\n\n    def __init__(self, cfg, name=None):\n        super(PositionwiseFeedForwardLayer, self).__init__()\n        initializer = nn.initializer.TruncatedNormal(std=cfg['initializer_range'])\n        d_model = cfg['hidden_size']\n        d_ffn = cfg.get('intermediate_size', 4 * d_model)\n        self.act = ACT_DICT[cfg['hidden_act']]()\n        self.i = _build_linear(\n            d_model,\n            d_ffn,\n            append_name(name, 'fc_0'),\n            initializer,\n        )\n        self.o = _build_linear(d_ffn, d_model, append_name(name, 'fc_1'), initializer)\n        prob = cfg.get('intermediate_dropout_prob', 0.)\n        self.dropout = nn.Dropout(p=prob)\n\n    def forward(self, inputs):\n        hidden = self.act(self.i(inputs))\n        hidden = self.dropout(hidden)\n        out = self.o(hidden)\n        return out\n\n\nclass ErnieBlock(nn.Layer):\n\n    def __init__(self, cfg, name=None):\n        super(ErnieBlock, self).__init__()\n        d_model = cfg['hidden_size']\n        self.attn = AttentionLayer(cfg, name=append_name(name, 'multi_head_att'))\n        self.ln1 = _build_ln(d_model, name=append_name(name, 'post_att'))\n        self.ffn = PositionwiseFeedForwardLayer(cfg, name=append_name(name, 'ffn'))\n        self.ln2 = _build_ln(d_model, name=append_name(name, 'post_ffn'))\n        prob = cfg.get('intermediate_dropout_prob', cfg['hidden_dropout_prob'])\n        self.dropout = nn.Dropout(p=prob)\n\n    def forward(self, inputs, attn_bias=None, past_cache=None):\n        attn_out, cache = self.attn(inputs, inputs, inputs, attn_bias, past_cache=past_cache)  #self attn\n        attn_out = self.dropout(attn_out)\n        hidden = attn_out + inputs\n        hidden = self.ln1(hidden)  # dropout/ add/ norm\n\n        ffn_out = self.ffn(hidden)\n        ffn_out = self.dropout(ffn_out)\n        hidden = ffn_out + hidden\n        hidden = self.ln2(hidden)\n        return hidden, cache\n\n\nclass ErnieEncoderStack(nn.Layer):\n\n    def __init__(self, cfg, name=None):\n        super(ErnieEncoderStack, self).__init__()\n        n_layers = cfg['num_hidden_layers']\n        self.block = nn.LayerList([ErnieBlock(cfg, append_name(name, 'layer_%d' % i)) for i in range(n_layers)])\n\n    def forward(self, inputs, attn_bias=None, past_cache=None):\n        if past_cache is not None:\n            assert isinstance(\n                past_cache,\n                tuple), 'unknown type of `past_cache`, expect tuple or list. got %s' % repr(type(past_cache))\n            past_cache = list(zip(*past_cache))\n        else:\n            past_cache = [None] * len(self.block)\n        cache_list_k, cache_list_v, hidden_list = [], [], [inputs]\n\n        for b, p in zip(self.block, past_cache):\n            inputs, cache = b(inputs, attn_bias=attn_bias, past_cache=p)\n            cache_k, cache_v = cache\n            cache_list_k.append(cache_k)\n            cache_list_v.append(cache_v)\n            hidden_list.append(inputs)\n\n        return inputs, hidden_list, (cache_list_k, cache_list_v)\n\n\nclass PretrainedModel(object):\n    bce = 'https://ernie-github.cdn.bcebos.com/'\n    resource_map = {\n        'ernie-1.0': bce + 'model-ernie1.0.1.tar.gz',\n        'ernie-2.0-en': bce + 'model-ernie2.0-en.1.tar.gz',\n        'ernie-2.0-large-en': bce + 'model-ernie2.0-large-en.1.tar.gz',\n        'ernie-tiny': bce + 'model-ernie_tiny.1.tar.gz',\n        'ernie-gram-zh': bce + 'model-ernie-gram-zh.1.tar.gz',\n        'ernie-gram-en': bce + 'model-ernie-gram-en.1.tar.gz',\n    }\n\n    @classmethod\n    def from_pretrained(cls, pretrain_dir_or_url, force_download=False, **kwargs):\n        if not Path(pretrain_dir_or_url).exists() and str(pretrain_dir_or_url) in cls.resource_map:\n            url = cls.resource_map[str(pretrain_dir_or_url)]\n            log.info('get pretrain dir from %s' % url)\n            pretrain_dir = _fetch_from_remote(url, force_download)\n        else:\n            log.info('pretrain dir %s not in %s, read from local' % (pretrain_dir_or_url, repr(cls.resource_map)))\n            pretrain_dir = Path(pretrain_dir_or_url)\n\n        if not pretrain_dir.exists():\n            raise ValueError('pretrain dir not found: %s, optional: %s' % (pretrain_dir, cls.resource_map.keys()))\n        state_dict_path = pretrain_dir / 'saved_weights.pdparams'\n        config_path = pretrain_dir / 'ernie_config.json'\n\n        if not config_path.exists():\n            raise ValueError('config path not found: %s' % config_path)\n        name_prefix = kwargs.pop('name', None)\n        cfg_dict = dict(json.loads(config_path.open().read()), **kwargs)\n        model = cls(cfg_dict, name=name_prefix)\n\n        log.info('loading pretrained model from %s' % pretrain_dir)\n\n        #param_path = pretrain_dir / 'params'\n        #if os.path.exists(param_path):\n        #    raise NotImplementedError()\n        #    log.debug('load pretrained weight from program state')\n        #    F.io.load_program_state(param_path) #buggy in dygraph.gurad, push paddle to fix\n        if state_dict_path.exists():\n            m = P.load(str(state_dict_path))\n            for k, v in model.state_dict().items():\n                if k not in m:\n                    log.warn('param:%s not set in pretrained model, skip' % k)\n                    m[k] = v  # FIXME: no need to do this in the future\n            model.set_state_dict(m)\n        else:\n            raise ValueError('weight file not found in pretrain dir: %s' % pretrain_dir)\n        return model\n\n\nclass ErnieModel(nn.Layer, PretrainedModel):\n\n    def __init__(self, cfg, name=None):\n        \"\"\"\n        Fundamental pretrained Ernie model\n        \"\"\"\n        log.debug('init ErnieModel with config: %s' % repr(cfg))\n        nn.Layer.__init__(self)\n        d_model = cfg['hidden_size']\n        d_emb = cfg.get('emb_size', cfg['hidden_size'])\n        d_vocab = cfg['vocab_size']\n        d_pos = cfg['max_position_embeddings']\n        d_sent = cfg.get(\"sent_type_vocab_size\") or cfg['type_vocab_size']\n        self.d_rel_pos = cfg.get('rel_pos_size', None)\n        max_seq_len = cfg.get(\"max_seq_len\", 512)\n        self.n_head = cfg['num_attention_heads']\n        self.return_additional_info = cfg.get('return_additional_info', False)\n        initializer = nn.initializer.TruncatedNormal(std=cfg['initializer_range'])\n        if self.d_rel_pos:\n            self.rel_pos_bias = _get_rel_pos_bias(max_seq_len)\n\n        self.ln = _build_ln(d_model, name=append_name(name, 'pre_encoder'))\n        self.word_emb = nn.Embedding(d_vocab,\n                                     d_emb,\n                                     weight_attr=P.ParamAttr(name=append_name(name, 'word_embedding'),\n                                                             initializer=initializer))\n        self.pos_emb = nn.Embedding(d_pos,\n                                    d_emb,\n                                    weight_attr=P.ParamAttr(name=append_name(name, 'pos_embedding'),\n                                                            initializer=initializer))\n        self.sent_emb = nn.Embedding(d_sent,\n                                     d_emb,\n                                     weight_attr=P.ParamAttr(name=append_name(name, 'sent_embedding'),\n                                                             initializer=initializer))\n        if self.d_rel_pos:\n            self.rel_pos_bias_emb = nn.Embedding(self.d_rel_pos,\n                                                 self.n_head,\n                                                 weight_attr=P.ParamAttr(name=append_name(name, 'rel_pos_embedding'),\n                                                                         initializer=initializer))\n        prob = cfg['hidden_dropout_prob']\n        self.dropout = nn.Dropout(p=prob)\n\n        self.encoder_stack = ErnieEncoderStack(cfg, append_name(name, 'encoder'))\n        if cfg.get('has_pooler', True):\n            self.pooler = _build_linear(\n                cfg['hidden_size'],\n                cfg['hidden_size'],\n                append_name(name, 'pooled_fc'),\n                initializer,\n            )\n        else:\n            self.pooler = None\n        self.train()\n\n    #FIXME:remove this\n    def eval(self):\n        if P.in_dynamic_mode():\n            super(ErnieModel, self).eval()\n        self.training = False\n        for l in self.sublayers():\n            l.training = False\n        return self\n\n    def train(self):\n        if P.in_dynamic_mode():\n            super(ErnieModel, self).train()\n        self.training = True\n        for l in self.sublayers():\n            l.training = True\n        return self\n\n    def forward(self,\n                src_ids,\n                sent_ids=None,\n                pos_ids=None,\n                input_mask=None,\n                attn_bias=None,\n                past_cache=None,\n                use_causal_mask=False):\n        \"\"\"\n        Args:\n            src_ids (`Variable` of shape `[batch_size, seq_len]`):\n                Indices of input sequence tokens in the vocabulary.\n            sent_ids (optional, `Variable` of shape `[batch_size, seq_len]`):\n                aka token_type_ids, Segment token indices to indicate first and second portions of the inputs.\n                if None, assume all tokens come from `segment_a`\n            pos_ids(optional, `Variable` of shape `[batch_size, seq_len]`):\n                Indices of positions of each input sequence tokens in the position embeddings.\n            input_mask(optional `Variable` of shape `[batch_size, seq_len]`):\n                Mask to avoid performing attention on the padding token indices of the encoder input.\n            attn_bias(optional, `Variable` of shape `[batch_size, seq_len, seq_len] or False`):\n                3D version of `input_mask`, if set, overrides `input_mask`; if set not False, will not apply attention mask\n            past_cache(optional, tuple of two lists: cached key and cached value,\n                each is a list of `Variable`s of shape `[batch_size, seq_len, hidden_size]`):\n                cached key/value tensor that will be concated to generated key/value when performing self attention.\n                if set, `attn_bias` should not be None.\n        Returns:\n            pooled (`Variable` of shape `[batch_size, hidden_size]`):\n                output logits of pooler classifier\n            encoded(`Variable` of shape `[batch_size, seq_len, hidden_size]`):\n                output logits of transformer stack\n            info (Dictionary):\n                addtional middle level info, inclues: all hidden stats, k/v caches.\n        \"\"\"\n        assert len(src_ids.shape) == 2, 'expect src_ids.shape = [batch, sequecen], got %s' % (repr(src_ids.shape))\n        assert attn_bias is not None if past_cache else True, 'if `past_cache` is specified; attn_bias should not be None'\n        d_seqlen = P.shape(src_ids)[1]\n        if pos_ids is None:\n            pos_ids = P.arange(0, d_seqlen, 1, dtype='int32').reshape([1, -1]).cast('int64')\n        if attn_bias is None:\n            if input_mask is None:\n                input_mask = P.cast(src_ids != 0, 'float32')\n            assert len(input_mask.shape) == 2\n            input_mask = input_mask.unsqueeze(-1)\n            attn_bias = input_mask.matmul(input_mask, transpose_y=True)\n            if use_causal_mask:\n                sequence = P.reshape(P.arange(0, d_seqlen, 1, dtype='float32') + 1., [1, 1, -1, 1])\n                causal_mask = (sequence.matmul(1. / sequence, transpose_y=True) >= 1.).cast('float32')\n                attn_bias *= causal_mask\n        else:\n            assert len(attn_bias.shape) == 3, 'expect attn_bias tobe rank 3, got %r' % attn_bias.shape\n        attn_bias = (1. - attn_bias) * -10000.0\n        attn_bias = attn_bias.unsqueeze(1).tile([1, self.n_head, 1, 1])  # avoid broadcast =_=\n        attn_bias.stop_gradient = True\n        if sent_ids is None:\n            sent_ids = P.zeros_like(src_ids)\n        if self.d_rel_pos:\n            rel_pos_ids = self.rel_pos_bias[:d_seqlen, :d_seqlen]\n            rel_pos_ids = P.to_tensor(rel_pos_ids, dtype='int64')\n            rel_pos_bias = self.rel_pos_bias_emb(rel_pos_ids).transpose([2, 0, 1])\n            attn_bias += rel_pos_bias\n        src_embedded = self.word_emb(src_ids)\n        pos_embedded = self.pos_emb(pos_ids)\n        sent_embedded = self.sent_emb(sent_ids)\n        embedded = src_embedded + pos_embedded + sent_embedded\n\n        embedded = self.dropout(self.ln(embedded))\n\n        encoded, hidden_list, cache_list = self.encoder_stack(embedded, attn_bias, past_cache=past_cache)\n        if self.pooler is not None:\n            pooled = F.tanh(self.pooler(encoded[:, 0, :]))\n        else:\n            pooled = None\n\n        additional_info = {\n            'hiddens': hidden_list,\n            'caches': cache_list,\n        }\n\n        if self.return_additional_info:\n            return pooled, encoded, additional_info\n        return pooled, encoded\n\n\nclass ErnieModelForSequenceClassification(ErnieModel):\n    \"\"\"\n    Ernie Model for text classfication or pointwise ranking tasks\n    \"\"\"\n\n    def __init__(self, cfg, name=None):\n        super(ErnieModelForSequenceClassification, self).__init__(cfg, name=name)\n\n        initializer = nn.initializer.TruncatedNormal(std=cfg['initializer_range'])\n        self.classifier = _build_linear(cfg['hidden_size'], cfg['num_labels'], append_name(name, 'cls'), initializer)\n\n        prob = cfg.get('classifier_dropout_prob', cfg['hidden_dropout_prob'])\n        self.dropout = nn.Dropout(p=prob)\n        self.train()\n\n    @add_docstring(ErnieModel.forward.__doc__)\n    def forward(self, *args, **kwargs):\n        \"\"\"\n        Args:\n            labels (optional, `Variable` of shape [batch_size]):\n                ground truth label id for each sentence\n        Returns:\n            loss (`Variable` of shape []):\n                Cross entropy loss mean over batch\n                if labels not set, returns None\n            logits (`Variable` of shape [batch_size, hidden_size]):\n                output logits of classifier\n        \"\"\"\n        labels = kwargs.pop('labels', None)\n        pooled, encoded = super(ErnieModelForSequenceClassification, self).forward(*args, **kwargs)\n        hidden = self.dropout(pooled)\n        logits = self.classifier(hidden)\n\n        if labels is not None:\n            if len(labels.shape) != 1:\n                labels = labels.squeeze()\n            loss = F.cross_entropy(logits, labels)\n        else:\n            loss = None\n        return loss, logits\n\n\nclass ErnieModelForTokenClassification(ErnieModel):\n    \"\"\"\n    Ernie Model for Named entity tasks(NER)\n    \"\"\"\n\n    def __init__(self, cfg, name=None):\n        super(ErnieModelForTokenClassification, self).__init__(cfg, name=name)\n\n        initializer = nn.initializer.TruncatedNormal(std=cfg['initializer_range'])\n        self.classifier = _build_linear(cfg['hidden_size'], cfg['num_labels'], append_name(name, 'cls'), initializer)\n\n        prob = cfg.get('classifier_dropout_prob', cfg['hidden_dropout_prob'])\n        self.dropout = nn.Dropout(p=prob)\n        self.train()\n\n    @add_docstring(ErnieModel.forward.__doc__)\n    def forward(self, *args, **kwargs):\n        \"\"\"\n        Args:\n            labels (optional, `Variable` of shape [batch_size, seq_len]):\n                ground truth label id for each token\n        Returns:\n            loss (`Variable` of shape []):\n                Cross entropy loss mean over batch and time, ignore positions where label == -100\n                if labels not set, returns None\n            logits (`Variable` of shape [batch_size, seq_len, hidden_size]):\n                output logits of classifier\n            loss_weights (`Variable` of shape [batch_size, seq_len]):\n                weigths of loss for each tokens.\n            ignore_index (int):\n                when label == `ignore_index`, this token will not contribute to loss\n        \"\"\"\n        ignore_index = kwargs.pop('ignore_index', -100)\n        labels = kwargs.pop('labels', None)\n        loss_weights = kwargs.pop('loss_weights', None)\n        pooled, encoded = super(ErnieModelForTokenClassification, self).forward(*args, **kwargs)\n        hidden = self.dropout(encoded)  # maybe not?\n        logits = self.classifier(hidden)\n\n        if labels is not None:\n            if len(labels.shape) != 2:\n                labels = labels.squeeze()\n            loss = F.cross_entropy(logits, labels, ignore_index=ignore_index, reduction='none')\n            if loss_weights is not None:\n                loss = loss * loss_weights\n            loss = loss.mean()\n        else:\n            loss = None\n        return loss, logits\n\n\nclass ErnieModelForQuestionAnswering(ErnieModel):\n    \"\"\"\n    Ernie model for reading comprehension tasks (SQuAD)\n    \"\"\"\n\n    def __init__(self, cfg, name=None):\n        super(ErnieModelForQuestionAnswering, self).__init__(cfg, name=name)\n\n        initializer = nn.initializer.TruncatedNormal(std=cfg['initializer_range'])\n        self.classifier = _build_linear(cfg['hidden_size'], 2, append_name(name, 'cls_mrc'), initializer)\n\n        prob = cfg.get('classifier_dropout_prob', cfg['hidden_dropout_prob'])\n        self.dropout = nn.Dropout(p=prob)\n        self.train()\n\n    @add_docstring(ErnieModel.forward.__doc__)\n    def forward(self, *args, **kwargs):\n        \"\"\"\n        Args:\n            start_pos (optional, `Variable` of shape [batch_size]):\n                token index of start of answer span in `context`\n            end_pos (optional, `Variable` of shape [batch_size]):\n                token index of end of answer span in `context`\n        Returns:\n            loss (`Variable` of shape []):\n                Cross entropy loss mean over batch and time, ignore positions where label == -100\n                if labels not set, returns None\n            start_logits (`Variable` of shape [batch_size, hidden_size]):\n                output logits of start position, use argmax(start_logit) to get start index\n            end_logits (`Variable` of shape [batch_size, hidden_size]):\n                output logits of end position, use argmax(end_logit) to get end index\n        \"\"\"\n\n        start_pos = kwargs.pop('start_pos', None)\n        end_pos = kwargs.pop('end_pos', None)\n        pooled, encoded = super(ErnieModelForQuestionAnswering, self).forward(*args, **kwargs)\n        encoded = self.dropout(encoded)\n        encoded = self.classifier(encoded)\n        start_logit, end_logits = P.unstack(encoded, axis=-1)\n        if start_pos is not None and end_pos is not None:\n            if len(start_pos.shape) != 1:\n                start_pos = start_pos.squeeze()\n            if len(end_pos.shape) != 1:\n                end_pos = end_pos.squeeze()\n            start_loss = F.cross_entropy(start_logit, start_pos)\n            end_loss = F.cross_entropy(end_logits, end_pos)\n            loss = (start_loss.mean() + end_loss.mean()) / 2.\n        else:\n            loss = None\n        return loss, start_logit, end_logits\n\n\nclass NSPHead(nn.Layer):\n\n    def __init__(self, cfg, name=None):\n        super(NSPHead, self).__init__()\n        initializer = nn.initializer.TruncatedNormal(std=cfg['initializer_range'])\n        self.nsp = _build_linear(cfg['hidden_size'], 2, append_name(name, 'nsp_fc'), initializer)\n\n    def forward(self, inputs, labels):\n        \"\"\"\n        Args:\n            start_pos (optional, `Variable` of shape [batch_size]):\n                token index of start of answer span in `context`\n            end_pos (optional, `Variable` of shape [batch_size]):\n                token index of end of answer span in `context`\n        Returns:\n            loss (`Variable` of shape []):\n                Cross entropy loss mean over batch and time, ignore positions where label == -100\n                if labels not set, returns None\n            start_logits (`Variable` of shape [batch_size, hidden_size]):\n                output logits of start position\n            end_logits (`Variable` of shape [batch_size, hidden_size]):\n                output logits of end position\n        \"\"\"\n\n        logits = self.nsp(inputs)\n        loss = F.cross_entropy(logits, labels)\n        return loss\n\n\nclass ErnieModelForPretraining(ErnieModel):\n    \"\"\"\n    Ernie Model for Masked Languate Model pretrain\n    \"\"\"\n\n    def __init__(self, cfg, name=None):\n        super(ErnieModelForPretraining, self).__init__(cfg, name=name)\n        initializer = nn.initializer.TruncatedNormal(std=cfg['initializer_range'])\n        d_model = cfg['hidden_size']\n        d_vocab = cfg['vocab_size']\n\n        self.pooler_heads = nn.LayerList([NSPHead(cfg, name=name)])\n        self.mlm = _build_linear(\n            d_model,\n            d_model,\n            append_name(name, 'mask_lm_trans_fc'),\n            initializer,\n        )\n        self.act = ACT_DICT[cfg['hidden_act']]()\n        self.mlm_ln = _build_ln(d_model, name=append_name(name, 'mask_lm_trans'))\n        self.mlm_bias = P.create_parameter(\n            dtype='float32',\n            shape=[d_vocab],\n            attr=P.ParamAttr(name=append_name(name, 'mask_lm_out_fc.b_0'),\n                             initializer=nn.initializer.Constant(value=0.0)),\n            is_bias=True,\n        )\n        self.train()\n\n    @add_docstring(ErnieModel.forward.__doc__)\n    def forward(self, *args, **kwargs):\n        \"\"\"\n        Args:\n            nsp_labels (optional, `Variable` of shape [batch_size]):\n                labels for `next sentence prediction` tasks\n            mlm_pos (optional, `Variable` of shape [n_mask, 2]):\n                index of mask_id in `src_ids`, can be obtained from `fluid.layers.where(src_ids==mask_id)`\n            labels (optional, `Variable` of shape [n_mask]):\n                labels for `mask language model` tasks, the original token indices in masked position in `src_ids`\n        Returns:\n            loss (`Variable` of shape []):\n                total_loss of `next sentence prediction` and `masked language model`\n            mlm_loss (`Variable` of shape []):\n                loss for `masked language model` task\n            nsp_loss (`Variable` of shape []):\n                loss for `next sentence prediction` task\n        \"\"\"\n\n        mlm_labels = kwargs.pop('labels')\n        mlm_pos = kwargs.pop('mlm_pos')\n        nsp_labels = kwargs.pop('nsp_labels')\n        pooled, encoded = super(ErnieModelForPretraining, self).forward(*args, **kwargs)\n        if len(mlm_labels.shape) != 1:\n            mlm_labels = mlm_labels.squeeze()\n        if len(nsp_labels.shape) == 1:\n            nsp_labels = nsp_labels.squeeze()\n\n        nsp_loss = self.pooler_heads[0](pooled, nsp_labels)\n\n        encoded_2d = encoded.gather_nd(mlm_pos)\n        encoded_2d = self.act(self.mlm(encoded_2d))\n        encoded_2d = self.mlm_ln(encoded_2d)\n        logits_2d = encoded_2d.matmul(self.word_emb.weight, transpose_y=True) + self.mlm_bias\n        mlm_loss = F.cross_entropy(logits_2d, mlm_labels)\n        total_loss = mlm_loss + nsp_loss\n        return total_loss, mlm_loss, nsp_loss\n\n\nclass ErnieModelForGeneration(ErnieModel):\n    \"\"\"\n    Ernie Model for sequence to sequence generation.\n    \"\"\"\n    resource_map = {\n        'ernie-gen-base-en': ErnieModel.bce + 'model-ernie-gen-base-en.1.tar.gz',\n        'ernie-gen-large-en': ErnieModel.bce + 'model-ernie-gen-large-en.1.tar.gz',\n        'ernie-gen-large-430g-en': ErnieModel.bce + 'model-ernie-gen-large-430g-en.1.tar.gz',\n        'ernie-1.0': ErnieModel.bce + 'model-ernie1.0.1.tar.gz',\n    }\n\n    def __init__(self, cfg, name=None):\n        cfg['return_additional_info'] = True\n        cfg['has_pooler'] = False\n        super(ErnieModelForGeneration, self).__init__(cfg, name=name)\n        initializer = nn.initializer.TruncatedNormal(std=cfg['initializer_range'])\n        d_model = cfg['hidden_size']\n        d_vocab = cfg['vocab_size']\n\n        self.mlm = _build_linear(\n            d_model,\n            d_model,\n            append_name(name, 'mask_lm_trans_fc'),\n            initializer,\n        )\n        self.act = ACT_DICT[cfg['hidden_act']]()\n        self.mlm_ln = _build_ln(d_model, name=append_name(name, 'mask_lm_trans'))\n        self.mlm_bias = P.create_parameter(\n            dtype='float32',\n            shape=[d_vocab],\n            attr=P.ParamAttr(name=append_name(name, 'mask_lm_out_fc.b_0'),\n                             initializer=nn.initializer.Constant(value=0.0)),\n            is_bias=True,\n        )\n        self.train()\n\n    @add_docstring(ErnieModel.forward.__doc__)\n    def forward(self, *args, **kwargs):\n        \"\"\"\n        Args\n            tgt_labels(`Variable` of shape [batch_size, seqlen] or [batch, seqlen, vocab_size]):\n                ground trouth target sequence id (hard label) or distribution (soft label)\n            tgt_pos(`Variable` of shape [n_targets, 2]):\n                index of tgt_labels in `src_ids`, can be obtained from `fluid.layers.where(src_ids==mask_id)`\n            encoder_only(Bool):\n                if set, will not return loss, logits_2d\n        Returns:\n            loss(`Variable` of shape []):\n                cross entropy loss mean over every target label. if `encode_only`, returns None.\n            logits(`Variable` of shape [n_targets, vocab_size]):\n                logits for every targets. if `encode_only`, returns None.\n            info(Dictionary): see `ErnieModel`\n        \"\"\"\n        tgt_labels = kwargs.pop('tgt_labels', None)\n        tgt_pos = kwargs.pop('tgt_pos', None)\n        encode_only = kwargs.pop('encode_only', False)\n        _, encoded, info = ErnieModel.forward(self, *args, **kwargs)\n        if encode_only:\n            return None, None, info\n        if tgt_labels is None or tgt_pos is None:\n            encoded = self.act(self.mlm(encoded))\n            encoded = self.mlm_ln(encoded)\n            logits = encoded.matmul(self.word_emb.weight, transpose_y=True) + self.mlm_bias\n            output_ids = logits.cast('float32').argmax(-1)\n            return output_ids, logits, info\n        else:\n            encoded_2d = encoded.gather_nd(tgt_pos)\n            encoded_2d = self.act(self.mlm(encoded_2d))\n            encoded_2d = self.mlm_ln(encoded_2d)\n            logits_2d = encoded_2d.matmul(self.word_emb.weight, transpose_y=True) + self.mlm_bias\n            assert len(tgt_labels.shape) == 2, 'expect 2d label, got %r' % tgt_labels\n\n            loss = F.cross_entropy(logits_2d, tgt_labels, soft_label=True)\n            return loss, logits_2d, info\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/vit_b_16x/ernievil2/transformers/ernie_tokenizer.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport logging\nimport os\nimport re\nimport sys\nimport tempfile\nfrom functools import partial\nfrom pathlib import Path\n\nimport six\nif six.PY2:\n    from pathlib2 import Path\nelse:\n    from pathlib import Path\n\nfrom tqdm import tqdm\nimport numpy as np\n\nfrom disco_diffusion_ernievil_base.vit_b_16x.ernievil2.transformers.file_utils import _fetch_from_remote\nimport io\n\nopen = partial(io.open, encoding='utf8')\n\nlog = logging.getLogger(__name__)\n\n_max_input_chars_per_word = 100\n\n\ndef _wordpiece(token, vocab, unk_token, prefix='##', sentencepiece_prefix=''):\n    \"\"\" wordpiece: helloworld => [hello, ##world] \"\"\"\n    chars = list(token)\n    if len(chars) > _max_input_chars_per_word:\n        return [unk_token], [(0, len(chars))]\n\n    is_bad = False\n    start = 0\n    sub_tokens = []\n    sub_pos = []\n    while start < len(chars):\n        end = len(chars)\n        cur_substr = None\n        while start < end:\n            substr = \"\".join(chars[start:end])\n            if start == 0:\n                substr = sentencepiece_prefix + substr\n            if start > 0:\n                substr = prefix + substr\n            if substr in vocab:\n                cur_substr = substr\n                break\n            end -= 1\n        if cur_substr is None:\n            is_bad = True\n            break\n        sub_tokens.append(cur_substr)\n        sub_pos.append((start, end))\n        start = end\n    if is_bad:\n        return [unk_token], [(0, len(chars))]\n    else:\n        return sub_tokens, sub_pos\n\n\nclass ErnieTokenizer(object):\n    bce = 'https://ernie-github.cdn.bcebos.com/'\n    resource_map = {\n        'ernie-1.0': bce + 'model-ernie1.0.1.tar.gz',\n        'ernie-2.0-en': bce + 'model-ernie2.0-en.1.tar.gz',\n        'ernie-2.0-large-en': bce + 'model-ernie2.0-large-en.1.tar.gz',\n        'ernie-tiny': bce + 'model-ernie_tiny.1.tar.gz',\n        'ernie-gen-base-en': bce + 'model-ernie-gen-base-en.1.tar.gz',\n        'ernie-gen-large-en': bce + 'model-ernie-gen-large-en.1.tar.gz',\n        'ernie-gram-zh': bce + 'model-ernie-gram-zh.1.tar.gz',\n        'ernie-gram-en': bce + 'model-ernie-gram-en.1.tar.gz',\n    }\n\n    @classmethod\n    def from_pretrained(cls, pretrain_dir_or_url, force_download=False, **kwargs):\n        if not Path(pretrain_dir_or_url).exists() and str(pretrain_dir_or_url) in cls.resource_map:\n            url = cls.resource_map[str(pretrain_dir_or_url)]\n            log.info('get pretrain dir from %s' % url)\n            pretrain_dir = _fetch_from_remote(url, force_download=force_download)\n        else:\n            log.info('pretrain dir %s not in %s, read from local' % (pretrain_dir_or_url, repr(cls.resource_map)))\n            pretrain_dir = Path(pretrain_dir_or_url)\n        if not pretrain_dir.exists():\n            raise ValueError('pretrain dir not found: %s, optional: %s' % (pretrain_dir, cls.resource_map.keys()))\n        vocab_path = pretrain_dir / 'vocab.txt'\n        if not vocab_path.exists():\n            raise ValueError('no vocab file in pretrain dir: %s' % pretrain_dir)\n        vocab_dict = {j.strip().split('\\t')[0]: i for i, j in enumerate(vocab_path.open(encoding='utf8').readlines())}\n        t = cls(vocab_dict, **kwargs)\n        return t\n\n    def __init__(self,\n                 vocab,\n                 unk_token='[UNK]',\n                 sep_token='[SEP]',\n                 cls_token='[CLS]',\n                 pad_token='[PAD]',\n                 mask_token='[MASK]',\n                 wordpiece_prefix='##',\n                 sentencepiece_prefix='',\n                 lower=True,\n                 encoding='utf8',\n                 special_token_list=[]):\n        if not isinstance(vocab, dict):\n            raise ValueError('expect `vocab` to be instance of dict, got %s' % type(vocab))\n        self.vocab = vocab\n        self.lower = lower\n        self.prefix = wordpiece_prefix\n        self.sentencepiece_prefix = sentencepiece_prefix\n        self.pad_id = self.vocab[pad_token]\n        self.cls_id = cls_token and self.vocab[cls_token]\n        self.sep_id = sep_token and self.vocab[sep_token]\n        self.unk_id = unk_token and self.vocab[unk_token]\n        self.mask_id = mask_token and self.vocab[mask_token]\n        self.unk_token = unk_token\n        special_tokens = {pad_token, cls_token, sep_token, unk_token, mask_token} | set(special_token_list)\n        pat_str = ''\n        for t in special_tokens:\n            if t is None:\n                continue\n            pat_str += '(%s)|' % re.escape(t)\n        pat_str += r'([a-zA-Z0-9]+|\\S)'\n        log.debug('regex: %s' % pat_str)\n        self.pat = re.compile(pat_str)\n        self.encoding = encoding\n\n    def tokenize(self, text):\n        if len(text) == 0:\n            return []\n        if six.PY3 and not isinstance(text, six.string_types):\n            text = text.decode(self.encoding)\n        if six.PY2 and isinstance(text, str):\n            text = text.decode(self.encoding)\n\n        res = []\n        for match in self.pat.finditer(text):\n            match_group = match.group(0)\n            if match.groups()[-1]:\n                if self.lower:\n                    match_group = match_group.lower()\n                words, _ = _wordpiece(match_group,\n                                      vocab=self.vocab,\n                                      unk_token=self.unk_token,\n                                      prefix=self.prefix,\n                                      sentencepiece_prefix=self.sentencepiece_prefix)\n            else:\n                words = [match_group]\n            res += words\n        return res\n\n    def convert_tokens_to_ids(self, tokens):\n        return [self.vocab.get(t, self.unk_id) for t in tokens]\n\n    def truncate(self, id1, id2, seqlen):\n        len1 = len(id1)\n        len2 = len(id2)\n        half = seqlen // 2\n        if len1 > len2:\n            len1_truncated, len2_truncated = max(half, seqlen - len2), min(half, len2)\n        else:\n            len1_truncated, len2_truncated = min(half, seqlen - len1), max(half, seqlen - len1)\n        return id1[:len1_truncated], id2[:len2_truncated]\n\n    def build_for_ernie(self, text_id, pair_id=[]):\n        \"\"\"build sentence type id, add [CLS] [SEP]\"\"\"\n        text_id_type = np.zeros_like(text_id, dtype=np.int64)\n        ret_id = np.concatenate([[self.cls_id], text_id, [self.sep_id]], 0)\n        ret_id_type = np.concatenate([[0], text_id_type, [0]], 0)\n\n        if len(pair_id):\n            pair_id_type = np.ones_like(pair_id, dtype=np.int64)\n            ret_id = np.concatenate([ret_id, pair_id, [self.sep_id]], 0)\n            ret_id_type = np.concatenate([ret_id_type, pair_id_type, [1]], 0)\n        return ret_id, ret_id_type\n\n    def encode(self, text, pair=None, truncate_to=None):\n        text_id = np.array(self.convert_tokens_to_ids(self.tokenize(text)), dtype=np.int64)\n        text_id_type = np.zeros_like(text_id, dtype=np.int64)\n        if pair is not None:\n            pair_id = np.array(self.convert_tokens_to_ids(self.tokenize(pair)), dtype=np.int64)\n        else:\n            pair_id = []\n        if truncate_to is not None:\n            text_id, pair_id = self.truncate(text_id, [] if pair_id is None else pair_id, truncate_to)\n\n        ret_id, ret_id_type = self.build_for_ernie(text_id, pair_id)\n        return ret_id, ret_id_type\n\n\nclass ErnieTinyTokenizer(ErnieTokenizer):\n    bce = 'https://ernie-github.cdn.bcebos.com/'\n    resource_map = {'ernie-tiny': bce + 'model-ernie_tiny.1.tar.gz'}\n\n    @classmethod\n    def from_pretrained(cls, pretrain_dir_or_url, force_download=False, **kwargs):\n        if not Path(pretrain_dir_or_url).exists() and str(pretrain_dir_or_url) in cls.resource_map:\n            url = cls.resource_map[str(pretrain_dir_or_url)]\n            log.info('get pretrain dir from %s' % url)\n            pretrain_dir = _fetch_from_remote(url, force_download)\n        else:\n            log.info('pretrain dir %s not in %s, read from local' % (pretrain_dir_or_url, repr(cls.resource_map)))\n            pretrain_dir = Path(pretrain_dir_or_url)\n        if not pretrain_dir.exists():\n            raise ValueError('pretrain dir not found: %s' % pretrain_dir)\n        vocab_path = pretrain_dir / 'vocab.txt'\n        sp_model_path = pretrain_dir / 'subword/spm_cased_simp_sampled.model'\n\n        if not vocab_path.exists():\n            raise ValueError('no vocab file in pretrain dir: %s' % pretrain_dir)\n        vocab_dict = {j.strip().split('\\t')[0]: i for i, j in enumerate(vocab_path.open(encoding='utf8').readlines())}\n\n        t = cls(vocab_dict, sp_model_path, **kwargs)\n        return t\n\n    def __init__(self, vocab, sp_model_path, **kwargs):\n        super(ErnieTinyTokenizer, self).__init__(vocab, **kwargs)\n        import sentencepiece as spm\n        import jieba as jb\n        self.sp_model = spm.SentencePieceProcessor()\n        self.window_size = 5\n        self.sp_model.Load(sp_model_path)\n        self.jb = jb\n\n    def cut(self, sentence):\n        return self.jb.cut(sentence)\n\n    def tokenize(self, text):\n        if len(text) == 0:\n            return []\n        if not isinstance(text, six.string_types):\n            text = text.decode(self.encoding)\n        if self.lower:\n            text = text.lower()\n\n        res = []\n        for match in self.cut(text):\n            res += self.sp_model.EncodeAsPieces(match)\n        return res\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/vit_b_16x/ernievil2/transformers/file_utils.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport logging\nimport os\nimport time\nfrom pathlib import Path\n\nimport six\nfrom tqdm import tqdm\nif six.PY2:\n    from pathlib2 import Path\nelse:\n    from pathlib import Path\n\nlog = logging.getLogger(__name__)\n\n\ndef _fetch_from_remote(url, force_download=False, cached_dir='~/.paddle-ernie-cache'):\n    import hashlib, tempfile, requests, tarfile\n    sig = hashlib.md5(url.encode('utf8')).hexdigest()\n    cached_dir = Path(cached_dir).expanduser()\n    try:\n        cached_dir.mkdir()\n    except OSError:\n        pass\n    cached_dir_model = cached_dir / sig\n    from filelock import FileLock\n    with FileLock(str(cached_dir_model) + '.lock'):\n        donefile = cached_dir_model / 'done'\n        if (not force_download) and donefile.exists():\n            log.debug('%s cached in %s' % (url, cached_dir_model))\n            return cached_dir_model\n        cached_dir_model.mkdir(exist_ok=True)\n        tmpfile = cached_dir_model / 'tmp'\n        with tmpfile.open('wb') as f:\n            r = requests.get(url, stream=True)\n            total_len = int(r.headers.get('content-length'))\n            for chunk in tqdm(r.iter_content(chunk_size=1024),\n                              total=total_len // 1024,\n                              desc='downloading %s' % url,\n                              unit='KB'):\n                if chunk:\n                    f.write(chunk)\n                    f.flush()\n            log.debug('extacting... to %s' % tmpfile)\n            with tarfile.open(tmpfile.as_posix()) as tf:\n                def is_within_directory(directory, target):\n                    \n                    abs_directory = os.path.abspath(directory)\n                    abs_target = os.path.abspath(target)\n                \n                    prefix = os.path.commonprefix([abs_directory, abs_target])\n                    \n                    return prefix == abs_directory\n                \n                def safe_extract(tar, path=\".\", members=None, *, numeric_owner=False):\n                \n                    for member in tar.getmembers():\n                        member_path = os.path.join(path, member.name)\n                        if not is_within_directory(path, member_path):\n                            raise Exception(\"Attempted Path Traversal in Tar File\")\n                \n                    tar.extractall(path, members, numeric_owner=numeric_owner) \n                    \n                \n                safe_extract(tf, path=str(cached_dir_model))\n            donefile.touch()\n        os.remove(tmpfile.as_posix())\n\n    return cached_dir_model\n\n\ndef add_docstring(doc):\n\n    def func(f):\n        f.__doc__ += ('\\n======other docs from supper class ======\\n%s' % doc)\n        return f\n\n    return func\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/vit_b_16x/ernievil2/transformers/multimodal.py",
    "content": "# copyright (c) 2022 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\nclass MultiModalModel(nn.Layer):\n\n    def __init__(self, image_model=None, text_model=None, args=None):\n        super(MultiModalModel, self).__init__()\n        self.visual = image_model\n        self.text_model = text_model\n\n    def encode_text(self, input_ids, pos_ids=None):\n        pool_out, text_embedding = self.text_model(input_ids, pos_ids=pos_ids)\n        return pool_out\n\n    def encode_image(self, img_word):\n        img_embedding = self.visual(img_word)\n        return img_embedding[:, 0]\n\n    def forward(self, img_word=None, input_ids=None, pos_ids=None):\n        img_embedding = self.visual(img_word)\n        img_embedding = img_embedding[:, 0]\n        pool_out, text_embedding = self.text_model(input_ids, pos_ids=pos_ids)\n        return img_embedding, pool_out\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/vit_b_16x/ernievil2/transformers/paddle_vision_transformer.py",
    "content": "# Copyright (c) 2021 PPViT Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"\nImplement Transformer Class for ViT\n\"\"\"\nimport copy\n\nimport paddle\nimport paddle.nn as nn\nfrom disco_diffusion_ernievil_base.vit_b_16x.ernievil2.transformers.droppath import DropPath\n\n\nclass Identity(nn.Layer):\n    \"\"\" Identity layer\n    The output of this layer is the input without any change.\n    Use this layer to avoid using 'if' condition in forward methods\n    \"\"\"\n\n    def __init__(self):\n        super().__init__()\n\n    def forward(self, x):\n        return x\n\n\nclass PatchEmbedding(nn.Layer):\n    \"\"\"Patch Embedding and Position Embedding\n    Apply patch embedding and position embedding on input images.\n    Attributes:\n        patch_embddings: impl using a patch_size x patch_size Conv2D operation\n        position_embddings: a parameter with len = num_patch + 1(for cls_token)\n        cls_token: token insert to the patch feature for classification\n        dropout: dropout for embeddings\n    \"\"\"\n\n    def __init__(self, image_size=224, patch_size=16, in_channels=3, embed_dim=768, dropout=0.):\n        super().__init__()\n        n_patches = (image_size // patch_size) * (image_size // patch_size)\n\n        self.patch_embedding = nn.Conv2D(in_channels=in_channels,\n                                         out_channels=embed_dim,\n                                         kernel_size=patch_size,\n                                         stride=patch_size)\n\n        self.position_embeddings = paddle.create_parameter(\n            shape=[1, n_patches + 1, embed_dim],\n            dtype='float32',\n            default_initializer=paddle.nn.initializer.TruncatedNormal(std=.02))\n\n        self.cls_token = paddle.create_parameter(shape=[1, 1, embed_dim],\n                                                 dtype='float32',\n                                                 default_initializer=paddle.nn.initializer.Constant(0))\n\n        self.dropout = nn.Dropout(dropout)\n\n    def forward(self, x):\n        cls_tokens = self.cls_token.expand((x.shape[0], -1, -1))\n        x = self.patch_embedding(x)\n        x = x.flatten(2)\n        x = x.transpose([0, 2, 1])\n        x = paddle.concat((cls_tokens, x), axis=1)\n\n        embeddings = x + self.position_embeddings  # tensor broadcast\n        embeddings = self.dropout(embeddings)\n        return embeddings\n\n\nclass Attention(nn.Layer):\n    \"\"\" Attention module\n    Attention module for ViT, here q, k, v are assumed the same.\n    The qkv mappings are stored as one single param.\n    Attributes:\n        num_heads: number of heads\n        attn_head_size: feature dim of single head\n        all_head_size: feature dim of all heads\n        qkv: a nn.Linear for q, k, v mapping\n        scales: 1 / sqrt(single_head_feature_dim)\n        out: projection of multi-head attention\n        attn_dropout: dropout for attention\n        proj_dropout: final dropout before output\n        softmax: softmax op for attention\n    \"\"\"\n\n    def __init__(self, embed_dim, num_heads, attn_head_size=None, qkv_bias=True, dropout=0., attention_dropout=0.):\n        super().__init__()\n\n        assert isinstance(embed_dim,\n                          int), (f\"Expected the type of `embed_dim` to be {int}, but received {type(embed_dim)}.\")\n        assert isinstance(num_heads,\n                          int), (f\"Expected the type of `num_heads` to be {int}, but received {type(num_heads)}.\")\n\n        assert embed_dim > 0, (f\"Expected `embed_dim` to be greater than 0, but received {embed_dim}\")\n        assert num_heads > 0, (f\"Expected `num_heads` to be greater than 0, but received {num_heads}\")\n\n        self.embed_dim = embed_dim\n        self.num_heads = num_heads\n\n        if attn_head_size is not None:\n            assert isinstance(attn_head_size, int), (f\"Expected the type of `attn_head_size` to be {int}, \"\n                                                     f\"but received {type(attn_head_size)}.\")\n            assert attn_head_size > 0, f\"Expected `attn_head_size` to be greater than 0,\" \\\n                                       f\" but received {attn_head_size}.\"\n            self.attn_head_size = attn_head_size\n        else:\n            self.attn_head_size = embed_dim // num_heads\n            assert self.attn_head_size * num_heads == embed_dim, (\n                f\"`embed_dim` must be divisible by `num_heads`,\"\n                f\" but received embed_dim={embed_dim}, num_heads={num_heads}.\")\n\n        self.all_head_size = self.attn_head_size * num_heads\n\n        w_attr_1, b_attr_1 = self._init_weights()\n        self.qkv = nn.Linear(\n            embed_dim,\n            self.all_head_size * 3,  # weights for q, k, and v\n            weight_attr=w_attr_1,\n            bias_attr=b_attr_1 if qkv_bias else False)\n\n        self.scales = self.attn_head_size**-0.5\n\n        w_attr_2, b_attr_2 = self._init_weights()\n        self.out = nn.Linear(self.all_head_size, embed_dim, weight_attr=w_attr_2, bias_attr=b_attr_2)\n\n        self.attn_dropout = nn.Dropout(attention_dropout)\n        self.proj_dropout = nn.Dropout(dropout)\n        self.softmax = nn.Softmax(axis=-1)\n\n    def _init_weights(self):\n        weight_attr = paddle.ParamAttr(initializer=nn.initializer.TruncatedNormal(std=.02))\n        bias_attr = paddle.ParamAttr(initializer=nn.initializer.Constant(0.0))\n        return weight_attr, bias_attr\n\n    def transpose_multihead(self, x):\n        new_shape = x.shape[:-1] + [self.num_heads, self.attn_head_size]\n        x = x.reshape(new_shape)\n        x = x.transpose([0, 2, 1, 3])\n        return x\n\n    def forward(self, x):\n        qkv = self.qkv(x).chunk(3, axis=-1)\n        q, k, v = map(self.transpose_multihead, qkv)\n\n        attn = paddle.matmul(q, k, transpose_y=True)\n        attn = attn * self.scales\n        attn = self.softmax(attn)\n        attn = self.attn_dropout(attn)\n\n        z = paddle.matmul(attn, v)\n        z = z.transpose([0, 2, 1, 3])\n        new_shape = z.shape[:-2] + [self.all_head_size]\n        z = z.reshape(new_shape)\n        # reshape\n        z = self.out(z)\n        z = self.proj_dropout(z)\n        return z\n\n\nclass Mlp(nn.Layer):\n    \"\"\" MLP module\n    Impl using nn.Linear and activation is GELU, dropout is applied.\n    Ops: fc -> act -> dropout -> fc -> dropout\n    Attributes:\n        fc1: nn.Linear\n        fc2: nn.Linear\n        act: GELU\n        dropout1: dropout after fc1\n        dropout2: dropout after fc2\n    \"\"\"\n\n    def __init__(self, embed_dim, mlp_ratio, dropout=0.):\n        super().__init__()\n        w_attr_1, b_attr_1 = self._init_weights()\n        self.fc1 = nn.Linear(embed_dim, int(embed_dim * mlp_ratio), weight_attr=w_attr_1, bias_attr=b_attr_1)\n\n        w_attr_2, b_attr_2 = self._init_weights()\n        self.fc2 = nn.Linear(int(embed_dim * mlp_ratio), embed_dim, weight_attr=w_attr_2, bias_attr=b_attr_2)\n        self.act = nn.GELU()\n        self.dropout1 = nn.Dropout(dropout)\n        self.dropout2 = nn.Dropout(dropout)\n\n    def _init_weights(self):\n        weight_attr = paddle.ParamAttr(initializer=paddle.nn.initializer.TruncatedNormal(std=0.2))\n        bias_attr = paddle.ParamAttr(initializer=paddle.nn.initializer.Constant(0.0))\n        return weight_attr, bias_attr\n\n    def forward(self, x):\n        x = self.fc1(x)\n        x = self.act(x)\n        x = self.dropout1(x)\n        x = self.fc2(x)\n        x = self.dropout2(x)\n        return x\n\n\nclass EncoderLayer(nn.Layer):\n    \"\"\"Encoder Layer\n    Encoder layer contains attention, norm, mlp and residual\n    Attributes:\n        hidden_size: transformer feature dim\n        attn_norm: nn.LayerNorm before attention\n        mlp_norm: nn.LayerNorm before mlp\n        mlp: mlp modual\n        attn: attention modual\n    \"\"\"\n\n    def __init__(self,\n                 embed_dim,\n                 num_heads,\n                 attn_head_size=None,\n                 qkv_bias=True,\n                 mlp_ratio=4.,\n                 dropout=0.,\n                 attention_dropout=0.,\n                 droppath=0.):\n        super().__init__()\n        w_attr_1, b_attr_1 = self._init_weights()\n        self.attn_norm = nn.LayerNorm(embed_dim, weight_attr=w_attr_1, bias_attr=b_attr_1, epsilon=1e-6)\n\n        self.attn = Attention(embed_dim, num_heads, attn_head_size, qkv_bias, dropout, attention_dropout)\n        self.drop_path = DropPath(droppath) if droppath > 0. else Identity()\n\n        w_attr_2, b_attr_2 = self._init_weights()\n        self.mlp_norm = nn.LayerNorm(embed_dim, weight_attr=w_attr_2, bias_attr=b_attr_2, epsilon=1e-6)\n\n        self.mlp = Mlp(embed_dim, mlp_ratio, dropout)\n\n    def _init_weights(self):\n        weight_attr = paddle.ParamAttr(initializer=nn.initializer.Constant(1.0))\n        bias_attr = paddle.ParamAttr(initializer=nn.initializer.Constant(0.0))\n        return weight_attr, bias_attr\n\n    def forward(self, x):\n        h = x\n        x = self.attn_norm(x)\n        x = self.attn(x)\n        x = self.drop_path(x)\n        x = x + h\n\n        h = x\n        x = self.mlp_norm(x)\n        x = self.mlp(x)\n        x = self.drop_path(x)\n        x = x + h\n\n        return x\n\n\nclass Encoder(nn.Layer):\n    \"\"\"Transformer encoder\n    Encoder encoder contains a list of EncoderLayer, and a LayerNorm.\n    Attributes:\n        layers: nn.LayerList contains multiple EncoderLayers\n        encoder_norm: nn.LayerNorm which is applied after last encoder layer\n    \"\"\"\n\n    def __init__(self,\n                 embed_dim,\n                 num_heads,\n                 depth,\n                 attn_head_size=None,\n                 qkv_bias=True,\n                 mlp_ratio=4.0,\n                 dropout=0.,\n                 attention_dropout=0.,\n                 droppath=0.):\n        super().__init__()\n        # stochatic depth decay\n        depth_decay = [x.item() for x in paddle.linspace(0, droppath, depth)]\n        layer_list = []\n        for i in range(depth):\n            encoder_layer = EncoderLayer(embed_dim,\n                                         num_heads,\n                                         attn_head_size=attn_head_size,\n                                         qkv_bias=qkv_bias,\n                                         mlp_ratio=mlp_ratio,\n                                         dropout=dropout,\n                                         attention_dropout=attention_dropout,\n                                         droppath=depth_decay[i])\n            layer_list.append(copy.deepcopy(encoder_layer))\n        self.layers = nn.LayerList(layer_list)\n\n        w_attr_1, b_attr_1 = self._init_weights()\n        self.encoder_norm = nn.LayerNorm(embed_dim, weight_attr=w_attr_1, bias_attr=b_attr_1, epsilon=1e-6)\n\n    def _init_weights(self):\n        weight_attr = paddle.ParamAttr(initializer=nn.initializer.Constant(1.0))\n        bias_attr = paddle.ParamAttr(initializer=nn.initializer.Constant(0.0))\n        return weight_attr, bias_attr\n\n    def forward(self, x):\n        for layer in self.layers:\n            x = layer(x)\n        out = self.encoder_norm(x)\n        return out\n\n\nclass VisualTransformer(nn.Layer):\n    \"\"\"ViT transformer\n    ViT Transformer, classifier is a single Linear layer for finetune,\n    For training from scratch, two layer mlp should be used.\n    Classification is done using cls_token.\n    Args:\n        image_size: int, input image size, default: 224\n        patch_size: int, patch size, default: 16\n        in_channels: int, input image channels, default: 3\n        num_classes: int, number of classes for classification, default: 1000\n        embed_dim: int, embedding dimension (patch embed out dim), default: 768\n        depth: int, number ot transformer blocks, default: 12\n        num_heads: int, number of attention heads, default: 12\n        mlp_ratio: float, ratio of mlp hidden dim to embed dim(mlp in dim), default: 4.0\n        qkv_bias: bool, If True, enable qkv(nn.Linear) layer with bias, default: True\n        dropout: float, dropout rate for linear layers, default: 0.\n        attention_dropout: float, dropout rate for attention layers default: 0.\n        droppath: float, droppath rate for droppath layers, default: 0.\n    \"\"\"\n\n    def __init__(self,\n                 image_size=224,\n                 patch_size=16,\n                 in_channels=3,\n                 num_classes=768,\n                 embed_dim=768,\n                 depth=12,\n                 num_heads=12,\n                 attn_head_size=None,\n                 mlp_ratio=4,\n                 qkv_bias=True,\n                 dropout=0.,\n                 attention_dropout=0.,\n                 droppath=0.,\n                 train_from_scratch=False):\n        super().__init__()\n        # create patch embedding with positional embedding\n        self.patch_embedding = PatchEmbedding(image_size, patch_size, in_channels, embed_dim, dropout)\n        # create multi head self-attention layers\n        self.encoder = Encoder(embed_dim, num_heads, depth, attn_head_size, qkv_bias, mlp_ratio, dropout,\n                               attention_dropout, droppath)\n\n        # classifier head (for training from scracth)\n        if train_from_scratch:\n            w_attr_1, b_attr_1 = self._init_weights()\n            w_attr_2, b_attr_2 = self._init_weights()\n            self.classifier = nn.Sequential(\n                nn.Linear(embed_dim, embed_dim, weight_attr=w_attr_1, bias_attr=b_attr_1),\n                nn.ReLU(),\n                nn.Dropout(dropout),\n                nn.Linear(embed_dim, num_classes, weight_attr=w_attr_2, bias_attr=b_attr_2),\n                nn.Dropout(dropout),\n            )\n        else:\n            # classifier head (for finetuning)\n            w_attr_1, b_attr_1 = self._init_weights()\n            self.classifier = nn.Linear(embed_dim, num_classes, weight_attr=w_attr_1, bias_attr=b_attr_1)\n\n    def _init_weights(self):\n        weight_attr = paddle.ParamAttr(initializer=paddle.nn.initializer.TruncatedNormal(std=.02))\n        bias_attr = paddle.ParamAttr(initializer=paddle.nn.initializer.Constant(0.0))\n        return weight_attr, bias_attr\n\n    def forward(self, x):\n        x = self.patch_embedding(x)\n        x = self.encoder(x)\n        logits = self.classifier(x[:, 0])  # take only cls_token as classifier\n        return logits\n\n        # return x\n\n\ndef build_vit(config):\n    \"\"\"build vit model from config\"\"\"\n    model = VisualTransformer(image_size=config.DATA.IMAGE_SIZE,\n                              patch_size=config.MODEL.TRANS.PATCH_SIZE,\n                              in_channels=config.DATA.IMAGE_CHANNELS,\n                              num_classes=config.MODEL.NUM_CLASSES,\n                              embed_dim=config.MODEL.TRANS.EMBED_DIM,\n                              depth=config.MODEL.TRANS.DEPTH,\n                              num_heads=config.MODEL.TRANS.NUM_HEADS,\n                              attn_head_size=config.MODEL.TRANS.ATTN_HEAD_SIZE,\n                              mlp_ratio=config.MODEL.TRANS.MLP_RATIO,\n                              qkv_bias=config.MODEL.TRANS.QKV_BIAS,\n                              dropout=config.MODEL.DROPOUT,\n                              attention_dropout=config.MODEL.ATTENTION_DROPOUT,\n                              droppath=config.MODEL.DROPPATH,\n                              train_from_scratch=False)\n    return model\n\n\ndef ViT_large_patch16_384(**kwargs):\n    model = VisualTransformer(image_size=384,\n                              patch_size=16,\n                              in_channels=3,\n                              embed_dim=1024,\n                              depth=24,\n                              num_heads=16,\n                              attn_head_size=64,\n                              mlp_ratio=4.0,\n                              qkv_bias=True,\n                              dropout=0.1,\n                              attention_dropout=0.1,\n                              train_from_scratch=False)\n    return model\n\n\ndef ViT_large_patch16_224(**kwargs):\n    model = VisualTransformer(image_size=224,\n                              patch_size=16,\n                              in_channels=3,\n                              embed_dim=1024,\n                              depth=24,\n                              num_heads=16,\n                              attn_head_size=64,\n                              mlp_ratio=4.0,\n                              qkv_bias=True,\n                              dropout=0.1,\n                              attention_dropout=0.1,\n                              train_from_scratch=False)\n    return model\n\n\ndef ViT_base_patch16_224(**kwargs):\n    model = VisualTransformer(image_size=224,\n                              patch_size=16,\n                              in_channels=3,\n                              embed_dim=768,\n                              depth=12,\n                              num_heads=12,\n                              attn_head_size=64,\n                              mlp_ratio=4.0,\n                              qkv_bias=True,\n                              dropout=0,\n                              attention_dropout=0,\n                              train_from_scratch=False)\n    return model\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/vit_b_16x/ernievil2/transformers/resnet.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import print_function\n\nimport paddle\nimport paddle.nn as nn\nfrom paddle.utils.download import get_weights_path_from_url\n\n__all__ = []\n\nmodel_urls = {\n    'resnet18': ('https://paddle-hapi.bj.bcebos.com/models/resnet18.pdparams', 'cf548f46534aa3560945be4b95cd11c4'),\n    'resnet34': ('https://paddle-hapi.bj.bcebos.com/models/resnet34.pdparams', '8d2275cf8706028345f78ac0e1d31969'),\n    'resnet50': ('https://paddle-hapi.bj.bcebos.com/models/resnet50.pdparams', 'ca6f485ee1ab0492d38f323885b0ad80'),\n    'resnet101': ('https://paddle-hapi.bj.bcebos.com/models/resnet101.pdparams', '02f35f034ca3858e1e54d4036443c92d'),\n    'resnet152': ('https://paddle-hapi.bj.bcebos.com/models/resnet152.pdparams', '7ad16a2f1e7333859ff986138630fd7a'),\n    'wide_resnet50_2':\n    ('https://paddle-hapi.bj.bcebos.com/models/wide_resnet50_2.pdparams', '0282f804d73debdab289bd9fea3fa6dc'),\n    'wide_resnet101_2':\n    ('https://paddle-hapi.bj.bcebos.com/models/wide_resnet101_2.pdparams', 'd4360a2d23657f059216f5d5a1a9ac93'),\n}\n\n\nclass BasicBlock(nn.Layer):\n    expansion = 1\n\n    def __init__(self,\n                 inplanes,\n                 planes,\n                 stride=1,\n                 downsample=None,\n                 groups=1,\n                 base_width=64,\n                 dilation=1,\n                 norm_layer=None):\n        super(BasicBlock, self).__init__()\n        if norm_layer is None:\n            norm_layer = nn.BatchNorm2D\n\n        if dilation > 1:\n            raise NotImplementedError(\"Dilation > 1 not supported in BasicBlock\")\n\n        self.conv1 = nn.Conv2D(inplanes, planes, 3, padding=1, stride=stride, bias_attr=False)\n        self.bn1 = norm_layer(planes)\n        self.relu = nn.ReLU()\n        self.conv2 = nn.Conv2D(planes, planes, 3, padding=1, bias_attr=False)\n        self.bn2 = norm_layer(planes)\n        self.downsample = downsample\n        self.stride = stride\n\n    def forward(self, x):\n        identity = x\n\n        out = self.conv1(x)\n        out = self.bn1(out)\n        out = self.relu(out)\n\n        out = self.conv2(out)\n        out = self.bn2(out)\n\n        if self.downsample is not None:\n            identity = self.downsample(x)\n\n        out += identity\n        out = self.relu(out)\n\n        return out\n\n\nclass BottleneckBlock(nn.Layer):\n\n    expansion = 4\n\n    def __init__(self,\n                 inplanes,\n                 planes,\n                 stride=1,\n                 downsample=None,\n                 groups=1,\n                 base_width=64,\n                 dilation=1,\n                 norm_layer=None):\n        super(BottleneckBlock, self).__init__()\n        if norm_layer is None:\n            norm_layer = nn.BatchNorm2D\n        width = int(planes * (base_width / 64.)) * groups\n\n        self.conv1 = nn.Conv2D(inplanes, width, 1, bias_attr=False)\n        self.bn1 = norm_layer(width)\n\n        self.conv2 = nn.Conv2D(width,\n                               width,\n                               3,\n                               padding=dilation,\n                               stride=stride,\n                               groups=groups,\n                               dilation=dilation,\n                               bias_attr=False)\n        self.bn2 = norm_layer(width)\n\n        self.conv3 = nn.Conv2D(width, planes * self.expansion, 1, bias_attr=False)\n        self.bn3 = norm_layer(planes * self.expansion)\n        self.relu = nn.ReLU()\n        self.downsample = downsample\n        self.stride = stride\n\n    def forward(self, x):\n        identity = x\n\n        out = self.conv1(x)\n        out = self.bn1(out)\n        out = self.relu(out)\n\n        out = self.conv2(out)\n        out = self.bn2(out)\n        out = self.relu(out)\n\n        out = self.conv3(out)\n        out = self.bn3(out)\n\n        if self.downsample is not None:\n            identity = self.downsample(x)\n\n        out += identity\n        out = self.relu(out)\n\n        return out\n\n\nclass ResNet(nn.Layer):\n    \"\"\"ResNet model from\n    `\"Deep Residual Learning for Image Recognition\" <https://arxiv.org/pdf/1512.03385.pdf>`_\n\n    Args:\n        Block (BasicBlock|BottleneckBlock): block module of model.\n        depth (int): layers of resnet, default: 50.\n        width (int): base width of resnet, default: 64.\n        num_classes (int): output dim of last fc layer. If num_classes <=0, last fc layer\n                            will not be defined. Default: 1000.\n        with_pool (bool): use pool before the last fc layer or not. Default: True.\n\n    Examples:\n        .. code-block:: python\n\n            import paddle\n            from paddle.vision.models import ResNet\n            from paddle.vision.models.resnet import BottleneckBlock, BasicBlock\n\n            resnet50 = ResNet(BottleneckBlock, 50)\n\n            wide_resnet50_2 = ResNet(BottleneckBlock, 50, width=64*2)\n\n            resnet18 = ResNet(BasicBlock, 18)\n\n            x = paddle.rand([1, 3, 224, 224])\n            out = resnet18(x)\n\n            print(out.shape)\n\n    \"\"\"\n\n    def __init__(self, block, depth=50, width=64, num_classes=1000, with_pool=True):\n        super(ResNet, self).__init__()\n        layer_cfg = {18: [2, 2, 2, 2], 34: [3, 4, 6, 3], 50: [3, 4, 6, 3], 101: [3, 4, 23, 3], 152: [3, 8, 36, 3]}\n        layers = layer_cfg[depth]\n        self.groups = 1\n        self.base_width = width\n        self.num_classes = num_classes\n        self.with_pool = with_pool\n        self._norm_layer = nn.BatchNorm2D\n\n        self.inplanes = 64\n        self.dilation = 1\n\n        self.conv1 = nn.Conv2D(3, self.inplanes, kernel_size=7, stride=2, padding=3, bias_attr=False)\n        self.bn1 = self._norm_layer(self.inplanes)\n        self.relu = nn.ReLU()\n        self.maxpool = nn.MaxPool2D(kernel_size=3, stride=2, padding=1)\n        self.layer1 = self._make_layer(block, 64, layers[0])\n        self.layer2 = self._make_layer(block, 128, layers[1], stride=2)\n        self.layer3 = self._make_layer(block, 256, layers[2], stride=2)\n        self.layer4 = self._make_layer(block, 512, layers[3], stride=2)\n        if with_pool:\n            self.avgpool = nn.AdaptiveAvgPool2D((1, 1))\n\n        if num_classes > 0:\n            self.fc = nn.Linear(512 * block.expansion, num_classes)\n\n    def _make_layer(self, block, planes, blocks, stride=1, dilate=False):\n        norm_layer = self._norm_layer\n        downsample = None\n        previous_dilation = self.dilation\n        if dilate:\n            self.dilation *= stride\n            stride = 1\n        if stride != 1 or self.inplanes != planes * block.expansion:\n            downsample = nn.Sequential(\n                nn.Conv2D(self.inplanes, planes * block.expansion, 1, stride=stride, bias_attr=False),\n                norm_layer(planes * block.expansion),\n            )\n\n        layers = []\n        layers.append(\n            block(self.inplanes, planes, stride, downsample, self.groups, self.base_width, previous_dilation,\n                  norm_layer))\n        self.inplanes = planes * block.expansion\n        for _ in range(1, blocks):\n            layers.append(\n                block(self.inplanes, planes, groups=self.groups, base_width=self.base_width, norm_layer=norm_layer))\n\n        return nn.Sequential(*layers)\n\n    def forward(self, x):\n        x = self.conv1(x)\n        x = self.bn1(x)\n        x = self.relu(x)\n        x = self.maxpool(x)\n        x = self.layer1(x)\n        x = self.layer2(x)\n        x = self.layer3(x)\n        x = self.layer4(x)\n\n        if self.with_pool:\n            x = self.avgpool(x)\n\n        if self.num_classes > 0:\n            x = paddle.flatten(x, 1)\n            x = self.fc(x)\n\n        return x\n\n\ndef _resnet(arch, Block, depth, pretrained, **kwargs):\n    model = ResNet(Block, depth, **kwargs)\n    if pretrained:\n        assert arch in model_urls, \"{} model do not have a pretrained model now, you should set pretrained=False\".format(\n            arch)\n        weight_path = get_weights_path_from_url(model_urls[arch][0], model_urls[arch][1])\n\n        param = paddle.load(weight_path)\n        model.set_dict(param)\n\n    return model\n\n\ndef resnet18(pretrained=False, **kwargs):\n    \"\"\"ResNet 18-layer model from\n    `\"Deep Residual Learning for Image Recognition\" <https://arxiv.org/pdf/1512.03385.pdf>`_\n\n    Args:\n        pretrained (bool): If True, returns a model pre-trained on ImageNet\n\n    Examples:\n        .. code-block:: python\n\n            import paddle\n            from paddle.vision.models import resnet18\n\n            # build model\n            model = resnet18()\n\n            # build model and load imagenet pretrained weight\n            # model = resnet18(pretrained=True)\n\n            x = paddle.rand([1, 3, 224, 224])\n            out = model(x)\n\n            print(out.shape)\n    \"\"\"\n    return _resnet('resnet18', BasicBlock, 18, pretrained, **kwargs)\n\n\ndef resnet34(pretrained=False, **kwargs):\n    \"\"\"ResNet 34-layer model from\n    `\"Deep Residual Learning for Image Recognition\" <https://arxiv.org/pdf/1512.03385.pdf>`_\n\n    Args:\n        pretrained (bool): If True, returns a model pre-trained on ImageNet\n\n    Examples:\n        .. code-block:: python\n\n            import paddle\n            from paddle.vision.models import resnet34\n\n            # build model\n            model = resnet34()\n\n            # build model and load imagenet pretrained weight\n            # model = resnet34(pretrained=True)\n\n            x = paddle.rand([1, 3, 224, 224])\n            out = model(x)\n\n            print(out.shape)\n    \"\"\"\n    return _resnet('resnet34', BasicBlock, 34, pretrained, **kwargs)\n\n\ndef resnet50(pretrained=False, **kwargs):\n    \"\"\"ResNet 50-layer model from\n    `\"Deep Residual Learning for Image Recognition\" <https://arxiv.org/pdf/1512.03385.pdf>`_\n\n    Args:\n        pretrained (bool): If True, returns a model pre-trained on ImageNet\n\n    Examples:\n        .. code-block:: python\n\n            import paddle\n            from paddle.vision.models import resnet50\n\n            # build model\n            model = resnet50()\n\n            # build model and load imagenet pretrained weight\n            # model = resnet50(pretrained=True)\n\n            x = paddle.rand([1, 3, 224, 224])\n            out = model(x)\n\n            print(out.shape)\n    \"\"\"\n    return _resnet('resnet50', BottleneckBlock, 50, pretrained, **kwargs)\n\n\ndef resnet101(pretrained=False, **kwargs):\n    \"\"\"ResNet 101-layer model from\n    `\"Deep Residual Learning for Image Recognition\" <https://arxiv.org/pdf/1512.03385.pdf>`_\n\n    Args:\n        pretrained (bool): If True, returns a model pre-trained on ImageNet\n\n    Examples:\n        .. code-block:: python\n\n            import paddle\n            from paddle.vision.models import resnet101\n\n            # build model\n            model = resnet101()\n\n            # build model and load imagenet pretrained weight\n            # model = resnet101(pretrained=True)\n\n            x = paddle.rand([1, 3, 224, 224])\n            out = model(x)\n\n            print(out.shape)\n    \"\"\"\n    return _resnet('resnet101', BottleneckBlock, 101, pretrained, **kwargs)\n\n\ndef resnet152(pretrained=False, **kwargs):\n    \"\"\"ResNet 152-layer model from\n    `\"Deep Residual Learning for Image Recognition\" <https://arxiv.org/pdf/1512.03385.pdf>`_\n\n    Args:\n        pretrained (bool): If True, returns a model pre-trained on ImageNet\n\n    Examples:\n        .. code-block:: python\n\n            import paddle\n            from paddle.vision.models import resnet152\n\n            # build model\n            model = resnet152()\n\n            # build model and load imagenet pretrained weight\n            # model = resnet152(pretrained=True)\n\n            x = paddle.rand([1, 3, 224, 224])\n            out = model(x)\n\n            print(out.shape)\n    \"\"\"\n    return _resnet('resnet152', BottleneckBlock, 152, pretrained, **kwargs)\n\n\ndef wide_resnet50_2(pretrained=False, **kwargs):\n    \"\"\"Wide ResNet-50-2 model from\n    `\"Wide Residual Networks\" <https://arxiv.org/pdf/1605.07146.pdf>`_.\n\n    Args:\n        pretrained (bool): If True, returns a model pre-trained on ImageNet\n\n    Examples:\n        .. code-block:: python\n\n            import paddle\n            from paddle.vision.models import wide_resnet50_2\n\n            # build model\n            model = wide_resnet50_2()\n\n            # build model and load imagenet pretrained weight\n            # model = wide_resnet50_2(pretrained=True)\n\n            x = paddle.rand([1, 3, 224, 224])\n            out = model(x)\n\n            print(out.shape)\n    \"\"\"\n    kwargs['width'] = 64 * 2\n    return _resnet('wide_resnet50_2', BottleneckBlock, 50, pretrained, **kwargs)\n\n\ndef wide_resnet101_2(pretrained=False, **kwargs):\n    \"\"\"Wide ResNet-101-2 model from\n    `\"Wide Residual Networks\" <https://arxiv.org/pdf/1605.07146.pdf>`_.\n\n    Args:\n        pretrained (bool): If True, returns a model pre-trained on ImageNet\n\n    Examples:\n        .. code-block:: python\n\n            import paddle\n            from paddle.vision.models import wide_resnet101_2\n\n            # build model\n            model = wide_resnet101_2()\n\n            # build model and load imagenet pretrained weight\n            # model = wide_resnet101_2(pretrained=True)\n\n            x = paddle.rand([1, 3, 224, 224])\n            out = model(x)\n\n            print(out.shape)\n    \"\"\"\n    kwargs['width'] = 64 * 2\n    return _resnet('wide_resnet101_2', BottleneckBlock, 101, pretrained, **kwargs)\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/vit_b_16x/ernievil2/utils/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/vit_b_16x/ernievil2/utils/tokenizer.py",
    "content": "# coding=utf-8\n# Copyright 2018 The Google AI Language Team Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#         http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Tokenization classes.\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport collections\nimport unicodedata\n\nimport six\n#import sentencepiece as sp\n\n\ndef convert_to_unicode(text):\n    \"\"\"Converts `text` to Unicode (if it's not already), assuming utf-8 input.\"\"\"\n    if six.PY3:\n        if isinstance(text, str):\n            return text\n        elif isinstance(text, bytes):\n            return text.decode(\"utf-8\", \"ignore\")\n        else:\n            raise ValueError(\"Unsupported string type: %s\" % (type(text)))\n    elif six.PY2:\n        if isinstance(text, str):\n            return text.decode(\"utf-8\", \"ignore\")\n        elif isinstance(text, unicode):\n            return text\n        else:\n            raise ValueError(\"Unsupported string type: %s\" % (type(text)))\n    else:\n        raise ValueError(\"Not running on Python2 or Python 3?\")\n\n\ndef load_vocab(vocab_file):\n    \"\"\"Loads a vocabulary file into a dictionary.\"\"\"\n    vocab = collections.OrderedDict()\n    fin = open(vocab_file)\n    for num, line in enumerate(fin):\n        items = convert_to_unicode(line.strip()).split(\"\\t\")\n        if len(items) > 2:\n            break\n        token = items[0]\n        index = items[1] if len(items) == 2 else num\n        token = token.strip()\n        vocab[token] = int(index)\n    return vocab\n\n\ndef convert_by_vocab(vocab, items):\n    \"\"\"Converts a sequence of [tokens|ids] using the vocab.\"\"\"\n    output = []\n    for item in items:\n        output.append(vocab[item])\n    return output\n\n\ndef convert_tokens_to_ids_include_unk(vocab, tokens, unk_token=\"[UNK]\"):\n    output = []\n    for token in tokens:\n        if token in vocab:\n            output.append(vocab[token])\n        else:\n            output.append(vocab[unk_token])\n    return output\n\n\ndef convert_tokens_to_ids(vocab, tokens):\n    return convert_by_vocab(vocab, tokens)\n\n\ndef convert_ids_to_tokens(inv_vocab, ids):\n    return convert_by_vocab(inv_vocab, ids)\n\n\ndef whitespace_tokenize(text):\n    \"\"\"Runs basic whitespace cleaning and splitting on a peice of text.\"\"\"\n    text = text.strip()\n    if not text:\n        return []\n    tokens = text.split()\n    return tokens\n\n\nclass FullTokenizer(object):\n    \"\"\"Runs end-to-end tokenziation.\"\"\"\n\n    def __init__(self, vocab_file, do_lower_case=True):\n        self.vocab = load_vocab(vocab_file)\n        self.inv_vocab = {v: k for k, v in self.vocab.items()}\n        self.basic_tokenizer = BasicTokenizer(do_lower_case=do_lower_case)\n        self.wordpiece_tokenizer = WordpieceTokenizer(vocab=self.vocab)\n\n    def tokenize(self, text):\n        split_tokens = []\n        for token in self.basic_tokenizer.tokenize(text):\n            for sub_token in self.wordpiece_tokenizer.tokenize(token):\n                split_tokens.append(sub_token)\n\n        return split_tokens\n\n    def convert_tokens_to_ids(self, tokens):\n        return convert_by_vocab(self.vocab, tokens)\n\n    def convert_ids_to_tokens(self, ids):\n        return convert_by_vocab(self.inv_vocab, ids)\n\n\nclass CharTokenizer(object):\n    \"\"\"Runs end-to-end tokenziation.\"\"\"\n\n    def __init__(self, vocab_file, do_lower_case=True):\n        self.vocab = load_vocab(vocab_file)\n        self.inv_vocab = {v: k for k, v in self.vocab.items()}\n        self.tokenizer = WordpieceTokenizer(vocab=self.vocab)\n\n    def tokenize(self, text):\n        split_tokens = []\n        for token in text.lower().split(\" \"):\n            for sub_token in self.tokenizer.tokenize(token):\n                split_tokens.append(sub_token)\n        return split_tokens\n\n    def convert_tokens_to_ids(self, tokens):\n        return convert_by_vocab(self.vocab, tokens)\n\n    def convert_ids_to_tokens(self, ids):\n        return convert_by_vocab(self.inv_vocab, ids)\n\n\nclass BasicTokenizer(object):\n    \"\"\"Runs basic tokenization (punctuation splitting, lower casing, etc.).\"\"\"\n\n    def __init__(self, do_lower_case=True):\n        \"\"\"Constructs a BasicTokenizer.\n\n        Args:\n            do_lower_case: Whether to lower case the input.\n        \"\"\"\n        self.do_lower_case = do_lower_case\n\n    def tokenize(self, text):\n        \"\"\"Tokenizes a piece of text.\"\"\"\n        text = convert_to_unicode(text)\n        text = self._clean_text(text)\n\n        # This was added on November 1st, 2018 for the multilingual and Chinese\n        # models. This is also applied to the English models now, but it doesn't\n        # matter since the English models were not trained on any Chinese data\n        # and generally don't have any Chinese data in them (there are Chinese\n        # characters in the vocabulary because Wikipedia does have some Chinese\n        # words in the English Wikipedia.).\n        text = self._tokenize_chinese_chars(text)\n\n        orig_tokens = whitespace_tokenize(text)\n        split_tokens = []\n        for token in orig_tokens:\n            if self.do_lower_case:\n                token = token.lower()\n                token = self._run_strip_accents(token)\n            split_tokens.extend(self._run_split_on_punc(token))\n\n        output_tokens = whitespace_tokenize(\" \".join(split_tokens))\n        return output_tokens\n\n    def _run_strip_accents(self, text):\n        \"\"\"Strips accents from a piece of text.\"\"\"\n        text = unicodedata.normalize(\"NFD\", text)\n        output = []\n        for char in text:\n            cat = unicodedata.category(char)\n            if cat == \"Mn\":\n                continue\n            output.append(char)\n        return \"\".join(output)\n\n    def _run_split_on_punc(self, text):\n        \"\"\"Splits punctuation on a piece of text.\"\"\"\n        chars = list(text)\n        i = 0\n        start_new_word = True\n        output = []\n        while i < len(chars):\n            char = chars[i]\n            if _is_punctuation(char):\n                output.append([char])\n                start_new_word = True\n            else:\n                if start_new_word:\n                    output.append([])\n                start_new_word = False\n                output[-1].append(char)\n            i += 1\n\n        return [\"\".join(x) for x in output]\n\n    def _tokenize_chinese_chars(self, text):\n        \"\"\"Adds whitespace around any CJK character.\"\"\"\n        output = []\n        for char in text:\n            cp = ord(char)\n            if self._is_chinese_char(cp):\n                output.append(\" \")\n                output.append(char)\n                output.append(\" \")\n            else:\n                output.append(char)\n        return \"\".join(output)\n\n    def _is_chinese_char(self, cp):\n        \"\"\"Checks whether CP is the codepoint of a CJK character.\"\"\"\n        # This defines a \"chinese character\" as anything in the CJK Unicode block:\n        #     https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_(Unicode_block)\n        #\n        # Note that the CJK Unicode block is NOT all Japanese and Korean characters,\n        # despite its name. The modern Korean Hangul alphabet is a different block,\n        # as is Japanese Hiragana and Katakana. Those alphabets are used to write\n        # space-separated words, so they are not treated specially and handled\n        # like the all of the other languages.\n        if ((cp >= 0x4E00 and cp <= 0x9FFF) or  #\n            (cp >= 0x3400 and cp <= 0x4DBF) or  #\n            (cp >= 0x20000 and cp <= 0x2A6DF) or  #\n            (cp >= 0x2A700 and cp <= 0x2B73F) or  #\n            (cp >= 0x2B740 and cp <= 0x2B81F) or  #\n            (cp >= 0x2B820 and cp <= 0x2CEAF) or (cp >= 0xF900 and cp <= 0xFAFF) or  #\n            (cp >= 0x2F800 and cp <= 0x2FA1F)):  #\n            return True\n\n        return False\n\n    def _clean_text(self, text):\n        \"\"\"Performs invalid character removal and whitespace cleanup on text.\"\"\"\n        output = []\n        for char in text:\n            cp = ord(char)\n            if cp == 0 or cp == 0xfffd or _is_control(char):\n                continue\n            if _is_whitespace(char):\n                output.append(\" \")\n            else:\n                output.append(char)\n        return \"\".join(output)\n\n\nclass WordpieceTokenizer(object):\n    \"\"\"Runs WordPiece tokenziation.\"\"\"\n\n    def __init__(self, vocab, unk_token=\"[UNK]\", max_input_chars_per_word=100):\n        self.vocab = vocab\n        self.unk_token = unk_token\n        self.max_input_chars_per_word = max_input_chars_per_word\n\n    def tokenize(self, text):\n        \"\"\"Tokenizes a piece of text into its word pieces.\n\n        This uses a greedy longest-match-first algorithm to perform tokenization\n        using the given vocabulary.\n\n        For example:\n            input = \"unaffable\"\n            output = [\"un\", \"##aff\", \"##able\"]\n\n        Args:\n            text: A single token or whitespace separated tokens. This should have\n                already been passed through `BasicTokenizer.\n\n        Returns:\n            A list of wordpiece tokens.\n        \"\"\"\n\n        text = convert_to_unicode(text)\n\n        output_tokens = []\n        for token in whitespace_tokenize(text):\n            chars = list(token)\n            if len(chars) > self.max_input_chars_per_word:\n                output_tokens.append(self.unk_token)\n                continue\n\n            is_bad = False\n            start = 0\n            sub_tokens = []\n            while start < len(chars):\n                end = len(chars)\n                cur_substr = None\n                while start < end:\n                    substr = \"\".join(chars[start:end])\n                    if start > 0:\n                        substr = \"##\" + substr\n                    if substr in self.vocab:\n                        cur_substr = substr\n                        break\n                    end -= 1\n                if cur_substr is None:\n                    is_bad = True\n                    break\n                sub_tokens.append(cur_substr)\n                start = end\n\n            if is_bad:\n                output_tokens.append(self.unk_token)\n            else:\n                output_tokens.extend(sub_tokens)\n        return output_tokens\n\n\ndef _is_whitespace(char):\n    \"\"\"Checks whether `chars` is a whitespace character.\"\"\"\n    # \\t, \\n, and \\r are technically contorl characters but we treat them\n    # as whitespace since they are generally considered as such.\n    if char == \" \" or char == \"\\t\" or char == \"\\n\" or char == \"\\r\":\n        return True\n    cat = unicodedata.category(char)\n    if cat == \"Zs\":\n        return True\n    return False\n\n\ndef _is_control(char):\n    \"\"\"Checks whether `chars` is a control character.\"\"\"\n    # These are technically control characters but we count them as whitespace\n    # characters.\n    if char == \"\\t\" or char == \"\\n\" or char == \"\\r\":\n        return False\n    cat = unicodedata.category(char)\n    if cat.startswith(\"C\"):\n        return True\n    return False\n\n\ndef _is_punctuation(char):\n    \"\"\"Checks whether `chars` is a punctuation character.\"\"\"\n    cp = ord(char)\n    # We treat all non-letter/number ASCII as punctuation.\n    # Characters such as \"^\", \"$\", and \"`\" are not in the Unicode\n    # Punctuation class but we treat them as punctuation anyways, for\n    # consistency.\n    if ((cp >= 33 and cp <= 47) or (cp >= 58 and cp <= 64) or (cp >= 91 and cp <= 96) or (cp >= 123 and cp <= 126)):\n        return True\n    cat = unicodedata.category(char)\n    if cat.startswith(\"P\"):\n        return True\n    return False\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/vit_b_16x/ernievil2/utils/utils.py",
    "content": "import json\nimport os\nfrom typing import List\nfrom typing import Union\n\nimport numpy as np\nimport paddle\nfrom disco_diffusion_ernievil_base.vit_b_16x.ernievil2.transformers.clip_vision_transformer import ViT_base_patch16_224\nfrom disco_diffusion_ernievil_base.vit_b_16x.ernievil2.transformers.clip_vision_transformer import ViT_base_patch32_224\nfrom disco_diffusion_ernievil_base.vit_b_16x.ernievil2.transformers.clip_vision_transformer import ViT_large_patch14_224\nfrom disco_diffusion_ernievil_base.vit_b_16x.ernievil2.transformers.efficientnet import EfficientNetB5\nfrom disco_diffusion_ernievil_base.vit_b_16x.ernievil2.transformers.ernie2 import ErnieModel\nfrom disco_diffusion_ernievil_base.vit_b_16x.ernievil2.transformers.multimodal import MultiModalModel\nfrom disco_diffusion_ernievil_base.vit_b_16x.ernievil2.utils.tokenizer import FullTokenizer\n\n__all__ = ['tokenize', 'build_model']\n\nMODEL_NAMES = ['vit_b_16x']\n\nMEAN, STD = (0.48145466, 0.4578275, 0.40821073), (0.26862954, 0.26130258, 0.27577711)\n_tokenizer = FullTokenizer(vocab_file=os.path.join(os.path.dirname(__file__),\n                                                   '../../packages/ernie_base_3.0/vocab.txt'),\n                           do_lower_case=True)\n\n\ndef tokenize(texts: Union[str, List[str]], context_length: int = 64):\n    \"\"\"\n    Returns the tokenized representation of given input string(s)\n    Parameters\n    ----------\n    texts : Union[str, List[str]]\n        An input string or a list of input strings to tokenize\n    context_length : int\n        The context length to use; all baseline models use 24 as the context length\n    Returns\n    -------\n    A two-dimensional tensor containing the resulting tokens, shape = [number of input strings, context_length]\n    \"\"\"\n    if isinstance(texts, str):\n        texts = [texts]\n\n    all_tokens = []\n    for text in texts:\n        all_tokens.append([_tokenizer.vocab['[CLS]']] +\n                          _tokenizer.convert_tokens_to_ids(_tokenizer.tokenize(text))[:context_length - 2] +\n                          [_tokenizer.vocab['[SEP]']])\n\n    result = paddle.zeros([len(all_tokens), context_length], dtype='int64')\n\n    for i, tokens in enumerate(all_tokens):\n        assert len(tokens) <= context_length\n        result[i, :len(tokens)] = paddle.to_tensor(tokens, dtype='int64')\n\n    return result\n\n\ndef build_model(name='vit_b_16x'):\n    assert name in MODEL_NAMES, f\"model name must be one of {MODEL_NAMES}\"\n    name2model = {'vit_b_16x': build_vit_b_16x_model}\n    model = name2model[name]()\n    return model\n\n\ndef build_vit_b_16x_model():\n    # Define model\n    image_model = ViT_base_patch16_224()\n    with open(os.path.join(os.path.dirname(__file__),\n                           '../../packages/ernie_base_3.0/ernie_config.base.json')) as json_file:\n        config_dict = json.load(json_file)\n    text_model = ErnieModel(config_dict)\n    model = MultiModalModel(image_model, text_model)\n    checkpoint = paddle.load(os.path.join(os.path.dirname(__file__), '../../pre_trained/vit_b_16x.pdparams'))\n    model.set_state_dict(checkpoint)\n    model.eval()\n    return model\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/vit_b_16x/packages/configs/vit_ernie_base.yaml",
    "content": "# The frequency to save trained models when training.\nsave_step: 500\n# The frequency to fetch and print output when training.\nprint_step: 10\n\n# The directory for saving model\nsave_model: \"checkpoints\"\n# The directory for saving inference model\ninference_model_dir: \"infer_model\"\n# Set seed for CE or debug\nrandom_seed: 1024\n\n# The data type of input ids.\ninput_dtype: \"int64\"\n\n# Device to use.\ndevice: \"gpu\"\n\n# TODO fix\n#batch_size: 2000\nbatch_size: 100\n\ninfer_batch_size: 1500\nshuffle_batch: False\n# Data shuffle only works when sort_type is pool or none\nshuffle: False\n# shuffle_seed must be set when shuffle is True and using multi-cards to train.\n# Otherwise, the number of batches cannot be guaranteed.\nshuffle_seed: 128\n\n# The number of epoches for training\nepoch: 50\n\n\n#learning_rate: 0.00005\nlearning_rate: 0.00003\n\n\nbeta1: 0.9\nbeta2: 0.997\neps: 1e-9\n# The parameters for learning rate scheduling.\nwarmup_steps: 1000\n\n# Dropout rates.\ndropout: 0.1\n\n\n# Mixed precision training\nuse_amp: True\nuse_pure_fp16: False\nscale_loss: 128.0\n\n# Maximum iteration for training.\nmax_iter: None\n\ndo_train: True\n\nmax_text_seqlen: 48\nvocab_file: \"./packages/ernie_base_3.0/vocab.txt\"\ntext_model_config: \"./packages/ernie_base_3.0/ernie_config.base.json\"\n\npad_token: 0\ncls_token: 1\nsep_token: 2\nmask_token: 3\nunk_token: 17963\n"
  },
  {
    "path": "modules/image/text_to_image/disco_diffusion_ernievil_base/vit_b_16x/packages/ernie_base_3.0/ernie_config.base.json",
    "content": "{\n    \"attention_probs_dropout_prob\": 0.1,\n    \"hidden_act\": \"gelu\",\n    \"hidden_dropout_prob\": 0.1,\n    \"hidden_size\": 768,\n    \"initializer_range\": 0.02,\n    \"max_position_embeddings\": 2048,\n    \"num_attention_heads\": 12,\n    \"num_hidden_layers\": 12,\n    \"sent_type_vocab_size\": 4,\n    \"task_type_vocab_size\": 3,\n    \"vocab_size\": 40000\n  }\n"
  },
  {
    "path": "modules/image/text_to_image/ernie_vilg/README.md",
    "content": "  <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/187387422-f6c9ccab-7fda-416e-a24d-7d6084c46f67.jpg\"  width = \"80%\" hspace='10'/>\n\n# PaddleHub ERNIE-ViLG\n\n# 目录\n1. [模型基本信息](#一模型基本信息)\n2. [安装](#二安装)\n3. [模型API预测](#三模型api预测)\n4. [Prompt 指南](#四-prompt-指南)\n5. [服务部署](#五服务部署)\n6. [更新历史](#六更新历史)\n\n\n## 一、模型基本信息\n\n|模型名称|ernie_vilg|\n| :--- | :---: |\n|类别|图像-文图生成|\n|网络|ERNIE-ViLG|\n|数据集|-|\n|是否支持Fine-tuning|否|\n|模型大小|-|\n|最新更新日期|2022-10-14|\n|数据指标|-|\n\n### 应用效果展示\n\n  - 输入文本 \"戴眼镜的猫\"  风格 \"油画\"\n\n  - 输出图像\n  <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/187395109-8ab830bb-4559-41a2-97a1-0a45d217abd6.png\"  width = \"80%\" hspace='10'/>\n  <br />\n\n\n### 模型介绍\n\n文心ERNIE-ViLG参数规模达到100亿，是目前为止全球最大规模中文跨模态生成模型，在文本生成图像、图像描述等跨模态生成任务上效果全球领先，在图文生成领域MS-COCO、COCO-CN、AIC-ICC等数据集上取得最好效果。你可以输入一段文本描述以及生成风格，模型就会根据输入的内容自动创作出符合要求的图像。\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.2.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install ernie_vilg\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n- ### 3. 使用申请（可选）\n  - 请前往 [文心旸谷社区](https://wenxin.baidu.com/moduleApi/key) 申请使用本模型所需的 API key 和 Secret Key。\n\n## 三、模型API预测  \n\n- ### 1、命令行预测\n\n  - ```shell\n    # 请设置 '--ak' 和 '--sk' 参数\n    # 或者设置 'WENXIN_AK' 和 'WENXIN_SK' 环境变量\n    # 更多细节参考下方 API 说明\n    $ hub run ernie_vilg --text_prompts \"宁静的小镇\" --style \"油画\" --output_dir ernie_vilg_out\n    ```\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    # 请设置 'ak' 和 'sk' 参数\n    # 或者设置 'WENXIN_AK' 和 'WENXIN_SK' 环境变量\n    # 更多细节参考下方 API 说明\n    module = hub.Module(name=\"ernie_vilg\")\n    text_prompts = [\"宁静的小镇\"]\n    images = module.generate_image(text_prompts=text_prompts, style='油画', output_dir='./ernie_vilg_out/')  \n    ```\n\n- ### 3、API\n\n  - ```python\n    def __init__(\n      ak: Optional[str] = None,\n      sk: Optional[str] = None\n    )\n    ```\n\n    - 初始化 API。\n\n    - **参数**\n\n      - ak(Optional[str]): 文心 API AK，默认为 None，即从环境变量 'WENXIN_AK' 中获取；\n      - sk(Optional[str]): 文心 API SK，默认为 None，即从环境变量 'WENXIN_SK' 中获取。\n\n  - ```python\n    def generate_image(\n      text_prompts:str,\n      style: Optional[str] = \"探索无限\",\n      topk: Optional[int] = 6,\n      output_dir: Optional[str] = 'ernievilg_output'\n    )\n    ```\n\n    - 文图生成API，生成文本描述内容的图像。\n\n    - **参数**\n\n      - text_prompts(str): 输入的语句，描述想要生成的图像的内容。\n      - style(Optional[str]): 生成图像的风格，当前支持 古风、油画、水彩、卡通、二次元、浮世绘、蒸汽波艺术、\n        low poly、像素风格、概念艺术、未来主义、赛博朋克、写实风格、洛丽塔风格、巴洛克风格、超现实主义、探索无限。\n      - topk(Optional[int]): 保存前多少张图，最多保存6张。\n      - output_dir(Optional[str]): 保存输出图像的目录，默认为\"ernievilg_output\"。\n\n\n    - **返回**\n      - images(List(PIL.Image)): 返回生成的所有图像列表，PIL的Image格式。\n\n\n## 四、 Prompt 指南\n\n作者：佳祥 (LCL-Brew) & 单斌\n\n### Prompt公式\n\n「公式」= 图片主体，细节词，修饰词\n细节词可以任意组合，修饰词可以限定一种风格，也可以限定多种风格，遵循的基本原则是符合正常的中文语法逻辑即可。\n\n### 示例\n\n|<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/174_蒙娜丽莎，赛博朋克，宝丽来，33毫米,蒸汽波艺术_000-1_7b4a78a.png\" alt=\"drawing\" width=\"300\"/>|\n| --- |\n| prompt：蒙娜丽莎，赛博朋克，宝丽来，33毫米,</br>蒸汽波艺术  |\n\n\n|<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/3_72d9343.png\" alt=\"drawing\" width=\"300\"/>|\n| --- |\n| prompt：火焰，凤凰，少女，未来感，高清，3d，</br>精致面容，cg感，古风，唯美，毛发细致，上半身立绘 |\n\n\n|<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/4_e1f5cbb.png\" alt=\"drawing\" width=\"300\"/>|\n| --- |\n|  prompt：巨狼，飘雪，蓝色大片烟雾，毛发细致，</br>烟雾缭绕，高清，3d，cg感，侧面照  |\n\n\n| <img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/5_d380451.png\" alt=\"drawing\" width=\"400\"/> |\n| --- |\n|  <center>prompt：浮世绘日本科幻哑光绘画，概念艺术，</br>动漫风格神道寺禅园英雄动作序列，包豪斯</center> |\n\n### 修饰词\n\n好的修饰词可以让图片生成的效果更好，基于产业级知识增强的文心大模型，用户可以通过输入独特且特征明显的修饰词，来达到更高质量的图片生成。\n\n#### 1. 效果参考\n<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/1_3612449.jpg\" alt=\"drawing\" width=\"600\"/>\n\n**cg感**\n\n| <img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/2_b72fd7a.png\" alt=\"drawing\" width=\"300\"/> |<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/image%20%281%29_8a6b56b.png\" alt=\"drawing\" width=\"300\"/> |\n| --- | --- |\n\n\n|  <img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/6_2363c54.png\" alt=\"drawing\" width=\"300\"/>|<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/7_a0910bf.png\" alt=\"drawing\" width=\"300\"/>|\n| --- | --- |\n\n\n**厚涂风格 / 厚涂版绘**\n\n\n\n| <img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/8_ea9d4f2.png\" alt=\"drawing\" width=\"300\"/> |<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/9_8defb0a.png\" alt=\"drawing\" width=\"300\"/> |\n| --- | --- |\n\n|  <img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/10_328c202.png\" alt=\"drawing\" width=\"300\"/>|<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/11_748702c.png\" alt=\"drawing\" width=\"300\"/>|\n| --- | --- |\n\n\n**古风**\n\n| <img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/12_85ba92e.png\" alt=\"drawing\" width=\"300\"/> |<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/13_cec7db5.png\" alt=\"drawing\" width=\"300\"/> |\n| --- | --- |\n\n\n|  <img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/14_3511a5d.png\" alt=\"drawing\" width=\"300\"/>|<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/15_2443b20.png\" alt=\"drawing\" width=\"300\"/>|\n| --- | --- |\n\n\n**精致面容**\n\n| <img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/16_c79ef20.png\" alt=\"drawing\" width=\"300\"/> |<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/17_9334d56.png\" alt=\"drawing\" width=\"300\"/> |\n| --- | --- |\n\n|  <img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/18_4ba96b0.png\" alt=\"drawing\" width=\"300\"/>|<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/19_e627e62.png\" alt=\"drawing\" width=\"300\"/>|\n| --- | --- |\n\n\n**穆夏 / 穆夏风格**\n\n| <img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/20_2cd8cfb.png\" alt=\"drawing\" width=\"300\"/> |<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/21_75b47a2.png\" alt=\"drawing\" width=\"300\"/> |\n| --- | --- |\n\n**机械感 / 机械**\n\n| <img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/22_c43e94f.png\" alt=\"drawing\" width=\"300\"/> |<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/23_3e85390.png\" alt=\"drawing\" width=\"300\"/> |\n| --- | --- |\n\n**宫崎骏动画**\n\n| <img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/24_02e9187.png\" alt=\"drawing\" width=\"300\"/> |<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/25_ecca869.png\" alt=\"drawing\" width=\"300\"/> |\n| --- | --- |\n\n\n**烟雾 / 烟雾缭绕**\n\n| <img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/26_d2bf84c.png\" alt=\"drawing\" width=\"300\"/> |<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/27_c482d21.png\" alt=\"drawing\" width=\"300\"/> |\n| --- | --- |\n\n**皮克斯动画**\n\n| <img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/28_b15c2c3.png\" alt=\"drawing\" width=\"300\"/> |<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/29_1a87854.png\" alt=\"drawing\" width=\"300\"/> |\n| --- | --- |\n\n**拟人化**\n\n| <img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/30_f55ea2d.png\" alt=\"drawing\" width=\"300\"/> |<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/31_bc12eaa.png\" alt=\"drawing\" width=\"300\"/> |\n| --- | --- |\n\n**剪纸叠加风格**\n\n| <img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/32_60f30a6.png\" alt=\"drawing\" width=\"300\"/> |<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/33_c4020cc.png\" alt=\"drawing\" width=\"300\"/> |\n| --- | --- |\n\n**色彩斑斓**\n\n| <img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/34_16c64b5.png\" alt=\"drawing\" width=\"300\"/> |<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/35_4b439ff.png\" alt=\"drawing\" width=\"300\"/> |\n| --- | --- |\n\n**城市印象 & 圆形轮廓**\n\n| <img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/36_2ed177e.png\" alt=\"drawing\" width=\"300\"/> |<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/37_d00e2bc.png\" alt=\"drawing\" width=\"300\"/> |\n| --- | --- |\n\n**上半身立绘 / 人物立绘**\n\n| <img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/38_0ec9be4.png\" alt=\"drawing\" width=\"300\"/> |<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/39_e72f64a.png\" alt=\"drawing\" width=\"300\"/> |\n| --- | --- |\n\n**电影质感**\n\n| <img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/40_f90ee02.png\" alt=\"drawing\" width=\"300\"/> |<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/41_1d3da07.png\" alt=\"drawing\" width=\"300\"/> |\n| --- | --- |\n\n**扁平化设计 / 扁平化**\n\n| <img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/42_a6fe543.png\" alt=\"drawing\" width=\"300\"/> |<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/43_360f7d8.png\" alt=\"drawing\" width=\"300\"/> |\n| --- | --- |\n\n**logo设计 / 简约logo设计**\n\n| <img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/44_73b7e12.png\" alt=\"drawing\" width=\"300\"/> |<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/45_5d2b093.png\" alt=\"drawing\" width=\"300\"/> |\n| --- | --- |\n\n**细节清晰**\n\n| <img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/46_e9e50e1.png\" alt=\"drawing\" width=\"300\"/> |<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/47_396cba1.png\" alt=\"drawing\" width=\"300\"/> |\n| --- | --- |\n\n**毛发细致**\n\n| <img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/48_55f90be.png\" alt=\"drawing\" width=\"300\"/> |<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/49_4b86ed3.png\" alt=\"drawing\" width=\"300\"/> |\n| --- | --- |\n\n\n\n\n\n#### 2. 风格词参考\n\n\n复古未来主义风格 ->- 海滩兔风格 ->- 抽象技术风格 ->- 酸性精灵风格 ->- 古埃及风格 ->- 风帽风格 ->- 装饰艺术风格 ->- 极光风格 ->- 秋天风格 ->- 巴洛克风格 ->- 摩托车手风格 ->- 碎核风格 ->- 纸箱风格 ->- 未来主义风格 ->- 孟菲斯公司风格 ->- 立体主义风格 ->-赛博朋克风格 ->- 黑暗自然主义风格 ->- 表现主义风格 ->- 野兽派风格 ->- 鬼魂风格 ->- 嘻哈风格 ->- 嬉皮士风格 ->- 幻象之城风格 ->- 印象主义风格 ->- 卡瓦伊风格 ->- 美人鱼风格 ->- 极简主义风格 ->- 水井惠郎风格 ->- 苔藓风格 ->- 新浪潮风格 ->- 迷宫物语风格 ->- 仙女风格 ->- 粉彩朋克风格 ->- 照片写实风格 ->- 粉红公主风格 ->- 海盗风格 ->- 像素可爱风格 ->- 波普艺术风格 ->- 史前遗迹风格 ->- 迷幻风格 ->- 雨天风格 ->- 湿漉漉的风格 ->- 浮世绘风格 ->- 矢量心风格 ->- 维京人风格 ->- 女巫店风格 ->- 后印象主义 ->- 素人主义\n\n#### 3. 艺术词参考\n\n\n\n| &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;艺术类型&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; | &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;艺术家&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; | &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;常用艺术风格&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; |\n| --- | --- | --- |\n| <center>肖像画| \t<center>文森特·梵高| \t<center>印象主义|\n| <center>风景画\t| <center>尼古拉斯·罗伊里奇\t| <center>现实主义|\n| <center>风俗画\t| <center>皮埃尔-奥古斯特·雷诺阿| \t<center>浪漫主义|\n| <center>宗教绘画\t| <center>克劳德·莫内\t| <center>表现主义|\n| <center>抽象画| \t<center>彼得·孔查洛夫斯基\t| <center>后印象主义|\n| <center>都市风景画| \t<center>卡米尔·毕沙罗\t| <center>象征主义|\n| <center>素描与草图| \t<center>约翰·辛格·萨金特| \t<center>新艺术主义|\n| <center>静物| \t<center>伦勃朗| \t<center>巴洛克风格|\n| <center>裸体画| \t<center>马克·夏加尔| \t<center>抽象表现主义|\n| <center>插画| \t<center>巴勃罗·毕加索\t| <center>北欧文艺复兴|\n| | <center>古斯塔夫·多雷\t| <center>素人艺术，原始主义|\n| | <center>阿尔布雷特·丢勒\t| <center>立体主义|\n| | <center>鲍里斯·库斯妥基耶夫\t| <center>洛可可|\n| | <center>埃德加·德加| \t<center>色域绘画|\n| | |  <center>波普艺术|\n| | | <center>文艺复兴开端|  \n| | | <center>文艺复兴全盛期| |\n|| |  <center>极简主义|\n| | | <center>矫饰主义，文艺复兴晚期|\n\n\n\n#### 4. 摄影词参考\n\n\n|<center>&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;可以加入到Prompt 中的摄影词&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;|&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;|\n| --- | --- |\n|<center>浅景深\t|<center>仰拍|\n|<center>负像\t|<center>动态模糊|\n|<center>微距\t|<center>高反差|\n|<center>双色版|\t<center>中心构图|\n|<center>角度\t|<center>逆光|\n|<center>三分法|\t<center>长曝光|\n|<center>抓拍\t|<center>禅宗摄影|\n|<center>软焦点|\t<center>抽象微距镜头|\n|<center>黑白|\t<center>暗色调|\n|<center>无镜反射|\t<center>长时间曝光|\n|<center>双色调|\t<center>框架，取景|\n|<center>颗粒图像||\n\n\n\n### 技巧提示\n\n1. 【作图规则】Prompt构建是文本符合逻辑的组合，有序且丰富的描述可以不断提升画面效果\n2. 【新手入门】不知如何输入Prompt？点击示例，体验文生图的魅力，参考教程，逐步进阶~\n3. 【风格生成】试试添加 “国潮”、“国风”等，感受中国风的魅力\n4. 【风格生成】试试混合两种代表性的风格，例如“赛博朋克，扁平化设计”、”皮克斯动画，赛博朋克”\n5. 【人像生成】添加“仙鹤、月亮、楼阁、小屋、街道、玫瑰、机械”，画面会更饱满\n6. 【人像生成】添加“精致面容、唯美、cg感、细节清晰“等，人物刻画会更细致\n7. 【风格生成】添加“扁平化风格，logo”等，可以设计出各类图标等，例如 “猫猫头像，扁平化风格”\n8. 【风格生成】指定颜色，或添加“烟雾缭绕”、“火焰”、“烟尘”、“花瓣”，生成画面的氛围感更饱满\n9. 【创意生成】发挥想象力，例如：“中西混搭”、“泰迪熊唱京剧”、“米老鼠吃火锅”\n10. 【风格生成】“水彩”，“水墨”与古诗组合，画面意境会有提升~\n11. 【风格生成】想要日系头像和拟人化动物？试试关键词“日系手绘”、“治愈风”\n12. 【风格生成】添加“pixiv”，生成二次元或者动漫的画质更惊艳\n\n### 呼吁与准则\n\n利用AI技术生成图片的最终目的是要便捷地为人类创造美的作品，激发人的想象力和创作力。而技术在发展中，做不到十全十美，不能保证每次生成的图片都能够尽善尽美。因此呼吁所有用户，您想分享满意的AI图片时，请以正能量进行传播宣传！\n算法生成的图片难免会受到数据的影响，从而导致生成的图片是有数据偏见的。因此在分享AI生成图片到社交媒体之前，请谨慎评估当前的图片是不是含有：令人不适的、暴力的、色情的内容。如对以上的内容进行恶意传播，您将会承担相应的法律后果。\n\n\n\n<span id = \"related-work\">   </span>\n### 相关链接\n\n美学相关的词汇： https://aesthetics.fandom.com/wiki/List_of_Aesthetics\n\nDALL-E 2 的 Prompt 技巧资料：https://docs.google.com/document/d/11WlzjBT0xRpQhP9tFMtxzd0q6ANIdHPUBkMV-YB043U/edit\n\nDiscoDiffusion Prompt 技巧资料：https://docs.google.com/document/d/1l8s7uS2dGqjztYSjPpzlmXLjl5PM3IGkRWI3IiCuK7g/edit\n\n## 五、服务部署\n\n- PaddleHub Serving可以部署一个在线文图生成服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m ernie_vilg\n    ```\n\n  - 这样就完成了一个文图生成的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果。\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    from io import BytesIO\n    from PIL import Image\n\n    # 发送HTTP请求\n    data = {'text_prompts': '巨大的白色城堡'}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/ernie_vilg\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 获取返回结果\n    for i, result in enumerate(r.json()[\"results\"]):\n      image = Image.open(BytesIO(base64.b64decode(result)))\n      image.save('result_{}.png'.format(i))\n\n- ### gradio app 支持\n  从paddlehub 2.3.1开始支持使用链接 http://127.0.0.1:8866/gradio/ernie_vilg 在浏览器中访问ernie_vilg的gradio app。\n\n\n## 六、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  增加分辨率参数以及所支持的风格\n\n* 1.2.0\n\n  移除分辨率参数，移除默认 AK 和 SK\n\n* 1.3.0\n  新增对gradio app的支持\n\n  ```shell\n  $ hub install ernie_vilg == 1.3.0\n  ```\n"
  },
  {
    "path": "modules/image/text_to_image/ernie_vilg/__init__.py",
    "content": ""
  },
  {
    "path": "modules/image/text_to_image/ernie_vilg/module.py",
    "content": "import argparse\nimport base64\nimport os\nimport time\nfrom io import BytesIO\nfrom typing import Optional\n\nimport gradio as gr\nimport numpy as np\nimport requests\nfrom PIL import Image\nfrom tqdm.auto import tqdm\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"ernie_vilg\",\n            version=\"1.3.0\",\n            type=\"image/text_to_image\",\n            summary=\"\",\n            author=\"baidu-nlp\",\n            author_email=\"paddle-dev@baidu.com\")\nclass ErnieVilG:\n\n    def __init__(self, ak=None, sk=None):\n        \"\"\"\n      :param ak: ak for applying token to request wenxin api.\n      :param sk: sk for applying token to request wenxin api.\n      \"\"\"\n        self.ak = ak\n        self.sk = sk\n        self.token_host = 'https://wenxin.baidu.com/younger/portal/api/oauth/token'\n        self.token = self._apply_token(self.ak, self.sk)\n\n    def _apply_token(self, ak, sk):\n        ak = ak if ak else os.getenv('WENXIN_AK')\n        sk = sk if sk else os.getenv('WENXIN_SK')\n        assert ak and sk, RuntimeError(\n            'Please go to the wenxin official website to apply for AK and SK and set the parameters “ak” and “sk” correctly, or set the environment variables “WENXIN_AK” and “WENXIN_SK”.'\n        )\n        response = requests.get(self.token_host,\n                                params={\n                                    'grant_type': 'client_credentials',\n                                    'client_id': ak,\n                                    'client_secret': sk\n                                })\n        if response:\n            res = response.json()\n            if res['code'] != 0:\n                print('Request access token error.')\n                raise RuntimeError(\"Request access token error.\")\n        else:\n            print('Request access token error.')\n            raise RuntimeError(\"Request access token error.\")\n        return res['data']\n\n    def generate_image(self,\n                       text_prompts,\n                       style: Optional[str] = \"探索无限\",\n                       topk: Optional[int] = 6,\n                       visualization: Optional[bool] = True,\n                       output_dir: Optional[str] = 'ernievilg_output'):\n        \"\"\"\n        Create image by text prompts using ErnieVilG model.\n\n        :param text_prompts: Phrase, sentence, or string of words and phrases describing what the image should look like.\n        :param style: Image stype, currently supported 古风、油画、水彩、卡通、二次元、浮世绘、蒸汽波艺术、\n        low poly、像素风格、概念艺术、未来主义、赛博朋克、写实风格、洛丽塔风格、巴洛克风格、超现实主义、探索无限。\n        :param topk: Top k images to save.\n        :param visualization: Whether to save images or not.\n        :output_dir: Output directory\n        \"\"\"\n        if not os.path.exists(output_dir):\n            os.makedirs(output_dir, exist_ok=True)\n        token = self.token\n        create_url = 'https://wenxin.baidu.com/younger/portal/api/rest/1.0/ernievilg/v1/txt2img?from=paddlehub'\n        get_url = 'https://wenxin.baidu.com/younger/portal/api/rest/1.0/ernievilg/v1/getImg?from=paddlehub'\n        if isinstance(text_prompts, str):\n            text_prompts = [text_prompts]\n        taskids = []\n        for text_prompt in text_prompts:\n            res = requests.post(create_url,\n                                headers={'Content-Type': 'application/x-www-form-urlencoded'},\n                                data={\n                                    'access_token': token,\n                                    \"text\": text_prompt,\n                                    \"style\": style\n                                })\n            res = res.json()\n            if res['code'] == 4001:\n                print('请求参数错误')\n                raise RuntimeError(\"请求参数错误\")\n            elif res['code'] == 4002:\n                print('请求参数格式错误，请检查必传参数是否齐全，参数类型等')\n                raise RuntimeError(\"请求参数格式错误，请检查必传参数是否齐全，参数类型等\")\n            elif res['code'] == 4003:\n                print('请求参数中，图片风格不在可选范围内')\n                raise RuntimeError(\"请求参数中，图片风格不在可选范围内\")\n            elif res['code'] == 4004:\n                print('API服务内部错误，可能引起原因有请求超时、模型推理错误等')\n                raise RuntimeError(\"API服务内部错误，可能引起原因有请求超时、模型推理错误等\")\n            elif res['code'] == 100 or res['code'] == 110 or res['code'] == 111:\n                self.token = self._apply_token(self.ak, self.sk)\n                res = requests.post(create_url,\n                                    headers={'Content-Type': 'application/x-www-form-urlencoded'},\n                                    data={\n                                        'access_token': self.token,\n                                        \"text\": text_prompt,\n                                        \"style\": style\n                                    })\n                res = res.json()\n                if res['code'] != 0:\n                    print(\"Token失效重新请求后依然发生错误，请检查输入的参数\")\n                    raise RuntimeError(\"Token失效重新请求后依然发生错误，请检查输入的参数\")\n            if res['msg'] == 'success':\n                taskids.append(res['data'][\"taskId\"])\n            else:\n                print(res['msg'])\n                raise RuntimeError(res['msg'])\n\n        start_time = time.time()\n        process_bar = tqdm(total=100, unit='%')\n        results = {}\n        total_time = 60 * len(taskids)\n        while True:\n            end_time = time.time()\n            duration = end_time - start_time\n            progress_rate = int((duration) / total_time * 100)\n            if not taskids:\n                progress_rate = 100\n            if progress_rate > process_bar.n:\n                if progress_rate >= 100:\n                    if not taskids:\n                        increase_rate = 100 - process_bar.n\n                    else:\n                        increase_rate = 0\n                else:\n                    increase_rate = progress_rate - process_bar.n\n            else:\n                increase_rate = 0\n            process_bar.update(increase_rate)\n            if duration < 30:\n                time.sleep(5)\n                continue\n            else:\n                time.sleep(10)\n            if not taskids:\n                break\n            has_done = []\n            for taskid in taskids:\n                res = requests.post(get_url,\n                                    headers={'Content-Type': 'application/x-www-form-urlencoded'},\n                                    data={\n                                        'access_token': token,\n                                        'taskId': {taskid}\n                                    })\n                res = res.json()\n                if res['code'] == 4001:\n                    print('请求参数错误')\n                    raise RuntimeError(\"请求参数错误\")\n                elif res['code'] == 4002:\n                    print('请求参数格式错误，请检查必传参数是否齐全，参数类型等')\n                    raise RuntimeError(\"请求参数格式错误，请检查必传参数是否齐全，参数类型等\")\n                elif res['code'] == 4003:\n                    print('请求参数中，图片风格不在可选范围内')\n                    raise RuntimeError(\"请求参数中，图片风格不在可选范围内\")\n                elif res['code'] == 4004:\n                    print('API服务内部错误，可能引起原因有请求超时、模型推理错误等')\n                    raise RuntimeError(\"API服务内部错误，可能引起原因有请求超时、模型推理错误等\")\n                elif res['code'] == 100 or res['code'] == 110 or res['code'] == 111:\n                    self.token = self._apply_token(self.ak, self.sk)\n                    res = requests.post(get_url,\n                                        headers={'Content-Type': 'application/x-www-form-urlencoded'},\n                                        data={\n                                            'access_token': self.token,\n                                            'taskId': {taskid}\n                                        })\n                    res = res.json()\n                    if res['code'] != 0:\n                        print(\"Token失效重新请求后依然发生错误，请检查输入的参数\")\n                        raise RuntimeError(\"Token失效重新请求后依然发生错误，请检查输入的参数\")\n                if res['msg'] == 'success':\n                    if res['data']['status'] == 1:\n                        has_done.append(res['data']['taskId'])\n                    results[res['data']['text']] = {\n                        'imgUrls': res['data']['imgUrls'],\n                        'waiting': res['data']['waiting'],\n                        'taskId': res['data']['taskId']\n                    }\n                else:\n                    print(res['msg'])\n                    raise RuntimeError(res['msg'])\n            for taskid in has_done:\n                taskids.remove(taskid)\n        print('Saving Images...')\n        result_images = []\n        for text, data in results.items():\n            for idx, imgdata in enumerate(data['imgUrls']):\n                try:\n                    image = Image.open(BytesIO(requests.get(imgdata['image']).content))\n                except Exception as e:\n                    print('Download generated images error, retry one time')\n                    try:\n                        image = Image.open(BytesIO(requests.get(imgdata['image']).content))\n                    except Exception:\n                        raise RuntimeError('Download generated images failed.')\n                if visualization:\n                    image.save(os.path.join(output_dir, '{}_{}.png'.format(text, idx)))\n                result_images.append(image)\n                if idx + 1 >= topk:\n                    break\n        print('Done')\n        return result_images\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        self.ak = args.ak\n        self.sk = args.sk\n        self.token = self._apply_token(self.ak, self.sk)\n        results = self.generate_image(text_prompts=args.text_prompts,\n                                      style=args.style,\n                                      topk=args.topk,\n                                      visualization=args.visualization,\n                                      output_dir=args.output_dir)\n        return results\n\n    @serving\n    def serving_method(self, text_prompts, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        results_base64encoded = []\n        results = self.generate_image(text_prompts=text_prompts, **kwargs)\n        for result in results:\n            buffered = BytesIO()\n            result.save(buffered, format=\"png\")\n            img_str = base64.b64encode(buffered.getvalue()).decode('utf-8')\n            results_base64encoded.append(img_str)\n        return results_base64encoded\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--text_prompts', type=str)\n        self.arg_input_group.add_argument('--style',\n                                          type=str,\n                                          default='探索无限',\n                                          choices=[\n                                              '古风', '油画', '水彩', '卡通', '二次元', '浮世绘', '蒸汽波艺术', 'low poly', '像素风格', '概念艺术',\n                                              '未来主义', '赛博朋克', '写实风格', '洛丽塔风格', '巴洛克风格', '超现实主义', '探索无限'\n                                          ],\n                                          help=\"绘画风格\")\n        self.arg_input_group.add_argument('--topk', type=int, default=6, help=\"选取保存前多少张图，最多10张\")\n        self.arg_input_group.add_argument('--ak', type=str, default=None, help=\"申请文心api使用token的ak\")\n        self.arg_input_group.add_argument('--sk', type=str, default=None, help=\"申请文心api使用token的sk\")\n        self.arg_input_group.add_argument('--visualization', type=bool, default=True, help=\"是否保存生成的图片\")\n        self.arg_input_group.add_argument('--output_dir', type=str, default='ernievilg_output')\n\n    def create_gradio_app(self):\n        '''\n        Add gradio app for hub serving.\n        '''\n        import paddlehub as hub\n        language_translation_model = hub.Module(name='baidu_translate')\n        language_recognition_model = hub.Module(name='baidu_language_recognition')\n\n        style_list = [\n            '古风', '油画', '水彩', '卡通', '二次元', '浮世绘', '蒸汽波艺术', 'low poly', '像素风格', '概念艺术', '未来主义', '赛博朋克', '写实风格', '洛丽塔风格',\n            '巴洛克风格', '超现实主义', '探索无限'\n        ]\n\n        tips = {\n            \"en\": \"Tips: The input text will be translated into Chinese for generation\",\n            \"jp\": \"ヒント: 入力テキストは生成のために中国語に翻訳されます\",\n            \"kor\": \"힌트: 입력 텍스트는 생성을 위해 중국어로 번역됩니다\"\n        }\n\n        count = 0\n\n        def translate_language(text_prompts):\n            nonlocal count\n            try:\n                count += 1\n                tips_text = None\n                language_code = language_recognition_model.recognize(text_prompts)\n                if language_code != 'zh':\n                    text_prompts = language_translation_model.translate(text_prompts, language_code, 'zh')\n            except Exception as e:\n                error_text = str(e)\n                return {status_text: error_text, language_tips_text: gr.update(visible=False)}\n            if language_code in tips:\n                tips_text = tips[language_code]\n            else:\n                tips_text = tips['en']\n            if language_code == 'zh':\n                return {\n                    language_tips_text: gr.update(visible=False),\n                    translated_language: text_prompts,\n                    trigger_component: gr.update(value=count, visible=False)\n                }\n            else:\n                return {\n                    language_tips_text: gr.update(visible=True, value=tips_text),\n                    translated_language: text_prompts,\n                    trigger_component: gr.update(value=count, visible=False)\n                }\n\n        def inference(text_prompts, style_indx):\n            try:\n                style = style_list[style_indx]\n                results = self.generate_image(text_prompts=text_prompts, style=style, visualization=False)\n            except Exception as e:\n                error_text = str(e)\n                return {status_text: error_text, gallery: None}\n            return {status_text: 'Success', gallery: results[:6]}\n\n        title = \"ERNIE-ViLG\"\n\n        description = \"ERNIE-ViLG model, which supports text-to-image task.\"\n\n        css = \"\"\"\n                .gradio-container {\n                    font-family: 'IBM Plex Sans', sans-serif;\n                }\n                .gr-button {\n                    color: white;\n                    border-color: black;\n                    background: black;\n                }\n                input[type='range'] {\n                    accent-color: black;\n                }\n                .dark input[type='range'] {\n                    accent-color: #dfdfdf;\n                }\n                .container {\n                    max-width: 730px;\n                    margin: auto;\n                    padding-top: 1.5rem;\n                }\n                #gallery {\n                    min-height: 22rem;\n                    margin-bottom: 15px;\n                    margin-left: auto;\n                    margin-right: auto;\n                    border-bottom-right-radius: .5rem !important;\n                    border-bottom-left-radius: .5rem !important;\n                }\n                #gallery>div>.h-full {\n                    min-height: 20rem;\n                }\n                .details:hover {\n                    text-decoration: underline;\n                }\n                .gr-button {\n                    white-space: nowrap;\n                }\n                .gr-button:focus {\n                    border-color: rgb(147 197 253 / var(--tw-border-opacity));\n                    outline: none;\n                    box-shadow: var(--tw-ring-offset-shadow), var(--tw-ring-shadow), var(--tw-shadow, 0 0 #0000);\n                    --tw-border-opacity: 1;\n                    --tw-ring-offset-shadow: var(--tw-ring-inset) 0 0 0 var(--tw-ring-offset-width) var(--tw-ring-offset-color);\n                    --tw-ring-shadow: var(--tw-ring-inset) 0 0 0 calc(3px var(--tw-ring-offset-width)) var(--tw-ring-color);\n                    --tw-ring-color: rgb(191 219 254 / var(--tw-ring-opacity));\n                    --tw-ring-opacity: .5;\n                }\n                .footer {\n                    margin-bottom: 45px;\n                    margin-top: 35px;\n                    text-align: center;\n                    border-bottom: 1px solid #e5e5e5;\n                }\n                .footer>p {\n                    font-size: .8rem;\n                    display: inline-block;\n                    padding: 0 10px;\n                    transform: translateY(10px);\n                    background: white;\n                }\n                .dark .footer {\n                    border-color: #303030;\n                }\n                .dark .footer>p {\n                    background: #0b0f19;\n                }\n                .prompt h4{\n                    margin: 1.25em 0 .25em 0;\n                    font-weight: bold;\n                    font-size: 115%;\n                }\n        \"\"\"\n\n        block = gr.Blocks(css=css)\n\n        examples = [\n            ['戴着眼镜的猫', '油画(Oil painting)'],\n            ['A cat with glasses', '油画(Oil painting)'],\n            ['眼鏡をかけた猫', '油画(Oil painting)'],\n            ['안경을 쓴 고양이', '油画(Oil painting)'],\n            ['日落时的城市天际线,史前遗迹风格', '油画(Oil painting)'],\n            ['一只猫坐在椅子上，戴着一副墨镜, low poly 风格', '卡通(Cartoon)'],\n            ['A cat sitting on a chair, wearing a pair of sunglasses, low poly style', '油画(Oil painting)'],\n            ['猫が椅子に座ってサングラスをかけている、low polyスタイル', '油画(Oil painting)'],\n            ['고양이 한 마리가 의자에 앉아 선글라스를 끼고 low poly 스타일을 하고 있다', '油画(Oil painting)'],\n            ['一只猫坐在椅子上，戴着一副墨镜,秋天风格', '探索无限(Explore infinity)'],\n            ['蒙娜丽莎，赛博朋克，宝丽来，33毫米,蒸汽波艺术', '探索无限(Explore infinity)'],\n            ['一只猫坐在椅子上，戴着一副墨镜,海盗风格', '探索无限(Explore infinity)'],\n            ['一条由闪电制成的令人敬畏的龙,概念艺术', '探索无限(Explore infinity)'],\n            ['An awesome dragon made of lightning, conceptual art', '油画(Oil painting)'],\n            ['稲妻で作られた畏敬の念を抱かせる竜、コンセプトアート', '油画(Oil painting)'],\n            ['번개로 만든 경외스러운 용, 개념 예술', '油画(Oil painting)'],\n            ['梵高猫头鹰,蒸汽波艺术', '探索无限(Explore infinity)'],\n            ['萨尔瓦多·达利描绘古代文明的超现实主义梦幻油画,写实风格', '探索无限(Explore infinity)'],\n            ['夕阳日落时，阳光落在云层上，海面波涛汹涌，风景，胶片感', '探索无限(Explore infinity)'],\n            ['Sunset, the sun falls on the clouds, the sea is rough, the scenery is filmy', '油画(Oil painting)'],\n            ['夕日が沈むと、雲の上に太陽の光が落ち、海面は波が荒く、風景、フィルム感', '油画(Oil painting)'],\n            ['석양이 질 때 햇빛이 구름 위에 떨어지고, 해수면의 파도가 용솟음치며, 풍경, 필름감', '油画(Oil painting)'],\n        ]\n\n        with block:\n            gr.HTML(\"\"\"\n                    <div style=\"text-align: center; max-width: 650px; margin: 0 auto;\">\n                    <div\n                        style=\"\n                        display: inline-flex;\n                        gap: 0.8rem;\n                        font-size: 1.75rem;\n                        margin-bottom: 10px;\n                        margin-left: 220px;\n                        justify-content: center;\n                        \"\n                    >\n                    <a href=\"https://github.com/PaddlePaddle/PaddleHub\"><img src=\"https://user-images.githubusercontent.com/22424850/187387422-f6c9ccab-7fda-416e-a24d-7d6084c46f67.jpg\" alt=\"Paddlehub\" width=\"40%\"></a>\n                    </div>\n                    <div\n                        style=\"\n                        display: inline-flex;\n                        align-items: center;\n                        gap: 0.8rem;\n                        font-size: 1.75rem;\n                        margin-bottom: 10px;\n                        justify-content: center;\n                        \">\n                    <a href=\"https://github.com/PaddlePaddle/PaddleHub\"><h1 style=\"font-weight: 900; margin-bottom: 7px;\">\n                        ERNIE-ViLG Demo\n                    </h1></a>\n                    </div>\n                    <p style=\"margin-bottom: 10px; font-size: 94%\">\n                        ERNIE-ViLG is a state-of-the-art text-to-image model that generates\n                        images from Chinese text.\n                    </p>\n                    <a href=\"https://github.com/PaddlePaddle/PaddleHub\"><img src=\"https://user-images.githubusercontent.com/22424850/188184795-98605a22-9af2-4106-827b-e58548f8892f.png\" alt=\"star Paddlehub\" width=\"100%\"></a>\n                    </div>\n                \"\"\")\n            with gr.Group():\n                with gr.Box():\n                    with gr.Row().style(mobile_collapse=False, equal_height=True):\n                        text = gr.Textbox(\n                            label=\"Prompt\",\n                            show_label=False,\n                            max_lines=1,\n                            placeholder=\"Enter your prompt, multiple languages are supported now.\",\n                        ).style(\n                            border=(True, False, True, True),\n                            rounded=(True, False, False, True),\n                            container=False,\n                        )\n\n                        btn = gr.Button(\"Generate image\").style(\n                            margin=False,\n                            rounded=(False, True, True, False),\n                        )\n                language_tips_text = gr.Textbox(label=\"language tips\", show_label=False, visible=False, max_lines=1)\n                styles = gr.Dropdown(label=\"风格(style)\",\n                                     choices=[\n                                         '古风(Ancient Style)', '油画(Oil painting)', '水彩(Watercolor)', '卡通(Cartoon)',\n                                         '二次元(Anime)', '浮世绘(Ukiyoe)', '蒸汽波艺术(Vaporwave)', 'low poly',\n                                         '像素风格(Pixel Style)', '概念艺术(Conceptual Art)', '未来主义(Futurism)',\n                                         '赛博朋克(Cyberpunk)', '写实风格(Realistic style)', '洛丽塔风格(Lolita style)',\n                                         '巴洛克风格(Baroque style)', '超现实主义(Surrealism)', '探索无限(Explore infinity)'\n                                     ],\n                                     value='卡通(Cartoon)',\n                                     type=\"index\")\n                gallery = gr.Gallery(label=\"Generated images\", show_label=False, elem_id=\"gallery\").style(grid=[2, 3],\n                                                                                                          height=\"auto\")\n                status_text = gr.Textbox(label=\"处理状态(Process status)\", show_label=True, max_lines=1, interactive=False)\n                trigger_component = gr.Textbox(\n                    vaule=\"\", visible=False)  # This component is used for triggering inference funtion.\n                translated_language = gr.Textbox(vaule=\"\", visible=False)\n\n                ex = gr.Examples(examples=examples,\n                                 fn=translate_language,\n                                 inputs=[text],\n                                 outputs=[language_tips_text, status_text, trigger_component, translated_language],\n                                 cache_examples=False)\n                ex.dataset.headers = [\"\"]\n\n                text.submit(translate_language,\n                            inputs=[text],\n                            outputs=[language_tips_text, status_text, trigger_component, translated_language])\n                btn.click(translate_language,\n                          inputs=[text],\n                          outputs=[language_tips_text, status_text, trigger_component, translated_language])\n                trigger_component.change(fn=inference,\n                                         inputs=[translated_language, styles],\n                                         outputs=[status_text, gallery])\n                gr.HTML(\"\"\"\n                        <div class=\"prompt\">\n                            <p><h4>Prompt公式</h4>\n                            <span> Prompt = 图片主体，细节词，修饰词 </span>\n                            关于各部分的构造方式和效果，可以参考<a href=\"https://github.com/PaddlePaddle/PaddleHub/blob/develop/modules/image/text_to_image/ernie_vilg/README.md#四-prompt-指南\" style=\"text-decoration: underline;\" target=\"_blank\">YouPromptMe指南</a>。\n                            更多的模型，请关注<a href=\"https://github.com/PaddlePaddle/PaddleHub\" style=\"text-decoration: underline;\" target=\"_blank\"> PaddleHub 官方Repo </a>， 如果你觉得不错，请star收藏吧。\n                            <p><svg xmlns=\"http://www.w3.org/2000/svg\" xmlns:xlink=\"http://www.w3.org/1999/xlink\" width=\"90\" height=\"20\"><style>a:hover #llink{fill:url(#b);stroke:#ccc}a:hover #rlink{fill:#4183c4}</style><linearGradient id=\"a\" x2=\"0\" y2=\"100%\"><stop offset=\"0\" stop-color=\"#fcfcfc\" stop-opacity=\"0\"/><stop offset=\"1\" stop-opacity=\".1\"/></linearGradient><linearGradient id=\"b\" x2=\"0\" y2=\"100%\"><stop offset=\"0\" stop-color=\"#ccc\" stop-opacity=\".1\"/><stop offset=\"1\" stop-opacity=\".1\"/></linearGradient><g stroke=\"#d5d5d5\"><rect stroke=\"none\" fill=\"#fcfcfc\" x=\"0.5\" y=\"0.5\" width=\"54\" height=\"19\" rx=\"2\"/><rect x=\"60.5\" y=\"0.5\" width=\"29\" height=\"19\" rx=\"2\" fill=\"#fafafa\"/><rect x=\"60\" y=\"7.5\" width=\"0.5\" height=\"5\" stroke=\"#fafafa\"/><path d=\"M60.5 6.5 l-3 3v1 l3 3\" stroke=\"d5d5d5\" fill=\"#fafafa\"/></g><image x=\"5\" y=\"3\" width=\"14\" height=\"14\" xlink:href=\"data:image/svg+xml;base64,PHN2ZyBmaWxsPSIjMTgxNzE3IiByb2xlPSJpbWciIHZpZXdCb3g9IjAgMCAyNCAyNCIgeG1sbnM9Imh0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnIj48dGl0bGU+R2l0SHViPC90aXRsZT48cGF0aCBkPSJNMTIgLjI5N2MtNi42MyAwLTEyIDUuMzczLTEyIDEyIDAgNS4zMDMgMy40MzggOS44IDguMjA1IDExLjM4NS42LjExMy44Mi0uMjU4LjgyLS41NzcgMC0uMjg1LS4wMS0xLjA0LS4wMTUtMi4wNC0zLjMzOC43MjQtNC4wNDItMS42MS00LjA0Mi0xLjYxQzQuNDIyIDE4LjA3IDMuNjMzIDE3LjcgMy42MzMgMTcuN2MtMS4wODctLjc0NC4wODQtLjcyOS4wODQtLjcyOSAxLjIwNS4wODQgMS44MzggMS4yMzYgMS44MzggMS4yMzYgMS4wNyAxLjgzNSAyLjgwOSAxLjMwNSAzLjQ5NS45OTguMTA4LS43NzYuNDE3LTEuMzA1Ljc2LTEuNjA1LTIuNjY1LS4zLTUuNDY2LTEuMzMyLTUuNDY2LTUuOTMgMC0xLjMxLjQ2NS0yLjM4IDEuMjM1LTMuMjItLjEzNS0uMzAzLS41NC0xLjUyMy4xMDUtMy4xNzYgMCAwIDEuMDA1LS4zMjIgMy4zIDEuMjMuOTYtLjI2NyAxLjk4LS4zOTkgMy0uNDA1IDEuMDIuMDA2IDIuMDQuMTM4IDMgLjQwNSAyLjI4LTEuNTUyIDMuMjg1LTEuMjMgMy4yODUtMS4yMy42NDUgMS42NTMuMjQgMi44NzMuMTIgMy4xNzYuNzY1Ljg0IDEuMjMgMS45MSAxLjIzIDMuMjIgMCA0LjYxLTIuODA1IDUuNjI1LTUuNDc1IDUuOTIuNDIuMzYuODEgMS4wOTYuODEgMi4yMiAwIDEuNjA2LS4wMTUgMi44OTYtLjAxNSAzLjI4NiAwIC4zMTUuMjEuNjkuODI1LjU3QzIwLjU2NSAyMi4wOTIgMjQgMTcuNTkyIDI0IDEyLjI5N2MwLTYuNjI3LTUuMzczLTEyLTEyLTEyIi8+PC9zdmc+\"/><g aria-hidden=\"false\" fill=\"#333\" text-anchor=\"middle\" font-family=\"Helvetica Neue,Helvetica,Arial,sans-serif\" text-rendering=\"geometricPrecision\" font-weight=\"700\" font-size=\"110px\" line-height=\"14px\"><a target=\"_blank\" xlink:href=\"https://github.com/PaddlePaddle/PaddleHub\"><text aria-hidden=\"true\" x=\"355\" y=\"150\" fill=\"#fff\" transform=\"scale(.1)\" textLength=\"270\">Stars</text><text x=\"355\" y=\"140\" transform=\"scale(.1)\" textLength=\"270\">Stars</text><rect id=\"llink\" stroke=\"#d5d5d5\" fill=\"url(#a)\" x=\".5\" y=\".5\" width=\"54\" height=\"19\" rx=\"2\"/></a><a target=\"_blank\" xlink:href=\"https://github.com/PaddlePaddle/PaddleHub/stargazers\"><rect width=\"30\" x=\"60\" height=\"20\" fill=\"rgba(0,0,0,0)\"/><text aria-hidden=\"true\" x=\"745\" y=\"150\" fill=\"#fff\" transform=\"scale(.1)\" textLength=\"210\">8.4k</text><text id=\"rlink\" x=\"745\" y=\"140\" transform=\"scale(.1)\" textLength=\"210\">8.4k</text></a></g></svg></p>\n                            同时，可以在 <a href=\"https://aistudio.baidu.com/aistudio/projectdetail/4462918\", style=\"text-decoration: underline;\" target=\"_blank\"> aistudio </a> 上使用免费的GPU体验更多案例。\n                            </p>\n                    </div>\n                    <div class=\"prompt\">\n                            <p><h4>Prompt format</h4>\n                            <span> Prompt = object, details, description </span>\n                            For more details, please refer to <a href=\"https://github.com/PaddlePaddle/PaddleHub/blob/develop/modules/image/text_to_image/ernie_vilg/README.md#四-prompt-指南\" style=\"text-decoration: underline;\" target=\"_blank\">YouPromptMe Guide</a>.\n                            There are more interesting models in PaddleHub, if you think it's great, welcome to star <a href=\"https://github.com/PaddlePaddle/PaddleHub\" style=\"text-decoration: underline;\" target=\"_blank\"> PaddleHub</a>.\n                            <p><svg xmlns=\"http://www.w3.org/2000/svg\" xmlns:xlink=\"http://www.w3.org/1999/xlink\" width=\"90\" height=\"20\"><style>a:hover #llink{fill:url(#b);stroke:#ccc}a:hover #rlink{fill:#4183c4}</style><linearGradient id=\"a\" x2=\"0\" y2=\"100%\"><stop offset=\"0\" stop-color=\"#fcfcfc\" stop-opacity=\"0\"/><stop offset=\"1\" stop-opacity=\".1\"/></linearGradient><linearGradient id=\"b\" x2=\"0\" y2=\"100%\"><stop offset=\"0\" stop-color=\"#ccc\" stop-opacity=\".1\"/><stop offset=\"1\" stop-opacity=\".1\"/></linearGradient><g stroke=\"#d5d5d5\"><rect stroke=\"none\" fill=\"#fcfcfc\" x=\"0.5\" y=\"0.5\" width=\"54\" height=\"19\" rx=\"2\"/><rect x=\"60.5\" y=\"0.5\" width=\"29\" height=\"19\" rx=\"2\" fill=\"#fafafa\"/><rect x=\"60\" y=\"7.5\" width=\"0.5\" height=\"5\" stroke=\"#fafafa\"/><path d=\"M60.5 6.5 l-3 3v1 l3 3\" stroke=\"d5d5d5\" fill=\"#fafafa\"/></g><image x=\"5\" y=\"3\" width=\"14\" height=\"14\" xlink:href=\"data:image/svg+xml;base64,PHN2ZyBmaWxsPSIjMTgxNzE3IiByb2xlPSJpbWciIHZpZXdCb3g9IjAgMCAyNCAyNCIgeG1sbnM9Imh0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnIj48dGl0bGU+R2l0SHViPC90aXRsZT48cGF0aCBkPSJNMTIgLjI5N2MtNi42MyAwLTEyIDUuMzczLTEyIDEyIDAgNS4zMDMgMy40MzggOS44IDguMjA1IDExLjM4NS42LjExMy44Mi0uMjU4LjgyLS41NzcgMC0uMjg1LS4wMS0xLjA0LS4wMTUtMi4wNC0zLjMzOC43MjQtNC4wNDItMS42MS00LjA0Mi0xLjYxQzQuNDIyIDE4LjA3IDMuNjMzIDE3LjcgMy42MzMgMTcuN2MtMS4wODctLjc0NC4wODQtLjcyOS4wODQtLjcyOSAxLjIwNS4wODQgMS44MzggMS4yMzYgMS44MzggMS4yMzYgMS4wNyAxLjgzNSAyLjgwOSAxLjMwNSAzLjQ5NS45OTguMTA4LS43NzYuNDE3LTEuMzA1Ljc2LTEuNjA1LTIuNjY1LS4zLTUuNDY2LTEuMzMyLTUuNDY2LTUuOTMgMC0xLjMxLjQ2NS0yLjM4IDEuMjM1LTMuMjItLjEzNS0uMzAzLS41NC0xLjUyMy4xMDUtMy4xNzYgMCAwIDEuMDA1LS4zMjIgMy4zIDEuMjMuOTYtLjI2NyAxLjk4LS4zOTkgMy0uNDA1IDEuMDIuMDA2IDIuMDQuMTM4IDMgLjQwNSAyLjI4LTEuNTUyIDMuMjg1LTEuMjMgMy4yODUtMS4yMy42NDUgMS42NTMuMjQgMi44NzMuMTIgMy4xNzYuNzY1Ljg0IDEuMjMgMS45MSAxLjIzIDMuMjIgMCA0LjYxLTIuODA1IDUuNjI1LTUuNDc1IDUuOTIuNDIuMzYuODEgMS4wOTYuODEgMi4yMiAwIDEuNjA2LS4wMTUgMi44OTYtLjAxNSAzLjI4NiAwIC4zMTUuMjEuNjkuODI1LjU3QzIwLjU2NSAyMi4wOTIgMjQgMTcuNTkyIDI0IDEyLjI5N2MwLTYuNjI3LTUuMzczLTEyLTEyLTEyIi8+PC9zdmc+\"/><g aria-hidden=\"false\" fill=\"#333\" text-anchor=\"middle\" font-family=\"Helvetica Neue,Helvetica,Arial,sans-serif\" text-rendering=\"geometricPrecision\" font-weight=\"700\" font-size=\"110px\" line-height=\"14px\"><a target=\"_blank\" xlink:href=\"https://github.com/PaddlePaddle/PaddleHub\"><text aria-hidden=\"true\" x=\"355\" y=\"150\" fill=\"#fff\" transform=\"scale(.1)\" textLength=\"270\">Stars</text><text x=\"355\" y=\"140\" transform=\"scale(.1)\" textLength=\"270\">Stars</text><rect id=\"llink\" stroke=\"#d5d5d5\" fill=\"url(#a)\" x=\".5\" y=\".5\" width=\"54\" height=\"19\" rx=\"2\"/></a><a target=\"_blank\" xlink:href=\"https://github.com/PaddlePaddle/PaddleHub/stargazers\"><rect width=\"30\" x=\"60\" height=\"20\" fill=\"rgba(0,0,0,0)\"/><text aria-hidden=\"true\" x=\"745\" y=\"150\" fill=\"#fff\" transform=\"scale(.1)\" textLength=\"210\">8.4k</text><text id=\"rlink\" x=\"745\" y=\"140\" transform=\"scale(.1)\" textLength=\"210\">8.4k</text></a></g></svg></p>\n                            Besides, you can use free GPU resourses in <a href=\"https://aistudio.baidu.com/aistudio/projectdetail/4462918\", style=\"text-decoration: underline;\" target=\"_blank\"> aistudio </a> to enjoy more cases, have fun.\n                            </p>\n                    </div>\n\n                \"\"\")\n                gr.Markdown(\"\"\"\n        在\"探索无限\"的风格模式下，画作的真实风格完全可以由你的prompt来决定。下面是一些参考案例:\n\n        In \"Explore infinity\" style mode, how the image looks like is totally up to your prompt. Below are some cases:\n\n        |<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/174_蒙娜丽莎，赛博朋克，宝丽来，33毫米,蒸汽波艺术_000-1_7b4a78a.png\" alt=\"drawing\" width=\"300\"/>|\n        | --- |\n        | prompt：蒙娜丽莎，赛博朋克，宝丽来，33毫米,</br>蒸汽波艺术  |\n\n\n        |<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/3_72d9343.png\" alt=\"drawing\" width=\"300\"/>|\n        | --- |\n        | prompt：火焰，凤凰，少女，未来感，高清，3d，</br>精致面容，cg感，古风，唯美，毛发细致，上半身立绘 |\n\n\n        |<img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/4_e1f5cbb.png\" alt=\"drawing\" width=\"300\"/>|\n        | --- |\n        |  prompt：巨狼，飘雪，蓝色大片烟雾，毛发细致，</br>烟雾缭绕，高清，3d，cg感，侧面照  |\n\n\n        | <img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/5_d380451.png\" alt=\"drawing\" width=\"400\"/> |\n        | --- |\n        |  prompt：浮世绘日本科幻哑光绘画，概念艺术，</br>动漫风格神道寺禅园英雄动作序列，包豪斯|\n\n        <img src=\"https://bce.bdstatic.com/doc/AIDP/wenxin/1_3612449.jpg\" alt=\"drawing\" width=\"600\"/>\n\n        ### <u>[更多内容...](https://github.com/PaddlePaddle/PaddleHub/blob/develop/modules/image/text_to_image/ernie_vilg/README.md#四-prompt-指南)([Explore more...](https://github.com/PaddlePaddle/PaddleHub/blob/develop/modules/image/text_to_image/ernie_vilg/README.md#四-prompt-指南))</u>\n\n\n                    \"\"\")\n                gr.HTML('''\n                <div class=\"footer\">\n                            <p>Model by <a href=\"https://github.com/PaddlePaddle/PaddleHub\" style=\"text-decoration: underline;\" target=\"_blank\">PaddleHub</a> and <a href=\"https://wenxin.baidu.com\" style=\"text-decoration: underline;\" target=\"_blank\">文心大模型</a> - Gradio Demo by 🤗 Hugging Face\n                            </p>\n                </div>\n                ''')\n\n        return block\n"
  },
  {
    "path": "modules/image/text_to_image/ernie_vilg/requirements.txt",
    "content": "requests\ntqdm\n"
  },
  {
    "path": "modules/image/text_to_image/ernie_vilg/test.py",
    "content": "import shutil\nimport unittest\n\nimport paddlehub as hub\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        cls.module = hub.Module(name=\"ernie_vilg\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('ernievilg_output')\n\n    def test_generate_image(self):\n        self.module.generate_image(text_prompts=['戴眼镜的猫'],\n                                   style=\"像素风格\",\n                                   topk=6,\n                                   visualization=True,\n                                   output_dir='ernievilg_output')\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/LICENSE",
    "content": "Copyright (c) 2022 Robin Rombach and Patrick Esser and contributors\n\nCreativeML Open RAIL-M\ndated August 22, 2022\n\nSection I: PREAMBLE\n\nMultimodal generative models are being widely adopted and used, and have the potential to transform the way artists, among other individuals, conceive and benefit from AI or ML technologies as a tool for content creation.\n\nNotwithstanding the current and potential benefits that these artifacts can bring to society at large, there are also concerns about potential misuses of them, either due to their technical limitations or ethical considerations.\n\nIn short, this license strives for both the open and responsible downstream use of the accompanying model. When it comes to the open character, we took inspiration from open source permissive licenses regarding the grant of IP rights. Referring to the downstream responsible use, we added use-based restrictions not permitting the use of the Model in very specific scenarios, in order for the licensor to be able to enforce the license in case potential misuses of the Model may occur. At the same time, we strive to promote open and responsible research on generative models for art and content generation.\n\nEven though downstream derivative versions of the model could be released under different licensing terms, the latter will always have to include - at minimum - the same use-based restrictions as the ones in the original license (this license). We believe in the intersection between open and responsible AI development; thus, this License aims to strike a balance between both in order to enable responsible open-science in the field of AI.\n\nThis License governs the use of the model (and its derivatives) and is informed by the model card associated with the model.\n\nNOW THEREFORE, You and Licensor agree as follows:\n\n1. Definitions\n\n- \"License\" means the terms and conditions for use, reproduction, and Distribution as defined in this document.\n- \"Data\" means a collection of information and/or content extracted from the dataset used with the Model, including to train, pretrain, or otherwise evaluate the Model. The Data is not licensed under this License.\n- \"Output\" means the results of operating a Model as embodied in informational content resulting therefrom.\n- \"Model\" means any accompanying machine-learning based assemblies (including checkpoints), consisting of learnt weights, parameters (including optimizer states), corresponding to the model architecture as embodied in the Complementary Material, that have been trained or tuned, in whole or in part on the Data, using the Complementary Material.\n- \"Derivatives of the Model\" means all modifications to the Model, works based on the Model, or any other model which is created or initialized by transfer of patterns of the weights, parameters, activations or output of the Model, to the other model, in order to cause the other model to perform similarly to the Model, including - but not limited to - distillation methods entailing the use of intermediate data representations or methods based on the generation of synthetic data by the Model for training the other model.\n- \"Complementary Material\" means the accompanying source code and scripts used to define, run, load, benchmark or evaluate the Model, and used to prepare data for training or evaluation, if any. This includes any accompanying documentation, tutorials, examples, etc, if any.\n- \"Distribution\" means any transmission, reproduction, publication or other sharing of the Model or Derivatives of the Model to a third party, including providing the Model as a hosted service made available by electronic or other remote means - e.g. API-based or web access.\n- \"Licensor\" means the copyright owner or entity authorized by the copyright owner that is granting the License, including the persons or entities that may have rights in the Model and/or distributing the Model.\n- \"You\" (or \"Your\") means an individual or Legal Entity exercising permissions granted by this License and/or making use of the Model for whichever purpose and in any field of use, including usage of the Model in an end-use application - e.g. chatbot, translator, image generator.\n- \"Third Parties\" means individuals or legal entities that are not under common control with Licensor or You.\n- \"Contribution\" means any work of authorship, including the original version of the Model and any modifications or additions to that Model or Derivatives of the Model thereof, that is intentionally submitted to Licensor for inclusion in the Model by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, \"submitted\" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Model, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as \"Not a Contribution.\"\n- \"Contributor\" means Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Model.\n\nSection II: INTELLECTUAL PROPERTY RIGHTS\n\nBoth copyright and patent grants apply to the Model, Derivatives of the Model and Complementary Material. The Model and Derivatives of the Model are subject to additional terms as described in Section III.\n\n2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare, publicly display, publicly perform, sublicense, and distribute the Complementary Material, the Model, and Derivatives of the Model.\n3. Grant of Patent License. Subject to the terms and conditions of this License and where and as applicable, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this paragraph) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Model and the Complementary Material, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Model to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Model and/or Complementary Material or a Contribution incorporated within the Model and/or Complementary Material constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for the Model and/or Work shall terminate as of the date such litigation is asserted or filed.\n\nSection III: CONDITIONS OF USAGE, DISTRIBUTION AND REDISTRIBUTION\n\n4. Distribution and Redistribution. You may host for Third Party remote access purposes (e.g. software-as-a-service), reproduce and distribute copies of the Model or Derivatives of the Model thereof in any medium, with or without modifications, provided that You meet the following conditions:\nUse-based restrictions as referenced in paragraph 5 MUST be included as an enforceable provision by You in any type of legal agreement (e.g. a license) governing the use and/or distribution of the Model or Derivatives of the Model, and You shall give notice to subsequent users You Distribute to, that the Model or Derivatives of the Model are subject to paragraph 5. This provision does not apply to the use of Complementary Material.\nYou must give any Third Party recipients of the Model or Derivatives of the Model a copy of this License;\nYou must cause any modified files to carry prominent notices stating that You changed the files;\nYou must retain all copyright, patent, trademark, and attribution notices excluding those notices that do not pertain to any part of the Model, Derivatives of the Model.\nYou may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions - respecting paragraph 4.a. - for use, reproduction, or Distribution of Your modifications, or for any such Derivatives of the Model as a whole, provided Your use, reproduction, and Distribution of the Model otherwise complies with the conditions stated in this License.\n5. Use-based restrictions. The restrictions set forth in Attachment A are considered Use-based restrictions. Therefore You cannot use the Model and the Derivatives of the Model for the specified restricted uses. You may use the Model subject to this License, including only for lawful purposes and in accordance with the License. Use may include creating any content with, finetuning, updating, running, training, evaluating and/or reparametrizing the Model. You shall require all of Your users who use the Model or a Derivative of the Model to comply with the terms of this paragraph (paragraph 5).\n6. The Output You Generate. Except as set forth herein, Licensor claims no rights in the Output You generate using the Model. You are accountable for the Output you generate and its subsequent uses. No use of the output can contravene any provision as stated in the License.\n\nSection IV: OTHER PROVISIONS\n\n7. Updates and Runtime Restrictions. To the maximum extent permitted by law, Licensor reserves the right to restrict (remotely or otherwise) usage of the Model in violation of this License, update the Model through electronic means, or modify the Output of the Model based on updates. You shall undertake reasonable efforts to use the latest version of the Model.\n8. Trademarks and related. Nothing in this License permits You to make use of Licensors’ trademarks, trade names, logos or to otherwise suggest endorsement or misrepresent the relationship between the parties; and any rights not expressly granted herein are reserved by the Licensors.\n9. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Model and the Complementary Material (and each Contributor provides its Contributions) on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Model, Derivatives of the Model, and the Complementary Material and assume any risks associated with Your exercise of permissions under this License.\n10. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Model and the Complementary Material (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages.\n11. Accepting Warranty or Additional Liability. While redistributing the Model, Derivatives of the Model and the Complementary Material thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability.\n12. If any provision of this License is held to be invalid, illegal or unenforceable, the remaining provisions shall be unaffected thereby and remain valid as if such provision had not been set forth herein.\n\nEND OF TERMS AND CONDITIONS\n\n\n\n\nAttachment A\n\nUse Restrictions\n\nYou agree not to use the Model or Derivatives of the Model:\n- In any way that violates any applicable national, federal, state, local or international law or regulation;\n- For the purpose of exploiting, harming or attempting to exploit or harm minors in any way;\n- To generate or disseminate verifiably false information and/or content with the purpose of harming others;\n- To generate or disseminate personal identifiable information that can be used to harm an individual;\n- To defame, disparage or otherwise harass others;\n- For fully automated decision making that adversely impacts an individual’s legal rights or otherwise creates or modifies a binding, enforceable obligation;\n- For any use intended to or which has the effect of discriminating against or harming individuals or groups based on online or offline social behavior or known or predicted personal or personality characteristics;\n- To exploit any of the vulnerabilities of a specific group of persons based on their age, social, physical or mental characteristics, in order to materially distort the behavior of a person pertaining to that group in a manner that causes or is likely to cause that person or another person physical or psychological harm;\n- For any use intended to or which has the effect of discriminating against individuals or groups based on legally protected characteristics or categories;\n- To provide medical advice and medical results interpretation;\n- To generate or disseminate information for the purpose to be used for administration of justice, law enforcement, immigration or asylum processes, such as predicting an individual will commit fraud/crime commitment (e.g. by text profiling, drawing causal relationships between assertions made in documents, indiscriminate and arbitrarily-targeted use)."
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/README.md",
    "content": "# stable_diffusion\n\n|模型名称|stable_diffusion|\n| :--- | :---: |\n|类别|多模态-文图生成|\n|网络|CLIP Text Encoder+UNet+VAD|\n|数据集|-|\n|是否支持Fine-tuning|否|\n|模型大小|4.0GB|\n|最新更新日期|2022-08-26|\n|数据指标|-|\n\n## 一、模型基本信息\n\n### 应用效果展示\n\n  - 输入文本 \"in the morning light,Overlooking TOKYO city by greg rutkowski and thomas kinkade,Trending on artstation.\"\n\n  - 输出图像\n  <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/186873437-2e426acd-7656-4d37-9ee4-8cafa48f097f.png\"  width = \"80%\" hspace='10'/>\n  <br />\n\n  - 生成过程\n  <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/186873216-d2a9761a-78b0-4f6a-97ec-919768f324f5.gif\"  width = \"80%\" hspace='10'/>\n  <br />\n\n### 模型介绍\n\nStable Diffusion是一种潜在扩散模型(Latent Diffusion)， 属于生成类模型，这类模型通过对随机噪声进行一步步地迭代降噪并采样来获得感兴趣的图像，当前取得了令人惊艳的效果。相比于Disco Diffusion, Stable Diffusion通过在低纬度的潜在空间（lower dimensional latent space）而不是原像素空间来做迭代，极大地降低了内存和计算量的需求，并且在V100上一分钟之内即可以渲染出想要的图像，欢迎体验。\n\n更多详情请参考论文：[High-Resolution Image Synthesis with Latent Diffusion Models](https://arxiv.org/abs/2112.10752)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install stable_diffusion\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run stable_diffusion --text_prompts \"in the morning light,Overlooking TOKYO city by greg rutkowski and thomas kinkade,Trending on artstation.\" --output_dir stable_diffusion_out\n    ```\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"stable_diffusion\")\n    text_prompts = [\"in the morning light,Overlooking TOKYO city by greg rutkowski and thomas kinkade,Trending on artstation.\"]\n    # 生成图像, 默认会在stable_diffusion_out目录保存图像\n    # 返回的da是一个DocumentArray对象，保存了所有的结果，包括最终结果和迭代过程的中间结果\n    # 可以通过操作DocumentArray对象对生成的图像做后处理，保存或者分析\n    # 您可以设置batch_size一次生成多张\n    da = module.generate_image(text_prompts=text_prompts, batch_size=3, output_dir='./stable_diffusion_out/')  \n    # 展示所有的中间结果\n    da[0].chunks[-1].chunks.plot_image_sprites(skip_empty=True, show_index=True, keep_aspect_ratio=True)\n    # 将整个生成过程保存为一个动态图gif\n    da[0].chunks[-1].chunks.save_gif('stable_diffusion_out-merged-result.gif')\n    # da索引的是prompt, da[0].chunks索引的是该prompt下生成的第一张图，在batch_size不为1时能同时生成多张图\n    # 您也可以按照上述操作显示单张图，如第0张的生成过程\n    da[0].chunks[0].chunks.plot_image_sprites(skip_empty=True, show_index=True, keep_aspect_ratio=True)\n    da[0].chunks[0].chunks.save_gif('stable_diffusion_out-image-0-result.gif')\n    ```\n\n- ### 3、API\n\n  - ```python\n    def generate_image(\n            text_prompts,\n            style: Optional[str] = None,\n            artist: Optional[str] = None,\n            width_height: Optional[List[int]] = [512, 512],\n            seed: Optional[int] = None,\n            batch_size: Optional[int] = 1,\n            output_dir: Optional[str] = 'stable_diffusion_out'):\n    ```\n\n    - 文图生成API，生成文本描述内容的图像。\n\n    - **参数**\n\n      - text_prompts(str): 输入的语句，描述想要生成的图像的内容。通常比较有效的构造方式为 \"一段描述性的文字内容\" + \"指定艺术家的名字\"，如\"in the morning light,Overlooking TOKYO city by greg rutkowski and thomas kinkade,Trending on artstation.\"。prompt的构造可以参考[网站](https://docs.google.com/document/d/1XUT2G9LmkZataHFzmuOtRXnuWBfhvXDAo8DkS--8tec/edit#)。\n      - style(Optional[str]): 指定绘画的风格，如'watercolor','Chinese painting'等。当不指定时，风格完全由您所填写的prompt决定。\n      - artist(Optional[str]): 指定特定的艺术家，如Greg Rutkowsk、krenz，将会生成所指定艺术家的绘画风格。当不指定时，风格完全由您所填写的prompt决定。各种艺术家的风格可以参考[网站](https://weirdwonderfulai.art/resources/disco-diffusion-70-plus-artist-studies/)。\n      - width_height(Optional[List[int]]): 指定最终输出图像的宽高，宽和高都需要是64的倍数，生成的图像越大，所需要的计算时间越长。\n      - seed(Optional[int]): 随机种子，由于输入默认是随机高斯噪声，设置不同的随机种子会由不同的初始输入，从而最终生成不同的结果，可以设置该参数来获得不同的输出图像。\n      - batch_size(Optional[int]): 指定每个prompt一次生成的图像的数量。\n      - output_dir(Optional[str]): 保存输出图像的目录，默认为\"stable_diffusion_out\"。\n\n\n    - **返回**\n      - ra(DocumentArray): DocumentArray对象， 包含`batch_size`个Documents，其中每个Document都保存了迭代过程的所有中间结果。详细可参考[DocumentArray使用文档](https://docarray.jina.ai/fundamentals/documentarray/index.html)。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线文图生成服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m stable_diffusion\n    ```\n\n  - 这样就完成了一个文图生成的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果，返回的预测结果在反序列化后即是上述接口声明中说明的DocumentArray类型，返回后对结果的操作方式和使用generate_image接口完全相同。\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    from docarray import DocumentArray\n\n    # 发送HTTP请求\n    data = {'text_prompts': 'in the morning light,Overlooking TOKYO city by greg rutkowski and thomas kinkade,Trending on artstation.'}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/stable_diffusion\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 获取返回结果\n    r.json()[\"results\"]\n    da = DocumentArray.from_base64(r.json()[\"results\"])\n    # 保存结果图\n    da[0].save_uri_to_file('stable_diffusion_out.png')\n    # 将生成过程保存为一个动态图gif\n    da[0].chunks[0].chunks.save_gif('stable_diffusion_out.gif')\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install stable_diffusion == 1.0.0\n  ```\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/README_en.md",
    "content": "# stable_diffusion\n\n|Module Name|stable_diffusion|\n| :--- | :---: |\n|Category|text to image|\n|Network|CLIP Text Encoder+UNet+VAD|\n|Dataset|-|\n|Fine-tuning supported or not|No|\n|Module Size|4.0GB|\n|Latest update date|2022-08-26|\n|Data indicators|-|\n\n## I.Basic Information\n\n### Application Effect Display\n\n  - Prompt \"in the morning light,Overlooking TOKYO city by greg rutkowski and thomas kinkade,Trending on artstation.\"\n\n  - Output image\n  <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/186873437-2e426acd-7656-4d37-9ee4-8cafa48f097f.png\"  width = \"80%\" hspace='10'/>\n  <br />\n\n  - Generating process\n  <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/186873216-d2a9761a-78b0-4f6a-97ec-919768f324f5.gif\"  width = \"80%\" hspace='10'/>\n  <br />\n\n### Module Introduction\n\nStable Diffusion is a latent diffusion model (Latent Diffusion), which belongs to the generative model. This kind of model obtains the images by iteratively denoising  noise and sampling step by step, and currently has achieved amazing results. Compared with Disco Diffusion, Stable Diffusion iterates in a lower dimensional latent space instead of the original pixel space, which greatly reduces the memory and computational requirements. You can render the desired image within a minute on the V100, welcome to enjoy it in [aistudio](https://aistudio.baidu.com/aistudio/projectdetail/4512600).\n\nFor more details, please refer to [High-Resolution Image Synthesis with Latent Diffusion Models](https://arxiv.org/abs/2112.10752)\n\n## II.Installation\n\n- ### 1.Environmental Dependence\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [How to install PaddleHub](../../../../docs/docs_en/get_start/installation.rst)\n\n- ### 2.Installation\n\n  - ```shell\n    $ hub install stable_diffusion\n    ```\n  - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md) | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)\n\n\n## III.Module API Prediction  \n\n- ### 1.Command line Prediction\n\n  - ```shell\n    $ hub run stable_diffusion --text_prompts \"in the morning light,Overlooking TOKYO city by greg rutkowski and thomas kinkade,Trending on artstation.\" --output_dir stable_diffusion_out\n    ```\n\n- ### 2.Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"stable_diffusion\")\n    text_prompts = [\"in the morning light,Overlooking TOKYO city by greg rutkowski and thomas kinkade,Trending on artstation.\"]\n    # Output images will be saved in stable_diffusion_out directory.\n    # The returned da is a DocumentArray object, which contains all immediate and final results\n    # You can manipulate the DocumentArray object to do post-processing and save images\n    # you can set batch_size parameter to generate number of batch_size images at one inference step.\n    da = module.generate_image(text_prompts=text_prompts, batch_size=3, output_dir='./stable_diffusion_out/')  \n    # Show all immediate results\n    da[0].chunks[-1].chunks.plot_image_sprites(skip_empty=True, show_index=True, keep_aspect_ratio=True)\n    # Save the generating process as a gif\n    da[0].chunks[-1].chunks.save_gif('stable_diffusion_out-merged-result.gif')\n    da[0].chunks[0].chunks.plot_image_sprites(skip_empty=True, show_index=True, keep_aspect_ratio=True)\n    da[0].chunks[0].chunks.save_gif('stable_diffusion_out-image-0-result.gif')\n    ```\n\n- ### 3.API\n\n  - ```python\n    def generate_image(\n            text_prompts,\n            style: Optional[str] = None,\n            artist: Optional[str] = None,\n            width_height: Optional[List[int]] = [512, 512],\n            seed: Optional[int] = None,\n            batch_size: Optional[int] = 1,\n            output_dir: Optional[str] = 'stable_diffusion_out'):\n    ```\n\n    - Image generating api, which generates an image corresponding to your prompt.\n\n    - **Parameters**\n\n      - text_prompts(str): Prompt, used to describe your image content. You can construct a prompt conforms to the format \"content\" + \"artist/style\", such as \"in the morning light,Overlooking TOKYO city by greg rutkowski and thomas kinkade,Trending on artstation.\". For more details, you can refer to [website](https://docs.google.com/document/d/1XUT2G9LmkZataHFzmuOtRXnuWBfhvXDAo8DkS--8tec/edit#).\n      - style(Optional[str]): Image style, such as \"watercolor\" and \"Chinese painting\". If not provided, style is totally up to your prompt.\n      - artist(Optional[str]): Artist name, such as Greg Rutkowsk,krenz, image style is as whose works you choose. If not provided, style is totally up to your prompt.(https://weirdwonderfulai.art/resources/disco-diffusion-70-plus-artist-studies/).\n      - width_height(Optional[List[int]]): The width and height of output images, should be better multiples of 64. The larger size is, the longger computation time is.\n      - seed(Optional[int]): Random seed, different seeds result in different output images.\n      - batch_size(Optional[int]): Number of images generated for one inference step.\n      - output_dir(Optional[str]): Output directory, default is \"stable_diffusion_out\".\n\n\n    - **Return**\n      - ra(DocumentArray):  DocumentArray object， including `batch_size` Documents，each document keeps all immediate results during generation, please refer to [DocumentArray tutorial](https://docarray.jina.ai/fundamentals/documentarray/index.html) for more details.\n\n## IV.Server Deployment\n\n- PaddleHub Serving can deploy an online service of text-to-image.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command：\n  - ```shell\n    $ hub serving start -m stable_diffusion\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result.\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    from docarray import DocumentArray\n\n    # Send an HTTP request\n    data = {'text_prompts': 'in the morning light,Overlooking TOKYO city by greg rutkowski and thomas kinkade,Trending on artstation.'}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/stable_diffusion\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # Get results\n    r.json()[\"results\"]\n    da = DocumentArray.from_base64(r.json()[\"results\"])\n    # Save final result image to a file\n    da[0].save_uri_to_file('stable_diffusion_out.png')\n    # Save the generating process as a gif\n    da[0].chunks[0].chunks.save_gif('stable_diffusion_out.gif')\n    ```\n\n## V.Release Note\n\n* 1.0.0\n\n  First release\n\n  ```shell\n  $ hub install stable_diffusion == 1.0.0\n  ```\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/clip/README.md",
    "content": "# OpenAI CLIP implemented in Paddle.\nThe original implementation repo is [ranchlai/clip.paddle](https://github.com/ranchlai/clip.paddle). We use this repo here for text encoder in stable diffusion.\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/clip/clip/__init__.py",
    "content": "from .utils import *\r\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/clip/clip/layers.py",
    "content": "from typing import Optional\n\nimport paddle\nimport paddle.nn as nn\nfrom paddle import Tensor\nfrom paddle.nn import functional as F\nfrom paddle.nn import Linear\n\n__all__ = ['ResidualAttentionBlock', 'AttentionPool2d', 'multi_head_attention_forward', 'MultiHeadAttention']\n\n\ndef multi_head_attention_forward(x: Tensor,\n                                 num_heads: int,\n                                 q_proj: Linear,\n                                 k_proj: Linear,\n                                 v_proj: Linear,\n                                 c_proj: Linear,\n                                 attn_mask: Optional[Tensor] = None):\n    max_len, batch_size, emb_dim = x.shape\n    head_dim = emb_dim // num_heads\n    scaling = float(head_dim)**-0.5\n    q = q_proj(x)  # L, N, E\n    k = k_proj(x)  # L, N, E\n    v = v_proj(x)  # L, N, E\n    #k = k.con\n    v = v.reshape((-1, batch_size * num_heads, head_dim)).transpose((1, 0, 2))\n    k = k.reshape((-1, batch_size * num_heads, head_dim)).transpose((1, 0, 2))\n    q = q.reshape((-1, batch_size * num_heads, head_dim)).transpose((1, 0, 2))\n\n    q = q * scaling\n    qk = paddle.bmm(q, k.transpose((0, 2, 1)))\n    if attn_mask is not None:\n        if attn_mask.ndim == 2:\n            attn_mask.unsqueeze_(0)\n        #assert str(attn_mask.dtype) == 'VarType.FP32' and attn_mask.ndim == 3\n        assert attn_mask.shape[0] == 1 and attn_mask.shape[1] == max_len and attn_mask.shape[2] == max_len\n        qk += attn_mask\n\n    qk = paddle.nn.functional.softmax(qk, axis=-1)\n    atten = paddle.bmm(qk, v)\n    atten = atten.transpose((1, 0, 2))\n    atten = atten.reshape((max_len, batch_size, emb_dim))\n    atten = c_proj(atten)\n    return atten\n\n\nclass MultiHeadAttention(nn.Layer):  # without attention mask\n\n    def __init__(self, emb_dim: int, num_heads: int):\n        super().__init__()\n        self.q_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.k_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.v_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.c_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.head_dim = emb_dim // num_heads\n        self.emb_dim = emb_dim\n        self.num_heads = num_heads\n        assert self.head_dim * num_heads == emb_dim, \"embed_dim must be divisible by num_heads\"\n        #self.scaling = float(self.head_dim) ** -0.5\n\n    def forward(self, x, attn_mask=None):  # x is in shape[max_len,batch_size,emb_dim]\n\n        atten = multi_head_attention_forward(x,\n                                             self.num_heads,\n                                             self.q_proj,\n                                             self.k_proj,\n                                             self.v_proj,\n                                             self.c_proj,\n                                             attn_mask=attn_mask)\n\n        return atten\n\n\nclass Identity(nn.Layer):\n\n    def __init__(self):\n        super().__init__()\n\n    def forward(self, x):\n        return x\n\n\nclass Bottleneck(nn.Layer):\n    expansion = 4\n\n    def __init__(self, inplanes, planes, stride=1):\n        super().__init__()\n\n        # all conv layers have stride 1. an avgpool is performed after the second convolution when stride > 1\n        self.conv1 = nn.Conv2D(inplanes, planes, 1, bias_attr=False)\n        self.bn1 = nn.BatchNorm2D(planes)\n\n        self.conv2 = nn.Conv2D(planes, planes, 3, padding=1, bias_attr=False)\n        self.bn2 = nn.BatchNorm2D(planes)\n\n        self.avgpool = nn.AvgPool2D(stride) if stride > 1 else Identity()\n\n        self.conv3 = nn.Conv2D(planes, planes * self.expansion, 1, bias_attr=False)\n        self.bn3 = nn.BatchNorm2D(planes * self.expansion)\n\n        self.relu = nn.ReLU()\n        self.downsample = None\n        self.stride = stride\n\n        if stride > 1 or inplanes != planes * Bottleneck.expansion:\n            self.downsample = nn.Sequential(\n                (\"-1\", nn.AvgPool2D(stride)),\n                (\"0\", nn.Conv2D(inplanes, planes * self.expansion, 1, stride=1, bias_attr=False)),\n                (\"1\", nn.BatchNorm2D(planes * self.expansion)))\n\n    def forward(self, x):\n        identity = x\n\n        out = self.relu(self.bn1(self.conv1(x)))\n        out = self.relu(self.bn2(self.conv2(out)))\n        out = self.avgpool(out)\n        out = self.bn3(self.conv3(out))\n\n        if self.downsample is not None:\n            identity = self.downsample(x)\n\n        out += identity\n        out = self.relu(out)\n        return out\n\n\nclass AttentionPool2d(nn.Layer):\n\n    def __init__(self, spacial_dim: int, embed_dim: int, num_heads: int, output_dim: int = None):\n        super().__init__()\n\n        self.positional_embedding = paddle.create_parameter((spacial_dim**2 + 1, embed_dim), dtype='float32')\n\n        self.q_proj = nn.Linear(embed_dim, embed_dim, bias_attr=True)\n        self.k_proj = nn.Linear(embed_dim, embed_dim, bias_attr=True)\n        self.v_proj = nn.Linear(embed_dim, embed_dim, bias_attr=True)\n        self.c_proj = nn.Linear(embed_dim, output_dim or embed_dim, bias_attr=True)\n        self.num_heads = num_heads\n\n        self.head_dim = embed_dim // num_heads\n        assert self.head_dim * num_heads == embed_dim, \"embed_dim must be divisible by num_heads\"\n\n    def forward(self, x):\n\n        x = x.reshape((x.shape[0], x.shape[1], x.shape[2] * x.shape[3])).transpose((2, 0, 1))  # NCHW -> (HW)NC\n        max_len, batch_size, emb_dim = x.shape\n        head_dim = self.head_dim\n        x = paddle.concat([paddle.mean(x, axis=0, keepdim=True), x], axis=0)\n        x = x + paddle.unsqueeze(self.positional_embedding, 1)\n        out = multi_head_attention_forward(x, self.num_heads, self.q_proj, self.k_proj, self.v_proj, self.c_proj)\n\n        return out[0]\n\n\nclass QuickGELU(nn.Layer):\n\n    def forward(self, x):\n        return x * paddle.nn.functional.sigmoid(1.702 * x)\n\n\nclass ResidualAttentionBlock(nn.Layer):\n\n    def __init__(self, d_model: int, n_head: int, attn_mask=None):\n        super().__init__()\n\n        self.attn = MultiHeadAttention(d_model, n_head)\n        self.ln_1 = nn.LayerNorm(d_model)\n        self.mlp = nn.Sequential((\"c_fc\", nn.Linear(d_model, d_model * 4)), (\"gelu\", QuickGELU()),\n                                 (\"c_proj\", nn.Linear(d_model * 4, d_model)))\n        self.ln_2 = nn.LayerNorm(d_model)\n        self.attn_mask = attn_mask\n\n    def attention(self, x):\n        x = self.attn(x, self.attn_mask)\n        assert isinstance(x, paddle.Tensor)  # not tuble here\n        return x\n\n    def forward(self, x):\n\n        x = x + self.attention(self.ln_1(x))\n        x = x + self.mlp(self.ln_2(x))\n        return x\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/clip/clip/model.py",
    "content": "from typing import Tuple\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle import nn\n\nfrom .layers import AttentionPool2d\nfrom .layers import Bottleneck\nfrom .layers import MultiHeadAttention\nfrom .layers import ResidualAttentionBlock\n\n\nclass ModifiedResNet(nn.Layer):\n    \"\"\"\n    A ResNet class that is similar to torchvision's but contains the following changes:\n    - There are now 3 \"stem\" convolutions as opposed to 1, with an average pool instead of a max pool.\n    - Performs anti-aliasing strided convolutions, where an avgpool is prepended to convolutions with stride > 1\n    - The final pooling layer is a QKV attention instead of an average pool\n    \"\"\"\n\n    def __init__(self, layers, output_dim, heads, input_resolution=224, width=64):\n        super().__init__()\n        self.output_dim = output_dim\n        self.input_resolution = input_resolution\n\n        # the 3-layer stem\n        self.conv1 = nn.Conv2D(3, width // 2, kernel_size=3, stride=2, padding=1, bias_attr=False)\n        self.bn1 = nn.BatchNorm2D(width // 2)\n        self.conv2 = nn.Conv2D(width // 2, width // 2, kernel_size=3, padding=1, bias_attr=False)\n        self.bn2 = nn.BatchNorm2D(width // 2)\n        self.conv3 = nn.Conv2D(width // 2, width, kernel_size=3, padding=1, bias_attr=False)\n        self.bn3 = nn.BatchNorm2D(width)\n        self.avgpool = nn.AvgPool2D(2)\n        self.relu = nn.ReLU()\n\n        # residual layers\n        self._inplanes = width  # this is a *mutable* variable used during construction\n        self.layer1 = self._make_layer(width, layers[0])\n        self.layer2 = self._make_layer(width * 2, layers[1], stride=2)\n        self.layer3 = self._make_layer(width * 4, layers[2], stride=2)\n        self.layer4 = self._make_layer(width * 8, layers[3], stride=2)\n\n        embed_dim = width * 32  # the ResNet feature dimension\n        self.attnpool = AttentionPool2d(input_resolution // 32, embed_dim, heads, output_dim)\n\n    def _make_layer(self, planes, blocks, stride=1):\n        layers = [Bottleneck(self._inplanes, planes, stride)]\n\n        self._inplanes = planes * Bottleneck.expansion\n        for _ in range(1, blocks):\n            layers.append(Bottleneck(self._inplanes, planes))\n\n        return nn.Sequential(*layers)\n\n    def forward(self, x):\n\n        def stem(x):\n            for conv, bn in [(self.conv1, self.bn1), (self.conv2, self.bn2), (self.conv3, self.bn3)]:\n                x = self.relu(bn(conv(x)))\n            x = self.avgpool(x)\n            return x\n\n        #x = x.type(self.conv1.weight.dtype)\n        x = stem(x)\n        x = self.layer1(x)\n        x = self.layer2(x)\n        x = self.layer3(x)\n        x = self.layer4(x)\n        x = self.attnpool(x)\n\n        return x\n\n\nclass Transformer(nn.Layer):\n\n    def __init__(self, width: int, layers: int, heads: int, attn_mask=None):\n        super().__init__()\n        self.width = width\n        self.layers = layers\n        self.resblocks = nn.Sequential(*[ResidualAttentionBlock(width, heads, attn_mask) for _ in range(layers)])\n\n    def forward(self, x):\n        return self.resblocks(x)\n\n\nclass VisualTransformer(nn.Layer):\n\n    def __init__(self, input_resolution: int, patch_size: int, width: int, layers: int, heads: int, output_dim: int):\n        super().__init__()\n        self.input_resolution = input_resolution\n        self.output_dim = output_dim\n        # used patch_size x patch_size, stride patch_size to do linear projection\n        self.conv1 = nn.Conv2D(in_channels=3,\n                               out_channels=width,\n                               kernel_size=patch_size,\n                               stride=patch_size,\n                               bias_attr=False)\n\n        # scale = width ** -0.5\n        self.class_embedding = paddle.create_parameter((width, ), 'float32')\n\n        self.positional_embedding = paddle.create_parameter(((input_resolution // patch_size)**2 + 1, width), 'float32')\n\n        self.ln_pre = nn.LayerNorm(width)\n\n        self.transformer = Transformer(width, layers, heads)\n\n        self.ln_post = nn.LayerNorm(width)\n        self.proj = paddle.create_parameter((width, output_dim), 'float32')\n\n    def forward(self, x):\n\n        x = self.conv1(x)\n        x = x.reshape((x.shape[0], x.shape[1], -1))\n        x = x.transpose((0, 2, 1))\n        x = paddle.concat([self.class_embedding + paddle.zeros((x.shape[0], 1, x.shape[-1]), dtype=x.dtype), x], axis=1)\n\n        x = x + self.positional_embedding\n        x = self.ln_pre(x)\n        x = x.transpose((1, 0, 2))\n        x = self.transformer(x)\n        x = x.transpose((1, 0, 2))\n        x = self.ln_post(x[:, 0, :])\n        if self.proj is not None:\n            x = paddle.matmul(x, self.proj)\n\n        return x\n\n\nclass TextTransformer(nn.Layer):\n\n    def __init__(self, context_length: int, vocab_size: int, transformer_width: int, transformer_heads: int,\n                 transformer_layers: int):\n        super().__init__()\n        self.context_length = context_length\n        self.transformer = Transformer(width=transformer_width,\n                                       layers=transformer_layers,\n                                       heads=transformer_heads,\n                                       attn_mask=self.build_attention_mask())\n\n        self.vocab_size = vocab_size\n        self.token_embedding = nn.Embedding(vocab_size, transformer_width)\n        self.positional_embedding = paddle.create_parameter((self.context_length, transformer_width), 'float32')\n        self.ln_final = nn.LayerNorm(transformer_width)\n\n    def build_attention_mask(self):\n        # lazily create causal attention mask, with full attention between the vision tokens\n        # mask = paddle.empty((self.context_length, self.context_length),dtype='float32')\n        # mask.fill_(float(\"-inf\"))\n        #mask.triu_(1)  # zero out the lower diagonal\n\n        mask = paddle.ones((self.context_length, self.context_length)) * float(\"-inf\")\n        mask = paddle.triu(mask, diagonal=1)\n\n        return mask\n\n    def forward(self, text):\n        x = self.token_embedding(text)  # [batch_size, n_ctx, d_model]\n        x = x + self.positional_embedding\n        x = x.transpose((1, 0, 2))  # NLD -> LND\n        x = self.transformer(x)\n        x = x.transpose((1, 0, 2))  # LND -> NLD\n        x = self.ln_final(x)\n        return x\n\n\nclass CLIP(nn.Layer):\n\n    def __init__(\n            self,\n            embed_dim: int,\n            # vision\n            image_resolution: int,\n            vision_layers: Union[Tuple[int, int, int, int], int],\n            vision_width: int,\n            vision_patch_size: int,\n            # text\n            context_length: int,\n            vocab_size: int,\n            transformer_width: int,\n            transformer_heads: int,\n            transformer_layers: int):\n        super().__init__()\n\n        self.context_length = context_length\n        if isinstance(vision_layers, (tuple, list)):\n            vision_heads = vision_width * 32 // 64\n            self.visual = ModifiedResNet(layers=vision_layers,\n                                         output_dim=embed_dim,\n                                         heads=vision_heads,\n                                         input_resolution=image_resolution,\n                                         width=vision_width)\n        else:\n            vision_heads = vision_width // 64\n            self.visual = VisualTransformer(input_resolution=image_resolution,\n                                            patch_size=vision_patch_size,\n                                            width=vision_width,\n                                            layers=vision_layers,\n                                            heads=vision_heads,\n                                            output_dim=embed_dim)\n\n        self.transformer = Transformer(width=transformer_width,\n                                       layers=transformer_layers,\n                                       heads=transformer_heads,\n                                       attn_mask=self.build_attention_mask())\n\n        self.vocab_size = vocab_size\n        self.token_embedding = nn.Embedding(vocab_size, transformer_width)\n        self.positional_embedding = paddle.create_parameter((self.context_length, transformer_width), 'float32')\n        self.ln_final = nn.LayerNorm(transformer_width)\n\n        self.text_projection = paddle.create_parameter((transformer_width, embed_dim), 'float32')\n        self.logit_scale = paddle.create_parameter((1, ), 'float32')\n\n    def build_attention_mask(self):\n        # lazily create causal attention mask, with full attention between the vision tokens\n        # mask = paddle.empty((self.context_length, self.context_length),dtype='float32')\n        # mask.fill_(float(\"-inf\"))\n        #mask.triu_(1)  # zero out the lower diagonal\n\n        mask = paddle.ones((self.context_length, self.context_length)) * float(\"-inf\")\n        mask = paddle.triu(mask, diagonal=1)\n\n        return mask\n\n    def encode_image(self, image):\n        return self.visual(image)\n\n    def encode_text(self, text):\n        x = self.token_embedding(text)  # [batch_size, n_ctx, d_model]\n        x = x + self.positional_embedding\n        x = x.transpose((1, 0, 2))  # NLD -> LND\n        x = self.transformer(x)\n        x = x.transpose((1, 0, 2))  # LND -> NLD\n        x = self.ln_final(x)\n        idx = text.numpy().argmax(-1)\n        idx = list(idx)\n        x = [x[i:i + 1, int(j), :] for i, j in enumerate(idx)]\n        x = paddle.concat(x, 0)\n        x = paddle.matmul(x, self.text_projection)\n        return x\n\n    def forward(self, image, text):\n        image_features = self.encode_image(image)\n        text_features = self.encode_text(text)\n\n        # normalized features\n        image_features = image_features / image_features.norm(dim=-1, keepdim=True)\n        text_features = text_features / text_features.norm(dim=-1, keepdim=True)\n\n        # cosine similarity as logits\n        logit_scale = self.logit_scale.exp()\n        logits_per_image = paddle.matmul(logit_scale * image_features, text_features.t())\n        logits_per_text = paddle.matmul(logit_scale * text_features, image_features.t())\n\n        # shape = [global_batch_size, global_batch_size]\n        return logits_per_image, logits_per_text\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/clip/clip/simple_tokenizer.py",
    "content": "import gzip\nimport html\nimport os\nfrom functools import lru_cache\n\nimport ftfy\nimport regex as re\n\n\n@lru_cache()\ndef default_bpe():\n    return os.path.join(os.path.dirname(os.path.abspath(__file__)), \"../assets/bpe_simple_vocab_16e6.txt.gz\")\n\n\n@lru_cache()\ndef bytes_to_unicode():\n    \"\"\"\n    Returns list of utf-8 byte and a corresponding list of unicode strings.\n    The reversible bpe codes work on unicode strings.\n    This means you need a large # of unicode characters in your vocab if you want to avoid UNKs.\n    When you're at something like a 10B token dataset you end up needing around 5K for decent coverage.\n    This is a signficant percentage of your normal, say, 32K bpe vocab.\n    To avoid that, we want lookup tables between utf-8 bytes and unicode strings.\n    And avoids mapping to whitespace/control characters the bpe code barfs on.\n    \"\"\"\n    bs = list(range(ord(\"!\"), ord(\"~\") + 1)) + list(range(ord(\"¡\"), ord(\"¬\") + 1)) + list(range(ord(\"®\"), ord(\"ÿ\") + 1))\n    cs = bs[:]\n    n = 0\n    for b in range(2**8):\n        if b not in bs:\n            bs.append(b)\n            cs.append(2**8 + n)\n            n += 1\n    cs = [chr(n) for n in cs]\n    return dict(zip(bs, cs))\n\n\ndef get_pairs(word):\n    \"\"\"Return set of symbol pairs in a word.\n    Word is represented as tuple of symbols (symbols being variable-length strings).\n    \"\"\"\n    pairs = set()\n    prev_char = word[0]\n    for char in word[1:]:\n        pairs.add((prev_char, char))\n        prev_char = char\n    return pairs\n\n\ndef basic_clean(text):\n    text = ftfy.fix_text(text)\n    text = html.unescape(html.unescape(text))\n    return text.strip()\n\n\ndef whitespace_clean(text):\n    text = re.sub(r'\\s+', ' ', text)\n    text = text.strip()\n    return text\n\n\nclass SimpleTokenizer(object):\n\n    def __init__(self, bpe_path: str = default_bpe()):\n        self.byte_encoder = bytes_to_unicode()\n        self.byte_decoder = {v: k for k, v in self.byte_encoder.items()}\n        merges = gzip.open(bpe_path).read().decode(\"utf-8\").split('\\n')\n        merges = merges[1:49152 - 256 - 2 + 1]\n        merges = [tuple(merge.split()) for merge in merges]\n        vocab = list(bytes_to_unicode().values())\n        vocab = vocab + [v + '</w>' for v in vocab]\n        for merge in merges:\n            vocab.append(''.join(merge))\n        vocab.extend(['<|startoftext|>', '<|endoftext|>'])\n        self.encoder = dict(zip(vocab, range(len(vocab))))\n        self.decoder = {v: k for k, v in self.encoder.items()}\n        self.bpe_ranks = dict(zip(merges, range(len(merges))))\n        self.cache = {'<|startoftext|>': '<|startoftext|>', '<|endoftext|>': '<|endoftext|>'}\n        self.pat = re.compile(\n            r\"\"\"<\\|startoftext\\|>|<\\|endoftext\\|>|'s|'t|'re|'ve|'m|'ll|'d|[\\p{L}]+|[\\p{N}]|[^\\s\\p{L}\\p{N}]+\"\"\",\n            re.IGNORECASE)\n\n    def bpe(self, token):\n        if token in self.cache:\n            return self.cache[token]\n        word = tuple(token[:-1]) + (token[-1] + '</w>', )\n        pairs = get_pairs(word)\n\n        if not pairs:\n            return token + '</w>'\n\n        while True:\n            bigram = min(pairs, key=lambda pair: self.bpe_ranks.get(pair, float('inf')))\n            if bigram not in self.bpe_ranks:\n                break\n            first, second = bigram\n            new_word = []\n            i = 0\n            while i < len(word):\n                try:\n                    j = word.index(first, i)\n                    new_word.extend(word[i:j])\n                    i = j\n                except:\n                    new_word.extend(word[i:])\n                    break\n\n                if word[i] == first and i < len(word) - 1 and word[i + 1] == second:\n                    new_word.append(first + second)\n                    i += 2\n                else:\n                    new_word.append(word[i])\n                    i += 1\n            new_word = tuple(new_word)\n            word = new_word\n            if len(word) == 1:\n                break\n            else:\n                pairs = get_pairs(word)\n        word = ' '.join(word)\n        self.cache[token] = word\n        return word\n\n    def encode(self, text):\n        bpe_tokens = []\n        text = whitespace_clean(basic_clean(text)).lower()\n        for token in re.findall(self.pat, text):\n            token = ''.join(self.byte_encoder[b] for b in token.encode('utf-8'))\n            bpe_tokens.extend(self.encoder[bpe_token] for bpe_token in self.bpe(token).split(' '))\n        return bpe_tokens\n\n    def decode(self, tokens):\n        text = ''.join([self.decoder[token] for token in tokens])\n        text = bytearray([self.byte_decoder[c] for c in text]).decode('utf-8', errors=\"replace\").replace('</w>', ' ')\n        return text\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/clip/clip/utils.py",
    "content": "import os\nfrom typing import List\nfrom typing import Union\n\nimport numpy as np\nimport paddle\nfrom paddle.utils import download\nfrom paddle.vision.transforms import CenterCrop\nfrom paddle.vision.transforms import Compose\nfrom paddle.vision.transforms import Normalize\nfrom paddle.vision.transforms import Resize\nfrom paddle.vision.transforms import ToTensor\n\nfrom .model import CLIP\nfrom .model import TextTransformer\nfrom .simple_tokenizer import SimpleTokenizer\n\n__all__ = ['transform', 'tokenize', 'build_model']\n\nMODEL_NAMES = ['VITL14']\n\nURL = {'VITL14': os.path.join(os.path.dirname(__file__), 'pre_trained', 'vitl14_textencoder.pdparams')}\n\nMEAN, STD = (0.48145466, 0.4578275, 0.40821073), (0.26862954, 0.26130258, 0.27577711)\n_tokenizer = SimpleTokenizer()\n\ntransform = Compose([\n    Resize(224, interpolation='bicubic'),\n    CenterCrop(224), lambda image: image.convert('RGB'),\n    ToTensor(),\n    Normalize(mean=MEAN, std=STD), lambda t: t.unsqueeze_(0)\n])\n\n\ndef tokenize(texts: Union[str, List[str]], context_length: int = 77):\n    \"\"\"\n    Returns the tokenized representation of given input string(s)\n\n    Parameters\n    ----------\n    texts : Union[str, List[str]]\n        An input string or a list of input strings to tokenize\n\n    context_length : int\n        The context length to use; all CLIP models use 77 as the context length\n\n    Returns\n    -------\n    A two-dimensional tensor containing the resulting tokens, shape = [number of input strings, context_length]\n    \"\"\"\n    if isinstance(texts, str):\n        texts = [texts]\n\n    sot_token = _tokenizer.encoder[\"<|startoftext|>\"]\n    eot_token = _tokenizer.encoder[\"<|endoftext|>\"]\n    all_tokens = [[sot_token] + _tokenizer.encode(text) + [eot_token] for text in texts]\n    result = paddle.zeros((len(all_tokens), context_length), dtype='int64')\n\n    for i, tokens in enumerate(all_tokens):\n        if len(tokens) > context_length:\n            raise RuntimeError(f\"Input {texts[i]} is too long for context length {context_length}\")\n        result[i, :len(tokens)] = paddle.to_tensor(np.array(tokens), dtype='int64')\n\n    return result\n\n\ndef build_model(name='VITL14'):\n    assert name in MODEL_NAMES, f\"model name must be one of {MODEL_NAMES}\"\n    name2model = {'VITL14': build_vitl14_language_model}\n    model = name2model[name]()\n    weight = URL[name]\n    sd = paddle.load(weight)\n    state_dict = model.state_dict()\n    for key, value in sd.items():\n        if key in state_dict:\n            state_dict[key] = value\n    model.load_dict(state_dict)\n    model.eval()\n    return model\n\n\ndef build_vitl14_language_model():\n    model = TextTransformer(context_length=77,\n                            vocab_size=49408,\n                            transformer_width=768,\n                            transformer_heads=12,\n                            transformer_layers=12)\n    return model\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/diffusers/__init__.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n__version__ = \"0.2.4\"\n\nfrom .models import AutoencoderKL, UNet2DConditionModel, UNet2DModel, VQModel\n\nfrom .schedulers import (DDIMScheduler, DDPMScheduler, KarrasVeScheduler, PNDMScheduler, SchedulerMixin,\n                         ScoreSdeVeScheduler, LMSDiscreteScheduler)\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/diffusers/configuration_utils.py",
    "content": "# coding=utf-8\n# Copyright 2022 The HuggingFace Inc. team.\n# Copyright (c) 2022, NVIDIA CORPORATION.  All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\" ConfigMixinuration base class and utilities.\"\"\"\nimport functools\nimport inspect\nimport json\nimport os\nimport re\nfrom collections import OrderedDict\nfrom typing import Any\nfrom typing import Dict\nfrom typing import Tuple\nfrom typing import Union\n\nfrom requests import HTTPError\n\nfrom paddlehub.common.logger import logger\n\nHUGGINGFACE_CO_RESOLVE_ENDPOINT = \"HUGGINGFACE_CO_RESOLVE_ENDPOINT\"\nDIFFUSERS_CACHE = \"./caches\"\n\n_re_configuration_file = re.compile(r\"config\\.(.*)\\.json\")\n\n\nclass ConfigMixin:\n    r\"\"\"\n    Base class for all configuration classes. Handles a few parameters common to all models' configurations as well as\n    methods for loading/downloading/saving configurations.\n\n    \"\"\"\n    config_name = \"model_config.json\"\n    ignore_for_config = []\n\n    def register_to_config(self, **kwargs):\n        if self.config_name is None:\n            raise NotImplementedError(f\"Make sure that {self.__class__} has defined a class name `config_name`\")\n        kwargs[\"_class_name\"] = self.__class__.__name__\n        kwargs[\"_diffusers_version\"] = \"0.0.1\"\n\n        for key, value in kwargs.items():\n            try:\n                setattr(self, key, value)\n            except AttributeError as err:\n                logger.error(f\"Can't set {key} with value {value} for {self}\")\n                raise err\n\n        if not hasattr(self, \"_internal_dict\"):\n            internal_dict = kwargs\n        else:\n            previous_dict = dict(self._internal_dict)\n            internal_dict = {**self._internal_dict, **kwargs}\n            logger.debug(f\"Updating config from {previous_dict} to {internal_dict}\")\n\n        self._internal_dict = FrozenDict(internal_dict)\n\n    def save_config(self, save_directory: Union[str, os.PathLike], push_to_hub: bool = False, **kwargs):\n        \"\"\"\n        Save a configuration object to the directory `save_directory`, so that it can be re-loaded using the\n        [`~ConfigMixin.from_config`] class method.\n\n        Args:\n            save_directory (`str` or `os.PathLike`):\n                Directory where the configuration JSON file will be saved (will be created if it does not exist).\n            kwargs:\n                Additional key word arguments passed along to the [`~utils.PushToHubMixin.push_to_hub`] method.\n        \"\"\"\n        if os.path.isfile(save_directory):\n            raise AssertionError(f\"Provided path ({save_directory}) should be a directory, not a file\")\n\n        os.makedirs(save_directory, exist_ok=True)\n\n        # If we save using the predefined names, we can load using `from_config`\n        output_config_file = os.path.join(save_directory, self.config_name)\n\n        self.to_json_file(output_config_file)\n        logger.info(f\"ConfigMixinuration saved in {output_config_file}\")\n\n    @classmethod\n    def from_config(cls, pretrained_model_name_or_path: Union[str, os.PathLike], return_unused_kwargs=False, **kwargs):\n        config_dict = cls.get_config_dict(pretrained_model_name_or_path=pretrained_model_name_or_path, **kwargs)\n\n        init_dict, unused_kwargs = cls.extract_init_dict(config_dict, **kwargs)\n\n        model = cls(**init_dict)\n\n        if return_unused_kwargs:\n            return model, unused_kwargs\n        else:\n            return model\n\n    @classmethod\n    def get_config_dict(cls, pretrained_model_name_or_path: Union[str, os.PathLike],\n                        **kwargs) -> Tuple[Dict[str, Any], Dict[str, Any]]:\n        cache_dir = kwargs.pop(\"cache_dir\", DIFFUSERS_CACHE)\n        force_download = kwargs.pop(\"force_download\", False)\n        resume_download = kwargs.pop(\"resume_download\", False)\n        proxies = kwargs.pop(\"proxies\", None)\n        use_auth_token = kwargs.pop(\"use_auth_token\", None)\n        local_files_only = kwargs.pop(\"local_files_only\", False)\n        revision = kwargs.pop(\"revision\", None)\n        subfolder = kwargs.pop(\"subfolder\", None)\n\n        user_agent = {\"file_type\": \"config\"}\n\n        pretrained_model_name_or_path = str(pretrained_model_name_or_path)\n\n        if cls.config_name is None:\n            raise ValueError(\n                \"`self.config_name` is not defined. Note that one should not load a config from \"\n                \"`ConfigMixin`. Please make sure to define `config_name` in a class inheriting from `ConfigMixin`\")\n\n        if os.path.isfile(pretrained_model_name_or_path):\n            config_file = pretrained_model_name_or_path\n        elif os.path.isdir(pretrained_model_name_or_path):\n            if os.path.isfile(os.path.join(pretrained_model_name_or_path, cls.config_name)):\n                # Load from a PyTorch checkpoint\n                config_file = os.path.join(pretrained_model_name_or_path, cls.config_name)\n            elif subfolder is not None and os.path.isfile(\n                    os.path.join(pretrained_model_name_or_path, subfolder, cls.config_name)):\n                config_file = os.path.join(pretrained_model_name_or_path, subfolder, cls.config_name)\n            else:\n                raise EnvironmentError(\n                    f\"Error no file named {cls.config_name} found in directory {pretrained_model_name_or_path}.\")\n        else:\n            try:\n                # Load from URL or cache if already cached\n                from huggingface_hub import hf_hub_download\n                config_file = hf_hub_download(\n                    pretrained_model_name_or_path,\n                    filename=cls.config_name,\n                    cache_dir=cache_dir,\n                    force_download=force_download,\n                    proxies=proxies,\n                    resume_download=resume_download,\n                    local_files_only=local_files_only,\n                    use_auth_token=use_auth_token,\n                    user_agent=user_agent,\n                    subfolder=subfolder,\n                )\n\n            except HTTPError as err:\n                raise EnvironmentError(\"There was a specific connection error when trying to load\"\n                                       f\" {pretrained_model_name_or_path}:\\n{err}\")\n            except ValueError:\n                raise EnvironmentError(\n                    f\"We couldn't connect to '{HUGGINGFACE_CO_RESOLVE_ENDPOINT}' to load this model, couldn't find it\"\n                    f\" in the cached files and it looks like {pretrained_model_name_or_path} is not the path to a\"\n                    f\" directory containing a {cls.config_name} file.\\nCheckout your internet connection or see how to\"\n                    \" run the library in offline mode at\"\n                    \" 'https://huggingface.co/docs/diffusers/installation#offline-mode'.\")\n            except EnvironmentError:\n                raise EnvironmentError(\n                    f\"Can't load config for '{pretrained_model_name_or_path}'. If you were trying to load it from \"\n                    \"'https://huggingface.co/models', make sure you don't have a local directory with the same name. \"\n                    f\"Otherwise, make sure '{pretrained_model_name_or_path}' is the correct path to a directory \"\n                    f\"containing a {cls.config_name} file\")\n\n        try:\n            # Load config dict\n            config_dict = cls._dict_from_json_file(config_file)\n        except (json.JSONDecodeError, UnicodeDecodeError):\n            raise EnvironmentError(f\"It looks like the config file at '{config_file}' is not a valid JSON file.\")\n\n        return config_dict\n\n    @classmethod\n    def extract_init_dict(cls, config_dict, **kwargs):\n        expected_keys = set(dict(inspect.signature(cls.__init__).parameters).keys())\n        expected_keys.remove(\"self\")\n        # remove general kwargs if present in dict\n        if \"kwargs\" in expected_keys:\n            expected_keys.remove(\"kwargs\")\n        # remove keys to be ignored\n        if len(cls.ignore_for_config) > 0:\n            expected_keys = expected_keys - set(cls.ignore_for_config)\n        init_dict = {}\n        for key in expected_keys:\n            if key in kwargs:\n                # overwrite key\n                init_dict[key] = kwargs.pop(key)\n            elif key in config_dict:\n                # use value from config dict\n                init_dict[key] = config_dict.pop(key)\n\n        unused_kwargs = config_dict.update(kwargs)\n\n        passed_keys = set(init_dict.keys())\n        if len(expected_keys - passed_keys) > 0:\n            logger.warning(\n                f\"{expected_keys - passed_keys} was not found in config. Values will be initialized to default values.\")\n\n        return init_dict, unused_kwargs\n\n    @classmethod\n    def _dict_from_json_file(cls, json_file: Union[str, os.PathLike]):\n        with open(json_file, \"r\", encoding=\"utf-8\") as reader:\n            text = reader.read()\n        return json.loads(text)\n\n    def __repr__(self):\n        return f\"{self.__class__.__name__} {self.to_json_string()}\"\n\n    @property\n    def config(self) -> Dict[str, Any]:\n        return self._internal_dict\n\n    def to_json_string(self) -> str:\n        \"\"\"\n        Serializes this instance to a JSON string.\n\n        Returns:\n            `str`: String containing all the attributes that make up this configuration instance in JSON format.\n        \"\"\"\n        config_dict = self._internal_dict if hasattr(self, \"_internal_dict\") else {}\n        return json.dumps(config_dict, indent=2, sort_keys=True) + \"\\n\"\n\n    def to_json_file(self, json_file_path: Union[str, os.PathLike]):\n        \"\"\"\n        Save this instance to a JSON file.\n\n        Args:\n            json_file_path (`str` or `os.PathLike`):\n                Path to the JSON file in which this configuration instance's parameters will be saved.\n        \"\"\"\n        with open(json_file_path, \"w\", encoding=\"utf-8\") as writer:\n            writer.write(self.to_json_string())\n\n\nclass FrozenDict(OrderedDict):\n\n    def __init__(self, *args, **kwargs):\n        super().__init__(*args, **kwargs)\n\n        for key, value in self.items():\n            setattr(self, key, value)\n\n        self.__frozen = True\n\n    def __delitem__(self, *args, **kwargs):\n        raise Exception(f\"You cannot use ``__delitem__`` on a {self.__class__.__name__} instance.\")\n\n    def setdefault(self, *args, **kwargs):\n        raise Exception(f\"You cannot use ``setdefault`` on a {self.__class__.__name__} instance.\")\n\n    def pop(self, *args, **kwargs):\n        raise Exception(f\"You cannot use ``pop`` on a {self.__class__.__name__} instance.\")\n\n    def update(self, *args, **kwargs):\n        raise Exception(f\"You cannot use ``update`` on a {self.__class__.__name__} instance.\")\n\n    def __setattr__(self, name, value):\n        if hasattr(self, \"__frozen\") and self.__frozen:\n            raise Exception(f\"You cannot use ``__setattr__`` on a {self.__class__.__name__} instance.\")\n        super().__setattr__(name, value)\n\n    def __setitem__(self, name, value):\n        if hasattr(self, \"__frozen\") and self.__frozen:\n            raise Exception(f\"You cannot use ``__setattr__`` on a {self.__class__.__name__} instance.\")\n        super().__setitem__(name, value)\n\n\ndef register_to_config(init):\n    \"\"\"\n    Decorator to apply on the init of classes inheriting from `ConfigMixin` so that all the arguments are automatically\n    sent to `self.register_for_config`. To ignore a specific argument accepted by the init but that shouldn't be\n    registered in the config, use the `ignore_for_config` class variable\n\n    Warning: Once decorated, all private arguments (beginning with an underscore) are trashed and not sent to the init!\n    \"\"\"\n\n    @functools.wraps(init)\n    def inner_init(self, *args, **kwargs):\n        # Ignore private kwargs in the init.\n        init_kwargs = {k: v for k, v in kwargs.items() if not k.startswith(\"_\")}\n        init(self, *args, **init_kwargs)\n        if not isinstance(self, ConfigMixin):\n            raise RuntimeError(\n                f\"`@register_for_config` was applied to {self.__class__.__name__} init method, but this class does \"\n                \"not inherit from `ConfigMixin`.\")\n\n        ignore = getattr(self, \"ignore_for_config\", [])\n        # Get positional arguments aligned with kwargs\n        new_kwargs = {}\n        signature = inspect.signature(init)\n        parameters = {\n            name: p.default\n            for i, (name, p) in enumerate(signature.parameters.items()) if i > 0 and name not in ignore\n        }\n        for arg, name in zip(args, parameters.keys()):\n            new_kwargs[name] = arg\n\n        # Then add all kwargs\n        new_kwargs.update({\n            k: init_kwargs.get(k, default)\n            for k, default in parameters.items() if k not in ignore and k not in new_kwargs\n        })\n        getattr(self, \"register_to_config\")(**new_kwargs)\n\n    return inner_init\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/diffusers/models/README.md",
    "content": "# Models\n\n- Models: Neural network that models $p_\\theta(\\mathbf{x}_{t-1}|\\mathbf{x}_t)$ (see image below) and is trained end-to-end to denoise a noisy input to an image. Examples: UNet, Conditioned UNet, 3D UNet, Transformer UNet\n\n## API\n\nTODO(Suraj, Patrick)\n\n## Examples\n\nTODO(Suraj, Patrick)\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/diffusers/models/__init__.py",
    "content": "# flake8: noqa\n# There's no way to ignore \"F401 '...' imported but unused\" warnings in this\n# module, but to preserve other warnings. So, don't check this module at all.\n# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom .unet_2d import UNet2DModel\nfrom .unet_2d_condition import UNet2DConditionModel\nfrom .vae import AutoencoderKL\nfrom .vae import VQModel\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/diffusers/models/attention.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nfrom inspect import isfunction\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\ndef finfo(dtype):\n    if dtype == paddle.float32:\n        return np.finfo(np.float32)\n    if dtype == paddle.float16:\n        return np.finfo(np.float16)\n    if dtype == paddle.float64:\n        return np.finfo(np.float64)\n\n\npaddle.finfo = finfo\n\n\nclass AttentionBlockNew(nn.Layer):\n    \"\"\"\n    An attention block that allows spatial positions to attend to each other. Originally ported from here, but adapted\n    to the N-d case.\n    https://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/models/unet.py#L66.\n    Uses three q, k, v linear layers to compute attention\n    \"\"\"\n\n    def __init__(\n        self,\n        channels,\n        num_head_channels=None,\n        num_groups=32,\n        rescale_output_factor=1.0,\n        eps=1e-5,\n    ):\n        super().__init__()\n        self.channels = channels\n\n        self.num_heads = channels // num_head_channels if num_head_channels is not None else 1\n        self.num_head_size = num_head_channels\n        self.group_norm = nn.GroupNorm(num_channels=channels, num_groups=num_groups, epsilon=eps)\n\n        # define q,k,v as linear layers\n        self.query = nn.Linear(channels, channels)\n        self.key = nn.Linear(channels, channels)\n        self.value = nn.Linear(channels, channels)\n\n        self.rescale_output_factor = rescale_output_factor\n        self.proj_attn = nn.Linear(channels, channels)\n\n    def transpose_for_scores(self, projection: paddle.Tensor) -> paddle.Tensor:\n        new_projection_shape = projection.shape[:-1] + [self.num_heads, -1]\n        # move heads to 2nd position (B, T, H * D) -> (B, T, H, D) -> (B, H, T, D)\n        new_projection = projection.reshape(new_projection_shape).transpose([0, 2, 1, 3])\n        return new_projection\n\n    def forward(self, hidden_states):\n        residual = hidden_states\n        batch, channel, height, width = hidden_states.shape\n\n        # norm\n        hidden_states = self.group_norm(hidden_states)\n\n        hidden_states = hidden_states.reshape([batch, channel, height * width]).transpose([0, 2, 1])\n\n        # proj to q, k, v\n        query_proj = self.query(hidden_states)\n        key_proj = self.key(hidden_states)\n        value_proj = self.value(hidden_states)\n\n        # transpose\n        query_states = self.transpose_for_scores(query_proj)\n        key_states = self.transpose_for_scores(key_proj)\n        value_states = self.transpose_for_scores(value_proj)\n\n        # get scores\n        scale = 1 / math.sqrt(math.sqrt(self.channels / self.num_heads))\n        attention_scores = paddle.matmul(query_states * scale, key_states * scale, transpose_y=True)\n        attention_probs = F.softmax(attention_scores.astype(\"float32\"), axis=-1).astype(attention_scores.dtype)\n\n        # compute attention output\n        context_states = paddle.matmul(attention_probs, value_states)\n\n        context_states = context_states.transpose([0, 2, 1, 3])\n        new_context_states_shape = context_states.shape[:-2] + [\n            self.channels,\n        ]\n        context_states = context_states.reshape(new_context_states_shape)\n\n        # compute next hidden_states\n        hidden_states = self.proj_attn(context_states)\n        hidden_states = hidden_states.transpose([0, 2, 1]).reshape([batch, channel, height, width])\n\n        # res connect and rescale\n        hidden_states = (hidden_states + residual) / self.rescale_output_factor\n        return hidden_states\n\n    def set_weight(self, attn_layer):\n        self.group_norm.weight.set_value(attn_layer.norm.weight)\n        self.group_norm.bias.set_value(attn_layer.norm.bias)\n\n        if hasattr(attn_layer, \"q\"):\n            self.query.weight.set_value(attn_layer.q.weight[:, :, 0, 0])\n            self.key.weight.set_value(attn_layer.k.weight[:, :, 0, 0])\n            self.value.weight.set_value(attn_layer.v.weight[:, :, 0, 0])\n\n            self.query.bias.set_value(attn_layer.q.bias)\n            self.key.bias.set_value(attn_layer.k.bias)\n            self.value.bias.set_value(attn_layer.v.bias)\n\n            self.proj_attn.weight.set_value(attn_layer.proj_out.weight[:, :, 0, 0])\n            self.proj_attn.bias.set_value(attn_layer.proj_out.bias)\n        elif hasattr(attn_layer, \"NIN_0\"):\n            self.query.weight.set_value(attn_layer.NIN_0.W.t())\n            self.key.weight.set_value(attn_layer.NIN_1.W.t())\n            self.value.weight.set_value(attn_layer.NIN_2.W.t())\n\n            self.query.bias.set_value(attn_layer.NIN_0.b)\n            self.key.bias.set_value(attn_layer.NIN_1.b)\n            self.value.bias.set_value(attn_layer.NIN_2.b)\n\n            self.proj_attn.weight.set_value(attn_layer.NIN_3.W.t())\n            self.proj_attn.bias.set_value(attn_layer.NIN_3.b)\n\n            self.group_norm.weight.set_value(attn_layer.GroupNorm_0.weight)\n            self.group_norm.bias.set_value(attn_layer.GroupNorm_0.bias)\n        else:\n            qkv_weight = attn_layer.qkv.weight.reshape(\n                [self.num_heads, 3 * self.channels // self.num_heads, self.channels])\n            qkv_bias = attn_layer.qkv.bias.reshape([self.num_heads, 3 * self.channels // self.num_heads])\n\n            q_w, k_w, v_w = qkv_weight.split(self.channels // self.num_heads, axis=1)\n            q_b, k_b, v_b = qkv_bias.split(self.channels // self.num_heads, axis=1)\n\n            self.query.weight.set_value(q_w.reshape([-1, self.channels]))\n            self.key.weight.set_value(k_w.reshape([-1, self.channels]))\n            self.value.weight.set_value(v_w.reshape([-1, self.channels]))\n\n            self.query.bias.set_value(q_b.flatten())\n            self.key.bias.set_value(k_b.flatten())\n            self.value.bias.set_value(v_b.flatten())\n\n            self.proj_attn.weight.set_value(attn_layer.proj.weight[:, :, 0])\n            self.proj_attn.bias.set_value(attn_layer.proj.bias)\n\n\nclass SpatialTransformer(nn.Layer):\n    \"\"\"\n    Transformer block for image-like data. First, project the input (aka embedding) and reshape to b, t, d. Then apply\n    standard transformer action. Finally, reshape to image\n    \"\"\"\n\n    def __init__(self, in_channels, n_heads, d_head, depth=1, dropout=0.0, context_dim=None):\n        super().__init__()\n        self.n_heads = n_heads\n        self.d_head = d_head\n        self.in_channels = in_channels\n        inner_dim = n_heads * d_head\n        self.norm = nn.GroupNorm(num_groups=32, num_channels=in_channels, epsilon=1e-6)\n\n        self.proj_in = nn.Conv2D(in_channels, inner_dim, kernel_size=1, stride=1, padding=0)\n\n        self.transformer_blocks = nn.LayerList([\n            BasicTransformerBlock(inner_dim, n_heads, d_head, dropout=dropout, context_dim=context_dim)\n            for d in range(depth)\n        ])\n\n        self.proj_out = nn.Conv2D(inner_dim, in_channels, kernel_size=1, stride=1, padding=0)\n\n    def forward(self, x, context=None):\n        # note: if no context is given, cross-attention defaults to self-attention\n        b, c, h, w = x.shape\n        x_in = x\n        x = self.norm(x)\n        x = self.proj_in(x)\n        x = x.transpose([0, 2, 3, 1]).reshape([b, h * w, c])\n        for block in self.transformer_blocks:\n            x = block(x, context=context)\n        x = x.reshape([b, h, w, c]).transpose([0, 3, 1, 2])\n        x = self.proj_out(x)\n        return x + x_in\n\n    def set_weight(self, layer):\n        self.norm = layer.norm\n        self.proj_in = layer.proj_in\n        self.transformer_blocks = layer.transformer_blocks\n        self.proj_out = layer.proj_out\n\n\nclass BasicTransformerBlock(nn.Layer):\n\n    def __init__(self, dim, n_heads, d_head, dropout=0.0, context_dim=None, gated_ff=True, checkpoint=True):\n        super().__init__()\n        self.attn1 = CrossAttention(query_dim=dim, heads=n_heads, dim_head=d_head,\n                                    dropout=dropout)  # is a self-attention\n        self.ff = FeedForward(dim, dropout=dropout, glu=gated_ff)\n        self.attn2 = CrossAttention(query_dim=dim,\n                                    context_dim=context_dim,\n                                    heads=n_heads,\n                                    dim_head=d_head,\n                                    dropout=dropout)  # is self-attn if context is none\n        self.norm1 = nn.LayerNorm(dim)\n        self.norm2 = nn.LayerNorm(dim)\n        self.norm3 = nn.LayerNorm(dim)\n        self.checkpoint = checkpoint\n\n    def forward(self, x, context=None):\n        x = self.attn1(self.norm1(x)) + x\n        x = self.attn2(self.norm2(x), context=context) + x\n        x = self.ff(self.norm3(x)) + x\n        return x\n\n\nclass CrossAttention(nn.Layer):\n\n    def __init__(self, query_dim, context_dim=None, heads=8, dim_head=64, dropout=0.0):\n        super().__init__()\n        inner_dim = dim_head * heads\n        context_dim = default(context_dim, query_dim)\n\n        self.scale = dim_head**-0.5\n        self.heads = heads\n\n        self.to_q = nn.Linear(query_dim, inner_dim, bias_attr=False)\n        self.to_k = nn.Linear(context_dim, inner_dim, bias_attr=False)\n        self.to_v = nn.Linear(context_dim, inner_dim, bias_attr=False)\n\n        self.to_out = nn.Sequential(nn.Linear(inner_dim, query_dim), nn.Dropout(dropout))\n\n    def reshape_heads_to_batch_dim(self, tensor):\n        batch_size, seq_len, dim = tensor.shape\n        head_size = self.heads\n        tensor = tensor.reshape([batch_size, seq_len, head_size, dim // head_size])\n        tensor = tensor.transpose([0, 2, 1, 3]).reshape([batch_size * head_size, seq_len, dim // head_size])\n        return tensor\n\n    def reshape_batch_dim_to_heads(self, tensor):\n        batch_size, seq_len, dim = tensor.shape\n        head_size = self.heads\n        tensor = tensor.reshape([batch_size // head_size, head_size, seq_len, dim])\n        tensor = tensor.transpose([0, 2, 1, 3]).reshape([batch_size // head_size, seq_len, dim * head_size])\n        return tensor\n\n    def forward(self, x, context=None, mask=None):\n        batch_size, sequence_length, dim = x.shape\n\n        h = self.heads\n\n        q = self.to_q(x)\n        context = default(context, x)\n        k = self.to_k(context)\n        v = self.to_v(context)\n\n        q = self.reshape_heads_to_batch_dim(q)\n        k = self.reshape_heads_to_batch_dim(k)\n        v = self.reshape_heads_to_batch_dim(v)\n\n        sim = paddle.einsum(\"b i d, b j d -> b i j\", q * self.scale, k)\n\n        if exists(mask):\n            mask = mask.reshape([batch_size, -1])\n            max_neg_value = -paddle.finfo(sim.dtype).max\n            mask = mask[:, None, :].repeat(h, 1, 1)\n            sim.masked_fill_(~mask, max_neg_value)\n\n        # attention, what we cannot get enough of\n        attn = F.softmax(sim, axis=-1)\n\n        out = paddle.einsum(\"b i j, b j d -> b i d\", attn, v)\n        out = self.reshape_batch_dim_to_heads(out)\n        return self.to_out(out)\n\n\nclass FeedForward(nn.Layer):\n\n    def __init__(self, dim, dim_out=None, mult=4, glu=False, dropout=0.0):\n        super().__init__()\n        inner_dim = int(dim * mult)\n        dim_out = default(dim_out, dim)\n        project_in = nn.Sequential(nn.Linear(dim, inner_dim), nn.GELU()) if not glu else GEGLU(dim, inner_dim)\n\n        self.net = nn.Sequential(project_in, nn.Dropout(dropout), nn.Linear(inner_dim, dim_out))\n\n    def forward(self, x):\n        return self.net(x)\n\n\n# feedforward\nclass GEGLU(nn.Layer):\n\n    def __init__(self, dim_in, dim_out):\n        super().__init__()\n        self.proj = nn.Linear(dim_in, dim_out * 2)\n\n    def forward(self, x):\n        x, gate = self.proj(x).chunk(2, axis=-1)\n        return x * F.gelu(gate)\n\n\n# TODO(Patrick) - remove once all weights have been converted -> not needed anymore then\nclass NIN(nn.Layer):\n\n    def __init__(self, in_dim, num_units, init_scale=0.1):\n        super().__init__()\n        self.W = self.create_parameter(shape=[in_dim, num_units], default_initializer=nn.initializer.Constant(0.))\n        self.b = self.create_parameter(shape=[\n            num_units,\n        ],\n                                       is_bias=True,\n                                       default_initializer=nn.initializer.Constant(0.))\n\n\ndef exists(val):\n    return val is not None\n\n\ndef default(val, d):\n    if exists(val):\n        return val\n    return d() if isfunction(d) else d\n\n\n# the main attention block that is used for all models\nclass AttentionBlock(nn.Layer):\n    \"\"\"\n    An attention block that allows spatial positions to attend to each other.\n\n    Originally ported from here, but adapted to the N-d case.\n    https://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/models/unet.py#L66.\n    \"\"\"\n\n    def __init__(\n        self,\n        channels,\n        num_heads=1,\n        num_head_channels=None,\n        num_groups=32,\n        encoder_channels=None,\n        overwrite_qkv=False,\n        overwrite_linear=False,\n        rescale_output_factor=1.0,\n        eps=1e-5,\n    ):\n        super().__init__()\n        self.channels = channels\n        if num_head_channels is None:\n            self.num_heads = num_heads\n        else:\n            assert (channels % num_head_channels == 0\n                    ), f\"q,k,v channels {channels} is not divisible by num_head_channels {num_head_channels}\"\n            self.num_heads = channels // num_head_channels\n\n        self.norm = nn.GroupNorm(num_channels=channels, num_groups=num_groups, epsilon=eps)\n        self.qkv = nn.Conv1D(channels, channels * 3, 1)\n        self.n_heads = self.num_heads\n        self.rescale_output_factor = rescale_output_factor\n\n        if encoder_channels is not None:\n            self.encoder_kv = nn.Conv1D(encoder_channels, channels * 2, 1)\n\n        self.proj = nn.Conv1D(channels, channels, 1)\n\n        self.overwrite_qkv = overwrite_qkv\n        self.overwrite_linear = overwrite_linear\n\n        if overwrite_qkv:\n            in_channels = channels\n            self.norm = nn.GroupNorm(num_channels=channels, num_groups=num_groups, epsilon=1e-6)\n            self.q = nn.Conv2D(in_channels, in_channels, kernel_size=1, stride=1, padding=0)\n            self.k = nn.Conv2D(in_channels, in_channels, kernel_size=1, stride=1, padding=0)\n            self.v = nn.Conv2D(in_channels, in_channels, kernel_size=1, stride=1, padding=0)\n            self.proj_out = nn.Conv2D(in_channels, in_channels, kernel_size=1, stride=1, padding=0)\n        elif self.overwrite_linear:\n            num_groups = min(channels // 4, 32)\n            self.norm = nn.GroupNorm(num_channels=channels, num_groups=num_groups, epsilon=1e-6)\n            self.NIN_0 = NIN(channels, channels)\n            self.NIN_1 = NIN(channels, channels)\n            self.NIN_2 = NIN(channels, channels)\n            self.NIN_3 = NIN(channels, channels)\n\n            self.GroupNorm_0 = nn.GroupNorm(num_groups=num_groups, num_channels=channels, epsilon=1e-6)\n        else:\n            self.proj_out = nn.Conv1D(channels, channels, 1)\n            self.set_weights(self)\n\n        self.is_overwritten = False\n\n    def set_weights(self, layer):\n        if self.overwrite_qkv:\n            qkv_weight = paddle.concat([layer.q.weight, layer.k.weight, layer.v.weight], axis=0)[:, :, :, 0]\n            qkv_bias = paddle.concat([layer.q.bias, layer.k.bias, layer.v.bias], axis=0)\n\n            self.qkv.weight.set_value(qkv_weight)\n            self.qkv.bias.set_value(qkv_bias)\n\n            proj_out = nn.Conv1D(self.channels, self.channels, 1)\n            proj_out.weight.set_value(layer.proj_out.weight[:, :, :, 0])\n            proj_out.bias.set_value(layer.proj_out.bias)\n\n            self.proj = proj_out\n        elif self.overwrite_linear:\n            self.qkv.weight.set_value(\n                paddle.concat([self.NIN_0.W.t(), self.NIN_1.W.t(), self.NIN_2.W.t()], axis=0)[:, :, None])\n            self.qkv.bias.set_value(paddle.concat([self.NIN_0.b, self.NIN_1.b, self.NIN_2.b], axis=0))\n\n            self.proj.weight.set_value(self.NIN_3.W.t()[:, :, None])\n            self.proj.bias.set_value(self.NIN_3.b)\n\n            self.norm.weight.set_value(self.GroupNorm_0.weight)\n            self.norm.bias.set_value(self.GroupNorm_0.bias)\n        else:\n            self.proj.weight.set_value(self.proj_out.weight)\n            self.proj.bias.set_value(self.proj_out.bias)\n\n    def forward(self, x, encoder_out=None):\n        if not self.is_overwritten and (self.overwrite_qkv or self.overwrite_linear):\n            self.set_weights(self)\n            self.is_overwritten = True\n\n        b, c, *spatial = x.shape\n        hid_states = self.norm(x).reshape([b, c, -1])\n\n        qkv = self.qkv(hid_states)\n        bs, width, length = qkv.shape\n        assert width % (3 * self.n_heads) == 0\n        ch = width // (3 * self.n_heads)\n        q, k, v = qkv.reshape([bs * self.n_heads, ch * 3, length]).split(ch, axis=1)\n\n        if encoder_out is not None:\n            encoder_kv = self.encoder_kv(encoder_out)\n            assert encoder_kv.shape[1] == self.n_heads * ch * 2\n            ek, ev = encoder_kv.reshape([bs * self.n_heads, ch * 2, -1]).split(ch, axis=1)\n            k = paddle.concat([ek, k], axis=-1)\n            v = paddle.concat([ev, v], axis=-1)\n\n        scale = 1 / math.sqrt(math.sqrt(ch))\n        weight = paddle.einsum(\"bct,bcs->bts\", q * scale, k * scale)  # More stable with f16 than dividing afterwards\n        weight = F.softmax(weight.astype(\"float32\"), axis=-1).astype(weight.dtype)\n\n        a = paddle.einsum(\"bts,bcs->bct\", weight, v)\n        h = a.reshape([bs, -1, length])\n\n        h = self.proj(h)\n        h = h.reshape([b, c, *spatial])\n\n        result = x + h\n\n        result = result / self.rescale_output_factor\n\n        return result\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/diffusers/models/embeddings.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\ndef get_timestep_embedding(timesteps,\n                           embedding_dim,\n                           flip_sin_to_cos=False,\n                           downscale_freq_shift=1,\n                           scale=1,\n                           max_period=10000):\n    \"\"\"\n    This matches the implementation in Denoising Diffusion Probabilistic Models: Create sinusoidal timestep embeddings.\n\n    :param timesteps: a 1-D Tensor of N indices, one per batch element.\n                      These may be fractional.\n    :param embedding_dim: the dimension of the output. :param max_period: controls the minimum frequency of the\n    embeddings. :return: an [N x dim] Tensor of positional embeddings.\n    \"\"\"\n    assert len(timesteps.shape) == 1, \"Timesteps should be a 1d-array\"\n\n    half_dim = embedding_dim // 2\n    exponent = -math.log(max_period) * paddle.arange(start=0, end=half_dim, dtype=\"float32\")\n    exponent = exponent / (half_dim - downscale_freq_shift)\n\n    emb = paddle.exp(exponent)\n    emb = timesteps[:, None].astype(\"float32\") * emb[None, :]\n\n    # scale embeddings\n    emb = scale * emb\n\n    # concat sine and cosine embeddings\n    emb = paddle.concat([paddle.sin(emb), paddle.cos(emb)], axis=-1)\n\n    # flip sine and cosine embeddings\n    if flip_sin_to_cos:\n        emb = paddle.concat([emb[:, half_dim:], emb[:, :half_dim]], axis=-1)\n\n    # zero pad\n    if embedding_dim % 2 == 1:\n        emb = paddle.concat(emb, paddle.zeros([emb.shape[0], 1]), axis=-1)\n    return emb\n\n\nclass TimestepEmbedding(nn.Layer):\n\n    def __init__(self, channel, time_embed_dim, act_fn=\"silu\"):\n        super().__init__()\n\n        self.linear_1 = nn.Linear(channel, time_embed_dim)\n        self.act = None\n        if act_fn == \"silu\":\n            self.act = nn.Silu()\n        self.linear_2 = nn.Linear(time_embed_dim, time_embed_dim)\n\n    def forward(self, sample):\n        sample = self.linear_1(sample)\n\n        if self.act is not None:\n            sample = self.act(sample)\n\n        sample = self.linear_2(sample)\n        return sample\n\n\nclass Timesteps(nn.Layer):\n\n    def __init__(self, num_channels, flip_sin_to_cos, downscale_freq_shift):\n        super().__init__()\n        self.num_channels = num_channels\n        self.flip_sin_to_cos = flip_sin_to_cos\n        self.downscale_freq_shift = downscale_freq_shift\n\n    def forward(self, timesteps):\n        t_emb = get_timestep_embedding(\n            timesteps,\n            self.num_channels,\n            flip_sin_to_cos=self.flip_sin_to_cos,\n            downscale_freq_shift=self.downscale_freq_shift,\n        )\n        return t_emb\n\n\nclass GaussianFourierProjection(nn.Layer):\n    \"\"\"Gaussian Fourier embeddings for noise levels.\"\"\"\n\n    def __init__(self, embedding_size=256, scale=1.0):\n        super().__init__()\n        self.register_buffer(\"weight\", paddle.randn((embedding_size, )) * scale)\n\n        # to delete later\n        self.register_buffer(\"W\", paddle.randn((embedding_size, )) * scale)\n\n        self.weight = self.W\n\n    def forward(self, x):\n        x = paddle.log(x)\n        x_proj = x[:, None] * self.weight[None, :] * 2 * np.pi\n        out = paddle.concat([paddle.sin(x_proj), paddle.cos(x_proj)], axis=-1)\n        return out\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/diffusers/models/resnet.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom functools import partial\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\ndef pad_new(x, pad, mode=\"constant\", value=0):\n    new_pad = []\n    for _ in range(x.ndim * 2 - len(pad)):\n        new_pad.append(0)\n    ndim = list(range(x.ndim - 1, 0, -1))\n    axes_start = {}\n    for i, _pad in enumerate(pad):\n        if _pad < 0:\n            new_pad.append(0)\n            zhengshu, yushu = divmod(i, 2)\n            if yushu == 0:\n                axes_start[ndim[zhengshu]] = -_pad\n        else:\n            new_pad.append(_pad)\n\n    padded = paddle.nn.functional.pad(x, new_pad, mode=mode, value=value)\n    padded_shape = paddle.shape(padded)\n    axes = []\n    starts = []\n    ends = []\n    for k, v in axes_start.items():\n        axes.append(k)\n        starts.append(v)\n        ends.append(padded_shape[k])\n        assert v < padded_shape[k]\n\n    if axes:\n        return padded.slice(axes=axes, starts=starts, ends=ends)\n    else:\n        return padded\n\n\nclass Upsample2D(nn.Layer):\n    \"\"\"\n    An upsampling layer with an optional convolution.\n\n    :param channels: channels in the inputs and outputs. :param use_conv: a bool determining if a convolution is\n    applied. :param dims: determines if the signal is 1D, 2D, or 3D. If 3D, then\n                 upsampling occurs in the inner-two dimensions.\n    \"\"\"\n\n    def __init__(self, channels, use_conv=False, use_conv_transpose=False, out_channels=None, name=\"conv\"):\n        super().__init__()\n        self.channels = channels\n        self.out_channels = out_channels or channels\n        self.use_conv = use_conv\n        self.use_conv_transpose = use_conv_transpose\n        self.name = name\n\n        conv = None\n        if use_conv_transpose:\n            conv = nn.Conv2DTranspose(channels, self.out_channels, 4, 2, 1)\n        elif use_conv:\n            conv = nn.Conv2D(self.channels, self.out_channels, 3, padding=1)\n\n        # TODO(Suraj, Patrick) - clean up after weight dicts are correctly renamed\n        if name == \"conv\":\n            self.conv = conv\n        else:\n            self.Conv2d_0 = conv\n\n    def forward(self, x):\n        assert x.shape[1] == self.channels\n        if self.use_conv_transpose:\n            return self.conv(x)\n\n        x = F.interpolate(x, scale_factor=2.0, mode=\"nearest\")\n\n        # TODO(Suraj, Patrick) - clean up after weight dicts are correctly renamed\n        if self.use_conv:\n            if self.name == \"conv\":\n                x = self.conv(x)\n            else:\n                x = self.Conv2d_0(x)\n\n        return x\n\n\nclass Downsample2D(nn.Layer):\n    \"\"\"\n    A downsampling layer with an optional convolution.\n\n    :param channels: channels in the inputs and outputs. :param use_conv: a bool determining if a convolution is\n    applied. :param dims: determines if the signal is 1D, 2D, or 3D. If 3D, then\n                 downsampling occurs in the inner-two dimensions.\n    \"\"\"\n\n    def __init__(self, channels, use_conv=False, out_channels=None, padding=1, name=\"conv\"):\n        super().__init__()\n        self.channels = channels\n        self.out_channels = out_channels or channels\n        self.use_conv = use_conv\n        self.padding = padding\n        stride = 2\n        self.name = name\n\n        if use_conv:\n            conv = nn.Conv2D(self.channels, self.out_channels, 3, stride=stride, padding=padding)\n        else:\n            assert self.channels == self.out_channels\n            conv = nn.AvgPool2D(kernel_size=stride, stride=stride)\n\n        # TODO(Suraj, Patrick) - clean up after weight dicts are correctly renamed\n        if name == \"conv\":\n            self.Conv2d_0 = conv\n            self.conv = conv\n        elif name == \"Conv2d_0\":\n            self.conv = conv\n        else:\n            self.conv = conv\n\n    def forward(self, x):\n        assert x.shape[1] == self.channels\n        if self.use_conv and self.padding == 0:\n            pad = (0, 1, 0, 1)\n            x = pad_new(x, pad, mode=\"constant\", value=0)\n\n        assert x.shape[1] == self.channels\n        x = self.conv(x)\n\n        return x\n\n\nclass FirUpsample2D(nn.Layer):\n\n    def __init__(self, channels=None, out_channels=None, use_conv=False, fir_kernel=(1, 3, 3, 1)):\n        super().__init__()\n        out_channels = out_channels if out_channels else channels\n        if use_conv:\n            self.Conv2d_0 = nn.Conv2D(channels, out_channels, kernel_size=3, stride=1, padding=1)\n        self.use_conv = use_conv\n        self.fir_kernel = fir_kernel\n        self.out_channels = out_channels\n\n    def _upsample_2d(self, x, w=None, k=None, factor=2, gain=1):\n        \"\"\"Fused `upsample_2d()` followed by `Conv2d()`.\n\n        Args:\n        Padding is performed only once at the beginning, not between the operations. The fused op is considerably more\n        efficient than performing the same calculation using standard TensorFlow ops. It supports gradients of arbitrary:\n        order.\n        x: Input tensor of the shape `[N, C, H, W]` or `[N, H, W,\n            C]`.\n        w: Weight tensor of the shape `[filterH, filterW, inChannels,\n            outChannels]`. Grouped convolution can be performed by `inChannels = x.shape[0] // numGroups`.\n        k: FIR filter of the shape `[firH, firW]` or `[firN]`\n            (separable). The default is `[1] * factor`, which corresponds to nearest-neighbor upsampling.\n        factor: Integer upsampling factor (default: 2). gain: Scaling factor for signal magnitude (default: 1.0).\n\n        Returns:\n        Tensor of the shape `[N, C, H * factor, W * factor]` or `[N, H * factor, W * factor, C]`, and same datatype as\n        `x`.\n        \"\"\"\n\n        assert isinstance(factor, int) and factor >= 1\n\n        # Setup filter kernel.\n        if k is None:\n            k = [1] * factor\n\n        # setup kernel\n        k = np.asarray(k, dtype=np.float32)\n        if k.ndim == 1:\n            k = np.outer(k, k)\n        k /= np.sum(k)\n\n        k = k * (gain * (factor**2))\n\n        if self.use_conv:\n            convH = w.shape[2]\n            convW = w.shape[3]\n            inC = w.shape[1]\n\n            p = (k.shape[0] - factor) - (convW - 1)\n\n            stride = (factor, factor)\n            # Determine data dimensions.\n            stride = [1, 1, factor, factor]\n            output_shape = ((x.shape[2] - 1) * factor + convH, (x.shape[3] - 1) * factor + convW)\n            output_padding = (\n                output_shape[0] - (x.shape[2] - 1) * stride[0] - convH,\n                output_shape[1] - (x.shape[3] - 1) * stride[1] - convW,\n            )\n            assert output_padding[0] >= 0 and output_padding[1] >= 0\n            inC = w.shape[1]\n            num_groups = x.shape[1] // inC\n\n            # Transpose weights.\n            w = paddle.reshape(w, (num_groups, -1, inC, convH, convW))\n            w = w[..., ::-1, ::-1].transpose([0, 2, 1, 3, 4])\n            w = paddle.reshape(w, (num_groups * inC, -1, convH, convW))\n\n            x = F.conv2d_transpose(x, w, stride=stride, output_padding=output_padding, padding=0)\n\n            x = upfirdn2d_native(x, paddle.to_tensor(k), pad=((p + 1) // 2 + factor - 1, p // 2 + 1))\n        else:\n            p = k.shape[0] - factor\n            x = upfirdn2d_native(x, paddle.to_tensor(k), up=factor, pad=((p + 1) // 2 + factor - 1, p // 2))\n\n        return x\n\n    def forward(self, x):\n        if self.use_conv:\n            h = self._upsample_2d(x, self.Conv2d_0.weight, k=self.fir_kernel)\n            h = h + self.Conv2d_0.bias.reshape([1, -1, 1, 1])\n        else:\n            h = self._upsample_2d(x, k=self.fir_kernel, factor=2)\n\n        return h\n\n\nclass FirDownsample2D(nn.Layer):\n\n    def __init__(self, channels=None, out_channels=None, use_conv=False, fir_kernel=(1, 3, 3, 1)):\n        super().__init__()\n        out_channels = out_channels if out_channels else channels\n        if use_conv:\n            self.Conv2d_0 = nn.Conv2D(channels, out_channels, kernel_size=3, stride=1, padding=1)\n        self.fir_kernel = fir_kernel\n        self.use_conv = use_conv\n        self.out_channels = out_channels\n\n    def _downsample_2d(self, x, w=None, k=None, factor=2, gain=1):\n        \"\"\"Fused `Conv2d()` followed by `downsample_2d()`.\n\n        Args:\n        Padding is performed only once at the beginning, not between the operations. The fused op is considerably more\n        efficient than performing the same calculation using standard TensorFlow ops. It supports gradients of arbitrary:\n        order.\n            x: Input tensor of the shape `[N, C, H, W]` or `[N, H, W, C]`. w: Weight tensor of the shape `[filterH,\n            filterW, inChannels, outChannels]`. Grouped convolution can be performed by `inChannels = x.shape[0] //\n            numGroups`. k: FIR filter of the shape `[firH, firW]` or `[firN]` (separable). The default is `[1] *\n            factor`, which corresponds to average pooling. factor: Integer downsampling factor (default: 2). gain:\n            Scaling factor for signal magnitude (default: 1.0).\n\n        Returns:\n            Tensor of the shape `[N, C, H // factor, W // factor]` or `[N, H // factor, W // factor, C]`, and same\n            datatype as `x`.\n        \"\"\"\n\n        assert isinstance(factor, int) and factor >= 1\n        if k is None:\n            k = [1] * factor\n\n        # setup kernel\n        k = np.asarray(k, dtype=np.float32)\n        if k.ndim == 1:\n            k = np.outer(k, k)\n        k /= np.sum(k)\n\n        k = k * gain\n\n        if self.use_conv:\n            _, _, convH, convW = w.shape\n            p = (k.shape[0] - factor) + (convW - 1)\n            s = [factor, factor]\n            x = upfirdn2d_native(x, paddle.to_tensor(k), pad=((p + 1) // 2, p // 2))\n            x = F.conv2d(x, w, stride=s, padding=0)\n        else:\n            p = k.shape[0] - factor\n            x = upfirdn2d_native(x, paddle.to_tensor(k), down=factor, pad=((p + 1) // 2, p // 2))\n\n        return x\n\n    def forward(self, x):\n        if self.use_conv:\n            x = self._downsample_2d(x, w=self.Conv2d_0.weight, k=self.fir_kernel)\n            x = x + self.Conv2d_0.bias.reshape([1, -1, 1, 1])\n        else:\n            x = self._downsample_2d(x, k=self.fir_kernel, factor=2)\n\n        return x\n\n\nclass ResnetBlock(nn.Layer):\n\n    def __init__(\n        self,\n        *,\n        in_channels,\n        out_channels=None,\n        conv_shortcut=False,\n        dropout=0.0,\n        temb_channels=512,\n        groups=32,\n        groups_out=None,\n        pre_norm=True,\n        eps=1e-6,\n        non_linearity=\"swish\",\n        time_embedding_norm=\"default\",\n        kernel=None,\n        output_scale_factor=1.0,\n        use_nin_shortcut=None,\n        up=False,\n        down=False,\n    ):\n        super().__init__()\n        self.pre_norm = pre_norm\n        self.pre_norm = True\n        self.in_channels = in_channels\n        out_channels = in_channels if out_channels is None else out_channels\n        self.out_channels = out_channels\n        self.use_conv_shortcut = conv_shortcut\n        self.time_embedding_norm = time_embedding_norm\n        self.up = up\n        self.down = down\n        self.output_scale_factor = output_scale_factor\n\n        if groups_out is None:\n            groups_out = groups\n\n        self.norm1 = nn.GroupNorm(num_groups=groups, num_channels=in_channels, epsilon=eps)\n\n        self.conv1 = nn.Conv2D(in_channels, out_channels, kernel_size=3, stride=1, padding=1)\n\n        if temb_channels is not None:\n            self.time_emb_proj = nn.Linear(temb_channels, out_channels)\n        else:\n            self.time_emb_proj = None\n\n        self.norm2 = nn.GroupNorm(num_groups=groups_out, num_channels=out_channels, epsilon=eps)\n        self.dropout = nn.Dropout(dropout)\n        self.conv2 = nn.Conv2D(out_channels, out_channels, kernel_size=3, stride=1, padding=1)\n\n        if non_linearity == \"swish\":\n            self.nonlinearity = lambda x: F.silu(x)\n        elif non_linearity == \"mish\":\n            self.nonlinearity = Mish()\n        elif non_linearity == \"silu\":\n            self.nonlinearity = nn.Silu()\n\n        self.upsample = self.downsample = None\n        if self.up:\n            if kernel == \"fir\":\n                fir_kernel = (1, 3, 3, 1)\n                self.upsample = lambda x: upsample_2d(x, k=fir_kernel)\n            elif kernel == \"sde_vp\":\n                self.upsample = partial(F.interpolate, scale_factor=2.0, mode=\"nearest\")\n            else:\n                self.upsample = Upsample2D(in_channels, use_conv=False)\n        elif self.down:\n            if kernel == \"fir\":\n                fir_kernel = (1, 3, 3, 1)\n                self.downsample = lambda x: downsample_2d(x, k=fir_kernel)\n            elif kernel == \"sde_vp\":\n                self.downsample = partial(F.avg_pool2d, kernel_size=2, stride=2)\n            else:\n                self.downsample = Downsample2D(in_channels, use_conv=False, padding=1, name=\"op\")\n\n        self.use_nin_shortcut = self.in_channels != self.out_channels if use_nin_shortcut is None else use_nin_shortcut\n\n        self.conv_shortcut = None\n        if self.use_nin_shortcut:\n            self.conv_shortcut = nn.Conv2D(in_channels, out_channels, kernel_size=1, stride=1, padding=0)\n\n    def forward(self, x, temb, hey=False):\n        h = x\n\n        # make sure hidden states is in float32\n        # when running in half-precision\n        h = self.norm1(h.astype(\"float32\")).astype(h.dtype)\n        h = self.nonlinearity(h)\n\n        if self.upsample is not None:\n            x = self.upsample(x)\n            h = self.upsample(h)\n        elif self.downsample is not None:\n            x = self.downsample(x)\n            h = self.downsample(h)\n\n        h = self.conv1(h)\n\n        if temb is not None:\n            temb = self.time_emb_proj(self.nonlinearity(temb))[:, :, None, None]\n            h = h + temb\n\n        # make sure hidden states is in float32\n        # when running in half-precision\n        h = self.norm2(h.astype(\"float32\")).astype(h.dtype)\n        h = self.nonlinearity(h)\n\n        h = self.dropout(h)\n        h = self.conv2(h)\n\n        if self.conv_shortcut is not None:\n            x = self.conv_shortcut(x)\n\n        out = (x + h) / self.output_scale_factor\n\n        return out\n\n\nclass Mish(nn.Layer):\n\n    def forward(self, x):\n        return x * F.tanh(F.softplus(x))\n\n\ndef upsample_2d(x, k=None, factor=2, gain=1):\n    r\"\"\"Upsample2D a batch of 2D images with the given filter.\n\n    Args:\n    Accepts a batch of 2D images of the shape `[N, C, H, W]` or `[N, H, W, C]` and upsamples each image with the given\n    filter. The filter is normalized so that if the input pixels are constant, they will be scaled by the specified\n    `gain`. Pixels outside the image are assumed to be zero, and the filter is padded with zeros so that its shape is a:\n    multiple of the upsampling factor.\n        x: Input tensor of the shape `[N, C, H, W]` or `[N, H, W,\n          C]`.\n        k: FIR filter of the shape `[firH, firW]` or `[firN]`\n          (separable). The default is `[1] * factor`, which corresponds to nearest-neighbor upsampling.\n        factor: Integer upsampling factor (default: 2). gain: Scaling factor for signal magnitude (default: 1.0).\n\n    Returns:\n        Tensor of the shape `[N, C, H * factor, W * factor]`\n    \"\"\"\n    assert isinstance(factor, int) and factor >= 1\n    if k is None:\n        k = [1] * factor\n\n    k = np.asarray(k, dtype=np.float32)\n    if k.ndim == 1:\n        k = np.outer(k, k)\n    k /= np.sum(k)\n\n    k = k * (gain * (factor**2))\n    p = k.shape[0] - factor\n    return upfirdn2d_native(x, paddle.to_tensor(k), up=factor, pad=((p + 1) // 2 + factor - 1, p // 2))\n\n\ndef downsample_2d(x, k=None, factor=2, gain=1):\n    r\"\"\"Downsample2D a batch of 2D images with the given filter.\n\n    Args:\n    Accepts a batch of 2D images of the shape `[N, C, H, W]` or `[N, H, W, C]` and downsamples each image with the\n    given filter. The filter is normalized so that if the input pixels are constant, they will be scaled by the\n    specified `gain`. Pixels outside the image are assumed to be zero, and the filter is padded with zeros so that its\n    shape is a multiple of the downsampling factor.\n        x: Input tensor of the shape `[N, C, H, W]` or `[N, H, W,\n          C]`.\n        k: FIR filter of the shape `[firH, firW]` or `[firN]`\n          (separable). The default is `[1] * factor`, which corresponds to average pooling.\n        factor: Integer downsampling factor (default: 2). gain: Scaling factor for signal magnitude (default: 1.0).\n\n    Returns:\n        Tensor of the shape `[N, C, H // factor, W // factor]`\n    \"\"\"\n\n    assert isinstance(factor, int) and factor >= 1\n    if k is None:\n        k = [1] * factor\n\n    k = np.asarray(k, dtype=np.float32)\n    if k.ndim == 1:\n        k = np.outer(k, k)\n    k /= np.sum(k)\n\n    k = k * gain\n    p = k.shape[0] - factor\n    return upfirdn2d_native(x, paddle.to_tensor(k), down=factor, pad=((p + 1) // 2, p // 2))\n\n\ndef upfirdn2d_native(input, kernel, up=1, down=1, pad=(0, 0)):\n    up_x = up_y = up\n    down_x = down_y = down\n    pad_x0 = pad_y0 = pad[0]\n    pad_x1 = pad_y1 = pad[1]\n\n    _, channel, in_h, in_w = input.shape\n    input = input.reshape([-1, in_h, in_w, 1])\n\n    _, in_h, in_w, minor = input.shape\n    kernel_h, kernel_w = kernel.shape\n\n    out = input.reshape([-1, in_h, 1, in_w, 1, minor])\n    # TODO\n    out = pad_new(out, [0, 0, 0, up_x - 1, 0, 0, 0, up_y - 1])\n    out = out.reshape([-1, in_h * up_y, in_w * up_x, minor])\n\n    out = pad_new(out, [0, 0, max(pad_x0, 0), max(pad_x1, 0), max(pad_y0, 0), max(pad_y1, 0)])\n    out = out[:, max(-pad_y0, 0):out.shape[1] - max(-pad_y1, 0), max(-pad_x0, 0):out.shape[2] - max(-pad_x1, 0), :, ]\n\n    out = out.transpose([0, 3, 1, 2])\n    out = out.reshape([-1, 1, in_h * up_y + pad_y0 + pad_y1, in_w * up_x + pad_x0 + pad_x1])\n    w = paddle.flip(kernel, [0, 1]).reshape([1, 1, kernel_h, kernel_w])\n    out = F.conv2d(out, w)\n    out = out.reshape(\n        [-1, minor, in_h * up_y + pad_y0 + pad_y1 - kernel_h + 1, in_w * up_x + pad_x0 + pad_x1 - kernel_w + 1])\n    out = out.transpose([0, 2, 3, 1])\n    out = out[:, ::down_y, ::down_x, :]\n\n    out_h = (in_h * up_y + pad_y0 + pad_y1 - kernel_h) // down_y + 1\n    out_w = (in_w * up_x + pad_x0 + pad_x1 - kernel_w) // down_x + 1\n\n    return out.reshape([-1, channel, out_h, out_w])\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/diffusers/models/unet_2d.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Dict\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .embeddings import GaussianFourierProjection\nfrom .embeddings import TimestepEmbedding\nfrom .embeddings import Timesteps\nfrom .unet_blocks import get_down_block\nfrom .unet_blocks import get_up_block\nfrom .unet_blocks import UNetMidBlock2D\n\n\nclass UNet2DModel(nn.Layer, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        sample_size=None,\n        in_channels=3,\n        out_channels=3,\n        center_input_sample=False,\n        time_embedding_type=\"positional\",\n        freq_shift=0,\n        flip_sin_to_cos=True,\n        down_block_types=(\"DownBlock2D\", \"AttnDownBlock2D\", \"AttnDownBlock2D\", \"AttnDownBlock2D\"),\n        up_block_types=(\"AttnUpBlock2D\", \"AttnUpBlock2D\", \"AttnUpBlock2D\", \"UpBlock2D\"),\n        block_out_channels=(224, 448, 672, 896),\n        layers_per_block=2,\n        mid_block_scale_factor=1,\n        downsample_padding=1,\n        act_fn=\"silu\",\n        attention_head_dim=8,\n        norm_num_groups=32,\n        norm_eps=1e-5,\n    ):\n        super().__init__()\n\n        self.sample_size = sample_size\n        time_embed_dim = block_out_channels[0] * 4\n\n        # input\n        self.conv_in = nn.Conv2D(in_channels, block_out_channels[0], kernel_size=3, padding=(1, 1))\n\n        # time\n        if time_embedding_type == \"fourier\":\n            self.time_proj = GaussianFourierProjection(embedding_size=block_out_channels[0], scale=16)\n            timestep_input_dim = 2 * block_out_channels[0]\n        elif time_embedding_type == \"positional\":\n            self.time_proj = Timesteps(block_out_channels[0], flip_sin_to_cos, freq_shift)\n            timestep_input_dim = block_out_channels[0]\n\n        self.time_embedding = TimestepEmbedding(timestep_input_dim, time_embed_dim)\n\n        self.down_blocks = nn.LayerList([])\n        self.mid_block = None\n        self.up_blocks = nn.LayerList([])\n\n        # down\n        output_channel = block_out_channels[0]\n        for i, down_block_type in enumerate(down_block_types):\n            input_channel = output_channel\n            output_channel = block_out_channels[i]\n            is_final_block = i == len(block_out_channels) - 1\n\n            down_block = get_down_block(\n                down_block_type,\n                num_layers=layers_per_block,\n                in_channels=input_channel,\n                out_channels=output_channel,\n                temb_channels=time_embed_dim,\n                add_downsample=not is_final_block,\n                resnet_eps=norm_eps,\n                resnet_act_fn=act_fn,\n                attn_num_head_channels=attention_head_dim,\n                downsample_padding=downsample_padding,\n            )\n            self.down_blocks.append(down_block)\n\n        # mid\n        self.mid_block = UNetMidBlock2D(\n            in_channels=block_out_channels[-1],\n            temb_channels=time_embed_dim,\n            resnet_eps=norm_eps,\n            resnet_act_fn=act_fn,\n            output_scale_factor=mid_block_scale_factor,\n            resnet_time_scale_shift=\"default\",\n            attn_num_head_channels=attention_head_dim,\n            resnet_groups=norm_num_groups,\n        )\n\n        # up\n        reversed_block_out_channels = list(reversed(block_out_channels))\n        output_channel = reversed_block_out_channels[0]\n        for i, up_block_type in enumerate(up_block_types):\n            prev_output_channel = output_channel\n            output_channel = reversed_block_out_channels[i]\n            input_channel = reversed_block_out_channels[min(i + 1, len(block_out_channels) - 1)]\n\n            is_final_block = i == len(block_out_channels) - 1\n\n            up_block = get_up_block(\n                up_block_type,\n                num_layers=layers_per_block + 1,\n                in_channels=input_channel,\n                out_channels=output_channel,\n                prev_output_channel=prev_output_channel,\n                temb_channels=time_embed_dim,\n                add_upsample=not is_final_block,\n                resnet_eps=norm_eps,\n                resnet_act_fn=act_fn,\n                attn_num_head_channels=attention_head_dim,\n            )\n            self.up_blocks.append(up_block)\n            prev_output_channel = output_channel\n\n        # out\n        num_groups_out = norm_num_groups if norm_num_groups is not None else min(block_out_channels[0] // 4, 32)\n        self.conv_norm_out = nn.GroupNorm(num_channels=block_out_channels[0],\n                                          num_groups=num_groups_out,\n                                          epsilon=norm_eps)\n        self.conv_act = nn.Silu()\n        self.conv_out = nn.Conv2D(block_out_channels[0], out_channels, 3, padding=1)\n\n    def forward(self, sample: paddle.Tensor, timestep: Union[paddle.Tensor, float, int]) -> Dict[str, paddle.Tensor]:\n\n        # 0. center input if necessary\n        if self.config.center_input_sample:\n            sample = 2 * sample - 1.0\n\n        # 1. time\n        timesteps = timestep\n        if not paddle.is_tensor(timesteps):\n            timesteps = paddle.to_tensor([timesteps], dtype=\"int64\")\n        elif paddle.is_tensor(timesteps) and len(timesteps.shape) == 0:\n            timesteps = timesteps[None]\n\n        # broadcast to batch dimension\n        timesteps = paddle.broadcast_to(timesteps, [sample.shape[0]])\n\n        t_emb = self.time_proj(timesteps)\n        emb = self.time_embedding(t_emb)\n\n        # 2. pre-process\n        skip_sample = sample\n        sample = self.conv_in(sample)\n\n        # 3. down\n        down_block_res_samples = (sample, )\n        for downsample_block in self.down_blocks:\n            if hasattr(downsample_block, \"skip_conv\"):\n                sample, res_samples, skip_sample = downsample_block(hidden_states=sample,\n                                                                    temb=emb,\n                                                                    skip_sample=skip_sample)\n            else:\n                sample, res_samples = downsample_block(hidden_states=sample, temb=emb)\n\n            down_block_res_samples += res_samples\n\n        # 4. mid\n        sample = self.mid_block(sample, emb)\n\n        # 5. up\n        skip_sample = None\n        for upsample_block in self.up_blocks:\n            res_samples = down_block_res_samples[-len(upsample_block.resnets):]\n            down_block_res_samples = down_block_res_samples[:-len(upsample_block.resnets)]\n\n            if hasattr(upsample_block, \"skip_conv\"):\n                sample, skip_sample = upsample_block(sample, res_samples, emb, skip_sample)\n            else:\n                sample = upsample_block(sample, res_samples, emb)\n\n        # 6. post-process\n        # make sure hidden states is in float32\n        # when running in half-precision\n        sample = self.conv_norm_out(sample.astype(\"float32\")).astype(sample.dtype)\n        sample = self.conv_act(sample)\n        sample = self.conv_out(sample)\n\n        if skip_sample is not None:\n            sample += skip_sample\n\n        if self.config.time_embedding_type == \"fourier\":\n            timesteps = timesteps.reshape((sample.shape[0], *([1] * len(sample.shape[1:]))))\n            sample = sample / timesteps\n\n        output = {\"sample\": sample}\n\n        return output\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/diffusers/models/unet_2d_condition.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Dict\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .embeddings import TimestepEmbedding\nfrom .embeddings import Timesteps\nfrom .unet_blocks import get_down_block\nfrom .unet_blocks import get_up_block\nfrom .unet_blocks import UNetMidBlock2DCrossAttn\n\n\nclass UNet2DConditionModel(nn.Layer, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        sample_size=64,\n        in_channels=4,\n        out_channels=4,\n        center_input_sample=False,\n        flip_sin_to_cos=True,\n        freq_shift=0,\n        down_block_types=(\"CrossAttnDownBlock2D\", \"CrossAttnDownBlock2D\", \"CrossAttnDownBlock2D\", \"DownBlock2D\"),\n        up_block_types=(\"UpBlock2D\", \"CrossAttnUpBlock2D\", \"CrossAttnUpBlock2D\", \"CrossAttnUpBlock2D\"),\n        block_out_channels=(320, 640, 1280, 1280),\n        layers_per_block=2,\n        downsample_padding=1,\n        mid_block_scale_factor=1,\n        act_fn=\"silu\",\n        norm_num_groups=32,\n        norm_eps=1e-5,\n        cross_attention_dim=768,\n        attention_head_dim=8,\n    ):\n        super().__init__()\n\n        self.sample_size = sample_size\n        time_embed_dim = block_out_channels[0] * 4\n\n        # input\n        self.conv_in = nn.Conv2D(in_channels, block_out_channels[0], kernel_size=3, padding=(1, 1))\n\n        # time\n        self.time_proj = Timesteps(block_out_channels[0], flip_sin_to_cos, freq_shift)\n        timestep_input_dim = block_out_channels[0]\n\n        self.time_embedding = TimestepEmbedding(timestep_input_dim, time_embed_dim)\n\n        self.down_blocks = nn.LayerList([])\n        self.mid_block = None\n        self.up_blocks = nn.LayerList([])\n\n        # down\n        output_channel = block_out_channels[0]\n        for i, down_block_type in enumerate(down_block_types):\n            input_channel = output_channel\n            output_channel = block_out_channels[i]\n            is_final_block = i == len(block_out_channels) - 1\n\n            down_block = get_down_block(\n                down_block_type,\n                num_layers=layers_per_block,\n                in_channels=input_channel,\n                out_channels=output_channel,\n                temb_channels=time_embed_dim,\n                add_downsample=not is_final_block,\n                resnet_eps=norm_eps,\n                resnet_act_fn=act_fn,\n                cross_attention_dim=cross_attention_dim,\n                attn_num_head_channels=attention_head_dim,\n                downsample_padding=downsample_padding,\n            )\n            self.down_blocks.append(down_block)\n\n        # mid\n        self.mid_block = UNetMidBlock2DCrossAttn(\n            in_channels=block_out_channels[-1],\n            temb_channels=time_embed_dim,\n            resnet_eps=norm_eps,\n            resnet_act_fn=act_fn,\n            output_scale_factor=mid_block_scale_factor,\n            resnet_time_scale_shift=\"default\",\n            cross_attention_dim=cross_attention_dim,\n            attn_num_head_channels=attention_head_dim,\n            resnet_groups=norm_num_groups,\n        )\n\n        # up\n        reversed_block_out_channels = list(reversed(block_out_channels))\n        output_channel = reversed_block_out_channels[0]\n        for i, up_block_type in enumerate(up_block_types):\n            prev_output_channel = output_channel\n            output_channel = reversed_block_out_channels[i]\n            input_channel = reversed_block_out_channels[min(i + 1, len(block_out_channels) - 1)]\n\n            is_final_block = i == len(block_out_channels) - 1\n\n            up_block = get_up_block(\n                up_block_type,\n                num_layers=layers_per_block + 1,\n                in_channels=input_channel,\n                out_channels=output_channel,\n                prev_output_channel=prev_output_channel,\n                temb_channels=time_embed_dim,\n                add_upsample=not is_final_block,\n                resnet_eps=norm_eps,\n                resnet_act_fn=act_fn,\n                cross_attention_dim=cross_attention_dim,\n                attn_num_head_channels=attention_head_dim,\n            )\n            self.up_blocks.append(up_block)\n            prev_output_channel = output_channel\n\n        # out\n        self.conv_norm_out = nn.GroupNorm(num_channels=block_out_channels[0],\n                                          num_groups=norm_num_groups,\n                                          epsilon=norm_eps)\n        self.conv_act = nn.Silu()\n        self.conv_out = nn.Conv2D(block_out_channels[0], out_channels, 3, padding=1)\n\n    def forward(\n        self,\n        sample: paddle.Tensor,\n        timestep: Union[paddle.Tensor, float, int],\n        encoder_hidden_states: paddle.Tensor,\n    ) -> Dict[str, paddle.Tensor]:\n\n        # 0. center input if necessary\n        if self.config.center_input_sample:\n            sample = 2 * sample - 1.0\n\n        # 1. time\n        timesteps = timestep\n        if not paddle.is_tensor(timesteps):\n            timesteps = paddle.to_tensor([timesteps], dtype=\"int64\")\n        elif paddle.is_tensor(timesteps) and len(timesteps.shape) == 0:\n            timesteps = timesteps[None]\n\n        # broadcast to batch dimension\n        timesteps = paddle.broadcast_to(timesteps, [sample.shape[0]])\n\n        t_emb = self.time_proj(timesteps)\n        emb = self.time_embedding(t_emb)\n\n        # 2. pre-process\n        sample = self.conv_in(sample)\n\n        # 3. down\n        down_block_res_samples = (sample, )\n        for downsample_block in self.down_blocks:\n\n            if hasattr(downsample_block, \"attentions\") and downsample_block.attentions is not None:\n                sample, res_samples = downsample_block(hidden_states=sample,\n                                                       temb=emb,\n                                                       encoder_hidden_states=encoder_hidden_states)\n            else:\n                sample, res_samples = downsample_block(hidden_states=sample, temb=emb)\n\n            down_block_res_samples += res_samples\n\n        # 4. mid\n        sample = self.mid_block(sample, emb, encoder_hidden_states=encoder_hidden_states)\n\n        # 5. up\n        for upsample_block in self.up_blocks:\n\n            res_samples = down_block_res_samples[-len(upsample_block.resnets):]\n            down_block_res_samples = down_block_res_samples[:-len(upsample_block.resnets)]\n\n            if hasattr(upsample_block, \"attentions\") and upsample_block.attentions is not None:\n                sample = upsample_block(\n                    hidden_states=sample,\n                    temb=emb,\n                    res_hidden_states_tuple=res_samples,\n                    encoder_hidden_states=encoder_hidden_states,\n                )\n            else:\n                sample = upsample_block(hidden_states=sample, temb=emb, res_hidden_states_tuple=res_samples)\n\n        # 6. post-process\n        # make sure hidden states is in float32\n        # when running in half-precision\n        sample = self.conv_norm_out(sample.astype(\"float32\")).astype(sample.dtype)\n        sample = self.conv_act(sample)\n        sample = self.conv_out(sample)\n\n        output = {\"sample\": sample}\n\n        return output\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/diffusers/models/unet_blocks.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\n\nfrom .attention import AttentionBlockNew\nfrom .attention import SpatialTransformer\nfrom .resnet import Downsample2D\nfrom .resnet import FirDownsample2D\nfrom .resnet import FirUpsample2D\nfrom .resnet import ResnetBlock\nfrom .resnet import Upsample2D\n\n\ndef get_down_block(\n    down_block_type,\n    num_layers,\n    in_channels,\n    out_channels,\n    temb_channels,\n    add_downsample,\n    resnet_eps,\n    resnet_act_fn,\n    attn_num_head_channels,\n    cross_attention_dim=None,\n    downsample_padding=None,\n):\n    down_block_type = down_block_type[7:] if down_block_type.startswith(\"UNetRes\") else down_block_type\n    if down_block_type == \"DownBlock2D\":\n        return DownBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            temb_channels=temb_channels,\n            add_downsample=add_downsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            downsample_padding=downsample_padding,\n        )\n    elif down_block_type == \"AttnDownBlock2D\":\n        return AttnDownBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            temb_channels=temb_channels,\n            add_downsample=add_downsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            downsample_padding=downsample_padding,\n            attn_num_head_channels=attn_num_head_channels,\n        )\n    elif down_block_type == \"CrossAttnDownBlock2D\":\n        if cross_attention_dim is None:\n            raise ValueError(\"cross_attention_dim must be specified for CrossAttnUpBlock2D\")\n        return CrossAttnDownBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            temb_channels=temb_channels,\n            add_downsample=add_downsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            downsample_padding=downsample_padding,\n            cross_attention_dim=cross_attention_dim,\n            attn_num_head_channels=attn_num_head_channels,\n        )\n    elif down_block_type == \"SkipDownBlock2D\":\n        return SkipDownBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            temb_channels=temb_channels,\n            add_downsample=add_downsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            downsample_padding=downsample_padding,\n        )\n    elif down_block_type == \"AttnSkipDownBlock2D\":\n        return AttnSkipDownBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            temb_channels=temb_channels,\n            add_downsample=add_downsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            downsample_padding=downsample_padding,\n            attn_num_head_channels=attn_num_head_channels,\n        )\n    elif down_block_type == \"DownEncoderBlock2D\":\n        return DownEncoderBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            add_downsample=add_downsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            downsample_padding=downsample_padding,\n        )\n\n\ndef get_up_block(\n    up_block_type,\n    num_layers,\n    in_channels,\n    out_channels,\n    prev_output_channel,\n    temb_channels,\n    add_upsample,\n    resnet_eps,\n    resnet_act_fn,\n    attn_num_head_channels,\n    cross_attention_dim=None,\n):\n    up_block_type = up_block_type[7:] if up_block_type.startswith(\"UNetRes\") else up_block_type\n    if up_block_type == \"UpBlock2D\":\n        return UpBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            prev_output_channel=prev_output_channel,\n            temb_channels=temb_channels,\n            add_upsample=add_upsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n        )\n    elif up_block_type == \"CrossAttnUpBlock2D\":\n        if cross_attention_dim is None:\n            raise ValueError(\"cross_attention_dim must be specified for CrossAttnUpBlock2D\")\n        return CrossAttnUpBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            prev_output_channel=prev_output_channel,\n            temb_channels=temb_channels,\n            add_upsample=add_upsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            cross_attention_dim=cross_attention_dim,\n            attn_num_head_channels=attn_num_head_channels,\n        )\n    elif up_block_type == \"AttnUpBlock2D\":\n        return AttnUpBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            prev_output_channel=prev_output_channel,\n            temb_channels=temb_channels,\n            add_upsample=add_upsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            attn_num_head_channels=attn_num_head_channels,\n        )\n    elif up_block_type == \"SkipUpBlock2D\":\n        return SkipUpBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            prev_output_channel=prev_output_channel,\n            temb_channels=temb_channels,\n            add_upsample=add_upsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n        )\n    elif up_block_type == \"AttnSkipUpBlock2D\":\n        return AttnSkipUpBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            prev_output_channel=prev_output_channel,\n            temb_channels=temb_channels,\n            add_upsample=add_upsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            attn_num_head_channels=attn_num_head_channels,\n        )\n    elif up_block_type == \"UpDecoderBlock2D\":\n        return UpDecoderBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            add_upsample=add_upsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n        )\n    raise ValueError(f\"{up_block_type} does not exist.\")\n\n\nclass UNetMidBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        attention_type=\"default\",\n        output_scale_factor=1.0,\n        **kwargs,\n    ):\n        super().__init__()\n\n        self.attention_type = attention_type\n        resnet_groups = resnet_groups if resnet_groups is not None else min(in_channels // 4, 32)\n\n        # there is always at least one resnet\n        resnets = [\n            ResnetBlock(\n                in_channels=in_channels,\n                out_channels=in_channels,\n                temb_channels=temb_channels,\n                eps=resnet_eps,\n                groups=resnet_groups,\n                dropout=dropout,\n                time_embedding_norm=resnet_time_scale_shift,\n                non_linearity=resnet_act_fn,\n                output_scale_factor=output_scale_factor,\n                pre_norm=resnet_pre_norm,\n            )\n        ]\n        attentions = []\n\n        for _ in range(num_layers):\n            attentions.append(\n                AttentionBlockNew(\n                    in_channels,\n                    num_head_channels=attn_num_head_channels,\n                    rescale_output_factor=output_scale_factor,\n                    eps=resnet_eps,\n                    num_groups=resnet_groups,\n                ))\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=in_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n    def forward(self, hidden_states, temb=None, encoder_states=None):\n        hidden_states = self.resnets[0](hidden_states, temb)\n        for attn, resnet in zip(self.attentions, self.resnets[1:]):\n            if self.attention_type == \"default\":\n                hidden_states = attn(hidden_states)\n            else:\n                hidden_states = attn(hidden_states, encoder_states)\n            hidden_states = resnet(hidden_states, temb)\n\n        return hidden_states\n\n\nclass UNetMidBlock2DCrossAttn(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        attention_type=\"default\",\n        output_scale_factor=1.0,\n        cross_attention_dim=1280,\n        **kwargs,\n    ):\n        super().__init__()\n\n        self.attention_type = attention_type\n        resnet_groups = resnet_groups if resnet_groups is not None else min(in_channels // 4, 32)\n\n        # there is always at least one resnet\n        resnets = [\n            ResnetBlock(\n                in_channels=in_channels,\n                out_channels=in_channels,\n                temb_channels=temb_channels,\n                eps=resnet_eps,\n                groups=resnet_groups,\n                dropout=dropout,\n                time_embedding_norm=resnet_time_scale_shift,\n                non_linearity=resnet_act_fn,\n                output_scale_factor=output_scale_factor,\n                pre_norm=resnet_pre_norm,\n            )\n        ]\n        attentions = []\n\n        for _ in range(num_layers):\n            attentions.append(\n                SpatialTransformer(\n                    in_channels,\n                    attn_num_head_channels,\n                    in_channels // attn_num_head_channels,\n                    depth=1,\n                    context_dim=cross_attention_dim,\n                ))\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=in_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n    def forward(self, hidden_states, temb=None, encoder_hidden_states=None):\n        hidden_states = self.resnets[0](hidden_states, temb)\n        for attn, resnet in zip(self.attentions, self.resnets[1:]):\n            hidden_states = attn(hidden_states, encoder_hidden_states)\n            hidden_states = resnet(hidden_states, temb)\n\n        return hidden_states\n\n\nclass AttnDownBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        attention_type=\"default\",\n        output_scale_factor=1.0,\n        downsample_padding=1,\n        add_downsample=True,\n    ):\n        super().__init__()\n        resnets = []\n        attentions = []\n\n        self.attention_type = attention_type\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            attentions.append(\n                AttentionBlockNew(\n                    out_channels,\n                    num_head_channels=attn_num_head_channels,\n                    rescale_output_factor=output_scale_factor,\n                    eps=resnet_eps,\n                ))\n\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n        if add_downsample:\n            self.downsamplers = nn.LayerList([\n                Downsample2D(in_channels,\n                             use_conv=True,\n                             out_channels=out_channels,\n                             padding=downsample_padding,\n                             name=\"op\")\n            ])\n        else:\n            self.downsamplers = None\n\n    def forward(self, hidden_states, temb=None):\n        output_states = ()\n\n        for resnet, attn in zip(self.resnets, self.attentions):\n            hidden_states = resnet(hidden_states, temb)\n            hidden_states = attn(hidden_states)\n            output_states += (hidden_states, )\n\n        if self.downsamplers is not None:\n            for downsampler in self.downsamplers:\n                hidden_states = downsampler(hidden_states)\n\n            output_states += (hidden_states, )\n\n        return hidden_states, output_states\n\n\nclass CrossAttnDownBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        cross_attention_dim=1280,\n        attention_type=\"default\",\n        output_scale_factor=1.0,\n        downsample_padding=1,\n        add_downsample=True,\n    ):\n        super().__init__()\n        resnets = []\n        attentions = []\n\n        self.attention_type = attention_type\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            attentions.append(\n                SpatialTransformer(\n                    out_channels,\n                    attn_num_head_channels,\n                    out_channels // attn_num_head_channels,\n                    depth=1,\n                    context_dim=cross_attention_dim,\n                ))\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n        if add_downsample:\n            self.downsamplers = nn.LayerList([\n                Downsample2D(in_channels,\n                             use_conv=True,\n                             out_channels=out_channels,\n                             padding=downsample_padding,\n                             name=\"op\")\n            ])\n        else:\n            self.downsamplers = None\n\n    def forward(self, hidden_states, temb=None, encoder_hidden_states=None):\n        output_states = ()\n\n        for resnet, attn in zip(self.resnets, self.attentions):\n            hidden_states = resnet(hidden_states, temb)\n            hidden_states = attn(hidden_states, context=encoder_hidden_states)\n            output_states += (hidden_states, )\n\n        if self.downsamplers is not None:\n            for downsampler in self.downsamplers:\n                hidden_states = downsampler(hidden_states)\n\n            output_states += (hidden_states, )\n\n        return hidden_states, output_states\n\n\nclass DownBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        output_scale_factor=1.0,\n        add_downsample=True,\n        downsample_padding=1,\n    ):\n        super().__init__()\n        resnets = []\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.resnets = nn.LayerList(resnets)\n\n        if add_downsample:\n            self.downsamplers = nn.LayerList([\n                Downsample2D(in_channels,\n                             use_conv=True,\n                             out_channels=out_channels,\n                             padding=downsample_padding,\n                             name=\"op\")\n            ])\n        else:\n            self.downsamplers = None\n\n    def forward(self, hidden_states, temb=None):\n        output_states = ()\n\n        for resnet in self.resnets:\n            hidden_states = resnet(hidden_states, temb)\n            output_states += (hidden_states, )\n\n        if self.downsamplers is not None:\n            for downsampler in self.downsamplers:\n                hidden_states = downsampler(hidden_states)\n\n            output_states += (hidden_states, )\n\n        return hidden_states, output_states\n\n\nclass DownEncoderBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        output_scale_factor=1.0,\n        add_downsample=True,\n        downsample_padding=1,\n    ):\n        super().__init__()\n        resnets = []\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=None,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.resnets = nn.LayerList(resnets)\n\n        if add_downsample:\n            self.downsamplers = nn.LayerList([\n                Downsample2D(in_channels,\n                             use_conv=True,\n                             out_channels=out_channels,\n                             padding=downsample_padding,\n                             name=\"op\")\n            ])\n        else:\n            self.downsamplers = None\n\n    def forward(self, hidden_states):\n        for resnet in self.resnets:\n            hidden_states = resnet(hidden_states, temb=None)\n\n        if self.downsamplers is not None:\n            for downsampler in self.downsamplers:\n                hidden_states = downsampler(hidden_states)\n\n        return hidden_states\n\n\nclass AttnDownEncoderBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        output_scale_factor=1.0,\n        add_downsample=True,\n        downsample_padding=1,\n    ):\n        super().__init__()\n        resnets = []\n        attentions = []\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=None,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            attentions.append(\n                AttentionBlockNew(\n                    out_channels,\n                    num_head_channels=attn_num_head_channels,\n                    rescale_output_factor=output_scale_factor,\n                    eps=resnet_eps,\n                    num_groups=resnet_groups,\n                ))\n\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n        if add_downsample:\n            self.downsamplers = nn.LayerList([\n                Downsample2D(in_channels,\n                             use_conv=True,\n                             out_channels=out_channels,\n                             padding=downsample_padding,\n                             name=\"op\")\n            ])\n        else:\n            self.downsamplers = None\n\n    def forward(self, hidden_states):\n        for resnet, attn in zip(self.resnets, self.attentions):\n            hidden_states = resnet(hidden_states, temb=None)\n            hidden_states = attn(hidden_states)\n\n        if self.downsamplers is not None:\n            for downsampler in self.downsamplers:\n                hidden_states = downsampler(hidden_states)\n\n        return hidden_states\n\n\nclass AttnSkipDownBlock2D(nn.Layer):\n\n    def __init__(\n            self,\n            in_channels: int,\n            out_channels: int,\n            temb_channels: int,\n            dropout: float = 0.0,\n            num_layers: int = 1,\n            resnet_eps: float = 1e-6,\n            resnet_time_scale_shift: str = \"default\",\n            resnet_act_fn: str = \"swish\",\n            resnet_pre_norm: bool = True,\n            attn_num_head_channels=1,\n            attention_type=\"default\",\n            output_scale_factor=np.sqrt(2.0),\n            downsample_padding=1,\n            add_downsample=True,\n    ):\n        super().__init__()\n        self.attentions = nn.LayerList([])\n        self.resnets = nn.LayerList([])\n\n        self.attention_type = attention_type\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            self.resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=min(in_channels // 4, 32),\n                    groups_out=min(out_channels // 4, 32),\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            self.attentions.append(\n                AttentionBlockNew(\n                    out_channels,\n                    num_head_channels=attn_num_head_channels,\n                    rescale_output_factor=output_scale_factor,\n                    eps=resnet_eps,\n                ))\n\n        if add_downsample:\n            self.resnet_down = ResnetBlock(\n                in_channels=out_channels,\n                out_channels=out_channels,\n                temb_channels=temb_channels,\n                eps=resnet_eps,\n                groups=min(out_channels // 4, 32),\n                dropout=dropout,\n                time_embedding_norm=resnet_time_scale_shift,\n                non_linearity=resnet_act_fn,\n                output_scale_factor=output_scale_factor,\n                pre_norm=resnet_pre_norm,\n                use_nin_shortcut=True,\n                down=True,\n                kernel=\"fir\",\n            )\n            self.downsamplers = nn.LayerList([FirDownsample2D(in_channels, out_channels=out_channels)])\n            self.skip_conv = nn.Conv2D(3, out_channels, kernel_size=(1, 1), stride=(1, 1))\n        else:\n            self.resnet_down = None\n            self.downsamplers = None\n            self.skip_conv = None\n\n    def forward(self, hidden_states, temb=None, skip_sample=None):\n        output_states = ()\n\n        for resnet, attn in zip(self.resnets, self.attentions):\n            hidden_states = resnet(hidden_states, temb)\n            hidden_states = attn(hidden_states)\n            output_states += (hidden_states, )\n\n        if self.downsamplers is not None:\n            hidden_states = self.resnet_down(hidden_states, temb)\n            for downsampler in self.downsamplers:\n                skip_sample = downsampler(skip_sample)\n\n            hidden_states = self.skip_conv(skip_sample) + hidden_states\n\n            output_states += (hidden_states, )\n\n        return hidden_states, output_states, skip_sample\n\n\nclass SkipDownBlock2D(nn.Layer):\n\n    def __init__(\n            self,\n            in_channels: int,\n            out_channels: int,\n            temb_channels: int,\n            dropout: float = 0.0,\n            num_layers: int = 1,\n            resnet_eps: float = 1e-6,\n            resnet_time_scale_shift: str = \"default\",\n            resnet_act_fn: str = \"swish\",\n            resnet_pre_norm: bool = True,\n            output_scale_factor=np.sqrt(2.0),\n            add_downsample=True,\n            downsample_padding=1,\n    ):\n        super().__init__()\n        self.resnets = nn.LayerList([])\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            self.resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=min(in_channels // 4, 32),\n                    groups_out=min(out_channels // 4, 32),\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        if add_downsample:\n            self.resnet_down = ResnetBlock(\n                in_channels=out_channels,\n                out_channels=out_channels,\n                temb_channels=temb_channels,\n                eps=resnet_eps,\n                groups=min(out_channels // 4, 32),\n                dropout=dropout,\n                time_embedding_norm=resnet_time_scale_shift,\n                non_linearity=resnet_act_fn,\n                output_scale_factor=output_scale_factor,\n                pre_norm=resnet_pre_norm,\n                use_nin_shortcut=True,\n                down=True,\n                kernel=\"fir\",\n            )\n            self.downsamplers = nn.LayerList([FirDownsample2D(in_channels, out_channels=out_channels)])\n            self.skip_conv = nn.Conv2D(3, out_channels, kernel_size=(1, 1), stride=(1, 1))\n        else:\n            self.resnet_down = None\n            self.downsamplers = None\n            self.skip_conv = None\n\n    def forward(self, hidden_states, temb=None, skip_sample=None):\n        output_states = ()\n\n        for resnet in self.resnets:\n            hidden_states = resnet(hidden_states, temb)\n            output_states += (hidden_states, )\n\n        if self.downsamplers is not None:\n            hidden_states = self.resnet_down(hidden_states, temb)\n            for downsampler in self.downsamplers:\n                skip_sample = downsampler(skip_sample)\n\n            hidden_states = self.skip_conv(skip_sample) + hidden_states\n\n            output_states += (hidden_states, )\n\n        return hidden_states, output_states, skip_sample\n\n\nclass AttnUpBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        prev_output_channel: int,\n        out_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attention_type=\"default\",\n        attn_num_head_channels=1,\n        output_scale_factor=1.0,\n        add_upsample=True,\n    ):\n        super().__init__()\n        resnets = []\n        attentions = []\n\n        self.attention_type = attention_type\n\n        for i in range(num_layers):\n            res_skip_channels = in_channels if (i == num_layers - 1) else out_channels\n            resnet_in_channels = prev_output_channel if i == 0 else out_channels\n\n            resnets.append(\n                ResnetBlock(\n                    in_channels=resnet_in_channels + res_skip_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            attentions.append(\n                AttentionBlockNew(\n                    out_channels,\n                    num_head_channels=attn_num_head_channels,\n                    rescale_output_factor=output_scale_factor,\n                    eps=resnet_eps,\n                ))\n\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n        if add_upsample:\n            self.upsamplers = nn.LayerList([Upsample2D(out_channels, use_conv=True, out_channels=out_channels)])\n        else:\n            self.upsamplers = None\n\n    def forward(self, hidden_states, res_hidden_states_tuple, temb=None):\n        for resnet, attn in zip(self.resnets, self.attentions):\n\n            # pop res hidden states\n            res_hidden_states = res_hidden_states_tuple[-1]\n            res_hidden_states_tuple = res_hidden_states_tuple[:-1]\n            hidden_states = paddle.concat([hidden_states, res_hidden_states], axis=1)\n\n            hidden_states = resnet(hidden_states, temb)\n            hidden_states = attn(hidden_states)\n\n        if self.upsamplers is not None:\n            for upsampler in self.upsamplers:\n                hidden_states = upsampler(hidden_states)\n\n        return hidden_states\n\n\nclass CrossAttnUpBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        prev_output_channel: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        cross_attention_dim=1280,\n        attention_type=\"default\",\n        output_scale_factor=1.0,\n        downsample_padding=1,\n        add_upsample=True,\n    ):\n        super().__init__()\n        resnets = []\n        attentions = []\n\n        self.attention_type = attention_type\n\n        for i in range(num_layers):\n            res_skip_channels = in_channels if (i == num_layers - 1) else out_channels\n            resnet_in_channels = prev_output_channel if i == 0 else out_channels\n\n            resnets.append(\n                ResnetBlock(\n                    in_channels=resnet_in_channels + res_skip_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            attentions.append(\n                SpatialTransformer(\n                    out_channels,\n                    attn_num_head_channels,\n                    out_channels // attn_num_head_channels,\n                    depth=1,\n                    context_dim=cross_attention_dim,\n                ))\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n        if add_upsample:\n            self.upsamplers = nn.LayerList([Upsample2D(out_channels, use_conv=True, out_channels=out_channels)])\n        else:\n            self.upsamplers = None\n\n    def forward(self, hidden_states, res_hidden_states_tuple, temb=None, encoder_hidden_states=None):\n        for resnet, attn in zip(self.resnets, self.attentions):\n\n            # pop res hidden states\n            res_hidden_states = res_hidden_states_tuple[-1]\n            res_hidden_states_tuple = res_hidden_states_tuple[:-1]\n            hidden_states = paddle.concat([hidden_states, res_hidden_states], axis=1)\n\n            hidden_states = resnet(hidden_states, temb)\n            hidden_states = attn(hidden_states, context=encoder_hidden_states)\n\n        if self.upsamplers is not None:\n            for upsampler in self.upsamplers:\n                hidden_states = upsampler(hidden_states)\n\n        return hidden_states\n\n\nclass UpBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        prev_output_channel: int,\n        out_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        output_scale_factor=1.0,\n        add_upsample=True,\n    ):\n        super().__init__()\n        resnets = []\n\n        for i in range(num_layers):\n            res_skip_channels = in_channels if (i == num_layers - 1) else out_channels\n            resnet_in_channels = prev_output_channel if i == 0 else out_channels\n\n            resnets.append(\n                ResnetBlock(\n                    in_channels=resnet_in_channels + res_skip_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.resnets = nn.LayerList(resnets)\n\n        if add_upsample:\n            self.upsamplers = nn.LayerList([Upsample2D(out_channels, use_conv=True, out_channels=out_channels)])\n        else:\n            self.upsamplers = None\n\n    def forward(self, hidden_states, res_hidden_states_tuple, temb=None):\n        for resnet in self.resnets:\n\n            # pop res hidden states\n            res_hidden_states = res_hidden_states_tuple[-1]\n            res_hidden_states_tuple = res_hidden_states_tuple[:-1]\n            hidden_states = paddle.concat([hidden_states, res_hidden_states], axis=1)\n\n            hidden_states = resnet(hidden_states, temb)\n\n        if self.upsamplers is not None:\n            for upsampler in self.upsamplers:\n                hidden_states = upsampler(hidden_states)\n\n        return hidden_states\n\n\nclass UpDecoderBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        output_scale_factor=1.0,\n        add_upsample=True,\n    ):\n        super().__init__()\n        resnets = []\n\n        for i in range(num_layers):\n            input_channels = in_channels if i == 0 else out_channels\n\n            resnets.append(\n                ResnetBlock(\n                    in_channels=input_channels,\n                    out_channels=out_channels,\n                    temb_channels=None,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.resnets = nn.LayerList(resnets)\n\n        if add_upsample:\n            self.upsamplers = nn.LayerList([Upsample2D(out_channels, use_conv=True, out_channels=out_channels)])\n        else:\n            self.upsamplers = None\n\n    def forward(self, hidden_states):\n        for resnet in self.resnets:\n            hidden_states = resnet(hidden_states, temb=None)\n\n        if self.upsamplers is not None:\n            for upsampler in self.upsamplers:\n                hidden_states = upsampler(hidden_states)\n\n        return hidden_states\n\n\nclass AttnUpDecoderBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        output_scale_factor=1.0,\n        add_upsample=True,\n    ):\n        super().__init__()\n        resnets = []\n        attentions = []\n\n        for i in range(num_layers):\n            input_channels = in_channels if i == 0 else out_channels\n\n            resnets.append(\n                ResnetBlock(\n                    in_channels=input_channels,\n                    out_channels=out_channels,\n                    temb_channels=None,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            attentions.append(\n                AttentionBlockNew(\n                    out_channels,\n                    num_head_channels=attn_num_head_channels,\n                    rescale_output_factor=output_scale_factor,\n                    eps=resnet_eps,\n                    num_groups=resnet_groups,\n                ))\n\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n        if add_upsample:\n            self.upsamplers = nn.LayerList([Upsample2D(out_channels, use_conv=True, out_channels=out_channels)])\n        else:\n            self.upsamplers = None\n\n    def forward(self, hidden_states):\n        for resnet, attn in zip(self.resnets, self.attentions):\n            hidden_states = resnet(hidden_states, temb=None)\n            hidden_states = attn(hidden_states)\n\n        if self.upsamplers is not None:\n            for upsampler in self.upsamplers:\n                hidden_states = upsampler(hidden_states)\n\n        return hidden_states\n\n\nclass AttnSkipUpBlock2D(nn.Layer):\n\n    def __init__(\n            self,\n            in_channels: int,\n            prev_output_channel: int,\n            out_channels: int,\n            temb_channels: int,\n            dropout: float = 0.0,\n            num_layers: int = 1,\n            resnet_eps: float = 1e-6,\n            resnet_time_scale_shift: str = \"default\",\n            resnet_act_fn: str = \"swish\",\n            resnet_pre_norm: bool = True,\n            attn_num_head_channels=1,\n            attention_type=\"default\",\n            output_scale_factor=np.sqrt(2.0),\n            upsample_padding=1,\n            add_upsample=True,\n    ):\n        super().__init__()\n        self.attentions = nn.LayerList([])\n        self.resnets = nn.LayerList([])\n\n        self.attention_type = attention_type\n\n        for i in range(num_layers):\n            res_skip_channels = in_channels if (i == num_layers - 1) else out_channels\n            resnet_in_channels = prev_output_channel if i == 0 else out_channels\n\n            self.resnets.append(\n                ResnetBlock(\n                    in_channels=resnet_in_channels + res_skip_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=min(resnet_in_channels + res_skip_channels // 4, 32),\n                    groups_out=min(out_channels // 4, 32),\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.attentions.append(\n            AttentionBlockNew(\n                out_channels,\n                num_head_channels=attn_num_head_channels,\n                rescale_output_factor=output_scale_factor,\n                eps=resnet_eps,\n            ))\n\n        self.upsampler = FirUpsample2D(in_channels, out_channels=out_channels)\n        if add_upsample:\n            self.resnet_up = ResnetBlock(\n                in_channels=out_channels,\n                out_channels=out_channels,\n                temb_channels=temb_channels,\n                eps=resnet_eps,\n                groups=min(out_channels // 4, 32),\n                groups_out=min(out_channels // 4, 32),\n                dropout=dropout,\n                time_embedding_norm=resnet_time_scale_shift,\n                non_linearity=resnet_act_fn,\n                output_scale_factor=output_scale_factor,\n                pre_norm=resnet_pre_norm,\n                use_nin_shortcut=True,\n                up=True,\n                kernel=\"fir\",\n            )\n            self.skip_conv = nn.Conv2D(out_channels, 3, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n            self.skip_norm = nn.GroupNorm(num_groups=min(out_channels // 4, 32),\n                                          num_channels=out_channels,\n                                          eps=resnet_eps,\n                                          affine=True)\n            self.act = nn.SiLU()\n        else:\n            self.resnet_up = None\n            self.skip_conv = None\n            self.skip_norm = None\n            self.act = None\n\n    def forward(self, hidden_states, res_hidden_states_tuple, temb=None, skip_sample=None):\n        for resnet in self.resnets:\n            # pop res hidden states\n            res_hidden_states = res_hidden_states_tuple[-1]\n            res_hidden_states_tuple = res_hidden_states_tuple[:-1]\n            hidden_states = paddle.concat([hidden_states, res_hidden_states], axis=1)\n\n            hidden_states = resnet(hidden_states, temb)\n\n        hidden_states = self.attentions[0](hidden_states)\n\n        if skip_sample is not None:\n            skip_sample = self.upsampler(skip_sample)\n        else:\n            skip_sample = 0\n\n        if self.resnet_up is not None:\n            skip_sample_states = self.skip_norm(hidden_states)\n            skip_sample_states = self.act(skip_sample_states)\n            skip_sample_states = self.skip_conv(skip_sample_states)\n\n            skip_sample = skip_sample + skip_sample_states\n\n            hidden_states = self.resnet_up(hidden_states, temb)\n\n        return hidden_states, skip_sample\n\n\nclass SkipUpBlock2D(nn.Layer):\n\n    def __init__(\n            self,\n            in_channels: int,\n            prev_output_channel: int,\n            out_channels: int,\n            temb_channels: int,\n            dropout: float = 0.0,\n            num_layers: int = 1,\n            resnet_eps: float = 1e-6,\n            resnet_time_scale_shift: str = \"default\",\n            resnet_act_fn: str = \"swish\",\n            resnet_pre_norm: bool = True,\n            output_scale_factor=np.sqrt(2.0),\n            add_upsample=True,\n            upsample_padding=1,\n    ):\n        super().__init__()\n        self.resnets = nn.LayerList([])\n\n        for i in range(num_layers):\n            res_skip_channels = in_channels if (i == num_layers - 1) else out_channels\n            resnet_in_channels = prev_output_channel if i == 0 else out_channels\n\n            self.resnets.append(\n                ResnetBlock(\n                    in_channels=resnet_in_channels + res_skip_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=min((resnet_in_channels + res_skip_channels) // 4, 32),\n                    groups_out=min(out_channels // 4, 32),\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.upsampler = FirUpsample2D(in_channels, out_channels=out_channels)\n        if add_upsample:\n            self.resnet_up = ResnetBlock(\n                in_channels=out_channels,\n                out_channels=out_channels,\n                temb_channels=temb_channels,\n                eps=resnet_eps,\n                groups=min(out_channels // 4, 32),\n                groups_out=min(out_channels // 4, 32),\n                dropout=dropout,\n                time_embedding_norm=resnet_time_scale_shift,\n                non_linearity=resnet_act_fn,\n                output_scale_factor=output_scale_factor,\n                pre_norm=resnet_pre_norm,\n                use_nin_shortcut=True,\n                up=True,\n                kernel=\"fir\",\n            )\n            self.skip_conv = nn.Conv2D(out_channels, 3, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n            self.skip_norm = nn.GroupNorm(num_groups=min(out_channels // 4, 32),\n                                          num_channels=out_channels,\n                                          eps=resnet_eps,\n                                          affine=True)\n            self.act = nn.SiLU()\n        else:\n            self.resnet_up = None\n            self.skip_conv = None\n            self.skip_norm = None\n            self.act = None\n\n    def forward(self, hidden_states, res_hidden_states_tuple, temb=None, skip_sample=None):\n        for resnet in self.resnets:\n            # pop res hidden states\n            res_hidden_states = res_hidden_states_tuple[-1]\n            res_hidden_states_tuple = res_hidden_states_tuple[:-1]\n            hidden_states = paddle.concat([hidden_states, res_hidden_states], axis=1)\n\n            hidden_states = resnet(hidden_states, temb)\n\n        if skip_sample is not None:\n            skip_sample = self.upsampler(skip_sample)\n        else:\n            skip_sample = 0\n\n        if self.resnet_up is not None:\n            skip_sample_states = self.skip_norm(hidden_states)\n            skip_sample_states = self.act(skip_sample_states)\n            skip_sample_states = self.skip_conv(skip_sample_states)\n\n            skip_sample = skip_sample + skip_sample_states\n\n            hidden_states = self.resnet_up(hidden_states, temb)\n\n        return hidden_states, skip_sample\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/diffusers/models/vae.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .unet_blocks import get_down_block\nfrom .unet_blocks import get_up_block\nfrom .unet_blocks import UNetMidBlock2D\n\n\nclass Encoder(nn.Layer):\n\n    def __init__(\n            self,\n            in_channels=3,\n            out_channels=3,\n            down_block_types=(\"DownEncoderBlock2D\", ),\n            block_out_channels=(64, ),\n            layers_per_block=2,\n            act_fn=\"silu\",\n            double_z=True,\n    ):\n        super().__init__()\n        self.layers_per_block = layers_per_block\n\n        self.conv_in = nn.Conv2D(in_channels, block_out_channels[0], kernel_size=3, stride=1, padding=1)\n\n        self.mid_block = None\n        self.down_blocks = nn.LayerList([])\n\n        # down\n        output_channel = block_out_channels[0]\n        for i, down_block_type in enumerate(down_block_types):\n            input_channel = output_channel\n            output_channel = block_out_channels[i]\n            is_final_block = i == len(block_out_channels) - 1\n\n            down_block = get_down_block(\n                down_block_type,\n                num_layers=self.layers_per_block,\n                in_channels=input_channel,\n                out_channels=output_channel,\n                add_downsample=not is_final_block,\n                resnet_eps=1e-6,\n                downsample_padding=0,\n                resnet_act_fn=act_fn,\n                attn_num_head_channels=None,\n                temb_channels=None,\n            )\n            self.down_blocks.append(down_block)\n\n        # mid\n        self.mid_block = UNetMidBlock2D(\n            in_channels=block_out_channels[-1],\n            resnet_eps=1e-6,\n            resnet_act_fn=act_fn,\n            output_scale_factor=1,\n            resnet_time_scale_shift=\"default\",\n            attn_num_head_channels=None,\n            resnet_groups=32,\n            temb_channels=None,\n        )\n\n        # out\n        num_groups_out = 32\n        self.conv_norm_out = nn.GroupNorm(num_channels=block_out_channels[-1], num_groups=num_groups_out, epsilon=1e-6)\n        self.conv_act = nn.Silu()\n\n        conv_out_channels = 2 * out_channels if double_z else out_channels\n        self.conv_out = nn.Conv2D(block_out_channels[-1], conv_out_channels, 3, padding=1)\n\n    def forward(self, x):\n        sample = x\n        sample = self.conv_in(sample)\n\n        # down\n        for down_block in self.down_blocks:\n            sample = down_block(sample)\n\n        # middle\n        sample = self.mid_block(sample)\n\n        # post-process\n        sample = self.conv_norm_out(sample)\n        sample = self.conv_act(sample)\n        sample = self.conv_out(sample)\n\n        return sample\n\n\nclass Decoder(nn.Layer):\n\n    def __init__(\n            self,\n            in_channels=3,\n            out_channels=3,\n            up_block_types=(\"UpDecoderBlock2D\", ),\n            block_out_channels=(64, ),\n            layers_per_block=2,\n            act_fn=\"silu\",\n    ):\n        super().__init__()\n        self.layers_per_block = layers_per_block\n\n        self.conv_in = nn.Conv2D(in_channels, block_out_channels[-1], kernel_size=3, stride=1, padding=1)\n\n        self.mid_block = None\n        self.up_blocks = nn.LayerList([])\n\n        # mid\n        self.mid_block = UNetMidBlock2D(\n            in_channels=block_out_channels[-1],\n            resnet_eps=1e-6,\n            resnet_act_fn=act_fn,\n            output_scale_factor=1,\n            resnet_time_scale_shift=\"default\",\n            attn_num_head_channels=None,\n            resnet_groups=32,\n            temb_channels=None,\n        )\n\n        # up\n        reversed_block_out_channels = list(reversed(block_out_channels))\n        output_channel = reversed_block_out_channels[0]\n        for i, up_block_type in enumerate(up_block_types):\n            prev_output_channel = output_channel\n            output_channel = reversed_block_out_channels[i]\n\n            is_final_block = i == len(block_out_channels) - 1\n\n            up_block = get_up_block(\n                up_block_type,\n                num_layers=self.layers_per_block + 1,\n                in_channels=prev_output_channel,\n                out_channels=output_channel,\n                prev_output_channel=None,\n                add_upsample=not is_final_block,\n                resnet_eps=1e-6,\n                resnet_act_fn=act_fn,\n                attn_num_head_channels=None,\n                temb_channels=None,\n            )\n            self.up_blocks.append(up_block)\n            prev_output_channel = output_channel\n\n        # out\n        num_groups_out = 32\n        self.conv_norm_out = nn.GroupNorm(num_channels=block_out_channels[0], num_groups=num_groups_out, epsilon=1e-6)\n        self.conv_act = nn.Silu()\n        self.conv_out = nn.Conv2D(block_out_channels[0], out_channels, 3, padding=1)\n\n    def forward(self, z):\n        sample = z\n        sample = self.conv_in(sample)\n\n        # middle\n        sample = self.mid_block(sample)\n\n        # up\n        for up_block in self.up_blocks:\n            sample = up_block(sample)\n\n        # post-process\n        sample = self.conv_norm_out(sample)\n        sample = self.conv_act(sample)\n        sample = self.conv_out(sample)\n\n        return sample\n\n\nclass VectorQuantizer(nn.Layer):\n    \"\"\"\n    Improved version over VectorQuantizer, can be used as a drop-in replacement. Mostly avoids costly matrix\n    multiplications and allows for post-hoc remapping of indices.\n    \"\"\"\n\n    # NOTE: due to a bug the beta term was applied to the wrong term. for\n    # backwards compatibility we use the buggy version by default, but you can\n    # specify legacy=False to fix it.\n    def __init__(self, n_e, e_dim, beta, remap=None, unknown_index=\"random\", sane_index_shape=False, legacy=True):\n        super().__init__()\n        self.n_e = n_e\n        self.e_dim = e_dim\n        self.beta = beta\n        self.legacy = legacy\n\n        self.embedding = nn.Embedding(self.n_e, self.e_dim)\n        self.embedding.weight.data.uniform_(-1.0 / self.n_e, 1.0 / self.n_e)\n\n        self.remap = remap\n        if self.remap is not None:\n            self.register_buffer(\"used\", paddle.to_tensor(np.load(self.remap)))\n            self.re_embed = self.used.shape[0]\n            self.unknown_index = unknown_index  # \"random\" or \"extra\" or integer\n            if self.unknown_index == \"extra\":\n                self.unknown_index = self.re_embed\n                self.re_embed = self.re_embed + 1\n            print(f\"Remapping {self.n_e} indices to {self.re_embed} indices. \"\n                  f\"Using {self.unknown_index} for unknown indices.\")\n        else:\n            self.re_embed = n_e\n\n        self.sane_index_shape = sane_index_shape\n\n    def remap_to_used(self, inds):\n        ishape = inds.shape\n        assert len(ishape) > 1\n        inds = inds.reshape([ishape[0], -1])\n        used = self.used\n        match = (inds[:, :, None] == used[None, None, ...]).astype(\"int64\")\n        new = match.argmax(-1)\n        unknown = match.sum(2) < 1\n        if self.unknown_index == \"random\":\n            new[unknown] = paddle.randint(0, self.re_embed, shape=new[unknown].shape)\n        else:\n            new[unknown] = self.unknown_index\n        return new.reshape(ishape)\n\n    def unmap_to_all(self, inds):\n        ishape = inds.shape\n        assert len(ishape) > 1\n        inds = inds.reshape([ishape[0], -1])\n        used = self.used\n        if self.re_embed > self.used.shape[0]:  # extra token\n            inds[inds >= self.used.shape[0]] = 0  # simply set to zero\n        back = paddle.gather(used[None, :][inds.shape[0] * [0], :], inds, axis=1)\n        return back.reshape(ishape)\n\n    def forward(self, z):\n        # reshape z -> (batch, height, width, channel) and flatten\n        z = z.transpose([0, 2, 3, 1])\n        z_flattened = z.reshape([-1, self.e_dim])\n        # distances from z to embeddings e_j (z - e)^2 = z^2 + e^2 - 2 e * z\n\n        d = (paddle.sum(z_flattened**2, axis=1, keepdim=True) + paddle.sum(self.embedding.weight**2, axis=1) -\n             2 * paddle.einsum(\"bd,dn->bn\", z_flattened, self.embedding.weight.t()))\n\n        min_encoding_indices = paddle.argmin(d, axis=1)\n        z_q = self.embedding(min_encoding_indices).reshape(z.shape)\n        perplexity = None\n        min_encodings = None\n\n        # compute loss for embedding\n        if not self.legacy:\n            loss = self.beta * paddle.mean((z_q.detach() - z)**2) + paddle.mean((z_q - z.detach())**2)\n        else:\n            loss = paddle.mean((z_q.detach() - z)**2) + self.beta * paddle.mean((z_q - z.detach())**2)\n\n        # preserve gradients\n        z_q = z + (z_q - z).detach()\n\n        # reshape back to match original input shape\n        z_q = z_q.transpose([0, 3, 1, 2])\n\n        if self.remap is not None:\n            min_encoding_indices = min_encoding_indices.reshape([z.shape[0], -1])  # add batch axis\n            min_encoding_indices = self.remap_to_used(min_encoding_indices)\n            min_encoding_indices = min_encoding_indices.reshape([-1, 1])  # flatten\n\n        if self.sane_index_shape:\n            min_encoding_indices = min_encoding_indices.reshape([z_q.shape[0], z_q.shape[2], z_q.shape[3]])\n\n        return z_q, loss, (perplexity, min_encodings, min_encoding_indices)\n\n    def get_codebook_entry(self, indices, shape):\n        # shape specifying (batch, height, width, channel)\n        if self.remap is not None:\n            indices = indices.reshape([shape[0], -1])  # add batch axis\n            indices = self.unmap_to_all(indices)\n            indices = indices.flatten()  # flatten again\n\n        # get quantized latent vectors\n        z_q = self.embedding(indices)\n\n        if shape is not None:\n            z_q = z_q.reshape(shape)\n            # reshape back to match original input shape\n            z_q = z_q.transpose([0, 3, 1, 2])\n\n        return z_q\n\n\nclass DiagonalGaussianDistribution(object):\n\n    def __init__(self, parameters, deterministic=False):\n        self.parameters = parameters\n        self.mean, self.logvar = paddle.chunk(parameters, 2, axis=1)\n        self.logvar = paddle.clip(self.logvar, -30.0, 20.0)\n        self.deterministic = deterministic\n        self.std = paddle.exp(0.5 * self.logvar)\n        self.var = paddle.exp(self.logvar)\n        if self.deterministic:\n            self.var = self.std = paddle.zeros_like(self.mean)\n\n    def sample(self):\n        x = self.mean + self.std * paddle.randn(self.mean.shape)\n        return x\n\n    def kl(self, other=None):\n        if self.deterministic:\n            return paddle.to_tensor([0.0])\n        else:\n            if other is None:\n                return 0.5 * paddle.sum(paddle.pow(self.mean, 2) + self.var - 1.0 - self.logvar, axis=[1, 2, 3])\n            else:\n                return 0.5 * paddle.sum(\n                    paddle.pow(self.mean - other.mean, 2) / other.var + self.var / other.var - 1.0 - self.logvar +\n                    other.logvar,\n                    axis=[1, 2, 3],\n                )\n\n    def nll(self, sample, dims=[1, 2, 3]):\n        if self.deterministic:\n            return paddle.to_tensor([0.0])\n        logtwopi = np.log(2.0 * np.pi)\n        return 0.5 * paddle.sum(logtwopi + self.logvar + paddle.pow(sample - self.mean, 2) / self.var, axis=dims)\n\n    def mode(self):\n        return self.mean\n\n\nclass VQModel(ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        in_channels=3,\n        out_channels=3,\n        down_block_types=(\"DownEncoderBlock2D\", ),\n        up_block_types=(\"UpDecoderBlock2D\", ),\n        block_out_channels=(64, ),\n        layers_per_block=1,\n        act_fn=\"silu\",\n        latent_channels=3,\n        sample_size=32,\n        num_vq_embeddings=256,\n    ):\n        super().__init__()\n\n        # pass init params to Encoder\n        self.encoder = Encoder(\n            in_channels=in_channels,\n            out_channels=latent_channels,\n            down_block_types=down_block_types,\n            block_out_channels=block_out_channels,\n            layers_per_block=layers_per_block,\n            act_fn=act_fn,\n            double_z=False,\n        )\n\n        self.quant_conv = nn.Conv2D(latent_channels, latent_channels, 1)\n        self.quantize = VectorQuantizer(num_vq_embeddings,\n                                        latent_channels,\n                                        beta=0.25,\n                                        remap=None,\n                                        sane_index_shape=False)\n        self.post_quant_conv = nn.Conv2D(latent_channels, latent_channels, 1)\n\n        # pass init params to Decoder\n        self.decoder = Decoder(\n            in_channels=latent_channels,\n            out_channels=out_channels,\n            up_block_types=up_block_types,\n            block_out_channels=block_out_channels,\n            layers_per_block=layers_per_block,\n            act_fn=act_fn,\n        )\n\n    def encode(self, x):\n        h = self.encoder(x)\n        h = self.quant_conv(h)\n        return h\n\n    def decode(self, h, force_not_quantize=False):\n        # also go through quantization layer\n        if not force_not_quantize:\n            quant, emb_loss, info = self.quantize(h)\n        else:\n            quant = h\n        quant = self.post_quant_conv(quant)\n        dec = self.decoder(quant)\n        return dec\n\n    def forward(self, sample):\n        x = sample\n        h = self.encode(x)\n        dec = self.decode(h)\n        return dec\n\n\nclass AutoencoderKL(nn.Layer, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        in_channels=3,\n        out_channels=3,\n        down_block_types=(\"DownEncoderBlock2D\", \"DownEncoderBlock2D\", \"DownEncoderBlock2D\", \"DownEncoderBlock2D\"),\n        up_block_types=(\"UpDecoderBlock2D\", \"UpDecoderBlock2D\", \"UpDecoderBlock2D\", \"UpDecoderBlock2D\"),\n        block_out_channels=(128, 256, 512, 512),\n        layers_per_block=2,\n        act_fn=\"silu\",\n        latent_channels=4,\n        sample_size=512,\n    ):\n        super().__init__()\n\n        # pass init params to Encoder\n        self.encoder = Encoder(\n            in_channels=in_channels,\n            out_channels=latent_channels,\n            down_block_types=down_block_types,\n            block_out_channels=block_out_channels,\n            layers_per_block=layers_per_block,\n            act_fn=act_fn,\n            double_z=True,\n        )\n\n        # pass init params to Decoder\n        self.decoder = Decoder(\n            in_channels=latent_channels,\n            out_channels=out_channels,\n            up_block_types=up_block_types,\n            block_out_channels=block_out_channels,\n            layers_per_block=layers_per_block,\n            act_fn=act_fn,\n        )\n\n        self.quant_conv = nn.Conv2D(2 * latent_channels, 2 * latent_channels, 1)\n        self.post_quant_conv = nn.Conv2D(latent_channels, latent_channels, 1)\n\n    def encode(self, x):\n        h = self.encoder(x)\n        moments = self.quant_conv(h)\n        posterior = DiagonalGaussianDistribution(moments)\n        return posterior\n\n    def decode(self, z):\n        z = self.post_quant_conv(z)\n        dec = self.decoder(z)\n        return dec\n\n    def forward(self, sample, sample_posterior=False):\n        x = sample\n        posterior = self.encode(x)\n        if sample_posterior:\n            z = posterior.sample()\n        else:\n            z = posterior.mode()\n        dec = self.decode(z)\n        return dec\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/diffusers/schedulers/README.md",
    "content": "# Schedulers\n\n- Schedulers are the algorithms to use diffusion models in inference as well as for training. They include the noise schedules and define algorithm-specific diffusion steps.\n- Schedulers can be used interchangable between diffusion models in inference to find the preferred trade-off between speed and generation quality.\n- Schedulers are available in numpy, but can easily be transformed into Py\n\n## API\n\n- Schedulers should provide one or more `def step(...)` functions that should be called iteratively to unroll the diffusion loop during\nthe forward pass.\n- Schedulers should be framework-agnostic, but provide a simple functionality to convert the scheduler into a specific framework, such as PyTorch\nwith a `set_format(...)` method.\n\n## Examples\n\n- The DDPM scheduler was proposed in [Denoising Diffusion Probabilistic Models](https://arxiv.org/abs/2006.11239) and can be found in [scheduling_ddpm.py](https://github.com/huggingface/diffusers/blob/main/src/diffusers/schedulers/scheduling_ddpm.py). An example of how to use this scheduler can be found in [pipeline_ddpm.py](https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/pipeline_ddpm.py).\n- The DDIM scheduler was proposed in [Denoising Diffusion Implicit Models](https://arxiv.org/abs/2010.02502) and can be found in [scheduling_ddim.py](https://github.com/huggingface/diffusers/blob/main/src/diffusers/schedulers/scheduling_ddim.py). An example of how to use this scheduler can be found in [pipeline_ddim.py](https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/pipeline_ddim.py).\n- The PNDM scheduler was proposed in [Pseudo Numerical Methods for Diffusion Models on Manifolds](https://arxiv.org/abs/2202.09778) and can be found in [scheduling_pndm.py](https://github.com/huggingface/diffusers/blob/main/src/diffusers/schedulers/scheduling_pndm.py). An example of how to use this scheduler can be found in [pipeline_pndm.py](https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/pipeline_pndm.py).\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/diffusers/schedulers/__init__.py",
    "content": "# flake8: noqa\n# There's no way to ignore \"F401 '...' imported but unused\" warnings in this\n# module, but to preserve other warnings. So, don't check this module at all.\n# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom .scheduling_ddim import DDIMScheduler\nfrom .scheduling_ddpm import DDPMScheduler\nfrom .scheduling_karras_ve import KarrasVeScheduler\nfrom .scheduling_lms_discrete import LMSDiscreteScheduler\nfrom .scheduling_pndm import PNDMScheduler\nfrom .scheduling_sde_ve import ScoreSdeVeScheduler\nfrom .scheduling_sde_vp import ScoreSdeVpScheduler\nfrom .scheduling_utils import SchedulerMixin\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/diffusers/schedulers/scheduling_ddim.py",
    "content": "# Copyright 2022 Stanford University Team and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# DISCLAIMER: This code is strongly influenced by https://github.com/pesser/pypaddle_diffusion\n# and https://github.com/hojonathanho/diffusion\nimport math\nfrom typing import Union\n\nimport numpy as np\nimport paddle\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\ndef betas_for_alpha_bar(num_diffusion_timesteps, max_beta=0.999):\n    \"\"\"\n    Create a beta schedule that discretizes the given alpha_t_bar function, which defines the cumulative product of\n    (1-beta) over time from t = [0,1].\n\n    :param num_diffusion_timesteps: the number of betas to produce. :param alpha_bar: a lambda that takes an argument t\n    from 0 to 1 and\n                      produces the cumulative product of (1-beta) up to that part of the diffusion process.\n    :param max_beta: the maximum beta to use; use values lower than 1 to\n                     prevent singularities.\n    \"\"\"\n\n    def alpha_bar(time_step):\n        return math.cos((time_step + 0.008) / 1.008 * math.pi / 2)**2\n\n    betas = []\n    for i in range(num_diffusion_timesteps):\n        t1 = i / num_diffusion_timesteps\n        t2 = (i + 1) / num_diffusion_timesteps\n        betas.append(min(1 - alpha_bar(t2) / alpha_bar(t1), max_beta))\n    return np.array(betas, dtype=np.float32)\n\n\nclass DDIMScheduler(SchedulerMixin, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        num_train_timesteps=1000,\n        beta_start=0.0001,\n        beta_end=0.02,\n        beta_schedule=\"linear\",\n        trained_betas=None,\n        timestep_values=None,\n        clip_sample=True,\n        set_alpha_to_one=True,\n        tensor_format=\"pd\",\n    ):\n\n        if beta_schedule == \"linear\":\n            self.betas = np.linspace(beta_start, beta_end, num_train_timesteps, dtype=np.float32)\n        elif beta_schedule == \"scaled_linear\":\n            # this schedule is very specific to the latent diffusion model.\n            self.betas = np.linspace(beta_start**0.5, beta_end**0.5, num_train_timesteps, dtype=np.float32)**2\n        elif beta_schedule == \"squaredcos_cap_v2\":\n            # Glide cosine schedule\n            self.betas = betas_for_alpha_bar(num_train_timesteps)\n        else:\n            raise NotImplementedError(f\"{beta_schedule} does is not implemented for {self.__class__}\")\n\n        self.alphas = 1.0 - self.betas\n        self.alphas_cumprod = np.cumprod(self.alphas, axis=0)\n\n        # At every step in ddim, we are looking into the previous alphas_cumprod\n        # For the final step, there is no previous alphas_cumprod because we are already at 0\n        # `set_alpha_to_one` decides whether we set this paratemer simply to one or\n        # whether we use the final alpha of the \"non-previous\" one.\n        self.final_alpha_cumprod = np.array(1.0) if set_alpha_to_one else self.alphas_cumprod[0]\n\n        # setable values\n        self.num_inference_steps = None\n        self.timesteps = np.arange(0, num_train_timesteps)[::-1].copy()\n\n        self.tensor_format = tensor_format\n        self.set_format(tensor_format=tensor_format)\n\n    def _get_variance(self, timestep, prev_timestep):\n        alpha_prod_t = self.alphas_cumprod[timestep]\n        alpha_prod_t_prev = self.alphas_cumprod[prev_timestep] if prev_timestep >= 0 else self.final_alpha_cumprod\n        beta_prod_t = 1 - alpha_prod_t\n        beta_prod_t_prev = 1 - alpha_prod_t_prev\n\n        variance = (beta_prod_t_prev / beta_prod_t) * (1 - alpha_prod_t / alpha_prod_t_prev)\n\n        return variance\n\n    def set_timesteps(self, num_inference_steps, offset=0):\n        self.num_inference_steps = num_inference_steps\n        self.timesteps = np.arange(0, self.config.num_train_timesteps,\n                                   self.config.num_train_timesteps // self.num_inference_steps)[::-1].copy()\n        self.timesteps += offset\n        self.set_format(tensor_format=self.tensor_format)\n\n    def step(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n        eta: float = 0.0,\n        use_clipped_model_output: bool = False,\n        generator=None,\n    ):\n        # See formulas (12) and (16) of DDIM paper https://arxiv.org/pdf/2010.02502.pdf\n        # Ideally, read DDIM paper in-detail understanding\n\n        # Notation (<variable name> -> <name in paper>\n        # - pred_noise_t -> e_theta(x_t, t)\n        # - pred_original_sample -> f_theta(x_t, t) or x_0\n        # - std_dev_t -> sigma_t\n        # - eta -> η\n        # - pred_sample_direction -> \"direction pointingc to x_t\"\n        # - pred_prev_sample -> \"x_t-1\"\n\n        # 1. get previous step value (=t-1)\n        prev_timestep = timestep - self.config.num_train_timesteps // self.num_inference_steps\n\n        # 2. compute alphas, betas\n        alpha_prod_t = self.alphas_cumprod[timestep]\n        alpha_prod_t_prev = self.alphas_cumprod[prev_timestep] if prev_timestep >= 0 else self.final_alpha_cumprod\n        beta_prod_t = 1 - alpha_prod_t\n\n        # 3. compute predicted original sample from predicted noise also called\n        # \"predicted x_0\" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf\n        pred_original_sample = (sample - beta_prod_t**(0.5) * model_output) / alpha_prod_t**(0.5)\n\n        # 4. Clip \"predicted x_0\"\n        if self.config.clip_sample:\n            pred_original_sample = self.clip(pred_original_sample, -1, 1)\n\n        # 5. compute variance: \"sigma_t(η)\" -> see formula (16)\n        # σ_t = sqrt((1 − α_t−1)/(1 − α_t)) * sqrt(1 − α_t/α_t−1)\n        variance = self._get_variance(timestep, prev_timestep)\n        std_dev_t = eta * variance**(0.5)\n\n        if use_clipped_model_output:\n            # the model_output is always re-derived from the clipped x_0 in Glide\n            model_output = (sample - alpha_prod_t**(0.5) * pred_original_sample) / beta_prod_t**(0.5)\n\n        # 6. compute \"direction pointing to x_t\" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf\n        pred_sample_direction = (1 - alpha_prod_t_prev - std_dev_t**2)**(0.5) * model_output\n\n        # 7. compute x_t without \"random noise\" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf\n        prev_sample = alpha_prod_t_prev**(0.5) * pred_original_sample + pred_sample_direction\n\n        if eta > 0:\n            noise = paddle.randn(model_output.shape)\n            variance = self._get_variance(timestep, prev_timestep)**(0.5) * eta * noise\n\n            if not paddle.is_tensor(model_output):\n                variance = variance.numpy()\n\n            prev_sample = prev_sample + variance\n\n        return {\"prev_sample\": prev_sample}\n\n    def add_noise(self, original_samples, noise, timesteps):\n        sqrt_alpha_prod = self.alphas_cumprod[timesteps]**0.5\n        sqrt_alpha_prod = self.match_shape(sqrt_alpha_prod, original_samples)\n        sqrt_one_minus_alpha_prod = (1 - self.alphas_cumprod[timesteps])**0.5\n        sqrt_one_minus_alpha_prod = self.match_shape(sqrt_one_minus_alpha_prod, original_samples)\n\n        noisy_samples = sqrt_alpha_prod * original_samples + sqrt_one_minus_alpha_prod * noise\n        return noisy_samples\n\n    def __len__(self):\n        return self.config.num_train_timesteps\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/diffusers/schedulers/scheduling_ddpm.py",
    "content": "# Copyright 2022 UC Berkely Team and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# DISCLAIMER: This file is strongly influenced by https://github.com/ermongroup/ddim\nimport math\nfrom typing import Union\n\nimport numpy as np\nimport paddle\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\ndef betas_for_alpha_bar(num_diffusion_timesteps, max_beta=0.999):\n    \"\"\"\n    Create a beta schedule that discretizes the given alpha_t_bar function, which defines the cumulative product of\n    (1-beta) over time from t = [0,1].\n\n    :param num_diffusion_timesteps: the number of betas to produce. :param alpha_bar: a lambda that takes an argument t\n    from 0 to 1 and\n                      produces the cumulative product of (1-beta) up to that part of the diffusion process.\n    :param max_beta: the maximum beta to use; use values lower than 1 to\n                     prevent singularities.\n    \"\"\"\n\n    def alpha_bar(time_step):\n        return math.cos((time_step + 0.008) / 1.008 * math.pi / 2)**2\n\n    betas = []\n    for i in range(num_diffusion_timesteps):\n        t1 = i / num_diffusion_timesteps\n        t2 = (i + 1) / num_diffusion_timesteps\n        betas.append(min(1 - alpha_bar(t2) / alpha_bar(t1), max_beta))\n    return np.array(betas, dtype=np.float32)\n\n\nclass DDPMScheduler(SchedulerMixin, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        num_train_timesteps=1000,\n        beta_start=0.0001,\n        beta_end=0.02,\n        beta_schedule=\"linear\",\n        trained_betas=None,\n        variance_type=\"fixed_small\",\n        clip_sample=True,\n        tensor_format=\"pd\",\n    ):\n\n        if trained_betas is not None:\n            self.betas = np.asarray(trained_betas)\n        elif beta_schedule == \"linear\":\n            self.betas = np.linspace(beta_start, beta_end, num_train_timesteps, dtype=np.float32)\n        elif beta_schedule == \"scaled_linear\":\n            # this schedule is very specific to the latent diffusion model.\n            self.betas = np.linspace(beta_start**0.5, beta_end**0.5, num_train_timesteps, dtype=np.float32)**2\n        elif beta_schedule == \"squaredcos_cap_v2\":\n            # Glide cosine schedule\n            self.betas = betas_for_alpha_bar(num_train_timesteps)\n        else:\n            raise NotImplementedError(f\"{beta_schedule} does is not implemented for {self.__class__}\")\n\n        self.alphas = 1.0 - self.betas\n        self.alphas_cumprod = np.cumprod(self.alphas, axis=0)\n        self.one = np.array(1.0)\n\n        # setable values\n        self.num_inference_steps = None\n        self.timesteps = np.arange(0, num_train_timesteps)[::-1].copy()\n\n        self.tensor_format = tensor_format\n        self.set_format(tensor_format=tensor_format)\n\n        self.variance_type = variance_type\n\n    def set_timesteps(self, num_inference_steps):\n        num_inference_steps = min(self.config.num_train_timesteps, num_inference_steps)\n        self.num_inference_steps = num_inference_steps\n        self.timesteps = np.arange(0, self.config.num_train_timesteps,\n                                   self.config.num_train_timesteps // self.num_inference_steps)[::-1].copy()\n        self.set_format(tensor_format=self.tensor_format)\n\n    def _get_variance(self, t, predicted_variance=None, variance_type=None):\n        alpha_prod_t = self.alphas_cumprod[t]\n        alpha_prod_t_prev = self.alphas_cumprod[t - 1] if t > 0 else self.one\n\n        # For t > 0, compute predicted variance βt (see formula (6) and (7) from https://arxiv.org/pdf/2006.11239.pdf)\n        # and sample from it to get previous sample\n        # x_{t-1} ~ N(pred_prev_sample, variance) == add variance to pred_sample\n        variance = (1 - alpha_prod_t_prev) / (1 - alpha_prod_t) * self.betas[t]\n\n        if variance_type is None:\n            variance_type = self.config.variance_type\n\n        # hacks - were probs added for training stability\n        if variance_type == \"fixed_small\":\n            variance = self.clip(variance, min_value=1e-20)\n        # for rl-diffuser https://arxiv.org/abs/2205.09991\n        elif variance_type == \"fixed_small_log\":\n            variance = self.log(self.clip(variance, min_value=1e-20))\n        elif variance_type == \"fixed_large\":\n            variance = self.betas[t]\n        elif variance_type == \"fixed_large_log\":\n            # Glide max_log\n            variance = self.log(self.betas[t])\n        elif variance_type == \"learned\":\n            return predicted_variance\n        elif variance_type == \"learned_range\":\n            min_log = variance\n            max_log = self.betas[t]\n            frac = (predicted_variance + 1) / 2\n            variance = frac * max_log + (1 - frac) * min_log\n\n        return variance\n\n    def step(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n        predict_epsilon=True,\n        generator=None,\n    ):\n        t = timestep\n\n        if model_output.shape[1] == sample.shape[1] * 2 and self.variance_type in [\"learned\", \"learned_range\"]:\n            model_output, predicted_variance = paddle.split(model_output, sample.shape[1], axis=1)\n        else:\n            predicted_variance = None\n\n        # 1. compute alphas, betas\n        alpha_prod_t = self.alphas_cumprod[t]\n        alpha_prod_t_prev = self.alphas_cumprod[t - 1] if t > 0 else self.one\n        beta_prod_t = 1 - alpha_prod_t\n        beta_prod_t_prev = 1 - alpha_prod_t_prev\n\n        # 2. compute predicted original sample from predicted noise also called\n        # \"predicted x_0\" of formula (15) from https://arxiv.org/pdf/2006.11239.pdf\n        if predict_epsilon:\n            pred_original_sample = (sample - beta_prod_t**(0.5) * model_output) / alpha_prod_t**(0.5)\n        else:\n            pred_original_sample = model_output\n\n        # 3. Clip \"predicted x_0\"\n        if self.config.clip_sample:\n            pred_original_sample = self.clip(pred_original_sample, -1, 1)\n\n        # 4. Compute coefficients for pred_original_sample x_0 and current sample x_t\n        # See formula (7) from https://arxiv.org/pdf/2006.11239.pdf\n        pred_original_sample_coeff = (alpha_prod_t_prev**(0.5) * self.betas[t]) / beta_prod_t\n        current_sample_coeff = self.alphas[t]**(0.5) * beta_prod_t_prev / beta_prod_t\n\n        # 5. Compute predicted previous sample µ_t\n        # See formula (7) from https://arxiv.org/pdf/2006.11239.pdf\n        pred_prev_sample = pred_original_sample_coeff * pred_original_sample + current_sample_coeff * sample\n\n        # 6. Add noise\n        variance = 0\n        if t > 0:\n            noise = self.randn_like(model_output)\n            variance = (self._get_variance(t, predicted_variance=predicted_variance)**0.5) * noise\n\n        pred_prev_sample = pred_prev_sample + variance\n\n        return {\"prev_sample\": pred_prev_sample}\n\n    def add_noise(self, original_samples, noise, timesteps):\n        sqrt_alpha_prod = self.alphas_cumprod[timesteps]**0.5\n        sqrt_alpha_prod = self.match_shape(sqrt_alpha_prod, original_samples)\n        sqrt_one_minus_alpha_prod = (1 - self.alphas_cumprod[timesteps])**0.5\n        sqrt_one_minus_alpha_prod = self.match_shape(sqrt_one_minus_alpha_prod, original_samples)\n\n        noisy_samples = sqrt_alpha_prod * original_samples + sqrt_one_minus_alpha_prod * noise\n        return noisy_samples\n\n    def __len__(self):\n        return self.config.num_train_timesteps\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/diffusers/schedulers/scheduling_karras_ve.py",
    "content": "# Copyright 2022 NVIDIA and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Union\n\nimport numpy as np\nimport paddle\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\nclass KarrasVeScheduler(SchedulerMixin, ConfigMixin):\n    \"\"\"\n    Stochastic sampling from Karras et al. [1] tailored to the Variance-Expanding (VE) models [2]. Use Algorithm 2 and\n    the VE column of Table 1 from [1] for reference.\n\n    [1] Karras, Tero, et al. \"Elucidating the Design Space of Diffusion-Based Generative Models.\"\n    https://arxiv.org/abs/2206.00364 [2] Song, Yang, et al. \"Score-based generative modeling through stochastic\n    differential equations.\" https://arxiv.org/abs/2011.13456\n    \"\"\"\n\n    @register_to_config\n    def __init__(\n        self,\n        sigma_min=0.02,\n        sigma_max=100,\n        s_noise=1.007,\n        s_churn=80,\n        s_min=0.05,\n        s_max=50,\n        tensor_format=\"pd\",\n    ):\n        \"\"\"\n        For more details on the parameters, see the original paper's Appendix E.: \"Elucidating the Design Space of\n        Diffusion-Based Generative Models.\" https://arxiv.org/abs/2206.00364. The grid search values used to find the\n        optimal {s_noise, s_churn, s_min, s_max} for a specific model are described in Table 5 of the paper.\n\n        Args:\n            sigma_min (`float`): minimum noise magnitude\n            sigma_max (`float`): maximum noise magnitude\n            s_noise (`float`): the amount of additional noise to counteract loss of detail during sampling.\n                A reasonable range is [1.000, 1.011].\n            s_churn (`float`): the parameter controlling the overall amount of stochasticity.\n                A reasonable range is [0, 100].\n            s_min (`float`): the start value of the sigma range where we add noise (enable stochasticity).\n                A reasonable range is [0, 10].\n            s_max (`float`): the end value of the sigma range where we add noise.\n                A reasonable range is [0.2, 80].\n        \"\"\"\n        # setable values\n        self.num_inference_steps = None\n        self.timesteps = None\n        self.schedule = None  # sigma(t_i)\n\n        self.tensor_format = tensor_format\n        self.set_format(tensor_format=tensor_format)\n\n    def set_timesteps(self, num_inference_steps):\n        self.num_inference_steps = num_inference_steps\n        self.timesteps = np.arange(0, self.num_inference_steps)[::-1].copy()\n        self.schedule = [(self.sigma_max * (self.sigma_min**2 / self.sigma_max**2)**(i / (num_inference_steps - 1)))\n                         for i in self.timesteps]\n        self.schedule = np.array(self.schedule, dtype=np.float32)\n\n        self.set_format(tensor_format=self.tensor_format)\n\n    def add_noise_to_input(self, sample, sigma, generator=None):\n        \"\"\"\n        Explicit Langevin-like \"churn\" step of adding noise to the sample according to a factor gamma_i ≥ 0 to reach a\n        higher noise level sigma_hat = sigma_i + gamma_i*sigma_i.\n        \"\"\"\n        if self.s_min <= sigma <= self.s_max:\n            gamma = min(self.s_churn / self.num_inference_steps, 2**0.5 - 1)\n        else:\n            gamma = 0\n\n        # sample eps ~ N(0, S_noise^2 * I)\n        eps = self.s_noise * paddle.randn(sample.shape)\n        sigma_hat = sigma + gamma * sigma\n        sample_hat = sample + ((sigma_hat**2 - sigma**2)**0.5 * eps)\n\n        return sample_hat, sigma_hat\n\n    def step(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        sigma_hat: float,\n        sigma_prev: float,\n        sample_hat: Union[paddle.Tensor, np.ndarray],\n    ):\n        pred_original_sample = sample_hat + sigma_hat * model_output\n        derivative = (sample_hat - pred_original_sample) / sigma_hat\n        sample_prev = sample_hat + (sigma_prev - sigma_hat) * derivative\n\n        return {\"prev_sample\": sample_prev, \"derivative\": derivative}\n\n    def step_correct(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        sigma_hat: float,\n        sigma_prev: float,\n        sample_hat: Union[paddle.Tensor, np.ndarray],\n        sample_prev: Union[paddle.Tensor, np.ndarray],\n        derivative: Union[paddle.Tensor, np.ndarray],\n    ):\n        pred_original_sample = sample_prev + sigma_prev * model_output\n        derivative_corr = (sample_prev - pred_original_sample) / sigma_prev\n        sample_prev = sample_hat + (sigma_prev - sigma_hat) * (0.5 * derivative + 0.5 * derivative_corr)\n        return {\"prev_sample\": sample_prev, \"derivative\": derivative_corr}\n\n    def add_noise(self, original_samples, noise, timesteps):\n        raise NotImplementedError()\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/diffusers/schedulers/scheduling_lms_discrete.py",
    "content": "# Copyright 2022 Katherine Crowson and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Union\n\nimport numpy as np\nimport paddle\nfrom scipy import integrate\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\nclass LMSDiscreteScheduler(SchedulerMixin, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        num_train_timesteps=1000,\n        beta_start=0.0001,\n        beta_end=0.02,\n        beta_schedule=\"linear\",\n        trained_betas=None,\n        timestep_values=None,\n        tensor_format=\"pd\",\n    ):\n        \"\"\"\n        Linear Multistep Scheduler for discrete beta schedules. Based on the original k-diffusion implementation by\n        Katherine Crowson:\n        https://github.com/crowsonkb/k-diffusion/blob/481677d114f6ea445aa009cf5bd7a9cdee909e47/k_diffusion/sampling.py#L181\n        \"\"\"\n\n        if beta_schedule == \"linear\":\n            self.betas = np.linspace(beta_start, beta_end, num_train_timesteps, dtype=np.float32)\n        elif beta_schedule == \"scaled_linear\":\n            # this schedule is very specific to the latent diffusion model.\n            self.betas = np.linspace(beta_start**0.5, beta_end**0.5, num_train_timesteps, dtype=np.float32)**2\n        else:\n            raise NotImplementedError(f\"{beta_schedule} does is not implemented for {self.__class__}\")\n\n        self.alphas = 1.0 - self.betas\n        self.alphas_cumprod = np.cumprod(self.alphas, axis=0)\n\n        self.sigmas = ((1 - self.alphas_cumprod) / self.alphas_cumprod)**0.5\n\n        # setable values\n        self.num_inference_steps = None\n        self.timesteps = np.arange(0, num_train_timesteps)[::-1].copy()\n        self.derivatives = []\n\n        self.tensor_format = tensor_format\n        self.set_format(tensor_format=tensor_format)\n\n    def get_lms_coefficient(self, order, t, current_order):\n        \"\"\"\n        Compute a linear multistep coefficient\n        \"\"\"\n\n        def lms_derivative(tau):\n            prod = 1.0\n            for k in range(order):\n                if current_order == k:\n                    continue\n                prod *= (tau - self.sigmas[t - k]) / (self.sigmas[t - current_order] - self.sigmas[t - k])\n            return prod\n\n        integrated_coeff = integrate.quad(lms_derivative, self.sigmas[t], self.sigmas[t + 1], epsrel=1e-4)[0]\n\n        return integrated_coeff\n\n    def set_timesteps(self, num_inference_steps):\n        self.num_inference_steps = num_inference_steps\n        self.timesteps = np.linspace(self.num_train_timesteps - 1, 0, num_inference_steps, dtype=float)\n\n        low_idx = np.floor(self.timesteps).astype(int)\n        high_idx = np.ceil(self.timesteps).astype(int)\n        frac = np.mod(self.timesteps, 1.0)\n        sigmas = np.array(((1 - self.alphas_cumprod) / self.alphas_cumprod)**0.5)\n        sigmas = (1 - frac) * sigmas[low_idx] + frac * sigmas[high_idx]\n        self.sigmas = np.concatenate([sigmas, [0.0]])\n\n        self.derivatives = []\n\n        self.set_format(tensor_format=self.tensor_format)\n\n    def step(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n        order: int = 4,\n    ):\n        sigma = self.sigmas[timestep]\n\n        # 1. compute predicted original sample (x_0) from sigma-scaled predicted noise\n        pred_original_sample = sample - sigma * model_output\n\n        # 2. Convert to an ODE derivative\n        derivative = (sample - pred_original_sample) / sigma\n        self.derivatives.append(derivative)\n        if len(self.derivatives) > order:\n            self.derivatives.pop(0)\n\n        # 3. Compute linear multistep coefficients\n        order = min(timestep + 1, order)\n        lms_coeffs = [self.get_lms_coefficient(order, timestep, curr_order) for curr_order in range(order)]\n\n        # 4. Compute previous sample based on the derivatives path\n        prev_sample = sample + sum(coeff * derivative\n                                   for coeff, derivative in zip(lms_coeffs, reversed(self.derivatives)))\n\n        return {\"prev_sample\": prev_sample}\n\n    def add_noise(self, original_samples, noise, timesteps):\n        alpha_prod = self.alphas_cumprod[timesteps]\n        alpha_prod = self.match_shape(alpha_prod, original_samples)\n\n        noisy_samples = (alpha_prod**0.5) * original_samples + ((1 - alpha_prod)**0.5) * noise\n        return noisy_samples\n\n    def __len__(self):\n        return self.config.num_train_timesteps\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/diffusers/schedulers/scheduling_pndm.py",
    "content": "# Copyright 2022 Zhejiang University Team and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# DISCLAIMER: This file is strongly influenced by https://github.com/ermongroup/ddim\nimport math\nfrom typing import Union\n\nimport numpy as np\nimport paddle\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\ndef betas_for_alpha_bar(num_diffusion_timesteps, max_beta=0.999):\n    \"\"\"\n    Create a beta schedule that discretizes the given alpha_t_bar function, which defines the cumulative product of\n    (1-beta) over time from t = [0,1].\n\n    :param num_diffusion_timesteps: the number of betas to produce. :param alpha_bar: a lambda that takes an argument t\n    from 0 to 1 and\n                      produces the cumulative product of (1-beta) up to that part of the diffusion process.\n    :param max_beta: the maximum beta to use; use values lower than 1 to\n                     prevent singularities.\n    \"\"\"\n\n    def alpha_bar(time_step):\n        return math.cos((time_step + 0.008) / 1.008 * math.pi / 2)**2\n\n    betas = []\n    for i in range(num_diffusion_timesteps):\n        t1 = i / num_diffusion_timesteps\n        t2 = (i + 1) / num_diffusion_timesteps\n        betas.append(min(1 - alpha_bar(t2) / alpha_bar(t1), max_beta))\n    return np.array(betas, dtype=np.float32)\n\n\nclass PNDMScheduler(SchedulerMixin, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        num_train_timesteps=1000,\n        beta_start=0.0001,\n        beta_end=0.02,\n        beta_schedule=\"linear\",\n        tensor_format=\"pd\",\n        skip_prk_steps=False,\n    ):\n\n        if beta_schedule == \"linear\":\n            self.betas = np.linspace(beta_start, beta_end, num_train_timesteps, dtype=np.float32)\n        elif beta_schedule == \"scaled_linear\":\n            # this schedule is very specific to the latent diffusion model.\n            self.betas = np.linspace(beta_start**0.5, beta_end**0.5, num_train_timesteps, dtype=np.float32)**2\n        elif beta_schedule == \"squaredcos_cap_v2\":\n            # Glide cosine schedule\n            self.betas = betas_for_alpha_bar(num_train_timesteps)\n        else:\n            raise NotImplementedError(f\"{beta_schedule} does is not implemented for {self.__class__}\")\n\n        self.alphas = 1.0 - self.betas\n        self.alphas_cumprod = np.cumprod(self.alphas, axis=0)\n\n        self.one = np.array(1.0)\n\n        # For now we only support F-PNDM, i.e. the runge-kutta method\n        # For more information on the algorithm please take a look at the paper: https://arxiv.org/pdf/2202.09778.pdf\n        # mainly at formula (9), (12), (13) and the Algorithm 2.\n        self.pndm_order = 4\n\n        # running values\n        self.cur_model_output = 0\n        self.counter = 0\n        self.cur_sample = None\n        self.ets = []\n\n        # setable values\n        self.num_inference_steps = None\n        self._timesteps = np.arange(0, num_train_timesteps)[::-1].copy()\n        self._offset = 0\n        self.prk_timesteps = None\n        self.plms_timesteps = None\n        self.timesteps = None\n\n        self.tensor_format = tensor_format\n        self.set_format(tensor_format=tensor_format)\n\n    def set_timesteps(self, num_inference_steps, offset=0):\n        self.num_inference_steps = num_inference_steps\n        self._timesteps = list(\n            range(0, self.config.num_train_timesteps, self.config.num_train_timesteps // num_inference_steps))\n        self._offset = offset\n        self._timesteps = [t + self._offset for t in self._timesteps]\n\n        if self.config.skip_prk_steps:\n            # for some models like stable diffusion the prk steps can/should be skipped to\n            # produce better results. When using PNDM with `self.config.skip_prk_steps` the implementation\n            # is based on crowsonkb's PLMS sampler implementation: https://github.com/CompVis/latent-diffusion/pull/51\n            self.prk_timesteps = []\n            self.plms_timesteps = list(reversed(self._timesteps[:-1] + self._timesteps[-2:-1] + self._timesteps[-1:]))\n        else:\n            prk_timesteps = np.array(self._timesteps[-self.pndm_order:]).repeat(2) + np.tile(\n                np.array([0, self.config.num_train_timesteps // num_inference_steps // 2]), self.pndm_order)\n            self.prk_timesteps = list(reversed(prk_timesteps[:-1].repeat(2)[1:-1]))\n            self.plms_timesteps = list(reversed(self._timesteps[:-3]))\n\n        self.timesteps = self.prk_timesteps + self.plms_timesteps\n\n        self.ets = []\n        self.counter = 0\n        self.set_format(tensor_format=self.tensor_format)\n\n    def step(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n    ):\n        if self.counter < len(self.prk_timesteps) and not self.config.skip_prk_steps:\n            return self.step_prk(model_output=model_output, timestep=timestep, sample=sample)\n        else:\n            return self.step_plms(model_output=model_output, timestep=timestep, sample=sample)\n\n    def step_prk(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n    ):\n        \"\"\"\n        Step function propagating the sample with the Runge-Kutta method. RK takes 4 forward passes to approximate the\n        solution to the differential equation.\n        \"\"\"\n        diff_to_prev = 0 if self.counter % 2 else self.config.num_train_timesteps // self.num_inference_steps // 2\n        prev_timestep = max(timestep - diff_to_prev, self.prk_timesteps[-1])\n        timestep = self.prk_timesteps[self.counter // 4 * 4]\n\n        if self.counter % 4 == 0:\n            self.cur_model_output += 1 / 6 * model_output\n            self.ets.append(model_output)\n            self.cur_sample = sample\n        elif (self.counter - 1) % 4 == 0:\n            self.cur_model_output += 1 / 3 * model_output\n        elif (self.counter - 2) % 4 == 0:\n            self.cur_model_output += 1 / 3 * model_output\n        elif (self.counter - 3) % 4 == 0:\n            model_output = self.cur_model_output + 1 / 6 * model_output\n            self.cur_model_output = 0\n\n        # cur_sample should not be `None`\n        cur_sample = self.cur_sample if self.cur_sample is not None else sample\n\n        prev_sample = self._get_prev_sample(cur_sample, timestep, prev_timestep, model_output)\n        self.counter += 1\n\n        return {\"prev_sample\": prev_sample}\n\n    def step_plms(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n    ):\n        \"\"\"\n        Step function propagating the sample with the linear multi-step method. This has one forward pass with multiple\n        times to approximate the solution.\n        \"\"\"\n        if not self.config.skip_prk_steps and len(self.ets) < 3:\n            raise ValueError(\n                f\"{self.__class__} can only be run AFTER scheduler has been run \"\n                \"in 'prk' mode for at least 12 iterations \"\n                \"See: https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/pipeline_pndm.py \"\n                \"for more information.\")\n\n        prev_timestep = max(timestep - self.config.num_train_timesteps // self.num_inference_steps, 0)\n\n        if self.counter != 1:\n            self.ets.append(model_output)\n        else:\n            prev_timestep = timestep\n            timestep = timestep + self.config.num_train_timesteps // self.num_inference_steps\n\n        if len(self.ets) == 1 and self.counter == 0:\n            model_output = model_output\n            self.cur_sample = sample\n        elif len(self.ets) == 1 and self.counter == 1:\n            model_output = (model_output + self.ets[-1]) / 2\n            sample = self.cur_sample\n            self.cur_sample = None\n        elif len(self.ets) == 2:\n            model_output = (3 * self.ets[-1] - self.ets[-2]) / 2\n        elif len(self.ets) == 3:\n            model_output = (23 * self.ets[-1] - 16 * self.ets[-2] + 5 * self.ets[-3]) / 12\n        else:\n            model_output = (1 / 24) * (55 * self.ets[-1] - 59 * self.ets[-2] + 37 * self.ets[-3] - 9 * self.ets[-4])\n\n        prev_sample = self._get_prev_sample(sample, timestep, prev_timestep, model_output)\n        self.counter += 1\n\n        return {\"prev_sample\": prev_sample}\n\n    def _get_prev_sample(self, sample, timestep, timestep_prev, model_output):\n        # See formula (9) of PNDM paper https://arxiv.org/pdf/2202.09778.pdf\n        # this function computes x_(t−δ) using the formula of (9)\n        # Note that x_t needs to be added to both sides of the equation\n\n        # Notation (<variable name> -> <name in paper>\n        # alpha_prod_t -> α_t\n        # alpha_prod_t_prev -> α_(t−δ)\n        # beta_prod_t -> (1 - α_t)\n        # beta_prod_t_prev -> (1 - α_(t−δ))\n        # sample -> x_t\n        # model_output -> e_θ(x_t, t)\n        # prev_sample -> x_(t−δ)\n        alpha_prod_t = self.alphas_cumprod[timestep + 1 - self._offset]\n        alpha_prod_t_prev = self.alphas_cumprod[timestep_prev + 1 - self._offset]\n        beta_prod_t = 1 - alpha_prod_t\n        beta_prod_t_prev = 1 - alpha_prod_t_prev\n\n        # corresponds to (α_(t−δ) - α_t) divided by\n        # denominator of x_t in formula (9) and plus 1\n        # Note: (α_(t−δ) - α_t) / (sqrt(α_t) * (sqrt(α_(t−δ)) + sqr(α_t))) =\n        # sqrt(α_(t−δ)) / sqrt(α_t))\n        sample_coeff = (alpha_prod_t_prev / alpha_prod_t)**(0.5)\n\n        # corresponds to denominator of e_θ(x_t, t) in formula (9)\n        model_output_denom_coeff = alpha_prod_t * beta_prod_t_prev**(0.5) + (alpha_prod_t * beta_prod_t *\n                                                                             alpha_prod_t_prev)**(0.5)\n\n        # full formula (9)\n        prev_sample = (sample_coeff * sample -\n                       (alpha_prod_t_prev - alpha_prod_t) * model_output / model_output_denom_coeff)\n\n        return prev_sample\n\n    def add_noise(self, original_samples, noise, timesteps):\n        sqrt_alpha_prod = self.alphas_cumprod[timesteps]**0.5\n        sqrt_alpha_prod = self.match_shape(sqrt_alpha_prod, original_samples)\n        sqrt_one_minus_alpha_prod = (1 - self.alphas_cumprod[timesteps])**0.5\n        sqrt_one_minus_alpha_prod = self.match_shape(sqrt_one_minus_alpha_prod, original_samples)\n\n        noisy_samples = sqrt_alpha_prod * original_samples + sqrt_one_minus_alpha_prod * noise\n        return noisy_samples\n\n    def __len__(self):\n        return self.config.num_train_timesteps\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/diffusers/schedulers/scheduling_sde_ve.py",
    "content": "# Copyright 2022 Google Brain and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# DISCLAIMER: This file is strongly influenced by https://github.com/yang-song/score_sde_pypaddle\n# TODO(Patrick, Anton, Suraj) - make scheduler framework indepedent and clean-up a bit\nfrom typing import Union\n\nimport numpy as np\nimport paddle\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\nclass ScoreSdeVeScheduler(SchedulerMixin, ConfigMixin):\n    \"\"\"\n    The variance exploding stochastic differential equation (SDE) scheduler.\n\n    :param snr: coefficient weighting the step from the model_output sample (from the network) to the random noise.\n    :param sigma_min: initial noise scale for sigma sequence in sampling procedure. The minimum sigma should mirror the\n            distribution of the data.\n    :param sigma_max: :param sampling_eps: the end value of sampling, where timesteps decrease progessively from 1 to\n    epsilon. :param correct_steps: number of correction steps performed on a produced sample. :param tensor_format:\n    \"np\" or \"pd\" for the expected format of samples passed to the Scheduler.\n    \"\"\"\n\n    @register_to_config\n    def __init__(\n        self,\n        num_train_timesteps=2000,\n        snr=0.15,\n        sigma_min=0.01,\n        sigma_max=1348,\n        sampling_eps=1e-5,\n        correct_steps=1,\n        tensor_format=\"pd\",\n    ):\n        # self.sigmas = None\n        # self.discrete_sigmas = None\n        #\n        # # setable values\n        # self.num_inference_steps = None\n        self.timesteps = None\n\n        self.set_sigmas(num_train_timesteps, sigma_min, sigma_max, sampling_eps)\n\n        self.tensor_format = tensor_format\n        self.set_format(tensor_format=tensor_format)\n\n    def set_timesteps(self, num_inference_steps, sampling_eps=None):\n        sampling_eps = sampling_eps if sampling_eps is not None else self.config.sampling_eps\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            self.timesteps = np.linspace(1, sampling_eps, num_inference_steps)\n        elif tensor_format == \"pd\":\n            self.timesteps = paddle.linspace(1, sampling_eps, num_inference_steps)\n        else:\n            raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def set_sigmas(self, num_inference_steps, sigma_min=None, sigma_max=None, sampling_eps=None):\n        sigma_min = sigma_min if sigma_min is not None else self.config.sigma_min\n        sigma_max = sigma_max if sigma_max is not None else self.config.sigma_max\n        sampling_eps = sampling_eps if sampling_eps is not None else self.config.sampling_eps\n        if self.timesteps is None:\n            self.set_timesteps(num_inference_steps, sampling_eps)\n\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            self.discrete_sigmas = np.exp(np.linspace(np.log(sigma_min), np.log(sigma_max), num_inference_steps))\n            self.sigmas = np.array([sigma_min * (sigma_max / sigma_min)**t for t in self.timesteps])\n        elif tensor_format == \"pd\":\n            self.discrete_sigmas = paddle.exp(paddle.linspace(np.log(sigma_min), np.log(sigma_max),\n                                                              num_inference_steps))\n            self.sigmas = paddle.to_tensor([sigma_min * (sigma_max / sigma_min)**t for t in self.timesteps])\n        else:\n            raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def get_adjacent_sigma(self, timesteps, t):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            return np.where(timesteps == 0, np.zeros_like(t), self.discrete_sigmas[timesteps - 1])\n        elif tensor_format == \"pd\":\n            return paddle.where(timesteps == 0, paddle.zeros_like(t), self.discrete_sigmas[timesteps - 1])\n\n        raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def set_seed(self, seed):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            np.random.seed(seed)\n        elif tensor_format == \"pd\":\n            paddle.seed(seed)\n        else:\n            raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def step_pred(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n        seed=None,\n    ):\n        \"\"\"\n        Predict the sample at the previous timestep by reversing the SDE.\n        \"\"\"\n        if seed is not None:\n            self.set_seed(seed)\n        # TODO(Patrick) non-Pypaddle\n\n        timestep = timestep * paddle.ones(sample.shape[0])  # paddle.repeat_interleave(timestep, sample.shape[0])\n        timesteps = (timestep * (len(self.timesteps) - 1)).astype(\"int64\")\n\n        sigma = self.discrete_sigmas[timesteps]\n        adjacent_sigma = self.get_adjacent_sigma(timesteps, timestep)\n        drift = self.zeros_like(sample)\n        diffusion = (sigma**2 - adjacent_sigma**2)**0.5\n\n        # equation 6 in the paper: the model_output modeled by the network is grad_x log pt(x)\n        # also equation 47 shows the analog from SDE models to ancestral sampling methods\n        drift = drift - diffusion[:, None, None, None]**2 * model_output\n\n        #  equation 6: sample noise for the diffusion term of\n        noise = self.randn_like(sample)\n        prev_sample_mean = sample - drift  # subtract because `dt` is a small negative timestep\n        # TODO is the variable diffusion the correct scaling term for the noise?\n        prev_sample = prev_sample_mean + diffusion[:, None, None, None] * noise  # add impact of diffusion field g\n\n        return {\"prev_sample\": prev_sample, \"prev_sample_mean\": prev_sample_mean}\n\n    def step_correct(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        sample: Union[paddle.Tensor, np.ndarray],\n        seed=None,\n    ):\n        \"\"\"\n        Correct the predicted sample based on the output model_output of the network. This is often run repeatedly\n        after making the prediction for the previous timestep.\n        \"\"\"\n        if seed is not None:\n            self.set_seed(seed)\n\n        # For small batch sizes, the paper \"suggest replacing norm(z) with sqrt(d), where d is the dim. of z\"\n        # sample noise for correction\n        noise = self.randn_like(sample)\n\n        # compute step size from the model_output, the noise, and the snr\n        grad_norm = self.norm(model_output)\n        noise_norm = self.norm(noise)\n        step_size = (self.config.snr * noise_norm / grad_norm)**2 * 2\n        step_size = step_size * paddle.ones(sample.shape[0])\n        # self.repeat_scalar(step_size, sample.shape[0])\n\n        # compute corrected sample: model_output term and noise term\n        prev_sample_mean = sample + step_size[:, None, None, None] * model_output\n        prev_sample = prev_sample_mean + ((step_size * 2)**0.5)[:, None, None, None] * noise\n\n        return {\"prev_sample\": prev_sample}\n\n    def __len__(self):\n        return self.config.num_train_timesteps\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/diffusers/schedulers/scheduling_sde_vp.py",
    "content": "# Copyright 2022 Google Brain and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# DISCLAIMER: This file is strongly influenced by https://github.com/yang-song/score_sde_pytorch\n# TODO(Patrick, Anton, Suraj) - make scheduler framework indepedent and clean-up a bit\nimport numpy as np\nimport paddle\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\nclass ScoreSdeVpScheduler(SchedulerMixin, ConfigMixin):\n\n    @register_to_config\n    def __init__(self, num_train_timesteps=2000, beta_min=0.1, beta_max=20, sampling_eps=1e-3, tensor_format=\"np\"):\n\n        self.sigmas = None\n        self.discrete_sigmas = None\n        self.timesteps = None\n\n    def set_timesteps(self, num_inference_steps):\n        self.timesteps = paddle.linspace(1, self.config.sampling_eps, num_inference_steps)\n\n    def step_pred(self, score, x, t):\n        # TODO(Patrick) better comments + non-PyTorch\n        # postprocess model score\n        log_mean_coeff = (-0.25 * t**2 * (self.config.beta_max - self.config.beta_min) - 0.5 * t * self.config.beta_min)\n        std = paddle.sqrt(1.0 - paddle.exp(2.0 * log_mean_coeff))\n        score = -score / std[:, None, None, None]\n\n        # compute\n        dt = -1.0 / len(self.timesteps)\n\n        beta_t = self.config.beta_min + t * (self.config.beta_max - self.config.beta_min)\n        drift = -0.5 * beta_t[:, None, None, None] * x\n        diffusion = paddle.sqrt(beta_t)\n        drift = drift - diffusion[:, None, None, None]**2 * score\n        x_mean = x + drift * dt\n\n        # add noise\n        noise = self.randn_like(x)\n        x = x_mean + diffusion[:, None, None, None] * np.sqrt(-dt) * noise\n\n        return x, x_mean\n\n    def __len__(self):\n        return self.config.num_train_timesteps\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/diffusers/schedulers/scheduling_utils.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Union\n\nimport numpy as np\nimport paddle\n\nSCHEDULER_CONFIG_NAME = \"scheduler_config.json\"\n\n\nclass SchedulerMixin:\n\n    config_name = SCHEDULER_CONFIG_NAME\n    ignore_for_config = [\"tensor_format\"]\n\n    def set_format(self, tensor_format=\"pd\"):\n        self.tensor_format = tensor_format\n        if tensor_format == \"pd\":\n            for key, value in vars(self).items():\n                if isinstance(value, np.ndarray):\n                    setattr(self, key, paddle.to_tensor(value))\n\n        return self\n\n    def clip(self, tensor, min_value=None, max_value=None):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n\n        if tensor_format == \"np\":\n            return np.clip(tensor, min_value, max_value)\n        elif tensor_format == \"pd\":\n            return paddle.clip(tensor, min_value, max_value)\n\n        raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def log(self, tensor):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n\n        if tensor_format == \"np\":\n            return np.log(tensor)\n        elif tensor_format == \"pd\":\n            return paddle.log(tensor)\n\n        raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def match_shape(self, values: Union[np.ndarray, paddle.Tensor], broadcast_array: Union[np.ndarray, paddle.Tensor]):\n        \"\"\"\n        Turns a 1-D array into an array or tensor with len(broadcast_array.shape) dims.\n\n        Args:\n            values: an array or tensor of values to extract.\n            broadcast_array: an array with a larger shape of K dimensions with the batch\n                dimension equal to the length of timesteps.\n        Returns:\n            a tensor of shape [batch_size, 1, ...] where the shape has K dims.\n        \"\"\"\n\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        values = values.flatten()\n\n        while len(values.shape) < len(broadcast_array.shape):\n            values = values[..., None]\n\n        return values\n\n    def norm(self, tensor):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            return np.linalg.norm(tensor)\n        elif tensor_format == \"pd\":\n            return paddle.norm(tensor.reshape([tensor.shape[0], -1]), axis=-1).mean()\n\n        raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def randn_like(self, tensor, generator=None):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            return np.random.randn(np.shape(tensor))\n        elif tensor_format == \"pd\":\n            # return paddle.randn_like(tensor)\n            return paddle.randn(tensor.shape)\n\n        raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def zeros_like(self, tensor):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            return np.zeros_like(tensor)\n        elif tensor_format == \"pd\":\n            return paddle.zeros_like(tensor)\n\n        raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/module.py",
    "content": "# copyright (c) 2022 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport ast\nimport os\nimport random\nimport sys\nfrom functools import partial\nfrom typing import List\nfrom typing import Optional\n\nimport numpy as np\nimport paddle\nfrom docarray import Document\nfrom docarray import DocumentArray\nfrom IPython import display\nfrom PIL import Image\nfrom stable_diffusion.clip.clip.utils import build_model\nfrom stable_diffusion.clip.clip.utils import tokenize\nfrom stable_diffusion.diffusers import AutoencoderKL\nfrom stable_diffusion.diffusers import DDIMScheduler\nfrom stable_diffusion.diffusers import LMSDiscreteScheduler\nfrom stable_diffusion.diffusers import PNDMScheduler\nfrom stable_diffusion.diffusers import UNet2DConditionModel\nfrom tqdm.auto import tqdm\n\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"stable_diffusion\",\n            version=\"1.0.0\",\n            type=\"image/text_to_image\",\n            summary=\"\",\n            author=\"paddlepaddle\",\n            author_email=\"paddle-dev@baidu.com\")\nclass StableDiffusion:\n\n    def __init__(self):\n        self.vae = AutoencoderKL(in_channels=3,\n                                 out_channels=3,\n                                 down_block_types=(\"DownEncoderBlock2D\", \"DownEncoderBlock2D\", \"DownEncoderBlock2D\",\n                                                   \"DownEncoderBlock2D\"),\n                                 up_block_types=(\"UpDecoderBlock2D\", \"UpDecoderBlock2D\", \"UpDecoderBlock2D\",\n                                                 \"UpDecoderBlock2D\"),\n                                 block_out_channels=(128, 256, 512, 512),\n                                 layers_per_block=2,\n                                 act_fn=\"silu\",\n                                 latent_channels=4,\n                                 sample_size=512)\n\n        self.unet = UNet2DConditionModel(sample_size=64,\n                                         in_channels=4,\n                                         out_channels=4,\n                                         center_input_sample=False,\n                                         flip_sin_to_cos=True,\n                                         freq_shift=0,\n                                         down_block_types=(\"CrossAttnDownBlock2D\", \"CrossAttnDownBlock2D\",\n                                                           \"CrossAttnDownBlock2D\", \"DownBlock2D\"),\n                                         up_block_types=(\"UpBlock2D\", \"CrossAttnUpBlock2D\", \"CrossAttnUpBlock2D\",\n                                                         \"CrossAttnUpBlock2D\"),\n                                         block_out_channels=(320, 640, 1280, 1280),\n                                         layers_per_block=2,\n                                         downsample_padding=1,\n                                         mid_block_scale_factor=1,\n                                         act_fn=\"silu\",\n                                         norm_num_groups=32,\n                                         norm_eps=1e-5,\n                                         cross_attention_dim=768,\n                                         attention_head_dim=8)\n\n        unet_path = os.path.join(self.directory, 'pre_trained', 'stable-diffusion-v1-4-unet.pdparams')\n        vae_path = os.path.join(self.directory, 'pre_trained', 'stable-diffusion-v1-4-vae.pdparams')\n        self.unet.set_dict(paddle.load(unet_path))\n        self.vae.set_dict(paddle.load(vae_path))\n        for parameter in self.unet.parameters():\n            parameter.stop_gradient = True\n        self.unet.eval()\n        for parameter in self.vae.parameters():\n            parameter.stop_gradient = True\n        self.vae.eval()\n\n        self.text_encoder = build_model()\n        for parameter in self.text_encoder.parameters():\n            parameter.stop_gradient = True\n        self.scheduler = PNDMScheduler(beta_start=0.00085,\n                                       beta_end=0.012,\n                                       beta_schedule=\"scaled_linear\",\n                                       num_train_timesteps=1000,\n                                       skip_prk_steps=True)\n\n    def generate_image(self,\n                       text_prompts,\n                       style: Optional[str] = None,\n                       artist: Optional[str] = None,\n                       width_height: Optional[List[int]] = [512, 512],\n                       batch_size: Optional[int] = 1,\n                       num_inference_steps=50,\n                       guidance_scale=7.5,\n                       enable_fp16=False,\n                       seed=None,\n                       display_rate=5,\n                       use_gpu=True,\n                       output_dir: Optional[str] = 'stable_diffusion_out'):\n        \"\"\"\n        Create Disco Diffusion artworks and save the result into a DocumentArray.\n\n        :param text_prompts: Phrase, sentence, or string of words and phrases describing what the image should look like.  The words will be analyzed by the AI and will guide the diffusion process toward the image(s) you describe. These can include commas and weights to adjust the relative importance of each element.  E.g. \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"Notice that this prompt loosely follows a structure: [subject], [prepositional details], [setting], [meta modifiers and artist]; this is a good starting point for your experiments. Developing text prompts takes practice and experience, and is not the subject of this guide.  If you are a beginner to writing text prompts, a good place to start is on a simple AI art app like Night Cafe, starry ai or WOMBO prior to using DD, to get a feel for how text gets translated into images by GAN tools.  These other apps use different technologies, but many of the same principles apply.\n        :param style: Image style, such as oil paintings, if specified, style will be used to construct prompts.\n        :param artist: Artist style, if specified, style will be used to construct prompts.\n        :param width_height: Desired final image size, in pixels. You can have a square, wide, or tall image, but each edge length should be set to a multiple of 64px, and a minimum of 512px on the default CLIP model setting.  If you forget to use multiples of 64px in your dimensions, DD will adjust the dimensions of your image to make it so.\n        :param batch_size: This variable sets the number of still images you want SD to create for each prompt.\n        :param num_inference_steps: The number of inference steps.\n        :param guidance_scale: Increase the adherence to the conditional signal which in this case is text as well as overall sample quality.\n        :param enable_fp16: Whether to use float16.\n        :param use_gpu: whether to use gpu or not.\n        :param output_dir: Output directory.\n        :return: a DocumentArray object that has `n_batches` Documents\n        \"\"\"\n        if seed:\n            np.random.seed(seed)\n            random.seed(seed)\n            paddle.seed(seed)\n\n        if use_gpu:\n            try:\n                _places = os.environ.get(\"CUDA_VISIBLE_DEVICES\", None)\n                if _places:\n                    paddle.device.set_device(\"gpu:{}\".format(0))\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n        else:\n            paddle.device.set_device(\"cpu\")\n        paddle.disable_static()\n\n        if not os.path.exists(output_dir):\n            os.makedirs(output_dir, exist_ok=True)\n\n        if isinstance(text_prompts, str):\n            text_prompts = text_prompts.rstrip(',.，。')\n            if style is not None:\n                text_prompts += \",{}\".format(style)\n            if artist is not None:\n                text_prompts += \",{},trending on artstation\".format(artist)\n            text_prompts = [text_prompts]\n        elif isinstance(text_prompts, list):\n            for i, prompt in enumerate(\n                    text_prompts):  # different from dd here, dd can have multiple prompts for one image with weight.\n                text_prompts[i] = prompt.rstrip(',.，。')\n                if style is not None:\n                    text_prompts[i] += \",{}\".format(style)\n                if artist is not None:\n                    text_prompts[i] += \",{},trending on artstation\".format(artist)\n\n        width, height = width_height\n        da_batches = DocumentArray()\n\n        for prompt in text_prompts:\n            d = Document(tags={'prompt': prompt})\n            da_batches.append(d)\n            for i in range(batch_size):\n                d.chunks.append(Document(tags={'prompt': prompt, 'image idx': i}))\n            d.chunks.append(Document(tags={'prompt': prompt, 'image idx': 'merged'}))\n            with paddle.amp.auto_cast(enable=enable_fp16, level='O2'):\n                prompts = [prompt] * batch_size\n                text_input = tokenize(prompts)\n                text_embeddings = self.text_encoder(text_input)\n                uncond_input = tokenize([\"\"] * batch_size)\n                uncond_embeddings = self.text_encoder(uncond_input)\n                text_embeddings = paddle.concat([uncond_embeddings, text_embeddings])\n\n                latents = paddle.randn((batch_size, self.unet.in_channels, height // 8, width // 8), )\n                if isinstance(self.scheduler, LMSDiscreteScheduler):\n                    latents = latents * self.scheduler.sigmas[0]\n\n                self.scheduler.set_timesteps(num_inference_steps)\n                for i, t in tqdm(enumerate(self.scheduler.timesteps)):\n                    # expand the latents if we are doing classifier-free guidance to avoid doing two forward passes.\n                    latent_model_input = paddle.concat([latents] * 2)\n\n                    if isinstance(self.scheduler, LMSDiscreteScheduler):\n                        sigma = self.scheduler.sigmas[i]\n                        latent_model_input = latent_model_input / ((sigma**2 + 1)**0.5)\n\n                    # predict the noise residual\n                    noise_pred = self.unet(latent_model_input, t, encoder_hidden_states=text_embeddings)[\"sample\"]\n\n                    # perform guidance\n                    noise_pred_uncond, noise_pred_text = noise_pred.chunk(2)\n                    noise_pred = noise_pred_uncond + guidance_scale * (noise_pred_text - noise_pred_uncond)\n\n                    # compute the previous noisy sample x_t -> x_t-1\n                    if isinstance(self.scheduler, LMSDiscreteScheduler):\n                        latents = self.scheduler.step(noise_pred, i, latents)[\"prev_sample\"]\n                    else:\n                        latents = self.scheduler.step(noise_pred, t, latents)[\"prev_sample\"]\n                    if i % display_rate == 0:\n                        # vae decode\n                        images = self.vae.decode(1 / 0.18215 * latents)\n                        images = (images / 2 + 0.5).clip(0, 1)\n                        merge_image = images.cpu().transpose([2, 0, 3, 1]).flatten(1, 2).numpy()\n                        merge_image = (merge_image * 255).round().astype(np.uint8)\n                        merge_image = Image.fromarray(merge_image)\n                        merge_image.save(os.path.join(output_dir, f'{prompt}-progress.png'))\n                        c = Document(tags={'step': i, 'prompt': prompt})\n                        c.load_pil_image_to_datauri(merge_image)\n                        d.chunks[-1].chunks.append(c)\n                        display.clear_output(wait=True)\n                        display.display(merge_image)\n                        images = images.cpu().transpose([0, 2, 3, 1]).numpy()\n                        images = (images * 255).round().astype(np.uint8)\n                        for j in range(images.shape[0]):\n                            image = Image.fromarray(images[j])\n                            c = Document(tags={'step': i, 'prompt': prompt})\n                            c.load_pil_image_to_datauri(image)\n                            d.chunks[j].chunks.append(c)\n\n                # vae decode\n                images = self.vae.decode(1 / 0.18215 * latents)\n                images = (images / 2 + 0.5).clip(0, 1)\n                merge_image = images.cpu().transpose([2, 0, 3, 1]).flatten(1, 2).numpy()\n                merge_image = (merge_image * 255).round().astype(np.uint8)\n                merge_image = Image.fromarray(merge_image)\n                merge_image.save(os.path.join(output_dir, f'{prompt}-merge.png'))\n                display.clear_output(wait=True)\n                display.display(merge_image)\n                d.load_pil_image_to_datauri(merge_image)\n                d.chunks[-1].load_pil_image_to_datauri(merge_image)\n                images = images.cpu().transpose([0, 2, 3, 1]).numpy()\n                images = (images * 255).round().astype(np.uint8)\n                for j in range(images.shape[0]):\n                    image = Image.fromarray(images[j])\n                    image.save(os.path.join(output_dir, f'{prompt}-image-{j}.png'))\n                    d.chunks[j].load_pil_image_to_datauri(image)\n        return da_batches\n\n    @serving\n    def serving_method(self, text_prompts, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        results = self.generate_image(text_prompts=text_prompts, **kwargs).to_base64()\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.generate_image(text_prompts=args.text_prompts,\n                                      style=args.style,\n                                      artist=args.artist,\n                                      width_height=args.width_height,\n                                      batch_size=args.batch_size,\n                                      num_inference_steps=args.num_inference_steps,\n                                      guidance_scale=args.guidance_scale,\n                                      enable_fp16=args.enable_fp16,\n                                      seed=args.seed,\n                                      display_rate=args.display_rate,\n                                      use_gpu=args.use_gpu,\n                                      output_dir=args.output_dir)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n\n        self.arg_input_group.add_argument('--num_inference_steps',\n                                          type=int,\n                                          default=50,\n                                          help=\"The number of inference steps.\")\n\n        self.arg_input_group.add_argument(\n            '--guidance_scale',\n            type=float,\n            default=7.5,\n            help=\n            \"Increase the adherence to the conditional signal which in this case is text as well as overall sample quality.\"\n        )\n\n        self.arg_input_group.add_argument(\n            '--seed',\n            type=int,\n            default=None,\n            help=\n            \"Deep in the diffusion code, there is a random number ‘seed’ which is used as the basis for determining the initial state of the diffusion.  By default, this is random, but you can also specify your own seed.\"\n        )\n\n        self.arg_input_group.add_argument(\n            '--display_rate',\n            type=int,\n            default=10,\n            help=\"During a diffusion run, you can monitor the progress of each image being created with this variable.\")\n\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=True,\n                                           help=\"whether use GPU or not\")\n\n        self.arg_config_group.add_argument('--enable_fp16',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use float16 or not\")\n\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='stable_diffusion_out',\n                                           help='Output directory.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument(\n            '--text_prompts',\n            type=str,\n            help=\n            'Phrase, sentence, or string of words and phrases describing what the image should look like.  The words will be analyzed by the AI and will guide the diffusion process toward the image(s) you describe. These can include commas and weights to adjust the relative importance of each element.  E.g. \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"Notice that this prompt loosely follows a structure: [subject], [prepositional details], [setting], [meta modifiers and artist]; this is a good starting point for your experiments. Developing text prompts takes practice and experience, and is not the subject of this guide.  If you are a beginner to writing text prompts, a good place to start is on a simple AI art app like Night Cafe, starry ai or WOMBO prior to using DD, to get a feel for how text gets translated into images by GAN tools.  These other apps use different technologies, but many of the same principles apply.'\n        )\n        self.arg_input_group.add_argument(\n            '--style',\n            type=str,\n            default=None,\n            help='Image style, such as oil paintings, if specified, style will be used to construct prompts.')\n        self.arg_input_group.add_argument('--artist',\n                                          type=str,\n                                          default=None,\n                                          help='Artist style, if specified, style will be used to construct prompts.')\n\n        self.arg_input_group.add_argument(\n            '--width_height',\n            type=ast.literal_eval,\n            default=[512, 512],\n            help=\n            \"Desired final image size, in pixels. You can have a square, wide, or tall image, but each edge length should be set to a multiple of 64px, and a minimum of 512px on the default CLIP model setting.  If you forget to use multiples of 64px in your dimensions, DD will adjust the dimensions of your image to make it so.\"\n        )\n        self.arg_input_group.add_argument(\n            '--batch_size',\n            type=int,\n            default=1,\n            help=\"This variable sets the number of still images you want SD to create for each prompt.\")\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion/requirements.txt",
    "content": "numpy\nftfy\nregex\ndocarray>=0.13.29\npyyaml\nregex\ntqdm\nipywidgets\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/LICENSE",
    "content": "Copyright (c) 2022 Robin Rombach and Patrick Esser and contributors\n\nCreativeML Open RAIL-M\ndated August 22, 2022\n\nSection I: PREAMBLE\n\nMultimodal generative models are being widely adopted and used, and have the potential to transform the way artists, among other individuals, conceive and benefit from AI or ML technologies as a tool for content creation.\n\nNotwithstanding the current and potential benefits that these artifacts can bring to society at large, there are also concerns about potential misuses of them, either due to their technical limitations or ethical considerations.\n\nIn short, this license strives for both the open and responsible downstream use of the accompanying model. When it comes to the open character, we took inspiration from open source permissive licenses regarding the grant of IP rights. Referring to the downstream responsible use, we added use-based restrictions not permitting the use of the Model in very specific scenarios, in order for the licensor to be able to enforce the license in case potential misuses of the Model may occur. At the same time, we strive to promote open and responsible research on generative models for art and content generation.\n\nEven though downstream derivative versions of the model could be released under different licensing terms, the latter will always have to include - at minimum - the same use-based restrictions as the ones in the original license (this license). We believe in the intersection between open and responsible AI development; thus, this License aims to strike a balance between both in order to enable responsible open-science in the field of AI.\n\nThis License governs the use of the model (and its derivatives) and is informed by the model card associated with the model.\n\nNOW THEREFORE, You and Licensor agree as follows:\n\n1. Definitions\n\n- \"License\" means the terms and conditions for use, reproduction, and Distribution as defined in this document.\n- \"Data\" means a collection of information and/or content extracted from the dataset used with the Model, including to train, pretrain, or otherwise evaluate the Model. The Data is not licensed under this License.\n- \"Output\" means the results of operating a Model as embodied in informational content resulting therefrom.\n- \"Model\" means any accompanying machine-learning based assemblies (including checkpoints), consisting of learnt weights, parameters (including optimizer states), corresponding to the model architecture as embodied in the Complementary Material, that have been trained or tuned, in whole or in part on the Data, using the Complementary Material.\n- \"Derivatives of the Model\" means all modifications to the Model, works based on the Model, or any other model which is created or initialized by transfer of patterns of the weights, parameters, activations or output of the Model, to the other model, in order to cause the other model to perform similarly to the Model, including - but not limited to - distillation methods entailing the use of intermediate data representations or methods based on the generation of synthetic data by the Model for training the other model.\n- \"Complementary Material\" means the accompanying source code and scripts used to define, run, load, benchmark or evaluate the Model, and used to prepare data for training or evaluation, if any. This includes any accompanying documentation, tutorials, examples, etc, if any.\n- \"Distribution\" means any transmission, reproduction, publication or other sharing of the Model or Derivatives of the Model to a third party, including providing the Model as a hosted service made available by electronic or other remote means - e.g. API-based or web access.\n- \"Licensor\" means the copyright owner or entity authorized by the copyright owner that is granting the License, including the persons or entities that may have rights in the Model and/or distributing the Model.\n- \"You\" (or \"Your\") means an individual or Legal Entity exercising permissions granted by this License and/or making use of the Model for whichever purpose and in any field of use, including usage of the Model in an end-use application - e.g. chatbot, translator, image generator.\n- \"Third Parties\" means individuals or legal entities that are not under common control with Licensor or You.\n- \"Contribution\" means any work of authorship, including the original version of the Model and any modifications or additions to that Model or Derivatives of the Model thereof, that is intentionally submitted to Licensor for inclusion in the Model by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, \"submitted\" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Model, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as \"Not a Contribution.\"\n- \"Contributor\" means Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Model.\n\nSection II: INTELLECTUAL PROPERTY RIGHTS\n\nBoth copyright and patent grants apply to the Model, Derivatives of the Model and Complementary Material. The Model and Derivatives of the Model are subject to additional terms as described in Section III.\n\n2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare, publicly display, publicly perform, sublicense, and distribute the Complementary Material, the Model, and Derivatives of the Model.\n3. Grant of Patent License. Subject to the terms and conditions of this License and where and as applicable, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this paragraph) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Model and the Complementary Material, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Model to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Model and/or Complementary Material or a Contribution incorporated within the Model and/or Complementary Material constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for the Model and/or Work shall terminate as of the date such litigation is asserted or filed.\n\nSection III: CONDITIONS OF USAGE, DISTRIBUTION AND REDISTRIBUTION\n\n4. Distribution and Redistribution. You may host for Third Party remote access purposes (e.g. software-as-a-service), reproduce and distribute copies of the Model or Derivatives of the Model thereof in any medium, with or without modifications, provided that You meet the following conditions:\nUse-based restrictions as referenced in paragraph 5 MUST be included as an enforceable provision by You in any type of legal agreement (e.g. a license) governing the use and/or distribution of the Model or Derivatives of the Model, and You shall give notice to subsequent users You Distribute to, that the Model or Derivatives of the Model are subject to paragraph 5. This provision does not apply to the use of Complementary Material.\nYou must give any Third Party recipients of the Model or Derivatives of the Model a copy of this License;\nYou must cause any modified files to carry prominent notices stating that You changed the files;\nYou must retain all copyright, patent, trademark, and attribution notices excluding those notices that do not pertain to any part of the Model, Derivatives of the Model.\nYou may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions - respecting paragraph 4.a. - for use, reproduction, or Distribution of Your modifications, or for any such Derivatives of the Model as a whole, provided Your use, reproduction, and Distribution of the Model otherwise complies with the conditions stated in this License.\n5. Use-based restrictions. The restrictions set forth in Attachment A are considered Use-based restrictions. Therefore You cannot use the Model and the Derivatives of the Model for the specified restricted uses. You may use the Model subject to this License, including only for lawful purposes and in accordance with the License. Use may include creating any content with, finetuning, updating, running, training, evaluating and/or reparametrizing the Model. You shall require all of Your users who use the Model or a Derivative of the Model to comply with the terms of this paragraph (paragraph 5).\n6. The Output You Generate. Except as set forth herein, Licensor claims no rights in the Output You generate using the Model. You are accountable for the Output you generate and its subsequent uses. No use of the output can contravene any provision as stated in the License.\n\nSection IV: OTHER PROVISIONS\n\n7. Updates and Runtime Restrictions. To the maximum extent permitted by law, Licensor reserves the right to restrict (remotely or otherwise) usage of the Model in violation of this License, update the Model through electronic means, or modify the Output of the Model based on updates. You shall undertake reasonable efforts to use the latest version of the Model.\n8. Trademarks and related. Nothing in this License permits You to make use of Licensors’ trademarks, trade names, logos or to otherwise suggest endorsement or misrepresent the relationship between the parties; and any rights not expressly granted herein are reserved by the Licensors.\n9. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Model and the Complementary Material (and each Contributor provides its Contributions) on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Model, Derivatives of the Model, and the Complementary Material and assume any risks associated with Your exercise of permissions under this License.\n10. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Model and the Complementary Material (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages.\n11. Accepting Warranty or Additional Liability. While redistributing the Model, Derivatives of the Model and the Complementary Material thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability.\n12. If any provision of this License is held to be invalid, illegal or unenforceable, the remaining provisions shall be unaffected thereby and remain valid as if such provision had not been set forth herein.\n\nEND OF TERMS AND CONDITIONS\n\n\n\n\nAttachment A\n\nUse Restrictions\n\nYou agree not to use the Model or Derivatives of the Model:\n- In any way that violates any applicable national, federal, state, local or international law or regulation;\n- For the purpose of exploiting, harming or attempting to exploit or harm minors in any way;\n- To generate or disseminate verifiably false information and/or content with the purpose of harming others;\n- To generate or disseminate personal identifiable information that can be used to harm an individual;\n- To defame, disparage or otherwise harass others;\n- For fully automated decision making that adversely impacts an individual’s legal rights or otherwise creates or modifies a binding, enforceable obligation;\n- For any use intended to or which has the effect of discriminating against or harming individuals or groups based on online or offline social behavior or known or predicted personal or personality characteristics;\n- To exploit any of the vulnerabilities of a specific group of persons based on their age, social, physical or mental characteristics, in order to materially distort the behavior of a person pertaining to that group in a manner that causes or is likely to cause that person or another person physical or psychological harm;\n- For any use intended to or which has the effect of discriminating against individuals or groups based on legally protected characteristics or categories;\n- To provide medical advice and medical results interpretation;\n- To generate or disseminate information for the purpose to be used for administration of justice, law enforcement, immigration or asylum processes, such as predicting an individual will commit fraud/crime commitment (e.g. by text profiling, drawing causal relationships between assertions made in documents, indiscriminate and arbitrarily-targeted use)."
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/README.md",
    "content": "# stable_diffusion_img2img\n\n|模型名称|stable_diffusion_img2img|\n| :--- | :---: |\n|类别|多模态-文图生成|\n|网络|CLIP Text Encoder+UNet+VAD|\n|数据集|-|\n|是否支持Fine-tuning|否|\n|模型大小|4.0GB|\n|最新更新日期|2022-08-26|\n|数据指标|-|\n\n## 一、模型基本信息\n\n### 应用效果展示\n\n  - 输入文本 \"A fantasy landscape, trending on artstation\"\n\n  - 输入初始图像\n  <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/192963415-4008af70-abb0-4734-9872-8363360138d8.png\"  width = \"80%\" hspace='10'/>\n  <br />\n\n  - 输出图像\n  <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/192963467-e78f5ca0-895a-47f2-8734-9dae1e7fd786.png\"  width = \"80%\" hspace='10'/>\n  <br />\n\n  - 生成过程\n  <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/192963443-7ca10651-16f4-4b39-9930-6022eee6fdd5.gif\"  width = \"80%\" hspace='10'/>\n  <br />\n\n### 模型介绍\n\nStable Diffusion是一种潜在扩散模型(Latent Diffusion)， 属于生成类模型，这类模型通过对随机噪声进行一步步地迭代降噪并采样来获得感兴趣的图像，当前取得了令人惊艳的效果。相比于Disco Diffusion, Stable Diffusion通过在低纬度的潜在空间（lower dimensional latent space）而不是原像素空间来做迭代，极大地降低了内存和计算量的需求，并且在V100上一分钟之内即可以渲染出想要的图像，欢迎体验。该模块支持输入文本以及一个初始图像，对初始图像的内容进行改变。\n\n更多详情请参考论文：[High-Resolution Image Synthesis with Latent Diffusion Models](https://arxiv.org/abs/2112.10752)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install stable_diffusion_img2img\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run stable_diffusion_img2img --text_prompts \"A fantasy landscape, trending on artstation\" --init_image /PATH/TO/IMAGE --output_dir stable_diffusion_img2img_out\n    ```\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"stable_diffusion_img2img\")\n    text_prompts = [\"A fantasy landscape, trending on artstation\"]\n    # 生成图像, 默认会在stable_diffusion_img2img_out目录保存图像\n    # 返回的da是一个DocumentArray对象，保存了所有的结果，包括最终结果和迭代过程的中间结果\n    # 可以通过操作DocumentArray对象对生成的图像做后处理，保存或者分析\n    # 您可以设置batch_size一次生成多张\n    da = module.generate_image(text_prompts=text_prompts, init_image='/PATH/TO/IMAGE', batch_size=2, output_dir='./stable_diffusion_img2img_out/')  \n    # 展示所有的中间结果\n    da[0].chunks[-1].chunks.plot_image_sprites(skip_empty=True, show_index=True, keep_aspect_ratio=True)\n    # 将整个生成过程保存为一个动态图gif\n    da[0].chunks[-1].chunks.save_gif('stable_diffusion_img2img_out-merged-result.gif')\n    # da索引的是prompt, da[0].chunks索引的是该prompt下生成的第一张图，在batch_size不为1时能同时生成多张图\n    # 您也可以按照上述操作显示单张图，如第0张的生成过程\n    da[0].chunks[0].chunks.plot_image_sprites(skip_empty=True, show_index=True, keep_aspect_ratio=True)\n    da[0].chunks[0].chunks.save_gif('stable_diffusion_img2img-image-0-result.gif')\n    ```\n\n- ### 3、API\n\n  - ```python\n    def generate_image(\n            text_prompts,\n            init_image,\n            strength: float = 0.8,\n            width_height: Optional[List[int]] = [512, 512],\n            seed: Optional[int] = None,\n            batch_size: Optional[int] = 1,\n            display_rate: Optional[int] = 5,\n            output_dir: Optional[str] = 'stable_diffusion_out'):\n    ```\n\n    - 文图生成API，生成文本描述内容的图像。\n\n    - **参数**\n\n      - text_prompts(str): 输入的语句，描述想要生成的图像的内容。\n      - init_image(str|numpy.ndarray|PIL.Image): 输入的初始图像。\n      - strength(float): 控制添加到输入图像的噪声强度，取值范围0到1。越接近1.0，图像变化越大。\n      - width_height(Optional[List[int]]): 指定最终输出图像的宽高，宽和高都需要是64的倍数，生成的图像越大，所需要的计算时间越长。\n      - seed(Optional[int]): 随机种子，由于输入默认是随机高斯噪声，设置不同的随机种子会由不同的初始输入，从而最终生成不同的结果，可以设置该参数来获得不同的输出图像。\n      - batch_size(Optional[int]): 指定每个prompt一次生成的图像的数量。\n      - display_rate(Optional[int]): 保存中间结果的频率，默认每5个step保存一次中间结果，如果不需要中间结果来让程序跑的更快，可以将这个值设大。\n      - output_dir(Optional[str]): 保存输出图像的目录，默认为\"stable_diffusion_out\"。\n\n\n    - **返回**\n      - ra(DocumentArray): DocumentArray对象， 包含`n_batches`个Documents，其中每个Document都保存了迭代过程的所有中间结果。详细可参考[DocumentArray使用文档](https://docarray.jina.ai/fundamentals/documentarray/index.html)。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线文图生成服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m stable_diffusion_img2img\n    ```\n\n  - 这样就完成了一个文图生成的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果，返回的预测结果在反序列化后即是上述接口声明中说明的DocumentArray类型，返回后对结果的操作方式和使用generate_image接口完全相同。\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    from docarray import DocumentArray\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tobytes())\n\n    # 发送HTTP请求\n    data = {'text_prompts': 'A fantasy landscape, trending on artstation', 'init_image': cv2_to_base64(cv2.imread('/PATH/TO/IMAGE'))}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/stable_diffusion_img2img\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 获取返回结果\n    r.json()[\"results\"]\n    da = DocumentArray.from_base64(r.json()[\"results\"])\n    # 保存结果图\n    da[0].save_uri_to_file('stable_diffusion_img2img_out.png')\n    # 将生成过程保存为一个动态图gif\n    da[0].chunks[0].chunks.save_gif('stable_diffusion_img2img_out.gif')\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install stable_diffusion_img2img == 1.0.0\n  ```\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/clip/README.md",
    "content": "# OpenAI CLIP implemented in Paddle.\nThe original implementation repo is [ranchlai/clip.paddle](https://github.com/ranchlai/clip.paddle). We use this repo here for text encoder in stable diffusion.\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/clip/clip/__init__.py",
    "content": "from .utils import *\r\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/clip/clip/layers.py",
    "content": "from typing import Optional\n\nimport paddle\nimport paddle.nn as nn\nfrom paddle import Tensor\nfrom paddle.nn import functional as F\nfrom paddle.nn import Linear\n\n__all__ = ['ResidualAttentionBlock', 'AttentionPool2d', 'multi_head_attention_forward', 'MultiHeadAttention']\n\n\ndef multi_head_attention_forward(x: Tensor,\n                                 num_heads: int,\n                                 q_proj: Linear,\n                                 k_proj: Linear,\n                                 v_proj: Linear,\n                                 c_proj: Linear,\n                                 attn_mask: Optional[Tensor] = None):\n    max_len, batch_size, emb_dim = x.shape\n    head_dim = emb_dim // num_heads\n    scaling = float(head_dim)**-0.5\n    q = q_proj(x)  # L, N, E\n    k = k_proj(x)  # L, N, E\n    v = v_proj(x)  # L, N, E\n    #k = k.con\n    v = v.reshape((-1, batch_size * num_heads, head_dim)).transpose((1, 0, 2))\n    k = k.reshape((-1, batch_size * num_heads, head_dim)).transpose((1, 0, 2))\n    q = q.reshape((-1, batch_size * num_heads, head_dim)).transpose((1, 0, 2))\n\n    q = q * scaling\n    qk = paddle.bmm(q, k.transpose((0, 2, 1)))\n    if attn_mask is not None:\n        if attn_mask.ndim == 2:\n            attn_mask.unsqueeze_(0)\n        #assert str(attn_mask.dtype) == 'VarType.FP32' and attn_mask.ndim == 3\n        assert attn_mask.shape[0] == 1 and attn_mask.shape[1] == max_len and attn_mask.shape[2] == max_len\n        qk += attn_mask\n\n    qk = paddle.nn.functional.softmax(qk, axis=-1)\n    atten = paddle.bmm(qk, v)\n    atten = atten.transpose((1, 0, 2))\n    atten = atten.reshape((max_len, batch_size, emb_dim))\n    atten = c_proj(atten)\n    return atten\n\n\nclass MultiHeadAttention(nn.Layer):  # without attention mask\n\n    def __init__(self, emb_dim: int, num_heads: int):\n        super().__init__()\n        self.q_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.k_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.v_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.c_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.head_dim = emb_dim // num_heads\n        self.emb_dim = emb_dim\n        self.num_heads = num_heads\n        assert self.head_dim * num_heads == emb_dim, \"embed_dim must be divisible by num_heads\"\n        #self.scaling = float(self.head_dim) ** -0.5\n\n    def forward(self, x, attn_mask=None):  # x is in shape[max_len,batch_size,emb_dim]\n\n        atten = multi_head_attention_forward(x,\n                                             self.num_heads,\n                                             self.q_proj,\n                                             self.k_proj,\n                                             self.v_proj,\n                                             self.c_proj,\n                                             attn_mask=attn_mask)\n\n        return atten\n\n\nclass Identity(nn.Layer):\n\n    def __init__(self):\n        super().__init__()\n\n    def forward(self, x):\n        return x\n\n\nclass Bottleneck(nn.Layer):\n    expansion = 4\n\n    def __init__(self, inplanes, planes, stride=1):\n        super().__init__()\n\n        # all conv layers have stride 1. an avgpool is performed after the second convolution when stride > 1\n        self.conv1 = nn.Conv2D(inplanes, planes, 1, bias_attr=False)\n        self.bn1 = nn.BatchNorm2D(planes)\n\n        self.conv2 = nn.Conv2D(planes, planes, 3, padding=1, bias_attr=False)\n        self.bn2 = nn.BatchNorm2D(planes)\n\n        self.avgpool = nn.AvgPool2D(stride) if stride > 1 else Identity()\n\n        self.conv3 = nn.Conv2D(planes, planes * self.expansion, 1, bias_attr=False)\n        self.bn3 = nn.BatchNorm2D(planes * self.expansion)\n\n        self.relu = nn.ReLU()\n        self.downsample = None\n        self.stride = stride\n\n        if stride > 1 or inplanes != planes * Bottleneck.expansion:\n            self.downsample = nn.Sequential(\n                (\"-1\", nn.AvgPool2D(stride)),\n                (\"0\", nn.Conv2D(inplanes, planes * self.expansion, 1, stride=1, bias_attr=False)),\n                (\"1\", nn.BatchNorm2D(planes * self.expansion)))\n\n    def forward(self, x):\n        identity = x\n\n        out = self.relu(self.bn1(self.conv1(x)))\n        out = self.relu(self.bn2(self.conv2(out)))\n        out = self.avgpool(out)\n        out = self.bn3(self.conv3(out))\n\n        if self.downsample is not None:\n            identity = self.downsample(x)\n\n        out += identity\n        out = self.relu(out)\n        return out\n\n\nclass AttentionPool2d(nn.Layer):\n\n    def __init__(self, spacial_dim: int, embed_dim: int, num_heads: int, output_dim: int = None):\n        super().__init__()\n\n        self.positional_embedding = paddle.create_parameter((spacial_dim**2 + 1, embed_dim), dtype='float32')\n\n        self.q_proj = nn.Linear(embed_dim, embed_dim, bias_attr=True)\n        self.k_proj = nn.Linear(embed_dim, embed_dim, bias_attr=True)\n        self.v_proj = nn.Linear(embed_dim, embed_dim, bias_attr=True)\n        self.c_proj = nn.Linear(embed_dim, output_dim or embed_dim, bias_attr=True)\n        self.num_heads = num_heads\n\n        self.head_dim = embed_dim // num_heads\n        assert self.head_dim * num_heads == embed_dim, \"embed_dim must be divisible by num_heads\"\n\n    def forward(self, x):\n\n        x = x.reshape((x.shape[0], x.shape[1], x.shape[2] * x.shape[3])).transpose((2, 0, 1))  # NCHW -> (HW)NC\n        max_len, batch_size, emb_dim = x.shape\n        head_dim = self.head_dim\n        x = paddle.concat([paddle.mean(x, axis=0, keepdim=True), x], axis=0)\n        x = x + paddle.unsqueeze(self.positional_embedding, 1)\n        out = multi_head_attention_forward(x, self.num_heads, self.q_proj, self.k_proj, self.v_proj, self.c_proj)\n\n        return out[0]\n\n\nclass QuickGELU(nn.Layer):\n\n    def forward(self, x):\n        return x * paddle.nn.functional.sigmoid(1.702 * x)\n\n\nclass ResidualAttentionBlock(nn.Layer):\n\n    def __init__(self, d_model: int, n_head: int, attn_mask=None):\n        super().__init__()\n\n        self.attn = MultiHeadAttention(d_model, n_head)\n        self.ln_1 = nn.LayerNorm(d_model)\n        self.mlp = nn.Sequential((\"c_fc\", nn.Linear(d_model, d_model * 4)), (\"gelu\", QuickGELU()),\n                                 (\"c_proj\", nn.Linear(d_model * 4, d_model)))\n        self.ln_2 = nn.LayerNorm(d_model)\n        self.attn_mask = attn_mask\n\n    def attention(self, x):\n        x = self.attn(x, self.attn_mask)\n        assert isinstance(x, paddle.Tensor)  # not tuble here\n        return x\n\n    def forward(self, x):\n\n        x = x + self.attention(self.ln_1(x))\n        x = x + self.mlp(self.ln_2(x))\n        return x\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/clip/clip/model.py",
    "content": "from typing import Tuple\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle import nn\n\nfrom .layers import AttentionPool2d\nfrom .layers import Bottleneck\nfrom .layers import MultiHeadAttention\nfrom .layers import ResidualAttentionBlock\n\n\nclass ModifiedResNet(nn.Layer):\n    \"\"\"\n    A ResNet class that is similar to torchvision's but contains the following changes:\n    - There are now 3 \"stem\" convolutions as opposed to 1, with an average pool instead of a max pool.\n    - Performs anti-aliasing strided convolutions, where an avgpool is prepended to convolutions with stride > 1\n    - The final pooling layer is a QKV attention instead of an average pool\n    \"\"\"\n\n    def __init__(self, layers, output_dim, heads, input_resolution=224, width=64):\n        super().__init__()\n        self.output_dim = output_dim\n        self.input_resolution = input_resolution\n\n        # the 3-layer stem\n        self.conv1 = nn.Conv2D(3, width // 2, kernel_size=3, stride=2, padding=1, bias_attr=False)\n        self.bn1 = nn.BatchNorm2D(width // 2)\n        self.conv2 = nn.Conv2D(width // 2, width // 2, kernel_size=3, padding=1, bias_attr=False)\n        self.bn2 = nn.BatchNorm2D(width // 2)\n        self.conv3 = nn.Conv2D(width // 2, width, kernel_size=3, padding=1, bias_attr=False)\n        self.bn3 = nn.BatchNorm2D(width)\n        self.avgpool = nn.AvgPool2D(2)\n        self.relu = nn.ReLU()\n\n        # residual layers\n        self._inplanes = width  # this is a *mutable* variable used during construction\n        self.layer1 = self._make_layer(width, layers[0])\n        self.layer2 = self._make_layer(width * 2, layers[1], stride=2)\n        self.layer3 = self._make_layer(width * 4, layers[2], stride=2)\n        self.layer4 = self._make_layer(width * 8, layers[3], stride=2)\n\n        embed_dim = width * 32  # the ResNet feature dimension\n        self.attnpool = AttentionPool2d(input_resolution // 32, embed_dim, heads, output_dim)\n\n    def _make_layer(self, planes, blocks, stride=1):\n        layers = [Bottleneck(self._inplanes, planes, stride)]\n\n        self._inplanes = planes * Bottleneck.expansion\n        for _ in range(1, blocks):\n            layers.append(Bottleneck(self._inplanes, planes))\n\n        return nn.Sequential(*layers)\n\n    def forward(self, x):\n\n        def stem(x):\n            for conv, bn in [(self.conv1, self.bn1), (self.conv2, self.bn2), (self.conv3, self.bn3)]:\n                x = self.relu(bn(conv(x)))\n            x = self.avgpool(x)\n            return x\n\n        #x = x.type(self.conv1.weight.dtype)\n        x = stem(x)\n        x = self.layer1(x)\n        x = self.layer2(x)\n        x = self.layer3(x)\n        x = self.layer4(x)\n        x = self.attnpool(x)\n\n        return x\n\n\nclass Transformer(nn.Layer):\n\n    def __init__(self, width: int, layers: int, heads: int, attn_mask=None):\n        super().__init__()\n        self.width = width\n        self.layers = layers\n        self.resblocks = nn.Sequential(*[ResidualAttentionBlock(width, heads, attn_mask) for _ in range(layers)])\n\n    def forward(self, x):\n        return self.resblocks(x)\n\n\nclass VisualTransformer(nn.Layer):\n\n    def __init__(self, input_resolution: int, patch_size: int, width: int, layers: int, heads: int, output_dim: int):\n        super().__init__()\n        self.input_resolution = input_resolution\n        self.output_dim = output_dim\n        # used patch_size x patch_size, stride patch_size to do linear projection\n        self.conv1 = nn.Conv2D(in_channels=3,\n                               out_channels=width,\n                               kernel_size=patch_size,\n                               stride=patch_size,\n                               bias_attr=False)\n\n        # scale = width ** -0.5\n        self.class_embedding = paddle.create_parameter((width, ), 'float32')\n\n        self.positional_embedding = paddle.create_parameter(((input_resolution // patch_size)**2 + 1, width), 'float32')\n\n        self.ln_pre = nn.LayerNorm(width)\n\n        self.transformer = Transformer(width, layers, heads)\n\n        self.ln_post = nn.LayerNorm(width)\n        self.proj = paddle.create_parameter((width, output_dim), 'float32')\n\n    def forward(self, x):\n\n        x = self.conv1(x)\n        x = x.reshape((x.shape[0], x.shape[1], -1))\n        x = x.transpose((0, 2, 1))\n        x = paddle.concat([self.class_embedding + paddle.zeros((x.shape[0], 1, x.shape[-1]), dtype=x.dtype), x], axis=1)\n\n        x = x + self.positional_embedding\n        x = self.ln_pre(x)\n        x = x.transpose((1, 0, 2))\n        x = self.transformer(x)\n        x = x.transpose((1, 0, 2))\n        x = self.ln_post(x[:, 0, :])\n        if self.proj is not None:\n            x = paddle.matmul(x, self.proj)\n\n        return x\n\n\nclass TextTransformer(nn.Layer):\n\n    def __init__(self, context_length: int, vocab_size: int, transformer_width: int, transformer_heads: int,\n                 transformer_layers: int):\n        super().__init__()\n        self.context_length = context_length\n        self.transformer = Transformer(width=transformer_width,\n                                       layers=transformer_layers,\n                                       heads=transformer_heads,\n                                       attn_mask=self.build_attention_mask())\n\n        self.vocab_size = vocab_size\n        self.token_embedding = nn.Embedding(vocab_size, transformer_width)\n        self.positional_embedding = paddle.create_parameter((self.context_length, transformer_width), 'float32')\n        self.ln_final = nn.LayerNorm(transformer_width)\n\n    def build_attention_mask(self):\n        # lazily create causal attention mask, with full attention between the vision tokens\n        # mask = paddle.empty((self.context_length, self.context_length),dtype='float32')\n        # mask.fill_(float(\"-inf\"))\n        #mask.triu_(1)  # zero out the lower diagonal\n\n        mask = paddle.ones((self.context_length, self.context_length)) * float(\"-inf\")\n        mask = paddle.triu(mask, diagonal=1)\n\n        return mask\n\n    def forward(self, text):\n        x = self.token_embedding(text)  # [batch_size, n_ctx, d_model]\n        x = x + self.positional_embedding\n        x = x.transpose((1, 0, 2))  # NLD -> LND\n        x = self.transformer(x)\n        x = x.transpose((1, 0, 2))  # LND -> NLD\n        x = self.ln_final(x)\n        return x\n\n\nclass CLIP(nn.Layer):\n\n    def __init__(\n            self,\n            embed_dim: int,\n            # vision\n            image_resolution: int,\n            vision_layers: Union[Tuple[int, int, int, int], int],\n            vision_width: int,\n            vision_patch_size: int,\n            # text\n            context_length: int,\n            vocab_size: int,\n            transformer_width: int,\n            transformer_heads: int,\n            transformer_layers: int):\n        super().__init__()\n\n        self.context_length = context_length\n        if isinstance(vision_layers, (tuple, list)):\n            vision_heads = vision_width * 32 // 64\n            self.visual = ModifiedResNet(layers=vision_layers,\n                                         output_dim=embed_dim,\n                                         heads=vision_heads,\n                                         input_resolution=image_resolution,\n                                         width=vision_width)\n        else:\n            vision_heads = vision_width // 64\n            self.visual = VisualTransformer(input_resolution=image_resolution,\n                                            patch_size=vision_patch_size,\n                                            width=vision_width,\n                                            layers=vision_layers,\n                                            heads=vision_heads,\n                                            output_dim=embed_dim)\n\n        self.transformer = Transformer(width=transformer_width,\n                                       layers=transformer_layers,\n                                       heads=transformer_heads,\n                                       attn_mask=self.build_attention_mask())\n\n        self.vocab_size = vocab_size\n        self.token_embedding = nn.Embedding(vocab_size, transformer_width)\n        self.positional_embedding = paddle.create_parameter((self.context_length, transformer_width), 'float32')\n        self.ln_final = nn.LayerNorm(transformer_width)\n\n        self.text_projection = paddle.create_parameter((transformer_width, embed_dim), 'float32')\n        self.logit_scale = paddle.create_parameter((1, ), 'float32')\n\n    def build_attention_mask(self):\n        # lazily create causal attention mask, with full attention between the vision tokens\n        # mask = paddle.empty((self.context_length, self.context_length),dtype='float32')\n        # mask.fill_(float(\"-inf\"))\n        #mask.triu_(1)  # zero out the lower diagonal\n\n        mask = paddle.ones((self.context_length, self.context_length)) * float(\"-inf\")\n        mask = paddle.triu(mask, diagonal=1)\n\n        return mask\n\n    def encode_image(self, image):\n        return self.visual(image)\n\n    def encode_text(self, text):\n        x = self.token_embedding(text)  # [batch_size, n_ctx, d_model]\n        x = x + self.positional_embedding\n        x = x.transpose((1, 0, 2))  # NLD -> LND\n        x = self.transformer(x)\n        x = x.transpose((1, 0, 2))  # LND -> NLD\n        x = self.ln_final(x)\n        idx = text.numpy().argmax(-1)\n        idx = list(idx)\n        x = [x[i:i + 1, int(j), :] for i, j in enumerate(idx)]\n        x = paddle.concat(x, 0)\n        x = paddle.matmul(x, self.text_projection)\n        return x\n\n    def forward(self, image, text):\n        image_features = self.encode_image(image)\n        text_features = self.encode_text(text)\n\n        # normalized features\n        image_features = image_features / image_features.norm(dim=-1, keepdim=True)\n        text_features = text_features / text_features.norm(dim=-1, keepdim=True)\n\n        # cosine similarity as logits\n        logit_scale = self.logit_scale.exp()\n        logits_per_image = paddle.matmul(logit_scale * image_features, text_features.t())\n        logits_per_text = paddle.matmul(logit_scale * text_features, image_features.t())\n\n        # shape = [global_batch_size, global_batch_size]\n        return logits_per_image, logits_per_text\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/clip/clip/simple_tokenizer.py",
    "content": "import gzip\nimport html\nimport os\nfrom functools import lru_cache\n\nimport ftfy\nimport regex as re\n\n\n@lru_cache()\ndef default_bpe():\n    return os.path.join(os.path.dirname(os.path.abspath(__file__)), \"../assets/bpe_simple_vocab_16e6.txt.gz\")\n\n\n@lru_cache()\ndef bytes_to_unicode():\n    \"\"\"\n    Returns list of utf-8 byte and a corresponding list of unicode strings.\n    The reversible bpe codes work on unicode strings.\n    This means you need a large # of unicode characters in your vocab if you want to avoid UNKs.\n    When you're at something like a 10B token dataset you end up needing around 5K for decent coverage.\n    This is a signficant percentage of your normal, say, 32K bpe vocab.\n    To avoid that, we want lookup tables between utf-8 bytes and unicode strings.\n    And avoids mapping to whitespace/control characters the bpe code barfs on.\n    \"\"\"\n    bs = list(range(ord(\"!\"), ord(\"~\") + 1)) + list(range(ord(\"¡\"), ord(\"¬\") + 1)) + list(range(ord(\"®\"), ord(\"ÿ\") + 1))\n    cs = bs[:]\n    n = 0\n    for b in range(2**8):\n        if b not in bs:\n            bs.append(b)\n            cs.append(2**8 + n)\n            n += 1\n    cs = [chr(n) for n in cs]\n    return dict(zip(bs, cs))\n\n\ndef get_pairs(word):\n    \"\"\"Return set of symbol pairs in a word.\n    Word is represented as tuple of symbols (symbols being variable-length strings).\n    \"\"\"\n    pairs = set()\n    prev_char = word[0]\n    for char in word[1:]:\n        pairs.add((prev_char, char))\n        prev_char = char\n    return pairs\n\n\ndef basic_clean(text):\n    text = ftfy.fix_text(text)\n    text = html.unescape(html.unescape(text))\n    return text.strip()\n\n\ndef whitespace_clean(text):\n    text = re.sub(r'\\s+', ' ', text)\n    text = text.strip()\n    return text\n\n\nclass SimpleTokenizer(object):\n\n    def __init__(self, bpe_path: str = default_bpe()):\n        self.byte_encoder = bytes_to_unicode()\n        self.byte_decoder = {v: k for k, v in self.byte_encoder.items()}\n        merges = gzip.open(bpe_path).read().decode(\"utf-8\").split('\\n')\n        merges = merges[1:49152 - 256 - 2 + 1]\n        merges = [tuple(merge.split()) for merge in merges]\n        vocab = list(bytes_to_unicode().values())\n        vocab = vocab + [v + '</w>' for v in vocab]\n        for merge in merges:\n            vocab.append(''.join(merge))\n        vocab.extend(['<|startoftext|>', '<|endoftext|>'])\n        self.encoder = dict(zip(vocab, range(len(vocab))))\n        self.decoder = {v: k for k, v in self.encoder.items()}\n        self.bpe_ranks = dict(zip(merges, range(len(merges))))\n        self.cache = {'<|startoftext|>': '<|startoftext|>', '<|endoftext|>': '<|endoftext|>'}\n        self.pat = re.compile(\n            r\"\"\"<\\|startoftext\\|>|<\\|endoftext\\|>|'s|'t|'re|'ve|'m|'ll|'d|[\\p{L}]+|[\\p{N}]|[^\\s\\p{L}\\p{N}]+\"\"\",\n            re.IGNORECASE)\n\n    def bpe(self, token):\n        if token in self.cache:\n            return self.cache[token]\n        word = tuple(token[:-1]) + (token[-1] + '</w>', )\n        pairs = get_pairs(word)\n\n        if not pairs:\n            return token + '</w>'\n\n        while True:\n            bigram = min(pairs, key=lambda pair: self.bpe_ranks.get(pair, float('inf')))\n            if bigram not in self.bpe_ranks:\n                break\n            first, second = bigram\n            new_word = []\n            i = 0\n            while i < len(word):\n                try:\n                    j = word.index(first, i)\n                    new_word.extend(word[i:j])\n                    i = j\n                except:\n                    new_word.extend(word[i:])\n                    break\n\n                if word[i] == first and i < len(word) - 1 and word[i + 1] == second:\n                    new_word.append(first + second)\n                    i += 2\n                else:\n                    new_word.append(word[i])\n                    i += 1\n            new_word = tuple(new_word)\n            word = new_word\n            if len(word) == 1:\n                break\n            else:\n                pairs = get_pairs(word)\n        word = ' '.join(word)\n        self.cache[token] = word\n        return word\n\n    def encode(self, text):\n        bpe_tokens = []\n        text = whitespace_clean(basic_clean(text)).lower()\n        for token in re.findall(self.pat, text):\n            token = ''.join(self.byte_encoder[b] for b in token.encode('utf-8'))\n            bpe_tokens.extend(self.encoder[bpe_token] for bpe_token in self.bpe(token).split(' '))\n        return bpe_tokens\n\n    def decode(self, tokens):\n        text = ''.join([self.decoder[token] for token in tokens])\n        text = bytearray([self.byte_decoder[c] for c in text]).decode('utf-8', errors=\"replace\").replace('</w>', ' ')\n        return text\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/clip/clip/utils.py",
    "content": "import os\nfrom typing import List\nfrom typing import Union\n\nimport numpy as np\nimport paddle\nfrom paddle.utils import download\nfrom paddle.vision.transforms import CenterCrop\nfrom paddle.vision.transforms import Compose\nfrom paddle.vision.transforms import Normalize\nfrom paddle.vision.transforms import Resize\nfrom paddle.vision.transforms import ToTensor\n\nfrom .model import CLIP\nfrom .model import TextTransformer\nfrom .simple_tokenizer import SimpleTokenizer\n\n__all__ = ['transform', 'tokenize', 'build_model']\n\nMODEL_NAMES = ['VITL14']\n\nURL = {'VITL14': os.path.join(os.path.dirname(__file__), 'pre_trained', 'vitl14_textencoder.pdparams')}\n\nMEAN, STD = (0.48145466, 0.4578275, 0.40821073), (0.26862954, 0.26130258, 0.27577711)\n_tokenizer = SimpleTokenizer()\n\ntransform = Compose([\n    Resize(224, interpolation='bicubic'),\n    CenterCrop(224), lambda image: image.convert('RGB'),\n    ToTensor(),\n    Normalize(mean=MEAN, std=STD), lambda t: t.unsqueeze_(0)\n])\n\n\ndef tokenize(texts: Union[str, List[str]], context_length: int = 77):\n    \"\"\"\n    Returns the tokenized representation of given input string(s)\n\n    Parameters\n    ----------\n    texts : Union[str, List[str]]\n        An input string or a list of input strings to tokenize\n\n    context_length : int\n        The context length to use; all CLIP models use 77 as the context length\n\n    Returns\n    -------\n    A two-dimensional tensor containing the resulting tokens, shape = [number of input strings, context_length]\n    \"\"\"\n    if isinstance(texts, str):\n        texts = [texts]\n\n    sot_token = _tokenizer.encoder[\"<|startoftext|>\"]\n    eot_token = _tokenizer.encoder[\"<|endoftext|>\"]\n    all_tokens = [[sot_token] + _tokenizer.encode(text) + [eot_token] for text in texts]\n    result = paddle.zeros((len(all_tokens), context_length), dtype='int64')\n\n    for i, tokens in enumerate(all_tokens):\n        if len(tokens) > context_length:\n            raise RuntimeError(f\"Input {texts[i]} is too long for context length {context_length}\")\n        result[i, :len(tokens)] = paddle.to_tensor(np.array(tokens), dtype='int64')\n\n    return result\n\n\ndef build_model(name='VITL14'):\n    assert name in MODEL_NAMES, f\"model name must be one of {MODEL_NAMES}\"\n    name2model = {'VITL14': build_vitl14_language_model}\n    model = name2model[name]()\n    weight = URL[name]\n    sd = paddle.load(weight)\n    state_dict = model.state_dict()\n    for key, value in sd.items():\n        if key in state_dict:\n            state_dict[key] = value\n    model.load_dict(state_dict)\n    model.eval()\n    return model\n\n\ndef build_vitl14_language_model():\n    model = TextTransformer(context_length=77,\n                            vocab_size=49408,\n                            transformer_width=768,\n                            transformer_heads=12,\n                            transformer_layers=12)\n    return model\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/diffusers/__init__.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n__version__ = \"0.2.4\"\n\nfrom .models import AutoencoderKL, UNet2DConditionModel, UNet2DModel, VQModel\n\nfrom .schedulers import (DDIMScheduler, DDPMScheduler, KarrasVeScheduler, PNDMScheduler, SchedulerMixin,\n                         ScoreSdeVeScheduler, LMSDiscreteScheduler)\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/diffusers/configuration_utils.py",
    "content": "# coding=utf-8\n# Copyright 2022 The HuggingFace Inc. team.\n# Copyright (c) 2022, NVIDIA CORPORATION.  All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\" ConfigMixinuration base class and utilities.\"\"\"\nimport functools\nimport inspect\nimport json\nimport os\nimport re\nfrom collections import OrderedDict\nfrom typing import Any\nfrom typing import Dict\nfrom typing import Tuple\nfrom typing import Union\n\nfrom requests import HTTPError\n\nfrom paddlehub.common.logger import logger\n\nHUGGINGFACE_CO_RESOLVE_ENDPOINT = \"HUGGINGFACE_CO_RESOLVE_ENDPOINT\"\nDIFFUSERS_CACHE = \"./caches\"\n\n_re_configuration_file = re.compile(r\"config\\.(.*)\\.json\")\n\n\nclass ConfigMixin:\n    r\"\"\"\n    Base class for all configuration classes. Handles a few parameters common to all models' configurations as well as\n    methods for loading/downloading/saving configurations.\n\n    \"\"\"\n    config_name = \"model_config.json\"\n    ignore_for_config = []\n\n    def register_to_config(self, **kwargs):\n        if self.config_name is None:\n            raise NotImplementedError(f\"Make sure that {self.__class__} has defined a class name `config_name`\")\n        kwargs[\"_class_name\"] = self.__class__.__name__\n        kwargs[\"_diffusers_version\"] = \"0.0.1\"\n\n        for key, value in kwargs.items():\n            try:\n                setattr(self, key, value)\n            except AttributeError as err:\n                logger.error(f\"Can't set {key} with value {value} for {self}\")\n                raise err\n\n        if not hasattr(self, \"_internal_dict\"):\n            internal_dict = kwargs\n        else:\n            previous_dict = dict(self._internal_dict)\n            internal_dict = {**self._internal_dict, **kwargs}\n            logger.debug(f\"Updating config from {previous_dict} to {internal_dict}\")\n\n        self._internal_dict = FrozenDict(internal_dict)\n\n    def save_config(self, save_directory: Union[str, os.PathLike], push_to_hub: bool = False, **kwargs):\n        \"\"\"\n        Save a configuration object to the directory `save_directory`, so that it can be re-loaded using the\n        [`~ConfigMixin.from_config`] class method.\n\n        Args:\n            save_directory (`str` or `os.PathLike`):\n                Directory where the configuration JSON file will be saved (will be created if it does not exist).\n            kwargs:\n                Additional key word arguments passed along to the [`~utils.PushToHubMixin.push_to_hub`] method.\n        \"\"\"\n        if os.path.isfile(save_directory):\n            raise AssertionError(f\"Provided path ({save_directory}) should be a directory, not a file\")\n\n        os.makedirs(save_directory, exist_ok=True)\n\n        # If we save using the predefined names, we can load using `from_config`\n        output_config_file = os.path.join(save_directory, self.config_name)\n\n        self.to_json_file(output_config_file)\n        logger.info(f\"ConfigMixinuration saved in {output_config_file}\")\n\n    @classmethod\n    def from_config(cls, pretrained_model_name_or_path: Union[str, os.PathLike], return_unused_kwargs=False, **kwargs):\n        config_dict = cls.get_config_dict(pretrained_model_name_or_path=pretrained_model_name_or_path, **kwargs)\n\n        init_dict, unused_kwargs = cls.extract_init_dict(config_dict, **kwargs)\n\n        model = cls(**init_dict)\n\n        if return_unused_kwargs:\n            return model, unused_kwargs\n        else:\n            return model\n\n    @classmethod\n    def get_config_dict(cls, pretrained_model_name_or_path: Union[str, os.PathLike],\n                        **kwargs) -> Tuple[Dict[str, Any], Dict[str, Any]]:\n        cache_dir = kwargs.pop(\"cache_dir\", DIFFUSERS_CACHE)\n        force_download = kwargs.pop(\"force_download\", False)\n        resume_download = kwargs.pop(\"resume_download\", False)\n        proxies = kwargs.pop(\"proxies\", None)\n        use_auth_token = kwargs.pop(\"use_auth_token\", None)\n        local_files_only = kwargs.pop(\"local_files_only\", False)\n        revision = kwargs.pop(\"revision\", None)\n        subfolder = kwargs.pop(\"subfolder\", None)\n\n        user_agent = {\"file_type\": \"config\"}\n\n        pretrained_model_name_or_path = str(pretrained_model_name_or_path)\n\n        if cls.config_name is None:\n            raise ValueError(\n                \"`self.config_name` is not defined. Note that one should not load a config from \"\n                \"`ConfigMixin`. Please make sure to define `config_name` in a class inheriting from `ConfigMixin`\")\n\n        if os.path.isfile(pretrained_model_name_or_path):\n            config_file = pretrained_model_name_or_path\n        elif os.path.isdir(pretrained_model_name_or_path):\n            if os.path.isfile(os.path.join(pretrained_model_name_or_path, cls.config_name)):\n                # Load from a PyTorch checkpoint\n                config_file = os.path.join(pretrained_model_name_or_path, cls.config_name)\n            elif subfolder is not None and os.path.isfile(\n                    os.path.join(pretrained_model_name_or_path, subfolder, cls.config_name)):\n                config_file = os.path.join(pretrained_model_name_or_path, subfolder, cls.config_name)\n            else:\n                raise EnvironmentError(\n                    f\"Error no file named {cls.config_name} found in directory {pretrained_model_name_or_path}.\")\n        else:\n            try:\n                # Load from URL or cache if already cached\n                from huggingface_hub import hf_hub_download\n                config_file = hf_hub_download(\n                    pretrained_model_name_or_path,\n                    filename=cls.config_name,\n                    cache_dir=cache_dir,\n                    force_download=force_download,\n                    proxies=proxies,\n                    resume_download=resume_download,\n                    local_files_only=local_files_only,\n                    use_auth_token=use_auth_token,\n                    user_agent=user_agent,\n                    subfolder=subfolder,\n                )\n\n            except HTTPError as err:\n                raise EnvironmentError(\"There was a specific connection error when trying to load\"\n                                       f\" {pretrained_model_name_or_path}:\\n{err}\")\n            except ValueError:\n                raise EnvironmentError(\n                    f\"We couldn't connect to '{HUGGINGFACE_CO_RESOLVE_ENDPOINT}' to load this model, couldn't find it\"\n                    f\" in the cached files and it looks like {pretrained_model_name_or_path} is not the path to a\"\n                    f\" directory containing a {cls.config_name} file.\\nCheckout your internet connection or see how to\"\n                    \" run the library in offline mode at\"\n                    \" 'https://huggingface.co/docs/diffusers/installation#offline-mode'.\")\n            except EnvironmentError:\n                raise EnvironmentError(\n                    f\"Can't load config for '{pretrained_model_name_or_path}'. If you were trying to load it from \"\n                    \"'https://huggingface.co/models', make sure you don't have a local directory with the same name. \"\n                    f\"Otherwise, make sure '{pretrained_model_name_or_path}' is the correct path to a directory \"\n                    f\"containing a {cls.config_name} file\")\n\n        try:\n            # Load config dict\n            config_dict = cls._dict_from_json_file(config_file)\n        except (json.JSONDecodeError, UnicodeDecodeError):\n            raise EnvironmentError(f\"It looks like the config file at '{config_file}' is not a valid JSON file.\")\n\n        return config_dict\n\n    @classmethod\n    def extract_init_dict(cls, config_dict, **kwargs):\n        expected_keys = set(dict(inspect.signature(cls.__init__).parameters).keys())\n        expected_keys.remove(\"self\")\n        # remove general kwargs if present in dict\n        if \"kwargs\" in expected_keys:\n            expected_keys.remove(\"kwargs\")\n        # remove keys to be ignored\n        if len(cls.ignore_for_config) > 0:\n            expected_keys = expected_keys - set(cls.ignore_for_config)\n        init_dict = {}\n        for key in expected_keys:\n            if key in kwargs:\n                # overwrite key\n                init_dict[key] = kwargs.pop(key)\n            elif key in config_dict:\n                # use value from config dict\n                init_dict[key] = config_dict.pop(key)\n\n        unused_kwargs = config_dict.update(kwargs)\n\n        passed_keys = set(init_dict.keys())\n        if len(expected_keys - passed_keys) > 0:\n            logger.warning(\n                f\"{expected_keys - passed_keys} was not found in config. Values will be initialized to default values.\")\n\n        return init_dict, unused_kwargs\n\n    @classmethod\n    def _dict_from_json_file(cls, json_file: Union[str, os.PathLike]):\n        with open(json_file, \"r\", encoding=\"utf-8\") as reader:\n            text = reader.read()\n        return json.loads(text)\n\n    def __repr__(self):\n        return f\"{self.__class__.__name__} {self.to_json_string()}\"\n\n    @property\n    def config(self) -> Dict[str, Any]:\n        return self._internal_dict\n\n    def to_json_string(self) -> str:\n        \"\"\"\n        Serializes this instance to a JSON string.\n\n        Returns:\n            `str`: String containing all the attributes that make up this configuration instance in JSON format.\n        \"\"\"\n        config_dict = self._internal_dict if hasattr(self, \"_internal_dict\") else {}\n        return json.dumps(config_dict, indent=2, sort_keys=True) + \"\\n\"\n\n    def to_json_file(self, json_file_path: Union[str, os.PathLike]):\n        \"\"\"\n        Save this instance to a JSON file.\n\n        Args:\n            json_file_path (`str` or `os.PathLike`):\n                Path to the JSON file in which this configuration instance's parameters will be saved.\n        \"\"\"\n        with open(json_file_path, \"w\", encoding=\"utf-8\") as writer:\n            writer.write(self.to_json_string())\n\n\nclass FrozenDict(OrderedDict):\n\n    def __init__(self, *args, **kwargs):\n        super().__init__(*args, **kwargs)\n\n        for key, value in self.items():\n            setattr(self, key, value)\n\n        self.__frozen = True\n\n    def __delitem__(self, *args, **kwargs):\n        raise Exception(f\"You cannot use ``__delitem__`` on a {self.__class__.__name__} instance.\")\n\n    def setdefault(self, *args, **kwargs):\n        raise Exception(f\"You cannot use ``setdefault`` on a {self.__class__.__name__} instance.\")\n\n    def pop(self, *args, **kwargs):\n        raise Exception(f\"You cannot use ``pop`` on a {self.__class__.__name__} instance.\")\n\n    def update(self, *args, **kwargs):\n        raise Exception(f\"You cannot use ``update`` on a {self.__class__.__name__} instance.\")\n\n    def __setattr__(self, name, value):\n        if hasattr(self, \"__frozen\") and self.__frozen:\n            raise Exception(f\"You cannot use ``__setattr__`` on a {self.__class__.__name__} instance.\")\n        super().__setattr__(name, value)\n\n    def __setitem__(self, name, value):\n        if hasattr(self, \"__frozen\") and self.__frozen:\n            raise Exception(f\"You cannot use ``__setattr__`` on a {self.__class__.__name__} instance.\")\n        super().__setitem__(name, value)\n\n\ndef register_to_config(init):\n    \"\"\"\n    Decorator to apply on the init of classes inheriting from `ConfigMixin` so that all the arguments are automatically\n    sent to `self.register_for_config`. To ignore a specific argument accepted by the init but that shouldn't be\n    registered in the config, use the `ignore_for_config` class variable\n\n    Warning: Once decorated, all private arguments (beginning with an underscore) are trashed and not sent to the init!\n    \"\"\"\n\n    @functools.wraps(init)\n    def inner_init(self, *args, **kwargs):\n        # Ignore private kwargs in the init.\n        init_kwargs = {k: v for k, v in kwargs.items() if not k.startswith(\"_\")}\n        init(self, *args, **init_kwargs)\n        if not isinstance(self, ConfigMixin):\n            raise RuntimeError(\n                f\"`@register_for_config` was applied to {self.__class__.__name__} init method, but this class does \"\n                \"not inherit from `ConfigMixin`.\")\n\n        ignore = getattr(self, \"ignore_for_config\", [])\n        # Get positional arguments aligned with kwargs\n        new_kwargs = {}\n        signature = inspect.signature(init)\n        parameters = {\n            name: p.default\n            for i, (name, p) in enumerate(signature.parameters.items()) if i > 0 and name not in ignore\n        }\n        for arg, name in zip(args, parameters.keys()):\n            new_kwargs[name] = arg\n\n        # Then add all kwargs\n        new_kwargs.update({\n            k: init_kwargs.get(k, default)\n            for k, default in parameters.items() if k not in ignore and k not in new_kwargs\n        })\n        getattr(self, \"register_to_config\")(**new_kwargs)\n\n    return inner_init\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/diffusers/models/README.md",
    "content": "# Models\n\n- Models: Neural network that models $p_\\theta(\\mathbf{x}_{t-1}|\\mathbf{x}_t)$ (see image below) and is trained end-to-end to denoise a noisy input to an image. Examples: UNet, Conditioned UNet, 3D UNet, Transformer UNet\n\n## API\n\nTODO(Suraj, Patrick)\n\n## Examples\n\nTODO(Suraj, Patrick)\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/diffusers/models/__init__.py",
    "content": "# flake8: noqa\n# There's no way to ignore \"F401 '...' imported but unused\" warnings in this\n# module, but to preserve other warnings. So, don't check this module at all.\n# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom .unet_2d import UNet2DModel\nfrom .unet_2d_condition import UNet2DConditionModel\nfrom .vae import AutoencoderKL\nfrom .vae import VQModel\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/diffusers/models/attention.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nfrom inspect import isfunction\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\ndef finfo(dtype):\n    if dtype == paddle.float32:\n        return np.finfo(np.float32)\n    if dtype == paddle.float16:\n        return np.finfo(np.float16)\n    if dtype == paddle.float64:\n        return np.finfo(np.float64)\n\n\npaddle.finfo = finfo\n\n\nclass AttentionBlockNew(nn.Layer):\n    \"\"\"\n    An attention block that allows spatial positions to attend to each other. Originally ported from here, but adapted\n    to the N-d case.\n    https://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/models/unet.py#L66.\n    Uses three q, k, v linear layers to compute attention\n    \"\"\"\n\n    def __init__(\n        self,\n        channels,\n        num_head_channels=None,\n        num_groups=32,\n        rescale_output_factor=1.0,\n        eps=1e-5,\n    ):\n        super().__init__()\n        self.channels = channels\n\n        self.num_heads = channels // num_head_channels if num_head_channels is not None else 1\n        self.num_head_size = num_head_channels\n        self.group_norm = nn.GroupNorm(num_channels=channels, num_groups=num_groups, epsilon=eps)\n\n        # define q,k,v as linear layers\n        self.query = nn.Linear(channels, channels)\n        self.key = nn.Linear(channels, channels)\n        self.value = nn.Linear(channels, channels)\n\n        self.rescale_output_factor = rescale_output_factor\n        self.proj_attn = nn.Linear(channels, channels)\n\n    def transpose_for_scores(self, projection: paddle.Tensor) -> paddle.Tensor:\n        new_projection_shape = projection.shape[:-1] + [self.num_heads, -1]\n        # move heads to 2nd position (B, T, H * D) -> (B, T, H, D) -> (B, H, T, D)\n        new_projection = projection.reshape(new_projection_shape).transpose([0, 2, 1, 3])\n        return new_projection\n\n    def forward(self, hidden_states):\n        residual = hidden_states\n        batch, channel, height, width = hidden_states.shape\n\n        # norm\n        hidden_states = self.group_norm(hidden_states)\n\n        hidden_states = hidden_states.reshape([batch, channel, height * width]).transpose([0, 2, 1])\n\n        # proj to q, k, v\n        query_proj = self.query(hidden_states)\n        key_proj = self.key(hidden_states)\n        value_proj = self.value(hidden_states)\n\n        # transpose\n        query_states = self.transpose_for_scores(query_proj)\n        key_states = self.transpose_for_scores(key_proj)\n        value_states = self.transpose_for_scores(value_proj)\n\n        # get scores\n        scale = 1 / math.sqrt(math.sqrt(self.channels / self.num_heads))\n        attention_scores = paddle.matmul(query_states * scale, key_states * scale, transpose_y=True)\n        attention_probs = F.softmax(attention_scores.astype(\"float32\"), axis=-1).astype(attention_scores.dtype)\n\n        # compute attention output\n        context_states = paddle.matmul(attention_probs, value_states)\n\n        context_states = context_states.transpose([0, 2, 1, 3])\n        new_context_states_shape = context_states.shape[:-2] + [\n            self.channels,\n        ]\n        context_states = context_states.reshape(new_context_states_shape)\n\n        # compute next hidden_states\n        hidden_states = self.proj_attn(context_states)\n        hidden_states = hidden_states.transpose([0, 2, 1]).reshape([batch, channel, height, width])\n\n        # res connect and rescale\n        hidden_states = (hidden_states + residual) / self.rescale_output_factor\n        return hidden_states\n\n    def set_weight(self, attn_layer):\n        self.group_norm.weight.set_value(attn_layer.norm.weight)\n        self.group_norm.bias.set_value(attn_layer.norm.bias)\n\n        if hasattr(attn_layer, \"q\"):\n            self.query.weight.set_value(attn_layer.q.weight[:, :, 0, 0])\n            self.key.weight.set_value(attn_layer.k.weight[:, :, 0, 0])\n            self.value.weight.set_value(attn_layer.v.weight[:, :, 0, 0])\n\n            self.query.bias.set_value(attn_layer.q.bias)\n            self.key.bias.set_value(attn_layer.k.bias)\n            self.value.bias.set_value(attn_layer.v.bias)\n\n            self.proj_attn.weight.set_value(attn_layer.proj_out.weight[:, :, 0, 0])\n            self.proj_attn.bias.set_value(attn_layer.proj_out.bias)\n        elif hasattr(attn_layer, \"NIN_0\"):\n            self.query.weight.set_value(attn_layer.NIN_0.W.t())\n            self.key.weight.set_value(attn_layer.NIN_1.W.t())\n            self.value.weight.set_value(attn_layer.NIN_2.W.t())\n\n            self.query.bias.set_value(attn_layer.NIN_0.b)\n            self.key.bias.set_value(attn_layer.NIN_1.b)\n            self.value.bias.set_value(attn_layer.NIN_2.b)\n\n            self.proj_attn.weight.set_value(attn_layer.NIN_3.W.t())\n            self.proj_attn.bias.set_value(attn_layer.NIN_3.b)\n\n            self.group_norm.weight.set_value(attn_layer.GroupNorm_0.weight)\n            self.group_norm.bias.set_value(attn_layer.GroupNorm_0.bias)\n        else:\n            qkv_weight = attn_layer.qkv.weight.reshape(\n                [self.num_heads, 3 * self.channels // self.num_heads, self.channels])\n            qkv_bias = attn_layer.qkv.bias.reshape([self.num_heads, 3 * self.channels // self.num_heads])\n\n            q_w, k_w, v_w = qkv_weight.split(self.channels // self.num_heads, axis=1)\n            q_b, k_b, v_b = qkv_bias.split(self.channels // self.num_heads, axis=1)\n\n            self.query.weight.set_value(q_w.reshape([-1, self.channels]))\n            self.key.weight.set_value(k_w.reshape([-1, self.channels]))\n            self.value.weight.set_value(v_w.reshape([-1, self.channels]))\n\n            self.query.bias.set_value(q_b.flatten())\n            self.key.bias.set_value(k_b.flatten())\n            self.value.bias.set_value(v_b.flatten())\n\n            self.proj_attn.weight.set_value(attn_layer.proj.weight[:, :, 0])\n            self.proj_attn.bias.set_value(attn_layer.proj.bias)\n\n\nclass SpatialTransformer(nn.Layer):\n    \"\"\"\n    Transformer block for image-like data. First, project the input (aka embedding) and reshape to b, t, d. Then apply\n    standard transformer action. Finally, reshape to image\n    \"\"\"\n\n    def __init__(self, in_channels, n_heads, d_head, depth=1, dropout=0.0, context_dim=None):\n        super().__init__()\n        self.n_heads = n_heads\n        self.d_head = d_head\n        self.in_channels = in_channels\n        inner_dim = n_heads * d_head\n        self.norm = nn.GroupNorm(num_groups=32, num_channels=in_channels, epsilon=1e-6)\n\n        self.proj_in = nn.Conv2D(in_channels, inner_dim, kernel_size=1, stride=1, padding=0)\n\n        self.transformer_blocks = nn.LayerList([\n            BasicTransformerBlock(inner_dim, n_heads, d_head, dropout=dropout, context_dim=context_dim)\n            for d in range(depth)\n        ])\n\n        self.proj_out = nn.Conv2D(inner_dim, in_channels, kernel_size=1, stride=1, padding=0)\n\n    def forward(self, x, context=None):\n        # note: if no context is given, cross-attention defaults to self-attention\n        b, c, h, w = x.shape\n        x_in = x\n        x = self.norm(x)\n        x = self.proj_in(x)\n        x = x.transpose([0, 2, 3, 1]).reshape([b, h * w, c])\n        for block in self.transformer_blocks:\n            x = block(x, context=context)\n        x = x.reshape([b, h, w, c]).transpose([0, 3, 1, 2])\n        x = self.proj_out(x)\n        return x + x_in\n\n    def set_weight(self, layer):\n        self.norm = layer.norm\n        self.proj_in = layer.proj_in\n        self.transformer_blocks = layer.transformer_blocks\n        self.proj_out = layer.proj_out\n\n\nclass BasicTransformerBlock(nn.Layer):\n\n    def __init__(self, dim, n_heads, d_head, dropout=0.0, context_dim=None, gated_ff=True, checkpoint=True):\n        super().__init__()\n        self.attn1 = CrossAttention(query_dim=dim, heads=n_heads, dim_head=d_head,\n                                    dropout=dropout)  # is a self-attention\n        self.ff = FeedForward(dim, dropout=dropout, glu=gated_ff)\n        self.attn2 = CrossAttention(query_dim=dim,\n                                    context_dim=context_dim,\n                                    heads=n_heads,\n                                    dim_head=d_head,\n                                    dropout=dropout)  # is self-attn if context is none\n        self.norm1 = nn.LayerNorm(dim)\n        self.norm2 = nn.LayerNorm(dim)\n        self.norm3 = nn.LayerNorm(dim)\n        self.checkpoint = checkpoint\n\n    def forward(self, x, context=None):\n        x = self.attn1(self.norm1(x)) + x\n        x = self.attn2(self.norm2(x), context=context) + x\n        x = self.ff(self.norm3(x)) + x\n        return x\n\n\nclass CrossAttention(nn.Layer):\n\n    def __init__(self, query_dim, context_dim=None, heads=8, dim_head=64, dropout=0.0):\n        super().__init__()\n        inner_dim = dim_head * heads\n        context_dim = default(context_dim, query_dim)\n\n        self.scale = dim_head**-0.5\n        self.heads = heads\n\n        self.to_q = nn.Linear(query_dim, inner_dim, bias_attr=False)\n        self.to_k = nn.Linear(context_dim, inner_dim, bias_attr=False)\n        self.to_v = nn.Linear(context_dim, inner_dim, bias_attr=False)\n\n        self.to_out = nn.Sequential(nn.Linear(inner_dim, query_dim), nn.Dropout(dropout))\n\n    def reshape_heads_to_batch_dim(self, tensor):\n        batch_size, seq_len, dim = tensor.shape\n        head_size = self.heads\n        tensor = tensor.reshape([batch_size, seq_len, head_size, dim // head_size])\n        tensor = tensor.transpose([0, 2, 1, 3]).reshape([batch_size * head_size, seq_len, dim // head_size])\n        return tensor\n\n    def reshape_batch_dim_to_heads(self, tensor):\n        batch_size, seq_len, dim = tensor.shape\n        head_size = self.heads\n        tensor = tensor.reshape([batch_size // head_size, head_size, seq_len, dim])\n        tensor = tensor.transpose([0, 2, 1, 3]).reshape([batch_size // head_size, seq_len, dim * head_size])\n        return tensor\n\n    def forward(self, x, context=None, mask=None):\n        batch_size, sequence_length, dim = x.shape\n\n        h = self.heads\n\n        q = self.to_q(x)\n        context = default(context, x)\n        k = self.to_k(context)\n        v = self.to_v(context)\n\n        q = self.reshape_heads_to_batch_dim(q)\n        k = self.reshape_heads_to_batch_dim(k)\n        v = self.reshape_heads_to_batch_dim(v)\n\n        sim = paddle.einsum(\"b i d, b j d -> b i j\", q * self.scale, k)\n\n        if exists(mask):\n            mask = mask.reshape([batch_size, -1])\n            max_neg_value = -paddle.finfo(sim.dtype).max\n            mask = mask[:, None, :].repeat(h, 1, 1)\n            sim.masked_fill_(~mask, max_neg_value)\n\n        # attention, what we cannot get enough of\n        attn = F.softmax(sim, axis=-1)\n\n        out = paddle.einsum(\"b i j, b j d -> b i d\", attn, v)\n        out = self.reshape_batch_dim_to_heads(out)\n        return self.to_out(out)\n\n\nclass FeedForward(nn.Layer):\n\n    def __init__(self, dim, dim_out=None, mult=4, glu=False, dropout=0.0):\n        super().__init__()\n        inner_dim = int(dim * mult)\n        dim_out = default(dim_out, dim)\n        project_in = nn.Sequential(nn.Linear(dim, inner_dim), nn.GELU()) if not glu else GEGLU(dim, inner_dim)\n\n        self.net = nn.Sequential(project_in, nn.Dropout(dropout), nn.Linear(inner_dim, dim_out))\n\n    def forward(self, x):\n        return self.net(x)\n\n\n# feedforward\nclass GEGLU(nn.Layer):\n\n    def __init__(self, dim_in, dim_out):\n        super().__init__()\n        self.proj = nn.Linear(dim_in, dim_out * 2)\n\n    def forward(self, x):\n        x, gate = self.proj(x).chunk(2, axis=-1)\n        return x * F.gelu(gate)\n\n\n# TODO(Patrick) - remove once all weights have been converted -> not needed anymore then\nclass NIN(nn.Layer):\n\n    def __init__(self, in_dim, num_units, init_scale=0.1):\n        super().__init__()\n        self.W = self.create_parameter(shape=[in_dim, num_units], default_initializer=nn.initializer.Constant(0.))\n        self.b = self.create_parameter(shape=[\n            num_units,\n        ],\n                                       is_bias=True,\n                                       default_initializer=nn.initializer.Constant(0.))\n\n\ndef exists(val):\n    return val is not None\n\n\ndef default(val, d):\n    if exists(val):\n        return val\n    return d() if isfunction(d) else d\n\n\n# the main attention block that is used for all models\nclass AttentionBlock(nn.Layer):\n    \"\"\"\n    An attention block that allows spatial positions to attend to each other.\n\n    Originally ported from here, but adapted to the N-d case.\n    https://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/models/unet.py#L66.\n    \"\"\"\n\n    def __init__(\n        self,\n        channels,\n        num_heads=1,\n        num_head_channels=None,\n        num_groups=32,\n        encoder_channels=None,\n        overwrite_qkv=False,\n        overwrite_linear=False,\n        rescale_output_factor=1.0,\n        eps=1e-5,\n    ):\n        super().__init__()\n        self.channels = channels\n        if num_head_channels is None:\n            self.num_heads = num_heads\n        else:\n            assert (channels % num_head_channels == 0\n                    ), f\"q,k,v channels {channels} is not divisible by num_head_channels {num_head_channels}\"\n            self.num_heads = channels // num_head_channels\n\n        self.norm = nn.GroupNorm(num_channels=channels, num_groups=num_groups, epsilon=eps)\n        self.qkv = nn.Conv1D(channels, channels * 3, 1)\n        self.n_heads = self.num_heads\n        self.rescale_output_factor = rescale_output_factor\n\n        if encoder_channels is not None:\n            self.encoder_kv = nn.Conv1D(encoder_channels, channels * 2, 1)\n\n        self.proj = nn.Conv1D(channels, channels, 1)\n\n        self.overwrite_qkv = overwrite_qkv\n        self.overwrite_linear = overwrite_linear\n\n        if overwrite_qkv:\n            in_channels = channels\n            self.norm = nn.GroupNorm(num_channels=channels, num_groups=num_groups, epsilon=1e-6)\n            self.q = nn.Conv2D(in_channels, in_channels, kernel_size=1, stride=1, padding=0)\n            self.k = nn.Conv2D(in_channels, in_channels, kernel_size=1, stride=1, padding=0)\n            self.v = nn.Conv2D(in_channels, in_channels, kernel_size=1, stride=1, padding=0)\n            self.proj_out = nn.Conv2D(in_channels, in_channels, kernel_size=1, stride=1, padding=0)\n        elif self.overwrite_linear:\n            num_groups = min(channels // 4, 32)\n            self.norm = nn.GroupNorm(num_channels=channels, num_groups=num_groups, epsilon=1e-6)\n            self.NIN_0 = NIN(channels, channels)\n            self.NIN_1 = NIN(channels, channels)\n            self.NIN_2 = NIN(channels, channels)\n            self.NIN_3 = NIN(channels, channels)\n\n            self.GroupNorm_0 = nn.GroupNorm(num_groups=num_groups, num_channels=channels, epsilon=1e-6)\n        else:\n            self.proj_out = nn.Conv1D(channels, channels, 1)\n            self.set_weights(self)\n\n        self.is_overwritten = False\n\n    def set_weights(self, layer):\n        if self.overwrite_qkv:\n            qkv_weight = paddle.concat([layer.q.weight, layer.k.weight, layer.v.weight], axis=0)[:, :, :, 0]\n            qkv_bias = paddle.concat([layer.q.bias, layer.k.bias, layer.v.bias], axis=0)\n\n            self.qkv.weight.set_value(qkv_weight)\n            self.qkv.bias.set_value(qkv_bias)\n\n            proj_out = nn.Conv1D(self.channels, self.channels, 1)\n            proj_out.weight.set_value(layer.proj_out.weight[:, :, :, 0])\n            proj_out.bias.set_value(layer.proj_out.bias)\n\n            self.proj = proj_out\n        elif self.overwrite_linear:\n            self.qkv.weight.set_value(\n                paddle.concat([self.NIN_0.W.t(), self.NIN_1.W.t(), self.NIN_2.W.t()], axis=0)[:, :, None])\n            self.qkv.bias.set_value(paddle.concat([self.NIN_0.b, self.NIN_1.b, self.NIN_2.b], axis=0))\n\n            self.proj.weight.set_value(self.NIN_3.W.t()[:, :, None])\n            self.proj.bias.set_value(self.NIN_3.b)\n\n            self.norm.weight.set_value(self.GroupNorm_0.weight)\n            self.norm.bias.set_value(self.GroupNorm_0.bias)\n        else:\n            self.proj.weight.set_value(self.proj_out.weight)\n            self.proj.bias.set_value(self.proj_out.bias)\n\n    def forward(self, x, encoder_out=None):\n        if not self.is_overwritten and (self.overwrite_qkv or self.overwrite_linear):\n            self.set_weights(self)\n            self.is_overwritten = True\n\n        b, c, *spatial = x.shape\n        hid_states = self.norm(x).reshape([b, c, -1])\n\n        qkv = self.qkv(hid_states)\n        bs, width, length = qkv.shape\n        assert width % (3 * self.n_heads) == 0\n        ch = width // (3 * self.n_heads)\n        q, k, v = qkv.reshape([bs * self.n_heads, ch * 3, length]).split(ch, axis=1)\n\n        if encoder_out is not None:\n            encoder_kv = self.encoder_kv(encoder_out)\n            assert encoder_kv.shape[1] == self.n_heads * ch * 2\n            ek, ev = encoder_kv.reshape([bs * self.n_heads, ch * 2, -1]).split(ch, axis=1)\n            k = paddle.concat([ek, k], axis=-1)\n            v = paddle.concat([ev, v], axis=-1)\n\n        scale = 1 / math.sqrt(math.sqrt(ch))\n        weight = paddle.einsum(\"bct,bcs->bts\", q * scale, k * scale)  # More stable with f16 than dividing afterwards\n        weight = F.softmax(weight.astype(\"float32\"), axis=-1).astype(weight.dtype)\n\n        a = paddle.einsum(\"bts,bcs->bct\", weight, v)\n        h = a.reshape([bs, -1, length])\n\n        h = self.proj(h)\n        h = h.reshape([b, c, *spatial])\n\n        result = x + h\n\n        result = result / self.rescale_output_factor\n\n        return result\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/diffusers/models/embeddings.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\ndef get_timestep_embedding(timesteps,\n                           embedding_dim,\n                           flip_sin_to_cos=False,\n                           downscale_freq_shift=1,\n                           scale=1,\n                           max_period=10000):\n    \"\"\"\n    This matches the implementation in Denoising Diffusion Probabilistic Models: Create sinusoidal timestep embeddings.\n\n    :param timesteps: a 1-D Tensor of N indices, one per batch element.\n                      These may be fractional.\n    :param embedding_dim: the dimension of the output. :param max_period: controls the minimum frequency of the\n    embeddings. :return: an [N x dim] Tensor of positional embeddings.\n    \"\"\"\n    assert len(timesteps.shape) == 1, \"Timesteps should be a 1d-array\"\n\n    half_dim = embedding_dim // 2\n    exponent = -math.log(max_period) * paddle.arange(start=0, end=half_dim, dtype=\"float32\")\n    exponent = exponent / (half_dim - downscale_freq_shift)\n\n    emb = paddle.exp(exponent)\n    emb = timesteps[:, None].astype(\"float32\") * emb[None, :]\n\n    # scale embeddings\n    emb = scale * emb\n\n    # concat sine and cosine embeddings\n    emb = paddle.concat([paddle.sin(emb), paddle.cos(emb)], axis=-1)\n\n    # flip sine and cosine embeddings\n    if flip_sin_to_cos:\n        emb = paddle.concat([emb[:, half_dim:], emb[:, :half_dim]], axis=-1)\n\n    # zero pad\n    if embedding_dim % 2 == 1:\n        emb = paddle.concat(emb, paddle.zeros([emb.shape[0], 1]), axis=-1)\n    return emb\n\n\nclass TimestepEmbedding(nn.Layer):\n\n    def __init__(self, channel, time_embed_dim, act_fn=\"silu\"):\n        super().__init__()\n\n        self.linear_1 = nn.Linear(channel, time_embed_dim)\n        self.act = None\n        if act_fn == \"silu\":\n            self.act = nn.Silu()\n        self.linear_2 = nn.Linear(time_embed_dim, time_embed_dim)\n\n    def forward(self, sample):\n        sample = self.linear_1(sample)\n\n        if self.act is not None:\n            sample = self.act(sample)\n\n        sample = self.linear_2(sample)\n        return sample\n\n\nclass Timesteps(nn.Layer):\n\n    def __init__(self, num_channels, flip_sin_to_cos, downscale_freq_shift):\n        super().__init__()\n        self.num_channels = num_channels\n        self.flip_sin_to_cos = flip_sin_to_cos\n        self.downscale_freq_shift = downscale_freq_shift\n\n    def forward(self, timesteps):\n        t_emb = get_timestep_embedding(\n            timesteps,\n            self.num_channels,\n            flip_sin_to_cos=self.flip_sin_to_cos,\n            downscale_freq_shift=self.downscale_freq_shift,\n        )\n        return t_emb\n\n\nclass GaussianFourierProjection(nn.Layer):\n    \"\"\"Gaussian Fourier embeddings for noise levels.\"\"\"\n\n    def __init__(self, embedding_size=256, scale=1.0):\n        super().__init__()\n        self.register_buffer(\"weight\", paddle.randn((embedding_size, )) * scale)\n\n        # to delete later\n        self.register_buffer(\"W\", paddle.randn((embedding_size, )) * scale)\n\n        self.weight = self.W\n\n    def forward(self, x):\n        x = paddle.log(x)\n        x_proj = x[:, None] * self.weight[None, :] * 2 * np.pi\n        out = paddle.concat([paddle.sin(x_proj), paddle.cos(x_proj)], axis=-1)\n        return out\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/diffusers/models/resnet.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom functools import partial\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\ndef pad_new(x, pad, mode=\"constant\", value=0):\n    new_pad = []\n    for _ in range(x.ndim * 2 - len(pad)):\n        new_pad.append(0)\n    ndim = list(range(x.ndim - 1, 0, -1))\n    axes_start = {}\n    for i, _pad in enumerate(pad):\n        if _pad < 0:\n            new_pad.append(0)\n            zhengshu, yushu = divmod(i, 2)\n            if yushu == 0:\n                axes_start[ndim[zhengshu]] = -_pad\n        else:\n            new_pad.append(_pad)\n\n    padded = paddle.nn.functional.pad(x, new_pad, mode=mode, value=value)\n    padded_shape = paddle.shape(padded)\n    axes = []\n    starts = []\n    ends = []\n    for k, v in axes_start.items():\n        axes.append(k)\n        starts.append(v)\n        ends.append(padded_shape[k])\n        assert v < padded_shape[k]\n\n    if axes:\n        return padded.slice(axes=axes, starts=starts, ends=ends)\n    else:\n        return padded\n\n\nclass Upsample2D(nn.Layer):\n    \"\"\"\n    An upsampling layer with an optional convolution.\n\n    :param channels: channels in the inputs and outputs. :param use_conv: a bool determining if a convolution is\n    applied. :param dims: determines if the signal is 1D, 2D, or 3D. If 3D, then\n                 upsampling occurs in the inner-two dimensions.\n    \"\"\"\n\n    def __init__(self, channels, use_conv=False, use_conv_transpose=False, out_channels=None, name=\"conv\"):\n        super().__init__()\n        self.channels = channels\n        self.out_channels = out_channels or channels\n        self.use_conv = use_conv\n        self.use_conv_transpose = use_conv_transpose\n        self.name = name\n\n        conv = None\n        if use_conv_transpose:\n            conv = nn.Conv2DTranspose(channels, self.out_channels, 4, 2, 1)\n        elif use_conv:\n            conv = nn.Conv2D(self.channels, self.out_channels, 3, padding=1)\n\n        # TODO(Suraj, Patrick) - clean up after weight dicts are correctly renamed\n        if name == \"conv\":\n            self.conv = conv\n        else:\n            self.Conv2d_0 = conv\n\n    def forward(self, x):\n        assert x.shape[1] == self.channels\n        if self.use_conv_transpose:\n            return self.conv(x)\n\n        x = F.interpolate(x, scale_factor=2.0, mode=\"nearest\")\n\n        # TODO(Suraj, Patrick) - clean up after weight dicts are correctly renamed\n        if self.use_conv:\n            if self.name == \"conv\":\n                x = self.conv(x)\n            else:\n                x = self.Conv2d_0(x)\n\n        return x\n\n\nclass Downsample2D(nn.Layer):\n    \"\"\"\n    A downsampling layer with an optional convolution.\n\n    :param channels: channels in the inputs and outputs. :param use_conv: a bool determining if a convolution is\n    applied. :param dims: determines if the signal is 1D, 2D, or 3D. If 3D, then\n                 downsampling occurs in the inner-two dimensions.\n    \"\"\"\n\n    def __init__(self, channels, use_conv=False, out_channels=None, padding=1, name=\"conv\"):\n        super().__init__()\n        self.channels = channels\n        self.out_channels = out_channels or channels\n        self.use_conv = use_conv\n        self.padding = padding\n        stride = 2\n        self.name = name\n\n        if use_conv:\n            conv = nn.Conv2D(self.channels, self.out_channels, 3, stride=stride, padding=padding)\n        else:\n            assert self.channels == self.out_channels\n            conv = nn.AvgPool2D(kernel_size=stride, stride=stride)\n\n        # TODO(Suraj, Patrick) - clean up after weight dicts are correctly renamed\n        if name == \"conv\":\n            self.Conv2d_0 = conv\n            self.conv = conv\n        elif name == \"Conv2d_0\":\n            self.conv = conv\n        else:\n            self.conv = conv\n\n    def forward(self, x):\n        assert x.shape[1] == self.channels\n        if self.use_conv and self.padding == 0:\n            pad = (0, 1, 0, 1)\n            x = pad_new(x, pad, mode=\"constant\", value=0)\n\n        assert x.shape[1] == self.channels\n        x = self.conv(x)\n\n        return x\n\n\nclass FirUpsample2D(nn.Layer):\n\n    def __init__(self, channels=None, out_channels=None, use_conv=False, fir_kernel=(1, 3, 3, 1)):\n        super().__init__()\n        out_channels = out_channels if out_channels else channels\n        if use_conv:\n            self.Conv2d_0 = nn.Conv2D(channels, out_channels, kernel_size=3, stride=1, padding=1)\n        self.use_conv = use_conv\n        self.fir_kernel = fir_kernel\n        self.out_channels = out_channels\n\n    def _upsample_2d(self, x, w=None, k=None, factor=2, gain=1):\n        \"\"\"Fused `upsample_2d()` followed by `Conv2d()`.\n\n        Args:\n        Padding is performed only once at the beginning, not between the operations. The fused op is considerably more\n        efficient than performing the same calculation using standard TensorFlow ops. It supports gradients of arbitrary:\n        order.\n        x: Input tensor of the shape `[N, C, H, W]` or `[N, H, W,\n            C]`.\n        w: Weight tensor of the shape `[filterH, filterW, inChannels,\n            outChannels]`. Grouped convolution can be performed by `inChannels = x.shape[0] // numGroups`.\n        k: FIR filter of the shape `[firH, firW]` or `[firN]`\n            (separable). The default is `[1] * factor`, which corresponds to nearest-neighbor upsampling.\n        factor: Integer upsampling factor (default: 2). gain: Scaling factor for signal magnitude (default: 1.0).\n\n        Returns:\n        Tensor of the shape `[N, C, H * factor, W * factor]` or `[N, H * factor, W * factor, C]`, and same datatype as\n        `x`.\n        \"\"\"\n\n        assert isinstance(factor, int) and factor >= 1\n\n        # Setup filter kernel.\n        if k is None:\n            k = [1] * factor\n\n        # setup kernel\n        k = np.asarray(k, dtype=np.float32)\n        if k.ndim == 1:\n            k = np.outer(k, k)\n        k /= np.sum(k)\n\n        k = k * (gain * (factor**2))\n\n        if self.use_conv:\n            convH = w.shape[2]\n            convW = w.shape[3]\n            inC = w.shape[1]\n\n            p = (k.shape[0] - factor) - (convW - 1)\n\n            stride = (factor, factor)\n            # Determine data dimensions.\n            stride = [1, 1, factor, factor]\n            output_shape = ((x.shape[2] - 1) * factor + convH, (x.shape[3] - 1) * factor + convW)\n            output_padding = (\n                output_shape[0] - (x.shape[2] - 1) * stride[0] - convH,\n                output_shape[1] - (x.shape[3] - 1) * stride[1] - convW,\n            )\n            assert output_padding[0] >= 0 and output_padding[1] >= 0\n            inC = w.shape[1]\n            num_groups = x.shape[1] // inC\n\n            # Transpose weights.\n            w = paddle.reshape(w, (num_groups, -1, inC, convH, convW))\n            w = w[..., ::-1, ::-1].transpose([0, 2, 1, 3, 4])\n            w = paddle.reshape(w, (num_groups * inC, -1, convH, convW))\n\n            x = F.conv2d_transpose(x, w, stride=stride, output_padding=output_padding, padding=0)\n\n            x = upfirdn2d_native(x, paddle.to_tensor(k), pad=((p + 1) // 2 + factor - 1, p // 2 + 1))\n        else:\n            p = k.shape[0] - factor\n            x = upfirdn2d_native(x, paddle.to_tensor(k), up=factor, pad=((p + 1) // 2 + factor - 1, p // 2))\n\n        return x\n\n    def forward(self, x):\n        if self.use_conv:\n            h = self._upsample_2d(x, self.Conv2d_0.weight, k=self.fir_kernel)\n            h = h + self.Conv2d_0.bias.reshape([1, -1, 1, 1])\n        else:\n            h = self._upsample_2d(x, k=self.fir_kernel, factor=2)\n\n        return h\n\n\nclass FirDownsample2D(nn.Layer):\n\n    def __init__(self, channels=None, out_channels=None, use_conv=False, fir_kernel=(1, 3, 3, 1)):\n        super().__init__()\n        out_channels = out_channels if out_channels else channels\n        if use_conv:\n            self.Conv2d_0 = nn.Conv2D(channels, out_channels, kernel_size=3, stride=1, padding=1)\n        self.fir_kernel = fir_kernel\n        self.use_conv = use_conv\n        self.out_channels = out_channels\n\n    def _downsample_2d(self, x, w=None, k=None, factor=2, gain=1):\n        \"\"\"Fused `Conv2d()` followed by `downsample_2d()`.\n\n        Args:\n        Padding is performed only once at the beginning, not between the operations. The fused op is considerably more\n        efficient than performing the same calculation using standard TensorFlow ops. It supports gradients of arbitrary:\n        order.\n            x: Input tensor of the shape `[N, C, H, W]` or `[N, H, W, C]`. w: Weight tensor of the shape `[filterH,\n            filterW, inChannels, outChannels]`. Grouped convolution can be performed by `inChannels = x.shape[0] //\n            numGroups`. k: FIR filter of the shape `[firH, firW]` or `[firN]` (separable). The default is `[1] *\n            factor`, which corresponds to average pooling. factor: Integer downsampling factor (default: 2). gain:\n            Scaling factor for signal magnitude (default: 1.0).\n\n        Returns:\n            Tensor of the shape `[N, C, H // factor, W // factor]` or `[N, H // factor, W // factor, C]`, and same\n            datatype as `x`.\n        \"\"\"\n\n        assert isinstance(factor, int) and factor >= 1\n        if k is None:\n            k = [1] * factor\n\n        # setup kernel\n        k = np.asarray(k, dtype=np.float32)\n        if k.ndim == 1:\n            k = np.outer(k, k)\n        k /= np.sum(k)\n\n        k = k * gain\n\n        if self.use_conv:\n            _, _, convH, convW = w.shape\n            p = (k.shape[0] - factor) + (convW - 1)\n            s = [factor, factor]\n            x = upfirdn2d_native(x, paddle.to_tensor(k), pad=((p + 1) // 2, p // 2))\n            x = F.conv2d(x, w, stride=s, padding=0)\n        else:\n            p = k.shape[0] - factor\n            x = upfirdn2d_native(x, paddle.to_tensor(k), down=factor, pad=((p + 1) // 2, p // 2))\n\n        return x\n\n    def forward(self, x):\n        if self.use_conv:\n            x = self._downsample_2d(x, w=self.Conv2d_0.weight, k=self.fir_kernel)\n            x = x + self.Conv2d_0.bias.reshape([1, -1, 1, 1])\n        else:\n            x = self._downsample_2d(x, k=self.fir_kernel, factor=2)\n\n        return x\n\n\nclass ResnetBlock(nn.Layer):\n\n    def __init__(\n        self,\n        *,\n        in_channels,\n        out_channels=None,\n        conv_shortcut=False,\n        dropout=0.0,\n        temb_channels=512,\n        groups=32,\n        groups_out=None,\n        pre_norm=True,\n        eps=1e-6,\n        non_linearity=\"swish\",\n        time_embedding_norm=\"default\",\n        kernel=None,\n        output_scale_factor=1.0,\n        use_nin_shortcut=None,\n        up=False,\n        down=False,\n    ):\n        super().__init__()\n        self.pre_norm = pre_norm\n        self.pre_norm = True\n        self.in_channels = in_channels\n        out_channels = in_channels if out_channels is None else out_channels\n        self.out_channels = out_channels\n        self.use_conv_shortcut = conv_shortcut\n        self.time_embedding_norm = time_embedding_norm\n        self.up = up\n        self.down = down\n        self.output_scale_factor = output_scale_factor\n\n        if groups_out is None:\n            groups_out = groups\n\n        self.norm1 = nn.GroupNorm(num_groups=groups, num_channels=in_channels, epsilon=eps)\n\n        self.conv1 = nn.Conv2D(in_channels, out_channels, kernel_size=3, stride=1, padding=1)\n\n        if temb_channels is not None:\n            self.time_emb_proj = nn.Linear(temb_channels, out_channels)\n        else:\n            self.time_emb_proj = None\n\n        self.norm2 = nn.GroupNorm(num_groups=groups_out, num_channels=out_channels, epsilon=eps)\n        self.dropout = nn.Dropout(dropout)\n        self.conv2 = nn.Conv2D(out_channels, out_channels, kernel_size=3, stride=1, padding=1)\n\n        if non_linearity == \"swish\":\n            self.nonlinearity = lambda x: F.silu(x)\n        elif non_linearity == \"mish\":\n            self.nonlinearity = Mish()\n        elif non_linearity == \"silu\":\n            self.nonlinearity = nn.Silu()\n\n        self.upsample = self.downsample = None\n        if self.up:\n            if kernel == \"fir\":\n                fir_kernel = (1, 3, 3, 1)\n                self.upsample = lambda x: upsample_2d(x, k=fir_kernel)\n            elif kernel == \"sde_vp\":\n                self.upsample = partial(F.interpolate, scale_factor=2.0, mode=\"nearest\")\n            else:\n                self.upsample = Upsample2D(in_channels, use_conv=False)\n        elif self.down:\n            if kernel == \"fir\":\n                fir_kernel = (1, 3, 3, 1)\n                self.downsample = lambda x: downsample_2d(x, k=fir_kernel)\n            elif kernel == \"sde_vp\":\n                self.downsample = partial(F.avg_pool2d, kernel_size=2, stride=2)\n            else:\n                self.downsample = Downsample2D(in_channels, use_conv=False, padding=1, name=\"op\")\n\n        self.use_nin_shortcut = self.in_channels != self.out_channels if use_nin_shortcut is None else use_nin_shortcut\n\n        self.conv_shortcut = None\n        if self.use_nin_shortcut:\n            self.conv_shortcut = nn.Conv2D(in_channels, out_channels, kernel_size=1, stride=1, padding=0)\n\n    def forward(self, x, temb, hey=False):\n        h = x\n\n        # make sure hidden states is in float32\n        # when running in half-precision\n        h = self.norm1(h.astype(\"float32\")).astype(h.dtype)\n        h = self.nonlinearity(h)\n\n        if self.upsample is not None:\n            x = self.upsample(x)\n            h = self.upsample(h)\n        elif self.downsample is not None:\n            x = self.downsample(x)\n            h = self.downsample(h)\n\n        h = self.conv1(h)\n\n        if temb is not None:\n            temb = self.time_emb_proj(self.nonlinearity(temb))[:, :, None, None]\n            h = h + temb\n\n        # make sure hidden states is in float32\n        # when running in half-precision\n        h = self.norm2(h.astype(\"float32\")).astype(h.dtype)\n        h = self.nonlinearity(h)\n\n        h = self.dropout(h)\n        h = self.conv2(h)\n\n        if self.conv_shortcut is not None:\n            x = self.conv_shortcut(x)\n\n        out = (x + h) / self.output_scale_factor\n\n        return out\n\n\nclass Mish(nn.Layer):\n\n    def forward(self, x):\n        return x * F.tanh(F.softplus(x))\n\n\ndef upsample_2d(x, k=None, factor=2, gain=1):\n    r\"\"\"Upsample2D a batch of 2D images with the given filter.\n\n    Args:\n    Accepts a batch of 2D images of the shape `[N, C, H, W]` or `[N, H, W, C]` and upsamples each image with the given\n    filter. The filter is normalized so that if the input pixels are constant, they will be scaled by the specified\n    `gain`. Pixels outside the image are assumed to be zero, and the filter is padded with zeros so that its shape is a:\n    multiple of the upsampling factor.\n        x: Input tensor of the shape `[N, C, H, W]` or `[N, H, W,\n          C]`.\n        k: FIR filter of the shape `[firH, firW]` or `[firN]`\n          (separable). The default is `[1] * factor`, which corresponds to nearest-neighbor upsampling.\n        factor: Integer upsampling factor (default: 2). gain: Scaling factor for signal magnitude (default: 1.0).\n\n    Returns:\n        Tensor of the shape `[N, C, H * factor, W * factor]`\n    \"\"\"\n    assert isinstance(factor, int) and factor >= 1\n    if k is None:\n        k = [1] * factor\n\n    k = np.asarray(k, dtype=np.float32)\n    if k.ndim == 1:\n        k = np.outer(k, k)\n    k /= np.sum(k)\n\n    k = k * (gain * (factor**2))\n    p = k.shape[0] - factor\n    return upfirdn2d_native(x, paddle.to_tensor(k), up=factor, pad=((p + 1) // 2 + factor - 1, p // 2))\n\n\ndef downsample_2d(x, k=None, factor=2, gain=1):\n    r\"\"\"Downsample2D a batch of 2D images with the given filter.\n\n    Args:\n    Accepts a batch of 2D images of the shape `[N, C, H, W]` or `[N, H, W, C]` and downsamples each image with the\n    given filter. The filter is normalized so that if the input pixels are constant, they will be scaled by the\n    specified `gain`. Pixels outside the image are assumed to be zero, and the filter is padded with zeros so that its\n    shape is a multiple of the downsampling factor.\n        x: Input tensor of the shape `[N, C, H, W]` or `[N, H, W,\n          C]`.\n        k: FIR filter of the shape `[firH, firW]` or `[firN]`\n          (separable). The default is `[1] * factor`, which corresponds to average pooling.\n        factor: Integer downsampling factor (default: 2). gain: Scaling factor for signal magnitude (default: 1.0).\n\n    Returns:\n        Tensor of the shape `[N, C, H // factor, W // factor]`\n    \"\"\"\n\n    assert isinstance(factor, int) and factor >= 1\n    if k is None:\n        k = [1] * factor\n\n    k = np.asarray(k, dtype=np.float32)\n    if k.ndim == 1:\n        k = np.outer(k, k)\n    k /= np.sum(k)\n\n    k = k * gain\n    p = k.shape[0] - factor\n    return upfirdn2d_native(x, paddle.to_tensor(k), down=factor, pad=((p + 1) // 2, p // 2))\n\n\ndef upfirdn2d_native(input, kernel, up=1, down=1, pad=(0, 0)):\n    up_x = up_y = up\n    down_x = down_y = down\n    pad_x0 = pad_y0 = pad[0]\n    pad_x1 = pad_y1 = pad[1]\n\n    _, channel, in_h, in_w = input.shape\n    input = input.reshape([-1, in_h, in_w, 1])\n\n    _, in_h, in_w, minor = input.shape\n    kernel_h, kernel_w = kernel.shape\n\n    out = input.reshape([-1, in_h, 1, in_w, 1, minor])\n    # TODO\n    out = pad_new(out, [0, 0, 0, up_x - 1, 0, 0, 0, up_y - 1])\n    out = out.reshape([-1, in_h * up_y, in_w * up_x, minor])\n\n    out = pad_new(out, [0, 0, max(pad_x0, 0), max(pad_x1, 0), max(pad_y0, 0), max(pad_y1, 0)])\n    out = out[:, max(-pad_y0, 0):out.shape[1] - max(-pad_y1, 0), max(-pad_x0, 0):out.shape[2] - max(-pad_x1, 0), :, ]\n\n    out = out.transpose([0, 3, 1, 2])\n    out = out.reshape([-1, 1, in_h * up_y + pad_y0 + pad_y1, in_w * up_x + pad_x0 + pad_x1])\n    w = paddle.flip(kernel, [0, 1]).reshape([1, 1, kernel_h, kernel_w])\n    out = F.conv2d(out, w)\n    out = out.reshape(\n        [-1, minor, in_h * up_y + pad_y0 + pad_y1 - kernel_h + 1, in_w * up_x + pad_x0 + pad_x1 - kernel_w + 1])\n    out = out.transpose([0, 2, 3, 1])\n    out = out[:, ::down_y, ::down_x, :]\n\n    out_h = (in_h * up_y + pad_y0 + pad_y1 - kernel_h) // down_y + 1\n    out_w = (in_w * up_x + pad_x0 + pad_x1 - kernel_w) // down_x + 1\n\n    return out.reshape([-1, channel, out_h, out_w])\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/diffusers/models/unet_2d.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Dict\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .embeddings import GaussianFourierProjection\nfrom .embeddings import TimestepEmbedding\nfrom .embeddings import Timesteps\nfrom .unet_blocks import get_down_block\nfrom .unet_blocks import get_up_block\nfrom .unet_blocks import UNetMidBlock2D\n\n\nclass UNet2DModel(nn.Layer, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        sample_size=None,\n        in_channels=3,\n        out_channels=3,\n        center_input_sample=False,\n        time_embedding_type=\"positional\",\n        freq_shift=0,\n        flip_sin_to_cos=True,\n        down_block_types=(\"DownBlock2D\", \"AttnDownBlock2D\", \"AttnDownBlock2D\", \"AttnDownBlock2D\"),\n        up_block_types=(\"AttnUpBlock2D\", \"AttnUpBlock2D\", \"AttnUpBlock2D\", \"UpBlock2D\"),\n        block_out_channels=(224, 448, 672, 896),\n        layers_per_block=2,\n        mid_block_scale_factor=1,\n        downsample_padding=1,\n        act_fn=\"silu\",\n        attention_head_dim=8,\n        norm_num_groups=32,\n        norm_eps=1e-5,\n    ):\n        super().__init__()\n\n        self.sample_size = sample_size\n        time_embed_dim = block_out_channels[0] * 4\n\n        # input\n        self.conv_in = nn.Conv2D(in_channels, block_out_channels[0], kernel_size=3, padding=(1, 1))\n\n        # time\n        if time_embedding_type == \"fourier\":\n            self.time_proj = GaussianFourierProjection(embedding_size=block_out_channels[0], scale=16)\n            timestep_input_dim = 2 * block_out_channels[0]\n        elif time_embedding_type == \"positional\":\n            self.time_proj = Timesteps(block_out_channels[0], flip_sin_to_cos, freq_shift)\n            timestep_input_dim = block_out_channels[0]\n\n        self.time_embedding = TimestepEmbedding(timestep_input_dim, time_embed_dim)\n\n        self.down_blocks = nn.LayerList([])\n        self.mid_block = None\n        self.up_blocks = nn.LayerList([])\n\n        # down\n        output_channel = block_out_channels[0]\n        for i, down_block_type in enumerate(down_block_types):\n            input_channel = output_channel\n            output_channel = block_out_channels[i]\n            is_final_block = i == len(block_out_channels) - 1\n\n            down_block = get_down_block(\n                down_block_type,\n                num_layers=layers_per_block,\n                in_channels=input_channel,\n                out_channels=output_channel,\n                temb_channels=time_embed_dim,\n                add_downsample=not is_final_block,\n                resnet_eps=norm_eps,\n                resnet_act_fn=act_fn,\n                attn_num_head_channels=attention_head_dim,\n                downsample_padding=downsample_padding,\n            )\n            self.down_blocks.append(down_block)\n\n        # mid\n        self.mid_block = UNetMidBlock2D(\n            in_channels=block_out_channels[-1],\n            temb_channels=time_embed_dim,\n            resnet_eps=norm_eps,\n            resnet_act_fn=act_fn,\n            output_scale_factor=mid_block_scale_factor,\n            resnet_time_scale_shift=\"default\",\n            attn_num_head_channels=attention_head_dim,\n            resnet_groups=norm_num_groups,\n        )\n\n        # up\n        reversed_block_out_channels = list(reversed(block_out_channels))\n        output_channel = reversed_block_out_channels[0]\n        for i, up_block_type in enumerate(up_block_types):\n            prev_output_channel = output_channel\n            output_channel = reversed_block_out_channels[i]\n            input_channel = reversed_block_out_channels[min(i + 1, len(block_out_channels) - 1)]\n\n            is_final_block = i == len(block_out_channels) - 1\n\n            up_block = get_up_block(\n                up_block_type,\n                num_layers=layers_per_block + 1,\n                in_channels=input_channel,\n                out_channels=output_channel,\n                prev_output_channel=prev_output_channel,\n                temb_channels=time_embed_dim,\n                add_upsample=not is_final_block,\n                resnet_eps=norm_eps,\n                resnet_act_fn=act_fn,\n                attn_num_head_channels=attention_head_dim,\n            )\n            self.up_blocks.append(up_block)\n            prev_output_channel = output_channel\n\n        # out\n        num_groups_out = norm_num_groups if norm_num_groups is not None else min(block_out_channels[0] // 4, 32)\n        self.conv_norm_out = nn.GroupNorm(num_channels=block_out_channels[0],\n                                          num_groups=num_groups_out,\n                                          epsilon=norm_eps)\n        self.conv_act = nn.Silu()\n        self.conv_out = nn.Conv2D(block_out_channels[0], out_channels, 3, padding=1)\n\n    def forward(self, sample: paddle.Tensor, timestep: Union[paddle.Tensor, float, int]) -> Dict[str, paddle.Tensor]:\n\n        # 0. center input if necessary\n        if self.config.center_input_sample:\n            sample = 2 * sample - 1.0\n\n        # 1. time\n        timesteps = timestep\n        if not paddle.is_tensor(timesteps):\n            timesteps = paddle.to_tensor([timesteps], dtype=\"int64\")\n        elif paddle.is_tensor(timesteps) and len(timesteps.shape) == 0:\n            timesteps = timesteps[None]\n\n        # broadcast to batch dimension\n        timesteps = paddle.broadcast_to(timesteps, [sample.shape[0]])\n\n        t_emb = self.time_proj(timesteps)\n        emb = self.time_embedding(t_emb)\n\n        # 2. pre-process\n        skip_sample = sample\n        sample = self.conv_in(sample)\n\n        # 3. down\n        down_block_res_samples = (sample, )\n        for downsample_block in self.down_blocks:\n            if hasattr(downsample_block, \"skip_conv\"):\n                sample, res_samples, skip_sample = downsample_block(hidden_states=sample,\n                                                                    temb=emb,\n                                                                    skip_sample=skip_sample)\n            else:\n                sample, res_samples = downsample_block(hidden_states=sample, temb=emb)\n\n            down_block_res_samples += res_samples\n\n        # 4. mid\n        sample = self.mid_block(sample, emb)\n\n        # 5. up\n        skip_sample = None\n        for upsample_block in self.up_blocks:\n            res_samples = down_block_res_samples[-len(upsample_block.resnets):]\n            down_block_res_samples = down_block_res_samples[:-len(upsample_block.resnets)]\n\n            if hasattr(upsample_block, \"skip_conv\"):\n                sample, skip_sample = upsample_block(sample, res_samples, emb, skip_sample)\n            else:\n                sample = upsample_block(sample, res_samples, emb)\n\n        # 6. post-process\n        # make sure hidden states is in float32\n        # when running in half-precision\n        sample = self.conv_norm_out(sample.astype(\"float32\")).astype(sample.dtype)\n        sample = self.conv_act(sample)\n        sample = self.conv_out(sample)\n\n        if skip_sample is not None:\n            sample += skip_sample\n\n        if self.config.time_embedding_type == \"fourier\":\n            timesteps = timesteps.reshape((sample.shape[0], *([1] * len(sample.shape[1:]))))\n            sample = sample / timesteps\n\n        output = {\"sample\": sample}\n\n        return output\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/diffusers/models/unet_2d_condition.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Dict\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .embeddings import TimestepEmbedding\nfrom .embeddings import Timesteps\nfrom .unet_blocks import get_down_block\nfrom .unet_blocks import get_up_block\nfrom .unet_blocks import UNetMidBlock2DCrossAttn\n\n\nclass UNet2DConditionModel(nn.Layer, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        sample_size=64,\n        in_channels=4,\n        out_channels=4,\n        center_input_sample=False,\n        flip_sin_to_cos=True,\n        freq_shift=0,\n        down_block_types=(\"CrossAttnDownBlock2D\", \"CrossAttnDownBlock2D\", \"CrossAttnDownBlock2D\", \"DownBlock2D\"),\n        up_block_types=(\"UpBlock2D\", \"CrossAttnUpBlock2D\", \"CrossAttnUpBlock2D\", \"CrossAttnUpBlock2D\"),\n        block_out_channels=(320, 640, 1280, 1280),\n        layers_per_block=2,\n        downsample_padding=1,\n        mid_block_scale_factor=1,\n        act_fn=\"silu\",\n        norm_num_groups=32,\n        norm_eps=1e-5,\n        cross_attention_dim=768,\n        attention_head_dim=8,\n    ):\n        super().__init__()\n\n        self.sample_size = sample_size\n        time_embed_dim = block_out_channels[0] * 4\n\n        # input\n        self.conv_in = nn.Conv2D(in_channels, block_out_channels[0], kernel_size=3, padding=(1, 1))\n\n        # time\n        self.time_proj = Timesteps(block_out_channels[0], flip_sin_to_cos, freq_shift)\n        timestep_input_dim = block_out_channels[0]\n\n        self.time_embedding = TimestepEmbedding(timestep_input_dim, time_embed_dim)\n\n        self.down_blocks = nn.LayerList([])\n        self.mid_block = None\n        self.up_blocks = nn.LayerList([])\n\n        # down\n        output_channel = block_out_channels[0]\n        for i, down_block_type in enumerate(down_block_types):\n            input_channel = output_channel\n            output_channel = block_out_channels[i]\n            is_final_block = i == len(block_out_channels) - 1\n\n            down_block = get_down_block(\n                down_block_type,\n                num_layers=layers_per_block,\n                in_channels=input_channel,\n                out_channels=output_channel,\n                temb_channels=time_embed_dim,\n                add_downsample=not is_final_block,\n                resnet_eps=norm_eps,\n                resnet_act_fn=act_fn,\n                cross_attention_dim=cross_attention_dim,\n                attn_num_head_channels=attention_head_dim,\n                downsample_padding=downsample_padding,\n            )\n            self.down_blocks.append(down_block)\n\n        # mid\n        self.mid_block = UNetMidBlock2DCrossAttn(\n            in_channels=block_out_channels[-1],\n            temb_channels=time_embed_dim,\n            resnet_eps=norm_eps,\n            resnet_act_fn=act_fn,\n            output_scale_factor=mid_block_scale_factor,\n            resnet_time_scale_shift=\"default\",\n            cross_attention_dim=cross_attention_dim,\n            attn_num_head_channels=attention_head_dim,\n            resnet_groups=norm_num_groups,\n        )\n\n        # up\n        reversed_block_out_channels = list(reversed(block_out_channels))\n        output_channel = reversed_block_out_channels[0]\n        for i, up_block_type in enumerate(up_block_types):\n            prev_output_channel = output_channel\n            output_channel = reversed_block_out_channels[i]\n            input_channel = reversed_block_out_channels[min(i + 1, len(block_out_channels) - 1)]\n\n            is_final_block = i == len(block_out_channels) - 1\n\n            up_block = get_up_block(\n                up_block_type,\n                num_layers=layers_per_block + 1,\n                in_channels=input_channel,\n                out_channels=output_channel,\n                prev_output_channel=prev_output_channel,\n                temb_channels=time_embed_dim,\n                add_upsample=not is_final_block,\n                resnet_eps=norm_eps,\n                resnet_act_fn=act_fn,\n                cross_attention_dim=cross_attention_dim,\n                attn_num_head_channels=attention_head_dim,\n            )\n            self.up_blocks.append(up_block)\n            prev_output_channel = output_channel\n\n        # out\n        self.conv_norm_out = nn.GroupNorm(num_channels=block_out_channels[0],\n                                          num_groups=norm_num_groups,\n                                          epsilon=norm_eps)\n        self.conv_act = nn.Silu()\n        self.conv_out = nn.Conv2D(block_out_channels[0], out_channels, 3, padding=1)\n\n    def forward(\n        self,\n        sample: paddle.Tensor,\n        timestep: Union[paddle.Tensor, float, int],\n        encoder_hidden_states: paddle.Tensor,\n    ) -> Dict[str, paddle.Tensor]:\n\n        # 0. center input if necessary\n        if self.config.center_input_sample:\n            sample = 2 * sample - 1.0\n\n        # 1. time\n        timesteps = timestep\n        if not paddle.is_tensor(timesteps):\n            timesteps = paddle.to_tensor([timesteps], dtype=\"int64\")\n        elif paddle.is_tensor(timesteps) and len(timesteps.shape) == 0:\n            timesteps = timesteps[None]\n\n        # broadcast to batch dimension\n        timesteps = paddle.broadcast_to(timesteps, [sample.shape[0]])\n\n        t_emb = self.time_proj(timesteps)\n        emb = self.time_embedding(t_emb)\n\n        # 2. pre-process\n        sample = self.conv_in(sample)\n\n        # 3. down\n        down_block_res_samples = (sample, )\n        for downsample_block in self.down_blocks:\n\n            if hasattr(downsample_block, \"attentions\") and downsample_block.attentions is not None:\n                sample, res_samples = downsample_block(hidden_states=sample,\n                                                       temb=emb,\n                                                       encoder_hidden_states=encoder_hidden_states)\n            else:\n                sample, res_samples = downsample_block(hidden_states=sample, temb=emb)\n\n            down_block_res_samples += res_samples\n\n        # 4. mid\n        sample = self.mid_block(sample, emb, encoder_hidden_states=encoder_hidden_states)\n\n        # 5. up\n        for upsample_block in self.up_blocks:\n\n            res_samples = down_block_res_samples[-len(upsample_block.resnets):]\n            down_block_res_samples = down_block_res_samples[:-len(upsample_block.resnets)]\n\n            if hasattr(upsample_block, \"attentions\") and upsample_block.attentions is not None:\n                sample = upsample_block(\n                    hidden_states=sample,\n                    temb=emb,\n                    res_hidden_states_tuple=res_samples,\n                    encoder_hidden_states=encoder_hidden_states,\n                )\n            else:\n                sample = upsample_block(hidden_states=sample, temb=emb, res_hidden_states_tuple=res_samples)\n\n        # 6. post-process\n        # make sure hidden states is in float32\n        # when running in half-precision\n        sample = self.conv_norm_out(sample.astype(\"float32\")).astype(sample.dtype)\n        sample = self.conv_act(sample)\n        sample = self.conv_out(sample)\n\n        output = {\"sample\": sample}\n\n        return output\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/diffusers/models/unet_blocks.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\n\nfrom .attention import AttentionBlockNew\nfrom .attention import SpatialTransformer\nfrom .resnet import Downsample2D\nfrom .resnet import FirDownsample2D\nfrom .resnet import FirUpsample2D\nfrom .resnet import ResnetBlock\nfrom .resnet import Upsample2D\n\n\ndef get_down_block(\n    down_block_type,\n    num_layers,\n    in_channels,\n    out_channels,\n    temb_channels,\n    add_downsample,\n    resnet_eps,\n    resnet_act_fn,\n    attn_num_head_channels,\n    cross_attention_dim=None,\n    downsample_padding=None,\n):\n    down_block_type = down_block_type[7:] if down_block_type.startswith(\"UNetRes\") else down_block_type\n    if down_block_type == \"DownBlock2D\":\n        return DownBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            temb_channels=temb_channels,\n            add_downsample=add_downsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            downsample_padding=downsample_padding,\n        )\n    elif down_block_type == \"AttnDownBlock2D\":\n        return AttnDownBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            temb_channels=temb_channels,\n            add_downsample=add_downsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            downsample_padding=downsample_padding,\n            attn_num_head_channels=attn_num_head_channels,\n        )\n    elif down_block_type == \"CrossAttnDownBlock2D\":\n        if cross_attention_dim is None:\n            raise ValueError(\"cross_attention_dim must be specified for CrossAttnUpBlock2D\")\n        return CrossAttnDownBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            temb_channels=temb_channels,\n            add_downsample=add_downsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            downsample_padding=downsample_padding,\n            cross_attention_dim=cross_attention_dim,\n            attn_num_head_channels=attn_num_head_channels,\n        )\n    elif down_block_type == \"SkipDownBlock2D\":\n        return SkipDownBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            temb_channels=temb_channels,\n            add_downsample=add_downsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            downsample_padding=downsample_padding,\n        )\n    elif down_block_type == \"AttnSkipDownBlock2D\":\n        return AttnSkipDownBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            temb_channels=temb_channels,\n            add_downsample=add_downsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            downsample_padding=downsample_padding,\n            attn_num_head_channels=attn_num_head_channels,\n        )\n    elif down_block_type == \"DownEncoderBlock2D\":\n        return DownEncoderBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            add_downsample=add_downsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            downsample_padding=downsample_padding,\n        )\n\n\ndef get_up_block(\n    up_block_type,\n    num_layers,\n    in_channels,\n    out_channels,\n    prev_output_channel,\n    temb_channels,\n    add_upsample,\n    resnet_eps,\n    resnet_act_fn,\n    attn_num_head_channels,\n    cross_attention_dim=None,\n):\n    up_block_type = up_block_type[7:] if up_block_type.startswith(\"UNetRes\") else up_block_type\n    if up_block_type == \"UpBlock2D\":\n        return UpBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            prev_output_channel=prev_output_channel,\n            temb_channels=temb_channels,\n            add_upsample=add_upsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n        )\n    elif up_block_type == \"CrossAttnUpBlock2D\":\n        if cross_attention_dim is None:\n            raise ValueError(\"cross_attention_dim must be specified for CrossAttnUpBlock2D\")\n        return CrossAttnUpBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            prev_output_channel=prev_output_channel,\n            temb_channels=temb_channels,\n            add_upsample=add_upsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            cross_attention_dim=cross_attention_dim,\n            attn_num_head_channels=attn_num_head_channels,\n        )\n    elif up_block_type == \"AttnUpBlock2D\":\n        return AttnUpBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            prev_output_channel=prev_output_channel,\n            temb_channels=temb_channels,\n            add_upsample=add_upsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            attn_num_head_channels=attn_num_head_channels,\n        )\n    elif up_block_type == \"SkipUpBlock2D\":\n        return SkipUpBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            prev_output_channel=prev_output_channel,\n            temb_channels=temb_channels,\n            add_upsample=add_upsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n        )\n    elif up_block_type == \"AttnSkipUpBlock2D\":\n        return AttnSkipUpBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            prev_output_channel=prev_output_channel,\n            temb_channels=temb_channels,\n            add_upsample=add_upsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            attn_num_head_channels=attn_num_head_channels,\n        )\n    elif up_block_type == \"UpDecoderBlock2D\":\n        return UpDecoderBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            add_upsample=add_upsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n        )\n    raise ValueError(f\"{up_block_type} does not exist.\")\n\n\nclass UNetMidBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        attention_type=\"default\",\n        output_scale_factor=1.0,\n        **kwargs,\n    ):\n        super().__init__()\n\n        self.attention_type = attention_type\n        resnet_groups = resnet_groups if resnet_groups is not None else min(in_channels // 4, 32)\n\n        # there is always at least one resnet\n        resnets = [\n            ResnetBlock(\n                in_channels=in_channels,\n                out_channels=in_channels,\n                temb_channels=temb_channels,\n                eps=resnet_eps,\n                groups=resnet_groups,\n                dropout=dropout,\n                time_embedding_norm=resnet_time_scale_shift,\n                non_linearity=resnet_act_fn,\n                output_scale_factor=output_scale_factor,\n                pre_norm=resnet_pre_norm,\n            )\n        ]\n        attentions = []\n\n        for _ in range(num_layers):\n            attentions.append(\n                AttentionBlockNew(\n                    in_channels,\n                    num_head_channels=attn_num_head_channels,\n                    rescale_output_factor=output_scale_factor,\n                    eps=resnet_eps,\n                    num_groups=resnet_groups,\n                ))\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=in_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n    def forward(self, hidden_states, temb=None, encoder_states=None):\n        hidden_states = self.resnets[0](hidden_states, temb)\n        for attn, resnet in zip(self.attentions, self.resnets[1:]):\n            if self.attention_type == \"default\":\n                hidden_states = attn(hidden_states)\n            else:\n                hidden_states = attn(hidden_states, encoder_states)\n            hidden_states = resnet(hidden_states, temb)\n\n        return hidden_states\n\n\nclass UNetMidBlock2DCrossAttn(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        attention_type=\"default\",\n        output_scale_factor=1.0,\n        cross_attention_dim=1280,\n        **kwargs,\n    ):\n        super().__init__()\n\n        self.attention_type = attention_type\n        resnet_groups = resnet_groups if resnet_groups is not None else min(in_channels // 4, 32)\n\n        # there is always at least one resnet\n        resnets = [\n            ResnetBlock(\n                in_channels=in_channels,\n                out_channels=in_channels,\n                temb_channels=temb_channels,\n                eps=resnet_eps,\n                groups=resnet_groups,\n                dropout=dropout,\n                time_embedding_norm=resnet_time_scale_shift,\n                non_linearity=resnet_act_fn,\n                output_scale_factor=output_scale_factor,\n                pre_norm=resnet_pre_norm,\n            )\n        ]\n        attentions = []\n\n        for _ in range(num_layers):\n            attentions.append(\n                SpatialTransformer(\n                    in_channels,\n                    attn_num_head_channels,\n                    in_channels // attn_num_head_channels,\n                    depth=1,\n                    context_dim=cross_attention_dim,\n                ))\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=in_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n    def forward(self, hidden_states, temb=None, encoder_hidden_states=None):\n        hidden_states = self.resnets[0](hidden_states, temb)\n        for attn, resnet in zip(self.attentions, self.resnets[1:]):\n            hidden_states = attn(hidden_states, encoder_hidden_states)\n            hidden_states = resnet(hidden_states, temb)\n\n        return hidden_states\n\n\nclass AttnDownBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        attention_type=\"default\",\n        output_scale_factor=1.0,\n        downsample_padding=1,\n        add_downsample=True,\n    ):\n        super().__init__()\n        resnets = []\n        attentions = []\n\n        self.attention_type = attention_type\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            attentions.append(\n                AttentionBlockNew(\n                    out_channels,\n                    num_head_channels=attn_num_head_channels,\n                    rescale_output_factor=output_scale_factor,\n                    eps=resnet_eps,\n                ))\n\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n        if add_downsample:\n            self.downsamplers = nn.LayerList([\n                Downsample2D(in_channels,\n                             use_conv=True,\n                             out_channels=out_channels,\n                             padding=downsample_padding,\n                             name=\"op\")\n            ])\n        else:\n            self.downsamplers = None\n\n    def forward(self, hidden_states, temb=None):\n        output_states = ()\n\n        for resnet, attn in zip(self.resnets, self.attentions):\n            hidden_states = resnet(hidden_states, temb)\n            hidden_states = attn(hidden_states)\n            output_states += (hidden_states, )\n\n        if self.downsamplers is not None:\n            for downsampler in self.downsamplers:\n                hidden_states = downsampler(hidden_states)\n\n            output_states += (hidden_states, )\n\n        return hidden_states, output_states\n\n\nclass CrossAttnDownBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        cross_attention_dim=1280,\n        attention_type=\"default\",\n        output_scale_factor=1.0,\n        downsample_padding=1,\n        add_downsample=True,\n    ):\n        super().__init__()\n        resnets = []\n        attentions = []\n\n        self.attention_type = attention_type\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            attentions.append(\n                SpatialTransformer(\n                    out_channels,\n                    attn_num_head_channels,\n                    out_channels // attn_num_head_channels,\n                    depth=1,\n                    context_dim=cross_attention_dim,\n                ))\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n        if add_downsample:\n            self.downsamplers = nn.LayerList([\n                Downsample2D(in_channels,\n                             use_conv=True,\n                             out_channels=out_channels,\n                             padding=downsample_padding,\n                             name=\"op\")\n            ])\n        else:\n            self.downsamplers = None\n\n    def forward(self, hidden_states, temb=None, encoder_hidden_states=None):\n        output_states = ()\n\n        for resnet, attn in zip(self.resnets, self.attentions):\n            hidden_states = resnet(hidden_states, temb)\n            hidden_states = attn(hidden_states, context=encoder_hidden_states)\n            output_states += (hidden_states, )\n\n        if self.downsamplers is not None:\n            for downsampler in self.downsamplers:\n                hidden_states = downsampler(hidden_states)\n\n            output_states += (hidden_states, )\n\n        return hidden_states, output_states\n\n\nclass DownBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        output_scale_factor=1.0,\n        add_downsample=True,\n        downsample_padding=1,\n    ):\n        super().__init__()\n        resnets = []\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.resnets = nn.LayerList(resnets)\n\n        if add_downsample:\n            self.downsamplers = nn.LayerList([\n                Downsample2D(in_channels,\n                             use_conv=True,\n                             out_channels=out_channels,\n                             padding=downsample_padding,\n                             name=\"op\")\n            ])\n        else:\n            self.downsamplers = None\n\n    def forward(self, hidden_states, temb=None):\n        output_states = ()\n\n        for resnet in self.resnets:\n            hidden_states = resnet(hidden_states, temb)\n            output_states += (hidden_states, )\n\n        if self.downsamplers is not None:\n            for downsampler in self.downsamplers:\n                hidden_states = downsampler(hidden_states)\n\n            output_states += (hidden_states, )\n\n        return hidden_states, output_states\n\n\nclass DownEncoderBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        output_scale_factor=1.0,\n        add_downsample=True,\n        downsample_padding=1,\n    ):\n        super().__init__()\n        resnets = []\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=None,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.resnets = nn.LayerList(resnets)\n\n        if add_downsample:\n            self.downsamplers = nn.LayerList([\n                Downsample2D(in_channels,\n                             use_conv=True,\n                             out_channels=out_channels,\n                             padding=downsample_padding,\n                             name=\"op\")\n            ])\n        else:\n            self.downsamplers = None\n\n    def forward(self, hidden_states):\n        for resnet in self.resnets:\n            hidden_states = resnet(hidden_states, temb=None)\n\n        if self.downsamplers is not None:\n            for downsampler in self.downsamplers:\n                hidden_states = downsampler(hidden_states)\n\n        return hidden_states\n\n\nclass AttnDownEncoderBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        output_scale_factor=1.0,\n        add_downsample=True,\n        downsample_padding=1,\n    ):\n        super().__init__()\n        resnets = []\n        attentions = []\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=None,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            attentions.append(\n                AttentionBlockNew(\n                    out_channels,\n                    num_head_channels=attn_num_head_channels,\n                    rescale_output_factor=output_scale_factor,\n                    eps=resnet_eps,\n                    num_groups=resnet_groups,\n                ))\n\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n        if add_downsample:\n            self.downsamplers = nn.LayerList([\n                Downsample2D(in_channels,\n                             use_conv=True,\n                             out_channels=out_channels,\n                             padding=downsample_padding,\n                             name=\"op\")\n            ])\n        else:\n            self.downsamplers = None\n\n    def forward(self, hidden_states):\n        for resnet, attn in zip(self.resnets, self.attentions):\n            hidden_states = resnet(hidden_states, temb=None)\n            hidden_states = attn(hidden_states)\n\n        if self.downsamplers is not None:\n            for downsampler in self.downsamplers:\n                hidden_states = downsampler(hidden_states)\n\n        return hidden_states\n\n\nclass AttnSkipDownBlock2D(nn.Layer):\n\n    def __init__(\n            self,\n            in_channels: int,\n            out_channels: int,\n            temb_channels: int,\n            dropout: float = 0.0,\n            num_layers: int = 1,\n            resnet_eps: float = 1e-6,\n            resnet_time_scale_shift: str = \"default\",\n            resnet_act_fn: str = \"swish\",\n            resnet_pre_norm: bool = True,\n            attn_num_head_channels=1,\n            attention_type=\"default\",\n            output_scale_factor=np.sqrt(2.0),\n            downsample_padding=1,\n            add_downsample=True,\n    ):\n        super().__init__()\n        self.attentions = nn.LayerList([])\n        self.resnets = nn.LayerList([])\n\n        self.attention_type = attention_type\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            self.resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=min(in_channels // 4, 32),\n                    groups_out=min(out_channels // 4, 32),\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            self.attentions.append(\n                AttentionBlockNew(\n                    out_channels,\n                    num_head_channels=attn_num_head_channels,\n                    rescale_output_factor=output_scale_factor,\n                    eps=resnet_eps,\n                ))\n\n        if add_downsample:\n            self.resnet_down = ResnetBlock(\n                in_channels=out_channels,\n                out_channels=out_channels,\n                temb_channels=temb_channels,\n                eps=resnet_eps,\n                groups=min(out_channels // 4, 32),\n                dropout=dropout,\n                time_embedding_norm=resnet_time_scale_shift,\n                non_linearity=resnet_act_fn,\n                output_scale_factor=output_scale_factor,\n                pre_norm=resnet_pre_norm,\n                use_nin_shortcut=True,\n                down=True,\n                kernel=\"fir\",\n            )\n            self.downsamplers = nn.LayerList([FirDownsample2D(in_channels, out_channels=out_channels)])\n            self.skip_conv = nn.Conv2D(3, out_channels, kernel_size=(1, 1), stride=(1, 1))\n        else:\n            self.resnet_down = None\n            self.downsamplers = None\n            self.skip_conv = None\n\n    def forward(self, hidden_states, temb=None, skip_sample=None):\n        output_states = ()\n\n        for resnet, attn in zip(self.resnets, self.attentions):\n            hidden_states = resnet(hidden_states, temb)\n            hidden_states = attn(hidden_states)\n            output_states += (hidden_states, )\n\n        if self.downsamplers is not None:\n            hidden_states = self.resnet_down(hidden_states, temb)\n            for downsampler in self.downsamplers:\n                skip_sample = downsampler(skip_sample)\n\n            hidden_states = self.skip_conv(skip_sample) + hidden_states\n\n            output_states += (hidden_states, )\n\n        return hidden_states, output_states, skip_sample\n\n\nclass SkipDownBlock2D(nn.Layer):\n\n    def __init__(\n            self,\n            in_channels: int,\n            out_channels: int,\n            temb_channels: int,\n            dropout: float = 0.0,\n            num_layers: int = 1,\n            resnet_eps: float = 1e-6,\n            resnet_time_scale_shift: str = \"default\",\n            resnet_act_fn: str = \"swish\",\n            resnet_pre_norm: bool = True,\n            output_scale_factor=np.sqrt(2.0),\n            add_downsample=True,\n            downsample_padding=1,\n    ):\n        super().__init__()\n        self.resnets = nn.LayerList([])\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            self.resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=min(in_channels // 4, 32),\n                    groups_out=min(out_channels // 4, 32),\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        if add_downsample:\n            self.resnet_down = ResnetBlock(\n                in_channels=out_channels,\n                out_channels=out_channels,\n                temb_channels=temb_channels,\n                eps=resnet_eps,\n                groups=min(out_channels // 4, 32),\n                dropout=dropout,\n                time_embedding_norm=resnet_time_scale_shift,\n                non_linearity=resnet_act_fn,\n                output_scale_factor=output_scale_factor,\n                pre_norm=resnet_pre_norm,\n                use_nin_shortcut=True,\n                down=True,\n                kernel=\"fir\",\n            )\n            self.downsamplers = nn.LayerList([FirDownsample2D(in_channels, out_channels=out_channels)])\n            self.skip_conv = nn.Conv2D(3, out_channels, kernel_size=(1, 1), stride=(1, 1))\n        else:\n            self.resnet_down = None\n            self.downsamplers = None\n            self.skip_conv = None\n\n    def forward(self, hidden_states, temb=None, skip_sample=None):\n        output_states = ()\n\n        for resnet in self.resnets:\n            hidden_states = resnet(hidden_states, temb)\n            output_states += (hidden_states, )\n\n        if self.downsamplers is not None:\n            hidden_states = self.resnet_down(hidden_states, temb)\n            for downsampler in self.downsamplers:\n                skip_sample = downsampler(skip_sample)\n\n            hidden_states = self.skip_conv(skip_sample) + hidden_states\n\n            output_states += (hidden_states, )\n\n        return hidden_states, output_states, skip_sample\n\n\nclass AttnUpBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        prev_output_channel: int,\n        out_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attention_type=\"default\",\n        attn_num_head_channels=1,\n        output_scale_factor=1.0,\n        add_upsample=True,\n    ):\n        super().__init__()\n        resnets = []\n        attentions = []\n\n        self.attention_type = attention_type\n\n        for i in range(num_layers):\n            res_skip_channels = in_channels if (i == num_layers - 1) else out_channels\n            resnet_in_channels = prev_output_channel if i == 0 else out_channels\n\n            resnets.append(\n                ResnetBlock(\n                    in_channels=resnet_in_channels + res_skip_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            attentions.append(\n                AttentionBlockNew(\n                    out_channels,\n                    num_head_channels=attn_num_head_channels,\n                    rescale_output_factor=output_scale_factor,\n                    eps=resnet_eps,\n                ))\n\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n        if add_upsample:\n            self.upsamplers = nn.LayerList([Upsample2D(out_channels, use_conv=True, out_channels=out_channels)])\n        else:\n            self.upsamplers = None\n\n    def forward(self, hidden_states, res_hidden_states_tuple, temb=None):\n        for resnet, attn in zip(self.resnets, self.attentions):\n\n            # pop res hidden states\n            res_hidden_states = res_hidden_states_tuple[-1]\n            res_hidden_states_tuple = res_hidden_states_tuple[:-1]\n            hidden_states = paddle.concat([hidden_states, res_hidden_states], axis=1)\n\n            hidden_states = resnet(hidden_states, temb)\n            hidden_states = attn(hidden_states)\n\n        if self.upsamplers is not None:\n            for upsampler in self.upsamplers:\n                hidden_states = upsampler(hidden_states)\n\n        return hidden_states\n\n\nclass CrossAttnUpBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        prev_output_channel: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        cross_attention_dim=1280,\n        attention_type=\"default\",\n        output_scale_factor=1.0,\n        downsample_padding=1,\n        add_upsample=True,\n    ):\n        super().__init__()\n        resnets = []\n        attentions = []\n\n        self.attention_type = attention_type\n\n        for i in range(num_layers):\n            res_skip_channels = in_channels if (i == num_layers - 1) else out_channels\n            resnet_in_channels = prev_output_channel if i == 0 else out_channels\n\n            resnets.append(\n                ResnetBlock(\n                    in_channels=resnet_in_channels + res_skip_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            attentions.append(\n                SpatialTransformer(\n                    out_channels,\n                    attn_num_head_channels,\n                    out_channels // attn_num_head_channels,\n                    depth=1,\n                    context_dim=cross_attention_dim,\n                ))\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n        if add_upsample:\n            self.upsamplers = nn.LayerList([Upsample2D(out_channels, use_conv=True, out_channels=out_channels)])\n        else:\n            self.upsamplers = None\n\n    def forward(self, hidden_states, res_hidden_states_tuple, temb=None, encoder_hidden_states=None):\n        for resnet, attn in zip(self.resnets, self.attentions):\n\n            # pop res hidden states\n            res_hidden_states = res_hidden_states_tuple[-1]\n            res_hidden_states_tuple = res_hidden_states_tuple[:-1]\n            hidden_states = paddle.concat([hidden_states, res_hidden_states], axis=1)\n\n            hidden_states = resnet(hidden_states, temb)\n            hidden_states = attn(hidden_states, context=encoder_hidden_states)\n\n        if self.upsamplers is not None:\n            for upsampler in self.upsamplers:\n                hidden_states = upsampler(hidden_states)\n\n        return hidden_states\n\n\nclass UpBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        prev_output_channel: int,\n        out_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        output_scale_factor=1.0,\n        add_upsample=True,\n    ):\n        super().__init__()\n        resnets = []\n\n        for i in range(num_layers):\n            res_skip_channels = in_channels if (i == num_layers - 1) else out_channels\n            resnet_in_channels = prev_output_channel if i == 0 else out_channels\n\n            resnets.append(\n                ResnetBlock(\n                    in_channels=resnet_in_channels + res_skip_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.resnets = nn.LayerList(resnets)\n\n        if add_upsample:\n            self.upsamplers = nn.LayerList([Upsample2D(out_channels, use_conv=True, out_channels=out_channels)])\n        else:\n            self.upsamplers = None\n\n    def forward(self, hidden_states, res_hidden_states_tuple, temb=None):\n        for resnet in self.resnets:\n\n            # pop res hidden states\n            res_hidden_states = res_hidden_states_tuple[-1]\n            res_hidden_states_tuple = res_hidden_states_tuple[:-1]\n            hidden_states = paddle.concat([hidden_states, res_hidden_states], axis=1)\n\n            hidden_states = resnet(hidden_states, temb)\n\n        if self.upsamplers is not None:\n            for upsampler in self.upsamplers:\n                hidden_states = upsampler(hidden_states)\n\n        return hidden_states\n\n\nclass UpDecoderBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        output_scale_factor=1.0,\n        add_upsample=True,\n    ):\n        super().__init__()\n        resnets = []\n\n        for i in range(num_layers):\n            input_channels = in_channels if i == 0 else out_channels\n\n            resnets.append(\n                ResnetBlock(\n                    in_channels=input_channels,\n                    out_channels=out_channels,\n                    temb_channels=None,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.resnets = nn.LayerList(resnets)\n\n        if add_upsample:\n            self.upsamplers = nn.LayerList([Upsample2D(out_channels, use_conv=True, out_channels=out_channels)])\n        else:\n            self.upsamplers = None\n\n    def forward(self, hidden_states):\n        for resnet in self.resnets:\n            hidden_states = resnet(hidden_states, temb=None)\n\n        if self.upsamplers is not None:\n            for upsampler in self.upsamplers:\n                hidden_states = upsampler(hidden_states)\n\n        return hidden_states\n\n\nclass AttnUpDecoderBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        output_scale_factor=1.0,\n        add_upsample=True,\n    ):\n        super().__init__()\n        resnets = []\n        attentions = []\n\n        for i in range(num_layers):\n            input_channels = in_channels if i == 0 else out_channels\n\n            resnets.append(\n                ResnetBlock(\n                    in_channels=input_channels,\n                    out_channels=out_channels,\n                    temb_channels=None,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            attentions.append(\n                AttentionBlockNew(\n                    out_channels,\n                    num_head_channels=attn_num_head_channels,\n                    rescale_output_factor=output_scale_factor,\n                    eps=resnet_eps,\n                    num_groups=resnet_groups,\n                ))\n\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n        if add_upsample:\n            self.upsamplers = nn.LayerList([Upsample2D(out_channels, use_conv=True, out_channels=out_channels)])\n        else:\n            self.upsamplers = None\n\n    def forward(self, hidden_states):\n        for resnet, attn in zip(self.resnets, self.attentions):\n            hidden_states = resnet(hidden_states, temb=None)\n            hidden_states = attn(hidden_states)\n\n        if self.upsamplers is not None:\n            for upsampler in self.upsamplers:\n                hidden_states = upsampler(hidden_states)\n\n        return hidden_states\n\n\nclass AttnSkipUpBlock2D(nn.Layer):\n\n    def __init__(\n            self,\n            in_channels: int,\n            prev_output_channel: int,\n            out_channels: int,\n            temb_channels: int,\n            dropout: float = 0.0,\n            num_layers: int = 1,\n            resnet_eps: float = 1e-6,\n            resnet_time_scale_shift: str = \"default\",\n            resnet_act_fn: str = \"swish\",\n            resnet_pre_norm: bool = True,\n            attn_num_head_channels=1,\n            attention_type=\"default\",\n            output_scale_factor=np.sqrt(2.0),\n            upsample_padding=1,\n            add_upsample=True,\n    ):\n        super().__init__()\n        self.attentions = nn.LayerList([])\n        self.resnets = nn.LayerList([])\n\n        self.attention_type = attention_type\n\n        for i in range(num_layers):\n            res_skip_channels = in_channels if (i == num_layers - 1) else out_channels\n            resnet_in_channels = prev_output_channel if i == 0 else out_channels\n\n            self.resnets.append(\n                ResnetBlock(\n                    in_channels=resnet_in_channels + res_skip_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=min(resnet_in_channels + res_skip_channels // 4, 32),\n                    groups_out=min(out_channels // 4, 32),\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.attentions.append(\n            AttentionBlockNew(\n                out_channels,\n                num_head_channels=attn_num_head_channels,\n                rescale_output_factor=output_scale_factor,\n                eps=resnet_eps,\n            ))\n\n        self.upsampler = FirUpsample2D(in_channels, out_channels=out_channels)\n        if add_upsample:\n            self.resnet_up = ResnetBlock(\n                in_channels=out_channels,\n                out_channels=out_channels,\n                temb_channels=temb_channels,\n                eps=resnet_eps,\n                groups=min(out_channels // 4, 32),\n                groups_out=min(out_channels // 4, 32),\n                dropout=dropout,\n                time_embedding_norm=resnet_time_scale_shift,\n                non_linearity=resnet_act_fn,\n                output_scale_factor=output_scale_factor,\n                pre_norm=resnet_pre_norm,\n                use_nin_shortcut=True,\n                up=True,\n                kernel=\"fir\",\n            )\n            self.skip_conv = nn.Conv2D(out_channels, 3, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n            self.skip_norm = nn.GroupNorm(num_groups=min(out_channels // 4, 32),\n                                          num_channels=out_channels,\n                                          eps=resnet_eps,\n                                          affine=True)\n            self.act = nn.SiLU()\n        else:\n            self.resnet_up = None\n            self.skip_conv = None\n            self.skip_norm = None\n            self.act = None\n\n    def forward(self, hidden_states, res_hidden_states_tuple, temb=None, skip_sample=None):\n        for resnet in self.resnets:\n            # pop res hidden states\n            res_hidden_states = res_hidden_states_tuple[-1]\n            res_hidden_states_tuple = res_hidden_states_tuple[:-1]\n            hidden_states = paddle.concat([hidden_states, res_hidden_states], axis=1)\n\n            hidden_states = resnet(hidden_states, temb)\n\n        hidden_states = self.attentions[0](hidden_states)\n\n        if skip_sample is not None:\n            skip_sample = self.upsampler(skip_sample)\n        else:\n            skip_sample = 0\n\n        if self.resnet_up is not None:\n            skip_sample_states = self.skip_norm(hidden_states)\n            skip_sample_states = self.act(skip_sample_states)\n            skip_sample_states = self.skip_conv(skip_sample_states)\n\n            skip_sample = skip_sample + skip_sample_states\n\n            hidden_states = self.resnet_up(hidden_states, temb)\n\n        return hidden_states, skip_sample\n\n\nclass SkipUpBlock2D(nn.Layer):\n\n    def __init__(\n            self,\n            in_channels: int,\n            prev_output_channel: int,\n            out_channels: int,\n            temb_channels: int,\n            dropout: float = 0.0,\n            num_layers: int = 1,\n            resnet_eps: float = 1e-6,\n            resnet_time_scale_shift: str = \"default\",\n            resnet_act_fn: str = \"swish\",\n            resnet_pre_norm: bool = True,\n            output_scale_factor=np.sqrt(2.0),\n            add_upsample=True,\n            upsample_padding=1,\n    ):\n        super().__init__()\n        self.resnets = nn.LayerList([])\n\n        for i in range(num_layers):\n            res_skip_channels = in_channels if (i == num_layers - 1) else out_channels\n            resnet_in_channels = prev_output_channel if i == 0 else out_channels\n\n            self.resnets.append(\n                ResnetBlock(\n                    in_channels=resnet_in_channels + res_skip_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=min((resnet_in_channels + res_skip_channels) // 4, 32),\n                    groups_out=min(out_channels // 4, 32),\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.upsampler = FirUpsample2D(in_channels, out_channels=out_channels)\n        if add_upsample:\n            self.resnet_up = ResnetBlock(\n                in_channels=out_channels,\n                out_channels=out_channels,\n                temb_channels=temb_channels,\n                eps=resnet_eps,\n                groups=min(out_channels // 4, 32),\n                groups_out=min(out_channels // 4, 32),\n                dropout=dropout,\n                time_embedding_norm=resnet_time_scale_shift,\n                non_linearity=resnet_act_fn,\n                output_scale_factor=output_scale_factor,\n                pre_norm=resnet_pre_norm,\n                use_nin_shortcut=True,\n                up=True,\n                kernel=\"fir\",\n            )\n            self.skip_conv = nn.Conv2D(out_channels, 3, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n            self.skip_norm = nn.GroupNorm(num_groups=min(out_channels // 4, 32),\n                                          num_channels=out_channels,\n                                          eps=resnet_eps,\n                                          affine=True)\n            self.act = nn.SiLU()\n        else:\n            self.resnet_up = None\n            self.skip_conv = None\n            self.skip_norm = None\n            self.act = None\n\n    def forward(self, hidden_states, res_hidden_states_tuple, temb=None, skip_sample=None):\n        for resnet in self.resnets:\n            # pop res hidden states\n            res_hidden_states = res_hidden_states_tuple[-1]\n            res_hidden_states_tuple = res_hidden_states_tuple[:-1]\n            hidden_states = paddle.concat([hidden_states, res_hidden_states], axis=1)\n\n            hidden_states = resnet(hidden_states, temb)\n\n        if skip_sample is not None:\n            skip_sample = self.upsampler(skip_sample)\n        else:\n            skip_sample = 0\n\n        if self.resnet_up is not None:\n            skip_sample_states = self.skip_norm(hidden_states)\n            skip_sample_states = self.act(skip_sample_states)\n            skip_sample_states = self.skip_conv(skip_sample_states)\n\n            skip_sample = skip_sample + skip_sample_states\n\n            hidden_states = self.resnet_up(hidden_states, temb)\n\n        return hidden_states, skip_sample\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/diffusers/models/vae.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .unet_blocks import get_down_block\nfrom .unet_blocks import get_up_block\nfrom .unet_blocks import UNetMidBlock2D\n\n\nclass Encoder(nn.Layer):\n\n    def __init__(\n            self,\n            in_channels=3,\n            out_channels=3,\n            down_block_types=(\"DownEncoderBlock2D\", ),\n            block_out_channels=(64, ),\n            layers_per_block=2,\n            act_fn=\"silu\",\n            double_z=True,\n    ):\n        super().__init__()\n        self.layers_per_block = layers_per_block\n\n        self.conv_in = nn.Conv2D(in_channels, block_out_channels[0], kernel_size=3, stride=1, padding=1)\n\n        self.mid_block = None\n        self.down_blocks = nn.LayerList([])\n\n        # down\n        output_channel = block_out_channels[0]\n        for i, down_block_type in enumerate(down_block_types):\n            input_channel = output_channel\n            output_channel = block_out_channels[i]\n            is_final_block = i == len(block_out_channels) - 1\n\n            down_block = get_down_block(\n                down_block_type,\n                num_layers=self.layers_per_block,\n                in_channels=input_channel,\n                out_channels=output_channel,\n                add_downsample=not is_final_block,\n                resnet_eps=1e-6,\n                downsample_padding=0,\n                resnet_act_fn=act_fn,\n                attn_num_head_channels=None,\n                temb_channels=None,\n            )\n            self.down_blocks.append(down_block)\n\n        # mid\n        self.mid_block = UNetMidBlock2D(\n            in_channels=block_out_channels[-1],\n            resnet_eps=1e-6,\n            resnet_act_fn=act_fn,\n            output_scale_factor=1,\n            resnet_time_scale_shift=\"default\",\n            attn_num_head_channels=None,\n            resnet_groups=32,\n            temb_channels=None,\n        )\n\n        # out\n        num_groups_out = 32\n        self.conv_norm_out = nn.GroupNorm(num_channels=block_out_channels[-1], num_groups=num_groups_out, epsilon=1e-6)\n        self.conv_act = nn.Silu()\n\n        conv_out_channels = 2 * out_channels if double_z else out_channels\n        self.conv_out = nn.Conv2D(block_out_channels[-1], conv_out_channels, 3, padding=1)\n\n    def forward(self, x):\n        sample = x\n        sample = self.conv_in(sample)\n\n        # down\n        for down_block in self.down_blocks:\n            sample = down_block(sample)\n\n        # middle\n        sample = self.mid_block(sample)\n\n        # post-process\n        sample = self.conv_norm_out(sample)\n        sample = self.conv_act(sample)\n        sample = self.conv_out(sample)\n\n        return sample\n\n\nclass Decoder(nn.Layer):\n\n    def __init__(\n            self,\n            in_channels=3,\n            out_channels=3,\n            up_block_types=(\"UpDecoderBlock2D\", ),\n            block_out_channels=(64, ),\n            layers_per_block=2,\n            act_fn=\"silu\",\n    ):\n        super().__init__()\n        self.layers_per_block = layers_per_block\n\n        self.conv_in = nn.Conv2D(in_channels, block_out_channels[-1], kernel_size=3, stride=1, padding=1)\n\n        self.mid_block = None\n        self.up_blocks = nn.LayerList([])\n\n        # mid\n        self.mid_block = UNetMidBlock2D(\n            in_channels=block_out_channels[-1],\n            resnet_eps=1e-6,\n            resnet_act_fn=act_fn,\n            output_scale_factor=1,\n            resnet_time_scale_shift=\"default\",\n            attn_num_head_channels=None,\n            resnet_groups=32,\n            temb_channels=None,\n        )\n\n        # up\n        reversed_block_out_channels = list(reversed(block_out_channels))\n        output_channel = reversed_block_out_channels[0]\n        for i, up_block_type in enumerate(up_block_types):\n            prev_output_channel = output_channel\n            output_channel = reversed_block_out_channels[i]\n\n            is_final_block = i == len(block_out_channels) - 1\n\n            up_block = get_up_block(\n                up_block_type,\n                num_layers=self.layers_per_block + 1,\n                in_channels=prev_output_channel,\n                out_channels=output_channel,\n                prev_output_channel=None,\n                add_upsample=not is_final_block,\n                resnet_eps=1e-6,\n                resnet_act_fn=act_fn,\n                attn_num_head_channels=None,\n                temb_channels=None,\n            )\n            self.up_blocks.append(up_block)\n            prev_output_channel = output_channel\n\n        # out\n        num_groups_out = 32\n        self.conv_norm_out = nn.GroupNorm(num_channels=block_out_channels[0], num_groups=num_groups_out, epsilon=1e-6)\n        self.conv_act = nn.Silu()\n        self.conv_out = nn.Conv2D(block_out_channels[0], out_channels, 3, padding=1)\n\n    def forward(self, z):\n        sample = z\n        sample = self.conv_in(sample)\n\n        # middle\n        sample = self.mid_block(sample)\n\n        # up\n        for up_block in self.up_blocks:\n            sample = up_block(sample)\n\n        # post-process\n        sample = self.conv_norm_out(sample)\n        sample = self.conv_act(sample)\n        sample = self.conv_out(sample)\n\n        return sample\n\n\nclass VectorQuantizer(nn.Layer):\n    \"\"\"\n    Improved version over VectorQuantizer, can be used as a drop-in replacement. Mostly avoids costly matrix\n    multiplications and allows for post-hoc remapping of indices.\n    \"\"\"\n\n    # NOTE: due to a bug the beta term was applied to the wrong term. for\n    # backwards compatibility we use the buggy version by default, but you can\n    # specify legacy=False to fix it.\n    def __init__(self, n_e, e_dim, beta, remap=None, unknown_index=\"random\", sane_index_shape=False, legacy=True):\n        super().__init__()\n        self.n_e = n_e\n        self.e_dim = e_dim\n        self.beta = beta\n        self.legacy = legacy\n\n        self.embedding = nn.Embedding(self.n_e, self.e_dim)\n        self.embedding.weight.data.uniform_(-1.0 / self.n_e, 1.0 / self.n_e)\n\n        self.remap = remap\n        if self.remap is not None:\n            self.register_buffer(\"used\", paddle.to_tensor(np.load(self.remap)))\n            self.re_embed = self.used.shape[0]\n            self.unknown_index = unknown_index  # \"random\" or \"extra\" or integer\n            if self.unknown_index == \"extra\":\n                self.unknown_index = self.re_embed\n                self.re_embed = self.re_embed + 1\n            print(f\"Remapping {self.n_e} indices to {self.re_embed} indices. \"\n                  f\"Using {self.unknown_index} for unknown indices.\")\n        else:\n            self.re_embed = n_e\n\n        self.sane_index_shape = sane_index_shape\n\n    def remap_to_used(self, inds):\n        ishape = inds.shape\n        assert len(ishape) > 1\n        inds = inds.reshape([ishape[0], -1])\n        used = self.used\n        match = (inds[:, :, None] == used[None, None, ...]).astype(\"int64\")\n        new = match.argmax(-1)\n        unknown = match.sum(2) < 1\n        if self.unknown_index == \"random\":\n            new[unknown] = paddle.randint(0, self.re_embed, shape=new[unknown].shape)\n        else:\n            new[unknown] = self.unknown_index\n        return new.reshape(ishape)\n\n    def unmap_to_all(self, inds):\n        ishape = inds.shape\n        assert len(ishape) > 1\n        inds = inds.reshape([ishape[0], -1])\n        used = self.used\n        if self.re_embed > self.used.shape[0]:  # extra token\n            inds[inds >= self.used.shape[0]] = 0  # simply set to zero\n        back = paddle.gather(used[None, :][inds.shape[0] * [0], :], inds, axis=1)\n        return back.reshape(ishape)\n\n    def forward(self, z):\n        # reshape z -> (batch, height, width, channel) and flatten\n        z = z.transpose([0, 2, 3, 1])\n        z_flattened = z.reshape([-1, self.e_dim])\n        # distances from z to embeddings e_j (z - e)^2 = z^2 + e^2 - 2 e * z\n\n        d = (paddle.sum(z_flattened**2, axis=1, keepdim=True) + paddle.sum(self.embedding.weight**2, axis=1) -\n             2 * paddle.einsum(\"bd,dn->bn\", z_flattened, self.embedding.weight.t()))\n\n        min_encoding_indices = paddle.argmin(d, axis=1)\n        z_q = self.embedding(min_encoding_indices).reshape(z.shape)\n        perplexity = None\n        min_encodings = None\n\n        # compute loss for embedding\n        if not self.legacy:\n            loss = self.beta * paddle.mean((z_q.detach() - z)**2) + paddle.mean((z_q - z.detach())**2)\n        else:\n            loss = paddle.mean((z_q.detach() - z)**2) + self.beta * paddle.mean((z_q - z.detach())**2)\n\n        # preserve gradients\n        z_q = z + (z_q - z).detach()\n\n        # reshape back to match original input shape\n        z_q = z_q.transpose([0, 3, 1, 2])\n\n        if self.remap is not None:\n            min_encoding_indices = min_encoding_indices.reshape([z.shape[0], -1])  # add batch axis\n            min_encoding_indices = self.remap_to_used(min_encoding_indices)\n            min_encoding_indices = min_encoding_indices.reshape([-1, 1])  # flatten\n\n        if self.sane_index_shape:\n            min_encoding_indices = min_encoding_indices.reshape([z_q.shape[0], z_q.shape[2], z_q.shape[3]])\n\n        return z_q, loss, (perplexity, min_encodings, min_encoding_indices)\n\n    def get_codebook_entry(self, indices, shape):\n        # shape specifying (batch, height, width, channel)\n        if self.remap is not None:\n            indices = indices.reshape([shape[0], -1])  # add batch axis\n            indices = self.unmap_to_all(indices)\n            indices = indices.flatten()  # flatten again\n\n        # get quantized latent vectors\n        z_q = self.embedding(indices)\n\n        if shape is not None:\n            z_q = z_q.reshape(shape)\n            # reshape back to match original input shape\n            z_q = z_q.transpose([0, 3, 1, 2])\n\n        return z_q\n\n\nclass DiagonalGaussianDistribution(object):\n\n    def __init__(self, parameters, deterministic=False):\n        self.parameters = parameters\n        self.mean, self.logvar = paddle.chunk(parameters, 2, axis=1)\n        self.logvar = paddle.clip(self.logvar, -30.0, 20.0)\n        self.deterministic = deterministic\n        self.std = paddle.exp(0.5 * self.logvar)\n        self.var = paddle.exp(self.logvar)\n        if self.deterministic:\n            self.var = self.std = paddle.zeros_like(self.mean)\n\n    def sample(self):\n        x = self.mean + self.std * paddle.randn(self.mean.shape)\n        return x\n\n    def kl(self, other=None):\n        if self.deterministic:\n            return paddle.to_tensor([0.0])\n        else:\n            if other is None:\n                return 0.5 * paddle.sum(paddle.pow(self.mean, 2) + self.var - 1.0 - self.logvar, axis=[1, 2, 3])\n            else:\n                return 0.5 * paddle.sum(\n                    paddle.pow(self.mean - other.mean, 2) / other.var + self.var / other.var - 1.0 - self.logvar +\n                    other.logvar,\n                    axis=[1, 2, 3],\n                )\n\n    def nll(self, sample, dims=[1, 2, 3]):\n        if self.deterministic:\n            return paddle.to_tensor([0.0])\n        logtwopi = np.log(2.0 * np.pi)\n        return 0.5 * paddle.sum(logtwopi + self.logvar + paddle.pow(sample - self.mean, 2) / self.var, axis=dims)\n\n    def mode(self):\n        return self.mean\n\n\nclass VQModel(ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        in_channels=3,\n        out_channels=3,\n        down_block_types=(\"DownEncoderBlock2D\", ),\n        up_block_types=(\"UpDecoderBlock2D\", ),\n        block_out_channels=(64, ),\n        layers_per_block=1,\n        act_fn=\"silu\",\n        latent_channels=3,\n        sample_size=32,\n        num_vq_embeddings=256,\n    ):\n        super().__init__()\n\n        # pass init params to Encoder\n        self.encoder = Encoder(\n            in_channels=in_channels,\n            out_channels=latent_channels,\n            down_block_types=down_block_types,\n            block_out_channels=block_out_channels,\n            layers_per_block=layers_per_block,\n            act_fn=act_fn,\n            double_z=False,\n        )\n\n        self.quant_conv = nn.Conv2D(latent_channels, latent_channels, 1)\n        self.quantize = VectorQuantizer(num_vq_embeddings,\n                                        latent_channels,\n                                        beta=0.25,\n                                        remap=None,\n                                        sane_index_shape=False)\n        self.post_quant_conv = nn.Conv2D(latent_channels, latent_channels, 1)\n\n        # pass init params to Decoder\n        self.decoder = Decoder(\n            in_channels=latent_channels,\n            out_channels=out_channels,\n            up_block_types=up_block_types,\n            block_out_channels=block_out_channels,\n            layers_per_block=layers_per_block,\n            act_fn=act_fn,\n        )\n\n    def encode(self, x):\n        h = self.encoder(x)\n        h = self.quant_conv(h)\n        return h\n\n    def decode(self, h, force_not_quantize=False):\n        # also go through quantization layer\n        if not force_not_quantize:\n            quant, emb_loss, info = self.quantize(h)\n        else:\n            quant = h\n        quant = self.post_quant_conv(quant)\n        dec = self.decoder(quant)\n        return dec\n\n    def forward(self, sample):\n        x = sample\n        h = self.encode(x)\n        dec = self.decode(h)\n        return dec\n\n\nclass AutoencoderKL(nn.Layer, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        in_channels=3,\n        out_channels=3,\n        down_block_types=(\"DownEncoderBlock2D\", \"DownEncoderBlock2D\", \"DownEncoderBlock2D\", \"DownEncoderBlock2D\"),\n        up_block_types=(\"UpDecoderBlock2D\", \"UpDecoderBlock2D\", \"UpDecoderBlock2D\", \"UpDecoderBlock2D\"),\n        block_out_channels=(128, 256, 512, 512),\n        layers_per_block=2,\n        act_fn=\"silu\",\n        latent_channels=4,\n        sample_size=512,\n    ):\n        super().__init__()\n\n        # pass init params to Encoder\n        self.encoder = Encoder(\n            in_channels=in_channels,\n            out_channels=latent_channels,\n            down_block_types=down_block_types,\n            block_out_channels=block_out_channels,\n            layers_per_block=layers_per_block,\n            act_fn=act_fn,\n            double_z=True,\n        )\n\n        # pass init params to Decoder\n        self.decoder = Decoder(\n            in_channels=latent_channels,\n            out_channels=out_channels,\n            up_block_types=up_block_types,\n            block_out_channels=block_out_channels,\n            layers_per_block=layers_per_block,\n            act_fn=act_fn,\n        )\n\n        self.quant_conv = nn.Conv2D(2 * latent_channels, 2 * latent_channels, 1)\n        self.post_quant_conv = nn.Conv2D(latent_channels, latent_channels, 1)\n\n    def encode(self, x):\n        h = self.encoder(x)\n        moments = self.quant_conv(h)\n        posterior = DiagonalGaussianDistribution(moments)\n        return posterior\n\n    def decode(self, z):\n        z = self.post_quant_conv(z)\n        dec = self.decoder(z)\n        return dec\n\n    def forward(self, sample, sample_posterior=False):\n        x = sample\n        posterior = self.encode(x)\n        if sample_posterior:\n            z = posterior.sample()\n        else:\n            z = posterior.mode()\n        dec = self.decode(z)\n        return dec\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/diffusers/schedulers/README.md",
    "content": "# Schedulers\n\n- Schedulers are the algorithms to use diffusion models in inference as well as for training. They include the noise schedules and define algorithm-specific diffusion steps.\n- Schedulers can be used interchangable between diffusion models in inference to find the preferred trade-off between speed and generation quality.\n- Schedulers are available in numpy, but can easily be transformed into Py\n\n## API\n\n- Schedulers should provide one or more `def step(...)` functions that should be called iteratively to unroll the diffusion loop during\nthe forward pass.\n- Schedulers should be framework-agnostic, but provide a simple functionality to convert the scheduler into a specific framework, such as PyTorch\nwith a `set_format(...)` method.\n\n## Examples\n\n- The DDPM scheduler was proposed in [Denoising Diffusion Probabilistic Models](https://arxiv.org/abs/2006.11239) and can be found in [scheduling_ddpm.py](https://github.com/huggingface/diffusers/blob/main/src/diffusers/schedulers/scheduling_ddpm.py). An example of how to use this scheduler can be found in [pipeline_ddpm.py](https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/pipeline_ddpm.py).\n- The DDIM scheduler was proposed in [Denoising Diffusion Implicit Models](https://arxiv.org/abs/2010.02502) and can be found in [scheduling_ddim.py](https://github.com/huggingface/diffusers/blob/main/src/diffusers/schedulers/scheduling_ddim.py). An example of how to use this scheduler can be found in [pipeline_ddim.py](https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/pipeline_ddim.py).\n- The PNDM scheduler was proposed in [Pseudo Numerical Methods for Diffusion Models on Manifolds](https://arxiv.org/abs/2202.09778) and can be found in [scheduling_pndm.py](https://github.com/huggingface/diffusers/blob/main/src/diffusers/schedulers/scheduling_pndm.py). An example of how to use this scheduler can be found in [pipeline_pndm.py](https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/pipeline_pndm.py).\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/diffusers/schedulers/__init__.py",
    "content": "# flake8: noqa\n# There's no way to ignore \"F401 '...' imported but unused\" warnings in this\n# module, but to preserve other warnings. So, don't check this module at all.\n# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom .scheduling_ddim import DDIMScheduler\nfrom .scheduling_ddpm import DDPMScheduler\nfrom .scheduling_karras_ve import KarrasVeScheduler\nfrom .scheduling_lms_discrete import LMSDiscreteScheduler\nfrom .scheduling_pndm import PNDMScheduler\nfrom .scheduling_sde_ve import ScoreSdeVeScheduler\nfrom .scheduling_sde_vp import ScoreSdeVpScheduler\nfrom .scheduling_utils import SchedulerMixin\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/diffusers/schedulers/scheduling_ddim.py",
    "content": "# Copyright 2022 Stanford University Team and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# DISCLAIMER: This code is strongly influenced by https://github.com/pesser/pypaddle_diffusion\n# and https://github.com/hojonathanho/diffusion\nimport math\nfrom typing import Union\n\nimport numpy as np\nimport paddle\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\ndef betas_for_alpha_bar(num_diffusion_timesteps, max_beta=0.999):\n    \"\"\"\n    Create a beta schedule that discretizes the given alpha_t_bar function, which defines the cumulative product of\n    (1-beta) over time from t = [0,1].\n\n    :param num_diffusion_timesteps: the number of betas to produce. :param alpha_bar: a lambda that takes an argument t\n    from 0 to 1 and\n                      produces the cumulative product of (1-beta) up to that part of the diffusion process.\n    :param max_beta: the maximum beta to use; use values lower than 1 to\n                     prevent singularities.\n    \"\"\"\n\n    def alpha_bar(time_step):\n        return math.cos((time_step + 0.008) / 1.008 * math.pi / 2)**2\n\n    betas = []\n    for i in range(num_diffusion_timesteps):\n        t1 = i / num_diffusion_timesteps\n        t2 = (i + 1) / num_diffusion_timesteps\n        betas.append(min(1 - alpha_bar(t2) / alpha_bar(t1), max_beta))\n    return np.array(betas, dtype=np.float32)\n\n\nclass DDIMScheduler(SchedulerMixin, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        num_train_timesteps=1000,\n        beta_start=0.0001,\n        beta_end=0.02,\n        beta_schedule=\"linear\",\n        trained_betas=None,\n        timestep_values=None,\n        clip_sample=True,\n        set_alpha_to_one=True,\n        tensor_format=\"pd\",\n    ):\n\n        if beta_schedule == \"linear\":\n            self.betas = np.linspace(beta_start, beta_end, num_train_timesteps, dtype=np.float32)\n        elif beta_schedule == \"scaled_linear\":\n            # this schedule is very specific to the latent diffusion model.\n            self.betas = np.linspace(beta_start**0.5, beta_end**0.5, num_train_timesteps, dtype=np.float32)**2\n        elif beta_schedule == \"squaredcos_cap_v2\":\n            # Glide cosine schedule\n            self.betas = betas_for_alpha_bar(num_train_timesteps)\n        else:\n            raise NotImplementedError(f\"{beta_schedule} does is not implemented for {self.__class__}\")\n\n        self.alphas = 1.0 - self.betas\n        self.alphas_cumprod = np.cumprod(self.alphas, axis=0)\n\n        # At every step in ddim, we are looking into the previous alphas_cumprod\n        # For the final step, there is no previous alphas_cumprod because we are already at 0\n        # `set_alpha_to_one` decides whether we set this paratemer simply to one or\n        # whether we use the final alpha of the \"non-previous\" one.\n        self.final_alpha_cumprod = np.array(1.0) if set_alpha_to_one else self.alphas_cumprod[0]\n\n        # setable values\n        self.num_inference_steps = None\n        self.timesteps = np.arange(0, num_train_timesteps)[::-1].copy()\n\n        self.tensor_format = tensor_format\n        self.set_format(tensor_format=tensor_format)\n\n    def _get_variance(self, timestep, prev_timestep):\n        alpha_prod_t = self.alphas_cumprod[timestep]\n        alpha_prod_t_prev = self.alphas_cumprod[prev_timestep] if prev_timestep >= 0 else self.final_alpha_cumprod\n        beta_prod_t = 1 - alpha_prod_t\n        beta_prod_t_prev = 1 - alpha_prod_t_prev\n\n        variance = (beta_prod_t_prev / beta_prod_t) * (1 - alpha_prod_t / alpha_prod_t_prev)\n\n        return variance\n\n    def set_timesteps(self, num_inference_steps, offset=0):\n        self.num_inference_steps = num_inference_steps\n        self.timesteps = np.arange(0, self.config.num_train_timesteps,\n                                   self.config.num_train_timesteps // self.num_inference_steps)[::-1].copy()\n        self.timesteps += offset\n        self.set_format(tensor_format=self.tensor_format)\n\n    def step(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n        eta: float = 0.0,\n        use_clipped_model_output: bool = False,\n        generator=None,\n    ):\n        # See formulas (12) and (16) of DDIM paper https://arxiv.org/pdf/2010.02502.pdf\n        # Ideally, read DDIM paper in-detail understanding\n\n        # Notation (<variable name> -> <name in paper>\n        # - pred_noise_t -> e_theta(x_t, t)\n        # - pred_original_sample -> f_theta(x_t, t) or x_0\n        # - std_dev_t -> sigma_t\n        # - eta -> η\n        # - pred_sample_direction -> \"direction pointingc to x_t\"\n        # - pred_prev_sample -> \"x_t-1\"\n\n        # 1. get previous step value (=t-1)\n        prev_timestep = timestep - self.config.num_train_timesteps // self.num_inference_steps\n\n        # 2. compute alphas, betas\n        alpha_prod_t = self.alphas_cumprod[timestep]\n        alpha_prod_t_prev = self.alphas_cumprod[prev_timestep] if prev_timestep >= 0 else self.final_alpha_cumprod\n        beta_prod_t = 1 - alpha_prod_t\n\n        # 3. compute predicted original sample from predicted noise also called\n        # \"predicted x_0\" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf\n        pred_original_sample = (sample - beta_prod_t**(0.5) * model_output) / alpha_prod_t**(0.5)\n\n        # 4. Clip \"predicted x_0\"\n        if self.config.clip_sample:\n            pred_original_sample = self.clip(pred_original_sample, -1, 1)\n\n        # 5. compute variance: \"sigma_t(η)\" -> see formula (16)\n        # σ_t = sqrt((1 − α_t−1)/(1 − α_t)) * sqrt(1 − α_t/α_t−1)\n        variance = self._get_variance(timestep, prev_timestep)\n        std_dev_t = eta * variance**(0.5)\n\n        if use_clipped_model_output:\n            # the model_output is always re-derived from the clipped x_0 in Glide\n            model_output = (sample - alpha_prod_t**(0.5) * pred_original_sample) / beta_prod_t**(0.5)\n\n        # 6. compute \"direction pointing to x_t\" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf\n        pred_sample_direction = (1 - alpha_prod_t_prev - std_dev_t**2)**(0.5) * model_output\n\n        # 7. compute x_t without \"random noise\" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf\n        prev_sample = alpha_prod_t_prev**(0.5) * pred_original_sample + pred_sample_direction\n\n        if eta > 0:\n            noise = paddle.randn(model_output.shape)\n            variance = self._get_variance(timestep, prev_timestep)**(0.5) * eta * noise\n\n            if not paddle.is_tensor(model_output):\n                variance = variance.numpy()\n\n            prev_sample = prev_sample + variance\n\n        return {\"prev_sample\": prev_sample}\n\n    def add_noise(self, original_samples, noise, timesteps):\n        sqrt_alpha_prod = self.alphas_cumprod[timesteps]**0.5\n        sqrt_alpha_prod = self.match_shape(sqrt_alpha_prod, original_samples)\n        sqrt_one_minus_alpha_prod = (1 - self.alphas_cumprod[timesteps])**0.5\n        sqrt_one_minus_alpha_prod = self.match_shape(sqrt_one_minus_alpha_prod, original_samples)\n\n        noisy_samples = sqrt_alpha_prod * original_samples + sqrt_one_minus_alpha_prod * noise\n        return noisy_samples\n\n    def __len__(self):\n        return self.config.num_train_timesteps\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/diffusers/schedulers/scheduling_ddpm.py",
    "content": "# Copyright 2022 UC Berkely Team and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# DISCLAIMER: This file is strongly influenced by https://github.com/ermongroup/ddim\nimport math\nfrom typing import Union\n\nimport numpy as np\nimport paddle\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\ndef betas_for_alpha_bar(num_diffusion_timesteps, max_beta=0.999):\n    \"\"\"\n    Create a beta schedule that discretizes the given alpha_t_bar function, which defines the cumulative product of\n    (1-beta) over time from t = [0,1].\n\n    :param num_diffusion_timesteps: the number of betas to produce. :param alpha_bar: a lambda that takes an argument t\n    from 0 to 1 and\n                      produces the cumulative product of (1-beta) up to that part of the diffusion process.\n    :param max_beta: the maximum beta to use; use values lower than 1 to\n                     prevent singularities.\n    \"\"\"\n\n    def alpha_bar(time_step):\n        return math.cos((time_step + 0.008) / 1.008 * math.pi / 2)**2\n\n    betas = []\n    for i in range(num_diffusion_timesteps):\n        t1 = i / num_diffusion_timesteps\n        t2 = (i + 1) / num_diffusion_timesteps\n        betas.append(min(1 - alpha_bar(t2) / alpha_bar(t1), max_beta))\n    return np.array(betas, dtype=np.float32)\n\n\nclass DDPMScheduler(SchedulerMixin, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        num_train_timesteps=1000,\n        beta_start=0.0001,\n        beta_end=0.02,\n        beta_schedule=\"linear\",\n        trained_betas=None,\n        variance_type=\"fixed_small\",\n        clip_sample=True,\n        tensor_format=\"pd\",\n    ):\n\n        if trained_betas is not None:\n            self.betas = np.asarray(trained_betas)\n        elif beta_schedule == \"linear\":\n            self.betas = np.linspace(beta_start, beta_end, num_train_timesteps, dtype=np.float32)\n        elif beta_schedule == \"scaled_linear\":\n            # this schedule is very specific to the latent diffusion model.\n            self.betas = np.linspace(beta_start**0.5, beta_end**0.5, num_train_timesteps, dtype=np.float32)**2\n        elif beta_schedule == \"squaredcos_cap_v2\":\n            # Glide cosine schedule\n            self.betas = betas_for_alpha_bar(num_train_timesteps)\n        else:\n            raise NotImplementedError(f\"{beta_schedule} does is not implemented for {self.__class__}\")\n\n        self.alphas = 1.0 - self.betas\n        self.alphas_cumprod = np.cumprod(self.alphas, axis=0)\n        self.one = np.array(1.0)\n\n        # setable values\n        self.num_inference_steps = None\n        self.timesteps = np.arange(0, num_train_timesteps)[::-1].copy()\n\n        self.tensor_format = tensor_format\n        self.set_format(tensor_format=tensor_format)\n\n        self.variance_type = variance_type\n\n    def set_timesteps(self, num_inference_steps):\n        num_inference_steps = min(self.config.num_train_timesteps, num_inference_steps)\n        self.num_inference_steps = num_inference_steps\n        self.timesteps = np.arange(0, self.config.num_train_timesteps,\n                                   self.config.num_train_timesteps // self.num_inference_steps)[::-1].copy()\n        self.set_format(tensor_format=self.tensor_format)\n\n    def _get_variance(self, t, predicted_variance=None, variance_type=None):\n        alpha_prod_t = self.alphas_cumprod[t]\n        alpha_prod_t_prev = self.alphas_cumprod[t - 1] if t > 0 else self.one\n\n        # For t > 0, compute predicted variance βt (see formula (6) and (7) from https://arxiv.org/pdf/2006.11239.pdf)\n        # and sample from it to get previous sample\n        # x_{t-1} ~ N(pred_prev_sample, variance) == add variance to pred_sample\n        variance = (1 - alpha_prod_t_prev) / (1 - alpha_prod_t) * self.betas[t]\n\n        if variance_type is None:\n            variance_type = self.config.variance_type\n\n        # hacks - were probs added for training stability\n        if variance_type == \"fixed_small\":\n            variance = self.clip(variance, min_value=1e-20)\n        # for rl-diffuser https://arxiv.org/abs/2205.09991\n        elif variance_type == \"fixed_small_log\":\n            variance = self.log(self.clip(variance, min_value=1e-20))\n        elif variance_type == \"fixed_large\":\n            variance = self.betas[t]\n        elif variance_type == \"fixed_large_log\":\n            # Glide max_log\n            variance = self.log(self.betas[t])\n        elif variance_type == \"learned\":\n            return predicted_variance\n        elif variance_type == \"learned_range\":\n            min_log = variance\n            max_log = self.betas[t]\n            frac = (predicted_variance + 1) / 2\n            variance = frac * max_log + (1 - frac) * min_log\n\n        return variance\n\n    def step(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n        predict_epsilon=True,\n        generator=None,\n    ):\n        t = timestep\n\n        if model_output.shape[1] == sample.shape[1] * 2 and self.variance_type in [\"learned\", \"learned_range\"]:\n            model_output, predicted_variance = paddle.split(model_output, sample.shape[1], axis=1)\n        else:\n            predicted_variance = None\n\n        # 1. compute alphas, betas\n        alpha_prod_t = self.alphas_cumprod[t]\n        alpha_prod_t_prev = self.alphas_cumprod[t - 1] if t > 0 else self.one\n        beta_prod_t = 1 - alpha_prod_t\n        beta_prod_t_prev = 1 - alpha_prod_t_prev\n\n        # 2. compute predicted original sample from predicted noise also called\n        # \"predicted x_0\" of formula (15) from https://arxiv.org/pdf/2006.11239.pdf\n        if predict_epsilon:\n            pred_original_sample = (sample - beta_prod_t**(0.5) * model_output) / alpha_prod_t**(0.5)\n        else:\n            pred_original_sample = model_output\n\n        # 3. Clip \"predicted x_0\"\n        if self.config.clip_sample:\n            pred_original_sample = self.clip(pred_original_sample, -1, 1)\n\n        # 4. Compute coefficients for pred_original_sample x_0 and current sample x_t\n        # See formula (7) from https://arxiv.org/pdf/2006.11239.pdf\n        pred_original_sample_coeff = (alpha_prod_t_prev**(0.5) * self.betas[t]) / beta_prod_t\n        current_sample_coeff = self.alphas[t]**(0.5) * beta_prod_t_prev / beta_prod_t\n\n        # 5. Compute predicted previous sample µ_t\n        # See formula (7) from https://arxiv.org/pdf/2006.11239.pdf\n        pred_prev_sample = pred_original_sample_coeff * pred_original_sample + current_sample_coeff * sample\n\n        # 6. Add noise\n        variance = 0\n        if t > 0:\n            noise = self.randn_like(model_output)\n            variance = (self._get_variance(t, predicted_variance=predicted_variance)**0.5) * noise\n\n        pred_prev_sample = pred_prev_sample + variance\n\n        return {\"prev_sample\": pred_prev_sample}\n\n    def add_noise(self, original_samples, noise, timesteps):\n        sqrt_alpha_prod = self.alphas_cumprod[timesteps]**0.5\n        sqrt_alpha_prod = self.match_shape(sqrt_alpha_prod, original_samples)\n        sqrt_one_minus_alpha_prod = (1 - self.alphas_cumprod[timesteps])**0.5\n        sqrt_one_minus_alpha_prod = self.match_shape(sqrt_one_minus_alpha_prod, original_samples)\n\n        noisy_samples = sqrt_alpha_prod * original_samples + sqrt_one_minus_alpha_prod * noise\n        return noisy_samples\n\n    def __len__(self):\n        return self.config.num_train_timesteps\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/diffusers/schedulers/scheduling_karras_ve.py",
    "content": "# Copyright 2022 NVIDIA and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Union\n\nimport numpy as np\nimport paddle\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\nclass KarrasVeScheduler(SchedulerMixin, ConfigMixin):\n    \"\"\"\n    Stochastic sampling from Karras et al. [1] tailored to the Variance-Expanding (VE) models [2]. Use Algorithm 2 and\n    the VE column of Table 1 from [1] for reference.\n\n    [1] Karras, Tero, et al. \"Elucidating the Design Space of Diffusion-Based Generative Models.\"\n    https://arxiv.org/abs/2206.00364 [2] Song, Yang, et al. \"Score-based generative modeling through stochastic\n    differential equations.\" https://arxiv.org/abs/2011.13456\n    \"\"\"\n\n    @register_to_config\n    def __init__(\n        self,\n        sigma_min=0.02,\n        sigma_max=100,\n        s_noise=1.007,\n        s_churn=80,\n        s_min=0.05,\n        s_max=50,\n        tensor_format=\"pd\",\n    ):\n        \"\"\"\n        For more details on the parameters, see the original paper's Appendix E.: \"Elucidating the Design Space of\n        Diffusion-Based Generative Models.\" https://arxiv.org/abs/2206.00364. The grid search values used to find the\n        optimal {s_noise, s_churn, s_min, s_max} for a specific model are described in Table 5 of the paper.\n\n        Args:\n            sigma_min (`float`): minimum noise magnitude\n            sigma_max (`float`): maximum noise magnitude\n            s_noise (`float`): the amount of additional noise to counteract loss of detail during sampling.\n                A reasonable range is [1.000, 1.011].\n            s_churn (`float`): the parameter controlling the overall amount of stochasticity.\n                A reasonable range is [0, 100].\n            s_min (`float`): the start value of the sigma range where we add noise (enable stochasticity).\n                A reasonable range is [0, 10].\n            s_max (`float`): the end value of the sigma range where we add noise.\n                A reasonable range is [0.2, 80].\n        \"\"\"\n        # setable values\n        self.num_inference_steps = None\n        self.timesteps = None\n        self.schedule = None  # sigma(t_i)\n\n        self.tensor_format = tensor_format\n        self.set_format(tensor_format=tensor_format)\n\n    def set_timesteps(self, num_inference_steps):\n        self.num_inference_steps = num_inference_steps\n        self.timesteps = np.arange(0, self.num_inference_steps)[::-1].copy()\n        self.schedule = [(self.sigma_max * (self.sigma_min**2 / self.sigma_max**2)**(i / (num_inference_steps - 1)))\n                         for i in self.timesteps]\n        self.schedule = np.array(self.schedule, dtype=np.float32)\n\n        self.set_format(tensor_format=self.tensor_format)\n\n    def add_noise_to_input(self, sample, sigma, generator=None):\n        \"\"\"\n        Explicit Langevin-like \"churn\" step of adding noise to the sample according to a factor gamma_i ≥ 0 to reach a\n        higher noise level sigma_hat = sigma_i + gamma_i*sigma_i.\n        \"\"\"\n        if self.s_min <= sigma <= self.s_max:\n            gamma = min(self.s_churn / self.num_inference_steps, 2**0.5 - 1)\n        else:\n            gamma = 0\n\n        # sample eps ~ N(0, S_noise^2 * I)\n        eps = self.s_noise * paddle.randn(sample.shape)\n        sigma_hat = sigma + gamma * sigma\n        sample_hat = sample + ((sigma_hat**2 - sigma**2)**0.5 * eps)\n\n        return sample_hat, sigma_hat\n\n    def step(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        sigma_hat: float,\n        sigma_prev: float,\n        sample_hat: Union[paddle.Tensor, np.ndarray],\n    ):\n        pred_original_sample = sample_hat + sigma_hat * model_output\n        derivative = (sample_hat - pred_original_sample) / sigma_hat\n        sample_prev = sample_hat + (sigma_prev - sigma_hat) * derivative\n\n        return {\"prev_sample\": sample_prev, \"derivative\": derivative}\n\n    def step_correct(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        sigma_hat: float,\n        sigma_prev: float,\n        sample_hat: Union[paddle.Tensor, np.ndarray],\n        sample_prev: Union[paddle.Tensor, np.ndarray],\n        derivative: Union[paddle.Tensor, np.ndarray],\n    ):\n        pred_original_sample = sample_prev + sigma_prev * model_output\n        derivative_corr = (sample_prev - pred_original_sample) / sigma_prev\n        sample_prev = sample_hat + (sigma_prev - sigma_hat) * (0.5 * derivative + 0.5 * derivative_corr)\n        return {\"prev_sample\": sample_prev, \"derivative\": derivative_corr}\n\n    def add_noise(self, original_samples, noise, timesteps):\n        raise NotImplementedError()\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/diffusers/schedulers/scheduling_lms_discrete.py",
    "content": "# Copyright 2022 Katherine Crowson and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Union\n\nimport numpy as np\nimport paddle\nfrom scipy import integrate\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\nclass LMSDiscreteScheduler(SchedulerMixin, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        num_train_timesteps=1000,\n        beta_start=0.0001,\n        beta_end=0.02,\n        beta_schedule=\"linear\",\n        trained_betas=None,\n        timestep_values=None,\n        tensor_format=\"pd\",\n    ):\n        \"\"\"\n        Linear Multistep Scheduler for discrete beta schedules. Based on the original k-diffusion implementation by\n        Katherine Crowson:\n        https://github.com/crowsonkb/k-diffusion/blob/481677d114f6ea445aa009cf5bd7a9cdee909e47/k_diffusion/sampling.py#L181\n        \"\"\"\n\n        if beta_schedule == \"linear\":\n            self.betas = np.linspace(beta_start, beta_end, num_train_timesteps, dtype=np.float32)\n        elif beta_schedule == \"scaled_linear\":\n            # this schedule is very specific to the latent diffusion model.\n            self.betas = np.linspace(beta_start**0.5, beta_end**0.5, num_train_timesteps, dtype=np.float32)**2\n        else:\n            raise NotImplementedError(f\"{beta_schedule} does is not implemented for {self.__class__}\")\n\n        self.alphas = 1.0 - self.betas\n        self.alphas_cumprod = np.cumprod(self.alphas, axis=0)\n\n        self.sigmas = ((1 - self.alphas_cumprod) / self.alphas_cumprod)**0.5\n\n        # setable values\n        self.num_inference_steps = None\n        self.timesteps = np.arange(0, num_train_timesteps)[::-1].copy()\n        self.derivatives = []\n\n        self.tensor_format = tensor_format\n        self.set_format(tensor_format=tensor_format)\n\n    def get_lms_coefficient(self, order, t, current_order):\n        \"\"\"\n        Compute a linear multistep coefficient\n        \"\"\"\n\n        def lms_derivative(tau):\n            prod = 1.0\n            for k in range(order):\n                if current_order == k:\n                    continue\n                prod *= (tau - self.sigmas[t - k]) / (self.sigmas[t - current_order] - self.sigmas[t - k])\n            return prod\n\n        integrated_coeff = integrate.quad(lms_derivative, self.sigmas[t], self.sigmas[t + 1], epsrel=1e-4)[0]\n\n        return integrated_coeff\n\n    def set_timesteps(self, num_inference_steps):\n        self.num_inference_steps = num_inference_steps\n        self.timesteps = np.linspace(self.num_train_timesteps - 1, 0, num_inference_steps, dtype=float)\n\n        low_idx = np.floor(self.timesteps).astype(int)\n        high_idx = np.ceil(self.timesteps).astype(int)\n        frac = np.mod(self.timesteps, 1.0)\n        sigmas = np.array(((1 - self.alphas_cumprod) / self.alphas_cumprod)**0.5)\n        sigmas = (1 - frac) * sigmas[low_idx] + frac * sigmas[high_idx]\n        self.sigmas = np.concatenate([sigmas, [0.0]])\n\n        self.derivatives = []\n\n        self.set_format(tensor_format=self.tensor_format)\n\n    def step(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n        order: int = 4,\n    ):\n        sigma = self.sigmas[timestep]\n\n        # 1. compute predicted original sample (x_0) from sigma-scaled predicted noise\n        pred_original_sample = sample - sigma * model_output\n\n        # 2. Convert to an ODE derivative\n        derivative = (sample - pred_original_sample) / sigma\n        self.derivatives.append(derivative)\n        if len(self.derivatives) > order:\n            self.derivatives.pop(0)\n\n        # 3. Compute linear multistep coefficients\n        order = min(timestep + 1, order)\n        lms_coeffs = [self.get_lms_coefficient(order, timestep, curr_order) for curr_order in range(order)]\n\n        # 4. Compute previous sample based on the derivatives path\n        prev_sample = sample + sum(coeff * derivative\n                                   for coeff, derivative in zip(lms_coeffs, reversed(self.derivatives)))\n\n        return {\"prev_sample\": prev_sample}\n\n    def add_noise(self, original_samples, noise, timesteps):\n        alpha_prod = self.alphas_cumprod[timesteps]\n        alpha_prod = self.match_shape(alpha_prod, original_samples)\n\n        noisy_samples = (alpha_prod**0.5) * original_samples + ((1 - alpha_prod)**0.5) * noise\n        return noisy_samples\n\n    def __len__(self):\n        return self.config.num_train_timesteps\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/diffusers/schedulers/scheduling_pndm.py",
    "content": "# Copyright 2022 Zhejiang University Team and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# DISCLAIMER: This file is strongly influenced by https://github.com/ermongroup/ddim\nimport math\nfrom typing import Union\n\nimport numpy as np\nimport paddle\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\ndef betas_for_alpha_bar(num_diffusion_timesteps, max_beta=0.999):\n    \"\"\"\n    Create a beta schedule that discretizes the given alpha_t_bar function, which defines the cumulative product of\n    (1-beta) over time from t = [0,1].\n\n    :param num_diffusion_timesteps: the number of betas to produce. :param alpha_bar: a lambda that takes an argument t\n    from 0 to 1 and\n                      produces the cumulative product of (1-beta) up to that part of the diffusion process.\n    :param max_beta: the maximum beta to use; use values lower than 1 to\n                     prevent singularities.\n    \"\"\"\n\n    def alpha_bar(time_step):\n        return math.cos((time_step + 0.008) / 1.008 * math.pi / 2)**2\n\n    betas = []\n    for i in range(num_diffusion_timesteps):\n        t1 = i / num_diffusion_timesteps\n        t2 = (i + 1) / num_diffusion_timesteps\n        betas.append(min(1 - alpha_bar(t2) / alpha_bar(t1), max_beta))\n    return np.array(betas, dtype=np.float32)\n\n\nclass PNDMScheduler(SchedulerMixin, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        num_train_timesteps=1000,\n        beta_start=0.0001,\n        beta_end=0.02,\n        beta_schedule=\"linear\",\n        tensor_format=\"pd\",\n        skip_prk_steps=False,\n    ):\n\n        if beta_schedule == \"linear\":\n            self.betas = np.linspace(beta_start, beta_end, num_train_timesteps, dtype=np.float32)\n        elif beta_schedule == \"scaled_linear\":\n            # this schedule is very specific to the latent diffusion model.\n            self.betas = np.linspace(beta_start**0.5, beta_end**0.5, num_train_timesteps, dtype=np.float32)**2\n        elif beta_schedule == \"squaredcos_cap_v2\":\n            # Glide cosine schedule\n            self.betas = betas_for_alpha_bar(num_train_timesteps)\n        else:\n            raise NotImplementedError(f\"{beta_schedule} does is not implemented for {self.__class__}\")\n\n        self.alphas = 1.0 - self.betas\n        self.alphas_cumprod = np.cumprod(self.alphas, axis=0)\n\n        self.one = np.array(1.0)\n\n        # For now we only support F-PNDM, i.e. the runge-kutta method\n        # For more information on the algorithm please take a look at the paper: https://arxiv.org/pdf/2202.09778.pdf\n        # mainly at formula (9), (12), (13) and the Algorithm 2.\n        self.pndm_order = 4\n\n        # running values\n        self.cur_model_output = 0\n        self.counter = 0\n        self.cur_sample = None\n        self.ets = []\n\n        # setable values\n        self.num_inference_steps = None\n        self._timesteps = np.arange(0, num_train_timesteps)[::-1].copy()\n        self._offset = 0\n        self.prk_timesteps = None\n        self.plms_timesteps = None\n        self.timesteps = None\n\n        self.tensor_format = tensor_format\n        self.set_format(tensor_format=tensor_format)\n\n    def set_timesteps(self, num_inference_steps, offset=0):\n        self.num_inference_steps = num_inference_steps\n        self._timesteps = list(\n            range(0, self.config.num_train_timesteps, self.config.num_train_timesteps // num_inference_steps))\n        self._offset = offset\n        self._timesteps = [t + self._offset for t in self._timesteps]\n\n        if self.config.skip_prk_steps:\n            # for some models like stable diffusion the prk steps can/should be skipped to\n            # produce better results. When using PNDM with `self.config.skip_prk_steps` the implementation\n            # is based on crowsonkb's PLMS sampler implementation: https://github.com/CompVis/latent-diffusion/pull/51\n            self.prk_timesteps = []\n            self.plms_timesteps = list(reversed(self._timesteps[:-1] + self._timesteps[-2:-1] + self._timesteps[-1:]))\n        else:\n            prk_timesteps = np.array(self._timesteps[-self.pndm_order:]).repeat(2) + np.tile(\n                np.array([0, self.config.num_train_timesteps // num_inference_steps // 2]), self.pndm_order)\n            self.prk_timesteps = list(reversed(prk_timesteps[:-1].repeat(2)[1:-1]))\n            self.plms_timesteps = list(reversed(self._timesteps[:-3]))\n\n        self.timesteps = self.prk_timesteps + self.plms_timesteps\n\n        self.ets = []\n        self.counter = 0\n        self.set_format(tensor_format=self.tensor_format)\n\n    def step(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n    ):\n        if self.counter < len(self.prk_timesteps) and not self.config.skip_prk_steps:\n            return self.step_prk(model_output=model_output, timestep=timestep, sample=sample)\n        else:\n            return self.step_plms(model_output=model_output, timestep=timestep, sample=sample)\n\n    def step_prk(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n    ):\n        \"\"\"\n        Step function propagating the sample with the Runge-Kutta method. RK takes 4 forward passes to approximate the\n        solution to the differential equation.\n        \"\"\"\n        diff_to_prev = 0 if self.counter % 2 else self.config.num_train_timesteps // self.num_inference_steps // 2\n        prev_timestep = max(timestep - diff_to_prev, self.prk_timesteps[-1])\n        timestep = self.prk_timesteps[self.counter // 4 * 4]\n\n        if self.counter % 4 == 0:\n            self.cur_model_output += 1 / 6 * model_output\n            self.ets.append(model_output)\n            self.cur_sample = sample\n        elif (self.counter - 1) % 4 == 0:\n            self.cur_model_output += 1 / 3 * model_output\n        elif (self.counter - 2) % 4 == 0:\n            self.cur_model_output += 1 / 3 * model_output\n        elif (self.counter - 3) % 4 == 0:\n            model_output = self.cur_model_output + 1 / 6 * model_output\n            self.cur_model_output = 0\n\n        # cur_sample should not be `None`\n        cur_sample = self.cur_sample if self.cur_sample is not None else sample\n\n        prev_sample = self._get_prev_sample(cur_sample, timestep, prev_timestep, model_output)\n        self.counter += 1\n\n        return {\"prev_sample\": prev_sample}\n\n    def step_plms(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n    ):\n        \"\"\"\n        Step function propagating the sample with the linear multi-step method. This has one forward pass with multiple\n        times to approximate the solution.\n        \"\"\"\n        if not self.config.skip_prk_steps and len(self.ets) < 3:\n            raise ValueError(\n                f\"{self.__class__} can only be run AFTER scheduler has been run \"\n                \"in 'prk' mode for at least 12 iterations \"\n                \"See: https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/pipeline_pndm.py \"\n                \"for more information.\")\n\n        prev_timestep = max(timestep - self.config.num_train_timesteps // self.num_inference_steps, 0)\n\n        if self.counter != 1:\n            self.ets.append(model_output)\n        else:\n            prev_timestep = timestep\n            timestep = timestep + self.config.num_train_timesteps // self.num_inference_steps\n\n        if len(self.ets) == 1 and self.counter == 0:\n            model_output = model_output\n            self.cur_sample = sample\n        elif len(self.ets) == 1 and self.counter == 1:\n            model_output = (model_output + self.ets[-1]) / 2\n            sample = self.cur_sample\n            self.cur_sample = None\n        elif len(self.ets) == 2:\n            model_output = (3 * self.ets[-1] - self.ets[-2]) / 2\n        elif len(self.ets) == 3:\n            model_output = (23 * self.ets[-1] - 16 * self.ets[-2] + 5 * self.ets[-3]) / 12\n        else:\n            model_output = (1 / 24) * (55 * self.ets[-1] - 59 * self.ets[-2] + 37 * self.ets[-3] - 9 * self.ets[-4])\n\n        prev_sample = self._get_prev_sample(sample, timestep, prev_timestep, model_output)\n        self.counter += 1\n\n        return {\"prev_sample\": prev_sample}\n\n    def _get_prev_sample(self, sample, timestep, timestep_prev, model_output):\n        # See formula (9) of PNDM paper https://arxiv.org/pdf/2202.09778.pdf\n        # this function computes x_(t−δ) using the formula of (9)\n        # Note that x_t needs to be added to both sides of the equation\n\n        # Notation (<variable name> -> <name in paper>\n        # alpha_prod_t -> α_t\n        # alpha_prod_t_prev -> α_(t−δ)\n        # beta_prod_t -> (1 - α_t)\n        # beta_prod_t_prev -> (1 - α_(t−δ))\n        # sample -> x_t\n        # model_output -> e_θ(x_t, t)\n        # prev_sample -> x_(t−δ)\n        alpha_prod_t = self.alphas_cumprod[timestep + 1 - self._offset]\n        alpha_prod_t_prev = self.alphas_cumprod[timestep_prev + 1 - self._offset]\n        beta_prod_t = 1 - alpha_prod_t\n        beta_prod_t_prev = 1 - alpha_prod_t_prev\n\n        # corresponds to (α_(t−δ) - α_t) divided by\n        # denominator of x_t in formula (9) and plus 1\n        # Note: (α_(t−δ) - α_t) / (sqrt(α_t) * (sqrt(α_(t−δ)) + sqr(α_t))) =\n        # sqrt(α_(t−δ)) / sqrt(α_t))\n        sample_coeff = (alpha_prod_t_prev / alpha_prod_t)**(0.5)\n\n        # corresponds to denominator of e_θ(x_t, t) in formula (9)\n        model_output_denom_coeff = alpha_prod_t * beta_prod_t_prev**(0.5) + (alpha_prod_t * beta_prod_t *\n                                                                             alpha_prod_t_prev)**(0.5)\n\n        # full formula (9)\n        prev_sample = (sample_coeff * sample -\n                       (alpha_prod_t_prev - alpha_prod_t) * model_output / model_output_denom_coeff)\n\n        return prev_sample\n\n    def add_noise(self, original_samples, noise, timesteps):\n        sqrt_alpha_prod = self.alphas_cumprod[timesteps]**0.5\n        sqrt_alpha_prod = self.match_shape(sqrt_alpha_prod, original_samples)\n        sqrt_one_minus_alpha_prod = (1 - self.alphas_cumprod[timesteps])**0.5\n        sqrt_one_minus_alpha_prod = self.match_shape(sqrt_one_minus_alpha_prod, original_samples)\n\n        noisy_samples = sqrt_alpha_prod * original_samples + sqrt_one_minus_alpha_prod * noise\n        return noisy_samples\n\n    def __len__(self):\n        return self.config.num_train_timesteps\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/diffusers/schedulers/scheduling_sde_ve.py",
    "content": "# Copyright 2022 Google Brain and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# DISCLAIMER: This file is strongly influenced by https://github.com/yang-song/score_sde_pypaddle\n# TODO(Patrick, Anton, Suraj) - make scheduler framework indepedent and clean-up a bit\nfrom typing import Union\n\nimport numpy as np\nimport paddle\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\nclass ScoreSdeVeScheduler(SchedulerMixin, ConfigMixin):\n    \"\"\"\n    The variance exploding stochastic differential equation (SDE) scheduler.\n\n    :param snr: coefficient weighting the step from the model_output sample (from the network) to the random noise.\n    :param sigma_min: initial noise scale for sigma sequence in sampling procedure. The minimum sigma should mirror the\n            distribution of the data.\n    :param sigma_max: :param sampling_eps: the end value of sampling, where timesteps decrease progessively from 1 to\n    epsilon. :param correct_steps: number of correction steps performed on a produced sample. :param tensor_format:\n    \"np\" or \"pd\" for the expected format of samples passed to the Scheduler.\n    \"\"\"\n\n    @register_to_config\n    def __init__(\n        self,\n        num_train_timesteps=2000,\n        snr=0.15,\n        sigma_min=0.01,\n        sigma_max=1348,\n        sampling_eps=1e-5,\n        correct_steps=1,\n        tensor_format=\"pd\",\n    ):\n        # self.sigmas = None\n        # self.discrete_sigmas = None\n        #\n        # # setable values\n        # self.num_inference_steps = None\n        self.timesteps = None\n\n        self.set_sigmas(num_train_timesteps, sigma_min, sigma_max, sampling_eps)\n\n        self.tensor_format = tensor_format\n        self.set_format(tensor_format=tensor_format)\n\n    def set_timesteps(self, num_inference_steps, sampling_eps=None):\n        sampling_eps = sampling_eps if sampling_eps is not None else self.config.sampling_eps\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            self.timesteps = np.linspace(1, sampling_eps, num_inference_steps)\n        elif tensor_format == \"pd\":\n            self.timesteps = paddle.linspace(1, sampling_eps, num_inference_steps)\n        else:\n            raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def set_sigmas(self, num_inference_steps, sigma_min=None, sigma_max=None, sampling_eps=None):\n        sigma_min = sigma_min if sigma_min is not None else self.config.sigma_min\n        sigma_max = sigma_max if sigma_max is not None else self.config.sigma_max\n        sampling_eps = sampling_eps if sampling_eps is not None else self.config.sampling_eps\n        if self.timesteps is None:\n            self.set_timesteps(num_inference_steps, sampling_eps)\n\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            self.discrete_sigmas = np.exp(np.linspace(np.log(sigma_min), np.log(sigma_max), num_inference_steps))\n            self.sigmas = np.array([sigma_min * (sigma_max / sigma_min)**t for t in self.timesteps])\n        elif tensor_format == \"pd\":\n            self.discrete_sigmas = paddle.exp(paddle.linspace(np.log(sigma_min), np.log(sigma_max),\n                                                              num_inference_steps))\n            self.sigmas = paddle.to_tensor([sigma_min * (sigma_max / sigma_min)**t for t in self.timesteps])\n        else:\n            raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def get_adjacent_sigma(self, timesteps, t):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            return np.where(timesteps == 0, np.zeros_like(t), self.discrete_sigmas[timesteps - 1])\n        elif tensor_format == \"pd\":\n            return paddle.where(timesteps == 0, paddle.zeros_like(t), self.discrete_sigmas[timesteps - 1])\n\n        raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def set_seed(self, seed):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            np.random.seed(seed)\n        elif tensor_format == \"pd\":\n            paddle.seed(seed)\n        else:\n            raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def step_pred(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n        seed=None,\n    ):\n        \"\"\"\n        Predict the sample at the previous timestep by reversing the SDE.\n        \"\"\"\n        if seed is not None:\n            self.set_seed(seed)\n        # TODO(Patrick) non-Pypaddle\n\n        timestep = timestep * paddle.ones(sample.shape[0])  # paddle.repeat_interleave(timestep, sample.shape[0])\n        timesteps = (timestep * (len(self.timesteps) - 1)).astype(\"int64\")\n\n        sigma = self.discrete_sigmas[timesteps]\n        adjacent_sigma = self.get_adjacent_sigma(timesteps, timestep)\n        drift = self.zeros_like(sample)\n        diffusion = (sigma**2 - adjacent_sigma**2)**0.5\n\n        # equation 6 in the paper: the model_output modeled by the network is grad_x log pt(x)\n        # also equation 47 shows the analog from SDE models to ancestral sampling methods\n        drift = drift - diffusion[:, None, None, None]**2 * model_output\n\n        #  equation 6: sample noise for the diffusion term of\n        noise = self.randn_like(sample)\n        prev_sample_mean = sample - drift  # subtract because `dt` is a small negative timestep\n        # TODO is the variable diffusion the correct scaling term for the noise?\n        prev_sample = prev_sample_mean + diffusion[:, None, None, None] * noise  # add impact of diffusion field g\n\n        return {\"prev_sample\": prev_sample, \"prev_sample_mean\": prev_sample_mean}\n\n    def step_correct(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        sample: Union[paddle.Tensor, np.ndarray],\n        seed=None,\n    ):\n        \"\"\"\n        Correct the predicted sample based on the output model_output of the network. This is often run repeatedly\n        after making the prediction for the previous timestep.\n        \"\"\"\n        if seed is not None:\n            self.set_seed(seed)\n\n        # For small batch sizes, the paper \"suggest replacing norm(z) with sqrt(d), where d is the dim. of z\"\n        # sample noise for correction\n        noise = self.randn_like(sample)\n\n        # compute step size from the model_output, the noise, and the snr\n        grad_norm = self.norm(model_output)\n        noise_norm = self.norm(noise)\n        step_size = (self.config.snr * noise_norm / grad_norm)**2 * 2\n        step_size = step_size * paddle.ones(sample.shape[0])\n        # self.repeat_scalar(step_size, sample.shape[0])\n\n        # compute corrected sample: model_output term and noise term\n        prev_sample_mean = sample + step_size[:, None, None, None] * model_output\n        prev_sample = prev_sample_mean + ((step_size * 2)**0.5)[:, None, None, None] * noise\n\n        return {\"prev_sample\": prev_sample}\n\n    def __len__(self):\n        return self.config.num_train_timesteps\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/diffusers/schedulers/scheduling_sde_vp.py",
    "content": "# Copyright 2022 Google Brain and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# DISCLAIMER: This file is strongly influenced by https://github.com/yang-song/score_sde_pytorch\n# TODO(Patrick, Anton, Suraj) - make scheduler framework indepedent and clean-up a bit\nimport numpy as np\nimport paddle\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\nclass ScoreSdeVpScheduler(SchedulerMixin, ConfigMixin):\n\n    @register_to_config\n    def __init__(self, num_train_timesteps=2000, beta_min=0.1, beta_max=20, sampling_eps=1e-3, tensor_format=\"np\"):\n\n        self.sigmas = None\n        self.discrete_sigmas = None\n        self.timesteps = None\n\n    def set_timesteps(self, num_inference_steps):\n        self.timesteps = paddle.linspace(1, self.config.sampling_eps, num_inference_steps)\n\n    def step_pred(self, score, x, t):\n        # TODO(Patrick) better comments + non-PyTorch\n        # postprocess model score\n        log_mean_coeff = (-0.25 * t**2 * (self.config.beta_max - self.config.beta_min) - 0.5 * t * self.config.beta_min)\n        std = paddle.sqrt(1.0 - paddle.exp(2.0 * log_mean_coeff))\n        score = -score / std[:, None, None, None]\n\n        # compute\n        dt = -1.0 / len(self.timesteps)\n\n        beta_t = self.config.beta_min + t * (self.config.beta_max - self.config.beta_min)\n        drift = -0.5 * beta_t[:, None, None, None] * x\n        diffusion = paddle.sqrt(beta_t)\n        drift = drift - diffusion[:, None, None, None]**2 * score\n        x_mean = x + drift * dt\n\n        # add noise\n        noise = self.randn_like(x)\n        x = x_mean + diffusion[:, None, None, None] * np.sqrt(-dt) * noise\n\n        return x, x_mean\n\n    def __len__(self):\n        return self.config.num_train_timesteps\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/diffusers/schedulers/scheduling_utils.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Union\n\nimport numpy as np\nimport paddle\n\nSCHEDULER_CONFIG_NAME = \"scheduler_config.json\"\n\n\nclass SchedulerMixin:\n\n    config_name = SCHEDULER_CONFIG_NAME\n    ignore_for_config = [\"tensor_format\"]\n\n    def set_format(self, tensor_format=\"pd\"):\n        self.tensor_format = tensor_format\n        if tensor_format == \"pd\":\n            for key, value in vars(self).items():\n                if isinstance(value, np.ndarray):\n                    setattr(self, key, paddle.to_tensor(value))\n\n        return self\n\n    def clip(self, tensor, min_value=None, max_value=None):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n\n        if tensor_format == \"np\":\n            return np.clip(tensor, min_value, max_value)\n        elif tensor_format == \"pd\":\n            return paddle.clip(tensor, min_value, max_value)\n\n        raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def log(self, tensor):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n\n        if tensor_format == \"np\":\n            return np.log(tensor)\n        elif tensor_format == \"pd\":\n            return paddle.log(tensor)\n\n        raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def match_shape(self, values: Union[np.ndarray, paddle.Tensor], broadcast_array: Union[np.ndarray, paddle.Tensor]):\n        \"\"\"\n        Turns a 1-D array into an array or tensor with len(broadcast_array.shape) dims.\n\n        Args:\n            values: an array or tensor of values to extract.\n            broadcast_array: an array with a larger shape of K dimensions with the batch\n                dimension equal to the length of timesteps.\n        Returns:\n            a tensor of shape [batch_size, 1, ...] where the shape has K dims.\n        \"\"\"\n\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        values = values.flatten()\n\n        while len(values.shape) < len(broadcast_array.shape):\n            values = values[..., None]\n\n        return values\n\n    def norm(self, tensor):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            return np.linalg.norm(tensor)\n        elif tensor_format == \"pd\":\n            return paddle.norm(tensor.reshape([tensor.shape[0], -1]), axis=-1).mean()\n\n        raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def randn_like(self, tensor, generator=None):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            return np.random.randn(np.shape(tensor))\n        elif tensor_format == \"pd\":\n            # return paddle.randn_like(tensor)\n            return paddle.randn(tensor.shape)\n\n        raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def zeros_like(self, tensor):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            return np.zeros_like(tensor)\n        elif tensor_format == \"pd\":\n            return paddle.zeros_like(tensor)\n\n        raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/module.py",
    "content": "# copyright (c) 2022 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport ast\nimport base64\nimport inspect\nimport os\nimport random\nimport sys\nfrom functools import partial\nfrom io import BytesIO\nfrom typing import List\nfrom typing import Optional\n\nimport numpy as np\nimport paddle\nfrom docarray import Document\nfrom docarray import DocumentArray\nfrom IPython import display\nfrom PIL import Image\nfrom stable_diffusion_img2img.clip.clip.utils import build_model\nfrom stable_diffusion_img2img.clip.clip.utils import tokenize\nfrom stable_diffusion_img2img.diffusers import AutoencoderKL\nfrom stable_diffusion_img2img.diffusers import DDIMScheduler\nfrom stable_diffusion_img2img.diffusers import LMSDiscreteScheduler\nfrom stable_diffusion_img2img.diffusers import PNDMScheduler\nfrom stable_diffusion_img2img.diffusers import UNet2DConditionModel\nfrom stable_diffusion_img2img.utils import preprocess\nfrom tqdm.auto import tqdm\n\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"stable_diffusion_img2img\",\n            version=\"1.0.0\",\n            type=\"image/text_to_image\",\n            summary=\"\",\n            author=\"paddlepaddle\",\n            author_email=\"paddle-dev@baidu.com\")\nclass StableDiffusionImg2Img:\n\n    def __init__(self):\n        self.vae = AutoencoderKL(in_channels=3,\n                                 out_channels=3,\n                                 down_block_types=(\"DownEncoderBlock2D\", \"DownEncoderBlock2D\", \"DownEncoderBlock2D\",\n                                                   \"DownEncoderBlock2D\"),\n                                 up_block_types=(\"UpDecoderBlock2D\", \"UpDecoderBlock2D\", \"UpDecoderBlock2D\",\n                                                 \"UpDecoderBlock2D\"),\n                                 block_out_channels=(128, 256, 512, 512),\n                                 layers_per_block=2,\n                                 act_fn=\"silu\",\n                                 latent_channels=4,\n                                 sample_size=512)\n\n        self.unet = UNet2DConditionModel(sample_size=64,\n                                         in_channels=4,\n                                         out_channels=4,\n                                         center_input_sample=False,\n                                         flip_sin_to_cos=True,\n                                         freq_shift=0,\n                                         down_block_types=(\"CrossAttnDownBlock2D\", \"CrossAttnDownBlock2D\",\n                                                           \"CrossAttnDownBlock2D\", \"DownBlock2D\"),\n                                         up_block_types=(\"UpBlock2D\", \"CrossAttnUpBlock2D\", \"CrossAttnUpBlock2D\",\n                                                         \"CrossAttnUpBlock2D\"),\n                                         block_out_channels=(320, 640, 1280, 1280),\n                                         layers_per_block=2,\n                                         downsample_padding=1,\n                                         mid_block_scale_factor=1,\n                                         act_fn=\"silu\",\n                                         norm_num_groups=32,\n                                         norm_eps=1e-5,\n                                         cross_attention_dim=768,\n                                         attention_head_dim=8)\n\n        vae_path = os.path.join(self.directory, 'pre_trained', 'stable-diffusion-v1-4-vae.pdparams')\n        unet_path = os.path.join(self.directory, 'pre_trained', 'stable-diffusion-v1-4-unet.pdparams')\n        self.unet.set_dict(paddle.load(unet_path))\n        self.vae.set_dict(paddle.load(vae_path))\n        for parameter in self.unet.parameters():\n            parameter.stop_gradient = True\n        self.vae.eval()\n        for parameter in self.vae.parameters():\n            parameter.stop_gradient = True\n        self.unet.eval()\n\n        self.text_encoder = build_model()\n        for parameter in self.text_encoder.parameters():\n            parameter.stop_gradient = True\n        self.scheduler = PNDMScheduler(beta_start=0.00085,\n                                       beta_end=0.012,\n                                       beta_schedule=\"scaled_linear\",\n                                       num_train_timesteps=1000,\n                                       skip_prk_steps=True)\n\n    def generate_image(self,\n                       text_prompts,\n                       init_image,\n                       strength: float = 0.8,\n                       style: Optional[str] = None,\n                       artist: Optional[str] = None,\n                       batch_size: Optional[int] = 1,\n                       num_inference_steps=50,\n                       guidance_scale=7.5,\n                       enable_fp16=False,\n                       seed=None,\n                       eta=0.0,\n                       display_rate=5,\n                       use_gpu=True,\n                       output_dir: Optional[str] = 'stable_diffusion_img2img_out'):\n        \"\"\"\n        Create Stable Diffusion artworks and save the result into a DocumentArray.\n\n        :param text_prompts: Phrase, sentence, or string of words and phrases describing what the image should look like.  The words will be analyzed by the AI and will guide the diffusion process toward the image(s) you describe.\n        :param init_image: Initial image.\n        :param strength: Control the noise strength added to initial image, value is in the interval [0.0, 1.0]. The closer to 1, the bigger change to the initial image.\n        :param style: Image style, such as oil paintings, if specified, style will be used to construct prompts.\n        :param artist: Artist style, if specified, style will be used to construct prompts.\n        :param batch_size: This variable sets the number of still images you want SD to create for each prompt.\n        :param num_inference_steps: The number of inference steps.\n        :param guidance_scale: Increase the adherence to the conditional signal which in this case is text as well as overall sample quality.\n        :param enable_fp16: Whether to use float16.\n        :param use_gpu: whether to use gpu or not.\n        :param output_dir: Output directory.\n        :return: a DocumentArray object that has `n_batches` Documents\n        \"\"\"\n        if seed:\n            np.random.seed(seed)\n            random.seed(seed)\n            paddle.seed(seed)\n\n        if use_gpu:\n            try:\n                _places = os.environ.get(\"CUDA_VISIBLE_DEVICES\", None)\n                if _places:\n                    paddle.device.set_device(\"gpu:{}\".format(0))\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n        else:\n            paddle.device.set_device(\"cpu\")\n        paddle.disable_static()\n\n        if not os.path.exists(output_dir):\n            os.makedirs(output_dir, exist_ok=True)\n\n        if isinstance(text_prompts, str):\n            text_prompts = text_prompts.rstrip(',.，。')\n            if style is not None:\n                text_prompts += \",{}\".format(style)\n            if artist is not None:\n                text_prompts += \",{},trending on artstation\".format(artist)\n            text_prompts = [text_prompts]\n        elif isinstance(text_prompts, list):\n            for i, prompt in enumerate(\n                    text_prompts):  # different from dd here, dd can have multiple prompts for one image with weight.\n                text_prompts[i] = prompt.rstrip(',.，。')\n                if style is not None:\n                    text_prompts[i] += \",{}\".format(style)\n                if artist is not None:\n                    text_prompts[i] += \",{},trending on artstation\".format(artist)\n\n        if isinstance(init_image, str):\n            init_image = preprocess(Image.open(init_image))\n        else:\n            init_image = preprocess(init_image)\n\n        # set timesteps\n        accepts_offset = \"offset\" in set(inspect.signature(self.scheduler.set_timesteps).parameters.keys())\n        extra_set_kwargs = {}\n        offset = 0\n        if accepts_offset:\n            offset = 1\n            extra_set_kwargs[\"offset\"] = 1\n\n        self.scheduler.set_timesteps(num_inference_steps, **extra_set_kwargs)\n\n        # encode the init image into latents and scale the latents\n        init_latents = self.vae.encode(init_image).sample()\n        init_latents = 0.18215 * init_latents\n\n        # expand init_latents for batch_size\n        init_latents = paddle.concat([init_latents] * batch_size)\n\n        # get the original timestep using init_timestep\n        init_timestep = int(num_inference_steps * strength) + offset\n        init_timestep = min(init_timestep, num_inference_steps)\n        if isinstance(self.scheduler, LMSDiscreteScheduler):\n            timesteps = paddle.to_tensor([num_inference_steps - init_timestep] * batch_size, dtype=\"int64\")\n        else:\n            timesteps = self.scheduler.timesteps[-init_timestep]\n            timesteps = paddle.to_tensor([timesteps] * batch_size, dtype=\"int64\")\n\n        # add noise to latents using the timesteps\n        noise = paddle.randn(init_latents.shape)\n        init_latents = self.scheduler.add_noise(init_latents, noise, timesteps)\n\n        # here `guidance_scale` is defined analog to the guidance weight `w` of equation (2)\n        # of the Imagen paper: https://arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1`\n        # corresponds to doing no classifier free guidance.\n        do_classifier_free_guidance = guidance_scale > 1.0\n\n        # prepare extra kwargs for the scheduler step, since not all schedulers have the same signature\n        # eta (η) is only used with the DDIMScheduler, it will be ignored for other schedulers.\n        # eta corresponds to η in DDIM paper: https://arxiv.org/abs/2010.02502\n        # and should be between [0, 1]\n        accepts_eta = \"eta\" in set(inspect.signature(self.scheduler.step).parameters.keys())\n        extra_step_kwargs = {}\n        if accepts_eta:\n            extra_step_kwargs[\"eta\"] = eta\n\n        da_batches = DocumentArray()\n\n        for prompt in text_prompts:\n            d = Document(tags={'prompt': prompt})\n            da_batches.append(d)\n            for i in range(batch_size):\n                d.chunks.append(Document(tags={'prompt': prompt, 'image idx': i}))\n            d.chunks.append(Document(tags={'prompt': prompt, 'image idx': 'merged'}))\n            with paddle.amp.auto_cast(enable=enable_fp16, level='O2'):\n                prompts = [prompt] * batch_size\n                text_input = tokenize(prompts)\n                text_embeddings = self.text_encoder(text_input)\n                if do_classifier_free_guidance:\n                    uncond_input = tokenize([\"\"] * batch_size)\n                    uncond_embeddings = self.text_encoder(uncond_input)\n                    text_embeddings = paddle.concat([uncond_embeddings, text_embeddings])\n\n                latents = init_latents\n\n                t_start = max(num_inference_steps - init_timestep + offset, 0)\n                for i, t in tqdm(enumerate(self.scheduler.timesteps[t_start:])):\n                    t_index = t_start + i\n                    # expand the latents if we are doing classifier-free guidance to avoid doing two forward passes.\n                    latent_model_input = (paddle.concat([latents] * 2) if do_classifier_free_guidance else latents)\n\n                    if isinstance(self.scheduler, LMSDiscreteScheduler):\n                        sigma = self.scheduler.sigmas[t_index]\n                        latent_model_input = latent_model_input / ((sigma**2 + 1)**0.5)\n\n                    # predict the noise residual\n                    noise_pred = self.unet(latent_model_input, t, encoder_hidden_states=text_embeddings)[\"sample\"]\n\n                    # perform guidance\n                    if do_classifier_free_guidance:\n                        noise_pred_uncond, noise_pred_text = noise_pred.chunk(2)\n                        noise_pred = noise_pred_uncond + guidance_scale * (noise_pred_text - noise_pred_uncond)\n\n                    # compute the previous noisy sample x_t -> x_t-1\n                    if isinstance(self.scheduler, LMSDiscreteScheduler):\n                        latents = self.scheduler.step(noise_pred, t_index, latents, **extra_step_kwargs)[\"prev_sample\"]\n                    else:\n                        latents = self.scheduler.step(noise_pred, t, latents, **extra_step_kwargs)[\"prev_sample\"]\n                    if i % display_rate == 0:\n                        # vae decode\n                        images = self.vae.decode(1 / 0.18215 * latents)\n                        images = (images / 2 + 0.5).clip(0, 1)\n                        merge_image = images.cpu().transpose([2, 0, 3, 1]).flatten(1, 2).numpy()\n                        merge_image = (merge_image * 255).round().astype(np.uint8)\n                        merge_image = Image.fromarray(merge_image)\n                        merge_image.save(os.path.join(output_dir, f'{prompt}-progress.png'))\n                        c = Document(tags={'step': i, 'prompt': prompt})\n                        c.load_pil_image_to_datauri(merge_image)\n                        d.chunks[-1].chunks.append(c)\n                        display.clear_output(wait=True)\n                        display.display(merge_image)\n                        images = images.cpu().transpose([0, 2, 3, 1]).numpy()\n                        images = (images * 255).round().astype(np.uint8)\n                        for j in range(images.shape[0]):\n                            image = Image.fromarray(images[j])\n                            c = Document(tags={'step': i, 'prompt': prompt})\n                            c.load_pil_image_to_datauri(image)\n                            d.chunks[j].chunks.append(c)\n\n                # vae decode\n                images = self.vae.decode(1 / 0.18215 * latents)\n                images = (images / 2 + 0.5).clip(0, 1)\n                merge_image = images.cpu().transpose([2, 0, 3, 1]).flatten(1, 2).numpy()\n                merge_image = (merge_image * 255).round().astype(np.uint8)\n                merge_image = Image.fromarray(merge_image)\n                merge_image.save(os.path.join(output_dir, f'{prompt}-merge.png'))\n                display.clear_output(wait=True)\n                display.display(merge_image)\n                d.load_pil_image_to_datauri(merge_image)\n                d.chunks[-1].load_pil_image_to_datauri(merge_image)\n                images = images.cpu().transpose([0, 2, 3, 1]).numpy()\n                images = (images * 255).round().astype(np.uint8)\n                for j in range(images.shape[0]):\n                    image = Image.fromarray(images[j])\n                    image.save(os.path.join(output_dir, f'{prompt}-image-{j}.png'))\n                    d.chunks[j].load_pil_image_to_datauri(image)\n        return da_batches\n\n    @serving\n    def serving_method(self, text_prompts, init_image, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        init_image = Image.open(BytesIO(base64.b64decode(init_image)))\n        results = self.generate_image(text_prompts=text_prompts, init_image=init_image, **kwargs).to_base64()\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.generate_image(text_prompts=args.text_prompts,\n                                      init_image=args.init_image,\n                                      strength=args.strength,\n                                      style=args.style,\n                                      artist=args.artist,\n                                      batch_size=args.batch_size,\n                                      num_inference_steps=args.num_inference_steps,\n                                      guidance_scale=args.guidance_scale,\n                                      enable_fp16=args.enable_fp16,\n                                      seed=args.seed,\n                                      display_rate=args.display_rate,\n                                      use_gpu=args.use_gpu,\n                                      output_dir=args.output_dir)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n\n        self.arg_input_group.add_argument('--num_inference_steps',\n                                          type=int,\n                                          default=50,\n                                          help=\"The number of inference steps.\")\n\n        self.arg_input_group.add_argument(\n            '--guidance_scale',\n            type=float,\n            default=7.5,\n            help=\n            \"Increase the adherence to the conditional signal which in this case is text as well as overall sample quality.\"\n        )\n\n        self.arg_input_group.add_argument(\n            '--seed',\n            type=int,\n            default=None,\n            help=\n            \"Deep in the diffusion code, there is a random number ‘seed’ which is used as the basis for determining the initial state of the diffusion.  By default, this is random, but you can also specify your own seed.\"\n        )\n\n        self.arg_input_group.add_argument(\n            '--display_rate',\n            type=int,\n            default=10,\n            help=\"During a diffusion run, you can monitor the progress of each image being created with this variable.\")\n\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=True,\n                                           help=\"whether use GPU or not\")\n\n        self.arg_config_group.add_argument('--enable_fp16',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use float16 or not\")\n\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='stable_diffusion_img2img_out',\n                                           help='Output directory.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument(\n            '--text_prompts',\n            type=str,\n            help=\n            'Phrase, sentence, or string of words and phrases describing what the image should look like.  The words will be analyzed by the AI and will guide the diffusion process toward the image(s) you describe. These can include commas and weights to adjust the relative importance of each element.  E.g. \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"Notice that this prompt loosely follows a structure: [subject], [prepositional details], [setting], [meta modifiers and artist]; this is a good starting point for your experiments. Developing text prompts takes practice and experience, and is not the subject of this guide.  If you are a beginner to writing text prompts, a good place to start is on a simple AI art app like Night Cafe, starry ai or WOMBO prior to using DD, to get a feel for how text gets translated into images by GAN tools.  These other apps use different technologies, but many of the same principles apply.'\n        )\n\n        self.arg_input_group.add_argument('--init_image', type=str, help='Initial image.')\n\n        self.arg_input_group.add_argument(\n            '--strength',\n            type=float,\n            help=\n            'Control the noise strength added to initial image, value is in the interval [0.0, 1.0]. The closer to 1, the bigger change to the initial image.'\n        )\n\n        self.arg_input_group.add_argument(\n            '--style',\n            type=str,\n            default=None,\n            help='Image style, such as oil paintings, if specified, style will be used to construct prompts.')\n        self.arg_input_group.add_argument('--artist',\n                                          type=str,\n                                          default=None,\n                                          help='Artist style, if specified, style will be used to construct prompts.')\n\n        self.arg_input_group.add_argument(\n            '--batch_size',\n            type=int,\n            default=1,\n            help=\"This variable sets the number of still images you want SD to create for each prompt.\")\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/requirements.txt",
    "content": "numpy\nftfy\nregex\ndocarray>=0.13.29\npyyaml\nregex\ntqdm\nipywidgets\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_img2img/utils.py",
    "content": "import numpy as np\nimport paddle\nimport PIL\nfrom PIL import Image\n\n\ndef preprocess(image):\n    if isinstance(image, np.ndarray):\n        image = Image.fromarray(image)\n    w, h = image.size\n    w, h = map(lambda x: x - x % 32, (w, h))  # resize to integer multiple of 32\n    image = image.resize((w, h), resample=PIL.Image.LANCZOS)\n    image = np.array(image).astype(np.float32) / 255.0\n    image = image[None].transpose(0, 3, 1, 2)\n    image = paddle.to_tensor(image)\n    return 2.0 * image - 1.0\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/LICENSE",
    "content": "Copyright (c) 2022 Robin Rombach and Patrick Esser and contributors\n\nCreativeML Open RAIL-M\ndated August 22, 2022\n\nSection I: PREAMBLE\n\nMultimodal generative models are being widely adopted and used, and have the potential to transform the way artists, among other individuals, conceive and benefit from AI or ML technologies as a tool for content creation.\n\nNotwithstanding the current and potential benefits that these artifacts can bring to society at large, there are also concerns about potential misuses of them, either due to their technical limitations or ethical considerations.\n\nIn short, this license strives for both the open and responsible downstream use of the accompanying model. When it comes to the open character, we took inspiration from open source permissive licenses regarding the grant of IP rights. Referring to the downstream responsible use, we added use-based restrictions not permitting the use of the Model in very specific scenarios, in order for the licensor to be able to enforce the license in case potential misuses of the Model may occur. At the same time, we strive to promote open and responsible research on generative models for art and content generation.\n\nEven though downstream derivative versions of the model could be released under different licensing terms, the latter will always have to include - at minimum - the same use-based restrictions as the ones in the original license (this license). We believe in the intersection between open and responsible AI development; thus, this License aims to strike a balance between both in order to enable responsible open-science in the field of AI.\n\nThis License governs the use of the model (and its derivatives) and is informed by the model card associated with the model.\n\nNOW THEREFORE, You and Licensor agree as follows:\n\n1. Definitions\n\n- \"License\" means the terms and conditions for use, reproduction, and Distribution as defined in this document.\n- \"Data\" means a collection of information and/or content extracted from the dataset used with the Model, including to train, pretrain, or otherwise evaluate the Model. The Data is not licensed under this License.\n- \"Output\" means the results of operating a Model as embodied in informational content resulting therefrom.\n- \"Model\" means any accompanying machine-learning based assemblies (including checkpoints), consisting of learnt weights, parameters (including optimizer states), corresponding to the model architecture as embodied in the Complementary Material, that have been trained or tuned, in whole or in part on the Data, using the Complementary Material.\n- \"Derivatives of the Model\" means all modifications to the Model, works based on the Model, or any other model which is created or initialized by transfer of patterns of the weights, parameters, activations or output of the Model, to the other model, in order to cause the other model to perform similarly to the Model, including - but not limited to - distillation methods entailing the use of intermediate data representations or methods based on the generation of synthetic data by the Model for training the other model.\n- \"Complementary Material\" means the accompanying source code and scripts used to define, run, load, benchmark or evaluate the Model, and used to prepare data for training or evaluation, if any. This includes any accompanying documentation, tutorials, examples, etc, if any.\n- \"Distribution\" means any transmission, reproduction, publication or other sharing of the Model or Derivatives of the Model to a third party, including providing the Model as a hosted service made available by electronic or other remote means - e.g. API-based or web access.\n- \"Licensor\" means the copyright owner or entity authorized by the copyright owner that is granting the License, including the persons or entities that may have rights in the Model and/or distributing the Model.\n- \"You\" (or \"Your\") means an individual or Legal Entity exercising permissions granted by this License and/or making use of the Model for whichever purpose and in any field of use, including usage of the Model in an end-use application - e.g. chatbot, translator, image generator.\n- \"Third Parties\" means individuals or legal entities that are not under common control with Licensor or You.\n- \"Contribution\" means any work of authorship, including the original version of the Model and any modifications or additions to that Model or Derivatives of the Model thereof, that is intentionally submitted to Licensor for inclusion in the Model by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, \"submitted\" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Model, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as \"Not a Contribution.\"\n- \"Contributor\" means Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Model.\n\nSection II: INTELLECTUAL PROPERTY RIGHTS\n\nBoth copyright and patent grants apply to the Model, Derivatives of the Model and Complementary Material. The Model and Derivatives of the Model are subject to additional terms as described in Section III.\n\n2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare, publicly display, publicly perform, sublicense, and distribute the Complementary Material, the Model, and Derivatives of the Model.\n3. Grant of Patent License. Subject to the terms and conditions of this License and where and as applicable, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this paragraph) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Model and the Complementary Material, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Model to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Model and/or Complementary Material or a Contribution incorporated within the Model and/or Complementary Material constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for the Model and/or Work shall terminate as of the date such litigation is asserted or filed.\n\nSection III: CONDITIONS OF USAGE, DISTRIBUTION AND REDISTRIBUTION\n\n4. Distribution and Redistribution. You may host for Third Party remote access purposes (e.g. software-as-a-service), reproduce and distribute copies of the Model or Derivatives of the Model thereof in any medium, with or without modifications, provided that You meet the following conditions:\nUse-based restrictions as referenced in paragraph 5 MUST be included as an enforceable provision by You in any type of legal agreement (e.g. a license) governing the use and/or distribution of the Model or Derivatives of the Model, and You shall give notice to subsequent users You Distribute to, that the Model or Derivatives of the Model are subject to paragraph 5. This provision does not apply to the use of Complementary Material.\nYou must give any Third Party recipients of the Model or Derivatives of the Model a copy of this License;\nYou must cause any modified files to carry prominent notices stating that You changed the files;\nYou must retain all copyright, patent, trademark, and attribution notices excluding those notices that do not pertain to any part of the Model, Derivatives of the Model.\nYou may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions - respecting paragraph 4.a. - for use, reproduction, or Distribution of Your modifications, or for any such Derivatives of the Model as a whole, provided Your use, reproduction, and Distribution of the Model otherwise complies with the conditions stated in this License.\n5. Use-based restrictions. The restrictions set forth in Attachment A are considered Use-based restrictions. Therefore You cannot use the Model and the Derivatives of the Model for the specified restricted uses. You may use the Model subject to this License, including only for lawful purposes and in accordance with the License. Use may include creating any content with, finetuning, updating, running, training, evaluating and/or reparametrizing the Model. You shall require all of Your users who use the Model or a Derivative of the Model to comply with the terms of this paragraph (paragraph 5).\n6. The Output You Generate. Except as set forth herein, Licensor claims no rights in the Output You generate using the Model. You are accountable for the Output you generate and its subsequent uses. No use of the output can contravene any provision as stated in the License.\n\nSection IV: OTHER PROVISIONS\n\n7. Updates and Runtime Restrictions. To the maximum extent permitted by law, Licensor reserves the right to restrict (remotely or otherwise) usage of the Model in violation of this License, update the Model through electronic means, or modify the Output of the Model based on updates. You shall undertake reasonable efforts to use the latest version of the Model.\n8. Trademarks and related. Nothing in this License permits You to make use of Licensors’ trademarks, trade names, logos or to otherwise suggest endorsement or misrepresent the relationship between the parties; and any rights not expressly granted herein are reserved by the Licensors.\n9. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Model and the Complementary Material (and each Contributor provides its Contributions) on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Model, Derivatives of the Model, and the Complementary Material and assume any risks associated with Your exercise of permissions under this License.\n10. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Model and the Complementary Material (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages.\n11. Accepting Warranty or Additional Liability. While redistributing the Model, Derivatives of the Model and the Complementary Material thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability.\n12. If any provision of this License is held to be invalid, illegal or unenforceable, the remaining provisions shall be unaffected thereby and remain valid as if such provision had not been set forth herein.\n\nEND OF TERMS AND CONDITIONS\n\n\n\n\nAttachment A\n\nUse Restrictions\n\nYou agree not to use the Model or Derivatives of the Model:\n- In any way that violates any applicable national, federal, state, local or international law or regulation;\n- For the purpose of exploiting, harming or attempting to exploit or harm minors in any way;\n- To generate or disseminate verifiably false information and/or content with the purpose of harming others;\n- To generate or disseminate personal identifiable information that can be used to harm an individual;\n- To defame, disparage or otherwise harass others;\n- For fully automated decision making that adversely impacts an individual’s legal rights or otherwise creates or modifies a binding, enforceable obligation;\n- For any use intended to or which has the effect of discriminating against or harming individuals or groups based on online or offline social behavior or known or predicted personal or personality characteristics;\n- To exploit any of the vulnerabilities of a specific group of persons based on their age, social, physical or mental characteristics, in order to materially distort the behavior of a person pertaining to that group in a manner that causes or is likely to cause that person or another person physical or psychological harm;\n- For any use intended to or which has the effect of discriminating against individuals or groups based on legally protected characteristics or categories;\n- To provide medical advice and medical results interpretation;\n- To generate or disseminate information for the purpose to be used for administration of justice, law enforcement, immigration or asylum processes, such as predicting an individual will commit fraud/crime commitment (e.g. by text profiling, drawing causal relationships between assertions made in documents, indiscriminate and arbitrarily-targeted use)."
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/README.md",
    "content": "# stable_diffusion_inpainting\n\n|模型名称|stable_diffusion_inpainting|\n| :--- | :---: |\n|类别|多模态-文图生成|\n|网络|CLIP Text Encoder+UNet+VAD|\n|数据集|-|\n|是否支持Fine-tuning|否|\n|模型大小|4.0GB|\n|最新更新日期|2022-08-26|\n|数据指标|-|\n\n## 一、模型基本信息\n\n### 应用效果展示\n\n  - 输入文本 \"a cat sitting on a bench\"\n\n  - 输入图像\n  <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/192967498-15458743-be08-4af0-b055-5bbe72c0b448.png\"  width = \"80%\" hspace='10'/>\n  <br />\n\n  - 输入mask\n  <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/192967504-7bc17d7d-98f9-4595-b355-76280865a4ab.png\"  width = \"80%\" hspace='10'/>\n  <br />\n\n  - 输出图像\n  <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/192966967-f7b12d1d-281e-415f-b38d-32715ab6bbb4.png\"  width = \"80%\" hspace='10'/>\n  <br />\n\n  - 生成过程\n  <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/192966945-19875111-31cc-42dd-85e0-3842a8df70d3.gif\"  width = \"80%\" hspace='10'/>\n  <br />\n\n### 模型介绍\n\nStable Diffusion是一种潜在扩散模型(Latent Diffusion)， 属于生成类模型，这类模型通过对随机噪声进行一步步地迭代降噪并采样来获得感兴趣的图像，当前取得了令人惊艳的效果。相比于Disco Diffusion, Stable Diffusion通过在低纬度的潜在空间（lower dimensional latent space）而不是原像素空间来做迭代，极大地降低了内存和计算量的需求，并且在V100上一分钟之内即可以渲染出想要的图像，欢迎体验。该模块支持输入文本以及一张图片，一张掩码图片，对掩码部分的内容进行改变。\n\n更多详情请参考论文：[High-Resolution Image Synthesis with Latent Diffusion Models](https://arxiv.org/abs/2112.10752)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install stable_diffusion_inpainting\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run stable_diffusion_inpainting --text_prompts \"a cat sitting on a bench\" --init_image /PATH/TO/IMAGE --mask_image /PATH/TO/IMAGE --output_dir stable_diffusion_inpainting_out\n    ```\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"stable_diffusion_inpainting\")\n    text_prompts = [\"a cat sitting on a bench\"]\n    # 生成图像, 默认会在stable_diffusion_inpainting_out目录保存图像\n    # 返回的da是一个DocumentArray对象，保存了所有的结果，包括最终结果和迭代过程的中间结果\n    # 可以通过操作DocumentArray对象对生成的图像做后处理，保存或者分析\n    # 您可以设置batch_size一次生成多张\n    da = module.generate_image(text_prompts=text_prompts, init_image='/PATH/TO/IMAGE', mask_image='/PATH/TO/IMAGE', batch_size=2, output_dir='./stable_diffusion_inpainting_out/')  \n    # 展示所有的中间结果\n    da[0].chunks[-1].chunks.plot_image_sprites(skip_empty=True, show_index=True, keep_aspect_ratio=True)\n    # 将整个生成过程保存为一个动态图gif\n    da[0].chunks[-1].chunks.save_gif('stable_diffusion_inpainting_out-merged-result.gif')\n    # da索引的是prompt, da[0].chunks索引的是该prompt下生成的第一张图，在batch_size不为1时能同时生成多张图\n    # 您也可以按照上述操作显示单张图，如第0张的生成过程\n    da[0].chunks[0].chunks.plot_image_sprites(skip_empty=True, show_index=True, keep_aspect_ratio=True)\n    da[0].chunks[0].chunks.save_gif('stable_diffusion_inpainting-image-0-result.gif')\n    ```\n\n- ### 3、API\n\n  - ```python\n    def generate_image(\n            text_prompts,\n            init_image,\n            mask_image,\n            strength: float = 0.8,\n            width_height: Optional[List[int]] = [512, 512],\n            seed: Optional[int] = None,\n            batch_size: Optional[int] = 1,\n            display_rate: Optional[int] = 5,\n            output_dir: Optional[str] = 'stable_diffusion_inpainting_out'):\n    ```\n\n    - 文图生成API，生成文本描述内容的图像。\n\n    - **参数**\n\n      - text_prompts(str): 输入的语句，描述想要生成的图像的内容。\n      - init_image(str|numpy.ndarray|PIL.Image): 输入的初始图像。\n      - mask_image(str|numpy.ndarray|PIL.Image): 输入的掩码图像。\n      - strength(float): 控制添加到输入图像的噪声强度，取值范围0到1。越接近1.0，图像变化越大。\n      - width_height(Optional[List[int]]): 指定最终输出图像的宽高，宽和高都需要是64的倍数，生成的图像越大，所需要的计算时间越长。\n      - seed(Optional[int]): 随机种子，由于输入默认是随机高斯噪声，设置不同的随机种子会由不同的初始输入，从而最终生成不同的结果，可以设置该参数来获得不同的输出图像。\n      - batch_size(Optional[int]): 指定每个prompt一次生成的图像的数量。\n      - display_rate(Optional[int]): 保存中间结果的频率，默认每5个step保存一次中间结果，如果不需要中间结果来让程序跑的更快，可以将这个值设大。\n      - output_dir(Optional[str]): 保存输出图像的目录，默认为\"stable_diffusion_out\"。\n\n\n    - **返回**\n      - ra(DocumentArray): DocumentArray对象， 包含`n_batches`个Documents，其中每个Document都保存了迭代过程的所有中间结果。详细可参考[DocumentArray使用文档](https://docarray.jina.ai/fundamentals/documentarray/index.html)。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线文图生成服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m stable_diffusion_inpainting\n    ```\n\n  - 这样就完成了一个文图生成的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果，返回的预测结果在反序列化后即是上述接口声明中说明的DocumentArray类型，返回后对结果的操作方式和使用generate_image接口完全相同。\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    from docarray import DocumentArray\n\n    def cv2_to_base64(image):\n      data = cv2.imencode('.jpg', image)[1]\n      return base64.b64encode(data.tobytes())\n\n    # 发送HTTP请求\n    data = {'text_prompts': 'a cat sitting on a bench', 'init_image': cv2_to_base64(cv2.imread('/PATH/TO/IMAGE')),\n            'mask_image': cv2_to_base64(cv2.imread('/PATH/TO/IMAGE')}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/stable_diffusion_inpainting\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 获取返回结果\n    r.json()[\"results\"]\n    da = DocumentArray.from_base64(r.json()[\"results\"])\n    # 保存结果图\n    da[0].save_uri_to_file('stable_diffusion_inpainting_out.png')\n    # 将生成过程保存为一个动态图gif\n    da[0].chunks[0].chunks.save_gif('stable_diffusion_inpainting_out.gif')\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install stable_diffusion_inpainting == 1.0.0\n  ```\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/clip/README.md",
    "content": "# OpenAI CLIP implemented in Paddle.\nThe original implementation repo is [ranchlai/clip.paddle](https://github.com/ranchlai/clip.paddle). We use this repo here for text encoder in stable diffusion.\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/clip/clip/__init__.py",
    "content": "from .utils import *\r\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/clip/clip/layers.py",
    "content": "from typing import Optional\n\nimport paddle\nimport paddle.nn as nn\nfrom paddle import Tensor\nfrom paddle.nn import functional as F\nfrom paddle.nn import Linear\n\n__all__ = ['ResidualAttentionBlock', 'AttentionPool2d', 'multi_head_attention_forward', 'MultiHeadAttention']\n\n\ndef multi_head_attention_forward(x: Tensor,\n                                 num_heads: int,\n                                 q_proj: Linear,\n                                 k_proj: Linear,\n                                 v_proj: Linear,\n                                 c_proj: Linear,\n                                 attn_mask: Optional[Tensor] = None):\n    max_len, batch_size, emb_dim = x.shape\n    head_dim = emb_dim // num_heads\n    scaling = float(head_dim)**-0.5\n    q = q_proj(x)  # L, N, E\n    k = k_proj(x)  # L, N, E\n    v = v_proj(x)  # L, N, E\n    #k = k.con\n    v = v.reshape((-1, batch_size * num_heads, head_dim)).transpose((1, 0, 2))\n    k = k.reshape((-1, batch_size * num_heads, head_dim)).transpose((1, 0, 2))\n    q = q.reshape((-1, batch_size * num_heads, head_dim)).transpose((1, 0, 2))\n\n    q = q * scaling\n    qk = paddle.bmm(q, k.transpose((0, 2, 1)))\n    if attn_mask is not None:\n        if attn_mask.ndim == 2:\n            attn_mask.unsqueeze_(0)\n        #assert str(attn_mask.dtype) == 'VarType.FP32' and attn_mask.ndim == 3\n        assert attn_mask.shape[0] == 1 and attn_mask.shape[1] == max_len and attn_mask.shape[2] == max_len\n        qk += attn_mask\n\n    qk = paddle.nn.functional.softmax(qk, axis=-1)\n    atten = paddle.bmm(qk, v)\n    atten = atten.transpose((1, 0, 2))\n    atten = atten.reshape((max_len, batch_size, emb_dim))\n    atten = c_proj(atten)\n    return atten\n\n\nclass MultiHeadAttention(nn.Layer):  # without attention mask\n\n    def __init__(self, emb_dim: int, num_heads: int):\n        super().__init__()\n        self.q_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.k_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.v_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.c_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.head_dim = emb_dim // num_heads\n        self.emb_dim = emb_dim\n        self.num_heads = num_heads\n        assert self.head_dim * num_heads == emb_dim, \"embed_dim must be divisible by num_heads\"\n        #self.scaling = float(self.head_dim) ** -0.5\n\n    def forward(self, x, attn_mask=None):  # x is in shape[max_len,batch_size,emb_dim]\n\n        atten = multi_head_attention_forward(x,\n                                             self.num_heads,\n                                             self.q_proj,\n                                             self.k_proj,\n                                             self.v_proj,\n                                             self.c_proj,\n                                             attn_mask=attn_mask)\n\n        return atten\n\n\nclass Identity(nn.Layer):\n\n    def __init__(self):\n        super().__init__()\n\n    def forward(self, x):\n        return x\n\n\nclass Bottleneck(nn.Layer):\n    expansion = 4\n\n    def __init__(self, inplanes, planes, stride=1):\n        super().__init__()\n\n        # all conv layers have stride 1. an avgpool is performed after the second convolution when stride > 1\n        self.conv1 = nn.Conv2D(inplanes, planes, 1, bias_attr=False)\n        self.bn1 = nn.BatchNorm2D(planes)\n\n        self.conv2 = nn.Conv2D(planes, planes, 3, padding=1, bias_attr=False)\n        self.bn2 = nn.BatchNorm2D(planes)\n\n        self.avgpool = nn.AvgPool2D(stride) if stride > 1 else Identity()\n\n        self.conv3 = nn.Conv2D(planes, planes * self.expansion, 1, bias_attr=False)\n        self.bn3 = nn.BatchNorm2D(planes * self.expansion)\n\n        self.relu = nn.ReLU()\n        self.downsample = None\n        self.stride = stride\n\n        if stride > 1 or inplanes != planes * Bottleneck.expansion:\n            self.downsample = nn.Sequential(\n                (\"-1\", nn.AvgPool2D(stride)),\n                (\"0\", nn.Conv2D(inplanes, planes * self.expansion, 1, stride=1, bias_attr=False)),\n                (\"1\", nn.BatchNorm2D(planes * self.expansion)))\n\n    def forward(self, x):\n        identity = x\n\n        out = self.relu(self.bn1(self.conv1(x)))\n        out = self.relu(self.bn2(self.conv2(out)))\n        out = self.avgpool(out)\n        out = self.bn3(self.conv3(out))\n\n        if self.downsample is not None:\n            identity = self.downsample(x)\n\n        out += identity\n        out = self.relu(out)\n        return out\n\n\nclass AttentionPool2d(nn.Layer):\n\n    def __init__(self, spacial_dim: int, embed_dim: int, num_heads: int, output_dim: int = None):\n        super().__init__()\n\n        self.positional_embedding = paddle.create_parameter((spacial_dim**2 + 1, embed_dim), dtype='float32')\n\n        self.q_proj = nn.Linear(embed_dim, embed_dim, bias_attr=True)\n        self.k_proj = nn.Linear(embed_dim, embed_dim, bias_attr=True)\n        self.v_proj = nn.Linear(embed_dim, embed_dim, bias_attr=True)\n        self.c_proj = nn.Linear(embed_dim, output_dim or embed_dim, bias_attr=True)\n        self.num_heads = num_heads\n\n        self.head_dim = embed_dim // num_heads\n        assert self.head_dim * num_heads == embed_dim, \"embed_dim must be divisible by num_heads\"\n\n    def forward(self, x):\n\n        x = x.reshape((x.shape[0], x.shape[1], x.shape[2] * x.shape[3])).transpose((2, 0, 1))  # NCHW -> (HW)NC\n        max_len, batch_size, emb_dim = x.shape\n        head_dim = self.head_dim\n        x = paddle.concat([paddle.mean(x, axis=0, keepdim=True), x], axis=0)\n        x = x + paddle.unsqueeze(self.positional_embedding, 1)\n        out = multi_head_attention_forward(x, self.num_heads, self.q_proj, self.k_proj, self.v_proj, self.c_proj)\n\n        return out[0]\n\n\nclass QuickGELU(nn.Layer):\n\n    def forward(self, x):\n        return x * paddle.nn.functional.sigmoid(1.702 * x)\n\n\nclass ResidualAttentionBlock(nn.Layer):\n\n    def __init__(self, d_model: int, n_head: int, attn_mask=None):\n        super().__init__()\n\n        self.attn = MultiHeadAttention(d_model, n_head)\n        self.ln_1 = nn.LayerNorm(d_model)\n        self.mlp = nn.Sequential((\"c_fc\", nn.Linear(d_model, d_model * 4)), (\"gelu\", QuickGELU()),\n                                 (\"c_proj\", nn.Linear(d_model * 4, d_model)))\n        self.ln_2 = nn.LayerNorm(d_model)\n        self.attn_mask = attn_mask\n\n    def attention(self, x):\n        x = self.attn(x, self.attn_mask)\n        assert isinstance(x, paddle.Tensor)  # not tuble here\n        return x\n\n    def forward(self, x):\n\n        x = x + self.attention(self.ln_1(x))\n        x = x + self.mlp(self.ln_2(x))\n        return x\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/clip/clip/model.py",
    "content": "from typing import Tuple\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle import nn\n\nfrom .layers import AttentionPool2d\nfrom .layers import Bottleneck\nfrom .layers import MultiHeadAttention\nfrom .layers import ResidualAttentionBlock\n\n\nclass ModifiedResNet(nn.Layer):\n    \"\"\"\n    A ResNet class that is similar to torchvision's but contains the following changes:\n    - There are now 3 \"stem\" convolutions as opposed to 1, with an average pool instead of a max pool.\n    - Performs anti-aliasing strided convolutions, where an avgpool is prepended to convolutions with stride > 1\n    - The final pooling layer is a QKV attention instead of an average pool\n    \"\"\"\n\n    def __init__(self, layers, output_dim, heads, input_resolution=224, width=64):\n        super().__init__()\n        self.output_dim = output_dim\n        self.input_resolution = input_resolution\n\n        # the 3-layer stem\n        self.conv1 = nn.Conv2D(3, width // 2, kernel_size=3, stride=2, padding=1, bias_attr=False)\n        self.bn1 = nn.BatchNorm2D(width // 2)\n        self.conv2 = nn.Conv2D(width // 2, width // 2, kernel_size=3, padding=1, bias_attr=False)\n        self.bn2 = nn.BatchNorm2D(width // 2)\n        self.conv3 = nn.Conv2D(width // 2, width, kernel_size=3, padding=1, bias_attr=False)\n        self.bn3 = nn.BatchNorm2D(width)\n        self.avgpool = nn.AvgPool2D(2)\n        self.relu = nn.ReLU()\n\n        # residual layers\n        self._inplanes = width  # this is a *mutable* variable used during construction\n        self.layer1 = self._make_layer(width, layers[0])\n        self.layer2 = self._make_layer(width * 2, layers[1], stride=2)\n        self.layer3 = self._make_layer(width * 4, layers[2], stride=2)\n        self.layer4 = self._make_layer(width * 8, layers[3], stride=2)\n\n        embed_dim = width * 32  # the ResNet feature dimension\n        self.attnpool = AttentionPool2d(input_resolution // 32, embed_dim, heads, output_dim)\n\n    def _make_layer(self, planes, blocks, stride=1):\n        layers = [Bottleneck(self._inplanes, planes, stride)]\n\n        self._inplanes = planes * Bottleneck.expansion\n        for _ in range(1, blocks):\n            layers.append(Bottleneck(self._inplanes, planes))\n\n        return nn.Sequential(*layers)\n\n    def forward(self, x):\n\n        def stem(x):\n            for conv, bn in [(self.conv1, self.bn1), (self.conv2, self.bn2), (self.conv3, self.bn3)]:\n                x = self.relu(bn(conv(x)))\n            x = self.avgpool(x)\n            return x\n\n        #x = x.type(self.conv1.weight.dtype)\n        x = stem(x)\n        x = self.layer1(x)\n        x = self.layer2(x)\n        x = self.layer3(x)\n        x = self.layer4(x)\n        x = self.attnpool(x)\n\n        return x\n\n\nclass Transformer(nn.Layer):\n\n    def __init__(self, width: int, layers: int, heads: int, attn_mask=None):\n        super().__init__()\n        self.width = width\n        self.layers = layers\n        self.resblocks = nn.Sequential(*[ResidualAttentionBlock(width, heads, attn_mask) for _ in range(layers)])\n\n    def forward(self, x):\n        return self.resblocks(x)\n\n\nclass VisualTransformer(nn.Layer):\n\n    def __init__(self, input_resolution: int, patch_size: int, width: int, layers: int, heads: int, output_dim: int):\n        super().__init__()\n        self.input_resolution = input_resolution\n        self.output_dim = output_dim\n        # used patch_size x patch_size, stride patch_size to do linear projection\n        self.conv1 = nn.Conv2D(in_channels=3,\n                               out_channels=width,\n                               kernel_size=patch_size,\n                               stride=patch_size,\n                               bias_attr=False)\n\n        # scale = width ** -0.5\n        self.class_embedding = paddle.create_parameter((width, ), 'float32')\n\n        self.positional_embedding = paddle.create_parameter(((input_resolution // patch_size)**2 + 1, width), 'float32')\n\n        self.ln_pre = nn.LayerNorm(width)\n\n        self.transformer = Transformer(width, layers, heads)\n\n        self.ln_post = nn.LayerNorm(width)\n        self.proj = paddle.create_parameter((width, output_dim), 'float32')\n\n    def forward(self, x):\n\n        x = self.conv1(x)\n        x = x.reshape((x.shape[0], x.shape[1], -1))\n        x = x.transpose((0, 2, 1))\n        x = paddle.concat([self.class_embedding + paddle.zeros((x.shape[0], 1, x.shape[-1]), dtype=x.dtype), x], axis=1)\n\n        x = x + self.positional_embedding\n        x = self.ln_pre(x)\n        x = x.transpose((1, 0, 2))\n        x = self.transformer(x)\n        x = x.transpose((1, 0, 2))\n        x = self.ln_post(x[:, 0, :])\n        if self.proj is not None:\n            x = paddle.matmul(x, self.proj)\n\n        return x\n\n\nclass TextTransformer(nn.Layer):\n\n    def __init__(self, context_length: int, vocab_size: int, transformer_width: int, transformer_heads: int,\n                 transformer_layers: int):\n        super().__init__()\n        self.context_length = context_length\n        self.transformer = Transformer(width=transformer_width,\n                                       layers=transformer_layers,\n                                       heads=transformer_heads,\n                                       attn_mask=self.build_attention_mask())\n\n        self.vocab_size = vocab_size\n        self.token_embedding = nn.Embedding(vocab_size, transformer_width)\n        self.positional_embedding = paddle.create_parameter((self.context_length, transformer_width), 'float32')\n        self.ln_final = nn.LayerNorm(transformer_width)\n\n    def build_attention_mask(self):\n        # lazily create causal attention mask, with full attention between the vision tokens\n        # mask = paddle.empty((self.context_length, self.context_length),dtype='float32')\n        # mask.fill_(float(\"-inf\"))\n        #mask.triu_(1)  # zero out the lower diagonal\n\n        mask = paddle.ones((self.context_length, self.context_length)) * float(\"-inf\")\n        mask = paddle.triu(mask, diagonal=1)\n\n        return mask\n\n    def forward(self, text):\n        x = self.token_embedding(text)  # [batch_size, n_ctx, d_model]\n        x = x + self.positional_embedding\n        x = x.transpose((1, 0, 2))  # NLD -> LND\n        x = self.transformer(x)\n        x = x.transpose((1, 0, 2))  # LND -> NLD\n        x = self.ln_final(x)\n        return x\n\n\nclass CLIP(nn.Layer):\n\n    def __init__(\n            self,\n            embed_dim: int,\n            # vision\n            image_resolution: int,\n            vision_layers: Union[Tuple[int, int, int, int], int],\n            vision_width: int,\n            vision_patch_size: int,\n            # text\n            context_length: int,\n            vocab_size: int,\n            transformer_width: int,\n            transformer_heads: int,\n            transformer_layers: int):\n        super().__init__()\n\n        self.context_length = context_length\n        if isinstance(vision_layers, (tuple, list)):\n            vision_heads = vision_width * 32 // 64\n            self.visual = ModifiedResNet(layers=vision_layers,\n                                         output_dim=embed_dim,\n                                         heads=vision_heads,\n                                         input_resolution=image_resolution,\n                                         width=vision_width)\n        else:\n            vision_heads = vision_width // 64\n            self.visual = VisualTransformer(input_resolution=image_resolution,\n                                            patch_size=vision_patch_size,\n                                            width=vision_width,\n                                            layers=vision_layers,\n                                            heads=vision_heads,\n                                            output_dim=embed_dim)\n\n        self.transformer = Transformer(width=transformer_width,\n                                       layers=transformer_layers,\n                                       heads=transformer_heads,\n                                       attn_mask=self.build_attention_mask())\n\n        self.vocab_size = vocab_size\n        self.token_embedding = nn.Embedding(vocab_size, transformer_width)\n        self.positional_embedding = paddle.create_parameter((self.context_length, transformer_width), 'float32')\n        self.ln_final = nn.LayerNorm(transformer_width)\n\n        self.text_projection = paddle.create_parameter((transformer_width, embed_dim), 'float32')\n        self.logit_scale = paddle.create_parameter((1, ), 'float32')\n\n    def build_attention_mask(self):\n        # lazily create causal attention mask, with full attention between the vision tokens\n        # mask = paddle.empty((self.context_length, self.context_length),dtype='float32')\n        # mask.fill_(float(\"-inf\"))\n        #mask.triu_(1)  # zero out the lower diagonal\n\n        mask = paddle.ones((self.context_length, self.context_length)) * float(\"-inf\")\n        mask = paddle.triu(mask, diagonal=1)\n\n        return mask\n\n    def encode_image(self, image):\n        return self.visual(image)\n\n    def encode_text(self, text):\n        x = self.token_embedding(text)  # [batch_size, n_ctx, d_model]\n        x = x + self.positional_embedding\n        x = x.transpose((1, 0, 2))  # NLD -> LND\n        x = self.transformer(x)\n        x = x.transpose((1, 0, 2))  # LND -> NLD\n        x = self.ln_final(x)\n        idx = text.numpy().argmax(-1)\n        idx = list(idx)\n        x = [x[i:i + 1, int(j), :] for i, j in enumerate(idx)]\n        x = paddle.concat(x, 0)\n        x = paddle.matmul(x, self.text_projection)\n        return x\n\n    def forward(self, image, text):\n        image_features = self.encode_image(image)\n        text_features = self.encode_text(text)\n\n        # normalized features\n        image_features = image_features / image_features.norm(dim=-1, keepdim=True)\n        text_features = text_features / text_features.norm(dim=-1, keepdim=True)\n\n        # cosine similarity as logits\n        logit_scale = self.logit_scale.exp()\n        logits_per_image = paddle.matmul(logit_scale * image_features, text_features.t())\n        logits_per_text = paddle.matmul(logit_scale * text_features, image_features.t())\n\n        # shape = [global_batch_size, global_batch_size]\n        return logits_per_image, logits_per_text\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/clip/clip/simple_tokenizer.py",
    "content": "import gzip\nimport html\nimport os\nfrom functools import lru_cache\n\nimport ftfy\nimport regex as re\n\n\n@lru_cache()\ndef default_bpe():\n    return os.path.join(os.path.dirname(os.path.abspath(__file__)), \"../assets/bpe_simple_vocab_16e6.txt.gz\")\n\n\n@lru_cache()\ndef bytes_to_unicode():\n    \"\"\"\n    Returns list of utf-8 byte and a corresponding list of unicode strings.\n    The reversible bpe codes work on unicode strings.\n    This means you need a large # of unicode characters in your vocab if you want to avoid UNKs.\n    When you're at something like a 10B token dataset you end up needing around 5K for decent coverage.\n    This is a signficant percentage of your normal, say, 32K bpe vocab.\n    To avoid that, we want lookup tables between utf-8 bytes and unicode strings.\n    And avoids mapping to whitespace/control characters the bpe code barfs on.\n    \"\"\"\n    bs = list(range(ord(\"!\"), ord(\"~\") + 1)) + list(range(ord(\"¡\"), ord(\"¬\") + 1)) + list(range(ord(\"®\"), ord(\"ÿ\") + 1))\n    cs = bs[:]\n    n = 0\n    for b in range(2**8):\n        if b not in bs:\n            bs.append(b)\n            cs.append(2**8 + n)\n            n += 1\n    cs = [chr(n) for n in cs]\n    return dict(zip(bs, cs))\n\n\ndef get_pairs(word):\n    \"\"\"Return set of symbol pairs in a word.\n    Word is represented as tuple of symbols (symbols being variable-length strings).\n    \"\"\"\n    pairs = set()\n    prev_char = word[0]\n    for char in word[1:]:\n        pairs.add((prev_char, char))\n        prev_char = char\n    return pairs\n\n\ndef basic_clean(text):\n    text = ftfy.fix_text(text)\n    text = html.unescape(html.unescape(text))\n    return text.strip()\n\n\ndef whitespace_clean(text):\n    text = re.sub(r'\\s+', ' ', text)\n    text = text.strip()\n    return text\n\n\nclass SimpleTokenizer(object):\n\n    def __init__(self, bpe_path: str = default_bpe()):\n        self.byte_encoder = bytes_to_unicode()\n        self.byte_decoder = {v: k for k, v in self.byte_encoder.items()}\n        merges = gzip.open(bpe_path).read().decode(\"utf-8\").split('\\n')\n        merges = merges[1:49152 - 256 - 2 + 1]\n        merges = [tuple(merge.split()) for merge in merges]\n        vocab = list(bytes_to_unicode().values())\n        vocab = vocab + [v + '</w>' for v in vocab]\n        for merge in merges:\n            vocab.append(''.join(merge))\n        vocab.extend(['<|startoftext|>', '<|endoftext|>'])\n        self.encoder = dict(zip(vocab, range(len(vocab))))\n        self.decoder = {v: k for k, v in self.encoder.items()}\n        self.bpe_ranks = dict(zip(merges, range(len(merges))))\n        self.cache = {'<|startoftext|>': '<|startoftext|>', '<|endoftext|>': '<|endoftext|>'}\n        self.pat = re.compile(\n            r\"\"\"<\\|startoftext\\|>|<\\|endoftext\\|>|'s|'t|'re|'ve|'m|'ll|'d|[\\p{L}]+|[\\p{N}]|[^\\s\\p{L}\\p{N}]+\"\"\",\n            re.IGNORECASE)\n\n    def bpe(self, token):\n        if token in self.cache:\n            return self.cache[token]\n        word = tuple(token[:-1]) + (token[-1] + '</w>', )\n        pairs = get_pairs(word)\n\n        if not pairs:\n            return token + '</w>'\n\n        while True:\n            bigram = min(pairs, key=lambda pair: self.bpe_ranks.get(pair, float('inf')))\n            if bigram not in self.bpe_ranks:\n                break\n            first, second = bigram\n            new_word = []\n            i = 0\n            while i < len(word):\n                try:\n                    j = word.index(first, i)\n                    new_word.extend(word[i:j])\n                    i = j\n                except:\n                    new_word.extend(word[i:])\n                    break\n\n                if word[i] == first and i < len(word) - 1 and word[i + 1] == second:\n                    new_word.append(first + second)\n                    i += 2\n                else:\n                    new_word.append(word[i])\n                    i += 1\n            new_word = tuple(new_word)\n            word = new_word\n            if len(word) == 1:\n                break\n            else:\n                pairs = get_pairs(word)\n        word = ' '.join(word)\n        self.cache[token] = word\n        return word\n\n    def encode(self, text):\n        bpe_tokens = []\n        text = whitespace_clean(basic_clean(text)).lower()\n        for token in re.findall(self.pat, text):\n            token = ''.join(self.byte_encoder[b] for b in token.encode('utf-8'))\n            bpe_tokens.extend(self.encoder[bpe_token] for bpe_token in self.bpe(token).split(' '))\n        return bpe_tokens\n\n    def decode(self, tokens):\n        text = ''.join([self.decoder[token] for token in tokens])\n        text = bytearray([self.byte_decoder[c] for c in text]).decode('utf-8', errors=\"replace\").replace('</w>', ' ')\n        return text\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/clip/clip/utils.py",
    "content": "import os\nfrom typing import List\nfrom typing import Union\n\nimport numpy as np\nimport paddle\nfrom paddle.utils import download\nfrom paddle.vision.transforms import CenterCrop\nfrom paddle.vision.transforms import Compose\nfrom paddle.vision.transforms import Normalize\nfrom paddle.vision.transforms import Resize\nfrom paddle.vision.transforms import ToTensor\n\nfrom .model import CLIP\nfrom .model import TextTransformer\nfrom .simple_tokenizer import SimpleTokenizer\n\n__all__ = ['transform', 'tokenize', 'build_model']\n\nMODEL_NAMES = ['VITL14']\n\nURL = {'VITL14': os.path.join(os.path.dirname(__file__), 'pre_trained', 'vitl14_textencoder.pdparams')}\n\nMEAN, STD = (0.48145466, 0.4578275, 0.40821073), (0.26862954, 0.26130258, 0.27577711)\n_tokenizer = SimpleTokenizer()\n\ntransform = Compose([\n    Resize(224, interpolation='bicubic'),\n    CenterCrop(224), lambda image: image.convert('RGB'),\n    ToTensor(),\n    Normalize(mean=MEAN, std=STD), lambda t: t.unsqueeze_(0)\n])\n\n\ndef tokenize(texts: Union[str, List[str]], context_length: int = 77):\n    \"\"\"\n    Returns the tokenized representation of given input string(s)\n\n    Parameters\n    ----------\n    texts : Union[str, List[str]]\n        An input string or a list of input strings to tokenize\n\n    context_length : int\n        The context length to use; all CLIP models use 77 as the context length\n\n    Returns\n    -------\n    A two-dimensional tensor containing the resulting tokens, shape = [number of input strings, context_length]\n    \"\"\"\n    if isinstance(texts, str):\n        texts = [texts]\n\n    sot_token = _tokenizer.encoder[\"<|startoftext|>\"]\n    eot_token = _tokenizer.encoder[\"<|endoftext|>\"]\n    all_tokens = [[sot_token] + _tokenizer.encode(text) + [eot_token] for text in texts]\n    result = paddle.zeros((len(all_tokens), context_length), dtype='int64')\n\n    for i, tokens in enumerate(all_tokens):\n        if len(tokens) > context_length:\n            raise RuntimeError(f\"Input {texts[i]} is too long for context length {context_length}\")\n        result[i, :len(tokens)] = paddle.to_tensor(np.array(tokens), dtype='int64')\n\n    return result\n\n\ndef build_model(name='VITL14'):\n    assert name in MODEL_NAMES, f\"model name must be one of {MODEL_NAMES}\"\n    name2model = {'VITL14': build_vitl14_language_model}\n    model = name2model[name]()\n    weight = URL[name]\n    sd = paddle.load(weight)\n    state_dict = model.state_dict()\n    for key, value in sd.items():\n        if key in state_dict:\n            state_dict[key] = value\n    model.load_dict(state_dict)\n    model.eval()\n    return model\n\n\ndef build_vitl14_language_model():\n    model = TextTransformer(context_length=77,\n                            vocab_size=49408,\n                            transformer_width=768,\n                            transformer_heads=12,\n                            transformer_layers=12)\n    return model\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/diffusers/__init__.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n__version__ = \"0.2.4\"\n\nfrom .models import AutoencoderKL, UNet2DConditionModel, UNet2DModel, VQModel\n\nfrom .schedulers import (DDIMScheduler, DDPMScheduler, KarrasVeScheduler, PNDMScheduler, SchedulerMixin,\n                         ScoreSdeVeScheduler, LMSDiscreteScheduler)\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/diffusers/configuration_utils.py",
    "content": "# coding=utf-8\n# Copyright 2022 The HuggingFace Inc. team.\n# Copyright (c) 2022, NVIDIA CORPORATION.  All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\" ConfigMixinuration base class and utilities.\"\"\"\nimport functools\nimport inspect\nimport json\nimport os\nimport re\nfrom collections import OrderedDict\nfrom typing import Any\nfrom typing import Dict\nfrom typing import Tuple\nfrom typing import Union\n\nfrom requests import HTTPError\n\nfrom paddlehub.common.logger import logger\n\nHUGGINGFACE_CO_RESOLVE_ENDPOINT = \"HUGGINGFACE_CO_RESOLVE_ENDPOINT\"\nDIFFUSERS_CACHE = \"./caches\"\n\n_re_configuration_file = re.compile(r\"config\\.(.*)\\.json\")\n\n\nclass ConfigMixin:\n    r\"\"\"\n    Base class for all configuration classes. Handles a few parameters common to all models' configurations as well as\n    methods for loading/downloading/saving configurations.\n\n    \"\"\"\n    config_name = \"model_config.json\"\n    ignore_for_config = []\n\n    def register_to_config(self, **kwargs):\n        if self.config_name is None:\n            raise NotImplementedError(f\"Make sure that {self.__class__} has defined a class name `config_name`\")\n        kwargs[\"_class_name\"] = self.__class__.__name__\n        kwargs[\"_diffusers_version\"] = \"0.0.1\"\n\n        for key, value in kwargs.items():\n            try:\n                setattr(self, key, value)\n            except AttributeError as err:\n                logger.error(f\"Can't set {key} with value {value} for {self}\")\n                raise err\n\n        if not hasattr(self, \"_internal_dict\"):\n            internal_dict = kwargs\n        else:\n            previous_dict = dict(self._internal_dict)\n            internal_dict = {**self._internal_dict, **kwargs}\n            logger.debug(f\"Updating config from {previous_dict} to {internal_dict}\")\n\n        self._internal_dict = FrozenDict(internal_dict)\n\n    def save_config(self, save_directory: Union[str, os.PathLike], push_to_hub: bool = False, **kwargs):\n        \"\"\"\n        Save a configuration object to the directory `save_directory`, so that it can be re-loaded using the\n        [`~ConfigMixin.from_config`] class method.\n\n        Args:\n            save_directory (`str` or `os.PathLike`):\n                Directory where the configuration JSON file will be saved (will be created if it does not exist).\n            kwargs:\n                Additional key word arguments passed along to the [`~utils.PushToHubMixin.push_to_hub`] method.\n        \"\"\"\n        if os.path.isfile(save_directory):\n            raise AssertionError(f\"Provided path ({save_directory}) should be a directory, not a file\")\n\n        os.makedirs(save_directory, exist_ok=True)\n\n        # If we save using the predefined names, we can load using `from_config`\n        output_config_file = os.path.join(save_directory, self.config_name)\n\n        self.to_json_file(output_config_file)\n        logger.info(f\"ConfigMixinuration saved in {output_config_file}\")\n\n    @classmethod\n    def from_config(cls, pretrained_model_name_or_path: Union[str, os.PathLike], return_unused_kwargs=False, **kwargs):\n        config_dict = cls.get_config_dict(pretrained_model_name_or_path=pretrained_model_name_or_path, **kwargs)\n\n        init_dict, unused_kwargs = cls.extract_init_dict(config_dict, **kwargs)\n\n        model = cls(**init_dict)\n\n        if return_unused_kwargs:\n            return model, unused_kwargs\n        else:\n            return model\n\n    @classmethod\n    def get_config_dict(cls, pretrained_model_name_or_path: Union[str, os.PathLike],\n                        **kwargs) -> Tuple[Dict[str, Any], Dict[str, Any]]:\n        cache_dir = kwargs.pop(\"cache_dir\", DIFFUSERS_CACHE)\n        force_download = kwargs.pop(\"force_download\", False)\n        resume_download = kwargs.pop(\"resume_download\", False)\n        proxies = kwargs.pop(\"proxies\", None)\n        use_auth_token = kwargs.pop(\"use_auth_token\", None)\n        local_files_only = kwargs.pop(\"local_files_only\", False)\n        revision = kwargs.pop(\"revision\", None)\n        subfolder = kwargs.pop(\"subfolder\", None)\n\n        user_agent = {\"file_type\": \"config\"}\n\n        pretrained_model_name_or_path = str(pretrained_model_name_or_path)\n\n        if cls.config_name is None:\n            raise ValueError(\n                \"`self.config_name` is not defined. Note that one should not load a config from \"\n                \"`ConfigMixin`. Please make sure to define `config_name` in a class inheriting from `ConfigMixin`\")\n\n        if os.path.isfile(pretrained_model_name_or_path):\n            config_file = pretrained_model_name_or_path\n        elif os.path.isdir(pretrained_model_name_or_path):\n            if os.path.isfile(os.path.join(pretrained_model_name_or_path, cls.config_name)):\n                # Load from a PyTorch checkpoint\n                config_file = os.path.join(pretrained_model_name_or_path, cls.config_name)\n            elif subfolder is not None and os.path.isfile(\n                    os.path.join(pretrained_model_name_or_path, subfolder, cls.config_name)):\n                config_file = os.path.join(pretrained_model_name_or_path, subfolder, cls.config_name)\n            else:\n                raise EnvironmentError(\n                    f\"Error no file named {cls.config_name} found in directory {pretrained_model_name_or_path}.\")\n        else:\n            try:\n                # Load from URL or cache if already cached\n                from huggingface_hub import hf_hub_download\n                config_file = hf_hub_download(\n                    pretrained_model_name_or_path,\n                    filename=cls.config_name,\n                    cache_dir=cache_dir,\n                    force_download=force_download,\n                    proxies=proxies,\n                    resume_download=resume_download,\n                    local_files_only=local_files_only,\n                    use_auth_token=use_auth_token,\n                    user_agent=user_agent,\n                    subfolder=subfolder,\n                )\n\n            except HTTPError as err:\n                raise EnvironmentError(\"There was a specific connection error when trying to load\"\n                                       f\" {pretrained_model_name_or_path}:\\n{err}\")\n            except ValueError:\n                raise EnvironmentError(\n                    f\"We couldn't connect to '{HUGGINGFACE_CO_RESOLVE_ENDPOINT}' to load this model, couldn't find it\"\n                    f\" in the cached files and it looks like {pretrained_model_name_or_path} is not the path to a\"\n                    f\" directory containing a {cls.config_name} file.\\nCheckout your internet connection or see how to\"\n                    \" run the library in offline mode at\"\n                    \" 'https://huggingface.co/docs/diffusers/installation#offline-mode'.\")\n            except EnvironmentError:\n                raise EnvironmentError(\n                    f\"Can't load config for '{pretrained_model_name_or_path}'. If you were trying to load it from \"\n                    \"'https://huggingface.co/models', make sure you don't have a local directory with the same name. \"\n                    f\"Otherwise, make sure '{pretrained_model_name_or_path}' is the correct path to a directory \"\n                    f\"containing a {cls.config_name} file\")\n\n        try:\n            # Load config dict\n            config_dict = cls._dict_from_json_file(config_file)\n        except (json.JSONDecodeError, UnicodeDecodeError):\n            raise EnvironmentError(f\"It looks like the config file at '{config_file}' is not a valid JSON file.\")\n\n        return config_dict\n\n    @classmethod\n    def extract_init_dict(cls, config_dict, **kwargs):\n        expected_keys = set(dict(inspect.signature(cls.__init__).parameters).keys())\n        expected_keys.remove(\"self\")\n        # remove general kwargs if present in dict\n        if \"kwargs\" in expected_keys:\n            expected_keys.remove(\"kwargs\")\n        # remove keys to be ignored\n        if len(cls.ignore_for_config) > 0:\n            expected_keys = expected_keys - set(cls.ignore_for_config)\n        init_dict = {}\n        for key in expected_keys:\n            if key in kwargs:\n                # overwrite key\n                init_dict[key] = kwargs.pop(key)\n            elif key in config_dict:\n                # use value from config dict\n                init_dict[key] = config_dict.pop(key)\n\n        unused_kwargs = config_dict.update(kwargs)\n\n        passed_keys = set(init_dict.keys())\n        if len(expected_keys - passed_keys) > 0:\n            logger.warning(\n                f\"{expected_keys - passed_keys} was not found in config. Values will be initialized to default values.\")\n\n        return init_dict, unused_kwargs\n\n    @classmethod\n    def _dict_from_json_file(cls, json_file: Union[str, os.PathLike]):\n        with open(json_file, \"r\", encoding=\"utf-8\") as reader:\n            text = reader.read()\n        return json.loads(text)\n\n    def __repr__(self):\n        return f\"{self.__class__.__name__} {self.to_json_string()}\"\n\n    @property\n    def config(self) -> Dict[str, Any]:\n        return self._internal_dict\n\n    def to_json_string(self) -> str:\n        \"\"\"\n        Serializes this instance to a JSON string.\n\n        Returns:\n            `str`: String containing all the attributes that make up this configuration instance in JSON format.\n        \"\"\"\n        config_dict = self._internal_dict if hasattr(self, \"_internal_dict\") else {}\n        return json.dumps(config_dict, indent=2, sort_keys=True) + \"\\n\"\n\n    def to_json_file(self, json_file_path: Union[str, os.PathLike]):\n        \"\"\"\n        Save this instance to a JSON file.\n\n        Args:\n            json_file_path (`str` or `os.PathLike`):\n                Path to the JSON file in which this configuration instance's parameters will be saved.\n        \"\"\"\n        with open(json_file_path, \"w\", encoding=\"utf-8\") as writer:\n            writer.write(self.to_json_string())\n\n\nclass FrozenDict(OrderedDict):\n\n    def __init__(self, *args, **kwargs):\n        super().__init__(*args, **kwargs)\n\n        for key, value in self.items():\n            setattr(self, key, value)\n\n        self.__frozen = True\n\n    def __delitem__(self, *args, **kwargs):\n        raise Exception(f\"You cannot use ``__delitem__`` on a {self.__class__.__name__} instance.\")\n\n    def setdefault(self, *args, **kwargs):\n        raise Exception(f\"You cannot use ``setdefault`` on a {self.__class__.__name__} instance.\")\n\n    def pop(self, *args, **kwargs):\n        raise Exception(f\"You cannot use ``pop`` on a {self.__class__.__name__} instance.\")\n\n    def update(self, *args, **kwargs):\n        raise Exception(f\"You cannot use ``update`` on a {self.__class__.__name__} instance.\")\n\n    def __setattr__(self, name, value):\n        if hasattr(self, \"__frozen\") and self.__frozen:\n            raise Exception(f\"You cannot use ``__setattr__`` on a {self.__class__.__name__} instance.\")\n        super().__setattr__(name, value)\n\n    def __setitem__(self, name, value):\n        if hasattr(self, \"__frozen\") and self.__frozen:\n            raise Exception(f\"You cannot use ``__setattr__`` on a {self.__class__.__name__} instance.\")\n        super().__setitem__(name, value)\n\n\ndef register_to_config(init):\n    \"\"\"\n    Decorator to apply on the init of classes inheriting from `ConfigMixin` so that all the arguments are automatically\n    sent to `self.register_for_config`. To ignore a specific argument accepted by the init but that shouldn't be\n    registered in the config, use the `ignore_for_config` class variable\n\n    Warning: Once decorated, all private arguments (beginning with an underscore) are trashed and not sent to the init!\n    \"\"\"\n\n    @functools.wraps(init)\n    def inner_init(self, *args, **kwargs):\n        # Ignore private kwargs in the init.\n        init_kwargs = {k: v for k, v in kwargs.items() if not k.startswith(\"_\")}\n        init(self, *args, **init_kwargs)\n        if not isinstance(self, ConfigMixin):\n            raise RuntimeError(\n                f\"`@register_for_config` was applied to {self.__class__.__name__} init method, but this class does \"\n                \"not inherit from `ConfigMixin`.\")\n\n        ignore = getattr(self, \"ignore_for_config\", [])\n        # Get positional arguments aligned with kwargs\n        new_kwargs = {}\n        signature = inspect.signature(init)\n        parameters = {\n            name: p.default\n            for i, (name, p) in enumerate(signature.parameters.items()) if i > 0 and name not in ignore\n        }\n        for arg, name in zip(args, parameters.keys()):\n            new_kwargs[name] = arg\n\n        # Then add all kwargs\n        new_kwargs.update({\n            k: init_kwargs.get(k, default)\n            for k, default in parameters.items() if k not in ignore and k not in new_kwargs\n        })\n        getattr(self, \"register_to_config\")(**new_kwargs)\n\n    return inner_init\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/diffusers/models/README.md",
    "content": "# Models\n\n- Models: Neural network that models $p_\\theta(\\mathbf{x}_{t-1}|\\mathbf{x}_t)$ (see image below) and is trained end-to-end to denoise a noisy input to an image. Examples: UNet, Conditioned UNet, 3D UNet, Transformer UNet\n\n## API\n\nTODO(Suraj, Patrick)\n\n## Examples\n\nTODO(Suraj, Patrick)\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/diffusers/models/__init__.py",
    "content": "# flake8: noqa\n# There's no way to ignore \"F401 '...' imported but unused\" warnings in this\n# module, but to preserve other warnings. So, don't check this module at all.\n# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom .unet_2d import UNet2DModel\nfrom .unet_2d_condition import UNet2DConditionModel\nfrom .vae import AutoencoderKL\nfrom .vae import VQModel\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/diffusers/models/attention.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nfrom inspect import isfunction\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\ndef finfo(dtype):\n    if dtype == paddle.float32:\n        return np.finfo(np.float32)\n    if dtype == paddle.float16:\n        return np.finfo(np.float16)\n    if dtype == paddle.float64:\n        return np.finfo(np.float64)\n\n\npaddle.finfo = finfo\n\n\nclass AttentionBlockNew(nn.Layer):\n    \"\"\"\n    An attention block that allows spatial positions to attend to each other. Originally ported from here, but adapted\n    to the N-d case.\n    https://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/models/unet.py#L66.\n    Uses three q, k, v linear layers to compute attention\n    \"\"\"\n\n    def __init__(\n        self,\n        channels,\n        num_head_channels=None,\n        num_groups=32,\n        rescale_output_factor=1.0,\n        eps=1e-5,\n    ):\n        super().__init__()\n        self.channels = channels\n\n        self.num_heads = channels // num_head_channels if num_head_channels is not None else 1\n        self.num_head_size = num_head_channels\n        self.group_norm = nn.GroupNorm(num_channels=channels, num_groups=num_groups, epsilon=eps)\n\n        # define q,k,v as linear layers\n        self.query = nn.Linear(channels, channels)\n        self.key = nn.Linear(channels, channels)\n        self.value = nn.Linear(channels, channels)\n\n        self.rescale_output_factor = rescale_output_factor\n        self.proj_attn = nn.Linear(channels, channels)\n\n    def transpose_for_scores(self, projection: paddle.Tensor) -> paddle.Tensor:\n        new_projection_shape = projection.shape[:-1] + [self.num_heads, -1]\n        # move heads to 2nd position (B, T, H * D) -> (B, T, H, D) -> (B, H, T, D)\n        new_projection = projection.reshape(new_projection_shape).transpose([0, 2, 1, 3])\n        return new_projection\n\n    def forward(self, hidden_states):\n        residual = hidden_states\n        batch, channel, height, width = hidden_states.shape\n\n        # norm\n        hidden_states = self.group_norm(hidden_states)\n\n        hidden_states = hidden_states.reshape([batch, channel, height * width]).transpose([0, 2, 1])\n\n        # proj to q, k, v\n        query_proj = self.query(hidden_states)\n        key_proj = self.key(hidden_states)\n        value_proj = self.value(hidden_states)\n\n        # transpose\n        query_states = self.transpose_for_scores(query_proj)\n        key_states = self.transpose_for_scores(key_proj)\n        value_states = self.transpose_for_scores(value_proj)\n\n        # get scores\n        scale = 1 / math.sqrt(math.sqrt(self.channels / self.num_heads))\n        attention_scores = paddle.matmul(query_states * scale, key_states * scale, transpose_y=True)\n        attention_probs = F.softmax(attention_scores.astype(\"float32\"), axis=-1).astype(attention_scores.dtype)\n\n        # compute attention output\n        context_states = paddle.matmul(attention_probs, value_states)\n\n        context_states = context_states.transpose([0, 2, 1, 3])\n        new_context_states_shape = context_states.shape[:-2] + [\n            self.channels,\n        ]\n        context_states = context_states.reshape(new_context_states_shape)\n\n        # compute next hidden_states\n        hidden_states = self.proj_attn(context_states)\n        hidden_states = hidden_states.transpose([0, 2, 1]).reshape([batch, channel, height, width])\n\n        # res connect and rescale\n        hidden_states = (hidden_states + residual) / self.rescale_output_factor\n        return hidden_states\n\n    def set_weight(self, attn_layer):\n        self.group_norm.weight.set_value(attn_layer.norm.weight)\n        self.group_norm.bias.set_value(attn_layer.norm.bias)\n\n        if hasattr(attn_layer, \"q\"):\n            self.query.weight.set_value(attn_layer.q.weight[:, :, 0, 0])\n            self.key.weight.set_value(attn_layer.k.weight[:, :, 0, 0])\n            self.value.weight.set_value(attn_layer.v.weight[:, :, 0, 0])\n\n            self.query.bias.set_value(attn_layer.q.bias)\n            self.key.bias.set_value(attn_layer.k.bias)\n            self.value.bias.set_value(attn_layer.v.bias)\n\n            self.proj_attn.weight.set_value(attn_layer.proj_out.weight[:, :, 0, 0])\n            self.proj_attn.bias.set_value(attn_layer.proj_out.bias)\n        elif hasattr(attn_layer, \"NIN_0\"):\n            self.query.weight.set_value(attn_layer.NIN_0.W.t())\n            self.key.weight.set_value(attn_layer.NIN_1.W.t())\n            self.value.weight.set_value(attn_layer.NIN_2.W.t())\n\n            self.query.bias.set_value(attn_layer.NIN_0.b)\n            self.key.bias.set_value(attn_layer.NIN_1.b)\n            self.value.bias.set_value(attn_layer.NIN_2.b)\n\n            self.proj_attn.weight.set_value(attn_layer.NIN_3.W.t())\n            self.proj_attn.bias.set_value(attn_layer.NIN_3.b)\n\n            self.group_norm.weight.set_value(attn_layer.GroupNorm_0.weight)\n            self.group_norm.bias.set_value(attn_layer.GroupNorm_0.bias)\n        else:\n            qkv_weight = attn_layer.qkv.weight.reshape(\n                [self.num_heads, 3 * self.channels // self.num_heads, self.channels])\n            qkv_bias = attn_layer.qkv.bias.reshape([self.num_heads, 3 * self.channels // self.num_heads])\n\n            q_w, k_w, v_w = qkv_weight.split(self.channels // self.num_heads, axis=1)\n            q_b, k_b, v_b = qkv_bias.split(self.channels // self.num_heads, axis=1)\n\n            self.query.weight.set_value(q_w.reshape([-1, self.channels]))\n            self.key.weight.set_value(k_w.reshape([-1, self.channels]))\n            self.value.weight.set_value(v_w.reshape([-1, self.channels]))\n\n            self.query.bias.set_value(q_b.flatten())\n            self.key.bias.set_value(k_b.flatten())\n            self.value.bias.set_value(v_b.flatten())\n\n            self.proj_attn.weight.set_value(attn_layer.proj.weight[:, :, 0])\n            self.proj_attn.bias.set_value(attn_layer.proj.bias)\n\n\nclass SpatialTransformer(nn.Layer):\n    \"\"\"\n    Transformer block for image-like data. First, project the input (aka embedding) and reshape to b, t, d. Then apply\n    standard transformer action. Finally, reshape to image\n    \"\"\"\n\n    def __init__(self, in_channels, n_heads, d_head, depth=1, dropout=0.0, context_dim=None):\n        super().__init__()\n        self.n_heads = n_heads\n        self.d_head = d_head\n        self.in_channels = in_channels\n        inner_dim = n_heads * d_head\n        self.norm = nn.GroupNorm(num_groups=32, num_channels=in_channels, epsilon=1e-6)\n\n        self.proj_in = nn.Conv2D(in_channels, inner_dim, kernel_size=1, stride=1, padding=0)\n\n        self.transformer_blocks = nn.LayerList([\n            BasicTransformerBlock(inner_dim, n_heads, d_head, dropout=dropout, context_dim=context_dim)\n            for d in range(depth)\n        ])\n\n        self.proj_out = nn.Conv2D(inner_dim, in_channels, kernel_size=1, stride=1, padding=0)\n\n    def forward(self, x, context=None):\n        # note: if no context is given, cross-attention defaults to self-attention\n        b, c, h, w = x.shape\n        x_in = x\n        x = self.norm(x)\n        x = self.proj_in(x)\n        x = x.transpose([0, 2, 3, 1]).reshape([b, h * w, c])\n        for block in self.transformer_blocks:\n            x = block(x, context=context)\n        x = x.reshape([b, h, w, c]).transpose([0, 3, 1, 2])\n        x = self.proj_out(x)\n        return x + x_in\n\n    def set_weight(self, layer):\n        self.norm = layer.norm\n        self.proj_in = layer.proj_in\n        self.transformer_blocks = layer.transformer_blocks\n        self.proj_out = layer.proj_out\n\n\nclass BasicTransformerBlock(nn.Layer):\n\n    def __init__(self, dim, n_heads, d_head, dropout=0.0, context_dim=None, gated_ff=True, checkpoint=True):\n        super().__init__()\n        self.attn1 = CrossAttention(query_dim=dim, heads=n_heads, dim_head=d_head,\n                                    dropout=dropout)  # is a self-attention\n        self.ff = FeedForward(dim, dropout=dropout, glu=gated_ff)\n        self.attn2 = CrossAttention(query_dim=dim,\n                                    context_dim=context_dim,\n                                    heads=n_heads,\n                                    dim_head=d_head,\n                                    dropout=dropout)  # is self-attn if context is none\n        self.norm1 = nn.LayerNorm(dim)\n        self.norm2 = nn.LayerNorm(dim)\n        self.norm3 = nn.LayerNorm(dim)\n        self.checkpoint = checkpoint\n\n    def forward(self, x, context=None):\n        x = self.attn1(self.norm1(x)) + x\n        x = self.attn2(self.norm2(x), context=context) + x\n        x = self.ff(self.norm3(x)) + x\n        return x\n\n\nclass CrossAttention(nn.Layer):\n\n    def __init__(self, query_dim, context_dim=None, heads=8, dim_head=64, dropout=0.0):\n        super().__init__()\n        inner_dim = dim_head * heads\n        context_dim = default(context_dim, query_dim)\n\n        self.scale = dim_head**-0.5\n        self.heads = heads\n\n        self.to_q = nn.Linear(query_dim, inner_dim, bias_attr=False)\n        self.to_k = nn.Linear(context_dim, inner_dim, bias_attr=False)\n        self.to_v = nn.Linear(context_dim, inner_dim, bias_attr=False)\n\n        self.to_out = nn.Sequential(nn.Linear(inner_dim, query_dim), nn.Dropout(dropout))\n\n    def reshape_heads_to_batch_dim(self, tensor):\n        batch_size, seq_len, dim = tensor.shape\n        head_size = self.heads\n        tensor = tensor.reshape([batch_size, seq_len, head_size, dim // head_size])\n        tensor = tensor.transpose([0, 2, 1, 3]).reshape([batch_size * head_size, seq_len, dim // head_size])\n        return tensor\n\n    def reshape_batch_dim_to_heads(self, tensor):\n        batch_size, seq_len, dim = tensor.shape\n        head_size = self.heads\n        tensor = tensor.reshape([batch_size // head_size, head_size, seq_len, dim])\n        tensor = tensor.transpose([0, 2, 1, 3]).reshape([batch_size // head_size, seq_len, dim * head_size])\n        return tensor\n\n    def forward(self, x, context=None, mask=None):\n        batch_size, sequence_length, dim = x.shape\n\n        h = self.heads\n\n        q = self.to_q(x)\n        context = default(context, x)\n        k = self.to_k(context)\n        v = self.to_v(context)\n\n        q = self.reshape_heads_to_batch_dim(q)\n        k = self.reshape_heads_to_batch_dim(k)\n        v = self.reshape_heads_to_batch_dim(v)\n\n        sim = paddle.einsum(\"b i d, b j d -> b i j\", q * self.scale, k)\n\n        if exists(mask):\n            mask = mask.reshape([batch_size, -1])\n            max_neg_value = -paddle.finfo(sim.dtype).max\n            mask = mask[:, None, :].repeat(h, 1, 1)\n            sim.masked_fill_(~mask, max_neg_value)\n\n        # attention, what we cannot get enough of\n        attn = F.softmax(sim, axis=-1)\n\n        out = paddle.einsum(\"b i j, b j d -> b i d\", attn, v)\n        out = self.reshape_batch_dim_to_heads(out)\n        return self.to_out(out)\n\n\nclass FeedForward(nn.Layer):\n\n    def __init__(self, dim, dim_out=None, mult=4, glu=False, dropout=0.0):\n        super().__init__()\n        inner_dim = int(dim * mult)\n        dim_out = default(dim_out, dim)\n        project_in = nn.Sequential(nn.Linear(dim, inner_dim), nn.GELU()) if not glu else GEGLU(dim, inner_dim)\n\n        self.net = nn.Sequential(project_in, nn.Dropout(dropout), nn.Linear(inner_dim, dim_out))\n\n    def forward(self, x):\n        return self.net(x)\n\n\n# feedforward\nclass GEGLU(nn.Layer):\n\n    def __init__(self, dim_in, dim_out):\n        super().__init__()\n        self.proj = nn.Linear(dim_in, dim_out * 2)\n\n    def forward(self, x):\n        x, gate = self.proj(x).chunk(2, axis=-1)\n        return x * F.gelu(gate)\n\n\n# TODO(Patrick) - remove once all weights have been converted -> not needed anymore then\nclass NIN(nn.Layer):\n\n    def __init__(self, in_dim, num_units, init_scale=0.1):\n        super().__init__()\n        self.W = self.create_parameter(shape=[in_dim, num_units], default_initializer=nn.initializer.Constant(0.))\n        self.b = self.create_parameter(shape=[\n            num_units,\n        ],\n                                       is_bias=True,\n                                       default_initializer=nn.initializer.Constant(0.))\n\n\ndef exists(val):\n    return val is not None\n\n\ndef default(val, d):\n    if exists(val):\n        return val\n    return d() if isfunction(d) else d\n\n\n# the main attention block that is used for all models\nclass AttentionBlock(nn.Layer):\n    \"\"\"\n    An attention block that allows spatial positions to attend to each other.\n\n    Originally ported from here, but adapted to the N-d case.\n    https://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/models/unet.py#L66.\n    \"\"\"\n\n    def __init__(\n        self,\n        channels,\n        num_heads=1,\n        num_head_channels=None,\n        num_groups=32,\n        encoder_channels=None,\n        overwrite_qkv=False,\n        overwrite_linear=False,\n        rescale_output_factor=1.0,\n        eps=1e-5,\n    ):\n        super().__init__()\n        self.channels = channels\n        if num_head_channels is None:\n            self.num_heads = num_heads\n        else:\n            assert (channels % num_head_channels == 0\n                    ), f\"q,k,v channels {channels} is not divisible by num_head_channels {num_head_channels}\"\n            self.num_heads = channels // num_head_channels\n\n        self.norm = nn.GroupNorm(num_channels=channels, num_groups=num_groups, epsilon=eps)\n        self.qkv = nn.Conv1D(channels, channels * 3, 1)\n        self.n_heads = self.num_heads\n        self.rescale_output_factor = rescale_output_factor\n\n        if encoder_channels is not None:\n            self.encoder_kv = nn.Conv1D(encoder_channels, channels * 2, 1)\n\n        self.proj = nn.Conv1D(channels, channels, 1)\n\n        self.overwrite_qkv = overwrite_qkv\n        self.overwrite_linear = overwrite_linear\n\n        if overwrite_qkv:\n            in_channels = channels\n            self.norm = nn.GroupNorm(num_channels=channels, num_groups=num_groups, epsilon=1e-6)\n            self.q = nn.Conv2D(in_channels, in_channels, kernel_size=1, stride=1, padding=0)\n            self.k = nn.Conv2D(in_channels, in_channels, kernel_size=1, stride=1, padding=0)\n            self.v = nn.Conv2D(in_channels, in_channels, kernel_size=1, stride=1, padding=0)\n            self.proj_out = nn.Conv2D(in_channels, in_channels, kernel_size=1, stride=1, padding=0)\n        elif self.overwrite_linear:\n            num_groups = min(channels // 4, 32)\n            self.norm = nn.GroupNorm(num_channels=channels, num_groups=num_groups, epsilon=1e-6)\n            self.NIN_0 = NIN(channels, channels)\n            self.NIN_1 = NIN(channels, channels)\n            self.NIN_2 = NIN(channels, channels)\n            self.NIN_3 = NIN(channels, channels)\n\n            self.GroupNorm_0 = nn.GroupNorm(num_groups=num_groups, num_channels=channels, epsilon=1e-6)\n        else:\n            self.proj_out = nn.Conv1D(channels, channels, 1)\n            self.set_weights(self)\n\n        self.is_overwritten = False\n\n    def set_weights(self, layer):\n        if self.overwrite_qkv:\n            qkv_weight = paddle.concat([layer.q.weight, layer.k.weight, layer.v.weight], axis=0)[:, :, :, 0]\n            qkv_bias = paddle.concat([layer.q.bias, layer.k.bias, layer.v.bias], axis=0)\n\n            self.qkv.weight.set_value(qkv_weight)\n            self.qkv.bias.set_value(qkv_bias)\n\n            proj_out = nn.Conv1D(self.channels, self.channels, 1)\n            proj_out.weight.set_value(layer.proj_out.weight[:, :, :, 0])\n            proj_out.bias.set_value(layer.proj_out.bias)\n\n            self.proj = proj_out\n        elif self.overwrite_linear:\n            self.qkv.weight.set_value(\n                paddle.concat([self.NIN_0.W.t(), self.NIN_1.W.t(), self.NIN_2.W.t()], axis=0)[:, :, None])\n            self.qkv.bias.set_value(paddle.concat([self.NIN_0.b, self.NIN_1.b, self.NIN_2.b], axis=0))\n\n            self.proj.weight.set_value(self.NIN_3.W.t()[:, :, None])\n            self.proj.bias.set_value(self.NIN_3.b)\n\n            self.norm.weight.set_value(self.GroupNorm_0.weight)\n            self.norm.bias.set_value(self.GroupNorm_0.bias)\n        else:\n            self.proj.weight.set_value(self.proj_out.weight)\n            self.proj.bias.set_value(self.proj_out.bias)\n\n    def forward(self, x, encoder_out=None):\n        if not self.is_overwritten and (self.overwrite_qkv or self.overwrite_linear):\n            self.set_weights(self)\n            self.is_overwritten = True\n\n        b, c, *spatial = x.shape\n        hid_states = self.norm(x).reshape([b, c, -1])\n\n        qkv = self.qkv(hid_states)\n        bs, width, length = qkv.shape\n        assert width % (3 * self.n_heads) == 0\n        ch = width // (3 * self.n_heads)\n        q, k, v = qkv.reshape([bs * self.n_heads, ch * 3, length]).split(ch, axis=1)\n\n        if encoder_out is not None:\n            encoder_kv = self.encoder_kv(encoder_out)\n            assert encoder_kv.shape[1] == self.n_heads * ch * 2\n            ek, ev = encoder_kv.reshape([bs * self.n_heads, ch * 2, -1]).split(ch, axis=1)\n            k = paddle.concat([ek, k], axis=-1)\n            v = paddle.concat([ev, v], axis=-1)\n\n        scale = 1 / math.sqrt(math.sqrt(ch))\n        weight = paddle.einsum(\"bct,bcs->bts\", q * scale, k * scale)  # More stable with f16 than dividing afterwards\n        weight = F.softmax(weight.astype(\"float32\"), axis=-1).astype(weight.dtype)\n\n        a = paddle.einsum(\"bts,bcs->bct\", weight, v)\n        h = a.reshape([bs, -1, length])\n\n        h = self.proj(h)\n        h = h.reshape([b, c, *spatial])\n\n        result = x + h\n\n        result = result / self.rescale_output_factor\n\n        return result\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/diffusers/models/embeddings.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\ndef get_timestep_embedding(timesteps,\n                           embedding_dim,\n                           flip_sin_to_cos=False,\n                           downscale_freq_shift=1,\n                           scale=1,\n                           max_period=10000):\n    \"\"\"\n    This matches the implementation in Denoising Diffusion Probabilistic Models: Create sinusoidal timestep embeddings.\n\n    :param timesteps: a 1-D Tensor of N indices, one per batch element.\n                      These may be fractional.\n    :param embedding_dim: the dimension of the output. :param max_period: controls the minimum frequency of the\n    embeddings. :return: an [N x dim] Tensor of positional embeddings.\n    \"\"\"\n    assert len(timesteps.shape) == 1, \"Timesteps should be a 1d-array\"\n\n    half_dim = embedding_dim // 2\n    exponent = -math.log(max_period) * paddle.arange(start=0, end=half_dim, dtype=\"float32\")\n    exponent = exponent / (half_dim - downscale_freq_shift)\n\n    emb = paddle.exp(exponent)\n    emb = timesteps[:, None].astype(\"float32\") * emb[None, :]\n\n    # scale embeddings\n    emb = scale * emb\n\n    # concat sine and cosine embeddings\n    emb = paddle.concat([paddle.sin(emb), paddle.cos(emb)], axis=-1)\n\n    # flip sine and cosine embeddings\n    if flip_sin_to_cos:\n        emb = paddle.concat([emb[:, half_dim:], emb[:, :half_dim]], axis=-1)\n\n    # zero pad\n    if embedding_dim % 2 == 1:\n        emb = paddle.concat(emb, paddle.zeros([emb.shape[0], 1]), axis=-1)\n    return emb\n\n\nclass TimestepEmbedding(nn.Layer):\n\n    def __init__(self, channel, time_embed_dim, act_fn=\"silu\"):\n        super().__init__()\n\n        self.linear_1 = nn.Linear(channel, time_embed_dim)\n        self.act = None\n        if act_fn == \"silu\":\n            self.act = nn.Silu()\n        self.linear_2 = nn.Linear(time_embed_dim, time_embed_dim)\n\n    def forward(self, sample):\n        sample = self.linear_1(sample)\n\n        if self.act is not None:\n            sample = self.act(sample)\n\n        sample = self.linear_2(sample)\n        return sample\n\n\nclass Timesteps(nn.Layer):\n\n    def __init__(self, num_channels, flip_sin_to_cos, downscale_freq_shift):\n        super().__init__()\n        self.num_channels = num_channels\n        self.flip_sin_to_cos = flip_sin_to_cos\n        self.downscale_freq_shift = downscale_freq_shift\n\n    def forward(self, timesteps):\n        t_emb = get_timestep_embedding(\n            timesteps,\n            self.num_channels,\n            flip_sin_to_cos=self.flip_sin_to_cos,\n            downscale_freq_shift=self.downscale_freq_shift,\n        )\n        return t_emb\n\n\nclass GaussianFourierProjection(nn.Layer):\n    \"\"\"Gaussian Fourier embeddings for noise levels.\"\"\"\n\n    def __init__(self, embedding_size=256, scale=1.0):\n        super().__init__()\n        self.register_buffer(\"weight\", paddle.randn((embedding_size, )) * scale)\n\n        # to delete later\n        self.register_buffer(\"W\", paddle.randn((embedding_size, )) * scale)\n\n        self.weight = self.W\n\n    def forward(self, x):\n        x = paddle.log(x)\n        x_proj = x[:, None] * self.weight[None, :] * 2 * np.pi\n        out = paddle.concat([paddle.sin(x_proj), paddle.cos(x_proj)], axis=-1)\n        return out\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/diffusers/models/resnet.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom functools import partial\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\ndef pad_new(x, pad, mode=\"constant\", value=0):\n    new_pad = []\n    for _ in range(x.ndim * 2 - len(pad)):\n        new_pad.append(0)\n    ndim = list(range(x.ndim - 1, 0, -1))\n    axes_start = {}\n    for i, _pad in enumerate(pad):\n        if _pad < 0:\n            new_pad.append(0)\n            zhengshu, yushu = divmod(i, 2)\n            if yushu == 0:\n                axes_start[ndim[zhengshu]] = -_pad\n        else:\n            new_pad.append(_pad)\n\n    padded = paddle.nn.functional.pad(x, new_pad, mode=mode, value=value)\n    padded_shape = paddle.shape(padded)\n    axes = []\n    starts = []\n    ends = []\n    for k, v in axes_start.items():\n        axes.append(k)\n        starts.append(v)\n        ends.append(padded_shape[k])\n        assert v < padded_shape[k]\n\n    if axes:\n        return padded.slice(axes=axes, starts=starts, ends=ends)\n    else:\n        return padded\n\n\nclass Upsample2D(nn.Layer):\n    \"\"\"\n    An upsampling layer with an optional convolution.\n\n    :param channels: channels in the inputs and outputs. :param use_conv: a bool determining if a convolution is\n    applied. :param dims: determines if the signal is 1D, 2D, or 3D. If 3D, then\n                 upsampling occurs in the inner-two dimensions.\n    \"\"\"\n\n    def __init__(self, channels, use_conv=False, use_conv_transpose=False, out_channels=None, name=\"conv\"):\n        super().__init__()\n        self.channels = channels\n        self.out_channels = out_channels or channels\n        self.use_conv = use_conv\n        self.use_conv_transpose = use_conv_transpose\n        self.name = name\n\n        conv = None\n        if use_conv_transpose:\n            conv = nn.Conv2DTranspose(channels, self.out_channels, 4, 2, 1)\n        elif use_conv:\n            conv = nn.Conv2D(self.channels, self.out_channels, 3, padding=1)\n\n        # TODO(Suraj, Patrick) - clean up after weight dicts are correctly renamed\n        if name == \"conv\":\n            self.conv = conv\n        else:\n            self.Conv2d_0 = conv\n\n    def forward(self, x):\n        assert x.shape[1] == self.channels\n        if self.use_conv_transpose:\n            return self.conv(x)\n\n        x = F.interpolate(x, scale_factor=2.0, mode=\"nearest\")\n\n        # TODO(Suraj, Patrick) - clean up after weight dicts are correctly renamed\n        if self.use_conv:\n            if self.name == \"conv\":\n                x = self.conv(x)\n            else:\n                x = self.Conv2d_0(x)\n\n        return x\n\n\nclass Downsample2D(nn.Layer):\n    \"\"\"\n    A downsampling layer with an optional convolution.\n\n    :param channels: channels in the inputs and outputs. :param use_conv: a bool determining if a convolution is\n    applied. :param dims: determines if the signal is 1D, 2D, or 3D. If 3D, then\n                 downsampling occurs in the inner-two dimensions.\n    \"\"\"\n\n    def __init__(self, channels, use_conv=False, out_channels=None, padding=1, name=\"conv\"):\n        super().__init__()\n        self.channels = channels\n        self.out_channels = out_channels or channels\n        self.use_conv = use_conv\n        self.padding = padding\n        stride = 2\n        self.name = name\n\n        if use_conv:\n            conv = nn.Conv2D(self.channels, self.out_channels, 3, stride=stride, padding=padding)\n        else:\n            assert self.channels == self.out_channels\n            conv = nn.AvgPool2D(kernel_size=stride, stride=stride)\n\n        # TODO(Suraj, Patrick) - clean up after weight dicts are correctly renamed\n        if name == \"conv\":\n            self.Conv2d_0 = conv\n            self.conv = conv\n        elif name == \"Conv2d_0\":\n            self.conv = conv\n        else:\n            self.conv = conv\n\n    def forward(self, x):\n        assert x.shape[1] == self.channels\n        if self.use_conv and self.padding == 0:\n            pad = (0, 1, 0, 1)\n            x = pad_new(x, pad, mode=\"constant\", value=0)\n\n        assert x.shape[1] == self.channels\n        x = self.conv(x)\n\n        return x\n\n\nclass FirUpsample2D(nn.Layer):\n\n    def __init__(self, channels=None, out_channels=None, use_conv=False, fir_kernel=(1, 3, 3, 1)):\n        super().__init__()\n        out_channels = out_channels if out_channels else channels\n        if use_conv:\n            self.Conv2d_0 = nn.Conv2D(channels, out_channels, kernel_size=3, stride=1, padding=1)\n        self.use_conv = use_conv\n        self.fir_kernel = fir_kernel\n        self.out_channels = out_channels\n\n    def _upsample_2d(self, x, w=None, k=None, factor=2, gain=1):\n        \"\"\"Fused `upsample_2d()` followed by `Conv2d()`.\n\n        Args:\n        Padding is performed only once at the beginning, not between the operations. The fused op is considerably more\n        efficient than performing the same calculation using standard TensorFlow ops. It supports gradients of arbitrary:\n        order.\n        x: Input tensor of the shape `[N, C, H, W]` or `[N, H, W,\n            C]`.\n        w: Weight tensor of the shape `[filterH, filterW, inChannels,\n            outChannels]`. Grouped convolution can be performed by `inChannels = x.shape[0] // numGroups`.\n        k: FIR filter of the shape `[firH, firW]` or `[firN]`\n            (separable). The default is `[1] * factor`, which corresponds to nearest-neighbor upsampling.\n        factor: Integer upsampling factor (default: 2). gain: Scaling factor for signal magnitude (default: 1.0).\n\n        Returns:\n        Tensor of the shape `[N, C, H * factor, W * factor]` or `[N, H * factor, W * factor, C]`, and same datatype as\n        `x`.\n        \"\"\"\n\n        assert isinstance(factor, int) and factor >= 1\n\n        # Setup filter kernel.\n        if k is None:\n            k = [1] * factor\n\n        # setup kernel\n        k = np.asarray(k, dtype=np.float32)\n        if k.ndim == 1:\n            k = np.outer(k, k)\n        k /= np.sum(k)\n\n        k = k * (gain * (factor**2))\n\n        if self.use_conv:\n            convH = w.shape[2]\n            convW = w.shape[3]\n            inC = w.shape[1]\n\n            p = (k.shape[0] - factor) - (convW - 1)\n\n            stride = (factor, factor)\n            # Determine data dimensions.\n            stride = [1, 1, factor, factor]\n            output_shape = ((x.shape[2] - 1) * factor + convH, (x.shape[3] - 1) * factor + convW)\n            output_padding = (\n                output_shape[0] - (x.shape[2] - 1) * stride[0] - convH,\n                output_shape[1] - (x.shape[3] - 1) * stride[1] - convW,\n            )\n            assert output_padding[0] >= 0 and output_padding[1] >= 0\n            inC = w.shape[1]\n            num_groups = x.shape[1] // inC\n\n            # Transpose weights.\n            w = paddle.reshape(w, (num_groups, -1, inC, convH, convW))\n            w = w[..., ::-1, ::-1].transpose([0, 2, 1, 3, 4])\n            w = paddle.reshape(w, (num_groups * inC, -1, convH, convW))\n\n            x = F.conv2d_transpose(x, w, stride=stride, output_padding=output_padding, padding=0)\n\n            x = upfirdn2d_native(x, paddle.to_tensor(k), pad=((p + 1) // 2 + factor - 1, p // 2 + 1))\n        else:\n            p = k.shape[0] - factor\n            x = upfirdn2d_native(x, paddle.to_tensor(k), up=factor, pad=((p + 1) // 2 + factor - 1, p // 2))\n\n        return x\n\n    def forward(self, x):\n        if self.use_conv:\n            h = self._upsample_2d(x, self.Conv2d_0.weight, k=self.fir_kernel)\n            h = h + self.Conv2d_0.bias.reshape([1, -1, 1, 1])\n        else:\n            h = self._upsample_2d(x, k=self.fir_kernel, factor=2)\n\n        return h\n\n\nclass FirDownsample2D(nn.Layer):\n\n    def __init__(self, channels=None, out_channels=None, use_conv=False, fir_kernel=(1, 3, 3, 1)):\n        super().__init__()\n        out_channels = out_channels if out_channels else channels\n        if use_conv:\n            self.Conv2d_0 = nn.Conv2D(channels, out_channels, kernel_size=3, stride=1, padding=1)\n        self.fir_kernel = fir_kernel\n        self.use_conv = use_conv\n        self.out_channels = out_channels\n\n    def _downsample_2d(self, x, w=None, k=None, factor=2, gain=1):\n        \"\"\"Fused `Conv2d()` followed by `downsample_2d()`.\n\n        Args:\n        Padding is performed only once at the beginning, not between the operations. The fused op is considerably more\n        efficient than performing the same calculation using standard TensorFlow ops. It supports gradients of arbitrary:\n        order.\n            x: Input tensor of the shape `[N, C, H, W]` or `[N, H, W, C]`. w: Weight tensor of the shape `[filterH,\n            filterW, inChannels, outChannels]`. Grouped convolution can be performed by `inChannels = x.shape[0] //\n            numGroups`. k: FIR filter of the shape `[firH, firW]` or `[firN]` (separable). The default is `[1] *\n            factor`, which corresponds to average pooling. factor: Integer downsampling factor (default: 2). gain:\n            Scaling factor for signal magnitude (default: 1.0).\n\n        Returns:\n            Tensor of the shape `[N, C, H // factor, W // factor]` or `[N, H // factor, W // factor, C]`, and same\n            datatype as `x`.\n        \"\"\"\n\n        assert isinstance(factor, int) and factor >= 1\n        if k is None:\n            k = [1] * factor\n\n        # setup kernel\n        k = np.asarray(k, dtype=np.float32)\n        if k.ndim == 1:\n            k = np.outer(k, k)\n        k /= np.sum(k)\n\n        k = k * gain\n\n        if self.use_conv:\n            _, _, convH, convW = w.shape\n            p = (k.shape[0] - factor) + (convW - 1)\n            s = [factor, factor]\n            x = upfirdn2d_native(x, paddle.to_tensor(k), pad=((p + 1) // 2, p // 2))\n            x = F.conv2d(x, w, stride=s, padding=0)\n        else:\n            p = k.shape[0] - factor\n            x = upfirdn2d_native(x, paddle.to_tensor(k), down=factor, pad=((p + 1) // 2, p // 2))\n\n        return x\n\n    def forward(self, x):\n        if self.use_conv:\n            x = self._downsample_2d(x, w=self.Conv2d_0.weight, k=self.fir_kernel)\n            x = x + self.Conv2d_0.bias.reshape([1, -1, 1, 1])\n        else:\n            x = self._downsample_2d(x, k=self.fir_kernel, factor=2)\n\n        return x\n\n\nclass ResnetBlock(nn.Layer):\n\n    def __init__(\n        self,\n        *,\n        in_channels,\n        out_channels=None,\n        conv_shortcut=False,\n        dropout=0.0,\n        temb_channels=512,\n        groups=32,\n        groups_out=None,\n        pre_norm=True,\n        eps=1e-6,\n        non_linearity=\"swish\",\n        time_embedding_norm=\"default\",\n        kernel=None,\n        output_scale_factor=1.0,\n        use_nin_shortcut=None,\n        up=False,\n        down=False,\n    ):\n        super().__init__()\n        self.pre_norm = pre_norm\n        self.pre_norm = True\n        self.in_channels = in_channels\n        out_channels = in_channels if out_channels is None else out_channels\n        self.out_channels = out_channels\n        self.use_conv_shortcut = conv_shortcut\n        self.time_embedding_norm = time_embedding_norm\n        self.up = up\n        self.down = down\n        self.output_scale_factor = output_scale_factor\n\n        if groups_out is None:\n            groups_out = groups\n\n        self.norm1 = nn.GroupNorm(num_groups=groups, num_channels=in_channels, epsilon=eps)\n\n        self.conv1 = nn.Conv2D(in_channels, out_channels, kernel_size=3, stride=1, padding=1)\n\n        if temb_channels is not None:\n            self.time_emb_proj = nn.Linear(temb_channels, out_channels)\n        else:\n            self.time_emb_proj = None\n\n        self.norm2 = nn.GroupNorm(num_groups=groups_out, num_channels=out_channels, epsilon=eps)\n        self.dropout = nn.Dropout(dropout)\n        self.conv2 = nn.Conv2D(out_channels, out_channels, kernel_size=3, stride=1, padding=1)\n\n        if non_linearity == \"swish\":\n            self.nonlinearity = lambda x: F.silu(x)\n        elif non_linearity == \"mish\":\n            self.nonlinearity = Mish()\n        elif non_linearity == \"silu\":\n            self.nonlinearity = nn.Silu()\n\n        self.upsample = self.downsample = None\n        if self.up:\n            if kernel == \"fir\":\n                fir_kernel = (1, 3, 3, 1)\n                self.upsample = lambda x: upsample_2d(x, k=fir_kernel)\n            elif kernel == \"sde_vp\":\n                self.upsample = partial(F.interpolate, scale_factor=2.0, mode=\"nearest\")\n            else:\n                self.upsample = Upsample2D(in_channels, use_conv=False)\n        elif self.down:\n            if kernel == \"fir\":\n                fir_kernel = (1, 3, 3, 1)\n                self.downsample = lambda x: downsample_2d(x, k=fir_kernel)\n            elif kernel == \"sde_vp\":\n                self.downsample = partial(F.avg_pool2d, kernel_size=2, stride=2)\n            else:\n                self.downsample = Downsample2D(in_channels, use_conv=False, padding=1, name=\"op\")\n\n        self.use_nin_shortcut = self.in_channels != self.out_channels if use_nin_shortcut is None else use_nin_shortcut\n\n        self.conv_shortcut = None\n        if self.use_nin_shortcut:\n            self.conv_shortcut = nn.Conv2D(in_channels, out_channels, kernel_size=1, stride=1, padding=0)\n\n    def forward(self, x, temb, hey=False):\n        h = x\n\n        # make sure hidden states is in float32\n        # when running in half-precision\n        h = self.norm1(h.astype(\"float32\")).astype(h.dtype)\n        h = self.nonlinearity(h)\n\n        if self.upsample is not None:\n            x = self.upsample(x)\n            h = self.upsample(h)\n        elif self.downsample is not None:\n            x = self.downsample(x)\n            h = self.downsample(h)\n\n        h = self.conv1(h)\n\n        if temb is not None:\n            temb = self.time_emb_proj(self.nonlinearity(temb))[:, :, None, None]\n            h = h + temb\n\n        # make sure hidden states is in float32\n        # when running in half-precision\n        h = self.norm2(h.astype(\"float32\")).astype(h.dtype)\n        h = self.nonlinearity(h)\n\n        h = self.dropout(h)\n        h = self.conv2(h)\n\n        if self.conv_shortcut is not None:\n            x = self.conv_shortcut(x)\n\n        out = (x + h) / self.output_scale_factor\n\n        return out\n\n\nclass Mish(nn.Layer):\n\n    def forward(self, x):\n        return x * F.tanh(F.softplus(x))\n\n\ndef upsample_2d(x, k=None, factor=2, gain=1):\n    r\"\"\"Upsample2D a batch of 2D images with the given filter.\n\n    Args:\n    Accepts a batch of 2D images of the shape `[N, C, H, W]` or `[N, H, W, C]` and upsamples each image with the given\n    filter. The filter is normalized so that if the input pixels are constant, they will be scaled by the specified\n    `gain`. Pixels outside the image are assumed to be zero, and the filter is padded with zeros so that its shape is a:\n    multiple of the upsampling factor.\n        x: Input tensor of the shape `[N, C, H, W]` or `[N, H, W,\n          C]`.\n        k: FIR filter of the shape `[firH, firW]` or `[firN]`\n          (separable). The default is `[1] * factor`, which corresponds to nearest-neighbor upsampling.\n        factor: Integer upsampling factor (default: 2). gain: Scaling factor for signal magnitude (default: 1.0).\n\n    Returns:\n        Tensor of the shape `[N, C, H * factor, W * factor]`\n    \"\"\"\n    assert isinstance(factor, int) and factor >= 1\n    if k is None:\n        k = [1] * factor\n\n    k = np.asarray(k, dtype=np.float32)\n    if k.ndim == 1:\n        k = np.outer(k, k)\n    k /= np.sum(k)\n\n    k = k * (gain * (factor**2))\n    p = k.shape[0] - factor\n    return upfirdn2d_native(x, paddle.to_tensor(k), up=factor, pad=((p + 1) // 2 + factor - 1, p // 2))\n\n\ndef downsample_2d(x, k=None, factor=2, gain=1):\n    r\"\"\"Downsample2D a batch of 2D images with the given filter.\n\n    Args:\n    Accepts a batch of 2D images of the shape `[N, C, H, W]` or `[N, H, W, C]` and downsamples each image with the\n    given filter. The filter is normalized so that if the input pixels are constant, they will be scaled by the\n    specified `gain`. Pixels outside the image are assumed to be zero, and the filter is padded with zeros so that its\n    shape is a multiple of the downsampling factor.\n        x: Input tensor of the shape `[N, C, H, W]` or `[N, H, W,\n          C]`.\n        k: FIR filter of the shape `[firH, firW]` or `[firN]`\n          (separable). The default is `[1] * factor`, which corresponds to average pooling.\n        factor: Integer downsampling factor (default: 2). gain: Scaling factor for signal magnitude (default: 1.0).\n\n    Returns:\n        Tensor of the shape `[N, C, H // factor, W // factor]`\n    \"\"\"\n\n    assert isinstance(factor, int) and factor >= 1\n    if k is None:\n        k = [1] * factor\n\n    k = np.asarray(k, dtype=np.float32)\n    if k.ndim == 1:\n        k = np.outer(k, k)\n    k /= np.sum(k)\n\n    k = k * gain\n    p = k.shape[0] - factor\n    return upfirdn2d_native(x, paddle.to_tensor(k), down=factor, pad=((p + 1) // 2, p // 2))\n\n\ndef upfirdn2d_native(input, kernel, up=1, down=1, pad=(0, 0)):\n    up_x = up_y = up\n    down_x = down_y = down\n    pad_x0 = pad_y0 = pad[0]\n    pad_x1 = pad_y1 = pad[1]\n\n    _, channel, in_h, in_w = input.shape\n    input = input.reshape([-1, in_h, in_w, 1])\n\n    _, in_h, in_w, minor = input.shape\n    kernel_h, kernel_w = kernel.shape\n\n    out = input.reshape([-1, in_h, 1, in_w, 1, minor])\n    # TODO\n    out = pad_new(out, [0, 0, 0, up_x - 1, 0, 0, 0, up_y - 1])\n    out = out.reshape([-1, in_h * up_y, in_w * up_x, minor])\n\n    out = pad_new(out, [0, 0, max(pad_x0, 0), max(pad_x1, 0), max(pad_y0, 0), max(pad_y1, 0)])\n    out = out[:, max(-pad_y0, 0):out.shape[1] - max(-pad_y1, 0), max(-pad_x0, 0):out.shape[2] - max(-pad_x1, 0), :, ]\n\n    out = out.transpose([0, 3, 1, 2])\n    out = out.reshape([-1, 1, in_h * up_y + pad_y0 + pad_y1, in_w * up_x + pad_x0 + pad_x1])\n    w = paddle.flip(kernel, [0, 1]).reshape([1, 1, kernel_h, kernel_w])\n    out = F.conv2d(out, w)\n    out = out.reshape(\n        [-1, minor, in_h * up_y + pad_y0 + pad_y1 - kernel_h + 1, in_w * up_x + pad_x0 + pad_x1 - kernel_w + 1])\n    out = out.transpose([0, 2, 3, 1])\n    out = out[:, ::down_y, ::down_x, :]\n\n    out_h = (in_h * up_y + pad_y0 + pad_y1 - kernel_h) // down_y + 1\n    out_w = (in_w * up_x + pad_x0 + pad_x1 - kernel_w) // down_x + 1\n\n    return out.reshape([-1, channel, out_h, out_w])\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/diffusers/models/unet_2d.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Dict\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .embeddings import GaussianFourierProjection\nfrom .embeddings import TimestepEmbedding\nfrom .embeddings import Timesteps\nfrom .unet_blocks import get_down_block\nfrom .unet_blocks import get_up_block\nfrom .unet_blocks import UNetMidBlock2D\n\n\nclass UNet2DModel(nn.Layer, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        sample_size=None,\n        in_channels=3,\n        out_channels=3,\n        center_input_sample=False,\n        time_embedding_type=\"positional\",\n        freq_shift=0,\n        flip_sin_to_cos=True,\n        down_block_types=(\"DownBlock2D\", \"AttnDownBlock2D\", \"AttnDownBlock2D\", \"AttnDownBlock2D\"),\n        up_block_types=(\"AttnUpBlock2D\", \"AttnUpBlock2D\", \"AttnUpBlock2D\", \"UpBlock2D\"),\n        block_out_channels=(224, 448, 672, 896),\n        layers_per_block=2,\n        mid_block_scale_factor=1,\n        downsample_padding=1,\n        act_fn=\"silu\",\n        attention_head_dim=8,\n        norm_num_groups=32,\n        norm_eps=1e-5,\n    ):\n        super().__init__()\n\n        self.sample_size = sample_size\n        time_embed_dim = block_out_channels[0] * 4\n\n        # input\n        self.conv_in = nn.Conv2D(in_channels, block_out_channels[0], kernel_size=3, padding=(1, 1))\n\n        # time\n        if time_embedding_type == \"fourier\":\n            self.time_proj = GaussianFourierProjection(embedding_size=block_out_channels[0], scale=16)\n            timestep_input_dim = 2 * block_out_channels[0]\n        elif time_embedding_type == \"positional\":\n            self.time_proj = Timesteps(block_out_channels[0], flip_sin_to_cos, freq_shift)\n            timestep_input_dim = block_out_channels[0]\n\n        self.time_embedding = TimestepEmbedding(timestep_input_dim, time_embed_dim)\n\n        self.down_blocks = nn.LayerList([])\n        self.mid_block = None\n        self.up_blocks = nn.LayerList([])\n\n        # down\n        output_channel = block_out_channels[0]\n        for i, down_block_type in enumerate(down_block_types):\n            input_channel = output_channel\n            output_channel = block_out_channels[i]\n            is_final_block = i == len(block_out_channels) - 1\n\n            down_block = get_down_block(\n                down_block_type,\n                num_layers=layers_per_block,\n                in_channels=input_channel,\n                out_channels=output_channel,\n                temb_channels=time_embed_dim,\n                add_downsample=not is_final_block,\n                resnet_eps=norm_eps,\n                resnet_act_fn=act_fn,\n                attn_num_head_channels=attention_head_dim,\n                downsample_padding=downsample_padding,\n            )\n            self.down_blocks.append(down_block)\n\n        # mid\n        self.mid_block = UNetMidBlock2D(\n            in_channels=block_out_channels[-1],\n            temb_channels=time_embed_dim,\n            resnet_eps=norm_eps,\n            resnet_act_fn=act_fn,\n            output_scale_factor=mid_block_scale_factor,\n            resnet_time_scale_shift=\"default\",\n            attn_num_head_channels=attention_head_dim,\n            resnet_groups=norm_num_groups,\n        )\n\n        # up\n        reversed_block_out_channels = list(reversed(block_out_channels))\n        output_channel = reversed_block_out_channels[0]\n        for i, up_block_type in enumerate(up_block_types):\n            prev_output_channel = output_channel\n            output_channel = reversed_block_out_channels[i]\n            input_channel = reversed_block_out_channels[min(i + 1, len(block_out_channels) - 1)]\n\n            is_final_block = i == len(block_out_channels) - 1\n\n            up_block = get_up_block(\n                up_block_type,\n                num_layers=layers_per_block + 1,\n                in_channels=input_channel,\n                out_channels=output_channel,\n                prev_output_channel=prev_output_channel,\n                temb_channels=time_embed_dim,\n                add_upsample=not is_final_block,\n                resnet_eps=norm_eps,\n                resnet_act_fn=act_fn,\n                attn_num_head_channels=attention_head_dim,\n            )\n            self.up_blocks.append(up_block)\n            prev_output_channel = output_channel\n\n        # out\n        num_groups_out = norm_num_groups if norm_num_groups is not None else min(block_out_channels[0] // 4, 32)\n        self.conv_norm_out = nn.GroupNorm(num_channels=block_out_channels[0],\n                                          num_groups=num_groups_out,\n                                          epsilon=norm_eps)\n        self.conv_act = nn.Silu()\n        self.conv_out = nn.Conv2D(block_out_channels[0], out_channels, 3, padding=1)\n\n    def forward(self, sample: paddle.Tensor, timestep: Union[paddle.Tensor, float, int]) -> Dict[str, paddle.Tensor]:\n\n        # 0. center input if necessary\n        if self.config.center_input_sample:\n            sample = 2 * sample - 1.0\n\n        # 1. time\n        timesteps = timestep\n        if not paddle.is_tensor(timesteps):\n            timesteps = paddle.to_tensor([timesteps], dtype=\"int64\")\n        elif paddle.is_tensor(timesteps) and len(timesteps.shape) == 0:\n            timesteps = timesteps[None]\n\n        # broadcast to batch dimension\n        timesteps = paddle.broadcast_to(timesteps, [sample.shape[0]])\n\n        t_emb = self.time_proj(timesteps)\n        emb = self.time_embedding(t_emb)\n\n        # 2. pre-process\n        skip_sample = sample\n        sample = self.conv_in(sample)\n\n        # 3. down\n        down_block_res_samples = (sample, )\n        for downsample_block in self.down_blocks:\n            if hasattr(downsample_block, \"skip_conv\"):\n                sample, res_samples, skip_sample = downsample_block(hidden_states=sample,\n                                                                    temb=emb,\n                                                                    skip_sample=skip_sample)\n            else:\n                sample, res_samples = downsample_block(hidden_states=sample, temb=emb)\n\n            down_block_res_samples += res_samples\n\n        # 4. mid\n        sample = self.mid_block(sample, emb)\n\n        # 5. up\n        skip_sample = None\n        for upsample_block in self.up_blocks:\n            res_samples = down_block_res_samples[-len(upsample_block.resnets):]\n            down_block_res_samples = down_block_res_samples[:-len(upsample_block.resnets)]\n\n            if hasattr(upsample_block, \"skip_conv\"):\n                sample, skip_sample = upsample_block(sample, res_samples, emb, skip_sample)\n            else:\n                sample = upsample_block(sample, res_samples, emb)\n\n        # 6. post-process\n        # make sure hidden states is in float32\n        # when running in half-precision\n        sample = self.conv_norm_out(sample.astype(\"float32\")).astype(sample.dtype)\n        sample = self.conv_act(sample)\n        sample = self.conv_out(sample)\n\n        if skip_sample is not None:\n            sample += skip_sample\n\n        if self.config.time_embedding_type == \"fourier\":\n            timesteps = timesteps.reshape((sample.shape[0], *([1] * len(sample.shape[1:]))))\n            sample = sample / timesteps\n\n        output = {\"sample\": sample}\n\n        return output\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/diffusers/models/unet_2d_condition.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Dict\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .embeddings import TimestepEmbedding\nfrom .embeddings import Timesteps\nfrom .unet_blocks import get_down_block\nfrom .unet_blocks import get_up_block\nfrom .unet_blocks import UNetMidBlock2DCrossAttn\n\n\nclass UNet2DConditionModel(nn.Layer, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        sample_size=64,\n        in_channels=4,\n        out_channels=4,\n        center_input_sample=False,\n        flip_sin_to_cos=True,\n        freq_shift=0,\n        down_block_types=(\"CrossAttnDownBlock2D\", \"CrossAttnDownBlock2D\", \"CrossAttnDownBlock2D\", \"DownBlock2D\"),\n        up_block_types=(\"UpBlock2D\", \"CrossAttnUpBlock2D\", \"CrossAttnUpBlock2D\", \"CrossAttnUpBlock2D\"),\n        block_out_channels=(320, 640, 1280, 1280),\n        layers_per_block=2,\n        downsample_padding=1,\n        mid_block_scale_factor=1,\n        act_fn=\"silu\",\n        norm_num_groups=32,\n        norm_eps=1e-5,\n        cross_attention_dim=768,\n        attention_head_dim=8,\n    ):\n        super().__init__()\n\n        self.sample_size = sample_size\n        time_embed_dim = block_out_channels[0] * 4\n\n        # input\n        self.conv_in = nn.Conv2D(in_channels, block_out_channels[0], kernel_size=3, padding=(1, 1))\n\n        # time\n        self.time_proj = Timesteps(block_out_channels[0], flip_sin_to_cos, freq_shift)\n        timestep_input_dim = block_out_channels[0]\n\n        self.time_embedding = TimestepEmbedding(timestep_input_dim, time_embed_dim)\n\n        self.down_blocks = nn.LayerList([])\n        self.mid_block = None\n        self.up_blocks = nn.LayerList([])\n\n        # down\n        output_channel = block_out_channels[0]\n        for i, down_block_type in enumerate(down_block_types):\n            input_channel = output_channel\n            output_channel = block_out_channels[i]\n            is_final_block = i == len(block_out_channels) - 1\n\n            down_block = get_down_block(\n                down_block_type,\n                num_layers=layers_per_block,\n                in_channels=input_channel,\n                out_channels=output_channel,\n                temb_channels=time_embed_dim,\n                add_downsample=not is_final_block,\n                resnet_eps=norm_eps,\n                resnet_act_fn=act_fn,\n                cross_attention_dim=cross_attention_dim,\n                attn_num_head_channels=attention_head_dim,\n                downsample_padding=downsample_padding,\n            )\n            self.down_blocks.append(down_block)\n\n        # mid\n        self.mid_block = UNetMidBlock2DCrossAttn(\n            in_channels=block_out_channels[-1],\n            temb_channels=time_embed_dim,\n            resnet_eps=norm_eps,\n            resnet_act_fn=act_fn,\n            output_scale_factor=mid_block_scale_factor,\n            resnet_time_scale_shift=\"default\",\n            cross_attention_dim=cross_attention_dim,\n            attn_num_head_channels=attention_head_dim,\n            resnet_groups=norm_num_groups,\n        )\n\n        # up\n        reversed_block_out_channels = list(reversed(block_out_channels))\n        output_channel = reversed_block_out_channels[0]\n        for i, up_block_type in enumerate(up_block_types):\n            prev_output_channel = output_channel\n            output_channel = reversed_block_out_channels[i]\n            input_channel = reversed_block_out_channels[min(i + 1, len(block_out_channels) - 1)]\n\n            is_final_block = i == len(block_out_channels) - 1\n\n            up_block = get_up_block(\n                up_block_type,\n                num_layers=layers_per_block + 1,\n                in_channels=input_channel,\n                out_channels=output_channel,\n                prev_output_channel=prev_output_channel,\n                temb_channels=time_embed_dim,\n                add_upsample=not is_final_block,\n                resnet_eps=norm_eps,\n                resnet_act_fn=act_fn,\n                cross_attention_dim=cross_attention_dim,\n                attn_num_head_channels=attention_head_dim,\n            )\n            self.up_blocks.append(up_block)\n            prev_output_channel = output_channel\n\n        # out\n        self.conv_norm_out = nn.GroupNorm(num_channels=block_out_channels[0],\n                                          num_groups=norm_num_groups,\n                                          epsilon=norm_eps)\n        self.conv_act = nn.Silu()\n        self.conv_out = nn.Conv2D(block_out_channels[0], out_channels, 3, padding=1)\n\n    def forward(\n        self,\n        sample: paddle.Tensor,\n        timestep: Union[paddle.Tensor, float, int],\n        encoder_hidden_states: paddle.Tensor,\n    ) -> Dict[str, paddle.Tensor]:\n\n        # 0. center input if necessary\n        if self.config.center_input_sample:\n            sample = 2 * sample - 1.0\n\n        # 1. time\n        timesteps = timestep\n        if not paddle.is_tensor(timesteps):\n            timesteps = paddle.to_tensor([timesteps], dtype=\"int64\")\n        elif paddle.is_tensor(timesteps) and len(timesteps.shape) == 0:\n            timesteps = timesteps[None]\n\n        # broadcast to batch dimension\n        timesteps = paddle.broadcast_to(timesteps, [sample.shape[0]])\n\n        t_emb = self.time_proj(timesteps)\n        emb = self.time_embedding(t_emb)\n\n        # 2. pre-process\n        sample = self.conv_in(sample)\n\n        # 3. down\n        down_block_res_samples = (sample, )\n        for downsample_block in self.down_blocks:\n\n            if hasattr(downsample_block, \"attentions\") and downsample_block.attentions is not None:\n                sample, res_samples = downsample_block(hidden_states=sample,\n                                                       temb=emb,\n                                                       encoder_hidden_states=encoder_hidden_states)\n            else:\n                sample, res_samples = downsample_block(hidden_states=sample, temb=emb)\n\n            down_block_res_samples += res_samples\n\n        # 4. mid\n        sample = self.mid_block(sample, emb, encoder_hidden_states=encoder_hidden_states)\n\n        # 5. up\n        for upsample_block in self.up_blocks:\n\n            res_samples = down_block_res_samples[-len(upsample_block.resnets):]\n            down_block_res_samples = down_block_res_samples[:-len(upsample_block.resnets)]\n\n            if hasattr(upsample_block, \"attentions\") and upsample_block.attentions is not None:\n                sample = upsample_block(\n                    hidden_states=sample,\n                    temb=emb,\n                    res_hidden_states_tuple=res_samples,\n                    encoder_hidden_states=encoder_hidden_states,\n                )\n            else:\n                sample = upsample_block(hidden_states=sample, temb=emb, res_hidden_states_tuple=res_samples)\n\n        # 6. post-process\n        # make sure hidden states is in float32\n        # when running in half-precision\n        sample = self.conv_norm_out(sample.astype(\"float32\")).astype(sample.dtype)\n        sample = self.conv_act(sample)\n        sample = self.conv_out(sample)\n\n        output = {\"sample\": sample}\n\n        return output\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/diffusers/models/unet_blocks.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\n\nfrom .attention import AttentionBlockNew\nfrom .attention import SpatialTransformer\nfrom .resnet import Downsample2D\nfrom .resnet import FirDownsample2D\nfrom .resnet import FirUpsample2D\nfrom .resnet import ResnetBlock\nfrom .resnet import Upsample2D\n\n\ndef get_down_block(\n    down_block_type,\n    num_layers,\n    in_channels,\n    out_channels,\n    temb_channels,\n    add_downsample,\n    resnet_eps,\n    resnet_act_fn,\n    attn_num_head_channels,\n    cross_attention_dim=None,\n    downsample_padding=None,\n):\n    down_block_type = down_block_type[7:] if down_block_type.startswith(\"UNetRes\") else down_block_type\n    if down_block_type == \"DownBlock2D\":\n        return DownBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            temb_channels=temb_channels,\n            add_downsample=add_downsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            downsample_padding=downsample_padding,\n        )\n    elif down_block_type == \"AttnDownBlock2D\":\n        return AttnDownBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            temb_channels=temb_channels,\n            add_downsample=add_downsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            downsample_padding=downsample_padding,\n            attn_num_head_channels=attn_num_head_channels,\n        )\n    elif down_block_type == \"CrossAttnDownBlock2D\":\n        if cross_attention_dim is None:\n            raise ValueError(\"cross_attention_dim must be specified for CrossAttnUpBlock2D\")\n        return CrossAttnDownBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            temb_channels=temb_channels,\n            add_downsample=add_downsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            downsample_padding=downsample_padding,\n            cross_attention_dim=cross_attention_dim,\n            attn_num_head_channels=attn_num_head_channels,\n        )\n    elif down_block_type == \"SkipDownBlock2D\":\n        return SkipDownBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            temb_channels=temb_channels,\n            add_downsample=add_downsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            downsample_padding=downsample_padding,\n        )\n    elif down_block_type == \"AttnSkipDownBlock2D\":\n        return AttnSkipDownBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            temb_channels=temb_channels,\n            add_downsample=add_downsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            downsample_padding=downsample_padding,\n            attn_num_head_channels=attn_num_head_channels,\n        )\n    elif down_block_type == \"DownEncoderBlock2D\":\n        return DownEncoderBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            add_downsample=add_downsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            downsample_padding=downsample_padding,\n        )\n\n\ndef get_up_block(\n    up_block_type,\n    num_layers,\n    in_channels,\n    out_channels,\n    prev_output_channel,\n    temb_channels,\n    add_upsample,\n    resnet_eps,\n    resnet_act_fn,\n    attn_num_head_channels,\n    cross_attention_dim=None,\n):\n    up_block_type = up_block_type[7:] if up_block_type.startswith(\"UNetRes\") else up_block_type\n    if up_block_type == \"UpBlock2D\":\n        return UpBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            prev_output_channel=prev_output_channel,\n            temb_channels=temb_channels,\n            add_upsample=add_upsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n        )\n    elif up_block_type == \"CrossAttnUpBlock2D\":\n        if cross_attention_dim is None:\n            raise ValueError(\"cross_attention_dim must be specified for CrossAttnUpBlock2D\")\n        return CrossAttnUpBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            prev_output_channel=prev_output_channel,\n            temb_channels=temb_channels,\n            add_upsample=add_upsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            cross_attention_dim=cross_attention_dim,\n            attn_num_head_channels=attn_num_head_channels,\n        )\n    elif up_block_type == \"AttnUpBlock2D\":\n        return AttnUpBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            prev_output_channel=prev_output_channel,\n            temb_channels=temb_channels,\n            add_upsample=add_upsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            attn_num_head_channels=attn_num_head_channels,\n        )\n    elif up_block_type == \"SkipUpBlock2D\":\n        return SkipUpBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            prev_output_channel=prev_output_channel,\n            temb_channels=temb_channels,\n            add_upsample=add_upsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n        )\n    elif up_block_type == \"AttnSkipUpBlock2D\":\n        return AttnSkipUpBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            prev_output_channel=prev_output_channel,\n            temb_channels=temb_channels,\n            add_upsample=add_upsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            attn_num_head_channels=attn_num_head_channels,\n        )\n    elif up_block_type == \"UpDecoderBlock2D\":\n        return UpDecoderBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            add_upsample=add_upsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n        )\n    raise ValueError(f\"{up_block_type} does not exist.\")\n\n\nclass UNetMidBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        attention_type=\"default\",\n        output_scale_factor=1.0,\n        **kwargs,\n    ):\n        super().__init__()\n\n        self.attention_type = attention_type\n        resnet_groups = resnet_groups if resnet_groups is not None else min(in_channels // 4, 32)\n\n        # there is always at least one resnet\n        resnets = [\n            ResnetBlock(\n                in_channels=in_channels,\n                out_channels=in_channels,\n                temb_channels=temb_channels,\n                eps=resnet_eps,\n                groups=resnet_groups,\n                dropout=dropout,\n                time_embedding_norm=resnet_time_scale_shift,\n                non_linearity=resnet_act_fn,\n                output_scale_factor=output_scale_factor,\n                pre_norm=resnet_pre_norm,\n            )\n        ]\n        attentions = []\n\n        for _ in range(num_layers):\n            attentions.append(\n                AttentionBlockNew(\n                    in_channels,\n                    num_head_channels=attn_num_head_channels,\n                    rescale_output_factor=output_scale_factor,\n                    eps=resnet_eps,\n                    num_groups=resnet_groups,\n                ))\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=in_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n    def forward(self, hidden_states, temb=None, encoder_states=None):\n        hidden_states = self.resnets[0](hidden_states, temb)\n        for attn, resnet in zip(self.attentions, self.resnets[1:]):\n            if self.attention_type == \"default\":\n                hidden_states = attn(hidden_states)\n            else:\n                hidden_states = attn(hidden_states, encoder_states)\n            hidden_states = resnet(hidden_states, temb)\n\n        return hidden_states\n\n\nclass UNetMidBlock2DCrossAttn(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        attention_type=\"default\",\n        output_scale_factor=1.0,\n        cross_attention_dim=1280,\n        **kwargs,\n    ):\n        super().__init__()\n\n        self.attention_type = attention_type\n        resnet_groups = resnet_groups if resnet_groups is not None else min(in_channels // 4, 32)\n\n        # there is always at least one resnet\n        resnets = [\n            ResnetBlock(\n                in_channels=in_channels,\n                out_channels=in_channels,\n                temb_channels=temb_channels,\n                eps=resnet_eps,\n                groups=resnet_groups,\n                dropout=dropout,\n                time_embedding_norm=resnet_time_scale_shift,\n                non_linearity=resnet_act_fn,\n                output_scale_factor=output_scale_factor,\n                pre_norm=resnet_pre_norm,\n            )\n        ]\n        attentions = []\n\n        for _ in range(num_layers):\n            attentions.append(\n                SpatialTransformer(\n                    in_channels,\n                    attn_num_head_channels,\n                    in_channels // attn_num_head_channels,\n                    depth=1,\n                    context_dim=cross_attention_dim,\n                ))\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=in_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n    def forward(self, hidden_states, temb=None, encoder_hidden_states=None):\n        hidden_states = self.resnets[0](hidden_states, temb)\n        for attn, resnet in zip(self.attentions, self.resnets[1:]):\n            hidden_states = attn(hidden_states, encoder_hidden_states)\n            hidden_states = resnet(hidden_states, temb)\n\n        return hidden_states\n\n\nclass AttnDownBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        attention_type=\"default\",\n        output_scale_factor=1.0,\n        downsample_padding=1,\n        add_downsample=True,\n    ):\n        super().__init__()\n        resnets = []\n        attentions = []\n\n        self.attention_type = attention_type\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            attentions.append(\n                AttentionBlockNew(\n                    out_channels,\n                    num_head_channels=attn_num_head_channels,\n                    rescale_output_factor=output_scale_factor,\n                    eps=resnet_eps,\n                ))\n\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n        if add_downsample:\n            self.downsamplers = nn.LayerList([\n                Downsample2D(in_channels,\n                             use_conv=True,\n                             out_channels=out_channels,\n                             padding=downsample_padding,\n                             name=\"op\")\n            ])\n        else:\n            self.downsamplers = None\n\n    def forward(self, hidden_states, temb=None):\n        output_states = ()\n\n        for resnet, attn in zip(self.resnets, self.attentions):\n            hidden_states = resnet(hidden_states, temb)\n            hidden_states = attn(hidden_states)\n            output_states += (hidden_states, )\n\n        if self.downsamplers is not None:\n            for downsampler in self.downsamplers:\n                hidden_states = downsampler(hidden_states)\n\n            output_states += (hidden_states, )\n\n        return hidden_states, output_states\n\n\nclass CrossAttnDownBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        cross_attention_dim=1280,\n        attention_type=\"default\",\n        output_scale_factor=1.0,\n        downsample_padding=1,\n        add_downsample=True,\n    ):\n        super().__init__()\n        resnets = []\n        attentions = []\n\n        self.attention_type = attention_type\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            attentions.append(\n                SpatialTransformer(\n                    out_channels,\n                    attn_num_head_channels,\n                    out_channels // attn_num_head_channels,\n                    depth=1,\n                    context_dim=cross_attention_dim,\n                ))\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n        if add_downsample:\n            self.downsamplers = nn.LayerList([\n                Downsample2D(in_channels,\n                             use_conv=True,\n                             out_channels=out_channels,\n                             padding=downsample_padding,\n                             name=\"op\")\n            ])\n        else:\n            self.downsamplers = None\n\n    def forward(self, hidden_states, temb=None, encoder_hidden_states=None):\n        output_states = ()\n\n        for resnet, attn in zip(self.resnets, self.attentions):\n            hidden_states = resnet(hidden_states, temb)\n            hidden_states = attn(hidden_states, context=encoder_hidden_states)\n            output_states += (hidden_states, )\n\n        if self.downsamplers is not None:\n            for downsampler in self.downsamplers:\n                hidden_states = downsampler(hidden_states)\n\n            output_states += (hidden_states, )\n\n        return hidden_states, output_states\n\n\nclass DownBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        output_scale_factor=1.0,\n        add_downsample=True,\n        downsample_padding=1,\n    ):\n        super().__init__()\n        resnets = []\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.resnets = nn.LayerList(resnets)\n\n        if add_downsample:\n            self.downsamplers = nn.LayerList([\n                Downsample2D(in_channels,\n                             use_conv=True,\n                             out_channels=out_channels,\n                             padding=downsample_padding,\n                             name=\"op\")\n            ])\n        else:\n            self.downsamplers = None\n\n    def forward(self, hidden_states, temb=None):\n        output_states = ()\n\n        for resnet in self.resnets:\n            hidden_states = resnet(hidden_states, temb)\n            output_states += (hidden_states, )\n\n        if self.downsamplers is not None:\n            for downsampler in self.downsamplers:\n                hidden_states = downsampler(hidden_states)\n\n            output_states += (hidden_states, )\n\n        return hidden_states, output_states\n\n\nclass DownEncoderBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        output_scale_factor=1.0,\n        add_downsample=True,\n        downsample_padding=1,\n    ):\n        super().__init__()\n        resnets = []\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=None,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.resnets = nn.LayerList(resnets)\n\n        if add_downsample:\n            self.downsamplers = nn.LayerList([\n                Downsample2D(in_channels,\n                             use_conv=True,\n                             out_channels=out_channels,\n                             padding=downsample_padding,\n                             name=\"op\")\n            ])\n        else:\n            self.downsamplers = None\n\n    def forward(self, hidden_states):\n        for resnet in self.resnets:\n            hidden_states = resnet(hidden_states, temb=None)\n\n        if self.downsamplers is not None:\n            for downsampler in self.downsamplers:\n                hidden_states = downsampler(hidden_states)\n\n        return hidden_states\n\n\nclass AttnDownEncoderBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        output_scale_factor=1.0,\n        add_downsample=True,\n        downsample_padding=1,\n    ):\n        super().__init__()\n        resnets = []\n        attentions = []\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=None,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            attentions.append(\n                AttentionBlockNew(\n                    out_channels,\n                    num_head_channels=attn_num_head_channels,\n                    rescale_output_factor=output_scale_factor,\n                    eps=resnet_eps,\n                    num_groups=resnet_groups,\n                ))\n\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n        if add_downsample:\n            self.downsamplers = nn.LayerList([\n                Downsample2D(in_channels,\n                             use_conv=True,\n                             out_channels=out_channels,\n                             padding=downsample_padding,\n                             name=\"op\")\n            ])\n        else:\n            self.downsamplers = None\n\n    def forward(self, hidden_states):\n        for resnet, attn in zip(self.resnets, self.attentions):\n            hidden_states = resnet(hidden_states, temb=None)\n            hidden_states = attn(hidden_states)\n\n        if self.downsamplers is not None:\n            for downsampler in self.downsamplers:\n                hidden_states = downsampler(hidden_states)\n\n        return hidden_states\n\n\nclass AttnSkipDownBlock2D(nn.Layer):\n\n    def __init__(\n            self,\n            in_channels: int,\n            out_channels: int,\n            temb_channels: int,\n            dropout: float = 0.0,\n            num_layers: int = 1,\n            resnet_eps: float = 1e-6,\n            resnet_time_scale_shift: str = \"default\",\n            resnet_act_fn: str = \"swish\",\n            resnet_pre_norm: bool = True,\n            attn_num_head_channels=1,\n            attention_type=\"default\",\n            output_scale_factor=np.sqrt(2.0),\n            downsample_padding=1,\n            add_downsample=True,\n    ):\n        super().__init__()\n        self.attentions = nn.LayerList([])\n        self.resnets = nn.LayerList([])\n\n        self.attention_type = attention_type\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            self.resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=min(in_channels // 4, 32),\n                    groups_out=min(out_channels // 4, 32),\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            self.attentions.append(\n                AttentionBlockNew(\n                    out_channels,\n                    num_head_channels=attn_num_head_channels,\n                    rescale_output_factor=output_scale_factor,\n                    eps=resnet_eps,\n                ))\n\n        if add_downsample:\n            self.resnet_down = ResnetBlock(\n                in_channels=out_channels,\n                out_channels=out_channels,\n                temb_channels=temb_channels,\n                eps=resnet_eps,\n                groups=min(out_channels // 4, 32),\n                dropout=dropout,\n                time_embedding_norm=resnet_time_scale_shift,\n                non_linearity=resnet_act_fn,\n                output_scale_factor=output_scale_factor,\n                pre_norm=resnet_pre_norm,\n                use_nin_shortcut=True,\n                down=True,\n                kernel=\"fir\",\n            )\n            self.downsamplers = nn.LayerList([FirDownsample2D(in_channels, out_channels=out_channels)])\n            self.skip_conv = nn.Conv2D(3, out_channels, kernel_size=(1, 1), stride=(1, 1))\n        else:\n            self.resnet_down = None\n            self.downsamplers = None\n            self.skip_conv = None\n\n    def forward(self, hidden_states, temb=None, skip_sample=None):\n        output_states = ()\n\n        for resnet, attn in zip(self.resnets, self.attentions):\n            hidden_states = resnet(hidden_states, temb)\n            hidden_states = attn(hidden_states)\n            output_states += (hidden_states, )\n\n        if self.downsamplers is not None:\n            hidden_states = self.resnet_down(hidden_states, temb)\n            for downsampler in self.downsamplers:\n                skip_sample = downsampler(skip_sample)\n\n            hidden_states = self.skip_conv(skip_sample) + hidden_states\n\n            output_states += (hidden_states, )\n\n        return hidden_states, output_states, skip_sample\n\n\nclass SkipDownBlock2D(nn.Layer):\n\n    def __init__(\n            self,\n            in_channels: int,\n            out_channels: int,\n            temb_channels: int,\n            dropout: float = 0.0,\n            num_layers: int = 1,\n            resnet_eps: float = 1e-6,\n            resnet_time_scale_shift: str = \"default\",\n            resnet_act_fn: str = \"swish\",\n            resnet_pre_norm: bool = True,\n            output_scale_factor=np.sqrt(2.0),\n            add_downsample=True,\n            downsample_padding=1,\n    ):\n        super().__init__()\n        self.resnets = nn.LayerList([])\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            self.resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=min(in_channels // 4, 32),\n                    groups_out=min(out_channels // 4, 32),\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        if add_downsample:\n            self.resnet_down = ResnetBlock(\n                in_channels=out_channels,\n                out_channels=out_channels,\n                temb_channels=temb_channels,\n                eps=resnet_eps,\n                groups=min(out_channels // 4, 32),\n                dropout=dropout,\n                time_embedding_norm=resnet_time_scale_shift,\n                non_linearity=resnet_act_fn,\n                output_scale_factor=output_scale_factor,\n                pre_norm=resnet_pre_norm,\n                use_nin_shortcut=True,\n                down=True,\n                kernel=\"fir\",\n            )\n            self.downsamplers = nn.LayerList([FirDownsample2D(in_channels, out_channels=out_channels)])\n            self.skip_conv = nn.Conv2D(3, out_channels, kernel_size=(1, 1), stride=(1, 1))\n        else:\n            self.resnet_down = None\n            self.downsamplers = None\n            self.skip_conv = None\n\n    def forward(self, hidden_states, temb=None, skip_sample=None):\n        output_states = ()\n\n        for resnet in self.resnets:\n            hidden_states = resnet(hidden_states, temb)\n            output_states += (hidden_states, )\n\n        if self.downsamplers is not None:\n            hidden_states = self.resnet_down(hidden_states, temb)\n            for downsampler in self.downsamplers:\n                skip_sample = downsampler(skip_sample)\n\n            hidden_states = self.skip_conv(skip_sample) + hidden_states\n\n            output_states += (hidden_states, )\n\n        return hidden_states, output_states, skip_sample\n\n\nclass AttnUpBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        prev_output_channel: int,\n        out_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attention_type=\"default\",\n        attn_num_head_channels=1,\n        output_scale_factor=1.0,\n        add_upsample=True,\n    ):\n        super().__init__()\n        resnets = []\n        attentions = []\n\n        self.attention_type = attention_type\n\n        for i in range(num_layers):\n            res_skip_channels = in_channels if (i == num_layers - 1) else out_channels\n            resnet_in_channels = prev_output_channel if i == 0 else out_channels\n\n            resnets.append(\n                ResnetBlock(\n                    in_channels=resnet_in_channels + res_skip_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            attentions.append(\n                AttentionBlockNew(\n                    out_channels,\n                    num_head_channels=attn_num_head_channels,\n                    rescale_output_factor=output_scale_factor,\n                    eps=resnet_eps,\n                ))\n\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n        if add_upsample:\n            self.upsamplers = nn.LayerList([Upsample2D(out_channels, use_conv=True, out_channels=out_channels)])\n        else:\n            self.upsamplers = None\n\n    def forward(self, hidden_states, res_hidden_states_tuple, temb=None):\n        for resnet, attn in zip(self.resnets, self.attentions):\n\n            # pop res hidden states\n            res_hidden_states = res_hidden_states_tuple[-1]\n            res_hidden_states_tuple = res_hidden_states_tuple[:-1]\n            hidden_states = paddle.concat([hidden_states, res_hidden_states], axis=1)\n\n            hidden_states = resnet(hidden_states, temb)\n            hidden_states = attn(hidden_states)\n\n        if self.upsamplers is not None:\n            for upsampler in self.upsamplers:\n                hidden_states = upsampler(hidden_states)\n\n        return hidden_states\n\n\nclass CrossAttnUpBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        prev_output_channel: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        cross_attention_dim=1280,\n        attention_type=\"default\",\n        output_scale_factor=1.0,\n        downsample_padding=1,\n        add_upsample=True,\n    ):\n        super().__init__()\n        resnets = []\n        attentions = []\n\n        self.attention_type = attention_type\n\n        for i in range(num_layers):\n            res_skip_channels = in_channels if (i == num_layers - 1) else out_channels\n            resnet_in_channels = prev_output_channel if i == 0 else out_channels\n\n            resnets.append(\n                ResnetBlock(\n                    in_channels=resnet_in_channels + res_skip_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            attentions.append(\n                SpatialTransformer(\n                    out_channels,\n                    attn_num_head_channels,\n                    out_channels // attn_num_head_channels,\n                    depth=1,\n                    context_dim=cross_attention_dim,\n                ))\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n        if add_upsample:\n            self.upsamplers = nn.LayerList([Upsample2D(out_channels, use_conv=True, out_channels=out_channels)])\n        else:\n            self.upsamplers = None\n\n    def forward(self, hidden_states, res_hidden_states_tuple, temb=None, encoder_hidden_states=None):\n        for resnet, attn in zip(self.resnets, self.attentions):\n\n            # pop res hidden states\n            res_hidden_states = res_hidden_states_tuple[-1]\n            res_hidden_states_tuple = res_hidden_states_tuple[:-1]\n            hidden_states = paddle.concat([hidden_states, res_hidden_states], axis=1)\n\n            hidden_states = resnet(hidden_states, temb)\n            hidden_states = attn(hidden_states, context=encoder_hidden_states)\n\n        if self.upsamplers is not None:\n            for upsampler in self.upsamplers:\n                hidden_states = upsampler(hidden_states)\n\n        return hidden_states\n\n\nclass UpBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        prev_output_channel: int,\n        out_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        output_scale_factor=1.0,\n        add_upsample=True,\n    ):\n        super().__init__()\n        resnets = []\n\n        for i in range(num_layers):\n            res_skip_channels = in_channels if (i == num_layers - 1) else out_channels\n            resnet_in_channels = prev_output_channel if i == 0 else out_channels\n\n            resnets.append(\n                ResnetBlock(\n                    in_channels=resnet_in_channels + res_skip_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.resnets = nn.LayerList(resnets)\n\n        if add_upsample:\n            self.upsamplers = nn.LayerList([Upsample2D(out_channels, use_conv=True, out_channels=out_channels)])\n        else:\n            self.upsamplers = None\n\n    def forward(self, hidden_states, res_hidden_states_tuple, temb=None):\n        for resnet in self.resnets:\n\n            # pop res hidden states\n            res_hidden_states = res_hidden_states_tuple[-1]\n            res_hidden_states_tuple = res_hidden_states_tuple[:-1]\n            hidden_states = paddle.concat([hidden_states, res_hidden_states], axis=1)\n\n            hidden_states = resnet(hidden_states, temb)\n\n        if self.upsamplers is not None:\n            for upsampler in self.upsamplers:\n                hidden_states = upsampler(hidden_states)\n\n        return hidden_states\n\n\nclass UpDecoderBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        output_scale_factor=1.0,\n        add_upsample=True,\n    ):\n        super().__init__()\n        resnets = []\n\n        for i in range(num_layers):\n            input_channels = in_channels if i == 0 else out_channels\n\n            resnets.append(\n                ResnetBlock(\n                    in_channels=input_channels,\n                    out_channels=out_channels,\n                    temb_channels=None,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.resnets = nn.LayerList(resnets)\n\n        if add_upsample:\n            self.upsamplers = nn.LayerList([Upsample2D(out_channels, use_conv=True, out_channels=out_channels)])\n        else:\n            self.upsamplers = None\n\n    def forward(self, hidden_states):\n        for resnet in self.resnets:\n            hidden_states = resnet(hidden_states, temb=None)\n\n        if self.upsamplers is not None:\n            for upsampler in self.upsamplers:\n                hidden_states = upsampler(hidden_states)\n\n        return hidden_states\n\n\nclass AttnUpDecoderBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        output_scale_factor=1.0,\n        add_upsample=True,\n    ):\n        super().__init__()\n        resnets = []\n        attentions = []\n\n        for i in range(num_layers):\n            input_channels = in_channels if i == 0 else out_channels\n\n            resnets.append(\n                ResnetBlock(\n                    in_channels=input_channels,\n                    out_channels=out_channels,\n                    temb_channels=None,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            attentions.append(\n                AttentionBlockNew(\n                    out_channels,\n                    num_head_channels=attn_num_head_channels,\n                    rescale_output_factor=output_scale_factor,\n                    eps=resnet_eps,\n                    num_groups=resnet_groups,\n                ))\n\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n        if add_upsample:\n            self.upsamplers = nn.LayerList([Upsample2D(out_channels, use_conv=True, out_channels=out_channels)])\n        else:\n            self.upsamplers = None\n\n    def forward(self, hidden_states):\n        for resnet, attn in zip(self.resnets, self.attentions):\n            hidden_states = resnet(hidden_states, temb=None)\n            hidden_states = attn(hidden_states)\n\n        if self.upsamplers is not None:\n            for upsampler in self.upsamplers:\n                hidden_states = upsampler(hidden_states)\n\n        return hidden_states\n\n\nclass AttnSkipUpBlock2D(nn.Layer):\n\n    def __init__(\n            self,\n            in_channels: int,\n            prev_output_channel: int,\n            out_channels: int,\n            temb_channels: int,\n            dropout: float = 0.0,\n            num_layers: int = 1,\n            resnet_eps: float = 1e-6,\n            resnet_time_scale_shift: str = \"default\",\n            resnet_act_fn: str = \"swish\",\n            resnet_pre_norm: bool = True,\n            attn_num_head_channels=1,\n            attention_type=\"default\",\n            output_scale_factor=np.sqrt(2.0),\n            upsample_padding=1,\n            add_upsample=True,\n    ):\n        super().__init__()\n        self.attentions = nn.LayerList([])\n        self.resnets = nn.LayerList([])\n\n        self.attention_type = attention_type\n\n        for i in range(num_layers):\n            res_skip_channels = in_channels if (i == num_layers - 1) else out_channels\n            resnet_in_channels = prev_output_channel if i == 0 else out_channels\n\n            self.resnets.append(\n                ResnetBlock(\n                    in_channels=resnet_in_channels + res_skip_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=min(resnet_in_channels + res_skip_channels // 4, 32),\n                    groups_out=min(out_channels // 4, 32),\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.attentions.append(\n            AttentionBlockNew(\n                out_channels,\n                num_head_channels=attn_num_head_channels,\n                rescale_output_factor=output_scale_factor,\n                eps=resnet_eps,\n            ))\n\n        self.upsampler = FirUpsample2D(in_channels, out_channels=out_channels)\n        if add_upsample:\n            self.resnet_up = ResnetBlock(\n                in_channels=out_channels,\n                out_channels=out_channels,\n                temb_channels=temb_channels,\n                eps=resnet_eps,\n                groups=min(out_channels // 4, 32),\n                groups_out=min(out_channels // 4, 32),\n                dropout=dropout,\n                time_embedding_norm=resnet_time_scale_shift,\n                non_linearity=resnet_act_fn,\n                output_scale_factor=output_scale_factor,\n                pre_norm=resnet_pre_norm,\n                use_nin_shortcut=True,\n                up=True,\n                kernel=\"fir\",\n            )\n            self.skip_conv = nn.Conv2D(out_channels, 3, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n            self.skip_norm = nn.GroupNorm(num_groups=min(out_channels // 4, 32),\n                                          num_channels=out_channels,\n                                          eps=resnet_eps,\n                                          affine=True)\n            self.act = nn.SiLU()\n        else:\n            self.resnet_up = None\n            self.skip_conv = None\n            self.skip_norm = None\n            self.act = None\n\n    def forward(self, hidden_states, res_hidden_states_tuple, temb=None, skip_sample=None):\n        for resnet in self.resnets:\n            # pop res hidden states\n            res_hidden_states = res_hidden_states_tuple[-1]\n            res_hidden_states_tuple = res_hidden_states_tuple[:-1]\n            hidden_states = paddle.concat([hidden_states, res_hidden_states], axis=1)\n\n            hidden_states = resnet(hidden_states, temb)\n\n        hidden_states = self.attentions[0](hidden_states)\n\n        if skip_sample is not None:\n            skip_sample = self.upsampler(skip_sample)\n        else:\n            skip_sample = 0\n\n        if self.resnet_up is not None:\n            skip_sample_states = self.skip_norm(hidden_states)\n            skip_sample_states = self.act(skip_sample_states)\n            skip_sample_states = self.skip_conv(skip_sample_states)\n\n            skip_sample = skip_sample + skip_sample_states\n\n            hidden_states = self.resnet_up(hidden_states, temb)\n\n        return hidden_states, skip_sample\n\n\nclass SkipUpBlock2D(nn.Layer):\n\n    def __init__(\n            self,\n            in_channels: int,\n            prev_output_channel: int,\n            out_channels: int,\n            temb_channels: int,\n            dropout: float = 0.0,\n            num_layers: int = 1,\n            resnet_eps: float = 1e-6,\n            resnet_time_scale_shift: str = \"default\",\n            resnet_act_fn: str = \"swish\",\n            resnet_pre_norm: bool = True,\n            output_scale_factor=np.sqrt(2.0),\n            add_upsample=True,\n            upsample_padding=1,\n    ):\n        super().__init__()\n        self.resnets = nn.LayerList([])\n\n        for i in range(num_layers):\n            res_skip_channels = in_channels if (i == num_layers - 1) else out_channels\n            resnet_in_channels = prev_output_channel if i == 0 else out_channels\n\n            self.resnets.append(\n                ResnetBlock(\n                    in_channels=resnet_in_channels + res_skip_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=min((resnet_in_channels + res_skip_channels) // 4, 32),\n                    groups_out=min(out_channels // 4, 32),\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.upsampler = FirUpsample2D(in_channels, out_channels=out_channels)\n        if add_upsample:\n            self.resnet_up = ResnetBlock(\n                in_channels=out_channels,\n                out_channels=out_channels,\n                temb_channels=temb_channels,\n                eps=resnet_eps,\n                groups=min(out_channels // 4, 32),\n                groups_out=min(out_channels // 4, 32),\n                dropout=dropout,\n                time_embedding_norm=resnet_time_scale_shift,\n                non_linearity=resnet_act_fn,\n                output_scale_factor=output_scale_factor,\n                pre_norm=resnet_pre_norm,\n                use_nin_shortcut=True,\n                up=True,\n                kernel=\"fir\",\n            )\n            self.skip_conv = nn.Conv2D(out_channels, 3, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n            self.skip_norm = nn.GroupNorm(num_groups=min(out_channels // 4, 32),\n                                          num_channels=out_channels,\n                                          eps=resnet_eps,\n                                          affine=True)\n            self.act = nn.SiLU()\n        else:\n            self.resnet_up = None\n            self.skip_conv = None\n            self.skip_norm = None\n            self.act = None\n\n    def forward(self, hidden_states, res_hidden_states_tuple, temb=None, skip_sample=None):\n        for resnet in self.resnets:\n            # pop res hidden states\n            res_hidden_states = res_hidden_states_tuple[-1]\n            res_hidden_states_tuple = res_hidden_states_tuple[:-1]\n            hidden_states = paddle.concat([hidden_states, res_hidden_states], axis=1)\n\n            hidden_states = resnet(hidden_states, temb)\n\n        if skip_sample is not None:\n            skip_sample = self.upsampler(skip_sample)\n        else:\n            skip_sample = 0\n\n        if self.resnet_up is not None:\n            skip_sample_states = self.skip_norm(hidden_states)\n            skip_sample_states = self.act(skip_sample_states)\n            skip_sample_states = self.skip_conv(skip_sample_states)\n\n            skip_sample = skip_sample + skip_sample_states\n\n            hidden_states = self.resnet_up(hidden_states, temb)\n\n        return hidden_states, skip_sample\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/diffusers/models/vae.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .unet_blocks import get_down_block\nfrom .unet_blocks import get_up_block\nfrom .unet_blocks import UNetMidBlock2D\n\n\nclass Encoder(nn.Layer):\n\n    def __init__(\n            self,\n            in_channels=3,\n            out_channels=3,\n            down_block_types=(\"DownEncoderBlock2D\", ),\n            block_out_channels=(64, ),\n            layers_per_block=2,\n            act_fn=\"silu\",\n            double_z=True,\n    ):\n        super().__init__()\n        self.layers_per_block = layers_per_block\n\n        self.conv_in = nn.Conv2D(in_channels, block_out_channels[0], kernel_size=3, stride=1, padding=1)\n\n        self.mid_block = None\n        self.down_blocks = nn.LayerList([])\n\n        # down\n        output_channel = block_out_channels[0]\n        for i, down_block_type in enumerate(down_block_types):\n            input_channel = output_channel\n            output_channel = block_out_channels[i]\n            is_final_block = i == len(block_out_channels) - 1\n\n            down_block = get_down_block(\n                down_block_type,\n                num_layers=self.layers_per_block,\n                in_channels=input_channel,\n                out_channels=output_channel,\n                add_downsample=not is_final_block,\n                resnet_eps=1e-6,\n                downsample_padding=0,\n                resnet_act_fn=act_fn,\n                attn_num_head_channels=None,\n                temb_channels=None,\n            )\n            self.down_blocks.append(down_block)\n\n        # mid\n        self.mid_block = UNetMidBlock2D(\n            in_channels=block_out_channels[-1],\n            resnet_eps=1e-6,\n            resnet_act_fn=act_fn,\n            output_scale_factor=1,\n            resnet_time_scale_shift=\"default\",\n            attn_num_head_channels=None,\n            resnet_groups=32,\n            temb_channels=None,\n        )\n\n        # out\n        num_groups_out = 32\n        self.conv_norm_out = nn.GroupNorm(num_channels=block_out_channels[-1], num_groups=num_groups_out, epsilon=1e-6)\n        self.conv_act = nn.Silu()\n\n        conv_out_channels = 2 * out_channels if double_z else out_channels\n        self.conv_out = nn.Conv2D(block_out_channels[-1], conv_out_channels, 3, padding=1)\n\n    def forward(self, x):\n        sample = x\n        sample = self.conv_in(sample)\n\n        # down\n        for down_block in self.down_blocks:\n            sample = down_block(sample)\n\n        # middle\n        sample = self.mid_block(sample)\n\n        # post-process\n        sample = self.conv_norm_out(sample)\n        sample = self.conv_act(sample)\n        sample = self.conv_out(sample)\n\n        return sample\n\n\nclass Decoder(nn.Layer):\n\n    def __init__(\n            self,\n            in_channels=3,\n            out_channels=3,\n            up_block_types=(\"UpDecoderBlock2D\", ),\n            block_out_channels=(64, ),\n            layers_per_block=2,\n            act_fn=\"silu\",\n    ):\n        super().__init__()\n        self.layers_per_block = layers_per_block\n\n        self.conv_in = nn.Conv2D(in_channels, block_out_channels[-1], kernel_size=3, stride=1, padding=1)\n\n        self.mid_block = None\n        self.up_blocks = nn.LayerList([])\n\n        # mid\n        self.mid_block = UNetMidBlock2D(\n            in_channels=block_out_channels[-1],\n            resnet_eps=1e-6,\n            resnet_act_fn=act_fn,\n            output_scale_factor=1,\n            resnet_time_scale_shift=\"default\",\n            attn_num_head_channels=None,\n            resnet_groups=32,\n            temb_channels=None,\n        )\n\n        # up\n        reversed_block_out_channels = list(reversed(block_out_channels))\n        output_channel = reversed_block_out_channels[0]\n        for i, up_block_type in enumerate(up_block_types):\n            prev_output_channel = output_channel\n            output_channel = reversed_block_out_channels[i]\n\n            is_final_block = i == len(block_out_channels) - 1\n\n            up_block = get_up_block(\n                up_block_type,\n                num_layers=self.layers_per_block + 1,\n                in_channels=prev_output_channel,\n                out_channels=output_channel,\n                prev_output_channel=None,\n                add_upsample=not is_final_block,\n                resnet_eps=1e-6,\n                resnet_act_fn=act_fn,\n                attn_num_head_channels=None,\n                temb_channels=None,\n            )\n            self.up_blocks.append(up_block)\n            prev_output_channel = output_channel\n\n        # out\n        num_groups_out = 32\n        self.conv_norm_out = nn.GroupNorm(num_channels=block_out_channels[0], num_groups=num_groups_out, epsilon=1e-6)\n        self.conv_act = nn.Silu()\n        self.conv_out = nn.Conv2D(block_out_channels[0], out_channels, 3, padding=1)\n\n    def forward(self, z):\n        sample = z\n        sample = self.conv_in(sample)\n\n        # middle\n        sample = self.mid_block(sample)\n\n        # up\n        for up_block in self.up_blocks:\n            sample = up_block(sample)\n\n        # post-process\n        sample = self.conv_norm_out(sample)\n        sample = self.conv_act(sample)\n        sample = self.conv_out(sample)\n\n        return sample\n\n\nclass VectorQuantizer(nn.Layer):\n    \"\"\"\n    Improved version over VectorQuantizer, can be used as a drop-in replacement. Mostly avoids costly matrix\n    multiplications and allows for post-hoc remapping of indices.\n    \"\"\"\n\n    # NOTE: due to a bug the beta term was applied to the wrong term. for\n    # backwards compatibility we use the buggy version by default, but you can\n    # specify legacy=False to fix it.\n    def __init__(self, n_e, e_dim, beta, remap=None, unknown_index=\"random\", sane_index_shape=False, legacy=True):\n        super().__init__()\n        self.n_e = n_e\n        self.e_dim = e_dim\n        self.beta = beta\n        self.legacy = legacy\n\n        self.embedding = nn.Embedding(self.n_e, self.e_dim)\n        self.embedding.weight.data.uniform_(-1.0 / self.n_e, 1.0 / self.n_e)\n\n        self.remap = remap\n        if self.remap is not None:\n            self.register_buffer(\"used\", paddle.to_tensor(np.load(self.remap)))\n            self.re_embed = self.used.shape[0]\n            self.unknown_index = unknown_index  # \"random\" or \"extra\" or integer\n            if self.unknown_index == \"extra\":\n                self.unknown_index = self.re_embed\n                self.re_embed = self.re_embed + 1\n            print(f\"Remapping {self.n_e} indices to {self.re_embed} indices. \"\n                  f\"Using {self.unknown_index} for unknown indices.\")\n        else:\n            self.re_embed = n_e\n\n        self.sane_index_shape = sane_index_shape\n\n    def remap_to_used(self, inds):\n        ishape = inds.shape\n        assert len(ishape) > 1\n        inds = inds.reshape([ishape[0], -1])\n        used = self.used\n        match = (inds[:, :, None] == used[None, None, ...]).astype(\"int64\")\n        new = match.argmax(-1)\n        unknown = match.sum(2) < 1\n        if self.unknown_index == \"random\":\n            new[unknown] = paddle.randint(0, self.re_embed, shape=new[unknown].shape)\n        else:\n            new[unknown] = self.unknown_index\n        return new.reshape(ishape)\n\n    def unmap_to_all(self, inds):\n        ishape = inds.shape\n        assert len(ishape) > 1\n        inds = inds.reshape([ishape[0], -1])\n        used = self.used\n        if self.re_embed > self.used.shape[0]:  # extra token\n            inds[inds >= self.used.shape[0]] = 0  # simply set to zero\n        back = paddle.gather(used[None, :][inds.shape[0] * [0], :], inds, axis=1)\n        return back.reshape(ishape)\n\n    def forward(self, z):\n        # reshape z -> (batch, height, width, channel) and flatten\n        z = z.transpose([0, 2, 3, 1])\n        z_flattened = z.reshape([-1, self.e_dim])\n        # distances from z to embeddings e_j (z - e)^2 = z^2 + e^2 - 2 e * z\n\n        d = (paddle.sum(z_flattened**2, axis=1, keepdim=True) + paddle.sum(self.embedding.weight**2, axis=1) -\n             2 * paddle.einsum(\"bd,dn->bn\", z_flattened, self.embedding.weight.t()))\n\n        min_encoding_indices = paddle.argmin(d, axis=1)\n        z_q = self.embedding(min_encoding_indices).reshape(z.shape)\n        perplexity = None\n        min_encodings = None\n\n        # compute loss for embedding\n        if not self.legacy:\n            loss = self.beta * paddle.mean((z_q.detach() - z)**2) + paddle.mean((z_q - z.detach())**2)\n        else:\n            loss = paddle.mean((z_q.detach() - z)**2) + self.beta * paddle.mean((z_q - z.detach())**2)\n\n        # preserve gradients\n        z_q = z + (z_q - z).detach()\n\n        # reshape back to match original input shape\n        z_q = z_q.transpose([0, 3, 1, 2])\n\n        if self.remap is not None:\n            min_encoding_indices = min_encoding_indices.reshape([z.shape[0], -1])  # add batch axis\n            min_encoding_indices = self.remap_to_used(min_encoding_indices)\n            min_encoding_indices = min_encoding_indices.reshape([-1, 1])  # flatten\n\n        if self.sane_index_shape:\n            min_encoding_indices = min_encoding_indices.reshape([z_q.shape[0], z_q.shape[2], z_q.shape[3]])\n\n        return z_q, loss, (perplexity, min_encodings, min_encoding_indices)\n\n    def get_codebook_entry(self, indices, shape):\n        # shape specifying (batch, height, width, channel)\n        if self.remap is not None:\n            indices = indices.reshape([shape[0], -1])  # add batch axis\n            indices = self.unmap_to_all(indices)\n            indices = indices.flatten()  # flatten again\n\n        # get quantized latent vectors\n        z_q = self.embedding(indices)\n\n        if shape is not None:\n            z_q = z_q.reshape(shape)\n            # reshape back to match original input shape\n            z_q = z_q.transpose([0, 3, 1, 2])\n\n        return z_q\n\n\nclass DiagonalGaussianDistribution(object):\n\n    def __init__(self, parameters, deterministic=False):\n        self.parameters = parameters\n        self.mean, self.logvar = paddle.chunk(parameters, 2, axis=1)\n        self.logvar = paddle.clip(self.logvar, -30.0, 20.0)\n        self.deterministic = deterministic\n        self.std = paddle.exp(0.5 * self.logvar)\n        self.var = paddle.exp(self.logvar)\n        if self.deterministic:\n            self.var = self.std = paddle.zeros_like(self.mean)\n\n    def sample(self):\n        x = self.mean + self.std * paddle.randn(self.mean.shape)\n        return x\n\n    def kl(self, other=None):\n        if self.deterministic:\n            return paddle.to_tensor([0.0])\n        else:\n            if other is None:\n                return 0.5 * paddle.sum(paddle.pow(self.mean, 2) + self.var - 1.0 - self.logvar, axis=[1, 2, 3])\n            else:\n                return 0.5 * paddle.sum(\n                    paddle.pow(self.mean - other.mean, 2) / other.var + self.var / other.var - 1.0 - self.logvar +\n                    other.logvar,\n                    axis=[1, 2, 3],\n                )\n\n    def nll(self, sample, dims=[1, 2, 3]):\n        if self.deterministic:\n            return paddle.to_tensor([0.0])\n        logtwopi = np.log(2.0 * np.pi)\n        return 0.5 * paddle.sum(logtwopi + self.logvar + paddle.pow(sample - self.mean, 2) / self.var, axis=dims)\n\n    def mode(self):\n        return self.mean\n\n\nclass VQModel(ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        in_channels=3,\n        out_channels=3,\n        down_block_types=(\"DownEncoderBlock2D\", ),\n        up_block_types=(\"UpDecoderBlock2D\", ),\n        block_out_channels=(64, ),\n        layers_per_block=1,\n        act_fn=\"silu\",\n        latent_channels=3,\n        sample_size=32,\n        num_vq_embeddings=256,\n    ):\n        super().__init__()\n\n        # pass init params to Encoder\n        self.encoder = Encoder(\n            in_channels=in_channels,\n            out_channels=latent_channels,\n            down_block_types=down_block_types,\n            block_out_channels=block_out_channels,\n            layers_per_block=layers_per_block,\n            act_fn=act_fn,\n            double_z=False,\n        )\n\n        self.quant_conv = nn.Conv2D(latent_channels, latent_channels, 1)\n        self.quantize = VectorQuantizer(num_vq_embeddings,\n                                        latent_channels,\n                                        beta=0.25,\n                                        remap=None,\n                                        sane_index_shape=False)\n        self.post_quant_conv = nn.Conv2D(latent_channels, latent_channels, 1)\n\n        # pass init params to Decoder\n        self.decoder = Decoder(\n            in_channels=latent_channels,\n            out_channels=out_channels,\n            up_block_types=up_block_types,\n            block_out_channels=block_out_channels,\n            layers_per_block=layers_per_block,\n            act_fn=act_fn,\n        )\n\n    def encode(self, x):\n        h = self.encoder(x)\n        h = self.quant_conv(h)\n        return h\n\n    def decode(self, h, force_not_quantize=False):\n        # also go through quantization layer\n        if not force_not_quantize:\n            quant, emb_loss, info = self.quantize(h)\n        else:\n            quant = h\n        quant = self.post_quant_conv(quant)\n        dec = self.decoder(quant)\n        return dec\n\n    def forward(self, sample):\n        x = sample\n        h = self.encode(x)\n        dec = self.decode(h)\n        return dec\n\n\nclass AutoencoderKL(nn.Layer, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        in_channels=3,\n        out_channels=3,\n        down_block_types=(\"DownEncoderBlock2D\", \"DownEncoderBlock2D\", \"DownEncoderBlock2D\", \"DownEncoderBlock2D\"),\n        up_block_types=(\"UpDecoderBlock2D\", \"UpDecoderBlock2D\", \"UpDecoderBlock2D\", \"UpDecoderBlock2D\"),\n        block_out_channels=(128, 256, 512, 512),\n        layers_per_block=2,\n        act_fn=\"silu\",\n        latent_channels=4,\n        sample_size=512,\n    ):\n        super().__init__()\n\n        # pass init params to Encoder\n        self.encoder = Encoder(\n            in_channels=in_channels,\n            out_channels=latent_channels,\n            down_block_types=down_block_types,\n            block_out_channels=block_out_channels,\n            layers_per_block=layers_per_block,\n            act_fn=act_fn,\n            double_z=True,\n        )\n\n        # pass init params to Decoder\n        self.decoder = Decoder(\n            in_channels=latent_channels,\n            out_channels=out_channels,\n            up_block_types=up_block_types,\n            block_out_channels=block_out_channels,\n            layers_per_block=layers_per_block,\n            act_fn=act_fn,\n        )\n\n        self.quant_conv = nn.Conv2D(2 * latent_channels, 2 * latent_channels, 1)\n        self.post_quant_conv = nn.Conv2D(latent_channels, latent_channels, 1)\n\n    def encode(self, x):\n        h = self.encoder(x)\n        moments = self.quant_conv(h)\n        posterior = DiagonalGaussianDistribution(moments)\n        return posterior\n\n    def decode(self, z):\n        z = self.post_quant_conv(z)\n        dec = self.decoder(z)\n        return dec\n\n    def forward(self, sample, sample_posterior=False):\n        x = sample\n        posterior = self.encode(x)\n        if sample_posterior:\n            z = posterior.sample()\n        else:\n            z = posterior.mode()\n        dec = self.decode(z)\n        return dec\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/diffusers/schedulers/README.md",
    "content": "# Schedulers\n\n- Schedulers are the algorithms to use diffusion models in inference as well as for training. They include the noise schedules and define algorithm-specific diffusion steps.\n- Schedulers can be used interchangable between diffusion models in inference to find the preferred trade-off between speed and generation quality.\n- Schedulers are available in numpy, but can easily be transformed into Py\n\n## API\n\n- Schedulers should provide one or more `def step(...)` functions that should be called iteratively to unroll the diffusion loop during\nthe forward pass.\n- Schedulers should be framework-agnostic, but provide a simple functionality to convert the scheduler into a specific framework, such as PyTorch\nwith a `set_format(...)` method.\n\n## Examples\n\n- The DDPM scheduler was proposed in [Denoising Diffusion Probabilistic Models](https://arxiv.org/abs/2006.11239) and can be found in [scheduling_ddpm.py](https://github.com/huggingface/diffusers/blob/main/src/diffusers/schedulers/scheduling_ddpm.py). An example of how to use this scheduler can be found in [pipeline_ddpm.py](https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/pipeline_ddpm.py).\n- The DDIM scheduler was proposed in [Denoising Diffusion Implicit Models](https://arxiv.org/abs/2010.02502) and can be found in [scheduling_ddim.py](https://github.com/huggingface/diffusers/blob/main/src/diffusers/schedulers/scheduling_ddim.py). An example of how to use this scheduler can be found in [pipeline_ddim.py](https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/pipeline_ddim.py).\n- The PNDM scheduler was proposed in [Pseudo Numerical Methods for Diffusion Models on Manifolds](https://arxiv.org/abs/2202.09778) and can be found in [scheduling_pndm.py](https://github.com/huggingface/diffusers/blob/main/src/diffusers/schedulers/scheduling_pndm.py). An example of how to use this scheduler can be found in [pipeline_pndm.py](https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/pipeline_pndm.py).\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/diffusers/schedulers/__init__.py",
    "content": "# flake8: noqa\n# There's no way to ignore \"F401 '...' imported but unused\" warnings in this\n# module, but to preserve other warnings. So, don't check this module at all.\n# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom .scheduling_ddim import DDIMScheduler\nfrom .scheduling_ddpm import DDPMScheduler\nfrom .scheduling_karras_ve import KarrasVeScheduler\nfrom .scheduling_lms_discrete import LMSDiscreteScheduler\nfrom .scheduling_pndm import PNDMScheduler\nfrom .scheduling_sde_ve import ScoreSdeVeScheduler\nfrom .scheduling_sde_vp import ScoreSdeVpScheduler\nfrom .scheduling_utils import SchedulerMixin\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/diffusers/schedulers/scheduling_ddim.py",
    "content": "# Copyright 2022 Stanford University Team and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# DISCLAIMER: This code is strongly influenced by https://github.com/pesser/pypaddle_diffusion\n# and https://github.com/hojonathanho/diffusion\nimport math\nfrom typing import Union\n\nimport numpy as np\nimport paddle\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\ndef betas_for_alpha_bar(num_diffusion_timesteps, max_beta=0.999):\n    \"\"\"\n    Create a beta schedule that discretizes the given alpha_t_bar function, which defines the cumulative product of\n    (1-beta) over time from t = [0,1].\n\n    :param num_diffusion_timesteps: the number of betas to produce. :param alpha_bar: a lambda that takes an argument t\n    from 0 to 1 and\n                      produces the cumulative product of (1-beta) up to that part of the diffusion process.\n    :param max_beta: the maximum beta to use; use values lower than 1 to\n                     prevent singularities.\n    \"\"\"\n\n    def alpha_bar(time_step):\n        return math.cos((time_step + 0.008) / 1.008 * math.pi / 2)**2\n\n    betas = []\n    for i in range(num_diffusion_timesteps):\n        t1 = i / num_diffusion_timesteps\n        t2 = (i + 1) / num_diffusion_timesteps\n        betas.append(min(1 - alpha_bar(t2) / alpha_bar(t1), max_beta))\n    return np.array(betas, dtype=np.float32)\n\n\nclass DDIMScheduler(SchedulerMixin, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        num_train_timesteps=1000,\n        beta_start=0.0001,\n        beta_end=0.02,\n        beta_schedule=\"linear\",\n        trained_betas=None,\n        timestep_values=None,\n        clip_sample=True,\n        set_alpha_to_one=True,\n        tensor_format=\"pd\",\n    ):\n\n        if beta_schedule == \"linear\":\n            self.betas = np.linspace(beta_start, beta_end, num_train_timesteps, dtype=np.float32)\n        elif beta_schedule == \"scaled_linear\":\n            # this schedule is very specific to the latent diffusion model.\n            self.betas = np.linspace(beta_start**0.5, beta_end**0.5, num_train_timesteps, dtype=np.float32)**2\n        elif beta_schedule == \"squaredcos_cap_v2\":\n            # Glide cosine schedule\n            self.betas = betas_for_alpha_bar(num_train_timesteps)\n        else:\n            raise NotImplementedError(f\"{beta_schedule} does is not implemented for {self.__class__}\")\n\n        self.alphas = 1.0 - self.betas\n        self.alphas_cumprod = np.cumprod(self.alphas, axis=0)\n\n        # At every step in ddim, we are looking into the previous alphas_cumprod\n        # For the final step, there is no previous alphas_cumprod because we are already at 0\n        # `set_alpha_to_one` decides whether we set this paratemer simply to one or\n        # whether we use the final alpha of the \"non-previous\" one.\n        self.final_alpha_cumprod = np.array(1.0) if set_alpha_to_one else self.alphas_cumprod[0]\n\n        # setable values\n        self.num_inference_steps = None\n        self.timesteps = np.arange(0, num_train_timesteps)[::-1].copy()\n\n        self.tensor_format = tensor_format\n        self.set_format(tensor_format=tensor_format)\n\n    def _get_variance(self, timestep, prev_timestep):\n        alpha_prod_t = self.alphas_cumprod[timestep]\n        alpha_prod_t_prev = self.alphas_cumprod[prev_timestep] if prev_timestep >= 0 else self.final_alpha_cumprod\n        beta_prod_t = 1 - alpha_prod_t\n        beta_prod_t_prev = 1 - alpha_prod_t_prev\n\n        variance = (beta_prod_t_prev / beta_prod_t) * (1 - alpha_prod_t / alpha_prod_t_prev)\n\n        return variance\n\n    def set_timesteps(self, num_inference_steps, offset=0):\n        self.num_inference_steps = num_inference_steps\n        self.timesteps = np.arange(0, self.config.num_train_timesteps,\n                                   self.config.num_train_timesteps // self.num_inference_steps)[::-1].copy()\n        self.timesteps += offset\n        self.set_format(tensor_format=self.tensor_format)\n\n    def step(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n        eta: float = 0.0,\n        use_clipped_model_output: bool = False,\n        generator=None,\n    ):\n        # See formulas (12) and (16) of DDIM paper https://arxiv.org/pdf/2010.02502.pdf\n        # Ideally, read DDIM paper in-detail understanding\n\n        # Notation (<variable name> -> <name in paper>\n        # - pred_noise_t -> e_theta(x_t, t)\n        # - pred_original_sample -> f_theta(x_t, t) or x_0\n        # - std_dev_t -> sigma_t\n        # - eta -> η\n        # - pred_sample_direction -> \"direction pointingc to x_t\"\n        # - pred_prev_sample -> \"x_t-1\"\n\n        # 1. get previous step value (=t-1)\n        prev_timestep = timestep - self.config.num_train_timesteps // self.num_inference_steps\n\n        # 2. compute alphas, betas\n        alpha_prod_t = self.alphas_cumprod[timestep]\n        alpha_prod_t_prev = self.alphas_cumprod[prev_timestep] if prev_timestep >= 0 else self.final_alpha_cumprod\n        beta_prod_t = 1 - alpha_prod_t\n\n        # 3. compute predicted original sample from predicted noise also called\n        # \"predicted x_0\" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf\n        pred_original_sample = (sample - beta_prod_t**(0.5) * model_output) / alpha_prod_t**(0.5)\n\n        # 4. Clip \"predicted x_0\"\n        if self.config.clip_sample:\n            pred_original_sample = self.clip(pred_original_sample, -1, 1)\n\n        # 5. compute variance: \"sigma_t(η)\" -> see formula (16)\n        # σ_t = sqrt((1 − α_t−1)/(1 − α_t)) * sqrt(1 − α_t/α_t−1)\n        variance = self._get_variance(timestep, prev_timestep)\n        std_dev_t = eta * variance**(0.5)\n\n        if use_clipped_model_output:\n            # the model_output is always re-derived from the clipped x_0 in Glide\n            model_output = (sample - alpha_prod_t**(0.5) * pred_original_sample) / beta_prod_t**(0.5)\n\n        # 6. compute \"direction pointing to x_t\" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf\n        pred_sample_direction = (1 - alpha_prod_t_prev - std_dev_t**2)**(0.5) * model_output\n\n        # 7. compute x_t without \"random noise\" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf\n        prev_sample = alpha_prod_t_prev**(0.5) * pred_original_sample + pred_sample_direction\n\n        if eta > 0:\n            noise = paddle.randn(model_output.shape)\n            variance = self._get_variance(timestep, prev_timestep)**(0.5) * eta * noise\n\n            if not paddle.is_tensor(model_output):\n                variance = variance.numpy()\n\n            prev_sample = prev_sample + variance\n\n        return {\"prev_sample\": prev_sample}\n\n    def add_noise(self, original_samples, noise, timesteps):\n        sqrt_alpha_prod = self.alphas_cumprod[timesteps]**0.5\n        sqrt_alpha_prod = self.match_shape(sqrt_alpha_prod, original_samples)\n        sqrt_one_minus_alpha_prod = (1 - self.alphas_cumprod[timesteps])**0.5\n        sqrt_one_minus_alpha_prod = self.match_shape(sqrt_one_minus_alpha_prod, original_samples)\n\n        noisy_samples = sqrt_alpha_prod * original_samples + sqrt_one_minus_alpha_prod * noise\n        return noisy_samples\n\n    def __len__(self):\n        return self.config.num_train_timesteps\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/diffusers/schedulers/scheduling_ddpm.py",
    "content": "# Copyright 2022 UC Berkely Team and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# DISCLAIMER: This file is strongly influenced by https://github.com/ermongroup/ddim\nimport math\nfrom typing import Union\n\nimport numpy as np\nimport paddle\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\ndef betas_for_alpha_bar(num_diffusion_timesteps, max_beta=0.999):\n    \"\"\"\n    Create a beta schedule that discretizes the given alpha_t_bar function, which defines the cumulative product of\n    (1-beta) over time from t = [0,1].\n\n    :param num_diffusion_timesteps: the number of betas to produce. :param alpha_bar: a lambda that takes an argument t\n    from 0 to 1 and\n                      produces the cumulative product of (1-beta) up to that part of the diffusion process.\n    :param max_beta: the maximum beta to use; use values lower than 1 to\n                     prevent singularities.\n    \"\"\"\n\n    def alpha_bar(time_step):\n        return math.cos((time_step + 0.008) / 1.008 * math.pi / 2)**2\n\n    betas = []\n    for i in range(num_diffusion_timesteps):\n        t1 = i / num_diffusion_timesteps\n        t2 = (i + 1) / num_diffusion_timesteps\n        betas.append(min(1 - alpha_bar(t2) / alpha_bar(t1), max_beta))\n    return np.array(betas, dtype=np.float32)\n\n\nclass DDPMScheduler(SchedulerMixin, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        num_train_timesteps=1000,\n        beta_start=0.0001,\n        beta_end=0.02,\n        beta_schedule=\"linear\",\n        trained_betas=None,\n        variance_type=\"fixed_small\",\n        clip_sample=True,\n        tensor_format=\"pd\",\n    ):\n\n        if trained_betas is not None:\n            self.betas = np.asarray(trained_betas)\n        elif beta_schedule == \"linear\":\n            self.betas = np.linspace(beta_start, beta_end, num_train_timesteps, dtype=np.float32)\n        elif beta_schedule == \"scaled_linear\":\n            # this schedule is very specific to the latent diffusion model.\n            self.betas = np.linspace(beta_start**0.5, beta_end**0.5, num_train_timesteps, dtype=np.float32)**2\n        elif beta_schedule == \"squaredcos_cap_v2\":\n            # Glide cosine schedule\n            self.betas = betas_for_alpha_bar(num_train_timesteps)\n        else:\n            raise NotImplementedError(f\"{beta_schedule} does is not implemented for {self.__class__}\")\n\n        self.alphas = 1.0 - self.betas\n        self.alphas_cumprod = np.cumprod(self.alphas, axis=0)\n        self.one = np.array(1.0)\n\n        # setable values\n        self.num_inference_steps = None\n        self.timesteps = np.arange(0, num_train_timesteps)[::-1].copy()\n\n        self.tensor_format = tensor_format\n        self.set_format(tensor_format=tensor_format)\n\n        self.variance_type = variance_type\n\n    def set_timesteps(self, num_inference_steps):\n        num_inference_steps = min(self.config.num_train_timesteps, num_inference_steps)\n        self.num_inference_steps = num_inference_steps\n        self.timesteps = np.arange(0, self.config.num_train_timesteps,\n                                   self.config.num_train_timesteps // self.num_inference_steps)[::-1].copy()\n        self.set_format(tensor_format=self.tensor_format)\n\n    def _get_variance(self, t, predicted_variance=None, variance_type=None):\n        alpha_prod_t = self.alphas_cumprod[t]\n        alpha_prod_t_prev = self.alphas_cumprod[t - 1] if t > 0 else self.one\n\n        # For t > 0, compute predicted variance βt (see formula (6) and (7) from https://arxiv.org/pdf/2006.11239.pdf)\n        # and sample from it to get previous sample\n        # x_{t-1} ~ N(pred_prev_sample, variance) == add variance to pred_sample\n        variance = (1 - alpha_prod_t_prev) / (1 - alpha_prod_t) * self.betas[t]\n\n        if variance_type is None:\n            variance_type = self.config.variance_type\n\n        # hacks - were probs added for training stability\n        if variance_type == \"fixed_small\":\n            variance = self.clip(variance, min_value=1e-20)\n        # for rl-diffuser https://arxiv.org/abs/2205.09991\n        elif variance_type == \"fixed_small_log\":\n            variance = self.log(self.clip(variance, min_value=1e-20))\n        elif variance_type == \"fixed_large\":\n            variance = self.betas[t]\n        elif variance_type == \"fixed_large_log\":\n            # Glide max_log\n            variance = self.log(self.betas[t])\n        elif variance_type == \"learned\":\n            return predicted_variance\n        elif variance_type == \"learned_range\":\n            min_log = variance\n            max_log = self.betas[t]\n            frac = (predicted_variance + 1) / 2\n            variance = frac * max_log + (1 - frac) * min_log\n\n        return variance\n\n    def step(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n        predict_epsilon=True,\n        generator=None,\n    ):\n        t = timestep\n\n        if model_output.shape[1] == sample.shape[1] * 2 and self.variance_type in [\"learned\", \"learned_range\"]:\n            model_output, predicted_variance = paddle.split(model_output, sample.shape[1], axis=1)\n        else:\n            predicted_variance = None\n\n        # 1. compute alphas, betas\n        alpha_prod_t = self.alphas_cumprod[t]\n        alpha_prod_t_prev = self.alphas_cumprod[t - 1] if t > 0 else self.one\n        beta_prod_t = 1 - alpha_prod_t\n        beta_prod_t_prev = 1 - alpha_prod_t_prev\n\n        # 2. compute predicted original sample from predicted noise also called\n        # \"predicted x_0\" of formula (15) from https://arxiv.org/pdf/2006.11239.pdf\n        if predict_epsilon:\n            pred_original_sample = (sample - beta_prod_t**(0.5) * model_output) / alpha_prod_t**(0.5)\n        else:\n            pred_original_sample = model_output\n\n        # 3. Clip \"predicted x_0\"\n        if self.config.clip_sample:\n            pred_original_sample = self.clip(pred_original_sample, -1, 1)\n\n        # 4. Compute coefficients for pred_original_sample x_0 and current sample x_t\n        # See formula (7) from https://arxiv.org/pdf/2006.11239.pdf\n        pred_original_sample_coeff = (alpha_prod_t_prev**(0.5) * self.betas[t]) / beta_prod_t\n        current_sample_coeff = self.alphas[t]**(0.5) * beta_prod_t_prev / beta_prod_t\n\n        # 5. Compute predicted previous sample µ_t\n        # See formula (7) from https://arxiv.org/pdf/2006.11239.pdf\n        pred_prev_sample = pred_original_sample_coeff * pred_original_sample + current_sample_coeff * sample\n\n        # 6. Add noise\n        variance = 0\n        if t > 0:\n            noise = self.randn_like(model_output)\n            variance = (self._get_variance(t, predicted_variance=predicted_variance)**0.5) * noise\n\n        pred_prev_sample = pred_prev_sample + variance\n\n        return {\"prev_sample\": pred_prev_sample}\n\n    def add_noise(self, original_samples, noise, timesteps):\n        sqrt_alpha_prod = self.alphas_cumprod[timesteps]**0.5\n        sqrt_alpha_prod = self.match_shape(sqrt_alpha_prod, original_samples)\n        sqrt_one_minus_alpha_prod = (1 - self.alphas_cumprod[timesteps])**0.5\n        sqrt_one_minus_alpha_prod = self.match_shape(sqrt_one_minus_alpha_prod, original_samples)\n\n        noisy_samples = sqrt_alpha_prod * original_samples + sqrt_one_minus_alpha_prod * noise\n        return noisy_samples\n\n    def __len__(self):\n        return self.config.num_train_timesteps\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/diffusers/schedulers/scheduling_karras_ve.py",
    "content": "# Copyright 2022 NVIDIA and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Union\n\nimport numpy as np\nimport paddle\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\nclass KarrasVeScheduler(SchedulerMixin, ConfigMixin):\n    \"\"\"\n    Stochastic sampling from Karras et al. [1] tailored to the Variance-Expanding (VE) models [2]. Use Algorithm 2 and\n    the VE column of Table 1 from [1] for reference.\n\n    [1] Karras, Tero, et al. \"Elucidating the Design Space of Diffusion-Based Generative Models.\"\n    https://arxiv.org/abs/2206.00364 [2] Song, Yang, et al. \"Score-based generative modeling through stochastic\n    differential equations.\" https://arxiv.org/abs/2011.13456\n    \"\"\"\n\n    @register_to_config\n    def __init__(\n        self,\n        sigma_min=0.02,\n        sigma_max=100,\n        s_noise=1.007,\n        s_churn=80,\n        s_min=0.05,\n        s_max=50,\n        tensor_format=\"pd\",\n    ):\n        \"\"\"\n        For more details on the parameters, see the original paper's Appendix E.: \"Elucidating the Design Space of\n        Diffusion-Based Generative Models.\" https://arxiv.org/abs/2206.00364. The grid search values used to find the\n        optimal {s_noise, s_churn, s_min, s_max} for a specific model are described in Table 5 of the paper.\n\n        Args:\n            sigma_min (`float`): minimum noise magnitude\n            sigma_max (`float`): maximum noise magnitude\n            s_noise (`float`): the amount of additional noise to counteract loss of detail during sampling.\n                A reasonable range is [1.000, 1.011].\n            s_churn (`float`): the parameter controlling the overall amount of stochasticity.\n                A reasonable range is [0, 100].\n            s_min (`float`): the start value of the sigma range where we add noise (enable stochasticity).\n                A reasonable range is [0, 10].\n            s_max (`float`): the end value of the sigma range where we add noise.\n                A reasonable range is [0.2, 80].\n        \"\"\"\n        # setable values\n        self.num_inference_steps = None\n        self.timesteps = None\n        self.schedule = None  # sigma(t_i)\n\n        self.tensor_format = tensor_format\n        self.set_format(tensor_format=tensor_format)\n\n    def set_timesteps(self, num_inference_steps):\n        self.num_inference_steps = num_inference_steps\n        self.timesteps = np.arange(0, self.num_inference_steps)[::-1].copy()\n        self.schedule = [(self.sigma_max * (self.sigma_min**2 / self.sigma_max**2)**(i / (num_inference_steps - 1)))\n                         for i in self.timesteps]\n        self.schedule = np.array(self.schedule, dtype=np.float32)\n\n        self.set_format(tensor_format=self.tensor_format)\n\n    def add_noise_to_input(self, sample, sigma, generator=None):\n        \"\"\"\n        Explicit Langevin-like \"churn\" step of adding noise to the sample according to a factor gamma_i ≥ 0 to reach a\n        higher noise level sigma_hat = sigma_i + gamma_i*sigma_i.\n        \"\"\"\n        if self.s_min <= sigma <= self.s_max:\n            gamma = min(self.s_churn / self.num_inference_steps, 2**0.5 - 1)\n        else:\n            gamma = 0\n\n        # sample eps ~ N(0, S_noise^2 * I)\n        eps = self.s_noise * paddle.randn(sample.shape)\n        sigma_hat = sigma + gamma * sigma\n        sample_hat = sample + ((sigma_hat**2 - sigma**2)**0.5 * eps)\n\n        return sample_hat, sigma_hat\n\n    def step(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        sigma_hat: float,\n        sigma_prev: float,\n        sample_hat: Union[paddle.Tensor, np.ndarray],\n    ):\n        pred_original_sample = sample_hat + sigma_hat * model_output\n        derivative = (sample_hat - pred_original_sample) / sigma_hat\n        sample_prev = sample_hat + (sigma_prev - sigma_hat) * derivative\n\n        return {\"prev_sample\": sample_prev, \"derivative\": derivative}\n\n    def step_correct(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        sigma_hat: float,\n        sigma_prev: float,\n        sample_hat: Union[paddle.Tensor, np.ndarray],\n        sample_prev: Union[paddle.Tensor, np.ndarray],\n        derivative: Union[paddle.Tensor, np.ndarray],\n    ):\n        pred_original_sample = sample_prev + sigma_prev * model_output\n        derivative_corr = (sample_prev - pred_original_sample) / sigma_prev\n        sample_prev = sample_hat + (sigma_prev - sigma_hat) * (0.5 * derivative + 0.5 * derivative_corr)\n        return {\"prev_sample\": sample_prev, \"derivative\": derivative_corr}\n\n    def add_noise(self, original_samples, noise, timesteps):\n        raise NotImplementedError()\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/diffusers/schedulers/scheduling_lms_discrete.py",
    "content": "# Copyright 2022 Katherine Crowson and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Union\n\nimport numpy as np\nimport paddle\nfrom scipy import integrate\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\nclass LMSDiscreteScheduler(SchedulerMixin, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        num_train_timesteps=1000,\n        beta_start=0.0001,\n        beta_end=0.02,\n        beta_schedule=\"linear\",\n        trained_betas=None,\n        timestep_values=None,\n        tensor_format=\"pd\",\n    ):\n        \"\"\"\n        Linear Multistep Scheduler for discrete beta schedules. Based on the original k-diffusion implementation by\n        Katherine Crowson:\n        https://github.com/crowsonkb/k-diffusion/blob/481677d114f6ea445aa009cf5bd7a9cdee909e47/k_diffusion/sampling.py#L181\n        \"\"\"\n\n        if beta_schedule == \"linear\":\n            self.betas = np.linspace(beta_start, beta_end, num_train_timesteps, dtype=np.float32)\n        elif beta_schedule == \"scaled_linear\":\n            # this schedule is very specific to the latent diffusion model.\n            self.betas = np.linspace(beta_start**0.5, beta_end**0.5, num_train_timesteps, dtype=np.float32)**2\n        else:\n            raise NotImplementedError(f\"{beta_schedule} does is not implemented for {self.__class__}\")\n\n        self.alphas = 1.0 - self.betas\n        self.alphas_cumprod = np.cumprod(self.alphas, axis=0)\n\n        self.sigmas = ((1 - self.alphas_cumprod) / self.alphas_cumprod)**0.5\n\n        # setable values\n        self.num_inference_steps = None\n        self.timesteps = np.arange(0, num_train_timesteps)[::-1].copy()\n        self.derivatives = []\n\n        self.tensor_format = tensor_format\n        self.set_format(tensor_format=tensor_format)\n\n    def get_lms_coefficient(self, order, t, current_order):\n        \"\"\"\n        Compute a linear multistep coefficient\n        \"\"\"\n\n        def lms_derivative(tau):\n            prod = 1.0\n            for k in range(order):\n                if current_order == k:\n                    continue\n                prod *= (tau - self.sigmas[t - k]) / (self.sigmas[t - current_order] - self.sigmas[t - k])\n            return prod\n\n        integrated_coeff = integrate.quad(lms_derivative, self.sigmas[t], self.sigmas[t + 1], epsrel=1e-4)[0]\n\n        return integrated_coeff\n\n    def set_timesteps(self, num_inference_steps):\n        self.num_inference_steps = num_inference_steps\n        self.timesteps = np.linspace(self.num_train_timesteps - 1, 0, num_inference_steps, dtype=float)\n\n        low_idx = np.floor(self.timesteps).astype(int)\n        high_idx = np.ceil(self.timesteps).astype(int)\n        frac = np.mod(self.timesteps, 1.0)\n        sigmas = np.array(((1 - self.alphas_cumprod) / self.alphas_cumprod)**0.5)\n        sigmas = (1 - frac) * sigmas[low_idx] + frac * sigmas[high_idx]\n        self.sigmas = np.concatenate([sigmas, [0.0]])\n\n        self.derivatives = []\n\n        self.set_format(tensor_format=self.tensor_format)\n\n    def step(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n        order: int = 4,\n    ):\n        sigma = self.sigmas[timestep]\n\n        # 1. compute predicted original sample (x_0) from sigma-scaled predicted noise\n        pred_original_sample = sample - sigma * model_output\n\n        # 2. Convert to an ODE derivative\n        derivative = (sample - pred_original_sample) / sigma\n        self.derivatives.append(derivative)\n        if len(self.derivatives) > order:\n            self.derivatives.pop(0)\n\n        # 3. Compute linear multistep coefficients\n        order = min(timestep + 1, order)\n        lms_coeffs = [self.get_lms_coefficient(order, timestep, curr_order) for curr_order in range(order)]\n\n        # 4. Compute previous sample based on the derivatives path\n        prev_sample = sample + sum(coeff * derivative\n                                   for coeff, derivative in zip(lms_coeffs, reversed(self.derivatives)))\n\n        return {\"prev_sample\": prev_sample}\n\n    def add_noise(self, original_samples, noise, timesteps):\n        alpha_prod = self.alphas_cumprod[timesteps]\n        alpha_prod = self.match_shape(alpha_prod, original_samples)\n\n        noisy_samples = (alpha_prod**0.5) * original_samples + ((1 - alpha_prod)**0.5) * noise\n        return noisy_samples\n\n    def __len__(self):\n        return self.config.num_train_timesteps\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/diffusers/schedulers/scheduling_pndm.py",
    "content": "# Copyright 2022 Zhejiang University Team and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# DISCLAIMER: This file is strongly influenced by https://github.com/ermongroup/ddim\nimport math\nfrom typing import Union\n\nimport numpy as np\nimport paddle\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\ndef betas_for_alpha_bar(num_diffusion_timesteps, max_beta=0.999):\n    \"\"\"\n    Create a beta schedule that discretizes the given alpha_t_bar function, which defines the cumulative product of\n    (1-beta) over time from t = [0,1].\n\n    :param num_diffusion_timesteps: the number of betas to produce. :param alpha_bar: a lambda that takes an argument t\n    from 0 to 1 and\n                      produces the cumulative product of (1-beta) up to that part of the diffusion process.\n    :param max_beta: the maximum beta to use; use values lower than 1 to\n                     prevent singularities.\n    \"\"\"\n\n    def alpha_bar(time_step):\n        return math.cos((time_step + 0.008) / 1.008 * math.pi / 2)**2\n\n    betas = []\n    for i in range(num_diffusion_timesteps):\n        t1 = i / num_diffusion_timesteps\n        t2 = (i + 1) / num_diffusion_timesteps\n        betas.append(min(1 - alpha_bar(t2) / alpha_bar(t1), max_beta))\n    return np.array(betas, dtype=np.float32)\n\n\nclass PNDMScheduler(SchedulerMixin, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        num_train_timesteps=1000,\n        beta_start=0.0001,\n        beta_end=0.02,\n        beta_schedule=\"linear\",\n        tensor_format=\"pd\",\n        skip_prk_steps=False,\n    ):\n\n        if beta_schedule == \"linear\":\n            self.betas = np.linspace(beta_start, beta_end, num_train_timesteps, dtype=np.float32)\n        elif beta_schedule == \"scaled_linear\":\n            # this schedule is very specific to the latent diffusion model.\n            self.betas = np.linspace(beta_start**0.5, beta_end**0.5, num_train_timesteps, dtype=np.float32)**2\n        elif beta_schedule == \"squaredcos_cap_v2\":\n            # Glide cosine schedule\n            self.betas = betas_for_alpha_bar(num_train_timesteps)\n        else:\n            raise NotImplementedError(f\"{beta_schedule} does is not implemented for {self.__class__}\")\n\n        self.alphas = 1.0 - self.betas\n        self.alphas_cumprod = np.cumprod(self.alphas, axis=0)\n\n        self.one = np.array(1.0)\n\n        # For now we only support F-PNDM, i.e. the runge-kutta method\n        # For more information on the algorithm please take a look at the paper: https://arxiv.org/pdf/2202.09778.pdf\n        # mainly at formula (9), (12), (13) and the Algorithm 2.\n        self.pndm_order = 4\n\n        # running values\n        self.cur_model_output = 0\n        self.counter = 0\n        self.cur_sample = None\n        self.ets = []\n\n        # setable values\n        self.num_inference_steps = None\n        self._timesteps = np.arange(0, num_train_timesteps)[::-1].copy()\n        self._offset = 0\n        self.prk_timesteps = None\n        self.plms_timesteps = None\n        self.timesteps = None\n\n        self.tensor_format = tensor_format\n        self.set_format(tensor_format=tensor_format)\n\n    def set_timesteps(self, num_inference_steps, offset=0):\n        self.num_inference_steps = num_inference_steps\n        self._timesteps = list(\n            range(0, self.config.num_train_timesteps, self.config.num_train_timesteps // num_inference_steps))\n        self._offset = offset\n        self._timesteps = [t + self._offset for t in self._timesteps]\n\n        if self.config.skip_prk_steps:\n            # for some models like stable diffusion the prk steps can/should be skipped to\n            # produce better results. When using PNDM with `self.config.skip_prk_steps` the implementation\n            # is based on crowsonkb's PLMS sampler implementation: https://github.com/CompVis/latent-diffusion/pull/51\n            self.prk_timesteps = []\n            self.plms_timesteps = list(reversed(self._timesteps[:-1] + self._timesteps[-2:-1] + self._timesteps[-1:]))\n        else:\n            prk_timesteps = np.array(self._timesteps[-self.pndm_order:]).repeat(2) + np.tile(\n                np.array([0, self.config.num_train_timesteps // num_inference_steps // 2]), self.pndm_order)\n            self.prk_timesteps = list(reversed(prk_timesteps[:-1].repeat(2)[1:-1]))\n            self.plms_timesteps = list(reversed(self._timesteps[:-3]))\n\n        self.timesteps = self.prk_timesteps + self.plms_timesteps\n\n        self.ets = []\n        self.counter = 0\n        self.set_format(tensor_format=self.tensor_format)\n\n    def step(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n    ):\n        if self.counter < len(self.prk_timesteps) and not self.config.skip_prk_steps:\n            return self.step_prk(model_output=model_output, timestep=timestep, sample=sample)\n        else:\n            return self.step_plms(model_output=model_output, timestep=timestep, sample=sample)\n\n    def step_prk(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n    ):\n        \"\"\"\n        Step function propagating the sample with the Runge-Kutta method. RK takes 4 forward passes to approximate the\n        solution to the differential equation.\n        \"\"\"\n        diff_to_prev = 0 if self.counter % 2 else self.config.num_train_timesteps // self.num_inference_steps // 2\n        prev_timestep = max(timestep - diff_to_prev, self.prk_timesteps[-1])\n        timestep = self.prk_timesteps[self.counter // 4 * 4]\n\n        if self.counter % 4 == 0:\n            self.cur_model_output += 1 / 6 * model_output\n            self.ets.append(model_output)\n            self.cur_sample = sample\n        elif (self.counter - 1) % 4 == 0:\n            self.cur_model_output += 1 / 3 * model_output\n        elif (self.counter - 2) % 4 == 0:\n            self.cur_model_output += 1 / 3 * model_output\n        elif (self.counter - 3) % 4 == 0:\n            model_output = self.cur_model_output + 1 / 6 * model_output\n            self.cur_model_output = 0\n\n        # cur_sample should not be `None`\n        cur_sample = self.cur_sample if self.cur_sample is not None else sample\n\n        prev_sample = self._get_prev_sample(cur_sample, timestep, prev_timestep, model_output)\n        self.counter += 1\n\n        return {\"prev_sample\": prev_sample}\n\n    def step_plms(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n    ):\n        \"\"\"\n        Step function propagating the sample with the linear multi-step method. This has one forward pass with multiple\n        times to approximate the solution.\n        \"\"\"\n        if not self.config.skip_prk_steps and len(self.ets) < 3:\n            raise ValueError(\n                f\"{self.__class__} can only be run AFTER scheduler has been run \"\n                \"in 'prk' mode for at least 12 iterations \"\n                \"See: https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/pipeline_pndm.py \"\n                \"for more information.\")\n\n        prev_timestep = max(timestep - self.config.num_train_timesteps // self.num_inference_steps, 0)\n\n        if self.counter != 1:\n            self.ets.append(model_output)\n        else:\n            prev_timestep = timestep\n            timestep = timestep + self.config.num_train_timesteps // self.num_inference_steps\n\n        if len(self.ets) == 1 and self.counter == 0:\n            model_output = model_output\n            self.cur_sample = sample\n        elif len(self.ets) == 1 and self.counter == 1:\n            model_output = (model_output + self.ets[-1]) / 2\n            sample = self.cur_sample\n            self.cur_sample = None\n        elif len(self.ets) == 2:\n            model_output = (3 * self.ets[-1] - self.ets[-2]) / 2\n        elif len(self.ets) == 3:\n            model_output = (23 * self.ets[-1] - 16 * self.ets[-2] + 5 * self.ets[-3]) / 12\n        else:\n            model_output = (1 / 24) * (55 * self.ets[-1] - 59 * self.ets[-2] + 37 * self.ets[-3] - 9 * self.ets[-4])\n\n        prev_sample = self._get_prev_sample(sample, timestep, prev_timestep, model_output)\n        self.counter += 1\n\n        return {\"prev_sample\": prev_sample}\n\n    def _get_prev_sample(self, sample, timestep, timestep_prev, model_output):\n        # See formula (9) of PNDM paper https://arxiv.org/pdf/2202.09778.pdf\n        # this function computes x_(t−δ) using the formula of (9)\n        # Note that x_t needs to be added to both sides of the equation\n\n        # Notation (<variable name> -> <name in paper>\n        # alpha_prod_t -> α_t\n        # alpha_prod_t_prev -> α_(t−δ)\n        # beta_prod_t -> (1 - α_t)\n        # beta_prod_t_prev -> (1 - α_(t−δ))\n        # sample -> x_t\n        # model_output -> e_θ(x_t, t)\n        # prev_sample -> x_(t−δ)\n        alpha_prod_t = self.alphas_cumprod[timestep + 1 - self._offset]\n        alpha_prod_t_prev = self.alphas_cumprod[timestep_prev + 1 - self._offset]\n        beta_prod_t = 1 - alpha_prod_t\n        beta_prod_t_prev = 1 - alpha_prod_t_prev\n\n        # corresponds to (α_(t−δ) - α_t) divided by\n        # denominator of x_t in formula (9) and plus 1\n        # Note: (α_(t−δ) - α_t) / (sqrt(α_t) * (sqrt(α_(t−δ)) + sqr(α_t))) =\n        # sqrt(α_(t−δ)) / sqrt(α_t))\n        sample_coeff = (alpha_prod_t_prev / alpha_prod_t)**(0.5)\n\n        # corresponds to denominator of e_θ(x_t, t) in formula (9)\n        model_output_denom_coeff = alpha_prod_t * beta_prod_t_prev**(0.5) + (alpha_prod_t * beta_prod_t *\n                                                                             alpha_prod_t_prev)**(0.5)\n\n        # full formula (9)\n        prev_sample = (sample_coeff * sample -\n                       (alpha_prod_t_prev - alpha_prod_t) * model_output / model_output_denom_coeff)\n\n        return prev_sample\n\n    def add_noise(self, original_samples, noise, timesteps):\n        sqrt_alpha_prod = self.alphas_cumprod[timesteps]**0.5\n        sqrt_alpha_prod = self.match_shape(sqrt_alpha_prod, original_samples)\n        sqrt_one_minus_alpha_prod = (1 - self.alphas_cumprod[timesteps])**0.5\n        sqrt_one_minus_alpha_prod = self.match_shape(sqrt_one_minus_alpha_prod, original_samples)\n\n        noisy_samples = sqrt_alpha_prod * original_samples + sqrt_one_minus_alpha_prod * noise\n        return noisy_samples\n\n    def __len__(self):\n        return self.config.num_train_timesteps\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/diffusers/schedulers/scheduling_sde_ve.py",
    "content": "# Copyright 2022 Google Brain and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# DISCLAIMER: This file is strongly influenced by https://github.com/yang-song/score_sde_pypaddle\n# TODO(Patrick, Anton, Suraj) - make scheduler framework indepedent and clean-up a bit\nfrom typing import Union\n\nimport numpy as np\nimport paddle\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\nclass ScoreSdeVeScheduler(SchedulerMixin, ConfigMixin):\n    \"\"\"\n    The variance exploding stochastic differential equation (SDE) scheduler.\n\n    :param snr: coefficient weighting the step from the model_output sample (from the network) to the random noise.\n    :param sigma_min: initial noise scale for sigma sequence in sampling procedure. The minimum sigma should mirror the\n            distribution of the data.\n    :param sigma_max: :param sampling_eps: the end value of sampling, where timesteps decrease progessively from 1 to\n    epsilon. :param correct_steps: number of correction steps performed on a produced sample. :param tensor_format:\n    \"np\" or \"pd\" for the expected format of samples passed to the Scheduler.\n    \"\"\"\n\n    @register_to_config\n    def __init__(\n        self,\n        num_train_timesteps=2000,\n        snr=0.15,\n        sigma_min=0.01,\n        sigma_max=1348,\n        sampling_eps=1e-5,\n        correct_steps=1,\n        tensor_format=\"pd\",\n    ):\n        # self.sigmas = None\n        # self.discrete_sigmas = None\n        #\n        # # setable values\n        # self.num_inference_steps = None\n        self.timesteps = None\n\n        self.set_sigmas(num_train_timesteps, sigma_min, sigma_max, sampling_eps)\n\n        self.tensor_format = tensor_format\n        self.set_format(tensor_format=tensor_format)\n\n    def set_timesteps(self, num_inference_steps, sampling_eps=None):\n        sampling_eps = sampling_eps if sampling_eps is not None else self.config.sampling_eps\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            self.timesteps = np.linspace(1, sampling_eps, num_inference_steps)\n        elif tensor_format == \"pd\":\n            self.timesteps = paddle.linspace(1, sampling_eps, num_inference_steps)\n        else:\n            raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def set_sigmas(self, num_inference_steps, sigma_min=None, sigma_max=None, sampling_eps=None):\n        sigma_min = sigma_min if sigma_min is not None else self.config.sigma_min\n        sigma_max = sigma_max if sigma_max is not None else self.config.sigma_max\n        sampling_eps = sampling_eps if sampling_eps is not None else self.config.sampling_eps\n        if self.timesteps is None:\n            self.set_timesteps(num_inference_steps, sampling_eps)\n\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            self.discrete_sigmas = np.exp(np.linspace(np.log(sigma_min), np.log(sigma_max), num_inference_steps))\n            self.sigmas = np.array([sigma_min * (sigma_max / sigma_min)**t for t in self.timesteps])\n        elif tensor_format == \"pd\":\n            self.discrete_sigmas = paddle.exp(paddle.linspace(np.log(sigma_min), np.log(sigma_max),\n                                                              num_inference_steps))\n            self.sigmas = paddle.to_tensor([sigma_min * (sigma_max / sigma_min)**t for t in self.timesteps])\n        else:\n            raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def get_adjacent_sigma(self, timesteps, t):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            return np.where(timesteps == 0, np.zeros_like(t), self.discrete_sigmas[timesteps - 1])\n        elif tensor_format == \"pd\":\n            return paddle.where(timesteps == 0, paddle.zeros_like(t), self.discrete_sigmas[timesteps - 1])\n\n        raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def set_seed(self, seed):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            np.random.seed(seed)\n        elif tensor_format == \"pd\":\n            paddle.seed(seed)\n        else:\n            raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def step_pred(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n        seed=None,\n    ):\n        \"\"\"\n        Predict the sample at the previous timestep by reversing the SDE.\n        \"\"\"\n        if seed is not None:\n            self.set_seed(seed)\n        # TODO(Patrick) non-Pypaddle\n\n        timestep = timestep * paddle.ones(sample.shape[0])  # paddle.repeat_interleave(timestep, sample.shape[0])\n        timesteps = (timestep * (len(self.timesteps) - 1)).astype(\"int64\")\n\n        sigma = self.discrete_sigmas[timesteps]\n        adjacent_sigma = self.get_adjacent_sigma(timesteps, timestep)\n        drift = self.zeros_like(sample)\n        diffusion = (sigma**2 - adjacent_sigma**2)**0.5\n\n        # equation 6 in the paper: the model_output modeled by the network is grad_x log pt(x)\n        # also equation 47 shows the analog from SDE models to ancestral sampling methods\n        drift = drift - diffusion[:, None, None, None]**2 * model_output\n\n        #  equation 6: sample noise for the diffusion term of\n        noise = self.randn_like(sample)\n        prev_sample_mean = sample - drift  # subtract because `dt` is a small negative timestep\n        # TODO is the variable diffusion the correct scaling term for the noise?\n        prev_sample = prev_sample_mean + diffusion[:, None, None, None] * noise  # add impact of diffusion field g\n\n        return {\"prev_sample\": prev_sample, \"prev_sample_mean\": prev_sample_mean}\n\n    def step_correct(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        sample: Union[paddle.Tensor, np.ndarray],\n        seed=None,\n    ):\n        \"\"\"\n        Correct the predicted sample based on the output model_output of the network. This is often run repeatedly\n        after making the prediction for the previous timestep.\n        \"\"\"\n        if seed is not None:\n            self.set_seed(seed)\n\n        # For small batch sizes, the paper \"suggest replacing norm(z) with sqrt(d), where d is the dim. of z\"\n        # sample noise for correction\n        noise = self.randn_like(sample)\n\n        # compute step size from the model_output, the noise, and the snr\n        grad_norm = self.norm(model_output)\n        noise_norm = self.norm(noise)\n        step_size = (self.config.snr * noise_norm / grad_norm)**2 * 2\n        step_size = step_size * paddle.ones(sample.shape[0])\n        # self.repeat_scalar(step_size, sample.shape[0])\n\n        # compute corrected sample: model_output term and noise term\n        prev_sample_mean = sample + step_size[:, None, None, None] * model_output\n        prev_sample = prev_sample_mean + ((step_size * 2)**0.5)[:, None, None, None] * noise\n\n        return {\"prev_sample\": prev_sample}\n\n    def __len__(self):\n        return self.config.num_train_timesteps\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/diffusers/schedulers/scheduling_sde_vp.py",
    "content": "# Copyright 2022 Google Brain and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# DISCLAIMER: This file is strongly influenced by https://github.com/yang-song/score_sde_pytorch\n# TODO(Patrick, Anton, Suraj) - make scheduler framework indepedent and clean-up a bit\nimport numpy as np\nimport paddle\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\nclass ScoreSdeVpScheduler(SchedulerMixin, ConfigMixin):\n\n    @register_to_config\n    def __init__(self, num_train_timesteps=2000, beta_min=0.1, beta_max=20, sampling_eps=1e-3, tensor_format=\"np\"):\n\n        self.sigmas = None\n        self.discrete_sigmas = None\n        self.timesteps = None\n\n    def set_timesteps(self, num_inference_steps):\n        self.timesteps = paddle.linspace(1, self.config.sampling_eps, num_inference_steps)\n\n    def step_pred(self, score, x, t):\n        # TODO(Patrick) better comments + non-PyTorch\n        # postprocess model score\n        log_mean_coeff = (-0.25 * t**2 * (self.config.beta_max - self.config.beta_min) - 0.5 * t * self.config.beta_min)\n        std = paddle.sqrt(1.0 - paddle.exp(2.0 * log_mean_coeff))\n        score = -score / std[:, None, None, None]\n\n        # compute\n        dt = -1.0 / len(self.timesteps)\n\n        beta_t = self.config.beta_min + t * (self.config.beta_max - self.config.beta_min)\n        drift = -0.5 * beta_t[:, None, None, None] * x\n        diffusion = paddle.sqrt(beta_t)\n        drift = drift - diffusion[:, None, None, None]**2 * score\n        x_mean = x + drift * dt\n\n        # add noise\n        noise = self.randn_like(x)\n        x = x_mean + diffusion[:, None, None, None] * np.sqrt(-dt) * noise\n\n        return x, x_mean\n\n    def __len__(self):\n        return self.config.num_train_timesteps\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/diffusers/schedulers/scheduling_utils.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Union\n\nimport numpy as np\nimport paddle\n\nSCHEDULER_CONFIG_NAME = \"scheduler_config.json\"\n\n\nclass SchedulerMixin:\n\n    config_name = SCHEDULER_CONFIG_NAME\n    ignore_for_config = [\"tensor_format\"]\n\n    def set_format(self, tensor_format=\"pd\"):\n        self.tensor_format = tensor_format\n        if tensor_format == \"pd\":\n            for key, value in vars(self).items():\n                if isinstance(value, np.ndarray):\n                    setattr(self, key, paddle.to_tensor(value))\n\n        return self\n\n    def clip(self, tensor, min_value=None, max_value=None):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n\n        if tensor_format == \"np\":\n            return np.clip(tensor, min_value, max_value)\n        elif tensor_format == \"pd\":\n            return paddle.clip(tensor, min_value, max_value)\n\n        raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def log(self, tensor):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n\n        if tensor_format == \"np\":\n            return np.log(tensor)\n        elif tensor_format == \"pd\":\n            return paddle.log(tensor)\n\n        raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def match_shape(self, values: Union[np.ndarray, paddle.Tensor], broadcast_array: Union[np.ndarray, paddle.Tensor]):\n        \"\"\"\n        Turns a 1-D array into an array or tensor with len(broadcast_array.shape) dims.\n\n        Args:\n            values: an array or tensor of values to extract.\n            broadcast_array: an array with a larger shape of K dimensions with the batch\n                dimension equal to the length of timesteps.\n        Returns:\n            a tensor of shape [batch_size, 1, ...] where the shape has K dims.\n        \"\"\"\n\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        values = values.flatten()\n\n        while len(values.shape) < len(broadcast_array.shape):\n            values = values[..., None]\n\n        return values\n\n    def norm(self, tensor):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            return np.linalg.norm(tensor)\n        elif tensor_format == \"pd\":\n            return paddle.norm(tensor.reshape([tensor.shape[0], -1]), axis=-1).mean()\n\n        raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def randn_like(self, tensor, generator=None):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            return np.random.randn(np.shape(tensor))\n        elif tensor_format == \"pd\":\n            # return paddle.randn_like(tensor)\n            return paddle.randn(tensor.shape)\n\n        raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def zeros_like(self, tensor):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            return np.zeros_like(tensor)\n        elif tensor_format == \"pd\":\n            return paddle.zeros_like(tensor)\n\n        raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/module.py",
    "content": "# copyright (c) 2022 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport ast\nimport base64\nimport inspect\nimport os\nimport random\nimport sys\nfrom functools import partial\nfrom io import BytesIO\nfrom typing import List\nfrom typing import Optional\n\nimport numpy as np\nimport paddle\nfrom docarray import Document\nfrom docarray import DocumentArray\nfrom IPython import display\nfrom PIL import Image\nfrom stable_diffusion_inpainting.clip.clip.utils import build_model\nfrom stable_diffusion_inpainting.clip.clip.utils import tokenize\nfrom stable_diffusion_inpainting.diffusers import AutoencoderKL\nfrom stable_diffusion_inpainting.diffusers import DDIMScheduler\nfrom stable_diffusion_inpainting.diffusers import LMSDiscreteScheduler\nfrom stable_diffusion_inpainting.diffusers import PNDMScheduler\nfrom stable_diffusion_inpainting.diffusers import UNet2DConditionModel\nfrom stable_diffusion_inpainting.utils import preprocess\nfrom stable_diffusion_inpainting.utils import preprocess_mask\nfrom tqdm.auto import tqdm\n\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"stable_diffusion_inpainting\",\n            version=\"1.0.0\",\n            type=\"image/text_to_image\",\n            summary=\"\",\n            author=\"paddlepaddle\",\n            author_email=\"paddle-dev@baidu.com\")\nclass StableDiffusionInpainting:\n\n    def __init__(self):\n        self.vae = AutoencoderKL(in_channels=3,\n                                 out_channels=3,\n                                 down_block_types=(\"DownEncoderBlock2D\", \"DownEncoderBlock2D\", \"DownEncoderBlock2D\",\n                                                   \"DownEncoderBlock2D\"),\n                                 up_block_types=(\"UpDecoderBlock2D\", \"UpDecoderBlock2D\", \"UpDecoderBlock2D\",\n                                                 \"UpDecoderBlock2D\"),\n                                 block_out_channels=(128, 256, 512, 512),\n                                 layers_per_block=2,\n                                 act_fn=\"silu\",\n                                 latent_channels=4,\n                                 sample_size=512)\n\n        self.unet = UNet2DConditionModel(sample_size=64,\n                                         in_channels=4,\n                                         out_channels=4,\n                                         center_input_sample=False,\n                                         flip_sin_to_cos=True,\n                                         freq_shift=0,\n                                         down_block_types=(\"CrossAttnDownBlock2D\", \"CrossAttnDownBlock2D\",\n                                                           \"CrossAttnDownBlock2D\", \"DownBlock2D\"),\n                                         up_block_types=(\"UpBlock2D\", \"CrossAttnUpBlock2D\", \"CrossAttnUpBlock2D\",\n                                                         \"CrossAttnUpBlock2D\"),\n                                         block_out_channels=(320, 640, 1280, 1280),\n                                         layers_per_block=2,\n                                         downsample_padding=1,\n                                         mid_block_scale_factor=1,\n                                         act_fn=\"silu\",\n                                         norm_num_groups=32,\n                                         norm_eps=1e-5,\n                                         cross_attention_dim=768,\n                                         attention_head_dim=8)\n\n        vae_path = os.path.join(self.directory, 'pre_trained', 'stable-diffusion-v1-4-vae.pdparams')\n        unet_path = os.path.join(self.directory, 'pre_trained', 'stable-diffusion-v1-4-unet.pdparams')\n        self.unet.set_dict(paddle.load(unet_path))\n        self.vae.set_dict(paddle.load(vae_path))\n        for parameter in self.unet.parameters():\n            parameter.stop_gradient = True\n        self.vae.eval()\n        for parameter in self.vae.parameters():\n            parameter.stop_gradient = True\n        self.unet.eval()\n\n        self.text_encoder = build_model()\n        for parameter in self.text_encoder.parameters():\n            parameter.stop_gradient = True\n        self.scheduler = PNDMScheduler(beta_start=0.00085,\n                                       beta_end=0.012,\n                                       beta_schedule=\"scaled_linear\",\n                                       num_train_timesteps=1000,\n                                       skip_prk_steps=True)\n\n    def generate_image(self,\n                       text_prompts,\n                       init_image,\n                       mask_image,\n                       strength: float = 0.8,\n                       style: Optional[str] = None,\n                       artist: Optional[str] = None,\n                       batch_size: Optional[int] = 1,\n                       num_inference_steps=50,\n                       guidance_scale=7.5,\n                       enable_fp16=False,\n                       seed=None,\n                       eta=0.0,\n                       display_rate=5,\n                       use_gpu=True,\n                       output_dir: Optional[str] = 'stable_diffusion_inpainting_out'):\n        \"\"\"\n        Create Stable Diffusion artworks and save the result into a DocumentArray.\n\n        :param text_prompts: Phrase, sentence, or string of words and phrases describing what the image should look like.  The words will be analyzed by the AI and will guide the diffusion process toward the image(s) you describe.\n        :param init_image: Initial image.\n        :param mask_image: Mask image.\n        :param strength: Control the noise strength added to initial image, value is in the interval [0.0, 1.0]. The closer to 1, the bigger change to the initial image.\n        :param style: Image style, such as oil paintings, if specified, style will be used to construct prompts.\n        :param artist: Artist style, if specified, style will be used to construct prompts.\n        :param batch_size: This variable sets the number of still images you want SD to create for each prompt.\n        :param num_inference_steps: The number of inference steps.\n        :param guidance_scale: Increase the adherence to the conditional signal which in this case is text as well as overall sample quality.\n        :param enable_fp16: Whether to use float16.\n        :param use_gpu: whether to use gpu or not.\n        :param output_dir: Output directory.\n        :return: a DocumentArray object that has `n_batches` Documents\n        \"\"\"\n        if seed:\n            np.random.seed(seed)\n            random.seed(seed)\n            paddle.seed(seed)\n\n        if use_gpu:\n            try:\n                _places = os.environ.get(\"CUDA_VISIBLE_DEVICES\", None)\n                if _places:\n                    paddle.device.set_device(\"gpu:{}\".format(0))\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n        else:\n            paddle.device.set_device(\"cpu\")\n        paddle.disable_static()\n\n        if not os.path.exists(output_dir):\n            os.makedirs(output_dir, exist_ok=True)\n\n        if isinstance(text_prompts, str):\n            text_prompts = text_prompts.rstrip(',.，。')\n            if style is not None:\n                text_prompts += \",{}\".format(style)\n            if artist is not None:\n                text_prompts += \",{},trending on artstation\".format(artist)\n            text_prompts = [text_prompts]\n        elif isinstance(text_prompts, list):\n            for i, prompt in enumerate(\n                    text_prompts):  # different from dd here, dd can have multiple prompts for one image with weight.\n                text_prompts[i] = prompt.rstrip(',.，。')\n                if style is not None:\n                    text_prompts[i] += \",{}\".format(style)\n                if artist is not None:\n                    text_prompts[i] += \",{},trending on artstation\".format(artist)\n\n        if isinstance(init_image, str):\n            init_image = preprocess(Image.open(init_image))\n        else:\n            init_image = preprocess(init_image)\n\n        if isinstance(mask_image, str):\n            mask_image = preprocess_mask(Image.open(mask_image))\n        else:\n            mask_image = preprocess_mask(mask_image)\n\n        # set timesteps\n        accepts_offset = \"offset\" in set(inspect.signature(self.scheduler.set_timesteps).parameters.keys())\n        extra_set_kwargs = {}\n        offset = 0\n        if accepts_offset:\n            offset = 1\n            extra_set_kwargs[\"offset\"] = 1\n\n        self.scheduler.set_timesteps(num_inference_steps, **extra_set_kwargs)\n\n        # encode the init image into latents and scale the latents\n        init_latents = self.vae.encode(init_image).sample()\n        init_latents = 0.18215 * init_latents\n\n        # prepare init_latents noise to latents\n        init_latents = paddle.concat([init_latents] * batch_size)\n        init_latents_orig = init_latents\n\n        mask = paddle.concat([mask_image] * batch_size)\n\n        # check sizes\n        if not mask.shape == init_latents.shape:\n            raise ValueError(f\"The mask and init_image should be the same size!\")\n\n        # get the original timestep using init_timestep\n        init_timestep = int(num_inference_steps * strength) + offset\n        init_timestep = min(init_timestep, num_inference_steps)\n        if isinstance(self.scheduler, LMSDiscreteScheduler):\n            timesteps = paddle.to_tensor([num_inference_steps - init_timestep] * batch_size, dtype=\"int64\")\n        else:\n            timesteps = self.scheduler.timesteps[-init_timestep]\n            timesteps = paddle.to_tensor([timesteps] * batch_size, dtype=\"int64\")\n\n        # add noise to latents using the timesteps\n        noise = paddle.randn(init_latents.shape)\n        init_latents = self.scheduler.add_noise(init_latents, noise, timesteps)\n\n        # here `guidance_scale` is defined analog to the guidance weight `w` of equation (2)\n        # of the Imagen paper: https://arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1`\n        # corresponds to doing no classifier free guidance.\n        do_classifier_free_guidance = guidance_scale > 1.0\n\n        # prepare extra kwargs for the scheduler step, since not all schedulers have the same signature\n        # eta (η) is only used with the DDIMScheduler, it will be ignored for other schedulers.\n        # eta corresponds to η in DDIM paper: https://arxiv.org/abs/2010.02502\n        # and should be between [0, 1]\n        accepts_eta = \"eta\" in set(inspect.signature(self.scheduler.step).parameters.keys())\n        extra_step_kwargs = {}\n        if accepts_eta:\n            extra_step_kwargs[\"eta\"] = eta\n\n        da_batches = DocumentArray()\n\n        for prompt in text_prompts:\n            d = Document(tags={'prompt': prompt})\n            da_batches.append(d)\n            for i in range(batch_size):\n                d.chunks.append(Document(tags={'prompt': prompt, 'image idx': i}))\n            d.chunks.append(Document(tags={'prompt': prompt, 'image idx': 'merged'}))\n            with paddle.amp.auto_cast(enable=enable_fp16, level='O2'):\n                prompts = [prompt] * batch_size\n                text_input = tokenize(prompts)\n                text_embeddings = self.text_encoder(text_input)\n                if do_classifier_free_guidance:\n                    uncond_input = tokenize([\"\"] * batch_size)\n                    uncond_embeddings = self.text_encoder(uncond_input)\n                    text_embeddings = paddle.concat([uncond_embeddings, text_embeddings])\n\n                latents = init_latents\n\n                t_start = max(num_inference_steps - init_timestep + offset, 0)\n                for i, t in tqdm(enumerate(self.scheduler.timesteps[t_start:])):\n                    t_index = t_start + i\n                    # expand the latents if we are doing classifier-free guidance to avoid doing two forward passes.\n                    latent_model_input = (paddle.concat([latents] * 2) if do_classifier_free_guidance else latents)\n\n                    if isinstance(self.scheduler, LMSDiscreteScheduler):\n                        sigma = self.scheduler.sigmas[t_index]\n                        latent_model_input = latent_model_input / ((sigma**2 + 1)**0.5)\n\n                    # predict the noise residual\n                    noise_pred = self.unet(latent_model_input, t, encoder_hidden_states=text_embeddings)[\"sample\"]\n\n                    # perform guidance\n                    if do_classifier_free_guidance:\n                        noise_pred_uncond, noise_pred_text = noise_pred.chunk(2)\n                        noise_pred = noise_pred_uncond + guidance_scale * (noise_pred_text - noise_pred_uncond)\n\n                    # compute the previous noisy sample x_t -> x_t-1\n                    if isinstance(self.scheduler, LMSDiscreteScheduler):\n                        latents = self.scheduler.step(noise_pred, t_index, latents, **extra_step_kwargs)[\"prev_sample\"]\n                        # masking\n                        init_latents_proper = self.scheduler.add_noise(init_latents_orig, noise, t_index)\n                    else:\n                        latents = self.scheduler.step(noise_pred, t, latents, **extra_step_kwargs)[\"prev_sample\"]\n                        # masking\n                        init_latents_proper = self.scheduler.add_noise(init_latents_orig, noise, t)\n                    latents = (init_latents_proper * mask) + (latents * (1 - mask))\n                    if i % display_rate == 0:\n                        # vae decode\n                        images = self.vae.decode(1 / 0.18215 * latents)\n                        images = (images / 2 + 0.5).clip(0, 1)\n                        merge_image = images.cpu().transpose([2, 0, 3, 1]).flatten(1, 2).numpy()\n                        merge_image = (merge_image * 255).round().astype(np.uint8)\n                        merge_image = Image.fromarray(merge_image)\n                        merge_image.save(os.path.join(output_dir, f'{prompt}-progress.png'))\n                        c = Document(tags={'step': i, 'prompt': prompt})\n                        c.load_pil_image_to_datauri(merge_image)\n                        d.chunks[-1].chunks.append(c)\n                        display.clear_output(wait=True)\n                        display.display(merge_image)\n                        images = images.cpu().transpose([0, 2, 3, 1]).numpy()\n                        images = (images * 255).round().astype(np.uint8)\n                        for j in range(images.shape[0]):\n                            image = Image.fromarray(images[j])\n                            c = Document(tags={'step': i, 'prompt': prompt})\n                            c.load_pil_image_to_datauri(image)\n                            d.chunks[j].chunks.append(c)\n\n                # vae decode\n                images = self.vae.decode(1 / 0.18215 * latents)\n                images = (images / 2 + 0.5).clip(0, 1)\n                merge_image = images.cpu().transpose([2, 0, 3, 1]).flatten(1, 2).numpy()\n                merge_image = (merge_image * 255).round().astype(np.uint8)\n                merge_image = Image.fromarray(merge_image)\n                merge_image.save(os.path.join(output_dir, f'{prompt}-merge.png'))\n                display.clear_output(wait=True)\n                display.display(merge_image)\n                d.load_pil_image_to_datauri(merge_image)\n                d.chunks[-1].load_pil_image_to_datauri(merge_image)\n                images = images.cpu().transpose([0, 2, 3, 1]).numpy()\n                images = (images * 255).round().astype(np.uint8)\n                for j in range(images.shape[0]):\n                    image = Image.fromarray(images[j])\n                    image.save(os.path.join(output_dir, f'{prompt}-image-{j}.png'))\n                    d.chunks[j].load_pil_image_to_datauri(image)\n        return da_batches\n\n    @serving\n    def serving_method(self, text_prompts, init_image, mask_image, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        init_image = Image.open(BytesIO(base64.b64decode(init_image)))\n        mask_image = Image.open(BytesIO(base64.b64decode(mask_image)))\n        results = self.generate_image(text_prompts=text_prompts, init_image=init_image, mask_image=mask_image,\n                                      **kwargs).to_base64()\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.generate_image(text_prompts=args.text_prompts,\n                                      style=args.style,\n                                      artist=args.artist,\n                                      batch_size=args.batch_size,\n                                      num_inference_steps=args.num_inference_steps,\n                                      guidance_scale=args.guidance_scale,\n                                      enable_fp16=args.enable_fp16,\n                                      seed=args.seed,\n                                      display_rate=args.display_rate,\n                                      use_gpu=args.use_gpu,\n                                      output_dir=args.output_dir)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n\n        self.arg_input_group.add_argument('--num_inference_steps',\n                                          type=int,\n                                          default=50,\n                                          help=\"The number of inference steps.\")\n\n        self.arg_input_group.add_argument(\n            '--guidance_scale',\n            type=float,\n            default=7.5,\n            help=\n            \"Increase the adherence to the conditional signal which in this case is text as well as overall sample quality.\"\n        )\n\n        self.arg_input_group.add_argument(\n            '--seed',\n            type=int,\n            default=None,\n            help=\n            \"Deep in the diffusion code, there is a random number ‘seed’ which is used as the basis for determining the initial state of the diffusion.  By default, this is random, but you can also specify your own seed.\"\n        )\n\n        self.arg_input_group.add_argument(\n            '--display_rate',\n            type=int,\n            default=10,\n            help=\"During a diffusion run, you can monitor the progress of each image being created with this variable.\")\n\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=True,\n                                           help=\"whether use GPU or not\")\n\n        self.arg_config_group.add_argument('--enable_fp16',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use float16 or not\")\n\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='stable_diffusion_inpainting_out',\n                                           help='Output directory.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument(\n            '--text_prompts',\n            type=str,\n            help=\n            'Phrase, sentence, or string of words and phrases describing what the image should look like.  The words will be analyzed by the AI and will guide the diffusion process toward the image(s) you describe. These can include commas and weights to adjust the relative importance of each element.  E.g. \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"Notice that this prompt loosely follows a structure: [subject], [prepositional details], [setting], [meta modifiers and artist]; this is a good starting point for your experiments. Developing text prompts takes practice and experience, and is not the subject of this guide.  If you are a beginner to writing text prompts, a good place to start is on a simple AI art app like Night Cafe, starry ai or WOMBO prior to using DD, to get a feel for how text gets translated into images by GAN tools.  These other apps use different technologies, but many of the same principles apply.'\n        )\n\n        self.arg_input_group.add_argument('--init_image', type=str, help='Initial image.')\n\n        self.arg_input_group.add_argument('--mask_image', type=str, help='Mask image.')\n\n        self.arg_input_group.add_argument(\n            '--strength',\n            type=float,\n            help=\n            'Control the noise strength added to initial image, value is in the interval [0.0, 1.0]. The closer to 1, the bigger change to the initial image.'\n        )\n\n        self.arg_input_group.add_argument(\n            '--style',\n            type=str,\n            default=None,\n            help='Image style, such as oil paintings, if specified, style will be used to construct prompts.')\n        self.arg_input_group.add_argument('--artist',\n                                          type=str,\n                                          default=None,\n                                          help='Artist style, if specified, style will be used to construct prompts.')\n\n        self.arg_input_group.add_argument(\n            '--batch_size',\n            type=int,\n            default=1,\n            help=\"This variable sets the number of still images you want SD to create for each prompt.\")\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/requirements.txt",
    "content": "numpy\nftfy\nregex\ndocarray>=0.13.29\npyyaml\nregex\ntqdm\nipywidgets\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_inpainting/utils.py",
    "content": "import numpy as np\nimport paddle\nimport PIL\nfrom PIL import Image\n\n\ndef preprocess(image):\n    if isinstance(image, np.ndarray):\n        image = Image.fromarray(image)\n    w, h = image.size\n    w, h = map(lambda x: x - x % 32, (w, h))  # resize to integer multiple of 32\n    image = image.resize((w, h), resample=PIL.Image.LANCZOS)\n    image = np.array(image).astype(np.float32) / 255.0\n    image = image[None].transpose(0, 3, 1, 2)\n    image = paddle.to_tensor(image)\n    return 2.0 * image - 1.0\n\n\ndef preprocess_mask(mask):\n    if isinstance(mask, np.ndarray):\n        mask = Image.fromarray(mask)\n    mask = mask.convert(\"L\")\n    w, h = mask.size\n    w, h = map(lambda x: x - x % 32, (w, h))  # resize to integer multiple of 32\n    mask = mask.resize((w // 8, h // 8), resample=PIL.Image.NEAREST)\n    mask = np.array(mask).astype(np.float32) / 255.0\n    mask = np.tile(mask, (4, 1, 1))\n    mask = mask[None].transpose(0, 1, 2, 3)\n    mask = 1 - mask  # repaint white, keep black\n    mask = paddle.to_tensor(mask)\n    return mask\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/LICENSE",
    "content": "Copyright (c) 2022 Robin Rombach and Patrick Esser and contributors\n\nCreativeML Open RAIL-M\ndated August 22, 2022\n\nSection I: PREAMBLE\n\nMultimodal generative models are being widely adopted and used, and have the potential to transform the way artists, among other individuals, conceive and benefit from AI or ML technologies as a tool for content creation.\n\nNotwithstanding the current and potential benefits that these artifacts can bring to society at large, there are also concerns about potential misuses of them, either due to their technical limitations or ethical considerations.\n\nIn short, this license strives for both the open and responsible downstream use of the accompanying model. When it comes to the open character, we took inspiration from open source permissive licenses regarding the grant of IP rights. Referring to the downstream responsible use, we added use-based restrictions not permitting the use of the Model in very specific scenarios, in order for the licensor to be able to enforce the license in case potential misuses of the Model may occur. At the same time, we strive to promote open and responsible research on generative models for art and content generation.\n\nEven though downstream derivative versions of the model could be released under different licensing terms, the latter will always have to include - at minimum - the same use-based restrictions as the ones in the original license (this license). We believe in the intersection between open and responsible AI development; thus, this License aims to strike a balance between both in order to enable responsible open-science in the field of AI.\n\nThis License governs the use of the model (and its derivatives) and is informed by the model card associated with the model.\n\nNOW THEREFORE, You and Licensor agree as follows:\n\n1. Definitions\n\n- \"License\" means the terms and conditions for use, reproduction, and Distribution as defined in this document.\n- \"Data\" means a collection of information and/or content extracted from the dataset used with the Model, including to train, pretrain, or otherwise evaluate the Model. The Data is not licensed under this License.\n- \"Output\" means the results of operating a Model as embodied in informational content resulting therefrom.\n- \"Model\" means any accompanying machine-learning based assemblies (including checkpoints), consisting of learnt weights, parameters (including optimizer states), corresponding to the model architecture as embodied in the Complementary Material, that have been trained or tuned, in whole or in part on the Data, using the Complementary Material.\n- \"Derivatives of the Model\" means all modifications to the Model, works based on the Model, or any other model which is created or initialized by transfer of patterns of the weights, parameters, activations or output of the Model, to the other model, in order to cause the other model to perform similarly to the Model, including - but not limited to - distillation methods entailing the use of intermediate data representations or methods based on the generation of synthetic data by the Model for training the other model.\n- \"Complementary Material\" means the accompanying source code and scripts used to define, run, load, benchmark or evaluate the Model, and used to prepare data for training or evaluation, if any. This includes any accompanying documentation, tutorials, examples, etc, if any.\n- \"Distribution\" means any transmission, reproduction, publication or other sharing of the Model or Derivatives of the Model to a third party, including providing the Model as a hosted service made available by electronic or other remote means - e.g. API-based or web access.\n- \"Licensor\" means the copyright owner or entity authorized by the copyright owner that is granting the License, including the persons or entities that may have rights in the Model and/or distributing the Model.\n- \"You\" (or \"Your\") means an individual or Legal Entity exercising permissions granted by this License and/or making use of the Model for whichever purpose and in any field of use, including usage of the Model in an end-use application - e.g. chatbot, translator, image generator.\n- \"Third Parties\" means individuals or legal entities that are not under common control with Licensor or You.\n- \"Contribution\" means any work of authorship, including the original version of the Model and any modifications or additions to that Model or Derivatives of the Model thereof, that is intentionally submitted to Licensor for inclusion in the Model by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, \"submitted\" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Model, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as \"Not a Contribution.\"\n- \"Contributor\" means Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Model.\n\nSection II: INTELLECTUAL PROPERTY RIGHTS\n\nBoth copyright and patent grants apply to the Model, Derivatives of the Model and Complementary Material. The Model and Derivatives of the Model are subject to additional terms as described in Section III.\n\n2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare, publicly display, publicly perform, sublicense, and distribute the Complementary Material, the Model, and Derivatives of the Model.\n3. Grant of Patent License. Subject to the terms and conditions of this License and where and as applicable, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this paragraph) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Model and the Complementary Material, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Model to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Model and/or Complementary Material or a Contribution incorporated within the Model and/or Complementary Material constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for the Model and/or Work shall terminate as of the date such litigation is asserted or filed.\n\nSection III: CONDITIONS OF USAGE, DISTRIBUTION AND REDISTRIBUTION\n\n4. Distribution and Redistribution. You may host for Third Party remote access purposes (e.g. software-as-a-service), reproduce and distribute copies of the Model or Derivatives of the Model thereof in any medium, with or without modifications, provided that You meet the following conditions:\nUse-based restrictions as referenced in paragraph 5 MUST be included as an enforceable provision by You in any type of legal agreement (e.g. a license) governing the use and/or distribution of the Model or Derivatives of the Model, and You shall give notice to subsequent users You Distribute to, that the Model or Derivatives of the Model are subject to paragraph 5. This provision does not apply to the use of Complementary Material.\nYou must give any Third Party recipients of the Model or Derivatives of the Model a copy of this License;\nYou must cause any modified files to carry prominent notices stating that You changed the files;\nYou must retain all copyright, patent, trademark, and attribution notices excluding those notices that do not pertain to any part of the Model, Derivatives of the Model.\nYou may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions - respecting paragraph 4.a. - for use, reproduction, or Distribution of Your modifications, or for any such Derivatives of the Model as a whole, provided Your use, reproduction, and Distribution of the Model otherwise complies with the conditions stated in this License.\n5. Use-based restrictions. The restrictions set forth in Attachment A are considered Use-based restrictions. Therefore You cannot use the Model and the Derivatives of the Model for the specified restricted uses. You may use the Model subject to this License, including only for lawful purposes and in accordance with the License. Use may include creating any content with, finetuning, updating, running, training, evaluating and/or reparametrizing the Model. You shall require all of Your users who use the Model or a Derivative of the Model to comply with the terms of this paragraph (paragraph 5).\n6. The Output You Generate. Except as set forth herein, Licensor claims no rights in the Output You generate using the Model. You are accountable for the Output you generate and its subsequent uses. No use of the output can contravene any provision as stated in the License.\n\nSection IV: OTHER PROVISIONS\n\n7. Updates and Runtime Restrictions. To the maximum extent permitted by law, Licensor reserves the right to restrict (remotely or otherwise) usage of the Model in violation of this License, update the Model through electronic means, or modify the Output of the Model based on updates. You shall undertake reasonable efforts to use the latest version of the Model.\n8. Trademarks and related. Nothing in this License permits You to make use of Licensors’ trademarks, trade names, logos or to otherwise suggest endorsement or misrepresent the relationship between the parties; and any rights not expressly granted herein are reserved by the Licensors.\n9. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Model and the Complementary Material (and each Contributor provides its Contributions) on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Model, Derivatives of the Model, and the Complementary Material and assume any risks associated with Your exercise of permissions under this License.\n10. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Model and the Complementary Material (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages.\n11. Accepting Warranty or Additional Liability. While redistributing the Model, Derivatives of the Model and the Complementary Material thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability.\n12. If any provision of this License is held to be invalid, illegal or unenforceable, the remaining provisions shall be unaffected thereby and remain valid as if such provision had not been set forth herein.\n\nEND OF TERMS AND CONDITIONS\n\n\n\n\nAttachment A\n\nUse Restrictions\n\nYou agree not to use the Model or Derivatives of the Model:\n- In any way that violates any applicable national, federal, state, local or international law or regulation;\n- For the purpose of exploiting, harming or attempting to exploit or harm minors in any way;\n- To generate or disseminate verifiably false information and/or content with the purpose of harming others;\n- To generate or disseminate personal identifiable information that can be used to harm an individual;\n- To defame, disparage or otherwise harass others;\n- For fully automated decision making that adversely impacts an individual’s legal rights or otherwise creates or modifies a binding, enforceable obligation;\n- For any use intended to or which has the effect of discriminating against or harming individuals or groups based on online or offline social behavior or known or predicted personal or personality characteristics;\n- To exploit any of the vulnerabilities of a specific group of persons based on their age, social, physical or mental characteristics, in order to materially distort the behavior of a person pertaining to that group in a manner that causes or is likely to cause that person or another person physical or psychological harm;\n- For any use intended to or which has the effect of discriminating against individuals or groups based on legally protected characteristics or categories;\n- To provide medical advice and medical results interpretation;\n- To generate or disseminate information for the purpose to be used for administration of justice, law enforcement, immigration or asylum processes, such as predicting an individual will commit fraud/crime commitment (e.g. by text profiling, drawing causal relationships between assertions made in documents, indiscriminate and arbitrarily-targeted use)."
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/README.md",
    "content": "# stable_diffusion_waifu\n\n|模型名称|stable_diffusion_waifu|\n| :--- | :---: |\n|类别|多模态-文图生成|\n|网络|CLIP Text Encoder+UNet+VAD|\n|数据集|-|\n|是否支持Fine-tuning|否|\n|模型大小|4.0GB|\n|最新更新日期|2022-10-17|\n|数据指标|-|\n\n## 一、模型基本信息\n\n### 应用效果展示\n\n  - 输入文本 \"Goku\"\n\n  - 输出图像\n  <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/196138387-577da86e-2910-4f7e-abe3-9ac927df7320.png\"  width = \"80%\" hspace='10'/>\n  <br />\n\n  - 生成过程\n  <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/22424850/196138374-1b3c8df7-0a35-4216-ba4f-c2bd0a0826ff.gif\"  width = \"80%\" hspace='10'/>\n  <br />\n\n### 模型介绍\n\nStable Diffusion是一种潜在扩散模型(Latent Diffusion)， 属于生成类模型，这类模型通过对随机噪声进行一步步地迭代降噪并采样来获得感兴趣的图像，当前取得了令人惊艳的效果。相比于Disco Diffusion, Stable Diffusion通过在低纬度的潜在空间（lower dimensional latent space）而不是原像素空间来做迭代，极大地降低了内存和计算量的需求，并且在V100上一分钟之内即可以渲染出想要的图像，欢迎体验。本模块采用hakurei的[waifu-diffusion](https://huggingface.co/hakurei/waifu-diffusion)的预训练参数，可用于生成二次元的卡通形象。\n\n\n更多详情请参考论文：[High-Resolution Image Synthesis with Latent Diffusion Models](https://arxiv.org/abs/2112.10752)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install stable_diffusion_waifu\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run stable_diffusion_waifu --text_prompts \"Goku\" --output_dir stable_diffusion_waifu_out\n    ```\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"stable_diffusion_waifu\")\n    text_prompts = [\"Goku\"]\n    # 生成图像, 默认会在stable_diffusion_waifu_out目录保存图像\n    # 返回的da是一个DocumentArray对象，保存了所有的结果，包括最终结果和迭代过程的中间结果\n    # 可以通过操作DocumentArray对象对生成的图像做后处理，保存或者分析\n    # 您可以设置batch_size一次生成多张\n    da = module.generate_image(text_prompts=text_prompts, batch_size=3, output_dir='./stable_diffusion_out/')  \n    # 展示所有的中间结果\n    da[0].chunks[-1].chunks.plot_image_sprites(skip_empty=True, show_index=True, keep_aspect_ratio=True)\n    # 将整个生成过程保存为一个动态图gif\n    da[0].chunks[-1].chunks.save_gif('stable_diffusion_waifu_out-merged-result.gif')\n    # da索引的是prompt, da[0].chunks索引的是该prompt下生成的第一张图，在batch_size不为1时能同时生成多张图\n    # 您也可以按照上述操作显示单张图，如第0张的生成过程\n    da[0].chunks[0].chunks.plot_image_sprites(skip_empty=True, show_index=True, keep_aspect_ratio=True)\n    da[0].chunks[0].chunks.save_gif('stable_diffusion_waifu_out-image-0-result.gif')\n    ```\n\n- ### 3、API\n\n  - ```python\n    def generate_image(\n            text_prompts,\n            width_height: Optional[List[int]] = [512, 512],\n            seed: Optional[int] = None,\n            batch_size: Optional[int] = 1,\n            output_dir: Optional[str] = 'stable_diffusion_out'):\n    ```\n\n    - 文图生成API，生成文本描述内容的图像。\n\n    - **参数**\n\n      - text_prompts(str): 输入的语句，描述想要生成的图像的内容, 如卡通人物Goku。\n      - width_height(Optional[List[int]]): 指定最终输出图像的宽高，宽和高都需要是64的倍数，生成的图像越大，所需要的计算时间越长。\n      - seed(Optional[int]): 随机种子，由于输入默认是随机高斯噪声，设置不同的随机种子会由不同的初始输入，从而最终生成不同的结果，可以设置该参数来获得不同的输出图像。\n      - batch_size(Optional[int]): 指定每个prompt一次生成的图像的数量。\n      - output_dir(Optional[str]): 保存输出图像的目录，默认为\"stable_diffusion_out\"。\n\n\n    - **返回**\n      - ra(DocumentArray): DocumentArray对象， 包含`n_batches`个Documents，其中每个Document都保存了迭代过程的所有中间结果。详细可参考[DocumentArray使用文档](https://docarray.jina.ai/fundamentals/documentarray/index.html)。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线文图生成服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m stable_diffusion_waifu\n    ```\n\n  - 这样就完成了一个文图生成的在线服务API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果，返回的预测结果在反序列化后即是上述接口声明中说明的DocumentArray类型，返回后对结果的操作方式和使用generate_image接口完全相同。\n\n  - ```python\n    import requests\n    import json\n    import cv2\n    import base64\n    from docarray import DocumentArray\n\n    # 发送HTTP请求\n    data = {'text_prompts': 'Goku'}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/stable_diffusion_waifu\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 获取返回结果\n    r.json()[\"results\"]\n    da = DocumentArray.from_base64(r.json()[\"results\"])\n    # 保存结果图\n    da[0].save_uri_to_file('stable_diffusion_waifu_out.png')\n    # 将生成过程保存为一个动态图gif\n    da[0].chunks[0].chunks.save_gif('stable_diffusion_waifu_out.gif')\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install stable_diffusion_waifu == 1.0.0\n  ```\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/clip/README.md",
    "content": "# OpenAI CLIP implemented in Paddle.\nThe original implementation repo is [ranchlai/clip.paddle](https://github.com/ranchlai/clip.paddle). We use this repo here for text encoder in stable diffusion.\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/clip/clip/__init__.py",
    "content": "from .utils import *\r\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/clip/clip/layers.py",
    "content": "from typing import Optional\n\nimport paddle\nimport paddle.nn as nn\nfrom paddle import Tensor\nfrom paddle.nn import functional as F\nfrom paddle.nn import Linear\n\n__all__ = ['ResidualAttentionBlock', 'AttentionPool2d', 'multi_head_attention_forward', 'MultiHeadAttention']\n\n\ndef multi_head_attention_forward(x: Tensor,\n                                 num_heads: int,\n                                 q_proj: Linear,\n                                 k_proj: Linear,\n                                 v_proj: Linear,\n                                 c_proj: Linear,\n                                 attn_mask: Optional[Tensor] = None):\n    max_len, batch_size, emb_dim = x.shape\n    head_dim = emb_dim // num_heads\n    scaling = float(head_dim)**-0.5\n    q = q_proj(x)  # L, N, E\n    k = k_proj(x)  # L, N, E\n    v = v_proj(x)  # L, N, E\n    #k = k.con\n    v = v.reshape((-1, batch_size * num_heads, head_dim)).transpose((1, 0, 2))\n    k = k.reshape((-1, batch_size * num_heads, head_dim)).transpose((1, 0, 2))\n    q = q.reshape((-1, batch_size * num_heads, head_dim)).transpose((1, 0, 2))\n\n    q = q * scaling\n    qk = paddle.bmm(q, k.transpose((0, 2, 1)))\n    if attn_mask is not None:\n        if attn_mask.ndim == 2:\n            attn_mask.unsqueeze_(0)\n        #assert str(attn_mask.dtype) == 'VarType.FP32' and attn_mask.ndim == 3\n        assert attn_mask.shape[0] == 1 and attn_mask.shape[1] == max_len and attn_mask.shape[2] == max_len\n        qk += attn_mask\n\n    qk = paddle.nn.functional.softmax(qk, axis=-1)\n    atten = paddle.bmm(qk, v)\n    atten = atten.transpose((1, 0, 2))\n    atten = atten.reshape((max_len, batch_size, emb_dim))\n    atten = c_proj(atten)\n    return atten\n\n\nclass MultiHeadAttention(nn.Layer):  # without attention mask\n\n    def __init__(self, emb_dim: int, num_heads: int):\n        super().__init__()\n        self.q_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.k_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.v_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.c_proj = nn.Linear(emb_dim, emb_dim, bias_attr=True)\n        self.head_dim = emb_dim // num_heads\n        self.emb_dim = emb_dim\n        self.num_heads = num_heads\n        assert self.head_dim * num_heads == emb_dim, \"embed_dim must be divisible by num_heads\"\n        #self.scaling = float(self.head_dim) ** -0.5\n\n    def forward(self, x, attn_mask=None):  # x is in shape[max_len,batch_size,emb_dim]\n\n        atten = multi_head_attention_forward(x,\n                                             self.num_heads,\n                                             self.q_proj,\n                                             self.k_proj,\n                                             self.v_proj,\n                                             self.c_proj,\n                                             attn_mask=attn_mask)\n\n        return atten\n\n\nclass Identity(nn.Layer):\n\n    def __init__(self):\n        super().__init__()\n\n    def forward(self, x):\n        return x\n\n\nclass Bottleneck(nn.Layer):\n    expansion = 4\n\n    def __init__(self, inplanes, planes, stride=1):\n        super().__init__()\n\n        # all conv layers have stride 1. an avgpool is performed after the second convolution when stride > 1\n        self.conv1 = nn.Conv2D(inplanes, planes, 1, bias_attr=False)\n        self.bn1 = nn.BatchNorm2D(planes)\n\n        self.conv2 = nn.Conv2D(planes, planes, 3, padding=1, bias_attr=False)\n        self.bn2 = nn.BatchNorm2D(planes)\n\n        self.avgpool = nn.AvgPool2D(stride) if stride > 1 else Identity()\n\n        self.conv3 = nn.Conv2D(planes, planes * self.expansion, 1, bias_attr=False)\n        self.bn3 = nn.BatchNorm2D(planes * self.expansion)\n\n        self.relu = nn.ReLU()\n        self.downsample = None\n        self.stride = stride\n\n        if stride > 1 or inplanes != planes * Bottleneck.expansion:\n            self.downsample = nn.Sequential(\n                (\"-1\", nn.AvgPool2D(stride)),\n                (\"0\", nn.Conv2D(inplanes, planes * self.expansion, 1, stride=1, bias_attr=False)),\n                (\"1\", nn.BatchNorm2D(planes * self.expansion)))\n\n    def forward(self, x):\n        identity = x\n\n        out = self.relu(self.bn1(self.conv1(x)))\n        out = self.relu(self.bn2(self.conv2(out)))\n        out = self.avgpool(out)\n        out = self.bn3(self.conv3(out))\n\n        if self.downsample is not None:\n            identity = self.downsample(x)\n\n        out += identity\n        out = self.relu(out)\n        return out\n\n\nclass AttentionPool2d(nn.Layer):\n\n    def __init__(self, spacial_dim: int, embed_dim: int, num_heads: int, output_dim: int = None):\n        super().__init__()\n\n        self.positional_embedding = paddle.create_parameter((spacial_dim**2 + 1, embed_dim), dtype='float32')\n\n        self.q_proj = nn.Linear(embed_dim, embed_dim, bias_attr=True)\n        self.k_proj = nn.Linear(embed_dim, embed_dim, bias_attr=True)\n        self.v_proj = nn.Linear(embed_dim, embed_dim, bias_attr=True)\n        self.c_proj = nn.Linear(embed_dim, output_dim or embed_dim, bias_attr=True)\n        self.num_heads = num_heads\n\n        self.head_dim = embed_dim // num_heads\n        assert self.head_dim * num_heads == embed_dim, \"embed_dim must be divisible by num_heads\"\n\n    def forward(self, x):\n\n        x = x.reshape((x.shape[0], x.shape[1], x.shape[2] * x.shape[3])).transpose((2, 0, 1))  # NCHW -> (HW)NC\n        max_len, batch_size, emb_dim = x.shape\n        head_dim = self.head_dim\n        x = paddle.concat([paddle.mean(x, axis=0, keepdim=True), x], axis=0)\n        x = x + paddle.unsqueeze(self.positional_embedding, 1)\n        out = multi_head_attention_forward(x, self.num_heads, self.q_proj, self.k_proj, self.v_proj, self.c_proj)\n\n        return out[0]\n\n\nclass QuickGELU(nn.Layer):\n\n    def forward(self, x):\n        return x * paddle.nn.functional.sigmoid(1.702 * x)\n\n\nclass ResidualAttentionBlock(nn.Layer):\n\n    def __init__(self, d_model: int, n_head: int, attn_mask=None):\n        super().__init__()\n\n        self.attn = MultiHeadAttention(d_model, n_head)\n        self.ln_1 = nn.LayerNorm(d_model)\n        self.mlp = nn.Sequential((\"c_fc\", nn.Linear(d_model, d_model * 4)), (\"gelu\", QuickGELU()),\n                                 (\"c_proj\", nn.Linear(d_model * 4, d_model)))\n        self.ln_2 = nn.LayerNorm(d_model)\n        self.attn_mask = attn_mask\n\n    def attention(self, x):\n        x = self.attn(x, self.attn_mask)\n        assert isinstance(x, paddle.Tensor)  # not tuble here\n        return x\n\n    def forward(self, x):\n\n        x = x + self.attention(self.ln_1(x))\n        x = x + self.mlp(self.ln_2(x))\n        return x\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/clip/clip/model.py",
    "content": "from typing import Tuple\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddle import nn\n\nfrom .layers import AttentionPool2d\nfrom .layers import Bottleneck\nfrom .layers import MultiHeadAttention\nfrom .layers import ResidualAttentionBlock\n\n\nclass ModifiedResNet(nn.Layer):\n    \"\"\"\n    A ResNet class that is similar to torchvision's but contains the following changes:\n    - There are now 3 \"stem\" convolutions as opposed to 1, with an average pool instead of a max pool.\n    - Performs anti-aliasing strided convolutions, where an avgpool is prepended to convolutions with stride > 1\n    - The final pooling layer is a QKV attention instead of an average pool\n    \"\"\"\n\n    def __init__(self, layers, output_dim, heads, input_resolution=224, width=64):\n        super().__init__()\n        self.output_dim = output_dim\n        self.input_resolution = input_resolution\n\n        # the 3-layer stem\n        self.conv1 = nn.Conv2D(3, width // 2, kernel_size=3, stride=2, padding=1, bias_attr=False)\n        self.bn1 = nn.BatchNorm2D(width // 2)\n        self.conv2 = nn.Conv2D(width // 2, width // 2, kernel_size=3, padding=1, bias_attr=False)\n        self.bn2 = nn.BatchNorm2D(width // 2)\n        self.conv3 = nn.Conv2D(width // 2, width, kernel_size=3, padding=1, bias_attr=False)\n        self.bn3 = nn.BatchNorm2D(width)\n        self.avgpool = nn.AvgPool2D(2)\n        self.relu = nn.ReLU()\n\n        # residual layers\n        self._inplanes = width  # this is a *mutable* variable used during construction\n        self.layer1 = self._make_layer(width, layers[0])\n        self.layer2 = self._make_layer(width * 2, layers[1], stride=2)\n        self.layer3 = self._make_layer(width * 4, layers[2], stride=2)\n        self.layer4 = self._make_layer(width * 8, layers[3], stride=2)\n\n        embed_dim = width * 32  # the ResNet feature dimension\n        self.attnpool = AttentionPool2d(input_resolution // 32, embed_dim, heads, output_dim)\n\n    def _make_layer(self, planes, blocks, stride=1):\n        layers = [Bottleneck(self._inplanes, planes, stride)]\n\n        self._inplanes = planes * Bottleneck.expansion\n        for _ in range(1, blocks):\n            layers.append(Bottleneck(self._inplanes, planes))\n\n        return nn.Sequential(*layers)\n\n    def forward(self, x):\n\n        def stem(x):\n            for conv, bn in [(self.conv1, self.bn1), (self.conv2, self.bn2), (self.conv3, self.bn3)]:\n                x = self.relu(bn(conv(x)))\n            x = self.avgpool(x)\n            return x\n\n        #x = x.type(self.conv1.weight.dtype)\n        x = stem(x)\n        x = self.layer1(x)\n        x = self.layer2(x)\n        x = self.layer3(x)\n        x = self.layer4(x)\n        x = self.attnpool(x)\n\n        return x\n\n\nclass Transformer(nn.Layer):\n\n    def __init__(self, width: int, layers: int, heads: int, attn_mask=None):\n        super().__init__()\n        self.width = width\n        self.layers = layers\n        self.resblocks = nn.Sequential(*[ResidualAttentionBlock(width, heads, attn_mask) for _ in range(layers)])\n\n    def forward(self, x):\n        return self.resblocks(x)\n\n\nclass VisualTransformer(nn.Layer):\n\n    def __init__(self, input_resolution: int, patch_size: int, width: int, layers: int, heads: int, output_dim: int):\n        super().__init__()\n        self.input_resolution = input_resolution\n        self.output_dim = output_dim\n        # used patch_size x patch_size, stride patch_size to do linear projection\n        self.conv1 = nn.Conv2D(in_channels=3,\n                               out_channels=width,\n                               kernel_size=patch_size,\n                               stride=patch_size,\n                               bias_attr=False)\n\n        # scale = width ** -0.5\n        self.class_embedding = paddle.create_parameter((width, ), 'float32')\n\n        self.positional_embedding = paddle.create_parameter(((input_resolution // patch_size)**2 + 1, width), 'float32')\n\n        self.ln_pre = nn.LayerNorm(width)\n\n        self.transformer = Transformer(width, layers, heads)\n\n        self.ln_post = nn.LayerNorm(width)\n        self.proj = paddle.create_parameter((width, output_dim), 'float32')\n\n    def forward(self, x):\n\n        x = self.conv1(x)\n        x = x.reshape((x.shape[0], x.shape[1], -1))\n        x = x.transpose((0, 2, 1))\n        x = paddle.concat([self.class_embedding + paddle.zeros((x.shape[0], 1, x.shape[-1]), dtype=x.dtype), x], axis=1)\n\n        x = x + self.positional_embedding\n        x = self.ln_pre(x)\n        x = x.transpose((1, 0, 2))\n        x = self.transformer(x)\n        x = x.transpose((1, 0, 2))\n        x = self.ln_post(x[:, 0, :])\n        if self.proj is not None:\n            x = paddle.matmul(x, self.proj)\n\n        return x\n\n\nclass TextTransformer(nn.Layer):\n\n    def __init__(self, context_length: int, vocab_size: int, transformer_width: int, transformer_heads: int,\n                 transformer_layers: int):\n        super().__init__()\n        self.context_length = context_length\n        self.transformer = Transformer(width=transformer_width,\n                                       layers=transformer_layers,\n                                       heads=transformer_heads,\n                                       attn_mask=self.build_attention_mask())\n\n        self.vocab_size = vocab_size\n        self.token_embedding = nn.Embedding(vocab_size, transformer_width)\n        self.positional_embedding = paddle.create_parameter((self.context_length, transformer_width), 'float32')\n        self.ln_final = nn.LayerNorm(transformer_width)\n\n    def build_attention_mask(self):\n        # lazily create causal attention mask, with full attention between the vision tokens\n        # mask = paddle.empty((self.context_length, self.context_length),dtype='float32')\n        # mask.fill_(float(\"-inf\"))\n        #mask.triu_(1)  # zero out the lower diagonal\n\n        mask = paddle.ones((self.context_length, self.context_length)) * float(\"-inf\")\n        mask = paddle.triu(mask, diagonal=1)\n\n        return mask\n\n    def forward(self, text):\n        x = self.token_embedding(text)  # [batch_size, n_ctx, d_model]\n        x = x + self.positional_embedding\n        x = x.transpose((1, 0, 2))  # NLD -> LND\n        x = self.transformer(x)\n        x = x.transpose((1, 0, 2))  # LND -> NLD\n        x = self.ln_final(x)\n        return x\n\n\nclass CLIP(nn.Layer):\n\n    def __init__(\n            self,\n            embed_dim: int,\n            # vision\n            image_resolution: int,\n            vision_layers: Union[Tuple[int, int, int, int], int],\n            vision_width: int,\n            vision_patch_size: int,\n            # text\n            context_length: int,\n            vocab_size: int,\n            transformer_width: int,\n            transformer_heads: int,\n            transformer_layers: int):\n        super().__init__()\n\n        self.context_length = context_length\n        if isinstance(vision_layers, (tuple, list)):\n            vision_heads = vision_width * 32 // 64\n            self.visual = ModifiedResNet(layers=vision_layers,\n                                         output_dim=embed_dim,\n                                         heads=vision_heads,\n                                         input_resolution=image_resolution,\n                                         width=vision_width)\n        else:\n            vision_heads = vision_width // 64\n            self.visual = VisualTransformer(input_resolution=image_resolution,\n                                            patch_size=vision_patch_size,\n                                            width=vision_width,\n                                            layers=vision_layers,\n                                            heads=vision_heads,\n                                            output_dim=embed_dim)\n\n        self.transformer = Transformer(width=transformer_width,\n                                       layers=transformer_layers,\n                                       heads=transformer_heads,\n                                       attn_mask=self.build_attention_mask())\n\n        self.vocab_size = vocab_size\n        self.token_embedding = nn.Embedding(vocab_size, transformer_width)\n        self.positional_embedding = paddle.create_parameter((self.context_length, transformer_width), 'float32')\n        self.ln_final = nn.LayerNorm(transformer_width)\n\n        self.text_projection = paddle.create_parameter((transformer_width, embed_dim), 'float32')\n        self.logit_scale = paddle.create_parameter((1, ), 'float32')\n\n    def build_attention_mask(self):\n        # lazily create causal attention mask, with full attention between the vision tokens\n        # mask = paddle.empty((self.context_length, self.context_length),dtype='float32')\n        # mask.fill_(float(\"-inf\"))\n        #mask.triu_(1)  # zero out the lower diagonal\n\n        mask = paddle.ones((self.context_length, self.context_length)) * float(\"-inf\")\n        mask = paddle.triu(mask, diagonal=1)\n\n        return mask\n\n    def encode_image(self, image):\n        return self.visual(image)\n\n    def encode_text(self, text):\n        x = self.token_embedding(text)  # [batch_size, n_ctx, d_model]\n        x = x + self.positional_embedding\n        x = x.transpose((1, 0, 2))  # NLD -> LND\n        x = self.transformer(x)\n        x = x.transpose((1, 0, 2))  # LND -> NLD\n        x = self.ln_final(x)\n        idx = text.numpy().argmax(-1)\n        idx = list(idx)\n        x = [x[i:i + 1, int(j), :] for i, j in enumerate(idx)]\n        x = paddle.concat(x, 0)\n        x = paddle.matmul(x, self.text_projection)\n        return x\n\n    def forward(self, image, text):\n        image_features = self.encode_image(image)\n        text_features = self.encode_text(text)\n\n        # normalized features\n        image_features = image_features / image_features.norm(dim=-1, keepdim=True)\n        text_features = text_features / text_features.norm(dim=-1, keepdim=True)\n\n        # cosine similarity as logits\n        logit_scale = self.logit_scale.exp()\n        logits_per_image = paddle.matmul(logit_scale * image_features, text_features.t())\n        logits_per_text = paddle.matmul(logit_scale * text_features, image_features.t())\n\n        # shape = [global_batch_size, global_batch_size]\n        return logits_per_image, logits_per_text\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/clip/clip/simple_tokenizer.py",
    "content": "import gzip\nimport html\nimport os\nfrom functools import lru_cache\n\nimport ftfy\nimport regex as re\n\n\n@lru_cache()\ndef default_bpe():\n    return os.path.join(os.path.dirname(os.path.abspath(__file__)), \"../assets/bpe_simple_vocab_16e6.txt.gz\")\n\n\n@lru_cache()\ndef bytes_to_unicode():\n    \"\"\"\n    Returns list of utf-8 byte and a corresponding list of unicode strings.\n    The reversible bpe codes work on unicode strings.\n    This means you need a large # of unicode characters in your vocab if you want to avoid UNKs.\n    When you're at something like a 10B token dataset you end up needing around 5K for decent coverage.\n    This is a signficant percentage of your normal, say, 32K bpe vocab.\n    To avoid that, we want lookup tables between utf-8 bytes and unicode strings.\n    And avoids mapping to whitespace/control characters the bpe code barfs on.\n    \"\"\"\n    bs = list(range(ord(\"!\"), ord(\"~\") + 1)) + list(range(ord(\"¡\"), ord(\"¬\") + 1)) + list(range(ord(\"®\"), ord(\"ÿ\") + 1))\n    cs = bs[:]\n    n = 0\n    for b in range(2**8):\n        if b not in bs:\n            bs.append(b)\n            cs.append(2**8 + n)\n            n += 1\n    cs = [chr(n) for n in cs]\n    return dict(zip(bs, cs))\n\n\ndef get_pairs(word):\n    \"\"\"Return set of symbol pairs in a word.\n    Word is represented as tuple of symbols (symbols being variable-length strings).\n    \"\"\"\n    pairs = set()\n    prev_char = word[0]\n    for char in word[1:]:\n        pairs.add((prev_char, char))\n        prev_char = char\n    return pairs\n\n\ndef basic_clean(text):\n    text = ftfy.fix_text(text)\n    text = html.unescape(html.unescape(text))\n    return text.strip()\n\n\ndef whitespace_clean(text):\n    text = re.sub(r'\\s+', ' ', text)\n    text = text.strip()\n    return text\n\n\nclass SimpleTokenizer(object):\n\n    def __init__(self, bpe_path: str = default_bpe()):\n        self.byte_encoder = bytes_to_unicode()\n        self.byte_decoder = {v: k for k, v in self.byte_encoder.items()}\n        merges = gzip.open(bpe_path).read().decode(\"utf-8\").split('\\n')\n        merges = merges[1:49152 - 256 - 2 + 1]\n        merges = [tuple(merge.split()) for merge in merges]\n        vocab = list(bytes_to_unicode().values())\n        vocab = vocab + [v + '</w>' for v in vocab]\n        for merge in merges:\n            vocab.append(''.join(merge))\n        vocab.extend(['<|startoftext|>', '<|endoftext|>'])\n        self.encoder = dict(zip(vocab, range(len(vocab))))\n        self.decoder = {v: k for k, v in self.encoder.items()}\n        self.bpe_ranks = dict(zip(merges, range(len(merges))))\n        self.cache = {'<|startoftext|>': '<|startoftext|>', '<|endoftext|>': '<|endoftext|>'}\n        self.pat = re.compile(\n            r\"\"\"<\\|startoftext\\|>|<\\|endoftext\\|>|'s|'t|'re|'ve|'m|'ll|'d|[\\p{L}]+|[\\p{N}]|[^\\s\\p{L}\\p{N}]+\"\"\",\n            re.IGNORECASE)\n\n    def bpe(self, token):\n        if token in self.cache:\n            return self.cache[token]\n        word = tuple(token[:-1]) + (token[-1] + '</w>', )\n        pairs = get_pairs(word)\n\n        if not pairs:\n            return token + '</w>'\n\n        while True:\n            bigram = min(pairs, key=lambda pair: self.bpe_ranks.get(pair, float('inf')))\n            if bigram not in self.bpe_ranks:\n                break\n            first, second = bigram\n            new_word = []\n            i = 0\n            while i < len(word):\n                try:\n                    j = word.index(first, i)\n                    new_word.extend(word[i:j])\n                    i = j\n                except:\n                    new_word.extend(word[i:])\n                    break\n\n                if word[i] == first and i < len(word) - 1 and word[i + 1] == second:\n                    new_word.append(first + second)\n                    i += 2\n                else:\n                    new_word.append(word[i])\n                    i += 1\n            new_word = tuple(new_word)\n            word = new_word\n            if len(word) == 1:\n                break\n            else:\n                pairs = get_pairs(word)\n        word = ' '.join(word)\n        self.cache[token] = word\n        return word\n\n    def encode(self, text):\n        bpe_tokens = []\n        text = whitespace_clean(basic_clean(text)).lower()\n        for token in re.findall(self.pat, text):\n            token = ''.join(self.byte_encoder[b] for b in token.encode('utf-8'))\n            bpe_tokens.extend(self.encoder[bpe_token] for bpe_token in self.bpe(token).split(' '))\n        return bpe_tokens\n\n    def decode(self, tokens):\n        text = ''.join([self.decoder[token] for token in tokens])\n        text = bytearray([self.byte_decoder[c] for c in text]).decode('utf-8', errors=\"replace\").replace('</w>', ' ')\n        return text\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/clip/clip/utils.py",
    "content": "import os\nfrom typing import List\nfrom typing import Union\n\nimport numpy as np\nimport paddle\nfrom paddle.utils import download\nfrom paddle.vision.transforms import CenterCrop\nfrom paddle.vision.transforms import Compose\nfrom paddle.vision.transforms import Normalize\nfrom paddle.vision.transforms import Resize\nfrom paddle.vision.transforms import ToTensor\n\nfrom .model import CLIP\nfrom .model import TextTransformer\nfrom .simple_tokenizer import SimpleTokenizer\n\n__all__ = ['transform', 'tokenize', 'build_model']\n\nMODEL_NAMES = ['VITL14']\n\nURL = {'VITL14': os.path.join(os.path.dirname(__file__), 'pre_trained', 'vitl14_textencoder.pdparams')}\n\nMEAN, STD = (0.48145466, 0.4578275, 0.40821073), (0.26862954, 0.26130258, 0.27577711)\n_tokenizer = SimpleTokenizer()\n\ntransform = Compose([\n    Resize(224, interpolation='bicubic'),\n    CenterCrop(224), lambda image: image.convert('RGB'),\n    ToTensor(),\n    Normalize(mean=MEAN, std=STD), lambda t: t.unsqueeze_(0)\n])\n\n\ndef tokenize(texts: Union[str, List[str]], context_length: int = 77):\n    \"\"\"\n    Returns the tokenized representation of given input string(s)\n\n    Parameters\n    ----------\n    texts : Union[str, List[str]]\n        An input string or a list of input strings to tokenize\n\n    context_length : int\n        The context length to use; all CLIP models use 77 as the context length\n\n    Returns\n    -------\n    A two-dimensional tensor containing the resulting tokens, shape = [number of input strings, context_length]\n    \"\"\"\n    if isinstance(texts, str):\n        texts = [texts]\n\n    sot_token = _tokenizer.encoder[\"<|startoftext|>\"]\n    eot_token = _tokenizer.encoder[\"<|endoftext|>\"]\n    all_tokens = [[sot_token] + _tokenizer.encode(text) + [eot_token] for text in texts]\n    result = paddle.zeros((len(all_tokens), context_length), dtype='int64')\n\n    for i, tokens in enumerate(all_tokens):\n        if len(tokens) > context_length:\n            raise RuntimeError(f\"Input {texts[i]} is too long for context length {context_length}\")\n        result[i, :len(tokens)] = paddle.to_tensor(np.array(tokens), dtype='int64')\n\n    return result\n\n\ndef build_model(name='VITL14'):\n    assert name in MODEL_NAMES, f\"model name must be one of {MODEL_NAMES}\"\n    name2model = {'VITL14': build_vitl14_language_model}\n    model = name2model[name]()\n    weight = URL[name]\n    sd = paddle.load(weight)\n    state_dict = model.state_dict()\n    for key, value in sd.items():\n        if key in state_dict:\n            state_dict[key] = value\n    model.load_dict(state_dict)\n    model.eval()\n    return model\n\n\ndef build_vitl14_language_model():\n    model = TextTransformer(context_length=77,\n                            vocab_size=49408,\n                            transformer_width=768,\n                            transformer_heads=12,\n                            transformer_layers=12)\n    return model\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/diffusers/__init__.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n__version__ = \"0.2.4\"\n\nfrom .models import AutoencoderKL, UNet2DConditionModel, UNet2DModel, VQModel\n\nfrom .schedulers import (DDIMScheduler, DDPMScheduler, KarrasVeScheduler, PNDMScheduler, SchedulerMixin,\n                         ScoreSdeVeScheduler, LMSDiscreteScheduler)\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/diffusers/configuration_utils.py",
    "content": "# coding=utf-8\n# Copyright 2022 The HuggingFace Inc. team.\n# Copyright (c) 2022, NVIDIA CORPORATION.  All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\" ConfigMixinuration base class and utilities.\"\"\"\nimport functools\nimport inspect\nimport json\nimport os\nimport re\nfrom collections import OrderedDict\nfrom typing import Any\nfrom typing import Dict\nfrom typing import Tuple\nfrom typing import Union\n\nfrom requests import HTTPError\n\nfrom paddlehub.common.logger import logger\n\nHUGGINGFACE_CO_RESOLVE_ENDPOINT = \"HUGGINGFACE_CO_RESOLVE_ENDPOINT\"\nDIFFUSERS_CACHE = \"./caches\"\n\n_re_configuration_file = re.compile(r\"config\\.(.*)\\.json\")\n\n\nclass ConfigMixin:\n    r\"\"\"\n    Base class for all configuration classes. Handles a few parameters common to all models' configurations as well as\n    methods for loading/downloading/saving configurations.\n\n    \"\"\"\n    config_name = \"model_config.json\"\n    ignore_for_config = []\n\n    def register_to_config(self, **kwargs):\n        if self.config_name is None:\n            raise NotImplementedError(f\"Make sure that {self.__class__} has defined a class name `config_name`\")\n        kwargs[\"_class_name\"] = self.__class__.__name__\n        kwargs[\"_diffusers_version\"] = \"0.0.1\"\n\n        for key, value in kwargs.items():\n            try:\n                setattr(self, key, value)\n            except AttributeError as err:\n                logger.error(f\"Can't set {key} with value {value} for {self}\")\n                raise err\n\n        if not hasattr(self, \"_internal_dict\"):\n            internal_dict = kwargs\n        else:\n            previous_dict = dict(self._internal_dict)\n            internal_dict = {**self._internal_dict, **kwargs}\n            logger.debug(f\"Updating config from {previous_dict} to {internal_dict}\")\n\n        self._internal_dict = FrozenDict(internal_dict)\n\n    def save_config(self, save_directory: Union[str, os.PathLike], push_to_hub: bool = False, **kwargs):\n        \"\"\"\n        Save a configuration object to the directory `save_directory`, so that it can be re-loaded using the\n        [`~ConfigMixin.from_config`] class method.\n\n        Args:\n            save_directory (`str` or `os.PathLike`):\n                Directory where the configuration JSON file will be saved (will be created if it does not exist).\n            kwargs:\n                Additional key word arguments passed along to the [`~utils.PushToHubMixin.push_to_hub`] method.\n        \"\"\"\n        if os.path.isfile(save_directory):\n            raise AssertionError(f\"Provided path ({save_directory}) should be a directory, not a file\")\n\n        os.makedirs(save_directory, exist_ok=True)\n\n        # If we save using the predefined names, we can load using `from_config`\n        output_config_file = os.path.join(save_directory, self.config_name)\n\n        self.to_json_file(output_config_file)\n        logger.info(f\"ConfigMixinuration saved in {output_config_file}\")\n\n    @classmethod\n    def from_config(cls, pretrained_model_name_or_path: Union[str, os.PathLike], return_unused_kwargs=False, **kwargs):\n        config_dict = cls.get_config_dict(pretrained_model_name_or_path=pretrained_model_name_or_path, **kwargs)\n\n        init_dict, unused_kwargs = cls.extract_init_dict(config_dict, **kwargs)\n\n        model = cls(**init_dict)\n\n        if return_unused_kwargs:\n            return model, unused_kwargs\n        else:\n            return model\n\n    @classmethod\n    def get_config_dict(cls, pretrained_model_name_or_path: Union[str, os.PathLike],\n                        **kwargs) -> Tuple[Dict[str, Any], Dict[str, Any]]:\n        cache_dir = kwargs.pop(\"cache_dir\", DIFFUSERS_CACHE)\n        force_download = kwargs.pop(\"force_download\", False)\n        resume_download = kwargs.pop(\"resume_download\", False)\n        proxies = kwargs.pop(\"proxies\", None)\n        use_auth_token = kwargs.pop(\"use_auth_token\", None)\n        local_files_only = kwargs.pop(\"local_files_only\", False)\n        revision = kwargs.pop(\"revision\", None)\n        subfolder = kwargs.pop(\"subfolder\", None)\n\n        user_agent = {\"file_type\": \"config\"}\n\n        pretrained_model_name_or_path = str(pretrained_model_name_or_path)\n\n        if cls.config_name is None:\n            raise ValueError(\n                \"`self.config_name` is not defined. Note that one should not load a config from \"\n                \"`ConfigMixin`. Please make sure to define `config_name` in a class inheriting from `ConfigMixin`\")\n\n        if os.path.isfile(pretrained_model_name_or_path):\n            config_file = pretrained_model_name_or_path\n        elif os.path.isdir(pretrained_model_name_or_path):\n            if os.path.isfile(os.path.join(pretrained_model_name_or_path, cls.config_name)):\n                # Load from a PyTorch checkpoint\n                config_file = os.path.join(pretrained_model_name_or_path, cls.config_name)\n            elif subfolder is not None and os.path.isfile(\n                    os.path.join(pretrained_model_name_or_path, subfolder, cls.config_name)):\n                config_file = os.path.join(pretrained_model_name_or_path, subfolder, cls.config_name)\n            else:\n                raise EnvironmentError(\n                    f\"Error no file named {cls.config_name} found in directory {pretrained_model_name_or_path}.\")\n        else:\n            try:\n                # Load from URL or cache if already cached\n                from huggingface_hub import hf_hub_download\n                config_file = hf_hub_download(\n                    pretrained_model_name_or_path,\n                    filename=cls.config_name,\n                    cache_dir=cache_dir,\n                    force_download=force_download,\n                    proxies=proxies,\n                    resume_download=resume_download,\n                    local_files_only=local_files_only,\n                    use_auth_token=use_auth_token,\n                    user_agent=user_agent,\n                    subfolder=subfolder,\n                )\n\n            except HTTPError as err:\n                raise EnvironmentError(\"There was a specific connection error when trying to load\"\n                                       f\" {pretrained_model_name_or_path}:\\n{err}\")\n            except ValueError:\n                raise EnvironmentError(\n                    f\"We couldn't connect to '{HUGGINGFACE_CO_RESOLVE_ENDPOINT}' to load this model, couldn't find it\"\n                    f\" in the cached files and it looks like {pretrained_model_name_or_path} is not the path to a\"\n                    f\" directory containing a {cls.config_name} file.\\nCheckout your internet connection or see how to\"\n                    \" run the library in offline mode at\"\n                    \" 'https://huggingface.co/docs/diffusers/installation#offline-mode'.\")\n            except EnvironmentError:\n                raise EnvironmentError(\n                    f\"Can't load config for '{pretrained_model_name_or_path}'. If you were trying to load it from \"\n                    \"'https://huggingface.co/models', make sure you don't have a local directory with the same name. \"\n                    f\"Otherwise, make sure '{pretrained_model_name_or_path}' is the correct path to a directory \"\n                    f\"containing a {cls.config_name} file\")\n\n        try:\n            # Load config dict\n            config_dict = cls._dict_from_json_file(config_file)\n        except (json.JSONDecodeError, UnicodeDecodeError):\n            raise EnvironmentError(f\"It looks like the config file at '{config_file}' is not a valid JSON file.\")\n\n        return config_dict\n\n    @classmethod\n    def extract_init_dict(cls, config_dict, **kwargs):\n        expected_keys = set(dict(inspect.signature(cls.__init__).parameters).keys())\n        expected_keys.remove(\"self\")\n        # remove general kwargs if present in dict\n        if \"kwargs\" in expected_keys:\n            expected_keys.remove(\"kwargs\")\n        # remove keys to be ignored\n        if len(cls.ignore_for_config) > 0:\n            expected_keys = expected_keys - set(cls.ignore_for_config)\n        init_dict = {}\n        for key in expected_keys:\n            if key in kwargs:\n                # overwrite key\n                init_dict[key] = kwargs.pop(key)\n            elif key in config_dict:\n                # use value from config dict\n                init_dict[key] = config_dict.pop(key)\n\n        unused_kwargs = config_dict.update(kwargs)\n\n        passed_keys = set(init_dict.keys())\n        if len(expected_keys - passed_keys) > 0:\n            logger.warning(\n                f\"{expected_keys - passed_keys} was not found in config. Values will be initialized to default values.\")\n\n        return init_dict, unused_kwargs\n\n    @classmethod\n    def _dict_from_json_file(cls, json_file: Union[str, os.PathLike]):\n        with open(json_file, \"r\", encoding=\"utf-8\") as reader:\n            text = reader.read()\n        return json.loads(text)\n\n    def __repr__(self):\n        return f\"{self.__class__.__name__} {self.to_json_string()}\"\n\n    @property\n    def config(self) -> Dict[str, Any]:\n        return self._internal_dict\n\n    def to_json_string(self) -> str:\n        \"\"\"\n        Serializes this instance to a JSON string.\n\n        Returns:\n            `str`: String containing all the attributes that make up this configuration instance in JSON format.\n        \"\"\"\n        config_dict = self._internal_dict if hasattr(self, \"_internal_dict\") else {}\n        return json.dumps(config_dict, indent=2, sort_keys=True) + \"\\n\"\n\n    def to_json_file(self, json_file_path: Union[str, os.PathLike]):\n        \"\"\"\n        Save this instance to a JSON file.\n\n        Args:\n            json_file_path (`str` or `os.PathLike`):\n                Path to the JSON file in which this configuration instance's parameters will be saved.\n        \"\"\"\n        with open(json_file_path, \"w\", encoding=\"utf-8\") as writer:\n            writer.write(self.to_json_string())\n\n\nclass FrozenDict(OrderedDict):\n\n    def __init__(self, *args, **kwargs):\n        super().__init__(*args, **kwargs)\n\n        for key, value in self.items():\n            setattr(self, key, value)\n\n        self.__frozen = True\n\n    def __delitem__(self, *args, **kwargs):\n        raise Exception(f\"You cannot use ``__delitem__`` on a {self.__class__.__name__} instance.\")\n\n    def setdefault(self, *args, **kwargs):\n        raise Exception(f\"You cannot use ``setdefault`` on a {self.__class__.__name__} instance.\")\n\n    def pop(self, *args, **kwargs):\n        raise Exception(f\"You cannot use ``pop`` on a {self.__class__.__name__} instance.\")\n\n    def update(self, *args, **kwargs):\n        raise Exception(f\"You cannot use ``update`` on a {self.__class__.__name__} instance.\")\n\n    def __setattr__(self, name, value):\n        if hasattr(self, \"__frozen\") and self.__frozen:\n            raise Exception(f\"You cannot use ``__setattr__`` on a {self.__class__.__name__} instance.\")\n        super().__setattr__(name, value)\n\n    def __setitem__(self, name, value):\n        if hasattr(self, \"__frozen\") and self.__frozen:\n            raise Exception(f\"You cannot use ``__setattr__`` on a {self.__class__.__name__} instance.\")\n        super().__setitem__(name, value)\n\n\ndef register_to_config(init):\n    \"\"\"\n    Decorator to apply on the init of classes inheriting from `ConfigMixin` so that all the arguments are automatically\n    sent to `self.register_for_config`. To ignore a specific argument accepted by the init but that shouldn't be\n    registered in the config, use the `ignore_for_config` class variable\n\n    Warning: Once decorated, all private arguments (beginning with an underscore) are trashed and not sent to the init!\n    \"\"\"\n\n    @functools.wraps(init)\n    def inner_init(self, *args, **kwargs):\n        # Ignore private kwargs in the init.\n        init_kwargs = {k: v for k, v in kwargs.items() if not k.startswith(\"_\")}\n        init(self, *args, **init_kwargs)\n        if not isinstance(self, ConfigMixin):\n            raise RuntimeError(\n                f\"`@register_for_config` was applied to {self.__class__.__name__} init method, but this class does \"\n                \"not inherit from `ConfigMixin`.\")\n\n        ignore = getattr(self, \"ignore_for_config\", [])\n        # Get positional arguments aligned with kwargs\n        new_kwargs = {}\n        signature = inspect.signature(init)\n        parameters = {\n            name: p.default\n            for i, (name, p) in enumerate(signature.parameters.items()) if i > 0 and name not in ignore\n        }\n        for arg, name in zip(args, parameters.keys()):\n            new_kwargs[name] = arg\n\n        # Then add all kwargs\n        new_kwargs.update({\n            k: init_kwargs.get(k, default)\n            for k, default in parameters.items() if k not in ignore and k not in new_kwargs\n        })\n        getattr(self, \"register_to_config\")(**new_kwargs)\n\n    return inner_init\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/diffusers/models/README.md",
    "content": "# Models\n\n- Models: Neural network that models $p_\\theta(\\mathbf{x}_{t-1}|\\mathbf{x}_t)$ (see image below) and is trained end-to-end to denoise a noisy input to an image. Examples: UNet, Conditioned UNet, 3D UNet, Transformer UNet\n\n## API\n\nTODO(Suraj, Patrick)\n\n## Examples\n\nTODO(Suraj, Patrick)\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/diffusers/models/__init__.py",
    "content": "# flake8: noqa\n# There's no way to ignore \"F401 '...' imported but unused\" warnings in this\n# module, but to preserve other warnings. So, don't check this module at all.\n# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom .unet_2d import UNet2DModel\nfrom .unet_2d_condition import UNet2DConditionModel\nfrom .vae import AutoencoderKL\nfrom .vae import VQModel\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/diffusers/models/attention.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nfrom inspect import isfunction\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\ndef finfo(dtype):\n    if dtype == paddle.float32:\n        return np.finfo(np.float32)\n    if dtype == paddle.float16:\n        return np.finfo(np.float16)\n    if dtype == paddle.float64:\n        return np.finfo(np.float64)\n\n\npaddle.finfo = finfo\n\n\nclass AttentionBlockNew(nn.Layer):\n    \"\"\"\n    An attention block that allows spatial positions to attend to each other. Originally ported from here, but adapted\n    to the N-d case.\n    https://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/models/unet.py#L66.\n    Uses three q, k, v linear layers to compute attention\n    \"\"\"\n\n    def __init__(\n        self,\n        channels,\n        num_head_channels=None,\n        num_groups=32,\n        rescale_output_factor=1.0,\n        eps=1e-5,\n    ):\n        super().__init__()\n        self.channels = channels\n\n        self.num_heads = channels // num_head_channels if num_head_channels is not None else 1\n        self.num_head_size = num_head_channels\n        self.group_norm = nn.GroupNorm(num_channels=channels, num_groups=num_groups, epsilon=eps)\n\n        # define q,k,v as linear layers\n        self.query = nn.Linear(channels, channels)\n        self.key = nn.Linear(channels, channels)\n        self.value = nn.Linear(channels, channels)\n\n        self.rescale_output_factor = rescale_output_factor\n        self.proj_attn = nn.Linear(channels, channels)\n\n    def transpose_for_scores(self, projection: paddle.Tensor) -> paddle.Tensor:\n        new_projection_shape = projection.shape[:-1] + [self.num_heads, -1]\n        # move heads to 2nd position (B, T, H * D) -> (B, T, H, D) -> (B, H, T, D)\n        new_projection = projection.reshape(new_projection_shape).transpose([0, 2, 1, 3])\n        return new_projection\n\n    def forward(self, hidden_states):\n        residual = hidden_states\n        batch, channel, height, width = hidden_states.shape\n\n        # norm\n        hidden_states = self.group_norm(hidden_states)\n\n        hidden_states = hidden_states.reshape([batch, channel, height * width]).transpose([0, 2, 1])\n\n        # proj to q, k, v\n        query_proj = self.query(hidden_states)\n        key_proj = self.key(hidden_states)\n        value_proj = self.value(hidden_states)\n\n        # transpose\n        query_states = self.transpose_for_scores(query_proj)\n        key_states = self.transpose_for_scores(key_proj)\n        value_states = self.transpose_for_scores(value_proj)\n\n        # get scores\n        scale = 1 / math.sqrt(math.sqrt(self.channels / self.num_heads))\n        attention_scores = paddle.matmul(query_states * scale, key_states * scale, transpose_y=True)\n        attention_probs = F.softmax(attention_scores.astype(\"float32\"), axis=-1).astype(attention_scores.dtype)\n\n        # compute attention output\n        context_states = paddle.matmul(attention_probs, value_states)\n\n        context_states = context_states.transpose([0, 2, 1, 3])\n        new_context_states_shape = context_states.shape[:-2] + [\n            self.channels,\n        ]\n        context_states = context_states.reshape(new_context_states_shape)\n\n        # compute next hidden_states\n        hidden_states = self.proj_attn(context_states)\n        hidden_states = hidden_states.transpose([0, 2, 1]).reshape([batch, channel, height, width])\n\n        # res connect and rescale\n        hidden_states = (hidden_states + residual) / self.rescale_output_factor\n        return hidden_states\n\n    def set_weight(self, attn_layer):\n        self.group_norm.weight.set_value(attn_layer.norm.weight)\n        self.group_norm.bias.set_value(attn_layer.norm.bias)\n\n        if hasattr(attn_layer, \"q\"):\n            self.query.weight.set_value(attn_layer.q.weight[:, :, 0, 0])\n            self.key.weight.set_value(attn_layer.k.weight[:, :, 0, 0])\n            self.value.weight.set_value(attn_layer.v.weight[:, :, 0, 0])\n\n            self.query.bias.set_value(attn_layer.q.bias)\n            self.key.bias.set_value(attn_layer.k.bias)\n            self.value.bias.set_value(attn_layer.v.bias)\n\n            self.proj_attn.weight.set_value(attn_layer.proj_out.weight[:, :, 0, 0])\n            self.proj_attn.bias.set_value(attn_layer.proj_out.bias)\n        elif hasattr(attn_layer, \"NIN_0\"):\n            self.query.weight.set_value(attn_layer.NIN_0.W.t())\n            self.key.weight.set_value(attn_layer.NIN_1.W.t())\n            self.value.weight.set_value(attn_layer.NIN_2.W.t())\n\n            self.query.bias.set_value(attn_layer.NIN_0.b)\n            self.key.bias.set_value(attn_layer.NIN_1.b)\n            self.value.bias.set_value(attn_layer.NIN_2.b)\n\n            self.proj_attn.weight.set_value(attn_layer.NIN_3.W.t())\n            self.proj_attn.bias.set_value(attn_layer.NIN_3.b)\n\n            self.group_norm.weight.set_value(attn_layer.GroupNorm_0.weight)\n            self.group_norm.bias.set_value(attn_layer.GroupNorm_0.bias)\n        else:\n            qkv_weight = attn_layer.qkv.weight.reshape(\n                [self.num_heads, 3 * self.channels // self.num_heads, self.channels])\n            qkv_bias = attn_layer.qkv.bias.reshape([self.num_heads, 3 * self.channels // self.num_heads])\n\n            q_w, k_w, v_w = qkv_weight.split(self.channels // self.num_heads, axis=1)\n            q_b, k_b, v_b = qkv_bias.split(self.channels // self.num_heads, axis=1)\n\n            self.query.weight.set_value(q_w.reshape([-1, self.channels]))\n            self.key.weight.set_value(k_w.reshape([-1, self.channels]))\n            self.value.weight.set_value(v_w.reshape([-1, self.channels]))\n\n            self.query.bias.set_value(q_b.flatten())\n            self.key.bias.set_value(k_b.flatten())\n            self.value.bias.set_value(v_b.flatten())\n\n            self.proj_attn.weight.set_value(attn_layer.proj.weight[:, :, 0])\n            self.proj_attn.bias.set_value(attn_layer.proj.bias)\n\n\nclass SpatialTransformer(nn.Layer):\n    \"\"\"\n    Transformer block for image-like data. First, project the input (aka embedding) and reshape to b, t, d. Then apply\n    standard transformer action. Finally, reshape to image\n    \"\"\"\n\n    def __init__(self, in_channels, n_heads, d_head, depth=1, dropout=0.0, context_dim=None):\n        super().__init__()\n        self.n_heads = n_heads\n        self.d_head = d_head\n        self.in_channels = in_channels\n        inner_dim = n_heads * d_head\n        self.norm = nn.GroupNorm(num_groups=32, num_channels=in_channels, epsilon=1e-6)\n\n        self.proj_in = nn.Conv2D(in_channels, inner_dim, kernel_size=1, stride=1, padding=0)\n\n        self.transformer_blocks = nn.LayerList([\n            BasicTransformerBlock(inner_dim, n_heads, d_head, dropout=dropout, context_dim=context_dim)\n            for d in range(depth)\n        ])\n\n        self.proj_out = nn.Conv2D(inner_dim, in_channels, kernel_size=1, stride=1, padding=0)\n\n    def forward(self, x, context=None):\n        # note: if no context is given, cross-attention defaults to self-attention\n        b, c, h, w = x.shape\n        x_in = x\n        x = self.norm(x)\n        x = self.proj_in(x)\n        x = x.transpose([0, 2, 3, 1]).reshape([b, h * w, c])\n        for block in self.transformer_blocks:\n            x = block(x, context=context)\n        x = x.reshape([b, h, w, c]).transpose([0, 3, 1, 2])\n        x = self.proj_out(x)\n        return x + x_in\n\n    def set_weight(self, layer):\n        self.norm = layer.norm\n        self.proj_in = layer.proj_in\n        self.transformer_blocks = layer.transformer_blocks\n        self.proj_out = layer.proj_out\n\n\nclass BasicTransformerBlock(nn.Layer):\n\n    def __init__(self, dim, n_heads, d_head, dropout=0.0, context_dim=None, gated_ff=True, checkpoint=True):\n        super().__init__()\n        self.attn1 = CrossAttention(query_dim=dim, heads=n_heads, dim_head=d_head,\n                                    dropout=dropout)  # is a self-attention\n        self.ff = FeedForward(dim, dropout=dropout, glu=gated_ff)\n        self.attn2 = CrossAttention(query_dim=dim,\n                                    context_dim=context_dim,\n                                    heads=n_heads,\n                                    dim_head=d_head,\n                                    dropout=dropout)  # is self-attn if context is none\n        self.norm1 = nn.LayerNorm(dim)\n        self.norm2 = nn.LayerNorm(dim)\n        self.norm3 = nn.LayerNorm(dim)\n        self.checkpoint = checkpoint\n\n    def forward(self, x, context=None):\n        x = self.attn1(self.norm1(x)) + x\n        x = self.attn2(self.norm2(x), context=context) + x\n        x = self.ff(self.norm3(x)) + x\n        return x\n\n\nclass CrossAttention(nn.Layer):\n\n    def __init__(self, query_dim, context_dim=None, heads=8, dim_head=64, dropout=0.0):\n        super().__init__()\n        inner_dim = dim_head * heads\n        context_dim = default(context_dim, query_dim)\n\n        self.scale = dim_head**-0.5\n        self.heads = heads\n\n        self.to_q = nn.Linear(query_dim, inner_dim, bias_attr=False)\n        self.to_k = nn.Linear(context_dim, inner_dim, bias_attr=False)\n        self.to_v = nn.Linear(context_dim, inner_dim, bias_attr=False)\n\n        self.to_out = nn.Sequential(nn.Linear(inner_dim, query_dim), nn.Dropout(dropout))\n\n    def reshape_heads_to_batch_dim(self, tensor):\n        batch_size, seq_len, dim = tensor.shape\n        head_size = self.heads\n        tensor = tensor.reshape([batch_size, seq_len, head_size, dim // head_size])\n        tensor = tensor.transpose([0, 2, 1, 3]).reshape([batch_size * head_size, seq_len, dim // head_size])\n        return tensor\n\n    def reshape_batch_dim_to_heads(self, tensor):\n        batch_size, seq_len, dim = tensor.shape\n        head_size = self.heads\n        tensor = tensor.reshape([batch_size // head_size, head_size, seq_len, dim])\n        tensor = tensor.transpose([0, 2, 1, 3]).reshape([batch_size // head_size, seq_len, dim * head_size])\n        return tensor\n\n    def forward(self, x, context=None, mask=None):\n        batch_size, sequence_length, dim = x.shape\n\n        h = self.heads\n\n        q = self.to_q(x)\n        context = default(context, x)\n        k = self.to_k(context)\n        v = self.to_v(context)\n\n        q = self.reshape_heads_to_batch_dim(q)\n        k = self.reshape_heads_to_batch_dim(k)\n        v = self.reshape_heads_to_batch_dim(v)\n\n        sim = paddle.einsum(\"b i d, b j d -> b i j\", q * self.scale, k)\n\n        if exists(mask):\n            mask = mask.reshape([batch_size, -1])\n            max_neg_value = -paddle.finfo(sim.dtype).max\n            mask = mask[:, None, :].repeat(h, 1, 1)\n            sim.masked_fill_(~mask, max_neg_value)\n\n        # attention, what we cannot get enough of\n        attn = F.softmax(sim, axis=-1)\n\n        out = paddle.einsum(\"b i j, b j d -> b i d\", attn, v)\n        out = self.reshape_batch_dim_to_heads(out)\n        return self.to_out(out)\n\n\nclass FeedForward(nn.Layer):\n\n    def __init__(self, dim, dim_out=None, mult=4, glu=False, dropout=0.0):\n        super().__init__()\n        inner_dim = int(dim * mult)\n        dim_out = default(dim_out, dim)\n        project_in = nn.Sequential(nn.Linear(dim, inner_dim), nn.GELU()) if not glu else GEGLU(dim, inner_dim)\n\n        self.net = nn.Sequential(project_in, nn.Dropout(dropout), nn.Linear(inner_dim, dim_out))\n\n    def forward(self, x):\n        return self.net(x)\n\n\n# feedforward\nclass GEGLU(nn.Layer):\n\n    def __init__(self, dim_in, dim_out):\n        super().__init__()\n        self.proj = nn.Linear(dim_in, dim_out * 2)\n\n    def forward(self, x):\n        x, gate = self.proj(x).chunk(2, axis=-1)\n        return x * F.gelu(gate)\n\n\n# TODO(Patrick) - remove once all weights have been converted -> not needed anymore then\nclass NIN(nn.Layer):\n\n    def __init__(self, in_dim, num_units, init_scale=0.1):\n        super().__init__()\n        self.W = self.create_parameter(shape=[in_dim, num_units], default_initializer=nn.initializer.Constant(0.))\n        self.b = self.create_parameter(shape=[\n            num_units,\n        ],\n                                       is_bias=True,\n                                       default_initializer=nn.initializer.Constant(0.))\n\n\ndef exists(val):\n    return val is not None\n\n\ndef default(val, d):\n    if exists(val):\n        return val\n    return d() if isfunction(d) else d\n\n\n# the main attention block that is used for all models\nclass AttentionBlock(nn.Layer):\n    \"\"\"\n    An attention block that allows spatial positions to attend to each other.\n\n    Originally ported from here, but adapted to the N-d case.\n    https://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/models/unet.py#L66.\n    \"\"\"\n\n    def __init__(\n        self,\n        channels,\n        num_heads=1,\n        num_head_channels=None,\n        num_groups=32,\n        encoder_channels=None,\n        overwrite_qkv=False,\n        overwrite_linear=False,\n        rescale_output_factor=1.0,\n        eps=1e-5,\n    ):\n        super().__init__()\n        self.channels = channels\n        if num_head_channels is None:\n            self.num_heads = num_heads\n        else:\n            assert (channels % num_head_channels == 0\n                    ), f\"q,k,v channels {channels} is not divisible by num_head_channels {num_head_channels}\"\n            self.num_heads = channels // num_head_channels\n\n        self.norm = nn.GroupNorm(num_channels=channels, num_groups=num_groups, epsilon=eps)\n        self.qkv = nn.Conv1D(channels, channels * 3, 1)\n        self.n_heads = self.num_heads\n        self.rescale_output_factor = rescale_output_factor\n\n        if encoder_channels is not None:\n            self.encoder_kv = nn.Conv1D(encoder_channels, channels * 2, 1)\n\n        self.proj = nn.Conv1D(channels, channels, 1)\n\n        self.overwrite_qkv = overwrite_qkv\n        self.overwrite_linear = overwrite_linear\n\n        if overwrite_qkv:\n            in_channels = channels\n            self.norm = nn.GroupNorm(num_channels=channels, num_groups=num_groups, epsilon=1e-6)\n            self.q = nn.Conv2D(in_channels, in_channels, kernel_size=1, stride=1, padding=0)\n            self.k = nn.Conv2D(in_channels, in_channels, kernel_size=1, stride=1, padding=0)\n            self.v = nn.Conv2D(in_channels, in_channels, kernel_size=1, stride=1, padding=0)\n            self.proj_out = nn.Conv2D(in_channels, in_channels, kernel_size=1, stride=1, padding=0)\n        elif self.overwrite_linear:\n            num_groups = min(channels // 4, 32)\n            self.norm = nn.GroupNorm(num_channels=channels, num_groups=num_groups, epsilon=1e-6)\n            self.NIN_0 = NIN(channels, channels)\n            self.NIN_1 = NIN(channels, channels)\n            self.NIN_2 = NIN(channels, channels)\n            self.NIN_3 = NIN(channels, channels)\n\n            self.GroupNorm_0 = nn.GroupNorm(num_groups=num_groups, num_channels=channels, epsilon=1e-6)\n        else:\n            self.proj_out = nn.Conv1D(channels, channels, 1)\n            self.set_weights(self)\n\n        self.is_overwritten = False\n\n    def set_weights(self, layer):\n        if self.overwrite_qkv:\n            qkv_weight = paddle.concat([layer.q.weight, layer.k.weight, layer.v.weight], axis=0)[:, :, :, 0]\n            qkv_bias = paddle.concat([layer.q.bias, layer.k.bias, layer.v.bias], axis=0)\n\n            self.qkv.weight.set_value(qkv_weight)\n            self.qkv.bias.set_value(qkv_bias)\n\n            proj_out = nn.Conv1D(self.channels, self.channels, 1)\n            proj_out.weight.set_value(layer.proj_out.weight[:, :, :, 0])\n            proj_out.bias.set_value(layer.proj_out.bias)\n\n            self.proj = proj_out\n        elif self.overwrite_linear:\n            self.qkv.weight.set_value(\n                paddle.concat([self.NIN_0.W.t(), self.NIN_1.W.t(), self.NIN_2.W.t()], axis=0)[:, :, None])\n            self.qkv.bias.set_value(paddle.concat([self.NIN_0.b, self.NIN_1.b, self.NIN_2.b], axis=0))\n\n            self.proj.weight.set_value(self.NIN_3.W.t()[:, :, None])\n            self.proj.bias.set_value(self.NIN_3.b)\n\n            self.norm.weight.set_value(self.GroupNorm_0.weight)\n            self.norm.bias.set_value(self.GroupNorm_0.bias)\n        else:\n            self.proj.weight.set_value(self.proj_out.weight)\n            self.proj.bias.set_value(self.proj_out.bias)\n\n    def forward(self, x, encoder_out=None):\n        if not self.is_overwritten and (self.overwrite_qkv or self.overwrite_linear):\n            self.set_weights(self)\n            self.is_overwritten = True\n\n        b, c, *spatial = x.shape\n        hid_states = self.norm(x).reshape([b, c, -1])\n\n        qkv = self.qkv(hid_states)\n        bs, width, length = qkv.shape\n        assert width % (3 * self.n_heads) == 0\n        ch = width // (3 * self.n_heads)\n        q, k, v = qkv.reshape([bs * self.n_heads, ch * 3, length]).split(ch, axis=1)\n\n        if encoder_out is not None:\n            encoder_kv = self.encoder_kv(encoder_out)\n            assert encoder_kv.shape[1] == self.n_heads * ch * 2\n            ek, ev = encoder_kv.reshape([bs * self.n_heads, ch * 2, -1]).split(ch, axis=1)\n            k = paddle.concat([ek, k], axis=-1)\n            v = paddle.concat([ev, v], axis=-1)\n\n        scale = 1 / math.sqrt(math.sqrt(ch))\n        weight = paddle.einsum(\"bct,bcs->bts\", q * scale, k * scale)  # More stable with f16 than dividing afterwards\n        weight = F.softmax(weight.astype(\"float32\"), axis=-1).astype(weight.dtype)\n\n        a = paddle.einsum(\"bts,bcs->bct\", weight, v)\n        h = a.reshape([bs, -1, length])\n\n        h = self.proj(h)\n        h = h.reshape([b, c, *spatial])\n\n        result = x + h\n\n        result = result / self.rescale_output_factor\n\n        return result\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/diffusers/models/embeddings.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\ndef get_timestep_embedding(timesteps,\n                           embedding_dim,\n                           flip_sin_to_cos=False,\n                           downscale_freq_shift=1,\n                           scale=1,\n                           max_period=10000):\n    \"\"\"\n    This matches the implementation in Denoising Diffusion Probabilistic Models: Create sinusoidal timestep embeddings.\n\n    :param timesteps: a 1-D Tensor of N indices, one per batch element.\n                      These may be fractional.\n    :param embedding_dim: the dimension of the output. :param max_period: controls the minimum frequency of the\n    embeddings. :return: an [N x dim] Tensor of positional embeddings.\n    \"\"\"\n    assert len(timesteps.shape) == 1, \"Timesteps should be a 1d-array\"\n\n    half_dim = embedding_dim // 2\n    exponent = -math.log(max_period) * paddle.arange(start=0, end=half_dim, dtype=\"float32\")\n    exponent = exponent / (half_dim - downscale_freq_shift)\n\n    emb = paddle.exp(exponent)\n    emb = timesteps[:, None].astype(\"float32\") * emb[None, :]\n\n    # scale embeddings\n    emb = scale * emb\n\n    # concat sine and cosine embeddings\n    emb = paddle.concat([paddle.sin(emb), paddle.cos(emb)], axis=-1)\n\n    # flip sine and cosine embeddings\n    if flip_sin_to_cos:\n        emb = paddle.concat([emb[:, half_dim:], emb[:, :half_dim]], axis=-1)\n\n    # zero pad\n    if embedding_dim % 2 == 1:\n        emb = paddle.concat(emb, paddle.zeros([emb.shape[0], 1]), axis=-1)\n    return emb\n\n\nclass TimestepEmbedding(nn.Layer):\n\n    def __init__(self, channel, time_embed_dim, act_fn=\"silu\"):\n        super().__init__()\n\n        self.linear_1 = nn.Linear(channel, time_embed_dim)\n        self.act = None\n        if act_fn == \"silu\":\n            self.act = nn.Silu()\n        self.linear_2 = nn.Linear(time_embed_dim, time_embed_dim)\n\n    def forward(self, sample):\n        sample = self.linear_1(sample)\n\n        if self.act is not None:\n            sample = self.act(sample)\n\n        sample = self.linear_2(sample)\n        return sample\n\n\nclass Timesteps(nn.Layer):\n\n    def __init__(self, num_channels, flip_sin_to_cos, downscale_freq_shift):\n        super().__init__()\n        self.num_channels = num_channels\n        self.flip_sin_to_cos = flip_sin_to_cos\n        self.downscale_freq_shift = downscale_freq_shift\n\n    def forward(self, timesteps):\n        t_emb = get_timestep_embedding(\n            timesteps,\n            self.num_channels,\n            flip_sin_to_cos=self.flip_sin_to_cos,\n            downscale_freq_shift=self.downscale_freq_shift,\n        )\n        return t_emb\n\n\nclass GaussianFourierProjection(nn.Layer):\n    \"\"\"Gaussian Fourier embeddings for noise levels.\"\"\"\n\n    def __init__(self, embedding_size=256, scale=1.0):\n        super().__init__()\n        self.register_buffer(\"weight\", paddle.randn((embedding_size, )) * scale)\n\n        # to delete later\n        self.register_buffer(\"W\", paddle.randn((embedding_size, )) * scale)\n\n        self.weight = self.W\n\n    def forward(self, x):\n        x = paddle.log(x)\n        x_proj = x[:, None] * self.weight[None, :] * 2 * np.pi\n        out = paddle.concat([paddle.sin(x_proj), paddle.cos(x_proj)], axis=-1)\n        return out\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/diffusers/models/resnet.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom functools import partial\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\ndef pad_new(x, pad, mode=\"constant\", value=0):\n    new_pad = []\n    for _ in range(x.ndim * 2 - len(pad)):\n        new_pad.append(0)\n    ndim = list(range(x.ndim - 1, 0, -1))\n    axes_start = {}\n    for i, _pad in enumerate(pad):\n        if _pad < 0:\n            new_pad.append(0)\n            zhengshu, yushu = divmod(i, 2)\n            if yushu == 0:\n                axes_start[ndim[zhengshu]] = -_pad\n        else:\n            new_pad.append(_pad)\n\n    padded = paddle.nn.functional.pad(x, new_pad, mode=mode, value=value)\n    padded_shape = paddle.shape(padded)\n    axes = []\n    starts = []\n    ends = []\n    for k, v in axes_start.items():\n        axes.append(k)\n        starts.append(v)\n        ends.append(padded_shape[k])\n        assert v < padded_shape[k]\n\n    if axes:\n        return padded.slice(axes=axes, starts=starts, ends=ends)\n    else:\n        return padded\n\n\nclass Upsample2D(nn.Layer):\n    \"\"\"\n    An upsampling layer with an optional convolution.\n\n    :param channels: channels in the inputs and outputs. :param use_conv: a bool determining if a convolution is\n    applied. :param dims: determines if the signal is 1D, 2D, or 3D. If 3D, then\n                 upsampling occurs in the inner-two dimensions.\n    \"\"\"\n\n    def __init__(self, channels, use_conv=False, use_conv_transpose=False, out_channels=None, name=\"conv\"):\n        super().__init__()\n        self.channels = channels\n        self.out_channels = out_channels or channels\n        self.use_conv = use_conv\n        self.use_conv_transpose = use_conv_transpose\n        self.name = name\n\n        conv = None\n        if use_conv_transpose:\n            conv = nn.Conv2DTranspose(channels, self.out_channels, 4, 2, 1)\n        elif use_conv:\n            conv = nn.Conv2D(self.channels, self.out_channels, 3, padding=1)\n\n        # TODO(Suraj, Patrick) - clean up after weight dicts are correctly renamed\n        if name == \"conv\":\n            self.conv = conv\n        else:\n            self.Conv2d_0 = conv\n\n    def forward(self, x):\n        assert x.shape[1] == self.channels\n        if self.use_conv_transpose:\n            return self.conv(x)\n\n        x = F.interpolate(x, scale_factor=2.0, mode=\"nearest\")\n\n        # TODO(Suraj, Patrick) - clean up after weight dicts are correctly renamed\n        if self.use_conv:\n            if self.name == \"conv\":\n                x = self.conv(x)\n            else:\n                x = self.Conv2d_0(x)\n\n        return x\n\n\nclass Downsample2D(nn.Layer):\n    \"\"\"\n    A downsampling layer with an optional convolution.\n\n    :param channels: channels in the inputs and outputs. :param use_conv: a bool determining if a convolution is\n    applied. :param dims: determines if the signal is 1D, 2D, or 3D. If 3D, then\n                 downsampling occurs in the inner-two dimensions.\n    \"\"\"\n\n    def __init__(self, channels, use_conv=False, out_channels=None, padding=1, name=\"conv\"):\n        super().__init__()\n        self.channels = channels\n        self.out_channels = out_channels or channels\n        self.use_conv = use_conv\n        self.padding = padding\n        stride = 2\n        self.name = name\n\n        if use_conv:\n            conv = nn.Conv2D(self.channels, self.out_channels, 3, stride=stride, padding=padding)\n        else:\n            assert self.channels == self.out_channels\n            conv = nn.AvgPool2D(kernel_size=stride, stride=stride)\n\n        # TODO(Suraj, Patrick) - clean up after weight dicts are correctly renamed\n        if name == \"conv\":\n            self.Conv2d_0 = conv\n            self.conv = conv\n        elif name == \"Conv2d_0\":\n            self.conv = conv\n        else:\n            self.conv = conv\n\n    def forward(self, x):\n        assert x.shape[1] == self.channels\n        if self.use_conv and self.padding == 0:\n            pad = (0, 1, 0, 1)\n            x = pad_new(x, pad, mode=\"constant\", value=0)\n\n        assert x.shape[1] == self.channels\n        x = self.conv(x)\n\n        return x\n\n\nclass FirUpsample2D(nn.Layer):\n\n    def __init__(self, channels=None, out_channels=None, use_conv=False, fir_kernel=(1, 3, 3, 1)):\n        super().__init__()\n        out_channels = out_channels if out_channels else channels\n        if use_conv:\n            self.Conv2d_0 = nn.Conv2D(channels, out_channels, kernel_size=3, stride=1, padding=1)\n        self.use_conv = use_conv\n        self.fir_kernel = fir_kernel\n        self.out_channels = out_channels\n\n    def _upsample_2d(self, x, w=None, k=None, factor=2, gain=1):\n        \"\"\"Fused `upsample_2d()` followed by `Conv2d()`.\n\n        Args:\n        Padding is performed only once at the beginning, not between the operations. The fused op is considerably more\n        efficient than performing the same calculation using standard TensorFlow ops. It supports gradients of arbitrary:\n        order.\n        x: Input tensor of the shape `[N, C, H, W]` or `[N, H, W,\n            C]`.\n        w: Weight tensor of the shape `[filterH, filterW, inChannels,\n            outChannels]`. Grouped convolution can be performed by `inChannels = x.shape[0] // numGroups`.\n        k: FIR filter of the shape `[firH, firW]` or `[firN]`\n            (separable). The default is `[1] * factor`, which corresponds to nearest-neighbor upsampling.\n        factor: Integer upsampling factor (default: 2). gain: Scaling factor for signal magnitude (default: 1.0).\n\n        Returns:\n        Tensor of the shape `[N, C, H * factor, W * factor]` or `[N, H * factor, W * factor, C]`, and same datatype as\n        `x`.\n        \"\"\"\n\n        assert isinstance(factor, int) and factor >= 1\n\n        # Setup filter kernel.\n        if k is None:\n            k = [1] * factor\n\n        # setup kernel\n        k = np.asarray(k, dtype=np.float32)\n        if k.ndim == 1:\n            k = np.outer(k, k)\n        k /= np.sum(k)\n\n        k = k * (gain * (factor**2))\n\n        if self.use_conv:\n            convH = w.shape[2]\n            convW = w.shape[3]\n            inC = w.shape[1]\n\n            p = (k.shape[0] - factor) - (convW - 1)\n\n            stride = (factor, factor)\n            # Determine data dimensions.\n            stride = [1, 1, factor, factor]\n            output_shape = ((x.shape[2] - 1) * factor + convH, (x.shape[3] - 1) * factor + convW)\n            output_padding = (\n                output_shape[0] - (x.shape[2] - 1) * stride[0] - convH,\n                output_shape[1] - (x.shape[3] - 1) * stride[1] - convW,\n            )\n            assert output_padding[0] >= 0 and output_padding[1] >= 0\n            inC = w.shape[1]\n            num_groups = x.shape[1] // inC\n\n            # Transpose weights.\n            w = paddle.reshape(w, (num_groups, -1, inC, convH, convW))\n            w = w[..., ::-1, ::-1].transpose([0, 2, 1, 3, 4])\n            w = paddle.reshape(w, (num_groups * inC, -1, convH, convW))\n\n            x = F.conv2d_transpose(x, w, stride=stride, output_padding=output_padding, padding=0)\n\n            x = upfirdn2d_native(x, paddle.to_tensor(k), pad=((p + 1) // 2 + factor - 1, p // 2 + 1))\n        else:\n            p = k.shape[0] - factor\n            x = upfirdn2d_native(x, paddle.to_tensor(k), up=factor, pad=((p + 1) // 2 + factor - 1, p // 2))\n\n        return x\n\n    def forward(self, x):\n        if self.use_conv:\n            h = self._upsample_2d(x, self.Conv2d_0.weight, k=self.fir_kernel)\n            h = h + self.Conv2d_0.bias.reshape([1, -1, 1, 1])\n        else:\n            h = self._upsample_2d(x, k=self.fir_kernel, factor=2)\n\n        return h\n\n\nclass FirDownsample2D(nn.Layer):\n\n    def __init__(self, channels=None, out_channels=None, use_conv=False, fir_kernel=(1, 3, 3, 1)):\n        super().__init__()\n        out_channels = out_channels if out_channels else channels\n        if use_conv:\n            self.Conv2d_0 = nn.Conv2D(channels, out_channels, kernel_size=3, stride=1, padding=1)\n        self.fir_kernel = fir_kernel\n        self.use_conv = use_conv\n        self.out_channels = out_channels\n\n    def _downsample_2d(self, x, w=None, k=None, factor=2, gain=1):\n        \"\"\"Fused `Conv2d()` followed by `downsample_2d()`.\n\n        Args:\n        Padding is performed only once at the beginning, not between the operations. The fused op is considerably more\n        efficient than performing the same calculation using standard TensorFlow ops. It supports gradients of arbitrary:\n        order.\n            x: Input tensor of the shape `[N, C, H, W]` or `[N, H, W, C]`. w: Weight tensor of the shape `[filterH,\n            filterW, inChannels, outChannels]`. Grouped convolution can be performed by `inChannels = x.shape[0] //\n            numGroups`. k: FIR filter of the shape `[firH, firW]` or `[firN]` (separable). The default is `[1] *\n            factor`, which corresponds to average pooling. factor: Integer downsampling factor (default: 2). gain:\n            Scaling factor for signal magnitude (default: 1.0).\n\n        Returns:\n            Tensor of the shape `[N, C, H // factor, W // factor]` or `[N, H // factor, W // factor, C]`, and same\n            datatype as `x`.\n        \"\"\"\n\n        assert isinstance(factor, int) and factor >= 1\n        if k is None:\n            k = [1] * factor\n\n        # setup kernel\n        k = np.asarray(k, dtype=np.float32)\n        if k.ndim == 1:\n            k = np.outer(k, k)\n        k /= np.sum(k)\n\n        k = k * gain\n\n        if self.use_conv:\n            _, _, convH, convW = w.shape\n            p = (k.shape[0] - factor) + (convW - 1)\n            s = [factor, factor]\n            x = upfirdn2d_native(x, paddle.to_tensor(k), pad=((p + 1) // 2, p // 2))\n            x = F.conv2d(x, w, stride=s, padding=0)\n        else:\n            p = k.shape[0] - factor\n            x = upfirdn2d_native(x, paddle.to_tensor(k), down=factor, pad=((p + 1) // 2, p // 2))\n\n        return x\n\n    def forward(self, x):\n        if self.use_conv:\n            x = self._downsample_2d(x, w=self.Conv2d_0.weight, k=self.fir_kernel)\n            x = x + self.Conv2d_0.bias.reshape([1, -1, 1, 1])\n        else:\n            x = self._downsample_2d(x, k=self.fir_kernel, factor=2)\n\n        return x\n\n\nclass ResnetBlock(nn.Layer):\n\n    def __init__(\n        self,\n        *,\n        in_channels,\n        out_channels=None,\n        conv_shortcut=False,\n        dropout=0.0,\n        temb_channels=512,\n        groups=32,\n        groups_out=None,\n        pre_norm=True,\n        eps=1e-6,\n        non_linearity=\"swish\",\n        time_embedding_norm=\"default\",\n        kernel=None,\n        output_scale_factor=1.0,\n        use_nin_shortcut=None,\n        up=False,\n        down=False,\n    ):\n        super().__init__()\n        self.pre_norm = pre_norm\n        self.pre_norm = True\n        self.in_channels = in_channels\n        out_channels = in_channels if out_channels is None else out_channels\n        self.out_channels = out_channels\n        self.use_conv_shortcut = conv_shortcut\n        self.time_embedding_norm = time_embedding_norm\n        self.up = up\n        self.down = down\n        self.output_scale_factor = output_scale_factor\n\n        if groups_out is None:\n            groups_out = groups\n\n        self.norm1 = nn.GroupNorm(num_groups=groups, num_channels=in_channels, epsilon=eps)\n\n        self.conv1 = nn.Conv2D(in_channels, out_channels, kernel_size=3, stride=1, padding=1)\n\n        if temb_channels is not None:\n            self.time_emb_proj = nn.Linear(temb_channels, out_channels)\n        else:\n            self.time_emb_proj = None\n\n        self.norm2 = nn.GroupNorm(num_groups=groups_out, num_channels=out_channels, epsilon=eps)\n        self.dropout = nn.Dropout(dropout)\n        self.conv2 = nn.Conv2D(out_channels, out_channels, kernel_size=3, stride=1, padding=1)\n\n        if non_linearity == \"swish\":\n            self.nonlinearity = lambda x: F.silu(x)\n        elif non_linearity == \"mish\":\n            self.nonlinearity = Mish()\n        elif non_linearity == \"silu\":\n            self.nonlinearity = nn.Silu()\n\n        self.upsample = self.downsample = None\n        if self.up:\n            if kernel == \"fir\":\n                fir_kernel = (1, 3, 3, 1)\n                self.upsample = lambda x: upsample_2d(x, k=fir_kernel)\n            elif kernel == \"sde_vp\":\n                self.upsample = partial(F.interpolate, scale_factor=2.0, mode=\"nearest\")\n            else:\n                self.upsample = Upsample2D(in_channels, use_conv=False)\n        elif self.down:\n            if kernel == \"fir\":\n                fir_kernel = (1, 3, 3, 1)\n                self.downsample = lambda x: downsample_2d(x, k=fir_kernel)\n            elif kernel == \"sde_vp\":\n                self.downsample = partial(F.avg_pool2d, kernel_size=2, stride=2)\n            else:\n                self.downsample = Downsample2D(in_channels, use_conv=False, padding=1, name=\"op\")\n\n        self.use_nin_shortcut = self.in_channels != self.out_channels if use_nin_shortcut is None else use_nin_shortcut\n\n        self.conv_shortcut = None\n        if self.use_nin_shortcut:\n            self.conv_shortcut = nn.Conv2D(in_channels, out_channels, kernel_size=1, stride=1, padding=0)\n\n    def forward(self, x, temb, hey=False):\n        h = x\n\n        # make sure hidden states is in float32\n        # when running in half-precision\n        h = self.norm1(h.astype(\"float32\")).astype(h.dtype)\n        h = self.nonlinearity(h)\n\n        if self.upsample is not None:\n            x = self.upsample(x)\n            h = self.upsample(h)\n        elif self.downsample is not None:\n            x = self.downsample(x)\n            h = self.downsample(h)\n\n        h = self.conv1(h)\n\n        if temb is not None:\n            temb = self.time_emb_proj(self.nonlinearity(temb))[:, :, None, None]\n            h = h + temb\n\n        # make sure hidden states is in float32\n        # when running in half-precision\n        h = self.norm2(h.astype(\"float32\")).astype(h.dtype)\n        h = self.nonlinearity(h)\n\n        h = self.dropout(h)\n        h = self.conv2(h)\n\n        if self.conv_shortcut is not None:\n            x = self.conv_shortcut(x)\n\n        out = (x + h) / self.output_scale_factor\n\n        return out\n\n\nclass Mish(nn.Layer):\n\n    def forward(self, x):\n        return x * F.tanh(F.softplus(x))\n\n\ndef upsample_2d(x, k=None, factor=2, gain=1):\n    r\"\"\"Upsample2D a batch of 2D images with the given filter.\n\n    Args:\n    Accepts a batch of 2D images of the shape `[N, C, H, W]` or `[N, H, W, C]` and upsamples each image with the given\n    filter. The filter is normalized so that if the input pixels are constant, they will be scaled by the specified\n    `gain`. Pixels outside the image are assumed to be zero, and the filter is padded with zeros so that its shape is a:\n    multiple of the upsampling factor.\n        x: Input tensor of the shape `[N, C, H, W]` or `[N, H, W,\n          C]`.\n        k: FIR filter of the shape `[firH, firW]` or `[firN]`\n          (separable). The default is `[1] * factor`, which corresponds to nearest-neighbor upsampling.\n        factor: Integer upsampling factor (default: 2). gain: Scaling factor for signal magnitude (default: 1.0).\n\n    Returns:\n        Tensor of the shape `[N, C, H * factor, W * factor]`\n    \"\"\"\n    assert isinstance(factor, int) and factor >= 1\n    if k is None:\n        k = [1] * factor\n\n    k = np.asarray(k, dtype=np.float32)\n    if k.ndim == 1:\n        k = np.outer(k, k)\n    k /= np.sum(k)\n\n    k = k * (gain * (factor**2))\n    p = k.shape[0] - factor\n    return upfirdn2d_native(x, paddle.to_tensor(k), up=factor, pad=((p + 1) // 2 + factor - 1, p // 2))\n\n\ndef downsample_2d(x, k=None, factor=2, gain=1):\n    r\"\"\"Downsample2D a batch of 2D images with the given filter.\n\n    Args:\n    Accepts a batch of 2D images of the shape `[N, C, H, W]` or `[N, H, W, C]` and downsamples each image with the\n    given filter. The filter is normalized so that if the input pixels are constant, they will be scaled by the\n    specified `gain`. Pixels outside the image are assumed to be zero, and the filter is padded with zeros so that its\n    shape is a multiple of the downsampling factor.\n        x: Input tensor of the shape `[N, C, H, W]` or `[N, H, W,\n          C]`.\n        k: FIR filter of the shape `[firH, firW]` or `[firN]`\n          (separable). The default is `[1] * factor`, which corresponds to average pooling.\n        factor: Integer downsampling factor (default: 2). gain: Scaling factor for signal magnitude (default: 1.0).\n\n    Returns:\n        Tensor of the shape `[N, C, H // factor, W // factor]`\n    \"\"\"\n\n    assert isinstance(factor, int) and factor >= 1\n    if k is None:\n        k = [1] * factor\n\n    k = np.asarray(k, dtype=np.float32)\n    if k.ndim == 1:\n        k = np.outer(k, k)\n    k /= np.sum(k)\n\n    k = k * gain\n    p = k.shape[0] - factor\n    return upfirdn2d_native(x, paddle.to_tensor(k), down=factor, pad=((p + 1) // 2, p // 2))\n\n\ndef upfirdn2d_native(input, kernel, up=1, down=1, pad=(0, 0)):\n    up_x = up_y = up\n    down_x = down_y = down\n    pad_x0 = pad_y0 = pad[0]\n    pad_x1 = pad_y1 = pad[1]\n\n    _, channel, in_h, in_w = input.shape\n    input = input.reshape([-1, in_h, in_w, 1])\n\n    _, in_h, in_w, minor = input.shape\n    kernel_h, kernel_w = kernel.shape\n\n    out = input.reshape([-1, in_h, 1, in_w, 1, minor])\n    # TODO\n    out = pad_new(out, [0, 0, 0, up_x - 1, 0, 0, 0, up_y - 1])\n    out = out.reshape([-1, in_h * up_y, in_w * up_x, minor])\n\n    out = pad_new(out, [0, 0, max(pad_x0, 0), max(pad_x1, 0), max(pad_y0, 0), max(pad_y1, 0)])\n    out = out[:, max(-pad_y0, 0):out.shape[1] - max(-pad_y1, 0), max(-pad_x0, 0):out.shape[2] - max(-pad_x1, 0), :, ]\n\n    out = out.transpose([0, 3, 1, 2])\n    out = out.reshape([-1, 1, in_h * up_y + pad_y0 + pad_y1, in_w * up_x + pad_x0 + pad_x1])\n    w = paddle.flip(kernel, [0, 1]).reshape([1, 1, kernel_h, kernel_w])\n    out = F.conv2d(out, w)\n    out = out.reshape(\n        [-1, minor, in_h * up_y + pad_y0 + pad_y1 - kernel_h + 1, in_w * up_x + pad_x0 + pad_x1 - kernel_w + 1])\n    out = out.transpose([0, 2, 3, 1])\n    out = out[:, ::down_y, ::down_x, :]\n\n    out_h = (in_h * up_y + pad_y0 + pad_y1 - kernel_h) // down_y + 1\n    out_w = (in_w * up_x + pad_x0 + pad_x1 - kernel_w) // down_x + 1\n\n    return out.reshape([-1, channel, out_h, out_w])\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/diffusers/models/unet_2d.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Dict\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .embeddings import GaussianFourierProjection\nfrom .embeddings import TimestepEmbedding\nfrom .embeddings import Timesteps\nfrom .unet_blocks import get_down_block\nfrom .unet_blocks import get_up_block\nfrom .unet_blocks import UNetMidBlock2D\n\n\nclass UNet2DModel(nn.Layer, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        sample_size=None,\n        in_channels=3,\n        out_channels=3,\n        center_input_sample=False,\n        time_embedding_type=\"positional\",\n        freq_shift=0,\n        flip_sin_to_cos=True,\n        down_block_types=(\"DownBlock2D\", \"AttnDownBlock2D\", \"AttnDownBlock2D\", \"AttnDownBlock2D\"),\n        up_block_types=(\"AttnUpBlock2D\", \"AttnUpBlock2D\", \"AttnUpBlock2D\", \"UpBlock2D\"),\n        block_out_channels=(224, 448, 672, 896),\n        layers_per_block=2,\n        mid_block_scale_factor=1,\n        downsample_padding=1,\n        act_fn=\"silu\",\n        attention_head_dim=8,\n        norm_num_groups=32,\n        norm_eps=1e-5,\n    ):\n        super().__init__()\n\n        self.sample_size = sample_size\n        time_embed_dim = block_out_channels[0] * 4\n\n        # input\n        self.conv_in = nn.Conv2D(in_channels, block_out_channels[0], kernel_size=3, padding=(1, 1))\n\n        # time\n        if time_embedding_type == \"fourier\":\n            self.time_proj = GaussianFourierProjection(embedding_size=block_out_channels[0], scale=16)\n            timestep_input_dim = 2 * block_out_channels[0]\n        elif time_embedding_type == \"positional\":\n            self.time_proj = Timesteps(block_out_channels[0], flip_sin_to_cos, freq_shift)\n            timestep_input_dim = block_out_channels[0]\n\n        self.time_embedding = TimestepEmbedding(timestep_input_dim, time_embed_dim)\n\n        self.down_blocks = nn.LayerList([])\n        self.mid_block = None\n        self.up_blocks = nn.LayerList([])\n\n        # down\n        output_channel = block_out_channels[0]\n        for i, down_block_type in enumerate(down_block_types):\n            input_channel = output_channel\n            output_channel = block_out_channels[i]\n            is_final_block = i == len(block_out_channels) - 1\n\n            down_block = get_down_block(\n                down_block_type,\n                num_layers=layers_per_block,\n                in_channels=input_channel,\n                out_channels=output_channel,\n                temb_channels=time_embed_dim,\n                add_downsample=not is_final_block,\n                resnet_eps=norm_eps,\n                resnet_act_fn=act_fn,\n                attn_num_head_channels=attention_head_dim,\n                downsample_padding=downsample_padding,\n            )\n            self.down_blocks.append(down_block)\n\n        # mid\n        self.mid_block = UNetMidBlock2D(\n            in_channels=block_out_channels[-1],\n            temb_channels=time_embed_dim,\n            resnet_eps=norm_eps,\n            resnet_act_fn=act_fn,\n            output_scale_factor=mid_block_scale_factor,\n            resnet_time_scale_shift=\"default\",\n            attn_num_head_channels=attention_head_dim,\n            resnet_groups=norm_num_groups,\n        )\n\n        # up\n        reversed_block_out_channels = list(reversed(block_out_channels))\n        output_channel = reversed_block_out_channels[0]\n        for i, up_block_type in enumerate(up_block_types):\n            prev_output_channel = output_channel\n            output_channel = reversed_block_out_channels[i]\n            input_channel = reversed_block_out_channels[min(i + 1, len(block_out_channels) - 1)]\n\n            is_final_block = i == len(block_out_channels) - 1\n\n            up_block = get_up_block(\n                up_block_type,\n                num_layers=layers_per_block + 1,\n                in_channels=input_channel,\n                out_channels=output_channel,\n                prev_output_channel=prev_output_channel,\n                temb_channels=time_embed_dim,\n                add_upsample=not is_final_block,\n                resnet_eps=norm_eps,\n                resnet_act_fn=act_fn,\n                attn_num_head_channels=attention_head_dim,\n            )\n            self.up_blocks.append(up_block)\n            prev_output_channel = output_channel\n\n        # out\n        num_groups_out = norm_num_groups if norm_num_groups is not None else min(block_out_channels[0] // 4, 32)\n        self.conv_norm_out = nn.GroupNorm(num_channels=block_out_channels[0],\n                                          num_groups=num_groups_out,\n                                          epsilon=norm_eps)\n        self.conv_act = nn.Silu()\n        self.conv_out = nn.Conv2D(block_out_channels[0], out_channels, 3, padding=1)\n\n    def forward(self, sample: paddle.Tensor, timestep: Union[paddle.Tensor, float, int]) -> Dict[str, paddle.Tensor]:\n\n        # 0. center input if necessary\n        if self.config.center_input_sample:\n            sample = 2 * sample - 1.0\n\n        # 1. time\n        timesteps = timestep\n        if not paddle.is_tensor(timesteps):\n            timesteps = paddle.to_tensor([timesteps], dtype=\"int64\")\n        elif paddle.is_tensor(timesteps) and len(timesteps.shape) == 0:\n            timesteps = timesteps[None]\n\n        # broadcast to batch dimension\n        timesteps = paddle.broadcast_to(timesteps, [sample.shape[0]])\n\n        t_emb = self.time_proj(timesteps)\n        emb = self.time_embedding(t_emb)\n\n        # 2. pre-process\n        skip_sample = sample\n        sample = self.conv_in(sample)\n\n        # 3. down\n        down_block_res_samples = (sample, )\n        for downsample_block in self.down_blocks:\n            if hasattr(downsample_block, \"skip_conv\"):\n                sample, res_samples, skip_sample = downsample_block(hidden_states=sample,\n                                                                    temb=emb,\n                                                                    skip_sample=skip_sample)\n            else:\n                sample, res_samples = downsample_block(hidden_states=sample, temb=emb)\n\n            down_block_res_samples += res_samples\n\n        # 4. mid\n        sample = self.mid_block(sample, emb)\n\n        # 5. up\n        skip_sample = None\n        for upsample_block in self.up_blocks:\n            res_samples = down_block_res_samples[-len(upsample_block.resnets):]\n            down_block_res_samples = down_block_res_samples[:-len(upsample_block.resnets)]\n\n            if hasattr(upsample_block, \"skip_conv\"):\n                sample, skip_sample = upsample_block(sample, res_samples, emb, skip_sample)\n            else:\n                sample = upsample_block(sample, res_samples, emb)\n\n        # 6. post-process\n        # make sure hidden states is in float32\n        # when running in half-precision\n        sample = self.conv_norm_out(sample.astype(\"float32\")).astype(sample.dtype)\n        sample = self.conv_act(sample)\n        sample = self.conv_out(sample)\n\n        if skip_sample is not None:\n            sample += skip_sample\n\n        if self.config.time_embedding_type == \"fourier\":\n            timesteps = timesteps.reshape((sample.shape[0], *([1] * len(sample.shape[1:]))))\n            sample = sample / timesteps\n\n        output = {\"sample\": sample}\n\n        return output\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/diffusers/models/unet_2d_condition.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Dict\nfrom typing import Union\n\nimport paddle\nimport paddle.nn as nn\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .embeddings import TimestepEmbedding\nfrom .embeddings import Timesteps\nfrom .unet_blocks import get_down_block\nfrom .unet_blocks import get_up_block\nfrom .unet_blocks import UNetMidBlock2DCrossAttn\n\n\nclass UNet2DConditionModel(nn.Layer, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        sample_size=64,\n        in_channels=4,\n        out_channels=4,\n        center_input_sample=False,\n        flip_sin_to_cos=True,\n        freq_shift=0,\n        down_block_types=(\"CrossAttnDownBlock2D\", \"CrossAttnDownBlock2D\", \"CrossAttnDownBlock2D\", \"DownBlock2D\"),\n        up_block_types=(\"UpBlock2D\", \"CrossAttnUpBlock2D\", \"CrossAttnUpBlock2D\", \"CrossAttnUpBlock2D\"),\n        block_out_channels=(320, 640, 1280, 1280),\n        layers_per_block=2,\n        downsample_padding=1,\n        mid_block_scale_factor=1,\n        act_fn=\"silu\",\n        norm_num_groups=32,\n        norm_eps=1e-5,\n        cross_attention_dim=768,\n        attention_head_dim=8,\n    ):\n        super().__init__()\n\n        self.sample_size = sample_size\n        time_embed_dim = block_out_channels[0] * 4\n\n        # input\n        self.conv_in = nn.Conv2D(in_channels, block_out_channels[0], kernel_size=3, padding=(1, 1))\n\n        # time\n        self.time_proj = Timesteps(block_out_channels[0], flip_sin_to_cos, freq_shift)\n        timestep_input_dim = block_out_channels[0]\n\n        self.time_embedding = TimestepEmbedding(timestep_input_dim, time_embed_dim)\n\n        self.down_blocks = nn.LayerList([])\n        self.mid_block = None\n        self.up_blocks = nn.LayerList([])\n\n        # down\n        output_channel = block_out_channels[0]\n        for i, down_block_type in enumerate(down_block_types):\n            input_channel = output_channel\n            output_channel = block_out_channels[i]\n            is_final_block = i == len(block_out_channels) - 1\n\n            down_block = get_down_block(\n                down_block_type,\n                num_layers=layers_per_block,\n                in_channels=input_channel,\n                out_channels=output_channel,\n                temb_channels=time_embed_dim,\n                add_downsample=not is_final_block,\n                resnet_eps=norm_eps,\n                resnet_act_fn=act_fn,\n                cross_attention_dim=cross_attention_dim,\n                attn_num_head_channels=attention_head_dim,\n                downsample_padding=downsample_padding,\n            )\n            self.down_blocks.append(down_block)\n\n        # mid\n        self.mid_block = UNetMidBlock2DCrossAttn(\n            in_channels=block_out_channels[-1],\n            temb_channels=time_embed_dim,\n            resnet_eps=norm_eps,\n            resnet_act_fn=act_fn,\n            output_scale_factor=mid_block_scale_factor,\n            resnet_time_scale_shift=\"default\",\n            cross_attention_dim=cross_attention_dim,\n            attn_num_head_channels=attention_head_dim,\n            resnet_groups=norm_num_groups,\n        )\n\n        # up\n        reversed_block_out_channels = list(reversed(block_out_channels))\n        output_channel = reversed_block_out_channels[0]\n        for i, up_block_type in enumerate(up_block_types):\n            prev_output_channel = output_channel\n            output_channel = reversed_block_out_channels[i]\n            input_channel = reversed_block_out_channels[min(i + 1, len(block_out_channels) - 1)]\n\n            is_final_block = i == len(block_out_channels) - 1\n\n            up_block = get_up_block(\n                up_block_type,\n                num_layers=layers_per_block + 1,\n                in_channels=input_channel,\n                out_channels=output_channel,\n                prev_output_channel=prev_output_channel,\n                temb_channels=time_embed_dim,\n                add_upsample=not is_final_block,\n                resnet_eps=norm_eps,\n                resnet_act_fn=act_fn,\n                cross_attention_dim=cross_attention_dim,\n                attn_num_head_channels=attention_head_dim,\n            )\n            self.up_blocks.append(up_block)\n            prev_output_channel = output_channel\n\n        # out\n        self.conv_norm_out = nn.GroupNorm(num_channels=block_out_channels[0],\n                                          num_groups=norm_num_groups,\n                                          epsilon=norm_eps)\n        self.conv_act = nn.Silu()\n        self.conv_out = nn.Conv2D(block_out_channels[0], out_channels, 3, padding=1)\n\n    def forward(\n        self,\n        sample: paddle.Tensor,\n        timestep: Union[paddle.Tensor, float, int],\n        encoder_hidden_states: paddle.Tensor,\n    ) -> Dict[str, paddle.Tensor]:\n\n        # 0. center input if necessary\n        if self.config.center_input_sample:\n            sample = 2 * sample - 1.0\n\n        # 1. time\n        timesteps = timestep\n        if not paddle.is_tensor(timesteps):\n            timesteps = paddle.to_tensor([timesteps], dtype=\"int64\")\n        elif paddle.is_tensor(timesteps) and len(timesteps.shape) == 0:\n            timesteps = timesteps[None]\n\n        # broadcast to batch dimension\n        timesteps = paddle.broadcast_to(timesteps, [sample.shape[0]])\n\n        t_emb = self.time_proj(timesteps)\n        emb = self.time_embedding(t_emb)\n\n        # 2. pre-process\n        sample = self.conv_in(sample)\n\n        # 3. down\n        down_block_res_samples = (sample, )\n        for downsample_block in self.down_blocks:\n\n            if hasattr(downsample_block, \"attentions\") and downsample_block.attentions is not None:\n                sample, res_samples = downsample_block(hidden_states=sample,\n                                                       temb=emb,\n                                                       encoder_hidden_states=encoder_hidden_states)\n            else:\n                sample, res_samples = downsample_block(hidden_states=sample, temb=emb)\n\n            down_block_res_samples += res_samples\n\n        # 4. mid\n        sample = self.mid_block(sample, emb, encoder_hidden_states=encoder_hidden_states)\n\n        # 5. up\n        for upsample_block in self.up_blocks:\n\n            res_samples = down_block_res_samples[-len(upsample_block.resnets):]\n            down_block_res_samples = down_block_res_samples[:-len(upsample_block.resnets)]\n\n            if hasattr(upsample_block, \"attentions\") and upsample_block.attentions is not None:\n                sample = upsample_block(\n                    hidden_states=sample,\n                    temb=emb,\n                    res_hidden_states_tuple=res_samples,\n                    encoder_hidden_states=encoder_hidden_states,\n                )\n            else:\n                sample = upsample_block(hidden_states=sample, temb=emb, res_hidden_states_tuple=res_samples)\n\n        # 6. post-process\n        # make sure hidden states is in float32\n        # when running in half-precision\n        sample = self.conv_norm_out(sample.astype(\"float32\")).astype(sample.dtype)\n        sample = self.conv_act(sample)\n        sample = self.conv_out(sample)\n\n        output = {\"sample\": sample}\n\n        return output\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/diffusers/models/unet_blocks.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\n\nfrom .attention import AttentionBlockNew\nfrom .attention import SpatialTransformer\nfrom .resnet import Downsample2D\nfrom .resnet import FirDownsample2D\nfrom .resnet import FirUpsample2D\nfrom .resnet import ResnetBlock\nfrom .resnet import Upsample2D\n\n\ndef get_down_block(\n    down_block_type,\n    num_layers,\n    in_channels,\n    out_channels,\n    temb_channels,\n    add_downsample,\n    resnet_eps,\n    resnet_act_fn,\n    attn_num_head_channels,\n    cross_attention_dim=None,\n    downsample_padding=None,\n):\n    down_block_type = down_block_type[7:] if down_block_type.startswith(\"UNetRes\") else down_block_type\n    if down_block_type == \"DownBlock2D\":\n        return DownBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            temb_channels=temb_channels,\n            add_downsample=add_downsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            downsample_padding=downsample_padding,\n        )\n    elif down_block_type == \"AttnDownBlock2D\":\n        return AttnDownBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            temb_channels=temb_channels,\n            add_downsample=add_downsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            downsample_padding=downsample_padding,\n            attn_num_head_channels=attn_num_head_channels,\n        )\n    elif down_block_type == \"CrossAttnDownBlock2D\":\n        if cross_attention_dim is None:\n            raise ValueError(\"cross_attention_dim must be specified for CrossAttnUpBlock2D\")\n        return CrossAttnDownBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            temb_channels=temb_channels,\n            add_downsample=add_downsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            downsample_padding=downsample_padding,\n            cross_attention_dim=cross_attention_dim,\n            attn_num_head_channels=attn_num_head_channels,\n        )\n    elif down_block_type == \"SkipDownBlock2D\":\n        return SkipDownBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            temb_channels=temb_channels,\n            add_downsample=add_downsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            downsample_padding=downsample_padding,\n        )\n    elif down_block_type == \"AttnSkipDownBlock2D\":\n        return AttnSkipDownBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            temb_channels=temb_channels,\n            add_downsample=add_downsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            downsample_padding=downsample_padding,\n            attn_num_head_channels=attn_num_head_channels,\n        )\n    elif down_block_type == \"DownEncoderBlock2D\":\n        return DownEncoderBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            add_downsample=add_downsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            downsample_padding=downsample_padding,\n        )\n\n\ndef get_up_block(\n    up_block_type,\n    num_layers,\n    in_channels,\n    out_channels,\n    prev_output_channel,\n    temb_channels,\n    add_upsample,\n    resnet_eps,\n    resnet_act_fn,\n    attn_num_head_channels,\n    cross_attention_dim=None,\n):\n    up_block_type = up_block_type[7:] if up_block_type.startswith(\"UNetRes\") else up_block_type\n    if up_block_type == \"UpBlock2D\":\n        return UpBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            prev_output_channel=prev_output_channel,\n            temb_channels=temb_channels,\n            add_upsample=add_upsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n        )\n    elif up_block_type == \"CrossAttnUpBlock2D\":\n        if cross_attention_dim is None:\n            raise ValueError(\"cross_attention_dim must be specified for CrossAttnUpBlock2D\")\n        return CrossAttnUpBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            prev_output_channel=prev_output_channel,\n            temb_channels=temb_channels,\n            add_upsample=add_upsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            cross_attention_dim=cross_attention_dim,\n            attn_num_head_channels=attn_num_head_channels,\n        )\n    elif up_block_type == \"AttnUpBlock2D\":\n        return AttnUpBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            prev_output_channel=prev_output_channel,\n            temb_channels=temb_channels,\n            add_upsample=add_upsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            attn_num_head_channels=attn_num_head_channels,\n        )\n    elif up_block_type == \"SkipUpBlock2D\":\n        return SkipUpBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            prev_output_channel=prev_output_channel,\n            temb_channels=temb_channels,\n            add_upsample=add_upsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n        )\n    elif up_block_type == \"AttnSkipUpBlock2D\":\n        return AttnSkipUpBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            prev_output_channel=prev_output_channel,\n            temb_channels=temb_channels,\n            add_upsample=add_upsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n            attn_num_head_channels=attn_num_head_channels,\n        )\n    elif up_block_type == \"UpDecoderBlock2D\":\n        return UpDecoderBlock2D(\n            num_layers=num_layers,\n            in_channels=in_channels,\n            out_channels=out_channels,\n            add_upsample=add_upsample,\n            resnet_eps=resnet_eps,\n            resnet_act_fn=resnet_act_fn,\n        )\n    raise ValueError(f\"{up_block_type} does not exist.\")\n\n\nclass UNetMidBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        attention_type=\"default\",\n        output_scale_factor=1.0,\n        **kwargs,\n    ):\n        super().__init__()\n\n        self.attention_type = attention_type\n        resnet_groups = resnet_groups if resnet_groups is not None else min(in_channels // 4, 32)\n\n        # there is always at least one resnet\n        resnets = [\n            ResnetBlock(\n                in_channels=in_channels,\n                out_channels=in_channels,\n                temb_channels=temb_channels,\n                eps=resnet_eps,\n                groups=resnet_groups,\n                dropout=dropout,\n                time_embedding_norm=resnet_time_scale_shift,\n                non_linearity=resnet_act_fn,\n                output_scale_factor=output_scale_factor,\n                pre_norm=resnet_pre_norm,\n            )\n        ]\n        attentions = []\n\n        for _ in range(num_layers):\n            attentions.append(\n                AttentionBlockNew(\n                    in_channels,\n                    num_head_channels=attn_num_head_channels,\n                    rescale_output_factor=output_scale_factor,\n                    eps=resnet_eps,\n                    num_groups=resnet_groups,\n                ))\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=in_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n    def forward(self, hidden_states, temb=None, encoder_states=None):\n        hidden_states = self.resnets[0](hidden_states, temb)\n        for attn, resnet in zip(self.attentions, self.resnets[1:]):\n            if self.attention_type == \"default\":\n                hidden_states = attn(hidden_states)\n            else:\n                hidden_states = attn(hidden_states, encoder_states)\n            hidden_states = resnet(hidden_states, temb)\n\n        return hidden_states\n\n\nclass UNetMidBlock2DCrossAttn(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        attention_type=\"default\",\n        output_scale_factor=1.0,\n        cross_attention_dim=1280,\n        **kwargs,\n    ):\n        super().__init__()\n\n        self.attention_type = attention_type\n        resnet_groups = resnet_groups if resnet_groups is not None else min(in_channels // 4, 32)\n\n        # there is always at least one resnet\n        resnets = [\n            ResnetBlock(\n                in_channels=in_channels,\n                out_channels=in_channels,\n                temb_channels=temb_channels,\n                eps=resnet_eps,\n                groups=resnet_groups,\n                dropout=dropout,\n                time_embedding_norm=resnet_time_scale_shift,\n                non_linearity=resnet_act_fn,\n                output_scale_factor=output_scale_factor,\n                pre_norm=resnet_pre_norm,\n            )\n        ]\n        attentions = []\n\n        for _ in range(num_layers):\n            attentions.append(\n                SpatialTransformer(\n                    in_channels,\n                    attn_num_head_channels,\n                    in_channels // attn_num_head_channels,\n                    depth=1,\n                    context_dim=cross_attention_dim,\n                ))\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=in_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n    def forward(self, hidden_states, temb=None, encoder_hidden_states=None):\n        hidden_states = self.resnets[0](hidden_states, temb)\n        for attn, resnet in zip(self.attentions, self.resnets[1:]):\n            hidden_states = attn(hidden_states, encoder_hidden_states)\n            hidden_states = resnet(hidden_states, temb)\n\n        return hidden_states\n\n\nclass AttnDownBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        attention_type=\"default\",\n        output_scale_factor=1.0,\n        downsample_padding=1,\n        add_downsample=True,\n    ):\n        super().__init__()\n        resnets = []\n        attentions = []\n\n        self.attention_type = attention_type\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            attentions.append(\n                AttentionBlockNew(\n                    out_channels,\n                    num_head_channels=attn_num_head_channels,\n                    rescale_output_factor=output_scale_factor,\n                    eps=resnet_eps,\n                ))\n\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n        if add_downsample:\n            self.downsamplers = nn.LayerList([\n                Downsample2D(in_channels,\n                             use_conv=True,\n                             out_channels=out_channels,\n                             padding=downsample_padding,\n                             name=\"op\")\n            ])\n        else:\n            self.downsamplers = None\n\n    def forward(self, hidden_states, temb=None):\n        output_states = ()\n\n        for resnet, attn in zip(self.resnets, self.attentions):\n            hidden_states = resnet(hidden_states, temb)\n            hidden_states = attn(hidden_states)\n            output_states += (hidden_states, )\n\n        if self.downsamplers is not None:\n            for downsampler in self.downsamplers:\n                hidden_states = downsampler(hidden_states)\n\n            output_states += (hidden_states, )\n\n        return hidden_states, output_states\n\n\nclass CrossAttnDownBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        cross_attention_dim=1280,\n        attention_type=\"default\",\n        output_scale_factor=1.0,\n        downsample_padding=1,\n        add_downsample=True,\n    ):\n        super().__init__()\n        resnets = []\n        attentions = []\n\n        self.attention_type = attention_type\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            attentions.append(\n                SpatialTransformer(\n                    out_channels,\n                    attn_num_head_channels,\n                    out_channels // attn_num_head_channels,\n                    depth=1,\n                    context_dim=cross_attention_dim,\n                ))\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n        if add_downsample:\n            self.downsamplers = nn.LayerList([\n                Downsample2D(in_channels,\n                             use_conv=True,\n                             out_channels=out_channels,\n                             padding=downsample_padding,\n                             name=\"op\")\n            ])\n        else:\n            self.downsamplers = None\n\n    def forward(self, hidden_states, temb=None, encoder_hidden_states=None):\n        output_states = ()\n\n        for resnet, attn in zip(self.resnets, self.attentions):\n            hidden_states = resnet(hidden_states, temb)\n            hidden_states = attn(hidden_states, context=encoder_hidden_states)\n            output_states += (hidden_states, )\n\n        if self.downsamplers is not None:\n            for downsampler in self.downsamplers:\n                hidden_states = downsampler(hidden_states)\n\n            output_states += (hidden_states, )\n\n        return hidden_states, output_states\n\n\nclass DownBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        output_scale_factor=1.0,\n        add_downsample=True,\n        downsample_padding=1,\n    ):\n        super().__init__()\n        resnets = []\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.resnets = nn.LayerList(resnets)\n\n        if add_downsample:\n            self.downsamplers = nn.LayerList([\n                Downsample2D(in_channels,\n                             use_conv=True,\n                             out_channels=out_channels,\n                             padding=downsample_padding,\n                             name=\"op\")\n            ])\n        else:\n            self.downsamplers = None\n\n    def forward(self, hidden_states, temb=None):\n        output_states = ()\n\n        for resnet in self.resnets:\n            hidden_states = resnet(hidden_states, temb)\n            output_states += (hidden_states, )\n\n        if self.downsamplers is not None:\n            for downsampler in self.downsamplers:\n                hidden_states = downsampler(hidden_states)\n\n            output_states += (hidden_states, )\n\n        return hidden_states, output_states\n\n\nclass DownEncoderBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        output_scale_factor=1.0,\n        add_downsample=True,\n        downsample_padding=1,\n    ):\n        super().__init__()\n        resnets = []\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=None,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.resnets = nn.LayerList(resnets)\n\n        if add_downsample:\n            self.downsamplers = nn.LayerList([\n                Downsample2D(in_channels,\n                             use_conv=True,\n                             out_channels=out_channels,\n                             padding=downsample_padding,\n                             name=\"op\")\n            ])\n        else:\n            self.downsamplers = None\n\n    def forward(self, hidden_states):\n        for resnet in self.resnets:\n            hidden_states = resnet(hidden_states, temb=None)\n\n        if self.downsamplers is not None:\n            for downsampler in self.downsamplers:\n                hidden_states = downsampler(hidden_states)\n\n        return hidden_states\n\n\nclass AttnDownEncoderBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        output_scale_factor=1.0,\n        add_downsample=True,\n        downsample_padding=1,\n    ):\n        super().__init__()\n        resnets = []\n        attentions = []\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=None,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            attentions.append(\n                AttentionBlockNew(\n                    out_channels,\n                    num_head_channels=attn_num_head_channels,\n                    rescale_output_factor=output_scale_factor,\n                    eps=resnet_eps,\n                    num_groups=resnet_groups,\n                ))\n\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n        if add_downsample:\n            self.downsamplers = nn.LayerList([\n                Downsample2D(in_channels,\n                             use_conv=True,\n                             out_channels=out_channels,\n                             padding=downsample_padding,\n                             name=\"op\")\n            ])\n        else:\n            self.downsamplers = None\n\n    def forward(self, hidden_states):\n        for resnet, attn in zip(self.resnets, self.attentions):\n            hidden_states = resnet(hidden_states, temb=None)\n            hidden_states = attn(hidden_states)\n\n        if self.downsamplers is not None:\n            for downsampler in self.downsamplers:\n                hidden_states = downsampler(hidden_states)\n\n        return hidden_states\n\n\nclass AttnSkipDownBlock2D(nn.Layer):\n\n    def __init__(\n            self,\n            in_channels: int,\n            out_channels: int,\n            temb_channels: int,\n            dropout: float = 0.0,\n            num_layers: int = 1,\n            resnet_eps: float = 1e-6,\n            resnet_time_scale_shift: str = \"default\",\n            resnet_act_fn: str = \"swish\",\n            resnet_pre_norm: bool = True,\n            attn_num_head_channels=1,\n            attention_type=\"default\",\n            output_scale_factor=np.sqrt(2.0),\n            downsample_padding=1,\n            add_downsample=True,\n    ):\n        super().__init__()\n        self.attentions = nn.LayerList([])\n        self.resnets = nn.LayerList([])\n\n        self.attention_type = attention_type\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            self.resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=min(in_channels // 4, 32),\n                    groups_out=min(out_channels // 4, 32),\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            self.attentions.append(\n                AttentionBlockNew(\n                    out_channels,\n                    num_head_channels=attn_num_head_channels,\n                    rescale_output_factor=output_scale_factor,\n                    eps=resnet_eps,\n                ))\n\n        if add_downsample:\n            self.resnet_down = ResnetBlock(\n                in_channels=out_channels,\n                out_channels=out_channels,\n                temb_channels=temb_channels,\n                eps=resnet_eps,\n                groups=min(out_channels // 4, 32),\n                dropout=dropout,\n                time_embedding_norm=resnet_time_scale_shift,\n                non_linearity=resnet_act_fn,\n                output_scale_factor=output_scale_factor,\n                pre_norm=resnet_pre_norm,\n                use_nin_shortcut=True,\n                down=True,\n                kernel=\"fir\",\n            )\n            self.downsamplers = nn.LayerList([FirDownsample2D(in_channels, out_channels=out_channels)])\n            self.skip_conv = nn.Conv2D(3, out_channels, kernel_size=(1, 1), stride=(1, 1))\n        else:\n            self.resnet_down = None\n            self.downsamplers = None\n            self.skip_conv = None\n\n    def forward(self, hidden_states, temb=None, skip_sample=None):\n        output_states = ()\n\n        for resnet, attn in zip(self.resnets, self.attentions):\n            hidden_states = resnet(hidden_states, temb)\n            hidden_states = attn(hidden_states)\n            output_states += (hidden_states, )\n\n        if self.downsamplers is not None:\n            hidden_states = self.resnet_down(hidden_states, temb)\n            for downsampler in self.downsamplers:\n                skip_sample = downsampler(skip_sample)\n\n            hidden_states = self.skip_conv(skip_sample) + hidden_states\n\n            output_states += (hidden_states, )\n\n        return hidden_states, output_states, skip_sample\n\n\nclass SkipDownBlock2D(nn.Layer):\n\n    def __init__(\n            self,\n            in_channels: int,\n            out_channels: int,\n            temb_channels: int,\n            dropout: float = 0.0,\n            num_layers: int = 1,\n            resnet_eps: float = 1e-6,\n            resnet_time_scale_shift: str = \"default\",\n            resnet_act_fn: str = \"swish\",\n            resnet_pre_norm: bool = True,\n            output_scale_factor=np.sqrt(2.0),\n            add_downsample=True,\n            downsample_padding=1,\n    ):\n        super().__init__()\n        self.resnets = nn.LayerList([])\n\n        for i in range(num_layers):\n            in_channels = in_channels if i == 0 else out_channels\n            self.resnets.append(\n                ResnetBlock(\n                    in_channels=in_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=min(in_channels // 4, 32),\n                    groups_out=min(out_channels // 4, 32),\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        if add_downsample:\n            self.resnet_down = ResnetBlock(\n                in_channels=out_channels,\n                out_channels=out_channels,\n                temb_channels=temb_channels,\n                eps=resnet_eps,\n                groups=min(out_channels // 4, 32),\n                dropout=dropout,\n                time_embedding_norm=resnet_time_scale_shift,\n                non_linearity=resnet_act_fn,\n                output_scale_factor=output_scale_factor,\n                pre_norm=resnet_pre_norm,\n                use_nin_shortcut=True,\n                down=True,\n                kernel=\"fir\",\n            )\n            self.downsamplers = nn.LayerList([FirDownsample2D(in_channels, out_channels=out_channels)])\n            self.skip_conv = nn.Conv2D(3, out_channels, kernel_size=(1, 1), stride=(1, 1))\n        else:\n            self.resnet_down = None\n            self.downsamplers = None\n            self.skip_conv = None\n\n    def forward(self, hidden_states, temb=None, skip_sample=None):\n        output_states = ()\n\n        for resnet in self.resnets:\n            hidden_states = resnet(hidden_states, temb)\n            output_states += (hidden_states, )\n\n        if self.downsamplers is not None:\n            hidden_states = self.resnet_down(hidden_states, temb)\n            for downsampler in self.downsamplers:\n                skip_sample = downsampler(skip_sample)\n\n            hidden_states = self.skip_conv(skip_sample) + hidden_states\n\n            output_states += (hidden_states, )\n\n        return hidden_states, output_states, skip_sample\n\n\nclass AttnUpBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        prev_output_channel: int,\n        out_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attention_type=\"default\",\n        attn_num_head_channels=1,\n        output_scale_factor=1.0,\n        add_upsample=True,\n    ):\n        super().__init__()\n        resnets = []\n        attentions = []\n\n        self.attention_type = attention_type\n\n        for i in range(num_layers):\n            res_skip_channels = in_channels if (i == num_layers - 1) else out_channels\n            resnet_in_channels = prev_output_channel if i == 0 else out_channels\n\n            resnets.append(\n                ResnetBlock(\n                    in_channels=resnet_in_channels + res_skip_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            attentions.append(\n                AttentionBlockNew(\n                    out_channels,\n                    num_head_channels=attn_num_head_channels,\n                    rescale_output_factor=output_scale_factor,\n                    eps=resnet_eps,\n                ))\n\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n        if add_upsample:\n            self.upsamplers = nn.LayerList([Upsample2D(out_channels, use_conv=True, out_channels=out_channels)])\n        else:\n            self.upsamplers = None\n\n    def forward(self, hidden_states, res_hidden_states_tuple, temb=None):\n        for resnet, attn in zip(self.resnets, self.attentions):\n\n            # pop res hidden states\n            res_hidden_states = res_hidden_states_tuple[-1]\n            res_hidden_states_tuple = res_hidden_states_tuple[:-1]\n            hidden_states = paddle.concat([hidden_states, res_hidden_states], axis=1)\n\n            hidden_states = resnet(hidden_states, temb)\n            hidden_states = attn(hidden_states)\n\n        if self.upsamplers is not None:\n            for upsampler in self.upsamplers:\n                hidden_states = upsampler(hidden_states)\n\n        return hidden_states\n\n\nclass CrossAttnUpBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        prev_output_channel: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        cross_attention_dim=1280,\n        attention_type=\"default\",\n        output_scale_factor=1.0,\n        downsample_padding=1,\n        add_upsample=True,\n    ):\n        super().__init__()\n        resnets = []\n        attentions = []\n\n        self.attention_type = attention_type\n\n        for i in range(num_layers):\n            res_skip_channels = in_channels if (i == num_layers - 1) else out_channels\n            resnet_in_channels = prev_output_channel if i == 0 else out_channels\n\n            resnets.append(\n                ResnetBlock(\n                    in_channels=resnet_in_channels + res_skip_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            attentions.append(\n                SpatialTransformer(\n                    out_channels,\n                    attn_num_head_channels,\n                    out_channels // attn_num_head_channels,\n                    depth=1,\n                    context_dim=cross_attention_dim,\n                ))\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n        if add_upsample:\n            self.upsamplers = nn.LayerList([Upsample2D(out_channels, use_conv=True, out_channels=out_channels)])\n        else:\n            self.upsamplers = None\n\n    def forward(self, hidden_states, res_hidden_states_tuple, temb=None, encoder_hidden_states=None):\n        for resnet, attn in zip(self.resnets, self.attentions):\n\n            # pop res hidden states\n            res_hidden_states = res_hidden_states_tuple[-1]\n            res_hidden_states_tuple = res_hidden_states_tuple[:-1]\n            hidden_states = paddle.concat([hidden_states, res_hidden_states], axis=1)\n\n            hidden_states = resnet(hidden_states, temb)\n            hidden_states = attn(hidden_states, context=encoder_hidden_states)\n\n        if self.upsamplers is not None:\n            for upsampler in self.upsamplers:\n                hidden_states = upsampler(hidden_states)\n\n        return hidden_states\n\n\nclass UpBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        prev_output_channel: int,\n        out_channels: int,\n        temb_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        output_scale_factor=1.0,\n        add_upsample=True,\n    ):\n        super().__init__()\n        resnets = []\n\n        for i in range(num_layers):\n            res_skip_channels = in_channels if (i == num_layers - 1) else out_channels\n            resnet_in_channels = prev_output_channel if i == 0 else out_channels\n\n            resnets.append(\n                ResnetBlock(\n                    in_channels=resnet_in_channels + res_skip_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.resnets = nn.LayerList(resnets)\n\n        if add_upsample:\n            self.upsamplers = nn.LayerList([Upsample2D(out_channels, use_conv=True, out_channels=out_channels)])\n        else:\n            self.upsamplers = None\n\n    def forward(self, hidden_states, res_hidden_states_tuple, temb=None):\n        for resnet in self.resnets:\n\n            # pop res hidden states\n            res_hidden_states = res_hidden_states_tuple[-1]\n            res_hidden_states_tuple = res_hidden_states_tuple[:-1]\n            hidden_states = paddle.concat([hidden_states, res_hidden_states], axis=1)\n\n            hidden_states = resnet(hidden_states, temb)\n\n        if self.upsamplers is not None:\n            for upsampler in self.upsamplers:\n                hidden_states = upsampler(hidden_states)\n\n        return hidden_states\n\n\nclass UpDecoderBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        output_scale_factor=1.0,\n        add_upsample=True,\n    ):\n        super().__init__()\n        resnets = []\n\n        for i in range(num_layers):\n            input_channels = in_channels if i == 0 else out_channels\n\n            resnets.append(\n                ResnetBlock(\n                    in_channels=input_channels,\n                    out_channels=out_channels,\n                    temb_channels=None,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.resnets = nn.LayerList(resnets)\n\n        if add_upsample:\n            self.upsamplers = nn.LayerList([Upsample2D(out_channels, use_conv=True, out_channels=out_channels)])\n        else:\n            self.upsamplers = None\n\n    def forward(self, hidden_states):\n        for resnet in self.resnets:\n            hidden_states = resnet(hidden_states, temb=None)\n\n        if self.upsamplers is not None:\n            for upsampler in self.upsamplers:\n                hidden_states = upsampler(hidden_states)\n\n        return hidden_states\n\n\nclass AttnUpDecoderBlock2D(nn.Layer):\n\n    def __init__(\n        self,\n        in_channels: int,\n        out_channels: int,\n        dropout: float = 0.0,\n        num_layers: int = 1,\n        resnet_eps: float = 1e-6,\n        resnet_time_scale_shift: str = \"default\",\n        resnet_act_fn: str = \"swish\",\n        resnet_groups: int = 32,\n        resnet_pre_norm: bool = True,\n        attn_num_head_channels=1,\n        output_scale_factor=1.0,\n        add_upsample=True,\n    ):\n        super().__init__()\n        resnets = []\n        attentions = []\n\n        for i in range(num_layers):\n            input_channels = in_channels if i == 0 else out_channels\n\n            resnets.append(\n                ResnetBlock(\n                    in_channels=input_channels,\n                    out_channels=out_channels,\n                    temb_channels=None,\n                    eps=resnet_eps,\n                    groups=resnet_groups,\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n            attentions.append(\n                AttentionBlockNew(\n                    out_channels,\n                    num_head_channels=attn_num_head_channels,\n                    rescale_output_factor=output_scale_factor,\n                    eps=resnet_eps,\n                    num_groups=resnet_groups,\n                ))\n\n        self.attentions = nn.LayerList(attentions)\n        self.resnets = nn.LayerList(resnets)\n\n        if add_upsample:\n            self.upsamplers = nn.LayerList([Upsample2D(out_channels, use_conv=True, out_channels=out_channels)])\n        else:\n            self.upsamplers = None\n\n    def forward(self, hidden_states):\n        for resnet, attn in zip(self.resnets, self.attentions):\n            hidden_states = resnet(hidden_states, temb=None)\n            hidden_states = attn(hidden_states)\n\n        if self.upsamplers is not None:\n            for upsampler in self.upsamplers:\n                hidden_states = upsampler(hidden_states)\n\n        return hidden_states\n\n\nclass AttnSkipUpBlock2D(nn.Layer):\n\n    def __init__(\n            self,\n            in_channels: int,\n            prev_output_channel: int,\n            out_channels: int,\n            temb_channels: int,\n            dropout: float = 0.0,\n            num_layers: int = 1,\n            resnet_eps: float = 1e-6,\n            resnet_time_scale_shift: str = \"default\",\n            resnet_act_fn: str = \"swish\",\n            resnet_pre_norm: bool = True,\n            attn_num_head_channels=1,\n            attention_type=\"default\",\n            output_scale_factor=np.sqrt(2.0),\n            upsample_padding=1,\n            add_upsample=True,\n    ):\n        super().__init__()\n        self.attentions = nn.LayerList([])\n        self.resnets = nn.LayerList([])\n\n        self.attention_type = attention_type\n\n        for i in range(num_layers):\n            res_skip_channels = in_channels if (i == num_layers - 1) else out_channels\n            resnet_in_channels = prev_output_channel if i == 0 else out_channels\n\n            self.resnets.append(\n                ResnetBlock(\n                    in_channels=resnet_in_channels + res_skip_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=min(resnet_in_channels + res_skip_channels // 4, 32),\n                    groups_out=min(out_channels // 4, 32),\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.attentions.append(\n            AttentionBlockNew(\n                out_channels,\n                num_head_channels=attn_num_head_channels,\n                rescale_output_factor=output_scale_factor,\n                eps=resnet_eps,\n            ))\n\n        self.upsampler = FirUpsample2D(in_channels, out_channels=out_channels)\n        if add_upsample:\n            self.resnet_up = ResnetBlock(\n                in_channels=out_channels,\n                out_channels=out_channels,\n                temb_channels=temb_channels,\n                eps=resnet_eps,\n                groups=min(out_channels // 4, 32),\n                groups_out=min(out_channels // 4, 32),\n                dropout=dropout,\n                time_embedding_norm=resnet_time_scale_shift,\n                non_linearity=resnet_act_fn,\n                output_scale_factor=output_scale_factor,\n                pre_norm=resnet_pre_norm,\n                use_nin_shortcut=True,\n                up=True,\n                kernel=\"fir\",\n            )\n            self.skip_conv = nn.Conv2D(out_channels, 3, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n            self.skip_norm = nn.GroupNorm(num_groups=min(out_channels // 4, 32),\n                                          num_channels=out_channels,\n                                          eps=resnet_eps,\n                                          affine=True)\n            self.act = nn.SiLU()\n        else:\n            self.resnet_up = None\n            self.skip_conv = None\n            self.skip_norm = None\n            self.act = None\n\n    def forward(self, hidden_states, res_hidden_states_tuple, temb=None, skip_sample=None):\n        for resnet in self.resnets:\n            # pop res hidden states\n            res_hidden_states = res_hidden_states_tuple[-1]\n            res_hidden_states_tuple = res_hidden_states_tuple[:-1]\n            hidden_states = paddle.concat([hidden_states, res_hidden_states], axis=1)\n\n            hidden_states = resnet(hidden_states, temb)\n\n        hidden_states = self.attentions[0](hidden_states)\n\n        if skip_sample is not None:\n            skip_sample = self.upsampler(skip_sample)\n        else:\n            skip_sample = 0\n\n        if self.resnet_up is not None:\n            skip_sample_states = self.skip_norm(hidden_states)\n            skip_sample_states = self.act(skip_sample_states)\n            skip_sample_states = self.skip_conv(skip_sample_states)\n\n            skip_sample = skip_sample + skip_sample_states\n\n            hidden_states = self.resnet_up(hidden_states, temb)\n\n        return hidden_states, skip_sample\n\n\nclass SkipUpBlock2D(nn.Layer):\n\n    def __init__(\n            self,\n            in_channels: int,\n            prev_output_channel: int,\n            out_channels: int,\n            temb_channels: int,\n            dropout: float = 0.0,\n            num_layers: int = 1,\n            resnet_eps: float = 1e-6,\n            resnet_time_scale_shift: str = \"default\",\n            resnet_act_fn: str = \"swish\",\n            resnet_pre_norm: bool = True,\n            output_scale_factor=np.sqrt(2.0),\n            add_upsample=True,\n            upsample_padding=1,\n    ):\n        super().__init__()\n        self.resnets = nn.LayerList([])\n\n        for i in range(num_layers):\n            res_skip_channels = in_channels if (i == num_layers - 1) else out_channels\n            resnet_in_channels = prev_output_channel if i == 0 else out_channels\n\n            self.resnets.append(\n                ResnetBlock(\n                    in_channels=resnet_in_channels + res_skip_channels,\n                    out_channels=out_channels,\n                    temb_channels=temb_channels,\n                    eps=resnet_eps,\n                    groups=min((resnet_in_channels + res_skip_channels) // 4, 32),\n                    groups_out=min(out_channels // 4, 32),\n                    dropout=dropout,\n                    time_embedding_norm=resnet_time_scale_shift,\n                    non_linearity=resnet_act_fn,\n                    output_scale_factor=output_scale_factor,\n                    pre_norm=resnet_pre_norm,\n                ))\n\n        self.upsampler = FirUpsample2D(in_channels, out_channels=out_channels)\n        if add_upsample:\n            self.resnet_up = ResnetBlock(\n                in_channels=out_channels,\n                out_channels=out_channels,\n                temb_channels=temb_channels,\n                eps=resnet_eps,\n                groups=min(out_channels // 4, 32),\n                groups_out=min(out_channels // 4, 32),\n                dropout=dropout,\n                time_embedding_norm=resnet_time_scale_shift,\n                non_linearity=resnet_act_fn,\n                output_scale_factor=output_scale_factor,\n                pre_norm=resnet_pre_norm,\n                use_nin_shortcut=True,\n                up=True,\n                kernel=\"fir\",\n            )\n            self.skip_conv = nn.Conv2D(out_channels, 3, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))\n            self.skip_norm = nn.GroupNorm(num_groups=min(out_channels // 4, 32),\n                                          num_channels=out_channels,\n                                          eps=resnet_eps,\n                                          affine=True)\n            self.act = nn.SiLU()\n        else:\n            self.resnet_up = None\n            self.skip_conv = None\n            self.skip_norm = None\n            self.act = None\n\n    def forward(self, hidden_states, res_hidden_states_tuple, temb=None, skip_sample=None):\n        for resnet in self.resnets:\n            # pop res hidden states\n            res_hidden_states = res_hidden_states_tuple[-1]\n            res_hidden_states_tuple = res_hidden_states_tuple[:-1]\n            hidden_states = paddle.concat([hidden_states, res_hidden_states], axis=1)\n\n            hidden_states = resnet(hidden_states, temb)\n\n        if skip_sample is not None:\n            skip_sample = self.upsampler(skip_sample)\n        else:\n            skip_sample = 0\n\n        if self.resnet_up is not None:\n            skip_sample_states = self.skip_norm(hidden_states)\n            skip_sample_states = self.act(skip_sample_states)\n            skip_sample_states = self.skip_conv(skip_sample_states)\n\n            skip_sample = skip_sample + skip_sample_states\n\n            hidden_states = self.resnet_up(hidden_states, temb)\n\n        return hidden_states, skip_sample\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/diffusers/models/vae.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .unet_blocks import get_down_block\nfrom .unet_blocks import get_up_block\nfrom .unet_blocks import UNetMidBlock2D\n\n\nclass Encoder(nn.Layer):\n\n    def __init__(\n            self,\n            in_channels=3,\n            out_channels=3,\n            down_block_types=(\"DownEncoderBlock2D\", ),\n            block_out_channels=(64, ),\n            layers_per_block=2,\n            act_fn=\"silu\",\n            double_z=True,\n    ):\n        super().__init__()\n        self.layers_per_block = layers_per_block\n\n        self.conv_in = nn.Conv2D(in_channels, block_out_channels[0], kernel_size=3, stride=1, padding=1)\n\n        self.mid_block = None\n        self.down_blocks = nn.LayerList([])\n\n        # down\n        output_channel = block_out_channels[0]\n        for i, down_block_type in enumerate(down_block_types):\n            input_channel = output_channel\n            output_channel = block_out_channels[i]\n            is_final_block = i == len(block_out_channels) - 1\n\n            down_block = get_down_block(\n                down_block_type,\n                num_layers=self.layers_per_block,\n                in_channels=input_channel,\n                out_channels=output_channel,\n                add_downsample=not is_final_block,\n                resnet_eps=1e-6,\n                downsample_padding=0,\n                resnet_act_fn=act_fn,\n                attn_num_head_channels=None,\n                temb_channels=None,\n            )\n            self.down_blocks.append(down_block)\n\n        # mid\n        self.mid_block = UNetMidBlock2D(\n            in_channels=block_out_channels[-1],\n            resnet_eps=1e-6,\n            resnet_act_fn=act_fn,\n            output_scale_factor=1,\n            resnet_time_scale_shift=\"default\",\n            attn_num_head_channels=None,\n            resnet_groups=32,\n            temb_channels=None,\n        )\n\n        # out\n        num_groups_out = 32\n        self.conv_norm_out = nn.GroupNorm(num_channels=block_out_channels[-1], num_groups=num_groups_out, epsilon=1e-6)\n        self.conv_act = nn.Silu()\n\n        conv_out_channels = 2 * out_channels if double_z else out_channels\n        self.conv_out = nn.Conv2D(block_out_channels[-1], conv_out_channels, 3, padding=1)\n\n    def forward(self, x):\n        sample = x\n        sample = self.conv_in(sample)\n\n        # down\n        for down_block in self.down_blocks:\n            sample = down_block(sample)\n\n        # middle\n        sample = self.mid_block(sample)\n\n        # post-process\n        sample = self.conv_norm_out(sample)\n        sample = self.conv_act(sample)\n        sample = self.conv_out(sample)\n\n        return sample\n\n\nclass Decoder(nn.Layer):\n\n    def __init__(\n            self,\n            in_channels=3,\n            out_channels=3,\n            up_block_types=(\"UpDecoderBlock2D\", ),\n            block_out_channels=(64, ),\n            layers_per_block=2,\n            act_fn=\"silu\",\n    ):\n        super().__init__()\n        self.layers_per_block = layers_per_block\n\n        self.conv_in = nn.Conv2D(in_channels, block_out_channels[-1], kernel_size=3, stride=1, padding=1)\n\n        self.mid_block = None\n        self.up_blocks = nn.LayerList([])\n\n        # mid\n        self.mid_block = UNetMidBlock2D(\n            in_channels=block_out_channels[-1],\n            resnet_eps=1e-6,\n            resnet_act_fn=act_fn,\n            output_scale_factor=1,\n            resnet_time_scale_shift=\"default\",\n            attn_num_head_channels=None,\n            resnet_groups=32,\n            temb_channels=None,\n        )\n\n        # up\n        reversed_block_out_channels = list(reversed(block_out_channels))\n        output_channel = reversed_block_out_channels[0]\n        for i, up_block_type in enumerate(up_block_types):\n            prev_output_channel = output_channel\n            output_channel = reversed_block_out_channels[i]\n\n            is_final_block = i == len(block_out_channels) - 1\n\n            up_block = get_up_block(\n                up_block_type,\n                num_layers=self.layers_per_block + 1,\n                in_channels=prev_output_channel,\n                out_channels=output_channel,\n                prev_output_channel=None,\n                add_upsample=not is_final_block,\n                resnet_eps=1e-6,\n                resnet_act_fn=act_fn,\n                attn_num_head_channels=None,\n                temb_channels=None,\n            )\n            self.up_blocks.append(up_block)\n            prev_output_channel = output_channel\n\n        # out\n        num_groups_out = 32\n        self.conv_norm_out = nn.GroupNorm(num_channels=block_out_channels[0], num_groups=num_groups_out, epsilon=1e-6)\n        self.conv_act = nn.Silu()\n        self.conv_out = nn.Conv2D(block_out_channels[0], out_channels, 3, padding=1)\n\n    def forward(self, z):\n        sample = z\n        sample = self.conv_in(sample)\n\n        # middle\n        sample = self.mid_block(sample)\n\n        # up\n        for up_block in self.up_blocks:\n            sample = up_block(sample)\n\n        # post-process\n        sample = self.conv_norm_out(sample)\n        sample = self.conv_act(sample)\n        sample = self.conv_out(sample)\n\n        return sample\n\n\nclass VectorQuantizer(nn.Layer):\n    \"\"\"\n    Improved version over VectorQuantizer, can be used as a drop-in replacement. Mostly avoids costly matrix\n    multiplications and allows for post-hoc remapping of indices.\n    \"\"\"\n\n    # NOTE: due to a bug the beta term was applied to the wrong term. for\n    # backwards compatibility we use the buggy version by default, but you can\n    # specify legacy=False to fix it.\n    def __init__(self, n_e, e_dim, beta, remap=None, unknown_index=\"random\", sane_index_shape=False, legacy=True):\n        super().__init__()\n        self.n_e = n_e\n        self.e_dim = e_dim\n        self.beta = beta\n        self.legacy = legacy\n\n        self.embedding = nn.Embedding(self.n_e, self.e_dim)\n        self.embedding.weight.data.uniform_(-1.0 / self.n_e, 1.0 / self.n_e)\n\n        self.remap = remap\n        if self.remap is not None:\n            self.register_buffer(\"used\", paddle.to_tensor(np.load(self.remap)))\n            self.re_embed = self.used.shape[0]\n            self.unknown_index = unknown_index  # \"random\" or \"extra\" or integer\n            if self.unknown_index == \"extra\":\n                self.unknown_index = self.re_embed\n                self.re_embed = self.re_embed + 1\n            print(f\"Remapping {self.n_e} indices to {self.re_embed} indices. \"\n                  f\"Using {self.unknown_index} for unknown indices.\")\n        else:\n            self.re_embed = n_e\n\n        self.sane_index_shape = sane_index_shape\n\n    def remap_to_used(self, inds):\n        ishape = inds.shape\n        assert len(ishape) > 1\n        inds = inds.reshape([ishape[0], -1])\n        used = self.used\n        match = (inds[:, :, None] == used[None, None, ...]).astype(\"int64\")\n        new = match.argmax(-1)\n        unknown = match.sum(2) < 1\n        if self.unknown_index == \"random\":\n            new[unknown] = paddle.randint(0, self.re_embed, shape=new[unknown].shape)\n        else:\n            new[unknown] = self.unknown_index\n        return new.reshape(ishape)\n\n    def unmap_to_all(self, inds):\n        ishape = inds.shape\n        assert len(ishape) > 1\n        inds = inds.reshape([ishape[0], -1])\n        used = self.used\n        if self.re_embed > self.used.shape[0]:  # extra token\n            inds[inds >= self.used.shape[0]] = 0  # simply set to zero\n        back = paddle.gather(used[None, :][inds.shape[0] * [0], :], inds, axis=1)\n        return back.reshape(ishape)\n\n    def forward(self, z):\n        # reshape z -> (batch, height, width, channel) and flatten\n        z = z.transpose([0, 2, 3, 1])\n        z_flattened = z.reshape([-1, self.e_dim])\n        # distances from z to embeddings e_j (z - e)^2 = z^2 + e^2 - 2 e * z\n\n        d = (paddle.sum(z_flattened**2, axis=1, keepdim=True) + paddle.sum(self.embedding.weight**2, axis=1) -\n             2 * paddle.einsum(\"bd,dn->bn\", z_flattened, self.embedding.weight.t()))\n\n        min_encoding_indices = paddle.argmin(d, axis=1)\n        z_q = self.embedding(min_encoding_indices).reshape(z.shape)\n        perplexity = None\n        min_encodings = None\n\n        # compute loss for embedding\n        if not self.legacy:\n            loss = self.beta * paddle.mean((z_q.detach() - z)**2) + paddle.mean((z_q - z.detach())**2)\n        else:\n            loss = paddle.mean((z_q.detach() - z)**2) + self.beta * paddle.mean((z_q - z.detach())**2)\n\n        # preserve gradients\n        z_q = z + (z_q - z).detach()\n\n        # reshape back to match original input shape\n        z_q = z_q.transpose([0, 3, 1, 2])\n\n        if self.remap is not None:\n            min_encoding_indices = min_encoding_indices.reshape([z.shape[0], -1])  # add batch axis\n            min_encoding_indices = self.remap_to_used(min_encoding_indices)\n            min_encoding_indices = min_encoding_indices.reshape([-1, 1])  # flatten\n\n        if self.sane_index_shape:\n            min_encoding_indices = min_encoding_indices.reshape([z_q.shape[0], z_q.shape[2], z_q.shape[3]])\n\n        return z_q, loss, (perplexity, min_encodings, min_encoding_indices)\n\n    def get_codebook_entry(self, indices, shape):\n        # shape specifying (batch, height, width, channel)\n        if self.remap is not None:\n            indices = indices.reshape([shape[0], -1])  # add batch axis\n            indices = self.unmap_to_all(indices)\n            indices = indices.flatten()  # flatten again\n\n        # get quantized latent vectors\n        z_q = self.embedding(indices)\n\n        if shape is not None:\n            z_q = z_q.reshape(shape)\n            # reshape back to match original input shape\n            z_q = z_q.transpose([0, 3, 1, 2])\n\n        return z_q\n\n\nclass DiagonalGaussianDistribution(object):\n\n    def __init__(self, parameters, deterministic=False):\n        self.parameters = parameters\n        self.mean, self.logvar = paddle.chunk(parameters, 2, axis=1)\n        self.logvar = paddle.clip(self.logvar, -30.0, 20.0)\n        self.deterministic = deterministic\n        self.std = paddle.exp(0.5 * self.logvar)\n        self.var = paddle.exp(self.logvar)\n        if self.deterministic:\n            self.var = self.std = paddle.zeros_like(self.mean)\n\n    def sample(self):\n        x = self.mean + self.std * paddle.randn(self.mean.shape)\n        return x\n\n    def kl(self, other=None):\n        if self.deterministic:\n            return paddle.to_tensor([0.0])\n        else:\n            if other is None:\n                return 0.5 * paddle.sum(paddle.pow(self.mean, 2) + self.var - 1.0 - self.logvar, axis=[1, 2, 3])\n            else:\n                return 0.5 * paddle.sum(\n                    paddle.pow(self.mean - other.mean, 2) / other.var + self.var / other.var - 1.0 - self.logvar +\n                    other.logvar,\n                    axis=[1, 2, 3],\n                )\n\n    def nll(self, sample, dims=[1, 2, 3]):\n        if self.deterministic:\n            return paddle.to_tensor([0.0])\n        logtwopi = np.log(2.0 * np.pi)\n        return 0.5 * paddle.sum(logtwopi + self.logvar + paddle.pow(sample - self.mean, 2) / self.var, axis=dims)\n\n    def mode(self):\n        return self.mean\n\n\nclass VQModel(ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        in_channels=3,\n        out_channels=3,\n        down_block_types=(\"DownEncoderBlock2D\", ),\n        up_block_types=(\"UpDecoderBlock2D\", ),\n        block_out_channels=(64, ),\n        layers_per_block=1,\n        act_fn=\"silu\",\n        latent_channels=3,\n        sample_size=32,\n        num_vq_embeddings=256,\n    ):\n        super().__init__()\n\n        # pass init params to Encoder\n        self.encoder = Encoder(\n            in_channels=in_channels,\n            out_channels=latent_channels,\n            down_block_types=down_block_types,\n            block_out_channels=block_out_channels,\n            layers_per_block=layers_per_block,\n            act_fn=act_fn,\n            double_z=False,\n        )\n\n        self.quant_conv = nn.Conv2D(latent_channels, latent_channels, 1)\n        self.quantize = VectorQuantizer(num_vq_embeddings,\n                                        latent_channels,\n                                        beta=0.25,\n                                        remap=None,\n                                        sane_index_shape=False)\n        self.post_quant_conv = nn.Conv2D(latent_channels, latent_channels, 1)\n\n        # pass init params to Decoder\n        self.decoder = Decoder(\n            in_channels=latent_channels,\n            out_channels=out_channels,\n            up_block_types=up_block_types,\n            block_out_channels=block_out_channels,\n            layers_per_block=layers_per_block,\n            act_fn=act_fn,\n        )\n\n    def encode(self, x):\n        h = self.encoder(x)\n        h = self.quant_conv(h)\n        return h\n\n    def decode(self, h, force_not_quantize=False):\n        # also go through quantization layer\n        if not force_not_quantize:\n            quant, emb_loss, info = self.quantize(h)\n        else:\n            quant = h\n        quant = self.post_quant_conv(quant)\n        dec = self.decoder(quant)\n        return dec\n\n    def forward(self, sample):\n        x = sample\n        h = self.encode(x)\n        dec = self.decode(h)\n        return dec\n\n\nclass AutoencoderKL(nn.Layer, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        in_channels=3,\n        out_channels=3,\n        down_block_types=(\"DownEncoderBlock2D\", \"DownEncoderBlock2D\", \"DownEncoderBlock2D\", \"DownEncoderBlock2D\"),\n        up_block_types=(\"UpDecoderBlock2D\", \"UpDecoderBlock2D\", \"UpDecoderBlock2D\", \"UpDecoderBlock2D\"),\n        block_out_channels=(128, 256, 512, 512),\n        layers_per_block=2,\n        act_fn=\"silu\",\n        latent_channels=4,\n        sample_size=512,\n    ):\n        super().__init__()\n\n        # pass init params to Encoder\n        self.encoder = Encoder(\n            in_channels=in_channels,\n            out_channels=latent_channels,\n            down_block_types=down_block_types,\n            block_out_channels=block_out_channels,\n            layers_per_block=layers_per_block,\n            act_fn=act_fn,\n            double_z=True,\n        )\n\n        # pass init params to Decoder\n        self.decoder = Decoder(\n            in_channels=latent_channels,\n            out_channels=out_channels,\n            up_block_types=up_block_types,\n            block_out_channels=block_out_channels,\n            layers_per_block=layers_per_block,\n            act_fn=act_fn,\n        )\n\n        self.quant_conv = nn.Conv2D(2 * latent_channels, 2 * latent_channels, 1)\n        self.post_quant_conv = nn.Conv2D(latent_channels, latent_channels, 1)\n\n    def encode(self, x):\n        h = self.encoder(x)\n        moments = self.quant_conv(h)\n        posterior = DiagonalGaussianDistribution(moments)\n        return posterior\n\n    def decode(self, z):\n        z = self.post_quant_conv(z)\n        dec = self.decoder(z)\n        return dec\n\n    def forward(self, sample, sample_posterior=False):\n        x = sample\n        posterior = self.encode(x)\n        if sample_posterior:\n            z = posterior.sample()\n        else:\n            z = posterior.mode()\n        dec = self.decode(z)\n        return dec\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/diffusers/schedulers/README.md",
    "content": "# Schedulers\n\n- Schedulers are the algorithms to use diffusion models in inference as well as for training. They include the noise schedules and define algorithm-specific diffusion steps.\n- Schedulers can be used interchangable between diffusion models in inference to find the preferred trade-off between speed and generation quality.\n- Schedulers are available in numpy, but can easily be transformed into Py\n\n## API\n\n- Schedulers should provide one or more `def step(...)` functions that should be called iteratively to unroll the diffusion loop during\nthe forward pass.\n- Schedulers should be framework-agnostic, but provide a simple functionality to convert the scheduler into a specific framework, such as PyTorch\nwith a `set_format(...)` method.\n\n## Examples\n\n- The DDPM scheduler was proposed in [Denoising Diffusion Probabilistic Models](https://arxiv.org/abs/2006.11239) and can be found in [scheduling_ddpm.py](https://github.com/huggingface/diffusers/blob/main/src/diffusers/schedulers/scheduling_ddpm.py). An example of how to use this scheduler can be found in [pipeline_ddpm.py](https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/pipeline_ddpm.py).\n- The DDIM scheduler was proposed in [Denoising Diffusion Implicit Models](https://arxiv.org/abs/2010.02502) and can be found in [scheduling_ddim.py](https://github.com/huggingface/diffusers/blob/main/src/diffusers/schedulers/scheduling_ddim.py). An example of how to use this scheduler can be found in [pipeline_ddim.py](https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/pipeline_ddim.py).\n- The PNDM scheduler was proposed in [Pseudo Numerical Methods for Diffusion Models on Manifolds](https://arxiv.org/abs/2202.09778) and can be found in [scheduling_pndm.py](https://github.com/huggingface/diffusers/blob/main/src/diffusers/schedulers/scheduling_pndm.py). An example of how to use this scheduler can be found in [pipeline_pndm.py](https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/pipeline_pndm.py).\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/diffusers/schedulers/__init__.py",
    "content": "# flake8: noqa\n# There's no way to ignore \"F401 '...' imported but unused\" warnings in this\n# module, but to preserve other warnings. So, don't check this module at all.\n# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom .scheduling_ddim import DDIMScheduler\nfrom .scheduling_ddpm import DDPMScheduler\nfrom .scheduling_karras_ve import KarrasVeScheduler\nfrom .scheduling_lms_discrete import LMSDiscreteScheduler\nfrom .scheduling_pndm import PNDMScheduler\nfrom .scheduling_sde_ve import ScoreSdeVeScheduler\nfrom .scheduling_sde_vp import ScoreSdeVpScheduler\nfrom .scheduling_utils import SchedulerMixin\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/diffusers/schedulers/scheduling_ddim.py",
    "content": "# Copyright 2022 Stanford University Team and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# DISCLAIMER: This code is strongly influenced by https://github.com/pesser/pypaddle_diffusion\n# and https://github.com/hojonathanho/diffusion\nimport math\nfrom typing import Union\n\nimport numpy as np\nimport paddle\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\ndef betas_for_alpha_bar(num_diffusion_timesteps, max_beta=0.999):\n    \"\"\"\n    Create a beta schedule that discretizes the given alpha_t_bar function, which defines the cumulative product of\n    (1-beta) over time from t = [0,1].\n\n    :param num_diffusion_timesteps: the number of betas to produce. :param alpha_bar: a lambda that takes an argument t\n    from 0 to 1 and\n                      produces the cumulative product of (1-beta) up to that part of the diffusion process.\n    :param max_beta: the maximum beta to use; use values lower than 1 to\n                     prevent singularities.\n    \"\"\"\n\n    def alpha_bar(time_step):\n        return math.cos((time_step + 0.008) / 1.008 * math.pi / 2)**2\n\n    betas = []\n    for i in range(num_diffusion_timesteps):\n        t1 = i / num_diffusion_timesteps\n        t2 = (i + 1) / num_diffusion_timesteps\n        betas.append(min(1 - alpha_bar(t2) / alpha_bar(t1), max_beta))\n    return np.array(betas, dtype=np.float32)\n\n\nclass DDIMScheduler(SchedulerMixin, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        num_train_timesteps=1000,\n        beta_start=0.0001,\n        beta_end=0.02,\n        beta_schedule=\"linear\",\n        trained_betas=None,\n        timestep_values=None,\n        clip_sample=True,\n        set_alpha_to_one=True,\n        tensor_format=\"pd\",\n    ):\n\n        if beta_schedule == \"linear\":\n            self.betas = np.linspace(beta_start, beta_end, num_train_timesteps, dtype=np.float32)\n        elif beta_schedule == \"scaled_linear\":\n            # this schedule is very specific to the latent diffusion model.\n            self.betas = np.linspace(beta_start**0.5, beta_end**0.5, num_train_timesteps, dtype=np.float32)**2\n        elif beta_schedule == \"squaredcos_cap_v2\":\n            # Glide cosine schedule\n            self.betas = betas_for_alpha_bar(num_train_timesteps)\n        else:\n            raise NotImplementedError(f\"{beta_schedule} does is not implemented for {self.__class__}\")\n\n        self.alphas = 1.0 - self.betas\n        self.alphas_cumprod = np.cumprod(self.alphas, axis=0)\n\n        # At every step in ddim, we are looking into the previous alphas_cumprod\n        # For the final step, there is no previous alphas_cumprod because we are already at 0\n        # `set_alpha_to_one` decides whether we set this paratemer simply to one or\n        # whether we use the final alpha of the \"non-previous\" one.\n        self.final_alpha_cumprod = np.array(1.0) if set_alpha_to_one else self.alphas_cumprod[0]\n\n        # setable values\n        self.num_inference_steps = None\n        self.timesteps = np.arange(0, num_train_timesteps)[::-1].copy()\n\n        self.tensor_format = tensor_format\n        self.set_format(tensor_format=tensor_format)\n\n    def _get_variance(self, timestep, prev_timestep):\n        alpha_prod_t = self.alphas_cumprod[timestep]\n        alpha_prod_t_prev = self.alphas_cumprod[prev_timestep] if prev_timestep >= 0 else self.final_alpha_cumprod\n        beta_prod_t = 1 - alpha_prod_t\n        beta_prod_t_prev = 1 - alpha_prod_t_prev\n\n        variance = (beta_prod_t_prev / beta_prod_t) * (1 - alpha_prod_t / alpha_prod_t_prev)\n\n        return variance\n\n    def set_timesteps(self, num_inference_steps, offset=0):\n        self.num_inference_steps = num_inference_steps\n        self.timesteps = np.arange(0, self.config.num_train_timesteps,\n                                   self.config.num_train_timesteps // self.num_inference_steps)[::-1].copy()\n        self.timesteps += offset\n        self.set_format(tensor_format=self.tensor_format)\n\n    def step(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n        eta: float = 0.0,\n        use_clipped_model_output: bool = False,\n        generator=None,\n    ):\n        # See formulas (12) and (16) of DDIM paper https://arxiv.org/pdf/2010.02502.pdf\n        # Ideally, read DDIM paper in-detail understanding\n\n        # Notation (<variable name> -> <name in paper>\n        # - pred_noise_t -> e_theta(x_t, t)\n        # - pred_original_sample -> f_theta(x_t, t) or x_0\n        # - std_dev_t -> sigma_t\n        # - eta -> η\n        # - pred_sample_direction -> \"direction pointingc to x_t\"\n        # - pred_prev_sample -> \"x_t-1\"\n\n        # 1. get previous step value (=t-1)\n        prev_timestep = timestep - self.config.num_train_timesteps // self.num_inference_steps\n\n        # 2. compute alphas, betas\n        alpha_prod_t = self.alphas_cumprod[timestep]\n        alpha_prod_t_prev = self.alphas_cumprod[prev_timestep] if prev_timestep >= 0 else self.final_alpha_cumprod\n        beta_prod_t = 1 - alpha_prod_t\n\n        # 3. compute predicted original sample from predicted noise also called\n        # \"predicted x_0\" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf\n        pred_original_sample = (sample - beta_prod_t**(0.5) * model_output) / alpha_prod_t**(0.5)\n\n        # 4. Clip \"predicted x_0\"\n        if self.config.clip_sample:\n            pred_original_sample = self.clip(pred_original_sample, -1, 1)\n\n        # 5. compute variance: \"sigma_t(η)\" -> see formula (16)\n        # σ_t = sqrt((1 − α_t−1)/(1 − α_t)) * sqrt(1 − α_t/α_t−1)\n        variance = self._get_variance(timestep, prev_timestep)\n        std_dev_t = eta * variance**(0.5)\n\n        if use_clipped_model_output:\n            # the model_output is always re-derived from the clipped x_0 in Glide\n            model_output = (sample - alpha_prod_t**(0.5) * pred_original_sample) / beta_prod_t**(0.5)\n\n        # 6. compute \"direction pointing to x_t\" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf\n        pred_sample_direction = (1 - alpha_prod_t_prev - std_dev_t**2)**(0.5) * model_output\n\n        # 7. compute x_t without \"random noise\" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf\n        prev_sample = alpha_prod_t_prev**(0.5) * pred_original_sample + pred_sample_direction\n\n        if eta > 0:\n            noise = paddle.randn(model_output.shape)\n            variance = self._get_variance(timestep, prev_timestep)**(0.5) * eta * noise\n\n            if not paddle.is_tensor(model_output):\n                variance = variance.numpy()\n\n            prev_sample = prev_sample + variance\n\n        return {\"prev_sample\": prev_sample}\n\n    def add_noise(self, original_samples, noise, timesteps):\n        sqrt_alpha_prod = self.alphas_cumprod[timesteps]**0.5\n        sqrt_alpha_prod = self.match_shape(sqrt_alpha_prod, original_samples)\n        sqrt_one_minus_alpha_prod = (1 - self.alphas_cumprod[timesteps])**0.5\n        sqrt_one_minus_alpha_prod = self.match_shape(sqrt_one_minus_alpha_prod, original_samples)\n\n        noisy_samples = sqrt_alpha_prod * original_samples + sqrt_one_minus_alpha_prod * noise\n        return noisy_samples\n\n    def __len__(self):\n        return self.config.num_train_timesteps\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/diffusers/schedulers/scheduling_ddpm.py",
    "content": "# Copyright 2022 UC Berkely Team and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# DISCLAIMER: This file is strongly influenced by https://github.com/ermongroup/ddim\nimport math\nfrom typing import Union\n\nimport numpy as np\nimport paddle\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\ndef betas_for_alpha_bar(num_diffusion_timesteps, max_beta=0.999):\n    \"\"\"\n    Create a beta schedule that discretizes the given alpha_t_bar function, which defines the cumulative product of\n    (1-beta) over time from t = [0,1].\n\n    :param num_diffusion_timesteps: the number of betas to produce. :param alpha_bar: a lambda that takes an argument t\n    from 0 to 1 and\n                      produces the cumulative product of (1-beta) up to that part of the diffusion process.\n    :param max_beta: the maximum beta to use; use values lower than 1 to\n                     prevent singularities.\n    \"\"\"\n\n    def alpha_bar(time_step):\n        return math.cos((time_step + 0.008) / 1.008 * math.pi / 2)**2\n\n    betas = []\n    for i in range(num_diffusion_timesteps):\n        t1 = i / num_diffusion_timesteps\n        t2 = (i + 1) / num_diffusion_timesteps\n        betas.append(min(1 - alpha_bar(t2) / alpha_bar(t1), max_beta))\n    return np.array(betas, dtype=np.float32)\n\n\nclass DDPMScheduler(SchedulerMixin, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        num_train_timesteps=1000,\n        beta_start=0.0001,\n        beta_end=0.02,\n        beta_schedule=\"linear\",\n        trained_betas=None,\n        variance_type=\"fixed_small\",\n        clip_sample=True,\n        tensor_format=\"pd\",\n    ):\n\n        if trained_betas is not None:\n            self.betas = np.asarray(trained_betas)\n        elif beta_schedule == \"linear\":\n            self.betas = np.linspace(beta_start, beta_end, num_train_timesteps, dtype=np.float32)\n        elif beta_schedule == \"scaled_linear\":\n            # this schedule is very specific to the latent diffusion model.\n            self.betas = np.linspace(beta_start**0.5, beta_end**0.5, num_train_timesteps, dtype=np.float32)**2\n        elif beta_schedule == \"squaredcos_cap_v2\":\n            # Glide cosine schedule\n            self.betas = betas_for_alpha_bar(num_train_timesteps)\n        else:\n            raise NotImplementedError(f\"{beta_schedule} does is not implemented for {self.__class__}\")\n\n        self.alphas = 1.0 - self.betas\n        self.alphas_cumprod = np.cumprod(self.alphas, axis=0)\n        self.one = np.array(1.0)\n\n        # setable values\n        self.num_inference_steps = None\n        self.timesteps = np.arange(0, num_train_timesteps)[::-1].copy()\n\n        self.tensor_format = tensor_format\n        self.set_format(tensor_format=tensor_format)\n\n        self.variance_type = variance_type\n\n    def set_timesteps(self, num_inference_steps):\n        num_inference_steps = min(self.config.num_train_timesteps, num_inference_steps)\n        self.num_inference_steps = num_inference_steps\n        self.timesteps = np.arange(0, self.config.num_train_timesteps,\n                                   self.config.num_train_timesteps // self.num_inference_steps)[::-1].copy()\n        self.set_format(tensor_format=self.tensor_format)\n\n    def _get_variance(self, t, predicted_variance=None, variance_type=None):\n        alpha_prod_t = self.alphas_cumprod[t]\n        alpha_prod_t_prev = self.alphas_cumprod[t - 1] if t > 0 else self.one\n\n        # For t > 0, compute predicted variance βt (see formula (6) and (7) from https://arxiv.org/pdf/2006.11239.pdf)\n        # and sample from it to get previous sample\n        # x_{t-1} ~ N(pred_prev_sample, variance) == add variance to pred_sample\n        variance = (1 - alpha_prod_t_prev) / (1 - alpha_prod_t) * self.betas[t]\n\n        if variance_type is None:\n            variance_type = self.config.variance_type\n\n        # hacks - were probs added for training stability\n        if variance_type == \"fixed_small\":\n            variance = self.clip(variance, min_value=1e-20)\n        # for rl-diffuser https://arxiv.org/abs/2205.09991\n        elif variance_type == \"fixed_small_log\":\n            variance = self.log(self.clip(variance, min_value=1e-20))\n        elif variance_type == \"fixed_large\":\n            variance = self.betas[t]\n        elif variance_type == \"fixed_large_log\":\n            # Glide max_log\n            variance = self.log(self.betas[t])\n        elif variance_type == \"learned\":\n            return predicted_variance\n        elif variance_type == \"learned_range\":\n            min_log = variance\n            max_log = self.betas[t]\n            frac = (predicted_variance + 1) / 2\n            variance = frac * max_log + (1 - frac) * min_log\n\n        return variance\n\n    def step(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n        predict_epsilon=True,\n        generator=None,\n    ):\n        t = timestep\n\n        if model_output.shape[1] == sample.shape[1] * 2 and self.variance_type in [\"learned\", \"learned_range\"]:\n            model_output, predicted_variance = paddle.split(model_output, sample.shape[1], axis=1)\n        else:\n            predicted_variance = None\n\n        # 1. compute alphas, betas\n        alpha_prod_t = self.alphas_cumprod[t]\n        alpha_prod_t_prev = self.alphas_cumprod[t - 1] if t > 0 else self.one\n        beta_prod_t = 1 - alpha_prod_t\n        beta_prod_t_prev = 1 - alpha_prod_t_prev\n\n        # 2. compute predicted original sample from predicted noise also called\n        # \"predicted x_0\" of formula (15) from https://arxiv.org/pdf/2006.11239.pdf\n        if predict_epsilon:\n            pred_original_sample = (sample - beta_prod_t**(0.5) * model_output) / alpha_prod_t**(0.5)\n        else:\n            pred_original_sample = model_output\n\n        # 3. Clip \"predicted x_0\"\n        if self.config.clip_sample:\n            pred_original_sample = self.clip(pred_original_sample, -1, 1)\n\n        # 4. Compute coefficients for pred_original_sample x_0 and current sample x_t\n        # See formula (7) from https://arxiv.org/pdf/2006.11239.pdf\n        pred_original_sample_coeff = (alpha_prod_t_prev**(0.5) * self.betas[t]) / beta_prod_t\n        current_sample_coeff = self.alphas[t]**(0.5) * beta_prod_t_prev / beta_prod_t\n\n        # 5. Compute predicted previous sample µ_t\n        # See formula (7) from https://arxiv.org/pdf/2006.11239.pdf\n        pred_prev_sample = pred_original_sample_coeff * pred_original_sample + current_sample_coeff * sample\n\n        # 6. Add noise\n        variance = 0\n        if t > 0:\n            noise = self.randn_like(model_output)\n            variance = (self._get_variance(t, predicted_variance=predicted_variance)**0.5) * noise\n\n        pred_prev_sample = pred_prev_sample + variance\n\n        return {\"prev_sample\": pred_prev_sample}\n\n    def add_noise(self, original_samples, noise, timesteps):\n        sqrt_alpha_prod = self.alphas_cumprod[timesteps]**0.5\n        sqrt_alpha_prod = self.match_shape(sqrt_alpha_prod, original_samples)\n        sqrt_one_minus_alpha_prod = (1 - self.alphas_cumprod[timesteps])**0.5\n        sqrt_one_minus_alpha_prod = self.match_shape(sqrt_one_minus_alpha_prod, original_samples)\n\n        noisy_samples = sqrt_alpha_prod * original_samples + sqrt_one_minus_alpha_prod * noise\n        return noisy_samples\n\n    def __len__(self):\n        return self.config.num_train_timesteps\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/diffusers/schedulers/scheduling_karras_ve.py",
    "content": "# Copyright 2022 NVIDIA and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Union\n\nimport numpy as np\nimport paddle\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\nclass KarrasVeScheduler(SchedulerMixin, ConfigMixin):\n    \"\"\"\n    Stochastic sampling from Karras et al. [1] tailored to the Variance-Expanding (VE) models [2]. Use Algorithm 2 and\n    the VE column of Table 1 from [1] for reference.\n\n    [1] Karras, Tero, et al. \"Elucidating the Design Space of Diffusion-Based Generative Models.\"\n    https://arxiv.org/abs/2206.00364 [2] Song, Yang, et al. \"Score-based generative modeling through stochastic\n    differential equations.\" https://arxiv.org/abs/2011.13456\n    \"\"\"\n\n    @register_to_config\n    def __init__(\n        self,\n        sigma_min=0.02,\n        sigma_max=100,\n        s_noise=1.007,\n        s_churn=80,\n        s_min=0.05,\n        s_max=50,\n        tensor_format=\"pd\",\n    ):\n        \"\"\"\n        For more details on the parameters, see the original paper's Appendix E.: \"Elucidating the Design Space of\n        Diffusion-Based Generative Models.\" https://arxiv.org/abs/2206.00364. The grid search values used to find the\n        optimal {s_noise, s_churn, s_min, s_max} for a specific model are described in Table 5 of the paper.\n\n        Args:\n            sigma_min (`float`): minimum noise magnitude\n            sigma_max (`float`): maximum noise magnitude\n            s_noise (`float`): the amount of additional noise to counteract loss of detail during sampling.\n                A reasonable range is [1.000, 1.011].\n            s_churn (`float`): the parameter controlling the overall amount of stochasticity.\n                A reasonable range is [0, 100].\n            s_min (`float`): the start value of the sigma range where we add noise (enable stochasticity).\n                A reasonable range is [0, 10].\n            s_max (`float`): the end value of the sigma range where we add noise.\n                A reasonable range is [0.2, 80].\n        \"\"\"\n        # setable values\n        self.num_inference_steps = None\n        self.timesteps = None\n        self.schedule = None  # sigma(t_i)\n\n        self.tensor_format = tensor_format\n        self.set_format(tensor_format=tensor_format)\n\n    def set_timesteps(self, num_inference_steps):\n        self.num_inference_steps = num_inference_steps\n        self.timesteps = np.arange(0, self.num_inference_steps)[::-1].copy()\n        self.schedule = [(self.sigma_max * (self.sigma_min**2 / self.sigma_max**2)**(i / (num_inference_steps - 1)))\n                         for i in self.timesteps]\n        self.schedule = np.array(self.schedule, dtype=np.float32)\n\n        self.set_format(tensor_format=self.tensor_format)\n\n    def add_noise_to_input(self, sample, sigma, generator=None):\n        \"\"\"\n        Explicit Langevin-like \"churn\" step of adding noise to the sample according to a factor gamma_i ≥ 0 to reach a\n        higher noise level sigma_hat = sigma_i + gamma_i*sigma_i.\n        \"\"\"\n        if self.s_min <= sigma <= self.s_max:\n            gamma = min(self.s_churn / self.num_inference_steps, 2**0.5 - 1)\n        else:\n            gamma = 0\n\n        # sample eps ~ N(0, S_noise^2 * I)\n        eps = self.s_noise * paddle.randn(sample.shape)\n        sigma_hat = sigma + gamma * sigma\n        sample_hat = sample + ((sigma_hat**2 - sigma**2)**0.5 * eps)\n\n        return sample_hat, sigma_hat\n\n    def step(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        sigma_hat: float,\n        sigma_prev: float,\n        sample_hat: Union[paddle.Tensor, np.ndarray],\n    ):\n        pred_original_sample = sample_hat + sigma_hat * model_output\n        derivative = (sample_hat - pred_original_sample) / sigma_hat\n        sample_prev = sample_hat + (sigma_prev - sigma_hat) * derivative\n\n        return {\"prev_sample\": sample_prev, \"derivative\": derivative}\n\n    def step_correct(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        sigma_hat: float,\n        sigma_prev: float,\n        sample_hat: Union[paddle.Tensor, np.ndarray],\n        sample_prev: Union[paddle.Tensor, np.ndarray],\n        derivative: Union[paddle.Tensor, np.ndarray],\n    ):\n        pred_original_sample = sample_prev + sigma_prev * model_output\n        derivative_corr = (sample_prev - pred_original_sample) / sigma_prev\n        sample_prev = sample_hat + (sigma_prev - sigma_hat) * (0.5 * derivative + 0.5 * derivative_corr)\n        return {\"prev_sample\": sample_prev, \"derivative\": derivative_corr}\n\n    def add_noise(self, original_samples, noise, timesteps):\n        raise NotImplementedError()\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/diffusers/schedulers/scheduling_lms_discrete.py",
    "content": "# Copyright 2022 Katherine Crowson and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Union\n\nimport numpy as np\nimport paddle\nfrom scipy import integrate\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\nclass LMSDiscreteScheduler(SchedulerMixin, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        num_train_timesteps=1000,\n        beta_start=0.0001,\n        beta_end=0.02,\n        beta_schedule=\"linear\",\n        trained_betas=None,\n        timestep_values=None,\n        tensor_format=\"pd\",\n    ):\n        \"\"\"\n        Linear Multistep Scheduler for discrete beta schedules. Based on the original k-diffusion implementation by\n        Katherine Crowson:\n        https://github.com/crowsonkb/k-diffusion/blob/481677d114f6ea445aa009cf5bd7a9cdee909e47/k_diffusion/sampling.py#L181\n        \"\"\"\n\n        if beta_schedule == \"linear\":\n            self.betas = np.linspace(beta_start, beta_end, num_train_timesteps, dtype=np.float32)\n        elif beta_schedule == \"scaled_linear\":\n            # this schedule is very specific to the latent diffusion model.\n            self.betas = np.linspace(beta_start**0.5, beta_end**0.5, num_train_timesteps, dtype=np.float32)**2\n        else:\n            raise NotImplementedError(f\"{beta_schedule} does is not implemented for {self.__class__}\")\n\n        self.alphas = 1.0 - self.betas\n        self.alphas_cumprod = np.cumprod(self.alphas, axis=0)\n\n        self.sigmas = ((1 - self.alphas_cumprod) / self.alphas_cumprod)**0.5\n\n        # setable values\n        self.num_inference_steps = None\n        self.timesteps = np.arange(0, num_train_timesteps)[::-1].copy()\n        self.derivatives = []\n\n        self.tensor_format = tensor_format\n        self.set_format(tensor_format=tensor_format)\n\n    def get_lms_coefficient(self, order, t, current_order):\n        \"\"\"\n        Compute a linear multistep coefficient\n        \"\"\"\n\n        def lms_derivative(tau):\n            prod = 1.0\n            for k in range(order):\n                if current_order == k:\n                    continue\n                prod *= (tau - self.sigmas[t - k]) / (self.sigmas[t - current_order] - self.sigmas[t - k])\n            return prod\n\n        integrated_coeff = integrate.quad(lms_derivative, self.sigmas[t], self.sigmas[t + 1], epsrel=1e-4)[0]\n\n        return integrated_coeff\n\n    def set_timesteps(self, num_inference_steps):\n        self.num_inference_steps = num_inference_steps\n        self.timesteps = np.linspace(self.num_train_timesteps - 1, 0, num_inference_steps, dtype=float)\n\n        low_idx = np.floor(self.timesteps).astype(int)\n        high_idx = np.ceil(self.timesteps).astype(int)\n        frac = np.mod(self.timesteps, 1.0)\n        sigmas = np.array(((1 - self.alphas_cumprod) / self.alphas_cumprod)**0.5)\n        sigmas = (1 - frac) * sigmas[low_idx] + frac * sigmas[high_idx]\n        self.sigmas = np.concatenate([sigmas, [0.0]])\n\n        self.derivatives = []\n\n        self.set_format(tensor_format=self.tensor_format)\n\n    def step(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n        order: int = 4,\n    ):\n        sigma = self.sigmas[timestep]\n\n        # 1. compute predicted original sample (x_0) from sigma-scaled predicted noise\n        pred_original_sample = sample - sigma * model_output\n\n        # 2. Convert to an ODE derivative\n        derivative = (sample - pred_original_sample) / sigma\n        self.derivatives.append(derivative)\n        if len(self.derivatives) > order:\n            self.derivatives.pop(0)\n\n        # 3. Compute linear multistep coefficients\n        order = min(timestep + 1, order)\n        lms_coeffs = [self.get_lms_coefficient(order, timestep, curr_order) for curr_order in range(order)]\n\n        # 4. Compute previous sample based on the derivatives path\n        prev_sample = sample + sum(coeff * derivative\n                                   for coeff, derivative in zip(lms_coeffs, reversed(self.derivatives)))\n\n        return {\"prev_sample\": prev_sample}\n\n    def add_noise(self, original_samples, noise, timesteps):\n        alpha_prod = self.alphas_cumprod[timesteps]\n        alpha_prod = self.match_shape(alpha_prod, original_samples)\n\n        noisy_samples = (alpha_prod**0.5) * original_samples + ((1 - alpha_prod)**0.5) * noise\n        return noisy_samples\n\n    def __len__(self):\n        return self.config.num_train_timesteps\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/diffusers/schedulers/scheduling_pndm.py",
    "content": "# Copyright 2022 Zhejiang University Team and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# DISCLAIMER: This file is strongly influenced by https://github.com/ermongroup/ddim\nimport math\nfrom typing import Union\n\nimport numpy as np\nimport paddle\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\ndef betas_for_alpha_bar(num_diffusion_timesteps, max_beta=0.999):\n    \"\"\"\n    Create a beta schedule that discretizes the given alpha_t_bar function, which defines the cumulative product of\n    (1-beta) over time from t = [0,1].\n\n    :param num_diffusion_timesteps: the number of betas to produce. :param alpha_bar: a lambda that takes an argument t\n    from 0 to 1 and\n                      produces the cumulative product of (1-beta) up to that part of the diffusion process.\n    :param max_beta: the maximum beta to use; use values lower than 1 to\n                     prevent singularities.\n    \"\"\"\n\n    def alpha_bar(time_step):\n        return math.cos((time_step + 0.008) / 1.008 * math.pi / 2)**2\n\n    betas = []\n    for i in range(num_diffusion_timesteps):\n        t1 = i / num_diffusion_timesteps\n        t2 = (i + 1) / num_diffusion_timesteps\n        betas.append(min(1 - alpha_bar(t2) / alpha_bar(t1), max_beta))\n    return np.array(betas, dtype=np.float32)\n\n\nclass PNDMScheduler(SchedulerMixin, ConfigMixin):\n\n    @register_to_config\n    def __init__(\n        self,\n        num_train_timesteps=1000,\n        beta_start=0.0001,\n        beta_end=0.02,\n        beta_schedule=\"linear\",\n        tensor_format=\"pd\",\n        skip_prk_steps=False,\n    ):\n\n        if beta_schedule == \"linear\":\n            self.betas = np.linspace(beta_start, beta_end, num_train_timesteps, dtype=np.float32)\n        elif beta_schedule == \"scaled_linear\":\n            # this schedule is very specific to the latent diffusion model.\n            self.betas = np.linspace(beta_start**0.5, beta_end**0.5, num_train_timesteps, dtype=np.float32)**2\n        elif beta_schedule == \"squaredcos_cap_v2\":\n            # Glide cosine schedule\n            self.betas = betas_for_alpha_bar(num_train_timesteps)\n        else:\n            raise NotImplementedError(f\"{beta_schedule} does is not implemented for {self.__class__}\")\n\n        self.alphas = 1.0 - self.betas\n        self.alphas_cumprod = np.cumprod(self.alphas, axis=0)\n\n        self.one = np.array(1.0)\n\n        # For now we only support F-PNDM, i.e. the runge-kutta method\n        # For more information on the algorithm please take a look at the paper: https://arxiv.org/pdf/2202.09778.pdf\n        # mainly at formula (9), (12), (13) and the Algorithm 2.\n        self.pndm_order = 4\n\n        # running values\n        self.cur_model_output = 0\n        self.counter = 0\n        self.cur_sample = None\n        self.ets = []\n\n        # setable values\n        self.num_inference_steps = None\n        self._timesteps = np.arange(0, num_train_timesteps)[::-1].copy()\n        self._offset = 0\n        self.prk_timesteps = None\n        self.plms_timesteps = None\n        self.timesteps = None\n\n        self.tensor_format = tensor_format\n        self.set_format(tensor_format=tensor_format)\n\n    def set_timesteps(self, num_inference_steps, offset=0):\n        self.num_inference_steps = num_inference_steps\n        self._timesteps = list(\n            range(0, self.config.num_train_timesteps, self.config.num_train_timesteps // num_inference_steps))\n        self._offset = offset\n        self._timesteps = [t + self._offset for t in self._timesteps]\n\n        if self.config.skip_prk_steps:\n            # for some models like stable diffusion the prk steps can/should be skipped to\n            # produce better results. When using PNDM with `self.config.skip_prk_steps` the implementation\n            # is based on crowsonkb's PLMS sampler implementation: https://github.com/CompVis/latent-diffusion/pull/51\n            self.prk_timesteps = []\n            self.plms_timesteps = list(reversed(self._timesteps[:-1] + self._timesteps[-2:-1] + self._timesteps[-1:]))\n        else:\n            prk_timesteps = np.array(self._timesteps[-self.pndm_order:]).repeat(2) + np.tile(\n                np.array([0, self.config.num_train_timesteps // num_inference_steps // 2]), self.pndm_order)\n            self.prk_timesteps = list(reversed(prk_timesteps[:-1].repeat(2)[1:-1]))\n            self.plms_timesteps = list(reversed(self._timesteps[:-3]))\n\n        self.timesteps = self.prk_timesteps + self.plms_timesteps\n\n        self.ets = []\n        self.counter = 0\n        self.set_format(tensor_format=self.tensor_format)\n\n    def step(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n    ):\n        if self.counter < len(self.prk_timesteps) and not self.config.skip_prk_steps:\n            return self.step_prk(model_output=model_output, timestep=timestep, sample=sample)\n        else:\n            return self.step_plms(model_output=model_output, timestep=timestep, sample=sample)\n\n    def step_prk(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n    ):\n        \"\"\"\n        Step function propagating the sample with the Runge-Kutta method. RK takes 4 forward passes to approximate the\n        solution to the differential equation.\n        \"\"\"\n        diff_to_prev = 0 if self.counter % 2 else self.config.num_train_timesteps // self.num_inference_steps // 2\n        prev_timestep = max(timestep - diff_to_prev, self.prk_timesteps[-1])\n        timestep = self.prk_timesteps[self.counter // 4 * 4]\n\n        if self.counter % 4 == 0:\n            self.cur_model_output += 1 / 6 * model_output\n            self.ets.append(model_output)\n            self.cur_sample = sample\n        elif (self.counter - 1) % 4 == 0:\n            self.cur_model_output += 1 / 3 * model_output\n        elif (self.counter - 2) % 4 == 0:\n            self.cur_model_output += 1 / 3 * model_output\n        elif (self.counter - 3) % 4 == 0:\n            model_output = self.cur_model_output + 1 / 6 * model_output\n            self.cur_model_output = 0\n\n        # cur_sample should not be `None`\n        cur_sample = self.cur_sample if self.cur_sample is not None else sample\n\n        prev_sample = self._get_prev_sample(cur_sample, timestep, prev_timestep, model_output)\n        self.counter += 1\n\n        return {\"prev_sample\": prev_sample}\n\n    def step_plms(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n    ):\n        \"\"\"\n        Step function propagating the sample with the linear multi-step method. This has one forward pass with multiple\n        times to approximate the solution.\n        \"\"\"\n        if not self.config.skip_prk_steps and len(self.ets) < 3:\n            raise ValueError(\n                f\"{self.__class__} can only be run AFTER scheduler has been run \"\n                \"in 'prk' mode for at least 12 iterations \"\n                \"See: https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/pipeline_pndm.py \"\n                \"for more information.\")\n\n        prev_timestep = max(timestep - self.config.num_train_timesteps // self.num_inference_steps, 0)\n\n        if self.counter != 1:\n            self.ets.append(model_output)\n        else:\n            prev_timestep = timestep\n            timestep = timestep + self.config.num_train_timesteps // self.num_inference_steps\n\n        if len(self.ets) == 1 and self.counter == 0:\n            model_output = model_output\n            self.cur_sample = sample\n        elif len(self.ets) == 1 and self.counter == 1:\n            model_output = (model_output + self.ets[-1]) / 2\n            sample = self.cur_sample\n            self.cur_sample = None\n        elif len(self.ets) == 2:\n            model_output = (3 * self.ets[-1] - self.ets[-2]) / 2\n        elif len(self.ets) == 3:\n            model_output = (23 * self.ets[-1] - 16 * self.ets[-2] + 5 * self.ets[-3]) / 12\n        else:\n            model_output = (1 / 24) * (55 * self.ets[-1] - 59 * self.ets[-2] + 37 * self.ets[-3] - 9 * self.ets[-4])\n\n        prev_sample = self._get_prev_sample(sample, timestep, prev_timestep, model_output)\n        self.counter += 1\n\n        return {\"prev_sample\": prev_sample}\n\n    def _get_prev_sample(self, sample, timestep, timestep_prev, model_output):\n        # See formula (9) of PNDM paper https://arxiv.org/pdf/2202.09778.pdf\n        # this function computes x_(t−δ) using the formula of (9)\n        # Note that x_t needs to be added to both sides of the equation\n\n        # Notation (<variable name> -> <name in paper>\n        # alpha_prod_t -> α_t\n        # alpha_prod_t_prev -> α_(t−δ)\n        # beta_prod_t -> (1 - α_t)\n        # beta_prod_t_prev -> (1 - α_(t−δ))\n        # sample -> x_t\n        # model_output -> e_θ(x_t, t)\n        # prev_sample -> x_(t−δ)\n        alpha_prod_t = self.alphas_cumprod[timestep + 1 - self._offset]\n        alpha_prod_t_prev = self.alphas_cumprod[timestep_prev + 1 - self._offset]\n        beta_prod_t = 1 - alpha_prod_t\n        beta_prod_t_prev = 1 - alpha_prod_t_prev\n\n        # corresponds to (α_(t−δ) - α_t) divided by\n        # denominator of x_t in formula (9) and plus 1\n        # Note: (α_(t−δ) - α_t) / (sqrt(α_t) * (sqrt(α_(t−δ)) + sqr(α_t))) =\n        # sqrt(α_(t−δ)) / sqrt(α_t))\n        sample_coeff = (alpha_prod_t_prev / alpha_prod_t)**(0.5)\n\n        # corresponds to denominator of e_θ(x_t, t) in formula (9)\n        model_output_denom_coeff = alpha_prod_t * beta_prod_t_prev**(0.5) + (alpha_prod_t * beta_prod_t *\n                                                                             alpha_prod_t_prev)**(0.5)\n\n        # full formula (9)\n        prev_sample = (sample_coeff * sample -\n                       (alpha_prod_t_prev - alpha_prod_t) * model_output / model_output_denom_coeff)\n\n        return prev_sample\n\n    def add_noise(self, original_samples, noise, timesteps):\n        sqrt_alpha_prod = self.alphas_cumprod[timesteps]**0.5\n        sqrt_alpha_prod = self.match_shape(sqrt_alpha_prod, original_samples)\n        sqrt_one_minus_alpha_prod = (1 - self.alphas_cumprod[timesteps])**0.5\n        sqrt_one_minus_alpha_prod = self.match_shape(sqrt_one_minus_alpha_prod, original_samples)\n\n        noisy_samples = sqrt_alpha_prod * original_samples + sqrt_one_minus_alpha_prod * noise\n        return noisy_samples\n\n    def __len__(self):\n        return self.config.num_train_timesteps\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/diffusers/schedulers/scheduling_sde_ve.py",
    "content": "# Copyright 2022 Google Brain and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# DISCLAIMER: This file is strongly influenced by https://github.com/yang-song/score_sde_pypaddle\n# TODO(Patrick, Anton, Suraj) - make scheduler framework indepedent and clean-up a bit\nfrom typing import Union\n\nimport numpy as np\nimport paddle\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\nclass ScoreSdeVeScheduler(SchedulerMixin, ConfigMixin):\n    \"\"\"\n    The variance exploding stochastic differential equation (SDE) scheduler.\n\n    :param snr: coefficient weighting the step from the model_output sample (from the network) to the random noise.\n    :param sigma_min: initial noise scale for sigma sequence in sampling procedure. The minimum sigma should mirror the\n            distribution of the data.\n    :param sigma_max: :param sampling_eps: the end value of sampling, where timesteps decrease progessively from 1 to\n    epsilon. :param correct_steps: number of correction steps performed on a produced sample. :param tensor_format:\n    \"np\" or \"pd\" for the expected format of samples passed to the Scheduler.\n    \"\"\"\n\n    @register_to_config\n    def __init__(\n        self,\n        num_train_timesteps=2000,\n        snr=0.15,\n        sigma_min=0.01,\n        sigma_max=1348,\n        sampling_eps=1e-5,\n        correct_steps=1,\n        tensor_format=\"pd\",\n    ):\n        # self.sigmas = None\n        # self.discrete_sigmas = None\n        #\n        # # setable values\n        # self.num_inference_steps = None\n        self.timesteps = None\n\n        self.set_sigmas(num_train_timesteps, sigma_min, sigma_max, sampling_eps)\n\n        self.tensor_format = tensor_format\n        self.set_format(tensor_format=tensor_format)\n\n    def set_timesteps(self, num_inference_steps, sampling_eps=None):\n        sampling_eps = sampling_eps if sampling_eps is not None else self.config.sampling_eps\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            self.timesteps = np.linspace(1, sampling_eps, num_inference_steps)\n        elif tensor_format == \"pd\":\n            self.timesteps = paddle.linspace(1, sampling_eps, num_inference_steps)\n        else:\n            raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def set_sigmas(self, num_inference_steps, sigma_min=None, sigma_max=None, sampling_eps=None):\n        sigma_min = sigma_min if sigma_min is not None else self.config.sigma_min\n        sigma_max = sigma_max if sigma_max is not None else self.config.sigma_max\n        sampling_eps = sampling_eps if sampling_eps is not None else self.config.sampling_eps\n        if self.timesteps is None:\n            self.set_timesteps(num_inference_steps, sampling_eps)\n\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            self.discrete_sigmas = np.exp(np.linspace(np.log(sigma_min), np.log(sigma_max), num_inference_steps))\n            self.sigmas = np.array([sigma_min * (sigma_max / sigma_min)**t for t in self.timesteps])\n        elif tensor_format == \"pd\":\n            self.discrete_sigmas = paddle.exp(paddle.linspace(np.log(sigma_min), np.log(sigma_max),\n                                                              num_inference_steps))\n            self.sigmas = paddle.to_tensor([sigma_min * (sigma_max / sigma_min)**t for t in self.timesteps])\n        else:\n            raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def get_adjacent_sigma(self, timesteps, t):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            return np.where(timesteps == 0, np.zeros_like(t), self.discrete_sigmas[timesteps - 1])\n        elif tensor_format == \"pd\":\n            return paddle.where(timesteps == 0, paddle.zeros_like(t), self.discrete_sigmas[timesteps - 1])\n\n        raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def set_seed(self, seed):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            np.random.seed(seed)\n        elif tensor_format == \"pd\":\n            paddle.seed(seed)\n        else:\n            raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def step_pred(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        timestep: int,\n        sample: Union[paddle.Tensor, np.ndarray],\n        seed=None,\n    ):\n        \"\"\"\n        Predict the sample at the previous timestep by reversing the SDE.\n        \"\"\"\n        if seed is not None:\n            self.set_seed(seed)\n        # TODO(Patrick) non-Pypaddle\n\n        timestep = timestep * paddle.ones(sample.shape[0])  # paddle.repeat_interleave(timestep, sample.shape[0])\n        timesteps = (timestep * (len(self.timesteps) - 1)).astype(\"int64\")\n\n        sigma = self.discrete_sigmas[timesteps]\n        adjacent_sigma = self.get_adjacent_sigma(timesteps, timestep)\n        drift = self.zeros_like(sample)\n        diffusion = (sigma**2 - adjacent_sigma**2)**0.5\n\n        # equation 6 in the paper: the model_output modeled by the network is grad_x log pt(x)\n        # also equation 47 shows the analog from SDE models to ancestral sampling methods\n        drift = drift - diffusion[:, None, None, None]**2 * model_output\n\n        #  equation 6: sample noise for the diffusion term of\n        noise = self.randn_like(sample)\n        prev_sample_mean = sample - drift  # subtract because `dt` is a small negative timestep\n        # TODO is the variable diffusion the correct scaling term for the noise?\n        prev_sample = prev_sample_mean + diffusion[:, None, None, None] * noise  # add impact of diffusion field g\n\n        return {\"prev_sample\": prev_sample, \"prev_sample_mean\": prev_sample_mean}\n\n    def step_correct(\n        self,\n        model_output: Union[paddle.Tensor, np.ndarray],\n        sample: Union[paddle.Tensor, np.ndarray],\n        seed=None,\n    ):\n        \"\"\"\n        Correct the predicted sample based on the output model_output of the network. This is often run repeatedly\n        after making the prediction for the previous timestep.\n        \"\"\"\n        if seed is not None:\n            self.set_seed(seed)\n\n        # For small batch sizes, the paper \"suggest replacing norm(z) with sqrt(d), where d is the dim. of z\"\n        # sample noise for correction\n        noise = self.randn_like(sample)\n\n        # compute step size from the model_output, the noise, and the snr\n        grad_norm = self.norm(model_output)\n        noise_norm = self.norm(noise)\n        step_size = (self.config.snr * noise_norm / grad_norm)**2 * 2\n        step_size = step_size * paddle.ones(sample.shape[0])\n        # self.repeat_scalar(step_size, sample.shape[0])\n\n        # compute corrected sample: model_output term and noise term\n        prev_sample_mean = sample + step_size[:, None, None, None] * model_output\n        prev_sample = prev_sample_mean + ((step_size * 2)**0.5)[:, None, None, None] * noise\n\n        return {\"prev_sample\": prev_sample}\n\n    def __len__(self):\n        return self.config.num_train_timesteps\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/diffusers/schedulers/scheduling_sde_vp.py",
    "content": "# Copyright 2022 Google Brain and The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# DISCLAIMER: This file is strongly influenced by https://github.com/yang-song/score_sde_pytorch\n# TODO(Patrick, Anton, Suraj) - make scheduler framework indepedent and clean-up a bit\nimport numpy as np\nimport paddle\n\nfrom ..configuration_utils import ConfigMixin\nfrom ..configuration_utils import register_to_config\nfrom .scheduling_utils import SchedulerMixin\n\n\nclass ScoreSdeVpScheduler(SchedulerMixin, ConfigMixin):\n\n    @register_to_config\n    def __init__(self, num_train_timesteps=2000, beta_min=0.1, beta_max=20, sampling_eps=1e-3, tensor_format=\"np\"):\n\n        self.sigmas = None\n        self.discrete_sigmas = None\n        self.timesteps = None\n\n    def set_timesteps(self, num_inference_steps):\n        self.timesteps = paddle.linspace(1, self.config.sampling_eps, num_inference_steps)\n\n    def step_pred(self, score, x, t):\n        # TODO(Patrick) better comments + non-PyTorch\n        # postprocess model score\n        log_mean_coeff = (-0.25 * t**2 * (self.config.beta_max - self.config.beta_min) - 0.5 * t * self.config.beta_min)\n        std = paddle.sqrt(1.0 - paddle.exp(2.0 * log_mean_coeff))\n        score = -score / std[:, None, None, None]\n\n        # compute\n        dt = -1.0 / len(self.timesteps)\n\n        beta_t = self.config.beta_min + t * (self.config.beta_max - self.config.beta_min)\n        drift = -0.5 * beta_t[:, None, None, None] * x\n        diffusion = paddle.sqrt(beta_t)\n        drift = drift - diffusion[:, None, None, None]**2 * score\n        x_mean = x + drift * dt\n\n        # add noise\n        noise = self.randn_like(x)\n        x = x_mean + diffusion[:, None, None, None] * np.sqrt(-dt) * noise\n\n        return x, x_mean\n\n    def __len__(self):\n        return self.config.num_train_timesteps\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/diffusers/schedulers/scheduling_utils.py",
    "content": "# Copyright 2022 The HuggingFace Team. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Union\n\nimport numpy as np\nimport paddle\n\nSCHEDULER_CONFIG_NAME = \"scheduler_config.json\"\n\n\nclass SchedulerMixin:\n\n    config_name = SCHEDULER_CONFIG_NAME\n    ignore_for_config = [\"tensor_format\"]\n\n    def set_format(self, tensor_format=\"pd\"):\n        self.tensor_format = tensor_format\n        if tensor_format == \"pd\":\n            for key, value in vars(self).items():\n                if isinstance(value, np.ndarray):\n                    setattr(self, key, paddle.to_tensor(value))\n\n        return self\n\n    def clip(self, tensor, min_value=None, max_value=None):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n\n        if tensor_format == \"np\":\n            return np.clip(tensor, min_value, max_value)\n        elif tensor_format == \"pd\":\n            return paddle.clip(tensor, min_value, max_value)\n\n        raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def log(self, tensor):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n\n        if tensor_format == \"np\":\n            return np.log(tensor)\n        elif tensor_format == \"pd\":\n            return paddle.log(tensor)\n\n        raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def match_shape(self, values: Union[np.ndarray, paddle.Tensor], broadcast_array: Union[np.ndarray, paddle.Tensor]):\n        \"\"\"\n        Turns a 1-D array into an array or tensor with len(broadcast_array.shape) dims.\n\n        Args:\n            values: an array or tensor of values to extract.\n            broadcast_array: an array with a larger shape of K dimensions with the batch\n                dimension equal to the length of timesteps.\n        Returns:\n            a tensor of shape [batch_size, 1, ...] where the shape has K dims.\n        \"\"\"\n\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        values = values.flatten()\n\n        while len(values.shape) < len(broadcast_array.shape):\n            values = values[..., None]\n\n        return values\n\n    def norm(self, tensor):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            return np.linalg.norm(tensor)\n        elif tensor_format == \"pd\":\n            return paddle.norm(tensor.reshape([tensor.shape[0], -1]), axis=-1).mean()\n\n        raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def randn_like(self, tensor, generator=None):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            return np.random.randn(np.shape(tensor))\n        elif tensor_format == \"pd\":\n            # return paddle.randn_like(tensor)\n            return paddle.randn(tensor.shape)\n\n        raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n\n    def zeros_like(self, tensor):\n        tensor_format = getattr(self, \"tensor_format\", \"pd\")\n        if tensor_format == \"np\":\n            return np.zeros_like(tensor)\n        elif tensor_format == \"pd\":\n            return paddle.zeros_like(tensor)\n\n        raise ValueError(f\"`self.tensor_format`: {self.tensor_format} is not valid.\")\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/module.py",
    "content": "# copyright (c) 2022 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport ast\nimport os\nimport random\nimport sys\nfrom functools import partial\nfrom typing import List\nfrom typing import Optional\n\nimport numpy as np\nimport paddle\nfrom docarray import Document\nfrom docarray import DocumentArray\nfrom IPython import display\nfrom PIL import Image\nfrom stable_diffusion.clip.clip.utils import build_model\nfrom stable_diffusion.clip.clip.utils import tokenize\nfrom stable_diffusion.diffusers import AutoencoderKL\nfrom stable_diffusion.diffusers import DDIMScheduler\nfrom stable_diffusion.diffusers import LMSDiscreteScheduler\nfrom stable_diffusion.diffusers import PNDMScheduler\nfrom stable_diffusion.diffusers import UNet2DConditionModel\nfrom tqdm.auto import tqdm\n\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"stable_diffusion_waifu\",\n            version=\"1.0.0\",\n            type=\"image/text_to_image\",\n            summary=\"\",\n            author=\"paddlepaddle\",\n            author_email=\"paddle-dev@baidu.com\")\nclass StableDiffusion:\n\n    def __init__(self):\n        self.vae = AutoencoderKL(in_channels=3,\n                                 out_channels=3,\n                                 down_block_types=(\"DownEncoderBlock2D\", \"DownEncoderBlock2D\", \"DownEncoderBlock2D\",\n                                                   \"DownEncoderBlock2D\"),\n                                 up_block_types=(\"UpDecoderBlock2D\", \"UpDecoderBlock2D\", \"UpDecoderBlock2D\",\n                                                 \"UpDecoderBlock2D\"),\n                                 block_out_channels=(128, 256, 512, 512),\n                                 layers_per_block=2,\n                                 act_fn=\"silu\",\n                                 latent_channels=4,\n                                 sample_size=512)\n\n        self.unet = UNet2DConditionModel(sample_size=64,\n                                         in_channels=4,\n                                         out_channels=4,\n                                         center_input_sample=False,\n                                         flip_sin_to_cos=True,\n                                         freq_shift=0,\n                                         down_block_types=(\"CrossAttnDownBlock2D\", \"CrossAttnDownBlock2D\",\n                                                           \"CrossAttnDownBlock2D\", \"DownBlock2D\"),\n                                         up_block_types=(\"UpBlock2D\", \"CrossAttnUpBlock2D\", \"CrossAttnUpBlock2D\",\n                                                         \"CrossAttnUpBlock2D\"),\n                                         block_out_channels=(320, 640, 1280, 1280),\n                                         layers_per_block=2,\n                                         downsample_padding=1,\n                                         mid_block_scale_factor=1,\n                                         act_fn=\"silu\",\n                                         norm_num_groups=32,\n                                         norm_eps=1e-5,\n                                         cross_attention_dim=768,\n                                         attention_head_dim=8)\n\n        vae_path = os.path.join(self.directory, 'pre_trained', 'waifu-vae.pdparams')\n        unet_path = os.path.join(self.directory, 'pre_trained', 'waifu-unet.pdparams')\n        self.unet.set_dict(paddle.load(unet_path))\n        self.vae.set_dict(paddle.load(vae_path))\n        for parameter in self.unet.parameters():\n            parameter.stop_gradient = True\n        self.vae.eval()\n        for parameter in self.vae.parameters():\n            parameter.stop_gradient = True\n        self.unet.eval()\n\n        self.text_encoder = build_model()\n        for parameter in self.text_encoder.parameters():\n            parameter.stop_gradient = True\n        self.scheduler = PNDMScheduler(beta_start=0.00085,\n                                       beta_end=0.012,\n                                       beta_schedule=\"scaled_linear\",\n                                       num_train_timesteps=1000,\n                                       skip_prk_steps=True)\n\n    def generate_image(self,\n                       text_prompts,\n                       style: Optional[str] = None,\n                       artist: Optional[str] = None,\n                       width_height: Optional[List[int]] = [512, 512],\n                       batch_size: Optional[int] = 1,\n                       num_inference_steps=50,\n                       guidance_scale=7.5,\n                       enable_fp16=False,\n                       seed=None,\n                       display_rate=5,\n                       use_gpu=True,\n                       output_dir: Optional[str] = 'stable_diffusion_waifu_out'):\n        \"\"\"\n        Create Disco Diffusion artworks and save the result into a DocumentArray.\n\n        :param text_prompts: Phrase, sentence, or string of words and phrases describing what the image should look like.  The words will be analyzed by the AI and will guide the diffusion process toward the image(s) you describe. These can include commas and weights to adjust the relative importance of each element.  E.g. \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"Notice that this prompt loosely follows a structure: [subject], [prepositional details], [setting], [meta modifiers and artist]; this is a good starting point for your experiments. Developing text prompts takes practice and experience, and is not the subject of this guide.  If you are a beginner to writing text prompts, a good place to start is on a simple AI art app like Night Cafe, starry ai or WOMBO prior to using DD, to get a feel for how text gets translated into images by GAN tools.  These other apps use different technologies, but many of the same principles apply.\n        :param style: Image style, such as oil paintings, if specified, style will be used to construct prompts.\n        :param artist: Artist style, if specified, style will be used to construct prompts.\n        :param width_height: Desired final image size, in pixels. You can have a square, wide, or tall image, but each edge length should be set to a multiple of 64px, and a minimum of 512px on the default CLIP model setting.  If you forget to use multiples of 64px in your dimensions, DD will adjust the dimensions of your image to make it so.\n        :param batch_size: This variable sets the number of still images you want SD to create for each prompt.\n        :param num_inference_steps: The number of inference steps.\n        :param guidance_scale: Increase the adherence to the conditional signal which in this case is text as well as overall sample quality.\n        :param enable_fp16: Whether to use float16.\n        :param use_gpu: whether to use gpu or not.\n        :param output_dir: Output directory.\n        :return: a DocumentArray object that has `n_batches` Documents\n        \"\"\"\n        if seed:\n            np.random.seed(seed)\n            random.seed(seed)\n            paddle.seed(seed)\n\n        if use_gpu:\n            try:\n                _places = os.environ.get(\"CUDA_VISIBLE_DEVICES\", None)\n                if _places:\n                    paddle.device.set_device(\"gpu:{}\".format(0))\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n        else:\n            paddle.device.set_device(\"cpu\")\n        paddle.disable_static()\n\n        if not os.path.exists(output_dir):\n            os.makedirs(output_dir, exist_ok=True)\n\n        if isinstance(text_prompts, str):\n            text_prompts = text_prompts.rstrip(',.，。')\n            if style is not None:\n                text_prompts += \",{}\".format(style)\n            if artist is not None:\n                text_prompts += \",{},trending on artstation\".format(artist)\n            text_prompts = [text_prompts]\n        elif isinstance(text_prompts, list):\n            for i, prompt in enumerate(\n                    text_prompts):  # different from dd here, dd can have multiple prompts for one image with weight.\n                text_prompts[i] = prompt.rstrip(',.，。')\n                if style is not None:\n                    text_prompts[i] += \",{}\".format(style)\n                if artist is not None:\n                    text_prompts[i] += \",{},trending on artstation\".format(artist)\n\n        width, height = width_height\n        da_batches = DocumentArray()\n\n        for prompt in text_prompts:\n            d = Document(tags={'prompt': prompt})\n            da_batches.append(d)\n            for i in range(batch_size):\n                d.chunks.append(Document(tags={'prompt': prompt, 'image idx': i}))\n            d.chunks.append(Document(tags={'prompt': prompt, 'image idx': 'merged'}))\n            with paddle.amp.auto_cast(enable=enable_fp16, level='O2'):\n                prompts = [prompt] * batch_size\n                text_input = tokenize(prompts)\n                text_embeddings = self.text_encoder(text_input)\n                uncond_input = tokenize([\"\"] * batch_size)\n                uncond_embeddings = self.text_encoder(uncond_input)\n                text_embeddings = paddle.concat([uncond_embeddings, text_embeddings])\n\n                latents = paddle.randn((batch_size, self.unet.in_channels, height // 8, width // 8), )\n                if isinstance(self.scheduler, LMSDiscreteScheduler):\n                    latents = latents * self.scheduler.sigmas[0]\n\n                self.scheduler.set_timesteps(num_inference_steps)\n                for i, t in tqdm(enumerate(self.scheduler.timesteps)):\n                    # expand the latents if we are doing classifier-free guidance to avoid doing two forward passes.\n                    latent_model_input = paddle.concat([latents] * 2)\n\n                    if isinstance(self.scheduler, LMSDiscreteScheduler):\n                        sigma = self.scheduler.sigmas[i]\n                        latent_model_input = latent_model_input / ((sigma**2 + 1)**0.5)\n\n                    # predict the noise residual\n                    noise_pred = self.unet(latent_model_input, t, encoder_hidden_states=text_embeddings)[\"sample\"]\n\n                    # perform guidance\n                    noise_pred_uncond, noise_pred_text = noise_pred.chunk(2)\n                    noise_pred = noise_pred_uncond + guidance_scale * (noise_pred_text - noise_pred_uncond)\n\n                    # compute the previous noisy sample x_t -> x_t-1\n                    if isinstance(self.scheduler, LMSDiscreteScheduler):\n                        latents = self.scheduler.step(noise_pred, i, latents)[\"prev_sample\"]\n                    else:\n                        latents = self.scheduler.step(noise_pred, t, latents)[\"prev_sample\"]\n                    if i % display_rate == 0:\n                        # vae decode\n                        images = self.vae.decode(1 / 0.18215 * latents)\n                        images = (images / 2 + 0.5).clip(0, 1)\n                        merge_image = images.cpu().transpose([2, 0, 3, 1]).flatten(1, 2).numpy()\n                        merge_image = (merge_image * 255).round().astype(np.uint8)\n                        merge_image = Image.fromarray(merge_image)\n                        merge_image.save(os.path.join(output_dir, f'{prompt[:10]}-progress.png'))\n                        c = Document(tags={'step': i, 'prompt': prompt})\n                        c.load_pil_image_to_datauri(merge_image)\n                        d.chunks[-1].chunks.append(c)\n                        display.clear_output(wait=True)\n                        display.display(merge_image)\n                        images = images.cpu().transpose([0, 2, 3, 1]).numpy()\n                        images = (images * 255).round().astype(np.uint8)\n                        for j in range(images.shape[0]):\n                            image = Image.fromarray(images[j])\n                            c = Document(tags={'step': i, 'prompt': prompt})\n                            c.load_pil_image_to_datauri(image)\n                            d.chunks[j].chunks.append(c)\n\n                # vae decode\n                images = self.vae.decode(1 / 0.18215 * latents)\n                images = (images / 2 + 0.5).clip(0, 1)\n                merge_image = images.cpu().transpose([2, 0, 3, 1]).flatten(1, 2).numpy()\n                merge_image = (merge_image * 255).round().astype(np.uint8)\n                merge_image = Image.fromarray(merge_image)\n                merge_image.save(os.path.join(output_dir, f'{prompt[:10]}-merge.png'))\n                display.clear_output(wait=True)\n                display.display(merge_image)\n                d.load_pil_image_to_datauri(merge_image)\n                d.chunks[-1].load_pil_image_to_datauri(merge_image)\n                images = images.cpu().transpose([0, 2, 3, 1]).numpy()\n                images = (images * 255).round().astype(np.uint8)\n                for j in range(images.shape[0]):\n                    image = Image.fromarray(images[j])\n                    image.save(os.path.join(output_dir, f'{prompt[:10]}-image-{j}.png'))\n                    d.chunks[j].load_pil_image_to_datauri(image)\n        return da_batches\n\n    @serving\n    def serving_method(self, text_prompts, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        results = self.generate_image(text_prompts=text_prompts, **kwargs).to_base64()\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.generate_image(text_prompts=args.text_prompts,\n                                      style=args.style,\n                                      artist=args.artist,\n                                      width_height=args.width_height,\n                                      batch_size=args.batch_size,\n                                      num_inference_steps=args.num_inference_steps,\n                                      guidance_scale=args.guidance_scale,\n                                      enable_fp16=args.enable_fp16,\n                                      seed=args.seed,\n                                      display_rate=args.display_rate,\n                                      use_gpu=args.use_gpu,\n                                      output_dir=args.output_dir)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n\n        self.arg_input_group.add_argument('--num_inference_steps',\n                                          type=int,\n                                          default=50,\n                                          help=\"The number of inference steps.\")\n\n        self.arg_input_group.add_argument(\n            '--guidance_scale',\n            type=float,\n            default=7.5,\n            help=\n            \"Increase the adherence to the conditional signal which in this case is text as well as overall sample quality.\"\n        )\n\n        self.arg_input_group.add_argument(\n            '--seed',\n            type=int,\n            default=None,\n            help=\n            \"Deep in the diffusion code, there is a random number ‘seed’ which is used as the basis for determining the initial state of the diffusion.  By default, this is random, but you can also specify your own seed.\"\n        )\n\n        self.arg_input_group.add_argument(\n            '--display_rate',\n            type=int,\n            default=10,\n            help=\"During a diffusion run, you can monitor the progress of each image being created with this variable.\")\n\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=True,\n                                           help=\"whether use GPU or not\")\n\n        self.arg_config_group.add_argument('--enable_fp16',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use float16 or not\")\n\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='stable_diffusion_waifu_out',\n                                           help='Output directory.')\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument(\n            '--text_prompts',\n            type=str,\n            help=\n            'Phrase, sentence, or string of words and phrases describing what the image should look like.  The words will be analyzed by the AI and will guide the diffusion process toward the image(s) you describe. These can include commas and weights to adjust the relative importance of each element.  E.g. \"A beautiful painting of a singular lighthouse, shining its light across a tumultuous sea of blood by greg rutkowski and thomas kinkade, Trending on artstation.\"Notice that this prompt loosely follows a structure: [subject], [prepositional details], [setting], [meta modifiers and artist]; this is a good starting point for your experiments. Developing text prompts takes practice and experience, and is not the subject of this guide.  If you are a beginner to writing text prompts, a good place to start is on a simple AI art app like Night Cafe, starry ai or WOMBO prior to using DD, to get a feel for how text gets translated into images by GAN tools.  These other apps use different technologies, but many of the same principles apply.'\n        )\n        self.arg_input_group.add_argument(\n            '--style',\n            type=str,\n            default=None,\n            help='Image style, such as oil paintings, if specified, style will be used to construct prompts.')\n        self.arg_input_group.add_argument('--artist',\n                                          type=str,\n                                          default=None,\n                                          help='Artist style, if specified, style will be used to construct prompts.')\n\n        self.arg_input_group.add_argument(\n            '--width_height',\n            type=ast.literal_eval,\n            default=[512, 512],\n            help=\n            \"Desired final image size, in pixels. You can have a square, wide, or tall image, but each edge length should be set to a multiple of 64px, and a minimum of 512px on the default CLIP model setting.  If you forget to use multiples of 64px in your dimensions, DD will adjust the dimensions of your image to make it so.\"\n        )\n        self.arg_input_group.add_argument(\n            '--batch_size',\n            type=int,\n            default=1,\n            help=\"This variable sets the number of still images you want SD to create for each prompt.\")\n"
  },
  {
    "path": "modules/image/text_to_image/stable_diffusion_waifu/requirements.txt",
    "content": "numpy\nftfy\nregex\ndocarray>=0.13.29\npyyaml\nregex\ntqdm\nipywidgets\n"
  },
  {
    "path": "modules/text/README.md",
    "content": ""
  },
  {
    "path": "modules/text/embedding/README.md",
    "content": ""
  },
  {
    "path": "modules/text/embedding/fasttext_crawl_target_word-word_dim300_en/README.md",
    "content": "# fasttext_crawl_target_word-word_dim300_en\n|模型名称|fasttext_crawl_target_word-word_dim300_en|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|fasttext|\n|数据集|crawl|\n|是否支持Fine-tuning|否|\n|文件大小|1.19GB|\n|词表大小|2000002|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install fasttext_crawl_target_word-word_dim300_en\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='fasttext_crawl_target_word-word_dim300_en')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n    \n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m fasttext_crawl_target_word-word_dim300_en\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/fasttext_crawl_target_word-word_dim300_en\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install fasttext_crawl_target_word-word_dim300_en==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/fasttext_crawl_target_word-word_dim300_en/README_en.md",
    "content": "# fasttext_crawl_target_word-word_dim300_en\n|Module Name|fasttext_crawl_target_word-word_dim300_en|\n| :--- | :---: | \n|Category|Word Embedding|\n|Network|fasttext|\n|Dataset|crawl|\n|Fine-tuning supported|No|\n|Module Size|1.19GB|\n|Vocab Size|2,000,002|\n|Last update date|26 Feb, 2021|\n|Data Indicators|-|\n\n## I. Basic Information\n\n- ### Module Introduction\n\n    - PaddleHub provides several open source pretrained word embedding models. These embedding models are distinguished by the corpus, training methods and word embedding dimensions. For more informations, please refer to: [Summary of embedding models](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## II. Installation\n\n- ### 1. Environmental Dependence\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [PaddleHub Installation Guide](../../../../docs/docs_ch/get_start/installation_en.rst)\n\n- ### 2. Installation\n\n  - ```shell\n    $ hub install fasttext_crawl_target_word-word_dim300_en\n    ```\n\n  - In case of any problems during installation, please refer to: [Windows_Quickstart](../../../../docs/docs_ch/get_start/windows_quickstart_en.md) | [Linux_Quickstart](../../../../docs/docs_ch/get_start/linux_quickstart_en.md) | [Mac_Quickstart](../../../../docs/docs_ch/get_start/mac_quickstart_en.md)\n\n## III. Module API Prediction\n\n- ### 1. Prediction Code Example\n\n  - ```\n    import paddlehub as hub\n    embedding = hub.Module(name='fasttext_crawl_target_word-word_dim300_en')\n\n    # Get the embedding of the word\n    embedding.search(\"中国\")\n    # Calculate the cosine similarity of two word vectors\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # Calculate the inner product of two word vectors\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - Construct an embedding module object without parameters by default.\n\n    - **Parameters**\n      - `*args`： Arguments specified by the user.\n      - `**kwargs`：Keyword arguments specified by the user.\n\n    - More info[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - Return the embedding of one or multiple words. The input data type can be `str`, `List[str]` and `int`, represent word, multiple words and the embedding of specified word id accordingly. Word id is related to the model vocab, vocab can be obtained by the attribute of `vocab`.\n\n    - **参数**\n      - `words`： input words or word id.\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - Cosine similarity calculation. `word_a` and `word_b` should be in the voacb, or they will be replaced by `unknown_token`. \n\n    - **参数**\n      - `word_a`： input word a.\n      - `word_b`： input word b.\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - Inner product calculation. `word_a` and `word_b` should be in the voacb, or they will be replaced by `unknown_token`. \n\n    - **参数**\n      - `word_a`： input word a.\n      - `word_b`： input word b.\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - Get the path of the local vocab file.\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - Get the tokenizer of current model, it will return an instance of JiebaTokenizer, only supports the chinese embedding models currently.\n\n    - **参数**\n      - `*args`: Arguments specified by the user.\n      - `**kwargs`: Keyword arguments specified by the user.\n    \n    - For more information about the arguments, please refer to[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - For more information about the usage, please refer to[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online service of cosine similarity calculation.\n\n- ### Step 1: Start PaddleHub Serving\n\n  - Run the startup command:\n\n  - ```shell\n    $ hub serving start -m fasttext_crawl_target_word-word_dim300_en\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:** If GPU is used for prediction, set `CUDA_VISIBLE_DEVICES` environment variable before the service, otherwise it need not be set.\n\n- ### Step 2: Send a predictive request\n\n  - With a configured server, use the following lines of code to send the prediction request and obtain the result\n\n  - ```python\n    import requests\n    import json\n\n    # Specify the word pairs used to calculate the cosine similarity [[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    data = {\"data\": word_pairs}\n    # Send an HTTP request\n    url = \"http://127.0.0.1:8866/predict/fasttext_crawl_target_word-word_dim300_en\"\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## V. Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.0.1\n\n  Model optimization\n  - ```shell\n    $ hub install fasttext_crawl_target_word-word_dim300_en==1.0.1\n    ```"
  },
  {
    "path": "modules/text/embedding/fasttext_crawl_target_word-word_dim300_en/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/fasttext_crawl_target_word-word_dim300_en/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"fasttext_crawl_target_word-word_dim300_en\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"fasttext.crawl.target.word-word.dim300.en\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/fasttext_wiki-news_target_word-word_dim300_en/README.md",
    "content": "# fasttext_wiki-news_target_word-word_dim300_en\n|模型名称|fasttext_wiki-news_target_word-word_dim300_en|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|fasttext|\n|数据集|wiki-news|\n|是否支持Fine-tuning|否|\n|文件大小|541.63MB|\n|词表大小|999996|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install fasttext_wiki-news_target_word-word_dim300_en\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='fasttext_wiki-news_target_word-word_dim300_en')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m fasttext_wiki-news_target_word-word_dim300_en\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/fasttext_wiki-news_target_word-word_dim300_en\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install fasttext_wiki-news_target_word-word_dim300_en==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/fasttext_wiki-news_target_word-word_dim300_en/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/fasttext_wiki-news_target_word-word_dim300_en/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"fasttext_wiki-news_target_word-word_dim300_en\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"fasttext.wiki-news.target.word-word.dim300.en\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/glove_twitter_target_word-word_dim100_en/README.md",
    "content": "# glove_twitter_target_word-word_dim100_en\n|模型名称|glove_twitter_target_word-word_dim100_en|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|glove|\n|数据集|crawl|\n|是否支持Fine-tuning|否|\n|文件大小|431.08MB|\n|词表大小|1193516|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install glove_twitter_target_word-word_dim100_en\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='glove_twitter_target_word-word_dim100_en')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m glove_twitter_target_word-word_dim100_en\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/glove_twitter_target_word-word_dim100_en\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install glove_twitter_target_word-word_dim100_en==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/glove_twitter_target_word-word_dim100_en/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/glove_twitter_target_word-word_dim100_en/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"glove_twitter_target_word-word_dim100_en\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"glove.twitter.target.word-word.dim100.en\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/glove_twitter_target_word-word_dim200_en/README.md",
    "content": "# glove_twitter_target_word-word_dim200_en\n|模型名称|glove_twitter_target_word-word_dim200_en|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|fasttext|\n|数据集|twitter|\n|是否支持Fine-tuning|否|\n|文件大小|848.56MB|\n|词表大小|1193516|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install glove_twitter_target_word-word_dim200_en\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='glove_twitter_target_word-word_dim200_en')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m glove_twitter_target_word-word_dim200_en\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/glove_twitter_target_word-word_dim200_en\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install glove_twitter_target_word-word_dim200_en==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/glove_twitter_target_word-word_dim200_en/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/glove_twitter_target_word-word_dim200_en/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"glove_twitter_target_word-word_dim200_en\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"glove.twitter.target.word-word.dim200.en\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/glove_twitter_target_word-word_dim25_en/README.md",
    "content": "# glove_twitter_target_word-word_dim25_en\n|模型名称|glove_twitter_target_word-word_dim25_en|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|glove|\n|数据集|twitter|\n|是否支持Fine-tuning|否|\n|文件大小|116.92MB|\n|词表大小|1193516|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install glove_twitter_target_word-word_dim25_en\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='glove_twitter_target_word-word_dim25_en')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m glove_twitter_target_word-word_dim25_en\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/glove_twitter_target_word-word_dim25_en\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install glove_twitter_target_word-word_dim25_en==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/glove_twitter_target_word-word_dim25_en/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/glove_twitter_target_word-word_dim25_en/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"glove_twitter_target_word-word_dim25_en\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"glove.twitter.target.word-word.dim25.en\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/glove_twitter_target_word-word_dim50_en/README.md",
    "content": "# glove_twitter_target_word-word_dim50_en\n|模型名称|glove_twitter_target_word-word_dim50_en|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|glove|\n|数据集|twitter|\n|是否支持Fine-tuning|否|\n|文件大小|221.64MB|\n|词表大小|1193516|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install glove_twitter_target_word-word_dim50_en\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='glove_twitter_target_word-word_dim50_en')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m glove_twitter_target_word-word_dim50_en\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/glove_twitter_target_word-word_dim50_en\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install glove_twitter_target_word-word_dim50_en==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/glove_twitter_target_word-word_dim50_en/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/glove_twitter_target_word-word_dim50_en/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"glove_twitter_target_word-word_dim50_en\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"glove.twitter.target.word-word.dim50.en\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/glove_wiki2014-gigaword_target_word-word_dim100_en/README.md",
    "content": "# glove_wiki2014-gigaword_target_word-word_dim100_en\n|模型名称|glove_wiki2014-gigaword_target_word-word_dim100_en|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|glove|\n|数据集|wiki2014-gigaword|\n|是否支持Fine-tuning|否|\n|文件大小|143.30MB|\n|词表大小|400002|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install glove_wiki2014-gigaword_target_word-word_dim100_en\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='glove_wiki2014-gigaword_target_word-word_dim100_en')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m glove_wiki2014-gigaword_target_word-word_dim100_en\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/glove_wiki2014-gigaword_target_word-word_dim100_en\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install glove_wiki2014-gigaword_target_word-word_dim50_en==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/glove_wiki2014-gigaword_target_word-word_dim100_en/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/glove_wiki2014-gigaword_target_word-word_dim100_en/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"glove_wiki2014-gigaword_target_word-word_dim100_en\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"glove.wiki2014-gigaword.target.word-word.dim100.en\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/glove_wiki2014-gigaword_target_word-word_dim200_en/README.md",
    "content": "# glove_wiki2014-gigaword_target_word-word_dim200_en\n|模型名称|glove_wiki2014-gigaword_target_word-word_dim200_en|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|glove|\n|数据集|wiki2014-gigaword|\n|是否支持Fine-tuning|否|\n|文件大小|282.97MB|\n|词表大小|400002|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install glove_wiki2014-gigaword_target_word-word_dim200_en\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='glove_wiki2014-gigaword_target_word-word_dim200_en')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m glove_wiki2014-gigaword_target_word-word_dim200_en\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/glove_wiki2014-gigaword_target_word-word_dim200_en\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install glove_wiki2014-gigaword_target_word-word_dim200_en==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/glove_wiki2014-gigaword_target_word-word_dim200_en/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/glove_wiki2014-gigaword_target_word-word_dim200_en/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"glove_wiki2014-gigaword_target_word-word_dim200_en\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"glove.wiki2014-gigaword.target.word-word.dim200.en\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/glove_wiki2014-gigaword_target_word-word_dim300_en/README.md",
    "content": "# glove_wiki2014-gigaword_target_word-word_dim300_en\n|模型名称|glove_wiki2014-gigaword_target_word-word_dim300_en|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|glove|\n|数据集|wiki2014-gigaword|\n|是否支持Fine-tuning|否|\n|文件大小|422.83MB|\n|词表大小|400002|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install glove_wiki2014-gigaword_target_word-word_dim300_en\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='glove_wiki2014-gigaword_target_word-word_dim300_en')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m glove_wiki2014-gigaword_target_word-word_dim300_en\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/glove_wiki2014-gigaword_target_word-word_dim300_en\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install glove_wiki2014-gigaword_target_word-word_dim300_en==1.0.1\n    ```\n    "
  },
  {
    "path": "modules/text/embedding/glove_wiki2014-gigaword_target_word-word_dim300_en/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/glove_wiki2014-gigaword_target_word-word_dim300_en/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"glove_wiki2014-gigaword_target_word-word_dim300_en\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"glove.wiki2014-gigaword.target.word-word.dim300.en\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/glove_wiki2014-gigaword_target_word-word_dim50_en/README.md",
    "content": "# glove_wiki2014-gigaword_target_word-word_dim50_en\n|模型名称|glove_wiki2014-gigaword_target_word-word_dim50_en|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|glove|\n|数据集|wiki2014-gigaword|\n|是否支持Fine-tuning|否|\n|文件大小|73.45MB|\n|词表大小|400002|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install glove_wiki2014-gigaword_target_word-word_dim50_en\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='glove_wiki2014-gigaword_target_word-word_dim50_en')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m glove_wiki2014-gigaword_target_word-word_dim50_en\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/glove_wiki2014-gigaword_target_word-word_dim50_en\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install glove_wiki2014-gigaword_target_word-word_dim50_en==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/glove_wiki2014-gigaword_target_word-word_dim50_en/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/glove_wiki2014-gigaword_target_word-word_dim50_en/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"glove_wiki2014-gigaword_target_word-word_dim50_en\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"glove.wiki2014-gigaword.target.word-word.dim50.en\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_context_word-character_char1-1_dim300/README.md",
    "content": "# w2v_baidu_encyclopedia_context_word-character_char1-1_dim300\n|模型名称|w2v_baidu_encyclopedia_context_word-character_char1-1_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|baidu_encyclopedia|\n|是否支持Fine-tuning|否|\n|文件大小|678.65MB|\n|词表大小|636200|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_context_word-character_char1-1_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_baidu_encyclopedia_context_word-character_char1-1_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_baidu_encyclopedia_context_word-character_char1-1_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_baidu_encyclopedia_context_word-character_char1-1_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_context_word-character_char1-1_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_context_word-character_char1-1_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_context_word-character_char1-1_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_baidu_encyclopedia_context_word-character_char1-1_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.baidu_encyclopedia.context.word-character.char1-1.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_context_word-character_char1-2_dim300/README.md",
    "content": "# w2v_baidu_encyclopedia_context_word-character_char1-2_dim300\n|模型名称|w2v_baidu_encyclopedia_context_word-character_char1-2_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|baidu_encyclopedia|\n|是否支持Fine-tuning|否|\n|文件大小|844.23MB|\n|词表大小|792631|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_context_word-character_char1-2_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_baidu_encyclopedia_context_word-character_char1-2_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_baidu_encyclopedia_context_word-character_char1-2_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_baidu_encyclopedia_context_word-character_char1-2_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_context_word-character_char1-2_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_context_word-character_char1-2_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_context_word-character_char1-2_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_baidu_encyclopedia_context_word-character_char1-2_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.baidu_encyclopedia.context.word-character.char1-2.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_context_word-character_char1-4_dim300/README.md",
    "content": "# w2v_baidu_encyclopedia_context_word-character_char1-4_dim300\n|模型名称|w2v_baidu_encyclopedia_context_word-character_char1-4_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|baidu_encyclopedia|\n|是否支持Fine-tuning|否|\n|文件大小|1.16GB|\n|词表大小|1117461|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_context_word-character_char1-4_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_baidu_encyclopedia_context_word-character_char1-4_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_baidu_encyclopedia_context_word-character_char1-4_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_baidu_encyclopedia_context_word-character_char1-4_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_context_word-character_char1-4_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_context_word-character_char1-4_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_context_word-character_char1-4_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_baidu_encyclopedia_context_word-character_char1-4_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.baidu_encyclopedia.context.word-character.char1-4.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_context_word-ngram_1-2_dim300/README.md",
    "content": "# w2v_baidu_encyclopedia_context_word-ngram_1-2_dim300\n|模型名称|w2v_baidu_encyclopedia_context_word-ngram_1-2_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|baidu_encyclopedia|\n|是否支持Fine-tuning|否|\n|文件大小|7.25GB|\n|词表大小|6967598|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_context_word-ngram_1-2_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_baidu_encyclopedia_context_word-ngram_1-2_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_baidu_encyclopedia_context_word-ngram_1-2_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_baidu_encyclopedia_context_word-ngram_1-2_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_context_word-ngram_1-2_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_context_word-ngram_1-2_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_context_word-ngram_1-2_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_baidu_encyclopedia_context_word-ngram_1-2_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.baidu_encyclopedia.context.word-ngram.1-2.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_context_word-ngram_1-3_dim300/README.md",
    "content": "# w2v_baidu_encyclopedia_context_word-ngram_1-3_dim300\n|模型名称|w2v_baidu_encyclopedia_context_word-ngram_1-3_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|baidu_encyclopedia|\n|是否支持Fine-tuning|否|\n|文件大小|5.21GB|\n|词表大小|5000001|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_context_word-ngram_1-3_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_baidu_encyclopedia_context_word-ngram_1-3_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_baidu_encyclopedia_context_word-ngram_1-3_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_baidu_encyclopedia_context_word-ngram_1-3_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_context_word-ngram_1-3_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_context_word-ngram_1-3_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_context_word-ngram_1-3_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_baidu_encyclopedia_context_word-ngram_1-3_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.baidu_encyclopedia.context.word-ngram.1-3.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_context_word-ngram_2-2_dim300/README.md",
    "content": "# w2v_baidu_encyclopedia_context_word-ngram_2-2_dim300\n|模型名称|w2v_baidu_encyclopedia_context_word-ngram_2-2_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|baidu_encyclopedia|\n|是否支持Fine-tuning|否|\n|文件大小|7.26GB|\n|词表大小|6968998|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_context_word-ngram_2-2_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_baidu_encyclopedia_context_word-ngram_2-2_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_baidu_encyclopedia_context_word-ngram_2-2_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_baidu_encyclopedia_context_word-ngram_2-2_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_context_word-ngram_2-2_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_context_word-ngram_2-2_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_context_word-ngram_2-2_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_baidu_encyclopedia_context_word-ngram_2-2_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.baidu_encyclopedia.context.word-ngram.2-2.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_context_word-wordLR_dim300/README.md",
    "content": "# w2v_baidu_encyclopedia_context_word-wordLR_dim300\n|模型名称|w2v_baidu_encyclopedia_context_word-wordLR_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|baidu_encyclopedia|\n|是否支持Fine-tuning|否|\n|文件大小|1.32GB|\n|词表大小|1271031|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_context_word-wordLR_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_baidu_encyclopedia_context_word-wordLR_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_baidu_encyclopedia_context_word-wordLR_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_baidu_encyclopedia_context_word-wordLR_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_context_word-wordLR_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_context_word-wordLR_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_context_word-wordLR_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_baidu_encyclopedia_context_word-wordLR_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.baidu_encyclopedia.context.word-wordLR.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_context_word-wordPosition_dim300/README.md",
    "content": "# w2v_baidu_encyclopedia_context_word-wordPosition_dim300\n|模型名称|w2v_baidu_encyclopedia_context_word-wordPosition_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|baidu_encyclopedia|\n|是否支持Fine-tuning|否|\n|文件大小|679.32MB|\n|词表大小|636038|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_context_word-wordPosition_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_baidu_encyclopedia_context_word-wordPosition_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n    \n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_baidu_encyclopedia_context_word-wordPosition_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_baidu_encyclopedia_context_word-wordPosition_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_context_word-wordPosition_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_context_word-wordPosition_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_context_word-wordPosition_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_baidu_encyclopedia_context_word-wordPosition_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.baidu_encyclopedia.context.word-wordPosition.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_context_word-word_dim300/README.md",
    "content": "# w2v_baidu_encyclopedia_context_word-word_dim300\n|模型名称|w2v_baidu_encyclopedia_context_word-word_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|baidu_encyclopedia|\n|是否支持Fine-tuning|否|\n|文件大小|677.74MB|\n|词表大小|635952|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_context_word-word_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_baidu_encyclopedia_context_word-word_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_baidu_encyclopedia_context_word-word_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_baidu_encyclopedia_context_word-word_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_context_word-word_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_context_word-word_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_context_word-word_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_baidu_encyclopedia_context_word-word_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.baidu_encyclopedia.context.word-word.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_bigram-char_dim300/README.md",
    "content": "# w2v_baidu_encyclopedia_target_bigram-char_dim300\n|模型名称|w2v_baidu_encyclopedia_target_bigram-char_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|baidu_encyclopedia|\n|是否支持Fine-tuning|否|\n|文件大小|679.29MB|\n|词表大小|635976|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_target_bigram-char_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_baidu_encyclopedia_target_bigram-char_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_baidu_encyclopedia_target_bigram-char_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_baidu_encyclopedia_target_bigram-char_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_target_bigram-char_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_bigram-char_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_bigram-char_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_baidu_encyclopedia_target_bigram-char_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.baidu_encyclopedia.target.bigram-char.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_word-character_char1-1_dim300/README.md",
    "content": "# w2v_baidu_encyclopedia_target_word-character_char1-1_dim300\n|模型名称|w2v_baidu_encyclopedia_target_word-character_char1-1_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|baidu_encyclopedia|\n|是否支持Fine-tuning|否|\n|文件大小|679.15MB|\n|词表大小|636038|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_target_word-character_char1-1_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_baidu_encyclopedia_target_word-character_char1-1_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_baidu_encyclopedia_target_word-character_char1-1_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_baidu_encyclopedia_target_word-character_char1-1_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_target_word-character_char1-1_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_word-character_char1-1_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_word-character_char1-1_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_baidu_encyclopedia_target_word-character_char1-1_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.baidu_encyclopedia.target.word-character.char1-1.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_word-character_char1-2_dim300/README.md",
    "content": "# w2v_baidu_encyclopedia_target_word-character_char1-2_dim300\n|模型名称|w2v_baidu_encyclopedia_target_word-character_char1-2_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|baidu_encyclopedia|\n|是否支持Fine-tuning|否|\n|文件大小|679.30MB|\n|词表大小|636038|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_target_word-character_char1-2_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_baidu_encyclopedia_target_word-character_char1-2_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_baidu_encyclopedia_target_word-character_char1-2_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_baidu_encyclopedia_target_word-character_char1-2_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_target_word-character_char1-2_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_word-character_char1-2_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_word-character_char1-2_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_baidu_encyclopedia_target_word-character_char1-2_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.baidu_encyclopedia.target.word-character.char1-2.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_word-character_char1-4_dim300/README.md",
    "content": "# w2v_baidu_encyclopedia_target_word-character_char1-4_dim300\n|模型名称|w2v_baidu_encyclopedia_target_word-character_char1-4_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|baidu_encyclopedia|\n|是否支持Fine-tuning|否|\n|文件大小|679.51MB|\n|词表大小|636038|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_target_word-character_char1-4_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_baidu_encyclopedia_target_word-character_char1-4_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_baidu_encyclopedia_target_word-character_char1-4_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_baidu_encyclopedia_target_word-character_char1-4_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_target_word-character_char1-4_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_word-character_char1-4_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_word-character_char1-4_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_baidu_encyclopedia_target_word-character_char1-4_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.baidu_encyclopedia.target.word-character.char1-4.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_word-ngram_1-2_dim300/README.md",
    "content": "# w2v_baidu_encyclopedia_target_word-ngram_1-2_dim300\n|模型名称|w2v_baidu_encyclopedia_target_word-ngram_1-2_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|baidu_encyclopedia|\n|是否支持Fine-tuning|否|\n|文件大小|679.48MB|\n|词表大小|635977|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_target_word-ngram_1-2_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_baidu_encyclopedia_target_word-ngram_1-2_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_baidu_encyclopedia_target_word-ngram_1-2_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_baidu_encyclopedia_target_word-ngram_1-2_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_target_word-ngram_1-2_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_word-ngram_1-2_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_word-ngram_1-2_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_baidu_encyclopedia_target_word-ngram_1-2_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.baidu_encyclopedia.target.word-ngram.1-2.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_word-ngram_1-3_dim300/README.md",
    "content": "# w2v_baidu_encyclopedia_target_word-ngram_1-3_dim300\n|模型名称|w2v_baidu_encyclopedia_target_word-ngram_1-3_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|baidu_encyclopedia|\n|是否支持Fine-tuning|否|\n|文件大小|671.27MB|\n|词表大小|628669|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_target_word-ngram_1-3_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_baidu_encyclopedia_target_word-ngram_1-3_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_baidu_encyclopedia_target_word-ngram_1-3_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_baidu_encyclopedia_target_word-ngram_1-3_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_target_word-ngram_1-3_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_word-ngram_1-3_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_word-ngram_1-3_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_baidu_encyclopedia_target_word-ngram_1-3_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.baidu_encyclopedia.target.word-ngram.1-3.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_word-ngram_2-2_dim300/README.md",
    "content": "# w2v_baidu_encyclopedia_target_word-ngram_2-2_dim300\n|模型名称|w2v_baidu_encyclopedia_target_word-ngram_2-2_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|baidu_encyclopedia|\n|是否支持Fine-tuning|否|\n|文件大小|7.28GB|\n|词表大小|6969069|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_target_word-ngram_2-2_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_baidu_encyclopedia_target_word-ngram_2-2_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_baidu_encyclopedia_target_word-ngram_2-2_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_baidu_encyclopedia_target_word-ngram_2-2_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_target_word-ngram_2-2_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_word-ngram_2-2_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_word-ngram_2-2_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_baidu_encyclopedia_target_word-ngram_2-2_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.baidu_encyclopedia.target.word-ngram.2-2.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_word-wordLR_dim300/README.md",
    "content": "# w2v_baidu_encyclopedia_target_word-wordLR_dim300\n|模型名称|w2v_baidu_encyclopedia_target_word-wordLR_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|baidu_encyclopedia|\n|是否支持Fine-tuning|否|\n|文件大小|678.22MB|\n|词表大小|635958|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_target_word-wordLR_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_baidu_encyclopedia_target_word-wordLR_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_baidu_encyclopedia_target_word-wordLR_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_baidu_encyclopedia_target_word-wordLR_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_target_word-wordLR_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_word-wordLR_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_word-wordLR_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_baidu_encyclopedia_target_word-wordLR_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.baidu_encyclopedia.target.word-wordLR.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_word-wordPosition_dim300/README.md",
    "content": "# w2v_baidu_encyclopedia_target_word-wordPosition_dim300\n|模型名称|w2v_baidu_encyclopedia_target_word-wordPosition_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|baidu_encyclopedia|\n|是否支持Fine-tuning|否|\n|文件大小|679.32MB|\n|词表大小|636038|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_target_word-wordPosition_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_baidu_encyclopedia_target_word-wordPosition_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_baidu_encyclopedia_target_word-wordPosition_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_baidu_encyclopedia_target_word-wordPosition_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_target_word-wordPosition_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_word-wordPosition_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_word-wordPosition_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_baidu_encyclopedia_target_word-wordPosition_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.baidu_encyclopedia.target.word-wordPosition.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_word-word_dim300/README.md",
    "content": "# w2v_baidu_encyclopedia_target_word-word_dim300\n|模型名称|w2v_baidu_encyclopedia_target_word-word_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|baidu_encyclopedia|\n|是否支持Fine-tuning|否|\n|文件大小|678.21MB|\n|词表大小|635965|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_target_word-word_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_baidu_encyclopedia_target_word-word_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_baidu_encyclopedia_target_word-word_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_baidu_encyclopedia_target_word-word_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_baidu_encyclopedia_target_word-word_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_word-word_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_baidu_encyclopedia_target_word-word_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_baidu_encyclopedia_target_word-word_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.baidu_encyclopedia.target.word-word.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_financial_target_bigram-char_dim300/README.md",
    "content": "# w2v_financial_target_bigram-char_dim300\n|模型名称|w2v_financial_target_bigram-char_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|financial|\n|是否支持Fine-tuning|否|\n|文件大小|499.52MB|\n|词表大小|467163|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_financial_target_bigram-char_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_financial_target_bigram-char_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_financial_target_bigram-char_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_financial_target_bigram-char_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_financial_target_bigram-char_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_financial_target_bigram-char_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_financial_target_bigram-char_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_financial_target_bigram-char_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.financial.target.bigram-char.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_financial_target_word-bigram_dim300/README.md",
    "content": "# w2v_financial_target_word-bigram_dim300\n|模型名称|w2v_financial_target_word-bigram_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|financial|\n|是否支持Fine-tuning|否|\n|文件大小|499.54MB|\n|词表大小|467331|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_financial_target_word-bigram_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_financial_target_word-bigram_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_financial_target_word-bigram_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_financial_target_word-bigram_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_financial_target_word-bigram_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_financial_target_word-bigram_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_financial_target_word-bigram_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_financial_target_word-bigram_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.financial.target.word-bigram.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_financial_target_word-char_dim300/README.md",
    "content": "# w2v_financial_target_word-char_dim300\n|模型名称|w2v_financial_target_word-char_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|financial|\n|是否支持Fine-tuning|否|\n|文件大小|499.17MB|\n|词表大小|467343|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_financial_target_word-char_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_financial_target_word-char_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_financial_target_word-char_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_financial_target_word-char_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_financial_target_word-char_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_financial_target_word-char_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_financial_target_word-char_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_financial_target_word-char_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.financial.target.word-char.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_financial_target_word-word_dim300/README.md",
    "content": "# w2v_financial_target_word-word_dim300\n|模型名称|w2v_financial_target_word-word_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|financial|\n|是否支持Fine-tuning|否|\n|文件大小|498.94MB|\n|词表大小|467324|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_financial_target_word-word_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_financial_target_word-word_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_financial_target_word-word_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_financial_target_word-word_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_financial_target_word-word_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_financial_target_word-word_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_financial_target_word-word_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_financial_target_word-word_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.financial.target.word-word.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_literature_target_bigram-char_dim300/README.md",
    "content": "# w2v_literature_target_bigram-char_dim300\n|模型名称|w2v_literature_target_bigram-char_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|literature|\n|是否支持Fine-tuning|否|\n|文件大小|200.69MB|\n|词表大小|187975|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_literature_target_bigram-char_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_literature_target_bigram-char_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_literature_target_bigram-char_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_literature_target_bigram-char_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_literature_target_bigram-char_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_literature_target_bigram-char_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_literature_target_bigram-char_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_literature_target_bigram-char_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.literature.target.bigram-char.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_literature_target_word-bigram_dim300/README.md",
    "content": "# w2v_literature_target_word-bigram_dim300\n|模型名称|w2v_literature_target_word-bigram_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|literature|\n|是否支持Fine-tuning|否|\n|文件大小|200.59MB|\n|词表大小|187962|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_literature_target_word-bigram_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_literature_target_word-bigram_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_literature_target_word-bigram_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_literature_target_word-bigram_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_literature_target_word-bigram_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_literature_target_word-bigram_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_literature_target_word-bigram_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_literature_target_word-bigram_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.literature.target.word-bigram.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_literature_target_word-char_dim300/README.md",
    "content": "# w2v_literature_target_word-char_dim300\n|模型名称|w2v_literature_target_word-char_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|literature|\n|是否支持Fine-tuning|否|\n|文件大小|200.44MB|\n|词表大小|187980|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_literature_target_word-char_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_literature_target_word-char_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_literature_target_word-char_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_literature_target_word-char_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_literature_target_word-char_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_literature_target_word-char_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_literature_target_word-char_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_literature_target_word-char_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.literature.target.word-char.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_literature_target_word-word_dim300/README.md",
    "content": "# w2v_literature_target_word-word_dim300\n|模型名称|w2v_literature_target_word-word_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|literature|\n|是否支持Fine-tuning|否|\n|文件大小|200.28MB|\n|词表大小|187961|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_literature_target_word-word_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_literature_target_word-word_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_literature_target_word-word_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_literature_target_word-word_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_literature_target_word-word_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_literature_target_word-word_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_literature_target_word-word_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_literature_target_word-word_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.literature.target.word-word.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_mixed-large_target_word-char_dim300/README.md",
    "content": "# w2v_mixed-large_target_word-char_dim300\n|模型名称|w2v_mixed-large_target_word-char_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|mixed|\n|是否支持Fine-tuning|否|\n|文件大小|1.35GB|\n|词表大小|1292552|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_mixed-large_target_word-char_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_mixed-large_target_word-char_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_mixed-large_target_word-char_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_mixed-large_target_word-char_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_mixed-large_target_word-char_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_mixed-large_target_word-char_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_mixed-large_target_word-char_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_mixed-large_target_word-char_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.mixed-large.target.word-char.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_mixed-large_target_word-word_dim300/README.md",
    "content": "# w2v_mixed-large_target_word-word_dim300\n|模型名称|w2v_mixed-large_target_word-word_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|mixed|\n|是否支持Fine-tuning|否|\n|文件大小|1.35GB|\n|词表大小|1292483|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_mixed-large_target_word-word_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_mixed-large_target_word-word_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_mixed-large_target_word-word_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_mixed-large_target_word-word_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_mixed-large_target_word-word_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_mixed-large_target_word-word_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_mixed-large_target_word-word_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_mixed-large_target_word-word_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.mixed-large.target.word-word.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_people_daily_target_bigram-char_dim300/README.md",
    "content": "# w2v_people_daily_target_bigram-char_dim300\n|模型名称|w2v_people_daily_target_bigram-char_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|people_daily|\n|是否支持Fine-tuning|否|\n|文件大小|379.96MB|\n|词表大小|356055|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_people_daily_target_bigram-char_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_people_daily_target_bigram-char_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_people_daily_target_bigram-char_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_people_daily_target_bigram-char_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_people_daily_target_bigram-char_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_people_daily_target_bigram-char_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_people_daily_target_bigram-char_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_people_daily_target_bigram-char_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.people_daily.target.bigram-char.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_people_daily_target_word-bigram_dim300/README.md",
    "content": "# w2v_people_daily_target_word-bigram_dim300\n|模型名称|w2v_people_daily_target_word-bigram_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|people_daily|\n|是否支持Fine-tuning|否|\n|文件大小|379.68MB|\n|词表大小|355991|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_people_daily_target_bigram-char_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_people_daily_target_bigram-char_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_people_daily_target_bigram-char_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_people_daily_target_bigram-char_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_people_daily_target_bigram-char_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_people_daily_target_word-bigram_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_people_daily_target_word-bigram_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_people_daily_target_word-bigram_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.people_daily.target.word-bigram.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_people_daily_target_word-char_dim300/README.md",
    "content": "# w2v_people_daily_target_word-char_dim300\n|模型名称|w2v_people_daily_target_word-char_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|people_daily|\n|是否支持Fine-tuning|否|\n|文件大小|379.45MB|\n|词表大小|355998|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_people_daily_target_word-char_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_people_daily_target_word-char_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_people_daily_target_word-char_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_people_daily_target_word-char_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_people_daily_target_word-char_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_people_daily_target_word-char_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_people_daily_target_word-char_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_people_daily_target_word-char_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.people_daily.target.word-char.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_people_daily_target_word-word_dim300/README.md",
    "content": "# w2v_people_daily_target_word-word_dim300\n|模型名称|w2v_people_daily_target_word-word_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|people_daily|\n|是否支持Fine-tuning|否|\n|文件大小|378.93MB|\n|词表大小|355989|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_people_daily_target_word-word_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_people_daily_target_word-word_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_people_daily_target_word-word_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_people_daily_target_word-word_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_people_daily_target_word-word_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_people_daily_target_word-word_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_people_daily_target_word-word_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_people_daily_target_word-word_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.people_daily.target.word-word.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_sikuquanshu_target_word-bigram_dim300/README.md",
    "content": "# w2v_sikuquanshu_target_word-bigram_dim300\n|模型名称|w2v_sikuquanshu_target_word-bigram_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|sikuquanshu|\n|是否支持Fine-tuning|否|\n|文件大小|20.77MB|\n|词表大小|19529|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_sikuquanshu_target_word-bigram_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_sikuquanshu_target_word-bigram_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_sikuquanshu_target_word-bigram_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_sikuquanshu_target_word-bigram_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_sikuquanshu_target_word-bigram_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_sikuquanshu_target_word-bigram_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_sikuquanshu_target_word-bigram_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_sikuquanshu_target_word-bigram_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.sikuquanshu.target.word-bigram.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_sikuquanshu_target_word-word_dim300/README.md",
    "content": "# w2v_sikuquanshu_target_word-word_dim300\n|模型名称|w2v_sikuquanshu_target_word-word_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|sikuquanshu|\n|是否支持Fine-tuning|否|\n|文件大小|20.70MB|\n|词表大小|19529|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_sikuquanshu_target_word-word_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_sikuquanshu_target_word-word_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_sikuquanshu_target_word-word_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_sikuquanshu_target_word-word_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_sikuquanshu_target_word-word_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_sikuquanshu_target_word-word_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_sikuquanshu_target_word-word_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_sikuquanshu_target_word-word_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.sikuquanshu.target.word-word.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_sogou_target_bigram-char_dim300/README.md",
    "content": "# w2v_sogou_target_bigram-char_dim300\n|模型名称|w2v_sogou_target_bigram-char_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|sogou|\n|是否支持Fine-tuning|否|\n|文件大小|389.81MB|\n|词表大小|365112|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_sogou_target_bigram-char_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_sogou_target_bigram-char_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_sogou_target_bigram-char_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_sogou_target_bigram-char_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_sogou_target_bigram-char_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_sogou_target_bigram-char_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_sogou_target_bigram-char_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_sogou_target_bigram-char_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.sogou.target.bigram-char.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_sogou_target_word-bigram_dim300/README.md",
    "content": "# w2v_sogou_target_word-bigram_dim300\n|模型名称|w2v_sogou_target_word-bigram_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|sogou|\n|是否支持Fine-tuning|否|\n|文件大小|388.66MB|\n|词表大小|364994|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_sogou_target_word-bigram_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_sogou_target_word-bigram_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_sogou_target_word-bigram_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_sogou_target_word-bigram_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_sogou_target_word-bigram_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_sogou_target_word-bigram_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_sogou_target_word-bigram_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_sogou_target_word-bigram_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.sogou.target.word-bigram.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_sogou_target_word-char_dim300/README.md",
    "content": "# w2v_sogou_target_word-char_dim300\n|模型名称|w2v_sogou_target_word-char_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|sogou|\n|是否支持Fine-tuning|否|\n|文件大小|389.89MB|\n|词表大小|365078|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_sogou_target_word-char_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_sogou_target_word-char_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_sogou_target_word-char_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_sogou_target_word-char_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_sogou_target_word-char_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_sogou_target_word-char_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_sogou_target_word-char_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_sogou_target_word-char_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.sogou.target.word-char.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_sogou_target_word-word_dim300/README.md",
    "content": "# w2v_sogou_target_word-word_dim300\n|模型名称|w2v_sogou_target_word-word_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|sogou|\n|是否支持Fine-tuning|否|\n|文件大小|388.66MB|\n|词表大小|364992|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_sogou_target_word-word_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_sogou_target_word-word_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_sogou_target_word-word_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_sogou_target_word-word_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_sogou_target_word-word_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_sogou_target_word-word_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_sogou_target_word-word_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_sogou_target_word-word_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.sogou.target.word-word.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_weibo_target_bigram-char_dim300/README.md",
    "content": "# w2v_weibo_target_bigram-char_dim300\n|模型名称|w2v_weibo_target_bigram-char_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|weibo|\n|是否支持Fine-tuning|否|\n|文件大小|208.24MB|\n|词表大小|195199|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_weibo_target_bigram-char_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_weibo_target_bigram-char_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_weibo_target_bigram-char_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_weibo_target_bigram-char_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_weibo_target_bigram-char_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_weibo_target_bigram-char_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_weibo_target_bigram-char_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_weibo_target_bigram-char_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.weibo.target.bigram-char.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_weibo_target_word-bigram_dim300/README.md",
    "content": "# w2v_weibo_target_word-bigram_dim300\n|模型名称|w2v_weibo_target_word-bigram_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|weibo|\n|是否支持Fine-tuning|否|\n|文件大小|208.19MB|\n|词表大小|195204|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_weibo_target_word-bigram_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_weibo_target_word-bigram_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_weibo_target_word-bigram_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_weibo_target_word-bigram_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_weibo_target_word-bigram_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_weibo_target_word-bigram_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_weibo_target_word-bigram_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_weibo_target_word-bigram_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.weibo.target.word-bigram.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_weibo_target_word-char_dim300/README.md",
    "content": "# w2v_weibo_target_word-char_dim300\n|模型名称|w2v_weibo_target_word-char_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|weibo|\n|是否支持Fine-tuning|否|\n|文件大小|208.03MB|\n|词表大小|195204|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_weibo_target_word-char_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_weibo_target_word-char_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_weibo_target_word-char_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_weibo_target_word-char_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_weibo_target_word-char_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_weibo_target_word-char_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_weibo_target_word-char_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_weibo_target_word-char_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.weibo.target.word-char.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_weibo_target_word-word_dim300/README.md",
    "content": "# w2v_weibo_target_word-word_dim300\n|模型名称|w2v_weibo_target_word-word_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|weibo|\n|是否支持Fine-tuning|否|\n|文件大小|207.94MB|\n|词表大小|195204|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_weibo_target_word-word_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_weibo_target_word-word_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_weibo_target_word-word_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_weibo_target_word-word_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_weibo_target_word-word_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_weibo_target_word-word_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_weibo_target_word-word_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_weibo_target_word-word_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.weibo.target.word-word.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_wiki_target_bigram-char_dim300/README.md",
    "content": "# w2v_wiki_target_bigram-char_dim300\n|模型名称|w2v_wiki_target_bigram-char_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|wiki|\n|是否支持Fine-tuning|否|\n|文件大小|375.98MB|\n|词表大小|352274|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_wiki_target_bigram-char_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_wiki_target_bigram-char_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_wiki_target_bigram-char_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_wiki_target_bigram-char_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_wiki_target_bigram-char_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_wiki_target_bigram-char_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_wiki_target_bigram-char_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_wiki_target_bigram-char_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.wiki.target.bigram-char.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_wiki_target_word-bigram_dim300/README.md",
    "content": "# w2v_wiki_target_word-bigram_dim300\n|模型名称|w2v_wiki_target_word-bigram_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|wiki|\n|是否支持Fine-tuning|否|\n|文件大小|375.72MB|\n|词表大小|352219|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_wiki_target_word-bigram_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_wiki_target_word-bigram_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_wiki_target_word-bigram_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_wiki_target_word-bigram_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_wiki_target_word-bigram_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_wiki_target_word-bigram_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_wiki_target_word-bigram_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_wiki_target_word-bigram_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.wiki.target.word-bigram.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_wiki_target_word-char_dim300/README.md",
    "content": "# w2v_wiki_target_word-char_dim300\n|模型名称|w2v_wiki_target_word-char_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|wiki|\n|是否支持Fine-tuning|否|\n|文件大小|375.52MB|\n|词表大小|352223|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_wiki_target_word-char_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_wiki_target_word-char_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_wiki_target_word-char_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_wiki_target_word-char_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_wiki_target_word-char_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_wiki_target_word-char_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_wiki_target_word-char_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_wiki_target_word-char_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.wiki.target.word-char.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_wiki_target_word-word_dim300/README.md",
    "content": "# w2v_wiki_target_word-word_dim300\n|模型名称|w2v_wiki_target_word-word_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|wiki|\n|是否支持Fine-tuning|否|\n|文件大小|374.95MB|\n|词表大小|352219|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_wiki_target_word-word_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_wiki_target_word-word_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_wiki_target_word-word_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_wiki_target_word-word_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_wiki_target_word-word_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_wiki_target_word-word_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_wiki_target_word-word_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_wiki_target_word-word_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.wiki.target.word-word.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_zhihu_target_bigram-char_dim300/README.md",
    "content": "# w2v_zhihu_target_bigram-char_dim300\n|模型名称|w2v_zhihu_target_bigram-char_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|zhihu|\n|是否支持Fine-tuning|否|\n|文件大小|277.35MB|\n|词表大小|259755|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_zhihu_target_bigram-char_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_zhihu_target_bigram-char_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_zhihu_target_bigram-char_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_zhihu_target_bigram-char_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_zhihu_target_bigram-char_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_zhihu_target_bigram-char_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_zhihu_target_bigram-char_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_zhihu_target_bigram-char_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.zhihu.target.bigram-char.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_zhihu_target_word-bigram_dim300/README.md",
    "content": "# w2v_zhihu_target_word-bigram_dim300\n|模型名称|w2v_zhihu_target_word-bigram_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|zhihu|\n|是否支持Fine-tuning|否|\n|文件大小|277.53MB|\n|词表大小|259885|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_zhihu_target_bigram-char_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_zhihu_target_bigram-char_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_zhihu_target_bigram-char_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_zhihu_target_bigram-char_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_zhihu_target_bigram-char_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_zhihu_target_word-bigram_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_zhihu_target_word-bigram_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_zhihu_target_word-bigram_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.zhihu.target.word-bigram.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_zhihu_target_word-char_dim300/README.md",
    "content": "# w2v_zhihu_target_word-char_dim300\n|模型名称|w2v_zhihu_target_word-char_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|zhihu|\n|是否支持Fine-tuning|否|\n|文件大小|277.40MB|\n|词表大小|259940|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_zhihu_target_word-char_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_zhihu_target_word-char_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_zhihu_target_word-char_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_zhihu_target_word-char_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_zhihu_target_word-char_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_zhihu_target_word-char_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_zhihu_target_word-char_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_zhihu_target_word-char_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.zhihu.target.word-char.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/w2v_zhihu_target_word-word_dim300/README.md",
    "content": "# w2v_zhihu_target_word-word_dim300\n|模型名称|w2v_zhihu_target_word-word_dim300|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|w2v|\n|数据集|zhihu|\n|是否支持Fine-tuning|否|\n|文件大小|276.98MB|\n|词表大小|259871|\n|最新更新日期|2021-04-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - PaddleHub提供多个开源的预训练Embedding模型。这些Embedding模型可根据不同语料、不同训练方式和不同的维度进行区分，关于模型的具体信息可参考PaddleNLP的文档：[Embedding模型汇总](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/docs/embeddings.md)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install w2v_zhihu_target_word-word_dim300\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    embedding = hub.Module(name='w2v_zhihu_target_word-word_dim300')\n\n    # 获取单词的embedding\n    embedding.search(\"中国\")\n    # 计算两个词向量的余弦相似度\n    embedding.cosine_sim(\"中国\", \"美国\")\n    # 计算两个词向量的内积\n    embedding.dot(\"中国\", \"美国\")\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        *args,\n        **kwargs\n    )\n    ```\n\n    - 创建一个Embedding Module对象，默认无需参数。\n\n    - **参数**\n      - `*args`： 用户额外指定的列表类型的参数。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n  - ```python\n    def search(\n        words: Union[List[str], str, int],\n    )\n    ```\n\n    - 获取一个或多个词的embedding。输入可以是`str`、`List[str]`和`int`类型，分别代表获取一个词，多个词和指定词编号的embedding，词的编号和模型的词典相关，词典可通过模型实例的`vocab`属性获取。\n\n    - **参数**\n      - `words`： 需要获取的词向量的词、词列表或者词编号。\n\n\n  - ```python\n    def cosine_sim(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的余弦相似度。需要注意的是`word_a`和`word_b`都需要是词典里的单词，否则将会被认为是OOV(Out-Of-Vocabulary)，同时被替换为`unknown_token`。\n\n    - **参数**\n      - `word_a`： 需要计算余弦相似度的单词a。\n      - `word_b`： 需要计算余弦相似度的单词b。\n\n\n  - ```python\n    def dot(\n        word_a: str,\n        word_b: str,\n    )\n    ```\n\n    - 计算两个词embedding的内积。对于输入单词同样需要注意OOV问题。\n\n    - **参数**\n      - `word_a`： 需要计算内积的单词a。\n      - `word_b`： 需要计算内积的单词b。\n\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取本地词表文件的路径信息。\n\n\n  - ```python\n    def get_tokenizer(*args, **kwargs)\n    ```\n\n    - 获取当前模型的tokenizer，返回一个JiebaTokenizer的实例，当前只支持中文embedding模型。\n\n    - **参数**\n      - `*args`： 额外传递的列表形式的参数。\n      - `**kwargs`： 额外传递的字典形式的参数。\n\n    - 关于额外参数的详情可参考[paddlenlp.data.tokenizer.JiebaTokenizer](https://github.com/PaddlePaddle/models/blob/release/2.0-beta/PaddleNLP/paddlenlp/data/tokenizer.py)\n\n  - 更多api详情和用法可参考[paddlenlp.embeddings](https://github.com/PaddlePaddle/models/tree/release/2.0-beta/PaddleNLP/paddlenlp/embeddings)\n\n\n## 四、部署服务\n\n- 通过PaddleHub Serving，可以部署一个在线获取两个词向量的余弦相似度的服务。\n\n- ### Step1: 启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m w2v_zhihu_target_word-word_dim300\n    ```\n\n  - 这样就完成了一个获取词向量的余弦相似度服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步: 发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于计算余弦相似度的单词对[[word_a, word_b], [word_a, word_b], ... ]]\n    word_pairs = [[\"中国\", \"美国\"], [\"今天\", \"明天\"]]\n    # 以key的方式指定word_pairs传入预测方法的时的参数，此例中为\"data\"，对于每一对单词，调用cosine_sim进行余弦相似度的计算\n    data = {\"data\": word_pairs}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/w2v_zhihu_target_word-word_dim300\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  优化模型\n  - ```shell\n    $ hub install w2v_zhihu_target_word-word_dim300==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/embedding/w2v_zhihu_target_word-word_dim300/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/embedding/w2v_zhihu_target_word-word_dim300/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlenlp.embeddings import TokenEmbedding\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import EmbeddingModule\n\n\n@moduleinfo(\n    name=\"w2v_zhihu_target_word-word_dim300\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=EmbeddingModule)\nclass Embedding(TokenEmbedding):\n    \"\"\"\n    Embedding model\n    \"\"\"\n    embedding_name = \"w2v.zhihu.target.word-word.dim300\"\n\n    def __init__(self, *args, **kwargs):\n        super(Embedding, self).__init__(embedding_name=self.embedding_name, *args, **kwargs)\n"
  },
  {
    "path": "modules/text/embedding/word2vec_skipgram/README.md",
    "content": "# word2vec_skipgram\n|模型名称|word2vec_skipgram|\n| :--- | :---: | \n|类别|文本-词嵌入|\n|网络|skip-gram|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|是|\n|模型大小|861MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - Word2vec是常用的词嵌入（word embedding）模型。该PaddleHub Module基于Skip-gram模型，在海量百度搜索数据集下预训练得到中文单词预训练词嵌入。其支持Fine-tune。Word2vec的预训练数据集的词汇表大小为1700249，word embedding维度为128。\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 1.8.2\n\n  - paddlehub >= 1.8.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install word2vec_skipgram\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API\n\n- ### 1、Finetune代码示例\n\n```python\nimport paddlehub as hub\n\n# Load word2vec pretrained model\nmodule = hub.Module(name=\"word2vec_skipgram\")\ninputs, outputs, program = module.context(trainable=True)\n\n# Must feed all the tensor of module need\nword_ids = inputs[\"text\"]\n\n# Use the pretrained word embeddings\nembedding = outputs[\"emb\"]\n```\n\n- ### 2、API\n\n  - ```python\n    context(trainable=False, max_seq_len=128, num_slots=1)\n    ```\n\n    - **参数**\n\n      - trainable(bool): trainable=True表示program中的参数在Fine-tune时需要微调，否则保持不变。\n      - max_seq_len(int): 模型使用的最大序列长度。\n      - num_slots(int): 输入到模型所需要的文本个数，如完成单句文本分类任务，则num_slots=1；完成pointwise文本匹配任务，则num_slots=2；完成pairtwise文本匹配任务，则num_slots=3；\n\n    - **返回**\n\n      - inputs(dict): program的输入变量\n      - outputs(dict): program的输出变量\n      - main_program(Program): 带有预训练参数的program\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  模型升级，支持用于文本分类，文本匹配等各种任务迁移学习\n"
  },
  {
    "path": "modules/text/embedding/word2vec_skipgram/module.py",
    "content": "# -*- coding:utf-8 -*-\n# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport io\nimport os\n\nimport paddle.fluid as fluid\nimport paddlehub as hub\nfrom paddlehub.common.paddle_helper import add_vars_prefix\nfrom paddlehub.module.module import moduleinfo\n\n\ndef load_vocab(file_path):\n    \"\"\"\n    load the given vocabulary\n    \"\"\"\n    vocab = {}\n    with io.open(file_path, 'r', encoding='utf8') as f:\n        for line in f:\n            parts = line.strip().split(\"\\t\")\n            vocab[parts[0]] = int(parts[1])\n\n    return vocab\n\n\n@moduleinfo(\n    name=\"word2vec_skipgram\",\n    version=\"1.1.0\",\n    summary=\"Chinese word embedding based on the SkipGram.\",\n    author=\"baidu-nlp\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\")\nclass Word2vecSkipGram(hub.Module):\n    def _initialize(self):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        self.pretrained_model_path = os.path.join(self.directory, \"assets\", \"model\")\n        self.vocab_path = os.path.join(self.directory, \"assets\", \"vocab.txt\")\n        self.vocab = load_vocab(self.vocab_path)\n\n    def context(self, trainable=False, max_seq_len=128, num_slots=1):\n        \"\"\"\n        Get the input ,output and program of the pretrained word2vec_skipgram\n\n        Args:\n             trainable(bool): whether fine-tune the pretrained parameters of word2vec_skipgram or not.\n             max_seq_len (int): It will limit the total sequence returned so that it has a maximum length.\n             num_slots(int): It's number of data inputted to the model, selectted as following options:\n\n                 - 1(default): There's only one data to be feeded in the model, e.g. the module is used for sentence classification task.\n                 - 2: There are two data to be feeded in the model, e.g. the module is used for text matching task (point-wise).\n                 - 3: There are three data to be feeded in the model, e.g. the module is used for text matching task (pair-wise).\n\n        Returns:\n             inputs(dict): the input variables of word2vec_skipgram (words)\n             outputs(dict): the output variables of input words (word embeddings)\n             main_program(Program): the main_program of word2vec_skipgram with pretrained prameters\n        \"\"\"\n        assert num_slots >= 1 and num_slots <= 3, \"num_slots must be 1, 2, or 3, but the input is %d\" % num_slots\n        main_program = fluid.Program()\n        startup_program = fluid.Program()\n        with fluid.program_guard(main_program, startup_program):\n            with fluid.unique_name.guard():\n\n                w_param_attrs = fluid.ParamAttr(\n                    name=\"embedding_0.w_0\",\n                    initializer=fluid.initializer.TruncatedNormal(scale=0.02),\n                    trainable=trainable)\n\n                text_1 = fluid.data(name='text', shape=[-1, max_seq_len], dtype='int64', lod_level=0)\n                emb_1 = fluid.embedding(\n                    input=text_1,\n                    is_sparse=True,\n                    size=[len(self.vocab), 128],\n                    padding_idx=len(self.vocab) - 1,\n                    dtype='float32',\n                    param_attr=w_param_attrs)\n                emb_1_name = emb_1.name\n                data_list = [text_1]\n                emb_name_list = [emb_1_name]\n\n                if num_slots > 1:\n                    text_2 = fluid.data(name='text_2', shape=[-1, max_seq_len], dtype='int64', lod_level=0)\n                    emb_2 = fluid.embedding(\n                        input=text_2,\n                        is_sparse=True,\n                        size=[len(self.vocab), 128],\n                        padding_idx=len(self.vocab) - 1,\n                        dtype='float32',\n                        param_attr=w_param_attrs)\n                    emb_2_name = emb_2.name\n                    data_list.append(text_2)\n                    emb_name_list.append(emb_2_name)\n\n                if num_slots > 2:\n                    text_3 = fluid.data(name='text_3', shape=[-1, max_seq_len], dtype='int64', lod_level=0)\n                    emb_3 = fluid.embedding(\n                        input=text_3,\n                        is_sparse=True,\n                        size=[len(self.vocab), 128],\n                        padding_idx=len(self.vocab) - 1,\n                        dtype='float32',\n                        param_attr=w_param_attrs)\n                    emb_3_name = emb_3.name\n                    data_list.append(text_3)\n                    emb_name_list.append(emb_3_name)\n\n                variable_names = filter(lambda v: v not in ['text', 'text_2', 'text_3'],\n                                        list(main_program.global_block().vars.keys()))\n\n                prefix_name = \"@HUB_{}@\".format(self.name)\n                add_vars_prefix(program=main_program, prefix=prefix_name, vars=variable_names)\n                for param in main_program.global_block().iter_parameters():\n                    param.trainable = trainable\n\n                place = fluid.CPUPlace()\n                exe = fluid.Executor(place)\n\n                # load the pretrained model\n                def if_exist(var):\n                    return os.path.exists(os.path.join(self.pretrained_model_path, var.name))\n\n                fluid.io.load_vars(exe, self.pretrained_model_path, predicate=if_exist)\n\n                inputs = {}\n                outputs = {}\n                for index, data in enumerate(data_list):\n                    if index == 0:\n                        inputs['text'] = data\n                        outputs['emb'] = main_program.global_block().vars[prefix_name + emb_name_list[0]]\n                    else:\n                        inputs['text_%s' % (index + 1)] = data\n                        outputs['emb_%s' % (index + 1)] = main_program.global_block().vars[prefix_name +\n                                                                                           emb_name_list[index]]\n\n                return inputs, outputs, main_program\n\n    def get_vocab_path(self):\n        return self.vocab_path\n\n\nif __name__ == \"__main__\":\n    w2v = Word2vecSkipGram()\n    i, o, p = w2v.context(num_slots=3)\n    print(w2v.get_vocab_path())\n"
  },
  {
    "path": "modules/text/language_model/README.md",
    "content": "## **更好用户体验，建议参考WEB端官方文档 -> [【语言模型】](https://www.paddlepaddle.org.cn/hublist)**\n\n### 语言模型\n\n\n- 推荐模型\n\n| 模型名称                                                     | 模型简介                                                     |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n| [词嵌入模型](https://www.paddlepaddle.org.cn/hubdetail?name=word2vec_skipgram&en_category=SemanticModel) |在海量百度搜索数据集下预训练得到中文单词预训练词嵌入。其支持Fine-tune。Word2vec的预训练数据集的词汇表大小为1700249，word embedding维度为128。 |\n| [文本相似度](https://www.paddlepaddle.org.cn/hubdetail?name=simnet_bow&en_category=SemanticModel) |根据用户输入的两个文本，计算出文本相似度得分。 |\n| [ERNIE](https://www.paddlepaddle.org.cn/hubdetail?name=ERNIE&en_category=SemanticModel) |基于百科类、资讯类、论坛对话类数据等中文语料自研模型，其可用于文本分类、序列标注、阅读理解等任务。\n.\n"
  },
  {
    "path": "modules/text/language_model/README_en.md",
    "content": "## **For better user experience, refer to the Web official document -> [Language Model](https://www.paddlepaddle.org.cn/hublist)**\n\n### Language Model\n\n- Recommended Model\n\n| Model Name                                                   | Module Introduction                                          |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n| [Word embedding model](https://www.paddlepaddle.org.cn/hubdetail?name=word2vec_skipgram&en_category=SemanticModel) | In the massive Baidu search dataset, the Chinese character pre-training word embedding is obtained through pre-training. It supports Fine-tune. The vocabulary list size of Word2vec's pre-training dataset is 1700249. The word embedding dimension is 128. |\n| [Text similarity](https://www.paddlepaddle.org.cn/hubdetail?name=simnet_bow&en_category=SemanticModel) | Based on the two texts entered by a user, the score of the text similarity is calculated. |\n\n| [ERNIE](https://www.paddlepaddle.org.cn/hubdetail?name=ERNIE&en_category=SemanticModel) |Based on Chinese corpus self-developed models such as encyclopedia, information, and forum dialogue data, it can be used for tasks such as text classification, sequence annotation, and reading comprehension.  \n"
  },
  {
    "path": "modules/text/language_model/albert-base-v1/README.md",
    "content": "# albert-base-v1\n|模型名称|albert-base-v1|\n| :--- | :---: |\n|类别|文本-语义模型|\n|网络|albert-base-v1|\n|数据集|-|\n|是否支持Fine-tuning|是|\n|模型大小|90MB|\n|最新更新日期|2022-02-08|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - ALBERT针对当前预训练模型参数量过大的问题，提出了以下改进方案：\n\n    - 嵌入向量参数化的因式分解。ALBERT对词嵌入参数进行了因式分解，先将单词映射到一个低维的词嵌入空间E，然后再将其映射到高维的隐藏空间H。\n\n    - 跨层参数共享。ALBERT共享了层之间的全部参数。\n\n更多详情请参考[ALBERT论文](https://arxiv.org/abs/1909.11942)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install albert-base-v1\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='albert-base-v1',\n    version='1.0.0',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Label: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](../../../../demo/text_classification)\n- [序列标注](../../../../demo/sequence_labeling)\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        task=None,\n        load_checkpoint=None,\n        label_map=None,\n        num_classes=2,\n        suffix=False,\n        **kwargs,\n    )\n    ```\n\n    - 创建Module对象（动态图组网版本）\n\n    - **参数**\n\n      - `task`： 任务名称，可为`seq-cls`(文本分类任务)或`token-cls`(序列标注任务)。\n      - `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n      - `label_map`：预测时的类别映射表。\n      - `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n      - `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n  - ```python\n    def predict(\n        data,\n        max_seq_len=128,\n        batch_size=1,\n        use_gpu=False\n    )\n    ```\n\n    - **参数**\n\n      - `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n      - `max_seq_len`：模型处理文本的最大长度\n      - `batch_size`：模型批处理大小\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，不同任务类型的返回结果如下\n        - 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n        - 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n  - ```python\n    def get_embedding(\n      data,\n      use_gpu=False\n    )\n    ```\n\n    - 用于获取输入文本的句子粒度特征与字粒度特征\n\n    - **参数**\n\n      - `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。  \n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线获取预训练词向量。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m albert-base-v1\n    ```\n\n  - 这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\n    text = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n    # 对应本地部署，则为module.get_embedding(data=text)\n    data = {\"data\": text}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/albert-base-v1\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/text/language_model/albert-base-v1/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/albert-base-v1/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nimport os\nfrom typing import Dict\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlenlp.transformers.albert.modeling import AlbertForSequenceClassification\nfrom paddlenlp.transformers.albert.modeling import AlbertForTokenClassification\nfrom paddlenlp.transformers.albert.modeling import AlbertModel\nfrom paddlenlp.transformers.albert.tokenizer import AlbertTokenizer\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(name=\"albert-base-v1\",\n            version=\"1.0.0\",\n            summary=\"\",\n            author=\"Baidu\",\n            author_email=\"\",\n            type=\"nlp/semantic_model\",\n            meta=TransformerModule)\nclass Albert(nn.Layer):\n    \"\"\"\n    ALBERT model\n    \"\"\"\n\n    def __init__(\n        self,\n        task: str = None,\n        load_checkpoint: str = None,\n        label_map: Dict = None,\n        num_classes: int = 2,\n        suffix: bool = False,\n        **kwargs,\n    ):\n        super(Albert, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = AlbertForSequenceClassification.from_pretrained(pretrained_model_name_or_path='albert-base-v1',\n                                                                         num_classes=self.num_classes,\n                                                                         **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = AlbertForTokenClassification.from_pretrained(pretrained_model_name_or_path='albert-base-v1',\n                                                                      num_classes=self.num_classes,\n                                                                      **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())],\n                                         suffix=suffix)\n        elif task == 'text-matching':\n            self.model = AlbertModel.from_pretrained(pretrained_model_name_or_path='albert-base-v1', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = AlbertModel.from_pretrained(pretrained_model_name_or_path='albert-base-v1', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding, _ = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding, _ = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            sequence_output, pooled_output = result\n            return sequence_output, pooled_output\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return AlbertTokenizer.from_pretrained(pretrained_model_name_or_path='albert-base-v1', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/albert-base-v2/README.md",
    "content": "# albert-base-v2\n|模型名称|albert-base-v2|\n| :--- | :---: |\n|类别|文本-语义模型|\n|网络|albert-base-v2|\n|数据集|-|\n|是否支持Fine-tuning|是|\n|模型大小|90MB|\n|最新更新日期|2022-02-08|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - ALBERT针对当前预训练模型参数量过大的问题，提出了以下改进方案：\n\n    - 嵌入向量参数化的因式分解。ALBERT对词嵌入参数进行了因式分解，先将单词映射到一个低维的词嵌入空间E，然后再将其映射到高维的隐藏空间H。\n\n    - 跨层参数共享。ALBERT共享了层之间的全部参数。\n\n更多详情请参考[ALBERT论文](https://arxiv.org/abs/1909.11942)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.2.0\n\n  - paddlehub >= 2.2.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install albert-base-v2\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='albert-base-v2',\n    version='1.0.0',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Label: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](../../../../demo/text_classification)\n- [序列标注](../../../../demo/sequence_labeling)\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        task=None,\n        load_checkpoint=None,\n        label_map=None,\n        num_classes=2,\n        suffix=False,\n        **kwargs,\n    )\n    ```\n\n    - 创建Module对象（动态图组网版本）\n\n    - **参数**\n\n      - `task`： 任务名称，可为`seq-cls`(文本分类任务)或`token-cls`(序列标注任务)。\n      - `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n      - `label_map`：预测时的类别映射表。\n      - `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n      - `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n  - ```python\n    def predict(\n        data,\n        max_seq_len=128,\n        batch_size=1,\n        use_gpu=False\n    )\n    ```\n\n    - **参数**\n\n      - `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n      - `max_seq_len`：模型处理文本的最大长度\n      - `batch_size`：模型批处理大小\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，不同任务类型的返回结果如下\n        - 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n        - 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n  - ```python\n    def get_embedding(\n      data,\n      use_gpu=False\n    )\n    ```\n\n    - 用于获取输入文本的句子粒度特征与字粒度特征\n\n    - **参数**\n\n      - `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。  \n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线获取预训练词向量。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m albert-base-v2\n    ```\n\n  - 这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\n    text = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n    # 对应本地部署，则为module.get_embedding(data=text)\n    data = {\"data\": text}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/albert-base-v2\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/text/language_model/albert-base-v2/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/albert-base-v2/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nimport os\nfrom typing import Dict\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlenlp.transformers.albert.modeling import AlbertForSequenceClassification\nfrom paddlenlp.transformers.albert.modeling import AlbertForTokenClassification\nfrom paddlenlp.transformers.albert.modeling import AlbertModel\nfrom paddlenlp.transformers.albert.tokenizer import AlbertTokenizer\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(name=\"albert-base-v2\",\n            version=\"1.0.0\",\n            summary=\"\",\n            author=\"Baidu\",\n            author_email=\"\",\n            type=\"nlp/semantic_model\",\n            meta=TransformerModule)\nclass Albert(nn.Layer):\n    \"\"\"\n    ALBERT model\n    \"\"\"\n\n    def __init__(\n        self,\n        task: str = None,\n        load_checkpoint: str = None,\n        label_map: Dict = None,\n        num_classes: int = 2,\n        suffix: bool = False,\n        **kwargs,\n    ):\n        super(Albert, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = AlbertForSequenceClassification.from_pretrained(pretrained_model_name_or_path='albert-base-v2',\n                                                                         num_classes=self.num_classes,\n                                                                         **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = AlbertForTokenClassification.from_pretrained(pretrained_model_name_or_path='albert-base-v2',\n                                                                      num_classes=self.num_classes,\n                                                                      **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())],\n                                         suffix=suffix)\n        elif task == 'text-matching':\n            self.model = AlbertModel.from_pretrained(pretrained_model_name_or_path='albert-base-v2', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = AlbertModel.from_pretrained(pretrained_model_name_or_path='albert-base-v2', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding, _ = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding, _ = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            sequence_output, pooled_output = result\n            return sequence_output, pooled_output\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return AlbertTokenizer.from_pretrained(pretrained_model_name_or_path='albert-base-v2', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/albert-chinese-base/README.md",
    "content": "# albert-chinese-base\n|模型名称|albert-chinese-base|\n| :--- | :---: |\n|类别|文本-语义模型|\n|网络|albert-chinese-base|\n|数据集|-|\n|是否支持Fine-tuning|是|\n|模型大小|77MB|\n|最新更新日期|2022-02-08|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - ALBERT针对当前预训练模型参数量过大的问题，提出了以下改进方案：\n\n    - 嵌入向量参数化的因式分解。ALBERT对词嵌入参数进行了因式分解，先将单词映射到一个低维的词嵌入空间E，然后再将其映射到高维的隐藏空间H。\n\n    - 跨层参数共享。ALBERT共享了层之间的全部参数。\n\n更多详情请参考[ALBERT论文](https://arxiv.org/abs/1909.11942)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.2.0\n\n  - paddlehub >= 2.2.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install albert-chinese-base\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='albert-chinese-base',\n    version='1.0.0',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Label: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](../../../../demo/text_classification)\n- [序列标注](../../../../demo/sequence_labeling)\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        task=None,\n        load_checkpoint=None,\n        label_map=None,\n        num_classes=2,\n        suffix=False,\n        **kwargs,\n    )\n    ```\n\n    - 创建Module对象（动态图组网版本）\n\n    - **参数**\n\n      - `task`： 任务名称，可为`seq-cls`(文本分类任务)或`token-cls`(序列标注任务)。\n      - `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n      - `label_map`：预测时的类别映射表。\n      - `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n      - `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n  - ```python\n    def predict(\n        data,\n        max_seq_len=128,\n        batch_size=1,\n        use_gpu=False\n    )\n    ```\n\n    - **参数**\n\n      - `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n      - `max_seq_len`：模型处理文本的最大长度\n      - `batch_size`：模型批处理大小\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，不同任务类型的返回结果如下\n        - 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n        - 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n  - ```python\n    def get_embedding(\n      data,\n      use_gpu=False\n    )\n    ```\n\n    - 用于获取输入文本的句子粒度特征与字粒度特征\n\n    - **参数**\n\n      - `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。  \n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线获取预训练词向量。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m albert-chinese-base\n    ```\n\n  - 这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\n    text = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n    # 对应本地部署，则为module.get_embedding(data=text)\n    data = {\"data\": text}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/albert-chinese-base\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/text/language_model/albert-chinese-base/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/albert-chinese-base/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nimport os\nfrom typing import Dict\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlenlp.transformers.albert.modeling import AlbertForSequenceClassification\nfrom paddlenlp.transformers.albert.modeling import AlbertForTokenClassification\nfrom paddlenlp.transformers.albert.modeling import AlbertModel\nfrom paddlenlp.transformers.albert.tokenizer import AlbertTokenizer\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(name=\"albert-chinese-base\",\n            version=\"1.0.0\",\n            summary=\"\",\n            author=\"Baidu\",\n            author_email=\"\",\n            type=\"nlp/semantic_model\",\n            meta=TransformerModule)\nclass Albert(nn.Layer):\n    \"\"\"\n    ALBERT model\n    \"\"\"\n\n    def __init__(\n        self,\n        task: str = None,\n        load_checkpoint: str = None,\n        label_map: Dict = None,\n        num_classes: int = 2,\n        suffix: bool = False,\n        **kwargs,\n    ):\n        super(Albert, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = AlbertForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='albert-chinese-base', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = AlbertForTokenClassification.from_pretrained(\n                pretrained_model_name_or_path='albert-chinese-base', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())],\n                                         suffix=suffix)\n        elif task == 'text-matching':\n            self.model = AlbertModel.from_pretrained(pretrained_model_name_or_path='albert-chinese-base', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = AlbertModel.from_pretrained(pretrained_model_name_or_path='albert-chinese-base', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding, _ = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding, _ = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            sequence_output, pooled_output = result\n            return sequence_output, pooled_output\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return AlbertTokenizer.from_pretrained(pretrained_model_name_or_path='albert-chinese-base', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/albert-chinese-large/README.md",
    "content": "# albert-chinese-large\n|模型名称|albert-chinese-large|\n| :--- | :---: |\n|类别|文本-语义模型|\n|网络|albert-chinese-large|\n|数据集|-|\n|是否支持Fine-tuning|是|\n|模型大小|112MB|\n|最新更新日期|2022-02-08|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - ALBERT针对当前预训练模型参数量过大的问题，提出了以下改进方案：\n\n    - 嵌入向量参数化的因式分解。ALBERT对词嵌入参数进行了因式分解，先将单词映射到一个低维的词嵌入空间E，然后再将其映射到高维的隐藏空间H。\n\n    - 跨层参数共享。ALBERT共享了层之间的全部参数。\n\n更多详情请参考[ALBERT论文](https://arxiv.org/abs/1909.11942)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.2.0\n\n  - paddlehub >= 2.2.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install albert-chinese-large\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='albert-chinese-large',\n    version='1.0.0',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Label: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](../../../../demo/text_classification)\n- [序列标注](../../../../demo/sequence_labeling)\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        task=None,\n        load_checkpoint=None,\n        label_map=None,\n        num_classes=2,\n        suffix=False,\n        **kwargs,\n    )\n    ```\n\n    - 创建Module对象（动态图组网版本）\n\n    - **参数**\n\n      - `task`： 任务名称，可为`seq-cls`(文本分类任务)或`token-cls`(序列标注任务)。\n      - `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n      - `label_map`：预测时的类别映射表。\n      - `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n      - `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n  - ```python\n    def predict(\n        data,\n        max_seq_len=128,\n        batch_size=1,\n        use_gpu=False\n    )\n    ```\n\n    - **参数**\n\n      - `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n      - `max_seq_len`：模型处理文本的最大长度\n      - `batch_size`：模型批处理大小\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，不同任务类型的返回结果如下\n        - 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n        - 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n  - ```python\n    def get_embedding(\n      data,\n      use_gpu=False\n    )\n    ```\n\n    - 用于获取输入文本的句子粒度特征与字粒度特征\n\n    - **参数**\n\n      - `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。  \n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线获取预训练词向量。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m albert-chinese-large\n    ```\n\n  - 这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\n    text = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n    # 对应本地部署，则为module.get_embedding(data=text)\n    data = {\"data\": text}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/albert-chinese-large\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/text/language_model/albert-chinese-large/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/albert-chinese-large/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nimport os\nfrom typing import Dict\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlenlp.transformers.albert.modeling import AlbertForSequenceClassification\nfrom paddlenlp.transformers.albert.modeling import AlbertForTokenClassification\nfrom paddlenlp.transformers.albert.modeling import AlbertModel\nfrom paddlenlp.transformers.albert.tokenizer import AlbertTokenizer\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(name=\"albert-chinese-large\",\n            version=\"1.0.0\",\n            summary=\"\",\n            author=\"Baidu\",\n            author_email=\"\",\n            type=\"nlp/semantic_model\",\n            meta=TransformerModule)\nclass Albert(nn.Layer):\n    \"\"\"\n    ALBERT model\n    \"\"\"\n\n    def __init__(\n        self,\n        task: str = None,\n        load_checkpoint: str = None,\n        label_map: Dict = None,\n        num_classes: int = 2,\n        suffix: bool = False,\n        **kwargs,\n    ):\n        super(Albert, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = AlbertForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='albert-chinese-large', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = AlbertForTokenClassification.from_pretrained(\n                pretrained_model_name_or_path='albert-chinese-large', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())],\n                                         suffix=suffix)\n        elif task == 'text-matching':\n            self.model = AlbertModel.from_pretrained(pretrained_model_name_or_path='albert-chinese-large', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = AlbertModel.from_pretrained(pretrained_model_name_or_path='albert-chinese-large', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding, _ = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding, _ = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            sequence_output, pooled_output = result\n            return sequence_output, pooled_output\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return AlbertTokenizer.from_pretrained(pretrained_model_name_or_path='albert-chinese-large', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/albert-chinese-small/README.md",
    "content": "# albert-chinese-small\n|模型名称|albert-chinese-small|\n| :--- | :---: |\n|类别|文本-语义模型|\n|网络|albert-chinese-small|\n|数据集|-|\n|是否支持Fine-tuning|是|\n|模型大小|44MB|\n|最新更新日期|2022-02-08|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - ALBERT针对当前预训练模型参数量过大的问题，提出了以下改进方案：\n\n    - 嵌入向量参数化的因式分解。ALBERT对词嵌入参数进行了因式分解，先将单词映射到一个低维的词嵌入空间E，然后再将其映射到高维的隐藏空间H。\n\n    - 跨层参数共享。ALBERT共享了层之间的全部参数。\n\n更多详情请参考[ALBERT论文](https://arxiv.org/abs/1909.11942)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.2.0\n\n  - paddlehub >= 2.2.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install albert-chinese-small\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='albert-chinese-small',\n    version='1.0.0',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Label: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](../../../../demo/text_classification)\n- [序列标注](../../../../demo/sequence_labeling)\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        task=None,\n        load_checkpoint=None,\n        label_map=None,\n        num_classes=2,\n        suffix=False,\n        **kwargs,\n    )\n    ```\n\n    - 创建Module对象（动态图组网版本）\n\n    - **参数**\n\n      - `task`： 任务名称，可为`seq-cls`(文本分类任务)或`token-cls`(序列标注任务)。\n      - `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n      - `label_map`：预测时的类别映射表。\n      - `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n      - `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n  - ```python\n    def predict(\n        data,\n        max_seq_len=128,\n        batch_size=1,\n        use_gpu=False\n    )\n    ```\n\n    - **参数**\n\n      - `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n      - `max_seq_len`：模型处理文本的最大长度\n      - `batch_size`：模型批处理大小\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，不同任务类型的返回结果如下\n        - 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n        - 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n  - ```python\n    def get_embedding(\n      data,\n      use_gpu=False\n    )\n    ```\n\n    - 用于获取输入文本的句子粒度特征与字粒度特征\n\n    - **参数**\n\n      - `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。  \n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线获取预训练词向量。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m albert-chinese-small\n    ```\n\n  - 这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\n    text = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n    # 对应本地部署，则为module.get_embedding(data=text)\n    data = {\"data\": text}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/albert-chinese-small\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/text/language_model/albert-chinese-small/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/albert-chinese-small/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nimport os\nfrom typing import Dict\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlenlp.transformers.albert.modeling import AlbertForSequenceClassification\nfrom paddlenlp.transformers.albert.modeling import AlbertForTokenClassification\nfrom paddlenlp.transformers.albert.modeling import AlbertModel\nfrom paddlenlp.transformers.albert.tokenizer import AlbertTokenizer\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(name=\"albert-chinese-small\",\n            version=\"1.0.0\",\n            summary=\"\",\n            author=\"Baidu\",\n            author_email=\"\",\n            type=\"nlp/semantic_model\",\n            meta=TransformerModule)\nclass Albert(nn.Layer):\n    \"\"\"\n    ALBERT model\n    \"\"\"\n\n    def __init__(\n        self,\n        task: str = None,\n        load_checkpoint: str = None,\n        label_map: Dict = None,\n        num_classes: int = 2,\n        suffix: bool = False,\n        **kwargs,\n    ):\n        super(Albert, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = AlbertForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='albert-chinese-small', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = AlbertForTokenClassification.from_pretrained(\n                pretrained_model_name_or_path='albert-chinese-small', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())],\n                                         suffix=suffix)\n        elif task == 'text-matching':\n            self.model = AlbertModel.from_pretrained(pretrained_model_name_or_path='albert-chinese-small', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = AlbertModel.from_pretrained(pretrained_model_name_or_path='albert-chinese-small', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding, _ = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding, _ = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            sequence_output, pooled_output = result\n            return sequence_output, pooled_output\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return AlbertTokenizer.from_pretrained(pretrained_model_name_or_path='albert-chinese-small', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/albert-chinese-tiny/README.md",
    "content": "# albert-chinese-tiny\n|模型名称|albert-chinese-tiny|\n| :--- | :---: |\n|类别|文本-语义模型|\n|网络|albert-chinese-tiny|\n|数据集|-|\n|是否支持Fine-tuning|是|\n|模型大小|40MB|\n|最新更新日期|2022-02-08|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - ALBERT针对当前预训练模型参数量过大的问题，提出了以下改进方案：\n\n    - 嵌入向量参数化的因式分解。ALBERT对词嵌入参数进行了因式分解，先将单词映射到一个低维的词嵌入空间E，然后再将其映射到高维的隐藏空间H。\n\n    - 跨层参数共享。ALBERT共享了层之间的全部参数。\n\n更多详情请参考[ALBERT论文](https://arxiv.org/abs/1909.11942)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.2.0\n\n  - paddlehub >= 2.2.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install albert-chinese-tiny\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='albert-chinese-tiny',\n    version='1.0.0',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Label: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](../../../../demo/text_classification)\n- [序列标注](../../../../demo/sequence_labeling)\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        task=None,\n        load_checkpoint=None,\n        label_map=None,\n        num_classes=2,\n        suffix=False,\n        **kwargs,\n    )\n    ```\n\n    - 创建Module对象（动态图组网版本）\n\n    - **参数**\n\n      - `task`： 任务名称，可为`seq-cls`(文本分类任务)或`token-cls`(序列标注任务)。\n      - `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n      - `label_map`：预测时的类别映射表。\n      - `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n      - `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n  - ```python\n    def predict(\n        data,\n        max_seq_len=128,\n        batch_size=1,\n        use_gpu=False\n    )\n    ```\n\n    - **参数**\n\n      - `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n      - `max_seq_len`：模型处理文本的最大长度\n      - `batch_size`：模型批处理大小\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，不同任务类型的返回结果如下\n        - 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n        - 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n  - ```python\n    def get_embedding(\n      data,\n      use_gpu=False\n    )\n    ```\n\n    - 用于获取输入文本的句子粒度特征与字粒度特征\n\n    - **参数**\n\n      - `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。  \n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线获取预训练词向量。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m albert-chinese-tiny\n    ```\n\n  - 这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\n    text = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n    # 对应本地部署，则为module.get_embedding(data=text)\n    data = {\"data\": text}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/albert-chinese-tiny\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/text/language_model/albert-chinese-tiny/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/albert-chinese-tiny/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nimport os\nfrom typing import Dict\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlenlp.transformers.albert.modeling import AlbertForSequenceClassification\nfrom paddlenlp.transformers.albert.modeling import AlbertForTokenClassification\nfrom paddlenlp.transformers.albert.modeling import AlbertModel\nfrom paddlenlp.transformers.albert.tokenizer import AlbertTokenizer\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(name=\"albert-chinese-tiny\",\n            version=\"1.0.0\",\n            summary=\"\",\n            author=\"Baidu\",\n            author_email=\"\",\n            type=\"nlp/semantic_model\",\n            meta=TransformerModule)\nclass Albert(nn.Layer):\n    \"\"\"\n    ALBERT model\n    \"\"\"\n\n    def __init__(\n        self,\n        task: str = None,\n        load_checkpoint: str = None,\n        label_map: Dict = None,\n        num_classes: int = 2,\n        suffix: bool = False,\n        **kwargs,\n    ):\n        super(Albert, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = AlbertForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='albert-chinese-tiny', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = AlbertForTokenClassification.from_pretrained(\n                pretrained_model_name_or_path='albert-chinese-tiny', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())],\n                                         suffix=suffix)\n        elif task == 'text-matching':\n            self.model = AlbertModel.from_pretrained(pretrained_model_name_or_path='albert-chinese-tiny', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = AlbertModel.from_pretrained(pretrained_model_name_or_path='albert-chinese-tiny', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding, _ = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding, _ = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            sequence_output, pooled_output = result\n            return sequence_output, pooled_output\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return AlbertTokenizer.from_pretrained(pretrained_model_name_or_path='albert-chinese-tiny', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/albert-chinese-xlarge/README.md",
    "content": "# albert-chinese-xlarge\n|模型名称|albert-chinese-xlarge|\n| :--- | :---: |\n|类别|文本-语义模型|\n|网络|albert-chinese-xlarge|\n|数据集|-|\n|是否支持Fine-tuning|是|\n|模型大小|346MB|\n|最新更新日期|2022-02-08|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - ALBERT针对当前预训练模型参数量过大的问题，提出了以下改进方案：\n\n    - 嵌入向量参数化的因式分解。ALBERT对词嵌入参数进行了因式分解，先将单词映射到一个低维的词嵌入空间E，然后再将其映射到高维的隐藏空间H。\n\n    - 跨层参数共享。ALBERT共享了层之间的全部参数。\n\n更多详情请参考[ALBERT论文](https://arxiv.org/abs/1909.11942)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.2.0\n\n  - paddlehub >= 2.2.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install albert-chinese-xlarge\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='albert-chinese-xlarge',\n    version='1.0.0',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Label: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](../../../../demo/text_classification)\n- [序列标注](../../../../demo/sequence_labeling)\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        task=None,\n        load_checkpoint=None,\n        label_map=None,\n        num_classes=2,\n        suffix=False,\n        **kwargs,\n    )\n    ```\n\n    - 创建Module对象（动态图组网版本）\n\n    - **参数**\n\n      - `task`： 任务名称，可为`seq-cls`(文本分类任务)或`token-cls`(序列标注任务)。\n      - `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n      - `label_map`：预测时的类别映射表。\n      - `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n      - `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n  - ```python\n    def predict(\n        data,\n        max_seq_len=128,\n        batch_size=1,\n        use_gpu=False\n    )\n    ```\n\n    - **参数**\n\n      - `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n      - `max_seq_len`：模型处理文本的最大长度\n      - `batch_size`：模型批处理大小\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，不同任务类型的返回结果如下\n        - 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n        - 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n  - ```python\n    def get_embedding(\n      data,\n      use_gpu=False\n    )\n    ```\n\n    - 用于获取输入文本的句子粒度特征与字粒度特征\n\n    - **参数**\n\n      - `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。  \n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线获取预训练词向量。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m albert-chinese-xlarge\n    ```\n\n  - 这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\n    text = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n    # 对应本地部署，则为module.get_embedding(data=text)\n    data = {\"data\": text}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/albert-chinese-xlarge\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/text/language_model/albert-chinese-xlarge/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/albert-chinese-xlarge/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nimport os\nfrom typing import Dict\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlenlp.transformers.albert.modeling import AlbertForSequenceClassification\nfrom paddlenlp.transformers.albert.modeling import AlbertForTokenClassification\nfrom paddlenlp.transformers.albert.modeling import AlbertModel\nfrom paddlenlp.transformers.albert.tokenizer import AlbertTokenizer\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(name=\"albert-chinese-xlarge\",\n            version=\"1.0.0\",\n            summary=\"\",\n            author=\"Baidu\",\n            author_email=\"\",\n            type=\"nlp/semantic_model\",\n            meta=TransformerModule)\nclass Albert(nn.Layer):\n    \"\"\"\n    ALBERT model\n    \"\"\"\n\n    def __init__(\n        self,\n        task: str = None,\n        load_checkpoint: str = None,\n        label_map: Dict = None,\n        num_classes: int = 2,\n        suffix: bool = False,\n        **kwargs,\n    ):\n        super(Albert, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = AlbertForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='albert-chinese-xlarge', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = AlbertForTokenClassification.from_pretrained(\n                pretrained_model_name_or_path='albert-chinese-xlarge', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())],\n                                         suffix=suffix)\n        elif task == 'text-matching':\n            self.model = AlbertModel.from_pretrained(pretrained_model_name_or_path='albert-chinese-xlarge', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = AlbertModel.from_pretrained(pretrained_model_name_or_path='albert-chinese-xlarge', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding, _ = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding, _ = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            sequence_output, pooled_output = result\n            return sequence_output, pooled_output\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return AlbertTokenizer.from_pretrained(pretrained_model_name_or_path='albert-chinese-xlarge', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/albert-chinese-xxlarge/README.md",
    "content": "# albert-chinese-xxlarge\n|模型名称|albert-chinese-xxlarge|\n| :--- | :---: |\n|类别|文本-语义模型|\n|网络|albert-chinese-xxlarge|\n|数据集|-|\n|是否支持Fine-tuning|是|\n|模型大小|1.3GB|\n|最新更新日期|2022-02-08|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - ALBERT针对当前预训练模型参数量过大的问题，提出了以下改进方案：\n\n    - 嵌入向量参数化的因式分解。ALBERT对词嵌入参数进行了因式分解，先将单词映射到一个低维的词嵌入空间E，然后再将其映射到高维的隐藏空间H。\n\n    - 跨层参数共享。ALBERT共享了层之间的全部参数。\n\n更多详情请参考[ALBERT论文](https://arxiv.org/abs/1909.11942)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.2.0\n\n  - paddlehub >= 2.2.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install albert-chinese-xxlarge\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='albert-chinese-xxlarge',\n    version='1.0.0',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Label: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](../../../../demo/text_classification)\n- [序列标注](../../../../demo/sequence_labeling)\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        task=None,\n        load_checkpoint=None,\n        label_map=None,\n        num_classes=2,\n        suffix=False,\n        **kwargs,\n    )\n    ```\n\n    - 创建Module对象（动态图组网版本）\n\n    - **参数**\n\n      - `task`： 任务名称，可为`seq-cls`(文本分类任务)或`token-cls`(序列标注任务)。\n      - `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n      - `label_map`：预测时的类别映射表。\n      - `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n      - `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n  - ```python\n    def predict(\n        data,\n        max_seq_len=128,\n        batch_size=1,\n        use_gpu=False\n    )\n    ```\n\n    - **参数**\n\n      - `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n      - `max_seq_len`：模型处理文本的最大长度\n      - `batch_size`：模型批处理大小\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，不同任务类型的返回结果如下\n        - 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n        - 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n  - ```python\n    def get_embedding(\n      data,\n      use_gpu=False\n    )\n    ```\n\n    - 用于获取输入文本的句子粒度特征与字粒度特征\n\n    - **参数**\n\n      - `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。  \n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线获取预训练词向量。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m albert-chinese-xxlarge\n    ```\n\n  - 这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\n    text = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n    # 对应本地部署，则为module.get_embedding(data=text)\n    data = {\"data\": text}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/albert-chinese-xxlarge\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/text/language_model/albert-chinese-xxlarge/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/albert-chinese-xxlarge/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nimport os\nfrom typing import Dict\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlenlp.transformers.albert.modeling import AlbertForSequenceClassification\nfrom paddlenlp.transformers.albert.modeling import AlbertForTokenClassification\nfrom paddlenlp.transformers.albert.modeling import AlbertModel\nfrom paddlenlp.transformers.albert.tokenizer import AlbertTokenizer\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(name=\"albert-chinese-xxlarge\",\n            version=\"1.0.0\",\n            summary=\"\",\n            author=\"Baidu\",\n            author_email=\"\",\n            type=\"nlp/semantic_model\",\n            meta=TransformerModule)\nclass Albert(nn.Layer):\n    \"\"\"\n    ALBERT model\n    \"\"\"\n\n    def __init__(\n        self,\n        task: str = None,\n        load_checkpoint: str = None,\n        label_map: Dict = None,\n        num_classes: int = 2,\n        suffix: bool = False,\n        **kwargs,\n    ):\n        super(Albert, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = AlbertForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='albert-chinese-xxlarge', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = AlbertForTokenClassification.from_pretrained(\n                pretrained_model_name_or_path='albert-chinese-xxlarge', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())],\n                                         suffix=suffix)\n        elif task == 'text-matching':\n            self.model = AlbertModel.from_pretrained(pretrained_model_name_or_path='albert-chinese-xxlarge', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = AlbertModel.from_pretrained(pretrained_model_name_or_path='albert-chinese-xxlarge', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding, _ = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding, _ = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            sequence_output, pooled_output = result\n            return sequence_output, pooled_output\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return AlbertTokenizer.from_pretrained(pretrained_model_name_or_path='albert-chinese-xxlarge', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/albert-xxlarge-v1/README.md",
    "content": "# albert-xxlarge-v1\n|模型名称|albert-xxlarge-v1|\n| :--- | :---: |\n|类别|文本-语义模型|\n|网络|albert-xxlarge-v1|\n|数据集|-|\n|是否支持Fine-tuning|是|\n|模型大小|1.3GB|\n|最新更新日期|2022-02-08|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - ALBERT针对当前预训练模型参数量过大的问题，提出了以下改进方案：\n\n    - 嵌入向量参数化的因式分解。ALBERT对词嵌入参数进行了因式分解，先将单词映射到一个低维的词嵌入空间E，然后再将其映射到高维的隐藏空间H。\n\n    - 跨层参数共享。ALBERT共享了层之间的全部参数。\n\n更多详情请参考[ALBERT论文](https://arxiv.org/abs/1909.11942)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.2.0\n\n  - paddlehub >= 2.2.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install albert-xxlarge-v1\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='albert-xxlarge-v1',\n    version='1.0.0',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Label: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](../../../../demo/text_classification)\n- [序列标注](../../../../demo/sequence_labeling)\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        task=None,\n        load_checkpoint=None,\n        label_map=None,\n        num_classes=2,\n        suffix=False,\n        **kwargs,\n    )\n    ```\n\n    - 创建Module对象（动态图组网版本）\n\n    - **参数**\n\n      - `task`： 任务名称，可为`seq-cls`(文本分类任务)或`token-cls`(序列标注任务)。\n      - `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n      - `label_map`：预测时的类别映射表。\n      - `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n      - `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n  - ```python\n    def predict(\n        data,\n        max_seq_len=128,\n        batch_size=1,\n        use_gpu=False\n    )\n    ```\n\n    - **参数**\n\n      - `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n      - `max_seq_len`：模型处理文本的最大长度\n      - `batch_size`：模型批处理大小\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，不同任务类型的返回结果如下\n        - 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n        - 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n  - ```python\n    def get_embedding(\n      data,\n      use_gpu=False\n    )\n    ```\n\n    - 用于获取输入文本的句子粒度特征与字粒度特征\n\n    - **参数**\n\n      - `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。  \n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线获取预训练词向量。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m albert-xxlarge-v1\n    ```\n\n  - 这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\n    text = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n    # 对应本地部署，则为module.get_embedding(data=text)\n    data = {\"data\": text}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/albert-xxlarge-v1\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/text/language_model/albert-xxlarge-v1/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/albert-xxlarge-v1/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nimport os\nfrom typing import Dict\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlenlp.transformers.albert.modeling import AlbertForSequenceClassification\nfrom paddlenlp.transformers.albert.modeling import AlbertForTokenClassification\nfrom paddlenlp.transformers.albert.modeling import AlbertModel\nfrom paddlenlp.transformers.albert.tokenizer import AlbertTokenizer\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(name=\"albert-xxlarge-v1\",\n            version=\"1.0.0\",\n            summary=\"\",\n            author=\"Baidu\",\n            author_email=\"\",\n            type=\"nlp/semantic_model\",\n            meta=TransformerModule)\nclass Albert(nn.Layer):\n    \"\"\"\n    ALBERT model\n    \"\"\"\n\n    def __init__(\n        self,\n        task: str = None,\n        load_checkpoint: str = None,\n        label_map: Dict = None,\n        num_classes: int = 2,\n        suffix: bool = False,\n        **kwargs,\n    ):\n        super(Albert, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = AlbertForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='albert-xxlarge-v1', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = AlbertForTokenClassification.from_pretrained(pretrained_model_name_or_path='albert-xxlarge-v1',\n                                                                      num_classes=self.num_classes,\n                                                                      **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())],\n                                         suffix=suffix)\n        elif task == 'text-matching':\n            self.model = AlbertModel.from_pretrained(pretrained_model_name_or_path='albert-xxlarge-v1', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = AlbertModel.from_pretrained(pretrained_model_name_or_path='albert-xxlarge-v1', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding, _ = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding, _ = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            sequence_output, pooled_output = result\n            return sequence_output, pooled_output\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return AlbertTokenizer.from_pretrained(pretrained_model_name_or_path='albert-xxlarge-v1', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/albert-xxlarge-v2/README.md",
    "content": "# albert-xxlarge-v2\n|模型名称|albert-xxlarge-v2|\n| :--- | :---: |\n|类别|文本-语义模型|\n|网络|albert-xxlarge-v2|\n|数据集|-|\n|是否支持Fine-tuning|是|\n|模型大小|1.3GB|\n|最新更新日期|2022-02-08|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - ALBERT针对当前预训练模型参数量过大的问题，提出了以下改进方案：\n\n    - 嵌入向量参数化的因式分解。ALBERT对词嵌入参数进行了因式分解，先将单词映射到一个低维的词嵌入空间E，然后再将其映射到高维的隐藏空间H。\n\n    - 跨层参数共享。ALBERT共享了层之间的全部参数。\n\n更多详情请参考[ALBERT论文](https://arxiv.org/abs/1909.11942)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.2.0\n\n  - paddlehub >= 2.2.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install albert-xxlarge-v2\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='albert-xxlarge-v2',\n    version='1.0.0',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Label: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](../../../../demo/text_classification)\n- [序列标注](../../../../demo/sequence_labeling)\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        task=None,\n        load_checkpoint=None,\n        label_map=None,\n        num_classes=2,\n        suffix=False,\n        **kwargs,\n    )\n    ```\n\n    - 创建Module对象（动态图组网版本）\n\n    - **参数**\n\n      - `task`： 任务名称，可为`seq-cls`(文本分类任务)或`token-cls`(序列标注任务)。\n      - `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n      - `label_map`：预测时的类别映射表。\n      - `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n      - `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n  - ```python\n    def predict(\n        data,\n        max_seq_len=128,\n        batch_size=1,\n        use_gpu=False\n    )\n    ```\n\n    - **参数**\n\n      - `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n      - `max_seq_len`：模型处理文本的最大长度\n      - `batch_size`：模型批处理大小\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，不同任务类型的返回结果如下\n        - 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n        - 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n  - ```python\n    def get_embedding(\n      data,\n      use_gpu=False\n    )\n    ```\n\n    - 用于获取输入文本的句子粒度特征与字粒度特征\n\n    - **参数**\n\n      - `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。  \n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线获取预训练词向量。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m albert-xxlarge-v2\n    ```\n\n  - 这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\n    text = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n    # 对应本地部署，则为module.get_embedding(data=text)\n    data = {\"data\": text}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/albert-xxlarge-v2\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/text/language_model/albert-xxlarge-v2/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/albert-xxlarge-v2/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nimport os\nfrom typing import Dict\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlenlp.transformers.albert.modeling import AlbertForSequenceClassification\nfrom paddlenlp.transformers.albert.modeling import AlbertForTokenClassification\nfrom paddlenlp.transformers.albert.modeling import AlbertModel\nfrom paddlenlp.transformers.albert.tokenizer import AlbertTokenizer\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(name=\"albert-xxlarge-v2\",\n            version=\"1.0.0\",\n            summary=\"\",\n            author=\"Baidu\",\n            author_email=\"\",\n            type=\"nlp/semantic_model\",\n            meta=TransformerModule)\nclass Albert(nn.Layer):\n    \"\"\"\n    ALBERT model\n    \"\"\"\n\n    def __init__(\n        self,\n        task: str = None,\n        load_checkpoint: str = None,\n        label_map: Dict = None,\n        num_classes: int = 2,\n        suffix: bool = False,\n        **kwargs,\n    ):\n        super(Albert, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = AlbertForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='albert-xxlarge-v2', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = AlbertForTokenClassification.from_pretrained(pretrained_model_name_or_path='albert-xxlarge-v2',\n                                                                      num_classes=self.num_classes,\n                                                                      **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())],\n                                         suffix=suffix)\n        elif task == 'text-matching':\n            self.model = AlbertModel.from_pretrained(pretrained_model_name_or_path='albert-xxlarge-v2', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = AlbertModel.from_pretrained(pretrained_model_name_or_path='albert-xxlarge-v2', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding, _ = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding, _ = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            sequence_output, pooled_output = result\n            return sequence_output, pooled_output\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return AlbertTokenizer.from_pretrained(pretrained_model_name_or_path='albert-xxlarge-v2', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/bert-base-cased/README.md",
    "content": "```shell\n$ hub install bert-base-cased==2.0.2\n```\n\n<p align=\"center\">\n<img src=\"https://bj.bcebos.com/paddlehub/paddlehub-img/bert_network.png\"  hspace='10'/> <br />\n</p>\n\n更多详情请参考[BERT论文](https://arxiv.org/abs/1810.04805)\n\n## API\n\n```python\ndef __init__(\n    task=None,\n    load_checkpoint=None,\n    label_map=None,\n    num_classes=2,\n    suffix=False,\n    **kwargs,\n)\n```\n\n创建Module对象（动态图组网版本）。\n\n**参数**\n\n* `task`： 任务名称，可为`seq-cls`(文本分类任务，原来的`sequence_classification`在未来会被弃用)或`token-cls`(序列标注任务)。\n* `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n* `label_map`：预测时的类别映射表。\n* `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n* `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n* `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n```python\ndef predict(\n    data,\n    max_seq_len=128,\n    batch_size=1,\n    use_gpu=False\n)\n```\n\n**参数**\n\n* `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n* `max_seq_len`：模型处理文本的最大长度\n* `batch_size`：模型批处理大小\n* `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n**返回**\n\n* `results`：list类型，不同任务类型的返回结果如下\n  * 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n  * 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n```python\ndef get_embedding(\n    data,\n    use_gpu=False\n)\n```\n\n用于获取输入文本的句子粒度特征与字粒度特征\n\n**参数**\n\n* `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n* `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n**返回**\n\n* `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。\n\n\n**代码示例**\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='bert-base-cased',\n    version='2.0.2',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Lable: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](https://github.com/PaddlePaddle/PaddleHub/tree/release/v2.0.0-beta/demo/text_classification)\n- [序列标注](https://github.com/PaddlePaddle/PaddleHub/tree/release/v2.0.0-beta/demo/sequence_labeling)\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线获取预训练词向量。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m bert-base-cased\n```\n\n这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\n\n# 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\ntext = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n# 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n# 对应本地部署，则为module.get_embedding(data=text)\ndata = {\"data\": text}\n# 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\nurl = \"http://127.0.0.1:8866/predict/bert-base-cased\"\n# 指定post请求的headers为application/json方式\nheaders = {\"Content-Type\": \"application/json\"}\n\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\nprint(r.json())\n```\n\n##   查看代码\n\nhttps://github.com/PaddlePaddle/models/tree/develop/PaddleNLP/pretrain_langauge_models/BERT\n\n\n## 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n\n## 更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  支持get_embedding与get_params_layer\n\n* 2.0.0\n\n  全面升级动态图，接口有所变化。\n\n* 2.0.1\n\n  任务名称调整，增加序列标注任务`token-cls`\n\n* 2.0.2\n\n  增加文本匹配任务`text-matching`\n"
  },
  {
    "path": "modules/text/language_model/bert-base-cased/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/bert-base-cased/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Dict\nimport os\nimport math\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom paddlenlp.transformers.bert.modeling import BertForSequenceClassification, BertModel, BertForTokenClassification\nfrom paddlenlp.transformers.bert.tokenizer import BertTokenizer\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"bert-base-cased\",\n    version=\"2.0.2\",\n    summary=\n    \"bert_cased_L-12_H-768_A-12, 12-layer, 768-hidden, 12-heads, 110M parameters. The module is executed as paddle.dygraph.\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=TransformerModule)\nclass Bert(nn.Layer):\n    \"\"\"\n    BERT model\n    \"\"\"\n\n    def __init__(\n            self,\n            task: str = None,\n            load_checkpoint: str = None,\n            label_map: Dict = None,\n            num_classes: int = 2,\n            suffix: bool = False,\n            **kwargs,\n    ):\n        super(Bert, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = BertForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='bert-base-cased', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = BertForTokenClassification.from_pretrained(\n                pretrained_model_name_or_path='bert-base-cased', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())], suffix=suffix)\n        elif task == 'text-matching':\n            self.model = BertModel.from_pretrained(pretrained_model_name_or_path='bert-base-cased', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = BertModel.from_pretrained(pretrained_model_name_or_path='bert-base-cased', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding, _ = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding, _ = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            sequence_output, pooled_output = result\n            return sequence_output, pooled_output\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return BertTokenizer.from_pretrained(pretrained_model_name_or_path='bert-base-cased', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/bert-base-chinese/README.md",
    "content": "# bert-base-chinese\n|模型名称|bert-base-chinese|\n| :--- | :---: | \n|类别|文本-语义模型|\n|网络|bert-base-chinese|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|是|\n|模型大小|681MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n<p align=\"center\">\n<img src=\"https://bj.bcebos.com/paddlehub/paddlehub-img/bert_network.png\"  width=750 height=280 hspace='10'/> <br />\n</p>\n\n更多详情请参考[BERT论文](https://arxiv.org/abs/1810.04805)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install bert-base-chinese\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='bert-base-chinese',\n    version='2.0.2',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Lable: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](../../../../demo/text_classification)\n- [序列标注](../../../../demo/sequence_labeling)\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        task=None,\n        load_checkpoint=None,\n        label_map=None,\n        num_classes=2,\n        suffix=False,\n        **kwargs,\n    )\n    ```\n\n    - 创建Module对象（动态图组网版本）\n\n    - **参数**\n\n      - `task`： 任务名称，可为`seq-cls`(文本分类任务)或`token-cls`(序列标注任务)。\n      - `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n      - `label_map`：预测时的类别映射表。\n      - `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n      - `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n  - ```python\n    def predict(\n        data,\n        max_seq_len=128,\n        batch_size=1,\n        use_gpu=False\n    )\n    ```\n\n    - **参数**\n\n      - `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n      - `max_seq_len`：模型处理文本的最大长度\n      - `batch_size`：模型批处理大小\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，不同任务类型的返回结果如下\n        - 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n        - 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n  - ```python\n    def get_embedding(\n      data,\n      use_gpu=False\n    )\n    ```\n\n    - 用于获取输入文本的句子粒度特征与字粒度特征\n\n    - **参数**\n\n      - `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。  \n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线获取预训练词向量。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m bert-base-chinese\n    ```\n\n  - 这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\n    text = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n    # 对应本地部署，则为module.get_embedding(data=text)\n    data = {\"data\": text}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/bert-base-chinese\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  修复python 2的兼容问题\n\n* 1.1.0\n\n  支持get_embedding与get_params_layer\n\n* 2.0.0\n\n  全面升级动态图版本，接口有所变化\n\n* 2.0.1\n\n  任务名称调整，增加序列标注任务`token-cls`\n\n* 2.0.2\n\n  增加文本匹配任务`text-matching`  \n  ```shell\n  $ hub install bert-base-chinese==2.0.2\n  ```\n"
  },
  {
    "path": "modules/text/language_model/bert-base-chinese/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/bert-base-chinese/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Dict\nimport os\nimport math\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom paddlenlp.transformers.bert.modeling import BertForSequenceClassification, BertModel, BertForTokenClassification\nfrom paddlenlp.transformers.bert.tokenizer import BertTokenizer\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"bert-base-chinese\",\n    version=\"2.0.2\",\n    summary=\n    \"bert_chinese_L-12_H-768_A-12, 12-layer, 768-hidden, 12-heads, 110M parameters. The module is executed as paddle.dygraph.\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=TransformerModule)\nclass Bert(nn.Layer):\n    \"\"\"\n    Bert model\n    \"\"\"\n\n    def __init__(\n            self,\n            task: str = None,\n            load_checkpoint: str = None,\n            label_map: Dict = None,\n            num_classes: int = 2,\n            suffix: bool = False,\n            **kwargs,\n    ):\n        super(Bert, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = BertForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='bert-base-chinese', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = BertForTokenClassification.from_pretrained(\n                pretrained_model_name_or_path='bert-base-chinese', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())], suffix=suffix)\n        elif task == 'text-matching':\n            self.model = BertModel.from_pretrained(pretrained_model_name_or_path='bert-base-chinese', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = BertModel.from_pretrained(pretrained_model_name_or_path='bert-base-chinese', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding, _ = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding, _ = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            sequence_output, pooled_output = result\n            return sequence_output, pooled_output\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return BertTokenizer.from_pretrained(pretrained_model_name_or_path='bert-base-chinese', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/bert-base-multilingual-cased/README.md",
    "content": "```shell\n$ hub install bert-base-multilingual-cased==2.0.2\n```\n\n<p align=\"center\">\n<img src=\"https://bj.bcebos.com/paddlehub/paddlehub-img/bert_network.png\"  hspace='10'/> <br />\n</p>\n\n更多详情请参考[BERT论文](https://arxiv.org/abs/1810.04805)\n\n## API\n\n```python\ndef __init__(\n    task=None,\n    load_checkpoint=None,\n    label_map=None,\n    num_classes=2,\n    suffix=False,\n    **kwargs,\n)\n```\n\n创建Module对象（动态图组网版本）。\n\n**参数**\n\n* `task`： 任务名称，可为`seq-cls`(文本分类任务，原来的`sequence_classification`在未来会被弃用)或`token-cls`(序列标注任务)。\n* `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n* `label_map`：预测时的类别映射表。\n* `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n* `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n* `**kwargs`：用户额外指定的关键字字典类型的参数。\n```python\ndef predict(\n    data,\n    max_seq_len=128,\n    batch_size=1,\n    use_gpu=False\n)\n```\n\n**参数**\n\n* `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n* `max_seq_len`：模型处理文本的最大长度\n* `batch_size`：模型批处理大小\n* `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n**返回**\n\n* `results`：list类型，不同任务类型的返回结果如下\n  * 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n  * 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n```python\ndef get_embedding(\n    data,\n    use_gpu=False\n)\n```\n\n用于获取输入文本的句子粒度特征与字粒度特征\n\n**参数**\n\n* `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n* `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n**返回**\n\n* `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。\n\n\n**代码示例**\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='bert-base-multilingual-cased',\n    version='2.0.2',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Lable: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](https://github.com/PaddlePaddle/PaddleHub/tree/release/v2.0.0-beta/demo/text_classification)\n- [序列标注](https://github.com/PaddlePaddle/PaddleHub/tree/release/v2.0.0-beta/demo/sequence_labeling)\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线获取预训练词向量。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m bert-base-multilingual-cased\n```\n\n这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\n\n# 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\ntext = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n# 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n# 对应本地部署，则为module.get_embedding(data=text)\ndata = {\"data\": text}\n# 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\nurl = \"http://127.0.0.1:8866/predict/bert-base-multilingual-cased\"\n# 指定post请求的headers为application/json方式\nheaders = {\"Content-Type\": \"application/json\"}\n\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\nprint(r.json())\n```\n\n##   查看代码\n\nhttps://github.com/PaddlePaddle/models/tree/develop/PaddleNLP/pretrain_langauge_models/BERT\n\n\n## 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n\n## 更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  支持get_embedding与get_params_layer\n\n* 2.0.0\n\n  全面升级动态图，接口有所变化。\n\n* 2.0.1\n\n  任务名称调整，增加序列标注任务`token-cls`\n\n* 2.0.2\n\n  增加文本匹配任务`text-matching`\n"
  },
  {
    "path": "modules/text/language_model/bert-base-multilingual-cased/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/bert-base-multilingual-cased/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Dict\nimport os\nimport math\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom paddlenlp.transformers.bert.modeling import BertForSequenceClassification, BertModel, BertForTokenClassification\nfrom paddlenlp.transformers.bert.tokenizer import BertTokenizer\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"bert-base-multilingual-cased\",\n    version=\"2.0.2\",\n    summary=\n    \"bert_multi_cased_L-12_H-768_A-12, 12-layer, 768-hidden, 12-heads, 110M parameters. The module is executed as paddle.dygraph.\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=TransformerModule)\nclass Bert(nn.Layer):\n    \"\"\"\n    BERT model\n    \"\"\"\n\n    def __init__(\n            self,\n            task: str = None,\n            load_checkpoint: str = None,\n            label_map: Dict = None,\n            num_classes: int = 2,\n            suffix: bool = False,\n            **kwargs,\n    ):\n        super(Bert, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = BertForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='bert-base-multilingual-cased', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = BertForTokenClassification.from_pretrained(\n                pretrained_model_name_or_path='bert-base-multilingual-cased', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())], suffix=suffix)\n        elif task == 'text-matching':\n            self.model = BertModel.from_pretrained(\n                pretrained_model_name_or_path='bert-base-multilingual-cased', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = BertModel.from_pretrained(\n                pretrained_model_name_or_path='bert-base-multilingual-cased', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding, _ = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding, _ = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            sequence_output, pooled_output = result\n            return sequence_output, pooled_output\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return BertTokenizer.from_pretrained(\n            pretrained_model_name_or_path='bert-base-multilingual-cased', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/bert-base-multilingual-uncased/README.md",
    "content": "```shell\n$ hub install bert-base-multilingual-uncased==2.0.2\n```\n\n<p align=\"center\">\n<img src=\"https://bj.bcebos.com/paddlehub/paddlehub-img/bert_network.png\"  hspace='10'/> <br />\n</p>\n\n更多详情请参考[BERT论文](https://arxiv.org/abs/1810.04805)\n\n## API\n\n```python\ndef __init__(\n    task=None,\n    load_checkpoint=None,\n    label_map=None,\n    num_classes=2,\n    suffix=False,\n    **kwargs,\n)\n```\n\n创建Module对象（动态图组网版本）。\n\n**参数**\n\n* `task`： 任务名称，可为`seq-cls`(文本分类任务，原来的`sequence_classification`在未来会被弃用)或`token-cls`(序列标注任务)。\n* `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n* `label_map`：预测时的类别映射表。\n* `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n* `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n* `**kwargs`：用户额外指定的关键字字典类型的参数。\n```python\ndef predict(\n    data,\n    max_seq_len=128,\n    batch_size=1,\n    use_gpu=False\n)\n```\n\n**参数**\n\n* `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n* `max_seq_len`：模型处理文本的最大长度\n* `batch_size`：模型批处理大小\n* `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n**返回**\n\n* `results`：list类型，不同任务类型的返回结果如下\n  * 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n  * 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n```python\ndef get_embedding(\n    data,\n    use_gpu=False\n)\n```\n\n用于获取输入文本的句子粒度特征与字粒度特征\n\n**参数**\n\n* `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n* `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n**返回**\n\n* `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。\n\n\n**代码示例**\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='bert-base-multilingual-uncased',\n    version='2.0.2',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Lable: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](https://github.com/PaddlePaddle/PaddleHub/tree/release/v2.0.0-beta/demo/text_classification)\n- [序列标注](https://github.com/PaddlePaddle/PaddleHub/tree/release/v2.0.0-beta/demo/sequence_labeling)\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线获取预训练词向量。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m bert-base-multilingual-uncased\n```\n\n这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\n\n# 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\ntext = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n# 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n# 对应本地部署，则为module.get_embedding(data=text)\ndata = {\"data\": text}\n# 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\nurl = \"http://127.0.0.1:8866/predict/bert-base-multilingual-uncased\"\n# 指定post请求的headers为application/json方式\nheaders = {\"Content-Type\": \"application/json\"}\n\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\nprint(r.json())\n```\n\n##   查看代码\n\nhttps://github.com/PaddlePaddle/models/tree/develop/PaddleNLP/pretrain_langauge_models/BERT\n\n\n## 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n\n## 更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  支持get_embedding与get_params_layer\n\n* 2.0.0\n\n  全面升级动态图，接口有所变化。\n\n* 2.0.1\n\n  任务名称调整，增加序列标注任务`token-cls`\n\n* 2.0.2\n\n  增加文本匹配任务`text-matching`\n"
  },
  {
    "path": "modules/text/language_model/bert-base-multilingual-uncased/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/bert-base-multilingual-uncased/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Dict\nimport os\nimport math\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom paddlenlp.transformers.bert.modeling import BertForSequenceClassification, BertModel, BertForTokenClassification\nfrom paddlenlp.transformers.bert.tokenizer import BertTokenizer\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"bert-base-multilingual-uncased\",\n    version=\"2.0.2\",\n    summary=\n    \"bert_multi_uncased_L-12_H-768_A-12, 12-layer, 768-hidden, 12-heads, 110M parameters. The module is executed as paddle.dygraph.\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=TransformerModule)\nclass Bert(nn.Layer):\n    \"\"\"\n    BERT model\n    \"\"\"\n\n    def __init__(\n            self,\n            task: str = None,\n            load_checkpoint: str = None,\n            label_map: Dict = None,\n            num_classes: int = 2,\n            suffix: bool = False,\n            **kwargs,\n    ):\n        super(Bert, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = BertForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='bert-base-multilingual-uncased', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = BertForTokenClassification.from_pretrained(\n                pretrained_model_name_or_path='bert-base-multilingual-uncased', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())], suffix=suffix)\n        elif task == 'text-matching':\n            self.model = BertModel.from_pretrained(\n                pretrained_model_name_or_path='bert-base-multilingual-uncased', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = BertModel.from_pretrained(\n                pretrained_model_name_or_path='bert-base-multilingual-uncased', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding, _ = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding, _ = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            sequence_output, pooled_output = result\n            return sequence_output, pooled_output\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return BertTokenizer.from_pretrained(\n            pretrained_model_name_or_path='bert-base-multilingual-uncased', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/bert-base-uncased/README.md",
    "content": "```shell\n$ hub install bert-base-uncased==2.0.2\n```\n\n<p align=\"center\">\n<img src=\"https://bj.bcebos.com/paddlehub/paddlehub-img/bert_network.png\"  hspace='10'/> <br />\n</p>\n\n更多详情请参考[BERT论文](https://arxiv.org/abs/1810.04805)\n\n## API\n\n```python\ndef __init__(\n    task=None,\n    load_checkpoint=None,\n    label_map=None,\n    num_classes=2,\n    suffix=False,\n    **kwargs,\n)\n```\n\n创建Module对象（动态图组网版本）。\n\n**参数**\n\n* `task`： 任务名称，可为`seq-cls`(文本分类任务，原来的`sequence_classification`在未来会被弃用)或`token-cls`(序列标注任务)。\n* `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n* `label_map`：预测时的类别映射表。\n* `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n* `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n* `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n```python\ndef predict(\n    data,\n    max_seq_len=128,\n    batch_size=1,\n    use_gpu=False\n)\n```\n\n**参数**\n\n* `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n* `max_seq_len`：模型处理文本的最大长度\n* `batch_size`：模型批处理大小\n* `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n**返回**\n\n* `results`：list类型，不同任务类型的返回结果如下\n  * 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n  * 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n```python\ndef get_embedding(\n    data,\n    use_gpu=False\n)\n```\n\n用于获取输入文本的句子粒度特征与字粒度特征\n\n**参数**\n\n* `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n* `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n**返回**\n\n* `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。\n\n\n**代码示例**\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='bert-base-uncased',\n    version='2.0.2',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Lable: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](https://github.com/PaddlePaddle/PaddleHub/tree/release/v2.0.0-beta/demo/text_classification)\n- [序列标注](https://github.com/PaddlePaddle/PaddleHub/tree/release/v2.0.0-beta/demo/sequence_labeling)\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线获取预训练词向量。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m bert-base-uncased\n```\n\n这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\n\n# 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\ntext = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n# 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n# 对应本地部署，则为module.get_embedding(data=text)\ndata = {\"data\": text}\n# 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\nurl = \"http://127.0.0.1:8866/predict/bert-base-uncased\"\n# 指定post请求的headers为application/json方式\nheaders = {\"Content-Type\": \"application/json\"}\n\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\nprint(r.json())\n```\n\n##   查看代码\n\nhttps://github.com/PaddlePaddle/models/tree/develop/PaddleNLP/pretrain_langauge_models/BERT\n\n\n## 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n\n## 更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  支持get_embedding与get_params_layer\n\n* 2.0.0\n\n  全面升级动态图，接口有所变化。\n\n* 2.0.1\n\n  任务名称调整，增加序列标注任务`token-cls`\n\n* 2.0.2\n\n  增加文本匹配任务`text-matching`\n"
  },
  {
    "path": "modules/text/language_model/bert-base-uncased/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/bert-base-uncased/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Dict\nimport os\nimport math\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom paddlenlp.transformers.bert.modeling import BertForSequenceClassification, BertModel, BertForTokenClassification\nfrom paddlenlp.transformers.bert.tokenizer import BertTokenizer\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"bert-base-uncased\",\n    version=\"2.0.2\",\n    summary=\n    \"bert_uncased_L-12_H-768_A-12, 12-layer, 768-hidden, 12-heads, 110M parameters. The module is executed as paddle.dygraph.\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=TransformerModule)\nclass Bert(nn.Layer):\n    \"\"\"\n    BERT model\n    \"\"\"\n\n    def __init__(\n            self,\n            task: str = None,\n            load_checkpoint: str = None,\n            label_map: Dict = None,\n            num_classes: int = 2,\n            suffix: bool = False,\n            **kwargs,\n    ):\n        super(Bert, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = BertForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='bert-base-uncased', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = BertForTokenClassification.from_pretrained(\n                pretrained_model_name_or_path='bert-base-uncased', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())], suffix=suffix)\n        elif task == 'text-matching':\n            self.model = BertModel.from_pretrained(pretrained_model_name_or_path='bert-base-uncased', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = BertModel.from_pretrained(pretrained_model_name_or_path='bert-base-uncased', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding, _ = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding, _ = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            sequence_output, pooled_output = result\n            return sequence_output, pooled_output\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return BertTokenizer.from_pretrained(pretrained_model_name_or_path='bert-base-uncased', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/bert-large-cased/README.md",
    "content": "```shell\n$ hub install bert-large-cased==2.0.2\n```\n\n<p align=\"center\">\n<img src=\"https://bj.bcebos.com/paddlehub/paddlehub-img/bert_network.png\"  hspace='10'/> <br />\n</p>\n\n更多详情请参考[BERT论文](https://arxiv.org/abs/1810.04805)\n\n## API\n\n```python\ndef __init__(\n    task=None,\n    load_checkpoint=None,\n    label_map=None,\n    num_classes=2,\n    suffix=False,\n    **kwargs,\n)\n```\n\n创建Module对象（动态图组网版本）。\n\n**参数**\n\n* `task`： 任务名称，可为`seq-cls`(文本分类任务，原来的`sequence_classification`在未来会被弃用)或`token-cls`(序列标注任务)。\n* `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n* `label_map`：预测时的类别映射表。\n* `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n* `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n* `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n```python\ndef predict(\n    data,\n    max_seq_len=128,\n    batch_size=1,\n    use_gpu=False\n)\n```\n\n**参数**\n\n* `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n* `max_seq_len`：模型处理文本的最大长度\n* `batch_size`：模型批处理大小\n* `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n**返回**\n\n* `results`：list类型，不同任务类型的返回结果如下\n  * 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n  * 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n```python\ndef get_embedding(\n    data,\n    use_gpu=False\n)\n```\n\n用于获取输入文本的句子粒度特征与字粒度特征\n\n**参数**\n\n* `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n* `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n**返回**\n\n* `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。\n\n\n**代码示例**\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='bert-large-cased',\n    version='2.0.2',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Lable: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](https://github.com/PaddlePaddle/PaddleHub/tree/release/v2.0.0-beta/demo/text_classification)\n- [序列标注](https://github.com/PaddlePaddle/PaddleHub/tree/release/v2.0.0-beta/demo/sequence_labeling)\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线获取预训练词向量。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m bert-large-cased\n```\n\n这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\n\n# 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\ntext = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n# 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n# 对应本地部署，则为module.get_embedding(data=text)\ndata = {\"data\": text}\n# 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\nurl = \"http://127.0.0.1:8866/predict/bert-large-cased\"\n# 指定post请求的headers为application/json方式\nheaders = {\"Content-Type\": \"application/json\"}\n\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\nprint(r.json())\n```\n\n##   查看代码\n\nhttps://github.com/PaddlePaddle/models/tree/develop/PaddleNLP/pretrain_langauge_models/BERT\n\n\n## 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n\n## 更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  支持get_embedding与get_params_layer\n\n* 2.0.0\n\n  全面升级动态图，接口有所变化。\n\n* 2.0.1\n\n  任务名称调整，增加序列标注任务`token-cls`\n\n* 2.0.2\n\n  增加文本匹配任务`text-matching`\n"
  },
  {
    "path": "modules/text/language_model/bert-large-cased/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/bert-large-cased/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Dict\nimport os\nimport math\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom paddlenlp.transformers.bert.modeling import BertForSequenceClassification, BertModel, BertForTokenClassification\nfrom paddlenlp.transformers.bert.tokenizer import BertTokenizer\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"bert-large-cased\",\n    version=\"2.0.2\",\n    summary=\n    \"bert_cased_L-24_H-1024_A-16, 24-layer, 1024-hidden, 16-heads, 340M parameters. The module is executed as paddle.dygraph.\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=TransformerModule)\nclass Bert(nn.Layer):\n    \"\"\"\n    BERT model\n    \"\"\"\n\n    def __init__(\n            self,\n            task: str = None,\n            load_checkpoint: str = None,\n            label_map: Dict = None,\n            num_classes: int = 2,\n            suffix: bool = False,\n            **kwargs,\n    ):\n        super(Bert, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = BertForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='bert-large-cased', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = BertForTokenClassification.from_pretrained(\n                pretrained_model_name_or_path='bert-large-cased', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())], suffix=suffix)\n        elif task == 'text-matching':\n            self.model = BertModel.from_pretrained(pretrained_model_name_or_path='bert-large-cased', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = BertModel.from_pretrained(pretrained_model_name_or_path='bert-large-cased', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding, _ = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding, _ = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            sequence_output, pooled_output = result\n            return sequence_output, pooled_output\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return BertTokenizer.from_pretrained(pretrained_model_name_or_path='bert-large-cased', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/bert-large-uncased/README.md",
    "content": "```shell\n$ hub install bert-large-uncased==2.0.2\n```\n\n<p align=\"center\">\n<img src=\"https://bj.bcebos.com/paddlehub/paddlehub-img/bert_network.png\"  hspace='10'/> <br />\n</p>\n\n更多详情请参考[BERT论文](https://arxiv.org/abs/1810.04805)\n\n## API\n\n```python\ndef __init__(\n    task=None,\n    load_checkpoint=None,\n    label_map=None,\n    num_classes=2,\n    suffix=False,\n    **kwargs,\n)\n```\n\n创建Module对象（动态图组网版本）。\n\n**参数**\n\n* `task`： 任务名称，可为`seq-cls`(文本分类任务，原来的`sequence_classification`在未来会被弃用)或`token-cls`(序列标注任务)。\n* `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n* `label_map`：预测时的类别映射表。\n* `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n* `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n* `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n```python\ndef predict(\n    data,\n    max_seq_len=128,\n    batch_size=1,\n    use_gpu=False\n)\n```\n\n**参数**\n\n* `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n* `max_seq_len`：模型处理文本的最大长度\n* `batch_size`：模型批处理大小\n* `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n**返回**\n\n* `results`：list类型，不同任务类型的返回结果如下\n  * 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n  * 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n```python\ndef get_embedding(\n    data,\n    use_gpu=False\n)\n```\n\n用于获取输入文本的句子粒度特征与字粒度特征\n\n**参数**\n\n* `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n* `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n**返回**\n\n* `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。\n\n\n**代码示例**\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='bert-large-uncased',\n    version='2.0.2',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Lable: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](https://github.com/PaddlePaddle/PaddleHub/tree/release/v2.0.0-beta/demo/text_classification)\n- [序列标注](https://github.com/PaddlePaddle/PaddleHub/tree/release/v2.0.0-beta/demo/sequence_labeling)\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线获取预训练词向量。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m bert-large-uncased\n```\n\n这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\n\n# 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\ntext = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n# 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n# 对应本地部署，则为module.get_embedding(data=text)\ndata = {\"data\": text}\n# 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\nurl = \"http://127.0.0.1:8866/predict/bert-large-uncased\"\n# 指定post请求的headers为application/json方式\nheaders = {\"Content-Type\": \"application/json\"}\n\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\nprint(r.json())\n```\n\n##   查看代码\n\nhttps://github.com/PaddlePaddle/models/tree/develop/PaddleNLP/pretrain_langauge_models/BERT\n\n\n## 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n\n## 更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  支持get_embedding与get_params_layer\n\n* 2.0.0\n\n  全面升级动态图，接口有所变化。\n\n* 2.0.1\n\n  任务名称调整，增加序列标注任务`token-cls`\n\n* 2.0.2\n\n  增加文本匹配任务`text-matching`\n"
  },
  {
    "path": "modules/text/language_model/bert-large-uncased/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/bert-large-uncased/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Dict\nimport os\nimport math\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom paddlenlp.transformers.bert.modeling import BertForSequenceClassification, BertModel, BertForTokenClassification\nfrom paddlenlp.transformers.bert.tokenizer import BertTokenizer\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"bert-large-uncased\",\n    version=\"2.0.2\",\n    summary=\n    \"bert_uncased_L-24_H-1024_A-16, 24-layer, 1024-hidden, 16-heads, 340M parameters. The module is executed as paddle.dygraph.\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=TransformerModule)\nclass Bert(nn.Layer):\n    \"\"\"\n    BERT model\n    \"\"\"\n\n    def __init__(\n            self,\n            task: str = None,\n            load_checkpoint: str = None,\n            label_map: Dict = None,\n            num_classes: int = 2,\n            suffix: bool = False,\n            **kwargs,\n    ):\n        super(Bert, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = BertForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='bert-large-uncased', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = BertForTokenClassification.from_pretrained(\n                pretrained_model_name_or_path='bert-large-uncased', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())], suffix=suffix)\n        elif task == 'text-matching':\n            self.model = BertModel.from_pretrained(pretrained_model_name_or_path='bert-large-uncased', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = BertModel.from_pretrained(pretrained_model_name_or_path='bert-large-uncased', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding, _ = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding, _ = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            sequence_output, pooled_output = result\n            return sequence_output, pooled_output\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return BertTokenizer.from_pretrained(pretrained_model_name_or_path='bert-large-uncased', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/chinese_bert_wwm/README.md",
    "content": "# chinese-bert-wwm\n|模型名称|chinese-bert-wwm|\n| :--- | :---: | \n|类别|文本-语义模型|\n|网络|chinese-bert-wwm|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|是|\n|模型大小|391MB|\n|最新更新日期|2021-03-16|\n|贡献者|[ymcui](https://github.com/ymcui)|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n<p align=\"center\">\n<img src=\"https://bj.bcebos.com/paddlehub/paddlehub-img/bert_network.png\"  hspace='10'/> <br />\n</p>\n\n更多详情请参考[BERT论文](https://arxiv.org/abs/1810.04805), [Chinese-BERT-wwm技术报告](https://arxiv.org/abs/1906.08101)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install chinese-bert-wwm\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='chinese-bert-wwm',\n    version='2.0.1',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Lable: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](../../../../demo/text_classification)\n- [序列标注](../../../../demo/sequence_labeling)\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        task=None,\n        load_checkpoint=None,\n        label_map=None,\n        num_classes=2,\n        suffix=False,\n        **kwargs,\n    )\n    ```\n\n    - 创建Module对象（动态图组网版本）\n\n    - **参数**\n\n      - `task`： 任务名称，可为`seq-cls`(文本分类任务)或`token-cls`(序列标注任务)。\n      - `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n      - `label_map`：预测时的类别映射表。\n      - `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n      - `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n  - ```python\n    def predict(\n        data,\n        max_seq_len=128,\n        batch_size=1,\n        use_gpu=False\n    )\n    ```\n\n    - **参数**\n\n      - `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n      - `max_seq_len`：模型处理文本的最大长度\n      - `batch_size`：模型批处理大小\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，不同任务类型的返回结果如下\n        - 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n        - 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n  - ```python\n    def get_embedding(\n      data,\n      use_gpu=False\n    )\n    ```\n\n    - 用于获取输入文本的句子粒度特征与字粒度特征\n\n    - **参数**\n\n      - `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。  \n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线获取预训练词向量。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m chinese_bert_wwm\n    ```\n\n  - 这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\n    text = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n    # 对应本地部署，则为module.get_embedding(data=text)\n    data = {\"data\": text}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/chinese_bert_wwm\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 2.0.0\n\n  全面升级动态图，接口有所变化。任务名称调整，增加序列标注任务`token-cls`\n\n* 2.0.1\n\n  增加文本匹配任务`text-matching`\n  ```shell\n  $ hub install chinese-bert-wwm==2.0.1\n  ```\n"
  },
  {
    "path": "modules/text/language_model/chinese_bert_wwm/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/chinese_bert_wwm/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Dict\nimport os\nimport math\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom paddlenlp.transformers.bert.modeling import BertForSequenceClassification, BertModel, BertForTokenClassification\nfrom paddlenlp.transformers.bert.tokenizer import BertTokenizer\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"chinese-bert-wwm\",\n    version=\"2.0.1\",\n    summary=\n    \"chinese-bert-wwm, 12-layer, 768-hidden, 12-heads, 110M parameters. The module is executed as paddle.dygraph.\",\n    author=\"ymcui\",\n    author_email=\"ymcui@ir.hit.edu.cn\",\n    type=\"nlp/semantic_model\",\n    meta=TransformerModule)\nclass BertWwm(nn.Layer):\n    \"\"\"\n    BertWwm model\n    \"\"\"\n\n    def __init__(\n            self,\n            task: str = None,\n            load_checkpoint: str = None,\n            label_map: Dict = None,\n            num_classes: int = 2,\n            suffix: bool = False,\n            **kwargs,\n    ):\n        super(BertWwm, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = BertForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='bert-wwm-chinese', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = BertForTokenClassification.from_pretrained(\n                pretrained_model_name_or_path='bert-wwm-chinese', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())], suffix=suffix)\n        elif task == 'text-matching':\n            self.model = BertModel.from_pretrained(pretrained_model_name_or_path='bert-wwm-chinese', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = BertModel.from_pretrained(pretrained_model_name_or_path='bert-wwm-chinese', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding, _ = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding, _ = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            sequence_output, pooled_output = result\n            return sequence_output, pooled_output\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return BertTokenizer.from_pretrained(pretrained_model_name_or_path='bert-wwm-chinese', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/chinese_bert_wwm_ext/README.md",
    "content": "# chinese-bert-wwm-ext\n|模型名称|chinese-bert-wwm-ext|\n| :--- | :---: | \n|类别|文本-语义模型|\n|网络|chinese-bert-wwm-ext|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|是|\n|模型大小|391MB|\n|最新更新日期|2021-03-16|\n|贡献者|[ymcui](https://github.com/ymcui)|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n<p align=\"center\">\n<img src=\"https://bj.bcebos.com/paddlehub/paddlehub-img/bert_network.png\"  hspace='10'/> <br />\n</p>\n\n更多详情请参考[BERT论文](https://arxiv.org/abs/1810.04805), [Chinese-BERT-wwm技术报告](https://arxiv.org/abs/1906.08101)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install chinese-bert-wwm-ext\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='chinese-bert-wwm-ext',\n    version='2.0.1',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Lable: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](../../../../demo/text_classification)\n- [序列标注](../../../../demo/sequence_labeling)\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        task=None,\n        load_checkpoint=None,\n        label_map=None,\n        num_classes=2,\n        suffix=False,\n        **kwargs,\n    )\n    ```\n\n    - 创建Module对象（动态图组网版本）\n\n    - **参数**\n\n      - `task`： 任务名称，可为`seq-cls`(文本分类任务)或`token-cls`(序列标注任务)。\n      - `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n      - `label_map`：预测时的类别映射表。\n      - `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n      - `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n  - ```python\n    def predict(\n        data,\n        max_seq_len=128,\n        batch_size=1,\n        use_gpu=False\n    )\n    ```\n\n    - **参数**\n\n      - `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n      - `max_seq_len`：模型处理文本的最大长度\n      - `batch_size`：模型批处理大小\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，不同任务类型的返回结果如下\n        - 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n        - 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n  - ```python\n    def get_embedding(\n      data,\n      use_gpu=False\n    )\n    ```\n\n    - 用于获取输入文本的句子粒度特征与字粒度特征\n\n    - **参数**\n\n      - `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。  \n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线获取预训练词向量。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m chinese_bert_wwm_ext\n    ```\n\n  - 这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\n    text = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n    # 对应本地部署，则为module.get_embedding(data=text)\n    data = {\"data\": text}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/chinese_bert_wwm_ext\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 2.0.0\n\n  全面升级动态图，接口有所变化。任务名称调整，增加序列标注任务`token-cls`\n\n* 2.0.1\n\n  增加文本匹配任务`text-matching`\n  ```shell\n  $ hub install chinese-bert-wwm-ext==2.0.1\n  ```\n"
  },
  {
    "path": "modules/text/language_model/chinese_bert_wwm_ext/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/chinese_bert_wwm_ext/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Dict\nimport os\nimport math\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom paddlenlp.transformers.bert.modeling import BertForSequenceClassification, BertModel, BertForTokenClassification\nfrom paddlenlp.transformers.bert.tokenizer import BertTokenizer\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"chinese-bert-wwm-ext\",\n    version=\"2.0.1\",\n    summary=\n    \"chinese-bert-wwm-ext, 12-layer, 768-hidden, 12-heads, 110M parameters. The module is executed as paddle.dygraph.\",\n    author=\"ymcui\",\n    author_email=\"ymcui@ir.hit.edu.cn\",\n    type=\"nlp/semantic_model\",\n    meta=TransformerModule)\nclass BertWwm(nn.Layer):\n    \"\"\"\n    BertWwm model\n    \"\"\"\n\n    def __init__(\n            self,\n            task: str = None,\n            load_checkpoint: str = None,\n            label_map: Dict = None,\n            num_classes: int = 2,\n            suffix: bool = False,\n            **kwargs,\n    ):\n        super(BertWwm, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = BertForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='bert-wwm-ext-chinese', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = BertForTokenClassification.from_pretrained(\n                pretrained_model_name_or_path='bert-wwm-ext-chinese', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())], suffix=suffix)\n        elif task == 'text-matching':\n            self.model = BertModel.from_pretrained(pretrained_model_name_or_path='bert-wwm-ext-chinese', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = BertModel.from_pretrained(pretrained_model_name_or_path='bert-wwm-ext-chinese', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding, _ = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding, _ = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            sequence_output, pooled_output = result\n            return sequence_output, pooled_output\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return BertTokenizer.from_pretrained(pretrained_model_name_or_path='bert-wwm-ext-chinese', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/chinese_electra_base/README.md",
    "content": "# chinese-electra-base\n|模型名称|chinese-electra-base|\n| :--- | :---: |\n|类别|文本-语义模型|\n|网络|ELECTRA|\n|数据集|中文维基+通用数据|\n|是否支持Fine-tuning|是|\n|模型大小|390MB|\n|最新更新日期|2022-02-08|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n<p align=\"center\">\n<img src=\"http://bj.bcebos.com/ibox-thumbnail98/1a5578bfbe1ad629035f7ad1eb3d0bce?authorization=bce-auth-v1%2Ffbe74140929444858491fbf2b6bc0935%2F2020-03-31T06%3A45%3A51Z%2F1800%2F%2F02b8749292f8ba1c606410d0e4e5dbabdf1d367d80da395887775d36424ac13e\"  hspace='10'/> <br />\n</p>\n\n更多详情请参考[ELECTRA论文](https://openreview.net/pdf?id=r1xMH1BtvB)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install chinese-electra-base==2.0.2\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='chinese-electra-base',\n    version='2.0.1',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Lable: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](../../../../demo/text_classification)\n- [序列标注](../../../../demo/sequence_labeling)\n\n- ### 2、API\n  - ```python\n    def __init__(\n        task=None,\n        load_checkpoint=None,\n        label_map=None,\n        num_classes=2,\n        suffix=False,\n        **kwargs,\n    )\n    ```\n\n    - 创建Module对象（动态图组网版本）。\n\n    - **参数**\n\n      - `task`： 任务名称，可为`seq-cls`(文本分类任务，原来的`sequence_classification`在未来会被弃用)或`token-cls`(序列标注任务)。\n      - `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n      - `label_map`：预测时的类别映射表。\n      - `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n      - `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n  - ```python\n    def predict(\n        data,\n        max_seq_len=128,\n        batch_size=1,\n        use_gpu=False\n    )\n    ```\n\n    - **参数**\n\n      - `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n      - `max_seq_len`：模型处理文本的最大长度\n      - `batch_size`：模型批处理大小\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n    - `results`：list类型，不同任务类型的返回结果如下\n      - 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n      - 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n  - ```python\n    def get_embedding(\n        data,\n        use_gpu=False\n    )\n    ```\n\n    - 用于获取输入文本的句子粒度特征与字粒度特征\n\n    - **参数**\n\n      - `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线获取预训练词向量。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m chinese-electra-base\n    ```\n\n  - 这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\n    text = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n    # 对应本地部署，则为module.get_embedding(data=text)\n    data = {\"data\": text}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/chinese-electra-base\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 2.0.0\n\n  全面升级动态图，接口有所变化。任务名称调整，增加序列标注任务`token-cls`\n\n* 2.0.1\n\n  增加文本匹配任务`text-matching`\n\n* 2.0.2\n\n  修复词嵌入模型预测的问题\n"
  },
  {
    "path": "modules/text/language_model/chinese_electra_base/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/chinese_electra_base/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Dict\nimport os\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom paddlenlp.transformers.electra.modeling import ElectraForSequenceClassification, ElectraForTokenClassification, ElectraModel\nfrom paddlenlp.transformers.electra.tokenizer import ElectraTokenizer\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"chinese-electra-base\",\n    version=\"2.0.2\",\n    summary=\n    \"chinese-electra-base, 12-layer, 768-hidden, 12-heads, 102M parameters. The module is executed as paddle.dygraph.\",\n    author=\"ymcui\",\n    author_email=\"ymcui@ir.hit.edu.cn\",\n    type=\"nlp/semantic_model\",\n    meta=TransformerModule,\n)\nclass Electra(nn.Layer):\n    \"\"\"\n    Electra model\n    \"\"\"\n\n    def __init__(\n            self,\n            task: str = None,\n            load_checkpoint: str = None,\n            label_map: Dict = None,\n            num_classes: int = 2,\n            suffix: bool = False,\n            **kwargs,\n    ):\n        super(Electra, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = ElectraForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='chinese-electra-base', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = ElectraForTokenClassification.from_pretrained(\n                pretrained_model_name_or_path='chinese-electra-base', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())], suffix=suffix)\n        elif task == 'text-matching':\n            self.model = ElectraModel.from_pretrained(pretrained_model_name_or_path='chinese-electra-base', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = ElectraModel.from_pretrained(pretrained_model_name_or_path='chinese-electra-base', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(query_token_embedding.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(title_token_embedding.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            return result\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return ElectraTokenizer.from_pretrained(pretrained_model_name_or_path='chinese-electra-base', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/chinese_electra_small/README.md",
    "content": "# chinese-electra-small\n|模型名称|chinese-electra-small|\n| :--- | :---: |\n|类别|文本-语义模型|\n|网络|ELECTRA|\n|数据集|中文维基+通用数据|\n|是否支持Fine-tuning|是|\n|模型大小|47MB|\n|最新更新日期|2022-02-08|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n<p align=\"center\">\n<img src=\"http://bj.bcebos.com/ibox-thumbnail98/1a5578bfbe1ad629035f7ad1eb3d0bce?authorization=bce-auth-v1%2Ffbe74140929444858491fbf2b6bc0935%2F2020-03-31T06%3A45%3A51Z%2F1800%2F%2F02b8749292f8ba1c606410d0e4e5dbabdf1d367d80da395887775d36424ac13e\"  hspace='10'/> <br />\n</p>\n\n更多详情请参考[ELECTRA论文](https://openreview.net/pdf?id=r1xMH1BtvB)\n\n## 二、安装\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install chinese-electra-small==2.0.2\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='chinese-electra-small',\n    version='2.0.1',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Lable: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](../../../../demo/text_classification)\n- [序列标注](../../../../demo/sequence_labeling)\n\n- ### 2、API\n  - ```python\n    def __init__(\n        task=None,\n        load_checkpoint=None,\n        label_map=None,\n        num_classes=2,\n        suffix=False,\n        **kwargs,\n    )\n    ```\n\n    - 创建Module对象（动态图组网版本）。\n\n    - **参数**\n\n      - `task`： 任务名称，可为`seq-cls`(文本分类任务，原来的`sequence_classification`在未来会被弃用)或`token-cls`(序列标注任务)。\n      - `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n      - `label_map`：预测时的类别映射表。\n      - `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n      - `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n  - ```python\n    def predict(\n        data,\n        max_seq_len=128,\n        batch_size=1,\n        use_gpu=False\n    )\n    ```\n\n    - **参数**\n\n      - `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n      - `max_seq_len`：模型处理文本的最大长度\n      - `batch_size`：模型批处理大小\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，不同任务类型的返回结果如下\n        - 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n        - 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n  - ```python\n    def get_embedding(\n        data,\n        use_gpu=False\n    )\n    ```\n\n    - 用于获取输入文本的句子粒度特征与字粒度特征\n\n    - **参数**\n\n      - `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      -  `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线获取预训练词向量。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m chinese-electra-small\n    ```\n\n  - 这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\n    text = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n    # 对应本地部署，则为module.get_embedding(data=text)\n    data = {\"data\": text}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/chinese-electra-small\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 2.0.0\n\n  全面升级动态图，接口有所变化。任务名称调整，增加序列标注任务`token-cls`\n\n* 2.0.1\n\n  增加文本匹配任务`text-matching`\n\n* 2.0.2\n\n  修复词嵌入模型预测的问题\n"
  },
  {
    "path": "modules/text/language_model/chinese_electra_small/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/chinese_electra_small/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Dict\nimport os\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom paddlenlp.transformers.electra.modeling import ElectraForSequenceClassification, ElectraForTokenClassification, ElectraModel\nfrom paddlenlp.transformers.electra.tokenizer import ElectraTokenizer\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"chinese-electra-small\",\n    version=\"2.0.2\",\n    summary=\n    \"chinese-electra-small, 12-layer, 256-hidden, 4-heads, 12M parameters. The module is executed as paddle.dygraph.\",\n    author=\"ymcui\",\n    author_email=\"ymcui@ir.hit.edu.cn\",\n    type=\"nlp/semantic_model\",\n    meta=TransformerModule,\n)\nclass Electra(nn.Layer):\n    \"\"\"\n    Electra model\n    \"\"\"\n\n    def __init__(\n            self,\n            task: str = None,\n            load_checkpoint: str = None,\n            label_map: Dict = None,\n            num_classes: int = 2,\n            suffix: bool = False,\n            **kwargs,\n    ):\n        super(Electra, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = ElectraForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='chinese-electra-small', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = ElectraForTokenClassification.from_pretrained(\n                pretrained_model_name_or_path='chinese-electra-small', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())], suffix=suffix)\n        elif task == 'text-matching':\n            self.model = ElectraModel.from_pretrained(pretrained_model_name_or_path='chinese-electra-small', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = ElectraModel.from_pretrained(pretrained_model_name_or_path='chinese-electra-small', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(query_token_embedding.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(title_token_embedding.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            return result\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return ElectraTokenizer.from_pretrained(pretrained_model_name_or_path='chinese-electra-small', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/electra_base/README.md",
    "content": "# electra-base\n|模型名称|electra-base|\n| :--- | :---: |\n|类别|文本-语义模型|\n|网络|ELECTRA|\n|数据集|英文维基百科|\n|是否支持Fine-tuning|是|\n|模型大小|630MB|\n|最新更新日期|2022-02-08|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n<p align=\"center\">\n<img src=\"http://bj.bcebos.com/ibox-thumbnail98/1a5578bfbe1ad629035f7ad1eb3d0bce?authorization=bce-auth-v1%2Ffbe74140929444858491fbf2b6bc0935%2F2020-03-31T06%3A45%3A51Z%2F1800%2F%2F02b8749292f8ba1c606410d0e4e5dbabdf1d367d80da395887775d36424ac13e\"  hspace='10'/> <br />\n</p>\n\n更多详情请参考[ELECTRA论文](https://openreview.net/pdf?id=r1xMH1BtvB)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n  $ hub install electra-base==1.0.2\n  ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='electra-base',\n    version='1.0.1',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Lable: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](https://github.com/PaddlePaddle/PaddleHub/tree/release/v2.0.0-beta/demo/text_classification)\n- [序列标注](https://github.com/PaddlePaddle/PaddleHub/tree/release/v2.0.0-beta/demo/sequence_labeling)\n\n- ### 2、API\n  - ```python\n    def __init__(\n            task=None,\n            load_checkpoint=None,\n            label_map=None,\n            num_classes=2,\n            suffix=False,\n            **kwargs,\n    )\n    ```\n\n    - 创建Module对象（动态图组网版本）。\n\n    - **参数**\n\n      - `task`： 任务名称，可为`seq-cls`(文本分类任务，原来的`sequence_classification`在未来会被弃用)或`token-cls`(序列标注任务)。\n      - `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n      - `label_map`：预测时的类别映射表。\n      - `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n      - `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n      *- `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n  - ```python\n    def predict(\n        data,\n        max_seq_len=128,\n        batch_size=1,\n        use_gpu=False\n    )\n    ```\n\n    - **参数**\n\n      - `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n      - `max_seq_len`：模型处理文本的最大长度\n      - `batch_size`：模型批处理大小\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，不同任务类型的返回结果如下\n        - 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n        - 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n  - ```python\n    def get_embedding(\n        data,\n        use_gpu=False\n    )\n    ```\n\n    - 用于获取输入文本的句子粒度特征与字粒度特征\n\n    - **参数**\n\n      - `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线获取预训练词向量。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m electra-base\n    ```\n\n  - 这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\n    text = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n    # 对应本地部署，则为module.get_embedding(data=text)\n    data = {\"data\": text}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/electra-base\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布，动态图版本模型，支持文本分类`seq-cls`和序列标注`token-cls`任务的fine-tune\n\n* 1.0.1\n\n  增加文本匹配任务`text-matching`\n\n* 1.0.2\n\n  修复词嵌入模型预测的问题\n"
  },
  {
    "path": "modules/text/language_model/electra_base/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/electra_base/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Dict\nimport os\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom paddlenlp.transformers.electra.modeling import ElectraForSequenceClassification, ElectraForTokenClassification, ElectraModel\nfrom paddlenlp.transformers.electra.tokenizer import ElectraTokenizer\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"electra-base\",\n    version=\"1.0.2\",\n    summary=\"electra-base, 12-layer, 768-hidden, 12-heads, 110M parameters. The module is executed as paddle.dygraph.\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=TransformerModule,\n)\nclass Electra(nn.Layer):\n    \"\"\"\n    Electra model\n    \"\"\"\n\n    def __init__(\n            self,\n            task: str = None,\n            load_checkpoint: str = None,\n            label_map: Dict = None,\n            num_classes: int = 2,\n            suffix: bool = False,\n            **kwargs,\n    ):\n        super(Electra, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = ElectraForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='electra-base', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = ElectraForTokenClassification.from_pretrained(\n                pretrained_model_name_or_path='electra-base', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())], suffix=suffix)\n        elif task == 'text-matching':\n            self.model = ElectraModel.from_pretrained(pretrained_model_name_or_path='electra-base', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = ElectraModel.from_pretrained(pretrained_model_name_or_path='electra-base', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(query_token_embedding.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(title_token_embedding.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            return result\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return ElectraTokenizer.from_pretrained(pretrained_model_name_or_path='electra-base', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/electra_large/README.md",
    "content": "# electra-large\n|模型名称|electra-large|\n| :--- | :---: |\n|类别|文本-语义模型|\n|网络|ELECTRA|\n|数据集|英文维基百科|\n|是否支持Fine-tuning|是|\n|模型大小|1.9GB|\n|最新更新日期|2022-02-08|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n<p align=\"center\">\n<img src=\"http://bj.bcebos.com/ibox-thumbnail98/1a5578bfbe1ad629035f7ad1eb3d0bce?authorization=bce-auth-v1%2Ffbe74140929444858491fbf2b6bc0935%2F2020-03-31T06%3A45%3A51Z%2F1800%2F%2F02b8749292f8ba1c606410d0e4e5dbabdf1d367d80da395887775d36424ac13e\"  hspace='10'/> <br />\n</p>\n\n更多详情请参考[ELECTRA论文](https://openreview.net/pdf?id=r1xMH1BtvB)\n\n## 二、安装\n\n- ### 1、环境依赖\n  \n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install electra-large==1.0.2\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='electra-large',\n    version='1.0.1',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Lable: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](../../../../demo/text_classification)\n- [序列标注](../../../../demo/sequence_labeling)\n\n- ### 2、API\n  - ```python\n    def __init__(\n        task=None,\n        load_checkpoint=None,\n        label_map=None,\n        num_classes=2,\n        suffix=False,\n        **kwargs,\n    )\n    ```\n\n    - 创建Module对象（动态图组网版本）。\n\n    - **参数**\n\n      - `task`： 任务名称，可为`seq-cls`(文本分类任务，原来的`sequence_classification`在未来会被弃用)或`token-cls`(序列标注任务)。\n      - `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n      - `label_map`：预测时的类别映射表。\n      - `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n      - `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n  - ```python\n    def predict(\n        data,\n        max_seq_len=128,\n        batch_size=1,\n        use_gpu=False\n    )\n    ```\n\n    - **参数**\n\n      - `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n      - `max_seq_len`：模型处理文本的最大长度\n      - `batch_size`：模型批处理大小\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，不同任务类型的返回结果如下\n        - 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n        - 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n  - ```python\n    def get_embedding(\n        data,\n        use_gpu=False\n    )\n    ```\n\n    - 用于获取输入文本的句子粒度特征与字粒度特征\n\n    - **参数**\n\n      - `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线获取预训练词向量。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m electra-large\n    ```\n\n  - 这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\n    text = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n    # 对应本地部署，则为module.get_embedding(data=text)\n    data = {\"data\": text}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/electra-large\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布，动态图版本模型，支持文本分类`seq-cls`和序列标注`token-cls`任务的fine-tune\n\n* 1.0.1\n\n  增加文本匹配任务`text-matching`\n\n* 1.0.2\n\n  修复词嵌入模型预测的问题\n"
  },
  {
    "path": "modules/text/language_model/electra_large/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/electra_large/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Dict\nimport os\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom paddlenlp.transformers.electra.modeling import ElectraForSequenceClassification, ElectraForTokenClassification, ElectraModel\nfrom paddlenlp.transformers.electra.tokenizer import ElectraTokenizer\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"electra-large\",\n    version=\"1.0.2\",\n    summary=\"electra-large, 24-layer, 1024-hidden, 16-heads, 335M parameters. The module is executed as paddle.dygraph.\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=TransformerModule,\n)\nclass Electra(nn.Layer):\n    \"\"\"\n    Electra model\n    \"\"\"\n\n    def __init__(\n            self,\n            task: str = None,\n            load_checkpoint: str = None,\n            label_map: Dict = None,\n            num_classes: int = 2,\n            suffix: bool = False,\n            **kwargs,\n    ):\n        super(Electra, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = ElectraForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='electra-large', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = ElectraForTokenClassification.from_pretrained(\n                pretrained_model_name_or_path='electra-large', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())], suffix=suffix)\n        elif task == 'text-matching':\n            self.model = ElectraModel.from_pretrained(pretrained_model_name_or_path='electra-large', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = ElectraModel.from_pretrained(pretrained_model_name_or_path='electra-large', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(query_token_embedding.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(title_token_embedding.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            return result\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return ElectraTokenizer.from_pretrained(pretrained_model_name_or_path='electra-large', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/electra_small/README.md",
    "content": "# electra-small\n|模型名称|electra-small|\n| :--- | :---: |\n|类别|文本-语义模型|\n|网络|ELECTRA|\n|数据集|英文维基百科|\n|是否支持Fine-tuning|是|\n|模型大小|78MB|\n|最新更新日期|2022-02-08|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n<p align=\"center\">\n<img src=\"http://bj.bcebos.com/ibox-thumbnail98/1a5578bfbe1ad629035f7ad1eb3d0bce?authorization=bce-auth-v1%2Ffbe74140929444858491fbf2b6bc0935%2F2020-03-31T06%3A45%3A51Z%2F1800%2F%2F02b8749292f8ba1c606410d0e4e5dbabdf1d367d80da395887775d36424ac13e\"  hspace='10'/> <br />\n</p>\n\n更多详情请参考[ELECTRA论文](https://openreview.net/pdf?id=r1xMH1BtvB)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install electra-small==1.0.2\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='electra-small',\n    version='1.0.1',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Lable: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](../../../../demo/text_classification)\n- [序列标注](../../../../demo/sequence_labeling)\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        task=None,\n        load_checkpoint=None,\n        label_map=None,\n        num_classes=2,\n        suffix=False,\n        **kwargs,\n    )\n    ```\n\n    - 创建Module对象（动态图组网版本）。\n\n    - **参数**\n\n      - `task`： 任务名称，可为`seq-cls`(文本分类任务，原来的`sequence_classification`在未来会被弃用)或`token-cls`(序列标注任务)。\n      - `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n      - `label_map`：预测时的类别映射表。\n      - `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n      - `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n  - ```python\n    def predict(\n        data,\n        max_seq_len=128,\n        batch_size=1,\n        use_gpu=False\n    )\n    ```\n\n    - **参数**\n\n      - `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n      - `max_seq_len`：模型处理文本的最大长度\n      - `batch_size`：模型批处理大小\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，不同任务类型的返回结果如下\n        - 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n        - 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n  - ```python\n    def get_embedding(\n        data,\n        use_gpu=False\n    )\n    ```\n\n    - 用于获取输入文本的句子粒度特征与字粒度特征\n\n    - **参数**\n\n      - `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线获取预训练词向量。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n  $ hub serving start -m electra-small\n  ```\n  - 这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\n    text = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n    # 对应本地部署，则为module.get_embedding(data=text)\n    data = {\"data\": text}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/electra-small\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n* 1.0.0\n\n  初始发布，动态图版本模型，支持文本分类`seq-cls`和序列标注`token-cls`任务的fine-tune\n\n* 1.0.1\n\n  增加文本匹配任务`text-matching`\n\n* 1.0.2\n\n  修复词嵌入模型预测的问题\n"
  },
  {
    "path": "modules/text/language_model/electra_small/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/electra_small/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Dict\nimport os\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom paddlenlp.transformers.electra.modeling import ElectraForSequenceClassification, ElectraForTokenClassification, ElectraModel\nfrom paddlenlp.transformers.electra.tokenizer import ElectraTokenizer\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"electra-small\",\n    version=\"1.0.2\",\n    summary=\"electra-small, 12-layer, 256-hidden, 4-heads, 14M parameters. The module is executed as paddle.dygraph.\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=TransformerModule,\n)\nclass Electra(nn.Layer):\n    \"\"\"\n    Electra model\n    \"\"\"\n\n    def __init__(\n            self,\n            task: str = None,\n            load_checkpoint: str = None,\n            label_map: Dict = None,\n            num_classes: int = 2,\n            suffix: bool = False,\n            **kwargs,\n    ):\n        super(Electra, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = ElectraForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='electra-small', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = ElectraForTokenClassification.from_pretrained(\n                pretrained_model_name_or_path='electra-small', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())], suffix=suffix)\n        elif task == 'text-matching':\n            self.model = ElectraModel.from_pretrained(pretrained_model_name_or_path='electra-small', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = ElectraModel.from_pretrained(pretrained_model_name_or_path='electra-small', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(query_token_embedding.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(title_token_embedding.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            return result\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return ElectraTokenizer.from_pretrained(pretrained_model_name_or_path='electra-small', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/ernie/README.md",
    "content": "# ernie\n|模型名称|ernie|\n| :--- | :---: | \n|类别|文本-语义模型|\n|网络|ernie-1.0|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|是|\n|模型大小|384MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\nErnie是百度提出的基于知识增强的持续学习语义理解模型，该模型将大数据预训练与多源丰富知识相结合，通过持续学习技术，不断吸收海量文本数据中词汇、结构、语义等方面的知识，实现模型效果不断进化。\n\n\n  - <a class=\"ant-btn large\" href=\"https://aistudio.baidu.com/aistudio/projectDetail/79380\" target=\"_blank\">AI Studio 快速体验</a>\n\n    <p align=\"center\">\n    <img src=\"https://bj.bcebos.com/paddlehub/paddlehub-img/ernie_network_1.png\" hspace='10'/> <br />\n    </p>\n\n    <p align=\"center\">\n    <img src=\"https://bj.bcebos.com/paddlehub/paddlehub-img/ernie_network_2.png\" hspace='10'/> <br />\n    </p>\n\n  - 更多详情请参考[ERNIE论文](https://arxiv.org/abs/1904.09223)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install ernie\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='ernie',\n    version='2.0.2',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Lable: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](../../../../demo/text_classification)\n- [序列标注](../../../../demo/sequence_labeling)\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        task=None,\n        load_checkpoint=None,\n        label_map=None,\n        num_classes=2,\n        suffix=False,\n        **kwargs,\n    )\n    ```\n\n    - 创建Module对象（动态图组网版本）\n\n    - **参数**\n\n      - `task`： 任务名称，可为`seq-cls`(文本分类任务)或`token-cls`(序列标注任务)。\n      - `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n      - `label_map`：预测时的类别映射表。\n      - `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n      - `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n  - ```python\n    def predict(\n        data,\n        max_seq_len=128,\n        batch_size=1,\n        use_gpu=False\n    )\n    ```\n\n    - **参数**\n\n      - `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n      - `max_seq_len`：模型处理文本的最大长度\n      - `batch_size`：模型批处理大小\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，不同任务类型的返回结果如下\n        - 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n        - 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n  - ```python\n    def get_embedding(\n      data,\n      use_gpu=False\n    )\n    ```\n\n    - 用于获取输入文本的句子粒度特征与字粒度特征\n\n    - **参数**\n\n      - `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。  \n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线获取预训练词向量。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m ernie\n    ```\n\n  - 这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\n    text = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n    # 对应本地部署，则为module.get_embedding(data=text)\n    data = {\"data\": text}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/ernie\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  修复该PaddleHub Module在paddlepaddle1.4.0版本、CPU环境下运行错误的问题\n\n* 1.0.2\n\n  修复该PaddleHub Module在paddlepaddle1.5.0版本下兼容问题\n\n* 1.1.0\n\n  ERNIE预训练时max_seq_len设置为512\n\n* 1.2.0\n\n  支持get_embedding与get_params_layer\n\n* 2.0.0\n\n  全面升级动态图版本，接口有所变化\n\n* 2.0.1\n\n  任务名称调整，增加序列标注任务`token-cls`\n\n* 2.0.2\n\n  增加文本匹配任务`text-matching`\n  ```shell\n  $ hub install ernie==2.0.2\n  ```\n"
  },
  {
    "path": "modules/text/language_model/ernie/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Dict\nimport os\nimport math\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom paddlenlp.transformers.ernie.modeling import ErnieModel, ErnieForSequenceClassification, ErnieForTokenClassification\nfrom paddlenlp.transformers.ernie.tokenizer import ErnieTokenizer\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"ernie\",\n    version=\"2.0.2\",\n    summary=\n    \"Baidu's ERNIE, Enhanced Representation through kNowledge IntEgration, max_seq_len=512 when predtrained. The module is executed as paddle.dygraph.\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=TransformerModule)\nclass Ernie(nn.Layer):\n    \"\"\"\n    Ernie model\n    \"\"\"\n\n    def __init__(\n            self,\n            task: str = None,\n            load_checkpoint: str = None,\n            label_map: Dict = None,\n            num_classes: int = 2,\n            suffix: bool = False,\n            **kwargs,\n    ):\n        super(Ernie, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = ErnieForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='ernie-1.0', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = ErnieForTokenClassification.from_pretrained(\n                pretrained_model_name_or_path='ernie-1.0', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())], suffix=suffix)\n        elif task == 'text-matching':\n            self.model = ErnieModel.from_pretrained(pretrained_model_name_or_path='ernie-1.0', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = ErnieModel.from_pretrained(pretrained_model_name_or_path='ernie-1.0', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding, _ = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding, _ = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            sequence_output, pooled_output = result\n            return sequence_output, pooled_output\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return ErnieTokenizer.from_pretrained(pretrained_model_name_or_path='ernie-1.0', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/ernie_tiny/README.md",
    "content": "# ernie_tiny\n|模型名称|ernie_tiny|\n| :--- | :---: | \n|类别|文本-语义模型|\n|网络|ernie_tiny|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|是|\n|模型大小|346MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\nErnie是百度提出的基于知识增强的持续学习语义理解模型，该模型将大数据预训练与多源丰富知识相结合，通过持续学习技术，不断吸收海量文本数据中词汇、结构、语义等方面的知识，实现模型效果不断进化。\n\n\n  - <a class=\"ant-btn large\" href=\"https://aistudio.baidu.com/aistudio/projectDetail/79380\" target=\"_blank\">AI Studio 快速体验</a>\n\n    <p align=\"center\">\n    <img src=\"https://bj.bcebos.com/paddlehub/paddlehub-img/ernie_network_1.png\" hspace='10'/> <br />\n    </p>\n\n    <p align=\"center\">\n    <img src=\"https://bj.bcebos.com/paddlehub/paddlehub-img/ernie_network_2.png\" hspace='10'/> <br />\n    </p>\n\n  - 更多详情请参考[ERNIE论文](https://arxiv.org/abs/1904.09223)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install ernie_tiny\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='ernie_tiny',\n    version='2.0.2',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Lable: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](../../../../demo/text_classification)\n- [序列标注](../../../../demo/sequence_labeling)\n\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        task=None,\n        load_checkpoint=None,\n        label_map=None,\n        num_classes=2,\n        suffix=False,\n        **kwargs,\n    )\n    ```\n\n    - 创建Module对象（动态图组网版本）\n\n    - **参数**\n\n      - `task`： 任务名称，可为`seq-cls`(文本分类任务)或`token-cls`(序列标注任务)。\n      - `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n      - `label_map`：预测时的类别映射表。\n      - `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n      - `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n  - ```python\n    def predict(\n        data,\n        max_seq_len=128,\n        batch_size=1,\n        use_gpu=False\n    )\n    ```\n\n    - **参数**\n\n      - `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n      - `max_seq_len`：模型处理文本的最大长度\n      - `batch_size`：模型批处理大小\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，不同任务类型的返回结果如下\n        - 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n        - 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n  - ```python\n    def get_embedding(\n      data,\n      use_gpu=False\n    )\n    ```\n\n    - 用于获取输入文本的句子粒度特征与字粒度特征\n\n    - **参数**\n\n      - `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。  \n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线获取预训练词向量。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m ernie_tiny\n    ```\n\n  - 这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\n    text = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n    # 对应本地部署，则为module.get_embedding(data=text)\n    data = {\"data\": text}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/ernie_tiny\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  修复python 2的兼容问题\n\n* 1.1.0\n\n  支持get_embedding与get_params_layer\n\n* 2.0.0\n\n  全面升级动态图版本，接口有所变化\n\n* 2.0.1\n\n  任务名称调整，增加序列标注任务`token-cls`\n\n* 2.0.2\n\n  增加文本匹配任务`text-matching`\n  ```shell\n  $ hub install ernie_tiny==2.0.2\n  ```\n"
  },
  {
    "path": "modules/text/language_model/ernie_tiny/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/ernie_tiny/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Dict\nimport os\nimport math\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom paddlenlp.transformers.ernie.modeling import ErnieModel, ErnieForSequenceClassification, ErnieForTokenClassification\nfrom paddlenlp.transformers.ernie.tokenizer import ErnieTinyTokenizer\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"ernie_tiny\",\n    version=\"2.0.2\",\n    summary=\"Baidu's ERNIE-tiny, Enhanced Representation through kNowledge IntEgration, tiny version, max_seq_len=512\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=TransformerModule)\nclass ErnieTiny(nn.Layer):\n    \"\"\"\n    Ernie model\n    \"\"\"\n\n    def __init__(\n            self,\n            task: str = None,\n            load_checkpoint: str = None,\n            label_map: Dict = None,\n            num_classes: int = 2,\n            suffix: bool = False,\n            **kwargs,\n    ):\n        super(ErnieTiny, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = ErnieForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='ernie-tiny', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = ErnieForTokenClassification.from_pretrained(\n                pretrained_model_name_or_path='ernie-tiny', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())], suffix=suffix)\n        elif task == 'text-matching':\n            self.model = ErnieModel.from_pretrained(pretrained_model_name_or_path='ernie-tiny', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = ErnieModel.from_pretrained(pretrained_model_name_or_path='ernie-tiny', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding, _ = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding, _ = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            sequence_output, pooled_output = result\n            return sequence_output, pooled_output\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return ErnieTinyTokenizer.from_pretrained(pretrained_model_name_or_path='ernie-tiny', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/ernie_v2_eng_base/README.md",
    "content": "# ernie_v2_eng_base\n|模型名称|ernie_v2_eng_base|\n| :--- | :---: |\n|类别|文本-语义模型|\n|网络|ernie_v2_eng_base|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|是|\n|模型大小|1.3G|\n|最新更新日期|2021-06-28|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\nErnie是百度提出的基于知识增强的持续学习语义理解模型，该模型将大数据预训练与多源丰富知识相结合，通过持续学习技术，不断吸收海量文本数据中词汇、结构、语义等方面的知识，实现模型效果不断进化。\n\n  - <p align=\"center\">\n    <img src=\"https://bj.bcebos.com/paddlehub/paddlehub-img/ernie2.0_arch.png\" hspace='10'/> <br />\n    </p>\n\n    <p align=\"center\">\n    <img src=\"https://bj.bcebos.com/paddlehub/paddlehub-img/ernie2.0_model.png\" hspace='10'/> <br />\n    </p>\n\n  - 更多详情请参考[ERNIE论文](https://arxiv.org/abs/1907.12412)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install ernie_v2_eng_base\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='ernie_v2_eng_base',\n    version='2.0.3',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Lable: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](../../../../demo/text_classification)\n- [序列标注](../../../../demo/sequence_labeling)\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        task=None,\n        load_checkpoint=None,\n        label_map=None,\n        num_classes=2,\n        suffix=False,\n        **kwargs,\n    )\n    ```\n\n    - 创建Module对象（动态图组网版本）\n\n    - **参数**\n\n      - `task`： 任务名称，可为`seq-cls`(文本分类任务)或`token-cls`(序列标注任务)。\n      - `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n      - `label_map`：预测时的类别映射表。\n      - `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n      - `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n  - ```python\n    def predict(\n        data,\n        max_seq_len=128,\n        batch_size=1,\n        use_gpu=False\n    )\n    ```\n\n    - **参数**\n\n      - `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n      - `max_seq_len`：模型处理文本的最大长度\n      - `batch_size`：模型批处理大小\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，不同任务类型的返回结果如下\n        - 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n        - 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n  - ```python\n    def get_embedding(\n      data,\n      use_gpu=False\n    )\n    ```\n\n    - 用于获取输入文本的句子粒度特征与字粒度特征\n\n    - **参数**\n\n      - `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。  \n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线获取预训练词向量。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m ernie_v2_eng_base\n    ```\n\n  - 这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\n    text = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n    # 对应本地部署，则为module.get_embedding(data=text)\n    data = {\"data\": text}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/ernie_v2_eng_base\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  修复python 2的兼容问题\n\n* 1.1.0\n\n  支持get_embedding与get_params_layer\n\n* 2.0.0\n\n  全面升级动态图版本，接口有所变化\n\n* 2.0.1\n\n  任务名称调整，增加序列标注任务`token-cls`\n\n* 2.0.2\n\n  增加文本匹配任务`text-matching`\n\n* 2.0.3\n\n  模型底座名称调整\n  ```shell\n  $ hub install ernie_v2_eng_base==2.0.3\n  ```\n"
  },
  {
    "path": "modules/text/language_model/ernie_v2_eng_base/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/ernie_v2_eng_base/module.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nimport os\nfrom typing import Dict\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlenlp.transformers.ernie.modeling import ErnieForSequenceClassification\nfrom paddlenlp.transformers.ernie.modeling import ErnieForTokenClassification\nfrom paddlenlp.transformers.ernie.modeling import ErnieModel\nfrom paddlenlp.transformers.ernie.tokenizer import ErnieTokenizer\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"ernie_v2_eng_base\",\n    version=\"2.0.3\",\n    summary=\n    \"Baidu's ERNIE 2.0, Enhanced Representation through kNowledge IntEgration, max_seq_len=512 when predtrained. The module is executed as paddle.dygraph.\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=TransformerModule)\nclass ErnieV2(nn.Layer):\n    \"\"\"\n    Ernie model\n    \"\"\"\n\n    def __init__(\n        self,\n        task: str = None,\n        load_checkpoint: str = None,\n        label_map: Dict = None,\n        num_classes: int = 2,\n        suffix: bool = False,\n        **kwargs,\n    ):\n        super(ErnieV2, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = ErnieForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='ernie-2.0-base-en', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = ErnieForTokenClassification.from_pretrained(pretrained_model_name_or_path='ernie-2.0-base-en',\n                                                                     num_classes=self.num_classes,\n                                                                     **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())],\n                                         suffix=suffix)\n        elif task == 'text-matching':\n            self.model = ErnieModel.from_pretrained(pretrained_model_name_or_path='ernie-2.0-base-en', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = ErnieModel.from_pretrained(pretrained_model_name_or_path='ernie-2.0-base-en', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding, _ = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding, _ = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            sequence_output, pooled_output = result\n            return sequence_output, pooled_output\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return ErnieTokenizer.from_pretrained(pretrained_model_name_or_path='ernie-2.0-base-en', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/ernie_v2_eng_large/README.md",
    "content": "# ernie_v2_eng_large\n|模型名称|ernie_v2_eng_large|\n| :--- | :---: | \n|类别|文本-语义模型|\n|网络|ernie_v2_eng_large|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|是|\n|模型大小|1.3G|\n|最新更新日期|2021-03-16|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\nErnie是百度提出的基于知识增强的持续学习语义理解模型，该模型将大数据预训练与多源丰富知识相结合，通过持续学习技术，不断吸收海量文本数据中词汇、结构、语义等方面的知识，实现模型效果不断进化。\n\n  - <p align=\"center\">\n    <img src=\"https://bj.bcebos.com/paddlehub/paddlehub-img/ernie2.0_arch.png\" hspace='10'/> <br />\n    </p>\n\n    <p align=\"center\">\n    <img src=\"https://bj.bcebos.com/paddlehub/paddlehub-img/ernie2.0_model.png\" hspace='10'/> <br />\n    </p>\n\n  - 更多详情请参考[ERNIE论文](https://arxiv.org/abs/1907.12412)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install ernie_tiny\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='ernie_v2_eng_large',\n    version='2.0.2',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Lable: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](../../../../demo/text_classification)\n- [序列标注](../../../../demo/sequence_labeling)\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        task=None,\n        load_checkpoint=None,\n        label_map=None,\n        num_classes=2,\n        suffix=False,\n        **kwargs,\n    )\n    ```\n\n    - 创建Module对象（动态图组网版本）\n\n    - **参数**\n\n      - `task`： 任务名称，可为`seq-cls`(文本分类任务)或`token-cls`(序列标注任务)。\n      - `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n      - `label_map`：预测时的类别映射表。\n      - `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n      - `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n  - ```python\n    def predict(\n        data,\n        max_seq_len=128,\n        batch_size=1,\n        use_gpu=False\n    )\n    ```\n\n    - **参数**\n\n      - `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n      - `max_seq_len`：模型处理文本的最大长度\n      - `batch_size`：模型批处理大小\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，不同任务类型的返回结果如下\n        - 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n        - 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n  - ```python\n    def get_embedding(\n      data,\n      use_gpu=False\n    )\n    ```\n\n    - 用于获取输入文本的句子粒度特征与字粒度特征\n\n    - **参数**\n\n      - `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。  \n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线获取预训练词向量。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m ernie_v2_eng_large\n    ```\n\n  - 这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\n    text = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n    # 对应本地部署，则为module.get_embedding(data=text)\n    data = {\"data\": text}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/ernie_v2_eng_large\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  修复python 2的兼容问题\n\n* 1.1.0\n\n  支持get_embedding与get_params_layer\n\n* 2.0.0\n\n  全面升级动态图版本，接口有所变化\n\n* 2.0.1\n\n  任务名称调整，增加序列标注任务`token-cls`\n\n* 2.0.2\n\n  增加文本匹配任务`text-matching`  \n  ```shell\n  $ hub install ernie_v2_eng_large==2.0.2\n  ```\n"
  },
  {
    "path": "modules/text/language_model/ernie_v2_eng_large/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/ernie_v2_eng_large/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Dict\nimport os\nimport math\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\nfrom paddlenlp.transformers.ernie.modeling import ErnieModel, ErnieForSequenceClassification, ErnieForTokenClassification\nfrom paddlenlp.transformers.ernie.tokenizer import ErnieTokenizer\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"ernie_v2_eng_large\",\n    version=\"2.0.2\",\n    summary=\n    \"Baidu's ERNIE 2.0, Enhanced Representation through kNowledge IntEgration, max_seq_len=512 when predtrained. The module is executed as paddle.dygraph.\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\",\n    meta=TransformerModule)\nclass ErnieV2(nn.Layer):\n    \"\"\"\n    Ernie model\n    \"\"\"\n\n    def __init__(\n            self,\n            task: str = None,\n            load_checkpoint: str = None,\n            label_map: Dict = None,\n            num_classes: int = 2,\n            suffix: bool = False,\n            **kwargs,\n    ):\n        super(ErnieV2, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = ErnieForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='ernie-2.0-large-en', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = ErnieForTokenClassification.from_pretrained(\n                pretrained_model_name_or_path='ernie-2.0-large-en', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())], suffix=suffix)\n        elif task == 'text-matching':\n            self.model = ErnieModel.from_pretrained(pretrained_model_name_or_path='ernie-2.0-large-en', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = ErnieModel.from_pretrained(pretrained_model_name_or_path='ernie-2.0-large-en', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding, _ = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding, _ = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            sequence_output, pooled_output = result\n            return sequence_output, pooled_output\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return ErnieTokenizer.from_pretrained(pretrained_model_name_or_path='ernie-2.0-large-en', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/lda_news/README.md",
    "content": "# lda_news\n\n|模型名称|lda_news|\n| :--- | :---: | \n|类别|文本-主题模型|\n|网络|LDA|\n|数据集|百度自建新闻领域数据集|\n|是否支持Fine-tuning|否|\n|模型大小|19MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - 主题模型(Topic Model)是以无监督学习的方式对文档的隐含语义结构进行聚类的统计模型，其中LDA(Latent Dirichlet Allocation)算法是主题模型的一种。LDA根据对词的共现信息的分析，拟合出词-文档-主题的分布，从而将词、文本映射到一个语义空间中。\n\n    <p align=\"center\">\n    <img src=\"https://bj.bcebos.com/paddlehub/model/nlp/semantic_model/lda.png\" width=600 hspace='10'/> <br />\n    </p>\n\n    更多详情请参考[LDA论文](http://www.jmlr.org/papers/volume3/blei03a/blei03a.pdf)。\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 1.8.2\n\n  - paddlehub >= 1.8.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install lda_news\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n``` python\nimport paddlehub as hub\n\nlda_news = hub.Module(name=\"lda_news\")\njsd, hd = lda_news.cal_doc_distance(doc_text1=\"今天的天气如何，适合出去游玩吗\", doc_text2=\"感觉今天的天气不错，可以出去玩一玩了\")\n# jsd = 0.003109, hd = 0.0573171\n\nlda_sim = lda_news.cal_query_doc_similarity(query='百度搜索引擎', document='百度是全球最大的中文搜索引擎、致力于让网民更便捷地获取信息，找到所求。百度超过千亿的中文网页数据库，可以瞬间找到相关的搜索结果。')\n# LDA similarity = 0.06826\n\nresults = lda_news.cal_doc_keywords_similarity('百度是全球最大的中文搜索引擎、致力于让网民更便捷地获取信息，找到所求。百度超过千亿的中文网页数据库，可以瞬间找到相关的搜索结果。')\n# [{'word': '百度', 'similarity': 0.12943492762349573},\n#  {'word': '信息', 'similarity': 0.06139783578769882},\n#  {'word': '找到', 'similarity': 0.055296603463188265},\n#  {'word': '搜索', 'similarity': 0.04270794098349327},\n#  {'word': '全球', 'similarity': 0.03773627056367886},\n#  {'word': '超过', 'similarity': 0.03478658388202199},\n#  {'word': '相关', 'similarity': 0.026295857219683725},\n#  {'word': '获取', 'similarity': 0.021313585287833996},\n#  {'word': '中文', 'similarity': 0.020187103312009513},\n#  {'word': '搜索引擎', 'similarity': 0.007092890537169911}]\n\nresults = lda_news.infer_doc_topic_distribution(\"最近有学者新出了一篇论文，关于自然语言处理的，可厉害了\")\n# [{'topic id': 216, 'distribution': 0.5222222222222223},\n#  {'topic id': 1789, 'distribution': 0.18888888888888888},\n#  {'topic id': 98, 'distribution': 0.1111111111111111},\n#  {'topic id': 805, 'distribution': 0.044444444444444446},\n#  {'topic id': 56, 'distribution': 0.03333333333333333}, ...]\n\nkeywords = lda_news.show_topic_keywords(topic_id=216)\n# {'研究': 0.1753955534055716,\n#  '学术': 0.13158917246453747,\n#  '论文': 0.1178632702247961,\n#  '课题': 0.057840811145163484,\n#  '发表': 0.05614630212471184,\n#  '成果': 0.03587086607950555,\n#  '期刊': 0.030608728068521086,\n#  '科研': 0.0216061375112729,\n#  '学者': 0.017739360125774,\n#  '科学': 0.015553720885167896}\n\n```\n\n- ### 2、API\n\n  - ```python\n    cal_doc_distance(doc_text1, doc_text2)\n    ```\n\n    - 用于计算两个输入文档之间的距离，包括Jensen-Shannon divergence(JS散度)、Hellinger Distance(海林格距离)。\n\n    - **参数**\n\n      - doc_text1(str): 输入的第一个文档。\n      - doc_text2(str): 输入的第二个文档。   \n\n    - **返回**\n\n      - jsd(float): 两个文档之间的JS散度([Jensen-Shannon divergence](https://blog.csdn.net/FrankieHello/article/details/80614422?utm_source=copy))。\n      - hd(float): 两个文档之间的海林格距离([Hellinger Distance](http://blog.sina.com.cn/s/blog_85f1ffb70101e65d.html))。    \n\n  - ```python\n    cal_doc_keywords_similarity(document, top_k=10)\n    ```\n\n    - 用于查找输入文档的前k个关键词及对应的与原文档的相似度。\n\n    - **参数**\n\n      - document(str): 输入文档。\n      - top_k(int): 查找输入文档的前k个关键词。\n\n    - **返回**\n\n      - results(list): 包含每个关键词以及对应的与原文档的相似度。其中，list的基本元素为dict，dict的key为关键词，value为对应的与原文档的相似度。    \n\n  - ```python\n    cal_query_doc_similarity(query, document)\n    ```\n\n    - 用于计算短文档与长文档之间的相似度。\n\n    -  **参数**\n\n      - query(str): 输入的短文档。\n      - document(str): 输入的长文档。\n\n    -  **返回**\n\n      - lda_sim(float): 返回短文档与长文档之间的相似度。 \n\n  - ```python\n    infer_doc_topic_distribution(document)\n    ```\n\n    - 用于推理出文档的主题分布。\n\n      - **参数**\n\n        - document(str): 输入文档。\n\n      - **返回**\n\n        - results(list): 包含主题分布下各个主题ID和对应的概率分布。其中，list的基本元素为dict，dict的key为主题ID，value为各个主题ID对应的概率。\n\n  - ```python\n    show_topic_keywords(topic_id, k=10)\n    ```\n\n    - 用于展示出每个主题下对应的关键词，可配合推理主题分布的API使用。\n\n      - **参数**\n\n        - topic_id(int): 主题ID。\n        - k(int): 需要知道对应主题的前k个关键词。\n\n      - **返回**\n\n        - results(dict): 返回对应文档的前k个关键词，以及各个关键词在文档中的出现概率。         \n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  修复因为return的bug导致的NoneType错误\n\n* 1.0.2\n\n  修复由于Windows`gbk`编码导致的问题\n"
  },
  {
    "path": "modules/text/language_model/lda_news/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/lda_news/config.py",
    "content": "\"\"\"\nThis file defines the basic config information of LDA/SLDA model.\n\"\"\"\n\n\nclass ModelType:\n    LDA = 0\n    SLDA = 1\n\n\nclass ModelConfig:\n    type = None\n    num_topics = None\n    alpha = None\n    beta = None\n    word_topic_file = None\n    vocab_file = None\n"
  },
  {
    "path": "modules/text/language_model/lda_news/document.py",
    "content": "import numpy as np\n\n\nclass Topic(object):\n    \"\"\"Basic data structure of topic, contains topic id and\n       corresponding probability.\n    \"\"\"\n\n    def __init__(self, tid, prob):\n        self.tid = tid  # topic id\n        self.prob = prob  # topic probability\n\n\nclass Token(object):\n    \"\"\"Basic storage unit of LDA documents, contains word id\n       and corresponding topic.\n    \"\"\"\n\n    def __init__(self, topic, id):\n        self.topic = topic\n        self.id = id\n\n\nclass Sentence(object):\n    \"\"\"Basic storage unit of SentenceLDA documents, contains word ids\n       of the sentence and its corresponding topic id.\n    \"\"\"\n\n    def __init__(self, topic, tokens):\n        self.topic = topic\n        self.tokens = tokens\n\n\nclass LDADoc(object):\n    \"\"\"The storage structure of LDA model's inference result.\n    \"\"\"\n\n    def __init__(self):\n        self._num_topics = None  # Number of topics.\n        self._num_accum = None  # Number of accumulated sample rounds.\n        self._alpha = None  # Document prior parameter.\n        self._tokens = None  # Storage structure of inference results.\n        self._topic_sum = None  # Document's topic sum in one round samples.\n        self._accum_topic_sum = None  # Accumulated results of topic sum.\n\n    def init(self, num_topics):\n        \"\"\"Initialize the LDADoc according to num_topics.\n        \"\"\"\n        self._num_topics = num_topics\n        self._num_accum = 0\n        self._tokens = []\n        self._topic_sum = np.zeros(self._num_topics)\n        self._accum_topic_sum = np.zeros(self._num_topics)\n\n    def add_token(self, token):\n        \"\"\"Add new word to current LDADoc.\n        Arg:\n            token: Token class object.\n        \"\"\"\n        assert token.topic >= 0, \"Topic %d out of range!\" % token.topic\n        assert token.topic < self._num_topics, \"Topic %d out of range!\" % token.topic\n        self._tokens.append(token)\n        self._topic_sum[token.topic] += 1\n\n    def token(self, index):\n        return self._tokens[index]\n\n    def set_topic(self, index, new_topic):\n        \"\"\"Set the index word's topic to new_topic, and update the corresponding\n           topic distribution.\n        \"\"\"\n        assert new_topic >= 0, \"Topic %d out of range!\" % new_topic\n        assert new_topic < self._num_topics, \"Topic %d out of range!\" % new_topic\n        old_topic = self._tokens[index].topic\n        if new_topic == old_topic:\n            return\n        self._tokens[index].topic = new_topic\n        self._topic_sum[old_topic] -= 1\n        self._topic_sum[new_topic] += 1\n\n    def set_alpha(self, alpha):\n        self._alpha = alpha\n\n    def size(self):\n        \"\"\"Return number of words in LDADoc.\n        \"\"\"\n        return len(self._tokens)\n\n    def topic_sum(self, topic_id):\n        return self._topic_sum[topic_id]\n\n    def sparse_topic_dist(self, sort=True):\n        \"\"\"Return the topic distribution of documents in sparse format.\n           By default, it is sorted according to the topic probability\n           under the descending order.\n        \"\"\"\n        topic_dist = []\n        sum_ = np.sum(self._accum_topic_sum)\n        if sum_ == 0:\n            return topic_dist\n        for i in range(0, self._num_topics):\n            if self._accum_topic_sum[i] == 0:\n                continue\n            topic_dist.append(Topic(i, self._accum_topic_sum[i] * 1.0 / sum_))\n        if sort:\n\n            def take_elem(topic):\n                return topic.prob\n\n            topic_dist.sort(key=take_elem, reverse=True)\n            if topic_dist is None:\n                topic_dist = []\n\n        return topic_dist\n\n    def dense_topic_dist(self):\n        \"\"\"Return the distribution of document topics in dense format,\n           taking into account the prior parameter alpha.\n        \"\"\"\n        dense_dist = np.zeros(self._num_topics)\n        if self.size() == 0:\n            return dense_dist\n        dense_dist = (self._accum_topic_sum * 1.0 / self._num_accum + self._alpha) / (\n            self.size() + self._alpha * self._num_topics)\n        return dense_dist\n\n    def accumulate_topic_num(self):\n        self._accum_topic_sum += self._topic_sum\n        self._num_accum += 1\n\n\nclass SLDADoc(LDADoc):\n    \"\"\"Sentence LDA Document, inherited from LDADoc.\n       Add add_sentence interface.\n    \"\"\"\n\n    def __init__(self):\n        super().__init__()\n        self.__sentences = None\n\n    def init(self, num_topics):\n        \"\"\"Initialize the SLDADoc according to num_topics.\n        \"\"\"\n        self._num_topics = num_topics\n        self.__sentences = []\n        self._num_accum = 0\n        self._topic_sum = np.zeros(self._num_topics)\n        self._accum_topic_sum = np.zeros(self._num_topics)\n\n    def add_sentence(self, sent):\n        \"\"\"Add new sentence to current SLDADoc.\n        Arg:\n            sent: Sentence class object.\n        \"\"\"\n        assert sent.topic >= 0, \"Topic %d out of range!\" % (sent.topic)\n        assert sent.topic < self._num_topics, \"Topic %d out of range!\" % (sent.topic)\n        self.__sentences.append(sent)\n        self._topic_sum[sent.topic] += 1\n\n    def set_topic(self, index, new_topic):\n        assert new_topic >= 0, \"Topic %d out of range!\" % (new_topic)\n        assert new_topic < self._num_topics, \"Topic %d out of range!\" % (new_topic)\n        old_topic = self.__sentences[index].topic\n        if new_topic == old_topic:\n            return\n        self.__sentences[index].topic = new_topic\n        self._topic_sum[old_topic] -= 1\n        self._topic_sum[new_topic] += 1\n\n    def size(self):\n        \"\"\"Return number of sentences in SLDADoc.\n        \"\"\"\n        return len(self.__sentences)\n\n    def sent(self, index):\n        return self.__sentences[index]\n"
  },
  {
    "path": "modules/text/language_model/lda_news/inference_engine.py",
    "content": "import os\n\nfrom paddlehub.common.logger import logger\n\nfrom lda_news.config import ModelConfig\nfrom lda_news.util import load_prototxt, fix_random_seed, rand_k\nfrom lda_news.model import TopicModel\nfrom lda_news.sampler import GibbsSampler, MHSampler\nfrom lda_news.document import LDADoc, SLDADoc, Token, Sentence\nfrom lda_news.vocab import OOV\n\n\nclass SamplerType:\n    GibbsSampling = 0\n    MetropolisHastings = 1\n\n\nclass InferenceEngine(object):\n    def __init__(self, model_dir, conf_file, type=SamplerType.MetropolisHastings):\n        # Read model configuration.\n        config = ModelConfig()\n        conf_file_path = os.path.join(model_dir, conf_file)\n        load_prototxt(conf_file_path, config)\n        self.__model = TopicModel(model_dir, config)\n        self.__config = config\n\n        # Initialize the sampler according to the configuration.\n        if type == SamplerType.GibbsSampling:\n            self.__sampler = GibbsSampler(self.__model)\n        elif type == SamplerType.MetropolisHastings:\n            self.__sampler = MHSampler(self.__model)\n\n    def infer(self, input, doc):\n        \"\"\"Perform LDA topic inference on input, and store the results in doc.\n        Args:\n            input: a list of strings after tokenization.\n            doc: LDADoc type or SLDADoc type.\n        \"\"\"\n        fix_random_seed()\n        if isinstance(doc, LDADoc) and not isinstance(doc, SLDADoc):\n            doc.init(self.__model.num_topics())\n            doc.set_alpha(self.__model.alpha())\n            for token in input:\n                id_ = self.__model.term_id(token)\n                if id_ != OOV:\n                    init_topic = rand_k(self.__model.num_topics())\n                    doc.add_token(Token(init_topic, id_))\n            self.lda_infer(doc, 20, 50)\n        elif isinstance(doc, SLDADoc):\n            doc.init(self.__model.num_topics())\n            doc.set_alpha(self.__model.alpha())\n            for sent in input:\n                words = []\n                for token in sent:\n                    id_ = self.__model.term_id(token)\n                    if id_ != OOV:\n                        words.append(id_)\n                init_topic = rand_k(self.__model.num_topics())\n                doc.add_sentence(Sentence(init_topic, words))\n            self.slda_infer(doc, 20, 50)\n        else:\n            logger.error(\"Wrong Doc Type!\")\n\n    def lda_infer(self, doc, burn_in_iter, total_iter):\n        assert burn_in_iter >= 0\n        assert total_iter > 0\n        assert total_iter > burn_in_iter\n\n        for iter_ in range(total_iter):\n            self.__sampler.sample_doc(doc)\n            if iter_ >= burn_in_iter:\n                doc.accumulate_topic_num()\n\n    def slda_infer(self, doc, burn_in_iter, total_iter):\n        assert burn_in_iter >= 0\n        assert total_iter > 0\n        assert total_iter > burn_in_iter\n\n        for iter_ in range(total_iter):\n            self.__sampler.sample_doc(doc)\n            if iter_ >= burn_in_iter:\n                doc.accumulate_topic_num()\n\n    def model_type(self):\n        return self.__model.type()\n\n    def get_model(self):\n        return self.__model\n\n    def get_config(self):\n        return self.__config\n"
  },
  {
    "path": "modules/text/language_model/lda_news/model.py",
    "content": "import os\nfrom collections import OrderedDict\n\nimport numpy as np\nfrom tqdm import tqdm\nfrom paddlehub.common.logger import logger\n\nfrom lda_news.vocab import Vocab, WordCount\n\n\nclass TopicModel(object):\n    \"\"\"Storage Structure of Topic model, including vocabulary and word topic count.\n    \"\"\"\n\n    def __init__(self, model_dir, config):\n        \"\"\"\n        Args:\n            model_dir: the path of model directory\n            config: ModelConfig class.\n        \"\"\"\n        self.__word_topic = None  # Model parameter of word topic.\n        self.__vocab = Vocab()  # Vocab data structure of model.\n        self.__num_topics = config.num_topics  # Number of topics.\n        self.__alpha = config.alpha\n        self.__alpha_sum = self.__alpha * self.__num_topics\n        self.__beta = config.beta\n        self.__beta_sum = None\n        self.__type = config.type  # Model type.\n        self.__topic_sum = np.zeros(self.__num_topics, dtype=\"int64\")  # Accum sum of each topic in word topic.\n        self.__topic_words = [[] for _ in range(self.__num_topics)]\n        word_topic_path = os.path.join(model_dir, config.word_topic_file)\n        vocab_path = os.path.join(model_dir, config.vocab_file)\n        self.load_model(word_topic_path, vocab_path)\n\n    def term_id(self, term):\n        return self.__vocab.get_id(term)\n\n    def load_model(self, word_topic_path, vocab_path):\n\n        # Loading vocabulary\n        self.__vocab.load(vocab_path)\n\n        self.__beta_sum = self.__beta * self.__vocab.size()\n        self.__word_topic = [{} for _ in range(self.__vocab.size())]  # 字典列表\n        self.__load_word_dict(word_topic_path)\n        logger.info(\"Model Info: #num_topics=%d #vocab_size=%d alpha=%f beta=%f\" %\n                    (self.num_topics(), self.vocab_size(), self.alpha(), self.beta()))\n\n    def word_topic_value(self, word_id, topic_id):\n        \"\"\"Return value of specific word under specific topic in the model.\n        \"\"\"\n        word_dict = self.__word_topic[word_id]\n        if topic_id not in word_dict:\n            return 0\n        return word_dict[topic_id]\n\n    def word_topic(self, term_id):\n        \"\"\"Return the topic distribution of a word.\n        \"\"\"\n        return self.__word_topic[term_id]\n\n    def topic_sum_value(self, topic_id):\n        return self.__topic_sum[topic_id]\n\n    def topic_sum(self):\n        return self.__topic_sum\n\n    def num_topics(self):\n        return self.__num_topics\n\n    def vocab_size(self):\n        return self.__vocab.size()\n\n    def alpha(self):\n        return self.__alpha\n\n    def alpha_sum(self):\n        return self.__alpha_sum\n\n    def beta(self):\n        return self.__beta\n\n    def beta_sum(self):\n        return self.__beta_sum\n\n    def type(self):\n        return self.__type\n\n    def __load_word_dict(self, word_dict_path):\n        \"\"\"Load the word topic parameters.\n        \"\"\"\n        logger.info(\"Loading word topic.\")\n        with open(word_dict_path, 'r', encoding='utf-8') as f:\n            for line in tqdm(f.readlines()):\n                fields = line.strip().split(\" \")\n                assert len(fields) > 0, \"Model file format error!\"\n                term_id = int(fields[0])\n                assert term_id < self.vocab_size(), \"Term id out of range!\"\n                assert term_id >= 0, \"Term id out of range!\"\n                for i in range(1, len(fields)):\n                    topic_count = fields[i].split(\":\")\n                    assert len(topic_count) == 2, \"Topic count format error!\"\n\n                    topic_id = int(topic_count[0])\n                    assert topic_id >= 0, \"Topic out of range!\"\n                    assert topic_id < self.__num_topics, \"Topic out of range!\"\n\n                    count = int(topic_count[1])\n                    assert count >= 0, \"Topic count error!\"\n\n                    self.__word_topic[term_id][topic_id] = count\n                    self.__topic_sum[topic_id] += count\n                    self.__topic_words[topic_id].append(WordCount(term_id, count))\n                new_dict = OrderedDict()\n                for key in sorted(self.__word_topic[term_id]):\n                    new_dict[key] = self.__word_topic[term_id][key]\n                self.__word_topic[term_id] = new_dict\n\n    def get_vocab(self):\n        return self.__vocab.vocabulary()\n\n    def topic_words(self):\n        return self.__topic_words\n"
  },
  {
    "path": "modules/text/language_model/lda_news/module.py",
    "content": "import os\n\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.common.logger import logger\n\nfrom lda_news.inference_engine import InferenceEngine\nfrom lda_news.document import LDADoc, SLDADoc\nfrom lda_news.semantic_matching import SemanticMatching, WordAndDis\nfrom lda_news.tokenizer import LACTokenizer, SimpleTokenizer\nfrom lda_news.config import ModelType\nfrom lda_news.vocab import Vocab, WordCount\n\n\n@moduleinfo(\n    name=\"lda_news\",\n    version=\"1.0.2\",\n    summary=\n    \"This is a PaddleHub Module for LDA topic model in news dataset, where we can calculate doc distance, calculate the similarity between query and document, etc\",\n    author=\"DesmonDay\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\")\nclass TopicModel(hub.Module):\n    def _initialize(self):\n        \"\"\"\n        Initialize with the necessary elements.\n        \"\"\"\n        self.model_dir = os.path.join(self.directory, 'news')\n        self.conf_file = 'lda.conf'\n        self.__engine = InferenceEngine(self.model_dir, self.conf_file)\n        self.vocab_path = os.path.join(self.model_dir, 'vocab_info.txt')\n        lac = hub.Module(name=\"lac\")\n        # self.__tokenizer = SimpleTokenizer(self.vocab_path)\n        self.__tokenizer = LACTokenizer(self.vocab_path, lac)\n\n        self.vocabulary = self.__engine.get_model().get_vocab()\n        self.config = self.__engine.get_config()\n        self.topic_words = self.__engine.get_model().topic_words()\n        self.topic_sum_table = self.__engine.get_model().topic_sum()\n\n        def take_elem(word_count):\n            return word_count.count\n\n        for i in range(self.config.num_topics):\n            self.topic_words[i].sort(key=take_elem, reverse=True)\n\n        logger.info(\"Finish initialization.\")\n\n    def cal_doc_distance(self, doc_text1, doc_text2):\n        \"\"\"\n        This interface calculates the distance between documents.\n\n        Args:\n            doc_text1(str): the input document text 1.\n            doc_text2(str): the input document text 2.\n\n        Returns:\n            jsd(float): Jensen-Shannon Divergence distance of two documents.\n            hd(float): Hellinger Distance of two documents.\n        \"\"\"\n        doc1_tokens = self.__tokenizer.tokenize(doc_text1)\n        doc2_tokens = self.__tokenizer.tokenize(doc_text2)\n\n        # Document topic inference.\n        doc1, doc2 = LDADoc(), LDADoc()\n        self.__engine.infer(doc1_tokens, doc1)\n        self.__engine.infer(doc2_tokens, doc2)\n\n        # To calculate jsd, we need dense document topic distribution.\n        dense_dict1 = doc1.dense_topic_dist()\n        dense_dict2 = doc2.dense_topic_dist()\n        # Calculate the distance between distributions.\n        # The smaller the distance, the higher the document semantic similarity.\n        sm = SemanticMatching()\n        jsd = sm.jensen_shannon_divergence(dense_dict1, dense_dict2)\n        hd = sm.hellinger_distance(dense_dict1, dense_dict2)\n\n        return jsd, hd\n\n    def cal_doc_keywords_similarity(self, document, top_k=10):\n        \"\"\"\n        This interface can be used to find top k keywords of document.\n\n        Args:\n            document(str): the input document text.\n            top_k(int): top k keywords of this document.\n\n        Returns:\n            results(list): contains top_k keywords and their corresponding\n                           similarity compared to document.\n        \"\"\"\n        d_tokens = self.__tokenizer.tokenize(document)\n\n        # Do topic inference on documents to obtain topic distribution.\n        doc = LDADoc()\n        self.__engine.infer(d_tokens, doc)\n        doc_topic_dist = doc.sparse_topic_dist()\n\n        items = []\n        words = set()\n        for word in d_tokens:\n            if word in words:\n                continue\n            words.add(word)\n            wd = WordAndDis()\n            wd.word = word\n            sm = SemanticMatching()\n            wd.distance = sm.likelihood_based_similarity(\n                terms=[word], doc_topic_dist=doc_topic_dist, model=self.__engine.get_model())\n            items.append(wd)\n\n        def take_elem(word_dis):\n            return word_dis.distance\n\n        items.sort(key=take_elem, reverse=True)\n\n        results = []\n        size = len(items)\n        for i in range(top_k):\n            if i >= size:\n                break\n            results.append({\"word\": items[i].word, \"similarity\": items[i].distance})\n\n        return results\n\n    def cal_query_doc_similarity(self, query, document):\n        \"\"\"\n        This interface calculates the similarity between query and document.\n\n        Args:\n            query(str): the input query text.\n            document(str): the input document text.\n\n        Returns:\n            lda_sim(float): likelihood based similarity between query and document\n                            based on LDA.\n        \"\"\"\n        q_tokens = self.__tokenizer.tokenize(query)\n        d_tokens = self.__tokenizer.tokenize(document)\n\n        doc = LDADoc()\n        self.__engine.infer(d_tokens, doc)\n        doc_topic_dist = doc.sparse_topic_dist()\n\n        sm = SemanticMatching()\n        lda_sim = sm.likelihood_based_similarity(q_tokens, doc_topic_dist, self.__engine.get_model())\n\n        return lda_sim\n\n    def infer_doc_topic_distribution(self, document):\n        \"\"\"\n        This interface infers the topic distribution of document.\n\n        Args:\n            document(str): the input document text.\n\n        Returns:\n            results(list): returns the topic distribution of document.\n        \"\"\"\n        tokens = self.__tokenizer.tokenize(document)\n        if tokens == []:\n            return []\n        results = []\n        doc = LDADoc()\n        self.__engine.infer(tokens, doc)\n        topics = doc.sparse_topic_dist()\n        for topic in topics:\n            results.append({\"topic id\": topic.tid, \"distribution\": topic.prob})\n        return results\n\n    def show_topic_keywords(self, topic_id, k=10):\n        \"\"\"\n        This interface returns first k keywords under specific topic.\n\n        Args:\n            topic_id(int): topic information we want to know.\n            k(int): top k keywords.\n\n        Returns:\n            results(dict): contains specific topic's keywords and corresponding\n                           probability.\n        \"\"\"\n        EPS = 1e-8\n        results = {}\n        if 0 <= topic_id < self.config.num_topics:\n            k = min(k, len(self.topic_words[topic_id]))\n            for i in range(k):\n                prob = self.topic_words[topic_id][i].count / \\\n                       (self.topic_sum_table[topic_id] + EPS)\n                results[self.vocabulary[self.topic_words[topic_id][i].word_id]] = prob\n            return results\n        else:\n            logger.error(\"%d is out of range!\" % topic_id)\n"
  },
  {
    "path": "modules/text/language_model/lda_news/sampler.py",
    "content": "import numpy as np\nfrom tqdm import tqdm\nfrom paddlehub.common.logger import logger\n\nfrom lda_news.document import LDADoc, SLDADoc, Token, Sentence\nfrom lda_news.vose_alias import VoseAlias\nfrom lda_news.util import rand, rand_k\n\n\nclass Sampler(object):\n    def __init__(self):\n        pass\n\n    def sample_doc(self, doc):\n        \"\"\"Sample LDA or SLDA topics for documents.\n        \"\"\"\n        raise NotImplementedError\n\n\nclass MHSampler(Sampler):\n    def __init__(self, model):\n        super().__init__()\n        self.__model = model\n        self.__topic_indexes = None\n        self.__alias_tables = None\n        self.__prob_sum = None\n        self.__beta_alias = VoseAlias()\n        self.__beta_prior_sum = None\n        self.__mh_steps = 2\n        self.__construct_alias_table()\n\n    def __construct_alias_table(self):\n        \"\"\"Construct alias table for all words.\n        \"\"\"\n        logger.info(\"Construct alias table for alias sampling method.\")\n        vocab_size = self.__model.vocab_size()\n        self.__topic_indexes = [[] for _ in range(vocab_size)]\n        self.__alias_tables = [VoseAlias() for _ in range(vocab_size)]\n        self.__prob_sum = np.zeros(vocab_size)\n\n        # Construct each word's alias table (prior is not included).\n        for i in tqdm(range(vocab_size)):\n            dist = []\n            prob_sum = 0\n            for key in self.__model.word_topic(i):\n                topic_id = key\n                word_topic_count = self.__model.word_topic(i)[key]\n                topic_sum = self.__model.topic_sum_value(topic_id)\n\n                self.__topic_indexes[i].append(topic_id)\n                q = word_topic_count / (topic_sum + self.__model.beta_sum())\n                dist.append(q)\n                prob_sum += q\n            self.__prob_sum[i] = prob_sum\n            if len(dist) > 0:\n                dist = np.array(dist, dtype=np.float)\n                self.__alias_tables[i].initialize(dist)\n\n        # Build prior parameter beta's alias table.\n        beta_dist = self.__model.beta() / (self.__model.topic_sum() + self.__model.beta_sum())\n        self.__beta_prior_sum = np.sum(beta_dist)\n        self.__beta_alias.initialize(beta_dist)\n\n    def sample_doc(self, doc):\n        if isinstance(doc, LDADoc) and not isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_token(doc, doc.token(i))\n                doc.set_topic(i, new_topic)\n        elif isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_sentence(doc, doc.sent(i))\n                doc.set_topic(i, new_topic)\n\n    def __sample_token(self, doc, token):\n        new_topic = token.topic\n        for i in range(self.__mh_steps):\n            doc_proposed_topic = self.__doc_proposal(doc, token)\n            new_topic = self.__word_proposal(doc, token, doc_proposed_topic)\n        return new_topic\n\n    def __sample_sentence(self, doc, sent):\n        new_topic = sent.topic\n        for i in range(self.__mh_steps):\n            doc_proposed_topic = self.__doc_proposal(doc, sent)\n            new_topic = self.__word_proposal(doc, sent, doc_proposed_topic)\n        return new_topic\n\n    def __doc_proposal(self, doc, token):\n        if isinstance(doc, LDADoc) and isinstance(token, Token):\n            old_topic = token.topic\n            dart = rand() * (doc.size() + self.__model.alpha_sum())\n            if dart < doc.size():\n                token_index = int(dart)\n                new_topic = doc.token(token_index).topic\n            else:\n                new_topic = rand_k(self.__model.num_topics())\n\n            if new_topic != old_topic:\n                proposal_old = self.__doc_proposal_distribution(doc, old_topic)\n                proposal_new = self.__doc_proposal_distribution(doc, new_topic)\n                proportion_old = self.__proportional_function(doc, token, old_topic)\n                proportion_new = self.__proportional_function(doc, token, new_topic)\n                transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                rejection = rand()\n                mask = -(rejection < transition_prob)\n                return (new_topic & mask) | (old_topic & ~mask)\n\n            return new_topic\n\n        elif isinstance(doc, SLDADoc) and isinstance(token, Sentence):\n            sent = token\n            old_topic = sent.topic\n            dart = rand() * (doc.size() + self.__model.alpha_sum())\n            if dart < doc.size():\n                token_index = int(dart)\n                new_topic = doc.sent(token_index).topic\n            else:\n                new_topic = rand_k(self.__model.num_topics())\n\n            if new_topic != old_topic:\n                proportion_old = self.__proportional_function(doc, sent, old_topic)\n                proportion_new = self.__proportional_function(doc, sent, new_topic)\n                proposal_old = self.__doc_proposal_distribution(doc, old_topic)\n                proposal_new = self.__doc_proposal_distribution(doc, new_topic)\n                transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                rejection = rand()\n                mask = -(rejection < transition_prob)\n                return (new_topic & mask) | (old_topic & ~mask)\n\n            return new_topic\n\n    def __word_proposal(self, doc, token, old_topic):\n        if isinstance(doc, LDADoc) and isinstance(token, Token):\n            new_topic = self.__propose(token.id)\n            if new_topic != old_topic:\n                proposal_old = self.__word_proposal_distribution(token.id, old_topic)\n                proposal_new = self.__word_proposal_distribution(token.id, new_topic)\n                proportion_old = self.__proportional_function(doc, token, old_topic)\n                proportion_new = self.__proportional_function(doc, token, new_topic)\n                transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                rejection = rand()\n                mask = -(rejection < transition_prob)\n                return (new_topic & mask) | (old_topic & ~mask)\n            return new_topic\n\n        elif isinstance(doc, SLDADoc) and isinstance(token, Sentence):\n            sent = token\n            new_topic = old_topic\n            for word_id in sent.tokens:\n                new_topic = self.__propose(word_id)\n                if new_topic != old_topic:\n                    proportion_old = self.__proportional_function(doc, sent, old_topic)\n                    proportion_new = self.__proportional_function(doc, sent, new_topic)\n                    proposal_old = self.__word_proposal_distribution(word_id, old_topic)\n                    proposal_new = self.__word_proposal_distribution(word_id, new_topic)\n                    transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                    rejection = rand()\n                    mask = -(rejection < transition_prob)\n                    new_topic = (new_topic & mask) | (old_topic & ~mask)\n            return new_topic\n\n    def __proportional_function(self, doc, token, new_topic):\n        if isinstance(doc, LDADoc) and isinstance(token, Token):\n            old_topic = token.topic\n            dt_alpha = doc.topic_sum(new_topic) + self.__model.alpha()\n            wt_beta = self.__model.word_topic_value(token.id, new_topic) + self.__model.beta()\n            t_sum_beta_sum = self.__model.topic_sum_value(new_topic) + self.__model.beta_sum()\n            if new_topic == old_topic and wt_beta > 1:\n                if dt_alpha > 1:\n                    dt_alpha -= 1\n                wt_beta -= 1\n                t_sum_beta_sum -= 1\n            return dt_alpha * wt_beta / t_sum_beta_sum\n\n        elif isinstance(doc, SLDADoc) and isinstance(token, Sentence):\n            sent = token\n            old_topic = sent.topic\n            result = doc.topic_sum(new_topic) + self.__model.alpha()\n            if new_topic == old_topic:\n                result -= 1\n            for word_id in sent.tokens:\n                wt_beta = self.__model.word_topic_value(word_id, new_topic) + self.__model.beta()\n                t_sum_beta_sum = self.__model.topic_sum_value(new_topic) + self.__model.beta_sum()\n                if new_topic == old_topic and wt_beta > 1:\n                    wt_beta -= 1\n                    t_sum_beta_sum -= 1\n                result *= wt_beta / t_sum_beta_sum\n            return result\n        else:\n            logger.error(\"Wrong input argument type!\")\n\n    def __word_proposal_distribution(self, word_id, topic):\n        wt_beta = self.__model.word_topic_value(word_id, topic) + self.__model.beta()\n        t_sum_beta_sum = self.__model.topic_sum_value(topic) + self.__model.beta_sum()\n        return wt_beta / t_sum_beta_sum\n\n    def __doc_proposal_distribution(self, doc, topic):\n        return doc.topic_sum(topic) + self.__model.alpha()\n\n    def __propose(self, word_id):\n        dart = rand() * (self.__prob_sum[word_id] + self.__beta_prior_sum)\n        if dart < self.__prob_sum[word_id]:\n            idx = self.__alias_tables[word_id].generate()\n            topic = self.__topic_indexes[word_id][idx]\n        else:\n            topic = self.__beta_alias.generate()\n        return topic\n\n\nclass GibbsSampler(Sampler):\n    def __init__(self, model):\n        super().__init__()\n        self.__model = model\n\n    def sample_doc(self, doc):\n        if isinstance(doc, LDADoc) and not isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_token(doc, doc.token(i))\n                doc.set_topic(i, new_topic)\n        elif isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_sentence(doc, doc.sent(i))\n                doc.set_topic(i, new_topic)\n\n    def __sample_token(self, doc, token):\n        old_topic = token.topic\n        num_topics = self.__model.num_topics()\n        accum_prob = np.zeros(num_topics)\n        prob = np.zeros(num_topics)\n        sum_ = 0\n        for i in range(num_topics):\n            dt_alpha = doc.topic_sum(i) + self.__model.alpha()\n            wt_beta = self.__model.word_topic_value(token.id, i) + self.__model.beta()\n            t_sum_beta_sum = self.__model.topic_sum(i) + self.__model.beta_sum()\n            if i == old_topic and wt_beta > 1:\n                if dt_alpha > 1:\n                    dt_alpha -= 1\n                wt_beta -= 1\n                t_sum_beta_sum -= 1\n            prob[i] = dt_alpha * wt_beta / t_sum_beta_sum\n            sum_ += prob[i]\n            accum_prob[i] = prob[i] if i == 0 else accum_prob[i - 1] + prob[i]\n\n        dart = rand() * sum_\n        if dart <= accum_prob[0]:\n            return 0\n        for i in range(1, num_topics):\n            if accum_prob[i - 1] < dart <= accum_prob[i]:\n                return i\n        return num_topics - 1\n\n    def __sample_sentence(self, doc, sent):\n        old_topic = sent.topic\n        num_topics = self.__model.num_topics()\n        accum_prob = np.zeros(num_topics)\n        prob = np.zeros(num_topics)\n        sum_ = 0\n        for t in range(num_topics):\n            dt_alpha = doc.topic_sum(t) + self.__model.alpha()\n            t_sum_beta_sum = self.__model.topic_sum(t) + self.__model.beta_sum()\n            if t == old_topic:\n                if dt_alpha > 1:\n                    dt_alpha -= 1\n                if t_sum_beta_sum > 1:\n                    t_sum_beta_sum -= 1\n            prob[t] = dt_alpha\n            for i in range(len(sent.tokens)):\n                w = sent.tokens[i]\n                wt_beta = self.__model.word_topic_value(w, t) + self.__model.beta()\n                if t == old_topic and wt_beta > 1:\n                    wt_beta -= 1\n                # Note: if the length of the sentence is too long, the probability will be\n                # too small and the accuracy will be lost if there are too many multiply items\n                prob[t] *= wt_beta / t_sum_beta_sum\n            sum_ += prob[t]\n            accum_prob[t] = prob[t] if t == 0 else accum_prob[t - 1] + prob[t]\n\n        dart = rand() * sum\n        if dart <= accum_prob[0]:\n            return 0\n        for t in range(1, num_topics):\n            if accum_prob[t - 1] < dart <= accum_prob[t]:\n                return t\n        return num_topics - 1\n"
  },
  {
    "path": "modules/text/language_model/lda_news/semantic_matching.py",
    "content": "import numpy as np\n\nfrom lda_news.vocab import OOV\n\nEPS = 1e-06\n\n\nclass WordAndDis(object):\n    def __init__(self):\n        self.word = None\n        self.distance = None\n\n\nclass SemanticMatching(object):\n    def __init__(self):\n        pass\n\n    def l2_norm(self, vec):\n        \"\"\"Calculate the length of vector.\n        \"\"\"\n        result = np.sqrt(np.sum(vec**2))\n        return result\n\n    def cosine_similarity(self, vec1, vec2):\n        \"\"\"Calculate the cosine similarity between two vectors.\n        \"\"\"\n        norm1 = self.l2_norm(vec1)\n        norm2 = self.l2_norm(vec2)\n        result = np.sum(vec1 * vec2) / norm1 / norm2\n        return result\n\n    def likelihood_based_similarity(self, terms, doc_topic_dist, model):\n        \"\"\"Calculate the likelihood based similarity.\n        Args:\n            terms: list of strings\n            doc_topic_dist: list of Topic class\n            model: TopicModel class\n        \"\"\"\n        num_of_term_in_vocab = 0\n        result = 0\n        for i in range(len(terms)):\n            term_id = model.term_id(terms[i])\n            if term_id == OOV:\n                continue\n            num_of_term_in_vocab += 1\n            for j in range(len(doc_topic_dist)):\n                topic_id = doc_topic_dist[j].tid\n                prob = doc_topic_dist[j].prob\n                result += model.word_topic_value(term_id, topic_id) * 1.0 / \\\n                          model.topic_sum_value(topic_id) * prob\n\n        if num_of_term_in_vocab == 0:\n            return result\n        return result / num_of_term_in_vocab\n\n    def kullback_leibler_divergence(self, dist1, dist2):\n        assert dist1.shape == dist2.shape\n        dist2[dist2 < EPS] = EPS\n        result = np.sum(dist1 * np.log(dist1 / dist2))\n        return result\n\n    def jensen_shannon_divergence(self, dist1, dist2):\n        assert dist1.shape == dist2.shape\n        dist1[dist1 < EPS] = EPS\n        dist2[dist2 < EPS] = EPS\n        mean = (dist1 + dist2) * 0.5\n        jsd = self.kullback_leibler_divergence(dist1, mean) * 0.5 + \\\n              self.kullback_leibler_divergence(dist2, mean) * 0.5\n        return jsd\n\n    def hellinger_distance(self, dist1, dist2):\n        assert dist1.shape == dist2.shape\n        result = np.sum((np.sqrt(dist1) - np.sqrt(dist2))**2)\n        result = np.sqrt(result) * 0.7071067812\n        return result\n"
  },
  {
    "path": "modules/text/language_model/lda_news/tokenizer.py",
    "content": "\"\"\"This file defines tokenizer class object.\n\"\"\"\n\n\nclass Tokenizer(object):\n    \"\"\"Base tokenizer class.\n    \"\"\"\n\n    def __init__(self):\n        pass\n\n    def tokenize(self, text):\n        raise NotImplementedError\n\n\nclass SimpleTokenizer(Tokenizer):\n    \"\"\"Simple version FMM(Forward Maximun Matching) word tokenizer. This tokenizer can only\n       be used in topic model demo, but not in real business application scenarios.\n\n       Notes: This tokenizer can only recognize the words in the corresponding vocab file.\n    \"\"\"\n\n    def __init__(self, vocab_path):\n        super().__init__()\n        self.__max_word_len = 0\n        self.__vocab = set()\n        self.__load_vocab(vocab_path)\n\n    def tokenize(self, text):\n        \"\"\"Tokenize the input string `text`, and return the tokenize result.\n        \"\"\"\n        text_len = len(text)\n        result = []\n        i = 0\n        while i < text_len:\n            word = found_word = \"\"\n            # Deal with English characters.\n            if self.__is_eng_char(text[i]):\n                for j in range(i, text_len + 1):\n                    if j < text_len and self.__is_eng_char(text[j]):\n                        word += self.__tolower(text[j])\n                    else:\n                        # Forward matching by character granularity.\n                        if word in self.__vocab:\n                            result.append(word)\n                        i = j - 1\n                        break\n            else:\n                for j in range(i, min(i + self.__max_word_len, text_len)):\n                    word += text[j]\n                    if word in self.__vocab:\n                        found_word = word\n                if len(found_word) > 0:\n                    result.append(found_word)\n                    i += len(found_word) - 1\n            i += 1\n        return result\n\n    def contains(self, word):\n        \"\"\"Check whether the word is in the vocabulary.\n        \"\"\"\n        return word in self.__vocab\n\n    def __load_vocab(self, vocab_path):\n        \"\"\"Load the word dictionary.\n        \"\"\"\n        with open(vocab_path, 'r', encoding='utf-8') as fin:\n            vocab_size = 0\n            for line in fin.readlines():\n                fields = line.strip().split('\\t')\n                assert len(fields) >= 2\n                word = fields[1]\n                self.__max_word_len = max(self.__max_word_len, len(word))\n                self.__vocab.add(word)\n                vocab_size += 1\n\n    def __is_eng_char(self, c):\n        \"\"\"Check whether char c is an English character.\n        \"\"\"\n        return (c >= 'A' and c <= 'Z') or (c >= 'a' and c <= 'z')\n\n    def __tolower(self, c):\n        \"\"\"Return the lowercase character of the corresponding character, or return\n           the original character if there is no corresponding lowercase character.\n        \"\"\"\n        return c.lower()\n\n\nclass LACTokenizer(Tokenizer):\n    def __init__(self, vocab_path, lac):\n        super().__init__()\n        self.__max_word_len = 0\n        self.__vocab = set()\n        self.__lac = lac\n        self.__load_vocab(vocab_path)\n\n    def __load_vocab(self, vocab_path):\n        \"\"\"Load the word dictionary.\n                \"\"\"\n        with open(vocab_path, 'r', encoding='utf-8') as fin:\n            vocab_size = 0\n            for line in fin.readlines():\n                fields = line.strip().split('\\t')\n                assert len(fields) >= 2\n                word = fields[1]\n                self.__max_word_len = max(self.__max_word_len, len(word))\n                self.__vocab.add(word)\n                vocab_size += 1\n\n    def tokenize(self, text):\n        results = self.__lac.lexical_analysis(texts=[text], use_gpu=False, batch_size=1, return_tag=True)\n        # Change English words to lower case.\n        # And just preserve the word in vocab.\n        words = results[0][\"word\"]\n        result = []\n        for word in words:\n            word = word.lower()\n            if word in self.__vocab:\n                result.append(word)\n        return result\n\n    def contains(self, word):\n        \"\"\"Check whether the word is in the vocabulary.\n        \"\"\"\n        return word in self.__vocab\n"
  },
  {
    "path": "modules/text/language_model/lda_news/util.py",
    "content": "import time\nimport yaml\n\nimport numpy as np\nfrom paddlehub.common.logger import logger\n\nfrom lda_news.config import ModelType\n\n\ndef load_prototxt(config_file, config):\n    \"\"\"\n    Args:\n        config_file: model configuration file.\n        config: ModelConfig class\n    \"\"\"\n    logger.info(\"Loading LDA config.\")\n    with open(config_file, 'r', encoding='utf-8') as f:\n        yaml_dict = yaml.load(f, Loader=yaml.FullLoader)\n\n    # Assignment.\n    if yaml_dict[\"type\"] == \"LDA\":\n        config.type = ModelType.LDA\n    else:\n        config.type = ModelType.SLDA\n    config.num_topics = yaml_dict[\"num_topics\"]\n    config.alpha = yaml_dict[\"alpha\"]\n    config.beta = yaml_dict[\"beta\"]\n    config.word_topic_file = yaml_dict[\"word_topic_file\"]\n    config.vocab_file = yaml_dict[\"vocab_file\"]\n\n\ndef fix_random_seed(seed=2147483647):\n    np.random.seed(seed)\n\n\ndef rand(min_=0, max_=1):\n    return np.random.uniform(low=min_, high=max_)\n\n\ndef rand_k(k):\n    \"\"\"Returns an integer float number between [0, k - 1].\n    \"\"\"\n    return int(rand() * k)\n\n\ndef timeit(f):\n    \"\"\"Return time cost of function f.\n    \"\"\"\n\n    def timed(*args, **kwargs):\n        start_time = time.time()\n        result = f(*args, **kwargs)\n        end_time = time.time()\n        print(\"   [-] %s : %2.5f sec\" % (f.__name__, end_time - start_time))\n        return result\n\n    return timed\n"
  },
  {
    "path": "modules/text/language_model/lda_news/vocab.py",
    "content": "from paddlehub.common.logger import logger\n\nOOV = -1\n\n\nclass WordCount(object):\n    def __init__(self, word_id, count):\n        self.word_id = word_id\n        self.count = count\n\n\nclass Vocab(object):\n    def __init__(self):\n        self.__term2id = {}\n        self.__id2term = {}\n\n    def get_id(self, word):\n        if word not in self.__term2id:\n            return OOV\n        return self.__term2id[word]\n\n    def load(self, vocab_file):\n        self.__term2id = {}\n        self.__id2term = {}\n        with open(vocab_file, 'r', encoding='utf-8') as fin:\n            for line in fin.readlines():\n                fields = line.strip().split('\\t')\n                assert len(fields) == 5, \"Vocabulary file [%s] format error!\" % (vocab_file)\n                term = fields[1]\n                id_ = int(fields[2])\n                if term in self.__term2id:\n                    logger.error(\"Duplicate word [%s] in vocab file!\" % (term))\n                    continue\n                self.__term2id[term] = id_\n                self.__id2term[id_] = term\n\n    def size(self):\n        return len(self.__term2id)\n\n    def vocabulary(self):\n        return self.__id2term\n"
  },
  {
    "path": "modules/text/language_model/lda_news/vose_alias.py",
    "content": "import numpy as np\n\nfrom lda_news.util import rand, rand_k\n\n\nclass VoseAlias(object):\n    \"\"\"Vose's Alias Method.\n    \"\"\"\n\n    def __init__(self):\n        self.__alias = None\n        self.__prob = None  # np.array\n\n    def initialize(self, distribution):\n        \"\"\"Initialize the alias table according to the input distribution\n        Arg:\n            distribution: the input distribution.\n        \"\"\"\n        size = distribution.shape[0]\n        self.__alias = np.zeros(size, dtype=np.int64)\n        self.__prob = np.zeros(size)\n        sum_ = np.sum(distribution)\n        p = distribution / sum_ * size  # Scale up probability.\n        large, small = [], []\n        for i, p_ in enumerate(p):\n            if p_ < 1.0:\n                small.append(i)\n            else:\n                large.append(i)\n\n        while large and small:\n            l = small[0]\n            g = large[0]\n            small.pop(0)\n            large.pop(0)\n            self.__prob[l] = p[l]\n            self.__alias[l] = g\n            p[g] = p[g] + p[l] - 1  # A more numerically stable option.\n            if p[g] < 1.0:\n                small.append(g)\n            else:\n                large.append(g)\n\n        while large:\n            g = large[0]\n            large.pop(0)\n            self.__prob[g] = 1.0\n\n        while small:\n            l = small[0]\n            small.pop(0)\n            self.__prob[l] = 1.0\n\n    def generate(self):\n        \"\"\"Generate samples from given distribution.\n        \"\"\"\n        dart1 = rand_k(self.size())\n        dart2 = int(rand())\n        return dart1 if dart2 > self.__prob[dart1] else self.__alias[dart1]\n\n    def size(self):\n        return self.__prob.shape[0]\n"
  },
  {
    "path": "modules/text/language_model/lda_novel/README.md",
    "content": "## 模型概述\n\n主题模型(Topic Model)是以无监督学习的方式对文档的隐含语义结构进行聚类的统计模型，其中LDA(Latent Dirichlet Allocation)算法是主题模型的一种。LDA根据对词的共现信息的分析，拟合出词-文档-主题的分布，从而将词、文本映射到一个语义空间中。本Module基于的数据集为百度自建的小说领域数据集。\n\n<p align=\"center\">\n<img src=\"https://bj.bcebos.com/paddlehub/model/nlp/semantic_model/lda.png\" hspace='10'/> <br />\n</p>\n\n更多详情请参考[LDA论文](http://www.jmlr.org/papers/volume3/blei03a/blei03a.pdf)。\n\n注：该Module由第三方开发者DesmonDay贡献。\n\n## LDA模型 API 说明\n### cal_doc_distance(doc_text1, doc_text2)\n用于计算两个输入文档之间的距离，包括Jensen-Shannon divergence(JS散度)、Hellinger Distance(海林格距离)。\n\n**参数**\n\n- doc_text1(str): 输入的第一个文档。\n- doc_text2(str): 输入的第二个文档。\n\n**返回**\n\n- jsd(float): 两个文档之间的JS散度([Jensen-Shannon divergence](https://blog.csdn.net/FrankieHello/article/details/80614422?utm_source=copy))。\n- hd(float): 两个文档之间的海林格距离([Hellinger Distance](http://blog.sina.com.cn/s/blog_85f1ffb70101e65d.html))。\n\n### cal_doc_keywords_similarity(document, top_k=10)\n\n用于查找输入文档的前k个关键词及对应的与原文档的相似度。\n\n**参数**\n\n- document(str): 输入文档。\n- top_k(int): 查找输入文档的前k个关键词。\n\n**返回**\n\n- results(list): 包含每个关键词以及对应的与原文档的相似度。其中，list的基本元素为dict，dict的key为关键词，value为对应的与原文档的相似度。\n\n### cal_query_doc_similarity(query, document)\n\n用于计算短文档与长文档之间的相似度。\n\n**参数**\n\n- query(str): 输入的短文档。\n- document(str): 输入的长文档。\n\n**返回**\n\n- lda_sim(float): 返回短文档与长文档之间的相似度。\n\n### infer_doc_topic_distribution(document)\n\n用于推理出文档的主题分布。\n\n**参数**\n\n- document(str): 输入文档。\n\n**返回**\n\n- results(list): 包含主题分布下各个主题ID和对应的概率分布。其中，list的基本元素为dict，dict的key为主题ID，value为各个主题ID对应的概率。\n\n### show_topic_keywords(topic_id, k=10)\n\n用于展示出每个主题下对应的关键词，可配合推理主题分布的API使用。\n\n**参数**\n\n- topic_id(int): 主题ID。\n- k(int): 需要知道对应主题的前k个关键词。\n\n**返回**\n\n- results(dict): 返回对应文档的前k个关键词，以及各个关键词在文档中的出现概率。\n\n### 代码示例\n\n这里展示部分API的使用示例。\n``` python\nimport paddlehub as hub\n\nlda_novel = hub.Module(name=\"lda_novel\")\njsd, hd = lda_novel.cal_doc_distance(doc_text1=\"老人幸福地看着自己的儿子，露出了欣慰的笑容。\", doc_text2=\"老奶奶看着自己的儿子，幸福地笑了。\")\n# jsd = 0.01292, hd = 0.11893\n\nlda_sim = lda_novel.cal_query_doc_similarity(query='亲孙女', document='老人激动地打量着面前的女孩，似乎找到了自己的亲孙女一般，双手止不住地颤抖着。')\n# LDA similarity = 0.0\n\nresults = lda_novel.cal_doc_keywords_similarity('百度是全球最大的中文搜索引擎、致力于让网民更便捷地获取信息，找到所求。百度超过千亿的中文网页数据库，可以瞬间找到相关的搜索结果。')\n# [{'word': '信息', 'similarity': 0.014140977159719738},\n#  {'word': '找到', 'similarity': 0.012251022010382823},\n#  {'word': '搜索', 'similarity': 0.004262275169349261},\n#  {'word': '网页', 'similarity': 0.0026937499565468327},\n#  {'word': '百度', 'similarity': 0.0021199508577209015},\n#  {'word': '全球', 'similarity': 0.0010464078137351785},\n#  {'word': '中文', 'similarity': 0.0009866259107630141},\n#  {'word': '瞬间', 'similarity': 0.0009262589016537221},\n#  {'word': '超过', 'similarity': 0.0008362863020592123},\n#  {'word': '相关', 'similarity': 0.000793663877590302}]\n\nresults = lda_novel.infer_doc_topic_distribution(\"妈妈告诉女儿，今天爸爸过生日，放学后要早点回家一起庆祝\")\n# [{'topic id': 0, 'distribution': 0.7166666666666667},\n#  {'topic id': 64, 'distribution': 0.11666666666666667},\n#  {'topic id': 125, 'distribution': 0.020833333333333332},\n#  {'topic id': 131, 'distribution': 0.016666666666666666},\n#  {'topic id': 137, 'distribution': 0.016666666666666666}, ...]\n\nkeywords = lda_novel.show_topic_keywords(topic_id=0)\n# {'妈妈': 0.36114392028319225,\n#  '爸爸': 0.18456064543161096,\n#  '女儿': 0.03591842787260316,\n#  '孩子': 0.01567368390197123,\n#  '家里': 0.014277018999815379,\n#  '回家': 0.013514888275429099,\n#  '回来': 0.013275213681108526,\n#  '爸妈': 0.007931677222119656,\n#  '告诉': 0.006841933742906693,\n#  '父母': 0.00627464639375944}\n\n```\n\n## 查看代码\nhttps://github.com/baidu/Familia\n\n\n## 依赖\n\npaddlepaddle >= 1.8.2\n\npaddlehub >= 1.8.0\n\n\n## 更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  修复因为return的bug导致的NoneType错误\n\n* 1.0.2\n\n  修复由于Windows`gbk`编码导致的问题\n"
  },
  {
    "path": "modules/text/language_model/lda_novel/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/lda_novel/config.py",
    "content": "\"\"\"\nThis file defines the basic config information of LDA/SLDA model.\n\"\"\"\n\n\nclass ModelType:\n    LDA = 0\n    SLDA = 1\n\n\nclass ModelConfig:\n    type = None\n    num_topics = None\n    alpha = None\n    beta = None\n    word_topic_file = None\n    vocab_file = None\n"
  },
  {
    "path": "modules/text/language_model/lda_novel/document.py",
    "content": "import numpy as np\n\n\nclass Topic(object):\n    \"\"\"Basic data structure of topic, contains topic id and\n       corresponding probability.\n    \"\"\"\n\n    def __init__(self, tid, prob):\n        self.tid = tid  # topic id\n        self.prob = prob  # topic probability\n\n\nclass Token(object):\n    \"\"\"Basic storage unit of LDA documents, contains word id\n       and corresponding topic.\n    \"\"\"\n\n    def __init__(self, topic, id):\n        self.topic = topic\n        self.id = id\n\n\nclass Sentence(object):\n    \"\"\"Basic storage unit of SentenceLDA documents, contains word ids\n       of the sentence and its corresponding topic id.\n    \"\"\"\n\n    def __init__(self, topic, tokens):\n        self.topic = topic\n        self.tokens = tokens\n\n\nclass LDADoc(object):\n    \"\"\"The storage structure of LDA model's inference result.\n    \"\"\"\n\n    def __init__(self):\n        self._num_topics = None  # Number of topics.\n        self._num_accum = None  # Number of accumulated sample rounds.\n        self._alpha = None  # Document prior parameter.\n        self._tokens = None  # Storage structure of inference results.\n        self._topic_sum = None  # Document's topic sum in one round samples.\n        self._accum_topic_sum = None  # Accumulated results of topic sum.\n\n    def init(self, num_topics):\n        \"\"\"Initialize the LDADoc according to num_topics.\n        \"\"\"\n        self._num_topics = num_topics\n        self._num_accum = 0\n        self._tokens = []\n        self._topic_sum = np.zeros(self._num_topics)\n        self._accum_topic_sum = np.zeros(self._num_topics)\n\n    def add_token(self, token):\n        \"\"\"Add new word to current LDADoc.\n        Arg:\n            token: Token class object.\n        \"\"\"\n        assert token.topic >= 0, \"Topic %d out of range!\" % token.topic\n        assert token.topic < self._num_topics, \"Topic %d out of range!\" % token.topic\n        self._tokens.append(token)\n        self._topic_sum[token.topic] += 1\n\n    def token(self, index):\n        return self._tokens[index]\n\n    def set_topic(self, index, new_topic):\n        \"\"\"Set the index word's topic to new_topic, and update the corresponding\n           topic distribution.\n        \"\"\"\n        assert new_topic >= 0, \"Topic %d out of range!\" % new_topic\n        assert new_topic < self._num_topics, \"Topic %d out of range!\" % new_topic\n        old_topic = self._tokens[index].topic\n        if new_topic == old_topic:\n            return\n        self._tokens[index].topic = new_topic\n        self._topic_sum[old_topic] -= 1\n        self._topic_sum[new_topic] += 1\n\n    def set_alpha(self, alpha):\n        self._alpha = alpha\n\n    def size(self):\n        \"\"\"Return number of words in LDADoc.\n        \"\"\"\n        return len(self._tokens)\n\n    def topic_sum(self, topic_id):\n        return self._topic_sum[topic_id]\n\n    def sparse_topic_dist(self, sort=True):\n        \"\"\"Return the topic distribution of documents in sparse format.\n           By default, it is sorted according to the topic probability\n           under the descending order.\n        \"\"\"\n        topic_dist = []\n        sum_ = np.sum(self._accum_topic_sum)\n        if sum_ == 0:\n            return topic_dist\n        for i in range(0, self._num_topics):\n            if self._accum_topic_sum[i] == 0:\n                continue\n            topic_dist.append(Topic(i, self._accum_topic_sum[i] * 1.0 / sum_))\n        if sort:\n\n            def take_elem(topic):\n                return topic.prob\n\n            topic_dist.sort(key=take_elem, reverse=True)\n            if topic_dist is None:\n                topic_dist = []\n\n        return topic_dist\n\n    def dense_topic_dist(self):\n        \"\"\"Return the distribution of document topics in dense format,\n           taking into account the prior parameter alpha.\n        \"\"\"\n        dense_dist = np.zeros(self._num_topics)\n        if self.size() == 0:\n            return dense_dist\n        dense_dist = (self._accum_topic_sum * 1.0 / self._num_accum + self._alpha) / (\n            self.size() + self._alpha * self._num_topics)\n        return dense_dist\n\n    def accumulate_topic_num(self):\n        self._accum_topic_sum += self._topic_sum\n        self._num_accum += 1\n\n\nclass SLDADoc(LDADoc):\n    \"\"\"Sentence LDA Document, inherited from LDADoc.\n       Add add_sentence interface.\n    \"\"\"\n\n    def __init__(self):\n        super().__init__()\n        self.__sentences = None\n\n    def init(self, num_topics):\n        \"\"\"Initialize the SLDADoc according to num_topics.\n        \"\"\"\n        self._num_topics = num_topics\n        self.__sentences = []\n        self._num_accum = 0\n        self._topic_sum = np.zeros(self._num_topics)\n        self._accum_topic_sum = np.zeros(self._num_topics)\n\n    def add_sentence(self, sent):\n        \"\"\"Add new sentence to current SLDADoc.\n        Arg:\n            sent: Sentence class object.\n        \"\"\"\n        assert sent.topic >= 0, \"Topic %d out of range!\" % (sent.topic)\n        assert sent.topic < self._num_topics, \"Topic %d out of range!\" % (sent.topic)\n        self.__sentences.append(sent)\n        self._topic_sum[sent.topic] += 1\n\n    def set_topic(self, index, new_topic):\n        assert new_topic >= 0, \"Topic %d out of range!\" % (new_topic)\n        assert new_topic < self._num_topics, \"Topic %d out of range!\" % (new_topic)\n        old_topic = self.__sentences[index].topic\n        if new_topic == old_topic:\n            return\n        self.__sentences[index].topic = new_topic\n        self._topic_sum[old_topic] -= 1\n        self._topic_sum[new_topic] += 1\n\n    def size(self):\n        \"\"\"Return number of sentences in SLDADoc.\n        \"\"\"\n        return len(self.__sentences)\n\n    def sent(self, index):\n        return self.__sentences[index]\n"
  },
  {
    "path": "modules/text/language_model/lda_novel/inference_engine.py",
    "content": "import os\n\nfrom paddlehub.common.logger import logger\n\nfrom lda_novel.config import ModelConfig\nfrom lda_novel.util import load_prototxt, fix_random_seed, rand_k\nfrom lda_novel.model import TopicModel\nfrom lda_novel.sampler import GibbsSampler, MHSampler\nfrom lda_novel.document import LDADoc, SLDADoc, Token, Sentence\nfrom lda_novel.vocab import OOV\n\n\nclass SamplerType:\n    GibbsSampling = 0\n    MetropolisHastings = 1\n\n\nclass InferenceEngine(object):\n    def __init__(self, model_dir, conf_file, type=SamplerType.MetropolisHastings):\n        # Read model configuration.\n        config = ModelConfig()\n        conf_file_path = os.path.join(model_dir, conf_file)\n        load_prototxt(conf_file_path, config)\n        self.__model = TopicModel(model_dir, config)\n        self.__config = config\n\n        # Initialize the sampler according to the configuration.\n        if type == SamplerType.GibbsSampling:\n            self.__sampler = GibbsSampler(self.__model)\n        elif type == SamplerType.MetropolisHastings:\n            self.__sampler = MHSampler(self.__model)\n\n    def infer(self, input, doc):\n        \"\"\"Perform LDA topic inference on input, and store the results in doc.\n        Args:\n            input: a list of strings after tokenization.\n            doc: LDADoc type or SLDADoc type.\n        \"\"\"\n        fix_random_seed()\n        if isinstance(doc, LDADoc) and not isinstance(doc, SLDADoc):\n            doc.init(self.__model.num_topics())\n            doc.set_alpha(self.__model.alpha())\n            for token in input:\n                id_ = self.__model.term_id(token)\n                if id_ != OOV:\n                    init_topic = rand_k(self.__model.num_topics())\n                    doc.add_token(Token(init_topic, id_))\n            self.lda_infer(doc, 20, 50)\n        elif isinstance(doc, SLDADoc):\n            doc.init(self.__model.num_topics())\n            doc.set_alpha(self.__model.alpha())\n            for sent in input:\n                words = []\n                for token in sent:\n                    id_ = self.__model.term_id(token)\n                    if id_ != OOV:\n                        words.append(id_)\n                init_topic = rand_k(self.__model.num_topics())\n                doc.add_sentence(Sentence(init_topic, words))\n            self.slda_infer(doc, 20, 50)\n        else:\n            logger.error(\"Wrong Doc Type!\")\n\n    def lda_infer(self, doc, burn_in_iter, total_iter):\n        assert burn_in_iter >= 0\n        assert total_iter > 0\n        assert total_iter > burn_in_iter\n\n        for iter_ in range(total_iter):\n            self.__sampler.sample_doc(doc)\n            if iter_ >= burn_in_iter:\n                doc.accumulate_topic_num()\n\n    def slda_infer(self, doc, burn_in_iter, total_iter):\n        assert burn_in_iter >= 0\n        assert total_iter > 0\n        assert total_iter > burn_in_iter\n\n        for iter_ in range(total_iter):\n            self.__sampler.sample_doc(doc)\n            if iter_ >= burn_in_iter:\n                doc.accumulate_topic_num()\n\n    def model_type(self):\n        return self.__model.type()\n\n    def get_model(self):\n        return self.__model\n\n    def get_config(self):\n        return self.__config\n"
  },
  {
    "path": "modules/text/language_model/lda_novel/model.py",
    "content": "import os\nfrom collections import OrderedDict\n\nimport numpy as np\nfrom tqdm import tqdm\nfrom paddlehub.common.logger import logger\n\nfrom lda_novel.vocab import Vocab, WordCount\n\n\nclass TopicModel(object):\n    \"\"\"Storage Structure of Topic model, including vocabulary and word topic count.\n    \"\"\"\n\n    def __init__(self, model_dir, config):\n        \"\"\"\n        Args:\n            model_dir: the path of model directory\n            config: ModelConfig class.\n        \"\"\"\n        self.__word_topic = None  # Model parameter of word topic.\n        self.__vocab = Vocab()  # Vocab data structure of model.\n        self.__num_topics = config.num_topics  # Number of topics.\n        self.__alpha = config.alpha\n        self.__alpha_sum = self.__alpha * self.__num_topics\n        self.__beta = config.beta\n        self.__beta_sum = None\n        self.__type = config.type  # Model type.\n        self.__topic_sum = np.zeros(self.__num_topics, dtype=\"int64\")  # Accum sum of each topic in word topic.\n        self.__topic_words = [[] for _ in range(self.__num_topics)]\n        word_topic_path = os.path.join(model_dir, config.word_topic_file)\n        vocab_path = os.path.join(model_dir, config.vocab_file)\n        self.load_model(word_topic_path, vocab_path)\n\n    def term_id(self, term):\n        return self.__vocab.get_id(term)\n\n    def load_model(self, word_topic_path, vocab_path):\n\n        # Loading vocabulary\n        self.__vocab.load(vocab_path)\n\n        self.__beta_sum = self.__beta * self.__vocab.size()\n        self.__word_topic = [{} for _ in range(self.__vocab.size())]  # 字典列表\n        self.__load_word_dict(word_topic_path)\n        logger.info(\"Model Info: #num_topics=%d #vocab_size=%d alpha=%f beta=%f\" %\n                    (self.num_topics(), self.vocab_size(), self.alpha(), self.beta()))\n\n    def word_topic_value(self, word_id, topic_id):\n        \"\"\"Return value of specific word under specific topic in the model.\n        \"\"\"\n        word_dict = self.__word_topic[word_id]\n        if topic_id not in word_dict:\n            return 0\n        return word_dict[topic_id]\n\n    def word_topic(self, term_id):\n        \"\"\"Return the topic distribution of a word.\n        \"\"\"\n        return self.__word_topic[term_id]\n\n    def topic_sum_value(self, topic_id):\n        return self.__topic_sum[topic_id]\n\n    def topic_sum(self):\n        return self.__topic_sum\n\n    def num_topics(self):\n        return self.__num_topics\n\n    def vocab_size(self):\n        return self.__vocab.size()\n\n    def alpha(self):\n        return self.__alpha\n\n    def alpha_sum(self):\n        return self.__alpha_sum\n\n    def beta(self):\n        return self.__beta\n\n    def beta_sum(self):\n        return self.__beta_sum\n\n    def type(self):\n        return self.__type\n\n    def __load_word_dict(self, word_dict_path):\n        \"\"\"Load the word topic parameters.\n        \"\"\"\n        logger.info(\"Loading word topic.\")\n        with open(word_dict_path, 'r', encoding='utf-8') as f:\n            for line in tqdm(f.readlines()):\n                fields = line.strip().split(\" \")\n                assert len(fields) > 0, \"Model file format error!\"\n                term_id = int(fields[0])\n                assert term_id < self.vocab_size(), \"Term id out of range!\"\n                assert term_id >= 0, \"Term id out of range!\"\n                for i in range(1, len(fields)):\n                    topic_count = fields[i].split(\":\")\n                    assert len(topic_count) == 2, \"Topic count format error!\"\n\n                    topic_id = int(topic_count[0])\n                    assert topic_id >= 0, \"Topic out of range!\"\n                    assert topic_id < self.__num_topics, \"Topic out of range!\"\n\n                    count = int(topic_count[1])\n                    assert count >= 0, \"Topic count error!\"\n\n                    self.__word_topic[term_id][topic_id] = count\n                    self.__topic_sum[topic_id] += count\n                    self.__topic_words[topic_id].append(WordCount(term_id, count))\n                new_dict = OrderedDict()\n                for key in sorted(self.__word_topic[term_id]):\n                    new_dict[key] = self.__word_topic[term_id][key]\n                self.__word_topic[term_id] = new_dict\n\n    def get_vocab(self):\n        return self.__vocab.vocabulary()\n\n    def topic_words(self):\n        return self.__topic_words\n"
  },
  {
    "path": "modules/text/language_model/lda_novel/module.py",
    "content": "import os\n\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.common.logger import logger\n\nfrom lda_novel.inference_engine import InferenceEngine\nfrom lda_novel.document import LDADoc, SLDADoc\nfrom lda_novel.semantic_matching import SemanticMatching, WordAndDis\nfrom lda_novel.tokenizer import LACTokenizer, SimpleTokenizer\nfrom lda_novel.config import ModelType\nfrom lda_novel.vocab import Vocab, WordCount\n\n\n@moduleinfo(\n    name=\"lda_novel\",\n    version=\"1.0.2\",\n    summary=\n    \"This is a PaddleHub Module for LDA topic model in novel dataset, where we can calculate doc distance, calculate the similarity between query and document, etc.\",\n    author=\"DesmonDay\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\")\nclass TopicModel(hub.Module):\n    def _initialize(self):\n        \"\"\"\n        Initialize with the necessary elements.\n        \"\"\"\n        self.model_dir = os.path.join(self.directory, 'novel')\n        self.conf_file = 'lda.conf'\n        self.__engine = InferenceEngine(self.model_dir, self.conf_file)\n        self.vocab_path = os.path.join(self.model_dir, 'vocab_info.txt')\n        lac = hub.Module(name=\"lac\")\n        # self.__tokenizer = SimpleTokenizer(self.vocab_path)\n        self.__tokenizer = LACTokenizer(self.vocab_path, lac)\n\n        self.vocabulary = self.__engine.get_model().get_vocab()\n        self.config = self.__engine.get_config()\n        self.topic_words = self.__engine.get_model().topic_words()\n        self.topic_sum_table = self.__engine.get_model().topic_sum()\n\n        def take_elem(word_count):\n            return word_count.count\n\n        for i in range(self.config.num_topics):\n            self.topic_words[i].sort(key=take_elem, reverse=True)\n\n        logger.info(\"Finish initialization.\")\n\n    def cal_doc_distance(self, doc_text1, doc_text2):\n        \"\"\"\n        This interface calculates the distance between documents.\n\n        Args:\n            doc_text1(str): the input document text 1.\n            doc_text2(str): the input document text 2.\n\n        Returns:\n            jsd(float): Jensen-Shannon Divergence distance of two documents.\n            hd(float): Hellinger Distance of two documents.\n        \"\"\"\n        doc1_tokens = self.__tokenizer.tokenize(doc_text1)\n        doc2_tokens = self.__tokenizer.tokenize(doc_text2)\n\n        # Document topic inference.\n        doc1, doc2 = LDADoc(), LDADoc()\n        self.__engine.infer(doc1_tokens, doc1)\n        self.__engine.infer(doc2_tokens, doc2)\n\n        # To calculate jsd, we need dense document topic distribution.\n        dense_dict1 = doc1.dense_topic_dist()\n        dense_dict2 = doc2.dense_topic_dist()\n        # Calculate the distance between distributions.\n        # The smaller the distance, the higher the document semantic similarity.\n        sm = SemanticMatching()\n        jsd = sm.jensen_shannon_divergence(dense_dict1, dense_dict2)\n        hd = sm.hellinger_distance(dense_dict1, dense_dict2)\n\n        return jsd, hd\n\n    def cal_doc_keywords_similarity(self, document, top_k=10):\n        \"\"\"\n        This interface can be used to find topk keywords of document.\n\n        Args:\n            document(str): the input document text.\n            top_k(int): top k keywords of this document.\n\n        Returns:\n            results(list): contains top_k keywords and their corresponding\n                           similarity compared to document.\n        \"\"\"\n        d_tokens = self.__tokenizer.tokenize(document)\n\n        # Do topic inference on documents to obtain topic distribution.\n        doc = LDADoc()\n        self.__engine.infer(d_tokens, doc)\n        doc_topic_dist = doc.sparse_topic_dist()\n\n        items = []\n        words = set()\n        for word in d_tokens:\n            if word in words:\n                continue\n            words.add(word)\n            wd = WordAndDis()\n            wd.word = word\n            sm = SemanticMatching()\n            wd.distance = sm.likelihood_based_similarity(\n                terms=[word], doc_topic_dist=doc_topic_dist, model=self.__engine.get_model())\n            items.append(wd)\n\n        def take_elem(word_dis):\n            return word_dis.distance\n\n        items.sort(key=take_elem, reverse=True)\n\n        results = []\n        size = len(items)\n        for i in range(top_k):\n            if i >= size:\n                break\n            results.append({\"word\": items[i].word, \"similarity\": items[i].distance})\n\n        return results\n\n    def cal_query_doc_similarity(self, query, document):\n        \"\"\"\n        This interface calculates the similarity between query and document.\n\n        Args:\n            query(str): the input query text.\n            document(str): the input document text.\n\n        Returns:\n            lda_sim(float): likelihood based similarity between query and document\n                            based on LDA.\n        \"\"\"\n        q_tokens = self.__tokenizer.tokenize(query)\n        d_tokens = self.__tokenizer.tokenize(document)\n\n        doc = LDADoc()\n        self.__engine.infer(d_tokens, doc)\n        doc_topic_dist = doc.sparse_topic_dist()\n\n        sm = SemanticMatching()\n        lda_sim = sm.likelihood_based_similarity(q_tokens, doc_topic_dist, self.__engine.get_model())\n\n        return lda_sim\n\n    def infer_doc_topic_distribution(self, document):\n        \"\"\"\n        This interface infers the topic distribution of document.\n\n        Args:\n            document(str): the input document text.\n\n        Returns:\n            results(list): returns the topic distribution of document.\n        \"\"\"\n        tokens = self.__tokenizer.tokenize(document)\n        if tokens == []:\n            return []\n        results = []\n        doc = LDADoc()\n        self.__engine.infer(tokens, doc)\n        topics = doc.sparse_topic_dist()\n        for topic in topics:\n            results.append({\"topic id\": topic.tid, \"distribution\": topic.prob})\n        return results\n\n    def show_topic_keywords(self, topic_id, k=10):\n        \"\"\"\n        This interface returns the k keywords under specific topic.\n\n        Args:\n            topic_id(int): topic information we want to know.\n            k(int): top k keywords.\n\n        Returns:\n            results(dict): contains specific topic's keywords and corresponding\n                           probability.\n        \"\"\"\n        EPS = 1e-8\n        results = {}\n        if 0 <= topic_id < self.config.num_topics:\n            k = min(k, len(self.topic_words[topic_id]))\n            for i in range(k):\n                prob = self.topic_words[topic_id][i].count / \\\n                       (self.topic_sum_table[topic_id] + EPS)\n                results[self.vocabulary[self.topic_words[topic_id][i].word_id]] = prob\n            return results\n        else:\n            logger.error(\"%d is out of range!\" % topic_id)\n"
  },
  {
    "path": "modules/text/language_model/lda_novel/sampler.py",
    "content": "import os\n\nimport numpy as np\nfrom tqdm import tqdm\nfrom paddlehub.common.logger import logger\n\nfrom lda_novel.document import LDADoc, SLDADoc, Token, Sentence\nfrom lda_novel.vose_alias import VoseAlias\nfrom lda_novel.util import rand, rand_k\n\n\nclass Sampler(object):\n    def __init__(self):\n        pass\n\n    def sample_doc(self, doc):\n        \"\"\"Sample LDA or SLDA topics for documents.\n        \"\"\"\n        raise NotImplementedError\n\n\nclass MHSampler(Sampler):\n    def __init__(self, model):\n        super().__init__()\n        self.__model = model\n        self.__topic_indexes = None\n        self.__alias_tables = None\n        self.__prob_sum = None\n        self.__beta_alias = VoseAlias()\n        self.__beta_prior_sum = None\n        self.__mh_steps = 2\n        self.__construct_alias_table()\n\n    def __construct_alias_table(self):\n        \"\"\"Construct alias table for all words.\n        \"\"\"\n        logger.info(\"Construct alias table for alias sampling method.\")\n        vocab_size = self.__model.vocab_size()\n        self.__topic_indexes = [[] for _ in range(vocab_size)]\n        self.__alias_tables = [VoseAlias() for _ in range(vocab_size)]\n        self.__prob_sum = np.zeros(vocab_size)\n\n        # Construct each word's alias table (prior is not included).\n        for i in tqdm(range(vocab_size)):\n            dist = []\n            prob_sum = 0\n            for key in self.__model.word_topic(i):\n                topic_id = key\n                word_topic_count = self.__model.word_topic(i)[key]\n                topic_sum = self.__model.topic_sum_value(topic_id)\n\n                self.__topic_indexes[i].append(topic_id)\n                q = word_topic_count / (topic_sum + self.__model.beta_sum())\n                dist.append(q)\n                prob_sum += q\n            self.__prob_sum[i] = prob_sum\n            if len(dist) > 0:\n                dist = np.array(dist, dtype=np.float)\n                self.__alias_tables[i].initialize(dist)\n\n        # Build prior parameter beta's alias table.\n        beta_dist = self.__model.beta() / (self.__model.topic_sum() + self.__model.beta_sum())\n        self.__beta_prior_sum = np.sum(beta_dist)\n        self.__beta_alias.initialize(beta_dist)\n\n    def sample_doc(self, doc):\n        if isinstance(doc, LDADoc) and not isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_token(doc, doc.token(i))\n                doc.set_topic(i, new_topic)\n        elif isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_sentence(doc, doc.sent(i))\n                doc.set_topic(i, new_topic)\n\n    def __sample_token(self, doc, token):\n        new_topic = token.topic\n        for i in range(self.__mh_steps):\n            doc_proposed_topic = self.__doc_proposal(doc, token)\n            new_topic = self.__word_proposal(doc, token, doc_proposed_topic)\n        return new_topic\n\n    def __sample_sentence(self, doc, sent):\n        new_topic = sent.topic\n        for i in range(self.__mh_steps):\n            doc_proposed_topic = self.__doc_proposal(doc, sent)\n            new_topic = self.__word_proposal(doc, sent, doc_proposed_topic)\n        return new_topic\n\n    def __doc_proposal(self, doc, token):\n        if isinstance(doc, LDADoc) and isinstance(token, Token):\n            old_topic = token.topic\n            dart = rand() * (doc.size() + self.__model.alpha_sum())\n            if dart < doc.size():\n                token_index = int(dart)\n                new_topic = doc.token(token_index).topic\n            else:\n                new_topic = rand_k(self.__model.num_topics())\n\n            if new_topic != old_topic:\n                proposal_old = self.__doc_proposal_distribution(doc, old_topic)\n                proposal_new = self.__doc_proposal_distribution(doc, new_topic)\n                proportion_old = self.__proportional_function(doc, token, old_topic)\n                proportion_new = self.__proportional_function(doc, token, new_topic)\n                transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                rejection = rand()\n                mask = -(rejection < transition_prob)\n                return (new_topic & mask) | (old_topic & ~mask)\n\n            return new_topic\n\n        elif isinstance(doc, SLDADoc) and isinstance(token, Sentence):\n            sent = token\n            old_topic = sent.topic\n            dart = rand() * (doc.size() + self.__model.alpha_sum())\n            if dart < doc.size():\n                token_index = int(dart)\n                new_topic = doc.sent(token_index).topic\n            else:\n                new_topic = rand_k(self.__model.num_topics())\n\n            if new_topic != old_topic:\n                proportion_old = self.__proportional_function(doc, sent, old_topic)\n                proportion_new = self.__proportional_function(doc, sent, new_topic)\n                proposal_old = self.__doc_proposal_distribution(doc, old_topic)\n                proposal_new = self.__doc_proposal_distribution(doc, new_topic)\n                transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                rejection = rand()\n                mask = -(rejection < transition_prob)\n                return (new_topic & mask) | (old_topic & ~mask)\n\n            return new_topic\n\n    def __word_proposal(self, doc, token, old_topic):\n        if isinstance(doc, LDADoc) and isinstance(token, Token):\n            new_topic = self.__propose(token.id)\n            if new_topic != old_topic:\n                proposal_old = self.__word_proposal_distribution(token.id, old_topic)\n                proposal_new = self.__word_proposal_distribution(token.id, new_topic)\n                proportion_old = self.__proportional_function(doc, token, old_topic)\n                proportion_new = self.__proportional_function(doc, token, new_topic)\n                transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                rejection = rand()\n                mask = -(rejection < transition_prob)\n                return (new_topic & mask) | (old_topic & ~mask)\n            return new_topic\n\n        elif isinstance(doc, SLDADoc) and isinstance(token, Sentence):\n            sent = token\n            new_topic = old_topic\n            for word_id in sent.tokens:\n                new_topic = self.__propose(word_id)\n                if new_topic != old_topic:\n                    proportion_old = self.__proportional_function(doc, sent, old_topic)\n                    proportion_new = self.__proportional_function(doc, sent, new_topic)\n                    proposal_old = self.__word_proposal_distribution(word_id, old_topic)\n                    proposal_new = self.__word_proposal_distribution(word_id, new_topic)\n                    transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                    rejection = rand()\n                    mask = -(rejection < transition_prob)\n                    new_topic = (new_topic & mask) | (old_topic & ~mask)\n            return new_topic\n\n    def __proportional_function(self, doc, token, new_topic):\n        if isinstance(doc, LDADoc) and isinstance(token, Token):\n            old_topic = token.topic\n            dt_alpha = doc.topic_sum(new_topic) + self.__model.alpha()\n            wt_beta = self.__model.word_topic_value(token.id, new_topic) + self.__model.beta()\n            t_sum_beta_sum = self.__model.topic_sum_value(new_topic) + self.__model.beta_sum()\n            if new_topic == old_topic and wt_beta > 1:\n                if dt_alpha > 1:\n                    dt_alpha -= 1\n                wt_beta -= 1\n                t_sum_beta_sum -= 1\n            return dt_alpha * wt_beta / t_sum_beta_sum\n\n        elif isinstance(doc, SLDADoc) and isinstance(token, Sentence):\n            sent = token\n            old_topic = sent.topic\n            result = doc.topic_sum(new_topic) + self.__model.alpha()\n            if new_topic == old_topic:\n                result -= 1\n            for word_id in sent.tokens:\n                wt_beta = self.__model.word_topic_value(word_id, new_topic) + self.__model.beta()\n                t_sum_beta_sum = self.__model.topic_sum_value(new_topic) + self.__model.beta_sum()\n                if new_topic == old_topic and wt_beta > 1:\n                    wt_beta -= 1\n                    t_sum_beta_sum -= 1\n                result *= wt_beta / t_sum_beta_sum\n            return result\n        else:\n            logger.error(\"Wrong input argument type!\")\n\n    def __word_proposal_distribution(self, word_id, topic):\n        wt_beta = self.__model.word_topic_value(word_id, topic) + self.__model.beta()\n        t_sum_beta_sum = self.__model.topic_sum_value(topic) + self.__model.beta_sum()\n        return wt_beta / t_sum_beta_sum\n\n    def __doc_proposal_distribution(self, doc, topic):\n        return doc.topic_sum(topic) + self.__model.alpha()\n\n    def __propose(self, word_id):\n        dart = rand() * (self.__prob_sum[word_id] + self.__beta_prior_sum)\n        if dart < self.__prob_sum[word_id]:\n            idx = self.__alias_tables[word_id].generate()\n            topic = self.__topic_indexes[word_id][idx]\n        else:\n            topic = self.__beta_alias.generate()\n        return topic\n\n\nclass GibbsSampler(Sampler):\n    def __init__(self, model):\n        super().__init__()\n        self.__model = model\n\n    def sample_doc(self, doc):\n        if isinstance(doc, LDADoc) and not isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_token(doc, doc.token(i))\n                doc.set_topic(i, new_topic)\n        elif isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_sentence(doc, doc.sent(i))\n                doc.set_topic(i, new_topic)\n\n    def __sample_token(self, doc, token):\n        old_topic = token.topic\n        num_topics = self.__model.num_topics()\n        accum_prob = np.zeros(num_topics)\n        prob = np.zeros(num_topics)\n        sum_ = 0\n        for i in range(num_topics):\n            dt_alpha = doc.topic_sum(i) + self.__model.alpha()\n            wt_beta = self.__model.word_topic_value(token.id, i) + self.__model.beta()\n            t_sum_beta_sum = self.__model.topic_sum(i) + self.__model.beta_sum()\n            if i == old_topic and wt_beta > 1:\n                if dt_alpha > 1:\n                    dt_alpha -= 1\n                wt_beta -= 1\n                t_sum_beta_sum -= 1\n            prob[i] = dt_alpha * wt_beta / t_sum_beta_sum\n            sum_ += prob[i]\n            accum_prob[i] = prob[i] if i == 0 else accum_prob[i - 1] + prob[i]\n\n        dart = rand() * sum_\n        if dart <= accum_prob[0]:\n            return 0\n        for i in range(1, num_topics):\n            if accum_prob[i - 1] < dart <= accum_prob[i]:\n                return i\n        return num_topics - 1\n\n    def __sample_sentence(self, doc, sent):\n        old_topic = sent.topic\n        num_topics = self.__model.num_topics()\n        accum_prob = np.zeros(num_topics)\n        prob = np.zeros(num_topics)\n        sum_ = 0\n        for t in range(num_topics):\n            dt_alpha = doc.topic_sum(t) + self.__model.alpha()\n            t_sum_beta_sum = self.__model.topic_sum(t) + self.__model.beta_sum()\n            if t == old_topic:\n                if dt_alpha > 1:\n                    dt_alpha -= 1\n                if t_sum_beta_sum > 1:\n                    t_sum_beta_sum -= 1\n            prob[t] = dt_alpha\n            for i in range(len(sent.tokens)):\n                w = sent.tokens[i]\n                wt_beta = self.__model.word_topic_value(w, t) + self.__model.beta()\n                if t == old_topic and wt_beta > 1:\n                    wt_beta -= 1\n                # Note: if the length of the sentence is too long, the probability will be\n                # too small and the accuracy will be lost if there are too many multiply items\n                prob[t] *= wt_beta / t_sum_beta_sum\n            sum_ += prob[t]\n            accum_prob[t] = prob[t] if t == 0 else accum_prob[t - 1] + prob[t]\n\n        dart = rand() * sum\n        if dart <= accum_prob[0]:\n            return 0\n        for t in range(1, num_topics):\n            if accum_prob[t - 1] < dart <= accum_prob[t]:\n                return t\n        return num_topics - 1\n"
  },
  {
    "path": "modules/text/language_model/lda_novel/semantic_matching.py",
    "content": "import os\n\nimport numpy as np\nfrom paddlehub.common.logger import logger\n\nfrom lda_novel.vocab import OOV\n\nEPS = 1e-06\n\n\nclass WordAndDis(object):\n    def __init__(self):\n        self.word = None\n        self.distance = None\n\n\nclass SemanticMatching(object):\n    def __init__(self):\n        pass\n\n    def l2_norm(self, vec):\n        \"\"\"Calculate the length of vector.\n        \"\"\"\n        result = np.sqrt(np.sum(vec**2))\n        return result\n\n    def cosine_similarity(self, vec1, vec2):\n        norm1 = self.l2_norm(vec1)\n        norm2 = self.l2_norm(vec2)\n        result = np.sum(vec1 * vec2) / norm1 / norm2\n        return result\n\n    def likelihood_based_similarity(self, terms, doc_topic_dist, model):\n        \"\"\"\n        Args:\n            terms: list of strings\n            doc_topic_dist: list of Topic class\n            model: TopicModel class\n        \"\"\"\n        num_of_term_in_vocab = 0\n        result = 0\n        for i in range(len(terms)):\n            term_id = model.term_id(terms[i])\n            if term_id == OOV:\n                continue\n            num_of_term_in_vocab += 1\n            for j in range(len(doc_topic_dist)):\n                topic_id = doc_topic_dist[j].tid\n                prob = doc_topic_dist[j].prob\n                result += model.word_topic_value(term_id, topic_id) * 1.0 / \\\n                          model.topic_sum_value(topic_id) * prob\n\n        if num_of_term_in_vocab == 0:\n            return result\n        return result / num_of_term_in_vocab\n\n    def kullback_leibler_divergence(self, dist1, dist2):\n        assert dist1.shape == dist2.shape\n        dist2[dist2 < EPS] = EPS\n        result = np.sum(dist1 * np.log(dist1 / dist2))\n        return result\n\n    def jensen_shannon_divergence(self, dist1, dist2):\n        assert dist1.shape == dist2.shape\n        dist1[dist1 < EPS] = EPS\n        dist2[dist2 < EPS] = EPS\n        mean = (dist1 + dist2) * 0.5\n        jsd = self.kullback_leibler_divergence(dist1, mean) * 0.5 + \\\n              self.kullback_leibler_divergence(dist2, mean) * 0.5\n        return jsd\n\n    def hellinger_distance(self, dist1, dist2):\n        assert dist1.shape == dist2.shape\n        result = np.sum((np.sqrt(dist1) - np.sqrt(dist2))**2)\n        result = np.sqrt(result) * 0.7071067812\n        return result\n"
  },
  {
    "path": "modules/text/language_model/lda_novel/tokenizer.py",
    "content": "import os\n\nimport numpy as np\nfrom paddlehub.common.logger import logger\n\n\nclass Tokenizer(object):\n    \"\"\"Base tokenizer class.\n    \"\"\"\n\n    def __init__(self):\n        pass\n\n    def tokenize(self, text):\n        raise NotImplementedError\n\n\nclass SimpleTokenizer(Tokenizer):\n    \"\"\"Simple version FMM(Forward Maximun Matching) word tokenizer. This tokenizer can only\n       be used in topic model demo, but not in real business application scenarios.\n\n       Notes: This tokenizer can only recognize the words in the corresponding vocab file.\n    \"\"\"\n\n    def __init__(self, vocab_path):\n        super().__init__()\n        self.__max_word_len = 0\n        self.__vocab = set()\n        self.__load_vocab(vocab_path)\n\n    def tokenize(self, text):\n        \"\"\"Tokenize the input string `text`, and return the tokenize result.\n        \"\"\"\n        text_len = len(text)\n        result = []\n        i = 0\n        while i < text_len:\n            word = found_word = \"\"\n            # Deal with English characters.\n            if self.__is_eng_char(text[i]):\n                for j in range(i, text_len + 1):\n                    if j < text_len and self.__is_eng_char(text[j]):\n                        word += self.__tolower(text[j])\n                    else:\n                        # Forward matching by character granularity.\n                        if word in self.__vocab:\n                            result.append(word)\n                        i = j - 1\n                        break\n            else:\n                for j in range(i, min(i + self.__max_word_len, text_len)):\n                    word += text[j]\n                    if word in self.__vocab:\n                        found_word = word\n                if len(found_word) > 0:\n                    result.append(found_word)\n                    i += len(found_word) - 1\n            i += 1\n        return result\n\n    def contains(self, word):\n        \"\"\"Check whether the word is in the vocabulary.\n        \"\"\"\n        return word in self.__vocab\n\n    def __load_vocab(self, vocab_path):\n        \"\"\"Load the word dictionary.\n        \"\"\"\n        with open(vocab_path, 'r', encoding='utf-8') as fin:\n            vocab_size = 0\n            for line in fin.readlines():\n                fields = line.strip().split('\\t')\n                assert len(fields) >= 2\n                word = fields[1]\n                self.__max_word_len = max(self.__max_word_len, len(word))\n                self.__vocab.add(word)\n                vocab_size += 1\n\n    def __is_eng_char(self, c):\n        \"\"\"Check whether char c is an English character.\n        \"\"\"\n        return (c >= 'A' and c <= 'Z') or (c >= 'a' and c <= 'z')\n\n    def __tolower(self, c):\n        \"\"\"Return the lowercase character of the corresponding character, or return\n           the original character if there is no corresponding lowercase character.\n        \"\"\"\n        return c.lower()\n\n\nclass LACTokenizer(Tokenizer):\n    def __init__(self, vocab_path, lac):\n        super().__init__()\n        self.__max_word_len = 0\n        self.__vocab = set()\n        self.__lac = lac\n        self.__load_vocab(vocab_path)\n\n    def __load_vocab(self, vocab_path):\n        \"\"\"Load the word dictionary.\n                \"\"\"\n        with open(vocab_path, 'r', encoding='utf-8') as fin:\n            vocab_size = 0\n            for line in fin.readlines():\n                fields = line.strip().split('\\t')\n                assert len(fields) >= 2\n                word = fields[1]\n                self.__max_word_len = max(self.__max_word_len, len(word))\n                self.__vocab.add(word)\n                vocab_size += 1\n\n    def tokenize(self, text):\n        results = self.__lac.lexical_analysis(texts=[text], use_gpu=False, batch_size=1, return_tag=True)\n        # Change English words to lower case.\n        # And just preserve the word in vocab.\n        words = results[0][\"word\"]\n        result = []\n        for word in words:\n            word = word.lower()\n            if word in self.__vocab:\n                result.append(word)\n        return result\n\n    def contains(self, word):\n        \"\"\"Check whether the word is in the vocabulary.\n        \"\"\"\n        return word in self.__vocab\n"
  },
  {
    "path": "modules/text/language_model/lda_novel/util.py",
    "content": "import time\nimport yaml\n\nimport numpy as np\nfrom paddlehub.common.logger import logger\n\nfrom lda_novel.config import ModelType\n\n\ndef load_prototxt(config_file, config):\n    \"\"\"\n    Args:\n        config_file: model configuration file.\n        config: ModelConfig class\n    \"\"\"\n    logger.info(\"Loading LDA config.\")\n    with open(config_file, 'r', encoding='utf-8') as f:\n        yaml_dict = yaml.load(f, Loader=yaml.FullLoader)\n\n    # Assignment.\n    if yaml_dict[\"type\"] == \"LDA\":\n        config.type = ModelType.LDA\n    else:\n        config.type = ModelType.SLDA\n    config.num_topics = yaml_dict[\"num_topics\"]\n    config.alpha = yaml_dict[\"alpha\"]\n    config.beta = yaml_dict[\"beta\"]\n    config.word_topic_file = yaml_dict[\"word_topic_file\"]\n    config.vocab_file = yaml_dict[\"vocab_file\"]\n\n\ndef fix_random_seed(seed=2147483647):\n    np.random.seed(seed)\n\n\ndef rand(min_=0, max_=1):\n    return np.random.uniform(low=min_, high=max_)\n\n\ndef rand_k(k):\n    \"\"\"Returns an integer float number between [0, k - 1].\n    \"\"\"\n    return int(rand() * k)\n\n\ndef timeit(f):\n    \"\"\"Return time cost of function f.\n    \"\"\"\n\n    def timed(*args, **kwargs):\n        start_time = time.time()\n        result = f(*args, **kwargs)\n        end_time = time.time()\n        print(\"   [-] %s : %2.5f sec\" % (f.__name__, end_time - start_time))\n        return result\n\n    return timed\n"
  },
  {
    "path": "modules/text/language_model/lda_novel/vocab.py",
    "content": "from paddlehub.common.logger import logger\n\nOOV = -1\n\n\nclass WordCount(object):\n    def __init__(self, word_id, count):\n        self.word_id = word_id\n        self.count = count\n\n\nclass Vocab(object):\n    def __init__(self):\n        self.__term2id = {}\n        self.__id2term = {}\n\n    def get_id(self, word):\n        if word not in self.__term2id:\n            return OOV\n        return self.__term2id[word]\n\n    def load(self, vocab_file):\n        self.__term2id = {}\n        self.__id2term = {}\n        with open(vocab_file, 'r', encoding='utf-8') as fin:\n            for line in fin.readlines():\n                fields = line.strip().split('\\t')\n                assert len(fields) == 5, \"Vocabulary file [%s] format error!\" % (vocab_file)\n                term = fields[1]\n                id_ = int(fields[2])\n                if term in self.__term2id:\n                    logger.error(\"Duplicate word [%s] in vocab file!\" % (term))\n                    continue\n                self.__term2id[term] = id_\n                self.__id2term[id_] = term\n\n    def size(self):\n        return len(self.__term2id)\n\n    def vocabulary(self):\n        return self.__id2term\n"
  },
  {
    "path": "modules/text/language_model/lda_novel/vose_alias.py",
    "content": "import os\n\nimport numpy as np\nfrom paddlehub.common.logger import logger\n\nfrom lda_novel.util import rand, rand_k\n\n\nclass VoseAlias(object):\n    \"\"\"Vose's Alias Method.\n    \"\"\"\n\n    def __init__(self):\n        self.__alias = None\n        self.__prob = None  # np.array\n\n    def initialize(self, distribution):\n        \"\"\"Initialize the alias table according to the input distribution\n        Arg:\n            distribution: Numpy array.\n        \"\"\"\n        size = distribution.shape[0]\n        self.__alias = np.zeros(size, dtype=np.int64)\n        self.__prob = np.zeros(size)\n        sum_ = np.sum(distribution)\n        p = distribution / sum_ * size  # Scale up probability.\n        large, small = [], []\n        for i, p_ in enumerate(p):\n            if p_ < 1.0:\n                small.append(i)\n            else:\n                large.append(i)\n\n        while large and small:\n            l = small[0]\n            g = large[0]\n            small.pop(0)\n            large.pop(0)\n            self.__prob[l] = p[l]\n            self.__alias[l] = g\n            p[g] = p[g] + p[l] - 1  # A more numerically stable option.\n            if p[g] < 1.0:\n                small.append(g)\n            else:\n                large.append(g)\n\n        while large:\n            g = large[0]\n            large.pop(0)\n            self.__prob[g] = 1.0\n\n        while small:\n            l = small[0]\n            small.pop(0)\n            self.__prob[l] = 1.0\n\n    def generate(self):\n        \"\"\"Generate samples from given distribution.\n        \"\"\"\n        dart1 = rand_k(self.size())\n        dart2 = int(rand())\n        return dart1 if dart2 > self.__prob[dart1] else self.__alias[dart1]\n\n    def size(self):\n        return self.__prob.shape[0]\n"
  },
  {
    "path": "modules/text/language_model/lda_webpage/README.md",
    "content": "# lda_webpage\n\n|模型名称|lda_webpage|\n| :--- | :---: | \n|类别|文本-主题模型|\n|网络|LDA|\n|数据集|百度自建网页领域数据集|\n|是否支持Fine-tuning|否|\n|模型大小|31MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - 主题模型(Topic Model)是以无监督学习的方式对文档的隐含语义结构进行聚类的统计模型，其中LDA(Latent Dirichlet Allocation)算法是主题模型的一种。LDA根据对词的共现信息的分析，拟合出词-文档-主题的分布，从而将词、文本映射到一个语义空间中。\n\n  <p align=\"center\">\n  <img src=\"https://bj.bcebos.com/paddlehub/model/nlp/semantic_model/lda.png\" hspace='10'/> <br />\n  </p>\n\n  更多详情请参考[LDA论文](http://www.jmlr.org/papers/volume3/blei03a/blei03a.pdf)。\n\n  注：该Module由第三方开发者DesmonDay贡献。\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 1.8.2\n\n  - paddlehub >= 1.8.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install lda_webpage\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n``` python\nimport paddlehub as hub\n\nlda_webpage = hub.Module(name=\"lda_webpage\")\njsd, hd = lda_webpage.cal_doc_distance(doc_text1=\"百度的网页上有着各种新闻的推荐，内容丰富多彩。\", doc_text2=\"百度首页推荐着各种新闻，还提供了强大的搜索引擎功能。\")\n# jsd = 0.00249, hd = 0.0510\n\nresults = lda_webpage.cal_doc_keywords_similarity('百度首页推荐着各种新闻，还提供了强大的搜索引擎功能。')\n#  [{'word': '强大', 'similarity': 0.0838851256627093},\n#   {'word': '推荐', 'similarity': 0.06295345182499558},\n#   {'word': '新闻', 'similarity': 0.05894049247832139},\n#   {'word': '提供', 'similarity': 0.04179908620523299},\n#   {'word': '百度', 'similarity': 0.033778847361833536},\n#   {'word': '首页', 'similarity': 0.018429949496365026},\n#   {'word': '功能', 'similarity': 0.011409342579361237},\n#   {'word': '搜索引擎', 'similarity': 0.010392479335778413}]\n\nout = lda_webpage.cal_query_doc_similarity(query='百度搜索引擎', document='百度是全球最大的中文搜索引擎、致力于让网民更便捷地获取信息，找到所求。百度超过千亿的中文网页数据库，可以瞬间找到相关的搜索结果。')\n# out = 0.0283\n\nresults = lda_webpage.infer_doc_topic_distribution(\"百度文库非常的好用，我们不仅在里面找到需要的文档，同时可以通过续费畅读精品文档。\")\n# [{'topic id': 3458, 'distribution': 0.5277777777777778},\n#  {'topic id': 1927, 'distribution': 0.17777777777777778},\n#  {'topic id': 1497, 'distribution': 0.05},\n#  {'topic id': 1901, 'distribution': 0.03333333333333333}...]\n\nkeywords = lda_webpage.show_topic_keywords(3458)\n# {'price': 0.10977647395316775,\n#  '文档': 0.06445075002937038,\n#  '财富值': 0.04012675135746289,\n#  '文库': 0.03953267826572788,\n#  'len': 0.038856163693739426,\n#  'tag': 0.03868762622172197,\n#  'current': 0.03728225157463761,\n#  'cut': 0.03448665775467454,\n#  '尺寸': 0.03250387028891812,\n#  '财富': 0.02902896727051734}\n\n```\n\n  - #### 查看代码\n  https://github.com/baidu/Familia\n\n- ### 2、API\n\n  - ```python\n    cal_doc_distance(doc_text1, doc_text2)\n    ```\n\n    - 用于计算两个输入文档之间的距离，包括Jensen-Shannon divergence(JS散度)、Hellinger Distance(海林格距离)。\n\n    - **参数**\n\n      - doc_text1(str): 输入的第一个文档。\n      - doc_text2(str): 输入的第二个文档。   \n\n    - **返回**\n\n      - jsd(float): 两个文档之间的JS散度([Jensen-Shannon divergence](https://blog.csdn.net/FrankieHello/article/details/80614422?utm_source=copy))。\n      - hd(float): 两个文档之间的海林格距离([Hellinger Distance](http://blog.sina.com.cn/s/blog_85f1ffb70101e65d.html))。    \n\n  - ```python\n    cal_doc_keywords_similarity(document, top_k=10)\n    ```\n\n    - 用于查找输入文档的前k个关键词及对应的与原文档的相似度。\n\n    - **参数**\n\n      - document(str): 输入文档。\n      - top_k(int): 查找输入文档的前k个关键词。\n\n    - **返回**\n\n      - results(list): 包含每个关键词以及对应的与原文档的相似度。其中，list的基本元素为dict，dict的key为关键词，value为对应的与原文档的相似度。    \n\n  - ```python\n    cal_query_doc_similarity(query, document)\n    ```\n\n    - 用于计算短文档与长文档之间的相似度。\n\n    -  **参数**\n\n      - query(str): 输入的短文档。\n      - document(str): 输入的长文档。\n\n    -  **返回**\n\n      - lda_sim(float): 返回短文档与长文档之间的相似度。 \n\n  - ```python\n    infer_doc_topic_distribution(document)\n    ```\n\n    - 用于推理出文档的主题分布。\n\n      - **参数**\n\n        - document(str): 输入文档。\n\n      - **返回**\n\n        - results(list): 包含主题分布下各个主题ID和对应的概率分布。其中，list的基本元素为dict，dict的key为主题ID，value为各个主题ID对应的概率。\n\n  - ```python\n    show_topic_keywords(topic_id, k=10)\n    ```\n\n    - 用于展示出每个主题下对应的关键词，可配合推理主题分布的API使用。\n\n      - **参数**\n\n        - topic_id(int): 主题ID。\n        - k(int): 需要知道对应主题的前k个关键词。\n\n      - **返回**\n\n        - results(dict): 返回对应文档的前k个关键词，以及各个关键词在文档中的出现概率。         \n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  修复因为return的bug导致的NoneType错误\n\n* 1.0.2\n\n  修复由于Windows`gbk`编码导致的问题\n"
  },
  {
    "path": "modules/text/language_model/lda_webpage/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/lda_webpage/config.py",
    "content": "\"\"\"\nThis file defines the basic config information of LDA/SLDA model.\n\"\"\"\n\n\nclass ModelType:\n    LDA = 0\n    SLDA = 1\n\n\nclass ModelConfig:\n    type = None\n    num_topics = None\n    alpha = None\n    beta = None\n    word_topic_file = None\n    vocab_file = None\n"
  },
  {
    "path": "modules/text/language_model/lda_webpage/document.py",
    "content": "import numpy as np\n\n\nclass Topic(object):\n    \"\"\"Basic data structure of topic, contains topic id and\n       corresponding probability.\n    \"\"\"\n\n    def __init__(self, tid, prob):\n        self.tid = tid  # topic id\n        self.prob = prob  # topic probability\n\n\nclass Token(object):\n    \"\"\"Basic storage unit of LDA documents, contains word id\n       and corresponding topic.\n    \"\"\"\n\n    def __init__(self, topic, id):\n        self.topic = topic\n        self.id = id\n\n\nclass Sentence(object):\n    \"\"\"Basic storage unit of SentenceLDA documents, contains word ids\n       of the sentence and its corresponding topic id.\n    \"\"\"\n\n    def __init__(self, topic, tokens):\n        self.topic = topic\n        self.tokens = tokens\n\n\nclass LDADoc(object):\n    \"\"\"The storage structure of LDA model's inference result.\n    \"\"\"\n\n    def __init__(self):\n        self._num_topics = None  # Number of topics.\n        self._num_accum = None  # Number of accumulated sample rounds.\n        self._alpha = None  # Document prior parameter.\n        self._tokens = None  # Storage structure of inference results.\n        self._topic_sum = None  # Document's topic sum in one round samples.\n        self._accum_topic_sum = None  # Accumulated results of topic sum.\n\n    def init(self, num_topics):\n        \"\"\"Initialize the LDADoc according to num_topics.\n        \"\"\"\n        self._num_topics = num_topics\n        self._num_accum = 0\n        self._tokens = []\n        self._topic_sum = np.zeros(self._num_topics)\n        self._accum_topic_sum = np.zeros(self._num_topics)\n\n    def add_token(self, token):\n        \"\"\"Add new word to current LDADoc.\n        Arg:\n            token: Token class object.\n        \"\"\"\n        assert token.topic >= 0, \"Topic %d out of range!\" % token.topic\n        assert token.topic < self._num_topics, \"Topic %d out of range!\" % token.topic\n        self._tokens.append(token)\n        self._topic_sum[token.topic] += 1\n\n    def token(self, index):\n        return self._tokens[index]\n\n    def set_topic(self, index, new_topic):\n        \"\"\"Set the index word's topic to new_topic, and update the corresponding\n           topic distribution.\n        \"\"\"\n        assert new_topic >= 0, \"Topic %d out of range!\" % new_topic\n        assert new_topic < self._num_topics, \"Topic %d out of range!\" % new_topic\n        old_topic = self._tokens[index].topic\n        if new_topic == old_topic:\n            return\n        self._tokens[index].topic = new_topic\n        self._topic_sum[old_topic] -= 1\n        self._topic_sum[new_topic] += 1\n\n    def set_alpha(self, alpha):\n        self._alpha = alpha\n\n    def size(self):\n        \"\"\"Return number of words in LDADoc.\n        \"\"\"\n        return len(self._tokens)\n\n    def topic_sum(self, topic_id):\n        return self._topic_sum[topic_id]\n\n    def sparse_topic_dist(self, sort=True):\n        \"\"\"Return the topic distribution of documents in sparse format.\n           By default, it is sorted according to the topic probability\n           under the descending order.\n        \"\"\"\n        topic_dist = []\n        sum_ = np.sum(self._accum_topic_sum)\n        if sum_ == 0:\n            return topic_dist\n        for i in range(0, self._num_topics):\n            if self._accum_topic_sum[i] == 0:\n                continue\n            topic_dist.append(Topic(i, self._accum_topic_sum[i] * 1.0 / sum_))\n        if sort:\n\n            def take_elem(topic):\n                return topic.prob\n\n            topic_dist.sort(key=take_elem, reverse=True)\n            if topic_dist is None:\n                topic_dist = []\n\n        return topic_dist\n\n    def dense_topic_dist(self):\n        \"\"\"Return the distribution of document topics in dense format,\n           taking into account the prior parameter alpha.\n        \"\"\"\n        dense_dist = np.zeros(self._num_topics)\n        if self.size() == 0:\n            return dense_dist\n        dense_dist = (self._accum_topic_sum * 1.0 / self._num_accum + self._alpha) / (\n            self.size() + self._alpha * self._num_topics)\n        return dense_dist\n\n    def accumulate_topic_num(self):\n        self._accum_topic_sum += self._topic_sum\n        self._num_accum += 1\n\n\nclass SLDADoc(LDADoc):\n    \"\"\"Sentence LDA Document, inherited from LDADoc.\n       Add add_sentence interface.\n    \"\"\"\n\n    def __init__(self):\n        super().__init__()\n        self.__sentences = None\n\n    def init(self, num_topics):\n        \"\"\"Initialize the SLDADoc according to num_topics.\n        \"\"\"\n        self._num_topics = num_topics\n        self.__sentences = []\n        self._num_accum = 0\n        self._topic_sum = np.zeros(self._num_topics)\n        self._accum_topic_sum = np.zeros(self._num_topics)\n\n    def add_sentence(self, sent):\n        \"\"\"Add new sentence to current SLDADoc.\n        Arg:\n            sent: Sentence class object.\n        \"\"\"\n        assert sent.topic >= 0, \"Topic %d out of range!\" % (sent.topic)\n        assert sent.topic < self._num_topics, \"Topic %d out of range!\" % (sent.topic)\n        self.__sentences.append(sent)\n        self._topic_sum[sent.topic] += 1\n\n    def set_topic(self, index, new_topic):\n        assert new_topic >= 0, \"Topic %d out of range!\" % (new_topic)\n        assert new_topic < self._num_topics, \"Topic %d out of range!\" % (new_topic)\n        old_topic = self.__sentences[index].topic\n        if new_topic == old_topic:\n            return\n        self.__sentences[index].topic = new_topic\n        self._topic_sum[old_topic] -= 1\n        self._topic_sum[new_topic] += 1\n\n    def size(self):\n        \"\"\"Return number of sentences in SLDADoc.\n        \"\"\"\n        return len(self.__sentences)\n\n    def sent(self, index):\n        return self.__sentences[index]\n"
  },
  {
    "path": "modules/text/language_model/lda_webpage/inference_engine.py",
    "content": "import os\n\nfrom paddlehub.common.logger import logger\n\nfrom lda_webpage.config import ModelConfig\nfrom lda_webpage.util import load_prototxt, fix_random_seed, rand_k\nfrom lda_webpage.model import TopicModel\nfrom lda_webpage.sampler import GibbsSampler, MHSampler\nfrom lda_webpage.document import LDADoc, SLDADoc, Token, Sentence\nfrom lda_webpage.vocab import OOV\n\n\nclass SamplerType:\n    GibbsSampling = 0\n    MetropolisHastings = 1\n\n\nclass InferenceEngine(object):\n    def __init__(self, model_dir, conf_file, type=SamplerType.MetropolisHastings):\n        # Read model configuration.\n        config = ModelConfig()\n        conf_file_path = os.path.join(model_dir, conf_file)\n        load_prototxt(conf_file_path, config)\n        self.__model = TopicModel(model_dir, config)\n        self.__config = config\n\n        # Initialize the sampler according to the configuration.\n        if type == SamplerType.GibbsSampling:\n            self.__sampler = GibbsSampler(self.__model)\n        elif type == SamplerType.MetropolisHastings:\n            self.__sampler = MHSampler(self.__model)\n\n    def infer(self, input, doc):\n        \"\"\"Perform LDA topic inference on input, and store the results in doc.\n        Args:\n            input: a list of strings after tokenization.\n            doc: LDADoc type or SLDADoc type.\n        \"\"\"\n        fix_random_seed()\n        if isinstance(doc, LDADoc) and not isinstance(doc, SLDADoc):\n            doc.init(self.__model.num_topics())\n            doc.set_alpha(self.__model.alpha())\n            for token in input:\n                id_ = self.__model.term_id(token)\n                if id_ != OOV:\n                    init_topic = rand_k(self.__model.num_topics())\n                    doc.add_token(Token(init_topic, id_))\n            self.lda_infer(doc, 20, 50)\n        elif isinstance(doc, SLDADoc):\n            doc.init(self.__model.num_topics())\n            doc.set_alpha(self.__model.alpha())\n            for sent in input:\n                words = []\n                for token in sent:\n                    id_ = self.__model.term_id(token)\n                    if id_ != OOV:\n                        words.append(id_)\n                init_topic = rand_k(self.__model.num_topics())\n                doc.add_sentence(Sentence(init_topic, words))\n            self.slda_infer(doc, 20, 50)\n        else:\n            logger.error(\"Wrong Doc Type!\")\n\n    def lda_infer(self, doc, burn_in_iter, total_iter):\n        assert burn_in_iter >= 0\n        assert total_iter > 0\n        assert total_iter > burn_in_iter\n\n        for iter_ in range(total_iter):\n            self.__sampler.sample_doc(doc)\n            if iter_ >= burn_in_iter:\n                doc.accumulate_topic_num()\n\n    def slda_infer(self, doc, burn_in_iter, total_iter):\n        assert burn_in_iter >= 0\n        assert total_iter > 0\n        assert total_iter > burn_in_iter\n\n        for iter_ in range(total_iter):\n            self.__sampler.sample_doc(doc)\n            if iter_ >= burn_in_iter:\n                doc.accumulate_topic_num()\n\n    def model_type(self):\n        return self.__model.type()\n\n    def get_model(self):\n        return self.__model\n\n    def get_config(self):\n        return self.__config\n"
  },
  {
    "path": "modules/text/language_model/lda_webpage/model.py",
    "content": "import os\nfrom collections import OrderedDict\n\nimport numpy as np\nfrom tqdm import tqdm\nfrom paddlehub.common.logger import logger\n\nfrom lda_webpage.vocab import Vocab, WordCount\n\n\nclass TopicModel(object):\n    \"\"\"Storage Structure of Topic model, including vocabulary and word topic count.\n    \"\"\"\n\n    def __init__(self, model_dir, config):\n        \"\"\"\n        Args:\n            model_dir: the path of model directory\n            config: ModelConfig class.\n        \"\"\"\n        self.__word_topic = None  # Model parameter of word topic.\n        self.__vocab = Vocab()  # Vocab data structure of model.\n        self.__num_topics = config.num_topics  # Number of topics.\n        self.__alpha = config.alpha\n        self.__alpha_sum = self.__alpha * self.__num_topics\n        self.__beta = config.beta\n        self.__beta_sum = None\n        self.__type = config.type  # Model type.\n        self.__topic_sum = np.zeros(self.__num_topics, dtype=\"int64\")  # Accum sum of each topic in word topic.\n        self.__topic_words = [[] for _ in range(self.__num_topics)]\n        word_topic_path = os.path.join(model_dir, config.word_topic_file)\n        vocab_path = os.path.join(model_dir, config.vocab_file)\n        self.load_model(word_topic_path, vocab_path)\n\n    def term_id(self, term):\n        return self.__vocab.get_id(term)\n\n    def load_model(self, word_topic_path, vocab_path):\n\n        # Loading vocabulary\n        self.__vocab.load(vocab_path)\n\n        self.__beta_sum = self.__beta * self.__vocab.size()\n        self.__word_topic = [{} for _ in range(self.__vocab.size())]  # 字典列表\n        self.__load_word_dict(word_topic_path)\n        logger.info(\"Model Info: #num_topics=%d #vocab_size=%d alpha=%f beta=%f\" %\n                    (self.num_topics(), self.vocab_size(), self.alpha(), self.beta()))\n\n    def word_topic_value(self, word_id, topic_id):\n        \"\"\"Return value of specific word under specific topic in the model.\n        \"\"\"\n        word_dict = self.__word_topic[word_id]\n        if topic_id not in word_dict:\n            return 0\n        return word_dict[topic_id]\n\n    def word_topic(self, term_id):\n        \"\"\"Return the topic distribution of a word.\n        \"\"\"\n        return self.__word_topic[term_id]\n\n    def topic_sum_value(self, topic_id):\n        return self.__topic_sum[topic_id]\n\n    def topic_sum(self):\n        return self.__topic_sum\n\n    def num_topics(self):\n        return self.__num_topics\n\n    def vocab_size(self):\n        return self.__vocab.size()\n\n    def alpha(self):\n        return self.__alpha\n\n    def alpha_sum(self):\n        return self.__alpha_sum\n\n    def beta(self):\n        return self.__beta\n\n    def beta_sum(self):\n        return self.__beta_sum\n\n    def type(self):\n        return self.__type\n\n    def __load_word_dict(self, word_dict_path):\n        \"\"\"Load the word topic parameters.\n        \"\"\"\n        logger.info(\"Loading word topic.\")\n        with open(word_dict_path, 'r', encoding='utf-8') as f:\n            for line in tqdm(f.readlines()):\n                fields = line.strip().split(\" \")\n                assert len(fields) > 0, \"Model file format error!\"\n                term_id = int(fields[0])\n                assert term_id < self.vocab_size(), \"Term id out of range!\"\n                assert term_id >= 0, \"Term id out of range!\"\n                for i in range(1, len(fields)):\n                    topic_count = fields[i].split(\":\")\n                    assert len(topic_count) == 2, \"Topic count format error!\"\n\n                    topic_id = int(topic_count[0])\n                    assert topic_id >= 0, \"Topic out of range!\"\n                    assert topic_id < self.__num_topics, \"Topic out of range!\"\n\n                    count = int(topic_count[1])\n                    assert count >= 0, \"Topic count error!\"\n\n                    self.__word_topic[term_id][topic_id] = count\n                    self.__topic_sum[topic_id] += count\n                    self.__topic_words[topic_id].append(WordCount(term_id, count))\n                new_dict = OrderedDict()\n                for key in sorted(self.__word_topic[term_id]):\n                    new_dict[key] = self.__word_topic[term_id][key]\n                self.__word_topic[term_id] = new_dict\n\n    def get_vocab(self):\n        return self.__vocab.vocabulary()\n\n    def topic_words(self):\n        return self.__topic_words\n"
  },
  {
    "path": "modules/text/language_model/lda_webpage/module.py",
    "content": "import os\n\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.common.logger import logger\n\nfrom lda_webpage.inference_engine import InferenceEngine\nfrom lda_webpage.document import LDADoc\nfrom lda_webpage.semantic_matching import SemanticMatching, WordAndDis\nfrom lda_webpage.tokenizer import LACTokenizer, SimpleTokenizer\nfrom lda_webpage.config import ModelType\nfrom lda_webpage.vocab import Vocab, WordCount\n\n\n@moduleinfo(\n    name=\"lda_webpage\",\n    version=\"1.0.2\",\n    summary=\n    \"This is a PaddleHub Module for LDA topic model in webpage dataset, where we can calculate doc distance, calculate the similarity between query and document, etc.\",\n    author=\"DesmonDay\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\")\nclass TopicModel(hub.Module):\n    def _initialize(self):\n        \"\"\"\n        Initialize with the necessary elements.\n        \"\"\"\n        self.model_dir = os.path.join(self.directory, 'webpage')\n        self.conf_file = 'lda.conf'\n        self.__engine = InferenceEngine(self.model_dir, self.conf_file)\n        self.vocab_path = os.path.join(self.model_dir, 'vocab_info.txt')\n        lac = hub.Module(name=\"lac\")\n        # self.__tokenizer = SimpleTokenizer(self.vocab_path)\n        self.__tokenizer = LACTokenizer(self.vocab_path, lac)\n\n        self.vocabulary = self.__engine.get_model().get_vocab()\n        self.config = self.__engine.get_config()\n        self.topic_words = self.__engine.get_model().topic_words()\n        self.topic_sum_table = self.__engine.get_model().topic_sum()\n\n        def take_elem(word_count):\n            return word_count.count\n\n        for i in range(self.config.num_topics):\n            self.topic_words[i].sort(key=take_elem, reverse=True)\n\n        logger.info(\"Finish initialization.\")\n\n    def cal_doc_distance(self, doc_text1, doc_text2):\n        \"\"\"\n        This interface calculates the distance between documents.\n\n        Args:\n            doc_text1(str): the input document text 1.\n            doc_text2(str): the input document text 2.\n\n        Returns:\n            jsd(float): Jensen-Shannon Divergence distance of two documents.\n            hd(float): Hellinger Distance of two documents.\n        \"\"\"\n        doc1_tokens = self.__tokenizer.tokenize(doc_text1)\n        doc2_tokens = self.__tokenizer.tokenize(doc_text2)\n\n        # Document topic inference.\n        doc1, doc2 = LDADoc(), LDADoc()\n        self.__engine.infer(doc1_tokens, doc1)\n        self.__engine.infer(doc2_tokens, doc2)\n\n        # To calculate jsd, we need dense document topic distribution.\n        dense_dict1 = doc1.dense_topic_dist()\n        dense_dict2 = doc2.dense_topic_dist()\n        # Calculate the distance between distributions.\n        # The smaller the distance, the higher the document semantic similarity.\n        sm = SemanticMatching()\n        jsd = sm.jensen_shannon_divergence(dense_dict1, dense_dict2)\n        hd = sm.hellinger_distance(dense_dict1, dense_dict2)\n\n        return jsd, hd\n\n    def cal_doc_keywords_similarity(self, document, top_k=10):\n        \"\"\"\n        This interface can be used to find topk keywords of document.\n\n        Args:\n            document(str): the input document text.\n            top_k(int): top k keywords of this document.\n\n        Returns:\n            results(list): contains top_k keywords and their\n                     corresponding similarity compared to document.\n        \"\"\"\n        d_tokens = self.__tokenizer.tokenize(document)\n\n        # Do topic inference on documents to obtain topic distribution.\n        doc = LDADoc()\n        self.__engine.infer(d_tokens, doc)\n        doc_topic_dist = doc.sparse_topic_dist()\n\n        items = []\n        words = set()\n        for word in d_tokens:\n            if word in words:\n                continue\n            words.add(word)\n            wd = WordAndDis()\n            wd.word = word\n            sm = SemanticMatching()\n            wd.distance = sm.likelihood_based_similarity(\n                terms=[word], doc_topic_dist=doc_topic_dist, model=self.__engine.get_model())\n            items.append(wd)\n\n        def take_elem(word_dis):\n            return word_dis.distance\n\n        items.sort(key=take_elem, reverse=True)\n\n        results = []\n        size = len(items)\n        for i in range(top_k):\n            if i >= size:\n                break\n            results.append({\"word\": items[i].word, \"similarity\": items[i].distance})\n\n        return results\n\n    def cal_query_doc_similarity(self, query, document):\n        \"\"\"\n        This interface calculates the similarity between query and document.\n\n        Args:\n            query(str): the input query text.\n            document(str): the input document text.\n\n        Returns:\n            lda_sim(float): likelihood based similarity between query and document based on LDA.\n        \"\"\"\n        q_tokens = self.__tokenizer.tokenize(query)\n        d_tokens = self.__tokenizer.tokenize(document)\n\n        doc = LDADoc()\n        self.__engine.infer(d_tokens, doc)\n        doc_topic_dist = doc.sparse_topic_dist()\n\n        sm = SemanticMatching()\n        lda_sim = sm.likelihood_based_similarity(q_tokens, doc_topic_dist, self.__engine.get_model())\n\n        return lda_sim\n\n    def infer_doc_topic_distribution(self, document):\n        \"\"\"\n        This interface infers the topic distribution of document.\n\n        Args:\n            document(str): the input document text.\n\n        Returns:\n            results(list): returns the topic distribution of document.\n        \"\"\"\n        tokens = self.__tokenizer.tokenize(document)\n        if tokens == []:\n            return []\n        results = []\n        doc = LDADoc()\n        self.__engine.infer(tokens, doc)\n        topics = doc.sparse_topic_dist()\n        for topic in topics:\n            results.append({\"topic id\": topic.tid, \"distribution\": topic.prob})\n        return results\n\n    def show_topic_keywords(self, topic_id, k=10):\n        \"\"\"\n        This interface returns the first k keywords under specific topic.\n\n        Args:\n            topic_id(int): topic information we want to know.\n            k(int): top k keywords.\n\n        Returns:\n            results(dict): contains specific topic's keywords and\n                     corresponding probability.\n        \"\"\"\n        EPS = 1e-8\n        results = {}\n        if 0 <= topic_id < self.config.num_topics:\n            k = min(k, len(self.topic_words[topic_id]))\n            for i in range(k):\n                prob = self.topic_words[topic_id][i].count / \\\n                       (self.topic_sum_table[topic_id] + EPS)\n                results[self.vocabulary[self.topic_words[topic_id][i].word_id]] = prob\n            return results\n        else:\n            logger.error(\"%d is out of range!\" % topic_id)\n"
  },
  {
    "path": "modules/text/language_model/lda_webpage/sampler.py",
    "content": "import os\n\nimport numpy as np\nfrom tqdm import tqdm\nfrom paddlehub.common.logger import logger\n\nfrom lda_webpage.document import LDADoc, SLDADoc, Token, Sentence\nfrom lda_webpage.vose_alias import VoseAlias\nfrom lda_webpage.util import rand, rand_k\n\n\nclass Sampler(object):\n    def __init__(self):\n        pass\n\n    def sample_doc(self, doc):\n        \"\"\"Sample LDA or SLDA topics for documents.\n        \"\"\"\n        raise NotImplementedError\n\n\nclass MHSampler(Sampler):\n    def __init__(self, model):\n        super().__init__()\n        self.__model = model\n        self.__topic_indexes = None\n        self.__alias_tables = None\n        self.__prob_sum = None\n        self.__beta_alias = VoseAlias()\n        self.__beta_prior_sum = None\n        self.__mh_steps = 2\n        self.__construct_alias_table()\n\n    def __construct_alias_table(self):\n        \"\"\"Construct alias table for all words.\n        \"\"\"\n        logger.info(\"Construct alias table for alias sampling method.\")\n        vocab_size = self.__model.vocab_size()\n        self.__topic_indexes = [[] for _ in range(vocab_size)]\n        self.__alias_tables = [VoseAlias() for _ in range(vocab_size)]\n        self.__prob_sum = np.zeros(vocab_size)\n\n        # Construct each word's alias table (prior is not included).\n        for i in tqdm(range(vocab_size)):\n            dist = []\n            prob_sum = 0\n            for key in self.__model.word_topic(i):\n                topic_id = key\n                word_topic_count = self.__model.word_topic(i)[key]\n                topic_sum = self.__model.topic_sum_value(topic_id)\n\n                self.__topic_indexes[i].append(topic_id)\n                q = word_topic_count / (topic_sum + self.__model.beta_sum())\n                dist.append(q)\n                prob_sum += q\n            self.__prob_sum[i] = prob_sum\n            if len(dist) > 0:\n                dist = np.array(dist, dtype=np.float)\n                self.__alias_tables[i].initialize(dist)\n\n        # Build prior parameter beta's alias table.\n        beta_dist = self.__model.beta() / (self.__model.topic_sum() + self.__model.beta_sum())\n        self.__beta_prior_sum = np.sum(beta_dist)\n        self.__beta_alias.initialize(beta_dist)\n\n    def sample_doc(self, doc):\n        if isinstance(doc, LDADoc) and not isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_token(doc, doc.token(i))\n                doc.set_topic(i, new_topic)\n        elif isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_sentence(doc, doc.sent(i))\n                doc.set_topic(i, new_topic)\n\n    def __sample_token(self, doc, token):\n        new_topic = token.topic\n        for i in range(self.__mh_steps):\n            doc_proposed_topic = self.__doc_proposal(doc, token)\n            new_topic = self.__word_proposal(doc, token, doc_proposed_topic)\n        return new_topic\n\n    def __sample_sentence(self, doc, sent):\n        new_topic = sent.topic\n        for i in range(self.__mh_steps):\n            doc_proposed_topic = self.__doc_proposal(doc, sent)\n            new_topic = self.__word_proposal(doc, sent, doc_proposed_topic)\n        return new_topic\n\n    def __doc_proposal(self, doc, token):\n        if isinstance(doc, LDADoc) and isinstance(token, Token):\n            old_topic = token.topic\n            dart = rand() * (doc.size() + self.__model.alpha_sum())\n            if dart < doc.size():\n                token_index = int(dart)\n                new_topic = doc.token(token_index).topic\n            else:\n                new_topic = rand_k(self.__model.num_topics())\n\n            if new_topic != old_topic:\n                proposal_old = self.__doc_proposal_distribution(doc, old_topic)\n                proposal_new = self.__doc_proposal_distribution(doc, new_topic)\n                proportion_old = self.__proportional_function(doc, token, old_topic)\n                proportion_new = self.__proportional_function(doc, token, new_topic)\n                transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                rejection = rand()\n                mask = -(rejection < transition_prob)\n                return (new_topic & mask) | (old_topic & ~mask)\n\n            return new_topic\n\n        elif isinstance(doc, SLDADoc) and isinstance(token, Sentence):\n            sent = token\n            old_topic = sent.topic\n            dart = rand() * (doc.size() + self.__model.alpha_sum())\n            if dart < doc.size():\n                token_index = int(dart)\n                new_topic = doc.sent(token_index).topic\n            else:\n                new_topic = rand_k(self.__model.num_topics())\n\n            if new_topic != old_topic:\n                proportion_old = self.__proportional_function(doc, sent, old_topic)\n                proportion_new = self.__proportional_function(doc, sent, new_topic)\n                proposal_old = self.__doc_proposal_distribution(doc, old_topic)\n                proposal_new = self.__doc_proposal_distribution(doc, new_topic)\n                transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                rejection = rand()\n                mask = -(rejection < transition_prob)\n                return (new_topic & mask) | (old_topic & ~mask)\n\n            return new_topic\n\n    def __word_proposal(self, doc, token, old_topic):\n        if isinstance(doc, LDADoc) and isinstance(token, Token):\n            new_topic = self.__propose(token.id)\n            if new_topic != old_topic:\n                proposal_old = self.__word_proposal_distribution(token.id, old_topic)\n                proposal_new = self.__word_proposal_distribution(token.id, new_topic)\n                proportion_old = self.__proportional_function(doc, token, old_topic)\n                proportion_new = self.__proportional_function(doc, token, new_topic)\n                transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                rejection = rand()\n                mask = -(rejection < transition_prob)\n                return (new_topic & mask) | (old_topic & ~mask)\n            return new_topic\n\n        elif isinstance(doc, SLDADoc) and isinstance(token, Sentence):\n            sent = token\n            new_topic = old_topic\n            for word_id in sent.tokens:\n                new_topic = self.__propose(word_id)\n                if new_topic != old_topic:\n                    proportion_old = self.__proportional_function(doc, sent, old_topic)\n                    proportion_new = self.__proportional_function(doc, sent, new_topic)\n                    proposal_old = self.__word_proposal_distribution(word_id, old_topic)\n                    proposal_new = self.__word_proposal_distribution(word_id, new_topic)\n                    transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                    rejection = rand()\n                    mask = -(rejection < transition_prob)\n                    new_topic = (new_topic & mask) | (old_topic & ~mask)\n            return new_topic\n\n    def __proportional_function(self, doc, token, new_topic):\n        if isinstance(doc, LDADoc) and isinstance(token, Token):\n            old_topic = token.topic\n            dt_alpha = doc.topic_sum(new_topic) + self.__model.alpha()\n            wt_beta = self.__model.word_topic_value(token.id, new_topic) + self.__model.beta()\n            t_sum_beta_sum = self.__model.topic_sum_value(new_topic) + self.__model.beta_sum()\n            if new_topic == old_topic and wt_beta > 1:\n                if dt_alpha > 1:\n                    dt_alpha -= 1\n                wt_beta -= 1\n                t_sum_beta_sum -= 1\n            return dt_alpha * wt_beta / t_sum_beta_sum\n\n        elif isinstance(doc, SLDADoc) and isinstance(token, Sentence):\n            sent = token\n            old_topic = sent.topic\n            result = doc.topic_sum(new_topic) + self.__model.alpha()\n            if new_topic == old_topic:\n                result -= 1\n            for word_id in sent.tokens:\n                wt_beta = self.__model.word_topic_value(word_id, new_topic) + self.__model.beta()\n                t_sum_beta_sum = self.__model.topic_sum_value(new_topic) + self.__model.beta_sum()\n                if new_topic == old_topic and wt_beta > 1:\n                    wt_beta -= 1\n                    t_sum_beta_sum -= 1\n                result *= wt_beta / t_sum_beta_sum\n            return result\n        else:\n            logger.error(\"Wrong input argument type!\")\n\n    def __word_proposal_distribution(self, word_id, topic):\n        wt_beta = self.__model.word_topic_value(word_id, topic) + self.__model.beta()\n        t_sum_beta_sum = self.__model.topic_sum_value(topic) + self.__model.beta_sum()\n        return wt_beta / t_sum_beta_sum\n\n    def __doc_proposal_distribution(self, doc, topic):\n        return doc.topic_sum(topic) + self.__model.alpha()\n\n    def __propose(self, word_id):\n        dart = rand() * (self.__prob_sum[word_id] + self.__beta_prior_sum)\n        if dart < self.__prob_sum[word_id]:\n            idx = self.__alias_tables[word_id].generate()\n            topic = self.__topic_indexes[word_id][idx]\n        else:\n            topic = self.__beta_alias.generate()\n        return topic\n\n\nclass GibbsSampler(Sampler):\n    def __init__(self, model):\n        super().__init__()\n        self.__model = model\n\n    def sample_doc(self, doc):\n        if isinstance(doc, LDADoc) and not isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_token(doc, doc.token(i))\n                doc.set_topic(i, new_topic)\n        elif isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_sentence(doc, doc.sent(i))\n                doc.set_topic(i, new_topic)\n\n    def __sample_token(self, doc, token):\n        old_topic = token.topic\n        num_topics = self.__model.num_topics()\n        accum_prob = np.zeros(num_topics)\n        prob = np.zeros(num_topics)\n        sum_ = 0\n        for i in range(num_topics):\n            dt_alpha = doc.topic_sum(i) + self.__model.alpha()\n            wt_beta = self.__model.word_topic_value(token.id, i) + self.__model.beta()\n            t_sum_beta_sum = self.__model.topic_sum(i) + self.__model.beta_sum()\n            if i == old_topic and wt_beta > 1:\n                if dt_alpha > 1:\n                    dt_alpha -= 1\n                wt_beta -= 1\n                t_sum_beta_sum -= 1\n            prob[i] = dt_alpha * wt_beta / t_sum_beta_sum\n            sum_ += prob[i]\n            accum_prob[i] = prob[i] if i == 0 else accum_prob[i - 1] + prob[i]\n\n        dart = rand() * sum_\n        if dart <= accum_prob[0]:\n            return 0\n        for i in range(1, num_topics):\n            if accum_prob[i - 1] < dart <= accum_prob[i]:\n                return i\n        return num_topics - 1\n\n    def __sample_sentence(self, doc, sent):\n        old_topic = sent.topic\n        num_topics = self.__model.num_topics()\n        accum_prob = np.zeros(num_topics)\n        prob = np.zeros(num_topics)\n        sum_ = 0\n        for t in range(num_topics):\n            dt_alpha = doc.topic_sum(t) + self.__model.alpha()\n            t_sum_beta_sum = self.__model.topic_sum(t) + self.__model.beta_sum()\n            if t == old_topic:\n                if dt_alpha > 1:\n                    dt_alpha -= 1\n                if t_sum_beta_sum > 1:\n                    t_sum_beta_sum -= 1\n            prob[t] = dt_alpha\n            for i in range(len(sent.tokens)):\n                w = sent.tokens[i]\n                wt_beta = self.__model.word_topic_value(w, t) + self.__model.beta()\n                if t == old_topic and wt_beta > 1:\n                    wt_beta -= 1\n                # Note: if the length of the sentence is too long, the probability will be\n                # too small and the accuracy will be lost if there are too many multiply items\n                prob[t] *= wt_beta / t_sum_beta_sum\n            sum_ += prob[t]\n            accum_prob[t] = prob[t] if t == 0 else accum_prob[t - 1] + prob[t]\n\n        dart = rand() * sum\n        if dart <= accum_prob[0]:\n            return 0\n        for t in range(1, num_topics):\n            if accum_prob[t - 1] < dart <= accum_prob[t]:\n                return t\n        return num_topics - 1\n"
  },
  {
    "path": "modules/text/language_model/lda_webpage/semantic_matching.py",
    "content": "import os\n\nimport numpy as np\nfrom paddlehub.common.logger import logger\n\nfrom lda_webpage.vocab import OOV\n\nEPS = 1e-06\n\n\nclass WordAndDis(object):\n    def __init__(self):\n        self.word = None\n        self.distance = None\n\n\nclass SemanticMatching(object):\n    def __init__(self):\n        pass\n\n    def l2_norm(self, vec):\n        \"\"\"Calculate the length of vector.\n        \"\"\"\n        result = np.sqrt(np.sum(vec**2))\n        return result\n\n    def cosine_similarity(self, vec1, vec2):\n        norm1 = self.l2_norm(vec1)\n        norm2 = self.l2_norm(vec2)\n        result = np.sum(vec1 * vec2) / norm1 / norm2\n        return result\n\n    def likelihood_based_similarity(self, terms, doc_topic_dist, model):\n        \"\"\"\n        Args:\n            terms: list of strings\n            doc_topic_dist: list of Topic class\n            model: TopicModel class\n        \"\"\"\n        num_of_term_in_vocab = 0\n        result = 0\n        for i in range(len(terms)):\n            term_id = model.term_id(terms[i])\n            if term_id == OOV:\n                continue\n            num_of_term_in_vocab += 1\n            for j in range(len(doc_topic_dist)):\n                topic_id = doc_topic_dist[j].tid\n                prob = doc_topic_dist[j].prob\n                result += model.word_topic_value(term_id, topic_id) * 1.0 / \\\n                          model.topic_sum_value(topic_id) * prob\n\n        if num_of_term_in_vocab == 0:\n            return result\n        return result / num_of_term_in_vocab\n\n    def kullback_leibler_divergence(self, dist1, dist2):\n        assert dist1.shape == dist2.shape\n        dist2[dist2 < EPS] = EPS\n        result = np.sum(dist1 * np.log(dist1 / dist2))\n        return result\n\n    def jensen_shannon_divergence(self, dist1, dist2):\n        assert dist1.shape == dist2.shape\n        dist1[dist1 < EPS] = EPS\n        dist2[dist2 < EPS] = EPS\n        mean = (dist1 + dist2) * 0.5\n        jsd = self.kullback_leibler_divergence(dist1, mean) * 0.5 + \\\n              self.kullback_leibler_divergence(dist2, mean) * 0.5\n        return jsd\n\n    def hellinger_distance(self, dist1, dist2):\n        assert dist1.shape == dist2.shape\n        result = np.sum((np.sqrt(dist1) - np.sqrt(dist2))**2)\n        result = np.sqrt(result) * 0.7071067812\n        return result\n"
  },
  {
    "path": "modules/text/language_model/lda_webpage/tokenizer.py",
    "content": "import os\n\nimport numpy as np\nfrom paddlehub.common.logger import logger\n\n\nclass Tokenizer(object):\n    \"\"\"Base tokenizer class.\n    \"\"\"\n\n    def __init__(self):\n        pass\n\n    def tokenize(self, text):\n        raise NotImplementedError\n\n\nclass SimpleTokenizer(Tokenizer):\n    \"\"\"Simple version FMM(Forward Maximun Matching) word tokenizer. This tokenizer can only\n       be used in topic model demo, but not in real business application scenarios.\n\n       Notes: This tokenizer can only recognize the words in the corresponding vocab file.\n    \"\"\"\n\n    def __init__(self, vocab_path):\n        super().__init__()\n        self.__max_word_len = 0\n        self.__vocab = set()\n        self.__load_vocab(vocab_path)\n\n    def tokenize(self, text):\n        \"\"\"Tokenize the input string `text`, and return the tokenize result.\n        \"\"\"\n        text_len = len(text)\n        result = []\n        i = 0\n        while i < text_len:\n            word = found_word = \"\"\n            # Deal with English characters.\n            if self.__is_eng_char(text[i]):\n                for j in range(i, text_len + 1):\n                    if j < text_len and self.__is_eng_char(text[j]):\n                        word += self.__tolower(text[j])\n                    else:\n                        # Forward matching by character granularity.\n                        if word in self.__vocab:\n                            result.append(word)\n                        i = j - 1\n                        break\n            else:\n                for j in range(i, min(i + self.__max_word_len, text_len)):\n                    word += text[j]\n                    if word in self.__vocab:\n                        found_word = word\n                if len(found_word) > 0:\n                    result.append(found_word)\n                    i += len(found_word) - 1\n            i += 1\n        return result\n\n    def contains(self, word):\n        \"\"\"Check whether the word is in the vocabulary.\n        \"\"\"\n        return word in self.__vocab\n\n    def __load_vocab(self, vocab_path):\n        \"\"\"Load the word dictionary.\n        \"\"\"\n        with open(vocab_path, 'r', encoding='utf-8') as fin:\n            vocab_size = 0\n            for line in fin.readlines():\n                fields = line.strip().split('\\t')\n                assert len(fields) >= 2\n                word = fields[1]\n                self.__max_word_len = max(self.__max_word_len, len(word))\n                self.__vocab.add(word)\n                vocab_size += 1\n\n    def __is_eng_char(self, c):\n        \"\"\"Check whether char c is an English character.\n        \"\"\"\n        return (c >= 'A' and c <= 'Z') or (c >= 'a' and c <= 'z')\n\n    def __tolower(self, c):\n        \"\"\"Return the lowercase character of the corresponding character, or return\n           the original character if there is no corresponding lowercase character.\n        \"\"\"\n        return c.lower()\n\n\nclass LACTokenizer(Tokenizer):\n    def __init__(self, vocab_path, lac):\n        super().__init__()\n        self.__max_word_len = 0\n        self.__vocab = set()\n        self.__lac = lac\n        self.__load_vocab(vocab_path)\n\n    def __load_vocab(self, vocab_path):\n        \"\"\"Load the word dictionary.\n                \"\"\"\n        with open(vocab_path, 'r', encoding='utf-8') as fin:\n            vocab_size = 0\n            for line in fin.readlines():\n                fields = line.strip().split('\\t')\n                assert len(fields) >= 2\n                word = fields[1]\n                self.__max_word_len = max(self.__max_word_len, len(word))\n                self.__vocab.add(word)\n                vocab_size += 1\n\n    def tokenize(self, text):\n        results = self.__lac.lexical_analysis(texts=[text], use_gpu=False, batch_size=1, return_tag=True)\n        # Change English words to lower case.\n        # And just preserve the word in vocab.\n        words = results[0][\"word\"]\n        result = []\n        for word in words:\n            word = word.lower()\n            if word in self.__vocab:\n                result.append(word)\n        return result\n\n    def contains(self, word):\n        \"\"\"Check whether the word is in the vocabulary.\n        \"\"\"\n        return word in self.__vocab\n"
  },
  {
    "path": "modules/text/language_model/lda_webpage/util.py",
    "content": "import time\nimport yaml\n\nimport numpy as np\nfrom paddlehub.common.logger import logger\n\nfrom lda_webpage.config import ModelType\n\n\ndef load_prototxt(config_file, config):\n    \"\"\"\n    Args:\n        config_file: model configuration file.\n        config: ModelConfig class\n    \"\"\"\n    logger.info(\"Loading LDA config.\")\n    with open(config_file, 'r', encoding='utf-8') as f:\n        yaml_dict = yaml.load(f, Loader=yaml.FullLoader)\n\n    # Assignment.\n    if yaml_dict[\"type\"] == \"LDA\":\n        config.type = ModelType.LDA\n    else:\n        config.type = ModelType.SLDA\n    config.num_topics = yaml_dict[\"num_topics\"]\n    config.alpha = yaml_dict[\"alpha\"]\n    config.beta = yaml_dict[\"beta\"]\n    config.word_topic_file = yaml_dict[\"word_topic_file\"]\n    config.vocab_file = yaml_dict[\"vocab_file\"]\n\n\ndef fix_random_seed(seed=2147483647):\n    np.random.seed(seed)\n\n\ndef rand(min_=0, max_=1):\n    return np.random.uniform(low=min_, high=max_)\n\n\ndef rand_k(k):\n    \"\"\"Returns an integer float number between [0, k - 1].\n    \"\"\"\n    return int(rand() * k)\n\n\ndef timeit(f):\n    \"\"\"Return time cost of function f.\n    \"\"\"\n\n    def timed(*args, **kwargs):\n        start_time = time.time()\n        result = f(*args, **kwargs)\n        end_time = time.time()\n        print(\"   [-] %s : %2.5f sec\" % (f.__name__, end_time - start_time))\n        return result\n\n    return timed\n"
  },
  {
    "path": "modules/text/language_model/lda_webpage/vocab.py",
    "content": "from paddlehub.common.logger import logger\n\nOOV = -1\n\n\nclass WordCount(object):\n    def __init__(self, word_id, count):\n        self.word_id = word_id\n        self.count = count\n\n\nclass Vocab(object):\n    def __init__(self):\n        self.__term2id = {}\n        self.__id2term = {}\n\n    def get_id(self, word):\n        if word not in self.__term2id:\n            return OOV\n        return self.__term2id[word]\n\n    def load(self, vocab_file):\n        self.__term2id = {}\n        self.__id2term = {}\n        with open(vocab_file, 'r', encoding='utf-8') as fin:\n            for line in fin.readlines():\n                fields = line.strip().split('\\t')\n                assert len(fields) == 5, \"Vocabulary file [%s] format error!\" % (vocab_file)\n                term = fields[1]\n                id_ = int(fields[2])\n                if term in self.__term2id:\n                    logger.error(\"Duplicate word [%s] in vocab file!\" % (term))\n                    continue\n                self.__term2id[term] = id_\n                self.__id2term[id_] = term\n\n    def size(self):\n        return len(self.__term2id)\n\n    def vocabulary(self):\n        return self.__id2term\n"
  },
  {
    "path": "modules/text/language_model/lda_webpage/vose_alias.py",
    "content": "import os\n\nimport numpy as np\nfrom paddlehub.common.logger import logger\n\nfrom lda_webpage.util import rand, rand_k\n\n\nclass VoseAlias(object):\n    \"\"\"Vose's Alias Method.\n    \"\"\"\n\n    def __init__(self):\n        self.__alias = None\n        self.__prob = None  # np.array\n\n    def initialize(self, distribution):\n        \"\"\"Initialize the alias table according to the input distribution\n        Arg:\n            distribution: Numpy array.\n        \"\"\"\n        size = distribution.shape[0]\n        self.__alias = np.zeros(size, dtype=np.int64)\n        self.__prob = np.zeros(size)\n        sum_ = np.sum(distribution)\n        p = distribution / sum_ * size  # Scale up probability.\n        large, small = [], []\n        for i, p_ in enumerate(p):\n            if p_ < 1.0:\n                small.append(i)\n            else:\n                large.append(i)\n\n        while large and small:\n            l = small[0]\n            g = large[0]\n            small.pop(0)\n            large.pop(0)\n            self.__prob[l] = p[l]\n            self.__alias[l] = g\n            p[g] = p[g] + p[l] - 1  # A more numerically stable option.\n            if p[g] < 1.0:\n                small.append(g)\n            else:\n                large.append(g)\n\n        while large:\n            g = large[0]\n            large.pop(0)\n            self.__prob[g] = 1.0\n\n        while small:\n            l = small[0]\n            small.pop(0)\n            self.__prob[l] = 1.0\n\n    def generate(self):\n        \"\"\"Generate samples from given distribution.\n        \"\"\"\n        dart1 = rand_k(self.size())\n        dart2 = int(rand())\n        return dart1 if dart2 > self.__prob[dart1] else self.__alias[dart1]\n\n    def size(self):\n        return self.__prob.shape[0]\n"
  },
  {
    "path": "modules/text/language_model/rbt3/README.md",
    "content": "```shell\n$ hub install rtb3==2.0.2\n```\n<p align=\"center\">\n<img src=\"https://bj.bcebos.com/paddlehub/paddlehub-img/bert_network.png\"  hspace='10'/> <br />\n</p>\n\n更多详情请参考[RoBERTa论文](https://arxiv.org/abs/1907.11692)、[Chinese-BERT-wwm技术报告](https://arxiv.org/abs/1906.08101)\n\n## API\n```python\ndef __init__(\n    task=None,\n    load_checkpoint=None,\n    label_map=None,\n    num_classes=2,\n    suffix=False,\n    **kwargs,\n)\n```\n\n创建Module对象（动态图组网版本）。\n\n**参数**\n\n* `task`： 任务名称，可为`seq-cls`(文本分类任务，原来的`sequence_classification`在未来会被弃用)或`token-cls`(序列标注任务)。\n* `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n* `label_map`：预测时的类别映射表。\n* `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n* `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n* `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n```python\ndef predict(\n    data,\n    max_seq_len=128,\n    batch_size=1,\n    use_gpu=False\n)\n```\n\n**参数**\n\n* `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n* `max_seq_len`：模型处理文本的最大长度\n* `batch_size`：模型批处理大小\n* `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n**返回**\n\n* `results`：list类型，不同任务类型的返回结果如下\n  * 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n  * 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n```python\ndef get_embedding(\n    data,\n    use_gpu=False\n)\n```\n\n用于获取输入文本的句子粒度特征与字粒度特征\n\n**参数**\n\n* `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n* `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n**返回**\n\n* `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。\n\n\n**代码示例**\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='rtb3',\n    version='2.0.2',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Lable: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](https://github.com/PaddlePaddle/PaddleHub/tree/release/v2.0.0-beta/demo/text_classification)\n- [序列标注](https://github.com/PaddlePaddle/PaddleHub/tree/release/v2.0.0-beta/demo/sequence_labeling)\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线获取预训练词向量。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m rtb3\n```\n\n这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\n\n# 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\ntext = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n# 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n# 对应本地部署，则为module.get_embedding(data=text)\ndata = {\"data\": text}\n# 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\nurl = \"http://127.0.0.1:8866/predict/rtb3\"\n# 指定post请求的headers为application/json方式\nheaders = {\"Content-Type\": \"application/json\"}\n\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\nprint(r.json())\n```\n\n## 查看代码\nhttps://github.com/ymcui/Chinese-BERT-wwm\n\n\n## 贡献者\n\n[ymcui](https://github.com/ymcui)\n\n## 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n\n## 更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 2.0.0\n\n  全面升级动态图，接口有所变化。任务名称调整，增加序列标注任务`token-cls`\n\n* 2.0.1\n\n  增加文本匹配任务`text-matching`\n\n* 2.0.2\n\n  更新预训练模型调用方法\n"
  },
  {
    "path": "modules/text/language_model/rbt3/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/rbt3/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nimport os\nfrom typing import Dict\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlenlp.transformers import AutoModel\nfrom paddlenlp.transformers import AutoModelForSequenceClassification\nfrom paddlenlp.transformers import AutoModelForTokenClassification\nfrom paddlenlp.transformers import AutoTokenizer\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"rbt3\",\n    version=\"2.0.2\",\n    summary=\"rbt3, 3-layer, 768-hidden, 12-heads, 38M parameters \",\n    author=\"ymcui\",\n    author_email=\"ymcui@ir.hit.edu.cn\",\n    type=\"nlp/semantic_model\",\n    meta=TransformerModule,\n)\nclass Roberta(nn.Layer):\n    \"\"\"\n    RoBERTa model\n    \"\"\"\n\n    def __init__(\n        self,\n        task: str = None,\n        load_checkpoint: str = None,\n        label_map: Dict = None,\n        num_classes: int = 2,\n        suffix: bool = False,\n        **kwargs,\n    ):\n        super(Roberta, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = AutoModelForSequenceClassification.from_pretrained(pretrained_model_name_or_path='hfl/rbt3',\n                                                                            num_classes=self.num_classes,\n                                                                            **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = AutoModelForTokenClassification.from_pretrained(pretrained_model_name_or_path='hfl/rbt3',\n                                                                         num_classes=self.num_classes,\n                                                                         **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())],\n                                         suffix=suffix)\n        elif task == 'text-matching':\n            self.model = AutoModel.from_pretrained(pretrained_model_name_or_path='hfl/rbt3', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = AutoModel.from_pretrained(pretrained_model_name_or_path='hfl/rbt3', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding, _ = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding, _ = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            sequence_output, pooled_output = result\n            return sequence_output, pooled_output\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return AutoTokenizer.from_pretrained(pretrained_model_name_or_path='hfl/rbt3', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/rbtl3/README.md",
    "content": "```shell\n$ hub install rbtl3==2.0.2\n```\n<p align=\"center\">\n<img src=\"https://bj.bcebos.com/paddlehub/paddlehub-img/bert_network.png\"  hspace='10'/> <br />\n</p>\n\n更多详情请参考[RoBERTa论文](https://arxiv.org/abs/1907.11692)、[Chinese-BERT-wwm技术报告](https://arxiv.org/abs/1906.08101)\n\n## API\n```python\ndef __init__(\n    task=None,\n    load_checkpoint=None,\n    label_map=None,\n    num_classes=2,\n    suffix=False,\n    **kwargs,\n)\n```\n\n创建Module对象（动态图组网版本）。\n\n**参数**\n\n* `task`： 任务名称，可为`seq-cls`(文本分类任务，原来的`sequence_classification`在未来会被弃用)或`token-cls`(序列标注任务)。\n* `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n* `label_map`：预测时的类别映射表。\n* `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n* `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n* `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n```python\ndef predict(\n    data,\n    max_seq_len=128,\n    batch_size=1,\n    use_gpu=False\n)\n```\n\n**参数**\n\n* `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n* `max_seq_len`：模型处理文本的最大长度\n* `batch_size`：模型批处理大小\n* `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n**返回**\n\n* `results`：list类型，不同任务类型的返回结果如下\n  * 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n  * 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n```python\ndef get_embedding(\n    data,\n    use_gpu=False\n)\n```\n\n用于获取输入文本的句子粒度特征与字粒度特征\n\n**参数**\n\n* `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n* `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n**返回**\n\n* `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。\n\n\n**代码示例**\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='rbtl3',\n    version='2.0.2',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Lable: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](https://github.com/PaddlePaddle/PaddleHub/tree/release/v2.0.0-beta/demo/text_classification)\n- [序列标注](https://github.com/PaddlePaddle/PaddleHub/tree/release/v2.0.0-beta/demo/sequence_labeling)\n\n## 服务部署\n\nPaddleHub Serving可以部署一个在线获取预训练词向量。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m rbtl3\n```\n\n这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\n\n# 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\ntext = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n# 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n# 对应本地部署，则为module.get_embedding(data=text)\ndata = {\"data\": text}\n# 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\nurl = \"http://127.0.0.1:8866/predict/rbtl3\"\n# 指定post请求的headers为application/json方式\nheaders = {\"Content-Type\": \"application/json\"}\n\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\nprint(r.json())\n```\n\n## 查看代码\nhttps://github.com/ymcui/Chinese-BERT-wwm\n\n\n## 贡献者\n\n[ymcui](https://github.com/ymcui)\n\n## 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n\n## 更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 2.0.0\n\n  全面升级动态图，接口有所变化。任务名称调整，增加序列标注任务`token-cls`\n\n* 2.0.1\n\n  增加文本匹配任务`text-matching`\n\n* 2.0.2\n\n  更新预训练模型调用方法\n"
  },
  {
    "path": "modules/text/language_model/rbtl3/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/rbtl3/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nimport os\nfrom typing import Dict\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlenlp.transformers import AutoModel\nfrom paddlenlp.transformers import AutoModelForSequenceClassification\nfrom paddlenlp.transformers import AutoModelForTokenClassification\nfrom paddlenlp.transformers import AutoTokenizer\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"rbtl3\",\n    version=\"2.0.2\",\n    summary=\"rbtl3, 3-layer, 1024-hidden, 16-heads, 61M parameters \",\n    author=\"ymcui\",\n    author_email=\"ymcui@ir.hit.edu.cn\",\n    type=\"nlp/semantic_model\",\n    meta=TransformerModule,\n)\nclass Roberta(nn.Layer):\n    \"\"\"\n    RoBERTa model\n    \"\"\"\n\n    def __init__(\n        self,\n        task: str = None,\n        load_checkpoint: str = None,\n        label_map: Dict = None,\n        num_classes: int = 2,\n        suffix: bool = False,\n        **kwargs,\n    ):\n        super(Roberta, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = AutoModelForSequenceClassification.from_pretrained(pretrained_model_name_or_path='hfl/rbtl3',\n                                                                            num_classes=self.num_classes,\n                                                                            **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = AutoModelForTokenClassification.from_pretrained(pretrained_model_name_or_path='hfl/rbtl3',\n                                                                         num_classes=self.num_classes,\n                                                                         **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())],\n                                         suffix=suffix)\n        elif task == 'text-matching':\n            self.model = AutoModel.from_pretrained(pretrained_model_name_or_path='hfl/rbtl3', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = AutoModel.from_pretrained(pretrained_model_name_or_path='hfl/rbtl3', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding, _ = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding, _ = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            sequence_output, pooled_output = result\n            return sequence_output, pooled_output\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return AutoTokenizer.from_pretrained(pretrained_model_name_or_path='hfl/rbtl3', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/roberta-wwm-ext/README.md",
    "content": "# roberta-wwm-ext\n|模型名称|roberta-wwm-ext|\n| :--- | :---: |\n|类别|文本-语义模型|\n|网络|roberta-wwm-ext|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|是|\n|模型大小|391MB|\n|最新更新日期|2021-03-16|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n<p align=\"center\">\n<img src=\"https://bj.bcebos.com/paddlehub/paddlehub-img/bert_network.png\"  hspace='10'/> <br />\n</p>\n\n更多详情请参考[RoBERTa论文](https://arxiv.org/abs/1907.11692)、[Chinese-BERT-wwm技术报告](https://arxiv.org/abs/1906.08101)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install roberta-wwm-ext\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='roberta-wwm-ext',\n    version='2.0.3',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Lable: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](../../../../demo/text_classification)\n- [序列标注](../../../../demo/sequence_labeling)\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        task=None,\n        load_checkpoint=None,\n        label_map=None,\n        num_classes=2,\n        suffix=False,\n        **kwargs,\n    )\n    ```\n\n    - 创建Module对象（动态图组网版本）\n\n    - **参数**\n\n      - `task`： 任务名称，可为`seq-cls`(文本分类任务)或`token-cls`(序列标注任务)。\n      - `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n      - `label_map`：预测时的类别映射表。\n      - `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n      - `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n  - ```python\n    def predict(\n        data,\n        max_seq_len=128,\n        batch_size=1,\n        use_gpu=False\n    )\n    ```\n\n    - **参数**\n\n      - `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n      - `max_seq_len`：模型处理文本的最大长度\n      - `batch_size`：模型批处理大小\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，不同任务类型的返回结果如下\n        - 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n        - 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n  - ```python\n    def get_embedding(\n      data,\n      use_gpu=False\n    )\n    ```\n\n    - 用于获取输入文本的句子粒度特征与字粒度特征\n\n    - **参数**\n\n      - `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。  \n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线获取预训练词向量。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m roberta-wwm-ext\n    ```\n\n  - 这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\n    text = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n    # 对应本地部署，则为module.get_embedding(data=text)\n    data = {\"data\": text}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/roberta-wwm-ext\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 2.0.0\n\n  全面升级动态图，接口有所变化。\n\n* 2.0.1\n\n  任务名称调整，增加序列标注任务`token-cls`\n\n* 2.0.2\n\n  增加文本匹配任务`text-matching`\n\n* 2.0.3\n\n  更新预训练模型调用方法\n  ```shell\n  $ hub install roberta-wwm-ext==2.0.3\n  ```\n"
  },
  {
    "path": "modules/text/language_model/roberta-wwm-ext/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/roberta-wwm-ext/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nimport os\nfrom typing import Dict\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlenlp.transformers import AutoModel\nfrom paddlenlp.transformers import AutoModelForSequenceClassification\nfrom paddlenlp.transformers import AutoModelForTokenClassification\nfrom paddlenlp.transformers import AutoTokenizer\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"roberta-wwm-ext\",\n    version=\"2.0.3\",\n    summary=\n    \"chinese-roberta-wwm-ext, 12-layer, 768-hidden, 12-heads, 110M parameters.  The module is executed as paddle.dygraph.\",\n    author=\"ymcui\",\n    author_email=\"ymcui@ir.hit.edu.cn\",\n    type=\"nlp/semantic_model\",\n    meta=TransformerModule,\n)\nclass Roberta(nn.Layer):\n    \"\"\"\n    RoBERTa model\n    \"\"\"\n\n    def __init__(\n        self,\n        task: str = None,\n        load_checkpoint: str = None,\n        label_map: Dict = None,\n        num_classes: int = 2,\n        suffix: bool = False,\n        **kwargs,\n    ):\n        super(Roberta, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = AutoModelForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='hfl/roberta-wwm-ext', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = AutoModelForTokenClassification.from_pretrained(\n                pretrained_model_name_or_path='hfl/roberta-wwm-ext', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())],\n                                         suffix=suffix)\n        elif task == 'text-matching':\n            self.model = AutoModel.from_pretrained(pretrained_model_name_or_path='hfl/roberta-wwm-ext', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = AutoModel.from_pretrained(pretrained_model_name_or_path='hfl/roberta-wwm-ext', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding, _ = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding, _ = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            sequence_output, pooled_output = result\n            return sequence_output, pooled_output\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return AutoTokenizer.from_pretrained(pretrained_model_name_or_path='hfl/roberta-wwm-ext', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/roberta-wwm-ext-large/README.md",
    "content": "# roberta-wwm-ext-large\n|模型名称|roberta-wwm-ext-large|\n| :--- | :---: |\n|类别|文本-语义模型|\n|网络|roberta-wwm-ext-large|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|是|\n|模型大小|1.3GB|\n|最新更新日期|2021-03-16|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n<p align=\"center\">\n<img src=\"https://bj.bcebos.com/paddlehub/paddlehub-img/bert_network.png\"  hspace='10'/> <br />\n</p>\n\n更多详情请参考[RoBERTa论文](https://arxiv.org/abs/1907.11692)、[Chinese-BERT-wwm技术报告](https://arxiv.org/abs/1906.08101)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install roberta-wwm-ext-large\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n```python\nimport paddlehub as hub\n\ndata = [\n    ['这个宾馆比较陈旧了，特价的房间也很一般。总体来说一般'],\n    ['怀着十分激动的心情放映，可是看着看着发现，在放映完毕后，出现一集米老鼠的动画片'],\n    ['作为老的四星酒店，房间依然很整洁，相当不错。机场接机服务很好，可以在车上办理入住手续，节省时间。'],\n]\nlabel_map = {0: 'negative', 1: 'positive'}\n\nmodel = hub.Module(\n    name='roberta-wwm-ext-large',\n    version='2.0.3',\n    task='seq-cls',\n    load_checkpoint='/path/to/parameters',\n    label_map=label_map)\nresults = model.predict(data, max_seq_len=50, batch_size=1, use_gpu=False)\nfor idx, text in enumerate(data):\n    print('Data: {} \\t Lable: {}'.format(text, results[idx]))\n```\n\n详情可参考PaddleHub示例：\n- [文本分类](../../../../demo/text_classification)\n- [序列标注](../../../../demo/sequence_labeling)\n\n- ### 2、API\n\n  - ```python\n    def __init__(\n        task=None,\n        load_checkpoint=None,\n        label_map=None,\n        num_classes=2,\n        suffix=False,\n        **kwargs,\n    )\n    ```\n\n    - 创建Module对象（动态图组网版本）\n\n    - **参数**\n\n      - `task`： 任务名称，可为`seq-cls`(文本分类任务)或`token-cls`(序列标注任务)。\n      - `load_checkpoint`：使用PaddleHub Fine-tune api训练保存的模型参数文件路径。\n      - `label_map`：预测时的类别映射表。\n      - `num_classes`：分类任务的类别数，如果指定了`label_map`，此参数可不传，默认2分类。\n      - `suffix`: 序列标注任务的标签格式，如果设定为`True`，标签以'-B', '-I', '-E' 或者 '-S'为结尾，此参数默认为`False`。\n      - `**kwargs`：用户额外指定的关键字字典类型的参数。\n\n  - ```python\n    def predict(\n        data,\n        max_seq_len=128,\n        batch_size=1,\n        use_gpu=False\n    )\n    ```\n\n    - **参数**\n\n      - `data`： 待预测数据，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。每个样例文本数量（1个或者2个）需和训练时保持一致。\n      - `max_seq_len`：模型处理文本的最大长度\n      - `batch_size`：模型批处理大小\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，不同任务类型的返回结果如下\n        - 文本分类：列表里包含每个句子的预测标签，格式为\\[label\\_1, label\\_2, …,\\]\n        - 序列标注：列表里包含每个句子每个token的预测标签，格式为\\[\\[token\\_1, token\\_2, …,\\], \\[token\\_1, token\\_2, …,\\], …,\\]\n\n  - ```python\n    def get_embedding(\n      data,\n      use_gpu=False\n    )\n    ```\n\n    - 用于获取输入文本的句子粒度特征与字粒度特征\n\n    - **参数**\n\n      - `data`：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n      - `use_gpu`：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。\n\n    - **返回**\n\n      - `results`：list类型，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。  \n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线获取预训练词向量。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m roberta-wwm-ext-large\n    ```\n\n  - 这样就完成了一个获取预训练词向量服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 指定用于获取embedding的文本[[text_1], [text_2], ... ]}\n    text = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"data\"\n    # 对应本地部署，则为module.get_embedding(data=text)\n    data = {\"data\": text}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/roberta-wwm-ext-large\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 2.0.0\n\n  全面升级动态图，接口有所变化。\n\n* 2.0.1\n\n  任务名称调整，增加序列标注任务`token-cls`\n\n* 2.0.2\n\n  增加文本匹配任务`text-matching`\n\n* 2.0.3\n\n  更新预训练模型调用方法\n  ```shell\n  $ hub install roberta-wwm-ext==2.0.3\n  ```\n"
  },
  {
    "path": "modules/text/language_model/roberta-wwm-ext-large/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/roberta-wwm-ext-large/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport math\nimport os\nfrom typing import Dict\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlenlp.metrics import ChunkEvaluator\nfrom paddlenlp.transformers import AutoModel\nfrom paddlenlp.transformers import AutoModelForSequenceClassification\nfrom paddlenlp.transformers import AutoModelForTokenClassification\nfrom paddlenlp.transformers import AutoTokenizer\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.nlp_module import TransformerModule\nfrom paddlehub.utils.log import logger\n\n\n@moduleinfo(\n    name=\"roberta-wwm-ext-large\",\n    version=\"2.0.3\",\n    summary=\n    \"chinese-roberta-wwm-ext-large, 24-layer, 1024-hidden, 16-heads, 340M parameters. The module is executed as paddle.dygraph.\",\n    author=\"ymcui\",\n    author_email=\"ymcui@ir.hit.edu.cn\",\n    type=\"nlp/semantic_model\",\n    meta=TransformerModule,\n)\nclass Roberta(nn.Layer):\n    \"\"\"\n    RoBERTa model\n    \"\"\"\n\n    def __init__(\n        self,\n        task: str = None,\n        load_checkpoint: str = None,\n        label_map: Dict = None,\n        num_classes: int = 2,\n        suffix: bool = False,\n        **kwargs,\n    ):\n        super(Roberta, self).__init__()\n        if label_map:\n            self.label_map = label_map\n            self.num_classes = len(label_map)\n        else:\n            self.num_classes = num_classes\n\n        if task == 'sequence_classification':\n            task = 'seq-cls'\n            logger.warning(\n                \"current task name 'sequence_classification' was renamed to 'seq-cls', \"\n                \"'sequence_classification' has been deprecated and will be removed in the future.\", )\n        if task == 'seq-cls':\n            self.model = AutoModelForSequenceClassification.from_pretrained(\n                pretrained_model_name_or_path='hfl/roberta-wwm-ext-large', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task == 'token-cls':\n            self.model = AutoModelForTokenClassification.from_pretrained(\n                pretrained_model_name_or_path='hfl/roberta-wwm-ext-large', num_classes=self.num_classes, **kwargs)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = ChunkEvaluator(label_list=[self.label_map[i] for i in sorted(self.label_map.keys())],\n                                         suffix=suffix)\n        elif task == 'text-matching':\n            self.model = AutoModel.from_pretrained(pretrained_model_name_or_path='hfl/roberta-wwm-ext-large', **kwargs)\n            self.dropout = paddle.nn.Dropout(0.1)\n            self.classifier = paddle.nn.Linear(self.model.config['hidden_size'] * 3, 2)\n            self.criterion = paddle.nn.loss.CrossEntropyLoss()\n            self.metric = paddle.metric.Accuracy()\n        elif task is None:\n            self.model = AutoModel.from_pretrained(pretrained_model_name_or_path='hfl/roberta-wwm-ext-large', **kwargs)\n        else:\n            raise RuntimeError(\"Unknown task {}, task should be one in {}\".format(task, self._tasks_supported))\n\n        self.task = task\n\n        if load_checkpoint is not None and os.path.isfile(load_checkpoint):\n            state_dict = paddle.load(load_checkpoint)\n            self.set_state_dict(state_dict)\n            logger.info('Loaded parameters from %s' % os.path.abspath(load_checkpoint))\n\n    def forward(self,\n                input_ids=None,\n                token_type_ids=None,\n                position_ids=None,\n                attention_mask=None,\n                query_input_ids=None,\n                query_token_type_ids=None,\n                query_position_ids=None,\n                query_attention_mask=None,\n                title_input_ids=None,\n                title_token_type_ids=None,\n                title_position_ids=None,\n                title_attention_mask=None,\n                seq_lengths=None,\n                labels=None):\n\n        if self.task != 'text-matching':\n            result = self.model(input_ids, token_type_ids, position_ids, attention_mask)\n        else:\n            query_result = self.model(query_input_ids, query_token_type_ids, query_position_ids, query_attention_mask)\n            title_result = self.model(title_input_ids, title_token_type_ids, title_position_ids, title_attention_mask)\n\n        if self.task == 'seq-cls':\n            logits = result\n            probs = F.softmax(logits, axis=1)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        elif self.task == 'token-cls':\n            logits = result\n            token_level_probs = F.softmax(logits, axis=-1)\n            preds = token_level_probs.argmax(axis=-1)\n            if labels is not None:\n                loss = self.criterion(logits, labels.unsqueeze(-1))\n                num_infer_chunks, num_label_chunks, num_correct_chunks = \\\n                    self.metric.compute(None, seq_lengths, preds, labels)\n                self.metric.update(num_infer_chunks.numpy(), num_label_chunks.numpy(), num_correct_chunks.numpy())\n                _, _, f1_score = map(float, self.metric.accumulate())\n                return token_level_probs, loss, {'f1_score': f1_score}\n            return token_level_probs\n        elif self.task == 'text-matching':\n            query_token_embedding, _ = query_result\n            query_token_embedding = self.dropout(query_token_embedding)\n            query_attention_mask = paddle.unsqueeze(\n                (query_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            query_token_embedding = query_token_embedding * query_attention_mask\n            query_sum_embedding = paddle.sum(query_token_embedding, axis=1)\n            query_sum_mask = paddle.sum(query_attention_mask, axis=1)\n            query_mean = query_sum_embedding / query_sum_mask\n\n            title_token_embedding, _ = title_result\n            title_token_embedding = self.dropout(title_token_embedding)\n            title_attention_mask = paddle.unsqueeze(\n                (title_input_ids != self.model.pad_token_id).astype(self.model.pooler.dense.weight.dtype), axis=2)\n            title_token_embedding = title_token_embedding * title_attention_mask\n            title_sum_embedding = paddle.sum(title_token_embedding, axis=1)\n            title_sum_mask = paddle.sum(title_attention_mask, axis=1)\n            title_mean = title_sum_embedding / title_sum_mask\n\n            sub = paddle.abs(paddle.subtract(query_mean, title_mean))\n            projection = paddle.concat([query_mean, title_mean, sub], axis=-1)\n            logits = self.classifier(projection)\n            probs = F.softmax(logits)\n            if labels is not None:\n                loss = self.criterion(logits, labels)\n                correct = self.metric.compute(probs, labels)\n                acc = self.metric.update(correct)\n                return probs, loss, {'acc': acc}\n            return probs\n        else:\n            sequence_output, pooled_output = result\n            return sequence_output, pooled_output\n\n    @staticmethod\n    def get_tokenizer(*args, **kwargs):\n        \"\"\"\n        Gets the tokenizer that is customized for this module.\n        \"\"\"\n        return AutoTokenizer.from_pretrained(pretrained_model_name_or_path='hfl/roberta-wwm-ext-large', *args, **kwargs)\n"
  },
  {
    "path": "modules/text/language_model/simnet_bow/README.md",
    "content": "# simnet_bow\n|模型名称|simnet_bow|\n| :--- | :---: |\n|类别|文本-语义匹配|\n|网络|BOW|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|否|\n|模型大小|245MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - 短文本语义匹配(SimilarityNet, SimNet)是一个计算短文本相似度的模型，可以根据用户输入的两个文本，计算出相似度得分。SimNet在百度各产品上广泛应用，适用于信息检索、新闻推荐、智能客服等多个应用场景，帮助企业解决语义匹配问题。该PaddleHub Module基于BOW网络结构，支持预测。\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.1.0\n\n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install simnet_bow\n    ```\n\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run simnet_bow --text_1 \"这道题很难\" --text_2 \"这道题不简单\"\n    ```\n    - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    simnet_bow = hub.Module(name=\"simnet_bow\")\n\n    # Data to be predicted\n    test_text_1 = [\"这道题太难了\", \"这道题太难了\", \"这道题太难了\"]\n    test_text_2 = [\"这道题是上一年的考题\", \"这道题不简单\", \"这道题很有意思\"]\n\n    inputs = {\"text_1\": test_text_1, \"text_2\": test_text_2}\n    results = simnet_bow.similarity(data=inputs, batch_size=2)\n    print(results)\n\n    # [{'text_1': '这道题太难了', 'text_2': '这道题是上一年的考题', 'similarity': 0.689}, {'text_1': '这道题太难了', 'text_2': '这道题不简单', 'similarity': 0.855}, {'text_1': '这道题太难了', 'text_2': '这道题很有意思', 'similarity': 0.8166}]\n    ```\n\n- ### 3、 API\n\n  - ```python\n    similarity(texts=[], use_gpu=False, batch_size=1)\n    ```\n\n    - simnet_bow预测接口，计算两个句子的cosin相似度\n\n    - **参数**\n\n      - texts(list): 待预测数据，第一个元素(list)为第一顺序句子，第二个元素(list)为第二顺序句子，两个元素长度相同。\n        如texts=[[\"这道题太难了\", \"这道题太难了\", \"这道题太难了\"], [\"这道题是上一年的考题\", \"这道题不简单\", \"这道题很有意思\"]]。\n      - use_gpu(bool): 是否使用GPU预测，如果使用GPU预测，则在预测之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置\n      - batch_size(int): 批处理大小\n\n    - **返回**\n\n      - results(list): 带预测数据的cosin相似度\n\n  - ```python\n    get_vocab_path()\n    ```\n    - 获取预训练时使用的词汇表\n\n    - **返回**\n\n      - vocab_path(str): 词汇表路径\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线语义匹配服务，可以将此接口用于在线web应用。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m simnet_bow\n    ```\n\n  - 启动时会显示加载模型过程，启动成功后显示\n\n  - ```shell\n    Loading simnet_bow successful.\n    ```\n\n  - 这样就完成了服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 待预测数据\n    test_text_1 = [\"这道题太难了\", \"这道题太难了\", \"这道题太难了\"]\n    test_text_2 = [\"这道题是上一年的考题\", \"这道题不简单\", \"这道题很有意思\"]\n\n    text = [test_text_1, test_text_2]\n\n    # 设置运行配置\n    # 对应本地预测simnet_bow.similarity(texts=text, batch_size=1, use_gpu=True)\n    data = {\"texts\": text, \"batch_size\": 1, \"use_gpu\":True}\n\n    # 指定预测方法为simnet_bow并发送post请求，content-type类型应指定json方式\n    # HOST_IP为服务器IP\n    url = \"http://HOST_IP:8866/predict/simnet_bow\"\n    headers = {\"Content-Type\": \"application/json\"}\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(json.dumps(r.json(), indent=4, ensure_ascii=False))\n    ```\n\n  - 关于PaddleHub Serving更多信息参考：[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  大幅提升预测性能\n\n* 1.2.0\n\n  模型升级，支持用于文本分类，文本匹配等各种任务迁移学习\n\n* 1.2.1\n\n  移除 fluid api\n\n  - ```shell\n    $ hub install simnet_bow==1.2.1\n    ```\n"
  },
  {
    "path": "modules/text/language_model/simnet_bow/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/simnet_bow/assets/params.txt",
    "content": "@HUB_simnet_bow@emb\n@HUB_simnet_bow@tanh.b\n@HUB_simnet_bow@tanh.w\n"
  },
  {
    "path": "modules/text/language_model/simnet_bow/assets/vocab.txt",
    "content": "赫尔曼·黑塞\t1\nweifeng\t2\n苗山\t3\n棍子\t4\n水平角\t5\n粘米粉\t6\n电脑投影仪\t7\n中国光大国际有限公司\t8\n爱程旅游网\t9\n知字\t10\n亿亩\t11\n耳鼻喉科\t12\n卫生计生局\t13\n集水器\t14\n内管\t15\nLUXURY\t16\n废钢破碎机\t17\n潍坊市人民医院\t18\n思南公馆\t19\n复华\t20\n雅思考试网\t21\n陈旭麓\t22\nCasino\t23\n青龙寺\t24\n太厉害\t25\n倍思\t26\n林子濠\t27\n房证\t28\nCHAP\t29\n鹰潭市\t30\n预交\t31\n残垣\t32\n蓝牙4.0\t33\n何琼\t34\n柏丽湾\t35\nFndroid\t36\n别错过\t37\n双合\t38\n真田广之\t39\n上海外事办_上海市人民政府外事办公室\t40\n交友\t41\n新甘十九妹\t42\nPHP编程\t43\n一鸣\t44\nPP\t45\nEncrypted\t46\n幽城幻剑录\t47\n生鲜类\t48\ntoki\t49\n丽水湾\t50\n咏柳\t51\n申诉\t52\n城南旧事\t53\nWwW\t54\narXiv\t55\n广州东\t56\n机械盘\t57\nAPP支付\t58\n珑门\t59\n掌厅\t60\n音译版\t61\n图室\t62\n灵魂兽\t63\n一门\t64\n薪王\t65\n20143dmax\t66\n行政审批局\t67\n慢牛\t68\npk10五码\t69\nxgs\t70\nRecommendations\t71\n杰楷\t72\n北京三中院\t73\n沈阳七中\t74\n蝙蝠侠故事版\t75\n上海市地方志办公室\t76\n一立方\t77\nItem#\t78\n广化\t79\n切向\t80\n细节\t81\n哑语\t82\n逆地理编码\t83\n131集\t84\n死亡笔记吧\t85\n南海撞机事件\t86\n雅坤\t87\n引文\t88\n迷宫篇\t89\n宗汉街道\t90\n补钱\t91\n肆无忌惮\t92\nstroke\t93\nHaoZip\t94\n凤尾竹\t95\n7元\t96\n绿城广场\t97\n本帮菜\t98\n刘兴华\t99\n大波\t100\nbnd\t101\nMac/Linux\t102\n众加商贸网\t103\n柳州市工人医院\t104\n怪杰\t105\nwel\t106\n助行器\t107\n漂色\t108\n匀速圆周运动\t109\n防风衣\t110\n铁甲雄心\t111\nintegrating\t112\n云尊币\t113\n特使\t114\nSip\t115\n绝景\t116\n查重算\t117\n4750\t118\n柳江区\t119\n全集吉吉影音\t120\nA-level\t121\n新疆大厦\t122\n24栋\t123\n皇帝\t124\n墨斗鱼\t125\n晓译\t126\n航天科工集团\t127\n_板\t128\n冷僻\t129\n西岚\t130\n万科清\t131\n放荡的女人\t132\nライダ\t133\n10.13.4\t134\nAon\t135\n24槽\t136\n皮海洲\t137\n背德寝取美人若妻\t138\n车型\t139\n新城街道\t140\n纤薄\t141\n武理工\t142\n砸盘\t143\n拓者\t144\n中西\t145\n藏茶\t146\n华豫\t147\n清单\t148\n酱子\t149\n携程火车票预订\t150\n受损\t151\n华翔\t152\nv2.03\t153\n江语晨\t154\n720S\t155\n番号大全\t156\n中国篮协\t157\n胡伟\t158\nsata3接口\t159\n添加值\t160\n长桌\t161\n亚马逊联盟\t162\n李豁子\t163\n男同桌\t164\n郑大三附院\t165\nclw\t166\n张石高速\t167\n壹侨\t168\nB轮\t169\n双月殿\t170\n碳六烯酸\t171\nQGraphicsView\t172\n宋美修\t173\n万泽\t174\n杭州凤凰网\t175\n迷迷\t176\nqt4.8\t177\n干枝\t178\n中国矿大\t179\n滨江社区\t180\nlettering\t181\n法律职业资格考试\t182\n雨洪\t183\n千金不换\t184\nMusik\t185\n速战\t186\nVita\t187\n主合\t188\nkama\t189\nFUEL\t190\n具身\t191\n原上\t192\n妇幼医院\t193\n卻\t194\n投林\t195\n阿玛尼迪迪\t196\n游族网络\t197\n杭锦之窗\t198\n爱的宣言\t199\n圆扇\t200\n陪审\t201\n萧云\t202\n孵化期\t203\n重庆市国土房管局\t204\n实\t205\nBDSM\t206\n紫裳\t207\n大蒜素\t208\n既济\t209\n诩\t210\n苏州工业园区外国语学校\t211\n静灵口服液\t212\n用电器\t213\ngabrielle\t214\n鼓浪屿之波\t215\n税务师考试\t216\nOP2\t217\n生食\t218\n2768\t219\n魔鬼城\t220\n898\t221\n萃雅\t222\n优迪\t223\n驿城\t224\n绍兴日报\t225\nガ\t226\n50000公里\t227\n漆酶\t228\npsw\t229\nrequest、response\t230\n01_\t231\n字距\t232\n无线数传\t233\n久盛\t234\n超清全集\t235\n粉土\t236\n麦语言\t237\n人教版小学四年级数学下册\t238\n首发集团\t239\n添加\t240\n蹦子\t241\n十件\t242\nZIMO\t243\n衍生品\t244\n摊开\t245\n资源区\t246\n好好说话\t247\n购表\t248\nifanr\t249\n姐妹情\t250\n5330\t251\n丹东银行\t252\n三国志赵云传\t253\n公休假\t254\n六分之五\t255\np6p\t256\nprintable\t257\n加料\t258\nondraw\t259\nv4.3.1\t260\n拳甲\t261\n第十六条\t262\n跑商\t263\n中山公园\t264\nRecycled\t265\nv3.1.0\t266\n1465\t267\n氰化钠\t268\n完了\t269\njapanese\t270\n音义\t271\nipodshuffle\t272\n爱斯基摩犬\t273\n瑞鹤\t274\n小贾\t275\n诱\t276\n在医\t277\n武汉设计工程学院\t278\n激战奇轮\t279\n基准点\t280\n食管炎\t281\n营造林\t282\n飞快\t283\nadf\t284\n迷镇\t285\n维特拉论坛_汽车之家论坛\t286\n中国电信四川公司\t287\n兴隆县\t288\n狂气\t289\n纤维化\t290\n孔刘\t291\n0557.cn\t292\n以太坊区块链\t293\n景公\t294\n末日豪劫\t295\nJPEG\t296\n溧水万达广场\t297\n大功率电机\t298\n_v1.2\t299\n性奴案\t300\n李狗子\t301\ntommy\t302\nLeon\t303\nNOD32\t304\n东方card\t305\nvagaa哇嘎\t306\n齐藤\t307\nByteArray\t308\n随机\t309\n滕县\t310\ngbq4.0\t311\n泰坦陨落\t312\n法瑞\t313\n镇江路\t314\n水队\t315\n大石街\t316\n中华人民共和国建设部\t317\n戴尔Vostro\t318\nbowie\t319\nwin7/win8\t320\nf541\t321\njpeg\t322\n尼克·弗瑞\t323\n鱼籽\t324\n蓬头垢面\t325\n秦人\t326\n裁审\t327\n卡内尔\t328\n学习雷锋好榜样\t329\n死胡同\t330\n仓泵\t331\n心底事\t332\n捷豹xj\t333\nagile\t334\n杭\t335\nentrust\t336\nV8.1\t337\nSC2\t338\n花展\t339\navl\t340\n君泽君\t341\n炽炎\t342\n荞麦花\t343\nrecords\t344\n零排\t345\n一个20尺\t346\n地洞\t347\n里斯\t348\n桃酥王\t349\n拉塞尔·威斯布鲁克\t350\n临湖\t351\nmedsci\t352\n黄艳\t353\n瞪\t354\n王可\t355\n碧桂园学校\t356\n纯种\t357\n东风热线\t358\n斯拉克\t359\nkendogrid\t360\n迪吧\t361\n财政局\t362\nman2015\t363\n刁蛮公主\t364\nrsc\t365\n统计信息网\t366\n共生\t367\n大众车\t368\n上铁\t369\n外显子\t370\n甘露寺\t371\n8书网\t372\nAlfa\t373\n路虎发现4\t374\n中国时间\t375\n有效值\t376\nheartbeat\t377\n于浩\t378\n肿瘤\t379\n振兴村\t380\nXPrivacy\t381\n几碗\t382\nfetree\t383\n小狗钱钱\t384\n龙王镇\t385\npdn\t386\n人民大会堂\t387\n火影忍者剧场版\t388\nCurator\t389\n仕女画\t390\n小米2S\t391\nn站\t392\nrevealed\t393\n镶贴\t394\n无限歌谣季\t395\naa\t396\n铅坠\t397\n北京易经学院\t398\n冷饮机\t399\n朱兰\t400\n澳门金沙城中心\t401\n锡安\t402\nIgxe\t403\nMuseo\t404\n岳麓山\t405\n石蜡油\t406\n温斯莱特\t407\n0511\t408\n气功师\t409\n即时通讯云\t410\n剧变\t411\n190g\t412\n抚养费\t413\n自平衡\t414\n肉痣\t415\nhongyan\t416\n千美黛\t417\n物流网\t418\n养殖业\t419\n霸权主义\t420\n丙酮酸\t421\n环保厅\t422\nEvangelion\t423\n建行\t424\n稀人\t425\n样板段\t426\n马塔\t427\n旧巷\t428\n苏通卡\t429\n一英尺\t430\nvisio2007\t431\n市盈率\t432\n挖坑机\t433\n应用包\t434\n越境\t435\n针机\t436\nEeboard\t437\nplateau\t438\n动漫产业\t439\n大荆\t440\n网络文化节\t441\n伊莱克斯\t442\n棋局\t443\n斗鱼鱼翅\t444\n东校\t445\n标段\t446\n移动安全区\t447\n深眸\t448\n民视\t449\nwww.555zw.com\t450\ndiscuss\t451\n魅蓝NOTE\t452\nwilde\t453\n教学观\t454\n雅可比\t455\n安徽信息工程学院\t456\n吉西他滨\t457\n层楼\t458\n北环大道\t459\n楠竹\t460\n登机\t461\nc罩杯\t462\nQQ欢乐居\t463\n都市水乡\t464\n三冰\t465\n渔岛\t466\n2017.doc\t467\n港妹\t468\n卜案\t469\n塑石\t470\nneighbors\t471\n72位\t472\n清中期\t473\n番号.rar\t474\n返修机\t475\n艾尔之光\t476\n石城镇\t477\nQQ3\t478\nresiduals\t479\n邮件组\t480\nunravel\t481\n价格指数\t482\n比较\t483\n福尔摩斯探案\t484\n长沙市知识产权局\t485\n广播节目\t486\nLumix\t487\n小黑屋\t488\n案名\t489\n眼见为\t490\noctane\t491\n播州\t492\nbbw\t493\nWinRE\t494\n二氧化硫\t495\n鼓吹\t496\n4600万\t497\n城北客运站\t498\n断背山\t499\n伯努利方程\t500\npcbeta\t501\nOpenSceneGraph\t502\n西乡街道\t503\n海马助手\t504\n长岛县\t505\n米尔萨普\t506\n三国篇\t507\n朱棣\t508\nLorde\t509\nAndorid\t510\n二成\t511\n效率源\t512\n大话西游2经典版\t513\n生产单位\t514\n赛盒\t515\n活量\t516\n兩\t517\n鸿运当头\t518\n接闪杆\t519\n南国篇\t520\ntypeahead\t521\n皎月\t522\n润园\t523\n明星大侦探3\t524\nMonika\t525\n童可可\t526\n香筒\t527\nFemale\t528\n中国保温网百事通\t529\n长筒靴\t530\n格键\t531\n曹爱华\t532\n浙二\t533\n临保\t534\n李航\t535\n点解\t536\n胡力\t537\n迪士尼酒店\t538\n王楚然\t539\n金旅\t540\n铂族\t541\n魅族pro6\t542\n京九高铁\t543\n洪亮\t544\nV7000\t545\nojdbc14.jar\t546\n吉祥谣\t547\n愿力\t548\n赵堡\t549\n价料\t550\npend\t551\n903路\t552\n三菱重工\t553\n非凡匠心\t554\n赖布衣\t555\n91c仔\t556\nicxo\t557\n风湿病科\t558\nHOVER\t559\n幼幼\t560\n第十一根\t561\n盖提亚\t562\nTGP版\t563\nX201i\t564\n注册资本金\t565\n外语人才网\t566\n芥花籽油\t567\n海姆立克\t568\n节子\t569\n后宫文\t570\n864\t571\n杜老师\t572\njayz\t573\n温州医科大学\t574\n第二十六期\t575\n确切\t576\n勇者斗恶龙3\t577\n火影战记\t578\n喜迎\t579\n平泽唯\t580\n方钢\t581\n小西满里惠\t582\n出掉\t583\n科力\t584\n大无畏\t585\n沙树\t586\n尼泊尔\t587\n黄连素片\t588\n烟台晚报\t589\n平大\t590\n极速\t591\n一个今\t592\n鸿合电子\t593\n亥时\t594\nrtmp协议\t595\n南靖新闻网\t596\n海勃伦\t597\n韩亚银行\t598\n乡镇\t599\n卢卡斯\t600\n加装\t601\n阳台柜\t602\n维吾尔语\t603\n福相\t604\n康熙字典体\t605\nSaki\t606\n万明\t607\n世纪鼎利\t608\n漯河火车站\t609\n通关单\t610\n俄方\t611\n红锁\t612\n不打不相识\t613\nentrypoint\t614\n数百只\t615\n国安\t616\n萧山经济技术开发区\t617\n吉林大学计算机科学与技术学院\t618\n高铁时刻\t619\n力帆股份\t620\n鹰目户外媒体网\t621\n马穆鲁克\t622\n20150303\t623\n来话\t624\nJonny\t625\n自然地理\t626\n马伯布拉德皮特\t627\n大发明\t628\n晋代\t629\n大山雀\t630\n封藏\t631\nEtiquette\t632\n9间\t633\n口算题\t634\nAPP软件工作室\t635\n腾讯广告算法\t636\njce\t637\n95度\t638\n张燕生\t639\n儒风\t640\nDVD9\t641\n高尚全\t642\n天津中国银行\t643\n熊孩子们\t644\nヒュ\t645\nC919\t646\ntreemap\t647\n满额\t648\n梅林软件中心\t649\n宜贷网\t650\n第一年\t651\n胡莱三国2\t652\n骑马与砍杀\t653\n盛水\t654\n彼得·杰克逊\t655\n顾欣怡\t656\n温泉度假酒店\t657\n中文维基\t658\n摩登兄弟\t659\n观星\t660\n撸主\t661\nvalentino\t662\n国共产党\t663\n华为荣耀3c\t664\n中环码头\t665\n火线传奇\t666\n38秒\t667\n微分公式\t668\n气概\t669\n装甲车\t670\n候机\t671\ndummy\t672\nitertools\t673\n陕西省历史博物馆\t674\n记录单\t675\n朱永新\t676\n天宸\t677\nArcmap\t678\n第53集\t679\n浙江省人民政府\t680\nhdi\t681\n5.7.15\t682\n在美\t683\ncpio\t684\n超女\t685\n双汇集团\t686\n基层板\t687\nvmware10\t688\n子承\t689\n衡水\t690\n天天美食网\t691\n双道\t692\nneutron\t693\nxmeye\t694\nFlex2\t695\n无击珠\t696\n神武刀塔传奇\t697\n拉开\t698\n中国对外承包工程商会\t699\n独立集\t700\ncnf\t701\n随身wifi\t702\n厦深铁路\t703\n2017年国庆节\t704\nmacrophage\t705\n居民\t706\n中国公路学会\t707\ntfboy\t708\n文辉\t709\n约翰尼德普\t710\n8.8.1\t711\n深圳律协\t712\n由于\t713\n王路\t714\nsim3\t715\njei\t716\n惠及\t717\n2915\t718\n投递\t719\n使用期\t720\n孙春星\t721\n吴玉生\t722\n鱼竿\t723\n叫卖\t724\n小暑\t725\nrisks\t726\n转度盘\t727\n20150802\t728\n李春平\t729\n羊杂汤\t730\n人生路\t731\narbitrary\t732\n科中\t733\n颖\t734\n丁丑年\t735\n考古发现\t736\n何事秋风悲画扇\t737\nb2b\t738\n老一辈\t739\n题跋\t740\n千课\t741\n排水法\t742\nemu8086\t743\n扒\t744\n40关\t745\nYelion\t746\n张恒久\t747\nLimitations\t748\n天台人才网\t749\n精神分析\t750\nseiko\t751\n中路\t752\ncaipiao\t753\n40小时\t754\n虹梅南路\t755\n招商创融\t756\n陈露露\t757\n年费\t758\n哺乳期\t759\nBLD\t760\n奇瑞a3\t761\n雷达\t762\n光黑\t763\n苏阳\t764\n保\t765\n狐疑\t766\n口唇疱疹\t767\n圣克鲁斯\t768\nDmall\t769\n解手\t770\n48周年\t771\n金象大药房\t772\nNaha\t773\n粪便\t774\n经度\t775\nSuperS\t776\n三百多\t777\n文昭\t778\n小标题\t779\nnewsmy\t780\nFavicon\t781\n肛塞\t782\n组诗\t783\nfsl\t784\ngee\t785\n撞衫\t786\nfstab\t787\n招商局大厦\t788\n北斗星\t789\nSerializer\t790\n瓦尔沙拉\t791\npreflight\t792\n1367\t793\n唐禹哲\t794\n散氏盘\t795\n王志杰\t796\n通利\t797\n北京)有限公司\t798\nmodelandview\t799\n维族\t800\n白话\t801\n标志307\t802\n窗洞\t803\n星景\t804\n横店兔旅行网\t805\nPororo\t806\n合起来\t807\nubuntu16\t808\n女优们\t809\nshelly\t810\nmactype\t811\n魏斯曼\t812\nWebstore\t813\n加量\t814\n襄阳市区\t815\n一个80\t816\nattrib\t817\n多种\t818\n180寸\t819\n27\t820\n5782\t821\n苦中作乐\t822\n1078\t823\nLIBRARY\t824\n王城\t825\n江苏省地方税务局\t826\n黑板漆\t827\n依止\t828\n梨纱\t829\n上海石库门\t830\n大数据技术专业\t831\n凤县\t832\nHHT\t833\n教练式\t834\n性爱椅\t835\n箱照\t836\n班晓雪\t837\n案主\t838\n陵园\t839\n南乐吧\t840\nspringboo\t841\n江汉油田\t842\n绿景虹湾\t843\nsarah\t844\nDebbie\t845\n6264\t846\n6\t847\n1531\t848\n美图美妆\t849\n留置案\t850\n挂毯\t851\nnumlock\t852\n天天兄弟\t853\n赠品\t854\nPRO6\t855\nV1.4\t856\n湖南省人民政府国有资产监督管理委员会\t857\n祁门红茶\t858\n紫金港\t859\n30公斤\t860\nupda\t861\ntook\t862\n惠比\t863\n农牧民\t864\n冲压件\t865\n钟爱宝\t866\n小跑\t867\nnetmap\t868\n女孩\t869\n锅铲\t870\n张晓天龙八部\t871\n交大\t872\n二叔\t873\n2357\t874\n培训室\t875\n几盒\t876\n起亚K5\t877\n80k\t878\n丹龙\t879\n营企业\t880\n清华大学附属小学\t881\nDigi\t882\n睫毛\t883\n福昕pdf\t884\n理正软件\t885\n从零开始的异世界\t886\n成熟期\t887\nPT柜\t888\n海博\t889\n道士那些年\t890\n光模块\t891\n培训者\t892\nredis连接池\t893\n共枕\t894\n龙昊雪\t895\n水南村\t896\nplayerunknown\t897\n95533\t898\ncontexts\t899\n数十万元\t900\nOracle12c\t901\n第一杆\t902\n迎客松\t903\n销货\t904\n培训稿\t905\n红蓝眼镜\t906\n180327\t907\n安蕾尔\t908\n水贝村\t909\n生化危机:复仇\t910\n机长\t911\nDy\t912\n6669\t913\n垂直分辨率\t914\n224\t915\nfm2017吧\t916\n虎爷\t917\n面量\t918\ninstall4j\t919\n东湖城\t920\n屁股机\t921\n破烂王\t922\nXW\t923\n佳工网\t924\n街舞版\t925\n11亿元\t926\n厚朴酚\t927\n大酒店集团\t928\nAscent\t929\nenfant\t930\n黑客们\t931\n最终幻想零式HD\t932\n圣上\t933\n咸淡\t934\nFox\t935\n乐视tv\t936\n2plus\t937\n皇者\t938\n苏北地区\t939\n消暑\t940\n建筑总平面图\t941\n0.001mm\t942\n雨伞\t943\n建筑工程系\t944\n赫姆勒\t945\n粗制\t946\n佛经\t947\n袁姓\t948\n售后包\t949\npository\t950\n7zip\t951\nprincomp\t952\nSOI\t953\n五战\t954\n乐少\t955\n代理业\t956\n班牙语\t957\n296号\t958\nSnagIt\t959\n潘神的迷宫\t960\n涵江区\t961\n_线\t962\n前几日\t963\n寒香\t964\n悦米\t965\n蓝灯\t966\n检测师\t967\nIT认证\t968\n俞斌\t969\n遗迹\t970\n同盟\t971\n一知半解\t972\n不一样的美男子2\t973\n绿城桃源小镇\t974\nh100i\t975\n注册咨询师考试\t976\n西瓜节\t977\n预分区\t978\n六语\t979\n8CD\t980\n唐吉柯德\t981\n苏黄止咳胶囊\t982\n九洲集团\t983\n绝地求生测试服\t984\na1502\t985\n重庆省\t986\n竹竿舞\t987\n中兴华为\t988\n精神分析学\t989\n九五之尊\t990\n第五十九\t991\n专业技\t992\n苹果ID\t993\n九轩\t994\n网带式\t995\n反担保\t996\n福州大学阳光学院\t997\n4.1_\t998\n绵羊\t999\n陈宇飞\t1000\n莫西沙\t1001\n福州\t1002\nallergy\t1003\nac1900p\t1004\n大挂\t1005\n辣条\t1006\n缩醛\t1007\n弄死\t1008\n5平方米\t1009\n卫所\t1010\n相位谱\t1011\n赛特奥特莱斯\t1012\n哥谭第四季\t1013\n淄博新闻网\t1014\n58同城招聘网\t1015\n高颜\t1016\nus\t1017\n俄军\t1018\n熊哥\t1019\n0.1克\t1020\n河南工业贸易职业学院\t1021\n5000套\t1022\n李思捷\t1023\nXF\t1024\n企业社会责任中国网\t1025\nSubstitution\t1026\nAloe\t1027\nCPM\t1028\n支取\t1029\nPOLO衫\t1030\nm128fn\t1031\nlars\t1032\nintellij14\t1033\n一封\t1034\n20k\t1035\n坑人\t1036\n根音\t1037\nMountains\t1038\n卓易彩票\t1039\n塘\t1040\naddressing\t1041\n有机肥厂\t1042\ngtx1070ti\t1043\n分享器\t1044\n桩基静载试验\t1045\n省立\t1046\n38平米\t1047\n兖州区\t1048\npiliang\t1049\n浑厚\t1050\n金智媛\t1051\n小包\t1052\n施工证\t1053\n仙花\t1054\n400年前\t1055\n幺鸡\t1056\n绿色产业集聚区\t1057\n预告片\t1058\n样貌\t1059\n认准\t1060\n聚会的目的\t1061\n何云伟\t1062\nreald\t1063\n新达达\t1064\n外轮\t1065\nPASSWORD\t1066\n玄派网\t1067\n九顶\t1068\n孝南\t1069\n海楼\t1070\n5.99寸\t1071\n9月8日\t1072\n宠上天\t1073\n福州移动\t1074\n七师\t1075\n粤科\t1076\nmac安装盘\t1077\n500GB\t1078\nSwitchySharp\t1079\n察举制\t1080\nx200\t1081\n九八年\t1082\nGrubby\t1083\nlammps\t1084\n方盛\t1085\n软枣\t1086\n天下第一\t1087\n首次公开募股\t1088\n盐酸多奈哌齐片\t1089\n92.1\t1090\n胜博\t1091\n41005\t1092\n6699\t1093\n博友\t1094\n无数个\t1095\nlp仿传奇单机版\t1096\n樱心美\t1097\n高桥村\t1098\n漫谈税\t1099\n加上单\t1100\n导零\t1101\n开思\t1102\n过大\t1103\n云英\t1104\n阳光少年乐园\t1105\n箭袋\t1106\n格林函数\t1107\nccm\t1108\n宕\t1109\n明日花\t1110\nAKKO\t1111\n拉尼娜\t1112\n必需\t1113\n396\t1114\n达标\t1115\n劳工\t1116\n减震\t1117\n点金胜\t1118\neddy\t1119\n富达\t1120\n东风航天城\t1121\nColdplay\t1122\n搬进\t1123\n赛亚\t1124\n配售\t1125\n留德\t1126\nN3N4N5\t1127\n神奇蜘蛛侠2\t1128\n赤壁路小学\t1129\nwampserver\t1130\n硅酸盐学报\t1131\n妃笑\t1132\n奥比中光\t1133\n400多家\t1134\namended\t1135\n事实劳动关系\t1136\n被囚者\t1137\n宣传照\t1138\n3046\t1139\n神谱\t1140\n三星W2014\t1141\n普明\t1142\n格拉纳达\t1143\n周延英\t1144\nrefspec\t1145\n分子生物学\t1146\n戏法\t1147\n亚视\t1148\nsuperstar\t1149\n法甲\t1150\n航位\t1151\n海格力斯\t1152\n第19卷\t1153\n黄道周\t1154\n人大常委会\t1155\n有脸\t1156\ndrawn\t1157\nZoho\t1158\n朱砂根\t1159\n龙之崛起\t1160\n一个星期\t1161\n免流软件\t1162\n鲁宾逊漂流记\t1163\n美家\t1164\n锻造\t1165\n南坪万达广场\t1166\nCROM\t1167\n王赫\t1168\n文明镇\t1169\n尊皇\t1170\n老车\t1171\n嗅\t1172\n全主\t1173\n奇瑞途胜\t1174\n性心理\t1175\n磷酸氢二铵\t1176\n名媛\t1177\n马科维茨\t1178\n剑网三奇遇\t1179\nscuba\t1180\n是非极性\t1181\n威布尔\t1182\n花灯\t1183\n小叔叔\t1184\n耐听\t1185\nTONG\t1186\n东海路\t1187\nB85\t1188\n奥田咲\t1189\n20170409\t1190\n接式\t1191\nMastercard\t1192\nNSNotificationCenter\t1193\n000美元\t1194\n三个年\t1195\n徐家汇\t1196\n重钢\t1197\n2018-01-25\t1198\n侧屏\t1199\n工作簿\t1200\n啦_\t1201\n品牌招商网\t1202\n灵剑\t1203\nResharper\t1204\n2060\t1205\n红米\t1206\npath=\t1207\n中山市市\t1208\n中央音乐学院附中\t1209\n工作委员会\t1210\nflashpaper\t1211\n刘学\t1212\n米高梅\t1213\n砷化镓\t1214\n33万元\t1215\n中国民航信息网络股份有限公司\t1216\n一道桥\t1217\n打回\t1218\n北海舰队\t1219\n请示\t1220\n野事\t1221\nLuigi\t1222\nSite\t1223\n纽伦堡\t1224\ntrm\t1225\nCASTLE\t1226\n如此美好\t1227\n石榴苗\t1228\n浅论天下网\t1229\n圆珠笔\t1230\n读书天\t1231\n马廷强\t1232\n液质\t1233\n李欣蕊\t1234\n粉碎\t1235\n223.104.176.117\t1236\nPorn\t1237\n水元素\t1238\n桃乃木\t1239\n百个\t1240\n洛丽全面战争\t1241\n德沃金\t1242\n豆奶粉\t1243\n绿茶叶\t1244\n维生素k\t1245\n竹椅\t1246\n红衣女\t1247\n食欲\t1248\n航天动力\t1249\n凌天乱清\t1250\n大姨\t1251\n中场\t1252\n马伊琍\t1253\nexamcoo\t1254\n收缴率\t1255\njsp+servlet\t1256\n牙病\t1257\nMV在线观\t1258\nAOB\t1259\n边塞\t1260\n梦幻启示录\t1261\n大连工业大学\t1262\n福瑞迪\t1263\n开不了\t1264\n三忆\t1265\n酱紫\t1266\n小雷\t1267\napplocale\t1268\n秀峰\t1269\n税前扣除\t1270\n门诊部\t1271\n视网膜色素变性\t1272\n知假买假\t1273\nJni\t1274\nPVC管\t1275\ncollections\t1276\n拘泥\t1277\n毁灭之路\t1278\n真空采血管\t1279\n安极网\t1280\n鸿茅药酒吧\t1281\n亚萨\t1282\n姚安\t1283\n盛田昭夫\t1284\n查房\t1285\n铭万\t1286\nsupplicant\t1287\n毫摩尔\t1288\n链霉素\t1289\n高文\t1290\n见惯\t1291\n128万\t1292\n拳皇2002um\t1293\ncomicstudio\t1294\n20170426\t1295\n跑狗网\t1296\n海明码\t1297\nexp函数\t1298\nsavefig\t1299\n遨博\t1300\n徐荣祥\t1301\n色原\t1302\n小说版\t1303\n正奥网\t1304\n云吞\t1305\n大池\t1306\n贵州省林业厅\t1307\nbogus\t1308\nV4.7\t1309\n河南省社会保障局\t1310\n160m\t1311\nashes\t1312\n趴下\t1313\n小米圈\t1314\n吸附\t1315\n20170201\t1316\n百度网盘+迅雷+旋风\t1317\nyongfa\t1318\n枕戈待旦\t1319\nmessaging\t1320\n乐舞\t1321\nbookshelf\t1322\n王雷\t1323\n张学政\t1324\nbord\t1325\n大一轮\t1326\n3E\t1327\n千月\t1328\n红专路\t1329\n林江国\t1330\njiangsu\t1331\n妖姬与艳妓\t1332\n支付宝钱\t1333\n卓邦\t1334\n芝麻酱\t1335\n张章\t1336\ninsyde\t1337\n500公里\t1338\n档\t1339\n赦\t1340\n太阳的后裔\t1341\n非静态\t1342\n中南财大\t1343\n热博\t1344\n大秦帝国之裂变\t1345\n全屋向阳\t1346\nwin10照片查看器\t1347\nFlink\t1348\n10辆\t1349\n16周年\t1350\n金美辛\t1351\n武当山景区\t1352\n珠山镇\t1353\n一英镑\t1354\ntape\t1355\n明医\t1356\nForeign\t1357\n温姓\t1358\nChampaign\t1359\n标致408论坛\t1360\n爱猪网\t1361\n整数\t1362\nNorthern\t1363\n1280高清国英双语\t1364\nDatacolor\t1365\n江恩\t1366\n酸牛奶\t1367\n8型\t1368\n断点调试\t1369\n共有产权房\t1370\n600760\t1371\n幼升小_\t1372\n洋口镇\t1373\n我的发现\t1374\n通气会\t1375\n孤胆特工\t1376\n京东商城\t1377\n葫芦岛新闻网\t1378\n51_\t1379\n超雪卡贴\t1380\nroots\t1381\n改变自己\t1382\n比赛服\t1383\n无量光明佛教网\t1384\n灰枣\t1385\n风剑\t1386\n维也纳国家歌剧院\t1387\n眉县人民政府\t1388\nhna\t1389\nTb\t1390\n\\65279\t1391\n反腐倡\t1392\n战宝\t1393\n品析\t1394\npqmagic\t1395\nforeach\t1396\n内角\t1397\nSocketIO\t1398\n品\t1399\n残片\t1400\n图表类\t1401\nMacBook\t1402\n史雷\t1403\n蹿\t1404\n优速达\t1405\n邮沪\t1406\nsparc\t1407\n兴海\t1408\n乌米\t1409\n丹桂苑\t1410\n太公\t1411\n赵建平\t1412\nMQTT协议\t1413\nholidays\t1414\n以太网控制器\t1415\n施秉县\t1416\ng2000\t1417\n字符串首\t1418\n地瓜粉\t1419\n猎刀\t1420\n福田雷沃\t1421\n卡罗琳\t1422\n18.8\t1423\n壹分\t1424\n神漫画\t1425\n何堪\t1426\n仕舞妻\t1427\n处分\t1428\n复飞\t1429\n马友友\t1430\n6.6.5\t1431\n炎术士\t1432\n塘头村\t1433\n尿点\t1434\n平安养老保险股份有限公司\t1435\nConnect\t1436\n庚烷\t1437\nWAIT\t1438\n财付通安全控件\t1439\nduring\t1440\n最简整数比\t1441\n外研社杯\t1442\n熊猫金银纪念币\t1443\n下去除\t1444\n江泽民传\t1445\n清艳\t1446\n操作系\t1447\n献金\t1448\n新农村商报网\t1449\n黄河勘测规划设计有限公司\t1450\nranger\t1451\n宽心\t1452\nsolas\t1453\n尼康af\t1454\nそら\t1455\nSVIP\t1456\n贵阳医学院\t1457\n第三批\t1458\n天津市五区县\t1459\n华融证券\t1460\n第33章\t1461\n9.3分\t1462\ncod8\t1463\n谭卓\t1464\n饱和蒸汽温度\t1465\n分液器\t1466\n赞微\t1467\n江苏电视台\t1468\n邻面\t1469\n100册\t1470\n964\t1471\n天津市市容和园林管理委员会\t1472\n外展\t1473\n力排众议\t1474\n280天\t1475\nAuckland\t1476\n扎啤\t1477\n斟酒\t1478\n60道\t1479\n鼓刹\t1480\n转瞬即逝\t1481\n无银\t1482\n衡子\t1483\nmoo\t1484\n不唱\t1485\n激流勇\t1486\n山西黄河新闻网\t1487\n特写图\t1488\n吉他谱_\t1489\n失水\t1490\nSuperfast\t1491\n大学科技园\t1492\n梨璐\t1493\nm4a1\t1494\n4812\t1495\nDarwin\t1496\n黑煞\t1497\n大恶人\t1498\n新生活方式\t1499\n麦卡锡\t1500\nprogrammers\t1501\n预算定额\t1502\n第42话\t1503\n有机氯\t1504\n72GFateGrandOrder\t1505\n古史\t1506\n乌江镇\t1507\n兵工\t1508\nCombinations\t1509\nLogical\t1510\n盛希泰\t1511\n知足\t1512\n子然\t1513\n训放\t1514\n125\t1515\n褙子\t1516\n001号\t1517\n庐江中学\t1518\n拾级而上\t1519\n孙子\t1520\n长安万达广场\t1521\n钢桁梁\t1522\n转印机\t1523\n联合站\t1524\nSSP\t1525\ncoco2dx\t1526\nquarantine\t1527\n富通天下\t1528\n争霸赛\t1529\n下一站别离\t1530\n雷克萨斯ES250\t1531\n户政\t1532\n青红\t1533\n种子的梦\t1534\n真柏\t1535\n氧化汞\t1536\n10993\t1537\n歷史\t1538\n刘思涵\t1539\n生源地助学贷款\t1540\nMisa\t1541\n宏杰\t1542\n2018年三月三\t1543\n里奇\t1544\n肉刀\t1545\n谷歌百度\t1546\n上苑\t1547\n修建\t1548\n对外担保\t1549\n省档案局\t1550\nxxg\t1551\n防尘帽\t1552\n成都轨道交通集团有限公司\t1553\n贴标\t1554\n开物\t1555\n大西南\t1556\n杭州绿城\t1557\nunlike\t1558\nnona\t1559\n天津美术馆\t1560\n深瞳\t1561\n不死\t1562\n欧西\t1563\n海棠社区\t1564\n中税协\t1565\n国际能源署\t1566\n保护好\t1567\n零序电流互感器\t1568\n副研究员\t1569\n初链\t1570\n开盘\t1571\n自定\t1572\n闪投\t1573\n不悦\t1574\n叶盘\t1575\n泰迪杯\t1576\nEnhancer\t1577\n导墙\t1578\n裤兜\t1579\n陆运\t1580\nWIN10\t1581\n江阴市人民法院\t1582\n镇宁\t1583\ntrough\t1584\n60型\t1585\n熏蒸机\t1586\n益智类\t1587\nIcons\t1588\n北京银泰中心\t1589\n人力资源管理师二级考试\t1590\n玫瑰花海\t1591\nfm2012吧_\t1592\n爵士鼓\t1593\n从零开始的异世界生活吧\t1594\n北京海\t1595\n水果老虎机\t1596\nmetaq\t1597\nM228b\t1598\n马曳\t1599\nwebapck\t1600\n12台\t1601\n真漂亮\t1602\n0.78\t1603\n舒洁\t1604\n地图龟\t1605\n直通式\t1606\n至简\t1607\n瓦格斯\t1608\nELECTRONIC\t1609\n嘟\t1610\n发梢\t1611\n六臂\t1612\n东风悦达\t1613\n南阳新闻网\t1614\n爱普\t1615\nppt_筑龙网\t1616\n半个\t1617\n怪猎ol\t1618\n散沙\t1619\n15分钟\t1620\n中华人民共和国商业银行法\t1621\n返税\t1622\n一野\t1623\n围歼\t1624\nip加速器\t1625\n翻覆\t1626\nx3250\t1627\n夸克浏览器\t1628\n12c\t1629\n赔罪\t1630\nGOPATH\t1631\nXi\t1632\nFigaro\t1633\n二郎三郎\t1634\n劝服\t1635\n白凤\t1636\n法定程序\t1637\nTWD\t1638\n传奇类\t1639\n张可久\t1640\n北京大学物理学院\t1641\n木头\t1642\n_尼\t1643\n漆笔\t1644\nMDRT\t1645\nconclusion\t1646\n赌霸\t1647\n姜老师\t1648\n低层次\t1649\n一环路北\t1650\nflash格式\t1651\n学派网\t1652\n改造\t1653\n钩稽\t1654\nTFD\t1655\n涂上\t1656\n本诗\t1657\n大头网\t1658\n氢离子\t1659\n麦麦提敏\t1660\n排污许可证管理暂行规定\t1661\n第三个人\t1662\n中国设计网\t1663\n五年级数学下册期中测试卷\t1664\n成本核算\t1665\n债券利率\t1666\nV4.1.1\t1667\n桶珠\t1668\n申洲国际\t1669\nMongoDB数据库\t1670\n游侠客旅游网\t1671\ndnsmasq\t1672\n那端\t1673\n胡辛束\t1674\n日妹妹\t1675\n真光中学\t1676\n经济体\t1677\nCER\t1678\nmingw64\t1679\n日语等级考试\t1680\n5斤\t1681\n祈\t1682\n五马路\t1683\n管子钳\t1684\n海冰\t1685\n210000\t1686\n斗鱼鱼乐\t1687\nbois\t1688\nbeibei\t1689\n一休一\t1690\n竹屋\t1691\n四川成都新东方烹饪学校\t1692\n算命吧\t1693\nhmset\t1694\n12月11日\t1695\n升达\t1696\n梁箍筋\t1697\n木垒县\t1698\n健龙\t1699\n401号\t1700\n气话\t1701\n高等学\t1702\n安卓6.0\t1703\n拆卸式\t1704\n暴裂无声\t1705\n创伤性关节炎\t1706\n芋泥\t1707\n长安汽车股份有限公司\t1708\n服玩\t1709\nTASAKI\t1710\ncarver\t1711\n王春华\t1712\n小团\t1713\n终结版\t1714\n误码\t1715\nIFC\t1716\n严浩\t1717\n雪铁\t1718\n北京中医药大学远程教育学院\t1719\n55位\t1720\n京瓷2010\t1721\nHunan\t1722\n外部式\t1723\nШ\t1724\n八云\t1725\n德克士\t1726\n曼比季\t1727\n生化危机\t1728\n男高中生\t1729\n李夫人\t1730\n喜气洋洋猪八戒\t1731\n软组织\t1732\n丨史\t1733\n无如\t1734\n昌硕\t1735\nFRONT\t1736\nTrilogy\t1737\n飞儿\t1738\n屠宰\t1739\n信管网\t1740\n梁晓慧\t1741\n溢出\t1742\n澳门机场\t1743\n共面\t1744\nSB\t1745\n绘影\t1746\n和平小区\t1747\n湖滨南路\t1748\n新疆联通\t1749\n鬼新娘\t1750\n林子聪\t1751\n兵位\t1752\n包柱\t1753\nschooldays\t1754\n走秀\t1755\n幻读\t1756\n进园\t1757\nMathworks\t1758\n搜客\t1759\nUNetbootin\t1760\n2019年底\t1761\n昭阳公园\t1762\n脱水性\t1763\n时间差函数\t1764\n绞杀\t1765\n诺维信\t1766\n第18天\t1767\n唐曾\t1768\n掌灯\t1769\nMiyako\t1770\n相迎\t1771\n特需门诊\t1772\nExcel学习网\t1773\n焦耳定律\t1774\n约瑟传说\t1775\n可敌\t1776\n移窗\t1777\n布克赛尔\t1778\n空山\t1779\n农业用水\t1780\n井草圣二\t1781\nCATCH\t1782\n三餐\t1783\n招鬼\t1784\n移表\t1785\n海淘|\t1786\n龙鱼\t1787\n品园\t1788\n專門\t1789\n六线\t1790\n收获机\t1791\n桐油\t1792\n美债\t1793\n顺风大酒店\t1794\n36P\t1795\n0.7元\t1796\nWeak\t1797\n上海爱福窝云技术有限公司\t1798\n根宝\t1799\n名车志Daily\t1800\nlicenses\t1801\n三十而立\t1802\n星武神诀\t1803\n特等座\t1804\n生气勃勃\t1805\n凌绝顶\t1806\n水宝\t1807\ncoil\t1808\nTelevision\t1809\n库存车\t1810\n斗鱼公会\t1811\n小榄车站\t1812\n深圳地铁5号线\t1813\n生命之歌\t1814\n金胶\t1815\n东直门外大街\t1816\n自然灾害\t1817\n家庭联产承包责任制\t1818\n山西路\t1819\n真人快打x\t1820\nRanks\t1821\n笑侃\t1822\n聪明绝顶\t1823\nwalter\t1824\n繁荣昌盛\t1825\nDP接口\t1826\n锄头\t1827\n孝昌县\t1828\n梦洁家纺\t1829\n124分钟\t1830\n流拍\t1831\n星时代\t1832\nX-T10\t1833\n工程规划许可证\t1834\n马俑\t1835\n魔穗\t1836\n迪瑞\t1837\n盐和避难所\t1838\n比熊犬\t1839\ntxt批量\t1840\n公益路\t1841\nlsit\t1842\n国寿嘉园\t1843\n倒台\t1844\nrpcs3\t1845\nSparse\t1846\nGML\t1847\n亿忆网\t1848\n浙商网\t1849\n诉讼双雄\t1850\nsuqqu\t1851\nSHA256\t1852\nxiaoxin\t1853\nwinedit\t1854\n停经\t1855\nGabrielle\t1856\n华洲城\t1857\n茶碱缓释片\t1858\n固精\t1859\n中国铁道科学研究院\t1860\n天津儿童医院\t1861\nlauren\t1862\n北京市委党校\t1863\n木制\t1864\n特鲁瓦\t1865\n好胜\t1866\n朝小诚\t1867\n康帕斯\t1868\n接物\t1869\nTutorials\t1870\n马家庄\t1871\n两个半\t1872\n中国港口协会\t1873\n印小天\t1874\n柯拉松\t1875\n谢利\t1876\n制动蹄\t1877\n芒康\t1878\n外资股比\t1879\n1705839\t1880\n异化\t1881\n光动\t1882\n瓶级\t1883\n古龙群侠传2\t1884\n封建\t1885\nxiangji\t1886\nmpeg4\t1887\ndeve\t1888\n药茶\t1889\n宙\t1890\nes文件浏览器\t1891\n市人力资源和社会保障局\t1892\n中国保险资产管理业协会\t1893\n炫色\t1894\n阆\t1895\n涤纶短纤\t1896\n襄阳房产网\t1897\n长跪\t1898\n大地幼儿园\t1899\n魔都\t1900\n尖扎县\t1901\nhg\t1902\nkan300\t1903\n0743\t1904\n客座率\t1905\nxjb\t1906\n3类\t1907\n畅畅\t1908\n移项\t1909\n降魔\t1910\n汉堡纸\t1911\n汉阴\t1912\nHDRip\t1913\n星井\t1914\n十口\t1915\n日夜撸影院\t1916\n王文京\t1917\n遭冷落\t1918\n葛根汤\t1919\n时崎狂三\t1920\n肉队\t1921\nMSDE2000\t1922\nUSNEWS\t1923\n乙酉\t1924\n贵州省纪委监察厅\t1925\n百事公司\t1926\n畸形儿\t1927\n54度\t1928\n青要山\t1929\n东岛\t1930\n骨骺\t1931\n忘尘谷\t1932\n葱\t1933\nnode-sass\t1934\n易胜\t1935\n澳门大三巴\t1936\nFeel\t1937\n高校竞价网\t1938\n饭爷\t1939\n接骨\t1940\n飒飒\t1941\n大于号\t1942\n送情郎\t1943\n比才\t1944\n更可怕\t1945\n假面骑士EX-AID\t1946\n野采\t1947\n粮仓\t1948\n银翼杀手\t1949\n通话录音\t1950\n扫路\t1951\nInMotion\t1952\n水处理药剂\t1953\n郭永航\t1954\n腺体\t1955\n三墩北\t1956\n2em\t1957\n更衣\t1958\n卡斯帕罗夫\t1959\n范斯\t1960\n厦门银行\t1961\nwikipedia\t1962\nSOUND\t1963\n社区居民委员会\t1964\nSCE\t1965\n枪柄\t1966\n气垫cc\t1967\n莎木3\t1968\nEDK\t1969\n维c\t1970\nListControl\t1971\nD600\t1972\n巧家县\t1973\nKarma\t1974\n不优\t1975\n长江水\t1976\n陈莹\t1977\n脉冲函数\t1978\n海鑫科金\t1979\n毛遂自荐\t1980\n接上\t1981\n筒子骨\t1982\n新井梓\t1983\n房产小区\t1984\n网签合同\t1985\n包威尔\t1986\n植物大战僵尸吧_\t1987\n李路平\t1988\n1200\t1989\n2016年以后\t1990\nfinalize\t1991\n比特网\t1992\n蔡思贝\t1993\nWoo\t1994\n婚牢\t1995\n康诺\t1996\nPub\t1997\n一六年\t1998\n姓氏万变\t1999\n没早\t2000\n西湖银泰\t2001\n倒钱\t2002\n韵语\t2003\nTensile\t2004\n魅蓝metal吧\t2005\n张驰\t2006\n选育\t2007\n库尔\t2008\n逆心\t2009\nfgo2\t2010\norgasm\t2011\n浅洛洳雪\t2012\n芽接\t2013\n右派\t2014\n19.com\t2015\n格式工厂\t2016\n截胡\t2017\n粉管\t2018\nゞ\t2019\n真味\t2020\n60多家\t2021\n广东省质量技术监督局\t2022\n田小娥\t2023\n直边\t2024\nSQL触发器\t2025\n常备军\t2026\n不收\t2027\n甲壳修罗武神\t2028\nyqrc\t2029\n雪花牛肉\t2030\n承插型\t2031\n桃符\t2032\nCL00Y\t2033\n歌辞\t2034\n潮范儿\t2035\n无出\t2036\nkirkland\t2037\n对嘴\t2038\n精神焦虑症\t2039\nDNF地下城与\t2040\n结构域\t2041\n纪纲\t2042\n永城职业学院\t2043\ndefiner\t2044\n协理\t2045\n科技日报社\t2046\n弄开\t2047\n红山果\t2048\n任伯儒\t2049\n350集\t2050\n十二平均律\t2051\nexperimental\t2052\n接圆\t2053\n洁茹\t2054\n伟博\t2055\n北海艺术设计学院\t2056\n洛杉矶\t2057\n抗命\t2058\n变速杆\t2059\nChrysler\t2060\n铬铁矿\t2061\n巡视员\t2062\nYY4138高清影院-殇情影院\t2063\n利率表\t2064\n一路向前\t2065\n126家\t2066\n喝\t2067\n2018年度\t2068\n傲来\t2069\n鸡排饭\t2070\n心跳声\t2071\n映美针式打印机\t2072\n神界\t2073\n鱼线\t2074\n水瓶座男\t2075\n鸡鸣\t2076\n92个\t2077\n汽油滤清器\t2078\ndangling\t2079\n胎脂\t2080\nsigned\t2081\n大酒店\t2082\n神女\t2083\n叫板\t2084\n乌龙院\t2085\nconditioning\t2086\nfadeOut\t2087\n5dx\t2088\n十几款\t2089\n吉林石化\t2090\n帝景苑\t2091\n江阴市人民政府\t2092\n中华视听网\t2093\n五种\t2094\npeoplesoft\t2095\n4.5万元\t2096\n燕之屋\t2097\n山亭政府网\t2098\n湖北三峡\t2099\n1.0.0.1\t2100\nIC50\t2101\npowered\t2102\n2.4L\t2103\n纂修\t2104\n蓝风\t2105\nAPH\t2106\n吊环\t2107\n麦拉片\t2108\n华福\t2109\n改派\t2110\n娱乐机\t2111\n万世\t2112\n两束\t2113\n蓝领\t2114\n中国人民大学外国语学院\t2115\ndungeons\t2116\n崇左市\t2117\n28级\t2118\n阜阳\t2119\namoled\t2120\n国话\t2121\n宫体\t2122\n20170325\t2123\n巴塔\t2124\n条码打印软件\t2125\n河南省质量技术监督局\t2126\n30a\t2127\ndem\t2128\n老婆饼\t2129\n求书网\t2130\n清江镇\t2131\n尐家\t2132\n何干\t2133\nSceneSource\t2134\n侦查员\t2135\n中央团校\t2136\nM126nw\t2137\n少城\t2138\n器灵2\t2139\n李莹\t2140\n市粮食局\t2141\n哚哚\t2142\nMiStar\t2143\n心静\t2144\n多难\t2145\nWDS\t2146\n急报\t2147\n五笔\t2148\n林原惠美\t2149\n合众新能源\t2150\n梦幻西游方寸山\t2151\n人间王\t2152\n交友群\t2153\n灭鼠器\t2154\n马志强\t2155\n175元\t2156\nIWC\t2157\n广州智光电气股份有限公司\t2158\nJRS\t2159\n外经\t2160\n50070\t2161\n炅\t2162\n环氧化\t2163\n网络版\t2164\n水冷板\t2165\n外婆\t2166\n图享网\t2167\n唯德\t2168\n水象\t2169\n中行手机银行\t2170\nliuyueyue\t2171\n卡尔佩恩\t2172\n猛蚁\t2173\n湖南师范大学树达学院\t2174\nSQLMAP\t2175\n中源\t2176\n死或生:沙滩排球3\t2177\n酸葡萄\t2178\n讨巧\t2179\n错判\t2180\n单程证\t2181\nppt投影\t2182\n洪楼\t2183\n广东教育信息网\t2184\n腾讯分分彩计划\t2185\nJAX-WS\t2186\n坡头\t2187\n开朗\t2188\nPE\t2189\n陈嘉映\t2190\n田鸡\t2191\n生馆\t2192\nbaf\t2193\n温岭政府网\t2194\n10万份\t2195\n洪湖赤卫队\t2196\n春牛\t2197\n金凌\t2198\nethylene\t2199\n小鸡\t2200\n形变\t2201\nimproves\t2202\n7k7k小游戏\t2203\n调剂生\t2204\n准信\t2205\n菅野\t2206\n废狗\t2207\n大饱眼福\t2208\n外国文学\t2209\n外校\t2210\n公用版\t2211\n氧化钾\t2212\n红树莓\t2213\n爱都\t2214\nRTX\t2215\n2015年5月1日\t2216\ndhtml\t2217\n56万元\t2218\nslimfast\t2219\n上海市人民政府新闻办公室\t2220\n宝龙城\t2221\nznode\t2222\n中控考勤机\t2223\n杨喆\t2224\n镇西\t2225\n明净\t2226\n前碟\t2227\n淘股吧\t2228\n165厘米\t2229\n一次次\t2230\n龙门铣床\t2231\n升力系数\t2232\n昏昏欲睡\t2233\n5026\t2234\n第5届\t2235\n苏尼特\t2236\nSCN\t2237\n摄像仪\t2238\n第13位\t2239\nad滴剂\t2240\nrepair\t2241\n淮安新闻网\t2242\n250万元\t2243\n星夜\t2244\n非遗\t2245\n音乐书\t2246\n三轮吸粪车\t2247\n脲醛\t2248\n砌生\t2249\nspell\t2250\n御龙在天页游\t2251\n逃学威龙\t2252\n磁生电\t2253\n软玉溪\t2254\n乙丑日\t2255\n手机\t2256\nqwidget\t2257\n名号\t2258\n中铁二十四局\t2259\nstrlen\t2260\n小衣\t2261\n徐公子\t2262\n五口\t2263\n_17素材网\t2264\nfanya\t2265\n参花\t2266\n肥皂\t2267\n350万\t2268\n国产动画片\t2269\n雁峰\t2270\nInformatics\t2271\n巨魔王\t2272\nimp\t2273\n怀远镇\t2274\n聚合函数\t2275\n盛淮南\t2276\n陈瑛\t2277\n罪魁祸首\t2278\n韩家英\t2279\n2mg\t2280\n谢塘镇\t2281\nmorgan\t2282\nTushy\t2283\n万1\t2284\nJian\t2285\n解答\t2286\n幸福一家人\t2287\n平昌冬奥\t2288\nFPC连接器\t2289\n陆军军医大学西南医院\t2290\nToy\t2291\nAssurance\t2292\n11米\t2293\n凡凡\t2294\n产片\t2295\n红会医院\t2296\n广东省公路管理局\t2297\n土木系\t2298\n谭笑\t2299\n106岁\t2300\n相棒\t2301\n随机整数\t2302\nvol.3\t2303\n4s店_列表网\t2304\n高淳县\t2305\nArizona\t2306\n1914\t2307\nDear\t2308\n14.3%\t2309\n三十种\t2310\n剥线钳\t2311\n欧足联\t2312\nDVR\t2313\n碘缺乏病\t2314\n长濑智\t2315\nseperate\t2316\n河山镇\t2317\nransac\t2318\n原封不动\t2319\n刻蚀\t2320\n妖族\t2321\n梦工坊\t2322\n长城电子\t2323\ndecode函数\t2324\ngives\t2325\n材质包\t2326\n赵州\t2327\n2018年3月19日\t2328\n桩号\t2329\n厦门华天涉外职业技术学院\t2330\n德军\t2331\n熊酷\t2332\n谈判者\t2333\nsituations\t2334\n方便面\t2335\nxcassets\t2336\n合肥市\t2337\n三星堆博物馆\t2338\n停用\t2339\n邮差包\t2340\n介\t2341\n徐海峰\t2342\n高晓牛\t2343\n疏肝\t2344\n迪卡凯恩\t2345\n就业观\t2346\ncon\t2347\n银龙股份\t2348\nProstate\t2349\nHAIR\t2350\n康乐村\t2351\n东方福利网\t2352\n成诗\t2353\napdl\t2354\n聘用制\t2355\ncDNA\t2356\n牛腿柱\t2357\n信息科学\t2358\n同程艺龙\t2359\n藏历新年\t2360\nSTREET\t2361\n觑\t2362\n王小姐\t2363\nCAD制图软件\t2364\n熏染\t2365\n友发\t2366\nmks\t2367\n柚安米\t2368\n草猫\t2369\nknitting\t2370\n轩辕剑\t2371\n互助盘\t2372\nOnlyTease\t2373\n维维豆奶\t2374\n我的歌声\t2375\nChampionships\t2376\n详述\t2377\nFine乐团\t2378\n十字\t2379\n中超\t2380\nEmbassy\t2381\n幻光戟\t2382\n上一层楼\t2383\n博丽灵梦\t2384\n控制系\t2385\n5.2\t2386\n图像块\t2387\n老少恋\t2388\n胡晓炼\t2389\n产权房\t2390\n适应证\t2391\n循环水真空泵\t2392\n谍影重重2\t2393\n木棒\t2394\n雷神岛\t2395\n去化率\t2396\n尼康D700\t2397\n皮肤性病科\t2398\n污区\t2399\n光固化\t2400\n风扇叶\t2401\neventlet\t2402\n靠手\t2403\n吓破胆\t2404\n自战\t2405\n代扣\t2406\n究\t2407\n完美陌生人\t2408\n洛城\t2409\nupdate5\t2410\n中性粒\t2411\n列表页\t2412\n杨越\t2413\n拳皇98终极之战OL\t2414\n1_\t2415\n家庭片\t2416\n真奇妙\t2417\n品尝\t2418\n新洲\t2419\n装裱\t2420\n尹雪艳\t2421\nKickers\t2422\n三狮\t2423\n建筑工程管理专业\t2424\n犬业\t2425\nMold\t2426\n争气\t2427\n阿铭\t2428\n孜然牛肉\t2429\n集奥\t2430\n租花\t2431\n斩断\t2432\n慢性肝炎\t2433\n一对二\t2434\n泰国斗鱼\t2435\n阿斯特拉\t2436\njsdoc\t2437\n2.8g\t2438\n五幅\t2439\n停职\t2440\nm203dw\t2441\n上海电气集团股份有限公司\t2442\n平开窗\t2443\nEndNote\t2444\n版本号\t2445\n薏米仁\t2446\n张笑天\t2447\nGetting\t2448\n真光路\t2449\nbe句型\t2450\n大学子\t2451\nImaging\t2452\n利息税\t2453\n浙江省瑞安中学\t2454\n糠酸莫米松鼻喷雾剂\t2455\n马兰花\t2456\n紧逼\t2457\n烟雾报警器\t2458\n哈妮\t2459\n石竹\t2460\n张掖市政府网\t2461\n纷乱\t2462\n107G\t2463\n星外\t2464\n等\t2465\n疙疙瘩瘩\t2466\nNumLock\t2467\n十来年\t2468\nflyway\t2469\nn56\t2470\n张玉梅\t2471\n向荣\t2472\n侧弯\t2473\n例行\t2474\n毅力\t2475\n415国家安全教育日\t2476\nDAC\t2477\n寻\t2478\n旅游报\t2479\ntitles\t2480\ntdms\t2481\npolymers\t2482\ngoogl\t2483\n先行先试\t2484\n疯帽子\t2485\n胆识\t2486\nswingers\t2487\nDND\t2488\n神龙\t2489\n漏勺\t2490\n当当当当当当当\t2491\n15kw\t2492\n玖恩\t2493\n白热化\t2494\n肇俊哲\t2495\nKVA\t2496\n县人民检察院\t2497\n吐司面包\t2498\n颈动脉硬化\t2499\nxxoogif\t2500\n富怡\t2501\n按健\t2502\n注册咨询工程师考试\t2503\nAus\t2504\nphantomjs\t2505\n热力环流\t2506\n内蒙古质监局\t2507\n嫩豆腐\t2508\n独享\t2509\n热血警探\t2510\n萧熏儿\t2511\n更适宜\t2512\n夏杰\t2513\n服侍\t2514\n狂蟒之灾\t2515\n5升\t2516\n凤凰卫视\t2517\n宜安\t2518\n盗走\t2519\n皇城国际\t2520\n安德利\t2521\n126a\t2522\n苦瓜汤\t2523\n各种\t2524\n一票制\t2525\n糖尿病眼病\t2526\nWindows2008R2\t2527\n特纳\t2528\n给所有知道我名字的人\t2529\n张柏芝\t2530\n亿丰\t2531\n名鱼\t2532\n泰山站\t2533\n热榜\t2534\n邮编娱乐网\t2535\n1微米\t2536\npersonalized\t2537\nmaterialia\t2538\n油耗仪\t2539\n建筑工程技术专业\t2540\n迪马\t2541\n夏威夷竹\t2542\nGXS\t2543\n西欧\t2544\nbkk\t2545\n中国铁建股份有限公司\t2546\naccessed\t2547\n中骏\t2548\n空中飞人\t2549\n上牙\t2550\n美康生物\t2551\n首款\t2552\n汗布\t2553\n中报\t2554\n一起来\t2555\n吉林省图书馆\t2556\n仲村\t2557\n75亿元\t2558\n0109\t2559\n850g\t2560\n2018.04.08\t2561\n芳华路\t2562\n首节\t2563\njhlong\t2564\n高敏\t2565\n9关\t2566\n超过10年\t2567\n至诚\t2568\n假寐\t2569\n380分\t2570\n千愁\t2571\n听来\t2572\n献县\t2573\n不绝\t2574\n专用品\t2575\n战锤:末世鼠疫2\t2576\n中华人民共和国税收征收管理法实施细则\t2577\n别再买\t2578\n快捷快递\t2579\n固生堂\t2580\n20173\t2581\n新丝绸之路\t2582\n轻奢新主义\t2583\nWaikiki\t2584\n哈弗h4\t2585\n芜湖方特东方神画\t2586\n2700X\t2587\n王为念\t2588\n接力棒\t2589\n黑白素描\t2590\n太笨\t2591\n急性鼻窦炎\t2592\n江北国际机场\t2593\n成长率\t2594\n保定火车站\t2595\nyq601\t2596\n恐怖惊魂夜\t2597\nwin2008_Windows\t2598\n交管123123\t2599\ngloss\t2600\nDAOKO\t2601\n土蜂\t2602\n朗播\t2603\n哏儿\t2604\n扣缴义务人\t2605\n频率\t2606\nGDCA\t2607\nMeteor\t2608\n热病\t2609\n林书杨惠妍\t2610\n整风\t2611\n储架\t2612\n99.90\t2613\n宝马红旗l5\t2614\n捞钱\t2615\n奥适宝\t2616\n近两周\t2617\nA51\t2618\nEXPORT\t2619\n3月26日\t2620\n雷普\t2621\n烤鸡炉\t2622\nAppender\t2623\n妻子们\t2624\n房路\t2625\n胃阴虚\t2626\n第十九\t2627\ndeloitte\t2628\n2017年清明\t2629\n中国水利学会\t2630\n张小青\t2631\n北苑\t2632\n袁登祥\t2633\npapertest\t2634\nBD高清\t2635\n国网浙江省电力公司\t2636\n两朵\t2637\n单级离心泵\t2638\nStylish\t2639\n北京晨报网\t2640\n抗剪强度\t2641\nrx200t\t2642\n每两周\t2643\n幸田来未\t2644\n烦\t2645\n/列\t2646\n往来账\t2647\nzypper\t2648\n黄庆\t2649\n锋芒\t2650\ntablo\t2651\n地热水\t2652\n石家庄机场\t2653\n艳旅\t2654\nno.9\t2655\nrand\t2656\n3DMGAME\t2657\n≡\t2658\n下期期中\t2659\n华远华中心\t2660\n百里夜刀\t2661\n黑龙江省公安交通管理局\t2662\n洋光\t2663\n川贝枇杷露\t2664\nTWS\t2665\nPPT2007\t2666\n百家\t2667\n康敏\t2668\n微整形\t2669\n遗骨\t2670\n死库水\t2671\n图典\t2672\n核蛋白\t2673\n辩析\t2674\nshij\t2675\n34页\t2676\n1080P/720P\t2677\n双屏\t2678\n东延\t2679\n会安\t2680\n客路\t2681\n大派\t2682\n老叶茶馆\t2683\n超程\t2684\n玻纤布\t2685\n熙菱信息\t2686\n圣经\t2687\n吉林省财政厅\t2688\n【多玩天刀攻略团\t2689\n阿姆斯特丹史基浦机场\t2690\n錾刻\t2691\n点军区\t2692\n二聚酸\t2693\n民事代理\t2694\nqq离线\t2695\nB16\t2696\n名侦探柯南普通话\t2697\n遮盖\t2698\n保税仓\t2699\n张志明\t2700\n明快\t2701\n佛兰德斯\t2702\n锡北镇\t2703\n断点处\t2704\n浙江省国资委\t2705\n第002章\t2706\n缺觉\t2707\n武哥\t2708\n和顺古镇\t2709\n卓航\t2710\n无归\t2711\nkerberos认证\t2712\n台湾政府\t2713\n修筑\t2714\n保卫处\t2715\n2015-2017年度\t2716\n人口\t2717\n宋本\t2718\n论文\t2719\nPics\t2720\n中国招生考试网\t2721\ncodestyle\t2722\n重招\t2723\n义务教育课程标准实验教科书\t2724\n顺治\t2725\n恒大滨江\t2726\n宏库\t2727\n千龙湖\t2728\n机械迷城\t2729\n老贼\t2730\n五丈原\t2731\n闻泰科技\t2732\nTrack\t2733\n蚂蚁花\t2734\n市供销社\t2735\n无害\t2736\n屁事\t2737\n北斗导航概念股\t2738\n陈铁军\t2739\n拆胎机\t2740\n酷狗音乐2018\t2741\nstrict\t2742\n九华山路\t2743\n揉摸\t2744\nanalogy\t2745\nECAM\t2746\nsublist\t2747\nnsfw\t2748\n甬派\t2749\n处男\t2750\n灵敏性\t2751\n安徽省国资委\t2752\n激起\t2753\n栈帧\t2754\n凡\t2755\nBT传奇\t2756\n琼山\t2757\n审车\t2758\n伪证罪\t2759\npleasures\t2760\n湖州市\t2761\n拉格朗日乘数法\t2762\nOpenStack\t2763\n河北师大\t2764\n一万一\t2765\n稳压阀\t2766\n淫宗\t2767\n团体赛\t2768\nJW\t2769\nusf\t2770\noldmonk\t2771\ndao\t2772\n小李飞刀\t2773\n多组分\t2774\n日立建机\t2775\n赵蓉\t2776\n路片\t2777\n沃银\t2778\ngenome\t2779\n三角瓶\t2780\n菜籽\t2781\nQuantity\t2782\ncfw\t2783\n饱餐\t2784\n户外写真机\t2785\n氯化钴\t2786\n多劳多得\t2787\n130W\t2788\n9800万\t2789\n原生花\t2790\n阿文\t2791\n培哚普利片\t2792\n魅影传说\t2793\n致命的邂逅\t2794\n用户体验设计\t2795\n_荔枝网\t2796\n宁波口腔医院\t2797\n电阻屏\t2798\ntiantou\t2799\nstraw\t2800\n樱花通信\t2801\n陈旧性肺结核\t2802\n妇保\t2803\n古特雷斯\t2804\n唐伯虎冲上云霄\t2805\n秒退\t2806\n红山文化\t2807\nsantacruze_丁香通\t2808\n500平\t2809\n傲娇与偏见\t2810\n目标机\t2811\n形象\t2812\n王美华\t2813\n钢桶\t2814\n赵寒阳\t2815\nmoncler\t2816\n优锘科技\t2817\n雾化\t2818\n宋长贵\t2819\n天梯赛\t2820\n水盘\t2821\n位阶\t2822\n段光勋\t2823\n麦克斯韦方程\t2824\nHime\t2825\n沃尔沃s60l\t2826\n模本\t2827\n金会\t2828\n电监\t2829\n面体\t2830\n安防知识网\t2831\n轻巧\t2832\n207所\t2833\n中标\t2834\n几厘米\t2835\n酚酞\t2836\n九连山\t2837\n犬牙\t2838\nnote3吧_\t2839\n古乐\t2840\nscv\t2841\nErgonomic\t2842\nExcel散点图\t2843\n点滴\t2844\n王敏德\t2845\n天麻素注射液\t2846\nSway\t2847\nCPU主频\t2848\n刘亚楼\t2849\n张真\t2850\n内签\t2851\n英雄传说:碧之轨迹\t2852\n第四十章\t2853\n能型\t2854\n眼药膏\t2855\n群晖nas\t2856\nnvi\t2857\n势语\t2858\n机票酒店\t2859\n靶机\t2860\n伊思红参\t2861\n永康方岩\t2862\n魔杰\t2863\nV2.1\t2864\n彭坤\t2865\n银河魔装机神\t2866\n自爱\t2867\nzhm\t2868\n中国汽配网\t2869\n马河\t2870\n蔬菜类\t2871\nqtum\t2872\n神经计算棒\t2873\n大耳朵\t2874\n中国海事服务网\t2875\n演播室\t2876\ncs229\t2877\n札记\t2878\ndgm\t2879\n梁弄\t2880\n44位\t2881\n励步英语\t2882\n高蒙\t2883\nglibc-devel\t2884\nAP1000\t2885\n世界大学排名\t2886\n五场\t2887\nassassin\t2888\n邻苯\t2889\n大田环球\t2890\nMata\t2891\nqss\t2892\n精惟一\t2893\n同济大学医学院\t2894\n取定\t2895\n马思\t2896\nNormalization\t2897\n9月1\t2898\n577\t2899\n云南建投集团\t2900\nhydrocarbon\t2901\n多少厘米\t2902\n密押\t2903\n赵克志\t2904\n嬴荡\t2905\n彭鲁\t2906\nPhpStorm\t2907\n白虎\t2908\n上海国际医院\t2909\n黄鹤楼送孟浩然之广陵\t2910\n标致2008\t2911\n茱莉\t2912\n前进性\t2913\n愈多\t2914\n禅学\t2915\n加鲁鲁\t2916\n安德烈\t2917\n宣太后\t2918\n撒哈拉沙漠\t2919\nhkg\t2920\nworry\t2921\n阔叶黄檀\t2922\n虎鞭\t2923\n孙卓\t2924\n陈安可\t2925\n铜币\t2926\n定海政府网\t2927\n水污染控制工程\t2928\nCounts\t2929\nbingbian\t2930\n下司犬\t2931\n5棵\t2932\n绮罗\t2933\n三只猫\t2934\n后味\t2935\n苏晋\t2936\n逆光\t2937\n血轮眼\t2938\n南沙天后宫\t2939\n多元统计分析\t2940\nYY禁歌网\t2941\n咸丰通宝\t2942\n002292\t2943\nabl\t2944\ncrul\t2945\n2399元\t2946\n好易通\t2947\n京歌\t2948\n咖啡味\t2949\n构造法\t2950\n徽派\t2951\n四象限\t2952\nsofitel\t2953\n我的人生\t2954\nac9\t2955\n枣园镇\t2956\n侧重于\t2957\n数云\t2958\n丧偶\t2959\n博古\t2960\n富邦广场\t2961\n最高位\t2962\n新风路\t2963\nAxureRP7.0\t2964\nIllustrat\t2965\nsandea\t2966\nvina\t2967\n滠口\t2968\n20160627\t2969\nniit\t2970\n超温\t2971\n郭艾王石\t2972\nmov格式\t2973\n出了问题\t2974\n压舱石\t2975\n初婚\t2976\n邢帅教育\t2977\n整合素\t2978\n贝吉特\t2979\n宁波大学法学院\t2980\n鼻毛\t2981\n乡镇工会\t2982\nsetting\t2983\npronounce\t2984\n地热能\t2985\n卖得火\t2986\n尼康D90\t2987\n天志\t2988\n11k\t2989\n兴荣\t2990\n武夷山北站\t2991\n鸟窝\t2992\n梁画\t2993\n悬着\t2994\nbridal\t2995\n16辆\t2996\n50期\t2997\n九江经济技术开发区\t2998\n杀虫灯\t2999\n华融信托\t3000\n北京温泉\t3001\nMPa\t3002\n400km\t3003\n导磁率\t3004\n广州小区\t3005\n生产方式\t3006\n好好先生\t3007\n所言\t3008\ntushu\t3009\n卡图\t3010\n李志敏\t3011\n冒险岛吧\t3012\n圣斗士星矢剧场版\t3013\n艾滋病日\t3014\n小作家\t3015\n孤寒\t3016\n日工\t3017\n指染驱灵师\t3018\nRenfe\t3019\n试行稿\t3020\n立体音\t3021\n打错\t3022\n虚空恐惧吧\t3023\n拿来主义\t3024\n3.5_\t3025\n雨林古茶坊\t3026\n饿汉\t3027\n荣耀盒子pro\t3028\n下议院\t3029\n野田\t3030\n下马\t3031\n齐刘海\t3032\n凝舞\t3033\n博途v15\t3034\n漫友\t3035\n水稀\t3036\nProtocol\t3037\n翠\t3038\n铜盂镇\t3039\n鸟样\t3040\n12.5.1.165\t3041\nPWC\t3042\n微博号\t3043\n这个周六\t3044\n高清版\t3045\n买菜\t3046\n买链帮手\t3047\n匍匐\t3048\n大数据产业园\t3049\n学前儿童发展心理学\t3050\n裸跑\t3051\nArea-51\t3052\n受害人\t3053\n肟\t3054\n朴信惠\t3055\n生查子\t3056\nwhu\t3057\n天龙集团\t3058\n平衡二叉树\t3059\n1月20日\t3060\nOBB\t3061\n人心\t3062\n明志健致远\t3063\njieti\t3064\n凝砂\t3065\nTheano\t3066\n星湖科技\t3067\n豪斯医生\t3068\n致我们终将逝去的青春\t3069\n吹水机\t3070\nDR4\t3071\nSerialize\t3072\nBKL\t3073\n程敏\t3074\n大寿\t3075\n四磨汤口服液\t3076\n长年累月\t3077\n金口河区\t3078\ntoms\t3079\n非活动\t3080\n骆宏俊\t3081\n周晓光\t3082\n莫高窟\t3083\n深圳市人民医院\t3084\nLut\t3085\n欣华\t3086\n农视网\t3087\neWebEditor\t3088\n0.5ml\t3089\n600266\t3090\n飞行\t3091\n终曲\t3092\nfinereader12\t3093\n尘印\t3094\nTL-WR886N\t3095\n太白金星\t3096\n果\t3097\nEXPLAIN\t3098\n内帐\t3099\n边数\t3100\nuCOS\t3101\n高媛媛\t3102\nV8.5\t3103\n门室\t3104\n聚合物\t3105\ninsights\t3106\n嗷大喵\t3107\n万能转换器\t3108\n天津古文化街\t3109\n遂昌县\t3110\nshure\t3111\n口袋妖怪火红\t3112\n安正\t3113\n斯塔姆\t3114\nAllway\t3115\n2017年8月30日\t3116\npxbibm\t3117\nYouthLA\t3118\n1条\t3119\n追逃\t3120\n正大集团\t3121\n法王\t3122\nECMS\t3123\n白扣\t3124\n汕头市\t3125\n中国儿童少年基金会\t3126\nRadioButton\t3127\n平移\t3128\n少淑\t3129\n侠盗猎车手GTA4\t3130\n动了心\t3131\n善被人欺\t3132\n腌鸭蛋\t3133\n神七\t3134\n脚手板\t3135\n沪科版\t3136\naig\t3137\n妹岛\t3138\n跑\t3139\n欧铂丽\t3140\n大联大控股\t3141\n2.torrent\t3142\n林区\t3143\n因式分解法\t3144\n疣猪\t3145\n北京师大\t3146\n博城\t3147\n丢三落四\t3148\n移动性\t3149\n美骑论\t3150\n飞行技术专业\t3151\n龙腾苑\t3152\n雨点儿\t3153\n易德龙\t3154\n奥斯\t3155\n深圳日海通讯技术股份有限公司\t3156\n绣花针\t3157\nSPDIF\t3158\n水户\t3159\n京津冀投资网\t3160\npercentile\t3161\n迷宫\t3162\nGalvanized\t3163\nMENTHOL\t3164\n波罗丁\t3165\n无理\t3166\nbaker\t3167\nSAB\t3168\n25回\t3169\n卫立煌\t3170\n硚口路\t3171\n小帅机器人\t3172\n第18届\t3173\n炫目\t3174\nporject\t3175\n大革命\t3176\n数转\t3177\n路书\t3178\n情操\t3179\n150年\t3180\n包谷\t3181\nnvida\t3182\n三色堇\t3183\n脸脸\t3184\n203mm\t3185\n打着\t3186\n存谢\t3187\n官渡之战\t3188\n机动\t3189\n皖西大裂谷\t3190\n粉霜\t3191\n耶子\t3192\n菜架\t3193\ndirgo\t3194\n土耳其烤肉\t3195\n毕春芳\t3196\n中国大舞台\t3197\n六千多万\t3198\n1440x2560\t3199\n第57期\t3200\n智能交换机\t3201\n电信级\t3202\naio\t3203\n全汇总\t3204\nive\t3205\nRiding\t3206\n江干九堡\t3207\n摆法\t3208\n朱生豪\t3209\n外交术\t3210\n陈志杰\t3211\n替身术\t3212\n跑起\t3213\ngenerous\t3214\n系统流\t3215\n国瑞地产\t3216\n胃肿瘤\t3217\n马航mh370\t3218\ntoro\t3219\n欧文斯\t3220\nM3U8格式\t3221\n中参\t3222\n结节性红斑\t3223\n郑阜\t3224\n投诚\t3225\n捷豹F-PACE\t3226\nRAID1\t3227\n闲趣\t3228\n两点半\t3229\nzypxx\t3230\n4K高清\t3231\n菲欧娜\t3232\n天包\t3233\n即便\t3234\n国内部\t3235\n孙丹菲\t3236\nstring型\t3237\nBy_WJSN\t3238\n十堰汽车网\t3239\n贲友林\t3240\n厶\t3241\n马芬\t3242\n估价表\t3243\n奔赴\t3244\n深港DJ总站\t3245\n4525\t3246\n牙杯\t3247\ngearvr\t3248\n棘齿城\t3249\n连云港市第一人民医院\t3250\n棉球\t3251\nOD\t3252\n标识列\t3253\nSpy\t3254\n三儿\t3255\n五道杠\t3256\n川外\t3257\n太空飞船\t3258\n厦门广电\t3259\n用费\t3260\nThong\t3261\n联盟链\t3262\nsol君吧\t3263\n前海湾\t3264\n口袋妖怪黑白\t3265\n裸片\t3266\n完税证明\t3267\n王永辉\t3268\n东川路\t3269\n土样\t3270\n泡花碱\t3271\nherit\t3272\n画箱\t3273\n迪粉\t3274\n近二十年\t3275\nLONDON\t3276\n桌游\t3277\n黄胜\t3278\n公务员职务与职级并行制度\t3279\n石龙火车站\t3280\n下市\t3281\n发泡机\t3282\nSogou\t3283\n戴进\t3284\n知花梅莎\t3285\n63mm\t3286\n乐视x800\t3287\ndav\t3288\nbears\t3289\n湖南快乐阳光互动娱乐传媒有限公司\t3290\n杨广\t3291\n整治\t3292\n标机\t3293\n叙利亚自由军\t3294\n金诃\t3295\n伏尸\t3296\ndigdeep126\t3297\nja\t3298\n大章\t3299\n琼楼\t3300\n开户行联行号\t3301\n粮食局\t3302\n佰仟金融\t3303\nKYOCERA\t3304\n华美达酒店\t3305\n老年性阴道炎\t3306\n第五号\t3307\n中国国际经济贸易仲裁委员会\t3308\n交杯\t3309\nAlamofire\t3310\n中公教育湖北分校\t3311\n双组份\t3312\n魂师\t3313\n野良猫\t3314\nchock\t3315\n好地网\t3316\nPolicy\t3317\n泸州网\t3318\n救助\t3319\nSW\t3320\nc920\t3321\n0.5元\t3322\n时尚类\t3323\n3136\t3324\n丁俊\t3325\n小寺\t3326\n汗脚\t3327\n开心鬼放暑假\t3328\n泰加\t3329\n诲\t3330\n王孟\t3331\n血迹\t3332\n两万里\t3333\n后沙峪镇\t3334\n摩加迪沙\t3335\n再喝\t3336\nclusters\t3337\n胡萝卜汤\t3338\nMetabolism\t3339\nv2.4.4\t3340\n1.3m\t3341\ncd2\t3342\nTate\t3343\n长沙地铁7号线\t3344\n风骤\t3345\n幻影忍者\t3346\n师达\t3347\n指委\t3348\n楹\t3349\nFatty\t3350\nKaplan\t3351\n黄新宇\t3352\n28\t3353\n剑拔弩张\t3354\n柯基俱乐部\t3355\n边陲\t3356\n雨润广场\t3357\n弗朗明哥\t3358\n导论\t3359\n拿铁咖啡\t3360\n蜜意\t3361\n幻想言情小说-17k小说网\t3362\n金钟街\t3363\nUPUPOO\t3364\n4寸\t3365\n线组\t3366\nepplus\t3367\nm2eclipse\t3368\n黄金圣斗士\t3369\n26码\t3370\n迪亚士\t3371\n返奖\t3372\nbeside\t3373\n珠海艺术职业学院\t3374\n注册会计师证\t3375\nnongye\t3376\n滨海新闻网\t3377\n阻挠\t3378\n20161223\t3379\n胎铃\t3380\nExBot易科机器人实验室\t3381\n罗茜\t3382\n长春汽车工业高等专科学校\t3383\n850PRO\t3384\nBesame\t3385\n东帝汶\t3386\n央行\t3387\n4440\t3388\n_路考路\t3389\n杀手锏\t3390\n超萌\t3391\n作战\t3392\n外女\t3393\n县际\t3394\n病案室\t3395\n决赛\t3396\n宠物\t3397\neverest\t3398\n军购\t3399\n十四首\t3400\n新金融\t3401\n天尊\t3402\n抢镜\t3403\n王中秋\t3404\n木府\t3405\n留日\t3406\n木皮\t3407\n视奏\t3408\n大圆\t3409\n碳\t3410\n训诫文\t3411\n瓦机\t3412\n微马\t3413\n烽火战国\t3414\n不说再见\t3415\n碳酸氢钠\t3416\n刘家窑\t3417\nlung\t3418\n演唱家\t3419\nminipage\t3420\n变性\t3421\nPScs6\t3422\n安川伺服驱动器\t3423\n随大流\t3424\n客房\t3425\n胡志明市\t3426\n福建省公安厅交警总队\t3427\n垂直入\t3428\n勾股\t3429\n喷字\t3430\n博游网\t3431\n无线电论坛\t3432\n行水\t3433\n湛江地区\t3434\n战神之路\t3435\n条机\t3436\n龙日一\t3437\n舒克\t3438\n艾伦沃克\t3439\n行政诉讼法\t3440\n高恩\t3441\nmockito\t3442\n55寸\t3443\n纷繁\t3444\nsignaling\t3445\n本塘\t3446\n落跑\t3447\nES文件管理器\t3448\n红阳能源\t3449\nfsmc\t3450\n解液\t3451\n13幅\t3452\n以爱之名\t3453\n端城\t3454\n百家讲坛国史通鉴\t3455\nMSP43016\t3456\nhttp接口功能测试\t3457\nATC\t3458\n虚空行者\t3459\n三路\t3460\nkanken\t3461\n微体\t3462\n陈玉婷\t3463\n青岛啤酒节\t3464\n万达电影院\t3465\n猎奇_风度男人网\t3466\n59&\t3467\ntail\t3468\n希尔微\t3469\nHurts\t3470\n第B01\t3471\n不爽\t3472\n动画界\t3473\naqara\t3474\n兴宁路\t3475\n邦邦岛\t3476\noned\t3477\nWhatis\t3478\n超细\t3479\n基佬们\t3480\n纣临\t3481\nsubsection\t3482\n灼痛\t3483\nKeeper\t3484\n凯利泰\t3485\n临济\t3486\n寺\t3487\n盐田网\t3488\n苏珊·桑塔格\t3489\nep01\t3490\n原理篇\t3491\n贪睡\t3492\n格列吡嗪\t3493\nBRL\t3494\n妙经\t3495\n逢坂大河\t3496\n墓王之王悬棺寺截图_墓王之王\t3497\nDTY\t3498\nantlr\t3499\nskew\t3500\n300001\t3501\n仔旭\t3502\n256位\t3503\n济南省\t3504\nvirtualbox虚拟机\t3505\n小段\t3506\nhs8145v\t3507\n动态树\t3508\n易受\t3509\nSpecials\t3510\n中田英寿\t3511\nTello\t3512\n色女\t3513\nMXM\t3514\n子轩\t3515\n髓鞘化\t3516\n芍药苷\t3517\n第14章\t3518\n劲客\t3519\n0.10\t3520\n天津奥数网\t3521\n陈南\t3522\n幼儿园教育指导纲要\t3523\n伯尔尼\t3524\n园林绿化专业\t3525\n横径\t3526\n计划表\t3527\n录井\t3528\n中铁七局集团有限公司\t3529\nsetter\t3530\n不远\t3531\n松松论坛\t3532\n成见\t3533\n鼓架\t3534\n江铃集团\t3535\n很抱歉\t3536\n有图有真\t3537\n旋钮式\t3538\n特里斯坦·汤普森\t3539\nPED\t3540\n独龙江\t3541\n微章\t3542\nTFS2013\t3543\n新农网\t3544\n三星kies\t3545\n华附\t3546\n弃标\t3547\n黄江\t3548\n平方项\t3549\n会计从业资格证考试\t3550\nBak\t3551\n1.4.0\t3552\njufd\t3553\n酒吧街\t3554\n麦金利\t3555\n吞\t3556\nMetallic\t3557\n百合吧\t3558\n荣耀8\t3559\nv2.0.10\t3560\n七颗\t3561\nVc\t3562\nxiaoy\t3563\n再读\t3564\n平安妖物语\t3565\n长冈\t3566\n恩雅\t3567\n育碧\t3568\nluck路客\t3569\nmu奇迹MU\t3570\n兰卡斯特\t3571\n古镇镇\t3572\n二十多个\t3573\nLYN\t3574\n三点钟\t3575\nRemoval\t3576\n宁波大学图书馆\t3577\n南京大学网络教育学院\t3578\n金鱼草\t3579\n长沙市交通运输局\t3580\n20170506\t3581\nTomb\t3582\n1秒后\t3583\n巡礼者\t3584\nicpms\t3585\nstr\t3586\n優質\t3587\n这个群\t3588\n杭州火车南站\t3589\nExynos4412\t3590\n出人头地\t3591\n150日\t3592\n王永杰\t3593\n梅子青\t3594\n通知音\t3595\n16.8元\t3596\n错题本\t3597\nBrooks\t3598\nbr/\t3599\ncrt-conio-l1-1-0.dll\t3600\n集度\t3601\nproset\t3602\nine\t3603\n记谱\t3604\n义勇军进行曲\t3605\n层流\t3606\n美好愿景\t3607\n中华人民共和国水法\t3608\n丨艾肯家电网\t3609\n双良\t3610\nAngel_Kitty\t3611\n中昂地产\t3612\n抵押\t3613\nBeaches\t3614\n乐视盒子\t3615\nThread\t3616\n刑事诉讼法\t3617\n惊声尖叫\t3618\nGI\t3619\nMafly\t3620\nfindstr\t3621\n创业型\t3622\n硝酸镁\t3623\n福州屏东中学\t3624\n3d贴图网\t3625\n小产\t3626\nhttpurl\t3627\n通话\t3628\n2000w\t3629\nmcmc\t3630\n湖口\t3631\n浩月白雪\t3632\n清式\t3633\n李子雄\t3634\n友衰\t3635\n庄磊哥\t3636\n邰\t3637\n草书字体转换器\t3638\n卫栖梧\t3639\n沙特里亚尔\t3640\n翼虎路虎揽胜\t3641\nv1.0免费版\t3642\n可盈可乐\t3643\n男变\t3644\n北京市工商局\t3645\nPerspective\t3646\nforms\t3647\n妈妈\t3648\n一箭双雕\t3649\n塞西莉亚\t3650\n石墨坩埚\t3651\n暮年\t3652\n恒丰路\t3653\n16.2\t3654\n1挡\t3655\n120型\t3656\n颈枕\t3657\nMBI集团\t3658\n曝气沉砂池\t3659\namerica\t3660\nPeople\t3661\n浙富控股\t3662\n不骄不躁\t3663\n中远集团\t3664\n新能源科技有限公司\t3665\n级子\t3666\n居住房\t3667\n茂名日报\t3668\n江苏地区\t3669\nSp3\t3670\n考核期\t3671\n欧洲卡车模拟2\t3672\n谢蕾蕾\t3673\n保利东郡\t3674\n上海虹\t3675\n王韵壹\t3676\n富士见\t3677\n二手房按揭贷款\t3678\n藝\t3679\n872\t3680\n好睡\t3681\n恶神\t3682\n0则\t3683\n慢摇\t3684\n打虎上山\t3685\n模块机\t3686\ncad批量\t3687\nWORD格式\t3688\n一百亿\t3689\n尹食堂\t3690\n水台\t3691\n商装\t3692\n玩趣\t3693\n汉斯·季默\t3694\n消力池\t3695\n零件盒\t3696\n晓薇\t3697\n五条人\t3698\n爆轰\t3699\n1611\t3700\n虎威\t3701\n有资\t3702\nglow\t3703\n吐嘈\t3704\n13部\t3705\n午\t3706\n高密度影\t3707\n李依\t3708\n落第\t3709\n累人\t3710\n水怪\t3711\n凤鸣鸟\t3712\n杰西达邦\t3713\n白杨街道\t3714\n29.9\t3715\n加工性\t3716\n猩红公爵\t3717\n张雪迎\t3718\n有座\t3719\n营业站\t3720\n开票方\t3721\n低卡\t3722\n机器学习\t3723\n福建\t3724\n2000平方米\t3725\n果宝特攻\t3726\n6-8月\t3727\n公煲\t3728\n小米recovery\t3729\n心脑血管疾病\t3730\nyears\t3731\n有理数\t3732\nMCV\t3733\n干燥症\t3734\n圣剑联盟\t3735\n泽尼特\t3736\n巧乐兹\t3737\n平正\t3738\n乐亭\t3739\ndom树\t3740\n文丰\t3741\n杨美玲\t3742\n客会\t3743\n八卦阵\t3744\nresulting\t3745\n鸣冤\t3746\n全志科技\t3747\n临危不惧\t3748\nlvm\t3749\n杨科璋\t3750\nApple\t3751\nKitwarePublic\t3752\nWINDOWSXP\t3753\nInfusion\t3754\n免伤\t3755\n水车薪\t3756\n奶瓶消毒器\t3757\n女花\t3758\n楼书\t3759\n骚逼\t3760\n浪琴手表\t3761\n北京军事博物馆\t3762\n義母\t3763\n小中\t3764\n16款\t3765\nAutomatically\t3766\n人衣\t3767\nRobotStudio\t3768\n东欧\t3769\nAnroid\t3770\n人面\t3771\n高反\t3772\n亚平\t3773\n司米\t3774\n800公斤\t3775\n上加\t3776\n帝位\t3777\n谢公屐\t3778\n地空导弹\t3779\n哭泣\t3780\n令咒\t3781\nfow\t3782\n永泰\t3783\n谏议\t3784\n厦门华厦学院\t3785\n夏明翰\t3786\n巴莱\t3787\nEssex\t3788\n中国工业电器网\t3789\n通报会\t3790\n安迪尔\t3791\n理想乡\t3792\ntrademark\t3793\nascll\t3794\n怒斩\t3795\n体育场馆\t3796\nServer\t3797\nYARiS\t3798\nbypass\t3799\n百加得\t3800\n韭菜炒鸡蛋\t3801\nvivoX7\t3802\nraysource\t3803\n合校\t3804\n逗哈\t3805\n千彩\t3806\n两旁\t3807\n新东方小学\t3808\n5181\t3809\nsll\t3810\n文殊院\t3811\n上卷\t3812\n十字军东征\t3813\n锻\t3814\n海洋科学学院\t3815\nnoobs\t3816\n1978年以来\t3817\n储气罐\t3818\n中华陶瓷网\t3819\n披星戴月\t3820\nGirl\t3821\n8460p\t3822\n周培源\t3823\n塞尔比\t3824\n同向\t3825\n尤里卡\t3826\n木林森股份有限公司\t3827\n王亚男\t3828\nMDX\t3829\n何军\t3830\n工作照\t3831\n红鬼\t3832\n七剑下天山\t3833\nipaddr\t3834\n松石\t3835\n克里希\t3836\n纽太特\t3837\n全盛时期\t3838\n入门机\t3839\n熄\t3840\nBrooke\t3841\n5.6.2\t3842\n伏羲山大峡谷\t3843\n漳州高新区\t3844\nOMRON\t3845\n武磊\t3846\n录音棒\t3847\nBROTHER\t3848\n车马费\t3849\n偶数页\t3850\n微观经济学\t3851\nopac\t3852\n20160513\t3853\nPiston\t3854\n彭禺厶\t3855\n青岛市环境保护局\t3856\nnigga\t3857\n九龙仓时代上城\t3858\n调整器\t3859\n冲锋陷阵\t3860\n郑州市城乡建设委员会\t3861\n1481\t3862\n权益型\t3863\n喜力士\t3864\nnintendo\t3865\n60平方\t3866\nDEEP\t3867\n10x\t3868\n铁建\t3869\n脚口\t3870\n无怨无悔\t3871\n地球物理学\t3872\n召唤师\t3873\n金山逍遥Xoyo.Com\t3874\nitmole\t3875\n尼罗河上的惨案\t3876\n万科金色家园\t3877\n莫焕晶\t3878\n东方航空物流\t3879\n恋爱的发现\t3880\nGE60\t3881\nse7en\t3882\n天津环湖医院\t3883\n炸鸡翅\t3884\n不可撤销\t3885\n苦心\t3886\n非企业\t3887\n不醉\t3888\n灰鹅\t3889\nbandizip\t3890\n七天\t3891\n格斗赛\t3892\n农膜\t3893\n儿女传奇\t3894\n上轨\t3895\n模柄\t3896\n31次\t3897\n苏州大学独墅湖校区\t3898\n4555\t3899\n恬\t3900\n江苏省财政厅\t3901\n爱茉莉\t3902\n水晶板\t3903\n干砌\t3904\n姜帆\t3905\n瞭望东方周刊\t3906\n中国信鸽信息网各地信鸽协会\t3907\n溶点\t3908\n检验检测机构资质认定管理办法\t3909\n车次表\t3910\n料酒\t3911\nxcin\t3912\n盖头\t3913\n彭丽媛\t3914\n王座\t3915\n高榕\t3916\n势场\t3917\n偶联反应\t3918\nMoschino\t3919\nchk\t3920\n不忍\t3921\n地球化学\t3922\n苏州高新\t3923\n车速表\t3924\n西宮\t3925\n咋回事\t3926\n克拉克\t3927\nTIP\t3928\n油\t3929\n简比\t3930\n暴食症\t3931\n白体\t3932\n点亮\t3933\nv2.7.0\t3934\n半睡半醒\t3935\n湖南大学研究生院\t3936\n玉屏\t3937\n27g\t3938\n郎月婷\t3939\nGoat\t3940\nmate9\t3941\nBOX\t3942\naffect3d\t3943\nFeatures\t3944\npleasant\t3945\n云胡不喜\t3946\n方正字库\t3947\n陕西省财政厅\t3948\n七十二式性\t3949\nUnsigned\t3950\n搜狐微门户\t3951\n科学美国人\t3952\ncomplement\t3953\n鲜百合\t3954\nnfs\t3955\n古词\t3956\n纪家庙\t3957\n登陆舰\t3958\n王力军\t3959\n88路\t3960\n看准网\t3961\n项目群\t3962\n上海高速\t3963\n柞水县\t3964\n佐良娜\t3965\n鲱鱼罐头\t3966\nEPS模板\t3967\n弹片\t3968\n公章\t3969\n两片\t3970\n东周\t3971\n閱讀\t3972\n亚亚\t3973\n唐宁街\t3974\nGunicorn\t3975\n不可逾越\t3976\n笑一个\t3977\n夜勤病栋全集\t3978\ndevi\t3979\n2018.4.30\t3980\n不赖\t3981\n渔家\t3982\n线内\t3983\n三教九流\t3984\n大青\t3985\n情水\t3986\n防风罩\t3987\n杆体\t3988\n横道世之介\t3989\n合肥政务区\t3990\nGY6\t3991\n仙剑奇侠传五前传\t3992\n螺旋输送机\t3993\n武汉市发展和改革委员会\t3994\n贮\t3995\nAmazon亚马逊\t3996\n胃肠功能紊乱\t3997\n2转\t3998\n64k\t3999\n干仗\t4000\n布头\t4001\n官渡\t4002\n色相饱和度\t4003\n铝条\t4004\n三四片\t4005\nGIT服务器\t4006\n十里春风不如你\t4007\n战马\t4008\nt17\t4009\n防尘袋\t4010\n黄业\t4011\n悬梁刺股\t4012\n第38\t4013\n西瓜太郎\t4014\n南京海底世界\t4015\n隼龙\t4016\nDataGuard\t4017\n霍林河\t4018\n哭不出来\t4019\nfinish\t4020\n乃亚\t4021\n爆满\t4022\n偏振片\t4023\n黄帝内经\t4024\n7.31\t4025\nfart\t4026\n杂坛\t4027\n4节\t4028\nlaster\t4029\nmovin\t4030\n阿苏\t4031\nlns\t4032\nvirtuoso\t4033\n冉高鸣\t4034\n粉碎者\t4035\n11月14日\t4036\n插帧\t4037\n诉讼状\t4038\nshidi\t4039\n低奢\t4040\n斯芬克斯猫\t4041\n三招\t4042\n辣文肉文\t4043\n鸿蒙树\t4044\n绿化工程师\t4045\nunlink\t4046\n设法\t4047\n射频识别\t4048\n世界树迷宫\t4049\n氘代\t4050\n瑞鹤仙\t4051\n人祖\t4052\nlandmark\t4053\ncqq\t4054\n洪涛\t4055\n_体\t4056\n86平\t4057\n香港教育大学\t4058\n王婆\t4059\ncoins\t4060\n幻想三国志5\t4061\n连接线\t4062\n钱钟\t4063\nlna\t4064\n闹元宵\t4065\n东奥注册会计师\t4066\nHTT\t4067\nc++2013\t4068\ncoll\t4069\n酒泉市人民政府\t4070\n桥本爱实\t4071\n人篇\t4072\n宝泰隆\t4073\n支链淀粉\t4074\n真值表\t4075\n原友\t4076\nmaximo\t4077\nSimmons\t4078\n双卡双待单通\t4079\n沉淀\t4080\n治理层\t4081\n酷乐\t4082\n自是\t4083\n桑园镇\t4084\n战舰帝国\t4085\nzowie\t4086\n渐渐\t4087\n指挥长\t4088\n泛化\t4089\nbillb\t4090\n西塘古镇\t4091\n音书\t4092\n买办\t4093\n摩尼\t4094\n枪色\t4095\nTZ\t4096\n200层\t4097\n长安cx70\t4098\n通识\t4099\n五体\t4100\nraid0\t4101\n高频变压器\t4102\n分析师\t4103\n蔫\t4104\n8分\t4105\n脚后跟\t4106\n中国输血协会\t4107\n波旁\t4108\n撸串儿\t4109\n贴体\t4110\n栾凯\t4111\n烧热\t4112\n鼻息\t4113\n反流性胃炎\t4114\naxios拦截器\t4115\n名绣\t4116\n长白山西坡\t4117\n总店\t4118\n湛江万达广场\t4119\n德城区\t4120\n上海整形美容医院\t4121\n中心机\t4122\n白胖子\t4123\n明泰\t4124\n天安门东\t4125\n听起来\t4126\n藏珑\t4127\n州\t4128\n净水器\t4129\nformatting\t4130\n综合_新闻中心\t4131\n古井贡酒\t4132\n搜救犬\t4133\nword表格行高_\t4134\nwafer\t4135\n碧儿\t4136\n8240\t4137\n全敏\t4138\n女文青\t4139\n三甲港\t4140\n急要\t4141\n燕山大学机械工程学院\t4142\n访朝\t4143\n世界联合学院\t4144\n蚀骨\t4145\n依托\t4146\n单用途商业预付卡管理办法\t4147\n朝阳镇\t4148\n占位性\t4149\n手刹\t4150\n$ref\t4151\nbhg\t4152\n陈钰\t4153\n染色体组\t4154\nexclusions\t4155\n包皮炎\t4156\nxbm\t4157\n马克思佩恩2\t4158\n50g\t4159\n511880\t4160\n癸二酸\t4161\n私募证券基金\t4162\n洛阳牡丹节\t4163\n关宁\t4164\n阶段性\t4165\ncher\t4166\n深圳市嘉立创科技发展有限公司\t4167\n新诺\t4168\n陶心瑶\t4169\njOOQ\t4170\n公屋\t4171\n1347841号段\t4172\n2017年4月25日\t4173\n黄褐斑\t4174\n大杂会\t4175\n田家良\t4176\nvnc4server\t4177\n高宽比\t4178\n微信公众号测试号\t4179\n红河州\t4180\n百度安全中心\t4181\n普车\t4182\n礼子\t4183\n荣威750\t4184\n郑州航空工业管理学院\t4185\n强国\t4186\n空页\t4187\nrubbish\t4188\n周城\t4189\n后胸\t4190\nGerrit\t4191\n偷天换\t4192\n刺激片\t4193\nイケナイコト\t4194\n力洛克\t4195\n电器类\t4196\n住在\t4197\n别碰\t4198\npatch\t4199\n快青\t4200\n发钗\t4201\n220ml\t4202\n星牌\t4203\nROWID\t4204\n规划书\t4205\nV3.7.0.0\t4206\n陶杰\t4207\n文三路\t4208\n南宫萧尘\t4209\nEEPROM\t4210\nTinyOS\t4211\n林夕丁俊晖\t4212\n超导\t4213\n鞋材\t4214\nMultipart\t4215\noperational\t4216\n老隆镇\t4217\n十则\t4218\n以退为\t4219\n容错率\t4220\n孤僻\t4221\nandrew\t4222\n九十三\t4223\n养胃粉\t4224\n中国民航大学\t4225\n相原\t4226\n内胎\t4227\n立交桥\t4228\nCX-5论\t4229\nsequences\t4230\n少年天子\t4231\nPING通\t4232\n椒江人才网\t4233\naed\t4234\n预热\t4235\n杭州职业技术学院\t4236\n人材机\t4237\n熊爸熊\t4238\n中华人民共和国农村土地承包法\t4239\nscoop\t4240\n中技\t4241\n穿梭\t4242\n中共西安市纪委监察局\t4243\n1颗\t4244\n管乐团\t4245\n递延\t4246\n寒冰3\t4247\nblueprint\t4248\n78万\t4249\n埃德加\t4250\n徐和谊\t4251\nHIFI播放器\t4252\n金士顿内存\t4253\n爱梁者说\t4254\n磁畴\t4255\n笋干\t4256\nCAD2014\t4257\n方燕\t4258\n邓鸿\t4259\n源起\t4260\n追寻爱的踪迹\t4261\n亮黑\t4262\n地球村\t4263\n新业路\t4264\n恩格尔系数\t4265\n济南装修公司\t4266\n山东省淄博市\t4267\n博雅教育\t4268\n第4期\t4269\nNazi\t4270\n25g\t4271\n380v\t4272\n学院路街道\t4273\n楠哥\t4274\n263\t4275\n普济寺\t4276\n央视315晚会\t4277\n书票\t4278\n头孢克肟胶囊\t4279\n日光\t4280\n全家福\t4281\n赫塔菲\t4282\n老纪\t4283\nhp1136\t4284\n抽水泵\t4285\n永源\t4286\n嘉禾街\t4287\n2017年11月01日\t4288\n余\t4289\n冥斗士\t4290\n淘宝天猫货源\t4291\n青客时尚租房网\t4292\n地下情\t4293\nCAT6\t4294\n国师\t4295\n卖油翁\t4296\n中央经济工作会议\t4297\n体育路\t4298\n1773\t4299\n六防\t4300\n重庆市建委\t4301\n成果展\t4302\nCPC\t4303\nunpredictable\t4304\n去氧\t4305\n张志平\t4306\n免检\t4307\nwebpy\t4308\n新发\t4309\n惠山新城\t4310\n高效化\t4311\nf650\t4312\n1.4万亿\t4313\nnetlist\t4314\n817\t4315\nCCM\t4316\n米道\t4317\n两首\t4318\n怪罪\t4319\nGTX1070\t4320\n几天\t4321\n奥巴马政府\t4322\n普创\t4323\n许银川\t4324\n山东艺术学院\t4325\n画线稿\t4326\n湖南省发展和改革委员会\t4327\n偏偏喜欢你\t4328\n中山花园\t4329\n海汽集团\t4330\n苏州健雄职业技术学院\t4331\n马仔\t4332\nNEX-5N\t4333\n补益\t4334\nseesion\t4335\n春逝\t4336\n李鸣岩\t4337\nchad\t4338\n12起\t4339\n吴炯\t4340\n卷尺\t4341\n真三国无双7with猛将传\t4342\n秦某\t4343\n乌拉诺斯\t4344\n淤\t4345\n偷天\t4346\n神武3手游\t4347\n训诫\t4348\n后遗症\t4349\n电吹风\t4350\n起点小说\t4351\n里坊\t4352\nmongo数据库\t4353\n1.66\t4354\nguest\t4355\nui设计师\t4356\n急招\t4357\n长眼\t4358\nVC++2010\t4359\n翼讯\t4360\n辆\t4361\n哈狗\t4362\n人体工学\t4363\n徐庄村\t4364\nFuzzy\t4365\n漓\t4366\nLab_分析测试百科网\t4367\n尼康D5\t4368\n201611\t4369\n锦江宾馆\t4370\n参会者\t4371\n输入框placeholder\t4372\n抵免\t4373\ncompaq\t4374\n昂山素季\t4375\n安卓8.0\t4376\ndeaths\t4377\n高通量基因测序\t4378\n星盘\t4379\n六间房\t4380\n勤劳\t4381\n泰和县\t4382\n李益民\t4383\n速达3000\t4384\n500问\t4385\n脱颖\t4386\n张春燕\t4387\n__\t4388\n急忙\t4389\n英菲微微一笑很倾城\t4390\n企业类\t4391\n凯迪拉克xt5\t4392\n婢\t4393\n黄体生成素\t4394\n胡萝卜周\t4395\n2302\t4396\n开发学\t4397\n话匣子\t4398\n抛售\t4399\n2房\t4400\n任重\t4401\n思悟\t4402\nMX-5\t4403\nGeography\t4404\n52\t4405\ninkydoom\t4406\nhotdog\t4407\n揉搓\t4408\n5542\t4409\n地脚\t4410\n彩粉\t4411\n少儿版\t4412\n绫\t4413\n元胡止痛片\t4414\n被窝网\t4415\n14年后\t4416\n51层\t4417\nimal\t4418\n乱真\t4419\n海底蜃境\t4420\n牵引\t4421\n忧虑\t4422\n弘愿寺\t4423\n维鲁斯\t4424\nunittest\t4425\nOrg\t4426\n走眼\t4427\n满月\t4428\nmongodb数据库\t4429\n破壁料理机\t4430\n钟绍京\t4431\n大话数据结构\t4432\n洪流\t4433\n师姐们\t4434\n盐城市第一人民医院\t4435\n尼娅\t4436\n通讯类\t4437\n哇哈\t4438\n冰冠堡垒\t4439\n我的祖国\t4440\nYO\t4441\n邢台东站\t4442\nyx\t4443\n桃坪羌寨\t4444\nLinear\t4445\n嘉兴火车站\t4446\n毒死蜱\t4447\nmindmapper\t4448\n涉腐\t4449\n关山大道\t4450\n撸管男\t4451\n欧蓝德论坛\t4452\n瓯江国际新城\t4453\n李伯祥\t4454\n重游\t4455\n玛格\t4456\n天河城\t4457\n鄂前旗\t4458\n婊\t4459\n阿莫西林克拉维酸钾干混悬剂\t4460\n御指\t4461\n华瑞\t4462\nEXISTS\t4463\nbeyonce\t4464\n花光\t4465\n热血\t4466\ntennis\t4467\n德勤网申\t4468\n替吉奥胶囊\t4469\ntablename\t4470\n陕西省国际信托股份有限公司\t4471\n柘林湖\t4472\nflippy\t4473\nram\t4474\n千和\t4475\nautolisp\t4476\ngears3\t4477\n敢达\t4478\n小萝莉\t4479\n75%\t4480\n金投信托\t4481\n测站\t4482\n唇露\t4483\n汉化补丁v2.0\t4484\n小撸\t4485\n封信\t4486\n保洁公司\t4487\n陈瑶湖\t4488\n日音\t4489\n辽师\t4490\nResearchers\t4491\n燕山\t4492\n北外\t4493\n1100千伏\t4494\n陈汉典\t4495\npuyopuyo\t4496\n中国移动香港日\t4497\n现售\t4498\n坐浴\t4499\n顾炎武\t4500\n东莞新能源科技有限公司\t4501\n企划书\t4502\n真知灼见\t4503\n镇委\t4504\n贵阳晚报数字报\t4505\n中共共产党\t4506\nI帧\t4507\n减龄\t4508\nconvenience\t4509\n高天尊\t4510\n五\t4511\nparks\t4512\n火俄\t4513\n曾凡\t4514\n深圳新闻网\t4515\ncrime\t4516\nSTUN\t4517\n97个\t4518\n火碱\t4519\nappid\t4520\n寒假工\t4521\n东流\t4522\n矿砂\t4523\n苦笋\t4524\nfable\t4525\n时效期\t4526\n金世佳\t4527\n方物\t4528\n藤原妹红\t4529\n平安财富\t4530\n霸都\t4531\nstrive\t4532\n御城\t4533\njpg\t4534\n熊健\t4535\nkoa\t4536\n行政村\t4537\n宁桓宇\t4538\n4日游\t4539\nprototyping\t4540\n丹斯\t4541\n居民宿\t4542\n禾山街道\t4543\n野蜂飞舞\t4544\nnia\t4545\n光散\t4546\ndianying\t4547\n耳壳\t4548\nRicardo\t4549\n因纽特人\t4550\n油车\t4551\n杨康\t4552\n乱石\t4553\neer\t4554\np12\t4555\n安妮宝贝\t4556\n半环\t4557\n巨慢\t4558\n推波\t4559\nctreectrl\t4560\n调解化\t4561\n平易近人\t4562\n海航控股\t4563\n康惠保\t4564\n珂罗\t4565\n淮安清浦区政府\t4566\n鸡毛掸子\t4567\n三升级\t4568\n如椽\t4569\n车辆违章查询_汽车违章查询_交通违章查询-违章服务网\t4570\n违法性\t4571\n65纳米\t4572\n写实\t4573\n月影\t4574\nthief\t4575\n控制欲\t4576\n驳斥\t4577\nSUV论坛_汽车之家论坛\t4578\n机动车道\t4579\n福州五区\t4580\nmarvin\t4581\n惰轮\t4582\nCloset\t4583\n黎城县\t4584\n机膜\t4585\n4399剑\t4586\n月季园\t4587\n月下美人来\t4588\n外焰\t4589\n保胆取石\t4590\nExc\t4591\n中文核心期刊要目总览\t4592\n空头\t4593\nartifact\t4594\n新疆油田\t4595\n第一女\t4596\n美容医院\t4597\n论及\t4598\n湖南黄金\t4599\n与世长辞\t4600\n影节\t4601\nliba\t4602\n杨明\t4603\n新手包\t4604\n第一类\t4605\n美第奇\t4606\n大卫·戈尔\t4607\nLian\t4608\nconflict\t4609\n靈\t4610\n无影胶\t4611\n吸虫\t4612\n负债端\t4613\n多迪\t4614\nnvram\t4615\n天华院\t4616\n李子\t4617\n裙\t4618\nPND\t4619\n领英LinkedIn\t4620\n8.88\t4621\nprob\t4622\n西建大\t4623\n地球\t4624\n社会主义\t4625\n五天\t4626\n颜亮\t4627\nYouJizz\t4628\n小苹果\t4629\n交管12123吧\t4630\n包养\t4631\n青岛市公安局\t4632\n北京京东\t4633\n拳击手\t4634\n醉卧\t4635\n博雅特产网\t4636\n徐南平\t4637\n宝盒\t4638\n万代高达\t4639\n斯巴达300\t4640\nrepl\t4641\ntheresa\t4642\n山科大\t4643\n新人教部编版小学\t4644\nwir\t4645\n吴建中\t4646\ngeely\t4647\n三讲\t4648\n飒漫\t4649\nPrelude\t4650\n火龙果苗\t4651\nobject类\t4652\n4MATIC\t4653\n奥佳华\t4654\n內容\t4655\n味全\t4656\n生灵\t4657\n安德海\t4658\n少女与战车\t4659\n板结\t4660\n蓝剑\t4661\n涡扇\t4662\n热式气体质量流量计\t4663\n2盘\t4664\n焦散\t4665\n薄靳言\t4666\n简弘\t4667\n传送\t4668\n火锅味\t4669\n脾氨肽口服液\t4670\nmicheal\t4671\n开缴\t4672\n广生堂\t4673\n英语三级考试\t4674\n夜神安卓模拟器\t4675\nmodifying\t4676\n总装\t4677\nf-5080470\t4678\n私募公司\t4679\n┏\t4680\n继动阀\t4681\n医政处\t4682\n三九集团\t4683\n无尽大冒险\t4684\n纳帕海\t4685\n花火\t4686\n妈妈级\t4687\n宁波效实中学\t4688\n金王\t4689\n女唱\t4690\n腹法\t4691\nTre\t4692\n百式\t4693\ntfd\t4694\n翔鹰\t4695\n牙雕\t4696\nCITIZEN\t4697\n梭边鱼\t4698\nmerger\t4699\n海创\t4700\n银发\t4701\n1926\t4702\n李幺\t4703\n山河人间\t4704\n智代\t4705\n狐仙\t4706\nThz\t4707\ni74790\t4708\n军转\t4709\n毕业作品\t4710\n扑出\t4711\nSignatures\t4712\n1600元\t4713\n尊耀\t4714\n试压机\t4715\n定\t4716\n广东培训网\t4717\n样品池\t4718\n接壤\t4719\n紫星\t4720\n北瓜\t4721\nopenId\t4722\n桂林理工\t4723\n广西壮族自治区财政厅\t4724\n移动电话卡\t4725\nlube\t4726\n双三次\t4727\n数据寄存器\t4728\n存货监盘\t4729\n蔡文\t4730\n诺唯真喜悦号\t4731\n主回路\t4732\nDataX\t4733\nUGC\t4734\n盗梦空间\t4735\n武汉市\t4736\n贾静雯\t4737\n强娶\t4738\n6000多位\t4739\n椽子\t4740\nCollect\t4741\n凤城四路\t4742\n决赛季\t4743\nsleuth\t4744\n孟菲斯\t4745\n苏菲贾跃亭\t4746\n武汉长江\t4747\nARM技术论坛\t4748\nreactjs\t4749\n废纸板\t4750\n交互设计\t4751\n二级建\t4752\n红槐花\t4753\n大珰\t4754\n耒阳\t4755\n艾弗森\t4756\n四立\t4757\n羊倌\t4758\n胡忠雄\t4759\n石门街\t4760\n知言\t4761\n燃脂\t4762\n深圳电力\t4763\n杭州摇号服务中心\t4764\nLater\t4765\noxc000007\t4766\n贵阳物流公司\t4767\n陈百祥\t4768\n日语考试\t4769\n将子\t4770\n月入\t4771\n抵押贷款利率\t4772\nwebgoat\t4773\n厨房柜\t4774\n叶蓉\t4775\n灵遁者\t4776\n此岸\t4777\n读博\t4778\nzec\t4779\n低年级\t4780\nPPP项目公司\t4781\n位分\t4782\n仙门\t4783\n2016年12月20日\t4784\n杭州地铁三期\t4785\n相向\t4786\n安徽省妇幼保健院\t4787\n西风颂\t4788\n祛除\t4789\nFew\t4790\n闲鱼网\t4791\n马伊利\t4792\nmodbus4j\t4793\n仿制药一致性评价\t4794\n学生性\t4795\n山东半岛\t4796\n卢比奥\t4797\n突厥人\t4798\nwrite函数\t4799\n1公里\t4800\n五短\t4801\n宁波市区\t4802\n虚无主义\t4803\n尚书坊\t4804\n观想\t4805\n附示\t4806\n光泽度仪\t4807\n新浪博客\t4808\n营养盐\t4809\n邓小平故居\t4810\nMann\t4811\n红米手机3S/3X_MIUI论坛\t4812\n酔\t4813\nUIView\t4814\n绒布\t4815\n无锡一中\t4816\n天梭力洛克\t4817\n飞燕\t4818\nnowait\t4819\n西沃\t4820\n山云\t4821\nzoj\t4822\n第八章\t4823\nSIF\t4824\nCognition\t4825\n8024\t4826\n拉拢\t4827\n离京\t4828\n咕果\t4829\n时间简史\t4830\npaddlepaddle\t4831\n停泊岛\t4832\n教课\t4833\n综合医院\t4834\nEricsson\t4835\n松拓\t4836\nquel\t4837\n四专\t4838\n三千里\t4839\n布依族\t4840\n绿钞\t4841\n不耻下问\t4842\n熊猫谷\t4843\n经久耐用\t4844\n工业厂房\t4845\n支撑式\t4846\n真的假\t4847\n超能\t4848\nQue\t4849\n九龙县\t4850\n三水区\t4851\nking\t4852\n双反相机\t4853\ngssapi\t4854\n4.4G\t4855\nchaft\t4856\n可编程序控制器\t4857\n器号\t4858\n老叶\t4859\nTextEdit\t4860\n非屏蔽\t4861\n絕對\t4862\n猪笼草\t4863\n杀破剑侠世界2\t4864\n名表\t4865\nt460\t4866\nZHO\t4867\n4170\t4868\n歌谣季\t4869\n西来镇\t4870\n铁板\t4871\n信息化部\t4872\ntransportfever\t4873\nqe\t4874\n跟\t4875\n二十个月\t4876\nAvene\t4877\n陈光\t4878\n结晶度\t4879\n斯蒂\t4880\n孙悦\t4881\n70一代\t4882\n刘立立\t4883\n正平\t4884\n课时间\t4885\n老寨\t4886\n放错\t4887\n松子仁\t4888\n方舟公园\t4889\n火烧面\t4890\n1043\t4891\nwhois\t4892\nYouTuber\t4893\nX7Plus\t4894\n可比性\t4895\n青秀区\t4896\nPHPEXCEL\t4897\n200米\t4898\n杭州公园\t4899\n赖艺\t4900\n制毒\t4901\n七田\t4902\n珩磨机\t4903\ncgo\t4904\nVBSE\t4905\nbumper\t4906\n0810\t4907\n凌云工业股份有限公司\t4908\n荣威i6\t4909\n应收票据贴现\t4910\nattic\t4911\n王朝\t4912\n女系\t4913\n纵横比\t4914\ncass9.1\t4915\n88亿\t4916\n69凌波\t4917\n没去\t4918\n帕尔哈提\t4919\n陡然\t4920\n一人之下第二季\t4921\n7921\t4922\n预应力筋\t4923\nkW\t4924\n岩棉复合板\t4925\n波力斯卡\t4926\n王丽华\t4927\noverrides\t4928\n3125\t4929\n大家说\t4930\n20160723\t4931\n只有一个人\t4932\n虚拟地球\t4933\n兵神\t4934\n七八级\t4935\n燕盏\t4936\n直线度\t4937\n柳擎宇\t4938\nInject\t4939\n下分\t4940\nsubstantial\t4941\nindentation\t4942\n图腾\t4943\n真夏夜的银梦\t4944\n华三\t4945\n摩擦轮\t4946\n盐雾试验机\t4947\n8月份\t4948\nexif\t4949\nURLEncode\t4950\n控制表\t4951\n欧布奥特曼\t4952\n无限不循环小数\t4953\n第246章\t4954\n灰泥\t4955\n展商\t4956\n二支\t4957\n整倍体\t4958\nDreaming\t4959\n紫堇\t4960\n重庆三峡学院\t4961\n东方坦克大战\t4962\n中装\t4963\nRtsp\t4964\n放课后\t4965\n急性鼻炎\t4966\n代序\t4967\n新昌新闻网\t4968\n保健品\t4969\n底版\t4970\n二等分\t4971\nX-100\t4972\n湘西自治州人民政府\t4973\nフィ\t4974\n静候\t4975\nnccl\t4976\n天涯明月刀\t4977\n纯属巧合\t4978\n16孔\t4979\n政府采购信息网\t4980\nnomoreshow\t4981\nPCo\t4982\n中国登报网\t4983\n外汇牌价_汇率网\t4984\nhzp\t4985\n壹城\t4986\n240PS\t4987\n3分\t4988\n滕王阁序\t4989\n基本工资\t4990\n有期收益率\t4991\n无锡滨湖\t4992\n楼外\t4993\n解忧公主\t4994\n4J\t4995\n平衡性\t4996\n中金在线\t4997\n大幅面\t4998\n询问\t4999\nNeiman\t5000\n每股\t5001\n埃德森\t5002\n尽管如此\t5003\n以太坊矿机\t5004\n王贻芳\t5005\n侠盗猎魔2\t5006\n收购价格\t5007\n宋威\t5008\n宁波妇儿医院\t5009\n穆里尼奥\t5010\n同济大学外事办公室\t5011\n参天fx\t5012\n贩售\t5013\n义捐\t5014\n自杀小队\t5015\n家家有本难念的经\t5016\n彭氏\t5017\nwich\t5018\n高企云\t5019\n郭的秀\t5020\n中括号\t5021\n星品\t5022\n所爱\t5023\n超级跑跑\t5024\n罗汉鱼\t5025\n超过五年\t5026\n健德门\t5027\n在外\t5028\n20171022\t5029\n六师五家渠政府网\t5030\n外露\t5031\n南山中心区\t5032\n注塑级\t5033\n九种\t5034\n萧雨\t5035\n来福岛爆笑娱乐网\t5036\n金具\t5037\n易网校\t5038\nxlsb\t5039\n口头禅\t5040\nlalala\t5041\nMCE\t5042\n中国香薰网\t5043\n二会\t5044\n脑梗后遗症\t5045\n一念向北\t5046\n720P&1080P\t5047\n常州道\t5048\n样点\t5049\n福山正达\t5050\n华商名人堂\t5051\n嗨钱网\t5052\n陈健\t5053\n苦境\t5054\nGees\t5055\n14C\t5056\n普者\t5057\n昆明地铁3号线\t5058\nBRS\t5059\n淄川区人民政府\t5060\n筒音\t5061\nforward\t5062\n完颜洪烈\t5063\n主域控\t5064\n新希望六和股份有限公司\t5065\n甲酸铵\t5066\n昆明网易\t5067\nQzRc\t5068\n牛栏山二锅头\t5069\n王王\t5070\n汲取\t5071\nppx\t5072\n车载dvd\t5073\n音手\t5074\n方大同\t5075\n1.49\t5076\n邓论\t5077\n保险业务\t5078\n红黄蓝幼儿园\t5079\n3甲\t5080\n哭殿\t5081\n32种\t5082\nrewind\t5083\n601288\t5084\n汽配\t5085\n杨志峰\t5086\n革命精神\t5087\n家娃\t5088\n转轮\t5089\nU9508\t5090\n海沧台商投资区管委会\t5091\n修修\t5092\n武汉市江汉区人民政府\t5093\n蟹堡王\t5094\n死亡证明\t5095\n粉钻\t5096\n译馆\t5097\n乡城\t5098\nixpand\t5099\nMara\t5100\n王铮\t5101\npme\t5102\n不死传说\t5103\n独吞\t5104\n骑马场\t5105\n二十四岁\t5106\n披头士\t5107\n47寸\t5108\n时秋\t5109\n石材展\t5110\n官气\t5111\n四师\t5112\n中国银行_电子银行\t5113\n龚剑\t5114\n南栅\t5115\n13年\t5116\n活水公园\t5117\n成都分公司\t5118\nBeans\t5119\nmm4\t5120\n南山书院\t5121\n中新社_北京分社\t5122\n长效化\t5123\n创富港\t5124\n蒋纬国\t5125\n满月宴\t5126\n5Law.cn\t5127\n砣\t5128\nRAZ\t5129\n王亮\t5130\n5月20日\t5131\nse535\t5132\n汉明\t5133\n洁碧冲牙器\t5134\n工作室\t5135\n喜帖吧\t5136\n伴虎\t5137\n王莽\t5138\nsoapui\t5139\n网信集团\t5140\n树山\t5141\nhal库\t5142\nArticle\t5143\n壹海城\t5144\n10颗\t5145\n欠账\t5146\n芝加哥\t5147\n嘀咕论坛\t5148\n叉烧\t5149\n缘何\t5150\n3330\t5151\n滑铁卢大学\t5152\n上海自助经办\t5153\n平均工资\t5154\njcl\t5155\n洞通\t5156\n盆底肌\t5157\n江苏省旅游局\t5158\n网段\t5159\n超声造影\t5160\nKMnO4\t5161\n华表\t5162\n担保品\t5163\nfwrite\t5164\n三港\t5165\n深圳港大医院\t5166\n王昕\t5167\n继受\t5168\n物源\t5169\nNavigation\t5170\nweb过滤器\t5171\n黄河第一湾\t5172\nrecurdyn\t5173\n双河\t5174\n坑点\t5175\n中二病\t5176\nreversed\t5177\n不能少\t5178\n旋臂\t5179\n新洋\t5180\n变形金刚1\t5181\nworkflowy\t5182\n13000\t5183\n超级花开半夏\t5184\n镀铬棒\t5185\n黏土\t5186\nprecise\t5187\nstudio2015\t5188\n红松路\t5189\n长垣\t5190\n西山公园\t5191\nfranxx\t5192\n高速过路费\t5193\n作业区\t5194\n何陋\t5195\n石壕吏\t5196\n中国水利水电第一工程局\t5197\n樱花大道\t5198\n天各一方\t5199\n_v1.0\t5200\n深圳市商业联合会\t5201\n砂布\t5202\n六局\t5203\n动机信\t5204\n国版\t5205\nACS510\t5206\n量产型\t5207\nparody\t5208\nbranch\t5209\nwdk\t5210\n不断网\t5211\nDuilib\t5212\n骑刃王\t5213\nfiel\t5214\n珍珠泉\t5215\nHTTPs\t5216\n欢迎语\t5217\n双离合变速箱\t5218\n4卫\t5219\n李瑶媛\t5220\n辰鬼\t5221\n倶楽部\t5222\n周棋洛\t5223\n会所\t5224\n大全\t5225\n杜海乌兰图雅\t5226\n石渠\t5227\n文宅\t5228\nparis\t5229\n话事人\t5230\n母排\t5231\n无锡地铁4号线\t5232\nget函数\t5233\nジェ\t5234\n自来也\t5235\nLegs\t5236\nkeeps\t5237\n缩水率\t5238\n免税版\t5239\n甘草锌颗粒\t5240\n5单\t5241\ndreamer\t5242\nDesignjet\t5243\nagri\t5244\nCAE\t5245\n软工\t5246\nkeying\t5247\n10^9\t5248\noslob\t5249\n圆度仪\t5250\n韩风\t5251\n参观者\t5252\n吉布森\t5253\n没去过\t5254\n2018年4月23\t5255\n真空干燥箱\t5256\n七国之乱\t5257\nBEJ48\t5258\n中银通\t5259\nUST\t5260\n第7辑\t5261\n麦迪科技\t5262\nMimya\t5263\n11:30\t5264\n迪升\t5265\n鲸吞\t5266\n茼蒿\t5267\n螺旋线\t5268\nESC键\t5269\n我不犯人\t5270\n苹果MacBook\t5271\n格式刷\t5272\n中国安防展览网\t5273\n公费医疗\t5274\n诊断学\t5275\n4.6米\t5276\n斗转星移\t5277\n闻涛路\t5278\n澜泊湾\t5279\n酒客\t5280\nRewrite\t5281\nImmigration\t5282\n1573\t5283\n转路\t5284\n维甲酸\t5285\n岩田聪\t5286\nwifi局域网\t5287\nx1+x2\t5288\n摄入量\t5289\n乃木坂46吧\t5290\n中外法制网\t5291\n2700k\t5292\n终篇\t5293\n中水回用\t5294\n丁圣元\t5295\n追日\t5296\n张维娜\t5297\n集粹\t5298\n观察性\t5299\nStarship\t5300\n百田VOCALOID圈\t5301\nsander\t5302\nslm\t5303\n玖珑湾\t5304\n五防\t5305\n不爱了\t5306\nOTF\t5307\n批单\t5308\n樊\t5309\nAMDCPU\t5310\nkotori\t5311\n小老婆\t5312\n钢卷\t5313\n刘家峡水库\t5314\n西奇\t5315\njco\t5316\nbds\t5317\n震旦集团\t5318\n自吹自擂\t5319\n陈湘生\t5320\n李亚伟\t5321\naten\t5322\n上拉刷新\t5323\nehci\t5324\n校内外\t5325\n万余名\t5326\n艾利欧\t5327\nWAVES\t5328\n金马影帝\t5329\n王叔叔\t5330\n铁裆功\t5331\n450g\t5332\nfiddler4\t5333\n擒获\t5334\n置物架\t5335\n虹猫蓝兔\t5336\nvpcs\t5337\n一拖六\t5338\n美罗家园\t5339\n砂芯\t5340\n中欧商学院\t5341\n战损\t5342\n設計\t5343\n公益广告\t5344\n性爱好者\t5345\nphtml\t5346\n红利期\t5347\naozhou\t5348\n潍坊市人民政府办公室\t5349\n五庄观\t5350\n舒肤\t5351\n今年三月份\t5352\nid3\t5353\n燃油锅炉\t5354\nSalad\t5355\n2589\t5356\n邀请书\t5357\n四十首\t5358\n别加\t5359\n招标文\t5360\n华兹华斯\t5361\n青岛四方区\t5362\n南京邮电大学\t5363\n机械制造基础\t5364\n15件\t5365\nCarlos\t5366\n报告单\t5367\n果戈理\t5368\nBambino\t5369\n蠢蠢的死法\t5370\n40毫米\t5371\n川麻圈\t5372\n那氏\t5373\n聚氨酯面漆\t5374\n黄玮\t5375\n沈抚新城\t5376\n说话稿\t5377\n每桶\t5378\nlumispa\t5379\nDiffen\t5380\n朝阳\t5381\nv网\t5382\n童萌\t5383\n阈值化\t5384\n长治北\t5385\n口杯\t5386\n芷江侗族自治县人民政府\t5387\n自锁器\t5388\n浠\t5389\n摘登\t5390\n肺结核药\t5391\nstores\t5392\n加速度计\t5393\n必胜客宅急送\t5394\ncics\t5395\n货车\t5396\n主机位\t5397\n1156\t5398\n福特福睿斯\t5399\n烟台蓬莱机场\t5400\n展期\t5401\n二百多年\t5402\ncodol\t5403\n惠民小区\t5404\nSqlDbx\t5405\n上海市精神卫生中心\t5406\n华阳河\t5407\n晨兴\t5408\n中国人民大学国家发展与战略研究院\t5409\n三秒后\t5410\n洪山街道\t5411\n途锐\t5412\n箭头\t5413\nx^2+y^2+z^2\t5414\n广播电视局\t5415\n狠心\t5416\n肋间\t5417\n向心力\t5418\n古林镇\t5419\n特装版\t5420\n感情观\t5421\n矮\t5422\n看病难\t5423\nunrecoverable\t5424\nlok\t5425\n京沪高铁\t5426\n莫奈花园\t5427\n砂船\t5428\n用稿\t5429\n卡斯提尔\t5430\ntxc\t5431\n长城华冠\t5432\n乡党委\t5433\n德川幕府\t5434\n本村\t5435\n药学系\t5436\n丰联\t5437\n中长期铁路网\t5438\n4KW\t5439\n傻缺\t5440\n以说\t5441\ns12\t5442\n征求意\t5443\n中山公园站\t5444\n北电中戏\t5445\n比开\t5446\n礼德财富\t5447\n陈丰\t5448\nEcco\t5449\n130_\t5450\n美型\t5451\n电休克\t5452\n零售通\t5453\nPromo\t5454\n填充物\t5455\n书人\t5456\n重庆西南大学\t5457\n锁仓\t5458\n物尽\t5459\n皇甫奇\t5460\n往昔\t5461\n120m\t5462\n刘成昆\t5463\nhomie\t5464\n第二例\t5465\n计财处\t5466\n北方网-新闻中心\t5467\n河西万达广场\t5468\nTear\t5469\n小盘\t5470\n升降序\t5471\n锚固\t5472\n跑不动\t5473\n成都办事处\t5474\n矿种\t5475\n蜡封\t5476\n三倍\t5477\n道主\t5478\n公义\t5479\ncdr格式\t5480\n添加式\t5481\n财政部\t5482\n江苏省交通技师学院\t5483\n远征军维京\t5484\nDepth\t5485\n徐芝纶\t5486\n石家庄地铁1号线\t5487\n永福寺\t5488\nairmech\t5489\n20处\t5490\niowait\t5491\n奇迹世界2\t5492\ncunt\t5493\n膨大素\t5494\n受纳\t5495\n嵌固\t5496\n太平洋地区\t5497\n弄潮\t5498\n幻想三国志1\t5499\n单源\t5500\n央视视频\t5501\n唐本忠\t5502\n镉\t5503\n帮人\t5504\n美团公司\t5505\n音乐特长生\t5506\n刀锯\t5507\n武基\t5508\n千吨\t5509\nPIAGET\t5510\n小五\t5511\n数据堂\t5512\n哥林多前书\t5513\n3年后\t5514\n2.X\t5515\n地堂\t5516\n承德露露\t5517\n浙江革命烈士纪念馆\t5518\n教师类\t5519\n刘洪彪\t5520\nThrone\t5521\n复方消化酶胶囊\t5522\n挥手\t5523\n游魂\t5524\n中华人民共和国公司法释义\t5525\n网游之天下无双\t5526\n仓多\t5527\n江南嘉捷\t5528\n2300亿\t5529\n中医科\t5530\n显影剂\t5531\nAFP\t5532\nunavailable\t5533\n篦\t5534\n费米能级\t5535\n回忆总想哭\t5536\n君子版\t5537\n洗井\t5538\n荒野大镖客2\t5539\n德顺\t5540\n卖座\t5541\nxstream\t5542\n荒村活寡:留守女人们的春天\t5543\n雷鸣科化\t5544\n文头\t5545\n拒接\t5546\n错嫁良缘\t5547\n东巴文\t5548\nMinecraft我的世界\t5549\n自体脂肪\t5550\n民政厅\t5551\n这会儿\t5552\n叩头\t5553\n洛邑\t5554\n石盘\t5555\n床单\t5556\n希罗达\t5557\n波状\t5558\n何静\t5559\n嘉铭桐城\t5560\n张大帅\t5561\n农村专业合作社\t5562\n阳光高考网\t5563\n明菜\t5564\n白日依山尽\t5565\n无期\t5566\n死亡人口\t5567\n毛细管电泳仪\t5568\n武装部\t5569\n刷取\t5570\n特比萘芬\t5571\n安吉万达广场\t5572\n航天信息金税盘\t5573\n太乐\t5574\n不衰\t5575\n防御\t5576\n张晓刚\t5577\n竖\t5578\n福建省卫计委\t5579\n刚体\t5580\n以津真天\t5581\n文一西路\t5582\n超频\t5583\n斫\t5584\n60hz\t5585\n开发有限公司\t5586\nStuck\t5587\n咋么\t5588\n这厢\t5589\n主桌\t5590\ngoroot\t5591\n单面焊\t5592\n闭水试验\t5593\n仿效\t5594\n先锋号\t5595\n九阳真经\t5596\nHALCON\t5597\n数字播放器\t5598\n斯大林格勒战役\t5599\n火车轮\t5600\nDistance\t5601\n山语海\t5602\n群生\t5603\nHiddleston\t5604\n易用性\t5605\nネ\t5606\n文体局\t5607\n山地车架\t5608\nTimeBie\t5609\n秦观\t5610\n9平米\t5611\n不由得\t5612\n卡莉芙拉\t5613\nmysql触发器\t5614\n2.7.7\t5615\nndarry\t5616\n曲格列汀\t5617\n菁英教育\t5618\n畅沛\t5619\n扎眼\t5620\n保赐利\t5621\n信义玻璃\t5622\n沙塔斯\t5623\nDoesn\t5624\n恩布拉科\t5625\n方法学\t5626\n性生\t5627\n除以\t5628\n北京首创股份有限公司\t5629\nping通\t5630\n精尖\t5631\n安联\t5632\n法利赛人\t5633\n凿毛\t5634\n三绝\t5635\n广播电视学\t5636\n神吐槽\t5637\n深圳侦探公司\t5638\n牙包\t5639\n悦山\t5640\n炮子\t5641\n精锻科技\t5642\n青建集团股份公司\t5643\nlook\t5644\nopener\t5645\nmils\t5646\n马修斯\t5647\nFAST\t5648\n御宝\t5649\n山水之间\t5650\n湖滨银泰\t5651\n南非世界杯\t5652\n汶上\t5653\n卓展\t5654\n家装展\t5655\nbioss\t5656\n泽艺影城\t5657\n通州区政府网\t5658\n大广\t5659\nAnnex\t5660\n上海影讯\t5661\n田原皓\t5662\n碌碌无为\t5663\n雪白\t5664\nprison\t5665\n山西省纪委监察厅\t5666\n世界搏击中心\t5667\n水木清华\t5668\n居里夫人\t5669\n外设\t5670\n新征程\t5671\n立体雕刻机\t5672\n5:00\t5673\n英才学院\t5674\n组织文化\t5675\n红鼻子节\t5676\n卡特彼勒\t5677\n一测\t5678\n拓陆\t5679\n孕产期\t5680\n18集\t5681\nplotting\t5682\n上水器\t5683\n口袋银行\t5684\n刷帖\t5685\n收件夹\t5686\n菱智\t5687\n卖出去\t5688\n苏州相城区\t5689\nNGFW\t5690\n小色\t5691\n小脑斧\t5692\n人意\t5693\njdt\t5694\n摄影赛\t5695\n副队长\t5696\n五次方\t5697\n死性\t5698\nBringing\t5699\n瓶男\t5700\n松岛菜菜子\t5701\n云娜\t5702\n40幅\t5703\n空明\t5704\nk粉\t5705\n城头\t5706\nminisd\t5707\n荣威ei5\t5708\n中央民族大学\t5709\n务农\t5710\n田纪香\t5711\n扫黄\t5712\n心理卫生\t5713\n当好\t5714\n周至水街\t5715\n德清新闻网\t5716\nGus\t5717\nLSS\t5718\n本周日\t5719\n泡酒料\t5720\n加州花园\t5721\n修仙鬼吹灯\t5722\n百分九少年\t5723\n国嘉\t5724\n面积\t5725\n插扣\t5726\n名词性物主代词\t5727\n长房\t5728\n复颜抗皱\t5729\n压花机\t5730\n南海大沥\t5731\n陛\t5732\n国产机\t5733\n临沂市统计局\t5734\n杨向东\t5735\n林肯公园\t5736\n乾元\t5737\n小萝卜头\t5738\n引江济淮\t5739\n出分\t5740\n12层\t5741\n凹印\t5742\n400毫米\t5743\n程序源代码\t5744\n霞慕尼\t5745\n海迅\t5746\n彩陶\t5747\n自动存款机\t5748\ni7s\t5749\n江口村\t5750\n赝品\t5751\n阿扎\t5752\n公文式\t5753\nErrno\t5754\n05版\t5755\n江泽\t5756\n达内集团\t5757\n可说\t5758\n舒密尔\t5759\nCALVIN\t5760\n日月贝\t5761\n正雄\t5762\n宫赋\t5763\n牛班\t5764\n王船山\t5765\n自找\t5766\n炒麦芽\t5767\n区教育局\t5768\n永大电梯\t5769\nlGL\t5770\n梦枕貘\t5771\n机率\t5772\nahb\t5773\n6.8\t5774\n11速\t5775\n还款券\t5776\n新疆机场\t5777\n粉肠\t5778\n角质层\t5779\n墨城\t5780\n日长\t5781\nBoLoLi\t5782\n十里桃花\t5783\n千余元\t5784\n过渡层\t5785\n136\t5786\n新希望锦悦楠庭\t5787\n20G\t5788\n踏勘\t5789\n吉吉影音_西瓜影音_影音先锋\t5790\n严\t5791\n清华大学马克思主义学院\t5792\n中岛\t5793\nCompose\t5794\n箱柜\t5795\n陕西省食品药品监督管理局\t5796\n酒神\t5797\n可交换债\t5798\nFight\t5799\n已經\t5800\n姚崇\t5801\n有才无\t5802\n糖液\t5803\n极品飞车10\t5804\n钟晨\t5805\n计算机控制技术\t5806\n簪子\t5807\n杨晓慧\t5808\n聊城大学\t5809\nP2000\t5810\n2370\t5811\nPrada\t5812\n选官\t5813\n北二路\t5814\n李莉莉\t5815\n中国共产党第十九次全国代表大会\t5816\n二滩\t5817\n私募机构\t5818\n14多\t5819\n淹溺\t5820\nNOOB\t5821\n站神\t5822\n手工壶\t5823\n轻仓\t5824\nKALI\t5825\n结肠息肉\t5826\n旋挖\t5827\n掉盘\t5828\n鹿鼎记2\t5829\n无锡百姓网\t5830\n250分\t5831\n520.1314\t5832\nworkflow\t5833\n姚宏\t5834\n直入\t5835\ngdpr\t5836\n统计图\t5837\n苏叶\t5838\nsmartart\t5839\n芦茨村\t5840\n众阳\t5841\n苏仨\t5842\n幼儿师范学校\t5843\n光阴的故事\t5844\nUB\t5845\n属地化\t5846\nav女\t5847\n16本\t5848\n1571\t5849\ncodo\t5850\n123@163.com\t5851\njnative\t5852\n9mm\t5853\n跨校\t5854\nhbase\t5855\n涌现\t5856\nEnterprise\t5857\n南京市\t5858\n我知道\t5859\n吹石玲奈\t5860\n实为\t5861\n一點\t5862\n莱芜市人民医院\t5863\n安全观\t5864\n空出\t5865\n画种\t5866\nCPVC\t5867\n张咏\t5868\n常熟零距离房产网\t5869\n康复科\t5870\n000671\t5871\n1626\t5872\n宁波政府法制信息网\t5873\n▃\t5874\nqweb\t5875\nTOOLS\t5876\n6032\t5877\n8.9分\t5878\n卡不住\t5879\n安庆二中\t5880\n顾屿森\t5881\n马应龙麝香痔疮膏\t5882\n地方政府融资平台公司\t5883\ndrones\t5884\n千秋万载\t5885\n爱新觉\t5886\n荒神罪蜀山传\t5887\n3367皇室战争\t5888\nlemonbit\t5889\n脓水\t5890\nlumia950\t5891\n佐料\t5892\n资产负债表\t5893\n四边形\t5894\n慧聪集团\t5895\nMulti\t5896\n古诗文\t5897\n秀城\t5898\n匚\t5899\n合价\t5900\nt440p\t5901\n绵阳市农业局\t5902\n龙蟠路\t5903\n兽药典\t5904\n迟疑\t5905\n6则\t5906\n顾烟\t5907\n核聚变\t5908\n素彩\t5909\n卖春\t5910\n2017-2035年\t5911\nimageset\t5912\n世上只有妈妈好\t5913\n把立\t5914\n查勘员\t5915\nbool\t5916\nConfluent\t5917\neas\t5918\n挨罚\t5919\nR11S\t5920\n泱\t5921\n电子手表\t5922\n王骏\t5923\n这样的话\t5924\n成员们\t5925\n薄利多销\t5926\n要债\t5927\n1553\t5928\n挂果\t5929\n即征\t5930\n收费群\t5931\n体能\t5932\n闽南\t5933\n医圣\t5934\n上海货运公司\t5935\n鬼门十三针\t5936\n科帕奇论坛\t5937\n秀目网\t5938\nUseit\t5939\n1001种\t5940\n苦轮\t5941\n喜马诺\t5942\n巴啦啦小魔仙\t5943\nrecalbox\t5944\n仙境物语\t5945\n主性\t5946\ntaro\t5947\n百兆路由器\t5948\ne30\t5949\nLogrotate\t5950\n精喹禾灵\t5951\n委托理财\t5952\n蒸汽减压阀\t5953\n松鹤\t5954\na+1\t5955\n祺嫔\t5956\n码出\t5957\n美战\t5958\n一代枭雄\t5959\n临床家\t5960\n娇羞\t5961\n一3\t5962\n集成费\t5963\n遨\t5964\n鑫泉留学网\t5965\n云汉\t5966\nContent-type\t5967\n218.28\t5968\n圆螺母\t5969\n河南建业\t5970\n淘客\t5971\nBoundless\t5972\n手动型\t5973\n麦迪龚宇\t5974\n官方旗舰店\t5975\nbase64.js\t5976\n僵尸脱衣舞娘\t5977\n郭鑫\t5978\n454g\t5979\n58期\t5980\n唠唠\t5981\n加沃特\t5982\n乐动\t5983\n茶园\t5984\n校工\t5985\n舍长\t5986\n贱受\t5987\n跑批\t5988\nMbed\t5989\n92#\t5990\n锡\t5991\n冲浪者\t5992\n凌晨5点\t5993\n尾\t5994\n外税\t5995\n故意\t5996\n碗布\t5997\n大富豪2\t5998\n加钻\t5999\n王华平\t6000\n妖怪\t6001\n三百块\t6002\n非参公\t6003\niPhone6\t6004\n马兆瑞\t6005\n推心置腹\t6006\n太阳社\t6007\n王小刚\t6008\n比基尼秀\t6009\n街头霸王3\t6010\n叩富网\t6011\n西楚霸王\t6012\n夏园\t6013\n烫机\t6014\n阳光蓄电池\t6015\n呼延\t6016\n熊木杏里\t6017\nWindy\t6018\nchacha\t6019\n雅万高铁\t6020\n绍兴市财政局\t6021\n码迷\t6022\ncharity\t6023\n阿丽\t6024\n大嵛山岛\t6025\n汉诗\t6026\n1768年\t6027\n海淀区幼儿园\t6028\n杰座\t6029\n穿青人\t6030\n20180331\t6031\nkangaroo\t6032\nLuisaviaroma\t6033\n黒川\t6034\nAppleWatch\t6035\n万达影视\t6036\ncoordinator\t6037\n越秀保利爱特城\t6038\n烟机\t6039\nPgSQL\t6040\nswto\t6041\n孙雯\t6042\n搭建\t6043\n寿阳县\t6044\n香邑溪谷\t6045\n金系\t6046\nherf\t6047\n商丘市梁园区\t6048\n采办\t6049\n切身\t6050\n32层\t6051\n南宫羽香\t6052\n寒江雪\t6053\n乐展\t6054\n觅食\t6055\nVuex\t6056\nGmail邮箱\t6057\n柳州螺蛳粉\t6058\n壬子\t6059\n万宝\t6060\nVBE\t6061\n尊贵型\t6062\nmyeclipse2016\t6063\n泊里\t6064\n海马玩模拟器\t6065\n移民监\t6066\n米安\t6067\nBoost库\t6068\n10关\t6069\n管规\t6070\n310s\t6071\n淫心\t6072\nbr0\t6073\nrecipes\t6074\n勿忘我\t6075\n夏侯恩\t6076\n柔肤\t6077\n比利时国家队\t6078\n风载\t6079\nexcursion\t6080\n测量机\t6081\nwbs\t6082\n莎拉\t6083\n诞生记\t6084\n满秩\t6085\n600多年\t6086\n香港国际机场\t6087\n安徽石台县政府\t6088\n户态\t6089\nnone-eabi\t6090\nhystrix\t6091\nFlowable\t6092\n平墅\t6093\nWin7旗舰版\t6094\n化雨\t6095\n美林\t6096\nevm\t6097\n分布式代理\t6098\n警务处\t6099\n64.whl\t6100\nluca\t6101\n天宗\t6102\n宝武集团\t6103\nerg\t6104\nxDrive35i\t6105\n勒痕\t6106\n南通港闸区\t6107\n龟山\t6108\n向海龙\t6109\n中压开关柜\t6110\n女真人\t6111\n格兰德\t6112\n尚佳\t6113\n石英晶体谐振器\t6114\nAstralWind\t6115\n奇景\t6116\n真空锅炉\t6117\n艾戈勒\t6118\nbutler\t6119\nCalories\t6120\n皮口港\t6121\n无神\t6122\n阿拉丁\t6123\n动漫网\t6124\nsmell\t6125\n人眼\t6126\n浙江广厦建设职业技术学院\t6127\n雨刮器\t6128\n4.2.2\t6129\nqizuang\t6130\n音乐银行\t6131\n高铁站-天气网\t6132\n苏教版6\t6133\n格律诗\t6134\n阿图什市\t6135\n拼装\t6136\n柴碧云\t6137\n加蓝\t6138\nhsb\t6139\nPIN\t6140\n金种子酒\t6141\n杨老三\t6142\n炮神\t6143\n相泽莉娜\t6144\n融合\t6145\n陈晓云\t6146\nPartition\t6147\n2018年2月份\t6148\nnumpy矩阵\t6149\nx6a\t6150\n苏州市公安局交通巡逻警察支队\t6151\n雪中情\t6152\ntariff\t6153\n胡记\t6154\n高耸入云\t6155\n宜宾学院\t6156\npppt\t6157\n支撑板\t6158\n肥穴\t6159\n声调\t6160\n超级机器人大战og\t6161\n微笑唇\t6162\nbatis\t6163\n合作银行\t6164\n事故\t6165\nMV\t6166\n杭州市红十字会医院\t6167\n平衡表\t6168\n蓓尔莫\t6169\n名中\t6170\n毛水\t6171\n1865\t6172\nqq微云\t6173\n建设类\t6174\n中华航空\t6175\n新东方雅思\t6176\n物联网工程\t6177\n黄庭\t6178\nClearly\t6179\n四十米\t6180\n理疗馆\t6181\n朴正允\t6182\n协战\t6183\nu23\t6184\n88.1\t6185\n头标\t6186\nuntag\t6187\n宝庆\t6188\n内分泌代谢科\t6189\n金星\t6190\n扇区\t6191\n摩根士丹利华鑫证券\t6192\n陈思诚\t6193\n还俗\t6194\n聚糖\t6195\n鱼肝\t6196\ninsert键\t6197\n普莱斯\t6198\n初级中学\t6199\n084\t6200\n126路\t6201\n刘建新\t6202\nadvancement\t6203\n新乡市城乡规划局\t6204\n120hz\t6205\n敬业度\t6206\n甘于\t6207\n临港地区\t6208\n元素萨\t6209\n泓森槐\t6210\n编配版\t6211\n龙亭公园\t6212\n性福人生2\t6213\nChroma\t6214\n王天\t6215\n托比\t6216\n1715\t6217\n雪男\t6218\nDisconf\t6219\n莱芜市国土资源局\t6220\n鲁格\t6221\nrobomaster\t6222\n醍醐\t6223\ncoordinated\t6224\n6}\t6225\nthrowing\t6226\n医学博士\t6227\n索佳\t6228\n恍若\t6229\n霹雳堂\t6230\n3.1233\t6231\n电磁感应\t6232\n读者\t6233\nmoven\t6234\nweather\t6235\n边界上\t6236\n扼腕叹息\t6237\n寻宝阁\t6238\n过肩摔\t6239\n85265001\t6240\n仙缘之城\t6241\n无锡蠡园\t6242\n神甲\t6243\n风雅颂\t6244\nBreathe\t6245\n巫刚\t6246\n卡通化\t6247\n新华传媒\t6248\n宁波便民网\t6249\nORMLite\t6250\n苏州科技城外国语学校\t6251\npushstate\t6252\n若离\t6253\n寒水\t6254\n椰浆\t6255\n项目库\t6256\n棋具\t6257\n空化\t6258\nIPIP\t6259\n宋世杰\t6260\n毛子\t6261\n千乘\t6262\nHing\t6263\nPreston\t6264\nTruffle\t6265\n45克\t6266\n抚远县\t6267\nimmuno\t6268\n蒋子刚\t6269\n城市品牌\t6270\n500万条\t6271\n限速版\t6272\n划不动\t6273\n火车厢\t6274\n佛山西站\t6275\n魔字\t6276\n雷峰夕照\t6277\nhours\t6278\n自觉性\t6279\nKmplayer\t6280\n410c\t6281\n路线路\t6282\n西河\t6283\n三群\t6284\n穿条纹睡衣的男孩\t6285\n翠竹\t6286\nlibjson\t6287\n8008\t6288\n_艺能达人\t6289\nmechrevo\t6290\n美甲帮\t6291\n1.38G\t6292\n破破\t6293\n风吟鸟\t6294\nEntityManager\t6295\n跳水\t6296\n175cm\t6297\ntabBarItem\t6298\n中科院华南植物园\t6299\n天津农商银行\t6300\n畜牲\t6301\nmortal\t6302\n杨舍镇\t6303\n白雁\t6304\n希悦尔\t6305\n才亮\t6306\n磕磕碰碰\t6307\n5.42\t6308\n女贼\t6309\n素头\t6310\nSUV\t6311\ntags\t6312\n一号店\t6313\nū\t6314\n瑞声\t6315\nie10\t6316\n16:00\t6317\n防锈液\t6318\n狗爷城中村\t6319\n重庆市第一人民医院\t6320\n张清\t6321\n放眼世界\t6322\n严监生\t6323\n康新公路\t6324\nps\t6325\n山东省情网\t6326\n维多利亚的秘密\t6327\n英雄无悔\t6328\nDWT\t6329\n排比\t6330\n本兮\t6331\nph计\t6332\n情义\t6333\n继电保护\t6334\n关岛\t6335\n女王大学\t6336\n英灵殿\t6337\n斑马纹\t6338\ncoords\t6339\nTested\t6340\n朱绍侯\t6341\n简单学习网\t6342\n丽福健\t6343\nprox\t6344\n突触\t6345\n追放\t6346\n北京法源寺\t6347\n兖矿集团有限公司\t6348\n冰粉粉\t6349\n已死\t6350\n高桥\t6351\n中国国际问题研究院\t6352\n三辊闸\t6353\n赶帮\t6354\nYamaha\t6355\n轿顶\t6356\n地面\t6357\n淬透性\t6358\n生产关系\t6359\n瞳神\t6360\n电视圈\t6361\n农网_吉林农网\t6362\n朝鲜劳动党\t6363\n超级火山\t6364\n星色\t6365\n半成\t6366\n保长\t6367\n九江职业技术学院\t6368\n萎凋\t6369\nadams2013\t6370\n19所\t6371\n无架\t6372\n颞骨\t6373\n爱者\t6374\nCS2\t6375\n杂七杂八\t6376\n哌替啶\t6377\n养阴\t6378\n舰载机\t6379\n御宅\t6380\n台湾省\t6381\n恒隆广场\t6382\n滨海政府网\t6383\n卡勒\t6384\n一源\t6385\n吴彦祖\t6386\n北京法院\t6387\n吉首\t6388\n艾莫\t6389\n傅文佩\t6390\n水族\t6391\n达尔盖\t6392\n长眠\t6393\n农学类\t6394\n卡奥\t6395\n王君馨\t6396\n佳能2520i\t6397\n─\t6398\n安龙县\t6399\n土黄\t6400\n安全生产责任制\t6401\n电动扫地车\t6402\n杭州移动\t6403\nscrub\t6404\n20160913\t6405\n校刊\t6406\nJmeter\t6407\n统筹兼顾\t6408\n凉席\t6409\ngardening\t6410\n文昌中学\t6411\n个贷\t6412\nSEL\t6413\nKEEP\t6414\n三元乙丙\t6415\n7.15\t6416\nGLM\t6417\n紧张\t6418\n碱度\t6419\n中国核工业二三建设有限公司\t6420\n和邦生物\t6421\n昏婚\t6422\n师伟\t6423\nzip压缩文件\t6424\n综影\t6425\n33度\t6426\n菊花茶\t6427\n4036\t6428\n庞各庄\t6429\nUINavigation\t6430\n陌路\t6431\n一四\t6432\nE路网\t6433\n情难枕\t6434\n30多万元\t6435\n施法\t6436\n穹窿\t6437\n第38号\t6438\n违法行为\t6439\n玩死\t6440\n民众镇\t6441\n摩尔多瓦\t6442\n002121\t6443\n绵阳南山中学\t6444\nthinkpadx1\t6445\n预制块\t6446\n5.18.0\t6447\n云顶娱乐\t6448\n3458\t6449\n注射剂\t6450\n智伯\t6451\nalicesoft吧\t6452\n2343\t6453\n质合\t6454\n长椅\t6455\n威尔彭蕾\t6456\n家神\t6457\n八马\t6458\n真好吃\t6459\n裴迪\t6460\n乌兹别克斯坦\t6461\np闪\t6462\n上下拉\t6463\n0207\t6464\n1.6.4\t6465\nCCC认证\t6466\n石类\t6467\n防霉剂\t6468\n郭川\t6469\n通读\t6470\n青见\t6471\n四方精创\t6472\nr134a\t6473\n5分种\t6474\n兵团日报\t6475\n直流式\t6476\nduplicates\t6477\nprime95\t6478\nveket\t6479\n4.5亿元\t6480\n优质版\t6481\n华联大厦\t6482\n500道\t6483\n孔小爽\t6484\n朝阳区教委\t6485\n集散地\t6486\n35句\t6487\n手酸\t6488\n2016-2025年\t6489\nΦ\t6490\nwin7/XP\t6491\n低筋粉\t6492\nhao360\t6493\n国际电影节\t6494\n烟斗鬼\t6495\n烈咬陆鲨\t6496\n狭长型\t6497\n财务报表附注\t6498\n1.6万\t6499\n暗欲\t6500\n白泥\t6501\n防病毒\t6502\n流浪\t6503\n妖女\t6504\n固体分散体\t6505\n谷粉\t6506\n2014.doc\t6507\n电音节\t6508\n弩哥\t6509\n多情\t6510\nDew\t6511\n独热编码\t6512\n风声鹤\t6513\n郝眉\t6514\nRuby\t6515\nchocolate\t6516\n亲水\t6517\nd2v\t6518\nshengyuan\t6519\n韩惠珍\t6520\nWR886N\t6521\n石泽研究所\t6522\nUpdater\t6523\n填埋\t6524\n来信\t6525\n第二十回\t6526\n驯龙记\t6527\n虐腹\t6528\nFreeSwitch\t6529\nsubunit\t6530\n但愿\t6531\n振奋\t6532\n你追我赶\t6533\n1247\t6534\n偷瞄\t6535\n涓子\t6536\n家犬\t6537\n端午节\t6538\n创和\t6539\nVs2010\t6540\n760p\t6541\n20160929\t6542\n加利西亚\t6543\nRolling\t6544\n行唐县\t6545\n4.7.3\t6546\n三六九\t6547\n大钟\t6548\n德通\t6549\n代通知金\t6550\nmac2018\t6551\n日媒\t6552\n考场\t6553\n孙磊\t6554\n何尝\t6555\n晋源区\t6556\nbroadcom\t6557\n爱惠浦\t6558\n三国之超级召唤系统\t6559\n哥本哈根减肥法\t6560\n名古屋\t6561\n幂级\t6562\n沉落\t6563\n摸头\t6564\n舍本逐末\t6565\nmotivation\t6566\n义务兵\t6567\nCoda\t6568\n阁夜\t6569\n蛋白胨\t6570\n震中杯\t6571\n稞麦\t6572\n不结转\t6573\nCoo\t6574\n国药集团\t6575\n太平猴魁\t6576\n贫血症\t6577\n桥式整流电路\t6578\n水产品\t6579\nCountries\t6580\nhifiman\t6581\n上海生物\t6582\ntradition\t6583\n金圣\t6584\n正稿\t6585\n雷音\t6586\n净化槽\t6587\n7780\t6588\nBorland\t6589\nQuarter\t6590\n剑仙诀\t6591\ncat5e\t6592\n龙婆\t6593\n郑泫\t6594\n伪\t6595\nReflector\t6596\n继续教育网\t6597\n猫女郎\t6598\n桐昆集团\t6599\nSergio\t6600\nShuffle\t6601\nTPO\t6602\n马赛克版\t6603\n艺派\t6604\nproviders\t6605\n面包柜\t6606\n降级\t6607\n网卡号\t6608\n特玩\t6609\n侠隐\t6610\n宫泽理惠\t6611\n打胎药\t6612\nmilton\t6613\n观众\t6614\n事成\t6615\nweld\t6616\n私人侦探\t6617\n野三坡景区\t6618\n杰克逊\t6619\n20根\t6620\n胴体\t6621\n当前\t6622\n龙翼\t6623\ngmarket\t6624\n美缝\t6625\nWATER\t6626\n0.85%\t6627\n特发性肺纤维化\t6628\nnudity\t6629\n奥威尔\t6630\n后宅街道\t6631\n乐高教育\t6632\n明单\t6633\n滤盒\t6634\nofficetab\t6635\n太行英雄传\t6636\n李委明\t6637\n弹唱谱\t6638\n黛薇\t6639\n定心\t6640\n作文人\t6641\n华润大厦\t6642\nQThread\t6643\n恒远\t6644\n嘉兴市第一医院\t6645\n曹蕾\t6646\n活泼型\t6647\n绿地乐和城\t6648\n高效\t6649\n采样\t6650\n徐汇中学\t6651\n行车记录仪\t6652\n氪星人\t6653\n七喜控股\t6654\n每分\t6655\nHOLA\t6656\n辱母案\t6657\njif\t6658\n文史类\t6659\n投案\t6660\nconfigurations\t6661\n2010年后\t6662\nAlfresco\t6663\n缩图\t6664\n飞宇科技\t6665\n武陵山大裂谷\t6666\nb6\t6667\n33万\t6668\n情深入骨\t6669\n老二\t6670\n十二类\t6671\nBCD\t6672\n大渡口区\t6673\n新郑机场\t6674\n四格\t6675\ncodol吧_\t6676\n南极星\t6677\nUMG\t6678\nskinstore\t6679\n龙魂\t6680\ndims\t6681\nyoutube-dl\t6682\n商谈\t6683\nSwitchyOmega\t6684\n浙江博洋家具有限公司\t6685\n幽鬼\t6686\n事业编统考\t6687\n莱万特\t6688\n火山号\t6689\n金程教育\t6690\n纪晓岚\t6691\n六星网\t6692\n白安\t6693\n皇宫\t6694\npcsx2\t6695\n伏牛路\t6696\nonthe\t6697\n长治市郊区\t6698\n第87\t6699\nifree卡\t6700\n徐佳宁\t6701\n竹篱\t6702\n槽式\t6703\n先进个人\t6704\n油豆角\t6705\n矿体\t6706\n邵兵\t6707\n分家\t6708\nQueryDSL\t6709\n清炒\t6710\n卓锦城\t6711\n淘汰\t6712\n渝澳大桥\t6713\nPPt\t6714\n39亿元\t6715\n殿堂\t6716\n名帖\t6717\n以物抵债\t6718\n绅宝D20\t6719\n股骨干\t6720\n徒儿\t6721\n前列腺癌骨转移\t6722\n奇偶校验\t6723\nzero-51CTO\t6724\n辽宁省中医院\t6725\n通迅\t6726\nCtrl+Shift\t6727\n转列\t6728\n年产值\t6729\n无机磷\t6730\n德云社\t6731\n中央人民政府\t6732\n展览厅\t6733\n讨论会议\t6734\nJohn_杰\t6735\n中福会\t6736\n肾内科\t6737\nwawa\t6738\n传动箱\t6739\n刘胜\t6740\nformulation\t6741\n65平方\t6742\nB版\t6743\nmkdir\t6744\n样刊\t6745\n许家屯\t6746\n唐山师范学院\t6747\nMigos\t6748\n不热\t6749\n加工费\t6750\nSDP\t6751\ncaf\t6752\nincorrectly\t6753\nPLOT\t6754\n0712\t6755\n哑妹\t6756\n空法\t6757\n史家村\t6758\n吴阳\t6759\n最终幻想15\t6760\n祝君\t6761\n晾霸\t6762\n数智网\t6763\n划图\t6764\n洪战辉\t6765\n万喆\t6766\n剑芒\t6767\n口部\t6768\n长春市绿园区人民政府\t6769\nliushao\t6770\nBlanc\t6771\n美诺华\t6772\n151\t6773\n铁拐李\t6774\n热电公司\t6775\n600P\t6776\n消化液\t6777\n天津机电职业技术学院\t6778\n谷腾环保网\t6779\n寿喜烧\t6780\n100多名\t6781\n华润三九医药股份有限公司\t6782\ninputtype\t6783\n雷明顿\t6784\n7.3奶\t6785\nlocalstorage\t6786\n阿里RocketMQ\t6787\nXorg\t6788\n成都政府网\t6789\n木作\t6790\n土陶\t6791\n银行信贷\t6792\n康县\t6793\n京广快递\t6794\n读取器\t6795\n互联网\t6796\n养生物\t6797\n护耳\t6798\nFCFE\t6799\n张文峰\t6800\n龙须树\t6801\n0.37\t6802\n一概\t6803\n勇者斗恶龙9\t6804\n冯延巳\t6805\nsin\t6806\n鞭尸\t6807\n2万余\t6808\n7月5日\t6809\n虚拟试衣\t6810\n萌宠乐园\t6811\n金海\t6812\n国学堂\t6813\n桑德拉\t6814\n登山\t6815\n郑州海洋馆\t6816\n中航光电\t6817\n锦绣良缘\t6818\n董扬\t6819\n转增\t6820\n酷6\t6821\noffice2010word\t6822\n最浪漫\t6823\n人类群星闪耀时\t6824\n口袋妖怪漆黑魅影\t6825\n第七组\t6826\n维斯塔\t6827\n八大处\t6828\n荣耀九\t6829\n戏命\t6830\n止痛药\t6831\n湖南都市职业学院\t6832\nbuy\t6833\n爱情自有天意\t6834\nware\t6835\n联通移动电信\t6836\n内蒙华电\t6837\n盐浴\t6838\n修房\t6839\n2018-02-05\t6840\nobstacles\t6841\n一手遮天\t6842\n测速器\t6843\ndump\t6844\n红辣椒\t6845\n黄人\t6846\nMum\t6847\n香港东\t6848\n扁人\t6849\n1395\t6850\n伏牛堂\t6851\n大本钟\t6852\n独眼小僧\t6853\n45路\t6854\nSchedules\t6855\n1.2.6\t6856\n松柏\t6857\n王前\t6858\n金桥社区\t6859\n600705\t6860\n娘们\t6861\n白马井镇\t6862\n第15卷\t6863\n吴佳妮\t6864\n日月光广场\t6865\nallway\t6866\n漂白性\t6867\n余威\t6868\n中环广场\t6869\n横梁式\t6870\n甲胎蛋白\t6871\n茑\t6872\n300688\t6873\n老金\t6874\nCMV\t6875\n国安居\t6876\n于越\t6877\n三关\t6878\n新石器烤肉\t6879\n抗震支\t6880\n泪流\t6881\naptana\t6882\ndjango2\t6883\n汇编\t6884\n宋兴柱\t6885\n新乡市牧野区\t6886\n盖棺定论\t6887\n23级\t6888\ne3\t6889\n书客吧小说网\t6890\n午间\t6891\n329国道\t6892\n乐记\t6893\n提振\t6894\n环氧乙烷\t6895\n150分钟\t6896\n宏包\t6897\n中底\t6898\ntopical\t6899\n03_\t6900\n置之死地而后生\t6901\n120毫米\t6902\n白银市\t6903\n名义\t6904\n4790k\t6905\n净月潭\t6906\n佛堂镇\t6907\n曾经我也想过一了百了\t6908\n再议\t6909\n深圳大学\t6910\n肉身\t6911\n30.0000元\t6912\n六十度\t6913\n天牛\t6914\nfx50j\t6915\n华岩镇\t6916\n赵鹏飞\t6917\n冶金工业\t6918\n淘粉吧返利网\t6919\n红魔馆\t6920\nPowerDesigner画ER图\t6921\n52亿\t6922\n国家工作人员在线学法\t6923\n标识导视系统\t6924\nfilevault\t6925\n双层股权结构\t6926\n一个40尺\t6927\n中央歌剧院\t6928\n贪食\t6929\n水蒸气\t6930\n我的英雄学院\t6931\n完美者\t6932\n遂平\t6933\n鹏瑞利\t6934\n湘君\t6935\n香菱学诗\t6936\n2万个\t6937\nCrash\t6938\n白江\t6939\n抚宁县\t6940\n晓枫\t6941\nbusuu\t6942\n山阳县人民政府\t6943\n3000套\t6944\n杰西卡·琼斯\t6945\n环奈\t6946\nsimd\t6947\n捆绑器\t6948\n清规\t6949\nGV\t6950\n非域\t6951\n周梅森\t6952\n维修店\t6953\n哲学史\t6954\n德安\t6955\n市委宣传部\t6956\n流放\t6957\n扎卡\t6958\n温布尔登\t6959\n税控盘\t6960\n白雪公主的故事\t6961\n税局\t6962\n昂克帝霸\t6963\n华蓥市人民政府\t6964\n背奶\t6965\n14.9元\t6966\nconfig\t6967\n长沙博物馆\t6968\n私募基金网\t6969\n硬盘机\t6970\n12月23日\t6971\n归心\t6972\n子数\t6973\n画坊\t6974\n魔兽争霸|\t6975\n2015—2017年\t6976\n观色\t6977\n凤\t6978\ncows\t6979\n色素性\t6980\n张召忠\t6981\n青岛上合组织\t6982\n银桥网\t6983\n爵士\t6984\nDerivative\t6985\n刚度系数\t6986\n黑顶\t6987\nlumia\t6988\n全真\t6989\n恒信\t6990\n职业培训教育网\t6991\n宫崎彩\t6992\nC端\t6993\n潜伏案\t6994\n飞刀\t6995\n空气滤芯\t6996\n美装\t6997\nmscbsc\t6998\n大敌\t6999\n恒都\t7000\n越秀地产\t7001\n3.71\t7002\n安达卢西亚\t7003\n跟趾\t7004\nmcg\t7005\n汉诺塔\t7006\nZen\t7007\n海贼王四皇\t7008\n总氮\t7009\n邀请制\t7010\n周晨\t7011\n秦光荣\t7012\n2015—2017年度\t7013\n未知子\t7014\n百位\t7015\n纳什维尔\t7016\n重庆解放碑\t7017\n1星期\t7018\n妮海瑟薇\t7019\n漫花\t7020\n談\t7021\n蓝沢润\t7022\n普洱\t7023\n万联\t7024\n寻仙2\t7025\n月岚\t7026\n哲学家们\t7027\n中科信息\t7028\n四化同步\t7029\n粘着\t7030\n戊戌政变\t7031\n职教集团\t7032\n李子柒\t7033\nshoo女性网\t7034\n武汉二中\t7035\nQQ群机器人\t7036\n万力\t7037\n武汉市经济技术开发区\t7038\n大懒堂\t7039\n江苏\t7040\n军贸\t7041\n杭州萧山国际机场\t7042\n求思\t7043\n华中科技大学同济医学院附属协和医院\t7044\n正丁醛\t7045\naecs6\t7046\n暖泉古镇\t7047\nwmd\t7048\n导轨式\t7049\n布鲁雅尔\t7050\n兴润\t7051\n做好事不留名\t7052\n云室\t7053\n螺旋给料机\t7054\n20160820\t7055\n九溪十八涧\t7056\n取证\t7057\n电荷泵\t7058\n开利\t7059\n新诛仙\t7060\n6AT变速箱\t7061\n佛音\t7062\n苏州路\t7063\n17页\t7064\n胡琴\t7065\n豆油皮\t7066\n刺杀者\t7067\n西北师范大学附属中学\t7068\n瑞达精\t7069\nv73\t7070\n李绅\t7071\nRECARO\t7072\n无人便利店\t7073\nppsu\t7074\nmrt\t7075\n佐助\t7076\n五年级数学下册\t7077\n033期\t7078\n8515\t7079\n杨派\t7080\n郑璇\t7081\n小骑手\t7082\n商定\t7083\ntabbar\t7084\n快笑\t7085\n柔福帝姬\t7086\n账\t7087\n绿光\t7088\n畅玩6x\t7089\n跌幅\t7090\nutf-8编码\t7091\n验身\t7092\n4件套\t7093\n只管\t7094\n少女时代泰妍\t7095\n多愁善感\t7096\n剑网3剑网3\t7097\n包厢\t7098\n千源\t7099\nali\t7100\n吴狄\t7101\n直销百科网\t7102\n3600\t7103\n黑尾\t7104\n够呛\t7105\n喀山\t7106\nDeluxe\t7107\n4路\t7108\n大庆石油学院\t7109\n羊山新区\t7110\nshou\t7111\nv1.0.2\t7112\n长春列表网\t7113\n因为爱\t7114\n语言学\t7115\n后段\t7116\n血管性痴呆\t7117\n伍贰君\t7118\n大和抚子\t7119\n保健食品\t7120\n90多度\t7121\n最裸\t7122\n中心广场\t7123\n郭英森\t7124\n街霸\t7125\n以撒的结合:重生\t7126\n重出江湖\t7127\n工位牌\t7128\n合户\t7129\n▇\t7130\nforemost\t7131\n何马\t7132\n深圳市贝腾科技有限公司\t7133\n华为meta\t7134\n体验馆\t7135\n魅族e3\t7136\n雅礼\t7137\n有钱有势\t7138\n李宇琪\t7139\n房车营地\t7140\n西北电力设计院\t7141\n干仓\t7142\n安徒恩\t7143\n超支化\t7144\n苏锡常\t7145\n张义\t7146\nBD-流放之路\t7147\n中国石油学会\t7148\n工区\t7149\n即将发生\t7150\n凌渡\t7151\n萧山19楼\t7152\n阿尔卡特\t7153\n醒酒器\t7154\n尚景\t7155\n2015年9月1日\t7156\nRelax\t7157\n入名\t7158\nJtable\t7159\n马利克\t7160\n粒化\t7161\n龙吟百美缘\t7162\nipad1\t7163\n胡小呆\t7164\n凯天\t7165\n仙剑梦幻\t7166\n鍒樻爧\t7167\n受欢迎\t7168\n片假名\t7169\n夹心饼干\t7170\nEXcel\t7171\n山篇\t7172\nsopc\t7173\n开杆\t7174\n防晒\t7175\n可研\t7176\n小飞侠彼得潘\t7177\n艾森克人格问卷\t7178\n20171027\t7179\ndeen\t7180\nCod\t7181\noutliers\t7182\n新标准大学英语\t7183\nl801\t7184\nフランクミュラ\t7185\n爆碎\t7186\nchosen\t7187\n火冒三丈\t7188\ngiovanni\t7189\n字人\t7190\n地下连续墙\t7191\n部洲\t7192\n36片\t7193\n胡桃\t7194\nbetter\t7195\n带单\t7196\n中国建筑书店\t7197\n福临门\t7198\n罗素·克劳\t7199\n肌张力\t7200\n柜架\t7201\nNodeMCU\t7202\n潮阳\t7203\n号器\t7204\n北京公园\t7205\n名山区\t7206\n十来个\t7207\n环球雅思学校\t7208\n一比一\t7209\nmailx\t7210\ni9\t7211\n按\t7212\n东城街道\t7213\n6月26日\t7214\n焚风\t7215\nrsag\t7216\n红烧鸡块\t7217\n野鸡\t7218\n博士生入学考试\t7219\n劝说\t7220\n百年灵手表\t7221\nX50\t7222\n汤杯\t7223\n人教版四年级数学下册\t7224\n大众帕萨特\t7225\n展鸿教育\t7226\ndaying\t7227\n疾控中心\t7228\n加盟展\t7229\n商业端\t7230\n杨堤\t7231\n婚宴\t7232\n徐洪浩\t7233\n入场费\t7234\n股权质押贷款\t7235\n仙果\t7236\n腋臭科\t7237\nforbes\t7238\n无暇\t7239\n南昌银行\t7240\n桃金娘\t7241\nFaber\t7242\nBowie\t7243\n互联网经济\t7244\n闵庄路\t7245\n违背\t7246\n山甲\t7247\n碎屑\t7248\n64厘米\t7249\n佟丽娅\t7250\nMorricone\t7251\n无息\t7252\nBabyface\t7253\n真理观\t7254\n思品\t7255\n消费者权益法\t7256\n有误\t7257\nVC\t7258\n立花宗茂\t7259\nbader\t7260\n535T\t7261\n如新\t7262\n几线\t7263\nResearchGate\t7264\n纷至沓来\t7265\n免疫比浊法\t7266\n魏渭\t7267\nquake\t7268\nBooting\t7269\n登建\t7270\n公摊\t7271\n淡季\t7272\n昆仑玉\t7273\n24:00\t7274\n南站\t7275\n强逼\t7276\n无名缘\t7277\n赢装网\t7278\n微权力\t7279\n上柜\t7280\n4.24\t7281\n白鹭郡\t7282\nhz7788.com\t7283\n山东二区\t7284\n黑贝\t7285\n20170617\t7286\n绯雪\t7287\nOneNET\t7288\n冷光\t7289\n天茂集团\t7290\n悬臂起重机\t7291\nthinkpadx270\t7292\n长路\t7293\n霰\t7294\n然并卵\t7295\n调薪\t7296\n表观密度\t7297\n热水澡\t7298\nVacancies\t7299\n七张\t7300\n华伟\t7301\n盘龙药业\t7302\n储安平\t7303\nsilkn脱毛仪\t7304\n4200万\t7305\n侍魂\t7306\n广州国际生物岛\t7307\n奥金顿\t7308\n鹿鼎记1\t7309\n县政府\t7310\n狂恋\t7311\n颍泉\t7312\n离职会\t7313\n两句\t7314\n杀猪菜\t7315\n谈美\t7316\n英国文化教育协会\t7317\ncoupling\t7318\n赵霞\t7319\n万德数据库\t7320\nMDI\t7321\n乙酰半胱氨酸\t7322\n舍宾\t7323\n徐子轩\t7324\n6.4.2\t7325\n波波夫\t7326\n朱克\t7327\n补偿链\t7328\n重拍\t7329\n反恐精英OL\t7330\n云谱\t7331\nQK\t7332\nutf-8格式\t7333\n动土\t7334\nlange\t7335\nエキサイト\t7336\nPANDAS\t7337\nHttpUrlConnection\t7338\n肉桂茶\t7339\nsigmod\t7340\n梁静茹\t7341\nNFPA\t7342\n虚拟机能\t7343\nSeafile\t7344\n加减乘除法\t7345\n鹿特丹\t7346\n审计业务约定书\t7347\n王老七\t7348\nYouwei\t7349\n家具展\t7350\nUNI\t7351\n瑜伽馆\t7352\n通天教主\t7353\n泰州市教育局\t7354\n100010\t7355\nlor\t7356\nadditional\t7357\nHackathon\t7358\nresu\t7359\nfangan\t7360\n肇庆中学\t7361\n网易模拟器\t7362\n小不点\t7363\n沃特世\t7364\n优先\t7365\n尿糖\t7366\n崩溃论\t7367\n神木市\t7368\nvolume\t7369\n手术器械\t7370\ncmd.exe\t7371\n山核桃\t7372\n记号笔\t7373\n开花\t7374\n黄宇\t7375\n畜\t7376\nESD\t7377\n水系统\t7378\nuniv\t7379\n中国梦之队快乐之舞\t7380\n小泽征尔\t7381\n俞凌雄\t7382\n小版\t7383\nshopex\t7384\n患友\t7385\n新華網\t7386\n6.5分\t7387\n捷克斯洛伐克\t7388\n齿条式\t7389\n袁宝成\t7390\n张广\t7391\n怪手\t7392\nBae\t7393\n香干\t7394\nslui\t7395\n文件柜\t7396\n贺言\t7397\nZSH\t7398\n魅友卡\t7399\n夔州\t7400\n物流基地\t7401\n点检仪\t7402\n莴笋\t7403\n丨奇舰\t7404\n成稿\t7405\n细菌病毒\t7406\n结算卡\t7407\n奥罗拉\t7408\n财经\t7409\n第七条\t7410\nstrongest\t7411\n万丽酒店\t7412\n16幢\t7413\n浙教版\t7414\nGoogleplay\t7415\nPurdue\t7416\npycache\t7417\n李良\t7418\n沿江大道\t7419\n永不服输\t7420\n统计证\t7421\n包玉婷\t7422\n怪物猎人X|\t7423\n阿尔比恩\t7424\n小鹏汽车\t7425\n成都农业银行\t7426\n魔神\t7427\n文澜\t7428\n出镜\t7429\n新六产\t7430\n子母床\t7431\nFSG\t7432\n曾多次\t7433\n插穴\t7434\n两界\t7435\n根号\t7436\nwin2008R2\t7437\n奥园地产\t7438\n港华燃气\t7439\n肇东\t7440\n官司\t7441\n小杜\t7442\nAuditor\t7443\n增加率\t7444\n浅表性胃炎\t7445\n大好\t7446\nFE\t7447\n纺纱机\t7448\n两亿岁\t7449\n泰达政府网\t7450\n杜宇麒\t7451\n五行石\t7452\n叶先生\t7453\nsaborino\t7454\n债券型基金\t7455\nconvert\t7456\n缓冲器\t7457\n福信集团\t7458\n胆囊壁毛糙\t7459\n祷言\t7460\nTonya\t7461\n0.5秒\t7462\nPowerPoint\t7463\n炔烃\t7464\n皎洁\t7465\njul\t7466\n玉色\t7467\n介绍页\t7468\n男仙\t7469\n现代舞\t7470\n头衔\t7471\n安盛保险\t7472\n特防\t7473\n西气东输\t7474\n火烧石\t7475\n梅村镇\t7476\n英国工党\t7477\n涉险\t7478\nZOWIE\t7479\n金融经济\t7480\n1509\t7481\n备存\t7482\n费里尼\t7483\n血舞旋月刀\t7484\n绝地求生狙击\t7485\n层架\t7486\n菡\t7487\n安顺职业技术学院\t7488\n董宝珍\t7489\nexcei\t7490\n澳门新八佰伴\t7491\n反流性食管炎\t7492\n你的眼神\t7493\n一则\t7494\n水坑\t7495\n米糠油\t7496\n旧装\t7497\n李建义\t7498\n佳鑫诺\t7499\nsimscape\t7500\n大清相国\t7501\n安彤\t7502\n第654章\t7503\n奔驰GLA级\t7504\n52亿元\t7505\n黄冈\t7506\nPADI\t7507\n怀安县\t7508\n碎铁者\t7509\n36.2\t7510\nnba2k13\t7511\nXbox360\t7512\n中共海南省委党史研究室_海南省地方志办公室\t7513\n奥特朗热水器\t7514\nCADENCE\t7515\n汽配人网\t7516\n挺身而出\t7517\n会使人\t7518\nviewer\t7519\nfeh\t7520\n瓷砖美缝剂\t7521\n麦迪电气\t7522\n黑马河\t7523\n苦草\t7524\n拉夏\t7525\n六席\t7526\nwins10\t7527\n主治医\t7528\nAerospace\t7529\n海济\t7530\n合富辉煌\t7531\n月空\t7532\n方世玉\t7533\n补锌\t7534\n五个\t7535\n戴套\t7536\n旋涡泵\t7537\n北泽\t7538\n七音\t7539\n甩奶\t7540\n布洛克城\t7541\n麻阳县\t7542\nqiyu\t7543\nescape\t7544\nEV300\t7545\n怪兽们\t7546\n骨缝\t7547\n调试版\t7548\n打酱油\t7549\ncommand\t7550\n贡布里希\t7551\nets\t7552\n爆破员\t7553\n财经学院\t7554\n合肥瑶海区\t7555\n露馅\t7556\n霹雳奇侠传\t7557\nappendto\t7558\n上海中石化\t7559\n生长炉\t7560\nTraversal\t7561\n何宝荣\t7562\n钢框\t7563\n生物科学\t7564\n独行者\t7565\n天府大道南\t7566\n家训馆\t7567\nSpark函数\t7568\n达观\t7569\n20151009\t7570\nrsrp\t7571\n乒乒乓乓\t7572\n大通回族土族自治县\t7573\n云点\t7574\n梁间\t7575\n公租房|华律\t7576\n新天龙八部截图\t7577\n忙于\t7578\na350\t7579\n鲤竿\t7580\n一克\t7581\n法律思想史\t7582\nwhereas\t7583\n星河赵\t7584\n交集\t7585\n中控\t7586\n新概念英语2\t7587\n拳皇96\t7588\n全清\t7589\nHD1280Px720P\t7590\n车削\t7591\n江苏省小学\t7592\n李楠\t7593\nDisplays\t7594\nka\t7595\n铂睿\t7596\n吹战\t7597\n跋山涉水\t7598\n美驰\t7599\n423世界读书日\t7600\n真人CS\t7601\n鹧鸪天\t7602\n点降水\t7603\n普兴镇\t7604\n打响\t7605\nstationary\t7606\n桂聘\t7607\nShowcase\t7608\n字迹\t7609\n无纺布\t7610\n鸡粪\t7611\n海普瑞\t7612\n鄂温克族\t7613\n莆仙戏\t7614\n突然袭击\t7615\n悲喜\t7616\n美孚速霸2000\t7617\n饿了么\t7618\nputao\t7619\n牛堡\t7620\ncellpadding=\t7621\n28种\t7622\n吉利汽车研究院\t7623\n求不满\t7624\nMako\t7625\n样样\t7626\n得子\t7627\nnetfits\t7628\n悟饭游戏厅\t7629\n藤子·F·不二雄\t7630\n草塔镇\t7631\ncoinbase\t7632\n肿\t7633\n金誉\t7634\n安吉房产网\t7635\n河北华夏幸福足球俱乐部\t7636\nnetscreen\t7637\n绿园区\t7638\n中福在线\t7639\n体膜\t7640\n产检\t7641\nAAM\t7642\nDSE\t7643\nCrystalDiskInfo\t7644\n16则\t7645\nhitch\t7646\n双眼\t7647\nxiang\t7648\n現在\t7649\n特工学院\t7650\n高铁商务座\t7651\n精美绝伦\t7652\n花园城\t7653\n16双\t7654\nl4158\t7655\n浓眉大眼\t7656\n途安社区\t7657\n我的英雄学院第二季\t7658\nendwith\t7659\n863\t7660\n酷我歌词\t7661\n另一种\t7662\n奶兔\t7663\ncockpit\t7664\n一贴\t7665\n手提式\t7666\n21级\t7667\nOklahoma\t7668\n回收\t7669\n地球帝国2\t7670\n泰安站\t7671\n凯恩之角_暗黑破坏\t7672\n老五\t7673\n心脏瓣膜置换术\t7674\n泰式奶茶\t7675\n雕栏玉砌\t7676\n玉楼春\t7677\n祷\t7678\n米库\t7679\n登\t7680\n元方\t7681\n青坊主\t7682\nWeiPHP\t7683\n思铭\t7684\n美多多\t7685\n维矩\t7686\n潍坊市区\t7687\n武汉市卫生和计划生育委员会\t7688\n摆阵\t7689\nfairygui\t7690\nStockholm\t7691\nusing\t7692\n12.7.3\t7693\n凹版\t7694\nPClady摩登学院\t7695\nenthusiastic\t7696\n大卖单\t7697\n喜福网\t7698\n待到\t7699\n团扇舞\t7700\n顶复\t7701\n西安电视台\t7702\n烟袋\t7703\n西镇\t7704\n光大安石\t7705\n芭拉旅游网\t7706\n谬\t7707\n雷洛\t7708\nNielsen\t7709\n古德里安\t7710\n菠萝蛋白酶\t7711\n购房俱乐部\t7712\n正清\t7713\n零冠词\t7714\n6MM\t7715\n河北一区\t7716\n广西防城港\t7717\n直臂\t7718\nPUNCH\t7719\nWLAN\t7720\ntomic\t7721\n素直\t7722\n艾派\t7723\n秦汉史\t7724\nleast\t7725\n半扎\t7726\nselenium2library\t7727\n8包\t7728\n王万良\t7729\ntuhooo\t7730\n脑血栓\t7731\n暗鬼\t7732\n沂蒙颂\t7733\n海南省人民政府办公厅\t7734\n单向可控硅\t7735\n先斩后奏\t7736\njcenter\t7737\n20160519\t7738\n保利地产\t7739\n宗教\t7740\nsalmon\t7741\n蓝可儿\t7742\nAlizee\t7743\n3dmax2015\t7744\n摩西网\t7745\n吃吃喝喝\t7746\n現役\t7747\n工具盘\t7748\n真的好喜欢\t7749\nmoniqi\t7750\nCover\t7751\n茅箭\t7752\nobs吧_\t7753\n氧指数\t7754\n38届\t7755\n鸡西百姓网\t7756\nSWIFT\t7757\n道亨\t7758\n科学决策\t7759\n英雄本色2\t7760\n妙药\t7761\n火灾自动报警系统设计规范\t7762\n临床药师网\t7763\n成都成华政府\t7764\n共同性\t7765\n淮坊市\t7766\n化法\t7767\n6局\t7768\n柳叶眉\t7769\n包类\t7770\n海口\t7771\n夹缝\t7772\n万夫\t7773\n赛力\t7774\nQL\t7775\n策略类\t7776\n厦门建发集团\t7777\n辞旧迎新\t7778\n词意\t7779\nbelief\t7780\nHttps\t7781\n火主\t7782\n爱尔信会计网校\t7783\n关服\t7784\n北盘江大桥\t7785\n县市场监管局\t7786\n材料成型及控制工程专业\t7787\n源点\t7788\n梅泰诺\t7789\n1份\t7790\n奇花\t7791\n66.cn\t7792\nDFRobot\t7793\n足底按摩\t7794\n金弦\t7795\n2017年12月27日\t7796\n急性左心衰\t7797\n贵州消防\t7798\n对角巷\t7799\n护理剂\t7800\n巫峡\t7801\n英伟达\t7802\n前夜\t7803\n哪场\t7804\n叶根\t7805\n陪嫁\t7806\nPDK\t7807\n小井\t7808\nRadeon\t7809\n肥东\t7810\n南方电网公司\t7811\n漫画\t7812\n前视图\t7813\nf1.8\t7814\n预定义变量\t7815\n苦手\t7816\n三盆\t7817\n创世兵魂\t7818\nProviding\t7819\nhori\t7820\n正真\t7821\n臭作\t7822\n果园港\t7823\n痈\t7824\n痴情冢\t7825\n妙道\t7826\n雷霆崖\t7827\n战舰世\t7828\n上海康朗生物科技有限公司\t7829\n皮皮鲁\t7830\n蔓越莓干\t7831\n乌姆蒂蒂\t7832\n卡普空\t7833\n胡安娜\t7834\n_一起网\t7835\nmodel类\t7836\n不好\t7837\n祈望\t7838\n副机\t7839\n2016年6月20日\t7840\n纷美\t7841\n狮头鹅\t7842\n11万公里\t7843\nvolet\t7844\nImpacts\t7845\nmarge\t7846\n毅然\t7847\n渝新欧\t7848\n1805\t7849\n七年级\t7850\n技术交流\t7851\n西游网\t7852\nGCCD\t7853\n子月\t7854\nangualr2\t7855\n灵耀\t7856\n基土\t7857\n中国证券监督管理委员会\t7858\nhalls\t7859\n専属\t7860\n姿式\t7861\n强制清算\t7862\n水电池\t7863\nVC6\t7864\n以弱胜强\t7865\nmobx\t7866\n洋泾街道\t7867\n八婺\t7868\n华为p9plus\t7869\nWMA\t7870\nCoCo\t7871\n菜种\t7872\n节节草\t7873\n杨晔\t7874\n苦学\t7875\n昌江黎族自治县\t7876\n湖南大剧院\t7877\n保鲜\t7878\n德比鞋\t7879\n福建工程学院\t7880\n酶类\t7881\n36章\t7882\n某科学的超电磁炮\t7883\n交货单\t7884\n位置传感器\t7885\nLasers\t7886\n贺信\t7887\n陶板\t7888\n被轮\t7889\n沿海赛洛城\t7890\n吃光\t7891\n财运\t7892\n春怨\t7893\n孟氏\t7894\n终审判决\t7895\n美悦湾\t7896\n天一大\t7897\n瓦滋猎人\t7898\n房龄\t7899\nMakoto\t7900\n载流量\t7901\n库兹马\t7902\n工作经\t7903\n新能源号\t7904\n南京教育论坛\t7905\n郭采洁\t7906\n一瓣\t7907\n玉玉\t7908\n00020\t7909\n无坚不摧\t7910\n在望\t7911\n鼻烟壶\t7912\n维新\t7913\n肺鱼\t7914\npile\t7915\n老坛酸菜牛肉面\t7916\noutlook2017\t7917\n河中\t7918\n梦雨\t7919\nowl\t7920\n肉毒素\t7921\nDVD刻录机\t7922\n六十九\t7923\n二维码收款\t7924\nUIbutton\t7925\n深圳南山医院\t7926\n最好的我们\t7927\n星湖街328号\t7928\n兴东国际机场\t7929\n满不在乎\t7930\n第122章\t7931\n体温\t7932\n安洁科技\t7933\nDOSS\t7934\n前锋\t7935\n控制变量法\t7936\n大写元\t7937\n小炒肉\t7938\n康强网\t7939\n随手杯\t7940\n6.5.7\t7941\n刺眼\t7942\n二零一五年\t7943\n网络负载均衡\t7944\n八仙饭店\t7945\n提讯\t7946\n接班\t7947\nキャットウォ\t7948\nCX70T\t7949\nahaschool\t7950\n科鲁兹论坛_汽车之家论坛\t7951\n汪玉凯\t7952\n克莱\t7953\n蒸散量\t7954\nWhe\t7955\n科力普商城\t7956\n京宝梵\t7957\n奔驰C\t7958\n锐普\t7959\n误餐费\t7960\nGolf\t7961\n0755-25526367\t7962\n标本馆\t7963\nmafengwo\t7964\nlabview2016\t7965\nfoxmial\t7966\n展室\t7967\n观音殿\t7968\n八荒斗神\t7969\n潘西\t7970\n人工降雨\t7971\n兄弟们\t7972\n耐压试验\t7973\n全港\t7974\n一棒\t7975\nbolb\t7976\n复信号\t7977\nClips\t7978\n车税\t7979\n13001\t7980\n罗马斗兽场\t7981\n这些个\t7982\n摩诃\t7983\n肃北县\t7984\n醋泡姜\t7985\n汉语言文学毕业论文\t7986\n突发奇想\t7987\n风林火山\t7988\n慎用\t7989\n监看\t7990\n伯德小姐\t7991\n弧段\t7992\npk场\t7993\n浙江大学医学院附属儿童医院\t7994\n赖着不走\t7995\n决死\t7996\n南极大陆\t7997\n赤木刚宪\t7998\n发色\t7999\nAIESEC\t8000\nclipping\t8001\n湍急\t8002\nmx6.0\t8003\n园林艺术\t8004\nbaby\t8005\n平衡度\t8006\n色轮\t8007\n杨德龙\t8008\n白麓城\t8009\n玛格丽\t8010\n露西\t8011\n迎泽\t8012\nRamen\t8013\n霍芬海姆\t8014\n两个礼拜\t8015\n新鲜度\t8016\n甲地\t8017\n幻想曹操传\t8018\n乳源瑶族自治县\t8019\nanjelica\t8020\n块链\t8021\n江苏省第二中医院\t8022\n秘书处\t8023\n思铂路虎揽胜\t8024\nBLOB\t8025\n绿创\t8026\n看样子\t8027\n声势\t8028\n金属加工\t8029\n七所\t8030\n偏度系数\t8031\n洛卡\t8032\n牛哥\t8033\n策划\t8034\nDiskGenius简体中文版\t8035\n瑞吉酒店\t8036\n管理项\t8037\n实境\t8038\n杨三角\t8039\n宠物包\t8040\nrep\t8041\ncrab\t8042\n被骗婚\t8043\n头层皮\t8044\n京源\t8045\n愣头青\t8046\n武川\t8047\nNexperia\t8048\n必得\t8049\n丰乳\t8050\n污染控制标准\t8051\nPHANTEKS\t8052\n双天刀\t8053\n内存容量\t8054\n汽油罐\t8055\n强夯\t8056\n爆菊\t8057\n天金\t8058\n僵死\t8059\n含恨\t8060\n高品格\t8061\n温儒敏\t8062\n锦色\t8063\nyapoo\t8064\n吴大师\t8065\n马利\t8066\nprograms\t8067\n宝安区小学\t8068\n飙酷车神2\t8069\n男儿\t8070\n且慢\t8071\n65寸\t8072\n怡心\t8073\n电容式电压互感器\t8074\n六害\t8075\njone\t8076\n王娜\t8077\n玫瑰醋\t8078\ncaob\t8079\n庞龙\t8080\n停掉\t8081\n20170321\t8082\n地笼网\t8083\n住院医疗险\t8084\n|批\t8085\nUS福利网\t8086\nlibgcc\t8087\nViVo\t8088\n魔酷阁\t8089\nKNEE\t8090\n横测\t8091\n子游\t8092\nAma\t8093\n门套\t8094\n风景园林网\t8095\nkindle\t8096\n创伤科\t8097\n邓禄普\t8098\n传祺GS5\t8099\n魔龙世界\t8100\n5月1日\t8101\n大众化\t8102\n渔网袜\t8103\n舜\t8104\n三昧\t8105\n余运算\t8106\n断陷\t8107\n油焖春笋\t8108\n第九期\t8109\n遣散费\t8110\n30帧\t8111\n戊二烯\t8112\n云OS\t8113\n零售额\t8114\n恒大地产\t8115\n桌面窗口管理器\t8116\n太平街\t8117\n章草\t8118\n锚点处\t8119\n虐猫\t8120\nv2.2.5\t8121\n凯贾\t8122\n达菲林\t8123\n锡粉\t8124\n绝地求生刺激战场\t8125\n狼少年\t8126\n陈敏恒\t8127\nstinger\t8128\n农夫\t8129\n中检院\t8130\n515路\t8131\n花城汇\t8132\n磨砂板\t8133\n湖南省政府扶贫办\t8134\nwoc\t8135\nboss直聘\t8136\n介绍费\t8137\n2018-04-24\t8138\n亲贤北街\t8139\n高耸\t8140\n中國經濟網\t8141\n墨青城\t8142\n健胃消食片\t8143\n亲属关系\t8144\n三表\t8145\n援青\t8146\n揽胜运动\t8147\ndying\t8148\n网丝\t8149\n工作压力\t8150\n宜兴紫砂壶名人录\t8151\npropane\t8152\n注油机\t8153\n仙剑奇侠\t8154\n向前看\t8155\n疖\t8156\n動物\t8157\n泵阀\t8158\n近平\t8159\n300DPI\t8160\nSEA\t8161\n卡梅\t8162\n流星语\t8163\n全一册\t8164\n购用\t8165\n较小\t8166\nrequestmapping\t8167\nニコニコ\t8168\n春游\t8169\nVegas\t8170\nglasses\t8171\n闲居\t8172\nairserver\t8173\n陈文强\t8174\nidex\t8175\n罗红霉素片\t8176\n风玫瑰图\t8177\n女人语时尚网\t8178\n常州经开区\t8179\n40句\t8180\n卫斯理传奇\t8181\n浮床\t8182\n成都国际铁路港\t8183\nRemoteFX\t8184\niem\t8185\nMBB\t8186\na75\t8187\nps油漆桶\t8188\n3版\t8189\n冲洗器\t8190\n伊美\t8191\n外膜\t8192\n不可期\t8193\nv2.8.0\t8194\n牛黄上清片\t8195\n上水管\t8196\n一粥\t8197\n攸\t8198\n易学\t8199\n讨价还价\t8200\n苯甲醚\t8201\n乐途\t8202\n板房\t8203\n30M\t8204\n亚洲开发银行\t8205\n翩翩冷少俏佳人\t8206\n孤独星球\t8207\n海鸣\t8208\nFlashFXP\t8209\n祸兮福\t8210\n达米安\t8211\n流氓兔\t8212\n水头损失\t8213\n杨晓东\t8214\n北京公交一卡通\t8215\n108号\t8216\n色香味形\t8217\n齐宣王\t8218\n公里表\t8219\n分解符\t8220\n张永霞\t8221\nlsfhack\t8222\nKMST\t8223\n阿笨猫\t8224\n上古卷轴3\t8225\ncircles\t8226\n2844\t8227\n医生们\t8228\n甘肃医学院\t8229\nclips4sale\t8230\n蔡翔\t8231\n金田花园\t8232\n对称阵\t8233\n叶继欢\t8234\n拨片网\t8235\n百万辆\t8236\n商品房预售证\t8237\n恒产者\t8238\n斗破苍穹动漫\t8239\n贵港市覃塘区人民政府\t8240\nILCE-7RM2\t8241\n中山路\t8242\n3炎\t8243\n【战\t8244\ne5200\t8245\n明妃\t8246\nr19\t8247\nPDF谱\t8248\n埇桥\t8249\n花程式\t8250\n汉威电子\t8251\n赵公明\t8252\n苦口\t8253\n恋与\t8254\n成空\t8255\n京东方A\t8256\n言情小说\t8257\n趁热打铁\t8258\n联防\t8259\n大彩网\t8260\n1次方\t8261\n寻医\t8262\n磷酸酯\t8263\nBilateral\t8264\n二维矩阵\t8265\n明经\t8266\n诗者\t8267\n64\t8268\n三渲二\t8269\n光笔\t8270\n蘑菇酱\t8271\n仡佬族\t8272\n大明咒\t8273\n8ml\t8274\n孕育剂\t8275\n2017-04-21\t8276\n水器\t8277\n夏枯草膏\t8278\n唯物主义者\t8279\n宫雁\t8280\n100v\t8281\n190号\t8282\n阿克塞尔\t8283\n昆明国旅\t8284\n农时\t8285\ngtk\t8286\n嫔\t8287\n平民窟\t8288\n8202\t8289\n山月\t8290\nLaguna\t8291\n日进斗金\t8292\n每元\t8293\nRDP\t8294\n很多块\t8295\nHS8145V\t8296\nGlo\t8297\naltiumdesigner\t8298\n平型关路\t8299\n救生\t8300\n绝地求生玩\t8301\n1ppm\t8302\n同步助手\t8303\n承兑\t8304\n路滑\t8305\n康维\t8306\n膨胀性\t8307\n只需\t8308\n魔法科高校\t8309\n多媒体箱\t8310\n胜利一中\t8311\nScheduler\t8312\n验证性\t8313\nloadrunner11\t8314\n金艺贞仙桃\t8315\nmingw32-make\t8316\ncoreldRAW\t8317\n补贴费\t8318\nguahao\t8319\n斗战狂潮\t8320\n大鳌\t8321\n飞腾\t8322\n英联邦\t8323\n梁稳根\t8324\n城东社区\t8325\nBOLL\t8326\n金恒德\t8327\n星月湾\t8328\n温湿度传感器\t8329\n观音\t8330\n普联TL\t8331\n臀疗\t8332\n阿宁\t8333\n75平\t8334\n江苏卫生人才\t8335\n费拉\t8336\n妖帝\t8337\nbeihai\t8338\n果农\t8339\n何洁\t8340\n五一天\t8341\n疏议\t8342\n围产\t8343\nUPS\t8344\n拓跋宏\t8345\n古界\t8346\n游侠\t8347\n3D肉蒲团\t8348\n螺岭\t8349\n爱医考试\t8350\n招风\t8351\nrarbg\t8352\n所思\t8353\n筛窦炎\t8354\nWin10|Windows\t8355\n杰奎琳\t8356\noman\t8357\n国债利率表\t8358\n嗤嗤\t8359\n静定结构\t8360\n班迪\t8361\nload_string\t8362\nSTK\t8363\n3万辆\t8364\nFateGrandOrder\t8365\n20161109\t8366\nlm324\t8367\n合肥市委\t8368\n20150822\t8369\nMIUI8稳定版\t8370\n2018年4月17\t8371\nskimage\t8372\n天津耀华中学\t8373\n1月19日\t8374\n猩球崛起3\t8375\n四责四诺\t8376\n副营\t8377\n泰山风景区\t8378\n暴尸\t8379\n闲者\t8380\ndbms\t8381\n德州扑克\t8382\n魏碑\t8383\n联想y720\t8384\n福家\t8385\n线速\t8386\ngnc\t8387\n北滨路\t8388\n火猫\t8389\n巫溪\t8390\n湖北省农村信用社联合社\t8391\n在记忆里\t8392\n安委办\t8393\n九间\t8394\nselenuim\t8395\nfetched\t8396\n庞蒂\t8397\n东元\t8398\n沃土\t8399\nbsb\t8400\n胆固醇血症\t8401\n新路子\t8402\n光明小区\t8403\nvard\t8404\n天庭\t8405\n私车\t8406\n深圳人民医院\t8407\n1962\t8408\n大纪元\t8409\n菲娅\t8410\n重庆市公安局交通管理局\t8411\n80只\t8412\n单数\t8413\n三态\t8414\n甘肃省国土资源厅\t8415\nYocto\t8416\n王霜\t8417\n夹线器\t8418\n花子\t8419\n首度\t8420\n张卫健\t8421\n带带\t8422\n儿童资源网\t8423\n平田\t8424\n郑君里\t8425\n吴文英\t8426\nPine\t8427\n泡泡网\t8428\n卓乐\t8429\nread/64\t8430\nnina\t8431\n呱呱泡蛙\t8432\n万邑通\t8433\n多媒体播放器\t8434\n中国节能\t8435\n云层\t8436\nRCO\t8437\n会员们\t8438\n黄桷坪\t8439\n陕西省环保厅\t8440\n108g\t8441\nRDO\t8442\nFMEA\t8443\ntumblr\t8444\n场道\t8445\nEaston\t8446\n手写版\t8447\nspringframework\t8448\n传播权\t8449\n巴音郭楞蒙古自治州\t8450\n扶养费\t8451\n炉石传说天梯\t8452\n红楼梦中\t8453\n毛坯图\t8454\n定为\t8455\n龙小云\t8456\n上海男排\t8457\n广告有限公司\t8458\n刀剑乱舞\t8459\n父神\t8460\n绿豆芽\t8461\n叶丹\t8462\nEnscape\t8463\n)融资租赁有限公司\t8464\n被停职\t8465\n孟庙\t8466\n胆大心细\t8467\nease\t8468\n孙周兴\t8469\n黄城根小学\t8470\n统计法\t8471\n六碟\t8472\n藕断丝连\t8473\n广汽研究院\t8474\n地方政府\t8475\n浙江中国小商品城集团股份有限公司\t8476\n污渍\t8477\nv-for\t8478\n学步\t8479\nrtw\t8480\nperpetual\t8481\n弧型\t8482\nlinguistic\t8483\n胃溃疡\t8484\nSaks\t8485\n郑阳\t8486\n博汇\t8487\n乌鲁木齐职业大学\t8488\nADC0832\t8489\n桂阳\t8490\nPrediction\t8491\n几关\t8492\nJ-1\t8493\n福瑞达\t8494\n黄宝石\t8495\nggjucheng\t8496\n回到现实\t8497\n越人歌\t8498\n苗菜\t8499\n两千多只\t8500\n金_\t8501\n传祺GA6\t8502\n信命\t8503\nmodest\t8504\n道德观察\t8505\ng470\t8506\nLRC歌词\t8507\nDitto\t8508\nl351\t8509\nMess\t8510\nHEXO\t8511\n萝拉\t8512\nksoap\t8513\n真子公主\t8514\n家云\t8515\n2笔\t8516\n放映\t8517\n电锯惊魂4\t8518\n第03\t8519\n白夜行\t8520\n0804\t8521\n800万元\t8522\n恶魔复仇者\t8523\n白衣\t8524\n红巾\t8525\n2012伦敦奥运会\t8526\n中国社会科学网\t8527\n唐慧\t8528\n山水文园集团\t8529\n呼伦\t8530\n外劳\t8531\nJay\t8532\n自开\t8533\n雀羽\t8534\n岗\t8535\n二项式\t8536\n康华整形美容网\t8537\n北京现代音乐研修学院\t8538\n红外触摸屏\t8539\n核弹头\t8540\nv级\t8541\n如燕\t8542\n黑鱼\t8543\n嘉实\t8544\nfutian\t8545\n广州道\t8546\n浦东小学\t8547\nXFS\t8548\n有源低音炮\t8549\nSUGAR\t8550\n初语\t8551\n暨南大学珠海校区\t8552\n龟苓膏\t8553\n可比价\t8554\n真人快打XL\t8555\n上海菜\t8556\n帐套\t8557\n北邙山\t8558\n腾信股份\t8559\n股票收益率\t8560\n浮子流量计\t8561\n2950\t8562\n利伐沙班\t8563\n轨门\t8564\nГ\t8565\n再生产\t8566\n梦飞科技\t8567\n方廷皓\t8568\n科文\t8569\n列提纲\t8570\nOptions\t8571\n一对一_\t8572\n槐乡\t8573\nDOLL\t8574\nswear\t8575\n填隙\t8576\ninstitution\t8577\n新玩\t8578\n多一次\t8579\ndaxie\t8580\n我领\t8581\n佳兴\t8582\n廉航\t8583\n李熙\t8584\njyzhou\t8585\n基频\t8586\n新型农村合作医疗保险\t8587\n伟光\t8588\n一堂\t8589\nMPP\t8590\n海玥\t8591\n邓颖\t8592\n电脑族\t8593\n社会学系\t8594\n博美吧\t8595\n绞股蓝\t8596\n酒泉市\t8597\nwebpage\t8598\n上汽大众途昂\t8599\n自然吸气发动机\t8600\n地爆天星\t8601\nSUBARU\t8602\n中央圣马丁\t8603\n大正制药\t8604\n我是王的女儿\t8605\n宣城中学\t8606\n苏州地铁2号线\t8607\n惠州汽车站\t8608\n御景湾\t8609\n1293\t8610\n浑身\t8611\n同爱\t8612\n包治\t8613\n真假_二丫网\t8614\n麾\t8615\nfaster\t8616\nk660e\t8617\nlua吧_\t8618\n茹茹\t8619\n神经官能症\t8620\nrelife\t8621\nkmv\t8622\n丰惠镇\t8623\nzx2\t8624\n优特钢\t8625\n20180127\t8626\n8所\t8627\n会议案\t8628\n尤为\t8629\n李建忠\t8630\n青面兽\t8631\n大丰\t8632\nxfun\t8633\n四川省政府\t8634\n遗址公园\t8635\n融科\t8636\n利于\t8637\n黄暴污\t8638\n仁恒河滨花园\t8639\n刘东明\t8640\n骨肽\t8641\n套间\t8642\n龙晶\t8643\n香叶\t8644\n南昌西\t8645\n手价\t8646\n国务院公报_中国政府网\t8647\nBrides\t8648\n此卡\t8649\nPhpstorm\t8650\n表格版\t8651\n太极拳\t8652\nQQ阅读\t8653\n联赛杯\t8654\n孤岛吧\t8655\n下锁\t8656\nDakota\t8657\n西尔维娅\t8658\n国食药监械\t8659\n暗影格斗2吧\t8660\n合生元奶粉\t8661\n纪念品\t8662\nIPM\t8663\n县纪委监委\t8664\n华广\t8665\n亚花梨\t8666\n同甘\t8667\n片面\t8668\n质量分\t8669\n129种\t8670\n2.5万公里\t8671\n自收自支\t8672\n639\t8673\nwein10\t8674\n水银柱\t8675\n荣威混沌剑神\t8676\n白裙子\t8677\n金岭镇\t8678\n方方格子\t8679\n海港开发区\t8680\nbreaker\t8681\n郑忠\t8682\n世华\t8683\nxinyang\t8684\n凹面\t8685\n帝国备份王\t8686\n卡卡号\t8687\n李玉兰\t8688\n消渴\t8689\nwww.51\t8690\ntomography\t8691\n理财学院\t8692\nmami\t8693\nsuanfa\t8694\n入党积极分子考察表\t8695\nGm\t8696\n采买\t8697\n雨龙\t8698\nhuami论坛\t8699\nPP管\t8700\n3秒内\t8701\n口琴谱\t8702\nInnodb\t8703\n小岛南\t8704\n拷贝\t8705\n唐浩明\t8706\n大脚骨\t8707\n4则\t8708\n给我\t8709\n西北有色金属研究院\t8710\n高科技\t8711\n中国照明\t8712\nchiphell\t8713\nin_书面语\t8714\njilu\t8715\n东京电影节\t8716\n中调\t8717\nHire\t8718\nVS\t8719\n建湖\t8720\n息息相关\t8721\n毒虫\t8722\n欢乐颂2\t8723\n装错\t8724\n只限\t8725\nPreliminary\t8726\n袋式过滤器\t8727\n喜乐街\t8728\naccess2010\t8729\n拐棍\t8730\n帝俊\t8731\nsata2\t8732\n晶质\t8733\n产业带\t8734\n外务省\t8735\ntimeless\t8736\n库伯\t8737\n遗传性耳聋\t8738\n中介机构\t8739\n编写函数\t8740\n翡丽甲第\t8741\n农业投入品\t8742\n远教\t8743\nimpressions\t8744\n生化反应\t8745\n毛泽民\t8746\n寻秦记\t8747\n手机触屏\t8748\n双效片\t8749\nH1S\t8750\n3个小时\t8751\ndoPDF\t8752\n手稿\t8753\n玛尼石\t8754\n多伦多猛龙\t8755\n右键\t8756\n军床\t8757\n薄透镜\t8758\n罗军\t8759\npotassium\t8760\n跳井\t8761\n听潮\t8762\n金石头\t8763\n陈述句\t8764\n地产类\t8765\nWinFrom\t8766\n数学家\t8767\n0xc0000135\t8768\n佰草世家\t8769\n美物\t8770\n燃油滤清器\t8771\n黛丝\t8772\nacacia\t8773\n绝命律师\t8774\n文锋\t8775\n许亚丽\t8776\nppy\t8777\n季昌明\t8778\nHDTVrip\t8779\n永基\t8780\nNSFC\t8781\n翻牆\t8782\n谦\t8783\nFervent\t8784\nGoethe\t8785\n搏击者\t8786\n青山楼\t8787\n守护者联盟\t8788\n辰皇\t8789\nAL00A\t8790\n10000日元\t8791\n软籽石榴\t8792\n复联2\t8793\n通海县\t8794\n宣传片\t8795\n环球广贸\t8796\n1173\t8797\n学前教育史\t8798\n看涨\t8799\n桐城中学\t8800\n网上银行\t8801\n陈晶晶\t8802\n流水段\t8803\nModule5\t8804\n估号\t8805\n应勤\t8806\n安泰集团\t8807\n文慧\t8808\n讨鬼传\t8809\nhkt\t8810\n4.13.0\t8811\n五星体育\t8812\ncitizen\t8813\nく\t8814\n6000万\t8815\n河北医科大学第二医院\t8816\n三盒\t8817\n昂立\t8818\n毕胜\t8819\n服装学院\t8820\n石墨烯电池\t8821\n烤鸭炉\t8822\nlutens\t8823\n南开中学\t8824\n生津\t8825\nviod\t8826\n串型\t8827\n指点江山\t8828\n嘀嗒拼车\t8829\n从无到有\t8830\n参与率\t8831\n郑州医院\t8832\n倩碧黄油\t8833\n几滴\t8834\n海星宝\t8835\n北京建委\t8836\n亮碟\t8837\n赫胥黎\t8838\n美食梦物语\t8839\n满堂红\t8840\n芷江\t8841\n六日\t8842\n数十\t8843\n爱创课堂\t8844\n春意盎然\t8845\n孙训方\t8846\n902路\t8847\n何老师\t8848\n上海国\t8849\n上海邮政\t8850\nsint\t8851\nwindows域\t8852\n汤姆斯\t8853\n侠盗飞车\t8854\n打鸟\t8855\n难民\t8856\n空白点\t8857\n玲奈\t8858\n非法移民\t8859\n叶良辰\t8860\n南汉\t8861\n毒哥\t8862\nqq企业邮箱\t8863\nsla\t8864\n51cto\t8865\n组织块\t8866\n6006\t8867\n干枣\t8868\n圣斗\t8869\n无霜\t8870\n物业小区\t8871\n巴黎鲁贝\t8872\n水妖精\t8873\n穆熙妍\t8874\n刚架\t8875\n鞋企\t8876\n宾汉姆\t8877\n邹勇\t8878\n陈志辉\t8879\nMik\t8880\n好色\t8881\n上海金融报\t8882\n延吉\t8883\nexisted\t8884\n记分周期\t8885\nUniversity\t8886\n4.4.3\t8887\n乌克兰语\t8888\n养程度\t8889\n跨地\t8890\n珠江口\t8891\n2880x1800\t8892\n椎间盘突出\t8893\npacking\t8894\n牧场\t8895\n听好\t8896\n真明\t8897\n景行\t8898\n成雅高速\t8899\nomnipeek\t8900\n俄勒冈大学\t8901\n五系\t8902\noca\t8903\n参观证\t8904\n西湖大学\t8905\nIII代\t8906\n杨梅坑\t8907\nbankruptcy\t8908\n萤石社区\t8909\n为谁而炼金吧\t8910\n济宁学院\t8911\n铺上\t8912\n宋夏\t8913\n垫底\t8914\n岔道\t8915\nck7788电影网\t8916\n休斯顿\t8917\n刘倩\t8918\nesf\t8919\n肖明\t8920\nyiji\t8921\n南丰\t8922\n岩土工程师考试\t8923\n大锤哥\t8924\n挖媒\t8925\n武阿哥\t8926\n写字桌\t8927\n静夜\t8928\n轮作\t8929\n8.3%\t8930\n排列整\t8931\n深圳市环球易购电子商务有限公司\t8932\n青岛地铁3号线\t8933\n谷月轩\t8934\n虚胖\t8935\n云韵\t8936\n耶鲁\t8937\n梦劳动美\t8938\n爱下\t8939\nnak\t8940\n珏\t8941\n里番合集\t8942\n医部\t8943\n渐开\t8944\nPROTOCOL\t8945\n敬佩\t8946\n南京大学法学院\t8947\n翻釋\t8948\n10ul\t8949\n智慧新城\t8950\n赛鸽\t8951\n龙兴\t8952\nWikiquote\t8953\n糖版\t8954\n金蝶\t8955\n彭山\t8956\nreat\t8957\n打蛋机\t8958\n图长\t8959\n实验诊断学\t8960\nmargiela\t8961\n徐州一中\t8962\n98级\t8963\n看管\t8964\n杜灿\t8965\n文件管理器\t8966\nscrubber\t8967\n脱氧核苷酸\t8968\nC85\t8969\n太原市国土资源局\t8970\n麦田怪圈\t8971\node\t8972\n魔兽世界猎人珍稀宠物图鉴\t8973\n82429106\t8974\nInconel\t8975\nstudio2010\t8976\n924次\t8977\nWES\t8978\n平安证券\t8979\n基本常识\t8980\n[综\t8981\n幽奈\t8982\n函评\t8983\n太极刀\t8984\n支塘镇\t8985\n步甲\t8986\n无彩限的怪灵世界\t8987\n热水锅炉\t8988\n氯吡脲\t8989\n播散\t8990\n坐坐\t8991\n撤诉\t8992\n二十三年\t8993\nsurfacepro3\t8994\n横眉冷对千夫指\t8995\n三星杯\t8996\nX12\t8997\n李珉宇\t8998\n田七粉\t8999\n04.18\t9000\n竹罐\t9001\n大聚会\t9002\n王恒\t9003\npesticide\t9004\n7.4分\t9005\njeffery\t9006\n学员们\t9007\n唱词\t9008\nrecovery双清\t9009\n_&#160\t9010\n常住\t9011\n芙蓉山庄\t9012\n20个小时\t9013\njcaptcha\t9014\n韩信天官赐福\t9015\n最适宜\t9016\n啫喱水\t9017\nEmp\t9018\n002236\t9019\n中海大厦\t9020\n明卓\t9021\nCc\t9022\n创智\t9023\nRUBY\t9024\n梦乃爱华\t9025\n深圳市金融办\t9026\n晶系\t9027\n清苑\t9028\nDatacenter\t9029\nesoogle\t9030\n郭丽萍\t9031\n圃\t9032\n聊城\t9033\n10例\t9034\nIphone6\t9035\n罗宋汤\t9036\nElementary\t9037\n盱眙龙虾节\t9038\n还有没有\t9039\n1997年\t9040\n中差\t9041\nnsnumber\t9042\n周仁\t9043\n氯虫苯甲酰胺\t9044\n昌平农业嘉年华\t9045\nflowmaster\t9046\n18日\t9047\n贺利氏\t9048\n柜中美人\t9049\n2017年7月3日\t9050\n科隆大教堂\t9051\n20i\t9052\n70亿元\t9053\n10分钟后\t9054\n凡瑶\t9055\n一逼\t9056\n江西省气象局\t9057\n11册\t9058\n普约尔\t9059\n神鹰帝国\t9060\n互投\t9061\nzx300\t9062\n脑男\t9063\n横岗街道\t9064\n华东交通大学\t9065\n小静\t9066\n虎掌\t9067\n5.1音乐网\t9068\n卷积核\t9069\n陈建军\t9070\n黄石市中心医院\t9071\n汉画像石\t9072\n鞣革\t9073\n星钻物语\t9074\n昆山花桥\t9075\n西部大峡谷\t9076\nBelle\t9077\nActors\t9078\nchar变量\t9079\npadavan\t9080\n人源\t9081\n通程\t9082\nlinner\t9083\n苍井优初\t9084\n兰德里\t9085\n方阵\t9086\n三达\t9087\nthumbzilla\t9088\n中频\t9089\n哈衣\t9090\n梨园镇\t9091\n国易斋\t9092\nintellijidea\t9093\n王洛勇\t9094\n喵玉殿论坛\t9095\n钳制\t9096\n梁俊\t9097\n海蛾号\t9098\n百合小说吧\t9099\n明里紬\t9100\nVer\t9101\n高参\t9102\n柯迪亚克\t9103\nG20峰会\t9104\n工业大学\t9105\n外环\t9106\n普本\t9107\n100句\t9108\n卡娜\t9109\n边区\t9110\n索权\t9111\n脐带间充质干细胞\t9112\n二联\t9113\n巴坎布\t9114\n牙斗\t9115\nprohibited\t9116\n邹忌\t9117\n傻吊\t9118\n鱼龙混杂\t9119\n浮名\t9120\n铁东\t9121\n永发镇\t9122\n大众迷你仓\t9123\n食展\t9124\n80多家\t9125\n广州林\t9126\nmountains\t9127\n匪\t9128\n活载\t9129\n御乘\t9130\n合肥供水集团\t9131\n三星云\t9132\n黄巾起义\t9133\n气鼓\t9134\n1903\t9135\n直插\t9136\n横刀夺爱\t9137\n红外光谱仪\t9138\n树人大学\t9139\n稽留\t9140\n舞出我人生3\t9141\n传播性\t9142\n晴川\t9143\npsychopass\t9144\n永顺\t9145\n闺蜜们\t9146\n去年10月\t9147\n免面\t9148\nExterior\t9149\n轰天\t9150\n磁控溅射\t9151\n离体\t9152\nWinfrom\t9153\n1D\t9154\n水稳碎石\t9155\n金时代\t9156\n欲求\t9157\n季缃绮\t9158\n竞对\t9159\n腰椎\t9160\n同江\t9161\n茗苑\t9162\n血继限界\t9163\nORA-06512\t9164\n先后次序\t9165\n舞风\t9166\nJTextField\t9167\n烧入\t9168\naber\t9169\nacquired\t9170\nCyrus\t9171\nHashing\t9172\n二副\t9173\n贺州日报\t9174\nNGS\t9175\n地杆\t9176\n投资者们\t9177\n炮师\t9178\n领料单\t9179\n3.7.30\t9180\n寻侠英雄传\t9181\n混合型证券投资基金招募说明书\t9182\n郑燮\t9183\n伯父\t9184\n地藏菩萨\t9185\n笔型\t9186\n墨竹\t9187\n黄凡\t9188\n速录机\t9189\n周免\t9190\n大雪天\t9191\n小棕\t9192\n八通网\t9193\n止战\t9194\n桂花雨\t9195\n事人\t9196\n大梁\t9197\n中宾\t9198\n俭\t9199\n巨龙大道\t9200\n叉叉酷玩\t9201\n转换器\t9202\n铮铮\t9203\nKAFKA\t9204\n热拍\t9205\n贵州省住房和城乡建设厅\t9206\n杠\t9207\n双重曝光\t9208\n舞姿\t9209\n松鼠\t9210\n新新魔塔\t9211\nStony\t9212\n博亿堂\t9213\nZU\t9214\n18卷\t9215\n神武3手游佛门\t9216\n陈文\t9217\nHART\t9218\n丹田\t9219\n盘式制动器\t9220\nIDC国际资讯\t9221\n开胶\t9222\nY型\t9223\n20170528\t9224\n可丽饼\t9225\n湛山\t9226\n3.6.4\t9227\n破法者\t9228\n诸己\t9229\n血斧\t9230\n塔基\t9231\nIATF\t9232\nArkansas\t9233\n水瓢\t9234\n夸张\t9235\n朗迪\t9236\nin10\t9237\n2170\t9238\n条件反射\t9239\nMWC2018\t9240\n发件\t9241\nCSharp\t9242\n2017年11月13日\t9243\n转序\t9244\n四川职业技术学院\t9245\namigoOS\t9246\n镇海区\t9247\n议定书\t9248\nMJPEG\t9249\n维修类\t9250\n馁\t9251\n粘液\t9252\n初代奥特曼\t9253\n花潮\t9254\n周新\t9255\n贵州省交通规划勘察设计研究院股份有限公司\t9256\n八一冰川\t9257\nwebwork\t9258\n对线\t9259\n安庆论坛\t9260\n无口\t9261\n不接\t9262\n了事\t9263\n情谜睡美人\t9264\n拷机\t9265\n棠棣之华\t9266\n系膜\t9267\ninductive\t9268\n外语\t9269\n牛黄\t9270\n时政类\t9271\n海河州\t9272\nzuciwang\t9273\nXms\t9274\n广东省公安厅\t9275\n娄艺潇\t9276\n宝岗大道\t9277\n3032\t9278\n浙江省交通运输厅\t9279\n社会型\t9280\n提袋\t9281\n淋巴结影\t9282\n周末班\t9283\n芳漆\t9284\n薛斯\t9285\nsoildwork\t9286\n超导磁体\t9287\nsegue\t9288\n青木原树海\t9289\n启明创投\t9290\nbicloud\t9291\n令人作呕\t9292\n外防\t9293\nESG\t9294\n安阳论坛\t9295\n抽样误差\t9296\n珂罗版\t9297\n飘花网\t9298\n会计从业\t9299\n拱墅\t9300\n安徽水利水电职业技术学院\t9301\nChampionship\t9302\nVivado\t9303\n耐思\t9304\nNissan\t9305\n0.17\t9306\n狼疮性肾炎\t9307\nCLOVER\t9308\n闪片\t9309\n败品秀\t9310\n新西凤酒商城\t9311\n失敗\t9312\n和解\t9313\nssl协议\t9314\n承\t9315\n十六周\t9316\n分型面\t9317\nqq表情大全\t9318\n油漆房\t9319\n财政所\t9320\n郝邵文\t9321\n沉积\t9322\n自启\t9323\n第一世\t9324\n大一块\t9325\n封疆大吏\t9326\n兵马俑\t9327\n华人小说网\t9328\ncoax\t9329\nJUY\t9330\n勇太\t9331\n摩洛哥\t9332\n门人\t9333\n和谐家园\t9334\nBranded\t9335\n郑州19楼\t9336\nsql-SQL\t9337\n申研\t9338\n灰印\t9339\n唐勇\t9340\ngrande\t9341\navg\t9342\n穿靴子的猫\t9343\nEffects\t9344\nPregnant\t9345\n各行\t9346\nHell\t9347\n脂老虎饼干\t9348\nappreciate\t9349\n聚氨酯硬泡\t9350\npullman\t9351\n鼾\t9352\n打掉\t9353\n铸铁管\t9354\n晶锐斗破苍穹\t9355\n潮粉\t9356\n花芽\t9357\n莱斯纳\t9358\n海兰信\t9359\n瓯江口\t9360\n奉化\t9361\n厦工\t9362\n松针\t9363\n死神来了4\t9364\n粮液\t9365\n开心超人联盟之星际危机\t9366\n彩信\t9367\n泰昌\t9368\n分校\t9369\n数控铣\t9370\n兴宁市\t9371\n书画集\t9372\n小平\t9373\n西红柿\t9374\n崔建远\t9375\n中山三院\t9376\n叠片机\t9377\n痂\t9378\n中国民生银行股份有限公司\t9379\n朱砂痣\t9380\n地球港\t9381\n黑芝麻丸\t9382\n掇刀\t9383\n长城m4\t9384\nMarseille\t9385\n马务\t9386\n万绿园\t9387\n第二屏\t9388\n襄公\t9389\n江阳区人民政府\t9390\n联芯\t9391\n票样\t9392\n晓东\t9393\n合众\t9394\n刘子\t9395\nMultilayer\t9396\n城市快速路\t9397\n空翼\t9398\n财预\t9399\n红素\t9400\n王烈\t9401\nA杖\t9402\n小青瓦\t9403\n准点\t9404\n巴卫\t9405\n幕后黑手\t9406\nix25\t9407\n五丈\t9408\nwatt\t9409\n十国千娇\t9410\n白手起家\t9411\n第五场\t9412\n嘉杰\t9413\n可控\t9414\n战就战\t9415\n蓝房网\t9416\n云南西双版纳州\t9417\n载客汽车\t9418\nx战警2\t9419\n油价\t9420\n魔棍\t9421\nMXGS\t9422\n无界\t9423\nSING\t9424\n持有人\t9425\n字书\t9426\n惠州市住房和城乡规划建设局\t9427\n空白名\t9428\n品牌街\t9429\n乐福鞋\t9430\n绣花鞋\t9431\nwxg\t9432\n神界2\t9433\n汉能移动能源控股集团有限公司\t9434\n大墨垂杨\t9435\n8000部\t9436\n李金铭\t9437\n1.0.5.0\t9438\n岸然\t9439\n人间体\t9440\n脾胃病\t9441\n苏州市政府\t9442\n众品\t9443\n小米5X\t9444\nPendant\t9445\n化脓性\t9446\n浪子心\t9447\n清华附中朝阳学校\t9448\n第3期\t9449\n偶遇\t9450\nita\t9451\nAssessments\t9452\n面包蟹\t9453\n武学\t9454\n弘法\t9455\n天黑\t9456\n爱心基金\t9457\nU点\t9458\n宣传栏\t9459\n驭胜s350\t9460\n金利源\t9461\n肉馅\t9462\n中国人民大学马克思主义学院\t9463\n上海理工大学\t9464\ncodesys\t9465\n全域旅游示范区\t9466\n周恩来故居\t9467\n发酵粉\t9468\n95198\t9469\n碳酸盐\t9470\n大全网\t9471\n开锅\t9472\n电力系统稳态分析\t9473\n瞿颖\t9474\nejaculation\t9475\n新乡市住房和城乡建设委员会\t9476\n搏斗\t9477\n电脑安全模式\t9478\n甘发\t9479\n春妮\t9480\n1平方厘米\t9481\nJavaSc\t9482\nLund\t9483\n东方润园\t9484\n枸杞红枣\t9485\n第129期\t9486\n螺线管\t9487\n电子科技大学经济与管理学院\t9488\n5002\t9489\n一般而言\t9490\n百余种\t9491\nnanos\t9492\n破虏\t9493\n3209\t9494\n一滴滴\t9495\n船民\t9496\n多级\t9497\n绝地求生手游助手\t9498\n喝水量\t9499\n清水岩\t9500\n阴阳人\t9501\n永州市人民政府\t9502\nIPHONE7\t9503\n莫入\t9504\n宇辉\t9505\n唐高宗\t9506\n娃娃衫\t9507\npeers\t9508\nretinal\t9509\nMercedes-Benz\t9510\n_v1.2安卓\t9511\n世奥赛\t9512\n小空\t9513\n捕蚊\t9514\n毛鸡\t9515\n方盒子\t9516\n非空值\t9517\n中共邹城市委\t9518\ngrace\t9519\n刺剑\t9520\n天族\t9521\n长牛\t9522\n淫水\t9523\nShemale\t9524\n822路\t9525\n栗林里莉\t9526\naircraft\t9527\nAKKA\t9528\nQCT\t9529\n20151206\t9530\n365招聘网\t9531\n脱衣\t9532\n正解问答-正解网\t9533\ncex\t9534\ngraduation\t9535\n倍得\t9536\n花源谷\t9537\n两滴\t9538\nCurrency\t9539\n国家烟草局\t9540\nsubstratum\t9541\n丝花\t9542\nweb3js\t9543\n汽博\t9544\n暗盒\t9545\n美国恐怖故事\t9546\n解挂\t9547\n罚分\t9548\n赵骏\t9549\n铂爵\t9550\n5000亿\t9551\n图省事\t9552\n俩家\t9553\n坡屋顶\t9554\n白桦林\t9555\n同业竞争\t9556\n仓山镇\t9557\n11个月\t9558\n精灵龙\t9559\n2010a\t9560\n十几种\t9561\n贵航股份\t9562\n董方卓\t9563\n六一小学\t9564\n渭南站\t9565\n天鹅套索\t9566\n周乔\t9567\n来了吧\t9568\n语义网\t9569\nNick\t9570\nZheng\t9571\nipr\t9572\n虚无缥缈\t9573\n成县\t9574\n展项\t9575\n7.6亿\t9576\n天谕\t9577\nindexcodes\t9578\nkok\t9579\n真人剧\t9580\neverspace\t9581\n南阳古镇\t9582\n天福号\t9583\nSasa\t9584\nFirefox\t9585\n地方特产\t9586\n23:59:59\t9587\n定式\t9588\ntumble\t9589\n未成年\t9590\nCAJviewer\t9591\n内蒙古自治区人民检察院\t9592\ngn\t9593\n1-3号\t9594\n抗规\t9595\nGuid\t9596\n亲生子\t9597\n独孤靖瑶\t9598\n颓然\t9599\n学术节\t9600\n绝版\t9601\n欧阳奋强\t9602\nminist\t9603\n和实\t9604\n浙江财经学院\t9605\n卷取机\t9606\n焊接件\t9607\n到哪儿\t9608\nsubs\t9609\n机动战士敢达OL\t9610\n不臭\t9611\n任长霞\t9612\nGibco\t9613\nextjs\t9614\napplied\t9615\n上海公寓\t9616\n全城热恋\t9617\n布雷顿森林体系\t9618\n琢美\t9619\n胜者倾城之恋\t9620\n投资咨询公司\t9621\n监控摄像头\t9622\nClaudia\t9623\nions\t9624\n/h1\t9625\nDISTINCT\t9626\n虚拟式\t9627\n真字\t9628\n乌兰布统草原\t9629\n牙齿矫正\t9630\nex3\t9631\n袁四爷\t9632\n专制主义\t9633\n横行\t9634\n三十周年\t9635\n网工\t9636\nSupper\t9637\n压差表\t9638\n柔肤水\t9639\nasylum\t9640\n清张\t9641\n暗链\t9642\n重印\t9643\n罩衣\t9644\njameshappy\t9645\nutility\t9646\nZUIKO\t9647\n德江县政府\t9648\n立方\t9649\n大竹海\t9650\n有眼无珠\t9651\n佳能ip2780\t9652\n娇儿\t9653\nsayonara\t9654\n不要哭\t9655\n硅灰石\t9656\n315网\t9657\n往上爬\t9658\n溶栓\t9659\n庸才\t9660\n孵化箱\t9661\nqc30\t9662\ndigo\t9663\n氨溴特罗\t9664\nmotion\t9665\n年羹尧\t9666\n对不起我爱你\t9667\n真武庙\t9668\nwww.i3done.com\t9669\n山青水秀\t9670\n北京昌平区绿海家园\t9671\nk码\t9672\n俊秀\t9673\nCOUNTIF函数用法\t9674\nicy\t9675\n采购商\t9676\n数据拟合函数\t9677\n旺源\t9678\n心源\t9679\n出峰\t9680\nMJRefresh\t9681\n甄芝灵芝西洋参茶\t9682\nDrawerLayout\t9683\nCICC\t9684\n沧州市运河区\t9685\n糖衣片\t9686\n赞刷\t9687\n路劲上海派\t9688\nSpark算子\t9689\n徐海燕\t9690\nbl小说\t9691\nad18\t9692\n金祥\t9693\n人势\t9694\n贤明\t9695\n干旱\t9696\n方盒\t9697\n爱的天堂\t9698\nSwiper中文网\t9699\n40.5码\t9700\n日用品\t9701\nswiperefreshlayout\t9702\n董静\t9703\nconan_lin\t9704\n晴朗\t9705\nboxing\t9706\n中国航信\t9707\n胡桃里\t9708\n电缆线路\t9709\n离心水泵\t9710\n中甲\t9711\n翡翠华府\t9712\n桃花源小区\t9713\nAdministration\t9714\nFrom\t9715\n600795\t9716\n汽车板\t9717\n浙商博物馆\t9718\n气隙\t9719\n_迅\t9720\n猫耳螨\t9721\nFreeBuf\t9722\n龙华区\t9723\nPsy\t9724\ncad图块\t9725\n六十载\t9726\n1934年\t9727\n穞\t9728\nvac\t9729\nPermutation\t9730\n「\t9731\nmeasure\t9732\n倒闭潮\t9733\n沈阳西塔\t9734\n桂花苑\t9735\n忿怒\t9736\n超线性\t9737\nmare\t9738\nSwitcher\t9739\ncaptive\t9740\n资源化\t9741\n刘泽\t9742\n大灾变\t9743\n速品\t9744\nValiant\t9745\n雷切\t9746\n海南省考试局\t9747\n54万\t9748\n徐俊\t9749\n荔城镇\t9750\n宿\t9751\n重庆银监局\t9752\n电瓷\t9753\n厮\t9754\n南丁格尔玫瑰图\t9755\n明杆\t9756\n病毒式传播\t9757\n无磁\t9758\n蛋花\t9759\n爱货网\t9760\n6月5日\t9761\n70周年\t9762\n轮宽\t9763\n老莫\t9764\n石英坩埚\t9765\n20140831\t9766\n&#46\t9767\n旁站\t9768\n五段\t9769\nEDPASS\t9770\n4.4_\t9771\n撕拉式\t9772\n玛里苟斯\t9773\n艾妮\t9774\n天山西路\t9775\nNano\t9776\n丰田埃尔法\t9777\n192.168.1.253\t9778\n朱迪\t9779\nPotassium\t9780\nWalmart\t9781\nregionserver\t9782\nFeynman\t9783\n搜狐汽车\t9784\n见字如\t9785\nDebussy\t9786\n阿桑\t9787\n一键大跳\t9788\n手机页游_h5\t9789\nblacksquad\t9790\nmybtis\t9791\n关系数据库\t9792\n微跌\t9793\n转商\t9794\n港东\t9795\neasytouch\t9796\n东陶中国\t9797\n滨江院区\t9798\n欧龙\t9799\n不以己悲\t9800\n鞑靼人\t9801\n搭上\t9802\n检测认证\t9803\n腾彩\t9804\nCBC有色网\t9805\ngarten\t9806\n天安人寿\t9807\n南郑\t9808\nkobold\t9809\n一儿\t9810\n三月初八\t9811\n杨柳风\t9812\nDHCP服务器\t9813\n吕梁山\t9814\n逐影\t9815\nholika\t9816\n米璐璐\t9817\n蓝罐\t9818\n无凭\t9819\n子宫日记\t9820\n北斗六星网\t9821\n高呼\t9822\n永和路\t9823\n资产负债管理\t9824\n14公斤\t9825\nMaternity\t9826\n宜川县\t9827\n宣传周\t9828\n心旷神怡\t9829\nOWL\t9830\n继保商务网\t9831\n润恒\t9832\nDetox\t9833\n邻接表\t9834\n中国统一战线新闻网\t9835\n@click\t9836\nDimash\t9837\n飞鸿无痕\t9838\n环氧砂浆\t9839\n江西省人民医院\t9840\n婚情\t9841\n雍景\t9842\n摄影\t9843\n菌业\t9844\n附报\t9845\n健康饮食\t9846\nsamples\t9847\n稻\t9848\n风管式\t9849\nDeakin\t9850\nbp=1&rsv\t9851\n香草\t9852\nlcchuguo\t9853\n处理剂\t9854\n清明上河园\t9855\n刘哇勇\t9856\n漏扫\t9857\n第十八季\t9858\nHTTP代理\t9859\n两分钟内\t9860\n国家电投集团\t9861\nYume\t9862\nETCD\t9863\n垂直投影\t9864\n湖南建工集团\t9865\n古泉\t9866\n美图t8\t9867\nzentai\t9868\n萨提亚\t9869\n4408\t9870\n鞋库\t9871\n父类\t9872\nバック\t9873\n采摘季\t9874\n草莓论坛\t9875\n油梨\t9876\n寂月皎皎\t9877\n福步\t9878\nR型\t9879\n研习\t9880\n刑案\t9881\n127平米\t9882\nvrchat\t9883\n马来西亚站\t9884\nlistcontrol\t9885\n广州大剧院\t9886\n艾博\t9887\n星羽\t9888\n一拖再拖\t9889\n长征5号\t9890\n盲山\t9891\n瑞尔齿科\t9892\n【文\t9893\n远程监控\t9894\n公德\t9895\n颇有\t9896\n酸奶盒\t9897\n紫禁之巅\t9898\n超剧场\t9899\n125.la\t9900\n17寸\t9901\n广告衫\t9902\n华为mate7\t9903\nppt_果果\t9904\n花样跳绳\t9905\n国际贸易学\t9906\n三刻\t9907\n我的滑板鞋\t9908\ngaps\t9909\nSides\t9910\n情锁\t9911\n磨刀霍霍\t9912\n李玮锋\t9913\n星际争霸:重制版\t9914\nqq音速吧\t9915\n常见问\t9916\n光辉女郎\t9917\nIntervals\t9918\n虹桥路1号\t9919\nLOCAL\t9920\n杀阡陌\t9921\nF150\t9922\nオレ\t9923\n力神\t9924\n庙宇\t9925\n茶茶\t9926\n腊肠犬\t9927\nParty\t9928\n寓义\t9929\n明星大侦探2\t9930\n索福瑞\t9931\n枉\t9932\n贝乐学科英语\t9933\n中国地质科学院地质研究所\t9934\n祈使句\t9935\n表征\t9936\n垂盆草\t9937\n填充墙\t9938\n木地板\t9939\n东电微校\t9940\n大面镇\t9941\n避孕药\t9942\n折衷\t9943\n11排\t9944\n金羊毛\t9945\n百雀羚\t9946\n逆时\t9947\n双翅\t9948\n0X\t9949\n周公解梦\t9950\n冠状动脉\t9951\n炎王\t9952\n2016.10\t9953\n修护液\t9954\n文二路\t9955\n美姑县\t9956\n儿媳\t9957\n忍者村大战吧\t9958\n应验\t9959\n法斗\t9960\n张郎\t9961\n武夷绿洲\t9962\n万足\t9963\nSTATEMENT\t9964\n连云港站\t9965\n合成机油\t9966\nPillar\t9967\n九曲黄河万里沙\t9968\n中华好诗词\t9969\n花为媒\t9970\n菏泽站\t9971\n简测\t9972\nepg\t9973\n正义观\t9974\n由是\t9975\n德稻\t9976\n石器\t9977\n四氟乙烯\t9978\n浮动框\t9979\n密保\t9980\n核身\t9981\n钙化\t9982\n付强\t9983\n超声仪\t9984\nTED\t9985\n单挑荒野\t9986\n希尔生\t9987\n碳氢\t9988\n山西人事考试网\t9989\n穿越火线手游\t9990\n本上\t9991\n北京市少年宫合唱团\t9992\n春莺啭\t9993\nNurse\t9994\n300ap\t9995\nVGtime\t9996\n720P/MP4\t9997\n岸基\t9998\n20160613\t9999\n广州APP开发公司\t10000\n异性\t10001\n中共四川省纪委\t10002\n方特东方神画\t10003\n中航工业集团\t10004\n龙梅子\t10005\n天猫精灵X1\t10006\n老马识途\t10007\n扣扣子\t10008\n煤田\t10009\n真空封口机\t10010\n观察点\t10011\n卡玛\t10012\nsting\t10013\n2651\t10014\n滨州医学院烟台附属医院\t10015\n卡卡网\t10016\n县交通局\t10017\n高速公路路\t10018\n排烟风机\t10019\n正负号\t10020\nTt\t10021\n正星\t10022\n2018届\t10023\n修真降魔录\t10024\nkp7gt\t10025\n中国雄安建设投资集团有限公司\t10026\n李淳风\t10027\nmarley\t10028\n手钩\t10029\n30_\t10030\n振膜\t10031\nBlueStacks\t10032\nwin0\t10033\n整期\t10034\n禹鼎\t10035\nDisable\t10036\n额叶\t10037\nPID号\t10038\n郑志勇\t10039\n全村\t10040\n千落冷剑相向_墓王之王悬棺寺\t10041\n十几米\t10042\ndfu\t10043\n净现值\t10044\n6.3亿\t10045\n山文\t10046\n风间由美\t10047\ndelivered\t10048\n璞玉\t10049\n安徽医科大学第四附属医院\t10050\n白青山脉\t10051\n主动式\t10052\n易安\t10053\n陶山镇\t10054\npuky\t10055\n中国中小城市网\t10056\n一币\t10057\navtaobao\t10058\n粗放\t10059\n龙章\t10060\n奇兽\t10061\nThird\t10062\n横向分布系数\t10063\n木月\t10064\n冷情\t10065\ncreo4\t10066\nLinux新闻_Linux公社\t10067\n废帝\t10068\n半夜12点\t10069\nweitzman\t10070\nOSS\t10071\n推下\t10072\n苏州湾\t10073\n通讯稿\t10074\n刷新上拉\t10075\n刘彪\t10076\n威刚万紫千红\t10077\n芦芽\t10078\n指甲花\t10079\n39款\t10080\n喷淋塔\t10081\n鹤鸣亭\t10082\n厦门思明\t10083\n王诗龄\t10084\n正文\t10085\n顿涅茨克矿工\t10086\n易燃品\t10087\nTSR\t10088\n氧分压\t10089\nvolleyball\t10090\n鞋报\t10091\n巅峰榜\t10092\n黄河滩\t10093\n进水管\t10094\n张春雷\t10095\n东善桥\t10096\n山雪\t10097\nHumanities\t10098\n单味\t10099\nfesco\t10100\ngeodatabase\t10101\n无害化\t10102\n_德\t10103\nFP\t10104\n小黑盒\t10105\n食料\t10106\nshortcuts\t10107\n汉口花园\t10108\n摆头\t10109\n司马懿\t10110\n三角恒\t10111\n9028\t10112\n小清\t10113\n整木\t10114\n急救包\t10115\n20160512\t10116\n里中\t10117\n骏威\t10118\n20150726\t10119\n回答\t10120\n鼻梁\t10121\n地产网\t10122\nGranger\t10123\n上币\t10124\n凉山州人民政府\t10125\n民水\t10126\n浙江理工大学科技与艺术学院\t10127\n杨子珊\t10128\n3507\t10129\n几周年\t10130\n超声相控阵\t10131\n10宗\t10132\n蚌精\t10133\n延安火车站\t10134\nv4.1.8\t10135\n洪都拉斯\t10136\n宝武\t10137\nquenching\t10138\n凭证式国债\t10139\n貂裘\t10140\n故事学\t10141\n卫生学\t10142\n违禁品\t10143\nSCAN\t10144\n5.5下\t10145\n开箱照\t10146\n五州\t10147\n兵库县\t10148\nxiqu\t10149\nUC\t10150\n高一级\t10151\n踏查\t10152\n掘金\t10153\n造像\t10154\n卖亏\t10155\n真北斗无双\t10156\n中国航空工业集团公司\t10157\n金裕\t10158\n【夜\t10159\n46期\t10160\nFNAa\t10161\n食府\t10162\n月光鱼\t10163\n撒切尔夫人\t10164\n龙柏新村\t10165\n坞\t10166\n厦门双十中学\t10167\n拾肆\t10168\n王晓燕\t10169\n老毕\t10170\n生物质燃烧机\t10171\n华南师大附中\t10172\n泡泡鱼\t10173\n文件行\t10174\n踏访\t10175\n上海交通大学媒体与设计学院\t10176\n吴明隆\t10177\n棒\t10178\nonmouseout\t10179\n3440x1440\t10180\nKVO\t10181\n番号库\t10182\ndina\t10183\n迈克尔·乔丹\t10184\n重庆市教育考试院\t10185\n八年前\t10186\n小升初择校_深圳奥数网\t10187\n8.1.3\t10188\n熊伟\t10189\n安全监督信息网\t10190\n滑水\t10191\n无痕浏览\t10192\n费曼\t10193\n寇岛\t10194\n火妖\t10195\n星星之火\t10196\ndanny\t10197\n国强\t10198\n凿子\t10199\n陈泓辰\t10200\n精选集\t10201\n鹤壁政府网\t10202\nbootcamp\t10203\n康辉旅行社\t10204\n爵士乐与不插电新编12首\t10205\n贝克曼库尔特\t10206\n呼出\t10207\n沙坪坝火车站\t10208\n北锣鼓巷\t10209\n微交\t10210\n小米电视4c\t10211\n三中\t10212\n情陷野山村\t10213\nesim\t10214\n2304\t10215\n科朗\t10216\n糯米粉\t10217\n安坦\t10218\nFuller\t10219\nspringMVC\t10220\n防盗版\t10221\n海洋之心\t10222\n好版\t10223\n神丹\t10224\n人生哲理_小故事大全\t10225\n晋文公\t10226\n运筹帷幄\t10227\n东风Honda\t10228\n男卑\t10229\n阿尔勒\t10230\n跳出率\t10231\n潼\t10232\n王顾\t10233\n10339手游网\t10234\n枣矿集团\t10235\n希财网\t10236\nvalor\t10237\n几折\t10238\n安图生物\t10239\n2017年4月17日\t10240\n夜航船\t10241\n130000\t10242\n7.1.8\t10243\n拉尔森\t10244\n文酱\t10245\n县\t10246\n5.06\t10247\n十几\t10248\n字节\t10249\n四进四信\t10250\n湖北省宜昌市第一中学\t10251\n期初数\t10252\n外经贸\t10253\nDUOWAN\t10254\n急转让\t10255\n磁盘管理器\t10256\n40分\t10257\n欧洲卡车模拟2mod\t10258\n丽珠集团\t10259\n奈飞\t10260\n网络视频监控软件\t10261\nCameron\t10262\nphobia\t10263\n沪闵路\t10264\n韩媒\t10265\n知行近思\t10266\n华阳街道\t10267\n石土\t10268\n三格\t10269\n风会\t10270\n制色\t10271\n不归\t10272\n针灸疗法\t10273\n夕阳西下\t10274\n观察室\t10275\n太奥广场\t10276\n规避\t10277\n林天龙\t10278\n瑞士\t10279\n薄厚\t10280\n7期\t10281\nPwC\t10282\n为戒\t10283\n和平里医院\t10284\n美容美\t10285\nclues\t10286\n苏航\t10287\n逆天\t10288\n南昌西站\t10289\n双顶\t10290\nサイト\t10291\n20170213\t10292\n童歌\t10293\n12月19日\t10294\n吴中大道\t10295\n印第安纳州\t10296\n芝心\t10297\n搜狐博客\t10298\n中少\t10299\nwwwbbb811com\t10300\n秘密文件\t10301\n民系\t10302\n湖南财经学院\t10303\n落健\t10304\n邮资机\t10305\n99美元\t10306\n上海自贸港\t10307\n介入治疗\t10308\ngrub.cfg\t10309\nApplying\t10310\nb套\t10311\n赵小彬\t10312\n无毛猫\t10313\n鲜花朵朵\t10314\nurl\t10315\nISO镜像\t10316\nccm+\t10317\n一个个\t10318\n地方税\t10319\n杏花\t10320\n2015年清明节\t10321\n穆棱市\t10322\n和气\t10323\n溱潼\t10324\nsuffered\t10325\n胡克定律\t10326\n财政部会计资格评价中心\t10327\n胡国强\t10328\n北京能源集团有限责任公司\t10329\ndjye\t10330\n扬州市教育局\t10331\n华义娟\t10332\n茸毛\t10333\n天弘中证\t10334\n爱不溯\t10335\n朱蕉\t10336\n252号\t10337\n宫颈癌疫苗\t10338\nchup\t10339\n策划案\t10340\n3500元\t10341\n电图\t10342\n长城杯\t10343\n重庆市南川区人民政府\t10344\nAnimations\t10345\n热塑\t10346\nasymmetric\t10347\n墙膜\t10348\n想你想你\t10349\n傅雷家书两则\t10350\n莞企\t10351\n单户\t10352\nJqgrid\t10353\n250欧\t10354\n一布\t10355\n南湖路\t10356\n第三方物流公司\t10357\n253号\t10358\nArchGo\t10359\n融创臻园\t10360\n眯\t10361\n粤语学习网\t10362\n压滤\t10363\n几里\t10364\n封管\t10365\n小说月报\t10366\n办实\t10367\n空客320\t10368\ndeletion\t10369\n创刊\t10370\n情欲\t10371\ndijkstra\t10372\nobsession\t10373\n桩芯\t10374\n网赚博客\t10375\nWear\t10376\n高毒\t10377\n上海市实验小学\t10378\nlighthouse\t10379\n金红石型钛白粉\t10380\nfru\t10381\nioredis\t10382\n傲斗凌云\t10383\n紫珍珠\t10384\n探阴山\t10385\n频道\t10386\n来源\t10387\n盛时\t10388\n清样\t10389\n圖片\t10390\n两道\t10391\nHills\t10392\n法定假期\t10393\n合兴\t10394\ncty\t10395\n持仓\t10396\nsilenceer\t10397\n蓝瞳\t10398\nVue中this.$router.push\t10399\n科技传媒网\t10400\n牛奶片\t10401\n国国\t10402\n麻醉师\t10403\n市自来水公司\t10404\n香澄\t10405\n附赠\t10406\n康城小区\t10407\n淘乐\t10408\n疏肝理气\t10409\n师宗\t10410\n蒋斌\t10411\n入\t10412\n485口\t10413\n顺峰山\t10414\n吸塑\t10415\n三限\t10416\n清凉山\t10417\n九龙壁\t10418\n河西学院\t10419\n杨杰\t10420\n2.78\t10421\n第92期\t10422\n销方\t10423\n1.52G\t10424\n雅尔塔\t10425\nBella\t10426\n程潇毕福剑\t10427\n黄变\t10428\n套线\t10429\n物理层\t10430\nconcentration\t10431\n沧浪之水\t10432\n詹俊\t10433\n海安港\t10434\n缙绅\t10435\ntaishan\t10436\n免费法律咨询\t10437\n活期\t10438\nstarless\t10439\n往右\t10440\n机电设备有限公司\t10441\n雨神\t10442\n科印\t10443\nMineski\t10444\nbernard\t10445\n酷开电视\t10446\nsh文件\t10447\n重庆渝富资产经营管理集团有限公司\t10448\nHockey\t10449\nreb\t10450\nobjc\t10451\n白芷\t10452\n北庄\t10453\n上课铃\t10454\n汉宫秋\t10455\n柚子树\t10456\nTurismo\t10457\nmelty\t10458\n低空飞行\t10459\n统揽\t10460\n双杆\t10461\n哺乳假\t10462\n农业大学\t10463\n得饶人处且饶人\t10464\n金梭\t10465\n云淡风清\t10466\n350吨\t10467\n编配\t10468\n广西壮族自治区\t10469\n誠\t10470\n六道口\t10471\n曼卡龙\t10472\n夏加尔\t10473\n马关\t10474\n北境\t10475\n600418\t10476\n何其多\t10477\n马特乌斯\t10478\nDassault\t10479\n台湾师范大学\t10480\n1000斤\t10481\n小园\t10482\n大足区\t10483\n成虫\t10484\nNATAPP\t10485\n阴妻\t10486\n舌边\t10487\n史迪威\t10488\n3月19\t10489\n黑龙江中医药大学\t10490\nESP\t10491\n靖江论坛\t10492\n特恩布尔\t10493\n14bit\t10494\n天气\t10495\nMA5680T\t10496\n甲酸\t10497\n盆地\t10498\n杀牙\t10499\n迫不及待\t10500\n赵旭日\t10501\n董波\t10502\nMirrorLink\t10503\n甘洛县人民政府\t10504\n凡仙\t10505\n网易我的世界吧\t10506\n费舍尔\t10507\n训养\t10508\nTypeC\t10509\n秋涛北路\t10510\n超然台\t10511\n12招\t10512\n筋哥\t10513\nlongtap\t10514\n冲压模\t10515\nHandles\t10516\n上海海事局\t10517\ngrs\t10518\n邹文\t10519\n万豪金业\t10520\n立身\t10521\n长春楼盘网\t10522\nguy\t10523\n商业银行\t10524\n聚氨酯胶黏剂\t10525\n市界\t10526\n范玮琪\t10527\n沈丹萍\t10528\nc++2010\t10529\n米拉·乔沃维奇\t10530\n用花\t10531\n丁培飞\t10532\nM1216nfh\t10533\n炉炉\t10534\n3号\t10535\n玛机雅娜\t10536\n重庆西部物流园\t10537\n5310\t10538\n唐慧琳\t10539\n权益金\t10540\n中晟\t10541\n惠东\t10542\nBBC_腾讯\t10543\n风骚律师\t10544\n检查项\t10545\n小白盒\t10546\n爆发式\t10547\nledge\t10548\n百度推广\t10549\n塔影\t10550\n副市\t10551\n江苏教育出版社\t10552\n武进\t10553\n马培德\t10554\n黄勃\t10555\n心软\t10556\n抛开\t10557\nsoy\t10558\n翁同龢\t10559\n7.51\t10560\n西博城\t10561\n销项\t10562\n小原\t10563\n十殿阎罗\t10564\n读取\t10565\n105L\t10566\n唐丽\t10567\nBAI\t10568\n马威\t10569\nrdquo\t10570\n废墨\t10571\n贰佰\t10572\n码牌\t10573\n梁加筋\t10574\n正部\t10575\n凝聚力\t10576\n造谣\t10577\nfopen函数\t10578\n繁昌\t10579\n光动力\t10580\nCamfrog\t10581\n孝德\t10582\n波普\t10583\n求道\t10584\n透明度\t10585\n迎驾贡酒\t10586\n新地\t10587\nDig\t10588\nanaly\t10589\n拨片\t10590\n苏哥\t10591\nwp7吧福\t10592\nTeamviewer12\t10593\n合成气\t10594\n健身舞\t10595\n王春梅\t10596\n决解\t10597\nStrength\t10598\n太短\t10599\n太仓港\t10600\n天蓝色\t10601\n东莱\t10602\n凹凸面\t10603\n2106\t10604\nlumia1520\t10605\n房盖\t10606\n500第一\t10607\n三安光电\t10608\n辽宁省环境保护厅\t10609\n成套设备\t10610\n天津音乐学院\t10611\nyeti\t10612\nMysterious\t10613\nspilt\t10614\n桥体\t10615\n地老虎\t10616\n盈余管理\t10617\n法兰尼\t10618\n性事\t10619\n黑妞\t10620\n姜珊\t10621\nterraria\t10622\n10箱\t10623\n500岁\t10624\nSerialization\t10625\n爬楼梯\t10626\n锦州\t10627\n可信\t10628\n一百美元\t10629\natoll\t10630\nQoo\t10631\n安庆日报\t10632\n昌景黄高铁\t10633\n金棋\t10634\n中国惊奇先生\t10635\n毕节市七星关区\t10636\n海淀实验小学\t10637\nparticle\t10638\nSolidworks2015\t10639\n楚云飞\t10640\n尚品宅配\t10641\np10p\t10642\n孝感学院\t10643\nFastcopy\t10644\n56首\t10645\n映射\t10646\nnpt\t10647\n宝马X4\t10648\n盼\t10649\n半把\t10650\n林甸\t10651\n普什\t10652\n高文秀\t10653\n宁致远\t10654\n工程部\t10655\n空翻\t10656\n大性\t10657\n盖帽\t10658\n黄河谣\t10659\nSymbian\t10660\n18ACG\t10661\n约德尔人\t10662\n巨难\t10663\n蝎男\t10664\n96个\t10665\n声律启蒙\t10666\n志鸟村\t10667\n北京老年医院\t10668\n建博会\t10669\n刘家良\t10670\n不巧\t10671\n日区\t10672\n钢圈\t10673\n胃肠病\t10674\nmx-5\t10675\nCarat\t10676\n虚焊\t10677\n0xbf\t10678\n猪芳芳\t10679\n海瑟薇\t10680\nprovincial\t10681\n大台\t10682\nzxc\t10683\n凝华\t10684\n国际金融报\t10685\nkx3552\t10686\n亚瑟·潘德拉贡\t10687\n井盖板\t10688\n竞\t10689\n塞勒\t10690\n美国购物网\t10691\n上海市肺科医院\t10692\ntpg\t10693\n弹簧测力计\t10694\n苏扶疏\t10695\nsqlstate\t10696\n坑村\t10697\n中厅\t10698\nK2HD\t10699\n南瑞继保\t10700\nillustrate\t10701\n王念\t10702\n揠苗助长\t10703\n退伍证\t10704\n宁波中医院\t10705\n弄清\t10706\n第210集\t10707\n九叔\t10708\n复审委\t10709\nSTEP7\t10710\nc型钢\t10711\n远程继续教育\t10712\n一袋\t10713\n美女上司\t10714\n三菱劲炫ASX\t10715\n中国电力建设集团\t10716\nradiation\t10717\nw64\t10718\n第几排\t10719\n卢毅\t10720\n药化\t10721\nWord2vec\t10722\n魔兽之路\t10723\n中国地质环境监测院\t10724\nPurple\t10725\n512MB\t10726\n未来论\t10727\ngionee\t10728\n口外\t10729\n酷网\t10730\n滨莱高速\t10731\n光分路器\t10732\n海思麒麟\t10733\nContaining\t10734\n医见\t10735\n长条\t10736\n四封\t10737\n艰险\t10738\nAirways\t10739\n15载\t10740\nDisease\t10741\n秦超\t10742\n合集包\t10743\n雾中风景\t10744\nLaunchImage\t10745\n冬瓜\t10746\nThinkCMF5\t10747\nrea\t10748\n国家自然科学基金委员会管理科学部\t10749\n建桥\t10750\nF2.8\t10751\n就教\t10752\n北京西直门\t10753\n德塔\t10754\n六号位\t10755\n杨家将全传\t10756\n答非所问\t10757\nsullivan\t10758\n文景\t10759\n現場\t10760\n151个\t10761\n63分\t10762\n记事\t10763\nR级\t10764\n商显\t10765\n4G内存条\t10766\n定胜天\t10767\n林语莫沉\t10768\n伊夫\t10769\n油润\t10770\n宽高\t10771\nSub\t10772\n儿童公园\t10773\n家常饭\t10774\n大麦\t10775\n千里香\t10776\n史氏\t10777\n奇奇妙\t10778\n钱进\t10779\nDirectX\t10780\n宏信\t10781\n玫瑰果油\t10782\n压单\t10783\nscz\t10784\n水蜜\t10785\n糖醋汁\t10786\n瓦斯琪尔\t10787\n投标保证金\t10788\n聯盟\t10789\n中南置地\t10790\n团体险\t10791\n晟\t10792\n乐视1s\t10793\n仙境传说\t10794\n三只手\t10795\n磕磕绊绊\t10796\n证厅\t10797\n晓\t10798\n西屋\t10799\nSix\t10800\n35斤\t10801\n数字化仪\t10802\n东易日盛\t10803\n进取型\t10804\n单应性矩阵\t10805\n下山\t10806\n秦皇岛市人力资源和社会保障局\t10807\nDefaults\t10808\n王英杰\t10809\n摸鸡\t10810\n江西师大附中\t10811\n城市居住区规划设计规范\t10812\ndht\t10813\n于扬\t10814\n塞进\t10815\n祖贤\t10816\n第十八集\t10817\n三都澳\t10818\n曹安公路\t10819\n为爱\t10820\n上海瑞慈体检中心\t10821\n飓风营救1\t10822\n喀什地区\t10823\nAETOS\t10824\nMonopoly\t10825\n荣威绝世武神\t10826\nreddit\t10827\n啦啦啦德玛西亚\t10828\n王建祥\t10829\n沧海一粟\t10830\n赘婿\t10831\n远景路\t10832\nworkers\t10833\n武库\t10834\nflushall\t10835\n10000米\t10836\n参谋部\t10837\n63批\t10838\n银阁寺\t10839\n清软喵\t10840\n雷克萨斯rx\t10841\n七宗\t10842\n卡级\t10843\n劲板\t10844\n2.7.0\t10845\n自我教育\t10846\n空气质量\t10847\njex\t10848\n8ms\t10849\n暗月马戏团\t10850\n磁珠\t10851\n韩晓春\t10852\nBump\t10853\n大岭\t10854\n271路\t10855\n羊汤馆\t10856\n遭禁\t10857\n马进\t10858\n资料集\t10859\n压力\t10860\n低聚果糖\t10861\n王耀武\t10862\n交道\t10863\n扩管\t10864\nATN\t10865\nEVP\t10866\n面部脂溢性皮炎\t10867\n黄志玮\t10868\n夠\t10869\nNYU\t10870\n枫伶忆\t10871\n欧卡2\t10872\n衡水一中\t10873\n金牛星\t10874\n倾巢\t10875\n锦屏镇\t10876\nsm虐恋SM\t10877\n会生\t10878\n11月16日\t10879\nCIFAR\t10880\nFest\t10881\n控制台\t10882\n安东尼·布朗\t10883\n不可爱\t10884\n西条琉璃M罩杯\t10885\n展锐\t10886\n意守\t10887\n惠安县\t10888\nInterbrand\t10889\n4月21\t10890\n假盘\t10891\nAmes\t10892\n腹腔镜手术\t10893\n5月1日前\t10894\n蓬蒿\t10895\nProduce\t10896\n花腔\t10897\njtree\t10898\nDebugo\t10899\nES8\t10900\n同志群\t10901\nImagenet\t10902\n欧拉\t10903\n欧阳娜娜\t10904\n京华吴京\t10905\n服务\t10906\nchico\t10907\n利乐\t10908\n75万元\t10909\n演示片\t10910\n大南\t10911\n卡通字\t10912\n随案\t10913\n咯吱窝\t10914\n市场法\t10915\n神剑\t10916\n斯坦福\t10917\n九都路\t10918\n徒行者\t10919\n183&\t10920\n油盐\t10921\n6500\t10922\n白银帝国\t10923\n1556060\t10924\n名写\t10925\n汉匈全面战争\t10926\nCPL\t10927\n煮意\t10928\n水鹤\t10929\n齐奥塞斯库\t10930\n决断力\t10931\n跬步\t10932\n肿么破\t10933\n石智勇\t10934\n切沃\t10935\n旁系亲属\t10936\n回手\t10937\n108平\t10938\n第90\t10939\n玉匣记\t10940\n钝化剂\t10941\n职业学院\t10942\n2册\t10943\n烤\t10944\n氟康唑片\t10945\n电锯惊魂7\t10946\n3722\t10947\n凑热闹\t10948\nOPPOA57\t10949\n液化\t10950\nAliGenie\t10951\n张宝文\t10952\n赫者\t10953\nLeanCloud\t10954\nFiles\t10955\n庞博\t10956\ntxt&id\t10957\nExtrusion\t10958\njavas\t10959\n不婚族\t10960\n球笼\t10961\n珠光路\t10962\ndiffuse\t10963\n当仁不让\t10964\nWILD\t10965\n去海边\t10966\n报评\t10967\nMito\t10968\n宝殿\t10969\nTRI\t10970\n刊出\t10971\n伍氏\t10972\n在劫难逃\t10973\n细雨\t10974\n丑丑\t10975\ndl360\t10976\n大华会计师事务所\t10977\n超级武器\t10978\n龙剑飞\t10979\n合格证\t10980\n新刀妹\t10981\n中国网球协会\t10982\n生肖酒\t10983\ndhc\t10984\n木荷\t10985\n白蛋白\t10986\n大伙们\t10987\n情志\t10988\n增值税起征点\t10989\n礼裙\t10990\n523Li\t10991\n打虫\t10992\n升价\t10993\nEFFECT\t10994\nlog格式\t10995\n74期\t10996\n吴建明\t10997\n世纪农药网\t10998\n遗产地\t10999\nREADY\t11000\n浪漫主义\t11001\n就算\t11002\nreplicated\t11003\n黄店镇\t11004\n重返德军\t11005\nэ\t11006\n穿心莲内酯滴丸\t11007\n新疆维吾尔族自治区\t11008\nWebcams\t11009\n第72届\t11010\n处理量\t11011\n银月城\t11012\n简道\t11013\n若狭\t11014\n调运\t11015\n亚版\t11016\nocg\t11017\n濯水古镇\t11018\n硬好\t11019\nSudo\t11020\n小河直街\t11021\nRAM\t11022\n7500万\t11023\n铁\t11024\n道琼斯工业指数\t11025\nnod\t11026\n往复泵\t11027\nmua\t11028\n吃苦\t11029\n爱迪\t11030\n上班时间\t11031\n兴国路\t11032\nタイトスカ\t11033\n汇业\t11034\n流溪\t11035\nTess\t11036\n信用网\t11037\nSear\t11038\n大小\t11039\n2061\t11040\n外科医\t11041\n40年代\t11042\n黑头\t11043\n还能不能\t11044\n神堂峪\t11045\neccel\t11046\nHLS\t11047\n大型化\t11048\n山东省农业科学院\t11049\n瞬移\t11050\n野蛮\t11051\nM230\t11052\n五十道\t11053\n铁龙\t11054\n保定职业技术学院\t11055\n二组\t11056\n张文军\t11057\n茵陈蒿\t11058\n陈教授\t11059\nGnuplot\t11060\n烟台富士康\t11061\n同学的名义\t11062\npva\t11063\n取色\t11064\n拿什么拯救你我的爱人\t11065\nApplications\t11066\nCarriage\t11067\n锅巴\t11068\nlovefou\t11069\n统考\t11070\n河南农业职业学院\t11071\n乔叶叶\t11072\n下式\t11073\n冤不冤\t11074\n玉泉区\t11075\nFF91\t11076\n多西他赛注射液\t11077\n传祺gm8\t11078\n难逃\t11079\n建业天筑\t11080\n随机血糖\t11081\n顺络\t11082\n5w30\t11083\nglamour\t11084\n鸟箱\t11085\n善成\t11086\n融创玖阙府\t11087\n红冠\t11088\n同贷书\t11089\n餐垫\t11090\ndnf剑豪\t11091\n青龙峡\t11092\njfreechart\t11093\n广东省科学技术协会\t11094\n姑姑\t11095\ntabview\t11096\n阜新\t11097\n俏美\t11098\n安徽城市管理职业学院\t11099\n中山市人民政府石岐区办事处\t11100\n宋丽\t11101\n宜信公司\t11102\n微单数码相机\t11103\n大麦青汁\t11104\n摩卡咖啡\t11105\n北旅\t11106\n永丰社区\t11107\n搜狗词库\t11108\n王红\t11109\nJRebel\t11110\n居间服务合同\t11111\n鞍山站\t11112\n茶文\t11113\nwin10下\t11114\n文艺版\t11115\n加特林\t11116\n细绳\t11117\n二十四点\t11118\n曹操\t11119\n观物\t11120\n天津汽车网\t11121\n齐性\t11122\n剪发\t11123\n0.6cm\t11124\n双星物语2\t11125\n马红俊\t11126\n张业遂\t11127\n追授\t11128\n王奕心\t11129\n供品\t11130\n上海烟草\t11131\n特准\t11132\n第37次\t11133\n悠米\t11134\n53页\t11135\n浩然\t11136\n交通大学\t11137\n回编\t11138\n钅\t11139\n深圳房产网\t11140\n小村庄\t11141\n江铃汽车股份有限公司\t11142\n三江口\t11143\n美羊羊\t11144\n滚筒洗衣机\t11145\n紫金矿业集团股份有限公司\t11146\n6098\t11147\n婴儿鞋\t11148\n防爆电动葫芦\t11149\n两生花\t11150\n小蓟\t11151\n芝加哥艺术学院\t11152\n此剧\t11153\n比特币资讯网\t11154\nbot\t11155\n理享\t11156\n各式\t11157\n审计\t11158\n霍轻\t11159\nforget\t11160\n西南交通大学远程与继续教育学院\t11161\n江门万达广场\t11162\n点阵\t11163\n褐煤\t11164\n牙期\t11165\ngabbana\t11166\n长塘镇\t11167\n肖根\t11168\n室性\t11169\n通州市\t11170\nat89c51\t11171\n汽油味\t11172\n赘皮\t11173\n广西投资集团\t11174\n4厘米\t11175\n1920X1080_\t11176\nChopard\t11177\nRSLogix5000\t11178\n宜华健康\t11179\n佛殿\t11180\n中港\t11181\n郑昊\t11182\n独头蒜\t11183\nDanny\t11184\n怒骂\t11185\n小苏打水\t11186\n潘家园眼镜城\t11187\n移动互联网+\t11188\n李姐\t11189\n审议稿\t11190\n导火索\t11191\n谣传\t11192\n尼伯龙根之歌\t11193\n38套\t11194\n凤凰体育\t11195\n痞气\t11196\n送审\t11197\n木旺\t11198\ncreateprocess\t11199\n通知类\t11200\n财鱼\t11201\n2018-03-15\t11202\n跳起来\t11203\n100e\t11204\n长城宽带吧\t11205\n模拟农场17\t11206\nPRG\t11207\n英联邦运动会\t11208\n第一学段\t11209\n徒弟们\t11210\n天降贤淑男\t11211\n广播\t11212\n瑟兰迪尔\t11213\n车顶灯\t11214\n3158美容网\t11215\nstring,string\t11216\n咕噜\t11217\ngaming\t11218\n函谷关\t11219\n电子信息网\t11220\nMeizu\t11221\n5Ds\t11222\n北京汽车股份有限公司\t11223\n另一个世界\t11224\n传递者\t11225\nqd\t11226\n东滩湿地公园\t11227\n美游\t11228\n采集端\t11229\n镜片\t11230\nroyalty\t11231\ngss\t11232\n光瓶酒\t11233\n元洲\t11234\n嘉兴港区\t11235\n东方财\t11236\n核基地\t11237\n陈有西\t11238\n关锁\t11239\n92路\t11240\n富海\t11241\n玄甲\t11242\n广州公办小学\t11243\n平台币\t11244\n36厘米\t11245\n电子电路图\t11246\n九五\t11247\n万一\t11248\n不惑\t11249\n分装\t11250\n贝菲特\t11251\n宋老三\t11252\n高羊茅\t11253\nSCX-3401\t11254\nDecker\t11255\n文广新\t11256\n海越股份\t11257\n么么哒\t11258\nAPPSTORE\t11259\n15年内\t11260\n坐莲\t11261\n海南省商务厅\t11262\n亚宝药业\t11263\n力宇\t11264\nFrancisco\t11265\n威固\t11266\n弃婴\t11267\n莱顿\t11268\n安魂\t11269\n42030\t11270\n云计算服务平台\t11271\n王宪魁\t11272\n黑土地\t11273\n双子座男\t11274\n通过\t11275\n伏案\t11276\n安陵容\t11277\n纪委监委\t11278\n娜塔莉·波特曼\t11279\n月之民\t11280\n张量分析\t11281\n互相帮助\t11282\n阿托斯\t11283\nyapi\t11284\nCMPP\t11285\nsysconfig\t11286\n犯法\t11287\n郑州新区\t11288\naro\t11289\n咸宁路\t11290\n密林\t11291\nORA-12514\t11292\ntnm\t11293\n三十岁\t11294\n沈佺期\t11295\n冯晓强\t11296\n放放电影网\t11297\n海景公寓\t11298\nZW\t11299\n第二十六届\t11300\n难掩\t11301\n红山森林动物园\t11302\n/option\t11303\nfacet\t11304\n学分制\t11305\n六环\t11306\n博雅生物\t11307\n装饰线\t11308\n存钱罐\t11309\n电闪\t11310\n亡\t11311\n钢筋砼\t11312\n婚前试爱\t11313\n10km\t11314\nFlies\t11315\nCentos\t11316\n三杰\t11317\n哈罗电单车\t11318\n26.2\t11319\n缔造\t11320\n充分条件\t11321\n365nm\t11322\n江苏理工学院\t11323\nmytheresa\t11324\n3cd\t11325\nLUV\t11326\n2017年10月31日\t11327\n乔菲\t11328\n1793\t11329\n石油路\t11330\n主页君\t11331\n淮水安澜\t11332\n电子地\t11333\n几袋\t11334\n吃素\t11335\n注气\t11336\n美年\t11337\n15类\t11338\n265.com\t11339\n武汉海尔\t11340\n鑫源\t11341\n固废处理\t11342\ncurr\t11343\n抨击\t11344\n祖巫\t11345\nJabber\t11346\n保家仙\t11347\n完户\t11348\n王恒屹\t11349\n调节剂\t11350\ntime_t\t11351\n美国注册公司\t11352\n康洪雷\t11353\n疼痛\t11354\n第十二季\t11355\n暗股\t11356\n派乐\t11357\n聚砜\t11358\n第三眼\t11359\n父子雄兵\t11360\n辰山植物园\t11361\n闽南日报\t11362\nregal\t11363\n中国流动科技馆\t11364\n北京理工大学\t11365\n倪景阳\t11366\n思量自难忘\t11367\n雅兰仕\t11368\naln\t11369\n三声炮\t11370\n小二_\t11371\n函数值\t11372\n舆论学\t11373\n骨化三醇\t11374\nJ.K.罗琳\t11375\n妹妹们\t11376\n我的太阳\t11377\n八歧大蛇\t11378\n烂苹果乐园\t11379\n樱桃萝卜\t11380\n江苏恩华药业股份有限公司\t11381\nNero8\t11382\n58级\t11383\n死魂灵\t11384\n雪姨\t11385\n争优\t11386\n覆叠轨\t11387\n叶世荣\t11388\n胡主席\t11389\n米氮平\t11390\n原花青素\t11391\ngce\t11392\n五得利\t11393\n兴华路\t11394\n澳门金沙娱乐场\t11395\n1966年\t11396\n东府\t11397\n纷舞妖姬\t11398\n360浏览器\t11399\n惨遭\t11400\n烟台新闻网\t11401\n阿诚de窝\t11402\n毫发\t11403\n毛舜筠\t11404\n中营\t11405\n快乐酷宝\t11406\n酸酱兔\t11407\n盛世长城\t11408\n帮助类\t11409\n乐观\t11410\n痴呆症\t11411\n鬼龙\t11412\nSR1000AT-ISE\t11413\n十六世纪\t11414\n小肠经\t11415\n香功\t11416\n香仁堂\t11417\n中科院物理所\t11418\nN40\t11419\n就位\t11420\n倒冲\t11421\n败落\t11422\n八九十\t11423\n自治区人民政府\t11424\nserv\t11425\nFESCO\t11426\n济宁广播电视台\t11427\nxcode5\t11428\n西北逍遥\t11429\n生工\t11430\n第十六回\t11431\n溪山温泉\t11432\n喜美\t11433\nlgs\t11434\n腰包\t11435\nATI\t11436\nFlight\t11437\n性爱\t11438\n蓝莓树\t11439\n春之歌\t11440\nfez\t11441\nLee\t11442\n一个字\t11443\n血魔\t11444\n墙壁式\t11445\n合并单元格\t11446\n中语\t11447\nf538\t11448\nChartered\t11449\n湖北省监察委员会\t11450\n濃厚\t11451\n伯母\t11452\n和平西路\t11453\n600008\t11454\nutf8_general_ci\t11455\n轩尼斯\t11456\n云南省科学技术厅\t11457\n170419\t11458\n好比\t11459\n忘年交\t11460\n好端端\t11461\n印度宗教\t11462\n5636\t11463\n攀枝花东区\t11464\nMicrobial\t11465\n医见钟情\t11466\n事业宫\t11467\nvivosmart\t11468\n点点\t11469\n京南\t11470\nopcache\t11471\n3厅\t11472\nKanye\t11473\n周渔\t11474\n大样\t11475\n陨石术\t11476\nZcash\t11477\n拖地机\t11478\n乳化剂\t11479\n立冬\t11480\n大观区\t11481\n北京山\t11482\nmjrefresh\t11483\n街道\t11484\n早晨5点\t11485\n第107期\t11486\n狰兽\t11487\n麒麟935\t11488\nつ\t11489\n永和街\t11490\n大利月\t11491\n叶螨\t11492\n断块\t11493\nmotors\t11494\n德安东尼\t11495\n阿尔卑斯山\t11496\n试井\t11497\n亿秀网\t11498\n杨小\t11499\n疾险\t11500\n怨\t11501\n资溪\t11502\n万城镇\t11503\nTAMA\t11504\n冬天\t11505\n手旁\t11506\n爹利\t11507\nbuil\t11508\n金鸡山\t11509\n现贷\t11510\nample\t11511\n狮子猫\t11512\n电动绞肉机\t11513\n律师袍\t11514\n四川省交通运输厅\t11515\n002714\t11516\n速断\t11517\nPilot\t11518\n十虎\t11519\n屠宰场\t11520\n掐头\t11521\n大括号\t11522\n月牙泉\t11523\n铁一中\t11524\n有机体\t11525\n特变电工\t11526\n壮志高飞\t11527\n朱翊钧\t11528\n洁净\t11529\n岸\t11530\n1376\t11531\n鹅湖\t11532\n微影\t11533\n51秒\t11534\nSTUDY\t11535\n人保公司\t11536\n复制\t11537\n腐漫画网\t11538\nBD1024\t11539\n利群股份\t11540\nfalv\t11541\n休閒\t11542\ncs6破解版\t11543\n宁都\t11544\nclude\t11545\n进口药\t11546\n银河历险记\t11547\ncor\t11548\n牧羊\t11549\n经久\t11550\n聚光科技(杭州)股份有限公司\t11551\n光证\t11552\n张玉清\t11553\n加沙\t11554\n容积率\t11555\n企业形象片\t11556\n山东医院\t11557\n燕子窝\t11558\n纵然\t11559\n一大碗\t11560\n洁尔阴洗液\t11561\n前座\t11562\n王璇\t11563\n记忆枕\t11564\nXcode7\t11565\n前程无忧手机网\t11566\n风雹\t11567\n三十三种\t11568\nprenatal\t11569\nproceeds\t11570\n纸头\t11571\nUnity3d.com\t11572\n谢安\t11573\n劲炫ASX论坛\t11574\n锦晃星\t11575\n题解\t11576\n陶伟\t11577\n柯维\t11578\n海威华芯\t11579\n磁介质\t11580\n_飞华\t11581\n单亲爸爸\t11582\n碧血\t11583\n第66章\t11584\n国士无双黄飞鸿\t11585\n观致3论坛_汽车之家论坛\t11586\ndefi\t11587\nNewzoo\t11588\n无线充电器\t11589\n牛仔衣\t11590\n翻手\t11591\n虹螺山\t11592\njansson\t11593\n绍兴市人民政府办公室\t11594\n4型\t11595\n第25天\t11596\nKasumi\t11597\nusim卡\t11598\ntubulin\t11599\n10亿美元\t11600\n培乐多彩泥\t11601\n李奕\t11602\nSuburban\t11603\n普洱茶\t11604\n九围\t11605\n辰华\t11606\n鬼混\t11607\n约翰·肯尼迪\t11608\n南方凤凰台\t11609\n公示牌\t11610\n溶剂型\t11611\nrinex\t11612\n金彭电动车\t11613\n和君\t11614\n高庙\t11615\nresponding\t11616\nhama\t11617\n八遍\t11618\ngmc\t11619\n乌审旗人民政府\t11620\nCABLE\t11621\n育秧\t11622\nGeoff\t11623\npp卡\t11624\n体育管理专业\t11625\n舞队\t11626\n20152016\t11627\nHui\t11628\n一_39\t11629\n3.0m\t11630\n反倾\t11631\n海难\t11632\noab\t11633\n维氏硬度\t11634\n第聂伯河\t11635\nLable\t11636\n军事训练营\t11637\n易房网\t11638\n正火\t11639\n亦然\t11640\n新古惑仔之少年激斗篇\t11641\n南华期货股份有限公司\t11642\nBuilding\t11643\n卖地\t11644\n不宁\t11645\n西野カナ\t11646\nGill\t11647\n半张\t11648\n东陵区\t11649\n税发\t11650\n音乐版\t11651\n3.9.4\t11652\n捷普\t11653\n150亩\t11654\n万源街\t11655\n问候语\t11656\n021\t11657\nModernsky\t11658\n夏末\t11659\n离婚以后\t11660\nfcl\t11661\n甩货\t11662\n刘思远\t11663\n丝点\t11664\n泰格\t11665\n中华工商网\t11666\n丹徒区\t11667\n奥迪s8\t11668\n湘语\t11669\n断粮\t11670\nLAB\t11671\nv7.1.1\t11672\n胖头鱼\t11673\n之江路\t11674\n陆虎\t11675\n小赌\t11676\n10星\t11677\n学龄\t11678\n陈程\t11679\ncombiner\t11680\n江豚\t11681\n茅台机场\t11682\n李大爷\t11683\n男版\t11684\n云起\t11685\n卡夫卡\t11686\n三量\t11687\nQQ名字网\t11688\n樟脑丸\t11689\n遇冷\t11690\n陕西博物馆\t11691\nweb认证\t11692\n第133章\t11693\nNSS\t11694\nBCM\t11695\n兴元\t11696\n骆琦\t11697\n天兴洲\t11698\n卡莱\t11699\n双沟酒\t11700\n马男\t11701\ntts\t11702\n20150202\t11703\n百惠\t11704\n西游记网\t11705\n阿雅克肖\t11706\n_捷\t11707\n记本\t11708\n8108\t11709\n左眼皮\t11710\n正奇\t11711\n中国信鸽协会\t11712\n代谢组学\t11713\n限项\t11714\n1.4.3\t11715\n地城\t11716\n几集\t11717\n儒学\t11718\n广州小学\t11719\n唇蜜\t11720\n快播播放器\t11721\n钢尺\t11722\ndeviantart\t11723\n明细分类账\t11724\nmhd\t11725\n美国务院\t11726\n4幅\t11727\n紫府仙缘\t11728\n5S\t11729\n超轻型\t11730\nntsc\t11731\n柳城\t11732\n3q\t11733\n风险投资\t11734\nfront-gl\t11735\n何总\t11736\n抗抑郁药\t11737\n要塞十字军东征2\t11738\n发皱\t11739\nmscs\t11740\n绿色食品网\t11741\n13厘米\t11742\n居转\t11743\n尖瓣\t11744\n叫号\t11745\n吸氧\t11746\n松源\t11747\n车损险\t11748\n_一路商机网\t11749\n津湾广场\t11750\n潇洒哥\t11751\n折叠电动车\t11752\n高温度\t11753\nbene\t11754\n评史\t11755\n悔海\t11756\nregulated\t11757\ndolls\t11758\nBrew\t11759\n醋坛\t11760\nsilo\t11761\n隐睾\t11762\n忆往昔峥嵘岁月稠\t11763\n战狼3\t11764\nTF家族\t11765\n郑思维\t11766\n吴京丁彦雨航\t11767\n恒康\t11768\nnous\t11769\n四柱八字算命\t11770\n色器\t11771\ndubble\t11772\n宗支\t11773\n斧头帮\t11774\n钢管杆\t11775\nbowling\t11776\ntfrecord\t11777\n清小\t11778\n北京青年政治学院\t11779\n江南路\t11780\naccelerating\t11781\nglyphicons\t11782\nLSB\t11783\n便当\t11784\n内华达\t11785\n汉嘉\t11786\n梆\t11787\n罗罗诺亚·索隆\t11788\n_多玩新闻中心\t11789\n田馥甄\t11790\nshihuc\t11791\n回车换行符\t11792\n科阳\t11793\nriding\t11794\n今晚八点\t11795\nz820\t11796\n电驴大全\t11797\nGenArts\t11798\ncharts\t11799\n当嫁\t11800\n有机宝石\t11801\n蛇精女\t11802\n武汉外校\t11803\n四川网\t11804\n李玲玉\t11805\n简理财\t11806\nk0\t11807\n白黄\t11808\n南山景区\t11809\n阿里钉钉\t11810\n荆州站\t11811\nKTV\t11812\n苏泊尔电压力锅\t11813\n冰火两重天\t11814\n中国路桥工程有限责任公司\t11815\n五代机\t11816\n子欲\t11817\n凌文\t11818\n张公山\t11819\n大洋湾\t11820\n中国零售行业\t11821\nCodeIgniter\t11822\nRxJS\t11823\n秦始皇陵\t11824\n雀巢咖啡\t11825\n四象限法则\t11826\n牙模\t11827\n行政职业能力测试\t11828\n直博\t11829\n放装型\t11830\n长政办\t11831\n4月9日起\t11832\n修补器\t11833\n湘西\t11834\n几斗\t11835\ndialog\t11836\n2.18G\t11837\n嫂嫂\t11838\n千图网www.58\t11839\n并序\t11840\n新宙邦\t11841\n竹篾\t11842\n1830\t11843\n颍上县人民政府\t11844\n画地\t11845\n蛇皮果\t11846\nFundebug\t11847\n锁服\t11848\n苏锐\t11849\n204个\t11850\ngatech\t11851\n小卖部\t11852\nP115b\t11853\ntweed\t11854\n正犯\t11855\n小豆子\t11856\n高冠\t11857\n判令\t11858\n三联单\t11859\n深心\t11860\ndrawal\t11861\n以太坊原链\t11862\n就怕\t11863\n走账\t11864\n开和\t11865\n南宁酒店\t11866\n色达县\t11867\ntechniques\t11868\n瓶胆\t11869\n二级建造师公路\t11870\nBRZ\t11871\nperformer\t11872\n地方级\t11873\n太聪明\t11874\nExpenses\t11875\n钢的琴\t11876\n7件套\t11877\n吉米巴特勒\t11878\neWTP\t11879\n电魂\t11880\nOttawa\t11881\ntested\t11882\n鄂西北\t11883\n格式\t11884\noscar\t11885\n丁豪广场\t11886\n幻剑情缘\t11887\n汗蒸馆\t11888\n光动力治疗\t11889\n尉氏\t11890\n大清一统志\t11891\n拱形\t11892\nppd\t11893\n八分米\t11894\n一一年\t11895\n联编\t11896\noppa\t11897\n人鼠\t11898\n度调查\t11899\n40岁时\t11900\n偷桃\t11901\nVS2015\t11902\n移情\t11903\n超声刀\t11904\nMultifunction\t11905\n三国群英2\t11906\n志向\t11907\n菲利\t11908\nindirect\t11909\n第三回\t11910\n水荫路\t11911\n小学读书节\t11912\n矩形\t11913\n万通中心\t11914\noptisystem\t11915\n1.3米\t11916\n中币\t11917\n折刀\t11918\n渡桥\t11919\n收集\t11920\n合成营\t11921\n经济制裁\t11922\n载道\t11923\n有助于\t11924\n旧州镇\t11925\n非遗传人\t11926\nptp\t11927\n保险卡\t11928\nfifaol3\t11929\n兰博基尼盖拉多\t11930\n正规性\t11931\n合谷穴\t11932\n旱金莲\t11933\n节气阀\t11934\n木兰云雾山\t11935\n导热油锅炉\t11936\n梦幻恋舞\t11937\n球墨铸铁件\t11938\n高明\t11939\n狰狞\t11940\nCoverage\t11941\n支路\t11942\npu404\t11943\n妈\t11944\n恒压\t11945\n一步电子网\t11946\n车坛\t11947\n减产\t11948\n冠冕堂皇\t11949\n雷疯\t11950\n挂拍\t11951\n北京市经信委\t11952\n隔音\t11953\n尿崩症\t11954\n薪金宝\t11955\n郑丹\t11956\n体外诊断\t11957\n桑皮纸\t11958\n遵义路\t11959\n丰田陆地巡洋舰\t11960\n蒜蓉酱\t11961\n牌号\t11962\ndirectdraw\t11963\n改分\t11964\n龙邦快递\t11965\n威富通\t11966\n丸药\t11967\nproton\t11968\n手撕发票\t11969\n用位\t11970\n李佳璐\t11971\n恒大御景半岛\t11972\n中国再生资源回收利用协会\t11973\n衣褶\t11974\n青阳县\t11975\n竹节\t11976\n借读\t11977\n目标性\t11978\n第三十八期\t11979\n珍藏版\t11980\n研磨度\t11981\n极致版\t11982\n交足\t11983\n手榴弹\t11984\n四日游\t11985\n迪亚海魔\t11986\n周口店\t11987\n最后一案\t11988\n袁家军\t11989\n3A级景区\t11990\n玉城\t11991\n北西\t11992\n抽油烟\t11993\n扩散器\t11994\n大渡口\t11995\n绿树\t11996\n全基因组关联分析\t11997\n笑意\t11998\n云路由器\t11999\n性比价\t12000\n长楹天街购物中心\t12001\nReinforcement\t12002\nS2期\t12003\n克莱斯勒300C\t12004\n斯巴鲁\t12005\n邀请语\t12006\nPackages\t12007\nCVT\t12008\n→\t12009\nf388\t12010\n一万亿\t12011\n植树节\t12012\n调心滚子轴承\t12013\n临沭\t12014\n小蜂\t12015\n梁格法\t12016\n薇薇安\t12017\nDCC\t12018\n蓬溪县人民政府\t12019\n冬兵\t12020\n考研报录比_考研\t12021\nvpd\t12022\nhays\t12023\n3符\t12024\n测试剂\t12025\n魅海蓝\t12026\n华安财产保险股份有限公司\t12027\nWIFE\t12028\n方大\t12029\n但丁\t12030\n耽美虐文\t12031\n3454\t12032\n城市规划论坛\t12033\n巧巧\t12034\n2.5v6\t12035\n交通票\t12036\n土风舞\t12037\n007商务站\t12038\n秦时明月之诸子百家\t12039\n口器\t12040\n邻水县人民政府\t12041\n海南经济特区\t12042\n桦甸\t12043\n电信业务经营许可证\t12044\n营私舞弊\t12045\n心致\t12046\n落花有意\t12047\n2699\t12048\n绞车\t12049\n蹦蹦跳跳\t12050\n泰兴市\t12051\n模版\t12052\nNam\t12053\n环资\t12054\nVLC播放器\t12055\n官方人才网\t12056\n无限性\t12057\nJon\t12058\n五官科\t12059\n王金玉\t12060\n转世\t12061\nIT之家论坛\t12062\n肾阴\t12063\n秘果\t12064\n绵阳市人民政府\t12065\n1.0.0\t12066\n元気\t12067\n教客\t12068\n一念之间\t12069\n生活类\t12070\n通辑\t12071\n无缝拼接屏\t12072\nb7\t12073\nTubes\t12074\n真金板\t12075\n奉行\t12076\n润土\t12077\n猎豹安全浏览器\t12078\n制修订\t12079\nFoxy\t12080\n隔列\t12081\nspi\t12082\nHenderson\t12083\n逆变焊机\t12084\n石螺\t12085\n西南大学网络与继续教育学院\t12086\nexe4j\t12087\n北京研发中心\t12088\n天津理工大学中环信息学院\t12089\n恨爱\t12090\n空瓶记\t12091\n计划生育证\t12092\n兰坡\t12093\nZeng\t12094\n华岩\t12095\n安国\t12096\n前置\t12097\n高纳\t12098\n第三子\t12099\n星动亚洲\t12100\n亚冠\t12101\nOx\t12102\n南昌市人力资源和社会保障局\t12103\n鸡神\t12104\n爱彩\t12105\n3001\t12106\n8004\t12107\n壹点\t12108\n酉阳县\t12109\nwizard\t12110\n羟基自由基\t12111\n猫步\t12112\n6ml\t12113\n1294\t12114\n号费\t12115\n浙江教育报数字报\t12116\nxml文件\t12117\ngxx\t12118\n微信企业\t12119\n差池\t12120\nisatap\t12121\ncol-xs-*\t12122\n三秋桂子\t12123\nmicsoft\t12124\n山西省委组织部\t12125\nseeking\t12126\n底分型\t12127\n央音\t12128\n梦萍\t12129\nNewton\t12130\n投票表\t12131\n最终幻想13:雷霆归来\t12132\n决好\t12133\n到期后\t12134\nMisty\t12135\nPIXTA\t12136\n格桑梅朵\t12137\nGOSH\t12138\n集腋成裘\t12139\n不动产登记\t12140\n山东理工\t12141\n印迹\t12142\n变形缝\t12143\n艾艾软件园\t12144\nnVIDIA\t12145\n手动液压叉车\t12146\n音乐馆\t12147\nKiCad\t12148\n划得\t12149\n道德准则\t12150\nphotoshopcc2018\t12151\n李国平\t12152\n桌椅\t12153\n筋肉\t12154\n艾伦秀\t12155\n厨师长\t12156\n丰镇市\t12157\n上党梆子\t12158\n宇电\t12159\n中国电科院\t12160\n引来\t12161\n独立式\t12162\n5个\t12163\n7.1v2\t12164\n中国新闻\t12165\n亦庄生活网\t12166\nBiotropics\t12167\n67岁\t12168\n庭上\t12169\nt520\t12170\nbyte型\t12171\n玉溪一中\t12172\nfaruto\t12173\n中国安达\t12174\n多条\t12175\n流塘\t12176\n极限竞速7吧\t12177\n贪食之痕\t12178\n足下\t12179\nEnabling\t12180\n天津海昌极地海洋世界\t12181\nmore\t12182\n母金毛\t12183\n金玉满堂\t12184\n三星s6\t12185\n第15关\t12186\n纵杆\t12187\nyanzi\t12188\n海峡人才网\t12189\n26歳\t12190\n二级考试\t12191\n丁点\t12192\n通城\t12193\n土地抵押贷款\t12194\n绿博园\t12195\n文化站\t12196\n兴平市政府\t12197\n儿童装\t12198\n神都夜行录\t12199\ncoinegg\t12200\nyi\t12201\n600298\t12202\n诸葛\t12203\n9万\t12204\nlaragon\t12205\nfat32\t12206\n宜信\t12207\n易制毒化学品管理信息系统\t12208\n总参谋部\t12209\n曲家\t12210\nDucks\t12211\n45秒\t12212\n中国投行\t12213\n电镀锌\t12214\nv2.0.2\t12215\n杭州余杭政府网\t12216\n打针\t12217\nleela\t12218\n这个样\t12219\n创译\t12220\nFlashlight\t12221\n欢哥\t12222\n切迹\t12223\n夜趣\t12224\n村村\t12225\nmedication\t12226\n关于实行党风廉政建设责任制的规定\t12227\n解绑\t12228\nMISSING\t12229\n皮艇\t12230\nIPRdaily\t12231\n1kV\t12232\n非洲菊\t12233\n金皇冠\t12234\ninventor2\t12235\n24式太极拳\t12236\n9幅\t12237\n弹性挡圈\t12238\n哈弗H2论坛\t12239\n20180224\t12240\n曾庆红\t12241\n苏宁控股集团\t12242\n隔天\t12243\n长男\t12244\n贺铸\t12245\n带好头\t12246\n冷水塔\t12247\n太美\t12248\nVINTAGE\t12249\n2700x\t12250\n粒子群优化\t12251\n跃马\t12252\n色谱峰\t12253\n黄歆苑\t12254\n十二款\t12255\n地都镇\t12256\nTW\t12257\ntimetable\t12258\n达索\t12259\n6801\t12260\n张望\t12261\n屏山村\t12262\n2450m\t12263\nalphabounce\t12264\n49吨\t12265\n全脂\t12266\n陆瑶\t12267\n伸梁\t12268\n王茜\t12269\n影像中国网\t12270\n百期\t12271\n爱壁纸\t12272\nm80\t12273\n202\t12274\n在于春\t12275\n工会法\t12276\n工夫\t12277\n中国短道速滑队\t12278\n凤凰山景区\t12279\n中古包\t12280\n圣光之誓\t12281\n美联物业\t12282\n李察\t12283\nfreed\t12284\n九霄奔云传\t12285\n萨拉米\t12286\n带载\t12287\n迪瑞克斯\t12288\n侄孙\t12289\n多元性\t12290\n好色千金\t12291\n滴滴答\t12292\n牛群\t12293\ncitic\t12294\n锥柄\t12295\n内坑镇\t12296\n第101期\t12297\n17段\t12298\n1.2.3_\t12299\n麻宫雅典娜\t12300\nX43\t12301\n暗黑破坏神2符文之语\t12302\n问鼎\t12303\n迎宾曲\t12304\n坤音四子\t12305\n【雅昌\t12306\n圆粒\t12307\n护苗\t12308\n沈阳晚报\t12309\n青岛版小学\t12310\n关咏荷\t12311\n百度高考\t12312\nnfdm\t12313\n兰释\t12314\n谢赫\t12315\n性爱大师\t12316\n性压抑\t12317\n&middot\t12318\n仰焊\t12319\n工程测量\t12320\n类黄酮\t12321\n2053\t12322\n金硕\t12323\nIQAir\t12324\n毕业设计.doc\t12325\n矮牵牛花\t12326\n405nm\t12327\n中转换\t12328\n协警\t12329\nSaudi\t12330\n饰\t12331\n中国劳动关系学院\t12332\n三池崇史\t12333\nSSAS\t12334\n金龙鱼\t12335\n技术人员\t12336\nzhishang\t12337\n婚姻法\t12338\n王会\t12339\n周密\t12340\n网大\t12341\n1.3.1\t12342\n美国联邦调查局\t12343\n冒险岛神\t12344\n蛋鸭\t12345\n库\t12346\n唐辉\t12347\nHD620\t12348\n搜狐科技\t12349\n甲基苯乙烯\t12350\n心伤\t12351\npcbaby\t12352\nwardrobe\t12353\n沃尔沃s40\t12354\n择天\t12355\n650亿\t12356\n财务类\t12357\n之间\t12358\n方正信产\t12359\n北京燃气集团\t12360\n牟定县\t12361\n油状物\t12362\n克卜勒\t12363\n谢云\t12364\n上个周末\t12365\n我妈妈\t12366\n统哥\t12367\n喻国明\t12368\n底盒\t12369\nvy\t12370\numl\t12371\n夜思\t12372\n金级\t12373\n肉丁\t12374\nrgb值\t12375\n门斗\t12376\n帝国理工\t12377\n帝集团\t12378\n野兽派\t12379\n太阳能逆变器\t12380\nsci\t12381\n炸线\t12382\n效实\t12383\n繁花似锦\t12384\n不流\t12385\n欧辰\t12386\n大门未知子\t12387\n健美生\t12388\n名数\t12389\n保障性\t12390\n不属\t12391\n64岁\t12392\n无损音乐下载|FLAC|WAV\t12393\n网拍\t12394\nn9200\t12395\nWin7/8.1\t12396\n大猫\t12397\n富邦股份\t12398\n三伏贴\t12399\n东梓关\t12400\nwim文件\t12401\n陈国邦\t12402\n沉管\t12403\n语域\t12404\n干制\t12405\n恋爱循环\t12406\n廊坊市人民医院\t12407\n七厘米\t12408\n别克_君越\t12409\n比特率\t12410\n卤汤\t12411\n私募债\t12412\namtlib\t12413\n螺旋霉素\t12414\n半\t12415\n南京医科大学\t12416\n二十所\t12417\n打顶\t12418\n腰椎盘\t12419\nemeditor\t12420\nWikitravel\t12421\n癸卯\t12422\n金头盔\t12423\n微微在线二维码生成器\t12424\n厌氧反应器\t12425\n净资产收益率\t12426\n氢硅油\t12427\nw519l\t12428\n第一游\t12429\n东风菱智\t12430\n国贸大厦\t12431\n赛码网\t12432\n腾辉\t12433\nafter\t12434\n中华人民共和国企业所得税法实施条例\t12435\n冷艳\t12436\n养脾\t12437\n南通公交\t12438\np2p理财\t12439\n开长\t12440\n呼和浩特市公安局\t12441\n自动化\t12442\n寂照庵\t12443\n东吴人寿\t12444\n祥云县\t12445\nHBR\t12446\n60.0000元\t12447\n海口海关\t12448\n机械工程学院\t12449\n房务\t12450\n英东\t12451\n奥康国际\t12452\n华为mate10\t12453\n妞们\t12454\nwww.3dmgame.com\t12455\nrises\t12456\n启蒙\t12457\n3品\t12458\n滁州市第一人民医院\t12459\n金螳螂\t12460\n六味地黄胶囊\t12461\n0.6.4\t12462\n上海茂育公司\t12463\n郑州北\t12464\nEngage\t12465\n鹿谷制茶\t12466\n查尔\t12467\n纸质版\t12468\n赤\t12469\n宁波市高新区\t12470\n公驴\t12471\n至\t12472\n聊斋花弄月\t12473\n卫士通\t12474\n20171026\t12475\n插话\t12476\nassembler\t12477\n异族\t12478\n程序猿001\t12479\nfrance\t12480\nHaiyuKing\t12481\n蒙羞\t12482\n环保袋\t12483\n0829\t12484\nFi\t12485\n月光女神\t12486\n京畿\t12487\n4.5.2\t12488\n经营预算\t12489\naddicted\t12490\n桥头堡\t12491\n府邸\t12492\n普拉多2700\t12493\n异音\t12494\n第3节\t12495\nR13\t12496\n抗压\t12497\n眼罩\t12498\ndnf2018\t12499\n中梁\t12500\n中影\t12501\n山东省自然科学基金\t12502\n魏家凉皮\t12503\n海亮德文郡\t12504\n类似文\t12505\n/方\t12506\nintelli\t12507\nsooopu\t12508\n炉石传说冒险模式\t12509\n上海市普陀教育党建\t12510\n银监\t12511\n科士达\t12512\n四阶\t12513\nPS照\t12514\n粘结剂\t12515\n保健科\t12516\n吉田拓郎\t12517\n江西网络广播电视台\t12518\n水中央\t12519\nlaysha\t12520\nthrough\t12521\n2018.3.14\t12522\n军犬\t12523\n命运之链\t12524\n大生化\t12525\n花钿\t12526\n法治类\t12527\nsdm\t12528\n正路\t12529\n物管\t12530\n为人父母\t12531\n蓝魔\t12532\n坎昆\t12533\n流放之路3.2\t12534\nBellevue\t12535\n画里\t12536\n张墨\t12537\nclark\t12538\n耽美快穿文\t12539\n派克钢笔\t12540\n桶装水店\t12541\nUM\t12542\n证交所\t12543\n江景房\t12544\n康普顿\t12545\n迈克尔逊干涉仪\t12546\nQSC\t12547\n药药\t12548\nShooting\t12549\n不良人\t12550\n鑫广\t12551\n平身\t12552\n名代\t12553\n奥古斯\t12554\n缩径机\t12555\n三山岛\t12556\nES6学习笔记\t12557\nAimee\t12558\n常德日报多媒体\t12559\n建筑学专业\t12560\n_单县\t12561\nhre\t12562\n应激性\t12563\n爱子\t12564\n0.32\t12565\n屌丝迷途\t12566\n地生\t12567\n观景\t12568\n100立方米\t12569\nCCB\t12570\n君色\t12571\n永川工业园区\t12572\n鹅蛋\t12573\n旅游费\t12574\n栎树\t12575\nthanks\t12576\n肉骨粉\t12577\nFalcon\t12578\n文具类\t12579\nminitool\t12580\n误差棒_\t12581\n入藏\t12582\n法定节假日三薪\t12583\n播视\t12584\n乔治·华盛顿\t12585\n5月5日起\t12586\n神威\t12587\nBOSS们\t12588\n前屏\t12589\nfm2018\t12590\n誌\t12591\n防水涂料\t12592\ncnca\t12593\n泌尿结石\t12594\n21栋\t12595\n帧同步\t12596\n阿里妈妈MUX\t12597\n1.4mm\t12598\n1.5.6\t12599\ndouy\t12600\ntxt文件\t12601\n模棱两可\t12602\n淅川县\t12603\n曹立军\t12604\nraptor\t12605\n朝天区人民政府\t12606\n265G网页游戏开服表\t12607\n新模式\t12608\n江清清\t12609\n香港特别行政区政府\t12610\n77页\t12611\n超级本\t12612\n最佳女主\t12613\n铍\t12614\nMariadb\t12615\n40章\t12616\n积石山县\t12617\n气侯\t12618\n蹄髈\t12619\n建窑\t12620\n刑法修正案\t12621\noption键\t12622\n观察者\t12623\nGTX\t12624\n中体\t12625\n可做饭\t12626\n刘亦婷\t12627\n第五十七条\t12628\n阿里学院\t12629\nMacbookair\t12630\n文区\t12631\n甲骨文转换器\t12632\n湿型\t12633\n加拿大多伦多大学\t12634\nJava\t12635\n无人能敌\t12636\n马盖\t12637\n远景\t12638\n两端\t12639\n拓展性\t12640\n暗矛\t12641\nblanc\t12642\nServletRequest\t12643\n鞋包\t12644\n剖面\t12645\n网库\t12646\n第五篇\t12647\n贡柑\t12648\n谷文昌\t12649\n华夏航空\t12650\n10th\t12651\n国泰金业\t12652\n_许\t12653\n_马自达3\t12654\n发卡银行\t12655\n十忌\t12656\n经营商\t12657\nepsilon\t12658\n深圳宾馆\t12659\n沙滩车\t12660\n炮房\t12661\n华义\t12662\n北京地铁8号线\t12663\nupnp\t12664\n【友\t12665\n佳沃股份\t12666\n精解\t12667\nireader\t12668\n人工少女\t12669\n杭州网络电视台\t12670\n不明觉厉\t12671\n小马达\t12672\nFOUR\t12673\n300吨\t12674\n湄潭\t12675\n中央党校\t12676\nslapd\t12677\n木砖\t12678\n强解\t12679\n丰县政府\t12680\n凤凰行动\t12681\n大选\t12682\n未曾\t12683\n不可知\t12684\n聚醚多元醇\t12685\nstaring\t12686\nScientists\t12687\n52SAMSUNG\t12688\n125&\t12689\n欧力\t12690\n陶昕然\t12691\nwin7win8win10\t12692\n个展\t12693\n甘肃省住房和城乡建设厅\t12694\n落花网\t12695\n0322\t12696\n槐耳颗粒\t12697\n胸腔\t12698\n打击报复\t12699\n手掌心\t12700\n友情\t12701\n73集\t12702\n瓷妆\t12703\nCamtasiaStudio\t12704\n房地产联合信息网\t12705\n资阳市人民政府\t12706\n起薪\t12707\n洲际酒店集团\t12708\nKensington\t12709\n皇家\t12710\n男仆\t12711\n浸泡式\t12712\n十余种\t12713\n异能文\t12714\n延长县人民政府\t12715\n中华人民共和国教师法\t12716\nlibrarian\t12717\n330路\t12718\n6000亿\t12719\n苓\t12720\n英国广播公司\t12721\n推塔\t12722\n笑傲江湖\t12723\nHAD\t12724\n厘米秀\t12725\n丰田普锐斯\t12726\n七根\t12727\n预制型\t12728\ninput_w3cschool\t12729\n环境日\t12730\n剑冢\t12731\nv2.3.4\t12732\n丐\t12733\n朗玛\t12734\n巡逻员\t12735\n拥军\t12736\n一小包\t12737\nmaui\t12738\n上腹\t12739\ncumshot\t12740\n邓石如\t12741\n认定\t12742\n利君沙\t12743\n向华胜\t12744\nwindouws\t12745\n肉棍\t12746\n青铜峡市\t12747\n莱切斯特\t12748\n江苏省医院协会\t12749\n无法抵挡\t12750\n成本费\t12751\n章子怡\t12752\n实质\t12753\n徐娜\t12754\n音版\t12755\n海螃蟹\t12756\n大佛山人才网\t12757\n笔数\t12758\n王利\t12759\n额济纳胡杨林\t12760\n布拉戈维申斯克\t12761\n爱国主义教育基地\t12762\n北洋\t12763\n罗建国\t12764\n红木家具网\t12765\ndelux\t12766\n300本\t12767\n深圳市科技创新委员会\t12768\n柳慧\t12769\n口镇\t12770\nQQ农场\t12771\nPBD\t12772\needc\t12773\n批改作业\t12774\n直营网\t12775\n大特\t12776\nyingwen\t12777\n一个层\t12778\nf369\t12779\nZOOM\t12780\nLeCun\t12781\nbootstrap4\t12782\nobjdump\t12783\n凡星\t12784\n中共中央八项规定\t12785\n硅谷\t12786\n深海狂鲨2\t12787\n正点率\t12788\n衍射\t12789\n第85集\t12790\n圣地\t12791\n公元前\t12792\n赫图阿拉城\t12793\n广西壮族自治区公安厅交通管理局\t12794\n莫慌\t12795\n华润物业\t12796\n周海\t12797\n进化中心岛\t12798\n6170\t12799\n特斯拉线圈\t12800\n宁德网\t12801\n好战\t12802\nv1.2.4\t12803\n中国广核集团\t12804\n憾\t12805\n海航大厦\t12806\n习近平系列讲话精神\t12807\n中国人寿养老保险股份有限公司\t12808\n差劲\t12809\n蜀国\t12810\nsika\t12811\nblbl\t12812\n当时\t12813\n清英\t12814\n占星术\t12815\n百度好链\t12816\n走出非洲\t12817\n南京西路1038号\t12818\n内场\t12819\nAddresses\t12820\nsatisfactory\t12821\n行星吞噬者\t12822\n吏\t12823\n和流\t12824\n挂科\t12825\nSr\t12826\n5.62\t12827\n李淳罡\t12828\n谎极落\t12829\ng2020\t12830\n巴武\t12831\n明城镇\t12832\n很早\t12833\n纳米比亚\t12834\n对越自卫反击战\t12835\n草田\t12836\n朝外大街\t12837\n新昌中学\t12838\n墙洞\t12839\nCodeFirst\t12840\n木芯板\t12841\n吴桥\t12842\n关员\t12843\n蒙牛优益\t12844\n妮\t12845\n兰庭\t12846\n马自达3星骋\t12847\n无缘无故\t12848\niPad3\t12849\nM1A2\t12850\nGTA5PC\t12851\n珠海高新区\t12852\n金晨\t12853\n长江江豚\t12854\nverbal\t12855\n天津时代奥城\t12856\ntabulate\t12857\n平垫\t12858\n7.75\t12859\n体格\t12860\n5.5.0\t12861\n小背\t12862\n八十七神仙卷\t12863\n美名\t12864\n往往\t12865\n莫明其妙\t12866\n上海科委\t12867\n1588\t12868\n牢狱\t12869\ngulf\t12870\n利发\t12871\n善舞\t12872\n缉凶\t12873\ngalvanized\t12874\n划入\t12875\nNodes\t12876\n物流箱\t12877\n跳槽\t12878\n穿行\t12879\n中华人民共和国劳动合同法\t12880\n_广联达\t12881\n无印\t12882\n碳纤维电地暖\t12883\nX9000E\t12884\n流纹\t12885\n从政\t12886\nnba2k9\t12887\n46章\t12888\n云筹\t12889\n诺希电池\t12890\nbpmf\t12891\nlz\t12892\ngt1030\t12893\n马田\t12894\n阿尔法策略\t12895\n超级小桀\t12896\n网易红彩\t12897\n红旗出版社\t12898\n梁祝文化公园\t12899\n精讲班\t12900\n仙剑四\t12901\n轻抚\t12902\nconst变量\t12903\n一叶一世界\t12904\n王梦\t12905\n无奇\t12906\n米拉玛沙漠\t12907\nvMix\t12908\n白童子\t12909\n北京酒店式公寓\t12910\n钼丝\t12911\n皮语音\t12912\nprojection\t12913\n00000000\t12914\n上清宫\t12915\ntall\t12916\n对腰\t12917\nfenglie\t12918\n小磨\t12919\n400500\t12920\n信邦制药\t12921\n200s\t12922\n12.31\t12923\n大光\t12924\n大国\t12925\n切入\t12926\n浅绛彩\t12927\n菖蒲\t12928\n钟小江\t12929\nLy\t12930\n80张\t12931\n通富微电\t12932\nqq空间日志\t12933\n三二\t12934\n高级会计学\t12935\nTP-Link路由器\t12936\n第一视频\t12937\n山西省粮食局\t12938\niVictor\t12939\n凤凰国际\t12940\n兴海路\t12941\n剃刀龟\t12942\n大飞机\t12943\n双拉条\t12944\n安亭新镇\t12945\n信利\t12946\n厂件\t12947\n转转验机\t12948\n大连万达集团股份有限公司\t12949\n籽\t12950\n资金成本率\t12951\ndep\t12952\n北京发改委\t12953\n这部\t12954\n膨润土防水毯\t12955\n吸收式制冷机\t12956\n衣品\t12957\n前值\t12958\n51部\t12959\nNoSql\t12960\n非磁性\t12961\n乌克兰格里夫纳\t12962\nnarrative\t12963\n教版\t12964\n鄂州市\t12965\n哪儿人\t12966\n宠文\t12967\nWOD\t12968\n第2_\t12969\n雷颐\t12970\n征袍\t12971\n417\t12972\n碱水\t12973\n大宁音乐广场\t12974\n欢迎\t12975\n最富\t12976\n大赞\t12977\n观星者\t12978\npandatv\t12979\n三身\t12980\nmagica\t12981\n二0二0年\t12982\n高位\t12983\n警企\t12984\n次新\t12985\naxis=0\t12986\n赣榆县\t12987\n一代代\t12988\n利息表\t12989\n智能屏\t12990\ncio\t12991\n[药理学\t12992\n6500K\t12993\n会谈心\t12994\n天文学家\t12995\nlockup\t12996\n第8版\t12997\n捷豹E-PACE\t12998\n卷带\t12999\n泼尼松片\t13000\n攻文\t13001\n海伦多兰\t13002\n可取\t13003\n白开水\t13004\n小儿咳嗽\t13005\n长梦\t13006\n宁夕\t13007\n东泰\t13008\n王博谷\t13009\n高铁票\t13010\n摘花\t13011\n李跃\t13012\n佐康\t13013\n杭州地铁6号线\t13014\nAEP\t13015\n330Li\t13016\n打杀\t13017\n长安CX70论坛\t13018\n三乐\t13019\n汉广\t13020\n位列\t13021\n怀愫\t13022\n且末县政府\t13023\n全国老龄办信息中心\t13024\n2017b\t13025\n300多种\t13026\n最终幻想11\t13027\n闲鱼影音工作室\t13028\n废宅\t13029\ngam\t13030\n奇正消痛贴膏\t13031\n成格\t13032\n2M\t13033\nfarcry3\t13034\n配单\t13035\n肿包\t13036\n女体宴\t13037\nrosh\t13038\n新法\t13039\n摆臂\t13040\n罚款单\t13041\n肺活量\t13042\nxiao8\t13043\n鼓乐\t13044\nTreat\t13045\n萨基姆\t13046\n热敏纸\t13047\n六扇门\t13048\n稳压电路\t13049\n凸形\t13050\n光纤准直器\t13051\nORCID\t13052\nTMS320F28335\t13053\n访问类\t13054\noc\t13055\nPK战\t13056\n档案机\t13057\n蓟州\t13058\n打飞\t13059\nRussia\t13060\n尬舞\t13061\n7600元\t13062\n伊林\t13063\n电灯泡\t13064\n奇刃\t13065\n苏环\t13066\n5组\t13067\nVT虚拟化\t13068\njeremy\t13069\n监工\t13070\n橙翼\t13071\n陆军总医院\t13072\n福建省人民政府\t13073\n虎耳草\t13074\n炼药师\t13075\n信发集团\t13076\n龟板\t13077\n花流\t13078\n大宋传奇之赵匡胤\t13079\n冤家路\t13080\nThinkServer\t13081\n湿地公园\t13082\n重制版成\t13083\n堂堂正正\t13084\n留学生作业网\t13085\nR420\t13086\nFissler\t13087\n慢生活\t13088\n炎性\t13089\nCigarette\t13090\ndataloader\t13091\ndkt\t13092\n歌功颂德\t13093\n人称\t13094\n测试服\t13095\n大麻烦\t13096\nC++库\t13097\n绘制函数\t13098\n龙珠:超宇宙\t13099\n万能视频播放器\t13100\n元后\t13101\n前菜\t13102\n网络营销学院\t13103\n对讲门\t13104\nGJB9001C-2017\t13105\nTec\t13106\n60升\t13107\n马克笔\t13108\n袁志刚\t13109\n塞西尔\t13110\n会同村\t13111\n周丽\t13112\n痴男\t13113\n腾博会\t13114\nOlay\t13115\n女研究生\t13116\n科协\t13117\nqemu\t13118\n203\t13119\n结构动力学\t13120\n王品牛排\t13121\n艺高\t13122\n淘豆网\t13123\n欠揍\t13124\n克里斯·埃文斯\t13125\n谢敏\t13126\n刑律\t13127\n剑池\t13128\n几箱\t13129\nEOI\t13130\n夏冰冰\t13131\n1910年\t13132\n诚信认证\t13133\n有诚意\t13134\n雪伊\t13135\n国资公司\t13136\n20180311\t13137\n军神\t13138\n王思佳\t13139\nlpk\t13140\n20170705\t13141\n美利金融\t13142\n天明路\t13143\n焊字机\t13144\n红头发\t13145\n预产期\t13146\n生而为\t13147\n侯友宜\t13148\n缓降\t13149\nIdol\t13150\n不须\t13151\n摄像\t13152\n投名状\t13153\n成都图书馆\t13154\n被击落\t13155\n隐蔽工程\t13156\n6300元\t13157\ngp\t13158\n枣庄站\t13159\n萨马兰奇\t13160\n餐饮有限公司\t13161\nG型\t13162\n命悬一线\t13163\n5052\t13164\n雄兵连\t13165\n凶简\t13166\n麦乐\t13167\n沭阳人才网\t13168\n树兰医院\t13169\n文园\t13170\n罗杰杜\t13171\n陈江华\t13172\n保花保果\t13173\n名鸽\t13174\n真假老师\t13175\nDatasheet5\t13176\n莫言巨石强森\t13177\n巴巴托斯\t13178\n来将挡\t13179\n第一商业网\t13180\n叶强\t13181\n附和\t13182\n上海长城宽带\t13183\n情花\t13184\n玉台\t13185\n竖起\t13186\n版纸\t13187\n铳\t13188\n广东警官学院\t13189\n陈嘉\t13190\n电饭煲版\t13191\n青口贝\t13192\n奶水\t13193\n红旗路\t13194\nTiida\t13195\n暖暖环游世界\t13196\n变线\t13197\n六建\t13198\nADSL\t13199\n快看漫画吧\t13200\n愧\t13201\n红豆股份\t13202\n绿维\t13203\n青岛公交\t13204\n青春使命\t13205\n安柏\t13206\n异丙胺\t13207\n莴苣\t13208\n龙东湖城\t13209\n蝴蝶式\t13210\n争春\t13211\nandreas\t13212\n裱纸机\t13213\n慢性盆腔炎\t13214\n万相\t13215\n王琳琳\t13216\n泰坦尼克吧\t13217\n曲线救国\t13218\n0951\t13219\n会员部\t13220\n三炮\t13221\n巢蜜\t13222\nAvocado\t13223\n南国网\t13224\n班委会\t13225\n自由者\t13226\n李权\t13227\n诺娃\t13228\n假性宫缩\t13229\n公子闲\t13230\nYYYY\t13231\n木屋别墅\t13232\n王艺霖\t13233\n密钥版\t13234\n木甲板\t13235\nuntill\t13236\nespon\t13237\n海螺沟\t13238\n氯化钾缓释片\t13239\n宋晓磊\t13240\nuft\t13241\n文昌门\t13242\n一证通\t13243\nzoro\t13244\n云雀的心愿\t13245\n王炯\t13246\n红肉\t13247\n辣道鱼火锅\t13248\n还能否\t13249\n_诺达\t13250\n多选值\t13251\nfindContours\t13252\nscalability\t13253\n海南省财政厅\t13254\nAny\t13255\n三本大学\t13256\n大绵羊\t13257\n西安交通大学机械学院\t13258\ncompr\t13259\n琼海市\t13260\nchenyu\t13261\nSparkSQL\t13262\n西安交通大学医学院第一附属医院\t13263\n分时图\t13264\n蓝月亮(中国)有限公司\t13265\n戊戌日\t13266\n尖锐湿疣\t13267\n韩亚\t13268\n去不回头\t13269\n33亿\t13270\nNGK\t13271\n耽美啦小说网\t13272\nsheets\t13273\n德赛西威\t13274\n皱起\t13275\n红灯\t13276\n天泪传奇之凤凰无双\t13277\n心坎\t13278\n双引号\t13279\n10安\t13280\n身先士卒\t13281\nJonas\t13282\n升序\t13283\n幼性\t13284\n杨邱自媒体\t13285\n骨折\t13286\n裂弹\t13287\n德国国防军\t13288\n苍天帝陵\t13289\nsmtplib\t13290\n钙石\t13291\n湖州市区\t13292\n李莲英\t13293\n泌尿外科\t13294\n竹柳\t13295\n武宣\t13296\n帝企鹅日记\t13297\n高倍\t13298\n肝病\t13299\n黄淑芬\t13300\n周舒桐\t13301\n寿仙谷\t13302\n有一个人\t13303\n一起同过窗\t13304\n金塘\t13305\n门头房\t13306\n迪斯尼\t13307\n3x4\t13308\n安粮\t13309\n识字5\t13310\n高雄机场\t13311\nmcse\t13312\n高考版\t13313\n神头\t13314\n墨斗\t13315\nsort、sorted高级排序\t13316\n秋水共长天一色\t13317\n安庆师范学院\t13318\n景涛\t13319\n铁碳合金相图\t13320\n3d2014\t13321\n_韶\t13322\npine\t13323\n马岛战争\t13324\nyb\t13325\n侨兴\t13326\n操作书\t13327\n福勒\t13328\n红枣枸杞\t13329\npattaya\t13330\n璃\t13331\n五星街道\t13332\n1月25日\t13333\n温子仁\t13334\n德恩\t13335\n典籍\t13336\n电结\t13337\n爱快软路由\t13338\n9K9K\t13339\n黏土人\t13340\n赛鸽网\t13341\n烧窑\t13342\n刘大成\t13343\nDis\t13344\n淘宝空包网\t13345\n程七七\t13346\nwmsys\t13347\n希德尼娅\t13348\nyunxing\t13349\n傍身\t13350\n神户大学\t13351\n听写\t13352\n阿q正传\t13353\n罗吉\t13354\n温州市住建委\t13355\n高自考\t13356\n埃文斯\t13357\nWeb浏览器\t13358\n70分钟\t13359\n城楼上\t13360\n艾达\t13361\nScrap\t13362\n6.6亿元\t13363\n神海4\t13364\n大事件\t13365\nOpenResty\t13366\n讲讲\t13367\n隐痛\t13368\n51个\t13369\n郑州电信\t13370\ny3\t13371\n第七季\t13372\nOpenNI\t13373\nAIP\t13374\n未雨绸缪\t13375\n2008R2\t13376\niTrip\t13377\n分包方\t13378\n2017年11月20日\t13379\n简略\t13380\n一甲\t13381\n指南者\t13382\n京津\t13383\n爱爱医资源网\t13384\n弓网\t13385\nadditives\t13386\n旅游线\t13387\nHK\t13388\n曼联\t13389\n孙波\t13390\n伯恩山犬\t13391\nMOD汉化版\t13392\nBL\t13393\n紫云赋\t13394\n24格\t13395\n风光大嫁\t13396\nGuice\t13397\n帖木儿\t13398\n扫一扫\t13399\n亚麻色\t13400\n艺术界\t13401\nResty\t13402\n外围群\t13403\n耳目\t13404\n南方周末\t13405\n双胶\t13406\n火鸟\t13407\n66Law.cn\t13408\n一目\t13409\nfilled\t13410\n扬州火车站\t13411\n人毛\t13412\n雷达兵\t13413\nHYDRA\t13414\n潜伏\t13415\n增势\t13416\n频遭\t13417\n委托函\t13418\n反转人生\t13419\n竞技者\t13420\n别在\t13421\nTU\t13422\n洛克希德马丁\t13423\n配合比\t13424\n赵露思\t13425\n公益岗\t13426\n2003\t13427\n墨墨言情网\t13428\n邮政快递单号\t13429\n刘可乐\t13430\n微缩版\t13431\n拿手\t13432\n发簪\t13433\n17.0\t13434\n44页\t13435\n济阳\t13436\n千帆直播\t13437\n叶晟荣\t13438\n雷格斯\t13439\n柯南剧场版\t13440\n肌钙蛋白\t13441\n土拨鼠装修网\t13442\n上海交通大学致远学院\t13443\n极影\t13444\n评价指标体系\t13445\n打神\t13446\n屠刀\t13447\n血汗\t13448\n正十二面体\t13449\n亏死\t13450\n杨九郎\t13451\n物联网专业\t13452\n80s手机电影网\t13453\n坑距\t13454\n姚非拉\t13455\n这一条\t13456\ncjun\t13457\n丹东市人民政府\t13458\n加替\t13459\n编辑页\t13460\n刘飞飞\t13461\n自杀\t13462\n结印\t13463\n3544\t13464\n武田华恋\t13465\n隐藏行\t13466\n机织物\t13467\n济州\t13468\nwhatsup\t13469\n禁养区\t13470\n第141集\t13471\n老套\t13472\n西楼\t13473\nnuest\t13474\nCron\t13475\n寿县政府\t13476\nBunnyBlack3\t13477\nDM\t13478\n咽峡炎\t13479\n潮州日报\t13480\n节理\t13481\n熊出没之奇幻空间\t13482\n软核\t13483\n南平市教育局\t13484\n草莓鼻\t13485\nIncremental\t13486\n第七步\t13487\n料件\t13488\n梦享城\t13489\n正新轮胎\t13490\n0126\t13491\ncll\t13492\n风云录\t13493\n清华同衡\t13494\n紫砂锅\t13495\n升降窗\t13496\n御驾亲征\t13497\n价带\t13498\n麻仓优\t13499\narte\t13500\n茹志鹃\t13501\n阉割版\t13502\n陈晓红\t13503\n炎黄蜀黍\t13504\n材料学\t13505\n南京国际展览中心\t13506\nxvid\t13507\n涛岛\t13508\n命定\t13509\nFastStone\t13510\nGUNS\t13511\n几吧\t13512\n海康威视\t13513\n西安市新城区人民城府\t13514\n04.22\t13515\n医脉通\t13516\n洪塘\t13517\n夜罗刹\t13518\n仿若\t13519\n环球经贸网\t13520\n广度\t13521\n艾叶糍粑\t13522\nKings\t13523\n蒋光鼐\t13524\n肯尼亚\t13525\n血脑\t13526\ne融所\t13527\n田东教育网\t13528\nTBT\t13529\n上杭县人民政府\t13530\n而今迈步从头越\t13531\n鑫鑫\t13532\nCulture\t13533\n大亨\t13534\n波萝社\t13535\n随州职业技术学院\t13536\n金玉\t13537\n超银\t13538\n邹容\t13539\nxenapp\t13540\n时尚大师\t13541\n笃定\t13542\n新光村\t13543\n零5网\t13544\n找不见\t13545\n6500k\t13546\nlara\t13547\ntask\t13548\nSCIE\t13549\n埒口\t13550\n腋秀\t13551\n3.7\t13552\nwinding\t13553\n十二封信\t13554\n公示稿\t13555\n不佳\t13556\n20载\t13557\n魏琛\t13558\n1500ml\t13559\n乐拉面\t13560\n冷冷清清\t13561\njao\t13562\n18.5kw\t13563\n坐向\t13564\n反抗者\t13565\n双龙桥\t13566\n久草在线福利资源站_久草在线\t13567\n圣印\t13568\n芷若\t13569\n增长\t13570\n品品品\t13571\n放荡\t13572\n吴三省\t13573\n瓷房子\t13574\n典狱长\t13575\n梅丽尔\t13576\n店铺子\t13577\n浅野\t13578\nACO\t13579\n双倍\t13580\n陈乐\t13581\n海之子\t13582\n切割工\t13583\n乐高漫威\t13584\n胡彦斌\t13585\n魔界战记吧\t13586\nLabelShop\t13587\n海正辉瑞\t13588\ndoctoral\t13589\n航空头\t13590\n15节\t13591\n工人新村\t13592\n梶浦由记\t13593\n20150503\t13594\n广东开放大学\t13595\n11岁\t13596\n飞雕\t13597\n20161007\t13598\n几罐\t13599\n立信会计出版社\t13600\n玄虚\t13601\n孙少平\t13602\n工业设计史\t13603\nBenedict\t13604\n细胞裂解液\t13605\nsequoia\t13606\nv2ex.com\t13607\nMultisim10\t13608\n前30分钟\t13609\n新闻晚报\t13610\n想不到\t13611\n长潭水库\t13612\nk27\t13613\n干部任用条例\t13614\n狂人\t13615\n12.mail\t13616\n2803\t13617\n徘徊者\t13618\n八面体\t13619\n烽火\t13620\n逼样\t13621\nbetaflight\t13622\n砸砖\t13623\n编纂\t13624\n瀚森\t13625\nstudents\t13626\n虚幻4\t13627\n15月\t13628\n唐三镜\t13629\nDay5\t13630\n死亡实验\t13631\n大岗\t13632\n三男\t13633\n水形奥比岛\t13634\n朝阳大街\t13635\n民兵整组\t13636\nBag\t13637\n龙潭区\t13638\n折边机\t13639\n新鲜\t13640\n常熟农商行\t13641\n蓝袍\t13642\n浦和\t13643\n玉玲珑\t13644\n七十八\t13645\n标包\t13646\n泷川华音\t13647\nrdc\t13648\n2000期\t13649\n热重\t13650\nCurious\t13651\nJS库\t13652\njava语言\t13653\n浴火成诗\t13654\n向日葵花\t13655\n无影手\t13656\n灵虚\t13657\n编辑工具\t13658\n重庆电信\t13659\nWWE2018\t13660\n荣耀Q版\t13661\niresearch\t13662\n7970\t13663\n沈乐平\t13664\n亦云\t13665\n狄俄尼索斯\t13666\n接送\t13667\n45mm\t13668\n182.140.245.40\t13669\n支持力\t13670\n疥疮\t13671\n0728房网\t13672\n华工新闻网\t13673\n金高路\t13674\n研途宝\t13675\n市通\t13676\n弗莱彻\t13677\njjjj\t13678\n纯纸\t13679\nPS软件\t13680\n上海居住证\t13681\n卫校\t13682\n化学家\t13683\n瓷胎\t13684\nlixiang\t13685\n咋办\t13686\ngalactic\t13687\n骨关节\t13688\n马晓峰\t13689\nGeoStudio\t13690\n倾国\t13691\ndxdydz\t13692\n齐家问问\t13693\n回春医疗保健操\t13694\nv6.6.0\t13695\nDepot\t13696\n会友\t13697\n湘绣\t13698\n严霜\t13699\n爱微游\t13700\n龙腾软件\t13701\n算一算\t13702\n金碧世纪花园\t13703\nCD-ROM\t13704\n福禄克\t13705\n配置型\t13706\n欢乐颂1\t13707\n欅坂46\t13708\n看光\t13709\n小松挖掘机\t13710\n山之石\t13711\n中华恐龙园\t13712\n李永波\t13713\n东峰\t13714\n钢桥\t13715\nvnpy\t13716\n米宅\t13717\n双排扣\t13718\n掏粪\t13719\n20180409\t13720\n中华人民共和国国家质量监督检验检疫总局\t13721\n万菱\t13722\n芒果娱乐网\t13723\nBurner\t13724\nosk\t13725\npopcorn\t13726\n简璎\t13727\n金秀儿\t13728\n管事\t13729\n短兵\t13730\n数值型\t13731\n坚队\t13732\n0707\t13733\n侠道\t13734\n送回\t13735\nDevExpress\t13736\n八点半\t13737\nviscosity\t13738\neShop\t13739\n原研药\t13740\n有问必\t13741\n歌厅\t13742\n光威复材\t13743\na5外包网\t13744\n杨六郎\t13745\n心蕊\t13746\nexists\t13747\ne40\t13748\n七彩色\t13749\n文摘\t13750\n无损探伤\t13751\n光罩\t13752\n拉用\t13753\n兄\t13754\n禹晟\t13755\n慧子\t13756\n后置\t13757\nLayouts\t13758\nhaode\t13759\n宁波博物馆\t13760\n武装部队\t13761\n几股\t13762\n2-1\t13763\nkjb\t13764\n壹方\t13765\nMS-DOS\t13766\n东华高级中学\t13767\n250亿元\t13768\n3.61\t13769\nporon\t13770\n镨\t13771\n市政务服务中心\t13772\n边站\t13773\n大学排名_高考网\t13774\n珠江夜游\t13775\n盖丽丽\t13776\n闪亮茗天\t13777\n耶罗\t13778\n江北大道\t13779\n关联方交易\t13780\n李严\t13781\n红石崖\t13782\n夏多\t13783\n中国移动通信集团广东有限公司\t13784\n227fdw\t13785\n振动器\t13786\n调税\t13787\n高祖本纪\t13788\n城市管理局\t13789\nMetadata\t13790\n乔迁\t13791\n青州在线\t13792\n三制\t13793\nreflow\t13794\n扼制\t13795\n国家节能中心\t13796\n实弹\t13797\n亚龙湾热带天堂森林公园\t13798\n一厘米\t13799\n深圳搬家公司\t13800\n高等教育机构\t13801\n亲子乐\t13802\nfatigue\t13803\nv5max\t13804\n华师大\t13805\n妥布霉素地塞米松眼膏\t13806\n1102\t13807\n生物药\t13808\n女频\t13809\n锐思\t13810\n县道\t13811\n发文规范\t13812\n2.5m\t13813\n气象万千\t13814\n维生素b2\t13815\n陈旧\t13816\nlasik\t13817\n三整\t13818\n成都地铁6号线\t13819\n开源路\t13820\n音棉\t13821\n融侨悦城\t13822\n医用空气消毒机\t13823\n旺\t13824\n北京自然博物馆\t13825\n巡航导弹\t13826\n吮指\t13827\n电视版\t13828\n坡路\t13829\n自制\t13830\n0型\t13831\n1000X\t13832\nstarcraft\t13833\n因人而异\t13834\n霍先生\t13835\n成都市财政局\t13836\n滨湖城\t13837\n西湖十景\t13838\n诺曼底号\t13839\n封边\t13840\nEstablished\t13841\n萧远山\t13842\ngrenade\t13843\n医疗科技有限公司\t13844\n湖北医学会\t13845\n李光复\t13846\nLSI\t13847\npx4\t13848\nvaries\t13849\n淫贼\t13850\n耿斯汉\t13851\n博源\t13852\nkolor\t13853\n国家海洋局北海分局\t13854\n蹄花\t13855\n制香机\t13856\n2017年8月10日\t13857\n暂列金\t13858\n杭州市公共交通集团有限公司\t13859\n侦破\t13860\nNCE\t13861\n炎帝陵\t13862\n74条\t13863\ntkbSimplest\t13864\n250kw\t13865\n小丙\t13866\nore\t13867\n美肌\t13868\n小梅哥\t13869\nbvlgari\t13870\n雪琪\t13871\n建华区\t13872\n31mm\t13873\n360搜狗\t13874\n异丙醇\t13875\n意外事件\t13876\n门图\t13877\n中国卫通\t13878\n三最\t13879\n蓟县\t13880\n核泄漏\t13881\n洛娜\t13882\nsqueeze\t13883\ntradestation\t13884\nshaved\t13885\n天涯咫尺\t13886\n手写\t13887\n恥辱\t13888\n勇者传说\t13889\n经颅多普勒\t13890\nQuestions\t13891\n警世通言\t13892\n不积\t13893\n多恩布什\t13894\n流式\t13895\n金辉地产\t13896\n玉米烘干机\t13897\n碼\t13898\n201604\t13899\n工会费\t13900\n上股\t13901\n丸子大暴走独胆\t13902\n虹口体育场\t13903\n27代理\t13904\n中柱\t13905\n广汽传祺GM8\t13906\n源码寺\t13907\n5.2英寸\t13908\neGPU\t13909\ncuowu\t13910\n云轩\t13911\n补审\t13912\n樱桃采摘园\t13913\n大芒网\t13914\n中央网络安全和信息化领导小组\t13915\n塑料片\t13916\n小米耳机\t13917\nirvine\t13918\n╬\t13919\n队长小翼\t13920\n眼水\t13921\nm40\t13922\nspss卡方检验\t13923\n_顶红网\t13924\nlibssl.so.10\t13925\n哨点\t13926\n借债\t13927\n卡贴机\t13928\n40吨\t13929\n傻龙\t13930\n奥迪A3论坛_汽车之家论坛\t13931\n厦门集美区\t13932\njs-xlsx\t13933\n江苏雷利\t13934\n维美\t13935\n天富能源\t13936\n风骏\t13937\n火狐浏览器\t13938\n五户\t13939\n编上\t13940\nmicropython\t13941\n第二堆\t13942\nDATETIME\t13943\ndezhou\t13944\nKaspersky\t13945\nfever\t13946\n双处\t13947\n王卫国\t13948\n奥兰\t13949\nshaping\t13950\n正安县\t13951\n单元格里_\t13952\n1979\t13953\n点租\t13954\nTCS\t13955\n北京中西远大科技有限公司\t13956\n键盘侠网\t13957\n弈客\t13958\n刑事案件\t13959\newei_shopv2\t13960\nxin7\t13961\n泖港\t13962\n抗日片\t13963\n建工集团\t13964\n自知\t13965\nliuyan\t13966\nLED照明\t13967\n绿豆粉\t13968\n南国早报数字报\t13969\n周浦万达广场\t13970\n同行人\t13971\n第二第三\t13972\n改装展\t13973\n牛津鞋\t13974\n反击战\t13975\n巷战\t13976\nFLOWERS\t13977\n中交第二航务工程局有限公司\t13978\nwin2000\t13979\n石家庄高铁站\t13980\n烈士\t13981\n清宫戏\t13982\n互联网协会\t13983\n吴昊宸\t13984\n同\t13985\n大发地产\t13986\nparalle\t13987\nRandom类\t13988\n哈尔滨工业大学航天学院\t13989\n2米\t13990\n小技巧\t13991\n林建\t13992\n维生素e乳\t13993\n难道说\t13994\n侠盗猎车手5/\t13995\nアイドル\t13996\n夹击\t13997\n郑耀\t13998\n华尔街日报\t13999\n月芽\t14000\n等值\t14001\n鸡胚\t14002\n九日游\t14003\ndyinglight\t14004\n衣俊卿\t14005\n无所顾忌\t14006\n迈阿密热火\t14007\n金纳多\t14008\nsmiling\t14009\n冒口\t14010\n45g\t14011\n孝里镇\t14012\nc++语言程序\t14013\n2008-2020年\t14014\n怅惘\t14015\n院士\t14016\n黑匣网\t14017\n延时函数\t14018\n瓦顶\t14019\n1.8T\t14020\n推倒\t14021\n星网视易\t14022\nnada\t14023\n百幅\t14024\n乾坤一掷\t14025\n绝无\t14026\nOpencart\t14027\n新伦理学\t14028\nyiyu\t14029\n哈尔滨工业大学研究生院\t14030\nyjizz\t14031\n柚柚控\t14032\n甘熙故居\t14033\n审计网\t14034\n焓熵\t14035\noffice2010\t14036\n张凤\t14037\n6222\t14038\n护法\t14039\n师大夜市\t14040\n苍洱\t14041\n鬼子们\t14042\n高顺\t14043\n挥发\t14044\nptfe\t14045\n毕业服\t14046\n热值\t14047\n张家口\t14048\n中国斗鱼\t14049\nberlin\t14050\nPSD分层\t14051\n搜狐旅游\t14052\nManners\t14053\nserver2003\t14054\n护角\t14055\n二流子\t14056\n何威\t14057\n张常宁\t14058\n查新\t14059\n明锐妖神记\t14060\nEVCARD\t14061\n龙珠超吧\t14062\n叶卡捷琳娜宫\t14063\n牛杂网\t14064\n横臂\t14065\n三一口语\t14066\nSockets\t14067\n鹿盔\t14068\ntext框\t14069\n所有子\t14070\n争食\t14071\n真币\t14072\n小客人\t14073\n山东黄金集团\t14074\n第三十六回\t14075\n八宿县\t14076\n船闸\t14077\n花鸟\t14078\n印地语\t14079\n中国新闻奖\t14080\n视频机\t14081\niPhone5/\t14082\n及方\t14083\n梨型\t14084\n黎锦\t14085\nHCNA\t14086\n今年底\t14087\n舞火\t14088\ntHe\t14089\n北京地铁新机场\t14090\n宿茂臻\t14091\n直播源\t14092\n七绝山\t14093\n曲调\t14094\n桃苑\t14095\n施密特\t14096\n北蔡镇\t14097\n大连市财政局\t14098\n海的女儿\t14099\n2016年春节\t14100\n滤纸\t14101\n锐酷\t14102\n文化局\t14103\n乐嘉丁泽仁\t14104\n凉月\t14105\n我的世界PE\t14106\n性与爱\t14107\ncaa\t14108\n石臼\t14109\n酸酸甜甜\t14110\nchunked\t14111\nx2.5\t14112\n傲雪\t14113\n启迪桑德环境资源股份有限公司\t14114\n南京领添信息技术有限公司\t14115\n杭州下城\t14116\n88.8\t14117\n城市热岛效应\t14118\n贵州省总工会\t14119\n柳鹏\t14120\n高卢战记\t14121\n起点站\t14122\n深闺人未识\t14123\nIllus\t14124\n亚心网\t14125\n赵本司马懿\t14126\n微博认证\t14127\n骂街\t14128\n动条\t14129\n血透机\t14130\nhzz\t14131\n听从\t14132\n阴极\t14133\n举手投足\t14134\n陕西省博物馆\t14135\n保种\t14136\n济南市国土资源局\t14137\nlun\t14138\n钦州报业网\t14139\n街舞工作室\t14140\n群众工作督导永远在路上\t14141\n黑蛋\t14142\n万人之上\t14143\n中国科学技术馆\t14144\n金杨\t14145\n教学型\t14146\n一到十\t14147\n出笼\t14148\n20180411\t14149\n湖北国税\t14150\n3xe\t14151\n小米Mix\t14152\n大冒险\t14153\n急性阑尾炎\t14154\nBeauty\t14155\n中国时尚品牌网\t14156\n新变\t14157\n_丁\t14158\n工勤\t14159\n上不了网\t14160\n九折\t14161\n鸡汁\t14162\n多次\t14163\n前列\t14164\n点菜机\t14165\n出片\t14166\n总路线\t14167\n55名\t14168\n爽片\t14169\n磨块\t14170\nDS1302\t14171\n白芒村\t14172\n均压环\t14173\n白喉\t14174\n老板椅\t14175\n学院派\t14176\n捣蛋鬼\t14177\n阿里阿里\t14178\nf368\t14179\n康托\t14180\n酸奶粉\t14181\n百度翻译器\t14182\n迈腾1.8t\t14183\n立顿\t14184\n彡彡九\t14185\n3.42\t14186\n中星6b\t14187\n异辛醇\t14188\nz3735f\t14189\n长潭\t14190\n夏期\t14191\n雅琴\t14192\n外嫁\t14193\n北京市朝阳区地方税务局\t14194\n江西电厂\t14195\n上海燃气\t14196\n新园路\t14197\n自体心理学\t14198\n下落\t14199\n王旭明\t14200\n面汤\t14201\nqq游览器\t14202\n还不能\t14203\n英魂之刃吧\t14204\nAVAST\t14205\n走读式\t14206\n30015\t14207\nawd\t14208\n清装\t14209\n碳酸钡\t14210\n腾讯云\t14211\nNSW\t14212\n法天下司法鉴定网\t14213\n淮安经济开发区\t14214\n石牌坊\t14215\n言情小说-恋上你\t14216\n实景\t14217\nNancy\t14218\n波场币\t14219\n糖尿病视网膜病变\t14220\nタベ\t14221\n登记证\t14222\n刘启\t14223\n市规划国土委\t14224\nStories\t14225\n金珍珠\t14226\n20161019\t14227\nMastercam\t14228\nhoudini吧_\t14229\n课文\t14230\n3.24\t14231\n木凳\t14232\ndestroy\t14233\n淮口\t14234\n减脂增肌\t14235\n汉化/RPG\t14236\n无痕阅盘\t14237\n糊糊\t14238\n几卷\t14239\n红莲螺岩\t14240\n小嶋\t14241\n破竹\t14242\n鞍山市中心医院\t14243\n2018年7月份\t14244\n低音\t14245\n无聊世界\t14246\n3轮\t14247\n十位\t14248\n卢璐\t14249\n华昌化工\t14250\n换模\t14251\n程亚军\t14252\n大明星们\t14253\nDoki\t14254\n钢铁\t14255\n裙长\t14256\n主胸\t14257\n董林\t14258\nleasing\t14259\n金属学\t14260\n金命\t14261\n发红\t14262\nJelena\t14263\n蒙娜丽莎\t14264\n谷登\t14265\n阿黄\t14266\n浙江科技学院\t14267\n蒹葭苍苍\t14268\n5门\t14269\n115分钟\t14270\n母上\t14271\n住院险\t14272\nGB50204-2015\t14273\nWelcome\t14274\nwmm\t14275\n上海世博展览馆\t14276\ncrunch\t14277\n湖州市水利局\t14278\n求生之路\t14279\nsim卡\t14280\n润滑油泵\t14281\n迪亚斯\t14282\n0613\t14283\n明季\t14284\n可以攻玉\t14285\nbenq投影仪\t14286\n酰胺化\t14287\n西安航空学院\t14288\nunrelated\t14289\nグリモワ\t14290\n铜绿\t14291\nStroke\t14292\n十三宗\t14293\n教室\t14294\n希望号\t14295\n山西工程职业技术学院\t14296\n生料带\t14297\n太阳能蓄电池\t14298\n反应罐\t14299\nbacon\t14300\n硅谷公司\t14301\n苍天航路\t14302\n一技\t14303\n国兴\t14304\n一分为\t14305\n蚕\t14306\nEquality\t14307\nSpring+MyBatis\t14308\n洗衣机柜\t14309\n中考网_3773考试网\t14310\n华清学府城\t14311\n上肢\t14312\n陪着你\t14313\n泰州日报\t14314\nbsi\t14315\n镜妖\t14316\n香兰\t14317\n咸阳高新区\t14318\n汤口镇\t14319\n力大\t14320\n变活\t14321\n圣事\t14322\n新余学院\t14323\n钢件\t14324\n十二三岁\t14325\n8550u\t14326\n墨灯\t14327\n名贤\t14328\n旋元佑\t14329\n阳光总在风雨后\t14330\n血手\t14331\n清湖地铁站\t14332\ngcw\t14333\n曾先生\t14334\nmulti\t14335\n北京国家图书馆\t14336\n弹簧秤\t14337\n断肢\t14338\n实务\t14339\n为害\t14340\n怪坡\t14341\n郭志强\t14342\n重商主义\t14343\nTreeMap\t14344\n55万\t14345\n够级\t14346\nFio\t14347\n3月17\t14348\n2.88\t14349\n齐峰新材\t14350\n4.9米\t14351\n英林\t14352\nCARE\t14353\n樊雨\t14354\n窝牛\t14355\n琯溪蜜柚\t14356\nCarbide\t14357\n单照片\t14358\n领土\t14359\n招财猫鱼\t14360\n上海市静安区人民政府\t14361\n染血\t14362\nexe病毒\t14363\n琼脂\t14364\n詹姆士\t14365\n新浦\t14366\n筹谋\t14367\n迅雷侠\t14368\n冷光源\t14369\n欧拉定理\t14370\n有很多\t14371\n罗氏虾\t14372\n掌阅科技\t14373\n第九十六章\t14374\n簇\t14375\nLibraries\t14376\n设计业\t14377\n全智贤\t14378\n朱峰社区\t14379\n王冕\t14380\n轻量级\t14381\nearliest\t14382\n通俗文学\t14383\n陈建林\t14384\n845\t14385\n散味\t14386\ninner\t14387\ntaiyonghai\t14388\n毛瓷\t14389\n阿根廷比索\t14390\né\t14391\n棘手\t14392\n迎春\t14393\nEDGE浏览器\t14394\n发包方\t14395\n李超\t14396\n绝地求生亚服\t14397\nlang\t14398\n剑网三长歌\t14399\n湖北汽车工业学院\t14400\n海洋局\t14401\nLoss\t14402\n博士们\t14403\n李利\t14404\n说唐\t14405\nPP越狱助手\t14406\n第73集\t14407\n19_\t14408\n快播5.0\t14409\n荣耀荆轲\t14410\nflase\t14411\n中澳\t14412\nPYTHON\t14413\n区卫计局\t14414\n采埃孚\t14415\nordinator\t14416\n跳开\t14417\n野狼谷\t14418\n蜜袋鼯\t14419\n离不开你\t14420\nneo\t14421\n地角\t14422\nvc2015\t14423\n登场率\t14424\n输液室\t14425\n儿戏\t14426\n陈川\t14427\nFighters\t14428\n拒稿\t14429\n偏光式\t14430\n扑克王\t14431\nSDH\t14432\n七喜\t14433\ndraft\t14434\nBoards\t14435\n上海外滩美术馆\t14436\n代书\t14437\n置顶吧网\t14438\n龙泽\t14439\n上海市虹口区政协\t14440\n昆明市统计局\t14441\n阿尔法小蛋\t14442\n南高\t14443\ngenesis\t14444\n六讲\t14445\n中国大厦\t14446\n梦者\t14447\nwen\t14448\n化煞\t14449\n自动化测试\t14450\n九江三中\t14451\n伤人案\t14452\n玉祁镇\t14453\n厦门中国银行\t14454\n自爆\t14455\nisdn\t14456\n过份\t14457\n肥妹\t14458\n新建区\t14459\n南方水泥有限公司\t14460\nminipcie\t14461\n阿胶枣\t14462\n26位\t14463\n麦克森\t14464\n嘉尔\t14465\n华大应用心理研究院\t14466\n韩秋雪\t14467\n招商街道\t14468\n扭头\t14469\n大丑闻\t14470\n品德与生活\t14471\n位在\t14472\n洪恩教育\t14473\n重庆农商行\t14474\n雪莲贴\t14475\nCAD块\t14476\n厚薄\t14477\n充气娃\t14478\n荣盛华府\t14479\nfranais\t14480\n台庆\t14481\n更富\t14482\n丽水市莲都政府\t14483\n低电压版\t14484\n猪扒\t14485\n秦惠王\t14486\n棋风\t14487\n康立\t14488\n城报\t14489\n112家\t14490\n顾不上\t14491\n1万一年\t14492\n空客a380\t14493\n生离死别\t14494\n全园\t14495\n居民户\t14496\n津巴布\t14497\n上海电台\t14498\n美丽的蝴蝶\t14499\n穆迪\t14500\n超级记忆法\t14501\n包子机\t14502\n煅烧\t14503\n万海\t14504\n奇速\t14505\n中日友好医院\t14506\nECA\t14507\n清风寨\t14508\n金杨新村街道\t14509\n佟梦实\t14510\n沢野美香\t14511\nChoose\t14512\n购汇\t14513\nhunter\t14514\n2式\t14515\n列排序\t14516\n湖南省人大常委会\t14517\n抹茶控\t14518\n审片\t14519\n黄半仙\t14520\n真爱的谎言之破冰者\t14521\n转基因玉米\t14522\n逛展\t14523\n使徒\t14524\n买提\t14525\nKSF\t14526\nfc吧_\t14527\n哈尔滨电信\t14528\n16平方\t14529\n柳叶\t14530\n海豚简笔画\t14531\n欲潮\t14532\n赤身\t14533\neasyBCD\t14534\nAllroad\t14535\n石梅\t14536\n纳尔逊\t14537\n叠级\t14538\n7650k\t14539\n平语\t14540\n小黄\t14541\n丁字沽\t14542\n植物体\t14543\n脑胀\t14544\n人造人\t14545\nHAT\t14546\n中村\t14547\n可投\t14548\n包边机\t14549\n二恶英\t14550\n久久久\t14551\n第11天\t14552\n祝桥镇\t14553\n张天翼\t14554\n明珠社区\t14555\n创造型\t14556\n低价股\t14557\n扣头\t14558\n武警学院\t14559\n太溪穴\t14560\n菜锅\t14561\n七贤\t14562\nDMR\t14563\n陈乔\t14564\n大提琴\t14565\n武广新城\t14566\n佳节\t14567\n枢关节\t14568\n七里桥\t14569\n莫恭明\t14570\n米脂中学\t14571\n虎门大桥\t14572\n青提\t14573\n梁正\t14574\n披头\t14575\n思科路由器\t14576\n昆仑通泰\t14577\ninterval\t14578\n33小说网\t14579\n美心\t14580\n勤\t14581\n备抵\t14582\n金言\t14583\n3d侠模型网\t14584\nオンラインストア\t14585\nTale\t14586\n总配电箱\t14587\n李群\t14588\nwilderness\t14589\n舒泰清\t14590\n干旱区\t14591\n后本\t14592\n十堰\t14593\n1800分\t14594\n暖气管\t14595\n审限\t14596\n财主\t14597\n多分类变量\t14598\n李小红\t14599\n后勤服务中心\t14600\n豆儿\t14601\n奥美定\t14602\nwebflux\t14603\n1.09G\t14604\n放贷\t14605\n奇游联机加速器\t14606\n任性\t14607\n毛勇\t14608\n天魔\t14609\n35天\t14610\n青竹湖\t14611\n艾比湖\t14612\n济南国际会展中心\t14613\n净浆\t14614\n编外\t14615\n吸毒者\t14616\n龙华西路\t14617\n枝干\t14618\nAir12\t14619\n120i\t14620\n埃托奥\t14621\n糖心\t14622\nEnquiry\t14623\nDL版\t14624\n定位板\t14625\n联机版\t14626\n实录\t14627\n龙林\t14628\n百_\t14629\n血环\t14630\n炎黄纵横\t14631\n纳米7\t14632\n新疆维吾尔自治区委员会\t14633\n二甲双胍片\t14634\n陶明伦\t14635\n玛妮\t14636\ntpee\t14637\nWLC\t14638\n0003\t14639\n百杰\t14640\njavaws\t14641\n完全竞争\t14642\n脚法\t14643\n20171103\t14644\nCovers\t14645\n肺结节病\t14646\n海底光缆\t14647\n传粉\t14648\n常青花园\t14649\n1-6年\t14650\n自得\t14651\n瑟银\t14652\n魏国\t14653\n花使者\t14654\nx射线\t14655\n黄菡\t14656\n减压器\t14657\n布拉格机场\t14658\nPluralEyes\t14659\n不息\t14660\n咸湿\t14661\n黑竹沟\t14662\n天堂之门\t14663\n第二十一课\t14664\n狂魔\t14665\n良性\t14666\nRehab\t14667\n王鉴\t14668\n吊运机\t14669\n白群晖\t14670\nXMR门罗币\t14671\n全麦粉\t14672\n狐朋狗友\t14673\n武汉_新闻中心\t14674\n南昌住房公积金管理中心\t14675\n6家\t14676\n桂源铺\t14677\n龟友网\t14678\n冷淡期\t14679\n资产证券化率\t14680\n3000w\t14681\n影响类\t14682\n180407\t14683\n明盛\t14684\nreflective\t14685\n示现\t14686\n_龙门镖局\t14687\n在途网\t14688\n看人生\t14689\n十九首\t14690\n慈山寺\t14691\n键码\t14692\n四月三日\t14693\n鳗\t14694\n北京化工大学理学院\t14695\n裕丰\t14696\n求帮\t14697\nRS232串口通信\t14698\n德州市\t14699\n8PIN\t14700\n对打的\t14701\npewdiepie\t14702\nrecruiting\t14703\n城村\t14704\n巴中_巴中政府网\t14705\n无间道2018\t14706\njingxiang\t14707\n松鹤楼\t14708\n强身术\t14709\n三半\t14710\n纪游\t14711\n苄基\t14712\n中国铁道建筑报\t14713\n画坛\t14714\n奥莱尔\t14715\n0011\t14716\n下叉\t14717\n袁家岗\t14718\nteststand\t14719\n梦幻古龙\t14720\n亚虎娱乐\t14721\npetit\t14722\nanisotropic\t14723\n排队论\t14724\n乐max2\t14725\n糊盒机\t14726\n苏高新集团\t14727\n苏通科技产业园\t14728\n为功\t14729\n瑞纳瑞虎\t14730\n第8届\t14731\n得益于\t14732\n窍日\t14733\n萧山\t14734\n200毫升\t14735\n武夷山市\t14736\nFlashGet\t14737\n有朝一日\t14738\n受\t14739\n谈婚论嫁\t14740\n玉府\t14741\n中共八大\t14742\n被解救的姜戈\t14743\n石金钱龟\t14744\n民宗局\t14745\n2350305682\t14746\n宝龙城市广场\t14747\nSpeak\t14748\n自尽\t14749\n2606\t14750\n传播学院\t14751\n联合早报\t14752\n33uuxx\t14753\nひ\t14754\nminecraft\t14755\n四|\t14756\n兰新高铁\t14757\n词作\t14758\n红蚂蚁\t14759\n6月21日\t14760\nMailer\t14761\n61所\t14762\nPPAR\t14763\n肩饰\t14764\n轴线\t14765\nGboard\t14766\nGin\t14767\nPSP2000\t14768\n川普\t14769\n李修平\t14770\n零售界\t14771\n室速\t14772\n2015年前\t14773\n常石磊\t14774\n等你\t14775\n台州市区\t14776\nHungarian\t14777\nSEM优化网\t14778\n园田美樱\t14779\ncv2.imshow\t14780\nkd树\t14781\n比格\t14782\n0.9.2\t14783\n外圈\t14784\n晨尿\t14785\n远端\t14786\ntO\t14787\n第十四季\t14788\n易小川\t14789\n刘玉龙\t14790\na股\t14791\n喋血\t14792\n梳篦\t14793\n区人大\t14794\n6502\t14795\n20150301\t14796\n96张\t14797\n胸水\t14798\ntexas\t14799\n严紧\t14800\n六帝钱\t14801\n韦力\t14802\n狼主\t14803\n咪咕文化科技有限公司\t14804\n水煮肉片\t14805\n天山大街\t14806\n清河路\t14807\n神马专车\t14808\n佐仓绫音\t14809\n买电器\t14810\n陈卫平\t14811\n外阴\t14812\n隆昌\t14813\n战斗包\t14814\n中国药典\t14815\nPDF格式\t14816\nfsh\t14817\n陕人社\t14818\n宁波栎社机场\t14819\n李秀华\t14820\nnoip2017\t14821\n巧选\t14822\n邻城\t14823\n日照置业网\t14824\n米蕉\t14825\n大连市国土资源和房屋局\t14826\n曲曲\t14827\n投料\t14828\n联想笔记本电池\t14829\n4000米\t14830\n碾子山区\t14831\nTwilight\t14832\n1004\t14833\n发胀\t14834\n魅族MX5\t14835\nnordic\t14836\n联校\t14837\n谢中书书\t14838\n王牌师\t14839\n通鉴\t14840\n通联\t14841\n百度图表\t14842\n汕昆高速\t14843\n18春\t14844\n小针\t14845\n丧葬\t14846\n独尊\t14847\n广东工业大学\t14848\nqudong\t14849\n广夏\t14850\n樊城区政府\t14851\n角色歌\t14852\nSENSOR\t14853\n卡乐比麦片\t14854\n提升式\t14855\n体腔\t14856\nhtac\t14857\n水镀\t14858\n首问\t14859\n东丽湖\t14860\n高量\t14861\n好写\t14862\n龙旗\t14863\n淘宝天猫京东\t14864\n7挡\t14865\n10.0.6\t14866\n20151024\t14867\norcale数据库\t14868\n20160619\t14869\n强弱\t14870\n狼灵\t14871\n53名\t14872\n发飙\t14873\n虞氏\t14874\n伊贺\t14875\nabqus\t14876\n硅藻土\t14877\n三废\t14878\n荣威RX8\t14879\n围度\t14880\n张保仔\t14881\nTeclast\t14882\nFairchild\t14883\n正等\t14884\n上海市委\t14885\n几桌\t14886\nメリトクラシ\t14887\n大保国\t14888\n人和物\t14889\n网架\t14890\nshoebox\t14891\n世界之窗浏览器\t14892\n天下有道\t14893\nemma\t14894\n吉大三院\t14895\n银楼\t14896\n怜爱\t14897\nClaim\t14898\n鱿鱼\t14899\n擒拿\t14900\n两性\t14901\n北京现代朗动\t14902\n四城区\t14903\n戴河\t14904\n地区级\t14905\n老虎\t14906\n83521388\t14907\nafp\t14908\nDouble\t14909\niqueryable\t14910\n微单A9\t14911\n899集\t14912\nTaxation\t14913\n90平方\t14914\n吾身\t14915\nPioneer\t14916\n包埋盒\t14917\n回到过去\t14918\n重力坝\t14919\n占山为王\t14920\n尸气\t14921\n百步穿杨\t14922\n澳新\t14923\n20.00\t14924\n冰女\t14925\n升顶\t14926\n安远路\t14927\n永历\t14928\n海肠\t14929\n西安新城\t14930\n纪梵希\t14931\n赛题\t14932\n雷动\t14933\n喘鸣\t14934\n追梦网\t14935\n访问器\t14936\nFOF基金\t14937\nlogisim\t14938\n四念处\t14939\ne度\t14940\n哥特萝莉\t14941\n降糖药\t14942\n不导\t14943\n光子嫩肤仪\t14944\n折纸玫瑰花\t14945\n护栏板\t14946\nrpcs3吧\t14947\n炸掉\t14948\n主楼\t14949\n不稀奇\t14950\n5寸\t14951\nThey\t14952\n赵本\t14953\n8点钟\t14954\n博兴政府网\t14955\n红蓝\t14956\nyii2.0\t14957\nK2路\t14958\n证据不足\t14959\n程立\t14960\n子层\t14961\nErolenta\t14962\n热塑性塑料\t14963\n【RTZ\t14964\nmedicines\t14965\n118\t14966\n华亭镇\t14967\n13689280139\t14968\n东亭镇\t14969\n网络教育学院\t14970\n撸啊骐达\t14971\n成都地铁\t14972\ne350\t14973\n侠盗猎车手5\t14974\n竹子社\t14975\n李浩宇\t14976\n碧桂园山河城\t14977\n劳务成本\t14978\n11.29\t14979\n工作费\t14980\n多公里\t14981\nBT磁力链双语字幕迅雷百度云\t14982\n工艺品\t14983\n桥博\t14984\nMenhera\t14985\n二氯丙烷\t14986\n广网\t14987\nArlington\t14988\n众视\t14989\n77.7万辆\t14990\ntheother\t14991\n共和镇\t14992\n急流槽\t14993\n52pkFIFA\t14994\n龙城大街\t14995\n农业产业信息网\t14996\n相似处\t14997\n阿纳金\t14998\n威海站\t14999\n李太白\t15000\n神似\t15001\n安非他命\t15002\n顺德长鹿农庄\t15003\n道德情操论\t15004\n宋茜\t15005\nBusters\t15006\n大侠们\t15007\n比较型\t15008\n跟我学Shiro\t15009\n中国砂石协会\t15010\n纯银\t15011\n贵\t15012\n美赞成\t15013\nPrevention\t15014\n林羽\t15015\n亚联财\t15016\n人物表\t15017\n北京协和医学院公共卫生学院\t15018\nNONONO\t15019\n直辖市\t15020\n维保员\t15021\n<\t15022\n蚕豆病\t15023\n陶泥\t15024\n2018年3月28日\t15025\n加洛西\t15026\nhandsontable\t15027\nGPU版\t15028\ntext函数\t15029\nsolids\t15030\n卷头\t15031\n工作薄\t15032\n下河\t15033\n夏荷\t15034\n3d片源\t15035\n第49章\t15036\n离散型随机变量\t15037\n称得\t15038\nrbegin\t15039\n2017年3月11日\t15040\n冒险\t15041\n硬梆梆\t15042\n巨蛋\t15043\n智能网\t15044\n载客\t15045\n1.6%\t15046\nDelay\t15047\n锡林郭勒\t15048\n水花\t15049\nKeithTt\t15050\n迈达斯\t15051\nSTAYREAL\t15052\nuina\t15053\n小序\t15054\n广东省人民政府国有资产监督管理委员会\t15055\n加步\t15056\n5.3.5\t15057\n丈毛\t15058\n62\t15059\n莱曼\t15060\n汽锅\t15061\n中河\t15062\n御制\t15063\n伤亡\t15064\n张琳\t15065\n沈梦\t15066\n鸣响\t15067\n元辰宫\t15068\n企业管理费\t15069\n微果酱\t15070\n坐管\t15071\n杨艳\t15072\nFCE\t15073\n6318\t15074\n咳出\t15075\nlinkz\t15076\n同母\t15077\nusb\t15078\n漳州火车站\t15079\n联想(北京)有限公司\t15080\n菩提道次第广论\t15081\nmemcached\t15082\ntelecom\t15083\n娘亲舅大\t15084\nSHADOW\t15085\nTRADE\t15086\n1-9月份\t15087\n廖敏\t15088\n埋件\t15089\n比速\t15090\n知食\t15091\n王龙\t15092\n电镐\t15093\nELLA\t15094\n房天下英国房产网\t15095\n从此\t15096\n杜丹\t15097\n麦可思\t15098\n金陵药业\t15099\nsacrifice\t15100\ntiny4412\t15101\n框选\t15102\n罗娟\t15103\n龙湾村\t15104\n申思\t15105\n任我行\t15106\n肾母细胞瘤\t15107\n深圳市投资控股有限公司\t15108\nfot\t15109\n1231v3\t15110\n12C\t15111\nkPa\t15112\n江西师范大学附属中学\t15113\n1.6\t15114\n毛利润率\t15115\n鸟兽\t15116\n白粉\t15117\n加持\t15118\n猛鬼屠房\t15119\n嘉宝果\t15120\n虚开\t15121\n阜宁\t15122\n3顿\t15123\n上阳宫\t15124\n穴\t15125\n对焊\t15126\n廉风\t15127\n流窜\t15128\n青海湖鸟岛\t15129\n佳能EOS\t15130\n苹果屏\t15131\ngalileo\t15132\nantennas\t15133\n5020\t15134\n看戏\t15135\n绝热\t15136\n苦于\t15137\n陈光明\t15138\n深圳音乐厅\t15139\nMSI季\t15140\nxiaochao\t15141\n龙婆培\t15142\nmach3\t15143\n浦发手机银行\t15144\n宇宙空间\t15145\n野果\t15146\nzhiren\t15147\nIm\t15148\n创刻\t15149\n婚礼服\t15150\n活人\t15151\n两级\t15152\n地产界\t15153\n大招\t15154\n奥飞动漫\t15155\n将计就计\t15156\n杨彦\t15157\n淑芬\t15158\n防弊屏\t15159\nsaler\t15160\niData\t15161\n金士杰\t15162\n郑重\t15163\nst段\t15164\n500cm\t15165\n徐英\t15166\n铁齿铜牙纪晓岚\t15167\n33IQ\t15168\n洗濯屋\t15169\nshoo\t15170\n半导体激光治疗仪\t15171\n多口\t15172\n2.4a\t15173\n文集\t15174\n61616\t15175\n第十四节\t15176\n2队\t15177\n招标投标法实施条例\t15178\n社保法\t15179\n位居\t15180\n眶隔\t15181\n毽\t15182\n中标题\t15183\nZhong\t15184\n当宁消防网\t15185\n叫绝\t15186\n奥山集团\t15187\n电堆\t15188\nfakepath\t15189\n两位\t15190\n下肢动脉硬化闭塞症\t15191\nlous\t15192\n台湾\t15193\n结果子\t15194\n梦幻麻将馆\t15195\n车辆工程学院\t15196\nstatements\t15197\n北京胸科医院\t15198\n塞纳里奥\t15199\n张磊\t15200\n焊锡机\t15201\n2018-03-10\t15202\n上帝之窗\t15203\n联村\t15204\n11.1\t15205\nほ\t15206\nS1810\t15207\n124平米\t15208\n600倍\t15209\n塑料瓶\t15210\n屌炸\t15211\n科盟\t15212\n行级\t15213\n新乡市人民政府\t15214\ndozer\t15215\nrevolution\t15216\n石沉大海\t15217\n西安户县\t15218\n压型\t15219\n李义山\t15220\norderby\t15221\nファイタ\t15222\n新能源汽车网\t15223\n知青\t15224\n三星s9+\t15225\nhd700\t15226\nyouna\t15227\n凤丹\t15228\n招牌\t15229\nDOOM\t15230\n熊宝\t15231\n钻石山\t15232\n官僚制\t15233\nshell,sed\t15234\n1235\t15235\n127\t15236\n硬轨\t15237\n用友u8\t15238\n挖角\t15239\n东方威尼斯\t15240\n转发率\t15241\n上颌骨\t15242\n留鼻血\t15243\n美俄\t15244\nwww.a9vg.com\t15245\n房途\t15246\nsetsuei\t15247\n三科光电\t15248\n冲片\t15249\nxlsx格式\t15250\nv5.3\t15251\n爱华\t15252\n讯网\t15253\n发落\t15254\n淮\t15255\nPID控制\t15256\n澜山\t15257\nGmbh\t15258\n26件\t15259\nNVDIA\t15260\n宽甸县\t15261\n图形库\t15262\n开荤\t15263\n赫敏·格兰杰\t15264\n表观\t15265\nファン\t15266\n通红\t15267\n沪江高考资源网\t15268\ndrilling\t15269\n三国群英传-霸王之业\t15270\n3750G\t15271\nA1822\t15272\n小象\t15273\n郭先生\t15274\n8087\t15275\n0.8\t15276\n韩志强\t15277\n手摇泵\t15278\nClark\t15279\n比热血\t15280\n泛亚汽车技术中心有限公司\t15281\n勇者斗恶龙\t15282\nvag\t15283\n花荄镇\t15284\n邵仲衡\t15285\n窝点\t15286\ndoors\t15287\nballet\t15288\n行政事业性收费\t15289\nangry\t15290\n康字\t15291\n山楂糕\t15292\n迪士尼动画电影\t15293\nITtribe\t15294\n美丽人妻\t15295\n卸货\t15296\n朗格汉斯\t15297\n煌黑龙\t15298\n奸计\t15299\n首套房贷款利率表\t15300\n三份\t15301\n样品柜\t15302\n山城棒棒军\t15303\nhexagon\t15304\nJLINK\t15305\ne570c\t15306\n浙江大学宁波理工学院\t15307\n奶花\t15308\n安卓4.0\t15309\nbilibili直播姬\t15310\nuibe\t15311\n拈花\t15312\n黑灰\t15313\nk个\t15314\n实作\t15315\n腾讯微校\t15316\n拾起\t15317\nCubase5\t15318\n长藓\t15319\n邹阳\t15320\n极路由3\t15321\n列夫托尔斯泰\t15322\n收拾\t15323\n李献计历险记\t15324\n狮跑\t15325\n孕妇裙\t15326\n王升\t15327\naune\t15328\n自勉\t15329\n郑智化\t15330\n1000个\t15331\n思想道德修养\t15332\n艾本德\t15333\n常州市人民政府\t15334\n热吻\t15335\n佛国\t15336\nHN\t15337\n3dmgame.com\t15338\n7.3.3\t15339\n会计招聘网\t15340\nUIM卡\t15341\n徐军\t15342\n上海场\t15343\n进门\t15344\nsinglemessage\t15345\nm3s\t15346\ncisco\t15347\n定子\t15348\n四子\t15349\n成都商报|成都商报电\t15350\n静态混合器\t15351\nKONKA\t15352\n古装照\t15353\n任毅\t15354\n节能\t15355\nLoull\t15356\nPyerlife\t15357\n小米无线鼠标\t15358\nmoviemaker\t15359\n不败之地\t15360\n肠子\t15361\n双美\t15362\n金沙萨\t15363\n759\t15364\n卫生巾\t15365\nbjl\t15366\n1W\t15367\n上海希尔顿酒店\t15368\n中医学\t15369\n9V\t15370\n地心引力\t15371\n错别\t15372\ne3d\t15373\n招贴网\t15374\n脓血\t15375\ngaode\t15376\n前区\t15377\n+子\t15378\n备忘录\t15379\nviu\t15380\n千分比\t15381\n申万宏源\t15382\n朱杰\t15383\n姜汝祥\t15384\n南京铁道职业技术学院\t15385\n6.3\t15386\n15CM\t15387\n交投集团\t15388\n美妆护肤辣妈帮\t15389\n瀚纳仕\t15390\noracle11g数据库\t15391\nParsons\t15392\n中国化学工程股份有限公司\t15393\n店率\t15394\nm430\t15395\n深白\t15396\n月经不来\t15397\n1帧\t15398\nDIVE\t15399\n沟通\t15400\n焦糖冬瓜\t15401\n纤维素醚\t15402\n═\t15403\n三巨\t15404\n烈士陵园\t15405\nSamba\t15406\n全新\t15407\n离子膜烧碱\t15408\n自然石\t15409\nLenses\t15410\n经开区\t15411\n区信访局\t15412\n幻兽王\t15413\nperm\t15414\n王府\t15415\n湾子\t15416\n汤沟镇\t15417\nJSON\t15418\nps卡\t15419\ndazhong\t15420\n花女\t15421\n皮层\t15422\n6.4分\t15423\nFeekr\t15424\nfences\t15425\ntransport\t15426\n毛俊杰\t15427\n灰度级\t15428\n袁牧\t15429\nnove\t15430\n起亚kx3\t15431\n成就感\t15432\n诛仙吧_\t15433\n斩服\t15434\n病毒攻击\t15435\n7.7\t15436\n东莞汽车站\t15437\n靠着\t15438\n金融债\t15439\n慈文传媒\t15440\n掠夺\t15441\n张择端\t15442\n網路\t15443\nlovebeauty\t15444\n谎言\t15445\n习近平关于青少年和共青团工作论述摘编\t15446\n粉穴\t15447\nstops\t15448\n赚网\t15449\n同行者\t15450\n青年湖\t15451\n峰谷平\t15452\n存销比\t15453\n北京大厦\t15454\n同义句\t15455\n气针\t15456\nencodings\t15457\n洋洋洒洒\t15458\n1.2.16\t15459\n怀化市\t15460\nb350m\t15461\n肖培\t15462\n金陵科技学院\t15463\n风雨哈佛路\t15464\n河间驴肉火烧\t15465\n水旗\t15466\n新番外\t15467\n注射用头孢曲松钠\t15468\n绿巨人浩克\t15469\n雪海\t15470\n读书人\t15471\n中铁电气化局\t15472\n东西湖区\t15473\n月牙湾\t15474\n讹传\t15475\n铁骨铮铮\t15476\n明治维新\t15477\n出刊\t15478\n恶魔之魂\t15479\ncollins\t15480\n随县网\t15481\n业精\t15482\n赵君\t15483\n佛山市住房公积金管理中心\t15484\n驱蚊贴\t15485\n20个\t15486\n2017年4月21日\t15487\n107g\t15488\n脚骨\t15489\n0375\t15490\n波斯王子5\t15491\n磁灶\t15492\n卡办卡\t15493\n美的公园天下\t15494\n95W\t15495\nTh\t15496\nVimium\t15497\n若妻\t15498\n金所炫\t15499\n天骄无双\t15500\n加仓\t15501\n陈道明\t15502\n舞\t15503\n两周左右\t15504\n美士\t15505\nSAGE\t15506\nrandom函数\t15507\n203路\t15508\n69\t15509\n五宝镇\t15510\ngrbl\t15511\n31周\t15512\n2.0.0.0\t15513\njournals\t15514\n几点\t15515\n料架\t15516\n东鼎\t15517\n王源\t15518\ng1\t15519\n窦性心动过速\t15520\nFacilities\t15521\n鲁迅美术学院\t15522\n白文\t15523\n90届\t15524\n周晓东\t15525\n林红\t15526\n挑梁\t15527\n114NBA\t15528\n消耗战\t15529\n小丰\t15530\n学诗\t15531\n象兵\t15532\n沂蒙山小调\t15533\n上海宝信软件股份有限公司\t15534\n底鼓\t15535\nbiostar\t15536\n父辈\t15537\nclass类\t15538\n银行间市场\t15539\n1注\t15540\n懂情\t15541\n言不由衷\t15542\n小罗\t15543\nEaton\t15544\n半节\t15545\n误选\t15546\n驾驶员\t15547\n去哪儿玩\t15548\n小王爷\t15549\n换车\t15550\n新上市\t15551\n我们这一代\t15552\n店家\t15553\n大北窑\t15554\n进发\t15555\n通榆\t15556\n着落\t15557\n阿玛尼艺术公寓\t15558\n扩音机\t15559\n计分\t15560\n苏州工业园区一站式服务中心\t15561\ndropout\t15562\n核磁氢谱\t15563\nChin\t15564\n开心就好\t15565\n制宪\t15566\n红砖美术馆\t15567\n神风特攻队\t15568\n13.6.1\t15569\nGA-H61M-DS2\t15570\n芦席\t15571\nworl\t15572\n杂兴\t15573\n巴西国家队\t15574\n产业价值链\t15575\n随卦\t15576\n白村\t15577\n那个谁\t15578\n苍翼之刃\t15579\n非战\t15580\n巡回\t15581\n青岛海信\t15582\nScarborough\t15583\n东兴小区\t15584\n司法部\t15585\n交警队\t15586\n阿米尔·汗\t15587\n野山\t15588\nbaseurl\t15589\n新盖\t15590\n相空间\t15591\nDEF\t15592\n耳前瘘管\t15593\n后浇\t15594\n136号\t15595\n康涅狄格大学\t15596\n刘邦\t15597\n圣迭戈\t15598\n鹩哥\t15599\n整修\t15600\nKaraf\t15601\nM118w\t15602\n二锅头酒\t15603\n上地东里\t15604\nVacation\t15605\nOlympics\t15606\niPhone5s/\t15607\n东莞市人力资源局\t15608\nMSVCR110.dll\t15609\n裙女\t15610\n选号网\t15611\n江南晚报网\t15612\n空洞型\t15613\n马鬃岭\t15614\n七节\t15615\n萁\t15616\n火车北站\t15617\n乱花渐欲\t15618\n方欣\t15619\nproxifier\t15620\n天府五街\t15621\n福建明溪县政府\t15622\n十一夜\t15623\n清华大学第一附属医院\t15624\nxpt\t15625\nVHDL语言\t15626\n火歌\t15627\nLBS\t15628\nJava线程池\t15629\nOutput\t15630\nojdbc\t15631\n42张\t15632\n天理何在\t15633\n黑恶势力\t15634\n公益性岗位\t15635\n退店\t15636\n906路\t15637\n图形的旋转\t15638\n得意家居网\t15639\n杨宗纬\t15640\n2980\t15641\n惨祸\t15642\n余值\t15643\n第一百零九章\t15644\n红石公园\t15645\nREADER\t15646\n沦为\t15647\n镇海区人民政府\t15648\n惊沙\t15649\nSolvent\t15650\ncelery\t15651\n收藏价\t15652\n聚品\t15653\n见证取样\t15654\n2005-2017年\t15655\n周三\t15656\n家庭出身\t15657\n前奏\t15658\nSuede\t15659\n合成纤维\t15660\n奔向\t15661\n芜湖市\t15662\n韬略\t15663\n十九岁\t15664\n纸样\t15665\n①\t15666\n渐出\t15667\n星卡\t15668\nchastity\t15669\n电子客票\t15670\nlangue\t15671\n105_\t15672\n博途\t15673\n网警\t15674\ne0\t15675\nshunde\t15676\n600392\t15677\n冷山\t15678\n三刀\t15679\n香皂\t15680\nPxCook\t15681\npdt\t15682\n123_\t15683\n湖北生物科技职业学院\t15684\n青叶\t15685\nkuaidi\t15686\n托奇\t15687\nbirds\t15688\n龙河路\t15689\n合肥科学岛\t15690\nmatl\t15691\n求索\t15692\n丰宁满族自治县\t15693\n间质瘤\t15694\n详注\t15695\n命令行\t15696\n荒地\t15697\n陈全国\t15698\n雪兔\t15699\n230个\t15700\n山南地区\t15701\n学完\t15702\n赤道仪\t15703\n倾颜\t15704\n红酥手\t15705\n陆静\t15706\nitt\t15707\n清房\t15708\n牙片机\t15709\n补光灯\t15710\n无糖可乐\t15711\n王家庄村\t15712\n乱飞\t15713\nEngagement\t15714\n郝立新\t15715\n怪医\t15716\n工工\t15717\nr星\t15718\n天体\t15719\nCree\t15720\n复式条形统计图\t15721\n梅岭小学\t15722\n爬行\t15723\n◥\t15724\nArchive\t15725\n阿字\t15726\n锦瑟沈梦辰\t15727\n中天微\t15728\n金光综艺馆\t15729\n锐派\t15730\nvncserver\t15731\n剑灵咒术师\t15732\n奥迪王牌特工\t15733\n湖南台\t15734\n20D\t15735\n0.1.6\t15736\n银皇后\t15737\n李海娜\t15738\n殷健灵\t15739\n甜象草\t15740\n肝素\t15741\n佛书\t15742\ncounts\t15743\n帖吧\t15744\n望都县\t15745\n园里\t15746\n配符\t15747\n叔叔们\t15748\n杭州公益中学\t15749\n教规\t15750\n瓯剧\t15751\n大芒果\t15752\nFramework4.0\t15753\nTruncate\t15754\n管服\t15755\n洪钟\t15756\n伞妖\t15757\n二十五载\t15758\n艾高\t15759\n拆装式\t15760\n当街\t15761\n引洮\t15762\n易经的奥秘\t15763\n12cm\t15764\nabb变频器\t15765\nseminar\t15766\nぜ\t15767\n2.4_\t15768\n丽江\t15769\n南京先锋书店\t15770\n5500\t15771\n会员名\t15772\n酷骑\t15773\n统计表\t15774\n前一月\t15775\n蓝盾\t15776\n_果果\t15777\n调度员\t15778\nDDL\t15779\naero\t15780\n寄寓\t15781\n锤子m1l\t15782\n海得\t15783\n李奇霖\t15784\n萍果\t15785\n思言\t15786\n端点\t15787\n年月日\t15788\n沙鲁克汗\t15789\napdu\t15790\n华晨华罗庚\t15791\ncory\t15792\n用友票据通\t15793\n解禁股\t15794\n美团旅行\t15795\n英雄联盟吧_\t15796\n鼎复\t15797\ninformix\t15798\nghz\t15799\nx230t\t15800\n破壁机\t15801\n导电率\t15802\n怡心园\t15803\n传热系数\t15804\n大方向\t15805\n恒温调奶器\t15806\n樊玲\t15807\n性爱床\t15808\n宁政发\t15809\nSwitzerland\t15810\n发货量\t15811\n外篇\t15812\n烧感\t15813\n土基回弹\t15814\n新居\t15815\nCOOKIES\t15816\n短期利率\t15817\nauthenticator\t15818\n酒瓶\t15819\n酸奶\t15820\n阿部乃美久\t15821\n小六\t15822\n天会\t15823\n妇科千金片\t15824\n氧体\t15825\n拼单\t15826\n紧张度\t15827\n铁板饭\t15828\n20多次\t15829\n不紧不慢\t15830\n戏剧性\t15831\n南昌人才网\t15832\ndataframe\t15833\n烧酒\t15834\n横琴岛\t15835\n&#4\t15836\n张彻\t15837\n酒店类\t15838\n完美匹配\t15839\n桃花村\t15840\n卡号\t15841\n天堂之吻\t15842\n马雪云\t15843\n20170811\t15844\n第二十四\t15845\n赵一荻\t15846\n充充\t15847\n宏伟区\t15848\n中央级\t15849\n这下\t15850\n一下秒\t15851\n2680元\t15852\n七八年\t15853\n团泊新城\t15854\n网博\t15855\nBilling\t15856\nefm\t15857\n第6轮\t15858\n煦\t15859\nmmTrix\t15860\n酷讯\t15861\n菇\t15862\n医疗\t15863\nDDP\t15864\n青田石\t15865\n2017-12-29\t15866\nx5内核\t15867\n干机\t15868\n珠片\t15869\n华测\t15870\n7990\t15871\n刘浏\t15872\nHTF\t15873\ngtx770\t15874\n后继有人\t15875\n天气瓶\t15876\n华友钴业\t15877\n1觉\t15878\n新界泵业\t15879\n亿_\t15880\nzyw\t15881\n风风风\t15882\n黄刚\t15883\n2022年\t15884\n夹片\t15885\n检查员\t15886\n第11个\t15887\n750度\t15888\n常用量\t15889\n木浆\t15890\n波切利\t15891\ncircumstances\t15892\n军视\t15893\n25篇\t15894\n9针\t15895\n福州格致中学\t15896\niframe子\t15897\n次元街\t15898\n内错角\t15899\n盐酸多巴胺\t15900\n陈云峰\t15901\n70关\t15902\n精进电动\t15903\n善待你所在的单位\t15904\n中船防务\t15905\n彩片\t15906\n68757951\t15907\nforums\t15908\nOceanStor\t15909\n16.0.4\t15910\n奥特莱斯\t15911\n龙岩市\t15912\n贴库\t15913\n20140525\t15914\n权御天下\t15915\n壁橱\t15916\n美琪\t15917\n东方红广场\t15918\nfsk\t15919\nSort\t15920\n一毛钱\t15921\n河南科技\t15922\n疣\t15923\nPOR\t15924\ndeps\t15925\n等级测评师\t15926\n百乐笔\t15927\n妺妺\t15928\n终极套装\t15929\n网站关键词排名\t15930\n无远虑\t15931\n直线法\t15932\n哈密地区\t15933\n梨花谷\t15934\nUP主\t15935\n物流节\t15936\n拦水\t15937\n三百三百两百\t15938\n一个多月前\t15939\n第二十八集\t15940\n宠妻入骨\t15941\n振实\t15942\n江南岸\t15943\nserverguide\t15944\n里里\t15945\n层叠\t15946\n刘瑾\t15947\n粒子群优化算法\t15948\n出会\t15949\nsl410\t15950\n苹果8plus\t15951\n20瓦\t15952\n贵港政府网\t15953\n濠河神聊\t15954\n慧馨安\t15955\n键子\t15956\nyoox\t15957\n慢一点\t15958\n戴荃\t15959\ngopro4\t15960\n北京市规划委员会朝阳分局\t15961\n凯撒沙拉\t15962\n不要忘记\t15963\n雪纺\t15964\n液冷\t15965\n混元金斗\t15966\n百鬼夜行\t15967\n法制博览\t15968\n末日逆袭\t15969\n向阳路\t15970\n赵克\t15971\n曹仁超\t15972\nTheories\t15973\n榴莲味\t15974\n扣票\t15975\nsemanage\t15976\n第127章\t15977\n一回路\t15978\n有始\t15979\nasc\t15980\n快吧我的世界盒子\t15981\n视联\t15982\n凤凰系统\t15983\n2018/4/2\t15984\n真心大冒险\t15985\n金鹰大厦\t15986\n非扫描版\t15987\n10节\t15988\nS2000\t15989\n蓝光盘\t15990\n博集\t15991\n周铭\t15992\n闲王\t15993\n编录\t15994\n欲奴第一季\t15995\n屠龙战记\t15996\n四强赛\t15997\n乡秀树\t15998\nKinabalu\t15999\n杜仲雄花\t16000\nJhuster\t16001\n天天天\t16002\n内罗毕\t16003\n绝恋\t16004\n花拳绣腿\t16005\n7\t16006\n张作霖\t16007\n欧债危机\t16008\n白蜡杆\t16009\n募资\t16010\n厦门市区\t16011\n评判\t16012\n世界卫生组织\t16013\nhorn\t16014\n神石\t16015\n挂号机\t16016\n灰蛊\t16017\n地中海俱乐部\t16018\n北京平\t16019\nguangdong\t16020\n49场\t16021\n侠侣\t16022\nSEAGATE\t16023\n中华人民共和国治安管理处罚法\t16024\n1069\t16025\n食屎\t16026\n乌布皇宫\t16027\n金华新闻网\t16028\n军绿\t16029\n武元衡\t16030\n_希财网\t16031\nlrm\t16032\n6700HQ\t16033\ndora\t16034\n构造函数\t16035\n帝国时代2征服者\t16036\n20171105\t16037\n直布罗陀海峡\t16038\n综治办\t16039\n长沙民政职业技术学院\t16040\n1872年\t16041\nphotoscan\t16042\n咬力\t16043\njstorm\t16044\n万万没想到\t16045\n材积\t16046\n木门锁\t16047\n中直股份\t16048\n排灌\t16049\n贝九\t16050\nhydrate\t16051\n宫内孕\t16052\n游坦之\t16053\n福利篇\t16054\n杏林子\t16055\n中国海诚工程科技股份有限公司\t16056\n品规\t16057\n国学智\t16058\n乐商店\t16059\nJPM\t16060\n大庆村\t16061\ndes算法\t16062\n89元\t16063\n云南省人力资源和社会保障厅\t16064\n羽毛球鞋\t16065\n头七\t16066\n中华人民共和国卫生部\t16067\n耍\t16068\nrebuild\t16069\n水晶泥\t16070\n美立方\t16071\n马应龙痔疮膏\t16072\n铭瑄科技\t16073\n迅雷仓\t16074\n列夫\t16075\n枸杞多糖\t16076\n信野\t16077\nJunction\t16078\nvue1\t16079\n希曼\t16080\n银桑\t16081\n恭城\t16082\n贝智康\t16083\nremarkable\t16084\n鸡西新闻网\t16085\n塑料壶\t16086\n是真的吗\t16087\npxcook\t16088\n名符其实\t16089\n新世界店\t16090\n镇党委\t16091\nsocks5代理服务器\t16092\n维度\t16093\n蟒蛇\t16094\nphpos\t16095\n2017年9月28日\t16096\n静坐\t16097\n小罗伯特\t16098\n水雾\t16099\n中共天津市委党校\t16100\n680k\t16101\n吸塑盒\t16102\n枯山水\t16103\nF调\t16104\n养生法\t16105\nDoughnut\t16106\n思想作风\t16107\nAMV\t16108\nmeinv\t16109\n来吧综合网\t16110\nBackbone\t16111\nsucha\t16112\n水天\t16113\n依山傍水\t16114\n慢性粒细胞白血病\t16115\n出梅\t16116\n西南联合产权交易所\t16117\n三瓣\t16118\n干度\t16119\nWireshark\t16120\n金山云\t16121\n新少林寺\t16122\n潮服\t16123\n凤凰自行车\t16124\n真衣\t16125\n无缝钢管\t16126\n生态仪\t16127\n杨春霞\t16128\n中国体彩网\t16129\n气定神闲\t16130\n授意\t16131\n水榭花都\t16132\nvent\t16133\nmeshgrid\t16134\n2014财年\t16135\n玉帝\t16136\nmokee\t16137\n引客\t16138\ne14\t16139\n胶\t16140\n经典语录网\t16141\n3.49\t16142\n趸交\t16143\n本地通\t16144\n利兹\t16145\n苏州市水利局\t16146\niphone5s\t16147\n日杂\t16148\n逆天神妃\t16149\n春明\t16150\n陈彤云\t16151\nButerin\t16152\n终止妊娠\t16153\n崔万志\t16154\n火了火了火\t16155\n琉璃厂\t16156\n亚通\t16157\n可可味\t16158\n作用点\t16159\n技术系\t16160\n5.36\t16161\n黄瓜片\t16162\ncxcel\t16163\n恶搞\t16164\n中央一号文件\t16165\n一爽\t16166\n新浪科\t16167\n英雄们\t16168\n王叔\t16169\n徐云龙\t16170\n宣统三年\t16171\nhoaprox\t16172\n二十五个\t16173\n列车\t16174\n禽业\t16175\n1批\t16176\n功能性子宫出血\t16177\n数据结构\t16178\n第三针\t16179\n机牌\t16180\n电报群\t16181\nNGFF固态硬盘\t16182\n七百年\t16183\nQualifier\t16184\nsyso\t16185\n集集\t16186\n悲恸\t16187\ncheckbox\t16188\n宋丹\t16189\n神鬼寓言3\t16190\n児\t16191\n易途\t16192\n何挺\t16193\nsandy\t16194\n6182\t16195\n冯仑\t16196\n盐政\t16197\ndeeplearning4j\t16198\nA15\t16199\n雷莫\t16200\n经发局\t16201\n仲夏夜\t16202\n五年级数学\t16203\n太仓市政府\t16204\n百龄坛\t16205\n滛\t16206\nSLG\t16207\n忠旺断桥铝门窗\t16208\nZhiwei\t16209\n广州中原地产\t16210\nstorage/emulated/0\t16211\n红米酒\t16212\n炉石传说盒子_炉石传说卡组\t16213\n山东能源\t16214\n中塑\t16215\n淫羊藿苷\t16216\n嘉恒\t16217\np2psearcher\t16218\n股融易投资\t16219\n茂密\t16220\n华为云空间华为云空间\t16221\n这个世界\t16222\n天岛湖\t16223\n柳承敏\t16224\n52家\t16225\n厦门电信\t16226\n第2批\t16227\n系统篇\t16228\ni白金\t16229\n中微子\t16230\n吴子牛\t16231\n新刀\t16232\nliuconglin\t16233\n中国电机网\t16234\n轻钢龙骨\t16235\n康拓普\t16236\n8300h\t16237\n城北\t16238\n授業\t16239\ndotnet\t16240\n第二十五\t16241\n绿色化\t16242\n途经\t16243\n民主社会主义\t16244\n叶剑\t16245\n沈阳奥数网\t16246\n特化\t16247\n4059\t16248\n石军\t16249\n社会主\t16250\n理智\t16251\n米尔沃尔\t16252\n泡沫化\t16253\nstf\t16254\n1.37G\t16255\n奈子\t16256\nabcdefg\t16257\n猫鼠游戏\t16258\nBoyfriend\t16259\n天阵\t16260\n测集\t16261\n中国翻译\t16262\n三角镇\t16263\n便笺\t16264\n深圳装修公司\t16265\n活性炭包\t16266\n新视野英语教程\t16267\nbangumi\t16268\n王一博\t16269\nclassics\t16270\n光明楼\t16271\n47分\t16272\n软硬派\t16273\n代理词\t16274\n鲜茶\t16275\n天津东\t16276\n长藻\t16277\nPAE\t16278\n雄狮\t16279\nlgo\t16280\n音域\t16281\n东西巷\t16282\nm0n0wall\t16283\n单科\t16284\n泖港镇\t16285\n断墨\t16286\n朗诗\t16287\n中非贸易研究中心\t16288\n几版\t16289\n刘旭东\t16290\ngtc\t16291\nmiley\t16292\n医务处\t16293\n朱重八\t16294\n深圳酒店\t16295\n第09章\t16296\n通威集团\t16297\n京雄城际\t16298\n医疗器\t16299\n龙卡\t16300\nbjt\t16301\n周公灵\t16302\n20150225\t16303\n病妻\t16304\n足趾\t16305\n土耳其航空\t16306\n秭归县人民政府\t16307\nv4.4\t16308\n宝玉\t16309\n疏而不漏\t16310\n3.7.2\t16311\n10T\t16312\n龙岩市财政局\t16313\n火红版\t16314\n画条\t16315\n要好\t16316\n气喘吁吁\t16317\n深深地\t16318\nQXDM\t16319\n围魏救\t16320\n伊思蜗牛\t16321\n西北师范大学\t16322\nHONEYWELL\t16323\n谢丽悦\t16324\n1万倍\t16325\nlobster\t16326\n股性\t16327\n九锡\t16328\n金翼\t16329\nWebGIS\t16330\n孙辉\t16331\n易大师\t16332\n1分钟后\t16333\n杨家成\t16334\n二〇一八年\t16335\n人民南路\t16336\nspringBoot\t16337\nbat批\t16338\n克己奉公\t16339\n磁盘缓存\t16340\n福安\t16341\n二十世纪\t16342\n北京松下\t16343\n四五个小时\t16344\nh文\t16345\n顶顶\t16346\n国家食品安全风险评估中心\t16347\n无向连通\t16348\n11栋\t16349\nholy\t16350\n马钫哲\t16351\nserver2008r2\t16352\n苦难辉煌\t16353\n国家天文台\t16354\n天天安卓\t16355\nblew\t16356\n三花\t16357\n比邻资讯网\t16358\n飞段\t16359\n欧美金融城\t16360\n玛莎百货\t16361\n下水文\t16362\n一大盒\t16363\nliber\t16364\nNMAP\t16365\n最热门\t16366\n刘永华\t16367\n棋盘\t16368\n通达信炒股\t16369\n夏七夕\t16370\ncorp\t16371\n天逸\t16372\n一苇以航\t16373\n友谊关\t16374\n水狼\t16375\n黑龙江省统计局\t16376\n玩鉴赏\t16377\n上海玉博生物科技有限公司\t16378\n大雁塔\t16379\na18\t16380\ntomato\t16381\n朴妮麦\t16382\n美洲猎弓\t16383\n鹤峰\t16384\n学历认证\t16385\n水华\t16386\n擒\t16387\n远心镜头\t16388\nSIGNAL\t16389\n蚊香社\t16390\n橙武\t16391\n圈机\t16392\n数限制\t16393\nGia\t16394\n引论\t16395\n罗茨流量计\t16396\nSelfie\t16397\n江南\t16398\nshee\t16399\n举起来\t16400\n探营\t16401\n吴钩\t16402\nrfc\t16403\n过去\t16404\ncocoajin\t16405\n罗燕\t16406\n西直门站\t16407\n休宁信息新闻网\t16408\n阿扎尔\t16409\nDOS批处理\t16410\n人机对话\t16411\n广深和谐号\t16412\n早孕反应\t16413\nRACE\t16414\n1.6.6\t16415\n上浆\t16416\n箱涵\t16417\n七项\t16418\n新闻史\t16419\n参数方程\t16420\nmac迅雷\t16421\nG104\t16422\n门票\t16423\nv-model\t16424\n入冬\t16425\n金江\t16426\n球笼式\t16427\nFlamenco\t16428\n着力\t16429\n广州南沙_南沙政府\t16430\n第29条\t16431\n麒麟山\t16432\nMTU\t16433\n如山倒\t16434\n参量\t16435\n形成分\t16436\n秋叶原之旅\t16437\n野狗\t16438\nprb\t16439\n甜城\t16440\n黄花梨\t16441\n诗卷\t16442\n行选\t16443\n木木文\t16444\n山艺\t16445\n一手车\t16446\n下策\t16447\n干租\t16448\n深港通\t16449\n牛排杯\t16450\n灵境胡同\t16451\npeare\t16452\nuker\t16453\n中国范儿\t16454\n爱优漫\t16455\n0816\t16456\n蹊径\t16457\nnong\t16458\n螯\t16459\n说说说\t16460\n新世界中国地产\t16461\nWine\t16462\n20160125\t16463\n向华吕布\t16464\n敏水\t16465\nv4.2.0.2\t16466\n四逆散\t16467\n单行\t16468\n文_\t16469\n九江日报数字报\t16470\n明河\t16471\n分理处\t16472\n借配\t16473\nMob\t16474\n伺服电机编码器\t16475\n总裁小说网\t16476\n钛精矿\t16477\n广州国际采购中心\t16478\n缓存优化\t16479\n卡赛\t16480\n城投集团\t16481\n市南\t16482\n极品飞车20\t16483\nF8\t16484\n擦地机器人\t16485\nharp\t16486\n益盟\t16487\n甲钴胺\t16488\n星角\t16489\n评出\t16490\n王正良\t16491\n大油\t16492\n腰力\t16493\nAnxiety\t16494\n崇文\t16495\n加气混凝土砌块\t16496\nbuss\t16497\n2018-01\t16498\n循环往复\t16499\n张扬\t16500\n各洲\t16501\n陈庄\t16502\n打中\t16503\n操作站\t16504\n彩虹岛物语\t16505\n6立方\t16506\n吏治\t16507\n金宸\t16508\n布鲁斯口琴\t16509\n企业谷\t16510\n如林\t16511\nvnx\t16512\n四席\t16513\n望见\t16514\n组件化\t16515\n带款\t16516\n郑秀妍\t16517\n期\t16518\n氦质谱检漏仪\t16519\n广州教育局\t16520\nsuper\t16521\n新村镇\t16522\n17元\t16523\n交委\t16524\n汽服\t16525\n我的野蛮王妃\t16526\n亿安科技\t16527\n阿石创\t16528\n平顶\t16529\n武魂2\t16530\n胜意\t16531\n技术展\t16532\n烟台市人民政府\t16533\n黑狼犬\t16534\n永兴街道\t16535\n骷髅兵\t16536\n心花路\t16537\n和田玉网\t16538\n大众捷达\t16539\n未来一周\t16540\nKey\t16541\nrowspan\t16542\n国家森林\t16543\n义眼\t16544\n从没\t16545\n高层建筑物\t16546\n罗技C920\t16547\n见与不见\t16548\n6501\t16549\n先锋影音av\t16550\n粘接剂\t16551\n杭海路\t16552\n慢点\t16553\n6.1\t16554\n学包\t16555\n李羿慧\t16556\n国际大都市\t16557\n街兔\t16558\n婴儿\t16559\n邱泽\t16560\n通货\t16561\n废气管\t16562\nsine\t16563\n┇\t16564\n报废率\t16565\n中柏\t16566\n18tv\t16567\n论谈\t16568\n90级\t16569\n二泉映月\t16570\nb48\t16571\n轮台县\t16572\nstolen\t16573\ncouncil\t16574\npdf417\t16575\n整函数\t16576\n西澳大利亚州\t16577\n道林纸\t16578\n157号\t16579\n旅行证\t16580\n侵占罪\t16581\nCApp\t16582\n康夫\t16583\nEgit\t16584\n微神\t16585\n银座购物广场\t16586\n注音输入法\t16587\n奖金\t16588\n胰头\t16589\n秦时明月之万里长城\t16590\nRanked\t16591\n1小时内\t16592\n鱼肉\t16593\n穷人区\t16594\nBlockchain\t16595\nTidy\t16596\n凹凸感\t16597\n三相五线\t16598\n99一年\t16599\n30余家\t16600\n长街宴\t16601\n惨不忍\t16602\n殖民者\t16603\n熹妃传华服\t16604\n天禧\t16605\n程漓\t16606\n7plus\t16607\n白霜\t16608\njuke\t16609\n课题组\t16610\n凹凸世界\t16611\n吡虫啉\t16612\n艳母\t16613\n3156\t16614\n2016-2030年\t16615\n加东\t16616\n后会无期\t16617\nyuv420\t16618\n天阴\t16619\n农服\t16620\n橙子皮\t16621\n红杏出墙记\t16622\n根治\t16623\n两周\t16624\n斯兰达人\t16625\n特攻\t16626\n过膝靴\t16627\n色沉\t16628\n河洛群侠传\t16629\n4399动画网\t16630\n600亩\t16631\n松辽\t16632\n流动率\t16633\n洛氏硬度计\t16634\n新北路\t16635\n哈国\t16636\n虹膜识别\t16637\nパパ\t16638\n9个月\t16639\n讴歌\t16640\n艺术设计学\t16641\n胬\t16642\n平川大辅\t16643\nSessionFactory\t16644\njdpaint\t16645\n孟加拉\t16646\n费心\t16647\n安溪\t16648\n明溪县\t16649\nBrazilian\t16650\nSystemUI\t16651\n猎刃\t16652\n立群\t16653\n肾癌\t16654\n广东省广告集团股份有限公司\t16655\ndecided\t16656\n车炉\t16657\n不慌不忙\t16658\n汉柏科技\t16659\nAKB48\t16660\npowercenter\t16661\n环太平洋:雷霆再起\t16662\n屈指可数\t16663\n图睿科技(深圳)有限公司\t16664\n人工髋关节\t16665\n十二大\t16666\n屠格涅夫\t16667\nBarely\t16668\nsi4463\t16669\nYPE\t16670\n直流屏\t16671\n石灰水\t16672\n评分者\t16673\n后者\t16674\n日本国家队\t16675\nrocks\t16676\n一丰\t16677\n汾湖经济开发区\t16678\n否认\t16679\n戚雅仙\t16680\nTL494\t16681\n荐文\t16682\n巴西市场\t16683\n公主级\t16684\nquantmod\t16685\n真空助力泵\t16686\n异度之刃2异刃\t16687\n杭州野生动物世界\t16688\n怀集\t16689\n乌菲齐美术馆\t16690\n波多野结衣\t16691\n羞落\t16692\n政审函\t16693\n亡命\t16694\n抹灰\t16695\n险滩\t16696\n格陵兰岛\t16697\ncython\t16698\nddos攻击\t16699\n钢铁侠1\t16700\n支配\t16701\n准时\t16702\n赋活\t16703\n樱桃谷\t16704\n10天后\t16705\n黑会\t16706\n高级职称\t16707\n林东\t16708\nshapely\t16709\n朱宸慧\t16710\n沃尔克\t16711\n不通\t16712\n检车\t16713\n马頔\t16714\n吴朝晖\t16715\nie内核\t16716\nLiveMusic\t16717\nmbd\t16718\n教师资格证考试网\t16719\n人管\t16720\n三岔河\t16721\n外存\t16722\nPoRO\t16723\n国际旅游岛\t16724\n凯登\t16725\n马明宇\t16726\n20161226\t16727\n说不出口\t16728\n盲视\t16729\n载入\t16730\nkilometers\t16731\n抚宁区\t16732\n通达\t16733\n审批\t16734\n单糖\t16735\n快餐厅\t16736\n排钻\t16737\n58分类网\t16738\nNULL值\t16739\n奥奇传说\t16740\n1000兆\t16741\n王天宇\t16742\n扩胸\t16743\n亿恩科技\t16744\n冯骥才\t16745\n成都建工集团\t16746\n乐文网\t16747\njeep指南者\t16748\n老大徒\t16749\n白向群\t16750\n毫米汞柱\t16751\n月姬\t16752\n赣州市教育局\t16753\n阿曼苏尔\t16754\n普拉多论\t16755\n折痕\t16756\n北影\t16757\n第二人\t16758\n新学期\t16759\n渔场\t16760\n土气\t16761\nMyth\t16762\n纳氏囊肿\t16763\n9席\t16764\n鸡母\t16765\n监视器\t16766\n周欣\t16767\n环卫局\t16768\n上古之神\t16769\nlibapache2\t16770\n2dx\t16771\n3328\t16772\n965\t16773\n熬过\t16774\n认缴额\t16775\n大庙村\t16776\nCavalli\t16777\n保障房装修|一起网\t16778\n苷\t16779\n臀照\t16780\n电容量\t16781\n战旗直播_Live\t16782\n7.3%\t16783\nMusiland\t16784\n1N\t16785\n黄新德\t16786\n行乐\t16787\n泰华城\t16788\n马普尔\t16789\n单品\t16790\nZ17\t16791\n大众高尔夫6\t16792\n光标阅读机\t16793\n深圳坪山新区\t16794\nV1.7\t16795\nPyTorch\t16796\n蜂蜜酒\t16797\n奥西里斯神庙\t16798\n这麽\t16799\n浑江\t16800\n体名\t16801\n五里墩\t16802\ndnf2016\t16803\n中国安装信息网\t16804\n地牢\t16805\n星缘石\t16806\n针\t16807\n至臻\t16808\n大空\t16809\n贯入\t16810\n南淝河\t16811\n明年6月\t16812\n豹式坦克\t16813\nPOCO.CN\t16814\n苏慧\t16815\n利息计算器\t16816\ncells\t16817\n苍溪\t16818\nCosmos\t16819\n波尔卡\t16820\n统一战线\t16821\n昭平\t16822\nRDT\t16823\n|二\t16824\n实验版\t16825\nbluesky\t16826\nFilmes\t16827\n无可取代\t16828\n锻制\t16829\nCEMS\t16830\n封妖\t16831\n甲基丙烯酸缩水甘油酯\t16832\n不住在\t16833\n侨鑫集团有限公司\t16834\n融创中国控股有限公司\t16835\n小沐\t16836\n作品番\t16837\n中共一大\t16838\nPole\t16839\n河北省地理信息局\t16840\n血管紧张素转换酶\t16841\n两千亿\t16842\n第118期\t16843\n遮阳帘\t16844\n灵缇\t16845\n环丁砜\t16846\n美奈\t16847\n1.5匹\t16848\n99.8\t16849\n伊秀情感网\t16850\nAS3.0\t16851\n气帘\t16852\n雇员\t16853\n折行\t16854\n江西人民出版社\t16855\nSpelling\t16856\n181\t16857\n长生不老\t16858\n吉林市\t16859\n献映\t16860\nsnidel\t16861\n总体规划\t16862\n上海市医院\t16863\n耐高温\t16864\nSQLServer数据库\t16865\nnmd\t16866\nPOV\t16867\n声场\t16868\nCantonese\t16869\n独眼\t16870\n大庆职业学院\t16871\n孙皓\t16872\ngirlfriend\t16873\n前奏诗\t16874\n吉拉\t16875\n断头台\t16876\n详尽\t16877\n韩牛\t16878\n赢创\t16879\n西朗\t16880\n情缘\t16881\naniplex\t16882\nBOOT\t16883\n淫乱史\t16884\n永泰园\t16885\n0.9.0\t16886\n见人爱\t16887\n黄焖鸡\t16888\nsocketio\t16889\n11月15日\t16890\n顾卿言\t16891\n酒食\t16892\n易燃物\t16893\npeizhaochen\t16894\nOFweek3D打印网\t16895\ngl6\t16896\n洛氏\t16897\n2018Q1\t16898\n月坛街道\t16899\n达摩福娘\t16900\n廓尔喀\t16901\nsoliderworks\t16902\n司徒\t16903\n三相四线\t16904\n新光集团\t16905\n军政府\t16906\n郎木寺\t16907\n戎马丹心-汉匈全面战争\t16908\n言\t16909\n100局\t16910\n金融广场\t16911\nDede\t16912\n南宁市公安局\t16913\nhasattr\t16914\n17号\t16915\n斯通\t16916\nsmg\t16917\nGUI\t16918\n‐\t16919\n同态\t16920\nNSR\t16921\n15千克\t16922\n采用\t16923\n小提琴谱\t16924\n鹰嘴\t16925\n燃燥\t16926\n钗\t16927\nthea\t16928\n朴新教育\t16929\n狠抓\t16930\n2cr13\t16931\nvj\t16932\nSpreadJS\t16933\n会计网\t16934\n超卡\t16935\nLexus\t16936\n王志成\t16937\n革故鼎新\t16938\n殷行路\t16939\n女角\t16940\n1.el6.x86\t16941\n86版\t16942\n李薇\t16943\n小票\t16944\nMTA\t16945\n纨绔子弟\t16946\n有模有样\t16947\n百分之二十五\t16948\n护师证\t16949\n雾森\t16950\n上海财经大学金融学院\t16951\n罗珊\t16952\n变频柜\t16953\n亡身\t16954\n朱荣斌\t16955\n帕西\t16956\n房地产贷款\t16957\n考辨\t16958\n中财讯\t16959\n超杀\t16960\n酥咔减脂饼干\t16961\n二十家\t16962\n南非共和国\t16963\n30公里\t16964\n千百撸2017\t16965\n海正\t16966\n京致\t16967\n展示型\t16968\n聊斋志异之孽欲狐仙\t16969\n夏热\t16970\n巨鲸\t16971\n无极灯\t16972\n枫蓝国际\t16973\n谱线\t16974\n玲珑女\t16975\n2克\t16976\n弥留之际\t16977\nElaine\t16978\n企税\t16979\n汉得信息\t16980\n太榆路\t16981\natx850\t16982\n一心一意\t16983\nCX-5\t16984\n两腮\t16985\n血亲\t16986\n环能科技\t16987\n能手\t16988\n怒怼小米黑鲨\t16989\n冥火\t16990\n木虱\t16991\n肛乳头瘤\t16992\n雷佳\t16993\nport-channel\t16994\n沙面\t16995\nJavaweb\t16996\n深海迷航吧\t16997\n122平米\t16998\n草拟\t16999\n宇西\t17000\n蛇队\t17001\nyihu\t17002\n劫数\t17003\n__一游网\t17004\n乔老爷\t17005\n杀机\t17006\n中国五冶\t17007\n双向认证\t17008\nStaffing\t17009\n江鲜\t17010\n帝王三国\t17011\n光伏发电站\t17012\n南开区政府\t17013\n中筋\t17014\n椭圆型\t17015\n花兰\t17016\nwww.qj023.com\t17017\n黑天鹅Coding\t17018\nsharpen\t17019\n20151118\t17020\n绞磨机\t17021\n放值\t17022\n现状图\t17023\n莉蒂\t17024\nAltered\t17025\n2.16.2\t17026\nSYNC3\t17027\n会计从业证\t17028\n執行\t17029\n侨福芳草地\t17030\n乐视pro\t17031\nmicrsoft\t17032\n势态\t17033\n装机大师\t17034\n腮红\t17035\ntaka\t17036\n神之战\t17037\n裂空座\t17038\n11月23日\t17039\n劳动奖\t17040\n瑧山府\t17041\n管理观察\t17042\n洛阳晚报\t17043\n东方明珠广播电视塔\t17044\n张金玲\t17045\n肖斯塔科维奇\t17046\nZebra\t17047\n镰仓市\t17048\nmodality\t17049\nlee0oo0\t17050\noxidase\t17051\nvmdk\t17052\nAdaptor\t17053\n凸轮分割器\t17054\n丑人\t17055\n摧花\t17056\n阿玛\t17057\nVGA转换器\t17058\ncatz\t17059\nXeon\t17060\njlb\t17061\n討論區\t17062\n安溪一中\t17063\nclan\t17064\n薏仁\t17065\n西木野真姬\t17066\n成考\t17067\n雷波县\t17068\n一战\t17069\n好雨\t17070\n应当\t17071\n按下\t17072\n男人机\t17073\n胜诉\t17074\n‵\t17075\nevan\t17076\nBeautylish\t17077\n酷狗音乐播放器\t17078\n宇都宫\t17079\n四人帮\t17080\n佳合家美\t17081\n护臀\t17082\n腾讯代理\t17083\npudding\t17084\n白朗\t17085\n操作指南\t17086\n车用\t17087\n老寺\t17088\n满\t17089\n强化券\t17090\n365淘房\t17091\n李小文\t17092\n雷诺科雷傲\t17093\n鲁滨逊漂流记\t17094\n万航渡路\t17095\n阿克拉\t17096\n隐形防盗网\t17097\naspera\t17098\nKio\t17099\n小靈通號段\t17100\nX线片\t17101\ninput框\t17102\n林思意\t17103\n惹眼\t17104\n麒麟鞭\t17105\npcu\t17106\n众商\t17107\n109个\t17108\n高教处\t17109\n森川\t17110\n跪拜\t17111\n送到\t17112\n新西兰奥克兰大学\t17113\n碳罐电磁阀\t17114\n石门山\t17115\nDCIM\t17116\n反链\t17117\n刘辉\t17118\n称重机\t17119\n老班章\t17120\n非淋菌性尿道炎\t17121\nguize\t17122\n抑菌\t17123\nTAPE\t17124\n上药集团\t17125\n良苦\t17126\n原自\t17127\nWOMAN\t17128\n上装\t17129\n求交\t17130\n广州地铁7号线\t17131\n入列\t17132\n和谐路\t17133\n金考卷\t17134\n三更半夜\t17135\n持久化\t17136\nBroadband\t17137\n生态园区\t17138\n刘醒\t17139\n最省钱\t17140\nノ\t17141\n营养\t17142\nIbm\t17143\nconcentrate\t17144\n浴室\t17145\n露天煤业\t17146\n趣玩\t17147\ncf卡\t17148\n熊胆粉\t17149\n东方资产\t17150\n北京妇科医院\t17151\n光版\t17152\n全国家\t17153\n张雨生\t17154\n找房\t17155\nt440s\t17156\n通达oa\t17157\n长春市103中学\t17158\n中国梦想秀\t17159\n重庆行政学院\t17160\nOfficeScan\t17161\nTHERE\t17162\n东京迪士尼乐园\t17163\n566\t17164\n_煮酒论史\t17165\n轮廓\t17166\n1099\t17167\n小河镇\t17168\nHybrids\t17169\n魔域发布网\t17170\n太原市中级人民法院\t17171\n潜行者\t17172\n媒人\t17173\n90倍\t17174\n汤博乐\t17175\n顾惜\t17176\ncss代码\t17177\n配音版\t17178\n迁安市人民政府\t17179\n农保卡\t17180\n华龙一号\t17181\n永济市\t17182\n云南省财政厅\t17183\nTGOD\t17184\nBitesize\t17185\n胆碱\t17186\n一天一天\t17187\n九把\t17188\nPriorityQueue\t17189\n复刻表\t17190\n匈奴\t17191\n3Dmax\t17192\n撒椒\t17193\n胃痛\t17194\n四环路\t17195\n一九\t17196\n长沙磁浮\t17197\n吉兆\t17198\n十月一日\t17199\n环球卡\t17200\n逸名在线字典\t17201\n杨严\t17202\nNero\t17203\nMSG\t17204\n陕西人事考试网\t17205\n纤维布\t17206\n包袋\t17207\n打工仔\t17208\n协同\t17209\n报录\t17210\n元朗\t17211\n李玉洁\t17212\n拾伍\t17213\n角斗士\t17214\n无人机遥控器\t17215\n1606\t17216\n八斗学院\t17217\n阜成路\t17218\n紫竹铃\t17219\n控制符\t17220\n东坡志林\t17221\n东花园\t17222\n享家\t17223\n铜锣湾时代广场\t17224\nPST\t17225\n暗含\t17226\n文职类\t17227\nTaiwan\t17228\ncaoliu1024_cl\t17229\nfoxy\t17230\n8plus\t17231\n_职\t17232\nOrganizations\t17233\n2018-01-02\t17234\n第四周\t17235\ncompetence\t17236\n唐苹\t17237\n秦长城\t17238\n虎彩\t17239\n大腹便便\t17240\n溶剂化\t17241\n链克\t17242\n创建型\t17243\nDecline\t17244\n细根\t17245\ndilidili\t17246\n双C\t17247\n【克苏鲁\t17248\n新海\t17249\n20160122\t17250\n60万公里\t17251\n大众房产网\t17252\n45ml\t17253\n地区号\t17254\n浴血\t17255\n车铃\t17256\n新园\t17257\n中国区\t17258\n无框眼镜\t17259\n查氏\t17260\n成蒲铁路\t17261\n千图网\t17262\n过半\t17263\n天杞园\t17264\n赛果\t17265\n五苓散\t17266\n按捺不住\t17267\ngdg\t17268\n0.97\t17269\nCOMEX黄金\t17270\n占有物\t17271\nstellaris\t17272\n大樟溪\t17273\n中国科学院北京基因组研究所\t17274\n暴乳\t17275\n双皮奶\t17276\n＄\t17277\n宝马3系论坛_汽车之家论坛\t17278\n乐字\t17279\n天文望远镜\t17280\nChemNet\t17281\npinpoint\t17282\n剿匪\t17283\n14季\t17284\n可溶性粉\t17285\n传统\t17286\n尼康\t17287\n善导\t17288\n海鸥\t17289\n第23关\t17290\n七讲\t17291\n北极星电力资料下载中心\t17292\n拉克丝\t17293\n磨剑\t17294\n尧存\t17295\n一百周年\t17296\nBlades\t17297\n秦山\t17298\n尘心\t17299\n应用研究\t17300\n0701\t17301\n福子\t17302\n和平东路\t17303\n留村\t17304\n十二道锋味\t17305\n劳务派遣经营许可证\t17306\nwdcdn\t17307\n万门\t17308\n百乐p500\t17309\n朱平\t17310\n四平百姓网\t17311\nlarger\t17312\n104家\t17313\n康美药业\t17314\n王学文\t17315\nusb无线网卡\t17316\n18902302605\t17317\n数字医疗网\t17318\n李东阳\t17319\n大连经济技术开发区\t17320\n珠江三角洲地区\t17321\n川贝炖雪梨\t17322\n桜田\t17323\n翁恺\t17324\nGarlic\t17325\n下拉栏\t17326\n世界环境日\t17327\nStyleMode中文网\t17328\n华为C8815\t17329\n官路弯弯\t17330\n烂熟\t17331\n裸连\t17332\n京津公路\t17333\n中山证券\t17334\nSWIN\t17335\n影雕\t17336\n金沙天街\t17337\n课标\t17338\n色人\t17339\n养虎为患\t17340\n神观\t17341\nUVa\t17342\n各类目\t17343\n彩云湖\t17344\n亿方\t17345\n猜解\t17346\n稳\t17347\n3第三章\t17348\n痛击\t17349\n难说\t17350\n苏州火车站\t17351\n24s\t17352\n单天\t17353\n大报告\t17354\n张庙街道\t17355\n烤酒\t17356\n蛋饺\t17357\nunknow\t17358\n张智霖\t17359\n探望权\t17360\n蒎烯\t17361\n显卡接口\t17362\n1304317\t17363\n直美\t17364\n张伟民\t17365\n杨克\t17366\n销售服\t17367\nGPSUU\t17368\n流动\t17369\n招股说明书\t17370\n办刊\t17371\n干爸\t17372\n20160921\t17373\n18650\t17374\n华尔\t17375\n360手机助手\t17376\n茴笙\t17377\nyijing\t17378\n龙符\t17379\n陈小红\t17380\n灵岩\t17381\n2.2版\t17382\nTektronix\t17383\n还算\t17384\n砂糖橘\t17385\n佳能5D3\t17386\npai\t17387\n王鼎\t17388\n争议案\t17389\n_企\t17390\n甄志丙\t17391\n2^x\t17392\n小七孔\t17393\n节度使\t17394\nHuizhou\t17395\n撸狗\t17396\n胡毅\t17397\n升起\t17398\n唄\t17399\n连体裤\t17400\n说的话\t17401\n新静安区\t17402\n泉城\t17403\ncreature\t17404\ntaskkill\t17405\n服务页\t17406\n一河\t17407\nFolklore\t17408\nQuery\t17409\n8公分\t17410\n听觉型\t17411\n聚点\t17412\n华为Mate10\t17413\nSkool\t17414\n许都\t17415\nRedirecting\t17416\n方志敏\t17417\n西山小学\t17418\nbuttons\t17419\n寒士\t17420\n扳指\t17421\nMiami\t17422\n破衣\t17423\n魔术队\t17424\n博派\t17425\n铁建重工\t17426\n叶问1\t17427\n放归\t17428\n码库\t17429\n孕婴\t17430\n四大件\t17431\nkeypad\t17432\n这阵子\t17433\n加雷斯\t17434\n隐语\t17435\n满期\t17436\n潮州人才网\t17437\n专管员\t17438\n宝工商城\t17439\n大规模\t17440\n江西人事考试网\t17441\n重庆市\t17442\n替诺福韦\t17443\n爱普生R230\t17444\nCOOLPAD\t17445\n橘子洲\t17446\n109五\t17447\n众泰T500】众泰_众泰T500\t17448\n字印\t17449\n不动产登记网\t17450\n管型\t17451\n王者荣耀2018KPL\t17452\n速美超级家\t17453\n2十\t17454\n调皮\t17455\n程普\t17456\n王春林\t17457\n石洞\t17458\n名老\t17459\n实验室\t17460\n冰块\t17461\nю\t17462\n720p.BD高清\t17463\ncalendar\t17464\nyyyyMMdd\t17465\n潘天寿\t17466\n以色列理工学院\t17467\nHALEY168\t17468\n提笔\t17469\n油麦菜\t17470\n龙泉山\t17471\n二氧\t17472\n550分\t17473\ntomb\t17474\n如洗\t17475\n公園\t17476\n捷豹汽车\t17477\n控制价\t17478\n怀桂\t17479\nQGhappy\t17480\nReduce\t17481\n陈逸飞\t17482\n3edu\t17483\n撩乱\t17484\n壹佰\t17485\nThen\t17486\n灰岩\t17487\n迁移证\t17488\n飞鹤奶粉\t17489\n麟越\t17490\n古村\t17491\n正实\t17492\n国科\t17493\n香辣鸡翅\t17494\n第二场\t17495\n中唐\t17496\n坚瑞沃能\t17497\n雷电模拟器\t17498\n贵阳幼儿师范高等专科学校\t17499\n上海地铁5号线\t17500\nP1505\t17501\n拔子\t17502\n春城晚报\t17503\n金锁记\t17504\n谍\t17505\nroam\t17506\n天琥教育\t17507\n散文\t17508\n康哥\t17509\n卡兹格罗斯\t17510\n稳压泵\t17511\n摆线\t17512\nFRM考试\t17513\npscc2015\t17514\n凤弈\t17515\n一天之后\t17516\nm7615dna\t17517\n画幅\t17518\nsubnet\t17519\nTabBarItem\t17520\n绝地求生卡顿\t17521\n总\t17522\n安钧璨\t17523\n水巷\t17524\nopec\t17525\n子贡\t17526\n2012-2030年\t17527\n加酶洗衣粉\t17528\nhavea\t17529\n标新\t17530\n影带\t17531\nFOX\t17532\nonmouseover\t17533\n众神之战\t17534\n湖南省住房和城乡建设厅\t17535\nPrototyping\t17536\n美语\t17537\n下去\t17538\n虾条\t17539\ncopic\t17540\n沈海寅\t17541\nDogs\t17542\n33068\t17543\n人民论坛网\t17544\n林朝阳\t17545\n木门窗\t17546\nccfl\t17547\n对公账户网银\t17548\n赏金赛\t17549\n中共十九大\t17550\n6.1.0\t17551\n文颂娴\t17552\n光斑\t17553\nH股\t17554\n阿拉木图\t17555\n猜歌达人\t17556\nW1\t17557\n布龙路\t17558\ntsc\t17559\n百年灵\t17560\n本田\t17561\n世歌\t17562\n124路\t17563\n调经促孕丸\t17564\n唯独\t17565\nIP\t17566\n泡面\t17567\nBRAVO\t17568\n梅山\t17569\n乐悠\t17570\nshowdoc\t17571\n胡雪峰\t17572\n第四层\t17573\n规范\t17574\n华中科技大学学报\t17575\n搜狗游戏中心\t17576\n财\t17577\n马布里\t17578\n南京长途汽车东站\t17579\nSpit\t17580\n拉格朗日方程\t17581\n籍\t17582\n欲火\t17583\n欺瞒\t17584\n贞观之治\t17585\n黑胶唱片\t17586\n王国生\t17587\nhowever\t17588\n时尚周\t17589\n160厘米\t17590\nChalet\t17591\n百分之九十九\t17592\nNeusoft\t17593\n特雷西·麦克格雷迪\t17594\n杜家豪\t17595\n3.1级\t17596\n塑包\t17597\n解放北路\t17598\n云物\t17599\n滨特尔\t17600\nobjective-c\t17601\n泰瑞利亚\t17602\n8120\t17603\n猴岛论坛\t17604\n围档\t17605\n第三段\t17606\n2018-02-09\t17607\n王小鲁\t17608\n1行\t17609\nArranger\t17610\n印刻\t17611\ncad2009\t17612\nrelief\t17613\n速珂\t17614\n西凤\t17615\n人造石墨\t17616\n新大纲\t17617\nTY\t17618\n火影忍者online\t17619\nREGISTER\t17620\nmemery\t17621\n有机电\t17622\n中平\t17623\n大花蕙兰\t17624\nG65\t17625\nWePhone\t17626\n潘杰\t17627\n剁椒\t17628\n中国铝业公司\t17629\n半年\t17630\n20160103\t17631\n三水人才网\t17632\n择校费\t17633\n橙天嘉禾\t17634\njinshan\t17635\n鞍山道\t17636\nS8\t17637\n17日\t17638\n多久\t17639\n择日\t17640\n其境\t17641\n纳鲁\t17642\n菲凡\t17643\n奢靡\t17644\n伯恩斯\t17645\n俯身\t17646\n黄沙大道\t17647\n小绿\t17648\n兰迪·奥顿\t17649\n清洗店\t17650\n31秒\t17651\n练兵\t17652\n苏明\t17653\nvi模板\t17654\n立碑\t17655\n中国互联网新闻中心\t17656\n涵盖\t17657\n莫华伦\t17658\n二手车贷款\t17659\n八方资源网\t17660\n沙湾公园\t17661\nraid5\t17662\n合肥168中学\t17663\nJerusalem\t17664\n聂永真\t17665\nSNKRS\t17666\n国盛\t17667\n颉\t17668\n企业管理有限公司\t17669\n内涵\t17670\n染色体核型分析\t17671\n奥迪a6\t17672\n闵京勋\t17673\n埸\t17674\n小径湾\t17675\n黄石公园\t17676\n黑白卷\t17677\n七个字\t17678\n转向灯\t17679\n组织类\t17680\nCES展\t17681\n汪静\t17682\nPVP\t17683\n私募基金管理公司\t17684\n刷机软件\t17685\n泰冯路\t17686\n3年内\t17687\n巧巧简笔画\t17688\n撕脱性骨折\t17689\n国士无双\t17690\n秦淮\t17691\n滚滚红尘\t17692\n第79集\t17693\n陶俑\t17694\nUltraBOOST\t17695\n三星m2071\t17696\n江泰\t17697\nOrlando\t17698\n博泰\t17699\n第134号\t17700\npluto\t17701\n师范学院\t17702\n吐出\t17703\n花画\t17704\n反函数\t17705\n渣男们\t17706\n胫骨骨折\t17707\n309号\t17708\n髓母细胞瘤\t17709\n研析\t17710\nmust\t17711\nlibssl\t17712\n站街女\t17713\n著作\t17714\n屏页\t17715\n采购单\t17716\n几瓦\t17717\nllt\t17718\n拉稀\t17719\n联泰\t17720\n针尖对麦芒\t17721\n2016-2021年\t17722\n广西交通投资集团有限公司\t17723\n妆炫\t17724\n裏路\t17725\n启动页\t17726\n麦乐鸡\t17727\n吴淞口国际邮轮港\t17728\nadt\t17729\n研制\t17730\nReverb\t17731\n八记\t17732\n十三卷\t17733\n中型车\t17734\nWesley\t17735\nfundamental\t17736\nSeafood\t17737\n民主革命\t17738\nDPM\t17739\n识途\t17740\n二用\t17741\nraw格式\t17742\n三节\t17743\nMacX\t17744\n对数函数\t17745\n串休\t17746\n潘海\t17747\n东三环\t17748\n非常星\t17749\n琅东\t17750\n单音\t17751\nlunamini2\t17752\nc君\t17753\n奇淫\t17754\n海心沙\t17755\nOV\t17756\n热水镇\t17757\n安徽文明网\t17758\n辣椒\t17759\n财神婆算命网\t17760\nv2018\t17761\nrage\t17762\n松鼠大战\t17763\n氨糖软骨素钙片\t17764\n高考学\t17765\nvrep\t17766\n克莱德\t17767\n阴谋家\t17768\nAC5300\t17769\n小淘\t17770\n高利\t17771\n螺纹胶\t17772\n不羡\t17773\nRenaissance\t17774\n林宏\t17775\nQwt\t17776\n七八天\t17777\n纳斯\t17778\n道县政府\t17779\n史蒂文·西格尔\t17780\n人心向背\t17781\n虹夕诺雅\t17782\n6542\t17783\n酸钠\t17784\n凶\t17785\n南京理工大学泰州科技学院\t17786\n实例\t17787\n常回家看看\t17788\nhp1022\t17789\n耶利哥\t17790\n三p\t17791\n十二金鸭\t17792\n代购们\t17793\n紧定\t17794\nnano7\t17795\nSIG\t17796\nmultiarray\t17797\n清虚\t17798\nhunks\t17799\n生成式\t17800\nEBD\t17801\n百度广告联盟\t17802\n广汽传祺gs8\t17803\n拳术\t17804\n东丰县\t17805\n花海公园\t17806\n繁丧\t17807\nlcl\t17808\n财务管理系统\t17809\n金融负债\t17810\nwineskin\t17811\n大陆法系\t17812\n小江\t17813\n300K\t17814\n石牌桥\t17815\n公交车上\t17816\n浙江广电集团\t17817\n1788\t17818\n乌头鱼\t17819\n画山\t17820\n华西都市报\t17821\n明晚\t17822\n丁肇\t17823\n四季兰\t17824\n航天长峰\t17825\n全系\t17826\n61位\t17827\nS120\t17828\nshibai\t17829\n刑讯室\t17830\n韩公主\t17831\n二元相图\t17832\n中山地铁\t17833\n9}\t17834\n真实赛车3\t17835\n平衡型\t17836\n概\t17837\nGUESS\t17838\nItem\t17839\nangularjs4\t17840\n国宅\t17841\n艳艳\t17842\nios10\t17843\n3130\t17844\n你不来我不老\t17845\n河南省洛阳市\t17846\n荣耀6A\t17847\n苕溪\t17848\n海南中学\t17849\n杜磊\t17850\n闪闪亮\t17851\n割手\t17852\nKuma\t17853\n李曙光\t17854\n11月24日\t17855\n391.01\t17856\n商丘师范学院\t17857\n鸳鸯湖\t17858\n红毒\t17859\n地台\t17860\n浑沌\t17861\n舌骨\t17862\n荣兴\t17863\n福建电大\t17864\n富临运业\t17865\n等号\t17866\nus福利网\t17867\n吉林电子信息职业技术学院\t17868\n苹果店\t17869\nnem\t17870\n七原罪\t17871\n洪山光谷\t17872\n铁盾\t17873\n古镇\t17874\n两支\t17875\nantonio\t17876\n四川中医药高等专科学校\t17877\nTotoiseSVN\t17878\n吴振华\t17879\n山田君\t17880\n斑节虾\t17881\n电键\t17882\nh323\t17883\n丽水市区\t17884\nsisi\t17885\nGMAT考试\t17886\n二十八个\t17887\nWatt\t17888\niphone模拟器\t17889\n中银国际\t17890\n豪血寺\t17891\n茧房\t17892\n以上\t17893\n代餐粉\t17894\n自力\t17895\n飞客茶馆旅行网\t17896\n空压机\t17897\n一二部\t17898\n新南\t17899\n股数\t17900\n兽娘\t17901\nChallenger\t17902\n安徽省\t17903\n曼秀雷敦\t17904\n脚心\t17905\n赌盘\t17906\nhaan\t17907\nTiac\t17908\nDiggerPlus\t17909\n防锈油\t17910\n工艺流\t17911\n2017财年\t17912\n椰子糖\t17913\nbaoding\t17914\n风起苍岚\t17915\n等等等等等\t17916\n三菱plc\t17917\n泰达国际心血管病医院\t17918\n精挑\t17919\n挥笔\t17920\n吃软饭\t17921\n靳双权\t17922\n赵星\t17923\n毕畅\t17924\n储蓄型\t17925\nSyn\t17926\n2011.10\t17927\n美说网\t17928\nhave\t17929\n乌拉那拉\t17930\n濯清涟而不妖\t17931\n共同贷款人\t17932\n渝建\t17933\nshogun\t17934\n2017.01\t17935\n上海市第二中学\t17936\ncooks\t17937\n标定板\t17938\npsk\t17939\n不情\t17940\n绣鞋\t17941\n台北市\t17942\nDaum\t17943\n脊髓\t17944\n侯德健\t17945\n芝麻街英语\t17946\ntimmy\t17947\nTransact\t17948\n陈彩霞\t17949\n大众朗逸\t17950\n未来馆\t17951\n误导性\t17952\n布林\t17953\n_K/3\t17954\nmib\t17955\n捷速\t17956\nhcm\t17957\n丁尼生\t17958\n盖茨\t17959\n淫妖虫\t17960\n阿德拉\t17961\n电务段\t17962\nalloy\t17963\n聚合网\t17964\nLQ-635K\t17965\n花坛\t17966\n小时代4\t17967\n烟海\t17968\n压电式传感器\t17969\n丹佛斯\t17970\n24小时以后\t17971\n紫红\t17972\n58学习网\t17973\nBattery\t17974\n局常委\t17975\nwin10绝地求生\t17976\n91p\t17977\n填用\t17978\nWatch-ZOL\t17979\n6英尺\t17980\n代标\t17981\n惠普公司\t17982\n跌落式\t17983\n推生\t17984\n鸡病专业网\t17985\nTubeon\t17986\n新疆克拉玛依市\t17987\n卷皮\t17988\n区委统战部\t17989\n森淼\t17990\n穿线\t17991\nembrace\t17992\n金刺猬戏剧网\t17993\n主人们\t17994\nxt2\t17995\n商住公寓\t17996\n大手笔\t17997\n混沌剑神\t17998\n36o\t17999\n单告\t18000\n交通运输部海事局\t18001\nCOVER\t18002\n|2017\t18003\n7架\t18004\n兄弟级\t18005\n湖北省信访局\t18006\n哈军工\t18007\n南京国民政府\t18008\n1.9.10\t18009\n创业学\t18010\n建业地产\t18011\n仨月\t18012\nPConl\t18013\npng8\t18014\n红波\t18015\nWORD2007\t18016\n顶风\t18017\n尖刀\t18018\n电子工业出版社\t18019\n时趣\t18020\nAER\t18021\n资信评估\t18022\nフィギュア\t18023\n陈小虎\t18024\nMAPGIS\t18025\n优格集成灶\t18026\n生石灰\t18027\n熊证\t18028\n中国资产评估协会\t18029\n薪人\t18030\n金湖新闻网\t18031\n爱数\t18032\n仙客来\t18033\n晓庄\t18034\n21CN手机网\t18035\n黄龙300\t18036\n多枚\t18037\n咬花\t18038\n毛葱\t18039\n书旗吧小说网\t18040\n老处女\t18041\n恶狠狠\t18042\nAutumn\t18043\n中国国有银行\t18044\n杨程\t18045\n乳腺小叶增生\t18046\nvideoplayer\t18047\n该值\t18048\nMeaning\t18049\n昌\t18050\nMALDI\t18051\n祸水\t18052\n闹肚子\t18053\n法性\t18054\n万客隆\t18055\n博雅象棋\t18056\n百许\t18057\n陈浩\t18058\n04号\t18059\n6DVD\t18060\n锦城学院\t18061\n团贷网\t18062\nReligion\t18063\n警察局\t18064\n内邱\t18065\n穆加贝\t18066\nx9splus\t18067\n俩台\t18068\n电子变压器\t18069\n3.1亿\t18070\n迟思堂工作室\t18071\n死间\t18072\n丑女\t18073\nRimWorld\t18074\n蓝山县人民政府\t18075\n钟摆\t18076\n中国一重\t18077\n偶像派\t18078\n舒尔耳机\t18079\nShard\t18080\n惩罚性\t18081\n诈降\t18082\n中国承德_承德政府\t18083\n匠艺\t18084\n香蕉牛奶\t18085\n乔榛\t18086\n阴三儿\t18087\n黑摩\t18088\norigin9.1\t18089\nTripollar\t18090\n巫哲\t18091\n朱昌耀\t18092\n疫苗\t18093\n佛像\t18094\n宁波市人民政府\t18095\n宜兴市\t18096\n新一站保险网\t18097\n发邮\t18098\n天豹\t18099\n诚通集团\t18100\n433M\t18101\n北京出入境边防检查总站\t18102\n海航科技集团\t18103\n宁波教科网\t18104\n尖角煞\t18105\n尼勒克县\t18106\n新港\t18107\nmicrochip\t18108\n漓江景区\t18109\n须知道\t18110\n20150912\t18111\n征募\t18112\n摇滚乐\t18113\n练达\t18114\n大房\t18115\n68万\t18116\n善良的姑娘\t18117\n铰链\t18118\nMassive\t18119\nRides\t18120\n很气愤\t18121\n饱含\t18122\n娇养\t18123\n美食界\t18124\nhydra\t18125\n贝克汉堡\t18126\nghd\t18127\nGroupBy\t18128\nInsta360\t18129\n鬼棺\t18130\ncomprise\t18131\n灰身\t18132\n净额法\t18133\n再生棉\t18134\n鬼影\t18135\n李俊明\t18136\n有氧运动\t18137\n5骶\t18138\nSeat\t18139\n不碎\t18140\n默沙东\t18141\n镇静\t18142\n目录册\t18143\n税收政策\t18144\n鄂东南\t18145\n出烟\t18146\n捆绑式\t18147\n丽春院\t18148\n华东理工大学出版社\t18149\n剪叉式升降机\t18150\n专送\t18151\n冷媒\t18152\neMTC\t18153\n扩展券\t18154\n湖心岛\t18155\n雨燕\t18156\n40立方\t18157\n墨西哥人\t18158\n远见\t18159\n弘文\t18160\n包衣机\t18161\n3.3V\t18162\n尽收眼底\t18163\nhae\t18164\n网柜\t18165\n降灵\t18166\n染整\t18167\n0.67\t18168\n马驰\t18169\n中小学德育工作指南\t18170\n徐虎\t18171\n变革\t18172\n安装包\t18173\n英雄史\t18174\n治本\t18175\n作弄\t18176\n我人\t18177\n不含\t18178\n头结点\t18179\n金率\t18180\n供选\t18181\nvaule\t18182\n国际旅行卫生保健中心\t18183\nheap\t18184\npmml\t18185\nsaba\t18186\nA06\t18187\n填单\t18188\n传记\t18189\n交通流\t18190\nGov\t18191\n淇滨\t18192\n柏格森\t18193\ndnfpk吧\t18194\n晏子\t18195\n张荣华\t18196\n宇野昌\t18197\n老酒\t18198\nII卷\t18199\n上汽大通G10\t18200\ndok\t18201\n两朝\t18202\n辛基酚\t18203\njjw\t18204\npipenv\t18205\n美谷朱里\t18206\n螺杆式冷水机组\t18207\norg域名\t18208\n薏\t18209\n鱼草\t18210\nopenURL\t18211\nzzsi\t18212\nOmnigraffle\t18213\n注射模\t18214\n人民币卡\t18215\nIOU\t18216\n李有才\t18217\n科技视界\t18218\n宝安中医院\t18219\n六里桥长途汽车站\t18220\n傅仪\t18221\nce认证\t18222\n硫酸奎宁\t18223\n复方福尔可定口服溶液\t18224\nPOCO\t18225\n广而告之\t18226\nyouth\t18227\n扶他林\t18228\nmalware\t18229\nWWII\t18230\n歡\t18231\n矗\t18232\n2018-04-03\t18233\nfut\t18234\n肌红蛋白\t18235\n胜利东路\t18236\n三七市镇\t18237\n大出血\t18238\n千影\t18239\n本县\t18240\n川贝枇杷膏\t18241\n旧日\t18242\n80台\t18243\n安昌\t18244\n宋小川\t18245\n一簇\t18246\nHair\t18247\n剑魂\t18248\n四角亭\t18249\n我的我\t18250\n彭博\t18251\n舒压\t18252\n期末考试\t18253\n零基\t18254\nsatis\t18255\n资管产品\t18256\nled数码管\t18257\n铸人\t18258\nナイト\t18259\n俊朗\t18260\n如履薄冰\t18261\n月亮花\t18262\n套牢盘\t18263\n宋式\t18264\n承受力\t18265\n跳情\t18266\n冰箱\t18267\n早稻田大学\t18268\n副手\t18269\n双胞胎集团\t18270\nbutong\t18271\n油机\t18272\n吃法\t18273\n浙人社\t18274\n高级中学\t18275\n白话文\t18276\nslash\t18277\n有钱了\t18278\n一仆二主\t18279\n羊肠线\t18280\n团体票\t18281\nReverse\t18282\n克明面业\t18283\nHobbit\t18284\n王夫之\t18285\n免漆门\t18286\n摩托威\t18287\n日南\t18288\n优胜美地国家公园\t18289\ncsj\t18290\n二手手机转让\t18291\nXXXXXXXX\t18292\n务工\t18293\n超分\t18294\n青娱乐吧\t18295\n衣食父母\t18296\n同程\t18297\n云南电力\t18298\n泰龙\t18299\n函数篇\t18300\nSTM8\t18301\n180粒\t18302\n哲蚌寺\t18303\n义乌市中心医院\t18304\n群雄\t18305\n七里庄\t18306\nAliyun\t18307\n新视线\t18308\n超级星光大道\t18309\n尼采\t18310\n1.98\t18311\n126334\t18312\nvsCode\t18313\n49分\t18314\n冰糖峪\t18315\n国家能源投资集团有限责任公司\t18316\n移民者\t18317\nHutool\t18318\nMAISON\t18319\n33秒\t18320\n20150424\t18321\npulp\t18322\n太阳\t18323\n船头\t18324\n内屏\t18325\nschneider\t18326\n601155\t18327\nweb编辑器\t18328\n内酯豆腐\t18329\nelemental\t18330\n黄大仙庙\t18331\n宣传单\t18332\n洒满\t18333\n群创光电\t18334\nquot\t18335\n日新月异\t18336\n人脸考勤机\t18337\n2002G\t18338\nZhang\t18339\n裁决吧\t18340\n10升\t18341\n9091\t18342\n贺子玲\t18343\n上清所\t18344\n每隔一分钟\t18345\n40000公里\t18346\n蒂莫西\t18347\n八宝\t18348\n剧种\t18349\n代考\t18350\n试用期满\t18351\n52av\t18352\n江东英豪传\t18353\n源发行版\t18354\n碎叶城\t18355\n体罚\t18356\n乔家大院\t18357\n烟台蓬莱中通快递\t18358\n诺维\t18359\n翠屏湖\t18360\n师葭希\t18361\n事会\t18362\nc0000142\t18363\n仁里\t18364\nWRT\t18365\nopenmeetings\t18366\n地牛\t18367\nsslocal\t18368\n雪国\t18369\n大报恩寺\t18370\n5820k\t18371\n悉达多\t18372\n薛之\t18373\n双人\t18374\n5.49\t18375\n城市管理行政执法局\t18376\ngerman\t18377\n朱雷\t18378\n广州注册会计师协会\t18379\n多彩贵州\t18380\n保险丝盒\t18381\n萬\t18382\n余姚中学\t18383\n文涛\t18384\n子夏\t18385\n沿岸\t18386\n大武夷新闻网\t18387\n长久\t18388\nSteep\t18389\n新闻业\t18390\n腾讯文学\t18391\n隔世\t18392\n三色坊\t18393\n2728\t18394\n被枪决\t18395\n三角洲\t18396\n大宁地区\t18397\n寒武纪科技\t18398\nCmsEasy\t18399\n搭术\t18400\n酷音\t18401\nuc55net\t18402\n征兆\t18403\n日照大学城\t18404\n三岛奈津子\t18405\n自动档汽车\t18406\n捝\t18407\n王培\t18408\n霍布斯鲍姆\t18409\n冲压工\t18410\n网络工\t18411\n想吃掉\t18412\n袍哥\t18413\n光控\t18414\n静秋\t18415\nIMDb\t18416\n帕丽斯·希尔顿\t18417\nTKT\t18418\ntoncat\t18419\n荔枝fm\t18420\n痴\t18421\n批发店\t18422\n静态路由\t18423\n入盆\t18424\n天涯八卦\t18425\n艺术设计专业\t18426\nNixon\t18427\n云淡风轻\t18428\n中坦\t18429\n陆总\t18430\n徐薇\t18431\n化油器\t18432\n木南日菜\t18433\n雨虹\t18434\n燕岭\t18435\n火影忍者Online\t18436\n扎基寺\t18437\n雅洁\t18438\nFF15\t18439\n依从性\t18440\n电层\t18441\n政治性\t18442\n水屋\t18443\n晒太阳\t18444\nmq4\t18445\n玥玥\t18446\n静压\t18447\n刘国钧\t18448\nktx\t18449\nKnow\t18450\n土巴兔新闻中心\t18451\n建队\t18452\nRE\t18453\n金鱼藻\t18454\n马钉\t18455\n双王\t18456\n九州\t18457\nenpass\t18458\nadele\t18459\n灌篮\t18460\n无奈\t18461\n阴文\t18462\n天正cad\t18463\n4112\t18464\n70KG\t18465\n瑞吉欧\t18466\n黑冰\t18467\n延安大学附属医院\t18468\n毒箭\t18469\n桃花\t18470\n陈雯\t18471\n鸭血粉丝汤\t18472\nGFS\t18473\n光电转换器\t18474\n黄角兰\t18475\n公共事务学院\t18476\nJTree\t18477\n铃声\t18478\n初评\t18479\n冰激凌粉\t18480\nEBP\t18481\n到时候\t18482\nvatti\t18483\n4820\t18484\n醉意\t18485\n海天味业\t18486\n联想小新\t18487\n撞邪\t18488\n十六条\t18489\n米脂\t18490\n19万\t18491\neminem\t18492\n名药\t18493\n智云云鹤2\t18494\n沸腾炉\t18495\n天了噜\t18496\n会址\t18497\n永仁县\t18498\n广州总领事馆\t18499\nconsuming\t18500\noes\t18501\n写真书\t18502\n金汉斯\t18503\nws15\t18504\nBACnet\t18505\n食\t18506\n该学\t18507\nSKE48\t18508\n7310\t18509\n古义\t18510\n泵站\t18511\n红鱼\t18512\n总控\t18513\n断流\t18514\n穿\t18515\n罗技g900\t18516\n官妓\t18517\nPearl\t18518\ncamille\t18519\n红翼\t18520\nnsobject\t18521\n阴儿房\t18522\n国际空港信息网\t18523\n深圳市华星光电技术有限公司\t18524\n怨女\t18525\n康熙来了\t18526\n面向对象程序设计\t18527\nTexStudio\t18528\n浙江毅腾\t18529\n相互关系\t18530\n具象化\t18531\n金恪集团\t18532\n冶炼\t18533\n缺货\t18534\n四边面\t18535\n阿瞳\t18536\nuntil\t18537\n影像科\t18538\n花蛇\t18539\n金边玫瑰\t18540\n台儿庄大战纪念馆\t18541\n牙膏\t18542\n奥地\t18543\n时间码\t18544\n光谷天地\t18545\n过塑机\t18546\nverse\t18547\n互传\t18548\n江夏镇\t18549\nzhuanji\t18550\n相馆\t18551\nrestclient\t18552\nmlib\t18553\n中国茶\t18554\n织梦网\t18555\n浪琴湾\t18556\n哈尔滨市卫生和计划生育委员会\t18557\nBOOL\t18558\njbl\t18559\n沉池\t18560\n黄丝\t18561\n玉莲\t18562\n埃里森\t18563\n电业局\t18564\n四德\t18565\n一成不变\t18566\n甘肃省科技厅\t18567\n西安市未央区\t18568\n心脏骤停\t18569\n义胆\t18570\n江泉实业\t18571\n工商管理硕士\t18572\n变蛋\t18573\n固\t18574\n侬晓得\t18575\n前进档\t18576\n陈家港\t18577\n连接方\t18578\n实施办法\t18579\n金属条\t18580\n软RAID\t18581\n西山岛\t18582\n我的家乡\t18583\nEazy\t18584\n山东法制报法院\t18585\n广东三九脑科医院\t18586\n反串\t18587\niRobot\t18588\n世毕盟\t18589\n常见\t18590\nfollower\t18591\n选机\t18592\n黄翡\t18593\n富阎\t18594\n毛坯\t18595\n23P\t18596\n大同村\t18597\n北京对外经济贸易大学\t18598\nPaperpass\t18599\n史书\t18600\n发现神行\t18601\n龙华寺\t18602\n120平米\t18603\n深圳大金\t18604\n7万亿\t18605\n深圳国税电子税务局\t18606\n白葡萄\t18607\n山东大学公共卫生学院\t18608\n四创\t18609\n胡编乱造\t18610\n玩主论坛_无忌\t18611\nlatency\t18612\n伪造证据\t18613\n象山森林公园\t18614\nSurfaceFlinger\t18615\n九霄环佩\t18616\nbrowsers\t18617\n瓢\t18618\n陶喆\t18619\n银盆岭\t18620\n█\t18621\n子表\t18622\n第四十一章\t18623\n派克\t18624\n契数\t18625\n武陵\t18626\n花花草草\t18627\ntof\t18628\n何超欣\t18629\n锐丰\t18630\n自喷漆\t18631\n开颜\t18632\n兵工厂\t18633\n腾讯电脑管家论坛\t18634\n克莱门特\t18635\npcr仪\t18636\n蛋壳\t18637\n魅族pro7\t18638\n传道\t18639\n天津外国语大学附属外国语学校\t18640\n山东省\t18641\n百度口碑\t18642\n中国移动通信集团北京有限公司\t18643\nIgG\t18644\nLinux\t18645\n永久自行车\t18646\n换爱\t18647\n道恩·强森\t18648\n宠上\t18649\n沈阳网\t18650\n中国农业科学院\t18651\n龙生\t18652\n佳文\t18653\n裸心谷\t18654\n人间蒸发\t18655\nSK\t18656\n100万台\t18657\nRecipes\t18658\n婚然天成\t18659\n国仁\t18660\n征仪路\t18661\n十围\t18662\n小姨妈\t18663\n写轮\t18664\n台达电子\t18665\n冷害\t18666\nactual\t18667\n宝月\t18668\n驱动力\t18669\n暂别\t18670\n神算子\t18671\n曲水\t18672\n东方国际集团\t18673\n乔伊\t18674\n耳罩式\t18675\n齐创\t18676\n深漂\t18677\n学团\t18678\n嘎嘣脆\t18679\n10次方\t18680\n尾货\t18681\n五色石\t18682\n大锯\t18683\nHTML文本框\t18684\n欧股\t18685\n口力\t18686\n大开杀戒\t18687\n潘美\t18688\nDesignated\t18689\n下秋\t18690\nFaraday\t18691\n九招\t18692\n实量\t18693\n逃生绳\t18694\n国际珠宝网\t18695\ncvbs\t18696\n彩虹乐队\t18697\n绍兴市柯桥区市场监督管理局\t18698\nresources\t18699\nTCH\t18700\nhame\t18701\n中国搜地网\t18702\nKinect\t18703\nshouce\t18704\n梨洲街道\t18705\n中关村大厦\t18706\nVAL\t18707\n上划线\t18708\n2017年8月19日\t18709\n斗米\t18710\n末年\t18711\nfif\t18712\n中美贸易摩擦_\t18713\n舒敏\t18714\n香港台\t18715\n大白汽车\t18716\n站标\t18717\n托尔\t18718\n下辖\t18719\n统计\t18720\n南京依维柯\t18721\n邪恶漫画爱丽丝\t18722\n左手边\t18723\n斯里兰卡\t18724\n闪断\t18725\n金融服务有限公司\t18726\n火柴棒\t18727\n601128\t18728\nKNO3\t18729\n皂荚树\t18730\nam61\t18731\n20周年\t18732\n6路\t18733\ninserts\t18734\n景逸\t18735\n苏妙玲\t18736\n大楼\t18737\n绘画版\t18738\n浙江大学计算机科学与技术学院\t18739\nLester\t18740\nharbour\t18741\ninterrupt\t18742\n汉责\t18743\n光辉\t18744\n一帖\t18745\nTulip\t18746\n掌镜\t18747\nHEALTH\t18748\n六叔\t18749\n寒鸦\t18750\naudience\t18751\n阶梯状\t18752\n天威视讯\t18753\n出货量\t18754\n40磅\t18755\n稀粥\t18756\n统计师\t18757\n止痛\t18758\nmemcmp\t18759\nEmguCV\t18760\n国际象棋\t18761\n0574\t18762\n淮南职业技术学院\t18763\n重庆市江津区人民政府\t18764\n吴石\t18765\n古今异义\t18766\n特许加盟展\t18767\n壁板\t18768\n火爆农化招商网\t18769\n恩宠\t18770\nbonelee\t18771\n第30类\t18772\ndtu\t18773\n诱惑者\t18774\n黑幕\t18775\n达标校\t18776\n喻义\t18777\nxlim\t18778\n安徽国防科技职业学院\t18779\n协管员\t18780\nBadger\t18781\nchunchill\t18782\n俊才招聘网\t18783\n编程之美\t18784\n氢氧根离子\t18785\n梅里达\t18786\n模拟器\t18787\n广州市广播电视大学\t18788\n温婉\t18789\nvyprvpn\t18790\n我国\t18791\n关西国际机场\t18792\n咸鱼干\t18793\n番薯学院\t18794\nhacking\t18795\n克里斯埃文斯\t18796\nReplica\t18797\n600平方\t18798\n3.3万\t18799\n前档\t18800\n我们家的男子汉\t18801\n重疾保险\t18802\n潘火街道\t18803\n淮海战役纪念馆\t18804\n明细\t18805\n2017年3月17日\t18806\n无爱\t18807\nWebserver\t18808\nedc\t18809\nSTRUCTURE\t18810\n引导式\t18811\n凉\t18812\nthings\t18813\n浇头\t18814\n无盘技术\t18815\nyurun\t18816\nspeedtree\t18817\n晒版机\t18818\n领衔\t18819\nFil\t18820\n900多\t18821\nbeanutils\t18822\n码农网\t18823\n劳务费\t18824\nxiaohua\t18825\n梦幻西游3\t18826\nCGV星\t18827\nnodemon\t18828\n庙会\t18829\nyoon\t18830\n散珠\t18831\n补丁版\t18832\n禅让\t18833\n伊织萌\t18834\n江淮骏铃\t18835\n出彩\t18836\n中央庭\t18837\n河铉雨\t18838\n象形\t18839\n私活\t18840\n巨如集团\t18841\nutf\t18842\n教鞭\t18843\n自由之丘\t18844\n周山神\t18845\nnifi\t18846\nrsd\t18847\n近三十年\t18848\n舆论观\t18849\n夏\t18850\nssss\t18851\n佛教界\t18852\n坐班\t18853\n矿石\t18854\n亮子\t18855\n信息中心\t18856\n标体\t18857\n云末加速器\t18858\n艾维娜\t18859\nxplay3s\t18860\n投机岛期货论坛\t18861\n疾速追杀\t18862\n表生\t18863\n聚色阁\t18864\n石杨路\t18865\n渝金所\t18866\n板砖\t18867\n杨晓云\t18868\n亲子节\t18869\ninit-method\t18870\n秀峰区\t18871\nStrongart\t18872\n王峥嵘\t18873\n天下粮田\t18874\n剑盾\t18875\n5分钟以上\t18876\n份儿\t18877\n光宝科技\t18878\n她说\t18879\n星宫莓\t18880\n5p\t18881\n威风堂堂\t18882\nbbs.0550.com\t18883\n柱斑\t18884\n春江晓景\t18885\n晚亭\t18886\nAstell\t18887\n2017年8月9日\t18888\n标准员\t18889\n阅卷\t18890\n皮蛋粥\t18891\n贵州省公务员局\t18892\n202&\t18893\n养料\t18894\n冰魄\t18895\nTechSpot\t18896\n培训宝\t18897\n测径仪\t18898\n环城西路\t18899\n4w\t18900\n1444\t18901\n化石群\t18902\n连墙\t18903\n4.1.0\t18904\n大连理工大学盘锦校区\t18905\n2640\t18906\n2016年5月13日\t18907\n热封机\t18908\n1500mm\t18909\n跨进\t18910\n江湾城\t18911\nreconfigure\t18912\n非商业\t18913\n穿孔板\t18914\n优势\t18915\n4399游戏盒\t18916\n童品\t18917\n2.03\t18918\nlikelihood\t18919\n工程计价\t18920\n不更\t18921\n杨贤江\t18922\n守拙\t18923\n温州医学院附属第一医院\t18924\n年庆\t18925\n早白\t18926\nhba\t18927\n李沁\t18928\n阴阳错\t18929\n浦桑尼克\t18930\n缪晓辉\t18931\n四周目\t18932\n锂枝晶\t18933\n悦来\t18934\n项目源码科帮网\t18935\nFactorio\t18936\nsway\t18937\nARCHITECTS\t18938\n邓文\t18939\n墨头\t18940\n130多\t18941\n梅花\t18942\n有话直说\t18943\n事企\t18944\n微视达人\t18945\n汽车4S店\t18946\n求指\t18947\n深度卷积神经网络\t18948\n万禾\t18949\n神爱之家吧_\t18950\n中国十九冶集团有限公司\t18951\n存现\t18952\n大臂\t18953\n马来西亚航空公司\t18954\n高空作业平台\t18955\n阿里旺旺\t18956\n自寻\t18957\n61&\t18958\n华太\t18959\n罗技G602\t18960\n磷酸三钙\t18961\n产业革命\t18962\n8关\t18963\njewelry\t18964\n相士\t18965\n大连东软\t18966\n一元三次方程\t18967\n李胜素\t18968\n49.9\t18969\n渗透式\t18970\n艮北新城\t18971\n全解\t18972\n土木工程网\t18973\n哈罗德\t18974\n使命召唤13\t18975\nv3.5.2\t18976\n展点\t18977\n渝府\t18978\n多玩剑灵\t18979\n流计\t18980\n36亿\t18981\nFrederick\t18982\n独联体\t18983\n总分类账\t18984\n病性\t18985\n国家新型工业化产业示范基地\t18986\nWin64\t18987\n托曼尼榴莲\t18988\n神志不清\t18989\n惩罚者\t18990\norganizing\t18991\n建施\t18992\n利国镇\t18993\n律考\t18994\n民丰县\t18995\n中国联通\t18996\n清流\t18997\n挥戈\t18998\n百度云迅雷\t18999\n忍者龙剑传3\t19000\n西安科技大学图书馆\t19001\n幸凡学习网\t19002\n百令\t19003\n料理\t19004\n书法鉴赏\t19005\npromoted\t19006\n桂平\t19007\n淘气爷孙\t19008\n易地\t19009\n战者\t19010\n芮城县人民政府\t19011\nBGR\t19012\n企业库\t19013\n厚街万达广场\t19014\n铬\t19015\n第五类\t19016\n兽形\t19017\n第1批\t19018\n七局\t19019\n专精\t19020\n美感\t19021\n街场\t19022\n木盘\t19023\n一彻\t19024\nclipbo\t19025\n隐形人\t19026\npositions\t19027\n无误\t19028\n港证\t19029\n社区公园\t19030\nfor&#160\t19031\n网易云邮箱\t19032\nfaw\t19033\nMikasa\t19034\nconverted\t19035\n延津县人民政府\t19036\n软土地区\t19037\n枝枝花\t19038\n漷县镇\t19039\n隐约可见\t19040\n锚筋\t19041\n707路\t19042\n党建办\t19043\n焦裕禄纪念馆\t19044\n天极产品库\t19045\n发怒\t19046\n精彩音乐汇\t19047\n广西云龙招标有限公司\t19048\n赵之心\t19049\n冯老师\t19050\n周未\t19051\n八达岭孔雀城\t19052\n六组\t19053\n源代\t19054\nMarcelo\t19055\nSVO\t19056\n铃谷\t19057\nWeare\t19058\nhanger\t19059\n金果园\t19060\nEU5\t19061\n137\t19062\n特首\t19063\n艾林\t19064\n无格式\t19065\n5000小时\t19066\n腾牛网\t19067\n中国新闻周刊\t19068\n金投保险网\t19069\n陈冠\t19070\n文斯\t19071\nwin95\t19072\n黄木岗\t19073\n孔隙率\t19074\n大学生活\t19075\n黑白菜\t19076\n苍耳\t19077\nhmp\t19078\n大于\t19079\n伏尔加格勒\t19080\n滚雪球\t19081\n屋久岛\t19082\n笑死\t19083\n信息公司\t19084\n第十位\t19085\n真央\t19086\n杭州新世纪外国语学校\t19087\n中国注册会计师审计准则\t19088\n8处\t19089\n老公公\t19090\n梅花路\t19091\n臨\t19092\n黄美玲\t19093\n炉石传说传说\t19094\nhormone\t19095\ngarfield\t19096\n冲激响应\t19097\nanalytica\t19098\n裙子剑网3\t19099\n什么价\t19100\n人气\t19101\n首擎\t19102\n魔导士\t19103\nlitchi\t19104\n十万个\t19105\nBlocked\t19106\nqt编程\t19107\n大伙房水库\t19108\n三脚架\t19109\n白欣欣\t19110\nVimeo\t19111\n金禾实业\t19112\n屠夫小姐\t19113\nrammstein\t19114\n丑娘娘\t19115\n选填\t19116\n流放之路预言机制\t19117\n棒槌\t19118\n2轮\t19119\nadept\t19120\n闺中\t19121\nSpectra\t19122\n不要\t19123\n坏男人\t19124\n报道员\t19125\n祥利\t19126\nKent\t19127\n和田白玉\t19128\n每隔10分钟\t19129\n聂鑫\t19130\nILI9341\t19131\n三元奶粉\t19132\n三棵树漆\t19133\n流利条\t19134\n唐诗联唱\t19135\n美国丽人\t19136\ncurly\t19137\n窠\t19138\n急性化脓性扁桃体炎\t19139\n美国驻上海总领事馆\t19140\n肖霞\t19141\nbiopython\t19142\n广东省消防\t19143\nTai\t19144\n断魂枪\t19145\n富通保险\t19146\n盖县\t19147\n1.8亿\t19148\n奉子\t19149\n火力全开\t19150\n宝刀未老\t19151\n大梵\t19152\n真岛浩\t19153\n新东方公寓\t19154\n书栈\t19155\n叶片式\t19156\nPersistence\t19157\n工学\t19158\nwww.3dm\t19159\n镐头\t19160\n生死关头\t19161\n逝去的爱\t19162\npossion\t19163\n639号\t19164\nmainactivity\t19165\n3389\t19166\n第12册\t19167\n成都求艺网\t19168\n一场空\t19169\n复刻版\t19170\n金马镇\t19171\n商末尾\t19172\nyy4080\t19173\n2.21\t19174\n葬魂\t19175\n_k73\t19176\n北京交通大学图书馆\t19177\n黄河口\t19178\n邮编库\t19179\n黄庄村\t19180\n湘人社\t19181\n中央税\t19182\n二七区人民政府\t19183\nICN\t19184\n卡米\t19185\n4批次\t19186\n船级\t19187\n吴江政府网\t19188\npid算法\t19189\n联想Y430P\t19190\n博士网\t19191\n20151106\t19192\n谱尼测试\t19193\n东方输入法\t19194\n一毫克\t19195\nCalculated\t19196\ntet\t19197\n9999元\t19198\n永嘉县\t19199\n835\t19200\n望京soho\t19201\n台客\t19202\n箱式炉\t19203\n一朵花\t19204\n第28号\t19205\n雅书阁\t19206\n汪潮涌\t19207\n何凡\t19208\n三证\t19209\n电讯\t19210\n沙海老兵\t19211\n控盘\t19212\n连续墙\t19213\n赵学\t19214\n篮球衣\t19215\n西方经济学\t19216\n中青旅山水酒店\t19217\n功率谱密度\t19218\nappeared\t19219\n大车企\t19220\n扇动\t19221\n碳纤维复合材料\t19222\n训练赛\t19223\n58型\t19224\n板本\t19225\n洛神赋图\t19226\n80SMP4\t19227\n枫溪区\t19228\n牙贴\t19229\n双屿街道\t19230\n战蝎\t19231\n鲷哥\t19232\nSurgery\t19233\niPhone8/8plus\t19234\norchard\t19235\nフォ\t19236\n拼一拼\t19237\n杭白菊\t19238\nG4600\t19239\n京生\t19240\n扭杆\t19241\n很奇怪\t19242\nxing.org1^\t19243\n华润堂\t19244\n朴明秀\t19245\n潍坊市高新区\t19246\nマイ\t19247\n拓衡器\t19248\n刹车分泵\t19249\n0.03%\t19250\n陪玩\t19251\n微小微\t19252\n乙酰胺\t19253\n泰浩\t19254\n中投在线\t19255\n金立s9\t19256\n4132\t19257\n永阳\t19258\n穿裆\t19259\n脱衣舞\t19260\n磨削液\t19261\ngtj2018\t19262\nmiui9稳定版\t19263\n备汛\t19264\n筷\t19265\n赶集网\t19266\n美国众议院\t19267\n落落\t19268\nzabbix\t19269\n赵芳\t19270\n武汉国际博览中心\t19271\n王钰\t19272\nskill\t19273\n跳跃\t19274\n99bt\t19275\n新碶街道\t19276\n亲眼目睹\t19277\n长虹电视\t19278\n球部\t19279\n千次\t19280\nCCTV-5体育赛\t19281\nScopus\t19282\nvs刀剑神域\t19283\n外倾\t19284\n昆达里尼\t19285\n小豆丁\t19286\n脸盲\t19287\n炸街\t19288\n市统计局\t19289\n多列\t19290\n纯水岸\t19291\n卡诺图\t19292\ndoppler\t19293\nvaluation\t19294\n婚后生活\t19295\n兰考县委\t19296\nKVB\t19297\n婚联\t19298\n第三分\t19299\n丁香五月婷婷\t19300\n中国诗词大会3\t19301\n034\t19302\n异男\t19303\n黑点\t19304\nHollywood\t19305\n精怪\t19306\n科技世界网\t19307\n陈思宇\t19308\n。2\t19309\nBDP\t19310\n合成\t19311\n海棠木\t19312\n博迪\t19313\n压力表\t19314\n丧尽\t19315\n实名宝\t19316\n恐怖袭击案\t19317\n哈尔滨市公安局\t19318\n石首市\t19319\n9340\t19320\n航空队\t19321\n预授权\t19322\n陈思成\t19323\n868路\t19324\n分久必合\t19325\n汤氏\t19326\n宜春学院\t19327\n罗宾森广场\t19328\n湟源县\t19329\nLime\t19330\nhukey\t19331\n子载波\t19332\nSolidAngle\t19333\nwallet\t19334\n假发片\t19335\n5.0000元\t19336\n太客气\t19337\n中性灰磨皮\t19338\n唐韵\t19339\n宁围街道\t19340\n和家网\t19341\n100多岁\t19342\n华阳湖\t19343\nBLG\t19344\n印媒\t19345\nnodepad++\t19346\n周平\t19347\nVL\t19348\nhorse\t19349\n材料导报\t19350\n朝天椒\t19351\n老猪\t19352\n20mhz\t19353\n造境\t19354\n网银\t19355\n项氏\t19356\n大时\t19357\nqyer\t19358\n九联\t19359\n景德镇北站\t19360\n金洋\t19361\n9ku\t19362\ncassette\t19363\nvortex\t19364\nxixihuang\t19365\n精研\t19366\n老白汾\t19367\n延安大学\t19368\n私行\t19369\nMM私房照\t19370\n稻瘟病\t19371\n敢怒\t19372\n求部\t19373\nActivities\t19374\nsimeji\t19375\n一根\t19376\n1618w\t19377\nsessionID\t19378\n子不语\t19379\n赤潮神族\t19380\nSWAN\t19381\n摘取\t19382\n常小兵\t19383\n森乐\t19384\n大货车\t19385\nBoxing\t19386\n走进撸猫成瘾少年的世界\t19387\n肾透明细胞癌\t19388\n系统之家ghost\t19389\n二群\t19390\n马亮\t19391\n矿院\t19392\n混合所有制经济\t19393\ncpucores\t19394\n液金\t19395\n声之形\t19396\nox\t19397\n非自然死亡日剧\t19398\n网易云音\t19399\ndatables\t19400\n超声骨密度仪\t19401\n宁波\t19402\n90年\t19403\n1640\t19404\ntekla\t19405\n亚光砖\t19406\nSapporo\t19407\nHarris\t19408\n社会地位\t19409\nfilesize\t19410\n商业银行法\t19411\n第124期\t19412\nzhige\t19413\n烈焰传奇\t19414\n发现者\t19415\n米尔豪斯\t19416\n中旅总社\t19417\n数十款\t19418\n河南工商局\t19419\n庆历\t19420\n微信双开\t19421\n沈政发\t19422\n东安动力\t19423\ncatholic\t19424\n英国保诚集团\t19425\nfm2006\t19426\n藏密\t19427\nLIKE\t19428\n7W\t19429\n95平方\t19430\n林美\t19431\n奴隷色\t19432\n简果\t19433\n手表带\t19434\nS7300\t19435\n兴业矿业\t19436\n太漂亮\t19437\n公集\t19438\n程响\t19439\n引擎\t19440\n宣传稿\t19441\n路虎捷豹\t19442\n火影博人传\t19443\n五彩湾\t19444\n效益\t19445\n苏畅\t19446\nCongratulations\t19447\n签诗\t19448\n西南财经大学金融学院\t19449\n个旧\t19450\n活成\t19451\n加氢反应器\t19452\n虹桥动车站\t19453\n犯罪片\t19454\nDamage\t19455\n五一银行\t19456\n康乐路\t19457\n光石\t19458\n黄其森\t19459\n衍生剧\t19460\n浙江树人学院\t19461\n陈叶翠\t19462\nCOLUMNS\t19463\nev3\t19464\n亿达\t19465\n惨\t19466\nE盘\t19467\nパ\t19468\n由乃\t19469\n长江文艺出版社\t19470\n负定\t19471\n体积小\t19472\n411\t19473\n室壁\t19474\n重庆市巴蜀中学\t19475\n题集\t19476\n2013年09月07日\t19477\n报批稿\t19478\nGameboy\t19479\n断裂\t19480\n李在镕\t19481\n米格尔\t19482\n惜惜\t19483\n23rd\t19484\nMime\t19485\n海南在线\t19486\n玩命\t19487\n钢铁业\t19488\n10万次\t19489\n康城\t19490\n皮下组织\t19491\n各省会\t19492\nwin7升win10\t19493\n2017年11月5日\t19494\n天外归云\t19495\nINSTITUTE\t19496\n贝美\t19497\n战神之怒\t19498\n热河路\t19499\nff\t19500\n小霸王\t19501\n闪动\t19502\n特币\t19503\nPPT123模板网\t19504\n黄燕\t19505\n粒\t19506\n王世坚\t19507\n上海托福\t19508\n抢购\t19509\n攀登者\t19510\n600823\t19511\n大英\t19512\n风疹\t19513\n成长性\t19514\n庞飞燕\t19515\n县管\t19516\n小额投资\t19517\n好莱坞_电影网\t19518\nE6430\t19519\n中国科学院软件研究所\t19520\n林笛儿\t19521\n西畴\t19522\nStatistics\t19523\n张凌\t19524\n粉底色\t19525\n旅游委\t19526\n沙头\t19527\n新生儿呼吸窘迫综合征\t19528\n背面\t19529\nsingal\t19530\n维生素E软胶囊\t19531\n胎具\t19532\n亚伦大陆\t19533\n兆瓦\t19534\n福生\t19535\n天津国际学校\t19536\n布拉格之恋\t19537\nex360\t19538\n乐视2\t19539\n连江县\t19540\n中国农经信息网\t19541\n采矿机\t19542\n百度帖吧\t19543\n早教\t19544\n保变电气\t19545\nTIME_WAIT\t19546\n备付\t19547\n人才服务中心\t19548\n中山大学管理学院\t19549\ncasting\t19550\n韩网\t19551\nsupermap\t19552\nzhongzi\t19553\n初试\t19554\ndefined\t19555\n四川省发展和改革委员会\t19556\n万科花园\t19557\nhashset\t19558\nNordri\t19559\n蒸肉\t19560\noffload\t19561\n崔永\t19562\nVendor\t19563\n中赫国安\t19564\n入展\t19565\n同龄\t19566\nad13\t19567\n骑龙\t19568\n云宝黛西\t19569\n迷走\t19570\n病毒性感染\t19571\n生辰石\t19572\n発\t19573\n利物\t19574\n地方领导留言板\t19575\n冰上的尤里\t19576\n尼尔斯骑鹅旅行记\t19577\n外马路\t19578\n建筑容积率\t19579\nTough\t19580\n屏膜\t19581\n铁块\t19582\nqd8.com\t19583\n迪森股份\t19584\n曲速\t19585\n膝跳\t19586\n工藤优\t19587\n珲春市人民政府\t19588\nВ\t19589\n方凳\t19590\n焰舞\t19591\ne13\t19592\n柳林县\t19593\n量本利分析\t19594\n寻春\t19595\nx-3\t19596\n清平高速\t19597\nen\t19598\n诺德基金\t19599\n崖山海战\t19600\n喜获\t19601\nQQ表情大全\t19602\n其余\t19603\n现年\t19604\nwin7系统硬盘分区\t19605\n绝地求生优化\t19606\nmelamine\t19607\n决意\t19608\n双排键\t19609\n锅青天\t19610\n二十四孝图\t19611\n58名\t19612\n对比分\t19613\nLI-NING\t19614\n新兴区\t19615\nkeystore\t19616\n环人\t19617\n铜氨丝\t19618\n喋喋\t19619\nfxml\t19620\n秦时明月丽人心\t19621\n零才\t19622\n对接\t19623\n管管\t19624\n耐驰\t19625\nGrandma\t19626\n赖世雄美语从头学\t19627\ncurrency\t19628\n美呗网\t19629\n爱云\t19630\nseven\t19631\n许褚\t19632\n科雷嘉\t19633\nbd1280\t19634\n纯冰\t19635\n战尊\t19636\n超筋\t19637\n刘惠\t19638\n区委宣传部\t19639\n集合贴\t19640\nsouq\t19641\n歌手4\t19642\n三独\t19643\n销售成本\t19644\n西游记的故事\t19645\n农民工\t19646\n散曲\t19647\n恒温混水阀\t19648\nAdditives\t19649\n尧化门\t19650\n沙地\t19651\nmkt\t19652\n单稳态触发器\t19653\nGarbage\t19654\nv14\t19655\n大比特\t19656\n爽气\t19657\n贡献币\t19658\n603301\t19659\nCheckstyle\t19660\nAmericas\t19661\n古董商\t19662\n电窑\t19663\n经纬网\t19664\n油杆\t19665\n路通\t19666\n疯狂猜猜\t19667\n热斑\t19668\n历史学类\t19669\naelf\t19670\n5d虾仔\t19671\n呆若\t19672\n德治国\t19673\n20150523\t19674\n赵恩静\t19675\n英格兰国家队\t19676\nforus\t19677\n保丽龙\t19678\n崂\t19679\n大族激光\t19680\n死侍\t19681\n电压传感器\t19682\n中国商品网\t19683\nTesla\t19684\n潍坊学院\t19685\n矿灯\t19686\n850亿\t19687\n绿区\t19688\n调用子组件\t19689\n折标系数\t19690\n闪击战\t19691\n国际化\t19692\nA7RM2\t19693\nrRNA\t19694\nCHAT\t19695\n烔炀镇\t19696\n多亮\t19697\n秘密俱乐部\t19698\n砍倒\t19699\nreen\t19700\n满井\t19701\n80073712\t19702\ncaches\t19703\nsamtools\t19704\n炒鱿鱼\t19705\nwebqq\t19706\n犀利哥\t19707\n冷汗\t19708\n王永红\t19709\n食尚\t19710\ntfrecords\t19711\n工作群\t19712\ngcf\t19713\n翻版\t19714\n唐岛湾\t19715\nGuides\t19716\n不能不\t19717\n卓越东部\t19718\n中央美术学院\t19719\n堇青石\t19720\n陈建国\t19721\n4V\t19722\n81平米\t19723\n独立游戏\t19724\n耐尔\t19725\n戴威\t19726\n林薇\t19727\n玩花\t19728\n频发\t19729\n微山\t19730\n版画\t19731\n提亲\t19732\n播放地址\t19733\n健儿\t19734\n问图\t19735\nCSSA\t19736\nxpel\t19737\n活灵活现\t19738\n胡阳\t19739\n朵蜜\t19740\nQueena\t19741\n2019年下半年\t19742\n3312\t19743\nDAL\t19744\ncant\t19745\n赤铁矿\t19746\n宿迁市\t19747\n忠仑公园\t19748\n喷雾器\t19749\n刘国峰\t19750\n阵雨\t19751\ncompare3\t19752\n飞机票\t19753\n2017-08-25\t19754\n1.14d\t19755\nSWIG\t19756\n意見\t19757\nyoucompleteme\t19758\n2000美元\t19759\n大伙儿\t19760\n亮忽暗\t19761\n中科院信息工程研究所\t19762\n差动放大器\t19763\n傲世致我们终将逝去的青春\t19764\n伊苏8pc\t19765\n融创城\t19766\n渗透系数\t19767\n魔刀\t19768\n三友\t19769\n小戏\t19770\n陈海涛\t19771\n深不可\t19772\n107个\t19773\n云浮市人民政府\t19774\n驱程\t19775\nnodpad\t19776\n中药药剂学\t19777\n珠江台\t19778\n新浪公司\t19779\n覃塘区\t19780\nshenghuo\t19781\n七品芝麻官\t19782\n2016年3月份\t19783\n惜别\t19784\n南宁地铁3号线\t19785\n即时性\t19786\n燕池\t19787\n排产\t19788\n购票\t19789\n三万块\t19790\n20厘米\t19791\n仪器交易网\t19792\n金凤镇\t19793\n旱情\t19794\n汤城\t19795\n540M\t19796\nyaml\t19797\n时纪\t19798\n红莲之王\t19799\n软交\t19800\n成明\t19801\n雁翅镇\t19802\ng6\t19803\n泉州市食品药品监督管理局\t19804\n安美露\t19805\n剖腹产\t19806\n亚门\t19807\n舰队co\t19808\n干压\t19809\nartorias\t19810\nam\t19811\n3552\t19812\n冰火岛\t19813\n戴昆\t19814\nxhekpon\t19815\n攻资\t19816\n框值\t19817\nt8s\t19818\n丙戊酸钠缓释片\t19819\n胎儿双顶径\t19820\n4月12\t19821\n华克金\t19822\n十文\t19823\n0.73\t19824\ncks\t19825\n林肯领航员\t19826\nバカ\t19827\n家访\t19828\n日精\t19829\nposgresql\t19830\n乐伐替尼\t19831\n世嘉万界天尊\t19832\n诚意药业\t19833\n半脸\t19834\n豆农\t19835\n如皋市科技局\t19836\n庚戌\t19837\n波浪式\t19838\n完整页\t19839\n武汉大学化学与分子科学学院\t19840\n开一贴\t19841\n高比\t19842\n肾小囊\t19843\nv4.0.2\t19844\n万怡\t19845\n古达\t19846\n预录取\t19847\n9.18\t19848\nCult\t19849\n养父\t19850\n威佳\t19851\n多尔\t19852\nPascal\t19853\nwindws\t19854\n余文\t19855\n毕业设计论文网\t19856\n超模们\t19857\n右列\t19858\nfortify\t19859\n00000008\t19860\n兵员\t19861\n医科\t19862\n5百万\t19863\n裁判证\t19864\n175g\t19865\n解中\t19866\n长武\t19867\n恒信贵金属\t19868\n牛肉丸子\t19869\n刀剑神域虚空幻\t19870\n小篆\t19871\n普顿\t19872\n金溪\t19873\nmuddy\t19874\n铠装热电偶\t19875\n苏美尔人\t19876\n1000转\t19877\n16个小时\t19878\n部招\t19879\nember\t19880\n看行\t19881\n直\t19882\n中国进出口商品交易会展馆\t19883\n【文创\t19884\n大话西游手游\t19885\nelongation\t19886\n第A05\t19887\n玩闹\t19888\n双龙镇\t19889\n7.0_\t19890\njoal\t19891\n42070\t19892\nMultiselect\t19893\n重生西游之天篷妖尊\t19894\nXXL\t19895\nShould\t19896\nSkylines\t19897\nDCB\t19898\n纺锤\t19899\n名关\t19900\n流注\t19901\n网捷\t19902\nneighbor\t19903\n评茶员\t19904\nstreaming\t19905\nCBD万达广场\t19906\n法力\t19907\n蓝堡湾\t19908\n西交利物浦大学\t19909\n第十三套\t19910\n一级半\t19911\n进取\t19912\n10月13日\t19913\n钱\t19914\n氟\t19915\n生活态度\t19916\n温德姆\t19917\n低练度\t19918\n注册土木工程师\t19919\n充填机\t19920\n#号\t19921\n水罐\t19922\n羊肉卷\t19923\n南朗\t19924\n表嫂\t19925\nwishbbs\t19926\n西游星球大战\t19927\n亚美科技\t19928\n宠物蛋\t19929\n天圣制药\t19930\nping通服务器\t19931\n2.00千克\t19932\n漫漫人生路\t19933\n66law.cn\t19934\nsimadi\t19935\n10万台\t19936\n影印\t19937\n医学考试网\t19938\n东岳集团\t19939\n棒式\t19940\n标兵\t19941\n枭兽\t19942\n各异\t19943\n阿里小蜜\t19944\n鬼仔\t19945\n喷射机\t19946\n七里海\t19947\niDan\t19948\n独舞\t19949\n中国发改委\t19950\n千库网\t19951\n灌木丛\t19952\n立机\t19953\n330\t19954\n新兰8cj\t19955\n武汉热干面\t19956\n酩酊大醉\t19957\n遮丑\t19958\n蓝条\t19959\ngamer\t19960\n依然\t19961\nyy粉丝网\t19962\n55米\t19963\n强克\t19964\n脚麻\t19965\n建办\t19966\n玄玄\t19967\nwaltz\t19968\n瘫痪\t19969\n相字\t19970\n梅莉\t19971\n边检\t19972\n元数据库\t19973\n郑国光\t19974\n三弄\t19975\n垃圾费\t19976\n无畏先锋\t19977\n非车\t19978\n桁\t19979\nThingiverse\t19980\nフェラ\t19981\n富媒体\t19982\n中期\t19983\n房地产商\t19984\n排列\t19985\n折券\t19986\n贺尔碧格\t19987\nJensen\t19988\nspring-test\t19989\n体育世界\t19990\n银湖网\t19991\n战败\t19992\n湖贝\t19993\n剑网三重置版\t19994\n做出来\t19995\nhental\t19996\n创世套\t19997\nlamada\t19998\n中华人民共和国节约能源法\t19999\n新相亲时代\t20000\n刘大伟\t20001\n涂销\t20002\n陈茵媺\t20003\nOpenid\t20004\n熹妃传十世情缘\t20005\n中图法\t20006\nabo文\t20007\n谢亚龙\t20008\n小虐\t20009\nbeach\t20010\n史稿\t20011\n厉少\t20012\n毛伟明\t20013\n巢式\t20014\n通解\t20015\n影侠网\t20016\n金子晴\t20017\n金蝶kis标准版\t20018\n沧浪区\t20019\n第40次\t20020\n如来\t20021\n美国中情局\t20022\nocl\t20023\n重庆市质量技术监督局\t20024\n翻墙\t20025\n花瓣雨\t20026\nlikey\t20027\n厦门大学\t20028\ntumors\t20029\n小吧\t20030\n2018年4月27号\t20031\n桃花芯木\t20032\n城关派出所\t20033\n勋鹿\t20034\ninnodb\t20035\n国石\t20036\nhistories\t20037\n中华汽车\t20038\n广告狂人\t20039\n票据质押\t20040\n易淡\t20041\n避雷器\t20042\n有钱人\t20043\n5宫\t20044\nLoadRunner\t20045\n抵扣分录\t20046\n无锡二院\t20047\n拍挡\t20048\nmds\t20049\n1934\t20050\n暗渡陈仓\t20051\nJfinal\t20052\npdf.doc\t20053\n高新医院\t20054\nwebb\t20055\nheian\t20056\n731部队\t20057\n绝地求生腰射\t20058\nNATO\t20059\n教考\t20060\n产品篇\t20061\n阴臀\t20062\n1985\t20063\n以色列\t20064\nfmod\t20065\nvtt\t20066\nkrsort\t20067\n棉签\t20068\n圆轴\t20069\n上心\t20070\n敌敌\t20071\n奸染\t20072\n证券市场基本法律法规\t20073\n魔犬\t20074\nSUM函数\t20075\n南京市物价局\t20076\n买卖房\t20077\n出风\t20078\n柳儿\t20079\naveeno\t20080\n肼\t20081\n特战队员\t20082\n艾西瓦娅·雷\t20083\n乌鲁木齐市\t20084\nB7\t20085\n卸扣\t20086\n英雄之刃\t20087\n王捷\t20088\n装饰盒\t20089\n苏州园林\t20090\nInternet_\t20091\n华媒\t20092\n艾佳\t20093\nbestv\t20094\n刷油漆\t20095\n充满信心\t20096\n瑞兽\t20097\n夏侯瑾轩\t20098\ntry{}catch\t20099\n铜雕\t20100\n丰田霸道2700\t20101\n机群\t20102\ngdgp3.chinaxinge.com/shuju2/201710/2017101520524387066.htm\t20103\n消费型\t20104\n叶全真\t20105\n砌筑工\t20106\n文山壮族苗族自治州\t20107\n中华人民共和国广东海事局\t20108\n外商独资企业\t20109\n南京大学出版社\t20110\n陈字\t20111\niphone7s\t20112\n32.768khz\t20113\n月亮门\t20114\nakg\t20115\n排架\t20116\n军方\t20117\n长安睿骋cc\t20118\n10几个\t20119\n老公天下第一\t20120\n韩林\t20121\n自负盈亏\t20122\n朱慈勉\t20123\n葉\t20124\n青春类\t20125\n固废网\t20126\n安徽省质监局\t20127\n新职\t20128\n流动注射分析仪\t20129\n佛山华英学校\t20130\n雅尼\t20131\n280\t20132\n斑竹\t20133\nSRX\t20134\n鸡犬\t20135\n360Shop\t20136\n吴海燕\t20137\n280i\t20138\nDEDE模板\t20139\n福来恩\t20140\n地质勘探\t20141\n雨罩\t20142\nTitler\t20143\n1.6XE\t20144\n霸州市\t20145\n空区\t20146\n近岸\t20147\n融易通\t20148\n返利\t20149\n岁月文学网\t20150\n单证员\t20151\n彩带\t20152\n第二款\t20153\n中国科学院深圳先进技术研究院\t20154\n常州市第一中学\t20155\n天龙八部ol_17173天龙\t20156\nusable\t20157\n安天\t20158\ntokudb\t20159\niped\t20160\n涉恐\t20161\n春欲晚\t20162\n肯塔基\t20163\n毕业率\t20164\n选址\t20165\n歌词曲\t20166\nappleID\t20167\nPenny\t20168\n告辞\t20169\n金利\t20170\noffice2012\t20171\n十小时\t20172\n斯特恩\t20173\n窝窝\t20174\n产程\t20175\n上梅林\t20176\n榴水\t20177\nWatchdog\t20178\n弘康\t20179\nguanwang\t20180\n翠苑一区\t20181\n氧化碳\t20182\nTumi\t20183\ntribez\t20184\nOwen\t20185\n酷雅\t20186\n溶度积\t20187\n托县\t20188\n安远县政府\t20189\n000006\t20190\nIP代理\t20191\nResidual\t20192\n宝雅\t20193\n京藏高速\t20194\n樱桃味\t20195\npak\t20196\nevian\t20197\n热斗传说\t20198\n深圳翻译公司\t20199\n重传\t20200\n葆蝶\t20201\n挂架\t20202\ntrc\t20203\n36级\t20204\n癌栓\t20205\n毛边机\t20206\nㄉ\t20207\n信息社会\t20208\n0.25g\t20209\nkankan\t20210\n八件\t20211\n26万\t20212\n花间词\t20213\n大果\t20214\n4.5升\t20215\n名_祥安阁风水网\t20216\n杜隆坦\t20217\n水冷液\t20218\n430000\t20219\n解忧杂货店\t20220\n百枚\t20221\nfae\t20222\nPact\t20223\n玛曲\t20224\n水刺无纺布\t20225\n非洲灰鹦鹉\t20226\n日光城\t20227\ncryp\t20228\n起扣点\t20229\n应酬\t20230\n殊不知\t20231\n勒杜鹃\t20232\n舞技\t20233\n一秒内\t20234\n桂花公园\t20235\n邮筒\t20236\n穷途末路\t20237\n香水百合\t20238\n区场\t20239\n型线\t20240\nharmless\t20241\n256g\t20242\n闵孝琳\t20243\n金刚砂耐磨地坪\t20244\n德运奶粉\t20245\n唐辛子\t20246\nhbf\t20247\nタカスギコウ\t20248\n西华师范\t20249\n骚色\t20250\nhibernate.hbm2ddl.auto\t20251\n城镇土地使用税\t20252\n手势舞\t20253\n乐高机器人\t20254\n南京大学商学院\t20255\nJack\t20256\nCONAN\t20257\n非货币性资产\t20258\n20140801\t20259\n名优\t20260\n暇\t20261\nclassical\t20262\n黄教\t20263\n3DsMax2013主工具栏\t20264\n东湖风景区\t20265\n5892\t20266\n经贸局\t20267\n附加刑\t20268\n桃园小区\t20269\n500户\t20270\n永安行\t20271\n生活圈\t20272\n日语学习网\t20273\n接力出版社\t20274\n洛洛\t20275\n爱屋\t20276\n公平性\t20277\n海姆达尔\t20278\n上海斯信生物科技有限公司\t20279\n希沃学院\t20280\n八步\t20281\n令谷\t20282\ncisc\t20283\n最强战士之迷你特工队\t20284\n心太软\t20285\n御天\t20286\nMFS\t20287\nLili\t20288\n异乡好居网\t20289\n康嘉\t20290\n基金从业人员资格考试\t20291\n王丽\t20292\n聚英\t20293\n中国美协\t20294\n管枕\t20295\n公交集团\t20296\nfidic\t20297\n冬眠者\t20298\n91卫\t20299\nsings\t20300\n解放j6\t20301\n职能制\t20302\nSaya\t20303\n户口页\t20304\n万斯鞋\t20305\n白枪\t20306\n仔角\t20307\n垃圾佬\t20308\n逆转裁判6\t20309\n美末\t20310\nkeren\t20311\n骨头镇\t20312\n悬浮球\t20313\n严明\t20314\n第几列\t20315\n长安CAE\t20316\n呐\t20317\n纯收入\t20318\n高耗能落后机电设备\t20319\n26.8\t20320\n天潼路\t20321\n米拉山\t20322\nUMD\t20323\nmj\t20324\n白雨\t20325\n同工同酬\t20326\n安尼泰科\t20327\n进士\t20328\n老李\t20329\n公共建筑节能设计标准\t20330\n中华鹰鹘苑\t20331\n泉州师范学院\t20332\n广播稿\t20333\n450_\t20334\n鼎阳\t20335\n05J909\t20336\n创远\t20337\n深绿色\t20338\n李志明\t20339\npchifi\t20340\n蝶祈\t20341\n舌体\t20342\n逢迎\t20343\nPs\t20344\n8082\t20345\nwindows.old\t20346\n萧山市北\t20347\n杨式太极拳\t20348\n北京空军总医院\t20349\ncphi制药在线\t20350\n门阀\t20351\nA77\t20352\nxhdpi\t20353\n东北虎豹国家公园\t20354\n重庆市纪委\t20355\n聚氨酯胶辊\t20356\nlifan\t20357\nUmi\t20358\n提\t20359\n陈米\t20360\n鞍山\t20361\n秋后\t20362\n带水\t20363\n李建林\t20364\n苏庄\t20365\n会计账簿\t20366\n笔尖\t20367\nCMR\t20368\n家常菜谱网\t20369\n尿血\t20370\nFla\t20371\n销售毛利率\t20372\n大发\t20373\nSTAR\t20374\n布鲁塞尔机场\t20375\n经量\t20376\nNDF\t20377\n眼眼\t20378\n张明智\t20379\n致道\t20380\n方针\t20381\n共青团员\t20382\nTitanic\t20383\n林凯文\t20384\n维尔贝克\t20385\n帕瓦罗蒂\t20386\n苏州幼儿园\t20387\n张亚王力宏\t20388\nDonut\t20389\n工参\t20390\n紫花\t20391\n验孕棒\t20392\n日规\t20393\nl455\t20394\n零宽\t20395\n伊人在线\t20396\n36平方米\t20397\n2016年10月19日\t20398\n图元\t20399\n圆满\t20400\n美行\t20401\n风流变身记\t20402\n第7册\t20403\n认出\t20404\n几通\t20405\n启元\t20406\n莫古\t20407\n华为荣耀Magic\t20408\n排字\t20409\n亚历山大大帝\t20410\nKat\t20411\nsusie\t20412\n巴洛克式\t20413\n表参\t20414\nindirection\t20415\n青年在选择职业时的考虑\t20416\n诗部\t20417\n芭朵斯\t20418\n颅\t20419\n马圈\t20420\n金风科技\t20421\n2014年4月\t20422\n陈香何\t20423\n雪佛兰新赛欧\t20424\nsoaring\t20425\n字边\t20426\n不置顶\t20427\n江之岛\t20428\n宏亚\t20429\n吉安市政府\t20430\n腾讯邮箱\t20431\n公开版\t20432\nlumia640\t20433\n金御半山\t20434\n玄武大道\t20435\n香港理工\t20436\n有事\t20437\ncorrespond\t20438\n张培\t20439\n周可\t20440\n感动\t20441\n地仙\t20442\n威利斯\t20443\n掷地有声\t20444\n中际旭创\t20445\n木片\t20446\n湖南省统计局\t20447\n82&\t20448\n鳄鱼小顽皮\t20449\n分销商\t20450\nwww.5ydj.com\t20451\n海岛大亨5\t20452\n泾渭分明\t20453\n长久物流\t20454\n靠谱不_\t20455\n复合门\t20456\n马卡龙\t20457\n锐变\t20458\n因而\t20459\n坐宫\t20460\n生话\t20461\n证云\t20462\nby柴鸡蛋\t20463\ngilt\t20464\n无缝切换\t20465\n首都师范大学\t20466\n心金\t20467\n三鞭\t20468\n五叔\t20469\n和密\t20470\nv4.0.3\t20471\n名记\t20472\n潮汕网.com\t20473\n表语\t20474\nDOGE\t20475\n昆明市食品药品监督管理局\t20476\ncommond\t20477\n密切\t20478\nYi\t20479\n4790K\t20480\n瑟妃\t20481\n1.0.42\t20482\n龟类\t20483\n认认真真\t20484\n婺\t20485\nkdevelop\t20486\n广州公共资源交易中心\t20487\n小百灵\t20488\nspe\t20489\n充值卡\t20490\n嘉联支付\t20491\n以太坊私有链\t20492\n五千公里\t20493\n窥阴器\t20494\n紧绷感\t20495\n武举\t20496\n多p\t20497\n湘乡市人民政府\t20498\nhospitality\t20499\n零工\t20500\nwps宏\t20501\n连接池\t20502\nIi\t20503\n蠹\t20504\n长成\t20505\n副庭长\t20506\n橹\t20507\n4月6日\t20508\n氧化铁\t20509\n今川\t20510\n战败国\t20511\n沙发\t20512\nUpdating\t20513\n天津饭\t20514\na型\t20515\n再试试\t20516\n求战\t20517\nHejin\t20518\n郑州市商务局\t20519\n建国街\t20520\n高龙\t20521\n惊鸿仙子\t20522\n细胞株\t20523\ncalc\t20524\n遊網\t20525\n香舍\t20526\n公安部第一研究所\t20527\n鬼乡\t20528\n泸定桥\t20529\nHob\t20530\nnao机器人\t20531\n红旗L5\t20532\n反文\t20533\nexperienced\t20534\n圣龙\t20535\n房宁\t20536\n9b\t20537\nOnyx\t20538\n蓝色钱江\t20539\n市扶贫办\t20540\n秧子\t20541\n采煤\t20542\ndanner\t20543\nInTouch\t20544\n野菊\t20545\nCardi\t20546\n补一补\t20547\n云鹤2\t20548\n安邦保险\t20549\n10毫米\t20550\n龙裔\t20551\nEconomic\t20552\nfastboot\t20553\n移动电玩城\t20554\n黑手\t20555\n中公金融人网\t20556\n金陵十三钗\t20557\n海口市人民政府\t20558\n第97号\t20559\nmike\t20560\n保康\t20561\necilipse\t20562\n化学能\t20563\n梦幻西游符石\t20564\n日籍\t20565\n老科\t20566\n嫡女重生\t20567\nrefers\t20568\n晾衣绳\t20569\n2.67\t20570\nINV\t20571\n干干净净\t20572\n热镀锌钢格板\t20573\n污水厂\t20574\nAirPlayer\t20575\nFlow\t20576\n国语版\t20577\n英都\t20578\n20140105\t20579\n莲前街道\t20580\n德力\t20581\n乘法口诀表\t20582\nCentOS-7\t20583\n补赛\t20584\n长辛店镇\t20585\n不知所终\t20586\n三国立志传\t20587\n酷狗输入法\t20588\n五折\t20589\nposh\t20590\n撞毁\t20591\n蔬食\t20592\n大阳山植物园\t20593\n灵明\t20594\n生性\t20595\n裸车\t20596\n那样\t20597\nexcerpt\t20598\n大明宫国家遗址公园\t20599\n托福阅读\t20600\n郫县\t20601\nHenan\t20602\n执业证\t20603\n旋压\t20604\n我的王朝\t20605\n知之深爱之切\t20606\n雪见\t20607\n质量奖\t20608\n气管镜\t20609\nmtu\t20610\n中值滤波器\t20611\n三氯化硼\t20612\n四月五日\t20613\npointwise\t20614\n悦花\t20615\ndiguo\t20616\n高效益\t20617\n小学英语\t20618\nhanoi\t20619\n丰巢\t20620\n终极预告\t20621\n10万平\t20622\n福华\t20623\n斗剧\t20624\nI2P\t20625\n20170322\t20626\n特玩网\t20627\n矢量标志\t20628\n花都区\t20629\n2520i\t20630\n九线\t20631\n慢性子\t20632\n巡风\t20633\n启发式\t20634\n兴村\t20635\n宜园\t20636\nconflicts\t20637\n进点\t20638\n芝士粉\t20639\n下任\t20640\n振安区\t20641\n一周\t20642\nvalidatebox\t20643\n1484\t20644\n创编\t20645\n电信4G卡\t20646\n火把节\t20647\n360safe\t20648\nFirst\t20649\nRPO\t20650\n重刑犯\t20651\n纳德拉\t20652\nsff\t20653\n宿雨\t20654\n前朝\t20655\n脱妆\t20656\n福鼎市\t20657\n危化品\t20658\n张弘范\t20659\n华晨汽车集团控股有限公司\t20660\n题写\t20661\nstateless\t20662\n不见\t20663\n无冬镇物语\t20664\nAIDA\t20665\n颈椎间盘\t20666\n凤凰周刊\t20667\n盛传\t20668\n衡东\t20669\n双袖\t20670\n香焦\t20671\n永隆\t20672\n弄平\t20673\n1V4\t20674\nLENOVO\t20675\nAVZ\t20676\n膺\t20677\nHudson\t20678\nNFA\t20679\n水文地质学\t20680\n逆我者\t20681\n紫米粥\t20682\n1.7.6\t20683\n联苯双酯滴丸\t20684\nCGAL\t20685\n芳华里\t20686\n敷尔佳\t20687\n国际电信联盟\t20688\n潘婕\t20689\n御神\t20690\nyiyun\t20691\n黑酸\t20692\n矿山救护队\t20693\n电话亭\t20694\nchicco\t20695\n撸一撸_\t20696\n十六日游\t20697\naron\t20698\n小檗碱\t20699\n素拓\t20700\n傣家\t20701\n材料科学与工程系\t20702\n血管痉挛\t20703\n北京现代ix35论坛_汽车之家论坛\t20704\nssim\t20705\n马赫数\t20706\n秀雅\t20707\n电锯惊魂2\t20708\n北师大英语\t20709\nalon\t20710\n炫丽\t20711\n调光器\t20712\n黑臂\t20713\n失去控制\t20714\n谱图\t20715\n暗黑破坏神3论坛\t20716\n怪物猎人世界重弩\t20717\n川村真矢\t20718\ntwemproxy\t20719\n猎天使魔女\t20720\n375电影网\t20721\n射手王\t20722\n海运单\t20723\nxuxu\t20724\n爱的罗曼史\t20725\n重臣\t20726\nkaiqi\t20727\nZF\t20728\nwhores\t20729\n沉重\t20730\n汤因比\t20731\n强盛\t20732\nmotorsport\t20733\n小林制药\t20734\n下得\t20735\n张悦楷\t20736\nBerita\t20737\n山内\t20738\nman饭网\t20739\n周二珂\t20740\n纷享销客\t20741\n像样\t20742\n重庆人才网\t20743\nbeige\t20744\ncellulose\t20745\nfarming\t20746\nsouhu\t20747\n斗罗大陆外传神界传说\t20748\n美世界\t20749\n下掉\t20750\n包包\t20751\n巫山烤鱼\t20752\n太原南站\t20753\n凉城县人民政府\t20754\n姬小菊\t20755\n个人资\t20756\n增城\t20757\n好歌曲\t20758\n211工程\t20759\n春宫图\t20760\n至伟\t20761\n上访者\t20762\nadoption\t20763\n陈晓楠\t20764\nAvnet\t20765\nkillers\t20766\n体相\t20767\n玲珑塔\t20768\n酸辣汤\t20769\n我是真的爱上你\t20770\n436\t20771\n马克扎克伯格\t20772\nPCK\t20773\n80点\t20774\n过人之处\t20775\nremained\t20776\n10列\t20777\n泳儿\t20778\n套管\t20779\n脓毒血症\t20780\n仙人掌区\t20781\n夜神月\t20782\nClarkson\t20783\n隔代\t20784\n更专业\t20785\n红姐\t20786\nt3\t20787\n嘴子\t20788\n反革\t20789\n到期还款日\t20790\n锐雯\t20791\n票号\t20792\n陈江\t20793\n帝陵\t20794\n木府风云\t20795\n数据转换\t20796\n31段\t20797\n于昊\t20798\noutthinker\t20799\n前一夜\t20800\n康提\t20801\n航空科学与工程学院\t20802\n185cm\t20803\n复仇者联盟3无限战争\t20804\ndivx\t20805\n火象星座\t20806\n王谢堂前燕\t20807\n公路资讯_火车网\t20808\n隐形\t20809\n脸谱网\t20810\npermanently\t20811\nX9000F\t20812\n秒批\t20813\n线口\t20814\n京豆\t20815\n11千瓦\t20816\n舍得街\t20817\n游艇\t20818\n烤架\t20819\nnbhd\t20820\nU-Boot\t20821\n宏达驾校\t20822\nworkstation12\t20823\n外语教育网\t20824\n娄勤俭\t20825\n中国疤痕论坛\t20826\n长江大桥\t20827\n转锁\t20828\n华中科技大学同济医学院\t20829\n鸭毛\t20830\n配方\t20831\n上海绿地申花\t20832\n迪桑娜\t20833\n君兰江山\t20834\n强直性\t20835\n喉片\t20836\nEvelyn\t20837\n孝丰镇\t20838\n千般\t20839\n地速\t20840\n咸阳论坛-华商论坛\t20841\n138度\t20842\n潘通\t20843\n苹果日报\t20844\n相抵\t20845\n无糖口香糖\t20846\n真迹\t20847\n太原理工\t20848\n奖卡\t20849\n堕落街传奇\t20850\niftop\t20851\n益\t20852\n诤言\t20853\n第9号\t20854\nCreep\t20855\n浴血奋战\t20856\n多寡\t20857\n谢志强\t20858\nCAd\t20859\n旧账\t20860\n音段\t20861\nsamba\t20862\n小欣\t20863\n筋饼\t20864\n20185月\t20865\n高压\t20866\n交趾黄檀\t20867\n5排\t20868\n数米\t20869\n喷罐\t20870\n1080P\t20871\n靓影\t20872\n道门老九\t20873\n热空气\t20874\n丁字桥\t20875\n绝地求生锁\t20876\nIRP\t20877\nCor\t20878\n帕斯卡\t20879\n舞蹈节\t20880\n学生党\t20881\nrd640\t20882\n苏州古城\t20883\nwww.doc88.com\t20884\n出阵\t20885\n试验田\t20886\n中文免安装版\t20887\n无花果苗\t20888\n双双\t20889\n迅雷玩客云\t20890\n栏菜\t20891\n阿伽门农\t20892\n可待因\t20893\n第二幅\t20894\n阱\t20895\n机关盒\t20896\n生鱼片\t20897\n富邦\t20898\n行政职业能力测验\t20899\n限售流通股\t20900\n德美\t20901\n农药百科\t20902\n找回\t20903\n郑村\t20904\n十余载\t20905\n存款额\t20906\nⅨ\t20907\n垂钓翁\t20908\n电珠\t20909\n跑光\t20910\n端子台\t20911\n妇女权益保障法\t20912\n摇身一变成\t20913\n选点\t20914\n双校\t20915\n布袋\t20916\n39mm\t20917\nucosii\t20918\n麦芽汁\t20919\n生物物理所\t20920\n易北河\t20921\n倒贴\t20922\n西南大学学报\t20923\n杨卫华\t20924\nsinX\t20925\nshaonv\t20926\n大螃蟹\t20927\n走世界\t20928\n超长篇\t20929\nGLA\t20930\n平安口袋银行\t20931\n曼胡默尔\t20932\n侯宁\t20933\n心相印\t20934\n牡丹江市\t20935\n4470\t20936\n博艺\t20937\n上海申通地铁集团有限公司\t20938\n北京折叠\t20939\n300个\t20940\n卡尔马龙\t20941\n快递暂行条例\t20942\ncred\t20943\n3棵\t20944\n000852\t20945\n4021\t20946\nSWP\t20947\n朴珍荣\t20948\nPCIe\t20949\n小黑人\t20950\nMomentum\t20951\n珠\t20952\n鱼场\t20953\n本手\t20954\n山堰\t20955\n前黄镇\t20956\ndspace\t20957\n直线方程\t20958\nhehe\t20959\natex\t20960\n综艺\t20961\n温哥华\t20962\n荔枝\t20963\nBiography\t20964\n特尔\t20965\npeg\t20966\n梓潼县\t20967\ndshow\t20968\nV4.1\t20969\nILA\t20970\n40亿美元\t20971\nBootCamp\t20972\n20160505\t20973\n购物券\t20974\n贴边\t20975\nsimulate\t20976\n第24季\t20977\n皇岗口岸\t20978\n张红\t20979\n安徽建筑大学\t20980\n刀峰\t20981\nAnatomy\t20982\nCanary\t20983\n房地产中介信息网\t20984\n红星路\t20985\n冬游\t20986\n非规则\t20987\n漂亮惹的祸\t20988\n单页\t20989\n湖北省住房和城乡建设厅\t20990\nm4p\t20991\n斯菲尔\t20992\n奥利尼克\t20993\n剑柄\t20994\n镇政府\t20995\n南通职业大学\t20996\n练枪\t20997\n乡邻\t20998\n顺丰控股\t20999\n开心果\t21000\n赵剑\t21001\n退行\t21002\nmend\t21003\n很复杂\t21004\n江西高校出版社\t21005\n升学率\t21006\n优行\t21007\n刘自鸿\t21008\n背包兔\t21009\n修辞手\t21010\n挂烫机\t21011\n00000003\t21012\n一礼\t21013\n6.3.26\t21014\n琪琪布电影网\t21015\n命版\t21016\n群魔乱舞\t21017\n郁雨君\t21018\n黑胡椒粉\t21019\n1255\t21020\n88毫米\t21021\n汉化补丁V1.0\t21022\n保险岛\t21023\n黑河学院\t21024\n懒腰\t21025\n高畑勋\t21026\n补强\t21027\n常州二院\t21028\n三电\t21029\n伺服控制器\t21030\n无锡八佰伴\t21031\n山东理工大学\t21032\n改扩\t21033\n三水森林公园\t21034\n光热\t21035\n刘纯\t21036\n耐威科技\t21037\n文浩电商学院\t21038\n远航驾校\t21039\n海能文库www.hnhw.org\t21040\n朝花夕拾\t21041\nsymbolic\t21042\n300116\t21043\n玉溪网\t21044\n清拆\t21045\nBETWEEN\t21046\n初战\t21047\n金士顿\t21048\n异常网游之神级机械猎人\t21049\n马博\t21050\n神机营\t21051\n雨岔大峡谷\t21052\n聊城市工商行政管理局\t21053\n迅雷/BT/百度云\t21054\nyifu\t21055\n武田玲奈\t21056\n家乐福超市\t21057\n美嘉欣\t21058\n窥探\t21059\njaws\t21060\n孙浩宇\t21061\n五年计划\t21062\nweb管理系统\t21063\n云亭街道\t21064\n雄鸟\t21065\nカノジョ\t21066\n粉丝勋章\t21067\n极地海洋公园\t21068\n建德新闻网\t21069\n镰蟹\t21070\n带口罩\t21071\n33个\t21072\n竹酒\t21073\n中国轮胎商业网\t21074\n人格魅力\t21075\n音基\t21076\nsurrey\t21077\n雷蒙机\t21078\n【理\t21079\nMS\t21080\n紫幽阁\t21081\nchengxu\t21082\n庄周梦蝶\t21083\n落差\t21084\n举证\t21085\n陈欢\t21086\n国安部\t21087\n整形咨询师\t21088\n第几联\t21089\n小巨蛋\t21090\n针锋\t21091\n蕾丝裙\t21092\n北京未来科技城\t21093\n20150820\t21094\n女尸\t21095\n扑尔敏\t21096\n金沙酒店\t21097\n上名\t21098\n肾穿刺\t21099\n13888\t21100\n上海浦东新区高行镇人民政府\t21101\n恒速\t21102\n润建通信\t21103\n贼船\t21104\nfts\t21105\nPropel\t21106\nVSS2005\t21107\n早睡\t21108\nPENTAGON\t21109\n团购券\t21110\n精纺\t21111\nqq个性网\t21112\n环槽\t21113\ndgjy\t21114\n有效期满\t21115\n龙腾网\t21116\n黎欣彤\t21117\n棕榈科\t21118\n第三角\t21119\n劳\t21120\n删失\t21121\n兴安\t21122\n20140411\t21123\n泰妍\t21124\n虎虎生威\t21125\ngasoline\t21126\n中科院苏州纳米所\t21127\n成一团\t21128\n薰儿\t21129\n毛瑟98K\t21130\n神探狄仁杰3\t21131\n可丽蓝\t21132\n中小企业板\t21133\n皮尔逊\t21134\n当头\t21135\n胖头\t21136\n公安医院\t21137\nNTA\t21138\n中国好人网\t21139\nraper\t21140\n难舍\t21141\n北京院\t21142\n口袋妖怪究极日\t21143\n气愤\t21144\n永丰乡\t21145\n第三十二集\t21146\n000623\t21147\n杂志编辑部\t21148\n50多元\t21149\n桂林师范高等专科学校\t21150\n此生因你空欢喜\t21151\n非地带性\t21152\n炫舞舞团\t21153\n5.2.17\t21154\n大话西游之仙履奇缘\t21155\n湖北省物价局\t21156\nmitutoyo\t21157\n旧城镇\t21158\n没问\t21159\n花枝鼠\t21160\n10天前\t21161\n进德\t21162\n愿得一人心\t21163\n手杆\t21164\nArchiver\t21165\n风止雨\t21166\n7品\t21167\n小蓓蕾\t21168\n大鸡\t21169\n日增\t21170\nAdobePDF\t21171\n开服网\t21172\n赤峰市医院\t21173\n速答\t21174\n投篮\t21175\nDOMINO\t21176\n家内\t21177\nchenby\t21178\n777福彩\t21179\nSign\t21180\n佳能800d\t21181\n日料\t21182\n西太后\t21183\nfg830\t21184\nHappyEDay\t21185\n配电箱柜\t21186\nlaca\t21187\n前2月\t21188\n风机盘\t21189\n王永峰\t21190\n靖西\t21191\n补角\t21192\nMOD_www.3dmgame.com\t21193\n汪\t21194\nlianjie\t21195\n太古城\t21196\n穷途\t21197\n大乐斗2\t21198\n书证\t21199\n李钊\t21200\npowergui\t21201\n虫草\t21202\n普快\t21203\n天畅园\t21204\n北海机场\t21205\nPS2018\t21206\n配电柜\t21207\n中科院南海海洋所\t21208\n罗卜\t21209\n倾尽天下\t21210\n下嫁\t21211\nbrick\t21212\n逐浪\t21213\nTinder\t21214\n神舞幻想吧\t21215\n践行者\t21216\nOPNET\t21217\n赛车总动员2\t21218\n杨灿\t21219\n情妇与野兽\t21220\nyvonne\t21221\n值法\t21222\n云际\t21223\n马甲线\t21224\ntran\t21225\n王者荣耀女英雄\t21226\n十四郎\t21227\n7迅雷\t21228\n纵观\t21229\nx900\t21230\n6sigma\t21231\n3DSMAX2017\t21232\n逍遥自在\t21233\n水素\t21234\n东新\t21235\n克虏伯\t21236\n葬花吟\t21237\nproteus7.8\t21238\n土建类\t21239\n狭长\t21240\n董家口港\t21241\nAdobe\t21242\n教唆犯\t21243\n孔浦\t21244\n济宁新闻网\t21245\n人大办公室\t21246\n移动电子发票\t21247\n工体北路\t21248\n林德曼\t21249\napocrypha\t21250\nWWW.66152.COM\t21251\nloko\t21252\n2905\t21253\n希咲艾玛\t21254\n天平架\t21255\n30类\t21256\n告白气球\t21257\n2016年8月\t21258\n浙江师范大学行知学院\t21259\n中塑在线\t21260\n学龄儿童\t21261\n罗玉平\t21262\n郭剑亮\t21263\n可孚\t21264\n荣记\t21265\n证券股份有限公司\t21266\nライブ\t21267\n猛鬼\t21268\n中国歌舞剧院\t21269\nwindows2008R2\t21270\n布洛芬\t21271\np2p网贷平台\t21272\n巴氏杀菌乳\t21273\n冯翔\t21274\n陈警官\t21275\nCrunchyroll\t21276\n顺云\t21277\n北京酒仙桥\t21278\n米琪\t21279\n姿势\t21280\n监利县\t21281\nM6600\t21282\n领克领克\t21283\n幻日\t21284\n氢氧化钾\t21285\n搓泥\t21286\nkawasaki\t21287\n车载充电器\t21288\n呼噜\t21289\n眚\t21290\n樱花战\t21291\n150秒\t21292\n租聘\t21293\n黑暗圣经\t21294\n张江科学城\t21295\n事必躬亲\t21296\nshoulders\t21297\n外国语学\t21298\n没完了\t21299\n狱警\t21300\n赫瑞瓦特大学\t21301\n750ML\t21302\nckd\t21303\n测力仪\t21304\n食欲不振\t21305\n青山黛玛\t21306\n很可爱\t21307\n76厘米\t21308\n上江\t21309\n汪汪队立大功全集\t21310\n欧菲\t21311\n厦门建设局\t21312\n途明\t21313\n73条\t21314\nPhilippines\t21315\n在家里\t21316\n希美真\t21317\n分销\t21318\n反病毒\t21319\n掘金宝\t21320\nsmbd\t21321\n红心猕猴桃苗\t21322\nHangzhou\t21323\n温州商贸城\t21324\n热力图\t21325\n斜\t21326\nIndices\t21327\n绿皮车\t21328\n甩客\t21329\n碳板\t21330\n岱山湖\t21331\n诫子\t21332\nGlad\t21333\n加半\t21334\nListing\t21335\nJNBY\t21336\n扰民\t21337\nselect\t21338\n西敏寺\t21339\n顾山\t21340\n隼\t21341\nrecipient\t21342\n制片人\t21343\n苹果醋\t21344\n腰射\t21345\n图案\t21346\n东北塘\t21347\n救援队\t21348\n保护箱\t21349\n广厦学院\t21350\n市场信息网\t21351\n奥妙洗衣液\t21352\n马萨\t21353\n储存箱\t21354\n无锡地铁3号线\t21355\nSurefire\t21356\n南部地区\t21357\nJV\t21358\n东森娱乐\t21359\ntaxonomy\t21360\n黛力新\t21361\nBreast\t21362\nuiswitch\t21363\n涉农资金\t21364\n空中网\t21365\nhdpe\t21366\n铝厂\t21367\n坝段\t21368\nintranet\t21369\n武训传\t21370\nShaoxing\t21371\nSHKD-595\t21372\n北斗星城\t21373\n春望\t21374\n游学\t21375\n西安铁一中\t21376\n义结金兰\t21377\n信息价\t21378\n二口\t21379\n叨\t21380\n财务司\t21381\n见步\t21382\n婉儿\t21383\n漫无目的\t21384\n自动对焦\t21385\n专座\t21386\nGDC\t21387\nAdore\t21388\nlanka\t21389\n小白兔们\t21390\n小美\t21391\n召集人\t21392\n悦盒ec6108v9\t21393\n20卷\t21394\n泰城\t21395\n最大公\t21396\n晨鑫科技\t21397\n海大富\t21398\n瓯北\t21399\n醒肤露\t21400\n异性恋\t21401\n版库\t21402\n十四天\t21403\n瘦西湖\t21404\n分手以后\t21405\npgrep\t21406\n工班\t21407\n20160712\t21408\n丐版\t21409\n猛鬼差馆\t21410\n回单\t21411\n新白娘子\t21412\n暖春\t21413\n艦\t21414\nGhost\t21415\n钨业新闻网\t21416\n雨篷\t21417\n脑胶质瘤\t21418\n保险员\t21419\n三国战纪\t21420\n密战\t21421\n模板网\t21422\n麻栗坡烈士陵园\t21423\n增城之窗\t21424\n茶刀\t21425\ninfile\t21426\n阮兆祥\t21427\n英魂之刃\t21428\n小李杜\t21429\n董仲舒\t21430\n出样\t21431\n扩写\t21432\nmaya\t21433\n奇女子\t21434\n潘桥\t21435\nnxt\t21436\nroseonly\t21437\nOverleaf\t21438\n8.\t21439\n10周年\t21440\n鲍余\t21441\n飞秋2013\t21442\n明星村\t21443\nLanZhou\t21444\n完结文\t21445\n2014-07\t21446\n腾讯新闻弹窗\t21447\n天佑德\t21448\np社\t21449\n1590\t21450\n杂糅\t21451\n97#\t21452\n跳行\t21453\n东湖中学\t21454\n才俊\t21455\n消灭掉\t21456\nNestlé\t21457\naum\t21458\n神鬼传奇\t21459\n辰阳\t21460\n161014\t21461\n【娃\t21462\n808路\t21463\n千起\t21464\n电脑虚拟内存\t21465\n领航\t21466\n对苯二酚\t21467\n皴\t21468\n夸\t21469\nfanli\t21470\n上海证券\t21471\n上海工商银行\t21472\n代谢性碱中毒\t21473\n南钢股份\t21474\n弥留\t21475\n肌肝\t21476\n1.3G\t21477\n鳞状细胞\t21478\n10.2.3\t21479\n2017年3月3日\t21480\n400PLC\t21481\n八福\t21482\n阻垢\t21483\n融杭\t21484\ndianhua\t21485\n浙江天册律师事务所\t21486\n防爆罐\t21487\nserves\t21488\n飞塔\t21489\n莱因哈特\t21490\n民大附中\t21491\n亚斯\t21492\n交易者联盟\t21493\nFC2\t21494\nyjs\t21495\n黄衣\t21496\n中国财经报道\t21497\n浙江区\t21498\n青神县\t21499\n_途睿欧论坛论坛\t21500\n离合器总泵\t21501\nImmutable\t21502\n11183\t21503\n宫爆鸡丁\t21504\n调角器\t21505\n七朵花\t21506\n水滴\t21507\n门照\t21508\n肖静\t21509\n170421\t21510\n洗脸盆\t21511\n壁报\t21512\n眼底病\t21513\n科麦斯\t21514\n0377\t21515\n拖拽\t21516\n北大青鸟消防\t21517\n鼓芯\t21518\n双牌县\t21519\n手位\t21520\n绿宝书\t21521\n乱搞\t21522\n林鹏\t21523\na+=1\t21524\n未嫁\t21525\n【德\t21526\n夏祭\t21527\n南宫山\t21528\n高统帅\t21529\n温塘\t21530\n装穷\t21531\n三盛地产\t21532\n废品\t21533\nSchumann\t21534\n展开式\t21535\n10招\t21536\nv5.70\t21537\n完不成\t21538\n巧儿\t21539\n回转窑\t21540\n中金环境\t21541\n粉体圈\t21542\n鱼鹰\t21543\n千朵\t21544\nshanghai\t21545\n据说\t21546\n一丨\t21547\n邓秀新\t21548\n0.4g\t21549\n一般化\t21550\nLibreOffice\t21551\n络新妇\t21552\n爱丽丝梦游仙境2\t21553\n44名\t21554\n500.0\t21555\n纹发\t21556\nczg\t21557\n黄龙体育中心\t21558\n共同富裕\t21559\n前舱\t21560\n酒井\t21561\n黄柏\t21562\n罗平县\t21563\n细案\t21564\n002023\t21565\nn次\t21566\n轴器\t21567\njock\t21568\n一万条\t21569\n亡者\t21570\n大意\t21571\n求动\t21572\n逸动XT\t21573\n长江学者奖励计划\t21574\n圣诞\t21575\n18岁时\t21576\n生长素\t21577\n洁身\t21578\n底火\t21579\n船舱\t21580\nChainsmokers\t21581\n甘肃省地方税务局\t21582\nHACMP\t21583\n卢广\t21584\ne480\t21585\n2916\t21586\n持久装\t21587\n眼镜布\t21588\n孰轻孰重\t21589\n8424\t21590\n蝶窦炎\t21591\n苏落\t21592\n串用\t21593\nthanksgiving\t21594\n北京会议中心\t21595\n药理\t21596\n20150318\t21597\n昙\t21598\n复旦大学出版社\t21599\n幼吾幼\t21600\n河南科技学院新科学院\t21601\ndianyin\t21602\nWebClient\t21603\n汉爵\t21604\n注册建造师管理规定\t21605\n极乐汤\t21606\n汇法网\t21607\n三叠泉\t21608\n巨细\t21609\n相除\t21610\n100安\t21611\n社保单\t21612\n模拟题库\t21613\nJest\t21614\n返还是\t21615\n1867年\t21616\nredirection\t21617\n机器猫\t21618\n上岗\t21619\n58寸\t21620\n西诺\t21621\n中央部委\t21622\n碰撞\t21623\n黄三角\t21624\n刘邓\t21625\n屈臣氏\t21626\n润润\t21627\n最全新\t21628\n林逋\t21629\n方木\t21630\n建行网上银行\t21631\n看题\t21632\n焖烧杯\t21633\n北京市第二十中学\t21634\ne网通\t21635\n三干\t21636\n学生证\t21637\n同温\t21638\n入球\t21639\n助产士\t21640\n无锡市统计局\t21641\n利益相关者理论\t21642\nPhrases\t21643\n复印件\t21644\nTushare\t21645\n皮科\t21646\n28V\t21647\n混凝土减水剂\t21648\n死或生沙滩排球3\t21649\n做货\t21650\n有机颜料\t21651\n浩哥\t21652\nDare\t21653\n硅氧烷\t21654\n装载\t21655\niPhone4S\t21656\nmytoken\t21657\n心性\t21658\n南宁市人民政府办公厅\t21659\n盒上\t21660\n快递柜\t21661\n标致3008论坛_汽车之家论坛\t21662\n金股街\t21663\n中年阵线联盟\t21664\n一览展会网\t21665\n老梁\t21666\nbethal\t21667\n6j1\t21668\n而成\t21669\n长光\t21670\n德阳市\t21671\nCondition\t21672\n菠菜\t21673\n最高超损人的话\t21674\n入豪门\t21675\nplea\t21676\n火箭炮\t21677\n红庙坡\t21678\n60块\t21679\n泊\t21680\n腰椎间盘\t21681\nXBD\t21682\n活猪\t21683\n最低消费\t21684\n气浮\t21685\nGoing\t21686\n里番社\t21687\nTYPE\t21688\nBoeing\t21689\n芳疗师\t21690\n韶大\t21691\nmsr\t21692\n说亮\t21693\ncemu\t21694\n乞丐版\t21695\nx80\t21696\n双发\t21697\n血钱\t21698\nCodeSmith\t21699\n锦屏春暖\t21700\npsd\t21701\n星际牛仔\t21702\nBirkin\t21703\n喳喳\t21704\n动画站\t21705\n诡案实录\t21706\n陷阵\t21707\nTRY\t21708\nGlam\t21709\n469\t21710\n言情小说吧\t21711\n1042\t21712\n菏房\t21713\n蝶\t21714\n戊肝\t21715\n安格\t21716\n省环境保护厅\t21717\n夏静\t21718\n润扬\t21719\n军嶂\t21720\nGARCH\t21721\n智德教育文库\t21722\n这条街\t21723\n托宾\t21724\n自闭症\t21725\n插电式\t21726\n彩石镇\t21727\n3922\t21728\n三星W2015\t21729\n桥本奈奈\t21730\nGarage\t21731\ncf\t21732\n虫药\t21733\n显贵\t21734\n求医网\t21735\n高度仪\t21736\n潍坊站\t21737\npdc\t21738\nprism5\t21739\n拟牛顿法\t21740\n水豚\t21741\n乐力\t21742\n结构性理财产品\t21743\n荡起\t21744\n收信人\t21745\n生成\t21746\nPMDG\t21747\n约泡\t21748\nCFP\t21749\n七遍\t21750\n针灸针\t21751\n2580\t21752\n中房商学院\t21753\n组合\t21754\n绯弹\t21755\n反犬\t21756\ncorenlp\t21757\n无问题\t21758\n花坞\t21759\n直选\t21760\n妙优\t21761\n迅游加速器\t21762\n退避三舍\t21763\n健腹器\t21764\n地勤\t21765\n两席\t21766\n范睢\t21767\nwestlife\t21768\n燕姿\t21769\n层峰\t21770\n春水堂\t21771\nJasmin\t21772\nmysal\t21773\nNCA\t21774\n布吉\t21775\n坤沙\t21776\n恒润科技\t21777\nhgetall\t21778\n杨志伟\t21779\n苦禅\t21780\nrxjava2\t21781\n三阶矩阵\t21782\n回怼\t21783\n信美\t21784\nTaylorS\t21785\ndpuser\t21786\n26小时\t21787\n加气块\t21788\nStation\t21789\n网络虚拟化\t21790\n王志勇\t21791\n金立金钢\t21792\ni58400\t21793\n太难\t21794\n一百公里\t21795\n24万元\t21796\n2013年3月\t21797\n天梭表\t21798\n海淘网站\t21799\n彩虹猫\t21800\nHarden\t21801\n69002730\t21802\nFritzing\t21803\n马俊仁\t21804\n微单M100\t21805\n高家大院\t21806\n芸香科\t21807\n休闲璐\t21808\n攒钱\t21809\nBreed\t21810\n中国民生银行\t21811\n一英磅\t21812\n轻松一刻\t21813\n性丑闻\t21814\nhash值\t21815\n内联元素\t21816\n歌剧院\t21817\n平刷\t21818\n万象新天\t21819\n北京建国门\t21820\n75道\t21821\n过去5年\t21822\n1094\t21823\n瑞萨电子\t21824\n银猫\t21825\nactiveform\t21826\n勾臂\t21827\n致优\t21828\n吴奇\t21829\nKilling\t21830\n魔卡世嘉\t21831\n拉斐\t21832\n垂尾\t21833\n叶色\t21834\n房屋维修基金\t21835\n俗称\t21836\nwps公式编辑器\t21837\n单眼\t21838\n西安电子科技大学》\t21839\nBTX\t21840\nsim4\t21841\n鸡尾\t21842\n物联卡吧\t21843\n流云诸葛\t21844\n短篇小说\t21845\n雄心勃勃\t21846\n成交量\t21847\n139亿\t21848\n蝗灾\t21849\nOFweek\t21850\n一鞭\t21851\n门德尔松\t21852\n兴森科技\t21853\n12幢\t21854\n红星天铂\t21855\n诊断仪\t21856\nBD720p/1080p\t21857\n故里\t21858\n警示性\t21859\n颈联\t21860\n春娇弹弹堂\t21861\n曹达华\t21862\n包法利夫人\t21863\nearl\t21864\n大学堂\t21865\nairport\t21866\n人事人才网\t21867\nyiwu\t21868\n买东西\t21869\n数据\t21870\n王菩萨\t21871\n合肥长鑫\t21872\n脉搏血氧仪\t21873\n鱿鱼干\t21874\n火锅鱼\t21875\n水星路由\t21876\n缅\t21877\n李明\t21878\n10班\t21879\n清标\t21880\n大众EA211\t21881\n千篇\t21882\n截访\t21883\n天草四郎\t21884\nQProcess\t21885\n槽点\t21886\n华帝燃气热水器\t21887\n沣东新城\t21888\n玉田镇\t21889\nZ5/Premium\t21890\n瑞达恒\t21891\nchamps\t21892\n阿里扒扒吧\t21893\n红鼎\t21894\n任免职\t21895\n焕彩\t21896\npyTorch\t21897\n157个\t21898\n三得利乌龙茶\t21899\n悬空寺\t21900\n三片式\t21901\nPro6\t21902\n_帮助中心\t21903\n养蛊\t21904\n输入项\t21905\n北京妇幼保健院\t21906\n加拿大女王大学\t21907\nDeposit\t21908\n雌鸟\t21909\nCreo3.0\t21910\nPM961\t21911\n素场\t21912\nzr\t21913\n清华池\t21914\n蛟龙\t21915\n2018.3.17\t21916\n王神\t21917\n托人\t21918\n谷歌日语输入法\t21919\nwz\t21920\n汤剂\t21921\n美国领事馆\t21922\n数字符\t21923\n新知新觉\t21924\n次梁\t21925\n反基督者\t21926\nReDive\t21927\n安源区政府\t21928\n于敏\t21929\n大圣归来棒指灵霄\t21930\n基拉祈\t21931\nJap\t21932\nHighCharts\t21933\n黄圃\t21934\n金斧子\t21935\n燃放\t21936\n红炎\t21937\n法环\t21938\n吉格\t21939\nCOCO\t21940\ndigit\t21941\n神泉\t21942\n萨洛蒙\t21943\n780t\t21944\n地秤\t21945\n同轨\t21946\n胆管细胞癌\t21947\n99e热\t21948\nIDea\t21949\nconnot\t21950\nsystemd\t21951\n注册岩土工程师考试\t21952\n铜管\t21953\n世格\t21954\nCannot\t21955\n孕婴童资讯中心\t21956\n数字变量\t21957\n3234游戏网\t21958\n口袋妖怪红蓝宝石复刻版\t21959\n广西北部湾\t21960\n中共鹤山市委\t21961\n西安交通大学口腔医院\t21962\n巴门尼德\t21963\n寿山乡\t21964\n喷潮\t21965\n导学号\t21966\n欧文4\t21967\n思诺思\t21968\n能使\t21969\n方恒\t21970\n环太平洋2\t21971\n挥汗\t21972\n北京总部基地\t21973\n一层楼\t21974\nvncviewer\t21975\n电热片\t21976\n流下\t21977\n公诚管理咨询有限公司\t21978\n两天两夜\t21979\n100匝\t21980\n当代美术馆\t21981\n丝网\t21982\nBC\t21983\n汤姆猫\t21984\nipm\t21985\n石岗\t21986\n林蛙油\t21987\n双河湾\t21988\nサヨナラ\t21989\n公试\t21990\n化建\t21991\n报验\t21992\n凯猫\t21993\n丁公路\t21994\n阿特伍德\t21995\nKNOWN\t21996\n祈祷文\t21997\n配血\t21998\nDEP\t21999\nmagnesium\t22000\n宝藏\t22001\n欧姆龙\t22002\n米其林三星餐厅\t22003\n面食\t22004\n金能科技\t22005\n备注页\t22006\n特派员\t22007\n这时\t22008\nGDPR\t22009\n卫东区\t22010\n全国糖酒会\t22011\nMCF\t22012\n孕母\t22013\n选民\t22014\n水桥保寿堂\t22015\n熵变\t22016\n朴春\t22017\n拍手称快\t22018\n胡丹\t22019\n老情人\t22020\n欧力威\t22021\n温碧霞\t22022\n1843\t22023\n20151128\t22024\n咸恩静\t22025\n48位\t22026\n爱火\t22027\n花儿与少年3\t22028\n夜夜噜2017\t22029\n中心点\t22030\n郑承焕\t22031\n河北省农村信用社\t22032\n看相\t22033\n甘李药业\t22034\nwww.yz88.org.cn\t22035\nn3450\t22036\nUnits\t22037\n老茂\t22038\n棵树\t22039\n性秘密\t22040\n侯湘婷\t22041\n部份\t22042\n金华经济技术开发区\t22043\n王兰斯\t22044\nm60\t22045\n上影线\t22046\n循环流化床\t22047\nIMWeb\t22048\nPreps\t22049\n操作性\t22050\n使命召唤11高级战争\t22051\n数制\t22052\nmiyavi\t22053\n齿廓\t22054\n万科云城\t22055\n王者级\t22056\n河南省发改委\t22057\n青岛公安\t22058\n农历三月\t22059\n世事\t22060\n驱魔少年\t22061\n肺癌骨转移\t22062\n滨江区政府\t22063\n智游\t22064\n易界\t22065\n000157\t22066\nboder\t22067\n磅房\t22068\nJoom\t22069\n七哥\t22070\n东奥会计网校\t22071\n金泰梨\t22072\nwhirlpool\t22073\nkonachan\t22074\n学习动机\t22075\nparticipating\t22076\nMicrophones\t22077\n2.54MM\t22078\n蓬莱仙山\t22079\nErno\t22080\nAICPA\t22081\nv8.4\t22082\n贵妃醉酒\t22083\n名卡\t22084\n4.0T\t22085\n一杯子\t22086\n大庆路\t22087\nSLAVE\t22088\n防夹\t22089\n重庆中国青年旅行社\t22090\n丁香作文网\t22091\n類\t22092\n杨新民\t22093\nF0\t22094\n天津医科大学第二医院\t22095\nv1.00\t22096\n有子\t22097\n高架\t22098\nhd630\t22099\n巴甲\t22100\n军火商\t22101\n簇拥\t22102\n企服\t22103\nSPSS\t22104\n宁玛\t22105\n卫生防疫站\t22106\n银盏\t22107\n美食季\t22108\n文杰\t22109\n七子之歌\t22110\nPCF8591\t22111\n乐视Pro3\t22112\n婕妤\t22113\n云艺\t22114\nflesh\t22115\n中国古动物馆\t22116\n夹生\t22117\n尹超\t22118\nlaw\t22119\nfile-loader\t22120\nxt800\t22121\n美国消防\t22122\n失压\t22123\n人力资源三级考试\t22124\nsadness\t22125\n安徽新闻网\t22126\n水浒传天导\t22127\n司法鉴定中心\t22128\n恼火\t22129\nスケ\t22130\nestablishing\t22131\n狱长\t22132\n空类\t22133\n京胡\t22134\ntelstra\t22135\nCurve\t22136\nch341a\t22137\n透光石\t22138\n时空恋旅人\t22139\n广告界\t22140\n防弹咖啡\t22141\n标页码\t22142\n二十名\t22143\n梁思浩\t22144\n给付型\t22145\nxcode8\t22146\n教授性\t22147\n卡奈\t22148\nBlocking\t22149\n亿生康\t22150\n势垒\t22151\n郑码\t22152\n语音\t22153\n柳甄\t22154\n云和\t22155\nstm32f407\t22156\n2存档\t22157\n广州地铁6号线\t22158\n新华大街\t22159\n天津女排\t22160\n海底椰\t22161\n人组\t22162\nvdp\t22163\n淘宝论坛\t22164\nfategrandorder\t22165\n安徽财经网\t22166\n梦丹\t22167\nsoo\t22168\n灵岩寺\t22169\n芈姝\t22170\n桑叶茶\t22171\n三副\t22172\n末世鼠疫2\t22173\n河北代表团\t22174\n北京格力空调\t22175\n感叹\t22176\nfounding\t22177\n蒂花之秀\t22178\n操Bxx\t22179\n湖南省公安厅交通警察总队\t22180\n平山镇\t22181\nbuildit\t22182\n卡车\t22183\nKSB\t22184\nSELF\t22185\n单元门\t22186\norigami\t22187\n交际\t22188\n羽翼\t22189\n老保险\t22190\n仲恺农业工程学院\t22191\n170%\t22192\n深圳市第三人民医院\t22193\nMedSci\t22194\n方哥\t22195\n老梁观\t22196\n专汽网\t22197\n中午11点\t22198\n只怕\t22199\n150张\t22200\n田园犬\t22201\n金鼠\t22202\n聊性\t22203\n近来\t22204\n街面\t22205\n学术论文集\t22206\nErnst\t22207\n幻化\t22208\n双子星座\t22209\n自恋狂\t22210\n戴鸿涛\t22211\n傅奎\t22212\n为奴\t22213\n振鼎鸡\t22214\n京剧艺术网网\t22215\n段距\t22216\nAcdsee\t22217\n13306659333\t22218\n雅戈尔动物园\t22219\n胶剂\t22220\n淮阴中学\t22221\n玄\t22222\n钱桥镇\t22223\nTop50\t22224\n穿越时空的爱恋\t22225\n10048\t22226\n义新欧\t22227\nmyc\t22228\n养生机\t22229\nGupta\t22230\n简繁转换\t22231\nmmog\t22232\n巴尼\t22233\n鞍区\t22234\n音调\t22235\n寒冰王座\t22236\nfana\t22237\n溏心风暴\t22238\n强生\t22239\n爱尔克\t22240\n演奏曲\t22241\n2017年7月1日后\t22242\n不失\t22243\nidean\t22244\n尼科\t22245\n私营\t22246\n百度云2017\t22247\n郭刻尔克\t22248\n镇海\t22249\n墨芬\t22250\n本年度\t22251\n一世\t22252\nGOOGLE浏览器\t22253\n苏玲\t22254\n慢慢仙途\t22255\n所为\t22256\nrfq\t22257\n合二为\t22258\n熨衣板\t22259\n黑白卡\t22260\n中密度板\t22261\n南纬\t22262\nngnix\t22263\n海王大厦\t22264\nIntrinsic\t22265\n苏琳\t22266\n从天而降\t22267\n寒雪\t22268\n中新天津\t22269\n梯面\t22270\n超凡蜘蛛侠\t22271\n蜀\t22272\n莫一剑\t22273\n大发快\t22274\nprudential\t22275\n邪恶少女漫画无翼鸟全集日本邪恶漫画与老师H_邪恶漫画无\t22276\nBandicam\t22277\n6.1.4\t22278\n影月谷\t22279\nyumiko\t22280\n原罪2\t22281\n英雄群侠传2吧\t22282\n列岛\t22283\n意动\t22284\nXposed\t22285\n数十个\t22286\n绿公司\t22287\n20160306\t22288\n历史主义\t22289\nivreg2\t22290\n管道过滤器\t22291\ndvajs\t22292\n罗通\t22293\n哈腰\t22294\nbeatx\t22295\n江菲\t22296\nseem\t22297\n取威\t22298\nFT\t22299\n品牌排名_百度\t22300\n政苑小区\t22301\n支行号\t22302\n天下3吧\t22303\n异声\t22304\nSketchUP\t22305\n起亚K4\t22306\n1800x\t22307\n递\t22308\n手撕鸡\t22309\n万德镇\t22310\nJad\t22311\n而去\t22312\n综上\t22313\n耐克森\t22314\n咽音\t22315\n印件\t22316\n梦中\t22317\n50欧\t22318\n一家子\t22319\n人力资源公司\t22320\n方一勺\t22321\n今冬明春\t22322\n培养基_\t22323\n博越\t22324\n方便\t22325\nThrasher\t22326\n天华\t22327\n民机\t22328\n藏龙\t22329\n栖身\t22330\nDeployment\t22331\n五条\t22332\n越前龙马\t22333\nDraftSight\t22334\n青云梯\t22335\n回报率\t22336\nmp3tag\t22337\n冰血暴\t22338\n匡\t22339\n林彬\t22340\nwww.9928\t22341\n8串\t22342\n梅克\t22343\n地球帝国\t22344\n摩点\t22345\nCaiBoBo\t22346\n声称\t22347\njdbcTemplate\t22348\n防御战\t22349\nh3c交换机\t22350\ncontinuation\t22351\nJDK9\t22352\n韩元\t22353\n死不了\t22354\n小秘密\t22355\nfsc\t22356\nnichicon\t22357\n1200W\t22358\n2018年1月9日\t22359\n郑州方特欢乐世界\t22360\n贵阳市第一中学\t22361\n隆冬\t22362\nRaft\t22363\n铜矿石\t22364\n类似物\t22365\nfl8000\t22366\n航头\t22367\n青岛早报\t22368\nWHILE\t22369\n微分_\t22370\n20170810\t22371\n上仙\t22372\ncad_筑龙网\t22373\n枚举类\t22374\n拓客\t22375\n0031\t22376\nquite\t22377\n刘永生\t22378\n流尽\t22379\n生命科学\t22380\n30000个\t22381\n鞠\t22382\n博士后流动站\t22383\nBrave\t22384\n零和博弈\t22385\n10.11.3\t22386\n流水号\t22387\n头晕目眩\t22388\n种\t22389\n危险化学品安全管理条例\t22390\nsticker\t22391\n小猫\t22392\n双管\t22393\n七大罪第二季\t22394\n秒赞\t22395\n途睿欧论坛_途睿欧\t22396\n鞣花酸\t22397\nyap\t22398\naji\t22399\n小皙\t22400\n600agm\t22401\n楚天都市报_多媒体报\t22402\n0xff\t22403\n白塔山\t22404\n出发\t22405\n当当\t22406\n海淀法院\t22407\navago\t22408\n硅水凝胶\t22409\nglsurfaceview\t22410\nCHEUNG\t22411\n康宁医院\t22412\n120度\t22413\n2000件\t22414\n安抚\t22415\nane\t22416\nimagejpeg\t22417\n1802\t22418\n灵谷寺\t22419\n芳子\t22420\n没有理由\t22421\n韵达电子\t22422\n王小燕\t22423\n租房网\t22424\n球皇\t22425\n文艺\t22426\n1806\t22427\nyuki\t22428\n不看后悔\t22429\n论点\t22430\n升职加薪\t22431\n赵丹\t22432\n0813\t22433\n钧天\t22434\n流化床反应器\t22435\n步态\t22436\nc盘\t22437\nusb启动盘\t22438\nН\t22439\n201个\t22440\n5.10\t22441\n序言\t22442\nBVI公司\t22443\n将士们\t22444\n金灯\t22445\n联通大厦\t22446\nmoisturizing\t22447\n精神病性\t22448\n动漫社\t22449\nebank\t22450\n同学们\t22451\n手拉壶\t22452\n宝华山国家森林公园\t22453\n影视展\t22454\n走位\t22455\nmax232\t22456\n符腾堡州\t22457\nOOXX\t22458\n三令五申\t22459\ndatastudio\t22460\nFlute\t22461\n十法\t22462\n计划单\t22463\n农民节\t22464\n超声检查\t22465\n一尘不染\t22466\nFantasia\t22467\n赵春华\t22468\nclas\t22469\n滑跃式\t22470\n发行期\t22471\n北京八达岭野生动物园\t22472\nPrecision\t22473\n尚友\t22474\nformax\t22475\n入宅\t22476\n海南职业技术学院\t22477\nsoviet\t22478\n碘化物\t22479\n古城堡\t22480\n222号\t22481\n米尔斯\t22482\n雅江\t22483\niPhone8/\t22484\n龙氏\t22485\n金三优服\t22486\n神经内科\t22487\n安世亚太\t22488\n充电宝\t22489\n林德伯格\t22490\n花美男\t22491\n娜姐\t22492\n两数\t22493\n京人社\t22494\n黑龙滩\t22495\n天财\t22496\n14公里\t22497\n事件源\t22498\nanomaly\t22499\n一大群\t22500\n龙城镇\t22501\n3773\t22502\n堰板\t22503\n創作\t22504\n营运证\t22505\nfield\t22506\n60多度\t22507\n信网\t22508\n居住证登记卡\t22509\n貂皮\t22510\n巴黎卢浮宫\t22511\n俱\t22512\n贴身侍卫\t22513\n航空业\t22514\n金管局\t22515\nSSB\t22516\nmth\t22517\nheroic\t22518\n桃州镇\t22519\nx86汇编语言\t22520\nlenka\t22521\n资治通鉴吧\t22522\n吕子乔\t22523\nHACCP\t22524\nunity3\t22525\n截获\t22526\n犹他爵士\t22527\npapo\t22528\n7300元\t22529\nrdlc\t22530\n住房公积金贷款计算器\t22531\n罗涛\t22532\n测量仪\t22533\n邛海\t22534\n时空观\t22535\nfelonwan\t22536\nlynda\t22537\n净息差\t22538\n征收率/扣除率\t22539\n央视网\t22540\n第5回\t22541\nSPARC\t22542\n正月初十\t22543\n扬子鳄\t22544\n减员增效\t22545\n长春酒店\t22546\n欧丽娟\t22547\nzkt\t22548\n繁衍\t22549\n周超\t22550\n拉伸弹簧\t22551\n新浪财经网\t22552\nSeattle\t22553\n300A\t22554\n钱三强\t22555\nfangjia\t22556\n参习\t22557\nhao123上网导航\t22558\n9099\t22559\n液氨储罐\t22560\nsoom\t22561\n中兴努比亚红魔\t22562\n4558\t22563\n乾坤湾\t22564\n绿眼\t22565\n光通传奇3\t22566\n求种\t22567\n白象街\t22568\n哑\t22569\n名龟\t22570\nappend\t22571\n哈克贝利费恩历险记\t22572\n省际\t22573\nアンダ\t22574\n手机卡\t22575\n亲女\t22576\n云海肴云南菜\t22577\n西安格力空调\t22578\n虽有\t22579\n农商信用卡\t22580\n没把握\t22581\n吐鲁番市\t22582\n灯会元\t22583\n淘米视频\t22584\n行政法学\t22585\n补退\t22586\n中央门\t22587\n唐璐电影网\t22588\n居家养老\t22589\n翼闸\t22590\n蓬莱松\t22591\n乱涂\t22592\nDemonstration\t22593\n彩蝶\t22594\n班徽\t22595\n算王\t22596\n郑州人民医院\t22597\n高尔顿\t22598\n女狼\t22599\n宁波19楼\t22600\n20150629\t22601\n中国五金商机网\t22602\n森女系\t22603\n射频同轴电缆\t22604\n白练\t22605\n第64号\t22606\nContest\t22607\n东北大酱\t22608\n荞麦粉\t22609\n夏溪\t22610\n福建师范大学协和学院\t22611\n踏古\t22612\nwdr5620\t22613\n帝国隋唐演义\t22614\nrxandroid\t22615\n英雄联盟s7\t22616\n张千一\t22617\n竖脊肌\t22618\n乳头瘤病毒\t22619\n羊毛\t22620\n乐高复仇者联盟\t22621\n幻甲\t22622\n中国庭审公开网\t22623\nKeren\t22624\n陪\t22625\n高校\t22626\n聚氨酯发泡保温钢管\t22627\n老焦\t22628\n红车\t22629\n14z\t22630\n小青\t22631\njzj\t22632\n百济健康商城\t22633\n第七次\t22634\nBobby\t22635\n金马路\t22636\n国家工程技术研究中心\t22637\n&#40\t22638\n厦门小区\t22639\n40亿元\t22640\n绿布\t22641\n生活部\t22642\n羁押\t22643\n可惜\t22644\niccid\t22645\n红本\t22646\n自然环境\t22647\n所\t22648\n悉心\t22649\nsaab\t22650\n0755\t22651\n剑晨\t22652\n缩颈\t22653\n割爱\t22654\n傲梅\t22655\n联名鞋\t22656\n8111\t22657\n大连商品交易所\t22658\n金属史\t22659\nAutoCAD泵\t22660\n行政许可法\t22661\n70%\t22662\ni7-6700K\t22663\n德拉诺\t22664\n价值规律\t22665\n海伦路\t22666\n黄斌汉\t22667\nCamera2\t22668\nDiffusion\t22669\n断管\t22670\n误会\t22671\nVS2010/MFC编程\t22672\nAnalyzers\t22673\n天造\t22674\n肥胖者\t22675\n固体力学\t22676\n烟霞\t22677\n广汕\t22678\n客观\t22679\n801路\t22680\numbrella\t22681\n迁出去\t22682\n庹\t22683\n乙状结肠癌\t22684\n3.0.3贺岁\t22685\n开港\t22686\n下颚骨\t22687\n数码宝贝2\t22688\n中华全国妇女联合会\t22689\nS.163.COM\t22690\nVB编程\t22691\n嫡妻\t22692\nu88连锁加盟网\t22693\n产权\t22694\n策展\t22695\n40km\t22696\n0914\t22697\n陈罡\t22698\n胃烧心\t22699\nTomPDA\t22700\n第62集\t22701\n太空刑警2\t22702\n仓缝\t22703\nB350M\t22704\n荣耀畅玩4X\t22705\n邹平\t22706\n第二\t22707\n宏文\t22708\n生死与共\t22709\n固化片\t22710\nFHR\t22711\n潜移默化\t22712\n五爱市场\t22713\n栖兰\t22714\nscenario\t22715\n败仗\t22716\n5357\t22717\nbootice\t22718\nvivado\t22719\n爱玉网\t22720\nZANADU赞那度\t22721\n串\t22722\n澳洲联邦银行\t22723\n崇祯皇帝\t22724\nprevailing\t22725\n拱手\t22726\n冷殿下\t22727\nmg/dl\t22728\n呼啸\t22729\n极限跳跃\t22730\n陈楠\t22731\nbeautyleg\t22732\nbcc\t22733\n8012\t22734\n贝客公寓\t22735\n德军总部2\t22736\n梭织机\t22737\n拥堵\t22738\n小花生\t22739\n万立方\t22740\n泡良\t22741\n云p\t22742\n脑放\t22743\nmmp\t22744\nBowl\t22745\n屯子\t22746\n张恨水\t22747\n弱化\t22748\n人工势场法\t22749\n飞致250\t22750\n安卓4\t22751\n副站长\t22752\n财综\t22753\n奥巴\t22754\n暗黑破坏神2魔电2017\t22755\n趣话\t22756\n邪御天娇\t22757\n常规型\t22758\n荣耀公孙离\t22759\n顺泰\t22760\n人造棉\t22761\n3月31号\t22762\n舒雅\t22763\n同济医院\t22764\ndiligence\t22765\n超变态\t22766\n大祥区\t22767\n文哲\t22768\nlar\t22769\n同心致远\t22770\n基点\t22771\n12条\t22772\n7月底\t22773\n4.6亿\t22774\n晒号网\t22775\n拒腐\t22776\n马庄村\t22777\n八佾\t22778\n谁不说俺家乡好\t22779\n卖卵\t22780\n南坛\t22781\n赫斯缇雅\t22782\n毁天灭地\t22783\n外化\t22784\n新天城市广场\t22785\n秀屿\t22786\n暗斑\t22787\n西九龙\t22788\n康巴赫\t22789\n四川广播电视台\t22790\n大败毒胶囊\t22791\n动漫游戏联盟网\t22792\n旬空\t22793\n靳东王凯\t22794\n9014\t22795\n兽腹\t22796\n大家坛\t22797\n抑郁\t22798\n青锋\t22799\n租車\t22800\n发车\t22801\ndomination\t22802\niphone8plus\t22803\n寿县一中\t22804\n有界\t22805\n山东省知识产权局\t22806\n冬冬\t22807\n屠门镇\t22808\n强轩\t22809\n曾剑\t22810\n瘫软\t22811\n2017-11-16\t22812\nE3D\t22813\n政治经济学批判\t22814\n吉百利\t22815\nniffer\t22816\n函授教育\t22817\n上海大学出版社\t22818\n南派\t22819\n中山日报报业集团\t22820\n待见\t22821\nLaura\t22822\nsilence\t22823\nSBL\t22824\n暖情\t22825\n冒险岛标飞\t22826\n凝灰岩\t22827\n罗永浩\t22828\n麻辣小龙虾\t22829\n奇风\t22830\n黑椒汁\t22831\n剑豪生死斗\t22832\n礼序\t22833\n标记法\t22834\n中百\t22835\n豆豆钱\t22836\n唐山路南\t22837\n格林兰岛\t22838\n华硕电脑\t22839\n迪文屏\t22840\n难得\t22841\nsteady\t22842\n2016年清明节\t22843\n威尔希尔\t22844\n斯科拉里\t22845\ne学大\t22846\n虎嗅\t22847\nFur\t22848\ncs2\t22849\n20世纪70年代\t22850\nunders\t22851\n38.8度\t22852\nROLL\t22853\nICMP\t22854\n封开县\t22855\n宋智孝\t22856\n聚乙烯塑料\t22857\n中顾法律网\t22858\n空伐\t22859\n两小时\t22860\n刻线\t22861\n情网站\t22862\nfriendships\t22863\n奉送\t22864\n不养\t22865\n艾梅柏·希尔德\t22866\n5.4米\t22867\n模仿秀\t22868\n广州产权交易所\t22869\n稽首\t22870\n数控车床\t22871\n何舞\t22872\n滤泡\t22873\nVv\t22874\n陈升号\t22875\n西游记2\t22876\n芯轴\t22877\n导语\t22878\nACG\t22879\n优智\t22880\n201709\t22881\n血浪\t22882\n定名\t22883\n第135期\t22884\nAvailable\t22885\n51cto.com\t22886\n8.25\t22887\n火星救援\t22888\nFPD\t22889\n你是我的姐妹\t22890\n舒眠\t22891\n哲学社会科学版)\t22892\n详规\t22893\n启程\t22894\n淮安网\t22895\n港西\t22896\nHGUC\t22897\n秀洲\t22898\n两年\t22899\n5万千瓦\t22900\n红盾网年检查询网\t22901\n常备药\t22902\n大地主\t22903\n第二程\t22904\n倭寇\t22905\n刑场\t22906\n磨\t22907\n吉马\t22908\n河北医科大学第一医院\t22909\nmei\t22910\n只争朝夕\t22911\nK3Cloud\t22912\nKING\t22913\n万隆证券网\t22914\n三昧真火\t22915\n胡大宝\t22916\nfabs\t22917\n空相\t22918\n36000元\t22919\n图瓦\t22920\n醒脑\t22921\n8070\t22922\nAPI认证\t22923\n一成\t22924\nhd6570\t22925\n粤水电\t22926\nCambridge\t22927\n三百里\t22928\n酷学\t22929\n舒马\t22930\n官腔\t22931\n富士康\t22932\n短式\t22933\ndload\t22934\n名副\t22935\n鲁姆\t22936\nLuis\t22937\n关凌\t22938\nMultivitamin\t22939\n黔江区\t22940\n李金发\t22941\n南京大学金陵学院\t22942\n孙菲\t22943\nReferer\t22944\n联科\t22945\n哈里森\t22946\n不二臣\t22947\n成魔\t22948\n苏州科技城医院\t22949\ntherapy\t22950\n西安地铁2号线\t22951\n固体氧化物燃料电池\t22952\nHong\t22953\n邱立东\t22954\n物理性\t22955\n高频热合机\t22956\n社会活动家\t22957\n团中央书记处\t22958\nAlthough\t22959\ni7-7700K\t22960\n假机\t22961\n在销\t22962\n有线电视\t22963\n开不了口\t22964\n借贷案\t22965\n年支\t22966\npostman测接口\t22967\n社区消防\t22968\n代码量\t22969\n九斤\t22970\nblot\t22971\n九天\t22972\nclass10\t22973\n自由贸易试验区\t22974\n至暗时刻\t22975\n颌下淋巴结肿大\t22976\n语音集\t22977\n松岗中学\t22978\n三门峡市委宣传部\t22979\n不陋\t22980\n竞逐\t22981\n跨过\t22982\n络合剂\t22983\n9.5分\t22984\ncdecl\t22985\n恨天\t22986\n【智\t22987\n林娜冰\t22988\n周冬\t22989\nadopted\t22990\n黑区\t22991\n魔兽世界恶魔猎手\t22992\n第1套\t22993\n192.168.1.100\t22994\n三个院子\t22995\n卷毛\t22996\n颜冬\t22997\n审视\t22998\n神行者2\t22999\n鞋城\t23000\nMultisim\t23001\n定子绕组\t23002\n体会学肃\t23003\n通铺\t23004\n知友们\t23005\nEEWORLD下载中心\t23006\n清涧县人民政府\t23007\nSchematic\t23008\n9家\t23009\nGl\t23010\n崖柏\t23011\n蝙蝠侠:黑暗骑士归来\t23012\n什麼\t23013\n辽宁忠旺集团有限公司\t23014\n20171005\t23015\n8066\t23016\n◇\t23017\ntica\t23018\n焚书坑儒\t23019\noriginlab\t23020\n赤红之瞳\t23021\n齿轮组\t23022\n乱鬼龙\t23023\n根儿\t23024\nmws\t23025\nNinebot\t23026\nSTATE\t23027\n一拉\t23028\n风铃\t23029\n桌面型\t23030\n上半场\t23031\nsgn\t23032\n牡丹江\t23033\n28期\t23034\n2017-08-29\t23035\n7.4.0\t23036\n中国民生投资股份有限公司\t23037\n勇敢者的游戏:决战丛林\t23038\n零元\t23039\ndaddies\t23040\n鸠占鹊巢\t23041\n6.5万\t23042\nhing\t23043\n5万公里\t23044\n拉门\t23045\nOPPOA79\t23046\n常务委员会\t23047\n徐佳丽\t23048\n细品\t23049\n魔瓶\t23050\n1.2.13\t23051\n浦沅\t23052\nGIS\t23053\niMindMap\t23054\n绸布\t23055\n912路\t23056\nCnc\t23057\n迪米特\t23058\n20180424\t23059\ndevenv\t23060\n软灯\t23061\nyo\t23062\nbootstrapvalidator\t23063\nDevExpress中文网\t23064\n飞马网\t23065\nCasio卡西欧\t23066\n48年\t23067\n色狼\t23068\n比达\t23069\n老司机带带我\t23070\n大连交通大学\t23071\n风琴\t23072\nOB\t23073\nNZXT\t23074\n无国界\t23075\nVitality\t23076\n手印\t23077\n三层半\t23078\necharts雷达图\t23079\nGTE\t23080\n欠压\t23081\ncri\t23082\n仙路至尊\t23083\n米尼\t23084\n6060\t23085\n武汉国博新城\t23086\n交叉熵\t23087\n搬家车\t23088\n1.x86\t23089\nim2col\t23090\nlm393\t23091\n乌合之众\t23092\nExtraction\t23093\n杨岳\t23094\n沙漠化\t23095\nMavic\t23096\n杨翔\t23097\n反转\t23098\nTris\t23099\n小米加密兔\t23100\n如痴如\t23101\n苏宁易\t23102\n顶固\t23103\n王弼\t23104\nfuzor\t23105\n范悦\t23106\n4222\t23107\n九台\t23108\n三生花\t23109\n三冬暖\t23110\neV\t23111\n总和\t23112\n氨基葡萄糖\t23113\n线程间通信\t23114\n橡胶油\t23115\njidaxue\t23116\nザ\t23117\n5300\t23118\n天山镇\t23119\n老小\t23120\n经币\t23121\n200多个\t23122\n阴影\t23123\n秦晓第六感生死缘\t23124\npaw\t23125\n十三\t23126\n欧姓\t23127\n五老村小学\t23128\n被扑灭\t23129\n取和\t23130\n尖尾\t23131\nCONFIG\t23132\n好奇害死猫\t23133\n玉珠铉\t23134\n沪教版\t23135\n约克郡北区\t23136\n严纯华\t23137\nHOME家族崩坏\t23138\n所选\t23139\n插卡音箱\t23140\n橘芹\t23141\n淘女郎\t23142\n窦唯\t23143\nctrl\t23144\n混混\t23145\n未见过\t23146\n不中断\t23147\n龙币\t23148\nerrorCode\t23149\n去找\t23150\n陈数\t23151\npythin\t23152\ntas\t23153\n平叉\t23154\n七八十\t23155\n访校\t23156\n华途\t23157\n布荷载\t23158\n津滨\t23159\n赛可瑞\t23160\nepsxe\t23161\ndilation\t23162\nJqGrid\t23163\n桂林航天工业学院\t23164\n电热炉\t23165\n归路\t23166\n百善\t23167\nwriteln\t23168\n汽车零部件有限公司\t23169\nricky\t23170\n金油\t23171\npatrol\t23172\n东方娃娃\t23173\n永康日报\t23174\nspurious\t23175\n刘晶晶\t23176\nPScc\t23177\nv1.4.1\t23178\n2017年5月15日\t23179\n十天内\t23180\nCen\t23181\n2251\t23182\n世联行\t23183\n试纸\t23184\n徐百卉\t23185\n5H\t23186\n深圳市人民政府口岸办公室\t23187\n大芒果GM\t23188\n甘霖\t23189\n拉达尼瓦\t23190\n岛台\t23191\n人教版小学六年级语文下册\t23192\n取出\t23193\n蓝脸\t23194\n绿色工厂\t23195\n汝湖镇\t23196\n百度凤巢\t23197\n剩男\t23198\n联通4g\t23199\n移动手机号\t23200\n黑客帝国3\t23201\n肆意妄为\t23202\n熊祁\t23203\n北京丰台医院\t23204\n瓯江口新区\t23205\n中国基金业协会\t23206\nfuss\t23207\n张总\t23208\n拾\t23209\n六本木\t23210\n\u001f\t23211\n侗族大歌\t23212\n秀山\t23213\nstat\t23214\n增肌期\t23215\nTeamTalk\t23216\nceva\t23217\n佳期\t23218\n信使\t23219\nReportNG\t23220\n全国社保基金\t23221\n边牧\t23222\n深林\t23223\nmaldi\t23224\n嘉里大通\t23225\n乔·吉拉德\t23226\n不当得\t23227\n桃园市\t23228\n嘉肴\t23229\n光师\t23230\n什么子\t23231\n上海保险\t23232\n猪肝泥\t23233\n短情\t23234\n爱永\t23235\n强师\t23236\n明细账\t23237\n清远清城\t23238\n跨职能流程图\t23239\n一口价\t23240\n石川\t23241\nmat\t23242\n本分\t23243\nJAV\t23244\n专技岗\t23245\n说写\t23246\n克州人民政府\t23247\n周松\t23248\n傅念琛\t23249\n63路\t23250\n旭旭\t23251\nApowersoft\t23252\n新开元大酒店\t23253\n让位\t23254\nkuro\t23255\n邓海光\t23256\n巴琴\t23257\n捆缚\t23258\n格莱魅\t23259\n猪肚鸡\t23260\n雅俗\t23261\n毎\t23262\nd3100\t23263\n举架\t23264\nfps值\t23265\nMargiela\t23266\n中航物业\t23267\n詹娜\t23268\n西北湖\t23269\n赛为智能\t23270\n放权\t23271\n万村千乡农民篮球赛\t23272\nhyt\t23273\n邪火\t23274\n范锐平\t23275\n后劲\t23276\nLBC\t23277\n合肥市地方税务局\t23278\n混合所有制\t23279\n384号\t23280\n北京大学国际关系学院\t23281\n贺子珍\t23282\n盛大金禧\t23283\n项目路演\t23284\n电动卷帘门\t23285\n个旧市\t23286\n正侧\t23287\n3DS中文网\t23288\n50m\t23289\njsm\t23290\nfacets\t23291\n苏大附中\t23292\n柴油版\t23293\n花柄\t23294\n纳兰明珠\t23295\n转速表\t23296\n葛荟婕\t23297\n资料页\t23298\n三大件\t23299\nvvv\t23300\n柏泉\t23301\n燧人氏\t23302\n上海市中心\t23303\n简支\t23304\n剪板机\t23305\n20171004\t23306\n弯曲度\t23307\n汉能控股集团\t23308\n西和\t23309\n长草颜\t23310\n38集\t23311\nBritain\t23312\nemac\t23313\n50MB\t23314\n今夜天使降临\t23315\n山西传媒学院\t23316\n氢氧化镍\t23317\n推展\t23318\n黄段\t23319\n41万\t23320\n宁波大榭开发区\t23321\n一汽丰田奕泽\t23322\nMacbookAir\t23323\n逐行\t23324\n5600k\t23325\n高平镇\t23326\nexp07\t23327\n亻\t23328\nng\t23329\n怡园\t23330\n众疾\t23331\n共和国住房和城乡建设部\t23332\n操刀\t23333\n李书文\t23334\n高介\t23335\n长安欧尚论坛\t23336\n网易藏宝阁\t23337\n煎鱼\t23338\npubkey\t23339\n非秀\t23340\n冷季型\t23341\n飞尘\t23342\n酸豆角\t23343\n软欧包\t23344\n小娃\t23345\nacc\t23346\n星战\t23347\n怪医黑杰克\t23348\n战乱\t23349\nryu\t23350\n萆薢\t23351\n内伤\t23352\n长河落日圆\t23353\nkids\t23354\nwindows2008\t23355\n米老鼠和唐老鸭\t23356\n宝骏360\t23357\n笼中鸟\t23358\nmm\t23359\n花明\t23360\npublicity\t23361\n胖胖\t23362\nTOP147台\t23363\n蓝光林肯公园\t23364\nc6100\t23365\nwinsows\t23366\n北京道\t23367\n北外附属外国语学校\t23368\n郭店\t23369\nAD|DXP\t23370\nqq3\t23371\n并行算法\t23372\n旋锁\t23373\n1.8MT\t23374\n水磨沟区\t23375\n扩香机\t23376\n接线子\t23377\n未就绪\t23378\n名品\t23379\n快滤池\t23380\n焦涛\t23381\n巴拿马运河\t23382\n永恒之蓝\t23383\n金Samuel\t23384\n第七话\t23385\nflatpickr\t23386\nServo\t23387\n济水\t23388\nAndroid7.0\t23389\n谋私\t23390\n马王洞\t23391\n数据链路层\t23392\nLoyalty\t23393\n鲁达\t23394\ndxg\t23395\n龙珠超\t23396\nr2014a\t23397\n外汇赠金网\t23398\n王者荣耀白\t23399\n三问\t23400\n闸门\t23401\n新都花园\t23402\ntvn\t23403\n荣耀7X\t23404\n布甲\t23405\n华冶\t23406\n天津站\t23407\n立绘\t23408\nRAKsmart\t23409\n出师不利\t23410\n百度影音-2828电影网\t23411\ncord\t23412\n金兜洞\t23413\n二之国2:亡魂之国\t23414\n华东勘测设计研究院\t23415\n地狱篇\t23416\n遵义机场\t23417\n谭维维\t23418\n125平\t23419\n迅雷影音\t23420\n不得志\t23421\nsave\t23422\nfeitian\t23423\nAr\t23424\n管理段\t23425\ndevkit\t23426\n先遣服\t23427\n盗抢\t23428\n天工\t23429\n第13号\t23430\n许晴\t23431\n20140328\t23432\n本学期\t23433\n红曲\t23434\n鸠\t23435\n21部\t23436\n犬科\t23437\n慕辰\t23438\nfrancis\t23439\nguang\t23440\n建筑英才网\t23441\n邦邦\t23442\n杭州市区\t23443\n2月28日\t23444\n塞下曲\t23445\n洪宗玄\t23446\nICG\t23447\nPHP168\t23448\n李政宰\t23449\n1933老场坊\t23450\n醉\t23451\n微信扫一扫\t23452\ngansu\t23453\nSummit\t23454\n竖叉\t23455\n公共管理学院\t23456\n礼仪操\t23457\nPSK\t23458\nZIP格式\t23459\n大冶北\t23460\n钟姓\t23461\nMaxMara\t23462\n159915\t23463\n手动变速器\t23464\nKb\t23465\nsuggestion\t23466\n环城生态区\t23467\nwo99.com\t23468\n天吉\t23469\n长型\t23470\n寒塘\t23471\n广惠\t23472\n乍一看\t23473\n救转\t23474\n广东政府\t23475\n正北\t23476\n妖男\t23477\n情谊\t23478\n孙莹\t23479\n甾醇\t23480\n卡惠\t23481\n野火\t23482\n录页\t23483\njtag\t23484\n乐比\t23485\n汇添富基金管理股份有限公司\t23486\n标书\t23487\n大耳\t23488\n反求\t23489\n播播\t23490\n骁龙650\t23491\nstutio\t23492\n防腐螺旋钢管\t23493\nExcel-CSDN\t23494\n澳式\t23495\n散户\t23496\n牛奶蛋白过敏\t23497\n1亿个\t23498\n凤凰健康\t23499\n竹材\t23500\n南开\t23501\n白宫\t23502\nsupernatural\t23503\n管办\t23504\n王凌云\t23505\nLancer\t23506\nMI6\t23507\n茱蒂\t23508\n洗衣袋\t23509\n中国正在说\t23510\n宁德市医院\t23511\n苹果皮\t23512\n沃尼尔\t23513\n无刷控制器\t23514\n管彤\t23515\n陈鹏鹏\t23516\n新世界百货店\t23517\n重新认识\t23518\n专案\t23519\n不织布\t23520\n星野亚希\t23521\n观察家\t23522\n2018种\t23523\nec\t23524\n弄伤\t23525\n关上\t23526\nt110\t23527\n彼得罗夫\t23528\n黄龄\t23529\nwust\t23530\nIVR\t23531\n红警3\t23532\n马洛斯\t23533\n赵季平\t23534\n莫怪\t23535\nktv\t23536\nVLOOKUP函数\t23537\n002050\t23538\nWeekend\t23539\n广益中学\t23540\n谷类\t23541\nADJ\t23542\neconomies\t23543\n气孔\t23544\n水芙蓉\t23545\n徐汇滨江\t23546\nbaodian\t23547\n经典性\t23548\n气焰\t23549\n煌上煌\t23550\nruntimel1-1-0.dll\t23551\ntrac\t23552\n平潮镇\t23553\nams车评网\t23554\n傅聪\t23555\nccdi\t23556\n1201\t23557\n衡欣\t23558\nute\t23559\n养殖巴巴网\t23560\n巴黎政治学院\t23561\n不要说\t23562\n上海华育中学\t23563\n海尔社区\t23564\n相扑手\t23565\n说一不二\t23566\nHaul\t23567\n简字体\t23568\n阜石路\t23569\n春秋旅游网\t23570\n查理·芒格\t23571\n工资计算器\t23572\n穿越兽世\t23573\n徐玉玉\t23574\n商住两用\t23575\n阿吉豆\t23576\n)信息科技有限公司\t23577\nstopper\t23578\n毒流\t23579\n高度图\t23580\n胡林翼\t23581\n羊城创意产业园\t23582\n魅蓝\t23583\n土坡\t23584\n4g网络\t23585\n茶袋\t23586\n欲哭\t23587\n亲爱的她们\t23588\n起火\t23589\n安森美\t23590\n宁镇\t23591\nr11t\t23592\n老鼠爱大米\t23593\n2多\t23594\nIMECH-IR\t23595\n上海保监局\t23596\nDestroy\t23597\n4G飞享\t23598\nDnsmasq\t23599\n2288h\t23600\n扫盲贴\t23601\n一加手机5\t23602\nPerfectly\t23603\n头针\t23604\n平安万家\t23605\n挑担\t23606\nmx4pro\t23607\n光刃\t23608\n1%\t23609\n藏学\t23610\n托卡塔\t23611\naconda\t23612\nTreant\t23613\n终凝\t23614\n杏花雨\t23615\n300003\t23616\n痞子英雄\t23617\n耳穴\t23618\npipeline\t23619\n嫩白\t23620\nfifa18吧\t23621\n第二审\t23622\n外转子电机\t23623\naaronGao\t23624\n胱氨酸\t23625\n谢辞\t23626\nvase\t23627\n拓陆者皮卡\t23628\n地卜师\t23629\n工匠精神\t23630\n俘虏\t23631\n行政处罚裁量权\t23632\n纸管\t23633\n5尺\t23634\n图书馆\t23635\n张爱国\t23636\n電機\t23637\n雷迪\t23638\nTuigirl\t23639\n履历书\t23640\n艾泽\t23641\nmodeler\t23642\n第82期\t23643\n第A08\t23644\n浸淫\t23645\n中耳\t23646\n毛杰\t23647\n脱逃\t23648\n固定术\t23649\n华图公务员考试网\t23650\n鼓鼓\t23651\n51JOB\t23652\n600个\t23653\n大事纪\t23654\n2112\t23655\n社交网络\t23656\n三章\t23657\n刀剑如梦\t23658\n太烂\t23659\n滚铁\t23660\n萨鲁曼\t23661\n疑难杂症\t23662\n孙新阳\t23663\n第一季1\t23664\ndacom\t23665\n无匹\t23666\n六角网\t23667\nMenuStrip\t23668\n热板\t23669\n笑谈\t23670\n僵尸先生\t23671\n鲁邦三世吧\t23672\n燕尾蝶\t23673\n深圳市龙华区人民医院\t23674\ndiscuz\t23675\n300多款\t23676\nqtopia\t23677\ntos\t23678\n椰海大道\t23679\n弄破\t23680\ncrosstalk\t23681\n阿里巴巴诚信通\t23682\nClaw\t23683\nTL-WR885N\t23684\n石潭镇\t23685\n南京金陵中学\t23686\nmultiplayer\t23687\n谈房论市-亿房论坛\t23688\n泪花\t23689\nPersonnel\t23690\nwinrt\t23691\n月悦\t23692\n刑架\t23693\n饶平\t23694\nauctions\t23695\n000198\t23696\n002427\t23697\nK-Means聚类算法\t23698\n咪咕阅读\t23699\n这么点\t23700\n骨针\t23701\n泰格斯\t23702\n窗机\t23703\n诸子百家\t23704\n后卫\t23705\n小混混\t23706\n中厂\t23707\n采茶舞曲\t23708\nGta5\t23709\n院中\t23710\n郑州大学第三附属医院\t23711\n梦醒时\t23712\nDOOR\t23713\namo\t23714\n褔\t23715\n避雷带\t23716\npostion\t23717\n宝鸡市人力资源和社会保障局\t23718\n考前\t23719\n岐阜县\t23720\nWord联盟\t23721\n卫生部办公厅\t23722\n只求\t23723\n激光打标机\t23724\n炉桥\t23725\n合龙\t23726\n黑蓝\t23727\nStability\t23728\n6065\t23729\n卷烟机\t23730\n9500\t23731\n利物浦队\t23732\n分司\t23733\n不成\t23734\n孟山都公司\t23735\n云中鹤花语\t23736\n_吾谷网\t23737\nOkinawa\t23738\n阿原\t23739\n龙田镇\t23740\n婚意\t23741\n黄山市\t23742\n7260\t23743\n兵哥哥\t23744\n奈法利安\t23745\n横线处\t23746\n纽约大学\t23747\n5.7.14\t23748\n二月河\t23749\n华金在线\t23750\n移动版\t23751\n张二嫂\t23752\n专转本考试\t23753\nペット\t23754\n折耳\t23755\narray_keys\t23756\nfallback\t23757\nzo\t23758\n使命召唤11:高级战争\t23759\n稻垣吾郎\t23760\nAURA\t23761\nscrollTo\t23762\n红房子\t23763\nveterinary\t23764\n天山五村\t23765\n申荷永\t23766\n82451698\t23767\n聚苯醚\t23768\nJackpot\t23769\nReference\t23770\nassumed\t23771\n勇者斗恶龙X\t23772\n调音师\t23773\n饿货\t23774\nJolimark\t23775\n限报\t23776\n护封\t23777\n谏山创\t23778\n六零年\t23779\n查询构造器\t23780\n双氯芬酸钠滴眼液\t23781\n储干\t23782\n花竹\t23783\n孙哥\t23784\n遭拒\t23785\n盥\t23786\n双兴\t23787\n吲哚美辛栓\t23788\n博爱\t23789\n雅萌\t23790\n星际老男孩\t23791\n达科塔·约翰逊\t23792\n宽温\t23793\nwin1032位\t23794\n在身\t23795\n纯朴\t23796\nVray渲染器\t23797\n天津公交查询网\t23798\n马德里德比\t23799\n百余个\t23800\n伊洛纳\t23801\n昕昕\t23802\n国家市场监督管理总局_国家质量监督检验检疫总局\t23803\n广东省立中山图书馆\t23804\n吉他独奏\t23805\n国资委党委\t23806\n幼生\t23807\n扉页\t23808\n重头戏\t23809\n百般\t23810\n绕城公路\t23811\n2010-2016年\t23812\nsql查询分析器\t23813\n凯润婷\t23814\n卫浴间\t23815\n14.7\t23816\n柏夫人\t23817\n车展\t23818\nME个性网\t23819\n多角虫\t23820\n修改类\t23821\n背脊\t23822\n惊扰\t23823\n220亿\t23824\n永不落\t23825\nsuppliers\t23826\n掀衣\t23827\n三年内\t23828\n印巴战争\t23829\n南昌大学研究生院\t23830\nEMI滤波器\t23831\n安博\t23832\n速发\t23833\nRisa\t23834\n杭州地铁10号线\t23835\n新知语文网\t23836\nCCFL\t23837\n生花生米\t23838\n荠\t23839\n2018年4月30号\t23840\n庵东\t23841\n规划学\t23842\n罗森便利店\t23843\n金庸网\t23844\n二学一\t23845\n16.4\t23846\n梦想三国\t23847\n杀无赦\t23848\nOpenCV\t23849\nresultType\t23850\n20170826\t23851\n叽\t23852\n18个小时\t23853\n球球网\t23854\n刺杀骑士团长\t23855\n美容美发\t23856\n九州大学\t23857\n色迷\t23858\n贵宾\t23859\n趋于\t23860\n历峰集团\t23861\nBTCC\t23862\n陈骏\t23863\n光栅化\t23864\n什么用\t23865\n伊莱\t23866\nDN40\t23867\n阴离子交换树脂\t23868\n汉得\t23869\n毕业册\t23870\n谢幕\t23871\nse846\t23872\n乌拉圭\t23873\n葵花\t23874\nAndroid浏览器\t23875\n远达\t23876\n鲢鱼\t23877\n佳能ip7280\t23878\n张云龙\t23879\nAccording\t23880\n罗庄区\t23881\n黑种人\t23882\n37亿美元\t23883\n续订\t23884\n飞墨\t23885\n选文\t23886\n金吉鸟健身\t23887\n玩不到\t23888\n教职工代表大会\t23889\n大吵大闹\t23890\n奥迪rs3\t23891\nVBIOS\t23892\nstm32f7\t23893\n雪纱\t23894\n9700\t23895\n宝拉\t23896\nbitstream\t23897\n万圆\t23898\n资产报酬率\t23899\n百尺\t23900\nMultilingual\t23901\n网络安全概念股\t23902\n大头虾\t23903\n共价键\t23904\n晕车贴\t23905\nRexha\t23906\n考察团\t23907\n吴竹\t23908\n阳坡\t23909\n洋媳妇\t23910\n1748\t23911\n幸运符\t23912\n透平膨胀机\t23913\n君命\t23914\ncflags\t23915\n贮备\t23916\n止渴\t23917\nodc\t23918\n这回事\t23919\n关淑怡\t23920\n刮布\t23921\n利通\t23922\nwasp\t23923\npow\t23924\n地契\t23925\n闺蜜网\t23926\n焦糖\t23927\n崔开潮\t23928\n读唐诗\t23929\n酵素浴\t23930\n练习四\t23931\n养和\t23932\n众众\t23933\n侠盗联盟组\t23934\npaco\t23935\n上海大众汽车有限公司\t23936\nAnglia\t23937\nwnzcj\t23938\n孤子\t23939\n氧化镁\t23940\n海魂\t23941\n刑克\t23942\nFinTech\t23943\n钢木\t23944\n错事\t23945\n中旅星旅网\t23946\n王益民\t23947\nCovenant\t23948\n摇\t23949\n平率\t23950\n准分子激光手术\t23951\n天码营\t23952\n艾奇\t23953\n1组\t23954\naerosol\t23955\n苏州高新区人民医院\t23956\nBetta\t23957\nexr\t23958\n83_\t23959\n流程框\t23960\n苏铁\t23961\nmonolithic\t23962\n左手无名指\t23963\n现身\t23964\nNubia\t23965\n李炎恢\t23966\n880元\t23967\n集报\t23968\n1.5\t23969\n无翼鸟\t23970\nsleepy\t23971\n国际货代公司\t23972\nIT时代网\t23973\n一一个\t23974\n昆百大A\t23975\n地网\t23976\n熔岩\t23977\n节能器\t23978\n川教版\t23979\nWebservice\t23980\n封印\t23981\n单幅\t23982\nRanch\t23983\n森永\t23984\n后排座\t23985\nms-dos\t23986\n团拜会\t23987\n农林渔牧\t23988\nHarper\t23989\n许晋亨\t23990\n拉格纳\t23991\n闲置\t23992\n贾马尔\t23993\n山东大学文学院\t23994\n氟化氢铵\t23995\n折扣码\t23996\n1373\t23997\n生死狙\t23998\n拳击场\t23999\n山东省国家税务局\t24000\n3次方\t24001\n安徽省农村信用社联合社\t24002\n17173.com\t24003\n浦发运通白金卡\t24004\n诏\t24005\n迈图\t24006\n百得胜衣柜\t24007\nPF\t24008\n加达里\t24009\n天麻片\t24010\n纹彩\t24011\n苏梓玲\t24012\n地球站\t24013\n璀璨\t24014\n抵减\t24015\nShaving\t24016\n2091\t24017\n公差值\t24018\n角位\t24019\n西郡\t24020\nIT网\t24021\n大连地区\t24022\n帛画\t24023\n重生之神级败家子\t24024\n潍坊网\t24025\n九尾狐\t24026\n链状\t24027\n电动遮阳帘\t24028\nASO114\t24029\n卡雅·斯考达里奥\t24030\n奥古斯丁\t24031\nh1s\t24032\n华为终端有限公司\t24033\nmiumiu\t24034\n【度盘\t24035\n坑神\t24036\n怒斥\t24037\n勾搭\t24038\n若隐若现\t24039\n克里斯托弗\t24040\n王小红\t24041\n线程组\t24042\npio\t24043\n挂课\t24044\n甸\t24045\n甲状腺髓样癌\t24046\n得利捷\t24047\n大通\t24048\n启航\t24049\n阳明山庄\t24050\n只用\t24051\n全神贯注\t24052\nicf\t24053\n坡式\t24054\n纸企\t24055\n冰窖\t24056\n300千瓦\t24057\n木鱼微\t24058\ncorel\t24059\ntwin\t24060\n张岚\t24061\n无轨胶轮车\t24062\ncounselor\t24063\n大渔铁板烧\t24064\n锁相环\t24065\n中粮糖业\t24066\n断货\t24067\n安医大一附院\t24068\n刘莉\t24069\nSHAPE\t24070\nthiscall\t24071\n开心斗\t24072\n个\t24073\n华北电力\t24074\n匡威高帮\t24075\n油膜\t24076\nCitations\t24077\n梁托\t24078\n邢台日报社\t24079\ntifa\t24080\n维信金科\t24081\nparentid\t24082\nKERNEL\t24083\n全人类\t24084\n[火影\t24085\n饱和土\t24086\n第3课\t24087\n呼图壁\t24088\n6kw\t24089\n户籍地址\t24090\n華網\t24091\n本群\t24092\n交管12123全国违章查询|12123预约考试|12123车管所】-交管12123\t24093\n特效\t24094\n诚志股份\t24095\n陈桥镇\t24096\n油温\t24097\n硝烟\t24098\n大源村\t24099\n小冉\t24100\n遮\t24101\noverflow\t24102\n注射器\t24103\n小波变换\t24104\nStata\t24105\n货币互换\t24106\n亚布力滑雪场\t24107\n义词\t24108\n小米浏览器\t24109\n源站\t24110\n汽轮\t24111\n德皇\t24112\n厘米\t24113\n李志鹏\t24114\nwright\t24115\n王贝利\t24116\niga肾病\t24117\n袁文杰\t24118\n密苏里号战列舰\t24119\n开州\t24120\n调压器\t24121\nARROW\t24122\n230斤\t24123\n汪志诚\t24124\n宁波海洋世界\t24125\n曲马多\t24126\n云长\t24127\n碧昂王志东\t24128\n米林宏昌\t24129\nmaga\t24130\n栓子\t24131\n姑娘\t24132\njimmy\t24133\n乌兰察布盟\t24134\n过敏过量\t24135\n东长安街\t24136\nexpandablelistview\t24137\n二八\t24138\n四川九洲\t24139\n周身\t24140\n栗浩洋\t24141\n1肖\t24142\n性学\t24143\n枫丹\t24144\n影怪\t24145\n江铃顺达\t24146\nneverfull\t24147\n铝硅\t24148\n6373\t24149\n半边莲\t24150\ncodon\t24151\ndocumentary\t24152\n刘仁\t24153\n鹿死谁手\t24154\n广深高铁\t24155\n天津开发区\t24156\n素珍\t24157\n辞职书\t24158\n生元\t24159\n夭夭\t24160\n威视\t24161\nPeng\t24162\n胡军\t24163\n角动量守恒定律\t24164\n大唐双龙传\t24165\n泉州信息工程学院\t24166\n中国驻美国大使馆\t24167\n戴上\t24168\n彩光\t24169\nbridges\t24170\n赶尸艳\t24171\n开哥\t24172\nKontakt\t24173\nkaji\t24174\n包覆机\t24175\nTrail\t24176\n海军\t24177\n可支配收入\t24178\n深圳地铁9号线\t24179\n4.4.9\t24180\n同法\t24181\n吴少华\t24182\n19.2\t24183\n高诚\t24184\n九七年\t24185\n各不相同\t24186\n2016一级\t24187\n十年之内\t24188\n蛋白虫\t24189\n东汇\t24190\n刘沁\t24191\n运动耳机\t24192\n试验型\t24193\n7556\t24194\n崔雅涵\t24195\n提姆\t24196\n18r\t24197\n法兰特\t24198\n聚氨酯泡沫填缝剂\t24199\nH3C认证\t24200\n千草\t24201\n十二项\t24202\n洪都大道\t24203\n固定夹\t24204\nVMwareTools\t24205\n1024x768\t24206\n环函\t24207\n穷查理宝典\t24208\n陕西省建设厅\t24209\n1400万\t24210\n刀速\t24211\n撸狠狠\t24212\n滨河小区\t24213\n管中闵\t24214\n科技路西口\t24215\n马山县\t24216\n兰亭集序\t24217\n时不时\t24218\n环境污染物\t24219\n瑞诺\t24220\n2.40\t24221\n中山五路\t24222\n糊精\t24223\n2.5D\t24224\n早生\t24225\n撒加\t24226\n炼胶\t24227\n盐卤\t24228\nspecific\t24229\nExtella\t24230\nHMA\t24231\n安e+\t24232\n沈无衣\t24233\n尾差\t24234\n无锡市公共资源交易中心\t24235\nGabber\t24236\n水字\t24237\n龙骏\t24238\n18456\t24239\n电源模块\t24240\n形象管理\t24241\n第九大\t24242\ngem5\t24243\n13张\t24244\n马天宇\t24245\n替考\t24246\n晚九点\t24247\nisotropic\t24248\n枝子\t24249\n无锡幼升小\t24250\n独乐乐不如众乐乐\t24251\n拿破仑全面战争\t24252\n肥伦秀\t24253\n时间性\t24254\n附打\t24255\n同素\t24256\n高寿\t24257\n市镇\t24258\n谩骂\t24259\n哈里波特\t24260\n好压\t24261\n酉阳杂俎\t24262\n腰围\t24263\n倍加\t24264\n铱金\t24265\n冰雪大世界\t24266\n方药\t24267\n振动力学\t24268\n阿勒泰市\t24269\n体恤衫\t24270\n发行方\t24271\n_草民电影网\t24272\n无锡市住房公积金管理中心\t24273\n卡丁\t24274\n全知网\t24275\ncpl\t24276\n泰安市教育局\t24277\n微变\t24278\n中国中医科学院\t24279\n主戏\t24280\n秒针\t24281\nxiyu\t24282\n站前街\t24283\n穿路\t24284\n天使之眼\t24285\n海西州\t24286\nbithumb\t24287\nvce\t24288\n西江在线网\t24289\n2013a\t24290\n机法\t24291\n河源高新区\t24292\nCLOT\t24293\n淮安市淮安区人民政府\t24294\n斐\t24295\nsrvany\t24296\n管户\t24297\n李元吉\t24298\n灯泡\t24299\n195个\t24300\n换流变压器\t24301\n玛丽莲·曼森\t24302\n唐狮\t24303\n嘉庆\t24304\n夏威夷大岛\t24305\nXIUMI\t24306\n10t\t24307\n690元\t24308\n班赛\t24309\n心爱的女人\t24310\n考法\t24311\n李谦\t24312\n威腾\t24313\n钱颖一\t24314\n信托投资公司\t24315\n临沂物流公司\t24316\n不辍\t24317\n导电玻璃\t24318\n耐磨地坪漆\t24319\nposition\t24320\n上篮\t24321\n粉果\t24322\n连乘\t24323\nRedmine中文网\t24324\n尼康D7100\t24325\n大榜\t24326\n记账联\t24327\nKeepalive\t24328\nSocketServer\t24329\n混改\t24330\n王安琪\t24331\nserver2008\t24332\n市监察局\t24333\n第[\t24334\n寰图\t24335\n联号\t24336\n泡沫状\t24337\n黑杰克\t24338\n高效氯氟氰菊酯\t24339\n上海数据中心\t24340\nspot\t24341\n北京现代索纳塔\t24342\n华盛顿州\t24343\n饱和脂肪\t24344\nTCGA\t24345\n必维国际检验集团\t24346\n到来\t24347\n资生堂集团\t24348\n弘康人寿\t24349\n耶米玛\t24350\n黑牡丹\t24351\nUX\t24352\n低氧\t24353\n崩拳\t24354\n御木本\t24355\n三皈依\t24356\n转用\t24357\nduplex\t24358\n100多年前\t24359\n&#16\t24360\n祖玛龙\t24361\n飞信\t24362\n石油工业出版社\t24363\n尚智\t24364\n99﹪\t24365\n72厘米\t24366\n追梦子\t24367\n小牛n1\t24368\n数字德育网\t24369\n异地恋\t24370\n抵税\t24371\n肺阴虚\t24372\n沈海\t24373\n第66期\t24374\n妙资\t24375\n大地影院\t24376\n人长\t24377\n304路\t24378\n学而思英语\t24379\n私募股权母基金\t24380\n缚\t24381\n医罪\t24382\nGraphpad\t24383\n国令\t24384\n小胡子哥\t24385\n自然辨证法\t24386\n三月的狮子\t24387\n花翎\t24388\n605\t24389\n大红花\t24390\n闺战\t24391\nshtml\t24392\n百分之20\t24393\n鲁直\t24394\n第二十三次\t24395\n荣耀x1\t24396\n第114章\t24397\n外接圆\t24398\n手球\t24399\n125家\t24400\n河北省定州中学\t24401\n罗波\t24402\n国网内蒙古东部电力有限公司\t24403\nmihiro\t24404\ngltools\t24405\n成都地铁2号线\t24406\n金玉王朝\t24407\n热销\t24408\n派遣单\t24409\n八点钟\t24410\n2万步\t24411\n怀空\t24412\n鲤鱼门\t24413\nbangs\t24414\n乐器展\t24415\n周成\t24416\n广州法院\t24417\n学日语\t24418\n互联版\t24419\n227g\t24420\n苏湖\t24421\n台钻\t24422\n急产\t24423\n中国民族管弦乐学会\t24424\nwow7.3\t24425\n化访问\t24426\n悠然自得\t24427\n地刺\t24428\n2025年前\t24429\n鼠女\t24430\n本份\t24431\n盘式电机\t24432\n全椒县人民政府\t24433\n沪江韩语\t24434\n性女传奇\t24435\n橘皮\t24436\n千里迢迢\t24437\nprf\t24438\n赤磷\t24439\nCHAPTER\t24440\n会标\t24441\n北京电力\t24442\ntssop\t24443\n易居\t24444\n触摸查询机\t24445\n湖工\t24446\ndrawImage\t24447\nhanzi\t24448\n秋兴八首\t24449\n23.9\t24450\n延陵\t24451\nmost\t24452\n使\t24453\n林蛙\t24454\n166集\t24455\n狰\t24456\n血蜜酒\t24457\n临沂人民医院\t24458\n相关性分析\t24459\n溪口镇\t24460\ndnf51\t24461\n边字\t24462\nF28335\t24463\n医林\t24464\n记号\t24465\n复地悦城\t24466\n牌楼\t24467\nProcessor\t24468\n汽滤\t24469\n昆明世博园\t24470\nabort\t24471\n新站镇\t24472\n走婚\t24473\nlvds\t24474\n冰种翡翠\t24475\nlegsjapan\t24476\n很不幸\t24477\n鬼谋\t24478\narkit\t24479\n上海立达职业技术学院\t24480\n电感式\t24481\n顺逛\t24482\n时价\t24483\n集合并\t24484\njieshi\t24485\n单式折线统计图\t24486\n一伙\t24487\n阿来\t24488\n52本\t24489\n侧侧\t24490\n化脓性扁桃体炎\t24491\n魔链\t24492\n教育知识与能力\t24493\n靓美\t24494\n黄柏塬\t24495\n声艺\t24496\n4.2日\t24497\n讨论版\t24498\n5.13\t24499\n存货盘盈\t24500\ndruid\t24501\nLavender\t24502\n头肉\t24503\n200分钟\t24504\nrotated\t24505\n出版人\t24506\n刘先生\t24507\n湖南省民政厅\t24508\n桔子树\t24509\n南京住房公积金管理中心\t24510\n5班\t24511\n宁波职业技术学院\t24512\nAPE/\t24513\nSO3\t24514\n西红柿鸡蛋面\t24515\n双冠\t24516\n直驱\t24517\n忍道\t24518\n文城\t24519\n南京站\t24520\n高档型\t24521\n命题\t24522\n从正\t24523\n柯罗\t24524\n陈韵\t24525\n张江集团学校\t24526\n初灵信息\t24527\n省模\t24528\n白金酒\t24529\n刘子琪\t24530\n断刃\t24531\n波莱罗\t24532\n九州诗词\t24533\n落体\t24534\n鲁抗\t24535\nunlocked\t24536\n来京\t24537\n一分钟内\t24538\n新叶城\t24539\n二至丸\t24540\n俎\t24541\n虚影\t24542\n模糊化\t24543\n理数\t24544\n钢系\t24545\n特搜\t24546\n甘坑\t24547\n创业大厦\t24548\n架子管\t24549\n招联\t24550\n农夫山泉股份有限公司\t24551\n正阳路\t24552\nkuma\t24553\n卓根\t24554\n济源市\t24555\n47项\t24556\n李风\t24557\n嵩阳书院\t24558\n魔幻手机\t24559\n科举制\t24560\n1596\t24561\n四川省图书馆\t24562\n美国加州大学\t24563\n抽油泵\t24564\nsalt\t24565\nDirections\t24566\n纳通\t24567\n30队\t24568\n68个经典励志小故事大道理_人生哲理小故事大道理\t24569\n搜狐新闻中心\t24570\n640x480\t24571\n天竺\t24572\nST紫学\t24573\n只有一个地球\t24574\n锁心玉\t24575\n事务性\t24576\n湖南省水利厅\t24577\n王美玲\t24578\nWiFi放大器\t24579\n布拉德利\t24580\n社会科学界\t24581\n话说说\t24582\n陈奕\t24583\n宋晓\t24584\n欧姆表\t24585\n义亭\t24586\n迷你型\t24587\n水云天\t24588\nplaying\t24589\n钻探\t24590\n北七家\t24591\n万分之三\t24592\n乔灌木\t24593\n华润城\t24594\nerecovery\t24595\n纪录片\t24596\nⅲ\t24597\n统计分组\t24598\n索契\t24599\nucf\t24600\n1000句\t24601\nljhzzyx\t24602\n待字闺中\t24603\n特异点\t24604\n何音\t24605\nUsitrip走四方旅游网\t24606\n沈阳市人民政府办公厅\t24607\n破宫\t24608\n沧浪亭\t24609\n拍拍拍\t24610\n相连\t24611\n45千米\t24612\n板凳\t24613\n口袋妖怪复刻\t24614\n永明\t24615\nz18mini\t24616\n印刷术\t24617\n叔母\t24618\nChambers\t24619\nnovel\t24620\n曼彻斯特\t24621\n同等学力申硕\t24622\n知识产权法\t24623\npinguo\t24624\nN3450\t24625\nflood\t24626\n杭州社区\t24627\n实际上\t24628\ntexworks\t24629\n3122\t24630\n肥乡区\t24631\nTRAIL\t24632\n广州周大福金融中心\t24633\n纯电\t24634\n广告篇\t24635\n普耐尔\t24636\nInnovation\t24637\nUndergraduate\t24638\n小未\t24639\n亳州路\t24640\n上海国际集团\t24641\n盛运环保\t24642\n大西安\t24643\nterritory\t24644\n龟蛋\t24645\n滨河\t24646\ndivergent\t24647\n大胆\t24648\n周韶马苏\t24649\n2405\t24650\n王倩\t24651\n小淼\t24652\n上海浦东香格里拉大酒店\t24653\nSqlServer2008\t24654\n苏菲的世界\t24655\n60英寸\t24656\n天通\t24657\nrobocopy\t24658\nreplace函数\t24659\n五月份\t24660\nCOD9\t24661\n55x9000e\t24662\nalps\t24663\n缓解\t24664\n铭星\t24665\n抗生素类\t24666\n孙某\t24667\n木方\t24668\nraye\t24669\n山流\t24670\n双流区政府\t24671\n不得以\t24672\n男性化\t24673\n药丸\t24674\n实验校\t24675\n天宫一号\t24676\n达伦\t24677\n多线程服务器\t24678\n多一点\t24679\n湖南广益实验中学\t24680\nocci\t24681\nKYO\t24682\n半折\t24683\n露怯\t24684\n华夏综艺网\t24685\n冠宇\t24686\n情侣座\t24687\n分页符\t24688\n2002\t24689\n领啦网\t24690\n11辆\t24691\n天浴\t24692\n计划经济体制\t24693\n合规\t24694\n购房合同备案\t24695\nDisaster\t24696\n40mm\t24697\n足利\t24698\n补仓\t24699\nparsley\t24700\n奥芬巴赫\t24701\n摩戈尔\t24702\n微软&IT之家\t24703\n前11个月\t24704\nbaoju\t24705\nZip\t24706\n太冲穴\t24707\n吕建德\t24708\n害死人\t24709\n点火线圈\t24710\n刘钰儿\t24711\n32英寸\t24712\n金耳朵\t24713\n战幕\t24714\n清茶\t24715\n马寅初\t24716\n秒赞网\t24717\n99.9%\t24718\ndnfb\t24719\n日本红怡院\t24720\n交流展\t24721\nPACIFIC\t24722\n09号\t24723\n汽车检具\t24724\n怪物猎人物语\t24725\n迅雷吧\t24726\n下半身\t24727\nstatusCode\t24728\n30速\t24729\n扭矩扳手\t24730\n冉闵\t24731\nLibsvm\t24732\n两部\t24733\n上腿\t24734\n风袋\t24735\nmelville\t24736\n500m\t24737\n75号\t24738\nweb程序\t24739\n外景地\t24740\n深圳环球\t24741\n∴\t24742\n二十卷\t24743\n米顿罗\t24744\n团员证\t24745\n小章\t24746\n建筑工程有限公司\t24747\n龙凤纹\t24748\n嘉兴秀洲区\t24749\ninsight4\t24750\ninstall.wim\t24751\n中国科技部\t24752\nCAT021\t24753\n融创东麟府\t24754\n第52页\t24755\n康师傅方便面\t24756\n汽修\t24757\n排风机\t24758\n聋子\t24759\npoisson\t24760\n6.5折\t24761\n長\t24762\noxlxs\t24763\n15【\t24764\n熊英\t24765\n萨尔图区\t24766\n弊处\t24767\ncan't\t24768\n研究社\t24769\n岩谷\t24770\n神州优车集团\t24771\n扶摇直上\t24772\n美妖\t24773\n6月19日\t24774\n弗里德\t24775\n汨罗\t24776\n凤仙花\t24777\n宋美龄\t24778\nLIP\t24779\n歪打正着\t24780\n画苑\t24781\n大连广播电视台\t24782\n2540\t24783\njinjiang\t24784\nprisonschool吧\t24785\n136家\t24786\n合一集团\t24787\nUltrasonic\t24788\ncham\t24789\n长沙舰\t24790\nSpear\t24791\n五指山市政府网\t24792\n专业性\t24793\nU钙网\t24794\n20160322\t24795\n铟\t24796\n広瀬\t24797\n渔网\t24798\n女头\t24799\n川东北\t24800\n职业生涯\t24801\n飞头\t24802\n钙奶\t24803\n白牛\t24804\n周晋\t24805\n劈山救母\t24806\n洋沙湖\t24807\n教师招聘考试网\t24808\n通力电梯有限公司\t24809\n基金档\t24810\nALOHA\t24811\n唱吧电脑版\t24812\n柴火\t24813\n水盂\t24814\n民主监督\t24815\n南极电商\t24816\n哽咽\t24817\n碰伤\t24818\n脑科学\t24819\n鞍钢\t24820\n归集表\t24821\nvisibility\t24822\n3x-1\t24823\n0163\t24824\n途睿欧_\t24825\n18虚拟研究院\t24826\n护民\t24827\nEQ\t24828\nド無料サンプル\t24829\n麦片\t24830\n华生园\t24831\n百万庄\t24832\n风船\t24833\n急性心梗\t24834\n国魂\t24835\n刀条\t24836\n青驿\t24837\n金太阳鹦鹉\t24838\n26000\t24839\n超星泛雅\t24840\n从小到\t24841\n结党营私\t24842\n胆矾\t24843\n懒得\t24844\n有意者\t24845\n事记\t24846\n纸人\t24847\n优阁网\t24848\n莫甘娜\t24849\n外交战\t24850\n沈阳机床股份有限公司\t24851\n爱之入\t24852\n拳道\t24853\n辨证论治\t24854\n十六位\t24855\n泣不成声\t24856\n正安\t24857\n长沙宾馆\t24858\n不知晓\t24859\n三片\t24860\nevolved\t24861\n愤青\t24862\n十三届\t24863\n参比\t24864\n客上\t24865\n食品饮料招商网\t24866\n考试\t24867\n大壶\t24868\n20170918\t24869\nspawning\t24870\n警苑\t24871\n运城市\t24872\n消音棉\t24873\n铺满\t24874\n河北能源职业技术学院\t24875\nPLATINUM\t24876\n4500亿元\t24877\n百度雲安全讓史\t24878\n横陈\t24879\n两回事\t24880\n大学士\t24881\n子曰\t24882\n三磷酸腺苷\t24883\n危害\t24884\n性瘾日记\t24885\n独立日2:卷土重来\t24886\n佛罗里达\t24887\n滨江西路\t24888\n一阵\t24889\nJohnson\t24890\nkarina\t24891\n渐入佳境\t24892\nNBIOT\t24893\n湛江人才网\t24894\n走遍美国\t24895\nntc\t24896\n梅根·福克斯\t24897\n六盘水师范学院\t24898\n太湖城\t24899\n300t\t24900\n把戏\t24901\nPP棉\t24902\nsna\t24903\n天津城投\t24904\nWox\t24905\n菲律宾人\t24906\n负载箱\t24907\n常数\t24908\n第九代\t24909\norca\t24910\n第六次\t24911\nablation\t24912\n没想到\t24913\n朱生岭\t24914\n阳朔县\t24915\n小Q\t24916\n书坊\t24917\n中白\t24918\n八桂\t24919\n大龙燚\t24920\n张广智\t24921\n国家标准化管理委员会\t24922\n疼惜\t24923\n主次\t24924\n探询\t24925\n金蝶游乐场\t24926\n青天河\t24927\nlibxml2\t24928\n单反快门\t24929\n军工类\t24930\n第二名\t24931\n捕风\t24932\n热面\t24933\n第26个\t24934\n晶城\t24935\n千叶真一\t24936\n李小飞\t24937\nkose\t24938\n85%\t24939\njinian\t24940\n画屏\t24941\n下雪乃\t24942\neager\t24943\nRazer\t24944\n马克丁\t24945\n极战联盟\t24946\n排遣\t24947\n红墅湾\t24948\n车厘子\t24949\n第125\t24950\n柏丽\t24951\n广州南站\t24952\n300张\t24953\nAdvances\t24954\n气功\t24955\n海国图志\t24956\n矿油\t24957\n失量\t24958\n升阶\t24959\n12荒\t24960\n影音先锋日夜撸妻子撸,撸死你资源站影音先锋\t24961\n大郎\t24962\n粤妆\t24963\n岩-51CTO\t24964\nOrCAD\t24965\n科技之光\t24966\n色达佛学院\t24967\n江西省政协\t24968\n10次\t24969\n产业园\t24970\n会声会影X5\t24971\n宋勇\t24972\n天诛\t24973\nスケベ\t24974\n汉富控股\t24975\n百度经纬度\t24976\n三星公司\t24977\nmicrobial\t24978\n领养\t24979\n归并\t24980\n莱芜职业技术学院\t24981\nLambert\t24982\ndevc++\t24983\n急剧\t24984\nGeekbench\t24985\n.top\t24986\nC++14\t24987\n000710\t24988\n标准房\t24989\nsno\t24990\n九品中正制\t24991\n洞身\t24992\n吉他中国论坛\t24993\n白刚玉\t24994\nSuppliers\t24995\n跨境电商鹰熊汇\t24996\n环世界a16\t24997\n张玉林\t24998\n爆发期\t24999\nCodecs\t25000\n两具\t25001\nDivided\t25002\n生物学\t25003\n半决赛\t25004\n配音稿\t25005\nLeg\t25006\n麦肯光明\t25007\n丹霞地质公园\t25008\n古墓丽影:崛起\t25009\n创见\t25010\n恍然大悟\t25011\nfeiche\t25012\n金华镇\t25013\nhhkb\t25014\n黎明前\t25015\nHIPS\t25016\n咲\t25017\n中南传媒\t25018\n出界\t25019\n殖民化\t25020\nmindmanager\t25021\nф\t25022\n纤纤\t25023\nknock\t25024\n富春江\t25025\n林黛\t25026\n金鹰店\t25027\n非离子表面活性剂\t25028\n长山镇\t25029\njavascirpt\t25030\n莎娜\t25031\n9575\t25032\n英语文\t25033\n上海人才服务中心\t25034\n皮件\t25035\n彩天堂\t25036\nDebugger\t25037\n视屏\t25038\n植物大战僵2\t25039\n荣幸\t25040\n输液管\t25041\n邢邵林\t25042\n凌汛\t25043\n第十大\t25044\n结缔组织\t25045\n土木专业\t25046\n富龙\t25047\n林内\t25048\n国美苏宁\t25049\n发臭\t25050\n尔雅\t25051\nsm\t25052\n贵州农村信用社\t25053\nSleepy\t25054\n新诗\t25055\n万网\t25056\n范毅\t25057\n夏同学\t25058\n枣强县\t25059\n易福门\t25060\n保利湾\t25061\n少帅明若晓溪\t25062\nHB\t25063\n同屏\t25064\n小型张\t25065\n人虫\t25066\n180410\t25067\n残差项\t25068\n需努力\t25069\n保利中航城\t25070\n江苏省扬州市梅岭中学\t25071\n明堂山\t25072\n主顾\t25073\n凤临\t25074\n曼宁\t25075\n寒酸\t25076\n罗技G903\t25077\n竺\t25078\n零欧元\t25079\n保持沉默\t25080\n是真的爱\t25081\n乐视\t25082\ncobbler\t25083\nxingba\t25084\n河南中医药大学第一附属医院\t25085\n淘小二\t25086\nSM公司\t25087\n眷族\t25088\n都市无上仙医\t25089\n第九连\t25090\n捧月\t25091\n露珠\t25092\n杨大伟\t25093\n亳州先锋网\t25094\n马云飞\t25095\n众泰T600\t25096\n蓝皮书\t25097\n广东中学\t25098\n丰顺县\t25099\n中川国际机场\t25100\n文书\t25101\n3770\t25102\n旧金山大学\t25103\n武汉公交查询网\t25104\n打绝\t25105\n很反感\t25106\n连接杆\t25107\n3.20.03\t25108\nG2800\t25109\nlogax\t25110\n翻领\t25111\nHIPANDA\t25112\npango\t25113\n8050\t25114\n卖家版\t25115\n奇瑞a5\t25116\n招商银行大厦\t25117\n没有退路\t25118\n公众公司\t25119\n许越\t25120\n33朵\t25121\n医士\t25122\n吴淞\t25123\n调谐\t25124\n0.3_\t25125\n乌伦古湖\t25126\n受戒\t25127\nTunge\t25128\nISOYES\t25129\n20150709\t25130\n48平\t25131\nwish\t25132\n山村老尸\t25133\n白芸\t25134\n多妻\t25135\n排档\t25136\n雷吉米勒\t25137\n怀恩\t25138\nUNDO\t25139\n桃花湖\t25140\n杨宏伟\t25141\n这就是街舞\t25142\n欲望号街车\t25143\n姜素拉\t25144\n火影忍者OL\t25145\n大成路\t25146\nSnowy\t25147\n景德镇市\t25148\njavase\t25149\n电信小区\t25150\npowerdvd\t25151\n乱说\t25152\n单体化\t25153\nkml\t25154\n5.2.8\t25155\n1.14G\t25156\n连作\t25157\n瑞斯康达科技发展股份有限公司\t25158\n喜临\t25159\n河北青年报电子报\t25160\n通阀\t25161\n鲁中\t25162\n_哲想\t25163\ntocken\t25164\ntowards\t25165\n亚麻\t25166\n90\t25167\n小米万能遥控器\t25168\n两轴\t25169\n翻模\t25170\n胳\t25171\n限界凸骑\t25172\nVSD\t25173\n3max\t25174\n听说课\t25175\njS\t25176\n马自达\t25177\n2017年9月7日\t25178\n想起诉\t25179\nUnit4\t25180\nkernels\t25181\n小赤\t25182\n色天\t25183\n欢歌笑语\t25184\npro7\t25185\nwatsons\t25186\n牛仔\t25187\n不相上下\t25188\nDealer\t25189\n鹤岗新闻网\t25190\n废品回收网\t25191\n五笔字根\t25192\n震旦复印机\t25193\nWin10/Win8.1\t25194\n第五十六章\t25195\n20160713\t25196\n马诗\t25197\n开封市政府\t25198\n梨园村\t25199\n纵贯\t25200\n20180407\t25201\n冶金网\t25202\n迷男\t25203\n1-x^2\t25204\nAutoVue\t25205\n惠东站\t25206\n夺宝幸运星\t25207\n金发女郎\t25208\n卫宁健康\t25209\n长江大道\t25210\n1-10月\t25211\n城人\t25212\nAescripts\t25213\n几张\t25214\nR-18\t25215\n集数\t25216\n无意注意\t25217\n2018年五一劳动节\t25218\nburden\t25219\nzhaoshayou\t25220\n九流\t25221\n栾城县\t25222\n谢春燕\t25223\nhandjob\t25224\n20160603\t25225\n慢性肝病\t25226\n紫宸\t25227\n临睡前\t25228\n指\t25229\n稻谷\t25230\n初春\t25231\nwinwin\t25232\n中华人民共和\t25233\n舰船\t25234\n碳酸铵\t25235\n守心\t25236\nAndrew_qian\t25237\n影界\t25238\n导购网\t25239\n跑遍\t25240\n2122\t25241\n米山\t25242\n马蹄\t25243\n小二寸\t25244\nDroid\t25245\n战争与文明\t25246\n公考雷达\t25247\n现代派\t25248\n两仪式\t25249\nContents\t25250\n赶尸艳谈\t25251\nwha\t25252\n金豹\t25253\n2018年5月2日\t25254\nedge+\t25255\n2000亿美元\t25256\nbirkin\t25257\n零钞\t25258\n大博医疗\t25259\n鸡篇\t25260\n华能新能源股份有限公司\t25261\n26集\t25262\n排忧解难\t25263\nmounting\t25264\n影剂\t25265\n东西大道\t25266\n心肝宝贝\t25267\n李惠\t25268\nSroot\t25269\n光与影\t25270\nBangkok\t25271\n六龄童\t25272\n以太坊go-ethereum\t25273\n公平感\t25274\n几者\t25275\n25w\t25276\n顾安诚\t25277\n去职\t25278\n玛纳斯县人民政府\t25279\nPacker\t25280\n下关沱茶吧\t25281\ncsx\t25282\nsshd服务\t25283\n坤哥\t25284\n谭杰希\t25285\nG403\t25286\nCollectibles\t25287\n料包\t25288\n郭富城\t25289\n十三所\t25290\nFirewall防火墙\t25291\n激动人心\t25292\n鸡骨\t25293\n企业养老保险\t25294\n卡其裤\t25295\n李向群\t25296\nLikwo\t25297\n小千代\t25298\n菜池\t25299\n满度\t25300\navi格式转换器\t25301\n小黄飞\t25302\n阿喵\t25303\n火萤\t25304\n东环\t25305\npyqt\t25306\n44本\t25307\n普法\t25308\nstitches\t25309\n斯柯达新明锐\t25310\n英语类\t25311\n乐海\t25312\n恩施新闻网\t25313\n信息学部\t25314\n氵\t25315\n药师咒\t25316\n新东方在线网络\t25317\n柴沟堡\t25318\n童声合唱团\t25319\n橘夏\t25320\nセクシ\t25321\npixmap\t25322\n5711\t25323\n485\t25324\n国债收益率曲线\t25325\n小脚\t25326\n占座\t25327\nXX县\t25328\n1000000元\t25329\n美景地\t25330\n听风\t25331\n五家渠\t25332\n教研项目\t25333\n重信\t25334\n麦丽素\t25335\n5576\t25336\n泰丸\t25337\n万丈深渊\t25338\ntosho\t25339\nMakerBot\t25340\n物华天宝\t25341\npugb\t25342\nGumroad\t25343\n房费\t25344\noperator\t25345\n深圳市中级人民法院\t25346\n邱岩\t25347\n吻技\t25348\n圣枪\t25349\notb\t25350\n绝地求生刺激战场S2\t25351\n龙血战神\t25352\n麦可\t25353\n双缩脲法\t25354\n武深高速\t25355\n上虞新闻网\t25356\n用心良苦\t25357\n希波克拉底\t25358\n扎龙\t25359\n万科金域中央\t25360\n文强\t25361\n霍华德\t25362\n696\t25363\n天津狗不理包子\t25364\n905路\t25365\n华庭\t25366\n私搭\t25367\n培育期\t25368\nNortheastern\t25369\n0817\t25370\n缩地\t25371\n沅陵县政府\t25372\n冬不拉\t25373\n126平\t25374\n楼内\t25375\n网易经济学家年会\t25376\nR480\t25377\n元格\t25378\n盐城市人民政府\t25379\n墨渊\t25380\n书记官\t25381\n教育杯\t25382\n国自然基金\t25383\nwin8输入法\t25384\n不可逆\t25385\n30美元\t25386\nxcm\t25387\n溉\t25388\n压声\t25389\n卤制品\t25390\nCIVIL\t25391\n火大\t25392\n中药注射液\t25393\nFreecode\t25394\n周晓燕\t25395\n警翼\t25396\nMatthews\t25397\n广府\t25398\n剑客网\t25399\nShiyu_Huang\t25400\n京昆高速\t25401\n住房公积金贷款利率\t25402\nClothing\t25403\nloadmore\t25404\nAdministrative\t25405\n稳评\t25406\n犇犇\t25407\ndelle\t25408\nsynth\t25409\n成正比\t25410\nk2450\t25411\n相薄\t25412\n菩提\t25413\n岳亮\t25414\nmarkii\t25415\ngooglelist\t25416\n白咲碧\t25417\n篆体字网\t25418\n三四岁\t25419\n穿刺器\t25420\nrm\t25421\n摩天大厦\t25422\n育儿大作战&#160\t25423\n北京长城\t25424\n第十六\t25425\n38篇\t25426\n8650\t25427\n新际航\t25428\n戴氏教育\t25429\n昆山市人民法院\t25430\n宠物天王\t25431\naecom\t25432\nfighter\t25433\n公选\t25434\n转账\t25435\n第1176章\t25436\n他们的故事\t25437\n噬神者\t25438\n雷影\t25439\n国网技术学院\t25440\n计策\t25441\n一定要买\t25442\n丝网除沫器\t25443\n6960\t25444\n精酿\t25445\n训导\t25446\n王国斌\t25447\n全民健康生活方式行动\t25448\n字词\t25449\n黄培\t25450\n1.25\t25451\n成飞集团\t25452\njsp:include\t25453\n览众\t25454\nsuitable\t25455\nSuperfly\t25456\n服务版\t25457\n三菱空调\t25458\nDIV层\t25459\n归档文件整理规则\t25460\n石油焦\t25461\n第1回\t25462\nluotuo\t25463\n#0\t25464\n天街小雨润\t25465\n桃花庵\t25466\n街心\t25467\niPhone6Plus\t25468\n捉鱼\t25469\n病榻\t25470\n市场定位\t25471\n朝夕\t25472\n龙鑫\t25473\n切除术\t25474\n鲁道夫\t25475\n杏树纱奈\t25476\n杉杉能源\t25477\nZjmainstay\t25478\n习近平七年知青岁月\t25479\n四速\t25480\n中国资源卫星应用中心\t25481\n风湿性关节炎\t25482\n朱祁钰\t25483\n图画书\t25484\n办用\t25485\n郑志强\t25486\n虎纹蛙\t25487\n广州外国语学校\t25488\n烈士纪念馆\t25489\n硅铁\t25490\n色无极\t25491\n综合性门户网\t25492\n一地下室\t25493\n第四大\t25494\n饲料\t25495\n奥凯航空\t25496\nJacket\t25497\n李顺祥\t25498\n┋\t25499\nOpera\t25500\n西西商城\t25501\n佳能Canon\t25502\n食尸鬼\t25503\n上下床\t25504\nscenekit\t25505\n薛家\t25506\n亲爱的安德烈\t25507\n考练\t25508\n江湖风云录\t25509\n崇武古城\t25510\n蛛网膜下腔出血\t25511\n金永哲\t25512\n1080P/MP4\t25513\n张怡宁\t25514\n汉台区人民政府\t25515\n华银天鹅湖\t25516\nescaped\t25517\n日立n4000\t25518\n八喜\t25519\n束腰\t25520\n社保\t25521\ncockyboys\t25522\npydicom\t25523\n天鹰座\t25524\n免试研究生\t25525\n威发\t25526\norigion\t25527\n1055T\t25528\n刘钊\t25529\n徐变\t25530\nbleeding\t25531\n900个\t25532\n下作\t25533\n爱生恨\t25534\n天正T20\t25535\n工作表\t25536\n单生\t25537\n开发单\t25538\n人行横道\t25539\n春花秋月\t25540\n酷客\t25541\nGraco\t25542\nInsertion\t25543\n凌冽\t25544\nTheFatRat\t25545\n陕西师大\t25546\n大铁\t25547\n宜川县人民政府\t25548\n舟子\t25549\n黄慎\t25550\n纯元\t25551\n靠右居\t25552\n千千万万个\t25553\n上海地铁2号线\t25554\n暗黑圣经\t25555\n1.172\t25556\n六枝\t25557\nrcs\t25558\nRuna\t25559\n列宾美术学院\t25560\n光端\t25561\n猫屎一号\t25562\n排队机\t25563\n护佑\t25564\n短途\t25565\n中铁上海工程局\t25566\n比价网\t25567\n应答器\t25568\n市场\t25569\npoll\t25570\n挽歌\t25571\n柯达公司\t25572\n20160107\t25573\nCUCKOO\t25574\n弗莱明\t25575\nCas\t25576\n蟾\t25577\nvue嵌套路由-query\t25578\n李由\t25579\nNONO\t25580\n南京三江学院\t25581\n情人节\t25582\n安妮扎克伯格\t25583\n弹开\t25584\n大岛\t25585\n长沙市公安局\t25586\ntafe\t25587\nTop\t25588\nOutstanding\t25589\nUHF\t25590\nqpq\t25591\n美兰湖\t25592\n钻石广场\t25593\n正义路\t25594\n北京市大学\t25595\n顶下\t25596\n护花危情\t25597\nMP4/\t25598\n天堂乐队\t25599\ndribbble\t25600\nexcel进销存表\t25601\nModify\t25602\n3.7.0\t25603\n陶菲克\t25604\n帧速率\t25605\n鲁梅尼格\t25606\n帖\t25607\n陈新民\t25608\n佳信\t25609\n附加码\t25610\n王靖雯\t25611\n采暖期\t25612\n女学者\t25613\n箫谱\t25614\n开封市住房和城乡建设局\t25615\nIdon\t25616\n表率\t25617\n太难看\t25618\n屠龙\t25619\n盒底\t25620\n凯悦酒店\t25621\n龙奔\t25622\nCentara\t25623\n大堡礁\t25624\n金线莲\t25625\n琶音\t25626\n002460\t25627\n鎏金\t25628\n显卡门\t25629\n中国医药指南\t25630\n第十七\t25631\n芝士条\t25632\n国税局\t25633\n中华人民共和国国家卫生和计划生育委员会\t25634\nGTASA\t25635\nmt6735\t25636\nwin7版\t25637\n向量表\t25638\nFEC\t25639\n抵账\t25640\ncivic\t25641\n雷克萨斯ES论坛_汽车之家论坛\t25642\n深圳科学馆\t25643\n预应力钢\t25644\n福克斯吧\t25645\n第49期\t25646\n端木云\t25647\nVUE\t25648\n沃尔左耳\t25649\ngantt\t25650\n年画\t25651\n抵达率\t25652\n诊断\t25653\n堂兄弟\t25654\n小s布兰妮\t25655\n江南区\t25656\necm\t25657\nc13\t25658\nPROCESS化工网\t25659\n广州市自来水公司\t25660\njessica\t25661\n软链\t25662\n聊天王\t25663\n张沛\t25664\n山村\t25665\n常用药\t25666\ncontenteditable=\t25667\n纠缠不清\t25668\n内眦赘皮\t25669\n25亿元\t25670\njingdian\t25671\n高明区人民政府\t25672\n胶帽\t25673\n108集\t25674\nv3max\t25675\n迪伦马特\t25676\nvscode\t25677\n瞒报\t25678\n温伯格\t25679\n合肥装修公司\t25680\nBodyPaint\t25681\n超声雾化器\t25682\n7遍\t25683\nudf\t25684\n房\t25685\n结露\t25686\n腾讯体验中心\t25687\n巫宁坤\t25688\nRye\t25689\n艾烟\t25690\n空政文工团\t25691\n冷背\t25692\n丝袜奶茶\t25693\nhollywood\t25694\n缩机\t25695\nGRA\t25696\n2018年03月20日\t25697\n奋豆\t25698\n07\t25699\n环境监控\t25700\nsnmp4j\t25701\n盐碱化\t25702\n军区\t25703\n歌圩节\t25704\nStan\t25705\n以待\t25706\n线条感\t25707\n龙巅网\t25708\n卢强\t25709\n叶敏\t25710\n1494\t25711\n新闻源\t25712\nwww.lanhua8.com\t25713\n拳皇中文\t25714\n2017.09\t25715\n紫宁\t25716\nueditor\t25717\n梅苑\t25718\n农机局\t25719\nNelson\t25720\n党中央\t25721\ncdrx5\t25722\n屁话\t25723\n头式\t25724\n王小波\t25725\n优享卡\t25726\n妆颜\t25727\n6安\t25728\n500万美元\t25729\n不堪一击\t25730\n治安\t25731\n异方差\t25732\n雅施达\t25733\n中川翔子\t25734\n正室\t25735\n国行机\t25736\n锦绣新天地\t25737\n11选5\t25738\n金木浴疗\t25739\n算税\t25740\n电开水炉\t25741\n技术工人\t25742\n东风镇\t25743\n风云会\t25744\n山阳\t25745\n海德公园\t25746\n1.12.6\t25747\nVariable\t25748\n真白\t25749\n何曼婷\t25750\n画友\t25751\n六十年\t25752\n水期\t25753\n广东省政府党组\t25754\n附着体\t25755\ngaysex\t25756\n不可选\t25757\n美国加州大学伯克利分校\t25758\n自诊\t25759\nMapleStory\t25760\n139号段\t25761\n博饼\t25762\ndnf剑魂\t25763\n2054\t25764\nbwa\t25765\n绝非偶然\t25766\n民乐村\t25767\n潜水艇鱼\t25768\n十六集\t25769\n有声小\t25770\nkingdee\t25771\n3511\t25772\nwxwidgets\t25773\n乐思闲_院校\t25774\n复仇天使\t25775\n昭觉县\t25776\n电子创新网\t25777\n捷报比分网\t25778\n三挡\t25779\n刘璐\t25780\n反偏\t25781\n加拿大元\t25782\n太阳炉\t25783\n甘敬\t25784\nlevel\t25785\nCNBC\t25786\n胡斐\t25787\n血管病\t25788\n186个\t25789\n神族\t25790\n抽送\t25791\nThroat\t25792\n烂陀寺\t25793\n三好网志\t25794\n冲蚀\t25795\nZ97\t25796\n小仙\t25797\n加气砼\t25798\n集体土地使用证\t25799\n01期\t25800\n张伟平\t25801\n太乙\t25802\n王台\t25803\n属相\t25804\n亵\t25805\n移动端\t25806\n孑孓\t25807\nMATE7\t25808\nPNP型\t25809\n吞万里如\t25810\n成都动物园\t25811\n续貂\t25812\n下注\t25813\n中间人\t25814\n维生素D3\t25815\n果报\t25816\n王琦\t25817\n美苏\t25818\n玉龙湾\t25819\n国际教育学院\t25820\n飞秒激光手术\t25821\n金牛男\t25822\n公流水\t25823\n忠骨\t25824\n电脑机\t25825\n高支模\t25826\n主导\t25827\n勤工俭学\t25828\n福尔摩斯探案全集\t25829\n1.8V\t25830\n综合性\t25831\n肏屄\t25832\n第六号\t25833\n5515\t25834\n敬长\t25835\n大圩古镇\t25836\nSIMOTION\t25837\n万科城市花园\t25838\nv1.4.5\t25839\n理论性\t25840\n禾苗\t25841\n萨满教\t25842\n约翰霍普金斯大学\t25843\n第122集\t25844\nSSIM\t25845\n继女\t25846\n上地实验学校\t25847\n100.64\t25848\n958\t25849\n德彩\t25850\nBossa\t25851\nFATAL\t25852\n去味\t25853\n言情520小说网\t25854\n诺瓦露\t25855\nQQ微云\t25856\n群P\t25857\n汽车行业信息网\t25858\nwherein\t25859\n侠水\t25860\n涉江采芙蓉\t25861\n舟曲县\t25862\n参上\t25863\n自利\t25864\n佳能eos\t25865\n谱架\t25866\n三聚环保\t25867\n口蹄疫疫苗\t25868\n190级\t25869\n爱问通\t25870\n易先生\t25871\nguan\t25872\n20320\t25873\n朱顶红\t25874\n五县\t25875\n白鸽\t25876\nDelivered\t25877\n供销大厦\t25878\n20世纪\t25879\n植物大战僵尸修改器\t25880\ncs6序列号\t25881\n陈琳萱\t25882\n冯晓泉\t25883\nv社\t25884\n20170529\t25885\nダ\t25886\n行动令\t25887\n白洋淀景区\t25888\nmustn\t25889\n番名\t25890\nGummy\t25891\n武汉江滩\t25892\n红蝴蝶\t25893\nS201\t25894\nKappa\t25895\nTypecho\t25896\n被停\t25897\n火端\t25898\nflutter\t25899\n法律逻辑学\t25900\n热水量\t25901\n放大\t25902\n石峰区\t25903\nhappier\t25904\n跨代\t25905\n川崎前锋\t25906\n火影忍者中文网\t25907\n莫兰指数\t25908\nrenee\t25909\n闹婚\t25910\n九九\t25911\n南湖街道\t25912\n圆偏振光\t25913\n石子路\t25914\n6.25\t25915\n秃子\t25916\n明虾\t25917\n本境\t25918\n下图\t25919\n铂链\t25920\n工业开发区\t25921\n百度金融\t25922\nbelive\t25923\n李永\t25924\nxaizai\t25925\n句酷批改网\t25926\n募捐\t25927\n越调\t25928\n代征\t25929\n婚姻介绍所\t25930\n和讯论坛\t25931\ncx3\t25932\n海亮教育集团\t25933\n殷格翰\t25934\n测试\t25935\n眷属\t25936\n企业管理类\t25937\n流产假\t25938\n威海市财政局\t25939\n西联\t25940\n唤灵\t25941\n多模光缆\t25942\n食堂\t25943\n北京建工\t25944\n昌平镇\t25945\n五花马\t25946\n肉壁\t25947\n175\t25948\n淮北职业技术学院\t25949\n禁塑令\t25950\n孙海燕\t25951\n女同桌\t25952\n部落守卫战\t25953\nGRAMMY\t25954\n恋老\t25955\n金鹏俱乐部\t25956\n冰冻\t25957\n中国船舶工业集团有限公司\t25958\n松原市人民政府\t25959\n修仙记\t25960\n饵\t25961\n留赞\t25962\n腰椎管狭窄症\t25963\n岩土力学\t25964\n海民\t25965\n高智\t25966\nWeir\t25967\nYARN\t25968\n补片\t25969\nMYTH\t25970\nTEXT\t25971\n竞潜\t25972\ncolorscheme\t25973\n分布式消息队列RocketMQ\t25974\nRAV4荣放论坛\t25975\n恶魔法则\t25976\n厦门海关\t25977\n荡漾\t25978\n北固湾\t25979\n上世纪八十年代\t25980\n塞尔达里\t25981\n汽油机\t25982\nDisplacement\t25983\n众里寻她千百度\t25984\nSIMULINK\t25985\n限界\t25986\n铁山\t25987\n悦荟\t25988\n猫鼠\t25989\n显灵\t25990\n课程教育研究\t25991\n重箱\t25992\n隐马尔科夫模型\t25993\n6000公里\t25994\n阿尔法罗密欧论坛_阿尔法罗密欧\t25995\nTabata\t25996\n非银\t25997\n剑网三七秀\t25998\n1526\t25999\n邢台新闻网\t26000\n贯流式\t26001\nHavana\t26002\n副所长\t26003\ndeman\t26004\n长春市实验中学\t26005\n脚筋\t26006\n5.6.16\t26007\nKID\t26008\n诗仙太白\t26009\n匠人精神\t26010\n张少佐\t26011\n昭衍\t26012\n丙方\t26013\n亲贤街\t26014\n博尔塔拉蒙古自治州\t26015\n联合发文\t26016\n道印\t26017\n奥斯特\t26018\n中国保险行业协会\t26019\n旅游季\t26020\nStudio3.0\t26021\n西溪湿地公园\t26022\n洪刚\t26023\nmotul\t26024\n维塔彭蕾\t26025\n五方面\t26026\n三角窗\t26027\n爪\t26028\n刘少奇\t26029\n风轮\t26030\nFIF\t26031\n雅士利\t26032\n黑瞎子\t26033\nKoMiles\t26034\n两峰\t26035\n第5位\t26036\nchung\t26037\n古旧\t26038\n200【\t26039\n滨边美波\t26040\n豹骨\t26041\n急变\t26042\n友谊之光\t26043\n慵懒散\t26044\n禾田\t26045\nfaith\t26046\n詹姆斯·乔伊斯\t26047\nwww.ra206.com/at\t26048\n消炎利胆片\t26049\n汇金大厦\t26050\nMeGUI\t26051\n沧海一笑\t26052\n恶搞版\t26053\nnba全明星赛\t26054\n南京九中\t26055\n黄板\t26056\nplanting\t26057\nHapag-Lloyd\t26058\n霹雳天命之仙魔鏖锋\t26059\n马来酸\t26060\n空置税\t26061\n个园\t26062\npvc\t26063\n巨化股份\t26064\n断食\t26065\n奥伦\t26066\n采集员\t26067\n怼怼\t26068\n哈姆\t26069\n泰达币\t26070\n扭剪型\t26071\n教学资源网\t26072\n电钻\t26073\n质点\t26074\nconnie\t26075\n徐州开发区\t26076\n2_九游\t26077\n塔克拉玛干\t26078\n东旭\t26079\n剑网\t26080\nv1.5.0\t26081\nX5650\t26082\n25分\t26083\n2018年1月5日\t26084\n融创中心\t26085\n首相\t26086\ncountif\t26087\n优酷YC\t26088\n2160p\t26089\n无颜\t26090\n胶束\t26091\nJanus\t26092\n6户\t26093\n烧录器\t26094\nぽ\t26095\nA站\t26096\n李天天\t26097\n滨江国际\t26098\n班芙\t26099\n金狐\t26100\n伊丽莎白奥尔森\t26101\n中国保险网\t26102\nfarmer\t26103\n深圳地铁7号线\t26104\n中国医师协会\t26105\n王者荣耀明世隐\t26106\n七桥\t26107\n人力资源部\t26108\nBFT\t26109\n退休人\t26110\n34首\t26111\n储量\t26112\nPPT+\t26113\n横打\t26114\n快乐乒乓网\t26115\nTrusted\t26116\n机甲类\t26117\n石英砂\t26118\n尔雅通识课\t26119\n枣庄银行\t26120\n亚哥\t26121\n小苏打\t26122\n万维\t26123\n榆荚\t26124\niis7.0\t26125\n假面骑士ex\t26126\n上海科学技术出版社\t26127\n录单员\t26128\nAlchemy\t26129\n科企\t26130\n欧美亚\t26131\n胶泥\t26132\n红龙\t26133\n偏好性\t26134\n戏言\t26135\n29级\t26136\nExamples\t26137\n渣浪\t26138\n吉言\t26139\n糟心\t26140\n靶\t26141\neater\t26142\n简明版\t26143\n表上作业法\t26144\n老虎型\t26145\n醋酸甲羟孕酮片\t26146\n黑椒牛柳\t26147\n环己烯\t26148\n钙镁片\t26149\n德生\t26150\n龙神契约\t26151\n洗液\t26152\nubtun\t26153\n12年后\t26154\n缓更\t26155\n张医生\t26156\n电动汽车时代网\t26157\n小儿肺热咳喘口服液\t26158\n随机化\t26159\nObjC\t26160\n戴春荣\t26161\n身家\t26162\n易百\t26163\n无刷直流电机\t26164\n公主版\t26165\n离弃\t26166\n新东方优能中学\t26167\n附加题\t26168\n左克\t26169\ncandid\t26170\n雨木林风\t26171\n澳际留学\t26172\n创造者\t26173\n新港路\t26174\n囊内\t26175\n新宠\t26176\n1-4月份\t26177\n工作点\t26178\n林口县\t26179\nx-man\t26180\n初动\t26181\n神武3端游\t26182\n良友\t26183\n一个度\t26184\n2012年8月\t26185\n视而不见\t26186\n二枚\t26187\nzoovixen\t26188\n神图\t26189\n松榆西里\t26190\n托管行\t26191\ncompensate\t26192\n丽晶国际\t26193\n预嘱\t26194\n11.18\t26195\ntpd\t26196\n合编\t26197\n神名\t26198\naviator\t26199\n冰糖\t26200\n自找麻烦\t26201\n中国保监会广东监管局\t26202\nZeroC\t26203\n专业卡\t26204\n传送阵\t26205\ncompatibility\t26206\nvmotion\t26207\n中国石化\t26208\n返修\t26209\nZO\t26210\n武汉欢乐谷\t26211\n跆拳道馆\t26212\n山脊赛车2\t26213\n金灿\t26214\n西蒙电气\t26215\nkt-joker\t26216\n龙珠斗士Z\t26217\npph\t26218\n汤炉\t26219\n048期\t26220\n句意\t26221\n中研科技网\t26222\nmud\t26223\n宁县\t26224\n塌陷\t26225\n定案\t26226\n收汁\t26227\n记得\t26228\n2018年3月20日\t26229\n神文\t26230\n伦巴\t26231\n岩烧乳酪\t26232\n区县\t26233\nOracle10g\t26234\n增值税普通发票\t26235\nvsFTPd\t26236\n野鸡蛋\t26237\n开学典礼\t26238\n几天内\t26239\nCloner\t26240\n听雪\t26241\n陪伴者\t26242\n掌印\t26243\n图丽\t26244\nDOTA\t26245\n威姿\t26246\n灌云\t26247\n袁凯\t26248\n物理学类\t26249\n汉坤律师事务所\t26250\n延期付款\t26251\n奥迪Q5论坛_汽车之家论坛\t26252\n风闻\t26253\n英国东印度公司\t26254\n静雅思\t26255\n白云寺\t26256\ndied\t26257\n中途岛\t26258\n模线\t26259\n成批\t26260\n布克兄弟\t26261\nESXI\t26262\n埭溪镇\t26263\nhumanities\t26264\n明挖\t26265\nRural\t26266\n新名词\t26267\nkaishi\t26268\nunpretty\t26269\n金属波纹管\t26270\n下游\t26271\n英雄泪\t26272\ntcpcopy\t26273\n9行\t26274\nZDF\t26275\n大侦探福尔摩斯2\t26276\n4444\t26277\n2.0|\t26278\n法语输入法\t26279\n切线\t26280\n布地奈德气雾剂\t26281\n心猿意马\t26282\n橡胶辊\t26283\ncage\t26284\n加基森\t26285\n石家庄远大白癜风医院\t26286\n杭州市行政服务中心\t26287\n40平\t26288\n孟浩然\t26289\nt70\t26290\n专坑\t26291\n三对三\t26292\n博白县\t26293\n四维图新\t26294\n若干行\t26295\n陶照阳\t26296\n2018名\t26297\n秀芹\t26298\n暢\t26299\nf185\t26300\n广州11区\t26301\n耐寒性\t26302\n大学语文\t26303\n2018年1月至4月\t26304\nSEO众包\t26305\n崔玉涛\t26306\n建新路\t26307\n微百科\t26308\n钓鱼记\t26309\n小林志玲\t26310\nyaris\t26311\nmarkov\t26312\n随机日期\t26313\n套币\t26314\n娇妻倾城\t26315\natc\t26316\n大事记\t26317\n起购\t26318\ntsne\t26319\n高新四路\t26320\n监护仪\t26321\n火车网\t26322\n高桥圣子\t26323\n美媳\t26324\n篮球馆\t26325\n西湖文化广场\t26326\n多芬\t26327\n卷草纹\t26328\n信息单\t26329\n瑞纳吧\t26330\n文件流\t26331\n自信心\t26332\n善莫大焉\t26333\n体形\t26334\nhaiyang\t26335\n白瑞德\t26336\n自由式\t26337\nwise\t26338\n单田\t26339\n谈爱\t26340\n_客集齐网\t26341\n深绿\t26342\n橙叶\t26343\ntecplot360\t26344\n脓毒性休克\t26345\n目表\t26346\n战火纷飞\t26347\n边防部队\t26348\nJetty\t26349\n四五千\t26350\n环市\t26351\nmodes\t26352\n日餐\t26353\n图拉古\t26354\n烈士谱\t26355\n鹤峰县\t26356\nvegetation\t26357\n品牌策划_营销策划公司\t26358\nRTO\t26359\n药代动力学\t26360\n玄关\t26361\n764位\t26362\n三香路\t26363\n石英晶振\t26364\nalerts\t26365\nemails\t26366\n仿真\t26367\n济南市知识产权局\t26368\ncsol2\t26369\nM16A4\t26370\nes索引\t26371\n永泰庄\t26372\n无处可\t26373\n欢女爱\t26374\n道果\t26375\n邢台站\t26376\n冬暖式\t26377\ne站\t26378\n烧蚀\t26379\n利润率\t26380\n银蕨\t26381\n元神\t26382\n李霄云\t26383\n最上层_\t26384\n新不夜城\t26385\n308s\t26386\n私力\t26387\n抗炎\t26388\n河南护理职业学院\t26389\n亚伯拉罕\t26390\ndevelopment\t26391\n重消\t26392\n行经\t26393\n小学庆\t26394\nf网\t26395\n途乐y62\t26396\n猎魔区\t26397\n8公斤\t26398\n辖内\t26399\n2806\t26400\n主景\t26401\n虚拟磁盘文件\t26402\n共混\t26403\n城段\t26404\n不劳而获\t26405\n中国烟草网络学院\t26406\n仙人掌科\t26407\n陈劲松\t26408\n行脚\t26409\n红岭社区\t26410\n双闪\t26411\n张宇\t26412\n莫利亚\t26413\n辛酸泪\t26414\n至强处理器\t26415\n长沙人才网\t26416\n微信文件传输助手\t26417\n庐山恋\t26418\n阳生\t26419\n肝吸虫病\t26420\n纸机\t26421\nArgue\t26422\ndisagree\t26423\nh13\t26424\n长白山北坡\t26425\n平安普惠投资咨询有限公司\t26426\n布鲁\t26427\n81年\t26428\n深圳东进\t26429\n智博会\t26430\n北工商\t26431\nHCE\t26432\n1118\t26433\n六年\t26434\n和特\t26435\n有图有文\t26436\nPose\t26437\n星球杯\t26438\n适合做\t26439\n80公分\t26440\n古生物学家\t26441\n地狱边境\t26442\nCONNECT\t26443\n烹煮\t26444\n北京市环境保护监测中心\t26445\nxitek\t26446\nDetector\t26447\nFA\t26448\n丹江口水库\t26449\n第十一册\t26450\n热镀锌钢丝网\t26451\n九台农商银行\t26452\n植物篇\t26453\n一心人\t26454\n曹云\t26455\n扒炉\t26456\n侯勇\t26457\n银川市\t26458\n徐浩峰\t26459\nKafKa\t26460\n3粒\t26461\n照顾\t26462\nMemory\t26463\nreits\t26464\nc80\t26465\n腾讯软件中心\t26466\ndefer\t26467\n有点儿\t26468\nTETRA\t26469\n强迫性\t26470\nhtri\t26471\n求配\t26472\n讲练\t26473\n大气\t26474\n洛霞\t26475\n刺五\t26476\n薄熙来\t26477\n命途多舛\t26478\n两家\t26479\n暗角\t26480\n互相\t26481\n奇奇颗颗历险记\t26482\n刘宝林\t26483\n王国风云2\t26484\n鸭蛋\t26485\n猪鼻\t26486\n吴建豪\t26487\n帝龙\t26488\n帕德玛大桥\t26489\n普拉提\t26490\n中专中职技校\t26491\nMetaphors\t26492\n砂石路\t26493\n焐\t26494\n章_\t26495\nfitear\t26496\n0x00000bcb\t26497\n十一郎\t26498\n小米随身WiFi\t26499\n限定版\t26500\nMood\t26501\n三炷\t26502\n广州地化所\t26503\n半面\t26504\n11起\t26505\n买得起\t26506\n普阳\t26507\n74万\t26508\n后桥\t26509\n数千万元\t26510\n2688\t26511\n人道主义援助\t26512\n20项\t26513\n中国交通银行\t26514\n林和平\t26515\n爱情银行\t26516\n深圳地铁4号线\t26517\n天津高新区\t26518\n真理性\t26519\nahk\t26520\n广州地税\t26521\nPowered\t26522\n龚楚\t26523\ntopeleven\t26524\n永猎双子\t26525\n啪啪游戏厅\t26526\n常村\t26527\n溢\t26528\npapa\t26529\n金步国\t26530\n观沙岭\t26531\n陆风\t26532\nMagento\t26533\n表体\t26534\n复方氟米松软膏\t26535\n曹仲\t26536\n高风\t26537\nBegins\t26538\nMolecular\t26539\ngave\t26540\nIETester\t26541\n透风\t26542\n运营期\t26543\nConceptual\t26544\n联合国妇女署\t26545\n俩娃\t26546\n颧\t26547\n耳尖\t26548\nxw素材网\t26549\n新宿御苑\t26550\nGCC\t26551\n马远\t26552\n这幅画\t26553\n叶熙祺\t26554\n新丰村\t26555\n100分\t26556\n第17\t26557\n致电\t26558\negpu\t26559\nxendesktop\t26560\n连胜文\t26561\n律操\t26562\n来威漆\t26563\n大杭州\t26564\n游船\t26565\n可乐操\t26566\n水泵接合器\t26567\nk+1\t26568\n千集\t26569\n丰登\t26570\n欣啪啪\t26571\n蓓蕾\t26572\nAdhesive\t26573\nbianchi\t26574\n白岩郭沫若\t26575\n138.com\t26576\n水钻头\t26577\n复合维生素B片\t26578\n莎拉布莱曼\t26579\n兰州牛肉面\t26580\n信自己\t26581\n新东方优能\t26582\nLives\t26583\nAbby\t26584\n李礼辉\t26585\n唱和\t26586\n袋装\t26587\n运动衣\t26588\nOncotarget\t26589\npdz\t26590\n蓬莱长岛\t26591\nseduction\t26592\n贡献度\t26593\n孔融让梨\t26594\n143\t26595\n宜人贷吧\t26596\nmitchell\t26597\n巡行\t26598\n气雾剂\t26599\n中信银行深圳分行\t26600\n抚玉\t26601\nfifamobile\t26602\n铜冠花园\t26603\n贝拉\t26604\n四性\t26605\nLinphone\t26606\n苗条\t26607\n草科\t26608\n中共安徽省委\t26609\n莲花座\t26610\n大寨路\t26611\n许君\t26612\n续交\t26613\n4624\t26614\nseaside\t26615\nPharmaceuticals\t26616\n莉香\t26617\nZHK\t26618\n刘玉翠\t26619\nMaru\t26620\n小海豚\t26621\n岗员\t26622\n童音\t26623\nteamviewer\t26624\nMachines\t26625\nNTU\t26626\n娜露\t26627\n复学\t26628\n1885\t26629\n海王类\t26630\n44号\t26631\n环境工程学院\t26632\n刘杰\t26633\n徐政\t26634\nAvengers\t26635\n杯形\t26636\nmetis\t26637\n规范性\t26638\n保鲜柜\t26639\n最丰富\t26640\nIssued\t26641\n不在意\t26642\n西海网\t26643\n1半\t26644\n监誓\t26645\nbos\t26646\n一句一个\t26647\n许喵喵\t26648\n第一张\t26649\n元斌\t26650\n随着\t26651\n郑克鲁\t26652\n红跑车\t26653\n17处\t26654\n反射膜\t26655\n陆续\t26656\n13码\t26657\n加火\t26658\nhierarchy\t26659\n镜泊湖\t26660\ntua\t26661\n杭州服装公司\t26662\n耳鼻咽喉头颈外科\t26663\n32名\t26664\n招标代理公司\t26665\n曹诚模\t26666\n你最好\t26667\n合成革\t26668\n奶气\t26669\n福建省质量技术监督局\t26670\n套管机\t26671\ncdrx7\t26672\n炼乳\t26673\nmemorystream\t26674\n围住\t26675\nxianshi\t26676\n近天\t26677\n自讨苦吃\t26678\n耽溺\t26679\n王寒铁\t26680\n箬笠\t26681\n山寨机\t26682\n淘需\t26683\n外卖员\t26684\n北京理工大学管理与经济学院\t26685\n尼特利\t26686\n中国给水排水\t26687\n塑料夹\t26688\n杜洪波\t26689\n融园\t26690\n154\t26691\n指导意见\t26692\nSanger\t26693\n2016上半年\t26694\n逐步\t26695\n黄金叶天叶\t26696\n13例\t26697\n上海教育新闻网\t26698\nMIJIA\t26699\n紫枫\t26700\n行包\t26701\n云税贷\t26702\n保利堂\t26703\n马苏里拉奶酪\t26704\n古亭\t26705\n窝窝团\t26706\n500首\t26707\n返臭\t26708\n浅海\t26709\nRichText\t26710\nTikZ\t26711\n安妮花\t26712\n芳草园小学\t26713\nmfc42d\t26714\n文具\t26715\n438号\t26716\n点军\t26717\n14.04\t26718\n25km\t26719\ndcb3688\t26720\n七十多岁\t26721\n普通话版\t26722\n非固化橡胶沥青防水涂料\t26723\n寻觅\t26724\nl800\t26725\n合肥政府网\t26726\n云者\t26727\nCasio\t26728\nbeating\t26729\n钕\t26730\n刻石\t26731\n货运险\t26732\n天达\t26733\n清华出版社\t26734\nST云网\t26735\n涨停潮\t26736\n林采缇\t26737\n深圳市委员会\t26738\n剑川\t26739\nKin\t26740\n四都镇\t26741\n胶卷\t26742\n我在碧桂园的1000天\t26743\n要务\t26744\n长幼\t26745\n急性结膜炎\t26746\nAmoeba\t26747\n丛台\t26748\n9名\t26749\n采茶\t26750\n达道\t26751\n新埭镇\t26752\n老片子\t26753\n王极\t26754\n许昌东站\t26755\n巨鳄\t26756\nsingularity\t26757\n广船国际\t26758\nRET\t26759\n毒疮\t26760\n绩\t26761\n宋简\t26762\n拆分版\t26763\n位域\t26764\n远征\t26765\n罗柏\t26766\nfilewriter\t26767\nxelatex\t26768\n河北省卫生和计划生育委员会\t26769\n笔架山\t26770\n阴阳师道馆\t26771\n记忆力\t26772\nkb2048绳艺网\t26773\nSCR脱硝催化剂\t26774\n亲爱的小孩\t26775\n猫语\t26776\n白油\t26777\n王充\t26778\nNanjing\t26779\n领尚\t26780\n台钟\t26781\nDBR\t26782\n笑东风\t26783\n四室\t26784\n爱奇艺万能播放器\t26785\n交通员\t26786\nu-blox\t26787\n3.4.9\t26788\n三国如龙传\t26789\n孔乙己\t26790\njdango\t26791\n美联\t26792\n膨胀螺丝\t26793\n易贷网\t26794\n30亿_\t26795\n侍刃\t26796\nBuuren\t26797\n迈皋桥街道\t26798\n长江国际音乐节\t26799\n唵嘛呢叭咪吽\t26800\nvacation\t26801\n宏运\t26802\nipv\t26803\n方思明\t26804\n菜柜\t26805\n荣威光之翼\t26806\n2000kg\t26807\n轮扣式\t26808\n2000分钟\t26809\n_帝\t26810\nccu\t26811\n东营经济技术开发区\t26812\n陆贾\t26813\nT30\t26814\n福建省高级人民法院\t26815\n第八号\t26816\n生而知\t26817\n源源\t26818\n范冰郭沫若\t26819\n三羟甲基丙烷\t26820\n嫡谋\t26821\n庭生\t26822\n精装本\t26823\n良田高拍仪_高拍仪\t26824\nedi\t26825\n气筒\t26826\n南华大学附属第二医院\t26827\n绿鬼\t26828\n心塞\t26829\n精气\t26830\n1366\t26831\n斯丹德\t26832\n巨片\t26833\n奔驰4s店\t26834\n_地下城与勇士吧\t26835\n3655xxxx\t26836\n文件转换器\t26837\n废土英雄联盟之传奇正盛\t26838\n午觉\t26839\n贝乐\t26840\nv5.1.1\t26841\n波光\t26842\n姜堰中学\t26843\n优渥\t26844\n林娘子\t26845\n苏媚\t26846\nAeroflot\t26847\n分分钟\t26848\n小拉车\t26849\n什邡\t26850\n入山\t26851\n科鲁\t26852\n金佑\t26853\nappointed\t26854\n李\t26855\n国常\t26856\n罗\t26857\n技惊\t26858\nHDFS\t26859\n密钥库\t26860\n泌阳\t26861\n拖运\t26862\n会员\t26863\n字面值\t26864\n40例\t26865\n建议书\t26866\nKors\t26867\n扁管\t26868\npokemongo\t26869\n中文学\t26870\n武昌起义纪念馆\t26871\nDVD版_影音先锋\t26872\n混伤\t26873\nscx3401\t26874\nSeajs\t26875\nPassing\t26876\n无删\t26877\n位块\t26878\n曲小檀\t26879\ndelphi\t26880\n隔声量\t26881\n跪服\t26882\n逍遥殷\t26883\n14R\t26884\n桂圆红枣\t26885\n开司\t26886\n查询\t26887\n小房\t26888\n张小白\t26889\n锐锋\t26890\n星巴克咖啡豆\t26891\n李正\t26892\n妒火\t26893\n会展经济与管理专业\t26894\n重庆市知识产权局\t26895\n十九代\t26896\n定销\t26897\n金度\t26898\n汉字字源网\t26899\n猎人谷\t26900\niconx\t26901\n乘胜\t26902\n铃兰\t26903\nYOKA\t26904\n字间距\t26905\nebk3\t26906\n柳梦梅\t26907\n零售招聘网\t26908\n骑神\t26909\n19项\t26910\n炮台镇\t26911\n朝阳银行\t26912\n白珠\t26913\n上海户口网\t26914\n自治区商务厅\t26915\n花生粕\t26916\n0523\t26917\n嘉峪关市\t26918\n何浩文\t26919\n失之交臂\t26920\n花铺\t26921\n深圳第二外国语学校\t26922\n千分之三\t26923\ncbr300\t26924\n树边\t26925\n鸽子蛋\t26926\ndl380\t26927\n银\t26928\n黑机\t26929\n局\t26930\n格力中央\t26931\n传智播客和黑马程序员视频库_传智播客和黑马程序员\t26932\n15672\t26933\n描边\t26934\n心包炎\t26935\nLUMINOR\t26936\npapers\t26937\n一片\t26938\n姚红\t26939\n爱玛夫人\t26940\nZXR10\t26941\n招标投标信息网\t26942\n大罪\t26943\n转移性\t26944\n陕西理工大学\t26945\n5日\t26946\n田径队\t26947\n牛肉馅\t26948\n维胺酯胶囊\t26949\n画家们\t26950\n磨皮插件\t26951\n陈四清\t26952\n热力学定律\t26953\n17期\t26954\n广西日报\t26955\n岳麓大道\t26956\n唯乐\t26957\n典妻\t26958\n流通业\t26959\n脂溢性角化病\t26960\n德林\t26961\n调整型\t26962\n破乳\t26963\nneaker\t26964\n第227集\t26965\n延\t26966\n长洲区\t26967\n三分之二\t26968\n云南省工商局\t26969\n戴云\t26970\n不幸福\t26971\n攒够\t26972\n7.16\t26973\n如沐春风\t26974\n中粮前滩\t26975\nBlunt\t26976\n河堤\t26977\n东美\t26978\nhttpcore\t26979\n侨梦苑\t26980\n柯蓝\t26981\n德汉\t26982\n鸿钧老祖\t26983\nPANAMA\t26984\n百威英博\t26985\n可气\t26986\n勃鲁盖尔\t26987\n会声会影8\t26988\nPoisson\t26989\n捷锐\t26990\nstarccm\t26991\n二聚体\t26992\n国开证券\t26993\n中华论坛\t26994\n水磨机\t26995\nQuota\t26996\n梧桐郡\t26997\ncmos\t26998\nalain\t26999\n第610集\t27000\n逃狱\t27001\n蔡颖\t27002\nlibphp\t27003\n整户\t27004\n61天\t27005\n亚比\t27006\n2017年9月22日\t27007\n领御\t27008\n创文\t27009\nsulphate\t27010\n26点\t27011\n袖珍罐\t27012\nwin12\t27013\n20150104\t27014\nCoM\t27015\nMagnum\t27016\n互救\t27017\n雾霭\t27018\n驳壳枪\t27019\n慈恩\t27020\n说说看\t27021\n古战场传奇\t27022\n一亿\t27023\nip反查\t27024\n118平\t27025\nexceptional\t27026\n5640\t27027\n北京华联\t27028\n东方商厦\t27029\n艾柏丰胸茶\t27030\npdf电子书\t27031\n万达文旅城\t27032\n南郊公园\t27033\n低压阀\t27034\n行政处罚权\t27035\n古翠路\t27036\n万彩城\t27037\n冥夫\t27038\n2018045期\t27039\n帝亚吉欧\t27040\n石榴婆\t27041\n雪山\t27042\n荒蛮故事\t27043\n玲珑府\t27044\n郑在\t27045\n七氟丙烷灭火装置\t27046\n佳士得\t27047\n下库\t27048\n网易帮助中心\t27049\n周志鹏\t27050\n唐丹\t27051\n通州新城\t27052\n十字军之王2\t27053\n摇号网\t27054\nLM339\t27055\nCSgo\t27056\n20150606\t27057\n俊峰\t27058\n古铜色\t27059\nSpotlight\t27060\n入柜\t27061\n博士伦隐形眼镜\t27062\n三胎\t27063\n贵姓\t27064\nnodes-51CTO\t27065\n神农顶\t27066\n必康股份\t27067\n班房\t27068\n做学问\t27069\n贝灵顿梗\t27070\ns40\t27071\n38米\t27072\n网题库\t27073\nTLC\t27074\n山水图\t27075\n世界之大\t27076\n舒驰\t27077\n南顺\t27078\n英雄互娱\t27079\n红英\t27080\narabic\t27081\n道孚县\t27082\n云养猫\t27083\n魔精\t27084\n20151217\t27085\n中划线\t27086\n大宁路街道\t27087\n124号\t27088\n庄心妍\t27089\n安硕信息\t27090\n恒大广场\t27091\n畅玩7X\t27092\n要是你在野外迷了路\t27093\n6090\t27094\n坡\t27095\n迭起\t27096\n锦绣家园\t27097\n靠山吃山\t27098\nForeo\t27099\n家公司\t27100\n于右\t27101\n三圆\t27102\n自动进样器\t27103\npri\t27104\n悲惨白百何\t27105\n福建省厦门第一中学\t27106\n苏州印象城\t27107\n郡主\t27108\nAnima\t27109\n区规划局\t27110\n泰山新村\t27111\n雪佛兰创酷\t27112\n抗美援朝纪念馆\t27113\nsigmoid\t27114\n女狙击手\t27115\n95号\t27116\n500cc\t27117\nWin7_\t27118\nbbi\t27119\n学会提问\t27120\n5日内\t27121\n美丽的神话\t27122\n诉讼离婚\t27123\n杂粮\t27124\n青岛市委\t27125\n拉绳\t27126\n汗\t27127\n小游\t27128\n市场报\t27129\n西电通院\t27130\n战利品\t27131\n旱雪\t27132\n人氏\t27133\n外上\t27134\nexe\t27135\n天木\t27136\n脑血管造影\t27137\n归顺\t27138\nF600\t27139\n方洲\t27140\n亲兵\t27141\n浣\t27142\n旋铆机\t27143\n惨案\t27144\n草机\t27145\n养蜂人\t27146\n邂\t27147\n维果斯基\t27148\n福园\t27149\nchiang\t27150\n第二十一届\t27151\n几亩\t27152\n智科\t27153\n侯沧海商路\t27154\n北京公立幼儿园\t27155\n面板数据模型\t27156\n食用菌商务网\t27157\n磷酸铁锂电池\t27158\n诺伊佩拉\t27159\n越王勾践剑\t27160\nFINANCE\t27161\n宁南山\t27162\ninsmod\t27163\n危地马拉\t27164\n转至\t27165\n撕逼大战\t27166\n九十年\t27167\n广州博物馆\t27168\n鲜竹笋\t27169\n犹太人大屠杀\t27170\n交通局\t27171\nIAS\t27172\n实报\t27173\n毒系\t27174\nRNS510\t27175\nMASM\t27176\n火火兔儿歌\t27177\n黎华\t27178\n钻井机\t27179\n吃一堑长一智\t27180\n无翼之鸟\t27181\n艾瑞网\t27182\n人工智能\t27183\ne560\t27184\n3405\t27185\n脾肾\t27186\n清除\t27187\n誉峰\t27188\n天猫Tmall.com\t27189\n木铲\t27190\n安贞桥\t27191\nexFAT\t27192\n列国志\t27193\n综合治理\t27194\n国富民强\t27195\n采样器\t27196\n上海医院\t27197\n健康城\t27198\n2014a\t27199\n和平大厦\t27200\n基耶斯洛夫斯基\t27201\nPS3/PS4\t27202\n竹叶青酒\t27203\n4610\t27204\nFetish\t27205\nscan\t27206\n赫拉克利特\t27207\n奥丁格\t27208\n22件\t27209\n流行歌手\t27210\n刁镇\t27211\n第4种\t27212\n高黎贡山\t27213\n导航\t27214\n7542\t27215\n匆\t27216\n校徽\t27217\n肌肉\t27218\n怦怦\t27219\n皈依证\t27220\n辱国\t27221\nupf\t27222\n过寿\t27223\n27p\t27224\n乙肝表面抗体定量\t27225\n政民\t27226\ndisappear\t27227\ngreenplum\t27228\n三义机场\t27229\n移苗\t27230\ncyborg\t27231\nESB\t27232\n0.4元\t27233\n饲养\t27234\n运动鞋\t27235\n来讲一讲\t27236\n开怀大笑\t27237\nBarber\t27238\n战火\t27239\n42卷\t27240\n钳工\t27241\nGothic\t27242\n肉干\t27243\nquan\t27244\nメス\t27245\nyuanss\t27246\n王志远\t27247\n老毛子\t27248\nMSc\t27249\n常住地\t27250\n二周岁\t27251\nB超\t27252\n波尔山羊\t27253\n硕\t27254\n大团结在线\t27255\n称雄\t27256\n十五元\t27257\n蔡崇达\t27258\n24cm\t27259\n6.3.2\t27260\n法语学习\t27261\n战舰世界战列舰\t27262\n张清一\t27263\nguonei\t27264\n7008\t27265\n意风\t27266\n淑蓉\t27267\n灵剑尊\t27268\neducated\t27269\n发明者\t27270\n村寨\t27271\nketo\t27272\n相隔\t27273\n画画画\t27274\n生态文学\t27275\n38万元\t27276\n周小川\t27277\ncorsa\t27278\nECDSA\t27279\n平成\t27280\n一量\t27281\n最终\t27282\n火土\t27283\n圣西罗\t27284\nmug\t27285\n乖乖女\t27286\n南京小学\t27287\n贞洁\t27288\n木色\t27289\n道义\t27290\n无线上网卡\t27291\n比亚迪唐论坛\t27292\n六月婷婷\t27293\n国家税务总局稽查局\t27294\n开户\t27295\n漫无\t27296\nXL论坛\t27297\n涂色书\t27298\n15回\t27299\n掉线\t27300\n直销公司\t27301\ndiskpart\t27302\n宁波市人民政府农村工作办公室\t27303\n启夫微安\t27304\n建筑工业\t27305\nANNUAL\t27306\n通路图\t27307\n导热硅胶片\t27308\n时间\t27309\nRMA\t27310\nrust\t27311\nticker\t27312\n周武\t27313\n9周\t27314\n傈僳族\t27315\n南京移动\t27316\nController\t27317\n细味\t27318\n七麦\t27319\n烟蒂\t27320\n好故事\t27321\nryo\t27322\n终极英雄\t27323\n上海国际经济贸易仲裁委员会\t27324\n猛龙\t27325\nProtect\t27326\nhgg\t27327\n小诺\t27328\n创幻\t27329\n没病\t27330\n自驾车\t27331\nint32\t27332\n2.8元\t27333\n不时之需\t27334\ngsa\t27335\nSEE\t27336\ncarey\t27337\n透气\t27338\n社员\t27339\n季铵盐\t27340\n大兄\t27341\n安装式\t27342\n于右任\t27343\n黑丝\t27344\n魏东亭\t27345\n城关中学\t27346\n91WAN.COM\t27347\n44分\t27348\n夜魇\t27349\nFund\t27350\n月城镇\t27351\n为题\t27352\n弹簧管\t27353\n启蒙运动\t27354\n荣耀群\t27355\n大意题\t27356\n横琴人寿\t27357\n猪肉\t27358\n华年\t27359\n创意网\t27360\n春语\t27361\n凤翔路\t27362\n赵集镇\t27363\nfoods\t27364\n上海药物所\t27365\nRemote\t27366\n鲤鱼竿\t27367\nXYplorer\t27368\n叩富\t27369\n斧\t27370\n女童星\t27371\n闲鱼号\t27372\n玉峰\t27373\n重达\t27374\n奥迪\t27375\n相反\t27376\n灰信网\t27377\nStatue\t27378\n拆图\t27379\n343\t27380\n妄图\t27381\n埔\t27382\n罗马全面战争吧_\t27383\n月费\t27384\n千点\t27385\n曝光台-【网贷之家论坛\t27386\n福建政府\t27387\n不定项\t27388\n果博\t27389\n往西\t27390\n北京华尔道夫酒店\t27391\n前道\t27392\n白玉京\t27393\n解放广场\t27394\n自然年\t27395\nChoices\t27396\nIBANEZ\t27397\n浅盘\t27398\nbarney\t27399\n武警部队\t27400\n一枝\t27401\n挤出来\t27402\n大\t27403\n徂徕山\t27404\nout\t27405\n二十多岁\t27406\n北京新闻\t27407\n查清\t27408\n兄弟姐妹们\t27409\n北海市人民政府\t27410\n汉祚世\t27411\n喷壶\t27412\n吃吃吃\t27413\n足浴城\t27414\n国家质检总局\t27415\n秀_\t27416\n西安城墙\t27417\n泰州市\t27418\n2018.04.01\t27419\n1024xp\t27420\n电容笔\t27421\n8.4\t27422\n洪晃\t27423\n仿宋\t27424\n对账单\t27425\n纠缠\t27426\n华胥\t27427\n修罗\t27428\ntextscan\t27429\n0.77\t27430\n郭村镇\t27431\nvista\t27432\n安定镇\t27433\n4.15\t27434\n赵乐秦\t27435\n天黑黑\t27436\nenlightenment\t27437\n爱奇艺泡泡圈\t27438\n二维点\t27439\n大槐树\t27440\n罗汉竹\t27441\n深圳大学经济学院\t27442\nLinQ\t27443\n1.03H\t27444\n100ah\t27445\n醒醒\t27446\nLINING\t27447\n002001\t27448\n101位\t27449\nWINCE\t27450\n重义\t27451\nagnes\t27452\n李多喜\t27453\n日购\t27454\n第三十九条\t27455\n新疆五家渠\t27456\n阔腿\t27457\n华丽丽\t27458\n江心屿\t27459\n炮弹\t27460\n熬煮\t27461\n卡尔冈\t27462\n男高音\t27463\n一企\t27464\n每一\t27465\n挑大梁\t27466\n雷总\t27467\n1600x\t27468\n糖尿病网\t27469\n炫舞非人学园\t27470\n孙娟\t27471\n焦洪昌\t27472\n退股\t27473\n杨二车娜姆\t27474\n真货\t27475\n降龙爪爪\t27476\n羽皇\t27477\n电影群\t27478\n57年\t27479\n道貌\t27480\nV3.6\t27481\n定职\t27482\nhudson\t27483\nContentType\t27484\n滚水坝\t27485\n8109\t27486\n处女膜\t27487\n200间\t27488\n品势\t27489\n拉氧头孢钠\t27490\n记过处分\t27491\n荣锋亮\t27492\n徐乃麟\t27493\n12分米\t27494\n21周年\t27495\n华润中心悦府\t27496\n斯摩格\t27497\nthb\t27498\n漫步人生路\t27499\n掸子\t27500\nrj11\t27501\njournalist\t27502\nstringbuilder\t27503\nTicwatch2\t27504\n华阳国志\t27505\n参考\t27506\n21条\t27507\n哑鼓\t27508\n王者荣耀苏烈\t27509\nmoons\t27510\nshiyan\t27511\n双组\t27512\n纸厂\t27513\n红旗水库\t27514\n鉴别诊断\t27515\n耐腐蚀性\t27516\n代表会议\t27517\n汤池\t27518\n东贝\t27519\nRelationship\t27520\nACFUN\t27521\n呐_\t27522\n公鹅\t27523\n郭小川\t27524\n莞高速\t27525\nArcMap\t27526\nbcr\t27527\n破势\t27528\n爱儿美\t27529\n防逃\t27530\n20160811\t27531\n枪神威驰\t27532\ncc3200\t27533\n变更\t27534\n吊带\t27535\n环氧自流平地坪\t27536\n实有\t27537\n圆明园遗址公园\t27538\n棱长\t27539\n紧密\t27540\n文字号\t27541\n外频\t27542\n故事化\t27543\nxe10\t27544\n车友圈\t27545\n远期利率协议\t27546\n22元\t27547\n元婴\t27548\n第55号\t27549\n艾利特\t27550\n20160927\t27551\n24部\t27552\nMilano\t27553\n10865\t27554\n咪咪头\t27555\nmiui5\t27556\n总传热系数\t27557\n冰雕\t27558\n沙家浜风景区\t27559\ndaemontools\t27560\n松龄\t27561\n江洪\t27562\n春笋\t27563\n鬼灯\t27564\n得而复失\t27565\n娘胎\t27566\n武士服\t27567\n杨泽\t27568\nes7\t27569\n一氧\t27570\n迷彩裤\t27571\n中国人民解放军进行曲\t27572\n深信服AC\t27573\n洗浴中心地址\t27574\n试析\t27575\n多级子\t27576\nlayoutit\t27577\n两子\t27578\n第六十二章\t27579\n太阳灶\t27580\n皮卡\t27581\n国富\t27582\n中外文\t27583\n电动执行器\t27584\nligergrid\t27585\n禁词\t27586\nSMU\t27587\nTVs\t27588\n茂名石化\t27589\nstrangers\t27590\n机电科\t27591\n劳斯判据\t27592\n阉猪\t27593\n桃源县\t27594\n阿卡林\t27595\ndido\t27596\n封面\t27597\n禁忌恋\t27598\n1方\t27599\nbáo\t27600\nm男\t27601\nAdvisors\t27602\n3.6.3\t27603\n两栖车\t27604\n丰田汽车\t27605\nhexun\t27606\n孟烦\t27607\n神姬\t27608\n泷泽萝拉\t27609\n转暖\t27610\n叹服\t27611\n奶油田\t27612\nEviews\t27613\n20ml\t27614\n大切诺基论坛\t27615\nyxlady\t27616\n枕式包装机\t27617\n500_\t27618\nVERSUS\t27619\n九年级第一次\t27620\n林宝金\t27621\n月经期\t27622\n这回\t27623\n公司战略与风险管理\t27624\n6125\t27625\n雷蛇\t27626\n调课\t27627\n钓鱼\t27628\n心理治疗师\t27629\n洛阳钼业\t27630\n第十五卷\t27631\n女篮\t27632\narr\t27633\n瓯海中学\t27634\n4754-2017\t27635\n广州电视台\t27636\n同治通宝\t27637\n李傕\t27638\n186号段\t27639\n威格\t27640\n二氧化氮\t27641\n泉彩\t27642\ncataclysmdda\t27643\n局中局\t27644\nzhuli\t27645\n大作为\t27646\nRocken\t27647\n43关\t27648\n旋翼机\t27649\n制订\t27650\n丢脸\t27651\n一模一\t27652\n10.1.0\t27653\n一键包\t27654\n最初\t27655\nReyes\t27656\n橙光游戏吧\t27657\n家伙\t27658\n北京市天元律师事务所\t27659\n不懂得\t27660\nglibc\t27661\n锌锰干\t27662\njade模板\t27663\n欧亚经济联盟\t27664\n米机\t27665\nprefab\t27666\nK30\t27667\nhi3531\t27668\noracle数据库\t27669\n税前收入\t27670\n道外区\t27671\n安广\t27672\n汉化包\t27673\n知己者\t27674\n睾丸癌\t27675\n学院\t27676\n武清开发区\t27677\n飘眉\t27678\n棠湖\t27679\nkkk\t27680\n横断山\t27681\n嗬\t27682\n141号\t27683\n女神异闻录3\t27684\n铒\t27685\n真户\t27686\n六安市人民政府\t27687\n肢体语言\t27688\n复旦大学数学科学学院\t27689\n韩语版\t27690\n春光\t27691\n苏水\t27692\ntempdb\t27693\nUKEY\t27694\n儿子性\t27695\n科兴科学园\t27696\n红药水\t27697\n顾老师\t27698\n三优\t27699\n东丽\t27700\n龙鼎\t27701\ns905x\t27702\n李忠民\t27703\n微信读书电脑版\t27704\n法人独资\t27705\n龙年\t27706\n中兴百货\t27707\niPanda\t27708\n黄州快哉亭记\t27709\n手元\t27710\n坨屎\t27711\n钩子\t27712\n朱华\t27713\nArmada\t27714\nfaceid\t27715\n梦的解析\t27716\n55亿\t27717\n佩兰\t27718\nrpgvx\t27719\n11a\t27720\n台商投资区\t27721\n十堰市委\t27722\n第二道\t27723\ncaas\t27724\n辉瑞公司\t27725\n六金\t27726\n英尔健\t27727\n进水口\t27728\nzookper\t27729\n5.0.1\t27730\n大辞典_辞海_新华字典\t27731\n电波\t27732\n算帐\t27733\n广渠路东延\t27734\n人行\t27735\n爱情故事\t27736\n蔡恒\t27737\nencyclopedia\t27738\nenctype\t27739\n模方\t27740\n桃花溪\t27741\n彼得森\t27742\n何许\t27743\n多情江山\t27744\n逃生\t27745\n杭州便民网\t27746\n阳陵泉\t27747\n苏塔\t27748\n360吧\t27749\n腾讯大楚网\t27750\n双极\t27751\n去掉头\t27752\n制砖机\t27753\n百合野\t27754\nCOUPE\t27755\ni1\t27756\nServer数据库\t27757\n样条曲线\t27758\n四十年后\t27759\n磨浆机\t27760\ngashina\t27761\n前门情思大碗茶\t27762\n虚开发票\t27763\netm\t27764\n梅里\t27765\nUNESCO\t27766\nSaaS公司\t27767\n传统酒店\t27768\nalt+tab\t27769\nFlags\t27770\n娄底市\t27771\nOnboard\t27772\n毒誓\t27773\n降频门\t27774\n地弹门\t27775\n炎阳\t27776\n2014年8月\t27777\n20问\t27778\n正统网\t27779\n天工社\t27780\n前台\t27781\n贸易壁垒\t27782\n1409\t27783\n喰种\t27784\n竹篱笆\t27785\n玉米粒\t27786\n早上七点\t27787\n多职\t27788\nexml\t27789\n唐宗\t27790\ninsurgency\t27791\n浦南医院\t27792\n淮树\t27793\n隐藏者\t27794\n促增\t27795\n保学\t27796\nnutrition\t27797\n徐欢\t27798\nBluestacks\t27799\n阳道\t27800\n美食大冒险\t27801\nPokeMMO\t27802\n一建万\t27803\nrogers\t27804\n活素\t27805\n友达光电\t27806\n大话红娘\t27807\n第四届\t27808\n舒蕾\t27809\nHiroshima\t27810\n填下\t27811\n百度浏览器\t27812\n卫生学校\t27813\nmla\t27814\n青岛奥数网\t27815\n吱吱\t27816\n聚成\t27817\ne9\t27818\n冰冠\t27819\nla\t27820\nToolBox\t27821\nluxury\t27822\n工业交换机\t27823\n被迫\t27824\n威旺\t27825\n菜式\t27826\n蝉女\t27827\n2016年11月15日\t27828\n大董\t27829\n过少\t27830\n血府逐瘀汤\t27831\n万象优图\t27832\n韩孝珠\t27833\n20170103\t27834\n子子\t27835\nPhison\t27836\n印\t27837\n狠操\t27838\n魔兽世界人口普查\t27839\n杜超\t27840\n祈福医院\t27841\n刘红\t27842\n枝上\t27843\n血清总胆固醇\t27844\n摄影集\t27845\nstringify\t27846\n飞临\t27847\n2015年五一\t27848\nCERN\t27849\nEDN\t27850\n李文玲\t27851\n工大附中\t27852\nMatlab2014a\t27853\n大富大贵\t27854\n潍坊西路\t27855\n河北移动\t27856\n无毒\t27857\n张亚平\t27858\n尤加利\t27859\n取酬\t27860\n胡宗南\t27861\nCleef\t27862\n号数\t27863\n女神异闻录4\t27864\n各室\t27865\n几篇\t27866\n恒大御峰\t27867\n高佳敏\t27868\n土家\t27869\n专一性\t27870\n阴兵\t27871\n关联宝贝\t27872\n高血钾\t27873\nmoon\t27874\n法治政党\t27875\n最后一作\t27876\n大闽\t27877\n不懈\t27878\n干燥管\t27879\n临港产业园\t27880\n雁栖\t27881\n各学派\t27882\n二级公路\t27883\n失落大陆\t27884\n清安\t27885\n禽\t27886\n张钰琪\t27887\n宗庆后\t27888\n输出装\t27889\n城厢街道\t27890\n中头\t27891\n焦头烂额\t27892\n新能源汽车产业\t27893\n50.0000元\t27894\n湿毒清胶囊\t27895\n孙艺\t27896\n矍铄\t27897\n应力波\t27898\n普洛菲斯\t27899\nc11\t27900\n桐洲岛\t27901\nanzhuo\t27902\n远程视频监控\t27903\n易信公众号\t27904\n景溪北苑\t27905\n吉木萨尔县\t27906\n刊\t27907\n为数\t27908\n足总杯\t27909\n黄白爆\t27910\nYestar\t27911\n日料控\t27912\nAtomicInteger\t27913\n一秘\t27914\niC\t27915\n54亿\t27916\n广州人才网\t27917\n王兵\t27918\n陆路\t27919\n四月四日\t27920\n球虫病\t27921\n白丝\t27922\n风道\t27923\n胡歌\t27924\ntert\t27925\nVMDK\t27926\n982\t27927\n南湖大桥\t27928\n驽\t27929\n配子\t27930\nwuji\t27931\n多乐士墙面漆\t27932\n调研类\t27933\n伞齿\t27934\n元一\t27935\n奥拉罗尤\t27936\n王真\t27937\nOKB\t27938\n为名\t27939\nMA2\t27940\n2005年10月\t27941\n大庆西站\t27942\n柳南区\t27943\n科德宝\t27944\n张政\t27945\n黑麋峰\t27946\npsd_千库网\t27947\n卡布西游\t27948\n盐酸二甲双胍肠溶片\t27949\n风趣\t27950\n警报声\t27951\na版\t27952\nv7.2.0\t27953\n街上\t27954\n省通信管理局\t27955\n好神途发布网\t27956\npp手机助手\t27957\n刘平\t27958\n百意\t27959\n雷蒙\t27960\n10款\t27961\n麻胀\t27962\n卸\t27963\n精艺\t27964\n凤凰娱乐\t27965\nlitter\t27966\n皮巾\t27967\n天之杯\t27968\n缓动\t27969\n小心谨慎\t27970\n连锁店\t27971\n制程\t27972\n吴侬\t27973\n驻校\t27974\n骨汤\t27975\n利隆\t27976\n寄语\t27977\nsublimetext\t27978\n1万6\t27979\n伊能静\t27980\n现房\t27981\n海南自由贸易试验区\t27982\n麦肯锡公司\t27983\nverification\t27984\n骨牌\t27985\n清炖\t27986\n创新公司\t27987\n罗甸县\t27988\noverhit\t27989\n夏都\t27990\n死光\t27991\nGALGAME\t27992\n留情\t27993\n新福田\t27994\ncaifu\t27995\n蜱\t27996\n台湾辅仁大学\t27997\n第一百二十一章\t27998\n钢盾\t27999\nkosmos\t28000\n魔铃\t28001\n温煦\t28002\n作出\t28003\n沥青罐\t28004\n刘清扬\t28005\nKato\t28006\n礼节性\t28007\n龙井市\t28008\n诛仙剑\t28009\n孙丕恕\t28010\n泡制\t28011\nMACBOOKPRO\t28012\ni58600k\t28013\n无恶不作\t28014\n豆妈\t28015\n十里琅珰\t28016\nP2P网贷平台\t28017\n鸡架\t28018\n单核细胞绝对值\t28019\n灵能百分百\t28020\n第十一套\t28021\n云南省水利厅\t28022\n养尊处优\t28023\nbattlebots\t28024\nR21\t28025\n67.5\t28026\n周丽华\t28027\n中资方\t28028\n频头\t28029\n给类\t28030\n空冷\t28031\n公证摇号\t28032\n惠州报业传媒集团\t28033\n河南中医药大学\t28034\n差分电荷\t28035\n八国\t28036\n江门市蓬江区政府\t28037\n防晒霜\t28038\n依存句法\t28039\n合买\t28040\nM7400\t28041\nWORKS\t28042\n3罐\t28043\nbatch\t28044\n军检\t28045\n宋宋\t28046\n3.8节\t28047\n临泉在线\t28048\n福建省科学技术协会\t28049\n尿液分析仪\t28050\n温州小学\t28051\n矿料\t28052\n唐果\t28053\n吸住\t28054\n工作页\t28055\n华为悦盒EC6108V9C\t28056\n固山\t28057\n42条\t28058\n香爆\t28059\n51集\t28060\n深圳光明\t28061\n国际贸易网\t28062\n套号\t28063\nminh\t28064\n老火汤\t28065\n寒霜\t28066\n沅江\t28067\n微情\t28068\n孤星\t28069\n主应力\t28070\n500多亿\t28071\n对外经济贸易大学\t28072\n参赌\t28073\n寄生虫病\t28074\nQTcp\t28075\n利欲\t28076\n审查\t28077\n设色\t28078\nwhale\t28079\n2018\t28080\n种子库\t28081\n军火\t28082\n记者们\t28083\n乐克乐克\t28084\n雄壮\t28085\n梅雨季\t28086\n火葬\t28087\n卑弥呼\t28088\n氟碳铝单板\t28089\n反冲\t28090\n国网辽宁省电力有限公司\t28091\n迈克\t28092\n尾静脉注射\t28093\n上海民办学校\t28094\n省经信委\t28095\nUISearchController\t28096\n江南大学设计学院\t28097\n社畜\t28098\n饼\t28099\n来谈谈\t28100\nmacbookpro\t28101\n舰式\t28102\nPPTSTORE\t28103\n微信拼团\t28104\n雪化\t28105\nHere\t28106\n赶集\t28107\n2500亩\t28108\n双钩\t28109\n智利比索\t28110\n烤箱菜\t28111\n高低频\t28112\n北京菜市口\t28113\n博狗\t28114\nIphone7\t28115\n创现\t28116\n季离\t28117\nCTOLib\t28118\n齿轮泵\t28119\n61317377\t28120\n逆滤波\t28121\n定损\t28122\n调节杆\t28123\nbills\t28124\n传情\t28125\n公邮\t28126\n赛末点\t28127\nbd0001\t28128\n案号\t28129\n易网科技\t28130\n资规\t28131\n掌上\t28132\n行骗天下JP\t28133\n小人儿\t28134\n名僧\t28135\n一鸣惊人\t28136\n涨紧轮\t28137\n从\t28138\n深圳市妇幼保健院\t28139\n恒彩\t28140\n復讐\t28141\n中特网\t28142\n2586\t28143\n暗帝\t28144\n无影有踪\t28145\n博飞\t28146\n白茶\t28147\n胧村正\t28148\nProgressive\t28149\n堆区\t28150\n香港证券公司\t28151\n默认字符编码\t28152\n2双\t28153\n杠铃卧推\t28154\n火力发电\t28155\n复兴门外大街\t28156\n赵岩\t28157\n中国通城网\t28158\nhtml51\t28159\n河北省水利厅\t28160\n上庄水库\t28161\n鉴定贴\t28162\n付丽\t28163\n江姗\t28164\n地盖\t28165\n1.65\t28166\ngodex\t28167\nmara\t28168\n河西镇\t28169\n王晶晶\t28170\nv8i\t28171\n他克莫司\t28172\n角力\t28173\n四川银监局\t28174\n244角\t28175\n杜尔\t28176\n剧情池\t28177\n现代级\t28178\n丹麦\t28179\n梦想的翅膀\t28180\n正大天晴\t28181\n信息技术基础\t28182\n果菜园\t28183\nsapphire\t28184\n居家养老服务中心\t28185\nj+1\t28186\n复星艺术中心\t28187\n2880元\t28188\n吴忠\t28189\n富士xa3\t28190\nhaoting\t28191\n一口一口\t28192\n碳结钢\t28193\n贸易部\t28194\nrode\t28195\n脚位\t28196\n山东省卫计委\t28197\nQin\t28198\n炎德\t28199\n禹王\t28200\nVUITTON\t28201\n邀请码\t28202\n列印\t28203\n5层\t28204\n受生\t28205\n淡漠\t28206\n狂飙龙5\t28207\nOcean\t28208\n博格斯\t28209\nzab\t28210\n微爱农场\t28211\n锟斤拷锟斤拷\t28212\n诉讼\t28213\nkoo\t28214\n沈阳自贸区\t28215\n松岗汽车站\t28216\nTAF\t28217\nhzd\t28218\n一三年\t28219\n雅集\t28220\n调分\t28221\n本征值\t28222\n投票群\t28223\noffice201\t28224\n信宜市\t28225\nplasticity\t28226\n新赛\t28227\n2017年五一\t28228\n中信大厦\t28229\n普渡大学\t28230\nmild\t28231\nmt4\t28232\n运回\t28233\n堕\t28234\nOracel\t28235\n394本\t28236\n30分钟左右\t28237\n江山\t28238\n等级制度\t28239\n96孔\t28240\nYOSHIKI\t28241\n8.2分\t28242\n精女孩\t28243\nDai\t28244\n76级\t28245\ncochrane\t28246\n财帛宫\t28247\n诚信教育\t28248\nJung\t28249\nRUC\t28250\n群英\t28251\nitlun\t28252\n国画家\t28253\n哺育\t28254\n高个\t28255\n36克\t28256\n澳门喜来登\t28257\n李菲\t28258\nPortraiture\t28259\n低密度\t28260\n虚拟专用网\t28261\n思聪\t28262\n建设项目竣工环境保护验收暂行办法\t28263\njiu\t28264\nwireless\t28265\n埃罗\t28266\n补集\t28267\n鲁文\t28268\n中石油集团\t28269\n黎元洪\t28270\n湖南省人大\t28271\n羑里城\t28272\n盈利模式分析\t28273\nface++\t28274\n第四季度\t28275\nliberty\t28276\nAug\t28277\n县城区\t28278\n徐孝元\t28279\n各斯\t28280\n猴头\t28281\n大掌\t28282\n高跟\t28283\n黑桑葚\t28284\n5纳米\t28285\n窃听风云\t28286\n蓝田县人民政府\t28287\n虹蓝\t28288\n离飞\t28289\n240平\t28290\n林宪明\t28291\n肾结石\t28292\n高桥优\t28293\n路易十三\t28294\nProxifier\t28295\nTOS\t28296\n断热\t28297\n暴裂\t28298\n木东\t28299\nCNEV-中国新能源汽车网\t28300\nv5.2.0\t28301\n悉尼FC\t28302\n张晓琳\t28303\n瀚金佰\t28304\n130年\t28305\n沈冰麦当娜\t28306\n吖\t28307\n于此\t28308\n精控\t28309\n青峰大辉\t28310\n7860k\t28311\n张振\t28312\n可暖舒活液\t28313\nratios\t28314\n0079\t28315\n胡晓东\t28316\n飞智社区\t28317\n8092\t28318\n液压管\t28319\n江中健胃消食片\t28320\n工行个人网上银行\t28321\n文化苦旅\t28322\ncosX\t28323\n扬州钓鱼网\t28324\nSYSTEMS\t28325\n文资\t28326\nSampler\t28327\n轩尼诗vsop\t28328\n慧云\t28329\n纳雍\t28330\niP\t28331\n使命召唤13:无限战争\t28332\n天津地铁\t28333\nQueenstown\t28334\n心急如焚\t28335\n姜维传\t28336\n松脆\t28337\n利通区\t28338\n网络用语\t28339\n统计学家\t28340\nCWP\t28341\n2乘\t28342\n天地之间\t28343\n青西新区\t28344\n速溶咖啡粉\t28345\n大魔法师\t28346\n字体转换器\t28347\n浙江省大学\t28348\n天天番号网\t28349\n单克隆抗体\t28350\n清颜\t28351\n何其\t28352\n毛鸡蛋\t28353\n70万元\t28354\n钻床\t28355\n爆炒江湖吧\t28356\n4.9.5\t28357\n小辣妻\t28358\n拐抢\t28359\n毛林林\t28360\n烟水\t28361\n日本料理\t28362\nCloudStack\t28363\nzelda\t28364\n一百名\t28365\n钟真\t28366\n蛟龙港\t28367\niphoen\t28368\n喜地\t28369\n折跃\t28370\nubunut\t28371\n出库\t28372\n张弛\t28373\n0xc004f074\t28374\n排轮滑\t28375\n几个零\t28376\n红发\t28377\nModem\t28378\nterminating\t28379\n墙杆\t28380\n王思思\t28381\nEngadget\t28382\ncleanser\t28383\n责任方\t28384\n辽宁警察学院\t28385\nMauritius\t28386\n卓高美缝剂\t28387\n效益型\t28388\n新兴际华集团有限公司\t28389\nBatch\t28390\nChipGenius\t28391\n图师\t28392\n巴斯蒂安\t28393\nIndy\t28394\n新贵妃醉酒\t28395\n半妖倾城2\t28396\n第60次\t28397\n28.1\t28398\n康萃乐\t28399\nT18\t28400\n倍镜\t28401\n环球时报\t28402\n上期数\t28403\n美的集团股份有限公司\t28404\n钟国\t28405\n御龙天峰\t28406\n燕顺路\t28407\n堰塞湖\t28408\n牙签\t28409\n油尖旺区\t28410\n各怀鬼胎\t28411\n2018-4-14\t28412\n主从表\t28413\n地图炮\t28414\n小玲\t28415\n不管怎么办\t28416\n重庆人事考试中心\t28417\nwestone\t28418\n云贵川\t28419\n钨酸\t28420\nrico\t28421\n洪湖公园\t28422\n母基金\t28423\n菲茨杰拉德\t28424\nproto\t28425\n1F\t28426\n推荐会\t28427\nz2\t28428\n黑龙江省林业厅\t28429\n小农\t28430\n拜会\t28431\ncpf\t28432\n读刻\t28433\n股东方\t28434\n离线地图\t28435\naligned\t28436\n中金控网\t28437\n曾九\t28438\nlexus\t28439\n陕旅版小学\t28440\nadata\t28441\n飞行姬\t28442\nGIFs\t28443\n保温板\t28444\n盐罐\t28445\n限售\t28446\n南北湖\t28447\n奥克斯缔壹城\t28448\nwww838eecon\t28449\n新闻在线\t28450\n建设项目工程总承包管理规范\t28451\n雕牌\t28452\n3Dmax2010\t28453\n剑客\t28454\n群宣\t28455\n咔咔养生网\t28456\n朝堂\t28457\n小道理\t28458\n西洋镜\t28459\n腐文\t28460\n雄鹿\t28461\n每时\t28462\n2018-01-13\t28463\n豹纹陆龟\t28464\n李泰容\t28465\n车轴\t28466\n十多斤\t28467\n克隆体\t28468\n黑药\t28469\n计算机科学概论\t28470\n银海湖\t28471\n庙里\t28472\n太平\t28473\n修罗战神\t28474\nlammonium\t28475\n一斤\t28476\n局长\t28477\n卡奇\t28478\n不成器\t28479\n补货\t28480\n扣除率\t28481\n溴酚蓝\t28482\n亲清\t28483\n科幻感\t28484\n面朝\t28485\n磨光\t28486\n陈永强\t28487\n北京比亚迪\t28488\n全美黄页工具网\t28489\n拉丁舞伦巴\t28490\nc语言版\t28491\n日上集团\t28492\n3250\t28493\n真田信繁\t28494\n朱宇\t28495\n射雕英雄传吧\t28496\nUtrack\t28497\nsupplement\t28498\n湖南移动\t28499\n找自己\t28500\n自然风\t28501\n九曲溪\t28502\nBro\t28503\n凄迷\t28504\n水环真空泵\t28505\n康城亲水湾\t28506\n225g\t28507\nDLT\t28508\n佛陀传\t28509\n奇幻类\t28510\n破身\t28511\ndnf剑豪吧\t28512\n条痕\t28513\ndreaming\t28514\n第七色\t28515\n极路由3pro\t28516\n本院\t28517\n02\t28518\n中国外汇管理局\t28519\nrgb2gray\t28520\n贵阳教育信息网\t28521\nWindows10输入法\t28522\n独孤天下百度云\t28523\n四川省卫生计生委\t28524\n首付\t28525\n手书\t28526\nribs\t28527\n7.08\t28528\n7年\t28529\n狙杀\t28530\n赵彦军\t28531\n钱夫人\t28532\n感光器\t28533\n徐长卿\t28534\n吸收管\t28535\n邵氏硬度计\t28536\n司米橱柜\t28537\n刘航\t28538\n20141102\t28539\n氢os\t28540\n第四夜\t28541\n别出心裁\t28542\n票夹\t28543\n2016～2020年\t28544\n上机考试\t28545\n沈佳凝\t28546\nNBA全明星赛\t28547\nroucheng\t28548\n16句\t28549\nlol卡\t28550\n内蒙古电信\t28551\n素女\t28552\n巧迪尚惠\t28553\n盗墓长生印\t28554\n雷克萨斯rx200t\t28555\noil\t28556\nIPQC\t28557\n魔兽sf吧\t28558\n微型企业\t28559\n教学研\t28560\n追随者\t28561\n房地产经纪有限公司\t28562\n阴式\t28563\n神仙树\t28564\n蓝旗\t28565\n教坛\t28566\nタルサイト\t28567\n理学家\t28568\n秋云\t28569\n本帖\t28570\n动漫片\t28571\n换钱\t28572\n日拍网\t28573\nskt\t28574\n吕秀莲\t28575\n生命力\t28576\n五男\t28577\nPledge\t28578\n小贤\t28579\n金耳机\t28580\n数显式\t28581\n36.4\t28582\n尼达尼布\t28583\n┣\t28584\n旺相\t28585\n股权\t28586\nebody\t28587\n星悦广西\t28588\n采写\t28589\n杨箕\t28590\n长沙西\t28591\n廊坊市政府\t28592\n战斗民族养成记\t28593\n十一岁\t28594\n第五人格\t28595\n地藏论坛\t28596\n固定物\t28597\n状元坊\t28598\n主动表\t28599\n常用版\t28600\nshopping\t28601\n混合阀\t28602\n大桥瞳\t28603\n任成超\t28604\nc4世嘉\t28605\n查干湖\t28606\n梦溪\t28607\n无音\t28608\n华新街\t28609\n丰美\t28610\n刷铁机\t28611\nMANUAL\t28612\nMATCHES\t28613\n开具\t28614\n台场\t28615\n14天内\t28616\n东光吧\t28617\nvarStatus\t28618\n双子叶\t28619\n钟齐\t28620\n黑夜传说\t28621\n精河县\t28622\n灵感家\t28623\n遗址\t28624\n代人\t28625\n哪么\t28626\n无敌幸运星\t28627\npeel\t28628\n易虎臣\t28629\n客轮\t28630\nyande\t28631\n阜阳一中\t28632\n000848\t28633\n青虫\t28634\nur\t28635\niina\t28636\n唐镇\t28637\nbe\t28638\n火木\t28639\n除垢剂\t28640\n无间道3\t28641\n氟化钠\t28642\n绞盘\t28643\n07版\t28644\n20150608\t28645\n综合体\t28646\n南国奥园\t28647\n阿拉伯国家\t28648\n高通量\t28649\n统规\t28650\n风光\t28651\n云悦\t28652\n东丰县政府\t28653\n标致206\t28654\n超影\t28655\n实义\t28656\n罗蒙\t28657\nNOI\t28658\n肖平\t28659\n艺伎回忆录\t28660\n小妞儿\t28661\n250斤\t28662\n138\t28663\n鸡翅包饭\t28664\n西莫电机论坛\t28665\n9.7寸\t28666\n12371\t28667\n增量型\t28668\n阿尔兹海默病\t28669\n传化智联\t28670\npotter\t28671\n魔兽世界_WOW\t28672\n马特拉齐\t28673\n影音先锋看片网站_影音先锋\t28674\n中国机械工程\t28675\n团辅\t28676\n这招\t28677\n琼华\t28678\n人力资源社会保障部财政部\t28679\n外环高速公路\t28680\n假鞋\t28681\n炮队\t28682\n纯情\t28683\n黄桥\t28684\n日默瓦\t28685\nOptiplex\t28686\n张昱\t28687\n风电并网\t28688\n五笔输入法\t28689\n南京凤凰国际书城\t28690\n重字\t28691\n江苏科学技术出版社\t28692\n胆汁淤积症\t28693\n太湖彬彬\t28694\n栖霞市\t28695\nvivio\t28696\n知声\t28697\n君之_\t28698\n日收益率\t28699\n0052\t28700\n再见\t28701\n引信\t28702\n吴家丽\t28703\n中钢网\t28704\n挑子\t28705\n潼南县\t28706\n冀连梅\t28707\n食安法\t28708\n刘晔\t28709\n强人\t28710\n明尊\t28711\n4.75%\t28712\n磷光\t28713\n投案自首\t28714\n迹部\t28715\n谷川俊太郎\t28716\n2016年1月10日\t28717\n北京限行\t28718\n597\t28719\n小翔\t28720\n建业集团\t28721\n北核\t28722\n弓友\t28723\n茂名南路\t28724\nges\t28725\n添翼\t28726\n12月10日\t28727\nBib\t28728\n驴得水\t28729\n2017年10月1日起\t28730\n核能\t28731\n铜质\t28732\n商贸园\t28733\n海驴岛\t28734\n22米\t28735\n海逸豪庭\t28736\n骊威\t28737\n宜昌市委\t28738\n忆往昔\t28739\n200KW\t28740\n喷出\t28741\n绝地求生大逃\t28742\nimiss\t28743\nstdint\t28744\n换光\t28745\nartstation\t28746\n艾诗塔\t28747\nLv\t28748\n居留\t28749\n情蛊\t28750\n夏威夷\t28751\n巴菲\t28752\n中国医院\t28753\n朱大勇\t28754\ndesinger\t28755\n中师\t28756\n星瀚\t28757\nALLEGRO\t28758\n命局\t28759\nLamb\t28760\n2017.7\t28761\n反挂\t28762\n糯\t28763\n情景剧\t28764\n数到\t28765\n110名\t28766\n收光\t28767\n老佛爷百货\t28768\n人工周期\t28769\n认养\t28770\n柴堆\t28771\n3in1\t28772\n太乙仙魔录\t28773\n盖盖\t28774\n凋零者\t28775\ndn200\t28776\nJRZD\t28777\n耍耍\t28778\n大桃\t28779\nWorkstation12\t28780\n山东移动\t28781\n挺起来\t28782\n漯河食品职业学院\t28783\n河东镇\t28784\ndaxues\t28785\n硕鼠\t28786\nFAMI通\t28787\n南充市中心医院\t28788\n吊索\t28789\n轰轰\t28790\n4月24\t28791\n大幅\t28792\n实用\t28793\n图穷匕见\t28794\n戈者\t28795\n北京电影院\t28796\nYale\t28797\n工作员\t28798\n移动平均法\t28799\n修好\t28800\n转资\t28801\n音型\t28802\n1.0.0.5\t28803\nSLF\t28804\n盗笔\t28805\n高危性行为\t28806\n一并\t28807\n文学\t28808\nMENU\t28809\n、、\t28810\n罗浮\t28811\nMDK5\t28812\n深圳地铁3号线\t28813\nBitches\t28814\n秀洲区政府\t28815\nChevron\t28816\n画心\t28817\n育发\t28818\nSUP\t28819\n唐宁\t28820\n李一鸣\t28821\n点不开\t28822\n第20集\t28823\nDADA\t28824\n墙长\t28825\n大柱\t28826\n约谈会\t28827\n西祠生活\t28828\n圆规画\t28829\n青江\t28830\n北山街道\t28831\n3110m\t28832\n泽东\t28833\n赛罗·奥特曼\t28834\n降膜式\t28835\niOS9.0\t28836\n一语双关\t28837\n迷梦\t28838\n言论自由\t28839\nelectrochemical\t28840\n霍建华\t28841\n春闺梦\t28842\ntrademanager\t28843\n学术报告\t28844\n底槽\t28845\n云筑网\t28846\nrec709\t28847\n何杜娟\t28848\nninepercent\t28849\n曲剧\t28850\n清徐县\t28851\nsmtm\t28852\nl4\t28853\n河北省统计局\t28854\n刘东升\t28855\nooon\t28856\n穆萨\t28857\n样单\t28858\n永定县\t28859\n纵隔\t28860\n第A11\t28861\n东四路\t28862\n光盘刻录大师免费版\t28863\n正确率\t28864\n皇冠花园\t28865\n伍佰捌\t28866\neng\t28867\n70年前\t28868\nmodernfamily\t28869\n978\t28870\n魔洞\t28871\n中国科学院山西煤炭化学研究所\t28872\n统治力\t28873\n天门山\t28874\n210天\t28875\nSome\t28876\n一扫光\t28877\n二维傅里叶变换\t28878\nTypeScript\t28879\n子母\t28880\n表单\t28881\n古都\t28882\n压花辊\t28883\n相杀\t28884\n0A\t28885\n瓦岗寨\t28886\n迅雷赚钱宝\t28887\n长安CS35论坛\t28888\n非冠\t28889\n新安县\t28890\n杨秀珠\t28891\n激光炮\t28892\n生还\t28893\n手服\t28894\n开元曼居\t28895\n髋关节脱位\t28896\n流放之路幻化守卫\t28897\n迪特\t28898\nFT4\t28899\n这样那样\t28900\nMOOC学院\t28901\n千兆猫\t28902\n榕城\t28903\n亚洲城\t28904\n方\t28905\n第二句\t28906\nR8500\t28907\n东莞南城汽车站\t28908\n济南公交查询网\t28909\nX9S\t28910\niOS9.1\t28911\n身故\t28912\n测名\t28913\n大振\t28914\n打下手\t28915\nYard\t28916\n滴水不漏\t28917\nusfuli\t28918\n刘馨棋\t28919\n儿童简笔画\t28920\n位序\t28921\n缓考\t28922\n临海论坛\t28923\n降水\t28924\n0898.com\t28925\n母仓鼠\t28926\n腾讯全球合作伙伴\t28927\n糖醋蒜\t28928\n马脸\t28929\n萌萌村\t28930\nec6108v9c\t28931\n2次元\t28932\ndy2018\t28933\n良好\t28934\n小握\t28935\n相似性\t28936\n副班长\t28937\nfn键\t28938\n盛宴\t28939\n缅军\t28940\n糖链\t28941\n180kg\t28942\n豌豆黄\t28943\n高原\t28944\n这啥\t28945\n种属\t28946\n自发\t28947\n中华蜂网\t28948\n辰光\t28949\n帘\t28950\n王者荣耀打野\t28951\n好学网\t28952\n上涨\t28953\n尼加拉瓜\t28954\n钢琴系\t28955\n异国他乡\t28956\n13款\t28957\n萨金特\t28958\njyt\t28959\n开关柜\t28960\n守望黎明号\t28961\n10@100\t28962\n华港\t28963\n铃声秀\t28964\nvuser\t28965\n紫鸟\t28966\n10.1038\t28967\n五店市\t28968\n校友网\t28969\n起讫\t28970\ncsg\t28971\n2万亩\t28972\n左派\t28973\n水信\t28974\n中国科学技术大学先进技术研究院\t28975\n优先权\t28976\n不干净\t28977\n孙悦斌\t28978\n仙逆吧\t28979\n麻痹\t28980\n口袋妖怪mega\t28981\n采动\t28982\n农谚\t28983\n11211\t28984\n南京临时政府\t28985\nfishing\t28986\nav天堂网影音先锋\t28987\n好心痛\t28988\ndeductible\t28989\n1.0.4.0\t28990\n法式\t28991\n郑秀珍\t28992\n骊威永夜君王\t28993\n北京世界花卉大观园\t28994\n60a\t28995\n来宾\t28996\nziwei\t28997\n瑞慈体检\t28998\nGrowing\t28999\n读习\t29000\n铜井镇\t29001\n中国艺术团\t29002\nNuface\t29003\n复旦大学附属上海市第五人民医院\t29004\n会战唐门\t29005\n甲酸甲酯\t29006\n龙岩市政府\t29007\n描述性统计\t29008\n宏达新材\t29009\n毕业证学信网\t29010\n公序良俗\t29011\n南五环\t29012\n炭翁\t29013\n河北省中医院\t29014\n扒窃\t29015\n氯芬黄敏片\t29016\nSINGLE\t29017\n中国儿童文学网\t29018\n罗旭\t29019\nHermann\t29020\n游伴\t29021\n王晓明\t29022\n县纪委监察局\t29023\n搜活动房网\t29024\n30辆\t29025\n五厘米\t29026\nk2\t29027\nfurmark\t29028\nmixture\t29029\nShopex\t29030\n田伯光\t29031\nCollege\t29032\n爱奇艺动画屋\t29033\nn3150\t29034\n0709\t29035\n二手车鉴定评估师\t29036\n刁蛮\t29037\n木渎古镇\t29038\n文森特德莱文\t29039\n搜狗网址导航\t29040\n哈弗H6论坛_哈弗\t29041\n站错队\t29042\n塞班s60v3\t29043\n去衣\t29044\n二手房网-房途网\t29045\n吴洁\t29046\n泰吉\t29047\n学院风\t29048\n诺斯清\t29049\nMAMP\t29050\n65号\t29051\n迎客\t29052\npulling\t29053\n剪力图\t29054\n泉盛\t29055\n4.2v\t29056\n时尚呼吸-北方网\t29057\n名队\t29058\nEnlight\t29059\n碳化硼\t29060\nCHILDREN\t29061\n要领\t29062\n金湾\t29063\ngreen\t29064\n胶囊剂\t29065\nhownet\t29066\n超华科技\t29067\n第五十二章\t29068\n面材\t29069\n热大\t29070\nPreserve\t29071\n围餐\t29072\n打拳\t29073\n东津新区\t29074\nstrpos\t29075\n铁甲舰\t29076\n大华网\t29077\n辽源信息港\t29078\n北京拓尔思信息技术股份有限公司\t29079\nserver数据库\t29080\n纸本\t29081\n空调滤\t29082\n1.0.5_\t29083\n虎林\t29084\n精睿\t29085\n志愿\t29086\n第一眼\t29087\nLetPub\t29088\n7900X\t29089\nVerySky\t29090\nLotus\t29091\n戏观\t29092\n紫杉醇\t29093\n腌咸菜\t29094\nCecil\t29095\n东京食尸鬼第二季:人类与喰种\t29096\nsheeran\t29097\n广州市人力资源和社会保障局\t29098\n冲力\t29099\n1.3亿\t29100\n丙二酸\t29101\nEIN\t29102\nappearance\t29103\n推送\t29104\n金术\t29105\nnba2k17吧\t29106\n登基\t29107\npeking\t29108\n克洛\t29109\n巨字\t29110\nh370\t29111\n【主\t29112\n小学综合素质\t29113\n真色\t29114\n非同凡响\t29115\n11月27日\t29116\n乐上\t29117\n840m\t29118\n声强\t29119\n刘中华\t29120\n豆瓣酱\t29121\n续航力\t29122\n18层\t29123\n联奏\t29124\n恒路物流\t29125\n绿城御园\t29126\n燃煤\t29127\n奥体中心\t29128\n滚圆机\t29129\n博时基金管理有限公司\t29130\n第14周\t29131\n嗤\t29132\nShoujo\t29133\n第一贴\t29134\n干渠\t29135\n纳威\t29136\nvbe6ext.olb\t29137\n夏德仁\t29138\n匝数\t29139\n方平\t29140\n200GANA\t29141\n无锡地区\t29142\n肛瘘\t29143\n盥洗\t29144\n汇总版\t29145\nc180\t29146\n维修处\t29147\nkux\t29148\n悠之空\t29149\ndept\t29150\n圣武\t29151\ntoke\t29152\n50L\t29153\n宏碁电脑\t29154\n黄哲勋\t29155\nscoring\t29156\n李赛凤\t29157\n冲压机\t29158\nanyway\t29159\n18组\t29160\n两英镇\t29161\namoBBS\t29162\n站地\t29163\nDevin\t29164\n香洲港\t29165\n肖杰\t29166\n解禁\t29167\n抗诉\t29168\n皆月\t29169\norical\t29170\n6bit\t29171\n华宝\t29172\n信华\t29173\n理智型\t29174\n色喜\t29175\n1.0.10\t29176\n斑马GK888t\t29177\n金宝街\t29178\n财税\t29179\n反方向\t29180\n顶针\t29181\n妙喻\t29182\n扩底\t29183\n琼瑶剧\t29184\n假标\t29185\n强卫\t29186\n黑龙江农垦\t29187\n厦门大学台湾研究院\t29188\n低声\t29189\n软装设计师\t29190\nhollow\t29191\n东城大道\t29192\n林林总总\t29193\ndubstep\t29194\n天生丽质\t29195\n人肉叉烧包\t29196\n减重\t29197\n美穴\t29198\n家谱\t29199\n杨铮\t29200\n铲除\t29201\n头条新闻网\t29202\n旋塞阀\t29203\nChristopher\t29204\n内存使用率\t29205\n断空\t29206\nDFM\t29207\nxpack\t29208\n染化\t29209\n曹县\t29210\n僵\t29211\ncom域名\t29212\n疯狂小糖\t29213\n主导型\t29214\n石宝寨\t29215\n20161203\t29216\n汇亚\t29217\n锁阳\t29218\n小鸟游六花\t29219\n病理诊断\t29220\n万世者\t29221\n脑立方\t29222\n公仔\t29223\n中山三角镇\t29224\n五百米\t29225\n四天\t29226\n顺德乐\t29227\n狗狗币\t29228\n莫格\t29229\n极道鲜师\t29230\n梨果\t29231\n铝炉\t29232\n怡成\t29233\n华天软件\t29234\n成都市人民检察院\t29235\n府前路\t29236\n2017年11月26日\t29237\n幸福来了\t29238\n暴爽\t29239\n断经\t29240\nChristophe\t29241\n煤焦\t29242\n编绎\t29243\n独立工矿区\t29244\ncert\t29245\n附墙\t29246\n辈\t29247\n笼袖\t29248\nios9吧\t29249\n富兰克林自传\t29250\n靖\t29251\n60平米\t29252\n震\t29253\n安徽医科大学\t29254\n明仕\t29255\n王家沙\t29256\n獐\t29257\n疯抢\t29258\n沃尔特\t29259\n枫桥街道\t29260\n5V1A\t29261\n铂悦府\t29262\n厌恶\t29263\n洛克菲勒中心\t29264\n脂类\t29265\n桔杆\t29266\n泉州酒店\t29267\n江苏省医学会\t29268\nseller\t29269\nstarof\t29270\n斑点状\t29271\n跳纸\t29272\n民用版\t29273\n九厘板\t29274\n广开言路\t29275\n四极\t29276\n葡聚糖\t29277\n天使纪元吧\t29278\n未免\t29279\n青龙镇\t29280\n总分公司\t29281\n千刀万剐\t29282\nDining\t29283\n帝妃无双\t29284\n山东钢铁集团有限公司\t29285\n广州医学院\t29286\nrecreation\t29287\nimpdb\t29288\n搜霸\t29289\n胡锦光\t29290\n输氧\t29291\n南繁\t29292\n33.com\t29293\n口灯\t29294\n乌药\t29295\nbootstrapt\t29296\n马老师\t29297\n101层\t29298\napache-maven\t29299\n中国特色社会主义思想研究中心\t29300\n四相\t29301\n李晓林\t29302\n青泽\t29303\n兰州银行\t29304\n标准韩国语\t29305\n金焰\t29306\n刘小飞\t29307\n意识形态领域\t29308\nduel\t29309\n台车\t29310\n站前\t29311\n双拥\t29312\nadobeillustrator\t29313\n神探伽利略\t29314\n龙洞堡机场\t29315\n噬月者巴库\t29316\n人色\t29317\nlookat\t29318\n纹布\t29319\nJAR\t29320\n人民公园\t29321\n习水\t29322\nAwaiting\t29323\n1.12.4\t29324\n兴业银行信用卡中心\t29325\nBeta\t29326\n五福临门\t29327\nk3wise\t29328\n徐风\t29329\n雅人\t29330\nmimo\t29331\npgm\t29332\n渡边直美\t29333\nBaume\t29334\n军姬\t29335\n随机梯度\t29336\n斑猫\t29337\n诡异\t29338\ntoey\t29339\n新思路\t29340\n矿洞\t29341\n濮塘\t29342\n不明不白\t29343\nCandle\t29344\n猫扑贴贴-猫扑网\t29345\n杭州格莱美医疗美容医院\t29346\n100只\t29347\nByteBuffer\t29348\nGalleria\t29349\n浙江省新华医院\t29350\nways\t29351\n苏州市质量技术监督局\t29352\n厂地\t29353\n不再犹豫\t29354\n重庆市国有资产监督管理委员会\t29355\n论析\t29356\n有道\t29357\n内蒙古医科大学附属医院\t29358\n6小时\t29359\nSHUFFLE\t29360\n中旭文化网\t29361\nadu\t29362\n拍单\t29363\n20160907\t29364\n亮度\t29365\nLUMIX\t29366\n大安村\t29367\n中华人民共和国防洪法\t29368\n肖晓琳\t29369\n一毛不拔\t29370\n乳量\t29371\n湖北警官学院\t29372\n小河鱼\t29373\nk\t29374\n嘉实多磁护\t29375\n码片\t29376\n大东海\t29377\nNt\t29378\nrpart\t29379\n教与学\t29380\n乐开\t29381\n搜索引擎蜘蛛\t29382\n医学检验\t29383\n一百万个\t29384\n小杰\t29385\n花间\t29386\ncatmelo\t29387\nlag\t29388\n黄檀木\t29389\nMPEG\t29390\n托收\t29391\n欧普照明\t29392\n芜湖方特梦幻王国\t29393\n幸福快车\t29394\n萝北\t29395\n十一罗汉\t29396\n工人们\t29397\n针女\t29398\n名胜区\t29399\n畅捷支付\t29400\n萝莉控ii\t29401\n赖氨葡锌颗粒\t29402\n安诺优达\t29403\nnginx\t29404\n千股千评\t29405\n龙de\t29406\n御海\t29407\n刘总\t29408\n高性能\t29409\n现场直播\t29410\nIT男\t29411\n犯错误\t29412\n忝\t29413\n乐不思\t29414\nMO\t29415\nRMVB\t29416\n颜真卿\t29417\nSpring3\t29418\ngb10\t29419\n比亚迪秦EV\t29420\n0下\t29421\n本能\t29422\n浦东新区教育局\t29423\n中线型\t29424\n嵌套函数\t29425\n画本\t29426\n名尚\t29427\n第一桶金\t29428\n迷神\t29429\n方正兰亭黑\t29430\ntime模块\t29431\n翔\t29432\n河畔\t29433\n植物大战僵尸花园战争\t29434\n国博\t29435\n陕西省商务厅\t29436\n大圆满前行广释\t29437\n英语趣\t29438\n17173流放之路poe\t29439\n永不了\t29440\n6b\t29441\n中国石油化工集团\t29442\n郑敏\t29443\n四月天\t29444\n实现\t29445\nCarcinoma\t29446\n川师附小\t29447\n刘元涛\t29448\n南华工商学院\t29449\n黑驰\t29450\n海钜信达\t29451\n金科物业\t29452\n十瓶\t29453\n救治\t29454\n九合\t29455\n成本法转权益法\t29456\nWWW.171ZZ\t29457\n农银人寿\t29458\n270个\t29459\n第七名\t29460\n首体\t29461\noff\t29462\n王小莱昂纳多\t29463\n包装\t29464\n善后\t29465\n宁波市镇海区人民政府\t29466\n2.8G\t29467\n放现\t29468\n兰州市民政局\t29469\n助力转向油\t29470\n黑芝麻粉\t29471\n卖房者\t29472\n三维动画师\t29473\n班规\t29474\npiston\t29475\n抗冻性\t29476\n史峰\t29477\n首倡\t29478\nidea2016\t29479\n分兵\t29480\n切管\t29481\n张建伟\t29482\n气田\t29483\n深圳市福田区政府\t29484\n蒋静\t29485\n万安镇\t29486\n回绝\t29487\n湖北经视\t29488\n绿枢\t29489\n吴聘\t29490\nskinfood\t29491\n乔妹\t29492\n无敌破坏王2\t29493\nAUTOCAD2012\t29494\n139级\t29495\ncorpora\t29496\n谕\t29497\ncitavi\t29498\n石锅拌饭\t29499\n近百年\t29500\nfluent\t29501\nMAME\t29502\nui线程\t29503\n27厘米\t29504\n正负值\t29505\n桩子\t29506\n160g\t29507\n9000元\t29508\n扭簧\t29509\n深圳市铁甲科技有限公司\t29510\n第二册\t29511\nChinaByte\t29512\n蓝钻\t29513\n米兰大教堂\t29514\n盛衰\t29515\nkb2670838\t29516\n刘建勋\t29517\nPDFEditor\t29518\n怎麽回事\t29519\n万里街道\t29520\n听不懂\t29521\n攻击力\t29522\n鳞片\t29523\n梁哥\t29524\n比蒙\t29525\nConfirmation\t29526\n麓谷企业广场\t29527\nCharles\t29528\n支付卡\t29529\n报关单号\t29530\n化武\t29531\nWesTward\t29532\n2016年4月22日\t29533\nwindos7\t29534\n礼遇\t29535\n操纲手\t29536\n佳源广场\t29537\n叶帆\t29538\nlinen\t29539\nsyste\t29540\n轩辕大宝\t29541\nshiwusui\t29542\n发热器\t29543\n分类页\t29544\n1735\t29545\n丑女无敌\t29546\n圣索菲亚教堂\t29547\n莫斯卡\t29548\n那本书\t29549\n明框\t29550\n5.23\t29551\n维他奶\t29552\n躲\t29553\nLX570\t29554\n3万亿美元\t29555\n注法\t29556\n石楼镇\t29557\n卢沟谣\t29558\n柴油机\t29559\n30部\t29560\n华莱健\t29561\nmemtest\t29562\n图灵机器人\t29563\n天之炽\t29564\nLCK\t29565\n最后一章\t29566\n济南飞机场\t29567\n盗墓笔记页游\t29568\n6878xxxx\t29569\nSector\t29570\nCRRT\t29571\n古韵\t29572\n横滨市\t29573\nP9/P9\t29574\n黄韦博\t29575\n寄件方\t29576\n酷客资源网\t29577\ndatabase\t29578\n轮理\t29579\n中物院\t29580\n纳米陶瓷\t29581\n维拉诺瓦大学\t29582\n西部庭州\t29583\n海口市区\t29584\n甘肃省环境保护厅\t29585\n米其林三星\t29586\n戏剧节\t29587\n刘奇凡\t29588\n单认证\t29589\n圈圈舞\t29590\n华池县\t29591\n菲仕乐\t29592\n紫衫\t29593\n筑龙网\t29594\n东京晴空塔\t29595\n江珊\t29596\n紧邻\t29597\n缺氧\t29598\n合图\t29599\n河口村\t29600\n福斯\t29601\n长平路\t29602\nAbove\t29603\n真人秀\t29604\n虞海波\t29605\n情男\t29606\n昌宁县\t29607\n该班\t29608\nparametertype\t29609\n正印\t29610\nLenovoPC\t29611\n秀水湾\t29612\n蔚蓝留学网\t29613\n布歌\t29614\n潘越\t29615\n中铁十八局集团有限公司\t29616\n42.5码\t29617\nFontaine\t29618\n奇宝\t29619\nchangelist\t29620\n组装\t29621\n救援\t29622\n西南医科大学附属中医医院\t29623\n魅蓝m1\t29624\n邹县\t29625\n发展史\t29626\n温压弹\t29627\n喜宝\t29628\n自费药\t29629\n尼禄祭\t29630\nredefined\t29631\n767\t29632\nAlgebra\t29633\n湄公河行动\t29634\n厚纸\t29635\n深圳南山人民医院\t29636\n90期\t29637\n海盗王\t29638\n史宾格\t29639\n安徽商报\t29640\nHD7850\t29641\n高压釜\t29642\n3710\t29643\n星崎\t29644\n牛叉诊股\t29645\n张家口南\t29646\n784\t29647\n差压变送器\t29648\n炎龙传奇\t29649\n能详\t29650\n中国照明网\t29651\n景源\t29652\n遵义四中\t29653\nf3\t29654\n球式\t29655\n卡种\t29656\nmysql.dll\t29657\n2017-05\t29658\n杂用\t29659\ncctalk\t29660\n卡雷拉\t29661\nDirectFB\t29662\n2万8\t29663\n急切\t29664\nAlcantara\t29665\n20150330\t29666\n施华洛世\t29667\n分号\t29668\n清潭洞\t29669\n刘忠林\t29670\n伊布替尼\t29671\n易俗河\t29672\nM1213nf-ZOL\t29673\n头孢地尼\t29674\n红牌\t29675\n独中\t29676\n鸟儿们\t29677\n骁龙808\t29678\n天光云影\t29679\n第壹城\t29680\nHLP\t29681\n京东黑号\t29682\n安徽省立儿童医院\t29683\nf-7202235\t29684\n卡法\t29685\n鞋舌\t29686\n吃鸡宏\t29687\n恒敬\t29688\nuv板\t29689\n交流贴\t29690\nmestrenova\t29691\nPAS\t29692\n绰\t29693\n柳杉\t29694\nUlead\t29695\n末日轮盘\t29696\n中国台湾网\t29697\nIQKeyboardManager\t29698\n神州泰岳\t29699\n单级\t29700\n越地\t29701\n郑州幼儿园\t29702\n家谱网\t29703\n周海涛\t29704\n早餐\t29705\n非金\t29706\n主号\t29707\n_天道留学\t29708\nsqlite3\t29709\n富瑞特装\t29710\n长寿路街道\t29711\n好多多\t29712\n旅社\t29713\n重庆市物价局\t29714\n星际嘉年华\t29715\n浅入浅出\t29716\n龙华站\t29717\n非领导职务\t29718\n车机\t29719\n1Key\t29720\n盛夏路\t29721\n救命\t29722\n红岭创投\t29723\nIMF\t29724\nusb分线器\t29725\n上海逸晨广告有限公司\t29726\n家业\t29727\n安鲜达\t29728\n专接\t29729\n亲吻照\t29730\n机口\t29731\n㎏\t29732\n大骨\t29733\n甄别\t29734\n渠江\t29735\n58发\t29736\n证伪\t29737\n金手指考试\t29738\nzq\t29739\n佛画\t29740\n肾综合征\t29741\n湖南师范\t29742\nOneNote\t29743\n多多批量\t29744\nbindService\t29745\nsib\t29746\nscrip\t29747\n暖和\t29748\n管状\t29749\n25路\t29750\n拥军优属\t29751\n29路\t29752\n江苏省志\t29753\n第46届\t29754\n黄麓\t29755\n郭琳\t29756\nfujixerox\t29757\n安支\t29758\n温室\t29759\n辛店镇\t29760\n井冈山革命根据地\t29761\n需氧\t29762\n混联\t29763\n石墨换热器\t29764\nphones\t29765\n苦术\t29766\n武德\t29767\n天使彦\t29768\n白卡纸\t29769\n绝色倾城\t29770\n太卡\t29771\n网易cc\t29772\n三国志11\t29773\nmails\t29774\n腹膜炎\t29775\n勾心斗角\t29776\n市民云\t29777\n第18号\t29778\n香槟城\t29779\n0092\t29780\n魂兽\t29781\n45乘\t29782\n党政机关办公用房管理办法\t29783\nip库\t29784\n5分\t29785\n一石二鸟\t29786\n周洲\t29787\n德化县人民政府\t29788\n卫视\t29789\n智灵通\t29790\n事业单\t29791\n搅拌站\t29792\n新鸿\t29793\nbjc\t29794\nIRF\t29795\n韩语学习网\t29796\n沙岭\t29797\ncrystalmaker\t29798\n99岁\t29799\n半音阶\t29800\n王五四\t29801\n深层\t29802\n奥豪斯\t29803\n求抱\t29804\n刘青山\t29805\n黄榕\t29806\n嬛\t29807\n竟被\t29808\n尙\t29809\n波罗的海三国\t29810\n尽现\t29811\n飞行时间\t29812\nTess4J\t29813\n克克\t29814\n23:30\t29815\ntomcat6\t29816\n旅游发展委员会\t29817\n非关系\t29818\n瑞江\t29819\n器式\t29820\n怪咖\t29821\nvsc\t29822\n模具展\t29823\n吸引器\t29824\n天津妈妈网\t29825\n神僧\t29826\n土改\t29827\n英短蓝猫\t29828\njlist\t29829\n科目余额表\t29830\n4幢\t29831\n主导性\t29832\n肾阳\t29833\n张利民\t29834\n惊羽\t29835\n米杨\t29836\nguidance\t29837\nipad9.7\t29838\n卓郎\t29839\n20141122\t29840\n李文涛\t29841\n陈志峰\t29842\n3000分钟\t29843\n心平气\t29844\n腻子喷涂机\t29845\n罗希\t29846\n机电网\t29847\n狂热郁金香\t29848\nPostfix\t29849\nses\t29850\n孔融\t29851\n16小时\t29852\nPCE\t29853\n春晓镇\t29854\nAMD\t29855\n60S\t29856\n托举\t29857\n鸬鸟镇\t29858\n国家森林公园\t29859\n石田三成\t29860\n达卡他韦\t29861\n最高薪\t29862\n逸轩\t29863\n林超泽\t29864\n汪华\t29865\n火烛\t29866\n随机数字\t29867\n一毫升\t29868\n西樵山\t29869\nrivet\t29870\n猎捕\t29871\n法瑞意\t29872\n选举\t29873\n大杂院\t29874\n修图师\t29875\n县机关事务管理局\t29876\n手足癣\t29877\n辽宁省新闻出版广电局\t29878\n聚生网管\t29879\n胖脸\t29880\n100卷\t29881\n宋理宗\t29882\n颓\t29883\n李霄鹏\t29884\n叶罗丽精灵梦\t29885\n凯特布兰切特\t29886\n山西青年报\t29887\n大众点评\t29888\nDSL斗鱼\t29889\n科谱\t29890\nyokogawa\t29891\n惊羡\t29892\n压缩系数\t29893\n自责\t29894\n市民服务中心\t29895\n新热\t29896\n模具联盟网\t29897\n1MP3\t29898\n悉尼科技大学\t29899\n业内人\t29900\n离愁\t29901\n角色扮演\t29902\n圣诞节\t29903\nimhist\t29904\n兰马\t29905\n尽性\t29906\n植物大战僵尸ol\t29907\n牧童\t29908\n诱发物\t29909\nCensus\t29910\nAccelerating\t29911\n林晓\t29912\n知识变频器\t29913\n镊\t29914\nC5000\t29915\n中安汽车\t29916\n第36条\t29917\n奇奇\t29918\n16800元\t29919\ndogs\t29920\n血狱\t29921\nisEqual\t29922\n3521\t29923\n运河桥\t29924\n0570\t29925\n刘东生\t29926\n三2\t29927\n苏州地铁6号线\t29928\n梅江会展中心\t29929\n庆隆\t29930\n欧亚卖场\t29931\nHoneywell\t29932\n默克\t29933\ndele\t29934\n全属\t29935\nfrenzy\t29936\n康桥融府\t29937\n住房城乡建设部办公厅\t29938\n归亚蕾\t29939\n富阳人才网\t29940\n老乡\t29941\nFake\t29942\nJTA\t29943\niPhoneSE2\t29944\n地平片\t29945\n任盈盈\t29946\nQualification\t29947\n病故\t29948\n上海美团\t29949\n思威\t29950\n翻飞\t29951\n240千米\t29952\n陇南市\t29953\n高德导航\t29954\n石家庄学院\t29955\ncodeblue\t29956\n艳娃传\t29957\n三大块\t29958\n围边\t29959\n掌控\t29960\ncamp\t29961\n白扁豆\t29962\njavax.servlet\t29963\n几针\t29964\n香薰炉\t29965\nKar\t29966\n毛细现象\t29967\nquartet\t29968\nHilbert\t29969\n踩踏\t29970\n货运单\t29971\nfootable\t29972\n三丁\t29973\n极型\t29974\n骨胶\t29975\n特效库\t29976\n慢吞吞\t29977\n中国国家男子足球队\t29978\n北辰教育\t29979\n佩件\t29980\nHCI\t29981\n中国人保资产管理有限公司\t29982\n黄标\t29983\n自驾游\t29984\n催奶师\t29985\n招生考试信息港\t29986\npagerank\t29987\n烧结钕铁硼\t29988\n漳平\t29989\n擎洲\t29990\nKnit\t29991\n安徽财经大学\t29992\n侓\t29993\n中国航空博物馆\t29994\n洛托姆\t29995\n最高院民一庭\t29996\n居理\t29997\nezpad\t29998\n1.05\t29999\n瑞虹新城悦庭\t30000\n依莉雅\t30001\n怪物猎人4G中文网\t30002\n刘海平\t30003\nUR-V论坛_汽车之家论坛\t30004\n天荟\t30005\n梁惠玲\t30006\n绰约\t30007\n云数据中心\t30008\n奔驰glk\t30009\nVocal\t30010\n酉矩阵\t30011\n3DMark\t30012\n武汉东湖新技术开发区\t30013\n烛叶\t30014\n宠魅\t30015\n名宇\t30016\n连裤\t30017\n使命召唤:现代战争\t30018\n初代吸血鬼第四季\t30019\nmianshi\t30020\n山崎贤人\t30021\nsm公司\t30022\n智取\t30023\n陷阱\t30024\n皇室战争\t30025\n接户\t30026\nfinal类\t30027\n第24话\t30028\n陪同\t30029\n文都教育\t30030\n裱糊\t30031\n余元\t30032\nGeth\t30033\n亨利\t30034\n签派\t30035\n高新区政府网\t30036\n康复机器人\t30037\nielts\t30038\nobstacle\t30039\n武汉19楼\t30040\n港囧\t30041\n埃博拉\t30042\n大舟山\t30043\n鲤鱼跳龙门\t30044\n潢川\t30045\n高压室\t30046\n会计报表附注\t30047\n200位\t30048\n唯冠\t30049\n第A06\t30050\n铸\t30051\nwindoes\t30052\n能量方程\t30053\n环境保护\t30054\n微软Xbox\t30055\n科兴\t30056\n长高\t30057\n中国科学院幼儿园\t30058\n朱梅\t30059\nStokes\t30060\n场馆\t30061\n神盾局\t30062\n创造\t30063\n区段\t30064\n海门市\t30065\n五斗米\t30066\n小情书\t30067\ngalgam\t30068\n专家型\t30069\n悲天悯人\t30070\n慈爱\t30071\nteradata\t30072\n21h\t30073\n新功\t30074\n春浪\t30075\n钉钉宝卡\t30076\n王艳萍\t30077\n杨臻博\t30078\n四川省人社厅\t30079\n联查\t30080\n酷我音乐播放器\t30081\n压条\t30082\njizzjizz\t30083\n6.2.3\t30084\n片手\t30085\n名爵mg6\t30086\n画匠\t30087\n1两\t30088\n戴维琼斯\t30089\n汉诺威工业展\t30090\n中国国家图书馆中国国家数字图书馆\t30091\n柯南中\t30092\nahead\t30093\n想问问\t30094\n正银\t30095\n现金借款\t30096\n0460网站指数\t30097\nAlpha\t30098\nSilvaco\t30099\n海内\t30100\n浸入式\t30101\n美食林\t30102\n两桩\t30103\n食用酒精\t30104\nwinhttp\t30105\n甘州\t30106\n2015劳动节\t30107\n梭鱼\t30108\n上海报业集团\t30109\n美束\t30110\nhpb\t30111\nmuv\t30112\n15829901051\t30113\nfarcry\t30114\n擒敌\t30115\n念珠菌性龟头炎\t30116\n春娇救志明\t30117\n爱丽斯\t30118\n劳动保障部\t30119\n前门大街\t30120\n温席\t30121\n弹幕播放器\t30122\nprevention\t30123\n打放\t30124\n炮渣\t30125\nvictory\t30126\n乌龙闯情关\t30127\n陕西广播电视台\t30128\n商机\t30129\n公孙渊\t30130\n极乐寺\t30131\n直属学校\t30132\nNTLEA\t30133\n卫斯理\t30134\n十字军之\t30135\n摄录\t30136\n沈炼\t30137\n末代皇帝溥仪\t30138\n|日妹妹\t30139\n耳朵眼\t30140\nHF线\t30141\n盘丝\t30142\n辩证唯物主义认识论\t30143\n阳光舜城\t30144\nprd\t30145\n庸者\t30146\n灵山县\t30147\n神狐\t30148\n脱密期\t30149\n易欢\t30150\nc7\t30151\n王泽宇\t30152\n烤馍片\t30153\n栋笃特工\t30154\n棠宁\t30155\n方钢管\t30156\n26吨\t30157\n进球率\t30158\n中国幼儿在线合作学校\t30159\naliases\t30160\nv-for循环\t30161\n测定器\t30162\n杜月凯美瑞\t30163\n白昭\t30164\n阿林\t30165\n惠州学校\t30166\nZ皓\t30167\n连帽\t30168\n能量餐\t30169\n火影忍者疾风传:究极忍者风暴\t30170\nACT考试\t30171\nconfession\t30172\n_马\t30173\n异动\t30174\n揽胜诛仙\t30175\n快速门\t30176\n大总裁\t30177\n西邻\t30178\nhillstone\t30179\n紫城\t30180\n终止符\t30181\n腌渍\t30182\n魂匣\t30183\n金碧坊\t30184\n事情\t30185\n5盘\t30186\n花椒直播\t30187\n超级玛丽单机版\t30188\n群发机\t30189\n湖北能源集团股份有限公司\t30190\n邢氏\t30191\n虐机\t30192\n咬住\t30193\n笑语\t30194\n葛覃\t30195\n黄海学院\t30196\n郑板桥\t30197\n豆腐店\t30198\n拉瓦尔\t30199\n并购重组委\t30200\n硫化机\t30201\n工行银行\t30202\nbezier\t30203\n古风类\t30204\nadmit\t30205\n情谜\t30206\n范雷\t30207\n真了不得\t30208\n每家\t30209\n马成\t30210\n明源云\t30211\n所示\t30212\n黄埭镇\t30213\n定序变量\t30214\n3面\t30215\n开题报告\t30216\n统一化\t30217\n军花\t30218\n郑新聪\t30219\n甬台\t30220\n六年级数学上册\t30221\n楚楚\t30222\n3dsmax2014\t30223\n户外显示屏\t30224\nㄖ\t30225\n一二代\t30226\n你是人间的四月天\t30227\n大关小区\t30228\n常规检查\t30229\n魅族\t30230\n咖啡灌肠\t30231\nBULK\t30232\n10106\t30233\n全年级\t30234\n崇礼县\t30235\n肌酐\t30236\nHOBBY\t30237\n龙的传人\t30238\nrtv\t30239\n小黄文\t30240\n九重\t30241\n创全\t30242\nParks\t30243\n乞拉朋齐\t30244\nipran\t30245\n通桥\t30246\n铝液\t30247\nsells\t30248\n和谐小区\t30249\n六十年代\t30250\nalike\t30251\n第一圈\t30252\n毅德\t30253\n教职员工\t30254\n猪价格网\t30255\n評\t30256\n孔雀蛋\t30257\n上坤\t30258\n1167\t30259\n自流平地面\t30260\n三缸柱塞泵\t30261\n安义\t30262\n神U\t30263\n【惠\t30264\n三辊卷板机\t30265\nDNF\t30266\n4605\t30267\n优惠率\t30268\nNBA直播吧_98篮球网\t30269\n德布劳内\t30270\ngodtrue\t30271\n译词\t30272\n早八点\t30273\n大新阳光\t30274\n满岛光\t30275\nwebmagic\t30276\n傻女人\t30277\n将军大道\t30278\n南宁日报多媒体数字报\t30279\n色谱仪\t30280\n润年\t30281\n明式家具\t30282\n移动oa\t30283\n退界\t30284\n吃不到\t30285\n磁窑\t30286\n马斯切拉诺\t30287\nTrance\t30288\nwin2\t30289\n价值量\t30290\n942\t30291\n马天\t30292\n主从犯\t30293\nnate\t30294\n1-12月\t30295\n进栈\t30296\nGetter\t30297\n九江县\t30298\n冲压模具\t30299\n1500毫升\t30300\n光伏展\t30301\n张熙\t30302\n拨乱反正\t30303\n肉丸\t30304\n猫扑贴贴\t30305\n西冲海滩\t30306\n肌膏\t30307\nkingscada\t30308\n第17话\t30309\n联城\t30310\n增压泵\t30311\n渐\t30312\n句芒\t30313\n南通市委\t30314\n领错\t30315\n愛音\t30316\n谢莉斯\t30317\n没意思\t30318\n固特异\t30319\n新兴街\t30320\n赞誉\t30321\n李江涛\t30322\n绿佳电动车\t30323\n过滤棉\t30324\n非师范专业\t30325\ns\t30326\n215路\t30327\n渔药\t30328\n韩飞龙\t30329\nzygote\t30330\n陈小凤\t30331\nfrom&#160\t30332\n扣血\t30333\n0407\t30334\n相当\t30335\n计算机网络技术\t30336\n蜂房\t30337\n101.1\t30338\n校验台\t30339\n律动\t30340\n翻牌机\t30341\n中共广西壮族自治区委员会\t30342\n趣头条自媒体\t30343\n微单EOS\t30344\n杀蚊\t30345\nCoffeeScript\t30346\n单孔\t30347\n第96章\t30348\n102.6\t30349\nSpeed\t30350\n三年级类\t30351\n图途\t30352\n茂昌\t30353\n0xc0000409\t30354\nmaid\t30355\n河北工商局\t30356\n植物大战僵尸:花园战争2\t30357\n雷锋号\t30358\nLASERJET\t30359\n贝雷塔壁挂炉\t30360\n双一\t30361\n钟嵘\t30362\n佩恩六道\t30363\n福建国税电子税务局\t30364\n144平\t30365\n湖南应用技术学院\t30366\n资讯_大公网\t30367\nDimensions\t30368\n肢端肥大症\t30369\n唐山万达广场\t30370\n上上签\t30371\n单张\t30372\n趣玩网\t30373\n巫医\t30374\n紫仓\t30375\n卫生监督所\t30376\n白头神探\t30377\ncolab\t30378\n钜宝\t30379\n光阻\t30380\n等分\t30381\n丰田卡罗拉\t30382\n上海信托\t30383\n肝帝\t30384\n北京工人体育场\t30385\n贺臣\t30386\n30大\t30387\n儿茶素\t30388\n侧旋\t30389\n不负\t30390\nMDM\t30391\nfolks\t30392\n早醒\t30393\n震级\t30394\n车架号\t30395\n甜心格格\t30396\n成渝高铁\t30397\nsoundtouch\t30398\n横河\t30399\nK-means聚类算法\t30400\n天数\t30401\nIC卡\t30402\nlbj\t30403\n谢永强\t30404\n388a\t30405\n顶住\t30406\n月考测试卷\t30407\npygtk\t30408\n中国人民政治协商会议全国委员会\t30409\n铸剑物语3\t30410\nWin\t30411\n1公升\t30412\n百变怪\t30413\n盗潜黄金城\t30414\n邓家佳\t30415\n软饮\t30416\n浅仓\t30417\n20150816\t30418\n太极乐队\t30419\n姜泥\t30420\n兰斯\t30421\n劳动套\t30422\n子哈特\t30423\n战神传说\t30424\n医疗事故\t30425\n搜狗手机助手\t30426\n兰州树人中学\t30427\n龙巅鹦鹉鱼\t30428\npaintings\t30429\n取法乎上\t30430\n草垛\t30431\n通报\t30432\n安得广厦\t30433\n发绀\t30434\n暗光\t30435\n美国国会大厦\t30436\nTPP\t30437\nCommerce\t30438\n岑宇\t30439\n叶玉卿\t30440\n刘振明\t30441\n护坡桩\t30442\nroxy\t30443\n搞笑类\t30444\n亲疏\t30445\n公交车道\t30446\nknt\t30447\n600919\t30448\n兴杰\t30449\nWEB服务器\t30450\n葛兰西\t30451\n联通系统集成有限公司\t30452\nBesiege\t30453\n树阴\t30454\n贝尼特斯\t30455\n维意定制\t30456\n椰风\t30457\n强化班\t30458\n实像\t30459\nfon\t30460\n一拍即合\t30461\nEl表达式\t30462\n砂滤罐\t30463\nIT_财经_中金在线\t30464\n五居\t30465\n村书记\t30466\n花坝\t30467\n日发\t30468\n网络科技公司\t30469\nreasonable\t30470\n兰洽会\t30471\n维他柠檬茶\t30472\n5年之内\t30473\n从轻处罚\t30474\n班杜拉\t30475\nrev\t30476\n新词\t30477\n江郎才尽\t30478\n苦今生\t30479\nP9\t30480\n一分米\t30481\n双日\t30482\n小奶狗\t30483\n花梨木\t30484\n信丰物流\t30485\n计算机软件著作权\t30486\n中华人民共和国驻日本国大使馆\t30487\n平安街\t30488\n62部\t30489\n1609\t30490\n金联储\t30491\n五型\t30492\n软肋\t30493\nSeconds\t30494\n滩地\t30495\n狗叫\t30496\n胡佳\t30497\njarvis\t30498\n千通\t30499\nJava虚拟机\t30500\n风易\t30501\n四川大学华西医学院\t30502\n浙江卫视\t30503\n别这样做\t30504\n林皓\t30505\n金租\t30506\n回乡\t30507\n五万吨\t30508\n陈洪绶\t30509\n5221\t30510\n2015.12\t30511\n环太平洋3\t30512\n南雄市\t30513\n天天在线\t30514\n十五所\t30515\n页子\t30516\n19日\t30517\n4R7\t30518\n量计\t30519\n1524\t30520\n代表人\t30521\nBEYOND\t30522\n川教版八年级\t30523\n土地使用权\t30524\n博文\t30525\nDDGS\t30526\n肇\t30527\n非法集资案\t30528\n800度\t30529\n丁是娥\t30530\n异视\t30531\n玉叶\t30532\n氟硅酸\t30533\n凤城六路\t30534\n安福县\t30535\n18禁\t30536\n孙桓\t30537\n极电强兵\t30538\n180421\t30539\nzaki\t30540\n龙眼蜜\t30541\n福州市第二医院\t30542\n护翼\t30543\n甘肃电信\t30544\nJavaScript函数\t30545\n牛大力\t30546\n上海大学美术学院\t30547\n面板数据\t30548\n五十四\t30549\n俩字\t30550\n净空法师\t30551\n肖鸿昌\t30552\n织金县\t30553\n姜至鹏\t30554\n哪行\t30555\n顷刻\t30556\n肚子疼\t30557\n作案\t30558\n轴心\t30559\n帝皇\t30560\nTorres\t30561\n好时代\t30562\n省直机关\t30563\n51岁\t30564\n850万\t30565\nhero\t30566\n份量\t30567\n奇果\t30568\nclips\t30569\n潭\t30570\n封坛酒\t30571\ncleaner\t30572\n星社\t30573\n改线\t30574\n养鸭场\t30575\n热力学第一定律\t30576\n我思故我在\t30577\n董建春\t30578\n二百多\t30579\nmathtype公式编辑器\t30580\n1万台\t30581\n中铁建设集团\t30582\nITunes\t30583\n年轻照\t30584\n东阿吧\t30585\n冠带\t30586\nInvestor\t30587\nIPHONE6\t30588\n吴学\t30589\n020GVG\t30590\n特应性皮炎\t30591\nadvise\t30592\n李袁杰\t30593\n捕捉\t30594\nINN\t30595\n五原\t30596\n百癣夏塔热片\t30597\n从业资\t30598\n漏孔\t30599\n_趣\t30600\n春饼\t30601\n金华人才网\t30602\nSoccerway\t30603\n孙黯\t30604\n病毒合剂\t30605\n梅林关\t30606\n得了\t30607\n我们的生活\t30608\nWar\t30609\n173个\t30610\n金高恩\t30611\n浆料\t30612\nsun\t30613\nsynchronization\t30614\n2018-04-17\t30615\n不结盟运动\t30616\n做名\t30617\n解乏\t30618\n客套\t30619\n本行\t30620\n传位\t30621\n上海一中院\t30622\n百路驰\t30623\nexecle\t30624\n皇甫\t30625\npasser\t30626\n1040\t30627\n匹配项\t30628\nCombine\t30629\n击破\t30630\n服务行\t30631\n王水平\t30632\n多伦多海鲜自助餐厅\t30633\n保险界\t30634\n母线\t30635\n货值\t30636\n狩猎者\t30637\n颜良文丑\t30638\n竹叶\t30639\n梅墟街道\t30640\n东浮山\t30641\n张文江\t30642\n泰格医药\t30643\nucosiii\t30644\n天刀五毒\t30645\n镇康\t30646\nwrf\t30647\n4K123\t30648\n中升雷克萨斯\t30649\n瞪眼\t30650\n运载火箭\t30651\n新制\t30652\nWV\t30653\n谢尔\t30654\n面试篇\t30655\n饸饹面\t30656\n导则\t30657\nExcel网\t30658\n华南地区\t30659\n天津国展中心\t30660\n軟體\t30661\nMenAtPlay\t30662\n亍\t30663\n以太网\t30664\n龙邦\t30665\nfellowship\t30666\nIoC\t30667\n小麦色\t30668\n花姑子\t30669\n枭妻\t30670\n多党制\t30671\n2017年07月\t30672\n美图手机吧_\t30673\nMBR分区表\t30674\n口水鸡\t30675\n光氧\t30676\n阿石\t30677\n生死相依\t30678\ndissertation\t30679\n一己之见\t30680\n高松市\t30681\n秀图\t30682\n武林壹号\t30683\n南京路\t30684\n小鼓手\t30685\n御马\t30686\n荣威erx5\t30687\n场效应管\t30688\n手排\t30689\n记作\t30690\n河卵石制砂机\t30691\nCLEO\t30692\n策马扬鞭\t30693\nvin\t30694\n卷烟器\t30695\ncopd\t30696\n徐国华\t30697\nFrequencies\t30698\n史学家\t30699\n运程\t30700\n不得闲\t30701\n长法\t30702\nrme\t30703\nx50\t30704\n打卦\t30705\n责任期\t30706\n没单\t30707\n日间\t30708\n平日\t30709\n东风雨\t30710\n阿里零售通\t30711\n港货\t30712\n蒙自市\t30713\n没劲\t30714\nrobinson\t30715\n_17汽车网\t30716\n各交\t30717\n雷顿\t30718\n疙瘩\t30719\n胎素\t30720\nxirr\t30721\n凤城街道\t30722\nMinami\t30723\n宗教事务条例\t30724\n奥瑞德\t30725\n猫城记\t30726\n月痕\t30727\n亿次\t30728\n九九归一\t30729\n为由\t30730\n阿曼尼\t30731\n创办人\t30732\n太阳能集热器\t30733\n吾儿\t30734\n改过\t30735\n购物狂\t30736\n小学生作文\t30737\n一顺\t30738\n波粒二象性\t30739\n乌石镇\t30740\nExtending\t30741\n大通湖\t30742\n湖北大厦\t30743\n夺印\t30744\n猎爱百计\t30745\n吱声\t30746\n祈求\t30747\n感觉不到\t30748\n隋唐英雄5\t30749\n暇步士\t30750\n阿坝州政府网\t30751\n厦门工商旅游学校\t30752\n本然\t30753\n路演\t30754\n一统\t30755\nlntel\t30756\n中国航天科技集团公司\t30757\npocib\t30758\nheight\t30759\nbodypaint\t30760\nGen10\t30761\n化感\t30762\n吊笼\t30763\n吉尔伽美什\t30764\n厂字\t30765\n天皇秀\t30766\n佛骨\t30767\n无铅汽油\t30768\nWordPress主题之家\t30769\n仙侠类\t30770\n_一叶知秋\t30771\n直挂\t30772\n苗侨伟\t30773\n骚妈\t30774\n蔡萝莉\t30775\n城际铁路规划\t30776\nwin98\t30777\n1.2厘米\t30778\n洒家\t30779\n万科租售中心\t30780\nWannabe\t30781\n禁摩限电\t30782\n枫华谷\t30783\n保镖公司\t30784\nresources、filter\t30785\n蒂法\t30786\n百度富文本编辑器\t30787\nwps格式\t30788\n弹性板\t30789\nMast\t30790\n年轻\t30791\narmv7s\t30792\n创新园\t30793\n畲江\t30794\n阳光保险集团\t30795\n2面\t30796\n第34条\t30797\nSELPHY\t30798\n磁链接\t30799\n10起\t30800\n词曲\t30801\n免维护蓄电池\t30802\nCosersuki\t30803\n茅屋\t30804\n双料\t30805\n闸板\t30806\n侧吸式\t30807\n米字格\t30808\nMac屏幕分辨率\t30809\n万花尺\t30810\nkb4093112\t30811\nlogn\t30812\n铸造师\t30813\n背酸\t30814\n战力\t30815\n六下\t30816\nCROWN\t30817\nTPM\t30818\n乡镇妇联\t30819\n百年好合\t30820\n胎牛血清\t30821\n专案组\t30822\nBracelet\t30823\n昭君\t30824\n习性\t30825\n1080ti\t30826\n哥伦布\t30827\n高压真空断路器\t30828\n庝\t30829\n虾米网\t30830\n失算\t30831\n军团\t30832\n魔恋\t30833\nCropper\t30834\n盲样\t30835\n优享版\t30836\n威曼\t30837\n高压清洗车\t30838\n当日\t30839\n巿场\t30840\nц\t30841\ncva\t30842\nClio\t30843\n路政\t30844\n)文化传媒有限公司\t30845\n八一队\t30846\n耳扣\t30847\n意外\t30848\n10.3_\t30849\njtj\t30850\n五毛网\t30851\n实机\t30852\nprobability\t30853\n阿帕奇\t30854\nshubiao\t30855\n▲\t30856\n鲜果\t30857\n原形毕露\t30858\n华东政法大学\t30859\n威马\t30860\n数字隔离器\t30861\n痞子\t30862\n维宏股份\t30863\n驾驶式洗地机\t30864\n跑得快\t30865\n吧里人\t30866\n入户花园\t30867\n琴弹\t30868\nPlumbing\t30869\nPRT\t30870\n三国群英传\t30871\n前卫\t30872\n万科a\t30873\n18年6月\t30874\n熊德\t30875\n长江二桥\t30876\ngogoing\t30877\njpm\t30878\n上清\t30879\n知音漫客网\t30880\n二轴\t30881\nCHU\t30882\n江西高速\t30883\n云台路\t30884\n黄花梨树\t30885\n南通经济技术开发区\t30886\n降分\t30887\n纽约世贸中心\t30888\n拉里·埃里森\t30889\n可居\t30890\nnetmask\t30891\n4月2日\t30892\n官妖\t30893\n好好看\t30894\nBD-720P\t30895\nppt+\t30896\neveryting\t30897\n0.74\t30898\n装满\t30899\n香蕉船\t30900\n20180413\t30901\n悦动圈\t30902\nCtrl+\t30903\n安阳学院\t30904\n子企业\t30905\n博浪\t30906\n超次元游戏海王星\t30907\n储油罐\t30908\n结构化思维\t30909\n途观丝绸之路\t30910\nwince6\t30911\n/月\t30912\n帅炸天\t30913\n福慧\t30914\n前一天\t30915\n厦门大学化学化工学院\t30916\n半层\t30917\n宋孝宗\t30918\ntortoisesvn\t30919\n1087\t30920\nicheck\t30921\n理化生\t30922\n傈僳\t30923\ndotween\t30924\n螺旋形\t30925\n寒冰木\t30926\nAv在线\t30927\nzzp\t30928\n剐\t30929\n十三五规划纲要\t30930\n河津\t30931\n华西口腔医院\t30932\n39就医助手_39健康网\t30933\n祂\t30934\n8万个\t30935\n榨精\t30936\n任我看\t30937\n半程\t30938\n32位\t30939\n张小雷\t30940\n八中\t30941\n姬怜美\t30942\n线膨胀系数\t30943\n中南财经政法大学法学院\t30944\n宝莉\t30945\n渲染\t30946\n高弹\t30947\n玉兔号\t30948\nFlowLayout\t30949\nAY\t30950\n魔女传\t30951\n郑义\t30952\n声色犬马\t30953\n奥沙利文\t30954\n农家乐网\t30955\nJOIN\t30956\n唱段\t30957\n华图教育河南分校\t30958\n刨花板\t30959\nannotation\t30960\n筒\t30961\n福佳白啤酒\t30962\nlogy\t30963\n哈德斯\t30964\n自生自灭\t30965\n700分钟\t30966\nrong\t30967\n薛佳凝\t30968\nCDkey\t30969\n体检\t30970\nDU\t30971\n搪玻璃反应釜\t30972\n德祥\t30973\n套书\t30974\n二龙\t30975\n迫使\t30976\n26595\t30977\n东社区\t30978\n相识\t30979\n珠海大剧院\t30980\nMari\t30981\n陈桂林\t30982\na米\t30983\n李向阳\t30984\n送到家\t30985\nCAD2017\t30986\n07:00\t30987\n淫花\t30988\ndragons\t30989\ncydia源\t30990\n泸州政府网\t30991\nhighmaps\t30992\n拦江\t30993\n全拼输入法\t30994\n英搏尔\t30995\n挖掘者\t30996\n房间隔\t30997\n梁烈唯\t30998\n大华西溪\t30999\n耐震\t31000\nCONTRACT\t31001\n隔热玻璃\t31002\n中国临清_临清市政府\t31003\n元老院\t31004\nconsensus\t31005\n龙娃\t31006\n城西银泰城\t31007\n会计从业资格证书\t31008\n简历库\t31009\n阳煤集团\t31010\n三亚日报数字报\t31011\n宝华城市之星\t31012\nManchester\t31013\n梁梁\t31014\nwww.52shici.com\t31015\n01000\t31016\n低合金钢\t31017\n尖峰时刻3\t31018\n表现手法\t31019\n24磅\t31020\n尖峰时刻2\t31021\n3000件\t31022\n贵阳市委\t31023\nX370\t31024\n言语\t31025\n寮步镇\t31026\n集大成者\t31027\n20151016\t31028\nfufu\t31029\n勇敢者游戏\t31030\n田宅\t31031\n大生产\t31032\n无痕浏览器\t31033\n解放东路\t31034\n习总书记\t31035\n洪濑\t31036\n职字\t31037\n立正\t31038\n寄宿制\t31039\n猪毛\t31040\n海峡导报\t31041\n【君\t31042\n不寒\t31043\n广汽本田\t31044\n青岛海洋地质研究所\t31045\n延伸性\t31046\nswf\t31047\n随机码\t31048\n云栖谷\t31049\nWatch3\t31050\n锂金属\t31051\n李维斯\t31052\n一外\t31053\n传播学专业\t31054\n彭丹\t31055\n张小雨\t31056\n疏\t31057\nCHEMISTRY\t31058\n友邦\t31059\n百面\t31060\nprojector\t31061\n齐次\t31062\n赞美花\t31063\n生活必需品\t31064\n成都市武侯区\t31065\nオレンジ\t31066\n端平\t31067\n北海人才网\t31068\n動\t31069\n北塘区\t31070\n泰安方特欢乐世界\t31071\ndiagonal\t31072\nmp4_\t31073\n2234\t31074\n北京西火车站\t31075\n敖德萨\t31076\n北京BJ40论坛_汽车之家论坛\t31077\n飞里\t31078\n十二星座\t31079\n10瓶\t31080\n7x24\t31081\n第十一章\t31082\n运力\t31083\n脊髓空洞症\t31084\n上海九院\t31085\n经纶\t31086\n8803\t31087\n温汤\t31088\n哇靠\t31089\n276\t31090\n胡昕\t31091\n挑射\t31092\nsoldier\t31093\n阴液\t31094\nEs6\t31095\n十三香龙虾\t31096\n声音嘶哑\t31097\n裴小峰\t31098\n超凡脱俗\t31099\n拖拉\t31100\n9月19日\t31101\n李银桥\t31102\n马红\t31103\n中南大学毕业生就业指导服务中心\t31104\n三孝口\t31105\n国寿e\t31106\n信号\t31107\n肚婆\t31108\n艾利和\t31109\n125周年\t31110\n套柱\t31111\n雕刻刀\t31112\n斗牛场\t31113\nkarsa\t31114\n丁小芹\t31115\n算命大师\t31116\n天石\t31117\n显影仓\t31118\n香克斯\t31119\nrs232\t31120\n八仙全传\t31121\n法系\t31122\n自然语言处理\t31123\n九鹿林\t31124\n增肌粉\t31125\n斑马GK888T\t31126\n杯子歌\t31127\n奥帆中心\t31128\n黄圣\t31129\n郭照\t31130\n望牛墩镇\t31131\n碳氢清洗剂\t31132\navs-museum\t31133\n塞哈姆\t31134\n匀浆机\t31135\n中国中山政府\t31136\n合抱木装修网\t31137\n迎\t31138\nseve\t31139\n张家\t31140\n17件\t31141\n中国交通建设股份有限公司\t31142\n安图\t31143\n赵亮\t31144\n图恩\t31145\n贬值\t31146\n四十三\t31147\n年以来\t31148\nsnv\t31149\n上饶市人民政府\t31150\n赠书\t31151\n眼里\t31152\nlusk\t31153\n鸿顺\t31154\n黎明之街\t31155\n回收店\t31156\n页脚处\t31157\n恩熙\t31158\n整组\t31159\nlee\t31160\nfoobar\t31161\n丽景苑\t31162\n熊太行\t31163\n裸妆\t31164\n天安科技园\t31165\n企业职工带薪年休假实施办法\t31166\n修仙唱弹\t31167\n2840\t31168\nkiehls\t31169\npostgressql\t31170\n空心管\t31171\n移动商城\t31172\nSecPulse\t31173\n择人\t31174\nsus304\t31175\n旭创\t31176\nLOL英雄联盟卡顿掉帧\t31177\n锡机\t31178\nv家\t31179\n豆浆\t31180\n拦河\t31181\n甲类\t31182\n春人\t31183\n下达\t31184\n一手楼\t31185\n防骗\t31186\n宰\t31187\n加衣\t31188\n药品柜\t31189\ngjb\t31190\n跑跳\t31191\n龙漕路\t31192\n美联航\t31193\n舞台\t31194\nrestroom\t31195\n凯普生物\t31196\n收藏夹\t31197\n平章\t31198\n12月8日\t31199\njci\t31200\npscc2017\t31201\n时装版\t31202\n益跑网\t31203\n溅血\t31204\n悲痛\t31205\nDRESS\t31206\nwww.ic37.com\t31207\n2卫\t31208\n万向接头\t31209\n特工队\t31210\nCopco\t31211\n2014年3月\t31212\n明伦\t31213\n知联会\t31214\nQuantitative\t31215\n宏能\t31216\n短语\t31217\n第十集\t31218\n渣土车\t31219\n签证费\t31220\n既往不咎\t31221\n童心\t31222\n处处\t31223\n感悟篇\t31224\n深赤湾\t31225\n严介\t31226\ncaribean\t31227\n高技\t31228\n三生三世步生莲\t31229\n伊小七\t31230\n红包\t31231\n这本书\t31232\nv7.9\t31233\n幕\t31234\n土豆丝\t31235\n融侨观澜\t31236\n中国太保\t31237\n小葱秀\t31238\n同济大学经济与管理学院\t31239\n混凝土泵车\t31240\ndeepfake\t31241\n希克\t31242\nverbose\t31243\n真空绝热板\t31244\n当令\t31245\nARIMA\t31246\nopen\t31247\n梦幻召唤兽\t31248\n2018年04月22日\t31249\nnoch\t31250\n求援\t31251\n马家沟\t31252\n北京市商汤科技开发有限公司\t31253\n参订\t31254\n瑞风S7\t31255\n答录机\t31256\n古币\t31257\n狼毛\t31258\n优理氏\t31259\n深圳市规划和国土资源委员会\t31260\n第33轮\t31261\n不规\t31262\n严艺丹\t31263\nWallets\t31264\n起亚K2\t31265\n103家\t31266\n锦成\t31267\n合资企业\t31268\n身下\t31269\n大禾\t31270\n开发板\t31271\n广德\t31272\n1830年\t31273\n批量改\t31274\n赶紧\t31275\n捐步\t31276\n安徒凤囚凰\t31277\n二十四节\t31278\n发牢骚\t31279\n绽开\t31280\n200日\t31281\n离群\t31282\nPalladium\t31283\n伽马刀\t31284\n外宿\t31285\n回文数\t31286\n郑济\t31287\n深紫色\t31288\n8.30\t31289\n延产\t31290\n乐松\t31291\nVitamins\t31292\n展示厅\t31293\n评先\t31294\n11亿美元\t31295\n影碟\t31296\n石述思\t31297\nHttpGet\t31298\n6速\t31299\nrobbie\t31300\n人力资源管理师三级\t31301\n稠\t31302\nSTATA\t31303\nLOP\t31304\n面包片\t31305\nFisherman\t31306\n宝来宝马x3\t31307\n艾蒙\t31308\n冒险王卫斯理之支离人\t31309\n几百种\t31310\n直流伺服电机\t31311\n几扇\t31312\n马华\t31313\nwebbrowser\t31314\n龙之刃\t31315\n超星移动图书馆\t31316\n肯·福莱特\t31317\nripe\t31318\nMBA加油站\t31319\nFMRTE\t31320\nnetty-socketio\t31321\n阀箱\t31322\nけ\t31323\n便民网\t31324\nPath\t31325\n北海政府\t31326\n3月27\t31327\nbosch\t31328\n心路\t31329\n狡兔\t31330\n3声\t31331\n赣锋\t31332\n东方电缆\t31333\n新百微\t31334\n画树\t31335\n耶加雪菲\t31336\n枣泥\t31337\n林刚\t31338\n小芹\t31339\n二手车床\t31340\n伊斯兰历\t31341\nUPLC\t31342\n南澳西\t31343\nopengles\t31344\nchengren\t31345\n追赶\t31346\n十五夜\t31347\n马耳他\t31348\n2分集\t31349\n尖东\t31350\n坐标系\t31351\n乱伦秘史\t31352\n主材\t31353\n免填\t31354\n二根\t31355\n大冶网\t31356\n上海)自由贸易试验区\t31357\n5]\t31358\n进口\t31359\n数显\t31360\n劳保服\t31361\n李胜基\t31362\n热刺\t31363\njokes\t31364\n柒夜\t31365\n桃谷艾莉卡\t31366\n大厂\t31367\n老叔\t31368\n凯少\t31369\n底稿\t31370\nsignals\t31371\nfist\t31372\nhazelcast\t31373\n藤本紫媛\t31374\n黄旭东\t31375\n滴水瓦\t31376\n一口\t31377\nSLDPRT\t31378\n投宿\t31379\n乐民\t31380\n3016\t31381\npersuade\t31382\n真崎航\t31383\n迷你世界\t31384\n32式太极拳\t31385\n油底\t31386\n二十款\t31387\n孤单又灿烂的神-鬼怪\t31388\ncococo\t31389\n优度\t31390\n锐盾\t31391\nvv鱼儿vv\t31392\n玫红色\t31393\n良玉\t31394\n双跨\t31395\n199IT\t31396\n深圳车管所\t31397\n万万\t31398\n九洲大道\t31399\n恋爱小说\t31400\n红烧鸡翅\t31401\n条码机\t31402\n随机密码\t31403\n梦幻之星4\t31404\n石兆琪\t31405\n在空中\t31406\nAbbreviations\t31407\n张楚楚\t31408\n香港中文大学深圳校区\t31409\n纪念碑谷\t31410\n805\t31411\n华南师范大学心理学院\t31412\nopenal\t31413\nKerberos认证\t31414\nM1136\t31415\nPU皮\t31416\n杨歌\t31417\n端上\t31418\n墨\t31419\n县林业局\t31420\n三年级英语\t31421\nPK10\t31422\n青岛海尔\t31423\n亿力洗车机\t31424\n辊筒\t31425\nlog4jdbc\t31426\nSQL优化\t31427\n唯心\t31428\n光启城\t31429\n流量宝\t31430\n墨绿色\t31431\n桂林医学院附属医院\t31432\nDITF\t31433\n电容式\t31434\nFFT\t31435\n智酷\t31436\n镇宁县\t31437\nplo\t31438\n常州新城\t31439\n黑河市人民政府\t31440\n血沉\t31441\n天逆\t31442\n姜涛\t31443\n中国儿童资源网\t31444\n青柳\t31445\n恐怖丛林肉搏\t31446\n权\t31447\n鉴识\t31448\n法牛\t31449\n盐酸林可霉素\t31450\n校宝\t31451\n中国石油大学胜利学院\t31452\nr9s\t31453\n修复版\t31454\n8300\t31455\n甲基\t31456\n迪沙\t31457\n秘武\t31458\n天地劫神魔至尊传\t31459\n杨筠松\t31460\n第4页\t31461\n镇头镇\t31462\n美洲杯\t31463\n美军机\t31464\n保温袋\t31465\n春风化雨\t31466\nseen\t31467\n云表\t31468\n慕容小凡\t31469\n闲愁\t31470\n高铁新区\t31471\n反悔\t31472\nswin\t31473\n净慈寺\t31474\n宰予\t31475\n康沈路\t31476\n188.com\t31477\nstu\t31478\nnth\t31479\nhers\t31480\n蒙丹\t31481\n蒲庙\t31482\n方城\t31483\nFluor\t31484\n佛法\t31485\n药疗\t31486\n新机\t31487\nZ型\t31488\n游龙\t31489\nDFI\t31490\n普及化\t31491\n淘宝卖家论坛\t31492\n画贴图\t31493\nross\t31494\n二道江区\t31495\n甲基纤维素\t31496\n豆沙绿_\t31497\n小山羊\t31498\n常熟市区\t31499\n社会步\t31500\n吹塑纸\t31501\n隋雪儿\t31502\nMetals\t31503\nBigasoft\t31504\n面试会\t31505\n上海青浦\t31506\n曾侯乙\t31507\n荣耀战魂\t31508\nPhotoshop2018\t31509\nMYD\t31510\nhid\t31511\n逃税\t31512\nesse\t31513\n旧岁\t31514\n滨江新城\t31515\nlax\t31516\n师风\t31517\n过渡期\t31518\n梳理\t31519\n面疙瘩\t31520\n四氧化三铁\t31521\nQUIC\t31522\n化妆用品\t31523\n增殖\t31524\n红外测油仪\t31525\nyemalu\t31526\n3.0t\t31527\n地下水\t31528\n中国少年先锋队\t31529\n上海交通大学分析测试中心\t31530\nMacOSX\t31531\nCasablanca\t31532\n【唯满侠\t31533\n异界\t31534\n在此一举\t31535\n老心不老\t31536\n队歌\t31537\n莱袭\t31538\n山屋\t31539\ndbs\t31540\n葡语\t31541\n054a\t31542\n安定门\t31543\n解封器\t31544\n163邮箱\t31545\n大风歌\t31546\n静儿\t31547\n20170304\t31548\n太完美\t31549\n免税购车\t31550\nju\t31551\n首发\t31552\n应采儿\t31553\n商芳\t31554\n海床\t31555\n织锦\t31556\n太湖街道\t31557\n特此\t31558\n细川\t31559\n华为mate10Pro\t31560\nange\t31561\n阿浪\t31562\n银英\t31563\n燕尾服\t31564\naico\t31565\neGouz\t31566\n黑曼巴\t31567\n纳米晶\t31568\n万壑\t31569\n拉簧\t31570\n语文\t31571\n穆菲菲\t31572\n垓下歌\t31573\n大学启示录\t31574\n盘号\t31575\nSDDE\t31576\n720P高清1080P蓝光3D\t31577\n师心\t31578\n艾瑞泽7论坛_汽车之家论坛\t31579\n广都\t31580\n陌上人\t31581\nIPS\t31582\n副总裁\t31583\n大副\t31584\n扎马斯\t31585\n平安e家保\t31586\n长泽雅美\t31587\n人脉\t31588\n建筑给水排水设计规范\t31589\n保护色\t31590\n千人计划\t31591\n隆昌县\t31592\n红冲发票\t31593\n迈锡尼\t31594\n朝雾\t31595\n埋弧焊机\t31596\n林觉民\t31597\n419\t31598\nLEFT\t31599\n119号\t31600\n人生得意须纵欢\t31601\n120万元\t31602\n浙江省旅游局\t31603\n桃花心木\t31604\n鹤洲\t31605\n赎\t31606\n2017年4月7日\t31607\n源\t31608\n富春山居图\t31609\n邪灵水\t31610\n归人\t31611\n祈福缤纷汇\t31612\n11.11\t31613\n创兴银行\t31614\n阴体\t31615\n押金单\t31616\n大庄科乡\t31617\n39药品通\t31618\n张铁生\t31619\n曲棍球\t31620\n3}\t31621\n杂说\t31622\n6格\t31623\n经纬\t31624\nSTMBUY\t31625\n浚县人民政府\t31626\n徐永涛\t31627\n平安银\t31628\ndingyi\t31629\n石化路\t31630\nチ\t31631\n风控模型\t31632\n14.04.1\t31633\n大悲古寺\t31634\n以讹传讹\t31635\n拿战\t31636\n长嘴\t31637\n泡菜\t31638\n张文顺\t31639\n81家\t31640\n欢愉\t31641\n孙河\t31642\n207号\t31643\n罗经\t31644\n莉迪亚\t31645\n连射\t31646\nminded\t31647\n顾瑶\t31648\n乐天海淘\t31649\n山西村\t31650\n抢修\t31651\n条装\t31652\nHearts\t31653\n痛苦术\t31654\n鸭架\t31655\n睁着眼\t31656\nFrontiers\t31657\n白石山景区\t31658\nconsequence\t31659\n最实惠\t31660\n村队\t31661\n监理员\t31662\n广明\t31663\n妈祖庙\t31664\n义乌国际商贸城\t31665\n下呼吸道\t31666\n九针\t31667\nMoovit\t31668\ngoing\t31669\n带式\t31670\n混沌之戒3\t31671\n遁\t31672\n蔡亮\t31673\n鱼台县委\t31674\n第22名\t31675\nCCTV1在线直播电视\t31676\n北京海淀驾校\t31677\nmantra\t31678\n300433\t31679\n9位\t31680\n刘光华\t31681\n中文拳击/搏击\t31682\n北京)技术有限公司\t31683\n马六甲\t31684\nGoggles\t31685\n柱面\t31686\n编修\t31687\nApt\t31688\n华碧\t31689\nb50\t31690\n驻外\t31691\n拼命三郎\t31692\n数据分析师\t31693\n工银e生活\t31694\n发射场\t31695\nVapor\t31696\n软件设计师考试\t31697\nKrystal\t31698\n东远\t31699\n购物中心\t31700\n齐活\t31701\n夜神多开器\t31702\n金巴利\t31703\n包券\t31704\n菲尔\t31705\n特殊化\t31706\n靳氏\t31707\n超嗨\t31708\n68期\t31709\naud\t31710\n上海新航道学校\t31711\nMoana\t31712\n带鱼屏\t31713\n流星蝴蝶剑\t31714\n鲤鱼洲\t31715\n备过案\t31716\n摇摆器\t31717\n死或生5:最后一战\t31718\nHc360\t31719\n三十二年\t31720\n想见\t31721\n座谈纪要\t31722\ninport\t31723\nKMSpico\t31724\n网管\t31725\n中国电机工程学报\t31726\n8180\t31727\n东谷\t31728\n导光柱\t31729\n小儿垂钓\t31730\n跑步\t31731\n温情\t31732\nflex2\t31733\nboto\t31734\nIdentifying\t31735\n寄生体\t31736\n法令\t31737\n喝药\t31738\njs-sdk\t31739\n全时\t31740\nrecttransform\t31741\n买网\t31742\nEasyUI\t31743\n乌有\t31744\n随爱\t31745\n孙虹烨\t31746\nRover\t31747\n脱钩\t31748\nAMD显卡\t31749\n水默\t31750\n庄思敏\t31751\n保利威视\t31752\n4K电视\t31753\n各司\t31754\n一对一\t31755\nsparkdev\t31756\n片道\t31757\n停标\t31758\n月租费\t31759\n高纬\t31760\n黄脸\t31761\n影音先锋看片资源_影音先锋资源站av资源\t31762\n刘兰芝\t31763\n义乌小商品批发网\t31764\n辽宁电信\t31765\n980nm\t31766\n塔纳安\t31767\n清水县\t31768\n廊坊\t31769\n两单\t31770\nelectrode\t31771\n地产经理人\t31772\n上海海事大学\t31773\n洛阳市区\t31774\n龙神\t31775\nsoulworker\t31776\n阀盖\t31777\n乖乖兔\t31778\n碌曲\t31779\n剁碎\t31780\n蚁人\t31781\n青阳县人民政府\t31782\n吴娜\t31783\n献佛\t31784\n优众\t31785\n三菱Mitsubishi\t31786\n名誉权\t31787\n摇摇椅\t31788\n树轮\t31789\n雅江县\t31790\n糖艺\t31791\nBuyee\t31792\n长丰大道\t31793\n暖流\t31794\nh\t31795\n旗币\t31796\n结香\t31797\n能量计\t31798\n挂出\t31799\nCSS教\t31800\n魁拔4\t31801\n洛奇亚\t31802\n企查\t31803\n耀县\t31804\n黑吃黑\t31805\n星桥\t31806\n赣州市国土资源局\t31807\n四川省审计厅\t31808\n雪佛兰科帕奇\t31809\n王掌柜\t31810\n碧桂园小区\t31811\nMonth\t31812\nmingfei\t31813\n一品红\t31814\n数过\t31815\n湖南省体育局\t31816\n明日歌\t31817\n李志刚\t31818\n76号\t31819\nbillbo\t31820\nKAITO\t31821\n何洛\t31822\nLOVELIVE\t31823\n驯兽\t31824\n聚乳酸\t31825\n大沽河\t31826\n高峰论坛\t31827\n太空战\t31828\n翘屏\t31829\n炉石传说女巫森林\t31830\n邀月\t31831\n训练机\t31832\n马加爵\t31833\n潮汕地区\t31834\n版权转让协议\t31835\n1566\t31836\n忻钰坤\t31837\nGlitch\t31838\nFinalData\t31839\n七处\t31840\n2017年7月10日\t31841\nteamviwer\t31842\n白金装\t31843\n合计\t31844\npiggy\t31845\nRANK函数\t31846\nオリジナルサウンドトラック\t31847\n计取\t31848\n0.9.7\t31849\n曹飞\t31850\n20160328\t31851\n宁沪高速\t31852\nVFD\t31853\n2018年3月31日\t31854\n冷拉钢\t31855\n南翔站\t31856\n燕郊开发区\t31857\n双台\t31858\n217号\t31859\n北京婚宴酒店\t31860\n101集\t31861\n飞华\t31862\n8.0_\t31863\nWat\t31864\n己者\t31865\n千军万马\t31866\n新疆分社\t31867\n精通网\t31868\n烘焙坊\t31869\n翠平\t31870\n160页\t31871\n张中\t31872\n之类\t31873\n体活\t31874\n行业新闻网\t31875\n五大道\t31876\n贵安欢乐世界\t31877\n友人帐\t31878\n李长栓\t31879\n战锤:全面战争\t31880\nCain\t31881\nTransient\t31882\n21斤\t31883\n比宽\t31884\n中华坊\t31885\n河北师范大学\t31886\n自绝\t31887\n腰鼓舞\t31888\nfarwish\t31889\n财富值\t31890\nel-table\t31891\nGloves\t31892\nU88加盟网\t31893\n水经注万能地图下载器\t31894\nIN\t31895\n原型化\t31896\n地点\t31897\n多段\t31898\nMMORPG\t31899\n药膏\t31900\n清气\t31901\nhardt\t31902\n梦幻模拟战\t31903\n微末\t31904\n城市天际线\t31905\ngeekbench\t31906\n跳空\t31907\n同济大学材料科学与工程学院\t31908\n99.1\t31909\nvenusblood\t31910\n没地\t31911\n前不久\t31912\n光明村\t31913\n满溢\t31914\n几项\t31915\n阿尔茨海默症\t31916\n发高烧\t31917\nFuchsia\t31918\n代写\t31919\nラブライブ\t31920\n杨弓\t31921\n政工师\t31922\nBoBo\t31923\n红宇\t31924\n细胞培养箱\t31925\n1500亿美元\t31926\n驭龙\t31927\nssi\t31928\n注册局\t31929\n灯光师\t31930\n农忙\t31931\n露天矿\t31932\n370亿\t31933\nPhotoshopCS5\t31934\n葵花子\t31935\n不可分\t31936\n雅雯\t31937\n风魂\t31938\n6月3日\t31939\n广东省公安厅消防局\t31940\n方晓\t31941\n画像\t31942\n企业博客网\t31943\n相全\t31944\n11.1.5\t31945\nAg\t31946\n万科公园大道\t31947\n普康\t31948\n大龙山\t31949\nrecent\t31950\n含羞草\t31951\n绝美冥王夫\t31952\n4H\t31953\n唐浩东\t31954\nppt幻灯片\t31955\n52pk.com\t31956\n传票\t31957\n大元素\t31958\n眼痛\t31959\nDigitizer\t31960\nMesa\t31961\n代理服务费\t31962\n检讨\t31963\nGantt\t31964\n库特\t31965\n迅捷视频转换器\t31966\n240000\t31967\n牛总管\t31968\nHackers\t31969\n国机\t31970\nQAF\t31971\n西城路\t31972\n博纳国际影城\t31973\n葛店吧\t31974\n近郊区\t31975\n思明区软件园二期\t31976\n团干部\t31977\n2.1mm\t31978\n基表\t31979\n阿蛮\t31980\nV6.8\t31981\n撒拉族\t31982\n街舞团\t31983\nROBAM\t31984\n小窝\t31985\nC3P0连接池\t31986\n黄小柔\t31987\n塞下\t31988\n香溪\t31989\nKVR\t31990\n诺彭族\t31991\n分券\t31992\n120方\t31993\nHough\t31994\n城堡传说2外传\t31995\nredshift渲染器\t31996\n3504\t31997\n17115\t31998\n西安小区\t31999\nQuerydsl\t32000\n测流\t32001\n南京高新区\t32002\n智联\t32003\n日立电梯\t32004\n安智宝\t32005\n段伟红\t32006\n【华\t32007\n柏邦妮\t32008\n野色\t32009\n璞悦\t32010\n之二十二\t32011\n一汽丰田卡罗拉\t32012\n茅海建\t32013\nforex\t32014\n手绘板\t32015\n西平镇\t32016\nsalty\t32017\n劳务派遣协议\t32018\n吊炉\t32019\n老普桑\t32020\nbacteria\t32021\n红豆社区\t32022\n银河证券\t32023\n1876年\t32024\n四川航空\t32025\n此版\t32026\nnx11\t32027\n幸福婚姻\t32028\n外土司\t32029\n滤机\t32030\nNeither\t32031\n领进门\t32032\nmavic\t32033\n佳能MG3680\t32034\nhey\t32035\n宝德龙\t32036\n湖南省长郡中学\t32037\n四围\t32038\n三通\t32039\n学习方略\t32040\n大猫电影网\t32041\n表格\t32042\n漫威影业\t32043\n二次线\t32044\n综合管理部\t32045\n第页\t32046\n行房\t32047\n容嬷嬷\t32048\n恶者\t32049\nBanking\t32050\n哥德巴赫猜想\t32051\n破解\t32052\n炫舞时代\t32053\n安和康网\t32054\n孙英\t32055\n炜\t32056\n陈光中\t32057\nradix\t32058\n最萌\t32059\n赛特奥莱\t32060\nhtmldiv\t32061\n京基百纳广场\t32062\n婚恋\t32063\n四张牌\t32064\n浙江移动\t32065\nISUZU\t32066\n啤酒厂\t32067\n百度营销\t32068\n磁吸式\t32069\n三极管\t32070\n字组\t32071\nCoherence\t32072\n馒头机\t32073\n寂静之地\t32074\nArturia\t32075\n男生子\t32076\n西坞\t32077\n公元对照简表\t32078\n胜利号\t32079\n球后卫\t32080\nproject2010\t32081\n上\t32082\n金地物业\t32083\n农女\t32084\n歪碰\t32085\n赤尾\t32086\n中国测试技术研究院\t32087\n龙湖北城天街\t32088\n牧场物语中文网\t32089\n周明华\t32090\n张广德\t32091\n压铸件\t32092\n摸爬滚打\t32093\n酸雨\t32094\n中铁十五局\t32095\n废水量\t32096\n风驰电掣\t32097\n王伟光\t32098\ni5-7200u\t32099\n薯蓣\t32100\nIndicators\t32101\n4.0.8\t32102\n英雄本色2018\t32103\n段先念\t32104\n黛欣霓\t32105\n中国工行\t32106\n千位数\t32107\nActs\t32108\n中间人攻击\t32109\n钴粉\t32110\n疯狂的爱\t32111\n二豆\t32112\nhdcp\t32113\n西门子杯\t32114\nYCA\t32115\nFIFA18\t32116\n篡夺\t32117\n2150\t32118\n小潘潘\t32119\n15度\t32120\nConnected\t32121\n一顿\t32122\n诺和力\t32123\n培生\t32124\n1000枚\t32125\n侠蓝月\t32126\n大状\t32127\ncrazymus\t32128\n日光浴\t32129\nFreegate\t32130\n好兆头\t32131\n劳动报酬罪\t32132\n和仁科技\t32133\n移动电信\t32134\nWEB版\t32135\nNEPCON\t32136\nsoles\t32137\n洋葱头\t32138\ncss样式表\t32139\n里克特\t32140\n微波炉变压器\t32141\n感叹词\t32142\n亡夫\t32143\n59万\t32144\nARDUINO\t32145\n超级课堂\t32146\n重型货架\t32147\nDonna\t32148\n体步\t32149\n2018.4.16\t32150\n11.23\t32151\n200项\t32152\n失信执行人\t32153\n风吹日晒\t32154\n财务通\t32155\n预防接种日\t32156\n山西职业技术学院\t32157\n10.5%\t32158\nCleansing\t32159\n尹素婉\t32160\n嘿群晖\t32161\nimessage\t32162\n播放页\t32163\n流言蜚语\t32164\n黄晖\t32165\n_三国志11吧_\t32166\n海龙燃油宝\t32167\nナ\t32168\n2.4.2\t32169\n1Mbps\t32170\nein\t32171\n20170718\t32172\ndnv\t32173\n0414\t32174\n27cm\t32175\n沉冤\t32176\n维京征服\t32177\n饭煲\t32178\n恒安新区\t32179\n600g\t32180\njiangxi\t32181\n往返\t32182\n工具化\t32183\n踌躇\t32184\n一个18岁\t32185\n灿烂人生\t32186\n贾树丙\t32187\n统一活动日\t32188\n电热恒温鼓风干燥箱\t32189\n喵污\t32190\n鹿角海棠\t32191\n臭皮\t32192\n純\t32193\n王永明\t32194\n快吧手游\t32195\n密封面\t32196\n幕后\t32197\nFAX\t32198\n定版\t32199\n阻计\t32200\n枫雪\t32201\n歌舞厅\t32202\nassociates\t32203\n王馥荔\t32204\n阳光棕榈园\t32205\ntypes\t32206\n巴音\t32207\ndn32\t32208\n局局\t32209\n桑塔纳论坛_汽车之家论坛\t32210\n男恋\t32211\n96岁\t32212\n具备\t32213\n定年\t32214\nadditive\t32215\n2017年4月4日\t32216\nhp\t32217\nisd\t32218\n中建交通建设集团有限公司\t32219\n英豪\t32220\nufc\t32221\n寒带\t32222\n升天\t32223\n图色\t32224\n火焰弹\t32225\n竞相\t32226\nDummies\t32227\n喘振\t32228\n6SP\t32229\n8350K\t32230\n婴幼儿奶粉\t32231\n2600x\t32232\n丙炔\t32233\n村务\t32234\n元明\t32235\nSmoothing\t32236\nEndNoteX6\t32237\n葵\t32238\n别干\t32239\nTerminator\t32240\n金业\t32241\n入户\t32242\nsiki\t32243\n廖佳琳\t32244\n诺诺\t32245\n花会\t32246\n转速\t32247\n休市\t32248\n2.31G\t32249\n长征精神\t32250\n二个月\t32251\n艾小图\t32252\nContrast\t32253\n14年前\t32254\n相濡\t32255\nhearted\t32256\nshiseido\t32257\n古力娜\t32258\n0.17.0\t32259\n浴佛节\t32260\n轨顶\t32261\nbrembo\t32262\nElle\t32263\n滹沱河\t32264\n京门风月\t32265\n南湖大道\t32266\n澳门城市大学\t32267\n吴建民\t32268\n克莱斯勒大捷龙\t32269\ncure\t32270\n二十年以后\t32271\nx250\t32272\n会餐\t32273\n五一学校\t32274\nVELVET\t32275\n产品化\t32276\n新古典风格\t32277\n钻出\t32278\n状元卷\t32279\n关键问题\t32280\n近3个月\t32281\n钟瑞\t32282\n呼噜声\t32283\n证券投资\t32284\n防护林\t32285\n中度抑郁症\t32286\n苹果专卖店\t32287\nBorderlands\t32288\n75所\t32289\n小浩\t32290\n热带沙漠\t32291\n南波湾\t32292\n北京化妆学校\t32293\n矿山公园\t32294\n笔压\t32295\nDialogFragment\t32296\n6.05\t32297\n莲花头\t32298\n吕业升\t32299\n吉崎直绪\t32300\n吹灭\t32301\nPRINCE2\t32302\n配号\t32303\n冬残奥会\t32304\n马剑\t32305\n雪肌精\t32306\ndesigning\t32307\nLinkedIn\t32308\nreact-navigation\t32309\n雨花区\t32310\n莱商银行\t32311\n金5\t32312\n叱咤港乐吧\t32313\n消防宣传日\t32314\n铁碳填料\t32315\nNGA\t32316\nwangye\t32317\nikanalyzer\t32318\n小权\t32319\n12.5.2\t32320\n一整行\t32321\n灾煞\t32322\n陶塑\t32323\nhynix\t32324\n三亚酒店\t32325\nuspto\t32326\nMAC切换输入法\t32327\n吊德斯\t32328\n垂死\t32329\n辣椒蟹\t32330\n4520\t32331\n800平米\t32332\n吉利远景S1\t32333\n轮椅\t32334\n体态\t32335\n模拟百校\t32336\n辩论稿\t32337\n耿直\t32338\nbbbb\t32339\nState\t32340\n果篮\t32341\n尧山\t32342\n工作日程\t32343\nadrian\t32344\n龙虾节\t32345\nrubao\t32346\nsesderma\t32347\n驾驶仪\t32348\nSTAGE\t32349\n金箍\t32350\n中国—东盟博览会\t32351\n龙腾3\t32352\n八点\t32353\n燕子飞\t32354\n御览\t32355\nxiaobao\t32356\n冷藏柜\t32357\n花礼\t32358\n明线\t32359\n尘尘\t32360\n星级饭店\t32361\nslo\t32362\ndynamodb\t32363\n2799元\t32364\nomo\t32365\n2000多\t32366\n后悔\t32367\n钥匙套\t32368\n麻美\t32369\nQQ华夏\t32370\n建平县\t32371\n阿维斯布隆\t32372\n断开\t32373\n护卫\t32374\n123.com\t32375\n恍如\t32376\n渗透型\t32377\n搜道网\t32378\nMinute\t32379\n新天科技\t32380\n佳缘\t32381\n南洋迪克\t32382\n20151230\t32383\n招财树\t32384\nCiliDB\t32385\n女人邦\t32386\n长青路\t32387\nmiserable\t32388\nETF基金\t32389\n块石\t32390\n纤\t32391\n30方\t32392\n文明城\t32393\nCardio\t32394\n全一快递\t32395\nkvo\t32396\n双过半\t32397\ntablet\t32398\n第80号\t32399\n甥\t32400\n5e对战平台\t32401\n三两二\t32402\n沈老头\t32403\n欢场\t32404\n周一清晨\t32405\n工人\t32406\n非流动资产基金\t32407\n卡巴斯基\t32408\n周思敏\t32409\n点距\t32410\n丝绸之路经济带\t32411\n赵云飞剑\t32412\n爆料者\t32413\nt7000\t32414\n金丰\t32415\n都市运输2\t32416\n挺胸\t32417\n真特\t32418\nPandas\t32419\n南京市教育科学研究所\t32420\n文艺学\t32421\n行知\t32422\n哆啦A梦:伴我同行\t32423\n北岭\t32424\nhrd\t32425\n英雄之村\t32426\nEBA\t32427\n清塘\t32428\nBrighter\t32429\n百丝\t32430\n土地使用权摊销\t32431\nelink\t32432\n塔子山公园\t32433\naward\t32434\n雪豹\t32435\n移动触屏\t32436\n欧根纱\t32437\n脉压\t32438\nQUARTZ\t32439\n实证\t32440\n福特公司\t32441\n传统式\t32442\ncast\t32443\n调档函\t32444\n澎湃新闻网\t32445\n抵用券\t32446\n守望先锋天使\t32447\n碧桂园太阳城\t32448\n中国红十字基金会\t32449\n葛民辉\t32450\n雷怪\t32451\n弥勒县\t32452\nimprint\t32453\n第44章\t32454\n爱生\t32455\nLookUpEdit\t32456\n孝感米酒\t32457\n打错了\t32458\n真空胎\t32459\n我的继父是偶像\t32460\n宁波江\t32461\n林超\t32462\n处理库\t32463\n果穂\t32464\nGravity\t32465\n疏水器\t32466\n琴魔\t32467\n4宫\t32468\n缅先烈\t32469\n分位数\t32470\n侯赛因\t32471\n副食\t32472\n王码五笔输入法\t32473\n乳胶枕\t32474\n7级\t32475\n主点\t32476\n秋景\t32477\n东方IC\t32478\n华阴市人民政府\t32479\n光亚展\t32480\n版包\t32481\n南京工商银行\t32482\n轮轮\t32483\nprov\t32484\n宜宾日报\t32485\n刘文杰\t32486\n53集\t32487\nprospect\t32488\n于水\t32489\n辅助位\t32490\n超级录屏\t32491\nBailey\t32492\n秋装\t32493\n劳烦\t32494\n荣丰2008\t32495\nIPSA\t32496\n券业\t32497\n在海滩上\t32498\n红豆角\t32499\n解说版\t32500\n河南省环保厅\t32501\n茶类\t32502\nFrancesca\t32503\n撕下来\t32504\n八臂\t32505\n10秒内\t32506\n弩\t32507\n天启者\t32508\n20140616\t32509\n土球\t32510\n牧羊犬\t32511\n终身险\t32512\n玻利维亚\t32513\n三七花\t32514\n北京妇女网\t32515\nBarco\t32516\n电脑设备管理器\t32517\n赟\t32518\n陈瑞\t32519\n升腾\t32520\nTOP2\t32521\n秒批秒\t32522\n变调夹\t32523\n油利\t32524\n王吉\t32525\nSafari\t32526\n宗璞\t32527\n测试盒\t32528\n太工口\t32529\n战戟\t32530\nWMF\t32531\nWin10任务管理器\t32532\n湖南省省\t32533\n2017年7月9日\t32534\n开源\t32535\n复合物\t32536\n神秘海域4\t32537\n過\t32538\nvss\t32539\n展图\t32540\nblance\t32541\n慎选\t32542\n长钢\t32543\n广州白云区\t32544\n2017年8月25日\t32545\n吸尘器\t32546\n桃花雨\t32547\n反胶\t32548\n落地架\t32549\n乐视s50\t32550\n97色\t32551\n柳丁\t32552\n青天白日旗\t32553\nCONTROL\t32554\n碧云寺\t32555\nQQ空间说说\t32556\n糯米蛋\t32557\n闻一\t32558\nsolving\t32559\n橘朵\t32560\n桥本奈奈未\t32561\n网络学习空间人人通\t32562\nbromo\t32563\n青丝\t32564\n三合\t32565\n莱芜一中\t32566\n上两天\t32567\n凭祥\t32568\nvuex2.0\t32569\n特优生\t32570\n龚虹嘉\t32571\n通天岩\t32572\n卡顿掉帧\t32573\n菀\t32574\nuni\t32575\n别具一格\t32576\n小提琴曲\t32577\nCatherine\t32578\n院里\t32579\n赚头\t32580\n南京保利大剧院\t32581\n纲领性\t32582\ntex\t32583\n固定期\t32584\n巍然屹立\t32585\n舍五入\t32586\n鼠笼式\t32587\nvist\t32588\n六根\t32589\npopularity\t32590\n胡乔木\t32591\n无忌\t32592\n耐力赛\t32593\n0411-66325017\t32594\n1256\t32595\n拉扎尔\t32596\nCatz\t32597\n江苏省委组织部\t32598\n五运六气\t32599\n吉林站\t32600\n塑料回收标志\t32601\nNUDEIDOLS\t32602\n三二一\t32603\nOrganizational\t32604\n甲仑\t32605\n37.5度\t32606\n平均差\t32607\n读取值\t32608\n感情\t32609\n24歳\t32610\n三浦\t32611\nJava编程\t32612\n人力资源师\t32613\n相思君\t32614\n掐丝珐琅\t32615\n四起\t32616\n文一路\t32617\n束\t32618\n时间步\t32619\n白雪公主和七个小矮人\t32620\n枣木\t32621\n麦盖提县\t32622\ncarolina\t32623\n牛奶蛋白\t32624\nM.66152.COM\t32625\n万村\t32626\n东方网力\t32627\n微调\t32628\n光值\t32629\n1321614\t32630\n青山湖科技城\t32631\n130号\t32632\n鬼宠\t32633\n建设工程法规及相关知识\t32634\ni5-6400\t32635\n15年后\t32636\n息英杰\t32637\n新八佰伴\t32638\nts\t32639\nAbdul\t32640\nsql文件\t32641\nruiwen\t32642\n介导\t32643\n八方来客\t32644\n芮成钢\t32645\n偏食\t32646\n海拉5\t32647\n广东共青团\t32648\n中出\t32649\n生命化\t32650\n5.6.17\t32651\n南二路\t32652\n福特F150\t32653\n广西人才网\t32654\n奔驰gle\t32655\n生源\t32656\nMOVIES\t32657\n火彩\t32658\n广州留学人员服务管理中心\t32659\n反恐怖\t32660\n74师\t32661\n张倩\t32662\n蓝采\t32663\n星河丹\t32664\nF4\t32665\nWinxp\t32666\njosn\t32667\n向心\t32668\n滑步\t32669\n大连外国语学院\t32670\n闽浙\t32671\n陈忠\t32672\n特典\t32673\n走向共和\t32674\n行尸之惧\t32675\nnatvie\t32676\n南昌欧菲光科技有限公司\t32677\nREDHAT\t32678\n桩基承台\t32679\n第127期\t32680\n上海开放大学\t32681\n雅赞\t32682\nmp4ba\t32683\nGD库\t32684\n全流\t32685\n调遣\t32686\n华西临床医学院\t32687\nonload\t32688\n彭帅\t32689\n王清\t32690\n布里斯班机场\t32691\n铁核桃之无间风云\t32692\nargs\t32693\n菜粕\t32694\n正基\t32695\n汉明码\t32696\nInspection\t32697\n腿伤\t32698\n小英子\t32699\n欲强\t32700\n一瞬\t32701\n影咖\t32702\n页_小说阅读网\t32703\nLAMBDA\t32704\n拳击吧\t32705\n湖里大道\t32706\n环翠区\t32707\n交复\t32708\n占位性病变\t32709\n汽车检测站\t32710\n游匣7567\t32711\n遵义县\t32712\n熊梓淇\t32713\n卫健\t32714\n游戏\t32715\n几年前\t32716\n中粒式\t32717\n几星\t32718\n卸妆乳\t32719\n拜金女\t32720\nipadair2\t32721\n沈嫣\t32722\nCAD2019\t32723\nMarch\t32724\n两高一部\t32725\n两蒋\t32726\nontario\t32727\n小婊\t32728\n手巾\t32729\naffair\t32730\nHi商学院\t32731\nExtend\t32732\n纽约机场\t32733\n案内\t32734\nUIlabel\t32735\n六安瓜片\t32736\n虎嗅网\t32737\n氨基酸口服液\t32738\n小塘\t32739\n辨明\t32740\n卡帕莱岛\t32741\n许阁\t32742\nvidz\t32743\n漏窗\t32744\n52RD.com\t32745\n6场\t32746\n千鱼\t32747\n线杠\t32748\n冲绳岛\t32749\n汤头\t32750\n曹妃甸港\t32751\ne42\t32752\nntext\t32753\nTicket\t32754\n34届\t32755\n依着\t32756\n遥控坦克\t32757\n酸酸\t32758\ntequila\t32759\n公座\t32760\n日皮\t32761\n双子村\t32762\n心玉\t32763\n怂\t32764\n长安金欧诺_长安金欧诺\t32765\n须根\t32766\n旧宫\t32767\n杨天若\t32768\n植物油\t32769\nPanels\t32770\nmf839\t32771\n中国科学院武汉岩土力学研究所\t32772\n疯城记\t32773\n脑梗死\t32774\n小两口\t32775\n人民日报刊文\t32776\n出访\t32777\nALBUM\t32778\n至理名言\t32779\nuserid\t32780\n桃香\t32781\n子单位\t32782\n佛山市人民政府办公室\t32783\n北京石景山\t32784\n江苏政府\t32785\n金属工艺学\t32786\n上林溪\t32787\n庶难\t32788\n此去\t32789\n郑氏\t32790\n公用电话\t32791\n辉龙\t32792\nshike\t32793\n曜影\t32794\n桐乡房产超市网\t32795\n构体\t32796\nzZ\t32797\nstoke\t32798\n棋缘\t32799\n上海证券交易所公司\t32800\n潍城区\t32801\n张玉龙\t32802\n并联\t32803\n大连湾海底隧道\t32804\n字钢\t32805\n神经节苷脂钠注射液\t32806\nTaste\t32807\n臭氧层\t32808\n商朝\t32809\n放风筝\t32810\n斗胆灯\t32811\n伙夫\t32812\nOR函数\t32813\nNerve\t32814\n蒋大\t32815\n吉列锋速3\t32816\n亩产值\t32817\nFRA\t32818\n消石素\t32819\n一屋老友记\t32820\n小人物\t32821\n大碗\t32822\n仲永\t32823\nDigitalMicrograph\t32824\ndecompress\t32825\n租赁型\t32826\n通大\t32827\n电脑散热器\t32828\n丧气\t32829\n封口\t32830\n小森\t32831\n林虹\t32832\n八戒影院\t32833\n重庆大学机械工程学院\t32834\n省市区\t32835\n瑞驰\t32836\n中国电信公司\t32837\n无双7\t32838\n井场\t32839\n天琅\t32840\n欢乐戏剧人\t32841\n无所有\t32842\nkv2\t32843\n雅达\t32844\n液化气体\t32845\n三唑仑\t32846\n马克思主义政治经济学\t32847\n榕树\t32848\n张玫\t32849\n黄金柱\t32850\n火炬大厦\t32851\nYorker\t32852\n2天一夜\t32853\n农村信用社手机银行\t32854\n尹派\t32855\nfttp\t32856\n筹集资金\t32857\nmbus\t32858\nsich\t32859\n2018年3月21日\t32860\n过境\t32861\n诸葛庐\t32862\nPJGirls\t32863\nfooterview\t32864\nzh\t32865\n玄术\t32866\n龙玺\t32867\nBT版\t32868\npic单片机\t32869\n倒立\t32870\n湖北工业大学\t32871\n田上\t32872\n交化\t32873\nフェアリ\t32874\n几斤\t32875\n一周15天\t32876\n淡豆豉\t32877\n减薪\t32878\n经协商\t32879\n战锤全面战争\t32880\n航空航天学院\t32881\n特型\t32882\n很简单\t32883\n新疆新闻联播\t32884\n向往的生活同人\t32885\n怪物猎人P3中文网\t32886\n136平\t32887\n纯奈佳苗\t32888\n小维\t32889\n高德JS\t32890\nrenxiaoren\t32891\n李婷\t32892\n棍棒\t32893\n认沽\t32894\nbaiye5.com\t32895\n云南省第一人民医院\t32896\n戊子日\t32897\n阿达帕林凝胶\t32898\n井冈\t32899\n27套\t32900\n娇\t32901\n中国音乐学院考级委员会\t32902\n无维网\t32903\nvav\t32904\n錯誤\t32905\nmango\t32906\n色流\t32907\n东湖高新技术开发区\t32908\n托斯卡纳\t32909\npin\t32910\n28只\t32911\nflats\t32912\n民粹\t32913\nAbigaile\t32914\nHPV疫苗\t32915\n晨刊\t32916\n武汉市政协\t32917\n1.24\t32918\n镜监信息网\t32919\n学习桌\t32920\n66套\t32921\n星光耀广场\t32922\n况天佑\t32923\n龙翔桥\t32924\n诺维格瑞\t32925\ne450c\t32926\n积蓄\t32927\n得于\t32928\n河南省书法家协会\t32929\n淘宝店宝贝\t32930\n全面屏Z17S-牛仔俱乐部\t32931\nAmanda\t32932\n九图网\t32933\n100多少\t32934\n永续债\t32935\nBingBing\t32936\n来了别错过\t32937\n渣鼻\t32938\n二元一次方程组\t32939\n运放\t32940\nled平板灯\t32941\niPhone7\t32942\napple\t32943\n金属制\t32944\n物联网吧\t32945\n司机端\t32946\n石英管\t32947\n有源器件\t32948\n林一\t32949\n洋葱浏览器\t32950\n终极福利\t32951\n刺史\t32952\n汴京早报\t32953\n防雾剂\t32954\n西安电子科技大学通信工程学院\t32955\n网络摄像头\t32956\n舔舐\t32957\nLNK2005\t32958\n小美人鱼\t32959\n第3页\t32960\n轻居\t32961\n近江\t32962\n登录页\t32963\n广州市政协\t32964\n阿克苏市\t32965\n灰底\t32966\nExcuse\t32967\n_九游\t32968\n可编程逻辑控制器\t32969\n木工坊\t32970\n国网甘肃省电力公司\t32971\n暴龙兽\t32972\nAirborne\t32973\nnumerical\t32974\n意想\t32975\n宋楠\t32976\ngundam\t32977\n2015-2021年\t32978\n听得\t32979\nウテア\t32980\n聊\t32981\n生命树\t32982\n疖肿\t32983\n辩解\t32984\n邵军\t32985\nBOSE音响\t32986\n谢国忠\t32987\nCinder\t32988\nkindeditor\t32989\n1.0.0.8\t32990\n隐喻\t32991\n嗜酒\t32992\nv18\t32993\n谭元寿\t32994\n菇凉\t32995\nRobots\t32996\n冷链物流\t32997\n网易我的世界论坛\t32998\n骨骼\t32999\n珍珠梅\t33000\n摆展\t33001\nocata\t33002\n128G\t33003\n镇海庄市\t33004\n重庆论坛_汽车之家论坛\t33005\n新本田\t33006\nEdges\t33007\n信达财险\t33008\n英大人寿\t33009\n金女\t33010\nsports\t33011\n唯恐\t33012\n新宿区\t33013\n搪胶\t33014\nMacUpdate\t33015\n天津地铁7号线\t33016\n大连华信计算机技术股份有限公司\t33017\n开始菜单栏\t33018\n玉肥\t33019\nfinex\t33020\n太平天国运动\t33021\n第5行\t33022\nDoll\t33023\nhon\t33024\n600t\t33025\n南京奥数网\t33026\n五和地铁站\t33027\nmicrophone\t33028\nwindowsserver\t33029\n岳阳东\t33030\n爱登堡\t33031\n同胜\t33032\nboot拦截器\t33033\nXman\t33034\n换汤不换药\t33035\n82条\t33036\n311号\t33037\n大稿\t33038\n北京国家大剧院\t33039\n堂客\t33040\n独坐\t33041\n肝昏迷\t33042\n码值\t33043\n亮闪闪\t33044\n肝内钙化灶\t33045\n迷雾围城\t33046\nBT传奇磁力链接\t33047\n宅院\t33048\n72岁\t33049\nsql存储过程\t33050\n体力活\t33051\n向善\t33052\n北京三元食品股份有限公司\t33053\nflange\t33054\n广东理工职业学院\t33055\n凯莉米洛\t33056\n斗笠\t33057\n四川大学电气信息学院\t33058\n58到家\t33059\n小黄歌\t33060\n第二十八章\t33061\n文昌鸡\t33062\n小少焱\t33063\n天赐良园\t33064\n切身利益\t33065\n腾冲市人民政府\t33066\nNative\t33067\n奥司他韦\t33068\n爬片\t33069\n启明星\t33070\n说服\t33071\n三川\t33072\n文档流\t33073\n肇庆医学高等专科学校\t33074\n文艺路\t33075\n若怒\t33076\n刺痒\t33077\navoid\t33078\n续集\t33079\n取卵移植-播种网\t33080\n三包\t33081\n战舞\t33082\n罗技g29\t33083\ndominate\t33084\n瓦锡兰\t33085\ni3wm\t33086\n崇礼\t33087\n猛料\t33088\n肠粉王\t33089\nmacromedia\t33090\n毅力帝\t33091\n纷享\t33092\n录像\t33093\n第32集\t33094\n39项\t33095\n有钱\t33096\n穿衣服\t33097\n贾卡\t33098\n说唱\t33099\n赃车\t33100\n黄天崎\t33101\nlsi\t33102\nKingroot\t33103\n柳庄\t33104\n金凯\t33105\nZ87\t33106\n000062\t33107\n04月25日\t33108\n老城门\t33109\n魔之谷\t33110\nselcet\t33111\n爱转角\t33112\nflorida\t33113\n300问\t33114\nladybays\t33115\n错过\t33116\n紧张性头痛\t33117\nZJOI\t33118\n氢化油\t33119\n宜昌市教育局\t33120\n考分\t33121\n股权转让协议范本\t33122\n三国志10\t33123\n北京公主坟\t33124\n方建华\t33125\n膨果\t33126\nwebstorm11\t33127\n逐梦演艺圈\t33128\n英敏特\t33129\n马蓉\t33130\n海龙王\t33131\n两相步进电机\t33132\n卡茨\t33133\n北京育才学校\t33134\n笑看\t33135\n青岛市高新区\t33136\n金座\t33137\n女学\t33138\n现金网\t33139\n达比\t33140\n青客上海站\t33141\n大陆演艺圈艳史\t33142\n金尊\t33143\n正镶白旗\t33144\n华信国际\t33145\n美国电视台\t33146\n稳定型心绞痛\t33147\n陈昌\t33148\n直杆\t33149\npart2\t33150\n宫室\t33151\n逆贼\t33152\n北中\t33153\n公安部治安管理局\t33154\n释小松\t33155\n打桩\t33156\n布艺\t33157\nchilli\t33158\n魔法少女爱\t33159\n琳儿\t33160\n科斯特\t33161\n毛鞋\t33162\n证据链\t33163\n打開\t33164\n国土资源厅\t33165\n扁线\t33166\ncondor\t33167\n布格罗\t33168\nNVidia\t33169\n生死关\t33170\n璧合科技\t33171\n官吏\t33172\n抗锯\t33173\n黄金广场\t33174\n商调\t33175\n时力\t33176\n冉庄\t33177\n碰坏\t33178\n法纪\t33179\n湖南师范大学文学院\t33180\n自媒体平台\t33181\n范全\t33182\n柏青哥\t33183\nMQTT\t33184\n华人世界\t33185\n300平方\t33186\n喔喔动漫网\t33187\n保障\t33188\n彭蕾\t33189\n骆新\t33190\n5DVD\t33191\nKCF\t33192\neDrawings\t33193\n船东\t33194\n南派传奇\t33195\n方浩\t33196\n筑波\t33197\nas3.0\t33198\n耐压\t33199\n魔范学院\t33200\nhy000\t33201\n职教园区\t33202\n一句句\t33203\n金刚菩提子\t33204\n统防\t33205\n科伦坡\t33206\n理解\t33207\n4.7.2\t33208\n暗系\t33209\n贺国强\t33210\n1.9\t33211\n德平路\t33212\n肯辛顿\t33213\n安徽省改革和发展委员会\t33214\n联军\t33215\n南京科远自动化集团股份有限公司\t33216\n破绽\t33217\nsets\t33218\n黄亮\t33219\nsinon\t33220\n2213\t33221\n久趣英语\t33222\n模座\t33223\n法经济学\t33224\n侧力\t33225\n赤水河\t33226\n舞友\t33227\nhous\t33228\nLiveCD\t33229\n垂直化\t33230\n镇关\t33231\n成都公园\t33232\n华擎\t33233\n王者荣耀亚瑟\t33234\n兴义市\t33235\n青州\t33236\n拘禁\t33237\n卡点\t33238\nlssvm\t33239\n1000千伏特\t33240\n凤凰单枞\t33241\n伊势神宫\t33242\n雷霆出击\t33243\n利尿\t33244\n郑州外国语学校\t33245\n小柒\t33246\n小米平板3\t33247\n各期\t33248\n从医\t33249\n傩\t33250\nJOOX\t33251\n人造器官\t33252\nfrpc\t33253\n化学镀\t33254\n倪匡\t33255\n黑屋\t33256\n救世主\t33257\nTrinus\t33258\n点掌\t33259\nDeepSky\t33260\n路桥街道\t33261\n拳霸2\t33262\n铁柱\t33263\n重载铁路\t33264\n三岁\t33265\nborder-width\t33266\n叉叉叉\t33267\n杭州湾大湾区\t33268\n澄泥砚\t33269\n月丘\t33270\n云南建投\t33271\n大黄靴\t33272\nKuwait\t33273\nVive\t33274\n中华龙舟\t33275\nhdf5\t33276\nHays\t33277\n珠海市政府\t33278\n延安中学\t33279\n创美\t33280\n点墨\t33281\n中科院金属所\t33282\n茅岩莓茶\t33283\n紫铜管\t33284\n固控\t33285\n锁心\t33286\n自考吧\t33287\njmeter4.0\t33288\nPortman\t33289\n机器人学\t33290\n一游网\t33291\n宁夏移动\t33292\n无翼鸟邪恶漫画全集\t33293\n望江东路\t33294\nps液化工具\t33295\n可凡倾听\t33296\ntransmac\t33297\n米粉卡吧\t33298\n石林镇\t33299\n铝钢\t33300\n防冻液\t33301\ndennis\t33302\n夜深了\t33303\nBmp\t33304\n高坠\t33305\n华彩网\t33306\n巴内斯\t33307\n权路风云\t33308\n赤壁市人民政府\t33309\n耐磨钢\t33310\n根植\t33311\n深圳育才中学\t33312\n崇华书院\t33313\n耐操\t33314\nAntiVirus\t33315\n西华师范大学\t33316\nBaron\t33317\n科技快报_砍柴网\t33318\n一潭死水\t33319\nTRADOS\t33320\n燕云十六州\t33321\n1061\t33322\n洗净\t33323\n流转单\t33324\n敏而好学\t33325\nNeverfull\t33326\n千年等一回\t33327\n编程\t33328\n2016.2.1\t33329\n鹿鹿\t33330\n开售\t33331\n2.4G\t33332\n阔叶\t33333\n501号\t33334\n西藏路\t33335\nv2.10\t33336\n发型\t33337\n联通网吧\t33338\nPhotovoltaic\t33339\nvest\t33340\n搜\t33341\n乔振宇\t33342\n圈圈乐\t33343\n违规者\t33344\nFreeware\t33345\nhaodf\t33346\n上海东方电视台\t33347\n富林\t33348\n新世纪广场\t33349\n矿大\t33350\n省里\t33351\n15kg\t33352\n林建华\t33353\n燃木\t33354\nquad\t33355\n中国海洋大学研究生院\t33356\n天壕环境\t33357\n银都\t33358\n聚人\t33359\n八月桂花遍地开\t33360\nJavaScriptCore\t33361\n20150212\t33362\n四驱\t33363\n数千\t33364\n支撑器\t33365\nWFApple\t33366\n两面派\t33367\n祖册\t33368\n第三十六次\t33369\n算无遗策\t33370\n阴跌\t33371\n3725\t33372\n2cm\t33373\n危废经营许可证\t33374\n台盆\t33375\n五月初\t33376\nLe\t33377\n20140617\t33378\n51MODO\t33379\nEleanor\t33380\nhacker\t33381\n消化性溃疡\t33382\n侠客风云传\t33383\n嘿哟\t33384\n股子\t33385\n定远\t33386\n刘兵\t33387\nmec\t33388\n新城东方丽园\t33389\n白风\t33390\n举枪\t33391\n活动类\t33392\n传奇一号\t33393\n本垒\t33394\n食野\t33395\n张宁\t33396\nv40\t33397\nTensorFLow\t33398\n家庭网\t33399\n抽屉式开关柜\t33400\n小红信\t33401\n宣贯\t33402\n五好\t33403\nmvpbang\t33404\nMovado\t33405\n艾灸仪\t33406\n伊面\t33407\n鸡蛋黄\t33408\n霸图\t33409\nbien\t33410\n周线\t33411\n复旦复华\t33412\n档口\t33413\n26个\t33414\n财神鱼\t33415\n南京大学中国社会科学研究评价中心\t33416\n千千阙歌\t33417\n杨高南路\t33418\n卓扬\t33419\n街头镇\t33420\n达摩盘\t33421\n交货期\t33422\n终南\t33423\n25米\t33424\n装换\t33425\n贾斯丁\t33426\n酒精肝\t33427\n天邪鬼赤\t33428\n超级相师\t33429\nyijie\t33430\nIssues\t33431\n腐案\t33432\n北京农学院\t33433\n第三中学\t33434\n垂泪\t33435\n点胶机\t33436\n第十二夜\t33437\nprc\t33438\n平衡器\t33439\n密胺餐具\t33440\n根目录\t33441\n0x2\t33442\n腊梅树\t33443\n神画\t33444\n醋酸地塞米松\t33445\n探伤\t33446\nslip\t33447\n小蓝帽\t33448\n五河琴\t33449\n越影\t33450\n格洛丽\t33451\nF14\t33452\n尽心尽力\t33453\nclick\t33454\n中国航天科工集团公司\t33455\n张建林\t33456\n5521\t33457\n横山桥\t33458\n皆宜\t33459\n偏导数\t33460\n蜡油\t33461\n恩替卡韦\t33462\nBD英语高清\t33463\n性伴侣\t33464\n360软件管家\t33465\n中国中医药报\t33466\n1105\t33467\nM403d\t33468\n段延庆\t33469\n好甜\t33470\n第四号\t33471\n性饥渴\t33472\n美征税\t33473\n百利天下留学\t33474\n13.3\t33475\n我不会\t33476\n热血江\t33477\n2468\t33478\n头皮毛囊炎\t33479\n训练服\t33480\np10plus\t33481\n炮胶\t33482\n热电\t33483\n千分制\t33484\nperseverance\t33485\n审贷\t33486\n陈宏\t33487\nRGV\t33488\n帅比\t33489\n金融机构大额交易和可疑交易报告管理办法\t33490\n驱鸟\t33491\n400米\t33492\n毙命\t33493\nspv\t33494\n第十话\t33495\n中国中铁四局集团有限公司\t33496\n兴义万峰林\t33497\n天得\t33498\n棕子\t33499\n4月16号\t33500\n经验类\t33501\n五卡\t33502\n催\t33503\n衢州统计局\t33504\nTorrentKitty\t33505\n黄绮珊\t33506\n漆桶\t33507\n顶胶\t33508\n友趣\t33509\n荞麦米\t33510\nnfpa\t33511\nrookahq\t33512\nWu\t33513\n大华公司\t33514\n李宁体育园\t33515\n祛痘印\t33516\n英雄传说\t33517\n中国工商银行数据中心\t33518\n牛皮包\t33519\n砖机\t33520\n第34次\t33521\n五行生克\t33522\n0.09MB\t33523\n三件\t33524\n黄莲\t33525\n北京市审计局\t33526\n信息源\t33527\ndsolve\t33528\n1.x\t33529\n轻机\t33530\n国省干线公路\t33531\n上海男性专科医院\t33532\n太平园\t33533\nBBU\t33534\n万磊\t33535\n明德小学\t33536\n录课\t33537\n中国石油大学\t33538\nPingPong\t33539\n果实\t33540\n红外热像\t33541\nfillder\t33542\nANNA\t33543\n历史上\t33544\n十一日\t33545\n会务\t33546\n李一诺\t33547\n血位\t33548\n枫泾镇\t33549\n黄厚\t33550\n免谈\t33551\n无房证明\t33552\n硫氢化钠\t33553\n中盘\t33554\n纸境\t33555\nglb\t33556\n战果\t33557\n武义新闻网\t33558\n几挡\t33559\n有轨电车\t33560\n苏州汽车客运总站\t33561\n预置\t33562\nDamn\t33563\n安峰\t33564\n加减档\t33565\n青奥\t33566\nbeckhoff\t33567\n禾祥西路\t33568\n大兔子\t33569\nDahl\t33570\n3集\t33571\n肥厚性心肌病\t33572\n胶装\t33573\nsendall\t33574\n虚拟定位\t33575\n长久集团\t33576\n阻燃剂\t33577\nSony\t33578\n南明区人民政府\t33579\nABCD-A1B1C1D1\t33580\nNeko\t33581\n倾斜\t33582\n九玺\t33583\n喂药器\t33584\nCCO\t33585\n幼儿教育_3edu教育网\t33586\nfi\t33587\n西安碑林\t33588\n青岛政府\t33589\n教育部高教司\t33590\nAPI编程\t33591\n宫商角徵羽\t33592\n虚空之女卡莎\t33593\n张大奕\t33594\n中山市古镇\t33595\n红舞\t33596\n麦弗逊\t33597\n流动人员\t33598\n突破口\t33599\nTOP30\t33600\n混合体\t33601\n亚青寺\t33602\n2017年4月19日\t33603\n荣佳国\t33604\n肉圆\t33605\n电动吊篮\t33606\n小舍\t33607\n人母\t33608\n私藏\t33609\nClassics\t33610\n7余\t33611\n青稞\t33612\n不堪入耳\t33613\n0527\t33614\n牧原智能安检\t33615\n御前\t33616\n党政纪\t33617\n千脑云电脑\t33618\nask\t33619\nBridged\t33620\n丹妮莉丝·坦格利安\t33621\n顶栏\t33622\n辛追\t33623\n峰值功率\t33624\nTaohuazu\t33625\n黑垢\t33626\n魔兽世界_魔兽7.0军团再临_17173\t33627\n剑杆\t33628\n矫枉过正\t33629\n豌豆粉\t33630\n易烊千玺\t33631\n3500万\t33632\n风筒\t33633\n饼状图\t33634\n中弘\t33635\n节杖\t33636\n2226\t33637\n招贤令\t33638\n大有人在\t33639\n千山记\t33640\n故事书\t33641\nalmond\t33642\n起意\t33643\n迂\t33644\n极品飞车ONLINE\t33645\nimpairment\t33646\n消防器\t33647\n中国铁塔公司\t33648\n阳光新城\t33649\n四川久远银海软件股份有限公司\t33650\n五菱汽车\t33651\n同济堂\t33652\n加减法\t33653\n低迷期\t33654\n城投控股\t33655\n机动战士\t33656\n51养生网\t33657\n飞烟灭\t33658\n邢碧旗\t33659\n村人\t33660\n甘肃省人民检察院\t33661\n兴盛\t33662\n南雄\t33663\n软席\t33664\n教育出版社\t33665\n琛\t33666\n七十五岁\t33667\nwxquare\t33668\n134\t33669\n按摩机\t33670\n制造有限公司\t33671\nopenstreet\t33672\n对照\t33673\nR17\t33674\n海油\t33675\n枭\t33676\n永不忘\t33677\nmap值\t33678\n七杀星\t33679\n克雷洛夫\t33680\n昆明市区\t33681\n猎兽者\t33682\n韩路\t33683\ng100\t33684\nproxyTable\t33685\nU23\t33686\nDisassembler\t33687\n鲁教版五四制\t33688\n同济新村\t33689\ncpvc电力管\t33690\n50辆\t33691\n骆惠宁\t33692\n热风枪\t33693\nslang\t33694\n色画\t33695\n抵扣券\t33696\n原产\t33697\n金渐层\t33698\ncodelite\t33699\n鲁证期货\t33700\nFF14\t33701\n十元店\t33702\n亿航\t33703\n宜昌站\t33704\n工业小区\t33705\n人人快递\t33706\n德森\t33707\n坚式\t33708\n20150702\t33709\nwiiu模拟器\t33710\n中建协\t33711\n料单\t33712\n南都娱乐\t33713\n30万辆\t33714\n卤虫\t33715\n龙耀\t33716\nOfficeJet\t33717\ngregory\t33718\n带彩\t33719\n轩辕剑外传:穹之扉\t33720\n卓创指数\t33721\n橡胶制品\t33722\n晾衣架\t33723\nvisualvm\t33724\n2017年11月17日\t33725\n市国资委\t33726\n激情片\t33727\n笑笑\t33728\n达伦·布朗\t33729\nnaga\t33730\n东仙坡\t33731\n机器学习实战\t33732\n驿路\t33733\n孙小梅\t33734\n坚美\t33735\n俩人\t33736\n買\t33737\n透明带\t33738\nMoMA\t33739\n鸿海\t33740\n薜之谦\t33741\n张建波\t33742\n间谍案\t33743\n3.0集群\t33744\n钉号\t33745\n得寸进尺\t33746\n枫丹丽舍\t33747\nISOFIX\t33748\nmica\t33749\n全国文明城市\t33750\nApiCloud\t33751\n无形的手\t33752\n云南文山州\t33753\n黎明杀机\t33754\n九县\t33755\nImporting\t33756\n八字形\t33757\n母模\t33758\n0.5小时\t33759\n幸福的歌\t33760\nBlue\t33761\n迷奸\t33762\n小儿抽动症\t33763\nfight\t33764\n4088\t33765\n三三三\t33766\n武警总队\t33767\n真棒\t33768\n章印\t33769\n酸软\t33770\n认认\t33771\n温度场\t33772\nwelink\t33773\nKali\t33774\n打游击\t33775\n粗圆\t33776\n章远\t33777\n直交\t33778\n男扮女装\t33779\nDOI\t33780\nU+\t33781\n下巴\t33782\n91y\t33783\n0369\t33784\n做寿\t33785\n赛琳娜\t33786\n包河大道\t33787\n周宇\t33788\n中果\t33789\n波西\t33790\n第172章\t33791\n改连\t33792\n屄毛\t33793\n排铅\t33794\n百度相册\t33795\nMORI\t33796\n汝城政府网\t33797\n全检\t33798\n邓晶晶\t33799\n咸水\t33800\npga\t33801\n绝地求生刺激战场吉利服\t33802\n女体化\t33803\n一个陌生女人的来信\t33804\nNSPredicate\t33805\n陈书\t33806\n车牌号\t33807\n被服\t33808\n沙疗\t33809\n凯丹广场\t33810\ndbhelper\t33811\n欢腾\t33812\n10盘\t33813\n安徽政府\t33814\n下不去\t33815\n竹梅\t33816\n1284\t33817\n奈克赛斯奥特曼\t33818\n统计分析\t33819\n黑嘴\t33820\n村内\t33821\n孙\t33822\n断剑\t33823\n白鹭湖\t33824\n埃斯顿\t33825\n补建\t33826\n丹凤街\t33827\n南京网络电视台\t33828\n计划生育科\t33829\n绘画史\t33830\n平和双语学校\t33831\n日香桂\t33832\n二只\t33833\n三版\t33834\n福建省人民政府外事办公室\t33835\n葡萄酒吧\t33836\n丁亮\t33837\n福特_途睿欧\t33838\n笼罩\t33839\n秋后算账\t33840\ntrivial\t33841\n小蛮\t33842\n拔刺\t33843\nkof2002\t33844\nedwin\t33845\nb250m-d3h\t33846\n徇私舞弊\t33847\n賀鑄\t33848\n危险犯\t33849\n潘石屈原\t33850\n水晶帘\t33851\n保暖衣\t33852\n福清一中\t33853\n麻辣油\t33854\n路易十四\t33855\n英山\t33856\n东莞康华医院\t33857\n电动百叶窗\t33858\n万源\t33859\n湘鄂情\t33860\n第13季\t33861\n羧甲基纤维素\t33862\nmsci\t33863\nPGI\t33864\nCD锤石\t33865\nskech\t33866\n半个月亮\t33867\n欧阳龙\t33868\n保险公司早会\t33869\n彩色多普勒\t33870\n三候\t33871\n椎名\t33872\n樊丽明\t33873\n沙北新区\t33874\nc3\t33875\nsgw\t33876\n超音\t33877\n第七级\t33878\nkenwood\t33879\n514200.com\t33880\n双鹤湖\t33881\ngeneration\t33882\n百慕大\t33883\n满天星斗\t33884\n粉红色的回忆\t33885\n史泰博\t33886\ncosx\t33887\n职业篇\t33888\n信仰者\t33889\n君士\t33890\n透明色\t33891\nAM3+\t33892\nsbit\t33893\n智能公交\t33894\n外链\t33895\nDAAD\t33896\n高新园\t33897\nge\t33898\n0xce\t33899\n集控\t33900\nVogue\t33901\nav天堂资源网\t33902\n情绪\t33903\n焚化\t33904\n热网\t33905\n【哲\t33906\n雅特\t33907\n软线\t33908\n郑俊\t33909\n铁釜\t33910\n徐州市中医院\t33911\n石榴集团\t33912\n马克思主义哲学原理\t33913\nShirt\t33914\n香水\t33915\n15800\t33916\n开业率\t33917\n全筑\t33918\n太极熊猫3猎龙\t33919\n论题\t33920\n平安e\t33921\nsolitude\t33922\nvipJr\t33923\n人教版7\t33924\nwamp\t33925\n优酷出品网\t33926\n七十年\t33927\n桩桩\t33928\n换屏\t33929\n扼流\t33930\n虚化\t33931\n三房一厅\t33932\n护照号\t33933\n慢性腹泻\t33934\n留不下\t33935\n59%\t33936\n黑麒麟\t33937\n聂尔龙\t33938\n聚居区\t33939\nopponent\t33940\n中山大学化学学院\t33941\n江科大\t33942\n施主\t33943\n朗阁\t33944\n划龙舟\t33945\n心心的爱\t33946\n天猫国际\t33947\n心丝\t33948\n北美省钱快报\t33949\n不贴\t33950\n草叶\t33951\n克拉霉素分散片\t33952\n大航海时代4威力加强版\t33953\n4950\t33954\n卫慧\t33955\n鄂州市纪委监察局\t33956\n护理员\t33957\n货币贬值\t33958\nLinq\t33959\n无事\t33960\n魔枪士\t33961\n羽化丹\t33962\n6双\t33963\n群监员\t33964\n51领啦网\t33965\n副食品\t33966\n尊亲\t33967\n2.47\t33968\n雨水管\t33969\n珂珂\t33970\n插播\t33971\n江南客运站\t33972\n亚历山大\t33973\n国能\t33974\n小碟\t33975\n优尼\t33976\n湘潭县一中\t33977\n量子力学\t33978\nPP8\t33979\n独占性\t33980\nvertx\t33981\nMRC\t33982\n水利水电工程管理与实务\t33983\n13\t33984\n普罗科菲耶夫\t33985\n彭清华\t33986\n多功能表\t33987\n芭蕾舞\t33988\n罗技宏\t33989\n孰是孰非\t33990\n前\t33991\ndax\t33992\n哈希值\t33993\n元素法\t33994\n缠绕式\t33995\n中国中医科学院望京医院\t33996\n丰富化\t33997\nOnLine\t33998\n左小祖咒\t33999\nsslvpn\t34000\n酸缸\t34001\nStefanie\t34002\nexpertise\t34003\n答题卡\t34004\n内蒙古气象局\t34005\n告子\t34006\n参照物\t34007\n木丝\t34008\nr14\t34009\n申万宏源证券有限公司\t34010\n石刻\t34011\n28丨\t34012\n賽\t34013\n2017.12.31\t34014\n央行研究局\t34015\n洁白\t34016\n口袋妖怪魂银\t34017\n快棋\t34018\n轩子巨2兔\t34019\n0373\t34020\n剩余\t34021\n重者\t34022\nv字仇杀队\t34023\n中国科学院福建物质结构研究所\t34024\nIDEA\t34025\n3.8折\t34026\n陈维\t34027\n中国科学技术大学软件学院\t34028\n秀动网\t34029\n52pk新闻中心\t34030\n紫金花\t34031\nHepG2\t34032\n硬度仪\t34033\n大精灵\t34034\n爱仕达\t34035\n石英钟\t34036\n王泽鉴\t34037\n中铁项目部\t34038\n段永平\t34039\n均胜电子\t34040\n手機\t34041\n夫郎\t34042\n更喜\t34043\n镜线\t34044\nH型\t34045\n小说阅读网\t34046\n三虎\t34047\nASIAN\t34048\n中科院理化所\t34049\n哈尔威\t34050\nMIUI7\t34051\n众善\t34052\n相电压\t34053\n长相守\t34054\n贤妻良母\t34055\nOTRS\t34056\nColony\t34057\n迷域\t34058\nSQlite\t34059\n合肥国轩高科动力能源有限公司\t34060\n采石\t34061\n有图有真相】\t34062\n300cc\t34063\nparanoid\t34064\n2.59G\t34065\nwebservices\t34066\n诗界\t34067\n南艳湖\t34068\n楚轩\t34069\n印前\t34070\n惋\t34071\n承台梁\t34072\n提纲挈领\t34073\nFlaming\t34074\n乳山\t34075\n罗冲围\t34076\n暗影精灵吧\t34077\najx\t34078\n英格玛·伯格曼\t34079\n新会陈皮\t34080\nTIMEOUT\t34081\n直销银行\t34082\n非特殊\t34083\n青阳路\t34084\nps2\t34085\ngrasshopper\t34086\n36万\t34087\n浙江省体育局\t34088\n_羽\t34089\n奇摩\t34090\nbig笑工坊\t34091\n大组\t34092\n提莫西·查拉梅\t34093\n七月半\t34094\n治\t34095\n猎袭\t34096\n舒庆\t34097\nccgp\t34098\n野山参\t34099\n第四页\t34100\n空投\t34101\n20150922\t34102\n277dcv\t34103\n周正\t34104\n甘心\t34105\n谢安琪\t34106\n丹东火车站\t34107\n测试题\t34108\n没有人管\t34109\n华润小径湾\t34110\nacg吧\t34111\n南海子郊野公园\t34112\n灾\t34113\n敲掉\t34114\n救母\t34115\nMosquito\t34116\n广东医院\t34117\n青春勇\t34118\n指宽\t34119\n米索\t34120\nm2\t34121\nSONIC\t34122\n水准尺\t34123\nrdp\t34124\nSLR/DSLM\t34125\n宋妍\t34126\n奉节网\t34127\n使命召唤\t34128\n曲塘\t34129\n中国驻外大使馆\t34130\n144Hz\t34131\n深影\t34132\n哲史\t34133\n五朵金花\t34134\n体机\t34135\n中央和国家机关培训费管理办法\t34136\nt700\t34137\n博士招聘网\t34138\n金龙寺\t34139\n中国式\t34140\n宜春\t34141\n4000个\t34142\n双木\t34143\n暖瓶\t34144\nAnderson\t34145\n全尸\t34146\n友田真希\t34147\n人渣\t34148\n飘散\t34149\nChairman\t34150\npainter\t34151\n图纸\t34152\n中阳\t34153\n双乙酸钠\t34154\nPERC\t34155\n實物特徵\t34156\n易安网\t34157\n夏侯氏\t34158\n准确\t34159\n知乎日报\t34160\nhash\t34161\nfunds\t34162\n古剑2\t34163\noriental\t34164\n颠勺\t34165\nstm8s103\t34166\n国家自然科学基金委\t34167\nmerged\t34168\nNBD\t34169\n名瓷\t34170\n扫码机\t34171\n少儿读物_简笔画小学堂\t34172\nminidlna\t34173\n宠物百科_百姓网\t34174\nCentOS6\t34175\n6mm\t34176\n25栋\t34177\n毫米级\t34178\n剑网三重制版\t34179\n沧州市中心医院\t34180\n甩狙\t34181\n5月4号\t34182\nPS4PRO\t34183\nsynology\t34184\n玉渊潭公园\t34185\n再保险\t34186\n教育部科技司\t34187\n宇鑫物流\t34188\n刀盘\t34189\n非国\t34190\n铅丝\t34191\n唐晓翼\t34192\n大窗\t34193\n鹅岭\t34194\n19英寸\t34195\n记记\t34196\nhellotv\t34197\n皮板\t34198\n贺满\t34199\n蒲地蓝口服液\t34200\n外来人\t34201\n东台市\t34202\nRED5\t34203\n江东北路\t34204\n搭建商\t34205\n蚝壳\t34206\n投入式\t34207\n严国栋\t34208\n非洲狮\t34209\n刍\t34210\n交流处\t34211\n功钱\t34212\n边裁\t34213\nFJ酷路泽\t34214\n2018年5月4日\t34215\n乐信\t34216\n不达标\t34217\n富森美\t34218\nSCHEMA\t34219\n陈丽娜\t34220\n雪妍\t34221\n20起\t34222\npaoding\t34223\nsi001第一会所亚\t34224\n越橘\t34225\n宠物美容\t34226\n金建\t34227\n拆字\t34228\n恐怖黎明吧_\t34229\n经商\t34230\n易再生网\t34231\n华成英\t34232\n方宇\t34233\noccurrences\t34234\n壱\t34235\n普教\t34236\n董超\t34237\ncoocaa\t34238\n1.54\t34239\n停车站\t34240\n德昌\t34241\n伉俪\t34242\n势力战\t34243\n衣箱\t34244\n天堂\t34245\n中华人民共和国证券法\t34246\n图霸\t34247\nExpand\t34248\n摇摇棒\t34249\n良姜\t34250\n剿\t34251\n水务集团\t34252\n放卷\t34253\na1660\t34254\nlamps\t34255\nbree\t34256\n能敌\t34257\n田字\t34258\n省住建厅\t34259\n一览无遗\t34260\ninversion\t34261\n幼儿学\t34262\n流贷\t34263\n乙草胺\t34264\n兴奋剂\t34265\n重大危险源\t34266\n周强\t34267\nintell\t34268\n杜鲁门\t34269\n咖啡展\t34270\nhn\t34271\n12节\t34272\n关联分析法\t34273\n相媲美\t34274\n16299\t34275\n277DCV\t34276\n远程桌面服务器\t34277\n剑网3综\t34278\nDDY\t34279\n二苯胺\t34280\n黑道文\t34281\ngmapping\t34282\n快方\t34283\n第143章\t34284\n老女人\t34285\n付费类\t34286\n大数据算法\t34287\n眉目\t34288\n欲乱\t34289\n从严治党\t34290\n殉职\t34291\n画界\t34292\nyayun\t34293\n中国农业大学经济管理学院\t34294\n恒峰\t34295\n喷灯\t34296\n远洋乐堤港\t34297\n过渡时期\t34298\n第3位\t34299\n0000005\t34300\nuvw\t34301\nDispose\t34302\n批处理\t34303\n抚顺市政府\t34304\n莲花岛\t34305\n潮率\t34306\n济源市人民政府\t34307\n危岩\t34308\n方案商\t34309\n20170504\t34310\n批评者\t34311\n股期\t34312\n苯扎溴铵\t34313\nswitchyomega\t34314\nperceive\t34315\n180412\t34316\nvetur\t34317\n违建房\t34318\n9ht\t34319\nXGogo\t34320\n湖西路\t34321\n白玉霜\t34322\n大摆裙\t34323\n见喜\t34324\n插槽\t34325\nJRPASS\t34326\n油舱\t34327\n梯柱\t34328\n管易云\t34329\n张伯礼\t34330\n2816\t34331\n预告\t34332\nCompletion\t34333\nHottoys\t34334\nwanimal\t34335\n145级\t34336\n信部\t34337\n骨料\t34338\n攻音\t34339\n半挂牵引车\t34340\n有急事\t34341\n薛梅\t34342\n威将\t34343\n167个\t34344\n注视\t34345\n北京大学马克思主义学院\t34346\n二肽\t34347\n银联版\t34348\n产业技术创新战略联盟\t34349\n几号\t34350\n平乱\t34351\nimf\t34352\n千万正版音乐海量无损曲库新歌热歌天天畅\t34353\n5.7.17\t34354\n◢\t34355\n异议\t34356\n爱得利\t34357\n冷门\t34358\n第14张\t34359\n闪讯\t34360\n1106\t34361\n起运\t34362\n宝塔服务器管理助手\t34363\n双兵\t34364\n能屈能伸\t34365\n王戈\t34366\n1846\t34367\n倍增器\t34368\nNginx开发从入门到精通\t34369\n前世今生的缘\t34370\n长州\t34371\n小学五年级语文\t34372\n醋意\t34373\n三周年\t34374\n淮南市第一人民医院\t34375\n索引牌\t34376\n刮胡子\t34377\nBEE\t34378\n情场\t34379\n庚子日\t34380\n题材股\t34381\n花中\t34382\nTuxedo\t34383\n活动日\t34384\nskyhd\t34385\n赢销\t34386\n轻钢龙骨石膏板\t34387\n惠佳\t34388\n丹东市政府\t34389\n蔻赛\t34390\n央吉玛\t34391\nDAIKIN\t34392\n我的梦\t34393\nSQL文件\t34394\n三十_\t34395\n司法局\t34396\n铜圆\t34397\n风行网\t34398\n刘浩然\t34399\n﹏\t34400\n红羽\t34401\n京粉\t34402\nHawking\t34403\n紫彩\t34404\n陕西省合阳县人民政府\t34405\nINSURANCE\t34406\n梁家辉\t34407\n溟\t34408\n纯电动车\t34409\n宝莱坞\t34410\n炒货\t34411\npiss\t34412\n甲状腺肿\t34413\n金斯瑞生物科技\t34414\n长刺\t34415\nStarbucks\t34416\n暴走漫画的自频道\t34417\n天班\t34418\nWednesday\t34419\n分割单\t34420\nCMU\t34421\n原意\t34422\n乔木南乔\t34423\n真数\t34424\n30页\t34425\n量算\t34426\n北京青年\t34427\n接料\t34428\n铝基\t34429\nflac3d\t34430\n空虚\t34431\ncimco\t34432\n大格局\t34433\n卷线器\t34434\n广州大道北\t34435\n官能\t34436\n讯飞输入法-讯飞晓译翻译机-阿尔法蛋\t34437\n天涯医院\t34438\n柳林\t34439\n钙卫蛋白\t34440\nkmax\t34441\nMnO2\t34442\n山东钢铁集团\t34443\ncba\t34444\nmaxhub\t34445\n沪剧网\t34446\n十升\t34447\n100K\t34448\n挑战书\t34449\nAniplex\t34450\n挂名\t34451\n离乡\t34452\n空投币\t34453\n深秋\t34454\na7m2\t34455\ncad模板-千图网\t34456\n南侧\t34457\n12.5米\t34458\n2周\t34459\n控台\t34460\n副队\t34461\n清修\t34462\nmesos\t34463\nhim\t34464\n罗小黑\t34465\nSparrow\t34466\n捌零\t34467\n文澜小学\t34468\n中国科学院大学吧\t34469\nz再世篇\t34470\n无源滤波器\t34471\n爱康国宾体检中心\t34472\n第10季\t34473\ncott\t34474\n投射\t34475\n蘑菇云\t34476\n阿纲\t34477\n理有\t34478\n试算平衡表\t34479\n硅麦\t34480\n陶文铨\t34481\n四单\t34482\n万泽股份\t34483\n币安网\t34484\n北京恒远安诺科技有限公司\t34485\n北华大学\t34486\nlise\t34487\nZara\t34488\n地狱火\t34489\n机电系\t34490\n老毛桃\t34491\n俞樾\t34492\n对啊\t34493\n广州白癜风医院\t34494\n摸臀\t34495\n大丰市\t34496\n空军一号\t34497\n信息科学与工程学院\t34498\n圣马丁\t34499\n美国男篮\t34500\n60期\t34501\n崆峒区\t34502\n压缩包\t34503\n保管员\t34504\nBruce\t34505\n大航海时代\t34506\n传奇单机版\t34507\n新金山\t34508\n梦中人\t34509\n植树造林\t34510\nCairo\t34511\n酒精检测仪\t34512\n盈利性\t34513\n终身版\t34514\n证明书\t34515\n茄克\t34516\n邓稼先\t34517\n多者\t34518\ntale\t34519\n作帐\t34520\n人民性\t34521\n两三天\t34522\n合肥市口腔医院\t34523\n卡尔斯\t34524\n002508\t34525\n劳模创新工作室\t34526\ntama\t34527\n妇女联合会\t34528\n星星科技\t34529\n有损\t34530\n循环机\t34531\n常铝股份\t34532\n芜湖港\t34533\n鲁士\t34534\n生殖器官\t34535\n分忧\t34536\n无行\t34537\n雷公\t34538\n54厘米\t34539\n击鼓传花\t34540\n门卫\t34541\nReSharper\t34542\n杨善洲\t34543\n程序子\t34544\nsp2\t34545\n花嫁丽舍\t34546\n康林\t34547\n不列颠哥伦比亚大学\t34548\nBryce\t34549\n好无奈\t34550\nmux\t34551\n妖魔\t34552\nvise\t34553\n5日游\t34554\n莱特莱德\t34555\n名桥\t34556\n困局\t34557\n上海浦东发展银行\t34558\nHSC\t34559\n滴胶\t34560\nXboxone\t34561\n细胞分裂素\t34562\n安逸\t34563\n161个\t34564\n芥子园\t34565\n黑莓9930\t34566\nabu\t34567\n第二十三集\t34568\n江西省\t34569\n国泰港龙航空\t34570\n丰田海狮\t34571\n美都\t34572\n无处不在\t34573\nBlackberry\t34574\n7月15\t34575\n微客\t34576\n天博\t34577\n萌妻\t34578\nbiquge\t34579\n二氢杨梅素\t34580\n天街\t34581\n瓷恋\t34582\npaid\t34583\n弃剧\t34584\nhagen\t34585\n心结\t34586\n泰宇\t34587\n交建\t34588\n10月18日\t34589\n运移\t34590\n八骏图\t34591\n橾\t34592\n话剧社\t34593\n3费\t34594\n缘客扫\t34595\nV10.6.6.63\t34596\n12.0.26\t34597\nBarnes\t34598\n解放路街道\t34599\n慈溪新闻网\t34600\n美国恶霸犬\t34601\nvr眼镜\t34602\n随心而动\t34603\n9天后\t34604\n远大购物中心\t34605\n对换\t34606\n13串\t34607\n谢芹\t34608\n委托贷款\t34609\n同人文\t34610\n阿啊\t34611\n金投收藏网\t34612\n20151123\t34613\n宁波建工\t34614\nwanqi\t34615\n8200\t34616\n工作法\t34617\n6033\t34618\n6.3分\t34619\n9元\t34620\n切肤之爱\t34621\n杜伦大学\t34622\n原处\t34623\n单挑王\t34624\ntools\t34625\n长渕刚\t34626\npostman模拟HTTP请求\t34627\n声霸\t34628\n红豆树\t34629\n70年代\t34630\n增值税销项税\t34631\nflexible\t34632\n岜沙\t34633\n联合光电\t34634\n北工\t34635\nmontana\t34636\nmotd\t34637\n石家庄公司\t34638\n快模\t34639\n瑞友\t34640\n吐下\t34641\n阎庆民\t34642\n药浴\t34643\n溪湖区\t34644\nd2d\t34645\n赛格导航\t34646\n复决\t34647\n北科大\t34648\n击打\t34649\n睡女\t34650\n何姓\t34651\n压迫\t34652\n强心苷\t34653\n定时\t34654\njarsigner\t34655\n第185集\t34656\n印行\t34657\nships\t34658\n拼豆\t34659\n及格\t34660\n窝端\t34661\n维艰\t34662\nGlide4.0\t34663\nqsv格式转换器\t34664\n长径\t34665\nVimIy\t34666\n俳句\t34667\n魏姓\t34668\nNOTE8\t34669\n双色球开机号\t34670\n烧纸\t34671\n吴世勋\t34672\n保俶路\t34673\nSD高达G世纪:起源\t34674\n道路运输许可证\t34675\n寻线仪\t34676\n翼虎网\t34677\n追爱\t34678\nBitter\t34679\n143个\t34680\n6132\t34681\n打赌\t34682\ncommitted\t34683\n4.5.8\t34684\n新建县人民政府\t34685\n购公\t34686\n铵盐\t34687\n画手\t34688\n癸\t34689\n线上\t34690\nolb\t34691\n中华人民共和国最高人民法院\t34692\n苹果iPad\t34693\n土鲮\t34694\n东灵山\t34695\nportland\t34696\n不可言说\t34697\n战恋\t34698\n混凝土公司\t34699\nbarbie\t34700\n狐媚\t34701\nRate\t34702\n四十二章\t34703\n靡靡\t34704\nMOXA\t34705\n垫付\t34706\n小胖丁\t34707\n赵晓\t34708\n仙田\t34709\n浮沉子\t34710\n95分钟\t34711\nabsence\t34712\n徐加爱\t34713\n法定刑\t34714\n长春经济技术开发区\t34715\n無碼\t34716\nElectron\t34717\npvm\t34718\n滔\t34719\n香奈尔\t34720\n邮储银行信用卡\t34721\n13万亿\t34722\n台州市九三腋臭康复中心\t34723\n有序性\t34724\n狗哥\t34725\n19条\t34726\n微学\t34727\n2018年末\t34728\n符师\t34729\n纪委书记\t34730\nsegments\t34731\n居处\t34732\n电烤箱\t34733\n长沙理工大学城南学院\t34734\n胼胝\t34735\n推拉窗\t34736\n电臀\t34737\n清香型\t34738\n剌激\t34739\n小故事大道理\t34740\nBJ40L\t34741\nh10\t34742\nwheezy\t34743\n华峰集团\t34744\n蜃影\t34745\nwolfplan\t34746\n抽纸\t34747\n污泥烘干机\t34748\n环城高速公路\t34749\n脑角\t34750\nG3800\t34751\nun\t34752\nMPU6050\t34753\n上海农商行\t34754\n风餐露宿\t34755\n太阳装\t34756\ne招贷\t34757\nRP7.0\t34758\n非你莫属\t34759\n大圩\t34760\n老榆木\t34761\n宁乡县\t34762\n辽宁省人力资源和社会保障厅\t34763\n黎塞留\t34764\nA.1个\t34765\n邨\t34766\n岙\t34767\n柳暗花明\t34768\n峨庄\t34769\n吴宁\t34770\n幂集\t34771\n800美元\t34772\n潞河中学\t34773\n5XSQ\t34774\n近朱者\t34775\n划归\t34776\nmindnode\t34777\n凌知薇\t34778\n蒿草\t34779\n北京市人大常委会\t34780\n那些年代\t34781\nminecraf\t34782\n美容师\t34783\n白杨\t34784\n改编版\t34785\n粘钢胶\t34786\n48块\t34787\nsaks\t34788\n腿围\t34789\n第五届\t34790\n紫荆\t34791\n全能冠军\t34792\nUSA\t34793\n3101\t34794\n天佑\t34795\n人权\t34796\n秦凯\t34797\n三十年\t34798\n阿萨\t34799\n贺勇\t34800\n今世\t34801\n亚洲人\t34802\nMedline\t34803\nqtp\t34804\n30公分\t34805\n应达\t34806\n嬉\t34807\nhplc\t34808\n横飞\t34809\n上海交通大学医学院附属仁济医院\t34810\n购通\t34811\nperceived\t34812\nteardown\t34813\n无限空间\t34814\nqc25\t34815\n水暖毯\t34816\n20180416\t34817\n布帘\t34818\n幂次方\t34819\n健身架\t34820\n碳刷架\t34821\n85分钟\t34822\n百度有钱花吧\t34823\n可种\t34824\n90篇\t34825\n灼热感\t34826\n晚上10点\t34827\n清汤\t34828\n炮仗\t34829\n宁夏回族自治区政府\t34830\n百度云/\t34831\n人体干细胞\t34832\n花蟹\t34833\nstaying\t34834\nyie\t34835\n金年\t34836\n功效性\t34837\n省水利厅\t34838\n长期股权投资的成本法\t34839\n北新路桥\t34840\n纸伞\t34841\nCarter\t34842\n钻豹\t34843\n星矢\t34844\n连环画\t34845\nNestedScrollView\t34846\n梦缘\t34847\n弥陀镇\t34848\n南京邮电\t34849\n景田\t34850\n下塘镇\t34851\n安富莱\t34852\n一虎一席谈\t34853\n清国\t34854\nlo娘\t34855\nTGC\t34856\nfyi\t34857\n海峡号\t34858\n手机铃声网\t34859\n梳妆\t34860\n光山\t34861\n为国争光\t34862\n政企\t34863\n领读\t34864\n3733\t34865\n齿科\t34866\n7637\t34867\n有机气体\t34868\n浙海\t34869\na9lh\t34870\nConstant\t34871\n段柏文\t34872\n2505\t34873\n泸沽湖机场\t34874\n康乐保\t34875\n出油率\t34876\n三夫\t34877\n通用磨坊\t34878\n范氏\t34879\n蒙娜\t34880\n哀悼日\t34881\n瑞科\t34882\nstm8s003\t34883\n汤\t34884\nTangled\t34885\nyay\t34886\n多于\t34887\n永康街\t34888\n襄阳日报网\t34889\n二姐\t34890\n王剑锋\t34891\n第三种爱情\t34892\nDisposition\t34893\n广东男篮\t34894\n第4行\t34895\nnodemailer\t34896\n放一放\t34897\n9VG\t34898\n蒲庙镇\t34899\n_新闻中心\t34900\n招数\t34901\n9件套\t34902\n化纤布\t34903\n助力车\t34904\n月晕\t34905\n冯刚\t34906\n中山一中\t34907\n嘢\t34908\n李兆香\t34909\n第一幕\t34910\n异国\t34911\nevolution\t34912\n32码\t34913\n逍遥社区\t34914\n云阳县\t34915\npif\t34916\n屡\t34917\n春雨沙沙\t34918\nZUO\t34919\n煤系\t34920\n臻品\t34921\ndakota\t34922\n朱自清春\t34923\n兴化市政府\t34924\n水玲珑\t34925\n李作鹏\t34926\n360N6\t34927\nSUQQU\t34928\n巡抚\t34929\ndpp\t34930\n第12页\t34931\n穆阳\t34932\nSUHO\t34933\n0期\t34934\n平等\t34935\n公羊传\t34936\n眼动仪\t34937\n赢币网\t34938\n活吃\t34939\n心语\t34940\n.0\t34941\n六联\t34942\n南粤银行\t34943\n收件\t34944\n未分配利润\t34945\n齐跳\t34946\n观音寺镇\t34947\n508路\t34948\n红橘\t34949\n创事\t34950\n顺时针\t34951\n575\t34952\n星石投资\t34953\n天齐网\t34954\n李河君\t34955\n想想办法\t34956\nsegfault\t34957\n加西亚\t34958\n速尿\t34959\n金本位制\t34960\n大妮\t34961\n功法\t34962\n早搏\t34963\n18名\t34964\n中国平安人寿保险股份有限公司\t34965\n干完\t34966\n人本轴承\t34967\nreservoir\t34968\nDang\t34969\ndeinstall\t34970\n3月份\t34971\n位育中学\t34972\nBREAK\t34973\n光景\t34974\n标致雪铁龙集团\t34975\n李相烨\t34976\nWordpress\t34977\n介孔二氧化硅\t34978\n450亿\t34979\n第三十三届\t34980\n宁波市总工会\t34981\n杭州尧舜科技有限公司\t34982\nintel显卡驱动\t34983\ncd盘\t34984\ngetDate\t34985\nfootjob\t34986\n1年以内\t34987\n发誓要\t34988\n该院\t34989\n纽约大都会博物馆\t34990\nLIFECLEAR\t34991\n第二十四集\t34992\n16299.19\t34993\n牌位\t34994\n祖传\t34995\n李茜\t34996\nrays\t34997\n榔梨镇\t34998\n119元\t34999\n北京长租公寓\t35000\n基操\t35001\n承购\t35002\n分表\t35003\n抹茶曲奇\t35004\n认质\t35005\n龙记\t35006\nYOURSELF\t35007\n剩者为王\t35008\nFLAGS\t35009\n油用牡丹\t35010\n黑政\t35011\n51la\t35012\n雷主\t35013\n刘惜芬\t35014\n中视传媒\t35015\n华强资讯\t35016\n浅水区\t35017\n大段\t35018\n掀起\t35019\nPCS\t35020\n路菲汐\t35021\nPTO\t35022\n中国男足\t35023\n人民政协\t35024\nhotmail邮箱\t35025\n4.6.1\t35026\nDisposal\t35027\n9.9万\t35028\n契子\t35029\n8000多元\t35030\n多倍体\t35031\n响_\t35032\n超级机器人大战OG2\t35033\ncampus\t35034\n常年\t35035\ncanno\t35036\nipx\t35037\n澳大利亚政府\t35038\n美国航空\t35039\n李君如\t35040\n回暖\t35041\n2018|\t35042\nouchn\t35043\nverifier\t35044\n施力\t35045\n管箍\t35046\n蓝砂石\t35047\n低碳饮食\t35048\n1.89\t35049\n长沙物流公司\t35050\n万能险\t35051\n黄立成\t35052\nCh\t35053\n驼色\t35054\nspeeds\t35055\n丁基胶\t35056\nChristianity\t35057\nHTML5播放器\t35058\n奠基人\t35059\n扫盘\t35060\n蚂蚁矿机s9\t35061\n决胜期\t35062\n销路\t35063\n擦伤\t35064\n马扎克\t35065\n5款\t35066\n正置\t35067\n2300万\t35068\n民办非企业单位登记管理暂行条例\t35069\nAEAS\t35070\n养老保险公司\t35071\n哈希函数\t35072\n草皮\t35073\n100020\t35074\nnagoya\t35075\n满山红\t35076\n尊者\t35077\n早前\t35078\nWebex\t35079\n玄月\t35080\n列侬\t35081\n华尔街日报中文网\t35082\n阿根廷\t35083\n滁州市人民政府办公室\t35084\n称谓语\t35085\n易帜\t35086\n李诗英\t35087\n罗普斯\t35088\n精神报告\t35089\n钟管镇\t35090\n衣原体\t35091\n套丝机\t35092\nフ\t35093\n超搞笑笑话\t35094\n维咔\t35095\n4月14日\t35096\n泰宁县\t35097\n6支\t35098\n西南大学教育学部\t35099\n哈瓦那\t35100\n艾滋病病毒感染者\t35101\n张文强\t35102\n方正国际软件有限公司\t35103\n能上能\t35104\n醋酸乙酯\t35105\n棒子\t35106\n前池\t35107\n2600次\t35108\n樊纲\t35109\nc86\t35110\n棋牌评测网\t35111\n费墨\t35112\n回乡证\t35113\n二级域名\t35114\ng+\t35115\n北京公共交通集团\t35116\n美联高中\t35117\n翔安新城\t35118\n纵深感\t35119\n李胜\t35120\n中国知识产权局\t35121\nienumerable\t35122\n奥普\t35123\ndoker\t35124\n苏斯\t35125\n电子健康卡\t35126\n电脑管家\t35127\n3D背景墙\t35128\nEnder\t35129\n明升暗降\t35130\n不限购\t35131\n股权转让协议书\t35132\n伦敦鲸\t35133\nlx3\t35134\n西安铁道技师学院\t35135\nRenminbi\t35136\n绿意\t35137\n深圳国际学校\t35138\n短版\t35139\nVOD\t35140\n甘肃省旅游局\t35141\n阿里中心\t35142\n杰宝\t35143\ncorrelation\t35144\n秦宜智\t35145\nzzu\t35146\n1.5_\t35147\n八公山区\t35148\nZR\t35149\n1397\t35150\n魔图\t35151\n大轿\t35152\nswal\t35153\n48日\t35154\n有伤\t35155\n李阳\t35156\n礼包\t35157\n闹心\t35158\n三英战\t35159\nHD高清中国人\t35160\n观韬\t35161\nrol\t35162\n9300\t35163\n云月港网\t35164\n轮战\t35165\n中央网信办\t35166\n百家讲坛全集\t35167\n营业处\t35168\n长沙师范学院\t35169\n树水\t35170\nfoxmail.com邮箱\t35171\nWhitney\t35172\n25h\t35173\n桃花缘\t35174\nstable\t35175\n99.7%\t35176\n超美爆乳AV女优\t35177\nhttp-proxy\t35178\n灵台\t35179\n冷热水\t35180\nGTX760\t35181\n胸甲\t35182\n白刚\t35183\n梦幻西游人族\t35184\n蒙奇奇\t35185\n1080P高清区_电影港\t35186\n必然趋势\t35187\n网约车\t35188\n沃特玛\t35189\n尘埃4\t35190\n福田教育网\t35191\nIIS\t35192\n恐怕\t35193\n三十载\t35194\n地暖管\t35195\n599元\t35196\n1.39\t35197\n备案\t35198\n咏怀诗\t35199\nlicstar\t35200\n买买\t35201\n福厦\t35202\n欢乐喜剧人\t35203\n喷射美少女2\t35204\n喻云林\t35205\n热缩片\t35206\n慌忙\t35207\n等位\t35208\n火花机\t35209\n黑白分明\t35210\n激情四射\t35211\n子承父业\t35212\n半壶纱\t35213\n高尚者\t35214\n中国社会主义青年团\t35215\n铅矿\t35216\n受伤害\t35217\n箱房\t35218\n水泥稳定土\t35219\n木棉花\t35220\n成都火车东站\t35221\n耸动\t35222\nSalvation\t35223\npdffactory\t35224\n长安CS75\t35225\n耐磨管\t35226\n南朝\t35227\n凡帝罗\t35228\n牙\t35229\n派别\t35230\n沉重打击\t35231\n团队协作\t35232\n柯尼卡\t35233\n韩再芬\t35234\n六字大明咒\t35235\n清喉利咽颗粒\t35236\n债资\t35237\ngreyson\t35238\nbaoma\t35239\n1厘米\t35240\n修道\t35241\n何凯文长\t35242\n犬神\t35243\n金领冠\t35244\n42位\t35245\n中国海洋石油总公司\t35246\n学\t35247\n652\t35248\n香液\t35249\n拉法\t35250\n友生意经\t35251\n428号\t35252\nMSCBSC\t35253\n4G全网通\t35254\n北京商铺网\t35255\n地标性\t35256\n爆吧\t35257\n爱快路由器\t35258\n劳燕\t35259\n健身车\t35260\n红嘴蓝鹊\t35261\n铝片\t35262\n疮\t35263\n14岁时\t35264\n澜沧古茶\t35265\n瑶寨\t35266\ngx8\t35267\n广州市番禺区政府\t35268\n分布筋\t35269\n拉丁化\t35270\n中国人民解放军总医院第一附属医院\t35271\n惊醒\t35272\n知识产权法学\t35273\n西湖风景名胜区\t35274\n开业\t35275\n下山桩\t35276\n偶合\t35277\n300万美元\t35278\n蝠\t35279\n2499元\t35280\n火检\t35281\n张建斌\t35282\n千级\t35283\n地城之光\t35284\n千子\t35285\nopensees\t35286\n心水\t35287\n陶庄\t35288\n嫌疑人X的献身\t35289\n梦幻西游仙玉\t35290\ngprs\t35291\n20151227\t35292\n2018-01-10\t35293\n拆车件\t35294\n中国企业家网\t35295\n新车\t35296\n光散射检测器\t35297\n局端\t35298\n永\t35299\n无心爱\t35300\nzhongjie\t35301\n前11月\t35302\n言传身教\t35303\n环形图\t35304\n炒粉\t35305\n张晓峰\t35306\neuc\t35307\n晶态\t35308\n上海航运交易所\t35309\n梵呗\t35310\n艾斯\t35311\n毒蛇\t35312\n第二盒\t35313\n称呼歌\t35314\n4538\t35315\nMetropolitan\t35316\n木浮生\t35317\n你自己\t35318\n自营销\t35319\n货站\t35320\nWEB编程\t35321\n在何方\t35322\n王文也\t35323\n智芯\t35324\n砍死\t35325\n阿房宫赋\t35326\nmweb\t35327\nPostMan\t35328\n丙纶\t35329\n沙排\t35330\n单元化\t35331\n纸笔\t35332\n心底\t35333\n气短\t35334\n社会工作实务\t35335\n啧\t35336\n知识竞\t35337\n这般\t35338\n员\t35339\n导书\t35340\n雏菊\t35341\n艺术设计学院\t35342\n下发\t35343\nFIX字幕侠\t35344\nHottest\t35345\n贺图\t35346\nDire\t35347\n1520\t35348\n石棉板\t35349\n口腔医\t35350\n汉堡王\t35351\n130万元\t35352\n上海尚尤实业有限公司\t35353\n2705\t35354\n球市\t35355\n长沙市委组织部\t35356\n匝道\t35357\n洁牙机\t35358\n必先\t35359\nwww.dz19.net\t35360\n西语助手\t35361\n期间费用明细表\t35362\n皇孙\t35363\n肠系\t35364\n汉景帝\t35365\n林昆\t35366\n猜一猜\t35367\n择天九真九阳\t35368\n石上纯\t35369\n郑州国际会展中心\t35370\nEMS快递单号查询\t35371\nck电影网\t35372\nMinnesota\t35373\n河套平原\t35374\n张美\t35375\n外参\t35376\nmoudle\t35377\nMili\t35378\n马自达6论坛\t35379\n划伤\t35380\n陕西省安全生产监督管理局\t35381\n邓建国\t35382\nunity3d游戏\t35383\nori\t35384\n2018招\t35385\n林子祥\t35386\n桫椤\t35387\narticulate\t35388\n红柳\t35389\n赵正平\t35390\n猪肠\t35391\n上海寻梦信息技术有限公司\t35392\ngoon\t35393\n小学数学\t35394\n功能化\t35395\n铁件\t35396\n温州市教育局\t35397\n王国之心\t35398\nVDC\t35399\n机心\t35400\n女人节\t35401\n何明翰\t35402\n北京地区\t35403\n长寿镇\t35404\n炮萝\t35405\nIMM5257E\t35406\n扣关\t35407\n荒天帝\t35408\n冒失\t35409\n1箱\t35410\n项目式\t35411\n屏蔽仪\t35412\n示警\t35413\n隔离区\t35414\n本场\t35415\n陈凯\t35416\nHCG\t35417\nSOLUTION\t35418\n李凌\t35419\nforword\t35420\nsexmovies\t35421\n309\t35422\n中国室内设计师网_室内设计联盟_装修设计公司\t35423\nangie\t35424\n红机\t35425\n黄浦路\t35426\n帧频\t35427\n苏摩\t35428\n编辑部\t35429\n開啟\t35430\n交响诗篇\t35431\naccounts\t35432\n浙江工业职业技术学院\t35433\n你看起来很好吃\t35434\n汇金广场\t35435\n李秀彬\t35436\n登出\t35437\n南征北战\t35438\naldrich\t35439\n巾帼英雄\t35440\nac68\t35441\n湖南菜\t35442\n赤坎区\t35443\n科诚\t35444\n余世维\t35445\n三里人家\t35446\n扩产\t35447\n王宇\t35448\nyi子\t35449\n微群\t35450\n友群\t35451\n张瑶\t35452\n可采\t35453\n消毒剂\t35454\nCCTV1节目表|中央\t35455\n实感\t35456\n谈情\t35457\nsegnet\t35458\nBIKETO自行车论坛\t35459\n防伪标\t35460\n徐珠贤\t35461\n滞洪区\t35462\nsjy\t35463\n独断聊斋志异\t35464\n经贸学院\t35465\n茶博会\t35466\n14ISK\t35467\n铺板\t35468\n华银\t35469\n许哲\t35470\n营改\t35471\n立减\t35472\n鸾\t35473\nTurbulence\t35474\n上古5\t35475\n源代码托管中心\t35476\n魔兽争霸rpg\t35477\n集体建设用地使用权\t35478\n_肝病\t35479\nOpenStreetMap\t35480\n华明镇\t35481\n易算\t35482\n骊山\t35483\n口蜜\t35484\nplexus\t35485\n信长之野望14:创造威力加强版\t35486\n深厚\t35487\n太平洋软件下载中心\t35488\n城建集团\t35489\n十二层\t35490\nM7250\t35491\n借贷网\t35492\n棣棠花\t35493\n22关\t35494\n因为我\t35495\nApi\t35496\n威\t35497\n维普资讯网\t35498\n美爱\t35499\n辽宁都市频道\t35500\n80摄氏度\t35501\n塔罗斯\t35502\n赶街\t35503\nExcite\t35504\nSpeccy\t35505\n果冻效应\t35506\n绿航\t35507\nHFile\t35508\n重圆\t35509\n800kg\t35510\nAntioxidant\t35511\n特大\t35512\n惩防\t35513\n中国烹饪协会\t35514\nLakes\t35515\n52pcgame\t35516\n第97章\t35517\n篮球袜\t35518\n0735\t35519\n总裁爹地的宠妻法则\t35520\nCyborg\t35521\n恍惚\t35522\n痴女\t35523\n0928\t35524\n高栏\t35525\n招商信诺\t35526\n照明展\t35527\n北京华联超市\t35528\n绝不放弃\t35529\n微积分学教程\t35530\n史家\t35531\n双建\t35532\n甲减\t35533\n中烟\t35534\nideapad110\t35535\n筑梦路上\t35536\ncnaps\t35537\n广西路\t35538\n左心室\t35539\n黑暗风\t35540\n专属版\t35541\n耦合式\t35542\n爸爸去哪儿4\t35543\netf基金\t35544\n6月30\t35545\n沈凯\t35546\ne0级\t35547\n慧星\t35548\nRFID读写器\t35549\n胸外\t35550\n140g\t35551\n三菱东京日联银行\t35552\n五菱宏光S\t35553\n孔夫子旧书网\t35554\n孤独\t35555\n桡\t35556\n现金卡\t35557\nndarray\t35558\n郑莹\t35559\n补偿费\t35560\nwi100sh\t35561\n英雄本色\t35562\nclarkson\t35563\n一年期\t35564\n团支\t35565\njedate\t35566\n浙江大学计算机学院\t35567\nligne\t35568\n老王\t35569\nEnergies\t35570\nPana\t35571\n怪物猎人世界恐暴龙\t35572\n吴波\t35573\n假账\t35574\n冷水浴\t35575\njiayayao\t35576\n鸿兴\t35577\npvp\t35578\n大浙网\t35579\n维京海盗\t35580\n802.11n\t35581\n地球日\t35582\n五颜六色\t35583\n64卦\t35584\n南屏晚钟\t35585\n百里画廊\t35586\n5000亿元\t35587\n10kv\t35588\n刘盈\t35589\n1792\t35590\n2016年11月7日\t35591\n改户\t35592\n真军\t35593\n生当作人杰\t35594\n串中\t35595\nmongol\t35596\n50HZ\t35597\n杀菌器\t35598\n液氧\t35599\n诗羽\t35600\n天使之翼\t35601\nvibrant\t35602\n1800mm\t35603\n洞\t35604\n迎江\t35605\n死或生5吧\t35606\n宰相刘罗锅\t35607\n若羌县\t35608\n变态王子与不笑猫\t35609\n出纳员\t35610\n华语_电影网\t35611\n陶氏化学\t35612\n岳阳县\t35613\n劲翔\t35614\n忠爱无言\t35615\n6.3.16\t35616\n贪吃蛇大作战\t35617\n月城\t35618\nPowers\t35619\n上海市出入境管理局\t35620\n香韵\t35621\n溶出仪\t35622\n注册证\t35623\ntic\t35624\nRCCSE\t35625\n艾芙洛\t35626\n氧化皮\t35627\nofs\t35628\n先婚\t35629\n幸运儿\t35630\n喷镀\t35631\n京万红软膏\t35632\n把握机会\t35633\n双生灵\t35634\n第六十六条\t35635\n猛兽\t35636\n彩龙\t35637\n华东师范大学\t35638\n财产税\t35639\n王昊\t35640\n川\t35641\n删库\t35642\n乌镇镇\t35643\n察察\t35644\n地书\t35645\n送机\t35646\n第05\t35647\n郭德刚\t35648\n老怪\t35649\n环形\t35650\nugly\t35651\n哆啦A梦吃惊\t35652\n绥江\t35653\n单室\t35654\n天津市口腔医院\t35655\n可靠度\t35656\n突现\t35657\n国平\t35658\n联想Y450\t35659\n梦想启航\t35660\n杨惠妍\t35661\n珠海市公共资源交易中心\t35662\n顺境\t35663\n大脸妹\t35664\n天齐\t35665\n荷叶\t35666\n百度竞价托管\t35667\n海康卫视\t35668\n家悦\t35669\n涼子\t35670\n快女\t35671\n黄家湖\t35672\n成田\t35673\n保民\t35674\n水濑祈\t35675\n放通\t35676\n杨晓敏\t35677\n云锡\t35678\n及其\t35679\n零子\t35680\n许渊\t35681\ntotal_fee\t35682\nMestReNova\t35683\n双键\t35684\nAVAudioPlayer\t35685\n中户\t35686\n阻碍\t35687\nfiletype\t35688\n松雨\t35689\n黑豹天下\t35690\nPPPoE\t35691\n征\t35692\n融水苗族自治县人民政府\t35693\n平基\t35694\n包导\t35695\n格拉斯哥大学\t35696\n一般说来\t35697\n哈尔滨师大附中\t35698\n祝酒歌\t35699\n底裤\t35700\n癞子\t35701\n心理学史\t35702\n巴国城\t35703\nLOST\t35704\nlotte\t35705\n考\t35706\n创造性\t35707\n3.20\t35708\n五里亭\t35709\n中道\t35710\n理科男\t35711\n步枪\t35712\n小艇\t35713\n燃油\t35714\nPig\t35715\n观世\t35716\n首钢总公司\t35717\nwww.sure56.com\t35718\n爽歪歪\t35719\n湿度\t35720\n秋无痕\t35721\nbeaglebone\t35722\n糖量\t35723\n甘薇\t35724\n天天看高清\t35725\n假动作\t35726\n兰儿\t35727\n后内\t35728\nanalyzer\t35729\n3月27号\t35730\n讲解稿\t35731\n扣编\t35732\n黑产\t35733\n刮胡刀\t35734\n北京理工大学附属中学\t35735\nTIC\t35736\n法形\t35737\nckook\t35738\n社保号\t35739\nns-3\t35740\nloft\t35741\n初章\t35742\n小考拉\t35743\n宅网\t35744\n现代主义\t35745\n冲阵\t35746\n满星\t35747\n八百元\t35748\n验旧\t35749\nscs\t35750\n萧淑慎\t35751\n决胜局\t35752\n湖州中心医院\t35753\n首饰\t35754\n全品\t35755\n失焦\t35756\n桃花源记2吧\t35757\n控油\t35758\n宁南\t35759\n不堪设想\t35760\nanydesk\t35761\n恩施职业技术学院\t35762\n闸北实验小学\t35763\n马面\t35764\nMagical\t35765\n平安谷之诡谷传说\t35766\n磨成\t35767\n国家科技图书文献中心\t35768\n希卡利\t35769\n小祥\t35770\n中航\t35771\n颤振\t35772\n中国医药联盟\t35773\n硫化剂\t35774\n邪恶漫画3d\t35775\nGiving\t35776\n土豆汁\t35777\n诺亚舟\t35778\n相线\t35779\n直流电机调速器\t35780\n芭蕾\t35781\n瑞金路\t35782\n350000\t35783\n柳月\t35784\n美化\t35785\n微闻\t35786\n浩劫DH\t35787\nSpiral\t35788\n旋转编码器\t35789\n钱清\t35790\n超融合\t35791\n张德培\t35792\n张中行\t35793\n漾\t35794\n离散余弦变换\t35795\n戴尔(中国)有限公司\t35796\nRobbins\t35797\n王永志\t35798\n长沙市住建委\t35799\n脱密\t35800\nIptables\t35801\n金泉广场\t35802\nPPT图表\t35803\n没看\t35804\n全文字\t35805\n1605\t35806\n磷化\t35807\n瘦身咖啡\t35808\n转手\t35809\n航空路\t35810\n联想超极本\t35811\nIT即时新闻_南方网\t35812\n夏昆冈\t35813\n邪恶漫画大全_高h漫画_h漫吧_日本h漫画_h动漫\t35814\n8gb\t35815\n青瞳\t35816\n大天使之剑\t35817\n艾尔莎\t35818\n识别率\t35819\nSnowDBA\t35820\nTOOL\t35821\n第144章\t35822\ncscx\t35823\n阿特拉斯·科普柯\t35824\n优姿\t35825\n厂商\t35826\n乐基儿\t35827\n关索\t35828\n七校联考\t35829\nbeeg\t35830\n斯坦利\t35831\n保利海德公园\t35832\n牙周科\t35833\n广阳岛\t35834\n中岛美雪\t35835\n东风商用车\t35836\n星源\t35837\n起底\t35838\n聚氨酯发泡胶\t35839\n益州大道\t35840\n轩盎\t35841\n二批\t35842\ncanting\t35843\nflew\t35844\n恋夜\t35845\n危旧\t35846\n甩卖\t35847\nToni\t35848\n支气管扩张\t35849\n吉林省国土资源厅\t35850\n追魂\t35851\n二叶\t35852\n苏宁帮客\t35853\n滨湖世纪城\t35854\n无尘纸\t35855\nspringboot2\t35856\n血鹦鹉\t35857\n卡梅伦\t35858\n大英百科全书\t35859\n公安学\t35860\n为什么不让\t35861\nspanked\t35862\n王乃恩\t35863\n玻璃影\t35864\nlasting\t35865\n北京华科医院\t35866\n安妮\t35867\n石油人才网\t35868\n20v\t35869\n野良\t35870\n分局\t35871\ndreammail\t35872\n美观度\t35873\nxcar\t35874\n普兰\t35875\n花溪镇\t35876\n筛窦\t35877\n买卖网\t35878\n莱卡相机\t35879\n住进来\t35880\n12艘\t35881\n杨立华\t35882\n仙武\t35883\nθ\t35884\n周岁宴\t35885\ncorrespondence\t35886\n钱云\t35887\n信而富\t35888\n1.12\t35889\n主键\t35890\n早熟\t35891\n华安县\t35892\n丧志\t35893\n索县\t35894\n汤羹\t35895\n吃元宵\t35896\nfadeout\t35897\n皱\t35898\n畜牧师\t35899\n非治违\t35900\n证据学\t35901\n孟博\t35902\n赵东来\t35903\n书夹\t35904\n山西省安全生产监督管理局\t35905\n[全职高手\t35906\n解数\t35907\n933\t35908\n逆羽\t35909\n承包户\t35910\n关节处\t35911\n大南路\t35912\n恐怖箱\t35913\n小月\t35914\n奔驰c63\t35915\n盖儿\t35916\n王铭章\t35917\nOptimus\t35918\nflanneld\t35919\n必不\t35920\n开赛\t35921\n被撞坏\t35922\ncheat\t35923\n印钱\t35924\n大龙哥\t35925\n中安教育网\t35926\n两篇\t35927\n麦滋林\t35928\n易凡\t35929\n藏经\t35930\n流花湖公园\t35931\n流媒\t35932\n诱捕器\t35933\n西财在线\t35934\n求懂\t35935\ny410p\t35936\n缺血\t35937\n山头\t35938\n鼠尾\t35939\nTCP\t35940\n黄水\t35941\nWECN\t35942\n金碟\t35943\n出塞曲\t35944\n700MB\t35945\n孟加拉国\t35946\n制品\t35947\nk值\t35948\n回天新材\t35949\n细胞分裂\t35950\n辟\t35951\n圣殿\t35952\n喜糖\t35953\n魔法值\t35954\n长青股份\t35955\n过行\t35956\njai\t35957\nTHS\t35958\n600余\t35959\n50多次\t35960\n斯道拉恩索\t35961\n上板\t35962\n十八章\t35963\n先马电源\t35964\n即视感\t35965\n天御\t35966\n变频\t35967\n李佐军\t35968\n海康萤石\t35969\n与\t35970\n名扬四海\t35971\nwhistle\t35972\nViolet\t35973\n189.cn\t35974\n停炉\t35975\n羊奶果\t35976\n驱油\t35977\n40个月\t35978\n岸本\t35979\n8万元\t35980\n技术史\t35981\n马林斯基剧院\t35982\ncreat\t35983\n5h\t35984\n卡塞尔\t35985\n胡萍\t35986\nrejected\t35987\n思归\t35988\n李国祥\t35989\n三台山\t35990\n绿帽文\t35991\n钻石级\t35992\nMatlab2016\t35993\n分股\t35994\nBlancpain\t35995\n舒淇\t35996\n拉手\t35997\n安徽广播电视台\t35998\n雪狼\t35999\n601225\t36000\n退单\t36001\n深圳市住房和建设局\t36002\n13班\t36003\n王庆爽\t36004\n51级\t36005\nnv\t36006\ngh4\t36007\n书帖\t36008\nhasa\t36009\n与应\t36010\n比弗利山庄\t36011\n51\t36012\n2.0cm\t36013\n五栋\t36014\n0425\t36015\n港口区\t36016\n三2018\t36017\noutlook邮箱\t36018\nnonatomic\t36019\n住宅用地\t36020\n吸出\t36021\ndbv\t36022\n颚\t36023\nsimilar\t36024\n兆头\t36025\n帆布包\t36026\n管工\t36027\n八字纹\t36028\n正负极\t36029\n迷你特工队\t36030\necg\t36031\n转合\t36032\n乐清政府\t36033\n本拉登\t36034\n广告牌\t36035\n不琢\t36036\n范大姐\t36037\nbist\t36038\n零申报\t36039\n导体\t36040\n刘晓莉\t36041\n照片流\t36042\nSSPD\t36043\nHIFI\t36044\n寻梦录\t36045\n斋藤工\t36046\n东城区东\t36047\n低热量\t36048\n保温台\t36049\n非金属膨胀节\t36050\n布布\t36051\n忧伤者\t36052\n中国推销员\t36053\n住进\t36054\n600507\t36055\nhes\t36056\n国家联盟\t36057\nLAMER海蓝之谜\t36058\n鸭脖子\t36059\n张仪\t36060\n半年度\t36061\n巨响\t36062\n0.94\t36063\n租赁期\t36064\n36颗\t36065\ntorrentkitty中文网\t36066\n十八里店乡\t36067\n2.2万元\t36068\n兵权\t36069\n枣庄路\t36070\n诺秒贷\t36071\n爆藻\t36072\n敬老卡\t36073\noffice办公软件\t36074\n北鼻\t36075\n暴走脑残\t36076\n代号47\t36077\n作假\t36078\n小提琴\t36079\n膀胱镜检查\t36080\n再晨\t36081\nlpxxn\t36082\nVoodooHDA\t36083\n阿莲\t36084\n圣安德鲁斯\t36085\n20170428\t36086\n龙姓\t36087\n大操大办\t36088\npairwise\t36089\n58章\t36090\n宁波雅戈尔\t36091\nukvi\t36092\n集权\t36093\npythoncharm\t36094\n新酒\t36095\nCompilation\t36096\n北京航空航天大学出版社\t36097\n四千年\t36098\n回教\t36099\n午线\t36100\n保险片\t36101\n圣痕\t36102\n新梅\t36103\n医_寻医问药网\t36104\n蝶讯网\t36105\nTote\t36106\n东星\t36107\n【求文】\t36108\n伊藤诚\t36109\n干子\t36110\n大主大唐双龙传\t36111\n恐龙战队\t36112\n担心\t36113\n我的英雄学院第三季\t36114\n粉岭\t36115\n解套\t36116\n卤煮火烧\t36117\n春巫\t36118\n天天卡牌\t36119\n索芙特\t36120\n猫千草\t36121\n谷歌地球\t36122\n反叛的鲁路修\t36123\n李子坝\t36124\n尼伯特\t36125\n南门路\t36126\n半集\t36127\n到货\t36128\n盐度\t36129\n教学大纲\t36130\n科瑞莱\t36131\n美术学院\t36132\n程潇梁宏达\t36133\n加特\t36134\nGas\t36135\n公共英语三级\t36136\n云里雾里\t36137\n刚柔并济\t36138\necstasy\t36139\nT恤衫\t36140\n布板\t36141\nDCDC变换器\t36142\n1.5M\t36143\n尸\t36144\n逸事\t36145\nT6\t36146\nwpm\t36147\nbboy\t36148\nff10\t36149\n国家观\t36150\nruff\t36151\n瓜君\t36152\n新版跑狗图\t36153\n韶山市\t36154\n骨套\t36155\njabber\t36156\n征途怀旧版\t36157\n力所能及\t36158\n后付\t36159\n湘江\t36160\n爱琳\t36161\ncholesky\t36162\nPCO\t36163\n精思\t36164\n比特财经\t36165\n22只\t36166\n沥海\t36167\n六十六节\t36168\n东莞酒店\t36169\n德科技\t36170\n林清\t36171\n绣衣\t36172\n做我\t36173\n子数组\t36174\n探源\t36175\n董源\t36176\n烈性犬\t36177\n向上网\t36178\n晋城一中\t36179\n⑧\t36180\n5W40\t36181\n南湖壹号\t36182\n茨冈\t36183\nMoonshadow\t36184\n贺词\t36185\n重装上阵\t36186\n顺顺\t36187\niDEAAM\t36188\n坂本龙\t36189\n顽张\t36190\n潇湘水云\t36191\n打开率\t36192\n李鹏程\t36193\n西西特\t36194\n爱江山更爱美人\t36195\n120P\t36196\n小卖\t36197\napache-jmeter\t36198\nsg3525\t36199\n北师大附中\t36200\n傣\t36201\n大圩镇\t36202\n应该\t36203\n大唐电信科技股份有限公司\t36204\n福建省总工会\t36205\n上海市中学\t36206\n熊出没之秋日团团转\t36207\nITH\t36208\n豪世华邦\t36209\n侨福芳\t36210\n第一流\t36211\n翻花\t36212\nkern\t36213\n聚苯乙烯泡沫板\t36214\n浑源县\t36215\nkbs\t36216\n唐介\t36217\n双面镜\t36218\n考山路\t36219\n极战\t36220\n曹明铭\t36221\n256k1\t36222\n新余一中\t36223\n张张\t36224\n马雷\t36225\n143.0\t36226\n洛凡\t36227\n黑暗之门\t36228\ndmso\t36229\n小天王\t36230\n阴债\t36231\n单面\t36232\n大汗淋漓\t36233\nsybil\t36234\ncharat\t36235\n世纪莲\t36236\ng6s\t36237\n年平均\t36238\n自助办\t36239\n15779930712\t36240\n潘火\t36241\n述德\t36242\n获悉\t36243\n几百条\t36244\nmelrose\t36245\n强\t36246\n街游\t36247\n鹿鼎记\t36248\n标志\t36249\n轴体\t36250\n中华传统美德故事\t36251\n画带\t36252\n起亚极睿\t36253\n鲢鳙\t36254\n长安欧尚A800论坛_长安欧尚A800车友会\t36255\n晚清\t36256\n社交性\t36257\n磁疗\t36258\n中德英伦联邦\t36259\n无罪判决\t36260\n盛装\t36261\n掌柜学院\t36262\n俱佳\t36263\n信号量\t36264\n陆生\t36265\n20160426\t36266\n二分\t36267\nntlm\t36268\n黄河北\t36269\n叠彩区\t36270\n组织\t36271\n4两\t36272\n新丰县人民政府\t36273\n仕程\t36274\n混合型\t36275\n离婚后\t36276\njsoup\t36277\n盘刹\t36278\n藏酒\t36279\n中世纪\t36280\n瑞滨\t36281\n爱默生\t36282\n蓝v\t36283\nbilili\t36284\n超小型\t36285\n5联\t36286\n赵旭\t36287\n3100\t36288\n妈妈性\t36289\n湖南话\t36290\n8003\t36291\n文史哲\t36292\n极光号\t36293\n抗菌膜\t36294\nOnePiece\t36295\n藏娇\t36296\n板卷\t36297\n魔笛MAGI\t36298\n钉机\t36299\n第三十六期\t36300\n电子衍射\t36301\ntoxic\t36302\n大连金州\t36303\n无底洞\t36304\n三1\t36305\n敦泰\t36306\n陈彦妃\t36307\n古银\t36308\n锡北\t36309\n炭块\t36310\n阮小二\t36311\n类风湿性关节炎\t36312\n智能灯\t36313\ncod6\t36314\n三国机密之潜龙在渊\t36315\n77777\t36316\nAtkins\t36317\n龙珠剧场版\t36318\n魔君\t36319\nOokla\t36320\n漏鸟\t36321\ntouch4\t36322\nws5100\t36323\nMIS\t36324\n北京地铁17号线\t36325\n红昭\t36326\nOLD\t36327\n税务登记表\t36328\n同比例\t36329\n蚂蟥\t36330\n尧帝\t36331\n恩和\t36332\n龙腾出行\t36333\n小常识\t36334\n万两\t36335\n欧剧\t36336\n妖夜\t36337\n楚乔传2\t36338\nblah\t36339\n嘉丰\t36340\n敢死队3\t36341\nATHEROS\t36342\nAliExpress\t36343\ngef\t36344\n个人征信报告\t36345\n北京交通运输职业学院\t36346\n近视者\t36347\n滚开\t36348\n逞能\t36349\n雷兹拉\t36350\n月湖桥\t36351\n齐家家博会\t36352\n6月13日\t36353\n孟某\t36354\n飞哥与小佛\t36355\n舒琪\t36356\n2战\t36357\nextender\t36358\n【敖\t36359\nlesser\t36360\n太阳穴\t36361\nestar\t36362\nccminer\t36363\n赵灵儿\t36364\nqqc\t36365\nRDF\t36366\n第60章\t36367\n科左中旗\t36368\n忻州网\t36369\n123路\t36370\n黑装\t36371\n蝙蝠侠:阿甘之城\t36372\nbangladesh\t36373\n139号\t36374\n梁思源\t36375\n思远\t36376\n平台部\t36377\n当幸福来敲门\t36378\n标致308论坛_汽车之家论坛\t36379\nCondi\t36380\n张志超\t36381\n2K11\t36382\n鸣鹤\t36383\n洗澡\t36384\n点击行\t36385\n浙江农林大学暨阳学院\t36386\n计重\t36387\n贴暖\t36388\n卡柏\t36389\nperfect\t36390\n彩神争霸\t36391\n古惑仔2\t36392\n冷弯成型机\t36393\n大学生村官\t36394\n于阗\t36395\n北京现代途胜\t36396\n安全生产事故案例分析\t36397\n天泰路\t36398\n焦裕禄\t36399\n香港证券交易所\t36400\n火刀\t36401\n台州清风网\t36402\n歌曲\t36403\n述语\t36404\n国家食品药品监督管理总局食品药品审核查验中心\t36405\nshuttle\t36406\n北京钓鱼台国宾馆\t36407\n打回原形\t36408\n生活提示\t36409\n酒歌\t36410\n介休\t36411\n300MM\t36412\n南国\t36413\n大埔论坛\t36414\n余力\t36415\n寿宴\t36416\n瑞昌市\t36417\ngnuradio\t36418\n发黑\t36419\nfuer\t36420\n自由侠\t36421\nAccelerate\t36422\n90后00后\t36423\n小蓝点\t36424\nhonorable\t36425\n仪器\t36426\n认得\t36427\nsui\t36428\nvcsa\t36429\n亮起来\t36430\nSmiley\t36431\n11幢\t36432\n油炸花生米\t36433\n阴皮\t36434\n常压锅炉\t36435\n智库中国_中国网\t36436\n商店\t36437\n犯罪分子\t36438\n薛城\t36439\n飚车世界吧\t36440\nGHS\t36441\nPercy\t36442\n十六个月\t36443\n昆山市人民政府\t36444\n恶魔小丑\t36445\n欢迎辞\t36446\n最后生还者\t36447\ndiesel\t36448\n甘肃中医药大学\t36449\n一照多址\t36450\n世贸大楼\t36451\n终极恶女\t36452\n上华\t36453\n旭辉集团股份有限公司\t36454\n脓胸\t36455\n康斯登\t36456\nEvolution\t36457\nqixin\t36458\ngongyuan\t36459\n反倾销\t36460\n平阳信息网\t36461\n浪迹天涯\t36462\n纹身馆\t36463\n长沙环境保护职业技术学院\t36464\ntuesday\t36465\n恒顺众昇\t36466\n1000次\t36467\n武清信息网\t36468\n17ce\t36469\n他乡\t36470\nbirthday\t36471\n广州医科大学附属第五医院\t36472\n滴滴美团\t36473\n饶舌\t36474\nwhit\t36475\n35公分\t36476\n沈阳地铁2号线\t36477\n123&\t36478\n嵩山少林寺武术学校\t36479\n边伯贤\t36480\n韦元强\t36481\n冲孔\t36482\n兖矿集团\t36483\nTitty\t36484\n数盟\t36485\nInnovative\t36486\n关联分析\t36487\n市域\t36488\n提新\t36489\nBeasts\t36490\nprimer5\t36491\n姜末\t36492\n农讲所\t36493\n黑龙江省纪委\t36494\n伊春区\t36495\n刀鞘\t36496\n逊\t36497\n20篇\t36498\n360_中国绿色数据中心\t36499\n尹斗俊\t36500\n深圳交通银行\t36501\nc1\t36502\n杭州市体育局\t36503\nTheater\t36504\n孝感市公安局\t36505\n说嫁\t36506\nmetabolism\t36507\n更惨\t36508\nAPRIL\t36509\n一念无明\t36510\n一探\t36511\n240个\t36512\n肝囊肿\t36513\n格兰特\t36514\n排水泵\t36515\nLeCoultre\t36516\n一窥\t36517\n程序猿\t36518\n财务部门\t36519\n郭河镇\t36520\n吉祥航空\t36521\n牢友\t36522\n颜面膜\t36523\n+\t36524\n8孔\t36525\n寒武纪年原创网\t36526\n零价\t36527\n硕放机场\t36528\n斯特林\t36529\n冉\t36530\n约翰·纳什\t36531\nnote4吧_\t36532\n虎山镇\t36533\n4.3寸\t36534\n纬编\t36535\nSkirts\t36536\n盼达\t36537\n惟有\t36538\n空隙\t36539\nDer\t36540\n堆垛\t36541\n放坡\t36542\n海丽\t36543\n三姐\t36544\n樵夫\t36545\nPUNK\t36546\nadhesion\t36547\n1303\t36548\n祁东县\t36549\n哼唧\t36550\n李旭瑞\t36551\n内贸\t36552\n沉与浮\t36553\nunescape\t36554\n冬衣\t36555\nCasa\t36556\n公款案\t36557\n修魔\t36558\n白玉兰\t36559\n周劲松\t36560\n饲育\t36561\n全影\t36562\n3edu教育网\t36563\n叮咛\t36564\n中山大学数据科学与计算机学院\t36565\n金州勇士\t36566\n睾\t36567\n进口关税\t36568\n吴阿敏\t36569\nrabbitmqctl\t36570\n一箩筐\t36571\n零碎\t36572\n松本メイ\t36573\n北京朝阳\t36574\n20斤\t36575\n卡旺卡奶茶\t36576\n家者\t36577\nmysqld\t36578\n柱式\t36579\n青州古城\t36580\n路端\t36581\n3.60\t36582\n淮阴侯列传\t36583\n为官不为\t36584\n11秒\t36585\n红白喜事\t36586\n高频板\t36587\n痴情司\t36588\nZX\t36589\n抗压吧\t36590\nScotland\t36591\nHoshino\t36592\n翻唱版\t36593\n分帧\t36594\n老钟\t36595\n羊肉粉\t36596\nHighs\t36597\nHTML51\t36598\n伊森\t36599\n文津\t36600\n红米手机1S\t36601\n0.012\t36602\n中国国际医疗器械\t36603\n金蝶软件(中国)有限公司\t36604\n碧们\t36605\n人装\t36606\n狄戈\t36607\nWeapons\t36608\n服务队\t36609\n第一创业证券股份有限公司\t36610\n炭雕\t36611\n酷似\t36612\n六片\t36613\n耦合器\t36614\n200马力\t36615\n主仆\t36616\n汪洋\t36617\n汽车保险\t36618\n贝瑞基因\t36619\n富民银行\t36620\n淮安人才网\t36621\n爱建集团\t36622\n扣型\t36623\nisset\t36624\n瞩目\t36625\n华意压缩\t36626\nripper\t36627\n16MM\t36628\n传写\t36629\nevaluated\t36630\n夏目奈奈\t36631\nweishu\t36632\n圣迹\t36633\nChef\t36634\nPadding\t36635\nCore\t36636\n星座\t36637\n联苯双酯\t36638\nvil\t36639\nAlphabetical\t36640\nfutaba\t36641\n自花授粉\t36642\ndarkest\t36643\n裸脸\t36644\n发芽\t36645\n时间类\t36646\nlfsblack\t36647\n华子良\t36648\n惠州港\t36649\n52kkm\t36650\n杨奇\t36651\n九个小时\t36652\n马列维奇\t36653\nled日光灯\t36654\n双力臂\t36655\n段位赛\t36656\n无锡市小学\t36657\nFUN\t36658\n刺客信条1\t36659\n仙路医妃独步天下\t36660\n初音\t36661\n竖曲线\t36662\nnrt\t36663\n急于\t36664\n95522\t36665\n孟庆云\t36666\n双责\t36667\n电子寄存柜\t36668\n大成社区\t36669\n蒋诗萌\t36670\n求积\t36671\n转存\t36672\n张榕容\t36673\n佘北\t36674\n索穆尔\t36675\n晓云\t36676\nLibSVM\t36677\n余仁生\t36678\n文森特\t36679\n嗡嗡\t36680\nDEC\t36681\nwht\t36682\nDepartments\t36683\n800ml\t36684\n蓝青学校\t36685\n川府函\t36686\n厥\t36687\n招行信用卡还款\t36688\n色差\t36689\n华南师范大学继续教育学院\t36690\n丽拉\t36691\n春哥\t36692\n权臣\t36693\n城际铁路\t36694\n胡巴\t36695\nBergen\t36696\n县府\t36697\n洗沙机\t36698\n携\t36699\n总代理店\t36700\n云展\t36701\n秀点\t36702\n赵一\t36703\neen\t36704\n料板\t36705\n挖眼\t36706\n林茂\t36707\nSling\t36708\n扭矩传感器\t36709\nCFM\t36710\n航洋\t36711\n2345浏览器\t36712\n怡宝水\t36713\n湖南省商务厅\t36714\n帝喾\t36715\n十二届全国人大三次会议\t36716\n中国铁道建筑总公司\t36717\n王一平\t36718\n暴走告解室\t36719\n阴神\t36720\n17000元\t36721\nigfx\t36722\n马苏战舰世界\t36723\n上雨\t36724\n能态\t36725\n星球三国如龙传\t36726\n热肠\t36727\n16元\t36728\n装房\t36729\n样态\t36730\n起源\t36731\n袭击案\t36732\n赵姓\t36733\n索引键\t36734\n长寿山\t36735\n异面\t36736\n十七\t36737\n分析篇\t36738\n现世\t36739\n天水市人民政府\t36740\ninput-jdbc\t36741\n护套线\t36742\n时间校准器\t36743\n一颗心\t36744\n熊本县\t36745\n渤\t36746\n访谈\t36747\n电风扇\t36748\n备案号\t36749\n中国翻译协会\t36750\n中国三江人才网\t36751\n突击者\t36752\nVol.4\t36753\n餐车\t36754\n高台县\t36755\n想逃\t36756\n逆城市化\t36757\n裁衣\t36758\n怪物猎人世界大锤\t36759\n巴托克\t36760\n天闻\t36761\n程序侠\t36762\n非晶硅\t36763\n钻孔桩\t36764\neD\t36765\n建党伟业\t36766\n糊味\t36767\n非空单元格\t36768\n封发\t36769\n撬开\t36770\n柱脚\t36771\nEBOD\t36772\n汉海海洋公园\t36773\n李志辉\t36774\nkingdle\t36775\n网费\t36776\n迷\t36777\n自配\t36778\nOldman\t36779\n桂圆肉\t36780\n快干胶_AB胶\t36781\n复效\t36782\n人皇\t36783\n585号\t36784\n瓷贴面\t36785\n第3集\t36786\n钟英\t36787\n7.4\t36788\n吉林省人社厅\t36789\n牛角面包\t36790\n黄征\t36791\n国家安全\t36792\nple\t36793\n2016、2017年\t36794\n巨石雷军\t36795\n哈里伯顿\t36796\n建外soho\t36797\n悬而未决\t36798\nc照\t36799\n中国科学技术大学物理学院\t36800\n交谊舞曲\t36801\n江苏开放大学\t36802\n茂名政府网\t36803\n春风动力\t36804\n灭运\t36805\nimagesc\t36806\n油印本\t36807\nsalesforce\t36808\n宣传处\t36809\n1880元\t36810\n拒嫁豪门:少夫人\t36811\n候选人\t36812\nresulttype\t36813\nwhc\t36814\n田秀英\t36815\n联络互动\t36816\nesee\t36817\n池昌旭\t36818\nmnist\t36819\nadobeacrobat\t36820\n百老汇\t36821\n赫程\t36822\n缓释片\t36823\n草草\t36824\n深圳福田\t36825\nFCKeditor\t36826\n加瓦\t36827\nhemato\t36828\n恩恩怨怨\t36829\n毁灭之王\t36830\nAwareness\t36831\n河南电信\t36832\n电脑绣花机\t36833\nexm\t36834\nNami\t36835\n荥阳政府\t36836\n南海禅寺\t36837\n王俊雄\t36838\n布鲁克\t36839\n前列腺癌\t36840\n美国罗格斯大学\t36841\n朝阳区图书馆\t36842\n湛江市人力资源和社会保障局\t36843\n三五七\t36844\nCubes\t36845\n自适\t36846\n降速\t36847\n格力斯\t36848\n乐山市中区\t36849\n紫光云数\t36850\n002310\t36851\n蔡国声\t36852\n龙吟虎啸\t36853\n威少\t36854\n见客\t36855\n一年半\t36856\n斗争\t36857\n非甲烷\t36858\n收件服务器\t36859\n荷兰村\t36860\n艾瑞泽6\t36861\n爱丽舍论坛\t36862\n爱莎\t36863\n道骨\t36864\n1000日元\t36865\ntochar\t36866\n呈现\t36867\n新竹县\t36868\nBV\t36869\n百行\t36870\n杭州办公室\t36871\n福南\t36872\n扶他林软膏\t36873\nicoud\t36874\nVM虚拟机\t36875\n枝头\t36876\n宠物狗\t36877\n非医学\t36878\n慢性肾脏病\t36879\n补齐\t36880\n旋盖\t36881\n张小斐\t36882\n集成灶网\t36883\n魔男\t36884\n福建医科大学附属协和医院\t36885\n诱食剂\t36886\n财农\t36887\n轮播器\t36888\n闪战\t36889\ntill\t36890\n2毫米\t36891\n电子信息工程学院\t36892\nJune\t36893\ngetattr\t36894\n上海核工程研究设计院\t36895\n教育发展基金会\t36896\n男宾\t36897\n新消防法\t36898\n吴海英\t36899\nDoyourself\t36900\n九卿\t36901\n钛镁合金门\t36902\n陕菜\t36903\nmessi\t36904\n中国联合网络通信有限公司\t36905\n邪恶男女\t36906\n中国好歌曲第二季-学员金曲\t36907\n鞭炮声\t36908\nchdir\t36909\nConventional\t36910\n五江\t36911\n绝技\t36912\n至强版\t36913\n志愿队\t36914\nbaidutop\t36915\n明智光秀\t36916\n溢脂性皮炎\t36917\n球珠\t36918\nwriting\t36919\n+0086\t36920\n磨铁中文网\t36921\n彝族火把节\t36922\n盘锦红海滩\t36923\n以致\t36924\nJdbc\t36925\nbenks\t36926\nKeil5\t36927\n取流\t36928\n明月几时有\t36929\n华夏证券\t36930\n深圳汽车站\t36931\n脱稿\t36932\n舟山自贸区\t36933\n一个平方米\t36934\n春期\t36935\n双桶洗衣机\t36936\n微量元素\t36937\n华讯财经\t36938\n段誉\t36939\nAJ11\t36940\n门襟\t36941\nGCN\t36942\n无欲\t36943\n儿童版\t36944\n芭提\t36945\n哪所\t36946\n观音灵\t36947\n机暴\t36948\n山外山\t36949\n8点\t36950\n秦皇半岛\t36951\n报名表\t36952\n数字谱\t36953\nmpdf\t36954\n华为G7\t36955\ndac0832\t36956\nSEBS\t36957\nJWS\t36958\n21届\t36959\n佳思敏\t36960\n乐极生悲\t36961\nTurtlebot\t36962\n速博\t36963\nbee\t36964\n稍息\t36965\n撞上\t36966\n人代\t36967\n天境\t36968\n上印\t36969\n两日游\t36970\ngpd\t36971\n丰添\t36972\n练江\t36973\n弱视\t36974\nICOM\t36975\n北京大学在职研究生\t36976\n巨头\t36977\n頔\t36978\n16年后\t36979\n显色指数\t36980\nSTOMP\t36981\n男生版\t36982\nInsurance\t36983\nAthletes\t36984\n耳勺\t36985\n有所居\t36986\n福森\t36987\n贪欢\t36988\n评头论足\t36989\n站段\t36990\nGerber\t36991\n王中林\t36992\n福建卫视\t36993\n鹿先森乐队\t36994\n青海卫视\t36995\nsheltered\t36996\n二首\t36997\n轻弩\t36998\n_菜谱网\t36999\nTop10\t37000\n富国中证\t37001\n查账征收\t37002\n无极天宗\t37003\n9.7分\t37004\nラムジ\t37005\n4502\t37006\n毒\t37007\n李晓春\t37008\n34英寸\t37009\n顶空进样器\t37010\n企业培训网\t37011\n名媛望族\t37012\n发抖\t37013\nMoodle\t37014\n虎丘\t37015\n3ds汉化吧\t37016\n双煞\t37017\n多规\t37018\n书吧\t37019\nshiti\t37020\n法国红酒\t37021\nemm\t37022\n参茸\t37023\n堂堂\t37024\n豆干\t37025\nemu\t37026\n小娃娃\t37027\n死档\t37028\npleased\t37029\n逆转裁判4\t37030\ncommitment\t37031\n购房合同\t37032\n女武神3\t37033\n双绳\t37034\n第一堆\t37035\n2.11.8\t37036\n核桃\t37037\n杂俎\t37038\nL10\t37039\nmosh\t37040\n会计电算化\t37041\n7.22\t37042\n上海快递公司\t37043\nLyingman\t37044\n神经系统\t37045\n三钢\t37046\n第一次段\t37047\n18.2\t37048\n耀州窑\t37049\n办妥\t37050\n商房\t37051\n听闻\t37052\n红谷滩\t37053\n300多\t37054\n云华\t37055\n分质\t37056\n新京报网\t37057\nengel\t37058\n婚礼司仪\t37059\n了凡四训\t37060\nsolar\t37061\n中裕\t37062\n上海市食品药品监督管理局\t37063\nfpd\t37064\nASUS\t37065\nTPPA\t37066\nei检索\t37067\nrui\t37068\n规划图\t37069\n新萧十一郎\t37070\n牡丹花苗\t37071\n600664\t37072\n三亚天域度假酒店\t37073\n晏华\t37074\n鉴证师\t37075\n2010年9月\t37076\nWebFlux\t37077\n教育学\t37078\n晴空塔\t37079\n亲子宝典库\t37080\nmusic\t37081\n土耳其共和国\t37082\n牛角沱\t37083\nturtle库\t37084\n李政\t37085\n姬银龙\t37086\n过年审\t37087\n无火\t37088\n头机\t37089\n超标准\t37090\n方坯\t37091\n白蜡\t37092\n黄雾\t37093\n龙胆紫\t37094\n滕州政府网\t37095\n铝矿\t37096\nrepartition\t37097\n内鬼\t37098\n名剑\t37099\n刘伯坚\t37100\n松溪县\t37101\n原乡\t37102\nsacuk\t37103\nkickstarter\t37104\n普运\t37105\nclould\t37106\nlaunching\t37107\nMac篇\t37108\n不留白\t37109\n青苹果乐园\t37110\n液相\t37111\n有罪\t37112\n三角翼\t37113\nf516\t37114\n廖先生\t37115\nmat_wu\t37116\nSticker\t37117\n陈家洛\t37118\nTextBox\t37119\n建言\t37120\n广深科技\t37121\n周雪\t37122\nQueryList\t37123\nWINPE\t37124\n美托洛尔\t37125\ndotx\t37126\n尊享\t37127\nP波\t37128\n沃野\t37129\n群晖\t37130\n7.3.5\t37131\n智能电网\t37132\n滑音\t37133\nel-form\t37134\n斑纹\t37135\n高铁站\t37136\n恶童\t37137\n反攻\t37138\n百度手机输入法\t37139\n企业文明\t37140\n前前前世\t37141\n梯度提升树\t37142\nblades\t37143\n天正吧_\t37144\n中国发展门户网\t37145\nmad吧_\t37146\n魅族pro6plus\t37147\nkarlie\t37148\n涟水县\t37149\n灯杯\t37150\n姉妹\t37151\nTalend\t37152\n子午线轮胎\t37153\n心窝\t37154\n上原瑞惠\t37155\n吸水树脂\t37156\n赵涛\t37157\n小赖\t37158\n阳光城集团\t37159\n姚杰\t37160\n你的一生\t37161\n提前批次\t37162\n铭功路\t37163\n开下\t37164\n策略案\t37165\nPuth\t37166\n吴伟\t37167\n纳米二氧化硅\t37168\n兄控\t37169\n紧扣\t37170\nboarding\t37171\n阴森森\t37172\n北海\t37173\n眼结石\t37174\n记忆科技\t37175\n柏林广场\t37176\n中信易卡\t37177\n晚上9点\t37178\nblend\t37179\n拓画\t37180\n新视\t37181\n2018-03-22\t37182\n廊桥\t37183\n亦步亦趋\t37184\ndeamon\t37185\n製造商\t37186\n人类学\t37187\n打草机\t37188\n0.83\t37189\n望城政府网\t37190\n行空\t37191\nnoscript\t37192\n胰岛素笔\t37193\n双料喉风散\t37194\n厌氧罐\t37195\n逆袭之路\t37196\nHXR-MC2500\t37197\nSUPA\t37198\n赵默笙\t37199\nprotease\t37200\n缺漏\t37201\n订场\t37202\n小可\t37203\n丰富度\t37204\n美漫百科\t37205\n宁乡市\t37206\n楚雄网\t37207\n素值\t37208\n精信\t37209\n氟米龙滴眼液\t37210\n美德\t37211\nSOBT\t37212\n#热血街舞团#\t37213\n扛鼎\t37214\n莎普爱思\t37215\n078\t37216\n地区性\t37217\n燃情克利夫兰\t37218\n络合\t37219\n莫西子诗\t37220\n演讲会\t37221\n兜兰\t37222\nIII型\t37223\naudit\t37224\nPlanet\t37225\nshmat\t37226\n8.41\t37227\n醛\t37228\n木偶剧\t37229\n王梁\t37230\n黄如\t37231\n表演型\t37232\n淮水安澜网\t37233\nPlum\t37234\nHoch\t37235\n无话不谈\t37236\n徐仁英\t37237\n3dmax2013\t37238\n湿毒\t37239\n白瓜子\t37240\n难民营\t37241\n竞争比\t37242\n呵\t37243\n美女老师\t37244\n酥胸\t37245\n合艾\t37246\nFiestar\t37247\n再死\t37248\nped\t37249\n宏和\t37250\nASIA\t37251\n田黄\t37252\n创业者们\t37253\n流动科技馆\t37254\n血管壁\t37255\n门匾\t37256\nVSAN\t37257\n男宝\t37258\n叙事文\t37259\n第三十六\t37260\n电警棍\t37261\n吊孝\t37262\n横切面\t37263\ntoto\t37264\n坐骑\t37265\n麦科勒姆\t37266\n两职\t37267\n葵花油\t37268\n夺宝奇兵5\t37269\n道县\t37270\n草龟\t37271\n大贰\t37272\n陶崇园\t37273\n雨纷\t37274\n)控股有限公司\t37275\n缴纳\t37276\nelva\t37277\nAN94\t37278\n色素斑\t37279\n云台花园\t37280\nmultinational\t37281\n10年代\t37282\nAI模板-千图网\t37283\n前24小时\t37284\n周周盈\t37285\n笔译\t37286\n龙华汽车站\t37287\n小院\t37288\n蝠翼\t37289\n2016.07\t37290\n强击机\t37291\n圆通金刚\t37292\nBeatbox\t37293\n瓢虫雷迪\t37294\nvtc\t37295\n开曼\t37296\n做贼\t37297\nword分页符\t37298\n招呼\t37299\n游吟\t37300\n骁龙625处理器\t37301\ntimestep\t37302\n节育\t37303\n附片\t37304\n18048期\t37305\n钓杆\t37306\n2000ac\t37307\n12.56米\t37308\n0.39\t37309\nsphero\t37310\nmonth\t37311\n金刚\t37312\nachievements\t37313\n泥阀\t37314\n北京天气网\t37315\n锦山镇\t37316\ndefines\t37317\nMus\t37318\n集成吊顶\t37319\n软组织密度影\t37320\n名树\t37321\n驳领\t37322\n140分钟\t37323\nswf播放器\t37324\n清远市人民医院\t37325\n舞阳河\t37326\nlq-635k\t37327\n中大医院\t37328\n中央车改办\t37329\n专利代理人考试-思博网\t37330\nCES2016\t37331\n橙汁儿网\t37332\nExchange技术论坛\t37333\nhavaianas\t37334\n潘军\t37335\n母舰\t37336\n家教学\t37337\n对抗\t37338\n轻狂\t37339\n大家\t37340\nK40\t37341\n联想R720\t37342\n1.5万\t37343\nhzs\t37344\n司马炎\t37345\n安全生产标准化\t37346\n已购\t37347\n连云港东\t37348\n宫\t37349\n安徽公务员考试网\t37350\n东海岸\t37351\n王新军\t37352\n抗酸\t37353\nbricks\t37354\n吴之荣\t37355\n经机\t37356\n8815\t37357\nBST\t37358\n金港镇\t37359\n万豪\t37360\n碳化钙\t37361\n百年\t37362\n绿宝树\t37363\ndateadd\t37364\n少年锦时\t37365\n利辛县\t37366\n鸭溪\t37367\nhtmlunit\t37368\n长筒\t37369\nMHz\t37370\n动态库\t37371\nlucene\t37372\n伊犁河\t37373\n深知\t37374\n感遇\t37375\n博卡拉\t37376\n在职研究生考试\t37377\nNAC\t37378\n代画\t37379\n数表\t37380\nWEBSHELL\t37381\n会\t37382\n应子\t37383\n雪落香杉树\t37384\n队\t37385\n石家庄物流公司\t37386\n级本\t37387\nfilms\t37388\nncr\t37389\n可道\t37390\nC++Primer\t37391\n裴元庆\t37392\n影子银行\t37393\n麦奇\t37394\n广东酒店管理职业技术学院\t37395\n静听\t37396\n在职研究生入学考试\t37397\n迭戈\t37398\n2万分\t37399\n高电\t37400\n流血\t37401\n痛风性关节炎\t37402\n小步\t37403\n孽情\t37404\n浙江传媒学院\t37405\n强东玥\t37406\n华约\t37407\n插入式\t37408\n十一学校\t37409\nteb\t37410\n是爱\t37411\n武侠乂\t37412\n吴圩镇\t37413\nXPS13\t37414\neaseus\t37415\n告白\t37416\ndcos\t37417\n高尔雷克萨斯is\t37418\n香饽饽\t37419\n乱轰三国志\t37420\nSIGHT\t37421\n希望工程\t37422\n言生\t37423\n账制\t37424\n起亚秀尔\t37425\n宝钛\t37426\n锱铢必较\t37427\nAPEX\t37428\n汉化cia\t37429\n6.2.5\t37430\n丽子\t37431\n恐怖故事\t37432\nIOPscience\t37433\n掉期\t37434\n夏亚\t37435\naopa\t37436\n中天城投\t37437\n长沙高新区\t37438\n胡静\t37439\n别生气\t37440\n国立中央大学\t37441\n舒心\t37442\n滥伐林木罪\t37443\nMalone\t37444\n竹扇\t37445\n菲灵\t37446\n日本岛\t37447\n上海火车南站\t37448\n白羊男\t37449\n国社\t37450\n捷达论坛_汽车之家论坛\t37451\n阿香\t37452\n超过15天\t37453\n2.36\t37454\n剑山\t37455\n冬藏\t37456\ncontacting\t37457\nukulele谱\t37458\n一次性补偿金\t37459\nstrides\t37460\n大连外国语大学\t37461\n光电学院\t37462\n影音先锋电影网\t37463\nGame\t37464\n天凰\t37465\n缺陷率\t37466\n岳山\t37467\n通天干探\t37468\n凤山县\t37469\n2012年12月15日\t37470\n三万里\t37471\n湖南第一师范学院\t37472\n820万\t37473\n况\t37474\n免子\t37475\n执御\t37476\n江苏省人民医院\t37477\n丰华园\t37478\nwrappers\t37479\n留薪期\t37480\n工薪贷\t37481\n石灰岩\t37482\n1492年\t37483\nff91\t37484\n黄村西大街\t37485\n杨美\t37486\n鑫润\t37487\nlxe\t37488\n亚洲航空公司\t37489\n数控激光切割机\t37490\n无限纷争_九游论坛\t37491\n张梓杜月笙\t37492\n骨质瓷\t37493\n牙线棒\t37494\n东风景逸x5\t37495\nLSPL\t37496\n冰爪\t37497\n少康\t37498\no型\t37499\n念慈庵\t37500\n垦丁\t37501\nkmc\t37502\n都市奇门医圣\t37503\n原贴\t37504\nlinqiaozhou\t37505\n宝典\t37506\n印能法师\t37507\n罗氏\t37508\n防晒帽\t37509\n水鸭\t37510\n铁岭西\t37511\n孤胆枪手3\t37512\nCorey\t37513\n多情人\t37514\n快处快赔\t37515\n据为己有\t37516\n软件谷\t37517\n银杏\t37518\n凸变英雄\t37519\n摇牌\t37520\n代为\t37521\n萨卡\t37522\nPension\t37523\n延保\t37524\n通河\t37525\n生态文明观\t37526\n蒸汽锅炉\t37527\nMomoka\t37528\n辅佐\t37529\nRetrofit2\t37530\n吴琦\t37531\n英雄连勇气传说\t37532\nausu\t37533\nios7吧_\t37534\n可逆式\t37535\n奔驰M级\t37536\n附房\t37537\nneccs\t37538\n结局篇\t37539\n中医药类\t37540\n泽村玲子\t37541\nmudan\t37542\n方数\t37543\nDrawboard\t37544\n阿克\t37545\nKOD\t37546\n大童鞋\t37547\n梦幻西游\t37548\n乐享派\t37549\n百分制\t37550\n百绘\t37551\n雷东多\t37552\n横纹肌溶解症\t37553\ncoc\t37554\n石惠\t37555\n玛伦\t37556\n得闲炒饭\t37557\nweishenm\t37558\n规划路\t37559\n聚脲\t37560\nKNT\t37561\n吃鸡吧\t37562\nSERVO\t37563\nEU\t37564\n柏文\t37565\n格斗\t37566\n中盐安徽红四方股份有限公司\t37567\n基料\t37568\n朗读版\t37569\nbreasts\t37570\n合算\t37571\n潘多拉多\t37572\n20米\t37573\n悔悟\t37574\n玛格特·罗比\t37575\n大难\t37576\n一房一厅\t37577\nTHERMO\t37578\n互认\t37579\nFAKE\t37580\ngiveaway\t37581\n稳准\t37582\nPen\t37583\n包干费\t37584\nor\t37585\n正心\t37586\n意绵绵\t37587\n徐海荣\t37588\n力哥\t37589\nIBT\t37590\n結局\t37591\n净身\t37592\n中国船舶重工股份有限公司\t37593\n一露\t37594\n豆腐脑\t37595\n重庆招考信息网\t37596\n瑞克和莫蒂\t37597\n神台\t37598\nSWS\t37599\n7100U\t37600\n济宁市教育局\t37601\n小米体脂秤\t37602\nswif\t37603\n晕染\t37604\n艺星\t37605\n30层\t37606\n扭力杆\t37607\n台湾机场\t37608\n子虚乌有\t37609\n红虫\t37610\nsituational\t37611\n会计师事务所\t37612\n勒芒\t37613\n迷宫式\t37614\n铝材\t37615\n控制器\t37616\n八卦网\t37617\n西里尔\t37618\n桌旗\t37619\n新锋范\t37620\n火女\t37621\n10.09\t37622\n观赏鸽\t37623\nelement14\t37624\n802路\t37625\nSinger\t37626\n左岩\t37627\nfeelunique\t37628\nLinux服务器\t37629\nphabricator\t37630\n弱颜\t37631\n5.8g\t37632\n刀神\t37633\n菜\t37634\n四率\t37635\nCOMP\t37636\n做进\t37637\nether\t37638\n外公司\t37639\n四所\t37640\n冯盈盈\t37641\n第三格\t37642\n目瞪口呆\t37643\n25针\t37644\n望海\t37645\n枯荣\t37646\n生活大爆炸第十季\t37647\n韩盛\t37648\n公需课\t37649\n心情不好\t37650\n火炎\t37651\ntinyalsa\t37652\n专技天下网\t37653\n天童寺\t37654\n投入品\t37655\n合不来\t37656\n欧莱雅\t37657\n八月十五\t37658\n濮阳市政府\t37659\n詹妮弗劳伦斯\t37660\nf189\t37661\nua\t37662\n唐飞\t37663\n杜拉拉升职记\t37664\n克里斯托夫\t37665\n大众凌渡论坛论坛\t37666\n再用\t37667\n吴旭\t37668\n6s\t37669\n胡运权\t37670\n江雪\t37671\ntext-overflow\t37672\n广岛三箭\t37673\n200N\t37674\n火虫\t37675\n嫁入豪门\t37676\n6.9.1\t37677\n三丽鸥\t37678\n45层\t37679\n里斯本竞技\t37680\n病毒性脑膜炎\t37681\n连供\t37682\n王建军\t37683\n烂片\t37684\n三股辫\t37685\n姜氏\t37686\n烽火电子\t37687\nManufacturer\t37688\n微分流形\t37689\n只认\t37690\ndramatic\t37691\n名称库\t37692\npyparsing\t37693\n銀座\t37694\nKT3\t37695\n贵格\t37696\n7.3冰法\t37697\n汉藏\t37698\n李晓明\t37699\n西瓜影音/BD\t37700\n室门\t37701\n迸发\t37702\nmakefile\t37703\n可致\t37704\nrewards\t37705\n汪辜\t37706\nAWM\t37707\n金水苑\t37708\n青海大学附属医院\t37709\n信阳毛尖\t37710\n冷雨萱\t37711\nWales\t37712\nROLAND\t37713\n共阴极数码管\t37714\n宁夏回族自治区公安厅\t37715\nd类\t37716\n第2周\t37717\n天枰女\t37718\nProfiling\t37719\n好的便携式\t37720\n收口\t37721\n59P\t37722\n日冕\t37723\nnForce\t37724\n草蜢乐队\t37725\n景宁政府网\t37726\n总经办\t37727\n直播间\t37728\nmockup\t37729\n管理文\t37730\n全处\t37731\nInfoPath\t37732\n第91章\t37733\n微信营销平台\t37734\n预许\t37735\n不妻而遇\t37736\n我爱我家\t37737\n家兄\t37738\nNode.js\t37739\n一力\t37740\nSven\t37741\n花野\t37742\n饭食\t37743\n天津大学电子信息工程学院\t37744\nn7108\t37745\n老店\t37746\n金牌投资人\t37747\nη\t37748\n经念诵\t37749\n塔王\t37750\n温州网易\t37751\n水晶虾饺\t37752\n冰漪\t37753\nolo\t37754\nJAVAC\t37755\n罗曼蒂克\t37756\n实度\t37757\n中共中央党校函授学院\t37758\n推拉力\t37759\n双一流\t37760\n名首\t37761\n急等\t37762\n第四维\t37763\n积劳成疾\t37764\nAberdeen\t37765\n独人\t37766\nnew一个\t37767\n291号\t37768\nCHILD\t37769\nY85\t37770\nino\t37771\n死叉\t37772\n高小琴\t37773\n森田疗法\t37774\nVert\t37775\nN9002\t37776\n去杠杆化\t37777\n百度云SVIP\t37778\nVE\t37779\n插画集\t37780\nmmcx\t37781\n几批\t37782\n齐力\t37783\n造伞\t37784\n财智\t37785\n洲际弹道导弹\t37786\n整齐划一\t37787\n美廉美\t37788\n干\t37789\n2016年度\t37790\n久久小说网\t37791\n常青路\t37792\n雨纪\t37793\n跟妆师\t37794\n作陪\t37795\n甜歌\t37796\n疾风剑豪\t37797\n伟成\t37798\n全开式\t37799\n万森\t37800\nS41\t37801\n伍家岭\t37802\n辫儿\t37803\n九人团\t37804\nFMA\t37805\nLevante\t37806\n黄村\t37807\n三更雨\t37808\n饭庄\t37809\nx7\t37810\n健康管理公司\t37811\n铿\t37812\n调剖\t37813\n求算\t37814\n计议\t37815\npccad\t37816\n腐蚀者\t37817\n杏吧\t37818\n戏吻\t37819\nFreeRADIUS\t37820\n胡建华\t37821\n2015年10月1日\t37822\n安康学院\t37823\n44.1\t37824\nintersection\t37825\n塞尺\t37826\n贪玩传世\t37827\n邓建华\t37828\n葵瓜子\t37829\n喝咖啡\t37830\n有愧\t37831\nqizi\t37832\ndirectfb\t37833\n表位\t37834\n维拉蒂\t37835\n砸碎\t37836\n颐指气使\t37837\n剖宫\t37838\n600633\t37839\n咖啡\t37840\n北京台\t37841\n壮壮\t37842\n刘琦\t37843\n急诊医学\t37844\n爱情公寓4\t37845\nmale\t37846\n⒎\t37847\n华为MateRS\t37848\n关西腔\t37849\n计谋\t37850\n王明阳\t37851\n星球版\t37852\nenforcing\t37853\n从宽\t37854\n澳盘\t37855\n善良的小姨子\t37856\n瑞动\t37857\n何日重\t37858\n自见\t37859\n最小生成树\t37860\nYOUNG卡\t37861\n7、8月\t37862\n中央八项规定精神实施办法\t37863\n预载\t37864\n发动机怠速\t37865\n奥拉星圈\t37866\n牛根生\t37867\n北纬网\t37868\nbarclays\t37869\n福建中公\t37870\nLullaby\t37871\nc5game\t37872\n炼\t37873\n收获宝\t37874\n身份\t37875\n罗伯斯庇尔\t37876\n谪仙\t37877\n斯图卡\t37878\n超能量\t37879\n万科湖畔度假公园\t37880\n郑严\t37881\n叶伟\t37882\n花钟\t37883\n餐企\t37884\n窍门\t37885\n无结\t37886\n铁条\t37887\n保险管\t37888\nJLink\t37889\n博马\t37890\n理综卷\t37891\n68Design\t37892\n春秋旅游网-春秋旅行社\t37893\nMurphy\t37894\n内刊\t37895\n惠租车\t37896\n霍克斯\t37897\nEmoji\t37898\n黑枸杞之恋网\t37899\n中国研发中心\t37900\n颜卓灵\t37901\n威尔森\t37902\n徐焰\t37903\n青年\t37904\n2016年10月13日\t37905\n孟超\t37906\n吟诵\t37907\nactivitymq\t37908\ndebris\t37909\n中政\t37910\n古船\t37911\n说课稿\t37912\n制图\t37913\n水瓶\t37914\n邓光荣\t37915\n子目录\t37916\n城发集团\t37917\n南长安街壹号\t37918\n石法\t37919\nceb\t37920\n黑食\t37921\n面红耳赤\t37922\n戏梦\t37923\n土地使用税\t37924\n左程云\t37925\n淘宝美工招聘网\t37926\n渗坑\t37927\n拨盘\t37928\nAudi\t37929\n周公解梦大全网\t37930\n瓦克\t37931\n兰园\t37932\n杉德支付网络服务发展有限公司\t37933\n刘晏\t37934\npmg\t37935\n致远软件\t37936\n似箭\t37937\n玉玺\t37938\n刮擦\t37939\n腓特烈大帝\t37940\nBRCA\t37941\nエスカレ\t37942\nMouth\t37943\n小罐茶\t37944\n3102\t37945\n紫菜网\t37946\n老虎证券\t37947\n潮汐式\t37948\nRamon\t37949\n乳头瘤\t37950\n红豆豆\t37951\n造价师证\t37952\n屠\t37953\n花朵\t37954\nint转换\t37955\n大渝网\t37956\n张新宇\t37957\n2K14\t37958\n赴任\t37959\n北京北控\t37960\n希尔顿逸林\t37961\n嘉陵区\t37962\n15P\t37963\n1.9.1\t37964\n思维者\t37965\nPPT/\t37966\n诞生礼\t37967\n蚂蚁短租|mayi.com\t37968\n全球央行\t37969\n川沙新镇\t37970\n1000种\t37971\n一批\t37972\n墓室\t37973\n先声\t37974\n百度招聘\t37975\ng2\t37976\n下雪天\t37977\n剧场\t37978\n宁毅\t37979\n4864\t37980\n杨鹏飞\t37981\nVI\t37982\n陈龙\t37983\n抛妻\t37984\n做大\t37985\n八爷\t37986\n肉丝\t37987\nPythonTab\t37988\n千团网\t37989\n对口单招网\t37990\nxps13\t37991\nbeaute\t37992\n45G\t37993\n享学\t37994\n熊俊\t37995\n檀头山岛\t37996\n骑驴\t37997\n令尊\t37998\n丰收\t37999\n左旗\t38000\n周天子\t38001\n广西生态工程职业技术学院\t38002\n黑帮\t38003\n冒雨\t38004\n中国网\t38005\n靖王\t38006\n外差\t38007\n鼓形齿式\t38008\n呋麻滴鼻液\t38009\nfe\t38010\n华为荣耀7X\t38011\n斗罗大陆绝世唐门\t38012\n梁滨\t38013\n多哈机场\t38014\n投資\t38015\n纪要\t38016\n上思\t38017\n思勤\t38018\n福田小学\t38019\n食代\t38020\n淅川县人民政府\t38021\n招生简章_高考网\t38022\n上海榕柏生物技术有限公司\t38023\n大同方特欢乐世界\t38024\n蝶梦\t38025\n陆玲\t38026\n21只\t38027\n炸春卷\t38028\n蕴藏\t38029\n斯塔福\t38030\n中国银联股份有限公司\t38031\n背人\t38032\nhuu\t38033\n农药味\t38034\nT500\t38035\n筑人\t38036\n富力公主湾\t38037\n海岸线文学网\t38038\n国药集团化学试剂有限公司\t38039\n长城板\t38040\n濮阳华龙\t38041\n二〇一六年\t38042\n骷\t38043\n张馨月\t38044\nICMP协议\t38045\n充绒机\t38046\nz10\t38047\n民电\t38048\n怕黑\t38049\n朝日奈\t38050\n嘉陵江路\t38051\n车辙剂\t38052\ncockney\t38053\n狗人\t38054\n郝氏\t38055\n宏定义\t38056\n龙湖源\t38057\n红伞\t38058\n莎啦啦快乐舞步健身操\t38059\n花非花雾非雾\t38060\nis\t38061\n取名网\t38062\n0226\t38063\n伺服阀\t38064\n瓦窑村\t38065\n玛索\t38066\n醉心\t38067\n西工大\t38068\n凡拓\t38069\n赠人\t38070\n室内装潢\t38071\n陕西省知识产权局\t38072\n北京德云社\t38073\n南开大学\t38074\n未老先衰\t38075\n顺安\t38076\n吊卡\t38077\narmy\t38078\n东北财经大学\t38079\n嫩模\t38080\n喷胶\t38081\n检查单\t38082\n太湖科技产业园\t38083\nMSTP\t38084\n眼前人\t38085\nLifeline\t38086\n山樱\t38087\n二里庄\t38088\n优力胶\t38089\n一航\t38090\n毁容\t38091\n肯\t38092\nUNP\t38093\n第94集\t38094\n九山\t38095\nshoes\t38096\n一主\t38097\n埃迪\t38098\n二工大\t38099\n划花\t38100\n拨测\t38101\n校墓处\t38102\n深圳市赛亿科技开发有限公司\t38103\n江南新区\t38104\ngateio\t38105\n泰兴路\t38106\nonenote\t38107\n慕容雪村\t38108\n7105\t38109\n堂弟\t38110\n采石矶\t38111\nPS学习网\t38112\n阿俏\t38113\nGAD腾讯\t38114\n共叙\t38115\n深化改革\t38116\n30x30\t38117\n4019\t38118\n广西\t38119\nOverture\t38120\n明好\t38121\n聚星娱乐\t38122\n井冈山机场\t38123\n王新\t38124\nNy\t38125\n平稳性\t38126\nsponge\t38127\n卷影\t38128\n周新民\t38129\natr\t38130\n西贝\t38131\n明日起\t38132\n汇中财富\t38133\n毕业歌\t38134\n智房网\t38135\n水果特产\t38136\n性味\t38137\n310000\t38138\n天知道\t38139\n宋钟基\t38140\nmsp430g2553\t38141\n赛后\t38142\n译言古登堡\t38143\n丝扣\t38144\n寸草不生\t38145\n媚薬\t38146\nCSU\t38147\n沃尔沃xc40\t38148\nehr\t38149\n什_\t38150\nforest\t38151\nzbar\t38152\n20150131\t38153\nBLS\t38154\n茶峒\t38155\nae2\t38156\n春野\t38157\nsansui\t38158\n山西省人民医院\t38159\n电视史\t38160\n温泉度假村\t38161\n幸福的滋味\t38162\nwindow10\t38163\n莱芜论坛\t38164\n互联网金融平台\t38165\n10240\t38166\nObserver\t38167\n永爱\t38168\n着色器\t38169\n世界之巅\t38170\n山涛\t38171\n第67届\t38172\n幻兽\t38173\n牛头刨床\t38174\n金泰克\t38175\nlethal\t38176\nTue\t38177\n养育\t38178\n大众证券网\t38179\n永久\t38180\n倏\t38181\n比利·海灵顿\t38182\n单词本\t38183\n銀\t38184\n语义分割\t38185\n亳州\t38186\n遮风挡\t38187\n2PM\t38188\n好人\t38189\n七海奈奈\t38190\n少女时代\t38191\n横琴湾\t38192\n真希波\t38193\n安通控股\t38194\nUNStudio\t38195\ncuriosity\t38196\n打断线\t38197\n公园道\t38198\ncharlie\t38199\n牛津词典\t38200\n北京武警总医院\t38201\nnhl\t38202\n601336\t38203\n子默\t38204\nconfluence\t38205\n桦川\t38206\n记错\t38207\n千分之一\t38208\n马其顿\t38209\n音乐系\t38210\n柳宗元\t38211\n扫黄打非网\t38212\n杨某\t38213\n厦门大学海洋与海岸带发展研究院\t38214\n套底\t38215\n中国银行个人网上银行\t38216\nnvdia\t38217\n张洪亮\t38218\n53P\t38219\n塞满\t38220\nGoodyear\t38221\nTHBWiki\t38222\n梯板\t38223\n中国邮政储蓄银\t38224\n司法所\t38225\n文件版\t38226\n张宝全\t38227\n攵\t38228\n8成\t38229\n平纹\t38230\n兰花草\t38231\n张成龙\t38232\n斗龙\t38233\n会稽山\t38234\n迷思\t38235\n外贸SO\t38236\n乌孙古道\t38237\n粉圈\t38238\n百邦\t38239\n分享\t38240\nloper\t38241\n130mm\t38242\n消息系统\t38243\nplague\t38244\n华铭智能\t38245\n变形金刚真人版\t38246\n复旦大学经济学院\t38247\n收班\t38248\n东宝路\t38249\n着陆\t38250\n去边\t38251\n南山小学\t38252\n胎囊\t38253\n9610\t38254\nhek\t38255\nKnowledge\t38256\n断脚\t38257\n齐论\t38258\n巍然\t38259\n会意\t38260\n异株\t38261\n地师\t38262\n520高清网\t38263\n张南\t38264\n方中\t38265\nImapla\t38266\nsubscriber\t38267\n上海钰博生物科技有限公司\t38268\n727\t38269\n射干\t38270\n樊锦诗\t38271\n中山古镇镇\t38272\n巴拉拉\t38273\n双屿\t38274\n滤棉\t38275\n科视\t38276\n道馆\t38277\n202路\t38278\n天津市气象局\t38279\ncde\t38280\n聚氨酯橡胶\t38281\n5.10.1\t38282\n任务栈\t38283\n信贷\t38284\ngd\t38285\n果蔬粉\t38286\n盐城市交通运输局\t38287\n会同县\t38288\n企业债券\t38289\n荷木\t38290\n摊薄\t38291\n小太妹\t38292\n荤场\t38293\n平衡阀\t38294\n皮皮虾\t38295\nhangfire\t38296\n吴光明\t38297\n九幽\t38298\nmongoexport\t38299\n币种\t38300\n纤美\t38301\n骨戒\t38302\n茶叶梗\t38303\n柏安妮\t38304\n宽城满族自治县\t38305\n均价\t38306\n狂暴西游\t38307\n柳冠\t38308\nTortoiseSVN\t38309\n正月初四\t38310\n原切牛排\t38311\n信阳师范学院\t38312\n搁置\t38313\n水银体温计\t38314\n文博会\t38315\n高铁东站\t38316\n劳心\t38317\n今天中午\t38318\n姚余栋\t38319\n王利明\t38320\n流行音乐网\t38321\nagencies\t38322\n腕子\t38323\n外外\t38324\n果场\t38325\n工资税\t38326\n肥\t38327\nsp3\t38328\n京麦\t38329\narthur\t38330\n2MOD\t38331\n代练\t38332\n中非关系\t38333\nHz\t38334\ngird\t38335\n21IC中国电子网\t38336\n过高\t38337\n纳木错\t38338\nDealing\t38339\n格子箱\t38340\n中华考试网\t38341\n金润吉\t38342\n海伦小镇\t38343\n提前还贷计算器\t38344\nHY000\t38345\n开场词\t38346\n上海\t38347\n辊轴\t38348\nbox2d\t38349\n登记本\t38350\n自新\t38351\n超前\t38352\n这一年\t38353\n在现场\t38354\n清扫器\t38355\nCryst\t38356\npassthrough\t38357\n如上图\t38358\nmultiselect\t38359\n西瑞\t38360\n中共重庆市委党校\t38361\n北京男排\t38362\n从入门到精通\t38363\n4支\t38364\n金威啤酒\t38365\nvlp\t38366\n拱宸桥\t38367\n喷水池\t38368\n环湖赛\t38369\n地板式\t38370\n苏熙\t38371\n砂磨机\t38372\nleonard\t38373\n球墨\t38374\n黄花蒿\t38375\n松狮犬\t38376\n闽粤\t38377\n薛老师\t38378\n何日\t38379\n百步镇\t38380\n深深深\t38381\nPLC编程\t38382\n日照分析\t38383\n大县\t38384\n幻影旅团\t38385\n红嫂\t38386\n宝堂\t38387\n习李\t38388\nGStreamer\t38389\n1.17G\t38390\nDLI\t38391\nToDo\t38392\n3210M\t38393\n沙滩椅\t38394\n碧水\t38395\n篮包\t38396\n生科\t38397\niav\t38398\ntki\t38399\n中华考试书店\t38400\n樟木镇\t38401\nwkhtmltopdf\t38402\n任之堂\t38403\n零花\t38404\nlimma\t38405\nAORUS\t38406\n迪迦·奥特曼\t38407\nOntario\t38408\n祥龙\t38409\n江苏村\t38410\n旭山动物园\t38411\n苏州律师协会\t38412\n生意宝\t38413\n长春高新\t38414\n云_\t38415\nBYTE\t38416\n侠盗猎魔\t38417\n智图\t38418\n病院\t38419\n达飞\t38420\n帆布\t38421\nCintiq\t38422\n8820\t38423\n倒淌河\t38424\n88mm\t38425\n马应龙\t38426\nwww.why123.org\t38427\n不完成\t38428\n坑机\t38429\npe启动盘\t38430\n武昌区\t38431\n封闭液\t38432\n灌篮高手重制版\t38433\nVideo-梨\t38434\n中小企业管理\t38435\n卫小宝\t38436\n太幸运\t38437\n资优\t38438\n扣点\t38439\n刺激器\t38440\nk11\t38441\n地面砖\t38442\n生死时速\t38443\n张凯毅\t38444\n电动自行车\t38445\nLouboutin\t38446\nagm\t38447\n尧十三\t38448\n两三岁\t38449\nxplay6\t38450\n润滑油脂\t38451\nDevTools\t38452\n挖掘金游戏网\t38453\n冈上肌\t38454\nFrida\t38455\n言情小说库\t38456\n鸣禽\t38457\n人民币跨境支付\t38458\nhokey\t38459\n两卷\t38460\nDodge\t38461\n黄湾镇\t38462\n武钢股份\t38463\nWalkman\t38464\n芯材\t38465\n铝带\t38466\n蜜袋\t38467\n唐吉诃德\t38468\n恒大足球学校\t38469\n精字\t38470\n丧乱\t38471\n柏林电影节\t38472\n麻油\t38473\nyfceshi\t38474\n糖果罐\t38475\nZhaiiKer\t38476\n狂枪\t38477\n梦幻西游5\t38478\nbiscuits\t38479\n外角\t38480\nIndesig\t38481\n彭羚\t38482\n苻坚\t38483\n詹雯婷\t38484\n南京禄口国际机场\t38485\n紫外线\t38486\n北外青少\t38487\n艾欧里亚\t38488\nhazardous\t38489\n16平\t38490\n汽包\t38491\nNikeLab\t38492\n有一失\t38493\n张大成\t38494\n能耗费\t38495\n奔驰C200\t38496\n逸龙剑抉择\t38497\n外塘\t38498\n龙豆\t38499\nPHP类\t38500\n田畠裕基\t38501\n自由篮球\t38502\n假声\t38503\n咩\t38504\n推挽\t38505\ntelerik\t38506\n湖南省妇幼保健院\t38507\n夹包\t38508\nchaoschild\t38509\n项目物\t38510\n中国人民武装警察部队\t38511\nwindows2003server\t38512\njichang\t38513\n重庆市规划局\t38514\n3速\t38515\nSpaceClaim\t38516\n火柴人战争\t38517\n48口\t38518\nメニュ\t38519\n浪琴表\t38520\nPandoraBox\t38521\n5.99万元\t38522\n泥浆池\t38523\n零壹\t38524\n卷尾\t38525\n帕杰罗\t38526\n阿五\t38527\n广州市统计局\t38528\n钩鞋\t38529\n自我诊断\t38530\n藤黄健骨丸\t38531\n四星级酒店\t38532\nprezi\t38533\n销价\t38534\n省国税局\t38535\n连本带利\t38536\n2412\t38537\ncaoliushequ2018\t38538\n底档\t38539\n铝镁合金\t38540\n三峡党建网\t38541\n堆雪人\t38542\n华润\t38543\n动脉导管\t38544\n差分进化算法\t38545\n美团外卖\t38546\nident\t38547\n事章\t38548\n方格网\t38549\npc6\t38550\n夜幕\t38551\n十一期\t38552\n四年级英语\t38553\nHTML转义符\t38554\n拖带\t38555\n中山大学附属第五医院\t38556\n山茶花\t38557\nVSCODE\t38558\n1.4d\t38559\n17\t38560\n二十四桥明月夜\t38561\nNewspapers\t38562\nlogitech\t38563\n墨鱼\t38564\n网络打印机\t38565\n竞驶\t38566\nprogrammatically\t38567\nCollapsing\t38568\n泰诺林\t38569\n李云飞\t38570\n盘路\t38571\n习总书记讲话精神\t38572\n及样\t38573\nrecognised\t38574\n尼罗\t38575\nEthereum\t38576\n渡假村\t38577\n莫亚\t38578\n┐\t38579\n通婚\t38580\n非公司\t38581\n背向\t38582\n血战钢锯岭\t38583\nForums\t38584\n前往\t38585\n艾逸网\t38586\n爱侣\t38587\n工衣\t38588\n卢仝\t38589\nSenses\t38590\n索菱股份\t38591\n光纤放大器\t38592\n海亮学校\t38593\n太平洋学习网\t38594\n尽善\t38595\n杭州·新闻区\t38596\n罗南道\t38597\n玉山县\t38598\n舒利迭\t38599\nStrava\t38600\n新石中路\t38601\nxiaofenguo\t38602\nNX3\t38603\n广南\t38604\nb2b网站\t38605\n盐津\t38606\n代价\t38607\n陈鑫\t38608\n老毛病\t38609\n轰动\t38610\n聚氨酯夹芯板\t38611\n阿里通信\t38612\n赛教\t38613\n龙体\t38614\n张小红\t38615\n念兹在兹\t38616\nPasses\t38617\n黑处\t38618\n波希米亚\t38619\n小哥哥\t38620\n丰胸\t38621\nashley\t38622\n注重\t38623\n20150715\t38624\n幸运币\t38625\n就结束了\t38626\n夏威夷果\t38627\nmouseout\t38628\n门镜\t38629\n存款利率利息表\t38630\n︳\t38631\n黄膏\t38632\nCad\t38633\n75英寸\t38634\n天山海世界\t38635\n校鸡\t38636\n公产\t38637\n流星花园\t38638\n中山教育信息港\t38639\n研究会\t38640\n江面\t38641\nHEY\t38642\n7.11\t38643\n横刀\t38644\n品香\t38645\n九日\t38646\n扬州鉴真\t38647\n李国忠\t38648\nmultiparty\t38649\n镜面\t38650\n聚丰\t38651\n3dmark\t38652\n商侣\t38653\n韩俊强\t38654\n武汉市中心医院\t38655\n领会\t38656\n精英教育\t38657\n邱勇\t38658\n泽井芽衣\t38659\n我的青春期\t38660\n死亡笔记吧_\t38661\n征信体系\t38662\n麦富迪\t38663\n解放军空军\t38664\n出勤\t38665\n权杖\t38666\n四五点\t38667\n博瑞ge\t38668\n仙尊\t38669\n外轮廓\t38670\nMaterialise\t38671\n招商银行一网通\t38672\n单田芳评书网\t38673\nRMK\t38674\n明了\t38675\njQuery幻灯片\t38676\ncoachella\t38677\n江西美术出版社\t38678\nOrders\t38679\n尿杯\t38680\nCAppChem\t38681\n大盐湖\t38682\nAPEC峰会\t38683\nios服\t38684\nSomeday\t38685\n叟\t38686\n听妈妈讲那过去的事情\t38687\n差商\t38688\nFrontier\t38689\n笔字\t38690\nWebdriver\t38691\n筹备工作会议\t38692\n接页\t38693\nalt+f4\t38694\n异形大战铁血战士2\t38695\n江西农业大学南昌商学院\t38696\n徐彪\t38697\nseque\t38698\n特戒\t38699\nrbc\t38700\nSL400\t38701\n3.1\t38702\nreflections\t38703\n魏涛\t38704\n模胚\t38705\n点播\t38706\n税改\t38707\n腹膜透析\t38708\n油猴\t38709\n酱酒\t38710\n龙筋\t38711\n兽决\t38712\n微商们\t38713\n保税库\t38714\n三亿日元\t38715\n排料\t38716\n矢崎\t38717\n孔雀公主\t38718\n发明\t38719\n绿城小镇\t38720\n大余县人民政府\t38721\n彩星\t38722\nAnko\t38723\nTeachers\t38724\n水厂\t38725\n泄流\t38726\n%2c\t38727\n影拓pro\t38728\n贺一航\t38729\n拦阻\t38730\n高送转股\t38731\n365音乐网\t38732\n韩令\t38733\n妖灵\t38734\nEnduro\t38735\n水囊\t38736\n32家\t38737\n时寒高峰\t38738\nvpx\t38739\n红狗\t38740\n基辛\t38741\n8327\t38742\n星桥街道\t38743\n中标率\t38744\n拟声\t38745\n双馈风力\t38746\n捷豹f-pace\t38747\n柠珞LOVE实验室\t38748\n臀舞\t38749\n刘亚男\t38750\nuat\t38751\n海得拉巴\t38752\nGitolite\t38753\n浜崎あゆみ\t38754\nHOF\t38755\nincreased\t38756\nmk3\t38757\n安魂曲\t38758\n慈溪市教育局\t38759\nskinny\t38760\n采掘\t38761\nHold\t38762\n吕奉先\t38763\n乙烷制乙烯\t38764\n广东国际大厦\t38765\n电脑花屏\t38766\n怡馨家园\t38767\n清恋\t38768\n说到做到\t38769\n海棠花节\t38770\n利津\t38771\n江苏省南通第一中学\t38772\n科索沃\t38773\n佛学\t38774\nPerk\t38775\ninterlock\t38776\n2202\t38777\n十斤\t38778\n惠浦\t38779\n县中医院\t38780\n去吧\t38781\n顺义马坡\t38782\n108度\t38783\nGooSeeker\t38784\nHeat\t38785\n只身\t38786\n王传一\t38787\n_闵行政府网\t38788\n恩来\t38789\n薛家湾镇\t38790\n鼠粮\t38791\n年少无知\t38792\n广德县\t38793\n节赛\t38794\n马厩\t38795\n壹\t38796\n磁致\t38797\n侯氏\t38798\n河南经济报\t38799\n军号\t38800\nzhifu\t38801\nSFTP\t38802\n国家农业综合开发办公室\t38803\n孕激素\t38804\n4.50分\t38805\n伟业\t38806\n寻龙记\t38807\n钢盔\t38808\n痢特灵\t38809\n华康\t38810\n干煸四季豆\t38811\n汽封\t38812\n贺\t38813\n碧桂园天誉\t38814\n100分钟\t38815\n虺\t38816\n半联\t38817\n下功夫\t38818\nAfterEffects\t38819\n军情\t38820\n新收入\t38821\n【瑞\t38822\n弹塑性力学\t38823\n枪王之王\t38824\n市质量技术监督局\t38825\nWatch\t38826\nPeach\t38827\n冷弯机\t38828\n办房\t38829\n换休\t38830\n摩机\t38831\n烟台经济技术开发区\t38832\nPapi酱\t38833\nyg\t38834\n1000万\t38835\n南方医科大学中西医结合医院\t38836\n京城四少\t38837\n红鬃烈马\t38838\n20摄氏度\t38839\n用名\t38840\n中国船舶英才网\t38841\n同轴电缆\t38842\n洪羊洞\t38843\n奥迪q3\t38844\nHyperledger\t38845\n21厘米\t38846\n毛岸青\t38847\n宇龙数控\t38848\n魔鬼天使\t38849\n四五快读\t38850\n国内生产总值\t38851\n杨门\t38852\n消薄\t38853\n求死\t38854\n低气压\t38855\n23号\t38856\nKirk\t38857\n0715\t38858\n卖价\t38859\n明暗\t38860\nRAV4荣\t38861\n3088\t38862\n编号表\t38863\n2万辆\t38864\n螺旋式\t38865\nsam\t38866\n新浪娱\t38867\nPBC\t38868\n迷你世界_九游论坛\t38869\nechars\t38870\nwg\t38871\n冲绳奴隶岛\t38872\n汤溪镇\t38873\n本吧\t38874\n看帖\t38875\n十七条\t38876\n114啦网址导航\t38877\n收敛\t38878\nAux\t38879\nfontsize\t38880\n大连君悦酒店\t38881\n新版本\t38882\n腾讯WiFi管家\t38883\n陈思\t38884\n郑大\t38885\n老市区\t38886\n典章\t38887\n锦绣缘华丽冒险\t38888\n汉腾x7\t38889\n土豪学校\t38890\n移动OA\t38891\n各州市\t38892\n十四年\t38893\ng_\t38894\n联盟路\t38895\n宫城县\t38896\nwing\t38897\nTNS\t38898\n风车\t38899\n寡人\t38900\n周其仁\t38901\n无忧学习网\t38902\n劳动者\t38903\n上海外滩华尔道夫酒店\t38904\n_书法字典\t38905\ncorrespondent\t38906\n丁桂儿脐贴\t38907\n板城烧锅酒\t38908\nfeature\t38909\nART\t38910\nhandbrake\t38911\n广东法院网\t38912\ndp0\t38913\n投资促进局\t38914\n奥班\t38915\ntoastmaster\t38916\n2017年5月25日\t38917\n参比制剂\t38918\n奥马\t38919\n发行权\t38920\n代写网\t38921\nblp\t38922\nAmusement\t38923\n上诉状\t38924\n王晓华\t38925\n刘星\t38926\n显著性\t38927\n瞬间秒\t38928\nhttp\t38929\n巡访\t38930\n洋槐蜂蜜\t38931\n裁片\t38932\n闽建建\t38933\n7.5\t38934\nArchery\t38935\n辽宁工业大学\t38936\n等腰三角形\t38937\nX240s\t38938\n専門\t38939\n把爱\t38940\n白玉鸟\t38941\n_途牛\t38942\n空气消毒器\t38943\n大陆居民往来台湾通行证\t38944\n藏语版\t38945\n临淄\t38946\n换衣服\t38947\ngrammarly\t38948\nhbqc\t38949\n24米\t38950\nwindows批\t38951\n金花葵\t38952\n底端\t38953\n灾情\t38954\nPersia\t38955\n68号\t38956\n银版\t38957\n胶钉\t38958\n干休所\t38959\n办公大楼\t38960\n环氧树脂固化剂\t38961\n90升\t38962\n钻机\t38963\n财政部国家税务总局\t38964\n超过十年\t38965\n经理证\t38966\ntolywang\t38967\n萝岗街\t38968\ndramatically\t38969\n黄坡镇\t38970\n稍息立正我爱你\t38971\n城证\t38972\n2H\t38973\n主料\t38974\n伤痕\t38975\nt90\t38976\nM8/M8si\t38977\n正读\t38978\n格格不入\t38979\n第四座\t38980\n17P\t38981\n冲水器\t38982\n耳根\t38983\n平特肖\t38984\n一筐\t38985\nVAPORMAX\t38986\nAppreciation\t38987\n筑龙图酷\t38988\n赞助商\t38989\nLevels\t38990\n诚字\t38991\n推荐书\t38992\nBambi\t38993\n苏州南\t38994\ntrx4\t38995\n新世纪环球中心\t38996\n五宝茶\t38997\n报号\t38998\n塘西\t38999\n瓜尔胶\t39000\n美容展\t39001\n车罩\t39002\n辛月\t39003\n氨基苯酚\t39004\n于洁\t39005\n努比亚红魔【锋潮评测室\t39006\n上边\t39007\nc++primer\t39008\n六分\t39009\n修行\t39010\n灰度变换\t39011\n炼金术士\t39012\n长虹社区\t39013\nChinajoy\t39014\n孙立军\t39015\n三国如龙\t39016\n庸俗\t39017\n2017年9月23日\t39018\n热装\t39019\n红虾\t39020\n江苏南通二建集团有限公司\t39021\n电子工业\t39022\n程序化\t39023\n麦迪逊\t39024\n黄湖\t39025\n5关\t39026\n四川幼儿师范高等专科学校\t39027\n活火山\t39028\n下溢\t39029\n百度影音_113\t39030\n敏实\t39031\n高拉\t39032\n一级建造师注册管理系统\t39033\n晨落\t39034\nPowershell\t39035\n抗争史\t39036\n拦截机\t39037\n想约\t39038\n君合律师事务所\t39039\n覆盆子\t39040\n花儿乐队\t39041\ndoor\t39042\n真结局\t39043\nraspberry\t39044\n变更登记\t39045\n大涌镇\t39046\n润东\t39047\npersonality\t39048\n普洱茶生茶\t39049\nDBing\t39050\nqq同步助手\t39051\n胡慧中\t39052\n中国科学技术大学生命科学学院\t39053\n蝴蝶斑\t39054\nupper\t39055\n犯人\t39056\n中森玲子\t39057\n胃底\t39058\n自然科学版)\t39059\nmeas\t39060\n送货\t39061\n极权\t39062\n北斗地图\t39063\nocp\t39064\n高新兴\t39065\n上海市松江区中心医院\t39066\n快递点\t39067\n交管网\t39068\n黎耀祥\t39069\n武士道\t39070\n追鱼传奇\t39071\nGarrett\t39072\na11\t39073\n吊炸天\t39074\n罗奇\t39075\n雷克萨斯ES论坛论坛\t39076\n中共中央关于全面深化改革若干重大问题的决定\t39077\n感光度\t39078\nRestriction\t39079\n关于我爱你\t39080\nMapBox\t39081\nmtb\t39082\n趣店集团\t39083\n自住型\t39084\n22.8\t39085\n喷漆\t39086\n返单\t39087\n银冠玉\t39088\n回放\t39089\n1887\t39090\nava\t39091\n涼\t39092\nmesenchymal\t39093\nTDK\t39094\n空港\t39095\n军媒\t39096\nMXnet\t39097\n洋芋\t39098\n80秀\t39099\n李越\t39100\n道境\t39101\nSusan\t39102\n卫城\t39103\n月桂树\t39104\n王力宏\t39105\n魔魅\t39106\n快楽\t39107\n林国强\t39108\n中国南玻集团股份有限公司\t39109\n绝缘管\t39110\n从动\t39111\n我的爱人\t39112\n假面骑士w\t39113\n梅花树\t39114\n_焦作日报数字报_焦作网WWW\t39115\n踏上\t39116\n待嫁\t39117\n刘维全\t39118\n宋敏\t39119\nwanlifeipeng\t39120\n闭源\t39121\n南天PR2E\t39122\n齿盘\t39123\n移籍\t39124\n押犯\t39125\n290号\t39126\n大夏\t39127\n水剂\t39128\nFesto\t39129\n开封市中心医院\t39130\n600903\t39131\n三星N7100/GALAXY\t39132\n安徽公安职业学院\t39133\n枪版\t39134\ntuneup\t39135\nay\t39136\n白娘子\t39137\n重庆市肿瘤医院\t39138\n猎者\t39139\n伊万诺维奇\t39140\n天九幸福集团\t39141\n巧影\t39142\n海鲜城\t39143\n修罗神\t39144\nB4\t39145\n企查猫\t39146\nPAINT\t39147\nwiggle\t39148\n10r\t39149\n399\t39150\n嫡妃\t39151\nformatter\t39152\n复办\t39153\n费振刚\t39154\n此类\t39155\n一刀流\t39156\n大沙村\t39157\n三至\t39158\n光大花园\t39159\njer\t39160\nZL\t39161\n军史\t39162\n地产\t39163\n湖坊镇\t39164\n一汽锡柴\t39165\n审\t39166\n天城镇\t39167\n先锋军\t39168\n床头\t39169\nunits\t39170\n铮\t39171\n氨味\t39172\n朝柴\t39173\n绿色酒店\t39174\n鸟鸟\t39175\n固定翼无人机\t39176\nVCP\t39177\n奇伟\t39178\nOPPOA73\t39179\n好冷\t39180\n三千个\t39181\n庆铃\t39182\n中欧班列_\t39183\n处理单\t39184\nGrowl\t39185\n短歌\t39186\n4阶\t39187\n一村一品\t39188\n阳山镇\t39189\n第七十五条\t39190\n口交会\t39191\n奥克兰机场\t39192\n剑灵气功师\t39193\n绥宁新闻网\t39194\n开干\t39195\n云南地区\t39196\n明大\t39197\n做业\t39198\nboot应用\t39199\nHORIBA\t39200\n喜爱夜蒲1\t39201\nfuwuqi\t39202\n捣烂\t39203\nvolo\t39204\nActiveState\t39205\nemc\t39206\n粪土\t39207\n华纳唱片\t39208\n中彩\t39209\n好多种\t39210\n布拉格城堡\t39211\n傩戏\t39212\n超神奇\t39213\n南川网\t39214\n探险家\t39215\natlas\t39216\n心管\t39217\n超体\t39218\n华孚色纺\t39219\nBQ\t39220\n贝氏\t39221\n互生\t39222\n路段\t39223\n贝索斯\t39224\nSage\t39225\n戴伦兹\t39226\n当堂\t39227\n穆里奇\t39228\n王正华\t39229\n铁管\t39230\n程序员节\t39231\n药学家\t39232\n截段\t39233\n异物\t39234\n中国铁建国际花园\t39235\n险种\t39236\n常见病\t39237\n智尚版\t39238\n在职证明\t39239\n小邵\t39240\n宁夏区\t39241\n保险杆\t39242\n蔚蓝卡地亚花园城\t39243\n深圳分所\t39244\n20棵\t39245\n恋夜影院\t39246\n定句\t39247\n姜半夏\t39248\nanew\t39249\n红中麻将\t39250\n信源\t39251\n咸阳\t39252\n结帖\t39253\n北京牡丹园\t39254\n木岛\t39255\n张云霞\t39256\n剽\t39257\n巧笑倩兮\t39258\n地委\t39259\nspp\t39260\n生态园\t39261\n如痴如醉\t39262\n全形\t39263\n集尘\t39264\nanatomy\t39265\n魔兽世界数据库\t39266\n手机号查询网\t39267\nDinner\t39268\n靠边站\t39269\n种族大屠杀\t39270\n西子美发网\t39271\n五星控股\t39272\n第33号\t39273\n塘栖\t39274\nSaba\t39275\n唯美系\t39276\n卫生人员\t39277\n908路\t39278\nmop\t39279\n微信聊天记录查看器\t39280\n話\t39281\nRenfei\t39282\nevolve\t39283\n淮河流域\t39284\nv80\t39285\n垮裤\t39286\n朱婷\t39287\n骤雨\t39288\noledb\t39289\n上海法院\t39290\nnrg\t39291\n日本拉面\t39292\n撒布机\t39293\n研发\t39294\ngxw\t39295\n烟台市老年体协\t39296\n广州百姓网\t39297\n郑捷\t39298\n360写字楼网\t39299\n新天龙八部之天山童姥\t39300\n格栅\t39301\n独神\t39302\n防火门监控系统\t39303\n柯尼卡美能达\t39304\n单字\t39305\n全景\t39306\n桐城新闻网\t39307\n化学法\t39308\n不由分说\t39309\n哄哄\t39310\n铜墙铁壁\t39311\nFTTH\t39312\n党风\t39313\n冒险团\t39314\n九酷网络\t39315\n赵志伟\t39316\n毛面\t39317\n陪着你走\t39318\n赵芸蕾\t39319\n魔童\t39320\n国际关系学院\t39321\n大写\t39322\n旋球\t39323\n耐\t39324\nTS流\t39325\n软件学院\t39326\n藏香猪\t39327\n汉董\t39328\n标致407\t39329\nGui\t39330\nTravelers\t39331\n德业\t39332\n伴手礼\t39333\n金渡镇\t39334\n90块\t39335\n采宝\t39336\n沈约\t39337\n眷村\t39338\n第96\t39339\n观日出\t39340\n釉瓷\t39341\n新钞\t39342\nPrimeNG\t39343\n在逃者\t39344\nfamous\t39345\nTrample\t39346\n龙在天涯\t39347\nindia\t39348\n第一栏\t39349\n法云寺\t39350\n三句半\t39351\n集成电路概念股\t39352\n穿热\t39353\n议\t39354\n403路\t39355\n翁志飞\t39356\n潸\t39357\nrpms\t39358\n扰扰\t39359\n再选中\t39360\n希拉克\t39361\n长得帅\t39362\n开瑞优劲\t39363\n科里奥\t39364\n万能药\t39365\n收敛域\t39366\n专门店\t39367\n田青\t39368\n乔保平\t39369\nghc\t39370\n军港\t39371\n执行期\t39372\n水箱自洁消毒器\t39373\n极速office\t39374\n礼拜五秘书网\t39375\n双床\t39376\n间质性肺病\t39377\n65厘米\t39378\n湖滨新区\t39379\n房产抵押登记\t39380\n2016下半年\t39381\n云内动力\t39382\n客户主\t39383\n开瑞优优\t39384\nKeychain\t39385\n1991\t39386\n妇委会\t39387\n2017年5月21日\t39388\n全一\t39389\n玉渊潭樱花节\t39390\noa办公系统\t39391\n张自忠\t39392\n条街\t39393\n6980\t39394\nmys\t39395\n心游\t39396\n图灵奖\t39397\n远洋公馆\t39398\n杂交龟\t39399\n亡命徒\t39400\n韦斯利\t39401\n反编译\t39402\n脚石\t39403\nuploader\t39404\n硚口区\t39405\n长隆水上乐园\t39406\n复合模\t39407\n满意率\t39408\n证券投资基金基础知识\t39409\nER3100\t39410\n奋发\t39411\n三味\t39412\nEgg\t39413\n夏目友人帐\t39414\npossession\t39415\n急诊室\t39416\n养生杯\t39417\n评率\t39418\n兮\t39419\n新魔\t39420\n汽车史\t39421\n见鬼\t39422\n新加坡银行\t39423\n看上去\t39424\n123电影网\t39425\n力发\t39426\n幽闭恐惧症\t39427\n赵小侨\t39428\nwebcontent\t39429\n菏泽论坛\t39430\n自由行攻略-韩游网\t39431\n石家庄经济技术开发区\t39432\n阴阳屏\t39433\n蓝多多\t39434\n热火三巨头\t39435\n付款行\t39436\n广西人民广播电台\t39437\n谷普\t39438\nORA-12170\t39439\n960pro\t39440\n烤漆龙骨\t39441\n郁金香节\t39442\nflot\t39443\n紫竹林\t39444\n广济药业\t39445\n645\t39446\n和歌山\t39447\n挥毫泼墨\t39448\n导卫\t39449\n最热\t39450\n藁城区\t39451\njuqing\t39452\n保利林语山庄\t39453\n世纪佳缘交友网\t39454\n三诺血糖仪\t39455\n泰迪熊2\t39456\n合数\t39457\n北湖\t39458\n失访\t39459\nV8.6\t39460\n法证先锋\t39461\n眉间雪\t39462\nearning\t39463\n印尼政府\t39464\nDX7\t39465\n可愛\t39466\nZenCart\t39467\n暴力迪\t39468\n新沂市人民政府\t39469\n数据链\t39470\n顺序器\t39471\nabbott\t39472\n剖面线\t39473\n总动脉\t39474\n宝鸡市政府\t39475\n治疗师\t39476\n保密法\t39477\nluck\t39478\n猎天使魔女吧\t39479\n绿驹电动车\t39480\n咖喱咖喱\t39481\n中边\t39482\n军事演习\t39483\n死穴\t39484\n中国私人银行\t39485\n白恩培\t39486\n福田汽车\t39487\n杨鹏\t39488\n龙庆峡\t39489\n背打\t39490\n太原工业学院\t39491\nshowModalDialog\t39492\nDecoding\t39493\n太钢\t39494\n杭州公办小学\t39495\nDowney\t39496\n板书\t39497\n影子舞\t39498\n连南\t39499\n左氧氟\t39500\nbooking\t39501\n总是\t39502\n非金融企业\t39503\n戊癸\t39504\n段间\t39505\n平安险\t39506\n他汀类药物\t39507\n比奈\t39508\n毁于一旦\t39509\n九水路\t39510\n轩主\t39511\n12升\t39512\n13【\t39513\nchemcp\t39514\n上海交响乐团音乐厅\t39515\n辣根过氧化物酶\t39516\nOKCoin\t39517\n小班\t39518\n固定值\t39519\n340马力\t39520\n高考资源网\t39521\n对角点\t39522\n上海松江区\t39523\n第14天\t39524\n织女\t39525\n回龙观东大街\t39526\n讲\t39527\nnolonely\t39528\n归省\t39529\n21号\t39530\nHDP\t39531\n曲阜政府网\t39532\n力捧\t39533\n720p+1080p\t39534\n自我\t39535\nH3C模拟器\t39536\nMaine\t39537\n融水苗族自治县\t39538\n_巴黎欧莱雅\t39539\n重庆市水利局\t39540\n时间操作函数\t39541\n用用\t39542\n彦希\t39543\nbooked\t39544\n高开\t39545\n7005\t39546\n中央华府\t39547\nDownstream\t39548\n压印\t39549\n风流才子\t39550\nFar\t39551\nre\t39552\nsupportassist\t39553\n装弹机\t39554\n黄龙洞\t39555\n国家海洋局南海分局\t39556\n首都医科大学附属北京同仁医院\t39557\n挑错\t39558\n孙迪\t39559\n600题\t39560\n瑞兴\t39561\n酚醛板\t39562\nWIN32\t39563\n铂悦犀湖\t39564\n周易网\t39565\njt808\t39566\n把酒话桑麻\t39567\n绩效工资\t39568\n汉印\t39569\n金农网金农网\t39570\n对襟\t39571\n中国燃气控股有限公司\t39572\n有兴趣\t39573\n黑马\t39574\nVIEW\t39575\n木虫\t39576\n金钱木\t39577\n安床\t39578\n玻璃粉\t39579\n革裹尸\t39580\ncltt\t39581\n一泽\t39582\n7天内\t39583\n子非\t39584\n安装工程费\t39585\n引领\t39586\n刘斐\t39587\nCCTV-1综合频道\t39588\n举重妖精金福珠\t39589\n锤子M1L\t39590\n努比亚Z18mini\t39591\n高头\t39592\n高圣\t39593\n三好小说网\t39594\n喀戎\t39595\nSatellite\t39596\n接管\t39597\n南京公司\t39598\nな\t39599\n君安\t39600\n许昌\t39601\n梦叶\t39602\n花前月下\t39603\norg.junit\t39604\n粤人\t39605\n私网\t39606\n铜川\t39607\n900句\t39608\nLcd\t39609\n小兰\t39610\n刘谦\t39611\n墓王之王悬棺寺_墓王之王悬棺寺\t39612\n博望镇\t39613\n微选码\t39614\n世界版\t39615\n比喻义\t39616\n野蛮时代\t39617\nnpj\t39618\n内情\t39619\nReputation\t39620\n菊梓乔\t39621\n成方圆\t39622\ngala\t39623\n2018-04-09\t39624\n微生活\t39625\n奔跑吧兄弟4\t39626\n保存在\t39627\n武汉地铁4号线\t39628\n488号\t39629\n深夜\t39630\nImpress\t39631\n.NET\t39632\n德恒律师事务所\t39633\n马里山\t39634\n东方红拖拉机\t39635\n长城资产管理公司\t39636\n公地\t39637\n这一夜\t39638\n寒蝉鸣泣之时\t39639\n琵琶女\t39640\n山乡\t39641\n督徒\t39642\n扎扎\t39643\n称重显示器\t39644\nRecycler\t39645\n万钧\t39646\n田径赛\t39647\nExcel组织结构图\t39648\n临窗\t39649\n裤裤\t39650\n鸡蛋盒\t39651\n提等\t39652\n消续费\t39653\n出库单\t39654\n荣宠\t39655\n圣农发展\t39656\n七零年代\t39657\n南宁市第二人民医院\t39658\n天翔环境\t39659\n4.76\t39660\n刘雄\t39661\n繁花\t39662\n7平方\t39663\n良缘\t39664\n灵云\t39665\n百度云盘链接\t39666\ncompetitor\t39667\n今生无悔\t39668\n两扇\t39669\n货样\t39670\n紫绀\t39671\n彩票群\t39672\npersistence\t39673\n铁血将军\t39674\nSV\t39675\n重物\t39676\n科三\t39677\nTs\t39678\nUNIQLO\t39679\n打包箱\t39680\n戴南\t39681\n钱柜\t39682\n墓王之王悬棺寺片尾曲\t39683\n涅茧利\t39684\n徐勤\t39685\n手盘\t39686\n国营农场\t39687\n姜超\t39688\nnore\t39689\n256K\t39690\n劳形\t39691\n名\t39692\n酷匠网\t39693\n自相关\t39694\n环城高速\t39695\nli\t39696\nadn\t39697\n吉他谱_和弦弹唱谱\t39698\n授旗\t39699\n中华人民共和国全国人民代表大会常务委员会\t39700\n奇槎\t39701\n欲封天\t39702\n费用类\t39703\n阳盛\t39704\n千言万语\t39705\ndongtai\t39706\n维纳滤波器\t39707\n凤栖\t39708\nST匹凸\t39709\n芒果西米露\t39710\n宁陵县人民政府\t39711\n8.4.1\t39712\n供选择\t39713\n请谅解\t39714\n8530\t39715\nbeef\t39716\n惊天破\t39717\n阴师\t39718\n88度\t39719\n/if\t39720\nTST\t39721\n狂胜\t39722\nnao\t39723\n深圳大学计算机与软件学院\t39724\nVAG\t39725\n小米miui\t39726\nH2S\t39727\n战斗系统\t39728\nANS\t39729\nKONTAKT\t39730\n隋然\t39731\nPrimary\t39732\n济沧海\t39733\n四十一枝\t39734\n宫崎英高\t39735\n恋欲\t39736\n1119\t39737\n故人庄\t39738\n通过式\t39739\n百瑞景\t39740\n々\t39741\n位于\t39742\nselling\t39743\n广州电影院\t39744\n5GB\t39745\n300关\t39746\n滞期\t39747\n体改\t39748\nie11\t39749\n4547\t39750\n垫江\t39751\n北展剧场\t39752\n响应者\t39753\n一名滴滴\t39754\n济南乐居\t39755\ntRNA\t39756\nA60\t39757\n漂亮的李慧珍\t39758\n触摸精灵\t39759\n4442\t39760\n一只猫\t39761\n寂然\t39762\n杉树\t39763\nstarch\t39764\n优才教育\t39765\nVoucher\t39766\n50G\t39767\n签名档\t39768\n群星会\t39769\n绝巴哈\t39770\n曼富图\t39771\n黄心猕猴桃\t39772\nnor叔\t39773\n首唱\t39774\n当归\t39775\n甲状腺乳头癌\t39776\n绞肉机\t39777\n可儿\t39778\nOvO\t39779\n五帝本纪\t39780\n第2框\t39781\nt96\t39782\n走秀网\t39783\n幼仔\t39784\nPOSTGRESQL\t39785\n孔夫子\t39786\n工作处\t39787\n无厘头\t39788\n大家好\t39789\n薪假\t39790\n黄庄\t39791\n施丹兰\t39792\n湖南交通职业技术学院\t39793\n周工\t39794\n夏加儿\t39795\n正途\t39796\n精益\t39797\n雨花台中学\t39798\n向松祚\t39799\n浓厚\t39800\n有关部门\t39801\n下档\t39802\n炉石传说\t39803\n滑溜\t39804\n财经公司\t39805\noils\t39806\nMix2\t39807\n129家\t39808\nKST\t39809\n汉语拼音声调\t39810\n风池穴\t39811\n齐唱\t39812\n拔头\t39813\n逆缘\t39814\nspark-shell\t39815\n陈锡联\t39816\n债权\t39817\nphp5apache2_4.dll\t39818\nconvexity\t39819\nppp0\t39820\ntheatre\t39821\n110号\t39822\n20161231\t39823\n纳粹僵尸部队\t39824\n玻璃钢标志桩\t39825\n超弦理论\t39826\n重庆聊斋\t39827\n化史\t39828\n无尽之剑\t39829\n常程\t39830\n第五十回\t39831\n项亚蕻\t39832\n郑州小区\t39833\n景观设计师\t39834\n2.7.10\t39835\n十处\t39836\n弱电箱\t39837\nexact\t39838\nTengine\t39839\n3房\t39840\n陶壶\t39841\n表演性\t39842\n吹风\t39843\n上学歌\t39844\nINPUT框\t39845\n髓质\t39846\n阿骨\t39847\n腾飞园\t39848\n一边脸\t39849\nM1L\t39850\n跨地区\t39851\n蓝枫\t39852\n_钟祥市政府网\t39853\nbiol\t39854\nVirtualBox虚拟机\t39855\nCMakeLists\t39856\n森崎\t39857\n百校\t39858\nStringUtils\t39859\n椎间盘\t39860\n新的挑战\t39861\n邀请词\t39862\n5.1.44\t39863\n彩虹六号吧\t39864\n单端\t39865\n亿品\t39866\n来说好\t39867\n大尾\t39868\n辅域\t39869\n协会\t39870\n人造板\t39871\n西安地铁8号线\t39872\n罗斯伯格\t39873\nOral\t39874\ncloser\t39875\n鲁花\t39876\n记忆体\t39877\n澳门路\t39878\nandriod\t39879\n国旗法\t39880\n学组\t39881\n公司战略\t39882\n雅琪\t39883\n松永\t39884\n不不不\t39885\n江晓琪\t39886\n冀州市\t39887\n智跑论坛\t39888\n塔希提岛\t39889\n步调一致\t39890\n头孢克肟分散片\t39891\n學校\t39892\n里番ACG\t39893\nskk\t39894\n精金\t39895\n热风炉\t39896\n766游戏网\t39897\n主罚\t39898\n麦多馅饼\t39899\n1.3.5\t39900\n暗送\t39901\n打号机\t39902\nredmi\t39903\n无中生\t39904\n姑侄\t39905\nsoftware\t39906\n同志们\t39907\n低幼\t39908\n欲罢不能\t39909\n极地恶灵吧\t39910\n泡眼\t39911\n包装印刷\t39912\nHolding\t39913\n拓路者\t39914\n苏州有限公司\t39915\n一元钱\t39916\n成都\t39917\n泗水政府网\t39918\n基林\t39919\n优莎娜\t39920\n三米\t39921\n薪酬\t39922\n日语学习软件\t39923\n莺歌\t39924\n王赛\t39925\n46天\t39926\nwider\t39927\n见不得\t39928\n范文澜\t39929\nKOEI\t39930\n鼋头渚风景区\t39931\nSomi\t39932\n杨卫泽\t39933\n脓疱\t39934\n舞艺附中\t39935\n学年\t39936\n兵败\t39937\n564\t39938\nank\t39939\n德江网\t39940\n良奸商\t39941\n江苏城乡建设职业学院\t39942\n802.3af\t39943\n花瓶\t39944\n美臀\t39945\n九真山\t39946\n喜大普奔\t39947\n江桥万达\t39948\n固锝\t39949\n三国时期\t39950\n放了\t39951\n激光\t39952\n渭城\t39953\n高斌\t39954\n光明社区\t39955\nGaruda\t39956\nhome版\t39957\n异侠\t39958\n窥欲\t39959\n召\t39960\n庄毅\t39961\n兽皇\t39962\n重于\t39963\nAce001\t39964\nfills\t39965\n真由美\t39966\n沁园春\t39967\nhalted\t39968\nTXZ\t39969\nwzc\t39970\n梦里花落知多少\t39971\n增高架\t39972\n辣妈们\t39973\n起亚智跑\t39974\n快哉亭\t39975\nwireshark\t39976\n导图\t39977\n绝地球生\t39978\n电子除垢仪\t39979\ndongli\t39980\nPlots\t39981\n召见\t39982\n阿特斯\t39983\n塞尔达传说野\t39984\n游戏化\t39985\n乐高漫威超级英雄2\t39986\nWIKI_高达模型\t39987\n毛纱\t39988\n桂林站\t39989\n寻母\t39990\n妈网\t39991\n洁\t39992\n南京交通职业技术学院\t39993\n溶解\t39994\ndnf驱魔吧_\t39995\n进修校\t39996\n中国纺织工业联合会\t39997\n软路由\t39998\n生机\t39999\n橄榄型\t40000\n腰型\t40001\nAction\t40002\nbay\t40003\n粪\t40004\n非关税壁垒\t40005\n病逝\t40006\n新兴际华\t40007\n暨南大学附属第一医院\t40008\n小卖家\t40009\nAMTEmu\t40010\n如意算盘\t40011\nJq\t40012\npersona5\t40013\nx80h\t40014\nPolicies\t40015\n立案庭\t40016\n李骏\t40017\n木桩\t40018\n英语作文_99作文网\t40019\n飞翔网\t40020\n生无可恋\t40021\n摊余成本\t40022\n盛\t40023\n干肠\t40024\n施芝鸿\t40025\n齐心集团\t40026\nARE\t40027\n数据科技有限公司\t40028\n二十四小时\t40029\n背肌\t40030\n临港开发区\t40031\n蕾哈娜\t40032\n伊利石\t40033\n工作线\t40034\n默语\t40035\n杨玉莹\t40036\n60件\t40037\n新城控股集团有限公司\t40038\n上海马戏城\t40039\n百灵\t40040\n冰袋\t40041\n常西湖新区\t40042\n神犬\t40043\n西施犬\t40044\nOclean\t40045\n大存\t40046\n中华诵\t40047\n脱氢\t40048\n腿\t40049\n书桌\t40050\nkar\t40051\n安淇尔\t40052\n匿\t40053\n个股\t40054\n剁手族\t40055\n运河新城\t40056\n上港集团\t40057\n蓝天杯\t40058\nDurable\t40059\nShouldn\t40060\nUIBarButtonItem\t40061\nigem\t40062\n大喵\t40063\nStopped\t40064\n好享贷\t40065\n希拉\t40066\n12级\t40067\n唐青枫\t40068\n骨笛\t40069\n白石溪\t40070\n东阿阿胶\t40071\n猴子\t40072\n贡献率\t40073\n视频化\t40074\n全海\t40075\nシャガ\t40076\n环回接口\t40077\nAGENT\t40078\n第64\t40079\nMemTest\t40080\n按摩馆\t40081\n40厘米\t40082\n粗纤维食物\t40083\n中海寰宇天下\t40084\nalf\t40085\nmotorola\t40086\n银娇\t40087\n模特儿\t40088\n拉怪\t40089\nady9\t40090\n电火花机\t40091\n口欲期\t40092\n陇上\t40093\n速算\t40094\n蒲地兰\t40095\n阿光\t40096\n豪宅\t40097\n有谁知道\t40098\n上辈\t40099\n平房村\t40100\n先照后证\t40101\n潮流\t40102\n上帝之城:监狱帝国\t40103\n曹文轩\t40104\n饕餮纹\t40105\n四川高院\t40106\n氟碳漆\t40107\n闫效平\t40108\nfloor\t40109\n千百万\t40110\n刑事和解\t40111\n关隘\t40112\n铜环\t40113\n紫薇斗数\t40114\nRayli\t40115\n通信子网\t40116\n虚空遁地兽\t40117\n选科\t40118\n绵竹市人民政府\t40119\n功耗\t40120\n奉旨\t40121\nbearsee\t40122\nHYAZ\t40123\nInnovations\t40124\n山西农业大学\t40125\n马尔科夫链\t40126\n火影忍者ol忍者考试\t40127\n关天\t40128\n出生证\t40129\n精神失常\t40130\n应邀\t40131\n毒种\t40132\n血清白蛋白\t40133\n七杀格\t40134\n中央档案馆\t40135\n阿庆嫂\t40136\n9A\t40137\n百度一下\t40138\nLITE\t40139\n点评\t40140\nPest\t40141\n不管不顾\t40142\n加泰\t40143\n单相变压器\t40144\nsoild\t40145\n可喜安\t40146\ndropna\t40147\nin99\t40148\n红糖姜茶\t40149\n假书\t40150\n湖北广电\t40151\n王乐乐\t40152\n平安路\t40153\n排洪\t40154\n尤里的复仇吧\t40155\n吃西餐\t40156\ndc24v\t40157\n玉兰\t40158\nv1.11\t40159\n立品\t40160\nP50\t40161\n玛格丽特\t40162\n周计划\t40163\naddressed\t40164\n罗马城\t40165\n头头\t40166\n北京市残疾人联合会\t40167\n双龙航空港经济区\t40168\n25岁的女高中生\t40169\n高新东区\t40170\nvidoe\t40171\nDigger\t40172\n陈士渠\t40173\n思字\t40174\n大后方\t40175\n歼灭者\t40176\n新形象\t40177\n驾校一点通\t40178\n白垩\t40179\nhax\t40180\nArcCatalog\t40181\nihg\t40182\nhyperledger\t40183\n衬线\t40184\n萌新\t40185\n亿图图示\t40186\nComposites\t40187\n巡航车\t40188\nmonocloud\t40189\n中阳县人民政府\t40190\n写错字\t40191\n亨通\t40192\nJinx\t40193\n国茂\t40194\n天猫魔盒\t40195\n暖风机\t40196\n1055t\t40197\n子夜情缠\t40198\nheld\t40199\nmainwindow\t40200\n000793\t40201\n办事\t40202\n成都电信\t40203\n内部类\t40204\n牛肉米粉\t40205\n重庆电力高等专科学校\t40206\n米熊\t40207\n火钟\t40208\nhpl\t40209\n消防给水及消火栓系统技术规范\t40210\n堆栈式\t40211\n盈港东路\t40212\n丑新\t40213\n赋值语句\t40214\nカップ\t40215\nreducers\t40216\n罗城古镇\t40217\nMULTI\t40218\n顾全\t40219\n大甩卖\t40220\n藏宝洞\t40221\nE讯网\t40222\n青虾\t40223\n820m\t40224\n25局\t40225\n车载摄像头\t40226\nCDA\t40227\n韩城市教育局\t40228\n3525\t40229\n_联商资讯中心\t40230\n万科广场\t40231\n吕老师\t40232\n首批20个\t40233\nAudioTrack\t40234\n石门坎\t40235\n全压\t40236\n新年好\t40237\n从一开始\t40238\n煊煊\t40239\n莱克电气\t40240\nMaxKim\t40241\n收缩\t40242\np1000\t40243\n芜湖镜湖\t40244\n黑咖\t40245\n这些年来\t40246\n本世纪\t40247\n厦门信达\t40248\n骚扰\t40249\n129元\t40250\n歌迷会\t40251\n益暖\t40252\n盛昌\t40253\nKhalid\t40254\n陟\t40255\nc9pro\t40256\n试点\t40257\n陕西环保集团\t40258\n2cosx\t40259\n砌块石\t40260\n酬金\t40261\n皇家花园\t40262\ncbox\t40263\n21三体综合征\t40264\n朴素贝叶斯分类算法\t40265\n太坏了\t40266\n蛇蝎美人第二季\t40267\nmate10pro\t40268\nwego\t40269\n全国名\t40270\ndell吧\t40271\nGPS论坛\t40272\nnam\t40273\n宋锦\t40274\n如月群真\t40275\n鸡人\t40276\n贝塞尔\t40277\n代理版\t40278\n零食\t40279\n混沌世界\t40280\n外宣办\t40281\n信管\t40282\n深圳火车站\t40283\n诋毁\t40284\n内藏\t40285\n云南白药集团股份有限公司\t40286\n住进去\t40287\n举起手来\t40288\n武陵源区\t40289\n电子天平\t40290\n拉销\t40291\nSATA3\t40292\nZNDS智能电视\t40293\n趣致\t40294\n哎哟\t40295\nJoomla\t40296\n御景国际\t40297\n美國\t40298\n常山县政府\t40299\n田蜜\t40300\n透水性\t40301\n六品堂\t40302\n4.60\t40303\n河庄街道\t40304\nvishay\t40305\n龙文\t40306\n香醋\t40307\nLEAD\t40308\n售后服务部\t40309\n卡夫特\t40310\n宋洪远\t40311\n浙江大学附属第一医院\t40312\n吐口\t40313\n转引\t40314\n工具卡\t40315\n永恒之柱\t40316\n0.06MB\t40317\n停不下\t40318\nlilly\t40319\n淡泊\t40320\n赌鬼\t40321\n小乔\t40322\n谈恋\t40323\n鸠山\t40324\n南京中电\t40325\n91PRON\t40326\n重庆08定额\t40327\n压铸模\t40328\n试看\t40329\n数篇\t40330\n华中科技大学\t40331\n早上10点\t40332\n资邦金服\t40333\n癫\t40334\n竞争法\t40335\n富士苹果\t40336\n谢\t40337\n高时\t40338\n拱门国家公园\t40339\n幻饰\t40340\n肇庆东站\t40341\n国务院常务会\t40342\n广州花园酒店\t40343\n苦无\t40344\n西阁\t40345\nwearable\t40346\n山东大学威海分校\t40347\n闭壳龟\t40348\nHEA\t40349\n赤色\t40350\n厦门鼓浪屿\t40351\n人本主义\t40352\nZonda\t40353\n股市\t40354\nN4010\t40355\n动化\t40356\n腠理\t40357\nthat&#160\t40358\n干脆利落\t40359\nhumour\t40360\n银霜\t40361\n反特\t40362\n臭屁\t40363\n冲击感\t40364\n耳板\t40365\nflw\t40366\n碧海云天\t40367\n御桥路\t40368\n拮据\t40369\n加多宝\t40370\n别开生面\t40371\n污蔑\t40372\n众达\t40373\n成都医学院第一附属医院\t40374\n元曲\t40375\n极路由2\t40376\n恶戏\t40377\n糖尿病周围神经病变\t40378\n新兴县\t40379\n张赛\t40380\n三七分\t40381\nRejected\t40382\n总部\t40383\n忘了他\t40384\ndeviation\t40385\n交互设计师\t40386\nHDF\t40387\n20类\t40388\n东院\t40389\n管版\t40390\n吕布传\t40391\n选学\t40392\n99998\t40393\n沈傲君\t40394\ncfs\t40395\n成田国际机场\t40396\n寻衅\t40397\n结扎\t40398\n露水河\t40399\n缓蚀\t40400\n孤胆枪手1\t40401\n湖南经视\t40402\n底人\t40403\nsfe\t40404\n体结构\t40405\n分类器\t40406\n2352\t40407\n施治\t40408\nFx\t40409\n五桂桥\t40410\n徐宥箴\t40411\n8月20日\t40412\n樊登读书会\t40413\n碳酸锶\t40414\n五行\t40415\n绝対\t40416\n翻译君\t40417\n柳树\t40418\nartcam\t40419\n凌波\t40420\n中海城\t40421\n奥托尼克斯\t40422\n2.7T\t40423\n马拉加\t40424\n改装件\t40425\n装运\t40426\n小庙\t40427\n本公\t40428\n迭代\t40429\n不得病\t40430\n六铺\t40431\n乐卡\t40432\n15分钟后\t40433\n370x\t40434\nNAND闪存\t40435\n画案\t40436\ndemocratic\t40437\n蚤\t40438\n并购\t40439\n游戏浏览器\t40440\n跑包\t40441\nNoRestriction\t40442\nPLT\t40443\nWELEN\t40444\n养志\t40445\n满洲里市\t40446\n胃十二指肠溃疡\t40447\nGREY\t40448\nS8Plus\t40449\nfootball\t40450\n弑神者\t40451\n聚铁\t40452\n工会组织\t40453\n韩进\t40454\n安以轩\t40455\n公鱼\t40456\n奥布\t40457\n杜仲雄花茶\t40458\n14.0.0\t40459\n鲨\t40460\n震动器\t40461\n霍山县\t40462\n1920年\t40463\n马学武\t40464\n奥兰多布鲁姆\t40465\n外汇掉期\t40466\n世界奇闻网\t40467\n2016年5月1日\t40468\n20151001\t40469\n360二代\t40470\ninfections\t40471\n敢死队2\t40472\nlocally\t40473\n微空间\t40474\n罗阳镇\t40475\nkk哥\t40476\n微痛\t40477\nCarroll\t40478\n江山图\t40479\n彩券\t40480\nCRAZYSEO+\t40481\n_刀塔传奇\t40482\n歪麦\t40483\n达江\t40484\nims\t40485\n银豹\t40486\n性爱小说\t40487\n市教委\t40488\n星链\t40489\n洱海\t40490\n文行\t40491\nyota3\t40492\n斜月\t40493\n浏河\t40494\n资深吃\t40495\nspank_spank\t40496\n临场感\t40497\n检索表\t40498\n安联保险集团\t40499\n环评证\t40500\nTired\t40501\n931\t40502\n博瑞金融论坛\t40503\n真实赛车3吧\t40504\n房性\t40505\n伊瑞\t40506\ncanmake\t40507\nUIWebView\t40508\n几何题\t40509\n种殖\t40510\n得月楼\t40511\nminiui\t40512\n挂职\t40513\n走着走着\t40514\n海娜\t40515\n甲硝唑\t40516\n从文\t40517\n便秘\t40518\n杭州第十四中学\t40519\n非法集资\t40520\n平行样\t40521\nPE+\t40522\n第26次\t40523\n火车道\t40524\n小年夜春晚\t40525\n知识题\t40526\n60种\t40527\n推力\t40528\n赛场\t40529\n正果\t40530\n35岁\t40531\ncc1plus\t40532\n八班\t40533\n600块\t40534\nSine\t40535\n1634\t40536\n两宋\t40537\n飞鸿影\t40538\nSiteRip\t40539\nvegetables\t40540\n打出血\t40541\n国泰集团\t40542\n恋爱心理学\t40543\n孤陋\t40544\n滤油器\t40545\n鬼神\t40546\nOF\t40547\nabdc\t40548\n水层\t40549\n千脑云\t40550\n站住\t40551\n阻塞性\t40552\n恋足癖\t40553\n共铸\t40554\naukg\t40555\n麦迪逊广场花园\t40556\n中忍考试\t40557\nICEY\t40558\n中国航空工业集团\t40559\nMuseum\t40560\n周围型\t40561\n邦劳岛\t40562\n挖泥船\t40563\n立乐\t40564\nMojo\t40565\nf182\t40566\n鬼恋\t40567\nTell\t40568\n把杆\t40569\n源代码/SDK\t40570\n智度股份\t40571\n振华\t40572\n伊登\t40573\nf10\t40574\n李大钊纪念馆\t40575\n手撕\t40576\n连带\t40577\nPredicate\t40578\n长兄\t40579\n极链\t40580\n猫王\t40581\nGemini\t40582\n报码\t40583\n干锅牛蛙\t40584\n止吐药\t40585\n活动页\t40586\nPatents\t40587\nac5300\t40588\nSencha\t40589\n爱回家\t40590\n蒋婴\t40591\nQQTZ\t40592\n98.5\t40593\n戛纳创意节\t40594\n800号\t40595\n激活版\t40596\nhands\t40597\n张峻宁\t40598\n台北故宫博物院\t40599\nbase58\t40600\n斥力\t40601\n几p\t40602\npathofe\t40603\n万科嘉园\t40604\n扭力测试仪\t40605\ncan\t40606\n四川在线\t40607\n办\t40608\n井口战役\t40609\nprojiect\t40610\n入模\t40611\n豪门惊梦\t40612\n马占山\t40613\n油菜\t40614\n灯下黑\t40615\n核爆\t40616\n浮雕\t40617\n式炉\t40618\n先军\t40619\nreques\t40620\n中海桃源\t40621\n礼花\t40622\nSimatic\t40623\n12312\t40624\n长兴县政府\t40625\n刘克亚\t40626\n4张\t40627\npx2rem\t40628\n26座\t40629\nsysdba\t40630\n百度移动云测试中心\t40631\n腰膝酸软\t40632\n百度汉语\t40633\n中南林业科技大学涉外学院\t40634\n区块\t40635\n旭日\t40636\n泼粪\t40637\n浙江正泰电器股份有限公司\t40638\n废碱\t40639\nrki\t40640\n彭州市\t40641\n粘合\t40642\nadditem\t40643\n滥竽充数\t40644\n微分中值定理\t40645\nClosed\t40646\n小博\t40647\n端头\t40648\n石门乡\t40649\nmk\t40650\n不能自拔\t40651\n天玺\t40652\n央视\t40653\n股票型证券投资基金-工银瑞信基金管理有限公司\t40654\n我的爸爸妈妈\t40655\n258集团\t40656\n新还珠格格\t40657\n广州白云国际机场\t40658\n荡荡\t40659\n池鱼\t40660\n党务公开网\t40661\n华二前滩\t40662\n检查组\t40663\n酬谢\t40664\nF-35\t40665\n两个小时\t40666\n玉名美良\t40667\n船用燃料油\t40668\n黑加仑葡萄干\t40669\njdbc\t40670\nHTML代码\t40671\n五四运动\t40672\n华为荣耀3C\t40673\n极武皇\t40674\n黄磷\t40675\nIBS\t40676\n清创\t40677\n大连银行\t40678\nbelulu\t40679\n蒙特斯\t40680\n主考官\t40681\n杂闻_摩信网\t40682\n监察权\t40683\n局促\t40684\nev4\t40685\nHumor\t40686\n播色\t40687\n浙江大学研究生院\t40688\n黄金期\t40689\nSpot\t40690\n劝诫\t40691\n河南省纪委\t40692\n竹跳板\t40693\n吉泽\t40694\n杭州剧院\t40695\n入正\t40696\n筚路蓝缕\t40697\n商务座\t40698\nsuface\t40699\n中冷器\t40700\n新筑股份\t40701\n藏獒网\t40702\n38.7\t40703\n少林足球\t40704\n孙媳妇\t40705\n固醇\t40706\nABC分类法\t40707\n地下隧道\t40708\n罗智\t40709\n张经理\t40710\nMinecraft\t40711\n上海银监局\t40712\n大黄蜂\t40713\n乡春\t40714\n虚拟导航栏\t40715\n_讯飞输入法\t40716\nsurrogate\t40717\n动画帧\t40718\n刀魔\t40719\nawifi\t40720\n高发期\t40721\n缺宅男女\t40722\n派乐汉堡\t40723\n【周\t40724\nHOST\t40725\nave\t40726\n耐晒\t40727\n申搏\t40728\n十八摸\t40729\n非公开发行股票\t40730\n洪合镇\t40731\n317号\t40732\nCAI\t40733\nMetrics\t40734\n厚学网\t40735\n酯酶\t40736\n洞庭湖\t40737\n1425\t40738\n红烧牛肉\t40739\n酷能\t40740\n港尾镇\t40741\nPyqt5\t40742\n浪费\t40743\nandroid数据库\t40744\n高回报\t40745\n脂溢性脱发\t40746\n航天梦\t40747\n太阳石\t40748\n一张照片\t40749\n箭楼\t40750\n百度手机助手\t40751\n虐男\t40752\n倾慕\t40753\nhao123\t40754\nNS3\t40755\nCRE\t40756\n牧野\t40757\n徐湘晴\t40758\n招标法\t40759\n平凡\t40760\n上海银行间同业拆放利率\t40761\n艰苦创业\t40762\n永州日报\t40763\n门条\t40764\n嵩山少林寺武校\t40765\n卧龙山庄\t40766\n量器\t40767\n国家农业部\t40768\n金城铃木\t40769\n刘主席\t40770\n东湖社区\t40771\nNBA2K18\t40772\nP10plus\t40773\nT00ls.Net\t40774\n歌泣\t40775\naqua\t40776\n劲歌\t40777\npeninsula\t40778\n汶\t40779\n中进制\t40780\n紫御江山\t40781\n地分\t40782\nanaconda2\t40783\n对影\t40784\n江上青\t40785\n市政集团\t40786\n章家敦\t40787\n泛醇\t40788\n李心洁\t40789\n葫芦丝网\t40790\n跪地\t40791\n医患关系\t40792\n塔轮\t40793\n新湖期货\t40794\nhuaban\t40795\nRyzen\t40796\n申请器\t40797\n牡丹园\t40798\n老卤\t40799\n柳城街道\t40800\n谢国民\t40801\n搋子\t40802\n重溪\t40803\n二级建造师继续教育\t40804\n热曲\t40805\n数字图书馆\t40806\n杜美惠\t40807\n五一七天\t40808\nAdam\t40809\n吕不韦传奇\t40810\n事件\t40811\n30次\t40812\n上海鼎拓衡器有限公司\t40813\n新年茶话会\t40814\n100.0\t40815\n岳伦\t40816\n瑞可\t40817\n支撑位\t40818\n暗\t40819\nhp400\t40820\n许久\t40821\n受难\t40822\nRom刷机包\t40823\n夏至重生之大涅磐\t40824\n_地下城与勇士\t40825\nbarcode\t40826\n纹身机\t40827\n鹿皮\t40828\n南通市经济技术开发区\t40829\n倾落\t40830\n灵师\t40831\n爆闪\t40832\n阿达木\t40833\n卤汁\t40834\n豪彩\t40835\n寒武纪\t40836\n天职\t40837\n美代\t40838\n转山\t40839\n180415\t40840\n左手腕\t40841\nwww.9ht.com\t40842\n神魔至尊传\t40843\nlnd\t40844\n河头镇\t40845\n焊制\t40846\n曾都\t40847\nCafe\t40848\n140%\t40849\n限行限号\t40850\n瑞士酒店\t40851\n五龙镇\t40852\n13c\t40853\nVESPA\t40854\n服从性\t40855\n趣机网\t40856\n高田\t40857\n长安_长安区政府\t40858\nV3.4\t40859\n灯罩\t40860\n仁和路\t40861\n分子晶体\t40862\n教育部重点实验室\t40863\n十套\t40864\n总工\t40865\n提档案\t40866\n补油\t40867\n建工城\t40868\n方滨兴\t40869\n夹点\t40870\nnove2s\t40871\nxsell\t40872\n垫层\t40873\ntucker\t40874\n第35类\t40875\nPPD\t40876\n悦界\t40877\n超现实主义\t40878\n58号\t40879\n坐客\t40880\n一谷\t40881\n松桃苗族自治县\t40882\n奴化\t40883\n缓存类\t40884\n灭活疫苗\t40885\n瓮\t40886\nAWVS\t40887\n赛马会\t40888\n柠檬醋\t40889\nascending\t40890\n蒙板\t40891\n庶\t40892\n強制\t40893\n青浦\t40894\nCOOKIE\t40895\n中新生态城\t40896\nHD斯诺克\t40897\nEl\t40898\n美女画\t40899\n摩摩哒\t40900\nfx-991ES\t40901\n长安睿骋CC\t40902\n小污\t40903\n合开\t40904\n爱情梦幻号\t40905\n初音未来歌姬计划\t40906\n李霖\t40907\n数字水泥网\t40908\nMUSEUM\t40909\nmhl\t40910\n源码街\t40911\n托儿所\t40912\n1100亿\t40913\n艾欧尼亚VS诺克萨斯\t40914\n青眼\t40915\n小米4\t40916\n网络管理器\t40917\nv1.1.0\t40918\n十八岁的天空\t40919\n流利式\t40920\n李旻\t40921\n光工\t40922\n90种\t40923\n德古拉伯爵\t40924\n爱国卫生运动委员会\t40925\n警笛声\t40926\n超声波焊接机\t40927\n李好\t40928\n抚顺\t40929\n120平方\t40930\n西阳\t40931\nhelp-question\t40932\n创显\t40933\nPrince\t40934\n亦源智能科技\t40935\n腊汁肉夹馍\t40936\n女人的战争之肮脏的交易\t40937\n农村合作社\t40938\n贾又福\t40939\n铁碳合金\t40940\nkanon\t40941\n莞惠城轨\t40942\n张智成\t40943\n硬下疳\t40944\n丁蜀镇\t40945\njade\t40946\n大人口\t40947\n第一季16集\t40948\nczh\t40949\n同城化\t40950\n糖分\t40951\n少年派的奇幻漂流\t40952\n李泽\t40953\n西条沙罗\t40954\n伊丽莎白鼠\t40955\n氯沙坦\t40956\n陆林\t40957\neink\t40958\nMdict\t40959\n牡丹之歌\t40960\n清欢\t40961\n两只脚\t40962\n提合\t40963\n梁村\t40964\n不相干\t40965\n根值\t40966\n22M\t40967\n斗鱼TV\t40968\n广东省自然科学基金\t40969\nxhell\t40970\n高玩\t40971\n大连医科大学附属第二医院\t40972\n昌光谷未来城\t40973\n车载显示器\t40974\n盐酸二甲双胍\t40975\n鸣声\t40976\n70-80年代\t40977\nryzen3\t40978\nChaoShan\t40979\nv-link\t40980\n八宝山殡仪馆\t40981\n漫天\t40982\n硫酸法\t40983\n2000家\t40984\n检阅\t40985\n丹迪\t40986\n艺术者\t40987\nISR\t40988\n知了\t40989\n合成纸\t40990\n成疑\t40991\n山色\t40992\nPhD\t40993\n化学剂\t40994\n正平股份\t40995\nrvn\t40996\n輕鬆\t40997\n丰台街道\t40998\n博多豚骨拉\t40999\n奇货\t41000\n上海保利大剧院\t41001\n长春龙嘉国际机场\t41002\n帽间\t41003\ntrying\t41004\nBOS\t41005\nPRETTY\t41006\ncoreld\t41007\nprojects\t41008\n中国科技大学\t41009\n爆氧\t41010\n我的小发明\t41011\n田园诗\t41012\n正所谓\t41013\n拖沓\t41014\n衙门\t41015\n香波\t41016\n金地香山湖\t41017\n未解\t41018\n100节\t41019\n游多多\t41020\n重庆市人大常委会\t41021\n这个月\t41022\n2018度\t41023\n黔源电力\t41024\n富有\t41025\n北桥街道\t41026\nlet\t41027\n英雄无敌4\t41028\n啫喱膏\t41029\nsproto\t41030\n内蒙古工业大学\t41031\nshooting\t41032\n56元\t41033\n锈点\t41034\n融易\t41035\n高露洁\t41036\nluncher\t41037\n管理委员会\t41038\n轻食\t41039\nx32\t41040\nlgb\t41041\n晴隆\t41042\n水冷壁\t41043\nacquisitions\t41044\n保定市城乡规划局\t41045\n一个人走\t41046\n拯救队\t41047\n厦门大学海外教育学院\t41048\n森林森\t41049\n银西高铁\t41050\nlak\t41051\n出境\t41052\nEvolutionary\t41053\nrectangle\t41054\n18183诛仙\t41055\n大清真寺\t41056\n虎胆龙威3\t41057\n盗火\t41058\n浦庄\t41059\n眉心\t41060\nLYZC11\t41061\n凋落\t41062\nUnauthorized\t41063\nmac地址修改器\t41064\n义云\t41065\n右上\t41066\n圆皮\t41067\n白色相簿2\t41068\n南屯\t41069\nMartini\t41070\nAsami\t41071\n我的英雄学院吧_\t41072\n刚哥\t41073\ncans\t41074\n禁止\t41075\n宝德股份\t41076\n茶叶袋\t41077\n随借随还\t41078\n雷神7\t41079\n高工机器人\t41080\n人口史\t41081\n休戚与共\t41082\n审稿期\t41083\n感兴\t41084\n劳动观\t41085\n夜奔\t41086\n化学性\t41087\n安化\t41088\n四宫格散粉\t41089\n袁花\t41090\n吴美莲\t41091\n1438\t41092\n喷雾式\t41093\nGarments\t41094\n山湖城\t41095\n揉碎\t41096\n众志成\t41097\n期待\t41098\nCNAPS\t41099\n宿务\t41100\n焕发\t41101\n香椿树\t41102\n收藏品\t41103\n山东大学控制科学与工程学院\t41104\n江民\t41105\n声控吧\t41106\n五谷渔粉\t41107\n空芯\t41108\n弗朗\t41109\n多径\t41110\ndna\t41111\n老圃\t41112\n73路\t41113\n蔡强\t41114\n清华大学能源与动力工程系\t41115\n克劳\t41116\n易网|养猪网\t41117\n本来面目\t41118\nIder\t41119\nTitanium\t41120\n天空\t41121\n采茶工\t41122\n功能仪\t41123\n贺麟\t41124\n金科路\t41125\n罾\t41126\n嘉兴银行\t41127\n大辫子\t41128\n去逝\t41129\n造口\t41130\n佳邮\t41131\n圣战士\t41132\n巴宝莉\t41133\n人身保险公司\t41134\n万念俱灰\t41135\n柚子皮\t41136\n密封垫\t41137\n公交化\t41138\n老班长\t41139\n蔬菜网\t41140\n运营类\t41141\n广濑丝丝\t41142\n13件\t41143\n平安建设\t41144\n好日\t41145\n叠片\t41146\nplatelet\t41147\n105分\t41148\n东直门中医院\t41149\n郑州市人社局\t41150\n紧肤\t41151\n打仗\t41152\n2018-01-11\t41153\n不图\t41154\n组排\t41155\nyulu\t41156\n袁博\t41157\n旅游船\t41158\narccosx\t41159\n本地人\t41160\n双壁\t41161\n第125期\t41162\n汉阳火车站\t41163\nJS-SDK\t41164\n2008-2009年\t41165\n新南威尔士大学\t41166\nstruct2\t41167\n落袋\t41168\n福士\t41169\n高旭\t41170\n文太\t41171\n606\t41172\n人力资源有限公司\t41173\n太安\t41174\n云南省地震局\t41175\n真羡慕\t41176\n王炜\t41177\n长斑\t41178\nPSTN\t41179\nT16\t41180\n破发\t41181\n金花菜\t41182\n李爱\t41183\n编译期\t41184\n黄狗\t41185\n5.21\t41186\nnuk\t41187\n新疆农信社\t41188\n广铁集团\t41189\n卡王\t41190\n中国质量检验协会\t41191\n包公审驴\t41192\nGodiva\t41193\nLLDB\t41194\n新疆\t41195\n汉语考试服务网\t41196\n1911年\t41197\n西瓜霜\t41198\n归责\t41199\nASME\t41200\nEA211\t41201\nhot\t41202\n面单\t41203\nrx0\t41204\nATX777\t41205\n叠叠乐\t41206\n新视野大学\t41207\n亚希\t41208\n特化师\t41209\nSTM32cube中文网\t41210\nDNF补丁\t41211\nshantou\t41212\nqfile\t41213\n天涯明\t41214\nvisual_studio\t41215\n中国广州政府\t41216\n旧历\t41217\nWeb3.js\t41218\n油率\t41219\n段落\t41220\n金瓶梅2:爱的奴隶\t41221\nVR日报\t41222\n电炉\t41223\nedg\t41224\n暗影格斗3\t41225\n煮熟\t41226\n综自\t41227\n招联金融\t41228\n邱隘\t41229\n武汉大学遥感信息工程学院\t41230\n中耕\t41231\n铠甲勇士捕王\t41232\n海浪\t41233\ncrank\t41234\n明基显示器\t41235\n第十四次\t41236\n笔挺\t41237\n纵剪机\t41238\n小古文\t41239\n换给\t41240\n0066\t41241\nboot_\t41242\nlinux7\t41243\n烤榴莲\t41244\n光洁度\t41245\n极品飞车ol\t41246\nwndr3800\t41247\n董文标\t41248\n阿u\t41249\nhttp协议\t41250\n宁建国\t41251\n500kv\t41252\n新疆党委\t41253\n信维\t41254\n几份\t41255\n斜眼\t41256\n美妆护肤帮\t41257\n颜如玉\t41258\n巨鹿之战\t41259\n22厘米\t41260\n地菜煮鸡蛋\t41261\n处死\t41262\n全连接层\t41263\n转圜\t41264\n团结大道\t41265\nchallenges\t41266\n赵泓霖\t41267\nexplains\t41268\n困苦\t41269\n日淘\t41270\n陈雁\t41271\n语文学\t41272\n陕西省林业厅\t41273\n2017年中秋\t41274\nSartorius\t41275\n綦建虹\t41276\n靠垫\t41277\n石场\t41278\n七式\t41279\n压铸铝合金\t41280\n2804\t41281\n满庄镇\t41282\n停歇\t41283\n建设银行信用卡\t41284\n顿法\t41285\n图录\t41286\n樟宜机场\t41287\n路痴\t41288\n20170208\t41289\n单冷型\t41290\niene\t41291\n中华人民共和国档案法\t41292\n香砂\t41293\n帝斯曼\t41294\n卤蛋\t41295\nHardy\t41296\n苍穹榜之圣灵纪\t41297\n百娱\t41298\n伊利诺斯州\t41299\n金鸡岭\t41300\n明器\t41301\n电子工程网\t41302\n李琳成吉思汗\t41303\n哈尔滨地铁3号线\t41304\n俞思远\t41305\ninphic\t41306\n20多款\t41307\nprimitives\t41308\n健身运动\t41309\n校音\t41310\n宝岛一村\t41311\n中岛裕翔\t41312\n仙剑奇侠传3外传\t41313\nhater\t41314\n上列\t41315\n明月湾\t41316\n丙二醇\t41317\n河南省委\t41318\n岜沙苗寨\t41319\n龙妈\t41320\n阻垢剂\t41321\n1601\t41322\nsympathetic\t41323\n郑秋冬\t41324\nautorun\t41325\n件套\t41326\n新闻报道\t41327\nalexanderkun\t41328\n叩\t41329\n不忘记\t41330\n胶着\t41331\nAvril\t41332\nlayouts\t41333\nnst\t41334\n领跑\t41335\n调生\t41336\nWarwick\t41337\n靓号网\t41338\n分缝\t41339\nactually\t41340\n昔者\t41341\n助贷\t41342\n宜昌高新区\t41343\n上海宁\t41344\nnight\t41345\n绿色贸易壁垒\t41346\n心奴\t41347\n识别器\t41348\nChatrandom\t41349\n奥兹国\t41350\n黄果\t41351\n脸卡\t41352\ngeorgia\t41353\n菩萨\t41354\n500mm\t41355\n重聚\t41356\n闪闪红星\t41357\n审委\t41358\n斗圣\t41359\n雅思9\t41360\nindexes\t41361\n米白\t41362\ncasablanca\t41363\nChromatography\t41364\n冤枉路\t41365\n海螺集团\t41366\n射速\t41367\nnvalid\t41368\n2016年4月27日\t41369\n天坛奖\t41370\nPC6\t41371\nQC35\t41372\n根据值\t41373\n奋笔疾书\t41374\nSoldiers\t41375\n麝香保心丸\t41376\n安图县\t41377\n命令流\t41378\n报人\t41379\n单倍群\t41380\n独创\t41381\n400余\t41382\n大弦\t41383\n从新\t41384\n小儿扁桃体\t41385\n一份\t41386\n暗影之诗吧\t41387\n摩登如来神掌\t41388\n断肠人\t41389\nREEBOK\t41390\n天幻网\t41391\n丑橘\t41392\n车龄\t41393\nEPDM\t41394\n中华人民共和国企业破产法\t41395\n3kw\t41396\nsigma\t41397\n17.0.0\t41398\n李嵩\t41399\n中国地铁\t41400\nTubeGalore\t41401\n叫花鸡\t41402\n拳皇98吧_\t41403\n西数红盘\t41404\nM0\t41405\n蕾米莉亚\t41406\n好帅二蛋\t41407\n82064065\t41408\n赫章县人民政府\t41409\nswu\t41410\n起亚KX\t41411\nderivative\t41412\nBleed\t41413\n六五普法\t41414\n私养\t41415\naccept\t41416\n狂乱\t41417\n宗教学\t41418\n万物生\t41419\n小高\t41420\n聊了聊\t41421\n新技巧\t41422\n笃实\t41423\nRemind\t41424\n蓝水\t41425\nfq\t41426\n水尺\t41427\n初秋\t41428\nColeman\t41429\n従\t41430\n不能忘\t41431\n中心句\t41432\n货运从业资格证考试\t41433\n5月22日\t41434\nsolidworks2013\t41435\n差倍\t41436\n开化新闻网\t41437\n首都电影院\t41438\nP8Z68\t41439\n四格漫画\t41440\n铜陵市义安区人民政府\t41441\n迹\t41442\n共产国际\t41443\n百姓家\t41444\n排酸\t41445\n大街小巷\t41446\n奖金池\t41447\n武之禅\t41448\n采样员\t41449\n塔松\t41450\n中民投\t41451\n出柜\t41452\n奥斯洛布\t41453\n连篇\t41454\n斗罗\t41455\n捆机\t41456\nSmut\t41457\ncuphead\t41458\n1769275\t41459\n冯曦妤\t41460\nHC3i数字医疗论坛\t41461\n新资源食品\t41462\n青岛理工\t41463\n辉县\t41464\n610k\t41465\nzhb\t41466\n三五\t41467\n咨询服务业\t41468\n截污\t41469\n徐川\t41470\n辛全生\t41471\n冰墙\t41472\n侠岚\t41473\n魔功\t41474\n标贯\t41475\n江南皮革厂\t41476\n苏州市安全生产监督管理局\t41477\n公民道德建设实施纲要\t41478\nF型\t41479\nkarter\t41480\n龙窑\t41481\n塞牙\t41482\n毛婷\t41483\nzfb\t41484\n微信商户号\t41485\n服装师\t41486\nlogrotate\t41487\nRocketMq\t41488\n潮州话\t41489\n木女\t41490\n迎着\t41491\n1.57G\t41492\n讲卷\t41493\n马牛\t41494\ncompounds\t41495\n磨煤机\t41496\nKardashian\t41497\n绝地求生卡顿解决方法\t41498\n非诚勿扰史\t41499\n张彬\t41500\n己\t41501\n疼痛感\t41502\nPymol\t41503\n渡辺\t41504\nlog日志\t41505\n秋学院\t41506\n三盛颐景园\t41507\n写封\t41508\n上海有机化学研究所\t41509\n优德\t41510\n影视明星\t41511\n欧阳德\t41512\n凯迪\t41513\n1.11.2\t41514\n支链\t41515\n百无聊赖\t41516\n50000吨\t41517\n杜飞\t41518\n硼硅\t41519\nmediate\t41520\n杰克吉伦哈尔\t41521\nSSD1306\t41522\nignore\t41523\n_健网\t41524\ncq40\t41525\n有型\t41526\n拆违\t41527\nBOBBI\t41528\n乙卯日\t41529\n美成达签证网\t41530\n磁力库\t41531\n先行赔付\t41532\n和晶科技\t41533\njng\t41534\n灵骨\t41535\n魅蓝吧\t41536\n液体泵\t41537\n轩逸\t41538\n南口镇\t41539\n极道天魔\t41540\n援藏\t41541\n全部都是你\t41542\n盘符_\t41543\n二三十万\t41544\n日蚀\t41545\n注册资本认缴制\t41546\n呃逆\t41547\n300点\t41548\nfieldset\t41549\n国家中心\t41550\n口红界\t41551\n血肉模糊\t41552\nlust\t41553\n莱蒂齐亚\t41554\nfisting\t41555\n及时行乐\t41556\n第3版\t41557\n2014年5月\t41558\n海毛虫\t41559\n褚健\t41560\n青工委\t41561\n胞妹\t41562\nSportback\t41563\nBC95\t41564\n张广宁\t41565\n磐安新闻网\t41566\n44小时\t41567\nBlythe\t41568\n直板夹\t41569\n证券经纪人\t41570\n弓箭步\t41571\nDescriptions\t41572\n小睿\t41573\nsmk\t41574\n为食\t41575\nCo\t41576\n枪血\t41577\n斯图科夫\t41578\n桥堆\t41579\nAki\t41580\n迦楼罗\t41581\nUSG\t41582\n采茶节\t41583\n碧海花园\t41584\n张琦\t41585\n佐伯奈奈\t41586\n切问\t41587\n帮助页\t41588\nhelped\t41589\n河北大学\t41590\n10w+\t41591\n变形计\t41592\n合拢\t41593\n大宅门\t41594\n合美\t41595\n￡\t41596\n南通市通州区人民政府\t41597\nandroidstudio3.0\t41598\n名相\t41599\nGround\t41600\n干警\t41601\n第22章\t41602\n百花洲\t41603\n4G_\t41604\n西直门外大街\t41605\n青岛移动\t41606\nNTLM\t41607\n督灸\t41608\n投资银行业务\t41609\n平安烟台网\t41610\n静波\t41611\n七贤岭\t41612\n库尔勒机场\t41613\n扫掠\t41614\n园上\t41615\n枫溪\t41616\nPreparatory\t41617\n齐溪\t41618\nykk\t41619\n保安人员\t41620\n咱\t41621\n43亿元\t41622\n内蒙古自治区国土资源厅\t41623\n童心未泯\t41624\n罗荣桓\t41625\nInterface\t41626\nbecoming\t41627\n爱很烂\t41628\n贾鹏\t41629\n仓前街道\t41630\n猫九酱\t41631\n郑州煤矿机械集团股份有限公司\t41632\n而后\t41633\n臭氧机\t41634\nlodop\t41635\n环球资源\t41636\nECMO\t41637\n喷火\t41638\nKOI\t41639\n徐海乔\t41640\n白灼\t41641\n花溪区\t41642\n欧包\t41643\n条板\t41644\n卫星路\t41645\n低速离心机\t41646\n华谊逸品澜湾\t41647\n廖仲恺\t41648\n弯折\t41649\n棉管\t41650\n魏骑\t41651\n万基\t41652\n1.0.0.0\t41653\nConfirmed\t41654\n供应链分析\t41655\nLeonn\t41656\n女匪\t41657\n日照网\t41658\n家畜\t41659\n西槎路\t41660\n水蛭素\t41661\n胡芦丝\t41662\n准据法\t41663\ngunzip\t41664\n10.11.6\t41665\n幸福在哪里\t41666\n三七_\t41667\n星太奇\t41668\n赶回\t41669\n唐山工人医院\t41670\n辛寨子\t41671\n拉升\t41672\n数据透析表\t41673\nwtybill\t41674\n养牛场\t41675\n北京市君合律师事务所\t41676\nPeppers\t41677\n人工智能机器人\t41678\n疤痕贴\t41679\nopenjudge\t41680\n奇幻射击2\t41681\n黄本\t41682\n13项\t41683\n园林绿化工程\t41684\n虾米\t41685\n拔毒\t41686\n第10届\t41687\nBobbi\t41688\n无知者无畏\t41689\n违纪案\t41690\n六盘水盘县\t41691\n白杨礼赞\t41692\n桃花诺\t41693\nViva\t41694\n镇区\t41695\nmsvcr120.dll\t41696\n西辛庄\t41697\n6CD\t41698\n电销\t41699\n沙浆\t41700\n穿衣打扮\t41701\n咸宁\t41702\n一案\t41703\n新型\t41704\nXL论坛_汽车之家论坛\t41705\n龙永图\t41706\n微众银行\t41707\n乡道\t41708\n叶氏\t41709\n磺脲类\t41710\n侬本多情\t41711\n跨工作\t41712\n墨村\t41713\n皓庭\t41714\n世界梦号\t41715\n布拉加\t41716\n更合\t41717\n0415\t41718\nProc\t41719\n春夜喜雨\t41720\n11cm\t41721\nκ\t41722\n中望3d\t41723\no4s\t41724\n金达莱花\t41725\n全院\t41726\n主持人\t41727\n任宇昕\t41728\n调解\t41729\n创业大街\t41730\n一叶税\t41731\n华为商城\t41732\n浇水器\t41733\nleah\t41734\n551\t41735\nOSSIM\t41736\n838\t41737\neetop\t41738\nBirt\t41739\n金三银四\t41740\n哥德\t41741\n德尔菲\t41742\nASTM\t41743\n矫正器\t41744\n大国医\t41745\n生活用水\t41746\n马道\t41747\nkrystal\t41748\n乔治克鲁尼\t41749\n破车\t41750\n龙巅观赏鱼\t41751\n焕活\t41752\n紫金信托\t41753\n瘦脸器\t41754\n猪嬲\t41755\n发份\t41756\n0x00000\t41757\n和否\t41758\n黑嘉嘉\t41759\n阿诺\t41760\n博力\t41761\n广本雅阁\t41762\n爱塔传奇\t41763\n绿春\t41764\n赶车\t41765\n桂林北站\t41766\nKatalon\t41767\n单品宝\t41768\n海门政府网\t41769\n孝子\t41770\n允妍\t41771\n磅单\t41772\n炊事班\t41773\n催产素\t41774\n阴晴圆缺\t41775\n猎头者\t41776\n000656\t41777\n偏差值\t41778\n欧凯\t41779\n判官\t41780\n磁电机\t41781\n云汇聚英\t41782\n蓝队\t41783\n行标\t41784\n第77号\t41785\nipmp\t41786\n增程式\t41787\n学法\t41788\n秒查\t41789\n游云\t41790\n新观\t41791\nHG8245\t41792\n镇魂曲\t41793\n求法\t41794\n王婆大虾\t41795\n厄运之忆\t41796\n巴黎圣日尔曼\t41797\n经济管理学院\t41798\n浙江政协\t41799\n昆明物流公司\t41800\n中文名\t41801\n北沿江\t41802\n利齿\t41803\n阿长与\t41804\n盛誉\t41805\n魔乳\t41806\n储物盒\t41807\n233个\t41808\n项脊轩志\t41809\n不受伤\t41810\n西游记之三打白骨精\t41811\n淮安市三畅仪表有限公司\t41812\n意锐\t41813\n拙政\t41814\n海水晶\t41815\nsm论坛\t41816\n店子\t41817\n在途\t41818\n蓝企鹅\t41819\n安贫乐道\t41820\n王文林\t41821\n中国民主同盟\t41822\n偶久论坛\t41823\ndp线\t41824\n中低档\t41825\nAMF\t41826\nGo语言\t41827\n壬辰\t41828\n土豆沙拉\t41829\n水情\t41830\nC\\C++\t41831\n灾难性\t41832\n热塑性\t41833\n垂直平分线\t41834\nCREATORS\t41835\n洛与霞\t41836\nwygl\t41837\n电梯费\t41838\n阴盘\t41839\n圣境传说\t41840\n不完全燃烧\t41841\n沃尔沃S60\t41842\n铅锌\t41843\n鸡粉\t41844\nactiveMQ\t41845\nallen\t41846\n拆招\t41847\n社会科\t41848\n大庆高新区\t41849\n排长\t41850\nNERF\t41851\n永不言败\t41852\n透贴\t41853\n2018-4-18\t41854\n山湖\t41855\n信片\t41856\n编程题\t41857\n光声\t41858\n土木工程学报\t41859\n创客宝\t41860\n1.4%\t41861\n养兰\t41862\n极神龙\t41863\n美国航空航天局\t41864\nCTY\t41865\n两心\t41866\n法拉赫\t41867\n小儿腺样体肥大\t41868\ngridcontrol\t41869\n比得兔\t41870\n12天\t41871\ntense\t41872\n上海静安香格里拉大酒店\t41873\n移动互联_比特网\t41874\n新桥街道\t41875\n宁波地铁2号线\t41876\niFIX\t41877\n孔桩\t41878\nEdittext\t41879\n荣耀9\t41880\nbeaulo\t41881\n立业\t41882\n九川\t41883\n公司估值\t41884\n6月27\t41885\n攻放\t41886\n0.7.4\t41887\n100cc\t41888\nyumi\t41889\n元空间\t41890\n子shell\t41891\n龙岩市统计局\t41892\n大耳朵英语\t41893\nminst\t41894\n谦谦君子\t41895\n闪电侠第二季\t41896\narray\t41897\n蒸养\t41898\n现浇\t41899\n国营\t41900\n芹\t41901\n上杭路\t41902\n沙家\t41903\n自驾游玩\t41904\n特战先锋\t41905\n2Z\t41906\n价差\t41907\n骰子\t41908\n议程\t41909\n独倚\t41910\nqpython\t41911\n远交\t41912\n扇子\t41913\n六点\t41914\nlump\t41915\n国脉\t41916\ngogogo\t41917\nWL\t41918\n丰田考斯特\t41919\n水龙吟\t41920\n桥牌\t41921\n茶史\t41922\nGalgame\t41923\n午睡\t41924\nmaize\t41925\n机台\t41926\n221号\t41927\n方文\t41928\nCitra3ds\t41929\n中铁二局\t41930\n金饭碗\t41931\n52岁\t41932\n勒阿弗尔\t41933\nextensible\t41934\n外王\t41935\n极品飞车吧\t41936\n轮次\t41937\n失能老人\t41938\n1.5.39\t41939\n大微\t41940\nvue框架\t41941\n新疆社区\t41942\n帝宠\t41943\n剑指\t41944\n神力科莎\t41945\n扬名广场\t41946\n神犬奇兵\t41947\n解放碑步行街\t41948\n鸡排\t41949\ngrade\t41950\n微金融\t41951\n查件\t41952\nm50\t41953\n死别\t41954\nThreadPool\t41955\n雁王\t41956\n9艘\t41957\n91家\t41958\n不宜\t41959\n安阳东站\t41960\n洪益娟\t41961\n杨梅红\t41962\n18kw\t41963\nConfigParser\t41964\nHT\t41965\n焦曼婷\t41966\n新乡火车站\t41967\n气急\t41968\nskiing\t41969\n美少女梦工厂3\t41970\nSyntax\t41971\ndetention\t41972\nSideFX\t41973\nSUPERSTAR\t41974\n2017年6月9日\t41975\n4.43\t41976\nfastdfs\t41977\n239\t41978\n四万万\t41979\n平今仓\t41980\n线电视\t41981\n影集\t41982\n根友\t41983\n白塘湾\t41984\n自送\t41985\nXENO\t41986\n5晚\t41987\n杜少\t41988\n鸥鹏\t41989\nsqlserver2014\t41990\n上悬窗\t41991\nGOG\t41992\n屈膝\t41993\n棒球队\t41994\n混凝土输送泵\t41995\n澈\t41996\n金丝大峡谷\t41997\n赣州\t41998\n知乎quora\t41999\n65t\t42000\n沈建光\t42001\nclasss\t42002\n猛速\t42003\n麗登網路藥妝館\t42004\n进程池\t42005\npositioning\t42006\n京平高速\t42007\n福苑\t42008\nup在线翻译\t42009\nUSERS\t42010\n于歆杰\t42011\n我的人\t42012\n五菱宏光s\t42013\n研习营\t42014\n覆盖版\t42015\n好运\t42016\n宫本汉乡\t42017\nre热\t42018\nfov\t42019\n700多\t42020\nbroadcast\t42021\n4MOD\t42022\n古墓丽影10\t42023\n淘必中\t42024\n武汉电信\t42025\nenvoy\t42026\n异界逍遥梦路\t42027\n绝世护花高手\t42028\n39码\t42029\n红照\t42030\n样机\t42031\n商务型\t42032\n生漆\t42033\n7621\t42034\n艺墅\t42035\nTOMY\t42036\n茜茜\t42037\n永嘉公务网\t42038\nxcodeproj\t42039\n拉卡拉收款宝\t42040\nBonpoint\t42041\n桂剧\t42042\n上不上班\t42043\n成都太古里\t42044\n蒋中正\t42045\n拼接器\t42046\n安大叔\t42047\nAnnexin\t42048\n热备盘\t42049\n新鸥鹏\t42050\n搬开\t42051\nv8a\t42052\n渝中区\t42053\n逗事\t42054\nlost\t42055\n奔驰r350\t42056\n蓝飞\t42057\n完美沙棘茶\t42058\n翻转架\t42059\n全国助残日\t42060\n中华人民共和国刑事诉讼法释义\t42061\nCPUCores\t42062\n棠下\t42063\n上饶市财政局\t42064\n白兔\t42065\n头孢克洛干混悬剂\t42066\n张工\t42067\n宁波电信\t42068\n棍术\t42069\n上海财经大学图书馆\t42070\n水丽菜\t42071\n高三\t42072\nProtocols\t42073\n铁锈\t42074\n凯卓\t42075\n一轨\t42076\n特鲁多\t42077\n弓片\t42078\n菓子\t42079\n范画\t42080\n拉森钢板桩\t42081\n明辉\t42082\n西单店\t42083\n东风路街道\t42084\n李天王\t42085\n15招\t42086\nDesignJet\t42087\n解放中路\t42088\n宋国权\t42089\nall叶\t42090\n十立\t42091\n熟能生巧\t42092\n张瑛\t42093\n5包\t42094\nweb应用防火墙\t42095\n计税依据\t42096\n华晨宇平凡之路\t42097\n漠然\t42098\n乐乐茶\t42099\nRangers\t42100\n定州\t42101\n内蒙古区\t42102\n钓鱼网站\t42103\n十二句\t42104\n当页\t42105\n任九\t42106\n智车优行\t42107\n老年病\t42108\n徽文化\t42109\nガンダム\t42110\n万数\t42111\nIncorporation\t42112\n张思莱\t42113\n奉化日报\t42114\n美贸易战\t42115\n外籍\t42116\n四川省统计局\t42117\n三国志\t42118\n悬索桥\t42119\nsparkSQL\t42120\n名鞋\t42121\n5min\t42122\n整十数\t42123\n聚效\t42124\nYY4480高清影院\t42125\n克莱斯勒\t42126\nPinpoint\t42127\n皮质醇\t42128\n话务台\t42129\n长庚\t42130\nEV400\t42131\n40几岁\t42132\n萧条\t42133\n325000\t42134\n阿里郎\t42135\n钳口\t42136\n福州市台江区人民政府\t42137\nフリ\t42138\n000839\t42139\nreadme.md\t42140\n强杀\t42141\n清洁度\t42142\n本番\t42143\n格列佛\t42144\n过户\t42145\nNAVI\t42146\n室韦\t42147\n醒茶\t42148\n七言绝句\t42149\n赌片\t42150\n3.2.9\t42151\nv5.5.1\t42152\n21#\t42153\n雷击\t42154\n金沙路\t42155\nkonw\t42156\n根\t42157\n长期护理险\t42158\n硫化鞋\t42159\n幸存者\t42160\n环球国际\t42161\n标志共和国\t42162\n胡媚娘\t42163\n局子\t42164\n哧\t42165\n库拉索\t42166\n考行测\t42167\n时候\t42168\n导包\t42169\n小游戏\t42170\n吴存荣\t42171\n虹泉路\t42172\n24口\t42173\n陶金\t42174\n南山图书馆\t42175\n江铃福特\t42176\n尸块\t42177\n聂绀弩\t42178\n云鬓乱\t42179\nSailing\t42180\n试炼塔\t42181\n快餐业\t42182\nDead\t42183\nQ友网\t42184\n仙林大学城\t42185\n邵华\t42186\n中原区\t42187\n曲库\t42188\n反野\t42189\n金融街控股股份有限公司\t42190\nAVCHD\t42191\n梦中梦\t42192\n12.7mm\t42193\n剧评\t42194\n黑龙江省人民政府\t42195\n冰蝶\t42196\n莺哥\t42197\n无锡站\t42198\n看客\t42199\nmdr1a\t42200\n认证群\t42201\n毁灭日\t42202\n血战篇\t42203\n煎荷包蛋\t42204\n六安火车站\t42205\n乐观主义\t42206\nGm论坛\t42207\n背甲\t42208\n刘灿\t42209\n顶头上司\t42210\n板框\t42211\nnfine\t42212\n奥巴马\t42213\n张麟\t42214\n柳市\t42215\n首次\t42216\n蓝硕\t42217\n第37期\t42218\n46家\t42219\n演艺吧_演出网\t42220\n062\t42221\n国网信通产业集团\t42222\n17.4\t42223\n30针\t42224\n深访\t42225\nディストビ\t42226\nYTG\t42227\n掩耳盗铃\t42228\n钢琴协奏曲\t42229\n欧迪臣\t42230\n盘片\t42231\n欣灵\t42232\n淫奴\t42233\n料器\t42234\n留党察看\t42235\nEETOP\t42236\n加松\t42237\n龙图游戏\t42238\nCybersecurity\t42239\n新坝镇\t42240\n都市小说吧\t42241\n硬骨头之绝地归途\t42242\nscarborough\t42243\n海关编码查询\t42244\n诵经\t42245\n20170114\t42246\njupyter\t42247\n苏州易修网\t42248\nNetDrive\t42249\n叔丁醇\t42250\n雇人\t42251\n伯克\t42252\nCx51\t42253\nTradingView\t42254\n世界\t42255\n真乱\t42256\n环境保护税\t42257\n简果网\t42258\n计息期\t42259\n磨刀不误砍柴工\t42260\nFresh馥蕾诗\t42261\n第15号\t42262\n占有权\t42263\n硅谷亮城3A座\t42264\n郑州地铁8号线\t42265\n10万块\t42266\njinyici\t42267\n广西电信\t42268\n深科技\t42269\n晓杰\t42270\nThing\t42271\nS2011\t42272\n浪费钱\t42273\n工程造价咨询公司\t42274\n内业\t42275\nASCII码值\t42276\n积分码\t42277\nST慧球\t42278\n猪腰\t42279\n途睿欧论坛论坛\t42280\n绝艺\t42281\n刘有生\t42282\n2087\t42283\n博威合金\t42284\nElitebook\t42285\n8升\t42286\n巴法络\t42287\n局级\t42288\n気持\t42289\n金陵镇\t42290\n极少数\t42291\n中国海事局\t42292\n米苏\t42293\n指迷\t42294\n奥特蛋\t42295\n进行时间\t42296\nJackey\t42297\nTextiles\t42298\n苗王\t42299\n股票期权激励\t42300\n浩南\t42301\n广昌县\t42302\n五卅运动\t42303\n打图\t42304\n柘荣县\t42305\n钱嘉乐\t42306\n七氟丙烷灭火系统\t42307\n桃园北路\t42308\n雨污\t42309\nHeroM2\t42310\n0808\t42311\n汉兰\t42312\n刘锡明\t42313\n武汉交通职业学院\t42314\n忻\t42315\n和诗\t42316\n绒衣\t42317\n第十六节\t42318\n围棋盘\t42319\n平绒\t42320\n抗菌肽\t42321\n终结贴\t42322\n环海路\t42323\n脱险\t42324\n15yc\t42325\ndraw\t42326\n金融市场\t42327\n仿真机\t42328\n一课一练\t42329\n唐山机场\t42330\n打降\t42331\n1.91\t42332\n白塔湖\t42333\n千张\t42334\npgf\t42335\n襁褓\t42336\n猫儿山\t42337\n阿斯玛\t42338\n贱民\t42339\nloma\t42340\n副词\t42341\npolymerization\t42342\n653\t42343\nUlysse\t42344\n英漢\t42345\n华天动力\t42346\n走肾\t42347\n沙巴岛\t42348\n标准施工招标文件\t42349\n603993\t42350\n2703\t42351\n强制认证\t42352\n负罪感\t42353\n第32章\t42354\n椰菜\t42355\nKnox\t42356\n宽城\t42357\nyields\t42358\nAmelia\t42359\n汽贸公司\t42360\n罐区\t42361\n毛颖\t42362\n巴尔曼\t42363\n8600K\t42364\n值时\t42365\ncidu\t42366\n将棋\t42367\n车臣\t42368\n鳞甲\t42369\n陕西村\t42370\nStronger\t42371\n黄龙之耳\t42372\n猫棒\t42373\n薄言\t42374\n若尔盖大草原\t42375\n统征\t42376\n罗兰德\t42377\n中山陵园风景区\t42378\n水电话\t42379\n201612\t42380\n2018.4.5\t42381\n模糊控制\t42382\n亿升\t42383\n抗结核\t42384\n光明\t42385\n食品药品监督局\t42386\nCASS7.0\t42387\n张杰罗志祥\t42388\n耳球\t42389\nbcactc\t42390\n孙铱\t42391\n离死\t42392\n盐度计\t42393\n华坪\t42394\n高中学\t42395\n毛杜鹃\t42396\nwebbench\t42397\n重庆市电子税务局\t42398\n中国华能\t42399\n低耗\t42400\nexcitement\t42401\n源赖光\t42402\n制冰\t42403\nActiveMq\t42404\n大火箭\t42405\n拨号盘\t42406\n白木香\t42407\n铃鼓\t42408\nNMM\t42409\n绿框\t42410\n韩晓\t42411\n戊寅日\t42412\n薪酬福利\t42413\n会前\t42414\n元字符\t42415\nmoisture\t42416\n上高原\t42417\n中共中央军委\t42418\n林散之\t42419\n火嘴\t42420\n天导\t42421\n废粉\t42422\n风中奇缘\t42423\ntimeout\t42424\n硕风\t42425\n主版\t42426\n患字\t42427\n肛检\t42428\ncaoponAV\t42429\n鸡蛋饼\t42430\n太平新城\t42431\ndrop\t42432\n骨面\t42433\n刘毅\t42434\n500份\t42435\n电视剧本\t42436\nEDIFICE\t42437\neaton\t42438\n人教版小学五年级数学下册\t42439\n职业病危害评价\t42440\n双台子区\t42441\n曹骏\t42442\n北京二中院\t42443\nBidding\t42444\n12W\t42445\nntf\t42446\n千位符\t42447\n处理率\t42448\n澳门威尼斯人酒店\t42449\nxiandai\t42450\n2014-09\t42451\n背对背拥抱\t42452\n缅甸花梨家具_黑酸枝家具厂\t42453\n味千拉面\t42454\n三手\t42455\n姜玉阳\t42456\n量测\t42457\n国税\t42458\nQueenie\t42459\n6008\t42460\n假票\t42461\n苏宁集团\t42462\n铂晶\t42463\n香水湾\t42464\n南京市科学技术委员会\t42465\n13种\t42466\n执法者\t42467\n县委组织部\t42468\nContextCapture\t42469\n高校图书馆\t42470\n痴痴\t42471\nZh\t42472\n36cm\t42473\ncleef\t42474\n柳满坡\t42475\n抖M\t42476\n西山国家森林公园\t42477\n健忘村\t42478\n潼侨镇\t42479\n上岛\t42480\n金蛋\t42481\n2017年1月18日\t42482\n二十倍\t42483\n凌派屏\t42484\n一步一步\t42485\n1978\t42486\nmadness\t42487\n申长友\t42488\n领航员\t42489\n黄花胶\t42490\n58路\t42491\n山东省监狱管理局\t42492\n网约车公司\t42493\n明尼苏达大学\t42494\n风暴凯越\t42495\n沈坤\t42496\n黄兴路步行街\t42497\n三方通话\t42498\n非字\t42499\n银耳莲子羹\t42500\n维生素B族\t42501\n龙湖冠寓\t42502\n吨位\t42503\n心远\t42504\nexclusion\t42505\n硼钢\t42506\n超滤离心管\t42507\n济南市公安局\t42508\n大众高尔夫嘉旅\t42509\n美克家居\t42510\n被奸\t42511\nNumpy\t42512\n程千帆\t42513\n壳聚糖\t42514\n凌然\t42515\n芙蓉社区\t42516\nUltraISO\t42517\n苯中毒\t42518\n欧洲花园\t42519\n西九龙站\t42520\n制备\t42521\n奔奔EV\t42522\n金皇\t42523\n中学英语\t42524\n层层叠叠\t42525\n辉丰\t42526\n默认视频播放器\t42527\n锡惠公园\t42528\n抗癌药零关税\t42529\nCOOK\t42530\n乔治\t42531\n杂学\t42532\n吴天明\t42533\nm14\t42534\nread_frame\t42535\n3367火影忍者手游\t42536\n彪哥闯奉天\t42537\n蒋月泉\t42538\n顺眼\t42539\nwooyun\t42540\ninsta\t42541\n小荷风采\t42542\nopenID\t42543\n棋力\t42544\n失蜡\t42545\n琅琊网\t42546\ndatable\t42547\n小蚁运动相机\t42548\n女朋友\t42549\nqplay\t42550\n课题目\t42551\n苏州小区\t42552\ndivided\t42553\n怒怼\t42554\nMIC\t42555\n5290\t42556\n王慧\t42557\n易成\t42558\n有我\t42559\n开发者社区\t42560\n附单\t42561\n预制舱\t42562\nnode\t42563\n雅美罗\t42564\n聿怀\t42565\n周六野\t42566\n宫下\t42567\nmssql2005\t42568\n语用学\t42569\n南斗\t42570\n800克\t42571\n扎赉诺尔\t42572\n39张\t42573\n玻璃棒\t42574\n2万多\t42575\n本剧\t42576\n同轴度\t42577\nCMW500\t42578\n倍增\t42579\n梦想高飞\t42580\nrecommended\t42581\n接用\t42582\n道滘\t42583\n悟道法师\t42584\n西海镇\t42585\n春琴\t42586\n尽端\t42587\n周玮\t42588\n单元格\t42589\n先期\t42590\n司空\t42591\n二十公分\t42592\n_子\t42593\n恤\t42594\n教界\t42595\n魂灵\t42596\n院长\t42597\n金晶\t42598\nz170\t42599\n施工地\t42600\n模似\t42601\n八部曲\t42602\n永州站\t42603\n无讼阅读|\t42604\n库布齐\t42605\n尧庙\t42606\n万丈\t42607\n外耳道炎\t42608\n山清水秀\t42609\n媒质\t42610\nNier\t42611\n扩肛\t42612\n笑掉\t42613\n辩词\t42614\npup\t42615\n青青家园\t42616\n第二产业\t42617\n帮帮忙\t42618\n小小忍者2\t42619\nfname\t42620\n相对误差\t42621\n殖民地\t42622\n牡丹吊兰\t42623\n秋卡\t42624\n小晨\t42625\n监督学\t42626\n爆裂无声\t42627\n东莞小学\t42628\n质子泵\t42629\n昆明市\t42630\n4399游拍\t42631\nrsyncd\t42632\nKan84.net\t42633\n崩坏学\t42634\n举办者\t42635\nアイドルマスタ\t42636\n温笛\t42637\nFramed\t42638\n强势\t42639\n端到端\t42640\n悍戚\t42641\n进化论\t42642\n美图秀秀电脑版\t42643\nOFBiz\t42644\nArden\t42645\n46厘米\t42646\n内心世界\t42647\n半导体分立器件\t42648\n定春\t42649\n105号\t42650\nvbnet\t42651\n怪物猎人世界烂辉龙\t42652\narg0\t42653\nhipanda\t42654\n胶壳\t42655\n程子\t42656\n连通\t42657\n泌尿系感染\t42658\nheard\t42659\n塘沽一中\t42660\n爱帝宫月子中心\t42661\n难觅\t42662\n林惊羽\t42663\n2.41\t42664\n残风\t42665\nx280\t42666\n3万次\t42667\n并联机器人\t42668\ndesertopia\t42669\n10000倍\t42670\n北京滴滴\t42671\n正算\t42672\n北新桥\t42673\n管顶\t42674\n拱北海关\t42675\n跛行\t42676\ncctv15\t42677\nτ\t42678\n李复\t42679\n私密\t42680\n格斗机器人\t42681\n路虎神行者\t42682\n后裔\t42683\n安德鲁森\t42684\n肾镜\t42685\nhumor\t42686\n20160828\t42687\n爱带\t42688\nC语\t42689\n货殖列传\t42690\n时代金融\t42691\n175号\t42692\nAppIcon\t42693\n面相识人\t42694\n创效\t42695\n繁殖基地\t42696\ninduced\t42697\n金山公园\t42698\n兵仙\t42699\n勇敢向前\t42700\n怀宁路\t42701\n155毫米\t42702\n堆积如山\t42703\n疲劳强度\t42704\n胎芽\t42705\n电子证照库\t42706\nfreddy\t42707\nSQL2014\t42708\ncomputational\t42709\n安卓5\t42710\n重现\t42711\n麻辣鸭脖\t42712\nUI模板\t42713\n大魔\t42714\n丙酮酸钠\t42715\ngegegan\t42716\n2209\t42717\na33\t42718\n疔\t42719\n惩\t42720\n御前侍卫\t42721\n晋华\t42722\n缔约国\t42723\n长安大学\t42724\n上甘岭战役\t42725\n空档期\t42726\n多余页\t42727\n反应釜\t42728\n竹筒酒\t42729\n王文华\t42730\n八卦掌\t42731\n申瓯\t42732\n8588\t42733\n纯债\t42734\n拿钱\t42735\nFaster\t42736\n15本\t42737\n毛呢面料\t42738\n众泰E200\t42739\n劣药\t42740\n黑度\t42741\n幼儿教育专业\t42742\n为友\t42743\n羟胺\t42744\n铺助\t42745\n雷神王座\t42746\n公文类\t42747\n蓝莹莹\t42748\n柳州铁道职业技术学院\t42749\nforrest\t42750\n蹉跎控制\t42751\n五大员\t42752\n回字纹\t42753\n智能家电_花火网\t42754\n辛德拉\t42755\n1.11.1\t42756\n中华人民共和国国家通用语言文字法\t42757\nXFire\t42758\n百度影音-一网天下\t42759\n澳门永利集团\t42760\n27公里\t42761\nHPMC\t42762\n计碎\t42763\n华律网合同\t42764\n淄博\t42765\n江汉路\t42766\n红葡萄\t42767\n家居100网\t42768\n孪生兄弟\t42769\n文婷\t42770\n辽宁省人大常委会\t42771\nhtc吧\t42772\n龙珠直播吧\t42773\n你不是真正的快乐\t42774\n电台稿\t42775\n完整端\t42776\n浩沙\t42777\n微信传输助手\t42778\nKartTv\t42779\npure\t42780\n土肥圆矮挫穷\t42781\n乐清湾大桥\t42782\n远逝\t42783\n裸男\t42784\n几平方\t42785\n林氏荣华\t42786\n42名\t42787\njb51\t42788\nnmap\t42789\n新安\t42790\n克罗米芬\t42791\n法监\t42792\nEntering\t42793\nSOM\t42794\nKQ88口腔新闻网\t42795\n70年后\t42796\n同行\t42797\nwajika\t42798\nrenti\t42799\nBAY\t42800\n腾达ac9\t42801\n傲丽\t42802\n杀手\t42803\n赵四\t42804\n慕尼黑大学\t42805\n色差仪\t42806\n胡健\t42807\n三里\t42808\n上观\t42809\n联信\t42810\namarra\t42811\n单刀直入\t42812\ntweaks\t42813\n叶问3\t42814\n若干名\t42815\n理疗科\t42816\n维塔士\t42817\npychram\t42818\n雪梅\t42819\n气性坏疽\t42820\n铱星\t42821\n带\t42822\n磨砂\t42823\n杨启\t42824\n熊田曜子\t42825\n针头\t42826\n消旋体\t42827\n乐生\t42828\n2415\t42829\n挑水\t42830\n口令\t42831\n华为荣耀4X\t42832\nandrobench\t42833\n示意\t42834\n上宝\t42835\n张家界景区\t42836\n恩奇都\t42837\n65W\t42838\n周放\t42839\n科华生物\t42840\n闪亮的日子\t42841\n重生犬\t42842\n回收商网\t42843\nacrgis\t42844\n正月初五\t42845\n九州通医药集团\t42846\n美国好声音\t42847\n报监\t42848\n炊饼\t42849\n紫马\t42850\n防腐木花架\t42851\n修道者\t42852\nBikeRadar\t42853\n娘\t42854\n丘吉尔\t42855\n史宾格犬\t42856\n各派\t42857\n有源相控阵雷达\t42858\n三氟乙酸\t42859\n膀\t42860\n东山岛\t42861\n文丛\t42862\n王南\t42863\n舞花\t42864\n通片\t42865\n蛤蟆草\t42866\n金河生物\t42867\nelo\t42868\ngravure\t42869\n同日\t42870\n关于性\t42871\n嘉定镇\t42872\n维基词典\t42873\n超小户型\t42874\nフリフレ\t42875\n観光\t42876\ngat5\t42877\n长沙市国土资源局\t42878\n麟游县\t42879\n糕饼\t42880\n燃血\t42881\nt20v3.0\t42882\n史炎\t42883\n西安中考网\t42884\n百度网盘限速\t42885\n5个多月\t42886\n取回\t42887\n曹村镇\t42888\n所得税汇算\t42889\n殡葬\t42890\n医宝\t42891\n德伯\t42892\n浇花器\t42893\n见一\t42894\n地柜\t42895\nwag\t42896\n多功能版\t42897\n第章\t42898\n黑龙江省体育局\t42899\n9.0%\t42900\n祥鹏航空\t42901\n渔夫的故事\t42902\n路易斯酸\t42903\n黄白\t42904\n献爱心\t42905\n除夜\t42906\n王思\t42907\n钢钢\t42908\n2011-2020年\t42909\nprime\t42910\n市法院\t42911\n骨血\t42912\n金丰路\t42913\n微波雷达\t42914\n十四集\t42915\n狙翎\t42916\n有话好说\t42917\n建造师考试\t42918\n总目\t42919\n长安大戏院\t42920\n怀远\t42921\n试听\t42922\nproba\t42923\nPEAK\t42924\n东风汽车报\t42925\n除暴\t42926\nSEVENTEEN\t42927\n道医网\t42928\nイン\t42929\n鹡鸰女神\t42930\n同时\t42931\n1759\t42932\nBeautyLeg\t42933\n多瓣\t42934\nCURLOPT\t42935\nitmo爱萌游戏网\t42936\n南城汽车站\t42937\n熟地黄\t42938\n以撒的结合:胎衣+\t42939\n黑骑\t42940\n影音先锋资源_影音先锋\t42941\n基普\t42942\n流花湖\t42943\n张孝全\t42944\n华北水利水电学院\t42945\nm225\t42946\ngameloft\t42947\n慌乱\t42948\n萝莉\t42949\n日本国家旅游局\t42950\n小周后\t42951\nDIY个\t42952\n一捧\t42953\n道林\t42954\n1569\t42955\n长途跋涉\t42956\n快乐彩\t42957\n光头哥\t42958\n空盘\t42959\n柴胡桂枝干姜汤\t42960\n太宰治\t42961\njustify\t42962\n惠普战66\t42963\n白蕉镇\t42964\n彩乃奈奈\t42965\n画库\t42966\nSteveWang\t42967\n碳当量\t42968\n房地产门户\t42969\njefferson\t42970\n资产注入\t42971\n陈曙光\t42972\n众泰T800\t42973\n第十六个\t42974\n120块\t42975\n鄙视链\t42976\n恒大文化旅游城\t42977\n第150集\t42978\ngraphic\t42979\n中石油中石化\t42980\n7180DN\t42981\n大庆一中\t42982\n600x600\t42983\ncc++\t42984\n秦皇寺\t42985\n跳河\t42986\nr2r\t42987\nReplace\t42988\n腾翔\t42989\nexpressing\t42990\n经办人\t42991\n御札\t42992\n新概念英语第二册Lesson\t42993\n蒋建国\t42994\n卧龙生\t42995\nQQ幻想世界\t42996\n发射架\t42997\n剑侠世界2\t42998\n模板化\t42999\nCloser\t43000\n纵向\t43001\n董秘\t43002\n跌价\t43003\n不知火\t43004\n三星a7100\t43005\nV8.3\t43006\nVídeos\t43007\n欧芹\t43008\n20160601\t43009\npracticing\t43010\n米云\t43011\n王桥\t43012\n霜火\t43013\npalo\t43014\n党委\t43015\n万润\t43016\n剑豪\t43017\nWeights\t43018\n商用\t43019\n东方银座\t43020\n无民事行为能力人\t43021\nBulletin\t43022\n星晖\t43023\n黑咖啡\t43024\n941n\t43025\n百度云音乐\t43026\npyrosim\t43027\n死亡之吻\t43028\n雄州\t43029\n俊逸\t43030\n烤肉机\t43031\n扬天\t43032\n操持\t43033\n赵盘\t43034\n六宫无妃\t43035\n回春医疗\t43036\n玛特\t43037\nmishi\t43038\n岳阳市\t43039\neff\t43040\n赛里木湖\t43041\n阴吹\t43042\nUtility\t43043\nv260\t43044\n亚马逊review\t43045\n探析\t43046\nASCII、Unicode\t43047\n体工\t43048\n壮志\t43049\n北京亦庄\t43050\n田园\t43051\n英吉沙县\t43052\n工作日报\t43053\n酷BT\t43054\nNike\t43055\n顺颂商祺\t43056\n通信世界网\t43057\n38个\t43058\n学飞\t43059\n奔腾4\t43060\n生角\t43061\n同质\t43062\n36万元\t43063\n复方阿胶浆\t43064\n买椟还珠\t43065\n水功能区划\t43066\n郑义门\t43067\n中国广州增城政府网\t43068\norn\t43069\n黄巍\t43070\nStayin\t43071\n3架\t43072\nzuan\t43073\n四百米\t43074\n沸石\t43075\n重联\t43076\n徐洋\t43077\n夜机\t43078\n18acg\t43079\n劲力\t43080\n吉林化工学院\t43081\n青岛流亭机场\t43082\nDeveloping\t43083\nHelpers\t43084\n挤满\t43085\nSIM4\t43086\nDDR2\t43087\n脑血管疾病\t43088\nim70\t43089\n枣庄市委\t43090\n亨德利\t43091\n凤头\t43092\n负载率\t43093\n豆萁\t43094\n一房一价\t43095\n雷霆再起\t43096\n25x\t43097\ndequeue\t43098\n用户面\t43099\nfreqz\t43100\n寿者\t43101\n户数\t43102\n中国科学技术大学\t43103\n露骨\t43104\n缠绕\t43105\n数字电路\t43106\n规划类\t43107\n圆领\t43108\n宝蓝色\t43109\n钢贸商\t43110\n百度卫士\t43111\n校园服务网\t43112\n哈尔滨报团\t43113\n铸造\t43114\n北京建设\t43115\n堆龙\t43116\n广州大学城\t43117\n吕建江\t43118\nTickTick\t43119\n沈星\t43120\n息税\t43121\nfound\t43122\nJames\t43123\n俞灏\t43124\n福建省人大常委会\t43125\nrenren\t43126\n竹园路\t43127\n1540\t43128\n广州丰田\t43129\n刘庸\t43130\n岩桐\t43131\n中心商务区\t43132\n黄丽娟\t43133\n22个\t43134\n华东理工大学商学院\t43135\n税事\t43136\n乐视Max2\t43137\n成飞中学\t43138\n史依弘\t43139\n上海市卫生和计划生育委员会\t43140\n十面埋伏\t43141\n寒衣\t43142\n自如\t43143\n微名片\t43144\n红麦\t43145\n润兴\t43146\n涨潮\t43147\n达州\t43148\n陈禾\t43149\n拟于\t43150\n环境变量path\t43151\n超高层住宅\t43152\n靳东\t43153\n图木舒克市\t43154\n鸡肉棒\t43155\n喷罩\t43156\n低密度胆固醇\t43157\n拉床\t43158\n大热天\t43159\n一分钟以上\t43160\n曾芳林\t43161\n书版\t43162\n蛋白胶\t43163\nk650c\t43164\n25斤\t43165\ncakewalk\t43166\n单位化\t43167\n星环科技\t43168\n嘤嘤嘤\t43169\n活该\t43170\n源强\t43171\nIKA\t43172\n中海滨湖公馆\t43173\nflip\t43174\n角声\t43175\n走马灯株式会社\t43176\n众信\t43177\n第三方类\t43178\n无人问津\t43179\nesxi6.0\t43180\n有机合成\t43181\n胃复春\t43182\n职业教育\t43183\n54天\t43184\nCES\t43185\n扬风\t43186\n灵思\t43187\n泰达医院\t43188\n20160106\t43189\n脸上\t43190\n800m\t43191\n李全福\t43192\nbmi\t43193\n红杉树\t43194\n隔音毡\t43195\n嵌入型\t43196\nNaturally\t43197\n12封\t43198\n跨行取款\t43199\n女武神驱动比丘尼\t43200\n徐太志\t43201\nDEFAULT\t43202\n妍妍\t43203\n桂冠电力\t43204\n0.000\t43205\n按摩垫\t43206\n珠宝柜\t43207\n0104\t43208\n天漠\t43209\n般\t43210\n枫泾\t43211\n百媚生\t43212\n不紧\t43213\nMOB\t43214\n搓手\t43215\n易\t43216\nbjui\t43217\n沧浪\t43218\n配电屏\t43219\n网吧三国\t43220\n四平路街道\t43221\n俗气\t43222\nTwisted\t43223\n佛山世纪莲体育中心\t43224\nvalentina\t43225\n效率低\t43226\n陈琼\t43227\n1:12\t43228\n友好性\t43229\n极热网\t43230\nceo\t43231\n肺癌\t43232\n刘小\t43233\n太阳出来喜洋洋\t43234\n印象曲\t43235\ncrease\t43236\n陈印泉\t43237\n绫清竹\t43238\nSnakeTC\t43239\n算了算\t43240\n阿克琉斯\t43241\n直加\t43242\nforwarding\t43243\nwin10安全中心\t43244\n波木遥\t43245\n招商公路\t43246\n期限\t43247\n鱼胶\t43248\ntransformations\t43249\n操纵案\t43250\n五分之三\t43251\n幕后照\t43252\n洛杉矶奥运会\t43253\n香江集团\t43254\n汤品\t43255\n德阳市住房和城乡建设局\t43256\n中国都市报\t43257\nshumeipai\t43258\n北京京剧院\t43259\n轻度抑郁症\t43260\n绑\t43261\n列里\t43262\nT-800\t43263\n黄壁庄水库\t43264\n阿莎姬\t43265\nanylogic\t43266\n仙剑三\t43267\n铁床\t43268\n山东省委组织部\t43269\n无参构造函数\t43270\nMilla\t43271\n豫津\t43272\n喷瓶\t43273\n米非司酮\t43274\n20170408\t43275\n弄堂\t43276\nLX5\t43277\n阿比甲\t43278\n信版\t43279\n50892074\t43280\nrtorrent\t43281\n1.06G\t43282\nAstroneer\t43283\n萨拉齐\t43284\nSup\t43285\n北京炸酱面\t43286\n上合组织峰会\t43287\n施以\t43288\n康桥路\t43289\n齐家滨\t43290\n葡萄糖浆\t43291\n丧门\t43292\n博纳艾杰尔\t43293\n欧锦赛\t43294\n18055期\t43295\n眉\t43296\nGamers丨\t43297\numask\t43298\n择时\t43299\n第222集\t43300\n石浦渔港古城\t43301\n西农\t43302\n肛温\t43303\n北京世贸天阶\t43304\n生计\t43305\n武汉市儿童医院\t43306\n马里奥赛车8\t43307\n10004\t43308\n驱动桥\t43309\ntiny\t43310\nbo5\t43311\n大叔别走\t43312\n小卖店\t43313\nrtsp流\t43314\n工合\t43315\n华戒\t43316\n国元农业保险股份有限公司\t43317\n三利达\t43318\n博人传之火影次世代\t43319\n刘二狗\t43320\nJoey\t43321\n一番话\t43322\ncso\t43323\n1例\t43324\n筏形\t43325\nwifi链\t43326\n悬笔\t43327\n枣红\t43328\n代数式\t43329\nPro\t43330\naccessibility\t43331\n第课\t43332\n迁户\t43333\nMindManager思维导图软件\t43334\n库存期\t43335\n南平北站\t43336\n鹞式\t43337\npopper\t43338\n成活率\t43339\n张衡\t43340\n天津津门中医院\t43341\n波导线\t43342\n神甲奇兵\t43343\n春兰杯\t43344\n以利\t43345\n手绘花\t43346\nHIP\t43347\n众泰Z700\t43348\n暗枪\t43349\n3th\t43350\n售假\t43351\n小圈圈\t43352\n磺酸\t43353\n吴清雅\t43354\n屏住呼吸\t43355\n标飞\t43356\n242号\t43357\n李云迪\t43358\n五险一金\t43359\n嘉和城\t43360\n把酒问青天\t43361\n外研通\t43362\nCUDA\t43363\n肖哥\t43364\n寿星\t43365\n天龙功放\t43366\n加元\t43367\n疯狂原始人\t43368\nsqllog\t43369\n站酷\t43370\n轮式\t43371\n忘言\t43372\n梅州百姓网\t43373\n大东湖\t43374\n易龙智\t43375\n设宴\t43376\n宁波市民政局\t43377\n莲舫觅珍\t43378\nCriminal\t43379\n神书\t43380\n小叶增生\t43381\n苏鹏\t43382\n飞出\t43383\n创客学院\t43384\n啷个哩个啷\t43385\n华平股份\t43386\n盲选\t43387\n饰界\t43388\nvocational\t43389\n姜军\t43390\n塑料包\t43391\n喷门癌\t43392\n56半\t43393\nuipage\t43394\n钟敏\t43395\n幅宽\t43396\n8千万\t43397\ngne\t43398\nBASE64\t43399\n虚拟网\t43400\n鬼畜\t43401\n滑翔翼\t43402\nブリ\t43403\n解开\t43404\n子部\t43405\n黑暗料理\t43406\n画者\t43407\n大唐西域记\t43408\n炸弹\t43409\n浙江省人民政府法制办公室\t43410\n店名\t43411\n02195559\t43412\n阿普利亚\t43413\n青釉\t43414\n一华独秀\t43415\n重庆小康工业集团股份有限公司\t43416\n猛跌\t43417\n银川火车站\t43418\n县工商局\t43419\n腾讯视频破解版\t43420\n闺怨\t43421\n真吻\t43422\n芝麻白\t43423\n异想天开\t43424\nliyan\t43425\n阵式\t43426\n91条\t43427\n贵州省农业委员会\t43428\nHDCP\t43429\n任务\t43430\nMT3\t43431\nwey\t43432\n2014.04\t43433\n人皮\t43434\n范宽\t43435\n前室\t43436\n福特新全顺论坛论坛\t43437\nQingming\t43438\ncrust\t43439\n林俊杰\t43440\n尼古拉斯\t43441\n灵鹤\t43442\n菜篮\t43443\n不排\t43444\n清格机\t43445\n4.3%\t43446\n第3套\t43447\n嘉定西\t43448\njhc\t43449\n20170131\t43450\npc版\t43451\nCCTV音乐厅\t43452\n兼营\t43453\n乙女心\t43454\n299\t43455\n博将\t43456\n双增\t43457\n房金所\t43458\n小根\t43459\n康建华\t43460\nssb\t43461\n禹王台\t43462\n二嫂\t43463\n代刷\t43464\n老兽\t43465\n海特高新\t43466\n道奇蝰蛇\t43467\nlsqcurvefit\t43468\n凳子舞\t43469\n透\t43470\n书写板\t43471\n刘阔\t43472\n广西中医药大学\t43473\n安信\t43474\n陈纯\t43475\nJavaScript\t43476\n谍战片\t43477\nfl8000u\t43478\nkef\t43479\n环节\t43480\n龙锤\t43481\n龙型\t43482\n晓星尘\t43483\n2018.3.28\t43484\n494\t43485\nQq\t43486\n德迈\t43487\n研发者\t43488\ne580\t43489\n朱行\t43490\n抗渗性\t43491\n惠泉\t43492\n秦秦\t43493\n注册资本登记制度\t43494\n民怨\t43495\nhas_key\t43496\n示教\t43497\nIEDA\t43498\n克里斯蒂娜\t43499\n己巳\t43500\n未成年人犯罪\t43501\n安徽省委组织部\t43502\n娄山\t43503\nmsn\t43504\n青岛世博园\t43505\n傲人\t43506\n蕙兰篇\t43507\nRivet\t43508\nTyson\t43509\n在此\t43510\n求求\t43511\n勇者斗恶龙吧_\t43512\n护木\t43513\n脚边\t43514\nDevStack\t43515\n身教\t43516\n15d\t43517\n下安\t43518\n沃兹\t43519\n打量\t43520\n内审\t43521\njqueryui\t43522\n燕十八\t43523\nsunny\t43524\n新蒲\t43525\n湖南省人力资源和社会保障厅\t43526\n靖西县人民政府\t43527\nMSCOMM\t43528\n海道\t43529\n春服\t43530\n洋花\t43531\nwallpaperengine\t43532\n环湖医院\t43533\n靠枕\t43534\nYesterday\t43535\n状元桥\t43536\n张志和\t43537\n复习\t43538\n徐汇区中心医院\t43539\n艾利斯\t43540\n藏青色\t43541\nROS软路由\t43542\n槽函数\t43543\nq8300\t43544\nPaladin\t43545\n五5\t43546\nSASO认证\t43547\nduff\t43548\n极物\t43549\n烧灼\t43550\n白虫\t43551\n诺帝卡\t43552\nplay\t43553\n赖氨酸磷酸氢钙颗粒\t43554\n深圳市车管所\t43555\n高度层\t43556\n荣耀立方\t43557\n片皮鸭\t43558\n宫部美雪\t43559\n中国华融\t43560\n2852\t43561\n3dl\t43562\n超微粉\t43563\n妙巴黎\t43564\n甲组\t43565\nmacvlan\t43566\n生事\t43567\n广东白云学院\t43568\n看门狗\t43569\nchina69\t43570\nExt2Fsd\t43571\nExposure\t43572\n宁波电子\t43573\n通灵师\t43574\n辽篮\t43575\n郑大世\t43576\n德尔玛\t43577\n出口额\t43578\n哪季\t43579\n张麻子\t43580\n木樨园\t43581\n红队\t43582\ntrello\t43583\nx方\t43584\n?感\t43585\n供职\t43586\n写真片\t43587\n农机\t43588\n批表\t43589\n硝酸锰\t43590\n新闻政经_北京商报网\t43591\n明礼\t43592\n慧智\t43593\n王伟民\t43594\nOutline\t43595\n土房\t43596\n3盒\t43597\n商丘职业技术学院\t43598\n真绪\t43599\n八桥镇\t43600\nX光片\t43601\n辱华\t43602\n25套\t43603\n颜妮\t43604\n鸡狗\t43605\nTOKYO\t43606\n糖蜜\t43607\n社区工作者考试\t43608\n烟盒\t43609\n海航城\t43610\n股基\t43611\n退港\t43612\n抢走\t43613\n麒麟659\t43614\nFlagship\t43615\n远东控股集团有限公司\t43616\n炉石传说渡鸦年\t43617\nHAZZYS\t43618\n158分钟\t43619\n浙江省委组织部\t43620\n氨基甲酸乙酯\t43621\nBithumb\t43622\n罚酒\t43623\n追梦路\t43624\n115周年\t43625\n城边\t43626\n足球馆\t43627\n月利率\t43628\nfreefilesync\t43629\n非概率\t43630\n张金良\t43631\n九天揽月\t43632\n平安里\t43633\n门牌\t43634\n北京同仁堂\t43635\n李李\t43636\n梦娜\t43637\n北京市属公园\t43638\nrad/s\t43639\nKayden\t43640\n毫秒级\t43641\n辰安科技\t43642\nunconditional\t43643\nomg战队\t43644\n重庆中央公园\t43645\n温州市中心医院\t43646\n接处警\t43647\nMagoosh\t43648\n玥玛\t43649\nKAD\t43650\n谢克尔\t43651\n清华大学医学院\t43652\ncodec\t43653\n龚滩古镇\t43654\n塔里克\t43655\n多选\t43656\nmintty\t43657\n中国高等教育学会\t43658\n怎么装备\t43659\n辣椒网\t43660\n中国工会\t43661\n萨伊\t43662\n25亿\t43663\nV300\t43664\n据报道\t43665\ncription\t43666\n600欧\t43667\n千件\t43668\nFlatList\t43669\nProDAD\t43670\n黄俊英\t43671\n正长\t43672\n反引号\t43673\n卜算子\t43674\n防汗\t43675\nCGO\t43676\nCA证书\t43677\n朝鲜民主主义人民共和国\t43678\n生猛\t43679\n乔致庸\t43680\nProgrammes\t43681\n府之任\t43682\n4000万美元\t43683\n高音\t43684\n章鱼王\t43685\n_场\t43686\n黑帆\t43687\n麦格雷戈\t43688\n稀有金属\t43689\nAKA\t43690\n190万\t43691\n红色版\t43692\n10道\t43693\n页&#160\t43694\n德勤咨询\t43695\n淡黄\t43696\n云杉\t43697\n海澜之家\t43698\n攻略书\t43699\n赵正永\t43700\n海况\t43701\n5月18号\t43702\n侠盗猎车5\t43703\n维修论\t43704\n中国戏曲学院\t43705\n分批\t43706\nHelio\t43707\n盒马鲜\t43708\n祈福新村\t43709\n对儿\t43710\n乔治希尔\t43711\n贸易战争\t43712\n格式工厂绿色版\t43713\nfruits\t43714\nBrick\t43715\n昔孟母\t43716\n列\t43717\n伤心欲绝\t43718\n年长者\t43719\n京\t43720\n预备班\t43721\nAPACHE\t43722\n20170701\t43723\n1.83\t43724\n咎由自取\t43725\n沥水\t43726\n安初夏\t43727\n飞海口\t43728\nWilliamson\t43729\nPSM\t43730\nKyocera\t43731\nSamson\t43732\n林水\t43733\n握姿\t43734\n康宁大猩猩\t43735\n航天金穗\t43736\n里约热内卢\t43737\n仙庙\t43738\n沥青混合料\t43739\nNOTES\t43740\njiong\t43741\n吃起来\t43742\nAutoForm\t43743\njemter\t43744\n顺丰空运\t43745\n钢板尺\t43746\n樱井俊介\t43747\n0228\t43748\n闪过\t43749\n水木\t43750\n张志坚\t43751\n淋巴结肿\t43752\n朗诗集团\t43753\n蓦山溪\t43754\n亮片\t43755\n114直播网\t43756\n和县政府\t43757\n京派\t43758\n酱菜\t43759\n日语毕业论文\t43760\n张辉\t43761\n三之三\t43762\n【吧务\t43763\n偷袭\t43764\n中信网鸽业\t43765\n七米\t43766\n鱼刺\t43767\n全民无双\t43768\ntrigkit\t43769\n根除\t43770\n阳春白雪\t43771\nmote\t43772\nUserAgent\t43773\n电工电子网\t43774\nVijos\t43775\n丰乐\t43776\n旋流除砂器\t43777\n文学系\t43778\n5.9.1\t43779\n前例\t43780\n变法\t43781\nsudoer\t43782\n富瑞\t43783\n老番\t43784\n4-2\t43785\n0314\t43786\n东方马达\t43787\nknockoutjs\t43788\n合成树脂\t43789\n扩印\t43790\nCodeforces\t43791\n快_\t43792\n仕事\t43793\n获利\t43794\n50周年\t43795\n同载\t43796\n盐山县\t43797\n虜\t43798\n中国科学院兰州化学物理研究所\t43799\n2515\t43800\nbbq\t43801\njio\t43802\n94届\t43803\nplante\t43804\n毕业秀\t43805\n王愚\t43806\nleveldb\t43807\n论剑\t43808\n驷\t43809\n赌城大亨之新哥传奇\t43810\n协和影院\t43811\n2.2kw\t43812\n佛山市科学技术局\t43813\n秘网\t43814\n吴哲\t43815\n循环泵\t43816\n神雕\t43817\n00158\t43818\n鞍山市公安局\t43819\n鹿\t43820\n中检\t43821\n选一\t43822\n南马镇\t43823\n琳琳\t43824\n6003\t43825\n钙离子\t43826\n罗芳村\t43827\n请叫我英雄\t43828\n配置单\t43829\n61年\t43830\n灵活性\t43831\n降维打击\t43832\n新桥镇\t43833\n艾格尼斯\t43834\ndiplomat\t43835\n周艳泓\t43836\n支气管镜\t43837\n精力\t43838\nqq五笔\t43839\nChez\t43840\n最佳男主角\t43841\nfluorescence\t43842\nWeTest\t43843\n登船\t43844\n种马文\t43845\n惠耳\t43846\n北京工商局\t43847\n2018年3月12日\t43848\n主的爱\t43849\n搜无\t43850\n10003\t43851\n布洛芬缓释片\t43852\n埋骨\t43853\n干支纪年法\t43854\n李亚鹏\t43855\n二丁颗粒\t43856\nij\t43857\n陕西日报社\t43858\n招生考试信息网\t43859\nR9\t43860\n水中花\t43861\n广元之窗\t43862\n孔板流量计\t43863\n中冶华天工程技术有限公司\t43864\n滨海湾新区\t43865\n求解答\t43866\n神兵玄奇\t43867\n季后\t43868\n网易相册\t43869\nrunyon\t43870\n607路\t43871\n商务版\t43872\n南海区政府\t43873\n9点15分\t43874\n春风十里不如你\t43875\n韩瑜\t43876\nqtable\t43877\nSennheiser\t43878\n操作区\t43879\n考研\t43880\n栽倒\t43881\n考据\t43882\n冰塔\t43883\n索引\t43884\n丰田逸致\t43885\n机车\t43886\n范进\t43887\n2016年11月24日\t43888\n诺德股份\t43889\n硕士生\t43890\n0302\t43891\n这辈子\t43892\n高罗佩\t43893\nr230\t43894\n御法者\t43895\n零钱\t43896\n21周\t43897\n潘庄\t43898\nclubman\t43899\n℉\t43900\n鲸落\t43901\n中岛柜\t43902\ncases\t43903\n谁是大歌神\t43904\n杉原杏璃\t43905\n忙碌\t43906\npav\t43907\n公示表\t43908\n棋子湾\t43909\n齐次方程\t43910\n不具备\t43911\n影响额\t43912\n上半期\t43913\n落枫飘飘\t43914\n活体\t43915\n累并快乐\t43916\n绑匪\t43917\n中国自动化学会\t43918\n牛血\t43919\n连云港开发区\t43920\n输赢\t43921\n武神\t43922\n姜多多\t43923\n雨花\t43924\n奥日与黑暗森林\t43925\n湄\t43926\n中芯国际集成电路制造有限公司\t43927\n兵马俑博物馆\t43928\n神经元\t43929\n165级\t43930\n奥迪克莱斯勒300c\t43931\n银泰资源\t43932\n小家电\t43933\n性母\t43934\n合金装备崛起:复仇\t43935\nsiwa\t43936\n1千万\t43937\n备案书\t43938\n祛风\t43939\n650M\t43940\n32岁\t43941\n滚丝机\t43942\n存储量\t43943\n茶餐厅\t43944\n滤毒罐\t43945\n花生小说阅读网\t43946\nWeb工程\t43947\nMM妹妹网\t43948\n细作\t43949\n拜日式\t43950\n天易成网管\t43951\n河北省衡水中学\t43952\n遮阳网\t43953\n偷偷\t43954\n秋季\t43955\n大麻油\t43956\n吴迪\t43957\nK11购物艺术中心\t43958\n山东技师学院\t43959\n流干\t43960\n齐涛\t43961\n中国宪治网\t43962\n北京理工大学机电学院\t43963\n坎特\t43964\n15岁时\t43965\n沈思\t43966\nfcu\t43967\n川芎嗪注射液\t43968\n开办\t43969\n茄\t43970\n肉松饼\t43971\n上海外语教育出版社\t43972\n寻宝网\t43973\nExpression\t43974\n鬼来电\t43975\n黄嘉伟\t43976\n第八课\t43977\n透照\t43978\n贾敬\t43979\n11m\t43980\n喷气式\t43981\nwmware\t43982\n基波\t43983\n海红旗\t43984\n19座\t43985\n神经干细胞\t43986\nSUMIFS\t43987\nArcSDE\t43988\npybot\t43989\n黑暗之魂2原罪学者\t43990\n二航\t43991\n画会\t43992\n平均年\t43993\ngetdata\t43994\n宝马X5论坛_汽车之家论坛\t43995\n米石\t43996\n谢家\t43997\n毛细\t43998\n10025\t43999\n张旸\t44000\ngooglenet\t44001\n零售网\t44002\n睿能科技\t44003\n国旗下\t44004\n固相\t44005\nSeeking\t44006\nindexdb\t44007\n肖琳\t44008\n衔\t44009\n258号\t44010\nproven\t44011\n完璧归赵\t44012\n桦\t44013\n金山逍遥网\t44014\n收支平衡\t44015\n京师律师事务所\t44016\n乐清人才网\t44017\n恋爱游戏网\t44018\n先天性耳前瘘管\t44019\n时光\t44020\n河南省人民政府外事侨务办公室\t44021\n桃子\t44022\n暗黑西游\t44023\n嘲讽\t44024\n是什么\t44025\n中兴手机\t44026\n限批\t44027\n饮尿\t44028\n浙江省人民检察院\t44029\n6.81\t44030\n富基\t44031\n艾小青\t44032\n不成问题的问题\t44033\ncarrie\t44034\n腾讯新闻\t44035\n微杏\t44036\n行车梁\t44037\n张艺韩寒\t44038\n僵尸游戏\t44039\n若谷\t44040\nmandatory\t44041\n李晓丹\t44042\n武汉工商学院\t44043\n尉犁\t44044\n王宁马伊\t44045\n主持辞\t44046\nAmino\t44047\nfifaonline3\t44048\n查股\t44049\n牡丹皮\t44050\n布局\t44051\n差点儿\t44052\n情侣园\t44053\n火头\t44054\n聚醚胺\t44055\n拼装式\t44056\n田渊正浩\t44057\n斜梯\t44058\n直线式\t44059\n秘密战争\t44060\n下旋\t44061\n挡拆\t44062\n猴\t44063\n思宇\t44064\nYoujizz\t44065\n中继服务器\t44066\n南礼士路\t44067\n祝文\t44068\ndocumented\t44069\n送客\t44070\n双卡双待\t44071\n7.8\t44072\n建平中学\t44073\n赤天化\t44074\n中国科学技术发展战略研究院\t44075\n春招\t44076\n580\t44077\n不远处\t44078\n婚外情事\t44079\n阳光小学\t44080\n追波\t44081\n粗骨料\t44082\n蓝灰\t44083\n网易彩票网\t44084\nCars\t44085\nPlot\t44086\n相交线与平行线\t44087\nseverity\t44088\n建国大业\t44089\nvisualbox\t44090\n1390\t44091\n李东荣\t44092\n妈妈再爱我一次\t44093\n届时\t44094\n大白鹅\t44095\n大学生职业生涯规划书\t44096\n三力士\t44097\n出奇蛋\t44098\nT台秀\t44099\nhsts\t44100\n中美贸易战争\t44101\nPS2模拟器\t44102\n南沙路\t44103\n善良的人\t44104\n杭州搬家公司\t44105\n320回\t44106\n达赖喇嘛\t44107\n上海市人力资源和社会保障局\t44108\nOOO\t44109\n中粮鸿云\t44110\n停拍\t44111\n学基础\t44112\nfancy\t44113\npenguin\t44114\n一步步\t44115\n李斯特\t44116\n教育费附加税率\t44117\nhashi\t44118\n终值系数\t44119\n撑腰\t44120\n回访率\t44121\n爱联\t44122\n11月1日\t44123\n天津市科学技术委员会\t44124\n砸抢\t44125\n星甸街道\t44126\nToyota\t44127\n昶\t44128\n自考概率论与数理统计\t44129\n饭包\t44130\nBulb\t44131\n中国仪表网\t44132\n国家机动车\t44133\n马晶\t44134\n三联书店\t44135\n大院子\t44136\nale\t44137\n遂宁市人力资源和社会保障局\t44138\n邢台市人力资源和社会保障局\t44139\nwav\t44140\n周生\t44141\n49项\t44142\n综琼瑶\t44143\ncambodia\t44144\n炕\t44145\n情境\t44146\n大众CC\t44147\n无余\t44148\n敛财\t44149\n阿提卡\t44150\n得胜\t44151\n盖上\t44152\n辅件\t44153\n人生之路\t44154\n2013年4月\t44155\n马伊董存瑞\t44156\n张美路\t44157\n两百多万\t44158\n奈亚子\t44159\n特评\t44160\nopenJDK\t44161\n黑光灯\t44162\nV5.8\t44163\nBJD\t44164\n探亲\t44165\n300ER\t44166\nHighland\t44167\nhongxi\t44168\nJacobian\t44169\nshengyin\t44170\n坠马\t44171\n吴一达\t44172\n如愿以偿\t44173\n李继耐\t44174\n红丸\t44175\nswaggerui\t44176\n中华人民共和国财政部\t44177\n广州站\t44178\n美髯公\t44179\n启事\t44180\nApowerMirror\t44181\n备耕\t44182\n沿面\t44183\n独立显卡_\t44184\n维特\t44185\nhellokitty\t44186\n电镀铜\t44187\n木仓\t44188\n洗洗澡\t44189\n劲酒\t44190\n清河县\t44191\n魅丽\t44192\n一个小时候\t44193\n取暖器\t44194\n4330\t44195\n10cm\t44196\n人才居住证\t44197\n自汗\t44198\nCarPlay\t44199\n菲达\t44200\n望月凉子\t44201\n之路\t44202\n民用建筑设计通则\t44203\nT440S\t44204\n2018年1月25日\t44205\nCoats\t44206\n囊性\t44207\n沈方舟\t44208\n越溪街道\t44209\n连笑\t44210\n九九九\t44211\n深圳医院\t44212\n砍怪\t44213\n双生\t44214\nascii编码\t44215\n广东省医学会\t44216\nVictorinox\t44217\n连云港师范高等专科学校\t44218\n1797\t44219\n内卷化\t44220\n江纱绫\t44221\n泰熙\t44222\nlol2017\t44223\n唯品币\t44224\n陕西省国家税务局电子税务局\t44225\nGD\t44226\nmixer\t44227\n分身软件\t44228\n不竭\t44229\n沈阳大学\t44230\n奔驰G550\t44231\n锡兰红茶\t44232\niPhone7/7Plus\t44233\n背驰\t44234\n建(构)筑物\t44235\n湾区日报\t44236\n易菇网\t44237\n东电\t44238\n成都工行\t44239\n借喻\t44240\n万象世界\t44241\n水豆\t44242\nhaxm\t44243\n工证\t44244\nXFL\t44245\n一桥大学\t44246\n生物科学专业\t44247\ntachi\t44248\n_药品通_39健康网\t44249\n朗姿\t44250\n2018039\t44251\n开放型\t44252\nQListWidget\t44253\n3八\t44254\n无氧运动\t44255\n汉达\t44256\nnoarch\t44257\n李建勋\t44258\n高压风机\t44259\n香蜜湖\t44260\n12.8%\t44261\n不可导\t44262\n泡菜汤\t44263\n豆豆龙\t44264\n心路独舞\t44265\n20160902\t44266\n超力\t44267\n汉明帝\t44268\n卢老爷\t44269\n台型\t44270\nbies\t44271\n笨拙\t44272\n老文\t44273\nVIN\t44274\n雷神911\t44275\n电阻表\t44276\n碰硬\t44277\nGymboree\t44278\n2.06G\t44279\n泰航\t44280\n转屏\t44281\n清好\t44282\n湖南信用网帮助中心\t44283\nfx8320\t44284\n龙争虎斗\t44285\n听信\t44286\n嗨皮\t44287\nDebout\t44288\nSOFINA\t44289\n丝片\t44290\nbeverly\t44291\n讲记\t44292\n月亮湾小区\t44293\n一路走来\t44294\n牛蒡子\t44295\n脱贫户\t44296\n双运\t44297\n星空地产网\t44298\n0875\t44299\n北京农业大学\t44300\n张思思\t44301\n我也不知道\t44302\ndiodes\t44303\nInsanity\t44304\n全勤奖\t44305\n从今开始\t44306\nDenmark\t44307\n小洋楼\t44308\n出家人\t44309\n未税\t44310\n科睿\t44311\n1905VIP影院_高清电影院线大片在线影院_电影网\t44312\n3800万\t44313\n55海淘网\t44314\n去年的树\t44315\n20150106\t44316\n志略\t44317\nphonopy\t44318\n辅道\t44319\n中山大学海洋学院\t44320\n华迈\t44321\n希思黎\t44322\n播撒\t44323\nerl\t44324\n识字6\t44325\n15周年\t44326\n供销商\t44327\n覆舟\t44328\n灌水\t44329\nThreejs\t44330\n第一间\t44331\n烟台市公安局\t44332\n双乳\t44333\nsill\t44334\n几座\t44335\n白酒\t44336\n首位度\t44337\n妖怪名单\t44338\nCLEAN\t44339\n香梅花园\t44340\n百度安全卫士\t44341\n去一降一补\t44342\n恶臭味\t44343\n后沙\t44344\n艾歌\t44345\nOLEDB\t44346\n红角洲\t44347\n100多款\t44348\nsurfaceflinger\t44349\n磁吸\t44350\n信客\t44351\ncategorical\t44352\nCui\t44353\n广州机场\t44354\n于明\t44355\n嘱\t44356\n朗新科技\t44357\n千万正版音乐海量无损曲\t44358\n京紫\t44359\n收缩率\t44360\n花枝\t44361\n阿立哌唑\t44362\n比鲁斯\t44363\nliqi\t44364\n长制\t44365\n延年\t44366\n2017年上半年\t44367\n山寨\t44368\n注释版\t44369\n左边\t44370\n现期\t44371\n160分\t44372\n一暴\t44373\n米径\t44374\nPermission\t44375\ndelight\t44376\nwww.xinqing100.net\t44377\n20W\t44378\n弯制\t44379\n佛言\t44380\nseam\t44381\n2016年6月25日\t44382\n点名册\t44383\n营改增试点\t44384\n贴金\t44385\n0.2.7\t44386\n政策\t44387\n郁金香花海\t44388\n绿茵\t44389\n洋盘\t44390\n猎鱼达人\t44391\n天生一对泰剧\t44392\n罗伊德\t44393\n界别\t44394\n电子银行承兑汇票\t44395\n200\t44396\n时间点\t44397\n瑞迪\t44398\nSQE\t44399\n摩挲\t44400\n鼻骨\t44401\n三杠\t44402\n网票\t44403\nCloud中文网\t44404\nReno\t44405\nHuSam\t44406\n麻将室\t44407\nSeetaFace\t44408\n11.5%\t44409\nf1赛车\t44410\n孔庙\t44411\n标准信息网\t44412\n截流\t44413\n终极一班3\t44414\n金彪\t44415\n5次\t44416\n碧缇福\t44417\n汽车保险公司\t44418\n招生之家\t44419\nV粉\t44420\n天津保税区\t44421\n李玲蔚\t44422\n文亮\t44423\n干戚\t44424\n三四个月\t44425\n维多利亚·贝克汉姆\t44426\n小米路由pro\t44427\n前节\t44428\n方应\t44429\n凌影\t44430\nwin+r\t44431\n投行在线\t44432\n爱疯\t44433\nv480\t44434\n第2篇\t44435\n地窝堡机场\t44436\nventures\t44437\nglycol\t44438\n唐正东\t44439\n西班牙语\t44440\n22磅\t44441\n迎春小区\t44442\n南京建设银行\t44443\n雅居乐地产\t44444\n作女\t44445\n兰寿\t44446\n浸濡\t44447\n丙球\t44448\n过早\t44449\n黑掉\t44450\n阴刑\t44451\n中华魂\t44452\n活力值\t44453\ncheap\t44454\n文件夹\t44455\n一举\t44456\n南部县\t44457\n新三国志英杰传\t44458\n中国宗教\t44459\n註冊\t44460\n烟台银行\t44461\n西安诺瓦电子科技有限公司\t44462\n薪情\t44463\nmtr\t44464\nzhenai\t44465\n业刹\t44466\n釜底抽薪\t44467\n最终幻\t44468\n无价宝\t44469\n肋\t44470\n李幼斌\t44471\nnull-CSDN\t44472\n君海\t44473\n万科中央公园\t44474\n芋\t44475\n鹿子\t44476\n针状\t44477\n鱼漂\t44478\nDylanZ\t44479\n上古卷轴5传奇\t44480\n蛞蝓\t44481\n凯旋国际\t44482\n酷勤网\t44483\nh5播放器\t44484\nconsolidated\t44485\n龙丹妮\t44486\n几毫米\t44487\n第三十三章\t44488\n95至尊\t44489\n影讯\t44490\nthead\t44491\nFETCH\t44492\nrisky\t44493\n毫不犹豫\t44494\n黄芩\t44495\n知无不言\t44496\n融湖中心城\t44497\n丛\t44498\n岩头镇\t44499\n浙能集团\t44500\n韩青\t44501\n大哲学家\t44502\n过时\t44503\n那一天\t44504\n道路交通标志和标线\t44505\n学刊\t44506\n褂子\t44507\nDataFrame切片\t44508\n单县\t44509\n心浪\t44510\n题西林壁\t44511\n渣子\t44512\nUnity5.x\t44513\n4.99\t44514\n句美网\t44515\n卡努\t44516\n7.0.5\t44517\n空心板\t44518\n迅雷磁力720P\t44519\n好妹妹\t44520\n罗伯特·德尼罗\t44521\n3700\t44522\n限仓\t44523\n高起点\t44524\ntickets\t44525\n一汽集团\t44526\n毒针\t44527\n成都农商银行\t44528\n出神\t44529\n号外\t44530\n李青云\t44531\n称为\t44532\n金瑞\t44533\n好卡\t44534\n京阿尼\t44535\n中国医疗器械信息网\t44536\n莲子\t44537\nlaura\t44538\n神箭手\t44539\n宋max\t44540\n毛泽东思想概论\t44541\n全息投影技术\t44542\n北京大学临床肿瘤学院\t44543\n番禺中学\t44544\n乌鲁木齐地区\t44545\n商服\t44546\n汉普森\t44547\n长图\t44548\nemployee\t44549\nPS3\t44550\n安菲尔德\t44551\n扬灰\t44552\n党的十八届四中全会报告精神解读\t44553\n李艳华\t44554\n杰睿\t44555\n黄群\t44556\n王牌贱谍\t44557\n冰汽时代\t44558\ni3-4160\t44559\n十六周岁\t44560\n肉肉文\t44561\nLandscape\t44562\n短暂性脑缺血\t44563\n余文乐\t44564\n式输送机\t44565\n韩脉脉\t44566\n肃清王三运流毒\t44567\n在一起吧\t44568\n控客\t44569\nPLus\t44570\n九十六\t44571\nExperience\t44572\nv6.7\t44573\nJAVMOO\t44574\n说明性\t44575\n360公里\t44576\n来秀\t44577\nt12\t44578\n舞林争霸\t44579\n傲骨之战第二季\t44580\n黄俊杰\t44581\n弹炮\t44582\n王西安\t44583\n010\t44584\n清热\t44585\nv5fox\t44586\n苏幕遮\t44587\n耐水性\t44588\n常州招聘网\t44589\n零年\t44590\n蒋怡\t44591\nNOBODY\t44592\n无法不爱\t44593\n接环\t44594\n画舫\t44595\n2328\t44596\nbx\t44597\n政治宣言\t44598\n布氏硬度\t44599\n顺便\t44600\n干拔\t44601\n澳洲\t44602\n白沫\t44603\n微滤机\t44604\n昊志机电\t44605\n丁丁历险记\t44606\n纽斯葆\t44607\n中华康网\t44608\n柚木\t44609\n鄞州区人民政府\t44610\n牵制\t44611\n仁孚\t44612\n沂蒙\t44613\n292号\t44614\n邮路\t44615\nfue\t44616\n井上瞳\t44617\n钱款\t44618\n恒冠\t44619\nWorkforce\t44620\n重品\t44621\n嵌顿\t44622\n徐氏\t44623\n中视典\t44624\n淘儿歌网\t44625\n陶化\t44626\n手机中国网\t44627\n注册造价师\t44628\n秦腔折子戏\t44629\n食人\t44630\n健康心理学\t44631\n中国物流公司\t44632\n西单图书大厦\t44633\n华山北\t44634\nHaze\t44635\n冻干食品\t44636\n桂花镇\t44637\n2.3G\t44638\n情迷彼得堡\t44639\nrna\t44640\n敬称\t44641\n换文\t44642\n伍丁\t44643\n冠军\t44644\n红鼻剪刀\t44645\n眼镜片\t44646\n603816\t44647\n甄子丹\t44648\n凌锋\t44649\n莫愁路\t44650\n省地税局\t44651\n201701\t44652\n神圣使命\t44653\nFatFs\t44654\n技法\t44655\n东南沿海\t44656\n点心\t44657\nenforce\t44658\nsgm\t44659\n210元\t44660\n19大\t44661\n初恋乐园\t44662\n躲债\t44663\n新时代讲习所\t44664\n300618\t44665\nParametric\t44666\n長澤\t44667\n400号\t44668\n作势\t44669\n溅镀\t44670\n点透\t44671\n我爱罗\t44672\n出度\t44673\n纽顿\t44674\n磁极\t44675\n王芳\t44676\nAV女***\t44677\n类书\t44678\nDxD\t44679\n西游记续\t44680\n费雯·丽\t44681\n虎躯\t44682\n以上者\t44683\n美巨乳\t44684\n董欣\t44685\n钟秀勇\t44686\n厦华\t44687\nkappa\t44688\n枸\t44689\n银纳米线\t44690\n360.cn\t44691\n焗\t44692\n克洛斯\t44693\n天门沟\t44694\n舒方\t44695\n速龙x4\t44696\nMexico\t44697\n落第骑士\t44698\n宽境\t44699\n手风琴\t44700\n多色\t44701\n太空棉\t44702\n东方都市\t44703\n抛光粉\t44704\n南十里居\t44705\n暴扣\t44706\nGigi\t44707\n64亿\t44708\n分页组件\t44709\n霸王王者天下\t44710\n双鹿\t44711\nu2718q\t44712\n拒马河\t44713\n银狼\t44714\nWorkflow\t44715\n洗碗池\t44716\n区金融办\t44717\n疯味\t44718\n天泰\t44719\n李竹\t44720\nPIX\t44721\nMINOLTA\t44722\n任学锋\t44723\nc6h\t44724\nkaspersky\t44725\n河南投资集团有限公司\t44726\n人工湿地\t44727\n无乐\t44728\nconsumer_offsets\t44729\n城山\t44730\n复旦大学继续教育学院\t44731\n地中海地区\t44732\n阴阳雷凌\t44733\n协奏\t44734\n两觉\t44735\n刘老湿\t44736\n膏贴\t44737\n报关员考试网\t44738\n浙江教育报\t44739\n刘长瑜\t44740\n200次\t44741\n历法\t44742\n贝店\t44743\n左归丸\t44744\n30毫升\t44745\n军靴\t44746\n输稿器\t44747\nIMCA\t44748\n反穿\t44749\nscarl\t44750\nLett\t44751\n永诚财产保险股份有限公司\t44752\nwmp\t44753\n锤妹\t44754\n2131\t44755\ntdp\t44756\n义乌机场\t44757\n福乐阁\t44758\n引人\t44759\n健康管理师\t44760\n5厘米\t44761\n微精灵\t44762\n闽剧\t44763\n保定新市\t44764\n冒牌\t44765\n波叔\t44766\n张兴旺\t44767\n赤岗\t44768\n泪满天\t44769\n宝安人民医院\t44770\n人大常委\t44771\n格西\t44772\n斯凯奇\t44773\nF罩杯\t44774\n大村\t44775\n3000多\t44776\nksi\t44777\n129集\t44778\n无氧\t44779\n宿舍楼\t44780\n电子电路\t44781\n自动画\t44782\n美国职业棒球大联盟\t44783\nPD\t44784\n20170306\t44785\n农交会\t44786\n大户\t44787\n1948\t44788\nscaler\t44789\n春天三国\t44790\n芙子\t44791\n985工程大学\t44792\n达奇\t44793\n腔隙\t44794\n南宁市人力资源和社会保障局\t44795\n24重\t44796\n武大新闻网_武汉大学\t44797\n赢钱\t44798\n石花镇\t44799\n谢欣\t44800\n良辰\t44801\n焱妃\t44802\n21运维\t44803\n可变\t44804\n90年代\t44805\n和空\t44806\nfeof\t44807\n方档\t44808\n边督\t44809\n三.doc\t44810\n漆器\t44811\nPhotosh\t44812\n拨打\t44813\n英文版\t44814\n高空作业费\t44815\nPicasa\t44816\n内存频率\t44817\nFOUND\t44818\n千雨\t44819\n思迈奥\t44820\n苏大\t44821\n重庆南岸区\t44822\n驴鞭\t44823\n纽币\t44824\n排水管道\t44825\n硫代乙酰胺\t44826\n鱼具\t44827\n全安\t44828\n小散\t44829\n24分米\t44830\n兰渝铁路\t44831\n隐匿\t44832\noutside\t44833\n恋练\t44834\nsim900a\t44835\n面板式\t44836\n公共事业管理专业\t44837\n搓洗\t44838\nDOTA2天梯\t44839\n小时分钟\t44840\n碗碗\t44841\n马兰谣\t44842\n彭芳\t44843\n孤女\t44844\n南大路\t44845\n硝呋太尔\t44846\n10031\t44847\n民族地区\t44848\n抽水蓄能电站\t44849\n细微\t44850\n鲜网\t44851\n淋面\t44852\n象岛\t44853\n分频\t44854\n力诺\t44855\n吴赫\t44856\n沈某\t44857\n舒静\t44858\nh辣文\t44859\nTerminology\t44860\n戏梦巴黎\t44861\n中国住建网\t44862\n天津中医药大学\t44863\n82个\t44864\n车震门\t44865\nFOV\t44866\nSCC\t44867\n寻女\t44868\nmde\t44869\n避难层\t44870\n湖北省委党校\t44871\nKINGDOM\t44872\n社科界\t44873\n关键时刻\t44874\n刘小东\t44875\n灵犀阁\t44876\n2016年10月\t44877\n换代\t44878\n私自\t44879\n17.com\t44880\n有尽\t44881\n895\t44882\n57折返利网\t44883\n扳机\t44884\n活机\t44885\n梦蝶\t44886\n市人民检察院\t44887\n徐渭\t44888\n奈音\t44889\ndbd\t44890\n小樱\t44891\n唐山海港开发区\t44892\n0539\t44893\n3XL\t44894\n普米族\t44895\n青年节\t44896\n何意\t44897\n安德玛\t44898\nPytorch\t44899\n李忠\t44900\nunderwater\t44901\n历史感\t44902\n杨晋\t44903\n一条条\t44904\n朱霞\t44905\n非交互式\t44906\nroyaltea皇茶\t44907\n卓尔集团\t44908\n恋活吧\t44909\n20170828\t44910\n乔斌\t44911\n八卷\t44912\n暨南大学管理学院\t44913\nbangbang\t44914\n吸血书\t44915\n金燕西\t44916\nCNTV动画台\t44917\n弹弹朗逸\t44918\n安盛\t44919\n台球杆\t44920\nIntechOpen\t44921\nPrinciples\t44922\n广式月饼\t44923\n陈意\t44924\n御墅\t44925\n正港\t44926\n效力\t44927\n永福镇\t44928\n爷爷\t44929\n北京市人力社保局\t44930\nBusinesses\t44931\ncomplementary\t44932\n脓痰\t44933\n伊兹密尔\t44934\n杭州地铁2号线\t44935\n中国联通沃音乐\t44936\n瓜子脸\t44937\n166号段\t44938\n墨点\t44939\n触摸显示屏\t44940\nHBASE\t44941\n星形胶质细胞\t44942\n爱房\t44943\n一案双查\t44944\n牙龈瘘管\t44945\nYours\t44946\n6月中旬\t44947\n王然\t44948\n长帝\t44949\n赫少华\t44950\n首冲\t44951\n搜狐微博\t44952\n钢业\t44953\n体育界\t44954\nMarshmello\t44955\n2017年3月10日\t44956\n壮文\t44957\n鲑鱼\t44958\n飞抵\t44959\nJeeSite\t44960\n张吉平\t44961\n加速器\t44962\n像素人\t44963\n麓城\t44964\ncmake-gui\t44965\n龙破九天诀\t44966\n二局\t44967\n小片片\t44968\n预计净残值\t44969\n娈\t44970\n第29次\t44971\n汉元帝\t44972\n霸气村\t44973\n叶辉\t44974\n终于知道\t44975\n后人\t44976\n鼠标\t44977\n凿\t44978\n圣女修道院\t44979\n习语\t44980\n眉山市人民政府\t44981\n有数\t44982\nperpendicular\t44983\n狮心王\t44984\n林俊德\t44985\n狼牙山五壮士\t44986\n临高县\t44987\nkubuntu\t44988\n现\t44989\nProfoto\t44990\nVISSIM\t44991\nBis\t44992\n百数\t44993\n冬瓜排骨汤\t44994\n咪豆音乐节\t44995\nQQ靓号\t44996\n绯忍传\t44997\n朗声\t44998\n山西代表团\t44999\n素牙\t45000\n悠牛网\t45001\nyouiv\t45002\n山东卫视\t45003\n尿壶\t45004\n拱坝\t45005\n瞳术\t45006\n少子化\t45007\n读书札记\t45008\n中邮网院\t45009\n人教版初二数学\t45010\n语用分析\t45011\n陈自强\t45012\n超级喜欢\t45013\n李佳薇\t45014\nForwarder\t45015\n哈格\t45016\n鱼尾\t45017\n神级\t45018\n张艳玲\t45019\n抽插\t45020\n缩胸\t45021\n金长城\t45022\n34部\t45023\nnc\t45024\n简朴\t45025\n烈风\t45026\nvars\t45027\n瞒着\t45028\n意志\t45029\n李淑敏\t45030\n10musume\t45031\nInformal\t45032\n猪鬃\t45033\nvivoxplay5\t45034\n青山菜\t45035\n面肥\t45036\n第六大道\t45037\n河南中医一附院\t45038\n丁辉\t45039\n岛崎信长\t45040\n白花菜\t45041\n自建国\t45042\nTypography\t45043\n喹诺酮类\t45044\n国内外\t45045\n2018年4月份\t45046\n中睿\t45047\n刘一男\t45048\n秒会\t45049\n饺子馆\t45050\n防火期\t45051\n有线电\t45052\ndose\t45053\n张浦镇\t45054\n台面\t45055\nreactions\t45056\n吸管\t45057\n董腾\t45058\n1米高\t45059\n凸变\t45060\n防脱\t45061\n钟楼区\t45062\n泯\t45063\n中专网\t45064\n肿瘤干细胞\t45065\n玉宝\t45066\n陈婉婷\t45067\n怎么\t45068\nFurious\t45069\n有病\t45070\n刷怪\t45071\na1528\t45072\nNEJM\t45073\nFolders\t45074\n222只\t45075\n3宫\t45076\n合家网\t45077\n雪花啤酒\t45078\n南充人才网\t45079\n等份\t45080\nabby\t45081\n全能生\t45082\n顺顺留学\t45083\n33年前\t45084\n童柯\t45085\n3CE\t45086\n拥塞\t45087\nquerySelectorAll\t45088\n丧尸围城3\t45089\n利达\t45090\n评价器\t45091\na7r3\t45092\n锋潮\t45093\n农历三月初三\t45094\n盖尔\t45095\n阅读文\t45096\n初速\t45097\nvivien\t45098\n永耀\t45099\n北迈网\t45100\n1l\t45101\nStudio\t45102\n华慧考博网\t45103\nAwakening\t45104\n胡希恕\t45105\n大肚腩\t45106\nLillian\t45107\n_巴士冒险岛\t45108\n红梅花\t45109\n宾语\t45110\n安东尼奥尼\t45111\n泡泡花\t45112\n玻尿酸\t45113\n化毛膏\t45114\n原始记录\t45115\n五城市\t45116\n表语从句\t45117\nAECOM\t45118\nword2011\t45119\n同室\t45120\n第14期\t45121\ncbt\t45122\n北艾路\t45123\n港中大\t45124\n蓝衣\t45125\nsjx\t45126\n300g\t45127\n百度电视云\t45128\n永固\t45129\n体臭\t45130\n洪灾\t45131\nldr\t45132\n89860998\t45133\n扣子\t45134\n猎禅网\t45135\nmeaven\t45136\nDates\t45137\n刘文军\t45138\ncse\t45139\n爱推文\t45140\n旅游先遣队\t45141\nresolutions\t45142\n杨贵妃\t45143\n黄秋燕\t45144\n张萌澄\t45145\n瞑\t45146\n全网通手机\t45147\n急性肾盂肾炎\t45148\n最后一课\t45149\n弹友\t45150\n摩拜\t45151\n例谈\t45152\n上海大观园\t45153\n圈存\t45154\n看着你\t45155\n赤木\t45156\n江南十校\t45157\n易林\t45158\n转投\t45159\n长春峰\t45160\n小飞\t45161\n1024X768\t45162\n姨姨\t45163\n小天后\t45164\n优优猜字谜游戏\t45165\n堵截\t45166\n盯着\t45167\n山景\t45168\nshua\t45169\n阿酸\t45170\n黄智雯\t45171\n大门口\t45172\n2609\t45173\n王永康\t45174\n骨裂\t45175\n白面\t45176\n杜布罗夫尼克\t45177\n催签\t45178\n福建省妇幼保健院\t45179\ncorsair\t45180\n云阳县人民政府\t45181\n刀舞\t45182\n下弦\t45183\n中国法学会\t45184\n曹宝麟\t45185\n碎玉\t45186\n李晓梅\t45187\n井径\t45188\nCLASS\t45189\n一眉\t45190\nKTR\t45191\n0473\t45192\n海安大观\t45193\n融资性\t45194\n无耻\t45195\n2013年1月1日\t45196\n3.2.4\t45197\nredefinition\t45198\n挽救\t45199\n张卡\t45200\n服众\t45201\n陶朱街道\t45202\n太阳星座\t45203\n鲅鱼圈区\t45204\nbt4\t45205\n第93\t45206\n2017年3月13日\t45207\n乐湾\t45208\nxx网\t45209\n乌鲁木齐市委\t45210\n名菜\t45211\n长期利率\t45212\n习水县\t45213\n长春房产网\t45214\n口袋妖怪叶绿386\t45215\n嗨客\t45216\n难以启齿\t45217\n高古\t45218\n70路\t45219\n辛巴\t45220\n噬人\t45221\n影苑\t45222\n夏子\t45223\n息肉\t45224\n险保\t45225\n采荷\t45226\n转租\t45227\nburst\t45228\n328\t45229\n中国国新控股有限责任公司\t45230\ncgspread\t45231\n加办\t45232\n杜伟\t45233\n603533\t45234\n高树三\t45235\n美庐\t45236\n佛山陶瓷\t45237\n纽交所\t45238\nconan\t45239\n纳税评估\t45240\n73天\t45241\n一缕阳光\t45242\n湛江西\t45243\n咲夜\t45244\n改移\t45245\nEvery\t45246\n神王\t45247\n陕西北路\t45248\n川师\t45249\n下书网\t45250\n21.0\t45251\n女老生\t45252\n孟非妮\t45253\nNEST\t45254\n连氏\t45255\n上海凯泉\t45256\n国规\t45257\n卫生棉\t45258\n醉鹅娘\t45259\n剑网三成就百科\t45260\n巴顿\t45261\n秀敏\t45262\n冒险类\t45263\n河南省监察厅\t45264\n三艘\t45265\n886\t45266\n土木工程概论\t45267\n代购网\t45268\n最近两天\t45269\n玄冬\t45270\n蜜桃\t45271\n心源性猝死\t45272\nc2b\t45273\ngardner\t45274\n啷个\t45275\n13个小时\t45276\nCool\t45277\n中委\t45278\n南京都市圈\t45279\nE63\t45280\nvcf\t45281\n加权平均数\t45282\n裴\t45283\n解维俊\t45284\nundead\t45285\n好酷\t45286\n绿水鬼\t45287\n分拨中心\t45288\n灵异网\t45289\nmanuka\t45290\nuibutton\t45291\n膝部\t45292\n中国人民大学后勤集团\t45293\n站队\t45294\n长疮\t45295\n中断点\t45296\n相门\t45297\n王鼎钧\t45298\n袁浩\t45299\nCkeditor\t45300\n电能量\t45301\nscaled\t45302\n四风扇\t45303\n不世\t45304\n蓝氏\t45305\nNada\t45306\n鼓足\t45307\nae白\t45308\n跟头\t45309\n预告登记证\t45310\n太子城\t45311\n扣押\t45312\n宁波大学\t45313\n根腐病\t45314\nnotice\t45315\n非婚\t45316\n第10代\t45317\n段友\t45318\nFlame\t45319\nGAR\t45320\n书摊\t45321\n特别行动\t45322\n辩证思维\t45323\n上航\t45324\n获益\t45325\n厚大\t45326\n诡道\t45327\n丽晶酒店\t45328\n蛆虫\t45329\n天津市高级人民法院\t45330\n集港\t45331\n14个小时\t45332\n极端值\t45333\n配婚\t45334\n苹果6p\t45335\n李帅\t45336\n中秋\t45337\n1064nm\t45338\n迷糊\t45339\n褒奖\t45340\n15周岁\t45341\n筱田佑\t45342\n唐强\t45343\n以太坊私有\t45344\n怪石\t45345\n硇洲岛\t45346\n李厚霖\t45347\n蔡卓妍\t45348\n轮眉\t45349\n大同煤矿集团\t45350\n中标记\t45351\n周笔爱因斯坦\t45352\n成都铁路学校\t45353\n0.07MB\t45354\nPie\t45355\n朗诗德\t45356\n斯巴达300勇士2\t45357\n汉阳\t45358\ninvitation\t45359\n水星网络\t45360\n性用\t45361\n商业史\t45362\n郑江\t45363\n玄外\t45364\n徐辛庄\t45365\n交国\t45366\n大恶\t45367\n斗罗大陆第一季\t45368\n罗琦\t45369\n自残\t45370\n银联国际\t45371\nnero\t45372\nWin94\t45373\n最美的期待\t45374\n防腐蚀\t45375\n私募资管\t45376\n下线\t45377\n村民委员会组织法\t45378\n乱性\t45379\n硬创\t45380\nHW\t45381\n萌出\t45382\nDeposits\t45383\ncoda\t45384\n人力资源管理毕业论文\t45385\n报名者\t45386\n上海隧道工程股份有限公司\t45387\n南京地铁1号线\t45388\n路易达孚\t45389\n五十克\t45390\n真分数和假分数\t45391\n此前\t45392\n功夫者\t45393\n中文社区\t45394\n水利工程专业\t45395\n中亭街\t45396\n通证\t45397\n发卡网\t45398\n林新\t45399\nFleece\t45400\n背板\t45401\n找寻\t45402\n学风\t45403\nmBot\t45404\n猫毛\t45405\n护士执业资格考试\t45406\nheaders\t45407\n魔掌\t45408\n用地\t45409\nNamecheap\t45410\n麻黄汤\t45411\nepik\t45412\n养生网\t45413\n湖南消防\t45414\n万宇在线\t45415\n狐臭\t45416\nArtificial\t45417\n防除\t45418\nIC芯片\t45419\n汤臣集团\t45420\n圣兽\t45421\n云台山景区\t45422\n豆福传\t45423\nimfilter\t45424\n清逸\t45425\n怪形\t45426\nLatest\t45427\n吊板\t45428\n郑小陌\t45429\n中智公司\t45430\n晴风\t45431\n_福彩_体彩_竞技彩\t45432\n公猪\t45433\n淫色\t45434\n锦湖\t45435\n下颌\t45436\n季莫申科\t45437\n飚车世界\t45438\n940m\t45439\n光纤配线架\t45440\n171路\t45441\n20170819\t45442\n黄曲霉\t45443\n新郑在线\t45444\n客币\t45445\n柿\t45446\n阿彩\t45447\n8代\t45448\n痴心不改\t45449\n青年路小学\t45450\n雄姿英发\t45451\n我要上春晚\t45452\n新白云机场\t45453\n凯锐\t45454\n王幸福\t45455\npjsip\t45456\nV880\t45457\n极品家丁吧\t45458\n镀彩锌\t45459\n打捆机\t45460\n碧翠丝\t45461\n重庆交大\t45462\n外平\t45463\n5000多个\t45464\nblues\t45465\n俄亥俄州\t45466\nww\t45467\noralce\t45468\n人物类\t45469\n佩特里\t45470\n下关市\t45471\n闻达\t45472\n2016年9月30日\t45473\n中华情\t45474\nrepodata\t45475\n道尔\t45476\n铜仁\t45477\n15万辆\t45478\n17关\t45479\n29周年\t45480\n短程\t45481\n哗啦啦\t45482\n妙趣\t45483\n山东大学附属中学\t45484\nmosquito\t45485\nWORDS\t45486\n电动清扫车\t45487\nKe\t45488\n州官\t45489\n星宇\t45490\n杨元庆\t45491\n分离器\t45492\n中山百姓网\t45493\n开能环保\t45494\n简行\t45495\n偷钱\t45496\n第二十三回\t45497\n紫之隧道\t45498\n笨笨-B站\t45499\n南博\t45500\n證\t45501\nP1106\t45502\n2018年4月3日\t45503\n首都医科大学附属北京潞河医院\t45504\n雅黑\t45505\n分赃\t45506\n阿姆斯特丹大学\t45507\nYY6080\t45508\n杨霞\t45509\ndiyu\t45510\nAudacity\t45511\n上世纪九十年代\t45512\n费城\t45513\nEXR\t45514\n郝明莉\t45515\n2016.3.10\t45516\n爱普生L310\t45517\n九纹龙\t45518\n万家灯火\t45519\n380TSI\t45520\nkmeans聚类算法\t45521\n撒气\t45522\n霞浦县\t45523\nAnyChat\t45524\n机板\t45525\n营长\t45526\nSexArt\t45527\nOpus\t45528\n坐怀不乱\t45529\n开漏\t45530\n阿芳\t45531\n前一小时\t45532\n石景山区\t45533\n千城\t45534\n人机大战\t45535\n戈尔贡\t45536\nWings\t45537\n南美侨报网\t45538\nimageCLASS\t45539\n帝国时代2高清版\t45540\n花草树木网\t45541\nSPV公司\t45542\neb5\t45543\n一瓶\t45544\n刻舟求剑\t45545\n1868年\t45546\n石门中学\t45547\n博科\t45548\n荣耀之战\t45549\n最中\t45550\n美寓\t45551\n菊花链\t45552\n哪壶\t45553\noceans\t45554\n0码\t45555\n猪场动力网\t45556\nmingli\t45557\n好酒网\t45558\n灰毛衣\t45559\nFranchise\t45560\n福州协和医院\t45561\n镇江火车站\t45562\n消谐\t45563\n恒昌小额贷款公司\t45564\n法叔\t45565\n无谋\t45566\nmatlab2016\t45567\n斗罗大陆\t45568\nnikia\t45569\nSECURE\t45570\n江苏省安监局\t45571\n物权\t45572\n封群\t45573\n002496\t45574\n横扫\t45575\n农家书屋\t45576\n京口瓜洲\t45577\n北京市丰台区人力资源和社会保障局\t45578\n滴水线\t45579\n行初\t45580\n李金龙\t45581\n外蒙\t45582\npyhton\t45583\ngerrit\t45584\n铠传\t45585\n龙岭\t45586\nA6\t45587\n赵牧\t45588\nChanger\t45589\nWMA格式\t45590\n觉\t45591\nISIS\t45592\n图片流\t45593\nlinux\t45594\nmenuitem\t45595\n模拟类\t45596\n4%\t45597\n保险费\t45598\n守卫剑阁神昏末劫\t45599\n上海古北\t45600\n机部\t45601\n无标\t45602\n阵灵\t45603\nStephan\t45604\n抹平\t45605\n爱为名\t45606\n燕子坞\t45607\n现场直击\t45608\nconcerning\t45609\n健身垫\t45610\n魔术师\t45611\n大事儿\t45612\n洪榕\t45613\n富士电机\t45614\n3524\t45615\n鲁亿通\t45616\n2016年06月\t45617\n单人房\t45618\nSD\t45619\nStarUML\t45620\n_富宝\t45621\n刘恺\t45622\n诉讼法\t45623\n妙术\t45624\n法拉奇\t45625\nteryx\t45626\n东京食尸鬼RE\t45627\nDIY小制作\t45628\n132\t45629\n圣元优博奶粉\t45630\n闪银\t45631\nAAX\t45632\n站稳\t45633\n韦恩图\t45634\n第152集\t45635\n雪鹰指染成婚\t45636\n滂沱\t45637\nYII2\t45638\n自然规律\t45639\nv2.6.1\t45640\n洁美\t45641\n厂屏\t45642\n水冷式\t45643\nbufferreader\t45644\n提携\t45645\n史记\t45646\n胸苷激酶\t45647\n新速腾\t45648\n双马尾\t45649\n精品打包网\t45650\n三门峡市人民政府\t45651\n一届\t45652\n地接\t45653\n西子情\t45654\n自问\t45655\noffice2017\t45656\nSyrup\t45657\nPatriot\t45658\n非晶合金\t45659\n屋外\t45660\n展团\t45661\n大通G10\t45662\n于和\t45663\n字条\t45664\n凌云路\t45665\n获客\t45666\n优美\t45667\nHOURS\t45668\n其四\t45669\n湖南省第六工程有限公司\t45670\nWin10浏览器\t45671\n秦末\t45672\n帝一\t45673\n抗裂\t45674\n武堂\t45675\n北京市计量检测科学研究院\t45676\n32A\t45677\n山峦\t45678\n萧山医院\t45679\n山西焦煤\t45680\n肉\t45681\nDigital\t45682\nMidiShow\t45683\n渡河\t45684\n命犯\t45685\n拿九稳\t45686\n0900\t45687\n安排表\t45688\nJeep自由光\t45689\n索超\t45690\n技击\t45691\n金山软件\t45692\n三鹿\t45693\n意林小小姐\t45694\n核桃仁\t45695\n永嘉路\t45696\n自攻螺钉\t45697\n今明两天\t45698\nsammy\t45699\n布莱恩\t45700\nFiltering\t45701\nChangYou\t45702\n今冬\t45703\n裁决者\t45704\nvipkid少儿英语\t45705\n富\t45706\nStatusBar\t45707\n巴比妥\t45708\n望江楼\t45709\n白鸟湖\t45710\n鞋模\t45711\n第七子\t45712\n荣光\t45713\n淋浴器\t45714\n长河\t45715\n迪米特洛夫\t45716\nWIX\t45717\nJJ比赛斗地主\t45718\n保肝\t45719\nWHQL版\t45720\nVolta\t45721\n五四路\t45722\n论辩\t45723\n泰勒展开式\t45724\nSi\t45725\n麻烦大\t45726\n达力士\t45727\n黄鹤楼\t45728\n中国建材家居网\t45729\n百八\t45730\n出口版\t45731\nv1.1_乐游网\t45732\n商务区\t45733\n氨基钠\t45734\nBud\t45735\n南卡\t45736\n金毅\t45737\nGhost中文网\t45738\n扶风县人民政府\t45739\n网销宝\t45740\n代表人物\t45741\n002422\t45742\n极右\t45743\n丝杠\t45744\n卒业\t45745\n168信息网\t45746\n霸主地位\t45747\n阿财\t45748\n織田真子\t45749\n荔\t45750\n锁卡\t45751\n全球通\t45752\n塔尺\t45753\n细带\t45754\n6幅\t45755\n特围\t45756\nSGM\t45757\nProperties\t45758\n康福\t45759\nQTV\t45760\n清科\t45761\nCGCS2000\t45762\n金山快译\t45763\n鼎城\t45764\nhttpget\t45765\n港澳\t45766\n万水千山\t45767\n传神\t45768\n碳酸氢盐\t45769\n绿洲花园\t45770\n枸橼酸\t45771\n点通\t45772\n德智体美\t45773\n一美\t45774\n西门\t45775\n32度\t45776\n少年儿童出版社\t45777\n低微\t45778\n预选\t45779\nexcel2015\t45780\n可溶性固形物\t45781\n反方\t45782\n手机稳定器\t45783\n今年2月份\t45784\n雷劫\t45785\n北极鱼\t45786\n熟客\t45787\n甲戌\t45788\n江景\t45789\n特设\t45790\n阿比特龙\t45791\n早产\t45792\ndiscovered\t45793\n贡士\t45794\n男工\t45795\n河北新闻网\t45796\n安徽电大\t45797\n宁德时代新能源科技股份有限公司\t45798\n江达\t45799\n消防应急灯\t45800\n异人族\t45801\n小雪性欢日记\t45802\n活现\t45803\n世界贸易组织概论\t45804\n霍娜\t45805\n5瓣\t45806\n一二季\t45807\n无醇啤酒\t45808\nLog\t45809\n密影\t45810\n南阳日报\t45811\n第92集\t45812\n夕阳山\t45813\n河东机场\t45814\n双膝\t45815\n李冰冰\t45816\n词坛\t45817\n水疗机\t45818\n争先恐后\t45819\n清水理纱\t45820\n广东队\t45821\n指掌\t45822\n禹卫\t45823\n钦点\t45824\n压力机\t45825\narticles\t45826\nMIRACLE\t45827\n出口加工区\t45828\n生蛇\t45829\n5月下旬\t45830\n龚嘉欣\t45831\n荒原\t45832\n专类\t45833\n韩美林\t45834\n360随身WiFi\t45835\n列名\t45836\n四环线\t45837\nNowadays\t45838\nhttpmime\t45839\n班德瑞\t45840\n终结者2审判日\t45841\npero\t45842\n日本国驻华大使馆\t45843\nMSO\t45844\nbre\t45845\n46层\t45846\n点漆\t45847\nOpenERP\t45848\n李磊\t45849\n保通\t45850\n书习题\t45851\n桥吊\t45852\ndiary\t45853\n大别山干部学院\t45854\nmhxx\t45855\n郑州市人力资源和社会保障局\t45856\nJobScheduler\t45857\n我是至尊\t45858\n作料\t45859\n初段\t45860\n4minute\t45861\n小仙女们\t45862\n菌棒\t45863\n抗拒\t45864\n逆战创世套\t45865\nAC220V\t45866\n我是歌手2\t45867\n红茅药酒\t45868\nbadminton\t45869\nmuseum\t45870\n醉倒\t45871\n菟丝花\t45872\n平地机\t45873\n生长\t45874\n青羊区\t45875\n刘兴\t45876\n武鸣区\t45877\n2016—2017年度\t45878\nCoreData\t45879\n春江镇\t45880\n过午\t45881\n网球王子\t45882\n微百\t45883\n紫藤萝瀑布\t45884\n疯跑\t45885\n卢克尼\t45886\n文库下载器\t45887\nT580\t45888\nzll\t45889\n影视传媒有限公司\t45890\n150w\t45891\nLoadiine\t45892\n傅里叶变换\t45893\n中国毒理学会\t45894\n糯米粥\t45895\nII型\t45896\n牛仔帽\t45897\nLookbook\t45898\nmkj\t45899\n42届\t45900\n600485\t45901\n三雄\t45902\n西安交大附属二院\t45903\nSqlyog\t45904\nm1213\t45905\n诏安\t45906\nfriendly\t45907\n墨锭\t45908\n今天早晨\t45909\n毛领\t45910\n骛\t45911\n江湖救急\t45912\n第A12版\t45913\n真三国无双7:猛将传\t45914\nTempo\t45915\ndiscriminant\t45916\n江汉路步行街\t45917\n字段值\t45918\n太傻论坛\t45919\nxcode\t45920\n厂公\t45921\n中学生作文网\t45922\n六岁半\t45923\nSO\t45924\n天游峰的扫路人\t45925\nThere\t45926\n橄榄\t45927\n祥符区\t45928\n贺岁杯\t45929\n卤肉卷\t45930\n雷管\t45931\n第二十六章\t45932\n光贴\t45933\n包租婆\t45934\n服化\t45935\nmvc:annotation-driven\t45936\nwait\t45937\n手机套\t45938\n玄皓战记\t45939\ne430\t45940\n65nm\t45941\n航宇\t45942\ngsh\t45943\nbda\t45944\n幸福院\t45945\n陈瑶\t45946\npursue\t45947\n沉陷\t45948\n阿香婆\t45949\nguoqi\t45950\n2667749444\t45951\n当责\t45952\n多方法\t45953\n有关\t45954\n15个工作日\t45955\n1550010\t45956\navx2\t45957\nMasa\t45958\n集合竞价\t45959\n丽水\t45960\n潭酒\t45961\n备进\t45962\n冬夜\t45963\n富森\t45964\nHOME键\t45965\nUCG\t45966\n劲炫\t45967\n马尼\t45968\n55句\t45969\n有明\t45970\n乌贼娘2\t45971\n超纲\t45972\n72V\t45973\n荣家湾\t45974\n迅鹰\t45975\n万乘\t45976\ncu\t45977\n°\t45978\n自成\t45979\ncmbc\t45980\n304不锈钢板\t45981\n排药\t45982\n日志\t45983\nrenkai721\t45984\n版次\t45985\n齐云山\t45986\n郁欢\t45987\n信长之野望创造:战国立志传\t45988\ntwo\t45989\n99re8\t45990\n截表\t45991\n天门洞\t45992\n育儿大作战第一季\t45993\n静乐\t45994\n方法类\t45995\n透蜜\t45996\nsql_server\t45997\n期中考试卷\t45998\n暑\t45999\n奥日\t46000\n庸碌\t46001\n论事\t46002\n1.9.5\t46003\n扛把子\t46004\n威虎\t46005\n含糊不清\t46006\n为先\t46007\nSwanson\t46008\n牙膛\t46009\n族汉\t46010\n万达商铺\t46011\n网上银行百科-金\t46012\n胖丁\t46013\nAugmented\t46014\n钮承泽\t46015\nInterllij\t46016\nImplant\t46017\n新版新概念英语\t46018\nedas\t46019\nNottingham\t46020\n御品\t46021\nlsdyna\t46022\n密阳\t46023\n1话\t46024\nfns\t46025\n3dmax/Vray\t46026\n周行涛\t46027\n乾隆大藏经\t46028\nem\t46029\nkbj\t46030\nJRC\t46031\ndescribed\t46032\n广东省建设执业资格注册中心\t46033\n借号\t46034\nfile类\t46035\n金丝绒\t46036\n瑶海\t46037\n长安汽车长安欧诺\t46038\nfs7\t46039\n高利润\t46040\n老土\t46041\necholife\t46042\nPANASONIC\t46043\n44天\t46044\n9克\t46045\n正反向\t46046\n贾峪镇\t46047\n千骨\t46048\ntextureview\t46049\n自称\t46050\n基本国情\t46051\nmplayer\t46052\nparam-value\t46053\n哺乳照\t46054\n宜必思酒店\t46055\n安徽省投资集团\t46056\n何事\t46057\n三公主\t46058\n万利达\t46059\nParameters\t46060\n顶薪\t46061\n燕山教育委员会\t46062\n律子\t46063\n于冬\t46064\n屯留\t46065\n面前\t46066\n氯沙坦钾片\t46067\n风控\t46068\n梳毛\t46069\n布网\t46070\nws5200\t46071\n模魂\t46072\n金滩\t46073\n肥水\t46074\nsfp\t46075\n反照率\t46076\n5圈\t46077\n契证\t46078\nWeb认证\t46079\n珂卡芙\t46080\nwam\t46081\n关闭\t46082\n膻\t46083\n饲草\t46084\n大伯\t46085\n密山市\t46086\nnoYes\t46087\n忘年之交\t46088\n四间\t46089\nsman\t46090\n张德华\t46091\n针孔摄像机\t46092\n适用于\t46093\n巴拉格宗\t46094\nvijayfly\t46095\n田中圭\t46096\n程煜\t46097\n云资讯_中国IDC圈\t46098\n飞溅\t46099\ngliethttp\t46100\n机械工业第六设计研究院有限公司\t46101\n二小时\t46102\nilluminate\t46103\n今题\t46104\n监管层\t46105\n母子恋\t46106\n分针\t46107\n空距\t46108\n夜勤\t46109\n华颂\t46110\n猕猴桃\t46111\n胡搅蛮缠\t46112\n富民县\t46113\n密报\t46114\n武道大帝\t46115\n白纱\t46116\n青岛市市北国家税务局\t46117\nPicsart\t46118\n领受\t46119\n诺贝尔\t46120\n寺院\t46121\n文人画\t46122\n江户幕府\t46123\n沉降缝\t46124\n砌块\t46125\n姑夫\t46126\n大华股份\t46127\n家电费\t46128\n灰太郎\t46129\nCrab\t46130\ncookbook\t46131\n陆奥\t46132\n监测器\t46133\n刘明华\t46134\n调研表\t46135\nqaq\t46136\n黄杜蕾斯\t46137\nfairly\t46138\n勇者无畏\t46139\nlazarus\t46140\nsm3\t46141\ntclsh\t46142\n114分类信息网\t46143\n疯狂动物园\t46144\n蛟龙号\t46145\nsick\t46146\nsta\t46147\n候机室\t46148\n济南市\t46149\n散乱污\t46150\nlighter\t46151\n死而无憾\t46152\nnomic\t46153\n次干道\t46154\nHSV\t46155\n儿科学\t46156\nWebサイト\t46157\nソング\t46158\n_奇丽女性网\t46159\n安意如\t46160\n吴庄\t46161\n礼品机\t46162\n唐六典\t46163\n惠惠\t46164\n爱书\t46165\n农政\t46166\n曲安奈德鼻喷雾剂\t46167\nAJ32\t46168\n通泰\t46169\nGulf\t46170\n客制\t46171\n华夏名网\t46172\n试衣镜\t46173\nV2.4.1\t46174\n乔治·克鲁尼\t46175\njanus\t46176\n就任\t46177\n香辣蟹\t46178\n山神\t46179\nsql盲注\t46180\n西湖国际城\t46181\nviolence\t46182\n余英男\t46183\n姚体\t46184\n中科院上海药物研究所\t46185\nfloor函数\t46186\n初级经济法\t46187\n铿锵\t46188\n网源\t46189\n罔\t46190\n比奇\t46191\nWifi信号\t46192\nHalal\t46193\n2000G\t46194\n转述\t46195\n瓦条\t46196\n双流航空港\t46197\n洮砚\t46198\n全册\t46199\n智能配电网\t46200\n终极斗士\t46201\n不锈钢网\t46202\n2720\t46203\n女政府办\t46204\n圣地亚\t46205\n持\t46206\notg\t46207\n补涨\t46208\n金蹴\t46209\n飘流\t46210\nTOM97番号网\t46211\n主题包\t46212\npaint\t46213\n最具有\t46214\nR3\t46215\n天猫转让\t46216\nmacqq\t46217\n20孔\t46218\n5448\t46219\n乐视x500\t46220\n后厂村路\t46221\nk95\t46222\n吹气\t46223\n光驱\t46224\n陆客\t46225\nGenomes\t46226\n大气垫\t46227\n管理咨询有限公司\t46228\nWarhammer\t46229\n马寨\t46230\nwendang\t46231\n黄金兽\t46232\n错了\t46233\n上海市医学会\t46234\n丁丑\t46235\nstaem\t46236\n天运\t46237\n以为\t46238\n搜狐焦点\t46239\n三权分置\t46240\nBVI\t46241\n盎然\t46242\nFan\t46243\n多知网\t46244\n罗炳辉\t46245\n福哥\t46246\n谷饲\t46247\n环孢素软胶囊\t46248\n齐乐\t46249\n天之骄子\t46250\n逐层\t46251\n滑动门\t46252\n苷类\t46253\n空气预热器\t46254\n燃油宝\t46255\n资产经营公司\t46256\nenhanced\t46257\n锦竹\t46258\n赋分\t46259\n南达科\t46260\n2924\t46261\n高亚麟\t46262\n120篇\t46263\nonclick\t46264\npendulum\t46265\n铝合金锭\t46266\n基金从业资格考试\t46267\nworkout\t46268\n中幅\t46269\n柳梢青\t46270\nMVC框架\t46271\n三十九度\t46272\n看戏机\t46273\nef6\t46274\n2018年2月1日起\t46275\n南天pr2\t46276\n胃转流\t46277\nEnvelope\t46278\nMiles\t46279\nlongest\t46280\nseq\t46281\n4g内存条\t46282\n江西省工商行政管理局\t46283\n等级证\t46284\n、\t46285\n书语\t46286\n八阵\t46287\n浙江萧山医院\t46288\n6棵\t46289\n出厂价格\t46290\nimove\t46291\nHID\t46292\nZAKER\t46293\n豚骨拉面\t46294\nDose\t46295\n建筑通\t46296\nfrontiers\t46297\n打夯机\t46298\n夺权\t46299\n艾迪西\t46300\nCODOL\t46301\n安趣网\t46302\n线面\t46303\n金荷娜\t46304\n室间隔缺损\t46305\n小狗狗\t46306\nMobileNet\t46307\nSOCO\t46308\n五河\t46309\n淄博站\t46310\n上海金山\t46311\n西数WD硬盘\t46312\n补品\t46313\nV3.0.0\t46314\n拿分\t46315\n驼人集团\t46316\n邵亦波\t46317\nXSplit、OBS\t46318\n博字\t46319\n消失\t46320\n实验动物学\t46321\n变废\t46322\n丙烯酸酯\t46323\niuu\t46324\n大碍\t46325\n关山牧场\t46326\n立威廉\t46327\n西峡县\t46328\n博州\t46329\n平差\t46330\n性书大亨\t46331\n电环\t46332\nmariadb\t46333\n撸起袖子加油干\t46334\n黑龙江分校\t46335\n药膜\t46336\n贺荣\t46337\nlimiter\t46338\n最优解\t46339\n协整检验\t46340\n2015_凯风网\t46341\n上海交响乐团\t46342\n范德堡大学\t46343\n氧化酶\t46344\n祭奠\t46345\n凄惨\t46346\nOBD\t46347\n深圳市人民政府国有资产监督管理委员会\t46348\npromiere\t46349\n5月13号\t46350\n萨内\t46351\n第十三章\t46352\n2017日\t46353\n明牌\t46354\naj117\t46355\n有化\t46356\ncatalysts\t46357\nplt.scatter\t46358\n刘一明\t46359\n免登录\t46360\n静心咒\t46361\n自动贴标机\t46362\n吊唁\t46363\n赵盾\t46364\n安徽站\t46365\n乔丹鞋\t46366\n装饰花\t46367\n玩意\t46368\n灰鹦鹉\t46369\n辽台\t46370\n预\t46371\nyaca\t46372\n八步区\t46373\nTINYINT\t46374\n光华街\t46375\n三角阀\t46376\n古田路9号\t46377\n玉杯\t46378\n电力工程公司\t46379\n遇见你真好\t46380\n唇彩\t46381\n儒勒·凡尔纳\t46382\n云南省农业厅\t46383\n韫色\t46384\n穗莞深城轨\t46385\n20150420\t46386\n美好的回忆\t46387\n英格拉姆\t46388\nsymlink\t46389\n美图M8s\t46390\n没有然后\t46391\n溏心蛋\t46392\n石鸡\t46393\n香港时代广场\t46394\nxe8\t46395\n鸦羽\t46396\nsnmptrap\t46397\n频率分布\t46398\nelisa\t46399\n滩头\t46400\n抚摩\t46401\nAlive\t46402\nAxureR\t46403\n荆州市环境保护局\t46404\n石陨石\t46405\n乌鲁鲁\t46406\nleopard\t46407\n大陈镇\t46408\n别克汽车\t46409\n迅雷吧_\t46410\nperformances\t46411\nXZ1\t46412\n信息处理\t46413\n六字诀\t46414\nY75\t46415\nJSONP\t46416\n中铁大桥局集团第五工程有限公司\t46417\nPola\t46418\n商丘工学院\t46419\n本素\t46420\n土货\t46421\n面朝大海春暖花开\t46422\n卡马\t46423\n使命召唤5世界战争\t46424\n塔塔\t46425\n预铺\t46426\n好赌\t46427\n苗头性\t46428\n非农业户口\t46429\n羽衣狐\t46430\nkof\t46431\ntrinity\t46432\n太仓人才网\t46433\n土资\t46434\n索萨\t46435\nVacheron\t46436\n第12个\t46437\nRe\t46438\n水价\t46439\n渗水量\t46440\nfunded\t46441\n命卦\t46442\n农机360网\t46443\n同色\t46444\nUnderwater\t46445\n唢呐\t46446\n澉浦镇\t46447\n眉刀\t46448\n转嫁\t46449\n旭硝子\t46450\n中山南一路\t46451\n20150928\t46452\n导视\t46453\n储值卡\t46454\n奥克斯\t46455\n滑轮组\t46456\n江苏省建筑工程集团有限公司\t46457\n步云山\t46458\n石梁镇\t46459\n降号\t46460\n海底城\t46461\n鲁大师吧\t46462\n化工部\t46463\n除去\t46464\n白舟\t46465\n聊城一中\t46466\n开利中央空调\t46467\n东风汽车公司\t46468\n娇医\t46469\n_武山政府网\t46470\n淬火油\t46471\n写字台\t46472\nGHz\t46473\n前驱体\t46474\n每隔1小时\t46475\n误服\t46476\nmht\t46477\n谁\t46478\n双千\t46479\n一个30岁\t46480\n画方\t46481\nMoshi\t46482\n你的孩子\t46483\n39分钟\t46484\n无锡江南大学\t46485\nBlush\t46486\n5103\t46487\n混帐\t46488\n三方协议\t46489\n梦馆\t46490\n天籁论坛_汽车之家论坛\t46491\n艾瑞咨询集团\t46492\nBOYLONDON\t46493\n煤制天然气\t46494\n丧权辱国\t46495\n琅邪\t46496\n郑超\t46497\n南都\t46498\n张明杰\t46499\n碰巧\t46500\n修谱\t46501\n抵销\t46502\n终极猎杀\t46503\n20180603\t46504\n海上生明月\t46505\n杨硕\t46506\n牛角\t46507\n王曼\t46508\n美分\t46509\n嘚啵嘚啵嘚\t46510\n罗托鲁瓦\t46511\n开庄\t46512\n奥克斯广场\t46513\nT139汽车改装网\t46514\n水晶石\t46515\n郑伯克\t46516\n德庄\t46517\n凯迈乐软件著作权登记网\t46518\n惊愕\t46519\n镍基\t46520\n好妞妞食品饮料招商网\t46521\n龙器\t46522\n刀男\t46523\nheade\t46524\n南天信息\t46525\n标集\t46526\nLED路灯\t46527\nlfa\t46528\n张丰\t46529\n头孢类\t46530\n校园文化\t46531\n阿加\t46532\n20170924\t46533\nFLYKNIT\t46534\n凯伍德\t46535\n20160714\t46536\nHiFi播放器\t46537\n5111\t46538\n起搏\t46539\n大嶝岛\t46540\n古剑奇谭吧\t46541\n安抚奶嘴\t46542\n4000mAh\t46543\n兰息\t46544\ncereal\t46545\nIPL5\t46546\n阳光课堂\t46547\nShun\t46548\nEasily\t46549\n那些事儿\t46550\n总厨\t46551\n刘诗诗\t46552\nHaus\t46553\n车载DVD导航\t46554\n金盆\t46555\n侠探杰克2\t46556\n对价\t46557\n根目\t46558\n无名指\t46559\n皮脂腺囊肿\t46560\n与学\t46561\nfckeditor\t46562\n数据服务器\t46563\n【日影\t46564\nJiang\t46565\nduoda\t46566\nProlific\t46567\n部类\t46568\n大连理工大学数学科学学院\t46569\n毂\t46570\n难\t46571\nISS\t46572\n200万\t46573\n地狱之路\t46574\n神医凰\t46575\n1671\t46576\n德化\t46577\n成人类\t46578\n课长\t46579\n广富林遗址公园\t46580\n港北\t46581\nNX2\t46582\n琴友\t46583\nGrimmmz\t46584\n安致网\t46585\n狂笑\t46586\n沢村麻耶\t46587\n应用商\t46588\n400加仑\t46589\n2o\t46590\n私分国有资产罪\t46591\n福州物流公司\t46592\nlsw\t46593\n赛拉图\t46594\n深感\t46595\n定边\t46596\n进退维谷\t46597\n曹健\t46598\n4322\t46599\n6月9号\t46600\n开题报告书\t46601\n上清液\t46602\n源代码包\t46603\n精神卫生\t46604\n南湖游乐园\t46605\n职达\t46606\n轮毅\t46607\nqq黄钻\t46608\n海葵\t46609\n瓦利\t46610\n老有\t46611\n蜗杆\t46612\n腾云\t46613\n12k\t46614\n磨盘\t46615\n饭岛\t46616\n三坐标测量机\t46617\n盯上\t46618\nPrivoxy\t46619\n党建之窗\t46620\n无儿无女\t46621\n茶饮\t46622\n七律长征\t46623\nsigmaplot\t46624\n试岗期\t46625\nmeida\t46626\n忠贞不渝\t46627\n慧博\t46628\n车尼尔\t46629\nappender\t46630\n盖世无双\t46631\nIKBC\t46632\nBlow\t46633\n结和\t46634\nthe\t46635\n中国工商银行企业网上银行\t46636\n肘关节\t46637\n金钱世界\t46638\npaparazzi\t46639\n50公里\t46640\n渤海船舶职业学院\t46641\nsudu\t46642\nAllied\t46643\n钟灵\t46644\n盘亏\t46645\n吉视\t46646\ncoef\t46647\n乌盆记\t46648\n8600k\t46649\n妈妈的吻\t46650\n珞璜\t46651\n张一白\t46652\n炎片\t46653\n功夫熊猫1\t46654\n医用酒精\t46655\n十五期\t46656\nGLUT\t46657\n上帝\t46658\n孔洞\t46659\n装盒机\t46660\n德诚\t46661\n林萃路\t46662\n圆弧形\t46663\nPowerful\t46664\n王智慧\t46665\n收妖\t46666\n李洪涛\t46667\n华奥\t46668\n四川省普通话水平测试中心\t46669\n阿贡\t46670\n江山武侠世界大冒险\t46671\n超人气\t46672\n总李\t46673\n8GB\t46674\n碳基\t46675\n胜记\t46676\n72集\t46677\n倪净\t46678\n科创\t46679\nА\t46680\n几万\t46681\n愈伤\t46682\nOptima\t46683\n赢彩\t46684\n精分\t46685\ndispenser\t46686\n尘嚣\t46687\nboiled\t46688\n大营\t46689\n南京师范大学\t46690\n来意\t46691\n通孔\t46692\nminit\t46693\n根管治疗术\t46694\nxxb\t46695\nnumbered\t46696\nRESTFUL\t46697\n哈欠\t46698\n叠螺污泥脱水机\t46699\n魂断蓝桥\t46700\n两湖\t46701\n坐席员\t46702\n英卡洛斯\t46703\ndelsey\t46704\n偷油\t46705\n五星红旗迎风飘扬\t46706\n星期\t46707\n艺术人生\t46708\nDf\t46709\n猪圈\t46710\n狂野\t46711\nCircuit\t46712\n中国芜湖县政府\t46713\n健峰\t46714\n刊登\t46715\nCURL\t46716\n姓名权\t46717\n门柜\t46718\n衣袂\t46719\n港头\t46720\n菠萝格\t46721\n邵阳站\t46722\nxxw\t46723\n2.5mm\t46724\n百花香\t46725\n双币卡\t46726\nDBM\t46727\n比亚迪宋MAX\t46728\n易用宝\t46729\n金鹏信息\t46730\nsiu\t46731\n红警3起义\t46732\n减编\t46733\n藤黄\t46734\n冲剪机\t46735\n扩围\t46736\nimageclass\t46737\nm42\t46738\n有变\t46739\n晚上十点\t46740\nctrl+f\t46741\n风行云\t46742\n天空之眼\t46743\n成人性\t46744\n雷克萨\t46745\nsiri\t46746\n七封\t46747\n广东省司法厅\t46748\n对抗食\t46749\n读后感作文_小故事网\t46750\n40.0000元\t46751\n哄堂大笑\t46752\n曼扎\t46753\n苏比克湾\t46754\n通组\t46755\n套话\t46756\nTapas\t46757\nInSAR\t46758\n枝晶\t46759\n200千瓦\t46760\n京彩\t46761\n虚拟运营商\t46762\n75届\t46763\n停战协定\t46764\nIwara\t46765\n宿州信息网\t46766\noacle\t46767\notk\t46768\n射频卡\t46769\n揭东区\t46770\n精达股份\t46771\n致命错误\t46772\n中关村二小\t46773\nSimulations\t46774\n刘也\t46775\nSQLyog\t46776\n折扣\t46777\nIntra\t46778\n白洁\t46779\n106部\t46780\n品物\t46781\n预定义\t46782\n职业观\t46783\n湘教通\t46784\n联发集团\t46785\n嘀嘀虎\t46786\n正序版\t46787\n王洛宾\t46788\nBT|\t46789\nELEVEN\t46790\n一个半月\t46791\nChinaNet\t46792\n芝麻信用\t46793\n帕萨克莱斯勒300c\t46794\n栋仁\t46795\n鹿寨\t46796\n陪吃\t46797\nstartups\t46798\n薇姿会员中心\t46799\nresult\t46800\ntopless\t46801\n居民们\t46802\n中华中医药杂志\t46803\nber\t46804\njavlibrary\t46805\n超极本\t46806\n奔驰e级\t46807\n火地晋\t46808\n41张\t46809\n朝阳政府\t46810\nCKFinder\t46811\n怪物猎人综合\t46812\n俞逊\t46813\n软泥怪\t46814\n近5年\t46815\n永久保存\t46816\n300cm\t46817\n三引号\t46818\n杜国楹\t46819\n画史\t46820\n五菱之光论坛_汽车之家论坛\t46821\n阿迪\t46822\nRTP\t46823\n智利\t46824\n喜丧\t46825\nPUSSY\t46826\n管闲事\t46827\nSupervised\t46828\n委托代征\t46829\n南方都市报\t46830\nflops\t46831\nPARIS\t46832\n两中\t46833\n感觉卡\t46834\nnodata\t46835\n龙虎争霸2\t46836\n口眼\t46837\n西华县\t46838\n铭铭\t46839\n漯河市委\t46840\n阻燃管\t46841\n工间操\t46842\nsloggi\t46843\n8.8分\t46844\n退让\t46845\nscores\t46846\n斗战胜\t46847\n凹陷处\t46848\nStudio2013\t46849\nSama\t46850\n五套\t46851\n五氧化二钒\t46852\n神舟十一号飞船\t46853\npretend\t46854\n国网湖南省电力有限公司\t46855\n露天\t46856\n氢醌乳膏\t46857\n翻糖\t46858\n站外\t46859\n陇县人民政府\t46860\n可申购\t46861\n王小凤\t46862\n末地传送门\t46863\nlast\t46864\nreduce\t46865\nrapidjson\t46866\n灯管\t46867\n养老保险金\t46868\n加油泵\t46869\n苦苣苔科\t46870\n霞龙\t46871\n廊坊一中\t46872\n亚东\t46873\n人在江湖\t46874\n卓信\t46875\n郑愁予\t46876\n去吧皮卡丘_九游论坛\t46877\n美国乡村\t46878\n桑桑\t46879\nsysout\t46880\n英国南安普顿大学\t46881\n井式\t46882\n开面\t46883\n不治\t46884\n房屋预售证\t46885\n夜未眠\t46886\n凌云山\t46887\n有线条\t46888\n微山县\t46889\n工业和信息化部电子第五研究所\t46890\nautofocus\t46891\nvbs吧\t46892\n三互\t46893\n官车\t46894\n留无意\t46895\n十渡\t46896\n农夫山泉茶\t46897\n潘总\t46898\ndreamscene\t46899\n万古仙穹最新章节_万古仙穹\t46900\n工伤认定申请表\t46901\n上海韵达快递\t46902\n珠圆玉润\t46903\n前后级\t46904\nZHE\t46905\nkissy\t46906\n鱼水情\t46907\n锦秀\t46908\n苏\t46909\n雅高酒店\t46910\n义演\t46911\n软起动器\t46912\n漫步者s1000\t46913\n奥迪RS\t46914\n本科\t46915\nMKV格式\t46916\n郭小舟\t46917\n幂运算\t46918\n刷脸\t46919\n东莞石排\t46920\n刘文龙\t46921\n通信ic\t46922\n褪\t46923\n泰禾集团股份有限公司\t46924\n月度\t46925\nanh\t46926\nifdown\t46927\n咸阳火车站\t46928\ntl-wn725n\t46929\nProBook\t46930\n溴苯\t46931\n鱼台县\t46932\nDNF艾肯\t46933\nmapreduce\t46934\n取子\t46935\nNikita\t46936\njgw\t46937\n牛肉卷\t46938\n岛頔\t46939\n工团\t46940\n脑死亡\t46941\nmul\t46942\n六艺\t46943\n500g\t46944\n晶相\t46945\n温带季风气候\t46946\n福新路\t46947\n判空\t46948\n亿万第三季\t46949\n简式\t46950\nACM\t46951\nVB6\t46952\n铝合金压铸件\t46953\n新工艺\t46954\n光福\t46955\n申士\t46956\n皮草\t46957\n中纺联\t46958\n拧\t46959\n楔带\t46960\nBayer\t46961\n火星文\t46962\n套袋\t46963\n便签\t46964\n西西\t46965\n元吉\t46966\n迷惘\t46967\n四川招商网\t46968\n奔驰唯雅诺\t46969\n110国道\t46970\ncto\t46971\n002402\t46972\n出狱\t46973\n小米云\t46974\n1.4301\t46975\n猫宁\t46976\n锦绣大道\t46977\n3609\t46978\n雨臣\t46979\n十二点\t46980\nZT\t46981\n卸妆棉\t46982\n砂轮片\t46983\n秦安政府网\t46984\n胡服\t46985\n平半\t46986\n168g\t46987\n中国矿业大学\t46988\nweinre\t46989\n化淤\t46990\nUPVC\t46991\n奇幻片\t46992\nmba吧\t46993\n北斗导航卫星\t46994\n妓\t46995\nimpress\t46996\n萌发\t46997\n管定\t46998\n行吊\t46999\n中交二公局\t47000\n金融大厦\t47001\n微信支\t47002\n十八天\t47003\n遵义医学院\t47004\n我委\t47005\n城剑\t47006\n塑料袋\t47007\n鲜爽\t47008\n诛仙2\t47009\n华尔街见闻\t47010\n高强板\t47011\nRestTemplate\t47012\n骨垢\t47013\n口感\t47014\n第32次\t47015\n第次\t47016\nincluding\t47017\n青云志2\t47018\nETag\t47019\nsunscreen\t47020\nThyroid\t47021\ntomacat\t47022\n兰蔻奇迹\t47023\nfooterView\t47024\n笨鸡\t47025\n佛山市公安局\t47026\n上海新时达电气股份有限公司\t47027\n5号位\t47028\n西行寺\t47029\n怪盗joker\t47030\ny480\t47031\n377\t47032\n中国惠普有限公司\t47033\n中山詹园\t47034\n暗黑3传奇宝石\t47035\n360.com\t47036\nFLIP\t47037\n虫病\t47038\nIvy\t47039\n山水林田湖\t47040\n驴头\t47041\n亲热\t47042\n给子\t47043\n九条命\t47044\nflypig\t47045\n四川大学建筑与环境学院\t47046\n铜阀门\t47047\n熊男\t47048\n狐友\t47049\nI3\t47050\nfreeswitch\t47051\n天坑地缝\t47052\n1111111\t47053\n劫后\t47054\n壁立千仞\t47055\nCorelCAD\t47056\n聚氨酯冷库板\t47057\n岳城\t47058\nbufferedimage\t47059\n5577我机网\t47060\n肉版\t47061\n怪物猎人世界上位\t47062\n胎位\t47063\n载脂蛋白\t47064\n快递网\t47065\nBona\t47066\n三才\t47067\n玛丽莲梦露\t47068\n第二十八年\t47069\nsewing\t47070\nreol\t47071\narma3\t47072\n陀地驱魔人\t47073\n悦黑\t47074\ncaprice\t47075\nlaurent\t47076\n歌唱二小放牛郎\t47077\n剽悍\t47078\n心光\t47079\n大路镇\t47080\nfarpoint\t47081\n7puls\t47082\n顺反异构\t47083\nwinnt\t47084\nalbion\t47085\n京东公司\t47086\n最后一击\t47087\n男法\t47088\n电煤\t47089\n扭一扭\t47090\nhob\t47091\nz470\t47092\nPlacement\t47093\n黄燕铭\t47094\n武墓\t47095\n扩路\t47096\n钱佳\t47097\n12月21日\t47098\n华为M3\t47099\n麦达数字\t47100\n南宁火车站\t47101\nCheng\t47102\n威震\t47103\n鉴证实录\t47104\n全面战争幕府将军2\t47105\n家庭装修\t47106\n泥坑\t47107\n8810\t47108\n切缝机\t47109\n排脓\t47110\n永嘉中学\t47111\n月落无声网\t47112\n模拟法庭\t47113\n贺老爷\t47114\nsuits\t47115\n第149号\t47116\n落箱\t47117\n华信信托\t47118\n微商\t47119\n最终幻想7重制版\t47120\n连平县\t47121\n4-5\t47122\n易耗品\t47123\n30分\t47124\n塞纳河路\t47125\ndisruptor\t47126\n双南\t47127\n小碗\t47128\n兰州交通大学博文学院\t47129\n舞堂\t47130\n五一劳动节\t47131\n青发\t47132\nh2z\t47133\n杨维\t47134\nSublimeText3\t47135\n木天蓼\t47136\n走的路\t47137\n整体\t47138\n海运费\t47139\n赤玉土\t47140\nFace\t47141\nImpala\t47142\nrocomp\t47143\n中海康城\t47144\nwin10cad\t47145\n诗心\t47146\n孤岛危机5\t47147\n站名\t47148\n2016年1月21日\t47149\nGrannies\t47150\nWORD,EXCEL\t47151\n孝亲\t47152\n写发\t47153\n三支一扶考试网\t47154\n冬蜜\t47155\n47号\t47156\n高考卷\t47157\nRGB灯\t47158\ntouchbar\t47159\n是因为你不懂\t47160\n斗破苍穹第二季\t47161\n應\t47162\n秦师\t47163\nArchitecture\t47164\n去处\t47165\n尚峰\t47166\n中公省\t47167\nneato\t47168\n启铭星\t47169\n金蚂蚁\t47170\n游牧民族\t47171\n看台\t47172\n爱你一生\t47173\n过堂\t47174\nzuoshi\t47175\n七龙\t47176\n深圳中心\t47177\nradioButton\t47178\n增值额\t47179\n8813\t47180\n小荧星合唱团\t47181\n周角\t47182\n系表\t47183\nhibernate-validator\t47184\n英才网联\t47185\n桂小太郎\t47186\n一学一做\t47187\n森林人论坛_汽车之家论坛\t47188\ndiag\t47189\n唐宁府\t47190\nautosize\t47191\nborough\t47192\n激光跟踪仪\t47193\n中国信息协会\t47194\n翻门\t47195\n60k\t47196\n毛军\t47197\n无线电管理局\t47198\n小型\t47199\n疏证\t47200\n出入境管理处\t47201\n缉毒犬\t47202\n兴泉铁路\t47203\n2015cc\t47204\n沉板\t47205\nwebapi\t47206\n喵特\t47207\n金辰\t47208\nMacKeeper\t47209\n赏梅\t47210\n珠海移动\t47211\nオリジナル\t47212\n官亭\t47213\nx86\t47214\n曼省\t47215\n资料盒\t47216\n固有一死\t47217\nacu\t47218\nPhonics\t47219\n格莱圈\t47220\n人兽恋\t47221\n傅里叶逆\t47222\n领航版\t47223\n泰特\t47224\n特雷克斯\t47225\n小米5S\t47226\nOn5\t47227\n中国少年儿童新闻出版总社\t47228\n丰富\t47229\ndejavu\t47230\n2018年1月7日\t47231\nbetterscroll\t47232\n张小波\t47233\n0#\t47234\n公积金联名卡\t47235\n一锤定音\t47236\n无尘车间\t47237\n新华人寿保险\t47238\n少年时代\t47239\n打工者\t47240\n黄河电视台\t47241\n顶山街道\t47242\n无纺\t47243\n金杜研究院\t47244\nnohub\t47245\n艳阳天\t47246\n梦幻模拟战3\t47247\n托尔多\t47248\nmakemigrations\t47249\n西药师\t47250\nYounger\t47251\n胡峰\t47252\n拉吉\t47253\n阳西县人民政府\t47254\n金扫帚奖\t47255\n卧龙山\t47256\n20180117\t47257\n正本清源\t47258\nE440\t47259\n千屈菜\t47260\n1190\t47261\n阿卡波糖片\t47262\n宫理惠\t47263\nbushang\t47264\n胡萝卜素软胶囊\t47265\n甲戌年\t47266\n临期\t47267\n旮旯\t47268\nMoncler\t47269\n桃仙\t47270\nA2DP\t47271\n学习成果\t47272\n雷兽\t47273\n涛哥\t47274\n徐福记\t47275\n博云新材\t47276\n我想和你唱第二季\t47277\n19公里\t47278\n分机号\t47279\nbreak\t47280\n华泽钴镍\t47281\n刑事责任\t47282\n下疳\t47283\n毛笋\t47284\n炽翎\t47285\n神明之胄\t47286\n婚女\t47287\n永恒之树\t47288\n彼方\t47289\n内房\t47290\n88微拍\t47291\n工程信息网\t47292\n天若有情\t47293\n特玩坦克世界\t47294\n王豆豆\t47295\n卓越教育\t47296\ngnome,kde\t47297\n连善\t47298\n瓶颈\t47299\n深圳市人民政府应急管理办公室\t47300\n世界微笑日\t47301\n飞丝\t47302\nrx100\t47303\n现代信号处理\t47304\n禁购\t47305\n大同站\t47306\n000日元\t47307\n鳌山卫\t47308\nmedia\t47309\n10张\t47310\nexit\t47311\ngraduate\t47312\n翻阅\t47313\n3638\t47314\n阿宝\t47315\n伊萨\t47316\n安徽电视台\t47317\n罗生堂\t47318\n畅途网\t47319\nEshop\t47320\n新天龙\t47321\n横山\t47322\n战卡\t47323\nVMWARE\t47324\n12100\t47325\n布米米\t47326\nTechSnail\t47327\n200_\t47328\nB+树\t47329\n仙林\t47330\nhata\t47331\n20MW\t47332\nstx\t47333\n23次\t47334\n刘东华\t47335\nconv2\t47336\n茶琉\t47337\n北斗星X5\t47338\nmy.cnf\t47339\n谍影重重1\t47340\n长沙南\t47341\n骆\t47342\n第4_\t47343\n154号\t47344\n毕业论文引言\t47345\n内出\t47346\n海森矩阵\t47347\n外贸版\t47348\n仿形机\t47349\nmsd\t47350\nCelebrities\t47351\n中国华电集团公司\t47352\n黎原创\t47353\n详评\t47354\n油乎乎\t47355\n死印\t47356\n蜿蜒\t47357\n双下巴\t47358\n泰康保险集团股份有限公司\t47359\n开疆拓土\t47360\n金山新城\t47361\nMF4752\t47362\n北京万科\t47363\nENVI\t47364\n雪铲\t47365\n焚影\t47366\nShader篇\t47367\nx-5\t47368\n李世宏\t47369\nneedle\t47370\nWMTS\t47371\n重来\t47372\n圆锯\t47373\n仓边路\t47374\nDock\t47375\n2001级\t47376\n改变者\t47377\n192.168.0.100\t47378\n马六甲板\t47379\n享通\t47380\nhtaccess\t47381\n杨晓超\t47382\nencourage\t47383\n密码学\t47384\n四价\t47385\n6.0.8\t47386\n绿茶婊\t47387\n夸美纽斯\t47388\n珍娜\t47389\n成达\t47390\n之外\t47391\n免税区\t47392\n顾凯\t47393\n济南消防\t47394\n高庄镇\t47395\n桂鱼\t47396\nvast\t47397\n阻值\t47398\n氧运动\t47399\n诺心\t47400\n第十八\t47401\n儿童舞\t47402\n级配碎石\t47403\n0755|\t47404\n华资\t47405\nPhoto\t47406\n班室\t47407\n太阳系八大行星之一\t47408\n准通\t47409\n青园街\t47410\n压手\t47411\n知识产权质押贷款\t47412\n阿坝州\t47413\n林雄\t47414\n自裁\t47415\n跨境电商综试区\t47416\n制罐\t47417\n古畑任三郎\t47418\nDIRECTORY\t47419\n涉密\t47420\nJobs\t47421\ni52400\t47422\n算命婚姻\t47423\n瓜迪奥拉\t47424\n宁海西\t47425\n盐场\t47426\n亿丰时代广场\t47427\n水原希子\t47428\n日网\t47429\n澳信\t47430\n电击文库\t47431\n痰液\t47432\n爱不得\t47433\n日产阳光\t47434\n狐尾藻\t47435\nscite\t47436\nCGCS2000坐标系\t47437\n刘逸云\t47438\n老衲\t47439\n阳江市住房和城乡规划建设局\t47440\n植物大战僵尸2_九游论坛\t47441\nodi\t47442\nSAFETY\t47443\nphix\t47444\n林兆华\t47445\n6.1%\t47446\n闇\t47447\n畅通\t47448\nbug\t47449\n第十八期\t47450\n神题\t47451\nicoc\t47452\n宋娟\t47453\n小利\t47454\nheadquarters\t47455\n包装纸盒\t47456\nhaokan\t47457\n巴哈伊\t47458\n高篇\t47459\n科勒\t47460\nmacos\t47461\n黄图\t47462\n蝇贪\t47463\nNo.7\t47464\n教训\t47465\n桃运村\t47466\n鸡西\t47467\n泉州银行\t47468\n省国土资源厅\t47469\nmpirun\t47470\n中国公路网\t47471\n宝景\t47472\n窝棚旅游网\t47473\n吉人\t47474\nlecturer\t47475\n2012年10月\t47476\n&#61481\t47477\njyl\t47478\nOPPOR11s\t47479\n杀熟\t47480\n内蒙古教育厅\t47481\n陈学东\t47482\n上部\t47483\n8440p\t47484\n发绿\t47485\n酷币\t47486\n米月\t47487\n金爵\t47488\n20170513\t47489\n5551\t47490\n贝立安\t47491\n铁锰\t47492\n万名\t47493\n纺车轮\t47494\ntaylor\t47495\n掌心\t47496\ntaa\t47497\n货币史\t47498\nHKT\t47499\nBASE\t47500\n1246\t47501\n最佳酒店\t47502\n调试篇\t47503\n佐佐\t47504\n盘古开天\t47505\ncolours\t47506\n歌乐\t47507\n横坑\t47508\n王岑\t47509\n熊出没全集\t47510\nStrokes\t47511\nAi502胶水\t47512\n佩里\t47513\n皮肤衣\t47514\nzzf\t47515\n羟基丙烯酸树脂\t47516\n刀子\t47517\nevaluating\t47518\n文物展\t47519\n银花\t47520\n骨扫描\t47521\n周洁\t47522\n城乡居民大病保险\t47523\ntexture\t47524\n感天动地\t47525\nNMD\t47526\n廖序东\t47527\nCX-4论坛\t47528\n题栏\t47529\n四余\t47530\n高窗\t47531\n福星盈门\t47532\nNotepad++\t47533\n数模转换\t47534\n见利忘义\t47535\n再点\t47536\n小楼昨夜又东风\t47537\n怪物猎人3g\t47538\nshifting\t47539\ncitibank\t47540\n过失\t47541\n字博缘文学网\t47542\n政协常委会\t47543\n存档\t47544\n菜菜网\t47545\n拉拉勾旅游网\t47546\n2000张\t47547\n李蕙敏\t47548\n陈伯\t47549\nSVM分类器\t47550\n常熟东南开发区\t47551\n试孕\t47552\n高逸泰\t47553\n晓港\t47554\ncron表达式详解\t47555\n安徽海螺水泥股份有限公司\t47556\n北京航空\t47557\n偶发性\t47558\n双开\t47559\n希尔顿\t47560\n罗源\t47561\nbroadcastreceiver\t47562\n扑翼机\t47563\n渔山\t47564\nlabwindows\t47565\n舰艇\t47566\n易修罗\t47567\n衣板\t47568\n国拍\t47569\n高型\t47570\n潇雨\t47571\n奥美拉挫\t47572\n熟妻\t47573\n纯度\t47574\n正和\t47575\n巴啦啦小魔仙之飞越彩灵堡\t47576\nimba吧_\t47577\n淘宝天猫店\t47578\nWIN7旗舰版\t47579\n链圈\t47580\n薪金\t47581\n张和平\t47582\n哈药\t47583\n契增\t47584\n戈仑\t47585\nM32\t47586\n前列地尔\t47587\n去者\t47588\nGaga\t47589\n三世情缘\t47590\n封闭针\t47591\n第十版\t47592\npc平板二合一\t47593\n玉剑\t47594\n95家\t47595\n凤凰科技\t47596\n义乌政府网\t47597\nFaron\t47598\n电动独轮车\t47599\n储存期\t47600\n简欧\t47601\nCompared\t47602\n成都植物园\t47603\n画皮网\t47604\n苋菜\t47605\n七大员\t47606\n克罗姆\t47607\n清清\t47608\n老琴\t47609\n剩余价值\t47610\n金盆洗手\t47611\n葆苾康\t47612\n田甜\t47613\n周娜\t47614\n观音瓶\t47615\n芦苇荡\t47616\n鸣凤\t47617\nConcurrency\t47618\nhd\t47619\nREPO\t47620\n反尘\t47621\n中文台词网\t47622\npasses\t47623\n国际章\t47624\npola美白丸\t47625\n661\t47626\n3000瓦\t47627\n乐歌股份\t47628\n金属色\t47629\n再临\t47630\n辽宁省政府\t47631\n珠海市香洲区人民法院\t47632\n瑞丰银行\t47633\n比亚迪唐吧\t47634\n6splus\t47635\n果仁网\t47636\n金在焕\t47637\n图书馆学\t47638\n量体裁衣\t47639\nbeatsx\t47640\n移除\t47641\n荣耀3X畅玩版\t47642\n1R\t47643\nXXOO\t47644\n义和团运动\t47645\n枫林绿洲\t47646\nautopep8\t47647\nantimicrobial\t47648\n肉嫁高柳家\t47649\nMasturbating\t47650\n张新宝\t47651\n娑娜\t47652\n王剑波\t47653\n内蒙古自治区环境保护厅\t47654\n东风导弹\t47655\nmeier\t47656\n一月多\t47657\n爱的多米诺\t47658\nModeling\t47659\n余佳文\t47660\n许美静\t47661\n7.72亿\t47662\n亚亚图雷\t47663\nunified\t47664\n尊号\t47665\niou\t47666\n兽王猎人\t47667\n长白街\t47668\n导向片\t47669\n漫锋网\t47670\n扑克人\t47671\n热情洋溢\t47672\n青森\t47673\n本杰明\t47674\nPoloniex\t47675\nx5pro\t47676\n通考\t47677\n两颗\t47678\n堕落\t47679\n解析式\t47680\n2.2.15\t47681\nEdge+\t47682\n千叶双子\t47683\n人物篇\t47684\n龙口\t47685\nkorg\t47686\n华泾\t47687\n广西电力职业技术学院\t47688\n肉奴\t47689\n养生术\t47690\n油油\t47691\ngameofmir\t47692\nPOM\t47693\n起垄\t47694\n雪龙\t47695\n皱褶\t47696\n傲农生物\t47697\n迷雾\t47698\n朝读\t47699\n29年\t47700\n压膜\t47701\n商周\t47702\n宏旺\t47703\n离合器油\t47704\n酷辣虫\t47705\n360安全防护中心\t47706\n墨韵\t47707\nboombeach\t47708\n玉体\t47709\nchp\t47710\nmsvcr100\t47711\n林见清\t47712\n乞巧\t47713\n影音先锋伦理片_XFPLAY先锋影音\t47714\n北京盈科律师事务所\t47715\n海藻\t47716\n后侧\t47717\n艾莉亚\t47718\nfurnace\t47719\n方谬\t47720\n保荐代表人胜任能力考试\t47721\nStocks\t47722\n马可波罗网\t47723\n智能车\t47724\n书客\t47725\n南方基金管理有限公司\t47726\n原貌\t47727\n中华人民共和国合同法释义\t47728\nscatter\t47729\n汉化+\t47730\n多肉土\t47731\n欧元区\t47732\nSynchronization\t47733\n300件\t47734\n嘉佑\t47735\n收集垫\t47736\nOSG\t47737\n国瓷\t47738\n低蛋白血症\t47739\n3万平米\t47740\n20150227\t47741\n四川大学华西妇产儿童医院\t47742\n精锐\t47743\n陕西联通\t47744\n大卫路易斯\t47745\n绿野\t47746\n安托兰\t47747\nFLY\t47748\n特大型\t47749\n殡葬业\t47750\n斐尔可\t47751\n守望的天空\t47752\nOneKey\t47753\n跟骨骨折\t47754\n魔枪\t47755\n交易群\t47756\n小咪\t47757\nConcerts\t47758\nM.171ZZ.COM\t47759\n14c\t47760\nb卷\t47761\n投资性房地产\t47762\nexec函数\t47763\n郑州工商学院\t47764\n文山市人民政府\t47765\n遗落\t47766\n劲头\t47767\n济南工程职业技术学院\t47768\n青云QingCloud\t47769\n如影\t47770\n乱闪\t47771\n17173激战2\t47772\nWeb编程\t47773\n棉籽\t47774\n引航员\t47775\nfeiq\t47776\n老桥\t47777\n灼眼的夏娜\t47778\n木雕\t47779\n当前年月日\t47780\n伪物语\t47781\n出气筒\t47782\n沙龙\t47783\nLake\t47784\n9158\t47785\n更新换代\t47786\n措词\t47787\n暴风科技\t47788\n荧屏\t47789\n统率\t47790\n霍达\t47791\n长沙市纪委\t47792\n国共内战\t47793\n快门\t47794\n东莞银行\t47795\n迪奥西斯\t47796\n1.37\t47797\n万石植物园\t47798\n数十名\t47799\n对付\t47800\n英语知识\t47801\n历城二中\t47802\n迅雷高清BT\t47803\n5月23日\t47804\n21岁\t47805\n非国有\t47806\n还珠格格3\t47807\n凝荷\t47808\n阿里大宝卡\t47809\n硬功\t47810\ncredentials\t47811\n一拖五\t47812\n沪江育儿网\t47813\n太阳镇\t47814\n女街霸\t47815\n紫癜肾炎\t47816\n0.26\t47817\n0756\t47818\n申鹭达\t47819\nMacan论坛_汽车之家论坛\t47820\n120名\t47821\n三石\t47822\n075\t47823\n独立版\t47824\n3111\t47825\n大昌\t47826\n二氧化碳传感器\t47827\n绿港\t47828\n碧桂园\t47829\n专管\t47830\n百佳学习邦\t47831\n心经\t47832\n进给\t47833\n核桃苗\t47834\n荃湾区\t47835\n踞\t47836\n雄大\t47837\n北京海淀\t47838\nWSL\t47839\n在脚下\t47840\n德哈卡\t47841\n盖德化工网\t47842\n倒车摄像头\t47843\n雨刷片\t47844\n曲梁镇\t47845\n小儿神经内科\t47846\n李雨儿\t47847\n晋江经济报\t47848\n足癣\t47849\n鬼节\t47850\nemb\t47851\n徽墨\t47852\n建芳\t47853\n挂壁式\t47854\n中国人寿财险\t47855\n游卡桌游\t47856\n金融部\t47857\n10组\t47858\n礼器\t47859\n中央新城\t47860\n应用化学\t47861\n碎裂\t47862\n小岞\t47863\n柯克\t47864\n纯粮酒\t47865\n绘图板\t47866\n万科海港城\t47867\ntaskset\t47868\nGApps\t47869\n一汽富维\t47870\n捷恩斯\t47871\n张角\t47872\n相对而言\t47873\n五七\t47874\n啥\t47875\ncyp\t47876\n51视频学院\t47877\n橙子味\t47878\n熊二\t47879\n牛萌萌\t47880\n泸沽湖\t47881\ninta\t47882\n刘子业\t47883\n60名\t47884\nrsources\t47885\n璀璨人生\t47886\n楚汉骄雄\t47887\n林花谢了春红\t47888\nC6000\t47889\n主題\t47890\n布尔玛\t47891\n小月月\t47892\n红蜻蜓\t47893\n栗子树\t47894\n多支\t47895\n宋庄\t47896\n佩蒂特\t47897\n负值\t47898\n刑事附带民事诉讼\t47899\n批准文号\t47900\n诚智\t47901\n光度\t47902\n青岛小区\t47903\n瑞凤\t47904\n剪钳\t47905\n10千克\t47906\nevo\t47907\n东方明珠售楼小姐\t47908\n中公教师网\t47909\n7.8级\t47910\n欧松板\t47911\nfiio\t47912\n斜纹\t47913\n罗马帝国艳情史\t47914\nz23\t47915\n臭氧水\t47916\n生产家\t47917\n遥\t47918\nansel\t47919\n手腕腱鞘炎\t47920\n天钥桥路\t47921\n绿软基地\t47922\nend键\t47923\nbrace\t47924\n子阳\t47925\n棕地\t47926\n中国皮书网\t47927\n守号\t47928\n黑暗君主\t47929\nAccept\t47930\n惠州供电局\t47931\njojo\t47932\n压灌\t47933\n无损音乐\t47934\n┌PS\t47935\n小阿俏\t47936\n校网\t47937\nMiro\t47938\n2.4.0\t47939\n孟山都\t47940\n吃逛\t47941\n敏儿\t47942\nCellar\t47943\n6秒钟\t47944\n深圳人才网\t47945\n园艺学\t47946\n自考_无忧考网\t47947\n戏腔\t47948\nENERGY\t47949\n主分区\t47950\n无字天书\t47951\n安徽农金\t47952\n人参\t47953\n美食团购网\t47954\n董峰\t47955\n撸撸撸\t47956\nSS小燕之夜\t47957\nsingh\t47958\nUIPicker\t47959\n万科东荟城\t47960\n遇害案\t47961\n十二夜\t47962\nptt\t47963\n理查兹\t47964\n借出\t47965\n重庆市市\t47966\n半半\t47967\nJess\t47968\n南京市公共资源交易中心\t47969\n陕京\t47970\nleessang\t47971\n埋植\t47972\nFlappy\t47973\n风压\t47974\niOS8.1\t47975\n补盖\t47976\nwiki\t47977\n平鑫涛\t47978\nqq华夏\t47979\n开心豆\t47980\n不达意\t47981\n联手网\t47982\n芒果皮\t47983\nsim卡座\t47984\n憨人\t47985\n教育咨询公司\t47986\n云波\t47987\nBT高清\t47988\n马来酸曲美布汀片\t47989\nloci\t47990\n暗屏\t47991\n电气设备\t47992\n住建局\t47993\n疯子\t47994\n染料敏化太阳能电池\t47995\n95511\t47996\n3AM\t47997\n最近3年\t47998\n分掉\t47999\n电屏\t48000\n重庆华美整形美容医院\t48001\nbeautify\t48002\n中关村东升科技园\t48003\n北京崇文门\t48004\n接盘者\t48005\n52ij\t48006\n奇才队\t48007\n纣\t48008\n石英表\t48009\n更紧密\t48010\n成端\t48011\n秦周懿\t48012\n87181682\t48013\nXQ\t48014\n乐视电视\t48015\n湖塘\t48016\n所想\t48017\n320K\t48018\n异地\t48019\n三共\t48020\nFootball\t48021\ndoctor\t48022\n翻毛\t48023\n顺城\t48024\n双形\t48025\n方_\t48026\n法语\t48027\n阴煞\t48028\n可造\t48029\n冬阳\t48030\n五合\t48031\n预报\t48032\n血口\t48033\n景顺长城\t48034\n设置\t48035\n复姓\t48036\n5笔\t48037\n荒原狼\t48038\n内存分配器\t48039\n蛇头\t48040\nremarks\t48041\n千凯\t48042\n立方氮化硼\t48043\n大西路\t48044\n概率\t48045\n面单打印机\t48046\n電\t48047\n赏春\t48048\nfds\t48049\n迪路兽\t48050\n大男\t48051\n勇芳\t48052\n组织处\t48053\n桐城网\t48054\n千块\t48055\nalo\t48056\nDN150\t48057\nAltova\t48058\nChinaFIT健身网\t48059\nClosing\t48060\n张家港市区\t48061\n自习课\t48062\nrevolver\t48063\n1.3t\t48064\n碧生源减肥茶\t48065\n塔釜\t48066\n居高临下\t48067\n胶膜\t48068\nzizhi\t48069\n人乳头瘤病毒\t48070\n苗阜王\t48071\n运营部\t48072\nyoux\t48073\n违犯\t48074\n博德\t48075\narcore\t48076\n腹股沟疝气\t48077\n三门峡南\t48078\n压缩式\t48079\nno.6\t48080\nBarcelona\t48081\n2秒后\t48082\nvux\t48083\n皇家守卫军\t48084\n珍宝珠\t48085\n临澧\t48086\n改头换面\t48087\ntora\t48088\n万泉\t48089\n网络课\t48090\n少年包青天\t48091\n华润电力控股有限公司\t48092\namsterdam\t48093\n蔡襄\t48094\n张彦\t48095\n钟罩\t48096\n26本\t48097\n经济实惠\t48098\nEagleGet\t48099\n28步\t48100\nreset\t48101\n低碳生活\t48102\n电子令牌\t48103\n繁写\t48104\nredis3\t48105\nzgs\t48106\n上海海关\t48107\n电影女\t48108\n群众性\t48109\n跌停价\t48110\n脑\t48111\ndeemed\t48112\n选项值\t48113\ngrams\t48114\n进京证\t48115\nWebServlet\t48116\n道达尔\t48117\n熙熙攘攘\t48118\n河北省交通运输厅\t48119\n卫生事业\t48120\n马战\t48121\n产前\t48122\njixiang\t48123\n智领\t48124\nnibaku\t48125\n钢化炉\t48126\n王文丽\t48127\n1720\t48128\n六份\t48129\n邱振中\t48130\n银鹭八宝粥\t48131\n泉塘\t48132\n潮网\t48133\n露谷\t48134\n波提切利\t48135\nPast\t48136\n四叔\t48137\nmysqlslap\t48138\n流出\t48139\ndarling\t48140\n穆罕默德\t48141\nL313\t48142\n长裤\t48143\n朝天门码头\t48144\n中流砥柱\t48145\n延边日报\t48146\n挖孔\t48147\n横琴镇\t48148\n文惠\t48149\n管状腺瘤\t48150\n衍\t48151\ntitus\t48152\nRed\t48153\n樊笼\t48154\n西安网\t48155\n被害\t48156\n黄宗泽\t48157\n西安高新技术产业开发区\t48158\n博彩网\t48159\n汤姆·希德勒斯顿\t48160\n26年前\t48161\n好不好过\t48162\nlush\t48163\n沉管灌注桩\t48164\n浏阳河\t48165\n170亿\t48166\n睡梦\t48167\n宽体机\t48168\n吃兔\t48169\n微科普\t48170\n冰冰霜\t48171\nweidea\t48172\n1.2万亿\t48173\n艇仔粥\t48174\n倒票\t48175\n银业\t48176\n略读\t48177\nQQ个性签名-个性网\t48178\n切碎\t48179\n吴庆文\t48180\n体教\t48181\n念亲恩\t48182\n普惠型\t48183\n中国石化出版社\t48184\nVIP版\t48185\n仙客\t48186\n发嗲\t48187\n紫屋魔恋\t48188\n美好生活家\t48189\n26路\t48190\n口条\t48191\n上海出入境边防检查总站\t48192\n西安交通大学经济与金融学院\t48193\n微圈\t48194\n佳酿网\t48195\nLibor\t48196\nexcel200\t48197\n莱纳\t48198\n丹阳眼镜城\t48199\n党支部\t48200\nddr3内存\t48201\n污染度\t48202\nweide\t48203\ndse\t48204\n铃木奥拓\t48205\nspdl\t48206\n树木\t48207\nBosch\t48208\n九尾狐狸\t48209\nVV7论坛_汽车之家论坛\t48210\n2折\t48211\n堰桥\t48212\n万达商管\t48213\n時計\t48214\nを\t48215\n赵泳鑫\t48216\nsvg\t48217\n跨模块\t48218\n美味奇缘\t48219\nLinguistics\t48220\n安徒生童话全集\t48221\n全敏普陀\t48222\n第67\t48223\n8.1双\t48224\n扬州市公安局\t48225\n朝阳实验小学\t48226\n向海\t48227\n喜上眉梢\t48228\n券商资管\t48229\nSunny\t48230\n一淘网\t48231\n高水准\t48232\n古往今来\t48233\n专项计划说明书\t48234\n刘贺\t48235\n浴室戏\t48236\n戏服\t48237\nSibelius\t48238\n恋舞ol\t48239\n迎检\t48240\nMaya/3DMax\t48241\nV16\t48242\n调研卷\t48243\n李典\t48244\n花色\t48245\n安徽机电职业技术学院\t48246\n杨卫\t48247\n永祥\t48248\n地板\t48249\n全画幅相机\t48250\n厕位\t48251\n入药\t48252\n曹家堡机场\t48253\n爱尚\t48254\n一味\t48255\n0575\t48256\n评审\t48257\n贪吃蛇\t48258\n学军中学\t48259\n油脂\t48260\nimwrite\t48261\npence\t48262\n公羽\t48263\n东麓\t48264\n南京水游城\t48265\nEvilAngel\t48266\n刘备\t48267\n利物浦\t48268\n人货\t48269\n东关村\t48270\n狸窝\t48271\n历史与社会\t48272\n大说\t48273\n静蹲\t48274\n皇塘镇\t48275\nLessons\t48276\n经期\t48277\nvitro\t48278\n旱天雷\t48279\n瞎话\t48280\n五边\t48281\n转隶\t48282\n热播韩剧网\t48283\nmsde2000\t48284\n能源展\t48285\nRyder\t48286\n四川省社会科学院\t48287\n热键\t48288\n张悟本\t48289\n东圳水库\t48290\n10-1\t48291\n中华人民共和国传染病防治法\t48292\njava定时器\t48293\n金钞\t48294\nDamaged\t48295\n张爽\t48296\nmoza\t48297\n月翼\t48298\n段奥娟\t48299\n邓莎\t48300\n电动汽车资源网\t48301\n20151204\t48302\n20届\t48303\nL0\t48304\n8000张\t48305\n大宗\t48306\n返现\t48307\n襄阳高新区\t48308\n杨村\t48309\n第A1\t48310\n600004\t48311\n杨红樱\t48312\n死掉\t48313\n超级机器人大战UX\t48314\n前5个月\t48315\n精灵传说\t48316\nJacques\t48317\nhangover\t48318\n一点击\t48319\n艾灸馆\t48320\n重生盛世嫡妃\t48321\n成王\t48322\n郑州华信学院\t48323\n1升\t48324\n红飘带\t48325\nhock\t48326\nSTB\t48327\n暗夜\t48328\n1500多\t48329\nXSHELL\t48330\ncomsol\t48331\n甜心\t48332\n辅仁\t48333\n大蜜丸\t48334\n调色盘\t48335\nuppercase\t48336\n红框\t48337\nzsb\t48338\n罗山街道\t48339\n语病\t48340\n明威\t48341\n167cm\t48342\n云南农村信用社\t48343\n穆少\t48344\n森米酵素\t48345\n康明斯柴油发电机组\t48346\n微思\t48347\n创世之柱\t48348\n自考考试\t48349\nHebei\t48350\nflashback\t48351\nDAV\t48352\ntream\t48353\n人防办\t48354\n椰壳活性炭\t48355\n王小骞\t48356\n公选王\t48357\n鹿院坪\t48358\n卡文迪许\t48359\neBook\t48360\n分布式光伏并网\t48361\n6040\t48362\n上海人民出版社\t48363\n干员\t48364\n如今\t48365\n迪赛\t48366\nBool\t48367\n大筒木\t48368\n乳糜\t48369\n退宿\t48370\nL型\t48371\n限售令\t48372\n67679910\t48373\n曹尼玛\t48374\n落英\t48375\n液晶电视\t48376\n八仙山\t48377\n暴雪打折季\t48378\nallan\t48379\n南铁\t48380\n4月11\t48381\nuefi+gpt\t48382\n湖北程力专用汽车有限公司\t48383\n主类\t48384\n健体\t48385\n养仇\t48386\n暗战2\t48387\nwebGL\t48388\n直来直往\t48389\n8.5亿\t48390\n武器篇\t48391\n辽宁民间艺术团\t48392\n李昉\t48393\n五关\t48394\n后手\t48395\n队号\t48396\n林广茂\t48397\n衡阳保卫战\t48398\n象屿股份\t48399\nmotive\t48400\n氮量\t48401\n黄希庭\t48402\n欧尚X70A_长安欧尚\t48403\n红富士苹果\t48404\n河南省粮食局\t48405\nnotion\t48406\n拳皇13\t48407\n有的时候\t48408\n驻波比\t48409\nKiko\t48410\n长春医院\t48411\n雅诺\t48412\n尼西亚\t48413\nperoxide\t48414\n唐人街神探2\t48415\n樱花热水器\t48416\n进化心理学\t48417\nSKECHERS\t48418\nXxxx\t48419\n钟书阁\t48420\n爱情小说\t48421\n小说\t48422\n400级\t48423\n锐捷网络\t48424\n济南政府\t48425\n崔英道\t48426\n玩世\t48427\n重要战略机遇期\t48428\n卧龙自然保护区\t48429\n高海\t48430\n欢欢\t48431\n热合机\t48432\n池\t48433\n骨性关节炎\t48434\nBUTTERFLY\t48435\n麻豆\t48436\n冶金工业出版社\t48437\n文殊心咒\t48438\n广电局\t48439\n场所\t48440\n中州大学\t48441\n校内\t48442\ncosmetic\t48443\n形法\t48444\n假面舞团\t48445\nselectors\t48446\n评查\t48447\nmika\t48448\n丝蕴\t48449\n小兔乖乖\t48450\n西宁百姓网\t48451\nipda\t48452\n九州虫\t48453\n三麦\t48454\n瑶池\t48455\nGun\t48456\n包料\t48457\n昭化\t48458\n盛宝\t48459\n遣散\t48460\n菜苗\t48461\n72平米\t48462\n全班\t48463\n小桃红\t48464\n掩膜\t48465\nNTS\t48466\nsympy\t48467\n傲宇阁\t48468\n江德福\t48469\nPV3\t48470\n克莫司\t48471\n欧阳坚\t48472\nkyoto\t48473\n巾箱\t48474\n钱龙金\t48475\nPerkins\t48476\n2010级\t48477\n锋霸\t48478\n几遍\t48479\n花神庙\t48480\n悲情城市\t48481\n汉剑\t48482\n椽\t48483\n水道\t48484\n瓷球\t48485\n南昌湾里区\t48486\nperiodic\t48487\n抗日之特战兵王\t48488\n侦监\t48489\n氧氟沙星滴耳液\t48490\n私印\t48491\n人民网\t48492\n藿香正气软胶囊\t48493\nCR2格式\t48494\n乐有家控股集团\t48495\n中内\t48496\n1331\t48497\nト\t48498\n殉爆\t48499\n炮车\t48500\n李永强\t48501\n済\t48502\n道光帝\t48503\n小白菜\t48504\n羊皮卷\t48505\n微球\t48506\n湖北省质量技术监督局\t48507\n惯量\t48508\n郯城县政府\t48509\n凯旋广场\t48510\n香稻\t48511\n卡拉马佐夫\t48512\n20161202\t48513\nbenjamin\t48514\nqtc\t48515\ndalong\t48516\nRacun\t48517\n靶场\t48518\n丽湾\t48519\n神楽坂\t48520\n黑盾\t48521\n讲完\t48522\nBender\t48523\n大津\t48524\nrobot魂吧\t48525\n无可争议\t48526\n静夜思\t48527\n停靠\t48528\nmodulator\t48529\n梨园春\t48530\n五街\t48531\nhypervisor\t48532\n周至\t48533\n五大\t48534\n难弟\t48535\n零拷贝\t48536\n杨德\t48537\n江阴大桥\t48538\n素锦\t48539\n六十一\t48540\n李德\t48541\n上膛\t48542\n条凳\t48543\nIOS模拟器\t48544\n经济器\t48545\n哲狐\t48546\n龙印\t48547\n哪集\t48548\n10mL\t48549\n西尾维新\t48550\n云南映象\t48551\n长濑智也\t48552\nnorman\t48553\n‘\t48554\nAndre\t48555\nnpz\t48556\n电脑之家\t48557\n京东白条\t48558\n预器\t48559\n肉毛\t48560\n广州爱尔眼科医院\t48561\n出张\t48562\nmetastatic\t48563\n室版\t48564\n沈阳自动化研究所\t48565\n同辈\t48566\n绿荫\t48567\nAuthorization\t48568\nelectron\t48569\n肾性高血压\t48570\n滤池\t48571\n第104号\t48572\n谢恩\t48573\n血栓性\t48574\n固定期限劳动合同\t48575\n高新开发区\t48576\n昌河汽车\t48577\nBROWN\t48578\n暑假工\t48579\n2809\t48580\nRadical\t48581\n天残\t48582\ncc6\t48583\nnhf\t48584\n陈群\t48585\n去留\t48586\nDruid数据库连接池\t48587\n前所\t48588\n丰大\t48589\n凝集素\t48590\n小狼\t48591\n顺威股份\t48592\n贬\t48593\n买彩票\t48594\n近思\t48595\n小金人\t48596\nscala\t48597\n温州实验中学\t48598\n宋公明\t48599\n奸杀案\t48600\nreversion\t48601\n消费观\t48602\nIceland\t48603\n爱空间\t48604\n阳性\t48605\nBTOB\t48606\n老实人\t48607\n环保费\t48608\n四儿\t48609\n主刑\t48610\n从容\t48611\n不共\t48612\n储物箱\t48613\n感冒灵\t48614\n土钉\t48615\n沙塘镇\t48616\n探索型\t48617\nxvldeos\t48618\n恶灵骑士\t48619\n双评\t48620\n聚宝匯\t48621\n顶梁\t48622\nRAW格式\t48623\n安昭熙\t48624\n南瓜泥\t48625\n接头盒\t48626\n灵山村\t48627\n∽\t48628\n搞笑一家人\t48629\n花村\t48630\n91助手\t48631\n医师\t48632\n素年锦\t48633\n杂环化合物\t48634\n五一小学\t48635\n脚踏\t48636\n力天\t48637\n脊椎骨\t48638\n阿郎的故事\t48639\n五星大饭店\t48640\n65cm\t48641\n年轻的母亲2\t48642\n火纸\t48643\np15\t48644\n51返呗\t48645\nbolo\t48646\n威斯康辛\t48647\nPro变声器\t48648\n3.7V\t48649\n复函\t48650\n地面数字电视\t48651\n恶语\t48652\n补补\t48653\n1447\t48654\n医学类\t48655\n跳坑\t48656\n驴粉卡\t48657\nEFI\t48658\n汤琳琳\t48659\nturning\t48660\n傅晶\t48661\n20150821\t48662\n微信公众平台\t48663\n毋米粥\t48664\n抽奖机\t48665\n扬长避短\t48666\n重庆市第一中学\t48667\n九鼎投资\t48668\n两整\t48669\n1950X\t48670\njeep自由光\t48671\n北京市京师律师事务所\t48672\n上海星河湾双语学校\t48673\nfence\t48674\n复古传奇\t48675\n名笔\t48676\n边牧俱乐部\t48677\n苏达\t48678\n书报\t48679\n花石\t48680\n阿蒙\t48681\n文雀\t48682\n第5天\t48683\nc++语言程\t48684\n产假\t48685\nsince\t48686\n40周年\t48687\n30问\t48688\nopeninstall\t48689\n圣空法师\t48690\n赵坚\t48691\n几万个\t48692\n少爷们\t48693\n7集\t48694\n成城\t48695\naqsiq\t48696\n超会\t48697\n陈雪凝\t48698\n戏说乾隆\t48699\ncarlton\t48700\n老乐\t48701\n入戏\t48702\n基差\t48703\n洗脸仪\t48704\n苏州山塘街\t48705\n皇后大道东\t48706\ncoefficients\t48707\n三星s3\t48708\n古氏\t48709\n飞鹏网大航海时代5\t48710\n茶艺网\t48711\nlilian\t48712\n刚铎\t48713\n推开\t48714\n彩虹的约定\t48715\n哔声\t48716\n中宵\t48717\n步伐\t48718\n青岛理工大学琴岛学院\t48719\n再也不\t48720\n云石机\t48721\nncis\t48722\n语言库\t48723\nCGI\t48724\n浙江传媒\t48725\n游湖\t48726\n共和国\t48727\n三十句\t48728\n横流\t48729\n严正\t48730\nUN\t48731\n辕门射戟\t48732\n忍具\t48733\nxss\t48734\n0754\t48735\ncolloidal\t48736\n弦月\t48737\n制发\t48738\n王动\t48739\n电信网码号\t48740\n三笔\t48741\n王子健\t48742\ndmb\t48743\n最后的休止符\t48744\n摩比\t48745\n南部\t48746\n纵火者\t48747\n周易\t48748\n实践篇\t48749\ntcp\t48750\n325目\t48751\n武珞路\t48752\ne人e本\t48753\nfed\t48754\n建新股份\t48755\n罗斯公司\t48756\n180吨\t48757\n蔚揽\t48758\n上居\t48759\n孙小圣\t48760\n熊怪\t48761\n下个路口见\t48762\n山东财经大学燕山学院\t48763\nbased\t48764\n第七章\t48765\n中力\t48766\n申城\t48767\n钮门\t48768\n太原新闻网\t48769\n王迎春\t48770\n探长\t48771\n碑铭\t48772\n薇琳\t48773\n17块\t48774\n小食\t48775\nepe\t48776\n粤国\t48777\nDBCP\t48778\n结核药\t48779\n星海\t48780\n朗县\t48781\n席先生\t48782\nbulk\t48783\noperate\t48784\n优速物流\t48785\nsall\t48786\nFinesse\t48787\nguru\t48788\n药堂\t48789\n2019年初\t48790\n青报网\t48791\n黄花机场\t48792\n妈妈妈妈\t48793\n大族\t48794\n太空望远镜\t48795\n名牌大学\t48796\n蹲踞式跳远\t48797\npypy\t48798\n江户川\t48799\n改头换\t48800\nbzip\t48801\n陈耀烨\t48802\n陈晓峰\t48803\n中电建路桥集团\t48804\nchm\t48805\nphicen\t48806\n苯丙胺\t48807\nStafford\t48808\nOutcomes\t48809\nPARTS\t48810\nwww4438x1\t48811\n20150909\t48812\nHspice\t48813\n系类\t48814\nhexdump\t48815\n前进西路\t48816\n最最最\t48817\n牙克石市\t48818\n信长之野望13:天道\t48819\nCriticism\t48820\n妇检\t48821\nSolrCloud\t48822\n消元\t48823\n雅昌拍卖信息网\t48824\n赫子\t48825\n榕\t48826\n小夜\t48827\n请进\t48828\nmvc:resources\t48829\n梦话\t48830\n中华人民共和国保守国家秘密法\t48831\n艺术家\t48832\n60岁\t48833\n乌东德水电站\t48834\n953\t48835\n砂井\t48836\n行列\t48837\n一汪\t48838\n五维度\t48839\nXart\t48840\n张栋\t48841\n067\t48842\n古代文学史\t48843\n6串\t48844\n一栏\t48845\n江梦娴\t48846\n哈夫曼\t48847\n小蝶\t48848\n0.47\t48849\n增函数\t48850\n更直观\t48851\n125cc\t48852\n耍横\t48853\n主学\t48854\n东京迷城\t48855\n玄冥二老\t48856\n警帽\t48857\n货轮\t48858\n饥饿游戏3:嘲笑鸟\t48859\n中物\t48860\nCEPH\t48861\n塑料纸\t48862\n九分之二\t48863\n岩茶\t48864\n架子鼓\t48865\n服软\t48866\n_多\t48867\n复兴集团\t48868\n呼呼\t48869\n补钾\t48870\n和之国\t48871\n选手\t48872\n伦晚\t48873\n微校\t48874\naxuer\t48875\n人兽杂交\t48876\n优级\t48877\n凌钢\t48878\n求抱抱\t48879\n百度流量\t48880\n短波\t48881\nSPG\t48882\n抢族\t48883\n安娜尔\t48884\n南京应用技术学校\t48885\n司空论坛|SM\t48886\nKarsa\t48887\n为么\t48888\n东京城\t48889\n捷克罗姆\t48890\n贵州盘县政府\t48891\n江苏省广播电视总台\t48892\n顶牛股网\t48893\nphpstorm2017\t48894\n害臊\t48895\n莫待无花空折枝\t48896\n福龙路\t48897\n捷安特\t48898\ninfluenza\t48899\n花样青春\t48900\n淮南市教育局\t48901\nTropes\t48902\n预制方桩\t48903\n建昌县\t48904\n付枚\t48905\n步步\t48906\n离园\t48907\n红眼睛\t48908\n丰台医院\t48909\n荒废\t48910\n上汽大众4S店\t48911\n高新技术产业\t48912\n妹儿\t48913\n万域\t48914\n头旋\t48915\n雁栖湖\t48916\n毛诗\t48917\ngov\t48918\n黑松露\t48919\n长沙地铁5号线\t48920\n老寒腿\t48921\n冷钢\t48922\n裸土\t48923\n解散\t48924\n50块\t48925\n95500\t48926\n约法三章\t48927\n丛林法则\t48928\n脏腑\t48929\n阳平\t48930\n会计研究\t48931\n云石胶\t48932\n法家\t48933\nxdmcp\t48934\n广州汽车站\t48935\n莫言\t48936\n冒险岛异逝界\t48937\n毛浙东\t48938\n600309\t48939\n码点\t48940\n童自荣\t48941\nv5.7\t48942\n曹刿论战\t48943\n慈善款\t48944\n蒙逼\t48945\n纤维蛋白\t48946\n凉拌莴笋\t48947\n曼奇尼\t48948\n允硕\t48949\n新开元\t48950\n电摩\t48951\n脑门\t48952\n中国电建集团海外投资有限公司\t48953\n慧慧广场舞\t48954\n补助\t48955\n学管\t48956\n张玉贞\t48957\n幽闭\t48958\n社科院\t48959\n秦博\t48960\n陈伟文\t48961\nc93\t48962\nLuCI\t48963\n优装\t48964\n骑士团\t48965\nfanhao\t48966\n大众版\t48967\n魔理\t48968\n诺记吧\t48969\n米奇妙妙屋\t48970\n1.31G\t48971\n61.157.144.46\t48972\n一决雌雄\t48973\nPFA\t48974\nTrans\t48975\nSUPER\t48976\n阿尔托\t48977\n广东省监狱管理局\t48978\nselective\t48979\n泊船瓜洲\t48980\n收楼\t48981\n橡塑板\t48982\n设计书\t48983\n中国水网\t48984\n点球大战\t48985\n鄠邑\t48986\n淘宝钻展\t48987\n拉拉链\t48988\n小儿便秘\t48989\n中登公司\t48990\n矩阵行\t48991\n痛风石\t48992\n二十二条\t48993\nmt2\t48994\n江天\t48995\nyan\t48996\n海思\t48997\n按部就班\t48998\n抱错\t48999\n云南政协\t49000\n返青\t49001\n司机们\t49002\n灭亲\t49003\n三门峡市\t49004\n工产品\t49005\n轰隆隆\t49006\n广西建设职业技术学院\t49007\n首先\t49008\n闭水\t49009\n朵\t49010\n虎牙星秀\t49011\nhd2\t49012\n九年级语文下册\t49013\n盐蛋\t49014\nwf\t49015\n汽贸\t49016\nPK版\t49017\n有村千佳\t49018\n网络嗅探器\t49019\n秋凌\t49020\n冷负荷\t49021\n蓝天保卫战\t49022\n十多次\t49023\n托特包\t49024\n朝阳区不动产登记事务中心\t49025\n121苹果网\t49026\n1916\t49027\n4399战争\t49028\n龙膜\t49029\n概述\t49030\n喀左\t49031\nAGREEMENT\t49032\nDXRACER\t49033\n王源罗京\t49034\n众泰T500论坛_众泰T500车友会\t49035\n腻子膏\t49036\n甘心情愿\t49037\n600056\t49038\n国IV\t49039\n第5种\t49040\n复职\t49041\n三星a7\t49042\n姣\t49043\n华北制药股份有限公司\t49044\n亿安\t49045\n石秀\t49046\n琴半岛地质海洋公园\t49047\n投中网\t49048\n李耳\t49049\n丁峰\t49050\n性疾病\t49051\n虚空假面\t49052\n金刚菩提手串\t49053\n径山\t49054\n大家乐\t49055\n钢刀\t49056\n阿拉伯人\t49057\n功绩\t49058\n南昌地铁3号线\t49059\nCRAFT\t49060\n北唐\t49061\n天主\t49062\nMIUI9\t49063\n绑嫁\t49064\n熏鸡\t49065\n一贯制\t49066\n云大附中星耀校区\t49067\nLambda\t49068\nstra\t49069\nsuji\t49070\n王凯歆\t49071\n团群\t49072\n灭口\t49073\n芙蓉街\t49074\n九宫飞星图\t49075\n大谷翔平\t49076\n温德米尔湖\t49077\nopencl\t49078\n东郭镇\t49079\n鞍形\t49080\n私有云\t49081\n猎云网\t49082\n荷花节\t49083\n2165\t49084\n第4\t49085\n通勤\t49086\n林肯MKZ\t49087\n甜性涩爱\t49088\nsemiconductors\t49089\n融资性担保公司管理暂行办法\t49090\n奇奇怪怪\t49091\n人生如戏\t49092\n相离\t49093\n20万亿美元\t49094\nvbe\t49095\n崇福\t49096\n万亩\t49097\n直营店\t49098\npyodbc\t49099\n尚赫美容仪\t49100\n鸿德\t49101\n超级地城之光\t49102\n材料作文\t49103\nncaa\t49104\nYJ\t49105\n茉莉酸\t49106\n第31轮\t49107\n非洲\t49108\n保守秘密\t49109\nvpn加速器\t49110\n单门\t49111\n国领\t49112\n区市场监管局\t49113\n开区\t49114\n尧都区\t49115\n日臻\t49116\n舞蹈网\t49117\nserialization\t49118\n爽朗\t49119\n黄仙\t49120\n罗盘仪\t49121\n线钳\t49122\nEarMaster\t49123\n搜易贷\t49124\n沈斌倜\t49125\n舍利\t49126\nr21\t49127\nuiviewcontroller\t49128\n手机一卡通\t49129\nエロコンビニ\t49130\n五六个\t49131\n沈阳国际软件园\t49132\n亚里斯多德\t49133\nFlooring\t49134\n城厢区\t49135\nDHT11\t49136\n反恐特战队\t49137\n自由禁区\t49138\n大兴吧\t49139\n撰文\t49140\n合同管理\t49141\n中国政府采购招标网\t49142\n秀英港\t49143\n露娜拉\t49144\nn4110\t49145\n租房者\t49146\n干豆腐\t49147\n拆封\t49148\n南京奥体中心\t49149\n张轩\t49150\n陶杯\t49151\nLiDAR\t49152\ntiled\t49153\n刺客信\t49154\n红番\t49155\n立体库\t49156\n激烈竞争\t49157\nBaka\t49158\n质量效应:仙女座\t49159\n16项\t49160\n深天马\t49161\nMSM\t49162\n撬棒\t49163\nNNID\t49164\n北冥神功\t49165\n938\t49166\nmik\t49167\nurllib2\t49168\n车公庄\t49169\npractice\t49170\n充分就业\t49171\n昆仑网\t49172\nmargin值\t49173\n冬幕节\t49174\n管院\t49175\n狂志\t49176\n8s\t49177\n天空套\t49178\n缓冲流\t49179\n宣城市人民医院\t49180\n杭州国家高新技术产业开发区管委会\t49181\n西南交通大学电气工程学院\t49182\npintos\t49183\n苏门\t49184\n天信\t49185\n霍克尼\t49186\nusb母座\t49187\nV1.2\t49188\n寸\t49189\n機場\t49190\nMycareer\t49191\n使用税\t49192\n截肢\t49193\n精华水\t49194\nInventory\t49195\n丝弦\t49196\nsurfer\t49197\n华北地区\t49198\n巧舌如簧\t49199\nnas服务器\t49200\n百无禁忌\t49201\n代开发票\t49202\n碧泉\t49203\nBBT\t49204\n事部\t49205\nIP服务器\t49206\n白皮鞋\t49207\n成渊\t49208\n高级感\t49209\n醋酐\t49210\n赫米特\t49211\nSpa酒店\t49212\n王志刚\t49213\n贝姐\t49214\n联咏\t49215\nepsxe模拟器\t49216\n天年\t49217\n45%\t49218\n来电秀\t49219\n清博指数\t49220\nChanel香奈儿\t49221\n胀\t49222\n20150806\t49223\n河南司法警官职业学院\t49224\n栅极\t49225\n5.6.25\t49226\n恒瑞医药\t49227\n情深深雨蒙蒙\t49228\nIPO\t49229\n前十二个月\t49230\n尽铅华\t49231\n学点\t49232\n寒亭\t49233\nPrimeton\t49234\n厂标\t49235\n剪切蒙版_\t49236\n慎海雄\t49237\nConduct\t49238\nccf\t49239\n三酯\t49240\n着急\t49241\n颖颖\t49242\n几辆\t49243\nNanoscale\t49244\ncrawler\t49245\n好诗文网\t49246\n热熔胶膜\t49247\n曹全碑\t49248\n九牧王\t49249\n环比\t49250\ntrending\t49251\n万豪金卡\t49252\n第一联\t49253\n垠\t49254\n有声读物网\t49255\n排泄物\t49256\n赶往\t49257\n小顶\t49258\nMyBatis3\t49259\n成功之母\t49260\ncontentType\t49261\n王树增\t49262\n女犯\t49263\n不谈\t49264\n北京人和\t49265\n20.83\t49266\nclique\t49267\ndota吧\t49268\nQZ\t49269\n南海撞机\t49270\n射流式\t49271\n中华信鸽网\t49272\npos\t49273\n霏霏\t49274\n缩阴球\t49275\n公证债权文书\t49276\n姓于\t49277\n毛羊\t49278\n加尔鲁什\t49279\n蓝卡\t49280\nhadoop\t49281\n退工\t49282\nEthnic\t49283\nhgc\t49284\n把握好\t49285\n门当户对\t49286\n锡渣\t49287\n亲爱的们\t49288\nSymbol\t49289\n杨光荣\t49290\n依安\t49291\n刘波\t49292\n杨琦\t49293\n杜若溪\t49294\n第纳尔\t49295\n萝卜圈\t49296\n魔法部\t49297\n渔期\t49298\n甲方\t49299\n蜂鸟网\t49300\nE470\t49301\n生态学\t49302\n很可笑\t49303\n皮城警备\t49304\n163&\t49305\n慢性肾盂肾炎\t49306\n爱同\t49307\nsonian\t49308\n太上\t49309\n王者荣耀2018五五开黑节\t49310\n重庆北\t49311\n曾敏\t49312\n沉淀池\t49313\n80千米\t49314\n超电磁\t49315\n出版业\t49316\nlinter\t49317\n瞬心\t49318\n西双版纳热带植物园\t49319\n启子\t49320\n咬断\t49321\n赞扬\t49322\n桑塔纳志俊\t49323\n魔兽对战平台\t49324\n齐之韵\t49325\n后唐\t49326\n北舞附中\t49327\nN4\t49328\ndoc-文档赚钱网\t49329\n飙歌\t49330\n钣金喷漆\t49331\n25公分\t49332\nSLS\t49333\n张呈栋\t49334\n富力广场\t49335\n渣女\t49336\n卤牛肉\t49337\n狂猎\t49338\n2980元\t49339\n锦江之星\t49340\nupright\t49341\n新维加斯吧\t49342\n甲本\t49343\n候车室\t49344\nHIFIMAN\t49345\n3零\t49346\n三巴\t49347\n晨报社区报\t49348\n骆驼祥子读书笔记\t49349\nDCOM\t49350\nBitch\t49351\n挂海论坛\t49352\nhuanle\t49353\n分处\t49354\n郭栋\t49355\n上汽荣威RX5\t49356\ne550\t49357\n光方\t49358\n乐事薯片\t49359\n通用汽车\t49360\n光翼\t49361\n世纪广场\t49362\nevident\t49363\n卷心\t49364\n深圳高新园\t49365\n附中\t49366\n无遮\t49367\n报时\t49368\n天椒\t49369\nKPA\t49370\n13日游\t49371\nsword\t49372\n朱明\t49373\n榴莲树\t49374\n悦禾旅游JOY\t49375\nidt\t49376\n宁夏回族自治区水利厅\t49377\n佳能EF\t49378\n何梅\t49379\n南屿\t49380\ndism\t49381\n微机原理\t49382\n黄体期\t49383\n2034\t49384\n挞\t49385\n工作日志\t49386\nWeb测试\t49387\n吞噬细胞\t49388\nAdagio\t49389\n陶瓷版\t49390\n密实\t49391\nwww.etest8.com\t49392\n神魔之塔\t49393\nmapper\t49394\n师出\t49395\n司\t49396\n水色\t49397\n黑龙江电视台\t49398\n书\t49399\n半老\t49400\n[刀剑乱舞\t49401\n寡欲\t49402\n一百斤\t49403\n进制转换器\t49404\nAthletics\t49405\n春检\t49406\n特大城市\t49407\n驼峰式\t49408\ntopo\t49409\n造化\t49410\n杨冰怡\t49411\n二十讲\t49412\n轮辋\t49413\nildasm\t49414\n张德\t49415\n兴凯湖\t49416\n洁面\t49417\n84分钟\t49418\n爱上自己\t49419\n鞍山西站\t49420\npaython\t49421\n170万\t49422\n后腰\t49423\ncapm\t49424\nEbert\t49425\n每期\t49426\n膨胀节\t49427\n虎扑篮球图片中心\t49428\n盘福路\t49429\nChristy\t49430\n黄圣池\t49431\n鸡肉粉\t49432\nNHK大河剧\t49433\n日月年\t49434\n翡翠园\t49435\n南京东路\t49436\n田欣\t49437\nldl\t49438\n上犹县\t49439\n杨海峰\t49440\n海神庙\t49441\n三分线\t49442\nMySql\t49443\n第8位\t49444\nmuscle\t49445\n稳坐\t49446\n6.5米\t49447\n墨尔本市\t49448\n钢弹\t49449\nMyriad\t49450\n壳类\t49451\n围猎\t49452\n煲汤\t49453\n贺峻霖\t49454\n临床医学八年制\t49455\n尤果\t49456\nmdz\t49457\n口环\t49458\n苐\t49459\nmalfunction\t49460\n乐蜂网\t49461\n命运之门\t49462\n包皮过长\t49463\nM档\t49464\n操作方\t49465\n概念篇\t49466\n芳香性\t49467\n品类\t49468\n看不了\t49469\n六安一中\t49470\n哈达威\t49471\nTriton\t49472\n南湖晚报\t49473\n四照\t49474\nUserControl\t49475\n女秘\t49476\nordinal\t49477\n富德生命人寿保险股份有限公司\t49478\n兰娜瑟尔\t49479\nhull\t49480\n小琳\t49481\n水苔\t49482\nTBL\t49483\n小荷才露尖尖角\t49484\nnutrient\t49485\n61404635\t49486\nneotv\t49487\n张胜利\t49488\ntheft\t49489\n欧吉尔德\t49490\n冻鱼\t49491\n顶起\t49492\n主族\t49493\n胡令\t49494\n复星高科\t49495\n5460\t49496\n6寸\t49497\n西园寺\t49498\n定海\t49499\n危重症\t49500\n新江湾城\t49501\n呼和浩特白塔国际机场\t49502\n郑州市工商行政管理局\t49503\n使命召唤14:二战\t49504\n清水河街道\t49505\n携众\t49506\n⒈\t49507\n法堂\t49508\n江苏省卫生厅\t49509\n电瓶车\t49510\n600500\t49511\n狗屋\t49512\n配图\t49513\n万科城市广场\t49514\n寻光\t49515\n高峰时间\t49516\nshorts\t49517\n条数\t49518\n压力线\t49519\n凯里学院\t49520\n光滑\t49521\n2013.1\t49522\n微星科技\t49523\n航空类\t49524\n数列{an}\t49525\n点读机\t49526\n直拨\t49527\nSVI\t49528\n少年西游记\t49529\nArgo\t49530\n1000度\t49531\nhost-only\t49532\n远图\t49533\n高瑞\t49534\n梁王\t49535\n獄\t49536\nrpcs3模拟器\t49537\n床品\t49538\n人教版.doc\t49539\n2015.03\t49540\n2.10\t49541\n莱克\t49542\nPPR\t49543\n味噌\t49544\n上好\t49545\n天拓\t49546\n侠盗猎车手5_侠盗猎车手5\t49547\n会声会影注册机\t49548\n诡事\t49549\n李度\t49550\n西麻布\t49551\n杀手5\t49552\n留待\t49553\nENGINE\t49554\n腐竹\t49555\n刮痧\t49556\nabac\t49557\n金高银\t49558\n振宁路\t49559\n罗康瑞\t49560\n李旭丹\t49561\n手机连\t49562\n趣医挂号网\t49563\n治警\t49564\n播阶\t49565\n宇智波一族\t49566\n神妃\t49567\n乾州\t49568\n排气道\t49569\n熊大\t49570\nexcelvba\t49571\n咸阳市秦都区人民政府\t49572\n首订\t49573\n原发性\t49574\n吉盛伟邦\t49575\n知盘\t49576\nTHAT\t49577\n广东工贸职业技术学院\t49578\n哈弗H2论坛_哈弗\t49579\n港普\t49580\nrsr\t49581\n王小韦德\t49582\nread函数\t49583\n招标采购导航网\t49584\n江小白\t49585\nRNA\t49586\n青山湖\t49587\nReactor\t49588\n精易模块\t49589\nultima\t49590\ngained\t49591\n平南县\t49592\n爱情保卫战\t49593\n烧烤\t49594\n足量\t49595\n瑞东\t49596\n海南省纪委\t49597\n香瓜\t49598\n相流\t49599\n咪呀\t49600\n污染案\t49601\n6色\t49602\niexplore\t49603\n菊色宫\t49604\n主讲人\t49605\n稀有怪\t49606\n十堰晚报数字报|十堰晚报\t49607\n人设\t49608\necf\t49609\nlibffi\t49610\n小尹\t49611\n奔驰S320\t49612\n完达\t49613\nfflush\t49614\n7.09\t49615\n绣球花\t49616\n致郁系\t49617\n在路上\t49618\nsouce\t49619\n占星\t49620\nBlasterX\t49621\n扭亏为盈\t49622\n宝岛眼镜\t49623\n国家开发投资集团有限公司\t49624\n黄金时代\t49625\ninputfile\t49626\n聚政网\t49627\n巅峰\t49628\n亡魂\t49629\n舍卡卡\t49630\n陷入绝境\t49631\n何君\t49632\n福州时代中学\t49633\n辞藻\t49634\n安全部\t49635\n外调\t49636\n峰境\t49637\n暗藏式\t49638\n情天\t49639\n花旗参\t49640\n权健\t49641\nS8+\t49642\n岗厦村\t49643\n法国菜\t49644\n斗笠舞\t49645\n留洋\t49646\n忽闪\t49647\n潘高峰\t49648\n出行\t49649\n缩口\t49650\n妖瞳\t49651\nplatinum\t49652\n杨影\t49653\n美穗\t49654\nattributed\t49655\nNIKEiD\t49656\n天下粮仓\t49657\n花皮\t49658\n150寸\t49659\n亚大\t49660\n临安房产网\t49661\naboutcg\t49662\nSilicon\t49663\nDarker\t49664\n晋城市环境保护局\t49665\n1个星期\t49666\n木梨硔\t49667\n党章\t49668\n铁环\t49669\n开沟\t49670\n海火\t49671\n战争之王\t49672\n安徽省科技厅\t49673\n再生人\t49674\n虚拟宝贝\t49675\n未分配\t49676\n二世\t49677\n书法\t49678\n华为输入法\t49679\n哺乳纲陆栖动物\t49680\n战痘\t49681\n政敌\t49682\n我的公主\t49683\n鬼道\t49684\n爆气\t49685\n阿司匹林\t49686\nJGit\t49687\n嘞\t49688\n东营新闻_东营大众\t49689\n江宁区政府\t49690\nVITA\t49691\n邱赫南\t49692\n愁苦\t49693\n达真堪布\t49694\n突\t49695\n中关\t49696\n洛阳卫生网\t49697\n酒店管理专业\t49698\n科尔沁草原\t49699\nRainyin\t49700\n第二十二期\t49701\n陈子\t49702\n隐身衣\t49703\n#四海鲸骑\t49704\n亿盛国际\t49705\n九期\t49706\n人外有人\t49707\n正焕\t49708\n绽放\t49709\n毛利兰\t49710\n牙周\t49711\n名门口袋学院\t49712\n纹纸\t49713\n27位\t49714\n橘优花\t49715\n财门\t49716\n青灯百物语\t49717\n孤灯\t49718\n中华美食网\t49719\nUgly\t49720\n18首\t49721\n锁关节\t49722\noversize\t49723\nunify\t49724\n神传\t49725\n母草\t49726\nMit\t49727\n比亚蜀山战纪2踏火行歌\t49728\nDecorte\t49729\nPLAZA\t49730\n照明\t49731\np1008\t49732\n叶航\t49733\n集体产权制度\t49734\n偷学\t49735\n湘财证券\t49736\n稀硫酸\t49737\n新工科\t49738\n百合粥\t49739\n光影魔术手批量\t49740\n色爱\t49741\n海斯坦普\t49742\n镶边\t49743\n二十日\t49744\n正大食品\t49745\n周易论坛\t49746\n早上9点\t49747\n苏州大学王健法学院\t49748\n赤子\t49749\n政治制度史\t49750\n奸细\t49751\n弄月\t49752\n细胞培养\t49753\n衣柜\t49754\n田军\t49755\n保龄宝\t49756\n金茂汇\t49757\n北京市民防局\t49758\n小鱼儿与花无缺\t49759\n湛卢\t49760\n河北工业职业技术学院\t49761\nv7.0\t49762\n内蒙古财政厅\t49763\n龙卡通\t49764\n正元\t49765\n虎都\t49766\n工作证明\t49767\n蜗牛\t49768\nGL62M\t49769\n福建省环保厅\t49770\n4.3米\t49771\n怕壮\t49772\n上海电力\t49773\n陈式\t49774\n观世音传奇\t49775\n闹翻\t49776\n莫须\t49777\n李萍\t49778\nflames\t49779\n审讯室\t49780\n防爆除湿机\t49781\nHello\t49782\n步步惊心\t49783\n十字花科\t49784\n建木\t49785\n先进单位\t49786\nv3.3.4\t49787\niPhoneX\t49788\n泻湖\t49789\n电口\t49790\n粤绣\t49791\n塔罗牌教程_奇缘阁算命网\t49792\n快吧游戏盒\t49793\n琥珀金冠道\t49794\n真枪实弹\t49795\nACHI\t49796\n蒋蓉\t49797\n狐假虎威\t49798\n彭坦\t49799\n天福华府\t49800\n数传\t49801\nSolidWorks2015\t49802\ncontext\t49803\n今宵多珍重\t49804\n枪身\t49805\n广东省中医院大学城医院\t49806\n专利局\t49807\n石家庄市第一医院\t49808\n三菱蓝瑟\t49809\n达芙文\t49810\n最萌身高差\t49811\n歌星\t49812\n必备品\t49813\n磁强计\t49814\nMSXML\t49815\n山东师大\t49816\n打井\t49817\n乔布斯\t49818\n报集萃\t49819\ncop\t49820\n国民经济\t49821\n到老\t49822\n乙烷\t49823\n东营村\t49824\nMAXHUB\t49825\n三头六臂\t49826\nregulates\t49827\n离子色谱法\t49828\naugmentation\t49829\n五匹\t49830\n重庆企聚网\t49831\nShenZhen\t49832\nimnoise\t49833\n腾讯游戏加速器\t49834\n区服\t49835\njudgement\t49836\n文化类\t49837\nAJax\t49838\n失势\t49839\n成都市文化广电新闻出版局\t49840\nPullman\t49841\n站位\t49842\n橘逾淮\t49843\n建筑装饰装修工程质量验收规范\t49844\n/日\t49845\n用水定额\t49846\n一带一路\t49847\n天津电子信息职业技术学院\t49848\nrevert\t49849\n人工关节置换术\t49850\n532nm\t49851\n女人街\t49852\n中华人民共和国国家统计局\t49853\n许仙白\t49854\nSHOPBOP\t49855\noauth2\t49856\n草桥\t49857\nOrb\t49858\n图篇\t49859\n背光板\t49860\n楼船\t49861\n王者荣耀不知火舞\t49862\nOrcHome\t49863\nacrylate\t49864\nPlugY\t49865\n武汉经济开发区\t49866\n梅方\t49867\n青鹏软膏\t49868\n点爆\t49869\n有求必应\t49870\nSUMMER\t49871\n777奇米\t49872\nlever\t49873\n琦\t49874\n家运\t49875\n酿制\t49876\n视觉\t49877\n辛未日\t49878\nTyrantMaster\t49879\n上汽大通\t49880\n0.2_\t49881\nSANG\t49882\n网络机顶盒子论坛\t49883\n门费\t49884\n牙膏盒\t49885\n非水\t49886\nHBr\t49887\nYsl\t49888\n中华人民共和国药典\t49889\n莎拉·布莱曼\t49890\n痛心疾首\t49891\n冷泉\t49892\nShark\t49893\nXRV\t49894\n几曰\t49895\n美滋\t49896\nFree9免费资源网\t49897\n压槽机\t49898\n各区\t49899\ntrot\t49900\n坚持到底\t49901\n胡胡\t49902\nmicroscope\t49903\n瘦果\t49904\n追封\t49905\n谓词\t49906\nEndMemo\t49907\n远东宏信\t49908\n像距\t49909\n示范户\t49910\n变形虫\t49911\n20170421\t49912\nid选择器\t49913\n东城天下\t49914\n不狠\t49915\n烽火星空\t49916\n兴盛路\t49917\ncs\t49918\n丽江旅游\t49919\n08期\t49920\n6月22日\t49921\n十九大报告学习体会\t49922\n骨巨细胞瘤\t49923\n12小时后\t49924\n聚投诉\t49925\n不走\t49926\n4月27号\t49927\n杂牌\t49928\n酶学\t49929\n女家\t49930\n前庭\t49931\nC++\t49932\n杏吧论坛\t49933\nexercise\t49934\n华军新闻网\t49935\n青亭网\t49936\n基因组\t49937\n拉米娜\t49938\n传动比\t49939\n操舞\t49940\nGateway\t49941\nrizhi\t49942\n外图\t49943\n最高温度\t49944\n湖南华莱\t49945\n无向\t49946\n康先生\t49947\n出品商\t49948\nme2day\t49949\nNO.2\t49950\n后雨\t49951\n穿模\t49952\n夏达\t49953\n扩展板\t49954\nHKTDC\t49955\n29.0.0.140\t49956\n3.91\t49957\n假象\t49958\n罗森\t49959\n学校\t49960\n五征\t49961\n盈峰\t49962\nNDK\t49963\n学访\t49964\nOwenLee\t49965\n192路由网\t49966\n中国日报社\t49967\n万体馆\t49968\n嘉栋\t49969\nwin7-ZOL\t49970\n草花梨\t49971\n康尔\t49972\nGSB\t49973\n乡类\t49974\n床子\t49975\n喜马拉\t49976\n竹粉\t49977\n桉叶油\t49978\n插值算法\t49979\n桂林洋\t49980\nmpe\t49981\nWOWS-空中网\t49982\n戴尔显示器\t49983\n缴款单\t49984\n免洗洗手液\t49985\n林梦\t49986\n校园文化建设\t49987\n毛泽东文集\t49988\n勃发\t49989\n节徽\t49990\nmongoengine\t49991\n莫过于\t49992\nHobbies\t49993\n军模\t49994\n川财证券\t49995\n地方性\t49996\n胡旋舞\t49997\n操心\t49998\n娜迦之怒\t49999\n调色板\t50000\n千百遍\t50001\n台儿庄\t50002\nMARC\t50003\n浦北\t50004\n5.doc\t50005\n斗破苍穹全集\t50006\n马路须加学园\t50007\n光纤溶脂\t50008\n生成类\t50009\n0.18%\t50010\n玲珑路\t50011\njiafeng\t50012\n宇\t50013\n曲包\t50014\nZOL\t50015\n热血高校3\t50016\n粗沙\t50017\n史考\t50018\n103路\t50019\nswish\t50020\n混写\t50021\nPvP\t50022\nbl文\t50023\n_房县\t50024\n穿越\t50025\n吴春明\t50026\n沂水在线\t50027\n其影响\t50028\n蚊虫\t50029\n樱花谷\t50030\nvm虚拟机\t50031\n2432\t50032\n好味道\t50033\nNanoPi\t50034\njar包\t50035\n赵元\t50036\n步长\t50037\n复利现值系数\t50038\naoeiuv\t50039\n捷配\t50040\nGiovanni\t50041\n220V\t50042\nHowzhi\t50043\n冷s\t50044\n中国网球公开赛\t50045\n太上皇\t50046\nCopen\t50047\n平价\t50048\n上海万达瑞华酒店\t50049\n每周3\t50050\n软度\t50051\n哈德逊河\t50052\n索尼(中国)有限公司\t50053\n齐木南雄\t50054\n高桥街道\t50055\n仰口\t50056\n广州知识产权法院\t50057\n一长串\t50058\n黄国平\t50059\n我自我\t50060\n国家发展研究院\t50061\nBBK\t50062\nmx365\t50063\nlitepal\t50064\nTeflon\t50065\n外置声卡\t50066\n政泉\t50067\n万三\t50068\n一个十岁\t50069\n育儿宝典\t50070\n完美世界\t50071\n海石湾\t50072\nArmitage\t50073\n友力\t50074\n刘洪悦\t50075\n米兰理工大学\t50076\nV6发动机\t50077\n爱恋动漫BT\t50078\nfischer\t50079\n400平米\t50080\n赵睿\t50081\n转籍\t50082\n华信医院\t50083\n铜陵市\t50084\n凋敝\t50085\n金拐棍\t50086\nAxela昂克赛拉论\t50087\n中国东方航空股份有限公司\t50088\n信息系统专业\t50089\n神智\t50090\n金雅拓\t50091\n魔潮\t50092\npowell\t50093\n云验证码\t50094\n哀鸣\t50095\nonblur\t50096\ntdf\t50097\nmatlab\t50098\n▎\t50099\n巨乳妻\t50100\n工资率\t50101\nmediatek\t50102\nLisp\t50103\n飞蛾\t50104\n袁伟豪\t50105\n丰田RAV4\t50106\n一念之差\t50107\n乐清市人民医院\t50108\n南欧\t50109\n怡和集团\t50110\n基诺\t50111\n粗硬\t50112\n麻椒\t50113\nTout\t50114\n东南大学交通学院\t50115\n黄山天都峰\t50116\n人和街道\t50117\n印飞星\t50118\n天天操\t50119\n江铃驭胜\t50120\nHX711\t50121\n非药物治疗\t50122\n10.30\t50123\n优矿\t50124\n国金宝\t50125\n黄黄高铁\t50126\n中天国富证券有限公司\t50127\n添加物\t50128\n第18044期\t50129\n美丽传说\t50130\n晋中\t50131\n宝剑\t50132\n封天\t50133\n十五周年\t50134\n孤芳不自赏\t50135\n习总书记系列讲话精神\t50136\n机器人展\t50137\n四定\t50138\nCME\t50139\n西川中学\t50140\nGarden\t50141\nBehaviour\t50142\n总体性\t50143\n支模架\t50144\n移相\t50145\n20141207\t50146\narouter\t50147\n分工\t50148\n突然之间\t50149\n弓系\t50150\n中国同盟会\t50151\nMarkus\t50152\nSNMP协议\t50153\n交叉口处\t50154\n郎中\t50155\nIdo\t50156\nMatching\t50157\n支付宝来\t50158\nbootm\t50159\n620分\t50160\n吴亚军\t50161\n钢钎\t50162\n对我来说\t50163\n南通大学杏林学院\t50164\nETO\t50165\n1966\t50166\n高精尖\t50167\n呼召\t50168\n专票\t50169\n漏粪板\t50170\n人形\t50171\nweidian\t50172\n烟墩\t50173\nQImage\t50174\n莱州\t50175\n口水流\t50176\nneirong\t50177\n100万\t50178\npyspark\t50179\n未成年工\t50180\n阿塔哈卡神庙\t50181\nNetFlow\t50182\n徐娟\t50183\nハイ\t50184\n笑面\t50185\n煎饼果子\t50186\n量产\t50187\n税法\t50188\n行政判决书\t50189\n中共济南市纪律检查委员会\t50190\nDataReader\t50191\nhougan\t50192\n中印\t50193\n摸着石头过河\t50194\n肇庆\t50195\n沒\t50196\nMinistries\t50197\n代步\t50198\n铅汞\t50199\n2条\t50200\n王鸣\t50201\n10桌\t50202\n汉语拼音\t50203\n北科\t50204\n东莞市住房和城乡建设局\t50205\n第一回\t50206\n禅意\t50207\nBD国粤\t50208\nshunv\t50209\n百分数\t50210\n甲壳类\t50211\n正定新区\t50212\nc++版\t50213\n摇杆\t50214\n包吃\t50215\n数字文化网\t50216\n中央国家机关住房资金管理中心\t50217\n1um\t50218\n市安监局\t50219\n金山湖\t50220\n大袋\t50221\n中国信息安全认证中心\t50222\n一口气\t50223\n六角头螺栓\t50224\nexele\t50225\n合成物\t50226\n梅花三弄\t50227\n芭妮兰卸妆膏\t50228\n从容不迫\t50229\n工价\t50230\nMED\t50231\n物流师\t50232\n集通\t50233\n欧派家居集团股份有限公司\t50234\n郑恺哈登\t50235\n不寒而栗\t50236\n俄门\t50237\n爱新觉罗\t50238\nSmoking\t50239\n严父\t50240\n阿拉巴马大学\t50241\n杭州市民卡服务中心\t50242\n盐酥鸡\t50243\nv3.4\t50244\n90A\t50245\n小米VR\t50246\nPins\t50247\n中国室内装饰协会\t50248\n义无反顾\t50249\n永联\t50250\n18周年\t50251\n荣威950\t50252\n瑞丽女性网\t50253\n行管\t50254\n1.1万\t50255\nchang\t50256\nVENUS\t50257\n5.6.21\t50258\nminecraft服务器\t50259\n伸开\t50260\n战神纪\t50261\n麻辣社\t50262\n女人帮\t50263\n星核\t50264\n4752G\t50265\nbnu\t50266\n曹峰\t50267\n农林下路\t50268\n黄贯中\t50269\nPd\t50270\n270亿\t50271\n4.8元\t50272\n抽象\t50273\n搜索网\t50274\n万胜围\t50275\n原点处\t50276\n鱼药\t50277\nimporterror\t50278\n占空\t50279\n普遍性\t50280\nStaples\t50281\n中信\t50282\n贲张\t50283\nStewart\t50284\nsmbpasswd\t50285\n中国\t50286\n乱侃\t50287\n寒冰5\t50288\ngetvalue\t50289\nPatient\t50290\n上海交通大学生物医学工程学院\t50291\n血观音\t50292\n速汇\t50293\n右归胶囊\t50294\n清水池\t50295\n强征\t50296\n达蒙\t50297\n市卫生局\t50298\n企业形象\t50299\n市经信局\t50300\nArkenstone\t50301\n睡在我上铺的兄弟\t50302\n水下机器人\t50303\n字元\t50304\n膏蟹\t50305\n2世纪\t50306\n岁城璃心\t50307\n30斤\t50308\n中国城\t50309\n105度\t50310\n二百\t50311\n忠县\t50312\n黄一飞\t50313\n太仆寺旗\t50314\n野神\t50315\n那路\t50316\n抖咪\t50317\nponycar\t50318\nyjv22\t50319\n厦门植物园\t50320\n太平通宝\t50321\n网管交换机\t50322\nSVM\t50323\n14.1\t50324\n2018.3.15\t50325\n团圆饭\t50326\nv1.02\t50327\n舒适性\t50328\n李泰\t50329\n理财师\t50330\nrosa\t50331\n舱位\t50332\n南果梨\t50333\n金猫银猫\t50334\n100问\t50335\n灯位\t50336\n破锁\t50337\n普田燃气灶\t50338\n开正\t50339\n刷级\t50340\n马来亚\t50341\n26张\t50342\n巫溪网\t50343\n挖方\t50344\n字句\t50345\nff6\t50346\nwww.69dns.com\t50347\n袁宝璟\t50348\n有潮\t50349\n银河小区\t50350\n瑶琳\t50351\n好慌\t50352\nsara\t50353\n洞洞\t50354\n002181\t50355\nHack\t50356\n异形大战铁血\t50357\nenlight\t50358\n百家号指数\t50359\n勤诚\t50360\n全男\t50361\n3毫米\t50362\n摆上\t50363\nPSCAD\t50364\n张炬\t50365\nArmani\t50366\n微微笑\t50367\n浮尘\t50368\nbowl\t50369\n三国杀传奇\t50370\nassignment\t50371\n西布罗姆维奇\t50372\n茂林修竹\t50373\n中航证券\t50374\n720p.HD高清\t50375\n建党\t50376\n乔治梅森大学\t50377\n乙腈\t50378\n不损\t50379\n赫赫\t50380\n腕部\t50381\n遮瑕膏\t50382\n宁波市环境保护局\t50383\njquerymobile\t50384\n人民政治协商会议\t50385\nPiper\t50386\n土鸡苗\t50387\n纳税人\t50388\nENTERPRISE\t50389\n信任度\t50390\n垂杨\t50391\n密封剂\t50392\n异动股\t50393\n民国二十五年\t50394\n总司\t50395\n柳善皓\t50396\n消减\t50397\n女生节\t50398\nbtbook\t50399\n禅鸣\t50400\nWIKI_着迷网\t50401\n前一晚\t50402\n铁蛋\t50403\nSSC\t50404\n首都师范大学附属中学\t50405\n久爱\t50406\nZX300A\t50407\n五律\t50408\n太辛苦\t50409\nscarf\t50410\n冒险岛炎\t50411\n金融股份有限公司\t50412\n螺壳\t50413\n许亮\t50414\n桌卡\t50415\n刺绳\t50416\n单子\t50417\nNingbo\t50418\nY8.com\t50419\nstrncmp\t50420\n单桥\t50421\nclement\t50422\n并没\t50423\nWin7系统菜单栏\t50424\n猎豹飞腾\t50425\nSpeaker\t50426\n双向四车道\t50427\n连续梁桥\t50428\n百度科技园\t50429\nsphinx\t50430\n武道宗师\t50431\n龙凤胎\t50432\n朝北\t50433\n史前文明\t50434\n3499\t50435\n纸枪\t50436\n冷冻柜\t50437\n柳木\t50438\n冰丝\t50439\n花式\t50440\n130克\t50441\nnt值\t50442\n青莲镇\t50443\nStandalone\t50444\nobesity\t50445\n李天方\t50446\n东莞公司\t50447\n赵丽娟\t50448\n聊城市人民政府\t50449\nove\t50450\n何建明\t50451\n中国国际医疗器械博览会\t50452\n上海金桥\t50453\n2018.4.22\t50454\n两票\t50455\n1598\t50456\n中检所\t50457\n会好\t50458\n义渠君\t50459\n80G\t50460\n何占豪\t50461\n浙江教育厅\t50462\n申贷网\t50463\n没骨气\t50464\n艺龙旅行网\t50465\n蓬松度\t50466\n广东省疾病预防控制中心\t50467\n重资\t50468\ngtsport\t50469\n禹\t50470\n海银\t50471\nDistributions\t50472\nc++11\t50473\n检测箱\t50474\nDotA英雄站\t50475\nDBI\t50476\n徐威\t50477\n美豹\t50478\n我不是精英\t50479\n许峰\t50480\nmond\t50481\n故意杀人罪\t50482\n张庄路\t50483\n溢价率\t50484\n宇峰\t50485\n7980xe\t50486\n仿机\t50487\n网络机柜\t50488\nthatis\t50489\n三无食品\t50490\n帅克\t50491\n地下城堡2\t50492\n石基\t50493\n绘声\t50494\nMEI\t50495\nLPG\t50496\n刘文辉\t50497\n马晓春\t50498\n尺素\t50499\n赎回\t50500\nhaose\t50501\n欧德\t50502\nHRT\t50503\n去他妈的世界\t50504\n戴维南\t50505\nNBU\t50506\n大舅子\t50507\n标委\t50508\n中空壁缠绕管\t50509\n小果\t50510\nECL\t50511\n规范场\t50512\n塑封膜\t50513\n2013-2015年度\t50514\n赵倩\t50515\nbl吧\t50516\n商业登记证\t50517\n侵袭性\t50518\n汉阳陵\t50519\n成瘾者\t50520\n秦俑\t50521\n钓鱼船\t50522\nZ10\t50523\n努比亚Z17mini\t50524\n鲁保罗\t50525\n上海数据交易中心\t50526\n名课堂企业管理培训网\t50527\n哈利波特7\t50528\n掠\t50529\n楷体\t50530\ncad2007注册机\t50531\n雷霆三巨头\t50532\n大叉\t50533\n0.24.4\t50534\n奔驰GLC论坛_汽车之家论坛\t50535\nMockups\t50536\n张晓平\t50537\n麝香壮骨膏\t50538\nske48\t50539\n年中\t50540\n凉开水\t50541\n千里马\t50542\n堕落史\t50543\n海报展\t50544\n电箱\t50545\n空投箱\t50546\n2011年7月\t50547\n自体脂肪移植\t50548\n研磨仪\t50549\ndino\t50550\n超过30年\t50551\nqichacha\t50552\ntile\t50553\n炼魂\t50554\n空天猎\t50555\n见微知著\t50556\n红包袋\t50557\n魏松\t50558\n浦口火车站\t50559\n华为技术有限公司\t50560\n10.10_\t50561\n娜塔莉\t50562\n逍遥小书生\t50563\n2508\t50564\n沃尔沃\t50565\n西南模范中学\t50566\n统括\t50567\n嘉州\t50568\nendswith\t50569\nstudi\t50570\n五号\t50571\nhapi\t50572\n三十多年\t50573\n金色\t50574\n张国华\t50575\n宋再临\t50576\n不道\t50577\n抚远\t50578\n中考生\t50579\n税控器\t50580\n精装修公司_装饰公司\t50581\n九里堤\t50582\nM7206\t50583\n箱费\t50584\n北京幼升小网\t50585\n铁圈\t50586\n秋雨\t50587\n宋朝\t50588\n憾生\t50589\niPhone9\t50590\n1743\t50591\n免注册\t50592\n液氨\t50593\n猪精\t50594\n赛鹰\t50595\n联接\t50596\n脂肪\t50597\n总得分\t50598\n煤气\t50599\n边境经济合作区\t50600\n肌肉率\t50601\n遥远\t50602\nadobereader\t50603\n电子业\t50604\n车名\t50605\n韩烨\t50606\n外民\t50607\n绿豆\t50608\n氧枪\t50609\n李承鄞\t50610\n鹿城镇\t50611\n网易mc\t50612\n翻箱\t50613\n妖猫\t50614\n各队\t50615\nbaorant\t50616\n杨骏\t50617\n翔安区\t50618\n川金诺\t50619\n汽车连接器\t50620\n被动土\t50621\ndebuger\t50622\n结核分枝杆菌\t50623\n青岛58中\t50624\nceres\t50625\nShelf\t50626\nifeixiang\t50627\ncountries\t50628\nisaced\t50629\nsomic\t50630\n科尼\t50631\n等效均布荷载\t50632\n北京银行股份有限公司\t50633\n懦\t50634\nuname\t50635\n田径\t50636\nandroid-sdk\t50637\n原子锁\t50638\n北京化工大学北方学院\t50639\nKUKA\t50640\n绿釉\t50641\nJavascrip\t50642\n基本信\t50643\n斯卡波罗集市\t50644\n宜州市\t50645\n华北电力大学新闻中心\t50646\nMKV版\t50647\n1922\t50648\n轮回者\t50649\n10055\t50650\n王哈\t50651\ncwz\t50652\n麻城房网\t50653\n集邮册\t50654\n斯拉夫人\t50655\n500T\t50656\nMetal-Flyme\t50657\n小伊\t50658\n不锈钢天地网\t50659\nstreams\t50660\n省工商局\t50661\n聂\t50662\n雨淋阀\t50663\n腰码\t50664\n榜眼\t50665\n巴戟天\t50666\n拔腿\t50667\n龙湖天宸\t50668\n_本溪农网\t50669\n奔跑吧兄弟第三季\t50670\n阅卷机\t50671\n犀牛\t50672\n储蓄国债\t50673\n10个工作日\t50674\n女尊\t50675\n10个小时\t50676\nv3\t50677\n002739\t50678\n轻点\t50679\njianshu\t50680\n原表\t50681\nCVX\t50682\n崔涛\t50683\n御状\t50684\n摩擦音\t50685\nmany\t50686\n3.5g\t50687\n一小节\t50688\n20万吨\t50689\n中国邮政储蓄银行股份有限公司\t50690\ntes\t50691\n肯恩\t50692\n通达街\t50693\nMOTOROLA\t50694\nFPV\t50695\n影音先锋xfplay\t50696\n全麦面粉\t50697\nQS\t50698\n联合会议\t50699\n第二十五回\t50700\n算及格\t50701\n北路\t50702\ncnetos7\t50703\n孤岛惊魂5\t50704\nidioms\t50705\nliability\t50706\nCDATA\t50707\n散文诗\t50708\nKSP\t50709\n91pom\t50710\nsinian\t50711\nEstablishing\t50712\n6000000\t50713\n龙王鲸\t50714\n风信子\t50715\n中国酒商网\t50716\n建工学院\t50717\n点数\t50718\nCabin\t50719\n英制螺纹\t50720\n未育\t50721\nligand\t50722\n春寒\t50723\nyeoman\t50724\nkissxsis\t50725\n玄奇\t50726\n18183皇室战争\t50727\n杨七郎\t50728\nexdoll\t50729\n异刃2\t50730\nMerger\t50731\n20141219\t50732\n刀马\t50733\n末次\t50734\n0.12\t50735\n悦悦\t50736\nmovi\t50737\n国防部长\t50738\n乡乡\t50739\n乔吉拉德\t50740\n生根\t50741\n赛科\t50742\n青海路\t50743\n梦话西游\t50744\n东莞地铁\t50745\n麻生早苗\t50746\n王小宾\t50747\n陈安之\t50748\nIMAGE\t50749\n席位\t50750\n天魂\t50751\n商志\t50752\n福彩刮刮乐\t50753\n中国发展研究基金会\t50754\n民事行为能力人\t50755\n唱吧麦颂\t50756\n70多天\t50757\nLarva\t50758\nTensor\t50759\npica\t50760\n防台\t50761\n微信第三方公众号\t50762\n夏海\t50763\n21米\t50764\n心塞塞\t50765\n王献之\t50766\n腋\t50767\n信子\t50768\n华中科技大学远程与继续教育学院\t50769\nNickel\t50770\ncline\t50771\n281\t50772\nHAL库\t50773\n网语\t50774\nBLDC\t50775\ncanvas\t50776\n谢立\t50777\n没那么难\t50778\n上海机场(集团)有限公司\t50779\n中心校\t50780\n螃\t50781\n四七\t50782\n环氧丙烷\t50783\n科层制\t50784\n王牧笛\t50785\n1800亩\t50786\n眼虫\t50787\n安坐\t50788\n金色摇篮幼儿园\t50789\n血色素\t50790\n保税\t50791\n嘴内\t50792\n长沙幼儿园\t50793\n展视\t50794\n抽吸\t50795\n金沙滩壹号\t50796\n铝合金门\t50797\n佳偶天成\t50798\n1.4E\t50799\npapaya\t50800\n歌子酒\t50801\n童蒙\t50802\n状元府\t50803\n四十岁\t50804\n赛门铁克公司\t50805\n茶色\t50806\n红八财富\t50807\n补番\t50808\n树库\t50809\n出版社\t50810\n0779\t50811\n厦门市人民政府办公厅\t50812\n进修生\t50813\n股商\t50814\nMCS-51单片机\t50815\n文登市\t50816\nALBION\t50817\n香港元朗\t50818\n突击队长\t50819\n端州\t50820\nmonitoring\t50821\n酚醛\t50822\n宝龙一城\t50823\ninconsistency\t50824\nG12\t50825\n0512-95566\t50826\n新男\t50827\nTowards\t50828\n电子厂\t50829\n79%\t50830\n穿鞋\t50831\nmssq\t50832\nDMM.R18\t50833\n警案\t50834\n张大同\t50835\n空压站\t50836\narche\t50837\n周子健\t50838\n二例\t50839\nonLoad\t50840\nIgA\t50841\n京子\t50842\nFat\t50843\n分体式\t50844\n射水\t50845\n土木\t50846\n龙元建设\t50847\n财吧\t50848\nmanolo\t50849\nNAND\t50850\n云计算中心\t50851\n从心\t50852\n哆啦a梦剧场版\t50853\nSpears\t50854\n学习者\t50855\n海淘科技\t50856\n单元格格式\t50857\n杭州时代小学\t50858\nconstitutional\t50859\n小字符喷码机\t50860\n塘边\t50861\nPragmatic\t50862\n20141226\t50863\nwindows共享文件\t50864\n驴迹\t50865\nsplatoon\t50866\nBioinformation\t50867\n抱成\t50868\n上世纪50年代\t50869\n冤死\t50870\n双证在职研究生\t50871\n消毒药\t50872\n白宫风云\t50873\nApplication类\t50874\n球袜\t50875\nSocks5\t50876\n保险单\t50877\n鹤\t50878\n爪痕\t50879\n南湖社区\t50880\n海日\t50881\n波士\t50882\n几时\t50883\nreese\t50884\n闹市\t50885\nudemy\t50886\n上海金庆电子技术有限公司\t50887\n暗灵\t50888\n云购\t50889\n激素六项检查\t50890\n移步\t50891\n随地\t50892\n吊挂\t50893\n提呈\t50894\n日本航空\t50895\n宋静\t50896\n第7期\t50897\n西华师大\t50898\nV17\t50899\n维妮\t50900\nezboot\t50901\n重庆市商务委员会\t50902\nE路航\t50903\nupcoming\t50904\n武清\t50905\n舷窗\t50906\n陵县\t50907\nCIF\t50908\n大众人才网\t50909\n洋马\t50910\n华为路由Q1\t50911\n克里斯汀娜\t50912\nosaka\t50913\n举步\t50914\n离心式冷水机组\t50915\n西游\t50916\n实业\t50917\nbath\t50918\ndisucz\t50919\n会计制度\t50920\n灯珠\t50921\ncoredump\t50922\n梦幻西游五庄\t50923\n中网卡\t50924\n转页\t50925\nqq麻将\t50926\n咬一口\t50927\ncine\t50928\n悬崖\t50929\n斜抛运动\t50930\n分野\t50931\n0.03\t50932\n2535\t50933\n王宝\t50934\n养元\t50935\n树脂瓦\t50936\n百草园\t50937\n贝儿公主\t50938\n客厅\t50939\n无词\t50940\nDNA聚合酶\t50941\n晚班\t50942\n集中连片特困地区\t50943\n玛丽黛佳\t50944\nligament\t50945\nserge\t50946\n张洪量\t50947\n莫斯卡托\t50948\nac100\t50949\nchunkhash\t50950\n9.7%\t50951\n#ifdef\t50952\n恽寿平\t50953\n豪典\t50954\n双子星公主\t50955\n上海杉达学院\t50956\n网目\t50957\nheapdump\t50958\n广东电大\t50959\n方皓玟\t50960\n中侨\t50961\n南巨\t50962\n盘妻\t50963\nfrontpage\t50964\nworkerman\t50965\nhbt\t50966\n1转\t50967\njunior\t50968\n不义联盟\t50969\n半岛电视台\t50970\n中国糖酒网\t50971\ndependence\t50972\n比例阀\t50973\n切它网\t50974\n柳屋生发液\t50975\n20170905\t50976\n天津全运村\t50977\n同步练习\t50978\n十族\t50979\n大唐移动\t50980\n细心\t50981\n炸死\t50982\n纵\t50983\n风筝\t50984\n落成\t50985\n衬塑管道\t50986\n共谋\t50987\n煎\t50988\nHD-MP4/1.5GB\t50989\n光谷软件园\t50990\n红米5A\t50991\n5.8.14.706\t50992\n石钟乳\t50993\n祭忆\t50994\nheadcount\t50995\n张江汀\t50996\n苹果7\t50997\n美空影院\t50998\n周立功\t50999\n三梯\t51000\n奥尔芬斯\t51001\n上海华师大\t51002\n158号\t51003\n国家电影局\t51004\n兰桂\t51005\n金科世界城\t51006\n暗影骑士3\t51007\n疏朗\t51008\n一结\t51009\nAlex_Michel-51CTO\t51010\n傲慢机器人大战\t51011\n懂爱\t51012\n最远\t51013\nCleanser\t51014\naftereffect\t51015\n方象\t51016\nHL7\t51017\n2017年4月13日\t51018\n豆腐机\t51019\n梁非凡\t51020\n蔡邕\t51021\n交底书\t51022\nBUSINESS\t51023\n宣传\t51024\n0546房产网\t51025\n袖箭\t51026\n20170801\t51027\n华博\t51028\n青海地区\t51029\n来令片\t51030\n易子\t51031\n曲霉菌\t51032\n医学检验所\t51033\n参照\t51034\nL383\t51035\n大王庙\t51036\n弯儿\t51037\n吴式太极拳\t51038\n天星教育\t51039\n生辰八字算命婚姻\t51040\nRetired\t51041\n金青鸟\t51042\n利可君片\t51043\n中国国际海运网\t51044\n九音\t51045\n天使基金\t51046\n三角型\t51047\n邓迪\t51048\n御殿场奥特莱斯\t51049\n纯阳\t51050\n193号\t51051\n古马\t51052\n热力工程\t51053\n成人高考专升本\t51054\nhtf\t51055\n诺阿\t51056\n信阳高新区\t51057\ncosting\t51058\n益生\t51059\nfml\t51060\n精疲力尽\t51061\n徐妍\t51062\n30周年\t51063\n玉环市\t51064\nrenpingsheng\t51065\n乐翻\t51066\n剃发\t51067\ncappuccino\t51068\nbrothers\t51069\n_联图\t51070\n百般刁难\t51071\nRaspberry\t51072\n气象雷达\t51073\nAddin\t51074\n运动季\t51075\n安美特\t51076\n礼片\t51077\n淘房网\t51078\n肩峰\t51079\n蒂华纳\t51080\n锐行\t51081\n国家工商行政管理总局行政学院\t51082\nstrand\t51083\n留职\t51084\nG1610\t51085\n花雨庭\t51086\n400年\t51087\n暴躁\t51088\n义乌人才网\t51089\n临汾路\t51090\n广东现代国际展览中心\t51091\n黑名\t51092\n中国水星网\t51093\n激波\t51094\n41寸\t51095\n维纳尔\t51096\n橙光游戏大全\t51097\n元素\t51098\nu型枕\t51099\n强东\t51100\nmh\t51101\n乙木\t51102\n20150711\t51103\n以少胜多\t51104\n扮演者\t51105\n记星\t51106\n酸枣糕\t51107\n土壤肥力\t51108\nCottage\t51109\nhito\t51110\nmodbus\t51111\nrecuva\t51112\n大大方方\t51113\n免打扰\t51114\n蓝色星球2\t51115\nfarewell\t51116\nVIVE论坛\t51117\n低俗化\t51118\n万剑\t51119\n陈规陋习\t51120\n空枪\t51121\n中华人民共和国合同法\t51122\n印花税\t51123\nSISTAR\t51124\n汉兰达\t51125\n腱\t51126\n法界\t51127\n吹水\t51128\n童话节\t51129\n锦鸡\t51130\n射精无力\t51131\n水污染\t51132\n君麻吕\t51133\n字库\t51134\n20部\t51135\n崔皓月\t51136\n恒大山水城\t51137\nDiamonds\t51138\n陈莎莉\t51139\ndateformat\t51140\n张昕\t51141\n没辙\t51142\n劲战\t51143\n支撑架\t51144\n专性\t51145\n城东小学\t51146\nSX\t51147\n第107\t51148\nfoxmail7.0\t51149\n折颜\t51150\n悟空遥控器\t51151\n嘉兴客运中心\t51152\n角平分线\t51153\n160首\t51154\n上海复旦大学\t51155\n雨柔\t51156\nProvidence\t51157\n齐鲁师范学院\t51158\n土木工程测量\t51159\n黄片\t51160\n第140集\t51161\n葉子\t51162\nHydrating\t51163\n护岸\t51164\n何常在\t51165\n盐析\t51166\n普华科技\t51167\n新兴铸管股份有限公司\t51168\n全椒网\t51169\n梁先生\t51170\nFlora\t51171\n挽词\t51172\n二张\t51173\nearthquake\t51174\nPacific\t51175\n巳时\t51176\n锈石\t51177\npremiere\t51178\nMask\t51179\n美网\t51180\n北大汇丰商学院\t51181\n______\t51182\n总代\t51183\n0117\t51184\ngary_tao\t51185\n锁清秋\t51186\n附加值\t51187\nimagenomic\t51188\n前沿性\t51189\n大步走\t51190\n郎家园\t51191\nMZY\t51192\n133号段\t51193\n血荒\t51194\n广东文理职业学院\t51195\n我是演说家\t51196\n打水仗\t51197\ndown在线翻译\t51198\n塑令\t51199\nrpm\t51200\n环渤海地区\t51201\numeditor\t51202\n农奋网\t51203\n小禾\t51204\n房树人\t51205\n快乐宝贝\t51206\n招办\t51207\n下差\t51208\n复兴西路\t51209\n{{}}\t51210\n马鼎盛\t51211\n白川乡\t51212\n何长工\t51213\n军鞋\t51214\n聚氨酯发泡剂\t51215\n火爆食品饮料招商网\t51216\nchloride\t51217\n白衣人\t51218\n周志华\t51219\n打者\t51220\n提心吊胆\t51221\n结束页\t51222\n意犹未尽\t51223\n独山\t51224\n百炼成钢\t51225\ngears\t51226\n天南和地\t51227\n产能\t51228\n斯坦福桥\t51229\n茂南区\t51230\n偏铝酸根\t51231\n120急救中心\t51232\nnotpad++\t51233\n宋氏\t51234\n遥指\t51235\nKenwood\t51236\n标板\t51237\n请指教\t51238\n棕榈生态城镇发展股份有限公司\t51239\n长兴岛\t51240\n团结乡\t51241\n七套\t51242\n汉典\t51243\nhanded\t51244\n剑帝\t51245\nemuch\t51246\nlineedit\t51247\n清洗剂\t51248\n苏宁易购买\t51249\nwipes\t51250\n佳景\t51251\n塔防\t51252\n90天\t51253\n新民主主义革命\t51254\n横穿\t51255\n库卡\t51256\n青衣\t51257\n中國哲學書電子化計劃\t51258\n跳楼\t51259\n诺贝尔生理学\t51260\n网商\t51261\n煤气味\t51262\nPostGIS\t51263\n周别\t51264\n周丹\t51265\n丘钛科技\t51266\n传播学\t51267\nlcd液晶屏\t51268\n180CM\t51269\n第七十九章\t51270\n本悦宋\t51271\n阿尔法信\t51272\n恒力石化\t51273\n一锤子\t51274\n该行\t51275\n结核性腹膜炎\t51276\n南京北站\t51277\n黄金基金\t51278\n至圣\t51279\n深大通\t51280\n深海鱼\t51281\n无香型\t51282\ndataman\t51283\n葡萄酒庄\t51284\niphone浏览器\t51285\n10.3.2\t51286\n叫苦不迭\t51287\n子安\t51288\n豪森家具\t51289\n安华卫浴\t51290\n挂帘\t51291\n老爸\t51292\n花境\t51293\nwallpapers\t51294\n粗管\t51295\n柾国\t51296\n淫兽\t51297\n人身权\t51298\nBeckhoff\t51299\n净营收\t51300\n蟠龙路\t51301\n女尤\t51302\n中继站\t51303\n大理石\t51304\n三端稳压管\t51305\n汽车烤漆房\t51306\n防写\t51307\n7.24\t51308\n百卷\t51309\n20年以上\t51310\n外围赛\t51311\nA.3\t51312\n沈阳工程学院\t51313\n不言弃\t51314\n651\t51315\n1965年\t51316\n吴闲云\t51317\n法制史\t51318\n洋县人民政府\t51319\n鹰隼\t51320\n11th\t51321\n6655\t51322\n特检院\t51323\n英国大学\t51324\n房地产调控\t51325\n╳\t51326\n诗篇\t51327\n五冶集团上海有限公司\t51328\nOreal\t51329\n机场路\t51330\n模特公司\t51331\n旺年\t51332\n管端\t51333\n釉上彩\t51334\n合金装备3\t51335\nITB\t51336\n屠神\t51337\n静摩擦力\t51338\n精武风云\t51339\n小天地\t51340\nNDSL\t51341\n1257AD\t51342\n枚举变量\t51343\n城际高铁\t51344\n标石\t51345\nsmbclient\t51346\nflex-wrap\t51347\n立项\t51348\n额法\t51349\n朗斯\t51350\n张家界机场\t51351\n午托\t51352\nParticle\t51353\n太原\t51354\n浓缩胶\t51355\n下屏\t51356\n塞拉门\t51357\n王战营\t51358\n汽车生活网\t51359\n富士山\t51360\n克林霉素磷酸酯\t51361\n夺刀\t51362\n垦丁大街\t51363\n中原经济网\t51364\n刺客信条黑旗\t51365\n大都市区\t51366\n洗浴城\t51367\n高论\t51368\n俊越\t51369\n暗队\t51370\nifn\t51371\n鲁宾斯坦\t51372\n华禹教育网\t51373\n炼油\t51374\n行政编码\t51375\niFTY\t51376\n壮族吧\t51377\n达卡\t51378\n27倍\t51379\n480\t51380\n孤岛危机吧\t51381\n排牙\t51382\n星际迷航发现号\t51383\n民务实清廉\t51384\n空岛战争\t51385\n王侯将相宁有种乎\t51386\nPPT文件\t51387\n3栋\t51388\n如园\t51389\n洋葱炒鸡蛋\t51390\n精减\t51391\n绿地中央广场\t51392\n票贩子\t51393\n忠诚卫士\t51394\n银行公司\t51395\n10万公里\t51396\n80平方\t51397\n遗音\t51398\nAdsorption\t51399\n没谈过恋爱\t51400\n钱儿\t51401\n安耐晒\t51402\n幸福歌\t51403\n梦野\t51404\nMD模拟器\t51405\n八万公里\t51406\n老祖\t51407\n黑头发\t51408\n山德士\t51409\n危\t51410\n姚合\t51411\n2年前\t51412\n修复体\t51413\nBreasts\t51414\nSEW\t51415\n愣子\t51416\n数据界面\t51417\n五言\t51418\n抗清\t51419\n石井街\t51420\n边儿\t51421\n茉莉广场舞\t51422\n丸九\t51423\n冷皮\t51424\n定性分析\t51425\n微电子科学与工程\t51426\n线程栈\t51427\nshes\t51428\n1139\t51429\n超能英雄\t51430\n2018六级\t51431\n李哲明\t51432\n教学科\t51433\n多玩网\t51434\n分文\t51435\nquotient\t51436\n河北省公安厅交通管理局\t51437\n加油发票\t51438\n收益型\t51439\n制氧机\t51440\nENV\t51441\nstayed\t51442\n600V\t51443\n花与爱丽丝杀人事件\t51444\n无形者\t51445\nmily\t51446\n摸金玦\t51447\nPhillip\t51448\n修水县委\t51449\n捻\t51450\n新批\t51451\n古诗三首\t51452\nVIVI\t51453\nappie\t51454\n古人类\t51455\n政府采购协议\t51456\n锐钛\t51457\n鼓楼实验小学\t51458\n家务劳动\t51459\n500毫米\t51460\n微博客\t51461\n八升\t51462\nTFboys\t51463\n水泥自流平\t51464\n母仓\t51465\n玫瑰花水\t51466\nSales\t51467\n郁金香\t51468\n伊尔库茨克\t51469\n道士下山\t51470\n文题\t51471\nPILZ\t51472\n小宏\t51473\n求全\t51474\n魔兽宝马3系\t51475\n白蓝\t51476\n竹杖芒鞋轻胜马\t51477\n得分后卫\t51478\n旅游资讯网\t51479\n不对劲\t51480\n郊游\t51481\n索拉非尼\t51482\n6.8折\t51483\n拿起\t51484\n西溪\t51485\n承托\t51486\nRasp\t51487\n边子\t51488\nCaramel\t51489\nscience\t51490\n等离子切割机\t51491\n日托\t51492\nbuding\t51493\n魔狼\t51494\n入党积极分子谈话记录\t51495\n隔音垫\t51496\n悠闲生活\t51497\n新视野大学英语2\t51498\nefc\t51499\n包装纸\t51500\n中国梦我的梦\t51501\n威兰德\t51502\n男声版\t51503\n轻信\t51504\nlimn\t51505\n天梭机械表\t51506\n16.00\t51507\n仁济医院\t51508\n强体\t51509\n浓度\t51510\n重数\t51511\n聚氨酯软泡\t51512\nDolce\t51513\n三国争霸2\t51514\n花木网\t51515\n维修班\t51516\n迎亲\t51517\n警长\t51518\n茅\t51519\n丧子\t51520\n男导演\t51521\n等速\t51522\n忽明忽暗\t51523\n徐健顺\t51524\n双登蓄电池\t51525\nT460\t51526\n死亡真相\t51527\nKesha\t51528\n石橄榄\t51529\n马尔文\t51530\npoppin\t51531\n3居\t51532\nNovel\t51533\nfancytree\t51534\n大佛普拉斯\t51535\n140000\t51536\n磨平\t51537\n存量房\t51538\n武铁\t51539\n孟郊\t51540\n在之前\t51541\n20170731\t51542\n#四海鲸骑#\t51543\n仁恒西郊花园\t51544\n食用鱼\t51545\n激活器\t51546\n的等\t51547\n拖线板\t51548\ncsi\t51549\n食品盒\t51550\nWAV/整轨\t51551\n网上信息系统\t51552\n标胶\t51553\n方正兰亭\t51554\n【巢湖\t51555\n秘色瓷\t51556\n希尔顿酒店\t51557\n本山\t51558\n全球品牌畜牧网\t51559\nNEX-5T\t51560\n整形美容医院\t51561\n加宽\t51562\n市场价格表\t51563\n自在神医逍遥客\t51564\nV5.0\t51565\n严凤英\t51566\n戏院\t51567\n冯矿伟\t51568\n船讯网\t51569\n莘县新闻网\t51570\n中国计划出版社\t51571\nTranscription\t51572\n固定资产净值\t51573\n金牛贷\t51574\n邬君梅\t51575\n感想\t51576\nshrio\t51577\nreservation\t51578\n罗信\t51579\n宾利欧陆\t51580\n尾矿\t51581\n柠檬汁\t51582\n24.4\t51583\nby_\t51584\n坦克世界亚服\t51585\n德拉季奇\t51586\n吉川\t51587\n72片\t51588\n巫蛊师\t51589\n⒍\t51590\n最美的相遇\t51591\nAppServ\t51592\n江笙\t51593\n五处\t51594\n托业\t51595\n砖柱\t51596\n阿权\t51597\n吉他谱_吉他谱\t51598\n包裹体\t51599\n浩气盟\t51600\n音乐片\t51601\n顺颂\t51602\nIO流\t51603\nCameras\t51604\n爱久\t51605\n六朝清羽记\t51606\nElemental\t51607\n科比特\t51608\n哥德堡\t51609\n宝昌\t51610\n打板\t51611\n估测\t51612\n授衔\t51613\nwhr\t51614\nloc\t51615\n许先生\t51616\niwatch3\t51617\n省食药监局\t51618\n重庆西部\t51619\nSTMP\t51620\nunpivot\t51621\nado\t51622\n175家\t51623\n美人女\t51624\n心智\t51625\n张家界大峡谷玻璃桥\t51626\n华山医院\t51627\n饶威\t51628\nHBase\t51629\n酷开社区\t51630\nrosetta\t51631\n薪酬激励\t51632\n马小云\t51633\n银鹭花生牛奶\t51634\n45码\t51635\n合同价\t51636\n皇家艺术学院\t51637\n拆解\t51638\nSummers\t51639\n漏记\t51640\n崇信\t51641\n凯通\t51642\n阳澄湖大闸蟹\t51643\nXcode\t51644\n哥特式风格\t51645\n上海顶邦公司\t51646\n信神\t51647\n明珠苑\t51648\nsetenforce\t51649\nAE\t51650\n_流放之路\t51651\n黄瑶\t51652\n大屠杀\t51653\nDDD\t51654\n自助餐\t51655\n高楼\t51656\n澄湖\t51657\n阜丰\t51658\n江苏省文化厅\t51659\n打牌\t51660\n新馨苑\t51661\n乂学教育\t51662\n炉火\t51663\n搜索引擎\t51664\n布林顿\t51665\n一百\t51666\n脊椎\t51667\n3.2_\t51668\n耳边\t51669\n中南大学出版社\t51670\n开元金融\t51671\n黄委会\t51672\n中柏EZpad\t51673\n钢化夹胶\t51674\n四措\t51675\n社会保险登记\t51676\n外神\t51677\nmiui9\t51678\n发言人\t51679\n条顿\t51680\n贝壳粉\t51681\nhill\t51682\n解决方案部\t51683\n菜籽饼\t51684\n水浮莲\t51685\n视死\t51686\n高聚物改性沥青防水卷材\t51687\n细管\t51688\n府尹\t51689\nDAY6\t51690\n图型\t51691\n山西晚报\t51692\nm3/\t51693\n燕飞\t51694\n20171227\t51695\n半码\t51696\n一周年\t51697\nJessica\t51698\n安卓文件管理器\t51699\nmicrosoftedge\t51700\n浙大中控\t51701\n欧马可\t51702\n苹果5s\t51703\n七院\t51704\nj-1\t51705\n金融学硕\t51706\n污水站\t51707\n54路\t51708\n抄底\t51709\n两杠\t51710\n虚拟示波器\t51711\nNatio\t51712\nbattlegrounds\t51713\nn2n\t51714\n乔安\t51715\n滤布\t51716\n交际花\t51717\nrhce\t51718\n安娜\t51719\n丧胆\t51720\n乡人民政府\t51721\n30小时\t51722\n怪盗基德\t51723\n十二家\t51724\nProper\t51725\n15分钟左右\t51726\n解除合\t51727\n好视\t51728\n魔纹三国\t51729\nGg\t51730\n婚纱店\t51731\n姑婆山\t51732\n滴滴答答\t51733\n灌缝\t51734\nqstring\t51735\n中科院心理所\t51736\n沃尔玛公司\t51737\n阿瓦隆\t51738\n饰物\t51739\nbenzi\t51740\n长安铃木维特拉\t51741\nTIOBE\t51742\n神崎\t51743\n非限制性\t51744\n平利\t51745\n帮凶\t51746\n苏嘉\t51747\n折800\t51748\n埃塞克斯大学\t51749\n7本\t51750\n各子\t51751\n阳晨\t51752\n工银灵通卡\t51753\n司法公开网\t51754\n下轨\t51755\n神玉\t51756\nwzs\t51757\n脑膜瘤\t51758\n凝陇\t51759\nPulsar\t51760\n原函数\t51761\n投资品\t51762\n郴州市委\t51763\n跨页面通信\t51764\n中国化\t51765\n东城万达广场\t51766\nvirsual\t51767\n中升集团\t51768\n万盛街\t51769\n盛宣怀\t51770\n双拐\t51771\n木垒\t51772\n劳损\t51773\n遗珍\t51774\n博物馆日\t51775\n140周年\t51776\n三角尺\t51777\n共有权人\t51778\n青岛地铁\t51779\n自动化设备有限公司\t51780\nサマ\t51781\n六祖寺\t51782\n半句\t51783\n魅族MX6\t51784\n降阶\t51785\n公帐\t51786\n教育机\t51787\n热敏电阻\t51788\n在线考试\t51789\n返\t51790\n过去一年\t51791\n不足为奇\t51792\n噼\t51793\n信签\t51794\n掀背\t51795\n2017年1月10日\t51796\n3起\t51797\n2016年底\t51798\n中铝网\t51799\n陈坤杜甫\t51800\n轮台政府网\t51801\n39英寸\t51802\n二第二\t51803\n现有\t51804\n601558\t51805\n神机箭\t51806\nSPK\t51807\n玉环县\t51808\n透平油\t51809\nvcds\t51810\n段飞\t51811\n纳尼\t51812\n沙塔尔·沙吾提\t51813\n环翠楼\t51814\nDN25\t51815\n钒氮合金\t51816\n1200万\t51817\n荆轲刺秦\t51818\nDEFORM\t51819\nkenny\t51820\n放浪冒险谭\t51821\n国机重工\t51822\n宜州区\t51823\n深圳农行\t51824\n平板车\t51825\n4.4.4\t51826\n上海中国移动\t51827\n囊壁\t51828\n阿咖酚散\t51829\nrfcn\t51830\n苦酒\t51831\n不\t51832\n冷妃\t51833\nJavaScript中文网\t51834\ntata\t51835\nP1\t51836\n大邦\t51837\nkoolshare\t51838\n校警\t51839\n2兆\t51840\nJaedong\t51841\nfifaonline4\t51842\n粗集料\t51843\n9支\t51844\n爆发性\t51845\n征值\t51846\n波分复用器\t51847\nTF卡\t51848\njichu\t51849\n癒\t51850\nmokers\t51851\n3月10日\t51852\n陈祺\t51853\n乱开\t51854\nqieta\t51855\n超过90天\t51856\n供给\t51857\n橡木色\t51858\n宝马m3\t51859\n平阳县\t51860\n西畴县\t51861\n广东省皮肤病医院\t51862\n人大出版社\t51863\nenhance\t51864\nPP8安卓网\t51865\n武清区\t51866\n许勇\t51867\navp\t51868\n箱入\t51869\n50位\t51870\n三星相机\t51871\nbapi\t51872\n威斯汀酒店\t51873\n凹凹\t51874\n分舵\t51875\n中投证券\t51876\n周暨\t51877\n李公\t51878\n南京新街口\t51879\nKbps\t51880\n海洋之星\t51881\n展开\t51882\n大文斗作文网\t51883\nDouble-Eggs\t51884\n结论\t51885\n2850\t51886\n天天资源网\t51887\nJourney\t51888\neligible\t51889\n东风多利卡\t51890\n一动不动\t51891\n起云涌\t51892\n二门\t51893\n光复路\t51894\n区地税局\t51895\n卖铁\t51896\nxyj\t51897\n三三四\t51898\n福禄丸子\t51899\nHAHA\t51900\n八千岁\t51901\nFocus\t51902\n酸奶杯\t51903\n6.02\t51904\n子分区\t51905\n双显卡\t51906\n育路高校招生网\t51907\n起承\t51908\n讹钱\t51909\n四季豆\t51910\nTypescript\t51911\nActiviti-Modeler\t51912\n柴胡疏肝散\t51913\n墨西哥湾\t51914\nWindbg\t51915\n第120\t51916\n7.3敏锐\t51917\n雪种\t51918\n上海金属网\t51919\n罢演\t51920\n花舍\t51921\n数字集成电路\t51922\n虚假诉讼罪\t51923\n亦庄经济开发区\t51924\n紫带\t51925\n筑业网\t51926\n赏金魄罗\t51927\n广州市胸科医院\t51928\n奶点\t51929\n禁心尽力\t51930\n3213823\t51931\n吉姆罗杰斯\t51932\n显效\t51933\n教科室\t51934\n2018年3月29日\t51935\n6回\t51936\nNXT\t51937\n小蛮妻\t51938\n查尔达斯\t51939\n12月底\t51940\n八百家\t51941\n卤族\t51942\n腾讯大成网\t51943\nvachar\t51944\nbionic\t51945\nIcarus\t51946\n张千里\t51947\nchrist\t51948\n江淮瑞风s3\t51949\n水校\t51950\n禹城\t51951\nhtml5视频播放器\t51952\n保处\t51953\nvidex\t51954\n海水\t51955\n金字\t51956\n一下下\t51957\nmvb\t51958\n电子驱蚊器\t51959\n全国文学\t51960\n单位阶跃函数\t51961\n雪色\t51962\n纲领\t51963\nfoshan\t51964\n厚黑\t51965\n橡皮泥\t51966\n风华国乐\t51967\n警队\t51968\n利娜\t51969\n单瓣\t51970\n尼摩\t51971\nrecherche\t51972\n加塞\t51973\n帅男\t51974\n人人秀\t51975\n学习方法\t51976\n任凭\t51977\n国模吧\t51978\n广东省人民检察院\t51979\n养猪业\t51980\n来曲唑片\t51981\n50010\t51982\n死亡金属\t51983\n新天\t51984\n五一村\t51985\n5MB\t51986\n拼成\t51987\n小k\t51988\nsey\t51989\nscw98.com\t51990\n42章\t51991\n马丁吉他\t51992\n丰城\t51993\n植物生长调节剂\t51994\n微擎+微赞\t51995\n阿勒泰路\t51996\n一班\t51997\n黎苗\t51998\nquickly\t51999\nバラ\t52000\n27家\t52001\n小青龙汤\t52002\n苏珊娜\t52003\n少数\t52004\nsb2\t52005\n潘朝晖\t52006\n纳扎克\t52007\n咸阳市中心医院\t52008\n海城市人民政府\t52009\n阿伯丁\t52010\n绰号\t52011\n杜荀鹤\t52012\n1V5\t52013\n10小时\t52014\n从速\t52015\n吴镇宇\t52016\n12宗\t52017\n尼米兹级航空母舰\t52018\n外阜\t52019\n以冬\t52020\n拉姆查兰\t52021\nSesso\t52022\n刘俊涛\t52023\n咿\t52024\n蓬莱\t52025\n中山社区\t52026\nipadios11\t52027\n冗余度\t52028\n第1181章\t52029\n赤溪镇\t52030\nchances\t52031\n京基\t52032\n第172集\t52033\n计算机基础\t52034\n7厘米\t52035\n商域\t52036\n丈人家\t52037\n2017年6月13日\t52038\n决心书\t52039\n金储宝\t52040\n防眩板\t52041\n非自然\t52042\n雷丁大学\t52043\nNavi\t52044\nreorg\t52045\n高海宁\t52046\n动词\t52047\n新亚洲\t52048\n冰书\t52049\n养户\t52050\n山西省新闻出版广电局\t52051\n土狗\t52052\n陆奇\t52053\n出项\t52054\n激励机制\t52055\n老蛮\t52056\n闵可夫斯基\t52057\nToxicology\t52058\n善解人意\t52059\n郝叔\t52060\n三权公开网\t52061\n活门\t52062\n卡钳\t52063\ncounted\t52064\n千里目\t52065\n龚道安\t52066\nTobii\t52067\n尽展\t52068\n南坪镇\t52069\n15年10月\t52070\n黔南州纪委监察局\t52071\n鬼泣4:特别版\t52072\n肉肉\t52073\n龙湖一号\t52074\n汪强\t52075\n住院部\t52076\nlibra\t52077\n钢纸\t52078\n楚秋\t52079\nGW\t52080\n差压流量计\t52081\n12348\t52082\n树里\t52083\n摩尔斯电码\t52084\n中大国际\t52085\n原来一样\t52086\n职业能力测验\t52087\n北京宝格丽酒店\t52088\n独立型\t52089\n科大国创\t52090\n明斯基\t52091\n现货_黄金网\t52092\n九十九\t52093\n百济新特药房网\t52094\n顾家\t52095\n邻苯二甲酸盐\t52096\n巨阙\t52097\nbishop\t52098\nldf\t52099\n园所\t52100\n宝应县\t52101\n20180430\t52102\n叮当快药\t52103\n李尚敏\t52104\n视觉型\t52105\n蓝湖郡\t52106\n保卫地球\t52107\n迪丽周渝民\t52108\n俩只\t52109\n董岩磊\t52110\nlog4js\t52111\n麻风疫苗\t52112\nwin7系统虚拟机\t52113\nbgc\t52114\n闻香识女人\t52115\nplastic\t52116\n金星路\t52117\nRevenant\t52118\n打鼓\t52119\n4家\t52120\n中航地产\t52121\n陈小生\t52122\n细菌性痢疾\t52123\n衢州19楼\t52124\nECshop\t52125\ngrit\t52126\n天天爱消除\t52127\n费减\t52128\n腾讯企业邮\t52129\nLenovoPC社区\t52130\n意美\t52131\n梦雪\t52132\n常有\t52133\n鉴定书\t52134\n深灰\t52135\n美国派\t52136\n青杠\t52137\n絮片\t52138\ndependent\t52139\n马\t52140\n博亚和讯\t52141\n加氟\t52142\n于为\t52143\nBioShock\t52144\n先锋电影网\t52145\nSemantics\t52146\n城西镇\t52147\n玩法师\t52148\n华远\t52149\n钓鱼竿\t52150\nspss\t52151\n律协\t52152\n口袋妖怪3DS_九游论坛\t52153\n水虎鱼\t52154\n巴风特\t52155\n能文能武\t52156\n北京清华长庚医院\t52157\n测试卷\t52158\n六论\t52159\n张玉嬿\t52160\n困斗\t52161\n四时田园杂兴\t52162\n玻璃库\t52163\n大城小爱\t52164\n盈泰\t52165\n酮替芬\t52166\n嗜睡\t52167\n数值函数\t52168\n0726\t52169\n甘肃省教育考试院\t52170\nCollision\t52171\n王海英\t52172\n400元\t52173\n油石\t52174\n路路\t52175\nborders\t52176\niKuai\t52177\n宝马5系\t52178\n开元通宝\t52179\n热血版\t52180\n发卡器\t52181\n悬臂\t52182\n绿妈文\t52183\n2.3.1\t52184\n木屑\t52185\n蒋晖\t52186\n各奔东西\t52187\nwarming\t52188\n克明\t52189\n东海渔村\t52190\n年轻人们\t52191\n洛奇Mabinogi\t52192\npostman\t52193\n聚物\t52194\n一只眼\t52195\nbootstrop\t52196\n宫腔积液\t52197\n在职研究生教育信息网\t52198\n晁盖\t52199\nAttend\t52200\n迅通\t52201\n缩回\t52202\n淫娃\t52203\n1050TI\t52204\n高顿搜索\t52205\n破人亡\t52206\n银笛\t52207\n浙江大华技术股份有限公司\t52208\n霍洛威\t52209\n第十七批\t52210\nfgo2018\t52211\n中元\t52212\n降频\t52213\n肉鹅\t52214\n开饭乐\t52215\n1982\t52216\nUIAlertController\t52217\n刑具\t52218\n英雌\t52219\n哑妻\t52220\n咬合力\t52221\n上海乐高探索中心\t52222\n陈道\t52223\n三洞\t52224\n斯克里帕尔\t52225\n知应会\t52226\n市井_淮海网\t52227\n安省\t52228\n磁浮\t52229\n长城奖\t52230\n奥索卡\t52231\n袭人\t52232\n320c\t52233\n估损\t52234\n武汉北\t52235\n夏锦文\t52236\n0199\t52237\n撩开\t52238\n做坏\t52239\n集大\t52240\n畸恋\t52241\n27年\t52242\n凯里一中\t52243\n管理股\t52244\n梦幻五开\t52245\n休闲服装\t52246\n硬卧\t52247\n七方\t52248\n齐星集团\t52249\nImg\t52250\npyridine\t52251\n一见钟情\t52252\n巴慕达\t52253\n牧野由依\t52254\n南岸路\t52255\n玩年\t52256\nTSE\t52257\n04级\t52258\n政法委\t52259\n日海\t52260\n开春\t52261\nTID\t52262\n宠宠宠\t52263\n力扬\t52264\n1m3\t52265\nlev\t52266\n貂绒\t52267\n自行车展\t52268\n田斌\t52269\n钟子期\t52270\n残料\t52271\n教肓\t52272\n辟邪剑法\t52273\n日石\t52274\n东奥会计论坛\t52275\nITRoad\t52276\n百度沸点\t52277\n几眼\t52278\n白州\t52279\n鱼泉\t52280\n准爸爸\t52281\n证券公司监督管理条例\t52282\n手机依赖症\t52283\n有轨\t52284\n赵吏\t52285\n秋菊\t52286\n中共四大\t52287\n抚顺石化\t52288\n何小军\t52289\nWis\t52290\nlnb\t52291\n押车\t52292\n资料册\t52293\n赵卫国\t52294\n开查\t52295\nTeach\t52296\n鲁炜\t52297\n云锋基金\t52298\n河北省委\t52299\n陶庄镇\t52300\n矢量素材\t52301\n6极\t52302\n特惠价\t52303\ncharlotte\t52304\niphon\t52305\nRoutes\t52306\n悬崖勒马\t52307\n长安区政府\t52308\n_源码搜藏网\t52309\n谈恋爱\t52310\n窦万贵\t52311\n芳草湖\t52312\n守规\t52313\n64GB\t52314\n奔驰C200L\t52315\n脚掌\t52316\n【性\t52317\n投资性房地产成本\t52318\n虫王\t52319\n选项框\t52320\n湘潭市政府\t52321\n倾泻\t52322\n阿怡\t52323\n邓爷爷\t52324\n花生地铁\t52325\n一章\t52326\n虚电路\t52327\n下落不明\t52328\n湖滨路\t52329\n林欣\t52330\n综合社区\t52331\n91约约\t52332\n华浔\t52333\n酱卤\t52334\nsocks代理\t52335\n超级点子王\t52336\n2.5倍\t52337\n几朝\t52338\n紫襟\t52339\n邮电大学\t52340\n万里行官网315online.com\t52341\n陶谦\t52342\nLogan\t52343\n卡维地洛片\t52344\ntrustedinstaller\t52345\n枪击\t52346\nTaco\t52347\nWEBGL\t52348\n秒钟\t52349\n七武士\t52350\n医疗机构执业许可证\t52351\nCru\t52352\n6.6亿\t52353\n惠斯勒\t52354\nindexeddb\t52355\n出风头\t52356\n娴静\t52357\n富祥\t52358\nnats\t52359\n情情\t52360\n琼花\t52361\n优能佳人参鹿鞭片\t52362\n传奇思铂睿\t52363\n准确值\t52364\n富华\t52365\nQueer\t52366\n5星酒店\t52367\n狠毒\t52368\n截位\t52369\n平安扣\t52370\n花梗\t52371\nDian\t52372\n韩国女排\t52373\n金杜\t52374\nnuc\t52375\n东张西望\t52376\n黄鉴\t52377\n宜华\t52378\n塑封机\t52379\n金红\t52380\n王新宇\t52381\n烟蛋\t52382\n都芳漆\t52383\n镍\t52384\n滨江壹号\t52385\n桃谷\t52386\n严白虎\t52387\n菜根谭\t52388\nrefer\t52389\n风潮\t52390\n小u\t52391\n24首\t52392\nDS160表\t52393\n格特拉克\t52394\n800个\t52395\n安居型\t52396\n沈阳地铁4号线\t52397\n抗坏血酸\t52398\n广电运通\t52399\n螺类\t52400\nVenom\t52401\ncilent\t52402\nInterop\t52403\n你好乔安\t52404\n考研生\t52405\n旱灾\t52406\nE客\t52407\n蒙内铁路\t52408\n长生\t52409\nBust\t52410\n1层\t52411\n本数\t52412\n乌鸡国\t52413\n阿的江\t52414\n254\t52415\n循环式\t52416\n550元\t52417\n蜃气\t52418\n北京华贸中心\t52419\n不拘小节\t52420\n15则\t52421\nh61m\t52422\n增容\t52423\nopenVPN\t52424\n早早孕\t52425\n数字经济指数\t52426\n沮丧\t52427\n雾炮\t52428\n集创\t52429\nbrandy\t52430\n酒钢集团\t52431\ngender\t52432\n谭富英\t52433\n条法\t52434\n岘山\t52435\n几安\t52436\n春纪\t52437\n盐类\t52438\n鹏宇\t52439\n吉林铁道职业技术学院\t52440\n重庆时时彩计划\t52441\n太仆寺\t52442\n曾志\t52443\n未来日记\t52444\ndle\t52445\n2018年1月6日\t52446\nkarin\t52447\n落雁岛\t52448\nWestminster\t52449\n发展性\t52450\n安徽教育网\t52451\n共青团中央办公厅\t52452\n偷窥狂\t52453\n新疆移动\t52454\n网易彩票\t52455\nヴィ\t52456\n民和股份\t52457\n制服类\t52458\n照片集\t52459\nUSB驱动\t52460\n磷酸酶\t52461\n程红\t52462\npockets\t52463\n蔑称\t52464\nproprietary\t52465\n外甥\t52466\n华南虎\t52467\n一批批\t52468\n铠\t52469\n仙剑奇侠传三外传\t52470\n混音师\t52471\n許\t52472\n裕隆\t52473\n史前一万年\t52474\n三味书屋\t52475\n启东市政府网\t52476\nroi\t52477\nheels\t52478\n基质\t52479\n滨海公园\t52480\n命运石之门吧\t52481\n约翰迪尔\t52482\naabb\t52483\n4012\t52484\n农村学校\t52485\n百度云|TVB\t52486\n区财政局\t52487\n氧基\t52488\n介休市\t52489\n少年军校\t52490\nHotToys\t52491\n镇赉\t52492\n浙江省人事厅\t52493\n我的生日\t52494\n沙坦\t52495\n翻供\t52496\nxampp\t52497\n1655\t52498\n匿名_天\t52499\n按一按\t52500\n大撼\t52501\ncudart\t52502\n易升\t52503\n恐同\t52504\n陶瓷类\t52505\n两米\t52506\n西苑医院\t52507\n天下城\t52508\n30步\t52509\nzmd\t52510\n失地\t52511\n裁判所\t52512\nkleine\t52513\n宋清\t52514\n20170102\t52515\nLora\t52516\nWeimob\t52517\n恬淡\t52518\n蚬\t52519\n暴击率\t52520\n宗师级\t52521\n惊门凯\t52522\n骆驼祥子\t52523\nT420s\t52524\ndwr\t52525\n公开赛\t52526\n哀悼\t52527\n不锈钢\t52528\nconcrete\t52529\n20150206\t52530\n史学\t52531\nRayFile\t52532\n四代火影\t52533\npending\t52534\n大扎\t52535\n转帐\t52536\n第3天\t52537\n喷沙\t52538\n渭滨\t52539\nCreatures\t52540\n麦芒5\t52541\n平方\t52542\n珠江嘉园\t52543\n小损样\t52544\n加油吧威基基\t52545\n六驱\t52546\n蝶窦\t52547\nv1.4.3\t52548\n张泽群\t52549\n斯蒂芬·威廉·霍金\t52550\n禁断介护\t52551\n平开\t52552\n陕西省煤炭生产安全监督管理局\t52553\n300070\t52554\nPhantomPDF\t52555\n求同\t52556\n那么久\t52557\n一元一次不等式组\t52558\n沙巴亚庇\t52559\n上杉\t52560\n断臂山\t52561\n消费量\t52562\n隐形眼睛\t52563\n智炫\t52564\n冰轮\t52565\n拳皇wing吧\t52566\n炮友\t52567\n晨枫\t52568\n调幅\t52569\n长江村\t52570\n错落有致\t52571\n洗车房\t52572\n2000倍\t52573\n3.76\t52574\n1023\t52575\nCK快播\t52576\n魏新雨\t52577\n水音\t52578\n辞\t52579\nA380\t52580\nhp1216\t52581\n无痛人流术\t52582\n劳务派遣单位\t52583\n3.8.3\t52584\n马连道\t52585\n傲世\t52586\n180420\t52587\n品德与社会\t52588\n河北高院\t52589\nasthma\t52590\ntexturepacker\t52591\n卡罗九阴真经\t52592\n大欢喜\t52593\n武汉农商行\t52594\n江毅\t52595\n简体形\t52596\n器具\t52597\n催经\t52598\n解手把手\t52599\n王卡\t52600\n马旦\t52601\nc位\t52602\n盛泰\t52603\n周启生\t52604\n砂糖桔\t52605\n货运部\t52606\nfmin\t52607\n承运商\t52608\n2月20日\t52609\n青梅酒\t52610\n奔驰g\t52611\n火商\t52612\n多模光模块\t52613\n张玉红\t52614\n双人间\t52615\n手爪\t52616\n斜对面\t52617\n天津一汽丰田汽车有限公司\t52618\n职衔\t52619\n铸铁\t52620\n重练\t52621\nalcon\t52622\n0211\t52623\n芙蕖\t52624\nVMware11\t52625\n完整\t52626\n天津地铁1号线\t52627\n硅藻泥电视背景墙\t52628\n冬曲\t52629\n南大文学院\t52630\n富不过三代\t52631\n留学生\t52632\n枇杷膏\t52633\n建筑学概论\t52634\n银虎\t52635\npreceding\t52636\n传输机\t52637\n气敏\t52638\n华泰保险集团\t52639\n岱山岛\t52640\n口德\t52641\n陕西省公安厅\t52642\n财险公司\t52643\n铭牌\t52644\n红军小学\t52645\nAlign\t52646\n赏景\t52647\n水泥土搅拌桩\t52648\njike\t52649\nG1620\t52650\n南菜园\t52651\nCharacterization\t52652\n笑传\t52653\n横炮\t52654\n黄盘\t52655\n21首\t52656\n万块\t52657\n那条街\t52658\n时间表\t52659\nns\t52660\n工作邮件\t52661\nURP\t52662\n宠妻狂魔\t52663\n寄宿\t52664\nfdatool\t52665\n亚美\t52666\nCuSO4\t52667\nkingdomrush\t52668\nRecursive\t52669\n香港证监会\t52670\nORF\t52671\n6.9下\t52672\n种活\t52673\n未来之星\t52674\n刘克\t52675\n12.7.4.76\t52676\n城寨\t52677\n科利华\t52678\ndialing\t52679\n交通圈\t52680\noli\t52681\n两幢\t52682\n吉安职业技术学院\t52683\n初度\t52684\nMIIX2\t52685\n歌指弹\t52686\ndock栏\t52687\n鞍重股份\t52688\n酶法\t52689\nbite\t52690\nRC3\t52691\n唐人神\t52692\n当家\t52693\nmaven2\t52694\n自身抗体\t52695\n现浇板\t52696\n三角牛\t52697\n金投价格网\t52698\n天主教\t52699\n三电平逆变器\t52700\n张世英\t52701\nu6\t52702\n俊俊\t52703\nSkylake处理器\t52704\n余某\t52705\n31处\t52706\n浩渺\t52707\npriming\t52708\ndiam\t52709\n信征\t52710\n报告文学\t52711\n南湖\t52712\n白果镇\t52713\n两集\t52714\n赵悦\t52715\n陕西会计网\t52716\n裸舞曲\t52717\nwuxian\t52718\nHxinguan\t52719\n公博\t52720\nFindLaw\t52721\n同性恋\t52722\n周建国\t52723\n王静\t52724\nmget\t52725\n大冲村\t52726\n张家界站\t52727\n城南镇\t52728\n两万五\t52729\n缚誓者\t52730\n艰辛\t52731\n脱向\t52732\nLerche\t52733\n1000亿\t52734\n现场调查\t52735\n蜂桶\t52736\n叠合板\t52737\nxbox录屏\t52738\n中图法分类号\t52739\n心怡科技\t52740\n臭氧消毒机\t52741\n不买东西\t52742\n艺术的故事\t52743\nsharding\t52744\n南都娱乐周刊\t52745\n徐鑫\t52746\n博客园\t52747\n双坠\t52748\n综评\t52749\n中企信办\t52750\npart5\t52751\n盒盖\t52752\n足语\t52753\n迈克达威\t52754\nOSI\t52755\n下移\t52756\n武汉市人力资源和社会保障局\t52757\n系统服务管理器\t52758\nStar\t52759\nE950\t52760\nGee\t52761\n杨奕\t52762\n流波\t52763\n土木工程制图\t52764\n过渡性养老金\t52765\n5.1日\t52766\n齐云\t52767\n黑神\t52768\n化学性质\t52769\n0.69\t52770\n人才\t52771\n加害者\t52772\ntrouble\t52773\nS60\t52774\nvertical-align\t52775\n东南网\t52776\n水泥\t52777\n荣蓉\t52778\n丰田本田\t52779\nNEC\t52780\n断尾\t52781\n负序\t52782\n中研普华\t52783\n比迪丽\t52784\n比亚迪f3\t52785\niwanna\t52786\n古古\t52787\n谁明浪子心\t52788\nostelin\t52789\n中国作家协会\t52790\n定时计数器\t52791\n宏志班\t52792\n祖国各地\t52793\n5毫克\t52794\n治法\t52795\n辽宁中医药大学附属医院\t52796\nPMP考试\t52797\n肉彩\t52798\n到病除\t52799\n易宝支付\t52800\nCOMPANY\t52801\n马家湾\t52802\n七牛开发者中心\t52803\n山西省煤炭工业厅\t52804\nword_office\t52805\n汉高百得\t52806\nCENSORED\t52807\nSJY之家\t52808\n夫妻关系\t52809\n艺术管理专业\t52810\nMillionaire\t52811\nLabor\t52812\n坂田银时\t52813\n拴\t52814\n更有用\t52815\n鑫贷\t52816\n天龙寺\t52817\n暴死\t52818\n兰岛物语\t52819\n狸花猫\t52820\n伟星\t52821\n坐脸\t52822\n渡情\t52823\nExcel表格下载器\t52824\n8场\t52825\n游人\t52826\n氮化碳\t52827\n亮光\t52828\nSingers\t52829\n成果奖\t52830\n灵水\t52831\n五花肉\t52832\nu9\t52833\nspringmvc\t52834\n张宝旬\t52835\n大鹏\t52836\n简答题\t52837\npanties\t52838\n上海政协\t52839\n仙羽\t52840\n本田思迪\t52841\n焦外\t52842\n逊尼派\t52843\n杀婴\t52844\nforecasts\t52845\n294\t52846\n真经\t52847\n醉虾\t52848\n委\t52849\n墨仓式\t52850\n917\t52851\nMarkets\t52852\n全国学联\t52853\n沢\t52854\n软件工\t52855\n7代\t52856\nSANS\t52857\n瓜类\t52858\n缩紧\t52859\n传销案\t52860\n防爆配电箱\t52861\n爱信网\t52862\npers\t52863\nECShop\t52864\n小赢卡贷\t52865\n吃空\t52866\n皮沙发\t52867\nWir\t52868\n贝克汉姆\t52869\n逃跑率\t52870\n蒹葭\t52871\n自定义控件\t52872\n积化\t52873\n酷龙\t52874\n葛炮\t52875\n二、三\t52876\nw60\t52877\n最终幻想世界\t52878\n40k\t52879\n人工呼吸\t52880\n第30期\t52881\n迷航\t52882\ndreamweaver8\t52883\n兰亭\t52884\n医学检验技术专业\t52885\n语文学习\t52886\n查拉图斯特拉如是说\t52887\n茅塞顿开\t52888\n尼弗迦德\t52889\n千血\t52890\n中共五大\t52891\nsalad\t52892\n掰开\t52893\n守卫剑阁三幻神\t52894\n屌毛\t52895\nmercalli\t52896\n利处\t52897\n孙松\t52898\n538porm\t52899\n广西田东县人民政府\t52900\n成物\t52901\nmccartney\t52902\nrime吧\t52903\n徐闻县\t52904\n求艺\t52905\niconv\t52906\n香吧香\t52907\n千锋\t52908\n娇滴滴\t52909\n堀北真希\t52910\n夜灯网\t52911\n登山杖\t52912\n大岚镇\t52913\n市交通委\t52914\n牙距\t52915\n弹奏\t52916\n汇博\t52917\n笔计\t52918\n3万公里\t52919\n凉拌腐竹\t52920\n矿力\t52921\nEe\t52922\n水爷\t52923\nManger\t52924\n复句\t52925\n微安\t52926\n2.66\t52927\nAnjelica\t52928\n急求\t52929\nmusica\t52930\n4世纪\t52931\n望穿秋水\t52932\n143家\t52933\n雌堕\t52934\n驱除\t52935\n歪脖老母\t52936\n汤姆赫兰德\t52937\nfg800\t52938\n\u0007马尔康\t52939\n重庆环球金融中心\t52940\n勒泰中心\t52941\nCommunication\t52942\nAuthorware\t52943\npet3\t52944\n罗玲\t52945\n侵害\t52946\npickle\t52947\n茎类\t52948\n乔顿\t52949\n孝经\t52950\npvc穿线管\t52951\n暴力\t52952\n2016年04月\t52953\n液化气瓶\t52954\nTrainer\t52955\n梦次元\t52956\nbdy\t52957\n中国药理学会\t52958\n唇唇\t52959\n科洛弗档案\t52960\n灶神\t52961\n新年快乐\t52962\n杨志勇\t52963\n对不齐\t52964\n木沙发\t52965\n壕礼\t52966\naw\t52967\n投影幕\t52968\nSHARE\t52969\n情愿\t52970\n歼-20\t52971\nNaim\t52972\n35毫米\t52973\n兰德公司\t52974\nraffles\t52975\nStardust\t52976\ncrsctl\t52977\n安卓x86\t52978\n1905年\t52979\nスロ\t52980\n金陵晚报\t52981\n蜡梅\t52982\nBeijinger\t52983\n珑悦\t52984\n语数\t52985\n5.6.13\t52986\n丁二酸\t52987\n齐头并进\t52988\n860k\t52989\nSlickEdit\t52990\n天津东站\t52991\n83页\t52992\n道貌岸然\t52993\n光明行\t52994\n纸短\t52995\n姜异康\t52996\n马骁\t52997\n零下\t52998\n杭州市水务控股集团有限公司\t52999\n本装\t53000\n人人平等\t53001\nMarried\t53002\n副翼\t53003\n功败垂成\t53004\n13.9\t53005\n正名\t53006\nfpic\t53007\n昙花一现\t53008\n易容卡\t53009\n书剑恩仇录\t53010\n0xe6\t53011\n白杠\t53012\nlevi\t53013\n镜室\t53014\n银川市区\t53015\n四川信托\t53016\nspanning\t53017\njyy\t53018\n气海\t53019\n新达\t53020\n弓妹\t53021\n较于\t53022\n高倩\t53023\n老慢\t53024\n瘦瘦\t53025\n微乳\t53026\n打猎\t53027\nvisa信用卡\t53028\n永福县\t53029\n公允性\t53030\nultrasonic\t53031\n钟小艾\t53032\nGreasy\t53033\n航空港\t53034\n水果篇\t53035\n符干\t53036\nWING\t53037\n范雎\t53038\n混沌石\t53039\n东南大学图书馆\t53040\nvob\t53041\n技术规范\t53042\n中吉\t53043\n主身\t53044\n凯里南站\t53045\n朝5晚9\t53046\n卢克索\t53047\n巴氏腺囊肿\t53048\n杀人魔\t53049\nCRC32\t53050\n240s\t53051\nD2C\t53052\n优谈\t53053\n发电\t53054\n1口\t53055\nvisco\t53056\n刘文静\t53057\n百转千回\t53058\nflc\t53059\n港城大道\t53060\n别上当\t53061\n树上\t53062\n持币\t53063\njgroups\t53064\nmingw-w64\t53065\n7场\t53066\n蚓激酶肠溶胶囊\t53067\n苞米\t53068\n公测\t53069\nSymantec\t53070\n学分绩\t53071\n中山西区\t53072\n汉尼\t53073\n李洪斌\t53074\nM128fn\t53075\npink\t53076\n无功补偿柜\t53077\n秘洞\t53078\n金台寺\t53079\n桃花芯\t53080\n萨菲隆\t53081\n环太平洋机甲\t53082\n中考_3773考试网\t53083\n浇花\t53084\n酵素\t53085\n皇极\t53086\n省部级干部\t53087\n窗花\t53088\n质壁\t53089\nNess\t53090\n2018年1月21日\t53091\n雀友\t53092\nmd5码\t53093\n段奕宏\t53094\n假和尚\t53095\n嘉兴港\t53096\nni\t53097\n迁爱\t53098\n钕铁硼\t53099\n大海贼\t53100\n0.4米\t53101\n吴俊杰\t53102\n绿地国际理想城\t53103\n键名\t53104\n亚细亚\t53105\nbytebuffer\t53106\n20170919\t53107\n口袋妖怪心金\t53108\nCFPS\t53109\n魏村\t53110\n采矿业\t53111\n2016年10月11日\t53112\nyankee\t53113\n清大教育\t53114\n圣教军天谴流\t53115\n阳伞\t53116\n氨基反应\t53117\n同志片\t53118\nRHCSA\t53119\n耗费\t53120\n故是\t53121\nturnover\t53122\n色噜噜\t53123\nUCenter\t53124\n河南广播电视大学\t53125\n夜行书生\t53126\n角头\t53127\n防具\t53128\nGEP\t53129\n三瓶\t53130\n曹某\t53131\n鞍\t53132\n恐怖博士\t53133\n店口镇\t53134\n兴宁\t53135\n吸盘式\t53136\n怀化南站\t53137\n迅雷BD高清中字mp4\t53138\nLCMS\t53139\n紫外线杀菌灯\t53140\n第9\t53141\n宫廷剧\t53142\n长安村\t53143\n中美欧\t53144\n房地产估价师考试\t53145\n中共辽宁省委\t53146\nqrc\t53147\n润乾\t53148\n第48章\t53149\nwriterow\t53150\nYet\t53151\n美国政府\t53152\n嘎巴拉\t53153\n李春城\t53154\n一公升\t53155\n木已成舟\t53156\n冷王\t53157\n伤湿止痛膏\t53158\n液压钳\t53159\n金正峰\t53160\n刘琛\t53161\n仙台市\t53162\n粒状\t53163\n配档\t53164\n云梦四时歌\t53165\n出阁\t53166\n佛山罗村\t53167\n刘子光\t53168\n一行一行\t53169\n叶丽仪\t53170\n平安城市\t53171\nt62\t53172\n准格尔之窗\t53173\n中学生活\t53174\n主持界\t53175\n别克昂科雷\t53176\nanni\t53177\n依视路镜片\t53178\nPanerai\t53179\n合唱节\t53180\n包氏\t53181\n志愿服务条例\t53182\n刘海洋\t53183\n邮政储蓄信用卡\t53184\n峨嵋\t53185\n海洋出版社\t53186\n唐家三少\t53187\n铁血社区\t53188\n香樟花园\t53189\n邱氏\t53190\nremain\t53191\n武勋\t53192\n周防雪子\t53193\n黄黑\t53194\n9291\t53195\n有毅力\t53196\n即使\t53197\n罪案\t53198\n战地4\t53199\n骑枪\t53200\n天使们\t53201\n好好听\t53202\n辛酸史\t53203\n酮康唑\t53204\nHerobrine\t53205\nMaven\t53206\nPelican\t53207\n怔住\t53208\n完成率\t53209\n老虎牙\t53210\n热图网\t53211\n十里堡\t53212\nnew对象\t53213\n20170922\t53214\n王者荣耀鲁班\t53215\nKayla\t53216\nKx\t53217\n陈克\t53218\n网石\t53219\n白杰\t53220\n第6版\t53221\nVirtualBOX\t53222\n39800\t53223\n葫芦岛北\t53224\n核燃料\t53225\nrequiring\t53226\nVae\t53227\n#000000\t53228\n百度开发者中心\t53229\n企业征信\t53230\n小人头\t53231\n帝尧\t53232\n看云\t53233\n培美曲塞\t53234\n蟹爪莲\t53235\n西安电信\t53236\nForwarding\t53237\npajek\t53238\n缀化\t53239\n6.10.1129.0\t53240\n红河市\t53241\n水仙花数\t53242\n特教\t53243\n七间\t53244\n展车\t53245\n警察类\t53246\nbiomass\t53247\n龙珠超宇宙\t53248\nwinzip\t53249\n培文\t53250\n开心速递\t53251\n存值\t53252\n伏地\t53253\n死灵师\t53254\n1025\t53255\n100股\t53256\n16万亿美元\t53257\n天津公交\t53258\n485回\t53259\n固液\t53260\n天下金融网\t53261\n束身\t53262\nER\t53263\n武林至尊\t53264\n加增\t53265\nkonami\t53266\n相关联\t53267\n邓县\t53268\n悔不当初\t53269\n2017-09-20\t53270\n黎明之战\t53271\n有色金属\t53272\nofdm\t53273\n济南市口腔医院\t53274\n第2天\t53275\nCindy\t53276\n呼家楼\t53277\nx9s\t53278\n乔伊斯\t53279\naramex\t53280\n长安欧尚x70a\t53281\n灰帽\t53282\nDaimler\t53283\n明天中午\t53284\n论说\t53285\n科罗拉多大学\t53286\nDISK\t53287\n诺\t53288\nmacbook12\t53289\n诚衣\t53290\n弱队\t53291\n智逸版\t53292\n壕沟\t53293\nMiui9\t53294\n中通配符\t53295\npychrm\t53296\n百花路\t53297\n下乡\t53298\n年型\t53299\n保理\t53300\nAllstars\t53301\n无锡市人民政府\t53302\n001\t53303\n九阿哥\t53304\n南京大屠杀死难者国家公祭日\t53305\nMayday\t53306\n迅雷链克\t53307\n势阱\t53308\n1.5W\t53309\n365夜\t53310\n永辉\t53311\nmathjax\t53312\n太阳城娱乐城\t53313\n农心杯\t53314\n没感觉\t53315\n小墨\t53316\nBlond\t53317\n里诺仓库管理软件\t53318\nbluetooth\t53319\n教学周\t53320\nBeatrich\t53321\n0833\t53322\n逆水凌渡\t53323\n2016.05\t53324\n微软Edge浏览器\t53325\n张秋\t53326\n奥林匹斯星传\t53327\n2019\t53328\nUltraMon\t53329\n陈希曾\t53330\n大理州\t53331\n稻田\t53332\n口温\t53333\n栓剂\t53334\n有效率\t53335\n汤溪\t53336\n腮红金\t53337\nst3\t53338\nGogs\t53339\nMODEL\t53340\n104国道\t53341\n2.11\t53342\nalyssa\t53343\n旌阳区\t53344\n冥币\t53345\n林林\t53346\n磁力链接|磁力搜索|\t53347\n鞍山西道\t53348\n第九十三章\t53349\n野良猫ハ\t53350\nmhc\t53351\n成武县\t53352\njulebu\t53353\n汪仔\t53354\nAmoy\t53355\n琅勃拉邦\t53356\n珠子\t53357\nmp3+\t53358\n板绘\t53359\n红图\t53360\n四环素\t53361\n中国家具协会\t53362\n愿景\t53363\nidm下载器\t53364\n拉什福德\t53365\n汶水路\t53366\n精雕图\t53367\nSvchost\t53368\n全面战\t53369\n给排水科学与工程\t53370\nHttp代理IP/Socks5\t53371\narma3吧\t53372\n北雁湖\t53373\ncapitalism\t53374\n吹毛求疵\t53375\n朽\t53376\n江疏影\t53377\n钙化灶\t53378\n天津欢乐谷\t53379\nInsights\t53380\n泛亚汽车\t53381\ngive\t53382\n世运\t53383\n广州粤波医院\t53384\nYHOUSE悦会\t53385\n不均等\t53386\n小杨生煎\t53387\n曼巴\t53388\n2第1\t53389\n三_山西新闻网\t53390\n重识\t53391\n文化观\t53392\n硕士生入学考试\t53393\n董大\t53394\n孙盛希\t53395\n王爽\t53396\n王力门\t53397\n金洲慈航\t53398\nr50\t53399\n奴良陆生\t53400\n筑业\t53401\n论语八则\t53402\nIP段\t53403\n年饭\t53404\n苗场\t53405\nCPR\t53406\n深受\t53407\n名图琴帝\t53408\nnmtui\t53409\n四至\t53410\n李文彬\t53411\n秘密基地\t53412\n第五大街\t53413\nnails\t53414\n棒球棍\t53415\n马振桓\t53416\n微水\t53417\n抗倭\t53418\n矽钢片\t53419\n宋芸桦\t53420\n伊恩·麦克尤恩\t53421\n新浦路\t53422\n新民晚报\t53423\n阁\t53424\n各方面\t53425\n惠东县人民政府\t53426\nARGOX\t53427\n尿袋\t53428\nTOYO\t53429\nWindows7SP1补丁包\t53430\nCFO\t53431\n减肥史\t53432\ndirector\t53433\ncontinent\t53434\nGXG\t53435\n刑\t53436\n04月18日\t53437\n无偏估计\t53438\n胶囊酒店\t53439\njdc\t53440\n先导智能\t53441\n吴中天\t53442\n汤馆\t53443\n台乡\t53444\n透射电镜\t53445\n百万朵\t53446\n中国经济特区\t53447\ncanteen\t53448\n釜\t53449\n14章\t53450\nLeaf\t53451\n6.44\t53452\n比亚迪F3论坛_汽车之家论坛\t53453\n小米超神\t53454\n强击\t53455\n侗族\t53456\n山东中烟工业有限责任公司\t53457\ncsd\t53458\n油鸡\t53459\n等价性\t53460\n中职\t53461\n【古灵精探\t53462\n眼笼\t53463\n申请版\t53464\n热云\t53465\n鳌峰\t53466\n邪影芳灵\t53467\nmiktex\t53468\nPS4/Xbox\t53469\n十二宫\t53470\nBin\t53471\n江西路\t53472\n长沙芙蓉区人民政府\t53473\n天都城\t53474\n六院\t53475\n冷爱\t53476\n伊什塔尔\t53477\nep51\t53478\n北京人文大学\t53479\n黑鸭\t53480\n查询网\t53481\nPolling\t53482\n张松茂\t53483\n北窗\t53484\n紫府\t53485\n博信\t53486\nendeavor\t53487\nbeishi\t53488\nThemes\t53489\n西仪股份\t53490\n上古卷轴ol吧\t53491\n2315\t53492\n七针\t53493\n苦苣\t53494\n德赛西威导航\t53495\nucrtbase\t53496\n十大\t53497\n德豪润达\t53498\n酿\t53499\n湖南省博物馆\t53500\n高温炉\t53501\n2017年7月\t53502\n有机碱\t53503\n伟神\t53504\n米格\t53505\n0.49%\t53506\nKE科日光伏网\t53507\n中国书法\t53508\n零学\t53509\n老本\t53510\n有色网\t53511\n瑞安日报\t53512\n文圣区\t53513\n很苦逼\t53514\n盐湖城\t53515\n平复\t53516\n和平区\t53517\n苏州招聘网\t53518\n24倍\t53519\n巨子\t53520\nHighscore\t53521\n冬拥湖\t53522\n航华\t53523\nimi\t53524\n257\t53525\n仍可\t53526\nCub\t53527\n60条\t53528\n狗窝\t53529\n一德路\t53530\n房产证\t53531\n文创产业\t53532\n深汕特别合作区\t53533\n海医\t53534\n拖把头\t53535\n一步登天\t53536\nErick-LONG\t53537\n体简\t53538\n普通话证\t53539\n贾南风\t53540\n0.55%\t53541\n制浆\t53542\n鄂钢\t53543\n利物浦足球俱乐部\t53544\n民大\t53545\n安小莫\t53546\n主家\t53547\n品牌馆\t53548\n结晶器\t53549\nSigns\t53550\n另有\t53551\n金立m5\t53552\n雷鬼\t53553\nM435nw\t53554\n治治\t53555\n孙允珠\t53556\n瑜伽\t53557\n泛亚铁路\t53558\n针织帽\t53559\n订单式\t53560\n战斗性\t53561\n第三页\t53562\n上原亚衣\t53563\n乱奸\t53564\npicard\t53565\n3.5寸\t53566\n忘不了你的爱\t53567\n美容器\t53568\n屠夫\t53569\n第28个\t53570\n身证\t53571\n州长\t53572\n庐山风景名胜区\t53573\njianjie\t53574\n广州市番禺区人民政府\t53575\n时许\t53576\nHarvest\t53577\n8月5日\t53578\n大姨妈\t53579\nsif\t53580\n标出\t53581\n天津人\t53582\n仪器仪\t53583\n中班组\t53584\n新民晚报数字报\t53585\n116个\t53586\n对话会\t53587\n看啦\t53588\n长沙妈妈网\t53589\n东部华侨城\t53590\nyouyou\t53591\n丹霞山\t53592\n1998年\t53593\n纽曼斯\t53594\n中华美德\t53595\nmarie\t53596\nQQ糖\t53597\nGIL\t53598\ntofel\t53599\n新编赞美诗\t53600\n600毫升\t53601\n程虹\t53602\n黄莺\t53603\n小巴郎\t53604\n佳能mp236\t53605\nv2.8\t53606\n兵不血刃\t53607\n灰树花\t53608\n脸面\t53609\ncollation\t53610\n飞鼠\t53611\n碳氢清洗机\t53612\n20kb\t53613\n中远海发\t53614\n有\t53615\n衬布\t53616\n照亮\t53617\n采暖季\t53618\nAA制\t53619\nboyfriend\t53620\nDicky_Zhang\t53621\nwolframalpha\t53622\n破甲\t53623\n收单业务\t53624\nPLEXTOR\t53625\n国珍松花粉\t53626\n滑膜肉瘤\t53627\n邪教徒\t53628\n王岩\t53629\n上海卫生人才网\t53630\n剑荡八荒\t53631\n大运河\t53632\n上海东方肝胆外科医院\t53633\nEAP\t53634\n型窗\t53635\nCOLG\t53636\n99.5\t53637\n猎人|Hunter-魔兽世界\t53638\ndrift\t53639\n蕾切尔·薇姿\t53640\n趣炫\t53641\n乙醇\t53642\n2品\t53643\n建建\t53644\n三脂\t53645\nrados\t53646\n白雪公主与七个小矮人\t53647\nBBB\t53648\n诡匠\t53649\n几周\t53650\n心青\t53651\n戏痴\t53652\n王者荣耀S9\t53653\n王洋\t53654\n紧盯\t53655\n高频电路\t53656\nstrang\t53657\n片叶\t53658\n风量\t53659\n钢质\t53660\n5603\t53661\nmont\t53662\n齐格勒\t53663\n卢惠光\t53664\narcserver\t53665\n绞索\t53666\nvod-type\t53667\n王莫涵\t53668\n国安社区\t53669\n南光\t53670\n电子签名\t53671\n上午10点\t53672\n喜马拉雅山脉\t53673\n人民代表大会制\t53674\n人物包\t53675\n2.37\t53676\n八头身\t53677\n圆融时代广场\t53678\n应收款\t53679\nshengao\t53680\nF2Pool\t53681\nVidz\t53682\n小升规\t53683\n烤全羊\t53684\n托森\t53685\nY58试驾网\t53686\n节假\t53687\n星峰\t53688\n农信社\t53689\n1593\t53690\n企信宝\t53691\n建辉\t53692\n神奇宝贝xy\t53693\n川建价\t53694\n灿白文\t53695\nKhronos\t53696\n中国教育在线\t53697\n小窍门\t53698\n0.66\t53699\n木兰围场\t53700\niso系统\t53701\n郎酒\t53702\n光栅\t53703\n新商盟烟草\t53704\n最污\t53705\n希伯来书\t53706\n叶嘉莹\t53707\n蜡烛图\t53708\n石龙\t53709\n易码\t53710\n杨贵\t53711\nA1528\t53712\n模面\t53713\n泛海国际\t53714\n波动\t53715\n朝中\t53716\n王建华\t53717\napi接口\t53718\n蓝晓科技\t53719\n中国农业大学人发学院\t53720\nindicated\t53721\nbth\t53722\nCMN\t53723\n畅\t53724\nLatency\t53725\n你的名字。\t53726\n分路\t53727\n雌蜂\t53728\nEMUI5.0\t53729\n丹妮莉丝\t53730\nkmeans\t53731\n名捕\t53732\n涮\t53733\n加权平均值\t53734\n潍坊市政府\t53735\n直埋保温管\t53736\n借条\t53737\n督察\t53738\n痕\t53739\nManila\t53740\nCLC\t53741\nEnjoy\t53742\n制动钳\t53743\nHitalk\t53744\netcp\t53745\njjg\t53746\nazkaban\t53747\n马羊\t53748\nflora\t53749\n蒙脱土\t53750\n爱疯了\t53751\n百兆网\t53752\n甘其食\t53753\n南京大学文学院语言学系\t53754\n结构方程\t53755\n下跳\t53756\nmds剑仙\t53757\n一去不返\t53758\n普瑞纳\t53759\n云控\t53760\n公众号认证\t53761\n司徒雷登\t53762\n质问\t53763\n艺术照\t53764\n地藏王\t53765\nwmf\t53766\n人口水\t53767\n皓天\t53768\n法律快车劳动法\t53769\n淋巴管\t53770\n成都生活网\t53771\n浙政办\t53772\n重生之超级战舰\t53773\n社会福利院\t53774\n【途睿欧】_途睿欧厂家_途睿欧\t53775\n精英中学\t53776\n王景春\t53777\n单人影\t53778\n尤达\t53779\n脸儿\t53780\n核表\t53781\n站赛\t53782\n物方\t53783\n孙雪\t53784\n第46集\t53785\n款额\t53786\nppmoney\t53787\n一问一答\t53788\n胸毛\t53789\n鲜辣\t53790\n幼女\t53791\n新疆电力\t53792\n匹维溴铵片\t53793\n阳明路\t53794\nForging\t53795\n当年\t53796\n魏如昀\t53797\n我是僵尸\t53798\n￥_\t53799\n好特网\t53800\n550W\t53801\npcf8591\t53802\n做做\t53803\n一升\t53804\n三水汽车站\t53805\n明珠小区\t53806\n霜花\t53807\n绅宝X55\t53808\n石水\t53809\n唐伯虎点秋香2\t53810\n748\t53811\n戬心\t53812\n评理\t53813\n小潘\t53814\n吉林财经大学\t53815\nip1188\t53816\n男命\t53817\n802.15.4\t53818\n排座\t53819\nSolicitors\t53820\n健康产业\t53821\n综艺盛典\t53822\n骆歆\t53823\n烧人\t53824\n大鹏教育\t53825\nalloc\t53826\n新集\t53827\ngluten\t53828\n才够用\t53829\n哏德全\t53830\njdj\t53831\n杭州嘉里中心\t53832\n增发价\t53833\nChIP\t53834\n_手机人人网\t53835\n分期付款\t53836\n完美告白\t53837\n封尾机\t53838\n白癜\t53839\nscientist\t53840\n300万辆\t53841\n海赋尚城\t53842\n艾略特\t53843\ne570\t53844\n艳\t53845\n会计基础\t53846\n安全带\t53847\n十八世纪\t53848\n商品林\t53849\n屈光\t53850\nveneer\t53851\nOpenVPN\t53852\n稻荷\t53853\n育肥\t53854\nhalogen\t53855\n激色\t53856\n白彪\t53857\n醉拳\t53858\n超音速\t53859\n从化温泉\t53860\n乐点\t53861\n80方\t53862\n酚类化合物\t53863\njordon\t53864\nwin10分屏\t53865\n开更\t53866\n鹏淘\t53867\n地形码\t53868\n英才网\t53869\n爱情公寓5\t53870\n破烂\t53871\nPEX\t53872\n一创\t53873\n持久层\t53874\n蒙曼品\t53875\nJoyer\t53876\n105.5\t53877\n自动检票机\t53878\nPulp\t53879\n陆建华\t53880\n孔笙\t53881\n儒略日\t53882\n中卫市人民政府\t53883\n第3代\t53884\n超级杯\t53885\n2709\t53886\n文昌帝君\t53887\n导控\t53888\nWashing\t53889\n青塔\t53890\n平安大道\t53891\n工亡\t53892\n5184\t53893\n稀见\t53894\n三孔景区\t53895\n恒大地产集团\t53896\n莫伊拉\t53897\nADVANCED\t53898\n昆明滇池国际会展中心\t53899\n爷娘\t53900\n70回\t53901\n太空泥\t53902\nkohler\t53903\n桐庐新闻网\t53904\ndirectional\t53905\n华为工作法\t53906\n衣康酸\t53907\n溥仪\t53908\n英特集团\t53909\nPZ\t53910\n平面镜成像\t53911\n钱赞企\t53912\n44款\t53913\n银耳粥\t53914\nkea\t53915\n南村镇\t53916\n绞磨\t53917\n西游侠侣\t53918\n680g\t53919\n克拉管\t53920\n火影忍者邪恶漫画\t53921\n5000亿美元\t53922\n竹心\t53923\n养羊\t53924\n暖棚\t53925\n爱琴海\t53926\n梁山伯与祝英台新传\t53927\n斗狗\t53928\n中发\t53929\n日月光店\t53930\nchips\t53931\n盐城市住房保障和房产管理局\t53932\n松田龙平\t53933\n汇丰大厦\t53934\n中央景城\t53935\nReaction\t53936\n小茂\t53937\n杨建文\t53938\n小红花\t53939\n应急池\t53940\n鸡皮\t53941\n太阳能热水系统\t53942\n谢春花\t53943\n飞蓬\t53944\n斜边\t53945\n陈为章\t53946\n乐高未来骑士团\t53947\n追记\t53948\nCtrl+Z\t53949\n名店\t53950\ndata值\t53951\n正项\t53952\n短整型\t53953\n医德医风\t53954\n汤子瀛\t53955\n相克表\t53956\n维权案\t53957\n2014年01月24日\t53958\n头内\t53959\n辅助项\t53960\n大学生活网\t53961\n网签\t53962\n东南大学机械工程学院\t53963\n龙马环卫\t53964\nColors\t53965\n紫金广场\t53966\n自弃\t53967\nrx78\t53968\nPRODUCE101\t53969\n川芎茶\t53970\n场论\t53971\n数据贴\t53972\ntocat\t53973\n中国国家馆\t53974\nMaya2016\t53975\n月嫂\t53976\n何必等\t53977\n100u\t53978\n变质岩\t53979\nScrub\t53980\n小蜻蜓\t53981\n打散\t53982\n缩写表\t53983\n激素类\t53984\nagenda\t53985\n绿豆面\t53986\n上文\t53987\n小米昕锐\t53988\n索契冬奥会\t53989\n履行合同\t53990\nno1\t53991\n砀山梨\t53992\n药业公司\t53993\n万科城二期\t53994\n波士顿机场\t53995\n荐股\t53996\n女方\t53997\n分子式\t53998\n一本正经\t53999\n疗伤\t54000\nCondos\t54001\nFeijiu网\t54002\naxcure\t54003\n5万块\t54004\n宝马汽车\t54005\n7菇\t54006\nrinetd\t54007\n宜兰\t54008\nAIR2\t54009\n蜀九香\t54010\n搜索器\t54011\n首都医科大学附属北京口腔医院\t54012\nCNPM\t54013\n匿名信\t54014\n大丰人才网\t54015\n太平庄\t54016\n转空\t54017\nDependent\t54018\n醒酒\t54019\n卡斯特梅\t54020\n博览馆\t54021\n专代\t54022\n杭州野生动物园\t54023\n油雾\t54024\n华大科技\t54025\n数学典\t54026\n中迅\t54027\n科标\t54028\nPCB信息网\t54029\n城市森林公园\t54030\n四溢\t54031\n流明\t54032\n龙皇武神\t54033\nRTMP协议\t54034\n酥油\t54035\nverdi\t54036\n第A03\t54037\n榫机\t54038\n圣眼之翼\t54039\nDataguard\t54040\nOFweek通信网\t54041\n富临精工\t54042\n拔河比赛\t54043\nVanDyke\t54044\nB612\t54045\n5.3%\t54046\n小孩子们\t54047\n一字\t54048\n浙江省公安厅\t54049\n泰泽\t54050\n服务链\t54051\n黎民\t54052\n奥鹏远程教育中心\t54053\n财富榜\t54054\n尿嘧啶\t54055\n我的妹妹\t54056\nXie\t54057\n女迷\t54058\n460万\t54059\n田头\t54060\n白棉\t54061\n投机\t54062\n克尔维特\t54063\n林峰\t54064\n助民\t54065\n邱淑\t54066\n侍灵\t54067\n邹晓东\t54068\n20多米\t54069\n天九集团\t54070\n西宁市区\t54071\ng403\t54072\n未来主义\t54073\n美国童子军\t54074\n财慧网\t54075\n大话西游2召唤兽\t54076\n邓铁涛\t54077\nbleu\t54078\n神犬小七\t54079\n吴曦\t54080\ne-tron\t54081\n于波\t54082\n乙胺\t54083\n沈胜衣\t54084\n卡里乌斯\t54085\n乐博\t54086\n塘约村\t54087\n卡帅\t54088\n下叶\t54089\n尿路结石\t54090\nScott\t54091\n成都市纪委\t54092\n秀辉\t54093\n中国食品工业协会\t54094\n志同道合\t54095\nNSNotification\t54096\ne43\t54097\n打乱\t54098\n搭铁\t54099\n美国电影学院\t54100\n昆高铁\t54101\n全友家私\t54102\n三百\t54103\n13.com\t54104\n萨尔茨堡红牛\t54105\ndiscussing\t54106\nCATALINA\t54107\n甜点\t54108\n网口\t54109\njinyuan\t54110\nrtd\t54111\n栈道\t54112\nn2\t54113\nRuff\t54114\n李鸿章\t54115\n90次\t54116\n首版次\t54117\n条码扫描枪\t54118\n名品导购网\t54119\nggj\t54120\n男包\t54121\n敲出\t54122\n林地\t54123\n赵大格\t54124\n1080p\t54125\n旧世\t54126\nXhtm\t54127\n首聚\t54128\n代拍\t54129\n2列\t54130\n鱼腩\t54131\n唱红歌\t54132\n大城县人民政府\t54133\n系统文件夹\t54134\n西山路\t54135\n吉林省交通运输厅\t54136\n丝域\t54137\n巴西女排\t54138\nDiet\t54139\n06188\t54140\n惠龙\t54141\n痴心绝对\t54142\n帝国霸略\t54143\n芜湖市人力资源和社会保障局\t54144\nIsis\t54145\n养正中学\t54146\n郑州市区\t54147\n不喝水\t54148\n非会员\t54149\n优达\t54150\n市西中学\t54151\n研训\t54152\n孔明传\t54153\n崔晋\t54154\n肝血管瘤\t54155\n洛阳北郊机场\t54156\n交规\t54157\n后主\t54158\n云邮箱\t54159\ncdg\t54160\n交易广场\t54161\n托尼贾\t54162\n蔽\t54163\n大马哈鱼\t54164\nwww.zjsfgkw.cn/attachment/documentbook/\t54165\n水局\t54166\n穷困\t54167\n定时炸弹\t54168\n高速集团\t54169\n平菇\t54170\n碰面\t54171\n微软云\t54172\n五胞胎\t54173\n西成高铁\t54174\n新会区会城\t54175\n宝山区小学\t54176\n应用程序\t54177\n实根\t54178\n盟主\t54179\n百度有钱花\t54180\n李彩琳\t54181\n惠大\t54182\n吴素英\t54183\nweiot\t54184\n益阳市一中\t54185\n密咒\t54186\n安游我的世界\t54187\n正则化\t54188\n云校\t54189\n宙斯盾\t54190\n凤眼\t54191\n展翅高飞\t54192\n既无\t54193\n51822\t54194\n极乐净土\t54195\n1.9C_\t54196\n肉眼\t54197\nLetao\t54198\n新浪山东_新浪网\t54199\nInline\t54200\n博恩\t54201\n烟台大学\t54202\nTp5\t54203\n原糖\t54204\n划片\t54205\n唇线\t54206\n湖熟街道\t54207\n蓝星\t54208\n垫脚石\t54209\n解义\t54210\n资信证明\t54211\n冷鲜肉\t54212\n预备期\t54213\n1名\t54214\n7.9无限期\t54215\n赵志强\t54216\nlayim\t54217\n特解\t54218\n000002\t54219\nDevice\t54220\nNP文\t54221\n灌醉\t54222\n顺义鲜花港\t54223\nEating\t54224\n英语语\t54225\nSay\t54226\n大新县\t54227\n湖州东站\t54228\n一吨\t54229\nycbcr\t54230\n韵达快递单号\t54231\n脚踩式\t54232\n妖梦\t54233\n妙洁\t54234\n失落感\t54235\n广东行政职业学院\t54236\n剑痕\t54237\n追星逐月\t54238\n加完\t54239\n【千\t54240\n创始人\t54241\ndospy智能手机网\t54242\ncream\t54243\n瘘\t54244\n少儿编程网\t54245\nlift\t54246\n娄底职业技术学院\t54247\n3D渲染\t54248\n虐杀\t54249\nplastics\t54250\n大庆石化公司\t54251\n清者\t54252\n方块方舟\t54253\nclg\t54254\n会员费\t54255\n刘绍棠\t54256\n油壶\t54257\n金州\t54258\nSmiths\t54259\n浇注料\t54260\nvarsall\t54261\nISO版\t54262\nsessionId\t54263\n一颤\t54264\n汤兰兰案\t54265\n化境\t54266\n单调\t54267\n剪指甲\t54268\n王晓春\t54269\n很多种\t54270\n近似值\t54271\n鼓隆城\t54272\n淋洗\t54273\n52PKLOL英雄联盟\t54274\nhaoyu.com\t54275\n起租\t54276\n海光寺\t54277\n海加尔山\t54278\n开心路\t54279\n库珀\t54280\n上官金虹\t54281\n文化大革命\t54282\n氨纶\t54283\n贝优妮塔\t54284\n上海公共交通卡股份有限公司\t54285\n歌城\t54286\n长虹公园\t54287\nRNA聚合酶\t54288\nbundle-windows-x86\t54289\n车讯网\t54290\n30多种\t54291\n装备篇\t54292\n小不\t54293\n促进剂\t54294\n现代式\t54295\n小沈龙\t54296\n加密服务器\t54297\nDB37/T\t54298\nengineering\t54299\n荔枝节\t54300\n北京市人大附中\t54301\n一柱香\t54302\n苏德\t54303\n蒂鸥维毅\t54304\n两岁\t54305\n单号查询-56114\t54306\n奋力\t54307\n天竹\t54308\n蜗牛与黄鹂鸟\t54309\n地州市\t54310\n犹网\t54311\nmix\t54312\n永定镇\t54313\nrotation\t54314\n保利天悦\t54315\n涅普顿\t54316\n黑金\t54317\n八百年前\t54318\nsolidworks\t54319\n霍霍巴油\t54320\n非法人\t54321\n2339\t54322\n楼凤网\t54323\n淘宝拼团\t54324\nwiring\t54325\n甲苯\t54326\n天山童姥\t54327\n三丘田码头\t54328\n商城\t54329\n公共卫生\t54330\n草塘\t54331\n唐轩\t54332\n居民健康卡\t54333\n拜谒\t54334\nflan\t54335\n55YOU\t54336\nwin7开机启动项\t54337\nAnimus\t54338\n凌晨三点钟\t54339\n砖铺\t54340\nTH000\t54341\n酸臭\t54342\n工商管理部门\t54343\n9包\t54344\nvimdiff\t54345\n氦气\t54346\n差字\t54347\n数据帧\t54348\n侧手\t54349\n李如松\t54350\n京泉华\t54351\n率达\t54352\n薇诺拉\t54353\n218元\t54354\n到岸\t54355\nmaya2012\t54356\n精智\t54357\n刘慈保罗\t54358\n公主病\t54359\n中分化\t54360\n治庸\t54361\n陈嘉男\t54362\n资料库\t54363\nremux\t54364\n华侨大厦\t54365\n二刻\t54366\nxiaomage\t54367\n坐骨神经痛\t54368\n机楼\t54369\n三天两晚\t54370\n秀兰\t54371\nESX\t54372\n子路由\t54373\nhanks\t54374\n下葬\t54375\n2z\t54376\n乐视超级电视\t54377\n立木\t54378\n烧肉\t54379\n沪沽湖\t54380\n输液袋\t54381\n史_\t54382\n莱芜在线\t54383\n亲戚\t54384\n纳税信用评价\t54385\n国家统计局\t54386\n中国平煤神马集团\t54387\nv2.3.1\t54388\n水果沙拉\t54389\npop子和pipi美的日常\t54390\nProxmox\t54391\n濡\t54392\n等待着\t54393\n1平方毫米\t54394\n倍舒痕\t54395\n246天天\t54396\n664\t54397\n椭球体\t54398\nmora\t54399\nrian\t54400\n劣币驱逐良币\t54401\n立体投影\t54402\n南浦大桥\t54403\n金属制品\t54404\neicad\t54405\n张宝利\t54406\n竹简\t54407\n中国呼叫中心\t54408\n流出来\t54409\n古树\t54410\n大连市公安局\t54411\nmckinsey\t54412\n巢柜\t54413\nshuo\t54414\nledit\t54415\n为你负重前行\t54416\n甜蜜乐章\t54417\n52万元\t54418\novf\t54419\n威尼\t54420\n北京人保\t54421\n九起\t54422\n强直\t54423\nOGG\t54424\n道君\t54425\n珠江人寿\t54426\n縛\t54427\n预订\t54428\n挣得\t54429\n玉箫\t54430\n基督邮报\t54431\n大场面\t54432\n红颜\t54433\n击发\t54434\nmid\t54435\n腾牛苹果网\t54436\n第二顺位\t54437\n契合\t54438\nSega\t54439\nDS4\t54440\n雯子\t54441\n第六_\t54442\n编谱\t54443\n嚯嚯\t54444\n20171229\t54445\n田伟\t54446\n肥皂剧\t54447\n张海英\t54448\n晓丁\t54449\nDrop\t54450\n沈庆\t54451\n新浪湖北_新浪网\t54452\n破伤风\t54453\n第三季01\t54454\n砥\t54455\n眼镜厂\t54456\n疲于奔命\t54457\n南京地铁三号线\t54458\nPvt\t54459\nguizhou\t54460\n软解码\t54461\n上户\t54462\n忍法帖\t54463\n高顿财经\t54464\n希捷硬盘\t54465\nRemoving\t54466\n贵霜\t54467\nPhpMyAdmin\t54468\n致命弯道\t54469\nkey=\t54470\n大外\t54471\n苦路\t54472\n急诊医生\t54473\n重跑\t54474\n搜款网\t54475\n全国老龄办\t54476\n笋塘\t54477\n阿健\t54478\n两星\t54479\nLG化学\t54480\nLung\t54481\n内循环\t54482\n青岛滨海学院\t54483\n狼族\t54484\n铁器时代\t54485\n美国西部\t54486\nbladder\t54487\n客至\t54488\nddf\t54489\n5.05\t54490\n势不可当\t54491\n格宾网\t54492\n阿龙\t54493\n宜居之城\t54494\n波波\t54495\ntesser\t54496\n早上九点\t54497\n环球医药网\t54498\n安吉百草园\t54499\n溢满\t54500\n打登机\t54501\n开关式\t54502\nrobin\t54503\nARCgis\t54504\nSketc\t54505\n头条君\t54506\n航天通信\t54507\n青岛市民政局\t54508\n智慧城市规划\t54509\n拳皇98\t54510\nDGS\t54511\nhlp\t54512\nPolaris\t54513\n外汇牌价_八九网\t54514\n三甲胺\t54515\nPIC\t54516\npainful\t54517\n顺德乐从\t54518\nTeva\t54519\n代修\t54520\n奸佞\t54521\n写本\t54522\n百元级\t54523\n鸿雁\t54524\ndongxiaolei\t54525\n8月14日\t54526\nflue\t54527\n鞋身\t54528\n四年前\t54529\n四叉树\t54530\n秘事\t54531\n思维能力\t54532\n终身教育\t54533\n沈海高速公路\t54534\n罗技mx\t54535\n181个\t54536\n圣尊\t54537\n国行三星S8\t54538\n南湾湖\t54539\nLeader\t54540\nVS刀剑神域\t54541\n母畜\t54542\n八方客\t54543\n延吉新闻网\t54544\n外缘\t54545\n花朝傅\t54546\n永威城\t54547\n金钟铉\t54548\n|天易成网管\t54549\n南京市统计局\t54550\n浦发银行\t54551\n元拓集团\t54552\n音乐编辑\t54553\ns2p\t54554\n石阶\t54555\n弗兰克斯\t54556\nPete\t54557\n纳甲\t54558\nScope\t54559\n建瓴\t54560\n沙棘\t54561\n博园路\t54562\n富国银行\t54563\niview\t54564\n鹞子\t54565\n阿sir\t54566\n领购簿\t54567\nMMR\t54568\n东北大秧歌\t54569\nDN300\t54570\n1270\t54571\n理发\t54572\n南沙湿地公园\t54573\n饭石\t54574\n煎药壶\t54575\n鲍德里亚\t54576\n公海\t54577\n防火包\t54578\n零陵路\t54579\n心魂\t54580\n天津总医院\t54581\n白景琦\t54582\n小夜子\t54583\nQVOD快播\t54584\nF\t54585\nps2018cc\t54586\n郭德纲\t54587\n玉观音\t54588\n胡彪\t54589\n葳\t54590\n舍得\t54591\nRope\t54592\n华邦\t54593\nASPEN\t54594\n伊宁机场\t54595\n绵州\t54596\nONT\t54597\n猫科动物\t54598\n9350\t54599\n2.5V\t54600\n包庇\t54601\n大客\t54602\n84a\t54603\n提后\t54604\nRoth\t54605\n天堂寨风景区\t54606\nere\t54607\n江干新闻网\t54608\nMSVCP120.dll\t54609\n撬门\t54610\n梦幻西游凌波城\t54611\n浙江省气象局\t54612\n中征码\t54613\nddj\t54614\n神经末梢\t54615\n尊卑\t54616\n合宠\t54617\n李亚东\t54618\n七村\t54619\n活像\t54620\n2017-12-18\t54621\n不切\t54622\n广西壮族自治区国土资源厅\t54623\n粒子\t54624\n黛\t54625\n乳香\t54626\n雪莲子\t54627\n颐养天年\t54628\n提举\t54629\n性药\t54630\n峨嵋酒家\t54631\n增氧泵\t54632\n75c\t54633\n莲叶\t54634\n移动号段\t54635\n肉酱\t54636\n广告代理公司\t54637\n凯越论坛_汽车之家论坛\t54638\n五星路\t54639\nlienhua34\t54640\nPrevent\t54641\n师大附中\t54642\n6750\t54643\n蓝蛙\t54644\n樱满集\t54645\n甲基吡啶\t54646\n玛雅金字塔\t54647\nquay\t54648\n质检卷\t54649\n实况足球2017\t54650\nTIM2\t54651\n车镜\t54652\n剩女\t54653\n4星级\t54654\n荣耀畅玩7X_荣耀畅玩7X\t54655\n2.0.4\t54656\n越人\t54657\n好问\t54658\n小圆\t54659\n昭仪\t54660\n然健环球\t54661\n雨润食品\t54662\n赶上\t54663\nzig\t54664\n伦敦\t54665\n说苑\t54666\n打油诗\t54667\nmover\t54668\n钢窗\t54669\n先证\t54670\n上工申贝\t54671\n校尉\t54672\n5.7.22\t54673\n十万分之一\t54674\n剑心侠义\t54675\n21路\t54676\n4x2\t54677\n十九大会议\t54678\n曙光街道\t54679\n鼻渊舒口服液\t54680\n8.05\t54681\n垮台\t54682\n龙城大药房\t54683\n天龙八部武当\t54684\n谢兰\t54685\n欣美途旅游网\t54686\n金曲榜\t54687\n行方\t54688\n斗罗魔卡幻想\t54689\n方寸之爱\t54690\nactivating\t54691\n黑咔相机\t54692\n喜感\t54693\n警醒\t54694\n易容术\t54695\n骆超\t54696\n古尔邦节\t54697\n直放站\t54698\n等着时光等着你\t54699\n龙武2\t54700\n军校\t54701\nZou\t54702\nビジネス\t54703\n天刀太白\t54704\n利奥波德\t54705\n塞东西\t54706\nAttributeError\t54707\n中国共产主义青年团\t54708\n抽烟机\t54709\n八米\t54710\n4809\t54711\n创意公司\t54712\nsanwa\t54713\n有轨电车2号线\t54714\noptparse\t54715\n易损件\t54716\n纪元1404吧_\t54717\n同在一起\t54718\n银河岁月\t54719\nwindows7_Windows\t54720\n任性付\t54721\n中小学生\t54722\n利安人寿\t54723\n散文家\t54724\ndw\t54725\n全微分_\t54726\n小宋佳\t54727\n两个晚上\t54728\ntopMan\t54729\nxrf\t54730\n工业和信息化部\t54731\n木牛流马\t54732\n翼博\t54733\n洞阳镇\t54734\n左小马龙\t54735\n话器\t54736\n雪盈\t54737\n于美红\t54738\n龙之皇冠\t54739\n4567\t54740\n中华医药\t54741\n关系户\t54742\n徐玉燕\t54743\n心腔\t54744\n小鸡x1\t54745\n17.31\t54746\n红轮\t54747\n晴空\t54748\n勤奋\t54749\n十国\t54750\n20平方厘米\t54751\n横摇\t54752\n晶须\t54753\n第三纪\t54754\n五人\t54755\n狼友\t54756\ngeci\t54757\n清平镇\t54758\n400毫升\t54759\n中国食品网\t54760\n鹿小姐\t54761\n极坐标方程\t54762\n美白仪\t54763\nVamei\t54764\n汇泽\t54765\nCrazy\t54766\n宝图\t54767\n隆美\t54768\n青雉\t54769\n水电瓶\t54770\n32c\t54771\n元杂剧\t54772\n1.8.7\t54773\n自动销户\t54774\n恩华药业\t54775\ndarkknightzh\t54776\ngoggle\t54777\n散装\t54778\nlite\t54779\n张伟\t54780\n王司徒\t54781\n耀莱成龙国际影城\t54782\n魔角\t54783\n罗莎\t54784\n孙强\t54785\n动物科学技术学院\t54786\n大众新速腾\t54787\n婚久\t54788\nPCPOP\t54789\n火线币\t54790\n建邺路\t54791\nHobby\t54792\n三阵\t54793\n流控\t54794\n不畏\t54795\nexcelhome\t54796\n_米\t54797\nWin7主题\t54798\n李石\t54799\nUltraviolet\t54800\n广西医科大学第二附属医院\t54801\n忠仕\t54802\n开低\t54803\n南京市教育局\t54804\n璎珞\t54805\n冥帝\t54806\n唯物\t54807\n何敏\t54808\n男歌\t54809\n陕西电力\t54810\n风景园林专业\t54811\n中国跆拳道协会\t54812\n东风社区\t54813\n雷凡培\t54814\n潘逸阳\t54815\n苏教版零五网\t54816\nbarbour\t54817\n忧患\t54818\n追鬼七雄\t54819\nPPAP\t54820\n稿器\t54821\n水解奶粉\t54822\n17春\t54823\n拟建\t54824\n李妍瑾\t54825\n异坤\t54826\n2千万\t54827\nlich\t54828\n罪妃\t54829\n陪跑\t54830\n试模\t54831\n中国人民解放军队列条令\t54832\n嘉云\t54833\ncad自学网\t54834\n肾脏\t54835\nexcel除法\t54836\n坼\t54837\n伍角\t54838\n无岸\t54839\n光明之响\t54840\n2045年\t54841\n结顶\t54842\nLisbon\t54843\n沪铜\t54844\n利士\t54845\n北京壹号院\t54846\nv5.5.2\t54847\n北京理工大学软件学院\t54848\n福冠堂\t54849\n姜太昱\t54850\n穿透\t54851\n绮园\t54852\n放肆宠\t54853\nint变量\t54854\nT型\t54855\n攸关\t54856\n拜祭\t54857\n风险度\t54858\n芽衣\t54859\n壁花\t54860\n咸宁市\t54861\n预应力空心板\t54862\n250nk\t54863\n安徽电信\t54864\n鑫玺\t54865\nMiley\t54866\nMindManager\t54867\n曾任\t54868\n小白金\t54869\n军徽\t54870\n精灵宝可梦太阳月亮\t54871\nEasyUI中文网\t54872\n10阶\t54873\n弯钩\t54874\nmpandroid\t54875\n兴亡\t54876\n菜鸟网络科技有限公司\t54877\n童恬恬\t54878\n媒妁之言\t54879\n立足本职\t54880\n400次\t54881\n没有你陪伴我真的好孤单\t54882\n超级能恩\t54883\n3个星期\t54884\n历史类\t54885\n过节\t54886\nShin\t54887\n旋光仪\t54888\n威奇塔\t54889\nBibi\t54890\n市第一人民医院\t54891\n狼堡\t54892\n召集\t54893\n平泉县\t54894\n遛鸟\t54895\n9.5\t54896\n航天基地\t54897\n云之凡\t54898\n比泽尔\t54899\n莉莉卡\t54900\n万寿山\t54901\n迷失\t54902\n4333\t54903\n菊苣栀子茶\t54904\n奶粉\t54905\n游戏陀螺丨\t54906\n李广田\t54907\n灵芝水\t54908\n北京国子监\t54909\n十条街\t54910\ncd1\t54911\n0.10MB\t54912\n劳改\t54913\n汇景家园\t54914\n御泉湾\t54915\n无药可医\t54916\n九皇山\t54917\ncach\t54918\n三曹\t54919\n王明\t54920\n8621\t54921\n土银\t54922\nMidnight\t54923\n取景器\t54924\n大麦盒子\t54925\nNOTICE\t54926\n恩平市\t54927\n东港\t54928\nRemedies\t54929\nzenmeyang\t54930\n协和\t54931\n姜葱\t54932\ncandance\t54933\n王珞丹\t54934\n顺发恒业\t54935\n薄暮\t54936\n文化论\t54937\n淮阳县\t54938\n龚俊\t54939\n国美控股集团\t54940\n大爆炸\t54941\n卡拉·迪瓦伊\t54942\n黄梦\t54943\npvr.ccz\t54944\n凳子\t54945\n连云港市公共资源交易中心\t54946\nY480\t54947\n一滩\t54948\n英短\t54949\n淡味\t54950\n我的电脑\t54951\n徐佳\t54952\nXinjiang\t54953\n恶\t54954\n普通车\t54955\n凌美LAMY\t54956\n墨明棋\t54957\nInterpolation\t54958\nclear\t54959\n嘉玲\t54960\n红唇\t54961\n青海省人民政府办公厅\t54962\nWEEX\t54963\n书艺\t54964\nmuse-ui\t54965\n意指\t54966\nx100t\t54967\n四十五年\t54968\n金丝峡\t54969\n中国邮政速递物流\t54970\n格尔木\t54971\n健步\t54972\n斩\t54973\n溶出度\t54974\nbiaoshi\t54975\n姜罚\t54976\n罗杰\t54977\n三柱\t54978\n倪建伟\t54979\n汗珠\t54980\n堵住\t54981\n技术职\t54982\nCOL\t54983\n陈富林\t54984\n胸有\t54985\n侦探\t54986\n进账单\t54987\n高射炮\t54988\n10pin\t54989\n皇姑\t54990\n黄巧灵\t54991\n核电池\t54992\n北京长安街\t54993\n禹城市\t54994\n雷尼绍\t54995\n桐山\t54996\n深信服ac\t54997\n窃取\t54998\n上扬\t54999\n时装\t55000\n格林童话故事\t55001\n草地\t55002\npivottable\t55003\n优秀生\t55004\n南柯\t55005\n同一样\t55006\n261ara\t55007\n光剑\t55008\n西南村\t55009\n大前锋\t55010\n吉喆\t55011\n沪宁高速\t55012\n曹刘\t55013\ncompaction\t55014\n张文明\t55015\n栀\t55016\n指节\t55017\n塔吊\t55018\n中华会计网\t55019\n2000平\t55020\n长安欧尚A800论坛_汽车之家论坛\t55021\n王江泾镇\t55022\n黑风\t55023\n李修贤\t55024\n绿水青山就是金山银山\t55025\n上海湾\t55026\n2018年国庆节\t55027\n参训\t55028\n花路\t55029\n上海新东方学校\t55030\n甲基苯酚\t55031\nclipstudiopaint\t55032\nGran\t55033\n小树苗\t55034\n圆灯\t55035\n福库\t55036\nyifile\t55037\n手绘吧\t55038\n临时表空间\t55039\n组砌\t55040\n一席\t55041\n空滤\t55042\n嘟嘟牛\t55043\n1500克\t55044\nzillow\t55045\n故\t55046\n3dh\t55047\njasper\t55048\n1442\t55049\nsaml\t55050\nPERRY\t55051\n保险经纪人\t55052\n夏日课堂\t55053\n取舍\t55054\n实木板\t55055\ndestructive\t55056\n487\t55057\n贵阳银行股份有限公司\t55058\nWheat\t55059\n猫狗大战\t55060\n北洛\t55061\n息负债\t55062\n盲测\t55063\n真与假\t55064\n冰刃\t55065\n慈父\t55066\n两税\t55067\n日常化\t55068\n秋光\t55069\n什\t55070\n凌希\t55071\n反冲洗过滤器\t55072\n精锐部队\t55073\n美肤\t55074\n24.5\t55075\n包装厂\t55076\n出殡\t55077\n珍珠岩保温板\t55078\nego\t55079\n竖蛋\t55080\n张謇\t55081\n益盟操盘手论坛\t55082\nPills\t55083\n2018年04月11日\t55084\n大连海事大学\t55085\njagger\t55086\n针轮\t55087\n安顺路\t55088\n傍大款\t55089\n唇膜\t55090\n法律快车网\t55091\nsharp\t55092\n事无巨细\t55093\n秋秋\t55094\n我的好妈妈\t55095\nEACCES\t55096\n实体集\t55097\nsecs\t55098\n显示类\t55099\n红枫湖\t55100\n定远县政府\t55101\n100001\t55102\n读一读\t55103\n中浆\t55104\n沉头螺钉\t55105\n功利化\t55106\n七星海世界\t55107\n5w20\t55108\nG2\t55109\nrdesktop\t55110\n合肥市住房公积金管理中心\t55111\n聚会\t55112\n花图\t55113\n6个多月\t55114\n20140317\t55115\n圣神\t55116\nbcg\t55117\nfileutils\t55118\n青青花木网\t55119\n维思通\t55120\n吴昊天\t55121\n梯子\t55122\n黄迪\t55123\n河东路\t55124\n书装\t55125\n氧气\t55126\n修改器\t55127\n武则艾晓琪\t55128\nmete\t55129\n招商银行银联\t55130\n佳宜\t55131\n玉汝\t55132\nSVN\t55133\n天网恢恢\t55134\n迷失太空\t55135\n傅\t55136\n绳刑\t55137\n鬼鲛\t55138\n肥乡县\t55139\n白蘑菇\t55140\n见闻色\t55141\n茶台\t55142\n鹤山市人民政府\t55143\n吴江市\t55144\n地铁2033\t55145\nsimpack\t55146\nBY子句\t55147\nMyDiskTest\t55148\n百度影棒\t55149\n萌牙\t55150\n金百万烤鸭店\t55151\n第22集\t55152\nipos\t55153\n3.9亿\t55154\nn5110\t55155\nG3云\t55156\n第六场\t55157\n千幻\t55158\n分液\t55159\nVolatility\t55160\n成都实验外国语学校\t55161\n荣耀畅玩4C\t55162\n匀\t55163\n壳头\t55164\ngbf\t55165\n10.1\t55166\n娑婆世界\t55167\n鼓号\t55168\n鸣人佐\t55169\n杜家坎\t55170\nCzlun\t55171\n海蛇\t55172\nD3js\t55173\n天胡荽\t55174\n常喝\t55175\n可吸入颗粒物\t55176\n小型犬\t55177\nln2\t55178\n新媒体联盟\t55179\nhentai\t55180\n別人\t55181\n高效率\t55182\n绞尽脑汁\t55183\n钟君\t55184\n69亿\t55185\nSHD\t55186\ntag值\t55187\n中青新闻网\t55188\nSeagull\t55189\n广州医科大学附属肿瘤医院\t55190\n机身\t55191\n五斗\t55192\nplunge\t55193\n圣斗士星矢ol\t55194\n克赛\t55195\n振兴\t55196\nIDM下载器\t55197\n联排别墅\t55198\nCreo工程图\t55199\ncysteine\t55200\n总类\t55201\n福田萨普\t55202\n人力资源服务许可证\t55203\nzhong\t55204\n并表\t55205\n北京西山国家森林公园\t55206\nCON\t55207\n1354\t55208\n省旅发委\t55209\n华灿\t55210\nBeastiality\t55211\n12期\t55212\n弹窗小说网\t55213\n浙江省卫生计生委\t55214\nv2.1\t55215\nLipid\t55216\n庄兆林\t55217\n2011年1月\t55218\n中国建筑工业出版社\t55219\n121路\t55220\n1909年\t55221\nlock\t55222\n女眷\t55223\n百宝袋\t55224\n粗犷\t55225\nSDU\t55226\n梯台\t55227\n达尔优DAREU\t55228\n浊水\t55229\n武戏\t55230\n固定成本\t55231\n立式\t55232\n整章\t55233\ndokcer\t55234\njit\t55235\n平安健康险\t55236\n国家移民管理局\t55237\n合肥公交查询网\t55238\nBD720p\t55239\n采集群\t55240\n乱堆\t55241\n免煎\t55242\n吉尼斯\t55243\nknocking\t55244\n1.09亿\t55245\n我们的故事\t55246\n陈杨\t55247\n网页浏览器\t55248\nconfusing\t55249\n牙釉\t55250\n江油市\t55251\n是谁非\t55252\n阿托伐他汀钙胶囊\t55253\n宝石之国吧\t55254\n背街\t55255\n茑萝\t55256\n研究生学院\t55257\n2.06\t55258\n魔兽世界6.0\t55259\nRay\t55260\n易乐游帮助中心\t55261\n蜡笔\t55262\n电信米粉卡\t55263\nACCESS数据库\t55264\n排气阀\t55265\n宜昌网\t55266\n功臣\t55267\n大汉情缘之云中歌\t55268\n黑白画\t55269\n山东能源枣庄矿业(集团)有限责任公司\t55270\n小卫士\t55271\n筹备\t55272\n就学\t55273\n看看新闻网\t55274\nshuji\t55275\n九阳\t55276\nrpi\t55277\nMarks\t55278\n20130706\t55279\n有道云笔记吧\t55280\n颜静刚\t55281\n傅小司\t55282\n复式统计表\t55283\n美国财政部\t55284\nwaring\t55285\n城市联合网络电视台\t55286\n重孙\t55287\n铁齿铜牙纪晓岚4\t55288\n弹工\t55289\nQQ卡通\t55290\n中国地震信息网\t55291\n凤形股份\t55292\n畅玩助手\t55293\n马自魔域\t55294\n600577\t55295\n四大招\t55296\n百张\t55297\n干粉灭火器\t55298\n三角网\t55299\n收纳\t55300\n升官发财\t55301\n爱情观\t55302\n臻享\t55303\n足校\t55304\nforEach\t55305\n乐高幻影忍者\t55306\nug\t55307\n黄健翔\t55308\n镜音双子\t55309\n小巴西龟\t55310\n158b\t55311\nison\t55312\n600806\t55313\n品酒师\t55314\niphong\t55315\n河岸\t55316\n会有天使替我爱你\t55317\n美作\t55318\n泛素化\t55319\n5A级\t55320\n小麻雀\t55321\n合江镇\t55322\n働\t55323\n林仙儿\t55324\n跑冒滴漏\t55325\n丽萨\t55326\n中共中央全面深化改革领导小组\t55327\ndetailview\t55328\n2017-12-26\t55329\n无辜者\t55330\n仇富\t55331\n东坪\t55332\n第212集\t55333\n莱芜高新区\t55334\n添乱\t55335\n第二十九章\t55336\n年满\t55337\n嘉兴二院\t55338\n波吉亚\t55339\nStrict\t55340\n订单网\t55341\n佐伯奈\t55342\nRaphael\t55343\n超神\t55344\n日照时数\t55345\n甲状腺结节_甲状腺结节\t55346\n龙之信条:黑暗崛起\t55347\n防晒液\t55348\n氯化铜\t55349\nbowel\t55350\n普林斯\t55351\n兰州大学第一医院\t55352\n杜仲雄\t55353\n美版\t55354\n本坛\t55355\n鞋底\t55356\n2017年内\t55357\ncommons-lang\t55358\n李永林\t55359\n开罐器\t55360\n力帆轩朗\t55361\n九州港\t55362\n黄金甲\t55363\n试验场\t55364\n详批\t55365\n生态产业园\t55366\n基本步\t55367\n伊美尔\t55368\n朝日奈明\t55369\n老妈\t55370\n蓝齐儿\t55371\n王依琳\t55372\n深圳广田集团股份有限公司\t55373\nsvipshipin\t55374\n龙本\t55375\n315路\t55376\n戊二醛\t55377\n省体\t55378\nimgs\t55379\n朗逸\t55380\n观止\t55381\n0xc0000005\t55382\n温泉酒店\t55383\n直捣\t55384\n宏鑫\t55385\nMiWiFi\t55386\n假日酒店\t55387\n一窍不通\t55388\n不要和陌生人说话\t55389\n马旭明\t55390\n65x\t55391\n翻新\t55392\n跟腱\t55393\n番茄汤\t55394\n网球馆\t55395\nmediated\t55396\nodds\t55397\n松重丰\t55398\n中毒性\t55399\n恒泰\t55400\n蜜糖村\t55401\n并无\t55402\nhttpsession\t55403\n软件著作权登记\t55404\n联丰路\t55405\nXmlSerializer\t55406\n三娘\t55407\n手机摇篮网\t55408\n河南工学院\t55409\n湖南省食品药品监督管理局\t55410\n观云\t55411\n陈美龄\t55412\n翼缘\t55413\n空港新区\t55414\nWaiver\t55415\n中华区\t55416\n人句\t55417\n王熙凤\t55418\n余平\t55419\n一元二次函数\t55420\nDesert\t55421\n彭浦镇\t55422\n中国宋庆龄基金会\t55423\n爱马仕\t55424\nMEA\t55425\n第三重\t55426\n王梓薇\t55427\n中国会计学会\t55428\n申发\t55429\nCpk\t55430\nZedboard\t55431\n胸露\t55432\n4R\t55433\n上海国际汽车城\t55434\n恋爱番\t55435\nOxford\t55436\n五通码头\t55437\n来得及\t55438\n重刊\t55439\n圣爱\t55440\n自动机\t55441\n农业科技有限公司\t55442\n美卓\t55443\n上海歌剧院\t55444\n奔驰C级\t55445\n几又\t55446\nmydm\t55447\ndeform\t55448\n胡和平\t55449\n坠饰\t55450\nwps2010\t55451\n400k\t55452\n濑粉\t55453\n师门\t55454\n素未谋面\t55455\n虾皮粉\t55456\n关机\t55457\nJavaScript库\t55458\n年均\t55459\n挑战者号\t55460\n淋浴屏\t55461\n待售\t55462\n高效液相色谱柱\t55463\n三相电流\t55464\n太平洋战场\t55465\n鲜肉\t55466\n文冠果\t55467\n玖姿\t55468\n水速\t55469\nardo\t55470\n维生素c_\t55471\nNot\t55472\n英国巴克莱银行\t55473\n碳纤维电暖器\t55474\nInch\t55475\n235号\t55476\n最初的爱\t55477\n直发棒\t55478\n单于\t55479\nscat\t55480\nLSH\t55481\n宗门\t55482\n氨基苯甲酸\t55483\n拎包\t55484\n乱云飞\t55485\n民选\t55486\nAISS爱丝\t55487\n径山寺\t55488\n省心\t55489\nv21\t55490\n战国时期\t55491\n交响情人\t55492\n獴\t55493\nT1航站楼\t55494\n郑汴路\t55495\n金雀\t55496\n工字轮\t55497\n国芳集团\t55498\nwatchdog\t55499\n美式咖啡机\t55500\nFolli\t55501\n农林局\t55502\nivms-4200\t55503\n小雪茄\t55504\n澹台\t55505\n洪都\t55506\nzImage\t55507\n法网\t55508\n169IT.COM\t55509\n有性\t55510\n第75号\t55511\n维密大秀\t55512\n文眼\t55513\n新中式\t55514\n月番\t55515\n东光\t55516\npara\t55517\n金杯\t55518\n20118年\t55519\n中国民航网\t55520\n焚书\t55521\ncrdownload\t55522\n逆否命题\t55523\n财贸\t55524\n家博会\t55525\n旭辉东樾城\t55526\n同业存款\t55527\nIdeone\t55528\n沪工\t55529\n叫餐\t55530\n000568\t55531\n心缘\t55532\nPowerPC\t55533\n复方\t55534\nCredit\t55535\n50根\t55536\n王志坚\t55537\nlinechart\t55538\n超蝠\t55539\n学门\t55540\n33.5\t55541\nACTIVEX\t55542\n40125\t55543\n天灾\t55544\n日本山\t55545\n想哭\t55546\n堆垛机\t55547\n超声\t55548\n膨胀系数\t55549\n清单式\t55550\n创意书\t55551\n蓝天下\t55552\n高巍\t55553\n统计器\t55554\n追龙\t55555\nliao\t55556\n伊素婉\t55557\n乐从\t55558\n东林中学\t55559\n水水\t55560\n北美洲\t55561\n随园诗话\t55562\n电话银行\t55563\n死夜恶\t55564\nTAM\t55565\n刘昊天\t55566\n20170129\t55567\nhuanyuan\t55568\n2700万\t55569\n五和\t55570\n热浸镀锌\t55571\n35个\t55572\nLCD之家论坛\t55573\n巴中市\t55574\n四通\t55575\n尾目\t55576\n雷锋月\t55577\n华测检测\t55578\n报警\t55579\nsurefire\t55580\n张妈妈\t55581\n三角木\t55582\n钟山县\t55583\ncstdio\t55584\n福清文光中学\t55585\n紫云路\t55586\ndiyi\t55587\n病毒性脑炎\t55588\n帝王之妾\t55589\n中科院过程所\t55590\n陈耀\t55591\n2018—2019年\t55592\n迪梵\t55593\n闵行\t55594\n承启\t55595\n5品\t55596\nunfold3d\t55597\n114ls\t55598\ndade\t55599\n帝王庙\t55600\n长念琅琊\t55601\n银行存款日记账\t55602\nand&#160\t55603\n隋唐史\t55604\nquart\t55605\n任法融\t55606\n焱\t55607\n曲水县\t55608\n细颗粒物\t55609\n王蔚\t55610\nExpo\t55611\n增援\t55612\n彭浩\t55613\n操逼网\t55614\nperl\t55615\n错价\t55616\n忘乎所以\t55617\n甘城\t55618\n奸门\t55619\n古运河\t55620\n男孩们\t55621\n飞智wee\t55622\n法律诉讼\t55623\n电视线\t55624\n莫少聪\t55625\n风能\t55626\n5480\t55627\n八字\t55628\nfreeliver54\t55629\n阿米尼电动车\t55630\n麻筋\t55631\n小语种\t55632\n袁霞\t55633\n伊顿幼儿园\t55634\n速诺\t55635\n600部\t55636\n中医杂志\t55637\n蓝郡\t55638\n凤凰知音卡\t55639\nBoob\t55640\n压阵\t55641\n玉田生活网\t55642\n卷风\t55643\n眷念\t55644\n陈春林\t55645\nNBC\t55646\n120条\t55647\npoe\t55648\n最高人民检察院\t55649\n白相\t55650\n睡眠药\t55651\n旁若无人\t55652\n医术\t55653\n财建\t55654\n果洛藏族自治州\t55655\n羽田\t55656\n爱读网\t55657\n阳哥\t55658\n脆枣\t55659\n墨画\t55660\n121个\t55661\n精播机\t55662\n发作性\t55663\n可编程控制器\t55664\n沙罗\t55665\n提柜\t55666\n柴油滤清器\t55667\n牛尔\t55668\n心品\t55669\n杨国民\t55670\n镁肥\t55671\n万物理论\t55672\n计生证\t55673\n一两件\t55674\n对比剂\t55675\n通化\t55676\n扩展版\t55677\nstm8l\t55678\n举报件\t55679\n传播淫秽物品罪\t55680\n反颌\t55681\n多智能体\t55682\n生物钟\t55683\n电缆卷筒\t55684\n酞菁\t55685\n刘一痕\t55686\n小书\t55687\n不买\t55688\n直谏\t55689\ncailiao\t55690\n恐怖广播\t55691\n86本\t55692\n色子\t55693\n搜索引擎营销\t55694\nMBE\t55695\nGJ\t55696\n仇海平\t55697\n290个\t55698\n北宁公园\t55699\n琴瑟和鸣\t55700\n养老险\t55701\ncarnation\t55702\n中空板周转箱\t55703\n灵光\t55704\n牛栏奶粉\t55705\n锁上\t55706\n咬手\t55707\n真树\t55708\n加州理工学院\t55709\nShareMouse\t55710\nz50\t55711\nMaxCompute\t55712\n配招\t55713\npossible\t55714\nvivox5\t55715\n屈光不正\t55716\n通威太阳能\t55717\n上海浩然生物技术有限公司\t55718\n冈比亚\t55719\n钢格\t55720\n5.3万\t55721\n绑扎带\t55722\n鹿豹座\t55723\n重庆高速\t55724\n坂本龙马\t55725\n格构梁\t55726\n卡扣式\t55727\n保税港\t55728\n十万个为什么\t55729\n保证期间\t55730\n万里长征\t55731\n7千万\t55732\n橘子汁\t55733\n广义\t55734\n2012世界末日\t55735\n东来\t55736\n发觉\t55737\n人彘\t55738\n纳屋\t55739\n2017年12月5日\t55740\n闭集\t55741\n方诚\t55742\narrange\t55743\n172号\t55744\n提壶\t55745\n导流线\t55746\n三角号\t55747\n外孙女\t55748\nContainers\t55749\n钟汉良\t55750\n陈百潭\t55751\n街口\t55752\n黄金色\t55753\nPADS2007\t55754\nArchitectural\t55755\n2016、2017年度\t55756\n南宁地铁\t55757\n曲江池遗址公园\t55758\n脚腕\t55759\n霍山米斛\t55760\ni+=1\t55761\n浦江镇\t55762\n博兴在线\t55763\n煎鸡蛋\t55764\n防区\t55765\nxpr\t55766\n阿里卡\t55767\nAmtrak\t55768\n起飞\t55769\n白羊女\t55770\n电气控制柜\t55771\nexcel,word\t55772\nBRT\t55773\n振宇\t55774\n皇庭国际\t55775\n头文件\t55776\n浙江省印染行业协会\t55777\n电子防潮箱\t55778\n东海龙宫\t55779\n首城\t55780\n顺尔宁\t55781\n内隐\t55782\n青行\t55783\n微雪\t55784\n水沢\t55785\n乐享4G\t55786\nKANSAI\t55787\n九季\t55788\nRed5\t55789\n蒋杰\t55790\nS600\t55791\nenumerate\t55792\n病虫害\t55793\n正荣润江城\t55794\n1.0.13\t55795\n富贵兵团\t55796\n触发者\t55797\n献帝\t55798\n睡狮\t55799\n上原瑞穂\t55800\n滴滴打车软件\t55801\n梦幻西游59\t55802\n若遇\t55803\n太平洋下载中心\t55804\n怪谭\t55805\n启业者\t55806\n星洲花园\t55807\n906h\t55808\n鬼情\t55809\n梦幻泡影\t55810\n火成岩\t55811\nDuration\t55812\nInfinite\t55813\n日霜\t55814\n张韶\t55815\n小曼\t55816\n锦里\t55817\n镜湖新闻网\t55818\n5家\t55819\n奥雅纳\t55820\nkabuto\t55821\nNat\t55822\n尾行\t55823\n无往不利\t55824\n月底\t55825\n自怕\t55826\n变流器\t55827\n29项\t55828\n拾掇\t55829\n神版\t55830\nM币\t55831\n绝地求生鼠标宏\t55832\n瑞派\t55833\n铃屋什\t55834\n前1天\t55835\n丑女人\t55836\n双港镇\t55837\n蝇王\t55838\n一百多年\t55839\n刘春生\t55840\n秒号\t55841\n溏心\t55842\nBrow\t55843\n红谷滩万达广场\t55844\n慕容博\t55845\n国权\t55846\n省人大常委会\t55847\nf2.0\t55848\n半价\t55849\ntms\t55850\n奠定\t55851\n天津港保税区\t55852\n性奋\t55853\n课代\t55854\n1.0.3.1\t55855\n干红枣\t55856\n风云三国2.8\t55857\n侏儒工程学\t55858\n齐齐哈尔工程学院\t55859\n29英寸\t55860\n柳州市政府\t55861\n三林镇\t55862\n段段\t55863\n蓝蓝的天\t55864\n新前\t55865\n经济研究参考\t55866\npxc\t55867\n000830\t55868\nDVD原盘\t55869\n2天游\t55870\n全旗\t55871\n恶棍\t55872\nmatch\t55873\n孙安\t55874\n结为\t55875\n五色梅\t55876\n反超\t55877\n龟甲师\t55878\n女人的秘密\t55879\nPA+++\t55880\n大中型\t55881\n糟粕\t55882\n心里的路\t55883\n专片\t55884\n_骨\t55885\nHashSet\t55886\n1.170\t55887\nBEAT\t55888\n誰\t55889\n过孔\t55890\n轻舟\t55891\n簇状\t55892\nULED\t55893\n95528\t55894\n堪比\t55895\n气源\t55896\n分步式\t55897\n阿呆喵\t55898\n建立者\t55899\n盐源县\t55900\n方吉\t55901\n昆山中医院\t55902\n智龙\t55903\nink\t55904\nf254\t55905\n卡帕\t55906\n宪制\t55907\n方颅\t55908\n一天一次\t55909\n渐次\t55910\n清华附中_清华大学附属中学\t55911\n玄奘法师\t55912\n百度软件\t55913\n闭式冷却塔\t55914\n嘉里不夜城\t55915\n墨阳\t55916\n50寸\t55917\n术阶\t55918\n无维\t55919\n托盘式\t55920\n覆雨翻云\t55921\njprofiler\t55922\n校人\t55923\n公婆\t55924\n屁眼子\t55925\n挂线\t55926\n插管\t55927\nDashboard\t55928\nd3d9\t55929\n宿缘\t55930\n武直10\t55931\n隆福寺\t55932\n8096\t55933\n如来启悦\t55934\n壕们\t55935\n玖西堂\t55936\n凌力尔特\t55937\n凤凰书城\t55938\n上海八佰伴\t55939\n喜讯\t55940\nDice\t55941\n风刃\t55942\nInternals\t55943\n信模板\t55944\n长江金业\t55945\n使用率\t55946\n电缆载流量\t55947\n赶海\t55948\n投资银行\t55949\nBeer\t55950\n中华人民共和国国家卫生健康委员会\t55951\nRSTP\t55952\n干巴\t55953\nbeaches\t55954\n遭抢\t55955\n水柱\t55956\n6.0.7\t55957\nsanction\t55958\n口射\t55959\n我们相爱吧\t55960\n铬矿\t55961\n汗马\t55962\n专兼职\t55963\nfri\t55964\n导尿\t55965\n大营救\t55966\n逃往\t55967\n势\t55968\n王怀忠\t55969\n起亚kx5\t55970\n新大陆ME31\t55971\n西沙湾\t55972\n呕\t55973\n卓立\t55974\n深振业\t55975\nautoload_register\t55976\n紫音\t55977\n抢板\t55978\n双向\t55979\n免罪\t55980\nAllez\t55981\nFucking\t55982\n网盾\t55983\n沭阳南部新城\t55984\n殿上\t55985\n社保定点医院\t55986\n玉华\t55987\ncsnd\t55988\nTakanashi\t55989\ndtv\t55990\n34天\t55991\n环管\t55992\niebook\t55993\nAtmos\t55994\n中国哲学简史\t55995\n版房\t55996\n连接板\t55997\n宁陵\t55998\n鹅业\t55999\n没收\t56000\n赵海燕\t56001\nヨ\t56002\n榆社县\t56003\n体重表\t56004\n干燥柜\t56005\n鹿山\t56006\n来讲讲\t56007\n码兔\t56008\n片型\t56009\n三门路\t56010\n8批次\t56011\n員\t56012\n斜塘老街\t56013\n皖水公寓\t56014\n内存溢出\t56015\n沐浴露\t56016\nLOGIC\t56017\n徐晶\t56018\n恒昌花园\t56019\n连手\t56020\nMutation\t56021\n10.0_\t56022\n主电路\t56023\n36G\t56024\n火币\t56025\ngbc吧\t56026\n套女\t56027\n有限合伙企业\t56028\n杰克·特劳特\t56029\n巴卡尔\t56030\n几案\t56031\n列表网\t56032\n陈词滥调\t56033\n哪样\t56034\n诺雅\t56035\n张译\t56036\n西园新村\t56037\n老火\t56038\n上筑\t56039\n延安清风网\t56040\n沙乡\t56041\n忍者村大战2\t56042\n太原市人民政府\t56043\n腰乐队\t56044\n软卧\t56045\n长隆大马戏\t56046\nPD虚拟机\t56047\n红铜\t56048\n陈姐\t56049\nJList\t56050\n萌漫乡\t56051\nIE6.0\t56052\n盆式橡胶支座\t56053\n陈政\t56054\nMunich\t56055\n金与正\t56056\nc2000\t56057\n小贷\t56058\n老虎机\t56059\n压头\t56060\n水晶\t56061\ncitizenship\t56062\nEASYICON\t56063\n烤牛肉\t56064\n谢冰\t56065\n职教云\t56066\n190平米\t56067\n谷酒\t56068\ntraditions\t56069\n海菱\t56070\n湖南电视台\t56071\n频域\t56072\n田家庵区\t56073\n生死相许\t56074\n黄草梁\t56075\n集录\t56076\n余建\t56077\n分享者\t56078\n悬挑式\t56079\n蛮好\t56080\n西葡\t56081\nCNFloral\t56082\n金融研究院\t56083\n阜阳论坛\t56084\n中国科学院上海有机化学研究所\t56085\n蔡国庆\t56086\n邪王\t56087\n开智\t56088\n安必奇生物科技\t56089\nCCTV13\t56090\n北京军区\t56091\n特朗普政府\t56092\n直接引语变间接引语\t56093\n美国华盛顿大学\t56094\n教练型\t56095\n常熟南站\t56096\n止咳\t56097\n19集\t56098\n黑妖\t56099\n草榴邀请码\t56100\n042\t56101\n微信内置浏览器\t56102\nPayback\t56103\n心理治疗\t56104\n妄想学生会\t56105\n広瀬奈\t56106\n069\t56107\n_安秀网\t56108\n情报局\t56109\n唐筛\t56110\n山东人才网\t56111\n60秒\t56112\n招领\t56113\n上海第二工业大学\t56114\n五六年\t56115\n门吊\t56116\n驻场\t56117\nnightwish\t56118\n再见阿郎\t56119\n三国志14\t56120\n佛光\t56121\n孝悌\t56122\nmacd\t56123\n两年内\t56124\n东北部\t56125\n探路者\t56126\n10698\t56127\n北极星环保招聘网\t56128\n1900-01-01\t56129\n九小\t56130\n汇改\t56131\n创始者\t56132\nafc\t56133\n卢永根\t56134\n高低位\t56135\njnz\t56136\n刚构桥\t56137\n秦长青\t56138\nVolunteer\t56139\n深外\t56140\n灵素\t56141\n差异率\t56142\n姊妹们\t56143\n系统编程\t56144\n孟山\t56145\n只手\t56146\n时段\t56147\n刘一秒\t56148\n金五星\t56149\nFestival\t56150\nDCM\t56151\n广东亿迅科技有限公司\t56152\n叫写\t56153\n简约版\t56154\n大学季\t56155\n三落\t56156\n胎盘干细胞\t56157\n旅行记\t56158\nThinkPad\t56159\n除氧\t56160\nArtifacts\t56161\n南宁市行政审批局\t56162\n王大花\t56163\n增白剂\t56164\n鞍山市人力资源和社会保障局\t56165\n款物\t56166\n湘源控规\t56167\n抗体\t56168\n马克思主义观\t56169\n华为视频播放器\t56170\n龙斗\t56171\nuic\t56172\n占上风\t56173\n尸衣\t56174\n1422\t56175\nTDD\t56176\n白山黑水\t56177\n重庆市民政局\t56178\n轻轻松松\t56179\nT530\t56180\ncandice\t56181\nAdvisor\t56182\n芥园道\t56183\nring0\t56184\n中央第四环境保护督察组\t56185\n米特\t56186\n电流传感器\t56187\n碗花\t56188\n吞人\t56189\n植食性\t56190\n时间财富网\t56191\n水楼\t56192\n探戈舞曲\t56193\nhxl\t56194\n超级小农民\t56195\n消谐器\t56196\n中国科技网\t56197\n王国权\t56198\n晶体剑\t56199\n越野\t56200\nApexbio\t56201\n湿婆神\t56202\nQ345B\t56203\n内控制度\t56204\n靖州\t56205\n惯性矩\t56206\npupu\t56207\n幻想生活\t56208\n副院长\t56209\n征兵\t56210\n包头楼盘网\t56211\n简史\t56212\n旋流\t56213\n王梦溪\t56214\n10%\t56215\n翼搏\t56216\n数析网\t56217\n总吨\t56218\n仙踪\t56219\n碟飞\t56220\n5.5cm\t56221\n猜字\t56222\nlub\t56223\n温线\t56224\n译者\t56225\n_筑龙教育网\t56226\n几个月前\t56227\n第29章\t56228\nokex\t56229\n人缘\t56230\n邮递\t56231\n七小\t56232\nszd\t56233\n以太网帧\t56234\n查名\t56235\nBAE\t56236\n羊膜\t56237\n弹出式\t56238\n知识产权出版社\t56239\n仁人\t56240\n机动式\t56241\n长春小学\t56242\n考察站\t56243\n商标证\t56244\n油泼辣子\t56245\n唐山市委\t56246\n竞店\t56247\n美味速递\t56248\n二级建筑师\t56249\nID\t56250\n处女地\t56251\n张世豪\t56252\n1292\t56253\nrhj\t56254\n叽叽\t56255\n中国书法家协会\t56256\n画饼\t56257\n马克斯·韦伯\t56258\n4500亿\t56259\nCare\t56260\n测评卷\t56261\n浙江长龙航空\t56262\n试机\t56263\n只字不提\t56264\n伟高\t56265\n长春经开区\t56266\nLearn\t56267\nwoke\t56268\n二恶烷\t56269\n打钩\t56270\n燕子矶\t56271\n自锁\t56272\nCarpenter\t56273\n宁安\t56274\n饭桌\t56275\n丹神\t56276\n嘉丽泽\t56277\n卫练\t56278\n中央广播电视总台\t56279\n知豆d2\t56280\n破产重整\t56281\n会计考试\t56282\n东风乡\t56283\n【列王\t56284\n韮\t56285\n食品经营许可证\t56286\n23万元\t56287\n0769\t56288\n树脂\t56289\nSoDu_搜读520小说网\t56290\n那么重要\t56291\nnikesb\t56292\n理县\t56293\n金山银山\t56294\n水图\t56295\nprimatte\t56296\nxchange\t56297\n潜龙轰天\t56298\n绿苑\t56299\n选股\t56300\n546\t56301\n热场\t56302\n眼影盘\t56303\n乐税网\t56304\ndokidoki\t56305\n江宇\t56306\n大褂\t56307\n曼妮\t56308\n涂改\t56309\n软矿\t56310\n吴君\t56311\n温软\t56312\nListary\t56313\n几幅\t56314\n东野贝克汉姆\t56315\n房地产报\t56316\n加菲猫\t56317\n浮图\t56318\n140个\t56319\n打座机\t56320\n佳明\t56321\n哀牢\t56322\n皮膏药\t56323\n上海市教育人才交流服务中心\t56324\n三星on7\t56325\n太东天樾湾\t56326\n5568\t56327\n和嫂子同居的日子\t56328\n四众\t56329\n北京市市政工程设计研究总院有限公司\t56330\nCRM\t56331\n第3讲\t56332\njva\t56333\n毕业论文答辩\t56334\n春夏季\t56335\nQuake\t56336\nregasm\t56337\n成都银泰中心\t56338\n世茂城三期\t56339\n中国快递公司\t56340\nMitutoyo\t56341\n腿式\t56342\n腹膜\t56343\n忠者\t56344\n拉赫玛尼诺夫\t56345\n1950年\t56346\n天地劫\t56347\nmissions\t56348\n海门港新区\t56349\n快速路网\t56350\n君银\t56351\n认股\t56352\n日本佛教\t56353\n国民老公\t56354\n上海市教育委员会\t56355\n7岁\t56356\n扩展卷\t56357\n内蒙古自治区食品药品监督管理局\t56358\n联合大学\t56359\n人网\t56360\n瓦娘\t56361\nls200\t56362\n谷超豪\t56363\n参茶\t56364\n孔连顺\t56365\n懒洋洋\t56366\n少先队员\t56367\n刘莎莎\t56368\nccpacer\t56369\n阳痿\t56370\n藏獒\t56371\n捷克克朗\t56372\n有异\t56373\nCreater\t56374\n重庆秀山\t56375\n启新\t56376\nParallax\t56377\n十六款\t56378\nprettier\t56379\n各地区\t56380\n滞销\t56381\n史话\t56382\n佳宝\t56383\n薛开宇\t56384\n600028\t56385\n得图\t56386\n杭州在线\t56387\n电子琴\t56388\nPHC\t56389\n2mm\t56390\n奇闻\t56391\n历界\t56392\n游园不值\t56393\n树顶\t56394\n双眸\t56395\n肥波\t56396\n邓涛\t56397\n第六宫\t56398\nETSI\t56399\n刘奇\t56400\nExceptions\t56401\nMetal\t56402\n狼毫\t56403\n朱琪\t56404\n省直辖市\t56405\n老北京四合院\t56406\n尚德机构\t56407\n李家镇\t56408\n汪达\t56409\n6月下旬\t56410\n盐藻\t56411\n雷达线雕\t56412\n黑丝袜\t56413\n201711月\t56414\n爱德华诺顿\t56415\nDebut\t56416\n网易云阅读\t56417\n狮鹫学派\t56418\n资源群\t56419\n暗夜精灵2\t56420\n张祖庆\t56421\nCosmetic\t56422\nv9.0.1\t56423\n邢昭林\t56424\n阵型\t56425\n弯强度\t56426\n公干\t56427\n湖北省科技厅\t56428\n刘媛\t56429\n第十五集\t56430\n舔逼\t56431\nCatalogue\t56432\n直线加速器\t56433\n按扭\t56434\n难为\t56435\n爱问网\t56436\n宏\t56437\n新罗\t56438\nB级车\t56439\n复混肥\t56440\n大连超越\t56441\n真情流露\t56442\n判案\t56443\n保湿\t56444\n杨青山\t56445\nGenymotion模拟器\t56446\n950\t56447\n青年时代\t56448\n剑灵诡面\t56449\n高木\t56450\nHush\t56451\nMommy\t56452\nNasdaq\t56453\n学奕\t56454\n万科红郡\t56455\nILCE-7\t56456\n老家人\t56457\n新丰路\t56458\n农艺师\t56459\n调养\t56460\n工程咨询公司\t56461\n224集\t56462\n暮光之城4:破晓\t56463\n肉友\t56464\nREFERENCE\t56465\n朱咪咪\t56466\n百观网\t56467\n压力位\t56468\n南医三院\t56469\n等到\t56470\nregard\t56471\n文学楼\t56472\n底图\t56473\n购销\t56474\n老铁\t56475\n7串\t56476\n云烟成雨\t56477\n陈盈燕\t56478\nRAISE\t56479\nSurvival\t56480\nmt世界\t56481\n同济\t56482\n麻疹\t56483\nprune\t56484\nsauce\t56485\nwinmail\t56486\n狱友\t56487\n热度图\t56488\n心旋律\t56489\nshipments\t56490\n黑月\t56491\n茶妹\t56492\npyw\t56493\n爱德华王子岛\t56494\nautovue\t56495\n汹汹\t56496\n公众版\t56497\n奚落\t56498\nFocustc\t56499\n大国重器\t56500\n真三国无双6\t56501\n裸条门\t56502\n郑州市社会保险局\t56503\ncolombia\t56504\nBBOOM\t56505\n国历\t56506\n哭声\t56507\n9555\t56508\n照美冥\t56509\n10061\t56510\nHEPA\t56511\n太清宫\t56512\n金加洛斯\t56513\n后式\t56514\n星球大战:前线2\t56515\n头陀\t56516\n小熠\t56517\n日照市人力资源和社会保障局\t56518\n容忍\t56519\n178星际\t56520\n三竿\t56521\n第18039期\t56522\nnsw\t56523\n开心冬\t56524\n刘娟\t56525\n污染者\t56526\n_台玻青岛玻璃有限公司\t56527\n守婚如玉\t56528\n88号\t56529\n杆织机\t56530\n摩诃婆罗多\t56531\nETF\t56532\n党政机构\t56533\n3470\t56534\n小村\t56535\n虚拟股权\t56536\njsession\t56537\n产业并购基金\t56538\n子期\t56539\n动脉瘤\t56540\n怆然而涕\t56541\n马建国\t56542\n番木瓜\t56543\n通透性\t56544\n唐双宁\t56545\n飞荣达\t56546\nWhile\t56547\n模拟式\t56548\n曲华影院\t56549\n切边\t56550\nsoccer\t56551\n杨楠\t56552\n超级大坏蛋\t56553\n木耳菜\t56554\n侠盗飞车猎车手圣安地列斯\t56555\n81本\t56556\n行客工作室\t56557\n柳梢\t56558\nHeard\t56559\n19元\t56560\n宇翔\t56561\n电动轮椅\t56562\n海底\t56563\nmundo\t56564\n赤安\t56565\n虎门二桥\t56566\n學網\t56567\noct\t56568\n吹呀\t56569\n故事机\t56570\ngogort\t56571\n怦然星动\t56572\nSSLVPN\t56573\n滑向\t56574\n窒息感\t56575\n塑胶制品\t56576\n阴和俊\t56577\n3脚\t56578\n剃须\t56579\n窗棂\t56580\n祖先\t56581\n达仁堂\t56582\n访客机\t56583\nBEACH\t56584\n轻奢品\t56585\n景域集团\t56586\n半年多\t56587\n暗巷\t56588\n上古卷轴5MOD\t56589\n仁恒置地\t56590\n利索\t56591\nMOMENT\t56592\n各段\t56593\n爱而生\t56594\n浙江省卫生厅\t56595\n中国指数研究院\t56596\n思南\t56597\n飞驒\t56598\n高磊鑫\t56599\n藏奸\t56600\n金艺琳\t56601\n肖尚略\t56602\n黏稠\t56603\n滴眼剂\t56604\n新红\t56605\nrival310\t56606\n爱似\t56607\n打不全\t56608\n搞乱\t56609\n悲欢\t56610\nFolio\t56611\n国风塑业\t56612\nXmarks\t56613\n各具\t56614\nMadden\t56615\n【汉\t56616\n茅山风景区\t56617\nLMAX\t56618\n5星级\t56619\nbaise\t56620\n金棒\t56621\n离夏\t56622\n10年左右\t56623\nDumb\t56624\n锋线\t56625\nfrancisco\t56626\n三国无惨\t56627\n颈带\t56628\n汉威科技集团股份有限公司\t56629\nUMX4\t56630\n钓叟\t56631\n陈羽凡\t56632\n务卿\t56633\n不舒\t56634\n一两个小时\t56635\n能率热水器\t56636\n2016年终\t56637\nTSI\t56638\n暖通公司\t56639\n电偶\t56640\n玉龙县\t56641\n非语言交际\t56642\nshuxue\t56643\n我们仨_杨绛\t56644\n五湖\t56645\n基金重仓\t56646\nHidden\t56647\n五五\t56648\n未必能\t56649\n再唱山歌给党听\t56650\n张大伟\t56651\n邱明成\t56652\n配列\t56653\n消费卡\t56654\n平定县\t56655\n受不鸟\t56656\n我的最爱\t56657\n宏利\t56658\n旺衰\t56659\n移植\t56660\n李斌\t56661\nLED吸顶灯\t56662\n教务在线\t56663\n303号\t56664\n夏辉\t56665\n威创\t56666\n乌烟瘴气\t56667\n高压柜\t56668\n我喜欢你\t56669\n对公帐户\t56670\n单晶硅\t56671\nbenz\t56672\n狸藻\t56673\n环保工程师考试\t56674\nare\t56675\n我要的情深似海\t56676\n大同市南郊区人民政府\t56677\n长友\t56678\n张小明\t56679\n嘴儿\t56680\n伊东红\t56681\n祥云小镇\t56682\n基圆\t56683\n100KB\t56684\n河北省通信管理局\t56685\n月影传说\t56686\n浴帽\t56687\n精神病科\t56688\n杭钢股份\t56689\n1000吨\t56690\n开慧镇\t56691\n李谷\t56692\n熊孩子儿歌\t56693\n⑷\t56694\n常平站\t56695\nkikong\t56696\nucsc\t56697\n高瑜\t56698\n余罪冰公主的校园之行\t56699\n改编剧\t56700\n花家\t56701\n凸窗\t56702\n腹股沟淋巴结肿大\t56703\n手势语\t56704\n诺邓\t56705\n香港电影院\t56706\nAR-2048S\t56707\n24吨\t56708\n1pdf\t56709\n小悦悦\t56710\n雷米\t56711\n耍狠\t56712\n木禾\t56713\nphp.ini\t56714\n在职博士招生信息网\t56715\nBANDAI\t56716\n订版\t56717\n杭盖乐队\t56718\ncommunicate\t56719\n扁圆\t56720\n密考\t56721\n断链\t56722\n板鞋\t56723\n市委书\t56724\nJudy\t56725\n隆城\t56726\n工笔\t56727\n二行\t56728\n高碑\t56729\n冬雨\t56730\n绿领巾\t56731\n鲍起静\t56732\n公用\t56733\nEDAS\t56734\n南京长江隧道\t56735\nF杯\t56736\nvisio流程图\t56737\n一双手\t56738\nCOSER\t56739\n9876\t56740\n催费\t56741\n割晒机\t56742\njijia\t56743\n初筛\t56744\n瓜渚湖\t56745\n科学馆\t56746\n摊大饼\t56747\nBrings\t56748\nFile类\t56749\n差号\t56750\nwinscp\t56751\n攀成钢\t56752\ncrosstab\t56753\n国际青年旅舍\t56754\n木筏\t56755\n站点\t56756\n雾山\t56757\n3303\t56758\n长春103中学\t56759\n苏宁零钱宝\t56760\n社会科学家\t56761\n灵机一动\t56762\n凭借\t56763\n威迅\t56764\n牡丹峰乐团\t56765\nsnip\t56766\nHD-720P/1080P-MP4\t56767\n乐从镇\t56768\n紫菜汤\t56769\n夹注\t56770\nHotdog\t56771\nslither\t56772\n着生\t56773\nD20\t56774\n1580元\t56775\nACM/ICPC\t56776\n0721\t56777\nreporting\t56778\n通风道\t56779\n诶\t56780\n五瓶\t56781\n刊期\t56782\n上海公墓\t56783\nEmployer\t56784\n益农\t56785\n送货单\t56786\n克拉苏\t56787\n打铁还需自身硬\t56788\n宝诗龙\t56789\n豪乳\t56790\n复位键\t56791\n汉人\t56792\n李营\t56793\nhad\t56794\n周_\t56795\n乱喷\t56796\n朴惠敏\t56797\nPHANTOM\t56798\n房地产开发有限公司\t56799\n40多种\t56800\n入门者\t56801\n聂海芬\t56802\n黄金分割法\t56803\n彭阳\t56804\nSEN\t56805\n改连供\t56806\n库龄\t56807\n江畔独步寻花\t56808\n第0集\t56809\nSkeleton\t56810\n蓝色\t56811\nremixos\t56812\n+8\t56813\nbubu\t56814\n铵根离子\t56815\n中兴路\t56816\n0.025\t56817\n带给\t56818\nhightcharts\t56819\n3840\t56820\nscales\t56821\n48001\t56822\n更美好\t56823\n服装秀\t56824\n茂宜岛\t56825\ntutuanna\t56826\n21.9\t56827\n安卓7\t56828\n中国诡实录\t56829\n99集\t56830\n光端机\t56831\nqf\t56832\n沪蓉高速\t56833\n适马\t56834\n女女优\t56835\n全歼\t56836\n软文营销\t56837\n这道菜\t56838\n硬密封蝶阀\t56839\n谈学\t56840\n达尔优机械师\t56841\n长沙车管所\t56842\n宝狄\t56843\n陈一堂\t56844\n炒菜机\t56845\n500e\t56846\n4.45\t56847\n太阁立志传5战国绘卷\t56848\nW\t56849\n沈鸿根\t56850\nMorphology\t56851\n超声波洗碗机\t56852\nCara\t56853\n松江区\t56854\n九四\t56855\n吞咽\t56856\n护林\t56857\n红尘情歌\t56858\n太平人寿保险有限公司\t56859\n水濂山森林公园\t56860\n_周\t56861\n2018数\t56862\n中阿\t56863\n震雄\t56864\nCSS框架\t56865\n餐厨垃圾\t56866\n16幅\t56867\n核动力\t56868\n水污染物\t56869\nwin10系统盘\t56870\nkirikiri\t56871\nKnows\t56872\n筑炉\t56873\nhandle\t56874\n2011年4月\t56875\n福乐\t56876\n远东国际租赁有限公司\t56877\n蛋白尿\t56878\n可减免\t56879\n简称\t56880\nEduSoho\t56881\nFireworks\t56882\n300多名\t56883\n平开门\t56884\nDecompiler\t56885\n毛囊角化症\t56886\n团岛\t56887\n溺死\t56888\n第229集\t56889\n韩天衡\t56890\n大龙潭\t56891\n解放前\t56892\nv11.0\t56893\nSpeedo\t56894\n柠都安岳\t56895\n逍遥\t56896\n铬钼钢\t56897\n霸锐\t56898\n市场调查网\t56899\n博尔赫斯\t56900\n500千瓦\t56901\nAllegorithmic\t56902\nequinox\t56903\n周6\t56904\n12.1\t56905\n涉税\t56906\nCINEMA\t56907\n4笔\t56908\natsl\t56909\nCTCS\t56910\n酷Q机器人\t56911\n寻人网\t56912\n如颖\t56913\n十九年\t56914\ntcf\t56915\n魏晓雪\t56916\n锡基霍尔岛\t56917\n用不起\t56918\n致姗姗来迟的你\t56919\n8千米\t56920\n泉州台商投资区\t56921\n李大美\t56922\n徐天\t56923\n國語\t56924\n摩尔定律\t56925\n超能力者齐木楠雄的灾难\t56926\n隔\t56927\n河北辛集中学\t56928\n红苹果乐园\t56929\nsu8\t56930\nmarmot\t56931\n浦东社区\t56932\n微喷\t56933\n锐星\t56934\n远上\t56935\n文法学院\t56936\n不可磨灭\t56937\nDiagnosis\t56938\n瓣\t56939\n月子餐\t56940\n立向\t56941\n轰鸣声\t56942\n培英\t56943\nhdv\t56944\n感应门\t56945\n110指挥中心\t56946\n中乐\t56947\n山水华庭\t56948\n第26话\t56949\n广隆\t56950\n铺张浪费\t56951\n李纪恒\t56952\n拖鞋\t56953\n高尔夫度假村\t56954\n串筒\t56955\n就地取材\t56956\n迥异\t56957\n兰新\t56958\n出位\t56959\n少精症\t56960\n爱玲\t56961\n幽怨\t56962\n大旺\t56963\n中国建筑材料集团有限公司\t56964\n浦江新闻网\t56965\n大连金州湾国际机场\t56966\n3296\t56967\n天君\t56968\n陈俊豪\t56969\niphone6S\t56970\nJ\t56971\n头盖骨\t56972\n陈家桥\t56973\n脱氧核糖核酸\t56974\n公价\t56975\nNightwish\t56976\npapi酱\t56977\nhuaian\t56978\n秋元\t56979\n为家\t56980\nisapi\t56981\n精英国际移民咨询公司\t56982\n陆氏\t56983\n46.5\t56984\n残翅\t56985\nTimberland\t56986\n冷婷\t56987\nHUST\t56988\n首约\t56989\nOFFICE\t56990\n极短\t56991\n朱建军\t56992\n天韵之声配音网\t56993\nBindings\t56994\nWWE\t56995\nteaser\t56996\n丰满区\t56997\n三羟甲基氨基甲烷\t56998\n八仙桌\t56999\n埃德·斯塔福德\t57000\n北京宝岛妇产医院\t57001\n赵先生\t57002\n护垫\t57003\n功夫瑜伽\t57004\n漳州师范学院\t57005\ntv版\t57006\n全文\t57007\n天地盒\t57008\n琴叶\t57009\n标识语\t57010\n韩饭\t57011\n珍品网\t57012\n涡轴\t57013\n甬金铁路\t57014\neclpise\t57015\n600219\t57016\nbabelua\t57017\n真空\t57018\n活期利率\t57019\n目标感\t57020\n中国铝业股份有限公司\t57021\n三扇\t57022\n客流量\t57023\n天宝镇\t57024\n约约哥\t57025\nspecialist\t57026\n小白裙\t57027\n分节符_\t57028\nCraft\t57029\nNETS\t57030\n过当\t57031\n李晟敏\t57032\nsteem\t57033\n秦曾煌\t57034\n豪门世家\t57035\n龙江县\t57036\nselenium2\t57037\nacta\t57038\n杭州迪普科技股份有限公司\t57039\n大甲岛\t57040\nark\t57041\n2018首\t57042\n史健凯\t57043\n快闪版\t57044\n断连\t57045\n欧佩克\t57046\n伞型\t57047\n背阔肌\t57048\n一名\t57049\nalexkn\t57050\n合规性\t57051\n硼烷\t57052\n美国国防部\t57053\n轰6K\t57054\n中国妇女发展基金会\t57055\n天士力\t57056\n西工区\t57057\njiexi\t57058\n奥术师\t57059\n加派\t57060\n助战\t57061\n威士伯\t57062\n北大附小\t57063\n我的世界吧\t57064\nRooms\t57065\n绿博会\t57066\nneq\t57067\n密码机\t57068\n雨一直下\t57069\n创融\t57070\nPrivate\t57071\n炮轰\t57072\nqdedu\t57073\n苏嘉甬\t57074\n投影式\t57075\nEPON\t57076\nDVDES\t57077\nseems\t57078\n兰州\t57079\nFalcom\t57080\nTSS\t57081\n1000m3\t57082\nrandy\t57083\n工商管理毕业论文\t57084\n龙眠大道\t57085\n胰头癌\t57086\nSpecifications\t57087\n70ml\t57088\n局外\t57089\n上海师范大学》\t57090\n80毫升\t57091\n5750g\t57092\n使者\t57093\n很太\t57094\n心象\t57095\nALWAYS\t57096\n青轴\t57097\n丹布朗\t57098\n19位\t57099\nAcademy\t57100\n奥迪a9\t57101\n陈家乐\t57102\nmaohaiqing0304\t57103\n千星流月\t57104\n李延年\t57105\ntreating\t57106\ntal\t57107\n鲁班锁\t57108\n金达威\t57109\n中小板指数\t57110\n京沪二线\t57111\n刘彦直\t57112\n省农科院\t57113\n天津市发展和改革委员会\t57114\n意念\t57115\n杨度\t57116\n公教\t57117\nOutfitters\t57118\n干板\t57119\n起泡瓶\t57120\n中维云视通\t57121\n黑暗幻想\t57122\n实验幼儿园\t57123\n老外们\t57124\n没时间\t57125\n赫曼米勒\t57126\n81级\t57127\n24粒\t57128\njanice\t57129\nsustainable\t57130\n上考网\t57131\n夕照寺\t57132\n优月真里奈\t57133\n狗神\t57134\n真理\t57135\n衍宗丸\t57136\n裘盛戎\t57137\nwin7系统配置\t57138\n复发\t57139\n电臀舞\t57140\n行星齿轮减速机\t57141\n定本\t57142\n2.68\t57143\nmitigation\t57144\n来广营乡\t57145\n2020年前\t57146\n知之者\t57147\nschools\t57148\n氯化钠\t57149\n龙西\t57150\n南通师范高等专科学校\t57151\n99式坦克\t57152\n北城天街\t57153\n荷园\t57154\nell\t57155\n惊悸\t57156\n雁栖鸣播音暮玖\t57157\n倭黑猩猩\t57158\n传遍\t57159\nhup\t57160\n罪魁\t57161\nexcess\t57162\n警监\t57163\nsbt\t57164\n纸牌屋\t57165\n道号\t57166\n装装修\t57167\n拘谨\t57168\n外国语学校\t57169\n咸蛋\t57170\n养护品\t57171\n700w\t57172\n全民超人汉考克2\t57173\ntask2\t57174\n大中小\t57175\n围护\t57176\n陈国华\t57177\n6822\t57178\n附属国\t57179\n北京地铁13号线\t57180\nThinkPhp\t57181\nVectors\t57182\n22场\t57183\nì\t57184\n荀彧\t57185\nvalid\t57186\n356号\t57187\n姿容\t57188\n犯罪集团\t57189\n福田中心区\t57190\njj斗地主\t57191\n南孚护驾#\t57192\n孕女\t57193\nSTM32F030\t57194\n狮威\t57195\n百视网\t57196\n实体机\t57197\n中锐\t57198\n代谢组\t57199\n1.6.2.2192\t57200\n喜乐蒂\t57201\n城市广场\t57202\n方雅丽\t57203\n武商网\t57204\n林霏\t57205\n北京千方科技股份有限公司\t57206\nitunse\t57207\nDHC黄金霜\t57208\n巨舰\t57209\n成都机电工程学校\t57210\n镍矿\t57211\nsrio\t57212\n十八\t57213\n名泉\t57214\n第三十五章\t57215\n盐酸氮卓斯汀鼻喷剂\t57216\n精听\t57217\n周永\t57218\n温医\t57219\n算命术\t57220\n成都市二医院\t57221\nxax\t57222\n编年\t57223\n台湾区\t57224\n刘新宇\t57225\n10月11日\t57226\n混行\t57227\n安和\t57228\n奔腾X80\t57229\n余晓维\t57230\n成立集团公司\t57231\n讨好\t57232\n焖面\t57233\n马冬梅\t57234\n朝阳西路\t57235\nFIBA\t57236\n泗滨\t57237\n阿岚\t57238\niBS外语学院\t57239\n论者\t57240\n光宗耀祖\t57241\n二十六个\t57242\n文氏\t57243\nLure\t57244\n支撑型\t57245\n健康指南\t57246\n178怪物猎人\t57247\n凤凰山海港\t57248\n比好\t57249\n择天尘缘\t57250\n明盒\t57251\n高糖\t57252\n强推\t57253\n1批次\t57254\n全自动机械表\t57255\n纱线\t57256\n爆管\t57257\n碟刹器\t57258\n京川\t57259\n关乎\t57260\n製\t57261\nDS7\t57262\n深圳眼科医院\t57263\ntianyancha\t57264\n王玺雯\t57265\n砂袋\t57266\n姜涩琪\t57267\n修成\t57268\n泗阳房产网\t57269\njournalism\t57270\n自然色\t57271\n张三的歌\t57272\n光纤适配器\t57273\n斗破苍\t57274\n丁祖昱\t57275\n绿壳蛋鸡苗\t57276\nnotability\t57277\n86岁\t57278\n蔡家岗\t57279\n天御套\t57280\n管网\t57281\n沭河\t57282\n饮鸩止渴\t57283\n汽渡\t57284\nvat\t57285\n啖\t57286\n丧生\t57287\n两男\t57288\n防臭袜\t57289\n物联网大会\t57290\nSPFA\t57291\nx21a\t57292\n预中\t57293\n一假\t57294\n首都医科大学附属北京中医医院\t57295\nlieju\t57296\n全行\t57297\nnlm\t57298\n挂挡\t57299\n2018048\t57300\nBitcoin\t57301\n王者之战\t57302\n支云\t57303\n磁力锁\t57304\n胡华\t57305\nzbox\t57306\n本所\t57307\nxkdy\t57308\n1.08\t57309\n周菲\t57310\n银家\t57311\n乳素\t57312\n国别体\t57313\nv3.00\t57314\n家秀\t57315\n议定\t57316\nNSF\t57317\nAppCan文档中心\t57318\n97.1\t57319\netc卡\t57320\n火火的姑娘\t57321\n荷尔德林\t57322\n陆俨\t57323\n粉雾剂\t57324\n徽杭古道\t57325\n哈尔滨医科大学附属肿瘤医院\t57326\n4篇\t57327\n篱落\t57328\n恩施乐居\t57329\n京东众智\t57330\n羊病\t57331\n智真\t57332\n地垄\t57333\n古建筑木\t57334\n得利斯\t57335\n冠通\t57336\n长沙市雅礼中学\t57337\n汕头龙湖区政府\t57338\n董桥\t57339\n触景生情\t57340\n弹丸\t57341\n交谊舞曲网\t57342\n尽显\t57343\naqours\t57344\nemacs\t57345\n德力西电气\t57346\n排除法\t57347\n科生\t57348\n青歌\t57349\n砂水\t57350\nxclient\t57351\n索尼A7\t57352\nmarui\t57353\n大马\t57354\n半分\t57355\n局气\t57356\n入息\t57357\nc5000\t57358\n宁波移动\t57359\nKENT\t57360\nVideoScribe\t57361\n工路\t57362\n中山大学医学院\t57363\n新城新区\t57364\n非经营性\t57365\n柳体\t57366\n复旦管理学院\t57367\nquente\t57368\n红旗渠干部学院\t57369\n拿起来\t57370\n精门\t57371\n报账员\t57372\n500辆\t57373\n劲\t57374\n南湖革命纪念馆\t57375\n宜昌路\t57376\n暗点\t57377\n停船\t57378\n十一运\t57379\n百炼成\t57380\n父亲\t57381\n13000公里\t57382\n梦想世界吧\t57383\n公职\t57384\n17年1月\t57385\n95050900\t57386\n夜后\t57387\n樱桃番茄\t57388\n大保健\t57389\n百年树人\t57390\n胆道镜\t57391\nASML\t57392\n电原理\t57393\n天巡网\t57394\n美国时代广场\t57395\n63天\t57396\n太原大学\t57397\nthousands\t57398\n紧身裙\t57399\n荆门市第一人民医院\t57400\n密码编码学\t57401\n高岛屋\t57402\n爱婴医院\t57403\n普利制药\t57404\nqueens\t57405\nRoguelike\t57406\n门号\t57407\n国际贸易有限公司\t57408\n铃木天语\t57409\n62式\t57410\n调测\t57411\nLAS\t57412\n银染\t57413\n雪莲\t57414\nHito\t57415\n威远\t57416\nJilin\t57417\n泉山\t57418\n高邮湖\t57419\n7500元\t57420\n偏角\t57421\n直刺\t57422\n峰位\t57423\n封港\t57424\n春分日\t57425\nmax格式\t57426\n银耳钉\t57427\n蔡康永\t57428\n合网\t57429\n小米45w\t57430\n4511\t57431\n中国签证申请服务中心\t57432\n韩人\t57433\n请说\t57434\n埃隆·马斯克\t57435\n橘子郡\t57436\nmcnp\t57437\nwinx64\t57438\n讷河市\t57439\nconnet\t57440\n50小时\t57441\n片假名转换器\t57442\n黄桥村\t57443\n125集\t57444\nsli\t57445\nmini5\t57446\n休战\t57447\n新著龙虎门\t57448\n省地\t57449\n一汽丰田RAV4\t57450\n刘芮麟\t57451\n龙穴岛\t57452\n西决\t57453\n肉架\t57454\nk50\t57455\n610分\t57456\n组合键\t57457\n痘印\t57458\n123集\t57459\n镰刀型细胞贫血症\t57460\n防卫\t57461\n处门\t57462\n非遗网\t57463\n数学篇\t57464\n乙酯\t57465\nwas8.5\t57466\n山河图\t57467\n基因农业网\t57468\netch\t57469\n喜收\t57470\nJORDAN\t57471\n燕飞利仕\t57472\n平塘县\t57473\nadjusting\t57474\n尤克里里中国网\t57475\n脱硫剂\t57476\n永生难忘\t57477\n立体图\t57478\nYeti论坛_汽车之家论坛\t57479\n追忆\t57480\n戴相龙\t57481\nxfyy\t57482\n开炼机\t57483\n华医\t57484\n萧然在线\t57485\nSCHOTT\t57486\n血与泪\t57487\nIndependence\t57488\n卧推\t57489\n沙塔尔沙吾\t57490\n学生端\t57491\n上警\t57492\n深圳市科陆电子科技股份有限公司\t57493\njizhi\t57494\nsistar\t57495\n老断\t57496\n李志洲\t57497\n数行\t57498\nsimba\t57499\n小罗莉\t57500\n减肥\t57501\nExact\t57502\n博亚\t57503\n玉湖村\t57504\n八仙饭店之人肉叉烧包\t57505\n关林\t57506\n灵核\t57507\n烬\t57508\n征信\t57509\natlantis\t57510\nchronometer\t57511\n价廉物美\t57512\n矩阵\t57513\nWMV\t57514\n根呷\t57515\n辑刊\t57516\nLustre\t57517\nGoodman\t57518\n荣威\t57519\nGROUP\t57520\n水桶包\t57521\n効果\t57522\npresentations\t57523\n合缝\t57524\n三十路\t57525\nRecurrent\t57526\n巍巍\t57527\n马修\t57528\n国统区\t57529\n萝卜排骨汤\t57530\n电台\t57531\nPWM波\t57532\nVysor\t57533\niwconfig\t57534\n坡跟\t57535\n中国电信股份有限公司\t57536\n成观法师\t57537\n姜胜允\t57538\n大公财经_大公网\t57539\n盲拧\t57540\njlol\t57541\n中式台球世锦赛\t57542\n52114\t57543\n1938\t57544\n条码秤\t57545\n目瞪\t57546\n单样板\t57547\n尔康制药\t57548\n傻脸娜\t57549\n召唤术\t57550\n小刀娱乐网\t57551\n漆枪\t57552\nZBar\t57553\n41_\t57554\n菲利浦\t57555\n56075775\t57556\n以示\t57557\nPHA\t57558\n嵴\t57559\n微趣\t57560\n徐玉\t57561\n焊锡膏\t57562\n菩提偈\t57563\n房产网_旅居网\t57564\nUncut\t57565\n2017年五一节\t57566\n屠呦呦\t57567\ngup\t57568\ndownie\t57569\n1.7亿\t57570\n王丹\t57571\n蜀葵\t57572\n魂兮\t57573\nTina\t57574\n张欣尧\t57575\n恰似爱如潮\t57576\n脑血流\t57577\n亚稳态\t57578\n跨服\t57579\n拐卖\t57580\n龙拳小子\t57581\n江海明珠网\t57582\nTASTE\t57583\nqueen\t57584\n助纣为虐\t57585\n争斗\t57586\n乡间\t57587\n吸光度\t57588\nename\t57589\nVisualStudio2013\t57590\n书铁\t57591\n完婚\t57592\nPowerPoint2007\t57593\nGitblit\t57594\nPictures\t57595\n扭腰\t57596\ninsydeh20\t57597\n牛鬼\t57598\nStark\t57599\n步月\t57600\n砸向\t57601\n20160730\t57602\n辣椒粉\t57603\n长胎\t57604\n金盛路\t57605\n合肥百大集团\t57606\n厝\t57607\n嘉兴百姓网\t57608\n中共浙江大学委员会\t57609\n1s\t57610\nLoud\t57611\n陈榕\t57612\n正通汽车\t57613\n风干\t57614\n楽器バンド\t57615\n热荐\t57616\n怪异\t57617\n1.3.7\t57618\n四川省教育考试院\t57619\n福建日报\t57620\n程小东\t57621\n国土空间\t57622\nPSC\t57623\nbeida\t57624\n星爷\t57625\nThinkSNS\t57626\n比丘尼\t57627\n目录\t57628\n綾瀬\t57629\n喂料机\t57630\n协调会\t57631\n妙妙\t57632\n合肥市人民政府办公厅\t57633\n打包机\t57634\n功名利禄\t57635\n蟹苗\t57636\n完美世\t57637\n大强\t57638\nd50\t57639\n校审\t57640\n海枣\t57641\n白羽鸡\t57642\n熊本熊\t57643\n信号处理\t57644\n一氧化碳中毒\t57645\n岭南水乡\t57646\n抽水马桶\t57647\n注水\t57648\n小视屏\t57649\n传媒类\t57650\n扬眉\t57651\n小蝇\t57652\n全球十大管理咨询公司\t57653\n起毛\t57654\n李善长\t57655\nFourier\t57656\n血月岛\t57657\n偷情宝鉴\t57658\n冬泉谷\t57659\n钞\t57660\n35部\t57661\n格差天堂\t57662\n陈凯歌\t57663\nAgnes\t57664\n栗鼠\t57665\n山东大学数学学院\t57666\n暖石网\t57667\n昆明理工大学津桥学院\t57668\ncjn\t57669\n附属卡\t57670\n近距\t57671\njnc\t57672\n灌肤\t57673\nR35\t57674\n金山公司\t57675\n丝照\t57676\n870\t57677\n代表会\t57678\n输出量\t57679\n王小华\t57680\n牛角包\t57681\n泡沫经济\t57682\n高扬程\t57683\nexpand\t57684\n130亿元\t57685\n黄晨\t57686\n打脸\t57687\np31\t57688\n主菜\t57689\n意中\t57690\n拦不住\t57691\n送错\t57692\nmevius\t57693\n保险标的\t57694\n戴维森\t57695\nEason\t57696\n适当地\t57697\n上海江莱生物科技有限公司\t57698\n环牧\t57699\n模拟赛\t57700\n_特玩绝地求生\t57701\n润达医疗\t57702\n溜须\t57703\n10ppm\t57704\n天下为公\t57705\n一长\t57706\n路人局\t57707\n宁夏回族自治区教育厅\t57708\n214000\t57709\n淡淡\t57710\n重点大学\t57711\n五天前\t57712\n朴茨茅斯\t57713\nSCons\t57714\n星游记之风暴法米拉\t57715\n木栈道\t57716\n通用定额发票\t57717\n泛微协同办公平台\t57718\n王志李兰迪\t57719\n亚路嘉\t57720\n疽\t57721\n他们\t57722\nqzzn\t57723\n032517\t57724\n俊男坊\t57725\n备货\t57726\n空岗\t57727\n山东省人力资源和社会保障厅\t57728\n哈马\t57729\nCAVE\t57730\nleisure\t57731\n易善复\t57732\n歌友\t57733\n怀安\t57734\n衙前镇\t57735\nlavyun\t57736\nattrition\t57737\n2.46\t57738\n南外\t57739\n上海招聘网\t57740\n嘟嘴\t57741\n李恩珠\t57742\n可折叠式\t57743\nCFA协会\t57744\n夜刀神十香\t57745\n逢泽莉娜\t57746\n4824\t57747\nImmersive\t57748\n广深沿江高速\t57749\n兆新股份\t57750\n工合同\t57751\n标的额\t57752\n插针\t57753\n73年\t57754\n正元香槟花园\t57755\n谭秦东\t57756\n嘣\t57757\n天门冬氨酸\t57758\n航空史\t57759\n宅妖记\t57760\nconfluent\t57761\nTrackpad\t57762\nstardock\t57763\n第115\t57764\n港宝\t57765\n上海微电子装备有限公司\t57766\n比亚迪E5\t57767\n纵欢\t57768\n冲毁\t57769\n1.8kg\t57770\n商海通牒\t57771\n热水型\t57772\n胭脂色\t57773\n中频炉\t57774\n网序\t57775\n顶面\t57776\n赞比\t57777\n数据流程图\t57778\n稽查\t57779\ncsproj\t57780\nleds\t57781\ncomponent_verify_ticket\t57782\n艾兰岛\t57783\n三国杀\t57784\n泰和郡\t57785\n中国妇女网\t57786\n衬裙\t57787\n80_\t57788\nmba\t57789\n广州文华东方酒店\t57790\n50下\t57791\n降档\t57792\n冤魂\t57793\n结合\t57794\n画江湖之侠岚\t57795\n杀妻\t57796\n鞋机\t57797\n供一业\t57798\n偏移\t57799\n文泰\t57800\n抵押权人\t57801\n说家\t57802\n八九点\t57803\n小陶\t57804\n平场\t57805\nNorah\t57806\n搜房网房天下\t57807\n金投热点网\t57808\nsolutions\t57809\nstype\t57810\n组装屏\t57811\n南沙街\t57812\n虞海燕\t57813\n临县\t57814\njxggzy\t57815\nXplay\t57816\n时尚集团\t57817\n中兴镇\t57818\nwh\t57819\nMacross\t57820\n渔博会\t57821\nLicensed\t57822\n破局者\t57823\n返奖率\t57824\n荒村\t57825\n晚景\t57826\n组织关系\t57827\n电子屏\t57828\n电脑灯\t57829\n爱情与灵药\t57830\n象鼻\t57831\nendif\t57832\n悉尼市\t57833\n华数\t57834\n2.4T\t57835\n柱距\t57836\n减振器\t57837\n凤鸣轩\t57838\n洪荣宏\t57839\n北京工业职业技术学院\t57840\n寇乃馨\t57841\nWatch2\t57842\n吕敬人\t57843\n蒙\t57844\n配珠\t57845\n高山茶\t57846\n洁洁\t57847\n連結\t57848\n肿痒\t57849\n神州行\t57850\n南京公证处\t57851\n少前\t57852\n加密币\t57853\n陈霸先\t57854\n将台路\t57855\n高中数学竞赛\t57856\n上海复星高科技(集团)有限公司\t57857\n无法者\t57858\n陈光标\t57859\n立顿奶茶\t57860\n880路\t57861\n天唐天王\t57862\n青果巷\t57863\n中国专利信息中心\t57864\n绿武\t57865\n五四制\t57866\n笼鸟\t57867\n小超资源网\t57868\n御夫\t57869\n散体\t57870\nssserver\t57871\n画论\t57872\n64688882\t57873\n灌溉\t57874\ndev/sdb1\t57875\nzuk\t57876\n2k14\t57877\n中央社院\t57878\n用户\t57879\n三诺生物\t57880\n蔓\t57881\n郵件\t57882\n捧腹大笑\t57883\n拓维\t57884\n指法\t57885\n红岸网\t57886\n内装修\t57887\n落羽\t57888\n淘金云\t57889\n人和乡\t57890\n约稿\t57891\njingying\t57892\n3G版\t57893\n临卦\t57894\nCoupons\t57895\n南汇新城镇\t57896\n盾\t57897\n红烧鲫鱼\t57898\n四件套\t57899\ndesignjet\t57900\n文学馆\t57901\n风夏\t57902\n淚\t57903\n李师傅\t57904\n周全\t57905\n土建学院\t57906\n狗万\t57907\nanalytical\t57908\n差速器\t57909\n隆盛\t57910\n琵琶语\t57911\n要不来\t57912\n500张\t57913\n李忠杰\t57914\nBtbiti\t57915\n0708\t57916\n正在\t57917\n短路\t57918\n低聚糖\t57919\n三国机密之潜龙在渊全集\t57920\n補\t57921\n街子\t57922\nosteo\t57923\n移载\t57924\n东浮山村\t57925\n2050年\t57926\n脉动\t57927\nfcn\t57928\n制作工\t57929\nSUMIF函数\t57930\n乔丹\t57931\nforte\t57932\n随附\t57933\nM226\t57934\n23米\t57935\n人蛇\t57936\nOray\t57937\n地方史\t57938\n氩弧焊工\t57939\n朱海涛\t57940\n48小时后\t57941\n隔离变压器\t57942\nmelo\t57943\n花水木\t57944\n领奖励\t57945\n雾雨\t57946\n传祺GS7\t57947\n深圳市华联欧国际贸易\t57948\n猪厂\t57949\n差分放大电路\t57950\n哈利路亚\t57951\n一约\t57952\n报数\t57953\n黔南州\t57954\nRight\t57955\n智慧社区\t57956\n性病\t57957\n题型\t57958\n乐言\t57959\n质量效应3\t57960\n大丑\t57961\n濮存昕\t57962\n餐箱\t57963\n支付宝运动步数\t57964\nCombi\t57965\n飞屏\t57966\n8条\t57967\n插旗\t57968\nchroma\t57969\n一星半点\t57970\n群魔\t57971\n农村路\t57972\njs函数\t57973\n墨轩\t57974\n火力\t57975\nH3C技术论坛\t57976\n测光\t57977\n国际展\t57978\n十章\t57979\n查立\t57980\n石岛\t57981\n龙牌石膏板\t57982\nK宝\t57983\nt3航站楼\t57984\nURLError\t57985\n心相\t57986\n施肥器\t57987\n饥饿鲨\t57988\n梁华\t57989\n低高\t57990\n制钉机\t57991\n东营乐居\t57992\nWINE\t57993\n400平方米\t57994\n石河子市\t57995\n宝葫芦\t57996\n植脂奶油\t57997\n问价\t57998\nsstm\t57999\n追溯性\t58000\n第九关\t58001\n酒精灯\t58002\n三角面\t58003\n小米体脂\t58004\n售后\t58005\n新矿\t58006\nChangelog\t58007\n15倍\t58008\n凝思\t58009\n新余北站\t58010\n葡萄糖胺\t58011\n中国(上海)自由贸易试验区管理委员会保税区管理局\t58012\n傅剑寒\t58013\n小呆\t58014\n小银星艺术团\t58015\n上证50etf\t58016\n雷震子\t58017\n辨识\t58018\n唇辣号\t58019\n寇森\t58020\n剑阶\t58021\n海监\t58022\n给排水科学与工程专业\t58023\nTrends\t58024\n圣后\t58025\nEmil\t58026\nxxl-job\t58027\n集成库\t58028\n微粒\t58029\n谋定\t58030\n宦妃\t58031\nviper\t58032\n手术服\t58033\n溧\t58034\nrobbery\t58035\n汛前\t58036\n32式\t58037\n医联体\t58038\ngaopengtttt\t58039\n大三巴\t58040\ncnds\t58041\n公立医院\t58042\n300box\t58043\n美图m4\t58044\n17310256231\t58045\nwdCP\t58046\n鸣谢\t58047\nsrs\t58048\n撸友们\t58049\n傲星阁\t58050\n3508\t58051\n3KG\t58052\n德铂\t58053\n言之有物\t58054\n龙门南\t58055\n五星街\t58056\n字形码\t58057\n河北省发改委\t58058\n雷神兽\t58059\n38级\t58060\n光明街\t58061\n胆脂瘤\t58062\n十年树木\t58063\n头天\t58064\n百家汇\t58065\n3过场\t58066\n石泉\t58067\n1.0.31\t58068\ngoolge\t58069\n野芦湾\t58070\n侧装\t58071\nspn\t58072\n锟斤拷\t58073\n深圳信息职业技术学院\t58074\n自由职业\t58075\nOWIN\t58076\n杰夫哈迪\t58077\n四轮车\t58078\nWastewater\t58079\n专用型\t58080\nmetastore\t58081\nstarskyhu\t58082\nDateTimePicker\t58083\n认不认\t58084\n外地人\t58085\n随机字符串\t58086\nHEETS\t58087\n地柏\t58088\n气杆\t58089\n600016\t58090\n公祭日\t58091\n天一城\t58092\n仁者无敌\t58093\n黄龙机场\t58094\n修型\t58095\n火车浏览器\t58096\n861\t58097\n宫口\t58098\n躬身\t58099\n澳国\t58100\n万豪积分\t58101\n轿门\t58102\n廖永远\t58103\n生意人\t58104\n年刊\t58105\n扶风吧\t58106\n爱乐维\t58107\n美易理财\t58108\n高新区人才网\t58109\n小昆山\t58110\n口水杭州论坛\t58111\nv4.3.0.0\t58112\n张嘉译\t58113\n小伙\t58114\n名法\t58115\n2018年初\t58116\n指标值\t58117\n金凤\t58118\n陇海铁路\t58119\n1200米\t58120\n8.10\t58121\noffice学院\t58122\n腾讯应用宝\t58123\ngolden\t58124\nline\t58125\n孵卵\t58126\n水力控制阀\t58127\nfetish\t58128\n污油\t58129\n段公路\t58130\n金立M7\t58131\nManuals\t58132\n僵尸新娘\t58133\nDVD-RMVB\t58134\nULTRA\t58135\noor\t58136\n好使\t58137\nBiology\t58138\nder\t58139\n吉林大学法学院\t58140\n小德cyj\t58141\npingtai\t58142\n德上高速\t58143\nNiagara\t58144\nqpcr\t58145\nDJ嗨嗨网\t58146\n美麗\t58147\n战士\t58148\nninebot\t58149\n欧华\t58150\n97zyz\t58151\n爱花\t58152\n颁发\t58153\n肌群\t58154\niyifei\t58155\n文体路\t58156\n马富天\t58157\n茶点\t58158\nre从零开始异世界生活吧\t58159\n莱迪\t58160\n提级\t58161\n急中生智\t58162\nIT程序猿\t58163\n第十八届\t58164\n吴彬\t58165\n泳\t58166\n房地产经纪人\t58167\n陆敏雪\t58168\n优效\t58169\n沙库巴曲\t58170\n激光扫描仪\t58171\n蒙文版\t58172\n第一家\t58173\nUnity2D研究院\t58174\n午市\t58175\n求是网\t58176\n犹如\t58177\n黑纹\t58178\n东京吸血鬼酒店\t58179\nNetSpeedMonitor\t58180\n领导人\t58181\n张咪\t58182\n李旺\t58183\n弗罗斯特\t58184\n深圳市凯立德科技股份有限公司\t58185\n置身事外\t58186\n电视家浏览器\t58187\n数字图像处理\t58188\n写好\t58189\n先锋队\t58190\nappStore\t58191\n傣味\t58192\nVISIO\t58193\n鸟食\t58194\n烤串店\t58195\n3转\t58196\n洛易\t58197\nhobo\t58198\n平均年龄\t58199\n吴家湾\t58200\n成都高新西区\t58201\n濠梁\t58202\n32G\t58203\n何帆\t58204\n茅忠群\t58205\nBEX\t58206\n华兰生物\t58207\n多哥\t58208\n柳莺\t58209\n鹏金所\t58210\n脊灰灭活疫苗\t58211\nfjdingsd\t58212\n狼乾劫\t58213\n2500万\t58214\n张敬华\t58215\n餐具\t58216\n通知金\t58217\nDMS\t58218\n百度文件\t58219\n正版化\t58220\n弧形板\t58221\n关联规则算法\t58222\n肝硬化\t58223\n1750元\t58224\n超赛\t58225\n绿袍\t58226\n李志勇\t58227\n马丁\t58228\n贺州新闻网\t58229\n新昌信息港\t58230\n点挂\t58231\n8716xxxx\t58232\n洛阳石化\t58233\n65期\t58234\nh3c模拟器\t58235\n黄皮\t58236\n电子科学与技术专业\t58237\n南昌电视台\t58238\n电话号\t58239\n狄安娜\t58240\n25名\t58241\n1400元\t58242\n六旬\t58243\n交管12123预约考试\t58244\n色图网\t58245\n灵秀\t58246\n双桨\t58247\n中国电力招标网\t58248\n浙江春风动力股份有限公司\t58249\n小肚\t58250\n总角\t58251\n2041\t58252\n镜检\t58253\n火车西站\t58254\n良家\t58255\n魔法少女育成计划\t58256\n恒阳\t58257\n除污\t58258\nv6.1\t58259\nnose\t58260\n活石\t58261\nredis服务器\t58262\n大日\t58263\nDelaunay\t58264\n箫\t58265\n符文工房吧\t58266\n鹬\t58267\nMarshal\t58268\n准入\t58269\n详单\t58270\n三国英雄传\t58271\nHACK80\t58272\nwaist\t58273\n神龙版\t58274\n新疆电视台\t58275\n天津市民政局\t58276\n程武\t58277\n预防性\t58278\nG31\t58279\n行价\t58280\n油毡瓦\t58281\n人力资源和社会保障网\t58282\n上海滩\t58283\n西门子s7\t58284\n入门级\t58285\n七年级语文\t58286\n飞天\t58287\n拉丁舞曲\t58288\n火蓝刀锋\t58289\n控制箱\t58290\n1.5万亿\t58291\n甘胆酸\t58292\n25.2\t58293\n邵逸艺伎回忆录\t58294\n微环境\t58295\n特报\t58296\n2.5_\t58297\nEhCache\t58298\n侗乡\t58299\n作文素材\t58300\n冰锐\t58301\n肺纤维化\t58302\nKK源码网\t58303\n微扑克\t58304\n天原集团\t58305\n张瀚\t58306\n1026\t58307\n北京西门子\t58308\nDevStore\t58309\n修复液\t58310\nPresario\t58311\n北投\t58312\n青岛小学\t58313\nHD7750\t58314\n厦门大学外文学院\t58315\n客栈\t58316\n内蒙古人民出版社\t58317\n贞娃儿\t58318\nurdf\t58319\n陆丰\t58320\n卵母细胞\t58321\nVEX\t58322\nsummer\t58323\n升水\t58324\n崆峒\t58325\n1MW\t58326\n虚度年华\t58327\n座右铭\t58328\ndependencies\t58329\nthathe\t58330\n三级跳\t58331\n水战\t58332\n不可点\t58333\n接待室\t58334\n襲\t58335\n至爱\t58336\nCSniper\t58337\nHealth\t58338\n防备\t58339\nMixer\t58340\n本轮\t58341\n奇妙\t58342\n地火\t58343\n窥视者\t58344\nReborn\t58345\n耗损\t58346\n东光县\t58347\n无冬OL\t58348\n瓦里玛萨斯\t58349\n葡萄糖醛酸\t58350\n全面两孩\t58351\n美团网\t58352\nvbs\t58353\n油茶树\t58354\n九龙花园\t58355\n嘎子\t58356\n藏器\t58357\n80070035\t58358\n土木工程毕业设计\t58359\n脱衣舞娘\t58360\n鹅口疮\t58361\nBattle\t58362\n湖北省文化厅\t58363\n奔驰GLC级论坛\t58364\n连云港市区\t58365\n天朝上国\t58366\n安东尼戴维斯\t58367\n赵宥乔\t58368\n江苏中南建设集团股份有限公司\t58369\n橘子酱\t58370\nFooter\t58371\n进错\t58372\n新课标版\t58373\nSession\t58374\nRCR\t58375\n卵巢肿瘤\t58376\n中国大学生网\t58377\n分行\t58378\n功图\t58379\n生物化学与分子生物学\t58380\n讲解员\t58381\n家风\t58382\n苏州农业职业技术学院\t58383\n黄石\t58384\nRepetier\t58385\n并生\t58386\n釉质\t58387\n风雨桥\t58388\n布鲁斯韦恩\t58389\n绿体\t58390\n黄眼\t58391\n1234567\t58392\n黄镇\t58393\n富贵\t58394\n第四人民医院\t58395\nNHS\t58396\n刷酸\t58397\n友缘\t58398\n道情\t58399\n油点\t58400\n五盒\t58401\n永不褪色\t58402\n刘銮雄\t58403\nRigid\t58404\n白沙海\t58405\n下卧层\t58406\n毛晓峰\t58407\n一响\t58408\n官子\t58409\n补办\t58410\n全民斗地主\t58411\n北京昌平\t58412\n蜀风雅韵\t58413\n热血街区\t58414\n阿穆尔\t58415\n北太平洋\t58416\n硝酸盐\t58417\n梅毒疹\t58418\n吴智辉\t58419\n石毓智\t58420\n吉尔\t58421\n传染期\t58422\n埋头\t58423\n几百元\t58424\n烫画\t58425\n齐氏\t58426\n芽苗菜\t58427\n安武林\t58428\n青岛国际会展中心\t58429\n两秒钟\t58430\n朦朦胧胧\t58431\n车评\t58432\n殖\t58433\n克罗恩病\t58434\n実\t58435\n丽台\t58436\n换世\t58437\nStrawberry\t58438\n鼓楼西大街\t58439\n应季\t58440\n程磊\t58441\n小新娘\t58442\n算术平均数\t58443\n32页\t58444\n暴光\t58445\n体会会\t58446\n数字人\t58447\nEVER\t58448\n抗菌性\t58449\nWamp\t58450\n厦门北\t58451\ncreativity\t58452\n浙江省残疾人联合会\t58453\n严峻性\t58454\n十村\t58455\n八门遁甲\t58456\nkungfu\t58457\n宗派\t58458\n登路\t58459\n睾酮素\t58460\n上海租车公司\t58461\n奶女\t58462\nadviser\t58463\n小米3\t58464\n存装\t58465\nbudapest\t58466\n某市\t58467\nV3.0\t58468\n更木剑八\t58469\n10位\t58470\n绝境\t58471\n后篇\t58472\n狼人杀\t58473\n移门\t58474\n第15名\t58475\n床图\t58476\nfex\t58477\n教务管理人员\t58478\n糟糠\t58479\nCardano\t58480\n磅礴\t58481\n风行天下\t58482\nXiamen\t58483\n60厘米\t58484\n电磁泵\t58485\nLadyboy\t58486\n吴琪\t58487\nabbey\t58488\n维普数据库\t58489\n税盘\t58490\n哈士奇\t58491\n一圈一圈\t58492\n陈山聪\t58493\n杀戮间2\t58494\nthunder\t58495\n尼康D500\t58496\npowerpcb\t58497\n刻有\t58498\nnove3e\t58499\n畸变率\t58500\n拉希德华莱士\t58501\n静态\t58502\n民营经济发展\t58503\n2017年12月03日\t58504\n水岸花园\t58505\nkeytab\t58506\n手把件\t58507\n可逆\t58508\naxio\t58509\n塔格奥\t58510\n姚高员\t58511\n谈到\t58512\n恩娇\t58513\n酱肉\t58514\n马走日\t58515\nERM\t58516\n魔导巧壳\t58517\n惩处\t58518\npain\t58519\n中国华电集团有限公司\t58520\n何浩\t58521\n回到从前\t58522\n博士\t58523\nRUST\t58524\no型圈\t58525\n李美凤\t58526\n恩怨情仇\t58527\n齿数\t58528\n北京移动动感地带卡\t58529\n4.4.0\t58530\nBU\t58531\n东风标致3008\t58532\n柳丝\t58533\n桑茶\t58534\n北通\t58535\n迪托\t58536\n153号\t58537\n恶魔城月下\t58538\n精品化\t58539\n乐视影业\t58540\n木子喵喵\t58541\n一夜之间\t58542\n江川路街道\t58543\ng1620\t58544\ndlc\t58545\n三股\t58546\n融金所\t58547\n4400\t58548\n什么地方\t58549\nMarin\t58550\nleaf\t58551\nae\t58552\n卷积神经网络\t58553\n12.02\t58554\n乱斗堂2\t58555\nkamailio\t58556\n雨打\t58557\n斯蒂尔\t58558\nSTM8S\t58559\ns2417dg\t58560\nyahei\t58561\n杜平\t58562\n近端\t58563\n宾得k50\t58564\n五年级语文下册期中考试卷\t58565\n750D\t58566\n逯恣祯\t58567\n安阳工学院\t58568\n搅拌桶\t58569\n翅片管\t58570\nMage\t58571\n小岗村\t58572\ngetview\t58573\n燃烧吧\t58574\n网银盾\t58575\n织品\t58576\n出轨的女人\t58577\n理疗\t58578\n20170817\t58579\n博世科\t58580\nimmense\t58581\n孙恋\t58582\n芬恩\t58583\n广东狮子会\t58584\n刘国辉\t58585\n压差计\t58586\n新宝股份\t58587\n3天\t58588\n置换反应\t58589\n微耽\t58590\n梼杌\t58591\nellie\t58592\n高坪\t58593\n荔浦县\t58594\n监督管理委员会\t58595\n过期\t58596\n动态平衡阀\t58597\n庞然大物\t58598\nhtmlcss\t58599\n12日\t58600\nAgoda\t58601\n毒战\t58602\nCareySon\t58603\nunhandled\t58604\n厦门宾馆\t58605\n第三宫\t58606\n演员\t58607\n巴黎大学\t58608\n适宜\t58609\n北京同仁堂药店\t58610\n敬香\t58611\nVOF\t58612\n蒸菜\t58613\n自知道\t58614\n变造\t58615\n恒大集团\t58616\n回家之开心速递粤语版\t58617\n榜\t58618\n豪杰\t58619\n絮状物\t58620\n面对\t58621\noverlay\t58622\npp5\t58623\n广品\t58624\n欧姆龙血压计\t58625\n开枪\t58626\n一未\t58627\n15.0\t58628\n布隆迪\t58629\n新聞-PLAYNO.1玩樂達人\t58630\n升压器\t58631\n4g运存\t58632\n端柱\t58633\n飘落\t58634\n驻华大使馆\t58635\n大话溧水\t58636\n王志文\t58637\nIntensive\t58638\n基弗\t58639\n第5关\t58640\n50粒\t58641\n乳腺纤维腺瘤\t58642\n3150\t58643\n复杂化\t58644\n铁笔\t58645\n1票\t58646\norgan\t58647\n杨东\t58648\n1051\t58649\n红藤\t58650\n朗文\t58651\n上景\t58652\n顶岗实习报告\t58653\n废水池\t58654\n莱恩\t58655\n深圳市纪委\t58656\n周世平\t58657\n纸尿\t58658\n迅时\t58659\n李一男\t58660\n特急\t58661\n游戏王\t58662\n赵江\t58663\nBEAR\t58664\n300余\t58665\n淮北一中\t58666\n出去\t58667\nballon\t58668\n王宝泉\t58669\nfurry\t58670\n圣济总录\t58671\nMAGIC\t58672\n十八代\t58673\n结贴\t58674\nrviz\t58675\n復\t58676\n新浪页游助手\t58677\n480G\t58678\n路叔\t58679\n2407\t58680\n王集乡\t58681\n驱车\t58682\nevc\t58683\n取料\t58684\n高丽虹\t58685\n试采\t58686\n_际通宝手机网\t58687\n末茶\t58688\nott\t58689\n盛况空前\t58690\n起诉状\t58691\nlalaland\t58692\n乳状液\t58693\n红彩\t58694\n黔东\t58695\ndkms\t58696\n爬高\t58697\n朋友聚会\t58698\n行政公益诉讼\t58699\n盘锦市\t58700\n牙垫\t58701\n全国卷\t58702\n青媚狐\t58703\nERP管理软件\t58704\n寐语者\t58705\n色片\t58706\n金元\t58707\n爱情鸟\t58708\n前端网\t58709\n黄桥镇\t58710\nrcl\t58711\n晒货\t58712\n纤维灶\t58713\n2000.15.4\t58714\n这等\t58715\n听众\t58716\n怪侠一枝梅\t58717\n合并案\t58718\n再借\t58719\n吃饭吧\t58720\nMFStar\t58721\n中国人民大学出版社\t58722\n杨毅\t58723\n踩场\t58724\n金纺\t58725\n刮目相待\t58726\n第一档\t58727\nOmen\t58728\n孙其峰\t58729\n调色机\t58730\n欧联\t58731\n俺去\t58732\n颐养\t58733\n张文杰\t58734\n情景\t58735\n奥胖\t58736\nATL\t58737\nB350M-A\t58738\nDONG\t58739\n光害\t58740\n3号位\t58741\n乔泽城\t58742\n河西\t58743\n扬州大学广陵学院\t58744\n藏身\t58745\n献声\t58746\n抓机\t58747\n新苏\t58748\ntempered\t58749\n佛山火车站\t58750\n设计心理学\t58751\n世强元件\t58752\n刷票\t58753\n磷化膜\t58754\n凉拌海带丝\t58755\n八角金盘\t58756\n联合时报\t58757\nProscenic\t58758\n曲咪新乳膏\t58759\n千分符\t58760\n珩\t58761\n大连天健网\t58762\n北京奥林匹克森林公园\t58763\n歼11\t58764\n移开\t58765\n天净沙秋\t58766\napplause\t58767\n574\t58768\nwww.jf258.com\t58769\n固纬\t58770\n場合\t58771\n清华长庚医院\t58772\nTimber\t58773\n逼装\t58774\n日志库\t58775\n通苏嘉铁路\t58776\n艾玛·马克思\t58777\n金胶州\t58778\n古城路\t58779\n白月\t58780\n4房\t58781\nternary\t58782\n5万吨级\t58783\n教學\t58784\n探明\t58785\n张丽芳\t58786\n一号案\t58787\ninch\t58788\n小云\t58789\n砚山\t58790\nv4.1.4\t58791\n甲A\t58792\n五牛\t58793\n不祥\t58794\n驴妈妈\t58795\n12型\t58796\n昆明经济技术开发区\t58797\n菌菇汤\t58798\n佛罗伦萨美术学院\t58799\n河桥\t58800\n康铜丝\t58801\n魔法球\t58802\n蒋洁敏\t58803\n维罗纳\t58804\n时滞\t58805\n温州市第十九中学\t58806\n艺苑\t58807\n中华尊驰\t58808\n海格通信\t58809\n龙身\t58810\ndangao\t58811\n陈与义\t58812\n郑渊白居易\t58813\n战斗员\t58814\n绿茶粉\t58815\n小白文件管理器\t58816\n山上\t58817\n马莎\t58818\n会引\t58819\n五格数理_\t58820\n血栓性外痔\t58821\n挡水\t58822\n超级机器人大战x\t58823\n凶险性\t58824\nInicio\t58825\nsimulink\t58826\nfluctuation\t58827\nimmigrant\t58828\nmser\t58829\n中邮小包\t58830\n时子\t58831\n洛克希德马丁公司\t58832\n昆区\t58833\nXiu\t58834\n劳动局\t58835\n黄梅一中\t58836\n流亡编年史\t58837\n博人传\t58838\n唐山高新区\t58839\n光心\t58840\n八七\t58841\nHCN\t58842\n不可动摇\t58843\n望城经开区\t58844\n170亿元\t58845\nworks2\t58846\n太平洋战争\t58847\n集赞\t58848\nx9plus\t58849\n房舍\t58850\nExpandable\t58851\n炸油饼\t58852\n硕放\t58853\n五女山\t58854\n大连北\t58855\n东奥会计在线\t58856\n买文具\t58857\n腹胀\t58858\n酷狗\t58859\nc3300\t58860\n洛氏硬度\t58861\n打子\t58862\n精女\t58863\n劳力士\t58864\nbrass\t58865\n蒋毅\t58866\n溪苑\t58867\n税务所\t58868\nv1.5.3\t58869\n合肥八中\t58870\n二之国2亡灵之国\t58871\n刘苏\t58872\n凡夫俗子\t58873\n崔大笨\t58874\n那点事儿\t58875\n跛\t58876\nMSRA\t58877\n三肯\t58878\n曲别针\t58879\n来潮\t58880\n耍流氓\t58881\n9686\t58882\n人寿保险\t58883\nCPUCPU\t58884\n蜂窝状\t58885\nAnyWay\t58886\n均质机\t58887\nintelligence\t58888\n不见不散\t58889\n营销岗\t58890\n南都周刊\t58891\n民族主义\t58892\n富勒姆\t58893\nsafari浏览器\t58894\n转接\t58895\nSigning\t58896\n小帆\t58897\nNeeds\t58898\n自作主张\t58899\n1000w\t58900\nfret\t58901\n张岳\t58902\n筹备工作报告\t58903\n死亡游戏\t58904\n神源岛\t58905\n_游侠网\t58906\nMinGW-w64\t58907\n什么样\t58908\n马克思主义哲学史\t58909\n阿克陶县\t58910\n动力源\t58911\n歌美飒\t58912\nTank\t58913\nVBR\t58914\n視訊\t58915\n龙珠电光火石3\t58916\nGeneva\t58917\n丐哥\t58918\n50.00\t58919\n绿景\t58920\n郑中基\t58921\n萝\t58922\n钟勉\t58923\n盛况\t58924\n买方市场\t58925\n北京高院\t58926\n知己\t58927\nAlertDialog\t58928\nmoye\t58929\nmili\t58930\n渤海银行\t58931\n参精\t58932\n南头镇\t58933\n法派\t58934\n杭州市规划局\t58935\nmax2010\t58936\n广灵四路\t58937\n闫伟\t58938\nidiom\t58939\n乐观主义者\t58940\n母公\t58941\nlanku\t58942\n魔流剑\t58943\n电监会\t58944\n零点绳艺网\t58945\n暗黑2圣骑士\t58946\nbudgeting\t58947\n把套\t58948\nifconfig\t58949\n电子信息类\t58950\n邱德光\t58951\n家客\t58952\n肾囊\t58953\nSIRO\t58954\n莫匹罗星软膏\t58955\nariba\t58956\n雪顶\t58957\n劳动仲裁申请书\t58958\nF1F2\t58959\n娱乐群\t58960\n威海机场\t58961\n固定资产减值准备\t58962\n莎士比亚全集\t58963\n亦庄同仁医院\t58964\n秦吏\t58965\n血树\t58966\nEnhancements\t58967\nTAC\t58968\nea888\t58969\n皮下注射\t58970\n聂子皓\t58971\n金证股份\t58972\n小度在家\t58973\ncpn\t58974\n潘明\t58975\n搬文\t58976\n烧猪\t58977\n陈良宇\t58978\nthereum\t58979\n李梦瑶\t58980\n什麽样\t58981\n掐断\t58982\n秦楚古道\t58983\n许凯皓\t58984\n紫薇\t58985\n亚士\t58986\n末节\t58987\n加卫苗\t58988\n不饱和溶液\t58989\n秦鹏\t58990\nmarionette\t58991\n福建省地方税务局\t58992\nvpl\t58993\n蹦蹦兔\t58994\n阿尔巴\t58995\n国械注进\t58996\nRTZ\t58997\n水桥\t58998\n电镀液\t58999\nmorphvox\t59000\n71zs.com\t59001\nh2s\t59002\n三思而后行\t59003\n诺亚财富投资理财\t59004\n内宣\t59005\n巴适\t59006\n濒临\t59007\n铝镁锰板\t59008\n护府\t59009\njavascipt\t59010\n吻妻\t59011\nOrigin8\t59012\n桓\t59013\n徕卡M10\t59014\n张怡诺\t59015\n印能捷\t59016\n文财神\t59017\n广发\t59018\n农民专业合作社\t59019\n慈善网\t59020\n等闲\t59021\nhk50\t59022\n汊河镇\t59023\n修复型\t59024\n大安区\t59025\n戒律\t59026\n通用篇\t59027\n责任事故\t59028\n现金侠\t59029\nSSD\t59030\n18888\t59031\n温州职业技术学院\t59032\n呀呀\t59033\n敖红亮\t59034\n葱茏\t59035\n灵山岛\t59036\nling\t59037\n近视镜\t59038\nstatue\t59039\ncommodities\t59040\n前进镇\t59041\n【图\t59042\n北京市朝阳区人民法院\t59043\n强警\t59044\n沁\t59045\n高高兴兴\t59046\n四中全会\t59047\n区委政法委\t59048\n護\t59049\n5.0安卓\t59050\nMEXICO\t59051\n唐式\t59052\n传奇世界h5\t59053\n威海广泰\t59054\n第三世\t59055\n阿拉尔\t59056\n颗粒化\t59057\n4000\t59058\n董勇\t59059\n范围\t59060\n神途\t59061\nreside\t59062\n大列\t59063\n刺柏\t59064\n江苏农牧科技职业学院\t59065\n富贵在天\t59066\nTOTAL\t59067\n第五位\t59068\n英杰\t59069\n久谦\t59070\n大东路\t59071\nHashtable\t59072\n油水\t59073\nWinsock\t59074\n过梁\t59075\n跬步者\t59076\nCaptcha\t59077\nWed114结婚网\t59078\n李姝\t59079\n2016年7月7日\t59080\n中环路\t59081\n18处\t59082\n半导\t59083\n糖鸡\t59084\nosm\t59085\n宝钢股份\t59086\nV2017\t59087\n气炉\t59088\n管乐队\t59089\nnotebook\t59090\n注册类\t59091\n润新\t59092\n肇庆站\t59093\nmavne\t59094\n繁峙县\t59095\n韩敏英\t59096\nannoy\t59097\n洗眼\t59098\n兵峰\t59099\n念旧\t59100\n并\t59101\n六渡桥\t59102\n8750\t59103\n二箱\t59104\n白鸦\t59105\n处决者\t59106\n可穿戴设备\t59107\n接踵而至\t59108\n13k\t59109\nWanda\t59110\n奥斯维辛集中营\t59111\n鲁通卡\t59112\n9.3.2\t59113\n姜糖\t59114\n金头\t59115\n180天\t59116\n大众公用\t59117\n徐老师\t59118\n西北大学经济管理学院\t59119\n罗伯逊\t59120\n1.4亿\t59121\n妻奴\t59122\nyeomen\t59123\n放送\t59124\n随身WiFi\t59125\n墨水瓶\t59126\n广安市环境保护局\t59127\n1110\t59128\ns形\t59129\n海坛古城\t59130\n23度\t59131\n诸葛瞻\t59132\n军衔\t59133\nResolution\t59134\n函谷\t59135\n埃里克森\t59136\n寝\t59137\n甲基四胺\t59138\n远投竿\t59139\n泉子\t59140\nShortest\t59141\nlaradock\t59142\n杨桥路\t59143\n埃尔南德斯\t59144\n杨璐\t59145\nintouch\t59146\ngolf\t59147\n法罗力\t59148\n英泰\t59149\n乔公子\t59150\n弄湿\t59151\n中信证券股份有限公司\t59152\n幸福街道\t59153\n君子爱财\t59154\n10.13.2\t59155\nHungary\t59156\nSpare\t59157\n郑云\t59158\n框剪\t59159\nmiddle\t59160\n2^n\t59161\n可可里\t59162\n炒面\t59163\n丝博会\t59164\n洛杉\t59165\n11月2日\t59166\n茶瓶\t59167\n竹内凉真\t59168\n97页\t59169\nJaxu\t59170\n优等品\t59171\nploom\t59172\n张小泉\t59173\n刘秉义\t59174\n眯着眼\t59175\nORA-12541\t59176\n显影机\t59177\n会昌县\t59178\n春江街道\t59179\n公募基金管理公司\t59180\n第18048期\t59181\n新侠\t59182\n300296\t59183\np波\t59184\n高剑\t59185\n双流机场\t59186\n牛图\t59187\n诗酒趁年华\t59188\n胡_\t59189\nnozomi\t59190\n电铸\t59191\n方雨\t59192\n核潜艇\t59193\nChain\t59194\n温泉街道\t59195\n半丝\t59196\n肖龙\t59197\ndefunct\t59198\n成交\t59199\nHitomi\t59200\n长江尾\t59201\n驻军\t59202\n乍暖\t59203\n操作化\t59204\n山西大学城\t59205\n乐金\t59206\n江西省南昌市中级人民法院\t59207\n杭州整形医院\t59208\n阿里巴巴一达通\t59209\n西崎崇子\t59210\n铁汉\t59211\n免費即時股票報價-窩輪\t59212\nTheorem\t59213\n庄西\t59214\nMHK\t59215\n报德\t59216\nMK14\t59217\n踏破铁鞋\t59218\nlogarithmic\t59219\n报规\t59220\n人防门\t59221\n抓鬼\t59222\n移动全球通卡手机号段\t59223\nscrum\t59224\n俞氏\t59225\n少年神探狄仁杰\t59226\n重庆市九龙坡区人民政府\t59227\n联发文库\t59228\nontime\t59229\n护林员\t59230\n2235\t59231\n一云\t59232\nhnds\t59233\n空铺\t59234\n司马红丽\t59235\n凯里酸汤鱼\t59236\nD2D\t59237\n碳点\t59238\nBioscience\t59239\n上世纪\t59240\n梁平\t59241\n游击队员\t59242\n网申\t59243\n60HZ\t59244\nhtpasswd\t59245\n马太福音\t59246\n盛京关捷\t59247\n陈维桓\t59248\nstreng\t59249\nDownload\t59250\n龙之信条黑暗觉者\t59251\n91.4\t59252\n云龙镇\t59253\n信天翁\t59254\n食品配料\t59255\n碳酸盐岩\t59256\n心集\t59257\n馆主\t59258\nUSMLE\t59259\n管理系\t59260\n塔架\t59261\nNeru\t59262\ncpu处理器\t59263\n171\t59264\n进口车\t59265\n手游式\t59266\n丙烯酸酯橡胶\t59267\n新浪海南_新浪网\t59268\n济南红盾信息网\t59269\nAutomated\t59270\n1.76复古\t59271\nHanging\t59272\ntig\t59273\n宾主\t59274\n相配\t59275\n阳宗海\t59276\n蕲春县\t59277\nued\t59278\n徐彬\t59279\n陈正\t59280\nlowa\t59281\n100万倍\t59282\n专业技术\t59283\n做好事\t59284\n算非\t59285\n迁移率\t59286\nMitaka\t59287\n黄金分割线\t59288\n繁峙\t59289\n法狮龙\t59290\n67家\t59291\n灌注桩\t59292\n耐酸泵\t59293\n挑打\t59294\n财务经理人网\t59295\n2012年4月\t59296\n迪迦奥特曼剧场版\t59297\n鸡率\t59298\nHSQLDB\t59299\n方丈\t59300\n201802\t59301\nSalary\t59302\n北京市中伦律师事务所\t59303\n疯疯癫癫\t59304\nmaven源\t59305\nx61\t59306\n离伤\t59307\n紫罗兰花\t59308\nxz2\t59309\n原子核\t59310\n二代们\t59311\n标致RCZ\t59312\n江西省纪委监察厅\t59313\nrepository\t59314\nqx60\t59315\n微露\t59316\n投资回收期\t59317\n箕斗\t59318\n申姓\t59319\n辅助性\t59320\n贺岁版\t59321\n山东新闻_大众网\t59322\ntaobao\t59323\n10亿吨\t59324\nprogressive\t59325\n盛世链\t59326\n聚氨酯预聚体\t59327\n没出息\t59328\n挡风玻璃\t59329\n风骚\t59330\n石板岩\t59331\n富昌\t59332\nFAA\t59333\n夏集\t59334\n黄龙岘\t59335\n20170927\t59336\n洗号\t59337\n2018年01月15日\t59338\n结艺\t59339\n46条\t59340\n张澜澜\t59341\n毒妖\t59342\nUbisoft\t59343\n姑子\t59344\n新维加斯\t59345\n两折\t59346\n巴啦啦小魔仙之奇迹舞步\t59347\n灭门惨案之孽\t59348\n慌\t59349\n兒子\t59350\n到处\t59351\n资助\t59352\n第191章\t59353\n北京旅游网\t59354\n外旋\t59355\n碎片率\t59356\n1.0%\t59357\n拉文克劳\t59358\nsra\t59359\n王向明\t59360\n北路梆子\t59361\n烟波浩渺\t59362\nFX\t59363\n罗仲谦\t59364\n美联储\t59365\nlegitimate\t59366\n奥斯卡影帝\t59367\nATA\t59368\n浅度\t59369\n图形用户界面\t59370\n乐斯菲斯\t59371\n空孕\t59372\nYHBOYS\t59373\n浙江新和成股份有限公司\t59374\n怡情\t59375\n广义积分\t59376\n福建信息职业技术学院\t59377\n正见\t59378\n近视镜片\t59379\n野比大雄的生化危机\t59380\n丁二狗的猎艳人生\t59381\n三三易通\t59382\n斯威夫特\t59383\n陈文茜\t59384\n誓不为妻:全球豪娶少夫人\t59385\n383号\t59386\n追踪报道\t59387\n10.12.2\t59388\nhst\t59389\n陆寻\t59390\n3D全息投影\t59391\n喜堂\t59392\n浅夏\t59393\n424\t59394\n炽焰帝国2\t59395\n750kV\t59396\n几栋\t59397\n廖一梅\t59398\nadobe\t59399\n中央音乐学院\t59400\nCoolpad\t59401\n快盗\t59402\n[猎人\t59403\n许善达\t59404\n中国信达资产管理股份有限公司\t59405\nBomba\t59406\n甾体\t59407\n过不过\t59408\nSIUF\t59409\n空挂户\t59410\n弑君贼\t59411\n含金\t59412\n天吉网\t59413\n妖月\t59414\ncocosbuilder\t59415\n祝甸\t59416\n唐国\t59417\n彩裙\t59418\n然然\t59419\n拿下\t59420\n环行\t59421\n行式\t59422\n驱动器\t59423\n岑丽香\t59424\n樊迟\t59425\n扩束镜\t59426\n邹家华\t59427\n布鲁姆\t59428\nWOT\t59429\n付诸\t59430\n三盏灯\t59431\ndir函数\t59432\n巴彦\t59433\n强加\t59434\n苏省\t59435\n述职述廉述学\t59436\n20160716\t59437\n德基广场\t59438\n张量\t59439\n五杀\t59440\nwow7.0\t59441\n汉釜宫\t59442\n三星s8+\t59443\ngta5ol\t59444\n18码\t59445\n附庸\t59446\n剑奴\t59447\nKOYO\t59448\n索\t59449\n说得好\t59450\n镀锌钢绞线\t59451\n陇东学院\t59452\n张升\t59453\n鬼子来了\t59454\n硅晶圆\t59455\n裸心社\t59456\n红枸杞\t59457\nmobiscroll\t59458\nPPT-CSDN\t59459\n新能源车辆购置税\t59460\n健民集团\t59461\nSHIPPER\t59462\n19001\t59463\n北京首钢\t59464\n小冬\t59465\n小周\t59466\n利培酮\t59467\n0612\t59468\n王乐\t59469\n高中生\t59470\n阳春市\t59471\n真蛇\t59472\nHD4000\t59473\nxiace\t59474\n朱光耀\t59475\n机\t59476\n⊥\t59477\n例词\t59478\n削球\t59479\n管理学概论\t59480\n4800\t59481\n信不信\t59482\n有荤\t59483\n洛铂\t59484\n暗房\t59485\n班玛县\t59486\n8215\t59487\n存房\t59488\n6月10日\t59489\n百富\t59490\n李起光\t59491\n二郎\t59492\n小馆\t59493\n米欧\t59494\n上海市哲学社会科学规划办公室\t59495\n亲爱的爸爸\t59496\nt1\t59497\n优质课\t59498\n寿命\t59499\n2副\t59500\n担保费\t59501\n1800亿元\t59502\n崂山区\t59503\n嘀嘀嘀\t59504\n罗寒蕾\t59505\n开心俱乐部\t59506\n将息\t59507\n大底\t59508\n新疆日报网\t59509\n11节\t59510\n音机\t59511\n永志\t59512\n千里江山图\t59513\n银税\t59514\n临泽公务网\t59515\n简单爱\t59516\n黑轩辕\t59517\n惠州网\t59518\n信贷部\t59519\n千分号\t59520\n武夷路\t59521\n办公室\t59522\nLadder\t59523\n隐修\t59524\ngogobar\t59525\n腿短\t59526\nyf\t59527\n心胜\t59528\nBac\t59529\n十秒钟\t59530\n中央军事委员会\t59531\n特拉福买家俱乐部\t59532\n地灵曲\t59533\n来访者\t59534\n石磊\t59535\n遮胸\t59536\nrefusing\t59537\n独门独院\t59538\n希赛网\t59539\n720p高清\t59540\n33款\t59541\n苗阜王声\t59542\n兴市\t59543\n清朝\t59544\n成都川\t59545\n第三十九次\t59546\nCut\t59547\n广发证券股份有限公司\t59548\n13公里\t59549\n七星岗\t59550\n20161101\t59551\n江西应用技术职业学院\t59552\n我的幸福我做主\t59553\n华北水利水电大学\t59554\n中华商标超市网\t59555\n站长\t59556\n筑龙\t59557\n主题公寓\t59558\n诸暨市\t59559\nescaping\t59560\n刘哥\t59561\n给力小说网\t59562\n头顿\t59563\nahaii\t59564\n建出\t59565\n红边\t59566\n法篇\t59567\n同济版\t59568\n痘肌\t59569\n登封\t59570\n词性\t59571\nnitori\t59572\n轻松掌柜\t59573\n台州路桥\t59574\n窖口客运站\t59575\n王波\t59576\n高潮流\t59577\nHEAT\t59578\nm4a格式\t59579\nAvailability\t59580\n尿素泵\t59581\n手术术\t59582\n途安l\t59583\nprimary\t59584\nlamda\t59585\n激光粒度分析仪\t59586\n张小艺\t59587\n第4卷\t59588\n水具\t59589\n非因工负伤医疗期规定\t59590\n逃离塔科夫\t59591\n空分\t59592\ndefconfig\t59593\n热轧卷\t59594\n王科\t59595\n鲁仕林\t59596\n黑犬\t59597\n6W101\t59598\n两段式\t59599\nqtreewidget\t59600\n2018期\t59601\n达斡尔\t59602\n徐华\t59603\n秦唐\t59604\n国王湖\t59605\nWOKO\t59606\n22分\t59607\n3945\t59608\n天猫魔盘\t59609\n怪笼\t59610\n娱乐圈\t59611\n侠侣网\t59612\n踢球者\t59613\nphysica\t59614\n15关\t59615\n沉淀剂\t59616\n君歌\t59617\n河南人大\t59618\n陈山\t59619\n莫尔根\t59620\n伦比\t59621\ndaisycolour\t59622\n安徽省国家税务局\t59623\n2磅\t59624\n自留地\t59625\n中共四川省委农村工作委员会\t59626\n月氏\t59627\n高乐\t59628\npays\t59629\n冰镇\t59630\nMd\t59631\n深圳市科创委\t59632\n尸变\t59633\n丁洁\t59634\n示范卷\t59635\n毛毛雨\t59636\n古扎拉蒂\t59637\nAccess2003\t59638\n自行车座\t59639\nimx6q\t59640\n李彩桦\t59641\n边压\t59642\n无赦\t59643\n蓝绿厂\t59644\n充满\t59645\n电解饱和食盐水\t59646\n量子链\t59647\n李宏\t59648\nPollution\t59649\nwifix\t59650\n10.12.6\t59651\n编程珠玑\t59652\n品效\t59653\n雪鸢\t59654\n塔柏\t59655\n1340\t59656\nTele\t59657\n罗素克劳\t59658\n成衣\t59659\n手画\t59660\n间苯二胺\t59661\norientdb\t59662\n王兆安\t59663\nク\t59664\n左晖\t59665\n初跑者\t59666\n胆管炎\t59667\nIntroducing\t59668\n呼唤爱\t59669\n湖南省发改委\t59670\npvt\t59671\n高云翔\t59672\n唔系\t59673\n墨菲斯托\t59674\nvirt\t59675\n契\t59676\n数据库索引\t59677\n18%\t59678\n王耳朵\t59679\n贾斯珀\t59680\n撞包\t59681\nBOSCH\t59682\nm30\t59683\nPiping\t59684\naptget\t59685\n文女\t59686\n遵命\t59687\n抢先版\t59688\n大阴蒂\t59689\n升沉\t59690\nlesbian\t59691\n切肤\t59692\n四方形\t59693\n淘宝店招\t59694\n龙湖郦城\t59695\n峰景\t59696\n杰威尔\t59697\n两学\t59698\n电汇凭证\t59699\n反恐战争\t59700\n江苏广电\t59701\n碎层\t59702\n叉树\t59703\n不法\t59704\n机件\t59705\nNEXUS\t59706\n俄区\t59707\nLNMP\t59708\n3680\t59709\n婵婵\t59710\n天寒\t59711\n安全包\t59712\n北京首都航空有限公司\t59713\n九州通医药集团股份有限公司\t59714\n一具\t59715\n快马\t59716\nlt18i\t59717\n篮下\t59718\nr语言\t59719\nLithium\t59720\nLeNet\t59721\n百悦\t59722\n书院\t59723\n万佳\t59724\nzqhxuyuan\t59725\ntech21\t59726\n五根\t59727\n兰迪\t59728\n12365\t59729\nShrimp\t59730\n三元整形网\t59731\naudigy\t59732\n计算器\t59733\naef\t59734\n孟定镇\t59735\n绵阳市人民医院\t59736\n预测性\t59737\nSHEET\t59738\n赔钱\t59739\n乐彩\t59740\n蓝男\t59741\n李嗣涔\t59742\n华坪县\t59743\n御医\t59744\n前院\t59745\n百度链接\t59746\n配音者\t59747\n志村\t59748\n军工\t59749\n阿拉维斯\t59750\nFactors\t59751\n沙井店\t59752\n急救员\t59753\n陈咏桐\t59754\n暴走装甲\t59755\n超电磁炮\t59756\n地奥司明片\t59757\n卫气\t59758\n传祺GA8\t59759\n裕木\t59760\n老旧\t59761\n何当\t59762\n80列\t59763\n什么板\t59764\n天津分行\t59765\n郑冬霞\t59766\ngpt\t59767\nvega\t59768\n箭羽\t59769\n玫莉蔻\t59770\n货梯\t59771\n把上\t59772\n箍筋\t59773\nf362\t59774\nMaybe\t59775\n束语\t59776\nweed\t59777\nシスタ\t59778\n预热器\t59779\n尸斑\t59780\n绍兴市人民医院\t59781\nv8.9\t59782\nVmware虚拟机\t59783\n民主主义\t59784\n空气质量指数\t59785\n阿特\t59786\nZhao\t59787\n投币\t59788\n打印件\t59789\n龙象\t59790\n自刑\t59791\n20140928\t59792\n维摩诘经\t59793\nweb3.0\t59794\n上上\t59795\n玄外附小\t59796\nrock\t59797\nFastest\t59798\n源型机\t59799\n10大H5\t59800\n运车\t59801\nStrive\t59802\n天津市水务局\t59803\nChron\t59804\nfinal变量\t59805\n归原主\t59806\nHuangshan\t59807\ne90\t59808\n范老师\t59809\nReLu\t59810\n军制\t59811\n深度学习算法\t59812\n湖大\t59813\n页脚\t59814\nBIOSTAR\t59815\n100码\t59816\n无锡大桥\t59817\n草稚京\t59818\nQQ空间留言板\t59819\n我的狐仙女友\t59820\n环迅支付\t59821\nMichele\t59822\n世界佛教\t59823\n20160822\t59824\n浦项制铁\t59825\ntff\t59826\n封神英雄\t59827\nv4.0.6\t59828\n红地\t59829\n手摇\t59830\n合项\t59831\n布雷克\t59832\n顾总\t59833\n升华\t59834\n凤凰出版社\t59835\n玲珑\t59836\n牛肚\t59837\n卡拉丁\t59838\njBox\t59839\n独尊儒术\t59840\n由浅入深\t59841\n艾伦\t59842\n很温暖\t59843\n高新技术企业认定管理办法\t59844\n趣链科技\t59845\n金龙羽\t59846\ntab页\t59847\n李晨玮\t59848\n江苏联合职业技术学院\t59849\n宁波石上光电科技有限公司\t59850\n复旦大学附属肿瘤医院\t59851\n宠金\t59852\n动力学\t59853\n中川砂仁\t59854\n双招\t59855\n京沈\t59856\n针贴\t59857\n大营镇\t59858\nbrowne\t59859\n网票网\t59860\nrecognized\t59861\n和硕县\t59862\n潜力\t59863\n牛熊证\t59864\n无锡移动\t59865\n雷蒙磨粉机\t59866\n流海\t59867\n私铁\t59868\n冰流\t59869\n人文学院\t59870\n答题\t59871\n侃房哥\t59872\n全国人大\t59873\n诗史\t59874\n清洁\t59875\n区府\t59876\nJumper\t59877\n赏金术士\t59878\n科幻类\t59879\n林正\t59880\n吹雨\t59881\n半波片\t59882\nv2.2.6\t59883\n北京市人民政府国有资产监督管理委员会\t59884\n易源接口总线-api接口中心\t59885\n云隐\t59886\nswap分区\t59887\n恒银\t59888\n切丁机\t59889\n广中路\t59890\n郑大一附院\t59891\n柴崎幸\t59892\n供料\t59893\n余亮\t59894\n福岛\t59895\n上海中山医院\t59896\n2017年12月份\t59897\n演示\t59898\n一杆\t59899\n四座\t59900\n漱口\t59901\n_日报\t59902\n临沂高新区\t59903\n更好玩\t59904\n鹏\t59905\n第一話\t59906\n霸\t59907\n100毫秒\t59908\n三星tab\t59909\ntech\t59910\n执鞭\t59911\n95页\t59912\n型煤\t59913\n贝莱林\t59914\nyiyang\t59915\n1\\2\t59916\nbritax\t59917\n第582章\t59918\n温泉县\t59919\nCortex-M0\t59920\n万叶集\t59921\n杨少华\t59922\n舞舞\t59923\n速溶咖啡\t59924\n刘朝阳\t59925\n兴商\t59926\n1300X\t59927\n2000部\t59928\nIPhone6\t59929\nDoug\t59930\n余式\t59931\n生产器\t59932\njitsi\t59933\n北京地安门\t59934\n模拟舱\t59935\nassemblies\t59936\n金凤成祥\t59937\n山楂粉\t59938\n天猫商城\t59939\n记账\t59940\n夜叉\t59941\nBoLoli波萝社\t59942\nWomen\t59943\nadolescent\t59944\nGAL\t59945\n此诗\t59946\nPVD\t59947\n大丹犬\t59948\n相似比\t59949\n坐车网\t59950\n瓜田\t59951\n芯邦\t59952\nfights\t59953\n南巡\t59954\n怪物学院\t59955\n石康\t59956\n抛夫\t59957\n毒角\t59958\nsolidwroks\t59959\nopenthings\t59960\n康力\t59961\nCups\t59962\n一篇80\t59963\nvaluable\t59964\nNLB\t59965\n相能\t59966\n老龄化\t59967\n标的\t59968\n舒坦\t59969\n夏洛克福尔摩斯\t59970\n配比\t59971\n序\t59972\nxvg\t59973\n上海长途汽车站\t59974\n加油券\t59975\n51CTO学院\t59976\n管理条\t59977\n青青世界\t59978\n开运\t59979\n柏景湾\t59980\n干馏\t59981\n今年11月\t59982\n中机\t59983\n悠享\t59984\n罗技G610\t59985\n书坛\t59986\n单相电机\t59987\n屁滚尿流\t59988\nPCGS\t59989\nal20\t59990\n主抱团\t59991\n诉讼标的\t59992\n兑换券\t59993\n天火大道\t59994\n现今\t59995\n时空\t59996\n2014-01-25\t59997\n树篱\t59998\n蒋介\t59999\n老利\t60000\nDesperate\t60001\n电子技术有限公司\t60002\n肛门息肉\t60003\n育英中学\t60004\n14T\t60005\n项目商业计划书\t60006\n长城哈弗H6\t60007\n福猫\t60008\n巴哥\t60009\nCBA\t60010\n轿运车\t60011\nnbiot\t60012\n4面\t60013\n及第\t60014\n贪赃枉法\t60015\n李大\t60016\n南海新闻网\t60017\n一般来说\t60018\n璧山区\t60019\ne周\t60020\njijzzizz\t60021\n韩涵\t60022\n克里姆特\t60023\n深圳房产信息网\t60024\n拓展\t60025\n阿拉巴\t60026\n实修\t60027\n技术卷\t60028\n层子\t60029\n穆拉丁\t60030\n极限流\t60031\nJDRead\t60032\n数字博物馆\t60033\n通背拳\t60034\nqpushbutton\t60035\n电压力锅\t60036\n东营房产网\t60037\nEXILE\t60038\n帷\t60039\n影帝\t60040\n收尾\t60041\ngetField\t60042\n手舞\t60043\n深圳市律师协会\t60044\n速盈\t60045\n修复性\t60046\n长安欧力威\t60047\n脚裤\t60048\nSKP店\t60049\n乌苏市\t60050\n众筹网\t60051\n栖霞寺\t60052\nhuihui\t60053\n和阳\t60054\n领章\t60055\nTTA\t60056\n盐官古镇\t60057\n穷兵黩武\t60058\nCanon\t60059\n宋子文\t60060\n防御塔\t60061\n背地里\t60062\n必然\t60063\ndatamatrix\t60064\n大上\t60065\n农行\t60066\n﹌\t60067\n20161003\t60068\n收发室\t60069\n真假公主\t60070\n止境\t60071\n田畴\t60072\n直柄\t60073\nHQCD\t60074\n空气能热泵热水器\t60075\n出海\t60076\n花园小区\t60077\nmultilevel\t60078\n邪师\t60079\nautomatically\t60080\n超类\t60081\n山亭区\t60082\n7.18\t60083\n河鲀\t60084\nShot\t60085\n014\t60086\n剑门\t60087\n承认错误\t60088\n情史\t60089\n常染色体\t60090\n淞沪抗战纪念馆\t60091\n一路歌唱\t60092\n斑马斑马\t60093\n舰队Collection\t60094\nKe$ha\t60095\n北京市民政局\t60096\n新大话西游2\t60097\n锗\t60098\n小书生\t60099\n土木人\t60100\n长风海洋世界\t60101\nNOAH\t60102\n振业城\t60103\n阿泰斯特\t60104\nh3399\t60105\n北斋\t60106\n蜜书\t60107\nPierre\t60108\n萃\t60109\n姚记扑克\t60110\n6.5.0\t60111\nstickers\t60112\nSCD\t60113\n董晓宇\t60114\n叶落知秋\t60115\n大肌\t60116\n冬雪\t60117\ndiscounted\t60118\n斥巨资\t60119\n牛驼\t60120\nFountain\t60121\nstability\t60122\n老窝\t60123\nBarryW\t60124\n自暴自弃\t60125\nonitsuka\t60126\n排污许可证\t60127\n第九辑\t60128\nJigsaw\t60129\n叶盛兰\t60130\n药香\t60131\n这个星期日\t60132\n崇州市\t60133\nforecast\t60134\n层析柱\t60135\n魔血\t60136\n野郎\t60137\n法律意识\t60138\n德鸿\t60139\n党组织\t60140\nnigeria\t60141\nTEAC\t60142\nprecent\t60143\n双降\t60144\n春波\t60145\n后海村\t60146\n蓝票\t60147\n划清\t60148\njil\t60149\n青春期性\t60150\n手绘屏\t60151\n核减\t60152\n邻国\t60153\n相贯线切割机\t60154\n飘花电影网\t60155\n每十分钟\t60156\n中国贸易网\t60157\n江西省商务厅\t60158\nSolidWorks2018\t60159\njdk1.8\t60160\n荐号\t60161\n柜族\t60162\n人身意外伤害保险\t60163\nF15\t60164\n照夜\t60165\n飞跃\t60166\n凤凰街\t60167\n蟒山国家森林公园\t60168\n佳兆业城市广场\t60169\n板式换热器\t60170\nSLG/\t60171\nttm\t60172\n新疆兵团第三师\t60173\n商务标\t60174\n强烈\t60175\n江城区\t60176\n57级\t60177\n羟丙基\t60178\n飞行堡垒\t60179\n加州大学洛杉矶分校\t60180\n预制品\t60181\nGSA\t60182\n布城\t60183\n志趣网\t60184\nider\t60185\n将夜\t60186\n过敏者\t60187\n範例\t60188\nJakarta\t60189\n好博\t60190\njdk10\t60191\n天上掉下个林妹妹\t60192\n前滩\t60193\n刀影\t60194\n大顶堆\t60195\n5#\t60196\n0.24\t60197\n贾元春\t60198\n在押\t60199\n棱台\t60200\n撂挑子\t60201\n最后一根稻草\t60202\nQQ空间技术\t60203\nRBF\t60204\n平安大街\t60205\n甘泉外国语中学\t60206\n重复名\t60207\n封建制度\t60208\nmaximus\t60209\naras\t60210\n中影票务通\t60211\n峨眉半山\t60212\n_梁\t60213\n0.0.9\t60214\njizzz\t60215\nCockpit\t60216\n淫媒\t60217\n2017年11月18日\t60218\n非常嫌疑犯\t60219\n4.53\t60220\n众悦\t60221\n农民起义\t60222\n深度睡眠\t60223\n清算\t60224\ntolower\t60225\ncoils\t60226\n鹰身\t60227\n花滑女王\t60228\n搜图\t60229\n欧克哈特\t60230\nAV\t60231\n笔墨水\t60232\n口贴\t60233\n方雅\t60234\n快乐大巴\t60235\n分下\t60236\n杭州港\t60237\n中国船舶\t60238\n第15次\t60239\n从今夜\t60240\nquagga\t60241\n徐子彦\t60242\n第105号\t60243\n塔吊人才网\t60244\n蝴蝶园\t60245\n辽宁人事考试网\t60246\n75w\t60247\n卵蛋\t60248\n汪国真\t60249\n棚板\t60250\n房利美\t60251\nAngelaboy\t60252\ninstantly\t60253\n一间\t60254\n值value\t60255\n经线\t60256\n天下兴\t60257\n霞之丘诗羽\t60258\n大品\t60259\n望江县\t60260\n00_\t60261\n第111集\t60262\nNutanix\t60263\n问答文库_百科\t60264\nAirsoft\t60265\n土山镇\t60266\n甜米酒\t60267\nIC_捷\t60268\n刘永\t60269\n多面手\t60270\n俊华\t60271\n缘分\t60272\n永安村\t60273\nWelt\t60274\n探花\t60275\n中央第二环境保护督察组\t60276\n冠山\t60277\n一线天\t60278\ncircuits\t60279\nPuppy\t60280\n劇\t60281\nAmd\t60282\n周独夫\t60283\n曹雪\t60284\n小微企业园\t60285\n7.5.4\t60286\n玄孙\t60287\n绿藻\t60288\nComing\t60289\n高石\t60290\nTechCrunch\t60291\n脱帽\t60292\n通江\t60293\n金桥开发区\t60294\n王祯\t60295\n致和\t60296\n颐盛御中环\t60297\nFollower\t60298\n0915\t60299\n操作机\t60300\n多校\t60301\n烤漆门\t60302\njilin\t60303\n种兔\t60304\n茧子\t60305\n两届\t60306\n大绿\t60307\n东方伊甸园\t60308\nTransCAD\t60309\n生熟\t60310\n猪年\t60311\n泰拳\t60312\n网络驱动器\t60313\n23公里\t60314\n卓越集团\t60315\n项目干系人\t60316\n紅\t60317\n古曼\t60318\nmybatise\t60319\nobs\t60320\n400多kb\t60321\n福州公司\t60322\nzun\t60323\nip池\t60324\n三氧化二砷\t60325\n马丁·路德·金\t60326\n彩包\t60327\n行旅\t60328\n和达\t60329\n墙灯\t60330\n三苗网\t60331\n天翼积分商城|电信\t60332\n深南中路\t60333\n防刷\t60334\n黑烟\t60335\n潘高寿\t60336\n震慑\t60337\n击倒\t60338\nbusinessman\t60339\nblood\t60340\n唐国华\t60341\n奇瑞eQ\t60342\n王川\t60343\n地理学家\t60344\n策划公司\t60345\njavamail\t60346\n10001\t60347\n利尿剂\t60348\n天津信托\t60349\n郑保瑞\t60350\n明山区\t60351\n霍华\t60352\n0703\t60353\nprix\t60354\nWindows共享文件\t60355\n商君书\t60356\n地恋\t60357\n本田哥瑞\t60358\n西安高新区管委会\t60359\n六道鸣人\t60360\n孺子牛\t60361\nDomestic\t60362\ntify\t60363\nNewifi\t60364\n梧州南\t60365\nhandles\t60366\n悲切\t60367\n几百部\t60368\n12千瓦\t60369\n3000万人次\t60370\n导弹发射\t60371\n新造镇\t60372\nl5640\t60373\n御笔\t60374\nAgency\t60375\n20160517\t60376\n勘探\t60377\n咏流\t60378\n秦东\t60379\n疙瘩汤\t60380\n水漫\t60381\n54P\t60382\nslayer\t60383\n中国龙\t60384\n艺学\t60385\ngm\t60386\n2017-2\t60387\n厦门海洋职业技术学院\t60388\n星移\t60389\nwindows2000\t60390\n苇\t60391\n1:10\t60392\ncolorbox\t60393\n不移\t60394\n翻土\t60395\n莫斯利安\t60396\n绿标\t60397\n陈昌旭\t60398\n风筝节\t60399\njdi\t60400\n防腐剂\t60401\n左槽\t60402\n井管\t60403\n花天酒地\t60404\n年薪\t60405\n参商\t60406\n试射\t60407\n卡卡卡\t60408\n小源\t60409\nlogan\t60410\n固定件\t60411\n庄神\t60412\n开机项\t60413\n傲虎\t60414\nSupermarket\t60415\n超值得\t60416\n恐高症\t60417\n座机\t60418\n深圳市中医院\t60419\n埭\t60420\n养鸡场\t60421\nwro\t60422\n菱角湖\t60423\nleveling\t60424\n几内亚湾\t60425\n第十二章\t60426\n菌株\t60427\nionic3\t60428\n部际\t60429\n脲酶\t60430\n岳阳世纪医院\t60431\n生化奇兵3\t60432\n前馈\t60433\n瑞金二路\t60434\n北京燃气集团有限责任公司\t60435\n神音\t60436\n战锤2全面战争\t60437\nspacex\t60438\njiang\t60439\nDelivering\t60440\nMSB\t60441\n冰厂田幼儿园\t60442\n知知\t60443\nH罩杯\t60444\n一路狂奔\t60445\n录取通知书\t60446\n白黑\t60447\n电镀件\t60448\nVCM\t60449\n初五\t60450\npostsql\t60451\n蒲洼\t60452\nforyou\t60453\n10v\t60454\n与狼共舞\t60455\n官仙\t60456\n水山公园\t60457\n专卖品\t60458\nhebei\t60459\n想方设法\t60460\neasui\t60461\n官府\t60462\nAT89C51\t60463\nCereal\t60464\n福利姬\t60465\nApril\t60466\n缸套\t60467\n结茧\t60468\n▽\t60469\n内驱力\t60470\ns71200\t60471\n云浮\t60472\n陕西中医药大学\t60473\n文斌\t60474\n跃升\t60475\n七速\t60476\nWhen\t60477\n柳林村\t60478\nhmi\t60479\nPickle\t60480\nscenery\t60481\n102集\t60482\n肽\t60483\n辣文小说网\t60484\n急性髓系白血病\t60485\n周周练\t60486\n新乐视智家\t60487\n班子\t60488\n永安保险\t60489\nPay\t60490\n合肥汽车站\t60491\n程颖\t60492\nsd卡根\t60493\n普拉洛芬\t60494\n枫桥\t60495\n宜昌新闻网\t60496\nSTN\t60497\n七通一平\t60498\n中空纤维膜\t60499\n人教版小学英语\t60500\n选单\t60501\n面朝大海\t60502\n白河镇\t60503\n45度\t60504\n防止\t60505\n饶燃文\t60506\n换单\t60507\n维保期\t60508\n律政俏佳人\t60509\ndamon\t60510\nHalf-Life\t60511\n谢绝\t60512\naquos\t60513\ncompetent\t60514\n马可\t60515\n20l\t60516\n388号\t60517\n湖南建设\t60518\n3.0版\t60519\n有天\t60520\n青客\t60521\n头鸟\t60522\n武汉理工大学管理学院\t60523\n桜井彩\t60524\n新北万达广场\t60525\n美颜\t60526\n十四大\t60527\n蛮颚龙\t60528\nstocking\t60529\n胶垫\t60530\n汉子\t60531\n江铃域虎\t60532\n盘古七星酒店\t60533\n310分钟\t60534\n洞人\t60535\ndxdy\t60536\nLing\t60537\n703n\t60538\n张兰\t60539\n顾廷烨\t60540\n叶平\t60541\n温柔的背叛\t60542\n韦特\t60543\n河北钢铁\t60544\n温泉乡\t60545\n电学\t60546\n拟题\t60547\nBTrenren\t60548\n朗嘎拉姆\t60549\ncadworx\t60550\n美好的生活\t60551\n虾线\t60552\n充气游泳池\t60553\nWordPress中文网\t60554\n石家庄邮电职业技术学院\t60555\n挤压式\t60556\n瓜菜\t60557\n麻友\t60558\n红星奖\t60559\n中国银行江苏省分行\t60560\n夺取\t60561\n细处\t60562\n搜救\t60563\n200d\t60564\n5.8.2\t60565\nserialport\t60566\n外项\t60567\n邛崃\t60568\n神奇\t60569\n王丹凤\t60570\nConferencing\t60571\n18G\t60572\nEDU\t60573\n211.985\t60574\n生下来\t60575\n张丽钧\t60576\n奶羊\t60577\n3886\t60578\n2014年12月\t60579\n毒地\t60580\n89年\t60581\n玛雅小说网\t60582\n第113集\t60583\n蛙蛙\t60584\nksc\t60585\nyv\t60586\n一洋\t60587\n界碑\t60588\nqdc\t60589\n钩编\t60590\n情馆\t60591\n面漆\t60592\n高效过滤器\t60593\n东方心经\t60594\nbrowser-sync\t60595\n1769\t60596\n窑鸡\t60597\nCentOS\t60598\n唐宇\t60599\n吴仲华\t60600\n康宁\t60601\n锯开\t60602\n滚石爱情故事\t60603\n陈以桐\t60604\n示范\t60605\n苍穹之昴\t60606\nProfessors\t60607\n尚勇\t60608\n相田纱耶香\t60609\n光汇云油\t60610\nmanagment\t60611\n7537\t60612\n18k\t60613\n权益\t60614\n妖石\t60615\n沉寂\t60616\n微信朋友\t60617\nworksheets\t60618\n艾弗潘石屹\t60619\ncllgeek\t60620\n金沙江西路\t60621\n5045\t60622\n三岔湖\t60623\n1840年\t60624\n蓄意\t60625\n扫地机器\t60626\n金融\t60627\n三泰\t60628\n地埋式\t60629\n阶梯型\t60630\n希佩尔\t60631\n小惠\t60632\nvector\t60633\n梁艺\t60634\n三环集团\t60635\n道明寺\t60636\n沙耶香\t60637\n天门南\t60638\nDOCTOR\t60639\n华光\t60640\n接待站\t60641\n音乐季\t60642\n两\t60643\n莫若\t60644\n北京新东方学校\t60645\nvSAN\t60646\n新浪陕西时尚_新浪\t60647\n635\t60648\n浅唱\t60649\n景瑞\t60650\n笨哥\t60651\n心有余悸\t60652\n表记\t60653\n连本\t60654\nExistence\t60655\n茶头\t60656\n排口\t60657\n瓦楞纸板\t60658\n响应式\t60659\n空少\t60660\n环保公司\t60661\n换热器\t60662\n颜葵\t60663\nMySql数据库\t60664\nda\t60665\n网纹\t60666\n麦田里的守望者\t60667\n四川铁路学校\t60668\n导视系统\t60669\n青岛旅游学校\t60670\n无计可施\t60671\ncerebral\t60672\n勿\t60673\n中国人民大学国际关系学院\t60674\n缴纳税\t60675\nDenial\t60676\n春蕾\t60677\n金夏\t60678\n心空\t60679\n80txt.com\t60680\n武汉海关\t60681\n02_\t60682\n林萧\t60683\n8类\t60684\nporcelain\t60685\nsiemens\t60686\n老头\t60687\navxxx\t60688\n高高在上\t60689\n坏爸爸\t60690\n4月8\t60691\n威海市政府\t60692\n脸红心跳\t60693\n百变机兽之洛洛历险记\t60694\n鸡群\t60695\ntiff格式\t60696\n第四十四条\t60697\n600406\t60698\n外壳\t60699\n032期\t60700\nWifi万能钥匙\t60701\n家院\t60702\n黄汐源\t60703\n屁沟\t60704\n360安全卫士\t60705\n包皮过长手术\t60706\n真空助力器\t60707\ncouples\t60708\n消保\t60709\n黄瓜籽\t60710\n英雄城\t60711\n板桥村\t60712\n制热\t60713\n冷水滩\t60714\n说了再见\t60715\n香膏\t60716\n基础包\t60717\nvtm\t60718\nSara早安\t60719\nPhilipp\t60720\nCM3\t60721\nQ235B\t60722\n体力活动\t60723\nCISP\t60724\n安达维尔\t60725\n馓子\t60726\n五数\t60727\n马拉松\t60728\n打电话\t60729\n何仙姑夫\t60730\n鲁教版八年级\t60731\nqlistwidget\t60732\n10080\t60733\n云南省质量技术监督局\t60734\n512GB\t60735\n打喷\t60736\n私家\t60737\n普及性\t60738\n20171028\t60739\n错门\t60740\n模拟人生2\t60741\n蹦跳\t60742\n李颖芝\t60743\ncsdn\t60744\n啥子\t60745\n军衣\t60746\n20150810\t60747\n信用盘\t60748\n水性环氧地坪漆\t60749\n苍蝇\t60750\ncgm\t60751\n微波消解仪\t60752\n黑匣子\t60753\n油橄榄\t60754\n大难不死\t60755\n九乡风景区\t60756\nUpton\t60757\n中国铁道学会\t60758\n洞里萨湖\t60759\n扭蛋\t60760\n2016年9月24日\t60761\n444\t60762\n烧香\t60763\n心脏搭桥\t60764\n成都市中西医结合医院\t60765\n无冕\t60766\n药师琉璃光如来本愿功德经\t60767\n碛口\t60768\n柠檬酱\t60769\n事业单位\t60770\n奇诺\t60771\n超白缸\t60772\n疯狂性\t60773\n乐圈\t60774\n帕德玛瓦蒂\t60775\n万载\t60776\n▏\t60777\n回撤率\t60778\n28.3\t60779\n千万个\t60780\n12.09\t60781\n北京师范大学经济与工商管理学院\t60782\n周玲安\t60783\n江南贡院\t60784\n石槽\t60785\n莫斯科中央陆军\t60786\n圣马力诺\t60787\n卡易信\t60788\nSp1\t60789\n郑煤机\t60790\n明伟\t60791\n墨香阁\t60792\n陈正雷\t60793\n林希\t60794\n秘贴\t60795\n西安百姓网\t60796\n活佛济公3\t60797\nLynx\t60798\nBOLL指标\t60799\n黑苦荞茶\t60800\n生理性黄疸\t60801\n奥语\t60802\n肖姓\t60803\n七月上\t60804\n心悦俱乐部\t60805\n岩洞\t60806\n烟雨\t60807\n逍遥梦路\t60808\n近期\t60809\n长风破浪\t60810\n蜀山街道\t60811\n幽门螺旋菌\t60812\n朴孝信\t60813\nleigh\t60814\n甄嬛体\t60815\n6R\t60816\n小米智能摄像机\t60817\n捷成股份\t60818\n不屑一顾\t60819\nKitten\t60820\n昊嘉\t60821\n鹤城镇\t60822\n蛾\t60823\n转折期\t60824\n5秒内\t60825\n罩壳\t60826\n真女神转生\t60827\n翻跟\t60828\ncayenne\t60829\n板状\t60830\n皇太后\t60831\n第二十批\t60832\n甘肃移动\t60833\n大男人\t60834\n锅锅\t60835\nWhitening\t60836\n珈\t60837\n正反比例\t60838\n硅谷数模\t60839\n恶狼\t60840\n圆度\t60841\n心言\t60842\n城阳\t60843\nisomorphic\t60844\n极式\t60845\n忠言\t60846\n20180109\t60847\n4210m\t60848\n双人房\t60849\n高清吉吉影音\t60850\nPlaystation\t60851\n60页\t60852\nDistinguished\t60853\n防冰\t60854\n秦时明月之君临天下\t60855\n聚源\t60856\nSatisfaction\t60857\n孙丽华\t60858\nScanSnap\t60859\n10万个\t60860\n铂金\t60861\n工厂化\t60862\nMiu\t60863\n容纳量\t60864\n萝卜网\t60865\nparallels\t60866\n杜尔伯特\t60867\n首屏\t60868\nebm\t60869\nFirenze\t60870\n逆阵\t60871\n变量\t60872\n手写板\t60873\n盘前\t60874\n陈桥\t60875\n宪\t60876\n必需氨基酸\t60877\n闪点\t60878\nICP代备案管理系统\t60879\n600km\t60880\n西冷印社\t60881\n1.25G\t60882\n恶缘\t60883\n盛唐妖异志礼包\t60884\n岩土在线\t60885\n百淬\t60886\n经十路\t60887\n余老师\t60888\n火箭\t60889\n长沙烈士公园\t60890\n重庆市人力资源开发培训中心\t60891\n糖类抗原724\t60892\nResin\t60893\nawk\t60894\n行车电脑\t60895\n锐逸\t60896\n40目\t60897\n作家协会\t60898\nsso\t60899\n海宁西站\t60900\n8.5米\t60901\n北京大使馆\t60902\n花果山水帘洞\t60903\n贤惠\t60904\n20170712\t60905\nChord4\t60906\n85平米\t60907\n银行界\t60908\n大功率LED\t60909\n瞳亮\t60910\n刺客信条启示录\t60911\nwin10/win7\t60912\n1231V3\t60913\n保利首开\t60914\nWin7\t60915\nG网\t60916\n百度百聘\t60917\n王杰克逊\t60918\n88年\t60919\n黄松\t60920\nCOUNT\t60921\n晃动\t60922\nGarageBand\t60923\n同源染色体\t60924\nibdata\t60925\n计量经济学\t60926\nyisheng\t60927\n吾家有女初长成\t60928\n科大\t60929\nCAPP\t60930\n李艳玲\t60931\n产商\t60932\n铁索桥\t60933\njukujo\t60934\n宮崎\t60935\n2200G\t60936\n会计初级考试\t60937\n拉黑\t60938\n单重\t60939\n云南师范大学附属中学\t60940\n暴走大事\t60941\n贝立凯\t60942\n阑珊处\t60943\n贵阳乐居网\t60944\n石之心\t60945\n方炯镔\t60946\n新叶\t60947\n车坊\t60948\n552\t60949\n舌头发\t60950\n十三姨\t60951\n破溃\t60952\nN9009\t60953\n刘昊乐嘉\t60954\n达克赛德\t60955\n三味线\t60956\n卿本\t60957\n558\t60958\n回气\t60959\n友尽\t60960\nFX3U\t60961\n黑暗处\t60962\n伶仃岛\t60963\nfitch\t60964\n翻新版\t60965\n重庆医科大学附属第一医院\t60966\n140.dll\t60967\n顺水\t60968\nbup\t60969\n高新技术企业认定网\t60970\n强烈推荐\t60971\n专页\t60972\n歌诗\t60973\nWalter\t60974\n妙笔阁\t60975\nFMC\t60976\nagreed\t60977\n1565\t60978\n亚兴\t60979\n井然\t60980\n墨尔本城\t60981\n骑术\t60982\n寓情\t60983\n王子复仇记\t60984\n桂\t60985\n学系\t60986\n重金属污染\t60987\n汇智大厦\t60988\n众泰T500论坛\t60989\n塔院\t60990\n怡景花园\t60991\n嘉士伯\t60992\n谢贤\t60993\n中智\t60994\nmeigesir\t60995\n外务\t60996\n呆账\t60997\n传声筒\t60998\n盛迪嘉\t60999\n书信体\t61000\n点兵\t61001\n2011—2020年\t61002\n龙城网\t61003\n单形\t61004\n传递窗\t61005\n寿联\t61006\n进击的巨人2\t61007\n骆越\t61008\n合成旅\t61009\n麦田李晨\t61010\n44100\t61011\n财务成本管理\t61012\nMonument\t61013\nAnyone\t61014\n雄塑科技\t61015\n每一个月\t61016\n星沙大道\t61017\n白云山景区\t61018\n3DS模拟器\t61019\n从简\t61020\n美利达斯特拉\t61021\ninyue\t61022\n穆清歌\t61023\n用友通\t61024\nqq堂\t61025\n浅川\t61026\n零解\t61027\n五到十年\t61028\n于宏洁\t61029\nCRIC\t61030\n兽皮\t61031\n爰\t61032\n_虫\t61033\nE350\t61034\n拉姆齐\t61035\n元九\t61036\n怨念\t61037\n格栅板\t61038\n寒门笑忘书\t61039\n173.COM\t61040\n肾石\t61041\n生旦\t61042\n11周\t61043\n外部化\t61044\n15年12月\t61045\nLava\t61046\n经济管理学\t61047\nCollaboration\t61048\n161\t61049\n卡利\t61050\n脉脉\t61051\n装饰工程公司\t61052\n几单\t61053\n黑米\t61054\nh色\t61055\n小生活\t61056\nmoe\t61057\nui3g\t61058\n主体\t61059\n山情\t61060\n周练\t61061\n个子\t61062\n环球太平洋\t61063\n华商报_华商网_华商报\t61064\n富力城\t61065\n民主共和制\t61066\n墨麒麟\t61067\n重回\t61068\n饭冈佳奈子\t61069\n大可\t61070\n亚克力\t61071\nApart\t61072\n宏都\t61073\n北京新公司\t61074\nmk2\t61075\n10月27日\t61076\n唐人\t61077\n橙味\t61078\n几片\t61079\n64式\t61080\n刘教授\t61081\n皇包车\t61082\n鲜品\t61083\nWINDOWS\t61084\n囚牛\t61085\n开创性\t61086\nczechstreets\t61087\n墨先生\t61088\n跪\t61089\n斗地主残局\t61090\n电量\t61091\n64天\t61092\n管理资源网\t61093\n插桩\t61094\n和事佬\t61095\n蚂蜂\t61096\n优益\t61097\n9整除\t61098\n依巴斯汀片\t61099\nVCE\t61100\n知因\t61101\n东莞地铁2号线\t61102\n68部\t61103\n张维国\t61104\n石榴木\t61105\n陆伟\t61106\n股票期权\t61107\nQuestion\t61108\n同程火车票\t61109\n帝王浴\t61110\n浮夸风\t61111\n星象仪\t61112\nHgamecn\t61113\n途观\t61114\n弹仓\t61115\n合肥信息技术职业学院\t61116\n东莞市妇幼保健院\t61117\n订单号\t61118\n11K\t61119\nSeq\t61120\n59条\t61121\nmagi\t61122\nwcb\t61123\n万锦\t61124\n沧州市人民医院\t61125\n非典型性\t61126\nePrint\t61127\n纱裙\t61128\n搏击俱乐部\t61129\n2012年11月24日\t61130\n河南省科学技术厅\t61131\n别克GL\t61132\n舒畅\t61133\ndytt\t61134\n芜湖一中\t61135\n重庆分公司\t61136\n木粉机\t61137\n紫砂壶\t61138\ndpdk\t61139\n对焦点\t61140\n方氏\t61141\n六重\t61142\n旅游卫视\t61143\n小米盒子3c\t61144\nrae\t61145\n一年多前\t61146\n熟练\t61147\nouc\t61148\n是不是这样\t61149\n镙\t61150\n天天来塔防\t61151\n毛泽东思想\t61152\n水穷处\t61153\n诸葛宇杰\t61154\nGBlive\t61155\nPS工具栏\t61156\n对面\t61157\n美丽水世界\t61158\n第二类\t61159\nCrystal\t61160\n马飞飞\t61161\n金翔\t61162\n81张\t61163\n米虫\t61164\n22吨\t61165\n穿堂\t61166\n怨气撞铃\t61167\n建筑设计图\t61168\n人伤\t61169\n煎蛋网\t61170\nshhbm\t61171\n目录式\t61172\n同感\t61173\n民间艺术团\t61174\n克劳奇\t61175\n建功立业\t61176\n加菲猫2\t61177\n50单\t61178\n中国电视台\t61179\nmedicinal\t61180\nmansion\t61181\n万以内数的认识\t61182\nsw\t61183\n上置\t61184\n桶状\t61185\n几刀\t61186\nNVS\t61187\n维明\t61188\n农\t61189\n雁栖镇\t61190\nStudioFOW\t61191\n徽式\t61192\n威海北站\t61193\n宋会\t61194\n甘坑客家小镇\t61195\n赵雷\t61196\n李鸣\t61197\n后录\t61198\n萤丸\t61199\n薄荷味\t61200\n证券从业资格\t61201\n铁馆\t61202\nOpenId\t61203\n顾凌擎\t61204\n弩手\t61205\n省作协\t61206\n版记\t61207\n服务器\t61208\n汪正\t61209\n已然\t61210\nbritney\t61211\n灰尘\t61212\n下城\t61213\nkeras\t61214\n陈栋\t61215\nEdinburgh\t61216\n阿鲁纳\t61217\n云版\t61218\nUSPTO\t61219\n广州招聘网\t61220\n_汉\t61221\nzhonghe\t61222\n写一写\t61223\n宏福\t61224\n巨大儿\t61225\nps2吧_\t61226\n勾勒\t61227\n无为中学\t61228\n海南电网\t61229\n退关\t61230\nsable\t61231\n圣安东尼奥马刺\t61232\ndoris\t61233\n买不到\t61234\n藏水\t61235\n摩尔库伦\t61236\nyadea\t61237\ntedx\t61238\nGXZC\t61239\n慈寿寺\t61240\n绸面\t61241\n共晶\t61242\n会风会纪\t61243\n内购\t61244\n河南经济网\t61245\n青荣城际铁路\t61246\n遵义市委\t61247\n10n\t61248\n0.79\t61249\n用餐\t61250\n2008年以前\t61251\n品房阁\t61252\n星刻\t61253\n湖南法院网\t61254\n枕叶\t61255\nCART\t61256\n袋鼠国\t61257\n直冷式\t61258\n嘤嘤\t61259\n285267128\t61260\n一沓\t61261\n坦克部队\t61262\n新侠客行\t61263\n舒肝解郁胶囊\t61264\n太阳黑子\t61265\nSTEAM\t61266\n悄悄\t61267\n田边\t61268\n风赛\t61269\nCompilers\t61270\n当当当当\t61271\n展示板\t61272\n胡媛媛\t61273\n单元格所在行\t61274\nDMZ\t61275\n黑社会\t61276\nmarina\t61277\n尿湿\t61278\n菲德尔·卡斯特罗\t61279\n李总理\t61280\n别业\t61281\n禁漫\t61282\n01月\t61283\n公路考\t61284\nIT运维管理系统\t61285\n国家标准馆\t61286\nWindows局域网\t61287\n阿麦\t61288\nKMS\t61289\n业主们\t61290\nSeiko\t61291\n红河县\t61292\n拱顶\t61293\n盘溪\t61294\n中外合资经营企业法\t61295\n韶关市政府\t61296\nsincerely\t61297\n中华医学会神经病学分会\t61298\n一宅\t61299\n一级\t61300\n圣网\t61301\n路谱\t61302\n怡思丁\t61303\ngames\t61304\n前十分钟\t61305\n沈鹏\t61306\n惊天魔盗团\t61307\n法律咨询\t61308\n小米随身WIFI\t61309\n海南省高级人民法院\t61310\n乌江画廊\t61311\n热熔胶棒\t61312\n小红豆\t61313\nAQ\t61314\n河津市人民政府\t61315\nKiva\t61316\n王雨\t61317\n平实\t61318\nGordon\t61319\nvenom\t61320\n合肥人才网\t61321\n以德治国\t61322\n贱志\t61323\n蜜友\t61324\n保利大剧院\t61325\n同案\t61326\n造价员\t61327\n电流音\t61328\n萨满祭司|Shaman\t61329\nyongli\t61330\n厦门出入境检验检疫局\t61331\nsnapkit\t61332\n机框\t61333\n知中国\t61334\n姜楠\t61335\n邻牙\t61336\n九龙城寨\t61337\n咸鸭蛋黄\t61338\n岬\t61339\n632\t61340\n丹红\t61341\n盈建科\t61342\nerror=2\t61343\n浙江省纪委\t61344\n咸宁市人民政府\t61345\n京野\t61346\n红豆沙\t61347\n第四宫\t61348\n奥様\t61349\n不屑\t61350\n24盒\t61351\n红星地产\t61352\n清气正\t61353\n儿童简笔画大全\t61354\n麦柯斯\t61355\n儿童故事大全\t61356\n敷贴\t61357\n露营\t61358\n立体派\t61359\n翻滚吧三国\t61360\n持久战\t61361\n水河\t61362\n鬼子\t61363\n多余\t61364\n一百五\t61365\n爱情公寓1\t61366\n接桩\t61367\nexcellent\t61368\n10.06\t61369\n片尾\t61370\nfifa17\t61371\nKBS\t61372\n近海海洋环境科学国家重点实验室\t61373\nlae\t61374\n永安药业\t61375\nbilbili\t61376\n沙河水库\t61377\n韦嘉\t61378\n克里金插值\t61379\n城东村\t61380\ngh5s\t61381\n华星\t61382\n街健\t61383\n鬼畜岛\t61384\n赛巴斯\t61385\n雪铁龙C4L论坛_汽车之家论坛\t61386\n雁翎\t61387\n联控\t61388\n广州医科大学附属第一医院\t61389\n1333\t61390\n走来走去\t61391\n赣州人才网\t61392\n桃仙国际机场\t61393\n货柜\t61394\n解经\t61395\n打封闭\t61396\n花果\t61397\n非礼也\t61398\n千丝\t61399\nprx\t61400\n932\t61401\n马嘉祺\t61402\n侯毅\t61403\n苏同兴\t61404\n汉中市人民政府\t61405\n20180131\t61406\njcreator\t61407\n回避制度\t61408\n新野纺织\t61409\n好白\t61410\n精密无缝钢管\t61411\n长江新闻号\t61412\n披\t61413\n大字\t61414\n塞罕坝国家森林公园\t61415\n有苦难言\t61416\n外国男\t61417\n县人民医院\t61418\n装修快车网\t61419\n新威\t61420\n有限电视\t61421\n胆盐\t61422\n顺昌\t61423\n陕汽\t61424\n一下一下\t61425\nNexT\t61426\n沉得住气\t61427\n中国国家认证认可监督管理委员会\t61428\n潘绫莹\t61429\n姚蔚\t61430\n包装机\t61431\nbodyshop\t61432\n恶毒龙牧\t61433\n九祖七宗\t61434\n中国戏剧网\t61435\nasync函数\t61436\n艺典\t61437\n9f\t61438\n骸音\t61439\n千斤\t61440\n半晌\t61441\nkeys\t61442\n天悦府\t61443\ncapturing\t61444\nReplication\t61445\n东村\t61446\n六款\t61447\n女猪\t61448\n爱情路\t61449\n私家侦探公司\t61450\n434\t61451\n龙涎香\t61452\n1.002\t61453\n切去\t61454\n股级\t61455\n大宅门2\t61456\n王岳\t61457\n分馏塔\t61458\n禽流感\t61459\n萧疏寒\t61460\n中国共产党统一战线工作条例\t61461\n体育器械\t61462\n嘹亮\t61463\n德州农工大学\t61464\n人民政协网\t61465\n诉讼时效期间\t61466\n同方\t61467\n恢复性\t61468\n勘察员\t61469\n百度蜘蛛\t61470\n中公教育江苏分校\t61471\nPit\t61472\n新变化\t61473\n2551\t61474\n空镜\t61475\n雅利安\t61476\n兴致\t61477\n衡水湖\t61478\n划词\t61479\nIMAX版\t61480\n180331\t61481\nmobaxterm\t61482\n得力文具\t61483\n西培学堂\t61484\n虎牢关\t61485\n马来西亚\t61486\n就会\t61487\n吴江经济开发区\t61488\n猎豹免费WiFi-随身WiFi\t61489\nMOTO\t61490\n巴黎地铁\t61491\n绿波\t61492\n12千米\t61493\nAlvaro\t61494\n10几年\t61495\n绿城江南\t61496\n上个月份\t61497\n图档\t61498\n蒙纳超\t61499\ngrib\t61500\n妈妈是超人\t61501\nIpsos\t61502\n三国9\t61503\n处理器\t61504\n珍宝岛\t61505\nbutto\t61506\n10行\t61507\n短轴\t61508\n锡山\t61509\n台底\t61510\n长沙南方职业学院\t61511\n21世纪不动产\t61512\n周霆\t61513\n男神们\t61514\n科尔沁区\t61515\nmeili\t61516\n长春应化所\t61517\naparche\t61518\nwet\t61519\n内高\t61520\n100多\t61521\n4轮\t61522\nooxx\t61523\napplies\t61524\naudiorecord\t61525\n几千个\t61526\n蒂姆伯顿\t61527\n华氏温度\t61528\nfifty\t61529\nrobocup\t61530\n一星期内\t61531\n齐人\t61532\n实性\t61533\nt型\t61534\n长房集团\t61535\n瘆人\t61536\n武家坡\t61537\n循证\t61538\n热征\t61539\n老生长\t61540\n陪送\t61541\njisu\t61542\n平凉市政府网\t61543\n接入\t61544\n露头\t61545\n翠贝卡\t61546\n0.31\t61547\nAIDL\t61548\nfilegee\t61549\niloc\t61550\n丁仪\t61551\n政绩观\t61552\n闭孔\t61553\nuploadfile\t61554\n头梁\t61555\n南靖土楼\t61556\n生活费用\t61557\n冷拉圆钢\t61558\n福田拓陆者皮卡\t61559\n滑丝\t61560\n跨境电商\t61561\nkmeans聚类\t61562\n王建刚\t61563\ncode\t61564\n京巴\t61565\ncameron\t61566\n腊八\t61567\n微剧\t61568\n中国期货公司\t61569\nDRUID连接池\t61570\n刀路\t61571\n隆裕太后\t61572\nfreekan\t61573\nphilippine\t61574\n奇易DJ音乐网\t61575\n不會\t61576\n李旸\t61577\n粤西北\t61578\n口口\t61579\n鲁家峙\t61580\n随之而来\t61581\n广东工商职业学院\t61582\nVital\t61583\n多尿\t61584\n王蛇\t61585\n葛塘\t61586\ndeviceid\t61587\n旬阳县人民政府\t61588\nlog4j.properties\t61589\n新论断\t61590\n张瀚文\t61591\n声卡驱动\t61592\n木醋液\t61593\nibs\t61594\n名网\t61595\n马自达马自达\t61596\ncircular\t61597\nepoll\t61598\n汽车消费贷款\t61599\n安吉利\t61600\n4.25日\t61601\n带通滤波器\t61602\n90.1\t61603\n星行\t61604\n小箭\t61605\n非金融类\t61606\n肉羊\t61607\n实验楼\t61608\n选自\t61609\n_股城\t61610\n挂档\t61611\n低压配电网\t61612\n广东南\t61613\nphb\t61614\n全石\t61615\n南通便民网\t61616\nLego\t61617\n发布站\t61618\nxxnet\t61619\n老身\t61620\n三四五六\t61621\n节奏天国\t61622\n名图论坛\t61623\n张汉盛\t61624\n野原广志\t61625\n爱情网\t61626\ntonight\t61627\n2018.4.12\t61628\n新和村\t61629\n职业\t61630\nEverything\t61631\nBoat\t61632\n玛卡\t61633\nTICKET\t61634\nOwn\t61635\n商务参赞处\t61636\nunetbootin\t61637\ndeserves\t61638\nshelll\t61639\n组工网\t61640\n甘井镇\t61641\n有声小说网_在线听书\t61642\n捕梦网\t61643\n椒图\t61644\n特价机\t61645\nreferences\t61646\n垫子\t61647\nWHMCS\t61648\n利丰集团\t61649\n新世纪大学\t61650\n李艳萍\t61651\n上海市第一妇婴保健院\t61652\n笋岗\t61653\nBian\t61654\nGS3\t61655\n微软中国\t61656\n咆哮者\t61657\n整残\t61658\n明沟\t61659\nencoded\t61660\n毒妈\t61661\n磁材\t61662\n野水\t61663\nSiO\t61664\nFeather\t61665\n千灯\t61666\n荨麻\t61667\n汽车集团\t61668\n星悦城\t61669\n丹邦科技\t61670\ncoccus\t61671\n草庙村\t61672\n木塑\t61673\nLewis\t61674\n赵敏\t61675\n免疫算法\t61676\n王府井大街\t61677\n下载页\t61678\n最大似然\t61679\n李白\t61680\nTablet\t61681\n400万吨\t61682\n迎峰度夏\t61683\n中凯城市之光\t61684\n蝶丝\t61685\n施特劳斯\t61686\n99kk\t61687\n华光股份\t61688\n把稳\t61689\n腰肌\t61690\n浙江省医学科学院\t61691\n长钉\t61692\nMagicaVoxel\t61693\n新天鹅堡\t61694\npoi\t61695\n碟仙\t61696\n金\t61697\n同义词\t61698\n30h\t61699\n林翠萍\t61700\n氧化磷酸化\t61701\n减免税\t61702\n金鼎\t61703\n乾卦\t61704\n胡涛\t61705\n旭辉广场\t61706\n康科德\t61707\nCarpenters\t61708\n第四十五章\t61709\n广告品\t61710\n苏南地区\t61711\ngret\t61712\nArthritis\t61713\n2017年5月24日\t61714\n15亿\t61715\n明天凌晨\t61716\n颓势\t61717\n李希侃\t61718\n选刊\t61719\nWord/WPS\t61720\n516\t61721\n年青\t61722\n私对公\t61723\n10KW\t61724\n岩片\t61725\n德鲁伊\t61726\nmyeclispe\t61727\n上海东方艺术中心\t61728\n魔法池\t61729\ngovt\t61730\n上稿\t61731\n富力津门湖\t61732\n东亚区\t61733\n长沙银行股份有限公司\t61734\n表链\t61735\n火之迷恋\t61736\n第一会\t61737\n省消防总队\t61738\n冒险岛m\t61739\nhenan\t61740\n魔法\t61741\n现身说法\t61742\ni7-8550u\t61743\n越野挑战赛\t61744\n宝安日报\t61745\n所向无敌\t61746\n蓄滞洪区\t61747\n五问\t61748\n库仑力\t61749\n强制性\t61750\n天成\t61751\narab\t61752\n把握\t61753\n均可\t61754\n中豪\t61755\n抒\t61756\n郑州成功财经学院\t61757\ntruth\t61758\n50题\t61759\n绿装\t61760\n荆棘王冠\t61761\n逆剑狂神\t61762\n易感\t61763\n20150930\t61764\n沈阳工业大学\t61765\n庞岩\t61766\n66RPG\t61767\n西安房管局\t61768\n叛谍\t61769\n浙江省科技厅\t61770\n恩人\t61771\ne5450\t61772\n游商\t61773\n流天\t61774\n苯板\t61775\nn4200\t61776\n京溪\t61777\n液压网\t61778\n_好菜杰家常菜谱大全网\t61779\n张建新\t61780\n2014款款\t61781\n中华人民共和国国家安全法\t61782\n军训\t61783\npvz2吧\t61784\n红桥\t61785\nぞ\t61786\ngl8\t61787\nTeaching\t61788\n500l\t61789\n排卵期出血\t61790\n人贴\t61791\n焦作一中\t61792\n24本\t61793\nghosts\t61794\n叶凌天\t61795\n派代网\t61796\nJiangsu\t61797\n#鹿晗热血街舞团#\t61798\n红颜旧\t61799\n赵锦焘\t61800\n回转寿司\t61801\n信息园\t61802\n亲夫\t61803\n身段\t61804\n干鲍鱼\t61805\n李勇\t61806\n上海民政\t61807\nimplied\t61808\n250个\t61809\n李唐\t61810\n远大科技集团\t61811\nestimates\t61812\nv-text\t61813\n第86章\t61814\nm176n\t61815\n触摸键盘\t61816\n谢妮\t61817\n张静初\t61818\n整肠丸\t61819\n数日\t61820\n雨辰\t61821\nOOR\t61822\n十中\t61823\n纹章\t61824\n肚兜\t61825\n京能集团\t61826\nProfinet\t61827\n原浆\t61828\n湘潭县\t61829\n构数\t61830\n建制村\t61831\n胚芽鞘\t61832\n立定跳远\t61833\n装饰盖\t61834\nlive-server\t61835\nSHEN\t61836\n时代地产\t61837\nGraphene\t61838\n黑暗扎基\t61839\nChaumet\t61840\ne8400\t61841\n铡\t61842\n北极熊\t61843\nSoleil\t61844\n上海樊克生物科技有限公司\t61845\ndrop、truncate\t61846\n微分法\t61847\n迎阳\t61848\n丝塔芙\t61849\nmultiman\t61850\nddi\t61851\n西游记续集\t61852\n麦克劳林\t61853\n初选\t61854\n夹扣\t61855\nncd\t61856\n川陵\t61857\n扑通\t61858\n新门\t61859\n火雷\t61860\nKTA\t61861\n循着\t61862\n几千公里\t61863\nmsys\t61864\n达州市国土资源局\t61865\nremenber\t61866\n赵秀君\t61867\n国家民委\t61868\n工人路\t61869\nexcel-CSDN\t61870\n马小跳\t61871\nzhrss\t61872\n林奕含\t61873\n闯过\t61874\n出水\t61875\n包邮\t61876\n清丰\t61877\n再犯\t61878\nBlockingQueue\t61879\nreids\t61880\n北京市公安局公安交通管理局车辆管理所\t61881\n换洗\t61882\n9150\t61883\n差错\t61884\n糖耐量\t61885\n指责\t61886\nwinyh\t61887\n章北海\t61888\n动画片\t61889\n温言\t61890\n三里河\t61891\n爸爸妈\t61892\n娱美德\t61893\n等额本息还款计算器\t61894\nUISearchBar\t61895\n消食片\t61896\n120秒\t61897\n30-40年代\t61898\n中一\t61899\n圆柱\t61900\n新街\t61901\n李琦\t61902\n颤抖\t61903\n深圳北\t61904\n广州南方人才市场\t61905\n诸葛亮传\t61906\n年表\t61907\n全自动绕线机\t61908\n管理司\t61909\n台农\t61910\n460ml\t61911\n刘家桥\t61912\n评价语\t61913\n解百纳\t61914\n图PS\t61915\n廉租\t61916\n民进中央\t61917\narma\t61918\nbangbus\t61919\n神影\t61920\n绿芽\t61921\n阳光私募\t61922\n桃花坞\t61923\n应用而生\t61924\n3.13.0\t61925\nVCRUNTIME140.dll\t61926\nVolt\t61927\n观世音菩萨普门品\t61928\naisa\t61929\n郭锐\t61930\n1686\t61931\n缭乱\t61932\n黑石山\t61933\n一種\t61934\n寻乌调查\t61935\n暗夜猎手薇恩\t61936\n流音\t61937\n国家密码管理局\t61938\nv3.8\t61939\njose\t61940\nK50\t61941\nKaren\t61942\n生世\t61943\n中钨高新\t61944\n胜山\t61945\nenglishpod\t61946\n解决方案\t61947\nSQ\t61948\n第05章\t61949\n新纪元\t61950\n刘小七\t61951\n富氢水\t61952\nTQ\t61953\n桃运\t61954\n天鹅座\t61955\n加类\t61956\n神篇\t61957\nelection\t61958\n智慧果\t61959\n反腐败\t61960\n纯味\t61961\n394号\t61962\n公社\t61963\n上海发展\t61964\nweis\t61965\n共阴数码管\t61966\n破裂\t61967\n幸福人家\t61968\nclimbing\t61969\n茂名市人民政府\t61970\n浙委\t61971\n发号施令\t61972\n万能表\t61973\n20170313\t61974\n减整\t61975\n河北省体育局\t61976\n民网\t61977\nAlienware\t61978\n放得开\t61979\n小T\t61980\n麦德林\t61981\n佛台\t61982\n广州股权交易中心\t61983\n丝盒\t61984\n谢略\t61985\n化学变化\t61986\n政治学概论\t61987\n没用对\t61988\n原流\t61989\n安华高\t61990\n几分之几\t61991\n看病\t61992\n上交所\t61993\n美人谷\t61994\n骑战三国\t61995\n报装\t61996\n鸡蛋味\t61997\n边缘户\t61998\n迪安\t61999\n王晓晖\t62000\niq\t62001\n力合\t62002\n思华\t62003\n六经\t62004\nofweek\t62005\n20151120\t62006\n缅北\t62007\n漫记\t62008\n台州商报\t62009\n恶魔猎人\t62010\n105集\t62011\n金窝窝\t62012\n开荒\t62013\n力不从心\t62014\n推荐帖\t62015\niThoughts\t62016\n项城网\t62017\n四五十万\t62018\n阿郎\t62019\n联诚\t62020\n坪山高级中学\t62021\n主动力\t62022\n案说法\t62023\nArchLinux\t62024\n华韩\t62025\n潜航器\t62026\ndehua\t62027\n阿拉伯帝国\t62028\nlalalala\t62029\n捷身\t62030\nmasterpiece\t62031\n文言文字典|古汉语字典\t62032\n创维酷开\t62033\n天长市\t62034\n羁縻\t62035\n胡辣汤\t62036\n地排\t62037\n我的疯巫妖\t62038\npc28\t62039\n名校堂\t62040\n关云长\t62041\n摇椅\t62042\nOPED\t62043\nXCA\t62044\n10月21日\t62045\n军书\t62046\nTaoCode\t62047\n北京希尔顿酒店\t62048\n李红涛\t62049\n低阻\t62050\n宜化\t62051\n200KB\t62052\n已知\t62053\nHR\t62054\n希有\t62055\n景石\t62056\n随你\t62057\n质管部\t62058\nVIBE\t62059\n9幢\t62060\n怪病\t62061\n长春新区\t62062\n金龙村\t62063\n81\t62064\n吉平\t62065\n质气化\t62066\n周丽君\t62067\n微擎人人商城\t62068\n宝井理人\t62069\n善领科技\t62070\n电脑帝\t62071\n百家讲坛资源网\t62072\n搜狗糖猫\t62073\n跋\t62074\n中华人民共和国继承法\t62075\n今日惠州网\t62076\n25.5\t62077\n嘉善吧\t62078\nアニメ\t62079\n丽江婚纱摄影\t62080\n中山东\t62081\n康顺\t62082\nEBDoor\t62083\nbts\t62084\n北京地铁八通线\t62085\nscal\t62086\n冰激淋\t62087\n卷贝\t62088\n瓜州在线\t62089\n温州医科大学附属眼视光医院\t62090\n在床上\t62091\n2066\t62092\n奔驰c200l\t62093\n人民健康网\t62094\n白鹿镇\t62095\n法军\t62096\n毒草\t62097\n流管\t62098\n华为荣耀7吧_\t62099\nUISearch\t62100\n茶籽\t62101\n尘封曲\t62102\n维一精油\t62103\n核磁管\t62104\n牧村\t62105\n泰州人民医院\t62106\n省发改委\t62107\n大阳山\t62108\nimpressed\t62109\n三旺\t62110\n混合动力车\t62111\n阿莫西林分散片\t62112\n20亿元\t62113\n206块\t62114\nTOKIO\t62115\nMFi认证\t62116\n聚氯乙烯树脂\t62117\n李晶晶\t62118\ngarmin\t62119\n外国佬\t62120\n重力式挡土墙\t62121\n5头\t62122\n蔚蓝山\t62123\nHBL\t62124\n硝基甲苯\t62125\n20幅\t62126\n电信IP地址\t62127\n先锋影院_影音先锋电影_影音先锋资源网\t62128\nDCD\t62129\nxfx\t62130\n克丽缇娜美容院\t62131\n浦发硅谷银行\t62132\n蚊香片\t62133\n水果茶\t62134\n胶纸\t62135\n浑水摸鱼\t62136\n第001\t62137\n小布头奇遇记\t62138\n广东省工商行政管理局\t62139\n事前\t62140\n闹闹\t62141\nyh\t62142\n深情告白\t62143\n近因\t62144\nhm65\t62145\n智通\t62146\nPowerbeats3\t62147\n252dw\t62148\n巴鲁姆克\t62149\n浴血华沙\t62150\n梁老师\t62151\n弥漫性\t62152\n罗德斯岛\t62153\n黄晓武\t62154\n保利紫荆公馆\t62155\n堆载\t62156\n凶杀\t62157\n长沟落月\t62158\n剪标\t62159\n烟房网\t62160\n表量\t62161\n王海峰\t62162\n剑锋金\t62163\n李迅\t62164\n陈国辉\t62165\n氯氰菊酯\t62166\n铺贴\t62167\n饥饿游戏\t62168\n三氯蔗糖\t62169\n合金装备:幸存\t62170\nupskirt\t62171\n焊线机\t62172\n合生广场\t62173\nker\t62174\n印堂穴\t62175\n成份股\t62176\n招商局蛇口工业区控股股份有限公司\t62177\n十四行诗\t62178\n星宸\t62179\n收容所\t62180\n智能穿戴设备\t62181\n$$\t62182\ngentlemen\t62183\n江西省林业厅\t62184\nthreading\t62185\n180330\t62186\npicsart\t62187\n5000例\t62188\n中国太极拳网\t62189\n魔兽地图编辑器\t62190\n胃癌\t62191\n25次\t62192\n即\t62193\n手流\t62194\n172&\t62195\n别克_\t62196\n科大附中\t62197\n杨采薇\t62198\n600320\t62199\n罗岗\t62200\n香港银行\t62201\n整牙\t62202\n机费\t62203\nPSP\t62204\nrus\t62205\n亭长\t62206\nEvai\t62207\nC2C\t62208\nosgEarth\t62209\n头孢菌\t62210\nAyu\t62211\n葛根木瓜\t62212\n青岛长途汽车站\t62213\n金雪\t62214\n金投股票网\t62215\n三国霸业\t62216\n40题\t62217\n林檎\t62218\nWAYBIG\t62219\n环保类\t62220\nmafia\t62221\n林玉英\t62222\n酱料\t62223\n许敏\t62224\n司法考试民事诉讼法\t62225\nxmu\t62226\n紫线\t62227\n1081\t62228\n中国质量协会\t62229\n雪诺酮\t62230\n月湖公园\t62231\n我独自生活\t62232\npifu\t62233\ncollect\t62234\n长大\t62235\n俞洪敏\t62236\n心证\t62237\n次郎\t62238\n门徒\t62239\n中国联合网络通信集团有限公司\t62240\n冯雪峰\t62241\n磷酸三钠\t62242\n千思网\t62243\n惊叫\t62244\n淮南东站\t62245\n二本线\t62246\n红翡\t62247\n撼路\t62248\n和平县\t62249\n蒲地蓝消炎片\t62250\n何家驹\t62251\nfdi\t62252\n台柜\t62253\nVULTR\t62254\n史考特\t62255\n蓉蓉\t62256\n艾玛罗伯茨\t62257\n阿德莱德大学\t62258\n玉兰大剧院\t62259\n苏小明\t62260\n报歌\t62261\n少代\t62262\n代代相传\t62263\n陕西建工集团\t62264\nbecame\t62265\n冷\t62266\n临床医学生\t62267\nTotal\t62268\n地幔\t62269\n小额信贷公司\t62270\nNFV\t62271\n太乙仙魔录之灵飞纪\t62272\nlegend\t62273\n猛涨期\t62274\n5000毫安\t62275\n孟然\t62276\n确实\t62277\n长沙婚纱摄影\t62278\n风花醉乐文_风花醉笔趣阁_风花醉顶点_风花醉\t62279\n擎天\t62280\n处女男\t62281\n大西南剿匪记\t62282\n克斯\t62283\nitest\t62284\n版子\t62285\n活塞杆\t62286\n皮样\t62287\n巴里\t62288\n翅\t62289\n超级赛车\t62290\n寸草心\t62291\n狂野飙车8吧\t62292\nsync3\t62293\n博新\t62294\n蔡辉\t62295\n古典式\t62296\n浦项\t62297\n两条路\t62298\n烹制\t62299\n肝移植\t62300\n谐音字\t62301\nhelloword\t62302\n人教版八年级语文\t62303\n导学单\t62304\n红嘴相思鸟\t62305\n宜昌市人民政府\t62306\n流放之路传奇\t62307\n北俱芦洲\t62308\nWannaCry\t62309\n铝方通\t62310\n唐晓天\t62311\n隆华\t62312\n连机\t62313\nw7\t62314\n附生\t62315\n手撕面包\t62316\n深雪\t62317\n美女神\t62318\n恶作剧\t62319\nstrin\t62320\n劲拓股份\t62321\n白氏\t62322\n赠险\t62323\n活照\t62324\n逃出生天\t62325\n王大虎\t62326\n腰肢\t62327\n收获日2\t62328\n经济效益\t62329\n链接器\t62330\n吹管\t62331\n86西游记\t62332\nbyk\t62333\n汉竹\t62334\nArchived\t62335\nwd40\t62336\n布尔乔亚\t62337\n高开高走\t62338\n内耳\t62339\n五界\t62340\n李经纬\t62341\necel\t62342\n魔侠\t62343\n黄金树\t62344\n洗尽铅华\t62345\n手续费\t62346\n政府\t62347\n拧紧机\t62348\n梅花鹿\t62349\nfinger\t62350\n悲悯\t62351\n棉毛裤\t62352\n威尼斯商人\t62353\nedup\t62354\n乔泽_\t62355\n踢走\t62356\n夏日大作战\t62357\n敲诈者\t62358\n华晨\t62359\n长城站\t62360\n乳糖\t62361\neq\t62362\n车浩\t62363\n美醉\t62364\n龙湖湾\t62365\n仙岩镇\t62366\n钳\t62367\n菊花香\t62368\nnntool\t62369\n开型\t62370\n机读\t62371\n亲睐\t62372\n携带型\t62373\n悬空城\t62374\n永清县\t62375\n百修网\t62376\n屈家岭\t62377\n附院\t62378\n53天天\t62379\nHD3P\t62380\nTOP8\t62381\n朝秦暮楚\t62382\n苏灿\t62383\ncemu吧\t62384\nrefactor\t62385\n小莫\t62386\n建安费\t62387\n千千岁\t62388\n北京工行\t62389\n开水泡\t62390\n安佳\t62391\n台本\t62392\n王立群读史记\t62393\n鸡窝\t62394\nsteak\t62395\n8粒\t62396\n38分\t62397\n根系\t62398\nDyson\t62399\n中航电子\t62400\n高僧\t62401\n第一部\t62402\nfortwo\t62403\n洗面\t62404\nS6/edge/edge+\t62405\nutils\t62406\n李洙赫\t62407\nono\t62408\n音火\t62409\n3~5年\t62410\n巴比伦饭店\t62411\n宽处\t62412\n天地玄黄\t62413\n斩铁\t62414\n芜湖市第二人民医院\t62415\n098\t62416\n坂本\t62417\n汀兰\t62418\n员工期权\t62419\n海世界\t62420\n一刀\t62421\n嗒\t62422\n截长\t62423\n601198\t62424\n中国城镇水网\t62425\n饶客网\t62426\n金帝集成灶\t62427\n西安汽车站\t62428\n宝楼\t62429\n鑫创\t62430\n小米note顶配\t62431\n美工笔\t62432\nrect\t62433\n第一劫\t62434\n组长\t62435\n痛诉\t62436\n统计分析软件\t62437\nT25\t62438\n旅游游记\t62439\n大连森林动物园\t62440\n惜花\t62441\n深成\t62442\n双鸭山\t62443\n信阳市区\t62444\n91手机助手\t62445\n杜桥镇\t62446\n陌上花开\t62447\n北极甜虾\t62448\n10.10\t62449\n浙江出版联合集团\t62450\n住房和城乡建设部执业资格注册中心\t62451\n工商企业管理专业\t62452\n穆如寒江\t62453\nPCA\t62454\n膨胀阀\t62455\n国家性\t62456\n狂战\t62457\n嘴巴子\t62458\n深圳民办学校\t62459\n有机大米\t62460\n看场\t62461\n中国新闻出版研究院\t62462\n星装\t62463\n白眉\t62464\n路头\t62465\nzealer\t62466\nZM\t62467\n易物\t62468\n浙商大厦\t62469\nDGC\t62470\nAnantara\t62471\n民办\t62472\n美院\t62473\n畅玩6\t62474\n2018047期\t62475\n荣耀战魂吧_\t62476\n変\t62477\n刘海儿\t62478\n10亿级\t62479\nplayer\t62480\n同德广场\t62481\nTOKEN\t62482\n0条\t62483\n白棋\t62484\n荣德基\t62485\n假冒注册商标罪\t62486\n1000条\t62487\n360P\t62488\n原振侠与卫斯理\t62489\n紫伊\t62490\n非农业\t62491\n圣王国\t62492\n掌上大学\t62493\n柜员机\t62494\ncoh\t62495\n滨江东\t62496\nonclose\t62497\nSS级\t62498\n陈秋实\t62499\n汤恩伯\t62500\n纵隔肿瘤\t62501\nlog4j.xml\t62502\nCop\t62503\ndefinio\t62504\n张裕爱斐堡\t62505\n周庄镇\t62506\n通风柜\t62507\n折纸盒子\t62508\n亡誓\t62509\n耐打\t62510\n麦穗\t62511\n侠客风云传前传\t62512\n新意\t62513\n众目睽睽\t62514\n沙坡头\t62515\n相信爱\t62516\n.NET编程_.NET\t62517\n刘\t62518\n插画师\t62519\nFuse\t62520\n地冥\t62521\n收肌\t62522\n泰福\t62523\n李阿姨\t62524\n东福山岛\t62525\n因素分析法\t62526\n胡鹏\t62527\npurely\t62528\n湖北省商务厅\t62529\n翻瓣\t62530\n孕囊\t62531\ngibco\t62532\nMaze\t62533\n妖精森林的小不点\t62534\n刘岳\t62535\n增项\t62536\n悠仁\t62537\nMHP3\t62538\n时行\t62539\n灯\t62540\naversion\t62541\n欧博诺\t62542\n台北故宫博物馆\t62543\n炉石传\t62544\nHaskell\t62545\nEViews\t62546\n上移\t62547\n导向器\t62548\n108路\t62549\n不可遗忘\t62550\n大悲剧\t62551\n孟晓苏\t62552\n乐健\t62553\n境线\t62554\ndnfpk\t62555\n吴老二\t62556\n柴米油盐酱醋茶\t62557\npetrol\t62558\n陶慧\t62559\ncoap\t62560\nczech\t62561\n李纯\t62562\n羊台山森林公园\t62563\n简繁英\t62564\n半头\t62565\n青枣\t62566\nCiliFanhao\t62567\n_钢易网\t62568\n鬼玩人\t62569\n工程机械网\t62570\nexal\t62571\n自由军\t62572\n信义\t62573\nenter\t62574\n芝罘政府网\t62575\n蕾\t62576\nleestar\t62577\nDetermination\t62578\n印象\t62579\n正信\t62580\n承租权\t62581\n公路路\t62582\n道远\t62583\n愣\t62584\n公害\t62585\n肖文\t62586\n苯磺酸左旋氨氯地平片\t62587\n200份\t62588\n百白破疫苗\t62589\n奇米影视首页_奇米色888_奇米影院_777米奇影视_奇米第四色_奇米\t62590\n慕兆丰\t62591\n晚上十点半\t62592\n五险一金公司\t62593\n香牌\t62594\n美国人\t62595\n系统变量\t62596\n十架\t62597\n人杰\t62598\nCrude\t62599\n成员\t62600\n20国\t62601\n化形\t62602\n太原市图书馆\t62603\n放正\t62604\n臭桂鱼\t62605\n谢谢\t62606\nOurs\t62607\n批发圈\t62608\n金灿毅\t62609\n团风县\t62610\n骚花\t62611\n皮日\t62612\n玩火\t62613\nqq卡\t62614\n海德堡\t62615\nkeySet\t62616\n短时傅里叶变换\t62617\n摆杆\t62618\n绿皮书\t62619\n95306\t62620\netn\t62621\nnet-snmp\t62622\n三国志9吧_\t62623\n顺利\t62624\n猪栏\t62625\n羽毛球赛\t62626\n限牌\t62627\njennifer\t62628\n北岛知寒\t62629\nSOLAR\t62630\n名俗\t62631\n醇厚\t62632\nsequel\t62633\n井上绫子\t62634\n侯京健\t62635\n成寿寺路\t62636\n教育_亲子中心\t62637\n&#61480\t62638\nfft变换\t62639\n猎魂者\t62640\n杨柳絮\t62641\n江北水城\t62642\n精彩\t62643\n三湘都市报\t62644\n夜恋\t62645\n转运珠\t62646\ne5300\t62647\noak\t62648\n独活\t62649\n海底捞月\t62650\n流动党员\t62651\n110分\t62652\n天云山\t62653\n电波钟\t62654\n正面管教\t62655\n长电\t62656\n泸州市教育局\t62657\n规章\t62658\n7月7日\t62659\n脑积水\t62660\n苍山负雪\t62661\n电子基\t62662\n33种\t62663\n音感\t62664\n十万位\t62665\nr2.0\t62666\n预查\t62667\n千叶珠宝\t62668\n赣县中学\t62669\n垣曲\t62670\n骇客帝国\t62671\n应届生求职网\t62672\n洋葱头历险记\t62673\nBrackets\t62674\n効\t62675\n求问\t62676\n匀变速直线运动\t62677\n行动者\t62678\n微信公众号\t62679\n马关县人民政府\t62680\nRGB值\t62681\nGPT分区\t62682\n呼吸音\t62683\n宝马1系\t62684\n王全\t62685\ndoc.doc\t62686\n黄冈市国土资源局\t62687\n舜山府\t62688\nsigh\t62689\n陆鲨\t62690\nSchedule\t62691\npublish\t62692\n哈兰\t62693\n低平\t62694\nCCAA\t62695\n巴奴毛肚火锅\t62696\n交易宝\t62697\nboost库\t62698\n瓦楞展\t62699\n芹泽\t62700\nartist\t62701\n教学片段\t62702\n三个小时\t62703\n生死钟声\t62704\nrdkafka\t62705\nconceptual\t62706\n卧式搅拌机\t62707\n金正\t62708\n化学科\t62709\n死尸\t62710\n简单点\t62711\n飞鸽传书\t62712\njzz\t62713\n廊坊市\t62714\n误传\t62715\n智通财经网\t62716\n13届\t62717\n守卫剑阁乱世枭雄\t62718\nXh\t62719\n福特嘉年华\t62720\n郭清\t62721\nHarlem\t62722\n美国参议院\t62723\n外烟\t62724\ntpep\t62725\n数控锯床\t62726\n度法\t62727\n火者\t62728\n看守者\t62729\n被打断\t62730\n巧姐\t62731\n两日\t62732\ncdn\t62733\n解译\t62734\n土地革命战争\t62735\n私域\t62736\n龙珠世界\t62737\nTomFord\t62738\n周明\t62739\n冒出\t62740\n乐此不疲\t62741\n型男\t62742\n吉客\t62743\n新吕布传\t62744\n烟斗村\t62745\n安徽省国税局\t62746\n电压型\t62747\n暴风影音\t62748\n9.0.4\t62749\n陕西都市快报\t62750\n亚当兰伯特\t62751\n当我\t62752\n起跳\t62753\n对不起,我爱你\t62754\n公开市场操作\t62755\n小儿鼻炎\t62756\n九章算术\t62757\n亳州市\t62758\n暴劫倾情\t62759\n海珀\t62760\n双筒\t62761\n底单\t62762\nSitara\t62763\n万和电气\t62764\n下集\t62765\nvesa\t62766\n大连)有限公司\t62767\n先锋论坛\t62768\n我的大学\t62769\n演出季\t62770\n对流传热系数\t62771\nHOWO\t62772\n90年代初\t62773\n低效\t62774\n苏联红军\t62775\n新丽传媒\t62776\n新风\t62777\n余火\t62778\n阿里影业\t62779\nCity\t62780\n竹林寺\t62781\n10-10\t62782\n承上启下\t62783\n革命派\t62784\nglobalData\t62785\n漳州市政府\t62786\nChannel\t62787\n动画全集高清\t62788\n黄金罗盘\t62789\nRuan\t62790\n招标代理费\t62791\n区委常委会\t62792\ncompiler\t62793\n上海威尔顿阀门有限公司\t62794\n尉健行\t62795\nAnywhere\t62796\n新春乐\t62797\n叔本华\t62798\n依偎\t62799\ndeactivate\t62800\n佐助鸣人\t62801\n关于分类推进人才评价机制改革的指导意见\t62802\n止口\t62803\n不冷不热\t62804\n纯化蛋白\t62805\n国际商学院\t62806\n自然人\t62807\n民富\t62808\n政府口\t62809\n林克\t62810\n盐城市\t62811\nm3u8格式\t62812\n陈洁\t62813\n汉正街\t62814\nSHOW\t62815\n漳浦\t62816\n福建广电网络集团\t62817\n3txt\t62818\n平特\t62819\n跳帧\t62820\n良仓\t62821\n84家\t62822\n服装\t62823\n前海\t62824\n以南\t62825\n扁鱼\t62826\n跨职\t62827\nSQL2012\t62828\n撕裂\t62829\nZ9mini\t62830\n北京朝阳区\t62831\n中小板\t62832\n石英石\t62833\n密尔沃基\t62834\n李长春\t62835\n睡不好觉\t62836\n长城宽带\t62837\n李长荣\t62838\n5010\t62839\n海景花园\t62840\n三星S7Edge\t62841\nyaxis\t62842\n冰弓\t62843\nlibjpeg\t62844\npoin\t62845\nAdriana\t62846\n兰帕德\t62847\n王卫\t62848\n第49集\t62849\nfrankfurt\t62850\n19层\t62851\n方正公司\t62852\n侧键\t62853\n南街街道\t62854\n骚货\t62855\n刘忻\t62856\n春心荡漾\t62857\n用光\t62858\n秀居网\t62859\n萨塞克斯大学\t62860\n巅峰战\t62861\n拖家带口\t62862\n达成\t62863\nwangy\t62864\n独孤\t62865\n薄饼机\t62866\n躁\t62867\n首部曲\t62868\n天妇罗\t62869\n敷衍\t62870\n鲁迅中学\t62871\n河滨之城\t62872\n佳龙\t62873\n上学啦\t62874\n五十年\t62875\n东风标志\t62876\nhonour\t62877\nrp8.0\t62878\n圣元优博\t62879\n金店\t62880\n乙酰基\t62881\n衣篮\t62882\n54m\t62883\napend\t62884\n站台\t62885\n化龙镇\t62886\n爱课程网\t62887\n白金光\t62888\nISV\t62889\n墨水量\t62890\n大哥们\t62891\n性生理\t62892\n华浔品味装饰\t62893\n水毁\t62894\n{2\t62895\n菁kids\t62896\n嘉苑\t62897\n葵潭\t62898\n尾流\t62899\n中国汽车消费网\t62900\n女排\t62901\n博弈\t62902\n木浴桶\t62903\n元素索引值\t62904\n_天府\t62905\nEmirates\t62906\n1339\t62907\n测深仪\t62908\n55kw\t62909\n织梦CMS\t62910\n德云八队\t62911\n希咲\t62912\nTransforms\t62913\nBLOOD\t62914\ndnf法\t62915\n孙维世\t62916\n乙型肝炎病毒\t62917\n半点\t62918\n新生代\t62919\nStacking\t62920\n江苏农业科学\t62921\n50斤\t62922\n公司\t62923\nFAT32\t62924\n蓝山站\t62925\n8007045\t62926\n李星龙\t62927\n忘\t62928\n主厨\t62929\navsox\t62930\n2017年6月1日\t62931\n西游山海\t62932\n刘白羽\t62933\n红梅公园\t62934\n定序\t62935\n连带赔偿责任\t62936\n大话战国\t62937\n家庭教养\t62938\n永生花\t62939\n墓王之王之悬棺寺百度云\t62940\n接听\t62941\n商鼎路\t62942\n便门\t62943\n如龙5\t62944\npagespeed\t62945\n方琼\t62946\n傻狗\t62947\nnile\t62948\n小鸡模拟器\t62949\nHDTunePro\t62950\n大气圈\t62951\n河北机场管理集团有限公司\t62952\noppoa37m\t62953\n极品相师\t62954\n酷比魔方i7\t62955\naaai\t62956\n毋宁\t62957\n汉拿山\t62958\ndefualt\t62959\n朱常洛\t62960\n中科院动物所\t62961\n马匹\t62962\nphotoshopcs2\t62963\n差示扫描量热仪\t62964\n拼车网\t62965\n流亭街道\t62966\n广西北部湾银行\t62967\n尖锋\t62968\n中交建\t62969\n掉进\t62970\n巨化集团公司\t62971\n欲言\t62972\n拍卖师\t62973\n搭界\t62974\n深圳市孙逸仙心血管医院\t62975\n最终幻想6\t62976\n嬌寵一世\t62977\n雪原\t62978\n和平门\t62979\n任贤\t62980\n欧星\t62981\n归田\t62982\n首只\t62983\nplants\t62984\n粤民\t62985\n2017\t62986\n秋雨露\t62987\n李珊珊\t62988\n久治\t62989\n自贡市第一人民医院\t62990\n皮雷\t62991\n达利欧\t62992\n凯洛伦\t62993\n杏林街道\t62994\n银河护卫队\t62995\n酷比魔方\t62996\n祝寿歌\t62997\n台达\t62998\n项目经理部\t62999\n75欧姆\t63000\n佛滔\t63001\n卸磨\t63002\n乐清市人民政府\t63003\n嗨到\t63004\nImageIO\t63005\n三氧化二铁\t63006\nVivi\t63007\n海口日报数字报\t63008\n斜襟\t63009\nOperations\t63010\n九阴真经ol\t63011\nartof\t63012\n点方\t63013\n学术论\t63014\n草莓音乐节\t63015\n上海酒店\t63016\n华图教育四川分校\t63017\n已婚女\t63018\n海南海\t63019\n前肢\t63020\n2句\t63021\n同窗会\t63022\n跳跳蛙\t63023\ngawk\t63024\n抛光蜡\t63025\n塔盘\t63026\n真男\t63027\n摩贝\t63028\n90年代末\t63029\nAPA格式\t63030\n中月\t63031\n阿尔法蛋\t63032\n彩云之南\t63033\nBCompare\t63034\n叁\t63035\nY520\t63036\n吴欣鸿\t63037\n命理师\t63038\n水根\t63039\n明火\t63040\nteachers\t63041\n旋光度\t63042\n提花布\t63043\n024\t63044\n丁小雨\t63045\n骑兵团\t63046\n重庆医科大学附属永川医院\t63047\n8公里\t63048\n双主\t63049\n邓紫琪\t63050\n永和县\t63051\n收放\t63052\n享声\t63053\nVicious\t63054\n马哲\t63055\nairports\t63056\n被捕获\t63057\n昌吉学院\t63058\nmscomm\t63059\n森工\t63060\n酿豆腐\t63061\n产业工人\t63062\n物业维修基金\t63063\n69分\t63064\n梦工场\t63065\n改篇\t63066\nutorrent\t63067\n厦门酒店\t63068\n国民技术\t63069\n电子政务综合门户网\t63070\n空关\t63071\n南屏山\t63072\npanel\t63073\n并发编程网\t63074\n谜画\t63075\n50kw\t63076\n山东警察学院\t63077\n番剧\t63078\nto在线翻译\t63079\n丰禾\t63080\n女优\t63081\n卡卡礼品网\t63082\nGrowingIO\t63083\n论见\t63084\n心狠手辣\t63085\n3010\t63086\n悍将\t63087\n循序渐进\t63088\n锦绣东路\t63089\n羽蛇神\t63090\ns档\t63091\n邪恶漫画_邪恶少女漫画无翼鸟_邪恶漫画大全\t63092\n藏风\t63093\nAT\t63094\n200000元\t63095\n中国石化广东石油分公司\t63096\n4G+64G\t63097\nIntersect\t63098\n11888\t63099\nGAMBIT\t63100\n罔顾\t63101\nCustomization\t63102\n老爷子\t63103\n兼备\t63104\n政令\t63105\nhyl\t63106\n欢乐颂3\t63107\n毕业论文中期检查表\t63108\n家苑\t63109\ninventory\t63110\n湿陷性黄土\t63111\n大洪山\t63112\n大盐城\t63113\n猫哥\t63114\n痔疮栓\t63115\n小米4s\t63116\n剑南大道\t63117\n柞蚕\t63118\nnoreply\t63119\n期货日报\t63120\nSHARP\t63121\n颞下颌关节\t63122\n纤箱\t63123\n海达股份\t63124\n无成\t63125\n坂道\t63126\n罗翔\t63127\n三星W2018\t63128\n桔灯\t63129\n打围\t63130\nlen\t63131\nidu\t63132\n潭溪山\t63133\n海福星\t63134\n梁佩玲\t63135\n草虫\t63136\n范德华力\t63137\nwsc\t63138\n博尔塔拉州\t63139\n泥娃娃\t63140\n终极街霸4\t63141\n1643\t63142\n三氯氢硅\t63143\n吉林省水利厅\t63144\n嘲天宫\t63145\n删档测试\t63146\n学懂\t63147\n种族主义者\t63148\n舒氏\t63149\n民宗委\t63150\nFranais\t63151\nw280bt\t63152\n农民网\t63153\n学英语\t63154\n耐腐\t63155\n宸\t63156\ndeadly\t63157\n配\t63158\n巴登\t63159\n极左\t63160\n陈春生\t63161\n末影人\t63162\n陆国富\t63163\n司马翎\t63164\nHotels.com好订网\t63165\n检察日报\t63166\n飞鱼星路由器\t63167\n元宝e家\t63168\n藤原豆腐店\t63169\n氖灯\t63170\n毒牙\t63171\nRays\t63172\n湖南省军区\t63173\n断桥铝合金\t63174\n丰臣秀吉\t63175\n居住区\t63176\n杯套\t63177\n道姑\t63178\nWitcher\t63179\n特点\t63180\n硝铵\t63181\nmaya吧_\t63182\namong\t63183\n银线\t63184\n鲁王\t63185\n万里挑\t63186\n何去何从\t63187\n卖车\t63188\n琪琪梦\t63189\n香港会展中心\t63190\n合作协议书\t63191\n阵法\t63192\n赖特\t63193\n南涧县\t63194\n锡剧\t63195\n丁草胺\t63196\nArraylist\t63197\n阿巴\t63198\n文件\t63199\n项目管理\t63200\nnexon\t63201\n保卫萝卜2\t63202\nschtasks\t63203\nista\t63204\n天道无限道武者路\t63205\n钨精矿\t63206\nTweaks\t63207\napplemusic\t63208\nPyplot\t63209\n郭勇\t63210\n懂\t63211\nRamsay\t63212\nDiploma\t63213\n等高线\t63214\n旷野\t63215\nanniversary\t63216\n竖页\t63217\n宇信\t63218\n春苏\t63219\n展开剂\t63220\n时间长\t63221\n中控指纹考勤机\t63222\n晋祠\t63223\n别听\t63224\n分动箱\t63225\n天地人和\t63226\n空气调节器\t63227\n温良\t63228\n过路车\t63229\n舍身击\t63230\n电镀层\t63231\n6批\t63232\n菜瓜\t63233\n卵用\t63234\n星创\t63235\n_洋县\t63236\nヤリ\t63237\n电子烟烟油\t63238\n36843212\t63239\n定兴县人民政府\t63240\nKun\t63241\n70位\t63242\n胶东半岛\t63243\n量块\t63244\n4.8.1\t63245\n尾架\t63246\nthu\t63247\nDEVIL\t63248\n流水表\t63249\n笑称\t63250\n厦门园博园\t63251\nClose\t63252\nNote\t63253\n卫生间隔墙\t63254\n底孔\t63255\n队徽\t63256\n无菌注射器\t63257\n梦幻西游电脑版\t63258\n云墙\t63259\n好给力\t63260\n秦始皇陵墓\t63261\n至上励合\t63262\n浩辰cad2018\t63263\n添砖加瓦\t63264\nBRAND\t63265\n至道学宫\t63266\n电子琴谱\t63267\nvlc\t63268\n24速\t63269\n会计恒等式\t63270\n植脂末\t63271\nInfant\t63272\n军机大臣\t63273\n梦塔\t63274\n河口镇\t63275\n260万\t63276\n8033\t63277\n周小斌\t63278\n罗素\t63279\n30年后\t63280\n片子\t63281\n伟大的祖国\t63282\n第5季\t63283\n入仙\t63284\n梭梭\t63285\n3540\t63286\n松花蛋\t63287\ncommits\t63288\n丁凯乐\t63289\n相簿\t63290\n石家庄开发区\t63291\n闵行区小学\t63292\nqq名\t63293\n监狱\t63294\n凡事\t63295\n崖山\t63296\n杭州市城市建设投资集团有限公司\t63297\n世纪明德\t63298\n130度\t63299\n纳米棒\t63300\n深圳华强北派出所\t63301\n偶线\t63302\n小景\t63303\n清西陵\t63304\n舱门\t63305\n038\t63306\n男生女生\t63307\n先正达\t63308\nMissions\t63309\n夏林果\t63310\n第二十四回\t63311\n好强大\t63312\n今日上午\t63313\nAnaly\t63314\n天气学\t63315\n1858年\t63316\n大凤\t63317\n折叠式\t63318\n小六壬\t63319\n8192\t63320\npbs\t63321\nrhel-server\t63322\n美肤仪\t63323\n入域\t63324\n乔希\t63325\n立白洗衣粉\t63326\nV型\t63327\n_樱\t63328\n长江投资\t63329\n中国银行间市场\t63330\n生血宝合剂\t63331\n太早\t63332\n北京保利国际拍卖有限公司\t63333\n带钢\t63334\n百虫\t63335\n严控\t63336\nzonxin\t63337\n黑皇\t63338\n腾讯wifi管家\t63339\n注册电气工程师执业资格考试\t63340\n个例\t63341\n废墟\t63342\n这期间\t63343\nrmvb\t63344\nHobo\t63345\n高锰酸盐\t63346\n好刀\t63347\n华南理工\t63348\n毕达哥拉斯学派\t63349\nappconfig\t63350\n嘉宝集团\t63351\n娱乐界\t63352\n河南博物院\t63353\n波兹南\t63354\nbreakpoints\t63355\n撸图\t63356\n西博\t63357\n房产交易网\t63358\n热稳定性\t63359\n沪江网hujiang.com\t63360\n挖掘场\t63361\nSupplementary\t63362\n上海市气象局\t63363\n杭娇\t63364\n龟峰山\t63365\n码垛\t63366\nToddler\t63367\n南京宾馆\t63368\n应氏杯\t63369\n开到\t63370\n十二季\t63371\nJepsen\t63372\n小西悠\t63373\n2.0.8\t63374\n太原市政府\t63375\n女学生\t63376\n新大话2\t63377\n阿萨姆\t63378\n东北林业大学\t63379\n线程对象\t63380\n施予斐\t63381\n狼牙棒\t63382\n丽萍\t63383\n佣金宝\t63384\n东方丽都\t63385\n李子君\t63386\n100日元\t63387\n无心\t63388\nMac/Win7双系统\t63389\n滨康路\t63390\nAndroid5.1\t63391\nAroma\t63392\n钓椅\t63393\n镇国\t63394\n黑榜\t63395\n得利\t63396\n沈静\t63397\n户籍证明\t63398\n速转\t63399\n屋企\t63400\n3G4G\t63401\n经语\t63402\n李诗芸\t63403\n泼妇\t63404\n桔\t63405\n圆锥形\t63406\n禾城\t63407\n优达学城\t63408\n星岛日报\t63409\n三辆\t63410\neverest33\t63411\n梯度法\t63412\n侯晓春\t63413\n致享\t63414\n洗轮机\t63415\ntelex\t63416\n鲜虾\t63417\n紫\t63418\n大布\t63419\n图意网\t63420\n印江县\t63421\n名副其实\t63422\n新古典经济学\t63423\npos58\t63424\n58同城交友\t63425\n青岛农大\t63426\n装饰圈\t63427\n湍\t63428\n暴露癖\t63429\n豫政\t63430\nHPS\t63431\n花状\t63432\n踢打\t63433\n甘肃交通厅\t63434\n用油\t63435\n20161204\t63436\n中国新歌声2\t63437\n丹化科技\t63438\n上蜡\t63439\n随风\t63440\n玻璃窑炉\t63441\n峥嵘岁月\t63442\nwindows7_Win\t63443\n武断\t63444\n务\t63445\n编织物\t63446\n锦绣田园\t63447\n17个\t63448\n北京工业大学研究生院\t63449\n特安\t63450\n新醉打金枝\t63451\n徐星\t63452\n杨希\t63453\n固铂\t63454\n推波助澜\t63455\nA股指数\t63456\n席夫碱\t63457\n内蒙古电视台\t63458\n捐\t63459\n末世女\t63460\n烈士公园\t63461\n班禅\t63462\n私家菜\t63463\n彩彩\t63464\n刘恺威\t63465\n正哥\t63466\n织金县人民政府\t63467\n阿列克谢耶维奇\t63468\nThornton\t63469\n汽车工程学院\t63470\n河钢股份\t63471\nVim\t63472\n妇女节\t63473\nsok\t63474\n8多\t63475\n智冠\t63476\n潮南区\t63477\nworn\t63478\n7mm\t63479\n发晶\t63480\n升降柱\t63481\n基元\t63482\n白石桥南\t63483\n2017年12月8日\t63484\njoiner\t63485\n东风东路\t63486\nEDA\t63487\n诈死\t63488\n平均年限法\t63489\n石家庄政府\t63490\n北京互联网地税局\t63491\n小卖铺\t63492\n宜春院\t63493\n800张\t63494\n为人先\t63495\n看热闹\t63496\n王一梅\t63497\n刑事侦查\t63498\n十三期\t63499\n二合一\t63500\n中航信托\t63501\n秦腔\t63502\n雪铁龙c4l\t63503\n认股权证\t63504\n盐城亭湖区\t63505\nCell\t63506\n柏合镇\t63507\n基金业协会\t63508\n陈道斌\t63509\ntcp6\t63510\n内购版\t63511\nRELOADED\t63512\n人参归脾丸\t63513\n信封\t63514\n如龙\t63515\n明一\t63516\n盲爱\t63517\n嫣然\t63518\n鸣子\t63519\nEMail\t63520\n3com\t63521\n_冠县\t63522\nDespicable\t63523\neditors\t63524\n姬魔恋战纪吧\t63525\n芝罘岛\t63526\n凯风网\t63527\nQQ币\t63528\nDVD驱动器\t63529\n18次\t63530\nStub\t63531\nlseek\t63532\n开放日\t63533\nprimarily\t63534\n894\t63535\n香飘飘奶茶\t63536\n批发大厦\t63537\n恶灵附身吧\t63538\n总分类账户\t63539\n温州商会\t63540\n娘道\t63541\naob\t63542\nsketchbook\t63543\n重离子\t63544\nyouni\t63545\nsmbios\t63546\n鸳鸯佩\t63547\neconomics\t63548\n沙头街道\t63549\nsuqi\t63550\n利客\t63551\n楚原\t63552\n坑边\t63553\n汤淼\t63554\n炮眼\t63555\n香港联交所\t63556\n下榻\t63557\n报道证\t63558\nAdy@Ady9.net|吉吉映画|先锋映画\t63559\n乳酸菌\t63560\n人工耳蜗\t63561\n全民科学素质行动计划纲要\t63562\n87岁\t63563\n叛逆的鲁路修\t63564\nYoshizawa\t63565\n新友\t63566\ncqu\t63567\n喷浆机\t63568\n伽蓝集团\t63569\n阿里校招\t63570\nkube-proxy\t63571\n林泽\t63572\n正溴丁烷\t63573\n嘀\t63574\n上海市洋泾中学\t63575\n吉他谱C调\t63576\n口网\t63577\n能口\t63578\n数控开料机\t63579\n750个\t63580\n焉知\t63581\nprintf\t63582\n提效\t63583\ncoin\t63584\n一声叹息\t63585\n姚海峰\t63586\n微公\t63587\n水转印\t63588\n复用器\t63589\n周围\t63590\n鲁路修\t63591\n北极星电力论坛\t63592\n无法逃避\t63593\n海商法\t63594\n竞瑞\t63595\n赵翼\t63596\n夏月\t63597\n香月\t63598\n神效\t63599\n81192\t63600\nauckland\t63601\n莱西论坛\t63602\nnewer\t63603\n南昌市委\t63604\n鼓皮\t63605\nshenqi\t63606\n收入户\t63607\n缺色\t63608\n盐酸美金刚片\t63609\n春风十里\t63610\n兰渝\t63611\nxiaos\t63612\n山东省物价局\t63613\nFR\t63614\n工程车\t63615\norg-mode\t63616\n赵彦春\t63617\n废弃物\t63618\nGREEN\t63619\n地王大厦\t63620\n华江路\t63621\nVS2013/MFC\t63622\n港岛\t63623\n研究生处\t63624\ninns\t63625\n反派们\t63626\n汇达\t63627\n秦天\t63628\n47元\t63629\n太妃\t63630\njdf\t63631\n200名\t63632\n中国对外经济贸易信托有限公司\t63633\n篮球板\t63634\nContests\t63635\ntoy\t63636\nCIV\t63637\ndefinitely\t63638\n99SE\t63639\n中石化加油站\t63640\n吉他拨片\t63641\n属于\t63642\n倒塌\t63643\n270X\t63644\n木锤\t63645\n东山精密\t63646\n文都网校\t63647\nGlassFish\t63648\n高把\t63649\ngdx\t63650\n5.53\t63651\n爱性\t63652\n运营权\t63653\n妈妈式\t63654\n五虎将后传3.04神xs\t63655\n弟子规公益网\t63656\n卡卡\t63657\n扶摇皇后\t63658\n视传\t63659\n27种\t63660\n吊起\t63661\n刘文涛\t63662\n二模卷\t63663\n送检\t63664\n奇丽\t63665\n西山村\t63666\njdb\t63667\n40后\t63668\n钢筋网片\t63669\n梦幻五开网\t63670\n无锡虹桥医院\t63671\n秘方\t63672\n东都\t63673\n再吃\t63674\n属性块\t63675\n肱\t63676\n2011年下半年\t63677\n外包装箱\t63678\n凤起东路\t63679\nBrand\t63680\n衣角\t63681\n现代爱情故事\t63682\n狮子山\t63683\n鱼石脂软膏\t63684\n概要设计\t63685\n特肖\t63686\nAgreement\t63687\n中国共产党党\t63688\n中学校\t63689\n荣美\t63690\n刘晓峰\t63691\n12批\t63692\n起吊\t63693\n徐红\t63694\n3x+2\t63695\n东南大学电气工程学院\t63696\n钣喷\t63697\nmsvcr110.dll\t63698\n王薇薇\t63699\n书法展\t63700\nv1.1_游迅网\t63701\n刮板输送机\t63702\n葛花\t63703\n巴林站\t63704\n老檀\t63705\n丽丝\t63706\n家字\t63707\nindex\t63708\n万达店\t63709\n和面\t63710\n词素\t63711\n转筒\t63712\n10000\t63713\n传呼\t63714\nclimax\t63715\n康欣新材\t63716\n刚柔\t63717\n飞猫云\t63718\n帕沙特\t63719\n仲景\t63720\nFreddie\t63721\n高级工\t63722\n红外分光光度计\t63723\n批次号\t63724\n贺知章\t63725\n赋值法\t63726\n土地流转网\t63727\n感念\t63728\n牧群\t63729\n云南省大理市人民政府\t63730\n观月雏\t63731\n电子导游证\t63732\n天国拯救游戏\t63733\n沌口\t63734\n克己复礼\t63735\n卓异\t63736\n宏城\t63737\n18年前\t63738\nCHEMICAL\t63739\n剂\t63740\n液压扳手\t63741\nCIRCLE\t63742\n陈蕾\t63743\n去污\t63744\n炸鸡腿\t63745\n第64章\t63746\nNBA中文网\t63747\n整风运动\t63748\nloi\t63749\n摩卡\t63750\n圆周角\t63751\n说自话\t63752\n尾钉\t63753\n商事登记制度\t63754\nmonero\t63755\nbait\t63756\nCOS\t63757\nM10H\t63758\n消瘦\t63759\n黄鸟\t63760\n拉布\t63761\n隔离期\t63762\ngranny\t63763\n南洲路\t63764\n瘦小\t63765\n济南恒隆广场\t63766\n黑龙江生物科技职业学院\t63767\n085212\t63768\nLVDS\t63769\n不公\t63770\n梨花颂\t63771\n害物\t63772\n打铁花\t63773\n吉祥经\t63774\n福州职业技术学院\t63775\n電話\t63776\n盲僧\t63777\n包膜\t63778\n烘焙展\t63779\nvc片\t63780\n湛江日报\t63781\n汪嘉伟\t63782\n16m\t63783\n皋\t63784\n定西市\t63785\n名源\t63786\n拍机\t63787\n营山县人民政府\t63788\n杨氏太极\t63789\n编辑块\t63790\n长沙\t63791\nSpock\t63792\n3087\t63793\n第三十四期\t63794\n附加品\t63795\n2.60\t63796\nb+1\t63797\n三班制\t63798\n咪咕善跑\t63799\n学道\t63800\n单丛茶\t63801\n十几万\t63802\n微号\t63803\n标数\t63804\n无忧幼儿园网\t63805\n西南交大\t63806\n雷火丰\t63807\n富侨\t63808\n床衣\t63809\n篮球员\t63810\nfinetune\t63811\nI.O.I\t63812\n视听发烧网\t63813\nHiDPI\t63814\n雷曼传奇\t63815\n马鞍山市人民政府\t63816\n吉林省国家税务局\t63817\n学习部\t63818\ndefeated\t63819\n12306吧_\t63820\n2015年1月份\t63821\nFriedrich\t63822\n2^8\t63823\n3500公里\t63824\nhotpot\t63825\n广寒\t63826\n南方地区\t63827\n虎鹤\t63828\n1万吨\t63829\n23座\t63830\nGoods\t63831\n铁标\t63832\npingguo\t63833\n液氯\t63834\n文件件\t63835\n中国水产信息网\t63836\n古井集团\t63837\n金色家园\t63838\n虞山镇\t63839\n大侠霍元甲\t63840\n搜刮\t63841\nInvestment\t63842\n夜车\t63843\n腰盘\t63844\n小黑哥\t63845\n颗粒感\t63846\n招行银联\t63847\n微乐麻将\t63848\n逊色\t63849\n谜面\t63850\n7100u\t63851\n扬州大学\t63852\n海虹\t63853\n碳酸钙d3颗粒\t63854\nWVR300\t63855\n固态\t63856\n艾萨克·阿西莫夫\t63857\n安家落户\t63858\n员额\t63859\n去秀\t63860\n古尔德\t63861\n骑行者\t63862\n超级机器人大战og2\t63863\nQP\t63864\n84岁\t63865\n1977年\t63866\n当世\t63867\n会员店\t63868\n无一物\t63869\n爱酷网\t63870\n5470\t63871\n中盛\t63872\n分红型\t63873\n我本人\t63874\nSSM\t63875\n名古屋大学\t63876\n插件集\t63877\n王泽境\t63878\nSnoop\t63879\n哈利波特与死亡圣器\t63880\n壹加壹\t63881\n72G\t63882\narry\t63883\n石牛\t63884\n王新华\t63885\nm16\t63886\n盘子\t63887\n回跳\t63888\n生意场\t63889\n山车\t63890\n武道神尊\t63891\n粮食作物\t63892\nv12.1\t63893\n台独\t63894\n宣讲员\t63895\n2345\t63896\n目视\t63897\n高配版\t63898\n小元素\t63899\n河北博物院\t63900\n维基解密\t63901\n五菱奔驰e级\t63902\n榴莲壳\t63903\nPak\t63904\n军门\t63905\n碳酸氢钙\t63906\n强效\t63907\n莒县\t63908\n.net框架\t63909\n婚龄\t63910\n雨点\t63911\n_快递网\t63912\n黄埔花园\t63913\n第十一期\t63914\n白银案\t63915\n心脏射频消融术\t63916\n剪辑器\t63917\n赵志敬\t63918\n酷睿i7\t63919\n强险保单\t63920\n负债表\t63921\n龙盘\t63922\n撸尔山\t63923\n再论\t63924\n锡东\t63925\n驻着\t63926\n丽岙街道\t63927\n193个\t63928\n华企\t63929\n顺义新城\t63930\n狮蝎\t63931\n广州汽车集团股份有限公司\t63932\n营养液\t63933\n保加利亚\t63934\n炮图\t63935\n父心\t63936\n混进\t63937\n一亩田\t63938\nDissertation\t63939\nCommVault\t63940\n看中\t63941\n顶置\t63942\nerc20\t63943\nPinnacle\t63944\n特房\t63945\n佐佑\t63946\n11年后\t63947\n洪秀全\t63948\n三角函数公式大全\t63949\n匹诺曹\t63950\n天赋异禀\t63951\ngot7\t63952\n侠盗飞车罪恶都市中文版_侠盗飞车罪恶都市\t63953\n拟声词\t63954\nsmbus\t63955\n三支一扶考试\t63956\np2000\t63957\n黑子\t63958\n撕开\t63959\n秋水仙碱\t63960\n父女\t63961\n王者荣耀东皇太一\t63962\n老坛酸菜\t63963\n阿姆斯壮\t63964\n背景纸\t63965\n蛇骨\t63966\n中国水电基础局有限公司\t63967\n词类\t63968\nUIPickerView\t63969\n巴洛克\t63970\n沙拉汁\t63971\n第十二节\t63972\n分尸\t63973\nABB式\t63974\n胡祖六\t63975\n重罪\t63976\n张冰\t63977\n新西湖\t63978\n琴台路\t63979\n2018-01-17\t63980\nphiladelphia\t63981\n霍比特人2\t63982\n共建\t63983\n佩服\t63984\n5200lx\t63985\n李丽丽\t63986\n120HZ\t63987\nregisters\t63988\n丹巴\t63989\n于人\t63990\nTargets\t63991\nceci\t63992\n酷博\t63993\n鼠害\t63994\n庄尼\t63995\n南五台\t63996\n盗采\t63997\n800吨\t63998\nsmw\t63999\n辽宁代表团\t64000\n阿翁\t64001\n许玮甯\t64002\n胡埭镇\t64003\n全棉时代\t64004\nsnh48\t64005\n猎狐者\t64006\n音语\t64007\nonresize\t64008\n庆云\t64009\napparatus\t64010\n骑行卡\t64011\n驻波\t64012\n凤凰系统吧\t64013\n房奴\t64014\nstab\t64015\n357游戏网\t64016\n安卓6\t64017\n减载\t64018\nelect\t64019\n鬼冢\t64020\nOs\t64021\n_墓王之王悬棺寺\t64022\n援交群\t64023\nChloro\t64024\n武汉光谷职业学院\t64025\n4.20\t64026\n港铁\t64027\n铝壳\t64028\n夜旅\t64029\n甜甜圈\t64030\n马尔可夫链\t64031\n新疆法院\t64032\n第一百零二章\t64033\n淘系\t64034\n超额准备金\t64035\n四川政务服务网\t64036\n工期\t64037\n永不妥协\t64038\n隆力奇\t64039\n谋事\t64040\n王星记\t64041\n19000元\t64042\n罗文\t64043\n疯了疯了\t64044\n破天\t64045\nwustl\t64046\n之举\t64047\n双井街道\t64048\n百盈\t64049\n车俊\t64050\nAPP软件破解版\t64051\n白龙潭\t64052\n289手游网\t64053\n窖\t64054\n必确\t64055\n渝\t64056\nHuman\t64057\n天候\t64058\n渥丹\t64059\n一字马\t64060\n重庆市科学技术协会\t64061\n杀人回忆\t64062\n减淡\t64063\n相生\t64064\nTelecom\t64065\n整地\t64066\nDemons\t64067\ninout\t64068\n基本放大电路\t64069\nSweeper\t64070\n硬质合金刀片\t64071\n牛展\t64072\ndowhile\t64073\n张璐\t64074\n久等\t64075\n新宇\t64076\n心人\t64077\nwiki百科\t64078\n综艺节\t64079\nRemix\t64080\n龌龊\t64081\n熏肉\t64082\n商务参赞\t64083\n檀木\t64084\n长安路\t64085\n没熟\t64086\n网络书店\t64087\n中南大学材料科学与工程学院\t64088\n星集团\t64089\n捶胸\t64090\n王战\t64091\n103万\t64092\n服务器类\t64093\nkali\t64094\n张一山\t64095\ngetline函数\t64096\n国务\t64097\n唐三十六\t64098\nYiiFramework\t64099\n夹持\t64100\n大学生就业信息网\t64101\n教学方法\t64102\n徐冲\t64103\n交叉熵损失函数\t64104\n积石山\t64105\n76亿\t64106\n深圳第二高级技工学校\t64107\nwhendream\t64108\n一代唐\t64109\n赌圣\t64110\n诺基亚n9\t64111\n宁波人大\t64112\nHS编码查询\t64113\n咕咚咕咚\t64114\n税屋网\t64115\n热映\t64116\n目击证人\t64117\n不老\t64118\n吉列\t64119\n银行类\t64120\nlibary\t64121\n布尔型\t64122\n说明书\t64123\nti8\t64124\n液压冲床\t64125\n圣诞歌\t64126\n信息系统监理师考试\t64127\nnorthgard\t64128\nzoc\t64129\nGray\t64130\n存进\t64131\nensight\t64132\n沙格列汀片\t64133\nmetronic\t64134\nV3Max\t64135\n身影\t64136\n烫银\t64137\n附\t64138\n小早川\t64139\n宫颈炎\t64140\n琴人\t64141\ng2a\t64142\n级片\t64143\npath\t64144\n080\t64145\n123.Net\t64146\n鲷\t64147\n黑颜\t64148\n指南\t64149\n围圈\t64150\n木姐\t64151\n九点钟\t64152\nLis\t64153\n穿山\t64154\n全员营销\t64155\n维多利亚2\t64156\n解压缩\t64157\n贵妇级\t64158\nLST\t64159\n地铁11号线\t64160\n曲奇饼干\t64161\nPTFX\t64162\n碧\t64163\n昕动异世界\t64164\notis\t64165\n2018年03月31日\t64166\n招生简\t64167\n顺序阀\t64168\n棱光\t64169\nchek\t64170\n小珍\t64171\n李永刚\t64172\n头菜\t64173\n4399奥义联盟\t64174\ncx\t64175\n梅超风\t64176\n大吴\t64177\n内存储器\t64178\n南飞鸿十年城\t64179\nNGA178\t64180\n启辰蛊\t64181\n房梁\t64182\n19V\t64183\n受奖\t64184\nimbd\t64185\n桥中\t64186\n胸腺肽肠溶片\t64187\n36v\t64188\n旋翼\t64189\n元碩行銷\t64190\n机热\t64191\n东皇\t64192\n社杯\t64193\n爆弹\t64194\n病科\t64195\n递推公式\t64196\nGluster\t64197\nplagiarism\t64198\n松懈\t64199\n保国寺\t64200\n爱达乐\t64201\nmultiline\t64202\nSec\t64203\n操场上\t64204\n内检\t64205\n周希\t64206\n维他命\t64207\n無料動画\t64208\n道高一尺\t64209\n陆文夫\t64210\n窨井\t64211\n韭菜鸡蛋饺子\t64212\n达梦数据库\t64213\n跟党走\t64214\nucas\t64215\n防滑贴\t64216\n投资家网\t64217\n柳堡\t64218\naj\t64219\n离调\t64220\n冈村\t64221\ndrawline\t64222\n9.6分\t64223\n柳州路\t64224\n遍身\t64225\n华山西峰\t64226\nseconds\t64227\n华锋\t64228\n浙报集团\t64229\n永新股份\t64230\n短发型\t64231\n樱桃苗\t64232\n猎人\t64233\n文山学院\t64234\n中国驻英国大使馆教育处\t64235\n109路\t64236\n信阳\t64237\n收分\t64238\n碧业生\t64239\n一尾\t64240\n展网\t64241\nurllib3\t64242\n鸟山明\t64243\n单发\t64244\nCHM\t64245\n教育部办公厅\t64246\n海淀区小学\t64247\n模糊集\t64248\nipairs\t64249\n硫化铜\t64250\niPcfun\t64251\nBetween\t64252\n拱券\t64253\nHyperdunk\t64254\nNotre\t64255\n因特网\t64256\n荣耀magic\t64257\n排外\t64258\nBark\t64259\n凤凰教育网\t64260\n玩堂\t64261\nMole\t64262\n区政府\t64263\n44.5\t64264\n停息\t64265\n细微处\t64266\nvel\t64267\n吴建伟\t64268\neuler\t64269\n盛运\t64270\njs变量值\t64271\nSHINY\t64272\n卡蜜尔\t64273\n丹尼尔·笛福\t64274\n比对\t64275\n17届\t64276\n深圳交委\t64277\n祝酒\t64278\nroses\t64279\nUAV\t64280\n志远\t64281\n独乐寺\t64282\n莞尔wr\t64283\n三十六个\t64284\n超疏\t64285\njvisualvm\t64286\n上汽大众汽车有限公司\t64287\n桐昆\t64288\n视差\t64289\n林震\t64290\nseverance\t64291\n菅野亜梨沙\t64292\n影依\t64293\n延边医院\t64294\n创客贴\t64295\n吉利区\t64296\n基板\t64297\n对得起\t64298\n255.255.254.0\t64299\nexecuting\t64300\n微微一笑很倾城\t64301\n沪江开心词场\t64302\n站房\t64303\n铂涛会\t64304\n300克\t64305\n阻工\t64306\n仙灵岛\t64307\n分手快乐\t64308\n石位\t64309\n图书架\t64310\n检算\t64311\n监控员\t64312\n狭隘\t64313\n日记簿\t64314\n佳斯特\t64315\n大指\t64316\n微公交\t64317\n22.1\t64318\n国际标\t64319\n邮件服务器\t64320\n奇石展\t64321\n绝地求生新图\t64322\n人的本性\t64323\n过滤\t64324\n白沙古镇\t64325\n千夜\t64326\n电炸炉\t64327\n巣\t64328\n永安社区\t64329\n唇印\t64330\nSNP\t64331\n敌法师\t64332\n四月\t64333\n利智\t64334\nappoint\t64335\n0402\t64336\n好风\t64337\n卓亚\t64338\n东庄镇\t64339\n神弓\t64340\nxmc\t64341\n哭笑不得\t64342\n公仪\t64343\n百度离线宝\t64344\n硬盘序列号\t64345\n刨\t64346\n性爱网\t64347\n快乐的人\t64348\n冬阴功汤\t64349\n软件费\t64350\nsgs\t64351\nregressor\t64352\n重谢\t64353\nbastion\t64354\n0a\t64355\n矮墙\t64356\n林志玲\t64357\naddrinfo\t64358\n斗室\t64359\n担保物权\t64360\n羲\t64361\n绿地国际金融城\t64362\n燕兰熹\t64363\n水盖\t64364\nWasher\t64365\n文字\t64366\n掌上财经\t64367\n鞍山市立山区\t64368\n苏州新城\t64369\nSL410\t64370\n优先级\t64371\n酸黄瓜\t64372\nhonored\t64373\n麦趣尔\t64374\nseo公司\t64375\nflamingo\t64376\n复婚\t64377\n日制\t64378\n掠食\t64379\ngo-ethereum\t64380\ntwincat3\t64381\n痴缠\t64382\n电脑分屏\t64383\n3555\t64384\nAffordable\t64385\n看点网\t64386\neviews8\t64387\n湖心亭看雪客\t64388\nv1.0_5577我机网\t64389\n现代家电网\t64390\n克鲁泽\t64391\nAmericano\t64392\n中联\t64393\n海南建省\t64394\n大连庄河\t64395\n坦坦荡荡\t64396\n防爆胎\t64397\n项\t64398\n品牌定位\t64399\n吴虹飞\t64400\n浮山中学\t64401\n平光\t64402\n赵元任\t64403\n碳氮\t64404\nJJ\t64405\n驭兽师\t64406\ngateone\t64407\n无修正\t64408\n南庄镇\t64409\n新建区人民政府\t64410\n冯仰妍\t64411\nMyFonts\t64412\n堂姐\t64413\npristine\t64414\n260g\t64415\n黑山\t64416\nRaped\t64417\nmoni\t64418\n2i4\t64419\nsmote\t64420\nlsqnonlin\t64421\nlenet-5\t64422\n弗戈工业在线\t64423\n名门翠园\t64424\nPaid\t64425\n兰州日报数字报\t64426\n信长之野望:大志\t64427\n三元催化剂\t64428\n不欺\t64429\n石墨管\t64430\n蔗\t64431\n信息港\t64432\n二元一次函数\t64433\ntuner\t64434\n出游\t64435\nDFF\t64436\n绯色\t64437\n张佳乐\t64438\n科尔\t64439\n余沧海\t64440\n3213\t64441\n坚果G1\t64442\n衣物\t64443\n虐心小说\t64444\n阳文\t64445\n增透\t64446\n华图教育\t64447\n烟苗\t64448\n基努·里维斯\t64449\n细支\t64450\nAristo\t64451\n纳米金\t64452\n牙条\t64453\nqwertyuiop\t64454\n霍旋\t64455\n怪物猎人Online-MHO\t64456\n145号\t64457\nRevo\t64458\n柳永\t64459\n飞机\t64460\n30000公里\t64461\n公示栏\t64462\n一起买\t64463\n二十九\t64464\n天猫精\t64465\n专业技术职务水平能力测试\t64466\n哈罗单车\t64467\n九龙半岛\t64468\n橘灯\t64469\nhg8145c\t64470\n支气管痉挛\t64471\n期贷\t64472\n近一个月\t64473\n隐血\t64474\n20150908\t64475\neig函数\t64476\n国学\t64477\nDense\t64478\n赵国庆\t64479\nbie\t64480\n95斤\t64481\n控制性详细规划通则\t64482\n占尽\t64483\nエ\t64484\n宝山街道\t64485\n第四十三章\t64486\n易付宝\t64487\n范瑞娟\t64488\nstandard\t64489\n北大清华\t64490\n中国科学院沈阳自动化研究所\t64491\nhurricane\t64492\n丹东市\t64493\nImportance\t64494\nopenai\t64495\noto\t64496\n超儿\t64497\n空气检测仪\t64498\n720ml\t64499\n氏\t64500\n晦涩\t64501\n民工\t64502\n第十九章\t64503\n札\t64504\n小名\t64505\n谷露\t64506\n氰化氢\t64507\nSP6\t64508\n单应矩阵\t64509\nAV女神\t64510\n爸爸回来了\t64511\n檀越\t64512\n键灯\t64513\n音浪\t64514\n荣耀卡\t64515\n离退\t64516\n兽类\t64517\n22.5\t64518\n喜盈门范城\t64519\n90平米\t64520\nNigerian\t64521\n植物群落\t64522\n剑网3大师赛\t64523\n大连站\t64524\n1682亿\t64525\n华海电脑网\t64526\n投资项\t64527\n梅列\t64528\nConfigure\t64529\n盛隆\t64530\n白本\t64531\n童梦无忧论\t64532\n陶庵梦忆\t64533\nfsf\t64534\n东环路\t64535\n聚乙烯\t64536\n慎微\t64537\n高一定\t64538\n喜来登酒店\t64539\n抽头\t64540\n扶不扶\t64541\nLOVE616\t64542\n朱元思书\t64543\n包工包料\t64544\n土土\t64545\n天涯整理版\t64546\n取样员\t64547\n泡机\t64548\n中恒集团\t64549\n柏浪涛\t64550\nMemorial\t64551\n发现_比特网\t64552\n恩替\t64553\n一张脸\t64554\n零_\t64555\nMSN\t64556\n口里\t64557\n战袍\t64558\n主心\t64559\n硝石\t64560\n长条状\t64561\n税务编码\t64562\n叶赛宁\t64563\n城崎美嘉\t64564\nomap\t64565\n风速计\t64566\n反动\t64567\n个税计算器2017\t64568\n姜汤\t64569\n7X\t64570\n顽皮\t64571\n灭运我的小人国\t64572\nhydration\t64573\n绿佳\t64574\nR/3\t64575\n五十公里\t64576\n三肽\t64577\ncops\t64578\n二机\t64579\n凯里市\t64580\n平沙岛\t64581\n外族\t64582\n一览无余\t64583\nGNZ\t64584\n挪动\t64585\nskim\t64586\n多用途\t64587\n线性规划\t64588\n草庐\t64589\nIOException\t64590\npermanent\t64591\n口袋妖怪绿宝石\t64592\n黄埔大桥\t64593\n8000\t64594\n尔库番号大全-2col\t64595\n三生石上一滴泪\t64596\niglol吧_\t64597\n王保长\t64598\n翡翠恋人\t64599\n金陵城\t64600\n黑骨藤\t64601\n张耀辉\t64602\n永诀\t64603\n新能源汽车公司\t64604\nticwatch\t64605\n前4月\t64606\n打虎山路第一小学\t64607\n石基镇\t64608\nAquos\t64609\n吉姆尼论坛_汽车之家论坛\t64610\n辖区\t64611\n人事权\t64612\n2002届\t64613\nmetering\t64614\nkafaka\t64615\neshop\t64616\n芝麻灰\t64617\nise\t64618\n水解物\t64619\n象棋谱\t64620\n弹幕站\t64621\n直减\t64622\n一个28岁\t64623\n梓萱\t64624\n善举\t64625\n扇门\t64626\n处理类\t64627\n失能\t64628\nDisruptor\t64629\n米阳\t64630\n强散\t64631\n氯丙嗪\t64632\n来去自如\t64633\n娱乐业\t64634\n4500米\t64635\n爱贝亲子\t64636\n张业\t64637\n佩尔\t64638\n东尚\t64639\n望洋兴叹\t64640\n狭路相逢勇者胜\t64641\n福州电视台\t64642\n烈车\t64643\n迫嫁\t64644\n中华人民共和国民办教育促进法\t64645\n试客联盟\t64646\n贴有\t64647\n行权日\t64648\nitnues\t64649\n郭敏\t64650\n私营有限责任公司\t64651\n冰河世纪\t64652\n维多利亚一号\t64653\n名分\t64654\n西航港\t64655\n涂山苏苏\t64656\n三十多个\t64657\nlwli\t64658\n哈利贝瑞\t64659\n迈腾b8\t64660\n正树\t64661\n那一次\t64662\n海子山\t64663\nParagon\t64664\n施工面积\t64665\n蹲坐\t64666\nrunnin\t64667\n打马\t64668\n鹭湖\t64669\n欠发达地区\t64670\n细度模数\t64671\n国内\t64672\n堇色\t64673\n中国城市轨道交通协会\t64674\n菲林片\t64675\n扫\t64676\n愤懑\t64677\n德日\t64678\nvlan\t64679\n护士\t64680\n成龙历险记\t64681\n几副\t64682\n街访\t64683\n调压阀\t64684\n五洲大道\t64685\nDrops\t64686\nStruct\t64687\n副本集\t64688\n热体\t64689\n真巧\t64690\n张利\t64691\n神行者\t64692\n膳魔师\t64693\n力信\t64694\n江苏省经信委\t64695\n腊味\t64696\n费家村\t64697\n俄\t64698\n9301\t64699\n松坪村\t64700\n经研院\t64701\n逼格\t64702\n长江实业\t64703\n嵩明\t64704\n圆明新园\t64705\n真皮包\t64706\n杆径\t64707\n北峰\t64708\n主动性\t64709\n加亮\t64710\n相比之下\t64711\n套住\t64712\n数千万A轮\t64713\n开始菜单\t64714\nSCOTT\t64715\n马路牙子\t64716\n新华丝路网\t64717\nReLIFE\t64718\n打饱嗝\t64719\nMontage\t64720\nonepiece\t64721\n满v版\t64722\n银联卡\t64723\n2016,2017年\t64724\n1.4art\t64725\n凿岩\t64726\n急躁\t64727\n袁湘琴\t64728\n会审\t64729\n监察机关\t64730\n69%\t64731\n昆明犬\t64732\n6633\t64733\n复旦经济学院\t64734\n影音先锋全集高清\t64735\nunzip\t64736\n5.3米\t64737\n我处\t64738\n五字\t64739\n菜农\t64740\n邢台日报\t64741\n宇文毓\t64742\n999感冒灵\t64743\n灰岩矿\t64744\n考核制\t64745\n仙剑云之凡\t64746\n彩灯\t64747\n宁杭\t64748\n立柱桩\t64749\n镇远新闻网\t64750\n藤花\t64751\nn行\t64752\n清亮\t64753\nansj\t64754\nTSM\t64755\n口号\t64756\n嫁给你\t64757\n宜都市人民政府\t64758\nEDC\t64759\n干燥性\t64760\n文献馆\t64761\n反梁\t64762\n红龙鱼\t64763\n三国无双3\t64764\n禅文化网\t64765\n60节\t64766\nisNaN\t64767\n疮痍\t64768\nPConli\t64769\n永峰\t64770\n49万\t64771\n22集\t64772\n牛肉串\t64773\n张大鹏\t64774\n基准地价\t64775\nz8300\t64776\n料位\t64777\n鱼凫\t64778\nrodger\t64779\n100多分\t64780\n陕西省司法厅\t64781\n杭州租房网-房途网\t64782\nwhey\t64783\n战元\t64784\n新邦\t64785\nfx80\t64786\n何小成\t64787\n博士服务团\t64788\n剥掉\t64789\nsimon\t64790\n15码\t64791\n沛纳海\t64792\n中轴线\t64793\n3.5L\t64794\n万劫不复\t64795\n蛦子\t64796\nEmbedded\t64797\n修杰楷\t64798\n在他乡\t64799\n农民日报\t64800\nsons\t64801\nBaseX\t64802\nPierce\t64803\n且\t64804\n新闻系\t64805\n马萨诸塞州\t64806\n烤色\t64807\n打结\t64808\nexagear\t64809\n000977\t64810\n新民市\t64811\n法拉电子\t64812\n这么难\t64813\n爱魅\t64814\n第102\t64815\n大象版小学\t64816\n田原\t64817\n猫类\t64818\n影魔\t64819\nemg\t64820\n金吉\t64821\n广州恒大足球俱乐部\t64822\n趣营销网\t64823\n包管\t64824\nDCMTK\t64825\n学习习近平谈治国理政\t64826\n筑梦\t64827\n终突\t64828\n成都注册公司\t64829\n合租房信息网\t64830\nDWG文件查看器\t64831\n29集\t64832\n不在\t64833\n水若寒\t64834\n年卡\t64835\nhorns\t64836\nhurdle\t64837\n萧山北\t64838\n亲信\t64839\nM1005-ZOL\t64840\nthz\t64841\nViewing\t64842\n怜怜\t64843\n搜索者\t64844\n夏空\t64845\n20131030\t64846\n新房\t64847\n小九资源网\t64848\n兴唐传\t64849\n36人\t64850\nite\t64851\nastronomy\t64852\nproxychains\t64853\n妈妈说\t64854\n胎菊\t64855\n艰苦奋斗\t64856\n43张\t64857\n雪莉酒\t64858\n郭睿\t64859\n周卫国\t64860\n更新\t64861\n伤风\t64862\n吹牛\t64863\n羽化\t64864\n紫禁\t64865\n禁菜\t64866\n猫和老鼠\t64867\n李瓶儿\t64868\n痞子蔡\t64869\n零个\t64870\n戍边\t64871\n渡过\t64872\n河南检察职业学院\t64873\n无一\t64874\n慧荣\t64875\nsct\t64876\n刀术\t64877\n耐压仪\t64878\n固定污染源排污许可分类管理名录\t64879\n有能力\t64880\n压敏胶\t64881\n存续\t64882\n半边天\t64883\n南京地铁3号线\t64884\n教育方针\t64885\n小知识\t64886\n移动靓号\t64887\n考验\t64888\n家庭生命周期\t64889\n毒蛇影院\t64890\n时长\t64891\n交通银行沃尔玛\t64892\n101路\t64893\n2000克\t64894\n马建堂\t64895\n14场\t64896\n韩泰\t64897\n川行\t64898\ng-music\t64899\n深圳市人大\t64900\nmirrors\t64901\nunicodeescape\t64902\n果糖\t64903\n六栋\t64904\n壮腰健肾丸\t64905\n1.4.9\t64906\nmultiset\t64907\n咀\t64908\n妇科洗液\t64909\nSublime\t64910\n95万\t64911\n津南区\t64912\n旸\t64913\n沙河乡\t64914\nwafer2\t64915\n文鼎苑\t64916\n膜拜\t64917\n口服药\t64918\n甚麽\t64919\n中国酒\t64920\n馋嘴\t64921\n台柱子\t64922\n条漫\t64923\n被害死\t64924\nBlueFlame\t64925\nterraform\t64926\n阿苯达唑片\t64927\n涡度\t64928\nvmem\t64929\n恫吓\t64930\n笑柄\t64931\n优质奖\t64932\nfapality\t64933\n郧县\t64934\n翎\t64935\n津洽\t64936\n铃口\t64937\n一方集团\t64938\n风帘\t64939\n2.7.3\t64940\n生活力\t64941\n乌蒙小燕\t64942\nkegel\t64943\nNXPower\t64944\n华夏出行\t64945\nbully\t64946\n锺\t64947\n多发性骨髓瘤\t64948\n警戒线\t64949\nTooth\t64950\nIntroduces\t64951\n宁夏人大\t64952\n明珠\t64953\n北麓\t64954\naddy\t64955\n山东金斯顿\t64956\n外源性\t64957\n生育能力\t64958\n后壁\t64959\n名校长\t64960\n基金份额持有人\t64961\n上原保奈美\t64962\n这么说\t64963\n川A\t64964\n嘟嘟传奇\t64965\n刊播\t64966\n热轧板卷\t64967\n腿身\t64968\n双截龙3\t64969\n雷立刚\t64970\n3.62\t64971\n坪洲新村\t64972\n第五中学\t64973\noom\t64974\ncallbacks\t64975\n奥克利\t64976\nXZ2\t64977\n平安易宝\t64978\n起风了\t64979\n建平新闻网\t64980\nhobbies\t64981\n0.4_\t64982\n超文本浏览框\t64983\n伸臂\t64984\n鞋楦\t64985\nfatal\t64986\n力威\t64987\n江苏电网\t64988\nLaravel数据库\t64989\nWellington\t64990\n泰州市海陵区人民政府\t64991\n论略\t64992\nhik\t64993\nd300\t64994\n衬托\t64995\n辽宁海事局\t64996\n十通\t64997\n免疫性肝炎\t64998\n四季酒店\t64999\n台日\t65000\n泛微OA\t65001\n0.5\t65002\n大化瑶族自治县\t65003\n颞下颌关节紊乱\t65004\nLark\t65005\n微景观\t65006\n败家日记\t65007\n江丰\t65008\n马栏\t65009\n地下情人\t65010\n婚博\t65011\n捐款箱\t65012\nofbiz\t65013\n出生地\t65014\n唇色\t65015\n表面抗原\t65016\nSolver\t65017\n我们一家人\t65018\nsifang\t65019\n佳期如梦\t65020\n一个17岁\t65021\n油镜\t65022\n朝鲜族自治州\t65023\n苦命\t65024\n色彩饱和度\t65025\n一托\t65026\n加上\t65027\n38栋\t65028\nALI\t65029\n梦想岛\t65030\nRafael\t65031\n桥梁墩台\t65032\n9504\t65033\n安于\t65034\n伴我成长\t65035\n月浦\t65036\n理术\t65037\n郑俊弘\t65038\n法学专业\t65039\n安全门\t65040\n丹麦语\t65041\n王子异\t65042\n卡二\t65043\n林外\t65044\n调虎离山\t65045\n买噶\t65046\n晨光科力普\t65047\n酸甜\t65048\n三仙\t65049\n猥琐流\t65050\n受理员\t65051\nFLAC/分轨\t65052\nxuni\t65053\nshadowsocks\t65054\nChurchill\t65055\n320KMP3\t65056\n标日\t65057\n22个月\t65058\n王勇平\t65059\nxiaoluo\t65060\n貂\t65061\n243集\t65062\n吴让之\t65063\n今年5月1日起\t65064\ncover\t65065\n火遁\t65066\n李清蓝\t65067\n左肩\t65068\n珍视明\t65069\n汽\t65070\n美团打车\t65071\n窝藏\t65072\n茅野爱衣\t65073\nPreprocessor\t65074\n乙级防火门\t65075\n低音喇叭\t65076\ntitanic\t65077\n口中\t65078\n刘鹤\t65079\n感谢您\t65080\n管式\t65081\n中英\t65082\n教坊\t65083\n佛肚竹\t65084\n头儿\t65085\n真姬\t65086\nPC6苹果网\t65087\n中央戏剧学院表演系\t65088\n装备展\t65089\n报任安书\t65090\nqsort\t65091\n牛杂火锅\t65092\n湘菜馆\t65093\n玉簪花\t65094\nmlr\t65095\noffice16\t65096\n虚川\t65097\ni9508\t65098\n丁启阵\t65099\n打非治违\t65100\n德尼罗\t65101\n新能源汽车之家网\t65102\n第二批\t65103\n中天证券\t65104\n点单\t65105\n提莫\t65106\n修改者\t65107\n红外感应\t65108\n合法\t65109\n第一百天\t65110\n湖州站\t65111\n苍井空\t65112\n杨彪\t65113\n革命纪念馆\t65114\n屬性\t65115\n小泥人\t65116\n洪雅\t65117\n北京警察学院\t65118\n烟井\t65119\nterribly\t65120\n江南古镇\t65121\n多汗症\t65122\n半径\t65123\n小三房\t65124\n智齿\t65125\n格尺\t65126\n暗涌\t65127\n赌约\t65128\n社招\t65129\n首冠\t65130\n溴甲酚\t65131\n韦口\t65132\n海豚传媒\t65133\nayden\t65134\n图拉\t65135\n参统\t65136\n绞丝旁\t65137\nchord\t65138\n陌香\t65139\n李秀成\t65140\n空团\t65141\n好想你\t65142\n591MOV\t65143\n400多分\t65144\n吴江东\t65145\n战国策\t65146\nkr\t65147\nPPS影音\t65148\n第三十五条\t65149\n莎娃\t65150\n兰登\t65151\nFC版\t65152\n9115\t65153\n李钰\t65154\nnet\t65155\n币族\t65156\n手续费率\t65157\nstove\t65158\n程池\t65159\nMVC4+EasyUI\t65160\n工作帽\t65161\n李院长\t65162\ndeadlock\t65163\n山东省检察院\t65164\n#1\t65165\n番茄鸡蛋\t65166\n情愁\t65167\n莘朱路\t65168\n鸭窝\t65169\n小折\t65170\n上海票据交易所\t65171\nh265\t65172\n一次性伤残补助金\t65173\n五马分尸\t65174\n看图王\t65175\nA14\t65176\n滚丝\t65177\n70米\t65178\n姜戈\t65179\n好运连连\t65180\nREPL\t65181\n北青公路\t65182\n依莱达\t65183\n皂液器\t65184\n戳穿\t65185\n无事生非\t65186\n形状\t65187\n美瑞克斯\t65188\n石家庄植物园\t65189\nlil电子烟\t65190\n第一第二\t65191\n基纽\t65192\n许墨\t65193\n高琪\t65194\nprices\t65195\n微税平台\t65196\n龙口西\t65197\n荣大\t65198\nliulan\t65199\n溶血病\t65200\n抽水\t65201\n人民币收藏网\t65202\n5.6.20\t65203\n阳光地带\t65204\n市斤\t65205\n小臣\t65206\nb族\t65207\n嗽\t65208\n訂\t65209\n重点率\t65210\n硫磺软膏\t65211\n/url-pattern\t65212\n1680x1050\t65213\n中空注浆锚索\t65214\n300度\t65215\nPR\t65216\n十九大新闻中心\t65217\n瑞士少女峰\t65218\nDosBox\t65219\n表径\t65220\n假眼\t65221\nE邮宝\t65222\n亚子\t65223\n新道\t65224\n隐私\t65225\n至顶网\t65226\nrootvg\t65227\neventually\t65228\n林老师\t65229\n在职研究生招生信息网\t65230\n延用\t65231\ngct\t65232\n天津自贸试验区\t65233\nshiro\t65234\n646名\t65235\nprem\t65236\nSQLSERVER2008\t65237\n央视一套\t65238\n禹岩\t65239\n酗酒\t65240\n体验装\t65241\n老兵\t65242\nnoc\t65243\n心莲\t65244\n团结协作\t65245\n婚车\t65246\n2300元\t65247\n安吉\t65248\n笨笨鱼\t65249\n封彪\t65250\n王者天下\t65251\n特牛\t65252\n命\t65253\nallinurl\t65254\n五鼠闹东京\t65255\nReviewer\t65256\n杨传堂\t65257\n9个\t65258\n掉魂\t65259\n表证\t65260\nGabriel\t65261\n公主们\t65262\n_边\t65263\nOUTER\t65264\n双河镇\t65265\n斑竹玉\t65266\n刘露\t65267\n白岩\t65268\n斗湖机场\t65269\n征集令\t65270\n踢足球\t65271\nnanny\t65272\n劳动街\t65273\n淫贱\t65274\n狗舍\t65275\n反贪风暴\t65276\n春剑\t65277\nfortigate\t65278\n格力空调\t65279\n褒扬\t65280\n中铺\t65281\n汉尼拔吧\t65282\n忌语\t65283\n点名\t65284\n老黄历\t65285\n汕汕\t65286\n氯化亚砜\t65287\nibdata1\t65288\n讨教\t65289\nMontana\t65290\n御宅族\t65291\n足球经理2018\t65292\n麦哨\t65293\n献世\t65294\nGPA计算器\t65295\nScores\t65296\n专横\t65297\n000936\t65298\n长阴\t65299\n2dm\t65300\n大阪关西机场\t65301\nStella\t65302\n丰宁\t65303\n学田\t65304\n包惠\t65305\n历史小说\t65306\n卡巴\t65307\n20kv\t65308\n光纤陀螺仪\t65309\n别说谎\t65310\n毕胜福\t65311\nbaobao\t65312\nhxsd\t65313\n北极\t65314\n宁波市妇女儿童医院\t65315\n冠岩\t65316\n红名\t65317\n杜碧丝\t65318\n1603\t65319\n岩棉板\t65320\n艾诺\t65321\n阴曹地府\t65322\n四方街\t65323\n分身大师\t65324\n余孽\t65325\n视客\t65326\n淹城动物园\t65327\n郭襄\t65328\n教友\t65329\nexcel分组\t65330\n沧州运河区\t65331\n义安\t65332\nFL\t65333\nsyllabus\t65334\n小错\t65335\n群情\t65336\n石榴社区\t65337\npalmer\t65338\n新工\t65339\n北京团市委\t65340\n自谦\t65341\njung\t65342\nv10\t65343\n乌鲁木齐爱德华医院\t65344\n灯池\t65345\n百荣世贸商城\t65346\nUcenter\t65347\n中华帝国\t65348\n1号\t65349\n港代\t65350\n北部湾港\t65351\n操作工\t65352\n隋\t65353\n皇马\t65354\n傻妃\t65355\nVS2017\t65356\n该局\t65357\nRP8.0\t65358\n围板\t65359\n20A\t65360\n线程函数\t65361\n西米网\t65362\n米诺陶\t65363\n贵阳装修公司\t65364\n秋之回忆\t65365\nadac\t65366\n灯芯\t65367\n裘千尺\t65368\n这东西\t65369\nbeyondcompare4\t65370\nprofessionals\t65371\n油浴\t65372\n三十斤\t65373\n神经性皮炎\t65374\n结晶学\t65375\n井室\t65376\n孜然\t65377\n德庆县\t65378\ndeskjet\t65379\n放牛小李飞刀\t65380\n详\t65381\n奇先生\t65382\n开居住\t65383\n综合测试卷\t65384\n叉号\t65385\n五招\t65386\n配酒\t65387\n空气质量标准\t65388\n落水\t65389\n吃官司\t65390\ngromacs\t65391\nFEMJOY\t65392\n查理·普斯\t65393\n渐进片\t65394\n情不自禁\t65395\n石纹\t65396\n中安网\t65397\n最高分\t65398\n清科研究中心\t65399\n自贡在线\t65400\n零报税\t65401\nAorus\t65402\n泻\t65403\n轮乱\t65404\n双黄蛋\t65405\n打印器\t65406\n590k\t65407\n加州大学戴维斯分校\t65408\n攻楼\t65409\n6515\t65410\nButton\t65411\ndockerfile\t65412\nMagnet\t65413\nrabbitmq\t65414\n粽叶\t65415\n家味\t65416\n燃烧吧少年\t65417\n女字\t65418\n血\t65419\n天津职业大学\t65420\ndruid数据库连接池\t65421\n全文明\t65422\nSHANDONG\t65423\n吉大慧谷\t65424\n500位\t65425\nt43\t65426\nSolidWorks2013\t65427\n侠\t65428\n冼星海\t65429\n电吉他谱\t65430\n觅仙路\t65431\n中国民航科学技术研究院\t65432\n回龙观西大街\t65433\n鼓浪\t65434\n张一曼\t65435\n童梦无忧论坛\t65436\ndota\t65437\n脚艺\t65438\n小媳妇儿\t65439\n太空人\t65440\n107国道\t65441\n肾衰\t65442\nBASIS\t65443\n8900\t65444\n心病\t65445\n啤酒罐\t65446\n模式识别\t65447\n李在福\t65448\nb2b电子商务\t65449\n中控智慧科技\t65450\n柔版\t65451\n简笔画大全\t65452\n熟悉度\t65453\nMatsumoto\t65454\ntimtales\t65455\nSEO站\t65456\n不休\t65457\n废车\t65458\n联昊通快递\t65459\nLEBRON\t65460\n10.6.6.63\t65461\n当涂\t65462\nCDN\t65463\n前晚\t65464\n冷却系\t65465\n110套\t65466\n99A\t65467\n豆人\t65468\n垃圾箱\t65469\n走夜路\t65470\nWHOO\t65471\n银沙湾\t65472\n305\t65473\n2441\t65474\n曲阳路\t65475\n两百米\t65476\n溜索\t65477\n悦木\t65478\n揽件\t65479\nYoMail\t65480\n黑叶\t65481\n芮\t65482\n查吃\t65483\n圣家族大教堂\t65484\n光照度\t65485\n锌合金\t65486\n正负面\t65487\n完全民事行为能力人\t65488\n思通\t65489\n清脆\t65490\n远程医疗\t65491\n超声法\t65492\ncss3_CSS_网\t65493\n明珠广场\t65494\nEXSI\t65495\nsystemui\t65496\n鱼男\t65497\n我的世界pe\t65498\ncucci\t65499\n软木塞\t65500\n会出现\t65501\n河湾\t65502\n刚果布\t65503\njiasu\t65504\n布兰\t65505\nSFP光模块\t65506\n桑海\t65507\nemiri\t65508\n副边\t65509\n河北科技学院\t65510\nfamiliar\t65511\n微播易\t65512\n90016\t65513\n第一百零七章\t65514\nthreat\t65515\n礼记\t65516\nobj\t65517\n细看\t65518\n吴瑞\t65519\n三枚\t65520\n大公无私\t65521\n生养\t65522\n十三周年\t65523\n祖母绿\t65524\n每样\t65525\n氯化钙\t65526\n生殖器\t65527\n牵出\t65528\n横水\t65529\nplots\t65530\n苏向阳\t65531\n算术\t65532\nshows\t65533\n好无聊\t65534\n糖水\t65535\n超星网\t65536\n日进\t65537\n上海柴油机股份有限公司\t65538\n刘伟强\t65539\n反侵略\t65540\n改价\t65541\n北京市政一卡通\t65542\n回放在\t65543\n渝建发\t65544\nOvercoming\t65545\n范薇\t65546\n祖达克\t65547\n一作\t65548\ncalcium\t65549\n前摇\t65550\n環\t65551\niframe父\t65552\n20170205\t65553\nQuan\t65554\n中山政府网\t65555\n动能\t65556\n工艺美术师\t65557\n新余市\t65558\n牛牛群\t65559\n俯首\t65560\n魏无忌\t65561\n入围\t65562\n35mm\t65563\n誓不罢休\t65564\nN3\t65565\n重庆市荣昌区人民政府\t65566\n硬骨头\t65567\n克鲁尼\t65568\n菲律宾长滩岛\t65569\n戒色\t65570\narrow\t65571\n数字币\t65572\n点师\t65573\n天津市教育委员会\t65574\nStartseite\t65575\n2页\t65576\n20080820\t65577\n龙工\t65578\n逆转\t65579\n感染性休克\t65580\n龙剑\t65581\n武藤绫香\t65582\n铝合金型材\t65583\n实体瘤\t65584\n房客\t65585\n西峰区\t65586\nMacAppStore\t65587\n胆略\t65588\n上海国金中心\t65589\n男一号\t65590\n阿静\t65591\nmidd\t65592\nKsp\t65593\n间架\t65594\nRhinoceros\t65595\n8000家\t65596\n两千公里\t65597\n农产品信息网\t65598\n经典小说网\t65599\n第26章\t65600\n阈\t65601\nultracompare\t65602\n小薰\t65603\n纱幔\t65604\n阳泉市矿区\t65605\n血胸\t65606\n官方\t65607\n苦艾酒\t65608\n8.6代\t65609\n开发者\t65610\nMaroon\t65611\n冀鲁\t65612\n2009-2011年\t65613\n下解\t65614\n龙须\t65615\n平步青云\t65616\n2002um\t65617\n诺诺镑客\t65618\n嗟\t65619\n非特异性\t65620\n平湖镇\t65621\n首屈一指\t65622\n门市部\t65623\nkt猫\t65624\n易贷\t65625\n苏赫\t65626\n龙湖山庄\t65627\n600dpi\t65628\n140集\t65629\n吉峰\t65630\n1550nm\t65631\n签证书\t65632\nwinx64.zip\t65633\n沈阳市第四人民医院\t65634\n唐骏我们仨\t65635\n李显\t65636\n立诚\t65637\n油棕\t65638\n榛名\t65639\n20880\t65640\n出轴\t65641\n色度\t65642\n电源管理软件\t65643\n步履不停\t65644\nStickers\t65645\n特校\t65646\n化德\t65647\n黑边\t65648\n247号\t65649\n合情合理\t65650\n吴长江\t65651\n婚然\t65652\n紫御府\t65653\n小米手机1/1S_MIUI论坛\t65654\n鹿血酒\t65655\n雷电芽衣\t65656\n918事变\t65657\nAdolf\t65658\n五艘\t65659\n湿生\t65660\nE8400\t65661\ngoogle+\t65662\nLevinson\t65663\n俏皮话\t65664\ndlib库\t65665\n09集\t65666\n一天两天\t65667\n12.0.1\t65668\n装疯\t65669\n粗点心战争\t65670\n藏传\t65671\n图形的运动\t65672\n中国一冶集团\t65673\n应天\t65674\n马岭\t65675\n3721健康知识网\t65676\n濒死感\t65677\n汾\t65678\n放榜\t65679\nCong\t65680\n灵异咒\t65681\n魔兽宏\t65682\n吉他谱\t65683\n2第一期\t65684\n氢谱\t65685\n15u\t65686\n袁伟民\t65687\nApo\t65688\nqunar\t65689\nwnmp\t65690\n卤人\t65691\n最后一站\t65692\n世界遗产\t65693\n罐式\t65694\n外号\t65695\naperture\t65696\n是非题\t65697\n无锡新区\t65698\n机匣\t65699\n招摇撞骗\t65700\n白云大道\t65701\n天猫精灵x1\t65702\n发行版\t65703\n四块\t65704\n淘金山\t65705\n上百\t65706\n几十层\t65707\nVlookup函数\t65708\nABS塑料\t65709\n青岛海\t65710\n曼陀\t65711\n高商\t65712\n10支\t65713\n社会保障法\t65714\n清醒\t65715\n大树镇\t65716\n灵魂王者天下\t65717\n史迪奇\t65718\n320lim\t65719\n8.99\t65720\nmex\t65721\n尚德幼儿园\t65722\n跨文化交流\t65723\n绿盘\t65724\njaguar\t65725\n智高\t65726\n红米4\t65727\n红线虫\t65728\n羽绒棉\t65729\nzake\t65730\nCELL\t65731\nFEM\t65732\n环己酮\t65733\n香港医院\t65734\n土地使用权罪\t65735\n鲁迅文学院\t65736\n宇宙观\t65737\n公允价值\t65738\n彩虹3\t65739\n番茄树\t65740\n老耿\t65741\n_语\t65742\n德阳晚报\t65743\n赤岸镇\t65744\n常用函数\t65745\n101次求婚\t65746\n刑事裁定书\t65747\n伞形科\t65748\n优路教育\t65749\n快枪\t65750\n修容粉\t65751\n库奇\t65752\nao3\t65753\nBLE4.0\t65754\n黑入\t65755\n60只\t65756\nJames2017\t65757\n济南市环保局\t65758\n雪漫\t65759\n五十岁\t65760\n财校\t65761\n成品油\t65762\n口袋妖怪究极日月\t65763\n小小智慧树\t65764\n北京市安监局\t65765\n7.5cm\t65766\nelastic\t65767\n9685\t65768\n纪念馆\t65769\n裕仁\t65770\n杨建民\t65771\n公参\t65772\n唱作人\t65773\n夹布\t65774\n腰子\t65775\n2四\t65776\n小米9号平衡车\t65777\nSHP\t65778\n米粒\t65779\nNoah\t65780\n朱文\t65781\n金海路\t65782\nMODEM\t65783\n国家会议中心\t65784\n龙牙\t65785\n站桩\t65786\n七氟丙烷\t65787\n半乳糖苷酶\t65788\n长颈龟\t65789\n下级\t65790\n流量仪\t65791\n中国电信集团有限公司\t65792\n白牡丹茶\t65793\n采购师\t65794\n小小小\t65795\n北汉\t65796\n50mL\t65797\n众泰_众泰T700\t65798\n商代\t65799\n粉末冶金\t65800\n阿曼\t65801\n中国共产党第十九次全国代表大会秘书处\t65802\n面红\t65803\n10环\t65804\n前途无量\t65805\n开放教育学院\t65806\n中华人民共和国城市房地产管理法\t65807\n法式风格\t65808\n潜逃\t65809\n40部\t65810\n真钢\t65811\nGirlfriends\t65812\n以太坊ETH\t65813\n赞助费\t65814\n路亚水滴轮\t65815\n么\t65816\nzxa10\t65817\n怒之铁拳2\t65818\n王朝網路\t65819\n馥绿德雅\t65820\nbgy\t65821\n1675\t65822\n冷风\t65823\nFeb\t65824\n30001\t65825\n当午\t65826\n网易一卡通\t65827\n葡萄糖酸锌片\t65828\n迷香\t65829\n特殊关系\t65830\n户外网\t65831\n绽\t65832\n网络监控系统\t65833\nClarity\t65834\n遵义市人力资源和社会保障局\t65835\n蓝岸\t65836\n20150429\t65837\n抽象化\t65838\n港生\t65839\n28335\t65840\n体力劳动\t65841\n密爱\t65842\nBOW\t65843\n中国建材网\t65844\nsomeone\t65845\nWLOP\t65846\nheadset\t65847\n狂跌\t65848\n吴玉华\t65849\n2017-2020年\t65850\nr20\t65851\n我可以\t65852\n意语\t65853\n系带\t65854\n2.1亿元\t65855\niKafka\t65856\ndm\t65857\n三庚\t65858\n情根\t65859\n研究法\t65860\nexpose\t65861\n广告纸\t65862\n速测\t65863\ni9300/Galaxy\t65864\n姬星\t65865\n倍力乐\t65866\n济源职业技术学院\t65867\n去病\t65868\n融创宜和园\t65869\n0摄氏度\t65870\n金鹰奥莱城\t65871\n我的中国梦\t65872\n休刊\t65873\n日本通\t65874\n上古\t65875\n郑楠\t65876\n游览\t65877\n新照\t65878\n打铁关\t65879\n诺如病毒\t65880\nActiveReports\t65881\n新闻众评_论坛\t65882\n玩具板\t65883\n酷播网\t65884\n世泽\t65885\n一元一次方程\t65886\n图则\t65887\n乐清柳市\t65888\n三顶\t65889\n证券交易印花税\t65890\n福满园\t65891\n敢当头条_大众网\t65892\n安州区\t65893\n个人社会保障卡\t65894\n超神学院之雄兵连\t65895\n双离合变速器\t65896\n执医\t65897\n红妆\t65898\ncares\t65899\n地球运动\t65900\n金狐狸\t65901\n女靴\t65902\n福禄康瑞\t65903\n天珠变\t65904\n评课稿\t65905\nbizarre\t65906\nfacile\t65907\n偶合器\t65908\nzol新闻中心\t65909\n李欧\t65910\n钉\t65911\n假单胞菌\t65912\n中国银河证券股份有限公司\t65913\nGTX780\t65914\nQQ手机浏览器\t65915\n关喆\t65916\n寒冰射手\t65917\n不自觉\t65918\n古铜\t65919\n1万3\t65920\n故事板\t65921\n法文版\t65922\n一望无际\t65923\ntranny\t65924\n延华\t65925\n请来\t65926\n华中科大\t65927\n电阻式\t65928\nAL00/3GB\t65929\n蒙自\t65930\n地包\t65931\n肺吸虫\t65932\n围杀\t65933\n概然\t65934\n前模\t65935\ndatatables\t65936\n澳洲移民局\t65937\n东营市政府\t65938\n诺成\t65939\nlambada\t65940\n16盘位\t65941\n2月9日\t65942\n徐菲\t65943\nCenturion\t65944\n25m\t65945\n基本初等函数\t65946\n微纪实\t65947\n旅游指南\t65948\nB站\t65949\n新专业\t65950\n宜山\t65951\n惯用\t65952\n蝴蝶舞\t65953\n秀好\t65954\n小丸\t65955\n李松儒\t65956\nGrow\t65957\n凌奥\t65958\nEqualizer\t65959\n中环一号\t65960\n80寸\t65961\n世纪村\t65962\n神经管\t65963\n屋顶绿化\t65964\n肉疼\t65965\n李延亮\t65966\n高尔\t65967\ncy\t65968\n西三旗\t65969\nigo\t65970\n黑洁明\t65971\n人才公园\t65972\n专技类\t65973\n业业\t65974\nto\t65975\n我宣誓\t65976\n宫里\t65977\n饶子龙\t65978\n永清环保\t65979\n投资性房地产转换\t65980\n河北省农业厅\t65981\n电子辞典\t65982\n林若溪\t65983\n一二三四五线\t65984\nouyang\t65985\nk3003\t65986\n续播\t65987\ncompute\t65988\nMD5\t65989\n一喷\t65990\n电爪\t65991\n菜市场\t65992\n131\t65993\n3.45\t65994\n黑阈\t65995\n电子网\t65996\n受死\t65997\n广州日报\t65998\n我的路\t65999\n哈克龙\t66000\n赵近芳\t66001\nkingfisher\t66002\n正根\t66003\nrolex\t66004\n马哈拉施特拉邦\t66005\nyoga\t66006\n每周\t66007\n典韦\t66008\n松散\t66009\nTransforming\t66010\n易教网\t66011\n浙江统计信息网\t66012\n48粒\t66013\nhamster\t66014\n门卡\t66015\n压纹机\t66016\n自在极意功\t66017\n棕兔\t66018\n沙雕\t66019\n八毫米\t66020\n张晓雅\t66021\n会计软件\t66022\n婴语\t66023\n自力式\t66024\n矛盾论\t66025\n二甲基苯胺\t66026\n30多亿\t66027\n香柏木\t66028\n沪伦通\t66029\n攻壳机动队\t66030\n孔祥东\t66031\ncshrc\t66032\n2018年6月1日\t66033\n城会\t66034\n112集\t66035\nOsprey\t66036\nreentrantlock\t66037\n音乐生\t66038\n泰迪狗\t66039\n阅读者\t66040\n陈庆\t66041\n顶部\t66042\n2第三章\t66043\nrespectively\t66044\nstorm\t66045\n流水作业\t66046\nTH\t66047\nbacked\t66048\n湖南省森林植物园\t66049\n时间停止器\t66050\n肠套叠\t66051\n上海出入境管理局\t66052\n俞军\t66053\n陈思媛\t66054\n不需要\t66055\n监视者们\t66056\n50颗\t66057\n250吨\t66058\n权价\t66059\n中融人寿\t66060\n冷水\t66061\n江波\t66062\nJess_喵\t66063\n一粟\t66064\n8.23\t66065\n服兵役\t66066\n越秀公园\t66067\n贝伐单抗\t66068\n部分区\t66069\n永中街道\t66070\n定音\t66071\n礼仪\t66072\nMysql\t66073\n奔驰威霆\t66074\n恒联\t66075\nAv在线伊人综合网\t66076\nRevel\t66077\n遂州\t66078\n三国机密之潜龙\t66079\nDus\t66080\nm+2\t66081\n国药控股股份有限公司\t66082\n封地\t66083\n家用\t66084\nfread函数\t66085\n骆宾王\t66086\nsnailsvn\t66087\n习武\t66088\n恒字\t66089\n斐讯联盟\t66090\n276个\t66091\n双层玻璃反应釜\t66092\ninquiry\t66093\n万菲乐\t66094\nSed\t66095\n3480\t66096\n维护\t66097\n甜辣酱\t66098\n海贼王漫画全集\t66099\n追补\t66100\n广西卫视\t66101\n输变电\t66102\nSkySoot\t66103\n神经科\t66104\n陈志华\t66105\n保险业\t66106\n约旦河\t66107\n王宝欧文\t66108\n不足\t66109\n科e\t66110\n姑息\t66111\nbritish\t66112\n综合险\t66113\n相视\t66114\n坂口玲奈\t66115\n帕杰罗劲畅\t66116\n吕后\t66117\n马达京\t66118\nGTX750ti\t66119\n1142\t66120\n婆姨\t66121\n利害关系\t66122\n药片\t66123\n极品\t66124\n玉圭\t66125\n四五六\t66126\n赌博默示录2\t66127\n回忆鲁迅先生\t66128\n滨江东路\t66129\nav片\t66130\n洗洗\t66131\n高达模\t66132\nnumb\t66133\n毓璜顶医院\t66134\n返魂香\t66135\n塔头\t66136\nFabulous\t66137\nPS磨皮滤镜\t66138\n14度\t66139\n盗墓案\t66140\n为知笔记\t66141\n北戴河区\t66142\n树王\t66143\n三千年前\t66144\n3704\t66145\n罗技G603\t66146\nCPhI制药在线\t66147\n希崎杰西卡\t66148\n20160814\t66149\nppt模版\t66150\n前体\t66151\n几百页\t66152\nSichuan\t66153\n道德性\t66154\n锄地\t66155\n花环\t66156\n美的集团\t66157\n红杉树智能英语\t66158\n仙剑奇侠传一\t66159\n不能寐\t66160\n林莹\t66161\n西红门镇\t66162\n路边摊\t66163\n减慢\t66164\n世界大学城\t66165\n王之\t66166\n转置\t66167\n马永贞\t66168\n中移在线服务有限公司\t66169\n斗罗大陆续集之史莱克七怪成神之路\t66170\nlhrbest\t66171\nchile\t66172\n车台\t66173\n篮子\t66174\n凤鸟\t66175\n扫踢\t66176\n骊威论坛_汽车之家论坛\t66177\nMudbox\t66178\n容易受伤的女人\t66179\n檀\t66180\n咋整\t66181\n鶸\t66182\n刮臂\t66183\n领用\t66184\n渡红尘\t66185\nwww1024seecom\t66186\n龙州\t66187\n视觉传达设计专业\t66188\n巢箱\t66189\nUfldl\t66190\n五枂\t66191\njade5\t66192\n蛇皮袋\t66193\n叙利亚\t66194\n八系\t66195\n金砖国家工商论坛\t66196\n品论\t66197\n共青团路\t66198\n新开路\t66199\n手电筒\t66200\n相伴\t66201\n济南市人力资源和社会保障局\t66202\n改性淀粉\t66203\n果粒橙\t66204\n甘拜下风\t66205\nn次方根\t66206\n逸贷\t66207\n凯斯特\t66208\nmedi\t66209\n郄\t66210\n2.8m\t66211\n吴刚\t66212\n剑河路\t66213\n杨再春\t66214\n入境\t66215\n牝\t66216\natl71.dll\t66217\nワルツ\t66218\n狭间\t66219\n千栀网\t66220\n掮客\t66221\n辅酶\t66222\n泡泡卡丁车\t66223\n灰壳\t66224\n55条\t66225\n上海半岛酒店\t66226\n晨星网\t66227\n经管类\t66228\n小刺\t66229\n耐候钢\t66230\n萝丝\t66231\n米津玄师\t66232\nWindos\t66233\n调制乳\t66234\n受冻\t66235\n衣帽柜\t66236\nP40\t66237\n几口\t66238\nunkonw\t66239\nziyu\t66240\nRetrieval\t66241\nSkynet\t66242\n第八十八章\t66243\n北辰香麓\t66244\n厦门工学院\t66245\n偏摆仪\t66246\n奥斯卡最佳外语片\t66247\n榆中\t66248\n比克尔\t66249\n李正信\t66250\n病友们\t66251\nCreampie\t66252\nreign\t66253\n情结\t66254\n龙标遥\t66255\n亿彩\t66256\nQUALITY\t66257\n荧光笔\t66258\n91平米\t66259\n伯俊\t66260\n五代\t66261\n婊女\t66262\n朱迪福斯特\t66263\nnosql数据库\t66264\n医史\t66265\n鸟的天堂\t66266\n3760\t66267\n邮件箱\t66268\ncis\t66269\n屋前\t66270\n三十日内\t66271\n中国服装网\t66272\n助产\t66273\n重工具\t66274\n朝天子\t66275\n小米路由器3\t66276\n令牌桶算法\t66277\n相生相克\t66278\n十届\t66279\nlenses\t66280\n避暑山庄\t66281\n鹊山\t66282\n镇海炼化\t66283\n卢武铉\t66284\n婴宁\t66285\n1.1万元\t66286\n七十味\t66287\n福楼\t66288\n副县长\t66289\n斯科拉\t66290\n继妹\t66291\n大疆社区\t66292\n寄性\t66293\n小红灯\t66294\n六弄\t66295\n张正\t66296\n候车大厅\t66297\n洁癖\t66298\n海报时尚网\t66299\n启悦\t66300\n机皇\t66301\n雨果\t66302\n市社科联\t66303\n我去上学啦\t66304\n断舍离\t66305\n暴风电视\t66306\n2.09\t66307\npix4dmapper\t66308\ntails\t66309\n轩辕剑外传穹之扉\t66310\n中国科学院重庆绿色智能技术研究院\t66311\n哀泣\t66312\n北京城市总体规划\t66313\n柯米克\t66314\n蒲公英路由器\t66315\n新港中路\t66316\nPMG\t66317\n完好率\t66318\n清道\t66319\n置物柜\t66320\n区党委\t66321\n保鲜剂\t66322\n帝标\t66323\n外代\t66324\nflap\t66325\nbeareyes\t66326\n5兆\t66327\nKone\t66328\n特职\t66329\n拉筋\t66330\n上证e互动\t66331\nzhanggui\t66332\n意尔康\t66333\n洋馆\t66334\n回南天\t66335\n一拍\t66336\n诺丁山\t66337\n北京红黄蓝幼儿园\t66338\n11万亿\t66339\n中国古代文学史\t66340\n张院忠\t66341\nmobi\t66342\n1-8月份\t66343\n百晓\t66344\n460亿\t66345\n天津市幼儿园\t66346\nPS画\t66347\n淮南人才网\t66348\nChupaporn\t66349\n小林简笔画\t66350\n东道主\t66351\n19.6\t66352\n王金战\t66353\n夏雨竹\t66354\n枣糕\t66355\n通函\t66356\n社会科学方法论\t66357\n阑尾炎\t66358\n苏和\t66359\n定向赛\t66360\nmedial\t66361\na100\t66362\n三居\t66363\n天国:拯救\t66364\n九分之五\t66365\nriverside\t66366\n私银\t66367\n散光轴位\t66368\n单弦\t66369\n隧道式\t66370\nMisfit\t66371\n眼干\t66372\n小运\t66373\n20g\t66374\n出品\t66375\nhany\t66376\n马鞍包\t66377\n樓\t66378\n钉钉考勤机\t66379\n白玉山\t66380\n牙签弩\t66381\nshif\t66382\nuisearch\t66383\n創\t66384\nClaus\t66385\n齐玉\t66386\n长期照\t66387\n长青集团\t66388\n病毒感冒\t66389\n讯雷\t66390\n尼克·胡哲\t66391\n11班\t66392\n曲线\t66393\n二五八\t66394\n查岗\t66395\n江淮晨报\t66396\n博时黄金\t66397\n一无所知\t66398\n管辖权异议\t66399\n送件\t66400\n凉夜\t66401\n建筑人才网\t66402\n10.doc\t66403\n参访\t66404\n秘辛\t66405\n一天到晚\t66406\n新闻联\t66407\nexpma\t66408\n战长沙\t66409\n长白点\t66410\n摸板\t66411\nGartner\t66412\n光峰光电\t66413\n黑仔\t66414\n6多\t66415\n20170220\t66416\n中天易税\t66417\nSeneca\t66418\n广医二院\t66419\n傻人\t66420\n如画\t66421\nL470\t66422\n2017年11月11日\t66423\n夕夕\t66424\n很傻\t66425\n980TI\t66426\n浮游动物\t66427\n豪德\t66428\nElvUI\t66429\n沪委\t66430\n来试试\t66431\n绀野\t66432\n山龟\t66433\n覆辙\t66434\ne-office\t66435\n纳智捷\t66436\nRocket\t66437\n字儿\t66438\n公务员局\t66439\n十字架与吸血鬼\t66440\n萨维奇\t66441\n烧味\t66442\n曹安路\t66443\nxdebug\t66444\n_年\t66445\n禅位\t66446\ngovernor\t66447\n电通\t66448\n劳军\t66449\n李雨\t66450\n经济带\t66451\n管夹\t66452\nforce\t66453\nntlea\t66454\n护额\t66455\n碟盘\t66456\n安贞西里\t66457\n笺\t66458\nWARNING\t66459\n网上银\t66460\n大烩菜\t66461\nVote\t66462\n高级酒店\t66463\n武汉东湖\t66464\n比亚迪元ev360\t66465\n专筑\t66466\nxiaav\t66467\n0411\t66468\n黑箱\t66469\n了结\t66470\n嘉士利\t66471\n游佐浩二\t66472\n超级玛丽小游戏\t66473\nSkyDrive\t66474\n刘丽\t66475\n贝尔\t66476\n丽娟\t66477\nwww.lijiejie.com/wp-content/uploads/2014/08/126_user\t66478\n警卫局\t66479\n布谷鸟\t66480\n无锡市大桥实验学校\t66481\n0529\t66482\n简算\t66483\n宁夏天元锰业\t66484\n吸气式\t66485\ngonglve\t66486\n虎口处\t66487\n街道人大\t66488\n双关语\t66489\n色拉油\t66490\n36项\t66491\n欧诺汽车\t66492\n青岛新闻网\t66493\nmuy\t66494\n326号\t66495\n珍妮杰克逊\t66496\nIG\t66497\n老胃病\t66498\n找位\t66499\n循环\t66500\n献出\t66501\n期货从业考试\t66502\n央广\t66503\n甩柜\t66504\n医学论坛网\t66505\n002368\t66506\n大明\t66507\n医疗机\t66508\n耳罩\t66509\n无间地狱\t66510\nceil函数\t66511\n鹏鹞环保\t66512\n双瞳\t66513\n努比亚红魔\t66514\nDWZ\t66515\nBD.720p\t66516\nWarsaw\t66517\n胡因李敖\t66518\nS500\t66519\n2018-02-28\t66520\n华软\t66521\n排阵\t66522\n心包\t66523\n智联版\t66524\n不定方程\t66525\n51CTO下载中心\t66526\n小偷\t66527\n卦辞\t66528\n漳河\t66529\n敬业与乐业\t66530\n贩运\t66531\n拍频\t66532\n冈村宁次\t66533\n唯唯诺诺\t66534\n色性\t66535\n3401\t66536\n2000\t66537\n新城中学\t66538\n缸筒\t66539\n挂号信\t66540\n辽宁省科技馆\t66541\n西洛\t66542\n恶魔猎手\t66543\n歙\t66544\n换链\t66545\n点到点\t66546\ncube\t66547\n心狠\t66548\n学报告\t66549\n斗士\t66550\n星妈\t66551\nho\t66552\n三农\t66553\nMinutes\t66554\ntouchend\t66555\n深圳南山网\t66556\n股权结构\t66557\n芬芬\t66558\nzil\t66559\n碧湖\t66560\n交通银行信用卡还款\t66561\nlibstdc++.so.6\t66562\nAkira\t66563\n工具展\t66564\n李一凡\t66565\n误人子弟\t66566\nppid\t66567\n通门\t66568\n李旭\t66569\nDelphi\t66570\n娃\t66571\ngenerate\t66572\n采茶戏\t66573\n大樱桃\t66574\n幸子\t66575\n艾奥瓦州\t66576\n胡敏\t66577\n+1\t66578\n福寿街\t66579\n飘板\t66580\n众兴镇\t66581\n黄_\t66582\n41mm\t66583\nJSR\t66584\n夜行者\t66585\n一挺\t66586\n腔隙性脑梗塞\t66587\n模考\t66588\nKids\t66589\n画师\t66590\n安家费\t66591\n明年1月\t66592\n合体\t66593\n氟气\t66594\n塑胶件\t66595\n卓创资讯\t66596\n知道\t66597\nCT机\t66598\n加拿大银行\t66599\n100毫米\t66600\n古塔\t66601\nCEN\t66602\n血色修道院\t66603\n榆林学院\t66604\n穗花\t66605\n双福\t66606\nhg8546m\t66607\nproximity\t66608\n全明星\t66609\nfilemaker\t66610\n大丁\t66611\n达拉特\t66612\n筋机\t66613\nuser_func\t66614\nFSET\t66615\n温氏\t66616\n美利坚\t66617\n撃\t66618\n海南省中医院\t66619\n智绘\t66620\n顾海良\t66621\nalloys\t66622\nflow3d\t66623\n遮眼\t66624\n地址段\t66625\n松滋市\t66626\n胥口\t66627\n迅敏\t66628\n石河镇\t66629\nFastAdmin\t66630\n380元\t66631\n首端\t66632\nVanquish\t66633\n船厂\t66634\nminnesota\t66635\n真女\t66636\n年多\t66637\n申一帆\t66638\n辛德勇\t66639\n立德树人\t66640\n玉柴集团\t66641\n六一幼儿园\t66642\n9套\t66643\n2031\t66644\ndpi\t66645\n第06集\t66646\nsvip\t66647\n日本早稻田大学\t66648\n环境工程学报\t66649\n淡彩\t66650\nISM\t66651\n戴春林\t66652\n紫金新干线\t66653\nsmt贴片机\t66654\n烘烘\t66655\n1140\t66656\n澳门永利\t66657\n㈠\t66658\n张亚飞\t66659\n简谐振动\t66660\nGIF图\t66661\n国统字\t66662\n陕西广电\t66663\n龙霸\t66664\n防腐木凉亭\t66665\nsinopia\t66666\n卡勒德·胡赛尼\t66667\n倩女幽魂1\t66668\n憨豆先生动画版\t66669\n正构烷烃\t66670\n造福\t66671\n李浩轩\t66672\n读起来\t66673\n明都\t66674\n劲爆\t66675\n北京天桥艺术中心\t66676\n操纵子\t66677\n张红宇\t66678\n幽香\t66679\n东北往事\t66680\n恳\t66681\nhaartraining\t66682\n山东省商务厅\t66683\n校验位\t66684\n无极娱乐\t66685\n06集\t66686\n樱花卫厨\t66687\ngnome3\t66688\n572\t66689\n厌烦\t66690\n疑病\t66691\n景芳\t66692\n上海市文化广播影视管理局\t66693\n说聊斋\t66694\n夹具\t66695\nlianghe\t66696\nards\t66697\n白云山庄\t66698\n55个\t66699\n牢底\t66700\n重单\t66701\n2018.04.16\t66702\n刺破\t66703\n武皇\t66704\n朱相昱\t66705\n弱小\t66706\n实业公司\t66707\n杜立德\t66708\n53章\t66709\n名创\t66710\nsjc\t66711\n云栖小镇\t66712\n王子与贫儿\t66713\nSpring+Quartz\t66714\n屠光绍\t66715\ndeborah\t66716\n广州地铁集团有限公司\t66717\n11重\t66718\n南开大学文学院\t66719\nxgboost\t66720\n宣讲稿\t66721\n长毛\t66722\n介定\t66723\n羊汤\t66724\n卤鸡腿\t66725\n弹牙\t66726\n利咽\t66727\ndefensive\t66728\n802\t66729\nxn\t66730\n天天富翁\t66731\n华友\t66732\n可莎蜜儿\t66733\n国家宗教局\t66734\n贝纳利\t66735\nAEE\t66736\n沉璧\t66737\niZotope\t66738\n公共基础\t66739\n言葉\t66740\n德莱尼\t66741\n刘伯承\t66742\n前蹄\t66743\n江西财大\t66744\nKNN\t66745\n粮草先行\t66746\n前1万年\t66747\n12.1.1\t66748\nA55\t66749\n10套\t66750\n医学统计学\t66751\n帕金斯\t66752\n语文生\t66753\n上颌窦炎\t66754\n苹果6SE\t66755\n中电科技集团重庆声光电有限公司\t66756\n扬言\t66757\n知人者\t66758\n交响乐\t66759\n练习版\t66760\n长城守卫军\t66761\n蔬菜园\t66762\n明长城\t66763\nl6\t66764\n圣湖\t66765\n福建论坛\t66766\n超短波\t66767\nGJB\t66768\n牛耳\t66769\n甲基叔丁基醚\t66770\n详版\t66771\n米箱\t66772\n精铜\t66773\n品官\t66774\npopulations\t66775\n第几册\t66776\n随身包\t66777\nFaster-RCNN\t66778\n妈妈的账单\t66779\n原汁\t66780\n和讯操盘宝\t66781\nCarly\t66782\n4770K\t66783\nHigher\t66784\n保湿剂\t66785\n扣眼\t66786\n精選\t66787\n沃尔沃v40\t66788\n金时\t66789\n曹昂\t66790\n暗示\t66791\nSchubert\t66792\n实参\t66793\nreceipts\t66794\n空者\t66795\n招贤\t66796\n丰园\t66797\n西安交大附中\t66798\n司空图\t66799\n专用户\t66800\n贷款公司\t66801\nEMUI4.0\t66802\n玉磨铁路\t66803\n货量\t66804\n小米Mix2s\t66805\n摩登舞\t66806\n胶干\t66807\n加腋\t66808\n义乌国际商贸城四区\t66809\n石柱镇\t66810\n洪武通宝\t66811\n喷泡\t66812\n克里斯蒂安贝尔\t66813\n尽日\t66814\nNpp\t66815\n李旦\t66816\n西王母\t66817\n悲喜交加\t66818\n柯桥日报\t66819\n熊本家\t66820\n小蓝\t66821\n京东移动\t66822\n夜舞\t66823\n苏菲的炼金工房\t66824\n285号\t66825\n绝非\t66826\nrobotic\t66827\n信阳羊山新区\t66828\n内村\t66829\n没本事\t66830\n秀丽江山之长歌行\t66831\n读码\t66832\n睿宝\t66833\n电力局\t66834\n凯恩斯\t66835\nECNU\t66836\n反激变换器\t66837\nHubei\t66838\n深信\t66839\n智辉版\t66840\n绿能\t66841\n发展改革委\t66842\n活闹鬼\t66843\n_叮当动漫网\t66844\n重型坦克\t66845\n薤白\t66846\n给定\t66847\n黒猫スミス\t66848\n福建省卫生和计划生育委员会\t66849\n岱岳\t66850\n刑事谅解书\t66851\nWANIMAL\t66852\n2000多元\t66853\n一幢\t66854\nmediarecorder\t66855\n中国法学网\t66856\n快递单\t66857\nR15梦镜\t66858\n陆平\t66859\nteMe\t66860\n阿珂\t66861\n北京旅游团\t66862\n加成型\t66863\n报身\t66864\ndx10\t66865\n华为mate10pro\t66866\n育龙单招网\t66867\ncccc\t66868\n20kg\t66869\n耙耳朵\t66870\n东丰\t66871\nComplete\t66872\n肺野\t66873\ngitlib\t66874\n生光\t66875\n玉蒲团之玉女心经\t66876\n兰塔岛\t66877\n各个\t66878\n绘画类\t66879\n边坡防护网\t66880\nnx100\t66881\n1250元\t66882\n旺墩路\t66883\n纸钞屋\t66884\n条形章\t66885\n雅舍\t66886\n脾气\t66887\n三个男\t66888\n惊倒\t66889\nE43\t66890\n缸径\t66891\n二零一六年\t66892\n石船\t66893\n阳奉阴违\t66894\n绿人\t66895\nBefore\t66896\n维瓦尔第\t66897\n同等效力\t66898\n尤莉安奥特曼\t66899\n3月23号\t66900\n艺朝艺夕\t66901\n云南省人力资源和社会保障\t66902\n景山\t66903\n狗笼\t66904\n监理方\t66905\n庆铃五十铃\t66906\n大爱\t66907\n老街\t66908\ntragic\t66909\n诱导型\t66910\n采光井\t66911\n艮山\t66912\n1毛\t66913\n中考卷\t66914\n|职业技术学院\t66915\nfractional\t66916\n卡波\t66917\nwindeployqt\t66918\n十尺\t66919\n勇者斗恶龙2\t66920\n青岛大学商学院\t66921\nSAMA\t66922\n本区\t66923\n发音器\t66924\n洗洁精\t66925\n连珠\t66926\n应税\t66927\n油豆腐\t66928\n3.8L\t66929\n腌鱼\t66930\n跨屏\t66931\n冠珠\t66932\n琅琊台\t66933\nVariational\t66934\nSale\t66935\nusb3.0\t66936\n意志力\t66937\n软包装\t66938\nmxnet\t66939\n沙田区\t66940\nTemporary\t66941\n先驱\t66942\n陈智强\t66943\n帕博西尼\t66944\n美帝国主义\t66945\n李建文\t66946\n杭温高铁\t66947\nbsh\t66948\n北京市旅游发展委员会\t66949\n陆勇\t66950\n枯荷\t66951\nedius7\t66952\n宁阳县\t66953\npotion\t66954\n怡景中心城\t66955\nableton\t66956\n新视野视\t66957\n发字\t66958\n甩子\t66959\n佳洁士\t66960\n水砖\t66961\n禅房\t66962\n109级\t66963\n梦天\t66964\n马岛\t66965\n60万元\t66966\n十兆\t66967\nCAD图纸查看器\t66968\nz370p\t66969\n玻尔兹曼机\t66970\n空气套\t66971\n进取版\t66972\n北京旅游集散中心\t66973\n丰e足食\t66974\nt450\t66975\nCampus\t66976\nXero\t66977\n赶制\t66978\n马伯骞\t66979\n3D学堂|3D虎\t66980\n飞儿乐队\t66981\n8万块\t66982\n中华v3\t66983\n耦合\t66984\n免税车\t66985\n小甜心\t66986\n片料\t66987\n工人先锋号\t66988\n两码\t66989\n育发液\t66990\n小沙弥\t66991\n姚强\t66992\ncores\t66993\nmabits\t66994\n防腐木地板\t66995\n宠婚撩人:腹黑老公诱妻成瘾\t66996\n虎将\t66997\n苏一光\t66998\n夏儿\t66999\n上海_百度\t67000\nviburnum\t67001\n电话\t67002\n万恶居委会\t67003\n40w\t67004\n中国邮政特快专递单号\t67005\n马鲁斯\t67006\n李七喜\t67007\n独角莲\t67008\njing\t67009\n正当防卫3吧\t67010\n四十年代\t67011\n8331\t67012\n成都职业技术学院\t67013\n车载MP4\t67014\n商业贷款转公积金贷款\t67015\n莫尼山\t67016\n15立方\t67017\n漫展网\t67018\n游泳衣\t67019\n冯珂\t67020\n黑界\t67021\n围檩\t67022\n新和成\t67023\n博思\t67024\n绝世神偷\t67025\n食品营养强化剂\t67026\nEnum\t67027\n220_\t67028\n280nm\t67029\nCal\t67030\n净网\t67031\n李小波\t67032\n爱的真谛\t67033\nwzq609\t67034\n熊春锦\t67035\n木人桩\t67036\n低级\t67037\n中卫市\t67038\n太阳纹\t67039\n情人岛\t67040\n何政军\t67041\n南京航空航天大学\t67042\nOH\t67043\n3311\t67044\n专户\t67045\n西双版纳原始森林公园\t67046\n卡拉果\t67047\n大明眼镜\t67048\n38000\t67049\nzookeper\t67050\n排卵期计算器\t67051\n浪潮集团\t67052\nAv色\t67053\nc证\t67054\nDFB\t67055\n5000w\t67056\n中信银行广州分行\t67057\n乱爱\t67058\nddd\t67059\n狼尾草\t67060\nLost\t67061\n国威WS824\t67062\n陈宝生\t67063\n苏打饼干\t67064\n香天下\t67065\n发证\t67066\n韩辉\t67067\n大众交通\t67068\n培培\t67069\n鸡鸣驿\t67070\n立体化\t67071\n科西嘉岛\t67072\nWeb应用程序\t67073\nhotwind\t67074\n五十五岁\t67075\n写值\t67076\n寿元\t67077\n用水\t67078\n合山\t67079\n台帽\t67080\n重放\t67081\n化工原理\t67082\n兔宝宝\t67083\n蕲春县人民政府\t67084\n许文彪\t67085\n赵依依\t67086\n肉穴\t67087\n_和牛网\t67088\nJoiner\t67089\n155级\t67090\nT70\t67091\n乐家网\t67092\n政治学专业\t67093\n布式\t67094\n过冲\t67095\n朱珠\t67096\n黄石国家公园\t67097\n茗城\t67098\n盛京通\t67099\n非标\t67100\nQS世界大学排名\t67101\nBG\t67102\n速度\t67103\njiajinhao\t67104\n游旅游线路\t67105\n市政法委\t67106\n待工\t67107\n大溪水\t67108\n佩雷拉\t67109\n碎玻璃\t67110\n鱼鸟\t67111\n汉传\t67112\n分配率\t67113\n重叠\t67114\n不做爱\t67115\n顺峰山公园\t67116\n欲封\t67117\n西安地铁5号线\t67118\n定期存款利率\t67119\n88分钟\t67120\n张诚\t67121\n冰封王座\t67122\nefcore\t67123\n473\t67124\ngaygay\t67125\n处长\t67126\n孙路弘\t67127\nEU400\t67128\n港科大\t67129\n撺掇\t67130\n吴映洁\t67131\n第三十回\t67132\n浮躁\t67133\n小奶\t67134\n仪态\t67135\nsettime\t67136\nCUBASE\t67137\n3月20日\t67138\n骆驼街道\t67139\n对页\t67140\n分字\t67141\n煤气管\t67142\n花家地\t67143\nPIXY\t67144\nM8\t67145\n8款\t67146\n顶级域名\t67147\n赫拉斯\t67148\n炉\t67149\n等腰\t67150\n霞飞路\t67151\n子网\t67152\nCompass\t67153\n罗曼·波兰斯基\t67154\n通城县\t67155\n安徽省人民政府法制办公室\t67156\n电子尺\t67157\n车牌识别系统\t67158\n棒针\t67159\n风火轮\t67160\n囤积\t67161\n图书网\t67162\n双氧水\t67163\n浪潮信息\t67164\n苏扬\t67165\n真空包装袋\t67166\nbackburner\t67167\n100W\t67168\n流莺\t67169\n华泰圣达菲\t67170\n清河站\t67171\n三夜间\t67172\n谨\t67173\n奔腾g4560\t67174\n心律\t67175\nccg\t67176\n100章\t67177\n典体\t67178\n令东来\t67179\n新秋\t67180\n10万吨级\t67181\n赵县\t67182\nav撸色\t67183\n新黑暗圣经\t67184\ngho\t67185\n夫差\t67186\n粤教版\t67187\n诺言\t67188\n天音寺\t67189\n武汉众邦银行\t67190\n600029\t67191\n慧聪商学院\t67192\n聪明才智\t67193\n康纳利\t67194\npartial\t67195\n博鳌乐城国际医疗旅游先行区\t67196\n2.2G\t67197\n炫风\t67198\n云马\t67199\n五脏俱全\t67200\n百代\t67201\n独行侠\t67202\n肥白\t67203\n幽浮2天选者之战\t67204\n油剂\t67205\n民法\t67206\n珍宝阁\t67207\n红树\t67208\n45套\t67209\n都铎\t67210\n迟来的爱\t67211\n湖东社区\t67212\n用题\t67213\ndestiny2\t67214\n程序猿DD\t67215\n青山沟\t67216\n情电影\t67217\n狐妖小红娘\t67218\ninpho\t67219\n67万\t67220\n勒令\t67221\n求婚大作战\t67222\narti\t67223\n听友\t67224\n注册有限公司\t67225\nWIFI信号\t67226\nXSL\t67227\n生吞\t67228\n澳门日报\t67229\n泛昔洛韦\t67230\n回形纹\t67231\nmavericks\t67232\n动话\t67233\n李普曼\t67234\nR7\t67235\n礼炮\t67236\n海创园\t67237\n三星S8+\t67238\n三四\t67239\n2017年11月4日\t67240\n法藏\t67241\n帮扶村\t67242\nICU\t67243\n应收账款融资\t67244\n新君威\t67245\n西泠印社\t67246\nprestige\t67247\n电子商务法\t67248\nzjw\t67249\nMarooned\t67250\n涨薪\t67251\n消消乐\t67252\n太阳谷\t67253\n合神\t67254\noutlook2003\t67255\n柿子\t67256\n搜狗细胞词库\t67257\nruntest\t67258\n张\t67259\n广告车\t67260\n华域汽车系统股份有限公司\t67261\n希乐城\t67262\n鉴湖\t67263\n参仙源\t67264\n摄动\t67265\nio1.1\t67266\n8259\t67267\n安全工程师考试\t67268\n微夏\t67269\n欧阳中石\t67270\n星状\t67271\nfrequently\t67272\n标准化\t67273\n5c\t67274\n颅内\t67275\n蠡湖大道\t67276\n大申\t67277\n红字\t67278\n病毒补丁\t67279\n139元\t67280\n乱麻\t67281\nRevolution\t67282\n珠海市教育局\t67283\n小狗\t67284\n延边新闻网\t67285\n战神3重制版\t67286\n三爪\t67287\n郴州市\t67288\n为知笔记吧\t67289\n第150期\t67290\n苹果8Plus\t67291\niPad越狱\t67292\n反证\t67293\n赣榆\t67294\nillegal\t67295\n妖精们\t67296\n海贼王启航\t67297\nSweep\t67298\nlhb\t67299\n咕力\t67300\n道听\t67301\n北京协和\t67302\n汇图网\t67303\n三国杀ol\t67304\n婚丧\t67305\n身高差\t67306\n快餐桌\t67307\n空房间\t67308\n转包\t67309\n六力\t67310\n谢玉\t67311\nrussia\t67312\n借款单\t67313\n树花凛\t67314\n4v4\t67315\n下个星期\t67316\n谢旦\t67317\n迷奸药\t67318\n72张\t67319\n乳胶衣\t67320\n三雄极光\t67321\n90.5\t67322\n闪蒸\t67323\n徐枫\t67324\n花团锦簇\t67325\nTMS\t67326\n滥用\t67327\nS40\t67328\n健享\t67329\n胰\t67330\n监委\t67331\n22季\t67332\n30余年\t67333\n我的早更女友\t67334\n200多天\t67335\n分子料理\t67336\n总胆汁酸\t67337\n压疮\t67338\n汉东\t67339\n许颖\t67340\n磷酸\t67341\n母门\t67342\n基础学\t67343\n6世纪\t67344\n受理费\t67345\n陈佩斯\t67346\n众口\t67347\n月布\t67348\n刺\t67349\n写道\t67350\n真正\t67351\n速算扣除数\t67352\n友人\t67353\n桃花源景区\t67354\n航天局\t67355\n靖江\t67356\n海安中学\t67357\nⅣ\t67358\n红豆汤\t67359\n告破\t67360\n百练百胜\t67361\n鸟\t67362\n黏性土\t67363\n常州信息职业技术学院\t67364\nC++Builder\t67365\n光白\t67366\n我的秘密\t67367\n换身\t67368\n果味\t67369\n北大资源集团\t67370\n单样\t67371\nSAP\t67372\n地埋式一体化\t67373\n流金水\t67374\nC位\t67375\n丹柯\t67376\n银城东苑\t67377\ngodspeed\t67378\n茶器\t67379\n同上\t67380\n男裤\t67381\n台球网\t67382\n河北省建设厅\t67383\n慈湖\t67384\nismallboy\t67385\n登麻\t67386\ninland\t67387\n李姣\t67388\n肉桂粉\t67389\n南京大学图书馆\t67390\n囧司徒\t67391\n熔合\t67392\nFibers\t67393\n个样\t67394\n正夫\t67395\npenalty\t67396\n官鹅沟\t67397\n茨威格\t67398\n俱乐部\t67399\nValor\t67400\n1537\t67401\n磁电式传感器\t67402\n087\t67403\n排解\t67404\n江西省国资委\t67405\n紫禁城\t67406\n重庆市中医院\t67407\n小米移动电源\t67408\n三角形的认识\t67409\n魅族mx2\t67410\n任鹏\t67411\n艾优\t67412\n3月4号\t67413\n日盛\t67414\n上台\t67415\n专程\t67416\n24W\t67417\n龃龉\t67418\n超级\t67419\n这一样\t67420\n回应\t67421\n林桃\t67422\n索尔斯克亚\t67423\n六年级下册\t67424\n杜房明\t67425\n出声\t67426\n优秀率\t67427\niso\t67428\n试产\t67429\nApp.config\t67430\n飞思相机\t67431\ngreendays\t67432\n半岁\t67433\n大秀哥\t67434\n艺术品鉴\t67435\n庶女\t67436\n亨利·马蒂斯\t67437\n陈怡曼\t67438\nGTX560\t67439\n调节池\t67440\n过剩\t67441\n马加\t67442\n风疹疫苗\t67443\nBenchmarks\t67444\n金叶复叶槭\t67445\nglenn\t67446\n浮动汇率\t67447\nfaa\t67448\n雷洋\t67449\nharold\t67450\nparasite\t67451\n前移式\t67452\nlaucher\t67453\n弛\t67454\n青秀万达广场\t67455\nhechunhua\t67456\nDownton\t67457\n辜怡媃\t67458\n2016年5月27日\t67459\n小仙女\t67460\n浙江广厦猛狮\t67461\n奋飞\t67462\n2500亿美元\t67463\nfxp\t67464\n淘气值\t67465\nmac2016\t67466\n退场\t67467\n光纤到户\t67468\n流BD\t67469\n数论\t67470\n核准\t67471\n不可为\t67472\n广晟\t67473\nKcash\t67474\n召集令\t67475\n左昱\t67476\n死信\t67477\n孙飘扬\t67478\nthisis\t67479\n单间\t67480\n合同部\t67481\n苏州市新区\t67482\n双向情感障碍\t67483\n瑞阳\t67484\n代步车\t67485\n偏安\t67486\n圆棒\t67487\n婆娘\t67488\n正宫\t67489\n可嘉\t67490\n严歌苓\t67491\n这个子\t67492\n供餐\t67493\n委托诉讼代理人\t67494\n4710MQ\t67495\n无情人\t67496\n王晓丹\t67497\n车牌街拍网\t67498\n愚公移山\t67499\n新开镇\t67500\n支付业务许可证\t67501\n贵州网\t67502\n331\t67503\n甲癣\t67504\n两栖攻击舰\t67505\n125名\t67506\n发尾\t67507\n好可\t67508\n吊眼\t67509\n李惠美\t67510\n文昭关\t67511\n中国作家网\t67512\n载流\t67513\n10周\t67514\n汀步\t67515\n转运仓\t67516\n私链\t67517\nsecondly\t67518\n新名\t67519\n电子线\t67520\nRM020601\t67521\n金福隆\t67522\n7月11日\t67523\ncleveland\t67524\ngrouped\t67525\n天敏\t67526\n000488\t67527\nD630\t67528\n60R18\t67529\n箱龟\t67530\n表间\t67531\n烧茄子\t67532\n打烂\t67533\n格莱\t67534\nrehat\t67535\n兴山\t67536\n香港投行\t67537\n恒盛地产\t67538\n高墙\t67539\n中正则\t67540\n昆山经济开发区\t67541\n普药\t67542\n歼16\t67543\n1753\t67544\nOther\t67545\n3DHGAME\t67546\n藤井树\t67547\nProTools\t67548\n动车组\t67549\n南花园\t67550\n通灵兽\t67551\n蒙城北路\t67552\n中式快餐店\t67553\n太平岛\t67554\n基本港\t67555\n纯净水设备\t67556\nmssql2000\t67557\n南昌南\t67558\n火影忍者羁绊\t67559\n讲重\t67560\n73名\t67561\n我的皇帝陛下\t67562\n奇术\t67563\n苏黎世机场\t67564\n厨余\t67565\n抽空\t67566\n神仙膏\t67567\n▌\t67568\n少数几个\t67569\n發展\t67570\neub\t67571\nPatagonia\t67572\n400公里\t67573\n连接串\t67574\n斯柯达科迪亚克\t67575\n李卫当官\t67576\n炉顶\t67577\nRescue\t67578\n铸态\t67579\n雨落\t67580\nahd\t67581\ndent\t67582\nIPZ\t67583\nAcids\t67584\n乔巴\t67585\n净化厂\t67586\nAisino\t67587\n公变\t67588\n数据宝\t67589\n茶饼\t67590\n山西煤化所\t67591\nSato\t67592\n开发员\t67593\n线锯\t67594\n锦秋路\t67595\npadding-top\t67596\n景泰县人民政府\t67597\n淮南牛肉汤\t67598\ncosy\t67599\n20141011\t67600\nADE\t67601\n注册资产评估师考试\t67602\n米克诺斯岛\t67603\n诉权\t67604\n网络安全工程师\t67605\n蔑视\t67606\n库班\t67607\n1.2亿元\t67608\n鸳鸯刀\t67609\n人教小学\t67610\n线\t67611\n农家饭\t67612\n增值税会计处理规定\t67613\n安格斯牛\t67614\n虹云\t67615\n巡游者\t67616\n珠海农商银行\t67617\n气费\t67618\n回旋钻\t67619\nagen\t67620\nAda\t67621\nputty\t67622\n海福乐\t67623\n防性\t67624\n胡编\t67625\n损害\t67626\n金宇澄\t67627\naliens\t67628\n20141231\t67629\n乐播\t67630\n玻镁\t67631\nCLUSTER\t67632\n530T\t67633\n水歌\t67634\n3562\t67635\n重心法\t67636\n四川省国土资源厅\t67637\n牵线\t67638\nbej48\t67639\n劝返\t67640\n培训版\t67641\n前行\t67642\n5688\t67643\nNBL\t67644\n枫杨\t67645\n万锦城\t67646\n撕夜\t67647\n肉羹\t67648\n踪迹\t67649\nndvi\t67650\n气花\t67651\n开启者\t67652\n东北振兴\t67653\n不饱和脂肪酸\t67654\n墨丘利\t67655\n每3分钟\t67656\n660ti\t67657\n700吨\t67658\n捧\t67659\n朱庄村\t67660\n疯狂英语\t67661\n多分\t67662\n鸡鸡\t67663\n84\t67664\n龟兔赛跑\t67665\n胡扬忠\t67666\n21.89分\t67667\n中枢\t67668\n二维码扫描软件\t67669\n社会学类\t67670\n饕餮记\t67671\n弦琴\t67672\nNeue\t67673\n4999元\t67674\n赶酒会招商网\t67675\n本房\t67676\n柳熙烈\t67677\n斑马螺\t67678\n去不掉\t67679\n种植者\t67680\nLuke\t67681\n张洁洁\t67682\n英雄无泪\t67683\n世界自然遗产\t67684\n膜机\t67685\nmail\t67686\n张爱华\t67687\n事务代码\t67688\n三网通\t67689\n克鲁姆洛夫\t67690\n红枣糕\t67691\n叙利亚政府军\t67692\n东北角\t67693\n克里斯丁\t67694\n猴王出世\t67695\npykafka\t67696\nE475\t67697\nWin10+Ubuntu\t67698\n前轮\t67699\n脓疱疮\t67700\n︶\t67701\n印刷工\t67702\n割席\t67703\n康铭\t67704\ncare\t67705\n德谟克利特\t67706\nuu\t67707\nCG游麟网\t67708\n上海非凡进修学院\t67709\nrestriction\t67710\n发痧\t67711\natm\t67712\n茫茫\t67713\n学生部\t67714\n500期\t67715\n1080p超清\t67716\n反色\t67717\n轻工\t67718\nSBM\t67719\n安祥\t67720\nkool\t67721\niang\t67722\n崇尚\t67723\n德玛西亚LOL视频站\t67724\n同责\t67725\n索塔\t67726\nIMFirewall\t67727\nTHEOL\t67728\n猫女\t67729\n上海交通大学航空航天学院\t67730\nmyscript\t67731\n乱臣\t67732\n形色\t67733\n蓝瑟\t67734\nPClady百科\t67735\n养殖厂\t67736\nRHEL5\t67737\nPES2015\t67738\n手标\t67739\n雅瑶\t67740\n殡改\t67741\n枇杷花\t67742\nadboe\t67743\n箜篌\t67744\n数千人\t67745\n魅族吧\t67746\n良辰讵\t67747\n报实销\t67748\n大冶\t67749\nteredo\t67750\n燕山路\t67751\nSpider\t67752\n金岛\t67753\n多大量\t67754\n拍房\t67755\nMines\t67756\n4.22\t67757\n5月7号\t67758\n尚东区\t67759\n砖胎\t67760\nkolla\t67761\n通缝\t67762\n砸墙\t67763\nDMMD\t67764\n黄真真\t67765\navi播放器\t67766\n三通一平\t67767\nJulio\t67768\n五和大道\t67769\n335Li\t67770\n蘑菇岛\t67771\nΤ\t67772\n房税\t67773\n乐圣\t67774\n君心似\t67775\n一队\t67776\n思力华\t67777\nuis\t67778\nChicken\t67779\n表座\t67780\n复古风\t67781\n鲁本斯\t67782\n菜虫\t67783\n思迪\t67784\n第版\t67785\n钱德勒\t67786\nhtttps\t67787\n佛城西路\t67788\n难言之欲\t67789\n吴承瑛\t67790\n火魔\t67791\n上海加油站\t67792\n小布\t67793\nnumba\t67794\n普利策奖\t67795\n笛膜\t67796\n送餐员招聘网\t67797\n战极姬\t67798\n画集\t67799\n清河城\t67800\n氯仿\t67801\n嘚\t67802\n雪之舞\t67803\n甜豆\t67804\n百度carlife\t67805\n杯水车薪\t67806\nX509\t67807\n钓箱\t67808\nOriginally\t67809\n华曦\t67810\n视听语言\t67811\nmuduo\t67812\n数学节\t67813\nmonmon\t67814\n卡布达\t67815\nLord\t67816\n带资\t67817\n淫行\t67818\nkettle7\t67819\nmysq数据库\t67820\nnsi\t67821\n第三方理财公司\t67822\n知路\t67823\nChastity\t67824\n异色\t67825\n李三吱\t67826\nWashington\t67827\n成年\t67828\n绝地求生游戏\t67829\n苏州博物馆\t67830\n提留\t67831\nCakes\t67832\n韩文版\t67833\n回头是岸\t67834\n欢迎光临\t67835\n稻盛和夫\t67836\n逡巡\t67837\nEric6\t67838\n12座\t67839\n阿莫西林颗粒\t67840\ndeeper\t67841\n柳时镇\t67842\nSTM32-F3/F4/F7\t67843\n隆基\t67844\nserendipity\t67845\npathogen\t67846\n论述文\t67847\nOffice2011\t67848\n泌乳\t67849\n6380\t67850\n玉米胚芽油\t67851\n猿题\t67852\nToon\t67853\n付款期\t67854\n浒山街道\t67855\n泰芒\t67856\n流动性\t67857\nsentence\t67858\n赋比兴\t67859\nm型\t67860\n草编\t67861\nLimitless\t67862\n入职体检报告\t67863\nSticks\t67864\n西北人才网\t67865\n宝庆斋乔\t67866\n2瓦\t67867\n色天使\t67868\n常远\t67869\n大袖\t67870\nspoon\t67871\npcv\t67872\n国庆\t67873\n褚遂良\t67874\n山东省人民政府法制办公室\t67875\n法拉利汽车\t67876\n鬼镇\t67877\n火疗\t67878\n潮玩\t67879\nclion\t67880\n嘉兴市交通运输局\t67881\n吽\t67882\n小小春\t67883\n巨汉\t67884\n7.2破碎\t67885\n铭瑄\t67886\nRoots\t67887\n1v1\t67888\n行囊\t67889\n导航员\t67890\nVr\t67891\n死歌\t67892\n多不多\t67893\n荷花酒\t67894\naxe\t67895\nglobaldata\t67896\n山东威达\t67897\n本多忠胜\t67898\nCascade\t67899\n小君\t67900\n阿亮\t67901\n海誓\t67902\n安室奈美惠\t67903\n三联生活周刊\t67904\n快乐牛牛\t67905\n今夕\t67906\n希尔德\t67907\nadoquery\t67908\n宣桥\t67909\n加盖\t67910\n三八线\t67911\n湖南省粮食局\t67912\n药学类\t67913\n止水带\t67914\n熊猫直播_泛娱乐\t67915\n佳能700D\t67916\n雄安规划\t67917\n良辰美景奈何天\t67918\nS3C6410\t67919\nChopper\t67920\n贝勒爷\t67921\nflex\t67922\n离子化\t67923\n川崎市\t67924\nstrptime\t67925\n广州军区总医院\t67926\n庆隆南山\t67927\n杨兴\t67928\n高危\t67929\nm2接口\t67930\n首存\t67931\n福彩\t67932\n沈南鹏\t67933\n卵圆孔未闭\t67934\n剪\t67935\n协作者\t67936\n霞山\t67937\n拜恩\t67938\n金瑞期货\t67939\nrxjs\t67940\n养伤\t67941\n脱宅\t67942\n索坦\t67943\n排障\t67944\nimply\t67945\n光谷社区\t67946\n祭礼\t67947\n赵乐际\t67948\n晚晚\t67949\n娇兰\t67950\n4.83\t67951\n十亿个\t67952\n青儿\t67953\n慵\t67954\n论金\t67955\n盛大网络\t67956\n2017年3月1日\t67957\nM1213nf\t67958\n徐捷\t67959\n闪电侠\t67960\n安徽省科学技术厅\t67961\n前三个月\t67962\n360极速模式\t67963\n一把把\t67964\n众叛亲离\t67965\n2013款\t67966\n曹科长\t67967\n入门版\t67968\n摘除术\t67969\n龙珠鸡皇杯\t67970\nPS3模拟器\t67971\n山东招生考试网\t67972\n天琴座\t67973\n捆绑销售\t67974\n小黄本\t67975\nlm\t67976\n抹药\t67977\n无家可归\t67978\n中华白海豚\t67979\n房买卖\t67980\n钻石画\t67981\n{&#160\t67982\n5.1.1\t67983\n张德江\t67984\n周凯\t67985\n张悦轩\t67986\nMasturbation\t67987\n28个\t67988\n张家口市\t67989\n增进\t67990\nnidec\t67991\n市桥街道\t67992\n听觉\t67993\n跳步\t67994\n深汕高速\t67995\n算量\t67996\n订位\t67997\n上海世界外国语小学\t67998\n寒颤\t67999\nwie\t68000\n野鹿\t68001\n单据\t68002\nDurham\t68003\nLinuxEye\t68004\nFireFox\t68005\n一所\t68006\n卢氏吧_\t68007\n属于你\t68008\n畅越\t68009\ndlv\t68010\n星范\t68011\n太平人寿\t68012\n黄公\t68013\n国礼\t68014\n崇安\t68015\nHIStory\t68016\n甘草堂\t68017\n101第一期\t68018\n8lag\t68019\nx^4\t68020\n江红\t68021\n店庆\t68022\n立山\t68023\nMomo\t68024\n上外附小幼儿园\t68025\n陕西米脂三中\t68026\n滨江中路\t68027\n楚门的世界\t68028\n生态袋\t68029\nDarin\t68030\n一日千里\t68031\njujua\t68032\n狂犬病毒\t68033\n华尔街\t68034\n第一架\t68035\n奖项\t68036\n百度医学\t68037\n绿园\t68038\n装饰工程有限公司\t68039\n七尺\t68040\n贺聪\t68041\n3H21J22\t68042\n非诚勿扰2\t68043\n送教下乡\t68044\n引导者\t68045\n荔山\t68046\n陆小曼\t68047\n1499.2\t68048\n慰问金\t68049\n紫荆花园\t68050\n重编\t68051\n捕兽\t68052\n胆红\t68053\n迅雷批量\t68054\nfudan\t68055\n接到\t68056\n名探偵コナン\t68057\n大无脑\t68058\n王珊珊\t68059\nSWMM\t68060\n实验学校\t68061\n边海\t68062\nラム\t68063\n打脚\t68064\n烧腊\t68065\n出版集团\t68066\n李文斌\t68067\n学天教育\t68068\n牛血清蛋白\t68069\n珠晖\t68070\n延安干部学院\t68071\n二体\t68072\n20161127\t68073\n古钱币收藏网\t68074\nsyntaxerror\t68075\n先刑\t68076\n专升本考试_无忧考网\t68077\n完达山乳业\t68078\nQiang\t68079\n窗口管理器\t68080\n修仙\t68081\n爬起来\t68082\n皮鼓\t68083\n闵行政府网\t68084\n红线\t68085\n富力十号\t68086\n拜月\t68087\n1H\t68088\nanion\t68089\nproposition\t68090\n齿龈\t68091\n伊维菌素\t68092\n阿瓦雷兹\t68093\n升龙拳\t68094\n96家\t68095\n秦皇\t68096\n78%\t68097\n长海县\t68098\nm277\t68099\nペルソナ\t68100\n山东工商学院\t68101\nreserve\t68102\n@50\t68103\n吴志雄\t68104\n飓风营救3\t68105\n什邡之窗\t68106\n提篮\t68107\n开放型经济新体制\t68108\n浙江海洋学院\t68109\n多吉美\t68110\n20170714\t68111\n深宵未眠为爱静\t68112\n发酵机\t68113\nMHDD\t68114\n万源市政府\t68115\nyaman\t68116\n拍立得\t68117\n龙池\t68118\n实质化\t68119\n江湖情\t68120\n两城镇\t68121\n直达\t68122\n翠亨村\t68123\n消消\t68124\njingle\t68125\n小蓝单车\t68126\n不够用\t68127\n猫版\t68128\n议会\t68129\ninstalls\t68130\nPlastics\t68131\n相思\t68132\n54级\t68133\n5万辆\t68134\n江高镇\t68135\n骨性龅牙\t68136\n妈妈群\t68137\n拍拍网\t68138\nzgc\t68139\n阿建\t68140\n第12张\t68141\nWhether\t68142\n神道丹尊\t68143\n西红柿鸡蛋汤\t68144\n宽城县\t68145\n免疫缺陷病\t68146\n年久失修\t68147\n寻呼\t68148\nCFL\t68149\n底炉\t68150\n数百种\t68151\n马蹄金\t68152\n横泾\t68153\n下周一\t68154\n唐柔\t68155\n茸城\t68156\n弑天刃\t68157\n教育网\t68158\n敢达ol\t68159\n周婕纶\t68160\n六甲\t68161\n103枚\t68162\n启东人\t68163\n嬉皮\t68164\n横平竖直\t68165\n决制\t68166\n古墓丽影3\t68167\n灰色地板\t68168\n秀贞\t68169\n鞋码\t68170\n戏画\t68171\n玄修\t68172\n7月1日\t68173\n制动器\t68174\n老曾\t68175\n行驶\t68176\n定时开关机\t68177\n老老王\t68178\n大龄\t68179\n中国农业新闻网\t68180\n37分\t68181\n金律\t68182\n路局\t68183\npingpong\t68184\n|日\t68185\n利率\t68186\n一百秒\t68187\n一人之下\t68188\nnbsp\t68189\n全备\t68190\nSCX-4521F\t68191\n刑拘\t68192\n修井机\t68193\nUTD\t68194\n十美\t68195\n偷梗\t68196\n科尔摩根\t68197\n泰来\t68198\n云眼\t68199\n2000多万\t68200\n情伤\t68201\n远望谷\t68202\nOauth2.0\t68203\n36计\t68204\n师大版\t68205\n心理访谈\t68206\n同步课堂\t68207\n潍坊银行\t68208\n蓝莓之夜\t68209\n辗迟\t68210\n熊浩\t68211\n室内装饰\t68212\n友谊大道\t68213\n成就奖\t68214\nPatterns\t68215\n高一丈\t68216\n猫儿\t68217\n报关员\t68218\n2018年3月9日\t68219\n鹭鸟\t68220\n三峡宜昌网\t68221\n赵文瑄\t68222\n17.7\t68223\n橘子树\t68224\n金刚狼3\t68225\n雏鸟\t68226\n成都市水务局\t68227\n等离子体质谱仪\t68228\n神方\t68229\n天河\t68230\n2.5L\t68231\n南宝山\t68232\n专集\t68233\n常熟百姓网\t68234\nMeasurement\t68235\nmhw\t68236\n郭冬临\t68237\nelton\t68238\n台塑\t68239\nmybits\t68240\n内地人\t68241\n无意识\t68242\nnw-a45\t68243\n王牌大贱谍\t68244\nAAPL\t68245\n曾翔\t68246\nlick\t68247\n家用投影\t68248\n文殊菩萨心咒\t68249\nCCT\t68250\nmmseg\t68251\n纸草\t68252\n广东省林业厅\t68253\n荒野行动奇怪君\t68254\n跋扈\t68255\n詹密\t68256\nopacity\t68257\n2.so.2\t68258\n全支\t68259\n1615\t68260\n六九七十六\t68261\nTraveler\t68262\n25仔\t68263\njavascript\t68264\n叫价\t68265\n只此一家\t68266\nC罗\t68267\n合富置业\t68268\n魅蓝2\t68269\n扎根\t68270\n无路可逃\t68271\n提法\t68272\n存储池\t68273\n沃茨\t68274\ndenis\t68275\njog\t68276\n中材\t68277\n大美\t68278\n50包\t68279\n广西自治区\t68280\nFatt\t68281\n十七孔桥\t68282\n抑后扬\t68283\n远兴能源\t68284\n财务报表编制\t68285\nmarkingNavicat\t68286\nstratum\t68287\n壹课堂\t68288\n几a\t68289\nbotanical\t68290\n盲道砖\t68291\n马长\t68292\nwebpack-plugin\t68293\n基督山伯爵\t68294\n隔脏\t68295\n百丽\t68296\nwindow7\t68297\n75吨\t68298\nLetv\t68299\nMobility\t68300\nMESH\t68301\n甲状腺球蛋白抗体\t68302\n快乐的日子\t68303\n离型剂\t68304\n性感沙滩zero\t68305\nHypixel\t68306\n摸了摸\t68307\n广州分公司\t68308\n虹桥长途西站\t68309\n果库\t68310\n第一下\t68311\n预选赛\t68312\n古埃尔公园\t68313\n益田路\t68314\n酷派\t68315\n1.2米\t68316\n热熔\t68317\n同里\t68318\n文化馆\t68319\n天山集团\t68320\n硬聚氯乙烯\t68321\n网易相册_2481984605@qq.com\t68322\n农牧\t68323\nwindowsserver2008r2\t68324\n8.0.1\t68325\n万无一失\t68326\n台州市网上公安局\t68327\n着调\t68328\n云水\t68329\n欸\t68330\n人民币银行结算账户\t68331\n爱的谎言\t68332\nd700\t68333\n阮小七\t68334\n63吨\t68335\n笑动\t68336\n亚马逊阅读器\t68337\ncheerio\t68338\n幻视\t68339\n中航锂电\t68340\n3618\t68341\n史兰芽\t68342\n水溶液\t68343\n企业文件\t68344\n列缺\t68345\n藍沢潤\t68346\n布洛妮娅\t68347\n青海省经济和信息化委员会\t68348\nDAILY\t68349\n重庆富侨\t68350\nendnot\t68351\nNSOperation\t68352\n水患\t68353\nceramic\t68354\n龍隐\t68355\n太冷\t68356\nsperry\t68357\n三栏式明细账\t68358\n蚌埠南\t68359\n住房\t68360\n瞿塘峡\t68361\n北京)投资管理有限公司\t68362\nCheer\t68363\nwinch\t68364\n九九乘法表\t68365\n看透\t68366\n无终\t68367\n401.1\t68368\n祖庙\t68369\n六灶镇\t68370\n音乐播放器\t68371\n黄丽玲\t68372\n棉布袋\t68373\n5068.com\t68374\n繁琐\t68375\n张迪\t68376\n738\t68377\n男音\t68378\n嘉州房产网\t68379\n建设街道\t68380\n2.8寸\t68381\n远期汇率\t68382\n农农\t68383\n团派\t68384\n鬼屋\t68385\n来处\t68386\n热饭\t68387\n根法\t68388\n神雕侠侣小龙女\t68389\nSNIS-531\t68390\ncozy\t68391\n汤峪\t68392\n咽喉肿痛\t68393\n吴清源\t68394\naj6\t68395\n静脉\t68396\n北景园\t68397\n属兔\t68398\n中国机械进出口(集团)有限公司\t68399\n平安好房\t68400\n上海大学文学院\t68401\n盛屯矿业\t68402\n安伟\t68403\n旅游展\t68404\ndotted\t68405\nrubber\t68406\n李迪\t68407\n特新\t68408\n地狱之轮\t68409\n安正时尚\t68410\n念诗\t68411\n剖析\t68412\n健康促进学校\t68413\n量子产率\t68414\n2018.04.09\t68415\n三相珠宝\t68416\n自赏\t68417\nzxx\t68418\nGrab\t68419\n北京儿童医院\t68420\n外包工\t68421\n微牛证券\t68422\nlog4net\t68423\nwin7系统笔记本\t68424\n5750\t68425\n任洁\t68426\n5月5号\t68427\n合魂\t68428\n电感耦合\t68429\n轻客\t68430\n长编\t68431\n约单\t68432\n王牌特工2:黄金圈\t68433\n雅各\t68434\n段值\t68435\n人教版五年级下册英语\t68436\n瑞奇与叮当\t68437\nEditPlus\t68438\nP3D\t68439\n白塔路\t68440\n现场会\t68441\nhpv9\t68442\n第一节\t68443\n荣耀7全网通\t68444\n内蒙古自治区工商行政管理局\t68445\n黄印\t68446\n广发期货\t68447\n副书\t68448\nmsp\t68449\n按份\t68450\n暖奶器\t68451\n湾仔码头\t68452\n恋爱辅助器\t68453\n伊犁地区\t68454\n南宁供电局\t68455\n旋转阀\t68456\n土豆牛肉\t68457\nVirtuoso\t68458\n魔王窟\t68459\n0000\t68460\n汹涌澎湃\t68461\n郊狼\t68462\n攻丝\t68463\n美涂士漆\t68464\n天津中心妇产医院\t68465\n预取\t68466\nQ6600\t68467\n西咸新区沣西新城管委会\t68468\n仑苍\t68469\ngang\t68470\n篮球界\t68471\n融创观澜壹号\t68472\n2q\t68473\nDLAN\t68474\n慢性附睾炎\t68475\n820型\t68476\n随机微分方程\t68477\n中国周易研究会\t68478\n安全新\t68479\nFlavour\t68480\n安尼威尔\t68481\n棋坛\t68482\n高筑\t68483\n牧牛杖\t68484\n北京嘉里大酒店\t68485\n渔婆\t68486\n4亩\t68487\nducks\t68488\n青医\t68489\n主宰\t68490\n白墙\t68491\n599号\t68492\n很幸运\t68493\n几十台\t68494\n一千五百\t68495\n宏来\t68496\n百分之八十\t68497\n者\t68498\nxxxxxxxxx\t68499\n汽车社区_车友互动中心\t68500\n胜义\t68501\n炮艇\t68502\n武林志\t68503\nG9350/全网通\t68504\n闻名于世\t68505\nFLO\t68506\nbilbil\t68507\nVirus\t68508\n导流\t68509\n北京科技馆\t68510\n名匠\t68511\n横街镇\t68512\n外窗\t68513\n极限特工2\t68514\n铅房\t68515\n数亿美元\t68516\n满堂彩\t68517\n科研\t68518\n款项\t68519\n351.4\t68520\n金依蓓\t68521\nparticles\t68522\n关税壁垒\t68523\n省赛\t68524\n王淑芳\t68525\norigin2016\t68526\n港龙航空\t68527\n老猎人\t68528\n邮轮\t68529\n高规\t68530\nNOLOCK\t68531\n猛追湾\t68532\n陶寺遗址\t68533\n旺运\t68534\n注册地址\t68535\n三线\t68536\nNintendoSwitch\t68537\n夹胶玻璃\t68538\n36周\t68539\nRegal\t68540\n海派\t68541\n过氧化值\t68542\n美原咲子\t68543\n东方集团\t68544\n海南省旅游发展委员会\t68545\n陶家\t68546\nKPI指标库\t68547\n微米\t68548\n软头\t68549\nmexican\t68550\n兰州理工\t68551\n被拐卖\t68552\n神爱\t68553\n跃击\t68554\nrisa\t68555\n凉城\t68556\n华堂\t68557\n马山县人民政府\t68558\n王鹤润\t68559\n高锴\t68560\n俄洛伊\t68561\n初学\t68562\n淘梦人生乐视体育\t68563\n一百种\t68564\n周昊\t68565\n洋车夫\t68566\n中村知恵\t68567\n控温\t68568\n火花塞\t68569\n胸无\t68570\n乱叫\t68571\n伯希\t68572\n大居\t68573\n袜男\t68574\n寄生虫学\t68575\n明悦\t68576\nXMind8\t68577\n手竿\t68578\n二手车行\t68579\n城市史\t68580\n宏立城集团\t68581\n重庆人大\t68582\n国志\t68583\n学厨\t68584\n78g\t68585\n垂感\t68586\nlantern\t68587\n三兵\t68588\n美蛙\t68589\n嫦娥二号\t68590\n全封闭\t68591\n震怒\t68592\n德德玛\t68593\n周辉\t68594\n2015年以来\t68595\n2345看图王\t68596\n华宅\t68597\n棘突\t68598\n富轩\t68599\n公租房信息网\t68600\n新概念英语青少版\t68601\n永垂不朽\t68602\n虚拟域名\t68603\n大祸\t68604\n宽甸满族自治县\t68605\n阿越\t68606\n入党积极分子申请书\t68607\n2018-03-21\t68608\n我中\t68609\n扫黑除恶\t68610\nrand函数\t68611\ncite\t68612\n左家庄街道\t68613\n隐私性\t68614\n二运\t68615\n说白了\t68616\nudacity\t68617\n2064\t68618\n850nm\t68619\n五七干校\t68620\n阿凡达2\t68621\namine\t68622\nkura\t68623\nnatalie\t68624\n北京千禧大酒店\t68625\n党湾\t68626\n2018年中\t68627\nvisvim\t68628\n全网通\t68629\ntravelling\t68630\n洗牙器\t68631\n玉泉\t68632\n皮斯卡略夫\t68633\n藤萝\t68634\nmonarch\t68635\n一起跳\t68636\ni7-8750H\t68637\n坡长\t68638\n严惩\t68639\n缩小版\t68640\n可乐雪碧\t68641\n可复美\t68642\n夹江县\t68643\n意式\t68644\n电弧\t68645\n九龙谷\t68646\n金銮\t68647\n尽忠\t68648\n拉科\t68649\n长和宽\t68650\n监事会\t68651\nHanvon\t68652\n16条\t68653\n责任险\t68654\ngtx780\t68655\nv1.2.0\t68656\nYP\t68657\n新型城镇化\t68658\nkop\t68659\nintroduces\t68660\n8013\t68661\n河坊街\t68662\n契约妻\t68663\nCEPA\t68664\n软底鞋\t68665\n低安\t68666\n山月不知我心事\t68667\nfare\t68668\n振臂\t68669\nvivoX9\t68670\n5.2万\t68671\n日字\t68672\nplot3\t68673\n第七届\t68674\n石敢当\t68675\n神农氏\t68676\n风流史\t68677\n刀币\t68678\n密道\t68679\n金田路\t68680\npaperok\t68681\n管理器\t68682\n爱的释放\t68683\n七星湖\t68684\n金晖\t68685\n养兔\t68686\nstaging\t68687\n爱尔眼科医院\t68688\nsql表\t68689\n湖北消防网_楚天消防\t68690\n纯露\t68691\n常侍\t68692\n古桥\t68693\n阳江市人民政府\t68694\nPant\t68695\nosp\t68696\n试水\t68697\n鲸\t68698\n馨园小区\t68699\n杀道\t68700\n苏州市国土资源局\t68701\n响尾蛇\t68702\n蜘蛛蟹\t68703\nzm\t68704\n爱情篇\t68705\n7万元\t68706\n0572\t68707\n省辖市\t68708\n发笑\t68709\n桃花族论坛\t68710\n壁厚_\t68711\n工式\t68712\n中国信息产业网\t68713\n有缘人\t68714\nformpanel\t68715\n意合\t68716\n黑龙江省省\t68717\ncrypt\t68718\n无所畏惧\t68719\n纳爱斯\t68720\nfasdf\t68721\n幸福里小区\t68722\nbackgrounds\t68723\n金木研\t68724\n潘太康\t68725\n掌阅iReader\t68726\nNGN\t68727\n张锐\t68728\n北京文艺网\t68729\nInsta\t68730\n达县\t68731\n京天利\t68732\nsnappy\t68733\n东方巴黎\t68734\n进攻方\t68735\n全盘\t68736\n中央集权\t68737\n白云村\t68738\n高级语言\t68739\n过多\t68740\n所得额\t68741\nlanqi\t68742\n儒艮\t68743\n平行六面体\t68744\n拆迁办\t68745\n闯海\t68746\n开放者\t68747\n威名\t68748\n十二长生\t68749\nPearls\t68750\npstree\t68751\nWIP\t68752\n铁棒\t68753\n5月21日\t68754\n卡博替尼\t68755\nFIFA16\t68756\n择\t68757\n黑子的篮球\t68758\n吴克羣\t68759\nlibera\t68760\n50余家\t68761\n人子\t68762\ncinema\t68763\n进华\t68764\nqingdao\t68765\n4CD\t68766\n首都之窗\t68767\n纽约州\t68768\n好想吃\t68769\n衙役\t68770\n泡萝卜\t68771\n开赴\t68772\n超乐泰胶水\t68773\n硅晶\t68774\n万能角度尺\t68775\n电户\t68776\nddr5\t68777\n藏毒\t68778\n云即时通讯云\t68779\nDBA\t68780\n孝感市政府\t68781\n拾遗\t68782\n松田\t68783\n庐山真面目\t68784\n100abn\t68785\n摩擦片\t68786\n抽脂\t68787\niOS-OC\t68788\n十五层\t68789\nmerge\t68790\n保障性安居工程\t68791\n众邦\t68792\n国药网\t68793\n0类\t68794\n刘伯温\t68795\n帝号\t68796\n无意穿堂风\t68797\n龙石\t68798\n铝单板\t68799\n周六福\t68800\n74ls138\t68801\n趣阅\t68802\n爆降\t68803\nUIColor\t68804\n25_\t68805\n费用化\t68806\nCOCOS2DX\t68807\n奔驰CLA\t68808\n猴头菇饼干\t68809\n捞月狗英雄联盟\t68810\n新南湖\t68811\n俗\t68812\n玄策\t68813\nU1\t68814\n蝇头\t68815\n草图\t68816\n2019年\t68817\nsettling\t68818\n永嘉\t68819\n画像石\t68820\nAT&T\t68821\nWorship\t68822\n天意\t68823\nREPORT\t68824\n2325\t68825\n中华街\t68826\n对冲\t68827\n花钵\t68828\n拳皇95\t68829\n倒行逆施\t68830\nFHD\t68831\ngraduates\t68832\nmia\t68833\n左手\t68834\n湖墅\t68835\n易视云\t68836\n招商信用卡\t68837\n高行镇\t68838\nsata\t68839\n雨轩\t68840\n兵种\t68841\n克里\t68842\nmybatis\t68843\nRIO\t68844\n北运河\t68845\n不分红\t68846\n冰封迷案\t68847\n书袋\t68848\nMinions\t68849\n施工升降机\t68850\n白釉\t68851\nPrisma\t68852\n如一\t68853\n金庸世界\t68854\nQMDN\t68855\n五六次\t68856\n擦条\t68857\neppendorf\t68858\n卫监\t68859\n一森\t68860\n危石儿\t68861\n琅东客运站\t68862\n透析液\t68863\nenables\t68864\n碘131\t68865\n随身课堂\t68866\n960fps\t68867\n歇山\t68868\n花瞳\t68869\n王小妮\t68870\n盐率\t68871\n谜样\t68872\n700公里\t68873\n耳道\t68874\n张辰\t68875\n回炉\t68876\n肾精\t68877\nGopher\t68878\n专项计划\t68879\n末生\t68880\n附子理中丸\t68881\n东京大学\t68882\n夺宝联盟\t68883\n多伦县人民政府\t68884\n150岁\t68885\n触角\t68886\n小叶\t68887\n叶峰\t68888\ngyz\t68889\n预科\t68890\n冰屋\t68891\n全昭旻\t68892\n有仇\t68893\n世邦魏理仕\t68894\n丽景小区\t68895\n玉琮\t68896\nEXCL\t68897\n女人坊\t68898\n深圳市住建局\t68899\nDragonfly\t68900\n状元村\t68901\n货币化\t68902\n聚甲基丙烯酸甲酯\t68903\n魏星\t68904\nE展\t68905\n杭州市国土资源局\t68906\n主题句\t68907\n1308\t68908\n咀嚼音\t68909\n明哲保身\t68910\n我们的生活充满阳光\t68911\n旋振筛\t68912\n查泰\t68913\n3PC\t68914\noffshore\t68915\n张志祥\t68916\nv4.5.2\t68917\n沉淀硫酸钡\t68918\n邓恩熙\t68919\n三国戏曹操传\t68920\n黑道风云\t68921\n机破星河\t68922\n天翼网\t68923\nwifi_\t68924\n借呗\t68925\nvale\t68926\n14路\t68927\n20180324\t68928\n俩位\t68929\n300MB\t68930\n问责\t68931\niw\t68932\n天马微电子股份有限公司\t68933\n拳坛\t68934\nNMS\t68935\n护士资格证考试\t68936\n圣山\t68937\n协作\t68938\n松口\t68939\n二手交易网\t68940\n社区妇联\t68941\nDsp\t68942\n袁媛\t68943\n廖俊波\t68944\n小嘎子\t68945\nisu\t68946\nexpressjs\t68947\nmasks\t68948\n酒水\t68949\n平谷区\t68950\n浮球液位开关\t68951\n减焦\t68952\n库号\t68953\n别烦\t68954\n得梅\t68955\nm6800\t68956\n姚大力\t68957\naffection\t68958\nhifidiy\t68959\n8686\t68960\n宁波通商银行\t68961\n片源\t68962\n小斌\t68963\n如东\t68964\n南安一中\t68965\n稔\t68966\n中交网\t68967\n张相\t68968\n大都会博物馆\t68969\n离婚证\t68970\n英格兰银行\t68971\nyale\t68972\n苏门答腊岛\t68973\n工商人\t68974\n撤单\t68975\n黑白稿\t68976\n身位\t68977\n劳务输出\t68978\nfedora\t68979\nis&#160\t68980\n泰亚\t68981\n奔驰GLE400\t68982\nalla\t68983\nVortex\t68984\nkross\t68985\n0200\t68986\n2017年9月11日\t68987\n呼死\t68988\n81890\t68989\nmarks\t68990\nintx\t68991\n山东省发展和改革委员会\t68992\ndady\t68993\n实人认证\t68994\n等不上\t68995\n一郎\t68996\n隔膜式气压罐\t68997\n齐步走\t68998\n安检仪\t68999\ntentacle\t69000\n注册建筑师考试\t69001\nBWCHINESE中文网\t69002\n尿酸碱度\t69003\n公交公司\t69004\n康耐特\t69005\n我的女友小茵\t69006\nUI自动化测试\t69007\nMonthly\t69008\nTraxxas\t69009\neasydarwin\t69010\n亲格\t69011\nphpbb\t69012\n超威锂电池\t69013\n实木门\t69014\n索罗亚克\t69015\n票交所\t69016\n半殖民地\t69017\n斯伟江\t69018\n序列式\t69019\n天空战\t69020\n3655\t69021\n双滦\t69022\n威斯特法伦\t69023\n李明博\t69024\n李颀\t69025\n金星脱口秀\t69026\n不必要\t69027\n生子文\t69028\n8a\t69029\n随喜\t69030\nSimon\t69031\n人参皂苷Rh2\t69032\n芭蕉花\t69033\nrcm\t69034\n0.62\t69035\n到访\t69036\n发劲\t69037\n外连\t69038\nahh\t69039\n车云\t69040\n冰沙机\t69041\n许许多多\t69042\ndl4j\t69043\n东方绯想天\t69044\nsoundsport\t69045\n中国民航局\t69046\n万分\t69047\n心想事成\t69048\n湖北\t69049\n天冠\t69050\n甜梦\t69051\n沙星\t69052\n肖林\t69053\nTanker\t69054\n南师附中\t69055\n臻美\t69056\n侵权责任法\t69057\n13.1.1548\t69058\n凤囚凰楚玉\t69059\n电影史\t69060\n我的夫人\t69061\n省妇联\t69062\n中英日\t69063\nwangduo\t69064\n5024\t69065\n撒钱\t69066\n郑琦\t69067\n筠子\t69068\n2013—2015年\t69069\n卷扬机\t69070\n12月13日\t69071\n单硬脂酸甘油酯\t69072\n旷视\t69073\n无限\t69074\n语文园地四\t69075\n单元控制性详细规划\t69076\n杨辰\t69077\nKino\t69078\n玻璃心\t69079\n采荷实验学校\t69080\n付敏\t69081\n环刚度\t69082\n冯小怜\t69083\n希腊\t69084\n重生之大涅磐\t69085\n人道\t69086\ndaikuan\t69087\n定板\t69088\n巢础\t69089\n第41届\t69090\n深海狂鲨\t69091\n呼兰区\t69092\n摭谈\t69093\n品牌排行网\t69094\n神夏\t69095\n弗朗西斯\t69096\n天地伟业技术有限公司\t69097\n审判权\t69098\n文帝\t69099\n29日\t69100\n兴福\t69101\n铝电池\t69102\n侠盗飞车4\t69103\n大统华\t69104\n法帖\t69105\n张毅刚\t69106\n厄普西隆\t69107\nTemplates\t69108\n报表期\t69109\n温岭东部新区\t69110\n学生装\t69111\n10立方\t69112\n第六种\t69113\ncanopen\t69114\n石石\t69115\n夏言\t69116\nAAO\t69117\n2.UUU9\t69118\ndfb\t69119\n岛链\t69120\n偶氮\t69121\nvsync\t69122\n第三国\t69123\nhaodiaoniu\t69124\nTravelMate\t69125\n上百款\t69126\n气动阀门\t69127\n梯式\t69128\n15型\t69129\nwr\t69130\n蒿子粑粑\t69131\nWaist\t69132\n365集\t69133\n别墅区\t69134\n吴伟民\t69135\n新年篇\t69136\nK80\t69137\nMMM\t69138\n支付宝健康果\t69139\n干道\t69140\niostat\t69141\nagnoster\t69142\n嬷嬷\t69143\n许愿\t69144\n蒙餐\t69145\n叉车证\t69146\npublications\t69147\n武汉科技大学\t69148\n党员们\t69149\nAdonit\t69150\n混合液\t69151\nCentrum\t69152\n褡裢坡\t69153\n5A景区\t69154\n切枪\t69155\n名龙堂\t69156\n因势利导\t69157\n工时制\t69158\n万道剑尊\t69159\n400多年\t69160\n发展心理学\t69161\nFIFAOL3\t69162\n断绝\t69163\n速途\t69164\n僵尸浣熊\t69165\nhit\t69166\n刘乔安\t69167\n冯正\t69168\n白细\t69169\nFreddy\t69170\n丝织品\t69171\n黄石火山\t69172\n跳弹\t69173\nf258\t69174\n宝博\t69175\n3万元\t69176\n霸王鞭\t69177\n29.8\t69178\n氧化铁红\t69179\n双重生\t69180\nfao\t69181\n湘教版小学\t69182\n新光路\t69183\n井底之蛙\t69184\n特岗\t69185\ncax\t69186\n渔民新村\t69187\n大容山\t69188\n顺义区医院\t69189\n糌粑\t69190\n152\t69191\n魔女之泉3\t69192\nthinpad\t69193\n夸夸\t69194\nIllustration\t69195\n只许\t69196\n三D\t69197\n布尔变量\t69198\nmicroSD卡\t69199\n发热丝\t69200\n妃子\t69201\n护套\t69202\n新时代健康\t69203\n调戏\t69204\n王宁利\t69205\n华北理工\t69206\nlge\t69207\n赤峰二中\t69208\n广发基金\t69209\n数控车床加工\t69210\n葱油饼\t69211\n常好\t69212\n胃出血\t69213\n中核华兴\t69214\n动武\t69215\n梅纽\t69216\n潭头\t69217\n芙蓉山\t69218\n艾伦秀第13季\t69219\n数间\t69220\n普\t69221\n武汉港\t69222\n指導\t69223\n先生\t69224\n双汇火腿肠\t69225\n滞后项\t69226\n合肥新站高新区\t69227\n1W2\t69228\n宽屏\t69229\n北京致远互联软件股份有限公司\t69230\n砂浆\t69231\n撤退\t69232\n大豫网\t69233\n北大资源博雅\t69234\nexsi\t69235\n疍家\t69236\n见谅\t69237\n手脑\t69238\nKennedy\t69239\n汤姆森\t69240\n俺去也\t69241\n小米面\t69242\n成长礼\t69243\n景龙\t69244\n纪旺西\t69245\n雎晓雯\t69246\n加建\t69247\n正宁路\t69248\n浙东大峡谷\t69249\nresharper\t69250\n饮片\t69251\n嵊州市\t69252\nbridge\t69253\nFemdom\t69254\n迟小秋\t69255\nAbel\t69256\n空战魔导士候补生\t69257\n美其美\t69258\n瑞年\t69259\n内部收益率\t69260\n日记网\t69261\n磐安\t69262\n因斯布鲁克\t69263\nlavavel\t69264\n飞向\t69265\n庆市\t69266\n阅朗\t69267\n上海出版社\t69268\n舒肝健胃丸\t69269\n会计从业资格考试网\t69270\n块儿\t69271\n十几分钟\t69272\n华泽集团\t69273\n方圆集团\t69274\n每2年\t69275\n中美联泰大都会人寿保险有限公司\t69276\n欧恩\t69277\n莞城街道\t69278\n伦巴第\t69279\nnicescroll\t69280\n解除\t69281\n冰封王座3\t69282\n10000mAh\t69283\n怪物猎人世界轻弩\t69284\n企业绩效评价标准值\t69285\n龙之吻\t69286\n洗手舞\t69287\n完稿\t69288\n可愿\t69289\n2018年4月3号\t69290\n为人处世\t69291\n王雅捷\t69292\n高压断路器\t69293\n切条\t69294\n袭警\t69295\n上海柏悦酒店\t69296\n解忧\t69297\nkirara\t69298\n第二十期\t69299\n循环体\t69300\n蛋雕\t69301\nfenty\t69302\n_恐怖漫画_黑白漫话\t69303\nhagao\t69304\n七曲山大庙\t69305\n百程\t69306\nKobe\t69307\n吴冠\t69308\nanger\t69309\nmod汉化版\t69310\n天津市河东区\t69311\n魔法世界\t69312\n万圣\t69313\nEnchanted\t69314\n第159集\t69315\n以后\t69316\n软盒\t69317\nForms\t69318\n投诉率\t69319\nAppleCare\t69320\n铝镁锰合金\t69321\n马尔科夫\t69322\nLync\t69323\n发条英雄\t69324\n紫阳大道\t69325\nStefano\t69326\n5天\t69327\n何宝生\t69328\n莫斯科保卫战\t69329\n史丹利\t69330\n珺\t69331\n205g\t69332\nMatrix\t69333\nnick_huang\t69334\nkanye\t69335\n百拓\t69336\n触痛\t69337\n李德明\t69338\n暗香盈袖沐心田\t69339\n100环\t69340\n浙江交科\t69341\n肇州\t69342\n牧笛\t69343\n高帅富\t69344\n数字华容道\t69345\n锗石\t69346\n饱和电流\t69347\n湖北省国家税务局\t69348\nnba76ers-ChinaUnix\t69349\n第3届\t69350\nIBA\t69351\n库容\t69352\n外汇牌价\t69353\n易加\t69354\n声词\t69355\n玛莎拉蒂Ghibli\t69356\n养而亲不待\t69357\nComparative\t69358\nczs\t69359\n海角\t69360\n6969\t69361\n侥幸心理\t69362\n考文\t69363\n目光\t69364\n天天色\t69365\n换牙\t69366\n白塔\t69367\n毫毛\t69368\n杨修\t69369\n撸友\t69370\n绞牙\t69371\ng630\t69372\n仁和会计培训\t69373\nIntersection\t69374\n强殖\t69375\n第18040期\t69376\n窝囊废\t69377\n祎庭沫瞳\t69378\nstuck\t69379\n吾王\t69380\n点儿\t69381\n官配\t69382\n华北理工大学轻工学院\t69383\n晒黑\t69384\n东莞职业技术学院\t69385\n王忠诚\t69386\n这样的生活\t69387\n炸猪排\t69388\n鼠妇\t69389\n我的天空\t69390\n班本\t69391\n神武逍遥\t69392\n地痞\t69393\n数字万用表\t69394\n汉隶\t69395\n可恩\t69396\n龙铭\t69397\n进口轴承\t69398\n公司转让网\t69399\n三井住友银行\t69400\nMtime时光网\t69401\n磁力链接ed2k\t69402\ndatanode\t69403\n社保经办\t69404\n炯炯有神\t69405\nsending\t69406\n深圳14号\t69407\n阿坝藏族羌族自治州\t69408\nCesium\t69409\n十九大丨\t69410\n朝日电视台\t69411\n紫云县\t69412\none3\t69413\n勇敢面对\t69414\n湖南省移民局\t69415\n04月19日\t69416\nshiyi\t69417\n甲刃\t69418\n烽火佳人\t69419\n骨瓷\t69420\nstrange\t69421\n逯子\t69422\n泰亚赛福\t69423\n眼药水\t69424\n第3列\t69425\n掐指一算\t69426\n_无双大蛇吧_\t69427\n特征性\t69428\n常州小学\t69429\n齐步\t69430\nstoryboard\t69431\n劳动关系学\t69432\nroutine\t69433\n绿宝石\t69434\n正生\t69435\n卡乐比\t69436\n花镇情感网\t69437\n崇文实验学校\t69438\nc35\t69439\n天津中学\t69440\n块面\t69441\n旧版\t69442\n中华人民共和国审计署\t69443\n亚磷酸\t69444\n滚雷\t69445\nFreeOA\t69446\npostgresql\t69447\n小马路\t69448\nxavier\t69449\n长乐\t69450\n购物公园\t69451\n作伴\t69452\n酒干倘卖无\t69453\n淮北市教育局\t69454\n诗情画意\t69455\nclin\t69456\n1碗\t69457\n于振海\t69458\n中央台\t69459\n北京大秦新天下电子有限公司\t69460\n杨丽华\t69461\n真宵\t69462\n前屈\t69463\n赞微商城\t69464\n毛猛达\t69465\n谭建荣\t69466\n同罪\t69467\nNICK\t69468\n厦门象屿集团有限公司\t69469\n辽源市人民政府\t69470\n高转速\t69471\nssd\t69472\n原矩阵\t69473\n炽烈\t69474\n20万条\t69475\nkendoui\t69476\n3265\t69477\n坛\t69478\n湖北江南专用特种汽车有限公司\t69479\n醇酸磁漆\t69480\n径\t69481\n花楹\t69482\nseite\t69483\n金雀花\t69484\n全县\t69485\n劳碌命\t69486\n失去理智\t69487\n牛津译林版八年级\t69488\n红润\t69489\n商贸\t69490\n唐河县\t69491\n拮抗剂\t69492\n黑水河\t69493\n集成板\t69494\n十三位\t69495\n八宫\t69496\n凹函数\t69497\n头昏脑胀\t69498\n奈德丽\t69499\n行不通\t69500\n口齿\t69501\nMEDICAL\t69502\n女装秀\t69503\n咏春拳\t69504\n1000公里\t69505\nrollei\t69506\nNS版\t69507\n热身\t69508\nSHEEN\t69509\n进化之地2\t69510\n酸价\t69511\n东莞市科学技术局\t69512\n星游\t69513\ncadance\t69514\n革字\t69515\n肉糜\t69516\n上兴镇\t69517\n智慧树幼儿园\t69518\nct机\t69519\n云中鹤\t69520\n天猫双11\t69521\n温州育英国际实验学校\t69522\n都峤山\t69523\nSaw\t69524\n蓝若冰\t69525\n王丽雪\t69526\n茶陵县政府\t69527\n血糖仪\t69528\n音\t69529\n魔联军\t69530\n测试赛\t69531\nG302\t69532\n第62期\t69533\n能效比\t69534\n中国卡车网\t69535\n广汽乘用车\t69536\nRESTORE\t69537\nOnsen\t69538\n唐朝好男人\t69539\n伊人综合网\t69540\n三款\t69541\nGB2312\t69542\n抱姿\t69543\n农产品\t69544\n反意\t69545\n大峃镇\t69546\n香港瑞丰会计事务所\t69547\nヒトヅマライフ\t69548\n胶球\t69549\n全图纸\t69550\n行署\t69551\nabp\t69552\nEasyMock\t69553\nseventh\t69554\n别回\t69555\n而生\t69556\n高邦\t69557\n木金\t69558\n建南汽车站\t69559\n100兆\t69560\n光大国际\t69561\n青山湖街道\t69562\n辩机\t69563\n金台区\t69564\n辐射:新维加斯\t69565\n阻抗谱\t69566\n泥料\t69567\n梦幻花园\t69568\ngogle\t69569\n建设大街\t69570\n泊岸\t69571\n陈超英\t69572\n京瓷1040\t69573\n妈咪们\t69574\n宜居城市\t69575\n抵抗军\t69576\n同名端\t69577\n十一版\t69578\n绍兴市\t69579\n沙驰\t69580\njour\t69581\nguanji\t69582\n中国独立学院\t69583\n芦芳生\t69584\nDataguru\t69585\n贵宠\t69586\n李照耀\t69587\n13页\t69588\n边缘化\t69589\n银医通\t69590\n两栋\t69591\n刘海军\t69592\nbelongs\t69593\n欧耶\t69594\n部委\t69595\n非主\t69596\n普泽\t69597\n把手\t69598\n脊柱\t69599\n四川盆地\t69600\n老\t69601\n凯特阿普顿\t69602\npgadmin\t69603\n架起\t69604\n中国电建地产集团有限公司\t69605\n赵刚\t69606\n三门湾\t69607\n景致\t69608\n枯草\t69609\ndeskscapes8\t69610\n长篇大论\t69611\n办美\t69612\n量场\t69613\n四川省卫生学校\t69614\n林科\t69615\neasygui\t69616\n心理\t69617\n铅头\t69618\n梯段\t69619\nZ6\t69620\n飞天山\t69621\n防渗膜\t69622\n225个\t69623\n博览\t69624\nhazir\t69625\n中心型\t69626\nピンク\t69627\nipsan\t69628\n3万多元\t69629\n护眼灯\t69630\nchekun\t69631\n林科院\t69632\nmeek\t69633\n圆刚\t69634\n中南大学铁道学院\t69635\n上一天\t69636\n升龙广场\t69637\naliyuncs\t69638\n天涯资讯网\t69639\n换算\t69640\n指轮\t69641\n≥\t69642\n蛀虫\t69643\n投资率\t69644\n全集高清\t69645\n设于\t69646\nWeek\t69647\n有得买\t69648\n爱国卫生运动\t69649\n下一次\t69650\n锅庄\t69651\n查办\t69652\n生物技术\t69653\ncenalulu\t69654\nngin\t69655\n炼胶机\t69656\n中国银联\t69657\n城府\t69658\n四千块\t69659\n泡沫剂\t69660\n戴玉强\t69661\n8km\t69662\nvsion\t69663\n泰迪俱\t69664\n警讯\t69665\nbass\t69666\n梅花五角硬币\t69667\nsccm\t69668\n深圳地王大厦\t69669\nmA\t69670\ncoordinatorlayout\t69671\n上海市普陀区中心医院\t69672\n哈尔滨工业大学管理学院\t69673\n园林处\t69674\nHCD007\t69675\n青啤\t69676\n逝魔\t69677\n街长\t69678\n高考题\t69679\nhev\t69680\n着重\t69681\n夏夏\t69682\nessence\t69683\n三教\t69684\nActionScript3.0\t69685\nsunrise\t69686\n快哉\t69687\n博泽\t69688\n手簿\t69689\nfunct\t69690\nCalculate\t69691\n1.8米\t69692\n众望所归\t69693\npositive\t69694\n思修\t69695\nlinalg\t69696\n呲哩呲哩\t69697\n吸取教训\t69698\n线代\t69699\n蛙声\t69700\n超越性\t69701\n单透\t69702\n木盖\t69703\n祸福\t69704\n环数\t69705\n吉龙\t69706\n博子\t69707\n小岛\t69708\n怪猎\t69709\n宗谱\t69710\n李为民\t69711\n海损\t69712\n海词缩略语\t69713\nPrefer\t69714\n有个人\t69715\n40W\t69716\n奥尔良烤翅\t69717\n重庆经济技术开发区\t69718\n记录册\t69719\nLiebe\t69720\n王刚\t69721\nSolidWorks\t69722\n匈\t69723\n柳花\t69724\n烧写\t69725\n钻井队\t69726\n99分\t69727\n钻心\t69728\n虚拟youtuber\t69729\n手抓饼\t69730\nquotes\t69731\n泰坦X1\t69732\n宠受\t69733\nferry\t69734\n仲恺高新区\t69735\n带土\t69736\n1333MHz\t69737\n肤乐霜\t69738\n乐享\t69739\nDPDK\t69740\n天骥\t69741\n农经权\t69742\nheze\t69743\nWebMail\t69744\n汉谟拉比法典\t69745\n山东中医药大学附属医院\t69746\n荧光币\t69747\nzen\t69748\n两百个\t69749\n韩宇卓\t69750\nJQ\t69751\nMBC音乐中心\t69752\n易窗网\t69753\n吉林教育\t69754\n新月集\t69755\n韩沐伯\t69756\n训练照\t69757\n段江涛\t69758\n运输罐\t69759\nxftp\t69760\n基金份额净值\t69761\n香积寺\t69762\n半透膜\t69763\n中国花木网\t69764\n后院\t69765\n鸡婆\t69766\n西固区\t69767\n宾利\t69768\nDllImport\t69769\n九仙山\t69770\n悲伤逆流成河\t69771\n花都客运站\t69772\n催收\t69773\n农银快e宝\t69774\n成都市市\t69775\nTheme\t69776\n上低音号\t69777\n量化基金\t69778\n写完\t69779\n武钢\t69780\n54位\t69781\n收方\t69782\n宣城市\t69783\n连表\t69784\n5.0&5.1\t69785\n冰轮丸\t69786\n含苞待放\t69787\n莫汉\t69788\ngitolite\t69789\n噔\t69790\nphpyun\t69791\n月舞\t69792\n有之\t69793\n穗府\t69794\n高古楼\t69795\n列阵\t69796\n张艾嘉\t69797\n黎塘镇\t69798\n飞乐\t69799\n六不\t69800\n大北\t69801\n大陆网\t69802\n800W\t69803\n常用值\t69804\n沈逸\t69805\n人工\t69806\n大古\t69807\n东岳泰山\t69808\n硅石\t69809\n挤走\t69810\n杂剧\t69811\n长流镇\t69812\n上围\t69813\n1000目\t69814\n保安服务公司\t69815\n腾讯云+\t69816\n武汉市委\t69817\ngiorgio\t69818\n擦出\t69819\n火锅\t69820\njeep自由侠\t69821\n36首\t69822\n星韵\t69823\n神乳\t69824\n久违了\t69825\n第61期\t69826\n平板电视大全\t69827\n中国服装协会\t69828\n原始套\t69829\n启动盘制作工具\t69830\nxlive\t69831\n巴博斯\t69832\n电院\t69833\n光效\t69834\n175%\t69835\n经营许可证\t69836\n名鞋库\t69837\n山西省实验中学\t69838\n你是我的遥不可及\t69839\n中国银行广东省分行\t69840\n上证指数\t69841\n站出来\t69842\nHip\t69843\nLovers\t69844\nchamp\t69845\ndescent\t69846\n馨城\t69847\n绝地求生大逃杀超级助手\t69848\n白头山\t69849\n官策\t69850\nubus\t69851\nnonpdrm\t69852\n润物\t69853\nweex-toolkit\t69854\n快消行业\t69855\n2017-12-15\t69856\npolymorphism\t69857\n1168.TV\t69858\nsoftly\t69859\n塑身衣\t69860\nsimsun\t69861\n115名\t69862\n二烯\t69863\n金士顿毒刺\t69864\nFreeMind\t69865\n杨家岭\t69866\n枸杞汤\t69867\n电力法\t69868\n全图\t69869\n硬脂酸镁\t69870\n框子\t69871\n其他综合收益\t69872\n木聪\t69873\n电话会议\t69874\n油枕\t69875\n王者荣耀女娲\t69876\n天天向上\t69877\n东港商务区\t69878\n新城小区\t69879\n张慧敏\t69880\n永州\t69881\n厕所革命\t69882\n1.29\t69883\nlice\t69884\nBowen\t69885\nbr2\t69886\n罗纳尔\t69887\n罗彻斯特\t69888\n梦人\t69889\nregret\t69890\nmeta\t69891\n长安悦翔v3\t69892\n东阳中学\t69893\n米塞斯\t69894\n河南省发展和改革委员会\t69895\n友子\t69896\n嘉行\t69897\n浇筑泵\t69898\nChkdsk\t69899\ndentist\t69900\n民航总局\t69901\n每个小时\t69902\n大连金州新区\t69903\n鱼雷艇\t69904\n昆西\t69905\n劝谏\t69906\n履行地\t69907\n北京理工大学学报\t69908\n背景颜\t69909\n5mm\t69910\n瘦身食谱\t69911\n拦污\t69912\npreempt\t69913\n同角三角函数\t69914\n刘春英\t69915\n该房\t69916\n陪葬品\t69917\n神舟笔记\t69918\nLQ-610K\t69919\n加剧\t69920\nresful\t69921\n0319\t69922\nplum\t69923\n吴天\t69924\n呼图壁县人民政府\t69925\n婚介网\t69926\nslaughter\t69927\n实践性\t69928\n偶有\t69929\ntfidf\t69930\n盐都区\t69931\n有偿\t69932\n剧情\t69933\nmalena\t69934\n第八十章\t69935\nUICollection\t69936\n光管\t69937\n金钥\t69938\n中国证券金融股份有限公司\t69939\n00Q\t69940\n侏儒症\t69941\n紫光股份有限公司\t69942\n北汽银翔\t69943\n纸壁\t69944\nols\t69945\n2017年10月25日\t69946\n宋哲\t69947\nSPF30+\t69948\n查士丁尼\t69949\nAOSP\t69950\n95小橙武\t69951\n淮南一中\t69952\nLui\t69953\n礼部\t69954\n江西航空\t69955\n夹板门\t69956\n蒼井空\t69957\nlotion\t69958\n嘉实基金\t69959\n固安在线\t69960\n库尔德武装\t69961\n新君越\t69962\n保护令\t69963\n莆田小鱼网\t69964\n酒坊\t69965\n血友\t69966\n600平方米\t69967\n勤学好问\t69968\n兄贵\t69969\n样条线\t69970\n移师\t69971\nPLUS\t69972\n鬼画符\t69973\n宣扬\t69974\n十二周年\t69975\n3个半月\t69976\n里面\t69977\n博西\t69978\n直营\t69979\narm9\t69980\n通天之路\t69981\n棒状\t69982\n计税毛利率\t69983\nkoi奶茶\t69984\n茶侃网\t69985\n赋权\t69986\n3碗\t69987\n怒江之战\t69988\n高境\t69989\n平凡的人\t69990\n南京市人力资源和社会保障局\t69991\n最大公因数\t69992\n魏格纳\t69993\n6055\t69994\n都安托涅瓦\t69995\n兴\t69996\nyonghu\t69997\n预案\t69998\n西安城北客运站\t69999\n汽车衡\t70000\n妖塔\t70001\n电动头\t70002\n修辞学\t70003\n中银理财\t70004\n浓缩咖啡\t70005\nUnavailable\t70006\n仁兄\t70007\n阿星\t70008\nprincessd8251\t70009\n宫西达\t70010\n洗车\t70011\nLighthouse\t70012\n鬼妻\t70013\n82场\t70014\n白沙黎族自治县\t70015\n甘孜日报社\t70016\n曹保平\t70017\n200方\t70018\n新京报电子报\t70019\n怀\t70020\n扣手\t70021\nCOSPLAY\t70022\n极客网\t70023\n易微联\t70024\n跑吧网\t70025\nluz\t70026\n罗红霉素\t70027\n试打\t70028\n柳钉\t70029\n借调\t70030\n老萨\t70031\n深圳平安银行\t70032\n散票\t70033\n黛瓦\t70034\nblister\t70035\n精算师\t70036\n土地资源管理专业\t70037\n祝融峰\t70038\n钱小佳\t70039\n扩项\t70040\n无电\t70041\n维爱\t70042\n怀旧服\t70043\n2011-2030年\t70044\n分布式系统\t70045\n葱葱\t70046\n辽宁广厦\t70047\n60多位\t70048\n赴韩\t70049\n三白\t70050\n紫帽\t70051\nAO史密斯\t70052\n王玉峰\t70053\nepc\t70054\n妙龄女\t70055\n何一\t70056\nChop\t70057\nTHz\t70058\n竹筏\t70059\nNeoTV玩家论坛\t70060\n庆安\t70061\n旋转器\t70062\n大蓟\t70063\n水洗标\t70064\n黑龙江地区\t70065\n8幢\t70066\n青雀\t70067\n不一样了\t70068\n党工\t70069\n阿_\t70070\nelona+\t70071\nspring+quartz\t70072\nELEAGUE\t70073\n祁县政府\t70074\nsetattr\t70075\nPGM\t70076\n寿\t70077\n总理\t70078\n过敏源\t70079\nPrintable\t70080\n必通\t70081\n黄河道\t70082\nwww.3wyoua.com\t70083\n软疣\t70084\n车种\t70085\n2000条\t70086\n联合汽车电子\t70087\n黄烟\t70088\n泡菜音译网\t70089\n大一棵树\t70090\n清香剂\t70091\ndoit\t70092\nhns\t70093\n迟延\t70094\n雪漫城\t70095\n深圳拓邦股份有限公司\t70096\n陕铁院\t70097\ncrossorigin\t70098\nlimited\t70099\n中国名酒招商网\t70100\n余飞\t70101\n填缝剂\t70102\n7up\t70103\n中华人民共和国国歌\t70104\n油冷器\t70105\n嬉皮笑脸\t70106\nRADIUS\t70107\n前束\t70108\n疏风解毒胶囊\t70109\nTimeStamp\t70110\n万家丽广场\t70111\n质押式回购\t70112\n上海工程技术大学\t70113\n炒菜锅\t70114\n南山风景区\t70115\n春卷\t70116\n金源店\t70117\n两罐\t70118\n巴蛮子\t70119\n第几轮\t70120\n计量经济\t70121\nwidgets\t70122\n平原区\t70123\n生科院\t70124\n摇一摇\t70125\n小和山\t70126\n声雨\t70127\n青溪镇\t70128\n今日9时\t70129\n32元\t70130\n澄澄\t70131\n联华ok卡\t70132\n酝酿\t70133\n市二院\t70134\n微针\t70135\n简直\t70136\n吊竹梅\t70137\n出家\t70138\n周考\t70139\n肝毒\t70140\n李国胜\t70141\nONVIF\t70142\n和君管理咨询公司\t70143\n便捷化\t70144\nNOTE5\t70145\n冰火人\t70146\nsigsegv\t70147\nnai\t70148\n环道\t70149\n讨饭\t70150\n注册书\t70151\n对比色\t70152\n神龙公司\t70153\n云冈石窟\t70154\n绿城地产\t70155\n差异性\t70156\n无房户\t70157\n2201\t70158\n19世纪初\t70159\n四惠长途汽车站\t70160\n6KV\t70161\n石子\t70162\nJoJo\t70163\nccdc\t70164\n丛中\t70165\n心无\t70166\n威斯丁\t70167\n福禄\t70168\nscrape\t70169\n心怀感恩\t70170\n澳星\t70171\n选载\t70172\n朱泳腾\t70173\n小康社会\t70174\n名古屋乐高乐园\t70175\nHipHop\t70176\n5502\t70177\n佐仓\t70178\n暴雪\t70179\nTubebbs\t70180\n火山群\t70181\n此刻\t70182\n烧损\t70183\n正黄集团\t70184\n扫风\t70185\n罗威\t70186\n众途\t70187\n打虎\t70188\n重铬酸钾法\t70189\n村上隆\t70190\n艾华集团\t70191\n螺虫\t70192\nstylelint\t70193\n变形体\t70194\n张梓琳\t70195\n扩机\t70196\n五星花园\t70197\nWEN\t70198\nSTS\t70199\n试鸣\t70200\n5145\t70201\n电子营业执照\t70202\n徐文东\t70203\n试种\t70204\n大众凯路威\t70205\n靠近你\t70206\n游戏悍将\t70207\n合诚\t70208\n机油泵\t70209\n李忘风\t70210\n笑话集\t70211\n姚敏\t70212\nb70\t70213\nRain\t70214\n第二十八条\t70215\n海洋馆\t70216\n无锡市人力资源和社会保障局\t70217\n9月28日\t70218\n气象预报\t70219\n银行保险\t70220\n桑果\t70221\nsuzy\t70222\n昭奚\t70223\n孙建军\t70224\n高年级\t70225\n小橙子姐姐我的世界\t70226\n五通桥\t70227\n亚泰坊\t70228\n加大\t70229\nconstraintlayout\t70230\n普惠金融\t70231\n翡翠书院\t70232\n上座部\t70233\n多家\t70234\n20世纪30年代\t70235\n財經\t70236\n声效\t70237\n堆堆\t70238\n张朝\t70239\n广安区\t70240\n莉爷\t70241\n悟寰轩\t70242\n舞力全开2018\t70243\nG10\t70244\n吴亮\t70245\n凯特·贝金赛尔\t70246\n阿蒂仙\t70247\n道通科技\t70248\ngunnar\t70249\n品管部\t70250\n3.78\t70251\n地势\t70252\n人民政府办公室\t70253\n郑州人民公园\t70254\n门级\t70255\n皮脂腺痣\t70256\npermutation\t70257\n一本道\t70258\n原判决\t70259\n色龙\t70260\nlarry\t70261\n抽象函数\t70262\n盆栽葡萄\t70263\n大学物理\t70264\n词\t70265\n2017年2月20日\t70266\n戏说\t70267\npew\t70268\nwechat\t70269\n3.5.3\t70270\n丹东河口\t70271\n欧芭\t70272\n云爆弹\t70273\n美多\t70274\n马如龙\t70275\n青阳子\t70276\nhola\t70277\n中国经济新闻网\t70278\nWVS\t70279\n0.13.0\t70280\n狩技\t70281\n神龙汽车有限公司\t70282\nSDE\t70283\n临走\t70284\n20180428\t70285\n天神诀\t70286\n纽约地区\t70287\n色容\t70288\ninsensitive\t70289\n艾彩\t70290\n美容网\t70291\n金砂\t70292\n旷达科技\t70293\nmate10\t70294\n贝玲妃\t70295\n巨力集团\t70296\n老体协\t70297\n扬花\t70298\n厦门大学出版社\t70299\n财政学专业\t70300\ndaxiang\t70301\n腐蚀剂\t70302\n南京东路步行街\t70303\nXO型\t70304\n韭菜园\t70305\nopa\t70306\n爱国歌\t70307\nH7\t70308\nlinker\t70309\n尺规作图\t70310\n受不了\t70311\n贝德玛\t70312\n河南发改委\t70313\n中共黑龙江省委\t70314\n劳务派遣制\t70315\n米雯娟\t70316\n火舞\t70317\n崩漏\t70318\n天星教育金考卷\t70319\n华米2\t70320\n雨山路\t70321\n链家地产\t70322\n语文园地二\t70323\n位温\t70324\n58年\t70325\nsomatoline\t70326\n润唇膏\t70327\n随机算法\t70328\n多语版\t70329\n心慌气短\t70330\n骠骑\t70331\n说彩\t70332\n奇异\t70333\n北京大学心理与认知科学学院\t70334\n羽毛\t70335\n中国民俗学会\t70336\nbd0001.sys\t70337\n精机\t70338\n消费者行为学\t70339\nm78\t70340\n魂石\t70341\n灵儿\t70342\n新西兰移民局\t70343\nBuy\t70344\n艾菲尔铁塔\t70345\n徐新荣\t70346\n下洋镇\t70347\ncad2010\t70348\n可乐会\t70349\nCrytek\t70350\n四转\t70351\n网商银行余利宝\t70352\n优弗\t70353\n正岩\t70354\n保时捷中心\t70355\n清开灵胶囊\t70356\nLinux子系统\t70357\nred5\t70358\n卡卡利科村\t70359\n斯巴达克斯血与沙\t70360\n留级\t70361\n磷虾油\t70362\n1.158\t70363\n人脸识别\t70364\n益胶泥\t70365\n刘鹗\t70366\n2017-01-01\t70367\n悠唐购物中心\t70368\n收益类\t70369\n林炜\t70370\n随机森林\t70371\n中国国际家具展览会\t70372\n售后点\t70373\n加班\t70374\n嘉诚国际\t70375\n30千瓦\t70376\n杭州市保俶塔实验学校\t70377\n红汤\t70378\nGif\t70379\nPro4\t70380\n新编大学英语\t70381\n433号\t70382\n12.00\t70383\n国家食品药品监督管理总局\t70384\n美音版\t70385\n欧专局\t70386\n彩城\t70387\n敌意\t70388\n百度定位\t70389\n有氧训练\t70390\n搜神传\t70391\n山西日报数字报\t70392\n俄舞\t70393\n千眼菩提子\t70394\nkimas\t70395\nNg\t70396\n20立方\t70397\n水淹七军\t70398\n西安宾馆\t70399\n南通市\t70400\n新蛋网\t70401\n音图\t70402\n马克思主\t70403\n毛毡\t70404\n星村镇\t70405\nRL\t70406\n郭威\t70407\nOSHA\t70408\n室颤\t70409\n三角板\t70410\n裸蛋糕\t70411\n大乘无量寿经\t70412\nframework层\t70413\n暖气片\t70414\n肉质\t70415\n扇叶\t70416\n楠溪\t70417\n撇折\t70418\n积成电子\t70419\n精华篇\t70420\n三忍\t70421\n上海市绿化和市容管理局\t70422\n呈送\t70423\nウェブサイト\t70424\n腾讯滨海大厦\t70425\n焦\t70426\n操\t70427\n2015年3月28日\t70428\n真田春香\t70429\n飞空\t70430\n10.9\t70431\nGeosciences\t70432\n俊熙\t70433\n菜头\t70434\n增霸\t70435\n包头市教育局\t70436\n二房东\t70437\n黄河风景区\t70438\n一周前\t70439\n琴仙\t70440\n樱狼\t70441\n三郎\t70442\nM17x\t70443\n11月7日\t70444\n国家电\t70445\n东信\t70446\n极坐标法\t70447\n沈巍\t70448\n玻璃栈桥\t70449\n遥感卫星\t70450\n化妆盒\t70451\n天天电影吧\t70452\ningress\t70453\n蒲江\t70454\n许宁\t70455\n卡徒\t70456\n国际通\t70457\nWestin\t70458\n精华液\t70459\n轩子巨二兔\t70460\n器材\t70461\n抽检\t70462\n锦砖\t70463\n帆游加速器\t70464\nhevc\t70465\n营销部\t70466\n检察院\t70467\n六神\t70468\n成海\t70469\n900M\t70470\n松岗镇\t70471\n郎眼\t70472\n究极日\t70473\n20171023\t70474\n盐城市亭湖区\t70475\n九小时\t70476\n爱莲\t70477\ntruetype\t70478\n浙一医院\t70479\n八旬\t70480\n产钳\t70481\n目标与时间管理\t70482\n46卷\t70483\nMicro\t70484\n砂轮\t70485\n丁香路\t70486\n古猿\t70487\n龙桃子\t70488\n撼世\t70489\n老松\t70490\n马来亚大学\t70491\n庞\t70492\nDaya\t70493\n印刷人才网\t70494\ncogs\t70495\n贸易逆差\t70496\n区安监局\t70497\n爱易房\t70498\n1-7月\t70499\n4.00\t70500\n那拉提\t70501\n商务委\t70502\n蹦蹦网\t70503\n15台\t70504\n大逆不道\t70505\nf555\t70506\n体定\t70507\n大兴\t70508\nBarra\t70509\n恶心人\t70510\n元末\t70511\n科学计数\t70512\n厨盆\t70513\n洞府\t70514\n铁腕\t70515\n超兽\t70516\n技术交底\t70517\n一飞\t70518\n浅墨\t70519\n流露\t70520\n40周岁\t70521\n千娇百媚\t70522\n单列\t70523\n36V\t70524\nktr\t70525\n村上\t70526\n转角处\t70527\nIdols\t70528\nwelfare\t70529\n仁和\t70530\n二蛋\t70531\n肩射\t70532\n刘玲\t70533\n北角\t70534\n教育部高等教育司\t70535\n大沙河公园\t70536\n遂昌县政府\t70537\n胡笳十八拍\t70538\n歹\t70539\nBlan丶Sun\t70540\n吧务\t70541\n军械库\t70542\nPIXEL\t70543\n房屋转让合同\t70544\nnsk\t70545\noci.dll\t70546\n官林\t70547\n正国级\t70548\n鬼契\t70549\n筑龙房地产论坛\t70550\neigen\t70551\n兽痕\t70552\n王者荣耀点券\t70553\n艾宝\t70554\n菜花状\t70555\n破败\t70556\nzepto\t70557\n戴维宁\t70558\n英语句\t70559\n天津凤凰网\t70560\n烽火科技集团\t70561\n高堂\t70562\nCDKEY\t70563\nwinmm\t70564\n楼下\t70565\n五莲路\t70566\n招标代理机构\t70567\n兰花花\t70568\nFeelings\t70569\n三井奥特莱斯购物城\t70570\n明清宫苑\t70571\n驿路梨花\t70572\n黎伟\t70573\npks\t70574\n建工大厦\t70575\n高血糖\t70576\n刹车泵\t70577\n屮\t70578\n怎么装饰\t70579\n小受\t70580\n我的记忆\t70581\n斩落\t70582\nPEP8\t70583\n氨基磺酸\t70584\n杏园\t70585\n2018年3月8日\t70586\n呲呲\t70587\n绚烂\t70588\n鹰猎\t70589\n浙江省委省政府\t70590\n我不是潘金莲\t70591\n前2年\t70592\n70二代\t70593\n鼠标手\t70594\namos\t70595\n80w\t70596\n新疆昌吉\t70597\nwatermark\t70598\n高会\t70599\n制和\t70600\niSlide\t70601\n五所\t70602\n苗子\t70603\n脚手架\t70604\n塞尔维亚\t70605\n边军\t70606\n潘斌龙\t70607\n教程版\t70608\n超級\t70609\n安阳市教育局\t70610\n万花筒\t70611\n铭德\t70612\n新西兰币\t70613\n说唱版\t70614\ndepend\t70615\nAAPE\t70616\n仁焕法师\t70617\n张家口市政府\t70618\n6米\t70619\niterable\t70620\n俄料\t70621\nBBM\t70622\n楠楠\t70623\n互惠生\t70624\n琴台\t70625\n饶人\t70626\npixlr\t70627\n离石区\t70628\ngugu\t70629\n印图\t70630\n卫生监督协管\t70631\nE540\t70632\ncc2530\t70633\n郎咸宋美龄\t70634\n竹签\t70635\n8阶\t70636\n为你而来\t70637\n齿机\t70638\nshear\t70639\n荣耀路由2\t70640\n团聚体\t70641\n鼠标垫\t70642\nNOAA\t70643\n刚果共和国\t70644\n碳纤维管\t70645\nlinux-x86\t70646\nClipboard\t70647\n西溪天街\t70648\n多媒体系统\t70649\ne1-471g\t70650\n遗漏\t70651\n逆风\t70652\n流浪客\t70653\n临阵脱逃\t70654\n绣线菊\t70655\nmy97datepicker\t70656\n學習\t70657\ntoArray\t70658\n颜子轩\t70659\n中国铁建投资集团有限公司\t70660\nLM3886\t70661\nRounded\t70662\n设备\t70663\n斯马特\t70664\nff13\t70665\n安硕\t70666\n共享化\t70667\n荣归\t70668\n滑头\t70669\n华彬\t70670\n门径\t70671\n3.74\t70672\n少年先锋队\t70673\n八珍益母丸\t70674\n液气\t70675\n荣民\t70676\nWikis\t70677\nvrp\t70678\n全国人大常务委员会\t70679\nbrowsersync\t70680\n4月中\t70681\n仪表类\t70682\n72伏\t70683\n洮南\t70684\n桃源村\t70685\n游道\t70686\n千纸鹤\t70687\n平多多\t70688\n江西卫视\t70689\n第一章第二节\t70690\n马尾巴\t70691\nNicolet\t70692\nImpl\t70693\n张川\t70694\n7200万\t70695\n餐饮部\t70696\n600_\t70697\n克隆侠\t70698\n哆啦a梦\t70699\n最高人民法院最高人民检察院\t70700\n8分之一\t70701\n昏睡\t70702\ncocostudio\t70703\nMendes\t70704\n半夏泻心汤\t70705\n民和\t70706\n庆山\t70707\n_凯\t70708\n守住\t70709\n地元\t70710\n耳听爱情\t70711\n北京市环保局\t70712\n洛洛历险记\t70713\n菊正宗\t70714\n华德\t70715\n四十大\t70716\n青青子衿\t70717\nSCX-4321NS\t70718\n久光百货\t70719\nrog\t70720\n咸丰县人民政府\t70721\nairy\t70722\n态服\t70723\n2470\t70724\n屏蔽泵\t70725\n亚洲地区\t70726\n库源\t70727\nsybase\t70728\n陕西省文物局\t70729\nesm\t70730\n阿水大杯茶\t70731\n沉思录\t70732\n南票区\t70733\n图拉夫\t70734\n华为p8\t70735\n错题\t70736\n假睫毛\t70737\n拜仁慕尼黑\t70738\n02202\t70739\n6.5.5\t70740\n田径运动\t70741\n10部\t70742\n倍攻\t70743\n饮食业\t70744\n今次\t70745\nGcc\t70746\n李花\t70747\n亚特兰大\t70748\n梧桐花\t70749\n足光粉\t70750\n问答\t70751\n0149\t70752\n携带版\t70753\nLindsay\t70754\n李响\t70755\n北京市少年宫\t70756\n官庄\t70757\n合伙人\t70758\n圆饼\t70759\n意外险\t70760\n梅德韦杰夫\t70761\n体温计\t70762\n菊花酒\t70763\n努比亚nubia\t70764\n周3\t70765\n三奥\t70766\n0501\t70767\n五矿\t70768\n情境性\t70769\n挑落\t70770\n华为m9\t70771\n梵网\t70772\n索伏\t70773\n李开元\t70774\n哈利法塔\t70775\n浮力\t70776\n时间框\t70777\n700多个\t70778\n此花\t70779\n电子商务员\t70780\nGPFS\t70781\nHsrcw\t70782\n玛雅maya\t70783\nnod32许可证\t70784\n陈捷\t70785\n雨风\t70786\n化工展\t70787\n探讨\t70788\n惜花芷\t70789\n网联\t70790\nV6.7\t70791\nsage\t70792\nyingxiao\t70793\n韩文输入法\t70794\n李光地\t70795\n羊尾\t70796\n热血长安\t70797\n拉茶\t70798\n如狼似\t70799\n支奴\t70800\nlou\t70801\n青叶灵异事务所\t70802\npygraphviz\t70803\n电子科技大学研究生院\t70804\n百优卡汽车连锁超市\t70805\n仿石\t70806\n帧间\t70807\n濮阳市人民政府\t70808\n毕业纪念册\t70809\nPresence\t70810\n靠逼\t70811\n200立方\t70812\n顶呱刮\t70813\n宣城市纪委\t70814\n11篇\t70815\n乐乐\t70816\n丹红注射液\t70817\n153个\t70818\nre管理器\t70819\n麓山\t70820\n华企商城\t70821\nLeonard\t70822\n十四\t70823\nincoming\t70824\n龙珠:超\t70825\n苏州市区\t70826\n猫拳\t70827\n中国科学院上海技术物理研究所\t70828\n腹泻奶粉\t70829\n嫌犯\t70830\n银行询证函\t70831\n04_\t70832\n干燥综合症\t70833\n知足者\t70834\nt恤衫\t70835\n索氏提取器\t70836\nActions\t70837\n烟王\t70838\n前野智昭\t70839\nsucess\t70840\n花名\t70841\n浙江省教育厅办公室\t70842\n癸酉年\t70843\n达摩\t70844\n火影忍者带土\t70845\n多啦a梦\t70846\n王微\t70847\n2011-06\t70848\nMILK\t70849\n国威\t70850\nmainstream\t70851\n吉普牧马人\t70852\n陶瓦\t70853\n藤茶\t70854\n扉\t70855\n高宁\t70856\n临桂县\t70857\n21000元\t70858\n祖文\t70859\n血案\t70860\n艺花\t70861\n10美元\t70862\n948\t70863\nFNIS\t70864\nLIQUID\t70865\n汉酱\t70866\n黑玛丽鱼\t70867\n发短信\t70868\n数据库报\t70869\n勤恳\t70870\n金柱\t70871\n4.8日\t70872\n海洋\t70873\nБ\t70874\ncomcn\t70875\n崇福寺\t70876\n吃遍\t70877\n郎永淳\t70878\nSITE\t70879\nMercury\t70880\n平行文\t70881\n宗宁\t70882\nExpectations\t70883\n威尼斯花园\t70884\ngimp\t70885\n六十种\t70886\n臻跃\t70887\n鲍里斯\t70888\n金盆岭\t70889\nBC221\t70890\n考别\t70891\nPS5\t70892\n卡讯网\t70893\n二子\t70894\n五军之战\t70895\n乐1pro\t70896\n假仙\t70897\n拳皇命运\t70898\n全职我欲封天\t70899\njituan\t70900\napply函数\t70901\n医社保\t70902\n波鸿\t70903\n巅峰对决\t70904\n全副武装\t70905\n5525\t70906\nProgramming\t70907\n机电工程系\t70908\nIncest\t70909\n第3级\t70910\n宋冬野\t70911\n乐至县\t70912\n773\t70913\n主干网\t70914\n法号\t70915\n方言\t70916\nOwner\t70917\n09期\t70918\n6135\t70919\nbitcomet\t70920\n999朵\t70921\n博发\t70922\n90vs\t70923\n雷佳林书豪\t70924\n清美\t70925\n绅宝D70\t70926\n宏伟\t70927\n元ヤリマン\t70928\n亲爱的姑娘\t70929\n大兴善寺\t70930\nmeant\t70931\n秋实\t70932\nTogether\t70933\n耷拉\t70934\n流着泪说分手\t70935\n史非\t70936\n紫菜蛋花汤\t70937\n脂肪率\t70938\n转运公司\t70939\n总政歌舞团\t70940\n杀虫气雾剂\t70941\n嘉润\t70942\ngt650m\t70943\npro2018\t70944\n填空题\t70945\n名屋\t70946\n120升\t70947\nwww.2214.cn\t70948\n好感度\t70949\n血本\t70950\n火绒\t70951\n松毛虫\t70952\n舒张\t70953\nVitamio\t70954\n巨胖\t70955\nmed\t70956\n重庆联通\t70957\n快仓\t70958\n中国南京_南京政府\t70959\n四丁基溴化铵\t70960\nLucia\t70961\n物价\t70962\ngraffiti\t70963\n四把火\t70964\npLearning\t70965\n阳光嘉苑\t70966\n大虫\t70967\n中华寺\t70968\n30倍\t70969\n葫芦科\t70970\n王备\t70971\n首作\t70972\n骆氏\t70973\n卡课堂\t70974\nsqlite3.dll\t70975\n共交\t70976\ndrm\t70977\n终结性\t70978\n13日\t70979\n众心塔网\t70980\n洗髓经\t70981\n翟天暗黑破坏神\t70982\n上海市农业科学院\t70983\n肃南\t70984\n张豆豆\t70985\n男宠\t70986\n徽商集团\t70987\ndoc.docx\t70988\n徽网\t70989\n归属地查询网\t70990\n宏峰\t70991\n上海市城市交通运输管理处\t70992\n十六周年\t70993\n八宝饭\t70994\n小箱\t70995\nPink\t70996\n三生\t70997\n清莹露\t70998\n芃\t70999\n志明\t71000\n别克新君威\t71001\n新疆队\t71002\n七宝酥\t71003\n13周年\t71004\n叶蓁蓁\t71005\n贵州省公安厅\t71006\ndolphin\t71007\n重到\t71008\n暂缓\t71009\n206年\t71010\n霸道总裁文\t71011\n金沙湖\t71012\n1天内\t71013\n佛珠机\t71014\n198541.com\t71015\n国务院办公厅\t71016\n梯控系统\t71017\nparrot\t71018\n半老风韵犹存\t71019\n死时\t71020\n10046\t71021\n金梦\t71022\n粘液性\t71023\n金融炼金术\t71024\n开天\t71025\n确定位置\t71026\n董事会审计委员会\t71027\nExce\t71028\n汉台区\t71029\n就喊\t71030\n焊工证\t71031\n文昌阁\t71032\n六爻卦\t71033\n促\t71034\n手机链\t71035\n保持健康\t71036\nJR\t71037\njstree\t71038\n崔志成\t71039\nSheer\t71040\n大众途安L\t71041\n寿光一中\t71042\n旦角\t71043\n孝庄\t71044\n久石\t71045\n十笏园\t71046\n叶选平\t71047\n19世纪末\t71048\n喝汤\t71049\n倒挂金钩\t71050\n伊马替尼\t71051\ne01\t71052\n七进\t71053\n双喜\t71054\n万象2008\t71055\n兼容机\t71056\n多才多艺\t71057\ncabello\t71058\n倾世公主之逆天成凰\t71059\n橱柜装修|一起网\t71060\n三阳广场\t71061\nlang3\t71062\n柔板\t71063\nVhiphop\t71064\n羞恶\t71065\n招标工程网\t71066\n当阳市\t71067\n木兰花慢\t71068\n干管\t71069\n泉州市工商行政管理局\t71070\n树树\t71071\n第一夫人\t71072\n中心银行\t71073\n拼布\t71074\nNotePad++\t71075\n预绞式\t71076\n英雄无敌5东方部落\t71077\n我爱男保姆\t71078\naffiliated\t71079\nHigh\t71080\n去极化\t71081\n网吧管理系统\t71082\n扶贫村\t71083\n经济体制改革\t71084\n有据\t71085\n知页\t71086\n散文类\t71087\n野种\t71088\n漫品\t71089\n20V\t71090\n矩阵切换器\t71091\nhuiy\t71092\nECMAScript6\t71093\n杀戒\t71094\n藏起\t71095\n多西他赛\t71096\n20160810\t71097\n橱\t71098\n自定义列\t71099\n青岛海山学校\t71100\n奥科\t71101\n不同于\t71102\n艾薇儿·拉维尼\t71103\n泰瑞\t71104\n一篮\t71105\n疏堵\t71106\n试作\t71107\n1525\t71108\n彩漂\t71109\n2.3.32\t71110\n冰\t71111\nMEP\t71112\n底楼\t71113\n石球\t71114\n魔法学校\t71115\n打倒\t71116\nCrawlSpider\t71117\nOTO\t71118\n遭疑\t71119\n患难与共\t71120\nDelft\t71121\n安全模式\t71122\nSTG\t71123\n受害者\t71124\n共字\t71125\n苏30\t71126\nknow\t71127\n布里斯\t71128\nConsulates\t71129\n莫洛托夫\t71130\n希罗多德\t71131\n老秦老\t71132\n佛理\t71133\nedX\t71134\n遮瑕棒\t71135\n泡书吧\t71136\nSaas\t71137\n翻泽\t71138\n内分泌科\t71139\n连笔\t71140\n应用型\t71141\n八仙果\t71142\nnetconf\t71143\n初步\t71144\n2012a\t71145\n胖子们\t71146\n伟才幼儿园\t71147\n都安县\t71148\n允恩\t71149\n聚合\t71150\n气箱\t71151\n复活币\t71152\n浊度仪\t71153\n八幅\t71154\n压装机\t71155\nAutocad2008\t71156\n安培\t71157\n吃黑\t71158\n中国生鲜移动\t71159\n重庆市永川区政府\t71160\n回单箱\t71161\n地球人\t71162\nmpr\t71163\n江苏省人民检察院\t71164\n线包\t71165\n潜意识\t71166\n作物学\t71167\n日票\t71168\n应召女友\t71169\n罗曼罗兰\t71170\n精神分析引论\t71171\n二进制位数\t71172\n葫芦形\t71173\n凤天路\t71174\n周铁根\t71175\n病毒扫描网\t71176\n晟嘉\t71177\n泥螺\t71178\n李沧\t71179\ncocos2dx3.0\t71180\n中通电子\t71181\nAPDU\t71182\ncfg桩\t71183\n中山二院\t71184\n侠客岛\t71185\n约瑟翰·庞麦郎\t71186\n迪尼斯\t71187\n长德\t71188\nlambda\t71189\nzuhao\t71190\n胎心监护\t71191\n嘉熙\t71192\n安希妍\t71193\n洗地机\t71194\n艺术概论\t71195\n相良宗介\t71196\nincompatible\t71197\n寿光菜博会\t71198\n切块\t71199\n扭转试验机\t71200\n共振峰\t71201\n抽佣\t71202\n登用\t71203\n纺织类\t71204\n3GB\t71205\n高集\t71206\n车工\t71207\n雪衣\t71208\n行政\t71209\n平原绫香\t71210\n暗精\t71211\n西藏旅游资讯网\t71212\n探测仪\t71213\n省关工委\t71214\n24【\t71215\n蒋梦婕\t71216\n花瓣网\t71217\n牙狼garo\t71218\n西安体育学院\t71219\nwurth\t71220\n折号\t71221\nZ5\t71222\n校准\t71223\n东莞日报\t71224\n即可\t71225\n夏圭\t71226\n抖腿\t71227\n百分网\t71228\n讹诈\t71229\n硬调\t71230\n2岁\t71231\n竹叶青\t71232\n幻光\t71233\n猴赛雷\t71234\nExplain\t71235\n淸\t71236\n玛莎\t71237\n力帆320\t71238\npescm\t71239\n花鸟笼\t71240\n后蹄\t71241\n两高一\t71242\n分送\t71243\ncanger\t71244\n不见面\t71245\nmomoko\t71246\n49期\t71247\nBastion\t71248\n罗士信\t71249\n佐藤美和子\t71250\n上海凤凰网\t71251\n无人接听\t71252\n福建博物院\t71253\nclickable\t71254\n外贸公司\t71255\n香盒\t71256\n迅雷铺电影网\t71257\n为主导\t71258\n长阳镇\t71259\n外国版\t71260\n李坚\t71261\nKAT\t71262\n金融企业呆账核销管理办法\t71263\ngtx560ti\t71264\n曹县人民政府\t71265\ntubes\t71266\n鸡犬不宁\t71267\n周彦宏\t71268\n得实惠\t71269\n曝露\t71270\nASSERT\t71271\n酷熊\t71272\nCMI\t71273\n淳化街道\t71274\n卡魔拉\t71275\n司马氏\t71276\n有没有\t71277\n土屋\t71278\n王维诗\t71279\n奥图\t71280\n地狱之门\t71281\nSTM32/STM\t71282\n这么多大\t71283\nopencv3.0\t71284\n中苑\t71285\n军屯\t71286\n来吧冠军\t71287\n华味亨\t71288\n蛲虫\t71289\nYOU+国际青年社区\t71290\nModding\t71291\n纸架\t71292\n霍迪尔\t71293\n热室\t71294\n20151114\t71295\n雷凌双擎\t71296\nSkincare\t71297\n许鸿飞\t71298\n稚子\t71299\n麒麟镇\t71300\nhttp代理\t71301\n破案片\t71302\n47集\t71303\nwin1064\t71304\n0|\t71305\n中央汇金\t71306\nforwards\t71307\n李家祥\t71308\ncrombie\t71309\n湖南科学技术出版社\t71310\n一心园\t71311\nMatteo\t71312\n只须\t71313\n王曙光\t71314\n石佛镇\t71315\npsych\t71316\n热评\t71317\n加热\t71318\n监查\t71319\n中铁二十一局\t71320\n魔法少女小圆剧场版\t71321\n黑枪\t71322\n牛魔\t71323\n初雪樱\t71324\n聚氨酯地坪\t71325\n吴长春\t71326\nMeow\t71327\nBESTie\t71328\n周口市\t71329\n图片男\t71330\n抱\t71331\npum\t71332\n10^6\t71333\n研习会\t71334\n鲤鱼精\t71335\n上汽斯柯达\t71336\n淮河路\t71337\n万里石\t71338\n立明\t71339\n确界\t71340\n对啊网\t71341\n板阀\t71342\n郁金\t71343\nonResume\t71344\n优科豪马轮胎\t71345\n巫山云雨\t71346\n罗斯切尔德\t71347\n黑夜\t71348\n无锡工艺职业技术学院\t71349\n巴黎公社\t71350\n异梦\t71351\n地下水位\t71352\n工程洗轮机\t71353\nBTN\t71354\n镇江市政府\t71355\nBIM5D\t71356\n无翼乌\t71357\nwin10操作中心\t71358\n132集\t71359\nS-cute\t71360\nfavour\t71361\n1048\t71362\n30级\t71363\n比利王\t71364\nココロ\t71365\n军事博物馆\t71366\n退还\t71367\n弥雾机\t71368\ncee\t71369\n【天骐\t71370\nTOM\t71371\n600183\t71372\n王竹\t71373\nCommunicate\t71374\n4545\t71375\n中国禁毒网\t71376\n不失为\t71377\n淮海工学院\t71378\n紫石英号\t71379\n时务者\t71380\n超速浏览器\t71381\nS档\t71382\n昆仑银行\t71383\n弹头\t71384\n漫评\t71385\n硅pu\t71386\n售\t71387\ndictation\t71388\n龙宇\t71389\n京B\t71390\n住所\t71391\n市桥\t71392\n梅菜扣肉\t71393\nflexibility\t71394\nOURDEN\t71395\n杀破狼3\t71396\n线圈\t71397\n终极一家\t71398\n履带吊\t71399\ncharls\t71400\n速学\t71401\nBreak易站\t71402\n海妈\t71403\n奇女\t71404\n红楼梦中文\t71405\nrcp\t71406\nDebian\t71407\n打鼾\t71408\n久诚\t71409\n明星们\t71410\n公开栏\t71411\niccv\t71412\n安能\t71413\n规级\t71414\n新事物\t71415\n朱婧\t71416\n4品\t71417\nstarman\t71418\nM600\t71419\n为谁炼金\t71420\n20N\t71421\n石英晶体振荡器\t71422\n押题卷\t71423\n自治权\t71424\n攻城战\t71425\n坎村\t71426\n硫酸铜晶体\t71427\n8264.com\t71428\n鸡笼山\t71429\n朝上\t71430\n烧心\t71431\n雅库茨克\t71432\n杜威\t71433\n光标的\t71434\n新邨\t71435\n幺\t71436\n宁夏政府\t71437\n的者\t71438\n缤特力\t71439\n阿卡索外教网\t71440\n花样符号\t71441\nNBA2K18+NBA2K17+NBA2K16+NBA2K15+NBA2K系\t71442\nsuga\t71443\n徐刚\t71444\nvalidform\t71445\n柑普\t71446\nHorner\t71447\nAUR\t71448\n样式表\t71449\n上海青\t71450\nav天堂2017\t71451\n永硕E盘\t71452\n切菜\t71453\n梅兰兵马俑\t71454\n踊\t71455\n5380\t71456\n赏樱花\t71457\novirt\t71458\n中国国际招标网\t71459\n中冶南方工程技术有限公司\t71460\njiashi\t71461\n标配\t71462\n陈晓\t71463\nIdle\t71464\n串行接口\t71465\n危险废物\t71466\n幸福观\t71467\n蝴蝶骨\t71468\n唾骂\t71469\n水分仪\t71470\n涛声依旧\t71471\n答器\t71472\n代刷网\t71473\n刘挺\t71474\n港人\t71475\n其乐融融\t71476\nbepaly\t71477\n白夜叉\t71478\n3游乐场\t71479\n狗币\t71480\n福建国税\t71481\nBai\t71482\n接受\t71483\n封井\t71484\n滨海高新技术产业开发区\t71485\nupaly\t71486\n苏氏\t71487\n培杰\t71488\n二重\t71489\n热熔划线\t71490\n菏泽医学专科学校\t71491\n信号灯\t71492\n柳岸\t71493\n慵懒\t71494\n稀水\t71495\n去火\t71496\nStorm\t71497\n快乐8\t71498\n利率债\t71499\n围屋\t71500\n君行\t71501\n器子\t71502\n转马\t71503\n充气泵\t71504\n红海行动\t71505\n自底\t71506\n达泊西汀\t71507\n忏悔文\t71508\n脚长\t71509\nSAILOR\t71510\n2012\t71511\nDateUtils\t71512\n不瞒\t71513\n酷房网\t71514\n平安金融中心\t71515\n月之国\t71516\n税票\t71517\n红箭\t71518\n北京人民大会堂\t71519\n宜宾市环境保护局\t71520\npsvita\t71521\n苍火龙\t71522\n挡土板\t71523\n般若波罗蜜多心经\t71524\n广州城市职业学院\t71525\nSearch\t71526\nvuedraggable\t71527\n胶管\t71528\nPam\t71529\n大开发\t71530\nln3\t71531\n输尿管癌\t71532\nX280\t71533\n中国科学院遗传与发育生物学研究所\t71534\n刘虎\t71535\n5.2%\t71536\n浙江大学图书馆\t71537\n0132\t71538\n曹冲\t71539\n降头术\t71540\n西溪村\t71541\n一派\t71542\n库平\t71543\nDelacey\t71544\n创品\t71545\n电源线\t71546\nCJJT\t71547\n便当包\t71548\nKnews\t71549\nJS防水涂料\t71550\nJRE7\t71551\nstan\t71552\n流鼻涕\t71553\n特曼\t71554\n105mm\t71555\nn+2\t71556\n部分\t71557\n赣州银行\t71558\n正逢\t71559\ndrummer\t71560\n栈内存\t71561\n威尼斯国际电影节\t71562\n远红外\t71563\nnewff\t71564\n亲姐\t71565\n大罗\t71566\n维多利亚2吧\t71567\n神职\t71568\nsublime3\t71569\nr2v\t71570\n5548\t71571\nCasting\t71572\n贴水\t71573\npointing\t71574\n630\t71575\n振作\t71576\n802d\t71577\n43.dll\t71578\n1.1.6\t71579\n东坡全集\t71580\n氯酸钾\t71581\n卷扬\t71582\n同江市\t71583\nButter\t71584\n4010\t71585\n200年后\t71586\nENGINEERING\t71587\n偏僻\t71588\n责任状\t71589\n泡芙\t71590\n和政\t71591\n配音员\t71592\n四海\t71593\n万达影院\t71594\n单反稳定器\t71595\n杨萃先\t71596\nMorocco\t71597\n上层\t71598\n公优\t71599\n嗲嗲\t71600\np30\t71601\n黑白片\t71602\n神猴\t71603\n海湾地区\t71604\n三轴\t71605\n反吹\t71606\nRunaway\t71607\n佳宇\t71608\n长安欧尚A800图解\t71609\n四轴机器人\t71610\n12.08\t71611\nPuppeteer\t71612\n百金\t71613\n兆易\t71614\n变形金刚吧_\t71615\n镜中\t71616\nserif\t71617\n九十度\t71618\n单瓶装\t71619\n14支\t71620\n无痛分娩\t71621\n巯基丙酸\t71622\n屡次\t71623\n列夫·托尔斯泰\t71624\n700ml\t71625\n中国反洗钱监测分析中心\t71626\n重症医学科\t71627\nDWI\t71628\n沙眼\t71629\n王小洪\t71630\n赛罕区\t71631\n强渡\t71632\nitools\t71633\nM430\t71634\n蒲公英茶\t71635\n南坪\t71636\nnotejs\t71637\n第10批\t71638\nworld2010\t71639\n95个\t71640\nMankind\t71641\n中文批量\t71642\ne代驾\t71643\n2018年4月26日\t71644\n南京艺术学院\t71645\n04月07日\t71646\n新晴\t71647\n剪除\t71648\n吉林地区\t71649\n滑石粉\t71650\n盐渍\t71651\n秘鲁\t71652\nYY\t71653\n明月湖\t71654\n南京市玄武区政府\t71655\n有机过氧化物\t71656\n出入境检疫\t71657\n表现形式\t71658\n兰比尔\t71659\n石屏\t71660\nChromebook\t71661\nChrom\t71662\n发音\t71663\n骚乱\t71664\n商机库\t71665\n分布式配置中心\t71666\n蚂蚁金服集团\t71667\n节地\t71668\ncousera\t71669\n多功\t71670\n易格斯\t71671\n官路风流\t71672\n小哈\t71673\n前辈\t71674\n9片\t71675\n帮倒忙\t71676\nQRS\t71677\n宋佳伦\t71678\n迪亚兹\t71679\n呱太\t71680\n父皇\t71681\n麦柴\t71682\n求嫁\t71683\nQQOI\t71684\n极域\t71685\n轻芒\t71686\naida\t71687\n中组部\t71688\n舍普琴科\t71689\nQQ营销软件\t71690\n穷神\t71691\n冒泡排序算法\t71692\n金鹰女神\t71693\nAD钙奶\t71694\n物牛\t71695\n王者荣耀露娜\t71696\n维多利亚\t71697\n蝶形\t71698\nhow\t71699\n兔子洞\t71700\nmastercam9.1\t71701\n福州小鱼网\t71702\n7|\t71703\n肃州\t71704\n遭拒绝\t71705\nRT\t71706\n甜玉米\t71707\n空防\t71708\n镜头感\t71709\n3f\t71710\n天地板\t71711\n收归\t71712\n附属物\t71713\n中国铁建电气化局集团有限公司\t71714\n达斯·维达\t71715\n神州钓鱼网\t71716\n152期\t71717\n上款\t71718\ncmdline\t71719\n4430\t71720\n小数位\t71721\n秀爱\t71722\n雨污分流\t71723\nmbr\t71724\n火器\t71725\n最深\t71726\nzu\t71727\n20.4\t71728\n钱三雄\t71729\n中南大学科学研究部\t71730\n四亩\t71731\n拒爱\t71732\njiadian\t71733\nGameGuardian\t71734\n_毛\t71735\nwinphone\t71736\n图图\t71737\napu\t71738\n中价\t71739\n国家自然科学基金管理信息系统\t71740\n人保财险\t71741\n酆都\t71742\n陈重\t71743\nso.6\t71744\n广东药学院\t71745\n板栗饼\t71746\nxmanager5\t71747\nmianfei\t71748\n流眼泪\t71749\n中骏置业控股有限公司\t71750\n小蚁摄像机\t71751\n鄢陵\t71752\n弦高\t71753\n深线\t71754\n秀客网\t71755\nI5-7200U\t71756\n南山镇\t71757\n识记\t71758\n梅花糕\t71759\n附息\t71760\n70亩\t71761\n三维点\t71762\n热熔枪\t71763\n麦克斯韦方程组\t71764\n摸摸\t71765\n桃源镇\t71766\n张伯雨\t71767\n字符编码转换\t71768\nwin10点\t71769\n幼女性\t71770\n益生元\t71771\n打击乐器\t71772\n王文俊\t71773\n2018年04月18日\t71774\nJavHD\t71775\n余姚市人民医院\t71776\n敔山湾\t71777\ncfi\t71778\n双飞姐妹花\t71779\nMexican\t71780\n补铁口服液\t71781\n断腰\t71782\nskey\t71783\n狗扑\t71784\nT20\t71785\n本日\t71786\n阀控式\t71787\nLu\t71788\n医馆笑传\t71789\n磨蚀\t71790\n坡点\t71791\n丁姓\t71792\n九月的桃子\t71793\n李靖\t71794\n制药\t71795\n许亚军\t71796\njingdong\t71797\n++\t71798\n增重\t71799\n刮骨疗毒\t71800\nebod\t71801\n转变者\t71802\n24条\t71803\n爱妮岛\t71804\n中央全面深化改革领导小组\t71805\n小张\t71806\njax\t71807\n750毫升\t71808\n区块链技术\t71809\n冰肤传说\t71810\n领导层\t71811\n锦绣园\t71812\n食管癌\t71813\n龙虾馆\t71814\nexperiment\t71815\n22篇\t71816\n抢劫罪\t71817\n乒乓球拍\t71818\n厉行节约\t71819\n鼎盛时期\t71820\n巳\t71821\nbt盒子\t71822\n财保\t71823\n长沙铁道学院\t71824\n故事会\t71825\n枣庄\t71826\nmedicine\t71827\n春咲\t71828\n花缘\t71829\n大河剧\t71830\n加拿大阿尔伯塔大学\t71831\n影音先锋资源站\t71832\n孙立成\t71833\n1191\t71834\n李居丽\t71835\n海龟岛\t71836\n修仙狂徒\t71837\n广州市增城区人民政府\t71838\nvra\t71839\n16号\t71840\nWIN8系统\t71841\n广新\t71842\n青岛市市立医院\t71843\n程菲\t71844\n神尾舞\t71845\n书旗小说网--书旗网|书旗免费小说|书旗\t71846\n华力\t71847\n心尖宠\t71848\n猎豹cs9\t71849\n首仿药\t71850\n46万元\t71851\n5场\t71852\n俞孔坚\t71853\n绿植\t71854\n德德\t71855\n校花\t71856\n奥格威\t71857\n潮汕话\t71858\n安阳钢铁集团有限责任公司\t71859\n工业设计专业\t71860\n令状\t71861\nqlv\t71862\n半群\t71863\n电冰箱\t71864\n中国科技论文统计源\t71865\n王涌\t71866\n║\t71867\n25克\t71868\n85年\t71869\n侯明昊\t71870\nipadmini3\t71871\n晓得\t71872\nSon\t71873\n农工商\t71874\n链克币\t71875\n戴尔技术中心\t71876\n红星军事网\t71877\nv3.0.6\t71878\n前门店)\t71879\n17R3\t71880\n剪影\t71881\naloo\t71882\n全国性\t71883\n过滤盒\t71884\n80CM\t71885\n龙咁威\t71886\n卡萨帝\t71887\n哈德斯菲尔德大学\t71888\n重新排列\t71889\n汇通快运\t71890\n执政者\t71891\n腊味煲仔饭\t71892\n00小说网\t71893\n真的好想你\t71894\n陕旅集团\t71895\nframeset\t71896\nk歌\t71897\n信徒\t71898\n伤速\t71899\n祖国大好河山\t71900\nTails\t71901\n2017年11月27日\t71902\n理工学院\t71903\n船速\t71904\n这两样\t71905\nmotif\t71906\npu\t71907\n中国石油集团公司\t71908\n语音输入法\t71909\n北京大学城市与环境学院\t71910\n喵星人大\t71911\n内田真礼\t71912\n平躺\t71913\n吃肉\t71914\n虹影\t71915\n陈曦\t71916\n弹性\t71917\n天猫优衣库\t71918\nKT\t71919\n约客\t71920\n你和我的倾城时光\t71921\nBTS\t71922\n菲尼尔\t71923\n13.5万\t71924\n4月16日\t71925\n蓝精灵2\t71926\n杨子大鹏\t71927\n无机房\t71928\n天疱疮\t71929\n抗病毒药\t71930\n自考专科\t71931\n芝麻油\t71932\n国家食药监局\t71933\n河南地区\t71934\n睡去\t71935\nWCA\t71936\n专项维修基金\t71937\n可否\t71938\n进纸传感器\t71939\nVersailles\t71940\n东北育才双语学校\t71941\nAng\t71942\n七·一\t71943\n平路\t71944\nosumania\t71945\nusrp\t71946\n我的未来不是梦\t71947\n方慧\t71948\n陈吉宁\t71949\n慢性肾衰\t71950\n水浒城\t71951\n走向深蓝\t71952\nwpa/wpa2\t71953\n代和\t71954\n江干下沙\t71955\n看穿\t71956\n双万\t71957\n阴齿\t71958\nResponsible\t71959\n技术监督\t71960\n玩图\t71961\nModel\t71962\n荣焉\t71963\n给予\t71964\n_博\t71965\n鉴真\t71966\n真可谓\t71967\n非洲人\t71968\n热烈\t71969\nMockMVC\t71970\n格兰蒂亚2\t71971\nall路\t71972\n射频连接器\t71973\n仙人岛\t71974\n新大陆\t71975\n高美\t71976\nMANUFACTURING\t71977\n红光村\t71978\n许靖\t71979\n五粮液酒\t71980\ncabin\t71981\n恐怖直播\t71982\nsena\t71983\n224号\t71984\n花姑娘\t71985\n逮捕令\t71986\n4399小游戏\t71987\n北理\t71988\n中国三峡集团\t71989\n点心师\t71990\n爱辣啵啵鱼\t71991\n形制\t71992\n12340\t71993\nlm1875\t71994\n丁桂儿\t71995\n发丝\t71996\n秋分\t71997\n坦克世界吧\t71998\n看不成\t71999\n肮脏\t72000\n男鬼\t72001\n刘少华\t72002\n退去\t72003\n黄政\t72004\n间苯三酚\t72005\n居住条件\t72006\n奇云\t72007\n大和\t72008\n5连\t72009\n荆州电视台\t72010\n科美\t72011\n色纸\t72012\n大奖网\t72013\n白金梦卡\t72014\n大孤山\t72015\n短头\t72016\n乳环\t72017\n王海玲\t72018\n观日\t72019\n巨头们\t72020\n反常\t72021\n检品\t72022\n胸罩\t72023\n萱草\t72024\n金首\t72025\n昭然\t72026\n东京电视台\t72027\nRSA算法\t72028\n3000年\t72029\nDu\t72030\n时而\t72031\nAlCl3\t72032\n宠婚\t72033\n杂事\t72034\n六美网\t72035\nWillie\t72036\n25款\t72037\nfree91\t72038\n秋之回忆8\t72039\nhzwer\t72040\nDOA5\t72041\n法律人\t72042\n直白\t72043\nSK-II\t72044\n趁虚而入\t72045\n复合绝缘子\t72046\nM117舞曲库\t72047\n锆英砂\t72048\n陕西省地方税务局\t72049\n悬吊术\t72050\n气脉\t72051\n胸垫\t72052\n宅人\t72053\n李天一\t72054\n购物圈\t72055\n曹凯\t72056\n奥登\t72057\n遵生八笺\t72058\n诫勉\t72059\n节目源\t72060\nrc3\t72061\nSpringboot\t72062\n肯定\t72063\n上海13号线\t72064\n久泰\t72065\n6.0.9\t72066\n好人家\t72067\nSIM800C\t72068\n加热板\t72069\n国家物资储备局\t72070\n皮特猫\t72071\n挽狂澜\t72072\n建造者\t72073\n茶人\t72074\n花禾\t72075\n宣城职业技术学院\t72076\nLadyGaga\t72077\n乐晨\t72078\nMoo\t72079\n分分彩\t72080\n四兄弟\t72081\nKhubilai\t72082\n艾美酒店\t72083\n收交\t72084\n1.8元\t72085\nv6.0.1\t72086\nnang\t72087\n碾磨\t72088\n三村\t72089\n活春宫\t72090\nvegetarian\t72091\n魔花\t72092\n广州市来穗人员服务管理局\t72093\n爱不在\t72094\n36秒\t72095\n长平\t72096\n泰斗级\t72097\n水男\t72098\n林熙蕾\t72099\n冻带鱼\t72100\n酷暑\t72101\nB250M-PLUS\t72102\n实际利率法\t72103\nairbook\t72104\n硅业\t72105\n小鱼池\t72106\n逃亡之路\t72107\n同号\t72108\n昆明医科大学第一附属医院\t72109\n汁液\t72110\n奥特曼格斗进化重生\t72111\n烤猪\t72112\nSMW\t72113\nfume\t72114\n抛砖引玉\t72115\n阿尔杰塔\t72116\nzimo\t72117\n15厘\t72118\n招投标管理中心\t72119\n1.7.12\t72120\n本学\t72121\nmenat\t72122\nawareness\t72123\n有话说\t72124\n蚊\t72125\n防疫员\t72126\n锦棉\t72127\n说明表\t72128\n上海市胸科医院\t72129\n塌缩\t72130\n要空\t72131\n江宁科学园\t72132\n卖得\t72133\n高魔\t72134\n科颜氏\t72135\nATCC\t72136\n红土\t72137\n久游论坛\t72138\n南京鼓楼\t72139\n杭电oj\t72140\n游迅网\t72141\n生孩\t72142\n迪茵湖\t72143\n郑州科技学院\t72144\n萌琪琪Irene\t72145\n1.25mm\t72146\nblockchain\t72147\ntravel\t72148\n直方图\t72149\n一舟\t72150\n奔丧\t72151\n中国美院\t72152\nNitrogen\t72153\ntaoke\t72154\n鲅鱼\t72155\n海卓\t72156\n施华洛\t72157\n东山小学\t72158\n卡斯罗犬\t72159\n邵丹\t72160\n前列腺素\t72161\nSetting\t72162\n地主阶级\t72163\nclipper\t72164\n_心食谱\t72165\n珞喻路\t72166\n华为荣耀路由器\t72167\n西藏天路\t72168\n厂牌\t72169\n画彩\t72170\n剑刃\t72171\n中央农办\t72172\np6spy\t72173\n50发\t72174\n俄罗\t72175\n第20章\t72176\nb11\t72177\n财格\t72178\n扬长而去\t72179\n6505\t72180\nzhon\t72181\n杨家村\t72182\n宜人财富\t72183\n尾上\t72184\n妈超3\t72185\n爱科技\t72186\n大站\t72187\n上海国家会计学院\t72188\n济南妈妈网\t72189\n埃兰\t72190\n10双\t72191\n78月份\t72192\n梅花儿香\t72193\n蚂蚁小宝卡\t72194\nzhp\t72195\n周董\t72196\n散称\t72197\nJav\t72198\n保税油\t72199\n鸳鸯金楼\t72200\n4成\t72201\n门器\t72202\n眼字\t72203\n输血科\t72204\n芒果\t72205\npsa\t72206\n牛棚\t72207\n暖场\t72208\n神幻拍档\t72209\n预警仪\t72210\nsc2020\t72211\n0436\t72212\nsvp\t72213\n动漫画\t72214\n老贺\t72215\n铝价\t72216\n蓝黑色\t72217\n驹马\t72218\n工场\t72219\n唐仁\t72220\n王义\t72221\n三光\t72222\n白皙\t72223\n不置\t72224\n书贼\t72225\n字段名\t72226\n姑苏城\t72227\n人脸识别系统\t72228\nBan\t72229\n红番区\t72230\n裴淳华\t72231\nTFBoys\t72232\n冀人社\t72233\n棘皮\t72234\n斋戒\t72235\n花膜\t72236\nBrains\t72237\n东莞市区\t72238\n铁岭县\t72239\n樱花电热水器\t72240\n唯一值\t72241\n471\t72242\n每星期\t72243\n王苑苑\t72244\n松狮\t72245\n硬币\t72246\n尾鳍\t72247\n百货类\t72248\n出牌\t72249\n劈柴院\t72250\n元亨利\t72251\n包头市人民政府\t72252\nSnow\t72253\n650w\t72254\n自然语言\t72255\n好大一棵树\t72256\n北京统计信息网\t72257\n小米兰亭\t72258\n社会工作综合能力\t72259\n2000%\t72260\n兴中\t72261\n七届二次\t72262\n7280\t72263\n诸葛孔明传\t72264\n龙尺\t72265\n陈碧舸\t72266\n方巾\t72267\n1427\t72268\n心绪\t72269\n儿歌网\t72270\n中种\t72271\n6818\t72272\n李毓佩\t72273\n贝司\t72274\n铲刀\t72275\n大新华航空\t72276\n2204n\t72277\n欲火难耐\t72278\n陈妍希\t72279\n年利率\t72280\nsxx\t72281\n皮囊之下\t72282\n虚拟路由器\t72283\n巨推链\t72284\n小米盒子增强版\t72285\n零9个月\t72286\n当易网\t72287\n建筑学\t72288\n李振国\t72289\n云南省高级人民法院\t72290\n40多家\t72291\n新亭\t72292\n必不可少\t72293\n正规体\t72294\n六度伯乐网\t72295\n入货\t72296\nWDR7500\t72297\ndajian\t72298\ndylib\t72299\nUTC\t72300\n桑树叶\t72301\n掌勺\t72302\n东风区\t72303\n第112章\t72304\n完工\t72305\n无锡工商局\t72306\n比较性\t72307\n955\t72308\n右拐\t72309\n艾饼\t72310\n舍力\t72311\n88\t72312\nBobo\t72313\n遗精\t72314\nscotland\t72315\n沈波\t72316\n降压型\t72317\n老梁观世界\t72318\nV5.0.0\t72319\n塞上曲\t72320\n超声诊断学\t72321\nSaaS系统公司\t72322\n后\t72323\n半卷\t72324\np4g\t72325\n阿祥\t72326\nXeno\t72327\n山香教育\t72328\n阿卡纳\t72329\n茶晶\t72330\n与人为\t72331\n好可怜\t72332\n手腕带\t72333\n106\t72334\n编者按\t72335\nGOLANG\t72336\n瑞士再保险\t72337\n伊思水乳\t72338\nLOL英雄联盟\t72339\n20160203\t72340\n造模\t72341\n杭州半山\t72342\n便服\t72343\n汇鸿集团\t72344\n王策\t72345\n中国移动北京公司\t72346\n王异\t72347\n大补阴丸\t72348\n俺娘田小草\t72349\n05\t72350\n微吼\t72351\n昭和元禄落语\t72352\n王大勇\t72353\n跟骨\t72354\n微博&私杂志\t72355\n偷运\t72356\n包臀裙\t72357\n抑菌液\t72358\n武功县\t72359\n狼群\t72360\n照花台\t72361\n普通外科学\t72362\n20151202\t72363\n温州百姓网\t72364\n39万\t72365\n广玉兰\t72366\n水位计\t72367\n面试者\t72368\n第一堂\t72369\n襄汾\t72370\n宽带山社区\t72371\n刊首语\t72372\n1291\t72373\nmanuals\t72374\n维文\t72375\nZealer\t72376\n元天\t72377\n万能遥控器\t72378\n中华人民共和国国家档案局\t72379\nGL6\t72380\n欲望之门\t72381\n抵顶\t72382\nグ\t72383\n鸿发\t72384\n簧管\t72385\n短小精悍\t72386\n七维\t72387\n雪宝\t72388\n美题\t72389\n黄猫\t72390\n替代物\t72391\n159级\t72392\nt500\t72393\nAREA\t72394\n链家自如友家\t72395\n决器\t72396\n灰镜\t72397\n横沙\t72398\n情色片\t72399\n整流滤波电路\t72400\nfsn\t72401\n小米滑板车\t72402\n二阶偏导数\t72403\nMybatis\t72404\n中国银行上海分行\t72405\n赤霞珠\t72406\n单刀赴\t72407\n弘\t72408\n雪峰\t72409\nptm\t72410\n宝马X3\t72411\n获邀\t72412\nUV树脂\t72413\n高德车机版\t72414\n尤瓦尔·赫拉利\t72415\n女神族\t72416\n圣培露\t72417\n陈爱民\t72418\nQ_\t72419\n泰坦陨落2\t72420\nsld\t72421\nlcov\t72422\n三叶草\t72423\n收益率\t72424\n人工智能机器学习\t72425\nsurface3吧_\t72426\n龙嘉\t72427\n土地革命\t72428\n秀色生香\t72429\n雷狮\t72430\n156号\t72431\n元宝梁\t72432\n新农村合作医疗\t72433\n漫道\t72434\n平安金融壹账通\t72435\n深圳卫视\t72436\nav色\t72437\n岳阳政府网\t72438\n7天\t72439\n短臂\t72440\n十三五规划.doc\t72441\nubyte\t72442\n超级机器人大战j\t72443\n银钢mini\t72444\nskyworks\t72445\nNiconico\t72446\n传奇世界sf\t72447\n四川出入境检验检疫局\t72448\n仁\t72449\n家家优保网\t72450\nBPSK\t72451\n伊苏\t72452\n景迈山\t72453\n刘栋\t72454\n圆砾\t72455\n三角关系\t72456\nYung\t72457\n全汉\t72458\n中建五局\t72459\n熟龄\t72460\n法卫士\t72461\n2819\t72462\n红豆今网\t72463\n郝思嘉\t72464\ntope\t72465\n龙辉\t72466\n正交性\t72467\n签发者\t72468\n无人能比\t72469\njavah\t72470\n冯亮\t72471\n芥子\t72472\n高低温\t72473\n金碟豹\t72474\nLesbians\t72475\n000505\t72476\n杨远\t72477\n贵州建工集团\t72478\nWin2003\t72479\n霜色\t72480\n曹正\t72481\n窗口期\t72482\n幸孕\t72483\n点意\t72484\n烙锅\t72485\n五菱宝骏\t72486\n艳姆\t72487\nfond\t72488\n市场监督管理局\t72489\n皮带秤\t72490\n道源\t72491\n犯罪学\t72492\n慢性扁桃体炎\t72493\n外研版\t72494\n武神风暴\t72495\n共产主义青年团\t72496\n泰盛\t72497\n霓漫天\t72498\nBooty\t72499\n装卸\t72500\n蚁坊\t72501\n-&#160\t72502\n个人经营性贷款\t72503\n2016年3月28日\t72504\n屈居\t72505\n发电机房\t72506\n君华\t72507\n花亭\t72508\n601186\t72509\nQQ公众号\t72510\nvirsh\t72511\n现世式\t72512\nQuidway\t72513\n拉野\t72514\n我学炒股网\t72515\n北京吉利大学\t72516\n龟岛\t72517\n前世\t72518\n龙珠超悟空\t72519\n乾隆\t72520\nppt批量\t72521\ntodo\t72522\n因特\t72523\n益盟操盘手\t72524\nRadisson\t72525\n大联唱\t72526\n35d\t72527\n女厕\t72528\n碱式氯化铝\t72529\ndmt\t72530\n滨海开发区\t72531\nプ嬢\t72532\npolls\t72533\n理所当然\t72534\nbtrace\t72535\n山东齐鲁医院\t72536\n实行\t72537\nC35\t72538\n裁判权\t72539\n萝莉呦呦\t72540\ngitstack\t72541\npounds\t72542\n劣迹\t72543\ncookie\t72544\nEFT\t72545\n肢\t72546\n财大\t72547\nCHICKEN\t72548\n赏月\t72549\n化合物\t72550\n于强\t72551\nBKD\t72552\n李玮珉\t72553\n6.98万元\t72554\n空军军医大学西京医院\t72555\n3季度\t72556\n铁城市广场\t72557\n天晟\t72558\n46亿\t72559\n外扣\t72560\ncolumbus\t72561\n前委\t72562\n公视\t72563\n闷油\t72564\n微单相机\t72565\n新奇世界\t72566\n大神康\t72567\n60R16\t72568\n6月7号\t72569\n3000_\t72570\nSupplier\t72571\n南外滩\t72572\n固定式升降机\t72573\n特美\t72574\n狐踪\t72575\n米索前列醇\t72576\n回答题\t72577\nWrangler\t72578\n天冬氨酸\t72579\n吉林敖东\t72580\n粱\t72581\n周里京\t72582\n十二个\t72583\nproduits\t72584\nsoftmax函数\t72585\n30#\t72586\n曲面积分\t72587\nghost\t72588\ncx200\t72589\n火影忍者忍者大师\t72590\nGit服务器\t72591\n企业改制\t72592\n王禹偁\t72593\n水压\t72594\n白雪仙\t72595\n狼少女\t72596\n爆机\t72597\n最近两个月\t72598\nwanke\t72599\nSpeXial\t72600\nSpaceVim\t72601\n外行人\t72602\n小豆\t72603\n9u\t72604\nLARGE\t72605\nocean\t72606\n末代皇后\t72607\n索尔\t72608\ntoryburch\t72609\n黑帮电影\t72610\n天翼视讯\t72611\n巴塔哥尼亚\t72612\n佳士\t72613\n诗经采薇\t72614\n超高频\t72615\nNachos\t72616\n季冠霖\t72617\n101围棋网\t72618\n10多名\t72619\n杨乐\t72620\n娱乐在线\t72621\n民事判决\t72622\n20首\t72623\naudirvana\t72624\n乐安康\t72625\n陈希同\t72626\n落物\t72627\n攀高峰\t72628\n随处可见\t72629\n3.3G\t72630\n十日谈\t72631\n2色\t72632\n阶级\t72633\n动名\t72634\ncharcoal\t72635\n石棉垫\t72636\n幼幼版\t72637\n中国医药科技出版社\t72638\n榄\t72639\n国务院妇女儿童工作委员会\t72640\n阿房宫\t72641\n信赖\t72642\n斯汀\t72643\n海林\t72644\n即期收益率\t72645\n怠惰\t72646\n耳鸣\t72647\n陈冰冰\t72648\n基多拉\t72649\n150nk\t72650\nCMS\t72651\n钟鼓楼\t72652\n华为社\t72653\n附注\t72654\n决选\t72655\nDesigned\t72656\n烩饭\t72657\n山下智久\t72658\n觉慧\t72659\n中国共产党西安市委员会\t72660\njta\t72661\nxata\t72662\n行政史\t72663\n探月\t72664\n拉宾\t72665\nVBA编程\t72666\n2015年6月\t72667\n李秀根\t72668\n西服\t72669\n212个\t72670\n阳明区\t72671\n一批二批\t72672\nvisualizer\t72673\n豺龙\t72674\n滑行车\t72675\n黄胆\t72676\n独奏谱\t72677\nREG007\t72678\nLavarel\t72679\n3375\t72680\n蓁蓁\t72681\n三里屯太古里\t72682\nSoy\t72683\n王正龙\t72684\n七绝\t72685\n简记\t72686\ninsulator\t72687\nagain\t72688\n深深\t72689\nbitch\t72690\n云南省卫生计生委\t72691\n秋山君\t72692\nOpenVZ\t72693\n6起\t72694\n百仕达\t72695\n1.5分\t72696\nID选择器\t72697\nwrinkle\t72698\n每30秒\t72699\nrespository\t72700\nlizi\t72701\n2.63\t72702\nAiCloud\t72703\n塞尔达传说:时之笛\t72704\n八讲\t72705\n塔城\t72706\n综合素质\t72707\n九宾\t72708\n韦弗\t72709\nISO26262\t72710\n盘景\t72711\n3n\t72712\n植物\t72713\n拳打脚踢\t72714\n不圆\t72715\ninitramfs\t72716\nEllie\t72717\n法轮功\t72718\n邬达克\t72719\n新校区\t72720\n人间第一情\t72721\n红军长征\t72722\n2018-03-16\t72723\n唯才教育网\t72724\n田波\t72725\ntaiguo\t72726\n众星\t72727\n认祖\t72728\ndiversified\t72729\nV22\t72730\n29度\t72731\n西北局\t72732\n钢笔\t72733\n美方\t72734\n珞樱\t72735\nJizz\t72736\nAgreements\t72737\ncpython\t72738\nOpenRice\t72739\n中信海直\t72740\n杰拉\t72741\n律人\t72742\n锯齿状\t72743\n沙依巴克\t72744\nvivox9\t72745\n奇异人生\t72746\n大富翁\t72747\n流动相\t72748\n启东中学\t72749\nthickness\t72750\n威斯康星州\t72751\n毒火\t72752\n截包\t72753\n沉稳\t72754\n神坑\t72755\n市海洋局\t72756\n果团网\t72757\npandown\t72758\n入市\t72759\n综合能力\t72760\n202个\t72761\n地址块\t72762\n结社\t72763\n颓丧\t72764\n芒果tv\t72765\n马齿笕\t72766\nKoa2\t72767\nELE\t72768\n杀手2\t72769\n唐静\t72770\n欧柏利\t72771\n外教社\t72772\n冷轧\t72773\n铁矿石\t72774\n热应力\t72775\n央视频\t72776\n牵连\t72777\n勾栏\t72778\n君子\t72779\nmultisim10\t72780\n暴气\t72781\nCAR\t72782\n愚者\t72783\n沥\t72784\n1.0.0.7\t72785\n关西\t72786\n官房\t72787\n浙中新报\t72788\n金宝贝早教\t72789\nwills\t72790\n杜克大学\t72791\n重点高中\t72792\n甘井子万达广场\t72793\nNIOS\t72794\n91Ri.org\t72795\n伸腿\t72796\nCohiba\t72797\nStranded\t72798\n豆邮\t72799\n20160207\t72800\n腕力球\t72801\n方向性\t72802\n欧泉美业\t72803\n世界纪录\t72804\ncjq\t72805\nGPS定位器\t72806\n拒食\t72807\n根本法\t72808\nxuyao\t72809\nCamper\t72810\n西安教育\t72811\n1391\t72812\n沙葛\t72813\n此女\t72814\nJohan\t72815\nMade\t72816\n山药排骨汤\t72817\n男饭\t72818\n企业年金办法\t72819\n法国航空公司\t72820\n收缩毛孔\t72821\n护理\t72822\n活性炭纤维\t72823\n1.08G\t72824\n天长市人民政府\t72825\n66769630\t72826\nReagents\t72827\n云发卡\t72828\n登山节\t72829\n福克斯rs\t72830\n兵车\t72831\n64g\t72832\n中环国际\t72833\n王麒诚\t72834\n天猫盒子\t72835\n一霎\t72836\n21幢\t72837\nHALF\t72838\n中公\t72839\nclit\t72840\nNaOH\t72841\nEBS\t72842\n万庆良\t72843\nUAS\t72844\n安妮公主\t72845\n恳谈会\t72846\n朱丽倩\t72847\nSex\t72848\n2016年11月23日\t72849\n603259\t72850\nAP1700\t72851\n完全数\t72852\n价额\t72853\n光彩照人\t72854\n制作师\t72855\n节油器\t72856\n_美篇\t72857\n7180dn\t72858\nnie\t72859\n槟榔树\t72860\n木瓜膏\t72861\n宜宾五粮液股份有限公司\t72862\n跨期\t72863\n缺钙\t72864\n黄金花园\t72865\n大友\t72866\n大浦\t72867\n魔方网\t72868\nemx\t72869\npuma\t72870\n960px\t72871\n爱好者\t72872\n苹果I派党\t72873\n_玩游戏网\t72874\n360手机F4\t72875\n尖端\t72876\n对和\t72877\n高鸿股份\t72878\nPersonals\t72879\n鹰潭北\t72880\n艾琳娜\t72881\n五十八\t72882\n高压条\t72883\n博世力士乐\t72884\n湖南幼儿师范高等专科学校\t72885\n第04章\t72886\nunity\t72887\n创业板上市管理办法\t72888\n深圳高新区\t72889\n赛科龙\t72890\n滑膛枪\t72891\n第_\t72892\n白水镇\t72893\n清风亭\t72894\n作不死\t72895\n好吃街\t72896\n17周\t72897\nHertz\t72898\n汪可盈\t72899\n文明\t72900\nJansen\t72901\nSuperdry\t72902\n家庭理财规划\t72903\n巨蟹座女\t72904\n点歌\t72905\n南华早报\t72906\n吹膜\t72907\n我们爱的难舍难分爱\t72908\n抢劫\t72909\n唇纹\t72910\n乐山市人民政府\t72911\n重庆出入境检验检疫局\t72912\n关系型\t72913\nknit\t72914\n岳西网\t72915\n大一码\t72916\n一元二次方程ax2+bx+c=0\t72917\n戴比尔斯\t72918\nScratch2.0\t72919\n变异性哮喘\t72920\n入党申请书_无忧考网\t72921\n轮回\t72922\n被叫\t72923\n惨无\t72924\n爱法\t72925\n1076\t72926\n主政\t72927\n疯狂赛车\t72928\n五遍\t72929\n|页\t72930\n郑州市妇幼保健院\t72931\n弗吉尼亚·伍尔夫\t72932\n银子弹\t72933\n薇诺娜·瑞德\t72934\n胜山镇\t72935\n评量\t72936\n李翱\t72937\nacces\t72938\n唠\t72939\n双性\t72940\n继续教育处\t72941\n费加罗的婚礼\t72942\n五三天天\t72943\nrepaired\t72944\n欲动\t72945\n道德与法治\t72946\nopera\t72947\n中华医学会糖尿病学分会\t72948\nf579\t72949\n卡拉卡斯\t72950\n河海大学常州校区\t72951\n硬片\t72952\n护板\t72953\n乐亭月坨岛\t72954\nCNKI科研诚信管理系统研究中心\t72955\nActi\t72956\n生活资料\t72957\n富力地产集团\t72958\n柳东\t72959\n让路\t72960\n中关村图书大厦\t72961\nInventor\t72962\n玩去\t72963\nRAINBOW\t72964\n神爱之家吧\t72965\n915\t72966\nBlame\t72967\nDreamsqin\t72968\n通信费\t72969\n黄赌\t72970\n附魔卡\t72971\n技校生\t72972\n双射\t72973\n吴喜之\t72974\n备急千金要方\t72975\n喂食\t72976\n上大附中\t72977\n韩玉\t72978\n肉面\t72979\n消光粉\t72980\nKaricare\t72981\n形象性\t72982\n焊接变位机\t72983\n7日\t72984\n麻灰\t72985\nhbo\t72986\nstill\t72987\n曲子\t72988\n销售代表\t72989\n最惠\t72990\n磁带\t72991\n德铁\t72992\n张萌\t72993\n龟牌\t72994\n壶盖\t72995\nEnglish\t72996\nZendStudio\t72997\n思达派\t72998\n声乐\t72999\n被动语态\t73000\n掌火\t73001\n杰大\t73002\n杀绝\t73003\nP55\t73004\n姜营机场\t73005\n热熔器\t73006\nncstudio\t73007\n双曲线\t73008\n库博\t73009\n英腾\t73010\ntickers\t73011\n千米网\t73012\n_主\t73013\n齐放\t73014\n5W-30\t73015\n半斤八两\t73016\n亚心论坛\t73017\n20150110\t73018\n百联\t73019\n险些\t73020\n物联网导论\t73021\n蛯原\t73022\n麻将席\t73023\n黑场\t73024\n二分类\t73025\n533\t73026\n阿山\t73027\n食品有限公司\t73028\n勤务员\t73029\nip4\t73030\n绳奴\t73031\n张维迎\t73032\n东京迪士尼\t73033\n朝阳市人民政府\t73034\n资深\t73035\n古实\t73036\n爆本\t73037\nsayatoo\t73038\n女郎\t73039\n自由搏击\t73040\n梦间集\t73041\n更有趣\t73042\n清南\t73043\n侦探推理\t73044\n东城区小学\t73045\nusg6000\t73046\n滴灌\t73047\n酱爆\t73048\n绿裙\t73049\n袁军\t73050\n六次\t73051\n原种\t73052\n联苯\t73053\n民事诉状\t73054\n板蓝\t73055\nComplexity\t73056\n巴国\t73057\n手银\t73058\n欧氏\t73059\n科普书\t73060\n4.0.2\t73061\n组团社\t73062\n换人\t73063\n广东省发展改革委\t73064\n397.31\t73065\n多倍\t73066\n南宁轨道交通集团有限责任公司\t73067\n文学作品\t73068\nHillary\t73069\n甬剧\t73070\n京良路\t73071\n人事考试信息网\t73072\n2008多\t73073\n多级泵\t73074\n贪污罪\t73075\n方块学园\t73076\n宁溧\t73077\n齿面\t73078\nleon\t73079\n南京航空航天大学研究生院\t73080\n战位\t73081\n革职\t73082\nbored\t73083\n火宝\t73084\n泰明\t73085\nU11+\t73086\n屋架\t73087\n特优\t73088\n没味\t73089\n农机具\t73090\n皇家马德里足球俱乐部\t73091\nGofun\t73092\n提升器\t73093\n古山镇\t73094\n乳清蛋白粉\t73095\n洋酒\t73096\n摄友\t73097\nmyocardial\t73098\n杞人忧天\t73099\n粉仓\t73100\n甘肃省地震局\t73101\nxjl\t73102\n龙王的工作\t73103\n麦考林\t73104\n韩三平\t73105\n碧绿\t73106\n多子\t73107\n路亚\t73108\nAMM\t73109\ndetermined\t73110\nema\t73111\n另一个\t73112\n封坛\t73113\n竹浆\t73114\n北林\t73115\n裸体女\t73116\n满腔热情\t73117\nV5-6R2016\t73118\nGonzalez\t73119\n姚伟涛\t73120\n阴蒂\t73121\n天梯\t73122\n360元\t73123\nmsgsnd\t73124\ninc\t73125\npssh\t73126\n回收站\t73127\nacn\t73128\n快雪\t73129\n世界女排大奖赛\t73130\n明知道\t73131\n跳档\t73132\n框架箱\t73133\nchick\t73134\n鱼鳖\t73135\n950_\t73136\n安娜子桐\t73137\n中冶建筑研究总院有限公司\t73138\n下载\t73139\n美女蛇\t73140\n恒信东方\t73141\n五岁\t73142\n讲不出再见\t73143\n万科明天广场\t73144\n预应力砼\t73145\n3.3米\t73146\n地心\t73147\n光谱仪\t73148\nCAMERA\t73149\n女囚犯\t73150\n第123期\t73151\nInvention\t73152\n拉菲红酒\t73153\n神经内分泌肿瘤\t73154\n男气\t73155\n1231\t73156\n指纹仪\t73157\n记取\t73158\nv2.3.0\t73159\n52次\t73160\n轮椅车\t73161\nASU\t73162\n世中\t73163\n百货商场\t73164\n张江高科\t73165\nroboto\t73166\n小苏打片\t73167\n成都市第五人民医院\t73168\nMC5\t73169\n办公大厦\t73170\nwnv\t73171\n操作按键\t73172\n自杀案\t73173\n中国红牛\t73174\n拥\t73175\n装疯卖傻\t73176\nseats\t73177\n点中\t73178\n韭菜\t73179\n国民美少女\t73180\n歧途\t73181\n台办\t73182\n推焦\t73183\nbigtable\t73184\n淋巴结节\t73185\n苏定方\t73186\n海峡之声网\t73187\n第71届\t73188\n合格品\t73189\n济济\t73190\n3.4万\t73191\n直向\t73192\nDCA\t73193\n督察长\t73194\n绿豆泥\t73195\n平乐园\t73196\n平衡图\t73197\nmingyue1818\t73198\n神幻之恋\t73199\n30集\t73200\n尾行2\t73201\noptane\t73202\n美全\t73203\n玩老\t73204\n鸠江区\t73205\n91y游戏中心\t73206\n20x20\t73207\nlandesk\t73208\n福达\t73209\n百褶裙\t73210\n纯属\t73211\n热式\t73212\n2016英雄联盟\t73213\n阿尔杰斯\t73214\nwindows-CSDN\t73215\n质度\t73216\n混凝土管\t73217\n商用车\t73218\n淄博火车站南广场\t73219\n裸板\t73220\n联想笔记本触摸板\t73221\n中铁二十五局集团\t73222\n条批\t73223\nphar\t73224\n梁萧\t73225\n乔燃\t73226\n不舍得\t73227\n凌空\t73228\n法律责\t73229\n短鲷\t73230\n加勒比海盗4\t73231\n真三国无双6猛将传\t73232\n20160127\t73233\n合工大\t73234\n垦利县\t73235\n金三顺\t73236\nvue.js2.0\t73237\n中国科学院高能物理研究所\t73238\n吴家\t73239\n芝加哥大学\t73240\nIDs\t73241\nMohammed\t73242\n32路\t73243\n牛文文\t73244\nFaire\t73245\n坂深雪\t73246\nFiji\t73247\n燎原\t73248\n烤羊排\t73249\n2^2\t73250\n北京国贸大厦\t73251\n心海\t73252\n5sing\t73253\n模理\t73254\n素野\t73255\npp值\t73256\n佛友\t73257\n布达佩斯大饭店\t73258\n云神\t73259\n直埋光缆\t73260\n峰峰信息港\t73261\n68条\t73262\n流放之路国际服\t73263\n日经指数\t73264\n市南区\t73265\n周宁县\t73266\n掉价\t73267\nmysql-connector\t73268\n福州大学\t73269\nsamrt\t73270\n2017年7月11日\t73271\n170425\t73272\nBOA\t73273\n挖空\t73274\n首开龙湖天璞\t73275\n离子束\t73276\n难舍难分\t73277\n小姨子的诱惑\t73278\n俊俏\t73279\n鹰冠\t73280\n中国平安_平安金融\t73281\nufc吧\t73282\n丹药\t73283\n武汉城市职业学院\t73284\n何山路\t73285\n2015.5\t73286\n腿毛\t73287\nremark\t73288\n网易数读_网易新闻中心\t73289\n上海沪震实业有限公司\t73290\nPresentation\t73291\n我的城\t73292\n面儿\t73293\n手机信号放大器\t73294\n溢液\t73295\nSelect2\t73296\n无赖勇者的鬼畜美学\t73297\n广石化\t73298\nblacklist\t73299\n浦发银行上海分行\t73300\nQMS\t73301\n航协\t73302\n周翔\t73303\n费伍德\t73304\nnowhere\t73305\nvivox21吧\t73306\n张嘉泽\t73307\n快轨\t73308\nytm\t73309\n山丘\t73310\n民族观\t73311\n密钥\t73312\n北京市交通委员会\t73313\n艾特奖\t73314\n棕编\t73315\n魔鬼\t73316\n美国海军\t73317\n解压版\t73318\nAK\t73319\n4006\t73320\nhashmap\t73321\npall\t73322\n村村通\t73323\n查理和巧克力工厂\t73324\n鸟具\t73325\nBom\t73326\nibp\t73327\nwe+\t73328\n雅典奥运会\t73329\n17斤\t73330\n8280\t73331\n查看器\t73332\n凯隆\t73333\n一大截\t73334\n黄铜板\t73335\nnsight\t73336\n双城堡\t73337\n天位\t73338\n读书会\t73339\n马达加斯加的企鹅\t73340\n才干\t73341\n谷饶\t73342\n健身仓\t73343\n世家\t73344\nSara\t73345\n紫苏籽油\t73346\ncirrus\t73347\n合影\t73348\n曼听公园\t73349\n4.0.30319\t73350\n941路\t73351\n北京大学国际法学院\t73352\n艺恩网\t73353\n一龙\t73354\n重映版\t73355\n阳光学校\t73356\n人民法院诉讼收费办法\t73357\nf1.2\t73358\nfutura\t73359\n孙为民\t73360\n反应型\t73361\n22218515\t73362\n淇奥\t73363\n枫糖\t73364\n登士柏\t73365\nyizhi\t73366\n明年3月\t73367\n老威\t73368\n威海高区\t73369\n对流\t73370\nCsgirl\t73371\n简约主义\t73372\n一团网\t73373\n对口考试\t73374\n茶黄素\t73375\n阳离子聚丙烯酰胺\t73376\n光缆交接箱\t73377\n胶装机\t73378\n华西股份\t73379\n大来\t73380\n三校\t73381\n学杂费\t73382\nCOMS\t73383\n纠删\t73384\nSSOP\t73385\n成公\t73386\naudrey\t73387\n埃罗芒阿老师\t73388\n小淀\t73389\n大快人心\t73390\n抹\t73391\neName\t73392\n怒称\t73393\n罪域\t73394\n蜂毒\t73395\n三里村\t73396\n002506\t73397\n67997735\t73398\n84平米\t73399\n四个故事\t73400\n口活\t73401\n宜游\t73402\n有梦\t73403\n黄台\t73404\nx7plus\t73405\n岑木\t73406\nDDNS\t73407\n北京画院\t73408\n杨利伟\t73409\n报告书\t73410\n多相\t73411\n郑州市科技局\t73412\n滥情\t73413\n快捷支付\t73414\n高德地图API\t73415\n渭南市环境保护局\t73416\n珠海大道\t73417\n法定年龄\t73418\nNeighbors\t73419\n梅英\t73420\n陈卓林\t73421\n青玉德\t73422\n识\t73423\n定海神针\t73424\nPlugins\t73425\n阿尼\t73426\nプレ\t73427\n尹卫东\t73428\n刚健\t73429\nmath包\t73430\n双曲面屏\t73431\n市场价格\t73432\n东风日产\t73433\n65公斤\t73434\n葆宫止血颗粒\t73435\n苏恩\t73436\n天域\t73437\n清炖排骨\t73438\n奥拉\t73439\n诚迈科技\t73440\npowershell\t73441\n金茂广场\t73442\n田震\t73443\nmoi\t73444\n神谕\t73445\nDTcms\t73446\n榆\t73447\n无针水光\t73448\n百衲本\t73449\n字样\t73450\n四月九日\t73451\n顺叙\t73452\n大号\t73453\nQGraphicsItem\t73454\n兵者\t73455\n杜塞尔多夫\t73456\n纪检员\t73457\n落脚点\t73458\n承包期\t73459\n淮山药\t73460\n鬃\t73461\n缅公路\t73462\n惨惨\t73463\n万界\t73464\n淘淘\t73465\n检伤\t73466\n语篇\t73467\n强汉者\t73468\n澳台\t73469\nArrayList\t73470\n复合条饼\t73471\n纸片\t73472\nUndercover\t73473\n东北证券\t73474\n扎兰屯市\t73475\nvisualgdb\t73476\n第十七卷\t73477\n烘烤箱\t73478\n肉宴\t73479\n储蓄罐\t73480\n为主\t73481\n3键\t73482\n七七四十九天\t73483\n市公共资源交易中心\t73484\nCombustion\t73485\n紧抓\t73486\nxHamster\t73487\nweir\t73488\n杨清娟\t73489\n渼陂湖\t73490\n整蛊专家\t73491\n罗尼\t73492\n600571\t73493\n七步诗\t73494\n附样\t73495\n过磅\t73496\n清澈\t73497\n降薪\t73498\n応子广场\t73499\n愤怒的小鸟\t73500\n超值\t73501\n1310\t73502\n刘欣\t73503\nmoog\t73504\n策马啸西风\t73505\n苍之纪元杰克\t73506\n系统集成项目管理工程师考试\t73507\n光位\t73508\n约翰纳什\t73509\n格列吡嗪控释片\t73510\n巨威\t73511\n4355\t73512\n波兄弟\t73513\n杜丽丽\t73514\n90P\t73515\n5156\t73516\n尖峰岭\t73517\n加藤\t73518\nsae\t73519\n职工\t73520\n创客邦\t73521\n世末歌者\t73522\n龙顶\t73523\n蒙古袍\t73524\n绝味\t73525\n九州风神玄冰\t73526\n铣工\t73527\n360n6\t73528\n动态平衡\t73529\nwinfrom\t73530\n铅封\t73531\nLoaders\t73532\n威海港\t73533\n295号\t73534\n白智英\t73535\n二环线\t73536\n渔排\t73537\n索引项\t73538\n马棚\t73539\nmini4\t73540\n多_18183\t73541\nテレビ\t73542\n豫建\t73543\n曾春蕾\t73544\n正劲\t73545\n野兽\t73546\nCCTV-1_央视\t73547\njesus\t73548\n惠风\t73549\n人力资源社会保障服务网\t73550\n入团申请书_无忧考网\t73551\n文证\t73552\n52寸\t73553\n开场白\t73554\n博猫娱乐\t73555\nrelu\t73556\nStadium\t73557\n高顿网校\t73558\nkehuduan\t73559\n东农\t73560\n爆炒\t73561\nDRBD\t73562\n黑蛇\t73563\n芭比布朗\t73564\n分幅\t73565\n市文化馆\t73566\n典藏版\t73567\n新鬼\t73568\npkc\t73569\n兰式\t73570\n宜科(天津)电子有限公司\t73571\n可怜\t73572\n两淮\t73573\n白泉镇\t73574\nRadio\t73575\nWIFi\t73576\noh-my-zsh\t73577\n圣伯纳犬\t73578\n八模\t73579\n国色生枭\t73580\n菁英版\t73581\n川藏铁路\t73582\n赶不上\t73583\n人力资源管理者\t73584\nCryptography\t73585\n平均工\t73586\n对称轴\t73587\n千元\t73588\n540p\t73589\nbasin\t73590\n近亲属\t73591\n厨卫\t73592\n景安\t73593\n45种\t73594\n金花路\t73595\n基层单位\t73596\n高杨\t73597\n值年\t73598\n灌南\t73599\n深海\t73600\n九霄\t73601\n秦皇岛市政府\t73602\nstaffing\t73603\nsweets\t73604\n七零末\t73605\n东京工业大学\t73606\n守正\t73607\n天斧\t73608\n硫酸氨基葡萄糖\t73609\n溃疡\t73610\n真幸\t73611\n蕾拉\t73612\n叮叮叮\t73613\nNuxt\t73614\n同德县\t73615\nGB7258\t73616\nqq技术\t73617\nTac\t73618\n大贵族\t73619\n洞穿\t73620\n紫阳县人民政府\t73621\n机电之家网\t73622\n淘粉\t73623\n土壤调理剂\t73624\n连抽\t73625\n用纸\t73626\n温岭二中\t73627\n水状\t73628\n开拓进取\t73629\n穿越时空\t73630\n腾讯直播\t73631\n多邦\t73632\n萌娃们\t73633\n60方\t73634\n学而思培优\t73635\n中国白银集团\t73636\n淮秀帮\t73637\n2309\t73638\n外渗\t73639\n太平乡\t73640\n妈祖灵\t73641\nlawyers\t73642\nPiguet\t73643\n梯队\t73644\n20171107\t73645\n茶叶蛋\t73646\n老年痴呆症\t73647\n重庆中医院\t73648\n付嬢\t73649\n厌战\t73650\n阿甘鞋\t73651\n奏乐\t73652\n秀晶\t73653\n傲斗遮天\t73654\nACK\t73655\n江少\t73656\n2018年7月\t73657\n发自\t73658\n无限浏览器\t73659\n玩笑\t73660\nlayered\t73661\n10uf\t73662\n长城华西银行\t73663\n2b\t73664\n山行\t73665\n鼠标宏\t73666\nv2.0.4\t73667\nbooksir\t73668\n春旱\t73669\n陈小春\t73670\n山里娃\t73671\n坦克兵\t73672\n绩点\t73673\n榆树新闻网\t73674\n时尚_播视网\t73675\n爱客影院\t73676\n愈演愈烈\t73677\n黄坤\t73678\n污垢\t73679\n灯装\t73680\nd5200\t73681\nunquote\t73682\n236路\t73683\n送出\t73684\n主旨\t73685\n灯串\t73686\nFunny\t73687\n1.5g\t73688\n华正\t73689\n修正集团\t73690\n死亡之屋\t73691\nallinmd\t73692\n转口贸易\t73693\n本利\t73694\n烈焰风暴\t73695\n3-4\t73696\n标品\t73697\n停板\t73698\n骑行道\t73699\n4月中下旬\t73700\n大邑\t73701\n绝地求生全军出击和绝地求生刺激战场\t73702\n罗宇\t73703\n帕奇\t73704\n2017年1月3日\t73705\n工资支付暂行规定\t73706\n巴西龟\t73707\n海湖\t73708\n最小值\t73709\n创业期\t73710\n赵文琪\t73711\n复合膜\t73712\n大港城\t73713\n中流\t73714\n菏泽家政职业学院\t73715\nfireworks\t73716\n50%\t73717\n章贡\t73718\n光纤模块\t73719\nConfucius\t73720\n珠江实业\t73721\n红音萤\t73722\nACGN动漫资源\t73723\n实学\t73724\n高架路\t73725\n除此之外\t73726\n北京尚学堂\t73727\n赵晋\t73728\n3相\t73729\n刘卫国\t73730\n30068\t73731\n蓝飘尔\t73732\n正负级\t73733\n北鼎\t73734\njxls\t73735\n疑\t73736\n村官\t73737\n独木难支\t73738\n迁善\t73739\n李建成\t73740\n万奢网\t73741\n莫小贝\t73742\n晒机\t73743\n夏蝉\t73744\n龙阳君\t73745\n程旭\t73746\n色变\t73747\n36524\t73748\n深植\t73749\n清扫机\t73750\nxecel\t73751\n1080\t73752\n35i\t73753\n山东医学高等专科学校\t73754\n飞鹤星\t73755\n大观园\t73756\n海神腿\t73757\n孙媛\t73758\n审计报告准则\t73759\n马来西亚元\t73760\n霍言\t73761\n偏锋\t73762\n延安宝塔\t73763\n一键同步\t73764\n萧山市\t73765\n66部\t73766\n馨悦\t73767\n极限运动\t73768\n凤栖梧桐\t73769\n微鲸科技\t73770\n欧神\t73771\n调包\t73772\n排挤\t73773\nauthen\t73774\n妥投\t73775\n矫健\t73776\ncourtesy\t73777\n黑子的篮球last\t73778\n人教版版\t73779\n灯带\t73780\n李诗\t73781\n梦.千航\t73782\n艾米莉·狄金森\t73783\n日曜日\t73784\n玉堂\t73785\nNAP6\t73786\n骥\t73787\n山居秋暝\t73788\n硒片\t73789\n激动\t73790\n广州万达城\t73791\n中央巡视组\t73792\n15版\t73793\n嫉恶如仇\t73794\n汇通达\t73795\n入党积极分子\t73796\nIP电话\t73797\n盘长\t73798\n品展\t73799\n卫生日\t73800\n第五条\t73801\n北京国家税务局\t73802\n地儿\t73803\n达成度\t73804\n郎昆\t73805\n贱卖\t73806\nresnet50\t73807\n抱歉\t73808\nCML\t73809\nalto\t73810\n17式\t73811\n恩爱\t73812\n安康日报社\t73813\n枸杞苗\t73814\n维扬区\t73815\n努比亚z17minis\t73816\n克朗\t73817\n收条\t73818\n树根\t73819\n81格\t73820\n张冬冬\t73821\n卢奇\t73822\n目空\t73823\n蜡石\t73824\n不睬\t73825\n房产税土地使用税\t73826\n合路器\t73827\n甜甜蜜蜜\t73828\n管家婆\t73829\n金海马\t73830\n车身贴\t73831\n地铁3号线\t73832\n泡泡染发剂\t73833\n新浪爱彩\t73834\n波箱\t73835\n郑炳\t73836\nsd\t73837\n女户\t73838\n校企合作方案\t73839\n中考指标生\t73840\n菌根\t73841\n上古世纪吧\t73842\n大比武\t73843\n北京国际机场\t73844\n爱液\t73845\n尚汇豪庭\t73846\n6.6.2\t73847\nc++语\t73848\n进级\t73849\n罗勒\t73850\n连云新城\t73851\n信达生物\t73852\n周邓纪念馆\t73853\n卧春\t73854\n不敢看\t73855\n陈洪\t73856\n城市规划师\t73857\n温差\t73858\n技术性\t73859\n邮票册\t73860\ngeoip\t73861\n贾环\t73862\n喝尿\t73863\n4200U\t73864\n纱帽\t73865\n小儿推拿培训\t73866\n均匀性\t73867\n纪略\t73868\n不饱和脂肪\t73869\n闪电邮\t73870\npods\t73871\nOrcale\t73872\n死亡万花筒\t73873\nvalue500\t73874\n爆乳\t73875\n外品\t73876\n检索引擎\t73877\n这就是铁甲\t73878\n710s\t73879\n导线\t73880\n生物帮\t73881\n保密\t73882\nCPD\t73883\n解甲\t73884\n新疆工商\t73885\n马路天使\t73886\n检查器\t73887\n黑鸭子组合\t73888\n中国南车\t73889\nAdditional\t73890\n大卫城\t73891\n石倚洁\t73892\n表板蜡\t73893\n牛二\t73894\nvhdx\t73895\n重亲\t73896\n56万\t73897\n五里路\t73898\n绕梁\t73899\n降压\t73900\n日变化\t73901\nlett\t73902\nMB_\t73903\nfaststone\t73904\n防水罩\t73905\nclickhouse\t73906\n裘锡圭\t73907\nbodymovin\t73908\n55156\t73909\n为您\t73910\n租房\t73911\n九连\t73912\n八一小学\t73913\n瞪羚\t73914\n髂骨\t73915\nairways\t73916\n石湾\t73917\n秋月小町\t73918\n使用价值\t73919\nblocked\t73920\n浅乃晴美\t73921\nMIIX5\t73922\n凤凰大参考_凤凰网\t73923\nheic\t73924\n无锡市公安局\t73925\npuella\t73926\n新动物园\t73927\n翻转式\t73928\n录错\t73929\n乱步\t73930\n史姓\t73931\n海贼王主题曲\t73932\nassembled\t73933\n舒适度\t73934\n打毛机\t73935\n互联网卡\t73936\nEIC\t73937\n香山植物园\t73938\n哈利波特全集\t73939\n潘安\t73940\n斤\t73941\nx6d\t73942\n光纤盒\t73943\n囝仔\t73944\n碳纸\t73945\n线形\t73946\n烟筒\t73947\n燕窝镇\t73948\n军事政变\t73949\n广州论坛\t73950\n大蚂蚁\t73951\n蕊希\t73952\nmoran\t73953\nffx\t73954\ntablelayout\t73955\n港式奶茶\t73956\n猪油\t73957\n铜陵镇\t73958\n1032位\t73959\n有余数的除法\t73960\n打架\t73961\n无限大\t73962\n城关乡\t73963\n四五次\t73964\n郭德\t73965\nro\t73966\n4cr13\t73967\n森麒麟\t73968\n四海鲸骑\t73969\n杨铿\t73970\n人感\t73971\n唱版\t73972\n北京卫生监督网\t73973\n很遗憾\t73974\nkelvin\t73975\nThose\t73976\n东京食尸鬼第三季CCG佐佐木六月透对抗食尸鬼\t73977\n竹溪县\t73978\n蟾酥\t73979\n有功电能表\t73980\n第四点\t73981\nstainless\t73982\n特小凤\t73983\n熟肉\t73984\n存根\t73985\nnordstrom\t73986\nSink\t73987\n邓小平纪念网\t73988\nnextval\t73989\n程晓\t73990\n闭眼\t73991\n油城\t73992\n46级\t73993\nnmsl\t73994\n平面简谐波\t73995\n习远平\t73996\n吸湿性\t73997\n迪粉汇\t73998\n雁群\t73999\ns-1\t74000\nChaSfz\t74001\n5.7_\t74002\n台州移动\t74003\n这家伙\t74004\n升职\t74005\n汉字编码简明对照表\t74006\n上市公告书\t74007\n十堰广电网\t74008\n七星镇\t74009\n那么多\t74010\ngps模块\t74011\n复古\t74012\n角块\t74013\n标的证券\t74014\n云蟾\t74015\n鼓膜\t74016\n酱瓜\t74017\n通用别克\t74018\n冬阴功\t74019\nguidelines\t74020\n女大学生\t74021\nSid\t74022\n三月初三\t74023\n悬疑类\t74024\n鲟龙鱼\t74025\n行政拘留\t74026\n广东一区\t74027\n必争之地\t74028\n深圳大学第一附属医院\t74029\n雅居乐富春山居\t74030\n瑞士军刀\t74031\n马晓伟\t74032\nexccel\t74033\n阿童木\t74034\nhires\t74035\n@2x\t74036\n远方朋友\t74037\n盐官\t74038\nLabelImg\t74039\n洼里乡\t74040\n残夜\t74041\n绿邦\t74042\n天弘基金\t74043\n生克\t74044\n雅安市人民政府\t74045\n艾瑞泽5\t74046\n徐怀钰\t74047\n吉林大学化学学院\t74048\n料理包\t74049\n美术馆\t74050\n彩笔画\t74051\n抽出式\t74052\n新兴吧\t74053\n碳酸酯\t74054\n吴文俊\t74055\n大瀑布\t74056\nPWM\t74057\n10天\t74058\n描叙\t74059\n保利公园九里\t74060\n道道道\t74061\n雷甸\t74062\n大逆转裁判2\t74063\n张一易建联\t74064\n召唤\t74065\n大庆高新技术产业开发区\t74066\nFiona\t74067\nnewsmth\t74068\n天气预\t74069\nweal\t74070\n红腰子\t74071\n江宁大学城\t74072\n烟熏肉\t74073\n幸福湾\t74074\n复联\t74075\ntpu\t74076\n空板\t74077\n萤火霜兔\t74078\nStraits\t74079\nswool\t74080\n咖啡树\t74081\ncomics\t74082\n4.85\t74083\n血液粘稠度\t74084\n蒋军虎\t74085\n调节变量\t74086\n血压压差\t74087\nhanyi\t74088\n20170108\t74089\n护理费\t74090\n绛唇\t74091\n集中字\t74092\n大同小异\t74093\nWindows批\t74094\n彼此间\t74095\n深圳燃气\t74096\n有车族\t74097\n账户\t74098\n讲错\t74099\nresilience\t74100\n多玩我的世界\t74101\n母亲河\t74102\n平方根\t74103\nNet\t74104\n楚南\t74105\n专家\t74106\n领翔\t74107\n桑给巴尔\t74108\n读书节\t74109\n徐步\t74110\n海天教育\t74111\n邪王追妻:废材逆天小姐\t74112\n主动轮\t74113\n一窗\t74114\nMD101\t74115\n连淮扬镇\t74116\n应用兔\t74117\n模特\t74118\n保场\t74119\n偶像练习生\t74120\n阿玉\t74121\n护甲\t74122\n公开信\t74123\n亚伟\t74124\n交通运输局\t74125\n秦枫\t74126\nlw881.com\t74127\nTCC\t74128\n祆教\t74129\n阿科玛\t74130\n回覆\t74131\n透光率\t74132\nNearly\t74133\nradeon\t74134\n触碰\t74135\n传奇3\t74136\n中企联合网\t74137\n第十二条\t74138\nWriter\t74139\nherosiege\t74140\n茶渍\t74141\n聘员\t74142\n定制品\t74143\n5个工作日\t74144\n联金所\t74145\nCrystalDiskMark\t74146\n香港浸会大学联合国际学院\t74147\nar9285\t74148\n校风\t74149\n廊坊大学城\t74150\n肉肉肉\t74151\n天津市第一中心医院\t74152\n玻璃砖\t74153\n魔改\t74154\n鳌江\t74155\n会计刺客\t74156\nBeams\t74157\n中服\t74158\n得不到\t74159\n国家煤矿安全监察局\t74160\n乐Pro3\t74161\n找开\t74162\n杨柳青\t74163\n潮汕机场\t74164\ninterceptor\t74165\n争雄\t74166\n草榴社\t74167\n麻美由真\t74168\nIDC\t74169\n咕咚来了\t74170\ntmd\t74171\n归乡\t74172\n上饶师范学院\t74173\n炒冷饭\t74174\n曼特宁\t74175\n人体损伤\t74176\nGoodbye\t74177\n大摇大摆\t74178\n西药\t74179\n万佛城\t74180\n广州市人民政府外事办公室\t74181\nAdditive\t74182\n活好\t74183\n佳能750D\t74184\n恩华\t74185\n工商所\t74186\n蜘蛛侠\t74187\n中环百联\t74188\n南关区\t74189\n养殖者\t74190\n李荣\t74191\n世界之最\t74192\n优劣性\t74193\n胡炜炜\t74194\n词频\t74195\nFischer\t74196\n放射式\t74197\n营养期\t74198\nAries\t74199\n春晚\t74200\n金玺\t74201\n辛烷\t74202\nTria\t74203\n钢水\t74204\ncertutil\t74205\n人制\t74206\n中国税\t74207\n佐佐木\t74208\n山境\t74209\n知网书\t74210\n汕头市人力资源和社会保障局\t74211\n啵啵啵\t74212\n邓州\t74213\n舒克和贝塔\t74214\n柳下惠\t74215\nsuse12\t74216\n打工妹\t74217\n宣城市人力资源和社会保障局\t74218\n也门撤侨\t74219\n萝莉控\t74220\n儿女\t74221\n自由基_\t74222\n靓点\t74223\n断代史\t74224\n安全框\t74225\n分红险\t74226\n拌面\t74227\n苍天\t74228\n二手机\t74229\ncpu风扇\t74230\n单链\t74231\nloop\t74232\n带不走\t74233\n玉房\t74234\nESSEC\t74235\nMathorCup\t74236\n冥魂\t74237\n百诺\t74238\n临汾一中\t74239\n白金信用卡\t74240\nswarm\t74241\n张燕\t74242\n越秀山\t74243\n合安\t74244\n忠诚\t74245\n火珠\t74246\n流程单\t74247\n中公华图\t74248\n广西公务员考试网\t74249\n体彩杯\t74250\n北大国际医院\t74251\n电容式触摸屏\t74252\n大航海时代4加强版\t74253\n偷袭珍珠港\t74254\n行架\t74255\n屯门\t74256\n投影幕布\t74257\n拱北\t74258\nCROCS\t74259\n铝铸件\t74260\n东营市环境保护局\t74261\n道奇宝斋\t74262\n代购\t74263\nxihuan\t74264\n双阳区\t74265\n点烟器\t74266\n莫露露\t74267\n上派镇\t74268\n普巴\t74269\n吸血鬼日记\t74270\n青岛二中\t74271\n7021\t74272\n不详\t74273\n无机\t74274\nbaogao\t74275\nfacetime\t74276\n_p图\t74277\n曹斌\t74278\n338\t74279\n验照\t74280\n马背\t74281\n4瓶\t74282\nPES2010\t74283\n土地出让合同\t74284\n招商银行美国运通\t74285\nDBAplus\t74286\n_飞华健康网\t74287\n吴立新\t74288\n问哈\t74289\nmutil\t74290\n十六进制\t74291\ntrp\t74292\n泰州中学\t74293\n朝阳小区\t74294\nelevated\t74295\nFlu\t74296\n山东省中医院\t74297\n四书\t74298\n英雄无敌3hd\t74299\n晶典\t74300\n阿娇\t74301\n艾比郎\t74302\nliji\t74303\n缺碘\t74304\npancake\t74305\nvc9\t74306\n古交吧\t74307\n12块\t74308\n南京工业大学浦江学院\t74309\n分解\t74310\n聂涛\t74311\n网游文\t74312\n得力打卡机\t74313\n宁小闲御神录\t74314\n宁波客运中心\t74315\n南天打印机\t74316\n深啡网\t74317\n求财\t74318\n一仗\t74319\n豪庭\t74320\nwelcomed\t74321\n希爱力\t74322\n林百欣\t74323\n销售率\t74324\n津沽\t74325\nTGS2017\t74326\naf1\t74327\n低头\t74328\n有益于\t74329\n启封\t74330\n心理学与生活\t74331\n广元市城乡规划建设和住房保障局\t74332\nBOP\t74333\n湖南发展\t74334\n忍气吞声\t74335\n解约\t74336\nthong\t74337\n小编\t74338\n小费\t74339\n熊出没之冬日乐翻天\t74340\n背水一战\t74341\n40L\t74342\n溱潼会船节\t74343\n西州\t74344\n蚂蚁微贷\t74345\n史官\t74346\n江阴银行\t74347\n努比亚z17\t74348\n华龙网\t74349\n多多指教\t74350\n山东省社科联\t74351\n上海地铁14号线\t74352\n八里店镇\t74353\nnofile\t74354\n南京江\t74355\nuiview\t74356\n古蒂\t74357\n沂源县\t74358\n输液\t74359\nH文\t74360\nCDDA\t74361\n义乌市公安局\t74362\n马尔代夫\t74363\n热血街舞团燃燥街舞团魂\t74364\n战旗村\t74365\n庄墓镇\t74366\n郭女王\t74367\n代尔夫特\t74368\nreboot\t74369\nNoi\t74370\nvivox20plus\t74371\n律师费\t74372\n第一百一十七章\t74373\n尤克里里谱|Ukulele尤克里里小站\t74374\n工作单位\t74375\nMuseMail\t74376\nhighway\t74377\n5.6.34\t74378\na1474\t74379\n一缕\t74380\n冰虫\t74381\nkcf\t74382\n720P|1080P|蓝光原盘|磁力链\t74383\n张玉华\t74384\nbeneficial\t74385\n向霞\t74386\n五帝\t74387\n焚烧场\t74388\n九园\t74389\n就问问\t74390\n套件\t74391\n太阳下\t74392\n省医院\t74393\n福建自贸区\t74394\n三跨\t74395\n行记\t74396\n怀柔北\t74397\n百得\t74398\n15吨\t74399\n规则\t74400\n底纹\t74401\n许嘉木\t74402\n李启铭\t74403\nˋ\t74404\n难念的经\t74405\n中巴\t74406\nMDAC\t74407\nSexiest\t74408\n三一集团\t74409\naudition3.0\t74410\n新快\t74411\n成都双流国际机场\t74412\n盘出\t74413\nSKINS\t74414\nHSL\t74415\n连城县\t74416\n杭州市中医院\t74417\n毛细血管\t74418\n双皮\t74419\n繁露\t74420\n甘肃省财政厅\t74421\n8世纪\t74422\n中国产业竞争情报网\t74423\n北京2区\t74424\n文抄公\t74425\n千户苗寨\t74426\n三证合一\t74427\n130平方米\t74428\n水沫\t74429\n老凤祥\t74430\n价格式\t74431\n毛妹\t74432\n带机\t74433\n306医院\t74434\n真王\t74435\n爱绿\t74436\n雁\t74437\n校对组\t74438\n罗根\t74439\n地板装修|一起网\t74440\n潇湘雨\t74441\nREM\t74442\n主从\t74443\n误差线\t74444\n社科处\t74445\ntoolkits\t74446\n富士变频器\t74447\n局域网监控软件\t74448\n课间\t74449\n捉贼\t74450\n诺斯\t74451\n贵觅\t74452\n大纲卷\t74453\n世纪之战\t74454\n声雅\t74455\nderby\t74456\n小白帽\t74457\n重考\t74458\n9.6亿\t74459\n醉爱\t74460\nQQ分组大全\t74461\n手抄报画\t74462\n钓鱼城\t74463\n华为荣耀5x\t74464\n桃花源记2\t74465\nifelse\t74466\n死国\t74467\n团膳\t74468\n雍正皇帝\t74469\ntoolstrip\t74470\n上海交大闵行校区\t74471\n71枚\t74472\n点击类\t74473\n于根伟\t74474\n黄冰糖\t74475\n同望公路\t74476\n标致30\t74477\nhch\t74478\n楼邦\t74479\n席琳\t74480\n张玖玲\t74481\n棠溪\t74482\n政府学\t74483\n徐志艾晓琪\t74484\n圣墓山\t74485\n赤坂\t74486\n华家池\t74487\n电脑死机\t74488\n阿莫西林胶囊\t74489\n薄荷醇\t74490\ncrawl\t74491\n成都市中级人民法院\t74492\nserver2000\t74493\n采山\t74494\n暗流\t74495\nusb风扇\t74496\n账房\t74497\nSiren\t74498\n拔毒膏\t74499\n权为民\t74500\n调阅\t74501\nsocks5\t74502\nHypeStat\t74503\n笃姬\t74504\n冷不丁\t74505\nSigma\t74506\n巴啦啦小魔仙之音符之谜\t74507\n离字\t74508\n鹰城\t74509\n插入手写\t74510\n星河国际\t74511\n氧舱\t74512\n复刻\t74513\n校点\t74514\n车库门\t74515\nkayano\t74516\n南充房产网\t74517\n乳白\t74518\n【安\t74519\n扇形\t74520\n南京大学文学院\t74521\namap\t74522\n农门医女\t74523\n川西林盘\t74524\n唐菀\t74525\n嘻哈圈\t74526\n二月二\t74527\n嘉陵本田\t74528\n一线之隔\t74529\n经营成本\t74530\n保案\t74531\n导航键\t74532\n汇金股份\t74533\n无始\t74534\n修复师\t74535\n上课\t74536\n启动盘+大白菜\t74537\n地盘\t74538\n后跟帖\t74539\n王卫斯理\t74540\n金地南湖\t74541\n5.5.28\t74542\n洪庆\t74543\n创成\t74544\n第一刀\t74545\nsplay\t74546\n京东杯\t74547\n葫芦岛市\t74548\n小日本\t74549\n两行\t74550\n田庄村\t74551\n含苞\t74552\n花粥\t74553\n山里\t74554\n发\t74555\n摆线针轮减速机\t74556\n吃完饭\t74557\n申远\t74558\n男女们\t74559\n摇篮网\t74560\nalt\t74561\n小神\t74562\n晕船\t74563\n锅庄舞\t74564\n16.5%\t74565\n36式\t74566\n草率\t74567\n孔壁\t74568\n月报\t74569\n安全生产电视电话会议\t74570\n安华桥\t74571\nhmtl\t74572\nS3\t74573\njiedao\t74574\n四地州\t74575\nt570\t74576\n劲步\t74577\n39栋\t74578\n飞智八爪鱼\t74579\n东渡\t74580\n航睿\t74581\n金豆\t74582\n办厂\t74583\n辩方\t74584\n武帝\t74585\n私享\t74586\n棉布\t74587\n唯雅诺\t74588\n高层住宅小区\t74589\n40001\t74590\nCineHello\t74591\n绿城\t74592\n巡航舰\t74593\nancestor\t74594\n谋变\t74595\n寄居蟹\t74596\n有夫之妇\t74597\n金瓶梅词话\t74598\nwind7\t74599\n酶活\t74600\n米兰机场\t74601\n龙泉驿区\t74602\n燃钢\t74603\n醚化\t74604\n议案\t74605\n吴赣昌\t74606\n20161029\t74607\n第106集\t74608\n黄金杯\t74609\n阿尔山市\t74610\n20187\t74611\n唱机\t74612\n息票率\t74613\n回国后\t74614\n咖喱味\t74615\n庄家\t74616\n#06#\t74617\n定州市人民政府\t74618\n面向对象设计\t74619\nサンプル\t74620\n140厘米\t74621\n自动灌装机\t74622\n2d\t74623\n控制狂\t74624\n证券市场\t74625\n赌钱\t74626\n水儿\t74627\nq8200\t74628\n澳门街\t74629\n小马\t74630\nClos\t74631\n发送\t74632\n麦理\t74633\n张紧轮\t74634\n石墓阵\t74635\n艺界\t74636\n无类\t74637\n百适\t74638\n直系亲\t74639\n大庆市政府采购中心\t74640\n思春\t74641\n肇庆国家高新技术产业开发区\t74642\n吉林省人力资源和社会保障厅\t74643\n泰日\t74644\n数据中心\t74645\n胡壮麟\t74646\n孙敏\t74647\n家长篇\t74648\n王潮歌\t74649\n自由贸易区服务网\t74650\n老矣\t74651\n银行结售汇\t74652\n我材\t74653\n无损\t74654\n照修\t74655\n中华人民共和国献血法\t74656\ntecent\t74657\n临床病理学\t74658\n杜莎夫人蜡像馆\t74659\n压裂液\t74660\n自准\t74661\n大旺高新区\t74662\ncoloros\t74663\nObservatory\t74664\n凹凸曼\t74665\n剑心\t74666\n群盗\t74667\n巴克利\t74668\n假面骑士Decade\t74669\n愉\t74670\nuwsgi\t74671\n脂肪型\t74672\n九阳电压力锅\t74673\n齐鲁晚报\t74674\n北巷\t74675\n曹锟\t74676\n02326\t74677\n愚孝\t74678\nRugby\t74679\n难以相信\t74680\ndaa\t74681\n十殿\t74682\n20幢\t74683\n四川城市职业学院\t74684\n玄尊\t74685\n锤击桩\t74686\n氧\t74687\n钜盛华\t74688\n舌根\t74689\n能力值\t74690\n翠竹园\t74691\n博乐Bar\t74692\n金赋\t74693\n100瓦\t74694\n卡泊三醇\t74695\n速腾论坛_汽车之家论坛\t74696\nheaderView\t74697\ntiff\t74698\nproxytable\t74699\n公羊\t74700\n小豆豆\t74701\nDegradation\t74702\nmexico\t74703\n二日\t74704\n张晓彬\t74705\n刮水器\t74706\n泽仁央金\t74707\n肉末茄子\t74708\nGSE\t74709\n易吉他弹唱\t74710\n鹿晗庆生\t74711\n声屏\t74712\narpr\t74713\nTreatment\t74714\n子夫\t74715\n山口达\t74716\n团花\t74717\nvcxproj\t74718\n网法\t74719\n圈_\t74720\n台企\t74721\n机器学习经典算法\t74722\n联东集团\t74723\n直率\t74724\n房房\t74725\n见愁\t74726\n半衰期\t74727\n大众情人\t74728\n大同江\t74729\n绝地求生之刺激战场\t74730\n中毒案\t74731\n呼叫中心\t74732\n高坪乡\t74733\n耀夜版\t74734\n贞操裤\t74735\n洗浴中心\t74736\n2费\t74737\ncctv10健康之路\t74738\n惠民县\t74739\n投海\t74740\n春景\t74741\n韓\t74742\n君黛\t74743\nQueue\t74744\n降钙\t74745\n36粒\t74746\n王夫\t74747\n劳部\t74748\n房价指数\t74749\n萨曼莎\t74750\n周日\t74751\n贵士\t74752\n楼市\t74753\n财经大学\t74754\n黑花生\t74755\n测试百科网\t74756\n二维函数\t74757\n管家部\t74758\n魅族pro5\t74759\n风云音乐谷\t74760\n诅咒者\t74761\n2s\t74762\n鸿恩\t74763\n海南网络广播电视台\t74764\n辽源\t74765\n睡相\t74766\n佛山市第一人民医院\t74767\n浆\t74768\n开国\t74769\n光速蒙面侠\t74770\ncache-control\t74771\nW3Cschool\t74772\n夜孔雀\t74773\n头肩\t74774\nV3版\t74775\n美少女\t74776\n共集\t74777\n维冈竞技\t74778\n藿香正气丸\t74779\nPolyline\t74780\nkenzo\t74781\n重庆鲁能\t74782\n十数\t74783\n郴州市第一人民医院\t74784\n名址\t74785\nTablo\t74786\npms\t74787\n同名同姓库\t74788\n渊明网\t74789\n罗马军团\t74790\n44个\t74791\n魔兽怀旧服\t74792\n上海市质监局\t74793\n柳州东路\t74794\n对句\t74795\n玻璃钢制品\t74796\n微型课\t74797\n李童\t74798\n两伙\t74799\n千桃园\t74800\n1181\t74801\n钢直尺\t74802\n江上\t74803\n4.2亿\t74804\n李朕\t74805\n骑士套\t74806\n603056\t74807\n46集\t74808\n响水湖\t74809\nRRU\t74810\n暗黑传奇\t74811\n4500元\t74812\n一第\t74813\n无机非金属\t74814\n欧朋\t74815\n窄体\t74816\n屁用\t74817\n百遍\t74818\n无铅焊锡\t74819\nav青青草\t74820\n融创白象街\t74821\n富特文格勒\t74822\n航展\t74823\n父辈们\t74824\n20天左右\t74825\n李雪峰\t74826\n狂赞\t74827\n3.4g\t74828\n莱昂纳德·科恩\t74829\n万科魅力之城\t74830\n园林工程师\t74831\nCUMCM\t74832\n王子们\t74833\n谘询\t74834\n电信4g卡\t74835\n8月21日\t74836\n甄选\t74837\nTV版\t74838\n特产\t74839\n滨海港\t74840\nGarcia\t74841\n尖发\t74842\n材料人论坛\t74843\n杜康村\t74844\n长单\t74845\n扬州市职业大学\t74846\nFuntouch\t74847\nstages\t74848\n集装\t74849\n豪大大鸡排\t74850\n海姆\t74851\n惊世医妃\t74852\n板场\t74853\n第39集\t74854\n阿里云|阿里云\t74855\n阴雨\t74856\n24年后\t74857\n开窗\t74858\n曾艳\t74859\n组培\t74860\nDefrag\t74861\n飞机师\t74862\n车船税\t74863\n07届\t74864\n珠峰大本营\t74865\nHandheld\t74866\n渔友\t74867\n妻夫\t74868\nMIcrosoft\t74869\n大卫·林奇\t74870\n晶体化学\t74871\n亭\t74872\n捶背\t74873\n佑\t74874\nunofficial\t74875\n5101\t74876\n双亲\t74877\n3N\t74878\n皮康王\t74879\n配线架\t74880\n货主\t74881\nJing\t74882\n防雨\t74883\ngrátis\t74884\n墙壁\t74885\n平新乔\t74886\n闭合性\t74887\n节目单\t74888\n我的钢铁\t74889\n高级篇\t74890\n优练\t74891\n梅兰芳\t74892\n猪板油\t74893\n现金支付\t74894\n火影忍者究极风暴\t74895\n面试题\t74896\n温州物流网\t74897\nLeopold\t74898\n安许\t74899\n9.com\t74900\n作弊案\t74901\n花生麸\t74902\nmicrosoft\t74903\nAccurate\t74904\nfeasibility\t74905\n傅兴国\t74906\nDOL\t74907\n底径\t74908\n大学计算机基础\t74909\n上海金税三期\t74910\n终极武器\t74911\n银盛通\t74912\n黄洁夫\t74913\n预研\t74914\n上海金融中心\t74915\n扣率\t74916\n狴犴\t74917\n三文鱼\t74918\n暗影岛\t74919\n新能源客车\t74920\n骁龙600\t74921\n玛利亚\t74922\n鑫科\t74923\n助力器\t74924\n京博\t74925\n耐温\t74926\n昌乐县\t74927\n女M\t74928\n26#\t74929\n女神福利社\t74930\n18.130.1\t74931\n临沭网\t74932\n孙黎明\t74933\n刘帆\t74934\n国服\t74935\n江南地区\t74936\n垫付款\t74937\n房市\t74938\n分不开\t74939\n赣州市\t74940\n7979\t74941\nr5\t74942\n万科新城\t74943\nIT/\t74944\n久邦\t74945\nengines\t74946\n将来\t74947\nVNC\t74948\n帕唑帕尼\t74949\n怒杀神\t74950\n天同\t74951\n338号\t74952\n丝状物\t74953\n煤焦化\t74954\n超级机器人大战a\t74955\nichu\t74956\n20160413\t74957\n凉拌\t74958\ndisc\t74959\n山东交通学院\t74960\npiny\t74961\n新品\t74962\n稀土\t74963\nEngland\t74964\n百度识图\t74965\n西校\t74966\n最详\t74967\n斗鱼鱼丸\t74968\n黑锅\t74969\n泄密\t74970\n张赫宣\t74971\ndca\t74972\n林华\t74973\n贝宁\t74974\n平顶山日报\t74975\n战灵\t74976\nknd\t74977\n北大荒集团\t74978\nAEX\t74979\n情致\t74980\n夏强\t74981\n上海市人社局\t74982\n直冲\t74983\n臻尚白\t74984\nunbind\t74985\n红花会\t74986\n太原地铁2号线\t74987\n半岛晨报\t74988\n北京政法职业学院\t74989\n里美尤利娅\t74990\n15组\t74991\n锡板\t74992\n问一下去\t74993\n南城区\t74994\n溪畔茶\t74995\nVpn\t74996\n易孕期\t74997\n山东省卫生厅\t74998\n艾美捷科技有限公司\t74999\n伊东真绪\t75000\n斗仙\t75001\n引导分区\t75002\n斜土路\t75003\n鲍方\t75004\n小说书\t75005\n若有所思\t75006\n同框\t75007\n无能为\t75008\n物控\t75009\n武宿机场\t75010\nfroala\t75011\n小喜鹊\t75012\n岸上\t75013\n齿距\t75014\n一坨\t75015\n郑州地铁2号线\t75016\n小黄姜\t75017\n000338\t75018\nSkyscraper\t75019\njiyuan\t75020\n_阿仪网\t75021\n杀戮都市\t75022\n設定\t75023\n81天\t75024\ntime_limit\t75025\n娃娃们\t75026\n最终幻想12\t75027\n兰晓龙\t75028\n纪律性\t75029\ntheoretical\t75030\n王嘉怡\t75031\nAlloyTeam\t75032\n第三十二回\t75033\npotato\t75034\nDiab\t75035\ngitlab\t75036\n两脚\t75037\n维特鲁威人\t75038\n衬垫\t75039\n气词\t75040\n地方债\t75041\n抗菌药\t75042\n第194集\t75043\nogr\t75044\n视神经炎\t75045\n乐鱼\t75046\n现金流量图\t75047\n万事如意\t75048\n29分钟\t75049\n哪\t75050\n棺材\t75051\n禁航\t75052\nLN\t75053\n万塘路\t75054\n钟湖\t75055\n||\t75056\n张大嘴\t75057\n李女婿\t75058\n使人\t75059\nkg316t\t75060\n湖北日报\t75061\n2500\t75062\n34篇\t75063\n韩复榘\t75064\nuoo\t75065\n歧视性\t75066\nublock\t75067\n2017年10月23日\t75068\n咬人猫\t75069\n米希儿\t75070\n换底\t75071\n据传\t75072\n仙品\t75073\n放血疗法\t75074\n秦广王\t75075\n中国信鸽信息网\t75076\nvegan\t75077\n100万个\t75078\n狂瘦\t75079\n哈尔滨理工大学\t75080\n牟合\t75081\ndecathlon\t75082\n薏米粉\t75083\n潘承洞\t75084\nEDID\t75085\n地球自转\t75086\n42周年\t75087\n低级趣味\t75088\n养宠\t75089\n桑葚干\t75090\n胃肠镜\t75091\n赵建民\t75092\n闺宁\t75093\nyichang\t75094\n生活化\t75095\n欧阳菁\t75096\n傅莹\t75097\n李惠民\t75098\n第12号\t75099\nNile\t75100\n省堂\t75101\n莲花白\t75102\nshouldn\t75103\n十样\t75104\nS2110\t75105\n佩小姐的奇幻城堡\t75106\nVEN\t75107\n5周岁\t75108\n华制智能\t75109\n生如夏花\t75110\n公园绿地\t75111\n期中考试\t75112\n洛神花茶\t75113\n十二日\t75114\n名曲\t75115\n马静\t75116\nsdorica\t75117\n降盘\t75118\n麻石\t75119\n海通证券\t75120\n布法罗\t75121\n奥地利航空\t75122\n商铺网\t75123\n0308\t75124\n23寸\t75125\n锐腾\t75126\n更进\t75127\n庞村\t75128\n阿罗拉\t75129\n家传\t75130\n经查明\t75131\n吓哭\t75132\navira\t75133\n相邻权\t75134\n000651\t75135\n杀业\t75136\n懿府\t75137\n昆明医科大学\t75138\n保山日报网\t75139\n扒鸡\t75140\n呋喃西林\t75141\n考卷\t75142\njwc\t75143\n正义者\t75144\n20万\t75145\n数组\t75146\n大沙头码头\t75147\n塑料制品业\t75148\n易启秀\t75149\n_晶报数字报\t75150\n一年来\t75151\n规费\t75152\nRS4\t75153\n宝师\t75154\nLATEX\t75155\nAriel\t75156\ncompiz\t75157\nu2\t75158\n北丐\t75159\nhuf\t75160\n世贸大厦\t75161\n連続\t75162\n自甘堕落\t75163\n迪马利亚\t75164\nps4slim\t75165\n飞华健康网\t75166\n孽缘\t75167\n龙胜各族自治县\t75168\n8#\t75169\nk920\t75170\n华鼎奖\t75171\n芬琳漆\t75172\n抄书\t75173\n相似字\t75174\n磁球\t75175\n山东海事职业学院\t75176\n假若\t75177\n箸\t75178\n李志勋\t75179\nAviation\t75180\n干逼\t75181\n胡紫薇\t75182\n丹娜\t75183\n飞落\t75184\n武职\t75185\n地骨皮\t75186\n国家保密局\t75187\n图轴\t75188\n水果干\t75189\nprince\t75190\nDiseases\t75191\n邢夫人\t75192\n新东方新概念\t75193\nblur事件\t75194\nsennheiser\t75195\nA37\t75196\n红腿\t75197\nSlogan\t75198\n结尾语\t75199\n10.4寸\t75200\n同步率\t75201\n支柱性\t75202\n小儿风寒感冒\t75203\n魅力值\t75204\n脱衣露乳\t75205\n镜架\t75206\n张玥\t75207\n300r\t75208\n9:15\t75209\n火工\t75210\nepson\t75211\nXplore\t75212\n西丽\t75213\n济南北\t75214\n182cm\t75215\n乡镇武装部\t75216\n胡衡华\t75217\n宝利丰广场\t75218\n歪斜\t75219\n秘密与谎言\t75220\n金税卡\t75221\n小红书购物笔记\t75222\n采伐证\t75223\n爱德基金会\t75224\n紫英\t75225\n续卡\t75226\n忠爱\t75227\nDescriptive\t75228\n奥娟\t75229\n铁匠铺\t75230\n雅菲\t75231\n说不了\t75232\nG120变频器\t75233\nMG6\t75234\nIE9\t75235\n合肥滨湖国际会展中心\t75236\n老福\t75237\nsia\t75238\n皮肉\t75239\n清苑区\t75240\n600276\t75241\n镇楼\t75242\n価\t75243\nEWSA\t75244\nwatson\t75245\nnba2k18\t75246\n广西投资集团有限公司\t75247\n信访室\t75248\n敦化路\t75249\n泰银\t75250\n3.7.7\t75251\n何展\t75252\n数学\t75253\n袖衫\t75254\n索尔维\t75255\n红酒架\t75256\nwindy\t75257\n到中年\t75258\n小新星\t75259\n宣城政府网\t75260\n王晓岭\t75261\n中国人寿保险股份有限公司\t75262\n人身意外伤害险\t75263\n九河\t75264\n四大美女\t75265\n拉膜\t75266\n保护装置\t75267\n6本\t75268\n杨国忠\t75269\n劳务派遣合同\t75270\n建信信托\t75271\n金投财经\t75272\n气相二氧化硅\t75273\n贵州电子信息职业技术学院\t75274\n叮当猫\t75275\n高举\t75276\n环境史\t75277\n家族\t75278\n极草\t75279\n彬县人民政府\t75280\n茶乡\t75281\n阿K\t75282\n雨晴\t75283\n山泉水\t75284\nbedtime\t75285\nValues\t75286\n4.5米\t75287\n滑动支座\t75288\n葡萄酒\t75289\nwww.wang1314.com\t75290\nvivox7\t75291\n绐\t75292\n林瑞阳\t75293\n电力学院\t75294\n西蒙尼\t75295\n3000平方\t75296\nwpsword\t75297\n伤脑筋\t75298\n宝马m5\t75299\n轴承箱\t75300\n广材网\t75301\n山西医院\t75302\n杨林镇\t75303\n椭圆管\t75304\nprivate\t75305\nsuppress\t75306\n0479\t75307\n牙刷牙膏\t75308\n鬼墨\t75309\n定制酒\t75310\ndakai\t75311\n3码\t75312\n一些\t75313\n平行板\t75314\n德宏州\t75315\n多才\t75316\nraty\t75317\nHJT\t75318\nnoisy\t75319\n白鹿原影视城\t75320\n鬼獒\t75321\n1319\t75322\n一票\t75323\n雨墨\t75324\n宜宾市人力资源和社会保障局\t75325\n韵律操\t75326\ncrh\t75327\n卡格妮·琳恩·卡特\t75328\n中国保温网\t75329\n好结果\t75330\n兴源\t75331\nbackwards\t75332\n杨天石\t75333\n赫尔大学\t75334\n磁化\t75335\n王珊\t75336\ndirectors\t75337\n那一抹\t75338\nAsked\t75339\n中国旅游网\t75340\n天津地税\t75341\n种土豆\t75342\n新漫\t75343\ntestway\t75344\ndeflection\t75345\n埃森哲\t75346\n印刷业\t75347\n网开一面\t75348\niTunes\t75349\n创业板公司\t75350\n梅园路77号\t75351\n汇考\t75352\nnucleo\t75353\n羞于\t75354\n改姓\t75355\n瑞鹏宠物医院\t75356\n数倍\t75357\n欢天喜地对亲家\t75358\n样品\t75359\n第三层\t75360\n辊轮\t75361\n尔等\t75362\n2009年5月\t75363\n术业\t75364\n16555\t75365\n共有人\t75366\n核准制\t75367\n百度竞价账户\t75368\n攻略战\t75369\n温度\t75370\n桃花传奇\t75371\ndorcel\t75372\n0086lcd.com\t75373\n抓板\t75374\n租约\t75375\n电磁辐射\t75376\n天河战役\t75377\n台秤\t75378\n谨慎性\t75379\nクリムゾン\t75380\n展板\t75381\n机会\t75382\n剪切力\t75383\n验兵\t75384\n嘉娜宝\t75385\n蓝海湾\t75386\n爱沢有纱\t75387\nspigot\t75388\n朱强\t75389\n董奇\t75390\nJ罗\t75391\n焰色\t75392\nE5400\t75393\n乐氏\t75394\n戌月\t75395\n悦己\t75396\n搞笑语录\t75397\n荔湾区\t75398\nasynchronous\t75399\n38路\t75400\n第02集\t75401\n中银绒业\t75402\n卡丁车场\t75403\nportaudio\t75404\nSkybox\t75405\n绿影\t75406\n李湛\t75407\n433足球网\t75408\n尊重\t75409\n日本村\t75410\n污王\t75411\n隔墙\t75412\nDW\t75413\n二级市场\t75414\n圆山\t75415\n3.2处\t75416\n非经常性\t75417\n胱\t75418\n满汉全席\t75419\nmouseleave\t75420\n喜德盛传奇\t75421\n贫嘴\t75422\nK型\t75423\nKerry\t75424\n徐博\t75425\n猫灵\t75426\nBD_流放之路\t75427\n纵膈肿瘤\t75428\n露脐装\t75429\n洪烛\t75430\njiancai.huangye88.com\t75431\n成化\t75432\n松下电器机电(中国)有限公司\t75433\n岩矿\t75434\n多污\t75435\n畅春园\t75436\n祥生\t75437\nCOCOS\t75438\nMoreTV\t75439\n33P\t75440\n卡里古拉\t75441\nrmvb_磁力链接ed2k迅雷\t75442\n左氧氟沙星片\t75443\n2018年以来\t75444\n快活林\t75445\n陈琨\t75446\n李慕豪\t75447\n六亿\t75448\n乌当区\t75449\n剧情篇\t75450\nmaoxian\t75451\n尼龙\t75452\n皮带输送机\t75453\n反听\t75454\n变一变\t75455\n教育日\t75456\n雪橇\t75457\n梨花又开放\t75458\n达格列净\t75459\n冒险王卫斯理\t75460\n为而为\t75461\n漫迷\t75462\n利比\t75463\n调仓\t75464\nMixins\t75465\n植物大战僵尸2黑暗时代\t75466\n陈洁仪\t75467\nMisaya\t75468\n可米\t75469\n13块\t75470\n光大永明\t75471\n步兵师\t75472\npolice\t75473\n张暖雅\t75474\n业主方\t75475\n第30关\t75476\n解厄\t75477\n公共关系\t75478\n粉绿\t75479\nT客\t75480\n绝地求生掉帧\t75481\n喊麦\t75482\n打印类\t75483\n局点\t75484\nmanufacturing\t75485\nPM2\t75486\n奇瑞\t75487\n漫长\t75488\n00后\t75489\n3个工作日\t75490\n武林立志传\t75491\nBCS\t75492\n卖菜\t75493\n随机数\t75494\nKDevelop\t75495\ngraves\t75496\n0008\t75497\n中原油田\t75498\n纸扇\t75499\n游久CSGO\t75500\n结算价\t75501\n煎蛋\t75502\n忠武\t75503\n17K小说网\t75504\n万境\t75505\n动漫_全集\t75506\n羽扇豆\t75507\n在线翻译_英语_读音_用法_例句_海词词典\t75508\n碑林\t75509\nv35\t75510\n东南地区\t75511\n7788商城__七七八八商品\t75512\n趣事\t75513\n屛\t75514\n动物性\t75515\n四股\t75516\n10月16日\t75517\n地埋管\t75518\ncb站\t75519\n小麦草\t75520\n舒朗\t75521\n曼陀铃\t75522\n存稿\t75523\n闻一多\t75524\n太湖大桥\t75525\n非传统\t75526\n虚开增值税发票案\t75527\n陈孝正\t75528\nzod\t75529\n想出\t75530\n帝国3吧\t75531\n宾至如归\t75532\n金鹰网\t75533\n阆中市人民政府\t75534\n睥睨\t75535\n清城\t75536\n若尔盖草原\t75537\n10月底\t75538\nRailroad\t75539\n舰队col\t75540\n广东省网上办事大厅惠州厅\t75541\n广东区\t75542\n届\t75543\nElementTree\t75544\n森福罗\t75545\n佰牧\t75546\nVO\t75547\nWdatePicker\t75548\n槽牙\t75549\nSpaghetti\t75550\n中青年\t75551\n22k\t75552\n颜如晶\t75553\n手孔\t75554\n中国房地产业协会\t75555\n肾病综合征\t75556\n第146章\t75557\n西南位育中学\t75558\n夏桑菊颗粒\t75559\n人脸识别考勤机\t75560\nwood\t75561\n金丝木\t75562\n易孚\t75563\n引导盘\t75564\n32例\t75565\n涨停板敢死队\t75566\n1600万\t75567\n实用化\t75568\n6栋\t75569\n2l\t75570\n千岩\t75571\n麦丽丝梦游辣境\t75572\n刹车总泵\t75573\n酒疯\t75574\n深藏\t75575\n5月2号\t75576\n東\t75577\nMonody\t75578\n投亲\t75579\n原声带\t75580\n漏洞百出\t75581\n卡单\t75582\nSD敢达OL\t75583\n宣教士\t75584\n鬼火摩托车\t75585\n第4张\t75586\n欲盖弥彰\t75587\n巨峰葡萄\t75588\n特支\t75589\n松劲\t75590\n3磅\t75591\n指甲贴\t75592\n響\t75593\nOpportunities\t75594\n夜枫\t75595\ncortical\t75596\n玉兰花\t75597\n性版\t75598\n易中天品三国\t75599\n松间\t75600\n虚席\t75601\nIIS8\t75602\n窝窝网\t75603\n综合工日\t75604\nlaurel\t75605\n闪带\t75606\n串哥\t75607\n内存颗粒\t75608\n步幅\t75609\n民刑\t75610\n武都区\t75611\nDBP\t75612\n丝织\t75613\n大庆市委\t75614\n廉租房\t75615\n柏林爱乐乐团\t75616\n方超\t75617\n远控\t75618\n保皇派\t75619\nprcs4\t75620\nmiix520\t75621\n仙寓山\t75622\n雨丝\t75623\n伊戈达拉\t75624\nDSI\t75625\n专方\t75626\nCVD\t75627\n赛德\t75628\nkaka\t75629\n你以为\t75630\n沐\t75631\n软件学报\t75632\n茯茶镇\t75633\n有机盐\t75634\n詹皇\t75635\n玲花\t75636\n感谢话\t75637\n小毛虫\t75638\nsquad\t75639\n情趣酒店\t75640\nPPT矢量图\t75641\n正林\t75642\n试议\t75643\nhtml+js\t75644\nUpgrading\t75645\n日本妈妈\t75646\n线态\t75647\n黑客军团\t75648\nrainbow\t75649\n索引贴\t75650\n同相\t75651\n鲫鱼汤\t75652\n美达股份\t75653\n荆襄\t75654\nheat\t75655\nLDPE\t75656\n生物群\t75657\n000425\t75658\n专一\t75659\n使命召唤10:幽灵\t75660\nwow8.0\t75661\n水杯\t75662\n行伍\t75663\nIT业\t75664\n火库\t75665\n两尊\t75666\n萧军\t75667\nAQL\t75668\n第5话\t75669\n黄历吉日\t75670\n多多卡\t75671\n试作型\t75672\n复仇者联盟2\t75673\n置地广场\t75674\n秦义\t75675\n细胞膜\t75676\n原彩\t75677\n酷狗播放器\t75678\n时针\t75679\napproximately\t75680\n汉之云\t75681\n晋灵公\t75682\nchromecast\t75683\nsandboxie\t75684\n新辑\t75685\n闻喜\t75686\n英洛华\t75687\n木西\t75688\n支原\t75689\n第A01\t75690\n601880\t75691\n君爵\t75692\n艾迪留学网\t75693\n广东人才网\t75694\n3.3亿元\t75695\na800座套\t75696\n汾西\t75697\nnewid\t75698\n曹杨二中\t75699\n苹果\t75700\n﹥\t75701\n襄阳日报\t75702\n堆取料机\t75703\nMetacafe\t75704\n财务处\t75705\nfortune\t75706\n电压比较器\t75707\n胶带纸\t75708\n433m\t75709\n麒麟处理器\t75710\n急死\t75711\n绝迹\t75712\n沙伯\t75713\n龙坪镇\t75714\n宝华驾校\t75715\n英致737\t75716\n残基\t75717\n奇趣网\t75718\n大话西游3免费版\t75719\n猎魂觉醒\t75720\n合隆镇\t75721\npy鱼\t75722\n新剑网3\t75723\n文彦博\t75724\nDECODE\t75725\n欧必特\t75726\n二一组卷\t75727\n南京华肤医院\t75728\nGMM\t75729\n桑耶寺\t75730\n夜莺俱乐部\t75731\n2017年08月\t75732\n清香型白酒\t75733\nAppointment\t75734\n十三代\t75735\n新里\t75736\njunn9527\t75737\n综合工时制\t75738\n潮库\t75739\n卫计\t75740\n意识\t75741\n滨\t75742\n中运量\t75743\n赫兹\t75744\n雨上\t75745\n椎间盘突出症\t75746\n电子贸\t75747\n盛气凌人\t75748\n以理服人\t75749\nSOCKS5\t75750\n山东大学物理学院\t75751\n超声波液位计\t75752\n泽国\t75753\n聚胺脂\t75754\n医疗保险\t75755\n真州镇\t75756\nsti\t75757\n轿跑车\t75758\naveda\t75759\noz大乱斗\t75760\nbugaboo\t75761\n1000%\t75762\n当代国际花园\t75763\n小佐\t75764\n慢慢买比价网\t75765\n康之园\t75766\n卡兹\t75767\n洒洒\t75768\ntfb\t75769\n成都酒店\t75770\n嘉惠\t75771\n冰豹\t75772\n美的时代城\t75773\npc817\t75774\n华为MateBook\t75775\n厌氧\t75776\n金兰集团\t75777\n游戏狗三国杀\t75778\n印尼卢比\t75779\n承板\t75780\n伯凡\t75781\n日产骐达\t75782\n谶纬\t75783\n雏蜂\t75784\nhh\t75785\nRX0\t75786\n重庆市第八中学\t75787\ninkey\t75788\n什末\t75789\n中国国际救援队\t75790\nMK2\t75791\n瞿溪路\t75792\nnba2kol2\t75793\n地下水道\t75794\n冰叔\t75795\n六乱\t75796\n王晓龙\t75797\nStringBuilder\t75798\n北京世纪好未来教育科技有限公司\t75799\n需缴\t75800\noracle11g\t75801\n性欲\t75802\n儿童体\t75803\n四袋\t75804\nnode-webkit\t75805\n丹心照汗青\t75806\n盖机\t75807\n阿塔兰忒\t75808\n巨雷\t75809\nepub\t75810\n铿锵有力\t75811\nthinkpad\t75812\n夏军\t75813\n粘鞋\t75814\n电控箱\t75815\n李村公园\t75816\n无双剑\t75817\n广济\t75818\n喉糖\t75819\n衍架\t75820\n上万\t75821\n本森\t75822\n好天气\t75823\n政和\t75824\n丙酸钙\t75825\n30分钟以上\t75826\n固定带\t75827\n上海癫痫病医院\t75828\n宜黄\t75829\n王文明\t75830\n鬼泣2\t75831\n_步街\t75832\n营销管培生\t75833\n建投能源\t75834\n弯下腰\t75835\n[综英美\t75836\n23周年\t75837\n故意犯罪\t75838\nST海润\t75839\nIRR\t75840\n电导法\t75841\nZimmer\t75842\n最前线\t75843\n不逮\t75844\n互动投影\t75845\n北条司\t75846\n屏芯\t75847\n破旧\t75848\nkea128\t75849\n博兴路\t75850\n笨办法学\t75851\n短信公众号\t75852\n源城区\t75853\nNewPtone\t75854\n沁园净水器\t75855\ncubes\t75856\n房产市场\t75857\n善若\t75858\n寒邪\t75859\nwv\t75860\n武曲星\t75861\n自然光\t75862\n阿甘正传\t75863\n伊藤美诚\t75864\n中山市技师学院\t75865\n展览会\t75866\n翘\t75867\n元征\t75868\n允许\t75869\n米诺地尔\t75870\nCD1\t75871\n龙刀\t75872\nexchange2010\t75873\nshushi\t75874\n五常稻\t75875\n建平实验小学\t75876\noffice07\t75877\n二十二项\t75878\n4S\t75879\nRobotium\t75880\n3tb\t75881\nmyown\t75882\nOllydbg\t75883\n魔戒指\t75884\n龙纹战神\t75885\n威尔斯\t75886\n3dvia\t75887\n张学文\t75888\n平顶山学院\t75889\n广东银监局\t75890\n柳超\t75891\n加单\t75892\n低危型\t75893\n枪型\t75894\n457\t75895\n周农\t75896\n折购\t75897\nsuliao\t75898\n工商总局\t75899\n合肥市工商局\t75900\n杨思路\t75901\nClapton\t75902\n甲男\t75903\n崇寿\t75904\n电子阅读器\t75905\ntog\t75906\n2908\t75907\n跑开\t75908\n匪徒\t75909\n磁板\t75910\n三难\t75911\n幼苗\t75912\n49个\t75913\n山东中煤工矿集团\t75914\n复退\t75915\n1.7.0\t75916\n五十九\t75917\nKinase\t75918\n出门在外\t75919\n猪屎\t75920\n1970年代\t75921\n四川省中医药管理局\t75922\ncell函数\t75923\ncaac\t75924\n年丰\t75925\n太空馆\t75926\n顺德农村商业银行\t75927\n非税收入管理\t75928\n单兵\t75929\n苯酚钠\t75930\n膨体隆鼻\t75931\nMLB\t75932\n弗兰肯斯坦\t75933\n浆砌石\t75934\n秧草\t75935\n后庭花\t75936\n威卡\t75937\n服饰展\t75938\nUeditor富文本编辑器\t75939\n摄影史\t75940\n爱意\t75941\n琉璃场\t75942\nben\t75943\n腿控\t75944\n军旅文\t75945\n冰毒\t75946\n苔菜\t75947\n全球纺织论坛\t75948\nP2015\t75949\n广州大佛寺\t75950\n烤鸡爪\t75951\n子息\t75952\n超常\t75953\n徐晋\t75954\n张英杰\t75955\n鹿角霜\t75956\n125亿\t75957\n違\t75958\n微宝\t75959\n泉州市人民政府\t75960\nkernel\t75961\n湖州\t75962\n戴震\t75963\n医务部\t75964\n横纲\t75965\n盆腔积液\t75966\ntendency\t75967\n玖体育\t75968\n镂空花\t75969\nCKEditor\t75970\n长年\t75971\n18CM\t75972\nanticipated\t75973\n再就是\t75974\n洋红\t75975\nGeological\t75976\nm6000\t75977\n挚信\t75978\n蛮神\t75979\n涪城区\t75980\naddr2line\t75981\nGrouped\t75982\nBu\t75983\n陕西延长石油(集团)有限责任公司\t75984\n张双南\t75985\n益乐新村\t75986\n皮尔卡丹\t75987\n糖尿病神经病变\t75988\n伤流\t75989\n国际刑警\t75990\n圈兔\t75991\n直击\t75992\n古美路街道\t75993\n雷鸣\t75994\n8c\t75995\n补缴\t75996\n预埋单\t75997\n0102\t75998\n波罗斯\t75999\n乔爷\t76000\n杂牌军\t76001\nMoviefone\t76002\n黄帝阴符经\t76003\n40003\t76004\n三三宝利来\t76005\n生保\t76006\n富甲天下4\t76007\n円城\t76008\nModelAnd\t76009\n有点味\t76010\n龙脊梯田\t76011\n喏\t76012\nALG\t76013\n珠海市区\t76014\nFID\t76015\nneca吧_\t76016\ns6edge\t76017\nexhibit\t76018\n150元\t76019\nq9550\t76020\n减挡\t76021\nBrainyQuote\t76022\nIvory\t76023\n姜大明\t76024\njpw\t76025\n第三十八章\t76026\n万彩录屏\t76027\nvvvdj\t76028\n不可欺\t76029\n白驹过隙\t76030\n支模板\t76031\nrefused\t76032\n上海站\t76033\n凤凰气象站\t76034\n回调函数\t76035\nTackle\t76036\n仿木纹\t76037\n豪美\t76038\n奋发图强\t76039\n轻工业\t76040\nazul\t76041\n委中穴\t76042\n制剂室\t76043\n一亩\t76044\n天津市政府采购中心\t76045\n定理\t76046\n构造器\t76047\nEPEL\t76048\nTreegrid\t76049\n非花\t76050\n夏校\t76051\nra超品相师\t76052\n狼蛛\t76053\nlnx\t76054\n塑料绳\t76055\n神恶魔\t76056\n艾绒\t76057\n漕运\t76058\nzyp\t76059\n以兰\t76060\n寿光蔬菜博览会\t76061\n医疗费\t76062\n线机\t76063\nAO史密斯热水器\t76064\ntruncated\t76065\n东风总医院\t76066\n婚论\t76067\n萧萧竹\t76068\n梁弄镇\t76069\n洛阳新区\t76070\n故意伤害\t76071\ninsecure\t76072\n糖酒\t76073\n微动力\t76074\n对照组\t76075\n软枣猕猴桃\t76076\n周丽娜\t76077\n潇洒走一回\t76078\n绵阳师范学院\t76079\n吉川萌\t76080\nATOM\t76081\n200000\t76082\n小天才z3\t76083\n肠结核\t76084\n印度飞饼\t76085\n政治生活\t76086\nPCIE\t76087\n60亿元\t76088\ntreble\t76089\n浮球液位控制器\t76090\n海宁房产超市网\t76091\n电荷体\t76092\n好可惜\t76093\n2017年5月2日\t76094\n第27话\t76095\n敦促\t76096\n右手边\t76097\n胡艾彤\t76098\nptree\t76099\n858\t76100\nwendu\t76101\n定址\t76102\n裴裴\t76103\nWarming\t76104\nBullying\t76105\n黄圣依\t76106\n停薪\t76107\nE50\t76108\n刨冰机\t76109\n针法\t76110\nwin7/win8/win10\t76111\nweb打印\t76112\nwfz\t76113\n隆泰\t76114\nOrient\t76115\n2954\t76116\n叶问前传\t76117\n特种设备安全监察条例\t76118\n东方星城\t76119\n八达\t76120\n视听说\t76121\n衡阳百姓网\t76122\nqq轻聊版\t76123\n导\t76124\n一刻钟\t76125\nilem\t76126\n5.7.11\t76127\n毛中特\t76128\n_江\t76129\n北京科技\t76130\n婚礼堂\t76131\n美片\t76132\n谯城区\t76133\n湖长制\t76134\nexcl\t76135\n何也\t76136\n砧板\t76137\nRR\t76138\n1259\t76139\n婺剧\t76140\n第17期\t76141\n西递\t76142\n全民奇迹MU\t76143\n最低处\t76144\numu\t76145\n鸡眼膏\t76146\n中煤\t76147\n表行\t76148\nEGR\t76149\n重车\t76150\nstylish\t76151\npart4\t76152\n600201\t76153\n共好\t76154\n什幺\t76155\n咖喱\t76156\ngeoJson\t76157\nHaute\t76158\n石潭\t76159\nC2CC\t76160\nDepression\t76161\n难经\t76162\niScroll\t76163\n20px\t76164\n三校生\t76165\n降职\t76166\ndcn\t76167\n净量\t76168\n车饰\t76169\n芭比波朗\t76170\n爱的华尔兹\t76171\n米朵网\t76172\ncad2008\t76173\n海蓝\t76174\nframework4\t76175\n350亿\t76176\n再认\t76177\n阴交\t76178\nsvf\t76179\n公子歌\t76180\nframing\t76181\n天正日照\t76182\n五保户\t76183\n热固\t76184\n风雨小说网\t76185\n春阁\t76186\n17本\t76187\n多座\t76188\n锦城湖\t76189\n国投资本\t76190\n5-5\t76191\n1943年\t76192\n南丁格尔奖\t76193\n1778\t76194\n云付通\t76195\n魂曲\t76196\ngreet\t76197\nwin100\t76198\naltair\t76199\n暗杆\t76200\n水热法\t76201\n欧香\t76202\n系统发育树\t76203\n太极掌\t76204\n养恩\t76205\n时评类\t76206\n四马\t76207\n缺钾\t76208\nLog4j\t76209\n1.7.4\t76210\n斗图\t76211\n神户\t76212\n国家监察法\t76213\n氟化镁\t76214\n酒管\t76215\n泼墨\t76216\n洗发露\t76217\n杨国华\t76218\n金码\t76219\n盐值\t76220\n15份\t76221\n碱石灰\t76222\n循环类\t76223\n831集\t76224\n周其凤\t76225\n无形资产摊销\t76226\nGtk\t76227\nEMBED\t76228\nP20/P20\t76229\nKart\t76230\n火机\t76231\n星运\t76232\n上海市南洋模范中学\t76233\n2016—2018年\t76234\n昆明西山\t76235\n加于\t76236\nwww.kongbao90.com\t76237\n康莱德酒店\t76238\n学生街\t76239\n寺内\t76240\n王凤\t76241\n1851\t76242\naidu\t76243\nrequirejs\t76244\nHaven\t76245\n传感器\t76246\n面向上\t76247\n逛逛\t76248\nMAME模拟器\t76249\n名镇\t76250\n粘稠度\t76251\n一个40岁\t76252\nsorting\t76253\nKenya\t76254\n任县\t76255\nCapitan\t76256\n1.32G\t76257\n边缘点\t76258\nGGV\t76259\n110级\t76260\n一万分\t76261\n西游释厄传\t76262\n中国共产党第七次全国代表大会\t76263\n景品\t76264\nigp\t76265\n盾兵\t76266\n分贝仪\t76267\n滨江俊园\t76268\n萧失\t76269\n乐动力\t76270\n0x0000001\t76271\nresponsible\t76272\n途乐4.0\t76273\n证券日报网\t76274\n赢政天下\t76275\n火锅米线\t76276\n南方新闻网\t76277\n冯绍峰\t76278\nx509\t76279\n小肉\t76280\n痙攣\t76281\n酷爱\t76282\n黄文辉\t76283\n珠江医院\t76284\n洗碗\t76285\n丛生\t76286\n鸟语\t76287\n麦豆\t76288\n自有\t76289\n金知元\t76290\n永磁同步电机\t76291\nColorOS\t76292\n马甸桥\t76293\n商务厅\t76294\nmoshou\t76295\n旅者\t76296\n31款\t76297\n闲着\t76298\n艺妓\t76299\n彩墨\t76300\n嘉悦\t76301\n红犬\t76302\ncontrols\t76303\n佛山市第二人民医院\t76304\n风水球\t76305\nPebble\t76306\n珍宝丸\t76307\n论\t76308\n天才麻将少女吧_\t76309\ntorrentkitty\t76310\n评审费\t76311\n配乐吧\t76312\n花意\t76313\ntout\t76314\n星朝\t76315\n3000平\t76316\n娥媚\t76317\n乳垫\t76318\n68cm\t76319\n李嘉万庆良\t76320\n水锤\t76321\n话述\t76322\nflashftp\t76323\n24FA\t76324\n无虐\t76325\n0537\t76326\n有救\t76327\n智康\t76328\n扩孔器\t76329\n翻译公司\t76330\nlinux-51CTO\t76331\n食品安全法\t76332\n尾野真知子\t76333\n六笔\t76334\n多岁\t76335\n20兆瓦\t76336\n滴水观音\t76337\ndubai\t76338\n宿羽阳\t76339\n德国大陆集团\t76340\n卡修\t76341\nnexus5吧\t76342\n爱普生L805\t76343\nCurie\t76344\n沈清\t76345\nプロジェクト\t76346\nRebel\t76347\ntl494\t76348\n绿原\t76349\n彭峰\t76350\n润笔\t76351\n珑\t76352\nObj\t76353\n租赁\t76354\n式衣\t76355\n避难\t76356\n文丘\t76357\ngithubuser\t76358\n绞姿\t76359\nDrum\t76360\n逗笑\t76361\n李松蔚\t76362\ndorothy\t76363\n计委\t76364\n要不够\t76365\n25万吨\t76366\n家庭化\t76367\n春愁\t76368\n张皓宸\t76369\n东方集团股份有限公司\t76370\n上海三联书店\t76371\n阿卡姆\t76372\n临清市政府\t76373\n药审\t76374\n梅溪湖壹号\t76375\n华医网\t76376\n管校聘\t76377\n选摘\t76378\nhtml5在线测试\t76379\n德州学院\t76380\nCamo\t76381\nMock\t76382\n太平洋家居网\t76383\n南京港\t76384\n游游\t76385\n小瓶\t76386\n条顿骑士团\t76387\n20151222\t76388\n陈辉\t76389\n诸界\t76390\n保德安\t76391\n第188集\t76392\n帕金森定律\t76393\nemergency\t76394\n盛妆\t76395\nSTRONG\t76396\n理法\t76397\n鞭打\t76398\n四川省商务厅\t76399\n吊扇灯\t76400\n袁世凯\t76401\n大众蔚\t76402\n多品种\t76403\n昆山市花桥中心幼儿园\t76404\n后背箱\t76405\nwin7系统屏幕亮度\t76406\n我心中\t76407\nappnium\t76408\n放饭\t76409\nbiopsy\t76410\n支配权\t76411\n巴塞尔委员会\t76412\n保湿乳\t76413\nvibram\t76414\n49亿\t76415\n递质\t76416\n一路同行\t76417\n业精于\t76418\n地形区\t76419\n进化\t76420\n排卵针\t76421\n望江西路\t76422\n200HP\t76423\n科思创\t76424\n2016年5月21日\t76425\n粉饼\t76426\nRift\t76427\n调理\t76428\n第47章\t76429\nrandint\t76430\n老顽童\t76431\n赵嘏\t76432\n酒家\t76433\n法律出版社\t76434\n平级\t76435\n燕归来\t76436\n15幅\t76437\n大迪\t76438\n我的钢铁网\t76439\n56式太极拳\t76440\n山猫\t76441\n热液\t76442\n不好意思\t76443\n泄压\t76444\n球标\t76445\n追梦人\t76446\nBusiness\t76447\n北北\t76448\n涉县\t76449\n中芯国际\t76450\n苏州职业大学\t76451\n20步\t76452\n华为pad\t76453\nBL漫\t76454\n/6\t76455\nlol2018\t76456\n神之浩劫吧\t76457\n关窗\t76458\n依文\t76459\n火势\t76460\n滴滴优享\t76461\n较场尾\t76462\n大碟\t76463\n购物税\t76464\nfunction函数\t76465\narch\t76466\n地热\t76467\n拉格朗日乘子法\t76468\n宁波酒店\t76469\n壬二酸\t76470\n工器具\t76471\n东林党\t76472\n文泉\t76473\n秘戏\t76474\nOrisun\t76475\n峡部\t76476\n抹灰砂浆\t76477\n林祥纤\t76478\nMissy\t76479\n江苏省邗江中学\t76480\n毕业设计论文\t76481\n红十字日\t76482\nMicropython\t76483\nsolidw\t76484\n鸿远\t76485\n洛丽三国杀online\t76486\n浏览者\t76487\n癌胚抗原\t76488\n中级人民法院\t76489\n智操盘股票\t76490\n厦门市音乐学校\t76491\nXXXX\t76492\n那般\t76493\n东方时尚驾驶学校股份有限公司\t76494\n乐视智家\t76495\n市住房和城乡建设局\t76496\n华为荣耀6\t76497\nPickup\t76498\n方太商城\t76499\n房话\t76500\n卡丽娜\t76501\n可卡因\t76502\n田朴珺\t76503\npregnancy\t76504\n主攻\t76505\nlbf\t76506\n家婆\t76507\n造价员网\t76508\n红发卡\t76509\n公安部消防局\t76510\nKnot\t76511\n微旅\t76512\nDWA\t76513\n提示语\t76514\n侧视\t76515\nCheck\t76516\n拔群\t76517\n上海代表处\t76518\n存续期\t76519\n蒲洼乡\t76520\nlandscape\t76521\n华江\t76522\nGisClub\t76523\n哐哐\t76524\n华一\t76525\n西北农林大学\t76526\n寂静岭5\t76527\n德恒\t76528\n华北一区\t76529\n脱狱\t76530\n德行天下\t76531\nrps\t76532\nv9.3\t76533\n黄柠檬\t76534\npattern\t76535\npins\t76536\n制动泵\t76537\n刘志远\t76538\n古称\t76539\n5型\t76540\n密登陆\t76541\nuisearchbar\t76542\n一举成名\t76543\n刘冬梅\t76544\n北戴河新区\t76545\n小甜\t76546\n顾西爵\t76547\n赛升药业\t76548\n8CM\t76549\n广州市国土资源和规划委员会\t76550\nDivine\t76551\nCrackMe\t76552\n采筑\t76553\nDeviantArt\t76554\nReques\t76555\n肤如凝脂\t76556\nFortnite\t76557\n药瓶\t76558\nHAS\t76559\n大润发超市\t76560\n太平洋舰队\t76561\n财政史\t76562\n高送转概念股\t76563\n事儿\t76564\nFlashPaper\t76565\n12套\t76566\n网群\t76567\n星科\t76568\n元德\t76569\n愤愤不平\t76570\nwdr7500\t76571\n中科院大学\t76572\n蛋蛋网\t76573\n动卧\t76574\n三盟\t76575\n挂瓦\t76576\n谍影重重3\t76577\n跨借鉴\t76578\n金敬道\t76579\n湖北站\t76580\ncric\t76581\n波炉\t76582\nveronica\t76583\n开发技\t76584\n中华人民共和国河道管理条例\t76585\n豆角焖面\t76586\nhbs\t76587\n交联度\t76588\n小鲤鱼历险记\t76589\n新杰\t76590\nSwift编程\t76591\n纸短情长\t76592\nピエロ\t76593\n需求理论\t76594\n冬瓜皮\t76595\n_觅元素51yuansu.com\t76596\n键音\t76597\n装饰架\t76598\n乐酒客\t76599\nMoisturizer\t76600\n差集\t76601\n剧情歌\t76602\n救国\t76603\n风田\t76604\n空心球\t76605\nPhim\t76606\n平方值\t76607\nDifference\t76608\nzer\t76609\nzzh\t76610\n北京市海淀区人民政府\t76611\n空气炮\t76612\n济南公司\t76613\n践行家\t76614\n烈火如歌全集\t76615\n洛根\t76616\n中共西安市高陵区委\t76617\n恒等\t76618\n永吉县\t76619\n阴棺\t76620\n橙堡\t76621\n黑钢\t76622\nrunway\t76623\n擒贼\t76624\nistio\t76625\n定妆照\t76626\n0916\t76627\n退市\t76628\n复方克霉唑乳膏\t76629\n炼器\t76630\n海伦堡地产\t76631\n面包车\t76632\n青岛市\t76633\n雷·达里奥\t76634\n量子谷\t76635\n四川公务员考试网\t76636\n相管\t76637\n鲁棒性\t76638\n推头\t76639\n警嫂\t76640\n发酵剂\t76641\n红玺台\t76642\n大运河孔雀城\t76643\n9.16\t76644\n116号\t76645\n职业培训\t76646\n刘胡轶\t76647\n皮炎平\t76648\n麒麟655\t76649\nsparql\t76650\nRika\t76651\nmater\t76652\n03【\t76653\n廉政准则\t76654\n操作系列\t76655\n深圳市百合外国语学校\t76656\n小汪\t76657\n制粒\t76658\n第十七章\t76659\n放射科\t76660\n北京安定医院\t76661\n长春机场\t76662\nodometry\t76663\n录音\t76664\n各国\t76665\n两板\t76666\n几十只\t76667\n开盖\t76668\nMNIST\t76669\nwininet\t76670\nLoewe\t76671\n内森\t76672\n做账\t76673\n生怕\t76674\n入井\t76675\n亿维\t76676\nbagel\t76677\n3065\t76678\n药典\t76679\n金乡\t76680\n扯开\t76681\n月九\t76682\n大杨树镇\t76683\n中关村科技园区\t76684\nカレ\t76685\n茶苗\t76686\n少妻\t76687\n垂直式\t76688\n杨幂\t76689\n失忆症\t76690\n巴沙尔·阿萨德\t76691\n发条云\t76692\n彩门\t76693\n三元里大道\t76694\n单播\t76695\n淡风\t76696\n安悦溪\t76697\n妮儿\t76698\n党政机关\t76699\n怪物猎人P3\t76700\nVee\t76701\n人脸识别算法\t76702\n谭嘉仪\t76703\n源起之战\t76704\n旱冰\t76705\n晒单\t76706\n50P\t76707\n装备业\t76708\nRstudio\t76709\n多少数\t76710\n万州区\t76711\nsofina\t76712\n礼节\t76713\ndouble\t76714\nLaFerrari\t76715\nst-link\t76716\n晋城市人民政府\t76717\n检核\t76718\n云南大理州\t76719\nDozer\t76720\n阿奎\t76721\n联营\t76722\n奥组委\t76723\n李文俊\t76724\n民勤政府网\t76725\n呕吐感\t76726\n陈酿\t76727\nMatlab函数\t76728\n纪德\t76729\n先天性耳聋\t76730\n锯掉\t76731\npitt\t76732\n终结科沃兹\t76733\nBrunch\t76734\nDefects\t76735\n血液透析机\t76736\n知觉\t76737\n发毛\t76738\n首周\t76739\nchloro\t76740\n柯木塱\t76741\n克隆人\t76742\n活宝\t76743\nCD3\t76744\n白草畔\t76745\n台阶\t76746\n王智愚\t76747\n频现\t76748\n逆流成河\t76749\n顶好\t76750\n永在\t76751\n凯茜\t76752\n白城师范学院\t76753\n演武\t76754\n海泥\t76755\n氢氧化亚铁\t76756\nRedistributable\t76757\n28部\t76758\n福康安\t76759\nShrink\t76760\n乐视X620\t76761\n天易娱乐\t76762\n天降美食2\t76763\n氟化钾\t76764\n密封固化剂\t76765\n体脂仪\t76766\n俾\t76767\n电机\t76768\n娃娃音\t76769\n牛仔裙\t76770\n自学竞价网\t76771\nDotnet\t76772\n高亭\t76773\n火烧圆明园\t76774\n6350\t76775\n张嘉\t76776\nGLE\t76777\nJCE\t76778\n千企帮千村\t76779\n中南民大\t76780\n歌力思\t76781\n算得上\t76782\n子汐\t76783\nresample\t76784\n9500GT\t76785\n苦竹\t76786\n2715\t76787\n古墓丽影:暗影\t76788\nmslog\t76789\n东京都市圈\t76790\n雀巢\t76791\nGreek\t76792\n記\t76793\n18v\t76794\npinned\t76795\n_套\t76796\n鸟叔\t76797\nMAX函数\t76798\n65层\t76799\n溪山行旅图\t76800\n比亚迪唐100\t76801\n2nd\t76802\n陌言川\t76803\n杨九红\t76804\n新七龙珠\t76805\n鲍语斋\t76806\n学跳\t76807\n群群\t76808\nPharma\t76809\n宋庆龄故居\t76810\n固城镇\t76811\n3月25日起\t76812\n床底\t76813\n冲田杏梨\t76814\n攻沙\t76815\n贪吃树\t76816\ndrawimage\t76817\n1.03G\t76818\n病毒载量\t76819\n\\1\t76820\n太白山国家森林公园\t76821\n晨勃\t76822\n苏州市立医院\t76823\noppoa33\t76824\n三驱\t76825\n雪梨膏\t76826\n严介和\t76827\n吕颂贤\t76828\n阵图\t76829\n中国电源学会\t76830\n绕标\t76831\n热议习\t76832\n19级\t76833\n肖凯晔\t76834\n心物\t76835\n新良\t76836\n60w\t76837\n徐工机械\t76838\n微动天下\t76839\n驱动之家\t76840\n魏娜\t76841\n天玉\t76842\n世卫\t76843\n破解版\t76844\n安宁庄\t76845\n补习班\t76846\n丁仲礼\t76847\n钱镠\t76848\n李文秀\t76849\n当代明诚\t76850\n港珠澳大桥\t76851\n阿倍仲麻吕\t76852\n数据库库\t76853\n更好些\t76854\n张冬\t76855\n编址\t76856\n安徽新媒体集团\t76857\n库库鲁\t76858\nmcpe\t76859\n布达\t76860\n本能寺之变\t76861\n破局\t76862\n出包王女第四季\t76863\nGrouping\t76864\n包缝机\t76865\n长湴\t76866\n归魂\t76867\n如晦\t76868\nopskins\t76869\nfranco\t76870\n拿不到\t76871\n共荣\t76872\n畅游\t76873\n贵阳高新区\t76874\nphotoshop2017\t76875\n航模吧\t76876\n一记\t76877\n小淘气尼古拉\t76878\n金蝶精斗云\t76879\n素人片\t76880\n宝剑锋\t76881\n中士\t76882\n辉门\t76883\n推断\t76884\n第4部\t76885\n北京四维图新科技股份有限公司\t76886\ninfoq\t76887\n旋律\t76888\n酸梅粉\t76889\n三好网\t76890\n沾化\t76891\n102平\t76892\n莉音\t76893\n10月30日\t76894\n中俄关系\t76895\n盐洲岛\t76896\n芜湖弋矶山医院\t76897\n济南南部\t76898\n草爬子\t76899\n呛咳\t76900\n北普陀山\t76901\n惨字\t76902\n牙克石\t76903\n资金\t76904\n天使与魔鬼\t76905\n遗稿\t76906\n狱医\t76907\n五十多天\t76908\n频出\t76909\n九场\t76910\nHX6730\t76911\n董磊\t76912\n武汉大学中南医院\t76913\n阿瑟·柯南·道尔\t76914\n6.58\t76915\n武当休闲山庄\t76916\nhaibao\t76917\n紧套\t76918\nexplor\t76919\n赋存\t76920\n吾栖\t76921\n狱中\t76922\n回路\t76923\n烯丙醇\t76924\n耽美漫\t76925\n巴林\t76926\n担惊受怕\t76927\n发回\t76928\n汝河路\t76929\n天然\t76930\n1999元\t76931\n大鹏所城\t76932\n阿努比斯\t76933\n测图\t76934\n天格地热地板\t76935\n直管县\t76936\n80万\t76937\n中冶京诚工程技术有限公司\t76938\n电池盒\t76939\n1:32\t76940\n冯鑫\t76941\n周丽萍\t76942\n曝出\t76943\n石家庄南\t76944\n象征主义\t76945\n对手盘\t76946\n药械\t76947\n中科院幼儿园\t76948\ncavalli\t76949\n周晖\t76950\n玛索丹\t76951\n穿越者\t76952\n茶城\t76953\n场场\t76954\nbao\t76955\n主道\t76956\n水煮菜\t76957\n宁化在线\t76958\n宠物小精灵冒险之旅\t76959\n汽车4s店\t76960\n要式\t76961\n牙胚\t76962\n美国音乐学院\t76963\n善战者\t76964\n北川中学\t76965\nvagina\t76966\n李为\t76967\n地形图\t76968\n20171228\t76969\ndrawText\t76970\n中国动漫产业网\t76971\nDOS批\t76972\n葫芦丝谱\t76973\n莫罗斯\t76974\n旅游险\t76975\ncomb\t76976\n攻坚期\t76977\n十八届五中全会\t76978\n单类\t76979\n石狮服装城\t76980\n爱森斯坦\t76981\nPS流\t76982\n陕西省测绘地理信息局\t76983\n越秀区小学\t76984\nUnit5\t76985\n尿奴\t76986\n首案\t76987\n尸姬\t76988\n行万里路\t76989\nsch\t76990\nuvb\t76991\n上古卷轴5天际重制版\t76992\n推广部\t76993\nBootCDN\t76994\n齿状\t76995\n美丽的姑娘\t76996\n鹫峰\t76997\n消耗量定额\t76998\n7788.com\t76999\n小菊花\t77000\ng++编译器\t77001\n浙江大学玉泉校区\t77002\n双雪涛\t77003\n熟知\t77004\n表存\t77005\n0997\t77006\n娇惯\t77007\n10100\t77008\n华门\t77009\n坦诚\t77010\n毕达\t77011\n坚果炒货\t77012\n中国华电\t77013\nshuyu\t77014\n天机营\t77015\n肉段\t77016\n奥恰洛夫\t77017\n卧虎\t77018\n第123届\t77019\nOpenCart\t77020\n爱丽丝门罗\t77021\nLOL英雄联盟S6\t77022\n北京艺莞儿\t77023\n保压\t77024\n870k\t77025\n尚学堂\t77026\n我是大侦探\t77027\n基因编辑技术\t77028\n大门镇\t77029\nUnexpected\t77030\n小方\t77031\nnosqlcoco\t77032\n中国九江网_九江市政府网\t77033\nbooot\t77034\n浅尝\t77035\n16G\t77036\nBong\t77037\nwindowserver\t77038\n新鲜事\t77039\n力五庄\t77040\n正比例反比例\t77041\n头胎\t77042\n高达vs\t77043\n萧何月下追韩信\t77044\nSluts\t77045\n搞笑\t77046\n中铁十六局\t77047\n晚盘\t77048\n夏休\t77049\n开掘\t77050\n中学语文\t77051\n别花\t77052\n站岗\t77053\nHTTP\t77054\nsharpdevelop\t77055\nIndesign\t77056\nLETTERS\t77057\nparanoia\t77058\n德劲\t77059\n银钩\t77060\n川上优\t77061\n铁衣\t77062\nLI\t77063\n额叶皮\t77064\n坂本真绫\t77065\n东凤\t77066\nwsdl2java\t77067\n大农\t77068\n看懂\t77069\n土坑\t77070\n去年11月份\t77071\n铝块\t77072\n踢伤\t77073\nsitemap\t77074\n个人主\t77075\n1115\t77076\n性器官\t77077\n气候\t77078\n每张\t77079\n老一代\t77080\n流浪狗\t77081\n一刀倾城\t77082\n搬迁\t77083\n新旅程\t77084\n一都\t77085\n鬓发\t77086\n5373\t77087\n佩琪\t77088\n上海旗舰店\t77089\n魔兽3\t77090\n呋喃丹\t77091\n吊针\t77092\n颜宁\t77093\ngy\t77094\n维吾尔文\t77095\nsim\t77096\n全画幅\t77097\n一个多层\t77098\n十字军东征2\t77099\n56家\t77100\n绥滨县\t77101\n英雄联盟lol\t77102\n皮尔森\t77103\n货币性\t77104\nrichard\t77105\n鼎捷软件股份有限公司\t77106\n榕城区\t77107\n中国水业网\t77108\n正丁烷\t77109\n思派\t77110\npr2\t77111\n@3x\t77112\nBEPS\t77113\nWMP\t77114\nstudio2\t77115\n随文\t77116\n消防法\t77117\n家凤\t77118\n30厘米\t77119\ntaxi\t77120\n检查性激素\t77121\n骨盆骨折\t77122\n晶澳\t77123\n玛塔\t77124\n金坛\t77125\n1.5千瓦\t77126\nVisualize\t77127\n孔颖达\t77128\n泰坦之旅不朽王座\t77129\nkasumi\t77130\n电刀\t77131\nr3\t77132\nB段\t77133\n陈辰\t77134\nFSL\t77135\n杜小刚\t77136\n应用题\t77137\nMPACC\t77138\n66页\t77139\n蛇蛇\t77140\n最后的棒棒\t77141\n科二\t77142\n项庄舞剑\t77143\n2.57\t77144\n大峡谷国家公园\t77145\n脆饼\t77146\n第21集\t77147\n贫乏\t77148\n12平方\t77149\n金川\t77150\nCon\t77151\nsublime-text\t77152\n撤资\t77153\n近乎\t77154\n舞帝利哥\t77155\n17cm\t77156\nMANN\t77157\n榆钱树\t77158\n乳娘\t77159\n115mm\t77160\n中乙联赛\t77161\n插队\t77162\n潘玉良\t77163\nenterbay\t77164\nvivi\t77165\n面部线雕\t77166\nchinab\t77167\nCryptographic\t77168\nsugarcrm\t77169\n午休床\t77170\n黄砂\t77171\nECOSYS\t77172\nwod\t77173\n郭沫安\t77174\n200余万\t77175\n吉祥树\t77176\n巴黎银行\t77177\nHOHNER\t77178\nO型\t77179\nBale\t77180\n陶文\t77181\n代打\t77182\n黄鸿升\t77183\n电子之家\t77184\n中山大学法学院\t77185\nEnds\t77186\n特拉维夫\t77187\ndnf圣骑士吧\t77188\n垣曲吧\t77189\n酷七网\t77190\n行表\t77191\n广东石油化工学院\t77192\nszx\t77193\n车驾管\t77194\n瑰丽酒店\t77195\n张启明\t77196\n夏河\t77197\n弘阳地产\t77198\n林墨\t77199\n恐龙园\t77200\n鬼君\t77201\n十九峰\t77202\n夏克\t77203\nkicked\t77204\n埋怨\t77205\n法卡勒\t77206\nSupported\t77207\n奥摩伊\t77208\nVermintide\t77209\n咱们裸熊\t77210\n染色布\t77211\n允美\t77212\n春秋战雄\t77213\n李子奈\t77214\n隆基股份\t77215\nMM4\t77216\n捉放\t77217\n九宫格键盘\t77218\nsf999传奇\t77219\n屏蔽门\t77220\nGelato\t77221\n永安镇\t77222\n彻底\t77223\nLG电子\t77224\n喷射飞航\t77225\n紫苏叶\t77226\n谁的歌\t77227\n象龟\t77228\n怒形\t77229\n不凡\t77230\n取整\t77231\n67分\t77232\nBridgat\t77233\nlotto\t77234\n李东海\t77235\n新浪科技\t77236\nHPRT\t77237\n后入\t77238\n胜利广场\t77239\n光牙\t77240\n赵荀\t77241\n一弦\t77242\n家用机\t77243\n天津奥体中心\t77244\n王宏宇\t77245\nLLC\t77246\nbond0\t77247\n东邪西毒\t77248\nliberal\t77249\n孰不可\t77250\n高培勇\t77251\n蛇片\t77252\n银谷\t77253\n体育局\t77254\n马朝旭\t77255\n张丽莉\t77256\n第七十三\t77257\nsbi\t77258\neax\t77259\n幽游白书吧\t77260\n眼色\t77261\n华强芯城\t77262\n三乘\t77263\n详论\t77264\n045期\t77265\n三1号\t77266\ntrauma\t77267\n协调性\t77268\n木枋\t77269\n搜购\t77270\ntpa\t77271\nログ\t77272\n京津风沙源\t77273\n天龙3D\t77274\n横过\t77275\nanglar\t77276\n狩魔猎人\t77277\n赞赞新时代\t77278\n低息\t77279\n福瑞斯\t77280\n小胸\t77281\n朝天区\t77282\n淅江\t77283\n中国五冶集团有限公司\t77284\n成都全搜索\t77285\n黑天鹅蛋糕\t77286\n锰硅\t77287\nfrozen\t77288\nInitialization\t77289\n4388\t77290\n藏地锦瑟\t77291\n000web\t77292\n百周年\t77293\n王谢\t77294\nfilebeat\t77295\n咏春\t77296\nSERVLET\t77297\n乌金木\t77298\n居家\t77299\n10月1号\t77300\n工资套\t77301\nstrength\t77302\n承印\t77303\n头孢曲松\t77304\n省民\t77305\n亮体\t77306\n一建\t77307\nOpal\t77308\n880gm\t77309\n对齐\t77310\n广东地区\t77311\n乘算\t77312\nletter\t77313\n寡核苷酸\t77314\n时字\t77315\n仓桥\t77316\n神界原罪2吧_\t77317\n脸盲症\t77318\n小田和正\t77319\n格鲁吉亚战争\t77320\n阿鲁\t77321\nCCC\t77322\n安卓播放器\t77323\n石家庄北国商城\t77324\n马甲\t77325\n777号\t77326\n尼康D7200\t77327\n法轮\t77328\nACCP\t77329\n北京军海医院\t77330\n唐山凤凰妇科医院\t77331\nAlright\t77332\n小沫\t77333\n铁砧\t77334\n遍体鳞伤\t77335\nchroot\t77336\n出自用\t77337\n龙泉寺\t77338\n悠哉\t77339\n紫薇园\t77340\n二批次\t77341\n鸡内金\t77342\n众女\t77343\nTahiti\t77344\n菲林格尔地板\t77345\n537集\t77346\n全国卫生专业技术资格考试\t77347\n手术费\t77348\n墨香斋\t77349\n解衣\t77350\n.mat\t77351\n3路\t77352\n达芬奇调色\t77353\nnump\t77354\nCOMIX\t77355\n泺源大街\t77356\n李宗瑞\t77357\nSirocco\t77358\n寻甸回族彝族自治县人民政府\t77359\nmyibatis\t77360\nenjoys\t77361\n人社部财政部\t77362\n67周年\t77363\n擦破\t77364\nelm327\t77365\n上海服装公司\t77366\n招生说明会\t77367\n镜音\t77368\n拜天\t77369\n4.3.0\t77370\n伸缩梯\t77371\nhighs\t77372\n2018年4月12\t77373\n九虫游戏\t77374\n充费\t77375\nzsks\t77376\naj32\t77377\n陈戈\t77378\n云天化\t77379\n车出\t77380\n图们\t77381\n大城市群\t77382\n新球\t77383\n素书\t77384\n申军谊\t77385\n梦幻西游乌鸡\t77386\n神之手\t77387\n猛汉\t77388\n赵一鸣\t77389\n真人化\t77390\n0546\t77391\n可以获\t77392\n6.8米\t77393\n银隆\t77394\n-\t77395\n200多张\t77396\n脱蜡\t77397\n万寿路\t77398\n宁波市地方税务局\t77399\n暨南大学\t77400\n柳暗花明又一村\t77401\n葡挞\t77402\n体操队\t77403\nLiver\t77404\n金靖\t77405\nMono\t77406\n盛斌\t77407\n上海路\t77408\n烙饼\t77409\nfilters\t77410\nTimmy\t77411\n鹤壁东\t77412\n美少女的谎言\t77413\n好多天\t77414\nctlte\t77415\nbrooke\t77416\n苦艾\t77417\n马靖昊\t77418\n0.02MB\t77419\n武汉物流公司\t77420\n大渝\t77421\n外延\t77422\n脉冲变压器\t77423\n算理\t77424\ngtasa\t77425\n苏威\t77426\n喷光\t77427\n720p/\t77428\n灯盏花\t77429\n鹦\t77430\n胃镜\t77431\n国足\t77432\nh60\t77433\nmel\t77434\n永昌县\t77435\n750px\t77436\n补肾方\t77437\ncastep\t77438\n森歌\t77439\n1.10.1\t77440\n民王\t77441\nLinux私房菜\t77442\n检察官\t77443\n偶极\t77444\n0768\t77445\n一百平方\t77446\n七行\t77447\n李佩\t77448\n上海外国语大学\t77449\n科技活动周\t77450\n苦干\t77451\n金炳万\t77452\n威伦\t77453\n肖扬\t77454\n_趣科技\t77455\nX-crew\t77456\n林源三\t77457\n考不上\t77458\n深圳信立泰药业股份有限公司\t77459\nK/3\t77460\n樊富珉\t77461\n秦皇岛市安全生产监督管理局\t77462\n枕套\t77463\nDSS\t77464\n中秋节\t77465\nwhite\t77466\noffice13\t77467\n霸榜\t77468\node45\t77469\n王瑞林\t77470\n银河战舰\t77471\nuns\t77472\nChan\t77473\n300487\t77474\n1路\t77475\n再嫁\t77476\n春秋\t77477\n道琼斯\t77478\n地名志\t77479\n杭州低碳科技馆\t77480\n围板箱\t77481\n奈瑞儿\t77482\n生物型\t77483\npyhon\t77484\n贾克虎\t77485\n源哥\t77486\n能动\t77487\nオトナ\t77488\n仓库管理软件\t77489\n曹志刚\t77490\n16.04LTS\t77491\n百度云链接\t77492\n大灶\t77493\n2000多年前\t77494\n青黛\t77495\n小蚂蚁\t77496\npiezo\t77497\n张鸿\t77498\n费加罗\t77499\n独家版\t77500\n邻村\t77501\nivf\t77502\nJFK\t77503\nlog文件\t77504\n互联网金融协会\t77505\n原以为\t77506\ncomplimentary\t77507\n阖闾\t77508\n凶煞\t77509\n今安\t77510\n铁线\t77511\n麦斯米兰\t77512\n意向表\t77513\nJAVA\t77514\nsinking\t77515\n欧莱雅染发剂\t77516\n挑动\t77517\n花粉漫谈百味杂陈\t77518\n极品飞车3\t77519\n一汽红旗\t77520\n理性人\t77521\n华记\t77522\n机械式\t77523\n山王坪\t77524\nCASE\t77525\nscorpio\t77526\n乐版\t77527\n10国\t77528\n叶军\t77529\n虎爪\t77530\n张乔\t77531\n中国医科大学附属第一医院\t77532\nMillion\t77533\n七味\t77534\n奥赛\t77535\n2016五一套\t77536\n6.55\t77537\n浙安\t77538\n中土世界\t77539\n过账\t77540\n林隐\t77541\n宝点网\t77542\n贴及\t77543\n雒\t77544\n靶向药\t77545\n小妮\t77546\n币商\t77547\n智能体\t77548\n六千\t77549\nAppleHDA\t77550\nelaine\t77551\n茨城县\t77552\nfree\t77553\nWINXP\t77554\n394\t77555\n跑量\t77556\n华住酒店\t77557\n乱封\t77558\n代表队\t77559\nmarginal\t77560\n47周年\t77561\n围界\t77562\n周回\t77563\n应龙\t77564\n负相关\t77565\n商都网\t77566\n苏州工业园区中小企业服务中心\t77567\n霍林郭勒\t77568\n400马力\t77569\n淮海战役\t77570\n航天宏图\t77571\n特种门\t77572\n季刚\t77573\n亚美车智汇\t77574\n波多\t77575\n27项\t77576\n夏日\t77577\n钦南\t77578\ndedicated\t77579\n清政府\t77580\n捕\t77581\n歪曲\t77582\n华南理工大学建筑设计研究院\t77583\nPEACH\t77584\n最近三年\t77585\n一卡通\t77586\n开假\t77587\n休渔期\t77588\n北邻\t77589\n平衡梁\t77590\n交院\t77591\n迪佳\t77592\n8日\t77593\n建筑邦\t77594\n重定向循环\t77595\n个人赛\t77596\n小荷作文网\t77597\n至少\t77598\n66630757\t77599\nfminsearch\t77600\n川籍\t77601\nASHRAE\t77602\n奇偶数\t77603\n卡拉特拉瓦\t77604\n渭滨区\t77605\n黄花镇\t77606\nNote5A\t77607\n57P\t77608\nx线\t77609\n谋略\t77610\n三科电器集团有限公司\t77611\n出没有\t77612\n月活\t77613\ncutting\t77614\n光伏电缆\t77615\n陈国平\t77616\n外出旅游\t77617\n上海税务局\t77618\nOng\t77619\n全度\t77620\n伪钞\t77621\n王松涛\t77622\n劳动西路\t77623\n硕博家园\t77624\n流放之路召唤师\t77625\n自笑\t77626\n笑一笑\t77627\nMillennium\t77628\n吊趟门\t77629\nDye\t77630\nBRG\t77631\n吻合\t77632\n近月\t77633\n部屋\t77634\n优异\t77635\n硬说\t77636\nCuda\t77637\n虹桥坊\t77638\n鲜花速递网\t77639\n交点\t77640\n团圆\t77641\n香港花园\t77642\n电力\t77643\n十日后\t77644\n百斯特\t77645\n反身\t77646\nafx\t77647\nOriented\t77648\n功率分析仪\t77649\n隐身术\t77650\n诺曼底号遇难记\t77651\n中国作协\t77652\n青藏线\t77653\n刘浩\t77654\n米开朗基罗\t77655\n荆门高新区\t77656\nnode-schedule\t77657\n何贤\t77658\n新苑小区\t77659\n9090\t77660\n瞄挂\t77661\n海枯石\t77662\n商人\t77663\n深业新岸\t77664\n古丘\t77665\n_线切割吧\t77666\n劳动光荣\t77667\n涉假\t77668\n柑子\t77669\ndana\t77670\n十级\t77671\n琉月\t77672\nkrita\t77673\n三连\t77674\n淫湿\t77675\n绩优股\t77676\n研究出版社\t77677\n中国领事服务网\t77678\n人口红利\t77679\n补码\t77680\nBurpSuite\t77681\n唐缺\t77682\natlassian\t77683\ndz模板\t77684\ndpm\t77685\nkui\t77686\n官方最新版软件\t77687\n劲道\t77688\n绩溪县人民政府\t77689\n玛莎拉蒂\t77690\n李丽芬\t77691\n大耗\t77692\n捧腹网\t77693\nupset\t77694\n国家安全日\t77695\nCalculus\t77696\n征管\t77697\n奇正藏药\t77698\n528路\t77699\n五城区\t77700\n绿软家园\t77701\n20170802\t77702\n残端\t77703\nBesTV\t77704\n中国证券投资基金业协会\t77705\nbuct\t77706\n欣享\t77707\n团长\t77708\n大学生考试网\t77709\n大龙猫\t77710\n刘力红\t77711\n冲值\t77712\n学习班\t77713\n中国国家画院\t77714\n最后一张照片\t77715\n阿特金森\t77716\nendorsement\t77717\n好穿\t77718\n墨丹文\t77719\nefficiency\t77720\n枪魂\t77721\n水甲\t77722\n扫呗\t77723\n进食\t77724\n1872\t77725\nipadair\t77726\n幻景\t77727\n性灵\t77728\nblox\t77729\nwii吧_\t77730\n不二神探\t77731\n语文迷\t77732\nreadiris\t77733\ncatalyzed\t77734\n骁龙450\t77735\ngroupid\t77736\nTransit\t77737\n王二小\t77738\n105ti\t77739\n7项\t77740\n润唇\t77741\nDiY\t77742\n加拉帕戈斯群岛\t77743\n龙岗街道\t77744\n板面\t77745\n苏州图书馆\t77746\n领\t77747\n槟榔西施\t77748\n本类\t77749\n犁地\t77750\n夏秋\t77751\n铁拳6\t77752\n飞升降\t77753\n烽火光猫\t77754\nbule\t77755\n动态度\t77756\n竭泽而渔\t77757\nEZVIZ\t77758\n淫骚\t77759\n内镜\t77760\n反向保理\t77761\n第14话\t77762\n锦绣育才小学\t77763\n600015\t77764\n论破\t77765\n八连\t77766\n双筋\t77767\n华闻传媒\t77768\n台儿庄战役\t77769\n曲裾\t77770\n子妻\t77771\n冰美式\t77772\n4波\t77773\nTeens\t77774\n摩拜单车\t77775\n2017-02\t77776\n小罗伯特·唐尼\t77777\n打印笔\t77778\n争用\t77779\nperformed\t77780\n藏妖\t77781\n浴火凤凰\t77782\nGist\t77783\n棒呆\t77784\nxxj\t77785\nYogurshine\t77786\n团购房\t77787\nvaqe\t77788\n功率\t77789\n史观\t77790\n古墓丽影8\t77791\n区公安局\t77792\n石墩\t77793\n三个工作日\t77794\n汉丰湖\t77795\n正确版\t77796\n1遍\t77797\n世纪开元\t77798\n科勒卫浴\t77799\n荷兰队\t77800\n滨海边疆区\t77801\n成度\t77802\n肺小结节\t77803\n麻薯\t77804\n美朝\t77805\nC5game\t77806\nsfys\t77807\nBlocker\t77808\nCells\t77809\n真探\t77810\n工商代办\t77811\n轻寒\t77812\n胡柚\t77813\n光孔\t77814\nmpv\t77815\n江南大学食品学院\t77816\nKFR\t77817\n王枫\t77818\n资料站\t77819\n预拌砂浆\t77820\n盗墓\t77821\n沈阳大悦城\t77822\n圆子\t77823\n蓝牙协议栈\t77824\n食记\t77825\n11s\t77826\n花手\t77827\nconcepts\t77828\n维他命c\t77829\n剧作家\t77830\n墒情\t77831\n末路重生\t77832\n土地出售网\t77833\n小规模纳税\t77834\nDICE\t77835\n导光板\t77836\n炒米粉\t77837\n浠水\t77838\n箕子\t77839\n恩度\t77840\n邪魔殿\t77841\n水堰\t77842\n适情\t77843\n李凯馨\t77844\n蓝晴\t77845\n开平方\t77846\n十二载\t77847\n三剑奇缘\t77848\n调色\t77849\n德意日\t77850\n再笑\t77851\n烫金\t77852\n八里沟\t77853\n砌筑\t77854\n夏紫薇\t77855\n摊还\t77856\n吉林银监局\t77857\n周慧珺\t77858\n重庆工商\t77859\n备案表\t77860\n拉涅利\t77861\nх\t77862\n博士后科研工作站\t77863\nlm55916\t77864\n夜雪\t77865\n二级片\t77866\n雪鱼\t77867\n岩井俊二\t77868\n钻取\t77869\ncattle\t77870\n理想宝\t77871\nhighschooldxd\t77872\n切管机\t77873\n电锯\t77874\n迪优美特\t77875\n包板\t77876\n641\t77877\naegis\t77878\n红旗大街\t77879\nClassical\t77880\n调压站\t77881\n上海建桥学院\t77882\nword07\t77883\n式\t77884\n实验性\t77885\n教学反思\t77886\n磷酸奥司他韦颗粒\t77887\nSeries3\t77888\n汉墓\t77889\nshea\t77890\n晓月\t77891\n李肖鸣\t77892\n冴岛奈绪\t77893\n渔具\t77894\n糖尿病肾病\t77895\n第8部\t77896\nacgzone\t77897\n37家\t77898\nHIJAV\t77899\ntheater\t77900\n大跌\t77901\n中共湖北省委党校\t77902\nautoreconf\t77903\n罗技M558\t77904\nfixer\t77905\n海尔创客实验室\t77906\n14万公里\t77907\n九九书包网\t77908\n定期存款单\t77909\nTreasure\t77910\n真性\t77911\nDataframe\t77912\n新化县\t77913\nH2SO4\t77914\n海利生物\t77915\n报单\t77916\nUWC\t77917\n签证单\t77918\n略谈\t77919\n奥拉迪波\t77920\nnumeric\t77921\n11808\t77922\n白给\t77923\n勤礼碑\t77924\n蛴螬\t77925\nmsvcp71.dll\t77926\n15股\t77927\n曾峰\t77928\n广东碧桂园学校\t77929\n方硕\t77930\n0.04MB\t77931\nEURCNY\t77932\nwalnut\t77933\n3X3\t77934\n威纶通\t77935\n大変\t77936\n联合会杯\t77937\nexcel打印\t77938\n安迅\t77939\n捉鬼大师\t77940\nmuf\t77941\nNong\t77942\n东北财经大学研究生院\t77943\n半空\t77944\nHTTP状态码\t77945\n信友\t77946\n黄贝岭\t77947\na5000\t77948\nv2.0.0_\t77949\n警惕\t77950\n经纬线\t77951\nshouyou\t77952\n9665\t77953\n林邑\t77954\n尹山湖\t77955\n九州行\t77956\n坚\t77957\n几万块\t77958\n铁血联盟2\t77959\n大连市国家税务局\t77960\nbabylon\t77961\n版权登记\t77962\n江苏省镇江市中级人民法院\t77963\n上海外高桥\t77964\n阿诗玛\t77965\ndevise\t77966\n张慧\t77967\npyton\t77968\nWXSS\t77969\n声源\t77970\n意字\t77971\n海西蒙古族藏族自治州\t77972\n甬台温高速\t77973\n长安论坛\t77974\n叱咤\t77975\n咖啡馆\t77976\nOPPOr11\t77977\ndelonghi\t77978\nREV\t77979\n魔女\t77980\ndiscretionary\t77981\n闪速\t77982\n卡罗尔\t77983\n攸州网\t77984\n圣智\t77985\n多高\t77986\nm88\t77987\n图趣网\t77988\n1482\t77989\n下钻\t77990\n第二局\t77991\n停车架\t77992\nLYNK\t77993\n帅哥们\t77994\n第A2\t77995\nherself\t77996\n丑牛\t77997\nEAM\t77998\n芭乐雅\t77999\n万和城隆基泰\t78000\n信\t78001\n遗留\t78002\n彀\t78003\nt19\t78004\n阜南县\t78005\n特发性\t78006\nhardy\t78007\n面画\t78008\n布格\t78009\n护花兵王\t78010\n_蟹\t78011\n外贸部\t78012\n金船\t78013\nv1.0.4\t78014\n金士顿骇客神条\t78015\n床身\t78016\n戊戌维新\t78017\n凉川绚音\t78018\n河南工程学院\t78019\n免密\t78020\nbii\t78021\n永丰县\t78022\n中俄文\t78023\n无闻\t78024\nMADNESS\t78025\n穿衣镜\t78026\n革兰氏阳性球菌\t78027\n樱井夕树\t78028\n灰库\t78029\n中仙\t78030\n看好\t78031\n压型钢板\t78032\n这部书\t78033\n钦州学院\t78034\n超碰免费版\t78035\nISF2.8\t78036\nword2013\t78037\n钳形电流表\t78038\nNCCN\t78039\nGPUZ\t78040\n总榜\t78041\n税务\t78042\n邓迪大学\t78043\nK3WISE\t78044\n多丽\t78045\n亚油酸\t78046\n燕尾槽\t78047\n港版\t78048\n杨爱民\t78049\n正襟危坐\t78050\n秘考\t78051\n300077\t78052\n千山鸟\t78053\n安管\t78054\n水寨镇\t78055\n皮皮作文网\t78056\n值尾\t78057\nVoidKing\t78058\n第几类\t78059\n应用汇\t78060\n排瘀\t78061\nMapgis\t78062\n4.2.4\t78063\n顽抗\t78064\nobtained\t78065\n文化创意公司\t78066\nnvren\t78067\nphpstrom\t78068\n孙家村\t78069\n今者\t78070\n娇淫\t78071\n临西县\t78072\n20余\t78073\n胞质\t78074\n广泛性\t78075\n过手\t78076\nWrap\t78077\n血符\t78078\n临沂经济技术开发区\t78079\n2084\t78080\n甲壳儒道至圣\t78081\n宋河\t78082\n中建股份\t78083\nwatchOS\t78084\n魔兽猎人\t78085\n雷雪松\t78086\n陈志平\t78087\n超过三天\t78088\nhalt\t78089\n国人\t78090\n夏代\t78091\n系统恢复\t78092\n细菌性肠炎\t78093\n渔鼓\t78094\n东南欧\t78095\n未转变者\t78096\n0615\t78097\n杨家埠\t78098\n绩效考核表\t78099\nnail\t78100\n禅城区人民政府\t78101\n武昌实验小学\t78102\n脱垂\t78103\nMathf\t78104\n高血脂症\t78105\n莲家园\t78106\n丝口\t78107\n内衣办公室\t78108\n馆子\t78109\n_嘉\t78110\n病愈\t78111\nWhats\t78112\n东海广场\t78113\nMARY\t78114\n4个\t78115\n后视镜片\t78116\n英雄\t78117\nmano\t78118\n超能宝\t78119\nPayne\t78120\n天籁之爱\t78121\n牛屎\t78122\n阿部宽\t78123\n9斤\t78124\n空间化\t78125\n快译通\t78126\n黑钱\t78127\nPVG\t78128\n混光\t78129\n烟度\t78130\n70#\t78131\nSybase\t78132\n联力\t78133\n荣耀号\t78134\n氯化胆碱\t78135\nhotplug\t78136\n图样\t78137\n第六十三章\t78138\n毛公鼎\t78139\n天天快报\t78140\n袁莎\t78141\n孙鑫\t78142\n广西河池市\t78143\nyulong\t78144\n第九级\t78145\n応\t78146\n赵琴\t78147\n央美\t78148\nC段\t78149\n0-1\t78150\n何奕恋\t78151\n王也\t78152\nXbed\t78153\n淳萃\t78154\n融创西安宸院\t78155\n夜排档\t78156\n兴奋度\t78157\n土金\t78158\n白先勇\t78159\nAuthentic\t78160\n2张\t78161\n巨集\t78162\ninf\t78163\n超级机器人大战BX\t78164\n争夺赛\t78165\n细胞器\t78166\n表决权\t78167\n_张\t78168\n20170920\t78169\n爽脆\t78170\nccnp\t78171\n起亚锐欧\t78172\n数头\t78173\n眼线液\t78174\n十一代\t78175\n合肥北城\t78176\n托福独立口语机经\t78177\n等人\t78178\n郑州市委\t78179\n仗剑江湖行\t78180\n奔驰GLC论坛论坛\t78181\n驴奶\t78182\n合同段\t78183\nCHAMPION\t78184\ndurex\t78185\n神探包青天\t78186\nplace\t78187\nejb\t78188\n一二三四\t78189\n浅议\t78190\n月形\t78191\n玄松月\t78192\n国泰航空\t78193\nDc\t78194\nqmenu\t78195\n瑞芯微RK3399\t78196\n帅S俊\t78197\n888集团\t78198\n农产品市场\t78199\n阴滋病\t78200\n小蝌蚪找妈妈\t78201\n2lnx\t78202\n信成\t78203\n等身\t78204\nJdbcTemplate\t78205\nbarn\t78206\n付息日\t78207\n新农村社区\t78208\nautoit\t78209\nAxis\t78210\n2123\t78211\n慎重\t78212\n严重者\t78213\nワルキュ\t78214\n9018\t78215\n比特彗星\t78216\n3则\t78217\n环球家电网\t78218\n安稳\t78219\n乌兰巴托之夜\t78220\n大数据研究院\t78221\n综保\t78222\n3456\t78223\n创明\t78224\n放热反应\t78225\n南方电网报\t78226\n殷勇\t78227\n_安游\t78228\n收并蓄\t78229\nAI+\t78230\n浅析\t78231\n接待费\t78232\n家营\t78233\n协议书\t78234\nMediaRecorder\t78235\n晋语\t78236\n删档\t78237\n20150613\t78238\n肛虐\t78239\n中国建筑技术集团有限公司\t78240\n后母\t78241\n一稿\t78242\nCIFAR10\t78243\n奥拉维尔·埃利亚松\t78244\n桔梗\t78245\nchelsea\t78246\njava软件工程师\t78247\n空中草原\t78248\nshichang\t78249\nLOUNGE\t78250\n会计核算_\t78251\n释迦摩尼\t78252\nsyntactic\t78253\nCloudflare\t78254\n写稿\t78255\n河内\t78256\n新秦时明月\t78257\n7篇\t78258\n拨\t78259\n科系\t78260\n7.5W\t78261\n充场\t78262\n典心\t78263\n宋城集团\t78264\ncn\t78265\ntaf\t78266\n脱硫泵\t78267\nnsinteger\t78268\n冯凯\t78269\nz170a\t78270\nMarilyn\t78271\nsakai\t78272\ncomplain\t78273\nSh\t78274\n启示\t78275\n中建三局一公司\t78276\n河宫\t78277\n发扬\t78278\nPRO3\t78279\n脊柱侧弯吧\t78280\n51LIVE\t78281\n爱闻\t78282\n屏版\t78283\nq35\t78284\n康安途\t78285\n股\t78286\n套定额\t78287\n蔚蓝国际\t78288\n蒋天\t78289\n国家食药监总局\t78290\n骗奸\t78291\n钟山风景名胜区\t78292\n国学院\t78293\n透明物\t78294\nUnable\t78295\n滤光镜\t78296\n胃窦炎\t78297\nation\t78298\nreaches\t78299\n并线辅助\t78300\n造路\t78301\n腐烂病\t78302\n北大中文系\t78303\n西夏区\t78304\n龙ol\t78305\nreinforce\t78306\n青年观\t78307\n天津塘沽\t78308\n李琪\t78309\n劣\t78310\n陈茂源\t78311\n博物\t78312\noutlook2016\t78313\n幽若\t78314\n滑雪机\t78315\nweigh\t78316\n新华汽车\t78317\nqhd\t78318\n2017-12\t78319\n虎视\t78320\n王八蛋\t78321\n锁盘\t78322\n彩吧图库\t78323\ncivil\t78324\n梦幻西游封妖\t78325\nMAXWELL\t78326\n新任\t78327\nPurchasing\t78328\n开练\t78329\n高效氯氰菊酯\t78330\n赵海波\t78331\n2016.9\t78332\n塔套\t78333\n夏如眷\t78334\n唤潮\t78335\n妈圈\t78336\n缸\t78337\n济宁大众网\t78338\n塔莎\t78339\nBuds\t78340\n墙板机\t78341\n骄傲\t78342\n彭超\t78343\n美妆类\t78344\n8.8.8.8\t78345\n套券\t78346\n1300张\t78347\nQ50\t78348\n长安镇\t78349\n空隙率\t78350\nyeyou\t78351\n叫我万岁爷\t78352\n曾国盗墓笔记\t78353\n云服务器\t78354\n周泓\t78355\n绩溪路\t78356\n比较级_\t78357\n需量\t78358\n玫瑰花园\t78359\nps钢笔工具\t78360\n倩女幽魂画魂\t78361\n油费\t78362\n股权投资基金\t78363\n腰鼓\t78364\n竖体\t78365\n男犯\t78366\nScrum\t78367\n图灵测试\t78368\n采集者\t78369\n广东东莞银行\t78370\n540m\t78371\n输送机\t78372\n无翼鸟漫画_无翼鸟漫画全集_无翼鸟邪恶漫画大全_无翼鸟H漫画\t78373\nlavazza\t78374\n下画\t78375\nTWAIN\t78376\n精盾\t78377\n齐恒公\t78378\n徐露\t78379\n倪柝声\t78380\n司炉\t78381\n浙江省委\t78382\n请三思\t78383\n包图网\t78384\nsqlite3_exec\t78385\n三正\t78386\n蒙面哥\t78387\n俩\t78388\n单波\t78389\n南朝四百八十寺\t78390\n压盘\t78391\nDapper\t78392\n老男\t78393\nSedo\t78394\nMidtown\t78395\n神鹿\t78396\n无线胶装机\t78397\n开言\t78398\nqqtn\t78399\n蓝猫淘气三千问\t78400\n拘束\t78401\n陈雪峰\t78402\n常用包\t78403\ndunk\t78404\n北京国税局\t78405\n上海省\t78406\n薄叶\t78407\n古武网\t78408\nPrism5\t78409\n美国俄亥俄州立大学\t78410\n残损\t78411\n旅游学概论\t78412\n实常\t78413\n浸会\t78414\n撩动\t78415\n05月\t78416\n倍速\t78417\n本价\t78418\n光荣与梦想\t78419\n博迪投资\t78420\n生化危机4\t78421\nOften\t78422\nZoofilia\t78423\n识字率\t78424\n10级\t78425\n刘雪\t78426\n江阴高新区\t78427\n何所冬暖何所夏凉\t78428\nREA\t78429\n奈央\t78430\nEdelman\t78431\n吉普赛人\t78432\n流放之路瓦尔\t78433\n123平\t78434\n新刊\t78435\n林韦君\t78436\n涡街流量计\t78437\nthankful\t78438\n复旦万科\t78439\n死亡名单\t78440\n茶蛋\t78441\n油脂粒\t78442\n新峰\t78443\n钓位\t78444\n0215\t78445\njk触发器\t78446\n卡\t78447\n航空股\t78448\n苏州市实验小学\t78449\n六四\t78450\n云表格\t78451\n2018年4月9日\t78452\n方小晓\t78453\nSCDMA\t78454\nSingles\t78455\n刺客信条2\t78456\n中央美术学院美术馆\t78457\n国京证券\t78458\n报奖\t78459\n卓高\t78460\n丢你\t78461\n侠盗列车\t78462\n云冈\t78463\n网络模拟器\t78464\n大麦网\t78465\nK27\t78466\n京郊\t78467\n爆表\t78468\n贵府\t78469\nvovi\t78470\n车载充电机\t78471\n腹黑男\t78472\n52米\t78473\n合诊\t78474\n4月18日\t78475\n纪委监察委\t78476\n厦门教育局\t78477\n国际关系史\t78478\n和平家园\t78479\n握不住\t78480\nntc热敏电阻\t78481\n260公里\t78482\n专技\t78483\n长沙步行街\t78484\n维新派\t78485\n月季花苗\t78486\n捉妖大仙\t78487\nexploit\t78488\n优色\t78489\n宵禁\t78490\nG402\t78491\n塞缪尔·杰克逊\t78492\n利润分配\t78493\n王彦章\t78494\nDarrenChan\t78495\n威马EX5\t78496\nmywife\t78497\n36平米\t78498\n家庭共济\t78499\n泮托拉唑钠肠溶胶囊\t78500\n拉伐\t78501\nROWS\t78502\n我喜欢的人\t78503\n粗鲁\t78504\nhalloween\t78505\n电脑硬件\t78506\n买一赠\t78507\n医疗联合体\t78508\n第219集\t78509\n希瓦娜\t78510\n柔巾\t78511\n白领\t78512\n顾此失彼\t78513\n甘为\t78514\n唐伯虎\t78515\n李少红\t78516\nLightning\t78517\n红房\t78518\n布卷\t78519\n难倒\t78520\n绥化市\t78521\n徐文长\t78522\nfll\t78523\n利用\t78524\n中国幼教\t78525\n竹园村\t78526\nweeks\t78527\n平面密封剂\t78528\n封版\t78529\ni56500\t78530\n满身\t78531\nvc++2015\t78532\n完美世界吧\t78533\n色料\t78534\n企鹅fm\t78535\n三品\t78536\n百草集\t78537\n第n\t78538\n赵家镇\t78539\n酒杯架\t78540\n白鬼\t78541\n鸡毛\t78542\nHEYTEA\t78543\n顺丰科技\t78544\n惠农政策\t78545\nldt\t78546\n〞\t78547\n1200PLC\t78548\n非网管交换机\t78549\n老白茶\t78550\nMiller\t78551\n第七集\t78552\n脑瘫\t78553\n机器视觉工程师\t78554\n李奥\t78555\n拂堤杨柳醉春烟\t78556\nacura\t78557\n戊唑醇\t78558\n900ml\t78559\n苹果装饰\t78560\n076\t78561\n释放\t78562\n富川县\t78563\n这份爱\t78564\n大话西游之超级小白龙\t78565\n伊凡\t78566\n外来工\t78567\n北邙\t78568\n比亚迪G6\t78569\n神武山庄\t78570\nMDR-EX750BT\t78571\n4380\t78572\n荒野猎人\t78573\n诱魔者\t78574\n猫奴\t78575\n被积函数\t78576\n2015年12月1日\t78577\nhttpip\t78578\n无疑\t78579\n鲍国安\t78580\n中国好声音第二季片花\t78581\n小周周\t78582\ng9350\t78583\n赛菲尔\t78584\n代卖\t78585\nPram\t78586\n运城市教育局\t78587\n恩阳区\t78588\n玫瑰精油\t78589\n走势\t78590\n楼梯\t78591\n鲁尼\t78592\n围村\t78593\n生物反应器\t78594\n有恒\t78595\n奖牌\t78596\n林大\t78597\nOffic\t78598\n业余时间\t78599\n10几万\t78600\n银幕\t78601\n龙沙宝石\t78602\nmeminfo\t78603\n6升\t78604\n下号\t78605\n切原赤也\t78606\n酒精性脂肪肝\t78607\n水利工程师\t78608\n化学位移\t78609\n剃度\t78610\n欠款\t78611\n起点小说网\t78612\n古典吉他吧\t78613\n_双\t78614\n四段\t78615\n自动白平衡\t78616\n最后赢家\t78617\n溶氧\t78618\n维生素A\t78619\n2015年5月15日\t78620\ndq11\t78621\n快意电梯\t78622\n晋江市医院\t78623\nzjt\t78624\n素体\t78625\n捣蛋猪\t78626\naquatic\t78627\n卫生部北京医院\t78628\npdb\t78629\n安心\t78630\n16GB\t78631\nRigging\t78632\n507\t78633\n斗转\t78634\n安琪酵母\t78635\n陈同甫\t78636\n240名\t78637\n石花菜\t78638\n酯类\t78639\n漩涡鸣人\t78640\n数多\t78641\n山航\t78642\n周仲瑛\t78643\n叶\t78644\nflann\t78645\nクロニクル\t78646\nVS网\t78647\n高衙内\t78648\nMitsui\t78649\nperspective\t78650\nphysics\t78651\n12平米\t78652\n必究\t78653\n芜湖新闻网\t78654\n微酸\t78655\n窃听\t78656\n取首\t78657\npanabit\t78658\n湘妃\t78659\n海伯利安\t78660\n黑暗时刻\t78661\n狂扫\t78662\n菜鸟\t78663\n矶\t78664\n青岛眼科医院\t78665\n快恢复二极管\t78666\n英迪格酒店\t78667\ncaffe2\t78668\nShift\t78669\n随笔集\t78670\n1-2月份\t78671\n转向柱\t78672\n5.3.4\t78673\n康有为\t78674\n套锁\t78675\ntouser\t78676\n弧角\t78677\n南风窗\t78678\n射流\t78679\nthe&#160\t78680\n数据行\t78681\n麻糬\t78682\n鳞化\t78683\n母婴展\t78684\n正四棱锥\t78685\n飞雪\t78686\nxh98hx\t78687\n软骨肉瘤\t78688\n定阻\t78689\n异装癖\t78690\n西芹籽\t78691\n黑莲花\t78692\n核级\t78693\nSemiconductors\t78694\n武汉万达瑞华酒店\t78695\n画渣\t78696\n伊阿宋\t78697\n自录\t78698\n鹿山街道\t78699\n753\t78700\n30T\t78701\ngalga\t78702\n子段\t78703\n2008-2017年\t78704\n唐延路\t78705\n赛况\t78706\n中坝镇\t78707\n北京簋街\t78708\n礼\t78709\nandroidkiller\t78710\n黄晓烟\t78711\n压浆\t78712\n光明日报\t78713\nApoptosis\t78714\n激光干涉仪\t78715\n能动学院\t78716\n跑腿公司\t78717\n黄金素\t78718\n57岁\t78719\nshadowsock\t78720\nA21\t78721\n第二段\t78722\npingg\t78723\n食用胶\t78724\n毛熊\t78725\n陈玮\t78726\n市际\t78727\n欧日\t78728\n急性中耳炎\t78729\n矫形\t78730\n演技派\t78731\nKCP\t78732\n省政府党组\t78733\n旅游管理专业\t78734\n植物大战僵尸二\t78735\n步步高点读机\t78736\nvcu\t78737\nBIU\t78738\n签到册\t78739\ntaoshou\t78740\nψ\t78741\n一朵莲\t78742\n加法器\t78743\nCountDownLatch\t78744\nkegg\t78745\n免洗\t78746\n想象力\t78747\n功能室\t78748\n加中\t78749\n版师\t78750\n混床\t78751\n3468\t78752\n致命弯道2\t78753\n贤心\t78754\n爱普生R330\t78755\n体表\t78756\nwinmain\t78757\n加彩\t78758\n刀剑物语\t78759\n总集\t78760\nglobal\t78761\n三水区人民政府\t78762\n第四更\t78763\n审批单\t78764\n万高\t78765\n车库\t78766\n啮\t78767\n投信\t78768\n加入\t78769\n回执单\t78770\n落风\t78771\n北院\t78772\n孟定\t78773\naimesh\t78774\nBeanstalk\t78775\n忘忧\t78776\n电流镜\t78777\n108\t78778\n动物类\t78779\nAqua\t78780\n林语堂\t78781\n婺源篁岭\t78782\nATMEL\t78783\n环巢湖大道\t78784\nFIREFOX\t78785\n共战\t78786\n侨鑫\t78787\n波恩大学\t78788\n够不着\t78789\n铣头\t78790\n麻雀虽小\t78791\n冯唐\t78792\n永和经济区\t78793\nBMX\t78794\n北大公学\t78795\n财力\t78796\n千峰\t78797\n蕾雅\t78798\n117.136.0.0\t78799\n全党\t78800\n表皮\t78801\n第7关\t78802\n远赴\t78803\nFragrances\t78804\n大栅栏\t78805\n同源\t78806\n下周二\t78807\n16#\t78808\n钢模\t78809\nkuka\t78810\n道具箱\t78811\n功能型\t78812\n100x100\t78813\n第5套\t78814\n1.0.5.1\t78815\n正东\t78816\n宝鸡百姓网\t78817\n伪造\t78818\n溢脂性脱发\t78819\nPatel\t78820\nqiyuan\t78821\n韦驮天\t78822\n上海超研生物科技有限公司\t78823\ndnfeiji\t78824\n征收\t78825\n自解\t78826\nSQLITE3\t78827\nbog\t78828\n龙战士\t78829\nautocad2006\t78830\n求进\t78831\n250克\t78832\n第129集\t78833\nlic\t78834\n自组网\t78835\n叔伯\t78836\n中国公文网\t78837\n8.1.0\t78838\n不计其数\t78839\n许茹芸\t78840\n尹烨\t78841\n知识产权信息网\t78842\n李戡\t78843\n北京银行\t78844\nsockets\t78845\n裸爱\t78846\n20150425\t78847\n杭集\t78848\n双向六车道\t78849\n创世纪2天地有情\t78850\n中北路\t78851\n南无观世音菩萨\t78852\n2018.04.03\t78853\n衰变\t78854\n俸\t78855\n延寿镇\t78856\nafrica\t78857\nシンアイ\t78858\n村妇\t78859\n1立方厘米\t78860\n罗伊人\t78861\n送\t78862\n跨文化交际\t78863\n小微企业贷款\t78864\n邵阳\t78865\n常州市中医院\t78866\n学院奖\t78867\nkaraoke\t78868\n科长\t78869\n红盾信息网\t78870\n定胆\t78871\n120平方米\t78872\ncoolorus\t78873\n花园北路\t78874\nIC\t78875\n国家电力投资集团\t78876\n莫干山\t78877\n解说员\t78878\n芦丁\t78879\n夏宇\t78880\n杂环\t78881\nannotation-driven\t78882\n笔画\t78883\n陕西省人民检察院\t78884\n川派\t78885\n银杏苑\t78886\ntpk\t78887\nMET\t78888\n该\t78889\n草绘\t78890\n威斯敏斯特大学\t78891\n新垣柴静\t78892\n云南大理\t78893\n周易-88虎风水网\t78894\n宜良\t78895\natof\t78896\n世界军运会\t78897\n三十八岁\t78898\n毛泽东选集\t78899\nnba2018\t78900\n下供\t78901\n450ml\t78902\n顺联动力\t78903\nRefain\t78904\n陈振宇\t78905\n扶翼\t78906\n花肥\t78907\nsslv3\t78908\n博朗\t78909\n无迹可\t78910\n爱大\t78911\n基酒\t78912\n双乙烯酮\t78913\n实施细则\t78914\n为一\t78915\n切花\t78916\n曼妙\t78917\n人文\t78918\n北大资源\t78919\nRDPAC\t78920\n不不\t78921\n地暖\t78922\n妇产科学\t78923\n2018年5月25日\t78924\n华春莹\t78925\nsychronized\t78926\n文艺心理学\t78927\n⑦\t78928\nsii\t78929\n11个\t78930\nsop8\t78931\n邢台火车站\t78932\n陈香梅\t78933\nauction\t78934\n精灵宝可梦全集\t78935\nimagemin\t78936\n绰绰有余\t78937\n先导片\t78938\n王均\t78939\ni74790k\t78940\n2017年10月18日\t78941\n古窑\t78942\nvray2014渲染器\t78943\n兰州大学研究生院\t78944\n星空日记\t78945\n百香果皮\t78946\n二手挖掘机\t78947\nhbk\t78948\nchamber\t78949\n张鲁一\t78950\ncode值\t78951\nTimesTen\t78952\n二百斤\t78953\n网络广告人\t78954\n勇闯夺命岛\t78955\nHopeTrip旅遊網\t78956\n第一条&#160\t78957\n腾房\t78958\n二传\t78959\navidolz\t78960\n能做到\t78961\n内生肌酐清除率\t78962\n全纳\t78963\n童模\t78964\nlist\t78965\n音信\t78966\n魔环\t78967\nrealtime\t78968\n遗书\t78969\n六子\t78970\n充饥\t78971\n袁记\t78972\n学期\t78973\n三样\t78974\n贾鹏芳\t78975\nwasn\t78976\n一品香\t78977\n包打\t78978\nGaGa\t78979\n第140章\t78980\n津膜科技\t78981\n灰鸽子远程控制软件\t78982\n卡迪\t78983\n合肥通用机械研究院\t78984\n纯阳观\t78985\n薛昊婧\t78986\n首肯\t78987\n双针\t78988\n三角梅\t78989\n很多\t78990\n吾股丰登\t78991\n房途网\t78992\n9节\t78993\n宏志\t78994\n云梦川\t78995\nSwift版\t78996\n进度图\t78997\nabbs\t78998\n无穷小\t78999\npc蛋蛋\t79000\n别挂\t79001\nde博客\t79002\n能源与动力工程学院\t79003\n髋\t79004\n腰饰\t79005\n率\t79006\nhbc\t79007\n0K\t79008\n无惑\t79009\n2月21日\t79010\n黑片\t79011\n太监\t79012\n咸肉菜饭\t79013\n一哈\t79014\n饭菜网\t79015\n2017-06-01\t79016\n顺毛\t79017\n新湖明珠城\t79018\n成都学院\t79019\n靖西市\t79020\nVIP163尊贵邮\t79021\nkit\t79022\n心领神会\t79023\n保存箱\t79024\n蜡模\t79025\nSwingers\t79026\n萨克雷\t79027\n定向井\t79028\n强盗\t79029\n厚彩\t79030\nJeffWong\t79031\n杭州第二中学\t79032\nORICO\t79033\nWDC\t79034\nbian\t79035\n教育学类\t79036\n埃尔德里奇\t79037\n钢化玻璃门\t79038\n衣邦\t79039\n异形\t79040\nぢ\t79041\n扫荡者\t79042\n飚\t79043\nrina\t79044\nmiho\t79045\n赏识\t79046\n深惠城轨\t79047\nCBR300R\t79048\n五六万\t79049\nintro\t79050\n扫把星\t79051\n百联奥特莱斯\t79052\nvoip\t79053\n淫谋\t79054\n雨林蝎\t79055\n芳邻\t79056\n李志希\t79057\n证明人\t79058\nSharding-JDBC\t79059\n轴截面\t79060\ndisruption\t79061\n幻像\t79062\n蜥蜴\t79063\n何建军\t79064\n金鑫松\t79065\n椰糠\t79066\n徘徊\t79067\n出來\t79068\n光谷青年城\t79069\n陈绮贞\t79070\n乐读\t79071\n个人主义\t79072\n贝肯\t79073\n盔\t79074\n40x40\t79075\n纸版\t79076\n理化\t79077\nzhuna\t79078\n北京首汽(集团)股份有限公司\t79079\ngslb\t79080\n分歧\t79081\n洞室\t79082\nCONF\t79083\n来校\t79084\n黄金大道\t79085\n老司机\t79086\n0.02%\t79087\n渣渣\t79088\n无忧乐行\t79089\n奴婢\t79090\n依安县\t79091\n入狱\t79092\n什锦\t79093\n哪首歌\t79094\n嘉应\t79095\n手抄报\t79096\n游牧民\t79097\n自走\t79098\n局部阻力系数\t79099\n秭归县\t79100\n夜钓灯\t79101\n胚布\t79102\n松仁\t79103\n火蛇\t79104\nPLU\t79105\n人民警察法\t79106\n5plus\t79107\n秦如培\t79108\n瓶装酒\t79109\n曹山\t79110\n文明委\t79111\n猿类\t79112\n傲日\t79113\n噗噗\t79114\n刚才\t79115\n北湖区\t79116\n裕美\t79117\n98寸\t79118\n慢跑鞋\t79119\n入间\t79120\n高港\t79121\n20160904\t79122\naces\t79123\n茂名传媒\t79124\n沈震轩\t79125\n上一年\t79126\n捡\t79127\n省呗\t79128\n北京四方继保自动化股份有限公司\t79129\n装修改\t79130\n辰时\t79131\n比亚哈弗h2\t79132\nmeasurements\t79133\nFBIF\t79134\nDataV\t79135\n失眠\t79136\nWank\t79137\nSQLAlchemy\t79138\n2500u\t79139\n傲宇达科技\t79140\n分手费\t79141\np10\t79142\n凯盛集团\t79143\nV2.7.0.6\t79144\n不留痕迹\t79145\n王丹丹\t79146\n湖北省人民医院\t79147\n新郑网\t79148\n班前班\t79149\n美林苑\t79150\n职位\t79151\n菊池蓝\t79152\ninittab\t79153\n最准\t79154\n改良版\t79155\n二斤\t79156\n华为云\t79157\n318国道川藏线\t79158\nswift\t79159\n0.0.4\t79160\n83831381\t79161\nMovements\t79162\n讲道集\t79163\nPlus+\t79164\n纸心\t79165\n18年春节\t79166\n蚬木\t79167\n100亿_\t79168\n晁补之\t79169\n刘雨欣\t79170\n挪威的森林\t79171\n选购\t79172\n分币\t79173\n最高档\t79174\n华商报\t79175\n自然遗产日\t79176\n假面骑士kiva\t79177\n爱表族\t79178\n医生库\t79179\n窗扇\t79180\n胶棉\t79181\n2016-08\t79182\nアクメ\t79183\n安小暖\t79184\n3d打印\t79185\n思诺\t79186\nqueues\t79187\n三国全战\t79188\n氨基硅油\t79189\n雪线\t79190\n界河\t79191\n鞋铺\t79192\n赵毅衡\t79193\nifram\t79194\n压入式\t79195\n跟踪台风的卫星\t79196\nLandrover\t79197\n兵字\t79198\n有不有\t79199\nOptional\t79200\n乌鲁木齐火车站\t79201\n李中\t79202\n红苹果\t79203\n300台\t79204\n友好城市\t79205\n球球\t79206\n骐骥\t79207\n高温\t79208\npld\t79209\n瑞丽市\t79210\n少年王卫斯理\t79211\n同余\t79212\n场务\t79213\nselenium3\t79214\n辽宁省公安厅交通安全管理局\t79215\n得势\t79216\n初效过滤器\t79217\n千禾\t79218\n三角洲部队\t79219\n表人\t79220\n13颗\t79221\n360手机吧_\t79222\n身无\t79223\n番队\t79224\n徐州日报社\t79225\n公公与媳妇完本\t79226\n学运\t79227\n热饮\t79228\n求生之路1\t79229\n4987\t79230\n双色球杀号定胆\t79231\n80句\t79232\n参谋网\t79233\nanjuke\t79234\n浑江区\t79235\n混编\t79236\n乐毅\t79237\n重行\t79238\n万科第五园\t79239\nRemixOS\t79240\n0460\t79241\n干锅鸡\t79242\n共妻\t79243\nGentle\t79244\n险成\t79245\n英雄联盟视频\t79246\n岳麓书社\t79247\n创界\t79248\n高丽营\t79249\n表外业务\t79250\n要靠\t79251\nunivs\t79252\n雄安概念股\t79253\n连梁\t79254\n肠道菌\t79255\nashx\t79256\n手机浏览器\t79257\n18公斤\t79258\n斯米兰岛\t79259\n钓鱼点\t79260\n柏林赫塔\t79261\nmacaca\t79262\n景区\t79263\nRadioGroup\t79264\n周田镇\t79265\n2根\t79266\n王利军\t79267\n蓝月传奇\t79268\n一摸\t79269\n镖客\t79270\n放不出\t79271\nvsn\t79272\nbenji\t79273\n杰瑞教育\t79274\n豆腐乳\t79275\n诺兰\t79276\n弗雷格\t79277\n战史\t79278\n学乐中国_小学生学习网\t79279\n小周天\t79280\n山东省书法家协会\t79281\nENBT种子磁力链接\t79282\nTire\t79283\n在那东山顶上\t79284\n三进\t79285\n万方医\t79286\nBanG\t79287\n500例\t79288\n刮刮乐\t79289\neea\t79290\n福田火车站\t79291\nwangzhan\t79292\n110xi4\t79293\na0\t79294\n溪口\t79295\n坛里\t79296\n于洪\t79297\n闪电\t79298\n0.002\t79299\n想问\t79300\n以身殉国\t79301\n新县政府\t79302\n城阳人民医院\t79303\n黄家湾\t79304\n一衣带水\t79305\n港池\t79306\n金山村\t79307\n图形框\t79308\n卸料\t79309\n曹建\t79310\nNEIGHBORHOOD\t79311\n明装\t79312\n开口费\t79313\n阳光酒店\t79314\n2134\t79315\n桐木\t79316\nRemy\t79317\n7777777\t79318\n新天地广场\t79319\nFirms\t79320\n诚e赊\t79321\nCYBER\t79322\n费森尤斯\t79323\namtemu\t79324\n小泽圆\t79325\n婉转\t79326\nSolve\t79327\n树脂基复合材料\t79328\n物产中大\t79329\n阿里云源\t79330\n亲子阅读\t79331\n重发\t79332\navoidance\t79333\n雌狮\t79334\n小郑\t79335\n民族团\t79336\nmeal\t79337\nhorizons\t79338\n嘉兴市委\t79339\n233号\t79340\n松辽水利网\t79341\n海联\t79342\n奔腾B30\t79343\n衡山西站\t79344\nBHD\t79345\nLTI\t79346\n一少年\t79347\n科友\t79348\n金庸群侠传x1.0\t79349\n陌陌币\t79350\n身外之物\t79351\n一万米\t79352\n技压\t79353\n飘雪\t79354\n爆品\t79355\n第39条\t79356\n20171024\t79357\nzuci\t79358\n张佳佳\t79359\n牛汉\t79360\n李淑芳\t79361\n进入\t79362\n京汉\t79363\n世谱\t79364\nJumanji\t79365\n欧式\t79366\n光明家具\t79367\n难办\t79368\n拍摄地\t79369\n鱼人头\t79370\n许宏\t79371\nved\t79372\noppor9sk\t79373\natelier\t79374\n黄明志\t79375\n通用电气\t79376\nwow恶魔猎手\t79377\n会员会\t79378\n李金波\t79379\n携程汽车\t79380\n夜凯\t79381\n同步通讯录\t79382\n桃江县政府网\t79383\n德龙咖啡机\t79384\n九江市区\t79385\nrecommend\t79386\nTL-WR842N\t79387\n老财\t79388\n鲁西西传\t79389\n车花\t79390\n风化\t79391\n连连看小游戏\t79392\n陈文俊\t79393\n贴子\t79394\n六兄弟\t79395\nmongodump\t79396\nwww.cnnic.cn/download/registar_list/future1todayDel.txt\t79397\n卧车\t79398\n展架\t79399\n呢_\t79400\n白狐狸\t79401\n长城哈弗\t79402\n移动飞享\t79403\n倾销\t79404\n郑州宇通客车股份有限公司\t79405\n优陌\t79406\n第89号\t79407\n金樾\t79408\n地魔\t79409\n经济学\t79410\n黑龙江广播电视台\t79411\n大理火车站\t79412\n江离\t79413\n18b20\t79414\n龙之子\t79415\nvm14\t79416\n肖哥哥\t79417\nDeveloper\t79418\n果网\t79419\nexplorer\t79420\n嵌甲\t79421\nlenet\t79422\n向往的生活2\t79423\n赵品霖\t79424\n激情乱伦\t79425\n毛主席语录\t79426\n姜宏波\t79427\n金地樾檀山\t79428\n硫化橡胶\t79429\n交手\t79430\nucsd\t79431\n095\t79432\n陕甘\t79433\n耳刀\t79434\n天长地久\t79435\n节奏坦克\t79436\n分压式\t79437\n省份证\t79438\nmdr1000x\t79439\n口气\t79440\n黄韬\t79441\n下颌角\t79442\npictureBox\t79443\n第二教育网\t79444\n张家窝\t79445\nCD-KEY\t79446\nniconico\t79447\n微生物量\t79448\n中国扶贫杂志社\t79449\n捉住\t79450\n光怪\t79451\n肉蒲团\t79452\nIVVI\t79453\n法大\t79454\nPowerdesigner\t79455\nTorrent\t79456\n河伯\t79457\n箱子版\t79458\n回击\t79459\nlzslbd\t79460\n小米5splus\t79461\nedsheeran\t79462\n国家烟草专卖局\t79463\n井川由衣\t79464\n皮亚佐拉\t79465\n有心人\t79466\n南京儿童医院\t79467\n歼灭\t79468\n主程\t79469\nASB\t79470\n龙甲\t79471\n租妻\t79472\n回收机\t79473\n71%\t79474\n绿地\t79475\n中成\t79476\n家庭影院装修|一起网\t79477\n攻德无量扫文组\t79478\ncapture\t79479\n耐受\t79480\n小规\t79481\nSummicron\t79482\n省列\t79483\n冷器\t79484\nOGN\t79485\nradis\t79486\n3109\t79487\n中奥\t79488\naxsure\t79489\n以史为鉴\t79490\n暗黑秘影\t79491\n网易游戏频道\t79492\n伯罗奔尼撒战争\t79493\n框架性\t79494\n国家航天局\t79495\n中国城市网\t79496\n煤样\t79497\nambiguity\t79498\n陈田\t79499\n无线交换机\t79500\n旧体\t79501\n寿光市政府\t79502\n南峡\t79503\n201号\t79504\n爱伤\t79505\n乾元丹\t79506\n圆口\t79507\n天使爱美丽\t79508\ncsy\t79509\n纳米粉体\t79510\n连座\t79511\n尿不湿\t79512\n龙珠大冒险\t79513\n汽化器\t79514\n德拉蒙德\t79515\n新加卷\t79516\n骑者\t79517\n龙樾湾\t79518\n向量积\t79519\nps教程自学网\t79520\nNSDictionary\t79521\n南京科技馆\t79522\n贾鲁河\t79523\n侯建军\t79524\n狗子\t79525\nPrimavera\t79526\n中国人民对外友好协会\t79527\n封冻\t79528\n出国\t79529\n桂江\t79530\n枪案\t79531\n危象\t79532\n8681\t79533\n简笔画图\t79534\n马尔康县\t79535\n根结线虫\t79536\n金顺\t79537\n绿道\t79538\nWord2016\t79539\n新赛欧\t79540\n苏州物流公司\t79541\n发泡水泥\t79542\n旷日持久\t79543\n金矿\t79544\n福建林业职业技术学院\t79545\n韩潮\t79546\n科技股\t79547\nagk\t79548\n止回阀\t79549\n35秒\t79550\n微邦\t79551\n普通话考试信息网\t79552\n盲眼\t79553\n矢泽妮\t79554\nh3d\t79555\n好大夫在线\t79556\n昏厥\t79557\n芯盒\t79558\n绿都御景蓝湾\t79559\n胸脯\t79560\n列批量\t79561\n中国中信集团公司\t79562\n2018年5月1\t79563\n文氏管\t79564\npgl\t79565\n女武神4\t79566\n女人口\t79567\nGecko\t79568\n祥泰\t79569\n业务板\t79570\n异\t79571\n通达信科\t79572\noray\t79573\nRomania\t79574\n2017.10.20\t79575\n_步知网\t79576\n互动出版网\t79577\n畜禽\t79578\n20141008\t79579\n唐菲\t79580\n草虾\t79581\n顶通\t79582\n篮战\t79583\n德克萨斯大学\t79584\nqian\t79585\n水家湖\t79586\n红梅\t79587\n黑暗主神\t79588\nKEYENCE\t79589\n强悍\t79590\n智联卓聘\t79591\n甲硝唑栓\t79592\n131路\t79593\n剑网君威\t79594\nUCAN\t79595\n徐杨\t79596\n橘子娱乐\t79597\n井塘古村\t79598\n齐一\t79599\n2303a\t79600\n绝地求生吃鸡\t79601\n青春时代\t79602\n装饰器\t79603\n锦官城\t79604\n文史\t79605\n温泉路\t79606\n水果餐\t79607\n吉尔吉斯斯坦\t79608\n妙偶天成\t79609\n长江商报\t79610\n指板\t79611\n中国旅游研究院\t79612\nsdcms\t79613\n上海芭蕾舞团\t79614\n清丰县\t79615\n照顾者\t79616\n山东省高级人民法院\t79617\n游卡桌游三国杀\t79618\n第一组\t79619\nClutch\t79620\n命中注定\t79621\n团购\t79622\n既\t79623\n名邦\t79624\nAvis\t79625\n自制纸\t79626\n风暖型\t79627\n锯齿波\t79628\nEmotion\t79629\n豪取\t79630\n按行\t79631\n凤尾鱼\t79632\n别抢\t79633\n月球车\t79634\n俄服\t79635\n002439\t79636\n形近\t79637\n唐探\t79638\n雷迪森\t79639\n哈尔滨市道里区\t79640\n积水\t79641\n北京华夏\t79642\nv2.1.0_\t79643\n1.5.4\t79644\n兰州拉面\t79645\n莱圳\t79646\n六千万\t79647\n可口可乐\t79648\n石油石化\t79649\n火麻\t79650\n近水楼台先得\t79651\n东大寺\t79652\nebi\t79653\nWolves\t79654\n那英\t79655\n撤县\t79656\n张汝京\t79657\n自粘袋\t79658\n套标机\t79659\n人字拖\t79660\n采花大盗\t79661\n南达\t79662\n金融计量学\t79663\n灵主\t79664\n十万火急\t79665\n国家安全部\t79666\n298K\t79667\n杭州华山连天美医疗美容医院\t79668\n消法\t79669\n太和先锋网\t79670\n手\t79671\nProcreate\t79672\n氧化铬\t79673\n隐检\t79674\n杀怪\t79675\n30MW\t79676\n阿尔卡迪亚\t79677\nhol\t79678\n选举委员会\t79679\n0.07\t79680\n木板材\t79681\n忍辱负重\t79682\n明星级\t79683\n迅雷资源\t79684\n墙灰\t79685\n汽车街\t79686\n中南城\t79687\n退群\t79688\n关镜\t79689\n杭州大厦购物城\t79690\n新华书店\t79691\n开诚布公\t79692\n双通\t79693\n明睿\t79694\n黑龙江省质量技术监督局\t79695\n三户\t79696\n幽灵行动:荒野\t79697\n伙同\t79698\n百度影音在线\t79699\n公区\t79700\n欧阳琦\t79701\nOpenHW\t79702\n弘阳广场\t79703\n吉美广场舞\t79704\n功能测试\t79705\n智宸\t79706\n14.4\t79707\n柠檬草\t79708\n渭南市财政局\t79709\n养妻\t79710\nwin7_\t79711\n臭逼\t79712\n铛铛\t79713\n2方\t79714\n安可儿\t79715\n够吗\t79716\n电力学\t79717\n陈芸\t79718\n氯化汞\t79719\n抢烧\t79720\nnonstop\t79721\n李凉\t79722\nOnlyLady\t79723\nCRUSH\t79724\n防治员\t79725\nLinux网络编程\t79726\nSOF\t79727\n高立\t79728\n码子\t79729\n社会保险补贴\t79730\n排班\t79731\n津人社局\t79732\n酷狗音乐人\t79733\n万科金域\t79734\n720度\t79735\n酸奶油\t79736\n双鸭山政府\t79737\n破骨细胞\t79738\n力士山\t79739\nDRS\t79740\n高一\t79741\n1263\t79742\n婆角\t79743\n痉挛性\t79744\n第四批次\t79745\nv6.6.6\t79746\n梦幻模拟战1\t79747\n拖着\t79748\n吕亦涵\t79749\n刀套\t79750\n一大帮\t79751\nnver\t79752\nv5.8\t79753\n威廉三世\t79754\n时尚中国网\t79755\n湖滨东路\t79756\n乌丸莲耶\t79757\n龙阳镇\t79758\n10x10\t79759\nonet\t79760\n关单\t79761\nStrive-count\t79762\njiguang\t79763\nmix2s\t79764\n跳芭蕾\t79765\ncorba\t79766\n小米air\t79767\n大体\t79768\n中国投资协会\t79769\n180万元\t79770\n車\t79771\nloid\t79772\n2017年7月份\t79773\n眼霜\t79774\n西环广场\t79775\n好姻缘\t79776\n三色杯\t79777\n設\t79778\n嘘\t79779\nSlab\t79780\n鸿新学习网\t79781\n罂粟籽\t79782\n千种\t79783\n何园\t79784\n12306\t79785\n射手男\t79786\n百里\t79787\n调谐器\t79788\nMOD_游人网\t79789\n吊灯\t79790\n伐昔洛韦\t79791\n夜光石\t79792\n经纪\t79793\n不累\t79794\n曹炯芳\t79795\n合期\t79796\n五列\t79797\n体恤\t79798\nMac视频播放器\t79799\n油漆刷\t79800\nAppleCare+\t79801\n应不应\t79802\n扑蝶\t79803\n首播\t79804\nSeeing\t79805\n湘粤\t79806\n顽疾\t79807\n坚贞\t79808\n直属\t79809\n中国农机网\t79810\n木叔\t79811\n绝活儿\t79812\n王娜娜\t79813\n800多万\t79814\n佛教音乐_高音质歌曲_九酷音乐\t79815\n新倩女幽魂吧\t79816\n碳青霉烯类\t79817\n中影网\t79818\n储料\t79819\n羽裳\t79820\n箩\t79821\nanna\t79822\n须弥山\t79823\n蓝贴\t79824\n期权行权\t79825\n)医疗器械有限公司\t79826\nct6\t79827\n小众\t79828\n秦晋\t79829\n净现金流量\t79830\n王者荣耀无情刺客\t79831\n94版\t79832\nInequality\t79833\n环保油\t79834\nlicense-CSDN\t79835\nTony\t79836\n樊家村\t79837\n网友们\t79838\nALTERA\t79839\n百汇\t79840\nFreeOnes\t79841\n透出\t79842\n份子\t79843\n武昌火车站\t79844\n╯▽╰\t79845\nmt2503\t79846\n后续\t79847\nsherry\t79848\nVol.3\t79849\n双开助手\t79850\n南阳西峡\t79851\n管城回族区\t79852\n红壳\t79853\n勇敢的心世界大战\t79854\n物欲\t79855\n父与子的编程之旅\t79856\nWildest\t79857\n村上水军\t79858\n指挥室\t79859\n爱福家\t79860\n静安大悦城\t79861\n戒糖\t79862\n润雨\t79863\n甜言\t79864\n抽奖\t79865\n木碳\t79866\n智臻\t79867\nhustlecastle\t79868\n不删档测试\t79869\n亚太会计师事务所\t79870\n25件\t79871\n女宇航员\t79872\n管理学报\t79873\nALFA\t79874\n三式\t79875\nslim\t79876\n8999元\t79877\n水膜\t79878\n蓝调口琴网\t79879\n长短不一\t79880\n东南西\t79881\n高志华\t79882\n牛栏山一中\t79883\n月刃\t79884\n塔山公园\t79885\n岳塘区\t79886\n折纸大全\t79887\n超鬼王猫掌柜\t79888\nvc11\t79889\nASDA\t79890\n锂辉石\t79891\n东北大学研究生院\t79892\n蓝帽\t79893\n柱础\t79894\n经济半小时\t79895\n大讲堂\t79896\n评估师\t79897\n海尔工业园\t79898\n鲁中论坛\t79899\n浙江大学信息与电子工程学院\t79900\n第几季\t79901\n李少华\t79902\n渣男\t79903\n刹车钳\t79904\n高感\t79905\n中材集团\t79906\n香港迪士尼\t79907\n105SL\t79908\n六年级下册语文书\t79909\n独善其身\t79910\n重庆公交\t79911\n802.1\t79912\n败品秀-我败我秀\t79913\n干接点\t79914\nq3\t79915\nwin7工具栏\t79916\n华润凯旋门\t79917\n王者荣耀高帧率\t79918\n舞邦\t79919\n晚茶\t79920\n腾讯TIM\t79921\n拉\t79922\n气球\t79923\nmaki\t79924\n广州天河客运站\t79925\n横尸\t79926\n4608\t79927\n聊一聊\t79928\n江苏交通一卡通\t79929\n套胶\t79930\n捆索\t79931\nMONTH\t79932\n红星美凯龙\t79933\n胶东\t79934\nxiao77\t79935\n闲云野鹤\t79936\n司马师\t79937\n钽\t79938\n追鱼\t79939\n镇民\t79940\npaying\t79941\n曲臂\t79942\n第21天\t79943\ncrepe\t79944\n环境科学与工程系\t79945\n星野娜美\t79946\namd64\t79947\n夏朵\t79948\nDBVisualizer\t79949\ntransitions\t79950\n妇产科在线\t79951\n深圳德尚医院\t79952\n自考毕业证书\t79953\n杨博\t79954\n阿塔\t79955\n小照\t79956\n桐昆股份\t79957\n石岐\t79958\n理光GR\t79959\n翻出\t79960\n李杜\t79961\n有没有人告诉你\t79962\n2018042\t79963\n头照\t79964\n后报\t79965\n莆田学院\t79966\n拉压\t79967\nwzb\t79968\n夏川\t79969\n第二十八期\t79970\n地球卫士奖\t79971\n夕颜阁\t79972\n丝蓓绮\t79973\nEgypt\t79974\nrtos\t79975\n天籁之战2\t79976\nEconometrics\t79977\n吕明\t79978\ncoffee\t79979\nKeyGen\t79980\nlining\t79981\n隋书\t79982\n宜春市政府\t79983\n李特\t79984\n2.08\t79985\n锦程物流\t79986\n冠城大通\t79987\n红餐网\t79988\npsu\t79989\n每五年\t79990\n鸡屎\t79991\n评戏\t79992\n吴淞口国际邮轮码头\t79993\n托玛琳\t79994\n茶话\t79995\n汉寿县\t79996\n花鞋\t79997\nXx\t79998\n雷天大壮\t79999\nviews\t80000\n水电费\t80001\n苍月奥特曼\t80002\nthunk\t80003\n儿\t80004\n人民法院量刑指导意见\t80005\n长期护理保险\t80006\n重庆城市管理职业学院\t80007\n木心美术馆\t80008\n王建平\t80009\n西贝猫\t80010\n手编\t80011\ncoyote\t80012\n呲花\t80013\n双录\t80014\n八珍汤\t80015\n发展阶\t80016\n迪生力\t80017\n江淮瑞风\t80018\nqDebug\t80019\nzhongxue\t80020\n_快啦网\t80021\n尤克里里Fans\t80022\nimager\t80023\nTent\t80024\n汗血宝马\t80025\n临床药师\t80026\n碧头\t80027\nOS2\t80028\ngzinglate\t80029\nInterconnect\t80030\n沖田杏梨\t80031\n圆尾斗鱼\t80032\n两孩\t80033\nLit\t80034\n牛肉丝\t80035\n天师钟馗\t80036\n程华\t80037\nzbrush4r8\t80038\n妈富隆\t80039\n温州医科大学附属第一医院\t80040\n生体\t80041\n棉子\t80042\n张芝华\t80043\n苏政\t80044\n黄腊管\t80045\n_君越\t80046\n乡约\t80047\n152个\t80048\n情境化\t80049\n五谷丰登\t80050\n死期\t80051\nfeed\t80052\n丸美\t80053\n叶露\t80054\n轩尼诗xo\t80055\n虎贲\t80056\n国奥村\t80057\n安速\t80058\n科幻\t80059\n松本\t80060\n非诚\t80061\n捺\t80062\n交行信用卡\t80063\n杀鸡取卵\t80064\n草根网\t80065\n连接处\t80066\n230W\t80067\n纤维绳\t80068\n醉舞\t80069\ncmx\t80070\n锅架\t80071\n蓝莓干\t80072\n船期\t80073\nNBHD\t80074\n4210\t80075\n联想\t80076\n邻菲罗啉\t80077\n草金\t80078\nlocoso\t80079\n罗老师\t80080\n夜大学\t80081\n39位\t80082\n杨木\t80083\n残花\t80084\n费费\t80085\n蓝月谷\t80086\n护垫侠\t80087\n美菱\t80088\njsq\t80089\n商用汽车网\t80090\n沙井中心客运站\t80091\n中国华录集团有限公司\t80092\n碧潭飘雪\t80093\n14页\t80094\nTBI\t80095\nzhiling\t80096\n不锈钢筛网\t80097\n云悠\t80098\n迪瑞医疗\t80099\nEmi\t80100\n印油\t80101\n37亿\t80102\n吉川莲\t80103\nsql语句\t80104\ndblp\t80105\n0405\t80106\n匂\t80107\n封神王源\t80108\n追加\t80109\n洞庭湖大桥\t80110\n鼻舒堂\t80111\n快递单号查询\t80112\n真尼玛\t80113\nSad\t80114\n司马昭\t80115\n问政时刻\t80116\nSightseeing\t80117\nCox\t80118\n巴托洛米奥\t80119\n翻箱倒柜\t80120\n信业\t80121\n四轮定位\t80122\n活出自己\t80123\n起初\t80124\n公摊率\t80125\n肺囊肿\t80126\n九州职业技术学院\t80127\n活動\t80128\n擂主\t80129\n云木\t80130\n闽江\t80131\n商业管理公司\t80132\n导热塑料\t80133\n户愚吕弟\t80134\n赌侠2\t80135\n淞虹路\t80136\nAlina\t80137\n433\t80138\ninorganic\t80139\n13版\t80140\n工程学\t80141\n西南民族大学\t80142\n3x1\t80143\n台视\t80144\n外廊\t80145\n喷雾干燥机\t80146\n命令与征服3:凯恩之怒\t80147\n葑门\t80148\n千山药机\t80149\n电视遥控器\t80150\n集安市\t80151\n轮压\t80152\nMakes\t80153\n曲安奈德\t80154\nMIRO\t80155\n加热炉\t80156\n红袖\t80157\n125年\t80158\n松本清\t80159\n等离子体质谱法\t80160\n白条\t80161\n沈度\t80162\n管风琴\t80163\n平稳\t80164\n提升率\t80165\n大小谎言\t80166\n碧桂园物业\t80167\n香港浸会大学\t80168\n积极性\t80169\n爱看天\t80170\n飞思\t80171\ndanish\t80172\n南方都市报数字报\t80173\n晶盛机电\t80174\nATD\t80175\n铜陵有色金属集团控股有限公司\t80176\n凯文·杜兰特\t80177\n公物\t80178\n试液\t80179\n尖部\t80180\nPinkoi\t80181\n宇尘网络\t80182\nshadowsocks-libev\t80183\ncaoprm\t80184\n开馆\t80185\n纯悦\t80186\n20170711\t80187\nPantyhose\t80188\nconcept\t80189\n行知小学\t80190\n囍\t80191\n画格\t80192\n黄粱一梦\t80193\n天士力集团\t80194\n笃笑\t80195\n百慕大三角\t80196\n民报\t80197\n履带式\t80198\n第二套房\t80199\n葡聚糖酶\t80200\nLINGO\t80201\n中鼎股份\t80202\n亚硝酸盐中毒\t80203\noui\t80204\nWi\t80205\n新华集团\t80206\n模拟/电源-与非网\t80207\n加布\t80208\n打火石\t80209\n乾易通\t80210\n英德\t80211\n帝国理工学院\t80212\nSmartbi\t80213\n胡尔克\t80214\n工作液\t80215\n假信\t80216\nSKC\t80217\n威漫\t80218\n中大型\t80219\n金融监管研究院\t80220\n区名\t80221\n苗情\t80222\n外液\t80223\n8090年代\t80224\n32万元\t80225\n中国电子科技集团公司第十五研究所\t80226\n国有土地使用权招拍挂出让成交公示_自然资源部\t80227\n洛奇\t80228\n杭州士兰微电子股份有限公司\t80229\n马山\t80230\n白木\t80231\n地面部队\t80232\n鲥鱼\t80233\n伸冤\t80234\n百万吨级\t80235\n光之美少女\t80236\nCD\t80237\n达闼科技\t80238\n梦金园\t80239\n四川省机关事务管理局\t80240\n山东师范大学\t80241\n奔驰smart\t80242\n大明王朝1566\t80243\n武林群侠传\t80244\n天海翼\t80245\n新片\t80246\n阴山大草原\t80247\n吉普\t80248\n司法考试网\t80249\n文一\t80250\n$.each\t80251\nmdn\t80252\nflah\t80253\n北京医科大学\t80254\n功放机\t80255\n碾压式\t80256\n昨\t80257\n李东\t80258\n女团\t80259\nserialize\t80260\n2017.3.4\t80261\n免费下载器\t80262\n东湖绿道二期\t80263\n阴阳师\t80264\n书翁\t80265\n济公传\t80266\n三供一业\t80267\n米妮\t80268\n失物招领\t80269\n报纸\t80270\n瑞通\t80271\n摊余成本法\t80272\n专武\t80273\n出山\t80274\n10.2.0.4\t80275\n1988\t80276\nhyundai\t80277\n现在时\t80278\n保活\t80279\n长歌行\t80280\nhousehold\t80281\n欧冶云商\t80282\napk编辑器\t80283\n秘录\t80284\n红苋菜\t80285\nkc\t80286\n翼虎论坛_汽车之家论坛\t80287\nEntry\t80288\n幸运日\t80289\n大地测量学\t80290\n破片\t80291\n豫卦\t80292\n94名\t80293\n吴仁惠\t80294\n种丹妮\t80295\nmaster卡\t80296\nWinnipeg\t80297\n存款准备金率_\t80298\n芒市\t80299\n酷粉杂谈\t80300\n亨通光电\t80301\n中国民航飞行学院\t80302\nabg\t80303\naitao\t80304\n002027\t80305\n吴平\t80306\nmem\t80307\n全封\t80308\nSST\t80309\n无往\t80310\nLinux安全_Linux公社\t80311\n梁宏达\t80312\n瑞马\t80313\n陈嘉栋\t80314\n马王\t80315\ncas号\t80316\n樱花季\t80317\nBulls\t80318\n节律\t80319\n钟吾乐\t80320\n尝尝\t80321\n书法体\t80322\n攻击性\t80323\n澡堂子\t80324\n披拉·尼迪裴善官\t80325\n勾机\t80326\n美国电影协会\t80327\n向华强\t80328\n衣橱\t80329\nSSD接口\t80330\nyjy\t80331\n大雪山\t80332\nNikki\t80333\n上上下\t80334\n魂链\t80335\n回转炉\t80336\n仙剑奇侠传online\t80337\n艾薇\t80338\n荇\t80339\n王南方\t80340\n干扰性\t80341\n生搬硬套\t80342\n切换\t80343\n目字\t80344\n鲁公\t80345\n手拉手\t80346\nharmonious\t80347\nKa\t80348\n永定门\t80349\ntransformation\t80350\n控制层\t80351\n九江市发展和改革委员会\t80352\npalyer\t80353\n們\t80354\n有夫\t80355\n鄂尔多斯机场\t80356\n卡具\t80357\n安索\t80358\n2017-10-25\t80359\n铜山镇\t80360\ndcim\t80361\n东方深秘录\t80362\n次元\t80363\nHeroes\t80364\n限电\t80365\n绣眉\t80366\n第十四\t80367\nInstrument\t80368\n果蔬汁\t80369\n五证\t80370\n1793年\t80371\n巡逻兵\t80372\nreuters\t80373\nTemplate\t80374\n前沿\t80375\nextension\t80376\nHAVE\t80377\nConfirm\t80378\nLibra\t80379\n安立路\t80380\nFixes\t80381\n动滑轮\t80382\n常山北明\t80383\n致命缺陷\t80384\nEden\t80385\n100系\t80386\n另谋\t80387\n暖屏\t80388\n斜位\t80389\n深圳商报\t80390\n虎式\t80391\n4-1\t80392\n4.1.5\t80393\nfgo2017\t80394\n4630\t80395\nparticipation\t80396\n广州发展\t80397\n票据融资\t80398\nemulated\t80399\n中埋式橡胶止水带\t80400\n芜湖站\t80401\n液位计\t80402\n狗牌\t80403\n梅苑小区\t80404\npp\t80405\n稳杀\t80406\n德马泰克\t80407\n千家综合布线网\t80408\n11月29日\t80409\n能源化\t80410\nBeverage\t80411\n文卫\t80412\n科切拉音乐节\t80413\n感恩父母\t80414\n嘉兴南湖区\t80415\n村嫂\t80416\n盘算\t80417\nMeyer\t80418\n反共\t80419\nplaces\t80420\n安娜苏\t80421\n凤凰男\t80422\n先装\t80423\n琵琶网\t80424\n家浜\t80425\n刘音\t80426\n谱友\t80427\n刘长福\t80428\n凯源吧\t80429\nanglebaby\t80430\n方差分析\t80431\n2195\t80432\n急性期\t80433\n富力盈凯广场\t80434\n金属乐\t80435\n工作量\t80436\n四杰\t80437\nDante\t80438\n16A\t80439\n德伦\t80440\n秘法\t80441\nSurfacePro\t80442\nAutoCAD2015\t80443\n蒙古栎\t80444\n辽宁对外经贸学院\t80445\n干妞网\t80446\n17r3\t80447\n合照\t80448\n金城武\t80449\n珠山区\t80450\n金蝉子\t80451\n顯示\t80452\n碟中谍5\t80453\n银仙\t80454\n上南路\t80455\n白金海岸\t80456\n里拉\t80457\n法援\t80458\nBradford\t80459\ncovery\t80460\n泡泡加速器\t80461\nbeatssolo3\t80462\n竖型\t80463\n区子\t80464\n大同古镇\t80465\n折光仪\t80466\n5.4.1\t80467\n扎金索斯\t80468\n人工蜂群算法\t80469\n天军\t80470\n隐刺\t80471\n血腥爱情故事\t80472\n常州北站\t80473\n框画\t80474\n勇敢的心:世界大战\t80475\ng7\t80476\n激发态\t80477\n陈胜\t80478\n重轮\t80479\nzxb\t80480\n48.5\t80481\nJB\t80482\nphrases\t80483\nHOLDER\t80484\n换乘案\t80485\n多端\t80486\n转一转\t80487\n中华上下五千年\t80488\n卡龙\t80489\n宸章\t80490\nz650\t80491\n5660\t80492\n香港交易所\t80493\n斯蒂文斯\t80494\n漫水桥\t80495\n美国花旗银行\t80496\n简读\t80497\nsecure\t80498\nlichmama\t80499\n亲爱的翻译官\t80500\nBangalore\t80501\n猫耳\t80502\n嘉庆帝\t80503\n姜潮\t80504\n禅宗\t80505\n海欧\t80506\n排线器\t80507\n司庆\t80508\n充耳不闻\t80509\n协和尚音\t80510\n内野\t80511\n爱善天使\t80512\n迢迢牵牛星\t80513\n徐某某\t80514\n九五至尊\t80515\n龙湖镇\t80516\n0.5万元\t80517\n家政服务部\t80518\n第35关\t80519\n营销策\t80520\nsdb\t80521\n青大附院\t80522\n中债指数\t80523\n津补贴\t80524\n挥泪\t80525\n阳仔\t80526\npsv2000\t80527\nShameless\t80528\n柘\t80529\n深圳市南山区人民法院\t80530\n母校\t80531\n雾岛\t80532\nThinkGIS\t80533\n友佳\t80534\n中国出版集团公司\t80535\n朱晔\t80536\n征途2s\t80537\n衡山县\t80538\n国网\t80539\n等待区\t80540\n孝昌人民政府\t80541\n阐释者\t80542\n马克杯\t80543\n732位\t80544\nガテン\t80545\n貂毛\t80546\n赣州广播电视台\t80547\n比亚迪秦80\t80548\nwin11\t80549\n北八道\t80550\n同分异构体\t80551\n司令员\t80552\n起草\t80553\n入校\t80554\n有源音箱\t80555\n魔力鸭\t80556\nSubsystem\t80557\n杜勒斯\t80558\n不知火舞\t80559\n归还\t80560\n婚纱照相框\t80561\n小粉\t80562\n含氯消毒剂\t80563\n25kg\t80564\nNEM\t80565\nス\t80566\n双浦\t80567\n机洗\t80568\n米小圈上学记\t80569\nmotorcycle\t80570\nMeetup\t80571\nSingular\t80572\ntoarray\t80573\nExcel表格\t80574\nwin32\t80575\n工作会议\t80576\n颜值\t80577\nCmd\t80578\n折扣券\t80579\ntoast\t80580\n福泉山\t80581\n皮癌\t80582\nProvisions\t80583\n600名\t80584\nTHK\t80585\n成家班\t80586\n窖酒\t80587\n佛山地铁2号线\t80588\n独山子石化\t80589\n申康\t80590\najp\t80591\n磨轮\t80592\n高产\t80593\n混战\t80594\n六十路\t80595\n桂东县\t80596\ndolphin模拟器\t80597\nemui4.1\t80598\n潍坊港\t80599\nWin7/XP\t80600\n减肥仪\t80601\n水果树\t80602\n93届\t80603\n照照\t80604\nRANGE\t80605\n科济\t80606\n李柘远\t80607\nQSH\t80608\n老魔\t80609\n科神\t80610\n设定\t80611\nSequoia\t80612\nFerguson\t80613\n牡丹区\t80614\n开关箱\t80615\n云庭\t80616\n海关税\t80617\n恐怖美术馆\t80618\n气动球阀\t80619\n开设\t80620\n第二台\t80621\n北京大学基础医学院\t80622\n蔚为\t80623\n镇平\t80624\n倘甸\t80625\n扶危济困\t80626\n唱臂\t80627\n335路\t80628\n出口关税\t80629\n哈嗯\t80630\n9.4\t80631\n书香节\t80632\nv9.9\t80633\n梁伟\t80634\nWindsor\t80635\n中银富登村镇银行\t80636\n达斯\t80637\nITI\t80638\naccused\t80639\n扶起\t80640\n帽子戏法\t80641\n东平湖\t80642\n此番\t80643\n扬城\t80644\n九龙湖校区\t80645\nDevon\t80646\ncolumn\t80647\n千佛洞\t80648\n刘非\t80649\n国有土地使用权出让\t80650\n拉片\t80651\n还是\t80652\n苏州网易\t80653\n沙嘴\t80654\n播客\t80655\n联用\t80656\n疏附县\t80657\nlyt\t80658\n11.2.16\t80659\n热镀锌\t80660\n自力教育\t80661\n水晶DJ网\t80662\n北海日报\t80663\n不至\t80664\n官僚主义形式主义\t80665\n英皇集团\t80666\n中网\t80667\n咸水沽\t80668\n兰格\t80669\n信星\t80670\nthorlabs\t80671\njdm\t80672\n女童\t80673\n华商银行\t80674\n2015年12月25日\t80675\n强化剂\t80676\n长虹路\t80677\n会不会\t80678\n人行征信报告\t80679\n东方瑞通\t80680\nlightbox\t80681\n脑细胞\t80682\n第四十二\t80683\nRods\t80684\n1副\t80685\n夏夜\t80686\n耳麦\t80687\n杏璃\t80688\ngraphs\t80689\n_婚\t80690\n米儿\t80691\n弓神\t80692\nDeliverance\t80693\n王远\t80694\n连栋\t80695\n五斤\t80696\ngdm\t80697\n桃树苗\t80698\n河源日报数字报\t80699\n蚂蚁金\t80700\n迪兰RX\t80701\ngartner\t80702\n高贵\t80703\n输卵管妊娠\t80704\n木龙骨\t80705\n管理员\t80706\n家庭教育\t80707\n6kv\t80708\nabdominal\t80709\n宠物们\t80710\n上篇\t80711\n平仓\t80712\n特制\t80713\n2018.4.7\t80714\n神之刃\t80715\n囊括\t80716\n金沙国际\t80717\n中国黄金集团公司\t80718\nloglevel\t80719\n罗湖火车站\t80720\n火爆孕婴童招商网\t80721\nINNODB\t80722\nRARBT\t80723\n提利昂·兰尼斯特\t80724\n成泉\t80725\n破线\t80726\n2konline\t80727\nBbw\t80728\n头重\t80729\n第九号\t80730\n紧紧\t80731\nMGC\t80732\noptistruct\t80733\ndnf死灵术士\t80734\n专有\t80735\n织构\t80736\na4l\t80737\n轩辕传奇手游\t80738\nwImage\t80739\n下派\t80740\nONU\t80741\n2017~2018\t80742\n吕巷镇\t80743\n真空覆膜机\t80744\n内资\t80745\n工具们\t80746\n50万方\t80747\nCJK\t80748\n方直科技\t80749\n关宏峰\t80750\n剪草机\t80751\n包袱\t80752\n面干\t80753\n返程\t80754\n进给量\t80755\n50.0\t80756\n大数据时代\t80757\n朝朝暮暮\t80758\nv9.6\t80759\n模糊\t80760\n超牛卡\t80761\n蓝天卫士\t80762\n巨轮\t80763\n新财富\t80764\n被曝光\t80765\n职\t80766\n亚马逊食人族\t80767\ndead2\t80768\n彭佳学\t80769\n幼女案\t80770\ntring\t80771\n好爽\t80772\n后撤步\t80773\n2018年5月13日\t80774\n新麦\t80775\n我最好朋友\t80776\n解结\t80777\n步步高家教机\t80778\n大佛顶首楞严经\t80779\n丙酮\t80780\nExpedition\t80781\n10.6%\t80782\n猪王\t80783\n玩懂\t80784\n乳山路\t80785\n胡萝卜\t80786\n剁椒酱\t80787\nCanadian\t80788\n74hc245\t80789\n樊建川\t80790\n旅法\t80791\naminxu\t80792\n三冠\t80793\n斩拌机\t80794\n烤饼\t80795\n乘船\t80796\n升龙集团\t80797\nTEWA\t80798\n76平\t80799\nGlass\t80800\nAV公司\t80801\n宝奶粉\t80802\nfolk\t80803\n几站\t80804\n聚新\t80805\n硬盘版\t80806\n随机误差项\t80807\n面过\t80808\n乐pro3\t80809\n囧架架\t80810\n岭南新世界\t80811\ne1\t80812\n孕妇装\t80813\n天狼影院\t80814\n社会资本合作中心\t80815\n监外执行信息网\t80816\n没早知道\t80817\n灵风\t80818\n首创证券\t80819\nNNEZ\t80820\n力加\t80821\nyounaxie\t80822\nDTB\t80823\n黄志勇\t80824\nPerfect\t80825\n爱爱网\t80826\nitl\t80827\n身外\t80828\n6月2号\t80829\n动漫东京食尸鬼\t80830\n压阻式\t80831\n背透\t80832\nlibcurl库\t80833\n领事\t80834\n阻焊\t80835\nscrach\t80836\n网秦\t80837\nEH\t80838\n20段\t80839\nvld\t80840\n莱特兄弟\t80841\n百家讲坛\t80842\n集大成\t80843\n翻绳\t80844\n苏芩\t80845\n两张皮\t80846\n乐正\t80847\n失声痛哭\t80848\n紧靠\t80849\n门架\t80850\n许国\t80851\n带客\t80852\n家伙们\t80853\n公平镇\t80854\n星状神经节\t80855\n西点师\t80856\nIndividuals\t80857\n范晓萱\t80858\nproblems\t80859\nKINDLE\t80860\nchai\t80861\n蓝月亮洗衣液\t80862\n学农\t80863\n盼盼\t80864\n埃琳娜\t80865\n入党誓词\t80866\n星影\t80867\n兔子\t80868\n上海海湾\t80869\n深宫谍影\t80870\n放哨\t80871\n蒸干\t80872\n光口交换机\t80873\n小轿车\t80874\n天津大学》\t80875\n腾挪\t80876\n腰振\t80877\n弹回\t80878\n30片\t80879\nelephants\t80880\n路畅\t80881\n西钓鱼台\t80882\ncaiwu\t80883\n化学元素周期表\t80884\n外事处\t80885\n门槛券\t80886\n粘包\t80887\n夏楠\t80888\n卡西欧\t80889\n卡比兽\t80890\nSphero\t80891\n夙夜\t80892\n骨板\t80893\n抗氧剂\t80894\n李东生\t80895\ncodis\t80896\ndiscus\t80897\n街区式\t80898\ncao榴\t80899\nForgot\t80900\n微信公众平号\t80901\ntgbus\t80902\n28章\t80903\n蜡炬\t80904\n变格\t80905\n力兴\t80906\nTOUR\t80907\n澳门百家乐\t80908\n戴德\t80909\n裸装\t80910\n海蜇头\t80911\n小桂子\t80912\n山海关区\t80913\n刁伟铭\t80914\n资阳市环境保护局\t80915\n派对\t80916\n丽泽\t80917\n触\t80918\n被\t80919\nhoho\t80920\n徐天堂\t80921\n土鳖\t80922\nmi5\t80923\n北京朝阳法院\t80924\n13岁时\t80925\nOutputStream\t80926\n农媳\t80927\nUnit2\t80928\n政协\t80929\n圣保罗州\t80930\n黄三角早报\t80931\n未来两天\t80932\n骗取贷款罪\t80933\n4.8.7\t80934\n170MV\t80935\n江海路\t80936\n第378章\t80937\n贵定\t80938\nmchange\t80939\n厦门湾\t80940\n来着\t80941\n亚东县\t80942\nUUU9\t80943\n时话\t80944\nEconomy\t80945\n注册安全工程师论坛\t80946\n李忱\t80947\n转关\t80948\n脚木\t80949\n微友搜索网\t80950\nwinkTV_winktv中文网\t80951\n司少的天价宝贝\t80952\n海参汤\t80953\n福尔摩斯基本演绎法\t80954\n开导_逍遥兵王\t80955\n太平洋游戏网\t80956\nalsa\t80957\n九宫格拼图\t80958\n牡丹广场\t80959\n仗\t80960\n顶天\t80961\n国防大学\t80962\n三星4521f\t80963\nmultisim13\t80964\n沙河街道\t80965\n荔枝广场\t80966\n哮\t80967\n张逸\t80968\n七氟醚\t80969\nini\t80970\n职业病防治法\t80971\n单机斗地主\t80972\n鄱阳\t80973\n0000007\t80974\n西汉南越王博物馆\t80975\n中国科学院天津工业生物技术研究所\t80976\n呻吟\t80977\n色欲迷墙\t80978\nvsphere\t80979\nparameterized\t80980\n巴黎最后的探戈\t80981\nSaipan\t80982\nZARD\t80983\n童锁\t80984\nblfbuaa\t80985\n馆陶\t80986\n急求解\t80987\n道以万计\t80988\nsociology\t80989\n伊斯兰国\t80990\n德宏傣族景颇族自治州\t80991\n指导性\t80992\n600598\t80993\n皮袄\t80994\n京准达\t80995\n梅津泰臣\t80996\n布尔类型\t80997\n阿汤\t80998\n荣智健\t80999\n预报员\t81000\n联想小新7000\t81001\n抓娃娃机\t81002\n阿多诺\t81003\n2018年1月起\t81004\n李振江\t81005\n围场\t81006\n竹类\t81007\n蒙迪\t81008\n不能错过\t81009\n延时摄影\t81010\n42055\t81011\n叶昕\t81012\n邵通\t81013\n掖县\t81014\n副刊\t81015\n背虾\t81016\n多子多福\t81017\n霍普\t81018\n背带裙\t81019\n常道\t81020\niis服务器\t81021\ncranes\t81022\n罗技G933\t81023\nWed\t81024\n军装\t81025\n漠漠\t81026\nMosquitto\t81027\n/a\t81028\n麦芒\t81029\n选题\t81030\n哈淘\t81031\n莱温斯基\t81032\n用床\t81033\n鞋神\t81034\nPR2017\t81035\n救生舱\t81036\nproe技巧网\t81037\nvertices\t81038\nXXXX公司\t81039\n求攻\t81040\nDiagnostics\t81041\n山西省委党校\t81042\n小体\t81043\n古巷\t81044\n欧洲央行\t81045\nT430\t81046\nUbu\t81047\n桃皮绒\t81048\nSE846\t81049\n单人床\t81050\n接枝\t81051\n东方好莱坞\t81052\n栽\t81053\n血管\t81054\n榕江\t81055\n汇通快递\t81056\n竹径\t81057\nsqlldr\t81058\nMAF\t81059\n白瓜\t81060\n福昕阅读器\t81061\n帕萨特\t81062\nPSD格式\t81063\n女猎手\t81064\n平顺\t81065\n纵容\t81066\n脊梁骨\t81067\n赏秋\t81068\n雪糕棒\t81069\n集邮者\t81070\n圆满结束\t81071\n45款\t81072\n迷魂阵\t81073\ncall\t81074\n妖猫传\t81075\nsaturn\t81076\n厂区\t81077\n姚安县\t81078\n星河丹堤\t81079\nWeeks\t81080\n羞耻度\t81081\npronunciation\t81082\n戈壁玉\t81083\n双街\t81084\n过秦论\t81085\nSovereign\t81086\n清肌\t81087\n00000709\t81088\n新中关购物中心\t81089\n2.42\t81090\n奔奔\t81091\n惠科\t81092\n2999元\t81093\n关外\t81094\nmui-slider\t81095\nBudapest\t81096\n01年\t81097\n重生女\t81098\n激励\t81099\n蛇妖\t81100\nCENS\t81101\n地固\t81102\n坦静\t81103\n那麽\t81104\n报批\t81105\n战王\t81106\n吝啬\t81107\n插画版\t81108\n马龙·白兰度\t81109\nVisualStudio2017\t81110\n安然\t81111\nReal\t81112\n尚方宝剑\t81113\n木薯\t81114\n三星Gear\t81115\n医药类\t81116\n一姐\t81117\n就坐\t81118\n鉴赏_无敌汽车网\t81119\n四五天\t81120\n社长\t81121\n暑校\t81122\n北方公司\t81123\n55R18\t81124\n月亮湾3号\t81125\n一千亿\t81126\n楹联\t81127\n深圳广播电台\t81128\n中银E贷\t81129\nsexlab\t81130\n日月城\t81131\n电度\t81132\n正大剧场\t81133\n鲜奶吧\t81134\nwww.163.com\t81135\nmatelab\t81136\n小米电视4a\t81137\n行政复议决定书\t81138\n14.04下\t81139\n减毒活疫苗\t81140\n整理员\t81141\n小泽爱丽丝\t81142\nWeather\t81143\n杉德卡\t81144\n庄园主\t81145\nSTARTER\t81146\n3月4日\t81147\n10.32\t81148\n陈政高\t81149\n希慎广场\t81150\n49_\t81151\nlocation\t81152\n木工\t81153\nmodbustcp\t81154\n偏光片\t81155\n普速铁路\t81156\n斯蒂卡\t81157\n旧称\t81158\n双12\t81159\n315.com\t81160\n上海日上免税店\t81161\n舔脚\t81162\n江亭\t81163\n_兽\t81164\n佳能俱乐部\t81165\n家朗\t81166\n透视感\t81167\n低腰\t81168\n松塔\t81169\nltrim\t81170\n预审\t81171\n看世界\t81172\nthf\t81173\n票决\t81174\n蜜穴\t81175\nnickel\t81176\n商埠\t81177\n麦吉\t81178\n及物\t81179\n乳沟\t81180\n南京市发展和改革委员会\t81181\n窃魂者\t81182\n张雪飞\t81183\n正反\t81184\n椭圆形\t81185\nFFSUSU\t81186\n528Li\t81187\n五双\t81188\n初情\t81189\nleftshine\t81190\n杨宽\t81191\nabnormal\t81192\nposi\t81193\n区城管执法局\t81194\n格子达\t81195\n晶品\t81196\n林清玄\t81197\n地尔汉宇\t81198\n美术师\t81199\n见龙在\t81200\n肌注\t81201\nOPenCV\t81202\necstore\t81203\nFinereport\t81204\n雷神加速器\t81205\n上虞人才网\t81206\nTranslational\t81207\n疗养院\t81208\n高仿苹果\t81209\n河南南路\t81210\n阳神\t81211\nhdc\t81212\n北京奔驰\t81213\n软考网\t81214\nElizabeth\t81215\n三湘万\t81216\n康大\t81217\n命運╰\t81218\n优质化\t81219\n芬顿反应\t81220\nEVE\t81221\ndj\t81222\n未经许可\t81223\npatty\t81224\n水饮\t81225\n秦晋霖\t81226\n修正药业\t81227\n映入眼帘\t81228\nslf4j\t81229\n远途\t81230\ngdy\t81231\n厦门南洋学院\t81232\nFunk\t81233\n闫\t81234\n三箭\t81235\n翻身仗\t81236\n跑进\t81237\n收率\t81238\n动弹\t81239\n松山镇\t81240\n莞尔\t81241\n不结\t81242\n走稳\t81243\nwold\t81244\ndiv、li\t81245\n自由落体\t81246\n悦辱\t81247\n吹膜机\t81248\n凤回巢\t81249\n羹\t81250\nDOTA2人口普查\t81251\n38码\t81252\n李哥\t81253\n悲伤\t81254\n无锡国劲合金有限公司\t81255\n人本集团\t81256\n商区\t81257\n侠僧探案传奇\t81258\n干锅\t81259\n圣莫妮卡\t81260\n健盘\t81261\ndhc卸妆油\t81262\n第11节\t81263\nmxm\t81264\n加快\t81265\nBNB\t81266\n退役军人\t81267\n美度贝伦赛丽\t81268\n展讯通信\t81269\n十几本\t81270\nFee\t81271\nBush\t81272\ntrader\t81273\nIntel处理器\t81274\n宫二\t81275\n于佳\t81276\n新三水网\t81277\n希尔镇\t81278\n求值\t81279\nblacks\t81280\nBlu\t81281\n方角\t81282\nlona\t81283\n胸长\t81284\ndstream\t81285\nWA\t81286\n伍君仪\t81287\n超级玛丽\t81288\nkristen\t81289\nSMC\t81290\n中生\t81291\n曾家镇\t81292\n权势\t81293\n模載\t81294\n星网\t81295\nCPhI\t81296\n瑄\t81297\n蒂森克虏伯\t81298\n天涯孤客\t81299\n英国谢菲尔德大学\t81300\n中山大学政治与公共事务管理学院\t81301\n迷题\t81302\n主君的太阳\t81303\n鬼泣4\t81304\n雪路\t81305\n优居\t81306\n荒川之怒\t81307\n镇海路\t81308\n普利\t81309\nedid\t81310\n贵州茅台集团\t81311\n春游日记\t81312\n用电\t81313\nDeNA\t81314\n阿不\t81315\n预算法\t81316\n互联网+医疗\t81317\n分位\t81318\n海斗\t81319\n摄镜\t81320\n加热套\t81321\ntittle\t81322\n朗诵会\t81323\n第82章\t81324\n显然\t81325\nMoco\t81326\nzeus\t81327\n天津公司\t81328\nbarLayout\t81329\nPOB\t81330\n绿都万和城\t81331\nCOB\t81332\n路遥知\t81333\n2092\t81334\nA级旅游景区\t81335\n招租/托管信息网\t81336\n7分钟\t81337\n华天\t81338\n163源\t81339\n东莞市第五人民医院\t81340\n几道\t81341\n道听途说\t81342\n过去式\t81343\n夺旗\t81344\nSIM\t81345\n盐井\t81346\n15_\t81347\nAgoly\t81348\nwww.178yy.com/News/ybml/fjzy.htm\t81349\nYaoi\t81350\n精油皂\t81351\n积存\t81352\n近亲结婚\t81353\n2018036\t81354\nJCB\t81355\n中国文明网联盟\t81356\n霸哥\t81357\n周仰杰\t81358\n合板\t81359\nsubtitles\t81360\n小诚\t81361\n差模\t81362\n飞天诚信\t81363\n圆梦园\t81364\n最高人民法院民一庭\t81365\n50分钟\t81366\nonscroll\t81367\n润湿剂\t81368\n陈继世\t81369\n佳味\t81370\n当麻\t81371\n石化区\t81372\n233\t81373\n集中楼\t81374\n令片\t81375\nACCA\t81376\ndozen\t81377\n真月\t81378\n第一任\t81379\n甜妻\t81380\n喜园\t81381\n古柏\t81382\n绍兴银行\t81383\n刺激性\t81384\n占察\t81385\nReplacement\t81386\n成就好\t81387\n教育科学学院\t81388\n马春花\t81389\nKeytruda\t81390\n古墓丽影10:崛起\t81391\n林本托\t81392\n羚锐制药\t81393\npre\t81394\n师带徒\t81395\n8069\t81396\n信托网\t81397\n党建制度\t81398\n官阶\t81399\n五彩城\t81400\n全民皆兵\t81401\n光绪\t81402\n42元\t81403\n美梦\t81404\npuzzle\t81405\n中版\t81406\n深孔钻\t81407\n国家保密科技测评中心\t81408\nSystemd\t81409\n柱形\t81410\njiffies\t81411\n入教\t81412\nAnswer\t81413\n透明质酸\t81414\n井山裕太\t81415\n假想\t81416\n硫软膏\t81417\n妙\t81418\n标回收率\t81419\nadult\t81420\n闪客精灵\t81421\n评述\t81422\n六福珠宝\t81423\n水耗\t81424\n花架\t81425\nd盘\t81426\niphone6s\t81427\n13w\t81428\n锐意\t81429\n黑桐谷歌\t81430\n中游\t81431\n360问答\t81432\nFreestyle_娱乐频道_红网\t81433\n圆模板\t81434\n赢驷\t81435\nNikon\t81436\n异型\t81437\n冬美人\t81438\n卢松松\t81439\n大乔\t81440\nabt\t81441\n思敏\t81442\n沙井村\t81443\n健康报\t81444\n太极服\t81445\n卓雅\t81446\n24种\t81447\n140亿元\t81448\n百分之四\t81449\n劳动年龄人口\t81450\n壮丽\t81451\nqms\t81452\n铁了心\t81453\n魅力四射\t81454\n高洋\t81455\nlede\t81456\n六角\t81457\n查斯特\t81458\n云黑\t81459\n瞬镜\t81460\n二十一项\t81461\n连廊\t81462\n止水\t81463\n秩序\t81464\n耐用度\t81465\n哆啦a梦新番\t81466\n没入\t81467\n加药\t81468\n板芙镇\t81469\nmg/m3\t81470\n协作体\t81471\n基因突变\t81472\ngsb\t81473\n中国经济导报社\t81474\n路商机\t81475\n耶鲁指纹锁\t81476\n2017-12-28\t81477\n比力\t81478\n脉率\t81479\n李天生\t81480\n玛蒂尔德\t81481\n天下霸\t81482\nscr\t81483\n分解者\t81484\nxeno\t81485\n北票\t81486\n北京航空航天大学研究生院\t81487\n降旗\t81488\n辅助型\t81489\n谢杰\t81490\nKencery\t81491\n一周内\t81492\nq20\t81493\n恢恢\t81494\n地地道道\t81495\n香柏\t81496\n充放\t81497\n衡山政府\t81498\nFoxmail帮助中心\t81499\n二叉树算法\t81500\nletv乐视高清\t81501\n剪刀门\t81502\n80斤\t81503\n点学\t81504\n区纪委监察局\t81505\n怎么装\t81506\n巧虎学\t81507\n上下界\t81508\n鉴衡\t81509\n迈皋桥\t81510\n米其林轮胎\t81511\n_南方网\t81512\n深圳市文体旅游局_深圳政府\t81513\nランド\t81514\n金石路\t81515\n戴鹏\t81516\n证人\t81517\n萧芳芳\t81518\n00期\t81519\n人事制度\t81520\n比亚迪f0\t81521\nhz\t81522\n6袋\t81523\nVox\t81524\n北京国际\t81525\n回忆版\t81526\n第25话\t81527\n黄慧\t81528\n小慧\t81529\n_神盾局\t81530\n客管处\t81531\n饭岛玖罗\t81532\nLOLs8\t81533\nAPR\t81534\n张澜\t81535\n苏运莹\t81536\n华安基金\t81537\nDXP\t81538\n矿藏\t81539\n1万家\t81540\n诉讼案\t81541\n师部\t81542\n甘肃电视台\t81543\nnorway\t81544\nBD谱\t81545\n第10周\t81546\nFlashlights\t81547\n德安县\t81548\n法定退休年龄\t81549\n万晓利\t81550\nlol吧\t81551\n朱宏\t81552\nChoosing\t81553\nZH奶酪\t81554\n说会\t81555\n善意\t81556\n赵容\t81557\n纳米纤维素\t81558\n乏困\t81559\ncpcpc\t81560\n合成油\t81561\n8601\t81562\n国信办\t81563\n阿睿\t81564\nmultivariate\t81565\n那年花开\t81566\n高检院\t81567\n_龙岩网\t81568\n韩娱小说吧\t81569\nannabella\t81570\nstockings\t81571\nATM取款机\t81572\n砂锅米线\t81573\n恺撒\t81574\n节点值\t81575\n主校区\t81576\n抗韩\t81577\n爱尔威\t81578\n止痒膏\t81579\nギリモザ\t81580\n280g\t81581\nShibor\t81582\n8.5折\t81583\n海蒂诗\t81584\n横织\t81585\n新桥社区\t81586\nsuc\t81587\n波拉\t81588\n金立s6\t81589\n仕方\t81590\n武汉大学新闻与传播学院\t81591\n猫扑大杂烩-猫扑网\t81592\n水效\t81593\n藏青\t81594\n泥垢\t81595\nyaskawa\t81596\nintake\t81597\n101%\t81598\n欧路\t81599\n阿拉纳克\t81600\nPinky\t81601\n像章\t81602\n李晓燕\t81603\n海人\t81604\nSeriously\t81605\n电气柜\t81606\n限售期\t81607\n双重组织生活会\t81608\n福州高新区\t81609\n草船借\t81610\ngluster\t81611\n310\t81612\n一大坨\t81613\n佛教徒\t81614\n新任村\t81615\n20178年\t81616\n婉君\t81617\n第一讲\t81618\n各庄\t81619\n散点图\t81620\n天宇骑士\t81621\n帅哭\t81622\n质押回购\t81623\n身法\t81624\n50000000\t81625\n智驾\t81626\n重庆市长寿区人民政府\t81627\n2分半\t81628\n美中国际\t81629\npodspec\t81630\n五六十\t81631\n虚度\t81632\ntrillion\t81633\n转弯\t81634\n十五款\t81635\n桁架式\t81636\n讨伐\t81637\n中国铁路南昌局集团有限公司\t81638\n陈志强\t81639\nGluten\t81640\n百合科\t81641\n中华人民共和国公务员级别\t81642\n牛刀小试\t81643\n拳皇世界\t81644\npinger\t81645\n苦楚\t81646\nbans\t81647\n乐得\t81648\n关於\t81649\n县镇\t81650\n黑牛\t81651\n15G\t81652\n山本耀司\t81653\n大四狗\t81654\n不为人\t81655\n宝鼎\t81656\n离岸型\t81657\n辉煌科技\t81658\n伤案\t81659\n光合细菌\t81660\nluashin\t81661\n地价\t81662\nmsvcp100.dll\t81663\n检验室\t81664\n崔珉豪\t81665\n别克4s店\t81666\n昆明小学\t81667\n华裳\t81668\n朱雀\t81669\n温床\t81670\n速马力\t81671\n斗胆\t81672\n带梦\t81673\n犬戎\t81674\n承受\t81675\n罗马利奥\t81676\n171号\t81677\n51nb\t81678\n6.80\t81679\n东陆\t81680\n注单\t81681\n五明\t81682\n招儿\t81683\n重庆公安出入境\t81684\n自然资源资产负债表\t81685\n知乎体\t81686\n安吉县人民政府\t81687\n京峰\t81688\n黄丽\t81689\n事君\t81690\n黑魂3\t81691\n滑档\t81692\n自助图书馆\t81693\n丑化\t81694\n玉溪\t81695\n十三日游\t81696\n剑器\t81697\n西安交通大学医学院\t81698\n无涯\t81699\nCoal\t81700\n邓氏鱼\t81701\n蛙鱼\t81702\n千度\t81703\n军事迷\t81704\n固件库\t81705\n江苏省国土资源厅\t81706\n裴银祥\t81707\n列强\t81708\n苏贞昌\t81709\n歌兰蒂斯\t81710\n英菲尼迪Q70L\t81711\n通电话\t81712\n青鸾\t81713\n煞人\t81714\n张家界旅游网\t81715\n十三妹\t81716\n朴灿烈\t81717\n洪水位\t81718\n船山\t81719\n暴利\t81720\n会计核算软件\t81721\n新疆公路\t81722\npss\t81723\n亮声\t81724\n于承惠\t81725\n过敏源测试\t81726\noverload\t81727\n耕耘\t81728\n091\t81729\n金谷园\t81730\n路劲铂隽\t81731\n洛江新闻网\t81732\n马克隆\t81733\n西厢\t81734\n张庆黎\t81735\n尼克扬\t81736\n梦之梦\t81737\n陈菲\t81738\nTuiGirl\t81739\n鞋鞋\t81740\n沉舟\t81741\n实用贴\t81742\n钟言\t81743\n无人银行\t81744\nadagio\t81745\n麦词\t81746\n奔驰AMG论坛\t81747\nxilie\t81748\n刀刺\t81749\n15ikb\t81750\n佛顶山\t81751\n373号\t81752\n云网\t81753\n20150415\t81754\n驭兽\t81755\n0.0.3\t81756\n匡威\t81757\n中国港\t81758\n私装\t81759\n泰达宏利\t81760\n卢沟桥乡\t81761\n生煎包\t81762\n2017年初\t81763\n先锋影音播放器\t81764\nbachelor\t81765\n王主任\t81766\n11首\t81767\n美祢藤\t81768\n第27轮\t81769\n压价\t81770\n荣誉室\t81771\n重病\t81772\n芍药园\t81773\nconductive\t81774\n超清_土豆\t81775\n浙工大\t81776\n拼花\t81777\n王尧\t81778\n酷睿m3\t81779\n神授\t81780\n番茄炒鸡蛋\t81781\n5级\t81782\n向快乐出发\t81783\n信誉度\t81784\nwecan\t81785\n创新基地\t81786\n斯泰尔\t81787\n刷子李\t81788\nfdf\t81789\n长江传媒\t81790\n6集\t81791\nHotspot\t81792\nby_saner100\t81793\n厂家\t81794\n奔驰E\t81795\n沙俄\t81796\n叶老师\t81797\n存贷比\t81798\nmarkets\t81799\n北边\t81800\nSVD\t81801\nbootanimation\t81802\n序列化\t81803\n打喷嚏\t81804\nissuing\t81805\n北京)科技有限公司\t81806\n52pk神武2\t81807\n合成法\t81808\n注册金\t81809\njavascript函数\t81810\nIINA\t81811\n不消\t81812\nfensi\t81813\n睿达\t81814\n学化妆\t81815\n百度Carlife\t81816\n电离辐射\t81817\n仙剑奇侠传1\t81818\n金家村\t81819\n万行\t81820\n锦年\t81821\n逆生\t81822\n否定句\t81823\n我亲爱的朋友们\t81824\n烀\t81825\nSurge\t81826\n多版\t81827\n赵哥\t81828\n无极县\t81829\neTax@SH\t81830\n倩女幽魂\t81831\ntaps\t81832\n发源\t81833\n甲壳素\t81834\nprores\t81835\n我是路人甲\t81836\n京东小金\t81837\n冠希哥\t81838\n赵国荣\t81839\ntcm\t81840\n水上公园\t81841\n血癌\t81842\nJust\t81843\n幸福广场\t81844\n埃德加·斯诺\t81845\n河南商报\t81846\nRELEASE\t81847\n武夷山机场\t81848\n王冬龄\t81849\n200万辆\t81850\n欲水\t81851\n99首\t81852\n忍者杀手\t81853\nannotated\t81854\nSs\t81855\n天津市眼科医院\t81856\nlocality\t81857\nchase\t81858\n氯化铝\t81859\n10586\t81860\n资源商\t81861\n成品率\t81862\n惠东县\t81863\n上手\t81864\n虎坊桥\t81865\n松江路\t81866\n后缀名\t81867\n606路\t81868\n游程\t81869\n航天工程大学\t81870\n0.78%\t81871\n减法器\t81872\n拂去\t81873\nb40\t81874\n动火证\t81875\nG603\t81876\niphone\t81877\n中心化交易所\t81878\n居住权\t81879\n小人鱼\t81880\n海棉\t81881\n长机\t81882\n打平\t81883\n勇闯死人谷2\t81884\nneal\t81885\n证券期货经营机构私募资产管理业务运作管理暂行规定\t81886\n龙武\t81887\n通风口\t81888\n路名\t81889\n斋堂镇\t81890\nphoenix\t81891\n2.8%\t81892\n红军\t81893\n九仙图\t81894\n逐级\t81895\nwaigua\t81896\n阿硕\t81897\n曲江新区\t81898\nYing\t81899\nfrank\t81900\n彩印机\t81901\ndmp\t81902\nelimination\t81903\nsized\t81904\n趁势\t81905\n动物庄园\t81906\n较好\t81907\n折纸网\t81908\n背感\t81909\n忧乐\t81910\n南工大\t81911\n和安\t81912\n浴柜\t81913\n国家科技成果网\t81914\n水哥\t81915\n无间道3:终极无间\t81916\n错误率\t81917\n青联\t81918\nMKII\t81919\n第113章\t81920\n试训\t81921\n车载dj音乐\t81922\n1y\t81923\n重庆港\t81924\n2a3\t81925\n熬\t81926\n无刷\t81927\nAGS\t81928\n高刘镇\t81929\nSSTP\t81930\n锦绣华府\t81931\n圆型\t81932\n争做时代先锋\t81933\n穷酸\t81934\n男人花\t81935\n徐子珊\t81936\n露点\t81937\n甲基红\t81938\n红毛\t81939\n上钩\t81940\n驻马店广视网\t81941\n染上\t81942\n厄运\t81943\nwows\t81944\n代表们\t81945\nXmanager\t81946\n9.5.7.0\t81947\n艳骨\t81948\nnacl\t81949\nx270\t81950\n龙坞\t81951\nzyzy\t81952\n周韶窦骁\t81953\n2450\t81954\n何其莘\t81955\n山丹丹花开红艳艳\t81956\nZeal\t81957\n帰\t81958\n梅根\t81959\n养胃\t81960\n1169\t81961\n南京南\t81962\n花过天晴\t81963\n为邻\t81964\n秋田\t81965\nOne/M7\t81966\n内窥\t81967\n弋矶山医院\t81968\n奇妙博物馆\t81969\n防滑板\t81970\n求好\t81971\n两万五千\t81972\nBios\t81973\ndocker-compose\t81974\n打狗\t81975\n创评\t81976\n机动车\t81977\n潇湘夜雨\t81978\n金庸群侠传3\t81979\nimmune\t81980\n多面体\t81981\n1.72\t81982\n23所\t81983\n周易算命\t81984\n爱的天空\t81985\npom\t81986\n辽宁省国家税务局\t81987\nワン\t81988\n天音控股\t81989\n脂质体\t81990\n两周岁\t81991\nsin2\t81992\n经济学类\t81993\nlibtool\t81994\n116\t81995\n永年\t81996\n胆艺\t81997\n乳癌\t81998\n青海网络广播电视台\t81999\n扣位\t82000\n3\t82001\n圣雄甘地\t82002\n床头板\t82003\n思春期\t82004\n南昌理工学院\t82005\n涤纶面料\t82006\n拉呱\t82007\n子昂\t82008\n画圆\t82009\n吨钢\t82010\n周志明\t82011\n锦苑\t82012\n信息权\t82013\n屋场\t82014\ndusty\t82015\ntinyxml\t82016\n矗立\t82017\n3.5T\t82018\n梦幻西游1\t82019\n韵府\t82020\n敲门声\t82021\n文楼\t82022\n惨死\t82023\n公使\t82024\n人生中\t82025\nConsequences\t82026\n网络技术有限公司\t82027\nV5.1.0\t82028\n1万步\t82029\n欧蕾\t82030\n夫妇\t82031\n诱情\t82032\n海轮\t82033\n禁塑\t82034\n雷蛇鼠标宏\t82035\nNoire\t82036\n温岭市\t82037\nPre\t82038\n湖北科技职业学院\t82039\n夏禹\t82040\n北京中赫国安\t82041\n建设银行卡\t82042\n东师理想\t82043\n盛唐妖异志\t82044\n宝通\t82045\n外键\t82046\n硕大\t82047\n歌曲谱\t82048\n邪恶漫画大全无翼鸟\t82049\n二十周年\t82050\n组件库\t82051\n一个四十岁\t82052\n天国的阶梯\t82053\n甜味剂\t82054\n创维盒子\t82055\n新密市人民政府\t82056\n4699元\t82057\n7075\t82058\n磨合期\t82059\n破云\t82060\n实仪\t82061\n卖盘\t82062\n沙站\t82063\n四支\t82064\n师道\t82065\n全部\t82066\nmarshmello\t82067\n徐晓冬\t82068\n频次\t82069\n荣耀畅玩4x\t82070\niqc\t82071\n十八厘米\t82072\n复查\t82073\n施食\t82074\nabundance\t82075\n大同古城\t82076\n吉利帝豪GS\t82077\n熄火\t82078\n南都电源\t82079\nAmeba\t82080\n攻城掠\t82081\n恐怖事件\t82082\n量表\t82083\n一屏\t82084\n诗人\t82085\n夜蝶\t82086\n环氧值\t82087\n民政办\t82088\n露奈雅拉\t82089\n说唱脸谱\t82090\n韩旭\t82091\n亚星\t82092\ndy888\t82093\n钱瑗\t82094\n沈默\t82095\n准运证\t82096\n小石城\t82097\n丰阳\t82098\n海口站\t82099\n百度提问\t82100\n雷狼龙\t82101\n喷粪\t82102\n重庆江北国际机场\t82103\ng17\t82104\n摇钱树\t82105\n镜筒\t82106\n高堡\t82107\n56.com\t82108\n平邮\t82109\n3^\t82110\n庚辰\t82111\n玄空飞星\t82112\n乙卯月\t82113\n口袋西游\t82114\nFOREO\t82115\n毒魔\t82116\nCalculation\t82117\n作业本\t82118\n幼兔\t82119\n白濁\t82120\n惠济\t82121\n奶酪蛋糕\t82122\n熟制\t82123\nTu\t82124\nProRes\t82125\n蔡旻佑\t82126\n实用新型专利申请\t82127\n扭亏\t82128\n液液酱\t82129\n灌根\t82130\n元组\t82131\n沁州\t82132\nZhuo\t82133\nansoft\t82134\nBetting\t82135\n玉骨\t82136\n北京图书大厦\t82137\n骑士步行者\t82138\n党报\t82139\n鳗鱼\t82140\n真武\t82141\n未成年人保护法\t82142\n1151针\t82143\n清水泵\t82144\n贵州人民出版社\t82145\nwebsocket\t82146\n南邵镇\t82147\n华南师范大学附属小学\t82148\n圣水寺\t82149\n林秀晶\t82150\n暴锋眼\t82151\n3.28\t82152\n中华讲师网\t82153\nav线\t82154\n盲品\t82155\n10000条\t82156\n六道木\t82157\nSubmodule\t82158\n新中国\t82159\n防潮\t82160\n央广网\t82161\n不离不弃\t82162\n观猎\t82163\n对氯间二甲苯酚\t82164\n2万美元\t82165\nnrpe\t82166\n20140103\t82167\n中同\t82168\n东方时代广场\t82169\n三枝\t82170\nwashington\t82171\n高指\t82172\n极品飞车14\t82173\ncs7\t82174\n裁判长\t82175\n地号\t82176\nzhcw\t82177\n原材料\t82178\n细口\t82179\nMORE\t82180\nnexus5\t82181\n国网商城\t82182\n拜读\t82183\n二苯甲酮\t82184\n百变马丁\t82185\n水轮\t82186\nconvergent\t82187\n萧诺\t82188\n尼尔森\t82189\n楼下村\t82190\n木笛\t82191\n唯我\t82192\n颛桥镇\t82193\n未觉\t82194\nBiu\t82195\n同工\t82196\nautodock\t82197\n红凌路\t82198\n福建省广播影视集团\t82199\nHK416\t82200\nネット\t82201\n王识贤\t82202\nConne\t82203\n女神级\t82204\n赋\t82205\n心理委员\t82206\n狼叔\t82207\nCollector\t82208\n73岁\t82209\n锂离子电池\t82210\n题词\t82211\nFeatured\t82212\nSMO\t82213\n张宏博\t82214\n孟子坤\t82215\n文登政府网\t82216\ncondenser\t82217\ngull\t82218\n网校\t82219\nmatlab2016a\t82220\n含油轴承\t82221\n文素\t82222\n和仁网\t82223\nPluralsight\t82224\n第2部\t82225\nonkeydown\t82226\n聚酰胺纤维\t82227\n2000例\t82228\nMongo\t82229\n暗黑之门\t82230\ncardboard\t82231\nnat123\t82232\n普吉岛酒店\t82233\nカラ\t82234\nBENZ\t82235\nOrUpdate\t82236\n旗开得胜\t82237\nsous\t82238\n4.05\t82239\n奖罚\t82240\n鞋柜\t82241\n37度\t82242\n光铸德莱尼\t82243\naiml\t82244\n太仓日报\t82245\n哈三联\t82246\nDSP28335\t82247\nlist}\t82248\n砌墙\t82249\n1929年\t82250\n3.5米\t82251\n清镇市人民政府\t82252\n手帖\t82253\n第一书记\t82254\n蔚山现代\t82255\n班会稿\t82256\n合作版\t82257\n下海\t82258\n进站\t82259\n尴尬期\t82260\n奶精\t82261\n江洋\t82262\n四三九九\t82263\n成都商报\t82264\nmp4_eD2k\t82265\n天刀ol\t82266\nBurgers\t82267\n肥猫\t82268\nrelaxing\t82269\n徕卡镜头\t82270\n液晶显示\t82271\n飲\t82272\n放肆\t82273\n湘鄂\t82274\n广州市正骨医院\t82275\nsaleae\t82276\n乐朗\t82277\n仙器\t82278\nCODESYS\t82279\n天鹏\t82280\nN95\t82281\nrequests\t82282\n波速尔\t82283\n离魂\t82284\nmknod\t82285\n甚微\t82286\n彩钢\t82287\n泰国菜\t82288\n小超人\t82289\nFRANCE\t82290\n8787\t82291\n蕲艾\t82292\nRossi\t82293\n猫爪\t82294\n简论\t82295\n苏银霞\t82296\n_吉\t82297\ntewa\t82298\n帕特农神庙\t82299\n钱林\t82300\nglay\t82301\n曳心\t82302\n数系\t82303\nchu\t82304\npaulyi\t82305\n影液\t82306\n喜达屋俱乐部\t82307\n小辣椒\t82308\nmakerbot\t82309\n一[\t82310\n马天何姿\t82311\nContinuing\t82312\n天策行\t82313\n威海经济技术开发区\t82314\n结构板\t82315\n长汀\t82316\n铭普光磁\t82317\n第二联\t82318\n韩国区\t82319\n青河绝恋\t82320\n养活\t82321\n昆都仑区\t82322\n表处\t82323\n植物生理学\t82324\nchem3d\t82325\n吴记\t82326\nBENQ\t82327\nSOCIAL\t82328\n长泡\t82329\n碧海青天\t82330\n障\t82331\n1.5公分\t82332\n天津城建大学\t82333\n20161001\t82334\nmacbook\t82335\n神采飞扬\t82336\n沈阳市政府\t82337\n脂\t82338\n朱蒙\t82339\n曼秀雷敦新碧\t82340\n一腰\t82341\nPICC人保财险\t82342\n封号\t82343\nGTX960M\t82344\n梦幻世界\t82345\n五厘\t82346\n每亩\t82347\n二十四条\t82348\n表板\t82349\n经科\t82350\n盛期\t82351\n浮云\t82352\n棚室\t82353\n我爱的人\t82354\nM117\t82355\n1001夜\t82356\ncomplie\t82357\n隔膜阀\t82358\n阿法拉伐\t82359\n谢烟客\t82360\n盛夏\t82361\n直热式\t82362\n两空\t82363\n分期信用卡\t82364\nManMP3\t82365\n纸上\t82366\nxavc\t82367\n快劳\t82368\n硅粉\t82369\n脑鸣\t82370\n锡罐\t82371\ncucm\t82372\nrs5\t82373\n升值\t82374\ntrance\t82375\n嵩天\t82376\nRalph\t82377\n512514\t82378\n汉美驰\t82379\n硅藻\t82380\n子宫息肉\t82381\n金虎\t82382\n漳县\t82383\n自行\t82384\n宁都县政府\t82385\n2017年6月7日\t82386\n8章\t82387\n三研\t82388\n多表\t82389\ncosme大赏\t82390\n八万年\t82391\n清穿文\t82392\n孙江涛\t82393\n蓝胖子\t82394\n10亩\t82395\n8868\t82396\n展平\t82397\n利安隆\t82398\n余子式\t82399\n不讨好\t82400\n合肥东部\t82401\nCustomers\t82402\nnolog\t82403\n徐晨\t82404\n萧山口水社\t82405\n9H\t82406\npages\t82407\n田丹\t82408\n北京理工大学宇航学院\t82409\n心灵\t82410\n卡路里\t82411\n江苏护理职业学院\t82412\n中付支付\t82413\n任贤齐\t82414\n散利痛\t82415\nLesbian\t82416\n魏新\t82417\n兴义民族师范学院\t82418\n凶杀案\t82419\n龙城小学\t82420\n苍龙\t82421\n社区卫生服务\t82422\n等闲识得东风面\t82423\n14m\t82424\n沂岭\t82425\n新精武门\t82426\nLinux就该这么学\t82427\n2小时内\t82428\n国代\t82429\n科汛\t82430\n臂力器\t82431\nqq音乐吧\t82432\n120\t82433\n南通车管所\t82434\n嘉丽\t82435\n驼梁\t82436\nCB190R\t82437\n人工费\t82438\n飞剑\t82439\nORACLE11G\t82440\n0x000000\t82441\ndandelion\t82442\n避风港\t82443\n不变成\t82444\n远小\t82445\n阜蒙县政府门户站\t82446\n表文\t82447\n嘎巴\t82448\n朱光磊\t82449\nLibgdx\t82450\n女娃\t82451\n洛阳王城公园\t82452\n超级细菌\t82453\nMolecule\t82454\nNVM\t82455\nSpinning\t82456\n水斗\t82457\nyangzailu1990\t82458\n肖峰\t82459\n12356\t82460\ninstitut\t82461\n0711\t82462\n百度指数\t82463\nVVT\t82464\n2.89\t82465\n马岩\t82466\nqcom\t82467\npypi\t82468\n铁壳\t82469\n蒋勋\t82470\n拉坯\t82471\n李宣美\t82472\n牛顿第二定律\t82473\n左左\t82474\n人社局\t82475\n31元\t82476\n油笔\t82477\n3499元\t82478\n文苑英华\t82479\n嘉实多润滑油\t82480\n罗汉寺\t82481\n3055\t82482\n影展\t82483\n保尔\t82484\n20160710\t82485\n街球王\t82486\n阿里内\t82487\n跨域名\t82488\n周国平\t82489\n消规\t82490\nchaumet\t82491\n镒\t82492\n考研派\t82493\n冯喆\t82494\n大市场\t82495\nphpwind\t82496\n2108\t82497\n甘南藏族自治州\t82498\n中国徐州鼓楼区政府\t82499\n夏威夷特勤组\t82500\n多西\t82501\n文化部艺术发展中心\t82502\nuu898\t82503\n木龙\t82504\n劈柴\t82505\n拓普\t82506\n泌尿频道_健客网\t82507\nexcludes\t82508\n逗乐\t82509\n单警\t82510\n讠\t82511\ngreener\t82512\nMAD\t82513\n爱情公寓2\t82514\npivot\t82515\n生产资料所有制\t82516\n岛民\t82517\nPin\t82518\napa格式\t82519\n幺妹\t82520\n研究室\t82521\ndoing\t82522\n弹链\t82523\n卡方检验\t82524\n青衿\t82525\n高提成\t82526\nminor\t82527\nclinique\t82528\n王小佳\t82529\n红磨坊\t82530\n宋祖英\t82531\n双频千兆无线路由器\t82532\n保号\t82533\n埃尔\t82534\n民生银行手机银行\t82535\n观光路\t82536\nsk2\t82537\n十间\t82538\n张集镇\t82539\nsssc\t82540\npy3.6\t82541\n17G\t82542\n插线板\t82543\n麻辣女孩\t82544\n梦瑶\t82545\n不雅\t82546\n爱书网\t82547\n汉庭快捷酒店\t82548\n走砍\t82549\ncontiguous\t82550\n保龄球\t82551\n机经\t82552\npunishment\t82553\n封盘\t82554\n张让\t82555\n久保田\t82556\nsqlser\t82557\nzhyue93\t82558\n跨越式\t82559\n无锡东\t82560\n粘层\t82561\n裂片\t82562\n好词\t82563\nir2110\t82564\n小圈\t82565\n水平衡测试\t82566\nmexw64\t82567\n全球婴童网\t82568\n散酒\t82569\nOllie\t82570\n网易安全中心\t82571\n中国大学生广告艺术节学院\t82572\n长峙岛\t82573\nPassport\t82574\n绝色妖娆:鬼医至尊\t82575\n25周年\t82576\n缦\t82577\nclicking\t82578\n反党\t82579\n等比\t82580\n黑暗骑士\t82581\n棒球英豪\t82582\n丝瓜藤\t82583\n文章\t82584\n吉林油田\t82585\n市旅游局\t82586\n业\t82587\n不经\t82588\n卡坦岛\t82589\ntaxation\t82590\n氢碘酸\t82591\n何尊\t82592\n白橡木\t82593\n李敬\t82594\n人与狗\t82595\n韩城镇\t82596\nfaker\t82597\n金蝶记账王\t82598\nse5\t82599\n通苏嘉城际铁路\t82600\n贝佳斯绿泥\t82601\n第十九批\t82602\n深圳信用网\t82603\n可可英语\t82604\nBlossoms\t82605\nimoo\t82606\n最多\t82607\n中审\t82608\n舍得酒\t82609\n高难\t82610\n新繁\t82611\njcrew\t82612\n10min\t82613\n合肥市政务区\t82614\n专\t82615\n手臂流\t82616\n徐州站\t82617\n丁硼乳膏\t82618\ncleartype\t82619\n梅酒\t82620\n工匠\t82621\n赵建毅\t82622\n开朗斗笠菇\t82623\n汤珈铖\t82624\n雪铁龙C3\t82625\n经区\t82626\n中诗网\t82627\n盛科\t82628\n民事案件\t82629\n长春金融高等专科学校\t82630\n火莲\t82631\n万佛湖\t82632\n商用冷柜\t82633\nDSB\t82634\nPreview\t82635\n樱唇\t82636\ninsta360\t82637\npasse\t82638\nTests\t82639\n世茂茂悦府\t82640\n刘罡\t82641\n猇亭区\t82642\n悦色\t82643\nstyrene\t82644\nLevi's\t82645\nblessed\t82646\n专利池\t82647\nbox-shadow\t82648\nlifecycle\t82649\nxiaoli\t82650\n导课\t82651\n狂杀\t82652\n不值一提\t82653\n150529\t82654\n赛欧3吧\t82655\n天险\t82656\n电动网\t82657\n电脑\t82658\n度假型\t82659\n软件业\t82660\n废石\t82661\n吉利帝豪gl\t82662\n陈光华\t82663\n三库\t82664\n焦亚硫酸钠\t82665\nservlet类\t82666\n果糖二磷酸钠\t82667\n用事\t82668\n518000\t82669\n钓鱼灯\t82670\n声压级\t82671\n大茶\t82672\n打纸\t82673\n锐蝮蛇\t82674\n笼中\t82675\n4367\t82676\n斯伦贝谢\t82677\n舆图\t82678\n中韩街道\t82679\n株洲方特\t82680\n嘉凯城\t82681\n沙蚕\t82682\n公侯\t82683\n三五天\t82684\n西樵山国艺影视城\t82685\n东邪\t82686\n舱口\t82687\n沉香灰烬\t82688\n遇见大咖\t82689\naiss\t82690\n窑洞\t82691\n紫外线消毒灯\t82692\n库尔勒市人民政府\t82693\nhttponly\t82694\n新疆地税\t82695\n庙子\t82696\n罗瑞卿\t82697\n剧作\t82698\n扭力\t82699\n治超\t82700\n长靴\t82701\n3158名酒网\t82702\n6.9d\t82703\n防治所\t82704\n新四军\t82705\ncrl\t82706\n刷卡金\t82707\n9550\t82708\n西安市人民政府\t82709\n三国群英传之崛起\t82710\n段宁\t82711\n股票市场\t82712\n运营方\t82713\n中铁大桥局集团有限公司\t82714\n铁甲网\t82715\n东财\t82716\n12.06\t82717\n灵术\t82718\n奔驰C级论坛_汽车之家论坛\t82719\n88读书网\t82720\nsshd\t82721\n玻璃纤维板\t82722\n佩佩\t82723\n循环码\t82724\n县人民政府办公室\t82725\nB站UP主\t82726\n克州\t82727\n嘉兴房产超市\t82728\n信息熵\t82729\n100期\t82730\n北方邦\t82731\n聚散\t82732\n凤凰乡\t82733\n广州蚂蚁搬家公司\t82734\n厚本\t82735\n定妆\t82736\n王元宝\t82737\nJCR\t82738\n1.0_顶库\t82739\n郑东新区\t82740\nYii中文网\t82741\n丁小强\t82742\n春月\t82743\n河北工业大学\t82744\n烦躁不安\t82745\n贝思客\t82746\nRuntime\t82747\n十四夜\t82748\n7D2\t82749\nPSP模拟器\t82750\n订票\t82751\nssdt\t82752\nconvertor\t82753\n县地税局\t82754\n蓉城\t82755\n江团鱼\t82756\n首单\t82757\n山东省总工会\t82758\n男人心\t82759\nTrend\t82760\n国投瑞银基金管理有限公司\t82761\n网路\t82762\nNPV\t82763\ncoherence\t82764\n舞种\t82765\n坏兽\t82766\n双人床\t82767\nBomber\t82768\n这一切\t82769\nsourceforge\t82770\n河南)自由贸易试验区\t82771\n毗河\t82772\n赤溪村\t82773\n子宫后壁\t82774\n零晨\t82775\n安胎\t82776\n红月\t82777\nIE内核浏览器\t82778\n哈工大附中\t82779\nPlantronics\t82780\nairdrop\t82781\n12.05\t82782\n刀塔Plus\t82783\nDnbbs\t82784\n密密麻麻\t82785\n集思宝\t82786\n家车\t82787\n上海骏联车行\t82788\n万能声卡驱动器\t82789\n乳摇\t82790\n焰器\t82791\n献唱\t82792\n5.8寸\t82793\n叶涩\t82794\n九九文章网\t82795\nWildlands\t82796\n果冻状\t82797\n晚礼服\t82798\n辛德勒名单\t82799\n辅导\t82800\n叫错\t82801\n三严三实民主生活会\t82802\n岩窟王\t82803\nろ\t82804\n多多多多\t82805\n盲管\t82806\n绿宝广场\t82807\n蔚来\t82808\nXXX公司\t82809\n雪花\t82810\n开曼群岛\t82811\n水阻划船机\t82812\n警路\t82813\n四川国税\t82814\n马云多\t82815\n一二本\t82816\n道具\t82817\nW5\t82818\nx2\t82819\n舞台版\t82820\nfangzi\t82821\n月谈\t82822\n中国军视网\t82823\n刷墙漆\t82824\n治水\t82825\n十五度\t82826\n20006\t82827\nplanned\t82828\n国际货币基金组织\t82829\nGSL\t82830\nbahn\t82831\n周梦蝶\t82832\n56亿\t82833\n集约\t82834\nHOWTO\t82835\n袁氏当国\t82836\n中国建筑一局(集团)有限公司\t82837\nZing\t82838\n双孢菇\t82839\n1952年\t82840\n百度小说\t82841\n第7名\t82842\n高神\t82843\n急流\t82844\n圆顶阀\t82845\n洛丽塔1997\t82846\n纳米纤维\t82847\n宝莱\t82848\nInternship\t82849\n石门县北\t82850\n升\t82851\nECHARTS\t82852\n斯内普\t82853\nopeniv\t82854\n罗技鼠标\t82855\n百度关键词\t82856\n艾美\t82857\n起算\t82858\n地脚线\t82859\n褪黑素片\t82860\n2018.1\t82861\nugirls\t82862\n商业中心\t82863\n大海啊故乡\t82864\n虹膜识别技术\t82865\n大澳\t82866\n减振\t82867\n回家吃饭\t82868\n知彼\t82869\nMirror\t82870\n偏岩古镇\t82871\n高级查询\t82872\n4月20日\t82873\n夏奈尔\t82874\n榨乳\t82875\nExplosive\t82876\n鸵鸟蛋\t82877\n病毒学\t82878\nroboform\t82879\ndesc\t82880\n广州市公安局\t82881\n暴食吧\t82882\n34秒\t82883\n灰卡\t82884\n应召女郎\t82885\n伙头智多星\t82886\n网版\t82887\n口贝\t82888\nYoshiki\t82889\n680万\t82890\n华商论坛\t82891\n格罗\t82892\n凤凰游戏\t82893\nVAIO\t82894\ninvariant\t82895\n安纳达\t82896\n原子化\t82897\n鹰击\t82898\n综采\t82899\n七八次\t82900\n24元\t82901\n星际2\t82902\n庚\t82903\n院团\t82904\n格兰蒂亚秘闻\t82905\n分歧管\t82906\n翰文\t82907\n兼修\t82908\n沪南路\t82909\n热血沸腾\t82910\n陈小华\t82911\n佳顺\t82912\n脑叶公司\t82913\n乐8\t82914\n筲箕湾\t82915\n南园\t82916\n福尔\t82917\n张岂之\t82918\nジョ\t82919\n50岚\t82920\n10.9级\t82921\n方便食品\t82922\n3.0.1\t82923\n英制\t82924\n夏凉被\t82925\n逆袭\t82926\n制冷剂\t82927\n海埂公园\t82928\n陈骁\t82929\nmommy\t82930\n密秘\t82931\n大阪钢巴\t82932\n良相\t82933\n池壁\t82934\n阿嬷\t82935\n企业管理咨询有限公司\t82936\naggressive\t82937\n豹\t82938\n湖南农业大学\t82939\n床头柜\t82940\n区位分析\t82941\n十三年后\t82942\nWBS\t82943\n吊裆裤\t82944\n萧逸\t82945\n体域\t82946\n新蛋\t82947\n中铁三局集团有限公司\t82948\n二品\t82949\nbcp\t82950\n卡塔尔航空公司\t82951\n兄弟伙\t82952\n首信易支付\t82953\n火电站\t82954\n郑州市教育局\t82955\n渲梦\t82956\n手撕票\t82957\n王明夫\t82958\n总工会\t82959\n搪瓷杯\t82960\ncssdiv\t82961\n陈启明\t82962\n华宇软件\t82963\n广西壮族自治区交通运输厅\t82964\n两箱\t82965\n买家秀\t82966\n7月28日\t82967\n华润(集团)有限公司\t82968\n如懿传\t82969\nProLyn\t82970\n走路\t82971\n260斤\t82972\n瓦拉内\t82973\n红光\t82974\n吴伯箫\t82975\n备忘\t82976\n祥林嫂\t82977\n内斯塔\t82978\nFriedman\t82979\n诸暨市政府\t82980\n岸价\t82981\ndate-picker\t82982\n欧瑞康\t82983\n鳍\t82984\n公司管理规章制度\t82985\n贝壳鞋\t82986\n天心区\t82987\nPVC-U\t82988\n洛阳法院\t82989\n手拿包\t82990\n简阳中学\t82991\n阿谀奉承\t82992\nSEO自媒体\t82993\nSwift语言\t82994\nFitting\t82995\n高安市\t82996\nRobert\t82997\n绵羊皮\t82998\n去旅游\t82999\n8次\t83000\n焚尸\t83001\ncontinuum\t83002\nnetkeeper\t83003\n柏林机场\t83004\n绵阳市\t83005\n教学区\t83006\n许志安\t83007\n魔刹\t83008\n青鹏\t83009\n李彤彤\t83010\n严以律己\t83011\n杨澜访谈录\t83012\nnba2k17\t83013\n梦游天姥吟\t83014\nSTORY\t83015\n乙肝病毒携带者\t83016\n沙色\t83017\n即用\t83018\n正军\t83019\nvex\t83020\n伏羲山\t83021\nBunker\t83022\n时尚潮\t83023\n黄建新\t83024\n寄意\t83025\n吉高\t83026\n大众银行\t83027\n自选题\t83028\n等一下\t83029\nCastel\t83030\n5000平米\t83031\n路德\t83032\nMahout\t83033\n乱糟糟\t83034\n液晶监视器\t83035\n减退\t83036\nmagicbook\t83037\n南宝\t83038\n电企\t83039\n锐器\t83040\n光能\t83041\n紧缚\t83042\n水风\t83043\n用习\t83044\n民政局婚姻登记处\t83045\n红领集团\t83046\n老单\t83047\n考友\t83048\n晋发\t83049\n八闽\t83050\n5600\t83051\n大波妹\t83052\n恶少\t83053\n开放式体素沙盒\t83054\n山东鲁能智能技术有限公司\t83055\n淡水湖\t83056\n吴佩\t83057\n顶煤\t83058\n陈慧娴\t83059\n金蝶kis\t83060\ntheforest\t83061\n把控\t83062\n奇飞知识网\t83063\n分数除法\t83064\n進撃\t83065\nPyqt\t83066\n朴麦妮\t83067\nrc\t83068\n第三十一次\t83069\n阳光家园\t83070\n崩坏学园\t83071\nAngeles\t83072\n600977\t83073\n吉诃德\t83074\n罐壁\t83075\nqilai\t83076\n3gs\t83077\n黑翼\t83078\n谷道\t83079\n关节点\t83080\n科技股份有限公司\t83081\nSQL语句\t83082\nS8050\t83083\nprocessors\t83084\n终结者:创世纪\t83085\n海盗船k70rgb\t83086\n少年骇客\t83087\n东莞理工\t83088\nK线图\t83089\n活力板\t83090\n梦见\t83091\nmet\t83092\n猎凶风河谷\t83093\n2觉\t83094\n半兽\t83095\n琥珀酸脱氢酶\t83096\n高行\t83097\n河北教育考试院\t83098\n飞行符\t83099\n悦来镇\t83100\n三甲基硅基\t83101\nak12\t83102\n西游手游\t83103\n吃饭前\t83104\n前3分钟\t83105\n琵琶腿\t83106\n三任\t83107\n吴政隆\t83108\n精要\t83109\n凤凰传奇\t83110\n肉牛犊\t83111\nEDG战队\t83112\n复旦经院\t83113\n_马自达6论坛\t83114\n伊水\t83115\niSCSI\t83116\n派瑞松\t83117\n师妹们\t83118\n供液\t83119\n插床\t83120\nds1302\t83121\n续费\t83122\n月老灵\t83123\n真人照\t83124\n视频流\t83125\n第三十条\t83126\n县人民法院\t83127\ng603\t83128\n第二十四次\t83129\n陈集镇\t83130\n八宅\t83131\n栋笃\t83132\n大中镇\t83133\n城市新区\t83134\n3DOne\t83135\n黛博拉\t83136\npert\t83137\n狮身人面\t83138\n死宫\t83139\nQRP\t83140\n卡拉卡\t83141\nY1\t83142\n天結\t83143\nPresent\t83144\n稀缺\t83145\n福建农林大学\t83146\nBeatBox\t83147\n略胜一筹\t83148\n万松园\t83149\n镇江市财政局\t83150\n老庙\t83151\n哈投股份\t83152\n双缸\t83153\nAnime\t83154\n冬阳童年骆驼队\t83155\n税审\t83156\nj6s\t83157\n380g\t83158\n91Job\t83159\nronald\t83160\n士力\t83161\n1米\t83162\n牙釉质\t83163\n嵌套类\t83164\n400多个\t83165\n广东科技报数字报\t83166\n新北师大\t83167\nirst\t83168\nhbuilder\t83169\n敬拜\t83170\n炒鸡\t83171\n南京市第二十九中学\t83172\nUNITEK\t83173\n一默\t83174\n整日\t83175\n卫浴展\t83176\n上海前滩\t83177\n楚云暮\t83178\n歌谱网-简谱网\t83179\n旺山\t83180\n卡拉马佐夫兄弟\t83181\n苏财\t83182\n官宦\t83183\nshenzhen\t83184\n徐慧\t83185\n暴击流\t83186\n骑马与砍杀维京征服\t83187\n瞄上\t83188\n斯拉\t83189\npark\t83190\nQuadro\t83191\n2018款\t83192\n秦红\t83193\n包抄\t83194\n诱受\t83195\n叉烧网\t83196\nsgf\t83197\n指环王2\t83198\n芳树\t83199\n暨阳网\t83200\nClang\t83201\n嘉陵江\t83202\n低密\t83203\n野柳\t83204\n北京装修公司\t83205\n打滑\t83206\n残害\t83207\n淮师新闻网\t83208\nmfd\t83209\necc\t83210\n不干胶材料\t83211\nChrono\t83212\n万隆\t83213\n勇者斗恶龙英雄2\t83214\ngs65\t83215\ngubawebapi.eastmoney.com/\t83216\n张纪南\t83217\n国家体育总局体育彩票管理中心\t83218\n梯形图\t83219\n赵天\t83220\n省委办公厅省政府办公厅\t83221\n武昌\t83222\n玫瑰痤疮\t83223\n蛤\t83224\n东革阿里\t83225\n光绪帝\t83226\n牙桩\t83227\nvertex\t83228\n1000片\t83229\nGree\t83230\ncin\t83231\nPewDiePie\t83232\n调查项\t83233\n犀浦\t83234\n童薇\t83235\n民生路\t83236\n甲硫氨酸\t83237\n郑徐高铁\t83238\n幼升小\t83239\n田中瞳\t83240\n张帆\t83241\n尚阳\t83242\nDiverse\t83243\n600699\t83244\nCrysis\t83245\n小端\t83246\n反恐怖主义\t83247\ndraculav\t83248\nkeynote\t83249\nfailure\t83250\n四妙丸\t83251\n黄牌\t83252\n快播案\t83253\n西浦\t83254\n600635\t83255\n过站\t83256\n动态类\t83257\n皇兄\t83258\nwelcome-file\t83259\n干养\t83260\n四方区\t83261\n福建新闻网\t83262\n盟员\t83263\n经筒\t83264\n濒危动物\t83265\n2017年3月12日\t83266\n6212\t83267\n邪神\t83268\n正当性\t83269\nSelect\t83270\n蕙\t83271\nvaio\t83272\n刘晓辉\t83273\n启辰D60\t83274\n诺富特\t83275\n六支\t83276\n9例\t83277\n上海微创\t83278\nhvac\t83279\n三新\t83280\nChamps\t83281\n混身\t83282\n多度网-帝都网\t83283\n万科柏翠园\t83284\nIntellJ\t83285\n逆反\t83286\n牛头梗\t83287\n烈爱\t83288\n新蓝网\t83289\n西野加奈\t83290\nL2TP\t83291\n射芯机\t83292\n4口\t83293\n苏33\t83294\n中车株机\t83295\n鬼舞步\t83296\n第8张\t83297\nMIYAKE\t83298\n闺蜜\t83299\nNevada\t83300\n尼米兹级\t83301\n桜都\t83302\n百会穴\t83303\n通俗易懂\t83304\n恒大御府\t83305\n施耐庵\t83306\n200场\t83307\n淋浴\t83308\n3月23\t83309\n178英雄联盟\t83310\n天之蓝\t83311\n血常规\t83312\n安庆西\t83313\n蛋面\t83314\nstud\t83315\n慰器\t83316\n特殊教育专业\t83317\n结案率\t83318\n萧淑妃\t83319\n渗液\t83320\n父线程\t83321\n玛鲁娜\t83322\n泰拉瑞亚pe\t83323\nwww.110down.com\t83324\n娣\t83325\n芯\t83326\nP4\t83327\ngongyi\t83328\n特别提款权\t83329\n悠悠资源网\t83330\n粤英\t83331\n不分\t83332\n数据脱敏\t83333\n思恋\t83334\n喵子\t83335\nNH2\t83336\n李小鹏\t83337\n卡迪夫\t83338\nREAD\t83339\n社卡\t83340\n连铸机\t83341\n27017\t83342\nbilib\t83343\n搜游\t83344\n翰墨\t83345\n马可波罗花\t83346\n60\t83347\nsorted函数\t83348\nDB21\t83349\n湛江火车站\t83350\n陌生\t83351\nml350\t83352\n座上\t83353\n赚友\t83354\nbbm\t83355\n湮没\t83356\n魅族Pro5\t83357\nYY4480影院\t83358\n120天\t83359\n化学工业园\t83360\n2.7%\t83361\n90000元\t83362\n李卫平\t83363\n断桥铝窗\t83364\n银行间同业拆借利率\t83365\nDUORA\t83366\n上半叶\t83367\nstudent类\t83368\n岩石\t83369\n杉菜\t83370\n社会主义民主政治\t83371\nchy\t83372\nNik\t83373\n第10场\t83374\n摩西罗伊\t83375\nphev\t83376\n法律讲堂(文史版)\t83377\n加利福尼亚大学\t83378\n香橙派\t83379\n奇字\t83380\nRena\t83381\na9vg\t83382\n竹胶板\t83383\n嘴帽\t83384\n保鲜盒\t83385\n奥利司他胶囊\t83386\n第二十集\t83387\n腰垫\t83388\n超速\t83389\n复文\t83390\n荧星\t83391\n广西事业单位考试网\t83392\n分派\t83393\n朱由校\t83394\n食品安全网\t83395\n大团结\t83396\n野外\t83397\n武汉天河机场\t83398\nLaw360\t83399\n雷克萨斯es300h\t83400\n百草枯\t83401\n外来语\t83402\n草集\t83403\n坚朗五金\t83404\nFlag\t83405\n珠穆拉玛峰\t83406\nA50\t83407\nINDUSTRY\t83408\n曲黎敏\t83409\n樵坪山\t83410\n非线性方程\t83411\n山药片\t83412\n城景\t83413\n王欢\t83414\n舜宇\t83415\n不破\t83416\n鹿角\t83417\n黄韵玲\t83418\nInstantiate\t83419\n12-2月\t83420\n潘越云\t83421\nGem\t83422\n筹资\t83423\n畅秀阁\t83424\n武汉国博\t83425\nGaussian\t83426\n锅包肉\t83427\n青果\t83428\n丁烷\t83429\n开源电子网\t83430\n圆珠笔画\t83431\nstrains\t83432\n人情冷暖\t83433\n吴敏兰\t83434\n细菌感染\t83435\n肉松面包\t83436\namid\t83437\n城市化\t83438\n圣地安列斯\t83439\n大唐无双手游\t83440\n复方利多卡因乳膏\t83441\n150篇\t83442\n道游互娱\t83443\n中国第一历史档案馆\t83444\n80后吧_\t83445\npvst\t83446\n恒隆\t83447\n百度网盘提取码\t83448\n浮窗\t83449\n油炸糕\t83450\n南方姑娘\t83451\n尹琊\t83452\n本意\t83453\nOzone\t83454\n访问页\t83455\n双宝\t83456\n保定清苑\t83457\nnsdata\t83458\n李吉林\t83459\n马运\t83460\nsoliworks\t83461\n中国财政经济出版社\t83462\n祖马\t83463\nKKS\t83464\n浐河\t83465\n20150716\t83466\n易果网\t83467\n煤矿工人\t83468\n2016.2.3\t83469\n威士忌酒\t83470\nWineHQ\t83471\n国土资源部\t83472\n太空袋\t83473\n裂开\t83474\np400\t83475\n薛涌\t83476\n_海盐政府网\t83477\nvray\t83478\n小番\t83479\n360文件恢复\t83480\n焰影\t83481\nconner\t83482\n中国通信学会\t83483\n小野兽\t83484\nbne\t83485\n七芒星\t83486\nmultisim14\t83487\n创意产业园\t83488\n除掉\t83489\n7609\t83490\n伊丽丝\t83491\n芭堤雅\t83492\n300l\t83493\n岱山\t83494\n刹车灯\t83495\n16.0\t83496\n旅途\t83497\n看破红尘\t83498\ncody\t83499\n喝开\t83500\nmybait\t83501\n方正县\t83502\n东葛路\t83503\n小南街\t83504\n用友时空\t83505\n参公\t83506\n250km\t83507\nthirsty\t83508\ndnat\t83509\nmoc\t83510\n时线\t83511\nHen\t83512\n净网2018\t83513\n三百六十度\t83514\n床裙\t83515\n内存芯片\t83516\n纯碱\t83517\n黄袍怪\t83518\n季建业\t83519\n钝角\t83520\n班杰明\t83521\n金吉拉\t83522\n土生\t83523\n车圈\t83524\nShaker\t83525\n第一医院\t83526\n德坤\t83527\n九州娱乐\t83528\n马明哲\t83529\nming\t83530\n市司法局\t83531\n18000元\t83532\n2017年6月18日\t83533\n温文尔雅\t83534\n愁绪\t83535\n中共中央关于全面推进依法治国若干重大问题的决定\t83536\n孙震\t83537\n电子课\t83538\n益生股份\t83539\ninvoice\t83540\n太华镇\t83541\n妖妖铃\t83542\nasks\t83543\n珍妮\t83544\nfpa\t83545\n王步\t83546\n建康\t83547\n虾蛄\t83548\n葡萄糖酸钙口服液\t83549\n奔驰E200\t83550\nXIV\t83551\n5000_\t83552\n文哥\t83553\n炉温测试仪\t83554\n奶啤\t83555\n测绘师\t83556\n经营期\t83557\n南方农村报\t83558\nwind10\t83559\n控释肥\t83560\n湟里镇\t83561\n硅氧\t83562\n34寸\t83563\n3.35\t83564\n豪恩\t83565\n应然\t83566\n王延\t83567\n孙阳\t83568\n毕业论文知网\t83569\n歌识曲\t83570\nぼ\t83571\n1.0版\t83572\nOptiStruct\t83573\n中国龙泉_龙泉市政府\t83574\n暴发\t83575\n老姜\t83576\n蜜芽宝贝\t83577\n华为MATE7\t83578\n二万五千\t83579\n悬疑剧\t83580\n宋璟\t83581\n格蕾丝\t83582\n7方\t83583\n哐\t83584\n第七十四条\t83585\n爱搞机\t83586\n内腔\t83587\n大阳线\t83588\n1368个\t83589\n厌\t83590\n杨浦区小学\t83591\nWadeXu\t83592\n5616\t83593\n李秀珍\t83594\nWord公式编辑器\t83595\n菏泽市人民政府\t83596\n是由\t83597\nCID\t83598\nkb4012215\t83599\n鹅岭公园\t83600\n向山\t83601\n犬粮\t83602\n欲望酒店\t83603\n张乐\t83604\n常山赵子龙\t83605\n快要\t83606\n洗脸器\t83607\n长卿\t83608\n画外音\t83609\n分担率\t83610\n服食\t83611\nfirstname\t83612\n龙斌\t83613\n专卖\t83614\nMINING\t83615\n校作\t83616\n范本\t83617\n旅游规划\t83618\n刘晋元\t83619\nSlap\t83620\n劳保用品\t83621\n静脉炎\t83622\n吴雁泽\t83623\n超旺\t83624\n客控\t83625\nTVN\t83626\n各户\t83627\n黄麒英\t83628\n云邮\t83629\nmpu9250\t83630\n邢台一中\t83631\n电热水器\t83632\ncensored\t83633\n1958年\t83634\n学步园\t83635\n写行\t83636\n美澳\t83637\n创意型\t83638\n江苏省农科院\t83639\n充气袋\t83640\n保卫萝卜3公园\t83641\n高斯投影\t83642\nCDlinux\t83643\n女娲娘娘\t83644\n24瓶\t83645\n显得\t83646\n百度关键词排名\t83647\nOSRAM\t83648\n3648\t83649\n2000点\t83650\n礼拜堂\t83651\n王黎\t83652\n五一套\t83653\nLoopback\t83654\n申杰\t83655\n社保卡电脑号\t83656\nellis\t83657\nInstallation\t83658\n肯尼迪\t83659\n物料架\t83660\nUlinix\t83661\n圣遗物\t83662\n新励\t83663\nsoutheast\t83664\n甜皮鸭\t83665\n动态域名\t83666\n三十块\t83667\n畅玩5x\t83668\n北海亭\t83669\n非负数\t83670\n惠友\t83671\n悬命\t83672\n__17173\t83673\n保守派\t83674\nawt\t83675\n新镇\t83676\n形式\t83677\n土坯\t83678\n编辑者\t83679\n利伐沙\t83680\n基恩\t83681\n错例\t83682\n5险一金\t83683\n蓝图纸\t83684\n中古市场\t83685\n第3款\t83686\n写意画\t83687\n避仇\t83688\n告罄\t83689\n云南楚雄网\t83690\n跛豪\t83691\n暨南大学深圳旅游学院\t83692\n坠子\t83693\n环检\t83694\nddraw\t83695\n耦合剂\t83696\nAES128\t83697\n后进式\t83698\nappletv\t83699\n膳食宝塔\t83700\n3157\t83701\n子肉\t83702\n富越\t83703\nCNB\t83704\n2008年12月\t83705\n张果\t83706\n温泉池\t83707\n壁挂\t83708\nSept\t83709\nTinywan\t83710\n载货汽车\t83711\n下川\t83712\n双眼皮贴\t83713\n富可视\t83714\n凝集\t83715\nuiautomator2\t83716\n出屏\t83717\n红桃娱乐\t83718\n东京战记re\t83719\n李雪主\t83720\n近20年\t83721\nGotta\t83722\nVPX\t83723\n豪门童养媳\t83724\njiaoan\t83725\nalive\t83726\n1633\t83727\n龙泉驿区委\t83728\n桩基\t83729\ncheckout\t83730\n本朝\t83731\n传物易\t83732\n跨膜运输\t83733\n无人售货机\t83734\n上海漫展\t83735\n变了\t83736\n头婚\t83737\n中央银行\t83738\n议息\t83739\n血府逐瘀胶囊\t83740\n板锯\t83741\n杨晓军\t83742\n律宗\t83743\n齐泽克\t83744\nmc天佑\t83745\nUDID\t83746\n心肠\t83747\n明天下午\t83748\n四化\t83749\n下一个月\t83750\n碟中谍1\t83751\nenfamily\t83752\n南极科考站\t83753\n小多\t83754\n奔驰红旗l5\t83755\n牙口\t83756\n馆校\t83757\n修正\t83758\n保利麓谷\t83759\ngbdt\t83760\n杨玉环\t83761\nConsortium\t83762\n周立波\t83763\n张营\t83764\n无线控制器\t83765\n丁达尔\t83766\nFreeBird\t83767\nfx63vd\t83768\n366号\t83769\n利益相关者\t83770\n第30次\t83771\n中国一汽\t83772\n虹膜\t83773\n水浒揭秘\t83774\n胜\t83775\nInvisible\t83776\n闷\t83777\n东莞东站\t83778\nOutfit\t83779\nzwz\t83780\n三咲恭子\t83781\n上饶火车站\t83782\nNER\t83783\nseating\t83784\nxfplay\t83785\n特写镜头\t83786\nIDENTITY\t83787\nLMS\t83788\n李子明\t83789\nVC2005\t83790\n汇邦\t83791\n洪客隆\t83792\n天视\t83793\n紫光扫描仪\t83794\n配搭\t83795\n手动车\t83796\n稳压器\t83797\n尿频\t83798\n单位格\t83799\n15319113469\t83800\n失落沙洲\t83801\n历下\t83802\n转管\t83803\n好伦哥\t83804\n王文彬\t83805\n万能地图下载器\t83806\n大小金\t83807\n咸祥镇\t83808\nXOXO\t83809\n虎鲨\t83810\normlite\t83811\n大亚\t83812\n_瓯网\t83813\nnpi\t83814\nzhushou\t83815\n衣裙\t83816\n越野摩托车\t83817\n人造草\t83818\n胜似\t83819\n广西财经学院\t83820\n中国民党\t83821\n优艺\t83822\nonlylady论坛\t83823\n走私罪\t83824\n回音\t83825\n洪堡\t83826\n配型\t83827\n暗黑版\t83828\n满脸\t83829\n姚军\t83830\n方鼎\t83831\n异业\t83832\n精饰\t83833\n汉尼斯\t83834\n花板\t83835\n配电室\t83836\n总督\t83837\n36\t83838\nCIT\t83839\n卡拉胶\t83840\n签证机\t83841\n学舌\t83842\n报名照\t83843\n包办\t83844\n婚纱裙\t83845\n亲和源\t83846\n董功\t83847\nSatriani\t83848\n深圳公园\t83849\n家厨\t83850\nFairmont\t83851\n洛阳市机关事务管理局\t83852\n左思\t83853\n金郡\t83854\n百台\t83855\n三个和尚\t83856\n讥讽\t83857\nGuest\t83858\n娱乐猛回头\t83859\n不改名\t83860\n6140\t83861\n嘉和\t83862\n第95章\t83863\n易碎\t83864\n宝鸡高新区\t83865\ncontribute\t83866\n重在\t83867\n高仕\t83868\n赛伯乐投资集团\t83869\n细菌\t83870\n静安区幼儿园\t83871\n暗夜使者\t83872\n硬体\t83873\n北京东城\t83874\nRao\t83875\n坎门\t83876\n刘铁男\t83877\n计生办\t83878\n假奶\t83879\n花千骨\t83880\ndagger\t83881\nmanytoone\t83882\n3M\t83883\nMASS\t83884\n对数\t83885\n买一赠一\t83886\n熵\t83887\n注入器\t83888\n落槌\t83889\n木枕\t83890\n密使\t83891\n中国建筑史\t83892\nb2b.cn\t83893\nweapp\t83894\n对散\t83895\n秘蜜\t83896\n百分之4\t83897\n南日岛\t83898\n双氯芬酸钠缓释胶囊\t83899\n南昌站\t83900\n淋室\t83901\n屁股\t83902\n斗山\t83903\n蒙语\t83904\n凯瑟琳\t83905\nPina\t83906\n非线性回归模型\t83907\n蛙泳\t83908\n雨墨轩痕\t83909\n肥业\t83910\n凤凰视\t83911\n矿权\t83912\n兰州日报\t83913\n非堆\t83914\nppn\t83915\n开页\t83916\n裂变式\t83917\n6.00\t83918\n12.15\t83919\n鱼松\t83920\n联投\t83921\n仅仅\t83922\n周玉\t83923\nFun\t83924\n枪栓\t83925\n亚新\t83926\n国安队\t83927\n县直机关工委\t83928\n北游\t83929\n平阴县\t83930\n化学药制剂\t83931\nur22\t83932\n前3天\t83933\n终末的后宫\t83934\n万一保险网\t83935\n陈英杰\t83936\n月子期\t83937\n赵晗\t83938\nGOLD\t83939\n12.07\t83940\n大觉寺\t83941\n斯温\t83942\n萨尔达传说\t83943\n中国电力出版社\t83944\n上财\t83945\n8.0.50\t83946\n逼停\t83947\npopup\t83948\n休城\t83949\n昆山市区\t83950\nXDebug\t83951\nHam\t83952\nInsane\t83953\n七日间\t83954\n葫芦丝\t83955\n灌浆\t83956\n李涯\t83957\n金玉章\t83958\n任人唯贤\t83959\n辑\t83960\n初会\t83961\n苍南政府网\t83962\nodp\t83963\n浑浑噩噩\t83964\n问话\t83965\n番杏\t83966\n動漫\t83967\n远中\t83968\nXIAMEN\t83969\n尼古丁\t83970\n爬梯\t83971\n智能家居\t83972\n李忆如\t83973\nduly\t83974\n刘晴\t83975\n反冲洗\t83976\n彩虹堂\t83977\n潜口\t83978\n第一瓶\t83979\n黄道龙\t83980\n恒洁\t83981\n栈式\t83982\n赫美集团\t83983\n输卵管积水\t83984\n7370\t83985\n孝感市\t83986\n低平板半挂车\t83987\n安顿\t83988\nsolo\t83989\n高分子聚合物\t83990\ncxc\t83991\nscary\t83992\n陈忠和\t83993\n成像仪\t83994\n李克\t83995\n巴歌\t83996\n央视体育频道\t83997\n软路\t83998\n徐茂公\t83999\n刘睿\t84000\n宝馨科技\t84001\n流光字\t84002\n抒情性\t84003\n3x^2\t84004\n布朗族\t84005\n癸酉日\t84006\n诚恳\t84007\ndek\t84008\n思美人\t84009\ndee\t84010\n硫醚\t84011\nH片\t84012\nsora\t84013\n宋国\t84014\nstadio\t84015\n乐酷\t84016\npands\t84017\n江南梦\t84018\n利润分配表\t84019\n大红枣\t84020\n100F\t84021\n主任务\t84022\nMRA\t84023\n娇娃\t84024\n狗獾\t84025\nrhcs\t84026\n结算资金\t84027\n沈阳市环保局\t84028\nvray3.4\t84029\n杰园\t84030\n天斩煞\t84031\nneutral\t84032\n汉语\t84033\n美国博物馆\t84034\nv7.6\t84035\n钢轴\t84036\n卞之琳\t84037\n盼盼木门\t84038\ndell\t84039\n滕子京\t84040\n风起\t84041\n乌龙院之活宝传奇\t84042\n直筒裤\t84043\nelasticserach\t84044\n罗技g302\t84045\n周宸佳\t84046\n上师大一附小\t84047\nPt\t84048\n美珍香\t84049\n老佛爷\t84050\nObsession\t84051\n云课堂\t84052\n张露\t84053\n王立平\t84054\nHelper\t84055\n中国材料网\t84056\nMui\t84057\n花旗松\t84058\n3.00.08\t84059\n冲孔机\t84060\n燕王\t84061\nsensing\t84062\nPsd\t84063\n张露萍\t84064\naliba\t84065\nriot\t84066\n紫女\t84067\n砖块\t84068\n国级\t84069\nmeteor\t84070\nmayavi\t84071\n擁\t84072\n亿成\t84073\n票种\t84074\n性具\t84075\n紧身牛仔裤\t84076\n咪咪虾条\t84077\ndiscu\t84078\n间谍之桥\t84079\n新联学院\t84080\nProbably\t84081\n侠士\t84082\n九洲港\t84083\n井巷\t84084\n巡回展\t84085\n深圳港\t84086\nyolov3\t84087\n雾蒙蒙\t84088\n1.6MT\t84089\njixue\t84090\n宛\t84091\nstardewvalley\t84092\nLV5\t84093\n乘积\t84094\n三角铁\t84095\n厦门大学法学院\t84096\n豆瓣网\t84097\n人局\t84098\nduboo\t84099\npopulate\t84100\n小顺\t84101\nwasapi\t84102\ncontro\t84103\nsxt\t84104\n益阳花鼓戏\t84105\n紫金保险\t84106\nCoachella\t84107\nFEMA\t84108\n组织力\t84109\n杨军\t84110\n饴\t84111\n雪球\t84112\n3.so\t84113\n刘韧\t84114\n男多女\t84115\n知豆D2\t84116\nAxure\t84117\n边盖\t84118\nLOVELY\t84119\nolder\t84120\n18天\t84121\n访\t84122\n钟山风景区\t84123\n填图\t84124\n工程造价专业\t84125\n成交率\t84126\n积跬步\t84127\n中小\t84128\nled吸顶灯\t84129\n恭亲王\t84130\n莲心\t84131\n镇海数字报\t84132\n恩施大峡谷\t84133\n梁山传奇\t84134\n供应处\t84135\n兰溪之窗\t84136\n266号\t84137\n离婚\t84138\n夺宝传世\t84139\n跳甲\t84140\n聚氨酯\t84141\n空之轨迹SC\t84142\n第十篇\t84143\n北斗神拳\t84144\n占庭\t84145\n丁嫣\t84146\n北京市工商行政管理局\t84147\n铅价\t84148\n杠次\t84149\n郑妍\t84150\n限房\t84151\n人力社保局\t84152\nloushi\t84153\n华荣科技股份有限公司\t84154\n人教版六年级语文\t84155\n杀尽\t84156\n玖龙玺\t84157\n止鼾\t84158\n深远\t84159\n3bt\t84160\nxlturing\t84161\n克利\t84162\n造化灵魂\t84163\n封腾\t84164\nArashi\t84165\n打印机\t84166\n韩寒怼\t84167\n皮圈\t84168\n陈家镇\t84169\n焚尸案\t84170\ninduce\t84171\nJAVAWEB\t84172\nlakeshore\t84173\n大盘鸡\t84174\n1英里\t84175\n蒋大为\t84176\n雅马哈福喜\t84177\n卡培\t84178\nad10\t84179\n暗黑破坏神\t84180\nclound\t84181\n否决制\t84182\n手工\t84183\nPls\t84184\n凡尔赛\t84185\n马晓天\t84186\npob\t84187\n建华\t84188\n马巷\t84189\nphilipp\t84190\n爱藏网\t84191\n36条\t84192\nfont-weight\t84193\n陈瞎子\t84194\n平江县\t84195\nsavewizard\t84196\n赵帅\t84197\n遵义市\t84198\n书香世家酒店\t84199\n松鹤墓园\t84200\nOrientation\t84201\nX型\t84202\n千阳县人民政府\t84203\n格拉斯哥\t84204\n表层\t84205\n22世纪\t84206\n罗家宝\t84207\n练耳\t84208\ndevelope\t84209\nkuangbin\t84210\nnewtab\t84211\npronoun\t84212\nRen\t84213\n河北政协\t84214\n大气污染物\t84215\n酒剑\t84216\nMocha\t84217\n读着\t84218\n属地\t84219\n蒙哥马利\t84220\n公子天\t84221\n绝色总裁爱上我\t84222\n宣钢\t84223\nmixly\t84224\n自由麦\t84225\nwinsock\t84226\n翼装\t84227\n转换方\t84228\n三音\t84229\nAp\t84230\n焦作法院\t84231\nTVT\t84232\n辽宁出版集团\t84233\n西南路\t84234\n新药\t84235\n河南省政协\t84236\n暨南大学研究生院\t84237\n木漆\t84238\n神医废材妃\t84239\n太阳军卫\t84240\nFoxmail7\t84241\n国女孩\t84242\n受击\t84243\n园芳宝贝\t84244\n483\t84245\n1063\t84246\n8排\t84247\nzalo\t84248\n钱江Benelli\t84249\nfgowiki\t84250\n兰寇\t84251\nlnternet\t84252\n观鱼\t84253\n七堡\t84254\n道可道\t84255\n神乎其神\t84256\nPL2303\t84257\n8.com.cn\t84258\n张霖\t84259\n宇视科技\t84260\n刘南昌\t84261\n这一句\t84262\n历届\t84263\n大军区\t84264\n李翊君\t84265\n丰原\t84266\n汽球\t84267\n衰\t84268\n瓦岗山异闻录\t84269\n补交\t84270\n堂皇\t84271\n五下\t84272\n天鹅绒\t84273\n小七\t84274\n棋院\t84275\nK450\t84276\n天猫女王节\t84277\n二〇二〇年\t84278\nArticulate\t84279\n中共十三届四中全会\t84280\n亚心之都\t84281\n锐仕方达\t84282\n简宁\t84283\n26.0.2\t84284\n叶聪\t84285\n噬\t84286\n识骨寻踪\t84287\nSeaside\t84288\n重新设计\t84289\nel-input\t84290\n明石\t84291\n文版\t84292\n武义县\t84293\n掉队\t84294\n聚羧酸减水剂\t84295\n淮矿\t84296\n集成版\t84297\n亭亭玉立\t84298\n先秦两汉\t84299\nc/c++\t84300\nstuffed\t84301\nvalueof\t84302\n100.5\t84303\n探监\t84304\nMagician\t84305\nUnlocked\t84306\n八亩\t84307\n运行库\t84308\n存钱\t84309\n入帐\t84310\n舒兰市\t84311\n明洁\t84312\n大电\t84313\npsasp\t84314\n球道\t84315\n重生之将门毒后\t84316\n龙象般若功\t84317\n收益性\t84318\nEmbed\t84319\n水手\t84320\n不管\t84321\ntrusty\t84322\n特味\t84323\n晒后\t84324\n鱼形\t84325\nhomes\t84326\nMODELS\t84327\n王者荣耀干将莫邪\t84328\n正比\t84329\n黄小米\t84330\nILS\t84331\n强哥\t84332\n琴颈\t84333\n偷笑\t84334\n监护\t84335\n上热\t84336\n腥风血雨\t84337\nscaffold\t84338\n郑州市住房保障和房地产管理局\t84339\n姜增伟\t84340\n11时\t84341\n喷丝\t84342\n亲笔\t84343\nckr100\t84344\n69路\t84345\n澳城\t84346\n气压椅\t84347\n清华大学苏州汽车研究院\t84348\n爱抚\t84349\nyilia\t84350\n金靴\t84351\n十梅庵\t84352\n铲车\t84353\n乌克\t84354\n奈奈与薫\t84355\n31cm\t84356\n萨索洛\t84357\nI期\t84358\nlal\t84359\nР\t84360\n休闲风\t84361\n击毙\t84362\nB套\t84363\n称骨算命法\t84364\n流通率\t84365\nMesh\t84366\nPNY\t84367\n小骚\t84368\n浙北\t84369\n康宝\t84370\n英雄篇\t84371\n拉法基\t84372\n舞蹈裤\t84373\nisdigit\t84374\nsystemtap\t84375\n开光\t84376\n武汉政府网\t84377\n水蛇\t84378\n二八定律\t84379\n_艺龙\t84380\n智能图书馆\t84381\n达特茅斯学院\t84382\n少儿歌曲\t84383\n3叶\t84384\n什锦菜\t84385\n济宁教育网\t84386\n达官营\t84387\n吊脚\t84388\nCentrino\t84389\n贞丰县\t84390\n政通路\t84391\n卧龙湾\t84392\n童鞋们\t84393\nA35\t84394\nKAYAK\t84395\n张敞\t84396\n五九\t84397\n一马平川\t84398\n2008\t84399\n20170501\t84400\n猜词\t84401\n蓝晶石\t84402\n行贿罪\t84403\ntianshi\t84404\n井井\t84405\n恋姬无双\t84406\n高达vs高达\t84407\n述职述廉述法\t84408\n民政\t84409\n五常\t84410\n南开大学滨海学院\t84411\nb250\t84412\n65.0.3325.146\t84413\n赫尔辛基机场\t84414\ncronExpression\t84415\nVera\t84416\n乳酸钠\t84417\n民主化\t84418\n性功能\t84419\n王建林\t84420\n田辉\t84421\nb币\t84422\n90层\t84423\n八分半\t84424\n流行色\t84425\n手机助手\t84426\n民主政治\t84427\n石湖村\t84428\n三八路\t84429\n15小时\t84430\n邪魔\t84431\nSPC\t84432\n天琪\t84433\n雄花\t84434\n三爻\t84435\n吉祥中国年\t84436\n地震荷载\t84437\n4399火影忍者\t84438\n怒海\t84439\n教不改\t84440\n四君子\t84441\n1:18\t84442\n半季\t84443\n戒心\t84444\n我不是药神\t84445\nlapp\t84446\n东莞东\t84447\n金杜律师事务所\t84448\n5130\t84449\n漫威公司\t84450\ncoldplay\t84451\n玉溪市人民政府\t84452\nIndustry\t84453\n深七\t84454\n米龙\t84455\n马车8\t84456\nbounce\t84457\n张玮玮\t84458\nyouy\t84459\n微蒸\t84460\n制曲\t84461\n姜黎黎\t84462\nApp端\t84463\n500立方\t84464\n上海外滩\t84465\n最好苗疆蛊事\t84466\nmotoz\t84467\n脚膜\t84468\n争辉\t84469\n芭比\t84470\n微姐\t84471\n地票\t84472\n一流\t84473\n耳鼻\t84474\n青特\t84475\n纸件\t84476\n雅培奶粉\t84477\nlexi\t84478\n猝灭\t84479\n人民日\t84480\n窝窝头\t84481\n拉动\t84482\n杜琪峰\t84483\ntransmitted\t84484\n无求\t84485\nangela\t84486\n绍兴路\t84487\n祭师\t84488\npotplayer播放器\t84489\n杨龙\t84490\n7.5千瓦\t84491\nlimp\t84492\n思修课\t84493\nAnimelo\t84494\n高深\t84495\n第一百零三章\t84496\n阅读率\t84497\nmacc\t84498\ndhv\t84499\nVintage\t84500\n树藤\t84501\n人体工程学\t84502\n茶杯犬\t84503\nCSFB\t84504\nx230i\t84505\n类型片\t84506\n统筹规划\t84507\n接连\t84508\n薄壳\t84509\n吻遍\t84510\n山西财经大学\t84511\nPG279Q\t84512\n2018年4月6号\t84513\nCDP\t84514\n爱库存\t84515\n100MHz\t84516\nFlutter\t84517\npra\t84518\nCaffe2\t84519\n四零\t84520\n成都市城乡房产管理局\t84521\nqq欢乐斗地主\t84522\n超标\t84523\n断了\t84524\n200smart\t84525\n吴小强\t84526\n事业单位工作人员处分暂行规定\t84527\n综合性学习\t84528\n治疗室\t84529\n城门镇\t84530\n宁河县\t84531\n攀附\t84532\n福建省国土资源厅\t84533\n迷你版\t84534\n11.5.1\t84535\n万源市政府网\t84536\n模拟版\t84537\n16首\t84538\n中国人保财险\t84539\n0.6g\t84540\n第五局\t84541\ngr09\t84542\n有源蜂鸣器\t84543\nLOHO\t84544\n血证\t84545\n3u\t84546\n燃眉\t84547\n通顺\t84548\n接杆\t84549\n法秀\t84550\n朱茜\t84551\n贝迪\t84552\n911\t84553\n人资部\t84554\n隔离性\t84555\n换版\t84556\n久石让\t84557\n光宝\t84558\n沪江中学学科网\t84559\n吴应熊\t84560\nBREAKING\t84561\n育儿理论\t84562\n琴棋\t84563\n连续型\t84564\n黑袜\t84565\nchubold\t84566\nw1\t84567\n八仙过海\t84568\n1000颗\t84569\n青建\t84570\n4MB\t84571\n十一国庆节\t84572\n魂斗罗归来\t84573\n副长\t84574\n昂坪\t84575\n新浪广东_新浪网\t84576\n追赃\t84577\n小公交车太友\t84578\n棘爪\t84579\n高勇强\t84580\n大灾难\t84581\n网络机\t84582\n董玮\t84583\n建材网\t84584\n二苯酮\t84585\n小白脸\t84586\n人才培养\t84587\n图包\t84588\n输入输出\t84589\n谱子\t84590\n1300项\t84591\n大带\t84592\n眼睑痉挛\t84593\n倒流香\t84594\n董敏\t84595\nGen8\t84596\n高起专\t84597\n超炮\t84598\n精囊炎\t84599\n青松美光\t84600\n青荣\t84601\n乐奇\t84602\n酷派锋尚\t84603\n离校\t84604\n李善友\t84605\n修改稿\t84606\n杨柯\t84607\n承知\t84608\n隐身侠\t84609\n德必\t84610\n等容\t84611\n育才\t84612\n房缘\t84613\n夏龙\t84614\n小米粥\t84615\n6月1日\t84616\n703\t84617\n幽冥路\t84618\n早起\t84619\n十九大发声亮剑\t84620\n北医新闻网\t84621\n中国财政科学研究院\t84622\n微牛牛\t84623\n宫颈涂片\t84624\nLiferay\t84625\n4.9.5.508\t84626\n_九江新闻网\t84627\nセット\t84628\n快与慢\t84629\n多线制\t84630\n苏州科达科技股份有限公司\t84631\n青岛银监局\t84632\n大汉天子\t84633\n文法\t84634\n中原证券\t84635\n思宝\t84636\n左脸\t84637\n陆尘\t84638\n长绿\t84639\nraspi\t84640\n2231\t84641\n周亚辉\t84642\n苏州公司\t84643\n乱世\t84644\n职能\t84645\n6PLUS\t84646\nSeven\t84647\n三星c8\t84648\n浙江省文物局\t84649\niforgot\t84650\n八辆\t84651\n二内\t84652\n柳州市人民医院\t84653\n站起\t84654\n重视\t84655\nb超机\t84656\n不虚此行\t84657\n面孔\t84658\n北校\t84659\n重测\t84660\nRenovation\t84661\n35层\t84662\n赵红兵\t84663\n缺气\t84664\n票选\t84665\n阿震\t84666\n松榆\t84667\n荒川\t84668\n百世汇通\t84669\n诉说\t84670\n模玩网\t84671\n2016年6月1日\t84672\n白沙湾\t84673\nwong\t84674\ng35\t84675\n得出结论\t84676\n指南篇\t84677\n百利天下全球院校库\t84678\n79号\t84679\n苍山县\t84680\n做成\t84681\n氮化合物\t84682\n盐酸二甲双胍片\t84683\n御天降魔传\t84684\n妮妮\t84685\n小吾\t84686\n湛江市教育局\t84687\n稠油\t84688\n社戏\t84689\n囧的呼唤\t84690\n高山镇\t84691\n安度\t84692\n中酒网\t84693\n提起\t84694\n带式过滤机\t84695\n剪切成\t84696\n舞校\t84697\n交融\t84698\n韩宝贝\t84699\n甘棠镇\t84700\n清洁化\t84701\n奉陪\t84702\n色管\t84703\n才人\t84704\n水星\t84705\n大望村\t84706\npackagename\t84707\n开胸\t84708\n奥力\t84709\n变形金刚地球之战\t84710\n王者荣耀武道大会\t84711\n黄瓜花\t84712\n参苓白术散\t84713\nwebpack4.x\t84714\n缓和曲线\t84715\n先秦\t84716\nqueries\t84717\n三模\t84718\nTPLink\t84719\nX奴\t84720\n卡梅隆·安东尼\t84721\n翻录\t84722\n棒棒\t84723\n手机碎屏险\t84724\n二力平衡\t84725\njiujiu\t84726\n立构\t84727\n沈尤格\t84728\n粉王\t84729\n铝料\t84730\n成都七中嘉祥外国语学校\t84731\n道别\t84732\n偶活\t84733\n张涛\t84734\n侠盗高飞\t84735\n高密_高密政府网\t84736\n御龙在天\t84737\n赚了钱\t84738\n西门庆\t84739\n绿化版\t84740\n第116章\t84741\n小黑资源网\t84742\n大套\t84743\n八章\t84744\n桂庙路\t84745\n北辰三角洲\t84746\n赶不走\t84747\nStudies\t84748\n商汇\t84749\n银浆\t84750\nFORWARD\t84751\nserver2005\t84752\n101答疑网\t84753\nTOSHIBA\t84754\n开酒\t84755\n江心岛\t84756\n银行行号查询网\t84757\n配额制\t84758\n滑座\t84759\n7.9英寸\t84760\nhonesty\t84761\nWindows篇\t84762\n爱死\t84763\n瑜米之伽\t84764\nLotto\t84765\n商之都\t84766\nRAGE\t84767\n毅丝\t84768\n九转金丹\t84769\n中国城市规划协会\t84770\n渔线\t84771\n宁化府\t84772\n罗麦\t84773\n短链\t84774\napplications\t84775\n武安镇\t84776\n吸\t84777\n鹅毛笔\t84778\nsender\t84779\nRock\t84780\n中铁一局集团有限公司\t84781\n合肥滨湖新区\t84782\n户内\t84783\n死神来了\t84784\n衡阳县\t84785\n营业账簿\t84786\n再使\t84787\n第一百一十三章\t84788\ndaf\t84789\n东非大裂谷\t84790\nG120\t84791\n微盟盟\t84792\n版权页\t84793\n恋老小说\t84794\n密歇根大学安娜堡分校\t84795\n孟繁贵\t84796\n孟屯河谷\t84797\nEntire\t84798\n平山\t84799\n他一个人\t84800\n雨夹雪\t84801\n限价单\t84802\n日辰\t84803\n乐文书包网\t84804\ndestructor\t84805\n登录\t84806\n红女巫\t84807\nSantorini\t84808\n要价\t84809\n二甲双胍缓释片\t84810\n510.com\t84811\n郸城\t84812\ncognitive\t84813\n跪求\t84814\nbeastiality\t84815\n补钙\t84816\nUb\t84817\n5.6.6\t84818\n走廊\t84819\nlindy\t84820\n驯鹿\t84821\n议政\t84822\n代码页\t84823\n胃整肠丸\t84824\n系统规划与管理师\t84825\n晴岚\t84826\n锐减\t84827\n东方之星\t84828\n约考\t84829\n中限\t84830\n的牛\t84831\n七彩虹\t84832\n指纹识别器\t84833\n遍野\t84834\n海贼王罗\t84835\n合作类\t84836\n杨强\t84837\n淮南子\t84838\n鲁西\t84839\n横滚\t84840\nAngularJS\t84841\n醇类\t84842\n汪静波\t84843\n摩萨德\t84844\n江苏师范大学\t84845\n心型\t84846\n红雪\t84847\n本硕\t84848\n频程\t84849\n制噪者\t84850\n托物\t84851\n花蕊夫人\t84852\n安徽省人民检察院\t84853\n应欢欢\t84854\n未检\t84855\n百龄\t84856\n每周质量报告\t84857\n舒泰神\t84858\n闯荡江湖\t84859\n注数\t84860\n乐进\t84861\n大宗交易\t84862\nWinCC\t84863\n人力资源开发与管理\t84864\n集中供热\t84865\n诺维萨德\t84866\n丁瑶\t84867\n老挝语\t84868\nchajian\t84869\n教案\t84870\n资政新篇\t84871\n日本京都\t84872\n蚕茧\t84873\n新长\t84874\n零之使魔\t84875\nUINavigationController\t84876\n集成化\t84877\n胯部\t84878\n幸运值\t84879\n厂房招租网\t84880\n陈姗姗\t84881\n反射仪\t84882\n海尔路\t84883\n秒回\t84884\nDonovan\t84885\nz370\t84886\n品牌机\t84887\n杭州消防\t84888\n666号\t84889\n表头\t84890\n免得\t84891\n借命\t84892\n万益广场\t84893\nshoud\t84894\nzhaop\t84895\n尽收\t84896\n节制\t84897\n焦作市教育局\t84898\n七草\t84899\n4399VR网\t84900\nsuch\t84901\n会谈\t84902\n土地使用证\t84903\n批零\t84904\nTest\t84905\n瓦罗兰\t84906\n京安高速\t84907\n弥生\t84908\n厦大数据库实验室\t84909\nmacox\t84910\n中宫\t84911\n香川县\t84912\nnginx反向代理\t84913\n3.1.8\t84914\n内弯\t84915\n日夜\t84916\n念佛机\t84917\n火影忍者忍者\t84918\n纽扣\t84919\n异形契约\t84920\n第34集\t84921\n牢不牢\t84922\nlab2\t84923\nths\t84924\n胸器\t84925\n设计型\t84926\n由浅入\t84927\nn55\t84928\nAV3\t84929\n陈猛\t84930\n20歳\t84931\n5.0.6\t84932\n炸\t84933\n姜淑梅\t84934\n斗破苍穹之无上之境\t84935\n女骑\t84936\n代收代付\t84937\n多角度\t84938\northo\t84939\nRANDOM\t84940\n代古拉\t84941\n流水席\t84942\n超级机器人大战Z\t84943\n江湖有道\t84944\n青秀山\t84945\n石齐平\t84946\n鸣志电器\t84947\n8一下\t84948\n农活\t84949\n几千年\t84950\n视频转换器\t84951\n腰花\t84952\n客隆\t84953\n重庆东金\t84954\nkw\t84955\n德贝\t84956\n刘野\t84957\n野德\t84958\n花园银行\t84959\nsql数据库表\t84960\n国医堂\t84961\n生态谷\t84962\n两腿之间\t84963\n撒子\t84964\n4070\t84965\n畸胎瘤\t84966\n3.6万\t84967\nベスト\t84968\n民勤县\t84969\n2批\t84970\n1432\t84971\n专项维修资金\t84972\n舒\t84973\n全球性\t84974\n友元函数\t84975\n盈方微\t84976\n屎\t84977\n小丫\t84978\ncell\t84979\n铸锭\t84980\n红山\t84981\n双星\t84982\n花花公\t84983\n认监\t84984\n担负\t84985\n依存症\t84986\n喷溅\t84987\nmvbox\t84988\n5267\t84989\n发问\t84990\n23甬\t84991\n千恋万花\t84992\n端脑\t84993\n玻璃门\t84994\nyamato\t84995\n回踩\t84996\n山东省委\t84997\n十九个\t84998\n广州东部\t84999\n坚守\t85000\n董艺\t85001\n手机处理器\t85002\n_蠡县\t85003\n勇者斗恶龙joker3\t85004\nC4.5\t85005\n韩片\t85006\n酸钾\t85007\n农业局\t85008\n性质\t85009\n飘飞\t85010\n郭炜炜\t85011\n程远\t85012\n公共汽车\t85013\n五金交电\t85014\n等风等雨我等你\t85015\n同在\t85016\n&quot\t85017\narctany\t85018\n雷火灸\t85019\n11月9日\t85020\n全干\t85021\n金犊奖\t85022\n北京爱迪学校\t85023\nAF\t85024\n范仲淹\t85025\n限制民事行为能力人\t85026\n沙奎尔·奥尼尔\t85027\nHttpServlet\t85028\n三生公司\t85029\n沈阳市图书馆\t85030\n泰兰德\t85031\n盐津县\t85032\n黄豆粉\t85033\n阻挡器\t85034\nancient\t85035\nFocusrite\t85036\n百年前\t85037\nSECTION\t85038\n陆埠镇\t85039\n优乐\t85040\n杜明礼\t85041\n泽库县\t85042\n使命召唤5\t85043\n下堂\t85044\n结节\t85045\n国务院法制办\t85046\n中国农业银行软件开发中心\t85047\n虎林市\t85048\nOCN\t85049\n人间喜剧\t85050\n译注\t85051\n姚来英\t85052\n日本小学\t85053\nImmortals\t85054\n20161106\t85055\n吧唧\t85056\n伸缩管\t85057\n羊尖镇\t85058\niapm\t85059\n下雨了\t85060\n皱眉头\t85061\n保利观塘\t85062\n煅烧高岭土\t85063\n垫巾\t85064\nint32_t\t85065\n租价\t85066\n人间地狱\t85067\n如人生\t85068\n科技界\t85069\n光纤转换器\t85070\n必泰\t85071\n真事儿\t85072\n冲\t85073\nLexicon\t85074\nsolr\t85075\n温病条辨\t85076\n2014-12\t85077\n2012年上半年\t85078\nmoli\t85079\n2001款\t85080\n专利案\t85081\n一触\t85082\n成都日报\t85083\n冠礼\t85084\n康\t85085\n心梦无痕\t85086\n糊涂\t85087\n财务专用章\t85088\n河北工业大学研究生院\t85089\njicheng\t85090\n唐派\t85091\nJingle\t85092\n宝钢集团\t85093\n陈根\t85094\ndaxue\t85095\n10102\t85096\n技术文\t85097\n浐灞国家湿地公园\t85098\n京天华盛\t85099\nWMIC\t85100\n上置集团\t85101\n若尔盖\t85102\n姜虎东\t85103\n13位\t85104\n宅舞\t85105\n莱莱\t85106\n碳排放交易网\t85107\ndolce\t85108\n舰\t85109\n静摩擦系数\t85110\n对外贸易额\t85111\n专训\t85112\n安迪苏\t85113\n偏硅酸\t85114\n众意\t85115\n东方之子\t85116\n有请\t85117\n福祸\t85118\n蚝烙\t85119\n德阳传媒网\t85120\n年谱\t85121\n4章\t85122\n中国外交部\t85123\n鄂北地区\t85124\n7.2\t85125\n朱永嘉\t85126\n诗歌史\t85127\n出师表\t85128\n考级\t85129\n关帝庙\t85130\n二人转版\t85131\n卖萌\t85132\nv8.8\t85133\n久游\t85134\nHeads\t85135\n长春市区\t85136\n配合\t85137\ntea\t85138\n螺髻山\t85139\n731\t85140\n哥谭市\t85141\nInternships\t85142\n染发膏\t85143\n水生植物\t85144\n烟男\t85145\n绯闻女友\t85146\n1341\t85147\n出市\t85148\n刘振宇\t85149\nidp\t85150\nPromotional\t85151\nfirework\t85152\n红们\t85153\n崔磊\t85154\n尼康d500\t85155\n枫桥镇\t85156\n牟定\t85157\nteamcity\t85158\n梓潼县人民政府\t85159\n凹凸型\t85160\n中国交通\t85161\n金评媒\t85162\n百合_\t85163\ntax\t85164\n鱼粮\t85165\nCPT\t85166\n岡\t85167\n断头谷\t85168\n干果机\t85169\n冲矢昴\t85170\n锦纶\t85171\n小脾气\t85172\n山西省眼科医院\t85173\n莫属\t85174\n3D软件\t85175\n宿迁市公安局\t85176\n第74章\t85177\n电热套\t85178\n6654\t85179\n这么少\t85180\n水星路由器\t85181\n海宁城\t85182\naspen\t85183\n脚本分享网\t85184\n散角\t85185\n三坪\t85186\n五一三天\t85187\ncambridge\t85188\n走丝\t85189\n年轻一代\t85190\n卢卡申科\t85191\nhdp\t85192\n10亿次\t85193\n波尔多红酒\t85194\n白强\t85195\n大疆精灵3\t85196\n宁波东海实验学校\t85197\n琳娜\t85198\n巫毒\t85199\n小妖\t85200\n环境艺术设计专业\t85201\n裂口\t85202\n自由权\t85203\n622848\t85204\nAbortion\t85205\n都不能少\t85206\n卡拉奇\t85207\n滨城区\t85208\n福来\t85209\n谷嘉诚\t85210\n高帆\t85211\n150点\t85212\n600555\t85213\n3500_\t85214\n地球帝国3\t85215\n报春路\t85216\nBipolar\t85217\nswift语言\t85218\n换租\t85219\n黑芝麻\t85220\n晋献公\t85221\n田坎\t85222\n直贴\t85223\nVon\t85224\n南朝鲜\t85225\n20171204\t85226\nw9\t85227\n杜志国\t85228\n国家劳动法\t85229\n无锡市社会保险基金管理中心\t85230\n计算机应用技术专业\t85231\n第20条\t85232\n幻梦馆\t85233\n列宁主义\t85234\n工程桩\t85235\ncamer\t85236\n刘雨柔\t85237\n福建艺术职业学院\t85238\nqq头像\t85239\n加钟\t85240\n拍拖\t85241\n百度杯\t85242\n双城镇\t85243\n姜雪\t85244\n瘙痒\t85245\n1日元\t85246\n中山市人力资源和社会保障局\t85247\nVisio2013\t85248\n罗建\t85249\n熊出没之过年\t85250\n红烧牛肉面\t85251\n外房\t85252\n龙塘村\t85253\n油罐\t85254\n银行考试网\t85255\nr480\t85256\n回龙传\t85257\n旗兵\t85258\ntests\t85259\n常德火车站\t85260\n江楠\t85261\n邻接矩阵\t85262\n収\t85263\n三尺\t85264\n两盆\t85265\n世宗\t85266\n清华大学土木水利学院\t85267\nP+R\t85268\n大余县\t85269\nPD-1单抗\t85270\nphenol\t85271\n根本大法\t85272\nStages\t85273\n礼仪之邦\t85274\n七大洲\t85275\n投掷\t85276\nfoolish\t85277\nL01\t85278\n陈至立\t85279\n运动训练专业\t85280\n万全县\t85281\n齐祖\t85282\n长安区人民政府\t85283\nP8P67\t85284\n达里诺尔湖\t85285\n载冷剂\t85286\nPDF+\t85287\n易怒\t85288\n攻宠\t85289\n第197集\t85290\n偶数位\t85291\n应付\t85292\n律师法\t85293\n何宁\t85294\n天才眼镜狗\t85295\nbiotropics\t85296\n航天电子\t85297\n100公里\t85298\n花柳病\t85299\noperation\t85300\n四川省地税局\t85301\n少\t85302\n多娇\t85303\nuag\t85304\n2.2a\t85305\n干扰源\t85306\n罗湖社区\t85307\n江北区\t85308\n汤药\t85309\n黑戈壁\t85310\n孙过庭\t85311\n台北站\t85312\n論壇\t85313\n末期\t85314\n右端\t85315\n石材\t85316\n芙蓉楼送辛渐\t85317\n肥肠粉\t85318\n一宵\t85319\n开箱\t85320\n海寇\t85321\nwasa\t85322\nuygur\t85323\n同天\t85324\nps2015cc\t85325\n30元\t85326\nnplayer\t85327\n气压罐\t85328\n美籍\t85329\n硝化细菌\t85330\n闽商\t85331\n灯盏\t85332\n涌泉穴\t85333\n1.1.8\t85334\ner图\t85335\n古墩路\t85336\n速度表\t85337\nbiotech\t85338\n游戏玩家\t85339\n60万\t85340\nbash\t85341\n药铺\t85342\n072\t85343\n14片\t85344\n1KB\t85345\n2013年上半年\t85346\n熙宁\t85347\n光棍儿\t85348\njun\t85349\n外冈镇\t85350\n重力荷载\t85351\n两分半\t85352\n挡\t85353\n用于\t85354\n曼妮芬\t85355\n屋檐下\t85356\n武大科技园\t85357\n荔波县\t85358\n锡林浩特\t85359\n增值税征\t85360\n守法\t85361\n王港\t85362\n#define\t85363\n保险盒\t85364\n无穷动\t85365\n丽湖花园\t85366\n雨感\t85367\njdg\t85368\n28d\t85369\n气动接头\t85370\n390万\t85371\n可园\t85372\n中国科学院苏州纳米技术与纳米仿生研究所\t85373\n大潘\t85374\n龙溟\t85375\ncocos2d-x\t85376\n13公斤\t85377\n削减\t85378\n冠军之路\t85379\nRing\t85380\n105毫米\t85381\n河流水质\t85382\nMessageBox\t85383\n阴部\t85384\nv3.4.2\t85385\n怀胎\t85386\n非功能\t85387\n名报\t85388\n夜色\t85389\n61号\t85390\n江苏银行股份有限公司\t85391\n0.3.2\t85392\naborted\t85393\ncitrus\t85394\n红富士\t85395\n推广员\t85396\n开源工作流\t85397\nict\t85398\n归葬\t85399\n第吉尔\t85400\n安阳网\t85401\n愈加\t85402\nedta\t85403\nTVD\t85404\n琴弦\t85405\ntm\t85406\n雪雪\t85407\n鸟嘴\t85408\n白酒杯\t85409\n赵国栋\t85410\n新时代证券\t85411\n90后们\t85412\nAntibody\t85413\nJEWELRY\t85414\n2018六\t85415\n2|\t85416\n小男\t85417\n佳片\t85418\n师太\t85419\n2014年以来\t85420\n混炼胶\t85421\n特玩炉石传说\t85422\n研华科技\t85423\ntodolist\t85424\n小网\t85425\n小鸡小鸡\t85426\n加行\t85427\n驭帅\t85428\n有机化合物\t85429\n松发\t85430\n改口\t85431\n洪泽政府网\t85432\n龙潭镇\t85433\n中宏人寿\t85434\n执行员\t85435\n懒神\t85436\n5月份\t85437\nsliding\t85438\n6.93\t85439\n博雅塔\t85440\n通用公司\t85441\nAsynchronous\t85442\n5\t85443\n不分伯仲\t85444\n家费\t85445\n擒凶\t85446\n不枉此生\t85447\nchiji\t85448\n动态规划算法\t85449\n金锐\t85450\n人人妻\t85451\nexecutables\t85452\n嫚\t85453\n卧虎藏龙贰\t85454\n缩阳\t85455\n霹雳布袋戏虚拟人物\t85456\n1888年\t85457\n美人痣\t85458\n33元\t85459\n2700亿\t85460\nRequiem\t85461\n山东省委省政府\t85462\n京津冀协同发展规划纲要\t85463\njmp\t85464\n明天广场\t85465\n邵帅\t85466\n15只\t85467\n旅游节\t85468\nboutique\t85469\n二十四首\t85470\n扒衣\t85471\n智慧医院\t85472\nsuo\t85473\n黄圃镇\t85474\n对味\t85475\n南京东南大学\t85476\n现况\t85477\n弗雷德里克\t85478\n信行\t85479\n呼和浩特分公司\t85480\n600518\t85481\n毛利人\t85482\n600118\t85483\n后藤真希\t85484\n编码机\t85485\nmp3剪切器\t85486\n指定页\t85487\n厚普股份\t85488\n三统一\t85489\n杨门女将之女儿当自强\t85490\n王小二\t85491\ncaptivate\t85492\n豆壳\t85493\n疯魔不成活\t85494\ntl00\t85495\n第25关\t85496\n第八十三章\t85497\n班内\t85498\n民运\t85499\n瘢痕\t85500\n25000\t85501\n雷战\t85502\n大众传媒\t85503\n浙江省中医院\t85504\n桶盖\t85505\n灵山镇\t85506\nU\t85507\n会计法\t85508\n押解\t85509\n李大芳华\t85510\n亲子园\t85511\n五毒\t85512\n电疗\t85513\n红裙\t85514\n北京市工作居住证\t85515\n2月1号\t85516\n焰尾\t85517\n炙子烤肉\t85518\nusdt\t85519\n青旅\t85520\n亲亲\t85521\n加拿大麦吉尔大学\t85522\n中国轻纺城\t85523\nFPS值\t85524\n年季\t85525\n浙江省农业科学院\t85526\n忙忙碌碌\t85527\n便利化\t85528\n网查\t85529\n新佳\t85530\n金鹿\t85531\nmuramura\t85532\n斯托米丹尼尔斯\t85533\n魏和尚\t85534\nspirngboot\t85535\n飞机餐\t85536\nzhenren\t85537\n共挤\t85538\n斯诺\t85539\n哈文\t85540\n三体式\t85541\n胖友\t85542\n1865年\t85543\n2位\t85544\n明日世界\t85545\n合山市\t85546\n五子衍宗丸\t85547\n百立丰\t85548\n堺雅人\t85549\n红七军\t85550\n28cm\t85551\n阵面\t85552\n1831\t85553\n炸药\t85554\n齐翔腾达\t85555\n沣西新城\t85556\nhtd\t85557\n李亮节\t85558\n鬣蜥\t85559\n3月15日起\t85560\n雅康高速\t85561\n吐谷浑\t85562\nmoshi\t85563\n拔\t85564\n美少\t85565\n非等位基因\t85566\n1974年\t85567\nip地址在线计算器\t85568\n钓甲鱼\t85569\n框架户\t85570\n清华大学出版社》\t85571\n孔灌注桩\t85572\n无锡\t85573\nJScript\t85574\n硫_\t85575\n瑞典\t85576\n盐酸贝尼地平片\t85577\n丢件\t85578\n结肠癌肝\t85579\n李建英\t85580\n不干胶印刷\t85581\n周楠\t85582\n碱基互补\t85583\n移位器\t85584\n龙湖区\t85585\n32类\t85586\n0190\t85587\n声谷\t85588\n补胎液\t85589\n福楼拜\t85590\n甲乙丙丁\t85591\n关系_参考网\t85592\nUNIVERSITY\t85593\n讲师团\t85594\n农村商业银行\t85595\n皮肤病科\t85596\n赵洪祝\t85597\n祛湿茶\t85598\n芭东\t85599\n筑坝\t85600\n康熙字典\t85601\n1998\t85602\n收藏量\t85603\n炼金师\t85604\n白磷弹\t85605\n涛女郎\t85606\n卢本伟\t85607\n初探_参考网\t85608\n打击者\t85609\nTEG\t85610\n第38次\t85611\n木墙\t85612\nAachen\t85613\n电气火灾监控系统\t85614\n彭浦新村街道\t85615\n不落俗套\t85616\n泉州市公安局\t85617\n高氯\t85618\n师生文\t85619\n王舒\t85620\n沈阳桃仙机场\t85621\n南禅寺\t85622\n当期损益\t85623\n畦\t85624\n熙岸\t85625\n机包\t85626\n合意\t85627\nRS-485\t85628\npixelgun3d\t85629\n回见\t85630\nDOTA2.UUU9.COM\t85631\n开脱\t85632\ngermmy\t85633\n板对板连接器\t85634\n塑颜\t85635\n都彭\t85636\n泰美\t85637\n退潮\t85638\n钓鱼翁钓鱼网\t85639\n重述\t85640\n梦幻版\t85641\n邹\t85642\nXSD\t85643\n谢晖\t85644\n认不到\t85645\n布偶猫\t85646\n陈少文\t85647\n欢乐斗地主\t85648\n南京广播电视台\t85649\n4月初\t85650\n丙酸氟替卡松\t85651\n胎停育\t85652\n分列式\t85653\n热烫\t85654\n拨鼠\t85655\nNo.3\t85656\ncustomized\t85657\n15点\t85658\npropeller\t85659\n曼谷大学\t85660\n湘桥\t85661\n无盘\t85662\n废都物语吧\t85663\nLNG气化站\t85664\n贱女\t85665\n26倍\t85666\n唐尧\t85667\n大木花谷\t85668\n昨夜小楼又东风\t85669\nbyebye\t85670\n笔形\t85671\n吊爆\t85672\n古田会议旧址\t85673\n马自达3昂克赛拉\t85674\n李常茹\t85675\n武汉爱尔眼科医院\t85676\naics6\t85677\n要记住\t85678\n菠萝花\t85679\n假面骑士吧\t85680\nshz\t85681\n妹夫\t85682\n千金净雅\t85683\n高坪镇\t85684\nchardet\t85685\n老字\t85686\n66分\t85687\n第九条\t85688\n少儿平安福\t85689\nidea\t85690\n肖洁\t85691\n彩箱\t85692\n咪头\t85693\n牵\t85694\n二尖瓣狭窄\t85695\n城市人\t85696\niPadmini\t85697\n瓠子\t85698\n外行星\t85699\n子宫积液\t85700\n弗兰基\t85701\n小阮\t85702\n国际排联\t85703\n康美中药网\t85704\n西野\t85705\nmhz\t85706\n龙头股\t85707\n秘色\t85708\n2x4\t85709\n答出\t85710\n枉费\t85711\n及时\t85712\n曾荫权\t85713\n0.7.5\t85714\n兴建\t85715\n旺旺群\t85716\n水汪汪\t85717\nowe\t85718\n动物系恋人啊\t85719\n汤姆汉克斯\t85720\n克山县\t85721\n汪小敏\t85722\n黄秋\t85723\n播音稿\t85724\n飞逸\t85725\n一径\t85726\n韶关市教育局\t85727\nPC站\t85728\n石器时代精灵王传说\t85729\n第22条\t85730\n圣物套\t85731\n中国科学院上海硅酸盐研究所\t85732\nNOIP2016\t85733\n二十四\t85734\n生理\t85735\nngram\t85736\n天池\t85737\n友谊大厦\t85738\n沈阳奥体中心\t85739\n单格\t85740\n团区委\t85741\n陈彬彬\t85742\n6217\t85743\n研发部\t85744\n一头\t85745\n三傻大闹宝莱坞\t85746\n校讯通\t85747\n马自达cx4\t85748\n竹内顺子\t85749\n太湖石\t85750\nEmpirical\t85751\n请医\t85752\n3510\t85753\n上仓\t85754\n电性\t85755\n云台山\t85756\n迫切需要\t85757\n北京国际会议中心\t85758\n黑羽龙崎\t85759\nCook\t85760\n裘英俊\t85761\nhaodiaori\t85762\nTophatter\t85763\n寿终正寝\t85764\n陈敏之\t85765\nactionsheet\t85766\ncrc32\t85767\n郭俊辰\t85768\nsream\t85769\n深圳科士达科技股份有限公司\t85770\n美国伊利诺伊大学香槟分校\t85771\n冬雨季\t85772\n湖南省文化厅\t85773\n电葫芦\t85774\n赤心\t85775\n阁螳螂\t85776\n张巍\t85777\n紫微斗数算命\t85778\n湖南省农业科学院\t85779\nwin10邮箱\t85780\n自己的家\t85781\n铸铁件\t85782\n范海辛的奇妙冒险2\t85783\n多玩战舰世界\t85784\n铺砌\t85785\nH.O.T\t85786\n下沙大学\t85787\nns2\t85788\n湖畔\t85789\n融信澜天\t85790\nHIT\t85791\nopenssh-server\t85792\n短袖\t85793\n艾瑞泽5e\t85794\npol\t85795\nikev2\t85796\n含住\t85797\n舒梅切尔\t85798\n10米\t85799\n700W\t85800\n施压\t85801\nOCS\t85802\n刚竹\t85803\n卸料器\t85804\n2753\t85805\n标准成本法\t85806\nXfce\t85807\n华帝股份有限公司\t85808\nMalik\t85809\n华中师范大学文学院\t85810\n第一案\t85811\n安徽省儿童医院\t85812\n中国梦\t85813\ngnutls\t85814\n2015年起\t85815\n45期\t85816\n童子命\t85817\n干疼\t85818\nexecl\t85819\n氧池\t85820\n小额借款\t85821\n制袋机\t85822\n晴方\t85823\nticwear\t85824\n楼面价\t85825\n鲶\t85826\nHologram\t85827\nwaw\t85828\n恶灵附身1\t85829\n实在人\t85830\n林勇\t85831\n向死\t85832\n吸盘\t85833\nMillipore\t85834\n南昌公交查询网\t85835\n纪念碑谷2\t85836\n蜀西湖\t85837\n湖北省总工会\t85838\n欧战\t85839\n幻星宇\t85840\n夷狄\t85841\nGRUB4DOS\t85842\n扬书魅影\t85843\n17起\t85844\n等次\t85845\n用友NC\t85846\n卷腹\t85847\n216个\t85848\n崇华\t85849\n爱伦坡\t85850\n5213\t85851\n山阳论坛\t85852\n加农炮\t85853\n大观镇\t85854\nwhz\t85855\n来自星星的你\t85856\n企业管理者\t85857\nppt课件网\t85858\neb8000\t85859\n岗石\t85860\n万域英雄志\t85861\n完全性右束支传导阻滞\t85862\n不锈钢管\t85863\n驱位\t85864\n1461\t85865\n死马\t85866\ntransaction\t85867\n女作者\t85868\n复通\t85869\n付林\t85870\nCET6\t85871\ncpi指数\t85872\n认母\t85873\n社区服务\t85874\n四宫格\t85875\nSent\t85876\n许涛\t85877\nsplitting\t85878\nbyconan\t85879\nimstrive\t85880\nZE\t85881\n语集\t85882\n深圳地税局\t85883\n增益\t85884\n泉奈\t85885\n男女神\t85886\n43条\t85887\n服饰\t85888\nLucky_man\t85889\n中风病\t85890\n凝水\t85891\nn550\t85892\n24级\t85893\n秀山土家族苗族自治县\t85894\n罗平\t85895\nprologue\t85896\n侠盗飞车罪恶都市\t85897\n长征\t85898\n移动迷宫3:死亡解药\t85899\nautomation\t85900\njintian\t85901\n欢视\t85902\n果郡王\t85903\n一副\t85904\nxxgk\t85905\n上饶市城乡规划局\t85906\nholter\t85907\nRadiology\t85908\n2F\t85909\nscara\t85910\n长阳教育信息网\t85911\n金大\t85912\n航空煤油\t85913\nangluar\t85914\nバッグ\t85915\n安宫牛黄丸\t85916\n星奈\t85917\n传成\t85918\n掐\t85919\n五会\t85920\n59\t85921\n最划算\t85922\n霓裳\t85923\n子界面\t85924\n旅夜书怀\t85925\n逸品\t85926\n维斯布鲁克\t85927\n防爆\t85928\n自动扶梯\t85929\n读罢\t85930\n吴江开发区\t85931\n3d字谜\t85932\n漫步者S1000\t85933\n面形\t85934\n美甲机\t85935\n水污染防治法\t85936\n土影\t85937\n同创伟业\t85938\n新一\t85939\nsatisfy\t85940\n广东工程职业技术学院\t85941\nHermès\t85942\n地方法\t85943\niost\t85944\n体育百科_体博网\t85945\n保利溪湖\t85946\n心菜\t85947\n19部\t85948\n豫园股份\t85949\n逾期费\t85950\n椒\t85951\n快递箱\t85952\n八首\t85953\n晚辈\t85954\n图框\t85955\n水击\t85956\n100万年\t85957\n七十二层奇楼\t85958\n甲状旁腺素\t85959\n氟苯尼考\t85960\n瞧不上\t85961\n生化危机5:黄金版\t85962\ndeadmau5\t85963\n机器学习工程师\t85964\n失衡\t85965\n葵花宝典\t85966\n浙江大学农业与生物技术学院\t85967\n湖北省公共资源交易信息网\t85968\n燕歌\t85969\n据实\t85970\n【万古仙穹\t85971\n万科城小区\t85972\n八一中学\t85973\n计量\t85974\n90a\t85975\ncoredraw\t85976\npcr\t85977\nCrypto\t85978\nvbse\t85979\n找中\t85980\n云端版\t85981\n酒米\t85982\nTravel\t85983\n比特比\t85984\n泰州市委\t85985\n方志强\t85986\n小香玉\t85987\n巴丹吉林沙漠\t85988\n电子罗盘\t85989\n叶和花\t85990\n人间仙境\t85991\nwdz\t85992\nQ35\t85993\n签字\t85994\n取样器\t85995\n账期\t85996\n食用碱\t85997\n卸妆油\t85998\n带头者\t85999\n潮绣\t86000\n青蓝\t86001\n北京市哲学社会科学规划办公室\t86002\n苦追\t86003\n华为Mate8\t86004\n茶楼\t86005\n踏着\t86006\n炮灰\t86007\n恋童\t86008\nios12\t86009\n腾达AC9\t86010\n亦庄线\t86011\nQinYa\t86012\n上饶统计信息网\t86013\n西铁城光动能手表\t86014\nambitious\t86015\nWISE\t86016\n布草\t86017\n长谷\t86018\n民主制\t86019\n马氏集团\t86020\nslee\t86021\n24张\t86022\n游堂\t86023\nbioscience\t86024\n罗马\t86025\n杜拜\t86026\n狂野飙车8:极速凌云\t86027\n强奸类\t86028\n排砖\t86029\n皮尔斯\t86030\n鸡奸\t86031\n庆阳\t86032\n群奸\t86033\n原箱\t86034\n标致4008论坛_汽车之家论坛\t86035\n拉走\t86036\nwhatis\t86037\n百闽人生网\t86038\n海缆\t86039\nCSA\t86040\n城县\t86041\n_阿坝州政府网\t86042\n运动型\t86043\n夙愿\t86044\n压块机\t86045\n副乳\t86046\n呼伦贝尔大草原\t86047\n中二黑暗血\t86048\nfixation\t86049\n武林群侠传2\t86050\n随机信号分析\t86051\n古巴哈瓦那\t86052\n中国物流交易中心\t86053\nExamination\t86054\n云水怒\t86055\n宜宾市审计局\t86056\ncad平面图\t86057\nmeilin\t86058\n美特斯邦威\t86059\n康静\t86060\n视色\t86061\n宪法\t86062\n医武霍去病\t86063\n1529\t86064\n真君\t86065\n哈医大二院\t86066\n莲蕊\t86067\ndenver\t86068\n成都信息工程大学银杏酒店管理学院\t86069\nactiviti\t86070\npdf格式\t86071\n街霸2\t86072\n小董\t86073\n肺热\t86074\nx201\t86075\nmilfy\t86076\nfranklin\t86077\n神针\t86078\nozmosis\t86079\nQQ空\t86080\nsougou\t86081\ncctv7农广天地\t86082\nptl\t86083\n九枝兰\t86084\n丁山\t86085\n幽城幻剑录吧\t86086\n筛检\t86087\n85亿美元\t86088\nsaem\t86089\nColoring\t86090\n3006\t86091\n空想到\t86092\n万昌\t86093\n色纱\t86094\n地根\t86095\n携税宝\t86096\n高手指\t86097\n粗心\t86098\n数月\t86099\n观测器\t86100\n瘦身_\t86101\nvine\t86102\n剑网3PVE官方站\t86103\n回村\t86104\n600公里\t86105\n侯梦瑶\t86106\noracle数据库表\t86107\n程翔\t86108\n西班牙队\t86109\n你是我的唯一\t86110\n高校生\t86111\n拼抢\t86112\nfabu\t86113\n720g\t86114\n布行\t86115\n杨振宇\t86116\n摄片\t86117\npcl\t86118\n竞购\t86119\nmuseums\t86120\n香港市\t86121\n变样\t86122\n600089\t86123\n茗雅\t86124\nBackpacks\t86125\n闪络\t86126\nf581\t86127\n六册\t86128\n香邑\t86129\n奕\t86130\nbinaries\t86131\n肝脾肿大\t86132\n三屏\t86133\n夷陵之战\t86134\n我是看守专用宠物\t86135\n广韵\t86136\n2018年1月19日\t86137\nMarc\t86138\npeptides\t86139\n学养\t86140\n薇甘菊\t86141\n小台\t86142\n中泉\t86143\n无息贷款\t86144\n600066\t86145\n查查吧手机网\t86146\n亲友\t86147\n防守方\t86148\n监字\t86149\n正仪\t86150\n刚片\t86151\n通辽市政府\t86152\n李博文\t86153\n福美来\t86154\n憩息\t86155\n融智\t86156\nsao\t86157\n参天\t86158\n乐佳\t86159\n日产途乐\t86160\n冬吴\t86161\n多间\t86162\n深港\t86163\n盖带\t86164\n云帆济沧海\t86165\n临终时\t86166\n云空间\t86167\n吉大港\t86168\n阿贝\t86169\nB85M-F\t86170\nidon\t86171\n平均速度\t86172\n26万元\t86173\n聪明\t86174\n6月6\t86175\nENCODE\t86176\nFeiJiu网\t86177\nsuper-d2\t86178\n超级机器人大战MX\t86179\n新增\t86180\n桂格\t86181\nconscious\t86182\n国寿\t86183\n出典\t86184\n骚男\t86185\nStunning\t86186\n李玉平\t86187\n卡尔·拉格斐\t86188\n养长\t86189\n迹美\t86190\n老挝石\t86191\n弥足\t86192\n刘家成\t86193\nCYDIA\t86194\n晓松奇谈\t86195\n阳信\t86196\n远程控制系统\t86197\n栅格化\t86198\n效度\t86199\n典雅版\t86200\n中国消防协会\t86201\n强欢\t86202\n促使\t86203\n金阊\t86204\n采乐\t86205\n大陆集团\t86206\nOneRepublic\t86207\n安标国家矿用产品安全标志中心\t86208\n十月革命\t86209\n科尔沁左翼中旗\t86210\n贵阳东站\t86211\n女儿\t86212\n空心砖\t86213\nmyself\t86214\n哈洛\t86215\n采访者\t86216\n苏安\t86217\n孝利\t86218\n紫背\t86219\n孤胆车神维加斯吧\t86220\n邻近\t86221\nBALLY\t86222\n佳能canon\t86223\n治国\t86224\n于赓哲\t86225\nzst\t86226\n用完\t86227\nOccupational\t86228\nmanifold\t86229\n成都市环保局\t86230\n电话卡\t86231\n包装盒\t86232\n火与剑\t86233\nForbidden\t86234\nDVA\t86235\ncuda8.0\t86236\n永川区\t86237\n性福生活\t86238\n长安欧尚_欧尚\t86239\n软垫\t86240\n格拉\t86241\n父级\t86242\n7月15日起\t86243\n獎\t86244\n国方\t86245\n跌级\t86246\n房山政府\t86247\n理论篇\t86248\nSpringData\t86249\n塞尚\t86250\n主题化\t86251\n杨欣\t86252\n夜行黄沙道\t86253\n国光帮\t86254\n高达里\t86255\nBLN\t86256\n四个人\t86257\n开关盒\t86258\nFY\t86259\n中利集团\t86260\n2014-08-20\t86261\n中国科学院国家授时中心\t86262\n绍剧\t86263\nrenesas\t86264\n600487\t86265\n空中客车\t86266\n贱\t86267\nbrazzer\t86268\n魔兽争霸4\t86269\ndapper\t86270\n2.3.0\t86271\ngta4\t86272\n总人次\t86273\nusp\t86274\n马瑞斯\t86275\n誉满\t86276\nTOP10排名\t86277\ndnf乱斗\t86278\nUniqlo\t86279\n石宝\t86280\n韩国首尔大学\t86281\n不含税\t86282\n广发基金管理有限公司\t86283\n970\t86284\n启明教育\t86285\nPROE5.0\t86286\n中国新闻社\t86287\n832位\t86288\n夏末秋\t86289\n102\t86290\n茅台\t86291\n中交兴路\t86292\n1902年\t86293\n合同工\t86294\n陈可辛\t86295\n鹏亮\t86296\n加诺格\t86297\n小米小米\t86298\n南充市南部县\t86299\n卡曼\t86300\n深圳水务集团\t86301\n一个十\t86302\n变径接头\t86303\n抓钩\t86304\nups国际快递\t86305\n晚来天欲雪\t86306\n检出率\t86307\n上海虹桥机场\t86308\n王国保卫战\t86309\n吉他弦\t86310\n下潜\t86311\n北京迁喜搬家公司\t86312\n35.dll\t86313\n保洁员\t86314\ncomix\t86315\n国朝\t86316\n街机版\t86317\n莓茶\t86318\n教头网\t86319\nパック\t86320\n急线\t86321\n复星集团\t86322\n字里行间\t86323\n孙俪\t86324\n东营论坛\t86325\np5000\t86326\n第22关\t86327\nxoh\t86328\n高层封\t86329\n三相电源\t86330\n滴管\t86331\n舍子\t86332\n胡虏\t86333\n启蒙老师\t86334\n万科溪\t86335\n管辖\t86336\n旧化\t86337\n访问学者\t86338\n多季\t86339\n43级\t86340\n缓慢\t86341\nNote3论坛\t86342\nlclc\t86343\n比卢普斯\t86344\nAPFS\t86345\n卷烟厂\t86346\n烟台万华\t86347\nnpu\t86348\n国际经济与贸易专业\t86349\n第71集\t86350\n卤代烃\t86351\nView\t86352\n华东政法大学》\t86353\n龙文鞭影\t86354\n枣园路\t86355\n王少华\t86356\n上海两新互动网\t86357\n第18季\t86358\n驭风\t86359\n索克\t86360\n等爱\t86361\n蕾贝卡\t86362\n花茶\t86363\n出淤泥而不染\t86364\nISO/\t86365\nsolarbe\t86366\n莲花街\t86367\n香奈儿可可\t86368\n华泰联合\t86369\n3三\t86370\n美娘\t86371\n高频机\t86372\n瓦日铁路\t86373\nPayments\t86374\n16次\t86375\n本门\t86376\n邢晓瑶\t86377\nHansen\t86378\n王玉英\t86379\n异戊二烯\t86380\n段赛\t86381\nwincc7.0\t86382\nserato\t86383\n晦暗\t86384\n对不上\t86385\n锑矿\t86386\nyuejin\t86387\n25点\t86388\n鬼病院\t86389\n折弯\t86390\nMac玩儿法\t86391\n迅雷离线空间\t86392\n葡星人\t86393\n解难\t86394\n匿名_天涯问答\t86395\n南宁东\t86396\n押金条\t86397\n政局\t86398\n月亮井\t86399\n浅野忠信\t86400\nynshangji\t86401\n3534\t86402\n38\t86403\n含浦\t86404\n司音\t86405\n乱收费\t86406\n奔驰gle320\t86407\n限定性\t86408\n南阳市财政局\t86409\nKissing\t86410\ntesting\t86411\nXCX\t86412\n伊丽莎白·泰勒\t86413\n波阻抗\t86414\n厕拍\t86415\n育婴员\t86416\n雷军\t86417\nmarven\t86418\n合婚\t86419\n东方红郡\t86420\n中华人民共和国退役军人事务部\t86421\n暮霭\t86422\n128g\t86423\n各样\t86424\n银雪\t86425\n东润\t86426\n美容馆\t86427\n上将军\t86428\n清宫气数录\t86429\n志愿者\t86430\nEmpires\t86431\n岛田\t86432\n砖瓦\t86433\n要闻\t86434\n肉壶\t86435\n碰不得\t86436\n水晶球财经网\t86437\n孔令\t86438\n螺狮\t86439\n飙泪\t86440\n互动电视\t86441\n私密性\t86442\n优质工程奖\t86443\n3000转\t86444\n机动战士高达THE\t86445\n601618\t86446\n胜出\t86447\n印字机\t86448\nyasuo\t86449\n股骨\t86450\n巴特潘石屹\t86451\n高高挂起\t86452\n崇法判决\t86453\n韩国女团\t86454\n小螺号\t86455\n茗阳\t86456\n寄居\t86457\n一旁\t86458\n鼻甲肥大\t86459\n灵通山\t86460\n法方\t86461\n江洋大盗\t86462\nObjectAnimator\t86463\nS230u\t86464\n苑子文\t86465\n气吞\t86466\n双元制\t86467\nlm358\t86468\n武汉市人民政府\t86469\n89C52\t86470\nquantity\t86471\n民族团结\t86472\n86957776.cc\t86473\n僧人\t86474\n赵女士\t86475\nKFR-35GW/\t86476\nc反应蛋白\t86477\n反派\t86478\nJavaScript模板\t86479\n增版\t86480\n海南省教育厅\t86481\n靖州新闻网\t86482\n制胜\t86483\n车载mv\t86484\n御宅屋\t86485\n蒙特卡洛树\t86486\n翼课网\t86487\n林弯弯\t86488\n淋湿\t86489\n柯珞克\t86490\n一败涂地\t86491\n长进\t86492\n华发新城\t86493\n定位\t86494\nfast4ward\t86495\nf310\t86496\n3.5%\t86497\n墨卡托\t86498\n单盘\t86499\njxt\t86500\nrepresentation\t86501\nios11卡\t86502\n暂行规\t86503\n第8套\t86504\n莫文蔚\t86505\nphotoshop2018\t86506\nSys\t86507\n里番色列漫画大全\t86508\n伯莱塔\t86509\n四川北路街道\t86510\n潘杰明\t86511\nVolatile\t86512\n看笑话\t86513\n劝告\t86514\n罗茨真空泵\t86515\n中市协注\t86516\n断桥铝\t86517\n大同市政府\t86518\n石嘴山\t86519\n同步性\t86520\n网格线\t86521\n45名\t86522\n恐怖份子\t86523\n土葬\t86524\nAMX\t86525\n熊安新区\t86526\n哥瑞\t86527\n广东省工商局\t86528\n没话说\t86529\n桥本爱\t86530\n特献\t86531\nmyeclipes\t86532\n4000吨\t86533\n女囚\t86534\ntuniu\t86535\nwelcom\t86536\n玩具总动员2\t86537\n横祸\t86538\nImpressions\t86539\nMUST\t86540\n最后期限\t86541\n友友们\t86542\n粪肠\t86543\n上级\t86544\nOaks\t86545\n微秒\t86546\n自由城之章\t86547\n铂类\t86548\n战神4女武神\t86549\n倪飞\t86550\n骑士酒店\t86551\n涡阳县\t86552\n书巢\t86553\n分解质因数\t86554\n凯迪拉克CT6\t86555\n03月31日\t86556\n工程质\t86557\nsort函数\t86558\nJide\t86559\n几月几号\t86560\n图吧地图\t86561\n动静脉内瘘\t86562\n灵异侦缉档案\t86563\ndeals\t86564\nstealth\t86565\nNozzle\t86566\n白了\t86567\n蚀刻机\t86568\n毒狼\t86569\n雪都美食网\t86570\nbecome\t86571\n虽有嘉肴\t86572\n景观式\t86573\n生机盎然\t86574\nli_\t86575\n2013年\t86576\n5000转\t86577\n锦上\t86578\n建筑法\t86579\nLP\t86580\n打油\t86581\n高平市\t86582\n乐正绫\t86583\n必考\t86584\n药监局\t86585\n轻钢结构别墅\t86586\n美绘\t86587\n金国\t86588\n旋挖钻\t86589\n启阳\t86590\n阿昔洛韦片\t86591\n气促\t86592\n红年\t86593\n玩具书\t86594\ncontroversy\t86595\n工程兵\t86596\n中国地质大学\t86597\n翻盖式\t86598\n北京东方时尚驾校\t86599\n小明星\t86600\n实操\t86601\n露臀\t86602\n顶箱\t86603\n勇敢者的游戏2\t86604\n0500\t86605\n白子\t86606\nsonic\t86607\n4.47\t86608\njitter\t86609\n85元\t86610\nXMT\t86611\n丰盈\t86612\n笔架山公园\t86613\nInstructions\t86614\nPGA\t86615\n王嘉良\t86616\n郭盛\t86617\neric0435\t86618\n政策性金融债\t86619\n彩铃网\t86620\n油豆\t86621\n人参健脾丸\t86622\n好机会\t86623\n河北联通\t86624\n三十岗乡\t86625\n考虫\t86626\n94路\t86627\nBIOS之家论坛\t86628\n葫芦丝曲\t86629\n欺侮\t86630\n玩笑话\t86631\n蒲可可\t86632\n儿童文学奖\t86633\n杭锦旗人民政府\t86634\n阿特拉斯\t86635\n红星国际广场\t86636\n特色校\t86637\n阿尔·帕西诺\t86638\n一时间\t86639\nIdeaCentre\t86640\n初精\t86641\n国王\t86642\n文山州人大常委会\t86643\n交困\t86644\n袁维娅\t86645\n何旭\t86646\n徐显明\t86647\n程无忧\t86648\n歌声的翅膀\t86649\n捕猎\t86650\n反解\t86651\n4头\t86652\n操作型\t86653\nHacker\t86654\n什刹海街道\t86655\n罗斯维尔\t86656\n同型\t86657\n注会财管\t86658\n安仁古镇\t86659\n文化创意产业网\t86660\n理性\t86661\n清开灵颗粒\t86662\n生热\t86663\n囊中\t86664\n明心见性\t86665\nled灯条\t86666\n报文段\t86667\nqa\t86668\n全食美\t86669\n白沙路\t86670\n东园\t86671\n质感\t86672\n小泽玛利亚\t86673\n麦琪的礼物\t86674\n喀什噶尔\t86675\ngreedy\t86676\n信阳茶文化节\t86677\n苏州工业园区公租房管理有限公司\t86678\nYuna\t86679\n晓字\t86680\nMicroscope\t86681\nAbility\t86682\n佛塔\t86683\n客串\t86684\n牛轧糖\t86685\n心肌炎\t86686\n压井\t86687\n7妹\t86688\n手术床\t86689\n胆囊息肉\t86690\n张彤\t86691\n台站\t86692\n琴酒\t86693\nSQL*PLUS\t86694\n微晶蜡\t86695\n常宏\t86696\n陈志雄\t86697\n墨菲\t86698\n五十音\t86699\nCinebench\t86700\n名弓\t86701\n张三丰之末世凶兵\t86702\n重庆工商局\t86703\nvba吧\t86704\n沃德\t86705\n稻森泉\t86706\n历久弥新\t86707\n基克\t86708\n600109\t86709\n常八九\t86710\n杖子\t86711\n_尚品网Shang\t86712\n公招\t86713\n秀太\t86714\n谐波\t86715\n看风景\t86716\npwcorr\t86717\n散粉\t86718\n淘贴\t86719\n31G\t86720\n野间\t86721\n旺旺子\t86722\n_凡人修仙传\t86723\n古典文学\t86724\nPyQt5\t86725\n任期制\t86726\n化痰\t86727\n整集\t86728\n号查\t86729\n玄晶\t86730\n佐罗\t86731\n迫击炮\t86732\n华美舞动广场舞\t86733\n水神兽\t86734\n彩虹桥\t86735\n如昔\t86736\n华贸城\t86737\noppoa1\t86738\n195\t86739\n两参\t86740\n内存池\t86741\n嵊泗县\t86742\nSumitomo\t86743\n妥否\t86744\n十六烷\t86745\nxieyi\t86746\n火云\t86747\n股票型\t86748\n一心二用\t86749\n来青\t86750\nFifth\t86751\n菲达环保\t86752\nlemo\t86753\n放走\t86754\n断行\t86755\n阿二\t86756\nopenbox\t86757\n维本堂\t86758\n321号\t86759\n天桥风云\t86760\n240V\t86761\n萨比\t86762\n文学概论\t86763\n科技路\t86764\n众悦学车网\t86765\nv3.8.0\t86766\n岚arashi\t86767\n陈皮皮\t86768\n林格曼\t86769\n豆豉鱼\t86770\n气袋\t86771\n联合办公室\t86772\n法蒂玛\t86773\n小妍\t86774\n赵子惠\t86775\n80回\t86776\n柏灵筠\t86777\n县委办公室\t86778\nRank\t86779\n桂冠人才网\t86780\n陕州\t86781\n幼龄\t86782\n万福镇\t86783\n汤小团\t86784\n_社会科学院\t86785\n科技馆\t86786\n吸脂术\t86787\n水马\t86788\n有主\t86789\nneko\t86790\n网易彩票帮助中心\t86791\n比做\t86792\n徐梅\t86793\n西安美院\t86794\n锦绣年华\t86795\nleach\t86796\nyouer\t86797\n暗厅\t86798\n王者出击\t86799\n武汉石化\t86800\n600d\t86801\n灵鸟\t86802\n背诵\t86803\n抢攻\t86804\n坊市\t86805\n国民经\t86806\n概化\t86807\n字符变量\t86808\n植物学\t86809\n好香\t86810\n十全街\t86811\n李马克\t86812\nLists\t86813\n44张\t86814\n波士顿\t86815\n8806\t86816\n名人\t86817\nxox\t86818\n迷信\t86819\n白矮星\t86820\nheypee\t86821\n养鸽\t86822\n温意\t86823\n秒人\t86824\n三春晖\t86825\n和平分手\t86826\nQUERY\t86827\n投降\t86828\n临沧\t86829\n啜泣\t86830\n省交通运输厅\t86831\nMarianne\t86832\n孙迅\t86833\n赊\t86834\n杭州新天地\t86835\n康城花园\t86836\n天底下\t86837\n延庆\t86838\n承插管\t86839\n鬼父2\t86840\n22017\t86841\n查检\t86842\nfarmers\t86843\n七少\t86844\n英致\t86845\n12036\t86846\n74cm\t86847\nSenna\t86848\n2018年起\t86849\n27集\t86850\n青铜葵花\t86851\n谜姬\t86852\n只欠东风\t86853\n家人们\t86854\n0906\t86855\n奥雷\t86856\nwanjia\t86857\n了断\t86858\n高戈奈斯\t86859\n花非花\t86860\n大连电视台\t86861\n毡帽\t86862\neot\t86863\n3par\t86864\n鲁icp\t86865\n嫁娶\t86866\n德强\t86867\n外标\t86868\n月鑫\t86869\n四川师范大学成都学院\t86870\n防振\t86871\n湖北省公安交通管理局\t86872\n孙晓明\t86873\n尊品\t86874\n都市\t86875\n多多益善\t86876\n霸者\t86877\n郑州市轨道交通有限公司\t86878\naccumulate\t86879\n吹拂\t86880\n68届\t86881\n七雄\t86882\n头鹰\t86883\nゼ\t86884\n乔瑟夫\t86885\n姚子羚\t86886\n先人\t86887\n奔驰男\t86888\n诺克萨斯\t86889\n加山\t86890\nbrand_wcpay\t86891\n巴菲韩信\t86892\nCommuter\t86893\nWEAR\t86894\nScirra\t86895\n孕晚期帮_\t86896\n兰卡威\t86897\n糖醋鱼\t86898\n典当行\t86899\nVISA信用卡\t86900\n新婚日记\t86901\n中国移动杭州研发中心\t86902\n珍珠蚌\t86903\n女警\t86904\n中国解放军\t86905\n世界银行\t86906\n人间道\t86907\n解码函数\t86908\n西铁城\t86909\n蔡澜\t86910\n看完\t86911\n肾小球\t86912\nWIFI版\t86913\n第83届\t86914\n小红军\t86915\n词牌\t86916\n甲子镇\t86917\n秀傲\t86918\n20160208\t86919\nE滁州人才网\t86920\nj2\t86921\nTeamviewer\t86922\n周美玲\t86923\n病原体\t86924\n日记\t86925\nInfoQ\t86926\n40029\t86927\n万里长江\t86928\n西宁晚报\t86929\n太原公交查询网\t86930\ncpu使用率\t86931\n胶体果胶铋\t86932\n女生们\t86933\n排号\t86934\n佛山大道\t86935\n国家发改委宏观经济研究院\t86936\n20170124\t86937\n河工新闻网\t86938\n社联\t86939\n呱蛋\t86940\n天王盖地虎\t86941\nbrains\t86942\n猎魔人\t86943\n政和县\t86944\n佐\t86945\n312国道\t86946\n俏\t86947\n易视宝\t86948\n压感\t86949\n锌粉\t86950\n50cm\t86951\nqq水浒\t86952\n白峰\t86953\n夯实\t86954\n墨子\t86955\n78页\t86956\n75关\t86957\n李琳\t86958\n脂老虎\t86959\nInterpreting\t86960\n应用篇\t86961\npadd\t86962\n大岚\t86963\n如麻\t86964\n诱杀\t86965\n100发\t86966\n德语助手\t86967\n孔中\t86968\n重庆市工商局\t86969\n死者\t86970\n虞乐\t86971\n女将\t86972\n喽\t86973\n可获\t86974\n测力计\t86975\n诗仙\t86976\n斑马快跑\t86977\n发小\t86978\n何度\t86979\n毛肚\t86980\n7262\t86981\n倪泽\t86982\n女爱豆\t86983\n信委\t86984\n狂想曲\t86985\n非煤矿山\t86986\n贺天\t86987\n江澄\t86988\nairbnb\t86989\n人行道\t86990\nFL5900U\t86991\n东营市实验中学\t86992\nBD国\t86993\nlibrdkafka\t86994\n童子军\t86995\n雷海\t86996\nmarkline\t86997\n大卫王\t86998\n神鹤\t86999\n血便\t87000\n消焰器\t87001\n红橡\t87002\n怪象\t87003\n物业管理网\t87004\n黄伟明\t87005\n网约车平台公司\t87006\n天姿\t87007\n京商商贸城\t87008\nrouter-link\t87009\n米网\t87010\n捐物\t87011\nIPF\t87012\n家康\t87013\n冠状动脉造影\t87014\n胆碱酯酶\t87015\n中南财经政法大学武汉学院\t87016\n闭塞\t87017\n批阅\t87018\n300秒\t87019\n输尿管镜\t87020\nkinit\t87021\n八枚\t87022\n抛光剂\t87023\n优词\t87024\nOPPOr9\t87025\n主贴\t87026\n日职\t87027\nlicking\t87028\nwin10设备管理器\t87029\n墨林\t87030\n广丰区政府\t87031\nTWITCH\t87032\nGeforce\t87033\n上海浦东\t87034\n氩弧焊机\t87035\n江苏楿港雪宝整体家居有限公司\t87036\n整枝\t87037\nBSOD\t87038\n云科\t87039\n苏州技师学院\t87040\n梦小楠\t87041\n中国评书网\t87042\n自性\t87043\n胶粒\t87044\ngtm\t87045\n挿\t87046\ndvaj\t87047\ngpd掌机\t87048\nmmseg4j\t87049\nSPEED\t87050\n全球速卖通卖家论坛\t87051\n农机购置补贴机具\t87052\n自管\t87053\n周笔\t87054\n电子流\t87055\nDim\t87056\n尽职调查_\t87057\nYoo\t87058\nLinu\t87059\ndew\t87060\n路南区\t87061\n苏怡静\t87062\n创视\t87063\n泡水\t87064\n奇葩说吧\t87065\nUSDT\t87066\n丰田雅力士\t87067\n易游网络\t87068\n汕头大学医学院第一附属医院\t87069\n新疆新闻网\t87070\n543\t87071\n斯托克顿\t87072\n李丽\t87073\n胶圈\t87074\n抽粪车\t87075\n索博圖庫|索博漫畫\t87076\n添彩\t87077\n近两天\t87078\n洋流\t87079\n王思铁\t87080\n方太\t87081\n处分期\t87082\n存图\t87083\n涡街\t87084\n奇迹MU:觉醒\t87085\n精华贴\t87086\n七彩云\t87087\n鹰目户外广告\t87088\nEveryday\t87089\nheit\t87090\n千秋\t87091\n我的新衣\t87092\n梦芭莎\t87093\n广州市义务教育学校\t87094\nC20\t87095\n和讯债券\t87096\n孤岛惊魂3\t87097\n5路\t87098\n非死\t87099\n牛犇\t87100\n欧雅\t87101\nlayUI\t87102\n德蒙福特大学\t87103\n乐哈健康网\t87104\n河南菜\t87105\n牢骚\t87106\n缉私局\t87107\nCoordinates\t87108\nOffice2003\t87109\nMicroPython\t87110\nCurtis\t87111\n暗金\t87112\n毕业设计网\t87113\n李阳疯狂英语\t87114\n阴阳刺青师\t87115\nincredibuild\t87116\n詹倩\t87117\n躺卧\t87118\n梦道\t87119\nultraact\t87120\nunoconv\t87121\nCCTV-2财经频道\t87122\nOpenCV2\t87123\nDesperado\t87124\n聚精会神\t87125\n超商\t87126\nrolan\t87127\ngwas\t87128\n黑楼\t87129\nnewest\t87130\n资本主\t87131\nmember\t87132\n称臣\t87133\n_多玩守望先锋\t87134\n易通\t87135\n佛蒙特州\t87136\nMPAndroid\t87137\n20170125\t87138\n福建省安全生产监督管理局\t87139\n刘芸\t87140\n达音科\t87141\n人教版七年级英语\t87142\n男朋友\t87143\nseg\t87144\n西山\t87145\n余生\t87146\n都市客\t87147\n一点儿\t87148\n柳神\t87149\n榆树叶\t87150\n上海宝钢\t87151\n3GB+32GB\t87152\nss服务器\t87153\n蒂芬妮\t87154\n宗地\t87155\n增强版\t87156\n北师大实验中学\t87157\n稽\t87158\n财会通讯\t87159\nwindow2012\t87160\nCOSQ\t87161\n秋兴\t87162\n永康市\t87163\nComfort\t87164\nprint函数\t87165\n抢救性\t87166\n上海市第一人民医院\t87167\nsensory\t87168\n鸡血石\t87169\n上海同济\t87170\n想着\t87171\n富士XT20\t87172\ndall\t87173\n合掌村\t87174\n冲垮\t87175\n胶态\t87176\n隔热垫\t87177\n大少爷\t87178\n和平号空间站\t87179\n广州南方医院\t87180\nglass\t87181\n冯卫东\t87182\n四合院\t87183\nWQ\t87184\n南海丹灶\t87185\n白机\t87186\n巴乐兔\t87187\nMikeZhou\t87188\n南京大学医学院\t87189\n关于实施乡村振兴战略的意见\t87190\n热性惊厥\t87191\n4.8g\t87192\nbleach\t87193\nv3.7.0\t87194\n部族\t87195\n台州学院\t87196\n姓氏歌\t87197\n废旧\t87198\nwe\t87199\n海洋大厦\t87200\n第13张\t87201\n恒指\t87202\n挂证\t87203\n白灵\t87204\n金鹰\t87205\n镗孔机\t87206\nIC卡燃气表\t87207\njds\t87208\n白河\t87209\n16kg\t87210\n奇妙能力歌\t87211\n知母\t87212\n融创九棠府\t87213\n西宁市人民政府\t87214\nMINI论坛\t87215\n四技\t87216\nVOGUE\t87217\n只不过\t87218\n2018-03-23\t87219\nステ\t87220\n百商通\t87221\nImplementing\t87222\naprilia\t87223\n尤尼克斯\t87224\n魔卡幻想吧\t87225\n信用\t87226\n800名\t87227\n百度词典\t87228\n肺腑\t87229\nv4.0.8\t87230\n港口\t87231\n长虹空调\t87232\nrt-mart\t87233\n点焊\t87234\n丽景门\t87235\n刺客信条大革命\t87236\nLORA\t87237\nVerbs\t87238\n部位\t87239\n栅栏\t87240\n陈萨\t87241\nmonetary\t87242\n均值滤波\t87243\n陈丹\t87244\nMATE\t87245\n第10页\t87246\nbl\t87247\n万木春\t87248\nsp0\t87249\n吉诺比利\t87250\n濠江风云\t87251\n乌桓\t87252\npigai\t87253\n注射用头孢哌酮钠舒巴坦钠\t87254\nX年\t87255\n机罩\t87256\nWACC\t87257\n分层次\t87258\nvr一体机\t87259\n打符\t87260\n庆洋制衣\t87261\n超声波清洗设备\t87262\n焊材\t87263\n6.0.1\t87264\n中国中建设计集团有限公司\t87265\n光线传媒\t87266\nspace-between\t87267\nUIImageView\t87268\n瑞戈非尼\t87269\n刘景文\t87270\nEXDOLL\t87271\n小媳妇\t87272\n一袋米\t87273\n2017年4月9日\t87274\n追缉令\t87275\n15亩\t87276\n精萃\t87277\nword批量\t87278\n东兴证券\t87279\nclent\t87280\n古思特\t87281\n陇海线\t87282\n社会保障学\t87283\n布列瑟农\t87284\nLOVE\t87285\nRevolutionary\t87286\n价格规\t87287\n除油剂\t87288\n红教堂\t87289\n中国国际电子商务中心\t87290\n图谱\t87291\nCentOs\t87292\n麦苹果\t87293\n两周前\t87294\n包真\t87295\n十遍\t87296\n杀气腾腾\t87297\ndsp28335\t87298\n中间轴\t87299\nibd\t87300\n汇文中学\t87301\n苏宁快递\t87302\nCountdown\t87303\n北京理工大学计算机学院\t87304\n掰直\t87305\n外层\t87306\n王宝森\t87307\npes2015\t87308\n大好时光\t87309\n二级建造师\t87310\n万人次\t87311\n短宗\t87312\n淋巴细胞绝对值\t87313\nHighly\t87314\n盒马鲜生\t87315\n长节\t87316\n磁耦\t87317\n安飞士租车AVIS|安飞士租车AVIS\t87318\nSet\t87319\nxp版\t87320\nwalle\t87321\n推理机\t87322\n数字式\t87323\n元春\t87324\n时间广场\t87325\n最爱听\t87326\n剑勇传说\t87327\n持之以恒正风肃纪\t87328\n移动车\t87329\n创制\t87330\n梦立方\t87331\n肉刑\t87332\n9000万元\t87333\n掐架\t87334\n秃驴\t87335\n卤鸭\t87336\n机器之血\t87337\n凃\t87338\n期限错配\t87339\n清真大寺\t87340\n班长\t87341\n猴王\t87342\nSukebei\t87343\nTI\t87344\n约球\t87345\n龙共舞\t87346\n阿锋佬\t87347\n冈\t87348\n吴桂英\t87349\n中交第一公路工程局有限公司\t87350\n第】\t87351\n苏州园林卡\t87352\n苏村\t87353\nsavior\t87354\n超级机器人大战X\t87355\n18年5月\t87356\n版费\t87357\n花井メイサ\t87358\n刘霞\t87359\n油表\t87360\n龟孙\t87361\n商丘人才网\t87362\n星期8\t87363\n刮花\t87364\n白炭\t87365\n死里逃生\t87366\n增碳剂\t87367\n2017年5月份\t87368\nRestController\t87369\n蜀山剑侠\t87370\n7300\t87371\n第29号\t87372\n天门山景区\t87373\n俞飞鸿\t87374\n石梅湾\t87375\n宁夏回族自治区国土资源厅\t87376\n汲\t87377\n冠佑\t87378\nBIKE\t87379\n西大附中\t87380\n起跑\t87381\n必理通\t87382\n元祖高达\t87383\n乱来\t87384\nmerit\t87385\n秦城监狱\t87386\n郑州市中级人民法院\t87387\n万能\t87388\n贵阳一中\t87389\n启示录2\t87390\n胜于雄辩\t87391\nentire\t87392\n敕勒川\t87393\n跖趾\t87394\n1924年\t87395\nSynchronous\t87396\n伟彬\t87397\n润和\t87398\n黑龙江飞鹤乳业有限公司\t87399\n魔偶\t87400\nEMBL\t87401\nbangerlee\t87402\n贵州道\t87403\n魔界战记5\t87404\nGTX1050\t87405\n公约数\t87406\n西游之路\t87407\n久美\t87408\n3天2晚\t87409\n零售机\t87410\n缙云中学\t87411\n合规文化建设\t87412\n初温\t87413\nnetworkx\t87414\n文丑\t87415\n数字滤波器\t87416\n泰拉星环\t87417\n奔跑吧6\t87418\n校园图\t87419\n火箭式\t87420\nEDTA\t87421\n铝氧化\t87422\nlxml库\t87423\n手软\t87424\n繁盛\t87425\n8585\t87426\n健步走\t87427\n每一分钟\t87428\n梦幻奥陶纪\t87429\n江苏工程职业技术学院\t87430\n陈赫\t87431\noffwhite\t87432\nctr\t87433\n黑陶\t87434\n移动开发\t87435\nTimerTask\t87436\n_亿\t87437\n新民网\t87438\nVOYO\t87439\n思腾\t87440\nOLAP\t87441\n绍兴房产网\t87442\nwangpan\t87443\n格朗\t87444\n硬\t87445\n神探狄仁杰\t87446\n张伯黄光裕\t87447\n600亿\t87448\n文石\t87449\n明棋\t87450\n战巡\t87451\n基础级\t87452\n武鸣县\t87453\n华农\t87454\n社科院研究生院\t87455\n龙腾世界\t87456\nshale\t87457\ncad吧_\t87458\n理科生\t87459\n均衡性\t87460\n导师团\t87461\n争藏\t87462\n十一\t87463\n京津冀旅游一卡通\t87464\nb360m\t87465\nGIVE\t87466\n20170228\t87467\n栽赃\t87468\n盗图\t87469\nxb10\t87470\n迷宫村\t87471\nsteve\t87472\n抢兑\t87473\nV2.5.1\t87474\n超网\t87475\n期末考试卷\t87476\n快快\t87477\n黄铭\t87478\n分子\t87479\n45条\t87480\n锦园\t87481\n志保\t87482\nrngm\t87483\nOmegle\t87484\n调查局\t87485\n跨越百年的美丽\t87486\n捷迈\t87487\n换锁\t87488\n600多\t87489\n高等职业教育创新发展行动计划\t87490\njoyo\t87491\nLearners\t87492\nProfile\t87493\n青海电台\t87494\nz11\t87495\n操作台\t87496\nhsw\t87497\n剑皇\t87498\n加热区\t87499\n8.5.1\t87500\n情债\t87501\n策应\t87502\n蔺相如\t87503\n听书网\t87504\n一揆\t87505\n农业区\t87506\n发货方\t87507\n发现页\t87508\nMayor\t87509\n标准A4\t87510\nruu\t87511\n170630\t87512\n2017季\t87513\n浙江合众新能源汽车有限公司\t87514\n大航海2\t87515\n纸坊镇\t87516\n星级\t87517\n江苏文艺出版社\t87518\nprev\t87519\n答惑\t87520\n日趋\t87521\nitext\t87522\nPassionate\t87523\n金玟岐\t87524\n文化石\t87525\n真岛\t87526\n盆栽\t87527\n创强\t87528\n英雄学院\t87529\nVISA卡\t87530\n小县\t87531\n孖展\t87532\n第2代\t87533\n民宅\t87534\n高昂\t87535\n定性\t87536\n跨式\t87537\n氢化\t87538\n冥狱\t87539\n徐瑶\t87540\ndalsa\t87541\n大檐帽\t87542\n树莓派\t87543\n豆捞\t87544\n7位\t87545\n伟创力\t87546\nsalient\t87547\n生命期\t87548\n宣化\t87549\n纪检部\t87550\n13.0\t87551\n1704\t87552\n418号\t87553\nLEXUS\t87554\n送信\t87555\n沈阳水务集团\t87556\n正东方教育\t87557\n花圈\t87558\n总犯\t87559\n秋霞网\t87560\n生死存亡\t87561\n博罗园洲\t87562\n中共上海市纪律检查委员会\t87563\n主存\t87564\n凤凰山森林公园\t87565\n20150710\t87566\nMobX\t87567\n市场导报\t87568\n重计数\t87569\n吊梁\t87570\n2018世界杯\t87571\n金兰咨询网\t87572\n一战封神\t87573\n炒冰机\t87574\n十五天\t87575\n东桥\t87576\n无产阶级\t87577\n酌定\t87578\n巅峰时刻\t87579\n底商\t87580\n东方红一号\t87581\n雪崩式\t87582\n中位值\t87583\n半数以上\t87584\nMathType公式编辑器\t87585\n蓝丰生化\t87586\n台州火车站\t87587\n香榭丽舍\t87588\n贞德\t87589\n飞檐\t87590\n100平方\t87591\nFortune\t87592\n上栗县\t87593\nmc34063\t87594\n桌类\t87595\nKelly\t87596\n仙华山\t87597\n子罕\t87598\n房层\t87599\n吕叔湘\t87600\n春暖性\t87601\n314路\t87602\nvrtx\t87603\n华侨新村\t87604\n淄博房产超市网\t87605\n岚龙\t87606\n背巾\t87607\n雅座\t87608\n4月15日起\t87609\n雪铁龙C2\t87610\nnonetype\t87611\n胡长清\t87612\nE型\t87613\n作风建设永远在路上\t87614\n工信部\t87615\n哺乳动物\t87616\n肺丸\t87617\n益民\t87618\n筛管\t87619\n铃木一彻\t87620\nSTM8S003\t87621\n刘仰\t87622\n样本集\t87623\n光明小学\t87624\n爆震传感器\t87625\n西部黄金\t87626\nSdn\t87627\n软开\t87628\n聖\t87629\n世界城\t87630\n鸡片\t87631\n月亮谷\t87632\n实中\t87633\n深图\t87634\n四川统计_四川省统计局\t87635\n科瑞达\t87636\nword97-2003\t87637\n劳务派\t87638\ne430c\t87639\n1706\t87640\n94%\t87641\nCTF\t87642\nReply\t87643\n离心鼓风机\t87644\n方丹\t87645\nTATA木门\t87646\n粉碎性\t87647\n0203\t87648\nsumifs函数\t87649\n太极集团\t87650\n倒换\t87651\n五一旅游\t87652\n流通处\t87653\nAdventures\t87654\n业力\t87655\n白芹\t87656\n福山润\t87657\n米米尔隆\t87658\n超强大\t87659\n確\t87660\nAiri\t87661\nb2b.huangye88.com\t87662\n沙漠玫瑰\t87663\n红外对射\t87664\n长征五号遥二\t87665\n一峰\t87666\n拉拔仪\t87667\n五龙山\t87668\nICP-MS\t87669\nWeddings\t87670\n毛主席纪念馆\t87671\n骨细胞\t87672\n七次方\t87673\n正远\t87674\n考试时\t87675\n法拉第笼\t87676\nABC输入法\t87677\n唐糖\t87678\n三月\t87679\nCD播放器\t87680\n南京公安局\t87681\n韩装网\t87682\n短剧\t87683\n强制关机\t87684\n邀请方\t87685\n钢机\t87686\n吉祥寺\t87687\n跳高\t87688\n关晓智\t87689\n狂吻\t87690\n孙兴民\t87691\n金色的鱼钩\t87692\n鼓形\t87693\n合修\t87694\n币友\t87695\n小米电视机\t87696\n庖厨\t87697\n小批量\t87698\n巧家\t87699\nBentley\t87700\n磨制\t87701\n计算机组成原理\t87702\n像\t87703\n跳床\t87704\n叶问\t87705\n冷冷\t87706\n利巴韦林\t87707\nLCL\t87708\nり\t87709\n堆\t87710\n中国惠农网\t87711\nporonovideos\t87712\n3.2.6\t87713\n5253\t87714\n92集\t87715\nmise\t87716\n飘雪动漫社\t87717\n耄耋\t87718\ngerber\t87719\n东风日产天籁\t87720\n11600\t87721\n胰岛素泵\t87722\n西南联合大学\t87723\n好订网\t87724\nJGJT\t87725\n北湖国际城\t87726\n水狗\t87727\n痤疮\t87728\n国家标\t87729\n默读\t87730\nqq营销\t87731\n搞清楚\t87732\njavasky\t87733\n气井\t87734\n艾瑞卡\t87735\n11-30天\t87736\n几幕\t87737\nDatagrid\t87738\n大众R36\t87739\n刘表\t87740\n笑佳人\t87741\n2小时左右\t87742\nsubscriptable\t87743\n山崎龙\t87744\n张新建\t87745\n木饰\t87746\n4.1号\t87747\n中华民国二十五年\t87748\n罗斌\t87749\n从一\t87750\n授牌\t87751\n上海大悦城\t87752\n三线表\t87753\n电子证\t87754\n宗申\t87755\n半支烟\t87756\n港务\t87757\n上海时代广场\t87758\n热仪\t87759\n乘座\t87760\ntyphoon\t87761\n伊万卡·特朗普\t87762\n科颜\t87763\n启迪\t87764\n希施金\t87765\n阿根\t87766\n安田\t87767\ncs1.5\t87768\n广东教育厅\t87769\n沪江留学网\t87770\n归侨\t87771\n无惨\t87772\n第103\t87773\n2017年8月\t87774\n广州市中医医院\t87775\n爱情海\t87776\nElimination\t87777\n明永乐\t87778\n科技创新导报\t87779\n广园路\t87780\n滨海机场\t87781\n毛色\t87782\n买断\t87783\n足不出户\t87784\n六娃\t87785\n心搏骤停\t87786\nslidworks\t87787\npk107\t87788\nSouthwest\t87789\n一百件\t87790\n阿尔特留斯\t87791\n博后交流版\t87792\n战地记者\t87793\n飞轮\t87794\n志摩\t87795\n剖开\t87796\nKFR-32GW/\t87797\n天台宗\t87798\n一个2000\t87799\n佣人\t87800\npgn\t87801\n君岛美绪\t87802\n税收征收管理法\t87803\n弟子\t87804\nXs\t87805\n阻垢缓蚀剂\t87806\nAT89C52\t87807\nf482\t87808\n京石\t87809\n茶性\t87810\n风蚀\t87811\n构建\t87812\nUC浏览器\t87813\nBootloader\t87814\n赵白石\t87815\n山东省科学技术厅\t87816\nvsprintf\t87817\n攻击装\t87818\n老狗\t87819\n丝雾\t87820\n法规\t87821\n刺五加茶\t87822\n中国商论\t87823\n同母异父\t87824\n王气\t87825\n淘书网\t87826\n中国建材采购网\t87827\n乐富\t87828\n阿斯拉达\t87829\n中化\t87830\n单纯型\t87831\n世纪大道站\t87832\n60本\t87833\n后亭\t87834\ntouches\t87835\n黑纸\t87836\n2012年9月\t87837\nMHC\t87838\n17.3寸\t87839\n黄钟\t87840\nresume\t87841\n洗车液\t87842\n精易编程助手\t87843\n塔尔\t87844\n3月21日\t87845\n王稳健\t87846\ngifts\t87847\n2018-03-11\t87848\n周坤\t87849\n斯派克\t87850\n济南高新区\t87851\n伦敦陷落\t87852\n喹诺酮\t87853\n丁盛\t87854\n1616\t87855\n担纲\t87856\n巨乳女优\t87857\n北大哲学系\t87858\n叙利\t87859\n三生缘\t87860\n巫山小三峡\t87861\n王文治\t87862\n热心人\t87863\n艾伯维\t87864\n华丰村\t87865\nmanagers\t87866\n同话\t87867\n变合\t87868\n手机变声器\t87869\n解放军外国语学院\t87870\n空气式\t87871\n吸筹\t87872\n雅音\t87873\n神水\t87874\nnord\t87875\n硬硬\t87876\n气壮山河\t87877\n82599\t87878\nM268dw\t87879\n14幅\t87880\n魅蓝6\t87881\n新三国演义全集\t87882\n口袋妖怪复刻吧\t87883\n直装版\t87884\nJuicer\t87885\n46分\t87886\n牙科综合治疗机\t87887\n223号\t87888\n丹桥\t87889\n楼板房\t87890\n风流\t87891\n渔\t87892\n小黄花\t87893\nAlevel\t87894\n善恶\t87895\n10.1002\t87896\n军宅\t87897\n亮化\t87898\n艾芘基妮\t87899\nwuqiseu\t87900\n侠气\t87901\n惠锁屏\t87902\n中国尊\t87903\n有途高考网\t87904\n素填土\t87905\n胡海波\t87906\n女舞\t87907\n吉林大学经济学院\t87908\n法制科\t87909\nham\t87910\n兰德\t87911\nInquiry\t87912\n理光\t87913\n乔羽\t87914\n赣南医学院\t87915\n)有限公司\t87916\n奥尔堡\t87917\n百度地图测量\t87918\nrichards\t87919\n银证转账\t87920\n跨列\t87921\n华中路\t87922\n永远的长征\t87923\n舞法天女朵法拉\t87924\n背带\t87925\n张小军\t87926\n文波\t87927\n福星高照\t87928\n起行\t87929\nJSQ25\t87930\n66778855\t87931\nipe\t87932\n卡宴\t87933\n傻傻分不清\t87934\n安吉竹博园\t87935\nmeetings\t87936\n梅尔\t87937\n磷酸锌\t87938\n行健\t87939\n米醋\t87940\n神经衰弱\t87941\n李志\t87942\n泉州火车站\t87943\n轻钢\t87944\ncohesion\t87945\n踏雪\t87946\n10000只\t87947\n断章\t87948\n黄永年\t87949\n我不知道\t87950\n幽兰\t87951\n合肥政府\t87952\n恶汉\t87953\n新英雄霞\t87954\n致命诱惑\t87955\n600mw\t87956\n0.3米\t87957\n孔祥熙\t87958\n换面\t87959\n标普\t87960\n头垢\t87961\n132124\t87962\n织带机\t87963\n黄娜\t87964\n耀华路\t87965\n文学部\t87966\n滑板公园\t87967\n浪子彦\t87968\n私教\t87969\n调压柜\t87970\n广播电视编导\t87971\n吊秤\t87972\n圣教\t87973\nelona\t87974\n宜昌市第一人民医院\t87975\n和合文化\t87976\n叽歪\t87977\n实心\t87978\n起始日\t87979\n嘉顿\t87980\n先锋资源网\t87981\nwith\t87982\n南段\t87983\n黄酮类化合物\t87984\n全顺\t87985\n女排世界杯\t87986\n张恒\t87987\n亲密关系\t87988\n1标\t87989\n2014年6月\t87990\n电动车\t87991\n阿格瑞斯\t87992\n保障书\t87993\n人工学院\t87994\n袋袋裤\t87995\n滑手\t87996\n呼吸困\t87997\n深重\t87998\n两边\t87999\n泛函分析\t88000\n荷兰菊\t88001\nmTOR\t88002\nCAST\t88003\n包法\t88004\n学园都市\t88005\n财技\t88006\n月星环球港\t88007\nAutocad2010\t88008\n神话故事\t88009\n黑叔\t88010\n长春绿园\t88011\n事业编公共基础知识\t88012\n草炭土\t88013\n别府\t88014\n牵引变电所\t88015\n双随机一\t88016\n捷越\t88017\n田园杂兴\t88018\n联想yoga\t88019\n秀品\t88020\n同志\t88021\nCFG桩\t88022\n人民网江苏视窗\t88023\n7548\t88024\n马鞍山师范高等专科学校\t88025\n奇货可居\t88026\n365易搜网\t88027\n亲情\t88028\nFor\t88029\n天津医科大学代谢病医院\t88030\n20130825\t88031\n拜见\t88032\n立方根\t88033\n7566\t88034\n日出日落\t88035\n工作报告会\t88036\n奔驰S400\t88037\n0309\t88038\n女人才\t88039\n区庄\t88040\n保生大帝\t88041\n户口簿\t88042\n儿园\t88043\n_妈蛋表情网\t88044\n陈晋\t88045\n20几个\t88046\nint*\t88047\n普朗\t88048\n恺撒堡\t88049\n良臣\t88050\n张士\t88051\n沔阳\t88052\n性功\t88053\n2集\t88054\n求拍\t88055\nustc\t88056\n啤\t88057\n广九直通车时刻表\t88058\n钱惠丽\t88059\n保健枕\t88060\n连体袜\t88061\n做自\t88062\n黄鳝鱼\t88063\n古诗歌\t88064\nemanlee\t88065\n蚂蚁网\t88066\n剖宫产\t88067\n庄主\t88068\narc\t88069\n资管业\t88070\n瑞安\t88071\n绿湖国际城\t88072\n蒂姆·伯顿\t88073\n13组\t88074\n维c银翘片\t88075\ncirros\t88076\n祠堂\t88077\n大赢家\t88078\n不样\t88079\n最后页\t88080\n济源教育网\t88081\n41\t88082\n对决\t88083\n1.pdf\t88084\n可爱版\t88085\n陈达\t88086\n智能家居公司\t88087\n未来杯\t88088\n辣妈_摇篮网\t88089\n领行\t88090\n33周\t88091\n厦府\t88092\nc50\t88093\n香珠\t88094\nbytea\t88095\nWNBA\t88096\n美斯特\t88097\n表面积\t88098\n超过6年\t88099\n催稿\t88100\n排骨\t88101\n托德\t88102\n丧尸镇\t88103\n佳能戴尔\t88104\n风吟\t88105\nquadratic\t88106\n吴晓艾玛沃特森\t88107\n全貌\t88108\nsibling\t88109\nkakao\t88110\n广州东方宾馆\t88111\n博士娃\t88112\n智方\t88113\n先予\t88114\n卡祖笛\t88115\n大涛\t88116\n龙若\t88117\n蒙嘉慧\t88118\n朵拉\t88119\np\t88120\n南京绿博园\t88121\n排污泵\t88122\n洗掉\t88123\n十二只\t88124\n网络语\t88125\n悠蓝\t88126\n限量版\t88127\n校园文化艺术节空瓶作画大赛部\t88128\nyouboy\t88129\n田氏\t88130\n360房产网\t88131\nmastercam2018\t88132\n代表方\t88133\n食疗方\t88134\n少宠\t88135\nphpword\t88136\nAUDI\t88137\n杯托\t88138\n顶山\t88139\n各地州\t88140\n王洪文\t88141\n沃尔塔瓦河\t88142\n呋塞米\t88143\n书友会\t88144\n315吧\t88145\n化妆台\t88146\n利弗莫尔\t88147\nMenuItem\t88148\nggplot\t88149\n河南建筑职业技术学院\t88150\n奴隶\t88151\n人物传记\t88152\n未来十天\t88153\n10圈\t88154\nCTX\t88155\n新石器时代\t88156\n李瑶\t88157\n尚无\t88158\n所属\t88159\n归属地查询|手机号码举报\t88160\nPGL\t88161\n草原英雄小姐妹\t88162\n果香\t88163\n牛郎\t88164\n血红细胞\t88165\n敲缸\t88166\n吉布斯\t88167\n楽天\t88168\n柳残阳\t88169\n闻\t88170\n凯迪综合网\t88171\n帽椅\t88172\nDBCC\t88173\naffected\t88174\n第一次种\t88175\nMassa\t88176\n粒数\t88177\nforname\t88178\nafford\t88179\n二十四年\t88180\n东南大学学报\t88181\ngmw\t88182\ncontribution\t88183\n影音先锋av电影_影音先锋看片资源_影音先锋\t88184\n罗红军\t88185\n表存在\t88186\n惨遭淘汰\t88187\n鱼糕\t88188\n打枪\t88189\nVery\t88190\n宗主\t88191\n淡丽\t88192\n双杠臂屈伸\t88193\n急_作业帮\t88194\n帆游\t88195\n得过且过\t88196\nL380\t88197\n法莱\t88198\n达亿瓦\t88199\nstylesheet\t88200\n两只猫\t88201\n三年以后\t88202\n别克_君威\t88203\n0day\t88204\n李红卫\t88205\n逗逗网\t88206\n海娜花\t88207\n塑料罐\t88208\n100遍\t88209\n65型\t88210\n民族英雄\t88211\n从化\t88212\n土地出租网\t88213\n教研\t88214\n自耦\t88215\n四三二\t88216\n生死簿\t88217\n歌仔\t88218\n虎纹\t88219\n生铁锅\t88220\nVue-router\t88221\nK2P\t88222\n全国赛\t88223\n烂花\t88224\n杭州银泰百货\t88225\n易讯\t88226\n兽心\t88227\n破解器\t88228\nBRA\t88229\n热情的邻居\t88230\n超过30秒\t88231\n四大名\t88232\n刘通\t88233\n最安全\t88234\n十四期\t88235\n内藏式\t88236\n首映\t88237\n心灵深处\t88238\n代议制\t88239\n6句\t88240\n金大福\t88241\n置放\t88242\n下雨\t88243\n人员证\t88244\n1.4_\t88245\nsummarize\t88246\n2018至2020年\t88247\n流行度\t88248\n加高\t88249\n舱单\t88250\n好人生\t88251\n九马画山\t88252\n绝壁\t88253\nNorthwest\t88254\n张子良\t88255\n高满堂\t88256\n丰衣足食\t88257\n极夜\t88258\n后来人\t88259\n洁尔阴\t88260\n肚子\t88261\nSempron\t88262\nQingdao\t88263\n3dmax2009\t88264\n间隔符\t88265\nEgyptian\t88266\n筷子舞\t88267\n国家发展改革委\t88268\n曹冲称象\t88269\nead\t88270\nvain\t88271\n蝉声\t88272\nvcm\t88273\nshodo\t88274\n赫拉\t88275\nstm8\t88276\n成德善\t88277\n好慷\t88278\n害羞\t88279\n901路\t88280\n初信\t88281\n倾杯\t88282\n厦门市第五医院\t88283\n开机棒\t88284\naaaaa\t88285\n小猪猪\t88286\n巡警\t88287\n无花\t88288\n牌\t88289\n心目\t88290\n花饰\t88291\n中昊\t88292\n吴哥窟\t88293\n东方project吧\t88294\n川北在线\t88295\n石章鱼\t88296\nc226\t88297\nincentives\t88298\n爱财集团\t88299\n纯甄酸牛奶\t88300\n环球华网\t88301\n笑书神侠倚碧鸳\t88302\ncipchk\t88303\n首创置业\t88304\n荣耀5a\t88305\n110m\t88306\n团车\t88307\n悸\t88308\n标准大气压_\t88309\n版带\t88310\n评传\t88311\n山东省交通运输厅\t88312\nFPM\t88313\nwarding\t88314\nsampling\t88315\n小飞鼠\t88316\n在户外\t88317\n鼠李糖乳杆菌\t88318\n旋合\t88319\n育婴师\t88320\n卷轴\t88321\ncups\t88322\n杭州市经济技术开发区\t88323\n五彩石\t88324\n阿斯加特\t88325\n美育节\t88326\nhitfm\t88327\nLaMw\t88328\n宿舍小区\t88329\n伊丹\t88330\nteamtalk\t88331\n任继愈\t88332\n海拍客\t88333\n做回\t88334\n三省六部\t88335\n凌豹姿\t88336\nキャッスルマイスタ\t88337\n民校\t88338\n敖日\t88339\n华尔网\t88340\n四川省公安厅\t88341\n爵\t88342\nZX300\t88343\nsherryChang\t88344\n海州\t88345\n梁朝\t88346\nfrontier\t88347\ntplus\t88348\n刮刮\t88349\n胶标\t88350\n露点仪\t88351\n清凉峰\t88352\n红岭路\t88353\n李金城\t88354\n瑞\t88355\n2012年以来\t88356\n近程\t88357\nWebMoney\t88358\n哈弗H1\t88359\n豌豆凉粉\t88360\nSEO优化公司\t88361\n邦迪\t88362\nfilter\t88363\n重庆主城区\t88364\n老朋友\t88365\n气纯\t88366\npeptide\t88367\n红枫路\t88368\n下瓦房\t88369\n中国民族网\t88370\nGeoIP\t88371\n导点\t88372\n碳化铬\t88373\nhsr\t88374\n54秒\t88375\n中国银行上海市分行\t88376\n水立方\t88377\nOffice办公助手\t88378\nAV男优\t88379\n砖塔\t88380\nzno\t88381\nDA\t88382\n117.136.8.1\t88383\n极限片\t88384\nvs2013\t88385\n左力\t88386\n虫体\t88387\n大众_车问网\t88388\n朱德\t88389\n行板\t88390\n长达\t88391\n保利花园\t88392\n咨询业\t88393\n中国青年企业家协会\t88394\n31p\t88395\n莲瓣兰\t88396\n豪迪群发器\t88397\n人教版)\t88398\n飞利浦\t88399\n460分\t88400\nuts\t88401\n每一天\t88402\n绿脓杆菌\t88403\nDeal\t88404\n秀水\t88405\nstdin\t88406\n孤岛先锋\t88407\n儿菜\t88408\n达飞金融\t88409\nidata\t88410\n声讯\t88411\nconv\t88412\nOpenSIPS\t88413\nNXP\t88414\n发展观\t88415\n法郎\t88416\n马叉\t88417\ndecorative\t88418\n350名\t88419\n虾虎鱼\t88420\n脊背\t88421\n勿用\t88422\n嘉陵\t88423\n人狼村\t88424\n小黄条\t88425\n上汽上海文化\t88426\ncute\t88427\n魔像\t88428\nIII卷\t88429\n癌细胞\t88430\nhutool\t88431\nns版\t88432\n40条\t88433\nClassLoader\t88434\n少女体\t88435\n长寿村\t88436\n无双大蛇2:终极版\t88437\n墨宝\t88438\n45月份\t88439\n绯闻女孩\t88440\n这个月底\t88441\nschwarz\t88442\n寄递\t88443\ncura\t88444\nlooper\t88445\n特拉普\t88446\n飞鸟集\t88447\n刑事专业\t88448\n杨伟\t88449\n队员\t88450\nKnight\t88451\n明世宗\t88452\n侵\t88453\n0.20\t88454\n37码\t88455\n时尚头\t88456\ninitiative\t88457\nluckykun\t88458\n日内瓦\t88459\n北京四季青\t88460\np300\t88461\n云海玉弓缘\t88462\n炒锅\t88463\n天谕幻雪\t88464\nENS\t88465\n松下中央空调\t88466\n江\t88467\n水体\t88468\n拧拉\t88469\n139平\t88470\nT3\t88471\n郑州市司法局\t88472\nZRainna\t88473\n样色\t88474\n国立\t88475\nQuip\t88476\n最终季\t88477\nBlake\t88478\n何辅堂\t88479\nEJ\t88480\n严不严\t88481\n福州社区\t88482\nwow.52pk.com\t88483\nchgrp\t88484\nRNGM\t88485\nsculpt\t88486\nputon\t88487\n汉中职业技术学院\t88488\n最后一夜\t88489\n黄马\t88490\n可拆式\t88491\n三体2\t88492\n荣盛\t88493\n飞蛇\t88494\n银河视点_地产\t88495\nstepper\t88496\n女网\t88497\n哥儿\t88498\nmimk\t88499\nFeign\t88500\n剑制\t88501\nGitBook\t88502\n书榜\t88503\n放动\t88504\n窜货\t88505\n王三\t88506\nCBM\t88507\n月包\t88508\n福缘\t88509\n相亲相\t88510\n初中毕业生\t88511\n身临\t88512\nSitting\t88513\n噪耳机\t88514\nescel\t88515\n一个万\t88516\n书旗\t88517\n鼻中隔偏曲\t88518\n敬事\t88519\n银河剧场\t88520\n3000亩\t88521\n美式整脊\t88522\n通关\t88523\ns7edge\t88524\n微单\t88525\n58亿\t88526\n婷婷丁\t88527\n崔如琢\t88528\n反曲吧\t88529\n曾伟\t88530\ntf\t88531\n气调\t88532\n梅兰妮\t88533\n北京延庆\t88534\n麦王争霸\t88535\n布告\t88536\n理还乱\t88537\n印金\t88538\n廉颇\t88539\n发胶\t88540\nrevisited\t88541\n搜房\t88542\n殁漂遥\t88543\n千颜篇\t88544\n字语\t88545\n页数\t88546\n工商管理学\t88547\n青岛便民网\t88548\n赖床\t88549\n你的眼泪\t88550\n吸鼻器\t88551\n马贵\t88552\n广医三院\t88553\n完形\t88554\nKIKO\t88555\n钱粮\t88556\n橙色\t88557\n法属波利尼西亚\t88558\n273二手车网\t88559\nJOURNEY\t88560\nshinian\t88561\n兵器库\t88562\n一言通天\t88563\nPortable\t88564\n王羲之\t88565\n激情性\t88566\ncottage\t88567\n刺梨\t88568\n箭馆\t88569\ncocoscreator\t88570\nvet\t88571\n对虾\t88572\n天蚕\t88573\n房王网\t88574\n犯罪\t88575\nqia\t88576\n制空权\t88577\n善融商城\t88578\n郑州信息科技职业学院\t88579\n突击队\t88580\n煎茶\t88581\n苏宁金融\t88582\nPINK\t88583\n不变资本\t88584\ngdswdx\t88585\n一瘸一拐\t88586\n九命猫\t88587\n摆脱\t88588\n屈服强度\t88589\nLPL英雄联盟职业联赛\t88590\n香港机场\t88591\n永定区\t88592\n脑子\t88593\n20150528\t88594\n2晚\t88595\n天擎\t88596\n逼\t88597\n开入\t88598\nchinaifne\t88599\n720p.BD中英双字幕迅雷\t88600\n三国乱世\t88601\n杉杉集团\t88602\n加热垫\t88603\n谭波\t88604\n美女\t88605\n55P\t88606\n庄辰超\t88607\n高总\t88608\nataraxia\t88609\n美秀美术馆\t88610\n119手游网\t88611\n宏兴\t88612\napt-cache\t88613\n亲戚们\t88614\n人教版小学四年级语文下册\t88615\n独具\t88616\n农机展\t88617\ncoatings\t88618\nEB5\t88619\n红鲤鱼\t88620\n浙江财经大学东方学院\t88621\n唾液\t88622\n变废为宝\t88623\n非甲烷总烃\t88624\npiping\t88625\n钢企\t88626\ndh\t88627\n刘贝\t88628\n蒋琬\t88629\n视黄醇\t88630\n艺术画\t88631\nRefactoring\t88632\n鼻梁骨\t88633\n炽热狙击\t88634\n金牡丹\t88635\n警护人\t88636\nOrigin\t88637\n秦巴山区\t88638\n哪几天\t88639\n9013\t88640\nBarrels\t88641\ncsv\t88642\nincurred\t88643\n立创商城\t88644\n1970年1月1日\t88645\n赖弘国\t88646\n机战王\t88647\n六张\t88648\n涂涂\t88649\n北京航天桥\t88650\n大生意\t88651\n汤神\t88652\n274号\t88653\n王律师\t88654\n美国海军陆战队\t88655\nnextTick\t88656\n薄板\t88657\n26步\t88658\n襄阳职业技术学院\t88659\nljd\t88660\nconfidential\t88661\nreusable\t88662\n碧丽\t88663\n肝\t88664\n敬业\t88665\n程序库\t88666\n华扬\t88667\n朝日\t88668\n书博会\t88669\n三维矩阵\t88670\n比昂\t88671\n砺剑\t88672\n张志国\t88673\n新编英语教程\t88674\nwh-1000xm2\t88675\n8下\t88676\n静态ip\t88677\n怎样样\t88678\n7110\t88679\n夜生活\t88680\n摩根弗里曼\t88681\n美好的未来\t88682\n吸脂\t88683\nSuperWingy\t88684\n幽雅\t88685\n阳光男孩\t88686\n咯\t88687\n竖炉\t88688\n阿赫利\t88689\n4.0|\t88690\n笑口\t88691\n美树\t88692\n麦客CRM\t88693\n立耳\t88694\n破口\t88695\n葵花盘\t88696\n奥利维亚·库克\t88697\n雷克萨斯LS\t88698\n丝轮\t88699\n嗜酸乳杆菌\t88700\n省林业厅\t88701\n纽曼思\t88702\n果心\t88703\n红荔村\t88704\nGROMACS\t88705\n双六\t88706\nFUZE\t88707\nnovo\t88708\n中央国家机关政府采购中心\t88709\n上海文庙\t88710\n宋凯\t88711\n能量\t88712\n作甚\t88713\n合肥城建\t88714\n大学生村官网\t88715\n止戈\t88716\n计生委\t88717\n符拉迪沃斯托克\t88718\n主炮\t88719\n定义表\t88720\n裂果\t88721\nMllib\t88722\n99式主战坦克\t88723\n烟花三月\t88724\n弯字机\t88725\n张丽\t88726\n松岛枫\t88727\npocky\t88728\nrayfile\t88729\n双特\t88730\nelephant\t88731\nmoren\t88732\nReasoning\t88733\n工商质监局\t88734\nfurla\t88735\n李霞\t88736\nfishes\t88737\n条条框框\t88738\n阿里通\t88739\n弧形\t88740\nfrustrated\t88741\nConcurrent\t88742\n省委宣讲团\t88743\n天天排行网\t88744\n金柳妍\t88745\n黄永玉\t88746\nDeform\t88747\n欧风\t88748\n云都\t88749\n厉兵秣马\t88750\n橙光\t88751\n桧柏\t88752\n鹤影天青\t88753\n危运\t88754\n仁术\t88755\nWind资讯\t88756\n东北证券股份有限公司\t88757\n铁血联盟\t88758\n奥维尔\t88759\n直线运动\t88760\n天使之翼2\t88761\n收放机\t88762\n盖过\t88763\n中汇会计师事务所\t88764\n五感\t88765\n0597\t88766\nYour\t88767\n可采莲\t88768\n南开大学经济学院\t88769\n两河镇\t88770\nmagicad\t88771\n咬唇\t88772\n食粮\t88773\n荷载\t88774\n城市管廊\t88775\n台地\t88776\n针口\t88777\n汇美\t88778\n火凤\t88779\n试想\t88780\n妤\t88781\n献媚\t88782\n张弘\t88783\n五门\t88784\n离岗\t88785\n中胜\t88786\n血热\t88787\n跳线\t88788\n中国机械工业联合会\t88789\n主题馆\t88790\n东立\t88791\n笔仙\t88792\n汽艇\t88793\nChizhouRen\t88794\n4x^2\t88795\n4399保卫萝卜3\t88796\n一次项\t88797\n乐Max\t88798\nabigaile\t88799\n香屯\t88800\n松滋人网\t88801\n温性\t88802\n回波损耗\t88803\n水运\t88804\n猜谜语网\t88805\n吞咽困难\t88806\n江北农场\t88807\n第44页\t88808\n宣武区\t88809\n金仙\t88810\n华德福\t88811\n鬼\t88812\n茶山竹海\t88813\n王志军\t88814\n喷射战士2\t88815\n膻中\t88816\n查字典网\t88817\n急袭\t88818\n第二辑\t88819\n猎色\t88820\n血淋淋\t88821\n上海发那科机器人有限公司\t88822\n脱轴\t88823\n奥园翡翠东湾\t88824\n兰贵人\t88825\n红色警戒2尤里的复仇\t88826\nguangzhou\t88827\n叶小纲\t88828\n总结篇\t88829\n济南的冬天\t88830\n北京交通大学研究生院\t88831\n180315\t88832\n火疙瘩\t88833\nCoin\t88834\n泰晓科技\t88835\n国籍\t88836\n蛋糕\t88837\n我的丑娘\t88838\n湘东区\t88839\n0x000000d1\t88840\n自由度机器人\t88841\n挂失\t88842\n海拉\t88843\n江湖三十年\t88844\n匿名类\t88845\n妮可罗宾\t88846\n手绘动漫\t88847\n细胞库\t88848\n大同杯\t88849\n世系\t88850\n照例\t88851\nudhcpc\t88852\n城商\t88853\n孙青\t88854\n理光复合机\t88855\ndepfile\t88856\nkvm虚拟机\t88857\n话话\t88858\n李志伟\t88859\n小高层\t88860\nscrivener\t88861\n绸\t88862\n紫云轩\t88863\n0.3毫米\t88864\n兴区\t88865\n童泰\t88866\ncoredata\t88867\n小小说\t88868\n张永利\t88869\n校友们\t88870\n硬盘分区表\t88871\n枸橼酸钠\t88872\n北京站\t88873\n7060\t88874\n美食库-倾品加盟网\t88875\n付件\t88876\n尺寸\t88877\n企业博客_企博网\t88878\nsend_keys\t88879\n龙文区\t88880\n3120\t88881\n北极狼\t88882\n陈可\t88883\nnaming\t88884\n永恒战士2\t88885\n黑文\t88886\n6000k\t88887\n反杀\t88888\n越城区政府\t88889\n中国路\t88890\n哭鼻子\t88891\n水锈\t88892\n着火\t88893\n韩熙\t88894\n李阳冰\t88895\n昆仑决世界搏击中心\t88896\n向后\t88897\n国旅联合\t88898\n王寅\t88899\n华夏文旅\t88900\n宣汉\t88901\nPDCA\t88902\n美琳\t88903\n福建水利电力职业技术学院\t88904\n缙云新闻网\t88905\n局域网\t88906\n钟点工\t88907\n创建人\t88908\n灵珑\t88909\n贾乃冒险岛\t88910\nAggregations\t88911\n赢取\t88912\nmsde\t88913\n科纳\t88914\n奥翔药业\t88915\n5月29日\t88916\n星主\t88917\n富安娜\t88918\n钏\t88919\nbol\t88920\n76个\t88921\n国美电器\t88922\n前后端\t88923\n透明装\t88924\n义理\t88925\ni54590\t88926\n莫德斯特\t88927\n蓝天野\t88928\nMusicians\t88929\n山西省肿瘤医院\t88930\n蚂蚁金服\t88931\n免一补\t88932\n输血\t88933\n换肤\t88934\n孔蒂\t88935\n连发\t88936\nCCIC\t88937\n鄱阳县\t88938\n巡航机\t88939\n广告女\t88940\nqepb\t88941\n微商博览会\t88942\n曼昆经济学原理\t88943\n鞘膜积液\t88944\n考虑到\t88945\n社会资本合作\t88946\n坎坷之路\t88947\n疯魔\t88948\n供应链融资\t88949\n061\t88950\n草莓苗\t88951\n玛丽鱼\t88952\n姚波\t88953\n1032\t88954\n拱度\t88955\n陆依萍\t88956\nxiuyixiu\t88957\n3_\t88958\nHeadphones\t88959\nnewgame\t88960\n2004年度\t88961\nkeji\t88962\nDOB\t88963\n微贷\t88964\nceleron\t88965\n漫山\t88966\n1.2万元\t88967\n形貌\t88968\n滨海湿地\t88969\n福彩3D走势图\t88970\ngensim\t88971\nknockout\t88972\n横梁\t88973\n新枝\t88974\n剑网3剑胆琴心\t88975\n阿康\t88976\n飞铲\t88977\n834\t88978\n下令\t88979\n宁波科学中学\t88980\n晋江万达广场\t88981\nzipkin\t88982\n共青森林公园\t88983\n牛肉块\t88984\n无纸化\t88985\n心肌梗死\t88986\n砖厂\t88987\nbasedir\t88988\n柔光灯\t88989\n酒精测试仪\t88990\n微电影节\t88991\nsimplified\t88992\n在身边\t88993\nStud\t88994\n12件套\t88995\n分叉\t88996\n精细度\t88997\n谷居网\t88998\nhoma\t88999\n300环\t89000\n7000亿\t89001\n中国铁路客户服务中心\t89002\nPrism\t89003\n数百年\t89004\n免于\t89005\n江西广电\t89006\n橡胶片\t89007\novh\t89008\n无须\t89009\n10.6.5\t89010\n查尔星港\t89011\n误动\t89012\n国民党政府\t89013\n高筒靴\t89014\n687\t89015\n盟誓\t89016\n答句\t89017\n意大利大学\t89018\n二三维\t89019\n腹股沟斜疝\t89020\n和平使者\t89021\ncreater\t89022\n淮安市第一人民医院\t89023\n放卖\t89024\nwreck\t89025\n釜式反应器\t89026\n文献港\t89027\n饶平县\t89028\n蔡英\t89029\n专区\t89030\nskd\t89031\n演唱者\t89032\n金掌柜\t89033\n挂倒\t89034\n826\t89035\n叶梦得\t89036\n无限世界\t89037\n白沙河\t89038\njed\t89039\nHQC\t89040\n厨王\t89041\n趋近于\t89042\n痄腮\t89043\n环戊酮\t89044\n邹姓\t89045\nAngelo\t89046\n第二话\t89047\n龙在边缘\t89048\n海山\t89049\n扩招\t89050\n女猎人\t89051\nyuyan\t89052\n2005届\t89053\n仁爱路\t89054\n康宁街\t89055\n治疗费\t89056\nIon\t89057\n高级英语\t89058\nSNMP\t89059\n创驰蓝天\t89060\n旅顺南路\t89061\n替莫唑胺\t89062\n360云盘\t89063\n&#65279\t89064\n报检单\t89065\n責\t89066\n教育部高等教育教学评估中心\t89067\n18级\t89068\n各族\t89069\n陕西省\t89070\n环境学院\t89071\n房地产类\t89072\ndrawboard\t89073\n汉克斯\t89074\n因由\t89075\n悬殊\t89076\n初三\t89077\n触怒\t89078\n园冶杯\t89079\n二甲基硅油\t89080\n哥大\t89081\n拍片\t89082\n育路上海国际学校网\t89083\n师范大学\t89084\n叽萝\t89085\n拉斐尔\t89086\nnutritional\t89087\nsyou\t89088\n肉漫画\t89089\nSports\t89090\n汤盈盈\t89091\n思辩\t89092\nPE版\t89093\n海鲈鱼\t89094\n美国商学院\t89095\n自罚\t89096\n家主\t89097\n建设项目环境保护管理条例\t89098\n藿香清胃胶囊\t89099\n邱少云\t89100\nad09\t89101\nk41\t89102\n玉兰广场\t89103\n弹弓\t89104\n叶蜂\t89105\n砂尘\t89106\n挂瓷\t89107\n千倍币\t89108\n淮海西路\t89109\n8\t89110\n桩体\t89111\n中经堂\t89112\n痴情玫瑰花\t89113\nUNITED\t89114\n大肆\t89115\n凤尾\t89116\n深湛\t89117\n任尔\t89118\n传奇加速器\t89119\n乐视网\t89120\n20180104\t89121\n宁波百姓网\t89122\n68页\t89123\n时域\t89124\n科龙空调\t89125\n广义相对论\t89126\n转学籍\t89127\nEIZO\t89128\n2.01\t89129\nRepositories\t89130\n刽子手\t89131\n主子式\t89132\n肖骁\t89133\n锤哥\t89134\nEFF\t89135\nICOCA卡\t89136\n圣元\t89137\n管理学原理\t89138\n面阵\t89139\n特饮\t89140\n沐浴桶\t89141\n晚10点\t89142\n练成\t89143\n国会\t89144\nVB6论坛\t89145\n园区\t89146\n我们的歌\t89147\nDara\t89148\ncskin\t89149\n230V\t89150\n县县\t89151\n盆友\t89152\n集团股份有限公司\t89153\n湘湘\t89154\n加荷\t89155\nislands\t89156\n情海\t89157\nITU-T\t89158\n踢裆\t89159\n拾贝\t89160\n同课异构\t89161\n凯丽\t89162\n报于\t89163\n钻套\t89164\n一盎司\t89165\n相去\t89166\n自由线\t89167\n倍频\t89168\n莱维特\t89169\n下载吧软件站\t89170\n黄芪精口服液\t89171\n唐长安城\t89172\n愤世嫉俗\t89173\n天经地义\t89174\ndayin\t89175\n惠风书院\t89176\n方孔\t89177\n杭州长江汽车有限公司\t89178\n怀揣\t89179\n长一短\t89180\n暴赚\t89181\nconquest\t89182\n反补贴\t89183\n主餐\t89184\nyouk\t89185\n省管县\t89186\n白地\t89187\n篇幅\t89188\nhover事件\t89189\n抵押贷\t89190\n吴泓\t89191\n宠辱不惊\t89192\n152号\t89193\n华夏手游\t89194\n氯碱\t89195\n吉尼\t89196\n中联办\t89197\nWinery\t89198\nR9S\t89199\n通风风管\t89200\nrp8\t89201\n灯库\t89202\n172项\t89203\n分布器\t89204\n3322\t89205\n船只\t89206\n金地玺华邨\t89207\nTara\t89208\n10000毫安\t89209\n断语\t89210\nweka\t89211\n靶向\t89212\n第一点\t89213\n广州中医药大学\t89214\n东台人才网\t89215\n余香\t89216\n镭战\t89217\n钟表网\t89218\n羊毛绒\t89219\n法道\t89220\n青林\t89221\n丁桂鱼\t89222\n通宇\t89223\n悄\t89224\n利港\t89225\n愚公\t89226\n四目\t89227\n金元宝\t89228\n1891年\t89229\n重庆国际博览中心\t89230\n江北新闻网\t89231\n菠萝社\t89232\n邓柯\t89233\n围蔽\t89234\n驻防\t89235\n代接\t89236\n武宣政府网\t89237\n我要做好孩子\t89238\n6道\t89239\n褥\t89240\n旗委\t89241\n天条\t89242\n入門\t89243\nvoxel\t89244\n泰安泰山区\t89245\nTDC\t89246\n184个\t89247\n靖宇\t89248\nn25\t89249\n丢包\t89250\nWound\t89251\n814\t89252\n江浩坤\t89253\n创科\t89254\nbakari\t89255\n运动袜\t89256\n大鹏湾\t89257\n河口湾\t89258\n不定向\t89259\n过氧化钠\t89260\n24节气\t89261\n吉林省政协\t89262\nBrake\t89263\nCrontab\t89264\noff-white\t89265\n临沂市人民政府\t89266\nwget\t89267\n娴熟\t89268\n海晏河\t89269\n搔痒\t89270\n范范\t89271\n恒晟\t89272\n中国新相亲\t89273\n垂帘\t89274\n巫师3:狂猎\t89275\n20150313\t89276\n压皱\t89277\n黔东南政务一站通\t89278\n唐棣\t89279\n2针\t89280\n五一街\t89281\n郝云\t89282\n最低限度\t89283\npriest\t89284\nxscmis\t89285\nTickets\t89286\n2018年03月25日\t89287\n族语\t89288\nTrap\t89289\nPCon\t89290\n第七十七条\t89291\n濑桃\t89292\n苗岭\t89293\n包络\t89294\n高下立判\t89295\ndrunk\t89296\n我不懂\t89297\n今永纱奈\t89298\n金镶玉\t89299\nev300\t89300\n呼麦\t89301\n商通\t89302\n古北\t89303\n第十八回\t89304\n20151112\t89305\n力学系\t89306\nCPU版\t89307\n惠邦\t89308\n拨冗\t89309\n崇德\t89310\n常盘贵子\t89311\n终端\t89312\n刘叉叉\t89313\nCuckoo\t89314\n超凡神眼\t89315\n苏黄\t89316\n毕业论文任务书\t89317\nNW-A45\t89318\n5471\t89319\n祈祷\t89320\ndataSource\t89321\n310S\t89322\n千岛湖开元度假村\t89323\n奇脉\t89324\n42类\t89325\n省政\t89326\n国际贸易实务\t89327\nreentrant\t89328\n查去\t89329\n优秀者\t89330\n黛珂\t89331\nLobby\t89332\n百度播放器\t89333\nL4168\t89334\n移动全球通\t89335\n20151219\t89336\n外编\t89337\n快乐的时光\t89338\n晨间剧\t89339\n艾希\t89340\n脸模\t89341\n扎金索斯岛\t89342\n经济界\t89343\n微势力\t89344\nLodging\t89345\n非金融企业债务融资工具\t89346\n某女\t89347\n北京国际饭店\t89348\n304#\t89349\n心之所向\t89350\n夏晴\t89351\n佳能g7x\t89352\n一辆\t89353\n达衣岩\t89354\n婚外初夜\t89355\n春歌\t89356\n批量\t89357\n淳道字画网\t89358\n最后一道题\t89359\n发货地\t89360\n人文性\t89361\nSSL证书\t89362\n修水县\t89363\nCroatia\t89364\n索玛花开\t89365\n彭亮\t89366\n结爱\t89367\n丑石\t89368\n一次一次\t89369\n妖猫星球大战\t89370\n李晓玲\t89371\n奈姆希\t89372\n发动机护板\t89373\n萧萧\t89374\nMamiya\t89375\n第二夜\t89376\n市外\t89377\n十一个月\t89378\n親子王國\t89379\n茅台路\t89380\nsoui\t89381\n天纪\t89382\n罗技g603\t89383\n通河新村\t89384\n爱斯\t89385\n庄荣文\t89386\nasco\t89387\nWatson\t89388\n0660\t89389\n航空\t89390\n暗卢克\t89391\n墨尔本皇家理工大学\t89392\n蜀山缥缈录\t89393\n农经\t89394\n剑网三天策\t89395\n起因\t89396\n孤独者\t89397\n960_\t89398\n以身试法\t89399\n081\t89400\n市场调节价\t89401\n吴辉\t89402\n穿衣搭配网\t89403\n互助组\t89404\n9.22\t89405\n百公里\t89406\n薛蛮\t89407\n过氧化氢溶液\t89408\n文津图书奖\t89409\nfont-face\t89410\n绿泥\t89411\n盘车\t89412\n95届\t89413\n熊吉\t89414\n滑板场\t89415\n博越_吉利汽车_吉利汽车\t89416\n2219\t89417\n补铁剂\t89418\n武直\t89419\n辽工大\t89420\n天府大道南段\t89421\nie浏览器\t89422\n前田香织\t89423\n第二梦\t89424\nEAR\t89425\n折纸花\t89426\n千日红\t89427\n运动衫\t89428\n不饱和度\t89429\n第【2】页\t89430\n柔性\t89431\n万用金\t89432\n最佳现场\t89433\n王室\t89434\n浙大图书馆\t89435\n森林狂想曲\t89436\nFry\t89437\ns4\t89438\n脉波\t89439\n花舞\t89440\n地砖\t89441\nK站\t89442\n新马泰\t89443\n云南省肿瘤医院\t89444\n聂将军\t89445\n相当于人\t89446\n魏千翔\t89447\n龙脉石\t89448\nFit\t89449\ngrb\t89450\n2299元\t89451\nModo\t89452\n佳能m5\t89453\n牛头械王\t89454\n0.3.1\t89455\n翡翠大道\t89456\n同学会\t89457\n三沙源\t89458\nnumpy,scipy\t89459\n市政管理局\t89460\n明月山庄\t89461\n教师资格条例\t89462\n盲审\t89463\n章邯\t89464\n水口钳\t89465\n多功能型\t89466\ntokio\t89467\n秦砖汉瓦\t89468\n王蒙丁俊晖\t89469\n爱普生630K\t89470\n姑息治疗\t89471\n12530\t89472\n不堪其扰\t89473\n青岛市国家税务局\t89474\n迅闪\t89475\n选举制\t89476\n丰庆公园\t89477\n粤沪\t89478\n汽车维修技术网\t89479\n一语成谶\t89480\n类变量\t89481\ntroubled\t89482\n蚂\t89483\nbird\t89484\n主格\t89485\n不入\t89486\n撑\t89487\npixie\t89488\n徐雷\t89489\n龙元\t89490\n征服\t89491\n务川县\t89492\nhandshaker\t89493\nRIDE\t89494\n10007\t89495\n博瑞论坛_汽车之家论坛\t89496\n艳欲\t89497\n又说\t89498\n赶会\t89499\n20160126\t89500\n竟是因为\t89501\n苏辙\t89502\n松岛\t89503\nz9\t89504\n3000万\t89505\n书信\t89506\n南川\t89507\n2138\t89508\n格子裤\t89509\n活用\t89510\n赌马\t89511\n孙艳\t89512\npuffin\t89513\n谪\t89514\nNapoli\t89515\n营业外收支\t89516\nBlinds\t89517\nhorses\t89518\n勉励\t89519\n41002\t89520\n博德之门增强版\t89521\n十三大\t89522\n310国道\t89523\n柠檬酸钠\t89524\nopencv4\t89525\nXP\t89526\n过关类\t89527\n大鸭梨\t89528\nBelts\t89529\n甘油三酯\t89530\n附图\t89531\n1-7天\t89532\n加权指数\t89533\nfunnel\t89534\n操作柱\t89535\n3D图\t89536\n老六路\t89537\nzoho\t89538\nplatforms\t89539\n25.com\t89540\n能带走\t89541\n对外汉语\t89542\n62平米\t89543\nyandex\t89544\n想了想\t89545\n生化危\t89546\n代管\t89547\n日本豆腐\t89548\nurl-loader\t89549\n佩剑\t89550\n科学家们\t89551\n鉴定所\t89552\n阿赞\t89553\n70D\t89554\nVariants\t89555\n王鸿飞\t89556\n88名\t89557\nCGFloat\t89558\nsolt\t89559\n摄氏度\t89560\n上海国际酒店用品博览会\t89561\n文房\t89562\n智悲德育网\t89563\n进编\t89564\n魔手\t89565\nSEM\t89566\nnetmeeting\t89567\n群聊\t89568\ndak\t89569\n幸运\t89570\n玩具版\t89571\n前海证券\t89572\nsimatic\t89573\nnetac\t89574\n法制化\t89575\nstc12c5a60s2\t89576\n反意词\t89577\n山东仁信\t89578\n梅州市梅县区人民政府\t89579\n蚀\t89580\nnividia\t89581\n净化板\t89582\nqui\t89583\n德音\t89584\n疾走\t89585\n国家地方联合工程研究中心\t89586\n枪声\t89587\n樊嘉\t89588\n长筒袜\t89589\n张允\t89590\n红塘湾\t89591\n北京市地震局\t89592\n非金属元素\t89593\n砂板\t89594\n大家保保险网\t89595\n养阴清肺丸\t89596\nSomeone\t89597\ntouchid\t89598\n燃烧器\t89599\nRepeater\t89600\n紫蝴蝶\t89601\n昆城广场\t89602\n52度\t89603\njfinal\t89604\nX-family\t89605\n尤姆\t89606\n过墙\t89607\nwpd\t89608\nFinal\t89609\n电放费\t89610\n邮储\t89611\n教育综合知识\t89612\n米棒\t89613\n美知广子\t89614\n1:43\t89615\n毛坦厂\t89616\n光波\t89617\n云南联通\t89618\nsloan\t89619\n深圳市南山区人民医院\t89620\n8M\t89621\n稿酬\t89622\n妙龄少女\t89623\n梁兴初\t89624\n福建省通信管理局\t89625\nMindManager2018\t89626\nBUMP\t89627\n疲软\t89628\n充值券\t89629\n更轻松\t89630\n欧洲卡车模拟2吧_\t89631\n3.0000元\t89632\n北京市海淀区地方税务局\t89633\n第十六期\t89634\n王恩\t89635\n朱姆沃尔特\t89636\n潮安县\t89637\n測試\t89638\ntogether3\t89639\nstream\t89640\n麦拉\t89641\n武阳\t89642\ncallme\t89643\n李大钊\t89644\nOlive\t89645\n百度糯米站\t89646\n索纳塔\t89647\nwhom\t89648\n线号机\t89649\n沪上\t89650\n败家\t89651\n国有企业\t89652\nmythology\t89653\n62年\t89654\n睡裤\t89655\n花山区\t89656\n阿瑞斯\t89657\n威海文登\t89658\nflac\t89659\n上户彩\t89660\nNashorn\t89661\n荧光字\t89662\n干型\t89663\n克拉玛依日报社\t89664\n金光湖\t89665\n多党合作制度\t89666\nSVC\t89667\n统计局\t89668\nAvenir\t89669\n郑开大道\t89670\n分布区\t89671\n德胜门中医院\t89672\nCetaphil\t89673\nBowers\t89674\n瓯柑\t89675\nMachinery\t89676\n新少林五祖\t89677\n卷积层\t89678\n工藤美纱\t89679\n牧场物语\t89680\n仙宝\t89681\nZSJ\t89682\nQQ空间日志\t89683\n庄河\t89684\nauthorization\t89685\n驱动人生6\t89686\n长物志\t89687\n2018年2月22日\t89688\n星石\t89689\nBaptist\t89690\n涨停股\t89691\n红眼病\t89692\n武汉大学计算机学院\t89693\n5件\t89694\n滴注\t89695\n皮瓣\t89696\n锹\t89697\n载带\t89698\n斯蒂格利茨\t89699\ndrs\t89700\n质检总局\t89701\ntrunk\t89702\n前入式\t89703\nfreex性\t89704\n交通法\t89705\n蕾玖\t89706\nvmware-install.pl\t89707\n猪肘\t89708\n奚国华\t89709\n太平间\t89710\n恋心\t89711\n董事会办公室\t89712\n食人鱼\t89713\njk罗琳\t89714\n狸窝家园\t89715\n西岳清风网\t89716\nrosi小莉\t89717\n志军\t89718\n文明路\t89719\nCoolerMaster\t89720\nbootstrap3\t89721\nCams\t89722\n梁冠华\t89723\n商业承兑汇票\t89724\n房车赛\t89725\n完税凭证\t89726\n氢氧化铝\t89727\n福建省交通运输厅\t89728\n扎佐镇\t89729\n云岩\t89730\n容身\t89731\n桂湖\t89732\nshwin\t89733\n王丽坤\t89734\n奶块\t89735\n毒哑\t89736\n小冰\t89737\n小伙儿\t89738\nMemoirs\t89739\n非人传祺\t89740\nPECL\t89741\n召稼楼\t89742\n北马\t89743\n鱼卡\t89744\n050\t89745\n美景天城\t89746\n七里镇\t89747\n科级\t89748\n震动\t89749\n有笑\t89750\n桑德兰\t89751\n伦巴舞\t89752\n熠\t89753\n爱的世界只有你\t89754\n首行缩进\t89755\nbokeh\t89756\n事业单位考试网\t89757\n辣子鸡丁\t89758\n银鸽\t89759\n胖仔\t89760\n电镀线\t89761\n实践论\t89762\n小鼓\t89763\n霍奇金淋巴瘤\t89764\n法制观念\t89765\n二下\t89766\n伊乐藻\t89767\n北京军颐中医医院\t89768\nV1.0.0_\t89769\n向前走\t89770\n涵碧楼\t89771\n研工部\t89772\n急_\t89773\n上海大学\t89774\n欠帐\t89775\n通山县\t89776\n高瓴\t89777\nPurchasers\t89778\n外业\t89779\n11类\t89780\nqinqin\t89781\n复地上城\t89782\n红九\t89783\n齐麟\t89784\n800亩\t89785\nChrome浏览器\t89786\n欧冠杯\t89787\n黄泉\t89788\n左爱\t89789\n严昊\t89790\n编研\t89791\n第一座\t89792\n逃过\t89793\n马印航空\t89794\njdg管\t89795\nも\t89796\n周2\t89797\nnnm\t89798\n亲润\t89799\ndelta\t89800\nrx560\t89801\n举人\t89802\n黄河颂\t89803\n小厮\t89804\n够了\t89805\n浙江在线环保新闻网\t89806\n含硫\t89807\nfsv\t89808\n后援会\t89809\n净收益\t89810\ndtree\t89811\n葱兰\t89812\n恶之华\t89813\n雨欣\t89814\n技术贴\t89815\n大瓦山\t89816\n成比例\t89817\n方德\t89818\nceil\t89819\n160826\t89820\n养壶\t89821\n讀\t89822\n横表\t89823\n12305\t89824\nflowers\t89825\n黄河石林\t89826\nfba\t89827\n青县\t89828\nH11\t89829\n西瓜影院\t89830\n台州市住房和城乡建设局\t89831\n直聘\t89832\ndispatch\t89833\n外阴_\t89834\nT420\t89835\n威尔伯\t89836\n编写\t89837\n区域性\t89838\nhepatic\t89839\n齿差\t89840\n大众Arteon\t89841\n上海土地公\t89842\n学长\t89843\n元老赛\t89844\n寻花问柳\t89845\nBenny\t89846\nyili\t89847\n御龙山\t89848\n蒙汉\t89849\nBEN\t89850\nOrderBy\t89851\n冷慕宸\t89852\n汝城县\t89853\n居住证签注\t89854\n垂直类\t89855\n审查员\t89856\nLiquibase\t89857\n云集镇\t89858\n床套\t89859\n84亿\t89860\n级部\t89861\nCRA\t89862\n渡路\t89863\n友利银行\t89864\n负气\t89865\ntau\t89866\n衰女\t89867\n中小学校\t89868\n如梦\t89869\n危险化学品经营许可证管理办法\t89870\naja\t89871\n试营\t89872\n思贤\t89873\nshafen\t89874\n4月底5\t89875\n消杀\t89876\n面皮机\t89877\n六安市区\t89878\n华印社区\t89879\n强劲\t89880\n广海\t89881\n大佛山\t89882\n成都市国有资产监督管理委员会\t89883\n羽山路\t89884\n天天在\t89885\n沫鱼\t89886\n抽泣\t89887\n甜水园\t89888\n花根\t89889\n萨内蒂\t89890\nPlank\t89891\n上海建工集团\t89892\n自然特征\t89893\n环境部\t89894\n广州市社会保障\t89895\n防盗锁\t89896\n通联支付\t89897\nparcelable\t89898\n牵头\t89899\n_批\t89900\n3者\t89901\n自不量力\t89902\n治文\t89903\n星光大道\t89904\nrates\t89905\n闫盼盼\t89906\n精细胞\t89907\n第6个\t89908\n地政\t89909\n中门\t89910\n2627\t89911\n菜鸟联盟\t89912\n8步\t89913\n疼痛难忍\t89914\nHeader\t89915\n谱牒\t89916\n2017年7月28日\t89917\n前几天\t89918\n7C教育资源网\t89919\n凹凸性\t89920\n堆上\t89921\nSHARK\t89922\n上天\t89923\n金屋藏娇阁\t89924\n数十家\t89925\n清夜\t89926\n秀秀菜园\t89927\n众家\t89928\n数据中心版\t89929\nCat\t89930\n巢房网\t89931\n实变函数论\t89932\n抽疯\t89933\nGenerations\t89934\n神油\t89935\n临沂路\t89936\n街车\t89937\n广发藏品网\t89938\n墨尔本\t89939\n7000个\t89940\n随心谈\t89941\n混账\t89942\nIpod\t89943\n仙境\t89944\n畅言网\t89945\n音乐餐\t89946\n200#\t89947\n退钱\t89948\n心机女\t89949\n管理会计\t89950\n浩方对战平台\t89951\n西山学校\t89952\n地球币\t89953\n绥滨\t89954\ncollision\t89955\nscp\t89956\nCORPORATION\t89957\n天狮\t89958\n这一仗\t89959\niges\t89960\n熬制\t89961\n年终总结\t89962\n邻苯二酚\t89963\n冥主\t89964\nbard\t89965\n韩游网\t89966\n热带地区\t89967\n樱花大战\t89968\n无限小说网\t89969\nipsos\t89970\n坚忍\t89971\n就业指导服务中心\t89972\n储器\t89973\n000810\t89974\n病证\t89975\n外链接\t89976\n謎\t89977\n畅销\t89978\nEliot\t89979\n天津湾\t89980\n93年\t89981\n逸仙路\t89982\n非执行董事\t89983\n侠之大者\t89984\nTKS\t89985\n一致同意\t89986\n等效电路图\t89987\n425号\t89988\n中铁十九局\t89989\n秦宇\t89990\nv5.6.0\t89991\nCCTV6\t89992\n10^3\t89993\n资本收益率\t89994\n新田360广场\t89995\n怪化\t89996\nEP.1\t89997\n地型\t89998\n高分子\t89999\n大熊猫\t90000\nCeltic\t90001\n台次\t90002\n北京国资委\t90003\n轮滑\t90004\n非英语\t90005\n20170316\t90006\n沉舟侧畔千帆\t90007\nBBC\t90008\n中国移动终端公司\t90009\n影志\t90010\n天冬\t90011\nAFC\t90012\n安阳\t90013\n细胞分裂6\t90014\nwinky\t90015\n卡利姆\t90016\n怜\t90017\n上海迪士尼乐园\t90018\n樱花树\t90019\n18只\t90020\n克莱默\t90021\nnds4\t90022\n130亿\t90023\n深冬\t90024\n给你们\t90025\n象限\t90026\n田蕊妮\t90027\n西北院\t90028\n中国连山网\t90029\n97.2\t90030\n大阪环球影城\t90031\n暴雪娱乐\t90032\nFST\t90033\n微电台\t90034\nWizard\t90035\n2017年11月1日\t90036\n成都世纪城新国际会展中心\t90037\n二人转\t90038\n跳一跳\t90039\nDRP\t90040\n荣耀畅玩5\t90041\n文伟\t90042\nMaid\t90043\n沁园春·雪\t90044\n91片片\t90045\n有限空间\t90046\n苟延残喘\t90047\n脂蛋白\t90048\n网优\t90049\n代理价\t90050\n我的爱好\t90051\nMediaTek\t90052\n不敢爱\t90053\n红樱\t90054\n朝天门\t90055\n替芯\t90056\n有害\t90057\ntomany\t90058\n程序包\t90059\n补救\t90060\n水培花卉_浴花谷花卉网\t90061\n上海大剧院\t90062\n七步洗手法\t90063\n钦州保税港区\t90064\n江西日报\t90065\n郑机\t90066\n收腹裤\t90067\n独立游戏吧\t90068\n钱海\t90069\nSuburbs\t90070\n画水镇\t90071\n舒悦\t90072\n小儿消化不良\t90073\n北京体育大学研究生院\t90074\n呼朋唤友\t90075\n卡蜜儿\t90076\n胁持\t90077\n烂醉如泥\t90078\n浣若君\t90079\n周巷镇\t90080\n母亲的恩情\t90081\nguode\t90082\n刘黎明\t90083\n555号\t90084\n辽宁省交通高等专科学校\t90085\n滴神\t90086\n老段\t90087\n刘谌\t90088\n阿月\t90089\n割让\t90090\n提词器\t90091\n公道\t90092\n樱空释\t90093\n竞彩足球\t90094\n3.5个\t90095\n萎\t90096\n引领型\t90097\nwsdd\t90098\n祸事\t90099\n创战\t90100\n欧叶\t90101\n内码\t90102\n0.25\t90103\n唐山市公安局\t90104\n徐佳莹\t90105\n中国铁人三项运动协会\t90106\n桩长\t90107\n三星S4\t90108\n两用型\t90109\n万行医疗卫生人才网\t90110\n佛罗里达州\t90111\n慕容离\t90112\n敌券\t90113\n莲都\t90114\n_田\t90115\nOutsider\t90116\n上海居委会\t90117\n720s\t90118\nrese\t90119\n腾讯开放平台|众创空间\t90120\n格网\t90121\n灵活就业人员\t90122\n电阻测试仪\t90123\n何时休\t90124\n25K\t90125\n爱儿\t90126\nlarevel\t90127\n10根\t90128\nTEE\t90129\n1.64\t90130\n上海顺丰\t90131\n教师资格\t90132\n银器\t90133\n新线程\t90134\n惊心\t90135\n狼穴\t90136\n巩义搜门户网\t90137\n性转\t90138\n维斯特帕列\t90139\n贱人\t90140\nproxmox\t90141\nSubstance\t90142\n)餐饮有限公司\t90143\npostmessage\t90144\n张雯\t90145\n审理\t90146\n107岁\t90147\n张志俊\t90148\n芳甸路\t90149\n遵义会议会址\t90150\n广州公办幼儿园\t90151\n16码\t90152\n应用技术学院\t90153\n总承包\t90154\n疾苦声\t90155\nparamount\t90156\n3方\t90157\n村上春树\t90158\n驶进\t90159\n光大保德信\t90160\n氟尿嘧啶\t90161\nUbun\t90162\n沃通WoSign\t90163\n第17天\t90164\n丫山\t90165\n51offer\t90166\n色弱\t90167\nycm\t90168\nV2.2.0\t90169\n锋利\t90170\n高防\t90171\n克东\t90172\nFlood\t90173\n偷听\t90174\nhandclap\t90175\n一掬\t90176\ncarib-1080p\t90177\n萧政\t90178\n真空上料机\t90179\nx=1\t90180\n名亭\t90181\n32800\t90182\n中国银河证券\t90183\n同一性\t90184\nphp+mysql\t90185\n乙肝e抗原\t90186\nQtCreator\t90187\n0593\t90188\n面纱\t90189\nxcom2吧\t90190\n沈东军\t90191\nORCAD\t90192\nSMT\t90193\n脱身\t90194\n3月23日\t90195\nreact-native\t90196\n24周\t90197\n坐月子\t90198\n千星\t90199\n检察\t90200\n肺鳞癌\t90201\nmBlock\t90202\n半周\t90203\n芳香剂\t90204\n金杨新村\t90205\n生产商\t90206\n四川传媒学院\t90207\nmodal框\t90208\nSH\t90209\n夜光粉\t90210\n山西汾酒\t90211\n5000k\t90212\n五月婷\t90213\n解教\t90214\n小逼\t90215\n米庄理财\t90216\n加方\t90217\n湖北机构编制网\t90218\n上海复旦大学附属儿科医院\t90219\n丰田汉兰达\t90220\n劳务分包_\t90221\n2671\t90222\nDOSPY\t90223\nsafri\t90224\n张苗苗\t90225\n陈理\t90226\n暴风看电影\t90227\n四居室\t90228\nHSBC\t90229\nvalder\t90230\n计入\t90231\n仓顶除尘器\t90232\napples\t90233\n雅典卫城\t90234\n八一电影制片厂\t90235\n香港市区\t90236\n连音社\t90237\n嘀嘀\t90238\ncharge2\t90239\n提醒\t90240\n秘功\t90241\n美年健康\t90242\navalon\t90243\n散装水泥罐\t90244\n指南表\t90245\n客软件\t90246\n敞篷版\t90247\n吕伟\t90248\n比佛利\t90249\n桑德尔\t90250\n实在性\t90251\n霍格沃茨\t90252\n沈阳火车站\t90253\n金克斯\t90254\n商洛市\t90255\n白云堡\t90256\n箱上\t90257\n此\t90258\nholes\t90259\n马肯依玛士\t90260\n长官\t90261\npangu\t90262\n天津冶金职业技术学院\t90263\n翅片\t90264\nciia\t90265\n金悦湾\t90266\n急难\t90267\n染色体变异\t90268\noracle数据库服务器\t90269\n贵阳客运东站\t90270\n婚姻保卫战\t90271\n辽宁省实验中学分校\t90272\n人无\t90273\n20150219\t90274\n天芒\t90275\n勃利吧\t90276\n1.2\t90277\n红米饭\t90278\n校医\t90279\n大西门\t90280\n页面置换算法\t90281\n主妇健康_主妇网\t90282\n腐植酸钾\t90283\n仁达\t90284\n总酚\t90285\n火儿\t90286\n哟哈网\t90287\n虾干\t90288\nSelenium2+python\t90289\n川北医学院\t90290\n斗破苍穹小说网\t90291\n逆变电路\t90292\n抗击\t90293\n17级\t90294\n聂枫\t90295\n破灭\t90296\n艳子钩\t90297\n偷怕\t90298\n许玮伦\t90299\n阿古斯之影\t90300\nRS422\t90301\n帮个忙\t90302\n辉哥\t90303\n金硕珍\t90304\n草莓蛋糕\t90305\nASK\t90306\n吗_分析测试百科网\t90307\n尤权\t90308\n高估\t90309\n大扬\t90310\n小二黑结婚\t90311\nAsha\t90312\nPrecious\t90313\n华健\t90314\n新品类\t90315\n建邺区\t90316\nInternationale\t90317\n天与地\t90318\nMPAcc\t90319\n新机体\t90320\nxz1\t90321\n绫瀬\t90322\n肌层\t90323\n尽头牙\t90324\n九州天空城3D\t90325\n元谋土林\t90326\n西城\t90327\n悟道\t90328\n智惠\t90329\n红米2a\t90330\nWITCH\t90331\n嘉泽\t90332\n长沙市发展和改革委员会\t90333\n99.6\t90334\n侠客行\t90335\n心如刀割\t90336\n省统计局\t90337\n女犬\t90338\n男一女\t90339\n荣耀v8\t90340\nwebkit-transform\t90341\n华润医药\t90342\n打死\t90343\nTelltale\t90344\nmsx\t90345\n一体机\t90346\n埃涅阿斯纪\t90347\n椟\t90348\n档案袋\t90349\n杜仲\t90350\n再见再见\t90351\n易和\t90352\nemoj\t90353\n肠鸣音\t90354\n同济六版\t90355\n看板管理\t90356\n清原\t90357\n过桥贷款\t90358\nvirgo\t90359\n医药英才网\t90360\nP10/P10\t90361\n阴性\t90362\n化料\t90363\n催熟剂\t90364\n康达\t90365\n3980\t90366\n在线分析仪\t90367\n正声\t90368\n信仰\t90369\nCommodity\t90370\n诺基亚7plus\t90371\nquartusII\t90372\nByron\t90373\n马薇\t90374\n新疆油田公司\t90375\nxv6\t90376\n12张\t90377\n水清木华\t90378\nlzw\t90379\n华联控股\t90380\n特展\t90381\n苹果花园\t90382\n移栽机\t90383\n平安普惠i贷\t90384\n七期\t90385\n03333\t90386\n精神\t90387\n名士\t90388\n家圆\t90389\nOutside\t90390\n美熟妇\t90391\n甲状腺检查\t90392\n4114\t90393\nAttr\t90394\n苏黎世大学\t90395\n直流减速电机\t90396\n五老星\t90397\n许佳麟\t90398\n甩不掉\t90399\nxvideos\t90400\n2018-4-13\t90401\n晴明\t90402\na+\t90403\n联讯\t90404\nsimulations\t90405\n历史虚无主义\t90406\n小香风\t90407\n公差分析\t90408\n裸贷女孩\t90409\n北控水务\t90410\n赣南脐橙\t90411\n多溴联苯醚\t90412\n3.04\t90413\nhugh\t90414\n橄榄城\t90415\n盐城火车站\t90416\nTxt小说阅读器\t90417\n俄国\t90418\nlogits\t90419\n帕杰罗劲畅论坛\t90420\n秋酿\t90421\n北京青年路\t90422\n万乡\t90423\n阿洋\t90424\n金维他\t90425\n苹果5S\t90426\nBait\t90427\n方歌\t90428\n赵燕菁\t90429\n夏津县\t90430\n道交法\t90431\nadvanced\t90432\n东莞市光志光电有限公司\t90433\n尼赫鲁\t90434\n四千米\t90435\n四剑\t90436\n憣\t90437\n认知\t90438\n谢尔盖\t90439\n申洲\t90440\n铜钟\t90441\n马土\t90442\n信息类\t90443\n起來\t90444\n容值\t90445\n轴径\t90446\n受牵连\t90447\n老哈\t90448\n广州白天鹅宾馆\t90449\ntruly\t90450\npscc2018\t90451\n加拿\t90452\nz7m\t90453\n炉石传说_小皮手游网\t90454\n图片集\t90455\n云意电气\t90456\n焊烟净化器\t90457\n3212\t90458\n查號\t90459\n报关委托书\t90460\n酸角\t90461\n流量费\t90462\n扬中\t90463\n新闻路\t90464\nrest\t90465\naecs4\t90466\nxoxo\t90467\n千里香馄饨\t90468\n36张\t90469\n周九良\t90470\n湘潭市\t90471\n清明节\t90472\npositon\t90473\n马文\t90474\n安徽分公司\t90475\nSAE\t90476\n小佳\t90477\n邹廷威\t90478\n莆田人才网\t90479\nAutonomous\t90480\n国际文化节\t90481\n清洁卫生\t90482\n浸没式\t90483\n肾窦\t90484\n吗丁啉\t90485\n陕西省总工会\t90486\n很聪明\t90487\n中客\t90488\n$30\t90489\n第112集\t90490\nenum\t90491\n天龙八部3d\t90492\n五成\t90493\n醉翁亭\t90494\n平治\t90495\n护墙\t90496\ntreap\t90497\n濑铃\t90498\n病有所医\t90499\n李明启\t90500\n世纪嘉园\t90501\n超级马里奥银河\t90502\n越秀区\t90503\n5E\t90504\nidm\t90505\nproje\t90506\nV5.0.1\t90507\n沪杭公路\t90508\n体育运动会\t90509\nunifi\t90510\n中电港\t90511\n深圳航空\t90512\n德勤中国\t90513\n车辆识别号\t90514\n公孙胜\t90515\nwondering\t90516\n蚕种\t90517\n立党为公\t90518\n冯磊\t90519\n跳单\t90520\n紫菜头\t90521\n粘网\t90522\n金华市\t90523\n3355\t90524\n知识讲座\t90525\n猛增\t90526\n重庆信息技术职业学院\t90527\n绅\t90528\n香奈儿山茶花\t90529\n卡尔曼滤波算法\t90530\n秒杀器\t90531\n名腿网\t90532\nbabe\t90533\n杭州装修公司\t90534\n禁绝\t90535\nAmerican\t90536\n板式塔\t90537\n用友软件\t90538\n穿越机\t90539\n扫北\t90540\ndarpa\t90541\n北京奔驰汽车有限公司\t90542\n无二不作\t90543\n大理日报社\t90544\n枪弹柜\t90545\n压膜机\t90546\nV5\t90547\nEHOME\t90548\n小米手环1\t90549\n乱跑\t90550\n王小五\t90551\n青秀\t90552\n哈尔滨剑桥学院\t90553\neb病毒\t90554\n华伦\t90555\n月季\t90556\n文件名\t90557\n事业心\t90558\nsymfony2\t90559\n甬城\t90560\n襄垣\t90561\nFram\t90562\n微丸\t90563\n连通性\t90564\n单相桥式整流电路\t90565\n心搏\t90566\n5260\t90567\n四年内\t90568\n仓库管理系统\t90569\n玛欣德\t90570\n蜜柚\t90571\n嘎\t90572\ncocoapod\t90573\n桥隧\t90574\n湖南安全技术职业学院\t90575\n宝马会\t90576\n长清大学城\t90577\n路灯\t90578\n河滩\t90579\ndebt\t90580\n房地产代理公司\t90581\n陈丽华\t90582\n祖尔\t90583\nPVsyst\t90584\n四棱锥\t90585\n公共资源交易中心交易信息网\t90586\n咏史诗\t90587\np4p\t90588\n戈壁滩\t90589\n北京东方\t90590\n水到渠成\t90591\n马戏\t90592\ngelato\t90593\nminiature\t90594\n否定词\t90595\n艾肯套\t90596\n妖魔鬼怪\t90597\n滴蜡\t90598\n曾梵志\t90599\n爱义行\t90600\nChen\t90601\n醒肤\t90602\n人工智能学院\t90603\n大促\t90604\n中华文明史\t90605\n数重\t90606\n孤儿怨\t90607\n拷线\t90608\nunreal4\t90609\n仙剑4\t90610\n众划算\t90611\n4636\t90612\n科技发展有限公司\t90613\nCircuits\t90614\n马拉喀什\t90615\nPUT\t90616\ns700\t90617\nCAD2011\t90618\n西北旺镇\t90619\n著作权法\t90620\nLNMPA\t90621\n迈巴赫S600\t90622\n单双号限行\t90623\n大钞\t90624\n昏昏沉沉\t90625\n拳脚\t90626\n怪物猎人OL\t90627\n换挡杆\t90628\n石家庄职业技术学院\t90629\n病毒携带者\t90630\n劉\t90631\nprobing\t90632\n文华财经期货\t90633\n存照\t90634\n杀螨剂\t90635\n冴岛\t90636\n具结\t90637\nCOVERMARK\t90638\n20180323\t90639\nsporting\t90640\n200千克\t90641\n歌华宽带\t90642\n党群\t90643\n山东人事考试信息网\t90644\n孵化率\t90645\n机化\t90646\n崇明\t90647\n颜良文\t90648\n玩完\t90649\n刘荷娜\t90650\n十八线\t90651\n象山政府网\t90652\n途观l\t90653\n飘逸\t90654\n涞\t90655\nMACbook\t90656\nREMOTE\t90657\n丹阳路\t90658\ninhibition\t90659\n丁香郡\t90660\n第97集\t90661\n鬼门\t90662\n小島\t90663\n1L\t90664\n古式\t90665\n瓜地\t90666\n倒计\t90667\n深拷贝和浅\t90668\n上海绿地集团\t90669\n互换\t90670\n立超\t90671\n鲁工\t90672\n好屋\t90673\n【壹\t90674\n稗草\t90675\n那刻\t90676\n75m\t90677\nGenerators\t90678\n绿油油\t90679\nDFU\t90680\n安切洛蒂\t90681\n2018.04.20\t90682\n构架\t90683\n143分钟\t90684\n米粥\t90685\nHungry\t90686\neasypr\t90687\n包络线\t90688\n沈文裕\t90689\n玩偶之家\t90690\nJLPT\t90691\n菁纯臻颜\t90692\n中共中央政治局\t90693\n丰饶\t90694\n维生素d3\t90695\nEdius\t90696\n渺小\t90697\n班额\t90698\n两点钟\t90699\n曾文正公\t90700\n林芝桃花节\t90701\nmastercam2017\t90702\n勇者斗恶龙怪兽篇joker3\t90703\n感覺\t90704\nEDG\t90705\n粉煤灰砖\t90706\n秦霜\t90707\n赵家庄\t90708\n育路网\t90709\n罗切斯特大学\t90710\n车场\t90711\n沈河区政府\t90712\n循环表\t90713\n慢性淋巴细胞白血病\t90714\n死不动\t90715\n法尊\t90716\n味仙\t90717\nSubmit\t90718\n至顶\t90719\n许豪杰\t90720\n干炒牛河\t90721\n鼋\t90722\n万家元\t90723\n石泉路\t90724\n2016年3月18日\t90725\n曲连杰\t90726\n精要版\t90727\n司局\t90728\n内环路\t90729\nAce\t90730\nkore\t90731\n哪个子\t90732\n竹头\t90733\n重柜\t90734\n扬子江大道\t90735\n移动198号段\t90736\n山东科技\t90737\n第218集\t90738\n玛多\t90739\n狗女\t90740\n北邮人论坛\t90741\n铭速\t90742\n不四\t90743\n奸商\t90744\n1111\t90745\n根本性\t90746\nSTUDIOS\t90747\n蒙脱石散\t90748\n氽\t90749\nSearching\t90750\n增值税\t90751\n打单\t90752\n格里\t90753\n御林军\t90754\n工程管理\t90755\n华中农业大学\t90756\n5k\t90757\n双栈\t90758\n化石\t90759\n钢丝节\t90760\n水味\t90761\n工业和信息化部办公厅\t90762\n杭州大学生创业\t90763\n设计案\t90764\n黑庄户乡\t90765\nxincha\t90766\n春儿\t90767\n幻形\t90768\n微行\t90769\n令嬢\t90770\n夺命凶弹\t90771\nPrologue\t90772\ntelnetlib\t90773\n七公主\t90774\n央子\t90775\n铁行\t90776\n不顾一切\t90777\n回魂夜\t90778\n静雅\t90779\n旅悦\t90780\n京东票务网\t90781\n土豪村\t90782\n节间\t90783\nrgw\t90784\noled屏幕\t90785\n奖杯\t90786\nAMP\t90787\nimportlib\t90788\n汽车票\t90789\n黄金哥布林\t90790\n学讲\t90791\n校长们\t90792\n望一望\t90793\nakt\t90794\n跋涉\t90795\nyesebang\t90796\n射波刀\t90797\n卡瘦棒\t90798\n鱼器\t90799\n青蔓烟阁\t90800\n欣杨kitty\t90801\nvmwaretools\t90802\n死命\t90803\n200级\t90804\n切斯特\t90805\n苦瓜网\t90806\nrossi\t90807\n御品轩\t90808\n盈科律师事务所\t90809\n172.17.0.0\t90810\n北京理工大学出版社\t90811\n团体舞\t90812\n许静\t90813\nWingy\t90814\n建行企业银行\t90815\n郭小宝\t90816\n中华人民共和国妇女权益保障法\t90817\n新方位\t90818\n巩县\t90819\nAE86\t90820\n王棠云\t90821\n三国影院\t90822\nunqualified\t90823\n油车港镇\t90824\n绿鲤鱼\t90825\n金唱盘\t90826\nbye\t90827\n一校\t90828\n恒易融\t90829\n洗衣房\t90830\n无限卡\t90831\nGe\t90832\n欧宝丽\t90833\n矿用\t90834\n火焰纹章外传\t90835\nComodo\t90836\n汉明窗\t90837\n云米\t90838\nKeez\t90839\nminimap\t90840\n板桥\t90841\n四川大学锦江学院\t90842\n豪园\t90843\n科乐\t90844\n20170724\t90845\nlov\t90846\n有志之士\t90847\n破伤风针\t90848\n显瘦\t90849\n套装石\t90850\n弥彦\t90851\n名女\t90852\n选调生网\t90853\nwindows_server\t90854\n卡碧尼\t90855\n办好事\t90856\n动漫之家漫画网\t90857\nj6\t90858\n尚格云顿\t90859\n调取\t90860\n1.4倍\t90861\n济南人才网\t90862\n13.6\t90863\n炼狱\t90864\nDocx\t90865\n2进制\t90866\n云南招标网\t90867\n老枝\t90868\n签订合同\t90869\n纱仓真奈\t90870\n扬州东站\t90871\n骤增\t90872\n竞发\t90873\n捧杯\t90874\n长春信息港\t90875\n质检部\t90876\n罗伯\t90877\n192.168.0.200\t90878\n120支\t90879\n全岛\t90880\n金刚杵\t90881\n森然\t90882\n幼文\t90883\n光学防抖\t90884\n都市_温商网\t90885\n目送\t90886\n职业化\t90887\n转摘\t90888\n奔驰glk300\t90889\neasyrecovery\t90890\n万山区\t90891\n啃老族\t90892\nPageRank\t90893\n宋之韵\t90894\n奉节生活网\t90895\n图表库\t90896\n黑眼珠\t90897\n改密\t90898\n迪凯瑞\t90899\n宁波南部商务区\t90900\n职业规划\t90901\n15路\t90902\n温州建国医院\t90903\n制动机\t90904\n唐纳德特朗普\t90905\n主承销商\t90906\nMaxScript\t90907\n丝接\t90908\n天则\t90909\n南洋股份\t90910\nMySQL索引\t90911\n中信保诚\t90912\n怎嘛\t90913\n肝经\t90914\npos机\t90915\nsprite\t90916\n一抢而空\t90917\n玉屏侗族自治县政府\t90918\n新春佳节\t90919\n加紧\t90920\n王石关羽\t90921\nKingchan\t90922\niPhone8p\t90923\n国易堂\t90924\n雷蛇地狱狂蛇\t90925\n北京苏宁\t90926\n三相电动机\t90927\n炙手可热\t90928\n详图\t90929\n孕照\t90930\n豆粒\t90931\ny66\t90932\n坟场\t90933\n为父\t90934\nwaterloo\t90935\n婷\t90936\n淮安区\t90937\n光网\t90938\n东营区\t90939\n核糖体\t90940\n谢晶思\t90941\n健身吧\t90942\n誉王\t90943\n光圈\t90944\n库名\t90945\n52赫兹\t90946\n双柏县人民政府\t90947\n五篇\t90948\n0105315\t90949\nRenew\t90950\n带制\t90951\n溢彩\t90952\n90068吧\t90953\niphone游戏吧\t90954\nfungi\t90955\n勾手\t90956\n拜礼\t90957\nTranslations\t90958\n凯撒网\t90959\n雨蓬\t90960\n842n\t90961\ncfb\t90962\nter\t90963\n评分\t90964\n信阳市中心医院\t90965\nPHOTOSHOP\t90966\nGuard\t90967\n圣泉集团\t90968\n滨海\t90969\n47_\t90970\n易科\t90971\n12.04下\t90972\n立华\t90973\n革兰氏阳性菌\t90974\nPHPOA\t90975\nUSER\t90976\n3[\t90977\n无穷尽\t90978\n无心插柳柳成荫\t90979\nrencai\t90980\n哈弗h5\t90981\n小米意外险\t90982\novation\t90983\nsemilogy\t90984\n丝带玫瑰花\t90985\n特种小兵传奇\t90986\n荆州市房产管理局\t90987\n885\t90988\n乌台诗案\t90989\n军力\t90990\nxianka\t90991\n荣耀手机吧\t90992\n悦读\t90993\n清闲\t90994\nsurgeon\t90995\n再做\t90996\nn卡\t90997\n保罗\t90998\n一代伟人\t90999\n巡天\t91000\n小天使们\t91001\n中山东升镇\t91002\n杨桦\t91003\n咲乐安娜\t91004\nIGI\t91005\n阿古\t91006\n九鲤湖\t91007\n择居\t91008\nIntroduce\t91009\n敬重\t91010\n爱奇艺缓存\t91011\n歪歪\t91012\n青商\t91013\n吉利gs\t91014\n熏炉\t91015\ngofun\t91016\n开闭\t91017\n招投标公司\t91018\ntage\t91019\nBTDigg\t91020\nlyy\t91021\n老赵\t91022\n四问\t91023\n24000\t91024\n羊羊\t91025\n生菜\t91026\n茅台镇\t91027\n厦门园博苑\t91028\n张妃\t91029\n成都男科医院\t91030\n长焦相机\t91031\n秉烛\t91032\n大仙\t91033\n椒陵\t91034\n自我体罚\t91035\nForecasting\t91036\nfesta\t91037\n成堆\t91038\nBNP\t91039\n土超女排\t91040\n濮水\t91041\n手机触屏版\t91042\n斜着\t91043\n索贝\t91044\n钟会\t91045\n九幽天帝\t91046\n唐宋八\t91047\n六七年\t91048\nDAG\t91049\nA321\t91050\n总导演\t91051\n嗜血法医\t91052\n能量守恒定律\t91053\nnos\t91054\n思堂\t91055\n撤走\t91056\n美乐家\t91057\n广州儿童医院\t91058\n假面骑士龙骑\t91059\n荧光\t91060\nWebRoot\t91061\n使用面积\t91062\n86722理财网\t91063\n192.168.1.2\t91064\nSurrogate\t91065\n经济参考\t91066\n白沟\t91067\n老干部局\t91068\n张显宗\t91069\n页HTML\t91070\n1300多\t91071\nie7\t91072\n张艳华\t91073\n左游游戏厅\t91074\n凤庆\t91075\n遮罩层\t91076\n太阳和月亮\t91077\n半吊子\t91078\n张海军\t91079\n0571-12345\t91080\n杰佣\t91081\n50后\t91082\n父亲节\t91083\n青海省审计厅\t91084\nWhip\t91085\n没礼貌\t91086\ntaxes\t91087\n景驰科技\t91088\nM4求助】模拟人生4\t91089\n永年区\t91090\n江苏农村商业银行\t91091\n猎人荒野的呼唤吧\t91092\n法民\t91093\n谢正义\t91094\n感染者\t91095\n自杀者\t91096\nhibernate3\t91097\nrequireJS\t91098\n微硅粉\t91099\neliminate\t91100\n精巢\t91101\n山段\t91102\n抄绘\t91103\n灵剑山\t91104\nRevision\t91105\n北京条约\t91106\n海润光伏\t91107\n戎马\t91108\n抵沪\t91109\nsawyer\t91110\n重见\t91111\n500000\t91112\n排水处\t91113\nDASH\t91114\n水资源\t91115\n小暮花恋\t91116\nhospital\t91117\n2016年10月份\t91118\nmuchong\t91119\nWrestleMania\t91120\n金泰大厦\t91121\n沙沙沙\t91122\nzqsg\t91123\n玉蝉\t91124\n宫廷计\t91125\n2000度\t91126\n鲍洪升\t91127\n三一重卡\t91128\n灵歌\t91129\nf485\t91130\nupwork\t91131\n2011届\t91132\n凛子\t91133\nderivation\t91134\n拓道\t91135\n尊崇\t91136\n赵俊峰\t91137\n哈尔滨市道里区政府\t91138\n件件\t91139\n光布\t91140\n李喵喵\t91141\n抱杆\t91142\n一整套\t91143\n分层\t91144\n日本亚马逊\t91145\n尹建莉\t91146\n分业\t91147\n兴发集团\t91148\nEsri\t91149\n山西一区\t91150\n恐龙危机2\t91151\n大单品\t91152\n小寨赛格\t91153\n3吧\t91154\n小儿七星茶\t91155\n废止\t91156\n星艺\t91157\naruba\t91158\nBos\t91159\n酷贝拉\t91160\nV卡\t91161\nHyperX\t91162\n千禧果\t91163\n山东大学药学院\t91164\n清唱\t91165\n湖北工业职业技术学院\t91166\n滑杆\t91167\n苯丙\t91168\n柯尼塞格\t91169\nDearzh\t91170\n麻姑\t91171\n股室\t91172\n王八拳\t91173\n为人师\t91174\n潼南网\t91175\n同士\t91176\n余春娇\t91177\n辞别\t91178\n明哲\t91179\n固有\t91180\n福建江夏学院\t91181\n黑暗血时代\t91182\nrenshi\t91183\nthumb\t91184\ncetnos\t91185\n马岭镇\t91186\n满头大汗\t91187\n思进\t91188\n河南银行\t91189\n土妞\t91190\n边门\t91191\n财务管理制度\t91192\n中帮\t91193\n通用性\t91194\nEqual\t91195\n复合型\t91196\n嫔妃\t91197\n2015年12月份\t91198\n+DS160\t91199\n織田non\t91200\n董希文\t91201\nmagnetic\t91202\n渭南市人民政府\t91203\n王水\t91204\nppo\t91205\n350km\t91206\n佛山物流公司\t91207\nNDB\t91208\n电子羊\t91209\n格灵深瞳\t91210\nd3s\t91211\n计提\t91212\n爱与性\t91213\n毛呢\t91214\n世纪电源网\t91215\n蒟蒻\t91216\nDefender\t91217\n金瓶梅2\t91218\n神经瘤\t91219\n591结婚网\t91220\n内蒙古检察院\t91221\nPOTPLAYER\t91222\n修武县\t91223\n追悔莫及\t91224\n事业编制考试\t91225\n菌菇\t91226\n夜泊\t91227\n龙之契约\t91228\n南京经济技术开发区\t91229\n类包\t91230\nScapy\t91231\nGaN\t91232\n安徽省供销社\t91233\n6.5.2\t91234\n鬼灯的冷彻第二季\t91235\n夏沫\t91236\n六祖慧能\t91237\nSUGA\t91238\nph\t91239\n耳膜\t91240\n宏达电子\t91241\n70kg\t91242\n休药期\t91243\n山西现代双语学校\t91244\nWinServer2008\t91245\n超宝\t91246\np51\t91247\n11章\t91248\n申银万国\t91249\n秤砣\t91250\n中毅达\t91251\n保安服务许可证\t91252\n科学研究\t91253\n京津地区\t91254\n螺口\t91255\n2.5年\t91256\nCos\t91257\n展示品\t91258\n退出江湖\t91259\n条件者\t91260\n红娘子\t91261\n营改增小规模纳税人\t91262\n波珠\t91263\n西尔莎·罗南\t91264\n死亡岛吧\t91265\n气象员\t91266\n于辉\t91267\n新天堂2\t91268\n十堰市政府\t91269\n长方集团\t91270\n中粮营养健康研究院\t91271\n首地\t91272\n油瓶\t91273\n知者\t91274\nWTG\t91275\n创保网\t91276\n混交林\t91277\ntsubasa\t91278\nInstreet\t91279\n4.81\t91280\n发起人\t91281\n爱问家电网\t91282\n就业者\t91283\n一厨\t91284\n34次\t91285\ngarrix\t91286\n伊利蒙牛\t91287\n耀莱集团\t91288\n七番\t91289\n带状疱疹病毒\t91290\nSunset\t91291\n私募股权投资基金公司\t91292\n乍杭\t91293\nDNA甲基化\t91294\n双拥路\t91295\n莲叶青青\t91296\nVerb\t91297\n优劣点\t91298\n假领\t91299\n红檀\t91300\n022-95599\t91301\n邮编商务网\t91302\n鸭面\t91303\n非凡搭档\t91304\n旅游日\t91305\nfdt\t91306\n钢刃\t91307\n针织\t91308\n日历日\t91309\n70P\t91310\n了不起的麦瑟尔女士\t91311\n白尾\t91312\n防务展\t91313\n莫青\t91314\ntextile\t91315\n中建海峡建设发展有限公司\t91316\n北落师门\t91317\n王达\t91318\n郭青\t91319\n锡山区\t91320\npickup\t91321\nzzg\t91322\n个人版\t91323\n自助餐台\t91324\n补给箱\t91325\n51节\t91326\n万隆会议\t91327\n等离子净化器\t91328\n熊雄\t91329\n金融科技公司\t91330\n电动推杆\t91331\n宝泉风景区\t91332\n缩容\t91333\n银杏大道\t91334\nsessionStorage\t91335\n胜通\t91336\nFearless\t91337\n000000\t91338\n何青青\t91339\n六福\t91340\n省安监局\t91341\n研究报告\t91342\n导入语\t91343\n郏县\t91344\nnomail\t91345\n肝病频道_健客网\t91346\ngibbs\t91347\n暗黑三\t91348\n江苏省机关事务管理局\t91349\nDominic\t91350\nfourier\t91351\n朝圣路\t91352\n步客\t91353\nCommentary\t91354\n趣推\t91355\nmto\t91356\n雷系\t91357\n鬼装\t91358\n滨州医学院附属医院\t91359\n91porn\t91360\n工程管理专业\t91361\n第231集\t91362\n200p\t91363\n11日\t91364\n外事\t91365\n福州市委\t91366\n五大囧\t91367\n新寻\t91368\nSupergirl\t91369\n橄榄色\t91370\n2万平方米\t91371\n下沙经济技术开发区\t91372\n醉仙\t91373\nAM2\t91374\nEastern\t91375\n北京市第二中级人民法院\t91376\n攻心\t91377\n消费者权益日\t91378\n5.0%\t91379\nangualr\t91380\n诚毅学院\t91381\n好伴侣\t91382\n最后一关\t91383\nMbps\t91384\n鞋业\t91385\n宫古岛\t91386\n天然气\t91387\n虚幻3\t91388\n沼气池\t91389\nPROFILE\t91390\n盛阳\t91391\n张海龙\t91392\n电椅\t91393\n盛虹石化\t91394\n邯山\t91395\nsqm\t91396\n臭臭\t91397\nspin\t91398\n平均量\t91399\n河东新区\t91400\n2016年03月\t91401\n北京第一实验小学\t91402\n卡路里计算器\t91403\n4850\t91404\n凶兵\t91405\ndiluted\t91406\n相山\t91407\n球冠\t91408\n饮食男女\t91409\nuuu9\t91410\n高凡\t91411\nr1\t91412\n车组\t91413\n值友\t91414\n技德\t91415\n宽居\t91416\n宗法\t91417\n正则\t91418\nweishen\t91419\ninde\t91420\n13秒\t91421\n阿根廷人\t91422\n财迷\t91423\n标签式\t91424\n胰管\t91425\nBETA版\t91426\n永雄\t91427\nnorms\t91428\n20171017\t91429\n波奇宠物网\t91430\n7200元\t91431\nMate\t91432\n托福独立口语\t91433\n107间\t91434\n临平\t91435\n石家庄火车站\t91436\n成伤\t91437\n排骨架\t91438\n零售业\t91439\n冯小韦德\t91440\n监理费\t91441\n布司\t91442\n西湖音乐节\t91443\ni18\t91444\n左邻右里\t91445\n土机\t91446\n傲蕾\t91447\n红字增值税发票\t91448\n墨竹工卡县\t91449\n工料\t91450\n糖耐量试验\t91451\n金嘉\t91452\nFramework7\t91453\n沈浪\t91454\n枪战英雄\t91455\n11月\t91456\n小行家\t91457\n色空阁\t91458\n九件套\t91459\n上佳\t91460\n65度\t91461\n艾尔战记\t91462\n手擀面\t91463\n汽车抵押贷款\t91464\ngout\t91465\n门禁读卡器\t91466\n火焰纹章觉醒中文网\t91467\n20170901\t91468\nwoods\t91469\n九分之一\t91470\nTa\t91471\nx12\t91472\n永远不会\t91473\n计生协会\t91474\n兵者诡道\t91475\n李野默\t91476\n照进\t91477\n以身许\t91478\n斗酒\t91479\n财讯\t91480\n浦东模范中学\t91481\nPol\t91482\n沈杜公路\t91483\n37米\t91484\n书套\t91485\nP90\t91486\nadditions\t91487\n水磨古镇\t91488\nmistress\t91489\n65吋\t91490\n模糊函数\t91491\n真值\t91492\n安禄山\t91493\n宝马五系\t91494\n51分钟\t91495\n基本\t91496\n澳大利\t91497\n二氯苯\t91498\n第53章\t91499\n四级片\t91500\n既得利益者\t91501\n王默\t91502\n金地中心\t91503\n117岁\t91504\ntecplot\t91505\n成为\t91506\n涅\t91507\n幻想画\t91508\n九江社区\t91509\n马栏广场\t91510\n摆式\t91511\nFaiss\t91512\nfat32格式\t91513\nTattoo\t91514\n姓甚名\t91515\n黑条\t91516\n重口\t91517\nstash\t91518\n河南医学高等专科学校\t91519\n5000万吨\t91520\n看完后\t91521\n搜狗团购导航\t91522\n索沛CS论坛\t91523\n微贷网\t91524\n夏宝龙\t91525\n数据库类\t91526\ncaspase\t91527\n黑龙江卫视\t91528\n华为pay\t91529\n不同地区\t91530\n功名尘\t91531\n说上\t91532\n新课标卷\t91533\n龚勋\t91534\n深圳公证处\t91535\nsodu\t91536\ncloth\t91537\n5到10年\t91538\n化脓\t91539\n亲卫\t91540\nregist\t91541\n司马\t91542\n龙军\t91543\n正成\t91544\n双机热备\t91545\n棋友\t91546\nke\t91547\n余姚四中\t91548\n高效果\t91549\n中乌\t91550\n中国电力建设企业协会\t91551\n刑事处罚\t91552\n头怪\t91553\n咬定\t91554\n温台\t91555\n辨位\t91556\n滤光片\t91557\n点云\t91558\n凌空soho\t91559\n无保\t91560\nNd\t91561\n宠物大乱斗\t91562\n平乐古镇\t91563\n三一重工\t91564\n方山\t91565\nlartern\t91566\n蕨类植物\t91567\n结晶性\t91568\nHPB\t91569\n2011年度\t91570\n车仆\t91571\n中国科大\t91572\n六七千\t91573\ntacos\t91574\n398号\t91575\n无孔不入\t91576\n绘师\t91577\n1070\t91578\n石家庄车管所\t91579\n陆家嘴软件园\t91580\n江姐\t91581\n月下美人来#\t91582\n妮可徐新\t91583\n深圳市龙岗区\t91584\n士官\t91585\n可动\t91586\n隆林\t91587\ncoc吧\t91588\npantone\t91589\n惠顾\t91590\n保定政府网\t91591\nFucks\t91592\n魏江\t91593\n耽生\t91594\n雷磊\t91595\n我的名字\t91596\n加设\t91597\n武汉工商银行\t91598\n网易闪电邮\t91599\n龙村\t91600\n包用\t91601\n望舒\t91602\nONLINE\t91603\n领馆\t91604\n非法本\t91605\n洗衣台\t91606\n市川\t91607\n暮光之城4\t91608\n第2版\t91609\ne代\t91610\n贵州省国家税务局\t91611\n伊布\t91612\n表哥\t91613\n建安区\t91614\nreflog\t91615\n王鸿云\t91616\n东印度公司\t91617\n秘仪\t91618\n舞蹈世界\t91619\nfinite\t91620\niku\t91621\n沃尔沃XC60论\t91622\n千雪\t91623\nHirsi\t91624\nUPS电源\t91625\n胡集镇\t91626\n兴龙\t91627\n北京理工大学法学院\t91628\n洽会\t91629\nHeavenly\t91630\nbullet\t91631\n上海证券报\t91632\n小炒\t91633\n刘奕\t91634\n反反\t91635\nBalancer\t91636\n2002年\t91637\n德福\t91638\n拓展坞\t91639\n金山词霸_句库\t91640\nentirely\t91641\n家用跑步机\t91642\n6735\t91643\n家眷\t91644\n阿兹猫\t91645\n家丁\t91646\n百听\t91647\nqnap\t91648\nCA认证\t91649\n上海市法人一证通\t91650\n岩性\t91651\n新老交替\t91652\n闻风而动\t91653\n秘谱\t91654\n派生词\t91655\nArcObjects\t91656\n0038\t91657\n千乃\t91658\ne4\t91659\n沙祖康\t91660\n水笔\t91661\n扶余县\t91662\n2006年1月\t91663\nEtude\t91664\n百菌清\t91665\n经天路\t91666\n上海迪士尼小镇\t91667\nmuma\t91668\ngreatly\t91669\n6排\t91670\nXS\t91671\n扑\t91672\nbackspace键\t91673\n诛仙逍遥涧\t91674\n大梅沙海滨公园\t91675\n有约\t91676\nwin10ie浏览器\t91677\n哭丧\t91678\n小梅花\t91679\n0874\t91680\n武大华科\t91681\n林某\t91682\n0014\t91683\n粘附\t91684\n封口膜\t91685\n林黛玉\t91686\nxmd\t91687\n救救\t91688\n呷\t91689\n刘巧儿\t91690\n七中育才\t91691\nlease\t91692\n不干胶打印机\t91693\n奥瑞金\t91694\n编码器\t91695\n车管站\t91696\n敲钟\t91697\n物流管理专业\t91698\n高尔沃尔沃s60l\t91699\n悪魔\t91700\n流式中文网\t91701\n插值法\t91702\n六朝博物馆\t91703\n20140415\t91704\n阳炎\t91705\nAdditions\t91706\n五十二岁\t91707\n录像机\t91708\nTsubasa\t91709\n0.75平方\t91710\n步科\t91711\n缓苗\t91712\n擎羊\t91713\n宪法性\t91714\n集装袋\t91715\nSLA\t91716\n神州龙芯\t91717\n曲梁\t91718\n编译原理\t91719\n小米手机5X_MIUI论坛\t91720\n香江路\t91721\n饶阳县\t91722\n招唤\t91723\n信女\t91724\nODI\t91725\n杰夫\t91726\nUIAlertView\t91727\n货权\t91728\n凌水\t91729\n不要命\t91730\n前湾保税港区\t91731\n股东们\t91732\n新时代\t91733\n4C2014款\t91734\ngraft\t91735\n光电信息科学与工程\t91736\n143号\t91737\n海伦国际\t91738\n协议离婚\t91739\n开通\t91740\n国网新源控股有限公司\t91741\n文登\t91742\n焦头\t91743\nuclinux\t91744\n中国太平洋财产保险股份有限公司\t91745\n世界外国语中学\t91746\n堡\t91747\n寿比山\t91748\n属国\t91749\nstayreal\t91750\n北京美食_酒店\t91751\ncaffeine\t91752\n常用色\t91753\n饱嗝\t91754\n幽冥域\t91755\n浙贝\t91756\n海军基地\t91757\n干栏式\t91758\n单式\t91759\n销售心理学\t91760\n按钮盒\t91761\n5月6日\t91762\n极米\t91763\n裕华路\t91764\n普通合伙人\t91765\n稻子\t91766\n风行天下一万号\t91767\n宋亚轩\t91768\n通用语\t91769\n一委\t91770\n求知\t91771\n长安微车\t91772\n连锁咖啡公司\t91773\nscrollerview\t91774\n回位\t91775\n圣莲山\t91776\n吴茱萸汤\t91777\n佛家\t91778\n底段\t91779\n蓝莓味\t91780\n2018.1.4\t91781\n吉布鲁\t91782\n巩乃斯\t91783\n舒筋活血片\t91784\n正正\t91785\n理肤泉\t91786\n开塞露\t91787\n唐三彩\t91788\n锡纸花甲粉\t91789\n普法战争\t91790\n西红\t91791\n迦陵\t91792\n豆瓣小组\t91793\n节能环保\t91794\n48v12ah\t91795\n大黄鸭\t91796\n10注\t91797\nserver版\t91798\nwkhtml\t91799\nl4d2\t91800\n无敌浩克\t91801\n平妖\t91802\n天天影院\t91803\n苏恩惠\t91804\nquickconnect\t91805\n杜集\t91806\n尼美舒利\t91807\n千百次\t91808\n大夫山森林公园\t91809\n教育科技有限公司\t91810\n辣木\t91811\n免箱期\t91812\n微淘广播\t91813\n南片\t91814\n电脑画\t91815\n骚鸡\t91816\n习家池\t91817\nCCTV-10\t91818\n快意\t91819\n汕头市公安局\t91820\n苏河\t91821\n小新\t91822\ni5-6500\t91823\n上阵父子兵\t91824\nnano\t91825\n逆火\t91826\n责难\t91827\n鄞州法院网\t91828\n湖头\t91829\n麦玲玲\t91830\n北海道\t91831\n左曹\t91832\n怒江\t91833\nntdll\t91834\n函件\t91835\n木聚糖酶\t91836\n工人村\t91837\nproperty-placeholder\t91838\nTrophy\t91839\n反补\t91840\n赣南师范大学\t91841\nDisney\t91842\n1931\t91843\n京山县\t91844\n纳木\t91845\n协调\t91846\n剪片\t91847\n档+DLC+\t91848\n供水管\t91849\nEventArgs\t91850\n七年级语文下册\t91851\ncd4\t91852\n橙瓜\t91853\n化学名\t91854\nlooked\t91855\n空之轨迹FC\t91856\n血乳酸\t91857\nFilipina\t91858\nFreeSWITCH\t91859\n巩峥\t91860\n彭山区政府\t91861\n大显\t91862\n吴晗\t91863\n腊八蒜\t91864\n创办者\t91865\n708\t91866\n9800多万户\t91867\n拖垮\t91868\ncassandra\t91869\n烛龙\t91870\n无锡市中医医院\t91871\n老君越\t91872\n棒球大联盟\t91873\n原图\t91874\n泰丰\t91875\n煤矿区队\t91876\n即热式电热水器\t91877\n金雁\t91878\nban机\t91879\n115年\t91880\n电源变压器\t91881\n威胜集团\t91882\n租机\t91883\n鹿茸\t91884\n美兰区\t91885\n波尼\t91886\n天涯宝马5系\t91887\nlymphoma\t91888\n剂型\t91889\n1137\t91890\nQinZhou8.com\t91891\n小羊\t91892\ncobol\t91893\n大杂\t91894\n非零向量\t91895\n棠溪火车站\t91896\nBLOG\t91897\n黎川县\t91898\n挡板\t91899\n黄袍\t91900\n肾\t91901\n点验\t91902\n600公斤\t91903\n傅抱石\t91904\n一线性\t91905\n栖霞苹果\t91906\n▔\t91907\n谢晓峰\t91908\nGB150\t91909\n太极门\t91910\nFILORGA\t91911\nDAIWA\t91912\n钻探机\t91913\n吊旗\t91914\n咽炎片\t91915\nBead\t91916\n寒天\t91917\n闻喜县\t91918\n颜料盒\t91919\n支离\t91920\n矿物学\t91921\n叠彩\t91922\n51so\t91923\n名爵MG6\t91924\n谈话类\t91925\n小波\t91926\n学工部\t91927\n虎皮尖椒\t91928\nSNA\t91929\n四川省投资促进局\t91930\nmox\t91931\n理工科\t91932\n深圳土地公\t91933\n电子竞价\t91934\n王者荣耀巅峰赛\t91935\n驾考吧\t91936\n华海药业\t91937\nIPAD\t91938\n海口三亚\t91939\nxe\t91940\n磨片\t91941\nH110M\t91942\ntdr\t91943\n邵阳职业技术学院\t91944\n20180217\t91945\n东华能源\t91946\n刘项\t91947\nmcs\t91948\n1:15\t91949\n汗斑\t91950\n复方氨基酸注射液\t91951\n超级绿宝石\t91952\nPersonalized\t91953\n证券从业资格考试\t91954\ndfc\t91955\njax-ws\t91956\n明制\t91957\nSizes\t91958\n另一\t91959\n捞尸\t91960\n建华实验学校\t91961\n不安\t91962\n淳淳\t91963\nhdlc\t91964\n在意\t91965\n改迁\t91966\n新仙剑奇侠传\t91967\n目标受众\t91968\n陈光荣\t91969\n0857\t91970\n双历\t91971\n广州东站汽车客运站\t91972\n旧帖\t91973\n听力\t91974\n流通盘\t91975\n双梁桥式起重机\t91976\nm1522nf\t91977\n本领\t91978\n五回\t91979\n割\t91980\ncookielib\t91981\n习云波\t91982\n乾元坤\t91983\n埋地\t91984\nIPD\t91985\n0729\t91986\n迷惑不解\t91987\ntml\t91988\n泰凯斯\t91989\n上海智康1对1\t91990\n假日\t91991\n腰酸痛\t91992\nGMI\t91993\n0745\t91994\nモンスタ\t91995\n喜酒\t91996\n尊荣\t91997\n烧烤酱\t91998\n赋役\t91999\n血衣侯\t92000\n九龙城\t92001\n索取\t92002\nZZY\t92003\n冰神卡\t92004\n崇洋\t92005\n益德\t92006\n喜士多便利店\t92007\n1500元\t92008\n福州晚报\t92009\n正在进行\t92010\nTimer\t92011\n锤死\t92012\n影后\t92013\n坡口\t92014\n冯斌\t92015\n曹彬\t92016\n多线程学习\t92017\n泰坦尼克号1\t92018\n两客一危\t92019\n另起炉灶\t92020\n药理学\t92021\nLinguee\t92022\n泥机\t92023\nlounge\t92024\n6679\t92025\n勾心\t92026\n外国女\t92027\n499\t92028\n迷色\t92029\n优待证\t92030\n杭州市发展和改革委员会\t92031\n1芯\t92032\n码期\t92033\n广东电台\t92034\n村村乐\t92035\n韩冬炎\t92036\nwritable\t92037\n三只蝴蝶\t92038\n甘蔗汁\t92039\n舒马胡润\t92040\nALB\t92041\n康安\t92042\nWhatsapp\t92043\n投资组合理论\t92044\n40086\t92045\n凌霄阁\t92046\n莞香\t92047\n孙云飞\t92048\n啦啦啦\t92049\n下校\t92050\n佘山站\t92051\nwinapi\t92052\nbig5\t92053\nbutton\t92054\n刺客信条:兄弟会\t92055\n1947年\t92056\nFoxpro\t92057\n医古文\t92058\n逐出\t92059\n专家库\t92060\n枪战王者\t92061\n联想G480\t92062\ntoly\t92063\nboot框架\t92064\n黄州\t92065\n伞布\t92066\n好园\t92067\n九鼎山\t92068\ncouture\t92069\n龙泉驿\t92070\n尤因\t92071\n普源\t92072\n绳操\t92073\n角标\t92074\n漳州实验中学\t92075\n三股势力\t92076\n赔\t92077\n宝骏e100\t92078\n艾玛·斯通\t92079\n次次\t92080\n高家宁\t92081\n2二\t92082\n名册\t92083\n赵越\t92084\n320路\t92085\n内蒙古医科大学\t92086\nBroadcast\t92087\n潜水员\t92088\nmusi\t92089\nschoolgirl\t92090\n甘油醚\t92091\n权日\t92092\n中央集权制\t92093\nppt\t92094\n海思麒麟处理器\t92095\nretaining\t92096\ncvxopt\t92097\n察使\t92098\n60万辆\t92099\n中星微\t92100\nmyelipse\t92101\npassed\t92102\n200页\t92103\nARM9之家论坛\t92104\nfan\t92105\n抽条\t92106\n北京培黎职业学院\t92107\n范玮\t92108\n浙江社区\t92109\n细水长流\t92110\n流火\t92111\n顾瑛\t92112\n小坑\t92113\n听完后\t92114\n200码\t92115\n劳动争议调解仲裁法\t92116\n房峰辉\t92117\n晚上7点\t92118\n李昕\t92119\n楼海\t92120\n和平广场\t92121\n山东男篮\t92122\n君山银针\t92123\n天长小学\t92124\n5321\t92125\n三顾茅庐\t92126\n汉芯事件\t92127\nDada\t92128\n山名\t92129\n救生艇\t92130\nav10492\t92131\nUprising\t92132\n巴川\t92133\n吸收率\t92134\n2B\t92135\n中间品\t92136\n铁锅炖鱼\t92137\n结尾\t92138\n上海虹桥火车站\t92139\n王品\t92140\n明日的我与昨日的你约会\t92141\n威豹\t92142\n新开网通\t92143\n癸未\t92144\n寻隐者\t92145\n冰强\t92146\n武汉江岸\t92147\n酒方\t92148\n束腰带\t92149\n连狙\t92150\n海威\t92151\n水果传\t92152\nCarrier\t92153\n摇船\t92154\n新宠儿\t92155\n江西站\t92156\n烈性\t92157\n小汉\t92158\n减速电机\t92159\n星际殖民\t92160\n反电动势\t92161\n紫蓬镇\t92162\n乖离性百万亚瑟王吧_\t92163\nMICRO\t92164\n解放军81医院\t92165\n黑裤\t92166\n霍尼\t92167\nUNICODE\t92168\nwww.52\t92169\n选者\t92170\ngsnas\t92171\noppoa59\t92172\n安兔兔\t92173\n一处\t92174\n泰捷\t92175\nmediawiki\t92176\n工作关系\t92177\n课本\t92178\n牙博士\t92179\n崔鹏\t92180\n陕西南路\t92181\n10.1.1\t92182\nchengd\t92183\nfederated\t92184\n凯克\t92185\n张自力\t92186\n东威\t92187\n刘玄德\t92188\ngraphpad\t92189\n长安网\t92190\n阅界\t92191\nnoodles\t92192\n临池\t92193\n公共品\t92194\n红番茄\t92195\n陆压\t92196\n神瞳\t92197\ndenied\t92198\n北地\t92199\n广州市萝岗区\t92200\n滴灌带\t92201\nTTM\t92202\n最后胜利\t92203\n使女的故事第二季\t92204\n反恐怖法\t92205\n台扇\t92206\n掠夺性\t92207\n近三天\t92208\n十堰市太和医院\t92209\n乌审旗\t92210\n旺宏\t92211\n建平县人民政府\t92212\n粉晶\t92213\n安博会\t92214\n560分\t92215\n公益诉讼\t92216\n退位减法\t92217\n训练室\t92218\n江苏省农村信用社联合社\t92219\nPrinting\t92220\n一个9\t92221\n中医师\t92222\n西非\t92223\n罗曼·罗兰\t92224\n脱羧\t92225\n慈善信托\t92226\n小宫女\t92227\n百词\t92228\n陈梅\t92229\nAssassins\t92230\n厦门大学新闻传播学院\t92231\n吴茂昆\t92232\n正果镇\t92233\n中银大厦\t92234\n母案\t92235\n灵牌\t92236\n划过\t92237\n古一\t92238\n王芬\t92239\n湿乎乎\t92240\n43路\t92241\n不够了\t92242\n0403\t92243\n五毛\t92244\nProfiles\t92245\nAT89S52\t92246\n踱\t92247\n期满\t92248\nannot\t92249\n爱的誓言\t92250\n111\t92251\nTae\t92252\n力架\t92253\n侍酒师\t92254\n温晓\t92255\n书习作\t92256\n万中无一\t92257\nLoRaWAN\t92258\n跪下\t92259\n威高集团\t92260\nRentals\t92261\n工艺\t92262\n展艺\t92263\n插嘴\t92264\n叶柄\t92265\n台湾大众银行\t92266\nDickinson\t92267\n鞋类\t92268\n偷车\t92269\n万能导航网\t92270\nbazzi\t92271\n灯光节\t92272\n人位\t92273\n羟甲基糠醛\t92274\n蓝军\t92275\n校准仪\t92276\n我的任务\t92277\nIP营销\t92278\n1275\t92279\n图层蒙版\t92280\n绝尘侠\t92281\n电视浏览器\t92282\nPROTEL\t92283\n卫星电视\t92284\n付货\t92285\n百度分享\t92286\ntwilight\t92287\n二阶差分\t92288\n表字\t92289\n唠唠叨叨\t92290\n小嶋阳菜\t92291\n烹饪\t92292\nreconciliation\t92293\n五岳之巅\t92294\n机床展\t92295\n雨田君\t92296\nSons\t92297\nSever\t92298\n炒\t92299\n荣耀六\t92300\n4399火线\t92301\n政库\t92302\n香榭\t92303\nMP3\t92304\n连场\t92305\n山西电力\t92306\n暗黑破坏神2毁灭之王\t92307\n成立新\t92308\nNeo4J\t92309\n叙说\t92310\n刀位\t92311\n一个卷\t92312\n海伦公式\t92313\n剑桥雅思7\t92314\ncongress\t92315\n南昌高新区\t92316\n安庆专区\t92317\n精灵宝\t92318\n筹委会\t92319\n我是特种兵之霹雳火\t92320\n嫡女\t92321\nyī\t92322\n离子烫\t92323\n牙龈红肿\t92324\n潘磊\t92325\n几百万\t92326\n洪山广场\t92327\n灰皮\t92328\n环境影响评价师考试\t92329\n陈键锋\t92330\n犬种\t92331\nCQB\t92332\nNever\t92333\n快鹿集团\t92334\n全名\t92335\n3GS\t92336\n忠孝东路\t92337\n奖\t92338\n陶瓷瓶\t92339\nGuitar\t92340\ndrug\t92341\n御坂\t92342\n罗渽民\t92343\n青玛瑙\t92344\n曾磊\t92345\n中共十八届五中全会\t92346\novpn\t92347\n幼童\t92348\n李学\t92349\nshemales\t92350\nSnail\t92351\n90mm\t92352\n鸟语花香\t92353\nACN\t92354\nPAYPAL\t92355\n生物质发电\t92356\n圪梁梁\t92357\n后白镇\t92358\n石家大院\t92359\n狮子型\t92360\nHoloLens\t92361\n360网盘\t92362\nroot包\t92363\n杂用水\t92364\n利益关系\t92365\n羞愧\t92366\n州县\t92367\nels\t92368\n协变\t92369\n葛大爷\t92370\n弥天大谎\t92371\n香农\t92372\n长江e号\t92373\n天猫汽车\t92374\nORG\t92375\n卷圆\t92376\n梦幻藏宝阁\t92377\n方媛\t92378\n朝天鼻\t92379\n最后的时光\t92380\n得用\t92381\n伙食\t92382\nsls\t92383\n管仲\t92384\n武统台湾\t92385\n陶然亭公园\t92386\n旅行险\t92387\n天空之境\t92388\n都江堰万达城\t92389\n三三九\t92390\n杯型\t92391\n福克斯两厢\t92392\n学豆社区|学豆论坛\t92393\n星战前夜\t92394\nwei\t92395\n荣华实业\t92396\n同室操戈\t92397\n胡景涛\t92398\n倪\t92399\n挥辉\t92400\n豪利时\t92401\n圣树\t92402\n芒果味\t92403\n河北师范大学附属实验中学\t92404\n多功能室\t92405\n麦克菲\t92406\n万木\t92407\n自谋\t92408\ncl2018\t92409\n酷友\t92410\n徐伦\t92411\n响沙湾\t92412\n小福利\t92413\n第十三位\t92414\n伏龙记\t92415\n生物工程学院\t92416\n肠道益生菌\t92417\n种群\t92418\n土地股份合作社\t92419\n雄才大略\t92420\nMatlab2017a\t92421\nkibana\t92422\n初临\t92423\n朱子治\t92424\n真融宝\t92425\n金梅\t92426\n875\t92427\n192.168.0.102\t92428\nwavemel\t92429\n米夏\t92430\n青岛市安全生产监督管理局\t92431\n二十次\t92432\nF100\t92433\n接地线\t92434\n风水\t92435\n香消玉殒\t92436\n36号\t92437\nfof\t92438\nmacbookair\t92439\n雨珍\t92440\n无锡市旅游局\t92441\nGrupo\t92442\nGM版\t92443\n20160220\t92444\n磁粉制动器\t92445\n标曲\t92446\n解毒机\t92447\n共箱\t92448\n西摩\t92449\n恰恰相反\t92450\n结缕草\t92451\n1.2.4\t92452\n黄帝内经网\t92453\n首部\t92454\n脑筋急转弯_短文学\t92455\nserenade\t92456\n青龙山\t92457\n姚梅龄\t92458\n3000多万\t92459\n成都新闻网\t92460\n哈希码\t92461\n打头阵\t92462\nZedBoard\t92463\nMechanics\t92464\n纺织品\t92465\nrobot\t92466\n致炫\t92467\n7M\t92468\n湖北航天信息技术有限公司\t92469\n゛\t92470\n渝中\t92471\n解迷\t92472\n方舟\t92473\n无错小说网\t92474\nwww.yibo5.cn\t92475\n美奥\t92476\n大潘佳佳\t92477\n连丽\t92478\nmutation\t92479\nopenvas\t92480\n旗舰店\t92481\n罗马队\t92482\n远行星号吧\t92483\n砂糖\t92484\ndotnetbar\t92485\n成人台\t92486\n紫晨\t92487\n500公斤\t92488\n353号\t92489\n网康\t92490\n自动包装机\t92491\n积慕\t92492\n熊超\t92493\n瑞迈特呼吸机\t92494\nreportlab\t92495\n低首\t92496\nurlrewrite\t92497\n裤子\t92498\nQuiet\t92499\n一字千金\t92500\n朋友别哭\t92501\nMACD股\t92502\n远景X3\t92503\n3156医药网\t92504\n回款\t92505\nGMT\t92506\n很感动\t92507\n胁差\t92508\n实付\t92509\n金六福酒\t92510\n彩色\t92511\n盐酸舍曲林片\t92512\n入度\t92513\n赛琳娜戈麦斯\t92514\nabbao\t92515\n乱发\t92516\nsystemic\t92517\n深圳保税区\t92518\n无锡地铁1号线\t92519\nuniqlo\t92520\n美国国家公园\t92521\n树科\t92522\n诺基亚808\t92523\n万柏林区\t92524\n扑腾\t92525\nb612咔叽\t92526\nv2.5.7\t92527\n新长江\t92528\n稳态\t92529\n郝建民\t92530\n丽江机场\t92531\n大件路\t92532\n序数词\t92533\nartists\t92534\n苏南硕放国际机场\t92535\n基底层\t92536\nubuntu\t92537\ng4560\t92538\n腾科\t92539\n氯化钡\t92540\n区块链100问\t92541\n冒头\t92542\n好姑娘\t92543\nCA6140\t92544\n纸碗\t92545\n6681\t92546\n三国机密之潜龙在渊百度云\t92547\n16道\t92548\n0.9m\t92549\n入行\t92550\n取非\t92551\n943路\t92552\n基金份额\t92553\n月亮星座\t92554\nEnhanced\t92555\n园服\t92556\n控制类\t92557\n列做\t92558\npresident\t92559\n车宽\t92560\n上海园\t92561\n朱琦\t92562\nSLT\t92563\n精灵4p\t92564\n路警\t92565\n挂马\t92566\n40余名\t92567\n三毛梅威瑟\t92568\n竖盘\t92569\n染出\t92570\n结核性\t92571\n中铁建工\t92572\n星探\t92573\n磁通量\t92574\n竭力\t92575\n人到中年不得已\t92576\n工程勘察设计收费标准\t92577\n林媛\t92578\n竞彩258网\t92579\n濑户早妃\t92580\n散焦\t92581\n金刚手菩萨\t92582\n甘肃政府采购网\t92583\n大沙河\t92584\n中华人民共和国民政部\t92585\n京畿道\t92586\n宏瑞\t92587\n珠玉\t92588\n肉照\t92589\n0f\t92590\n申华控股\t92591\n皮兜\t92592\n桃花奇缘_多玩\t92593\nnomore\t92594\n李云哲\t92595\n插入页\t92596\n情非得已\t92597\n上海理工大学管理学院\t92598\n宅男福利网\t92599\n合肥地铁2号线\t92600\n祭酒\t92601\nNPP\t92602\n贝索\t92603\n古驰GUCCI\t92604\n波多野洁衣\t92605\n承兑汇票贴现\t92606\nSUPPORT\t92607\n壶口\t92608\ncenteros\t92609\n小巴掌\t92610\nstriped\t92611\n刻制\t92612\n中央统战部\t92613\n31期\t92614\n玛拉顿\t92615\nFn\t92616\n雪峰科技\t92617\n金刚狼\t92618\n乳露\t92619\n时代杯\t92620\n血栓通\t92621\n金赛\t92622\n让世界充满爱\t92623\n军报记者网\t92624\n吴天君\t92625\n无尽之剑3\t92626\nxmlbeans\t92627\n乐博士\t92628\n视障\t92629\n女王宫\t92630\n18块\t92631\n黄金糕\t92632\n异型鱼\t92633\n于都\t92634\n关刀\t92635\n强军\t92636\n笔袋\t92637\nwebAPP\t92638\n压路\t92639\n水面\t92640\nifashion\t92641\n攻城掠地\t92642\ncooking\t92643\n湖南新闻_新浪\t92644\n刺络\t92645\n董美香\t92646\n调酒师\t92647\n拔模\t92648\n北京市体检中心\t92649\n旭川动物园\t92650\n152mm\t92651\n非分之想\t92652\n街角\t92653\n蓝血\t92654\n坡脚\t92655\n观月茜\t92656\nmaomi\t92657\n京东金融研究院\t92658\n起跑线亲子网\t92659\nhms\t92660\nf188\t92661\n正一\t92662\nster\t92663\ndatafile\t92664\nca\t92665\n悍跳\t92666\n8161\t92667\n博德鲁姆\t92668\n维嘉\t92669\n盗梦人小说网\t92670\n5月13日\t92671\n论足\t92672\n辅食\t92673\nX220\t92674\n地图\t92675\n牧养\t92676\n练一练\t92677\n碧桂园天汇\t92678\n2.9亿\t92679\n爱宝\t92680\n梧州学院\t92681\n少男少女\t92682\n麻将桌\t92683\n林风\t92684\n跑环\t92685\n新纪\t92686\nmmorpg\t92687\n挪用资金罪\t92688\n秘籍\t92689\n存款性\t92690\n1980年代\t92691\n高效性\t92692\n艾斯卡\t92693\n梦到\t92694\n榨汁杯\t92695\n午餐费\t92696\n会计学硕\t92697\n苏南国际机场\t92698\n18w\t92699\n跨洋\t92700\nbook2\t92701\n2.4米\t92702\n五指山市\t92703\n施丹\t92704\n女上男\t92705\nPSV\t92706\n学绪论\t92707\n江苏卫生健康职业学院\t92708\n贝雷帽\t92709\n城乡路\t92710\n三湾\t92711\n长城人寿\t92712\n谢飞\t92713\n神球\t92714\n稿\t92715\n高级战争2\t92716\n陈豪\t92717\n三浦春马\t92718\n形骸\t92719\n客途\t92720\nfixe\t92721\n洗浴\t92722\nbrewing\t92723\n非特\t92724\n计划\t92725\n末日之战\t92726\nimageRUNNER\t92727\n中国西电集团\t92728\n英雄无敌\t92729\n冶金焦\t92730\nBeg\t92731\nied\t92732\n成飞\t92733\nttx\t92734\n威慑\t92735\n戈尔\t92736\n龙凤区\t92737\n蓝光原盘\t92738\n吞下\t92739\nDELL戴尔\t92740\nds-160\t92741\n刘亦菲\t92742\nSnapdragon\t92743\n天天利财\t92744\n180片\t92745\n夜火\t92746\n深水港\t92747\n沃美\t92748\n中国外文局\t92749\n十四位\t92750\n健康观\t92751\n处理厂\t92752\n财函\t92753\n格库铁路\t92754\n深圳社区\t92755\n碘酊\t92756\n天津体育学院运动与文化艺术学院\t92757\n备员\t92758\n试爱\t92759\n区残联\t92760\n村务监督委员会\t92761\n帮带\t92762\n咏蛙\t92763\nmp288\t92764\n快算\t92765\n23G\t92766\n杀精\t92767\n璨\t92768\n近40年\t92769\n谢孟媛\t92770\n祚\t92771\n嫂子\t92772\n谢场\t92773\n竖琴\t92774\n死树\t92775\n建行大厦\t92776\n第84集\t92777\nInitialize\t92778\n机油\t92779\n聚美优品\t92780\n欧比斯\t92781\n北京人民大学\t92782\n敢达决战\t92783\n烤漆房\t92784\n曳步\t92785\n秀色可餐\t92786\nul00\t92787\n川港\t92788\n捅\t92789\n29mm\t92790\n卡卡颂\t92791\n玉兰花园\t92792\n中付\t92793\nidr\t92794\n陈中\t92795\n海天黄豆酱\t92796\naex\t92797\n邓普顿\t92798\n巡检员\t92799\nClorisLand\t92800\n刺客信条枭雄吧\t92801\nEDR\t92802\n得以\t92803\nMIX2\t92804\n柠檬云\t92805\n指认\t92806\n阿德里亚诺\t92807\n荔箫\t92808\n湖石\t92809\ntestflight\t92810\n国际足联\t92811\n有机氯农药\t92812\n不明白\t92813\n庄月明\t92814\n翩\t92815\nFOSS\t92816\n南方医科大学深圳医院\t92817\n重庆汽博中心\t92818\n子鼠\t92819\n日产聆风\t92820\nLingoes\t92821\nsql吧_\t92822\n平约瑟\t92823\n中国兵器装备集团公司\t92824\n南京图书馆\t92825\n信昌\t92826\n打除\t92827\n苯乙酮\t92828\n大数据营销\t92829\n帝驼\t92830\n冷桥\t92831\n看展\t92832\n张锡纯\t92833\n回调\t92834\nIE7\t92835\n非织\t92836\n淳化县人民政府\t92837\n西城法院\t92838\n疏理\t92839\niggy\t92840\n磁流体\t92841\n查开放\t92842\n5个多小时\t92843\n难者\t92844\n报考\t92845\n签署\t92846\nACU\t92847\n开苞\t92848\n濮阳市\t92849\n苟日新\t92850\n易宝典\t92851\n高尔夫\t92852\nsweetfx\t92853\n19.1.2\t92854\n百度云1080p\t92855\n王者荣耀至尊宝\t92856\n壁式\t92857\n一个工作日\t92858\n东海花园\t92859\nDrunk\t92860\n强龙者\t92861\n杨舒平\t92862\n伟迪捷\t92863\nps模拟器\t92864\n合影留念\t92865\nfatblaster\t92866\n诸城新闻网\t92867\n6包\t92868\n周海婴\t92869\n赫迪拉\t92870\n首都医科大学附属北京天坛医院\t92871\n刘丽萍\t92872\n超负荷\t92873\nTMC\t92874\n本溪市\t92875\n绿名\t92876\n夜学\t92877\n眉粉\t92878\n衔接班\t92879\n东坑镇\t92880\n桐原绘里香\t92881\n龙族5\t92882\nReckless\t92883\n北京香格里拉饭店\t92884\nufc223\t92885\n机械加工\t92886\nN900\t92887\n20座\t92888\n烟包\t92889\ntopsis\t92890\n巴恩斯\t92891\n李冉\t92892\n昌平路\t92893\n校费\t92894\n四氯化碳溶液\t92895\n夜灵平原\t92896\n北大仓\t92897\n我的父亲\t92898\n限域\t92899\n城市售票网\t92900\n美妇\t92901\n画芯\t92902\n评审团\t92903\n道侣\t92904\n晓富\t92905\n中华母亲节\t92906\n中国邮政集团\t92907\n浙江海事局\t92908\nm126nw\t92909\n桂林航空\t92910\n106个\t92911\n桓淼淼\t92912\n编\t92913\n努比亚N3\t92914\n易县\t92915\n七峰山\t92916\n姬无双\t92917\n燕郊\t92918\n靓颖\t92919\n寻人启事\t92920\n芜\t92921\n偏振态\t92922\n神魔天尊\t92923\najaxsubmit\t92924\n海兴吧\t92925\n李东升\t92926\n市场准入\t92927\n酒原文\t92928\na柱\t92929\n尽致\t92930\n李行\t92931\n直航\t92932\nchcp\t92933\n坑梓街道\t92934\nBLUE\t92935\n平角\t92936\n蹩脚\t92937\nQQ拼音\t92938\n家母\t92939\n柔弱\t92940\n四更\t92941\ngpio\t92942\n广州丝足\t92943\n临床医学类\t92944\n读碟\t92945\n分则\t92946\n叉叉\t92947\n工段\t92948\n头孢氨苄\t92949\nA719\t92950\nCISG\t92951\n余弦函数\t92952\n耳线\t92953\n获胜者\t92954\nTurtle库\t92955\nxrandr\t92956\n第十六章\t92957\n柱底\t92958\n幼麟\t92959\nska\t92960\n20#钢\t92961\n安康鱼\t92962\n合肥市经济技术开发区\t92963\n陈李济\t92964\n大气采样器\t92965\n新石\t92966\nsdd\t92967\n49路\t92968\n云南省环境保护厅\t92969\nPyCharm2017\t92970\n乐派\t92971\n正弘\t92972\nPyV8\t92973\n边外\t92974\n妈妈的朋友3\t92975\nsolidworks2012\t92976\nDesigner\t92977\n雨松MOMO程序研究院\t92978\n房券\t92979\n衣着\t92980\n都市娱乐\t92981\n四进制\t92982\n珉\t92983\n胱抑素c\t92984\n中共临沂市委党校\t92985\n单关\t92986\n建业凯旋广场\t92987\n新交\t92988\nmarvel\t92989\n使命召唤二战\t92990\n千垛景区\t92991\n内阁制\t92992\nlingo\t92993\nvitality\t92994\n刘墉下南京\t92995\n当子\t92996\n小糸\t92997\n瓦楞机\t92998\n瘦\t92999\n郁先生\t93000\n献血者\t93001\n济宁日报\t93002\n马王堆导引术\t93003\n明秀路\t93004\nAya\t93005\n15下\t93006\n叮咚\t93007\n有司\t93008\n流媒体服务器\t93009\n林里\t93010\n2018年一月份\t93011\n源高\t93012\n藤子不二雄\t93013\n淮安开发区\t93014\n手皮\t93015\n斧王岛\t93016\n黑莓9780\t93017\n张锦\t93018\n二龙山\t93019\nDw\t93020\n哈局\t93021\n第18章\t93022\n女寝\t93023\n卡商\t93024\n中铁十一局集团\t93025\nS5700\t93026\n影印版\t93027\n中牧股份\t93028\n医药展\t93029\n歌曲类\t93030\n卖花\t93031\nFFTW\t93032\n可米Sunnee\t93033\n湖北大学知行学院\t93034\n网上挂号\t93035\n唐冶新区\t93036\n小饭\t93037\nDark\t93038\n彩玉\t93039\n小样\t93040\nNuget\t93041\n真格学网\t93042\n里耶\t93043\n方砖\t93044\n浊度\t93045\n天津一汽丰田\t93046\n鄂州市国土资源局\t93047\n从心开始\t93048\n夏县人民政府\t93049\n給\t93050\n幸福心理学\t93051\n当雄县\t93052\n博罗石湾\t93053\n风疹病毒\t93054\n刘达\t93055\n苏木\t93056\nnum\t93057\n宝马m2\t93058\n普安\t93059\n意外伤\t93060\nbasically\t93061\n2015年8月份\t93062\n福建教育\t93063\n千丈\t93064\n天车\t93065\n或多或少\t93066\n16.3\t93067\n完整版影音先锋高清\t93068\n郭晓婷\t93069\n前3个月\t93070\n右军\t93071\n泸州市政府\t93072\n雄鹰\t93073\n102国道\t93074\nFaZe\t93075\n45#\t93076\n忘尘如羡\t93077\n代办费\t93078\ndii\t93079\nInspiron\t93080\nHOHO\t93081\n元阳县\t93082\n财大气粗\t93083\n一键批量\t93084\nNastran\t93085\nscanf\t93086\n75周年\t93087\nScenic\t93088\n澳门酒店\t93089\n时代光华\t93090\n拜吧\t93091\n南京政府\t93092\n1940年\t93093\n上海市疾病预防控制中心\t93094\n合益\t93095\n哈尔滨旅行社\t93096\nMOM\t93097\n摈弃\t93098\ngmap\t93099\nrror\t93100\n黑沙湾\t93101\n南极圈\t93102\ngst\t93103\ncasio计算器\t93104\n中国中铁四局集团\t93105\n安全工程师\t93106\n操之过急\t93107\n无座\t93108\ntsinghua\t93109\n低功\t93110\nfsdb\t93111\n3.4G\t93112\n美墅\t93113\nxietui\t93114\n广西科技大学\t93115\n完本\t93116\n锐速\t93117\npdg\t93118\nVIOS\t93119\n闺蜜装\t93120\n弗朗索瓦\t93121\n空蒙\t93122\n成功\t93123\n新骗术\t93124\ncnode\t93125\nCPDA\t93126\n离岸价\t93127\n老君山\t93128\n新世代全顺\t93129\n展示盒\t93130\n週\t93131\n插口\t93132\n胸模\t93133\n此起彼伏\t93134\n1&\t93135\n广州湾\t93136\nKongregate\t93137\n牵引绳\t93138\n消失模铸造\t93139\n碌曲县\t93140\n苏人版\t93141\nS1期\t93142\n甩手工具箱\t93143\nmentohust\t93144\n刀剑乱舞花丸\t93145\n保定市公安局\t93146\n抽象画\t93147\n善若吉命理网\t93148\nsteinberg\t93149\n混合制\t93150\n新罗免税店\t93151\n聚赌\t93152\n奇隆\t93153\nattracted\t93154\n荀飞盏\t93155\n2017.6\t93156\n张力尹\t93157\n江西省司法厅\t93158\n童第周\t93159\n250亿\t93160\n少林寺武校\t93161\nЛ\t93162\n银座商城\t93163\nsin1/\t93164\n万能遥控\t93165\n马拉糕\t93166\n中国机电产品进出口商会\t93167\n2.1亿\t93168\n54部\t93169\narray_multisort\t93170\n狂热\t93171\n维维软件园\t93172\n面相分析\t93173\n周六日\t93174\n水凝胶\t93175\n代理机构\t93176\n中国橡机网\t93177\n习总书记十九大\t93178\n居住证制度\t93179\n打卡器\t93180\n警惕性\t93181\nfifaonlin\t93182\n抢工\t93183\n五凤溪古镇\t93184\n李小冉\t93185\n梁婖婷\t93186\n王禹\t93187\n6.4.1\t93188\n南丹路\t93189\n12卷\t93190\n冰滴\t93191\nrattle\t93192\n清火\t93193\n汗血马\t93194\n新广告法\t93195\n史集\t93196\n700万\t93197\nfbs\t93198\n犯难\t93199\n西关村\t93200\n薄雪\t93201\n丁香酚\t93202\n2000款\t93203\n维c泡腾片\t93204\nCONVERSE\t93205\n北京网信办\t93206\n布水\t93207\n摊铺机\t93208\n驴友户外网\t93209\n杭州物流公司\t93210\n王化成\t93211\n女人有话说\t93212\n街舞队长\t93213\n天时地利人\t93214\n福音时报\t93215\n三只小猪\t93216\n北京市农林科学院\t93217\n快用\t93218\n市博物馆\t93219\n沙河街\t93220\n白瞎\t93221\n宝来\t93222\n云南山歌剧\t93223\nstudy\t93224\n关系史\t93225\n新圩\t93226\n搞笑视频\t93227\ncheryl\t93228\nN7100\t93229\n平均律\t93230\nFSN\t93231\nvgod\t93232\n顺数\t93233\n35\t93234\nKnowing\t93235\n情热传说\t93236\n麦克卢汉\t93237\nF2\t93238\n相\t93239\n火焰纹章无双\t93240\n舞协\t93241\n岱山县安全生产监督管理局\t93242\n涟钢\t93243\n032\t93244\nSDNLAB\t93245\n论衡\t93246\n欧迪芬\t93247\n2ne1\t93248\n北京地铁4号线\t93249\nf420\t93250\n数个\t93251\n挑明\t93252\n蜡菊\t93253\n乐飞\t93254\n梁七少\t93255\na7s\t93256\n上海电视\t93257\n电子层\t93258\n罗马湖\t93259\n赤膊\t93260\nCFer\t93261\n黑魅\t93262\n2015年9月3日\t93263\n安慕希\t93264\n侧线\t93265\nfifaol4\t93266\n行锁\t93267\nWFC\t93268\n俺去也电影网-anquye-俺去了-俺来也-俺去啦-我去也-俺去也新网\t93269\n彭宇行\t93270\n图普科技\t93271\n开年\t93272\nsata3.0\t93273\n阿扎里\t93274\n繁多\t93275\n议习\t93276\n瑜亮\t93277\n安游在线\t93278\n天王补心丸\t93279\n使命召唤12:黑色行动3\t93280\n平乐\t93281\n十五个月\t93282\n包裹性\t93283\n东京热\t93284\nEB-5\t93285\nprestashop\t93286\n拉手网团购网\t93287\n跨层\t93288\n飞天侠\t93289\nembroidery\t93290\n1.7.2\t93291\n丝袜\t93292\n耳语\t93293\n垓下\t93294\n维棠\t93295\n痔疮\t93296\n1600万元\t93297\n饶罗翔\t93298\n吉木萨尔县人民政府\t93299\nMTK手机网\t93300\nCOALESCE\t93301\n赵新\t93302\nicl\t93303\n科索\t93304\n豆饼\t93305\nVisa\t93306\n纯真\t93307\n中游标\t93308\n碧血剑\t93309\n张贤胜\t93310\n杀鱼\t93311\n双低\t93312\n墨量\t93313\n乌龙茶\t93314\n猛鬼大厦\t93315\n北京大学社会学系\t93316\nlibgtk\t93317\nSplitIt\t93318\n365彩票网\t93319\n欺诈\t93320\n拉动式\t93321\n老骥伏枥\t93322\ncharles\t93323\n李二牛\t93324\n字脸\t93325\n巴塘县\t93326\nrebalance\t93327\n国记\t93328\n阳城\t93329\n童姥\t93330\n赛普健身\t93331\n米仓\t93332\n九州娱乐网\t93333\narrived\t93334\n福佑\t93335\ngiving\t93336\n小叶榄仁\t93337\n石占明\t93338\n公交都市\t93339\n品牌概念\t93340\nWin7系统电脑\t93341\ndiagnostic\t93342\n5202\t93343\n20160318\t93344\n找麻烦\t93345\n江铠\t93346\n赤化\t93347\nSin\t93348\n国防军\t93349\n32平米\t93350\n减息\t93351\nps扣图\t93352\n班门\t93353\n新长安欧尚_长安欧诺报价\t93354\n上海动漫\t93355\nMalibu\t93356\n兵贵\t93357\nattrs\t93358\n潘金莲之前世今生\t93359\n少先队鼓号队\t93360\n藏龙百瀑\t93361\n流程时间\t93362\n集贤路\t93363\nyoutuber\t93364\n汇康\t93365\n海俪恩\t93366\nheavi\t93367\n营业款\t93368\n奶瓶\t93369\n撷香\t93370\n北卡罗莱纳州\t93371\n标准类\t93372\n第八条\t93373\nangular\t93374\n火墙\t93375\n御宴\t93376\n方剂\t93377\n诛\t93378\n正脉\t93379\n阿松\t93380\n京运通\t93381\nISSUED\t93382\nQLineEdit\t93383\nskate\t93384\n斗罗大陆2\t93385\n雁鸣湖\t93386\n1322140\t93387\n评估期\t93388\n照抄\t93389\n2017年4月11日\t93390\n人民政协报\t93391\n脍\t93392\n15N\t93393\n言舒雅\t93394\n印台区\t93395\n化隆\t93396\n上证所\t93397\n惊叹号\t93398\n注液\t93399\n莫德尔\t93400\n象山景区\t93401\n沙雪\t93402\nsha\t93403\n3月22日\t93404\n小书亭\t93405\n优诺\t93406\n恒大悦澜湾\t93407\n致趣\t93408\n代表人士\t93409\n三批\t93410\n海信平板电视\t93411\n分体\t93412\n吴召国\t93413\n自然面\t93414\n全无修\t93415\nTroy\t93416\n96分\t93417\nAES256\t93418\n晓磊\t93419\n硬性\t93420\n国运\t93421\n头绪\t93422\n泛普\t93423\nobe\t93424\n中山红木家具品牌\t93425\n第142章\t93426\nTransformers\t93427\nozone\t93428\n某地\t93429\n珠海市政府网\t93430\n联合军\t93431\naldi\t93432\n省直机关工委\t93433\n真子\t93434\nlatte\t93435\nNordic\t93436\n普普通通\t93437\n阿尔泰\t93438\n5.x\t93439\n分叉币\t93440\n上海国际学校\t93441\ncarnival\t93442\n小动作\t93443\ninhibitor\t93444\n佳乐世纪城\t93445\n溶于\t93446\n鼓曲\t93447\n园艺场\t93448\nGm版本库\t93449\n国润城\t93450\n悬念式\t93451\n0.125g\t93452\n拉铆螺母\t93453\n%eax\t93454\n塞纳春天\t93455\n副区长\t93456\n冰神\t93457\n可圈可\t93458\n吴师\t93459\n二百三\t93460\nViewPager\t93461\n叮叮车\t93462\n放话\t93463\nclassifier\t93464\n人卫智网\t93465\n框\t93466\n老蛙镜头\t93467\n半波整流电路\t93468\n暖水\t93469\n0.71\t93470\n全段\t93471\nA5100\t93472\n九九中文网\t93473\n天天军棋\t93474\n王子文\t93475\n恐怖宠物店\t93476\nimgur\t93477\n傅承敏\t93478\n廖俊涛\t93479\nQTE\t93480\n红橡木\t93481\nInkjet\t93482\n地主\t93483\n邻省\t93484\n舌面\t93485\n最后冲刺\t93486\n袁纯清\t93487\n红道\t93488\n彩虹图\t93489\n钢桩\t93490\n战略版\t93491\n烟袋斜街10号\t93492\n苗医\t93493\n尺量\t93494\n中国地图出版社\t93495\nzhuji\t93496\n水氧\t93497\n名驭\t93498\n滤镜\t93499\n皮垫\t93500\n获得性\t93501\n老干爹\t93502\n精诚网\t93503\n高送转\t93504\n重庆新闻_新浪\t93505\n站前大道\t93506\n工行私人银行\t93507\nmaplestory\t93508\nPitbull\t93509\n通风报信\t93510\nmapperLocations\t93511\n海东新闻网\t93512\n季布\t93513\n不和睦\t93514\n杰信\t93515\n工牌\t93516\ntur\t93517\n1406\t93518\n澳门风云2\t93519\n动件\t93520\n山野\t93521\n国家5A级旅游景区\t93522\n龙里县人民政府\t93523\n类人\t93524\n宜丰信息港\t93525\n汗蒸房\t93526\n车水马龙\t93527\nvedio\t93528\n城市分\t93529\n万科里\t93530\nDHL\t93531\n缺粮\t93532\n3月21\t93533\n单号查询网\t93534\n墙裙\t93535\n6平方\t93536\n印次\t93537\n亲身\t93538\n变砖\t93539\n唱支山歌给党听\t93540\n西元\t93541\n鹿邑\t93542\n王菀之\t93543\n容城\t93544\n温柔\t93545\nES5\t93546\nMega/115\t93547\n地球简笔画\t93548\n杭盖\t93549\n2部\t93550\n阿巴嘎旗\t93551\n水流年\t93552\n嗎\t93553\n3X3黄金联赛\t93554\n九龙山风景区\t93555\n20141215\t93556\n膀胱镜\t93557\n查尔顿\t93558\n1.com\t93559\n证券交易所\t93560\n贵金属催化剂\t93561\n梅西耶\t93562\n音乐社\t93563\n星座乐\t93564\n马累机场\t93565\ngentle\t93566\n面对巨人\t93567\n两三天后\t93568\n科技文\t93569\n成都)有限公司\t93570\n千年\t93571\n苹果7plus\t93572\n300行\t93573\n昌吉回族\t93574\n滤\t93575\nFilezilla\t93576\n名片\t93577\n蓝移\t93578\n驱动程序包\t93579\n阿訇\t93580\n毛藻\t93581\n出发点\t93582\n皮拉图斯山\t93583\nsivan\t93584\n郭妃丽\t93585\n3.9米\t93586\nIPAD2\t93587\n26家\t93588\n備\t93589\n广陵新城\t93590\n责问\t93591\n什川\t93592\n多益\t93593\n万事达卡\t93594\nRD\t93595\n县城\t93596\n竞争上岗\t93597\n九村\t93598\n荣威350\t93599\n三川玲\t93600\n6月30日起\t93601\n第六十二\t93602\n北汽福田汽车\t93603\n|巴\t93604\n曲非烟\t93605\n同人本\t93606\n中高档\t93607\n企业管理专业\t93608\n网络扫描仪\t93609\nnuomi\t93610\nデリバリ\t93611\n多线切割机\t93612\n650W\t93613\n齐鲁党报网\t93614\nCollection\t93615\n贝恩施\t93616\n非线程安全\t93617\n轴长\t93618\n梅墟\t93619\n网易云音乐电台\t93620\nharder\t93621\n保卫\t93622\n余量\t93623\nimplode\t93624\n柯美751\t93625\n嘲弄\t93626\n防寒\t93627\n精悍\t93628\n邮政编码大全_全国邮政编码大全\t93629\n专科段\t93630\n川端\t93631\nConsulting\t93632\n安卓机\t93633\nUIWindow\t93634\n磁密\t93635\n萤石云\t93636\n死斗\t93637\n类群\t93638\n连接管\t93639\n提纯\t93640\n日积月累\t93641\n闫云达\t93642\n新号\t93643\n途观丝绸之路版\t93644\n宝马3\t93645\nsession_key\t93646\n精血\t93647\n盛婚\t93648\n给花\t93649\n俄克拉荷马\t93650\n创文巩卫\t93651\n自动挡车\t93652\n兴文在线\t93653\n百戏\t93654\n有线通\t93655\n天涯社区\t93656\n物流费\t93657\n这么近\t93658\n新华北路\t93659\n37tp\t93660\nbright\t93661\n十师\t93662\n影长\t93663\n聚齐\t93664\nGPU世界论坛\t93665\n搜狗导航\t93666\n高层\t93667\n20160703\t93668\n两难\t93669\n中国共产党党员领导干部廉洁从政若干准则\t93670\n戒骄戒躁\t93671\n逆水寒ol\t93672\nNamenode\t93673\n屠龙勇士\t93674\n迈克尔逊\t93675\n会考题\t93676\n艶母\t93677\n样卡\t93678\n铠龙\t93679\n20170706\t93680\n四五岁\t93681\n结构化产品\t93682\nHishop\t93683\n丙烯酸漆\t93684\n仙本\t93685\n魏都区\t93686\nwagner\t93687\n火壁炉\t93688\n聚四氟乙烯\t93689\n辽宁工程技术大学\t93690\n闻名\t93691\n磁悬浮列车\t93692\n西南医院\t93693\nPublishing\t93694\n头屯河\t93695\n南方医科大学\t93696\n岁月的童话\t93697\n网吧服务器\t93698\nwh1000xm2\t93699\n纳税服务网\t93700\n絮絮\t93701\nshapefile\t93702\nbpp\t93703\n红兴隆\t93704\n当家做主\t93705\n加查\t93706\n唐钰\t93707\n网易漫画\t93708\n小退\t93709\n宏_百度\t93710\n56P\t93711\ndiem\t93712\nsienna\t93713\n790cc\t93714\n武侠\t93715\n清华大学\t93716\nRegularization\t93717\n19th\t93718\n搭扣\t93719\n人工受孕\t93720\n高级渲染器\t93721\n城陵矶\t93722\n禁种铲毒\t93723\nCalculators\t93724\n不俗气\t93725\n黄文涛\t93726\nNSAttributedString\t93727\n宝光寺\t93728\n生物通\t93729\n结影\t93730\n留印\t93731\n飞蚊症\t93732\n心潮澎湃\t93733\n企地\t93734\n太湖源镇\t93735\nJalopy\t93736\n福特翼搏\t93737\nWindowBuilder\t93738\n融府\t93739\n7点\t93740\n宜春中学\t93741\n新疆艺术学院\t93742\n珊\t93743\n12日游\t93744\n谄媚\t93745\n第几年\t93746\nJCO\t93747\n180级\t93748\n万宋\t93749\n微金\t93750\n希岛爱理\t93751\n贰角\t93752\n山寨币\t93753\nSYSLOG\t93754\n千万美元\t93755\nbcd码\t93756\npoco\t93757\n三生有幸\t93758\n陆机\t93759\nStratos\t93760\nLOFTER\t93761\n社会制度\t93762\n拖尾\t93763\n洮河\t93764\n甚高\t93765\n管家\t93766\n培菲康\t93767\n土霉素片\t93768\n脱单\t93769\n坑坑洼洼\t93770\n湖北宜化\t93771\n瑞尔森大学\t93772\n怀璧其罪\t93773\nhuzhou\t93774\nLawrence\t93775\n珠海市人民政府\t93776\n受益人\t93777\nfp\t93778\n5070\t93779\n何期\t93780\nios8.4\t93781\n肉桂酸\t93782\n房贷款\t93783\n空气流量计\t93784\n罗敷\t93785\n光锥\t93786\n暮然\t93787\n无氧铜\t93788\nchil\t93789\n1.html\t93790\n20171203\t93791\n电子科技有限公司\t93792\nvdbench\t93793\n三根\t93794\n蒙族舞\t93795\n亮翅\t93796\nWin10安全之家\t93797\n砵兰街\t93798\n合青高铁\t93799\n苏诺\t93800\n国史通鉴\t93801\n40块\t93802\n老爷车\t93803\ncadsee\t93804\n创发\t93805\n虎丘湿地公园\t93806\n常州市第一人民医院\t93807\n游话\t93808\n重婚罪\t93809\n长跑\t93810\n大庆市人力资源和社会保障局\t93811\n充绒\t93812\n玉屏山\t93813\npcf\t93814\n首宗\t93815\nSuspended\t93816\n贺年\t93817\nabdroid\t93818\n108种\t93819\n吴越国\t93820\n程涛\t93821\n旗语\t93822\ndayinji\t93823\n叶剑英\t93824\n台骀山\t93825\n分管\t93826\n12星座女\t93827\nRoma\t93828\nstr2\t93829\nRHCE\t93830\n香港招商银行\t93831\n符氏\t93832\n碱蓬\t93833\n陈俊卿\t93834\n3dmax\t93835\n泰祥\t93836\n飘渺\t93837\nIRC\t93838\n图员\t93839\n办公厅\t93840\n麒麟650\t93841\ntqdm\t93842\n一题\t93843\n轴类\t93844\n全票\t93845\n液位开关\t93846\n迈阿密热火队\t93847\n点缀\t93848\n5.09\t93849\n串连\t93850\n瓦尔塔\t93851\n灵川镇\t93852\n孙传庭\t93853\n第151章\t93854\n采收率\t93855\n磁编码器\t93856\n嘉定中心医院\t93857\n云南省\t93858\n巴西世界杯\t93859\n卷筒纸\t93860\nWI-H700\t93861\nCommand\t93862\n内线\t93863\nefex\t93864\nCambodia\t93865\n实施意见\t93866\n噼里\t93867\n新疆能源网\t93868\n比赛\t93869\n李沐\t93870\n垂直同步_\t93871\n益贷\t93872\nduowan\t93873\nfash\t93874\n油花\t93875\n海峡银行\t93876\n微商学院\t93877\n夏装\t93878\n许友彬\t93879\nsurfactant\t93880\nxmouse\t93881\n眼角纹\t93882\n年轻代\t93883\n受罪\t93884\nRoosevelt\t93885\n经销商\t93886\n干姐\t93887\n单方面\t93888\n退换\t93889\n京东社区\t93890\n滑冰鞋\t93891\n资源配置\t93892\n痣苍\t93893\n曹颖\t93894\n笋芽\t93895\n同科\t93896\n贫贱\t93897\n养殖户\t93898\n雄文\t93899\nread-only\t93900\n287号\t93901\n酷睿i5处理器\t93902\n第120集\t93903\n0恒\t93904\nDOSBox\t93905\n马法兰\t93906\nNUI\t93907\n80.0000元\t93908\nImage\t93909\n护卫神\t93910\ndoom4\t93911\nxming\t93912\n36节\t93913\n模糊算法\t93914\nterrace\t93915\n干嘉伟\t93916\n纹\t93917\n细胞生物学\t93918\n南极仙翁\t93919\n文质彬彬\t93920\n不能吃\t93921\n沈雷\t93922\n支腿\t93923\n桃尻\t93924\n轩辕剑天之痕\t93925\nFACT\t93926\n五指毛桃\t93927\n村子\t93928\n减员\t93929\n乔治六世\t93930\n脱温\t93931\nAD转换\t93932\n埃隆\t93933\n火星情报局\t93934\nPASCAL\t93935\n检举信\t93936\nled投光灯\t93937\nbroken\t93938\n盒型\t93939\n鸟市\t93940\n裁判秀\t93941\nTEIN\t93942\n大有罪\t93943\nMysql数据库\t93944\n无空\t93945\n大清铜币\t93946\n中山大学化学与化学工程学院\t93947\n达拉然\t93948\n自我介绍\t93949\ngranite\t93950\n槁\t93951\n连线题\t93952\n征途\t93953\n建行手机银行\t93954\nWSP\t93955\n配备\t93956\n84号\t93957\n长距\t93958\n乔治巴顿\t93959\n头孢过敏\t93960\nProbe\t93961\n银帆\t93962\n诺艾尔\t93963\n酚醛树脂\t93964\nhmail\t93965\n三门核电站\t93966\n练就好\t93967\nV6.3\t93968\njh\t93969\nlots\t93970\n五月\t93971\n大球\t93972\n刘勤\t93973\n雨人\t93974\n北大未名\t93975\n历奇\t93976\n移动变电站\t93977\n大专文凭\t93978\n理财型\t93979\n降钙素\t93980\n淘大师\t93981\n铁力\t93982\nexi\t93983\n地面波数字电视\t93984\n王思远\t93985\n土猫网\t93986\n乡路\t93987\n出头\t93988\nbaksmali\t93989\n职专\t93990\nshorten\t93991\n高景\t93992\n子路教育网\t93993\n门牙\t93994\n20150831\t93995\nbullshit\t93996\nhibernat\t93997\n浙江华友钴业股份有限公司\t93998\n吃药\t93999\n野钓\t94000\nCscope\t94001\n小奴\t94002\n南山控股\t94003\n私彩\t94004\n辽宁舰\t94005\ntongzhi\t94006\n库恩\t94007\n出川\t94008\n涮书\t94009\nComplementary\t94010\n黑字\t94011\n蔡廷锴\t94012\n备份盘\t94013\n小77论坛\t94014\n网狐6603\t94015\n鸡爪槭\t94016\n太古地产\t94017\ntogether\t94018\n画江湖之换世门生\t94019\nplays\t94020\nSecurecrt\t94021\n老塞\t94022\n表单元素\t94023\napt-get源\t94024\n進\t94025\n界面剂\t94026\n珊瑚树\t94027\nCreating\t94028\nMp4Ba\t94029\n淮安\t94030\n马内\t94031\n公牛集团\t94032\n玖樟台\t94033\n六心\t94034\n中国建筑装饰协会\t94035\n粉丝名\t94036\nhtml5/css3\t94037\n9月21日起\t94038\n洗钱\t94039\nHashCode\t94040\n酸片\t94041\n碎掉\t94042\n4006275566\t94043\n版册\t94044\n我们的少年时代\t94045\n二维码扫描枪\t94046\n老毛桃u盘启动盘制作工具\t94047\n螺旋面\t94048\n楚阳\t94049\n第三号\t94050\nenergyplus\t94051\n杨镇\t94052\ntein\t94053\n第8个\t94054\n极限竞速7\t94055\n做最好的自己\t94056\n织业\t94057\n69期\t94058\n迅鲨\t94059\n北京汽车租赁公司\t94060\n东山中学\t94061\n卫星石化\t94062\n轻扬\t94063\n伺\t94064\n血战到底\t94065\n宇佐美\t94066\n瓮福\t94067\n弹幕式\t94068\n吓走\t94069\n绿城乌镇雅园\t94070\n智汇卡\t94071\n齐俊杰\t94072\n放空阀\t94073\n斑\t94074\n无财\t94075\n飞针\t94076\n鼎诚\t94077\n感化\t94078\n亿吨\t94079\n鞍山立山区\t94080\n7206\t94081\n烟圈\t94082\n3.0.8\t94083\n九阴绝学\t94084\nBeanShell\t94085\nべ\t94086\n后巷\t94087\n中国民用航空总局\t94088\n歪风\t94089\n卡士\t94090\n资产类\t94091\n南京市下关区\t94092\nAD原理图\t94093\n骨灰级\t94094\n可惜没如果\t94095\nLuaJIT\t94096\n安次区\t94097\n丰都新闻网\t94098\n下呼吸道感染\t94099\n叶琛\t94100\n山西小说网\t94101\n乔治马丁\t94102\nlos\t94103\n银行信贷员\t94104\n30g\t94105\n天天艾瑞泽5\t94106\n绝地求生刺激战场直播\t94107\n飞岛\t94108\n股权质押\t94109\n切成\t94110\n美国一号公路\t94111\n求精\t94112\n1.3\t94113\n马德堡\t94114\ncarla\t94115\nconditions\t94116\nsuede\t94117\n东莞市技师学院\t94118\n鄂托克前旗\t94119\nCTV\t94120\n通用机场\t94121\nuploaded\t94122\n客迷\t94123\n棣\t94124\nblack\t94125\n盐酸反应\t94126\n雪耳\t94127\n18页\t94128\n外界\t94129\n汉献帝\t94130\n岳南\t94131\n翡翠明珠\t94132\n令人震惊\t94133\n新飞度\t94134\n枪阶\t94135\n广西医科大\t94136\nFluorescent\t94137\nR100\t94138\nIPP\t94139\n撸管\t94140\n绝地求生透视自瞄\t94141\nAPAC\t94142\n经行\t94143\n1.8.10\t94144\n正前\t94145\n定向运动\t94146\n精日\t94147\n3.3v\t94148\n15.36\t94149\n18036期\t94150\n狼疮\t94151\nmatter\t94152\n蕉岭\t94153\n当前月\t94154\n20150117\t94155\n查询机\t94156\n保利香雪山\t94157\n混凝土搅拌车\t94158\n微游戏\t94159\n签字版\t94160\n黑骨茶\t94161\n二硫键\t94162\n52页\t94163\n25类\t94164\n程诺\t94165\n东风大道\t94166\n铝扣板\t94167\n远期合约\t94168\n资格证\t94169\n射手\t94170\n中共大连市委组织部\t94171\n57期\t94172\n萨姆森\t94173\n会展办\t94174\n此情可待成追忆\t94175\n锦屏\t94176\n奇思妙想\t94177\n汇丰银行(中国)有限公司\t94178\n1323558\t94179\nISI\t94180\n华为watch2\t94181\nRequirements\t94182\n%}\t94183\n私募操盘手\t94184\nairi\t94185\n红安\t94186\n标准形\t94187\n大坡镇\t94188\n大龟\t94189\n寿比\t94190\n按开\t94191\n海口市规划局\t94192\n朱建平\t94193\n机械工业\t94194\n酥肉\t94195\n吲达帕胺片\t94196\n摸象\t94197\n塵风\t94198\n同步_\t94199\nscrambler\t94200\n沙坑\t94201\n计权\t94202\n调味汁\t94203\n154亿\t94204\n8.1%\t94205\ncobaltstrike\t94206\n替代\t94207\n邓文怡\t94208\n水浒少年\t94209\n珍酒\t94210\n合肥大学城\t94211\n周卡\t94212\n方伟\t94213\n董家口\t94214\nchin\t94215\n汉中路\t94216\n邮币卡\t94217\n人力车\t94218\nbabyzen\t94219\n限\t94220\n第37页\t94221\n水边\t94222\nps图层蒙版\t94223\nDAYZ\t94224\n龙泉股份\t94225\nSafer\t94226\nhsm\t94227\n京东闪购\t94228\nCRISIS\t94229\n下水口\t94230\n左耳\t94231\n王永民\t94232\n黄家村\t94233\n床位费\t94234\n工间\t94235\n散盘\t94236\n天瑞集团\t94237\n原阳县政府\t94238\nジュ\t94239\n道神\t94240\n深情厚谊\t94241\n兵车行\t94242\nutil\t94243\n0.43\t94244\n投标\t94245\n中国政府组织结构\t94246\n神态\t94247\n喝止\t94248\n交通事故人\t94249\n陕\t94250\n中指院\t94251\n绝不放手\t94252\n超级本吧\t94253\nFences\t94254\nECG\t94255\n博敏电子\t94256\nUSB-C接口\t94257\n七十二式\t94258\n粘滞系数\t94259\n基准日\t94260\n中国医院协会\t94261\n鄞州区\t94262\n三起\t94263\n百臻堂\t94264\n环渤海动力煤价格指数\t94265\n平安e生保\t94266\n游泳者\t94267\n项目型\t94268\n000547\t94269\n十五种\t94270\n腾讯应用中心\t94271\n远得要命的爱情\t94272\n娃哈哈集团\t94273\n隋唐英雄\t94274\n黄子东野圭吾\t94275\n东岗\t94276\n怎一个\t94277\nPreparation\t94278\n亚洲乱乱色情网\t94279\nYST\t94280\n195号\t94281\n6020\t94282\n助孕\t94283\n苗期\t94284\n石壁街\t94285\n1公斤\t94286\n销客\t94287\nsmartpls\t94288\n姚晓光\t94289\n吴宗宪\t94290\n349号\t94291\n泥岩\t94292\n直火\t94293\n戏剧史\t94294\n赫章\t94295\n脸骨\t94296\n上海之春国际音乐节\t94297\n火影忍者ONLINE\t94298\nfor循环嵌套\t94299\nGuilty\t94300\n西安文理学院\t94301\n最终痴汉电车\t94302\n爱情雨\t94303\n龙涛\t94304\n清泽\t94305\n顺丰速递\t94306\n不少于\t94307\n郡肝\t94308\nmsp430f5529\t94309\n中中\t94310\n上海威斯汀大饭店\t94311\n小鱼板\t94312\nSeason\t94313\n轩然大波\t94314\n新蜀门\t94315\n视点\t94316\n16磅\t94317\n玛维\t94318\n徽园\t94319\nIIS7.5\t94320\na1534\t94321\n武汉富士康\t94322\n风险投资基金\t94323\nyzu\t94324\n#01\t94325\n1矩阵\t94326\n计算值\t94327\n阳台板\t94328\n慕容湮儿\t94329\nwin7电脑蓝屏\t94330\n国鼎\t94331\n投连险\t94332\n汉字释义\t94333\norp\t94334\n图鉴\t94335\n五居室\t94336\n山西省委\t94337\n人心果\t94338\nBrief\t94339\n_厘米\t94340\n腾讯地图API\t94341\nwap版\t94342\n厨房\t94343\n民办学位\t94344\n碘化银\t94345\n闷痘\t94346\n超霸\t94347\n鸽舍\t94348\n年令\t94349\n清冽\t94350\n准分子激光\t94351\n散步\t94352\n西南交通大学希望学院\t94353\n网侠苹果软件站\t94354\n吴悟无\t94355\n權威醫美領導\t94356\n津滨网\t94357\n揭阳\t94358\n小豌豆\t94359\n马屿镇\t94360\n山楂妹\t94361\nblz\t94362\npreparing\t94363\n泉边\t94364\n大悟论坛\t94365\n捷德\t94366\n广州公安局\t94367\n刀塔蓝鸟\t94368\n中国建设监理协会\t94369\n柏雪\t94370\n圣铃\t94371\nCoolorus\t94372\n裸替\t94373\nfastjson\t94374\n文人墨客\t94375\n干果\t94376\n咸菜\t94377\n师兄们\t94378\n今日15时\t94379\n宋华\t94380\n表帅\t94381\n纹路\t94382\ngnupg\t94383\n同档\t94384\n2.34\t94385\n刘安\t94386\n170702\t94387\n仪器系\t94388\nFerry\t94389\n1670\t94390\nD盘\t94391\n国清寺\t94392\n丹枫飞云\t94393\n寒食节\t94394\nmcl\t94395\n口述\t94396\n露面\t94397\n锌精矿\t94398\n西路军\t94399\n扬声\t94400\n远城区\t94401\n积善\t94402\n威乐\t94403\n总高\t94404\n试管架\t94405\n上士\t94406\nfnk\t94407\n传唱度\t94408\n倒装句\t94409\nPPT模版\t94410\n秦全耀\t94411\n孟婆三七\t94412\n南华老仙\t94413\n基金从业考试\t94414\n缴费率\t94415\n复式折线统计图\t94416\n补偿式\t94417\n努比亚布拉格\t94418\n贫僧\t94419\nReal-time\t94420\n光感版\t94421\n老肖\t94422\n梅威\t94423\n海璟\t94424\n中国网安\t94425\n三大一\t94426\n20160319\t94427\n奔驰e260\t94428\n盛情\t94429\n李小姐\t94430\ncalories\t94431\n动物农场\t94432\n济南高新区管委会\t94433\n梭子\t94434\n压簧\t94435\n刘坚\t94436\n段志强\t94437\njboot\t94438\n摩羯女\t94439\n冷冻品\t94440\n卡特诸葛亮\t94441\n平安旅游白\t94442\n晾\t94443\nwindowxp\t94444\n排位赛\t94445\n00054\t94446\n明信城\t94447\n秘而不宣\t94448\nas400\t94449\nobjectid\t94450\ntub\t94451\n金镯子\t94452\n广东狗民俱乐部\t94453\n窦店\t94454\n撸灰飞烟灭\t94455\n冷酸灵\t94456\n劳士\t94457\n听经\t94458\n李克强\t94459\n哀家\t94460\n电控锁\t94461\n豆虫\t94462\nalgebraic\t94463\n脚注\t94464\n998元\t94465\n中科大软件学院\t94466\ntaurus\t94467\n另一方面\t94468\n3.0|\t94469\n上海电网\t94470\n梅尧臣\t94471\n贝尔格莱德\t94472\n第86\t94473\n斯托克斯\t94474\nmg248q\t94475\n姓名缘分配对测试\t94476\n非人狂想屋\t94477\n企鹅岛\t94478\nDagger2\t94479\n趵突泉\t94480\n一贝\t94481\nncss\t94482\n中国农工民主党\t94483\n2900元\t94484\n晚间新闻\t94485\n01031\t94486\nDrive\t94487\nmatches\t94488\n后付款\t94489\n清水镇\t94490\n气贯\t94491\n6月3号\t94492\n冲刷\t94493\n北京科教\t94494\n幸福时刻\t94495\n泰韩\t94496\n金融云\t94497\n白户\t94498\n郭玮\t94499\n中华建材网\t94500\n盐田\t94501\nFencing\t94502\n91创业网\t94503\n莆田文化网\t94504\n绿色消费\t94505\n98彩票网\t94506\n一万多块\t94507\n监督管理\t94508\n5.9元\t94509\n寂寞沙洲冷\t94510\n蕨\t94511\n八味\t94512\n凤凰之神\t94513\n汉川市政府\t94514\n雅思考试\t94515\n拍卖价\t94516\n游戏者\t94517\n第三方存管\t94518\n古歌\t94519\n龙海论坛\t94520\n内乡县\t94521\n昆山市第一人民医院\t94522\n平衡态\t94523\n价群\t94524\n男色\t94525\n八神太一\t94526\n太原市中心医院\t94527\n转出\t94528\n希\t94529\n山泉\t94530\n扭扭车\t94531\n真主\t94532\nLayers\t94533\n慎独\t94534\n东坝郊野公园\t94535\n张大佛爷\t94536\nes8\t94537\n尊耀版\t94538\nraces\t94539\nVx\t94540\n郑州航空\t94541\nOP&ED\t94542\n增长列\t94543\n齐白石\t94544\nRDPAC认证\t94545\n喷油漆\t94546\n处置场\t94547\n套娃\t94548\n升压\t94549\n群光\t94550\n事不关己\t94551\nsummit\t94552\n沙板\t94553\n尼布楚条约\t94554\nGB\t94555\n该子\t94556\nmdev\t94557\n常闭式防火门\t94558\n博览群书\t94559\n枪舞\t94560\n锥子\t94561\n被捅\t94562\n热弯机\t94563\n高海鹏\t94564\n子组件\t94565\n峯\t94566\n联丰村\t94567\n实装\t94568\n贡\t94569\n999感冒灵颗粒\t94570\n南京德基广场\t94571\n碳元科技\t94572\npause\t94573\n歼-20_\t94574\n风法\t94575\n野纱\t94576\n内摩擦角\t94577\n我爱车网\t94578\nddb\t94579\n电热水龙头\t94580\n双娇\t94581\n珠海市统计局\t94582\n陈倩倩\t94583\n1366x768\t94584\ncordova\t94585\nSonarLint\t94586\n电子工程专业\t94587\n天天有\t94588\n源代码\t94589\nDirectives\t94590\n小芝\t94591\n2005年1月\t94592\n天冬酰胺\t94593\ngtx750ti\t94594\n563\t94595\n力争\t94596\n经开区管委会\t94597\n紫雨\t94598\n报箱\t94599\ntreated\t94600\n糖胶\t94601\n安全月\t94602\n中共湖南省委\t94603\n七海千秋\t94604\n前些\t94605\n展商网\t94606\n假货\t94607\n郑州市水务局\t94608\n乐嘉李晟\t94609\n素材中国17素材网\t94610\n618路\t94611\n伪满洲国\t94612\n航天服\t94613\n富野\t94614\n斗地主单机版\t94615\n越野路书\t94616\n大言\t94617\nzhuhai\t94618\n右臂\t94619\n全国企\t94620\n赵丽宏\t94621\n陈瑾\t94622\n早期乳腺癌\t94623\n醴陵\t94624\nMindManager思维导图\t94625\nWindows8.1免激活版\t94626\n24c02\t94627\n圣域\t94628\nintern\t94629\n福湘\t94630\ngovernance\t94631\n联轴节\t94632\nTCA\t94633\n交易虎\t94634\n暗血\t94635\n南极石\t94636\n麻辣GIS\t94637\n世贸广场\t94638\n卧龙电气\t94639\n跡美\t94640\nTArch\t94641\nど\t94642\n无戏言\t94643\n佳作\t94644\n边行\t94645\ng1820\t94646\n水场\t94647\n永隆银行\t94648\nCGM\t94649\n草乌\t94650\n七色\t94651\nCashier\t94652\n【香奈儿\t94653\niso-8859-1\t94654\n苏子油\t94655\n洪山路\t94656\n穿堂煞\t94657\nad-hoc\t94658\n47部\t94659\n跑男4\t94660\n南山大佛\t94661\n第75集\t94662\n信融科技\t94663\nPEGASUS\t94664\n自救器\t94665\n溜进\t94666\n时寒\t94667\n五十佳\t94668\n潇洒人生\t94669\n2万块\t94670\n谐振电路\t94671\n蛋叔\t94672\nMAPK\t94673\n花水湾\t94674\n氓\t94675\nfantaxy\t94676\n本科大学\t94677\n8999\t94678\n胎压计\t94679\n路人缘\t94680\n清肠\t94681\n灼识\t94682\n锰砂\t94683\n浅滩\t94684\n泵浦\t94685\nS_\t94686\n埃及艳\t94687\n美手\t94688\n我叫黄国盛\t94689\n武汉移动\t94690\n大彩\t94691\n睢县网\t94692\n咪表\t94693\n上海德济医院\t94694\n置评\t94695\n切音\t94696\n时簿\t94697\n易普森\t94698\n万界帝尊\t94699\n学习版\t94700\n杨萍\t94701\n高俊熙\t94702\n豆腐干\t94703\n第9卷\t94704\n百度知道\t94705\n河山\t94706\n4530\t94707\n嗲囡囡\t94708\n肉块\t94709\n三国杀online\t94710\n土木工程材料\t94711\nkali-linux\t94712\n灵希\t94713\n曹璐\t94714\nitchat\t94715\n小猪佩奇表情包\t94716\n勇敢者游戏决战丛林\t94717\n失落者\t94718\n爸妈网\t94719\n民间菜\t94720\n大众一汽\t94721\n美国芝加哥大学\t94722\n叶诗文\t94723\n16年01月\t94724\n中国森林旅游网\t94725\n为甚\t94726\n181路\t94727\n鹅蛋脸\t94728\n萌娘百科\t94729\n夜灵\t94730\n曾黎\t94731\n腐植酸钠\t94732\nCristina\t94733\n温特\t94734\nクレ\t94735\n卢燕\t94736\n刘局长\t94737\n男兔\t94738\n金庸群俠傳5+金庸无双\t94739\n井冈山经济技术开发区\t94740\nh97\t94741\n27路\t94742\n25集\t94743\n典范\t94744\n天国救赎\t94745\n样器\t94746\n制造基地\t94747\n豆豉鲮鱼\t94748\n帕萨迪纳\t94749\n魔法阵\t94750\n宋义\t94751\n璟宸\t94752\n吟咏\t94753\nxvidoes\t94754\n鞋长\t94755\n2017年3月16日\t94756\nsheer\t94757\nFAS\t94758\n10日前\t94759\n和泰\t94760\nDataTraveler\t94761\n古北口\t94762\n动态链\t94763\nvitamin\t94764\n小米达人\t94765\n3.\t94766\nLayOut\t94767\nPowder\t94768\n小马国女孩\t94769\n人字\t94770\n两男一女\t94771\n跨境电商公司\t94772\nflair\t94773\n陪侍\t94774\n深圳市市场和监督管理委员会\t94775\n王霸\t94776\n50排名\t94777\n10.7%\t94778\n吊钩\t94779\nyeye\t94780\n复方黄连素片\t94781\n上海华为\t94782\n第1讲\t94783\n神之墓地\t94784\n创新点\t94785\nThan\t94786\n大众辉昂\t94787\ndrone\t94788\n催眠系\t94789\n北野\t94790\n李秀英\t94791\nLinux根\t94792\n临沂市人民医院\t94793\n融水\t94794\n恶作剧2吻\t94795\n恶之花\t94796\n口水疹\t94797\n爱普生l360\t94798\n翠微小学\t94799\n2.qq.com\t94800\n密约\t94801\nLEATHER\t94802\n海西房产网\t94803\n白玛瑙\t94804\n天津师大\t94805\n龙将\t94806\n鹿鞠\t94807\n周文王\t94808\n2017年前\t94809\n拍品\t94810\n奥尔\t94811\n宝妈们\t94812\n晚上8点\t94813\nILSpy\t94814\n钦北\t94815\n常宁新区\t94816\n正祥\t94817\n27000元\t94818\n金刚石微粉\t94819\n蓝燕\t94820\n入党动机\t94821\n插班生\t94822\n慈云\t94823\n无招\t94824\n大卖王\t94825\n祝捷\t94826\n秣陵\t94827\n陆风x8\t94828\n权柄\t94829\n铜钱草\t94830\n1.2.3部\t94831\n梶裕贵\t94832\nwins\t94833\n1401\t94834\n微晶石\t94835\n网页游戏卡\t94836\n骊姬\t94837\n脸麻\t94838\ni站\t94839\n淘宝售假\t94840\n371路\t94841\n佛山妇幼保健院\t94842\n卡头\t94843\n淘金\t94844\n2018-2020年\t94845\n环丙沙星\t94846\nルレ\t94847\n广播体\t94848\n牛太郎\t94849\n林超贤\t94850\n南京日报\t94851\nselectkey\t94852\n计划部\t94853\n中国民族资本主义\t94854\n温水镇\t94855\nlfs\t94856\n学宫\t94857\n兵团\t94858\n51.com\t94859\n海蓝之谜\t94860\n神秘之城寻物历险\t94861\nKhan\t94862\n有一天\t94863\nphat\t94864\nlibp\t94865\n4招\t94866\ncontained\t94867\n2016B站\t94868\nLondon\t94869\n防封\t94870\n薄荷色\t94871\n关卡\t94872\n李丽萍\t94873\n爱靓导购\t94874\n约翰·洛克\t94875\n品职\t94876\n三方协议书\t94877\n世纪杯\t94878\n赛富\t94879\n型式样\t94880\n拉维奇\t94881\n杀死一只知更鸟\t94882\n丽彩\t94883\nPoE交换机\t94884\n深圳手表厂\t94885\n中国地理学会\t94886\n苦与乐\t94887\n5000流明\t94888\n蓬莱机场\t94889\n乐扣\t94890\n北京外国语学院\t94891\n龙骨花\t94892\n昆汀\t94893\n陶器\t94894\n丰和园\t94895\nBIM网\t94896\n粉红猪小妹\t94897\n雷锋大道\t94898\nPhantom\t94899\nr11plus\t94900\n汗臭\t94901\n环氧胶\t94902\nshand\t94903\n到头\t94904\nab胶\t94905\n天母\t94906\nautowire\t94907\n金牛理财网\t94908\n血鸭\t94909\n今年三月\t94910\n石洞口\t94911\njszip\t94912\nResults\t94913\n魏文帝\t94914\nv4.9.1\t94915\n歌_九酷音乐网\t94916\n霍州市\t94917\n硅谷网\t94918\n法术\t94919\n雎\t94920\n绝经\t94921\nXVI\t94922\n小笼包\t94923\n深圳市第六人民医院\t94924\n麦德龙\t94925\nladen\t94926\nprometheus\t94927\n幻想小勇士\t94928\n软弹枪\t94929\n鸦天狗\t94930\n伊一\t94931\nsdn\t94932\n不动产\t94933\nNorth\t94934\n中国建设银行股份有限公司\t94935\n人间失格\t94936\n赛贝格\t94937\n乐场\t94938\n上海公安\t94939\n班型\t94940\n学士路\t94941\n长月\t94942\n烫卷\t94943\n冰草\t94944\n七神\t94945\n见附件\t94946\n里\t94947\nHazelcast\t94948\n反字\t94949\nLifting\t94950\n东风御风\t94951\n2018-01-26\t94952\n欢天喜帝\t94953\n朴门\t94954\n楚少\t94955\n奎尔德拉\t94956\n20185年\t94957\n黏连\t94958\n幸福苑\t94959\n7.0版\t94960\n用膳\t94961\n血池\t94962\n浅爱\t94963\nhardness\t94964\n孔飞力\t94965\nSEUS\t94966\n擂台\t94967\n赵麟\t94968\n刀烬\t94969\n大塘\t94970\n海天酱油\t94971\n扫描仪\t94972\n纺企\t94973\n负责人\t94974\n建盏吧\t94975\n液晶电视维\t94976\n经纱\t94977\nfunctioning\t94978\ndraem0507\t94979\n长沙南站\t94980\n裆\t94981\nk680e\t94982\n博茨瓦纳\t94983\n别错\t94984\n龙亭\t94985\n小猿\t94986\nBaba\t94987\n福特级航母\t94988\n七里坪\t94989\n11player.com\t94990\n豆酱\t94991\nkontakt5\t94992\nPottermore\t94993\n声名狼藉\t94994\nRunnable\t94995\n校园行\t94996\nicm\t94997\n主变压器\t94998\n创邑\t94999\n医教园\t95000\n4.7万\t95001\n清原满族自治县\t95002\n在任\t95003\n金尊府\t95004\n修改源\t95005\n回采率\t95006\n合取\t95007\n鑫磊\t95008\n1.7G\t95009\n平泽\t95010\n小肚子\t95011\n文韬\t95012\n快要死\t95013\nworks\t95014\nsqlprompt\t95015\nCK免费电影网\t95016\n魔域\t95017\n多普\t95018\n我的女儿\t95019\n顾欣\t95020\n乌兰察布政府网\t95021\n会计档案管理办法\t95022\n唐宋明\t95023\n单恋\t95024\nzstv\t95025\n插行\t95026\n2VCD\t95027\nzenme\t95028\n烂泥\t95029\n產\t95030\n高危险\t95031\n20160717\t95032\n借款方\t95033\nrika\t95034\npotatoes\t95035\ntipo\t95036\n张靓\t95037\n中华管乐网\t95038\n少儿英语\t95039\nWare\t95040\n稀晶石手机眼镜\t95041\n新疆地区\t95042\nNotion\t95043\n四川省工商局\t95044\n砗磲\t95045\n准假\t95046\n艾诺迪亚\t95047\nlpc1768\t95048\n铁臂阿童木\t95049\n余则成\t95050\n龙之凯越\t95051\n陀罗\t95052\nTTFB\t95053\n春江花夜月\t95054\n炎狼兽\t95055\n美国联邦最高法院\t95056\n山皮石\t95057\nSher\t95058\n观后\t95059\n中轴\t95060\n钻饰\t95061\n北端\t95062\n血檀\t95063\n昨夜\t95064\nC++论坛\t95065\n财管\t95066\n广深互联帮助中心\t95067\nindienova\t95068\n五彩\t95069\n540元\t95070\n乳假\t95071\n淡影\t95072\n1941\t95073\n田山\t95074\n考察组\t95075\n折算\t95076\njame\t95077\n几百只\t95078\n今年双十一\t95079\n郑在玹\t95080\n华育\t95081\n腾讯企业邮箱免费版\t95082\n0111\t95083\n光测色仪\t95084\n心尖部\t95085\n出血线\t95086\n哑女\t95087\n抚平\t95088\n建瓯\t95089\nchen110xi\t95090\n燔\t95091\n雨湖区\t95092\n湘雅三医院\t95093\n般若波罗蜜\t95094\n首战告捷\t95095\n耗能\t95096\n私募基金备案\t95097\n超凡入圣\t95098\nsusanws\t95099\n冀鸟\t95100\n0065\t95101\n_天涯剧社\t95102\n弘一法师\t95103\n上海晨光文具股份有限公司\t95104\n硒鼓\t95105\n萧美娘\t95106\n亚马逊雨林\t95107\n颅内压增高\t95108\n枫林\t95109\n大通证\t95110\n韩丹\t95111\n西安财经学院行知学院\t95112\n放心药房网\t95113\n侯耀华\t95114\n净高\t95115\nWhatsApp\t95116\n诗班\t95117\nwin2008server\t95118\n西撒哈拉\t95119\n荣耀花木兰\t95120\n馆舍\t95121\n_狗\t95122\n5.99万起\t95123\n东方酒店\t95124\nKVV\t95125\n1024商务网\t95126\n万州烤鱼\t95127\n百分之95\t95128\n百度问答\t95129\nabchina\t95130\n龙岩一中\t95131\n中分\t95132\n后梁\t95133\n有声听书\t95134\n碰到\t95135\n马家窑\t95136\n糖果粉碎苏打传奇\t95137\n31家\t95138\n萌动\t95139\nsemantics\t95140\n张腾\t95141\n校餐\t95142\ngalore\t95143\nmyOffer\t95144\n怒击\t95145\n咨询\t95146\n电视直播网\t95147\n高途\t95148\n帕西瓦尔\t95149\n军艺\t95150\n禅修班\t95151\n柴扉\t95152\n马里奥赛车8:豪华版\t95153\nTOP100\t95154\nm4r格式\t95155\nComplement\t95156\n海马s5\t95157\n开户许可证\t95158\n娅\t95159\n王红梅\t95160\n临桂区\t95161\n扇热\t95162\n地插座\t95163\n食梯\t95164\n玛蒂尔达\t95165\n发包人\t95166\nExtJs4\t95167\nIKEA\t95168\n沉降板\t95169\n幻想三国志2\t95170\n梅园\t95171\n淮北新闻网\t95172\n中国铁建国际城\t95173\n设计方\t95174\n代父\t95175\n维棠FLV\t95176\n赵欣瑜\t95177\n帧数\t95178\nrouter-view\t95179\n非池\t95180\n第六十五条\t95181\n上臂式\t95182\n升学e网通\t95183\n季玥\t95184\n人教版2017\t95185\nArbitrage\t95186\n超过一天\t95187\n8825\t95188\n箭在弦上\t95189\nessentials\t95190\n固形物\t95191\n极速赛车\t95192\n胡悦\t95193\n上海市教育局\t95194\n贺涵\t95195\n钥匙柜\t95196\n大列巴\t95197\n无所谓了\t95198\n广西普法网\t95199\n仙岳路\t95200\n祥源文化\t95201\n32cm\t95202\n登机牌\t95203\n莉贝琳\t95204\n中央民族大学舞蹈学院\t95205\n1226\t95206\n帝王师\t95207\n册亨县\t95208\n成都火车站\t95209\nSleeping\t95210\n兴国县\t95211\n龙潭虎穴\t95212\n鬼臼毒素酊\t95213\n电桥\t95214\n中国军网_数字报\t95215\n袁刚\t95216\n超度\t95217\nNETPAS\t95218\n企业家协会\t95219\n价服\t95220\n部分页\t95221\n黄金城\t95222\nsram\t95223\n凤于九天\t95224\n单人餐\t95225\n体弱\t95226\nEasonJ\t95227\n卖药郎\t95228\n湖北省委组织部\t95229\nccr\t95230\n古汉\t95231\nResponsive\t95232\n双龙路\t95233\n第七节\t95234\nPolarization\t95235\n询价单\t95236\n粘胶短纤\t95237\nB8L\t95238\nbpmn\t95239\n撰稿\t95240\nweil\t95241\n存在主义\t95242\nkino\t95243\n杭州建设银行\t95244\n上海电力学院\t95245\n对战类\t95246\n心妍\t95247\n1.5元\t95248\nwtw1974\t95249\n中国妇幼保健协会\t95250\n三剑\t95251\nWiFi信号差\t95252\n秦港股份\t95253\n查控\t95254\n铜花\t95255\n丧尸围城4\t95256\nDBL\t95257\neos币\t95258\nIDA\t95259\n角形\t95260\n共担\t95261\n税率\t95262\ninputstream\t95263\nwi\t95264\n伊斯坦\t95265\n假作\t95266\n广讯通\t95267\n论证会\t95268\nBrilliant\t95269\n莫菁门\t95270\n安徽省体育局\t95271\n迷茫\t95272\n1499元\t95273\n他达拉非\t95274\n东方之约\t95275\n开工\t95276\n三3\t95277\n安邦财险\t95278\n急急急\t95279\n刘力菲\t95280\n基层组织\t95281\n秋浦\t95282\n终于等到你\t95283\n雨中曲\t95284\nruna\t95285\n失业潮\t95286\nx染色体\t95287\n化学原理\t95288\n浦东软件园\t95289\n丁力\t95290\n吉沢明歩\t95291\n查收\t95292\n千夫所指\t95293\n紫薯粉\t95294\n量子点电视\t95295\n冲模\t95296\n圆觉经\t95297\n市中医院\t95298\n左路\t95299\n河海\t95300\n周一仙\t95301\n腥\t95302\n日历年\t95303\n穷游\t95304\ndiploma\t95305\n华北油田\t95306\ncould\t95307\n花样年\t95308\nNamed\t95309\n线宽\t95310\nNetherlands\t95311\nG13\t95312\n帕拉丁二觉\t95313\n金龙机电\t95314\n封条\t95315\n840\t95316\n顺行\t95317\nxmind思维导图\t95318\n妈妈的手\t95319\n五十万\t95320\n朝阳北路\t95321\n苏宁云信\t95322\n比亚迪秦EV450\t95323\n西乡中学\t95324\n双名\t95325\n喜剧片\t95326\n直板\t95327\n厉王\t95328\n泰安市\t95329\n灯草\t95330\n广州市对外贸易经济合作局\t95331\nCharter\t95332\n黄百韬\t95333\n滑膜炎\t95334\n袁弘\t95335\n异短\t95336\n测压管\t95337\n蓝蓝\t95338\nfont.chinaz\t95339\n海贼王罗宾\t95340\n经前\t95341\n治疗率\t95342\n单人位\t95343\n文化重生之贼行天下\t95344\n何贵干\t95345\n背信\t95346\ngt730\t95347\n江苏省统计局\t95348\n192.168.1\t95349\nBitSet\t95350\n金戈\t95351\n广发银行信用卡中心\t95352\n常熟房产网\t95353\n干头\t95354\n万科麓城\t95355\n萌萌萌\t95356\n陈静仪\t95357\n荐客\t95358\n上厕所\t95359\n会计信息化\t95360\n专岗\t95361\nLinux篇\t95362\n基金法律法规\t95363\n10.3.3_\t95364\n密训班\t95365\nTTyb\t95366\n鸭嘴兽\t95367\n抛釉砖\t95368\n玉佛寺\t95369\n纸钞屋第一季\t95370\n湖南省人力资源和社会保障\t95371\n沈跃跃\t95372\nairpak\t95373\n牙印\t95374\n三公\t95375\n凹凸棒土\t95376\n革命史\t95377\n朱小平\t95378\n绕制\t95379\n中冶华天\t95380\n4399.com\t95381\n洗衣服\t95382\n拂菻坊\t95383\n王志宏\t95384\n6册\t95385\n北极星光伏招聘网\t95386\n江海博物馆\t95387\n闭会\t95388\n商丘市第一人民医院\t95389\n信长野望大志\t95390\n恐怖类\t95391\n和村\t95392\n金刚狼2\t95393\n1921年7月23日\t95394\n一起玩游戏\t95395\n峪口镇\t95396\n两万多\t95397\n舞会\t95398\n看望\t95399\n中国钢铁人才网\t95400\n眼角\t95401\n分布式\t95402\n集成墙面网\t95403\n赵璐\t95404\n黑白屏\t95405\n国投瑞银\t95406\n恒大悦珑湾\t95407\n没救\t95408\nResources\t95409\n胜景\t95410\n名悦\t95411\nPrint\t95412\n人机关系\t95413\n4所\t95414\n恶灵退散\t95415\n目地\t95416\n何须怨\t95417\n收银管理软件\t95418\n2002路\t95419\nkagney\t95420\n泳坛\t95421\nPdf\t95422\n轩辕剑之汉\t95423\n不苦\t95424\n优衣库\t95425\n全脂奶\t95426\n夜深沉\t95427\n漳州站\t95428\n小酌\t95429\n恨意\t95430\n资产收益率\t95431\n洽\t95432\n无轴螺旋输送机\t95433\n城南区\t95434\n四川航天职业技术学院\t95435\n湖州职业技术学院\t95436\n王朗\t95437\n初始号\t95438\n抬腿\t95439\n变档\t95440\n石田\t95441\nme域名\t95442\n冻死\t95443\n每晚\t95444\n建设银行网点_金投\t95445\n工具袋\t95446\nAwake\t95447\nMixture\t95448\n康熙来了吧_\t95449\n股票质押率\t95450\n发生制\t95451\n天空之役\t95452\n立人\t95453\n邱建良\t95454\n数学科\t95455\nid验证码\t95456\n对焦屏\t95457\n华樱\t95458\n取出来\t95459\n方大城\t95460\n奔驰cla200\t95461\n科瑞石油\t95462\n千世\t95463\n存贷款利率\t95464\n爆粗\t95465\n靴\t95466\nunchanged\t95467\n踹\t95468\n相城\t95469\nneru\t95470\n磬\t95471\ntwg\t95472\n安徽省农业科学院\t95473\n30头\t95474\n胖妹\t95475\n第11\t95476\n幽人\t95477\n焦度\t95478\n华瑞凯琳\t95479\n便携式投影仪\t95480\n600708\t95481\ncoolbooter\t95482\n3615\t95483\n蒋超\t95484\n互联网金\t95485\n10.13\t95486\nMultiplier\t95487\n分度圆\t95488\n动力型\t95489\n直充\t95490\n骇人听闻\t95491\n前四后八自卸车\t95492\n288\t95493\n现代汽车\t95494\n阴道\t95495\n丁俊哈继铭\t95496\n傅家坡\t95497\n论读书\t95498\n南贤俊\t95499\n鑫江玫瑰园\t95500\n有声小说\t95501\n止锚湾\t95502\n纪念邮票\t95503\n查找\t95504\n湖北地区\t95505\n青岛装修公司\t95506\n李建斌\t95507\n玉米蛋白粉\t95508\n长沙龙虾馆\t95509\n聚季铵盐\t95510\n紫晶城\t95511\n3.16\t95512\n330li\t95513\n扣杀\t95514\nCreated\t95515\nawkward\t95516\n8690\t95517\n子非鱼\t95518\ncensus\t95519\nseed\t95520\n45型\t95521\nDIR\t95522\n橡果\t95523\nARMS\t95524\n2500U\t95525\n二里头\t95526\n2545\t95527\n党外知识分子联谊会\t95528\n挪威科技大学\t95529\nAffiliate\t95530\n产状\t95531\ntables\t95532\n7座\t95533\n挂号费\t95534\n携程酒店\t95535\n潘多拉\t95536\nWebTorrent\t95537\n丁佩\t95538\n甜叶菊\t95539\n背叛者\t95540\n养生酒\t95541\n西溪银泰城\t95542\n领导小组\t95543\n马口铁\t95544\nbershka\t95545\n森奈奈子\t95546\nautoindex\t95547\nserialVersionUID\t95548\n79届\t95549\n赤水吧\t95550\n铅锤\t95551\nDlib\t95552\n锡业股份\t95553\nXFCE\t95554\nios6.13\t95555\n培训桌\t95556\n萌生\t95557\nes2\t95558\n施耐德钢笔\t95559\n邓男子\t95560\n福迪\t95561\nDELL\t95562\n儿童抽动症\t95563\n姜锋\t95564\n董辉\t95565\nvee\t95566\n霍乱弧菌\t95567\n转子\t95568\nCreme\t95569\nawgn\t95570\n1905.com\t95571\n多行\t95572\n蘋果\t95573\n偷跑\t95574\n王绎龙\t95575\ntrial\t95576\n美国图书馆\t95577\n如常\t95578\n俏江南\t95579\nrosie\t95580\n金钟大\t95581\n人教\t95582\n192.168.31.1\t95583\n玉山路\t95584\n苏美尔文明\t95585\n博爱路\t95586\n罗溪\t95587\n提档线\t95588\ng550\t95589\nQTCN\t95590\nSimplorer\t95591\n李亿龙\t95592\n高筋粉\t95593\nm1216nfh\t95594\n148家\t95595\nMAU\t95596\n_简\t95597\n神术\t95598\n通知公\t95599\nunjeep\t95600\n鲁诺\t95601\n孟祥伟\t95602\n250公里\t95603\nry8212\t95604\n继承\t95605\n如果\t95606\n藏宝\t95607\n凤凰茶\t95608\n两足\t95609\n延生\t95610\nLinker\t95611\n第二十五集\t95612\n杨志军\t95613\n问号\t95614\n有你在\t95615\n多囊卵巢综合症\t95616\n受贿罪\t95617\n成佛\t95618\n戛纳广告节\t95619\n33家\t95620\n机字\t95621\n篆字\t95622\nSquad\t95623\n铁炉堡\t95624\nUnix\t95625\n新材料科技有限公司\t95626\n240元\t95627\n鱼片\t95628\n天津北\t95629\n北京首都创业集团有限公司\t95630\n气象报文\t95631\n五十文\t95632\n2406\t95633\nsmart3d\t95634\n复合影响因子\t95635\n陈老师\t95636\n002185\t95637\n比重瓶\t95638\nGTX860M\t95639\n找女\t95640\n中科院海西研究院\t95641\n汉中兴汉新区\t95642\n严小雨\t95643\n雀圣\t95644\n张云逸\t95645\n一朵云\t95646\nreservations\t95647\n供应链管理专业\t95648\n张贤亮\t95649\n剥削\t95650\n5022\t95651\n北京定点医院\t95652\n朝阳广场\t95653\n男人色\t95654\n阿鬼\t95655\navtb\t95656\n宝库\t95657\n家道\t95658\n巴西莓\t95659\n满族\t95660\n五子\t95661\n流型\t95662\n袁东\t95663\n佳能500d\t95664\n集邮信息网\t95665\nuc吧_\t95666\n三鲁公路\t95667\n文本域\t95668\n回龙观\t95669\n子宫颈癌\t95670\n核磁\t95671\n连体钞\t95672\n未可\t95673\n心肾\t95674\n2246\t95675\n玖月奇迹\t95676\n曹植\t95677\n自驾\t95678\nisolation\t95679\n中山四路\t95680\n煲粥\t95681\n经典轩逸\t95682\n光伏业\t95683\n米图\t95684\n大航海4\t95685\n锦绣谷\t95686\n桐原\t95687\n002600\t95688\n爬行者\t95689\n帕菲特\t95690\n茯苓粉\t95691\ne6500\t95692\n上海地铁1号线\t95693\n20140609\t95694\n啰嗦\t95695\n深圳湾学校\t95696\n捐赠者\t95697\n崇福镇\t95698\nPorous\t95699\n潘家口水库\t95700\n梅里亚\t95701\n鬼将\t95702\n上海财大\t95703\n日语综合教程\t95704\n空港经济区\t95705\n茜公\t95706\n生源地贷款\t95707\n经史子集\t95708\n语句\t95709\n宫洺\t95710\n三礼\t95711\n20匹\t95712\nEF\t95713\n杀虫\t95714\n火罐\t95715\n隧道窑\t95716\nminicom\t95717\nTPU\t95718\n国际机\t95719\n会员证\t95720\n未定义标识符\t95721\nEdraw\t95722\n钟洁\t95723\n海军节\t95724\n黄原\t95725\n病毒木马\t95726\n744\t95727\n职工带薪年休假条例\t95728\n徐悲鸿\t95729\n鲁科\t95730\n注册会计师考试\t95731\njsa\t95732\n中性\t95733\n太和吧\t95734\n百兽\t95735\n蛇妻\t95736\n拆分盘\t95737\n少华\t95738\n守陵\t95739\n做噩梦\t95740\n镍板\t95741\n江正军\t95742\n机码\t95743\n人件\t95744\n党管\t95745\n轨道\t95746\n万里茶道\t95747\n李森茂\t95748\n辐射式\t95749\n总发\t95750\n美涂士\t95751\n猎头公司\t95752\n八期\t95753\n手放\t95754\njcraft\t95755\n养老院\t95756\n隐念笎\t95757\nAsync\t95758\n山西省科技厅\t95759\n动物儿歌\t95760\n比得\t95761\n普通\t95762\n穿串机\t95763\n金匮公园\t95764\n大变身\t95765\ndesert\t95766\n任务群\t95767\n挫折\t95768\n李坑\t95769\n九月\t95770\nFocal\t95771\n天天魔域\t95772\n别太\t95773\n知识\t95774\n高清国\t95775\nedexcel\t95776\n妥塞敏\t95777\nPREP\t95778\n徐飞\t95779\n晋位\t95780\n滑触线\t95781\n金龙\t95782\nanya\t95783\n空孕催乳剂\t95784\n主板\t95785\n青少版)\t95786\n虐心爱情故事\t95787\n迅游网游加速器\t95788\n2018年1月1日\t95789\n一百只\t95790\n油封\t95791\nP8H61\t95792\n放置\t95793\n百度富文本\t95794\nNFC\t95795\n追妻\t95796\n雅兮网\t95797\n动箱\t95798\n菠萝包\t95799\n欧瑞莲\t95800\nAndrea\t95801\n倍特\t95802\n小米4a\t95803\n南京市文广新局\t95804\n换届纪律\t95805\nliju\t95806\n刃牙吧\t95807\n欧进萍\t95808\n紫红色\t95809\n卢台长\t95810\n庞瀚辰\t95811\n麦当\t95812\n三千尺\t95813\n李立峰\t95814\n遇见你之前\t95815\n广发中证\t95816\n新斯科舍省\t95817\n莫代尔棉\t95818\n三星A5\t95819\n小米note3吧_\t95820\nNETSH\t95821\n横杆\t95822\n反倒\t95823\n盐城市公安局\t95824\n河北教育网\t95825\n出缺\t95826\n团\t95827\nExpert\t95828\n成都府\t95829\nidea14\t95830\n严苛\t95831\n敲章\t95832\n集势\t95833\nUnit8\t95834\n北京市海淀区人民法院\t95835\n数式\t95836\n中央储备粮\t95837\n自由裁量权\t95838\n高中锋\t95839\norz\t95840\n岗头\t95841\n用功\t95842\n恒大江湾\t95843\n日历表\t95844\n八座\t95845\n21例\t95846\n东田\t95847\nooopic\t95848\n变频恒压\t95849\n温州市人力资源和社会保障局\t95850\n上海音乐会\t95851\n终止执行\t95852\n音乐之都\t95853\n广东政协\t95854\n贵安新区\t95855\ns14\t95856\n0.0000\t95857\n电神魔傀2\t95858\n将帅\t95859\n三山区\t95860\nMorettie\t95861\n冲积\t95862\n龙华烈士陵园\t95863\nmaine\t95864\n上海工商职业技术学院\t95865\nKIS\t95866\n民族大学\t95867\n实惠\t95868\n泳装照\t95869\n左颜\t95870\n冯忠宏\t95871\n龙翔\t95872\nSis\t95873\n821路\t95874\n华盛顿大学\t95875\n驯染\t95876\n江哥\t95877\n曹孟德\t95878\n鬼王宗\t95879\n拳皇97\t95880\nsartorius\t95881\ntizzyt\t95882\n1280元\t95883\n反黑\t95884\nagrj\t95885\n电动势\t95886\n小儿\t95887\n钱隆\t95888\n35平\t95889\n水钻机\t95890\n0010\t95891\n疯狂的麦克斯4:狂暴之路\t95892\n腹心\t95893\n惊鸟\t95894\n弥尘\t95895\niPX\t95896\n沟口\t95897\n泄泻\t95898\nValgrind\t95899\n107家\t95900\n英数\t95901\n黄药\t95902\n2016年5月14日\t95903\n澳亚\t95904\n违规\t95905\n路冲\t95906\n深圳市宝安区人民法院\t95907\n毗卢寺\t95908\ndna亲子鉴定\t95909\n胃复春片\t95910\n假冒小姐\t95911\nkindom\t95912\n海利亚\t95913\ncmr\t95914\n4046\t95915\n天津论坛_汽车之家论坛\t95916\n38亿元\t95917\n云尚\t95918\n莎朗斯通\t95919\n导盲犬\t95920\n520万\t95921\n江苏理工\t95922\n龙柱\t95923\n河南省人民政府办公厅\t95924\n皮棉\t95925\nnoun\t95926\n万军\t95927\n12式\t95928\n一警\t95929\n16年02月\t95930\n6300HQ\t95931\nnosuch\t95932\n现代密码学\t95933\nCALayer\t95934\n黄页88建材网\t95935\n不惑之年\t95936\n对你好\t95937\n冷压机\t95938\n学生运动\t95939\n湿疹膏\t95940\n肉渣\t95941\n疆界\t95942\n12A\t95943\nunsplash\t95944\n坐标\t95945\n旅途的花样\t95946\n瑞银信\t95947\n工作包\t95948\n长宁\t95949\n影音先锋撸\t95950\n千龙科技\t95951\n技艺\t95952\n华夏医界网\t95953\nApproach\t95954\n闻名遐迩\t95955\n玉子\t95956\nwti\t95957\n朱标\t95958\n悪\t95959\nCE家族社\t95960\nBat\t95961\n2200年\t95962\n振作起来\t95963\n英汉词典_词典网\t95964\n秀丽\t95965\nA7RIII\t95966\n愈\t95967\n淘宝卖家交流论坛社区网\t95968\nb4\t95969\nCouch\t95970\n竟是\t95971\n金塘镇\t95972\n河北省儿童医院\t95973\n杨澜访\t95974\n或许\t95975\n草滩\t95976\n谈房论市\t95977\n仙霞西路\t95978\n茵莱湖\t95979\n美芝股份\t95980\n26章\t95981\n角角\t95982\nXV\t95983\n213\t95984\n_牛仔网\t95985\n呼百应公司\t95986\n急性肺水肿\t95987\n桃源洞\t95988\n红山网\t95989\n创业社区\t95990\n鲁贝\t95991\n侧翼\t95992\n落日圆\t95993\n联通\t95994\n奥利司他\t95995\n拜访\t95996\n第56届\t95997\n天道留学\t95998\n配戏\t95999\n第16轮\t96000\n私募基金登记备案\t96001\n1873\t96002\n傻冒\t96003\n济南遥墙国际机场\t96004\n博导\t96005\n连载\t96006\n2015-01-20\t96007\n6月25日\t96008\n财科\t96009\n蓝天幼儿园\t96010\n厅局级\t96011\n内浮顶罐\t96012\n1157\t96013\nJiaxing\t96014\nexecve\t96015\n侧链\t96016\n1.2cm\t96017\nhuayang\t96018\n小憩\t96019\n郑州日报\t96020\nrecourse\t96021\n李晗\t96022\n21.13\t96023\n113\t96024\n滚珠丝杆\t96025\n新巴塞尔协议\t96026\n279号\t96027\nBriefing\t96028\npexels\t96029\n甘肃省交通运输厅\t96030\n连绵不断\t96031\nHappy222\t96032\n保山新闻网\t96033\n艾丽西亚\t96034\nqlv格式转换MP4\t96035\nrecombinant\t96036\n中梁集团\t96037\n太子\t96038\n岳欣\t96039\n疤痕\t96040\nReadme\t96041\n江浦路\t96042\nnameserver\t96043\nWORD模板\t96044\n清江鱼\t96045\n中远船务\t96046\n解恨\t96047\n中央财大\t96048\nセレブ\t96049\n天狼\t96050\n输煤\t96051\n三2号\t96052\n大勺\t96053\n阿成\t96054\nC++编译器\t96055\nreplicas\t96056\n妙蛙\t96057\n晋江市\t96058\n火圈\t96059\n4匹\t96060\n双新\t96061\nV19\t96062\n懵懵\t96063\n孟军\t96064\n守望之海\t96065\n牛皮纸\t96066\n少阳\t96067\n天蓬元帅\t96068\n莲香楼\t96069\nlinux内核\t96070\n3炽翎\t96071\n紫晶\t96072\n枝桠\t96073\n转通\t96074\n30起\t96075\n货部\t96076\n洛浦街\t96077\n_参考网\t96078\n排风井\t96079\n二硝基苯肼\t96080\n凯美瑞双擎\t96081\n橘园洲\t96082\n重症\t96083\nb座\t96084\nObesity\t96085\nYulia\t96086\n198\t96087\npearl\t96088\n瑛\t96089\n有色技术网\t96090\n脱敏剂\t96091\n孙立新\t96092\nPandas篇\t96093\n危及\t96094\n普碳\t96095\n云盒\t96096\nggdown\t96097\nalexandra\t96098\n刺疼\t96099\nCalibre\t96100\n剑神\t96101\n分一分\t96102\n施耐德电气\t96103\n双非学校\t96104\n东原城\t96105\n牙缝\t96106\n特维斯\t96107\n朱明瑛\t96108\n襄阳市政府\t96109\n法阵\t96110\n热器\t96111\n蜀素帖\t96112\n多年后\t96113\n苏芊\t96114\nsong\t96115\nOCZ\t96116\n40本\t96117\n总帖\t96118\n水螅\t96119\n新思科技\t96120\n公猫母猫\t96121\n柏尚\t96122\n淘狗网\t96123\n八篇\t96124\necx\t96125\n崇宁重宝\t96126\n祖地\t96127\n臆想\t96128\n血统\t96129\n圣三国\t96130\ngown\t96131\n解放军报社\t96132\n温国辉\t96133\n自然日\t96134\nTrang\t96135\n盂\t96136\nlegit\t96137\n小儿止咳糖浆\t96138\n骑乘位\t96139\n金盏花\t96140\n寄售\t96141\nイ\t96142\n杨丽坤\t96143\n浙江大学后勤集团\t96144\n250%\t96145\n撞库\t96146\ne540\t96147\n沢田纲吉\t96148\n0925\t96149\n红烧鱼块\t96150\n细柳\t96151\n花花牛\t96152\n宠物鼠\t96153\nSequelize\t96154\n渠道商\t96155\n15.05.1\t96156\n天河路\t96157\n超级机器人大战v\t96158\njQ\t96159\nacupuncture\t96160\n木筷\t96161\n郁美\t96162\n胶带机\t96163\n36个月\t96164\n破碎虚空\t96165\n蒸煮\t96166\n天天射影\t96167\n洛南县人民政府\t96168\nMMA综合格斗\t96169\nddns\t96170\n白目\t96171\n卡巴拉\t96172\n法汉\t96173\n轻钢龙骨吊顶\t96174\n长崎\t96175\n北金所\t96176\n9碗\t96177\n重\t96178\n宁府\t96179\n简历\t96180\nweblogic11g\t96181\n张明炜\t96182\n10余次\t96183\n绝地求生枪\t96184\n板泉路\t96185\n筛网\t96186\n北脸\t96187\n江青\t96188\n倒点\t96189\n210mm\t96190\n工业发展有限公司\t96191\n坚强\t96192\n/A\t96193\n天草\t96194\nDoS\t96195\n陶家岭\t96196\n夜深\t96197\n大放\t96198\n巫家\t96199\n百合外国语学校\t96200\n抛荒\t96201\n交易商\t96202\n胡荣华\t96203\n某日\t96204\n豆们\t96205\nattachment\t96206\n金太狼\t96207\n煤炭网\t96208\nshux\t96209\n风雨兰\t96210\n中央财经大学\t96211\n天涯明月\t96212\n比斯特购物村\t96213\n芜房网\t96214\n2.3元\t96215\nEXCEL2003\t96216\n持续\t96217\n埠头\t96218\n大中\t96219\n艾瑞达\t96220\n星际战争\t96221\n扣税计算器\t96222\n播放盒\t96223\n城乡\t96224\nHDZIMU\t96225\n多动症\t96226\n陕西省肿瘤医院\t96227\nheep\t96228\ncodewu\t96229\n深康村\t96230\n重庆机电职业技术学院\t96231\nSVDVD\t96232\n小作\t96233\n13.5%\t96234\n氧化钇\t96235\n底肥\t96236\nVLFeat\t96237\n木樨地\t96238\nSeveral\t96239\n韩杰文\t96240\n人言\t96241\nARMA\t96242\n明泰铝业\t96243\n渭源\t96244\n_花镇\t96245\n金宥真\t96246\nZDNet\t96247\nShylock\t96248\n青海师范大学\t96249\n妄想\t96250\n新贵\t96251\n东郊\t96252\n自我救赎\t96253\n不怀\t96254\n建盏\t96255\n500kV\t96256\n查狄伦\t96257\nArcGIS\t96258\n五游网\t96259\n新德里\t96260\nwww.yz88.cn\t96261\nxusir\t96262\ngabor\t96263\nadvertise\t96264\n北京办公室\t96265\n易控\t96266\n哈仙岛\t96267\n填出\t96268\n西藏新闻网\t96269\n蔡氏\t96270\n北辰世纪中心\t96271\n摇摇乐\t96272\nalongside\t96273\n太原市公安局\t96274\nlonely\t96275\n保卫部\t96276\nMHD\t96277\n十片\t96278\n多特瑞\t96279\n鉴定人\t96280\n嘉诚\t96281\n威尼斯酒店\t96282\n之窗\t96283\nBetter\t96284\n56mama\t96285\n棍王\t96286\nurlen\t96287\naxture\t96288\n新户\t96289\n文明5:美丽新世界\t96290\n小伶\t96291\n小手工\t96292\n超市\t96293\n肌肌\t96294\n国华\t96295\nSM文\t96296\nUeditor\t96297\n战神4_战神4\t96298\n环境卫生学\t96299\nLib\t96300\n不排水\t96301\n维生素b1\t96302\n老鸡\t96303\n百团大战\t96304\n九酷音乐网\t96305\n雷刚\t96306\n锦程物流网\t96307\n右侧\t96308\nfifaonline3吧\t96309\n海贼世界\t96310\n首盘\t96311\n民调局异闻录后传\t96312\n欲滴\t96313\n聚众斗殴罪\t96314\nFreeswitch\t96315\n吉姆\t96316\n欧拉函数\t96317\n9毫米\t96318\n被证实\t96319\n四平市\t96320\n合肥市蜀山区政府\t96321\njcuan\t96322\n盗汗\t96323\n气柜\t96324\n据调查\t96325\ncrass\t96326\n一生\t96327\n霍夫圆\t96328\n唐山大地震\t96329\n梦幻海\t96330\nWD\t96331\n跳字\t96332\n克孜勒苏柯尔克孜自治州人民政府\t96333\ninfiniti\t96334\n本田歌诗图\t96335\n精耕\t96336\n恒昌汇\t96337\nBuilders\t96338\nh.264\t96339\n非\t96340\n赵健\t96341\n林莉\t96342\n03月\t96343\n小米手机\t96344\n太想爱你\t96345\n卿相\t96346\n天龙八部游戏\t96347\n玄乎\t96348\n陈树湘\t96349\n02:00\t96350\nTwink\t96351\n井通\t96352\nkeyspace\t96353\n仙居新闻网\t96354\n起落\t96355\nPLS-00103\t96356\n香菇汤\t96357\n楼顶花园\t96358\n同卷\t96359\n淘宝客\t96360\n高玲\t96361\nFAL\t96362\n深挖\t96363\n@_@50\t96364\nI-94\t96365\n松冈茉优\t96366\ninu\t96367\n电子级\t96368\n整容液\t96369\n侍郎\t96370\n内江新闻网\t96371\n货币\t96372\neu260\t96373\n毫不动摇\t96374\n玩人\t96375\n魔枢\t96376\n竹兜\t96377\n扭秧歌\t96378\n和尚头\t96379\n文英\t96380\n1424\t96381\n调音量\t96382\n白金币\t96383\n胜平\t96384\n负载均衡\t96385\n品界\t96386\n2^3\t96387\n剃刀边缘\t96388\n胸鳍\t96389\n终有一死\t96390\n魏婴\t96391\nPicker\t96392\nbtc\t96393\n运维\t96394\n国家安全生产监督管理局\t96395\n空屋\t96396\n少年梦\t96397\n岩土\t96398\n不能说\t96399\n吴能\t96400\n食谱秀\t96401\n挂靠网\t96402\n广州市建筑集团有限公司\t96403\n北京建设银行\t96404\n陈省身\t96405\n城市发展\t96406\nStudio2.0\t96407\nPowerISO\t96408\n2016年9月份\t96409\n转塘\t96410\n歌尔\t96411\n3余数\t96412\n林明\t96413\n之后\t96414\ngogo\t96415\n吊式\t96416\n安徽省图书馆\t96417\n白土\t96418\n碧昂\t96419\n重庆大学建设管理与房地产学院\t96420\n终极一战\t96421\nchisel\t96422\n微点\t96423\n詹子晴\t96424\n自立门户\t96425\nmp2014\t96426\n花嫁尼禄\t96427\n广西人力资源和社会保障厅\t96428\nspzs\t96429\nHVAC\t96430\n汪培珽\t96431\n行动派\t96432\n呆瓜\t96433\n祸从天降\t96434\n逗鱼\t96435\n贴合机\t96436\n点道梦里花落\t96437\n沈阳幼儿园\t96438\n张京\t96439\nnRF24L01\t96440\nSLAM\t96441\ncouchbase\t96442\nknowledge\t96443\nキス\t96444\nsegregation\t96445\n立川理恵\t96446\n恒河\t96447\n离子交换器\t96448\n河南财经政法大学\t96449\n均方根值\t96450\n健步鞋\t96451\nwok\t96452\n四米\t96453\n贝蒂斯\t96454\njdbctype\t96455\n爱同行\t96456\n真相大白\t96457\nヘ\t96458\nExit\t96459\n棉花\t96460\n屋内\t96461\n乙肝小三阳\t96462\n刘松年\t96463\n二手手机网\t96464\n车尾气\t96465\n懊恼\t96466\n情热\t96467\n电摆舞\t96468\n气楼\t96469\nboomerang\t96470\n秋晚\t96471\n全天\t96472\n徽州\t96473\n买车购置税\t96474\n黑暗侵袭2\t96475\n她们\t96476\nimag\t96477\n塞尔达无双\t96478\n总裁在上我在下\t96479\n_邻医网\t96480\n明尼阿波利斯\t96481\n诚者\t96482\n二尺\t96483\n数十次\t96484\n二户\t96485\nideapad310s\t96486\n宝安区人民医院\t96487\n防撞垫\t96488\n游谱\t96489\n天官撸啊撸\t96490\n0.13\t96491\n刘潭\t96492\nCDRX7\t96493\n佑安医院\t96494\n美神\t96495\n海西晨报\t96496\n剪彩\t96497\n18036\t96498\n微软SQL\t96499\n承兑贴现\t96500\n型状\t96501\nCIFS\t96502\n绵阳中学\t96503\n300场\t96504\nV7\t96505\n凉州\t96506\n第14\t96507\n银钞\t96508\n陈之汉\t96509\n片外\t96510\n海天注塑机\t96511\n金喜\t96512\n真人快打xl\t96513\n前海航交所\t96514\nRIN\t96515\nAnimals\t96516\n伤寒论\t96517\n善复\t96518\n光汇石油\t96519\n天兆\t96520\n伊利丹·怒风\t96521\n倒霉\t96522\n别喊\t96523\n命书\t96524\n名刹\t96525\n互帮\t96526\n甲状腺乳状腺癌\t96527\nnit\t96528\n屏蔽线\t96529\n秦皇岛开发区\t96530\n尾联\t96531\n韩善花\t96532\nMinGW\t96533\n井冈山路\t96534\n导盲\t96535\n愚昧\t96536\n玫瑰花蕾\t96537\n红玉海\t96538\n福田体育公园\t96539\n办公椅\t96540\n格值\t96541\nSana\t96542\n272万源\t96543\nBack\t96544\n高能少年团\t96545\n城市管理综合执法局\t96546\n残垣断壁\t96547\n红枣银耳汤\t96548\n生龙活虎\t96549\n三九手机网\t96550\nmatlab2014a\t96551\n磷肥\t96552\n王瀚\t96553\n黑金花\t96554\nuedbet\t96555\n鱼箱\t96556\n三相桥式全控整流电路\t96557\n1.1G\t96558\n伦敦大学国王学院\t96559\n环西湖\t96560\n承包制\t96561\n4月23号\t96562\n家常版\t96563\n五秒钟\t96564\n王建宇\t96565\n在人间\t96566\n吴淞江\t96567\nyanse\t96568\nCITS\t96569\nergo\t96570\nnethack\t96571\nrs3\t96572\n17款名\t96573\n华夏幸福基业\t96574\n邓小刚\t96575\n20片\t96576\n坦桑尼亚\t96577\n炫神\t96578\nHumanity\t96579\n100m\t96580\n微信公众服务\t96581\n电气箱\t96582\nAMD速龙\t96583\n外国语\t96584\n上巳\t96585\n力挽狂澜\t96586\n中大五院\t96587\n纲要\t96588\n采招网\t96589\n西子奥的斯\t96590\n羽毛球女\t96591\noci\t96592\n泉水\t96593\n下套\t96594\nCobra\t96595\nDAY2\t96596\n海美迪Q5\t96597\n商业保理\t96598\n皂君庙\t96599\n西盟县\t96600\nDAEMON\t96601\nPlates\t96602\n达川\t96603\n齐鲁\t96604\n2017年6月2日\t96605\n浩辰CAD\t96606\nEach\t96607\nIslam\t96608\n9650\t96609\nyjv\t96610\n全章\t96611\nFNS\t96612\n專賣\t96613\nMINECRAFT\t96614\n零5\t96615\n江苏省注册会计师协会\t96616\n流不止\t96617\n洗面乳\t96618\n颂词\t96619\n谭颖\t96620\n印刷经营许可证\t96621\n竖向\t96622\n鄞州实验中学\t96623\n隔音棉\t96624\n扼守\t96625\n江阴论坛\t96626\n21CN\t96627\n罗马人的故事\t96628\n球友\t96629\n颖儿\t96630\nkontakt\t96631\n干洗剂\t96632\n12.23\t96633\n凌风云\t96634\n鸟瞰\t96635\nSNAPSHOT\t96636\n云岩区\t96637\n夜礼服\t96638\ngmt\t96639\n货币型基金\t96640\nTsingke\t96641\n介观\t96642\n金庸群侠传吧\t96643\nRecruit\t96644\n森林中\t96645\n内厝澳\t96646\n死星\t96647\n2015年06月\t96648\n8美元\t96649\n笑声\t96650\n熊亮\t96651\n最近五年\t96652\n杀手包\t96653\n永寿县人民政府\t96654\n蔓荆子\t96655\n成都军区总医院\t96656\npeakfit\t96657\n太爷爷\t96658\ncros\t96659\n渭南市\t96660\n高价\t96661\nDisplayPort\t96662\nGH5S\t96663\n25号\t96664\n婚介所\t96665\nEllipse\t96666\n影速科技\t96667\n北京西红门\t96668\n肝热\t96669\n罗湖国贸\t96670\n清表\t96671\n普利策\t96672\n乐购超市\t96673\n淮师\t96674\n骓\t96675\n六要素\t96676\n超过15分钟\t96677\nexplicitly\t96678\n下等\t96679\nimsoft\t96680\n输掉\t96681\n覆水\t96682\n資生堂\t96683\nsmartaudio\t96684\n理财规划师_金融理财公司\t96685\nshp9500\t96686\n攀\t96687\n睿力\t96688\n济\t96689\n曹\t96690\n郁金香运动\t96691\n龙马传\t96692\nnorthwood\t96693\n你好嘢\t96694\nmojito\t96695\ngovcn\t96696\n纸钱\t96697\n深圳第二人民医院\t96698\n聊城开发区\t96699\n英雄梦\t96700\nKENWOOD\t96701\n博益网\t96702\nSparkContext\t96703\n不好惹\t96704\n袁亚湘\t96705\n古石\t96706\nCgroup\t96707\ngmm\t96708\n心灵史\t96709\n颔联\t96710\n快播电影有声小说网\t96711\n快速版\t96712\nPRP\t96713\n陨星\t96714\nzanui\t96715\n凤凰文化广场\t96716\n晶清\t96717\n二十世纪初\t96718\n胡慧\t96719\n黑死病\t96720\n拉娜\t96721\n应用层\t96722\ndsg\t96723\n陈天石\t96724\n2016年12月17日\t96725\n身弱\t96726\nTRADING\t96727\n01234\t96728\n超流行\t96729\n建业壹号城邦\t96730\nbabycare\t96731\n胶乳\t96732\n十漫个\t96733\n武田弘光\t96734\n长途旅行\t96735\ngenghuan\t96736\n波尔凯尼恩\t96737\n期权定价模型\t96738\n地黄丸\t96739\nBOMB\t96740\n艾格\t96741\nsecretion\t96742\n分裂机\t96743\n可视电话\t96744\n某键\t96745\n恋母\t96746\n荀攸\t96747\n升龙汇金中心\t96748\n溶化\t96749\n95522诈骗\t96750\n病毒营销\t96751\n虎丘婚纱城\t96752\n诸界网游之倒行逆施\t96753\nrobo3t\t96754\n拔模斜度\t96755\n迈科\t96756\n斯诺克大师赛\t96757\n游走性\t96758\nnps\t96759\n三国页游\t96760\n证券投顾\t96761\n0994\t96762\n海豚音\t96763\n小米盒子2\t96764\n魇\t96765\n阳台花园\t96766\ngta5卡\t96767\n兔儿\t96768\n东南汽车\t96769\n谢振华\t96770\n尤斯蒂娅\t96771\n龙文章\t96772\n140W\t96773\n超肉\t96774\n央\t96775\n呆毛王\t96776\n三和\t96777\n朱传奇\t96778\niso27001\t96779\n阻尼比\t96780\n金属基复合材料\t96781\n219\t96782\n任建华\t96783\n针眼\t96784\nstoneniqiu\t96785\n40位\t96786\n君人\t96787\n开心论坛\t96788\n洗鼻器\t96789\nMatcher\t96790\n200万元\t96791\n攻其不备\t96792\n恶战\t96793\n壹品娱乐网\t96794\n园路\t96795\nTFA\t96796\n婺源县\t96797\n神塔\t96798\n湿式离合器\t96799\nqide\t96800\njineng\t96801\n和和\t96802\n水晶光电\t96803\nqq区\t96804\n流纺\t96805\nMAKE\t96806\n南海十三郎\t96807\n月光花\t96808\n口袋妖怪漆黑的魅影5.0\t96809\n4500万元\t96810\n塔什库尔干县\t96811\n歌尔股份\t96812\n有所得\t96813\n20180129\t96814\nonlytease\t96815\n发物\t96816\n暮光之城2\t96817\n两优一先\t96818\n久岚\t96819\n纾解\t96820\nduba\t96821\n第2216章\t96822\ncracks\t96823\n郑志\t96824\n国标\t96825\nTEXT函数\t96826\nAerial\t96827\n东星斑\t96828\n学术性\t96829\n中铁十五局集团有限公司\t96830\n连贯\t96831\n东京旅游\t96832\n360抢票王\t96833\n周凡\t96834\n亚太财产保险有限公司\t96835\n峻\t96836\n概预算\t96837\n结转\t96838\n裂器\t96839\n崇明堡镇\t96840\n中欧\t96841\n美津浓\t96842\nEmoji表情\t96843\n114路\t96844\n王依萌\t96845\n尸案\t96846\nrespected\t96847\n女神异闻录3携带版\t96848\n万峰林\t96849\n王振东\t96850\n三相五线制\t96851\n国子派\t96852\n都市情感\t96853\n钛度\t96854\n6E\t96855\n李奥瑞克\t96856\n星期一到星期五\t96857\n梦网\t96858\n拉瓦\t96859\n柜\t96860\n石棉橡胶板\t96861\n李德毅\t96862\n急件\t96863\nJiao\t96864\n362路\t96865\n20160403\t96866\noricon\t96867\n望京\t96868\n纯乐\t96869\n自由飞翔\t96870\nAWG\t96871\n俞伯牙\t96872\n众多大\t96873\n风水学院\t96874\n700点\t96875\nctd\t96876\n车手\t96877\n餐刀\t96878\n手抛网\t96879\n第14页\t96880\n深圳平安大厦\t96881\n冲裁模\t96882\n洗衣机\t96883\n宽点\t96884\n桃园三结义\t96885\n索洛\t96886\n四川天邑康和通信股份有限公司\t96887\n四分之一决赛\t96888\n连图\t96889\n无以复加\t96890\n李市镇\t96891\n陈凌\t96892\n熔岩蛋糕\t96893\n68级\t96894\n顺德区人民政府\t96895\n桌垫\t96896\nIC,IC\t96897\n20151103\t96898\n江源区\t96899\n尊声\t96900\n春素\t96901\n田纳西州\t96902\n爱母\t96903\nSpots\t96904\nhcy\t96905\n1024BT核工厂\t96906\n中华人民共和国环境影响评价法\t96907\n单阳\t96908\n维维股份\t96909\n兽群\t96910\n风雅钱塘\t96911\n导线载流量\t96912\n名站\t96913\n凡士林\t96914\n_抚顺吧\t96915\n恒泰集团\t96916\nxe2\t96917\n产出物\t96918\n彩卷\t96919\n仓促\t96920\n人造地球\t96921\n圆括号\t96922\nDreams\t96923\n80片\t96924\ninm\t96925\n卤肉\t96926\n巴氏\t96927\n曲奇饼\t96928\n为何时\t96929\n洪荒少年猎艳录\t96930\nluoyang\t96931\n几百家\t96932\nmmd\t96933\n人事行政部\t96934\n清迈府\t96935\n669\t96936\n拓乐\t96937\n15厘米\t96938\n日美贸易战\t96939\n跨考\t96940\ntp-link路由器\t96941\n子集\t96942\nE3-1231\t96943\nLP仿传奇单机版\t96944\n百世物流科技(中国)有限公司\t96945\nnavigationbar\t96946\n奥黛丽赫\t96947\n云托管\t96948\n酒师\t96949\n痛片\t96950\nncp\t96951\n147分\t96952\n锦绣未央\t96953\n渗碳淬火\t96954\n标准筛\t96955\n基本演绎法\t96956\n阿佤\t96957\n依仗\t96958\n加百利\t96959\n颁证\t96960\n保有量\t96961\n伊百丽\t96962\n|华\t96963\n枭龙\t96964\n雅轩\t96965\n479号\t96966\n7Plus\t96967\n李炜\t96968\nd\t96969\nFIO\t96970\n肖尔\t96971\n无人机能\t96972\n兄友弟恭\t96973\n公有制\t96974\n画虎\t96975\n投到\t96976\nmysqldump\t96977\n小农经济\t96978\nCNYW\t96979\n硅胶板\t96980\n陈静瑜\t96981\n楼梯段\t96982\nyotaphone\t96983\n三标\t96984\n返生餐单\t96985\n638\t96986\n钵\t96987\n知音\t96988\n北极星电力商务通\t96989\ntraceback\t96990\n白玉蟾\t96991\n相约\t96992\n高辉\t96993\n缓步\t96994\nps工具栏\t96995\n眼泡\t96996\nan}\t96997\n梦环\t96998\n残留量\t96999\n凤梨酥\t97000\nftth\t97001\n按摩仪\t97002\n小馋猫\t97003\n8月\t97004\n薄荷岛\t97005\n点火线\t97006\n相宜相克\t97007\n文学院\t97008\n上海地铁18号线\t97009\n2017年底\t97010\nPaths\t97011\n变形警车珀利\t97012\n亏本\t97013\n实力派\t97014\n报喜鸟\t97015\n好鲜\t97016\n吉林省肿瘤医院\t97017\n斗气\t97018\n中国画\t97019\n泉溪镇\t97020\n合肥市规划局\t97021\n缙\t97022\npk吧\t97023\nhttppost\t97024\nts流\t97025\n分解版\t97026\n博彩业\t97027\ncathay\t97028\n合久必分\t97029\n李村\t97030\n成渝城市群\t97031\n21IC\t97032\n脚钉\t97033\nabcde\t97034\n雨蝶\t97035\n良乡医院\t97036\nGaytube\t97037\naxle\t97038\nBreakout\t97039\n罢工\t97040\n入司\t97041\n轩辕剑3外传天之痕\t97042\nEgo\t97043\n赵德\t97044\nMSH\t97045\n利丰\t97046\n安达利尔\t97047\n俗语\t97048\n富阳\t97049\nsco\t97050\n蜂巢式\t97051\n宋彬\t97052\n蝉翼\t97053\nAvira\t97054\n手绘100网\t97055\n附魔书\t97056\n铸造业\t97057\n吾日\t97058\n末班车\t97059\n样癌\t97060\n河粉\t97061\n顶管机\t97062\n上海出版印刷高等专科学校\t97063\nBELL\t97064\n笑园\t97065\nfaild\t97066\n锂电池组\t97067\n扫图\t97068\n激活码\t97069\n王小源\t97070\n硝酸铝\t97071\n伯利克里\t97072\n20171007\t97073\n二线制\t97074\nGraphics2D\t97075\n抗过敏药\t97076\n伯邑考\t97077\n硝基\t97078\n李志强\t97079\n双闭环\t97080\n画画图\t97081\nwin32k.sys\t97082\ndvt\t97083\n农高区\t97084\nPC蛋蛋\t97085\n不再\t97086\n原始\t97087\n客家话\t97088\nvalidating\t97089\n力帆集团\t97090\n米露\t97091\n几味\t97092\n丰林集团\t97093\n足迹\t97094\n9553\t97095\n第九十五章\t97096\n肌瘤\t97097\n甘家口大厦\t97098\n广河\t97099\n通州运河\t97100\n蒙中\t97101\n心理健康教育\t97102\n孟小蓓\t97103\n前进路\t97104\n微软亚太研发集团\t97105\n六双\t97106\n杨锐\t97107\nindependence\t97108\n飘香藤\t97109\n2014年5月份\t97110\n复古式\t97111\n1两个\t97112\n洗面机\t97113\n红庙村\t97114\n孔隙度\t97115\n合化\t97116\n华士\t97117\n格式转换\t97118\n怒火攻心1\t97119\n龙净环保\t97120\n暗影格斗2\t97121\n南坊\t97122\n舒兰\t97123\nhostonly\t97124\n揭穿\t97125\nUpwork\t97126\nzook\t97127\n4子\t97128\n第二十一\t97129\nrolling\t97130\n对称加密算法\t97131\n宝石山\t97132\n確認\t97133\n残保\t97134\nvin码\t97135\nGiraffe\t97136\n摇摇杯\t97137\n由来故事\t97138\n张山\t97139\nfacetune\t97140\n_妈妈网\t97141\nSadistic\t97142\n南瓜籽\t97143\n河南水利与环境职业学院\t97144\n膝袜\t97145\nhe\t97146\n志汇\t97147\n76万\t97148\ngey\t97149\n阵营\t97150\nLADY\t97151\nKeynote\t97152\nchrome卡顿\t97153\n聘\t97154\nV_\t97155\n三甲基氯硅烷\t97156\n阿华田\t97157\n苏州酒店\t97158\n10包\t97159\n工建\t97160\nliverpool\t97161\n三槐堂\t97162\nSAFE\t97163\n60421381\t97164\n莲花社区\t97165\n武汉大学人民医院\t97166\n刀剑\t97167\n拍卖会\t97168\n索然\t97169\n重庆市发展和改革委员会\t97170\n雨花亭\t97171\n助业\t97172\n科技管理研究\t97173\n重夺\t97174\n享受型\t97175\n一起玩转\t97176\n纽西\t97177\n布卡\t97178\n无由\t97179\ndictionaries\t97180\n致病\t97181\n独股\t97182\n大轰炸\t97183\n现金流量分析\t97184\nsystick\t97185\n港澳台办公室\t97186\n指望\t97187\n建筑圈网\t97188\n雅思兰黛\t97189\n人力\t97190\n壮年\t97191\n利淘网\t97192\n国家安监局\t97193\nシルバ\t97194\nAmateurs\t97195\n罗茨鼓风机\t97196\n成事在人\t97197\nhajime\t97198\nvyou\t97199\nwow7\t97200\n好花\t97201\n西关街\t97202\n放映员\t97203\n14.10\t97204\n中岛京子\t97205\n欢度\t97206\n王迪\t97207\n三险\t97208\n乔诗语\t97209\n言听计从\t97210\n火箭联盟\t97211\n个人化\t97212\nClash\t97213\n白马非马\t97214\n次卡\t97215\n有无\t97216\n拉人\t97217\n收集贴\t97218\n中共山西省委组织部\t97219\nExtjs4\t97220\nasax\t97221\n乐刻\t97222\n孙佳仁\t97223\n鲸类\t97224\n贷后检查\t97225\n时代城\t97226\n火与冰\t97227\n隽秀\t97228\n李萨如\t97229\n中国船舶重工集团\t97230\n椰梦长廊\t97231\ncurrentstyle\t97232\n别理\t97233\n杨树鹏\t97234\nds160\t97235\nTouTiao\t97236\n卫佳\t97237\nKingsLanding\t97238\n食马者\t97239\n纳川股份\t97240\nifly\t97241\nrng\t97242\n小奏鸣曲\t97243\n银鳕鱼\t97244\nJython\t97245\nRG\t97246\n虎女\t97247\nWaka\t97248\n直流变换器\t97249\n第9类\t97250\n霍英洛丽塔\t97251\nBoot入门\t97252\n化工英才网\t97253\nEbooks\t97254\nHololens\t97255\n子系统\t97256\nsubscribe\t97257\n川端康成\t97258\n5why\t97259\n溴化氢\t97260\nsuperme\t97261\npreferred\t97262\nsmashbox\t97263\n108个\t97264\n谗言\t97265\n万达城\t97266\n学员\t97267\n中国养鸡网\t97268\n起色\t97269\n木兰镇\t97270\n狼吻夜惊魂\t97271\nDeere\t97272\n计量经济模型\t97273\n大众速腾\t97274\n爱丽丝\t97275\n菜梗\t97276\n漂发\t97277\n85周年\t97278\n寻爱\t97279\n中南大\t97280\n落位\t97281\n伪随机数\t97282\n投融\t97283\n暗指\t97284\n鼠目\t97285\n会计实务\t97286\n善业\t97287\n中华少年儿童慈善救助基金会\t97288\n7688\t97289\nPIE\t97290\n微信支付\t97291\n泽尻李雨霏\t97292\n国家大学科技园\t97293\n婚天喜地\t97294\n逢魔\t97295\n鱼群\t97296\naTool\t97297\n广州开发区\t97298\nGCJ02\t97299\n张志红\t97300\n加美拉\t97301\n易建周凯旋\t97302\n左权\t97303\n林鑫\t97304\n圣埃蒂安\t97305\n欢朋\t97306\nawake\t97307\n邮编查询网\t97308\n玲珑心\t97309\n起重机\t97310\n莫西子\t97311\nposit\t97312\ndavid\t97313\n李盈盈\t97314\n第35届\t97315\n环镇北路\t97316\n闯江湖\t97317\n动力工程及工程热物理\t97318\n80块\t97319\n消失模\t97320\n1568\t97321\n南京西\t97322\n拆迁户\t97323\n陈太太\t97324\npedia\t97325\npolo衫\t97326\n容颜\t97327\nvix\t97328\nARPU\t97329\nseach\t97330\n中兴威武\t97331\n大有讲究\t97332\nvi设计\t97333\n县人大常委会\t97334\n进爵\t97335\n15万\t97336\n夏宫\t97337\nnsurlsession\t97338\n温岭日报\t97339\n孟村县\t97340\ntextArea\t97341\n宽禁带\t97342\n语文园地七\t97343\n2首\t97344\n谢希仁\t97345\n雪默\t97346\n科学技术协会\t97347\n新巴巴运动网\t97348\n索尼A7RII\t97349\n对仗\t97350\nUnity3d\t97351\n38公里\t97352\ntypemonkey\t97353\nweird\t97354\n上家\t97355\n热凝器\t97356\n空心人\t97357\nwww.36dj.com\t97358\n陈东升\t97359\nboston\t97360\n丘比特\t97361\nsponsored\t97362\nxinyu\t97363\nJPS\t97364\n晒图\t97365\nNo.1格價\t97366\n解锁器\t97367\n泉州机场\t97368\nU2000\t97369\n奥运五环\t97370\n欧共体\t97371\n红学\t97372\n太尼玛菜\t97373\n社规\t97374\nPeak\t97375\n蝴蝶展\t97376\n西藏大学\t97377\nps7.0\t97378\n忆苦思甜\t97379\n科帕\t97380\n矿业展\t97381\n效劳\t97382\n1.26\t97383\n发布页\t97384\n大和尚\t97385\n冷料\t97386\n高开区\t97387\nidesktop\t97388\n极品飞车19吧\t97389\nば\t97390\n悬崖村\t97391\nbeet\t97392\n第12期\t97393\n重庆动物园\t97394\n随借\t97395\n联想G470\t97396\n各支路\t97397\nTopshop\t97398\n定界\t97399\n陈伟群\t97400\nBarry\t97401\n九夜\t97402\n家本\t97403\n一石\t97404\n卑鄙\t97405\n_牙\t97406\n固德威\t97407\n租赁权\t97408\nimpurity\t97409\n巴比松\t97410\n汉方药\t97411\nOpenDaylight\t97412\n农业服务中心\t97413\n糖豆网\t97414\n三国无双5\t97415\n衣匠\t97416\neta\t97417\n1500w\t97418\n4.65\t97419\n两碗\t97420\n阅图\t97421\n醉红尘\t97422\nGT3\t97423\n遗尿\t97424\n血雨腥风\t97425\n轻质砖\t97426\n换来\t97427\n多乐士多彩\t97428\nromer\t97429\n汤峪温泉\t97430\n五笔型\t97431\n雷公藤多苷片\t97432\n黄萍\t97433\n夏口\t97434\n支払\t97435\nGVOD\t97436\n分解炉\t97437\n胃下垂\t97438\ntpm\t97439\n易门县\t97440\n叉车工\t97441\n永久居留\t97442\nsurgical\t97443\n复来作文网\t97444\n洪福齐天\t97445\n龙湖天璞名邸\t97446\n余世\t97447\n不可待\t97448\nWallpapers\t97449\n一集\t97450\n王充闾\t97451\n保靖县\t97452\n说服力\t97453\n調教\t97454\n金沙城\t97455\n赛马网\t97456\n1914年\t97457\n威士顿\t97458\n沙鹰\t97459\n能源与动力工程\t97460\nSTYLENANDA\t97461\n美图淘淘\t97462\nlichming\t97463\n2H2O\t97464\n糖人\t97465\n半联动\t97466\n2010年10月\t97467\nuint64_t\t97468\n中级通信工程师\t97469\n秘恋\t97470\n索要\t97471\nTourism\t97472\n第40关\t97473\n外发\t97474\n混淆视听\t97475\n20170921\t97476\n途观雪铁龙c6\t97477\n园子温\t97478\n魏龙\t97479\nPostgresql数据库\t97480\n圩日\t97481\n宋秀岩\t97482\n网游戏\t97483\n铜版\t97484\nUnravel\t97485\nLOCTITE\t97486\n敏\t97487\nAficio\t97488\n酸气\t97489\n馆际\t97490\n双鹭\t97491\n0.91\t97492\n纸膜\t97493\npenis\t97494\n汪涵卡特\t97495\n各人\t97496\n超声波模块\t97497\nDemo-CSDN\t97498\n请勿入\t97499\n西北部\t97500\n香网\t97501\n阿昔洛韦乳膏\t97502\n埃森\t97503\n惠农\t97504\n周炎\t97505\nconcert\t97506\n北干街道\t97507\nbara\t97508\n连云港机场\t97509\n昆仑加油卡\t97510\n索斯\t97511\n表面式\t97512\ntack\t97513\n北京杜莎夫人蜡像馆\t97514\nmenhera酱表情包\t97515\n屁衣\t97516\n元谋\t97517\n鹏海制药\t97518\n6208\t97519\n大精\t97520\ncpld\t97521\n金玉彬\t97522\nnnet\t97523\n通讯业\t97524\n__________\t97525\n工作岗位\t97526\nBitShares\t97527\n棒棒奶酪\t97528\nOracle11g数据库\t97529\n256号\t97530\n微软Azure\t97531\n川沙路\t97532\n王佳宁\t97533\n长沙汽车西站\t97534\n亲力亲\t97535\n10000名\t97536\nPresto\t97537\n元人\t97538\n1171\t97539\ncand\t97540\nChau\t97541\n磷矿石\t97542\n乔雅\t97543\nb75m\t97544\nAP课程\t97545\n129五\t97546\n结焦\t97547\n双排桩\t97548\n头癣\t97549\nApplet\t97550\n窘\t97551\n桂政办\t97552\n麻色\t97553\n急性肾炎\t97554\n合并方\t97555\n哈东站\t97556\nAllenChou\t97557\n80041198\t97558\n9260ac\t97559\n事展\t97560\n高斯函数\t97561\n和睦相处\t97562\n1709\t97563\n帝星\t97564\nestimating\t97565\n万类\t97566\n燕南\t97567\n多样化\t97568\n饥不择食\t97569\n知音漫客\t97570\n超声波探伤\t97571\n吉列锋\t97572\n帕奇斯\t97573\n九正\t97574\n灌封\t97575\n49元\t97576\n无机及分析化学\t97577\nV2.0.00_\t97578\n9800GT\t97579\n霉运\t97580\n流放之路过滤器\t97581\n人体穴位网\t97582\n聚汇\t97583\nInfor\t97584\n点法\t97585\ngmbh\t97586\n手机欢乐斗地主\t97587\n第59届\t97588\n2.4元\t97589\n唐王\t97590\n徐矿集团\t97591\n动效\t97592\nAoi\t97593\nC2M\t97594\n金刚骷髅岛\t97595\n凌晨12点\t97596\n氢氧化钙\t97597\n魔魂\t97598\n博众\t97599\n金一\t97600\n第四军医大学\t97601\n火箭弹\t97602\n腾讯QQ\t97603\n贾汉\t97604\n艾力达\t97605\n侠盗猎车3\t97606\n望望\t97607\n历代\t97608\norpg\t97609\n321路\t97610\nf660\t97611\n竹石\t97612\n趾高气扬\t97613\n大叶性肺炎\t97614\ncsw\t97615\n深宝\t97616\n说不爱\t97617\n海棠花园\t97618\n服装材料\t97619\nVIVO\t97620\n冯丹\t97621\n河湟\t97622\n平原路\t97623\n90亿\t97624\n浸出\t97625\n同表\t97626\n托尼斯塔克\t97627\nJoe\t97628\n动作条\t97629\n盛美\t97630\n顶星\t97631\n考辛斯\t97632\n呦\t97633\n执明\t97634\n破气\t97635\n欢乐一家亲\t97636\n刘万祥\t97637\n碟簧\t97638\n魔典\t97639\n三星电脑\t97640\nsharedpreferences\t97641\n中国期货市场监控中心\t97642\ncosb\t97643\n小令\t97644\n一筹\t97645\n24千米\t97646\n量价齐跌\t97647\n失恋\t97648\norientation\t97649\n通\t97650\n相害\t97651\n北汽幻速S6\t97652\nDAREU\t97653\n韩国釜山\t97654\nhtcvive\t97655\n欲界\t97656\n马越\t97657\n2+1\t97658\n先兆流产\t97659\nGBK编码\t97660\noffice2011\t97661\n鲑\t97662\n浙江传媒大学\t97663\n每日新闻报\t97664\n鸡泽\t97665\ngeotiff\t97666\n塞达尔\t97667\nSDO\t97668\n平方米\t97669\n卡思\t97670\n赠款\t97671\n地平\t97672\nlibnids\t97673\n18分\t97674\n易鑫集团\t97675\n千一\t97676\n刘涵\t97677\n阳光电源\t97678\n智华\t97679\nJSOUP\t97680\n辽源市\t97681\n少战\t97682\n阴道瘙痒\t97683\n可及\t97684\n中外公路\t97685\nkeyshot\t97686\n4T\t97687\n壮族\t97688\n未完待续\t97689\n5%\t97690\n电塔\t97691\n红歌会\t97692\n东北军\t97693\n库尔勒市\t97694\n夜语\t97695\n色情\t97696\n望月\t97697\n楚州区\t97698\n特遣队\t97699\n量词\t97700\n谢添\t97701\n全区\t97702\n陈文彬\t97703\n憎水\t97704\nomegle\t97705\n千载\t97706\n气柱\t97707\noffice2016吧\t97708\n21世纪初\t97709\n2250xxxx\t97710\n正则项\t97711\n机械\t97712\n红枣茶\t97713\n药锦\t97714\n足球场\t97715\nV信\t97716\n扽\t97717\n合肥房产网\t97718\n两岁多\t97719\n2015-2017年\t97720\n糖手工吐司\t97721\nfuser\t97722\n中关村银行\t97723\n万悦城\t97724\n行李费\t97725\n896\t97726\nPOLY\t97727\n充电站\t97728\n阿卡丁\t97729\n望江\t97730\n百叶\t97731\n河南省电力公司\t97732\n招投\t97733\n青饼\t97734\n刘赏\t97735\n波包\t97736\n600383\t97737\n甘比\t97738\n朱总\t97739\n卡帧\t97740\ngoole\t97741\npuripara\t97742\n猫九\t97743\nwriters\t97744\n重庆谈判\t97745\n罗宾逊\t97746\nselected\t97747\n花龟\t97748\n唐喜成\t97749\n攥\t97750\n姜撞奶\t97751\n字据\t97752\nBPP\t97753\n王树燚\t97754\nlecloudxx\t97755\n人才交流服务中心\t97756\n青光眼手术\t97757\n鸿合i学\t97758\n盾安集团\t97759\n上原花恋\t97760\n人力资源社会保障\t97761\n又一次\t97762\nPython+Selenium\t97763\n高大尚\t97764\n徐行\t97765\n喜剧人\t97766\n聚合物基复合材料\t97767\nrita\t97768\n安危\t97769\nQQ讨论组\t97770\nLame\t97771\n途观1.8T\t97772\n琼绯\t97773\n百草堂\t97774\nIJCAI\t97775\n才字\t97776\n牝犬\t97777\n第2弹\t97778\n盆柜\t97779\nJennifer\t97780\n并求\t97781\n外科男\t97782\n涟水论坛\t97783\n群口\t97784\nconsolidation\t97785\n深圳地铁1号线\t97786\n腾牛\t97787\n破壁松花粉\t97788\n军演\t97789\n河源源城区\t97790\n科定\t97791\n达人\t97792\n刘永清\t97793\n贝尔莱德\t97794\n漫香郡\t97795\nyep\t97796\n李恒欣\t97797\ndistinguish\t97798\n贵阳市区\t97799\nsahara\t97800\nRCFans\t97801\n清理小区\t97802\n航天科工三院\t97803\n程嘉敏\t97804\n短漫\t97805\nGB1\t97806\n浮\t97807\ndevpress\t97808\nsterling\t97809\n短距\t97810\n人化\t97811\n韩硕\t97812\nsdram\t97813\n北京人事考试网\t97814\n机电一体化专业\t97815\n矿企\t97816\n妾室\t97817\n龟王\t97818\n導\t97819\n测试集\t97820\n彤彤\t97821\n哥本哈根大学\t97822\n鬼灯的冷彻吧\t97823\nwaking\t97824\n热水炉\t97825\n双重身份\t97826\n福建银行\t97827\n吴淞炮台湾湿地森林公园\t97828\n润滑脂\t97829\n坐拥\t97830\nfunky\t97831\n数据条\t97832\nTIMES\t97833\n二程\t97834\n出入\t97835\nBeach\t97836\nnuttx\t97837\n中山市人民医院\t97838\n倍速链\t97839\nEntities\t97840\n大摆锤\t97841\n167\t97842\n善事\t97843\n湖州市住房公积金管理中心\t97844\n血浓于水\t97845\n9季\t97846\n大奎\t97847\n施洋\t97848\n千层金金融网\t97849\n雇佣合同\t97850\n回明\t97851\n122届\t97852\n2139\t97853\n槽机\t97854\n南京紫峰大厦\t97855\n始创\t97856\n谢总\t97857\n中电科\t97858\n小西湖\t97859\n0828\t97860\nFF14#\t97861\n安华招标网\t97862\nReceipt\t97863\n打粉\t97864\nPRL\t97865\n微语文\t97866\nsheet3\t97867\nT43\t97868\n曹先生\t97869\n姚瑶\t97870\n脱肛\t97871\n独子\t97872\nDRDS\t97873\n性感\t97874\n咸鱼王\t97875\n冀东油田\t97876\n5311\t97877\n债主\t97878\nOPENCV\t97879\n中央电视台少儿频道\t97880\n化氏\t97881\n种牛\t97882\n奇幻潮\t97883\n小本成龙\t97884\n马克思\t97885\nnccn\t97886\n60款\t97887\n成人学\t97888\nblackbox\t97889\nHadoop\t97890\nui模板\t97891\n临川\t97892\n瘦西湖风景区\t97893\n星河集团\t97894\n編\t97895\nHEMA\t97896\n戴杭林\t97897\n保存者\t97898\n淋漓尽致\t97899\nArmadillo\t97900\n语音播放器\t97901\n青春最好时\t97902\n雄雌\t97903\n任意\t97904\n豺狼\t97905\n工学院\t97906\n弯弯的月亮\t97907\n多亿元\t97908\n李代沫\t97909\n小花匠\t97910\n3.0L\t97911\n盔甲架\t97912\n三亚新闻网\t97913\nMental\t97914\n底限\t97915\n惺惺相惜\t97916\n603018\t97917\n梦想机\t97918\n刺灸法\t97919\n火线\t97920\nPHPEMS\t97921\n影像仪\t97922\n重庆小面\t97923\nfolding\t97924\nvfs\t97925\n传武\t97926\n鸽巢\t97927\n海心\t97928\n湖南人事考试网\t97929\n箭头函数\t97930\n集品\t97931\n永恒空间\t97932\n猴男\t97933\nㄐ\t97934\n中航机电\t97935\n大罐\t97936\nCP+\t97937\n蜘蛛侠3\t97938\n530万\t97939\n一遭\t97940\n物联网展\t97941\n大同煤矿集团公司\t97942\n上海新视界眼科医院\t97943\n马瑞芳\t97944\n网站百科\t97945\n人餐\t97946\n安溪县\t97947\n救生筏\t97948\nmastercam\t97949\nNGUI研究院\t97950\n江洋畈生态公园\t97951\nFraser\t97952\n表男\t97953\n丹尼尔克雷格\t97954\n冢\t97955\n太好了\t97956\n武汉同济医院\t97957\n春蜜\t97958\nEverybody\t97959\n风投\t97960\n凌霄城\t97961\n纯享版\t97962\n7.0.86\t97963\n一例\t97964\n陈沛\t97965\n488\t97966\n玛氏食品(中国)有限公司\t97967\ner\t97968\n360安全云盘\t97969\n芝山\t97970\n海北\t97971\n窃窃私语\t97972\n600166\t97973\n一大片\t97974\nshu\t97975\n小宝\t97976\n细高跟鞋\t97977\n版纳\t97978\n济南高新万达\t97979\n罗斯特\t97980\nICP证\t97981\n故事稿\t97982\n豆苗\t97983\n关盘\t97984\n办理\t97985\n折桂令\t97986\n呼玛县\t97987\nM27\t97988\n史纲\t97989\n刘进\t97990\nTESCO\t97991\n国家AAAA级旅游景区\t97992\n没资格\t97993\n影之刃\t97994\n梅斯\t97995\n奇迹\t97996\n签到表\t97997\n核价\t97998\n王兆力\t97999\n热丝\t98000\n防暑降温费\t98001\n九方店\t98002\n恶性\t98003\n北京恒昌惠诚信息咨询有限公司\t98004\ncustomization\t98005\n河\t98006\n生活\t98007\n微门户-zz\t98008\n33333\t98009\n0.3cm\t98010\n定梁\t98011\n道人\t98012\n家教育\t98013\n同类\t98014\n安徒生童话故事全集\t98015\n每斤\t98016\n美记\t98017\n扶智\t98018\nNokia\t98019\n延边大学\t98020\n10平方厘米\t98021\n康定情歌\t98022\n食积\t98023\n66张\t98024\n南京地铁4号线\t98025\n201610\t98026\n角铁\t98027\nBD超清\t98028\nManBetX\t98029\nmod函数\t98030\n显视\t98031\n花珠\t98032\njianjian\t98033\n足见\t98034\n布尔\t98035\n复写率\t98036\n失效期\t98037\n麻生\t98038\nwzzkaifa\t98039\n张国伟\t98040\n8L\t98041\n崔敏静\t98042\n轻餐\t98043\n发热贴\t98044\nSpool\t98045\nodor\t98046\nAnnotated\t98047\n恒易融理财\t98048\n工业路由器\t98049\nvcruntime140.dll\t98050\n3.2.11\t98051\n宝城\t98052\n安居乐业网\t98053\n希露薇\t98054\nbelong\t98055\n吉林大学文学院\t98056\n综合实践活动课\t98057\n十个小时\t98058\n2010款\t98059\nSAAS\t98060\n湘家荡\t98061\n读书笔\t98062\nBennett\t98063\n唐代\t98064\n土地使用权出让金\t98065\n镇江日报\t98066\n人教部\t98067\n皮带式\t98068\nima\t98069\n佰钧成\t98070\n生态链\t98071\n洛克希德\t98072\n朝阳区法院\t98073\nEOS\t98074\nSNEC\t98075\n蜂蜜茶\t98076\nRatios\t98077\n中华人民共和国保险法\t98078\n爹\t98079\nODAC\t98080\nAASTOCKS\t98081\n寻医问药专家网\t98082\n神词\t98083\n黑魔女\t98084\nsolve函数\t98085\n魏智\t98086\n哈萨克吧\t98087\nHop\t98088\n侯寨\t98089\n沙梨\t98090\n专题\t98091\n新乐\t98092\n程序管理器\t98093\n8棵\t98094\n五步\t98095\nfactor\t98096\n务求\t98097\n喷涂\t98098\n有染\t98099\n弯矩图\t98100\n0064\t98101\n年老\t98102\n龙头村\t98103\n上市公司收购管理办法\t98104\n2.0子\t98105\n奥迪起亚k9\t98106\n无射\t98107\n人教高中\t98108\nwww,sao8o8ocon\t98109\nachieved\t98110\n浙江龙盛\t98111\n数千元\t98112\n汇算清缴\t98113\n222_\t98114\nrJava\t98115\n爱如\t98116\n雪铁龙c3xr\t98117\n心悦海岛\t98118\nsupra\t98119\n针阀\t98120\n九都荟\t98121\nv0.1.7\t98122\n大黄山\t98123\n柯尔伯格\t98124\n意识形态工作责任制实施办法\t98125\n线性分类器\t98126\n打版\t98127\n吉祥\t98128\n彩塘镇\t98129\n行政级\t98130\nMEET\t98131\n新局长\t98132\nequb\t98133\n4.1.2\t98134\n瞄\t98135\n3.9万\t98136\nandroidx86吧\t98137\n160727\t98138\n议书\t98139\n叠\t98140\n315曝光\t98141\n娃娃裙\t98142\n清官\t98143\n洗剂\t98144\n平花\t98145\n前站\t98146\n综拓\t98147\ncimage\t98148\n随机数函数\t98149\n甜悦读网\t98150\njubao\t98151\n上速\t98152\n永年县\t98153\n德信地产\t98154\n六律\t98155\n谦卦\t98156\n向导\t98157\n排挡杆\t98158\n中国外汇交易中心\t98159\n独照\t98160\ndreamwaver\t98161\n纳氏试剂分光光度法\t98162\n豫章书院\t98163\n捷克共和国\t98164\nAided\t98165\n洪迈\t98166\n株距\t98167\n棘轮\t98168\napproaching\t98169\n燕窝糕\t98170\n遮天\t98171\nWindows/Linux\t98172\n13招\t98173\n色母\t98174\n腾\t98175\n模拟电子技术\t98176\n可名\t98177\nTSI330\t98178\n吹来\t98179\n北京奥运会\t98180\nqhw\t98181\n最高人民法院研究室\t98182\ny67\t98183\n神州土地网\t98184\n荷柏瑞\t98185\n工企\t98186\n三六五\t98187\n0.045\t98188\n冲压油\t98189\n眼皮子\t98190\n相距\t98191\nshirts\t98192\nVSTS\t98193\n0598\t98194\n非对称性\t98195\n赫山政府网\t98196\n河南广电\t98197\n旁系\t98198\n刘桦\t98199\n3522\t98200\n府第\t98201\n保单贷款\t98202\n加热圈\t98203\n星际争霸2:自由之翼\t98204\n双人联机\t98205\n绣金匾\t98206\n藤木直人\t98207\n拖车绳\t98208\n伊丽莎白二世\t98209\n陈振明\t98210\n韩峰\t98211\n香港\t98212\nRTA\t98213\n实德\t98214\n唐家璇\t98215\n迅雷加速器\t98216\n5555\t98217\n扑杀\t98218\n复航\t98219\n太阳能光伏电站\t98220\n亚马逊卖家论坛\t98221\nKarlie\t98222\n去角质\t98223\n3dm\t98224\n屠戮\t98225\n百度营销中心\t98226\nmultisim论坛\t98227\n点券\t98228\n餐后血糖\t98229\n苏州住房公积金管理中心\t98230\n国家发展改革委员会\t98231\n戊戌六君子\t98232\n匹马\t98233\n如织\t98234\n五次&#160\t98235\nMould\t98236\n宋岳庭\t98237\nFedora\t98238\n艰巨\t98239\nepay\t98240\n奔跑吧兄\t98241\n二岁\t98242\n蒂芙尼蓝\t98243\n更具有\t98244\nPMO\t98245\nft232\t98246\n老树\t98247\n多男\t98248\n梦幻西游2口袋版\t98249\nforerunner\t98250\n若干个\t98251\n仓田保昭\t98252\nweleda\t98253\n1~2年\t98254\n军恋\t98255\n男英雄\t98256\n包耳\t98257\n放长\t98258\n既是\t98259\n瑞安市人民医院\t98260\n6k\t98261\npostscript\t98262\n60枚\t98263\n美晨\t98264\nGRR\t98265\ncrit\t98266\n霎时\t98267\n君生\t98268\n怡化\t98269\n广州亚美信息科技有限公司\t98270\nM24\t98271\n导程\t98272\nparasolid\t98273\n花束\t98274\n20150917\t98275\n乱七八糟\t98276\n合肥西站\t98277\n黄蚬子\t98278\n陀思妥耶夫斯基\t98279\nBZOJ\t98280\n无塔供水器\t98281\n兰定筠\t98282\n弯弯曲曲\t98283\nipn\t98284\n报名号\t98285\n200篇\t98286\n爱笑会议室\t98287\notn\t98288\n藏龙福地\t98289\n香蜜公园\t98290\n心电图\t98291\n申贷\t98292\n河砂\t98293\ncourt\t98294\n国食\t98295\n承受能力\t98296\nUNWIRE\t98297\nunepic\t98298\n2N\t98299\nCCl4\t98300\n管泽元\t98301\n孙志超\t98302\n拓野3\t98303\nAngular5\t98304\n南无\t98305\n温州志通管业有限公司\t98306\n风雪夜\t98307\n饭桶\t98308\nRoger\t98309\n2发\t98310\n1.69\t98311\n宝龙街道\t98312\n浴火银河\t98313\n电焊工\t98314\nMeg\t98315\n局机关\t98316\n采采\t98317\n水域\t98318\n顶岗实习\t98319\n搁浅\t98320\n铝箔纸\t98321\n笑林\t98322\n王茂\t98323\n打击\t98324\n井陉\t98325\n电子鼻\t98326\n87.cn\t98327\n密歇根州\t98328\n狂派\t98329\nSEO自学网\t98330\ndevil\t98331\n供求信息网\t98332\nYuria\t98333\n乐趣公园\t98334\n0317\t98335\n长生草\t98336\nPrius\t98337\n暮江吟\t98338\n4迅雷\t98339\nlamy2000\t98340\n几页\t98341\n盏\t98342\n电笔\t98343\n谷氨酰胺转氨酶\t98344\n中毒\t98345\nsang\t98346\nuC\t98347\n20150819\t98348\n非赛季\t98349\nroland\t98350\n德嘉\t98351\n氧代\t98352\n腾讯游戏安全中心\t98353\n6类\t98354\n乐谱网\t98355\n18043\t98356\n参杂\t98357\n邯郸县\t98358\n90岁\t98359\n佟掌柜\t98360\nhuh\t98361\n海龙\t98362\n抄\t98363\n头筹\t98364\n林格尔\t98365\n测线仪\t98366\n老用户\t98367\n康宁玻璃\t98368\n錯過\t98369\n家徒\t98370\n10000张\t98371\n中国水利报\t98372\n青松股份\t98373\n橡\t98374\n李锐莫妮卡贝鲁奇\t98375\n票价\t98376\ntianya\t98377\n空调泵\t98378\n清明饼\t98379\n电感式传感器\t98380\n漂白水\t98381\n音皇\t98382\n华联商厦\t98383\n恶魔眼\t98384\n小姑\t98385\n辛德勒\t98386\n抛丸机\t98387\n无锡梅村\t98388\n金卡纸\t98389\n山楂汁\t98390\n掺\t98391\n沈恩京\t98392\n垄断者\t98393\n_酒店运营区\t98394\n佳能g1800\t98395\n一站通\t98396\n二四六天天\t98397\n第54条\t98398\n卡福\t98399\n如皋日报\t98400\n月下独酌\t98401\n中国平安财产保险股份有限公司\t98402\n例文\t98403\n穿心莲片\t98404\n何璐\t98405\n8799\t98406\n优先购买权\t98407\n华赛\t98408\n中控集团\t98409\n丰台二中\t98410\nam4\t98411\nblibli\t98412\n美丽的遇见\t98413\n冯·诺依曼\t98414\nSGE\t98415\nssd盘\t98416\nFILTER\t98417\nlimit分页\t98418\n上海宝华万豪酒店\t98419\n小米MIX\t98420\n姐妹花\t98421\n洗衣舞\t98422\n龚玥\t98423\n糖丸\t98424\n黑鹰坠落\t98425\n精耕细作\t98426\n135家\t98427\n工作餐\t98428\n流量阀\t98429\neuclidean\t98430\n5.5千瓦\t98431\n片机\t98432\nCollider\t98433\n2016世界读书日\t98434\n刀柄\t98435\n合班\t98436\n虔诚\t98437\n走步机\t98438\n离子交换法\t98439\n知难而进\t98440\n制作类\t98441\n梅川\t98442\n望城区政府\t98443\ncounter\t98444\n胡磊\t98445\n吉鲁\t98446\n痞帅\t98447\n南洋红双喜\t98448\n武汉市水务局\t98449\n无所住\t98450\n小学教学设计\t98451\nviterbi\t98452\n3孔\t98453\n深圳君悦酒店\t98454\n三化\t98455\n三天内\t98456\ndribble\t98457\n长今\t98458\n霸服\t98459\n保育员证\t98460\n毒性\t98461\n武鸣政府网\t98462\n真题集\t98463\n担保网\t98464\nЬ\t98465\n重链\t98466\n徐策\t98467\n瑞欧\t98468\nbeatles\t98469\n麦德氏\t98470\n700%\t98471\n灵璧县政府\t98472\nclassic\t98473\n方点\t98474\n小秦子\t98475\nNIKKOR\t98476\n幼师证\t98477\n后退回\t98478\n京东618\t98479\nASN.1\t98480\nSYSDATE\t98481\nbigemap地图下载器\t98482\n4786\t98483\n超级颜论\t98484\n中国婚博会\t98485\n绿子\t98486\n结型场效应管\t98487\n湖南大学土木工程学院\t98488\n中國哲學書電子化計劃\u001c維基\t98489\n113路\t98490\n科思\t98491\n迈瑞\t98492\n答案卷\t98493\n非卖品\t98494\nDutti\t98495\n中华人民共和国环境噪声污染防治法\t98496\nLodash\t98497\n兴华街\t98498\nkhalifa\t98499\n齐木楠\t98500\n19#\t98501\n石面\t98502\ngitbook\t98503\n衢州机场\t98504\nMyself\t98505\n兵蟹将\t98506\n柳絮凭风\t98507\n情感日记-嘉人有约\t98508\n清水江\t98509\n乐摇摇\t98510\n南海影视城\t98511\nsuggests\t98512\n佳能6D\t98513\n太极2\t98514\nconfigure\t98515\n连云港市赣榆区人民政府\t98516\nnei\t98517\n呔\t98518\n色魔\t98519\n火花天龙剑\t98520\n恒天然\t98521\n牧马人论坛_汽车之家论坛\t98522\nPRM\t98523\n零售价格指数\t98524\n104.1\t98525\nip版\t98526\n千万级\t98527\n拒审\t98528\n内痕\t98529\n河北\t98530\n100根\t98531\n仁和可立克\t98532\n安奈儿\t98533\n熊野\t98534\n螭吻\t98535\n健身衣\t98536\n剧痛\t98537\n伴娘\t98538\n增值税发票认证\t98539\n200公里\t98540\n201804\t98541\n辰溪\t98542\n哆唻咪\t98543\n1.45\t98544\n神武逍\t98545\n实况足球2013\t98546\n急性肾功能衰竭\t98547\n10派\t98548\n海盗战\t98549\n决明子\t98550\n李国黄轩\t98551\nb150m\t98552\n色鬼\t98553\n5001\t98554\n匙\t98555\n南开区\t98556\n老姑娘\t98557\n兽王\t98558\n口哨声\t98559\n漂亮\t98560\n肉粒\t98561\n医缇雅\t98562\n一点钟\t98563\nElim\t98564\n军棋\t98565\n诺和诺德\t98566\n彤德莱\t98567\n20针\t98568\n玉女聊斋\t98569\n举国\t98570\n狗屁\t98571\ncua\t98572\nbigo\t98573\n辞去\t98574\n一个4\t98575\n第二贴\t98576\n英属维尔京群岛\t98577\nmamba\t98578\n刘某\t98579\nfritzing\t98580\n古言文\t98581\n共享性\t98582\npermitted\t98583\n树型\t98584\n曲名\t98585\n麒麟科创园\t98586\n5.8.1\t98587\n多玩\t98588\nWONG\t98589\n吉大二院\t98590\n七组\t98591\n众泰5008\t98592\n肚照\t98593\nloco\t98594\n沙龙室\t98595\n服装厂\t98596\naae\t98597\nmir\t98598\n002841\t98599\nNotation\t98600\n西南大学研究生院\t98601\nbins\t98602\nbrokerage\t98603\nQS认证\t98604\n橙卡\t98605\n新兴\t98606\n合阳\t98607\nFramework\t98608\n上海队\t98609\n一百万\t98610\n驻点\t98611\n8043\t98612\n杭州大剧院\t98613\n成群结队\t98614\n司命\t98615\n彩虹股份\t98616\n打号\t98617\n重复性\t98618\n仙剑奇侠传二\t98619\n新唐书\t98620\n蒙砂\t98621\nUDP\t98622\n江苏广播\t98623\n球感\t98624\n腼腆\t98625\n啡\t98626\n行动条\t98627\n凌晨0点\t98628\n4700MQ\t98629\n大丰实业\t98630\n静茹\t98631\n红寺堡区\t98632\n僻静\t98633\nUDAF\t98634\n认购权\t98635\n三浦恵理子\t98636\n宅客\t98637\n发那科\t98638\n农字\t98639\n3.69\t98640\n妇儿\t98641\n欧巡赛\t98642\n第十三版\t98643\nmomentum\t98644\n大清帝国\t98645\n11.2.0.4单\t98646\n王道士\t98647\nltc\t98648\n呼伦贝尔\t98649\n钟意\t98650\n付豪\t98651\nyto\t98652\n倍佳\t98653\n帝战\t98654\nsumif\t98655\n伊斯内尔\t98656\n外邦\t98657\n海尔中央空调\t98658\n梨园戏\t98659\n控制性\t98660\n艾娃\t98661\nVerifier\t98662\njstat\t98663\n韶关学院\t98664\n光库科技\t98665\ncork\t98666\n剑侠情缘3\t98667\n辅城坳\t98668\n登记卡\t98669\n盐酸氨溴索口服溶液\t98670\n时代玫瑰园\t98671\n入伏\t98672\n动机\t98673\n大大大大\t98674\n哈尔滨医大二院\t98675\n直三棱柱\t98676\n干墨鱼\t98677\nadi\t98678\n短句\t98679\n魔法师传奇\t98680\n加密版\t98681\n無形\t98682\n红豆杉\t98683\n卢氏\t98684\n方牌\t98685\n铃原エミリ\t98686\n耶稣光\t98687\n玲姐\t98688\n玄彩娥\t98689\n英菲尼迪Q60\t98690\n毕方\t98691\n东南大学\t98692\n15K\t98693\n止不住\t98694\n瑾哥\t98695\n皮肤病学\t98696\n龙果学院\t98697\ncl2\t98698\n极重\t98699\n僚人\t98700\n劝架\t98701\n茅山派\t98702\n胶砂\t98703\n85814\t98704\n惊跳\t98705\n江南造船厂\t98706\n母带\t98707\n华融证券股份有限公司\t98708\n此处\t98709\n秋雁\t98710\n青阳镇\t98711\n宝镜\t98712\nAirflow\t98713\n宾大\t98714\nJoans\t98715\nrmvb_磁力链接_BT种子\t98716\n8兆\t98717\n知呼\t98718\n哺乳后\t98719\n登杆\t98720\n很多年\t98721\n丹参酮胶囊\t98722\n海扁王2\t98723\n选民登记\t98724\n度度\t98725\n第九版\t98726\n地方政府债券\t98727\n党校\t98728\n皮尔森相关系数\t98729\n碎乐\t98730\nsignaltap\t98731\n5月5日\t98732\n徐嘉良\t98733\npoping\t98734\n嘉兆科技\t98735\n负心人\t98736\n毒诫\t98737\n好多页\t98738\n伏羲女娲\t98739\n摆角\t98740\n新策略\t98741\n解脱\t98742\n掌趣科技\t98743\n外贸圈\t98744\n大宴\t98745\n胡搞\t98746\n114.114.114.114\t98747\nCCPC\t98748\nK1two2\t98749\nkenya\t98750\n回来后\t98751\n借口\t98752\n转院\t98753\n孙萌\t98754\nCareer\t98755\nlxd\t98756\n六亲\t98757\n贫民窟的百万富翁\t98758\n西里奇\t98759\n16年7月\t98760\nsl6\t98761\n风雪\t98762\n白领贷\t98763\n一緒\t98764\n猛冲\t98765\n邪恶少女漫画无翼鸟与老师h全集\t98766\n卫斯理之霸王卸甲\t98767\n法律咨询|律师在线\t98768\n诸天机破星河\t98769\n空置率\t98770\n巽寮湾\t98771\n李华平\t98772\n117.136.38.54\t98773\nsteam求生之路2\t98774\n戴予桐\t98775\n库狛\t98776\n江泰保险经纪股份有限公司\t98777\nARES\t98778\n龙之信条:黑暗觉者\t98779\nINIT\t98780\n信得\t98781\n澳大利亚墨尔本大学\t98782\n有去无回\t98783\nmercurial\t98784\ndianh\t98785\n116114\t98786\n铃村健一\t98787\n张金龙\t98788\n恶字\t98789\n第一夜\t98790\n百度云_白领网\t98791\n圈儿\t98792\n金桥信息\t98793\n两手抓\t98794\n孢子菌\t98795\n中共浙江省委老干部局\t98796\n对外经贸\t98797\n索引处\t98798\n小公鸡\t98799\n3龙\t98800\nBrowne\t98801\n风水网\t98802\ng20峰会\t98803\n孙庄\t98804\nSENNHEISER\t98805\n400多亿\t98806\n悟空\t98807\neeban\t98808\ntq\t98809\n万家论坛\t98810\n黄祖儿\t98811\n40公斤\t98812\n拂尘\t98813\nMySQL数据库\t98814\n龙马潭\t98815\n把头\t98816\n021七桃独舞\t98817\n拿瓦\t98818\n古刹\t98819\n心室率\t98820\n灵魂石\t98821\nカリビアンコム\t98822\n双标\t98823\n舰长\t98824\n何萍\t98825\n橱窗\t98826\n刺猬紫檀\t98827\n机器设备\t98828\n黄某某\t98829\n干物妹\t98830\n匀贞\t98831\nYUV\t98832\n旱魃\t98833\n大炮\t98834\n阿里国际站\t98835\n金山寺\t98836\n美国西点军校\t98837\n旅游社\t98838\nrar-CSDN\t98839\n前途出国\t98840\n纸印\t98841\n本周四\t98842\n舞者\t98843\n团战\t98844\n温流\t98845\n掺杂\t98846\n第37号\t98847\n簪\t98848\n电动攻丝机\t98849\n三科电器有限公司\t98850\n童叟无欺\t98851\n台桌\t98852\n1.5秒\t98853\n手铐\t98854\n丁先森\t98855\n唐雪\t98856\n辖\t98857\nhp3050\t98858\n螺\t98859\n要不是\t98860\nHsu\t98861\npore\t98862\n古田路\t98863\n胚胎学\t98864\n千丁\t98865\n身\t98866\n利驰\t98867\nOpenJDK\t98868\n慕战\t98869\n阳逻街\t98870\n学回信\t98871\n促销价\t98872\n小星\t98873\n新寻仙\t98874\n百花广场\t98875\n圆通快递公司\t98876\n峰峻\t98877\n我记得\t98878\n干菜\t98879\n道济\t98880\n现存\t98881\n西安博物馆\t98882\n马经\t98883\n张志伟\t98884\n博库\t98885\n导波\t98886\n包装展\t98887\n垂直\t98888\n郑仁仙\t98889\n江山印\t98890\n维生素C注射液\t98891\n省气\t98892\nTPLINK\t98893\n新疆代表团\t98894\n孟雨泽\t98895\n参建\t98896\n丁东\t98897\nbin格式\t98898\n结线\t98899\na45\t98900\n非字型\t98901\n山大\t98902\nOWARI\t98903\nfla\t98904\n一胜\t98905\n伸冤人\t98906\n袁嘉敏\t98907\n尺寸链\t98908\neducate\t98909\n垃圾股\t98910\n康乐社区\t98911\nJIT\t98912\n罗恩\t98913\nUEFI版\t98914\n好几年\t98915\n常安镇\t98916\nvideoscribe\t98917\n大竹林\t98918\nMailChimp\t98919\n三戒\t98920\n浙江省人民医院\t98921\n有见\t98922\n短池\t98923\n绿苔\t98924\n双培\t98925\n970a\t98926\nSPV\t98927\n杨根思\t98928\n薛衣人\t98929\nSHIELD\t98930\nK3路\t98931\n柳埠镇\t98932\n基层司法所\t98933\n何欢\t98934\n大公布\t98935\n山石\t98936\n曲目\t98937\n牡丹\t98938\n量子号\t98939\n希德\t98940\n和平共处\t98941\n折价\t98942\n订货量\t98943\n磨损\t98944\n已\t98945\n非贸付汇\t98946\n小西门\t98947\n武陟\t98948\n恒鑫\t98949\n疯窝\t98950\n乏善可陈\t98951\n脱硫设备\t98952\n天南海北\t98953\n7月13日\t98954\n兰\t98955\n中国制冷学会\t98956\n投标函\t98957\n川价\t98958\n上饶晚报-上饶数字报\t98959\n丽江火车站\t98960\neclispse\t98961\nrookieM\t98962\nsiyue\t98963\n大圣归来\t98964\n第二分\t98965\n常言\t98966\ninArray\t98967\n3.8.1\t98968\nTPlink\t98969\n奥华子\t98970\n头号\t98971\n酸式\t98972\n氢氧化锂\t98973\n动态评分\t98974\n苏苑\t98975\n吴经理\t98976\n全福\t98977\n唯蜜\t98978\n魏洲\t98979\n陈信宏\t98980\n荡寇\t98981\nEasy\t98982\n大乐透机\t98983\n美食加盟连锁\t98984\n资本公积\t98985\n高宽\t98986\n拜访者\t98987\n42项\t98988\n鳑鲏\t98989\n小叶子\t98990\n骨质疏松\t98991\n注解\t98992\n机械设计制造及其自动化专业\t98993\n第35条\t98994\n空头支票\t98995\ncr\t98996\n云南省农业科学院\t98997\n途安\t98998\n208个\t98999\n15届\t99000\n秦朝时期\t99001\n骑马与砍杀潘德\t99002\n木屋架\t99003\n张晓松\t99004\n坛蜜\t99005\n暴操\t99006\naoz\t99007\n初级会计师考试\t99008\nopm\t99009\n航运公司\t99010\n生辰八字算命_算命\t99011\n1317\t99012\n羊山\t99013\n老滚\t99014\n100多张\t99015\n普速\t99016\n钩虫\t99017\nIl\t99018\n傻不傻\t99019\nzzfx\t99020\nembeded\t99021\n打擂\t99022\n死神来了5\t99023\n双赢网\t99024\n凸轮轴\t99025\n金蝎\t99026\n智慧园区\t99027\n着迷网\t99028\n环链电动葫芦\t99029\nshì\t99030\n周显欣\t99031\n连技\t99032\n南海罗村\t99033\n屋主\t99034\n2017年1月6日\t99035\n超一流\t99036\n十番\t99037\n画境\t99038\n鸡卡\t99039\n档案室\t99040\n百叶片\t99041\n缺片网\t99042\n海晶\t99043\n逸仙\t99044\n魔音\t99045\n织标\t99046\n房产\t99047\n籽粒\t99048\n蒂姆\t99049\n突然死亡\t99050\n关衔\t99051\n正部级\t99052\n深圳公积金管理中心\t99053\n最高层\t99054\n有砟\t99055\n鱼虾\t99056\n浦上\t99057\n南京市国家税务局\t99058\n求顶\t99059\n把\t99060\n西安酒店\t99061\n目无\t99062\n鱼香茄子\t99063\nu2717\t99064\n四川省环保厅\t99065\niphonese\t99066\n2043\t99067\nMotorcycles\t99068\n1868\t99069\nVuejs\t99070\n候\t99071\n雨刷\t99072\n内甲\t99073\n酪氨酸\t99074\n三十年前\t99075\n野鹅敢死队\t99076\npeau\t99077\nSM961\t99078\n效果\t99079\nqí\t99080\n小规模纳税人公司\t99081\n无翼鸟日本邪恶少女漫画大全之恋母性活义母吐息\t99082\ngrids\t99083\nPrefab\t99084\n代挂\t99085\n日照市人民医院\t99086\n群友们\t99087\n永恒魔法\t99088\nrespone\t99089\n南昌万达海洋乐园\t99090\n陆波\t99091\n发卷\t99092\n大主宰吧\t99093\n日本东京大学\t99094\ncolorectal\t99095\n破产清算组\t99096\n除湿机\t99097\nbagging\t99098\n花土沟\t99099\n文本\t99100\n要疯\t99101\nhibernatetemplate\t99102\n伦琴\t99103\n浮肿\t99104\n扬\t99105\n脚本之家\t99106\n深圳人力资源和社会保障局\t99107\n孑然\t99108\nLuka\t99109\n西安电子科技\t99110\nsbs\t99111\n三旬\t99112\n原典\t99113\n百度大圣卡\t99114\n麻辣烫\t99115\nsurf函数\t99116\ninr\t99117\n陈德文\t99118\n怪哉\t99119\n门楣\t99120\n长安欧尚长安CX70\t99121\n安阳一中\t99122\n新垣结衣\t99123\n烘焙业\t99124\nDivinity\t99125\n中非\t99126\nV5.5\t99127\n瞭望塔\t99128\nGOCLOUD\t99129\npw\t99130\n一句顶一万句\t99131\n半工半\t99132\nCBME\t99133\nEDIUS\t99134\nFLUKE\t99135\n零秒\t99136\n决不能\t99137\n前两天\t99138\n厦蓉高速\t99139\n场景值\t99140\n600158\t99141\n棒棰岛\t99142\nelasticSearch\t99143\n中华人民共和国中央人民政府\t99144\n酒诗\t99145\n除草剂\t99146\n小贱\t99147\nAATCC\t99148\n平民\t99149\n杨小杰\t99150\naccrual\t99151\n傲血\t99152\n功能费\t99153\n黄老师\t99154\n云计算平台\t99155\n广州市第一人民医院\t99156\n产地证\t99157\n防护期\t99158\n济天下\t99159\n淘房记\t99160\nAnswerHub\t99161\n滁州市财政局\t99162\n质量检查\t99163\n某事\t99164\n非金属性\t99165\n范敏\t99166\n险途\t99167\n卡尔萨根\t99168\n离情\t99169\n吸式\t99170\n方清平\t99171\nAL00/全网通\t99172\n舍友\t99173\nGerald\t99174\n系统解剖学\t99175\n鮨\t99176\n630k\t99177\nCall\t99178\n九斗鱼\t99179\n九和路\t99180\n阀件\t99181\n点钟\t99182\n北京房产网\t99183\nTab3\t99184\njacobi\t99185\n2颗\t99186\n唐清\t99187\ndevils\t99188\nXjng\t99189\n气宇轩昂\t99190\n持有\t99191\n白鹤滩水电站\t99192\n寻龙诀\t99193\nEizo\t99194\n豪赌\t99195\n10000点\t99196\n科拉\t99197\n沃邦国际教育\t99198\n蛋奶\t99199\n和平社区\t99200\n打光\t99201\nhamlet\t99202\n钥匙孔\t99203\n1.6元\t99204\n闳\t99205\n拉登\t99206\ngalleria\t99207\n娜塔莎\t99208\n老师傅\t99209\njiangxiaobo\t99210\n杭州地铁9号线\t99211\n宜和园\t99212\nHPV\t99213\n四幕\t99214\n德古拉\t99215\n布质\t99216\n沪渝高速\t99217\n六式\t99218\n统计值\t99219\n勃兰特\t99220\n勃朗宁\t99221\nkaixin\t99222\n全基因组测序\t99223\n北京工会\t99224\n词儿\t99225\n药王山\t99226\n王丽霞\t99227\n卡钱\t99228\nmpi4py\t99229\n2018.4.28\t99230\ncz\t99231\nRxS\t99232\n一元函数\t99233\n抢座\t99234\n真奇怪\t99235\n案由\t99236\n米莱\t99237\n嗜酸性\t99238\n间氨基苯酚\t99239\n草绿\t99240\n货币主义\t99241\nquestion\t99242\n复合维生素\t99243\n长治二中\t99244\n刘扬\t99245\nindividual\t99246\n报税季\t99247\n纳税义务\t99248\n喷绘\t99249\nstream886\t99250\nBeliever007\t99251\n天册\t99252\n二级注册计量师考试\t99253\n防守型\t99254\n杨阳洋\t99255\n啼血\t99256\n四次方\t99257\n星号\t99258\n连打\t99259\nAck\t99260\n有道云\t99261\n孙述涛\t99262\n大慨\t99263\n鸟粪\t99264\n北体大\t99265\nChai\t99266\n非盟\t99267\n明初\t99268\n新藏线\t99269\n阜新县\t99270\n佟大徐志摩\t99271\n奔腾x80\t99272\ndebussy\t99273\nf150\t99274\n12代\t99275\n请求权\t99276\n刮风\t99277\n五显\t99278\n珍宝馆\t99279\n神迹\t99280\noabc\t99281\nstardust\t99282\n中国光大银行股份有限公司\t99283\nFE55\t99284\n岳西翠兰\t99285\n治安官\t99286\n拿手菜\t99287\n2401\t99288\n备足\t99289\n踏空\t99290\n针扣\t99291\n曹老师\t99292\nEtc\t99293\n光阴荏苒\t99294\nM249\t99295\nSteam\t99296\n轻质\t99297\n古港\t99298\n比翼双飞\t99299\n缚鸡之力\t99300\n讯飞语记\t99301\n刘蓉\t99302\nASOne\t99303\n搽\t99304\n步进电机控制器\t99305\n挽留\t99306\n免税农产品\t99307\n情战\t99308\n血吸虫病\t99309\n机装\t99310\n林世荣\t99311\n荣威i6/ei6论坛_汽车之家论坛\t99312\n晒斑\t99313\n地画\t99314\nWindows.old\t99315\n杨宝忠\t99316\n还林\t99317\n无人烟\t99318\n天津恒隆广场\t99319\n自来水\t99320\ncaliper\t99321\n怡乐路\t99322\n天天模拟器\t99323\n黑玫瑰\t99324\n4.9\t99325\nhongxing\t99326\nhate\t99327\nstanley\t99328\n300天\t99329\n成册\t99330\n没名\t99331\n月满轩尼诗\t99332\n岩寺\t99333\n口交器\t99334\n赵川\t99335\nsoso\t99336\nexcel通讯录\t99337\n4000亿元\t99338\n深圳供电局有限公司\t99339\n料台\t99340\n武能\t99341\n信号继电器\t99342\n酸汤肥牛\t99343\nKit\t99344\n振东制药\t99345\n瓦森纳\t99346\n王秀芬\t99347\n7.62毫米\t99348\n走转改\t99349\n莲溪路\t99350\nspeculative\t99351\n英亩\t99352\n泉州南\t99353\n115\t99354\n360急救箱\t99355\n孙英杰\t99356\n铃木王\t99357\n保利百合花园\t99358\n阵列\t99359\nqgis\t99360\n罪人与龙共舞\t99361\nΕ\t99362\n娜塔莉·多默尔\t99363\n炉石传说女巫森林冒险\t99364\n连载版\t99365\n仁王\t99366\nv4.1.1\t99367\n低温\t99368\n正面临\t99369\n利标\t99370\n易燃易爆炸\t99371\nRAT\t99372\n61名\t99373\n徐教院附中\t99374\n#import\t99375\n禁音\t99376\n金丝楠木手串\t99377\n明知山有\t99378\n南京医院\t99379\n钩花\t99380\n玩车\t99381\nnba2k16\t99382\nModules\t99383\n怪形前传\t99384\n条纹衫\t99385\njquery-easyui\t99386\nSTDEV\t99387\n温州市政府\t99388\n望坛\t99389\n1分钟前\t99390\n新度镇\t99391\naccompany\t99392\n打团\t99393\n知心话\t99394\n7件\t99395\n紫外分光光度计\t99396\n黄浩\t99397\n奥赛康\t99398\n中国电建集团\t99399\n张家界市\t99400\n王者荣耀荆轲\t99401\n菊与刀\t99402\n武汉科技大学城市学院\t99403\n启锐\t99404\n心直口快\t99405\n吃金戈\t99406\n月经病\t99407\n金源\t99408\n刘晓凯\t99409\n广州供电局\t99410\n低压开关柜\t99411\n_列王\t99412\nconfuse\t99413\n彭鑫\t99414\n关雎尔\t99415\n感人至深\t99416\n四合扣\t99417\n电卡\t99418\n阿尔西\t99419\n2017年12月16日\t99420\nX1\t99421\n窃格瓦拉\t99422\n解放军艺术学院\t99423\n莲妹\t99424\n黑豆浆\t99425\n励精\t99426\n请贴\t99427\n羌\t99428\n周皓\t99429\n奥美拉唑肠溶胶囊\t99430\n桑玠\t99431\n控制科学与工程专业\t99432\n打零工\t99433\n酯\t99434\n格拉茨\t99435\n黄石港区\t99436\nresistance\t99437\n富拉尔基\t99438\n紫藤花\t99439\n方家\t99440\n千佛手\t99441\n特卖\t99442\n无锡市妇幼保健院\t99443\n下定决心\t99444\n京九线\t99445\n茶海\t99446\n稳岗\t99447\n108式\t99448\n请注\t99449\n文艺报\t99450\n诡媚海妖\t99451\nWallace\t99452\n结板\t99453\n遭淘汰\t99454\n龙猫传奇\t99455\navio\t99456\n猛涨\t99457\nd3400\t99458\nSeebug\t99459\n清华大学软件学院\t99460\n五t\t99461\n盯梢\t99462\n香仁堂瘦瘦包\t99463\n位宽\t99464\n绯绯\t99465\n哈尔滨医科大学附属第一医院\t99466\naspectj\t99467\n不眠之夜\t99468\nチラリズム\t99469\n杏彩\t99470\n爱特\t99471\ncwp\t99472\n汪杰\t99473\n三竹\t99474\n柯华\t99475\n5680t\t99476\n锥切\t99477\n原水\t99478\n統色\t99479\n章姬\t99480\n全唐诗\t99481\n大庆市政府\t99482\n勤廉\t99483\n三坊七巷\t99484\n神龙教\t99485\n卖好\t99486\n焖饼\t99487\n4.08\t99488\n哭鬼\t99489\n显名\t99490\n篇章\t99491\nCytoscape\t99492\n长安奥迪q3\t99493\n封神纪\t99494\n克莱蒙\t99495\n联测\t99496\n9点\t99497\n世纪新城\t99498\n赤藓糖醇\t99499\n玛咖粉\t99500\n普通门诊\t99501\n西雅图华盛顿大学\t99502\nword模板\t99503\n文章类\t99504\n中国网通\t99505\n京瓷1020\t99506\n黄书豪\t99507\n野史\t99508\n1.9T\t99509\ncertificate\t99510\n漷县\t99511\n宝安西乡\t99512\n黄兴公园\t99513\n神交\t99514\n洋快餐\t99515\n驳壳\t99516\n圣盈信\t99517\n_万古仙穹\t99518\n上海交通大学医学院附属瑞金医院\t99519\n醋泡黑豆\t99520\n南京军区总院\t99521\n舒婷\t99522\nslx\t99523\n百丽集团\t99524\n盲妃\t99525\n野彩\t99526\n微单a7\t99527\nSulfate\t99528\n美呆\t99529\n妖火\t99530\n十四日游\t99531\nRY\t99532\n并蒂莲\t99533\n解放阁\t99534\n鼓楼广场\t99535\nspending\t99536\n中国产业洞察网\t99537\nratio\t99538\n文案\t99539\n炼石有色\t99540\n副产\t99541\nsshpass\t99542\n敏捷型\t99543\n作业指导书\t99544\ntranscad\t99545\n合正\t99546\n2017-2025年\t99547\n笑哭\t99548\n北京市农村工作委员会\t99549\nZeppelin\t99550\n万影网\t99551\n4399寻仙手游\t99552\n大同\t99553\n奈莎\t99554\nD型\t99555\n11页\t99556\n放射性同位素\t99557\n青海省政府\t99558\n巨婴\t99559\n软锰矿\t99560\nScreens\t99561\n三瓜公社\t99562\n国标码\t99563\nCalibri\t99564\nmambo2\t99565\n就战\t99566\n7270\t99567\n推介\t99568\n古文字\t99569\n小苏打粉\t99570\n银轴\t99571\n喜上\t99572\n小屄\t99573\n112岁\t99574\n免堆\t99575\n北京农业职业学院\t99576\n100立方\t99577\n商牛网\t99578\n阿里智能\t99579\n浦阳\t99580\nhbad\t99581\n75位\t99582\n反洗钱\t99583\n大道理\t99584\n大老鼠\t99585\n零落成\t99586\n锅巴菜\t99587\nWin10设备管理器\t99588\n不定冠词\t99589\n李香凝\t99590\n湖州地区\t99591\n中国锻压协会\t99592\nBurpsuite\t99593\n脱除\t99594\n北京现代\t99595\n吊扣\t99596\n接受我\t99597\n2.1.6\t99598\n直流电路\t99599\n亚格力\t99600\nB&O\t99601\n骑士王\t99602\n胡宁\t99603\n昌业\t99604\n菅田将晖\t99605\nFollow\t99606\n9205\t99607\n午夜片\t99608\nopenshift\t99609\n恶性循环\t99610\n尤努斯\t99611\n白银区\t99612\n萨克斯\t99613\n国检\t99614\n海尔金控\t99615\newp\t99616\nhunger\t99617\neagleget\t99618\n兽笼\t99619\n调整箱\t99620\nhumblebundle\t99621\n真心话大冒险\t99622\n救生绳\t99623\n关城\t99624\n碗窑\t99625\n大言不惭\t99626\n郑大远程教育\t99627\n全职\t99628\n李灿\t99629\nDefense\t99630\nbjm\t99631\nfashi\t99632\n高达seed\t99633\n明视\t99634\n贩\t99635\n丫子\t99636\n濯村\t99637\n巨债\t99638\n财富邮\t99639\nsousuo\t99640\n头等舱\t99641\nScrollview\t99642\nsnapshot-51CTO\t99643\n法门\t99644\n钱罐\t99645\n上海财经大学会计学院\t99646\n沈德潜\t99647\n开幕词\t99648\n芯烨\t99649\n银川市政府网\t99650\n绝地求生刺激战场吧\t99651\nDOME\t99652\neoc\t99653\n桥边\t99654\n9日\t99655\n礼泉县人民政府\t99656\nawful\t99657\n切脉\t99658\n17岁时\t99659\n旋转蒸发仪\t99660\nDMX512\t99661\n援助\t99662\n千佛崖\t99663\n黑龙江建筑职业技术学院\t99664\n乔一\t99665\n长城版\t99666\nimu\t99667\n消噪\t99668\n山峰\t99669\n剩余定理\t99670\n新米\t99671\n寒性水果\t99672\n银制\t99673\n无心法师\t99674\n标卷\t99675\n\\0\t99676\nALP\t99677\nBMap\t99678\nRufus\t99679\n96平方\t99680\nvivox20吧\t99681\n山推股份\t99682\n大庭\t99683\n相对分子质量\t99684\ntsks\t99685\n胡.杰\t99686\n心闷\t99687\n京挑客\t99688\n五陵\t99689\n色域\t99690\n吴中路\t99691\n躲起来\t99692\n番茄鱼\t99693\n牧马城市\t99694\n瓦当\t99695\n锈迹斑斑\t99696\nimds\t99697\n迪诺水镇\t99698\n96069\t99699\n传好\t99700\n2000t\t99701\n太阳树\t99702\nAv网\t99703\n白活\t99704\n天高云淡\t99705\n好居\t99706\n银砖\t99707\n汉东省\t99708\n潇潇雨\t99709\n黑底\t99710\n起皱\t99711\n捞王锅物料理\t99712\n4000多元\t99713\n管流\t99714\nPassat\t99715\n三终\t99716\n桂子\t99717\nSqlite数据库\t99718\n6.9折\t99719\n105亿\t99720\n弹唱\t99721\nzju\t99722\n西安小学\t99723\n赵晶\t99724\n枸杞树\t99725\n丝竹\t99726\n救死扶伤\t99727\n更至\t99728\n飘来飘去\t99729\ntravling\t99730\n够用\t99731\n杭州政府网\t99732\nSAINT\t99733\n防爆盒\t99734\n小姨多鹤\t99735\n38.com\t99736\n鹭洲\t99737\n淡写\t99738\n王老头\t99739\n西南林业大学\t99740\n理想家\t99741\nVoyage\t99742\n祸不单行\t99743\n焦山\t99744\n尛\t99745\n不唯\t99746\n翁旗\t99747\n矿长\t99748\n半轴\t99749\n成都南站\t99750\n第17页\t99751\n污\t99752\nBRK\t99753\n满分\t99754\nswi\t99755\n水川\t99756\n小辈\t99757\n一分钟之内\t99758\n88必发\t99759\n塔川\t99760\n罗比尼奥\t99761\n18.3.4\t99762\nfushion\t99763\n黄河大桥\t99764\n20160314\t99765\n曲周吧\t99766\nAttractions\t99767\n五四村\t99768\n伊犁州人力资源和社会保障局\t99769\n索吻\t99770\n终极版\t99771\n语言栏\t99772\n帮鞋\t99773\n托特\t99774\n962\t99775\n中誉\t99776\n深圳男科医院\t99777\n积分入户分值表\t99778\n反抗\t99779\n邮报\t99780\n琳恩\t99781\n柯柯\t99782\n纳米银线\t99783\n维生素b6\t99784\n墓志铭\t99785\n反智\t99786\n畅优\t99787\n悬赏金\t99788\n周转房\t99789\n茶淡饭\t99790\nBoxes\t99791\n剖视\t99792\n汪勇\t99793\n三高\t99794\n阳光体育运动\t99795\n车款\t99796\n专属情歌\t99797\n剪切蒙版\t99798\n上海机场集团\t99799\nFPGA/ASIC\t99800\nhex\t99801\n八边形网\t99802\n蚍蜉撼树\t99803\n挂用\t99804\n普叉\t99805\n舒肤佳香皂\t99806\n雪佛兰科沃兹\t99807\n蒯大富\t99808\n蜗轮蜗杆\t99809\n余天\t99810\n备胎\t99811\n横行霸道\t99812\n万里行\t99813\n返台\t99814\n工务段\t99815\n债卷\t99816\n人力资源管理师证\t99817\n太西路\t99818\n汉钟精机\t99819\nJeb\t99820\n泽州县\t99821\n学则\t99822\n日觉\t99823\n勇于\t99824\n意气用事\t99825\n锅匠\t99826\n阿尔托莉雅\t99827\n41秒\t99828\n电镜\t99829\nOptimized\t99830\n收紧器\t99831\n针式打印机\t99832\n民主生活会\t99833\n陆昊\t99834\n满天\t99835\nprise\t99836\n嚣张\t99837\n60010\t99838\n隐性基因\t99839\n3DMAX2010\t99840\n书楼\t99841\n第K\t99842\n麓湖生态城\t99843\n前海蛇口\t99844\nhunk-ch\t99845\nyoyo\t99846\n东台市人民政府\t99847\n植物源\t99848\n丹厦\t99849\n流水潺潺\t99850\n大兴野生动物园\t99851\nPlaygrounds\t99852\nTreeview\t99853\n兰德尔\t99854\n袁希福\t99855\n昆士兰大学\t99856\nrevit2017\t99857\n洁水\t99858\nhd版\t99859\nMemento\t99860\n凤毛麟角\t99861\n蓝格\t99862\n标准线\t99863\n4方\t99864\n双钻\t99865\nJump\t99866\n萧圣\t99867\nAndroidM\t99868\n北单\t99869\n放量\t99870\n授权书\t99871\n珊莎\t99872\n西洋乐\t99873\n惠普1020plus\t99874\n红色警戒2共和国之辉\t99875\n张廷玉\t99876\n老周\t99877\nICO\t99878\nAbc\t99879\n张蕙兰\t99880\n太苏\t99881\n元祖\t99882\n7月31日\t99883\n宏源\t99884\n等等\t99885\n电动搅拌器\t99886\n南金\t99887\ninvasion\t99888\nConfidentLiu\t99889\n酷看影视\t99890\n陈伯达\t99891\n中国南京栖霞区政府网\t99892\n十一黄金周\t99893\n五矿集团\t99894\n积案\t99895\n訳\t99896\nZoe\t99897\n符串\t99898\n慢性鼻窦炎\t99899\n轴荷\t99900\n沪通\t99901\n督导师\t99902\n格罗姆\t99903\nIntruder\t99904\n白蛋白紫杉醇\t99905\ndivergence\t99906\n白沙路街道\t99907\nmp259\t99908\n公服\t99909\n200架\t99910\nOurMySQL\t99911\n草丛里\t99912\nREVIEW\t99913\n廉价航空公司\t99914\n中国企业家\t99915\n黄曲霉毒素\t99916\n卑怯\t99917\nAIKA\t99918\n莫晓娜\t99919\n耐受性\t99920\nzaixian\t99921\n押韵\t99922\nLogic\t99923\n横扫千军\t99924\nroot版\t99925\n武汉人才网\t99926\nonTouchEvent\t99927\n永宁\t99928\n围炉\t99929\n气势磅礴\t99930\n龙娘\t99931\n为非作歹\t99932\n米兜\t99933\n实际利率\t99934\n冒險者\t99935\n笔墨\t99936\nartful\t99937\n厨工\t99938\n照章\t99939\n蝶舞\t99940\nDramatic\t99941\n鲍浩然\t99942\n凤凰单丛\t99943\n边志军\t99944\n对称\t99945\n900米\t99946\n气法\t99947\n韩杰\t99948\n筛板\t99949\n小夫妻\t99950\n数据猿\t99951\n韩文\t99952\n7u80\t99953\n一中年\t99954\nacronis\t99955\n砖茶\t99956\n人保部\t99957\n上海爱信诺航天信息有限公司\t99958\n迂曲\t99959\n张昭\t99960\n贪嘴\t99961\ncopyfile\t99962\nShowDialog\t99963\n动画集\t99964\n新城大街\t99965\n吴鹏飞\t99966\n毛毯\t99967\n清扫\t99968\n不再相信\t99969\n3350\t99970\n纤云\t99971\n设计篇\t99972\n第648章\t99973\n专指\t99974\n李放\t99975\n慢慢地\t99976\n61条\t99977\n等价类\t99978\n一月\t99979\n离石\t99980\n山东师大附中\t99981\n爱的距离\t99982\n哆啦A梦\t99983\n疲劳药\t99984\n问及\t99985\n寶\t99986\nVIPER\t99987\n很兴奋\t99988\ndeutsche\t99989\nEclispe\t99990\nPEP\t99991\n2017年3月5日\t99992\n263号\t99993\n临沂大众网\t99994\n江苏城市职业学院\t99995\n范生\t99996\n金立M5\t99997\n维斯特\t99998\n爆顶\t99999\n佳园\t100000\nCosta\t100001\n永硕e盘\t100002\n鲁邦\t100003\n东联\t100004\n扭扭\t100005\n回龙观社区\t100006\n头痛\t100007\n老虞学GoLang笔记\t100008\n洗冤集录\t100009\n一起同过窗2\t100010\n杨经理\t100011\n牵引变压器\t100012\n上海腾讯\t100013\n看做\t100014\n阿弥陀经\t100015\n东部\t100016\n核心筒\t100017\n随口\t100018\n慕纤瞳\t100019\n在于\t100020\n树石\t100021\n露蒂\t100022\n精微\t100023\n临沂\t100024\n小米Max\t100025\n小庙镇\t100026\ngoogle.com\t100027\n人事工资管理系统\t100028\n山珍海味\t100029\nTrueCrypt\t100030\n神仙水\t100031\n福昕pdf阅读器\t100032\n60601\t100033\n垡头\t100034\n清新型\t100035\n水漾\t100036\n帕克\t100037\n李在村\t100038\n6105\t100039\n温州村\t100040\nMultiphysics\t100041\n标志牌\t100042\n店\t100043\n孙大勇\t100044\nRoxy\t100045\n极小\t100046\n高勒\t100047\n0311-87675566\t100048\n热血格斗\t100049\nForever\t100050\n正定古城\t100051\n双庙\t100052\n中兴小区\t100053\n蚓\t100054\n恺英网络\t100055\n船\t100056\n上葡京\t100057\nger\t100058\n农村信用社网上银行\t100059\n世界经济论坛\t100060\nfarm\t100061\nMarcus\t100062\n冒险岛4\t100063\n清原县\t100064\n查号\t100065\n绿化面积\t100066\n气化器\t100067\n资邦金服网络科技集团有限公司\t100068\n草莽\t100069\n意遇\t100070\n广东猛狮新能源科技股份有限公司\t100071\nModel3\t100072\n敬茶\t100073\n网签价\t100074\n试验方\t100075\n菲拉斯\t100076\nLive\t100077\n必杀\t100078\n中华名酒招商网\t100079\n随记\t100080\n听\t100081\n汇点\t100082\n高志航\t100083\n忆梦\t100084\nJavashop\t100085\n安全裤\t100086\n2016年中秋节\t100087\n柴静\t100088\n赶尽杀绝\t100089\nxp1024核工厂\t100090\nHVDC\t100091\n喵喵喵喵喵\t100092\n茵曼\t100093\nkvb\t100094\n合优\t100095\n苏诺锦\t100096\n白首不相离\t100097\n天津大剧院\t100098\nDes\t100099\nconky\t100100\n简约型\t100101\n灿星\t100102\n南唐后主\t100103\n大秘\t100104\n150条\t100105\ndephi\t100106\n永山\t100107\n260名\t100108\n番禺日报数字报\t100109\n梯式桥架\t100110\n50k\t100111\nSTM32F7\t100112\n史密斯\t100113\n黑龙江省纪委监察厅\t100114\n体行\t100115\n蜉蝣\t100116\nSTVP\t100117\nSymPy\t100118\n10月10日\t100119\n美的地产集团\t100120\n唯识\t100121\n老杨\t100122\n酒醉\t100123\n二流\t100124\n鼬神\t100125\n腺癌\t100126\nPantone\t100127\n米开朗琪罗\t100128\n肛门腺\t100129\njstarkan\t100130\n鬼洞\t100131\n下火\t100132\n盘乐网\t100133\n必须\t100134\n空甲联盟\t100135\n缭绕\t100136\n杨沛宜\t100137\n茶酒\t100138\ntouchstart\t100139\n江苏省农垦集团有限公司\t100140\n0.02mm\t100141\ntravelers\t100142\n样表\t100143\ndeux\t100144\n难言之隐\t100145\n魔头\t100146\n促农\t100147\n办报\t100148\npicked\t100149\n酸疼\t100150\n临沭县\t100151\n选装\t100152\n跑速\t100153\nvulcan\t100154\n隔直\t100155\nт\t100156\n雅高\t100157\n坎坎\t100158\n冷番\t100159\n合力泰科技股份有限公司\t100160\n意丹\t100161\n芝麻认证\t100162\n中科\t100163\n空气冷却器\t100164\nskin\t100165\n鱼糜\t100166\n将乐县\t100167\n喷气发动机\t100168\n云水谣古镇\t100169\n葡式蛋挞\t100170\n邪恶漫画\t100171\n等级性\t100172\n喷膜\t100173\n祛皱\t100174\n中持股份\t100175\n林波\t100176\n4050\t100177\n4.2%\t100178\n县图书馆\t100179\n阿姆斯特朗\t100180\n3017\t100181\n跳起\t100182\n山东政协\t100183\n乐视体育\t100184\n社区卫生服务站\t100185\n西河镇\t100186\n美人为馅\t100187\n泰达宏利基金\t100188\n浙江大学明州医院\t100189\n预则立\t100190\n盐酸乙哌立松片\t100191\n原筑\t100192\n政社\t100193\n战棋类\t100194\n地胶\t100195\n博优\t100196\n回音壁\t100197\n苦后甜\t100198\n8.1分\t100199\n穷孩子富孩子\t100200\n目前有\t100201\n1977\t100202\n乞巧节\t100203\n位移角\t100204\n螺柱\t100205\nFastBoot\t100206\nCurtain\t100207\n蜻蜓\t100208\nPKF\t100209\n屏芯科技\t100210\n葱包烩\t100211\nQLV\t100212\nOQ\t100213\n大厨\t100214\nconditioned\t100215\n盘_\t100216\n王振辉\t100217\n椒麻鸡\t100218\nProduct\t100219\n乐敦\t100220\n刘建中\t100221\n哈萨克\t100222\nHana\t100223\n1.26G\t100224\n一二章\t100225\n天赐\t100226\n猪儿\t100227\n古拉顿\t100228\npassage\t100229\n11315\t100230\n恒星科技\t100231\n123个\t100232\n双秀\t100233\n输入器\t100234\n修模\t100235\n磁县\t100236\nA9\t100237\n犯罪团伙\t100238\nt分布\t100239\n断发\t100240\n验光\t100241\n元淳公主\t100242\n被提交\t100243\n油炸\t100244\n法论文\t100245\n无耻之徒第八季\t100246\n建模算法\t100247\n48英寸\t100248\n中储股份\t100249\ninclude\t100250\n奇偶页\t100251\n中国太平保险集团\t100252\nDXC\t100253\ni9220\t100254\n护心\t100255\n套定额_\t100256\n房兵\t100257\n青春时代2\t100258\n揖\t100259\n左传\t100260\n结友\t100261\n王文玉\t100262\n生活型\t100263\n劲光\t100264\n诱因\t100265\n晋京\t100266\n铲铲\t100267\n八墓村\t100268\n债法\t100269\n螺母柱\t100270\n顺理成章\t100271\n21克\t100272\nadhere\t100273\n大话西游II\t100274\n谦词\t100275\n南京交院\t100276\n中国大饭店\t100277\n苏宇\t100278\n自动挡汽车\t100279\n2.5毫米\t100280\n1382\t100281\n5.8亿\t100282\n大侦探\t100283\nNearest\t100284\n剪接\t100285\n咸阳机场\t100286\n魔牙\t100287\n4000000\t100288\n认知症\t100289\n永井\t100290\n这么多年\t100291\n不痛不痒\t100292\n瑞能\t100293\n材料力学\t100294\n患\t100295\n8060\t100296\n10M\t100297\n旧游\t100298\n4个多小时\t100299\n见贤思齐\t100300\n灌缝胶\t100301\njewel\t100302\n雪绒花\t100303\n苏州东方之门\t100304\nasone\t100305\n帝冢真织\t100306\n料理人\t100307\n昆兰\t100308\n最大和\t100309\n踏实\t100310\n400t\t100311\n刹车\t100312\nCQ40\t100313\n跑步机\t100314\n3dmax吧\t100315\n集成支付宝\t100316\nNA\t100317\n内销\t100318\n日瓦戈医生\t100319\n程派\t100320\n电气工程学院\t100321\n私房牛肉面\t100322\n人名\t100323\n姚老师\t100324\n野柳地质公园\t100325\nbushing\t100326\n麻衣神算子\t100327\n江淮星锐\t100328\n胸襟\t100329\n吴锡豪\t100330\n屯兵\t100331\n百度云mkv\t100332\nwax\t100333\nBSS\t100334\n诺基亚7p\t100335\n科隆岛\t100336\n陆风X8\t100337\n德叔\t100338\n王者荣耀墨子\t100339\n群怪\t100340\n拱圈\t100341\nSims\t100342\n长芽\t100343\n培林\t100344\n回车后\t100345\n构造代码块\t100346\n燕罗街道\t100347\n东瓯\t100348\n谭龙\t100349\n韩国女足\t100350\n郜林\t100351\n世茂摩天城\t100352\nProtel\t100353\n刘文清\t100354\n国政通\t100355\n灯柱\t100356\n赫利俄斯\t100357\n经典传奇\t100358\n思绪\t100359\n美丽度\t100360\n二手房买卖合同\t100361\nBEKO\t100362\n夹心蛋糕\t100363\nHatsune\t100364\n2.5g\t100365\n嵩山路\t100366\n万里云\t100367\nCC攻击\t100368\n饭量\t100369\n中国电影股份有限公司\t100370\nCOMPUTER\t100371\n澧县政府\t100372\n面无\t100373\n学前教育学\t100374\n谅山\t100375\n中国药房\t100376\n闸室\t100377\nHpv\t100378\n何有\t100379\n王卫平\t100380\n半月板损伤\t100381\n排队\t100382\n蓝牙4.0BLE\t100383\n艾德希兰\t100384\nchemist\t100385\n大寺\t100386\n美苑\t100387\n漕河泾\t100388\n蒋立章\t100389\n杨白劳\t100390\n椰子鸡\t100391\nWarships\t100392\nwxcel\t100393\n失效\t100394\n1240\t100395\n剑影\t100396\nCCTV5\t100397\n莫娣\t100398\n迅游\t100399\n变形金刚2\t100400\n加酒\t100401\n程万军\t100402\n2017年8月23日\t100403\n魔屋\t100404\n澳洲航空\t100405\n区市场监督管理局\t100406\n红马甲\t100407\n航海王之黄金城\t100408\n王焱\t100409\nsk-ii\t100410\n新海宜\t100411\n上万只\t100412\n7230\t100413\n魔兽编辑器\t100414\nqanholas\t100415\n3|3T\t100416\nSexk\t100417\n第20次\t100418\n百仕达小学\t100419\n快乐番薯\t100420\n博仁\t100421\n库布\t100422\nJScrollPane\t100423\nync\t100424\n人大代表政协\t100425\n透视器\t100426\n5句\t100427\n千秋万代\t100428\n拆迁\t100429\ntestng\t100430\n描绘\t100431\n2018—2022年\t100432\n颈椎骨\t100433\nIntergraph\t100434\n瀛湖\t100435\n4029\t100436\n布克赛尔县\t100437\n192.168.1.0\t100438\n中科院肿瘤医院\t100439\ntemptation\t100440\n童童\t100441\n8036\t100442\n久仰\t100443\n泰山路\t100444\n漫漫长夜\t100445\nwin10pe\t100446\n名胜\t100447\n北人\t100448\n凤舞\t100449\n懂法\t100450\nlaugh\t100451\nMANAGER\t100452\n纸巾架\t100453\n休息术\t100454\n卓博\t100455\nnomount\t100456\nSoundTouch\t100457\n牙虫\t100458\n活禽\t100459\n刘晓博\t100460\n博图v13\t100461\n上京\t100462\nSILVER\t100463\n据悉\t100464\n围棋界\t100465\n安定性\t100466\n画圣\t100467\n詹姆斯布朗特\t100468\nA5站长网\t100469\n嵊山\t100470\n积目\t100471\n抖落\t100472\n浑善达克沙地\t100473\nmicrobit\t100474\n鬼船\t100475\n梦梦奈\t100476\npvg\t100477\nopenlayers2\t100478\n245个\t100479\n砂舞\t100480\n脂板\t100481\neason\t100482\n穆满\t100483\n临渊羡鱼\t100484\n新蔡县\t100485\n碳粉\t100486\n2015年末\t100487\n爱淘宝\t100488\n发脾气\t100489\n腾讯视频mac\t100490\n木语\t100491\n考生们\t100492\nOutlook2007\t100493\n一个一个\t100494\n流放之路3.1\t100495\n喷金\t100496\nIKON\t100497\n于姬\t100498\n径山茶\t100499\n张纪中\t100500\n肉包子\t100501\nNetstat\t100502\n如此这般\t100503\n见闻\t100504\n美丝馆\t100505\n认为\t100506\n云居\t100507\nsolidworks2016\t100508\n激情燃烧的岁月\t100509\n20170612\t100510\n武打片\t100511\n边城浪子\t100512\n任弼\t100513\n品牌域名\t100514\n全球品牌网\t100515\n盐酸四环素\t100516\n玉兰路\t100517\n营养不良\t100518\n套\t100519\nseeds\t100520\n和位\t100521\n世纪华通\t100522\nmacros\t100523\n1602\t100524\nivan\t100525\n宁向东\t100526\n代入感\t100527\nBillions\t100528\nAnalog\t100529\n固定剂\t100530\n常事\t100531\n磁环\t100532\neig\t100533\nU-boot\t100534\ngulp-sass\t100535\n10.23\t100536\nOsborne\t100537\n西环路\t100538\n谮\t100539\n墨客\t100540\n章丘区\t100541\n黑锐网\t100542\n二郎腿\t100543\nV客\t100544\nPajamaCat\t100545\n来风\t100546\n龙四\t100547\npnt\t100548\nTechno\t100549\n大鳄\t100550\nideapad720s\t100551\n天峯\t100552\n老虎洞\t100553\n姐弟妹\t100554\n礼金\t100555\nS90\t100556\nwharton\t100557\n旭辉地产\t100558\n北仑港\t100559\n造口壁\t100560\n陈亚军\t100561\n国宴\t100562\n才米公社\t100563\n奇游联机宝\t100564\n横州\t100565\nxiziwang\t100566\n一起飞\t100567\n松浦弥太郎\t100568\n糖色\t100569\n誓死\t100570\ncooling\t100571\n渔竿\t100572\n卡廷\t100573\n呼伦贝尔市\t100574\n两新组织\t100575\n李慕白\t100576\n传国玉玺\t100577\n女漫\t100578\n散散心\t100579\n膜分离\t100580\n旋转点\t100581\nstrip\t100582\n中国建筑股份有限公司\t100583\n3570K\t100584\n缄\t100585\n印太\t100586\n娈童\t100587\n不要走\t100588\n第一排\t100589\nmjzl\t100590\nBAF\t100591\nw3svc\t100592\naq\t100593\n三品一标\t100594\n大林\t100595\n明前茶\t100596\nArtisan\t100597\nlijian\t100598\niMazing\t100599\nandroguard\t100600\n欲望红杏\t100601\n脚裂\t100602\nFM2018\t100603\n理正勘察\t100604\n羊绒衫\t100605\naddition\t100606\n百奇\t100607\n枣庄职业学院\t100608\n银背\t100609\nWAF\t100610\nMSI\t100611\nWinPE\t100612\n我們\t100613\n给水管\t100614\n躯体\t100615\n链接\t100616\n鬼东西\t100617\n牡丹图\t100618\nLearning\t100619\n时代\t100620\n内外\t100621\nmyo\t100622\n中国节能在线\t100623\n醇基燃料\t100624\nhighlights\t100625\n高冈\t100626\n龚爽\t100627\n小蹄\t100628\n慰问信\t100629\n磨剂\t100630\n盾构隧道\t100631\n郭凯敏\t100632\n白鹤镇\t100633\n重庆卫视\t100634\n煲机\t100635\n高压反应釜\t100636\n首都机场集团公司\t100637\n孙佳\t100638\nbonne\t100639\n相城经济开发区\t100640\n惨淡\t100641\n泡池\t100642\npassing\t100643\nSEQ\t100644\nirrigation\t100645\n霍成君\t100646\n政治学原理\t100647\n323号\t100648\n挎子\t100649\n无奋斗\t100650\n夏氏\t100651\n603636\t100652\n平道\t100653\nCoconut\t100654\n李文华\t100655\n彩讯股份\t100656\n金士力\t100657\n涟水\t100658\n曹磊\t100659\n27本\t100660\n唐军\t100661\n沈\t100662\n南京审计学院\t100663\n电鳗\t100664\n防护套\t100665\n吞吞吐吐\t100666\n声烦\t100667\n吸痰器\t100668\n青斑\t100669\ntrails\t100670\n3d立体画\t100671\n生桩\t100672\n熟娘\t100673\nABAP\t100674\n中本聪\t100675\n翠云山\t100676\n废硫酸\t100677\n释迦牟尼\t100678\n基线\t100679\n小学片\t100680\n第15页\t100681\n张三李四\t100682\nk660d\t100683\n伯克希尔\t100684\nfork函数\t100685\n沈磊\t100686\n精细化管理\t100687\n角楼\t100688\n滴滴快车吧_\t100689\n财界\t100690\n何麦枒\t100691\n音读\t100692\n恶习\t100693\n起死\t100694\n抚顺市\t100695\n胀管\t100696\n四型\t100697\n秋霜\t100698\n塔斯马尼亚岛\t100699\nbiggest\t100700\n_阿勒泰市政府\t100701\nsolaris\t100702\n4.2.3\t100703\n观天\t100704\n智能穿戴\t100705\n入局\t100706\n精神文明\t100707\n大宗商品交易所\t100708\n147家\t100709\n3发\t100710\n千万里\t100711\n童年往事\t100712\nGoMP3\t100713\n敢为人\t100714\n无头骑士异闻录\t100715\n.1\t100716\n加拿大驻华大使馆\t100717\n钮\t100718\nNTP时间服务器\t100719\n阿奇霉素肠溶胶囊\t100720\n王敦\t100721\n铁道兵\t100722\n大洋山\t100723\n灵昆\t100724\n长安宝马x5\t100725\nvrf\t100726\nHURRICANE\t100727\n竖弯\t100728\n百发百中\t100729\n短工\t100730\n陈淑桦\t100731\n罗斯福\t100732\n荣川乃亚\t100733\nTeamViewer13\t100734\n地方版\t100735\nuart\t100736\nPurge\t100737\n邓小平文选\t100738\n氨基酸分析仪\t100739\n大左\t100740\nwangyaning\t100741\n长安CS\t100742\n全世界\t100743\n女大\t100744\n曙光花园\t100745\n三垒\t100746\n新北仑\t100747\n金装\t100748\n深圳活动策划公司\t100749\n1120\t100750\n百路\t100751\n武汉白癜风医院\t100752\n彩印\t100753\n勘\t100754\n苏恋雅\t100755\n影讯网\t100756\n明洞\t100757\n淮阳\t100758\nx+5\t100759\n400套\t100760\n控制理论与控制工程\t100761\n晶闸管\t100762\n肌酸\t100763\n828D\t100764\n听筒\t100765\nq友\t100766\n吹号\t100767\nRoborock|MI\t100768\n唇线笔\t100769\n汉城路\t100770\n怨恨\t100771\n进行曲\t100772\n50MM\t100773\nwin10下vmware\t100774\n陈思念\t100775\n300027\t100776\n万套\t100777\n防突\t100778\n藤月\t100779\n麦鲁\t100780\nagree\t100781\n凤在江湖\t100782\n父母亲\t100783\n天邦股份\t100784\nAGC\t100785\n神经电\t100786\n夢\t100787\n尹雪喜\t100788\n李春雨\t100789\n好_摇篮育儿百科\t100790\n德邦\t100791\n涿州\t100792\n金镝\t100793\n昆药集团\t100794\nflushing\t100795\n血行\t100796\n页页\t100797\n四新\t100798\nenabled\t100799\n保利水城\t100800\n绿地控股集团\t100801\n舒马赫\t100802\nti5\t100803\ne5.0\t100804\n下午4点\t100805\n楼上\t100806\n市委巡察办\t100807\n你的人\t100808\n抢钱\t100809\n复活石\t100810\n范\t100811\n杭者\t100812\n同盟军\t100813\n高清\t100814\n菜贩\t100815\n老胡\t100816\n猎艳江湖\t100817\n鄂尔多斯市东胜区\t100818\nRP1\t100819\n发光管\t100820\n六爻吧\t100821\n调味罐\t100822\n各执一词\t100823\n独立站\t100824\n枉凝眉\t100825\n白猫杯\t100826\nThinking_Jiang\t100827\n低碳世界\t100828\n朱海\t100829\n炼丹炉\t100830\ncxa\t100831\n高中数学函数\t100832\n不曾\t100833\n天津公安医院\t100834\n热爱\t100835\ng903\t100836\nGLOBAL\t100837\n轉發\t100838\n墨雨\t100839\n蝶衣\t100840\n诈骗犯\t100841\n江苏省气象局\t100842\n泾河\t100843\n工程师证\t100844\nYYsports\t100845\n环海南岛\t100846\n狗不理\t100847\n乌什县\t100848\n雅颂居\t100849\n注册有限责任公司\t100850\n跨境贸易\t100851\n脸部\t100852\n金手勺\t100853\n未成年人犯罪法\t100854\n朱紫\t100855\n磨砺\t100856\n崧泽大道\t100857\n固定翼\t100858\n连云港市\t100859\n吴康\t100860\nacoustic\t100861\n青岛妇女儿童医院\t100862\n150倍\t100863\n游聚网\t100864\n悠黑\t100865\njinan\t100866\nds\t100867\n中国扶贫开发协会\t100868\n安德路\t100869\n结节状\t100870\njapenese\t100871\n2018年2月19日\t100872\n平安直通保险\t100873\n漫客网\t100874\noem\t100875\nDadi家园宝\t100876\n下跃\t100877\n晓星\t100878\n薄慕颜\t100879\n种方\t100880\ndirectsound\t100881\nabel533\t100882\n1200公里\t100883\n南三龙铁路\t100884\n虎豹\t100885\n苹果派\t100886\n未来币\t100887\n余英\t100888\n鬼娃娃\t100889\n样式管理器\t100890\ncbr300r\t100891\n工作局\t100892\n丰度\t100893\n10s\t100894\n每周一\t100895\n演绎\t100896\n虎牌\t100897\nLamer\t100898\n烫伤\t100899\n北京清华\t100900\n道法自然\t100901\n数形\t100902\n晚春\t100903\nBtmule\t100904\n速客网\t100905\n徐州医学院\t100906\n圆标\t100907\n温华\t100908\n民福康健康\t100909\n反比\t100910\n中国联合钢铁网\t100911\n投实\t100912\nsudoers\t100913\n10倍\t100914\n2018.6\t100915\n锁住\t100916\nZBRUSH\t100917\n广宁\t100918\n初可\t100919\nthread\t100920\n里昂火车站\t100921\n又一村\t100922\n排练室\t100923\nfurniture\t100924\nneuron\t100925\nQB|\t100926\n55吨\t100927\n五期\t100928\nRAL\t100929\n封闭漆\t100930\nfcf\t100931\n上海市文物局\t100932\n专门性\t100933\n12K\t100934\n入耳式耳机\t100935\n66%\t100936\n线程号\t100937\nSPIE\t100938\n门静脉高压\t100939\n婶娘\t100940\ng460\t100941\nWOE\t100942\n实事求是\t100943\n拓跋弘\t100944\n李赫\t100945\n满衣\t100946\n虫棍\t100947\nbacula\t100948\n狗叫声\t100949\n65米\t100950\n120fps\t100951\n积木宝贝\t100952\n东方电气\t100953\n画皮师-画皮师在线漫画\t100954\n柴河\t100955\n玖月\t100956\n铝板\t100957\nAppleScript\t100958\n暹罗\t100959\n赛默飞世尔\t100960\n王灵\t100961\nMOBI格式\t100962\n混子\t100963\n校区\t100964\n古滇国\t100965\n知否\t100966\n调委会\t100967\nWALL\t100968\nALIENTEK\t100969\n空瓶\t100970\n各各他\t100971\nhos\t100972\n李双林\t100973\n1支\t100974\nlbl\t100975\n近战\t100976\n攸州\t100977\n张国斌\t100978\n厦门市国土资源与房产管理局\t100979\n燕姐\t100980\nibook\t100981\niphonese2\t100982\n靠谱不【\t100983\n沈昌祥\t100984\n我的体育老师\t100985\n说长\t100986\n中华人民共和国广州海关\t100987\n独角仙\t100988\nLn\t100989\n金光布袋戏\t100990\n何沐霖\t100991\n挖宝\t100992\nYM\t100993\n安氏\t100994\n少年巴比伦\t100995\nQieTa\t100996\n深入敌后\t100997\n成都远洋太古里\t100998\n祎\t100999\n分割\t101000\n辨音\t101001\n英雄联\t101002\n春温\t101003\n容止\t101004\n昊然\t101005\n47亿\t101006\n手包\t101007\n黑白猫\t101008\nfallout\t101009\n董洁\t101010\neeuss\t101011\n灵谕\t101012\n宫锁沉香\t101013\n爆浆\t101014\n蔚蓝海岸\t101015\n华商学院\t101016\nNipple\t101017\n西门塔尔牛\t101018\n大湄公河次\t101019\n星位\t101020\n风云雄霸天下吧\t101021\ncyclone\t101022\n马蔚华\t101023\n严家新\t101024\n活力\t101025\n2018年五一节\t101026\n再有钱\t101027\n玫瑰葡萄\t101028\nrequest\t101029\n筑城\t101030\nTher\t101031\n深商\t101032\n儒藏\t101033\n深渡镇\t101034\n均质\t101035\nmagic\t101036\n湘阴县\t101037\n改革\t101038\n金帐\t101039\n车务\t101040\n0.9.9\t101041\n拆放\t101042\n美术界\t101043\n吵架\t101044\n老山战役\t101045\n侮辱罪\t101046\naimed\t101047\n瑞雅\t101048\n天津泰达\t101049\n林辰\t101050\nResilience\t101051\nsis0001\t101052\n黄梨\t101053\nrunlevel\t101054\n明珠花园\t101055\n商业养老保险\t101056\n100刀\t101057\nwyw\t101058\n氟化\t101059\nora-01031\t101060\n北师大小学\t101061\n南充市高坪区人民政府\t101062\n0048\t101063\n问一问\t101064\n增压\t101065\nROUGE\t101066\n_山水时尚酒店\t101067\n沉迷\t101068\n硫酸铜\t101069\nsubscription\t101070\n校训\t101071\n光明网_光明数字报\t101072\n助理会计师\t101073\npressure\t101074\n炸花生米\t101075\n马尔贝克\t101076\npageoffice\t101077\n开扎\t101078\n汉江大道\t101079\nSERO\t101080\ncorrupt\t101081\n幼稚园杀手\t101082\nAyane\t101083\n哈尔滨市城乡规划局\t101084\n做网站\t101085\n网址栏\t101086\n石头儿\t101087\n龙\t101088\n/em\t101089\n中国女乒\t101090\ndn65\t101091\n法律系\t101092\nstam\t101093\n雅可比矩阵\t101094\n铝酸盐水泥\t101095\n百出\t101096\nInterstellar\t101097\n洪涛股份\t101098\njunshi\t101099\n西门豹\t101100\n新达拉然\t101101\n二边\t101102\n大塞车\t101103\n卫洗丽\t101104\n次第\t101105\n飞机网\t101106\n冷制皂\t101107\n星陨\t101108\n南长区\t101109\n昆山农商行\t101110\nFreezer\t101111\n天神\t101112\n入池\t101113\n儿童歌曲大奖赛\t101114\n平安银行信用卡中心\t101115\n移远\t101116\n安选\t101117\n涟漪\t101118\nchaoyang\t101119\n敷衍了事\t101120\nJFrame\t101121\n世界杯\t101122\n盐酸二甲双胍缓释片\t101123\n刻碟\t101124\n很纯很暧昧\t101125\n神舟战神\t101126\n休宁县\t101127\n校准品\t101128\n网易公司\t101129\n6卡\t101130\n兰田\t101131\n黄一\t101132\n金泰熙\t101133\n汪小兰\t101134\n巧变\t101135\n泥泥\t101136\n藏象\t101137\nspu\t101138\n国标号\t101139\n仙贝\t101140\n167号\t101141\n李亦非\t101142\n国土资源报网\t101143\n乐视1pro\t101144\n建库\t101145\n叶辛\t101146\n无齿锯\t101147\nlgdlol\t101148\n棕\t101149\nlsq\t101150\n朝阳寺\t101151\nswapon\t101152\n张家口市发展和改革委员会\t101153\n黑暗塔\t101154\n拉断\t101155\nkelly\t101156\n一胎\t101157\nmsvcr71.dll\t101158\n桉树\t101159\n0.12.0\t101160\n新公园\t101161\n弓箭手\t101162\n小男孩子\t101163\n白下路\t101164\n李晨\t101165\nmayor\t101166\n对流层\t101167\n堆头\t101168\n200多种\t101169\n对齐不同行\t101170\n天津京剧院\t101171\ncarsim\t101172\n静力水准仪\t101173\nx=5\t101174\n最小假\t101175\n和平大悦城\t101176\n色妹妹网\t101177\n组合技\t101178\nJonsea\t101179\nWG\t101180\n调香\t101181\nSLB\t101182\n广阔\t101183\n节能型\t101184\n凝聚\t101185\nVicky\t101186\n九百万\t101187\n建仓期\t101188\nTop20\t101189\n彭湖湾\t101190\neStar\t101191\n炎狼\t101192\n打不动\t101193\n碗碟\t101194\nmems\t101195\npat\t101196\n极限特工3\t101197\n种植类\t101198\nAPP端\t101199\n合声\t101200\nPoop\t101201\n十字路\t101202\nassessing\t101203\n长信\t101204\n保龄\t101205\njima\t101206\n蛋黄\t101207\n血肉\t101208\nplanes\t101209\n关区\t101210\n获嘉县\t101211\ndoss\t101212\nMore\t101213\n秦晓\t101214\n1公分\t101215\n绝地求生大逃杀卡\t101216\n悲情\t101217\n天天练舞功\t101218\n台垫\t101219\n张蒙\t101220\nmaximum\t101221\n直裾\t101222\n35.0000元\t101223\n摸着\t101224\n融资\t101225\n大西洋帝国\t101226\n几条\t101227\ndb2\t101228\n拉练\t101229\n淘宝考试\t101230\n一本万利\t101231\n采招\t101232\n飨灵\t101233\nTRS\t101234\n诗会\t101235\n可可狮\t101236\npreviewer\t101237\n风舞\t101238\n嘴唇\t101239\n家蛇\t101240\n湖北教育出版社\t101241\n浮城谜事\t101242\n慢板\t101243\n江行\t101244\narcpy\t101245\n纯肉\t101246\n5722\t101247\n城北区\t101248\n350v2\t101249\nDressing\t101250\n集成显卡驱动\t101251\n彩客\t101252\n中航信息\t101253\n桐木镇\t101254\n正丙醇\t101255\n佛牙\t101256\n传奇世界手游\t101257\n水濂山\t101258\nTail\t101259\n尼康d7000\t101260\n星形细胞瘤\t101261\nuprising\t101262\n吸血贵利王\t101263\n罗马史\t101264\n210号\t101265\ngoland\t101266\n甲周疣\t101267\n松软\t101268\nokio\t101269\n指纹考勤机\t101270\n萧澜\t101271\n同程酒店\t101272\n红城\t101273\n昆山汽车站\t101274\n阳离子型\t101275\n裂罅\t101276\n清风车\t101277\nPyQt\t101278\n无菌性炎症\t101279\nhong\t101280\n纪律处分\t101281\n兰化一中\t101282\n韩方\t101283\n帕金森综合征\t101284\n营业线\t101285\n尼伯龙根\t101286\n庆元县政府\t101287\n迈阿密\t101288\n3anguo\t101289\n维苏威火山\t101290\n开门式\t101291\n汽柴\t101292\n阿影\t101293\n列标和\t101294\n阿Q\t101295\n江岸区\t101296\n恩格斯\t101297\n起亚重生完美时代\t101298\n秦越\t101299\nnuo\t101300\n张江三湘海尚\t101301\n幕词\t101302\nDaydream\t101303\n黄寺\t101304\n分室\t101305\n山西工程技术学院\t101306\nanalyst\t101307\n甘霖镇\t101308\n自修复\t101309\n岛柜\t101310\n渔港春夜\t101311\n周星\t101312\n纸片人\t101313\n奔流不息\t101314\n祁宏\t101315\nKarter\t101316\ncaltech\t101317\n中共安徽省纪律检查委员会\t101318\n沥青瓦\t101319\n99页\t101320\n黑糖群侠传\t101321\n昭通市人民政府\t101322\n住宅专项维修资金管理办法\t101323\n下单\t101324\n8264.com_\t101325\n建设路街道\t101326\n有刷电机\t101327\n廖弟\t101328\nReason\t101329\n娇花\t101330\n异像\t101331\n陕西能源职业技术学院\t101332\n易家网\t101333\n酪蛋白磷酸肽\t101334\n鈴木心春\t101335\n贫儿\t101336\n雷楼\t101337\n泽思\t101338\n美克美家\t101339\nwin2012r2\t101340\n按期\t101341\n雪板\t101342\n透气度\t101343\n主祷文\t101344\n盐酸克林霉素\t101345\nviolates\t101346\n国家质检总局缺陷产品管理中心\t101347\n263&\t101348\n福利社\t101349\n郧阳\t101350\nP45\t101351\n楚雄市人民政府\t101352\n毛呢外套\t101353\n木子学院\t101354\n刘建东\t101355\n悠然小天\t101356\n鬼神套\t101357\nsecurity4\t101358\n台女星\t101359\nEuropa\t101360\n新木乃伊\t101361\n错错\t101362\ngsub\t101363\n沂蒙山区\t101364\n积\t101365\nrevival\t101366\nBibTeX\t101367\n学警\t101368\n0值\t101369\n3.57\t101370\n广州城建职业学院\t101371\n工程局\t101372\n内聚\t101373\n二汽\t101374\n官印\t101375\nx20\t101376\n蒙文\t101377\n越界\t101378\n烟尘\t101379\n2016电大\t101380\n合肥银行\t101381\n绿魔\t101382\n吸附剂\t101383\n音心\t101384\n曹炳琨\t101385\n力康\t101386\n第八篇\t101387\n提报\t101388\n东方丽景\t101389\nEXPDP\t101390\nshenji\t101391\n母狼\t101392\n一级消防工程师考试\t101393\n北京航空航天大学材料科学与工程学院\t101394\n84路\t101395\nctrl+shift\t101396\n蜀山剑侠传\t101397\n别搞错\t101398\n法修\t101399\n小快克\t101400\n华夏健康\t101401\n太阳纸业\t101402\n骏网一卡通\t101403\ncomprehensive\t101404\n云南人\t101405\n包射\t101406\nbate\t101407\n罗马嘉园\t101408\nJOS\t101409\n魅者\t101410\naccordion\t101411\naccp\t101412\n天上地下\t101413\n滨水区\t101414\n斯凯\t101415\nsketchp\t101416\n3小时后\t101417\n切痣\t101418\n重庆市城乡建设委员会\t101419\n同等学力法学\t101420\n粪坑\t101421\n郴州站\t101422\n光子嫩肤\t101423\n住宅公园\t101424\n瓶窑\t101425\n中通快递单号\t101426\n卡西\t101427\n几队\t101428\n非狐外传\t101429\n沙拉\t101430\n肯尼斯\t101431\nflowerplus\t101432\n2.1_\t101433\n国货\t101434\n液晶电视遥控器\t101435\n靠化缘\t101436\n刘大哥\t101437\ntucoo\t101438\n瓦坎达\t101439\n超算\t101440\n转换\t101441\nOPC\t101442\n朱九真\t101443\n机兵\t101444\n管中\t101445\n电脑包\t101446\n九校\t101447\n理解性\t101448\n纸尿裤\t101449\n卡飞\t101450\n中华民族全面抗日战争\t101451\nNERDTree\t101452\n华为mates\t101453\nall27\t101454\n电涌保护器\t101455\n姚滨\t101456\n汪军\t101457\n戴拿奥特曼\t101458\n洗钝化\t101459\nJavaCV\t101460\n广州科技贸易职业学院\t101461\n爸爸去哪儿3\t101462\nImagick\t101463\n中国禽病网\t101464\n雪城大学\t101465\n蜜雪冰城\t101466\nlao\t101467\n操行\t101468\ngmz88\t101469\n挖出\t101470\n主力\t101471\nEUB\t101472\n电机学\t101473\n锚具\t101474\n重庆观音桥\t101475\nquic\t101476\n液力耦合器\t101477\n吴昌硕\t101478\nSLboat\t101479\n中里巴人\t101480\n【维\t101481\n大参林\t101482\n台州银行\t101483\n阴沉沉\t101484\n第九届\t101485\nlex\t101486\n半斤\t101487\n周东亮\t101488\n文言\t101489\n万科七橡墅\t101490\n沧州银行\t101491\n正弦定理\t101492\n散乱\t101493\n創造\t101494\ntoC\t101495\n工藤新一\t101496\n如晤\t101497\nMansion\t101498\n气度\t101499\n可溶性\t101500\nSII\t101501\n桉柠蒎肠溶软胶囊\t101502\n四年级\t101503\nEMUI\t101504\n子午胎\t101505\n乐视手机2\t101506\n电气\t101507\n戊戌维新运动\t101508\n1449\t101509\n国家全域旅游示范区\t101510\n总而言之\t101511\n枯木逢春\t101512\n蜜蜂窝\t101513\n枣林\t101514\nhoii\t101515\npragmatic\t101516\n冷轧钢板\t101517\n离线宝\t101518\n魏晓明\t101519\n模拟盘\t101520\n1模\t101521\n姚依林\t101522\ninnosetup\t101523\n词风\t101524\n百合竹\t101525\nworktile\t101526\n轩枫阁\t101527\n黑魔法\t101528\naxf\t101529\nnips\t101530\n传统医学\t101531\nlyr\t101532\n华北区\t101533\n肿瘤免疫治疗\t101534\n太渣\t101535\n王永胜\t101536\n失色\t101537\n马贩子\t101538\n李敏\t101539\n拙计\t101540\necache\t101541\n中国财经观察网\t101542\n数码师\t101543\nbluedroid\t101544\n空格处\t101545\n一个样\t101546\n暴雪战网\t101547\n第28页\t101548\ndake\t101549\nKORG\t101550\n发空\t101551\n隐匿性肾炎\t101552\n托卡马克\t101553\n瓦屋\t101554\n第四列\t101555\n第十九篇\t101556\n苏尔\t101557\n三千院\t101558\n学堂歌\t101559\n开点\t101560\n法水\t101561\n四月中旬\t101562\ndanji\t101563\n队员们\t101564\nie9浏览器\t101565\n新电\t101566\n集电器\t101567\n平武\t101568\nyaoshi\t101569\n茶农\t101570\n18包\t101571\ninvasive\t101572\n1968\t101573\n假证\t101574\n造形\t101575\n号牌\t101576\n新星小区\t101577\n公器\t101578\n李若嘉\t101579\n称呼\t101580\n红岩顶\t101581\n清源\t101582\n锁行\t101583\n砂子\t101584\ntbook\t101585\ntlc\t101586\n万孚生物\t101587\ntypec\t101588\n大学生村官考试\t101589\n1-11月\t101590\nshv34\t101591\narccos\t101592\n屏录\t101593\n同城会\t101594\n前片\t101595\n艾因\t101596\nAPI接口\t101597\n民盟中央\t101598\ndchao\t101599\n小樽\t101600\n君悦湾\t101601\n醋蛋液\t101602\n熊小米\t101603\nSoul\t101604\n加油金\t101605\n天猫直通车\t101606\n熊猫宝宝\t101607\nD3.JS\t101608\npyo\t101609\n双创示范基地\t101610\n断卦\t101611\nfgogo\t101612\n沙井新区\t101613\n多层级\t101614\n投资收益\t101615\n5.7米\t101616\n被点\t101617\n托帕石\t101618\n张威\t101619\n发慌\t101620\nmatical\t101621\nzhuce\t101622\n学生运动会\t101623\n肚\t101624\n0081\t101625\n姜水\t101626\n农民合作社示范社\t101627\n香身\t101628\n急性腹泻\t101629\n大海网\t101630\n黑色\t101631\n1597\t101632\n萨瓦纳\t101633\n00:00:00\t101634\n刘辩\t101635\nPublish\t101636\n丹心\t101637\n优酷PC\t101638\n终级\t101639\n腾祥\t101640\n汽车投诉网\t101641\n挂逼\t101642\n倾注\t101643\n潘德拉贡\t101644\npresonus\t101645\n藤桥\t101646\n基本每股收益\t101647\n阳光化\t101648\nHSR\t101649\n兰若寺\t101650\n武阳镇\t101651\n黄亚洲\t101652\nswp\t101653\n光新路\t101654\n100条\t101655\nBread\t101656\n米内网\t101657\nALONE\t101658\n维纳滤波\t101659\n李大奔\t101660\n2016年国庆节\t101661\n会否\t101662\nbbd\t101663\n萤石云视频监控\t101664\n肖海洋\t101665\n22.9\t101666\n红十字会\t101667\nruning\t101668\nx5v\t101669\nAbercrombie\t101670\n爱莫\t101671\nAlbum\t101672\n家长\t101673\n穿行测试\t101674\n蔚州\t101675\n3899\t101676\n森林法\t101677\n保安公司\t101678\n新农村商网\t101679\n遥不可及的你\t101680\npro2017\t101681\n苗\t101682\nJiemian\t101683\n彩虹小马\t101684\n64P\t101685\nJSP编程\t101686\n高等级\t101687\n功利主义\t101688\n粉水\t101689\n艾儿\t101690\n基本完成\t101691\n小湾\t101692\n10一\t101693\n密固达\t101694\n校安\t101695\n忽而今夏\t101696\n明太鱼\t101697\n对打\t101698\n一些点\t101699\n马波\t101700\n睡姿\t101701\n词数\t101702\n_马蜂窝\t101703\n御史大夫\t101704\n孽海花\t101705\n第3辑\t101706\n天天果园\t101707\n郭mini\t101708\n积雪草\t101709\nTAMRON\t101710\n西安职业技术学院\t101711\n胸部\t101712\n泥洼\t101713\n莱州政府网\t101714\niko\t101715\n莱芜市\t101716\n网游加速器\t101717\n生活源\t101718\n240_\t101719\nDegree\t101720\n莆田市国土资源局\t101721\n菜鸟级\t101722\n张万年\t101723\n电离平衡常数\t101724\npku\t101725\n魔海\t101726\n莱茨狗\t101727\n美颜相机\t101728\n八下\t101729\n蕲春\t101730\n神作\t101731\nLAZADA\t101732\n二饼\t101733\n相吻合\t101734\n1.8\t101735\n圣装\t101736\ntravels\t101737\n猪邦\t101738\n艾弗里\t101739\n老班\t101740\n沙滩裙\t101741\n博洛尼\t101742\n20141017\t101743\n质押品\t101744\n英派斯\t101745\n徐特\t101746\n这段路\t101747\n依法治国\t101748\n4399西普大陆\t101749\n北京市西城区人民政府\t101750\n新式\t101751\n轮暴\t101752\n肯德基宅急送\t101753\n高低压配电柜\t101754\nsqlsever\t101755\n四川省政府国有资产监督管理委员会\t101756\n三游洞\t101757\n28位\t101758\n0.2.1\t101759\n稚嫩\t101760\nv1.0.0_\t101761\n朴成\t101762\nCMB\t101763\n荣耀圈\t101764\n苏财规\t101765\nVideoPlayer\t101766\n半规管\t101767\n风光片\t101768\n沧州_中国沧州政府\t101769\n喜洲古镇\t101770\n碳性\t101771\n鲁友\t101772\n和昌\t101773\nElena\t101774\n徐州博物馆\t101775\n图哈特\t101776\nFEKO\t101777\ngzinflate\t101778\nIntercept\t101779\nmoonvan\t101780\n过继\t101781\n泥量\t101782\n管理岗\t101783\n换胎\t101784\n周新城\t101785\n文金\t101786\n仲恺大道\t101787\n固液分离机\t101788\ndain\t101789\n对对\t101790\n我乐橱柜\t101791\n幻梦战记\t101792\ngetty\t101793\n地穴\t101794\n龙阳路\t101795\n以太坊智能合约\t101796\n丰田普拉多\t101797\n典点\t101798\n荣格工业资源网\t101799\n枕芯\t101800\ntextbook\t101801\n摄氏\t101802\n金川集团股份有限公司\t101803\n选率\t101804\n百度云盘/BT\t101805\n搜狗阅读\t101806\nwrod\t101807\n3334\t101808\n诺恩\t101809\njjy\t101810\n川剧\t101811\n杨安迪\t101812\n菜市口\t101813\nsysmac\t101814\n卢森\t101815\n王斌\t101816\n得失\t101817\n图展\t101818\n泰安方特\t101819\n郑福坤\t101820\n平安易贷\t101821\nt42\t101822\n巡弋\t101823\n主审\t101824\n100双\t101825\n弘宇\t101826\n班尼特福迪\t101827\nBo\t101828\n捞月狗痞子狼\t101829\n第四军医大学口腔医院\t101830\n郝吉林\t101831\nStudio3\t101832\n虚脱\t101833\n郭可盈\t101834\ngvs\t101835\ndon\t101836\nelectronics\t101837\n苏州汽车站\t101838\n微型\t101839\n大战魔帅_逍遥兵王\t101840\n600352\t101841\n同款\t101842\n1.7GB\t101843\n暧昧大明文魁\t101844\nuncaged\t101845\n声功率\t101846\n四川省农村信用社\t101847\n闪腰\t101848\n银条\t101849\n尾水\t101850\n罗海琼\t101851\n报任\t101852\n直行车\t101853\n甘草酸二铵肠溶胶囊\t101854\n长泽苿里奈\t101855\ni7-6700HQ\t101856\n抢筹\t101857\n鲍尔默\t101858\nicacls\t101859\n英智\t101860\nmyBatis\t101861\n开辟\t101862\n版材\t101863\n396号\t101864\n大宁店\t101865\n香港大学深圳医院\t101866\nincreasingly\t101867\n各类型\t101868\n签到\t101869\nliulanqi\t101870\nword2010\t101871\n消费型重疾险\t101872\nphone\t101873\n旱半夏\t101874\n金慧\t101875\nfateapocrypha\t101876\namiami\t101877\n升龙天汇\t101878\n才是\t101879\nGBA\t101880\n型谱\t101881\n戰甲\t101882\n出单\t101883\n神伤\t101884\nAwesomes\t101885\n福建法院网\t101886\n金缕梅\t101887\n规上\t101888\nbz\t101889\n菊次郎\t101890\njuery\t101891\n乘警\t101892\n决胜负\t101893\n治党\t101894\n解放鞋\t101895\n香织\t101896\n20160411\t101897\n南阳高新区\t101898\nFM2016\t101899\n司美琴\t101900\n聚英考研网\t101901\n象山区\t101902\n名侦探柯南剧场版\t101903\n祭父\t101904\n高速\t101905\n内存条\t101906\n冷清秋\t101907\n比亚迪F3\t101908\n三筒烘干机\t101909\n潮霉素\t101910\n皖北地区\t101911\n客诉\t101912\n西安西\t101913\n真人密室逃脱\t101914\n参考文档\t101915\n国家励志奖学金\t101916\n帅哥\t101917\n优利特\t101918\n绗\t101919\n官图\t101920\n桑代克\t101921\n阿特兹论坛_汽车之家论坛\t101922\n税调\t101923\ncbd\t101924\nwin7虚拟内存\t101925\n更生\t101926\n时光荏苒\t101927\n搞不好\t101928\n赢商\t101929\n黄河农村商业银行\t101930\nmfa\t101931\n平儿\t101932\nF91\t101933\nv2.0.1\t101934\n000503\t101935\n结案\t101936\n查理九世\t101937\n锁匙\t101938\n混乱\t101939\n歇着\t101940\n交纳\t101941\n人社厅\t101942\nNewsroom\t101943\n冷鲜\t101944\n彪悍\t101945\n瑞熙墅\t101946\n再开\t101947\ngoodwill\t101948\n西游神武3\t101949\n起点中文小说网\t101950\n曲洋\t101951\n太平村\t101952\n不为谁而作的歌\t101953\n参评\t101954\n应答率\t101955\n民生观\t101956\n必发财富网\t101957\n刘远\t101958\naws\t101959\nhfs\t101960\n戬杰\t101961\n状语\t101962\n琶醍\t101963\n休闲\t101964\n24升\t101965\n凉意\t101966\n导流管\t101967\n鹅湖镇\t101968\n迪莫\t101969\n退课\t101970\n九浅一深\t101971\n《文明\t101972\n脱附\t101973\n长册\t101974\n120Hz\t101975\n财务管理_经管营销\t101976\n库车县\t101977\ncadtools\t101978\nwin732位\t101979\n李某某\t101980\n守望者\t101981\n撷英\t101982\nnorthface\t101983\n化学工业出版社\t101984\n金义\t101985\n吴晓滨\t101986\n德淘网\t101987\nindustrial\t101988\n陈冬\t101989\n阿里研究院\t101990\nH5|\t101991\nCX\t101992\n小巨人\t101993\n结束了\t101994\n难封\t101995\n重庆大学虎溪校区\t101996\nfindbug\t101997\n土路\t101998\n伊吹五月\t101999\n太爷\t102000\n抽乐\t102001\n嘉和一品\t102002\nequality\t102003\n联航路\t102004\n资产经营有限公司\t102005\n楚玉\t102006\n易掌柜\t102007\nGENESIS\t102008\nQQ问候表情_窝窝QQ网\t102009\n京韵\t102010\n有利\t102011\n吉的堡幼儿园\t102012\noringin\t102013\nExtender\t102014\n绵阳站\t102015\n宁夏职业技术学院\t102016\nllss\t102017\nwelding\t102018\n易地建设费\t102019\nBuildIt\t102020\n41岁\t102021\n江流\t102022\nElementUI\t102023\n凌志军\t102024\n烟标\t102025\n伍嘉成\t102026\n0595\t102027\n小鸡鸡\t102028\nFlex\t102029\n应届生\t102030\n环球教育\t102031\n团体意外险\t102032\n柔宇\t102033\norg.slf4j\t102034\n四川省肿瘤医院\t102035\n800G\t102036\n手夹\t102037\nevans\t102038\n湖北省公安厅出入境办证\t102039\n独当一面\t102040\nSteemit\t102041\n临沂机场\t102042\n克林特·伊斯特伍德\t102043\n再教育\t102044\n粗暴\t102045\n洞爷湖\t102046\nFeeds\t102047\n前进街\t102048\n德尔地板\t102049\n黑暗与光明\t102050\n联合国维和部队\t102051\n塑管\t102052\nluos\t102053\n乐驰\t102054\n第一单\t102055\n王家新\t102056\nc2cc\t102057\n杀星\t102058\n中国农业银行股份有限公司\t102059\n空长\t102060\n交通路\t102061\n高山青\t102062\n农肥\t102063\n五金展\t102064\n武汉市中医医院\t102065\n百度莱茨狗\t102066\n暧昧关系\t102067\n更强大\t102068\n倪家桥\t102069\n铜杆\t102070\nrecursion\t102071\n中国内部审计协会\t102072\n春夏秋冬又一春\t102073\n小确幸\t102074\nExperts\t102075\n乒乓球\t102076\n排箫\t102077\n北极星电力\t102078\nplated\t102079\n人身价\t102080\npxw\t102081\n写块\t102082\n国际红十字会\t102083\n铃木爱理\t102084\nv22\t102085\n一脸\t102086\n暗黑二\t102087\n旻子\t102088\n程子华\t102089\n食人族\t102090\n8b\t102091\n孕妇\t102092\n偷影子的人\t102093\n16袋\t102094\n_菜鸟\t102095\nMNC\t102096\nUkulele\t102097\n洛特\t102098\n严飞\t102099\n竿\t102100\n风起长林\t102101\n桂平西山\t102102\n圆锥体\t102103\n长沙市城乡规划局\t102104\n痴情\t102105\n1.6升\t102106\n五菱宏光论坛_汽车之家论坛\t102107\n四元桥\t102108\n飞豆\t102109\nq币\t102110\n重阳\t102111\n2.5v\t102112\n九头虫\t102113\n华夏信财\t102114\n和坤\t102115\n延期\t102116\n危险源辨识\t102117\n平卧\t102118\n鱼疗\t102119\ncleared\t102120\n160万元\t102121\n包面\t102122\n基德\t102123\n10门\t102124\nquantitative\t102125\nNig\t102126\nRue\t102127\n数控钻床\t102128\n6570\t102129\n10.1.0.7224\t102130\ncauses\t102131\nadx\t102132\n粤式\t102133\n热辣辣\t102134\n小米Note/顶配_MIUI\t102135\nPVT\t102136\n2013年度\t102137\ntablet2\t102138\n上确界\t102139\n贝贝\t102140\nCIIA\t102141\n线程同步\t102142\nElevation\t102143\n重庆大学图书馆\t102144\n桶装水_列表网\t102145\n迷人的保姆\t102146\n崩瓷\t102147\nE42\t102148\n巴中市政府\t102149\n塔沟武校\t102150\nnotify\t102151\nToolbar\t102152\nSELinux\t102153\n南京师范大学商学院\t102154\n27歳\t102155\n地缸\t102156\n绍兴公安\t102157\n桑园\t102158\n中山市人民政府\t102159\n针式\t102160\nzei\t102161\n林淑容\t102162\n18135\t102163\n榆林新闻网\t102164\n朱以萍\t102165\nv1.0.3\t102166\n音频播放器\t102167\noyster\t102168\n飞刀问情\t102169\n杭州西站\t102170\n上海通用汽车\t102171\nDeath\t102172\n0.51\t102173\n冰晶画\t102174\nnuke\t102175\nbetheme\t102176\n房补\t102177\n通村\t102178\n负于\t102179\n穿越之还珠风流\t102180\n鲁迅文学奖\t102181\n阵眼\t102182\n音韵\t102183\n进城\t102184\n发根\t102185\n曾培炎\t102186\n冲销\t102187\n碾碎\t102188\nConda\t102189\n挺起\t102190\n会计核算办法\t102191\n口诉\t102192\nstarring\t102193\n奔驰s400\t102194\n作文格\t102195\n撰\t102196\n三十年后\t102197\n第40条\t102198\n下滑\t102199\n莅\t102200\n卷四\t102201\nTogether3\t102202\njaro\t102203\n庾\t102204\n导条\t102205\n猫包\t102206\n2015.4\t102207\n控制性详规\t102208\n71分\t102209\n悬崖式\t102210\nandrid\t102211\n专营\t102212\n周而复始\t102213\n女犯人\t102214\n胸腔镜手术\t102215\n缸体\t102216\n片石\t102217\n佛山\t102218\n张翔\t102219\n80g\t102220\n二级注册建造师\t102221\n言官\t102222\n秦必楚\t102223\n景荣\t102224\n呤\t102225\n论选\t102226\n千古情\t102227\nHelene\t102228\n引人注目\t102229\n煤炉\t102230\n破血\t102231\ndisplays\t102232\n骨髓穿刺\t102233\n柒\t102234\n严打\t102235\n猎虎\t102236\n35栋\t102237\n球场\t102238\nwinre\t102239\n六偏磷酸钠\t102240\n粗纱机\t102241\n25本\t102242\n楓\t102243\n卷管机\t102244\n浮池\t102245\n波斯丽尔\t102246\n定性研究\t102247\n0080\t102248\nRF射频\t102249\n图语\t102250\n分库分表\t102251\n世纪金榜\t102252\n投标价\t102253\njackluo\t102254\n九维\t102255\n思客\t102256\n不可或缺\t102257\nCNI\t102258\n鹤鸣山\t102259\nTP-LINK路由器\t102260\n2018个\t102261\nevhttp\t102262\nm118w\t102263\n执行局\t102264\n屈身\t102265\n津港\t102266\n1000升\t102267\n法眼\t102268\n沿革\t102269\nword/excel\t102270\n22项\t102271\n肉饼\t102272\n医政\t102273\n追番\t102274\n诗品\t102275\n途睿\t102276\n江晨\t102277\n敬业福\t102278\n几何布朗运动\t102279\n黑龙眼\t102280\n心部\t102281\n莎蔓莉莎\t102282\n十堰人大网\t102283\n设值\t102284\nwin8.1\t102285\n1.67\t102286\n广西糖网\t102287\ncmtj\t102288\nVG\t102289\n报刊文\t102290\n6股\t102291\niso镜像\t102292\n拉趣网\t102293\n牛油果油\t102294\ntiguan\t102295\n安东\t102296\n高斯贝尔\t102297\nLKP\t102298\n盒里\t102299\n离婚律师\t102300\n树叶\t102301\n风动\t102302\n单晶冰糖\t102303\n4平方米\t102304\n全自动打包机\t102305\n蒜\t102306\n韩式\t102307\nPLC\t102308\n龙水\t102309\n环状\t102310\n下周四\t102311\nMini\t102312\n多台路由器\t102313\nTuna\t102314\n日语论文网\t102315\n广告帽\t102316\n冬青\t102317\n10.03\t102318\n林内燃气\t102319\n边\t102320\ndostyle\t102321\n74年\t102322\n五一数学\t102323\n北京汽车集团有限公司\t102324\ncd套\t102325\n写者\t102326\n今日关注\t102327\n002型\t102328\nDatePicker\t102329\nStardock\t102330\n交关\t102331\nyykit\t102332\n教技\t102333\n最近一周\t102334\nsave函数\t102335\ncondensation\t102336\n意大利共和国\t102337\n玩什么\t102338\n举列\t102339\n信史\t102340\n握手包\t102341\n青海盐湖\t102342\nP6\t102343\n咸水歌\t102344\n伙拼\t102345\n员岗\t102346\n详介\t102347\n光帆科技\t102348\nlivecd\t102349\nKONAMI\t102350\nDBC2000\t102351\n菜花节\t102352\n地税申报表\t102353\n晶报\t102354\n独立悬架\t102355\n安格尔\t102356\n转过头\t102357\n预控\t102358\n山门镇\t102359\n智联招聘\t102360\n筼筜湖\t102361\n香港奇案之强奸\t102362\nKYB\t102363\n光伏們\t102364\nDML\t102365\n太子丹\t102366\n极客公园\t102367\nMoban\t102368\n群法\t102369\n现代农业示范区\t102370\n捕快\t102371\nWCM\t102372\nAnsel\t102373\n1262\t102374\n中共浙江省委办公厅\t102375\n内层\t102376\n1679\t102377\n不防水\t102378\n第56\t102379\nWinWebMail\t102380\n生产单\t102381\n醋酸酐\t102382\n95后\t102383\n上颌\t102384\n味儿\t102385\nrainy\t102386\nled电子屏\t102387\n降龙伏虎小济公\t102388\n山曲\t102389\n戴萌\t102390\nNVR\t102391\n鼻前庭炎\t102392\nUSBkey\t102393\nS1线\t102394\nTO-220\t102395\n凯氏\t102396\n饶河\t102397\n阿里斯顿\t102398\n鼠来宝3\t102399\n14周年\t102400\n嘉兴南湖革命纪念馆\t102401\n羊肝\t102402\nSTM32F1\t102403\n纺车\t102404\n大华兴寺\t102405\n刘建华\t102406\nVI模板\t102407\n百度医生\t102408\n宏昌\t102409\n内交\t102410\n上海行健职业学院\t102411\n锁线\t102412\n照单\t102413\nMSDS\t102414\n野球\t102415\n67度\t102416\n大年初二\t102417\n蓝莓爆珠\t102418\n出格\t102419\n枕巾\t102420\n察布查尔\t102421\n民商\t102422\nat\t102423\n180401期\t102424\n中海华庭\t102425\n量体\t102426\n莫凡\t102427\n何冰\t102428\n给自己的歌\t102429\n南汇区\t102430\n辛西娅\t102431\nRicky\t102432\nVMFS\t102433\n中国通用技术集团\t102434\n华为荣耀4x\t102435\n西红柿蛋汤\t102436\n美国\t102437\n墙画\t102438\nps2017\t102439\nRoboMaster\t102440\n远峰\t102441\n社会契约论\t102442\nAlberto\t102443\n江泽民\t102444\nJOE\t102445\n黑鲷\t102446\nBespoke\t102447\n甩手\t102448\n农业\t102449\nminikube\t102450\n7}\t102451\n鱼豆腐\t102452\n16根\t102453\n紫阳县\t102454\n镌\t102455\n库存商品\t102456\n织体\t102457\n周俊伟\t102458\n筒子\t102459\n大动脉炎\t102460\n分句\t102461\n海花岛\t102462\n生花生\t102463\n长焦镜头\t102464\n合适用\t102465\n桐君堂\t102466\n回正\t102467\n红布林\t102468\n难宠\t102469\n眼膏\t102470\n任何\t102471\nrentiyishu\t102472\n麦凯乐\t102473\n超高强\t102474\n挑染\t102475\n账表\t102476\n展陈\t102477\n导热油泵\t102478\n平齐\t102479\n贾老师\t102480\nHuntington\t102481\n主升浪\t102482\nangelbaby\t102483\n紫荆花\t102484\n天安门\t102485\n求贤若渴\t102486\n八百米\t102487\n玻璃幕\t102488\n蛋堡\t102489\nf-81981\t102490\n窜\t102491\n坂上\t102492\n可调电位器\t102493\n法索\t102494\n长岭镇\t102495\nWoody\t102496\n性交\t102497\n汗疱疹\t102498\n热桥\t102499\n全民突击\t102500\n黑郁金香\t102501\nMilfs\t102502\n胆碱能性荨麻疹\t102503\nsprinkle\t102504\n骁龙800\t102505\nE级\t102506\n曾国潘\t102507\nILO\t102508\n电竞椅\t102509\n莉莉娜\t102510\npfx\t102511\n波斯顿\t102512\nroad\t102513\n童瑶\t102514\n原初\t102515\n翠屏\t102516\n奥迪a1\t102517\n屈\t102518\n掌舵者\t102519\ncondo\t102520\ndaddy\t102521\nT3T4\t102522\nkhda\t102523\nFixing\t102524\nSUITE\t102525\nGhost版\t102526\nTantrums\t102527\n神彩\t102528\nint8\t102529\n双鱼\t102530\n黎耀辉\t102531\n流动车\t102532\n贷款还款\t102533\n绞死\t102534\n单相电能表\t102535\n讲习\t102536\n东方汽车\t102537\nBaldwin\t102538\n刘圣妮\t102539\n吾优网\t102540\n做用\t102541\n后果\t102542\nFaceNet\t102543\n天欲\t102544\n正月初七\t102545\n是你\t102546\n九儿\t102547\nhao123浏览器\t102548\n一百多万\t102549\n2003年12月27日\t102550\naidl\t102551\nitalia\t102552\n巨大化\t102553\n髓内钉\t102554\nHealthy\t102555\n取位\t102556\n2.10.0\t102557\n阵痛期\t102558\n味核苷酸二钠\t102559\n宏发股份\t102560\n电锅\t102561\n上海妈妈网\t102562\n水袖\t102563\n花屏\t102564\n四集\t102565\n保温罩\t102566\n脱俗\t102567\n线条\t102568\n瑞莱\t102569\n竖新镇\t102570\n山口玲子\t102571\n影库\t102572\n沈阳师范大学\t102573\n手动打包机\t102574\n海鸥表\t102575\n维兰德\t102576\n微擎微赞\t102577\npayable\t102578\n乐金健康\t102579\nmmc\t102580\n数域\t102581\n侯佩岑\t102582\n堀内秋美\t102583\n继父\t102584\n华哥仔\t102585\n乐聘网\t102586\n慢车\t102587\nse3\t102588\n啄木鸟经典Pornochic\t102589\nwinserver2003\t102590\n那么远\t102591\n濑尿虾\t102592\nov7670\t102593\n土易网\t102594\nmkfifo\t102595\n长隆\t102596\n同窗\t102597\n碉\t102598\n大海\t102599\n430元\t102600\n白开\t102601\n抗\t102602\n李春霞\t102603\n叶盛\t102604\n敬辞\t102605\n孙媛媛\t102606\n降音\t102607\nMX300\t102608\n九型人格\t102609\n汤碗\t102610\ncgal\t102611\nuna\t102612\n齐APP正版软件\t102613\n交警支\t102614\n重弩\t102615\n文化传媒公司\t102616\n三相电表\t102617\n三月街\t102618\n东方航空公司\t102619\n水无月\t102620\n风幻\t102621\n镂\t102622\n姜皓文\t102623\n寒腿\t102624\n蚁群算法\t102625\n罐底\t102626\n腿窝\t102627\n2.5i\t102628\n放开手\t102629\n粉画\t102630\n副把\t102631\n数卷\t102632\n601933\t102633\n第三十次\t102634\n中档酒店\t102635\n数采仪\t102636\nRFID\t102637\n5A级旅游景区\t102638\n分子科学学院\t102639\ncryptojs\t102640\n声振论坛\t102641\n滴定仪\t102642\nElectrochemical\t102643\nholm\t102644\nhulk\t102645\nextents\t102646\n白嫖\t102647\n来访\t102648\n保释金\t102649\n仿石砖\t102650\n分身术\t102651\n第四十三\t102652\n抽奖箱\t102653\n托福机经\t102654\n抗原\t102655\n留校\t102656\n快消\t102657\n青岛11号线\t102658\n蒋梦麟\t102659\n沟渠\t102660\n王建荣\t102661\n白云新城\t102662\n金地花园\t102663\nNPO\t102664\n癜痫\t102665\n样本方差\t102666\nnavmesh\t102667\n兴隆台区\t102668\n掠过\t102669\n薄荷草\t102670\n49届\t102671\nyuma\t102672\n勾魂\t102673\n火车头采集器\t102674\n_马自达\t102675\n门生\t102676\n9688\t102677\n晨夕\t102678\nsuhr\t102679\n各行各业\t102680\n蜜汁\t102681\n石狮\t102682\n东临新区\t102683\n4月20日起\t102684\nf603\t102685\nsurfacepr\t102686\n圣契\t102687\n美国纽约大学\t102688\n台湾医院\t102689\n铁山坪\t102690\nbowtie2\t102691\nNES模拟器\t102692\n鳌鱼\t102693\n300期\t102694\n张星\t102695\n社保一卡通\t102696\n人称代词\t102697\n六榕寺\t102698\n典雅型\t102699\n新华西路\t102700\n酒店式公寓\t102701\nwago\t102702\n面荷载\t102703\n中都\t102704\n锄\t102705\n低渗\t102706\n航天桥\t102707\n对花枪\t102708\n络石\t102709\n心飞\t102710\n袁姗姗\t102711\nhellman\t102712\nmich\t102713\n滁州职业技术学院\t102714\n柔丝\t102715\n记录袋\t102716\njnlp\t102717\n中国人民解放军第306医院\t102718\n诞辰日\t102719\n高尔斯华绥\t102720\nwef\t102721\n水树\t102722\n蓝天云\t102723\n315维权网\t102724\n混合器\t102725\n点烟\t102726\n赛迦奥特曼\t102727\n列侯\t102728\n闭嘴\t102729\n飞雪连天射白鹿\t102730\n小无相功\t102731\n立贴\t102732\n_书法爱\t102733\n低压槽\t102734\n滤缸\t102735\n二十四式太极拳\t102736\n保时捷卡曼\t102737\n2456\t102738\n1073741819\t102739\n消防炮\t102740\n热气球\t102741\n金狮广场\t102742\nreaver\t102743\n卡迪夫城\t102744\n跌落\t102745\n腾讯广点通\t102746\n郑莉\t102747\n香雪兰\t102748\n沙太南路\t102749\n冠位\t102750\n3.5.2\t102751\n女教授\t102752\n民事上诉状\t102753\n怪婴\t102754\n花车\t102755\n求像\t102756\nrainfall\t102757\n魏明伦\t102758\n酱缸\t102759\n槐树\t102760\n上衣秀胸\t102761\nUNICORN\t102762\n不闪\t102763\n练口语\t102764\nirq\t102765\n颜颜\t102766\n呼包鄂\t102767\n海蒂·拉玛\t102768\n清香\t102769\n相关类\t102770\n勇者斗恶龙怪兽篇2\t102771\n共和国之辉\t102772\n鳄龟\t102773\n南京银行股份有限公司\t102774\n宁佳怡\t102775\nsnnu\t102776\n心悦3\t102777\n李小林\t102778\n慢性骨髓炎\t102779\n闭气\t102780\n规律性\t102781\n独上西楼\t102782\nefficacy\t102783\n告发\t102784\n小乙\t102785\n影子武士2\t102786\n收群\t102787\n展现\t102788\n1000万元\t102789\n代款\t102790\n400a\t102791\n保险代理人资格考试\t102792\n蒲公英\t102793\ncuteftp\t102794\n闽台\t102795\n隋俊波\t102796\n源文\t102797\n李多海\t102798\n有志气\t102799\n九二年\t102800\n张力\t102801\n小松丸\t102802\n悠然见\t102803\n疏散门\t102804\n优先编码器\t102805\n逃脱\t102806\n樱桃树\t102807\n蓝釉\t102808\n鬼臼丸\t102809\nHurley\t102810\n放租\t102811\n无菌室\t102812\n纸托\t102813\nDP\t102814\n昌利混凝土搅拌站公司\t102815\n食品\t102816\nyifang\t102817\n二调\t102818\nKOSE\t102819\n养生学\t102820\n陪陪\t102821\n象神\t102822\n雾化器\t102823\n湖北省人才中心\t102824\n红外热像仪\t102825\n凤凰国际书城\t102826\n列车长\t102827\n神印王座\t102828\n百度云搜索-坑搜网\t102829\n胡瓜\t102830\n易房\t102831\n灰质\t102832\n吉星\t102833\n盗版软件\t102834\n亚讯车网\t102835\n30几岁\t102836\n宏观经济网\t102837\n8.0分\t102838\n子站\t102839\n舍利子\t102840\n秘旨\t102841\n九毛九\t102842\n咳血\t102843\n梁宏\t102844\n成日\t102845\n自播\t102846\n欺诈者\t102847\n医毒双绝\t102848\n优托邦\t102849\n压痛感\t102850\n办公用品店\t102851\n青岛经济技术开发区\t102852\n日清\t102853\n輪\t102854\ntory\t102855\nCha\t102856\n人柱\t102857\nshenyang\t102858\n程_\t102859\n给力币\t102860\n繁体字\t102861\n语法分析器\t102862\n巨细胞病毒感染\t102863\n第六人民医院\t102864\n三八女性网\t102865\n奥里萨邦\t102866\n阅览\t102867\n青岛搜狐\t102868\n途安L论坛\t102869\n宣布\t102870\n24公里\t102871\n90号\t102872\n班婕妤\t102873\n新竹路\t102874\n凌晨\t102875\ndiv层\t102876\n日喀则地区\t102877\nexpresscache\t102878\nserer\t102879\n文冲\t102880\nVisualizing\t102881\n转过身\t102882\n18048\t102883\n神波\t102884\n静流\t102885\n北京114生活网\t102886\n印发\t102887\n至诚财经\t102888\nmorality\t102889\n铁氰化钾\t102890\nfei\t102891\noccupation\t102892\n北京舞蹈学院附中\t102893\n孤岛惊魂5末日储物箱\t102894\n蒋巷村\t102895\n中上\t102896\n轮状病毒\t102897\n胶印机\t102898\n马兰花开\t102899\n酒桌\t102900\nCobalt\t102901\n门外\t102902\n副会\t102903\n秦学\t102904\n滑跪\t102905\n平安医疗\t102906\n叠氮化钠\t102907\n86\t102908\n魅惑\t102909\n赵尚志\t102910\n汤姆历险记\t102911\n针灸推拿科\t102912\n如影随行\t102913\n关谷神奇\t102914\n看了看\t102915\n猛烈\t102916\n魔发精灵\t102917\nf321\t102918\n时尚度\t102919\n木桨\t102920\n做守\t102921\n塔斯马尼亚州\t102922\n歼灭战\t102923\n陈先生\t102924\n麻辣粉\t102925\n幼专\t102926\n中铁奥维尔\t102927\n亨瑞\t102928\n陆羽\t102929\n投机者\t102930\nmetabase\t102931\n美咲结衣\t102932\n肖磊\t102933\n跨境电子商务\t102934\n跟帖\t102935\n黎明杀机屠夫\t102936\n4类\t102937\n空泡\t102938\n买卖\t102939\n贵人鸟\t102940\n上吊\t102941\n若羌\t102942\n黄阁\t102943\n7208\t102944\n荧光蛋白\t102945\n安防监控吧\t102946\nstrtus\t102947\n安全生产许可证条例\t102948\n王文静\t102949\n亲哥\t102950\n甄姐姐\t102951\n查理芒格\t102952\nKubeadm\t102953\n乐普医疗\t102954\n四股辫\t102955\n五步曲\t102956\nbl漫画\t102957\nprj\t102958\n号头\t102959\n滥权\t102960\nencounters\t102961\n佛历\t102962\n刚泰控股\t102963\n沫若\t102964\nDOT4\t102965\n需求池\t102966\n寻甸县\t102967\nPOWER\t102968\n十二经脉\t102969\n清华同方锋锐\t102970\n淘淘商城\t102971\n黑魔导\t102972\n众生药业\t102973\n吕小布\t102974\nKindle\t102975\n南岛\t102976\n第001段\t102977\n20180126\t102978\n山东黄金集团有限公司\t102979\n15mg\t102980\n9800\t102981\n沥干\t102982\nd座\t102983\n医典\t102984\n上抛\t102985\n广电兰亭盛荟\t102986\n三种方\t102987\n地膜\t102988\n汉画\t102989\n邦普\t102990\n往日\t102991\nsrf\t102992\n樱花忍法帖\t102993\n火影忍者究极风暴4\t102994\n一刀切\t102995\n海绵宝宝历险记\t102996\nMapperS\t102997\n天恒\t102998\n王城公园\t102999\n0503\t103000\n先锋AV\t103001\n浓淡\t103002\n虎人\t103003\n东方今典\t103004\n红楼遗梦\t103005\n锦江国际\t103006\n芯片\t103007\n制板\t103008\n卫星\t103009\n加壳\t103010\nBaseball\t103011\n枕秃\t103012\n平安村\t103013\n战姬绝唱\t103014\n萌仔\t103015\n花色苷\t103016\n农垦\t103017\ndatabases\t103018\n大数定律\t103019\n徐皓峰\t103020\nclp\t103021\n低钙血症\t103022\n从善如流\t103023\n拉杜蓝乔\t103024\nconductivity\t103025\n免刑\t103026\n柯蓝碧昂丝\t103027\n奔走相告\t103028\n中堂镇\t103029\n东胜区\t103030\n拱门\t103031\n3分钟\t103032\n板夹\t103033\n穆铁柱\t103034\n比尔吉沃特\t103035\n牧马奥迪s8\t103036\n河原崎\t103037\n三帆中学\t103038\n卡文\t103039\n最后一个\t103040\nxml格式\t103041\n橡胶条\t103042\n育儿百科\t103043\n先天性行为\t103044\n荣宅\t103045\n糖果机\t103046\n花千张三丰异界游\t103047\nIE8.0\t103048\n暴雪将至\t103049\ngly\t103050\nmySQL\t103051\n龙兴寺\t103052\n惠州市教育局\t103053\n兰姐\t103054\n黄集镇\t103055\n阆中市\t103056\nEYE游戏论坛\t103057\n市委常委会\t103058\n酷品\t103059\n金刚砂\t103060\n观音桥\t103061\nprogram\t103062\n闵\t103063\n张婉清\t103064\n照片\t103065\n有机半导体\t103066\n健身圈\t103067\n文龙\t103068\n经济发展\t103069\n低腰裤\t103070\nboot分区\t103071\n座上宾\t103072\n剧烈\t103073\n乐山大佛景区\t103074\n可见一斑\t103075\n审计报告\t103076\n老百胜\t103077\n侧裙\t103078\n塘村\t103079\n继承表\t103080\n哈医大一院\t103081\n库区\t103082\n飞洋\t103083\ndescendant\t103084\n奥姆\t103085\n進入\t103086\n世茂锦绣长江\t103087\n移动营业厅\t103088\ncfca\t103089\n茶太\t103090\n有风\t103091\n绵绵\t103092\n赞美之泉\t103093\n四舍\t103094\n微整\t103095\n道路运输证\t103096\n3卡顿\t103097\n厦门分公司\t103098\n18世纪\t103099\n梁_\t103100\n7.5元\t103101\n72页\t103102\n图灵社区\t103103\n委员们\t103104\n套项\t103105\n检查口\t103106\n特变\t103107\n旅游区\t103108\n第一百一十六章\t103109\n醉鬼\t103110\n旧卡\t103111\n7258\t103112\n苏州银行\t103113\n夏进\t103114\n重庆电子工程职业学院\t103115\n秀妍\t103116\n爱田奈\t103117\n区体育局\t103118\n四川农大\t103119\n旌逸集团\t103120\n塑界\t103121\n郑州中心站\t103122\n换乘\t103123\n单词记忆法\t103124\n碾米机\t103125\n赤裸特工\t103126\n鲫鱼豆腐汤\t103127\nhrpg\t103128\n冲床\t103129\nff1\t103130\n矿业权评估\t103131\n雪场\t103132\nipadmini2\t103133\n树妖\t103134\n老憨\t103135\n布雷特\t103136\n锚记\t103137\n聊斋迈腾\t103138\n1.7米\t103139\ncp9\t103140\n南昊\t103141\nLoader\t103142\nuri\t103143\n王晓波\t103144\n旷工\t103145\n物理学史\t103146\n雄峰城\t103147\n三角旗\t103148\n妙丽\t103149\n20KV\t103150\n采红菱\t103151\nBobCoder\t103152\n就喜欢\t103153\n自发性气胸\t103154\nTrojan\t103155\n絮凝\t103156\nQQ志乐园\t103157\n大一统\t103158\n共通\t103159\nutilities\t103160\nhiyachen\t103161\n朱延平\t103162\nKB\t103163\n御井烹香\t103164\n秘剑\t103165\n王佳敏\t103166\n5多少\t103167\n杰微\t103168\nPussy\t103169\ngta4吧_\t103170\nVICHY\t103171\n100瓶\t103172\n溪涧\t103173\n外汇牌价计算器\t103174\n2x+\t103175\n国开\t103176\n公共危机\t103177\n门源县\t103178\n美的燃气灶\t103179\nAH\t103180\n导引术\t103181\n10秒左右\t103182\n2._\t103183\n组合钢模板\t103184\nMOD_玩游戏网\t103185\n深加工\t103186\n考研网\t103187\n洋山\t103188\n昆山登云科技职业学院\t103189\n讳言\t103190\n长城葡萄酒\t103191\nGFE\t103192\n双龙戏珠\t103193\n韩国人\t103194\n纲手鸣人\t103195\n蔷薇月季_浴花谷花卉网\t103196\n翡\t103197\n圆珠笔芯\t103198\n飞黄腾达\t103199\ntractor\t103200\n水方\t103201\n中国财经网\t103202\n文献学\t103203\n大逆转裁判\t103204\n拓跋焘\t103205\n庆丰路\t103206\n题表\t103207\n麻饼\t103208\nverb\t103209\n肉鸭\t103210\n连段\t103211\npowe\t103212\n老年斑\t103213\n越权\t103214\n郑蒲港新区\t103215\nQM\t103216\nyezi\t103217\ngts450\t103218\n雾里看花\t103219\n瑞银集团\t103220\n闯天关\t103221\n计数板\t103222\nstatus\t103223\nBIEE\t103224\n精仪\t103225\n突变物\t103226\n罗马帘\t103227\ntechnique\t103228\n国民政府\t103229\n黄村火车站\t103230\nObservation\t103231\n巡查\t103232\n京杭\t103233\n中华人民共和国商务部反垄断局\t103234\n午饭\t103235\n曹德妲己\t103236\n抗倭英雄戚继光\t103237\n宝峰湖\t103238\nCI7\t103239\nBUFFER\t103240\n醒悟\t103241\n阻燃\t103242\n1板\t103243\nPow\t103244\n雨菲\t103245\n汤坑镇\t103246\nVertu\t103247\n朗诵诗\t103248\n柳条\t103249\nClaude\t103250\n凤凰直播\t103251\n陈彼得\t103252\nundertale\t103253\n新凤\t103254\n木疙瘩\t103255\n副作\t103256\nG9200\t103257\n太白南路\t103258\n择期\t103259\n口腔病\t103260\n徐州市委\t103261\n巴巴腾\t103262\n心歌\t103263\n馨儿\t103264\nmsm8909\t103265\nPike\t103266\nGROHE\t103267\n康军\t103268\n宣化上人\t103269\n丁建华\t103270\n伊芙\t103271\n狮眼\t103272\n空岛\t103273\n40册\t103274\njoin\t103275\n养路费\t103276\n简谐\t103277\n繁漪\t103278\n张春晓\t103279\n小儿过敏性咳嗽\t103280\n济南大学泉城学院\t103281\n百度派\t103282\n绢本\t103283\n帅气\t103284\n罗马世纪城\t103285\n陌声\t103286\n迅雷粉\t103287\n易图通\t103288\n敦煌\t103289\n电兽\t103290\nRUBBER\t103291\n冲淡\t103292\n浮诛\t103293\n开眼界\t103294\n乔佛里\t103295\n俄罗斯族\t103296\n99亿\t103297\n早饭\t103298\nra\t103299\n群工\t103300\n渡槽\t103301\n杭齿\t103302\n人力资源和社会保障局\t103303\n录栏\t103304\n丙察察\t103305\nARM单片机\t103306\n历山路\t103307\nnbk\t103308\n除号\t103309\nК\t103310\n买卖车\t103311\n摔落\t103312\nodb\t103313\n竞岗\t103314\n第二份\t103315\n膘\t103316\n集美学村\t103317\n初中\t103318\n第4轮\t103319\n小春\t103320\n河北省工商行政管理局\t103321\n猪肉脯\t103322\n八千里路云和月\t103323\nknee\t103324\n发件人\t103325\n冬令营\t103326\n约束力\t103327\nMarijuana\t103328\n被淹死\t103329\n氮化铝\t103330\n600621\t103331\n薛先生\t103332\n虚拟摄像头\t103333\n保诚\t103334\n条件单\t103335\n拖船\t103336\n明升m88\t103337\n收阳\t103338\n布朗山\t103339\n好幸运\t103340\nAmtemu\t103341\nabin9630\t103342\n坠入\t103343\n习总书记系列重要讲话精神\t103344\n女奴\t103345\n委身\t103346\n柳石明\t103347\n懂懂\t103348\nbending\t103349\n义乌火车站\t103350\n库存机\t103351\n没事干\t103352\n偷偷看\t103353\n邮税\t103354\n即征即退\t103355\n水条\t103356\n违缘\t103357\n绿媒\t103358\nSWAT\t103359\n智星\t103360\n奔驰glc300\t103361\n病毒性结膜炎\t103362\nWinnie\t103363\n标委会\t103364\n抢眼\t103365\n海流\t103366\n弹丸轮舞\t103367\n寒芒\t103368\n【化\t103369\n痘疤\t103370\n玫瑰苗\t103371\n微缩景观\t103372\ndamo\t103373\n山西刀削面\t103374\n环岛路\t103375\nCAD|天工问答\t103376\n工作报道\t103377\n苏州高新区实验初级中学\t103378\n境内外\t103379\n龙德广场\t103380\n活节\t103381\n星河互联\t103382\n谈定\t103383\nら\t103384\n一百亩\t103385\n正当防卫3\t103386\n别墅群\t103387\n万洋\t103388\n9520\t103389\n孙娜\t103390\n综掘\t103391\n五光\t103392\n翻云\t103393\n多彩网\t103394\n相忘于江湖\t103395\n花厅\t103396\ninvoices\t103397\n变废为\t103398\n苏嘉杭\t103399\n网屏\t103400\n第四单\t103401\nenl\t103402\n耳熟\t103403\n一勺\t103404\n马家\t103405\n汤臣倍健\t103406\n东呈酒店集团\t103407\n易到用车\t103408\n齐晨\t103409\n阴阳寮\t103410\nchester\t103411\n至尊江湖\t103412\n詹妮弗康纳利\t103413\nm.taobao.com\t103414\n科大智能科技股份有限公司\t103415\n义贼\t103416\n名村\t103417\n冰汽\t103418\n户均\t103419\n聪慧\t103420\n苟活\t103421\n玻璃桥\t103422\nUTV\t103423\n現\t103424\nRender\t103425\n氧化膜\t103426\n仍须\t103427\n湖南信息职业技术学院\t103428\n晋华集成电路\t103429\n脱水剂\t103430\n话务员\t103431\n楼体\t103432\n怒目\t103433\n重庆大学美视电影学院\t103434\n小庭院\t103435\n押井守\t103436\n扩散性ma\t103437\n制绒\t103438\n枳壳\t103439\n居住\t103440\n乒超\t103441\n言职\t103442\n湖北大学\t103443\n姽婳\t103444\n深圳市财政委员会\t103445\n再审案\t103446\n大汉帝国\t103447\n上键\t103448\nIconic\t103449\n不多\t103450\n下人\t103451\n多星\t103452\n大连口腔医院\t103453\n萌龙\t103454\n17个月\t103455\n北汽新能源汽车\t103456\n口译\t103457\n莱珀妮\t103458\n观赏期\t103459\n全民飞机大战\t103460\n东阿县\t103461\n节水器\t103462\n小米云盘\t103463\n冠翔网\t103464\n夜星\t103465\n1多路\t103466\n大话西游2\t103467\n擎天科技\t103468\nadlmint\t103469\n慢性阻塞性肺气肿\t103470\n小南庄\t103471\nMacromedia\t103472\n后位\t103473\n300分钟\t103474\n北青\t103475\n海茶\t103476\n码单\t103477\n阶\t103478\n金句\t103479\n截面\t103480\n中忍\t103481\n凯氏氮\t103482\n碳酸镧\t103483\nPing\t103484\n还我\t103485\n客车票\t103486\narabia\t103487\nwin64位\t103488\n法师塔\t103489\nfl吧\t103490\nC25\t103491\nnyaa\t103492\n平湖中学\t103493\nBDM\t103494\nHUF\t103495\ncos^2\t103496\nkill\t103497\nティング\t103498\n赌博案\t103499\n9-10\t103500\ndoushi\t103501\njingli\t103502\n河西新城\t103503\n喻意\t103504\n奥普浴霸\t103505\n保友\t103506\n京沪高速公路\t103507\nangularjs2\t103508\np53\t103509\n潜山\t103510\n杨钰莹\t103511\n幕墙\t103512\ncodeblock\t103513\nTKI\t103514\n不退还\t103515\n小鳄\t103516\nBuyers\t103517\n重庆市卫生和计划生育委员会\t103518\n敲碗\t103519\nWHAT\t103520\n九型人格测试\t103521\n2070\t103522\n商業\t103523\n深圳住房公积金\t103524\n127集\t103525\n刑辩\t103526\n涂炭\t103527\nPlotting\t103528\nimagelist\t103529\n蜂花粉\t103530\n20140501\t103531\n凯迪威\t103532\n辛杰\t103533\n基片\t103534\n985吧_\t103535\n小说阅读器\t103536\n花样多\t103537\n三年之内\t103538\n地勘矿权\t103539\n猫耳娘\t103540\nbjtu\t103541\n内蒙古鸿茅实业股份有限公司\t103542\n溜冰场\t103543\n外筒\t103544\n力特\t103545\n法律讲堂\t103546\n汇聚\t103547\nTOMATO\t103548\ndecompiler\t103549\n曾子墨\t103550\n齐齐哈尔站\t103551\nrtf\t103552\n7_\t103553\n人力资源配置\t103554\n傲游哈哈\t103555\n五零\t103556\ngeodesic\t103557\n王颢\t103558\n落花情\t103559\nselectivity\t103560\n电信移动\t103561\n菊子\t103562\nunpacking\t103563\n努比亚Z17\t103564\narashi\t103565\nhig\t103566\n发言稿\t103567\n江门市政府\t103568\n芯棒\t103569\n柠溪\t103570\n稀有动物\t103571\n芒果TV\t103572\n羊蝎子\t103573\n妖精的旋律\t103574\n一齐\t103575\nthunderbird\t103576\n山东消防\t103577\n吸毒史\t103578\n香港金像奖\t103579\nsunnee\t103580\n宝贵\t103581\n老叟\t103582\n狂欢节\t103583\n达龙云\t103584\n江省\t103585\n【史图馆\t103586\n藏戏\t103587\n选择表\t103588\nTiledMap\t103589\n5592\t103590\n抗美援朝志愿军\t103591\n二份\t103592\n天天基金网\t103593\n繁缕\t103594\n医务科\t103595\n产力\t103596\n卓亦然\t103597\n亚马\t103598\nuilable\t103599\n东北三省三校\t103600\n慧明\t103601\n茅于轼\t103602\n王奕\t103603\n山东科技大学\t103604\n文选\t103605\n东华凤九\t103606\n盖洛普\t103607\n凤凰旅游\t103608\n民宿\t103609\n董必武\t103610\nfullscreen\t103611\n敌流\t103612\n孔雀之乡\t103613\n商国\t103614\n东莞阳光网\t103615\n异星工厂吧_\t103616\nprefer\t103617\nsoopat\t103618\n_亿度\t103619\ninfp\t103620\n欢乐西游\t103621\n急尿\t103622\n爱可奈\t103623\n老鹰乐队\t103624\n徐州日报社多媒体数字报\t103625\n贵阳龙洞堡机场\t103626\n行地\t103627\n走光\t103628\nCentre\t103629\n阿里旺铺\t103630\n磁灶镇\t103631\n雪小禅\t103632\n品智\t103633\n此子\t103634\n上脚\t103635\n唐山凤凰妇产医院\t103636\n链球\t103637\nPico\t103638\n桃桃桃\t103639\n椎间盘膨出\t103640\n降血糖\t103641\n岷\t103642\n工字形\t103643\nAJ4\t103644\n绿植墙\t103645\n蒙娜丽莎的微笑\t103646\n学讲话\t103647\n4.3G\t103648\n布拉克\t103649\nsbx\t103650\n双硫仑\t103651\n感应式\t103652\n二升\t103653\n军事展\t103654\n和林格尔县\t103655\n1.2mm\t103656\n相场\t103657\n乳粉\t103658\n素人\t103659\n5年\t103660\n驼奶\t103661\n高中三角函数\t103662\n天涯八卦网\t103663\n横空\t103664\n淫棍\t103665\n电场\t103666\n猛子\t103667\n影音先锋播放器\t103668\n丙烯酰胺\t103669\nFloral\t103670\n四冠\t103671\n车针\t103672\n证物\t103673\npaging\t103674\n生米\t103675\njcb\t103676\n张艺左\t103677\n电离室\t103678\n友谊西路\t103679\n金戈铁马\t103680\n外逃\t103681\npurple\t103682\n本项\t103683\n劳保费\t103684\n群演\t103685\n家用电梯\t103686\n海尔施特劳斯\t103687\n椰汁\t103688\n曜石\t103689\n3:00\t103690\n欠债\t103691\navi格式\t103692\nSCSI\t103693\n包飞\t103694\n炸酥肉\t103695\nbearer\t103696\nFuzhou\t103697\n自叙\t103698\n消不掉\t103699\n第34届\t103700\nzhanqi\t103701\n70多\t103702\n纤溶酶\t103703\n塘主\t103704\n诠释\t103705\n玛丽娜\t103706\n天天中文网\t103707\nwikiHow\t103708\n陆定一\t103709\n珍珠翡翠白玉汤\t103710\n腘绳肌\t103711\n浙江省肿瘤医院\t103712\n10万吨\t103713\n国家电力监管委员会\t103714\n浙大管院\t103715\nnwpu\t103716\n两件套\t103717\n不无\t103718\n骨瘤\t103719\n拜堂\t103720\nmenatplay\t103721\n白事\t103722\nqq表情\t103723\n求学路\t103724\n盐矿\t103725\n音阶\t103726\n泡开\t103727\n手机21ic中国电子网\t103728\ndocs\t103729\n体验服\t103730\n摩擦性\t103731\n蔡小虎\t103732\n周市\t103733\n山芋\t103734\n李笛\t103735\n港口吞吐量\t103736\n应力仪\t103737\n纯新人\t103738\nSquirt\t103739\n草长莺飞\t103740\n022\t103741\n观复\t103742\n19篇\t103743\nflexnet\t103744\n人海\t103745\n四川省国税局\t103746\n南京医科大学附属眼科医院\t103747\neNet游戏先锋\t103748\n焊把\t103749\n纬七路\t103750\n常州钟楼政府\t103751\n制转\t103752\n2018年03月\t103753\n打码\t103754\n刘屹宸\t103755\n括\t103756\n下常\t103757\n大豪科技\t103758\n维尔茨堡\t103759\n大巴山\t103760\n兰德酷路泽\t103761\n奔跑吧2\t103762\n4025\t103763\n华师大二附中\t103764\n相叶雅纪\t103765\nMBTI职业性格测试\t103766\n诛仙问答\t103767\n时来\t103768\n看秀\t103769\n青基\t103770\n5000型\t103771\n张贵庄\t103772\nARJ21\t103773\nautoform\t103774\nBD-mkv\t103775\nCHENG\t103776\n何清\t103777\n719\t103778\n锯肌\t103779\n自由光论坛\t103780\n龙背\t103781\n服用\t103782\n天津实验中学\t103783\n陈尧\t103784\n华为交换机\t103785\nwww.v2ex.com\t103786\nHinata\t103787\n音头\t103788\n月殇\t103789\n/厂家/公司\t103790\n过场\t103791\n门球\t103792\n特灵中央空调\t103793\n夜之子\t103794\n魔法门之英雄无敌6\t103795\nGCMS\t103796\nexcel表格函数\t103797\nStevenson\t103798\nWP8\t103799\n超级兔子人\t103800\n长剑\t103801\n多路阀\t103802\n3月3号\t103803\n武侯祠\t103804\n伊立替康\t103805\n四川人民出版社\t103806\n杨崇勇\t103807\n教书育人\t103808\n齐天\t103809\n臂铠\t103810\ndhcpv6\t103811\n书签版\t103812\n美规版\t103813\nMileage\t103814\n余霜\t103815\n宿迁学院\t103816\nQQ机器人\t103817\n3131\t103818\n北京八维研修学院\t103819\n泥浆护壁钻孔灌注桩\t103820\n南玻集团\t103821\n99式\t103822\nagc\t103823\n百合漫画\t103824\n三人小说网\t103825\n福建人大网\t103826\n20150524\t103827\n我就喜欢\t103828\n大湿\t103829\n恶友\t103830\nhellsing\t103831\n女声\t103832\n静功\t103833\n小板\t103834\n环球社\t103835\n声影\t103836\nRonshen\t103837\n反致\t103838\n齁\t103839\n三星I9300\t103840\n第二十二集\t103841\n王若琳\t103842\nChanrich\t103843\n金阳光\t103844\n七浦路服装批发市场\t103845\n头晕\t103846\n22款\t103847\n堂本光一\t103848\n百人\t103849\n土元\t103850\n软膜\t103851\n外专\t103852\n菲律宾薄荷岛\t103853\n博远\t103854\nluntan\t103855\n河北农村信用社\t103856\n大庆新村\t103857\n脏乱\t103858\n倪萍舒马赫\t103859\n20170419\t103860\n其利\t103861\npuff\t103862\n康桥东路\t103863\n非承重墙\t103864\n中天花园\t103865\n斯诺克世界锦标赛\t103866\n吸程\t103867\n所称\t103868\n000938\t103869\n简明现代魔法图书馆\t103870\n树杆\t103871\n子网掩码\t103872\n970m\t103873\n7.2分\t103874\n锚式悬索桥\t103875\n承运\t103876\n战列\t103877\npubmed数据库\t103878\n_天下3吧\t103879\n张泽果\t103880\n大民\t103881\nzlib1.dll\t103882\n百看\t103883\n邱会作\t103884\n椰子味\t103885\n童真\t103886\nbounding\t103887\n威驰FS\t103888\n鲫鱼竿\t103889\n江苏医院\t103890\n葛道凯\t103891\n1.9.13\t103892\nshidai\t103893\n趣说\t103894\n柜机\t103895\n四角裤\t103896\n处罚\t103897\n失败\t103898\nNFS服务器\t103899\n蛮荒\t103900\n天怒\t103901\n归途如虹\t103902\n新几内亚\t103903\n深圳电视台\t103904\n水品\t103905\n棚房\t103906\n高新六路\t103907\n逃跑\t103908\n磨刀器\t103909\n利亚\t103910\n天花板\t103911\n读读多小说网\t103912\n河虾\t103913\n曹焱兵\t103914\n微观\t103915\n半百\t103916\n红楼景甜\t103917\n鸿辉\t103918\n围墙\t103919\n俯仰\t103920\n群氓\t103921\n李祥\t103922\n通风管\t103923\n南油大厦\t103924\n身价\t103925\n艺术学\t103926\n12月初\t103927\nNetScaler\t103928\n成人式\t103929\nSPA\t103930\nTreasures\t103931\n妮维雅\t103932\n花园社区\t103933\npreparation\t103934\n栋梁\t103935\n同名村\t103936\n金立吧\t103937\n山东大学》\t103938\n形近词\t103939\n绿城杨柳郡\t103940\n衣间\t103941\n加来道雄\t103942\n怼乘警\t103943\n浑天\t103944\n12306bypass\t103945\n饽饽\t103946\n蒋艳超\t103947\n增长量\t103948\n小米路由器\t103949\n长江科学院\t103950\n28款\t103951\nfavorite\t103952\nroommate\t103953\n祸源\t103954\n230ORE\t103955\nxps吧\t103956\n稳妥\t103957\n电蚊香液\t103958\ncaca\t103959\ncrontab\t103960\n1644\t103961\n广瓜网\t103962\n易筋洗髓功\t103963\n全人\t103964\n碎影\t103965\n郁陵岛\t103966\ndatareader\t103967\n劳拉西泮\t103968\n环保总局\t103969\n7888\t103970\n李明洋\t103971\n代币券\t103972\n战神封魔录\t103973\nwal\t103974\n曾家山\t103975\n中论\t103976\n根正苗红\t103977\n南水北调中线工程\t103978\n过滤式\t103979\n评学\t103980\n铝杆\t103981\n五角场万达广场\t103982\n棒约翰比萨\t103983\n挂任\t103984\n头桥镇\t103985\n72种\t103986\n压哨\t103987\nUnders\t103988\nxui\t103989\n十大劲歌金曲颁奖典礼\t103990\nredio\t103991\n三九养生堂\t103992\n支局\t103993\n梯控\t103994\n轩妈蛋黄酥\t103995\n乡村经济\t103996\n总承包商\t103997\n少华山\t103998\n透气鞋\t103999\n黯然失色\t104000\n王利平\t104001\nohmy\t104002\ngdgp\t104003\n0119\t104004\n家用电视\t104005\n荣耀钟馗\t104006\n忧心忡忡\t104007\n铁扇\t104008\n德克斯特\t104009\n平段\t104010\n龙牡壮骨颗粒\t104011\nBMC\t104012\nqinghe\t104013\nzhilian\t104014\nc6\t104015\n1.4t\t104016\n极乐鸟\t104017\n塞尔达\t104018\n乌海网\t104019\ncheckStyle\t104020\n南昌网\t104021\n罗兵\t104022\n配置页\t104023\n乱舞水浒\t104024\n最长公共子序列\t104025\nkingdoms\t104026\n天诛地灭\t104027\nh12\t104028\n20180210\t104029\n氧离子\t104030\nimpacts\t104031\n一次几粒\t104032\n教机\t104033\n而异\t104034\n宝娜斯\t104035\n隐杆线虫\t104036\n乐高公司\t104037\n人物照\t104038\n黑豹\t104039\n何健\t104040\n20150328\t104041\n夏目彩春\t104042\n66周年\t104043\n杨庙\t104044\n酷文\t104045\nRichmond\t104046\ncommerce\t104047\n扬州慢\t104048\n石油大学\t104049\n╮\t104050\n学哥\t104051\nbrady\t104052\n经不起\t104053\n车载MV\t104054\nrte\t104055\n1.59\t104056\n天竺综合保税区\t104057\n禾丰牧业\t104058\nFMS\t104059\n锋潮科技\t104060\ntime函数\t104061\n流动红旗\t104062\nFlash播放器\t104063\n汉译英\t104064\n玩具\t104065\n吴小晖\t104066\n记实\t104067\n中央环境保护督察\t104068\n码市\t104069\n升初\t104070\n两公\t104071\n校媒\t104072\n蛇口渔人码头\t104073\n要员\t104074\n商品经济\t104075\n酒场\t104076\n丁原\t104077\nc++vector\t104078\n好妹妹乐队\t104079\n帐单\t104080\n全瓷冠\t104081\n大臣\t104082\npoke\t104083\n酷孩\t104084\n快乐戏园\t104085\n小妹妹\t104086\n舒化\t104087\n重庆大剧院\t104088\n金税三期\t104089\n蓝湾咖啡\t104090\n朱成玉\t104091\n写物\t104092\nwebassembly\t104093\n彩雕\t104094\n胡冰\t104095\n散步者\t104096\n重庆路桥\t104097\n热射病\t104098\n两条命\t104099\n同频\t104100\nwakeup\t104101\n大众公司\t104102\n小米米粉节\t104103\nWET\t104104\nDCloud\t104105\n道桥\t104106\n专柜价\t104107\n99版\t104108\n德飞莱\t104109\n复信\t104110\n在这儿\t104111\n天祥广场\t104112\nKIKI\t104113\n2018-04-25\t104114\n7档\t104115\n书丛\t104116\n3-1\t104117\ndrds\t104118\n曾经\t104119\nvcl\t104120\n第几次\t104121\n唐吉坷德\t104122\n搞头\t104123\nCrossovers\t104124\n稻叶\t104125\n明说\t104126\n坏道\t104127\n明德实验学校\t104128\n颍川\t104129\n兰顿\t104130\n16组\t104131\n美达\t104132\n300号\t104133\n特雷莎·梅\t104134\n1080TI\t104135\n5000亩\t104136\n入厂\t104137\n磐镭\t104138\n矩阵型\t104139\nMecanim\t104140\n易联支付\t104141\n双环\t104142\n极地地区\t104143\n勃艮第红\t104144\n双飞燕\t104145\nkeychain\t104146\n性变态\t104147\n前进大街\t104148\n草蛇\t104149\n报名\t104150\n乡亲\t104151\n受潮\t104152\n并集\t104153\n合财\t104154\n结亲\t104155\n住会\t104156\n乐天世界\t104157\n扫路人\t104158\n从头\t104159\n华为荣耀v9\t104160\n烟酰胺\t104161\n网络科技有限公司\t104162\n长安新能源\t104163\n存期\t104164\n主簿\t104165\naccu\t104166\nFusionSphere\t104167\n出国留学\t104168\n女伯爵\t104169\n金锣\t104170\n1}\t104171\n市安委会\t104172\n拖把\t104173\n飞行者\t104174\n防晒罩\t104175\n1em\t104176\n西咸新区沣东新城管委会\t104177\nJBOSS\t104178\n干物\t104179\nConvertible\t104180\n叫魂\t104181\n巧虎资源\t104182\n浙江科技\t104183\nWinzip\t104184\n跛脚\t104185\n寄人篱下\t104186\n许小年\t104187\n光影魔术手\t104188\n安丘市人民政府\t104189\n福建省林业厅\t104190\n河南大学民生学院\t104191\n吝啬鬼\t104192\n斐然\t104193\nV2EX\t104194\n正义论\t104195\n广西人事考试网\t104196\n李兆麟\t104197\n页面父\t104198\n姑苏晚报\t104199\nZ440\t104200\n过敏性湿疹\t104201\n海峡城\t104202\n奇绝\t104203\nJoyoung\t104204\n北京癫痫病医院\t104205\n白富美\t104206\n日达仙\t104207\n5【\t104208\nDOPA\t104209\n网吧游戏\t104210\n风景片\t104211\n女神们\t104212\n石头\t104213\n过高速\t104214\nmarry\t104215\n1928\t104216\n小宇宙\t104217\n仪征市\t104218\n威海房产超市网\t104219\n邓维\t104220\nGitLab\t104221\nQGC\t104222\n深圳地税\t104223\n2.\t104224\nufotable\t104225\n3.4%\t104226\nPaper\t104227\n湖北中医药大学\t104228\n礼门\t104229\n福山正达外国语小学\t104230\n炒楼\t104231\n西西图\t104232\n治愈\t104233\n3.7.5\t104234\n4封\t104235\n银白色\t104236\n行装\t104237\n布卢明顿\t104238\n炎孕\t104239\nGap\t104240\n林允\t104241\n金霉素软膏\t104242\n斯金纳\t104243\nX550\t104244\n蓬灰\t104245\n捣棒\t104246\nZY\t104247\nCHR\t104248\n叶子猪\t104249\n早孕网\t104250\n30寸\t104251\n宋卫东\t104252\n球裤\t104253\nConan\t104254\n众博\t104255\n欧模网\t104256\n默默无闻\t104257\nhatano\t104258\nkuaibo\t104259\ncolle\t104260\n上海同济城市规划设计研究院\t104261\n局域网打印机\t104262\nu11+\t104263\n小栈\t104264\n涂改液\t104265\n狂嗨\t104266\n徐驰\t104267\n新时代政协\t104268\n33000元\t104269\n待查\t104270\n蛛丝\t104271\n耕种\t104272\n山型\t104273\n人民出版社\t104274\n月经量\t104275\n传祺GE3\t104276\ncritics\t104277\n高官们\t104278\n盛威\t104279\n卢瓦尔河谷\t104280\n褒美\t104281\nmimics\t104282\naoc\t104283\n小米运动\t104284\n出汗\t104285\n杭州婚纱摄影\t104286\n关心\t104287\nFramework自动化测试\t104288\nTuneUp\t104289\nNgai\t104290\n正胶\t104291\n成都市政协\t104292\n海迈\t104293\n太空沙\t104294\nembryo\t104295\n一键投影\t104296\n中国巨幕\t104297\n立委\t104298\n亚洲龙\t104299\n王衍\t104300\n港汇恒隆广场\t104301\n阳光照射\t104302\nmyeclise\t104303\n几百倍\t104304\n美肤宝\t104305\n铸轧\t104306\n徐经长\t104307\n绿地公园\t104308\n鳍片\t104309\n荣德\t104310\nMobiles\t104311\n10所\t104312\n絵\t104313\nMx\t104314\nStamps\t104315\n加光\t104316\n节点处\t104317\n雨靴\t104318\n巨微\t104319\n3天3夜\t104320\n决议\t104321\n污力\t104322\n1^3\t104323\n我妈\t104324\nCALL\t104325\n一个7岁\t104326\n豆哥\t104327\nBuffet\t104328\n汉寿政府网\t104329\n622路\t104330\nITMO\t104331\n普世价值观\t104332\nTorque\t104333\n安定医院\t104334\n渐热\t104335\n沙哈尔\t104336\n三相稳压器\t104337\n刘二雄\t104338\nMetric\t104339\n松嫩平原\t104340\n郭士哈继铭\t104341\n柳州新闻网\t104342\n皇后镇\t104343\nVue2.0\t104344\n摄制组\t104345\n大盘\t104346\n福建广播电视大学\t104347\n裂谷\t104348\n极酷播放器\t104349\nPHP100中文网\t104350\n苹果网\t104351\nchicago\t104352\n山海同湾\t104353\n今时\t104354\n陈清\t104355\n光电传感器\t104356\n雅培中国\t104357\n谷原希美\t104358\n情夫\t104359\n妙语连珠\t104360\nRNDIS\t104361\n重量\t104362\n书剑\t104363\n市北\t104364\nshoufei\t104365\n苏打水\t104366\nInput\t104367\n注册表键值\t104368\n德利\t104369\n被抢\t104370\n神马电影网\t104371\n中交第一公路勘察设计研究院有限公司\t104372\n税眼\t104373\nViney\t104374\n20150919\t104375\n静默期\t104376\n齐墩果酸\t104377\n优质校\t104378\n葛慧君\t104379\nGii\t104380\n白塔村\t104381\n码工助手\t104382\n磅级\t104383\n娱乐家\t104384\n优购工品\t104385\n联想Y700\t104386\nvol2\t104387\n我爱你塞北的雪\t104388\n泄出\t104389\n杨绛梅威瑟\t104390\n田阳\t104391\n1套\t104392\n风炉\t104393\n投行人\t104394\n电影集\t104395\n权利\t104396\n田寮村\t104397\niPAD\t104398\n宠物奇缘\t104399\n半甲\t104400\n苏博士\t104401\n狱卒\t104402\n课中坏事\t104403\n爵士版\t104404\n灶火\t104405\nSLX\t104406\n天津轨道交通集团有限公司\t104407\n分析文\t104408\n爆伤\t104409\n萝岗\t104410\n燕子矶新城\t104411\nDrawable\t104412\n星星球\t104413\n家梦\t104414\n欣欣向荣\t104415\nSD高达G世纪超越世界\t104416\n傻\t104417\n毛超峰\t104418\nBigDecimal类\t104419\n再逢\t104420\n苏小红\t104421\n慈溪\t104422\n第35话\t104423\n训子\t104424\n玉环新闻网\t104425\n正代\t104426\n蔡霞\t104427\nanton\t104428\n绳网\t104429\n智威汤逊\t104430\n土建\t104431\n斯太尔\t104432\n你老\t104433\n沙门\t104434\n展台\t104435\n慨然\t104436\n2350m\t104437\nCoordinator\t104438\nOAF\t104439\n梁山好汉\t104440\n39条\t104441\n孔琳\t104442\n两千年\t104443\n5欧\t104444\n马春雷\t104445\nTelling\t104446\nheadview\t104447\n黎瑞刚\t104448\n宁波车管所\t104449\n顾清\t104450\npolina\t104451\n二觉\t104452\n中转包\t104453\n木材\t104454\n2000多个\t104455\n犬齿\t104456\n计算机技术职业资格网\t104457\n六孔\t104458\n九阴白骨爪\t104459\n2805\t104460\n鑫鼎\t104461\nHold不住\t104462\n存货跌价\t104463\ncombi\t104464\n82分\t104465\nBlending\t104466\n路易基\t104467\nehs\t104468\n3月14\t104469\n楚门\t104470\n三个半月\t104471\n听风吟\t104472\n兰姨娘\t104473\n母鹿\t104474\n瞿溪\t104475\n佳能ip1180\t104476\n武陵山区\t104477\n湘西州政府\t104478\nSketchup2016\t104479\n967\t104480\n巡展\t104481\n宏功能\t104482\n合景领峰\t104483\np档\t104484\n一覽\t104485\n爱奇艺轮播台\t104486\n营业收入\t104487\n逆战\t104488\n无抵押贷款_抵押贷款_贷款利率\t104489\n受保\t104490\n新雅\t104491\n无皇刃谭\t104492\n侠盗飞车资源网\t104493\n偏航\t104494\n指标费\t104495\n付笛声\t104496\n雷克萨斯LX\t104497\n投资有限公司\t104498\ndisconnect\t104499\nQQ头像_腾牛个性网\t104500\n原更纱\t104501\n郑重其事\t104502\n饶有\t104503\n安人\t104504\n寒风\t104505\n拉康\t104506\n一问\t104507\nfloat变量\t104508\ncq\t104509\nlive2dviewerex\t104510\n吉宗\t104511\n双定户\t104512\n福建省发展和改革委员会\t104513\n织补\t104514\n李碧华\t104515\n贝加尔\t104516\n农业示范园\t104517\n枣庄市教育局\t104518\nfw\t104519\n禧泰\t104520\n中国煤炭工业协会\t104521\n矿泉\t104522\n宁夏理工学院\t104523\niPadmini4\t104524\n四分位\t104525\n蔡诗芸\t104526\nSeiya\t104527\n株洲时代新材料科技股份有限公司\t104528\n华谊兄弟传媒股份有限公司\t104529\n淮南火车站\t104530\n水合氯醛\t104531\n装饰条\t104532\n爱丽小屋\t104533\n鸡精\t104534\n子民\t104535\nLinkin\t104536\n礼花蛋\t104537\n部\t104538\n直角三角形\t104539\n专利号\t104540\n天使之橙\t104541\n剪切乳化机\t104542\n吸油烟机\t104543\n端口\t104544\n清水河\t104545\nRxSwift\t104546\n戊寅\t104547\n林泉\t104548\n专项应付款\t104549\nlc\t104550\n续修四库全书\t104551\n页码\t104552\n向天果\t104553\n場\t104554\n本草纲目\t104555\n暗星\t104556\n小郭\t104557\n滨河东路\t104558\nBokeh\t104559\n10035\t104560\n170000\t104561\n郑刚\t104562\n伦\t104563\n全国化\t104564\n袁家界\t104565\n黑眼\t104566\n康师傅\t104567\n1.5岁\t104568\n缺省\t104569\n图影\t104570\n柔佳\t104571\nstartup\t104572\n八佰伴\t104573\n被上诉人\t104574\n几伏\t104575\n_天书中文\t104576\n对着干\t104577\n串起\t104578\n天使萌\t104579\n耳液\t104580\n脱氧\t104581\n辽阳县\t104582\n建筑时报\t104583\n爆胎\t104584\nKardon\t104585\n热身操\t104586\nGzip\t104587\n丽华\t104588\nAgropages\t104589\nros\t104590\n星歌\t104591\n纯牛\t104592\n否者\t104593\nmn\t104594\n本田莉子\t104595\n广州成考网\t104596\n学化\t104597\n72年\t104598\n五违四必\t104599\n8761\t104600\n定场\t104601\n破例\t104602\n自然选择\t104603\n大馄饨\t104604\nfaac\t104605\n22周\t104606\n蓝城集团\t104607\n犯罪剧\t104608\n柔然\t104609\n超级会员\t104610\nutf-8\t104611\nspl\t104612\n野生动物园\t104613\nItalian\t104614\n改动\t104615\ncourses\t104616\n竹\t104617\n铜火锅\t104618\n紫荆山\t104619\n一千五\t104620\n小老头\t104621\n深圳妇幼保健院\t104622\n近现代\t104623\n酱面\t104624\n见好\t104625\n忍精不射\t104626\n有机茶\t104627\n透骨草\t104628\n凉拌菠菜\t104629\n非柜面\t104630\n腾讯企业邮箱帮助中心\t104631\n龙岗小学\t104632\nResorts\t104633\n&CG集\t104634\n走低\t104635\nONOS\t104636\n麦积区\t104637\n256GB\t104638\n神魔\t104639\nQQ输入法\t104640\n半胱氨酸\t104641\n站撸\t104642\nrrrr\t104643\n大庆油田总医院\t104644\n里根\t104645\nCashback\t104646\n明年一月\t104647\n天野雨乃\t104648\n天津物流公司\t104649\n北京师范大学生命科学学院\t104650\nNorthgard\t104651\np45\t104652\n刘罗锅\t104653\n摩托帮\t104654\naqs\t104655\n笑傲江湖2\t104656\n海河教育园区\t104657\nR710\t104658\nFriends\t104659\n峰米\t104660\n预定单\t104661\nfractions\t104662\n多晶硅片\t104663\n王梓钰\t104664\n孙道荣\t104665\n马里奥赛车8豪华版\t104666\nOur\t104667\nelipse\t104668\nons\t104669\nPaperFree\t104670\n依水间\t104671\n哈尔滨站\t104672\n防水材料招商网\t104673\n标致3\t104674\n1&#160\t104675\n旅游图\t104676\nmatic\t104677\n1080p高清\t104678\n晋江一中\t104679\n关妹\t104680\n清中\t104681\n5.39\t104682\n便是\t104683\n荷\t104684\n福州总医院\t104685\n鼓泡\t104686\n美女体\t104687\n维多利亚壹号\t104688\n普定\t104689\n壮阳药\t104690\nherbag\t104691\n全能王\t104692\n穆旦\t104693\nbandwagon\t104694\n问件\t104695\n才女\t104696\n西安高新开发区\t104697\n江浩\t104698\nMPH\t104699\n徐冬冬\t104700\n滑坡\t104701\n合格\t104702\nHDPE管\t104703\n120亿元\t104704\n9.0.6\t104705\n烧伤\t104706\n绿芽电影网\t104707\n战队币\t104708\n忆峥嵘岁月\t104709\n外接\t104710\n市直工委\t104711\npgs\t104712\n17世纪\t104713\n无路可退\t104714\nmgm\t104715\n明珠路\t104716\nYAMAHA\t104717\n差压式\t104718\n上料\t104719\n高州中学\t104720\n259\t104721\n干枯\t104722\n礼品册\t104723\nform-control\t104724\nwindchill\t104725\n马仁奇峰\t104726\nAlle\t104727\n译介\t104728\ngadgets\t104729\nreque\t104730\n麻辣天后传\t104731\n福美来论坛\t104732\n雷克沙\t104733\nnRF51822\t104734\n妖香\t104735\npsc6\t104736\n统一教育网\t104737\n归因理论\t104738\n河南省公共资源交易中心\t104739\n村头\t104740\n两小儿辩日\t104741\n天津人民美术出版社\t104742\n武汉大学资源与环境科学学院\t104743\n几目\t104744\n贺平\t104745\n增辉\t104746\n康舒\t104747\n沙和尚\t104748\n硕硕\t104749\n罗网\t104750\niBATIS\t104751\n单缸\t104752\n铁粉社区\t104753\n七天连锁酒店\t104754\nreleasing\t104755\n拓跋余\t104756\nNETCONF\t104757\n港大医院\t104758\n水塘\t104759\n东风汽车集团有限公司\t104760\n日间行车灯\t104761\n泡疹\t104762\n赶快\t104763\n缺血性\t104764\n老豆\t104765\n起落杆\t104766\n佳澄果穗\t104767\n埭头镇\t104768\n汽提塔\t104769\n外科护理学\t104770\n武汉机场\t104771\n成都市商务委\t104772\nTrove\t104773\n创意园\t104774\n万佛山\t104775\n围棋书\t104776\nrever\t104777\n万方数据库\t104778\n艾丽·范宁\t104779\n浙江电子口岸\t104780\n诈骗\t104781\n994\t104782\n公子哥\t104783\n100t\t104784\n325号\t104785\narrylist\t104786\nsiren\t104787\n摊点\t104788\n云耳\t104789\n肠虫清\t104790\n徒劳\t104791\n5000公里\t104792\n阿当\t104793\n失魂\t104794\n星咲\t104795\n三星s5\t104796\n美国片\t104797\n教学处\t104798\n线报\t104799\n奕乐\t104800\n招待会\t104801\nKUN\t104802\n清一\t104803\n插法\t104804\n抱人\t104805\n进修\t104806\n好贷网\t104807\n链条式\t104808\n万书网\t104809\nskylake\t104810\n直行道\t104811\n除污器\t104812\n包角\t104813\n35名\t104814\n迪拜帆船酒店\t104815\n信长之野望战国立志传\t104816\n推背感\t104817\nzv\t104818\n三角恋\t104819\n全全\t104820\n番禺客运站\t104821\n200年前\t104822\n武汉万科\t104823\n十月份\t104824\n硕连读\t104825\npadas\t104826\n优变商务网youbian.com\t104827\n一环路\t104828\n美早樱桃\t104829\n粑粑\t104830\n那曲地区\t104831\n两列\t104832\n会计核算\t104833\n可歌可泣\t104834\n江西师范大学科学技术学院\t104835\nBUTTON\t104836\nvdc\t104837\n楝树\t104838\n难逃一死\t104839\n陡崖\t104840\n触屏版\t104841\n雨裤\t104842\nTxt\t104843\n花月夜\t104844\n荣耀手表S1\t104845\n脚癣\t104846\nvideostudio\t104847\n建筑桩基技术规范\t104848\n东方宾馆\t104849\ndevicenet\t104850\nWIN7局域网\t104851\n积财\t104852\njavaMail\t104853\n自由行/\t104854\n创史\t104855\n昆士兰\t104856\n豁口\t104857\n考研_无忧考网\t104858\n版本科\t104859\n葛冰\t104860\nyoyoso\t104861\nlop\t104862\n王博士\t104863\n77本\t104864\nITEM\t104865\n金晶科技\t104866\n中原高速\t104867\n资源学\t104868\n银证\t104869\n赛欧3\t104870\n月楼\t104871\niPhone5\t104872\nID号\t104873\n万丰星际网\t104874\nici\t104875\n尿毒症透析\t104876\nswith\t104877\n猩球崛起3:终极之战\t104878\ndfo\t104879\n2012款款\t104880\nIANA\t104881\ntrap\t104882\n我局\t104883\nburns\t104884\n湖南省委\t104885\n西蒙·佩吉\t104886\nfu\t104887\nRIA\t104888\n秘密级\t104889\n食品加工\t104890\n垂头丧气\t104891\n火旺\t104892\n平安e生\t104893\n东山村\t104894\n格斗类\t104895\n法言\t104896\nf6\t104897\nMemset\t104898\n孟凡\t104899\n广州美莱\t104900\nPDFBox\t104901\nandy\t104902\n049\t104903\nushort\t104904\n采矿工\t104905\n国际电力网\t104906\nholic\t104907\n李岚\t104908\n刀剑神域失落之歌\t104909\nMyspace\t104910\n星河舰队\t104911\n普安镇\t104912\n万城华府\t104913\nGTX1050ti\t104914\n浙江人大\t104915\n金寨政府网\t104916\nwes7\t104917\n国家重点实验室\t104918\n藿香正气胶囊\t104919\n新纪录\t104920\n水文信息网\t104921\n金门高粱酒\t104922\nRip\t104923\n雷速登\t104924\n成像\t104925\n人种\t104926\n行政处罚\t104927\n疲劳期\t104928\n多重曝光\t104929\n威化\t104930\n中扑网\t104931\nNavisworks\t104932\n光之翼\t104933\n火影忍者博人\t104934\n胡子\t104935\n静谧\t104936\n卷圆机\t104937\nsqlserver2008\t104938\n健康品\t104939\n被刑拘\t104940\n干槽症\t104941\nン\t104942\n川内\t104943\n春夜\t104944\n脲\t104945\ncats\t104946\n1.9.7\t104947\n扩散性\t104948\nクリムゾンガ\t104949\n风冷\t104950\n美国队长\t104951\n猪腿\t104952\n湖南师范大学附属中学\t104953\n表露\t104954\n香澄果穗\t104955\n货代\t104956\nthann\t104957\n佛山小学\t104958\n8月28日\t104959\n4个半小时\t104960\n小蓓蕾组合\t104961\n连卡佛\t104962\nWorkspace\t104963\n二月兰\t104964\n潮宏基\t104965\n跳音\t104966\n监守\t104967\n2018.3.30\t104968\n讯石光通讯网\t104969\n海洋学\t104970\n2.33\t104971\n航海王online\t104972\nArithmetic\t104973\n草木皆兵\t104974\n保定市住房和城乡建设局\t104975\n【乐\t104976\nJACKSON\t104977\n月塘镇\t104978\n悦玺\t104979\nExcel\t104980\n教导员\t104981\naj7\t104982\n柳橙\t104983\n360手机N6Pro\t104984\nInitializing\t104985\n居住证明\t104986\n沿\t104987\n笛卡尔积\t104988\n偏差率\t104989\nGet\t104990\n姚江\t104991\n圆锥型\t104992\n青山葵\t104993\n岩棉彩钢板\t104994\n罗技摄像头\t104995\n宁夏三农呼叫中心\t104996\n陈月\t104997\n广东食品药品职业学院\t104998\nㄔ\t104999\nFateGo\t105000\n项式\t105001\n颈肩腰腿痛\t105002\n消夏计划\t105003\npve\t105004\n中国思想史\t105005\n项圈\t105006\n摸乳\t105007\n飞行队\t105008\nArtStation\t105009\n号群\t105010\n搜罗\t105011\n第五十一条\t105012\nUnrecognized\t105013\n赖文峰\t105014\nJacob\t105015\npatagonia\t105016\n中央环保督察\t105017\n王小白\t105018\n胃胀痛\t105019\n三国塔防传奇\t105020\nPool\t105021\n眼片\t105022\n金投\t105023\nbrent\t105024\ngdgp3.chinaxinge.com/shuju2/201610/20161061608517081\t105025\n小西克幸\t105026\n萤火之森\t105027\n设计馆\t105028\n斯图加特\t105029\n隐居\t105030\n升迁路\t105031\n敌百虫\t105032\n安打\t105033\n中购资讯网\t105034\n翼神\t105035\n属于我\t105036\n促成长\t105037\n网易河南\t105038\n收藏贴\t105039\nsoupui\t105040\n中厚板\t105041\n新宋\t105042\nadhesive\t105043\nff14吧\t105044\n九尺\t105045\n第7讲\t105046\n存款人\t105047\n白根\t105048\n挂袋\t105049\n中国电建集团贵阳勘测设计研究院有限公司\t105050\n明察秋毫\t105051\n拓攻\t105052\n调音台\t105053\n潸然\t105054\n大成功\t105055\n聘用\t105056\n486\t105057\nNeon\t105058\n嘉乐花园\t105059\nBD1280超清国英双语中英\t105060\n一抹柔情倾江南\t105061\n特岗教师招聘考试\t105062\n圆圆\t105063\n郭海培\t105064\nTelegram\t105065\n960EVO\t105066\n王学圻\t105067\n流延膜\t105068\n新闻_奥一网\t105069\n莲\t105070\n麦德\t105071\n公网服务器\t105072\n盛唐妖异志漫画全集\t105073\n集迈\t105074\n洗水唛\t105075\n5克拉\t105076\n丝瓜水\t105077\n戕害\t105078\n青春修炼手册\t105079\n荣耀明世隐\t105080\n赵艺\t105081\n知识产权协定\t105082\n知止\t105083\n热磁\t105084\n第61章\t105085\n德米安\t105086\n家纹\t105087\nbrightening\t105088\n社政\t105089\n馅饼\t105090\n拉子\t105091\nSeries\t105092\n福建省纪委\t105093\n中医堂\t105094\n中世纪2:全面战争\t105095\n刘培宇\t105096\n1542\t105097\n岳飞庙\t105098\ncad快捷键\t105099\n时尚资讯_论坛\t105100\n碘甲烷\t105101\n分润\t105102\n营业区\t105103\nUSM\t105104\n3D模型网\t105105\n开元棋牌\t105106\n微电解填料\t105107\nドリ\t105108\n中国仪仗队\t105109\n武长顺\t105110\n钜\t105111\n南中环\t105112\n巨口\t105113\n魏冉\t105114\ntay\t105115\n客气\t105116\n佳利\t105117\n易游\t105118\nfrightened\t105119\n斗志昂扬\t105120\n冬梅\t105121\n刘文天\t105122\n威客网\t105123\n小米3c\t105124\nTIM3\t105125\nInvestments\t105126\n好兔\t105127\nequally\t105128\n吴艳\t105129\n几支\t105130\n虎皮\t105131\nSoda\t105132\n结点\t105133\nsql数据库文件\t105134\n蒋方博尔特\t105135\nxiv\t105136\n关永超\t105137\n本作\t105138\n课堂派\t105139\n倾斜面\t105140\n2.6万元\t105141\n有机生活\t105142\n工商界\t105143\n21.2.0.113\t105144\n山西日报\t105145\n向量化\t105146\n居委会\t105147\n第(1\t105148\nHTTP2\t105149\n安上\t105150\n中国高速公路\t105151\n数据包络分析\t105152\ncanceled\t105153\n部落战\t105154\n泰东\t105155\n鲁人社\t105156\n保力\t105157\n近7天\t105158\n深圳大疆\t105159\n永康城\t105160\n永胜\t105161\n至尊\t105162\n宠宠\t105163\nuninitialized\t105164\n助推\t105165\n巧媳妇\t105166\n汕尾\t105167\n欧吉桑\t105168\n祭典\t105169\nf-47218\t105170\nG007\t105171\n北京首汽租车公司\t105172\n山东省司法厅\t105173\n超视\t105174\n乌木喉\t105175\n清华大学生命科学学院\t105176\n陶瓷史\t105177\n四方新村\t105178\nhive函数\t105179\nSupervision\t105180\n林芝县\t105181\n旅游篇\t105182\nb27\t105183\n灯塔\t105184\nxfplay-面包网\t105185\niso45001\t105186\n难望\t105187\n项梁\t105188\n一家人\t105189\n滴滴滴\t105190\n校花的贴身高手\t105191\n关系分析\t105192\n上海酒店式公寓\t105193\n冰鲜\t105194\nroomba\t105195\n烟烟罗\t105196\n1975年\t105197\n踏歌\t105198\n新疆维吾尔自治区人民医院\t105199\nLEED认证\t105200\n干煸豆角\t105201\n侨领\t105202\n木奇灵\t105203\n_跖疣\t105204\nSTARBUCKS\t105205\n宝盈\t105206\n新浪3X3篮球黄金联赛\t105207\n球票\t105208\n聚卓\t105209\n2555\t105210\n6310\t105211\n江原道\t105212\n御子\t105213\n4缸\t105214\n一至十二月\t105215\nNiushop\t105216\n金盏菊\t105217\n一段话\t105218\n舞花桥\t105219\n羟乙基淀粉\t105220\n语文学刊\t105221\nSM951\t105222\nInverted\t105223\n3880\t105224\nGOLF\t105225\nsata接口\t105226\n新闻性\t105227\n互动投影系统\t105228\n衰减\t105229\n半年前\t105230\norign\t105231\n上传于\t105232\n存货\t105233\n梅思安\t105234\n裤型\t105235\n朴树\t105236\n春闺\t105237\n照价\t105238\n阴阳历\t105239\n20150316\t105240\n月晓风清\t105241\nReagent\t105242\n900年\t105243\nExchanges\t105244\n康巴\t105245\n未病\t105246\nYY菲儿\t105247\n复制粘贴\t105248\n鮟鱇\t105249\n11万元\t105250\n李锐麦迪\t105251\n译经\t105252\n2014-11\t105253\n中国环境科学学会\t105254\nBig笑工坊\t105255\n最为\t105256\n保时捷\t105257\n融云\t105258\n申银万国证券\t105259\n橱柜\t105260\n领班\t105261\n岳岳\t105262\nGTX670\t105263\n注脚\t105264\n萧陶\t105265\n北京林业大学研究生院\t105266\n气门嘴\t105267\n账日\t105268\n中国教育新闻网\t105269\nstarling\t105270\n贾探春\t105271\nPTE\t105272\n一年\t105273\n中关村北\t105274\n晓依\t105275\ndesolate\t105276\nhuffman\t105277\n独奏版\t105278\n超声波雷达\t105279\nE570\t105280\n洗冤\t105281\n孤儿网\t105282\n插页\t105283\n税点\t105284\n毕业论文评语\t105285\n热菜\t105286\n上海长风公园\t105287\n早孕\t105288\n锦浪\t105289\nDHT11温湿度传感器\t105290\n上阵\t105291\n柠檬云财税\t105292\n歪\t105293\n海南省文昌市政府\t105294\n创奇\t105295\n两广\t105296\n无法阻挡\t105297\np8700\t105298\n工作居住证\t105299\n89623104\t105300\n钩机\t105301\n危规\t105302\ncolourful\t105303\n面带\t105304\n熊出没变形记\t105305\n17栋\t105306\n容桂街道\t105307\n敦煌市\t105308\n老架一路\t105309\n合作精神\t105310\n政治史\t105311\n微课版\t105312\n九龙坡区\t105313\n路由\t105314\n上海美术馆\t105315\nslect\t105316\n休办\t105317\n心田\t105318\n西安交通大学城市学院\t105319\n反演\t105320\n支山歌\t105321\n卵巢囊肿\t105322\n本构\t105323\n_湖北日报_多媒体报\t105324\nSrl\t105325\n电波表\t105326\n懂了\t105327\n转学生\t105328\n荆州\t105329\nTe\t105330\npagoda\t105331\n研途\t105332\n信丰快递\t105333\n老鳖\t105334\n异世灵武天下\t105335\n隔爆型\t105336\n紫檀\t105337\n沙子\t105338\n组\t105339\n贷后\t105340\n北大国发院\t105341\n2333333\t105342\n金善英\t105343\n威龙\t105344\n暖锋\t105345\n鑫城\t105346\ntv3\t105347\n无业\t105348\n天之大\t105349\n四川省监察委员会\t105350\nincomplete\t105351\n印鉴卡\t105352\n洪崖洞\t105353\n致害\t105354\n香海大桥\t105355\n吴王夫差\t105356\n惧内综合征\t105357\n层层\t105358\n游遍\t105359\nespoir\t105360\n身板\t105361\n似曾相识\t105362\n高教版\t105363\n小浣熊水浒卡\t105364\n韩伦\t105365\n亦庄镇\t105366\n高调\t105367\n摘要版\t105368\n马里兰大学帕克分校\t105369\n玛尼\t105370\n爱奇艺电视果\t105371\n尚书\t105372\nSybil\t105373\nlyw\t105374\n武汉大学国际软件学院\t105375\nSiege\t105376\npso\t105377\n素数\t105378\n回头客\t105379\nnylons\t105380\ncanoe\t105381\n集体经营性建设用地\t105382\n常识坊\t105383\n空空空\t105384\n86400\t105385\n龙奥大厦\t105386\n金刚玻璃\t105387\n98天\t105388\n夏佳\t105389\n水果捞\t105390\n机绣\t105391\n猜字谜\t105392\ndiner\t105393\n报生育\t105394\n旗袍\t105395\n蕨根粉\t105396\n合肥先锋网\t105397\nCapgemini\t105398\nOHSAS18001\t105399\n物类\t105400\n唐昌\t105401\n仅限\t105402\n海底总动员\t105403\n财气\t105404\n古诺\t105405\n肯定句\t105406\n广告学专业\t105407\n说走就走的旅行\t105408\n开封市公安局\t105409\n除霜\t105410\n地步\t105411\n库尔班\t105412\n67页\t105413\n上国\t105414\n颗粒状\t105415\nqck\t105416\n1090\t105417\nCouple\t105418\n2460\t105419\n金典\t105420\n自粘防水卷材\t105421\n泰州机场\t105422\nBOOTLEG\t105423\n狗男女\t105424\n中国国际工业博览会\t105425\n电脑启动项\t105426\n汨罗市\t105427\n戴德梁行\t105428\n加密播放器\t105429\n红肿\t105430\n双孢蘑菇\t105431\n灵异_起点中文网\t105432\n德国汉莎航空公司\t105433\n别管\t105434\n失节\t105435\n贾静\t105436\n红岩\t105437\n单良\t105438\n月盛斋\t105439\nз\t105440\nhiro\t105441\n$this\t105442\n中山火炬开发区\t105443\n触屏笔\t105444\n421\t105445\n张坊镇\t105446\n渐层\t105447\nteck\t105448\n宠物龟\t105449\n55you\t105450\nmysql库\t105451\n娑婆\t105452\nwaitkey\t105453\n旧作\t105454\ncotton\t105455\n6_\t105456\n神火集团\t105457\n分宜县人民政府\t105458\n氧化钙\t105459\n十四次\t105460\nbible\t105461\n孝房网\t105462\n111111111\t105463\n爱凤\t105464\n广州金升阳科技有限公司\t105465\n磐\t105466\n300回\t105467\nBurberry巴宝莉\t105468\n鱼米之乡\t105469\nindie\t105470\n土包\t105471\n新加坡航空\t105472\n教父2\t105473\n玉溪市政府\t105474\n吊兰\t105475\n精良\t105476\n国民级\t105477\n阿伟\t105478\nPH计\t105479\n运算律\t105480\n体香\t105481\npymsql\t105482\n华军软件园\t105483\n韶山路\t105484\n联创光电\t105485\n谭恩美\t105486\nhallo\t105487\n化学期\t105488\n10单\t105489\n渔家傲秋思\t105490\n井然有序\t105491\n湖北省地方税务局电子税务局\t105492\n井眼\t105493\n绍新\t105494\n微信公众平台公众号\t105495\n第二十八届\t105496\n超级机器人大战F\t105497\nmt7\t105498\n黔南州人民政府\t105499\n建材展\t105500\neoLinker\t105501\n痛殴\t105502\n550\t105503\nNX12.0\t105504\n王保安\t105505\n专号\t105506\n帅呆\t105507\n0550\t105508\n逃逸\t105509\nTyp\t105510\n碎屑岩\t105511\n华侨大学\t105512\n圈网\t105513\n有名\t105514\n魅妖\t105515\n化工原料\t105516\n革卦\t105517\n打印版\t105518\n海甸\t105519\ndoci\t105520\n周洋\t105521\n富论\t105522\n置之度外\t105523\n伊特\t105524\n杭州市西溪医院\t105525\n黛比\t105526\ntian\t105527\n2019年2月\t105528\n25帧\t105529\n巨风\t105530\n最好色\t105531\n支付通\t105532\n脐带\t105533\n汤博\t105534\n接合器\t105535\nDCI\t105536\n说不出来\t105537\n巴河镇\t105538\nyuzhongwusan\t105539\nXY\t105540\n杜甫草堂\t105541\nfirming\t105542\n胡大海\t105543\n阳基御龙山\t105544\n7千克\t105545\nkawa\t105546\n秀彬\t105547\n联署\t105548\n治疗术\t105549\n大连理工大学\t105550\n自己去\t105551\n1000位\t105552\n年景\t105553\n凯格尔\t105554\n碧江区\t105555\n应县木塔\t105556\n铝压铸\t105557\nBlogs\t105558\nRos\t105559\nT34\t105560\nfcuk\t105561\n凌家滩\t105562\n市烟草专卖局\t105563\n记者证\t105564\n五四杯\t105565\n减肥操\t105566\n铁甲钢拳2\t105567\nleap\t105568\n国产在线\t105569\n昆山二中\t105570\n张浩然\t105571\n122\t105572\n魏振海\t105573\n静风\t105574\n株洲县政府\t105575\n热敏灸\t105576\n金范\t105577\n曲炜\t105578\n贺友直\t105579\n王琨\t105580\n招商局\t105581\n场刊\t105582\n406号\t105583\n平岗\t105584\n2007年6月\t105585\n徐立\t105586\nChords\t105587\n超链\t105588\n杨达\t105589\n100多度\t105590\n36套\t105591\n浮出\t105592\nwode\t105593\n凸包\t105594\n苏州古城区\t105595\n重活\t105596\n2.05\t105597\n裸死\t105598\n南台\t105599\n京东联盟\t105600\n3PE防腐钢管\t105601\n胆囊壁\t105602\n深圳奥数网\t105603\n上框\t105604\n16欧\t105605\n两三分钟\t105606\n西南财经大学天府学院\t105607\n空速\t105608\n年金保险\t105609\n家鼠\t105610\n于梦竹\t105611\nberries\t105612\n折剪\t105613\n昂秀\t105614\nGIPHY\t105615\n六安职业技术学院\t105616\n九峰\t105617\n山茶油\t105618\n_马自\t105619\n虚拟信用卡\t105620\n扩展器\t105621\nAMD处理器\t105622\n秃宝\t105623\nMOODYZ\t105624\n冷轧板\t105625\n3116\t105626\n贤良\t105627\n长治医学院\t105628\n046期\t105629\nBoxster\t105630\n策展人\t105631\n11.30\t105632\n城区\t105633\n羊楼\t105634\n公派\t105635\n无问西东\t105636\n见光死\t105637\n旅行篇\t105638\n谢金星\t105639\n冲水\t105640\n辅导班\t105641\n生产法\t105642\n黑松\t105643\nyoutub\t105644\n寿衣\t105645\nyolov2\t105646\n格林小镇\t105647\nroo\t105648\n蝙蝠侠前传3\t105649\nFRANK\t105650\n模拟战\t105651\n铁兵\t105652\n真命天子\t105653\n艳香迷醉\t105654\nMOUNTAIN\t105655\n遗忘者\t105656\nBaseActivity\t105657\n裤子女\t105658\nETFs\t105659\n广州联通\t105660\n恒大御龙天峰\t105661\n案牍\t105662\n催办\t105663\n调速\t105664\n950元\t105665\nEmpowering\t105666\n西堤头镇\t105667\n游研社\t105668\n时之需\t105669\n何小平\t105670\n侦察车\t105671\n小学五年级英语\t105672\n桑塔纳3000\t105673\n湘水博园\t105674\n书谱\t105675\n倒算\t105676\nPreviews\t105677\n宁波东站\t105678\n标的公司\t105679\n陆凯\t105680\n以前\t105681\n豺狗\t105682\n剪贴板\t105683\n700名\t105684\n红专厂\t105685\n大饭店\t105686\n佳益\t105687\nby\t105688\n安志杰\t105689\nCriteria\t105690\n乔治·阿玛尼\t105691\n东辉\t105692\n标题段\t105693\nexpected\t105694\n雪顿节\t105695\n豌豆花\t105696\n考试卷\t105697\npycurl\t105698\n制止\t105699\n陈震\t105700\n杂烩\t105701\n曹路\t105702\n科技圈\t105703\n爱范儿\t105704\ng36\t105705\n不听话\t105706\n酥梨\t105707\n组套\t105708\nzion\t105709\n一期一振\t105710\n光身\t105711\ndn15\t105712\n莫仕\t105713\n建经\t105714\nLubricants\t105715\n1973年\t105716\n对策\t105717\n季度\t105718\njianti\t105719\n上海金山工业区管理委员会\t105720\n顺子\t105721\nAn\t105722\n金算盘\t105723\n西北工业大学网络教育学院\t105724\n拔款\t105725\n县级\t105726\n富士施乐吧\t105727\namdcpu\t105728\n王小玮\t105729\n3.5吨\t105730\n好友邦\t105731\nEnough\t105732\nSHE\t105733\n12月26日\t105734\n微习惯\t105735\n奇拉比\t105736\n安微省\t105737\n跳转\t105738\n合肥市城乡建设委员会\t105739\nConsultants\t105740\n最新版\t105741\n遂宁安居区\t105742\n中国人民公安大学\t105743\n液压阀\t105744\nstrathberry\t105745\n第3_\t105746\n六倍\t105747\n之二十五\t105748\n富阳政府\t105749\n手办展\t105750\n流殇\t105751\n志\t105752\nturtle\t105753\n1080P高清MV\t105754\n英镑\t105755\n浓缩版\t105756\n疏风\t105757\n复耕\t105758\n柏联\t105759\n马刺\t105760\nCOMMENT\t105761\nncnn\t105762\n侨社\t105763\nTinashe\t105764\n漫畫\t105765\nETC服务点电话\t105766\n好逸恶劳\t105767\n朱彤\t105768\n600739\t105769\n邵磊\t105770\n经营性\t105771\n品牌方\t105772\nUniversitt\t105773\n花里胡哨\t105774\n曲靖市\t105775\n二乙烯三胺\t105776\n福建省教育厅办公室\t105777\n壮锦\t105778\n驶出\t105779\n阴茎\t105780\n髻\t105781\n我叫金三顺\t105782\n蛋椅\t105783\nBlack\t105784\nagentd\t105785\n万商货源网\t105786\n小喇叭\t105787\n壁灯\t105788\n徐工集团\t105789\n圣路\t105790\nova\t105791\n想开始\t105792\n刘骥\t105793\n合欢宫记事\t105794\n姥爷\t105795\n永欣\t105796\n会稿\t105797\n赛欧逍客\t105798\n门齿\t105799\n18章\t105800\na=1\t105801\n习近平关于全面从严治党论述摘编\t105802\nPotter\t105803\n努沙杜瓦\t105804\n耐用型\t105805\n观山\t105806\n丽江酒店\t105807\n房产大厦\t105808\n高家庄\t105809\nxcom2\t105810\n刚察县人民政府\t105811\n共立\t105812\n光谷步行街\t105813\n平等院\t105814\n九一八事变\t105815\n观世音\t105816\n迫不得已\t105817\n黄敏\t105818\n鸡蛋糕\t105819\n散炮\t105820\n李源\t105821\n开关型\t105822\n螺丝批\t105823\nRegistered\t105824\n禁忌之恋\t105825\n莱士\t105826\n20150514\t105827\n中交天津航道局有限公司\t105828\n中盟\t105829\nV1.0_\t105830\ntbbt\t105831\n杨集\t105832\ntinkpad\t105833\nZlib\t105834\n耿爽\t105835\n社会保险法\t105836\n创蓝\t105837\n北京375路\t105838\n马头山\t105839\n肉脯\t105840\nwisdomone\t105841\n传课\t105842\n至强e5\t105843\nPlat\t105844\n花蝶\t105845\n性冷淡\t105846\nManiac\t105847\n律政英雄\t105848\njbk\t105849\n唐纳德·特朗普\t105850\n二氯亚砜\t105851\nBenson\t105852\n醒着\t105853\n深圳百姓网\t105854\n美英法\t105855\n九月底\t105856\n谷主\t105857\nnippon\t105858\n三黄片\t105859\nBeth\t105860\n南京师范大学附属中学\t105861\n第十一卷\t105862\nE420\t105863\nglo\t105864\n乐善好施\t105865\n魏冰雪\t105866\n纬二路\t105867\n3个半小时\t105868\nminecr\t105869\n刑庭\t105870\n遥远的歌\t105871\n信泰\t105872\nDicom\t105873\n配员\t105874\n广安路\t105875\n【格林\t105876\n施耐德\t105877\n储存器\t105878\n硕人\t105879\n14.1.1\t105880\n烧烤架\t105881\n小升初数学\t105882\n5.com\t105883\n高高手\t105884\n广东省民政厅\t105885\n裤裆\t105886\nTAMIYA\t105887\n被毁\t105888\n三十年代\t105889\n002415\t105890\nhex2bin\t105891\n仃\t105892\nemf\t105893\n蓝臻\t105894\n中女\t105895\n惠崇春江晚景\t105896\nURl-team\t105897\n31.4\t105898\n不搞笑\t105899\n600634\t105900\nExclude\t105901\nerb\t105902\n摩羯座男\t105903\n郑瑞\t105904\n告诉你\t105905\n铁牌\t105906\n存储篇\t105907\nfadein\t105908\n崔\t105909\n李瑾\t105910\n敖汉旗\t105911\nAndroidStudio3.0\t105912\ngooglechrome\t105913\nlaplacian\t105914\n椴树蜜\t105915\n上体\t105916\n情定\t105917\n小火车\t105918\n1处\t105919\n400集\t105920\nhitech\t105921\n加油量\t105922\n小丑皇\t105923\n子宫肌瘤\t105924\n夜影\t105925\n来自深渊\t105926\n杨箕村\t105927\n黄卫平\t105928\n1964\t105929\n冯柳\t105930\ncocoscreater\t105931\nedison\t105932\n梅雨天\t105933\n2015年春节\t105934\n螯虾\t105935\n初中篇\t105936\n数聚\t105937\nppt版\t105938\n20160205\t105939\n张宝顺\t105940\nlaos\t105941\n小破孩\t105942\n中崎\t105943\n连续体\t105944\n2017年8月3日\t105945\n长春观\t105946\n硅酸钾\t105947\n到会不会\t105948\n两当县\t105949\n学习路上——习近平总书记系列重要讲话\t105950\nbookshop\t105951\n残秽\t105952\n卸荷阀\t105953\n更名\t105954\n甲骨文公司\t105955\n变形金刚4\t105956\n解放战争\t105957\n头皮癣\t105958\n玉器街\t105959\n125分\t105960\ndispersion\t105961\n翘起\t105962\n张静\t105963\n冷战思维\t105964\n72篇\t105965\n9780\t105966\njia\t105967\n吹尽狂沙始到金\t105968\n北城区\t105969\n田龟源五郎\t105970\nGROW\t105971\n骡马市\t105972\n行下\t105973\n90个\t105974\n名局\t105975\n从事\t105976\n医\t105977\npml\t105978\n把儿乐队\t105979\n罗罗娜\t105980\n盲沟\t105981\n欢爱\t105982\n助演\t105983\n铸模\t105984\n眼波\t105985\n王培安\t105986\n肺部\t105987\n雷达图\t105988\nusually\t105989\nexpandable\t105990\n邵氏\t105991\n杂粮煎饼\t105992\njets\t105993\n修善寺\t105994\n左线\t105995\n人事部门\t105996\n王府井\t105997\n桃花源古镇\t105998\ntips\t105999\n凶骨\t106000\n网格画\t106001\n远洋国际中心\t106002\n3337\t106003\n永兴坊\t106004\n陈明明\t106005\n福费\t106006\n西安音乐厅\t106007\n中国药神\t106008\nfigure\t106009\n1267\t106010\n赵宁\t106011\neCos\t106012\n2019届\t106013\n学历证书电子注册备案表\t106014\n主卧门\t106015\nbox\t106016\n电视新闻\t106017\n六龙争霸3D\t106018\n【乒乓网\t106019\n救主\t106020\n市信访局\t106021\n汇鑫\t106022\nbeggar\t106023\n适合于\t106024\n9097\t106025\ner3200\t106026\nEia\t106027\n悦步跳舞毯\t106028\n插人\t106029\n舞伴\t106030\n分解师\t106031\n猎杀潜航吧\t106032\n钱壮飞\t106033\n奇正\t106034\n分段式\t106035\n包装器\t106036\nSO4\t106037\n翡翠山\t106038\nroot_\t106039\nB52\t106040\n金相\t106041\n凌通\t106042\n党政网\t106043\n中新国际城\t106044\n4521\t106045\n国产动画\t106046\n罗辑思维\t106047\n美容品\t106048\n路虎特种兵\t106049\nPCM-D100\t106050\nboyz\t106051\n明药\t106052\n曲妥珠单抗\t106053\n大检阅\t106054\n升降器\t106055\n深恶\t106056\nLayui\t106057\n6万多\t106058\n青囊\t106059\n百度站长\t106060\nlaplace\t106061\n刀削面\t106062\n杨浪\t106063\n纳滤\t106064\n安慰剂\t106065\n陶瓷芯\t106066\n招商证券\t106067\n土耳\t106068\nT540p\t106069\nrtmp流\t106070\n刘晓梅\t106071\naims\t106072\n杨晃\t106073\n谢冕\t106074\namigo\t106075\nHeroine\t106076\n携程旅游】旅游_旅游网_旅游线路_自驾游_旅游团_周边游\t106077\ndeer\t106078\n刘宇昆\t106079\n必需脂肪酸\t106080\nfushu\t106081\n间距\t106082\n6150\t106083\nkat\t106084\n千万吨\t106085\n月火\t106086\n科普贴\t106087\nLPDDR4\t106088\n佛教四大名山\t106089\n小鱼\t106090\n甘遂\t106091\n番号集\t106092\n南骏\t106093\nPeg\t106094\n45位\t106095\nDelvaux\t106096\n新君越论坛论坛\t106097\n十七帖\t106098\n日产汽车\t106099\n五措\t106100\n阻塞式\t106101\n夜关门\t106102\n扬州市梅岭中学\t106103\nQWidget\t106104\n物理师\t106105\n南府\t106106\n古川雄辉\t106107\n师表\t106108\n起凡吧\t106109\n新一代\t106110\n张力控制器\t106111\n第六十一章\t106112\n氟化铵\t106113\nCarnival\t106114\n80公斤\t106115\n宜昌市人力资源和社会保障局\t106116\n国信证券股份有限公司\t106117\n微管\t106118\n中温\t106119\n咬合\t106120\n奚沾雨\t106121\n尖嘴\t106122\n分钱\t106123\n捷途X95\t106124\n恶妻\t106125\nレシピ\t106126\n498元\t106127\n外显\t106128\nexing\t106129\n18052期\t106130\n玉木宏\t106131\nGVG\t106132\nFRESH\t106133\n八阶\t106134\n1352\t106135\n徐莹\t106136\n偏序\t106137\n鄞州公园\t106138\nteleport\t106139\n277号\t106140\n两德\t106141\n参考性\t106142\n艾拓思\t106143\n洗车器\t106144\n庄信万丰\t106145\n衣裳\t106146\n河北省文化厅\t106147\n奶贝\t106148\n妊娠期\t106149\n宝宝们\t106150\n宁夏回族自治区经济和信息化委员会\t106151\n抄件\t106152\n期颐\t106153\nmfcc\t106154\n康星\t106155\n首起\t106156\nUI组件库\t106157\n江苏省电力公司\t106158\npieces\t106159\n3.0.9\t106160\n深圳地铁12号线\t106161\n一见不钟情\t106162\n百炼钢\t106163\n衰耗\t106164\ncult片\t106165\n追索权\t106166\n9.3\t106167\n国家体育馆\t106168\n鬼作\t106169\nRipple\t106170\n名点\t106171\n尽意\t106172\n蓬山\t106173\ncaTV\t106174\n磷霉素氨丁三醇散\t106175\n著作权人\t106176\npscc吧\t106177\n北山路\t106178\n技术工\t106179\n990期\t106180\ncns\t106181\n黑将s5\t106182\n扩大器\t106183\nbaozi\t106184\n230克\t106185\n椎间孔镜\t106186\n贺国丰\t106187\nunnatural百度云\t106188\n一个10\t106189\n反铲\t106190\n0.45\t106191\n宁夏自治区\t106192\n求验\t106193\n马6\t106194\n代签\t106195\nassociate\t106196\n不看电视\t106197\n翻天覆地\t106198\n2140\t106199\n4pin\t106200\n帝皇侠\t106201\n点旋转\t106202\n歆语\t106203\nscanner\t106204\noccasion\t106205\n1月18日\t106206\n开埠\t106207\n宋家兴\t106208\n事字\t106209\n麦饭石锅\t106210\n$query\t106211\n交还\t106212\n端游\t106213\n郑瑞熙\t106214\n鸢\t106215\nQT5\t106216\n优财\t106217\n直读\t106218\n环氧胶泥\t106219\n上溪镇\t106220\n方位角\t106221\n不靠\t106222\n业主论\t106223\n紧集\t106224\n悬念\t106225\n周亮\t106226\n咿咿呀呀\t106227\n茶者\t106228\n研磨剂\t106229\n永庆坊\t106230\n轮候\t106231\n桑拿炉\t106232\ntgfc\t106233\n腈\t106234\n00315\t106235\n禅悟\t106236\n赤壁论坛\t106237\n业务员\t106238\n猎取\t106239\n排气管\t106240\ncacls\t106241\n派遣单位\t106242\n鱼贯而入\t106243\n莱斯大学\t106244\n济州府\t106245\n姜志刚\t106246\n南昌县\t106247\n3min\t106248\nUTF-8编码\t106249\n艾一若\t106250\n齐珊珊\t106251\nhessian\t106252\n自定义人生网\t106253\n金立S10\t106254\n黎川\t106255\n答语\t106256\nAFT\t106257\n太山\t106258\n口水篇\t106259\nli0924\t106260\neQ\t106261\n相位仪\t106262\n符咒\t106263\n六安市人力资源和社会保障局\t106264\nbanjia\t106265\n我能\t106266\n刘志强\t106267\n偏心\t106268\n金杯阁瑞斯\t106269\nJulien\t106270\nDomaine\t106271\n救生衣\t106272\n批卡\t106273\n妇孺\t106274\n助词\t106275\nifound\t106276\n画魂\t106277\n20150723\t106278\n教学机构\t106279\n聂永\t106280\n麻辣火锅\t106281\n旅游型\t106282\n深圳晚报\t106283\n1937年8月28\t106284\n中华人民共和国婚姻法\t106285\n精简包\t106286\n植保站\t106287\nubutun\t106288\njre8\t106289\n惠聚工作队\t106290\nBrowse\t106291\n冯道\t106292\n妖魔化\t106293\n狗脊\t106294\n反戈\t106295\n台柳路\t106296\n上证50ETF\t106297\n七周年\t106298\n平面抛光机\t106299\n生死瞬间\t106300\ncdo\t106301\n别野\t106302\n强化版\t106303\n乐华城\t106304\n报喜\t106305\n赚品\t106306\n不接电话\t106307\n第四种\t106308\nRocky\t106309\nLEM\t106310\n第八段\t106311\n沿袭\t106312\n风云小说阅读网\t106313\n中孚\t106314\n星语海蓝\t106315\n营销战\t106316\n喜上加喜\t106317\ncellid\t106318\n香木\t106319\nweapon\t106320\n快鱼\t106321\n沈醉\t106322\n当朝\t106323\n甘地传\t106324\n所属期\t106325\n7.4%\t106326\n滚动体\t106327\n水墨\t106328\n王力平\t106329\n卷一\t106330\n诺基亚8sirocco\t106331\n霍桑二世\t106332\n后现代性\t106333\n翼航模\t106334\n小金瓶\t106335\nhanju\t106336\n万劫连击\t106337\n八一勋章\t106338\n吉林沙漠\t106339\n腰机\t106340\nVIF\t106341\n江汉平原\t106342\nOakland\t106343\n跨距\t106344\nb860a\t106345\n手绘笔\t106346\nHEIC\t106347\n温雅\t106348\n肌透\t106349\n失主\t106350\n褒贬\t106351\n长春师范大学\t106352\nDRL\t106353\nRT-AC87U\t106354\n大新村\t106355\n上海一大会址\t106356\n房产局\t106357\n省级政府\t106358\nWho\t106359\n森下悠里\t106360\nneighbourhood\t106361\nAutoLISP\t106362\n杨浦高级中学\t106363\n晓说2017\t106364\n杭州市测绘与地理信息局\t106365\nAmlogic\t106366\netal\t106367\nINSPIRE\t106368\n良人\t106369\n三山四海\t106370\n通透式\t106371\n中亿丰建设集团股份有限公司\t106372\n青藏地区\t106373\n曹玄亮\t106374\n忙季\t106375\n偏压\t106376\n广东统计信息网\t106377\nmediation\t106378\nADSafe\t106379\n霍邱县\t106380\n嗨小漫\t106381\n美图秀\t106382\nnoted\t106383\n产教\t106384\n囧女翻身之嗨如花\t106385\n曾敏杰\t106386\n瑞虎3xe\t106387\n一揽\t106388\n冶金工程专业\t106389\n皮卡车\t106390\n点表\t106391\n卧铺\t106392\n企业文规\t106393\n桥本环奈\t106394\n一指禅\t106395\nFurniture\t106396\n大鳌镇\t106397\n不要脸\t106398\n十八涧\t106399\n嘤嘤怪\t106400\n泡沫板\t106401\n鸭鸭\t106402\n网络电\t106403\n南昌市区\t106404\n2147483648\t106405\n自报\t106406\nAa\t106407\nSUPREME\t106408\n肉林\t106409\n企业策划书\t106410\nHolt\t106411\nburger\t106412\n运输业\t106413\n天冰\t106414\n苏坡\t106415\n小微贷款\t106416\n西安日报\t106417\n连锁化\t106418\nopple\t106419\n橡皮人\t106420\ncardiovascular\t106421\ne5620\t106422\n驻唱\t106423\n回话\t106424\n锁文\t106425\n杀青机\t106426\n钢网\t106427\n墨囊\t106428\n12560\t106429\n五笔网\t106430\ngory\t106431\n桐生\t106432\n赵飞\t106433\n比兴\t106434\n马德\t106435\n```\t106436\n公拍网\t106437\nhardening\t106438\n陈汝佳\t106439\nKrist\t106440\nbuyout\t106441\n索引列\t106442\n筑房网\t106443\n地日\t106444\niphone7home键\t106445\n早上六点半\t106446\n一家老小\t106447\n无领导小组讨论\t106448\n妊娠斑\t106449\n题录\t106450\n046\t106451\n滴速\t106452\nseajs\t106453\nsolveigmm\t106454\n王轶\t106455\n600489\t106456\n3万吨\t106457\n_农机通讯社\t106458\n筑客网\t106459\n弓藏\t106460\nt=0\t106461\n中国连锁网\t106462\n一、二\t106463\n荆高峰\t106464\n壮烈\t106465\n梨花落\t106466\n五经\t106467\n彝族\t106468\n抗日血战\t106469\n北京世纪坛医院\t106470\n横着\t106471\ngrunt\t106472\nKE\t106473\npython+selenium\t106474\n虚拟存储器\t106475\n徐文明\t106476\n无时无刻\t106477\nUnity3D\t106478\n喷施\t106479\n2.1.2\t106480\ntxt文\t106481\n一敌百\t106482\n戏骨\t106483\n触探\t106484\npop子\t106485\n普及\t106486\n汪明荃\t106487\n持力\t106488\n分家析产\t106489\nHD磁力链接\t106490\nwindown\t106491\n见风使舵\t106492\n帆软论坛\t106493\n撞肿\t106494\n图们市\t106495\ndpc\t106496\n污漫画\t106497\n吊柜\t106498\nMiFi\t106499\n果滨\t106500\ntropical\t106501\n雷克塞\t106502\n3,4月份\t106503\n俩种\t106504\n逆天仙尊\t106505\nHOU_JUN\t106506\n罗哥\t106507\n小九九\t106508\n谍网\t106509\n排名赛\t106510\n莎莫\t106511\nbuild-tools\t106512\n漫画式\t106513\n湖北省水利厅\t106514\n海棠区\t106515\n萨拉查\t106516\n苏州工业园区新闻中心\t106517\npowerdesigner15\t106518\n不用怕\t106519\naggre\t106520\nfdisk\t106521\n驼铃网\t106522\ninux\t106523\n中海油研究总院\t106524\n音质版\t106525\n新日电动车\t106526\n慢性荨麻疹\t106527\n库肯霍夫\t106528\nmarmoset\t106529\n铭品\t106530\n流源\t106531\ntcad\t106532\ninnodb_flush_log_at_trx_commit\t106533\n闪送\t106534\n林保怡\t106535\ndecreased\t106536\n绕城高速公路\t106537\ncodeblocks\t106538\nMaserati\t106539\nunusual\t106540\nDitF\t106541\n出户\t106542\n在册\t106543\n氦\t106544\n交通协管员\t106545\n中国法院网\t106546\n雪莲果\t106547\nVSC\t106548\n投行云课堂\t106549\nRemoting\t106550\n门禁\t106551\n1ml\t106552\n中华民国临时约法\t106553\n脑心\t106554\n枣庄高新区\t106555\n斗龙战士\t106556\n华政\t106557\n岩崎\t106558\n递送\t106559\nWorldMP3\t106560\n桃浦新村\t106561\n1581\t106562\n循礼\t106563\nVideo\t106564\n视景\t106565\n微淘\t106566\n率土之滨\t106567\n发热\t106568\ndiskutil\t106569\n阿法骨化醇\t106570\n透明状\t106571\n打组\t106572\n大唐天下吧\t106573\n两聚一高\t106574\n猩红之月\t106575\nmpo\t106576\n班禅额尔德尼\t106577\n黄花国际机场\t106578\n盛极一时\t106579\n抗凝剂\t106580\nSEK\t106581\n墩\t106582\n数百亿\t106583\nMarmalade\t106584\n寒江\t106585\n青岛市崂山区\t106586\n三六零\t106587\n凶徒\t106588\n11季\t106589\n三包期\t106590\n导函数\t106591\n最高处\t106592\n雪无痕\t106593\nkingstudy\t106594\n镇湖\t106595\n中国铁建房地产集团有限公司\t106596\n殷俊\t106597\n吴兴新闻网\t106598\nScale\t106599\n虹许路\t106600\n蓝幕\t106601\n褪黑素\t106602\n哑舍\t106603\n明宪宗\t106604\n弧面\t106605\nBertrand\t106606\n离轴\t106607\ndisallowed\t106608\n羚羊\t106609\nVILLE\t106610\n华罗施瓦辛格\t106611\n垒打\t106612\n六年级英语\t106613\nAGA\t106614\n八家嘉苑\t106615\nirc\t106616\n耐信\t106617\nWin7双系统\t106618\n0992\t106619\n山西省民政厅\t106620\n运管局\t106621\n上海16号线\t106622\n羽状\t106623\n睡不好\t106624\n排序\t106625\n最终幻想吧\t106626\n光大证券\t106627\n小阁楼\t106628\n京都念慈庵\t106629\n傻夫\t106630\natlantic\t106631\n潘云鹤\t106632\n济州航空\t106633\n女婿\t106634\n十二时\t106635\n浙江省水利厅\t106636\nUSP\t106637\n摆烂\t106638\n2014—2015年度\t106639\n三分地论坛签证+\t106640\n醒了\t106641\n性教育片\t106642\n盛典\t106643\n文件费\t106644\n药柜\t106645\n2017-18年\t106646\n王建民\t106647\n鲁尔山\t106648\n七盏\t106649\n马克飞象\t106650\nfie\t106651\nAilete\t106652\n拼贴\t106653\n小娴\t106654\n嘴鞋\t106655\n断头舞\t106656\n荒年\t106657\nw+\t106658\n沈嘉\t106659\nFollowme\t106660\n北岳恒山\t106661\n驾驶位\t106662\n刘弘\t106663\n电子信息科学与技术专业\t106664\n城南村\t106665\n试饮\t106666\nsoi\t106667\n中铁大桥局\t106668\nbondage\t106669\n塑身机\t106670\nEXCEL2007\t106671\n别动不动\t106672\n金缮\t106673\n高中数学必修2\t106674\nnvicat\t106675\n辽河路\t106676\n增新\t106677\n民艺\t106678\nspecialists\t106679\nPlush\t106680\n天将雄师\t106681\n阿胶块\t106682\n小伟\t106683\n秸\t106684\n肿大\t106685\n定期报告_公司\t106686\n万能版\t106687\n餐吧\t106688\n苏共\t106689\n铭感\t106690\njcombobox\t106691\n榻榻米床\t106692\n眼疾\t106693\n月曜日\t106694\n电脑节\t106695\nw2017\t106696\n万邦都市花园\t106697\n水金\t106698\n祝语\t106699\n1200亿\t106700\nmega石\t106701\nmiele\t106702\n工具盒\t106703\n省心范文网\t106704\n中山酒店\t106705\nWebMagic\t106706\n赢家\t106707\n紫阳花\t106708\nCanoScan\t106709\n下笔\t106710\n5532\t106711\n福特撼路者论坛论坛\t106712\nsynctoy\t106713\n帽筒\t106714\n狂父\t106715\n跖\t106716\n海上天书奇谈\t106717\n未干版\t106718\n保修\t106719\n状元们\t106720\n2243\t106721\n鼻水\t106722\n舍念念\t106723\n征信会\t106724\n七类\t106725\n甘肃省人民政府\t106726\n去名\t106727\nproactive\t106728\n09秒\t106729\n东座\t106730\nBusan\t106731\n木乃伊4\t106732\n戴胜鸟\t106733\n五月中旬\t106734\n气士\t106735\nCooler\t106736\n00后早恋吧\t106737\n天国拯救修道院\t106738\n最多几天\t106739\n大唐集团公司\t106740\n初恋女友\t106741\n签到处\t106742\n踌躇满志\t106743\n1974\t106744\n九龙大道\t106745\n滁河\t106746\n蜡笔画\t106747\n润\t106748\n稳定土拌和站\t106749\n大白菜帮助中心\t106750\n荡铺\t106751\n寻梦者\t106752\nirene\t106753\n第130期\t106754\n陈昱\t106755\n曲阜机场\t106756\n五泉山\t106757\n逐渐\t106758\n华为mate9\t106759\nNIVEA\t106760\n亚种\t106761\n饭冈\t106762\nPlayback\t106763\nsysctl\t106764\n魏无羡\t106765\n肖秀荣\t106766\n第二把\t106767\n章莹颖\t106768\n魏金栋\t106769\n发现率\t106770\n人事代理\t106771\n第153集\t106772\n婚戒\t106773\n闲庭信步\t106774\niphoto\t106775\n全智\t106776\nchew\t106777\n电位分析法\t106778\npade\t106779\n金质\t106780\n名家汇\t106781\n挖财信用卡之窗\t106782\n火炬之光\t106783\n56mm\t106784\n活塞式压缩机\t106785\n蝶飞\t106786\n廖京生\t106787\n择业\t106788\n金娜\t106789\n乳片\t106790\ndemonstrated\t106791\n旅鼠\t106792\ntpc\t106793\n考不过\t106794\nroth\t106795\n第01章\t106796\n深圳市市\t106797\n第一段\t106798\n应召\t106799\n迈特威\t106800\n道路灯\t106801\n治病\t106802\n王斑\t106803\n新闻场\t106804\nM227fdw\t106805\n水性聚氨酯\t106806\n核壳\t106807\nbishi\t106808\n皇太极\t106809\n电视架\t106810\n良种\t106811\n奥特曼格斗进化\t106812\n猛追\t106813\nsupport-v4.jar\t106814\n突然\t106815\nWTO姐妹会\t106816\n和合术\t106817\n停不下来\t106818\n轮换\t106819\n得点\t106820\n北京地铁时刻表\t106821\n苏教版八年级\t106822\nStepping\t106823\n改错\t106824\n激光雕刻机\t106825\n西安发展\t106826\n工程法规\t106827\n表情集\t106828\n杜松子酒\t106829\n舍曲林\t106830\n贵金\t106831\n城市功能分区\t106832\n腾讯网\t106833\n血液\t106834\n国务院扶贫开发领导小组\t106835\n掉包\t106836\n遵照\t106837\n舒适\t106838\n梁小龙\t106839\n永不为\t106840\n鳌拜\t106841\n服装画\t106842\n快乐大本营2018全集\t106843\n第156集\t106844\n米粒儿\t106845\n翻腾\t106846\nOray花生壳动态域名\t106847\n北戴河火车站\t106848\nduplicate\t106849\n酥麻\t106850\n海亮教育\t106851\n2018年4月19日\t106852\n商報\t106853\nBots\t106854\n永恒狂刀\t106855\n按照\t106856\n秦都区\t106857\ngoodnote\t106858\n米西\t106859\n爱秀美女\t106860\n58_\t106861\n霸穹\t106862\n15天内\t106863\nSQLPlus\t106864\n免拉手\t106865\nPackard\t106866\nUGNX\t106867\n香樟路\t106868\n半个多月\t106869\n金异层塔\t106870\n骚母\t106871\n砥柱\t106872\n红人装\t106873\n诺尔妮\t106874\n苗乡\t106875\nSAFARI\t106876\n自杀式\t106877\n蓝色多瑙河\t106878\n董志勇\t106879\n熊心\t106880\n#11\t106881\n战神3:重置版\t106882\nhorton\t106883\n九成\t106884\n奔腾B50\t106885\n江西省农村信用社\t106886\n网上售票系统V2.0\t106887\n相结合\t106888\n教区\t106889\n阿汤哥\t106890\n本·阿弗莱克\t106891\nyoudao\t106892\n武汉城投集团\t106893\n椰子鞋\t106894\n箱床\t106895\n400w\t106896\n榆中县\t106897\nbeatiful\t106898\n小熊布\t106899\n中国心理学会\t106900\nargmin\t106901\nHokkaido\t106902\n万科智谷\t106903\n快干胶\t106904\n和风\t106905\n刘明明\t106906\nAG亚游\t106907\ngiulia\t106908\n一块钱\t106909\n天妒\t106910\ne-learning\t106911\n六气\t106912\n罗马2全面战争\t106913\n禁言\t106914\n南瓜花\t106915\n内蒙古东部\t106916\n风评\t106917\n四川省成都七中\t106918\n鉴赏\t106919\nstd=c++11\t106920\n白米饭\t106921\n37大天使之剑\t106922\nodd\t106923\n专发\t106924\n冻僵\t106925\n_威易网\t106926\n淮北矿业集团\t106927\nflirt\t106928\n钢琴窗\t106929\n西九华山\t106930\nMLE\t106931\n其二\t106932\n池忠国\t106933\n8个半月\t106934\n一安\t106935\n荧\t106936\n2010年\t106937\n张智勇\t106938\n智能机\t106939\n疯涨\t106940\n84个\t106941\n双拼\t106942\n通风盘\t106943\n36.6\t106944\n一个半\t106945\n喷灌机\t106946\n葱烧烙饼\t106947\nXmx\t106948\n几分\t106949\n傅渝\t106950\n500克\t106951\n奥尔波特\t106952\n港铁天颂\t106953\n战平\t106954\n少年侦探团\t106955\n量价\t106956\neu5\t106957\n咣当\t106958\n植物蛋白饮料\t106959\n语法化\t106960\nfx1n\t106961\ncamper\t106962\n黄赌毒\t106963\n家政\t106964\n恒大足校\t106965\n南京市食品药品监督管理局\t106966\n步步惊心丽\t106967\n蓝岛大厦\t106968\n风光旖旎\t106969\n十一项\t106970\ninterest\t106971\n引申\t106972\n江苏省安全生产监督管理局\t106973\n重法\t106974\n科幻末世小说-17k小说网\t106975\n772\t106976\n通案\t106977\n4399植物大战僵尸2\t106978\n邓公\t106979\n中国特种设备检验协会\t106980\n张文深\t106981\n瑞捷\t106982\n透水地坪\t106983\nDJI\t106984\n中电鑫龙\t106985\npolished\t106986\nthening\t106987\n法制日报\t106988\n给排\t106989\nwsk\t106990\n出尔反尔\t106991\n酒都\t106992\nAA\t106993\n平行性\t106994\n系统集成项目管理工程师\t106995\n漯河医学高等专科学校\t106996\n柔软性\t106997\n人民卫生出版社\t106998\n戦\t106999\n紫石\t107000\nairpin\t107001\n不可估量\t107002\n本利丰\t107003\n86_\t107004\n一万一年\t107005\n第33期\t107006\n峦头\t107007\n博林天瑞\t107008\n福田法院\t107009\n江苏教育报刊总社\t107010\n望洞庭\t107011\npprint\t107012\nvirtuale\t107013\nbeard\t107014\n中国建筑设计院\t107015\n博阿基耶\t107016\n江西省审计厅\t107017\n普兰县\t107018\n钢化\t107019\n驶过\t107020\nftb\t107021\nbottom\t107022\n舆论监督\t107023\n小金刚\t107024\n中铁集团\t107025\n电抗\t107026\n虚函数\t107027\n张北\t107028\n_海外购物网\t107029\n上海人社\t107030\n爸爸的爱\t107031\n卓嘎\t107032\ndeceit\t107033\n成本价\t107034\n矫正带\t107035\nLexington\t107036\n诺基亚920\t107037\nHibernate5\t107038\n罪魁祸\t107039\nHash函数\t107040\nDELMIA\t107041\n山西省高级人民法院\t107042\n0774\t107043\n4g模块\t107044\n高艺\t107045\n考上\t107046\nmind\t107047\n杭嘉湖\t107048\ndz\t107049\n耗儿\t107050\n强奸男\t107051\n阴痉\t107052\netsy\t107053\n长虹桥\t107054\nRails\t107055\n花东\t107056\n2017年3月30日\t107057\n北京立水桥\t107058\ninternet\t107059\n青岛国际院士港\t107060\n衡平\t107061\n金希澈\t107062\n千层石\t107063\n武夷新区\t107064\n椰子饭\t107065\n篱苑\t107066\n2014年10月\t107067\n武汉东湖高新技术开发区\t107068\n海龙大厦\t107069\n大夫\t107070\n大宗商品交易\t107071\n维视\t107072\n英乙\t107073\n公务员论坛\t107074\n共侍\t107075\n靠谱儿\t107076\n上海交通大学出版社\t107077\n天隼\t107078\n李永华\t107079\n你是我的眼\t107080\n十一科技\t107081\n假设\t107082\n伊夫堡\t107083\n闰\t107084\nmaggie\t107085\nRoto\t107086\n20则\t107087\n明旅\t107088\n省人\t107089\n511\t107090\nXIAN\t107091\n时代特征\t107092\n放气\t107093\n智业\t107094\n定房\t107095\n英语转换器\t107096\n路桩\t107097\n作息\t107098\n妥布霉素地塞米松滴眼液\t107099\n6789\t107100\n青岛市人力资源和社会保障局\t107101\nitu\t107102\n热性水果\t107103\n银月\t107104\n沙尘暴\t107105\noffsetTop\t107106\n感\t107107\n居下\t107108\ncyclo\t107109\n信立\t107110\n修\t107111\nAvro\t107112\n保修卡\t107113\n学委\t107114\n集体照\t107115\n姜鹏\t107116\n仙葫经济开发区\t107117\n考拉征信\t107118\n成毅\t107119\n2i2\t107120\n智悦人生年金保险\t107121\n观光车\t107122\n多力\t107123\n美鲜\t107124\n王牌对王牌2\t107125\n几星级\t107126\n捕鱼\t107127\n云南机场\t107128\n汉方本草纤美茶\t107129\n九道湾\t107130\nlcg\t107131\n仙石\t107132\n总群\t107133\n乐1s\t107134\n1500张\t107135\n和时间赛跑\t107136\nPC模拟器\t107137\n胶衣\t107138\n三江码头\t107139\n脑勺\t107140\n半入耳式\t107141\nc函数\t107142\n西安经济技术开发区\t107143\n碧莲盛\t107144\nOLED网\t107145\n0xef\t107146\n953集\t107147\n关停并转\t107148\n潮男\t107149\nshangshi\t107150\n250平米\t107151\nyha\t107152\n丰联广场\t107153\n于龙\t107154\n蝉衣\t107155\nSelection\t107156\n绝大部分\t107157\n聚义堂\t107158\n克蕾雅\t107159\n教科书级\t107160\n气郁\t107161\n拖车费\t107162\n萧然\t107163\n8_\t107164\n大跃进\t107165\n好了\t107166\nwulin\t107167\n啶虫脒\t107168\nM\t107169\n平安金管家\t107170\n影霸\t107171\n空域\t107172\n华新\t107173\nmtd\t107174\n佐佐木明希\t107175\n目不转睛\t107176\nadd+\t107177\n6级\t107178\n欢天喜地\t107179\n章名\t107180\n考行\t107181\nwww.189.cn\t107182\n遗腹子\t107183\n生物圈\t107184\nppapi\t107185\n培尔金特\t107186\ntsm\t107187\n彷徨\t107188\n72部\t107189\n中国瑞林工程技术有限公司\t107190\n腐蚀品\t107191\n网络监视器\t107192\n斗子\t107193\n节数\t107194\n刘安民\t107195\n户外鞋\t107196\n深田梨菜\t107197\n大连实德\t107198\n跨库\t107199\n石开\t107200\n教工\t107201\nfeild\t107202\n一盘散沙\t107203\n洪泽论坛\t107204\n布基胶带\t107205\n斯科尔斯\t107206\n潜航\t107207\n20名\t107208\n洪崎\t107209\n_鸡病专业网\t107210\noverfitting\t107211\n马毅\t107212\n龙宫\t107213\nshore\t107214\n萎缩\t107215\n顺德市\t107216\n函复\t107217\nSasaki\t107218\nForever21\t107219\nincorporate\t107220\n月光男孩\t107221\n世茂滨江新城\t107222\nEditBox\t107223\n醋酸\t107224\n大明宫遗址\t107225\n256m\t107226\n扬州育才实验学校\t107227\n栅格计算器\t107228\n玉城街道\t107229\n第14条\t107230\n28件\t107231\n埃米\t107232\n舒馨\t107233\n导词\t107234\n黑版\t107235\n4G网\t107236\nnatapp\t107237\n水渍\t107238\n5219\t107239\n长安之星2\t107240\n阮明\t107241\n2.0安卓\t107242\n韦世豪\t107243\n逗游游戏盒\t107244\nqualifications\t107245\n乌布\t107246\n定量灌装机\t107247\n绘画板\t107248\nFIX\t107249\n1.7版\t107250\n二色谱\t107251\nKellogg\t107252\n道理\t107253\nbiji\t107254\npunk\t107255\n宫鲁鸣\t107256\nFLAC3D\t107257\n毛体\t107258\n杭州招聘网\t107259\n余烬\t107260\n王东瑶\t107261\n蓝井艾露\t107262\nBackground\t107263\n芙蓉镇\t107264\nal2o3\t107265\n洛江\t107266\n长丰县\t107267\n李峥\t107268\n陈安妮\t107269\n巨峰\t107270\n瑶柱\t107271\n6000个\t107272\n踽踽\t107273\nsaber\t107274\n手藓\t107275\n惊婚\t107276\n靖州苗族侗族自治县\t107277\n11月18日\t107278\n南召\t107279\n软硬兼施\t107280\n青岛市发展和改革委员会\t107281\n潇湘书院\t107282\n贺函\t107283\n马拉特\t107284\ngain\t107285\n都会路\t107286\n齐鲁交通发展集团有限公司\t107287\n蜀九香火锅\t107288\nWebStorm\t107289\n张丹枫\t107290\n平凉\t107291\n虾子镇\t107292\n交大附中\t107293\n纪晓波\t107294\nredeon\t107295\n千位分隔符\t107296\nBBI\t107297\n氧管\t107298\n网易UU\t107299\n滚珠丝杠\t107300\n4度\t107301\n2016年末\t107302\n龙岛\t107303\n74集\t107304\n大宝贝\t107305\n閪\t107306\n离型纸\t107307\ncomfortable\t107308\n800MB\t107309\n丹参片\t107310\n宾堡\t107311\n聚丙烯管\t107312\nEXCEL单元格\t107313\nsnsapi\t107314\n北湖路\t107315\ndoT\t107316\n尔必达\t107317\n钢针\t107318\n环号\t107319\npansidon\t107320\n黑达摩\t107321\n转染\t107322\nIni\t107323\n1e\t107324\n定位赛\t107325\n安德\t107326\nSecureFX\t107327\n天津南\t107328\n混世魔王\t107329\n自盗\t107330\n胖东\t107331\nzaizai\t107332\nbritain\t107333\n白居易\t107334\n张奶奶\t107335\ncougar\t107336\nPaaS\t107337\n1699\t107338\n锐角币\t107339\n徽州古城\t107340\n安亭北\t107341\n金刚王\t107342\nlockdir\t107343\n买房网\t107344\n体位\t107345\n万丰\t107346\n2分钟\t107347\n阿苏卫\t107348\n下腹痛\t107349\nAuthentication\t107350\n川宁\t107351\n划线\t107352\n铁剑\t107353\n上海北外滩\t107354\n认房又认贷\t107355\n胁迫\t107356\n东道\t107357\n液晶显示屏\t107358\n承受不起\t107359\n同等学力经济学\t107360\ngongying\t107361\n无缝\t107362\n爱的人\t107363\nNDC\t107364\n微凹黄檀\t107365\n东河区\t107366\n南京师范大学中北学院\t107367\n?具\t107368\n健\t107369\n巨金怪\t107370\n即事\t107371\n重分类\t107372\n0.4KV\t107373\nTommy\t107374\n.net域名\t107375\n上东区\t107376\n半幅机\t107377\n偷欢\t107378\n研祥\t107379\n史带\t107380\n挂装\t107381\nEdison\t107382\n朱元璋\t107383\n群管\t107384\n芯体\t107385\nbt之家\t107386\n罩\t107387\nrestaurants\t107388\n李政道\t107389\n追诉\t107390\n中信建设\t107391\n蛤蟆功\t107392\n宝骏310W\t107393\n洞朗地区\t107394\n大众集团\t107395\n狩魔手记\t107396\n初美沙希\t107397\n中国重汽\t107398\n奶罩\t107399\n八里桥\t107400\n生存在\t107401\n派遣期\t107402\n极品家丁\t107403\n支付网\t107404\nVault\t107405\n鸣叫\t107406\n洛带\t107407\n天地伟业\t107408\nexcal\t107409\n梅西大学\t107410\n专升本考试网\t107411\n挂历\t107412\n琦琦\t107413\n成寿寺\t107414\nmteam\t107415\nwsdl\t107416\n止\t107417\n知客\t107418\n二甲基甲酰胺\t107419\n輪姦\t107420\n乙醇胺\t107421\n总书记\t107422\n陈卫星\t107423\n西游释厄传群魔乱舞\t107424\n福州酒店\t107425\n叮铛\t107426\n傅克\t107427\n平衡式热水器\t107428\nUsername\t107429\n德思勤\t107430\n途家网\t107431\n恢复盘\t107432\nyyyymmdd\t107433\n长飞光纤光缆\t107434\n创业投资公司\t107435\n球迷们\t107436\n539\t107437\n金银花\t107438\n24孔\t107439\n34章\t107440\n云南师范大学商学院\t107441\n华润置地有限公司\t107442\n生化危机:启示录2\t107443\n尹国驹\t107444\n高压水\t107445\nInception\t107446\n初锦\t107447\n禹作敏\t107448\n山河集团\t107449\n东升路\t107450\nTether\t107451\n懒鱼\t107452\n银华\t107453\n发薪\t107454\n扁桃体炎\t107455\n简写\t107456\n新城疫\t107457\nbuff\t107458\n中国平安公司\t107459\n球童\t107460\n2017年6月10日\t107461\n20170527\t107462\n2月初\t107463\n吉利汽车\t107464\nrecode\t107465\n开发群\t107466\nzydzyd\t107467\n霸体\t107468\n凝冻\t107469\n豪绅\t107470\n4.7亿\t107471\n活动范围\t107472\nCREATE\t107473\n掌门路\t107474\n企业职工患病或非因工负伤医疗期规定\t107475\n烟草花叶病毒\t107476\nbpm\t107477\n9DM\t107478\n腰椎退行性变\t107479\n外开\t107480\n布满\t107481\n上海新东方\t107482\n逃生舱\t107483\n锟\t107484\n大护法\t107485\n管口\t107486\n栅格\t107487\nfamilies\t107488\n绥化市人民政府\t107489\n吉他世界网\t107490\n中信环境\t107491\n列的值\t107492\n10步\t107493\nCSE\t107494\n气力\t107495\nPowerPoint2010\t107496\nLPC1768\t107497\n特拉法尔加·罗\t107498\n889\t107499\n周军\t107500\n中华人民共和国最高人民检察院\t107501\n江郎\t107502\n5.3.3\t107503\n第112期\t107504\nDaemon\t107505\n白浆\t107506\n福利版\t107507\n小提琴协奏曲\t107508\n二十四种\t107509\n讽刺\t107510\n甜梦文库\t107511\n佳能5D\t107512\nGPX\t107513\n反渗透\t107514\n量量\t107515\n捏碎\t107516\ndrinking\t107517\nTWI\t107518\n基诺族\t107519\n画\t107520\n学民\t107521\n偷日\t107522\n组播\t107523\n传空\t107524\n小枪\t107525\n卢舍\t107526\nScraper\t107527\n恩施州中心医院\t107528\n靠泊\t107529\n男头\t107530\ne闪贷\t107531\n3天游\t107532\n大正\t107533\n6席\t107534\n迷死\t107535\n组织性\t107536\n荔枝树\t107537\n张晓晨\t107538\n三星服务中心\t107539\n叛军\t107540\n嗯嗯\t107541\n化学会\t107542\n粉瘤\t107543\n宇飞\t107544\n锐志\t107545\n2021年\t107546\n敏天宫\t107547\n一家亲\t107548\n变径\t107549\n佘山镇\t107550\n抛压\t107551\n毛草\t107552\n我的一家人\t107553\n第几部\t107554\n美承\t107555\n8421BCD码\t107556\n藏狐\t107557\n洋槐花\t107558\n街道社区卫生服务中心\t107559\n卵细胞\t107560\n下房\t107561\n全国人民代表大会专门委员会\t107562\n普柜\t107563\n奇瑞_\t107564\n程间\t107565\n微店\t107566\n恶魔人crybaby\t107567\n蛋糕机\t107568\n16寸\t107569\n倾倒\t107570\n中国出版网\t107571\n文泽路\t107572\nphenyl\t107573\ntr200\t107574\n北京工业大学出版社\t107575\nNBA2K13\t107576\n德惠翁主\t107577\n泽元\t107578\n愿结\t107579\n软蛋\t107580\n白雪\t107581\n郭鹤鸣\t107582\n专升硕\t107583\n三7\t107584\n彩虹糖\t107585\nES2015\t107586\n顶驱\t107587\nMERIDA\t107588\n122毫米\t107589\n收入\t107590\n文博收藏艺术专业\t107591\n门下\t107592\n零距\t107593\ntps\t107594\n2018年4月10\t107595\nApplied\t107596\n七位数\t107597\n渐变色\t107598\n创尔特\t107599\n打火灶\t107600\n中段\t107601\n上海市职业能力考试院\t107602\n中央司法警官学院\t107603\n榴莲干\t107604\n孔捷\t107605\n特种设备作业人员监督管理办法\t107606\n3-5天\t107607\n凯乐科技\t107608\n医药谷\t107609\n藏文版\t107610\nv1.3_\t107611\n17173英雄联盟\t107612\ncatalog\t107613\n静若清池\t107614\nip67\t107615\n金华房网\t107616\n气化\t107617\n结编\t107618\n自新大陆\t107619\n番禺日报\t107620\n曲高终和寡\t107621\n腐竹机\t107622\n通道侗族自治县\t107623\n蜘蛛痣\t107624\nVisitors\t107625\n计息\t107626\n奴隶们\t107627\n化学方程\t107628\n卢文伯爵\t107629\n萧岗\t107630\nv8.7\t107631\n孙行者\t107632\n美味人妻\t107633\n卡迪拉克\t107634\n广州市花都区政府\t107635\n台湾人\t107636\n本源\t107637\n定中\t107638\n适应性\t107639\n新疆天山\t107640\n民本\t107641\n日本大学院\t107642\n幼儿简笔画\t107643\n云管\t107644\n這麼\t107645\n刘玉兰\t107646\n康平路\t107647\nSIMC\t107648\n常州市\t107649\n莱茵东郡\t107650\n灰机\t107651\n阳泉一中\t107652\n日本游戏公司\t107653\n黑金属\t107654\n移花接木\t107655\n徐璐\t107656\nDita\t107657\nImproved\t107658\n国家公派留学\t107659\n法政\t107660\n乃是\t107661\n4根\t107662\n李赣\t107663\n馈线柜\t107664\npartnership\t107665\n红装\t107666\n天工开物\t107667\n武极天下\t107668\n2391\t107669\n丁海峰\t107670\n直隶\t107671\n1年\t107672\ndurability\t107673\n规委\t107674\n永汉\t107675\n肠化\t107676\nix6780\t107677\nCityEngine\t107678\n防热\t107679\n天水在线\t107680\n耶稣\t107681\n3075\t107682\n网球\t107683\nODA\t107684\n_骨科频道_健客网\t107685\n跑狗\t107686\nIT猫扑网\t107687\n88_\t107688\nICP/IP地址\t107689\n高低温试验箱\t107690\n中国光大集团股份公司\t107691\niconx2018\t107692\n舰队\t107693\n全减器\t107694\n大众新宝来\t107695\npassage2\t107696\nwpa\t107697\n28万\t107698\n有限自动机\t107699\n奔月\t107700\n信达滨江壹品\t107701\n一汽解放汽车有限公司\t107702\n豪放女大兵\t107703\necharts3\t107704\n食品许可证\t107705\nrapoo\t107706\n晓之以理\t107707\n于氏\t107708\n北京车展网\t107709\n定向越野\t107710\n平顶山西\t107711\n前8个月\t107712\ntak\t107713\n基本单元\t107714\n陈芳语\t107715\n济南市工商行政管理局\t107716\nMariaDB数据库\t107717\nddp\t107718\n中国气象学会\t107719\n被判刑\t107720\n管中窥豹\t107721\n净收入\t107722\n风湿热\t107723\ncgs\t107724\n鲍林\t107725\n金台铁路\t107726\n478号\t107727\n牛奶歌\t107728\n华润微电子\t107729\n水ト\t107730\n胡志明\t107731\n姓名算命\t107732\n陈实\t107733\n谭剑波\t107734\n妖孽\t107735\n金蝶友商网\t107736\n潍坊医学院附属医院\t107737\nBoracay\t107738\n恩斯特·克莱恩\t107739\n中共北京市委\t107740\n宋英杰\t107741\n荷花淀\t107742\n⑵\t107743\n金智科技\t107744\nWnt\t107745\n尚涛\t107746\n大妆\t107747\n客様\t107748\nu15\t107749\n鹿特丹港\t107750\nhic\t107751\n刘帅\t107752\n辽宁省纪委\t107753\n被罢免\t107754\n任氏\t107755\nhee\t107756\n贞丰\t107757\n征人\t107758\n号表\t107759\n130吨\t107760\n库单\t107761\n公序良俗原则\t107762\n明信片\t107763\n青岛市司法局\t107764\n弘业股份\t107765\n2^4\t107766\n重灾区\t107767\n利尿药\t107768\n声发射\t107769\n氢弹\t107770\nnfc卡\t107771\n网易云音乐人\t107772\n插足\t107773\n4056\t107774\n真田丸\t107775\n李海鹰\t107776\n第246集\t107777\n墙贴画\t107778\n网页采集器\t107779\n牙酸\t107780\nvotes\t107781\n作弊\t107782\n偏函数\t107783\n5ds\t107784\n利妥昔单抗\t107785\n托马斯·杰斐逊\t107786\n该县\t107787\nZD\t107788\n九游经济网\t107789\n不删\t107790\n铁路线\t107791\n艾比\t107792\n格致中学\t107793\n金翅雀\t107794\n勃姆石\t107795\n数字蛋糕\t107796\n多元回归\t107797\n35寸\t107798\n京酱肉丝\t107799\n华东路\t107800\n跑马灯\t107801\n华神\t107802\n开封\t107803\nmodeling\t107804\n70000\t107805\n朝阳园\t107806\nSummerChill\t107807\n董监高\t107808\n4X4\t107809\n史耀斌\t107810\n英国诺丁汉大学\t107811\nV9\t107812\nC84\t107813\n豫园\t107814\n马健涛\t107815\n乐理\t107816\n闫闯\t107817\nEnzo\t107818\n阿图\t107819\n缺席\t107820\n格林德沃\t107821\n精品文摘网\t107822\n昂克赛拉\t107823\n漆膜\t107824\n闲花\t107825\n引灵幡\t107826\n648号\t107827\nWind\t107828\n地板革\t107829\n云南省旅游发展委员会\t107830\n空袭\t107831\nchak\t107832\n众所周知\t107833\n四类\t107834\n路易十五\t107835\nIndirect\t107836\n波粒\t107837\nvivox5m\t107838\n吃钱\t107839\n爱发科\t107840\n肠梗阻\t107841\n新元\t107842\n糖尿病日\t107843\n全兴\t107844\n长沙世界之窗\t107845\nCHAR\t107846\n通透\t107847\n韩剧-零蛋韩剧网\t107848\n入迷\t107849\n港仔\t107850\n鑫谷\t107851\n碰一碰\t107852\n清风\t107853\n2018-03-14\t107854\nshaders\t107855\n布莱切利\t107856\n六进\t107857\n氯沙坦钾氢氯噻嗪片\t107858\nsuparc\t107859\n都敏俊\t107860\n细石混凝土泵\t107861\nProE5.0\t107862\n雪mm\t107863\n新宿\t107864\nallie\t107865\n探灵网\t107866\n2017年02月\t107867\nDummy\t107868\nXML文件\t107869\n李昱\t107870\n情绪低落\t107871\ncqs\t107872\n小昊\t107873\n几分之一\t107874\nORA-12560\t107875\n干区\t107876\n隔行\t107877\nC1照\t107878\n入睡\t107879\n中国电信欢go网上海电信\t107880\nTelegraph\t107881\n红黄牌\t107882\nxserver\t107883\n4.9.4\t107884\n深川铃\t107885\n西南大学附属中学\t107886\n小者\t107887\n宋城演艺\t107888\n开水炉\t107889\n毒理学\t107890\n郝斌\t107891\n面破\t107892\n行书\t107893\n0_\t107894\n苏宁易购集团股份有限公司\t107895\n知音号\t107896\nsevere\t107897\n遮挡版\t107898\n快乐阅读\t107899\n兵棋\t107900\nFX168财经网\t107901\nNEXT\t107902\n伊恩\t107903\n资源税\t107904\n红米5\t107905\n韦斯·安德森\t107906\n南通机场\t107907\n余乐\t107908\n泰山7号\t107909\n油污\t107910\n盒子比价网\t107911\n约伯记\t107912\n伏魔英雄传\t107913\n餐\t107914\n长水航城\t107915\nwindowsxp\t107916\n咯咯\t107917\n第五波\t107918\n百济健康商城_康德乐大药房\t107919\n用友T3\t107920\n翻修\t107921\n益鸟\t107922\nvericut\t107923\nemulsion\t107924\n1.7万元\t107925\n一旬\t107926\nSoftEther\t107927\n仙林街道\t107928\nDiablo3\t107929\nshodan\t107930\n博创\t107931\n火枪手\t107932\nPenguin\t107933\nwin7系\t107934\n荥经县\t107935\n变脸\t107936\n灵异类\t107937\nmagent\t107938\n分会\t107939\n十五部\t107940\n文化村\t107941\n调档案\t107942\n阿加西\t107943\n醉乡\t107944\n数千家\t107945\n女仔\t107946\n快消品\t107947\n六芒星\t107948\n知悉\t107949\n屋里\t107950\n骨通贴膏\t107951\n高压线\t107952\n大良东区\t107953\n2017年10月12日\t107954\nCAV\t107955\n七马网\t107956\n迪拜塔\t107957\nPets\t107958\n再续\t107959\n三元股份\t107960\n健步如飞\t107961\n溴酸盐\t107962\n平板M3\t107963\n昌平教育\t107964\n福永镇\t107965\n充电量\t107966\n落难\t107967\nps批量\t107968\n薪水\t107969\nsemiconductor\t107970\n意见箱\t107971\n至道\t107972\n娇人\t107973\n沧口\t107974\n杰\t107975\n横纵\t107976\n十月\t107977\n齐齐乐战舰少女R\t107978\nless\t107979\n防烟\t107980\n金纬路\t107981\n为公\t107982\nVocaloid\t107983\n难为情\t107984\n方正集团\t107985\n4.0.6\t107986\n瑜伽师地论\t107987\n五六\t107988\n修理业\t107989\nSheffield\t107990\nfilmaker\t107991\n三元前驱体\t107992\n扬基\t107993\n韩国瑜\t107994\n感知机\t107995\n格林巴利\t107996\nlecture\t107997\n云南师范大学\t107998\n钉耙\t107999\nThinkPHP5数据库\t108000\n服务所\t108001\n微术\t108002\n东北师范大学研究生院\t108003\nutf-16\t108004\n薤\t108005\naem\t108006\n正当防卫2\t108007\n原浆啤酒\t108008\n滚子链\t108009\n栓口\t108010\n传影\t108011\n朱安\t108012\n孵化机\t108013\n脚指\t108014\n低劣\t108015\n贸易招聘网\t108016\nlist值\t108017\n圆筒型\t108018\n公证\t108019\n的话\t108020\n3重\t108021\n39p\t108022\n擀面\t108023\n百人计划\t108024\nmethods\t108025\n痛仰乐队\t108026\n仪陇县\t108027\n短杆\t108028\n贵阳医学院附属医院\t108029\n元胞数组\t108030\n国投集团\t108031\n直径表\t108032\n白头盔\t108033\n中国农大\t108034\n除斥\t108035\n加拿大西安大略大学\t108036\nMACH3\t108037\n操作感\t108038\n西安未央\t108039\nMJC\t108040\n永流传\t108041\n贺卡\t108042\n强抢\t108043\n睡眠\t108044\n鸡封号\t108045\n宁淮\t108046\n当家做\t108047\nReveal\t108048\n查理曼大帝\t108049\n块\t108050\nmotu\t108051\n博宝艺术网\t108052\n新希望地产\t108053\n卡盘\t108054\n白魔\t108055\njip\t108056\nNUMA\t108057\n研创\t108058\n防效\t108059\nps笔刷\t108060\n木版\t108061\n汇添富\t108062\n商务师\t108063\n南林新闻网\t108064\n电钢\t108065\n郑州大学研究生院\t108066\n海岛大亨3\t108067\n派普\t108068\n警句\t108069\n鼓浪屿\t108070\n欣宇\t108071\n感受态细胞\t108072\n2018年4月22\t108073\n装卸工\t108074\n珠三角人才网\t108075\n中山路街道\t108076\naba\t108077\n箭牌橱柜\t108078\n沙漠皇帝\t108079\n音乐学\t108080\n美国杜克大学\t108081\n新都孔雀城\t108082\n栀子金花丸\t108083\n冬青树\t108084\n中金海棠湾\t108085\n工序\t108086\n搜谱网\t108087\n飞帆\t108088\n验毒\t108089\nChinaTT\t108090\n威斯康星\t108091\n斯琴格日乐\t108092\n360双核浏览器\t108093\nDEVELOPER\t108094\n田中角荣\t108095\n地名\t108096\n上海译文出版社\t108097\n中脑\t108098\n把位\t108099\n地铁门\t108100\nK值\t108101\n影音先锋AV网\t108102\n适马镜头\t108103\n潮品\t108104\n图拉丁吧\t108105\n天突穴\t108106\n海航\t108107\n无油轴承\t108108\n男妓\t108109\n财年\t108110\n四美\t108111\n虚祖\t108112\nIPCamera\t108113\n赔偿\t108114\nLips\t108115\n万全\t108116\n世博会\t108117\n万兴科技\t108118\n营口东\t108119\n保保\t108120\n本着\t108121\n乐骋\t108122\n东盟自由贸易区\t108123\n天之禁2\t108124\n蒋某\t108125\nlicai\t108126\nvoi\t108127\n兵法三国\t108128\n生气\t108129\n城市之光\t108130\n本友会\t108131\n冷水滩政府\t108132\n水弯\t108133\n绳头\t108134\n20多岁\t108135\n在何处\t108136\n四川农信\t108137\n太可惜\t108138\n香径\t108139\n|WM10\t108140\n太赫兹\t108141\n一次方\t108142\n光亮管\t108143\nt21\t108144\n园外\t108145\nGbps\t108146\n胡笳\t108147\n宽窄巷子\t108148\n中国建筑西南设计研究院有限公司\t108149\n陈靖\t108150\nStudents\t108151\nstaples\t108152\n上海公交查询网\t108153\n红麻\t108154\nCHT\t108155\nCM13\t108156\n人身保护令\t108157\n2200元\t108158\nmaya2018\t108159\n极地恶灵第一季\t108160\n神探飞机头\t108161\nfs5\t108162\n猪脑\t108163\n英勇\t108164\n安琴\t108165\n申公豹\t108166\n相机\t108167\n淘车网\t108168\n缝衣\t108169\nx0\t108170\nDJ晨洋\t108171\n国务院国资委\t108172\n西夏王陵\t108173\n专四\t108174\n都市版\t108175\n北野武\t108176\n签发人\t108177\nexchang\t108178\n创世神\t108179\n妃夕妍雪\t108180\nMultiplayer\t108181\n走资\t108182\nLCD屏\t108183\n会声会影x9\t108184\nv5.4.0\t108185\n站帮网\t108186\nexponential\t108187\n福利网\t108188\n果树\t108189\neps格式\t108190\n07式\t108191\n一个两\t108192\n_网总管\t108193\n吴宽\t108194\n卫生条件\t108195\n正考\t108196\n严屹宽\t108197\n传奇级\t108198\nash\t108199\nXA\t108200\n华中师范大学研究生院\t108201\nJ.Fla\t108202\n滤光\t108203\n东阳光药\t108204\n刘双\t108205\nK7\t108206\n名币\t108207\n成都汽车网\t108208\n李富春\t108209\n200美元\t108210\n彭剑锋\t108211\n1902\t108212\n环保砖\t108213\n圣士\t108214\n本雅明\t108215\n云联\t108216\n4127\t108217\n宁波日报社\t108218\n南昌路\t108219\n防盗报警系统\t108220\n191号\t108221\n百余名\t108222\nstopping\t108223\n泡泡\t108224\nvantage\t108225\n芬奇\t108226\nLandsat\t108227\n吴城\t108228\n安络\t108229\n球台\t108230\n多心\t108231\n辣照\t108232\n行时\t108233\n文网文\t108234\n平和\t108235\n农业银行\t108236\nhnr\t108237\n希波战争\t108238\n开心消消乐\t108239\n血色苍穹\t108240\n个人贷款计算器\t108241\n5处\t108242\ncvt变速箱\t108243\n反扑\t108244\n降帧\t108245\n携程礼品卡\t108246\nalu\t108247\n科林蒂安\t108248\n谢童\t108249\nMakefile\t108250\n众议\t108251\nvpy\t108252\n岛内\t108253\n溴素\t108254\n马旭\t108255\n磁山\t108256\n男优\t108257\n用不着\t108258\n小米相机\t108259\nfou\t108260\n2341\t108261\n暌违\t108262\ngtss\t108263\n169元\t108264\n穆巴拉克\t108265\n烟台市人力资源和社会保障局\t108266\n面审\t108267\n20161012\t108268\n关平\t108269\n死亡率\t108270\n创绩\t108271\n大历\t108272\ntmc\t108273\n麦小兜\t108274\nminus\t108275\n外泄\t108276\nCentered\t108277\nphpstudy2016\t108278\n黄陂\t108279\n江西省社会保险管理中心\t108280\n林锋\t108281\nCX-5论坛_汽车之家论坛\t108282\n口袋妖怪漆黑的魅影金手指\t108283\nivh\t108284\ngraphql\t108285\n列支\t108286\n采摘机\t108287\n20\t108288\nrevise\t108289\nweex\t108290\n到端\t108291\n钻木取火\t108292\n伏林航空\t108293\n葱须\t108294\n公寓床\t108295\n绝色\t108296\n蛇舞\t108297\n空气锤\t108298\nnikita\t108299\n速通卡\t108300\n下部\t108301\n核果\t108302\n1000L\t108303\n乔冠华\t108304\nRosie\t108305\n伤力\t108306\n快播伦理片\t108307\n摆台\t108308\n马里奥赛车7\t108309\ntranslations\t108310\n缺血缺氧性脑病\t108311\n史蒂芬森\t108312\n中国检验检疫科学研究院\t108313\n中国保险监督管理委员会\t108314\n3|\t108315\n传到\t108316\nā\t108317\n先锋影音看片\t108318\n秦时明月之逍遥天下\t108319\n公益行\t108320\njojo奇妙冒险\t108321\n真形\t108322\nSSID\t108323\n为乐\t108324\nX战警:天启\t108325\ndoc-文档投稿赚钱网\t108326\n金属物\t108327\n遥望\t108328\n仁心\t108329\n行车\t108330\n癫痫病\t108331\nECB\t108332\n鱼眼镜头\t108333\n方条\t108334\n深圳市福田区人民法院\t108335\n通策\t108336\n8月17日\t108337\nendian\t108338\n气吞山河\t108339\n小可爱们\t108340\n辅具\t108341\n17例\t108342\n挥拳\t108343\n浦口高新区\t108344\n低压版\t108345\nokHttp\t108346\nParadigm\t108347\n抚州法院\t108348\n300104\t108349\nvivox6s\t108350\n妇科医生\t108351\n508网\t108352\n7万公里\t108353\n孙雷\t108354\n六月初\t108355\n冰皮月饼\t108356\n阳刻\t108357\nSlip\t108358\n九息\t108359\n薇儿\t108360\nschmidt\t108361\n终始\t108362\n证房\t108363\n金立群\t108364\n宜昌市夷陵中学\t108365\n上海局\t108366\najaxForm\t108367\n汇报\t108368\n申办\t108369\ncg\t108370\n同城\t108371\n碳肥\t108372\n湖北恩施州\t108373\n诺信\t108374\n話題\t108375\n杭埠\t108376\ncs3\t108377\n青岛公交集团\t108378\n阳高\t108379\npopwindow\t108380\n雪蛤\t108381\n免流卡\t108382\n沧州火车站\t108383\n缺页\t108384\n练师\t108385\n离顶\t108386\n方块字\t108387\n女红\t108388\nMAY\t108389\n作品类\t108390\n刊易出书\t108391\n大酱\t108392\n考克\t108393\n混杂\t108394\n秸秆压块机\t108395\n8570w\t108396\n脑血管病\t108397\n金相显微镜\t108398\n同洲电子\t108399\n几轴\t108400\n小金库\t108401\n吾\t108402\n保险诈骗罪\t108403\ndhc黄金霜\t108404\n霸道总裁爱上我\t108405\n母巢之战\t108406\n看破\t108407\n大都市圈\t108408\n凌叔华\t108409\nlies\t108410\nlibwebsockets\t108411\n夜尿多\t108412\n佛山市图书馆\t108413\nPTY\t108414\n东四\t108415\nspacecraft\t108416\n天魁\t108417\n早上四点\t108418\n矫直\t108419\n糖度\t108420\n330号\t108421\n诊\t108422\n刘思宇\t108423\n环己醇\t108424\n扫黄打非\t108425\n中顶\t108426\n真红\t108427\n奔牛\t108428\n黄河源\t108429\n史洲宇\t108430\nax\t108431\n凉皮肉夹馍\t108432\nCONTAX\t108433\nMERS\t108434\n广告主\t108435\n方控\t108436\n中山温泉\t108437\n君智\t108438\n氯化石蜡\t108439\n战魄\t108440\n法治乐山网\t108441\n阿卡波糖\t108442\netp\t108443\n常熟经济技术开发区\t108444\n魔兽战士吧\t108445\nsei\t108446\nddr3\t108447\n528li\t108448\ncpuinfo\t108449\n冷不\t108450\n害\t108451\n大宁金茂府\t108452\n小状元\t108453\n杨和镇\t108454\n刺子\t108455\n5.99英寸\t108456\n五香粉\t108457\n原阳\t108458\n四缸\t108459\n阔叶树\t108460\n卧铺车\t108461\n试穿\t108462\nT40\t108463\n同步电机\t108464\n朱继东\t108465\n21位\t108466\n小红叉\t108467\n河马生鲜\t108468\nOMD\t108469\n凌子\t108470\n裕景\t108471\n双条\t108472\nタイプ\t108473\n查勘\t108474\n上汽通用别克4S店\t108475\n螺栓\t108476\n土豆侠\t108477\n钢球\t108478\n桃花姑娘\t108479\n铜牛\t108480\ndeserve\t108481\nKuala\t108482\n反斗神鹰\t108483\noxygen\t108484\n苔痕\t108485\nAssetto\t108486\n江高\t108487\n罗加\t108488\n七十岁\t108489\n庐陵\t108490\n北京整形医院\t108491\n波音777\t108492\n本单\t108493\nOSC\t108494\n卡朋特\t108495\n起程\t108496\n碰杯\t108497\nflares\t108498\n十亩\t108499\nSERS\t108500\nPROTEUS\t108501\n盲人摸\t108502\n同流合污\t108503\nsessionid\t108504\n3d侠3d模型\t108505\n张老\t108506\n搜狗\t108507\n关紧\t108508\n某一处\t108509\n存贷款\t108510\n开启式\t108511\n益阳\t108512\n浙能电力\t108513\n人烟\t108514\n奇异值分解\t108515\n第12季\t108516\n盆景展\t108517\n海南椰岛\t108518\n派格\t108519\n高铁站_高铁网\t108520\n脸男\t108521\nCCTV-1_央视网\t108522\n陆树铭\t108523\nWilderness\t108524\nOntology\t108525\n广发财智金\t108526\n为你读\t108527\n京调\t108528\n金阳新区\t108529\n恨透\t108530\n石门一路\t108531\n迈克·泰森\t108532\n万盛经济技术开发区\t108533\n沃视频\t108534\n人工气道\t108535\n倒闸\t108536\n#热血街舞团\t108537\nRGB\t108538\n罗技G633\t108539\nrtb\t108540\n山大路\t108541\n衣锦\t108542\n望洞庭湖\t108543\n变幻莫测\t108544\nbldong\t108545\ns5500\t108546\n厂纪\t108547\nDTO\t108548\n灯笼果\t108549\n磷酸二氢铵\t108550\nLLD\t108551\n五一民政局\t108552\n90粒\t108553\nString类\t108554\nmook\t108555\n信阳农林学院\t108556\n4399玩具网\t108557\n供奉\t108558\n湖滨\t108559\nMOD_辐射4\t108560\n100份\t108561\n兰博基尼Urus\t108562\n本周六\t108563\n独奏吉他谱-虫虫吉他谱\t108564\n大桥路\t108565\n肺功能仪\t108566\n二乙二醇\t108567\n监舍\t108568\nWham\t108569\n螺杆式启闭机\t108570\n冒险岛龙\t108571\n蛋品\t108572\n五联苗\t108573\n砂金\t108574\nkdh\t108575\n多孔砖\t108576\n大观楼\t108577\n睢宁县人民政府\t108578\n延安西路\t108579\n磨房\t108580\n三亚在线\t108581\n国家军委\t108582\n环星\t108583\n埼玉\t108584\n洞悉\t108585\n下载版\t108586\n琶\t108587\nkitten\t108588\n米家有品\t108589\n四号\t108590\nstm32\t108591\n被动型\t108592\n六万多\t108593\n暗道\t108594\n真高达无双\t108595\nmac虚拟机\t108596\n段宏\t108597\n3月10号\t108598\n武公\t108599\n航行器\t108600\n快穿吧\t108601\nzmy\t108602\n女生小游戏_女生小游戏大全_女生小游戏全集\t108603\n因地制宜\t108604\n饥\t108605\n太平洋人寿\t108606\n大柴旦\t108607\n搭讪\t108608\n问卷调查/在线考试\t108609\nmird\t108610\n漯河西站\t108611\nComplaint\t108612\n谁知\t108613\nrlwrap\t108614\n德累斯顿工业大学\t108615\n橡胶件\t108616\n茵悦\t108617\nrespective\t108618\n同治\t108619\n数字逻辑电路\t108620\n动容\t108621\n遮阳板\t108622\n中国科学院半导体研究所\t108623\n山西煤矿\t108624\nspeex\t108625\n卫津路\t108626\n2017.11.29\t108627\n头孢拉定胶囊\t108628\n圣教序\t108629\n商法学\t108630\n激情沙漠\t108631\n崇川_崇川政府网\t108632\n父兄\t108633\n衍射仪\t108634\n代购买\t108635\n1.4版\t108636\n闪婚\t108637\n安全系\t108638\n调降\t108639\n切嗣\t108640\n人财两空\t108641\n西陆网\t108642\n大竹县政府\t108643\n麦家琪\t108644\n衬胶\t108645\nFacade\t108646\n湖山\t108647\n烟友\t108648\nNHDTA\t108649\n裙式\t108650\n浦星公路\t108651\nipaddress\t108652\n画中仙\t108653\n牧原集团\t108654\nIK分词器\t108655\n伸缩棍\t108656\nCaro\t108657\n早出\t108658\n张大猛\t108659\n宝马s1000rr\t108660\n童装加盟网\t108661\nmL\t108662\n鼻音\t108663\n自做\t108664\n瞄准镜\t108665\n南枝\t108666\n肖茵\t108667\n凤阳县\t108668\n谜巢\t108669\n麦道\t108670\n益群\t108671\nPastel\t108672\n扑捉\t108673\n19800元\t108674\n怡萱\t108675\n吸热反应\t108676\n憨\t108677\n戏中人\t108678\n不骄\t108679\n明彩\t108680\n见证员\t108681\n宋濂\t108682\n2016年中秋\t108683\n儿孙\t108684\n阿里诚信通\t108685\n异常值\t108686\n抗力\t108687\n土砂龙\t108688\n万达\t108689\n2016下\t108690\nzhenti\t108691\n硬态\t108692\n自备电厂\t108693\nv3.4.5\t108694\n固定工\t108695\n酒馆\t108696\nKIP\t108697\n厂部\t108698\n正当\t108699\n二房\t108700\n烦燥\t108701\n金玉婷\t108702\nts250\t108703\nMcCartney\t108704\nBSP\t108705\nFractal\t108706\n革命军\t108707\nX战警:逆转未来\t108708\n微信电脑版2017\t108709\n马村区\t108710\n禹会区\t108711\nMFC-1919NW\t108712\n480P\t108713\n孔径\t108714\n打钻\t108715\n折梁\t108716\nHazel\t108717\n高攀路\t108718\nLSM\t108719\n陈哲远\t108720\n博物馆奇妙夜2\t108721\n桑梓\t108722\n预警\t108723\n太平街道\t108724\n负价\t108725\n纸牌游戏\t108726\n拆弹部队\t108727\n影牙\t108728\n缺血性脑病\t108729\n旅行网\t108730\n正则式\t108731\njsq20\t108732\n宜昌市中心人民医院\t108733\n马蒂斯\t108734\nWikimedia\t108735\n1722\t108736\n第十八个\t108737\nswust\t108738\n张勋\t108739\n金阁\t108740\n第33个\t108741\n残疾人\t108742\nregister\t108743\n许勤\t108744\n夜色邦\t108745\n宿迁市人力资源和社会保障局\t108746\n书面材料\t108747\n夏靖\t108748\nanyproxy\t108749\n延长期\t108750\n齿痕舌\t108751\n苏通大桥\t108752\n248\t108753\n大骑士物语\t108754\n双杠臂\t108755\n继续教育中心\t108756\n形成期\t108757\n仙界\t108758\n月亮湾\t108759\n刘忠河\t108760\n班公湖\t108761\nlumi\t108762\n利州区\t108763\n许魏洲\t108764\n涮锅\t108765\n訊息\t108766\n白区\t108767\n转型期\t108768\n吉光\t108769\nkie\t108770\n卡林巴琴\t108771\n戻\t108772\n材料员\t108773\n公募\t108774\n纳格\t108775\n猛将\t108776\n第21号\t108777\n泽平\t108778\nYY语音\t108779\n0.6元\t108780\nGifCam\t108781\n4tb\t108782\n所能\t108783\nIronStark\t108784\n师生恋\t108785\n竞房\t108786\nheroes5吧\t108787\nlistwidget\t108788\nNspire\t108789\nhour\t108790\n十三中\t108791\n广东理工学院\t108792\n嵌件\t108793\n墙脚\t108794\nunbantu\t108795\n46所\t108796\n伊瑞尔\t108797\n姓氏\t108798\ncainiao\t108799\n第一性\t108800\n鼻涕虫\t108801\n斗鱼tv\t108802\n船用油\t108803\n宝骏310w\t108804\n健走\t108805\nVC++2005\t108806\n收服\t108807\n071\t108808\n拙政园\t108809\n实际收益率\t108810\n餐饭\t108811\n无源晶振\t108812\n南华生物\t108813\nv4.0\t108814\n假面骑士剑\t108815\ng41\t108816\n防腐板\t108817\n华阳中学\t108818\n小麒麟\t108819\nvv5\t108820\nwindows7旗舰版\t108821\n新州\t108822\n传送石\t108823\n深柜\t108824\n丽声妙想\t108825\n信发\t108826\n巨资\t108827\nRid\t108828\n奇楼\t108829\n刮片\t108830\n生精片\t108831\n进去\t108832\n文武\t108833\n吉祥馄饨\t108834\n童叟无\t108835\n91集\t108836\n展示馆\t108837\n201403\t108838\n翠苑街道\t108839\nIND\t108840\n财务\t108841\n期末考试题\t108842\n背后的秘密\t108843\n析晶\t108844\n渔夫\t108845\nX9i\t108846\nEEUSS影院\t108847\nsolidwoks\t108848\nWorks3\t108849\n耳部\t108850\nbwin\t108851\n千千万万遍\t108852\n巨鱼\t108853\nti4\t108854\n第二套\t108855\n中国电子信息产业发展研究院_赛迪集团\t108856\n尿酮\t108857\nHandsome\t108858\n陈航\t108859\n二版\t108860\n包装品\t108861\n当乐网\t108862\nTV890\t108863\n3351\t108864\n中国武夷\t108865\n楚雄彝族自治州人民政府\t108866\nAmbit\t108867\n自然垄断\t108868\nND\t108869\n科利华中学\t108870\n迎送\t108871\n其子\t108872\n枣儿\t108873\n衣襟\t108874\ntoo\t108875\n5小时\t108876\ndestiny\t108877\n去月球\t108878\n门挡\t108879\n聚氨酯组合料\t108880\n和山\t108881\nPeterson\t108882\n微变传奇\t108883\n10余家\t108884\n别接\t108885\n1世纪\t108886\nProteins\t108887\n3998\t108888\n倒霉事\t108889\nBn\t108890\n罗技M590\t108891\nRADS\t108892\n张生荣\t108893\n网络电视盒子\t108894\nxy\t108895\n点点滴滴\t108896\n三维网\t108897\n鹅泉\t108898\n中国共产党第十九届中央纪律检查委员会\t108899\n磨床\t108900\n丁平\t108901\n戚继光\t108902\nLrc\t108903\n106集\t108904\n一帝\t108905\n山东省幼儿园\t108906\n小米5x\t108907\n那那\t108908\nTLB\t108909\n2017世界机器人大会\t108910\n反签\t108911\n句容热线网\t108912\n水陆\t108913\n平板硫化机\t108914\n北京市经济和信息化委员会\t108915\n云同步盘\t108916\n小易\t108917\n枪娘\t108918\n只婚\t108919\n苏步青\t108920\n六通阀\t108921\n鲁教\t108922\nscikit-learn\t108923\n专转本吧\t108924\nanello\t108925\n整容术\t108926\n无可奈何\t108927\n伪君子\t108928\n叶山瞳\t108929\nFIFO\t108930\n用板\t108931\n戴卡\t108932\n投射物\t108933\n上海大学外国语学院\t108934\naa2\t108935\n茵\t108936\n跟进\t108937\n课外学分\t108938\n第163章\t108939\n再生\t108940\n德州东\t108941\n北京师范大学图书馆\t108942\n战兵\t108943\n阿拉宁波网\t108944\n怪鸟\t108945\n墙垛\t108946\n喇叭\t108947\n32款\t108948\n强化训练\t108949\n卫生服\t108950\n麦场\t108951\n肉串\t108952\n太阴\t108953\n観月\t108954\n钱雁秋\t108955\n上海11号线\t108956\n布鞋\t108957\nmysql数据库文件\t108958\n单书\t108959\nAnyDesk\t108960\n网房\t108961\nevernote\t108962\nConcentrate\t108963\n金税盘版\t108964\nandroid-studio\t108965\n范德华\t108966\nelearning\t108967\n福州大学至诚学院\t108968\n1000xm2\t108969\n安祖缇\t108970\n墙德\t108971\n西三环\t108972\n层云\t108973\n超级市场\t108974\n九包邮\t108975\n无时\t108976\n毒气体\t108977\n大学英语四级考试\t108978\nPeanuts\t108979\n56名\t108980\nmico\t108981\n猪肺\t108982\n波哥盛世大厦\t108983\n腾博\t108984\nun个\t108985\n头花\t108986\n凌晨两点\t108987\n重庆大学出版社\t108988\n黑龙江省医院\t108989\n光缆\t108990\n顺丁烯二酸酐\t108991\n新欧尚\t108992\n锰系\t108993\n北极贝\t108994\n顽固\t108995\n河南省人民政府国有资产监督管理委员会\t108996\n铭座\t108997\n不嫁给\t108998\n拍一拍\t108999\n棋手\t109000\n奋迅\t109001\nbeetle\t109002\n疑病症\t109003\nspecular\t109004\n廯\t109005\n森女\t109006\nag03\t109007\n概括\t109008\nSHI\t109009\n有素\t109010\n锁城\t109011\n康迪泰克\t109012\n睁开\t109013\n许安然\t109014\n百度验证码\t109015\n微网\t109016\n180413\t109017\nOnClick\t109018\nPC圈\t109019\n中华人民共和国人民政府\t109020\nzelens\t109021\n许超\t109022\n星河湾半岛\t109023\n探索者\t109024\n100ml\t109025\n海翔\t109026\n7778\t109027\n地坛\t109028\n因果论\t109029\n300勇士:帝国崛起\t109030\n宇智波佐助\t109031\n勿忘心安\t109032\n日景\t109033\n轩轩\t109034\n广济桥\t109035\n贵州省水利厅\t109036\n黑臭河道\t109037\ndelvaux\t109038\n梦幻之星ol2\t109039\n龙珠花园\t109040\n一铵\t109041\n表线\t109042\n唐浩\t109043\n猪业\t109044\n横冲直撞\t109045\n17年11月\t109046\n冀东\t109047\n狼头\t109048\nx-m\t109049\n友好\t109050\n鲁冠球\t109051\ng4600\t109052\n拾取\t109053\nPGC\t109054\n康复医疗\t109055\ngtaol\t109056\n行凶\t109057\n城市生活\t109058\nasmack\t109059\n忠犬八公的故事\t109060\n烟窗\t109061\n消防柜\t109062\n大连理工大学城市学院\t109063\n固态硬盘120g\t109064\n恐狼\t109065\n步进伺服\t109066\n择机\t109067\nsanrio\t109068\nSuica\t109069\n吉林移动\t109070\n第四十九条\t109071\n理臣教育\t109072\n水城奈绪\t109073\n台钳\t109074\n娜迦\t109075\n三舍\t109076\n复合调味料\t109077\n108亿\t109078\nvso\t109079\nSevilla\t109080\n双月湾\t109081\n会演\t109082\n有关心\t109083\n三越\t109084\nsubjects\t109085\n两式\t109086\n几千亿\t109087\nfpermissive\t109088\n怡馨\t109089\n龟背\t109090\n飞狗\t109091\nFreight\t109092\n沛县便民网\t109093\n中共浙江省委党校\t109094\n律例\t109095\n3.1415\t109096\n上古卷轴6\t109097\n83161177\t109098\n_v2.0\t109099\n星际争霸2自由之翼\t109100\n亚马逊\t109101\n复旦大学信息化办公室\t109102\n手提电脑\t109103\n海鸟\t109104\nZONE\t109105\nfocusky\t109106\nAchilles\t109107\n周转\t109108\nExaGear\t109109\n索索\t109110\n青岛市南\t109111\n偶数\t109112\n720p|1080p\t109113\nJasperReports\t109114\n奚\t109115\n卢克R\t109116\n易中苏菲玛索\t109117\n环绕\t109118\n延津县\t109119\n厂级\t109120\n携创网\t109121\n免洗手\t109122\n香港投资\t109123\n星河\t109124\nSARscape\t109125\n山泽\t109126\n第19号\t109127\n国乐\t109128\n巴纳姆\t109129\n农奴制\t109130\n爱你的人\t109131\nfenix3\t109132\n农博园\t109133\n众合教育\t109134\n潦草\t109135\n背负\t109136\n姚氏\t109137\n市容局\t109138\n性侵犯\t109139\n灵珠\t109140\ncommercial\t109141\n42%\t109142\n三台县\t109143\n6030\t109144\nAGM\t109145\nL450\t109146\n关节轴承\t109147\n王者荣耀夫子\t109148\n分线盒\t109149\n圆柱子\t109150\n1.92\t109151\nHash算法\t109152\nAlley\t109153\nme31\t109154\n失败乃成功之母\t109155\n03期\t109156\n维多利亚港\t109157\n吴忌寒\t109158\n重生\t109159\n269\t109160\nMaven+SSM\t109161\n20170720\t109162\nSTC15F2K60S2\t109163\n南松\t109164\n记牌器\t109165\n湖湘文化\t109166\n梁大师\t109167\n北京康盛新创科技有限责任公司\t109168\n充磁\t109169\n冷青衫\t109170\n犬王\t109171\n4500吨\t109172\n川姜镇\t109173\n钢种\t109174\n太白湖新区\t109175\n双山\t109176\nsmart200\t109177\n天罡\t109178\n夸客\t109179\n奥古斯托\t109180\n善恶资源网\t109181\n400ml\t109182\n环博会\t109183\nBeautyleg\t109184\n灵石\t109185\n一凤\t109186\n如家酒店\t109187\n植物大战僵尸花园战争2\t109188\n机动战士高达00\t109189\n出校\t109190\n吸土\t109191\nMoc\t109192\nAUO\t109193\n富丽花园\t109194\n洛阳玻璃\t109195\n桎梏\t109196\narray_walk\t109197\n450W\t109198\n拥立\t109199\nLinGon\t109200\n春娇\t109201\n舒客牙膏\t109202\n信鸽网\t109203\n灯光秀\t109204\n穆赫兰\t109205\n五云山\t109206\n人中招\t109207\n檀香山\t109208\n艳美\t109209\n启运\t109210\n192\t109211\n阳陵\t109212\nnative\t109213\nultraedit\t109214\n111111\t109215\n击出\t109216\n听伴\t109217\n昆明南站\t109218\n拆开\t109219\n中冶集团\t109220\ngeturl\t109221\n津巴布韦元\t109222\n创新设计大赛\t109223\nVICTOR\t109224\n上梁\t109225\n∷\t109226\n一分钱\t109227\n抓绒衣\t109228\n20.0\t109229\n昵图网\t109230\n德战\t109231\nConstantin\t109232\nメス豚\t109233\n车行天下\t109234\nLOCO\t109235\n福州地铁6号线\t109236\nmnesia\t109237\nUFC终极格斗锦标赛\t109238\n极薄\t109239\n菠萝\t109240\nvlanif\t109241\n避重就轻\t109242\n工商大学\t109243\n土地权\t109244\n原辅\t109245\n迪妮莎\t109246\n诺奖\t109247\n丑男\t109248\n内盖\t109249\n金灵\t109250\n耶格\t109251\npmcaff\t109252\n单色仪\t109253\nas2.0\t109254\n元宝山区\t109255\n烤全鱼\t109256\n道奇战斧\t109257\nit\t109258\n燎\t109259\n金华19楼\t109260\nSerie\t109261\n三巨头\t109262\n淫情\t109263\n防爆挠性管\t109264\n杨玏\t109265\n中佳\t109266\n樊落\t109267\n热敷包\t109268\n金皖\t109269\n抓有\t109270\n李_\t109271\nbackups\t109272\n新浪播客\t109273\n徐誉滕\t109274\n杨瑞\t109275\nDoom\t109276\n易建\t109277\n不点\t109278\n江都中学\t109279\nRaid卡\t109280\n洲际赛\t109281\n冷排\t109282\n不老屯\t109283\nZP\t109284\n何加\t109285\n方田\t109286\n钛管\t109287\n8宫\t109288\nGeoffrey\t109289\n文友\t109290\n雍和\t109291\n马女\t109292\n科学技术部\t109293\n二遍\t109294\n计时器\t109295\n桑乐太阳能\t109296\n达州火车站\t109297\n基业箱\t109298\n先声夺人\t109299\n塔读小说网\t109300\n民非组织\t109301\n237\t109302\n水枪\t109303\n新大洲本田\t109304\n徐洪才\t109305\n李世明\t109306\nF2C\t109307\n困难生\t109308\n百跃\t109309\nPenn\t109310\n内液\t109311\n光控开关\t109312\n新丝路\t109313\n咸阳市政府\t109314\n吠\t109315\n杨梓\t109316\nfilestream\t109317\n陶虹\t109318\n东方白\t109319\n打吊\t109320\n死心\t109321\n伪列\t109322\n炎魔神\t109323\nGLC260\t109324\n年休\t109325\n98版\t109326\n类子\t109327\n黄瓜山\t109328\n第25页\t109329\n警示柱\t109330\n定量分析法\t109331\n一块\t109332\nn-1\t109333\nusleep\t109334\n中公研招网\t109335\n1.13G\t109336\n食安\t109337\n大胆西西\t109338\n和社会保障局\t109339\n水经\t109340\n心裁\t109341\n2016年5月20日\t109342\n被褥\t109343\nMailing\t109344\n阿胶糕\t109345\n免受\t109346\n料泵\t109347\n墨鱼骨\t109348\nAdvancing\t109349\n二十八天\t109350\n喷尿\t109351\n葬仙\t109352\n飞燕女\t109353\n灰霾\t109354\na60\t109355\n华东师范大学心理与认知科学学院\t109356\n油操\t109357\n第二相\t109358\n蒂森\t109359\n雪国列车\t109360\ncuhk\t109361\niView\t109362\n落魄\t109363\n杯中\t109364\n7.3分\t109365\n培训\t109366\n及值\t109367\n海底世界\t109368\n康复治疗师\t109369\n华颂7\t109370\n布迪厄\t109371\n职责\t109372\n检测仪\t109373\nachain\t109374\n三国演义动画版\t109375\n二胡谱_找歌谱网\t109376\n牙床\t109377\n体心\t109378\n台球知识网\t109379\n涨停指标\t109380\n封绘\t109381\n一两个\t109382\n迅雷组队\t109383\n虎踞龙盘\t109384\n资源管理器\t109385\n癣\t109386\n恢复力\t109387\n1X届\t109388\ndrgs\t109389\n倩影\t109390\n情境教学法\t109391\n哈佛大学\t109392\n青岛十九中\t109393\n火影世界\t109394\n雕塑品\t109395\n青海盐湖工业股份有限公司\t109396\n扩容\t109397\n留存率\t109398\n第35\t109399\n客源\t109400\n伴读\t109401\n培训类\t109402\n应援\t109403\n藤丸立香\t109404\n凯里经济开发区\t109405\nswiper3\t109406\n500K\t109407\n太空银\t109408\n复旦大学国际关系与公共事务学院\t109409\n样板房\t109410\n新课标人教版\t109411\n第12\t109412\n越秀滨海御城\t109413\n世界文化遗产\t109414\n被冻结\t109415\n粉鱼\t109416\n希尔伯特\t109417\n飞手\t109418\n克死\t109419\n包皮包茎\t109420\n汉博\t109421\niphoe\t109422\nMIND\t109423\n点歌机\t109424\nIDL\t109425\n杨得志\t109426\n长蛆\t109427\n小黄鹂\t109428\n智能汽车\t109429\n太紧\t109430\n圣约翰\t109431\n保护伞\t109432\n灰狗\t109433\n九江\t109434\n170701\t109435\n1775\t109436\n米子\t109437\n120s\t109438\n店埠镇\t109439\n克拉码头\t109440\n史诗\t109441\nOpcache\t109442\n微舆情\t109443\n原生态购物网\t109444\n裂变\t109445\n大嘴\t109446\n下一周\t109447\n雨雾\t109448\nbluestack\t109449\nKao\t109450\nDEVICE\t109451\n冰风\t109452\n碎布\t109453\n司库\t109454\n学天\t109455\n温岭政府\t109456\n烟场\t109457\n走道\t109458\nMasterCAM\t109459\nWorking\t109460\n奔驰GLC200\t109461\n琪琪布\t109462\n定值\t109463\nterrorist\t109464\nJJF\t109465\n龙梅\t109466\n土地登记代理人协会\t109467\n小米授权服务中心\t109468\n上海交通大学农业与生物学院\t109469\n外阴瘙痒\t109470\nRedis服务器\t109471\n赤泥\t109472\n宋方金\t109473\n开满鲜花的小路\t109474\n价格法\t109475\n摆一摆\t109476\n豆芽机\t109477\nsuns\t109478\n石港\t109479\n332\t109480\n上海光明\t109481\n殖民统治\t109482\n赵光\t109483\nplas\t109484\n5首\t109485\n东方国信\t109486\n徒劳无功\t109487\n26攻速\t109488\n漫生活\t109489\n李昌\t109490\nc1驾驶证\t109491\n扩音\t109492\n肉嫩\t109493\n蟹岛度假村\t109494\n16oz\t109495\n败类\t109496\n4kg\t109497\nSociology\t109498\n档子\t109499\n吉祥天\t109500\n那时\t109501\n铜铝复合散热器\t109502\n全自动洗车\t109503\nOutdoors\t109504\n组份\t109505\n簡介\t109506\nMatch函数\t109507\n东西南北\t109508\n资本公司\t109509\n永仁\t109510\nYOU+\t109511\n悉地\t109512\n238\t109513\n朗动\t109514\n评论库\t109515\noob\t109516\n车载版\t109517\ns8050\t109518\n绵软\t109519\n色盲\t109520\n鲁二哥\t109521\n天圆\t109522\nra2\t109523\n安徽大学\t109524\n食品摊贩\t109525\n圆柱的表面积\t109526\n权力的游戏第三季\t109527\ntempest\t109528\n一顿饭\t109529\n醋酸钙\t109530\n微电子学\t109531\n恒温机\t109532\n西彭\t109533\n侨报\t109534\n76分\t109535\n渤海钻探吧\t109536\n智能投影仪\t109537\n盐性\t109538\n星河大帝\t109539\nghibli\t109540\n清痘\t109541\nvcom\t109542\n今讯网\t109543\n五连\t109544\n华侨城创意文化园\t109545\n北川县\t109546\nYaya\t109547\n门帘\t109548\n将来时\t109549\ntechnik\t109550\n南财\t109551\n滴滴快车吧\t109552\n中山市财政局\t109553\n洒金\t109554\n首站\t109555\n2007-2008年\t109556\ngeng\t109557\n刘楠\t109558\n于健\t109559\n我的唯一\t109560\n大英雄\t109561\n华为MediaPad\t109562\naiwa\t109563\n重名\t109564\n焖烧罐\t109565\n养生之家\t109566\n披挂\t109567\n豆丝\t109568\n本科段\t109569\n20151211\t109570\n真空泵\t109571\nOFFSET\t109572\nogg格式\t109573\n腹肌\t109574\n文化展\t109575\n武定路\t109576\n几瓶\t109577\n劳务吧\t109578\n中国体操队\t109579\n大屯镇\t109580\nSocrates\t109581\n23mm\t109582\n73级\t109583\n平望镇\t109584\nbibtex\t109585\nextraordinary\t109586\n物物\t109587\n东莞东华小学\t109588\n20151026\t109589\n三元相图\t109590\n工号牌\t109591\n财猫\t109592\nAssay\t109593\n丶\t109594\n祁门县人民政府\t109595\n高良\t109596\n必定要\t109597\n甙\t109598\nmdac\t109599\n伊东丰雄\t109600\n宿世\t109601\nICC\t109602\n仲夏\t109603\nRib\t109604\n炎爆\t109605\n东湖路\t109606\n四川长江职业学院\t109607\n缅甸花梨木\t109608\n纱丽\t109609\nEmbrace\t109610\n比利亚\t109611\narrivo\t109612\n氧化沟\t109613\n组局\t109614\n开罗游戏吧\t109615\n探沂镇\t109616\n尚志\t109617\n茶颜\t109618\n沃尔沃v60\t109619\nchua\t109620\n50度\t109621\n渗碳体\t109622\n孔二狗\t109623\n消耗品\t109624\n树体\t109625\n出版者\t109626\n经济发展史\t109627\n七口\t109628\nshakespeare\t109629\n华为p8max\t109630\n家庭式\t109631\n上海外事服务中心\t109632\n学无忧\t109633\n牛鼻子\t109634\n奔驰gls450\t109635\n知识产权服务\t109636\n归去\t109637\nbibi\t109638\n瑟雨\t109639\n徐冬梅\t109640\nwww.2144.cn\t109641\n玉容\t109642\n6章\t109643\n陶冬\t109644\n鈴木\t109645\nv7.0.0\t109646\n峨边彝族自治县\t109647\n黑龙江省交通运输厅\t109648\n4冠\t109649\ngprinter\t109650\n735\t109651\n盗墓少年三国志\t109652\n本书\t109653\n御匾\t109654\nteller\t109655\n好莱\t109656\nbln\t109657\n两化融合管理体系\t109658\n1000Mbps\t109659\n泷泽萝\t109660\n北京白癜风医院\t109661\n权益性投资收益\t109662\n滕王阁\t109663\n盖世帝尊\t109664\n滑倒\t109665\n快穿女配\t109666\nMRD\t109667\n草本\t109668\n小容\t109669\n十二烷基苯磺酸\t109670\n中棒\t109671\n猛犬俱乐部\t109672\n℅\t109673\n0551-62723050\t109674\n亚硒酸钠\t109675\n理发馆\t109676\n宣读\t109677\nsalvage\t109678\n晓月圆舞曲\t109679\n高智能方程式赛车\t109680\n祥源\t109681\n杨羽\t109682\n外贸邦\t109683\n法雅\t109684\n爱乐乐享\t109685\n芝麻官\t109686\n12369\t109687\nvsflexgrid\t109688\nROBLOX\t109689\n塔拉夏\t109690\nHand\t109691\n许戈辉\t109692\n维固\t109693\n蹋\t109694\nwegame\t109695\n大屠\t109696\n正态分布表\t109697\n生药\t109698\n18万吨\t109699\nsilva\t109700\nrunningman饭网\t109701\n滨海时报\t109702\n99米\t109703\n套圈圈\t109704\n星轨\t109705\n美竹铃\t109706\n草东\t109707\n敬祝\t109708\n极略三国\t109709\n监察室\t109710\nKnew\t109711\n帮帮团\t109712\n噩运\t109713\n吉木\t109714\ndavis\t109715\n鼎天\t109716\n性系\t109717\nDuplex\t109718\n固态硬盘\t109719\n肾上腺素\t109720\n双重否定句\t109721\n缥缈录\t109722\n澄迈县人民政府\t109723\n续作\t109724\n电放保函\t109725\n长卡\t109726\n西安未央政府网\t109727\nMstar\t109728\n美洛耶塔\t109729\n代号\t109730\n橘子皮\t109731\n对赌\t109732\nnational\t109733\n贝壳头\t109734\n杨扬\t109735\nPagination\t109736\nSteam_www.3dmgame.com\t109737\n百望\t109738\n氩弧焊枪\t109739\n防爆轮胎\t109740\n信托\t109741\n553\t109742\nzaix\t109743\n1970年\t109744\n左峰\t109745\n筠连\t109746\n沈阳航空航天大学\t109747\n干裂\t109748\n极乐\t109749\n为负\t109750\n335i\t109751\n优课\t109752\n唐江\t109753\nCCTIME\t109754\n酷q机器人\t109755\n有恃无恐\t109756\noverlook\t109757\n望京西\t109758\n虾稻\t109759\nwonyun\t109760\n高长\t109761\n触摸屏\t109762\n格安\t109763\n家户\t109764\n南京地铁10号线\t109765\n红人馆\t109766\n千元级\t109767\n交通肇事逃逸\t109768\nvifa\t109769\n25亿美元\t109770\nctn\t109771\n快疯了\t109772\n闲人\t109773\n望江亭\t109774\nX4简体中文\t109775\n华三路由器\t109776\n满城\t109777\n考古\t109778\nBufferedWriter\t109779\n南京市中级人民法院\t109780\n莞城\t109781\n55kg\t109782\n博鳌亚洲论坛国际会议中心\t109783\n把妹\t109784\njeux\t109785\n杰伦布朗\t109786\n嵌草砖\t109787\n江南通志\t109788\n康宁大猩猩玻璃\t109789\n薛定谔\t109790\n【天策府\t109791\ncooler\t109792\n萋萋\t109793\n放下来\t109794\nMAGIX\t109795\navic\t109796\n皇姑区\t109797\n卧床不起\t109798\n诫子书\t109799\n泗河\t109800\nautocad2016\t109801\n拱辰\t109802\n卡拉OK\t109803\nplanner\t109804\n西部世界第二季\t109805\nLOS\t109806\n赵文静\t109807\n手帕\t109808\n中曼石油\t109809\n中科院半导体所\t109810\nGlow\t109811\nfilament\t109812\n南箫\t109813\nrunaway\t109814\n39年\t109815\nTennessee\t109816\n淋巴结核\t109817\n吹拉弹唱\t109818\n四升\t109819\n初出茅庐\t109820\n第三夜\t109821\n华夏基金财富宝\t109822\n0718\t109823\nLRU\t109824\n航空器\t109825\n注塑模\t109826\n抗火\t109827\n欠调\t109828\n100多部\t109829\n不想\t109830\nhuangse\t109831\n求发\t109832\n刑侦科\t109833\n转卖\t109834\n2000-2010年\t109835\n166路\t109836\n樱花大战5\t109837\n明星大侦探第三季\t109838\n氢氧化物\t109839\n三秦网\t109840\n慧聪\t109841\n飞去\t109842\n牡丹江医学院\t109843\n20180124\t109844\n7.9分\t109845\ncreatejs\t109846\nGoEuro\t109847\n棉制\t109848\n水洗机\t109849\n阿里体育\t109850\nLarsen\t109851\n浪生\t109852\n米皮\t109853\n四川共青团\t109854\n三众\t109855\n贝印\t109856\nwww.haoxp123.com\t109857\n秒板\t109858\nrooted\t109859\nskye\t109860\n成年犬\t109861\n尹天照\t109862\n中关村在线办公论坛\t109863\n色卡\t109864\n母鸽\t109865\n莆田市\t109866\n荒野八人组\t109867\n直流电子负载\t109868\n城市晚报\t109869\n印经\t109870\n泛海三江\t109871\n沁县\t109872\n出生医学证明\t109873\n陈式太极\t109874\n新屋村\t109875\n王铁成\t109876\n哨所\t109877\n侉子\t109878\n足踝\t109879\n美银\t109880\n丽水中学\t109881\ngic\t109882\n2.81\t109883\n中国烟草总公司\t109884\nStrap\t109885\n3554\t109886\n江苏省电力\t109887\n靖康\t109888\n免房\t109889\n熏陶\t109890\n骐达超级神基因\t109891\n速度与激情7\t109892\n咪咕屋\t109893\n丽江网\t109894\nmgs\t109895\n方晨\t109896\n威尼托\t109897\n贵重\t109898\nblvd\t109899\n常州西\t109900\n猪\t109901\n华飞\t109902\n大众宝来\t109903\nbeauté\t109904\nCuO\t109905\n为\t109906\n宝洁公司\t109907\n路德维希\t109908\ncamaro\t109909\n顾拜旦\t109910\n8队\t109911\n均方误差\t109912\n匯率\t109913\npacewear\t109914\n噗嗤噗嗤\t109915\n装配式\t109916\nM154a\t109917\n巡河\t109918\n烈焰猴\t109919\n花椒币\t109920\n计费\t109921\n爱的人间\t109922\n门口\t109923\nTransparent\t109924\n5.19\t109925\n测试场\t109926\n中国制造网\t109927\nbihua\t109928\n商展\t109929\n低血糖\t109930\n意大\t109931\nFlange\t109932\n乳酪\t109933\nmirage\t109934\n卡箍\t109935\n79集\t109936\n236个\t109937\n酷漫网\t109938\n魔狱\t109939\n瑞利\t109940\n0427\t109941\n1551\t109942\n裁剪\t109943\n天照大神\t109944\n撞车\t109945\nipnone\t109946\n布布路\t109947\n干部局\t109948\n介苗\t109949\n谦辞\t109950\n天香\t109951\n蜡像馆\t109952\nsentaurus\t109953\n汉斯格雅\t109954\n8局\t109955\nJava类\t109956\nvieos\t109957\n施策\t109958\n武器战\t109959\n黑暗料理王\t109960\nbde\t109961\n诚然\t109962\nelectricity\t109963\n流弊\t109964\n噪音扰民\t109965\n十二天\t109966\n悠扬\t109967\n31年前\t109968\n几分之\t109969\n胆管结石\t109970\n猫石\t109971\nPILIPILI\t109972\n微糖\t109973\n黑桃皇后\t109974\nsynthetic\t109975\n37栋\t109976\n彭雪枫\t109977\n刘蓓\t109978\n新南门\t109979\n上海交通大学》\t109980\nl298n\t109981\n中国建材集团\t109982\n帕罗西汀\t109983\n2825\t109984\n蚂蜂窝\t109985\n垂直度\t109986\napprox\t109987\n管理心理学\t109988\n山药泥\t109989\n中转仓\t109990\n危险性\t109991\nkodi\t109992\nTMF\t109993\n20151115\t109994\n宇宙大爆炸理论\t109995\n捷途X90\t109996\nGirls\t109997\n芈八子\t109998\nEMUI8.0\t109999\n苏拉玛城\t110000\nBuildings\t110001\n擅用\t110002\narcGIS\t110003\n捉鬼\t110004\nQ50L\t110005\n推脱\t110006\n开版\t110007\nGakuen\t110008\n丹尼尔·克雷格\t110009\nfinebi\t110010\n支公司\t110011\n爱尚小说网\t110012\n雪道\t110013\nprints\t110014\nOpenXml\t110015\n东州\t110016\n硫酸庆大霉素\t110017\n鲅鱼圈新闻网\t110018\nCV\t110019\n爱普生l805\t110020\n郑州外国语中学\t110021\n曳引\t110022\n机械台\t110023\n静水深流\t110024\njager\t110025\nplaystation\t110026\nallocation\t110027\n维生素b族片\t110028\n胡茬\t110029\n装修期\t110030\npom.xml\t110031\n京东快递单号\t110032\n71届\t110033\n精装书\t110034\nZynq-7000\t110035\nfillet\t110036\n奇皇后\t110037\n四国\t110038\n仰光\t110039\nv3.4.0\t110040\n小金钟\t110041\nhermite\t110042\nTFS\t110043\n被判\t110044\nPhilosophy\t110045\n绿行\t110046\nLoadrunner12\t110047\n复明\t110048\n1.7.1\t110049\n江铃特顺\t110050\nNYOJ\t110051\njiaocheng\t110052\n捕鱼网\t110053\n头环\t110054\nProgrammer\t110055\n侍魂2\t110056\n带来\t110057\n收费权\t110058\n叶赫那拉氏\t110059\n袖剑\t110060\n法丽达\t110061\nvts\t110062\n百余部\t110063\n京美\t110064\n第237集\t110065\n秒射\t110066\n有点难\t110067\n瑞虎8\t110068\nE92\t110069\n33名\t110070\n过滤机\t110071\n一槌定音\t110072\n流行期\t110073\n板楼\t110074\n京族\t110075\n长安欧尚a800\t110076\n北京16区\t110077\nMacro\t110078\n金融理财师\t110079\n昌黎县\t110080\n蓝溪\t110081\n西安国际港务区\t110082\n国企公司\t110083\nringtone\t110084\n|2\t110085\n窄轨\t110086\n腹式呼吸法\t110087\n老妇人\t110088\nImageLoader\t110089\n无糖食品\t110090\n#\t110091\n侯爷\t110092\n语输入法\t110093\n笨\t110094\n也想\t110095\n中诚信国际信用评级有限责任公司\t110096\n002604\t110097\nGrim\t110098\n50名\t110099\n宏名\t110100\ne智贷\t110101\n蝴蝶精\t110102\n程成\t110103\n呼和浩特东\t110104\n外间\t110105\njinji\t110106\n旋转轮胎\t110107\n边套\t110108\n海宇\t110109\ntengine\t110110\n保定北市\t110111\ndev-server\t110112\n一鹿\t110113\n二烯烃\t110114\nlunch\t110115\n湛江市政府\t110116\n钢承板\t110117\nHD-MP4/\t110118\n江海职业技术学院\t110119\n杨刚\t110120\n戈丫汝\t110121\nECHO\t110122\n乔洋\t110123\n已知数\t110124\n自选股\t110125\n中国管道商务网\t110126\n工程型\t110127\ne乐充\t110128\n黄果树机场\t110129\nBirth\t110130\nxiuren\t110131\n暴鲤龙\t110132\nREX\t110133\n奔驰amg\t110134\n跨源\t110135\n怂包\t110136\n嫦娥三号\t110137\n紫罗兰永恒花园\t110138\n红人气\t110139\n大朴\t110140\n上海浦传翻译有限公司\t110141\n新普陀小学\t110142\n花キララ\t110143\n污水处理\t110144\nInterior\t110145\nviva\t110146\n珠江花城\t110147\n关学\t110148\n起动器\t110149\nFreeRadius\t110150\n长鞭\t110151\n女人没有错\t110152\n前茅\t110153\n杨晓明\t110154\n日久生情\t110155\n语气词\t110156\n观火\t110157\n微软surface\t110158\n爱依瑞斯\t110159\n云南大学滇池学院\t110160\n衣食住行\t110161\n博世集团\t110162\nFans\t110163\nKEITH\t110164\n分卷压缩\t110165\nTerence\t110166\n战来\t110167\n49pao\t110168\n歌节\t110169\n南京博物馆\t110170\n出翔\t110171\n栏\t110172\ndlz\t110173\nChuan\t110174\n青岛崂山区\t110175\n传媒集团\t110176\nWEB打印控件\t110177\n新闻出版广电总局\t110178\nSolidworks2016\t110179\nread\t110180\n第四十九\t110181\n926\t110182\n回旋\t110183\n诗林\t110184\n不锈钢罐\t110185\n雪窦山\t110186\n财评\t110187\n20160826\t110188\n口碑网\t110189\n数据库存储\t110190\n无梁\t110191\n东北平原\t110192\n锁髓\t110193\n荷花池\t110194\n非金融\t110195\n球连者\t110196\n钻牛角尖\t110197\nGPS之家论坛\t110198\n学士学位证\t110199\n淫妻\t110200\n智能一卡通\t110201\ntopas\t110202\n500型\t110203\n南海广场\t110204\n缝合线\t110205\n搞完\t110206\nHENTAI\t110207\npatent\t110208\nSchemes\t110209\n黑珍珠号\t110210\ncombat\t110211\n无锡日报\t110212\n医妃权倾天下\t110213\n上王村\t110214\n高辣文\t110215\n杰克罗素梗\t110216\n故事新编\t110217\ngrades\t110218\nchondrial\t110219\n许默\t110220\nzb\t110221\n德耐尔\t110222\nserpent\t110223\n人参花\t110224\n成就者\t110225\n宫颈疫苗\t110226\n浅塘\t110227\n蕾丝边\t110228\n神剪\t110229\ngantz\t110230\n249号\t110231\ncpci\t110232\n朱由检\t110233\n西北人\t110234\n超敏c反应蛋白\t110235\n镍镉电池\t110236\n_位面\t110237\n47位\t110238\n广州银行信用卡中心\t110239\n欲女\t110240\n圣井山\t110241\n玫瑰酱\t110242\n广阳\t110243\nc3d\t110244\necms\t110245\n后益\t110246\n4690\t110247\nAnello\t110248\n报工\t110249\n武汉市疾病预防控制中心\t110250\n海正药业\t110251\n61家\t110252\n盖队\t110253\n8只\t110254\nEP02\t110255\n范德瓦尔斯\t110256\n最终幻想战略版\t110257\n智课\t110258\ntelemetry\t110259\n郑少秋\t110260\n康佳\t110261\n耐卡\t110262\nnx3\t110263\n十七届\t110264\n加减消元法\t110265\n南京汽车客运南站\t110266\n7303\t110267\n螺旋地桩\t110268\n监听者\t110269\n东外滩\t110270\n2762\t110271\n金瑞龙\t110272\n师徒情\t110273\n龙狮\t110274\n宋钦宗\t110275\nilog\t110276\n迪亚波罗\t110277\ndisplacement\t110278\n浪涌套\t110279\n绝世无双\t110280\nCHROME浏览器\t110281\n买入额\t110282\n迈腾论坛_汽车之家论坛\t110283\n建宁\t110284\nRecombinant\t110285\nnull、undefined\t110286\n徐汇教育\t110287\n薪酬网\t110288\n雨雨\t110289\n藤原纪香\t110290\nBurberry\t110291\n大拿\t110292\ni5-8400\t110293\n土耳其机场\t110294\n报告团\t110295\n跳皮筋\t110296\nHIOKI\t110297\n5.6.7\t110298\nxcrew\t110299\n南宁市市\t110300\n郎溪县\t110301\n闪之轨迹3\t110302\n泛普软件\t110303\n杭州工商局\t110304\n托里县\t110305\nHSSF\t110306\n790\t110307\n高速机\t110308\nTQM\t110309\n税法一\t110310\n淘宝银行\t110311\n礼治\t110312\n罐箱\t110313\n囚情\t110314\n青年志愿者协会\t110315\ncircumstance\t110316\n陈勋奇\t110317\n杜月笙\t110318\nLink\t110319\n职业差评师\t110320\n李纹\t110321\nrecon\t110322\n甘孜县\t110323\nKimberly\t110324\n仓科加奈\t110325\n美甲师\t110326\n鸬鸟山\t110327\n动力性\t110328\n煞白\t110329\n胎位不正\t110330\n内体\t110331\n看不惯\t110332\ngay吧_\t110333\n煮汤\t110334\n快线路\t110335\nSMD\t110336\n回民\t110337\n医助\t110338\n0.6.3\t110339\n象人\t110340\noregon\t110341\n照耀\t110342\n萧晨\t110343\n鼎龙\t110344\n云雾茶\t110345\n旅游局\t110346\n父子兵\t110347\n响子\t110348\n家教案\t110349\nel-cascader\t110350\nK60\t110351\n汉石桥湿地公园\t110352\nhtnl\t110353\n走读\t110354\nSimplify\t110355\n青岛电信\t110356\n嚎哭\t110357\n停车库\t110358\nCapcom\t110359\n默认页\t110360\n火炭\t110361\nJU\t110362\n幻天\t110363\nPPE\t110364\n吴桂贤\t110365\npreprocess\t110366\n青海省人力资源和社会保障厅\t110367\n入浴\t110368\n融雪剂\t110369\n1400\t110370\n俯卧位\t110371\n花间集\t110372\n波多黎各\t110373\n无念\t110374\n国泰君安国际\t110375\n权益性\t110376\n叶面\t110377\n北关村\t110378\n0610\t110379\n示威者\t110380\n俄文\t110381\n赵宝儿\t110382\n金角银角\t110383\n半月谈\t110384\n联谊新闻网\t110385\n钱氏\t110386\nsige\t110387\nオブ\t110388\nAD转换器\t110389\n50分贝\t110390\n彭勃\t110391\nColombo\t110392\n2017年03月\t110393\n重置成本法\t110394\nSASA\t110395\n江淮汽车\t110396\n和静县\t110397\n哈理工\t110398\n沁园春雪\t110399\nvape驱蚊器\t110400\n重庆长安汽车股份有限公司\t110401\n山东大学经济学院\t110402\n南海实验中学\t110403\n三十二式太极剑\t110404\n塞尔达:荒野之息\t110405\n交易猫\t110406\n高沟\t110407\n票据案\t110408\nreact16\t110409\n地产商\t110410\nwww.360doc.cn\t110411\n清明上河图\t110412\n第十三批\t110413\n55部\t110414\n尼亚\t110415\n世界500强企业\t110416\n百官街道\t110417\n偏方\t110418\n名山大川\t110419\n中华古玩网\t110420\n弹弹堂2\t110421\n马川行\t110422\n天津大学图书馆\t110423\n利维坦\t110424\n五花\t110425\n太平湖\t110426\n卿卿\t110427\n透干\t110428\n废章\t110429\n电子信息专业\t110430\n宣州区政府\t110431\n村志\t110432\n金立m6\t110433\n国贸商城\t110434\n探子\t110435\nendnote8\t110436\n2mod\t110437\n刘烨\t110438\n绿金\t110439\nEA7\t110440\n善念\t110441\n鄂伦春族\t110442\nleptonica\t110443\n王者荣耀刺客\t110444\n10.0.11\t110445\n喘息\t110446\n兰索拉唑片\t110447\n丹麦克朗\t110448\nadd\t110449\n王大明\t110450\n第5类\t110451\n31.5\t110452\n料口\t110453\n铅酸\t110454\n吻兽\t110455\n横叉\t110456\n隔三差\t110457\nkeybd\t110458\n新西塘\t110459\n国星\t110460\n村支书\t110461\n青海省体育局\t110462\nled光源\t110463\n长炭\t110464\n杏核\t110465\n皈依佛门\t110466\n分饰\t110467\n管办分离\t110468\n数说\t110469\nIMM\t110470\n硅谷大街\t110471\n润普\t110472\n同样\t110473\n换板\t110474\n神探狄仁杰之情花金人案\t110475\n/\t110476\n付金\t110477\n还券\t110478\n720P版BD1280\t110479\n将长\t110480\n港华\t110481\n色,戒\t110482\n开封站\t110483\nMarsh\t110484\n顾客群\t110485\n归档\t110486\ncubase\t110487\n萧山中学\t110488\nUIFont\t110489\nリアル\t110490\n杨露\t110491\n量贩式KTV\t110492\n黄腐酸钾\t110493\nfatezero\t110494\n陈小云\t110495\n黑执事\t110496\n巴士梦三国\t110497\n射礼\t110498\n五年制小学\t110499\n糕点师\t110500\n功夫熊猫\t110501\n郭宝昌\t110502\n高群书\t110503\n方原\t110504\n钟辰乐\t110505\n官帽\t110506\nCylinder\t110507\n原身\t110508\n长江流域\t110509\n视觉中国集团\t110510\n声网\t110511\n应许\t110512\n十一号\t110513\n点式\t110514\nwildest\t110515\nsecpath\t110516\n被动防护网\t110517\n山南水北\t110518\n兜售\t110519\n红印\t110520\nALEX\t110521\n露齿\t110522\n安徽工业经济职业技术学院\t110523\nzhanjindong\t110524\n纪念册\t110525\n青云之志\t110526\n金安区人民政府\t110527\nskybox\t110528\n熊出没之丛林总动员\t110529\nSUNSHINE\t110530\n袖手\t110531\ninterl\t110532\n立意\t110533\nunit2\t110534\n山东广电\t110535\n牙山\t110536\n五子连珠\t110537\n覆盖剂\t110538\n日屄\t110539\n蔡司镜片\t110540\n褐石\t110541\n第30\t110542\n申义\t110543\ninsec\t110544\nSpecialist\t110545\ntoken认证\t110546\n闲情逸致\t110547\n臧克\t110548\n企鹅号指数\t110549\n布拉格\t110550\n独宠\t110551\n神策\t110552\nthoracic\t110553\n龙珠超宇宙2吧_\t110554\n博大精深\t110555\n陈伟杰\t110556\n淄博大众网\t110557\nZalo\t110558\n菱悦\t110559\n彩瓷\t110560\n100枚\t110561\n三人行必有我师焉\t110562\n平板划线法\t110563\n换乘站\t110564\nginger\t110565\nElias\t110566\n世界银行集团\t110567\nx轴\t110568\n锻练\t110569\n脓肿\t110570\n410\t110571\n大同中学\t110572\n王思星际战甲\t110573\n腾讯视频_好莱坞影院\t110574\n助理\t110575\n万言书\t110576\n审制\t110577\n游侠网\t110578\nQQ幻想\t110579\n20189\t110580\n卡米尔\t110581\npwp\t110582\n成本率\t110583\n树葡萄\t110584\n陈冉冉\t110585\n松土\t110586\n四合\t110587\n乐昌\t110588\n绿岛影院\t110589\n段\t110590\nalphacam\t110591\n30项\t110592\n崔屹\t110593\n商务部国际贸易经济合作研究院\t110594\n门地\t110595\n文件盒\t110596\n局域\t110597\nPowerMock\t110598\n佛本是道\t110599\n彦\t110600\n庭园\t110601\n180天内\t110602\n类封装\t110603\n50页\t110604\n九龙区\t110605\n验票\t110606\nTacoma\t110607\n软轴\t110608\n东风日产文体中心\t110609\n青雨\t110610\n673\t110611\n桔子浏览器\t110612\n后尾箱\t110613\n三水广场\t110614\n扯下\t110615\n小闹钟\t110616\n慢性肠炎\t110617\n佳兆业君\t110618\n女神异闻录2\t110619\n英协路\t110620\n移动神州行卡手机号段\t110621\nrealguitar\t110622\n05_\t110623\nrgb565\t110624\nMounting\t110625\n矶钓竿\t110626\n孬种\t110627\nAUTODESK\t110628\nDiss\t110629\nthreesome\t110630\n四川师范大学\t110631\n花蜜\t110632\n收割机\t110633\n南坪街道\t110634\n宝树\t110635\n潜行者|Rogue-魔兽世界\t110636\n抗病毒口服液\t110637\n直升生\t110638\n通水\t110639\n阿里企业邮箱\t110640\n兄弟团\t110641\n美妆网\t110642\n蝙蝠蛾\t110643\n孔盖\t110644\n奥斯特洛夫斯基\t110645\n杨三姐\t110646\n金牛座\t110647\n八九不离十\t110648\nSM捆绑口塞调教\t110649\nblustone\t110650\n宝安政府\t110651\n进食障碍\t110652\n2017—2020年\t110653\n败走\t110654\nshallow\t110655\n凉花\t110656\n3.8.4\t110657\nQQ影音\t110658\n起机\t110659\nstalled\t110660\n请柬\t110661\n点睛之笔\t110662\n第三方输入法\t110663\n阿精\t110664\n德泰\t110665\nabyss\t110666\n间接性\t110667\n13.0.1\t110668\n诳语\t110669\n天津市\t110670\n箕\t110671\n接地\t110672\n8.7\t110673\n荏原\t110674\n显微硬度\t110675\n酒吞童子\t110676\n陵城\t110677\n校纪\t110678\nhrss\t110679\n明清\t110680\n皇军\t110681\nCASCADE\t110682\n15枚\t110683\n夏目\t110684\n联发科技\t110685\n撕衣\t110686\n电视问政\t110687\n辊子\t110688\nclarisonic\t110689\n星岩\t110690\n人造肉\t110691\ndistance\t110692\n豪沃自卸车\t110693\n七侠镇\t110694\n孔杰\t110695\n披肩\t110696\n龙山站\t110697\n帐子\t110698\n二阶堂红丸\t110699\n刘立\t110700\n释然\t110701\n直形\t110702\n野兵\t110703\n袁家岭\t110704\n心爱\t110705\n申购\t110706\n中国中小企业协会\t110707\nshind\t110708\n这么开\t110709\n18周\t110710\n40种\t110711\n东虹桥\t110712\n2_\t110713\nofficescan\t110714\n7265\t110715\n河口\t110716\n860evo\t110717\n非齐次线性微分方程\t110718\n橱窗位\t110719\n见长\t110720\n10_\t110721\n无棣\t110722\n大肚子\t110723\n智能卡\t110724\ncultures\t110725\n僖公\t110726\n11.2.5\t110727\nSandro\t110728\n工作性\t110729\nattorney\t110730\n跳跃式\t110731\n图虫摄影网\t110732\n李嫣\t110733\n炫彩版\t110734\n知已\t110735\n澄城县\t110736\n天涯网\t110737\n峡山区\t110738\ndiskgenius\t110739\nX240\t110740\nCA1350\t110741\nDockerFile\t110742\n知府\t110743\n搭板\t110744\n李树建\t110745\nendurance\t110746\n绝地求生进\t110747\n呆脑\t110748\ndm3344\t110749\n苍之纪元大苍穹\t110750\nWiiU模拟器\t110751\n爱剪辑软件\t110752\nandrioid\t110753\nparadox\t110754\n京师博仁\t110755\n明言\t110756\n电荷放大器\t110757\n凋零\t110758\nnegro\t110759\n石兵\t110760\n东门口\t110761\n音控\t110762\n北京师范\t110763\n浩天\t110764\n温瑞安\t110765\njana\t110766\n浐灞半岛\t110767\n挂杯\t110768\n城市管理专业\t110769\n芳龄\t110770\n李先念\t110771\naccess_token\t110772\nGRACE\t110773\nbuffering\t110774\n動畫\t110775\n茗\t110776\n均衡\t110777\n心惊胆战\t110778\n群儒\t110779\n逃走\t110780\n食品工艺学\t110781\n5523\t110782\nrto\t110783\n方回春堂\t110784\nISO镜像文件\t110785\n774\t110786\n三生(中国)健康产业有限公司\t110787\ncollate\t110788\n烷烃\t110789\n浙江大学光电信息工程学系\t110790\n阿兹特克\t110791\n沙坪\t110792\n暗黑类\t110793\n冶炼炉\t110794\nQt4\t110795\n7天左右\t110796\n美乐乐\t110797\n半影\t110798\n大平层\t110799\nsvnkit\t110800\n四载\t110801\nmbr分区\t110802\nSPS\t110803\n45寸\t110804\nVacations\t110805\n暖荷\t110806\n梦之城\t110807\n起亚KX5\t110808\n46款\t110809\n注册器\t110810\n上班女郎\t110811\n扶贫日\t110812\n暖闻\t110813\n推背\t110814\n通信原理\t110815\nfreenom\t110816\nSmell\t110817\n阿森纳\t110818\n景芝镇\t110819\n如果有来生\t110820\n业务流\t110821\nkingdom\t110822\n汤原\t110823\nkicks\t110824\nlv3\t110825\n铜仁路\t110826\n陕西论坛_汽车之家论坛\t110827\n曲江区\t110828\n杨絮\t110829\n腰酸背疼\t110830\nBred\t110831\n外来妹\t110832\n庚申年\t110833\n牵着手\t110834\n襄阳路\t110835\n误工\t110836\nMemoryStream\t110837\n凤凰蛋\t110838\n罗西基\t110839\n国家赔偿法\t110840\nAAP\t110841\n解放军军乐团\t110842\nNylon\t110843\n8本\t110844\nED2000.COM\t110845\n圆的认识\t110846\n政见\t110847\n工兵\t110848\n竞谈\t110849\n洛林\t110850\n反拍\t110851\n洗眉\t110852\n宇部\t110853\n曾恺玹\t110854\nHoaprox\t110855\n海沧区\t110856\n瓜子网\t110857\n蓝龙\t110858\n推进剂\t110859\nVoLTE\t110860\n毕雯珺\t110861\n地暖盘\t110862\n政要\t110863\n悬点\t110864\n氟中毒\t110865\nAiMesh\t110866\n环球人物\t110867\n一龄\t110868\n棺人\t110869\n1月5日\t110870\n动势\t110871\n灰幕\t110872\n麒派\t110873\n免费法律咨\t110874\n超神学院之乾坤篇\t110875\n一期二期\t110876\n东亚银行(中国)有限公司\t110877\n懂得\t110878\ncctext\t110879\n食俗\t110880\n飞云之下\t110881\n2008年5月\t110882\n中华优秀传统文化\t110883\n香囊\t110884\nAir\t110885\n海吧\t110886\n完成度\t110887\n彩钢卷\t110888\n注孤生\t110889\n2017.9\t110890\nfairy\t110891\n解付\t110892\n英特\t110893\n甘丹寺\t110894\nLucky\t110895\n中国银行银行\t110896\n20150102\t110897\n暗影精灵2plus\t110898\n殷高西路\t110899\n短绒\t110900\n3dcoat\t110901\n肾素\t110902\n马利亚\t110903\n玉兰苑\t110904\n中华人民共和国安全生产法\t110905\n宇称\t110906\nSharding\t110907\n髂胫束\t110908\n陶宝\t110909\n丁文武\t110910\n遗民\t110911\n食药局\t110912\n客运员\t110913\n魅族Pro6Plus\t110914\n福晟集团\t110915\ncbr\t110916\n3215\t110917\n虚标\t110918\n涠洲岛\t110919\n海峡军演\t110920\n战情室\t110921\n迪兰\t110922\nrough\t110923\nSporting\t110924\n深圳公办学校\t110925\ntransceiver\t110926\n黒澤\t110927\nGongSi\t110928\n裸体照\t110929\n虚值期权\t110930\n运动包\t110931\n初体験\t110932\n透明盒\t110933\n洗脚\t110934\n精选\t110935\nbloom\t110936\n宁津\t110937\n越洋\t110938\n2.8.0\t110939\n``\t110940\n128k\t110941\n溴隐亭\t110942\n经络\t110943\n永济\t110944\n明珠台\t110945\n烤瓷\t110946\n苏元\t110947\n二手车百事通\t110948\n天声\t110949\n22nd\t110950\n中书\t110951\nhtml语言\t110952\n薄料\t110953\n导热油炉\t110954\n多起\t110955\n嘉特\t110956\n第125章\t110957\nRStudio\t110958\n实绩\t110959\n迈步\t110960\n1.2.5\t110961\n网包\t110962\n祁东\t110963\n不知者\t110964\n杜近芳\t110965\n别克英朗\t110966\nlg2\t110967\n白剂\t110968\n戈博\t110969\n超凡战队\t110970\n泰坦之路\t110971\n碲\t110972\n代理店\t110973\n激情照\t110974\ndataGridView\t110975\n专利代理人考试\t110976\n老头儿油爆虾\t110977\n事故案\t110978\nNBE\t110979\n林妍\t110980\n集发\t110981\nBiosystems\t110982\n汪曼春\t110983\n乐教\t110984\n纪昌学\t110985\n金正日\t110986\n電器\t110987\n1.7.10mod\t110988\n王下七武海\t110989\n1-6号\t110990\n资询\t110991\n吊塔\t110992\n林木\t110993\nmechanics\t110994\n中国历史故事集\t110995\n角都\t110996\n绿头\t110997\nvap\t110998\n牟氏\t110999\nzexal\t111000\nVore\t111001\n求首\t111002\nCCTV-7\t111003\n吸收式\t111004\n出道\t111005\n东方金诚\t111006\ndianzi\t111007\nuit\t111008\nVFP\t111009\nreis\t111010\n踢脚线\t111011\n耐候胶\t111012\nsided\t111013\njatoolsPrinter\t111014\n巴厘岛\t111015\n金木失忆\t111016\n搞掂\t111017\nC_\t111018\nnoitanym\t111019\n电视局\t111020\n南京万达\t111021\n晖\t111022\nweishang\t111023\n全球史\t111024\n大有\t111025\nFlorist\t111026\n8.3.3\t111027\n一弄\t111028\n寒山僧踪\t111029\n布雷\t111030\n未来软件园\t111031\n会计师\t111032\n天体营\t111033\n程霖\t111034\nAlN\t111035\n煤制\t111036\n红莲湖\t111037\n台版流星花园\t111038\n毛丫丫\t111039\n石口\t111040\n卢焱\t111041\n4d\t111042\n神牌\t111043\n烈山\t111044\n模仿游戏\t111045\nphpstudy2018\t111046\nsustain\t111047\n流动式起重机\t111048\n异味\t111049\n灵武\t111050\ni36100\t111051\n2014-2030年\t111052\n浙江人事考试网\t111053\n不丹\t111054\n王国强\t111055\nContains\t111056\n祠\t111057\n油位传感器\t111058\nDSF\t111059\n中间点\t111060\nJets\t111061\n宏伊国际广场\t111062\njs类\t111063\n病床\t111064\nReMix\t111065\n丽水古堰画乡\t111066\nbovine\t111067\n百科_秀宠网\t111068\nede\t111069\n学困生\t111070\n戊酸雌二醇片\t111071\ninbox\t111072\n两段\t111073\n热管\t111074\nRoblox\t111075\n水下古城\t111076\nFade\t111077\n月经来潮\t111078\n赵迪\t111079\n悬\t111080\n有限期\t111081\n邮政快递\t111082\n邳州市政府\t111083\nexcel排序\t111084\n口松\t111085\n狂躁症\t111086\n胡泽君\t111087\nhs_err\t111088\n幸免于难\t111089\nBoris\t111090\n互联网思维\t111091\n516号\t111092\n上海仲裁委员会\t111093\ndeng\t111094\n清算权\t111095\n花棚\t111096\n名医\t111097\n退下\t111098\n我乐\t111099\nsurfacepro4\t111100\n珍爱\t111101\n龙泰\t111102\n旋风\t111103\n国际燃气网\t111104\n旅行者\t111105\n6.41\t111106\n发情\t111107\n4月3\t111108\n海丰\t111109\n荣耀7c\t111110\n消糜栓\t111111\n网页版\t111112\nTumblr汤\t111113\nopensource\t111114\n江湖恩仇录\t111115\n六个多月\t111116\n1.2.12785\t111117\n脸红\t111118\n音泰式\t111119\n028\t111120\n一目十行\t111121\n时效\t111122\n大金操\t111123\n鱼雷机\t111124\n马斯喀特\t111125\n91个\t111126\n心蓝\t111127\n数学与应用数学\t111128\n飞声\t111129\n河北省人大\t111130\n垃圾食品\t111131\n9月25日\t111132\nhpf\t111133\n34条\t111134\n西孟加拉邦\t111135\n上海丝足\t111136\n题量\t111137\n尾戒\t111138\n1.5万套\t111139\n多糖\t111140\nGTL\t111141\n平方厘米\t111142\n君威GS\t111143\n北京保安公司\t111144\n官网\t111145\n醉眼\t111146\n6680\t111147\n狼王梦\t111148\n脑海\t111149\n梁诗正\t111150\n60载\t111151\n唐山市人民医院\t111152\n舌质\t111153\n缝包机\t111154\n影音先锋看片网站色_先锋影音av资源_影音先锋\t111155\n吉翁\t111156\n小米mix2S\t111157\n微作文\t111158\n宿华\t111159\n疯啦\t111160\n薛晓峰\t111161\nTXD\t111162\n6000多万\t111163\n白底\t111164\nX70A\t111165\n热敷袋\t111166\nthus\t111167\n操作方法\t111168\nMEVIUS\t111169\n尘袋\t111170\n会歌\t111171\n117.136.16.104\t111172\n公屏\t111173\n19款\t111174\nPNR\t111175\n留步\t111176\n贝伐珠单抗\t111177\n170号\t111178\n恒科汽配网\t111179\n香港金银业贸易场\t111180\n四不像\t111181\n佳能ef\t111182\n薇姿官網\t111183\n博雅计划\t111184\n致富路\t111185\n镍铬合金\t111186\n385号\t111187\n惠崇春江晓景\t111188\nSudoku\t111189\n阿基诺\t111190\n长坊新闻网\t111191\n堕天使\t111192\n最后一幕\t111193\n促教\t111194\n爱奇艺视频\t111195\nal00\t111196\n努比亚z11\t111197\n三峡移民\t111198\n互赞\t111199\n和园\t111200\nKawaks\t111201\n富通\t111202\n囚衣\t111203\n波拿巴\t111204\nTIANJIN\t111205\n朱敦儒\t111206\n杨青\t111207\nSKU\t111208\n高仓\t111209\n荣放\t111210\n抗皱\t111211\n333.com\t111212\n阿拉斯加犬\t111213\n山园\t111214\n风标\t111215\n刘梓晨\t111216\nDescribe\t111217\n复华文旅\t111218\n碧血青天杨家将\t111219\nkei\t111220\n下半\t111221\nDistributed\t111222\n尅\t111223\n窝棚\t111224\n恋恋\t111225\n战争学院\t111226\n过膝长靴\t111227\ntrunc\t111228\n1864231\t111229\n生物学通报\t111230\n变频调速器\t111231\n角蛋白\t111232\nfliqlo\t111233\n响片\t111234\n雨山\t111235\nVeryCD娱乐大全\t111236\n遍历\t111237\nGDW\t111238\n1星\t111239\n螳螂捕\t111240\n百强区\t111241\n福宁客\t111242\n鲜卑\t111243\nLORD\t111244\nrepalce\t111245\n王萌萌\t111246\n导播\t111247\n山大华特\t111248\n爪哇语\t111249\n乳乳\t111250\nwin键\t111251\nHD720p\t111252\n养老服\t111253\n价价\t111254\n信口\t111255\n响亮\t111256\n掌动\t111257\n木素\t111258\n华英学校\t111259\n郑州北站\t111260\n8.1.1\t111261\n泛美航空\t111262\n杨吉\t111263\n盘山风景区\t111264\n002400\t111265\n立堂\t111266\n安徽省司法厅\t111267\n四氯化硅\t111268\n叶问2\t111269\n板木\t111270\n加工机\t111271\n偏磁\t111272\n圆通快递\t111273\nwxid\t111274\n热容\t111275\n丝带\t111276\n捷威\t111277\n87128023\t111278\n冻融\t111279\nCyber\t111280\n一两千\t111281\n永乐票务\t111282\n胜负彩投注\t111283\n湿态\t111284\n美琴\t111285\n如何阅读一本书\t111286\n四十周年\t111287\n窝轮\t111288\n很喜欢\t111289\n狮头\t111290\n非罪\t111291\n赠张\t111292\n广州中山大学附属第一医院\t111293\nxcb\t111294\n罗斯科\t111295\n玉米机\t111296\n鬼影实录\t111297\n燃气具\t111298\n我想你\t111299\nReceivable\t111300\n鬃岩\t111301\n吸顶式\t111302\n工科生\t111303\nWorkCentre\t111304\nIGM\t111305\n课会\t111306\n桂工网\t111307\nnike+\t111308\n全祼\t111309\n_巴中传媒网\t111310\n淑\t111311\nthinkcmf5\t111312\n三万多\t111313\n鲜卑族\t111314\n偏题\t111315\n商丘市公安局\t111316\n南昌地铁1号线\t111317\nUHD\t111318\nloose\t111319\nJerry\t111320\n20170809\t111321\n姨父\t111322\n浙报传媒\t111323\n恬妞\t111324\n中转站\t111325\n民革中央\t111326\n铑\t111327\n重力加速度\t111328\n断轴\t111329\n250pp\t111330\n安基\t111331\n有绳\t111332\n湖州论坛\t111333\n滋润\t111334\n染液\t111335\n定下\t111336\n奶结\t111337\n霍莉雪特\t111338\n简章\t111339\n保险公估\t111340\njprofile\t111341\n配接\t111342\n逆转录病毒\t111343\n真実\t111344\n宠物猴\t111345\n入京\t111346\n123d\t111347\n阻击\t111348\n熟读\t111349\n上海松江\t111350\n护理垫\t111351\nciao\t111352\n货期\t111353\nnewaxis\t111354\n角栓\t111355\n天琥\t111356\n路人\t111357\n透彻\t111358\nneck\t111359\n丽柜厅\t111360\ndlt\t111361\n丁善德\t111362\ndeprecated\t111363\n郑君\t111364\n鱼病\t111365\n复兴镇\t111366\nweaving\t111367\n星神\t111368\n玄风\t111369\nWIn7\t111370\nIEM\t111371\n贾正\t111372\n浮游菌采样器\t111373\n圣马可大教堂\t111374\n投用\t111375\n试谈\t111376\n蜜瘦\t111377\nwin10高分屏\t111378\n借考\t111379\ncompleted\t111380\n中炮过河车\t111381\n海克斯科技\t111382\n植毛\t111383\n习卷\t111384\n陶然亭\t111385\n电压器\t111386\n泉州环保局\t111387\n到期收益率\t111388\n水趣\t111389\n集成度\t111390\n龙胆泻肝丸\t111391\nHomecoming\t111392\n死灵龙\t111393\n九龙仓\t111394\n消退\t111395\n六年内\t111396\n中国国际数码互动娱乐\t111397\n建设史\t111398\n48K\t111399\n可牛\t111400\n_昕\t111401\n微福利\t111402\n河南投资集团\t111403\n空间群\t111404\n降尘\t111405\n屡遭\t111406\n天华百剑斩\t111407\n豆娘\t111408\n郫县豆瓣\t111409\nWalt\t111410\n冷君\t111411\n湖州市政府\t111412\n沪宜公路\t111413\n东奔西跑\t111414\n哈金\t111415\n布里兰\t111416\nMac虚拟机\t111417\n出党\t111418\nBAR\t111419\n董再国\t111420\n振动锤\t111421\n华夏基石\t111422\n圣母颂\t111423\n燧\t111424\n旅游意外险\t111425\n迈欧\t111426\n密云古北水镇\t111427\n超奥特曼八兄弟\t111428\n起敬\t111429\nSMTOWN\t111430\n观湖街道\t111431\nifft\t111432\nv93\t111433\n46P\t111434\n国际惯例\t111435\n进一法\t111436\n783\t111437\n太阳落山\t111438\n恋文\t111439\n筑博\t111440\n广深铁路股份有限公司\t111441\n屎棍\t111442\n龟龄集\t111443\n颈动脉斑块\t111444\n渗透性\t111445\n第8行\t111446\n融360\t111447\n张芷溪\t111448\n编辑部的故事\t111449\n香港迪士尼乐园酒店\t111450\n湖心亭\t111451\n快闪店\t111452\n纳斯网\t111453\n活力素\t111454\nxpety\t111455\n理查德\t111456\nfdtd\t111457\n调流\t111458\n工程咨询\t111459\n缅甸\t111460\nWire\t111461\nFitTime\t111462\n芙蓉新城\t111463\n18.9\t111464\n凋\t111465\n威博\t111466\n超品\t111467\nm16a4\t111468\nLiterature\t111469\n风塘\t111470\nzhuguowei\t111471\n第六笔\t111472\nerroe\t111473\n郭云鹏\t111474\n池化\t111475\n轩辕非人学园\t111476\n年轻化\t111477\n千八\t111478\nie6\t111479\n优冠\t111480\nalkaline\t111481\n华为手机维修中心\t111482\n零序互感器\t111483\n2017.4.1\t111484\n程啸\t111485\n8袋\t111486\n初裁\t111487\n国家化\t111488\nps批处理\t111489\n大事\t111490\n法宝\t111491\nAlila\t111492\n吴仪\t111493\n2015年3月\t111494\n自动分拣机\t111495\n北京十二中\t111496\n滚出\t111497\n锋范论坛\t111498\nMINITAB\t111499\ndomoticz\t111500\nlatina\t111501\n永胜县\t111502\n在职研之家\t111503\n江西省国有资产监督管理委员会\t111504\n七杯\t111505\n2018年5月5日\t111506\n意大利杯\t111507\nc99\t111508\n反正面\t111509\n稠州\t111510\n比喻句\t111511\nArthur\t111512\n新华锦\t111513\nRosen\t111514\n爱心暑托班\t111515\nChe\t111516\n西德尼\t111517\n夜市\t111518\n岩窟\t111519\n更改\t111520\ndat文件\t111521\nlsprepost\t111522\n龙裔艺术馆\t111523\n海南电视台\t111524\n5重\t111525\n塔尔寺\t111526\n千尊\t111527\n卡瓦\t111528\nSupermicro\t111529\n蹄片\t111530\n穿墙\t111531\niPeen\t111532\n锦龙\t111533\nXss\t111534\n门外汉\t111535\n新华教育集团\t111536\n三尸虫\t111537\n普罗米修斯2\t111538\n简·爱\t111539\n升级篇\t111540\n零陵\t111541\n小说选刊\t111542\n享久\t111543\n原谅\t111544\n林学\t111545\nRobotframeWork\t111546\n斑岩\t111547\n变声\t111548\nISSN\t111549\n王小\t111550\nDMK\t111551\n直冲式\t111552\n缷\t111553\n彭埠镇\t111554\n悠长\t111555\n几\t111556\n毛笔画\t111557\n银河帝国之刃\t111558\n淘巧网\t111559\n黑客攻击\t111560\n250A\t111561\nTat\t111562\npe版\t111563\n应用数学\t111564\nrndis\t111565\n澄城政府\t111566\n清唱剧\t111567\n2冠\t111568\n九元航空\t111569\n基金管理人\t111570\n私有化\t111571\n融创欧麓花园城\t111572\n游隙\t111573\n片岩\t111574\n输送\t111575\ncalayer\t111576\n秘血岛\t111577\neuropa\t111578\nW541\t111579\nlvs\t111580\n汉娜\t111581\n83%\t111582\n艾\t111583\n公共招聘网\t111584\n瑞典隆德大学\t111585\n提存\t111586\n厂庆\t111587\n李光\t111588\n推想\t111589\nremastered\t111590\n吕剧\t111591\n代账\t111592\nTherapist\t111593\n沪昆\t111594\nfmincon函数\t111595\n保鲜膜\t111596\n大珠小珠落玉盘\t111597\n14.04双\t111598\n蛮夷\t111599\n理工大学\t111600\n竞争性\t111601\n裕丰地产\t111602\n民国瓷\t111603\n青亘\t111604\n知子\t111605\nDVS\t111606\n1132\t111607\nhermit\t111608\n发热板\t111609\n杂曲歌辞\t111610\n风冷却器\t111611\n惦\t111612\n散卡\t111613\nRIP协议\t111614\n广丰县\t111615\n周侗\t111616\n1u\t111617\n注字\t111618\n发送机\t111619\n中甸\t111620\n绿钻\t111621\n龙力\t111622\n杀破\t111623\n晨风机器人\t111624\n广东省国资委\t111625\n柴窑\t111626\n爱心基金会\t111627\n张家口职业技术学院\t111628\n九月初\t111629\n暂离\t111630\nhuipu\t111631\n北京市金杜律师事务所\t111632\n肺科医院\t111633\n风月\t111634\n能力者\t111635\n风神AX7\t111636\n办事机构\t111637\n二十三岁\t111638\n2.5秒\t111639\n上海汇众\t111640\n1欧姆\t111641\n深圳长城宽带\t111642\n缕\t111643\nSixteen\t111644\n北欧\t111645\n八德\t111646\nAlloy\t111647\n甘其\t111648\n20170626\t111649\n料锯\t111650\nPlantation\t111651\n中心库\t111652\n长隆横琴湾酒店\t111653\n喷气机\t111654\n罗俊\t111655\n军械\t111656\n很丑\t111657\n投行部\t111658\njys\t111659\nJIA\t111660\n梁段\t111661\n华尔兹\t111662\nApache\t111663\n开矿\t111664\n困乏\t111665\nradar\t111666\n吕超\t111667\n巨根\t111668\n维尼\t111669\nbo妞\t111670\n第二学位\t111671\nAsh\t111672\ncultivation\t111673\n渗碳\t111674\n郝平\t111675\n石板路\t111676\n正相\t111677\n胜制\t111678\n流水线\t111679\n接代\t111680\n32步\t111681\n第2波\t111682\n蜗牛霜\t111683\n合金装备5原爆点\t111684\n富矿\t111685\n张灏\t111686\n电动尾\t111687\np4000\t111688\nPro版\t111689\n张家浜\t111690\n佛山市政府\t111691\n新加坡小学\t111692\n费德\t111693\npoller\t111694\n贡献者\t111695\n联想昭阳K22\t111696\nchandler\t111697\n10KB\t111698\n条形\t111699\nImmigrant\t111700\n柳树叶\t111701\n玄奘\t111702\n废乳化液\t111703\n重试\t111704\nplist\t111705\n优泌林\t111706\n科医院\t111707\n瞬杀\t111708\nPure\t111709\n边户\t111710\napart\t111711\n浙江工贸职业技术学院\t111712\ngaodu\t111713\n资质\t111714\n借腹\t111715\n德商\t111716\n定远县\t111717\n暗黑血统3\t111718\n东方中科\t111719\n闪粉\t111720\nBeijing\t111721\n致礼\t111722\n正科级\t111723\n真瓷\t111724\n聚光科技\t111725\n谷轮压缩机\t111726\nscrollview\t111727\nsprout\t111728\n甲氨蝶呤片\t111729\n尘世\t111730\n赵公口\t111731\n董卿尼古拉斯凯奇\t111732\nExams\t111733\n截水\t111734\nHTTPD\t111735\n注液机\t111736\n楷模\t111737\n孙艺洲\t111738\nstrawberry\t111739\n横路\t111740\n精品软件\t111741\n西汉高速\t111742\njr们\t111743\n博天环境\t111744\n4月27日晚\t111745\n安山岩\t111746\n一百年后\t111747\n代还款\t111748\n拼接式\t111749\n翡翠湾\t111750\nEZpad\t111751\n透明质酸钠\t111752\nmapping\t111753\ngovernment\t111754\n天机7\t111755\n席宝儿\t111756\nZye\t111757\n如来神掌\t111758\n六道轮回\t111759\n20171126\t111760\nrow\t111761\ntonic\t111762\n地下城与勇士吧\t111763\ncircle\t111764\n负离子空气净化器\t111765\n10点半\t111766\n机械硬盘分区\t111767\n试玩版\t111768\n八种\t111769\n十一中\t111770\n唇腭裂\t111771\n青桐\t111772\n倒过来\t111773\nie8.0\t111774\n四胡\t111775\n激流\t111776\n文绣\t111777\n穿堂风\t111778\n海棠\t111779\n入伍\t111780\n飞龙\t111781\n北京租车公司\t111782\n组子\t111783\n宣和\t111784\n龙泉乡\t111785\n0518\t111786\n广州市市\t111787\n角膜\t111788\n国奖\t111789\nMusicRadio\t111790\n杆位\t111791\n媚\t111792\n安卡\t111793\n30多岁\t111794\n区科技局\t111795\nZIPPO\t111796\njs-CSDN\t111797\n印纪传媒\t111798\n六角螺栓\t111799\n冤屈\t111800\n中午12点\t111801\n小市民\t111802\n9800元\t111803\nrockwell\t111804\n付与\t111805\n第10张\t111806\n青马\t111807\n户县\t111808\nCJT\t111809\nFATE\t111810\n人情味\t111811\n姬神\t111812\n王文忠\t111813\n策划部\t111814\n龙人\t111815\nv型\t111816\n50套\t111817\nLX\t111818\n倒手\t111819\n面相学\t111820\n宵\t111821\n国网电力\t111822\nQFile\t111823\n宝马X1论坛\t111824\ndn50\t111825\nshengyi\t111826\n浮沙\t111827\n徐杰\t111828\n孽恋\t111829\n大镇\t111830\n复工\t111831\n灵枢\t111832\n逗游\t111833\n建筑展\t111834\niihs\t111835\ncaching_sha2\t111836\ndiscord\t111837\n服务局\t111838\n阮郎归\t111839\n龙纹\t111840\n七个\t111841\n3月中旬\t111842\n全国\t111843\n法籍\t111844\n刺毛\t111845\n毁灭系\t111846\n几万公里\t111847\n功德\t111848\n钢琴版\t111849\nMOP\t111850\n台鸽\t111851\npvf\t111852\nMalcolm\t111853\n欢送会\t111854\n百花成蜜\t111855\nLUXE\t111856\n冀中能源集团\t111857\n虞山人才网\t111858\n卡方分布_\t111859\n11月17日\t111860\n越南黄花梨\t111861\n惠州市政府网\t111862\n宁波市商务委员会\t111863\n约克大学\t111864\n小指头\t111865\n穷\t111866\n中南财经大学\t111867\n虞山\t111868\n合肥城隍庙\t111869\n接待量\t111870\nSafari浏览器一键翻译\t111871\nscroll-view\t111872\n折现\t111873\n河北人\t111874\n肋间神经炎\t111875\n骗子\t111876\n138路\t111877\n势头\t111878\n根据\t111879\n广告架\t111880\nDouglas\t111881\n迷恋\t111882\n周君\t111883\n蒋四金\t111884\n顾超\t111885\n元气骑士\t111886\nhypersnap\t111887\n工矿\t111888\n黄花岗起义\t111889\nUlysses\t111890\n元旦\t111891\n开森\t111892\n高速费\t111893\nLCDHOME论坛\t111894\n哈姆太郎\t111895\n新宁\t111896\n肉松小贝\t111897\n评点\t111898\n长生殿\t111899\n30凌渡\t111900\n银科\t111901\n不可挽回\t111902\n华东理工大学\t111903\n最快捷\t111904\n气滞血瘀\t111905\n胰十二指肠切除术\t111906\nCJ\t111907\n血红蛋白量\t111908\n非墨\t111909\n山妖\t111910\n肆意\t111911\n261号\t111912\n曹州\t111913\n李存勖\t111914\n須\t111915\n普益\t111916\n清产核资\t111917\n约不约\t111918\n肃纪\t111919\n一本\t111920\n徕卡m10\t111921\n十二步\t111922\nz170x\t111923\n标致308S\t111924\n第24页\t111925\ncharge3\t111926\n燕莎奥特莱斯\t111927\n比德文\t111928\n春暧\t111929\n啜\t111930\n阻力位\t111931\ndaterange\t111932\n12.99\t111933\n8139\t111934\n中签\t111935\n∠\t111936\n苦水玫瑰\t111937\n狂喜\t111938\n二十度\t111939\n小表\t111940\n桃花酒\t111941\n落叶树\t111942\n车长\t111943\n室内空间设计\t111944\n鸳鸯鞋\t111945\n尤浩然\t111946\n尼塔库\t111947\n云界之乱\t111948\n道医\t111949\n监控项\t111950\n旗帜机\t111951\n花墙\t111952\n佛说无量寿经\t111953\n懂你\t111954\n柳州市中医院\t111955\nnbu\t111956\n石碑\t111957\n六盘\t111958\n净身出户\t111959\n90式\t111960\n桃仙机场\t111961\n张桥镇\t111962\n张家川回族自治县人民政府\t111963\nStarr\t111964\n长命\t111965\n存销\t111966\n太坑\t111967\n概念学习\t111968\n借箭\t111969\n路锥\t111970\n社保网\t111971\n补件\t111972\n工业品\t111973\n400题\t111974\n哈密路\t111975\npreventive\t111976\n大婶们\t111977\nq345b\t111978\n微指数\t111979\n我等\t111980\n洛斯里克\t111981\n3shape\t111982\nWindowsServer2012\t111983\n计划生育局\t111984\n编辑版\t111985\n龙裔DLC\t111986\nGOING\t111987\n第7次\t111988\n)科技股份有限公司\t111989\nP图\t111990\n中捷产业园区\t111991\n601\t111992\nPS4Pro\t111993\n中圈\t111994\nnumpy库\t111995\n护士装\t111996\n南天国实验室\t111997\n亲和力\t111998\n张阳\t111999\nasset\t112000\n中川镇\t112001\n花结\t112002\n星之卡比镜\t112003\n万都\t112004\n增福\t112005\n一棵\t112006\n三胖\t112007\n12小时内\t112008\n凄苦\t112009\n12.1.0.2\t112010\n四会市\t112011\n秋叶原\t112012\n盲打\t112013\n拌匀\t112014\nbiao\t112015\neco\t112016\n南沙自贸区\t112017\n奥伦达\t112018\n丙烯腈\t112019\n马路市场\t112020\n戴帽\t112021\n10发\t112022\n伊娜\t112023\nantv\t112024\n求名侦探柯南\t112025\nNoSQL数据库\t112026\n精灵宝可梦XY\t112027\n票子\t112028\nclot\t112029\n京剧猫\t112030\n畜产品\t112031\n1483\t112032\n第二天晚上\t112033\n七三\t112034\n时事\t112035\n首义路\t112036\n小妮子\t112037\n灌封胶\t112038\n薄壁管\t112039\n为载体\t112040\n4501\t112041\n下天\t112042\n惊悚双城记\t112043\n湖北省司法厅\t112044\n辛未\t112045\n长桥\t112046\n2018年清明节\t112047\n雷凌妖神记\t112048\n录客阁\t112049\n老董\t112050\n假面骑士555\t112051\n3260\t112052\n600年前\t112053\n城阳区\t112054\n滦县\t112055\n醉书\t112056\n台州市\t112057\nfermentation\t112058\nJAVA编程语言程序\t112059\n拉回\t112060\n康掌柜体检网\t112061\n倡导\t112062\n240万源\t112063\npor\t112064\n追念\t112065\n蒙迪欧致胜\t112066\n乌拉特后旗\t112067\nborn\t112068\n桐柏山\t112069\n电动巡逻车\t112070\n素还真\t112071\n未来酒店\t112072\n魂灯\t112073\n何英\t112074\n绣娘\t112075\n电感\t112076\n紫缘\t112077\n虫友\t112078\n被抓\t112079\nMOVING\t112080\npixel2\t112081\n费洛蒙\t112082\n霜天\t112083\n拓邦股份\t112084\nzeiss\t112085\n走顶\t112086\n宽哥\t112087\n海德拉巴\t112088\n美蓝\t112089\n万源市\t112090\n第6感\t112091\n华县\t112092\n精\t112093\n哈尔滨万达\t112094\nsdzs\t112095\nListed\t112096\npp管\t112097\n生态型\t112098\n富爸爸穷爸爸\t112099\nknoll\t112100\n极至\t112101\n大众脸\t112102\n酷炫体坛\t112103\n睛姿\t112104\n经贸委\t112105\n江门地区\t112106\n广东中烟工业有限责任公司\t112107\n体育业\t112108\n丰台报数字报\t112109\n合肥华研白癜风医院\t112110\n张晓云\t112111\n焕驰\t112112\n工业和信息化部人才交流中心\t112113\n应收账款核销\t112114\nWinDBG\t112115\n二之国2\t112116\n数据浏览器\t112117\n外冈\t112118\nPLM\t112119\n南朗镇\t112120\n销轴\t112121\n生态工业园区\t112122\n扣图\t112123\nAllowance\t112124\n梁旭\t112125\n杰奇\t112126\n牧夫天文论坛\t112127\n巩\t112128\n计画\t112129\n文科\t112130\n20150926\t112131\n遁形\t112132\ntreasure\t112133\n骷髅头\t112134\n茶物语\t112135\n稀奇\t112136\n揉\t112137\n贺炳炎\t112138\n图片素\t112139\n铜鼎\t112140\n望京街道\t112141\n德甲\t112142\n长谷川美红\t112143\nCMC\t112144\n杨海玲\t112145\n双街镇\t112146\n大庭广众\t112147\n磷酸盐缓冲液\t112148\n菜单项\t112149\n人名条\t112150\n武\t112151\n箱底\t112152\nrar\t112153\n矫直机\t112154\nHundred\t112155\n大明虾\t112156\n15w\t112157\n军委\t112158\n桃园社区\t112159\n张岭\t112160\n致命武器\t112161\n申通\t112162\n邓晓芒\t112163\n万事大吉\t112164\n莫大\t112165\n恒大人寿\t112166\n收文\t112167\n郑汴\t112168\n长生生物\t112169\n杨素\t112170\n二氧六环\t112171\nDeer\t112172\n家电\t112173\nfullgc\t112174\nhahazexia\t112175\n桌布\t112176\n2018年4月21日\t112177\n陕西凉皮\t112178\n绿韵\t112179\n巨噬细胞\t112180\n慕晴叶天熠\t112181\nstringr\t112182\n钟南街\t112183\n问问\t112184\n网侠\t112185\n倒读\t112186\n卡索\t112187\njeepxie\t112188\nLiteracy\t112189\n实验小学\t112190\n原告\t112191\n第三十章\t112192\n全明\t112193\n洛圣\t112194\n乐龄网\t112195\n邓散木\t112196\n当你老了\t112197\n137路\t112198\n鸿星\t112199\n名烟\t112200\n娘们儿\t112201\n六感\t112202\n中城\t112203\nbll\t112204\n三荣\t112205\nhrp\t112206\n亿方云\t112207\n金力\t112208\n周淑怡\t112209\n梅芙\t112210\n直流电源\t112211\n第6\t112212\n搞笑大小王\t112213\nhp1007\t112214\n讯捷\t112215\n上海知识产权法院\t112216\nQTIS\t112217\n1mg\t112218\n结核菌\t112219\n回心转意\t112220\n早发白帝城\t112221\n数万元\t112222\n行李袋\t112223\n802.11b\t112224\nEmmanuel\t112225\n王兰庄\t112226\n走着走着就散了\t112227\n盖州\t112228\nWalking\t112229\n束起\t112230\n台州市住房和城乡建设规划局\t112231\n孔子\t112232\n小黄鸭\t112233\n审减\t112234\n后验\t112235\ntoeic\t112236\nmri\t112237\n桥林街道\t112238\n长江职业学院\t112239\n黑莓q10\t112240\n易CRM\t112241\nkohana\t112242\nTriangle\t112243\n互联网+时代\t112244\nCapitalism\t112245\n远方的朋友\t112246\n夜的钢琴曲\t112247\nCardView\t112248\n跨国并购\t112249\nps5\t112250\n白雅\t112251\n非公益性\t112252\n火车大劫案\t112253\n庚午\t112254\n空行母\t112255\n依达拉\t112256\n王雨辰\t112257\n造梦西游大闹天庭篇\t112258\n断源\t112259\n金钟民\t112260\n吃豆\t112261\nPickerView\t112262\n杨晶\t112263\n国务院参事室\t112264\n个人独资企业\t112265\n战国\t112266\n2544\t112267\n深圳福田中心\t112268\n苏新诗\t112269\nStrings\t112270\n雷蒙·斯尼奇\t112271\n数以万计\t112272\n十三名\t112273\n该村\t112274\n领导干部\t112275\ndemonstration\t112276\n合并行\t112277\n广泰\t112278\n新课标人教\t112279\n尧三青\t112280\n咔咔\t112281\n汊沽港镇\t112282\nprerequisites\t112283\n高级编程\t112284\n钢纤维\t112285\n重庆工商银行\t112286\n水暖\t112287\n雷人囧图\t112288\nDane\t112289\n超平\t112290\nSolidity\t112291\n海伦市\t112292\n石榴姐\t112293\n奴役\t112294\nappkey\t112295\n好丈夫\t112296\n网络包\t112297\n孙红雷\t112298\n非洲堇\t112299\n舞起来\t112300\n拖板\t112301\n铝碳酸镁咀嚼片\t112302\n吊顶龙骨\t112303\nresidues\t112304\n29厘米\t112305\n东源县\t112306\nITeye\t112307\n炀\t112308\n2018年4月15号\t112309\n漏电保护器\t112310\n氩弧焊\t112311\n详式\t112312\n凯宾斯基酒店\t112313\n中信股份\t112314\n华师一附中\t112315\n电邮\t112316\nHatano\t112317\nORA-00001\t112318\n法治政府\t112319\n口疼\t112320\n远观\t112321\nmehr\t112322\ninfineon\t112323\n裱花蛋糕\t112324\nmaya2016\t112325\nEX5\t112326\n4例\t112327\n吴独尊\t112328\nsavings\t112329\n甄士隐\t112330\n86名\t112331\n93分\t112332\n150kw\t112333\n苍龙逐日\t112334\n眼观六路\t112335\n联氨\t112336\n黄海路\t112337\n动生\t112338\n飞龙在天\t112339\n老院子\t112340\n休未休\t112341\n包住\t112342\n龙湾\t112343\n叠罗汉\t112344\n繁文缛节\t112345\ncas-server\t112346\n求扫\t112347\n京绣\t112348\n别斯兰\t112349\n油膏\t112350\n铝盒\t112351\n69中文网\t112352\n百度财富值\t112353\n三国志10吧\t112354\n青金石\t112355\n寻欢作乐\t112356\n加热型\t112357\n焗鸡\t112358\nKconfig\t112359\n6607\t112360\n热腾腾\t112361\n已足够\t112362\n石缝\t112363\n月见黑\t112364\n1幢\t112365\n石头门\t112366\nzzbm\t112367\n平湖火车站\t112368\n世界篇\t112369\n跑狗图\t112370\n工程建设标准强制性条文\t112371\n哈尔科夫\t112372\n中建一局集团\t112373\n恒星\t112374\n富源县\t112375\n萧井陌\t112376\n7388\t112377\n横店\t112378\n圆白菜\t112379\n000882\t112380\n耽搁\t112381\n厂\t112382\n搅拌式\t112383\n名科\t112384\n九格\t112385\n影人\t112386\n李章洙\t112387\n麦克唐纳\t112388\n十几岁\t112389\n本多\t112390\n2017-10-24\t112391\n雅韵\t112392\n五一庆\t112393\n中国雨湖网\t112394\n荣耀认证\t112395\nmio\t112396\n苏晨\t112397\n757\t112398\n拆锁\t112399\n潘汉年\t112400\nComputer\t112401\n305sh\t112402\n放手一搏\t112403\n伽梵蛇\t112404\n区位码\t112405\nFastReport\t112406\n考夫曼\t112407\n孔林\t112408\n汉江大桥\t112409\n初始污染菌\t112410\n赴台\t112411\n证券基金\t112412\n灯彩\t112413\n搞笑短信网\t112414\ncomposite\t112415\n伯恩山\t112416\n斯柯达柯珞克\t112417\n乌鲁木齐市公安局\t112418\nkt3\t112419\nLaptop\t112420\nsis001\t112421\n商务大厦\t112422\nJDK1.8\t112423\n住房补贴\t112424\n1869\t112425\n午时\t112426\n解危\t112427\n一曰\t112428\njetpack\t112429\n沙棘果\t112430\n管理人员\t112431\n军规\t112432\n熊荣川\t112433\nPromotions\t112434\n上海国际舞蹈中心\t112435\n字模\t112436\n软磨硬泡\t112437\n表演秀\t112438\n肠上皮化生\t112439\n英雄萨姆2\t112440\n李俊昊\t112441\nv14.0\t112442\n黑龙江法院\t112443\n82期\t112444\n开眼\t112445\n小立\t112446\n打拼\t112447\n小三角\t112448\nEmory\t112449\n天戈朱\t112450\n疾速\t112451\nV1\t112452\n门第\t112453\n命运之夜\t112454\n警犬巴\t112455\n美世\t112456\n阿里巴巴公司\t112457\n2夜\t112458\n随母\t112459\n华润电力\t112460\n勋章\t112461\n演奏家\t112462\n花朝节\t112463\nbuild_query\t112464\n随机点\t112465\n霓裳羽衣曲\t112466\n丝印机\t112467\n穆斯\t112468\n成华\t112469\n叶浅予\t112470\n迷彩服\t112471\n灭绝\t112472\n工字电感\t112473\n浙江日报\t112474\n星期八\t112475\n大连亿达\t112476\n环境公益诉讼\t112477\n黎安\t112478\n_莱州政府网\t112479\n白歆惠\t112480\n灵与欲\t112481\n宛平城\t112482\nCN314智能生活网\t112483\n成欢\t112484\n日本天皇\t112485\nHUAWEI\t112486\n0.01元\t112487\n拿完\t112488\nROM刷机包\t112489\n20.com\t112490\npvc-u\t112491\nWNFK\t112492\n伟杰\t112493\n澳洲大学\t112494\nDk\t112495\nsvchost.exe\t112496\n定妆粉\t112497\n10.3.5\t112498\n小泡芙\t112499\n信息系统工程\t112500\n非生僻\t112501\n890\t112502\n付笛生\t112503\n雪枫\t112504\n中国浦东干部学院\t112505\n海上海\t112506\n鲁迪盖伊\t112507\n炮击\t112508\n朱智勋\t112509\nNAOH\t112510\n泥炭土\t112511\n湖北博物馆\t112512\n伙牌\t112513\n青青\t112514\nF1-F12\t112515\n肠管\t112516\n庆应\t112517\nBD720P\t112518\nteeth\t112519\n幸福家园\t112520\n涛动周期论\t112521\ndiversification\t112522\n湖北农村义务教育学校\t112523\n张新芳\t112524\n大业信托\t112525\n1.5A\t112526\n二零一七\t112527\n四川日报网\t112528\n吉姆尼\t112529\n数据地\t112530\n情杀案\t112531\n21种\t112532\nbns\t112533\n江西电力\t112534\n新优娱乐\t112535\n信用币\t112536\n金克拉\t112537\n挠\t112538\n剑网牧马人\t112539\n蒜香排骨\t112540\n谜踪之国\t112541\n10进制数\t112542\n商务照\t112543\n北京苹果\t112544\nAlone\t112545\n鸡柳\t112546\n52wpe\t112547\n绿气\t112548\n聚丙烯酰胺凝胶电泳\t112549\n速度感\t112550\n38000元\t112551\n新中大\t112552\nL460\t112553\n古皂\t112554\n三龙\t112555\n主编\t112556\n3百万\t112557\ncyberlink\t112558\n张滨\t112559\n反对形式主义\t112560\n思如\t112561\n银石\t112562\n午夜\t112563\n说话说\t112564\n护颈枕\t112565\n补气\t112566\n龙潭\t112567\nebdoor\t112568\n汝矣岛\t112569\n街舞#\t112570\n穆桂英\t112571\n童声版\t112572\n凯撒旅游网\t112573\n德人\t112574\n维多利亚瀑布\t112575\nextras\t112576\n千山风景区\t112577\n180斤\t112578\n1g\t112579\n徐惠民\t112580\n惨叫声\t112581\n免播放\t112582\n奥赛罗\t112583\n凤翔县\t112584\n西湖街道\t112585\nerrorcode\t112586\n舜宇集团\t112587\nv2.1.7\t112588\n电大电大\t112589\n旧梦\t112590\n侠盗猎车手3\t112591\n咳嗽\t112592\n天井之娃\t112593\n拔节\t112594\n喜结网\t112595\n紫夜\t112596\n钊\t112597\n设变\t112598\nuasb\t112599\n地铁6号线\t112600\nUps\t112601\nLEGO\t112602\n异性缘\t112603\n八十多\t112604\nMDK-ARM\t112605\n恩恩\t112606\naccommodate\t112607\n巨人国\t112608\n新乡市中心医院\t112609\n温州政府\t112610\nLaCie\t112611\n大幅面打印机\t112612\n十块\t112613\n修补\t112614\nHttp请求\t112615\n田块\t112616\n共板\t112617\n江苏省餐饮行业协会\t112618\nATLUS\t112619\nparams\t112620\n中共十七大\t112621\n晶振电路\t112622\n云逸\t112623\n输入功率\t112624\nliaoning\t112625\n爱普生l351\t112626\n白雾\t112627\n盐酸莫西沙星片\t112628\n拍立淘\t112629\n沙盘\t112630\nyura\t112631\n敌后武工队\t112632\n末日乐园\t112633\nhence\t112634\n管理费用率\t112635\nfbi\t112636\n鼾症\t112637\n1000件\t112638\nvtune\t112639\n2017党支部组织生活会\t112640\n国电集团\t112641\nlesson\t112642\n爱上瘾\t112643\nsupergirl\t112644\n数字电子技术\t112645\nflooding\t112646\njv\t112647\nwdc\t112648\nLTE\t112649\n好在\t112650\n来到\t112651\n嘴笨\t112652\n导乐\t112653\n去眼袋手术\t112654\n大对决\t112655\n潇瑞\t112656\n第17位\t112657\n阿米洛\t112658\n亚热带季风气候\t112659\nfmdb\t112660\nSpeaks\t112661\n101#\t112662\n热剧\t112663\n安保员\t112664\n翻船\t112665\ntheamleaf\t112666\n祖师爷\t112667\n材料系\t112668\n梁家河\t112669\n苯丙酮\t112670\nLog4Net\t112671\n乳腺增生症\t112672\nm226dn\t112673\n来后到\t112674\n股指\t112675\n天葵\t112676\n另一个子\t112677\n建设工程教育网\t112678\n权力的游戏第六季\t112679\n8684邮编查询网\t112680\n爱客\t112681\n许佳挺\t112682\n运途\t112683\nmtbf\t112684\n滨海经济开发区\t112685\n起兴\t112686\n毛里求斯\t112687\n11p\t112688\nKVM切换器\t112689\n_川\t112690\n防盗窗\t112691\nSynapse\t112692\n1200吨\t112693\n阿特柔斯\t112694\n三十八年\t112695\n通大厦\t112696\n归零码\t112697\nwebstorm\t112698\n九十天\t112699\n门上\t112700\nZARA\t112701\n一整列\t112702\n乌当区政府网\t112703\n变革型\t112704\n世界杯足球赛\t112705\naxios\t112706\n刘玉栋\t112707\nSpectral\t112708\n宫门\t112709\n陈建波\t112710\n捷足先登\t112711\n胧车\t112712\n丝席\t112713\n翁卷\t112714\nshing\t112715\n杜汶泽\t112716\n日本药妆店\t112717\n楚王陵\t112718\n预订单\t112719\n豆豉\t112720\n苟利\t112721\n诺曼\t112722\nC206\t112723\n澳大利亚旅游局\t112724\n费事\t112725\nMinority\t112726\ntaohuazu\t112727\n一行\t112728\n苏州市中医院\t112729\nVV5s\t112730\n声压\t112731\n拍买\t112732\n80例\t112733\n嘉文\t112734\n天合联盟\t112735\n渣场\t112736\n仿美团\t112737\ndomain\t112738\nZephyrus\t112739\n63条\t112740\n高汤\t112741\n绿箭侠第四季\t112742\nARP\t112743\nモデルコレクション\t112744\nDoku\t112745\n小狐狸\t112746\n荣丰控股\t112747\n审时度势\t112748\nErlang\t112749\n三策\t112750\n大藤峡\t112751\n床罩\t112752\n康艺\t112753\nlcrypto\t112754\n两发\t112755\n移库\t112756\n俄航\t112757\n威驰fs\t112758\nwss\t112759\n枫叶国\t112760\n盐阜大众报\t112761\n焚天\t112762\n婴幼\t112763\n早安问候语\t112764\n25th\t112765\n开解\t112766\n见习生\t112767\n宠物训练师\t112768\n货比三家\t112769\n押着\t112770\n程朱\t112771\n泄火\t112772\n壁龛\t112773\n莫卧儿\t112774\n修罗王\t112775\nusm\t112776\niai\t112777\nzabbix3.0\t112778\n松科\t112779\nTshare365\t112780\n1.12.0\t112781\n中山大学中山医学院\t112782\nread_excel\t112783\n合作社\t112784\n近畿\t112785\n1万亩\t112786\n希子\t112787\nlijie\t112788\n庐山\t112789\n赵锦\t112790\n杜宇\t112791\n手推式\t112792\n00:00\t112793\n天山网\t112794\n秘密乐园\t112795\nasterisk\t112796\nroute\t112797\nlaprairie\t112798\n5th\t112799\n精河\t112800\n手套机\t112801\n思利\t112802\n六角亭\t112803\n加息券\t112804\n塔西佗\t112805\n%1\t112806\n高一_无忧考网\t112807\nappserver\t112808\n少女队\t112809\n刨面\t112810\n浪奇\t112811\n那巴\t112812\n温州一家人\t112813\n安处\t112814\n一块地\t112815\n小集\t112816\n粵語\t112817\n送戏\t112818\n邯郸市教育局\t112819\n4031\t112820\n_舍得街\t112821\ngrand\t112822\n单肩包\t112823\n洛阳桥\t112824\n凯励程\t112825\n交楼\t112826\nNBE游戏论坛\t112827\n刺沙\t112828\n中国科学技术大学数学科学学院\t112829\nmamiya\t112830\nAbloh\t112831\n星际战甲国际服\t112832\n変態\t112833\nsymptoms\t112834\n剖宫产术\t112835\n一个400\t112836\nXd\t112837\n3.2米\t112838\n上海复旦大学附属眼耳鼻喉科医院\t112839\n电动调节阀\t112840\n微量\t112841\njdwx\t112842\n森源\t112843\n社会调查\t112844\nxt1085\t112845\n开腔\t112846\n旧石器时代\t112847\n闯入\t112848\n遗忘\t112849\n1.29G\t112850\n宿松路\t112851\n鸭川\t112852\n膏剂\t112853\n熊\t112854\n31亿元\t112855\n十一五规划\t112856\n暖宫\t112857\nSorting\t112858\n卢新宁\t112859\n林园\t112860\n佳木斯快乐舞步\t112861\n服饰品\t112862\n韩语\t112863\n皮下出血点\t112864\n接装\t112865\n台州市立医院\t112866\n温州网络公司\t112867\n舞蹈队\t112868\n巨长\t112869\nwww.cr173.com\t112870\n泰勒级数\t112871\n4席\t112872\n实验学\t112873\n秘本\t112874\n陕煤\t112875\n张士平\t112876\n摆动式\t112877\nx-100\t112878\n陆犯焉识\t112879\n理水\t112880\n協會\t112881\n制瓷\t112882\n猪口蹄疫\t112883\n影视化\t112884\n能级\t112885\n龙鳞纹\t112886\n完全燃烧\t112887\nDiplomat\t112888\nvirtual\t112889\n开神\t112890\nPorter\t112891\n程稼夫\t112892\n仿冒\t112893\n中间层\t112894\ntreegrid\t112895\n金本位\t112896\n杨建忠\t112897\n闲值\t112898\n先攻\t112899\ndelet\t112900\ngresql\t112901\n重操旧业\t112902\n350\t112903\n推挤\t112904\n量学\t112905\n号段\t112906\n标称值\t112907\n夏耕\t112908\n图式理论\t112909\n福建商学院\t112910\niscroll5\t112911\n伺服减速机\t112912\nd3d9.dll\t112913\n城关镇\t112914\nOpposite\t112915\n东盟博览会\t112916\nSOLID\t112917\nGrokbase\t112918\n邓光\t112919\nMIO\t112920\nmtalab\t112921\n尚东\t112922\n狗东\t112923\n合称\t112924\n捆扎机\t112925\nlmp\t112926\n炒币\t112927\n学机\t112928\n高光机\t112929\n蜜思\t112930\n体育节\t112931\n[德\t112932\n疥螨\t112933\n人民银行征信中心\t112934\nOBJ\t112935\n思想性\t112936\n短信群\t112937\n眼镜店\t112938\n两把刷子\t112939\n每每\t112940\n第七轮\t112941\n离家出走\t112942\n深南大道\t112943\n4卷\t112944\n广东生态工程职业学院\t112945\n小帅\t112946\nalphago\t112947\n东大路\t112948\n5v2a\t112949\n适量\t112950\nStrip\t112951\n医疗器械\t112952\n创可\t112953\n零岁\t112954\n西部数码域名论坛\t112955\n1828\t112956\n营口市政府\t112957\n城防\t112958\n李泰林\t112959\n冰滴咖啡\t112960\nINK\t112961\n物业税\t112962\n中国通信\t112963\n局部战争\t112964\n破镜重圆\t112965\n2.1a\t112966\nssize\t112967\n购车\t112968\n刷业\t112969\n月光启蒙\t112970\n_路亚吧\t112971\n次项\t112972\n几十兆\t112973\n特拉华大学\t112974\n视图\t112975\n无面者\t112976\n中华人民共和国公司登记管理条例\t112977\n少年汉尼拔\t112978\n录音机\t112979\n七零\t112980\n汾酒集团\t112981\n】子\t112982\n双柏\t112983\n新研股份\t112984\nv5.7.0\t112985\n六宫格\t112986\n能看\t112987\ncv2.imread\t112988\n都铎王朝\t112989\n24式简化太极拳\t112990\n吴若甫\t112991\n雪霜\t112992\nManufacture\t112993\nNasty\t112994\n红楼遗秘\t112995\n不中用\t112996\n47条\t112997\n伞柄\t112998\n第39话\t112999\n闯王\t113000\n10.7.3\t113001\n爱普生\t113002\n抄写\t113003\n电子档杆\t113004\n2017年11月1日起\t113005\n家家悦\t113006\n吕祖灵\t113007\n48家\t113008\nメン\t113009\n1557\t113010\n逃生者\t113011\n安全第一\t113012\n_性\t113013\nRename\t113014\n杨学祥\t113015\n洛阳火车站\t113016\n会宁\t113017\n天恩\t113018\nhudie\t113019\n达摩卡\t113020\n95分\t113021\n牆\t113022\n腿哥\t113023\n5.01\t113024\n别墅酒店\t113025\n大海贼物语\t113026\n套料\t113027\n育儿\t113028\n4G+\t113029\nH-Mate\t113030\n甘肃\t113031\n电脑维修网\t113032\n化学工业区\t113033\n大丰收\t113034\n异世之风流大法师\t113035\n硫酸氢氯吡格雷片\t113036\n腾讯科技\t113037\nosip\t113038\n性板\t113039\n禁闭\t113040\n申博\t113041\n丰翔路\t113042\n蝶恋花\t113043\n抛尸案\t113044\n7.9\t113045\n瞻岐镇\t113046\nseattle\t113047\ntms320f28335\t113048\nUISlider\t113049\n930\t113050\n2017.1.3\t113051\n一瞥\t113052\n安托鲁斯\t113053\n游击战争\t113054\n育儿大作战\t113055\n25句\t113056\n古老\t113057\n规图\t113058\n熊辉\t113059\n贺飞\t113060\nqiagen\t113061\n旧社会\t113062\n量比\t113063\n不仅仅\t113064\n冰子\t113065\n篓子\t113066\n软碟通\t113067\n22秒\t113068\n第79\t113069\nMYBATIS\t113070\n伊瑟拉\t113071\n桥头\t113072\n神秘感\t113073\n戛纳电影节\t113074\n桃味\t113075\nqin\t113076\n2017上半年\t113077\nvrealize\t113078\nWorkstations\t113079\n华晨宝\t113080\n连信\t113081\niphne\t113082\none-hot\t113083\n胶原蛋白口服液\t113084\n3.00\t113085\n队医\t113086\n两例\t113087\n圣战群英传2\t113088\n汇算\t113089\nfiref\t113090\n基佬大乱斗\t113091\n第一重\t113092\n乐风论坛_汽车之家论坛\t113093\n成人高考信息网\t113094\n蕃茄\t113095\n梅果\t113096\n重庆市扶贫开发办公室\t113097\n八十年\t113098\nv7.1\t113099\nmootools\t113100\nt3t4\t113101\n袒胸露乳\t113102\n龚磊\t113103\n基蛋\t113104\n波束\t113105\n6旬\t113106\n右方\t113107\n苏兰\t113108\nextremely\t113109\n吉他谱六线谱\t113110\n福田街道\t113111\n蓝波湾\t113112\n歪路\t113113\n悬停\t113114\n余姚市区\t113115\n鹤壁\t113116\nKlipsch\t113117\n马柱\t113118\nIPOD\t113119\n举重\t113120\n画画\t113121\n银石广场\t113122\n2源\t113123\n伪证\t113124\n梁头\t113125\nfeat\t113126\n大庆市公安局\t113127\n少年英雄小哪吒\t113128\n舟曲县人民政府\t113129\n冻梨\t113130\nPHARMACEUTICAL\t113131\n发展思路\t113132\n萧山区\t113133\n颜昌海\t113134\nadhd\t113135\n清华大学社会科学学院\t113136\nLiverpool\t113137\n24颗\t113138\n狼狈不堪\t113139\n菲利普科特勒\t113140\n私交\t113141\n4時間\t113142\n新港开发区\t113143\n政历\t113144\n肉盾\t113145\nHopper\t113146\n中天建设\t113147\n中超直播网\t113148\n85亿\t113149\n胰腺囊肿\t113150\n绍兴市人民政府\t113151\n球迷网\t113152\n投行在线吧\t113153\n白酒业\t113154\n小雏\t113155\n前一年\t113156\n雷神托尔\t113157\n氨磺必利片\t113158\n练法\t113159\n干湿分离机\t113160\n蚰蜒\t113161\nSocket编程\t113162\n舞蹈症\t113163\n宿州审计局\t113164\n楚乔传\t113165\nIbanez\t113166\n谢尔曼\t113167\n舰载\t113168\n俗话说\t113169\n1.24e\t113170\n德氏\t113171\n800级\t113172\n调弦\t113173\ntoledo\t113174\n非常静\t113175\nExchange2013\t113176\nesp\t113177\n季军赛\t113178\n才艺秀\t113179\n纤维\t113180\n曾都区\t113181\n李岚清\t113182\n鬼谷无双\t113183\n美名腾智能起名网\t113184\n爱错\t113185\n三线格\t113186\n国家基本药物目录\t113187\n增伤\t113188\naspectjweaver\t113189\n百奥\t113190\n敢富网\t113191\n药政\t113192\n龙岩电视台\t113193\n东乡区\t113194\n破碎机\t113195\n琪琳\t113196\n出栈\t113197\n珠海经济技术开发区\t113198\n艾司唑仑片\t113199\nflicker\t113200\n33eee\t113201\n公交车\t113202\n丁酉日\t113203\nbeckman\t113204\n智妍\t113205\nOTDR\t113206\n传腹\t113207\n相依为命\t113208\n灯塔市\t113209\ns3c6410\t113210\n9.5.1\t113211\n坡线\t113212\nfade\t113213\n凌云文学网\t113214\n来不及说\t113215\nCS55\t113216\n赤岭\t113217\nkongkong011\t113218\n格化\t113219\n英国中学\t113220\n蒲慕明\t113221\n张五哈尔\t113222\nCisco\t113223\nwin10版\t113224\n辐射避难所\t113225\n第23期\t113226\n潍河\t113227\n陈学冒险岛2\t113228\n铆钉机\t113229\n大气污染物综合排放标准\t113230\niTech\t113231\n王太后\t113232\nPg\t113233\ntextnow\t113234\ncrashing\t113235\n保险丝\t113236\nwow\t113237\n知法懂法\t113238\n20路\t113239\n跑龙套\t113240\n4s店\t113241\n沿江城际铁路\t113242\n中国中投证券有限责任公司\t113243\n肝肿大\t113244\nthru\t113245\n利美\t113246\n真爱谎言之破冰者\t113247\nNissan东风日产\t113248\nmtime\t113249\n鬼父全集\t113250\nlaravel数据库\t113251\n电力系统自动化\t113252\n9速\t113253\n浙大儿院\t113254\n故障网\t113255\n海甸岛\t113256\nQ7\t113257\nAj\t113258\n逍遥兵王\t113259\n心跳线\t113260\n亲爱的路人\t113261\n噪声源\t113262\n晨星\t113263\nbootstrapper\t113264\nuntiy3d\t113265\n朱亚文\t113266\n紫草\t113267\n处所\t113268\n宦妃天下\t113269\n元征科技\t113270\n摆臂式\t113271\n高泽文\t113272\n唐人阁\t113273\nlsb\t113274\n半张脸\t113275\n雅辛\t113276\n页岩气革命\t113277\n女性瘾者\t113278\n神阙\t113279\n0.53\t113280\n害命\t113281\n别克昂科拉\t113282\n寻死\t113283\n呼南高铁\t113284\n差距\t113285\n盘带\t113286\nss代理\t113287\n万维书刊网\t113288\n浙江省建筑设计研究院\t113289\n系统架构设计师\t113290\n春草\t113291\nwww.5xcg.com\t113292\ntenant\t113293\n广东省通信管理局\t113294\n赫基国际集团\t113295\n洛阳市\t113296\n伊藤\t113297\n区块链概念股\t113298\nVk\t113299\n17类\t113300\nMobil\t113301\n卡纳瓦罗\t113302\n整体式\t113303\n杂货船\t113304\n抽屉\t113305\nFOTILE\t113306\n周代\t113307\n斯坦索姆\t113308\n北京市肿瘤防治研究所\t113309\n源头活水\t113310\n丁基胶塞\t113311\nAqueous\t113312\n碱性磷酸酶\t113313\n误读\t113314\n难忍\t113315\nsaya\t113316\n近藤\t113317\n熊芳芳\t113318\nHoshi\t113319\npchar\t113320\n8007007e\t113321\n钢束\t113322\n蓝色妖姬\t113323\n日全食\t113324\n白金龙王\t113325\n劲部\t113326\n蓝吊\t113327\n大光圈\t113328\n低位\t113329\n综合馆\t113330\n熏香炉\t113331\nStrong\t113332\n面塑\t113333\n屈浩\t113334\n掏空\t113335\n经脉\t113336\n雪峰寺\t113337\n帧中继\t113338\n女武神\t113339\n单机\t113340\n中成股份\t113341\n会稽\t113342\n记念\t113343\n纽埃\t113344\n洗头膏\t113345\n城市规划区\t113346\nflinto\t113347\n遗世独立\t113348\ndurham\t113349\n发亮\t113350\n常见于\t113351\n爱色丽\t113352\n爱家网\t113353\n斗地主\t113354\n肩部\t113355\n极客数学帮\t113356\n小方块\t113357\nPlayStation\t113358\n2017-2021年\t113359\n泛爱\t113360\n聚水\t113361\n雨水口\t113362\n铁锈色\t113363\n阿喀琉斯\t113364\n湖南人才网\t113365\nGIF5\t113366\n音影先锋\t113367\n久远\t113368\nboot\t113369\n仰起头\t113370\n三阴乳腺癌\t113371\n体坛网\t113372\n年纪轻轻\t113373\nIncrediBuild\t113374\n明雅\t113375\n高斯消元法\t113376\n回头率\t113377\npicture\t113378\n七厘散\t113379\n国子监博物馆\t113380\n捐款\t113381\n施岚青\t113382\n棱石\t113383\n趣图\t113384\npros\t113385\n如茵\t113386\n润泽\t113387\n6.13\t113388\n秦臻\t113389\n4014\t113390\nTransmitter\t113391\n舒乙\t113392\n人民币汇率\t113393\n内蒙古自治区人民政府办公厅\t113394\n指原莉乃\t113395\nPS痕迹\t113396\n不一样的任你操\t113397\n拉萨火车站\t113398\n3月底\t113399\n西楚网\t113400\nSMART\t113401\n总营收\t113402\nFFM\t113403\n息税折旧摊销\t113404\nArm\t113405\n经世\t113406\n八里河\t113407\n300型\t113408\n赤峰学院学报\t113409\n虐菜\t113410\n70.3\t113411\n圣斗士星\t113412\n枪士\t113413\n龙东\t113414\nalkyl\t113415\n音配\t113416\n致敬梗\t113417\n自安\t113418\n朝阳站\t113419\n海南海事局\t113420\n章莹颖案\t113421\n奔达\t113422\ncrossfit\t113423\n姑母\t113424\n陈明精\t113425\nSuspend\t113426\n异世界\t113427\n100万存\t113428\n五路一桥\t113429\nkmod\t113430\n透光板\t113431\n宿迁工业园区\t113432\n1407\t113433\n检测卷\t113434\n伯尔\t113435\n两女一男\t113436\n郑浩\t113437\n小篆体\t113438\n李懂\t113439\n第45\t113440\n中国人寿集团\t113441\n超级神基因\t113442\n父爱如山\t113443\nLenka\t113444\n数控\t113445\neli\t113446\npc6智能硬件网\t113447\n苹果Magic\t113448\nfishman\t113449\n难难\t113450\n乱刀\t113451\n亵渎\t113452\nHolo\t113453\n岁月如梭\t113454\n食人花\t113455\n家驹\t113456\nkronos\t113457\n008吧_\t113458\nEndnote\t113459\n天兴\t113460\n中大网校论文网\t113461\n周公庙\t113462\n李艳红\t113463\n阅舰\t113464\n余欣荣\t113465\n不再说\t113466\n西安建设银行\t113467\n巴比特\t113468\np图\t113469\n理智与情感\t113470\ntypescript\t113471\nWST\t113472\n恰当\t113473\n浦发银行手机银行\t113474\n信誉\t113475\nComsol\t113476\nobjectmapper\t113477\n里应外合\t113478\n鲁滨\t113479\n好期\t113480\n1C\t113481\n赌球\t113482\nimpedance\t113483\nHalifax\t113484\nf414\t113485\nlibtorrent\t113486\n刘洁\t113487\n杂波\t113488\nA-1\t113489\n学一学\t113490\n魔法花园\t113491\n刚来\t113492\n天猫精灵吧\t113493\nfund\t113494\n995集\t113495\n知识产权\t113496\n最坏\t113497\ncombobox\t113498\n邵阳市人民政府\t113499\n印地安\t113500\n疯了\t113501\n林原\t113502\nkconfig\t113503\n30W\t113504\n板载\t113505\n绵阳中学实验学校\t113506\nMedicine\t113507\nafraid\t113508\n裂风\t113509\n酒架\t113510\nNavMenu\t113511\n1.2.14\t113512\nWiFi密码查看器\t113513\nBrighton\t113514\nSketches\t113515\n2016.1.1\t113516\n生情\t113517\nrws\t113518\n6.5\t113519\nDC\t113520\n一丈\t113521\n英雄赞歌\t113522\n龙湖滟澜海岸\t113523\nlunix\t113524\n飞乐音响\t113525\n清心寡欲\t113526\nzikao\t113527\n美标卫浴\t113528\n隐隐痛\t113529\n汉斯希尔\t113530\n临边\t113531\n三角测量\t113532\n黑龙江财经学院\t113533\n居\t113534\n报关费\t113535\n30歳\t113536\n起始位\t113537\n梁村镇\t113538\n美体健身操\t113539\n阿姆西\t113540\n诸神之怒\t113541\n正线\t113542\n赤道\t113543\n不合规\t113544\n卡夫特|LED\t113545\n中子类\t113546\nRust\t113547\n文咏珊\t113548\ncrazybaby\t113549\n体值\t113550\n杨丽玲\t113551\n柏翠面包机\t113552\n驻马店人民政府\t113553\n金花茶\t113554\n巴力\t113555\nRAYS\t113556\n小糖\t113557\n千牛子\t113558\n南宋御街\t113559\n噶尔县\t113560\n恩杂鲁胺\t113561\n南京皮肤病研究所\t113562\n邙山\t113563\n人体损伤致残程度分级\t113564\n7800\t113565\n公差带\t113566\n尧舜禹\t113567\n旗子\t113568\n杜小莘\t113569\n武馆\t113570\n心乱\t113571\n5.0.7\t113572\n此话\t113573\n火影忍者:究极风暴3\t113574\n外挂式\t113575\n羊蝎子火锅\t113576\n史密斯威森\t113577\n先导\t113578\nrevive\t113579\n惩恶扬善\t113580\n防水型\t113581\n保荐机构\t113582\nA10\t113583\n21遍\t113584\nsdmakers\t113585\n子图\t113586\n吸食\t113587\n厄贝沙坦片\t113588\n简州\t113589\n导气\t113590\n九阴雨燕\t113591\n6款\t113592\n承欢\t113593\n黄轩\t113594\n即墨北站\t113595\n上海庙\t113596\n28i\t113597\n5注\t113598\nlookup\t113599\n3D打印技术\t113600\n大转转FE\t113601\n西水东\t113602\n冷链物流网\t113603\n李绩\t113604\n血与骨\t113605\n博科尼大学\t113606\n欧缇丽\t113607\n空气呼吸器\t113608\n侧漏\t113609\n蜂蛹\t113610\n割草机\t113611\n子鱼\t113612\n宁夏党委\t113613\n银火龙\t113614\n法流\t113615\n欧宝雅特\t113616\n访华\t113617\n住住\t113618\n腕骨\t113619\n保利龙\t113620\nong\t113621\n末日浩劫\t113622\n前20天\t113623\n盐城师范学院\t113624\n泫雅\t113625\n状\t113626\nHeater\t113627\n23年后\t113628\n多种方\t113629\n奥迪A7\t113630\n福鼎家园\t113631\n公放\t113632\n2017-11-01\t113633\n情遇曼哈顿\t113634\n8870\t113635\n大血\t113636\n肖莉\t113637\n阴径\t113638\n伦教街道\t113639\n三孔桥\t113640\n长三角地区\t113641\n好喜\t113642\nRCTD\t113643\n克罗托内\t113644\n新通\t113645\nAssertion\t113646\n派克汉尼汾\t113647\n海蜇丝\t113648\n别问\t113649\n中市\t113650\n徐子崴\t113651\npyro\t113652\n番禺市\t113653\n流溪河国家森林公园\t113654\n特劳特\t113655\n长春光机所\t113656\n黑格尔法\t113657\n力控\t113658\n彩图\t113659\n扫荡\t113660\n天津公安警官职业学院\t113661\nAKS\t113662\n感染率\t113663\n美图M6\t113664\n完美世界小说吧\t113665\nBLEU\t113666\n长安大学汽车学院\t113667\nmumu模拟器\t113668\n271\t113669\ntipping\t113670\n氰化物\t113671\n烈火如歌百度云\t113672\n平远街\t113673\n清远政府网\t113674\n北移\t113675\n卡进\t113676\n12w\t113677\n国星光电\t113678\n|个\t113679\n科博\t113680\n吉林大学通信工程学院\t113681\n武昌首义学院\t113682\n第07集\t113683\n香堂\t113684\n利欧股份\t113685\n江苏星光动力集团\t113686\n飞图\t113687\n好人像\t113688\nAndroid8.0\t113689\nWebEx\t113690\n日月谷\t113691\n象王\t113692\nBach\t113693\nphtyon\t113694\n小米轩逸\t113695\n吸湿器\t113696\nx30\t113697\n香坂\t113698\n工程公司\t113699\n4届\t113700\npuritan\t113701\neasiest\t113702\nRoyal\t113703\n刘润潼\t113704\n料工\t113705\n瑞奇\t113706\nnicelabel\t113707\nresolv\t113708\n尼德克\t113709\n泰晤士报\t113710\n李矛\t113711\n重排版\t113712\nML\t113713\ntrig\t113714\noeko\t113715\n为爱所困2\t113716\n洁厕剂\t113717\nCopy\t113718\n北京中汽总回国留学人员购车服务有限公司\t113719\n性格内向\t113720\nwindows7系统\t113721\n临终\t113722\nstata12\t113723\n32安\t113724\n宝力\t113725\n陪你度过漫长岁月\t113726\nImproving\t113727\nSUB\t113728\nネ取\t113729\n俩次\t113730\n花言\t113731\n周几\t113732\n红十字国际委员会\t113733\n压铸厂\t113734\n野田洋次郎\t113735\n加卡\t113736\n杨振朱婷\t113737\nMeeGo\t113738\nMFC-7380\t113739\n水力发电\t113740\n硅藻泥网\t113741\n1819\t113742\n邮电路\t113743\nwebfont\t113744\n1.4.6\t113745\npes2016\t113746\n诛仙3_多玩游戏网\t113747\n抉\t113748\n奥雪\t113749\narden\t113750\n各执\t113751\n2018年10月\t113752\n键键\t113753\n广告部\t113754\n方励之\t113755\n3日游\t113756\n天津联通\t113757\n摊牌\t113758\n和颜悦色\t113759\n2537\t113760\n猫爪草\t113761\n野生鸟\t113762\nnx300\t113763\nShimo\t113764\n诺亚传说\t113765\n百鸟朝凤\t113766\n3dmax建模\t113767\n东坡下载\t113768\nscraper\t113769\n霜寒\t113770\n手动对焦\t113771\n微升\t113772\n盈动\t113773\n网上贷款\t113774\n福州南站\t113775\n0191\t113776\n燕子专列\t113777\n价值\t113778\n中共中央对外联络部\t113779\n制砖\t113780\n河野太郎\t113781\n台海核电\t113782\n活拿\t113783\n深圳动物园\t113784\n水红\t113785\n03709\t113786\n鸦片战争博物馆\t113787\n份额\t113788\n交通运输部科学研究院\t113789\n高低\t113790\nd7000\t113791\nkink\t113792\n少林寺\t113793\n潇湘客\t113794\n两成\t113795\nGalleries\t113796\nmicron\t113797\n数值分析\t113798\n德生科技\t113799\n第71期\t113800\nLobster\t113801\n非学历\t113802\n完\t113803\n周世宗\t113804\nBillie\t113805\nprotocol\t113806\n雪佛兰迈锐宝\t113807\nbiaoge\t113808\n高速钢\t113809\nG450\t113810\nbcmath\t113811\n富贵险\t113812\n811\t113813\n欲奴\t113814\n开口式\t113815\n激子\t113816\nGLIBC\t113817\n重装机兵2\t113818\nQF\t113819\n2018年4月4日\t113820\n少女峰\t113821\n阿芙洛狄忒\t113822\n影响\t113823\n猪舌\t113824\n赛区\t113825\n一块板\t113826\n行疆\t113827\nCivilisations\t113828\nN+\t113829\n高好\t113830\n恒山派\t113831\n10万平米\t113832\n青林湾\t113833\nXR-V论坛\t113834\n熔芯\t113835\n影子股\t113836\n展场\t113837\n植草砖\t113838\n不定型\t113839\n房地产业\t113840\n圣\t113841\n郭麒麟\t113842\n无意\t113843\n新平\t113844\n魔卡\t113845\n巨巨\t113846\n雨露网\t113847\nev录屏\t113848\nv2.9.2\t113849\n球星们\t113850\n啪啪丁\t113851\n收藏展\t113852\n国海\t113853\n深南电路\t113854\nbinning\t113855\n鼎新\t113856\nmodelling\t113857\nwheat\t113858\nVetements\t113859\n梦筱二\t113860\n一垒\t113861\nafternoon\t113862\n铁石心肠\t113863\n2017年2月27日\t113864\n设局\t113865\n年青人\t113866\n请稍后\t113867\npm961\t113868\n宫濑里子\t113869\n欧拉公式\t113870\n袭击者\t113871\n湄潭县\t113872\n娇兰臻彩\t113873\n迪赛尼斯\t113874\n200余\t113875\n竞技性\t113876\n优视\t113877\n金玄玉\t113878\n怪诞心理学\t113879\n结疤\t113880\nmer\t113881\nyouku\t113882\n省委\t113883\ndnf冰结师\t113884\n桃桃\t113885\niphone11\t113886\n丹阳新闻网\t113887\n鸿恩寺\t113888\n安泰科技\t113889\n郭庚茂\t113890\n坡头区\t113891\n鱼峰\t113892\n洪杨\t113893\n发动机仓\t113894\n1500米\t113895\n普京大帝\t113896\n水素杯\t113897\n利尔达科技\t113898\n知道不知道\t113899\n晨小柏\t113900\nromi\t113901\n走下\t113902\n陈立新\t113903\n42043\t113904\nopenjdk8\t113905\nbarracuda\t113906\n金地(集团)股份有限公司\t113907\n一枝红艳露凝香\t113908\n酷睿i3\t113909\nwww.52fzba.com\t113910\nwindiws\t113911\n偏钒酸铵\t113912\n南二环路\t113913\nFrontend\t113914\n硕博招聘网\t113915\n8328\t113916\n100MB\t113917\n岩岩\t113918\n中铁二院\t113919\n小时江湖\t113920\n开元寺\t113921\n地租\t113922\nhamcrest\t113923\n射精\t113924\n吸墨器\t113925\n大阪市区\t113926\n刮胡\t113927\n天骥脱机\t113928\n潟湖\t113929\n染发\t113930\n为什么叫\t113931\n豆腐鱼\t113932\n你可以\t113933\n双声\t113934\n闭环控制\t113935\n踏足\t113936\n问医\t113937\nCS1.5\t113938\n铁东区\t113939\n资本成本率\t113940\n道阻\t113941\nmonroe\t113942\nMatches\t113943\n奇案\t113944\n奥雅之光\t113945\n绝望先生\t113946\n大牌们\t113947\n王明光\t113948\n南江县\t113949\n景点\t113950\n2828.1\t113951\n挚\t113952\n河北省人民政府办公厅\t113953\n正国\t113954\n青岛路\t113955\nNIKON\t113956\n20140505\t113957\n西乡殿\t113958\n37天\t113959\n碰碰\t113960\n维稳\t113961\n宋崇\t113962\n台湾地区\t113963\nMarantz\t113964\nQPQ\t113965\n琅东汽车站\t113966\n重块\t113967\n捞船\t113968\n身中\t113969\n80211\t113970\n恒温恒湿恒氧\t113971\nRome\t113972\n外廓\t113973\n乐百家\t113974\n西山居\t113975\n款人\t113976\n1.0.2.3\t113977\n黎川县人民政府\t113978\n常宁\t113979\ncna\t113980\nviper4\t113981\n张江集团\t113982\n刑责\t113983\n新洛神\t113984\nAmiibo\t113985\n星光\t113986\n折叠机\t113987\n创响\t113988\n网易微博\t113989\n南闸\t113990\n可耐福\t113991\n提示性\t113992\n翻缸\t113993\nwap2\t113994\n孤胆\t113995\n2016.5.1\t113996\n三国吧\t113997\n蜀都\t113998\n魔灯\t113999\n鸭血粉丝\t114000\npromotion\t114001\n嘴角\t114002\n易制爆化学品\t114003\n九曲十八弯\t114004\nSOS\t114005\n泰兴法院\t114006\n体育教育专业\t114007\n富力湾\t114008\ndubbo\t114009\n北大方极规划\t114010\n科斯\t114011\n李金明\t114012\n硬防\t114013\nCleaners\t114014\n牛企\t114015\n真人性\t114016\n采摘期\t114017\n制定\t114018\n社会学\t114019\n液相色谱仪\t114020\n超星尔雅大学生创业\t114021\nbet\t114022\n特艺\t114023\n铺垫\t114024\n诺希\t114025\n0.1.1\t114026\n说\t114027\n贡院\t114028\n揭阳站\t114029\n特玩csgo\t114030\nCBI游戏天地网\t114031\n绿盖\t114032\n提示器\t114033\njavacore\t114034\n320x240\t114035\neteams\t114036\n消抖\t114037\ncss篇\t114038\n魔宙\t114039\n僧衣\t114040\nnxlog\t114041\n中考满分作文\t114042\n存款准备金利率\t114043\n隐含波动率\t114044\n厦门工商银行\t114045\n4096\t114046\n妖王\t114047\n酒宴\t114048\n7002\t114049\n三远\t114050\n急性肺损伤\t114051\nwah\t114052\nRigidbody\t114053\n当地游\t114054\n日语输入法\t114055\n南昌市\t114056\n黑虫子\t114057\n程序园\t114058\n棉纸\t114059\nkdb\t114060\nBingo\t114061\n六种\t114062\n还价\t114063\n新的一天\t114064\n机检\t114065\n伤害\t114066\nsitc\t114067\n线仪\t114068\n项权\t114069\n万达网科\t114070\n果穗\t114071\n周永康\t114072\n调控\t114073\n加福\t114074\n蝶阀\t114075\n第二十七集\t114076\n电动转辙机\t114077\n美国驻华大使馆\t114078\n电子版\t114079\n芜湖市第一人民医院\t114080\n空当\t114081\n好实用\t114082\n血粉\t114083\n雷龙鱼\t114084\n电源地\t114085\nsynergy\t114086\n削足适履\t114087\n10.9.5\t114088\ndqmj3\t114089\n龙叔\t114090\n油茶\t114091\n热血街舞团VS\t114092\n红山区\t114093\n上海汇众汽车制造有限公司\t114094\n并茂\t114095\nmunicipal\t114096\n卓识\t114097\n伴行\t114098\n尤克里里弹唱谱\t114099\n广州华夏职业学院\t114100\n根底\t114101\n王者荣耀职业联赛\t114102\n马千里\t114103\nGS8\t114104\n中保\t114105\n淫人\t114106\n陶华碧\t114107\n晨熙\t114108\n讲座\t114109\n王振宇\t114110\nmoss\t114111\n凯翼X3\t114112\n灵猴\t114113\nhomemade\t114114\nLinking\t114115\n一个三\t114116\n密斯凡德罗\t114117\n显著性检验\t114118\n并联电容器\t114119\nR6300v2\t114120\n钠离子交换器\t114121\n7.06\t114122\n江杨南路\t114123\nOb\t114124\ntanA\t114125\n宝宝类\t114126\n400部\t114127\n百安居\t114128\n教育信息化网\t114129\n方流芳\t114130\n浩云科技\t114131\n海盗湾中文网\t114132\n何毅亭\t114133\nannounced\t114134\nMTK6577\t114135\n兔同笼\t114136\n半颗\t114137\n金鱼缸\t114138\n打包袋\t114139\n民间版\t114140\n注解版\t114141\n高鹗\t114142\n无杆\t114143\nCUN\t114144\n留学基金委\t114145\n词歌\t114146\nxuexiao\t114147\n企划部\t114148\n许雪里\t114149\n噱\t114150\n丰趣\t114151\n千兆以太网交换机\t114152\n鼓楼\t114153\n170611\t114154\n11行\t114155\n合谋\t114156\n识别性\t114157\n角质化\t114158\n双氧奶\t114159\n蝙蝠侠:阿卡姆骑士\t114160\n三角区\t114161\n广东省体育局\t114162\ndege\t114163\n小鑫\t114164\nconstexpr\t114165\n希澈\t114166\n水位\t114167\n成人高等学历教育\t114168\n沃尔沃XC60论坛\t114169\n横岗大厦\t114170\n换气\t114171\n吉冈里帆\t114172\n景观亭\t114173\nChangzhou\t114174\n金太郎\t114175\n凯妮汀\t114176\n443岁\t114177\n在编\t114178\n软管泵\t114179\n杭州市富阳区人民政府\t114180\n1380元\t114181\n记否\t114182\n王卫东\t114183\n家宅\t114184\n沪苏\t114185\n车管家\t114186\n小天\t114187\n字体网\t114188\n清苦\t114189\n软银中国资本\t114190\n济南市商务局\t114191\nDelayed\t114192\n钟离昧\t114193\n虐杀原形\t114194\n诈骗罪\t114195\nHASEE\t114196\n梦幻超级变态版\t114197\nKtv\t114198\nAnnounce\t114199\n王西麟\t114200\n农博网\t114201\n芳林\t114202\nVS岚\t114203\n晓月苑\t114204\n蓝风筝\t114205\n嘉靖\t114206\n海文库\t114207\n奔跑吧兄弟\t114208\n免谷歌\t114209\n伪书\t114210\n璐璐\t114211\nstd\t114212\n中国北方工业公司\t114213\n嫣汐\t114214\nv2.5.0\t114215\n断桩\t114216\n96119\t114217\n48芯\t114218\n电子产品\t114219\n李沧海\t114220\n恶心\t114221\n吉林大学出版社\t114222\n民营加油站\t114223\n人身高\t114224\n航趣\t114225\n霸道总裁\t114226\n公私\t114227\n沃特\t114228\n长城vv5\t114229\n柔韧\t114230\n摩凡陀\t114231\n2036年\t114232\n欧倍青\t114233\n600053\t114234\ndeepmind\t114235\n句容\t114236\n5月31日\t114237\nInd\t114238\n明眸善睐\t114239\n奔驰GLE43\t114240\n锁链\t114241\n中天国富证券\t114242\n松驰\t114243\nmapcontrol\t114244\n江亿\t114245\n勇気\t114246\ngcms\t114247\n一禅\t114248\n足跟\t114249\n◆\t114250\n自耦变压器\t114251\n接待台\t114252\n大数定理\t114253\n3986\t114254\n红沙\t114255\n印度尼西亚\t114256\n欧姆\t114257\n空调病\t114258\nWizards\t114259\n2017年2月22日\t114260\n放票\t114261\nnexus\t114262\n笃志\t114263\n800分\t114264\n固定资本\t114265\n力达\t114266\n爬山虎的脚\t114267\n诬告\t114268\n免息期\t114269\n锡场镇\t114270\n求生类\t114271\n四流\t114272\n爱的替身\t114273\ndnf死灵术士吧\t114274\n聚丙烯酸\t114275\n超滤膜\t114276\n看护\t114277\n广告位\t114278\n平面波\t114279\n鹦哥岭\t114280\n5000个\t114281\nTouchBar\t114282\nFancl\t114283\nFlour\t114284\nhib\t114285\nwmui\t114286\n牛里脊\t114287\n代尔塔\t114288\n影音\t114289\n齐柏林\t114290\n法瑞尔\t114291\n情感化\t114292\n变形金刚5\t114293\nWGS\t114294\n西蔓\t114295\n兴业银行大厦\t114296\n珍珠鱼\t114297\n戴孝\t114298\n4.9.3\t114299\n宫崎\t114300\nmemories\t114301\n火炬之光2吧_\t114302\n舶来品\t114303\n2069\t114304\nZ7\t114305\n诸神\t114306\n僵尸道长1\t114307\n梦幻西游BB\t114308\n玉米淀粉\t114309\n2手\t114310\n#10\t114311\n螺杆式\t114312\nM4A4\t114313\n唱功\t114314\n弱势群体\t114315\n谭其骧\t114316\n跆拳舞\t114317\nai音箱\t114318\n古迪\t114319\n宋永志\t114320\n衣舍\t114321\n奥丽莎\t114322\nLubuntu\t114323\n活意\t114324\n286\t114325\nIndexing\t114326\n城东新区\t114327\n青春季\t114328\n唐璐\t114329\ninfant\t114330\n腿形\t114331\nkx-2\t114332\n度小月\t114333\n日评\t114334\n82列\t114335\n培生儿童英语分级阅读\t114336\nエルフ姫ニィ\t114337\n长武人民政府\t114338\nB7000\t114339\n由得\t114340\n第七十五章\t114341\n陈丹燕\t114342\n基于\t114343\nrsort\t114344\n108P\t114345\n椅子舞\t114346\n全句\t114347\nSAMPLE\t114348\n江苏商贸职业学院\t114349\nDiag\t114350\n麻丘\t114351\n割裂\t114352\n济南千佛山\t114353\n2x^2\t114354\n华月\t114355\n东至\t114356\n许荣茂\t114357\nRockets\t114358\n关原\t114359\nSlide\t114360\n九洲城\t114361\n台湖\t114362\nParental\t114363\n山东省地方税务局\t114364\n波导管\t114365\n王勇\t114366\n破费\t114367\n强人所难\t114368\nAVI_磁力链接ed2k迅雷\t114369\ngzhifi\t114370\n咻\t114371\n棕榈\t114372\n中建二局第三建筑工程有限公司\t114373\n中昌数据\t114374\n龙的天空\t114375\nDataURL\t114376\nnet4.5\t114377\n海产\t114378\n体卫艺\t114379\n20160715\t114380\n点焊机\t114381\n飘荡\t114382\n程潇\t114383\n北京合众思壮科技股份有限公司\t114384\n新一佳\t114385\n房改房\t114386\n折星星\t114387\n宦溪镇\t114388\n别样\t114389\n乱世三国\t114390\n魔法咪路咪路\t114391\nESPRIT\t114392\n高级审计师\t114393\nieframe\t114394\n不孝\t114395\ncn邮箱\t114396\n出队\t114397\n5周年\t114398\n事业编制人员\t114399\n耿耿于怀\t114400\n20170414\t114401\n成龙大道\t114402\n核工业西南物理研究院\t114403\n301调查\t114404\n花儿与少年第二季\t114405\n钟山镇\t114406\n变形计之平行世界\t114407\n西营镇\t114408\n标志物\t114409\n记者会\t114410\n郑秀文\t114411\n11.3%\t114412\n料头\t114413\n武炼巅峰\t114414\n赤砂糖\t114415\n商业策划书\t114416\n7700K\t114417\n河北科技大学理工学院\t114418\n反潮\t114419\n纸模\t114420\n硅肥\t114421\nBOND\t114422\n苯佐卡因\t114423\n东风谷\t114424\n发达地区\t114425\n鲜嫩\t114426\n面头\t114427\n企业信用网\t114428\n极迅\t114429\n清扫口\t114430\n哪位\t114431\n伊戈尔\t114432\n人塔\t114433\n刀女\t114434\n费恩\t114435\n52_\t114436\n突撃\t114437\n空客A350\t114438\nCURRENT\t114439\n60片\t114440\n火人\t114441\nDTS\t114442\n躲雨\t114443\n100.6\t114444\n慌慌\t114445\n杨坤\t114446\n第3轮\t114447\n房贷计算器\t114448\npicasso\t114449\n孔网\t114450\n2MB\t114451\n小主人\t114452\n层次感\t114453\n九正装修问答\t114454\n出手\t114455\n佑天兰\t114456\n达喀尔拉力赛\t114457\n沙画\t114458\nbbs.51credit.com\t114459\n蓄电\t114460\n悬针\t114461\n储油\t114462\n死线\t114463\n小魔怪\t114464\n三奇\t114465\n饼类\t114466\n近藤真彦\t114467\nSMT之家论坛\t114468\n国五\t114469\n殷罡\t114470\n快节奏\t114471\n上海车管所\t114472\n俄狄浦斯王\t114473\n通项\t114474\n万能格式转换器\t114475\n帝师呼兰河传\t114476\n上饶中学\t114477\n自动贩卖机\t114478\n天冷\t114479\nPANIC\t114480\n炫迈\t114481\n人民网舆情监测室\t114482\n营销者\t114483\n涨落\t114484\n咸阳职业技术学院\t114485\n蔡琳\t114486\n函模板\t114487\n发麻\t114488\n咱们屯里的人\t114489\n油液\t114490\n防水板\t114491\n沪苏湖\t114492\n扣件式钢管\t114493\n商丘学院\t114494\n磁阻\t114495\n运输户\t114496\n定点药店\t114497\n三星S6\t114498\nfifaonline3吧_\t114499\n绑腿\t114500\nArgox\t114501\n非洲象人族\t114502\n上图\t114503\n洪城\t114504\ncontradiction\t114505\nsha2\t114506\n新天梯\t114507\n大力神\t114508\nKARA\t114509\n孔雀开屏\t114510\n3.36\t114511\nIMEIdb\t114512\nt20v4.0\t114513\n43399\t114514\n本山传媒\t114515\n证据法\t114516\nnnpj\t114517\nHates\t114518\n牛脚\t114519\n2.5t\t114520\n潞安\t114521\n区工商局\t114522\n新客\t114523\n柯尔克孜族\t114524\n杂话\t114525\n省地震局\t114526\n新康花园\t114527\n隆鑫摩托\t114528\n河北科技大学\t114529\n2分钱\t114530\n赣南采茶戏\t114531\n中国能源工程集团有限公司\t114532\n吸潮\t114533\n瀚思\t114534\n小若\t114535\n签证表\t114536\n花瑶\t114537\n杜洋\t114538\n美优\t114539\n国防生\t114540\nBalloons\t114541\n5.2亿\t114542\n朝核\t114543\nshimg\t114544\nbaicai\t114545\n纯涩\t114546\n学起\t114547\n筋膜\t114548\nF10\t114549\n波型\t114550\n嘉定都市网\t114551\n常静\t114552\n西青道\t114553\ntp-link无线路由器\t114554\n阅览室\t114555\n贪官污吏\t114556\nWaymo\t114557\n不可方物\t114558\n清光\t114559\nbuyee\t114560\n雨敏\t114561\nbizbox\t114562\n中童\t114563\nVariety\t114564\n米斯特汀\t114565\ngid\t114566\n)网络科技有限公司\t114567\n德平\t114568\n孝琳\t114569\n商祺\t114570\n顺达\t114571\nSpooler\t114572\n中粮我买网\t114573\n德玛西亚\t114574\n井上雄彦\t114575\n黑嗓\t114576\nrutgers\t114577\nTex\t114578\n决口\t114579\n大冲锋\t114580\n分布表\t114581\ndatedif\t114582\n公示\t114583\nIC网\t114584\n傲风电竞椅\t114585\n光芒万丈\t114586\nv110\t114587\n童子军手册之僵尸启示录\t114588\n朱艳\t114589\nlabel\t114590\n适时\t114591\n东山大道\t114592\n讓\t114593\n林树森\t114594\n荡口古镇\t114595\n中国电力网\t114596\n辰龙\t114597\n手机输入法\t114598\n允贤\t114599\nmouse2\t114600\n几十年\t114601\n动感号\t114602\n阿刁\t114603\n球阀\t114604\ncapstone\t114605\n井冈山政府网\t114606\nbab.la英语\t114607\n刀郎\t114608\n保健所\t114609\n鱼塘\t114610\n欧意\t114611\n东胜区人民政府\t114612\n朝会\t114613\n沈艳\t114614\n3002\t114615\n3sin\t114616\n生态化\t114617\ncglib\t114618\n体外培养\t114619\n特布他林\t114620\n常州市政府网\t114621\n1st\t114622\n批注框\t114623\nPoppin\t114624\n接人\t114625\n王萍\t114626\n大天使号\t114627\n武汉格力空调\t114628\nyishu\t114629\n激灵\t114630\n身亡\t114631\n铅衣\t114632\n璧山县人民政府\t114633\n长镜头\t114634\n锡基霍尔\t114635\n塔里木油田\t114636\n15.26\t114637\n中山市委\t114638\n全办\t114639\n袁明\t114640\nclassid\t114641\n洋槐树\t114642\n页岩\t114643\n来历不明\t114644\nwww.360doc.com\t114645\n欺人太甚\t114646\n氢燃料汽车\t114647\nDeveloper数据库\t114648\n立花瞳\t114649\n桂东电力\t114650\n冷菜\t114651\n科学院\t114652\n丹西\t114653\n泥\t114654\n1533\t114655\n抹脸\t114656\n省好\t114657\n炮管\t114658\nEmpty\t114659\n借款人\t114660\n宋旻浩\t114661\n苍穹之战\t114662\n以点带面\t114663\n跃龙\t114664\n美山\t114665\nl310\t114666\n谐谑曲\t114667\n镀镍\t114668\n信用码\t114669\n别太坏\t114670\n西安市高新区\t114671\nNACE\t114672\n136期\t114673\n阮玲玉\t114674\n我命\t114675\n死狗\t114676\n三十二步\t114677\ndefaults\t114678\n奔驰s级\t114679\nbenq\t114680\n搜狐视频\t114681\n0位\t114682\n黑白照\t114683\n棉纺织\t114684\n校正器\t114685\n瑞升\t114686\n第88届\t114687\n使团\t114688\ntunable\t114689\n4万多元\t114690\nbsc\t114691\n第9页\t114692\n第2481章\t114693\n青娱乐在线\t114694\n翠林\t114695\n参差\t114696\n老规矩\t114697\n史老师\t114698\n四十七\t114699\n杜氏\t114700\n会员国\t114701\n易军\t114702\n江中猴姑米稀\t114703\n粉球\t114704\n戴勇\t114705\n乐库\t114706\n气动三联件\t114707\n垃圾池\t114708\n上海中大肿瘤医院\t114709\ncaro\t114710\n黄岩岛\t114711\n1kpa\t114712\n黄河水利委员会\t114713\n玉米粉\t114714\n热火\t114715\n凄厉\t114716\nXIUREN秀人网\t114717\n鬼婴\t114718\n鬼吹灯\t114719\n连身\t114720\n守望先锋英雄\t114721\n窝客\t114722\n华然装饰\t114723\n七校\t114724\n真新街道\t114725\n百分之8\t114726\n莫高雷\t114727\n木兰辞\t114728\n朱迅乌兰图雅\t114729\n自由幻想\t114730\n7门\t114731\n私人银行\t114732\n锹甲\t114733\n中国物流网\t114734\neb80.com\t114735\n娃们\t114736\n鬃狮\t114737\nT1/\t114738\nq版\t114739\nsnake\t114740\nLESS\t114741\n52pk英雄联盟\t114742\n每周六\t114743\n4码\t114744\n新台\t114745\n疯狂的石头\t114746\nAPPIUM\t114747\n湖羊\t114748\n氢燃料电池\t114749\nAES\t114750\n六天\t114751\n思雨\t114752\n孙冶方\t114753\n凤凰飞\t114754\n商业预付卡\t114755\nola\t114756\n和易性\t114757\n城镇群\t114758\nCupid\t114759\n爆口\t114760\n2018年底\t114761\nIAC\t114762\n603000\t114763\n一秋\t114764\n小儿支气管炎\t114765\n脑动脉\t114766\n由小到大\t114767\n南通市教育局\t114768\n人大附中西山学校\t114769\n卦妃天下_锦凰\t114770\n周分组\t114771\n抱负\t114772\n牛总\t114773\n样\t114774\n发端\t114775\n宿松\t114776\n分析器\t114777\n队列\t114778\nsql吧\t114779\n3月3日\t114780\n阿耐\t114781\n徐忠\t114782\n宿怨\t114783\n桂言叶\t114784\n皮拉夫\t114785\nz11mini\t114786\nbeanfun\t114787\n喝死\t114788\n运动系\t114789\n汗流浃背\t114790\n孙先生\t114791\n格雷厄姆\t114792\n太康县人民政府\t114793\n艺创\t114794\n老爸当家\t114795\ndazzle\t114796\n自持\t114797\n5万条\t114798\n凤凰计划\t114799\n超详\t114800\n服务有限公司\t114801\ngary\t114802\n谷地\t114803\n占领\t114804\n磕磕\t114805\n布拉\t114806\niowa\t114807\n1股\t114808\n有关闭\t114809\n核单\t114810\n鱼戏莲叶\t114811\n超纤\t114812\n荣泰\t114813\n三叶虫\t114814\n卡琳\t114815\n王二麻子\t114816\n土耳其女排\t114817\n小胡鸭\t114818\nSATA3接口\t114819\n3.8g\t114820\nsmartbi\t114821\n霍青桐\t114822\ntce\t114823\nK0\t114824\n61.183\t114825\nopenurl\t114826\n3月2号\t114827\n瞿\t114828\nwade\t114829\n复种\t114830\nInconsistent\t114831\n056\t114832\n被申请人\t114833\n仲宫镇\t114834\n诺和锐\t114835\n6话\t114836\n马蹄铁\t114837\n成医\t114838\n账单\t114839\n体育西路\t114840\n盛鼎\t114841\n长泰\t114842\n佳能尼康\t114843\n40乘\t114844\nButton篇\t114845\n奥克维尔\t114846\nLazy\t114847\n15克\t114848\n中沙群岛\t114849\n1933年\t114850\n小陌\t114851\nxie\t114852\n江中游\t114853\n教育类\t114854\n第四章\t114855\n葵园\t114856\n顶碗\t114857\nanp\t114858\n31集\t114859\n陈家沟\t114860\n预备\t114861\n茂名新闻网\t114862\n不可做\t114863\n集合管\t114864\n北朝\t114865\n语言的魅力\t114866\n缸杯\t114867\n贸易类\t114868\n轴端\t114869\ncma\t114870\nstoring\t114871\n佳茗\t114872\n马志刚\t114873\n门子\t114874\n楚河\t114875\n火马\t114876\n疾病\t114877\n试唱\t114878\n草人人\t114879\n仙姿\t114880\n双扇门\t114881\nanaconda3\t114882\nurllib2.urlopen\t114883\nOY\t114884\n恒牛\t114885\n柒号\t114886\n逻辑电路\t114887\n第一秘\t114888\n东营房\t114889\nyong\t114890\n六百\t114891\nGlasgow\t114892\n力系\t114893\n林秀香\t114894\n中国六盘水网\t114895\n桃红色\t114896\n耐看型\t114897\n长春新闻网\t114898\n拾色器\t114899\n两侧\t114900\nSmith\t114901\nTester\t114902\n串讲\t114903\n朱淑真\t114904\n9根\t114905\nSegway\t114906\n全率\t114907\n江陵县\t114908\nhelping\t114909\n实习律师\t114910\n好与坏\t114911\n友嘉\t114912\n廉贞星\t114913\n粉丝们\t114914\n于田\t114915\nN8\t114916\n20150907\t114917\n半袖妖妖\t114918\n刺激仪\t114919\nelx\t114920\n路科\t114921\n4008\t114922\n曹公\t114923\nnga.178.com\t114924\n王晓杰\t114925\nOffice教程学习网\t114926\n弑君\t114927\n上海赛车场\t114928\n四月二日\t114929\n帐簿\t114930\n99育儿百科全书_99健康网\t114931\n巷\t114932\n艾诺迪亚3\t114933\n米兰理工\t114934\n50大\t114935\n武装斗争\t114936\n巴清传\t114937\n神箭\t114938\n600570\t114939\n客家语\t114940\n配装\t114941\nJCL\t114942\n不锈钢冲孔网\t114943\nSWOT\t114944\nnaice\t114945\nOEM\t114946\n1.docx\t114947\n苍蓝\t114948\n上海口腔科医院\t114949\n树怪\t114950\n劳动合同期限\t114951\n单克隆\t114952\n2.2万亿\t114953\n代表团\t114954\n1.1.3\t114955\nCl\t114956\nlil\t114957\n体验机\t114958\n龙长春\t114959\nⅠ\t114960\ndoudou\t114961\n龙应台\t114962\nLED屏\t114963\n高频小信号放大器\t114964\n铁丝网\t114965\n微波炉热\t114966\nffi\t114967\n说的好\t114968\n标营\t114969\n陕北说书\t114970\n11.15\t114971\n第五个\t114972\nSure\t114973\n北京首汽【集团\t114974\n十九次\t114975\n120片\t114976\n艾肯6\t114977\n中亚股份\t114978\nOk\t114979\n松明\t114980\n新天堂II\t114981\ntyrant\t114982\n微盘\t114983\n物理类\t114984\n私办\t114985\n罡星\t114986\n丽萍美体健身操\t114987\n手工艺品\t114988\njava编程\t114989\n99块\t114990\nshroud\t114991\nxlog\t114992\n新沂\t114993\n岑凯伦\t114994\n媚肉生香\t114995\ntokyohot\t114996\n窃电\t114997\n奥兹玛\t114998\n李泽厚\t114999\n索菲亚家居股份有限公司\t115000\n计春华\t115001\ncsa\t115002\n汕头电视台\t115003\n迅雷地址\t115004\n番号\t115005\nnxp\t115006\nreadboy\t115007\n幽门梗阻\t115008\n中国企业培训网\t115009\n洪峰\t115010\n25级\t115011\n房产商\t115012\n阿贝折光仪\t115013\nlife\t115014\n宁波市政府\t115015\n上海耳鼻喉科医院\t115016\npalette\t115017\n枫哥\t115018\n清剿\t115019\n牧者\t115020\n长沙建设银行\t115021\n套房子\t115022\n金馆长\t115023\n北京控股\t115024\n04月27日\t115025\n打假\t115026\n压延机\t115027\n希捷1TB\t115028\n古礼\t115029\n00861\t115030\nwww.zf158.com\t115031\n单胞菌\t115032\n鬼泣4特别版\t115033\nsca\t115034\n4千万\t115035\n混凝土地泵\t115036\n名实\t115037\n仁济医院东院\t115038\nswitchbutton\t115039\n3树\t115040\n海信液晶电视\t115041\n面目\t115042\n伸长率\t115043\n旁落\t115044\n缺少\t115045\n不具\t115046\nup主\t115047\n每十年\t115048\ndvd机\t115049\ngalaxy\t115050\n泌尿系结石\t115051\n社会科学报\t115052\n埃辛诺斯战刃\t115053\n种业商务网\t115054\nsummon\t115055\nthrew\t115056\n朝阳外国语学校\t115057\n小公园\t115058\n原生动物\t115059\nSpeaking\t115060\n食客\t115061\nNewsmy\t115062\n营山县\t115063\n骑警\t115064\n普达\t115065\n师士\t115066\n韦家辉\t115067\ndesirable\t115068\n青椒肉丝\t115069\n纷泽\t115070\n顿生\t115071\n火蚁\t115072\n海路\t115073\nMaya2017\t115074\nFeudal\t115075\nMenswear\t115076\n昙华林\t115077\n第100\t115078\n恋家网\t115079\nlgbt\t115080\n白乳胶\t115081\nMK4\t115082\n鲜橙汁\t115083\n案发\t115084\n20170227\t115085\n定过\t115086\n涂鸦\t115087\n操办\t115088\n竖滚动条\t115089\n顶线\t115090\n127查询网\t115091\n我是谁\t115092\nCory\t115093\n3瓦\t115094\n华盛顿纪念碑\t115095\n山西省公安厅\t115096\n南码头\t115097\nserenity\t115098\n档头\t115099\n因小失大\t115100\n2018年4月5月\t115101\n反合\t115102\nwindows7序列号\t115103\n孙吴县\t115104\n11只\t115105\n难离\t115106\n蒸饭箱\t115107\n山体\t115108\nCPI\t115109\n主板芯片组\t115110\n金驰\t115111\n经营类\t115112\n简便\t115113\n邗江政府网\t115114\n硅芯管\t115115\n欲睡\t115116\ncatering\t115117\nPacks\t115118\n司法鉴定所\t115119\n巧练\t115120\n木通\t115121\n子长\t115122\n起始值\t115123\nm端\t115124\n开诊\t115125\n晶锐剑\t115126\n柚月\t115127\n歇脚\t115128\n放流\t115129\n杏仁茶\t115130\n宝鸡教育网\t115131\n堂哥\t115132\n经营决策机构\t115133\n龟甲胶\t115134\nQQ闪照\t115135\n入学\t115136\nlxs\t115137\nREDO\t115138\n6米高\t115139\n袁和平\t115140\n资信证明书\t115141\n蒂雅\t115142\nsf6\t115143\nVRchat\t115144\nbattleye\t115145\n真空预冷机\t115146\n归海\t115147\n停交\t115148\n双龙寺\t115149\n辐射仪\t115150\nshc\t115151\n炭疽病\t115152\n雪鹰领主\t115153\n中断向量表\t115154\ns26\t115155\n顾山镇\t115156\ngogoro\t115157\n锥形\t115158\n易阅-言情穿越\t115159\n化德县\t115160\n雷克萨斯NX论坛_汽车之家论坛\t115161\n山海天\t115162\n给袋式包装机\t115163\n杭州机场\t115164\niperf3\t115165\nx520\t115166\n工作\t115167\nvlde\t115168\n通天绳\t115169\n比容\t115170\n三栋\t115171\n川大附小\t115172\n海淀区人力资源和社会保障局\t115173\n阜成门\t115174\n万汇\t115175\n较为\t115176\nllguanli\t115177\n天票\t115178\n恒金\t115179\nP10\t115180\n以期\t115181\n骨肉瘤\t115182\nRent\t115183\n食梦貘\t115184\nyehualu\t115185\n精肉\t115186\n叙利亚战争\t115187\n嫁\t115188\n固定位\t115189\n电机保护器\t115190\n车类\t115191\n蔡振华\t115192\n和一个人\t115193\n格林卡\t115194\ntigris\t115195\n桑树\t115196\n刘诺\t115197\n同捆\t115198\ncaptured\t115199\nCM11\t115200\n利信\t115201\n分割线\t115202\n巨惠\t115203\n20所\t115204\n强制执行\t115205\n猫藓\t115206\nWindows98\t115207\n特藏\t115208\n大节\t115209\nTDP\t115210\n盐城站\t115211\n7月8月\t115212\n前編\t115213\n深圳广播电影电视集团\t115214\n成都武侯\t115215\n慰问费\t115216\n性爱狂想曲\t115217\n温商贷\t115218\n大义\t115219\n十部\t115220\n韭菜子\t115221\n快讯\t115222\n奇迹:最强者\t115223\ncontinuity\t115224\n防撞杆\t115225\n春秋旅行社\t115226\n静静地\t115227\n三万年\t115228\n陈斌宇\t115229\nbooth\t115230\nCRC16\t115231\n微能力者\t115232\n伦艺\t115233\n太湖湾\t115234\n郭沫史泰龙\t115235\n深夜食堂\t115236\n生日宴\t115237\n健康险\t115238\n拭\t115239\n广灵吧\t115240\n东京市\t115241\n华隆\t115242\n赵欣颖\t115243\n日日顺\t115244\nqpixmap\t115245\n红雾\t115246\n摺\t115247\n展眉\t115248\n婷风\t115249\nGreater\t115250\n黑玛丽\t115251\n万松书院\t115252\n龙票\t115253\n魔耳\t115254\n天津钢管集团股份有限公司\t115255\nelevate\t115256\n佳彩\t115257\n绍兴网\t115258\nn年\t115259\nMiflash\t115260\n回流\t115261\n超级女声\t115262\n三四点\t115263\n日报\t115264\n模特经纪公司\t115265\n天祝藏族自治县\t115266\n祝酒词\t115267\n股票质押回购\t115268\n第4名\t115269\nchoo\t115270\n育龄\t115271\n我爱你中国\t115272\n多音\t115273\na座\t115274\n西安网络广播电视台\t115275\nlibnfc\t115276\n学位网\t115277\n学习观\t115278\n易佰店\t115279\nH100i\t115280\nGermany\t115281\n轰龙\t115282\n张志窦骁\t115283\n犍陀罗\t115284\n天乐树\t115285\n赞同\t115286\n阿特拉斯科普柯\t115287\nata\t115288\n追凶者也\t115289\n下学\t115290\n总决赛\t115291\namd吧_\t115292\n胡三\t115293\nHerald\t115294\n翼企\t115295\nfacenet\t115296\n1285\t115297\n38.3\t115298\n/P\t115299\n小马宝莉\t115300\nShe\t115301\n一亿美元\t115302\n嘎查\t115303\n退居二线\t115304\n钓鱼台\t115305\n染色液\t115306\n7亿\t115307\n上海猎聘网\t115308\n墨鸦\t115309\nSpr\t115310\n只得\t115311\n李芳芳\t115312\nturned\t115313\n死人谷\t115314\n布鲁斯·威利斯\t115315\n咸辉\t115316\n一直在路上\t115317\n子文\t115318\nSpeex\t115319\n摇臀\t115320\nlinux6\t115321\n陈怡\t115322\n莎木\t115323\n企事\t115324\n输水管\t115325\n平面应力\t115326\n谐\t115327\n亲密照\t115328\n第十八条\t115329\n摩天岭\t115330\n皇家大道\t115331\n具体性\t115332\nABO溶血\t115333\n李文杨\t115334\n京粮控股\t115335\n濉溪\t115336\n泾川\t115337\n想得到\t115338\n几孔\t115339\nRecreation\t115340\nv4.12\t115341\n焦点科技股份有限公司\t115342\n连杆\t115343\n于晓\t115344\n杨惠\t115345\n望闻问切\t115346\nエロ動\t115347\n晋网\t115348\ngter\t115349\n大彬\t115350\n标致301\t115351\n云南镇\t115352\n采砂船\t115353\n纪昌\t115354\nel-table-column\t115355\n万古神帝\t115356\nDE\t115357\n圣文森特\t115358\n薰衣草\t115359\n两波\t115360\n林家栋\t115361\n巫灵\t115362\n夜生活女王之霞姐传奇\t115363\n雅博\t115364\n懈怠\t115365\n信长之野望15\t115366\n华芯通\t115367\n永平街\t115368\n小米5c\t115369\n迈拓\t115370\n猴爪\t115371\n广西广播电视大学\t115372\n外媒\t115373\n72%\t115374\n追根溯源\t115375\n联合广场\t115376\nWindwos\t115377\n小鲤鱼\t115378\nYY音乐社区\t115379\n灰椋鸟\t115380\n峰不二子\t115381\n维景\t115382\nFreaks\t115383\n大龙村\t115384\n七色网\t115385\n精勤\t115386\n疯癫\t115387\nworking\t115388\n6个半月\t115389\n中国的和平发展\t115390\n风流人生\t115391\n墨笔\t115392\n办公\t115393\n奸邪\t115394\nkhz\t115395\nFlowplayer\t115396\n充错\t115397\n雷克萨斯CT\t115398\nBristol\t115399\n逆转录酶\t115400\nbilibili卡\t115401\nsetopt\t115402\nCompound\t115403\n抽取\t115404\nRod\t115405\n发现神行论坛\t115406\n换牙期\t115407\nseed吧_\t115408\n第2\t115409\n改商\t115410\n歌乐山\t115411\nhydac\t115412\n资产负债率\t115413\n绝地求生大逃亡\t115414\nOLED屏幕\t115415\nmemoq\t115416\n州级\t115417\n险企\t115418\n乔尔\t115419\nastronaut\t115420\n杨浦区\t115421\n田黄石\t115422\n缔结\t115423\nEmacs\t115424\nBDrip\t115425\n稳固\t115426\n宣桥镇\t115427\nr9splus\t115428\n行尸走肉\t115429\n点花\t115430\nPUR\t115431\nMoS2\t115432\n轴流泵\t115433\n裏\t115434\n星座女\t115435\n技术监督局\t115436\np16\t115437\n大鞋\t115438\n大屯\t115439\n眼肉\t115440\nruling\t115441\n600528\t115442\n盘管机\t115443\n众合科技\t115444\ncorres\t115445\n60集\t115446\n奥体公园\t115447\nstuio\t115448\n赚果果\t115449\nfox\t115450\n辅分离\t115451\n香郡\t115452\n南洲\t115453\n有生以来\t115454\nfruit\t115455\nvt\t115456\n世界第一初恋\t115457\n换壳\t115458\n福中\t115459\n腊排骨\t115460\n作词\t115461\n白城\t115462\n列管式\t115463\n缺失\t115464\nunbalanced\t115465\n吴老狗\t115466\n赵邦\t115467\n难忘\t115468\n制冰机\t115469\nGridPanel\t115470\nmaxiongying\t115471\n屡教不改\t115472\n老年\t115473\n米林\t115474\n山西省人民政府法制办公室\t115475\n旺苍人民政府\t115476\n移动宽带吧\t115477\n排病\t115478\niws\t115479\n右舵\t115480\nvoluntary\t115481\n常用句\t115482\n白砂\t115483\n超导材料\t115484\n携程商旅\t115485\n泸水\t115486\n陈彦\t115487\nLoong\t115488\n打嗝\t115489\n160515\t115490\n健博会\t115491\n毕业后\t115492\n金观涛\t115493\n整场\t115494\n非遗展\t115495\n上原瑞穗\t115496\n星历\t115497\n断桥铝合金门窗\t115498\nUFS\t115499\n禾花雀\t115500\n茫崖\t115501\n中海石油气电集团有限责任公司\t115502\nTx\t115503\n引物二聚体\t115504\n岭头\t115505\n新视点\t115506\n滑步车\t115507\n孵育\t115508\n参照系\t115509\n截面惯性矩\t115510\n血咒\t115511\n民德\t115512\n三国杀国战吧\t115513\n死守\t115514\n蓝湖月崖\t115515\n重庆航天职业技术学院\t115516\n九亭大街\t115517\n不得不懂\t115518\nLAND\t115519\n激化\t115520\n范斯沃斯\t115521\n悬棺寺\t115522\n500平米\t115523\nituens\t115524\n五里\t115525\n露得清\t115526\nailed\t115527\n青森县\t115528\n内套\t115529\n大行p8\t115530\n草原上升起不落的太阳\t115531\n博德之门2\t115532\nDisc\t115533\n成都市律师协会\t115534\nLumia\t115535\n引脚\t115536\n河北省政府\t115537\n李俊伟\t115538\n启明\t115539\n布洛克·莱斯纳\t115540\n乔治华盛顿大学\t115541\nsda\t115542\n像不像\t115543\n20160409\t115544\n六轮\t115545\n小雕\t115546\n达也\t115547\nChangeLog\t115548\n下月底\t115549\n待客\t115550\n浅酌\t115551\n中国园林博物馆\t115552\n路桥新闻网\t115553\nworkercn\t115554\n2018年2月18日\t115555\n小本创业网\t115556\n17zwd.com\t115557\n受欢\t115558\n1.2G\t115559\n公文写作\t115560\nMicrosoftOffice2010\t115561\n楼村\t115562\n龙圩区\t115563\n苏州公园\t115564\n前列腺增生症\t115565\n中鸣人\t115566\n一百条\t115567\n保卫战\t115568\n16L\t115569\nTabbar\t115570\n罗米\t115571\nformerly\t115572\n胜利村\t115573\nsvs\t115574\n中航重机\t115575\n管制员\t115576\n江米条\t115577\n淘金热\t115578\n2016-10\t115579\nv5.2.1\t115580\n中空板箱\t115581\n女春\t115582\n春梦了无痕\t115583\n水清路\t115584\n女儿奴\t115585\n预备党员思想汇报\t115586\n十|\t115587\n55海淘\t115588\n凌美\t115589\n内板\t115590\nzhu\t115591\n李寅\t115592\n奈良美智\t115593\n讲解\t115594\n裤袋\t115595\n嘿\t115596\n1479\t115597\n潘玮柏\t115598\n上拉\t115599\n日语语法\t115600\nPageOffice\t115601\nA4L\t115602\n广东医学院\t115603\n假牌\t115604\nbk\t115605\n木村\t115606\ngpu\t115607\n伊川\t115608\n麻线\t115609\n豪华型\t115610\nxmr\t115611\n养心菜\t115612\n_变声软件\t115613\n淘宝助理5\t115614\nwhiskey\t115615\n仲良\t115616\n三元锂电\t115617\n赵英\t115618\n二硫\t115619\n大河票务网\t115620\n萤石C2C\t115621\n帕累\t115622\n淀粉类\t115623\n流逝\t115624\n九球天后\t115625\n滨州经济开发区\t115626\n供述\t115627\n呼气\t115628\nKRAS\t115629\n惊天大贼王\t115630\nexcluded\t115631\n海纳\t115632\n词头\t115633\n楪祈\t115634\n三曰\t115635\n达坂城区\t115636\njackson\t115637\n三胞\t115638\n几_万\t115639\n忠狗\t115640\n康卡斯\t115641\n勇武\t115642\n2365\t115643\n活剥\t115644\n舒张期\t115645\n城居\t115646\n港南区\t115647\n群访\t115648\n反差色\t115649\n华磊\t115650\n芝\t115651\n谢晋元\t115652\n真善美的小世界\t115653\n自便\t115654\n2011-2017年\t115655\n新西\t115656\n芦淞区\t115657\n悬浮\t115658\nmysql数据库服务器\t115659\n锅包\t115660\n刘仁娜\t115661\n顺城巷\t115662\n朱高煦\t115663\n说再见\t115664\n屏峰\t115665\n铁血刺客\t115666\n最终幻想5\t115667\n不可多得\t115668\n无锡农村商业银行\t115669\nchore\t115670\neal\t115671\nkangjie\t115672\nHTMLElement\t115673\n两圆\t115674\n婉婷\t115675\n29位\t115676\n尚可\t115677\n遥惠美\t115678\n5550\t115679\n宜兴竹海风景区\t115680\n浅识\t115681\n联播\t115682\nPLASTIC\t115683\n4340\t115684\n横店镇\t115685\njl\t115686\n广东科学中心\t115687\nCS6|PhotoShop\t115688\n3198\t115689\n秒\t115690\n9d\t115691\n工业危险废物\t115692\n金刚晶\t115693\n李潇\t115694\n唐兴\t115695\n基格尔德\t115696\n恒温恒湿\t115697\n短信猫\t115698\n天涯侠医\t115699\n厂方\t115700\n吉祥村\t115701\nnickname\t115702\nWinclone\t115703\n精炼炉\t115704\n星报\t115705\n自动\t115706\nPoly\t115707\n优柔寡断\t115708\ndescribe\t115709\n魔域杂谈\t115710\nGhost安装器\t115711\n诗圣杜甫\t115712\n淫乐\t115713\n橡胶助剂\t115714\n粘度仪\t115715\n天池山\t115716\n药引\t115717\n小米平板1_MIUI论坛\t115718\n生长点\t115719\nfsimage\t115720\n七八十岁\t115721\n网事\t115722\n火兔\t115723\n东京\t115724\n自动售票机\t115725\n神医弃女:鬼帝的驭兽狂妃\t115726\nDocker-Compose\t115727\n氯乙烷\t115728\n深圳市人民医院龙华分院\t115729\n坝\t115730\nze\t115731\n1911\t115732\n立淘\t115733\n娜娜\t115734\n75岁\t115735\n秦义绝\t115736\n123号\t115737\n单线\t115738\n致远_\t115739\nSPE\t115740\nsle\t115741\n新亨镇\t115742\n4月18\t115743\ncp5s1\t115744\n新潮\t115745\n颍泉区人民政府\t115746\n导纳\t115747\n复电\t115748\nvolatile\t115749\n长段\t115750\n岑村\t115751\n华强\t115752\n纽拉\t115753\n防晒衣\t115754\n精诚所至\t115755\n市总工会\t115756\n美强\t115757\n文昌星\t115758\n中央民族歌舞团\t115759\ncmef\t115760\n比特大雄\t115761\n洗耳恭听\t115762\n沃尔沃XC40\t115763\n灵狐者\t115764\n貂蝉\t115765\n刀轨\t115766\n国剧\t115767\n桑塔纳\t115768\n西里\t115769\nTcl\t115770\n防腐木工程\t115771\n佛罗\t115772\nConcepts\t115773\n半导体制程\t115774\n34栋\t115775\n中东市场\t115776\n拖影\t115777\n张小素\t115778\nUI设计\t115779\n约吧大明星\t115780\n月河\t115781\n田禾\t115782\n驱\t115783\n羊脂白玉\t115784\n卷帘门\t115785\ncontenteditable\t115786\n湛江晚报\t115787\n王菁\t115788\nJenna\t115789\ncortex-m3\t115790\n海思科\t115791\n1905电影网\t115792\n死亡之屋2\t115793\n登记费\t115794\n60目\t115795\n虚函数表\t115796\nbarco\t115797\n佶\t115798\n一终\t115799\n林子雨\t115800\ne431\t115801\n波士顿龙虾\t115802\nSHAFT\t115803\n从何说起\t115804\n纳米管\t115805\n夙兴夜寐\t115806\nRT-Thread\t115807\n排行一览\t115808\n找东西\t115809\n驾驶座\t115810\n第一书\t115811\n认缴\t115812\nMakiyo\t115813\n武汉大学历史学院\t115814\n吴晓斯嘉丽约翰逊\t115815\n陵川\t115816\n进击\t115817\nCartoons\t115818\n樱奇\t115819\n动感型\t115820\nAccord\t115821\n品牌影响力\t115822\n广东南方职业学院\t115823\n仔仔网\t115824\n卡彭\t115825\nmanager\t115826\n7000美元\t115827\n女圣职者\t115828\n聆听者\t115829\n大哥大\t115830\n薛涛\t115831\nLouis\t115832\n人福医药\t115833\n嘉能可\t115834\n防弹板\t115835\n赵小姐\t115836\n双座\t115837\n汲魂\t115838\n硬顶\t115839\n商物\t115840\n重庆江\t115841\nST锐电\t115842\nServer端\t115843\n多孔板\t115844\n苏派\t115845\n小例\t115846\n消防稳压泵\t115847\nUGUI研究院\t115848\n8024402\t115849\n南联\t115850\n多玛\t115851\n长途汽车\t115852\n谜团\t115853\n麦芽\t115854\n地肤子\t115855\n生活日报\t115856\n综上所述\t115857\n华润雪花啤酒\t115858\nwin10edge浏览器\t115859\nchoker\t115860\n黑松镇\t115861\n25美分\t115862\n邢帅\t115863\n银梦\t115864\n453\t115865\n黄金周\t115866\n奔逃\t115867\n狮子岛\t115868\n堤顶\t115869\na4yy\t115870\nv1.1安卓\t115871\n語版\t115872\n零濡鸦\t115873\n0.几\t115874\n第164章\t115875\nDiodes\t115876\nflashfxp\t115877\n嘉兴在线\t115878\n巨ru\t115879\nzon\t115880\n波塞冬\t115881\nlala吧\t115882\n罗孚\t115883\n辛酸路\t115884\nconsultation\t115885\n云集\t115886\nSpike\t115887\nWin8/Win10\t115888\n黑百合\t115889\n气穴\t115890\ng620\t115891\n抽薹\t115892\n柯村\t115893\nPPT母版\t115894\nlg\t115895\nDurex\t115896\n保利克洛维\t115897\n高碑店市\t115898\n扎堆\t115899\n潘家园\t115900\nY53\t115901\n联能\t115902\n诗论\t115903\n白石瞳\t115904\n跳过\t115905\nsd敢达吧\t115906\n脾胃\t115907\nMP236\t115908\n学校食堂\t115909\n真假鉴别法\t115910\n乐之者\t115911\nPacman\t115912\n慕名\t115913\n重庆旅游问答\t115914\n招商卡\t115915\n藏趣\t115916\n余家辉\t115917\n柚子花\t115918\n20170606\t115919\n百两\t115920\nhym\t115921\nDateBox\t115922\nSIP协议\t115923\n东莞光明中学\t115924\n十三届全国人大一次会议\t115925\n黄麓镇\t115926\n李好好\t115927\n协查\t115928\n外汇汇率\t115929\n福建省投资开发集团有限责任公司\t115930\nVolley\t115931\nHu知非\t115932\nArcaea\t115933\n20160215\t115934\n建机\t115935\n桩柱\t115936\n退赃\t115937\nVernon\t115938\n黑齿\t115939\nA.5\t115940\n次氯酸钠\t115941\n熏\t115942\n青山刚昌\t115943\n一站式\t115944\n不遗余力\t115945\n8890\t115946\n浙政\t115947\n长春地区\t115948\nBaoGaoSu\t115949\n男生性\t115950\n丽发新城\t115951\nHS编码\t115952\n暴走撸啊撸\t115953\n太和县\t115954\n背斜\t115955\n二0\t115956\n闯天涯\t115957\n奋乃静\t115958\n2780\t115959\n煮糊了\t115960\n5.5折\t115961\n楼顶\t115962\n广州市越秀区政府\t115963\n私我\t115964\n天梭\t115965\n欢乐斗地主四月残局\t115966\n5.8mm\t115967\nDayZ\t115968\n严私德\t115969\n168商务网\t115970\n22话\t115971\n140方\t115972\nファイル\t115973\n满腔\t115974\n极为\t115975\nz900\t115976\n108sq.com\t115977\n妮娜\t115978\n投过\t115979\n无知\t115980\n徐州市政府\t115981\nRGBA\t115982\n勒马\t115983\n徐经理\t115984\n高盛集团\t115985\n医疗兵\t115986\n兔绒\t115987\n凯迪拉克atsl\t115988\n极地海洋馆\t115989\n阔剑\t115990\nRFM\t115991\n加薪\t115992\n灭队\t115993\n桃花姬阿胶糕\t115994\n散发\t115995\n120健康网\t115996\n索拉尔\t115997\n数据卡\t115998\n总队长\t115999\n大富豪\t116000\nfrd\t116001\n滕州市中心人民医院\t116002\n药芯\t116003\n闫盼\t116004\n华光路\t116005\np++\t116006\n广州番禺\t116007\n民间文学\t116008\n狂骨\t116009\n珍妮弗\t116010\n露屄\t116011\ngoogel\t116012\n5伏\t116013\n米卡成长天地\t116014\n福建政府网\t116015\nsheel\t116016\n滨海地区\t116017\n沃尔核材\t116018\n07073h5\t116019\n陈敏\t116020\n六月二十七日\t116021\nsl400\t116022\n六十厘米\t116023\n1887年\t116024\n水果味\t116025\nniao\t116026\ncapri\t116027\n转珠\t116028\nccz\t116029\n国界\t116030\n李英杰\t116031\nbtb\t116032\n中国农业科学院作物科学研究所\t116033\n顺企网\t116034\n南京农业大学\t116035\nSJM\t116036\n吉神\t116037\n成一片\t116038\n燕京啤酒\t116039\n389\t116040\n逸景\t116041\n南宫\t116042\n变电箱\t116043\n满室\t116044\nScars\t116045\n数字签名算法\t116046\nPowerShot\t116047\n383\t116048\n3969\t116049\nEXCEL版\t116050\nTechnological\t116051\n玫瑰\t116052\n德孝\t116053\nUsa\t116054\n伏数\t116055\n东湖港\t116056\n投影仪\t116057\nOpenssl\t116058\n斯乌\t116059\n施瓦沈南鹏\t116060\n看雪\t116061\n不学无术\t116062\n帕卡\t116063\n东门步行街\t116064\n我的爱\t116065\nEnvironmental\t116066\n第58章\t116067\n空车\t116068\n陈新\t116069\n水培吊兰\t116070\n箱包\t116071\nNIGHT\t116072\nSolenoid\t116073\n绿龟\t116074\n苏州市旅游局\t116075\n西瓜籽\t116076\n营养袋\t116077\n句段\t116078\n凉虾\t116079\n极弹加速器\t116080\n免疫规划\t116081\n中国服装人才网\t116082\n共沸\t116083\n物联网产业\t116084\n103平米\t116085\n大连开发区\t116086\n光山县人民政府\t116087\n矫姿\t116088\n风绳\t116089\n2017年7月4日\t116090\n火星小说网\t116091\n硫酸氨基葡萄糖胶囊\t116092\n理性认识\t116093\n曾祖父\t116094\n改性剂\t116095\n土蜂蜜\t116096\n苍山月\t116097\n巨金\t116098\n女校\t116099\n年历\t116100\n战祸\t116101\n灰麻\t116102\n长颈龙\t116103\n中百商网\t116104\n柠檬鱼\t116105\n河北国税\t116106\nOV7670\t116107\nmsk\t116108\n石城县\t116109\n体奶\t116110\n空中浩劫吧\t116111\n胡总\t116112\n欧洲杯\t116113\nOKAY\t116114\nchoosing\t116115\n花瓷\t116116\n暴食\t116117\n茵栀黄\t116118\n孙侦探\t116119\n氧氟沙星滴眼液\t116120\n闲谈\t116121\n狂热版\t116122\n无缝方管\t116123\n本质\t116124\n51talk\t116125\n血氨\t116126\njpare\t116127\n洲际集团\t116128\nbillbill\t116129\nzbs\t116130\n买股\t116131\n微电影\t116132\n婷婷色情网\t116133\n新加坡市\t116134\n2月29日\t116135\n大众_迈腾\t116136\n欧惠\t116137\n纵行\t116138\n2017年8月16日\t116139\n杠杆式\t116140\n小茶\t116141\nCardboard\t116142\n北京市科委\t116143\n新奥\t116144\n15.4寸\t116145\n盯盯拍mini2\t116146\n顾源\t116147\n嫂子的诱惑\t116148\n慢品\t116149\nM104w\t116150\n随任\t116151\n宏晶科技\t116152\n7e.hk\t116153\n糖友\t116154\ncta\t116155\n垅\t116156\ndxd\t116157\nrigou\t116158\n数据库索引_\t116159\nAV女忧\t116160\n能力\t116161\n时代教育\t116162\ndamaged\t116163\nSTORIES\t116164\n高尔夫7论坛\t116165\n屏南\t116166\n香格里拉花园\t116167\nasynchttp\t116168\n99路\t116169\n创行\t116170\n歌狂\t116171\n给水管网\t116172\n骨刺\t116173\nStepBy_fx\t116174\n胃镜室\t116175\nHaut\t116176\npledge\t116177\n正本\t116178\n无声\t116179\n0xc0000022\t116180\n深圳市文体旅游局\t116181\nesports\t116182\n9部\t116183\n唐僧\t116184\n张宏森\t116185\nSlice\t116186\n开新\t116187\n开选\t116188\n改改\t116189\n670\t116190\nChicks\t116191\n10.2寸\t116192\n刘娅希\t116193\n矫情\t116194\nty66\t116195\n玛丽奥\t116196\nSPECTRE\t116197\n装修工\t116198\nnuxtjs\t116199\n搜客网\t116200\n自驾游论坛_汽车之家论坛\t116201\npycharm2018\t116202\nk670d\t116203\n空竹\t116204\n致炫论坛_汽车之家论坛\t116205\n烂醉\t116206\n第3层\t116207\n奉天街\t116208\n猪油渣\t116209\nstellar\t116210\n陆琴\t116211\n省交通厅\t116212\n三星s4\t116213\n踏梦者\t116214\n9511\t116215\nopencv库\t116216\n平尺\t116217\n中大网校考试\t116218\n主动防护网\t116219\n卡尔卡西\t116220\n长干\t116221\nhertz\t116222\nservle\t116223\n袁国顺\t116224\nsvr\t116225\nd4\t116226\n粉身\t116227\n房管处\t116228\n自动车\t116229\n5v\t116230\n羊羔毛\t116231\n新华都商学院\t116232\n王玉普\t116233\n磁力棒\t116234\n国家外汇管理局综合司\t116235\n四个多月\t116236\nHexo\t116237\n开题报\t116238\ndoms\t116239\n佛教用品网\t116240\n轻剑\t116241\n狱政\t116242\n金盾\t116243\n物业管理费\t116244\n周叶\t116245\n进程端口号\t116246\n奥睿科\t116247\n指间恋\t116248\n电热管\t116249\n地钉\t116250\n昆岩池\t116251\n小结节\t116252\n鸡肋\t116253\n农家乐/周边游\t116254\n新视界\t116255\n陈林\t116256\n维拉克斯\t116257\n繁荣期\t116258\n火影迷\t116259\n勒庞\t116260\ncmwap\t116261\nindustry\t116262\n6324\t116263\n华中科技大学经济学院\t116264\n东升村\t116265\n中国仪器仪表行业协会\t116266\n65部\t116267\nmiracast\t116268\nlareina\t116269\ntanner\t116270\n外壳式断路器\t116271\n汇洁股份\t116272\n生化培养箱\t116273\n1925年\t116274\n老坛酸菜鱼\t116275\n澎湖湾\t116276\n逆浪\t116277\n张照\t116278\n画版\t116279\n南京市总工会\t116280\n固性\t116281\n天下大事\t116282\n堕落色戒\t116283\nhdt\t116284\n伽途ix5\t116285\n启动类\t116286\n传射\t116287\n刘忠良\t116288\n局部地区\t116289\n院徽\t116290\n幻想三国志5_幻想三国志5\t116291\n印鉴\t116292\nDorm\t116293\n蔓迪\t116294\n三转\t116295\n叶类\t116296\n勃林格殷格翰\t116297\n铝屑\t116298\n行军\t116299\n汉仪菱心体简\t116300\n万伟\t116301\n北极虾\t116302\nf370\t116303\n曲阜\t116304\n机油门\t116305\n可溶\t116306\n福建省水利厅\t116307\ndaba\t116308\n履卦\t116309\n周浦花海\t116310\n苏仙岭\t116311\n11.0.5\t116312\n半月痕\t116313\n李文世\t116314\n土卫六\t116315\n60000公里\t116316\n新东区\t116317\nBOM头\t116318\nhonoka\t116319\n河北大学附属医院\t116320\n模范夫妻\t116321\n2.92\t116322\n努比\t116323\nunto\t116324\n挤压机\t116325\n主君\t116326\n十五年等待候鸟\t116327\n洞头县\t116328\n邮政手机银行\t116329\nP16\t116330\n警魂\t116331\ncoursera\t116332\nWindows许可证\t116333\n步知公\t116334\nHaruka\t116335\n阿里云中央\t116336\n青蒜\t116337\ndevoted\t116338\nPKPM2010\t116339\n桔钓沙\t116340\n最小数\t116341\n苏小姐\t116342\n奥美\t116343\n上海锦江饭店\t116344\nflax\t116345\n振动机\t116346\nASP\t116347\nSubjects\t116348\n宜城网\t116349\n申特\t116350\n上海贝岭\t116351\nbirth\t116352\n甘雨路\t116353\n羞怯\t116354\n垂头\t116355\n又见一帘幽梦\t116356\npiaget\t116357\ndominance\t116358\n逃避\t116359\n委令\t116360\n如同\t116361\n肩胛骨处\t116362\n闷盖\t116363\n黛安·克鲁格\t116364\n幻想嘉年华\t116365\n糖酵解\t116366\nmacau\t116367\n撬锁\t116368\n工作安\t116369\n琅琊榜风起长林\t116370\n操作人\t116371\n朴孝敏\t116372\n铜马\t116373\n柯桥区\t116374\n孟祥远\t116375\n东风村\t116376\n游戏本\t116377\n20160612\t116378\n邵阳北\t116379\noptifine\t116380\n江苏大厦\t116381\n上方谷\t116382\n湮\t116383\n越冬\t116384\n黑崎一护\t116385\nPrim\t116386\n宋佳\t116387\n39路\t116388\n7平米\t116389\n18047\t116390\n全球\t116391\n华易算命网\t116392\n全果\t116393\nNZS\t116394\n拉斯帕尔马斯\t116395\n温如言\t116396\n迈上\t116397\n逆战吧\t116398\npei\t116399\n太子河\t116400\n沈小岑\t116401\n异形柱\t116402\nnow\t116403\n肥皂泡\t116404\n布达拉\t116405\n新义州\t116406\n补号\t116407\nlicensed\t116408\n15.2\t116409\n塔克\t116410\nOUTLOOK2010\t116411\n美式咖啡\t116412\n跟班\t116413\n工控栏目_机电之家网\t116414\n15.2%\t116415\n金口\t116416\nclasp\t116417\nExif\t116418\n斯特凡\t116419\n藻片\t116420\n德州职业技术学院\t116421\n研究生\t116422\n重修\t116423\n咏梅\t116424\n法恩莎\t116425\n热传导系数\t116426\n林记\t116427\nfm17\t116428\n虹光\t116429\n1463\t116430\n识别码\t116431\n中枢性\t116432\n萨列里\t116433\n笄礼\t116434\nandrodi\t116435\n珀尔修斯\t116436\n明晃晃\t116437\n雷克萨斯ES200\t116438\n自清\t116439\n气候箱\t116440\nrange函数\t116441\nlibm\t116442\n市北初级中学\t116443\n草酸钠\t116444\n企财险\t116445\nぬ\t116446\nYang_Guang\t116447\nyinyuetai\t116448\nkx\t116449\n动漫剧\t116450\n1.4G\t116451\n交汇\t116452\n做种\t116453\n穷富\t116454\n电子化\t116455\n20年间\t116456\n水磨石\t116457\nInsulation\t116458\nr级\t116459\n精典\t116460\n胡星\t116461\n国庆黄金周\t116462\n享瘦\t116463\n致富158网\t116464\nDenseNet\t116465\n总格\t116466\n匡衡\t116467\nMikuMikuDance\t116468\n仇东升\t116469\n中国信通院\t116470\n5910\t116471\n博易云\t116472\n2h\t116473\n浓墨\t116474\n李佳升\t116475\n铜陵站\t116476\n樱花杯\t116477\n切换器\t116478\n拼音字母表\t116479\n皮尔逊相关系数\t116480\ncss3选择器\t116481\n装饰\t116482\n沪C\t116483\n大士\t116484\n仙洋\t116485\n第十一批\t116486\nmonoclonal\t116487\n哼哼\t116488\n三魂\t116489\n斗齿\t116490\n铝合金管\t116491\n黄河\t116492\n中国商业地产\t116493\n系统还原\t116494\nAki阿杰\t116495\nproceeding\t116496\n重定义\t116497\n哈弗h2s\t116498\n都市淫狐传\t116499\nMotif\t116500\nBRAVIA\t116501\n摩卡壶\t116502\n虹桥街道\t116503\n浮球式\t116504\n封神英杰传\t116505\n间甲酚\t116506\nskirt\t116507\n拉美国家\t116508\n6.3%\t116509\n猛犸君侯\t116510\n水刀拼花\t116511\n第二十九期\t116512\n瑞景商业广场\t116513\n所辖\t116514\nralink\t116515\n数粒\t116516\n营业点\t116517\n人力资源管理师考试\t116518\n阿尔托利亚\t116519\n势利眼\t116520\n微选\t116521\n2013.4\t116522\n探伤剂\t116523\nFUDAN\t116524\nTPimage\t116525\n湘潭市人民政府\t116526\nDJ舞曲\t116527\n栗木\t116528\n曺圭贤\t116529\nReport\t116530\n美妆蛋\t116531\n绍兴酒\t116532\n2.7.12\t116533\nSalesforce\t116534\n北欧女神传\t116535\n2.5.16\t116536\n埋弧\t116537\n山西省纪委\t116538\n申请方\t116539\n爱卿\t116540\n孙友田\t116541\n博看小说网\t116542\n美国迪士尼公司\t116543\n火药味\t116544\n激趣\t116545\n季戊四醇\t116546\n福建省\t116547\n八岁\t116548\n基牙\t116549\n往来对账\t116550\ngen10\t116551\nHOYA\t116552\n访谈稿\t116553\n邯郸职业技术学院\t116554\n神秘岛\t116555\n李新宇\t116556\n西坝河\t116557\n因特尔\t116558\n生态农业\t116559\nvpay\t116560\n橡胶管\t116561\n焊膏\t116562\n三国群英传8\t116563\n秦无炎\t116564\n2排\t116565\n智富\t116566\n胭脂盒\t116567\n芽\t116568\n脱机\t116569\n外快理财\t116570\n修鞋姑娘\t116571\n赵晓生\t116572\nRedchar\t116573\n百花公园\t116574\n泡爪\t116575\n血债\t116576\n电子货币\t116577\nNetSuite\t116578\n吉吉影院\t116579\nVerified\t116580\ndwt\t116581\n行政记过处分\t116582\n原研哉\t116583\n冷镀锌\t116584\n網頁\t116585\n双翼\t116586\n20股\t116587\n许子东\t116588\n张跃飞\t116589\n属性页\t116590\n魂人\t116591\npremie\t116592\n郑政\t116593\n2015年4月25日\t116594\n弄错\t116595\n四散\t116596\n奥拓\t116597\n5R\t116598\n论证\t116599\n12种\t116600\n温布利\t116601\nkras\t116602\nRX480\t116603\n平行光\t116604\n扑哧\t116605\n黯然\t116606\n2012级\t116607\n1.0g\t116608\nroger\t116609\nXiao\t116610\n首项\t116611\n图源\t116612\n波雅汉库克\t116613\n大番薯网\t116614\n见面礼\t116615\n尖叫鸡\t116616\n开沃\t116617\n东方时尚驾校\t116618\n尬聊\t116619\nIKU\t116620\n理应\t116621\nroguelike\t116622\n96\t116623\nkurtosis\t116624\n丁集镇\t116625\nZTree\t116626\n斯嘉任正非\t116627\nNihaorz\t116628\n十万多\t116629\n天剑狂刀\t116630\n潼关\t116631\nDuty\t116632\n捕蚊器\t116633\n史陶比尔\t116634\n隔离器\t116635\n枇杷树\t116636\ns01\t116637\n短信验证码\t116638\n松果\t116639\nTBD\t116640\nBreaking\t116641\n结售汇\t116642\n天元镇\t116643\n微藻\t116644\n_天弘基金\t116645\nNURBS\t116646\n96平\t116647\n安瓶\t116648\n瓯海区\t116649\n私法\t116650\nkla\t116651\n说真话\t116652\n陈姓\t116653\n教母\t116654\nAuxiliary\t116655\n180号\t116656\n渭南高新区\t116657\n冥思苦想\t116658\n王健林\t116659\n魔芋丝\t116660\n十八周年\t116661\n海绵\t116662\nwindows服务器\t116663\n安徽省环境保护厅\t116664\n灰石\t116665\n大调\t116666\n可试\t116667\n激励性\t116668\nAPNS\t116669\n皮管\t116670\n搜狗联盟\t116671\n酷图\t116672\n二方\t116673\ncontacted\t116674\n新华联集团\t116675\n烈焰燃情\t116676\n学前\t116677\n邵阳市\t116678\n绍兴\t116679\n建发珑玥湾\t116680\n化名\t116681\n【书旗小说阅读器\t116682\n中国航天科技\t116683\n亚山\t116684\n7510\t116685\n皮蛋\t116686\n幻樱字幕组\t116687\n飞越老人院\t116688\nRX1\t116689\nAxureRP\t116690\n仿制药\t116691\n制造业\t116692\n人教版小学五年级英语\t116693\n大产证\t116694\n2015-08-08\t116695\n37首\t116696\n有体\t116697\n嘉辉\t116698\n歌手2017\t116699\n红疹\t116700\n东密\t116701\n四里河\t116702\n米制\t116703\nIHS\t116704\n三亚天涯海角\t116705\n百宝\t116706\n层级\t116707\n告死\t116708\n奋斗史\t116709\ngates\t116710\n可行性\t116711\n串串香\t116712\nWKT\t116713\n8.5%\t116714\nune\t116715\n别吵\t116716\n一屁股\t116717\n东京食尸鬼金木\t116718\n咖啡粉\t116719\n大拇指\t116720\npvalue\t116721\nWarren\t116722\nMistake\t116723\n李梅\t116724\n4.25\t116725\n腾飞路\t116726\n0-3岁\t116727\n古利特\t116728\nYork\t116729\n上海博物馆\t116730\nb2b电子商务网\t116731\n商砼\t116732\n汽车制造业\t116733\n版组\t116734\nGx\t116735\n宾川\t116736\n工业类\t116737\n叶榕\t116738\n炼体\t116739\n成单\t116740\n豆腐屋\t116741\n静息仙玉\t116742\n打援\t116743\nmynba2k18\t116744\n摇摇车\t116745\n10.00\t116746\n猫饼\t116747\n关联规则\t116748\n决对\t116749\n大山楂丸\t116750\n续重\t116751\n5万元\t116752\n天诛3\t116753\n1.5.12\t116754\n入学率\t116755\njumped\t116756\n五月电影院\t116757\n试业\t116758\n真水无香\t116759\n单配\t116760\n李卫国\t116761\n围城\t116762\n情生\t116763\nEYAN\t116764\nrct猜人\t116765\n神探狄仁杰2\t116766\n长江路\t116767\n罗技502\t116768\np20pro\t116769\n身人\t116770\n会计报表术语\t116771\n伪经\t116772\n老师\t116773\n传智播客\t116774\nFarms\t116775\n文控\t116776\nSurfaceBook\t116777\n盛宝银行\t116778\n83875115\t116779\n红色高棉\t116780\n易笔\t116781\nholmes\t116782\n熨\t116783\n银行理财-网贷天眼\t116784\n第243集\t116785\n胡小喃\t116786\n中国就业网\t116787\n令堂\t116788\nAGI\t116789\njunan\t116790\n填字\t116791\n印度半岛\t116792\n暗影精灵2pro\t116793\n程丽华\t116794\nV4L2\t116795\n魔夜\t116796\nredistributable\t116797\nBurden\t116798\n灶台\t116799\n20140530\t116800\n母与子\t116801\n中航华府\t116802\n熔盐\t116803\ne2e\t116804\nmodelform\t116805\n宝山\t116806\n继\t116807\n波兰兹罗提\t116808\n不专\t116809\n第七十二章\t116810\n52生活网\t116811\n钟涛\t116812\n诛仙叁\t116813\n刘志宏\t116814\n杜邦\t116815\n0876\t116816\n研究版\t116817\n海帆\t116818\n仙四\t116819\n花堪折直须折\t116820\n600649\t116821\n宣城\t116822\n特病\t116823\n王爵\t116824\nkpa\t116825\n界面\t116826\n英语音标表\t116827\n8653\t116828\n探歌\t116829\n天猫魔盒2\t116830\nMailbox\t116831\n阳菜\t116832\n玛蒂娜\t116833\n颈圈\t116834\n残疾赔偿金\t116835\n软体\t116836\nrusty\t116837\n平抑\t116838\n年复一年\t116839\n几十位\t116840\n20150511\t116841\n鄂东网\t116842\n天赋异禀第一季\t116843\n李集\t116844\nH370\t116845\nassault\t116846\n踏花行花卉论坛\t116847\nPhy\t116848\n男队\t116849\n集群化\t116850\nestimator\t116851\n绝想日记网\t116852\n浙江教育考试院\t116853\n郭少\t116854\n美锦能源\t116855\n空运单\t116856\nsubline\t116857\nzhuan\t116858\n回延安\t116859\n撞鬼\t116860\n拍过\t116861\n几幢\t116862\ndaz\t116863\n170mm\t116864\n卓尼县\t116865\n北京住总集团\t116866\n霞浦街道\t116867\ncath\t116868\n人面桃花相映\t116869\n应考\t116870\n否命题\t116871\n终级幻想\t116872\nmaritime\t116873\n梦幻西游2017\t116874\n毛蛋\t116875\nSpade\t116876\n素组\t116877\n令郎\t116878\n湖南理工\t116879\n长顺\t116880\n谷歌地球专业版\t116881\nc++-C++\t116882\n潇然\t116883\n刘猛\t116884\n0.5mg\t116885\n罗技G27\t116886\ninformative\t116887\n票票\t116888\n中信戴卡\t116889\n穿戴式\t116890\n中国证监会\t116891\n修护\t116892\n曹原\t116893\n楚翘城\t116894\n兵役法\t116895\n1周\t116896\n扁桃\t116897\nStarlight\t116898\n創意\t116899\n回弹\t116900\n红外图\t116901\n人教版小学\t116902\n药山\t116903\nadmi\t116904\n灭魂\t116905\nTung\t116906\n卡拉扬\t116907\n问心无愧\t116908\n阿帕替尼\t116909\n缺素\t116910\n宽永通宝\t116911\n渐变字\t116912\n菊地\t116913\nGCE\t116914\nzeroc\t116915\nicourt\t116916\n舒适化\t116917\n栓\t116918\n坐北朝南\t116919\n锯骨机\t116920\nPresident\t116921\n萌王EX\t116922\n泰国正大集团\t116923\n中页\t116924\n收拢\t116925\n第107章\t116926\nidentifier\t116927\nresponsebody\t116928\n生成素\t116929\n逆矩阵\t116930\n袒胸\t116931\n批处\t116932\nshiy\t116933\nTop40\t116934\n李惠利医院\t116935\n网易论坛\t116936\n面料展\t116937\n凌天佣兵\t116938\n告捷\t116939\n寝食\t116940\n热饭网\t116941\nRSRP\t116942\n大连湾\t116943\n亚翔\t116944\n拉螺栓\t116945\n乙酰苯胺\t116946\n事实婚姻\t116947\n36W\t116948\n这般人\t116949\n澳洲木瓜膏\t116950\n发言\t116951\n雷瓦\t116952\n沈肯尼\t116953\n蒸发\t116954\n无线吸尘器\t116955\n因此\t116956\n斗牛梗\t116957\n防盗门锁\t116958\n双微\t116959\nPACK\t116960\nnovels\t116961\nShadowsocksR\t116962\n机电类\t116963\n山东地税\t116964\n相爷\t116965\nES6\t116966\nYoho\t116967\n向塘\t116968\nsyk\t116969\n拜月教主\t116970\n乌克兰航空\t116971\n稻壳儿\t116972\nWEY\t116973\n无癌\t116974\n侠_车问网\t116975\n乐助贷\t116976\n首条\t116977\n同分\t116978\nendless\t116979\n2018年12月31日\t116980\n寺观\t116981\n御选\t116982\n科玛\t116983\n一世界\t116984\n周光平\t116985\n地下河\t116986\n581\t116987\n99朵\t116988\n少管\t116989\n陈亦迅\t116990\n暗黑宋威龙\t116991\n五有\t116992\n筹划\t116993\n城市之窗\t116994\ncocosstudio\t116995\n阳离子表面活性剂\t116996\nwiiu破解吧\t116997\n飞虎潜行极战\t116998\n108国道\t116999\n没收回\t117000\n圆瓶\t117001\n希引力*+\t117002\nAntennas\t117003\nKubernetes\t117004\n妖猫奥奇传说\t117005\nbri\t117006\n密谈\t117007\n生贺\t117008\n挑中\t117009\n玫珂菲\t117010\n凌渡论坛\t117011\n睫角守宫\t117012\n秒卡区\t117013\n葡7\t117014\n盘羊\t117015\n肇庆市政府\t117016\n声学\t117017\nTGA\t117018\n昆仑\t117019\n3D豪情\t117020\nWindstep\t117021\n20160526\t117022\n子女宫\t117023\n斜边长\t117024\n杭州动物园\t117025\n方圆\t117026\n王锐\t117027\n江津市\t117028\n中央政府采购网\t117029\n糯米糕\t117030\n人土\t117031\n朱老总\t117032\n分子链\t117033\n御景花园\t117034\n梅塞尔\t117035\n优艺星\t117036\n格斗家\t117037\n高能环境\t117038\n玺\t117039\n夜风\t117040\n搬运车\t117041\n2018.3.16\t117042\n菇菇\t117043\n龙湖九里晴川\t117044\n调料盒\t117045\n夏先生\t117046\n不请自来\t117047\nCathedral\t117048\n信箱\t117049\n管委会\t117050\n百度理财\t117051\nsuda\t117052\nWCP\t117053\n残情\t117054\n都市快报\t117055\n3号令\t117056\nhwp\t117057\n配重\t117058\nLinuxMint\t117059\n中电云集帮助中心\t117060\n喀秋莎录屏\t117061\ngeny\t117062\n仁青\t117063\ngard\t117064\n武器装备\t117065\n天天把歌唱\t117066\n海\t117067\n松任谷由实\t117068\n广州汽车\t117069\n供参考\t117070\n常州市第二人民医院\t117071\n人工鱼礁\t117072\n484\t117073\n财料\t117074\n采矿场\t117075\nQ6\t117076\n六周岁\t117077\n俾斯麦\t117078\nsf传奇\t117079\npeepla\t117080\n算料\t117081\n飞不起来\t117082\n川甸街\t117083\n冠径\t117084\n白天不懂夜的黑\t117085\npolicies\t117086\n第二组\t117087\n正态分布_\t117088\n第17集\t117089\nmix2\t117090\n水泉沟\t117091\n三线仓鼠\t117092\n蹲式\t117093\nvds\t117094\n烤肉串\t117095\nТ\t117096\n23点\t117097\nDB33T\t117098\n过人\t117099\n重复\t117100\nProgressBar\t117101\n川商\t117102\n首席官\t117103\n三分类\t117104\n深圳城市管理局\t117105\n前厅\t117106\n太岁肉\t117107\nVAV\t117108\n5月19号\t117109\n中国图书馆图书分类法\t117110\n68000\t117111\nips屏\t117112\n量柱\t117113\nbirt\t117114\n溶脂针\t117115\n同安镇\t117116\n整式乘除\t117117\n05年\t117118\n三强\t117119\n第四十五\t117120\n防盗盒\t117121\n大功率led\t117122\ntons\t117123\nFormal\t117124\n美森\t117125\n崔秀英\t117126\nRadHat\t117127\nWWW.4422N.COM\t117128\n二手手机交易网\t117129\n200子\t117130\n6QF\t117131\n镇海楼\t117132\n肺淤血\t117133\n一万遍\t117134\n冰脉\t117135\nARW\t117136\nx-t\t117137\n茂源\t117138\n按图索骥\t117139\n食尸\t117140\n520公里\t117141\n隆兴\t117142\njieba\t117143\n方太燃气灶\t117144\n玩玩\t117145\n无禁\t117146\n蔡屋围\t117147\n转基\t117148\n蛋挞\t117149\n灾害\t117150\n邯山区\t117151\nself\t117152\n48条\t117153\nT440p\t117154\n梦幻西游2_巴士\t117155\n汤匙\t117156\n2016年7月15日\t117157\n野马岭\t117158\n百度云/迅雷\t117159\ngome\t117160\nszhr\t117161\n以其\t117162\n名山之窗\t117163\n旅游管理学院\t117164\n贾迎春\t117165\n心理病\t117166\n膜片钳\t117167\n7.99\t117168\n凯蒂佩里\t117169\n背身\t117170\n凊\t117171\n广州科技职业技术学院\t117172\n大连软件园\t117173\n铃木雨燕\t117174\n废布\t117175\nNBA2K14\t117176\nNTKO\t117177\n第91期\t117178\n本末\t117179\n余敏\t117180\n利卡\t117181\nuvlayout\t117182\n取件\t117183\n2017年1号\t117184\nkd\t117185\nDisposable\t117186\n总单\t117187\nJZRB\t117188\n不要不要\t117189\n渣土\t117190\n四川省建设厅\t117191\nwebdriver\t117192\n私服版\t117193\n示功\t117194\n简装版\t117195\n佳能750d\t117196\nswift3.0\t117197\n58集\t117198\n广东省实验中学\t117199\n禁种铲\t117200\n浙江温州江南皮革厂\t117201\n诱色\t117202\nSEXO\t117203\n黄明昊\t117204\n丢失率\t117205\n棚里\t117206\n180314\t117207\n中国蔬菜网\t117208\n火口\t117209\n王丽萍\t117210\n中国领事馆\t117211\nzbook15\t117212\n230ml\t117213\n情侣公园\t117214\n贸易通\t117215\n非转基因\t117216\n有气质\t117217\n政发\t117218\n长安欧尚X70A\t117219\n助创\t117220\ntoB\t117221\n暮光女\t117222\n84000\t117223\n脑洞大开\t117224\n后后\t117225\n下十年\t117226\nStores\t117227\n封国\t117228\n飞地\t117229\n微信群聊\t117230\niphone4S\t117231\n成功营销\t117232\n敬启\t117233\n商会\t117234\n上海国际展览中心\t117235\n都市圈\t117236\n今上\t117237\n深圳市扎克贸易有限公司\t117238\n60V\t117239\n倒角器\t117240\n谓语\t117241\n春心\t117242\n九宫格输入法\t117243\n神奇宝贝剧场版\t117244\n2k18吧\t117245\n初创期\t117246\n述责\t117247\n小提\t117248\nMSAA\t117249\n奔波\t117250\n紫藤路\t117251\n开创者\t117252\nBTCAT\t117253\n35kv\t117254\n不明朗\t117255\n索兰托\t117256\nFLV播放器\t117257\n屏蔽\t117258\n感谢你\t117259\n狂犬疫苗\t117260\n撞船\t117261\nfeilin\t117262\n齿牙\t117263\n荒野的召唤\t117264\nInvariant\t117265\npda\t117266\nspringMVC拦截器\t117267\nSNH\t117268\n天山\t117269\n第二课\t117270\n免交\t117271\n演出\t117272\neland\t117273\n王秀\t117274\n阪\t117275\nschedules\t117276\n天涯沦落人\t117277\n308号\t117278\n如狼似虎\t117279\n窒息性\t117280\n铁血丹心\t117281\n二十六年\t117282\n美溪\t117283\n魔卡少女樱吧\t117284\n几日\t117285\n6.65\t117286\n@Autowired\t117287\nQQ头像网\t117288\n护资\t117289\n淮滨\t117290\n雷笋\t117291\n六月天\t117292\n快装\t117293\n高丽营镇\t117294\n小洛\t117295\n中华铁道网\t117296\n臭椿\t117297\n假大空\t117298\n网贷之家\t117299\nAlwaysOn\t117300\n平安人寿保险\t117301\n568号\t117302\n败者\t117303\nopenvz\t117304\n卑猥\t117305\n毒盒\t117306\n东海村\t117307\n计算机科学\t117308\nG18\t117309\n何鹏\t117310\n腾讯社\t117311\n有声版\t117312\n铁总建设\t117313\n洗涤\t117314\n背包客\t117315\n20180426\t117316\nPDO\t117317\nk12\t117318\n851\t117319\n宋末\t117320\n小微企业所得税优惠政策\t117321\n中国之星\t117322\n游褒禅山记\t117323\n贵州中耀华建建筑股份有限公司\t117324\n红原\t117325\n十三年\t117326\n松山湖高新技术产业开发区\t117327\n尼康D850\t117328\ncundang\t117329\n榴炮\t117330\n小龙在线\t117331\n万向轮\t117332\n总帐\t117333\n响起来\t117334\n左邻右舍\t117335\n阳谷华泰\t117336\n魔帝\t117337\n三鹿集团\t117338\n集王羲之圣教序\t117339\n1710\t117340\n上海科技大学\t117341\n党总支委员会\t117342\n星月\t117343\n赔偿案\t117344\n花树\t117345\n贝希特斯加\t117346\nlinking\t117347\n房地产界\t117348\n双安\t117349\n永恒的爱\t117350\n打天下\t117351\n防控\t117352\nh1\t117353\n发纹\t117354\napahce\t117355\n外院\t117356\n禁足\t117357\n1178\t117358\n箱号\t117359\nBBAN\t117360\n赛文哥\t117361\n百听不厌\t117362\nvdo\t117363\n总工办\t117364\n趋同\t117365\n青岛西\t117366\n规范化\t117367\n巴布达\t117368\n狡辩\t117369\n岩溶地区\t117370\n齐富路\t117371\n第146号\t117372\n国难\t117373\n汤类\t117374\nrange\t117375\n战术性\t117376\n伊婉\t117377\n雷锋日记\t117378\n曹杨中学\t117379\n柳色\t117380\n莱利\t117381\n配送员\t117382\n共和路\t117383\n文汇小学\t117384\n梦幻金庸群侠传\t117385\n陈陈陈\t117386\n行政处罚案\t117387\n色狗\t117388\nintroducing\t117389\n6档\t117390\n房中\t117391\n财道\t117392\n神殿\t117393\n田楷\t117394\n门柱\t117395\n红色警戒\t117396\n江南糕点\t117397\n1877年\t117398\n长生不死\t117399\nams1117\t117400\n彝\t117401\nppt2016\t117402\n电吸门\t117403\n双金\t117404\n正邪\t117405\n1867\t117406\n桂溪街道\t117407\n褒\t117408\n军官\t117409\ncurses\t117410\n隋唐英雄4\t117411\n见缝插针\t117412\n新劳动法\t117413\n精析\t117414\n王志良\t117415\n无公害\t117416\nantdesign\t117417\ndistributor\t117418\n霍普金斯\t117419\nsciences\t117420\n20160202\t117421\nnough\t117422\n串通\t117423\n爱丽丝菲尔\t117424\n光绘机\t117425\n曾庆\t117426\n枪神纪\t117427\n箱装\t117428\n两幅画\t117429\n129号\t117430\n撤步\t117431\ni9-7900X\t117432\n张小盒\t117433\nbdb\t117434\n中央直属机关\t117435\nRegistration\t117436\n北汽股份\t117437\n百纳中国\t117438\nsuny\t117439\n◎\t117440\nKR\t117441\n北京工商大学\t117442\nU3d\t117443\n秘盒\t117444\nlive版\t117445\n筋脉\t117446\nSceneKit\t117447\nconcord\t117448\n27.dll\t117449\n猛火\t117450\n顶尖级\t117451\n东京都\t117452\n叶型\t117453\n8.2.4\t117454\n辰亦儒\t117455\n清盈\t117456\n18台\t117457\n周欢\t117458\nlims\t117459\n江铜集团\t117460\n国地\t117461\ndropins\t117462\n新丰时尚网\t117463\n广东省海洋与渔业厅\t117464\n读到\t117465\n平均\t117466\n郭元强\t117467\n牧魂\t117468\n腱鞘\t117469\n家有仙妻\t117470\n合成区\t117471\n李果\t117472\n牛肉丸\t117473\n库洛姆\t117474\nan\t117475\n先入为主\t117476\nPotato\t117477\n科罗廖夫\t117478\n西裤\t117479\n促学\t117480\n全粒\t117481\n入耳式\t117482\n古物\t117483\n艳势番\t117484\n围垦\t117485\n蓬松\t117486\n路风\t117487\n基层版\t117488\n中华人民共和国增值税暂行条例实施细则\t117489\nVs\t117490\n杨大卫\t117491\n住校\t117492\n卯月麻衣\t117493\n赛车总动员\t117494\n吉林大学汽车工程学院\t117495\n黄河流域\t117496\nGodzilla\t117497\nUGUI\t117498\n屹\t117499\n小心翼翼\t117500\n第94期\t117501\nignition\t117502\n1件\t117503\n中国新闻网\t117504\n隆重\t117505\n唯刀百辟\t117506\n耀武扬威\t117507\n桡骨远端骨折\t117508\n年少轻狂\t117509\n孙威\t117510\n上海十院\t117511\n黄佟佟\t117512\n藏族音乐网\t117513\n贷罗盘\t117514\nF80\t117515\n1564\t117516\n抽绳\t117517\nSurface3\t117518\n永丰县政府\t117519\n血管紧张素\t117520\n徐州市\t117521\n120公斤\t117522\n东方艺术中心\t117523\n麦浚龙\t117524\nmod管理器\t117525\n从零开始的异世界生活\t117526\n武汉市国家税务局\t117527\n学会生活\t117528\n天津丽思卡尔顿酒店\t117529\n鞍山师范学院\t117530\n机械加工网\t117531\n渗入\t117532\n诺丽果\t117533\n接本\t117534\n数控切割机\t117535\n汉宠妻\t117536\n琴娘\t117537\n训读\t117538\n一个年\t117539\n电热宝\t117540\n广福路\t117541\n3月14号\t117542\n程佩青\t117543\n神州数码\t117544\n两寸\t117545\n百洁\t117546\nlamba\t117547\n膜器\t117548\n几天一\t117549\nparity\t117550\ngtx560\t117551\nbooty\t117552\n大望路\t117553\n文本函数\t117554\n农村人居环境整治三年行动方案\t117555\n九死\t117556\n理趣\t117557\n邹衍\t117558\n卧底调查分公司\t117559\n找群加\t117560\nsunnto\t117561\n新华科技\t117562\n厦门地铁2号线\t117563\n加权最小二乘法\t117564\nsap2000\t117565\n城市小区\t117566\nmanagement\t117567\nmapserver\t117568\n黄河桥\t117569\n攻速\t117570\n振威\t117571\n四年级语文\t117572\n莱茵生物\t117573\n橡胶止水带\t117574\n724\t117575\n十进制\t117576\n任伟\t117577\nAlphaGo\t117578\n5发\t117579\nnewevan\t117580\nEquilibrium\t117581\n延伸段\t117582\n世爻\t117583\naddin\t117584\n奇瑞eq\t117585\n骑马与砍杀:风云三国\t117586\n萧月\t117587\nvarying\t117588\n陕西网\t117589\n辐\t117590\n大木\t117591\n一城\t117592\n正品\t117593\n巧借\t117594\n湖州市人民政府\t117595\n9月14日\t117596\n大营坡\t117597\n明前\t117598\n莲实克蕾儿\t117599\n舒尔曼\t117600\naecc2017\t117601\n薄带\t117602\n冰室\t117603\n三沟\t117604\nducky\t117605\n星露谷\t117606\n子宫体\t117607\n北京八达岭长城\t117608\n画中\t117609\n转星移\t117610\n五羊新城\t117611\n贫民区\t117612\n加药泵\t117613\n如图\t117614\n资费\t117615\n1990s\t117616\nyax\t117617\n锅魁\t117618\n2710\t117619\n中东人\t117620\n格力海岸\t117621\n定都阁\t117622\n打勾\t117623\n藤\t117624\n没时刻\t117625\n26秒\t117626\n锐理\t117627\nPlayer播放器\t117628\n拨筋\t117629\n足内翻\t117630\n魏骏杰\t117631\nkb\t117632\ninfinity\t117633\n起锚\t117634\n工业除湿机\t117635\n黄彬\t117636\n金地格林东郡\t117637\n京东空包网\t117638\n刘起涛\t117639\n商用中央空调\t117640\n红竹\t117641\n铁男\t117642\nidea15\t117643\n城市基础设施配套费\t117644\n二十四节气网\t117645\n死火\t117646\n二手电脑转让\t117647\n龙华小学\t117648\n诚德\t117649\n开门见\t117650\nBlogger\t117651\n刘松\t117652\n墙材\t117653\n骏派\t117654\n不忘初衷\t117655\n万1.5\t117656\n平湖市政府\t117657\n保息\t117658\n史昂\t117659\n放缓\t117660\n岗南水库\t117661\nbv\t117662\n子们\t117663\n锋利度\t117664\n扩孔\t117665\n方长\t117666\n成都科技\t117667\n越南老街\t117668\ninsist\t117669\n芬琳\t117670\n反衬\t117671\n贷前调查\t117672\n水解蛋白\t117673\n特百惠\t117674\n魔盗\t117675\n3313\t117676\n浠水县\t117677\n悦宝园\t117678\n御津井芭华\t117679\n20160326\t117680\n芡实茶\t117681\n中央财经大学会计学院\t117682\n西安航空职业技术学院\t117683\nInstructor\t117684\n六合竹镇\t117685\n两院士\t117686\n户主\t117687\n天朗\t117688\n甘精\t117689\n云南旅游团\t117690\n王代军\t117691\n奴隶贸易\t117692\n擦鞋机\t117693\n用计\t117694\n落点\t117695\n嗨\t117696\n300余家\t117697\n北京市公安局消防局\t117698\n灰泰迪\t117699\n胰腺癌\t117700\n迅读网\t117701\n超级街霸4\t117702\n纤支镜\t117703\nprimo\t117704\n国策\t117705\n听讲\t117706\n下腰\t117707\n92素材网\t117708\n联想s410\t117709\n白沟镇\t117710\n乐信集团\t117711\npersoner\t117712\n肿么办\t117713\n粘贴\t117714\n安检门\t117715\n96级\t117716\n鱼盘\t117717\n华莱士\t117718\n淘宝助理\t117719\n圣斗士星矢:斗士之魂\t117720\n尤其\t117721\nFoam\t117722\nQQ群发器\t117723\nsoundbar\t117724\n佛教史\t117725\n北京石景山政府\t117726\n渝北两路\t117727\n外置\t117728\n埃塞俄比亚航空\t117729\n护神\t117730\n20121123\t117731\nrussell\t117732\n远视眼\t117733\n格林\t117734\nwin10系统浏览器\t117735\n宁波韵升\t117736\n3012\t117737\n星际争霸虫族\t117738\n第七街\t117739\n香格里拉饭店\t117740\n冰晶\t117741\n免疫性\t117742\n王亮亮\t117743\ncentroid\t117744\nMorse\t117745\nvm12\t117746\n出租屋\t117747\n12.9\t117748\n监督性\t117749\n计价定额\t117750\n重庆医科大学附属口腔医院\t117751\n超过5分钟\t117752\n天蝎\t117753\n杨超\t117754\nRecycling\t117755\nDNA\t117756\nNyan\t117757\n真好看\t117758\nPS,ps\t117759\n11.0.2\t117760\nDiego\t117761\n国务院安委会办公室\t117762\n1印\t117763\njetty服务器\t117764\n西村寿行\t117765\n预见\t117766\n家医\t117767\n悦府\t117768\n第二张\t117769\n欧沃斯\t117770\n桃色\t117771\n白鹿书院\t117772\n黑暗2\t117773\nset\t117774\n内蒙古国\t117775\n断子\t117776\n档案架\t117777\n十八届中央政治局\t117778\nRFID世界网\t117779\nghs\t117780\nref\t117781\n首服\t117782\n500年\t117783\n公众微信号\t117784\nUpload\t117785\nakoya\t117786\n4.22世界地球日\t117787\n各不同\t117788\n浮光\t117789\n光泽裤\t117790\n新宝gg\t117791\n会计们\t117792\nJSon\t117793\n禅修营\t117794\n映客\t117795\n权重矩阵\t117796\ncpgz\t117797\nHilfiger\t117798\n33部\t117799\n石坪桥\t117800\n陡增\t117801\n社会阶级\t117802\n照金\t117803\n科技观\t117804\n2550\t117805\n草稿箱\t117806\n复城国际\t117807\n彭德怀元帅\t117808\nAircraft\t117809\n虚荣心\t117810\n1093\t117811\n多玩剑网3\t117812\n本色\t117813\nviet\t117814\n赛普\t117815\n6X2\t117816\n分离变量法\t117817\n下板城\t117818\ntable-cell\t117819\nWINDOWS10\t117820\n东永裴\t117821\npdu\t117822\n反映\t117823\n特兰普\t117824\nmetfone\t117825\n改模\t117826\n甲母痣\t117827\n方材\t117828\n杂警\t117829\n工字钢\t117830\n宿新市徐公店\t117831\n天生尤物\t117832\n玉屏风\t117833\n2017.11.11\t117834\n分合\t117835\n冒险盒\t117836\nakp\t117837\n胡春\t117838\n御水温泉\t117839\n酷漫\t117840\n麻省大学\t117841\n子观音\t117842\n周星夙\t117843\n日伪\t117844\nmantisbt\t117845\nBPD\t117846\n中共十九届三中全会\t117847\nkinda\t117848\n良记\t117849\n相和\t117850\n国联集团\t117851\nqbittorrent\t117852\n滁州学院\t117853\n20150302\t117854\n一任\t117855\n撵走\t117856\n白灼虾\t117857\n金属链\t117858\nzx300a\t117859\n长沙小区\t117860\n韩后\t117861\nflyer\t117862\n货员\t117863\n固态盘\t117864\n800块\t117865\nppap\t117866\n8位\t117867\n势利\t117868\npixel\t117869\n十折\t117870\n山谷中的谜底\t117871\nAngelaBaby\t117872\n0617\t117873\nyy伴侣\t117874\n42.1\t117875\nPorsche\t117876\nSOHU\t117877\nHades\t117878\n碳砖\t117879\n端程序\t117880\n伊盾\t117881\nr32\t117882\n鼎力支持\t117883\nInfluence\t117884\n无锁\t117885\n盈余\t117886\n2015款款\t117887\n技工\t117888\n诱人的飞行\t117889\n鹦鹉案\t117890\n长江公司\t117891\n过氧化氢\t117892\n资源岛\t117893\n杨玉玲\t117894\n22位\t117895\n秣陵街道\t117896\n小说库\t117897\n脏\t117898\n杨洪涛\t117899\nmyths\t117900\nresize\t117901\nlow\t117902\nSummer\t117903\n尤二姐\t117904\n棒堂\t117905\nsqlexpress\t117906\n冕服\t117907\n四味\t117908\n21时\t117909\n浙江省司法厅\t117910\n达米尼\t117911\nAthletic\t117912\n部篇\t117913\n网上购物系统\t117914\n海洋王国\t117915\n2门\t117916\n兔粮\t117917\n六甲山\t117918\n两幅\t117919\n隔声窗\t117920\nlocking\t117921\n犬只\t117922\nlaravel\t117923\n10db\t117924\nldquo\t117925\nASTRO\t117926\nNiki\t117927\nlibo\t117928\n十页\t117929\n麦可蛋糕网\t117930\n神女控\t117931\n衣带渐宽终不悔\t117932\n平昌县\t117933\n潘达\t117934\n室长\t117935\n成都市公安局交通管理局\t117936\n桂味\t117937\nNYX\t117938\n大兴线\t117939\n王文清\t117940\n送分\t117941\n温州港\t117942\n磁盘驱动器\t117943\n复兴岛\t117944\n千代婆婆\t117945\n北京市密云区人民政府\t117946\n青超\t117947\n金时代顺风大酒店\t117948\n安坏\t117949\n歌舞片\t117950\n贵阳市\t117951\n1.58G\t117952\n北辰\t117953\n神经退行性疾病\t117954\nXX公司\t117955\n电能表\t117956\nkith\t117957\n勃起来\t117958\ntp5.0\t117959\nTrie树\t117960\n影迷会\t117961\nHTK\t117962\nbmcc\t117963\n国家移民局\t117964\n数据库触发器\t117965\n第01期\t117966\n男女关系\t117967\n不固\t117968\n超编\t117969\n销子\t117970\n泰安火车站\t117971\n深长\t117972\n寒门肖申克的救赎\t117973\n电动蝶阀\t117974\n烟台火车站\t117975\n第四纪\t117976\n合乎\t117977\n钉住\t117978\n尼桑\t117979\n生活水箱\t117980\n刘彦池\t117981\n头等大事\t117982\n姆明\t117983\n陈剑\t117984\n悦榕\t117985\n沙生\t117986\n背篼\t117987\n昏迷\t117988\n跳色\t117989\n沙太路\t117990\n0.1kg\t117991\n重庆工程职业技术学院\t117992\nMack\t117993\n擦掉\t117994\n2012年12月4日\t117995\nrounds\t117996\nql\t117997\n16万公里\t117998\nIllustr\t117999\n出席\t118000\n21度\t118001\n蓝波\t118002\n敬亭\t118003\nCss3\t118004\n交不交\t118005\n苏州大学体育学院\t118006\n賽馬資訊\t118007\n江湖新秩序\t118008\n私房钱\t118009\n舟山跨海大桥\t118010\n刘铁\t118011\n北京理工大学珠海学院\t118012\n盛天网络\t118013\n云南白药胶囊\t118014\n嘉兴市政府网\t118015\n阴道癌\t118016\n【宇\t118017\n徐子豪\t118018\n虚情假意\t118019\n昆仑雪菊\t118020\n王树\t118021\n球菌\t118022\n27章\t118023\n黄昊\t118024\nexpire\t118025\ng9200\t118026\n长利\t118027\n桶装\t118028\n创业\t118029\n解图\t118030\n招财宝\t118031\n王妍\t118032\nrevoke\t118033\n53年\t118034\n35小时\t118035\n脆\t118036\n投拆\t118037\nSouq\t118038\n韩战\t118039\n唇角\t118040\n张庆祥\t118041\nServiced\t118042\n709\t118043\n星辰岳麓书院\t118044\n深圳注册公司\t118045\n〃\t118046\n神州车闪贷\t118047\n第十七个\t118048\n亲兄妹\t118049\nwlk\t118050\n圆神\t118051\n王子涛\t118052\n彩杀\t118053\ncentOs\t118054\n一个三位\t118055\n机器侠\t118056\n碧水华庭\t118057\n独占鳌头\t118058\n山东鲁能泰山\t118059\n膨胀珍珠岩\t118060\nb200\t118061\nIDEA12\t118062\n17W\t118063\n万业湖墅\t118064\n中国银监会办公厅\t118065\n新蒙迪欧\t118066\nque\t118067\nBrushes\t118068\nXCOPY\t118069\n同济大学电子与信息工程学院\t118070\nCruz\t118071\nPDF文\t118072\n腱鞘囊肿\t118073\n10.2.0\t118074\n德阳中学\t118075\n三星s9\t118076\n微信支付扫码\t118077\n挂住\t118078\n老发\t118079\ncaopor\t118080\n成吉\t118081\n三吨\t118082\n安思尔\t118083\n过渡词\t118084\n择天记手游\t118085\n新恒结衣\t118086\n直流断路器\t118087\nFastCGI\t118088\n拳皇98c\t118089\n苏秘\t118090\n20161110\t118091\n商报\t118092\n选取\t118093\n冥套\t118094\n追溯\t118095\nemWin实战教程V2.0\t118096\n李四\t118097\n北师版小学\t118098\nJen\t118099\n铁甲信息技术(北京)有限公司\t118100\n喜加一\t118101\nppt2017\t118102\n罗马杆\t118103\nc12\t118104\n水资讯网\t118105\nsmart\t118106\nMALL_联商论坛\t118107\n华夏东路\t118108\n射手女\t118109\n张溪\t118110\n青青草国产\t118111\n0213\t118112\n断续\t118113\n普救寺\t118114\nWDD\t118115\n大鳄龟\t118116\n乙肝五项指标\t118117\n哈电集团\t118118\n店面\t118119\n万年县\t118120\nK24\t118121\n乡镇经济\t118122\n闪念\t118123\n卵巢畸胎瘤\t118124\n哈弗h3\t118125\n商河\t118126\nnuance\t118127\nST烯碳\t118128\n凯华轴\t118129\n难懂\t118130\nsts\t118131\n8万吨\t118132\n热能表\t118133\ndiffuser\t118134\nzxcvbnm\t118135\n咸阳市区\t118136\n品艺\t118137\n王力可\t118138\n邪恶帝_邪恶漫画_邪恶少女漫画_邪恶漫画大全\t118139\n原子能\t118140\n综\t118141\n许雅钧\t118142\nXmas\t118143\n孟卫东\t118144\nZootube\t118145\ncesium\t118146\n中华资源库\t118147\n橡皮章\t118148\n网络吞吐量\t118149\n下沙院区\t118150\n16套\t118151\n强制执行申请书\t118152\n淘友\t118153\n陈立夫\t118154\n情谷\t118155\nMia\t118156\nvivox6\t118157\n华泰证券股份有限公司\t118158\n就是爱你\t118159\n朱镇模\t118160\n白宝山\t118161\n没料到\t118162\n中海物业\t118163\nHTMLTestRunner\t118164\n平洲\t118165\n大碗岛\t118166\nNginx+Lua\t118167\namr\t118168\nGTV\t118169\n最看重\t118170\n凛冽\t118171\n满足率\t118172\n锅上\t118173\nipcs\t118174\n防误\t118175\n锦尚天舞\t118176\n乐东县\t118177\nラグジュ\t118178\n单角\t118179\n乌鸡\t118180\n无条件的爱\t118181\n晶魄\t118182\n金卓\t118183\n臭名\t118184\nwww224ttcmo\t118185\n周恒\t118186\n储备货币\t118187\n仁恒\t118188\n中国恒大\t118189\n晓东CAD家园\t118190\n省妇幼保健院\t118191\n一个300\t118192\n红遍\t118193\n秦皇岛银行\t118194\nJWT\t118195\nomniplan\t118196\n韧带\t118197\n天河北路\t118198\n高级连击\t118199\n技术开发区\t118200\nvue父子组件\t118201\nlarval\t118202\nManufactory\t118203\ng910\t118204\n伸懒\t118205\n界篇\t118206\nhelloworldlee\t118207\n安娜·卡列尼娜\t118208\n赤壁外传\t118209\nintuos\t118210\n支撑件\t118211\n轨道板\t118212\n2017年8月1日\t118213\n托管式\t118214\nDiesel\t118215\n模拟示范卷\t118216\n上海中医医院\t118217\nbtavi\t118218\n刷新点\t118219\nT淋巴细胞\t118220\nVER\t118221\nDB11\t118222\n蛋刀\t118223\n大有可\t118224\n恒祥\t118225\n新版部\t118226\nSketchUp2018\t118227\nT470S\t118228\n错号\t118229\n妇色\t118230\ndrainage\t118231\n普瑞眼科\t118232\nNBA2K11\t118233\n原甲酸三甲酯\t118234\n黄页/公司库/大全\t118235\n东风渠\t118236\nlorenz\t118237\n垛\t118238\n医毒\t118239\n88g\t118240\n萧玄\t118241\n天津医科大学\t118242\n精灵鼠\t118243\n木面\t118244\nEfficient\t118245\nJac\t118246\n巴桑\t118247\nBBQ\t118248\n嘉兴市实验小学\t118249\n甜甜甜\t118250\nIBM公司\t118251\n失窃案\t118252\n硬钢\t118253\n鼻壶\t118254\n安全生产法律法规\t118255\n格斗术\t118256\nsol君\t118257\n欢共乐\t118258\n木板\t118259\n西国\t118260\n段桥\t118261\ndirected\t118262\nfrequency\t118263\n我站在桥上看风景\t118264\nWii游戏\t118265\n2.7.15rc1\t118266\n七乐彩走势图\t118267\n修订版\t118268\n枯水期\t118269\n6G运存\t118270\n小曲儿\t118271\n9千万\t118272\n招商地产\t118273\n灯心草\t118274\n郭家村\t118275\nr290\t118276\nsuspect\t118277\n北京市网信办\t118278\n87062222\t118279\n闲聊茶\t118280\n2679\t118281\n广州统计信息网\t118282\n245ml\t118283\n公式化\t118284\n湖怪\t118285\n借景抒情\t118286\n上品折扣网\t118287\n均等\t118288\n本田摩托\t118289\n异影\t118290\n杜松\t118291\n傲世堂\t118292\n立方厘米\t118293\ng8\t118294\n药管\t118295\n7808\t118296\n重庆会计之家\t118297\n安-225\t118298\n太科园\t118299\n艾肯传说\t118300\n干混砂浆\t118301\n狗娘\t118302\n共用\t118303\n神农基因\t118304\n黄金时间\t118305\n精妙绝伦\t118306\n长沙市第八医院\t118307\n红鹦鹉鱼\t118308\n谷仓门\t118309\n刘小姐\t118310\n朱斌\t118311\n铁甲雄兵\t118312\nethyl\t118313\nspringsecurity\t118314\nFLOW\t118315\n岑溪市\t118316\n红蜘蛛\t118317\n股本金\t118318\nhav\t118319\nRHCS\t118320\n792\t118321\n黑木麻衣\t118322\n华东院\t118323\nQML\t118324\n引起轰动\t118325\n缩句\t118326\n自投罗网\t118327\n天康生物\t118328\n博雅杯\t118329\nnava\t118330\n加鲁加\t118331\n铁王座\t118332\n吞噬苍穹\t118333\n朱婧汐\t118334\n十里画廊\t118335\n饶颖\t118336\n化妆水\t118337\n万通\t118338\n华禹\t118339\n法学派\t118340\n契约型基金\t118341\n教学计划\t118342\ncareful\t118343\nparagraph\t118344\n信长之野望13\t118345\n县公安局\t118346\n久病\t118347\n岳望高速\t118348\nP2psearcher\t118349\n上溯\t118350\n叠加\t118351\n二溴乙烷\t118352\ntool\t118353\n倒伏\t118354\n林七七\t118355\n坑蒙拐骗\t118356\n木楔\t118357\n荷恩\t118358\n平均数\t118359\n高纯\t118360\n小米nfc\t118361\n27年后\t118362\n跨步\t118363\n暴发性\t118364\n技术群\t118365\n稻森丽奈\t118366\n京东白条吧\t118367\n空话\t118368\nlinspace\t118369\noutlook\t118370\n供应室\t118371\n庚寅年\t118372\ne-STUDIO\t118373\nCommitment\t118374\n华能电厂\t118375\n有氧搏击操\t118376\noutputs\t118377\n2016年清明\t118378\n刻舟求\t118379\n航天二院\t118380\nminami\t118381\n厷\t118382\n全斗焕\t118383\n程维\t118384\n黄博士\t118385\n蔬菜沙拉\t118386\n尘螨\t118387\n基础设备\t118388\n莱芜\t118389\n海拉尔区\t118390\n失察\t118391\n腾讯翻译君\t118392\n中国国家足球队\t118393\nTRA\t118394\n廊坊市住房保障和房产管理局\t118395\n扬州职业大学\t118396\n通讯社\t118397\n卢巧音\t118398\n18位\t118399\n北京贷款公司\t118400\n庞德\t118401\n枯燥\t118402\nvirus.win32.ramnit\t118403\n7104\t118404\n黄侃\t118405\n32%\t118406\n祛疤\t118407\n阵痛\t118408\nFURY\t118409\n华安证券\t118410\n蛋鸡\t118411\n地方各级人民法院\t118412\nslightly\t118413\nUkulele尤克里里小站\t118414\n纽卡斯尔\t118415\n朴刀\t118416\n20150619\t118417\n梦洁\t118418\n保兑\t118419\nconstructed\t118420\nvcd\t118421\n绝不会\t118422\n王教授\t118423\n宣城房产网\t118424\n名状\t118425\n束手\t118426\n民主集中制\t118427\n单身派对\t118428\n亲橙\t118429\n淘宝商城\t118430\n京洛再无佳人\t118431\n一铺\t118432\n发微\t118433\n南开大学医学院\t118434\nConnections\t118435\n多语种\t118436\nObama\t118437\n720p/1080p\t118438\n江铜\t118439\n试镜\t118440\n圾\t118441\n工业药剂学\t118442\n百才招聘网\t118443\n瓦莲金娜\t118444\n江南府\t118445\n封弊者\t118446\n安培环路\t118447\n折合\t118448\n大巧\t118449\n数显微镜\t118450\n湛江市\t118451\n罗彻斯特大学\t118452\n福耀\t118453\nkula\t118454\n第五行\t118455\nok卡\t118456\n燃物\t118457\n二进制码\t118458\n佛山新城\t118459\n雾中仙\t118460\n家村\t118461\nxmpp\t118462\nMadness\t118463\n公款\t118464\n重庆图书馆\t118465\n奔驰c\t118466\n355ml\t118467\n香港区\t118468\n泗安\t118469\n美厨邦\t118470\n汉语拼音字\t118471\n电信4g\t118472\n尹明珠\t118473\n湖北公司\t118474\nClair\t118475\n中国科学院微生物研究所\t118476\n90米\t118477\n生死疲劳\t118478\nphc\t118479\n保时捷Panamera\t118480\n义务教育学校管理标准\t118481\n心照不宣\t118482\n流不尽\t118483\naimhero\t118484\n套子\t118485\n好评度\t118486\n三明二中\t118487\nxubuntu\t118488\n潇湘溪\t118489\n丽水在线\t118490\nFriction\t118491\n宗国英\t118492\nManning\t118493\n特征值\t118494\n三班倒\t118495\n星皇\t118496\n鹤丸国永\t118497\n中国国际航空\t118498\n防眩目\t118499\n5.65\t118500\n自媒体_一猫汽车网\t118501\nU2412M\t118502\n洛少爷\t118503\nSIFT算法\t118504\n院内\t118505\n气泡\t118506\n液态硬盘\t118507\n牛黄清心丸\t118508\n香猪\t118509\n穆帅\t118510\n石棱\t118511\n1010\t118512\n十九大报告\t118513\n78种\t118514\ni3-7100\t118515\n陆源\t118516\n简阳机场\t118517\n太仓)有限公司\t118518\n镇静药\t118519\n百胜电子\t118520\nHorton\t118521\n盗贼之海\t118522\n赵静\t118523\n新金沙\t118524\nHint\t118525\n1V3\t118526\nemployer\t118527\nLTSB\t118528\n西梅\t118529\n终裁\t118530\n关中匪事\t118531\n一路上\t118532\n奇瑞eQ1\t118533\n费迪南德\t118534\n逗留\t118535\n7.32燃烧王座\t118536\nmaxi\t118537\ncadillac\t118538\n乐趣网\t118539\n景德\t118540\n井上\t118541\n双拼输入法\t118542\n太行景区\t118543\n瑞意\t118544\n多肉\t118545\nwav格式\t118546\n舞蹈秀\t118547\n80x86\t118548\n蝴蝶飞\t118549\n沙量\t118550\n贴纸机\t118551\n插头转换器\t118552\n猛鱼\t118553\n欧胜\t118554\n斯巴达人\t118555\n11.31\t118556\nNeowin\t118557\n伊东千奈美\t118558\n废柴\t118559\n高庙村\t118560\n脂肪胺\t118561\nip地址段\t118562\n中国科学院上海应用物理研究所\t118563\n函数名\t118564\n格伦\t118565\n常规性\t118566\n稳重\t118567\n三神器\t118568\n金属加工液\t118569\n头孢氨苄片\t118570\n介电强度\t118571\n顿挫\t118572\nTEMS\t118573\n10艘\t118574\n略述\t118575\n海天精工\t118576\nlevel2\t118577\n唐家湾镇\t118578\n传奇影业\t118579\n潮气\t118580\ncriminal\t118581\n在此之前\t118582\n13.9亿\t118583\n国王队\t118584\n光纤光栅传感器\t118585\n松材\t118586\n_梅\t118587\n物保\t118588\n泛娱乐化\t118589\n犹他州\t118590\n高德地图\t118591\n会率\t118592\n138平\t118593\n课本剧\t118594\n宝胜股份\t118595\ncips\t118596\n名山\t118597\n巨臂\t118598\n叶简明\t118599\n赵玉田\t118600\namateurs\t118601\n耐腐蚀\t118602\n家信\t118603\n亚美尼亚\t118604\n大雅\t118605\n淫液\t118606\n银四\t118607\nHelloSUN\t118608\nv1.2.3\t118609\n航海家\t118610\n莎士比亚十四行诗\t118611\n2016年以来\t118612\n2233\t118613\n踏清\t118614\n暗黑破坏神3猎魔人\t118615\n第17批\t118616\n英力特\t118617\nMathematica\t118618\n苏德战争\t118619\n植脂\t118620\niis10\t118621\n南流江\t118622\n奇士\t118623\n牛年\t118624\n熵值\t118625\n插锁\t118626\n新江与城\t118627\n烟台教育\t118628\n吸乳\t118629\n访问\t118630\nMYSQL\t118631\n胖哥俩肉蟹煲\t118632\n惩戒骑士\t118633\n省校\t118634\n李文龙\t118635\nSM城市广场\t118636\nEMSA\t118637\n奥斯本\t118638\n双板\t118639\n指数幂\t118640\n红曲酒\t118641\n卖票\t118642\n想你是\t118643\n铁盘\t118644\n玛凯雷\t118645\npic\t118646\n乱文\t118647\n德里克·罗斯\t118648\nsyncthing\t118649\nmillise\t118650\n末日新垣结衣\t118651\n酒花\t118652\n仪仗兵\t118653\n压铸\t118654\n朱孝天\t118655\n雅乐网\t118656\n麻风\t118657\n气包\t118658\n显存颗粒\t118659\n20170324\t118660\n安德斯\t118661\n竹笙\t118662\n惊云\t118663\n丰乐种业\t118664\n后勤处\t118665\n停播\t118666\n印度河\t118667\n太原人才网\t118668\n抹机\t118669\n中威\t118670\nv3.1.9\t118671\nvagrant\t118672\nDSDT\t118673\nEconomics\t118674\n蒙阴\t118675\n土地补偿费\t118676\n威客圈\t118677\nBORDER\t118678\n维维\t118679\n开发信\t118680\n湖南省地方税务局\t118681\n三国志2017\t118682\n台椅\t118683\nmrg\t118684\n西拉\t118685\n知晓率\t118686\n姣姣\t118687\nv5.6\t118688\n轮功\t118689\n购车贷款\t118690\n届期\t118691\n隆基乐叶\t118692\n卡莉斯塔\t118693\n安缨\t118694\n达沃\t118695\n梦露\t118696\n漂漂\t118697\nfinnciti\t118698\n震感\t118699\n三阴交\t118700\n北京男篮\t118701\n卡通版\t118702\n拉风\t118703\n闻言\t118704\n惠泽\t118705\n大天使\t118706\n50赫兹\t118707\n爱康健齿科集团\t118708\n碎龙\t118709\n茶干\t118710\n沃尔什\t118711\n红夜\t118712\n君乐宝乳业集团\t118713\nL310\t118714\n12罐\t118715\n禁止令\t118716\n柳荫\t118717\n何超莲\t118718\n红蛇\t118719\n中国航空规划设计研究总院有限公司\t118720\n张宏斌\t118721\n叶圣\t118722\nHarkLee\t118723\ntsl\t118724\n李小狼\t118725\n锐龙\t118726\n忍者大师\t118727\nspeak\t118728\n1w2\t118729\nR\t118730\nExcel2016\t118731\n达特\t118732\nputtygen\t118733\n收益\t118734\n荣耀畅玩6\t118735\nwin10狂野飙车8\t118736\n机制\t118737\n800kw\t118738\n寒影\t118739\nfock\t118740\n王兰英\t118741\n庙堂\t118742\n齐鲁工业大学\t118743\ngm眼镜\t118744\ncartridge\t118745\n空巷\t118746\n安子轩\t118747\n早行\t118748\nJ型\t118749\n焦作\t118750\n千剑而后识器\t118751\n计事\t118752\n儿童文学\t118753\n明日华\t118754\n闷声\t118755\n戊子\t118756\n青春卡\t118757\n放生\t118758\n赵小磊\t118759\n慢性淋病\t118760\nbbbt\t118761\n车区\t118762\ncyclic\t118763\n2018.4.4\t118764\n暗黑界\t118765\n安徽博物院\t118766\n骚受\t118767\nGLPI\t118768\n姚谦\t118769\nplayers\t118770\n6.34\t118771\n十几亿\t118772\n湖南省审计厅\t118773\n连字\t118774\n下柜\t118775\n华泰股份\t118776\n82家\t118777\n天使长\t118778\n头数\t118779\n百万条\t118780\n护栏管\t118781\n雾林社\t118782\n沧州西站\t118783\nx3850\t118784\n豌豆公主\t118785\n室内机\t118786\n研修生\t118787\nprepayment\t118788\n人民电视\t118789\n粮价\t118790\n秋播\t118791\nGolang\t118792\ny75\t118793\n巨嘴鸟\t118794\n2017年10月29日\t118795\n王天睿\t118796\n学习报\t118797\n张一鸣\t118798\n撸\t118799\n杜菲尼\t118800\n摇点\t118801\n春阳\t118802\n一剑\t118803\n多瑙河\t118804\n金蘑菇\t118805\n七本\t118806\n狼垡\t118807\n琥珀山庄\t118808\n2个半小时\t118809\nHamburg\t118810\n关于全面推进政务公开工作的意见\t118811\n山东行政学院\t118812\n显著性差异_\t118813\n电视屏幕\t118814\n折手\t118815\n凹凸君\t118816\n寄给\t118817\n秘密性\t118818\n南方公司\t118819\n鹿鸣\t118820\ntpcc\t118821\n水刺\t118822\n云莱坞\t118823\n黄金圈\t118824\n小布丁\t118825\n55家\t118826\n张嘎\t118827\n他的儿子\t118828\n麻宫玲\t118829\n成图\t118830\n尼尔机械纪元2b\t118831\n新城控股集团\t118832\n中国农机新闻网\t118833\n蝴蝶穴\t118834\n78OA\t118835\n沈龙\t118836\n链式输送机\t118837\n26mm\t118838\n代位求偿权\t118839\n刘润\t118840\n方志\t118841\nshor\t118842\n鹰卫\t118843\n弥尔顿\t118844\n长沙动物园\t118845\n棉城\t118846\n电信区\t118847\n冷爵枭\t118848\n0055\t118849\n垂管\t118850\nCGV\t118851\n完善\t118852\n风尚网\t118853\nPATCH\t118854\n帕金森病\t118855\n预算单\t118856\nm50x\t118857\nMcNeel\t118858\n东方肝胆外科医院\t118859\nCooperative\t118860\n内燃机\t118861\nNGA玩家社区\t118862\n危重\t118863\n49.3\t118864\n阿姆斯特丹\t118865\n含餐\t118866\n鲁宾孙漂流记\t118867\n枯萎病\t118868\n我的英雄学院吧\t118869\n木箭\t118870\n饱和蒸汽压\t118871\n中国大陆\t118872\nurls\t118873\naliyun\t118874\n香槟杯\t118875\nHTS\t118876\nCAFA\t118877\n国家电力投资集团有限公司\t118878\n萃茶\t118879\n骨气\t118880\n杨金龙\t118881\n荆房网\t118882\n拔毛\t118883\n暗暗\t118884\n余晖\t118885\n华泰联合证券有限责任公司\t118886\n子午流注\t118887\n神州牡丹园\t118888\nword合并单元格\t118889\n组织科\t118890\n基遇\t118891\n嫩绿\t118892\n江淮人才网\t118893\nsql*plus\t118894\n医疗信息化\t118895\n288号\t118896\n大院\t118897\nMCI\t118898\n母材\t118899\n会计从业资格\t118900\n交换温柔\t118901\n十城\t118902\n仙剑1\t118903\n个体\t118904\n恋\t118905\n犹太\t118906\n石家庄医学高等专科学校\t118907\n刘旭\t118908\njbj\t118909\n哈伦\t118910\n孙绍振\t118911\n有源低通滤波器\t118912\n黑米饭\t118913\n湖南自考网\t118914\n20组\t118915\n无损分区\t118916\n骑行裤\t118917\nPHP7\t118918\n打血\t118919\n无穷大\t118920\n龙潭水库\t118921\n刘嘉湘\t118922\n创争\t118923\n建设工程监理规范\t118924\n黑凉粉\t118925\n充电器\t118926\n众包\t118927\n读书笔记大全\t118928\n荣耀德古拉\t118929\nqii\t118930\n火王\t118931\n刘雨昕\t118932\nRedux\t118933\n吹雪\t118934\nv6.2\t118935\n2018年五月\t118936\n掩码\t118937\n云腾\t118938\n星辰网\t118939\n磨叽\t118940\n黑马程序员\t118941\n第70期\t118942\n自动灭火装置\t118943\n扎带机\t118944\nbounty\t118945\n疏菜\t118946\n千亿元\t118947\n马粪\t118948\n中植集团\t118949\n学训宝\t118950\n智能路由器\t118951\n明祖陵\t118952\n公道杯\t118953\n磨砂纸\t118954\n503\t118955\n神农百草膏\t118956\n失信黑名单\t118957\n再恋\t118958\n百联集团\t118959\n门顶\t118960\nCourtney\t118961\n官能团\t118962\n灵脉\t118963\n紫马财行\t118964\nrimowa\t118965\n南京政治学院\t118966\n桑坦德\t118967\n毛机\t118968\n广西国税\t118969\n固件\t118970\n刺客信条兄弟会\t118971\nErrorCode\t118972\nVR_花火网\t118973\n侯为贵\t118974\n三特\t118975\n电气类\t118976\nActiviti\t118977\n医统汉祚高门\t118978\n吃鸡开\t118979\n直属机关工委\t118980\n魏锐\t118981\ndSYM\t118982\ni9-7980XE\t118983\n变流\t118984\n吸怪\t118985\n税费\t118986\n噪音值\t118987\n居易\t118988\n菁英\t118989\n广州领事馆\t118990\n武动大圣传\t118991\n天下网\t118992\n酵素粉\t118993\n1.7.9\t118994\n文旅\t118995\n密闭性\t118996\nmx150\t118997\n服讨\t118998\nexl\t118999\n孤勇\t119000\n汽配中国网\t119001\n钢蛇\t119002\nKCC\t119003\nServ-U\t119004\n消重\t119005\n2阶\t119006\n五通一平\t119007\ntiandi\t119008\n套餐包\t119009\n福州四中\t119010\n红砂岩\t119011\njointly\t119012\n有重谢\t119013\n虚荣吧\t119014\n响应体\t119015\n忏悔\t119016\n杨忠\t119017\n警号\t119018\n林径\t119019\nlaodao\t119020\n供电段\t119021\nJPA\t119022\n红车网\t119023\n用度\t119024\n扎鲁特\t119025\n云画\t119026\nPowerShadow\t119027\n4万亿元\t119028\n1024cl\t119029\n学理论\t119030\n污漫\t119031\n新生杯\t119032\n五路口\t119033\n下阕\t119034\n互动吧\t119035\n每一片\t119036\n齿式联轴器\t119037\nR6800\t119038\n208年\t119039\n钢子\t119040\n奋楫者\t119041\n北京地铁14号线\t119042\n2.4.5\t119043\n自体脂肪丰胸\t119044\n大相径庭\t119045\n战狼传说\t119046\n中国社会科学院大学\t119047\n诱敌\t119048\n虹猫蓝兔勇者归来\t119049\n食者\t119050\n危言\t119051\nfalls\t119052\n腐化\t119053\n蜂之翼\t119054\n感恩信\t119055\nExtra\t119056\n裁定书\t119057\n测试版\t119058\n16岁\t119059\nanimated\t119060\n668号\t119061\n最舒服\t119062\n十座\t119063\n谈情说爱\t119064\n德云三宝\t119065\n宁波路\t119066\ngo\t119067\n叶绿体色素\t119068\n酸式滴定管\t119069\n依依社区美图网\t119070\n武汉二中广雅中学\t119071\n5局\t119072\n58888\t119073\n步步高集团\t119074\n小兵传奇\t119075\n682\t119076\n中小城市\t119077\n胖人\t119078\n企业核心价值观\t119079\n地税电子税务局\t119080\n找字网\t119081\n梅州网\t119082\nCaves\t119083\n碧萝芷\t119084\n彭水网\t119085\n打火\t119086\n天舟文化\t119087\n广州医药集团有限公司\t119088\nQBT\t119089\nVue-cli\t119090\n超污\t119091\n提钱\t119092\n上北\t119093\n黄石寨\t119094\n虚有\t119095\n锁锁\t119096\n江勇\t119097\n950XL\t119098\n嘶\t119099\n规模\t119100\n肌张力高\t119101\nmoretv\t119102\n升降门\t119103\n白姐\t119104\n壮胆\t119105\n金属学报\t119106\n女歌星\t119107\n国元\t119108\n菲尼萨\t119109\n1200套\t119110\n不省心\t119111\ngrinding\t119112\n言字\t119113\nSONOS\t119114\nHomeware\t119115\n星际特工\t119116\n工作经验\t119117\n皮克\t119118\n德胜门\t119119\n智百威\t119120\n文鼎\t119121\npyxl\t119122\n0卡\t119123\n标压处理器\t119124\nPenta\t119125\n20141119\t119126\n来了\t119127\nBop\t119128\n分享型\t119129\ntranslational\t119130\n大石\t119131\n插板\t119132\n第四套\t119133\n邱国鹭\t119134\n包破\t119135\n8匹\t119136\n省长杯\t119137\n千里江陵\t119138\n雷死\t119139\nButterfly\t119140\n俊男\t119141\n刨析\t119142\nAlaska\t119143\n熊蜂\t119144\n至强E5\t119145\n镍基合金\t119146\n消费函数\t119147\n咬掉\t119148\n末路\t119149\n北京市康达律师事务所\t119150\n盲侠大律师\t119151\n踏马\t119152\nScan\t119153\n梦幻封妖传\t119154\n都市霸道校草宠溺爱\t119155\n结衣波多野\t119156\n兰美抒\t119157\nDECLARE\t119158\n王愿坚\t119159\nPICC人保\t119160\n斗鱼鱼\t119161\n太子山\t119162\n44家\t119163\n张五海阔天空\t119164\n酒酿饼\t119165\n李诗韵\t119166\nDealAm\t119167\n凯时娱乐\t119168\n牟宗\t119169\n锐舞\t119170\n推挽式\t119171\n一首个\t119172\n瘟疫\t119173\n沪深港\t119174\nChiron\t119175\n钓组\t119176\n哈喇味\t119177\n技能型\t119178\n林建东\t119179\n渊源\t119180\n乌普萨拉\t119181\n黑牛城道\t119182\n大众辉腾\t119183\n太乙神数\t119184\nwin10\t119185\n仕置\t119186\n美域\t119187\n露鲍\t119188\n一弘\t119189\nincr\t119190\n行必果\t119191\n韩城市\t119192\n孤军奋战\t119193\n上海电影博物馆\t119194\n渝万公会\t119195\n连体服\t119196\n金街\t119197\n陈振濂\t119198\nVR女友\t119199\n奥法\t119200\n言听计\t119201\ntabcontrol\t119202\nshake\t119203\n深国\t119204\n刘学新\t119205\n透底\t119206\n析出\t119207\n孙弘\t119208\n优盛\t119209\n英雄联盟职业\t119210\n黄关春\t119211\n陶铸\t119212\n包括\t119213\n三凤桥\t119214\n王炎\t119215\n孤岛\t119216\n普五\t119217\n艾草\t119218\n达尼亚\t119219\nShelly\t119220\n河北科大\t119221\nMistress\t119222\nmodels\t119223\n帝道特产网\t119224\nBOY\t119225\n白瓷\t119226\n乾宫\t119227\n汉峪\t119228\n机值\t119229\n2018年1月31日\t119230\nK-0510\t119231\n票据打印机\t119232\n第8篇\t119233\nPIA\t119234\n中国足彩网\t119235\n出长\t119236\n日语系\t119237\nLyX\t119238\n贵女\t119239\n红星\t119240\n省知识产权局\t119241\n表式\t119242\n长春火车站\t119243\n何以琛\t119244\n陆花\t119245\n荣耀畅玩7C\t119246\nlxf\t119247\npopo\t119248\n威索尼\t119249\n王帅\t119250\n@\t119251\n打绳\t119252\n802.1q\t119253\n上一夜\t119254\n席绢\t119255\n伯里克利\t119256\n我爱满堂彩\t119257\n塘下\t119258\n对称式\t119259\nwinodws\t119260\n鲜榨橙汁\t119261\n完美小说网\t119262\n2017-12-08\t119263\n保重\t119264\n大乔小乔\t119265\n貌合神离\t119266\n张恭庆\t119267\n塞尔达公主\t119268\n分线\t119269\n日本桥\t119270\n几公分\t119271\n谢丹\t119272\n薪火\t119273\n1000_\t119274\nQQ通讯录\t119275\nRoo\t119276\n金正恩\t119277\n苏州建材网\t119278\n小林未郁\t119279\n陈石\t119280\n达沃斯\t119281\n迈\t119282\n马尔科姆\t119283\n第二军医大学\t119284\n陈欣怡\t119285\n房方\t119286\n重庆协和医院\t119287\n五把\t119288\n综穿\t119289\nBasel\t119290\n保鲜袋\t119291\n在线违章查询网\t119292\n星兜\t119293\n型号\t119294\nGAAP\t119295\n四指\t119296\nDopa\t119297\n搭便\t119298\nfmx\t119299\n十全大补丸\t119300\n沥青路\t119301\n000034\t119302\n黑河\t119303\n毛周角化\t119304\nCFANZ\t119305\n中公教育河北分校\t119306\n垫球\t119307\ncomma\t119308\n希文\t119309\n茅台迎宾酒\t119310\n160名\t119311\n习大大爱着彭麻麻\t119312\n交交\t119313\n专四考试\t119314\n吃透\t119315\n护理师\t119316\n聚优\t119317\nhaerbin\t119318\n肇庆七星岩\t119319\n十送红军\t119320\n云南日报\t119321\nvalidated\t119322\nTOP50\t119323\nlftp\t119324\n泰禾光电\t119325\n复关\t119326\n墨版\t119327\n不预则废\t119328\n多线程\t119329\n600837\t119330\n牌匾\t119331\n神令\t119332\n提供\t119333\n第61集\t119334\n罗云熙\t119335\n语\t119336\n克隆大作战\t119337\n那些年,我们一起追的女孩\t119338\nie8i\t119339\n法国签证中心\t119340\n机要局\t119341\n细解\t119342\n喷涌\t119343\n2018-04-30\t119344\nabee\t119345\n适用性\t119346\nTeenagers\t119347\n肥熟\t119348\n弯处\t119349\n莫非王土\t119350\n1335\t119351\n拆除\t119352\ncloud2\t119353\n3y\t119354\n植金石\t119355\n刘丁\t119356\n柠檬水\t119357\n前四天\t119358\n张永新\t119359\n柳州晚报\t119360\nguess\t119361\n三相电度表\t119362\n黄河北大街\t119363\n表白日\t119364\n四十个\t119365\n徐州幼儿师范高等专科学校\t119366\n生存之道\t119367\n奥迪RS3\t119368\n阜新蒙古族自治县\t119369\n内录\t119370\n湖州市旅委\t119371\n控制盒\t119372\n人大政协\t119373\n冬歇期\t119374\nweb\t119375\n页岩气\t119376\n影音先锋资\t119377\n茨实\t119378\n新人物\t119379\n2.2亿\t119380\nPolyU\t119381\n伊金霍洛旗\t119382\n亲昵\t119383\nVirusTotal\t119384\ndpx\t119385\n肖亚非\t119386\n超纯水\t119387\n莹子\t119388\n珠海北站\t119389\n关关雎鸠\t119390\n注消\t119391\n英雄联盟S7\t119392\n维修基金\t119393\n频率计\t119394\n华润海\t119395\n人的一生\t119396\n管理会计学\t119397\n陕西省住房和城乡建设厅\t119398\nmdaemon\t119399\n面齿\t119400\n第3季\t119401\niDoNews\t119402\ninterceptors\t119403\nhy\t119404\nv1.3.2\t119405\n胃气\t119406\nF40\t119407\n刘德华\t119408\n记者站\t119409\nyounger\t119410\n写字间\t119411\n富阳日报数字报\t119412\n牧歌\t119413\n乐堤港\t119414\ngm8\t119415\n存储过程_\t119416\n林可欣\t119417\narp\t119418\n美的空气能热水器\t119419\n金庸武侠小说\t119420\n523路\t119421\nmachinery\t119422\n迟缓\t119423\n阴阳眼\t119424\n热卡\t119425\n淘宝双11\t119426\nMathType\t119427\n胆儿\t119428\n荣耀冠军杯\t119429\ntopik\t119430\n丁内酯\t119431\n7000吨\t119432\n中山大学附属医院\t119433\n慢性膀胱炎\t119434\n阿里鱼卡吧\t119435\n富田\t119436\n客位\t119437\n云矿机\t119438\n黑莓keyone\t119439\n会期\t119440\n中山市博爱医院\t119441\n子宫前壁\t119442\n都市运输\t119443\n巨奶\t119444\n切片机\t119445\n规委会\t119446\n综合征\t119447\n降火\t119448\n济南论坛\t119449\npyyaml\t119450\n事务码\t119451\n病案\t119452\noracle10g\t119453\n泗阳县政府\t119454\n上海唐镇\t119455\npbi\t119456\n车顶箱\t119457\n学办\t119458\ndatagrid\t119459\n桐岛千沙\t119460\n家居网\t119461\ntapd\t119462\n医妃逆天_瓦猫\t119463\n胡燕\t119464\n恶灵\t119465\n项目章\t119466\nPCWAP\t119467\n文白\t119468\n企业应收账款\t119469\n任天堂公司\t119470\n使命召唤4:现代战争重制版\t119471\n康尼机电\t119472\n老大爷\t119473\n四川路桥\t119474\n邦奇\t119475\njunjin\t119476\n调查派\t119477\n塞子\t119478\n坡屋面\t119479\nYKK\t119480\n陈力丹\t119481\n不做梦\t119482\n易佳\t119483\nbody\t119484\nxtp\t119485\n中公医考网\t119486\nDukascopy\t119487\nbvi\t119488\n血值\t119489\nfuli\t119490\n孔氏\t119491\n映漾\t119492\n闘\t119493\n太古匆匆那年\t119494\nwoai\t119495\n参\t119496\n于林\t119497\n影域\t119498\n避让\t119499\n16节\t119500\n许江\t119501\n当归四逆汤\t119502\n对错\t119503\n尔雅2016\t119504\n服部平次\t119505\n第22\t119506\n广德县人民政府\t119507\nTabLa\t119508\n寿山石\t119509\n死神/境\t119510\narchitect\t119511\n迫在眉睫\t119512\n2\t119513\n许燕\t119514\n生吃\t119515\n2698\t119516\n肓\t119517\n红色警戒3龙霸天下\t119518\n持久\t119519\n林光\t119520\n对账簿\t119521\n大气污染控制工程\t119522\ntocmat\t119523\n朱洁静\t119524\n电击疗法\t119525\n浚\t119526\n一百天\t119527\n浙江省经济和信息化委员会\t119528\n莆田市第一医院\t119529\n善导大师\t119530\n大胃王\t119531\n话剧团\t119532\nRowland\t119533\n艾露恩\t119534\n金亚科技\t119535\nホテル\t119536\n一晌\t119537\n木心\t119538\n5180\t119539\n华夏田氏网\t119540\n失落之塔\t119541\nKamikaze\t119542\n红星街\t119543\nHappy\t119544\n汽车购置税\t119545\n金门\t119546\n标题页\t119547\n浙江大学公共管理学院\t119548\n移动云商城\t119549\nWH-H800\t119550\n下不来\t119551\nsin七大罪\t119552\n清远市城乡规划局\t119553\n痛觉\t119554\n88元\t119555\n深海迷航独眼巨人号\t119556\nattendant\t119557\n沈璧君\t119558\nhuanjing\t119559\nKingMario\t119560\nyoungest\t119561\n99乘\t119562\n西汉\t119563\n雨棚\t119564\n前腿\t119565\n罗田论坛\t119566\n华西医院\t119567\n买价\t119568\nWINRAR\t119569\n概论\t119570\n各种各样\t119571\n九酷福音网\t119572\n阿德托昆博\t119573\n慎于言\t119574\nhighland\t119575\nMake\t119576\nVodafone\t119577\n20170402\t119578\n科讯\t119579\n569\t119580\n务院\t119581\n天竺国\t119582\n贝卡\t119583\n遥遥领先\t119584\n林如海\t119585\n杜比音效\t119586\n购物单\t119587\n公\t119588\n新款\t119589\n误食\t119590\n8000名\t119591\n漳\t119592\n648\t119593\n邮包\t119594\n雪佛龙\t119595\n脑白质脱髓鞘\t119596\n镇痛药\t119597\n螃蟹\t119598\n油香\t119599\n杨振华\t119600\n巨人教育\t119601\n脊髓灰质炎疫苗\t119602\n平西王府\t119603\n清山\t119604\nCOMPLETE\t119605\n苏州市公安局交通警察支队\t119606\nFDR\t119607\n顶贴\t119608\n郑州经开区\t119609\n黄果树瀑布\t119610\n长野博\t119611\ncatl\t119612\n本地\t119613\n成功学\t119614\n道奇挑战者\t119615\n豹龟\t119616\nし\t119617\n矿库\t119618\n鳄鱼式\t119619\nEntrepreneurship\t119620\n科朗医化\t119621\n第一粒\t119622\nwinedt\t119623\nCUPS\t119624\n沉箱\t119625\n六六歌\t119626\n烟卷\t119627\n西君\t119628\n拓扑图\t119629\nhomepage\t119630\n宅基地使用证\t119631\n超滤管\t119632\n49日\t119633\n新中基\t119634\n吸湿\t119635\n脱口而出\t119636\n雾中\t119637\n自媒体联盟\t119638\n豪力\t119639\n80后之窗\t119640\n实销\t119641\n复式公寓\t119642\n黄喉\t119643\n就业援助\t119644\nRory\t119645\n心慌\t119646\n灯神\t119647\n莱赛尔\t119648\n两升\t119649\n当归芍药散\t119650\nmkv\t119651\n坏蛋\t119652\n硫酸钡\t119653\n博识\t119654\n2三\t119655\n海森\t119656\narangodb\t119657\n750M\t119658\n新碧\t119659\n伊滨区\t119660\n追星族\t119661\n风云四号\t119662\n此花亭奇谭\t119663\nhelloWorld\t119664\n3斤\t119665\nGi-Oh\t119666\n第四十六章\t119667\nzlf\t119668\n4.22日\t119669\n奕泽\t119670\numg\t119671\nwangEditor2\t119672\n我爱你\t119673\n吮吸\t119674\n扒带\t119675\nPULSE\t119676\nvuhdo\t119677\n大华集团\t119678\n魂殇\t119679\nsally\t119680\n一只只\t119681\n3D投影机\t119682\n细读\t119683\n麻叶\t119684\n奔驰GLE论坛\t119685\n丽芙泰勒\t119686\n恐龙蛋化石\t119687\n大姨夫\t119688\n广西消防\t119689\n英雄连:勇气传说\t119690\n宁美国度\t119691\n领军型\t119692\n白心火龙果\t119693\n犹大\t119694\n上任意\t119695\n房卡\t119696\n20150712\t119697\n跨海\t119698\nfrqc\t119699\n溪南\t119700\n地坛医院\t119701\n5毛\t119702\n闷绝\t119703\nVe\t119704\n泊位\t119705\n卡尔顿大学\t119706\n贡嘎山\t119707\n操作库\t119708\n洛浦街道\t119709\nBLAST\t119710\n第130\t119711\n法律经济学\t119712\n艰苦卓绝\t119713\n抹墙\t119714\nLED发光二极管\t119715\n本神\t119716\n文火\t119717\nAop\t119718\n大涌谷\t119719\n辛集社区\t119720\n2017年8月22日\t119721\nmg6\t119722\n天目药业\t119723\n料仓\t119724\n近朱者赤\t119725\n牧马人论坛\t119726\nappserv\t119727\nban位\t119728\n瑞博\t119729\n标准照\t119730\nSensors\t119731\n大司\t119732\n第2回\t119733\nInsect\t119734\n沿着\t119735\n罗克\t119736\n观音灵签\t119737\n宝安中学\t119738\n俞丽\t119739\n大粤\t119740\n与非阻塞\t119741\n鄂武商\t119742\n龙司\t119743\n兴蓉环境\t119744\n佐使\t119745\n雄姿\t119746\nForEach\t119747\n多好\t119748\n抹大拉\t119749\nPolyurethane\t119750\ntni\t119751\n华卓\t119752\n林彦俊\t119753\n明年8月\t119754\n中国产业信息网\t119755\nSustainable\t119756\n惩恶\t119757\nbonjour\t119758\n真笔\t119759\n特供\t119760\n走自己的路\t119761\n1.43\t119762\n宾州州立大学帕克分校\t119763\n兰州牛肉拉面\t119764\nureport2\t119765\n尧新\t119766\namt\t119767\n飞虎2\t119768\n截骨\t119769\n论文奖\t119770\nintroductory\t119771\nMBA\t119772\n李玉宾\t119773\n轴值\t119774\n防水条\t119775\n芙丽芳丝\t119776\n完具\t119777\n黄河交通学院\t119778\nSTM32F10x\t119779\nRayshen\t119780\n氯化聚乙烯\t119781\n华峰\t119782\n东京食种\t119783\n军警\t119784\n18寸\t119785\n2277\t119786\n手动挡车\t119787\n小气\t119788\n正如\t119789\n龙珠大道\t119790\n书社\t119791\n雷蒙磨\t119792\nNote4x\t119793\n内衣照\t119794\nopl\t119795\nHAN\t119796\n国防工业\t119797\n试运营\t119798\n缉毒警\t119799\n20160903\t119800\n学乐云\t119801\n重于形式\t119802\n1.7.11\t119803\nlgd\t119804\nsl2\t119805\n卡昂\t119806\n春水\t119807\n韩啸\t119808\n三卷\t119809\nchorome\t119810\n中信手机银行\t119811\n邪杀\t119812\nSSK\t119813\n缬沙坦氨氯地平片\t119814\n真贫\t119815\njwplayer\t119816\nopti\t119817\n战斗片\t119818\n倒挂\t119819\nタイム\t119820\n太阁立志传4\t119821\n家私\t119822\nJPress\t119823\n金海心\t119824\nqun\t119825\nwiFi\t119826\n西派\t119827\n环南路\t119828\n上饶之窗\t119829\n牡丹吧_\t119830\nRS1\t119831\nelectro\t119832\n上海市公共卫生临床中心\t119833\n6.3mm\t119834\n西青郊野公园\t119835\nepl\t119836\n24px\t119837\nBECCA\t119838\n32&64位\t119839\n三略\t119840\nwww\t119841\n倾情\t119842\n印广法师\t119843\n健康卡\t119844\n华为新闻中心\t119845\n星耀\t119846\n物化\t119847\n沙塔斯城\t119848\n20161008\t119849\nAU3\t119850\nAla\t119851\n五价\t119852\n博润\t119853\n葡萄藤\t119854\nIRA\t119855\n2立方米\t119856\n液压榨油机\t119857\n罗马柱\t119858\n防撞板\t119859\n专供版\t119860\n南山区教育局\t119861\n案例\t119862\n丰田\t119863\n中英文名\t119864\n止血钳\t119865\n免抵\t119866\n祖祖辈辈\t119867\n幻想神域ol\t119868\n最美好的\t119869\n丙烯酸丁酯\t119870\nSanMaoSpace\t119871\n451号\t119872\n独行冰海\t119873\n140年\t119874\n台玻\t119875\n卡卷\t119876\nwebstorm2017\t119877\n九号秘事\t119878\n威力\t119879\n皮尺\t119880\nsheffield\t119881\nlinux根\t119882\n沈家门\t119883\n应县\t119884\n楼凤阁\t119885\nWeil\t119886\n紫薇阁\t119887\n马拉拉\t119888\n施甸县\t119889\n视神经\t119890\njavescript\t119891\n赢顺\t119892\nInfuse\t119893\nmandate\t119894\n601111\t119895\n欧克\t119896\n纽约港\t119897\n速流\t119898\n谦谦\t119899\n可汗学院\t119900\n校色仪\t119901\n建国北路\t119902\n美龄宫\t119903\nfegin\t119904\n300大\t119905\n雌激素受体\t119906\n酷酷\t119907\n勇攀\t119908\nFxPro\t119909\n百田星座圈\t119910\n恶乎\t119911\n老潼关肉夹馍\t119912\nsupport-v7\t119913\n丽花\t119914\n0.1.0\t119915\n5505\t119916\n首阳\t119917\n作恶\t119918\n档案册\t119919\n割断\t119920\n北京交通大学理学院\t119921\n招商局集团\t119922\n阳历表\t119923\n纱荣子\t119924\nb860av1.1\t119925\n第二波\t119926\n马尔济斯犬\t119927\nEYE\t119928\nPPPOE\t119929\n畏\t119930\n沙皇俄国\t119931\n太古\t119932\n米未传媒\t119933\n极品飞车18:宿敌\t119934\nfuller\t119935\nQuam\t119936\n南康\t119937\n安布拉\t119938\n纸板箱\t119939\n菲戈\t119940\n激战\t119941\namusement\t119942\n网\t119943\n厝内小眷村\t119944\n塞尔达amiibo\t119945\nK2\t119946\nMN\t119947\nCOD6\t119948\n芝兰\t119949\nGarland\t119950\n门磁开关\t119951\n第十六天\t119952\n大河路\t119953\n240hz\t119954\n开价\t119955\nxjtu\t119956\n几层\t119957\n神信物\t119958\n腾出\t119959\n临床型\t119960\n33mm\t119961\n众神\t119962\n东游\t119963\n旧桥\t119964\ncoverity\t119965\n懦夫\t119966\n哈里斯\t119967\n入党积极分子评语\t119968\n优教\t119969\nDenis\t119970\n过年好\t119971\nkillall\t119972\n中共\t119973\n南京教育\t119974\n广西工商局\t119975\ntransact\t119976\n私播\t119977\nBEI\t119978\n三星手机通讯录\t119979\n杭州地铁1号线\t119980\n正业\t119981\nPerhaps\t119982\n653个\t119983\n鞋男\t119984\n鸿星尔克\t119985\n58英寸\t119986\naok\t119987\n陆阳\t119988\nxlsread\t119989\n公共英语三级考试\t119990\n环卫车\t119991\n监证\t119992\n创刊号\t119993\n枝繁叶茂\t119994\n大衣柜\t119995\n荼\t119996\n乳燕\t119997\nmyspeed\t119998\n平坦\t119999\n云南中医学院\t120000\n8115\t120001\n彩图版\t120002\n卵黄\t120003\n奇瑞E5\t120004\n国父\t120005\n中南医院\t120006\n6.95\t120007\n原由\t120008\nMQL4\t120009\n雷布斯\t120010\n资溪县\t120011\n起走\t120012\n译林小学\t120013\n小恶\t120014\ntrapping\t120015\n扶持期\t120016\n金三角经济特区\t120017\ndevices\t120018\n章句\t120019\n洗根\t120020\nfig\t120021\n有机化学实验\t120022\nmythtype\t120023\n浮山县\t120024\n黑暗之魂1\t120025\nGuerlain法国娇兰\t120026\n新版\t120027\n走不走\t120028\nIReport\t120029\nsdde\t120030\n劳务收入\t120031\n卖血\t120032\n徐睿\t120033\n阚清子\t120034\n适于\t120035\n莱娅\t120036\n晚节\t120037\nElona\t120038\n陪安东尼度过漫长岁月\t120039\n多线\t120040\n鏡\t120041\n育才小学\t120042\nnike\t120043\n溥心畲\t120044\n东软望海\t120045\n江西省教育厅\t120046\n张银行\t120047\n粉装\t120048\n松雪泰子\t120049\n52linux\t120050\n赣州教育网\t120051\n吃住\t120052\n学堂在线\t120053\n胜负师\t120054\n和源\t120055\n牌型\t120056\n心肌酶\t120057\n补魔\t120058\n血缘\t120059\n2009年1月\t120060\n干煸\t120061\n建党节\t120062\n一下第五\t120063\n柔顺剂\t120064\n湛江师范学院\t120065\n暮光之城4破晓\t120066\n苏宿工业园区\t120067\nNations\t120068\n大利好\t120069\n受够\t120070\n0xC0000005\t120071\n可通\t120072\n高快\t120073\n迪蒙\t120074\n我相信你\t120075\n无赖勇者\t120076\n内图\t120077\n轴承钢\t120078\n旋盖机\t120079\n李良仕\t120080\ndidnapper\t120081\nstamped\t120082\n杨光斌\t120083\n10-5\t120084\n思博网\t120085\n阿非\t120086\n克里斯汀斯图尔特\t120087\n挂花\t120088\n200多名\t120089\n高区\t120090\nJabra\t120091\n辽宁女排\t120092\n陈珏如\t120093\nShutter\t120094\nWillow\t120095\n第一章第三节\t120096\n2012-06-30\t120097\n蒙自源\t120098\n大理洱海\t120099\n素质\t120100\n立达信\t120101\n姚庄镇\t120102\n2117\t120103\n雌性\t120104\n旭派\t120105\n用钱\t120106\n上海市市\t120107\n威乐水泵\t120108\n杰卡德\t120109\n七伤拳\t120110\n百廿\t120111\ngoogleearth\t120112\n栈段\t120113\n参片\t120114\n五毛食品\t120115\n金属漆\t120116\n金海花园\t120117\n2xl\t120118\n民国十年\t120119\n96厘米\t120120\nGrave\t120121\n5080\t120122\nassignee\t120123\n法音\t120124\nUnsplash\t120125\nanx\t120126\nxiaoyuan\t120127\n定西\t120128\n爱恨\t120129\n20151213\t120130\n草台\t120131\n啼笑因缘\t120132\n老年代\t120133\n反恐精英cs1.6\t120134\nhitfilm\t120135\n延安三路\t120136\n279\t120137\n4266\t120138\nYong\t120139\nPsychology\t120140\n府学小学\t120141\nefuse\t120142\n5055\t120143\n门缝\t120144\narcv\t120145\n新秀赛\t120146\n老面\t120147\n猫病\t120148\n白水河\t120149\nMünchen\t120150\n户型\t120151\n猫与桃花源\t120152\n大蓉\t120153\n篡\t120154\nwin7画图工具\t120155\n三里庵\t120156\n任人摆布\t120157\n河南省天一大联\t120158\n二套房贷款\t120159\n1月10日\t120160\n李晓敏\t120161\n提取码\t120162\n国学智慧\t120163\n何藩\t120164\n壹宝\t120165\n一万吨\t120166\n敏感\t120167\n光明大陆\t120168\n陪睡\t120169\n呕心\t120170\ncr12\t120171\n英迈\t120172\n申园\t120173\nkpi\t120174\n预备党员大会\t120175\n黄潇\t120176\n87346159\t120177\n全国股份转让系统\t120178\n新东方在线论坛\t120179\n永远失去\t120180\n边疆行\t120181\n仰观\t120182\n紫藤花园\t120183\n姫様\t120184\nrat\t120185\n气管炎\t120186\n手指谣\t120187\n安徽省粮食局\t120188\n瓜子皮\t120189\n乔氏\t120190\n打分算命\t120191\n1款\t120192\n70块\t120193\ncd\t120194\n评价指标\t120195\n97.5\t120196\n滑移\t120197\n淡水\t120198\n中国经济出版社\t120199\nSBS歌谣大战\t120200\n雷纳德\t120201\n敦刻尔克\t120202\n战术\t120203\n袴\t120204\n钳形\t120205\n雅儿贝德\t120206\nUNITY\t120207\n风土\t120208\n玛瑞莎\t120209\n宇宙奇趣网\t120210\n热源\t120211\n曹乾\t120212\n滨州学院\t120213\n蜂友\t120214\n名刀\t120215\n莎兰\t120216\n巻\t120217\n竹虫\t120218\n创新路\t120219\nkhdaw\t120220\n黑豆汤\t120221\n水砂\t120222\n昆山市教育局\t120223\n美其名曰\t120224\n陕西省交通厅\t120225\n长治市政府网\t120226\n0.8厘米\t120227\nComment\t120228\n冷却泵\t120229\n成卷\t120230\nWinds\t120231\n电力电子技术\t120232\nBDA\t120233\nスポ\t120234\n幂函数\t120235\n长庚医院\t120236\n芙蓉北路\t120237\ngoBack\t120238\n红米4A\t120239\n西武高铁\t120240\n中药药\t120241\nmoz\t120242\n董洪涛\t120243\n腰枕\t120244\n1001y\t120245\n陈廷敬\t120246\n开嗓\t120247\n我的生活\t120248\n纯阳宫\t120249\nDockOne\t120250\nEmail-Sansen\t120251\n中共中央国家机关工作委员会\t120252\n密歇根大学\t120253\n厄尔尼诺\t120254\n标号\t120255\n七里\t120256\n李金林\t120257\n6.9.2\t120258\n打包带\t120259\n道明光学\t120260\n准数\t120261\nsabic\t120262\n宇文成都\t120263\nFruits\t120264\n三垒股份\t120265\n浪子回头\t120266\nkcp\t120267\n瓜农\t120268\nMSA\t120269\n青柠\t120270\n远大前程\t120271\nSWING\t120272\n养蜂\t120273\n西安奥数网\t120274\nr210\t120275\n运城职业技术学院\t120276\nwms\t120277\nala\t120278\n遍地狼烟\t120279\n朝史\t120280\n异变\t120281\n古塘\t120282\n耕农\t120283\nivms4500\t120284\n珙县\t120285\n皮口\t120286\nAva\t120287\n玲声\t120288\n独断\t120289\n这条路\t120290\n高逼格\t120291\n车次\t120292\n紫玉山庄\t120293\nJ7\t120294\n北京首都国际机场\t120295\n72街\t120296\n掌权\t120297\n事事\t120298\n中国策划学院\t120299\n周为海\t120300\n沙湾镇\t120301\nvictor\t120302\n任期\t120303\nPaige\t120304\n压力蒸汽灭菌器\t120305\n朽木\t120306\n血甲\t120307\ncareless\t120308\n跨上\t120309\nWinRM\t120310\n236号\t120311\n斗门政府\t120312\n助鬼为乐\t120313\n黄炜\t120314\nliterally\t120315\n追忆潸然\t120316\n彻夜流香\t120317\ncoolnull\t120318\nprinceton\t120319\n&#8203\t120320\n营养片\t120321\n朝外\t120322\n4490\t120323\n对一\t120324\nFayin\t120325\n差色\t120326\nleverage\t120327\n孙燕\t120328\nfengzp\t120329\n快乐圣诞\t120330\n52部\t120331\n蓝光雍锦半岛\t120332\n守望野马\t120333\n龙旗广场\t120334\n新城吾悦广场\t120335\n苹果手机_手机学院\t120336\n美规\t120337\n陈村镇\t120338\n广东省委\t120339\n12.1寸\t120340\nDecanter\t120341\n表注\t120342\n一颗颗\t120343\nformId\t120344\n国家大剧院\t120345\n笔录\t120346\n女老总\t120347\n下忍\t120348\n耶耶耶\t120349\n南岳\t120350\n三道\t120351\nb超检查\t120352\nfing\t120353\nmysql数据库\t120354\nlnmpa\t120355\n金婷婷\t120356\n包放\t120357\nps软件\t120358\nUFRII\t120359\n卤米松乳膏\t120360\n香港街\t120361\n线性方程组\t120362\n能火\t120363\n剪刀梯\t120364\n爱奇艺\t120365\n影响力\t120366\n李毓芬\t120367\nnihon\t120368\nHannibal\t120369\n迸\t120370\n不由\t120371\n安素\t120372\nBootStrap\t120373\n二十五种\t120374\n胃肠间质瘤\t120375\n第10放映室\t120376\nunity3d\t120377\n上海静安嘉里中心\t120378\n原田\t120379\n仙宗\t120380\nabaqus\t120381\n锦绣中华民俗村\t120382\n会议\t120383\n芙蓉姐姐\t120384\n中国式关系\t120385\nSapling\t120386\n董璇\t120387\n晚一点\t120388\n内蒙古自治区政协\t120389\n内贸易\t120390\nsni\t120391\n千百十\t120392\n上海华东医院\t120393\n临淇镇\t120394\n晋陕\t120395\nEagles\t120396\nointment\t120397\n三牛\t120398\n18F\t120399\n奶奶\t120400\n卞夫人\t120401\n鱼壳\t120402\n和田枣\t120403\n丛林\t120404\nppt课件\t120405\n最近三天\t120406\n尚客优快捷酒店\t120407\n狭义\t120408\n张丽娟\t120409\ntutu\t120410\n牛车网\t120411\n正将\t120412\n对质\t120413\n近一年\t120414\n吊扇\t120415\nHWI\t120416\n测绘局\t120417\n4s\t120418\n华润燃气\t120419\n篮王\t120420\n福建医科大学附属第一医院\t120421\n习声\t120422\n3D扫描仪\t120423\n支撑性\t120424\n60多年\t120425\n赵欢\t120426\nRedTube\t120427\n三星电子\t120428\n0633\t120429\nDispatcherServlet\t120430\n实实\t120431\n透明化\t120432\n2018年4月7号\t120433\n奇瑞瑞虎5x\t120434\n迅雷影音播放器\t120435\n猪草\t120436\n佟铁鑫\t120437\n十五周\t120438\ncenturion\t120439\n星光域\t120440\n公亩\t120441\n魔兽世界工程学\t120442\n少吃\t120443\n三首\t120444\n白玛\t120445\n西安移动\t120446\n第93集\t120447\n西安软件公司\t120448\nvmnet8\t120449\n优特\t120450\n大身\t120451\n简体中文免安装版\t120452\n文君酒\t120453\n光学传感器\t120454\nJDPaint\t120455\nBind\t120456\n支书\t120457\n_勤加缘网\t120458\n中铁大桥局集团\t120459\nyoy\t120460\n中华人民共和国消防\t120461\n块材\t120462\n墩台\t120463\n顺其自然\t120464\n润喉\t120465\n检针机\t120466\n检定\t120467\n会使\t120468\n与世隔绝\t120469\n城建学院\t120470\nboao\t120471\n1周岁\t120472\n浙江商业职业技术学院\t120473\n1000t\t120474\n尺\t120475\n暗度陈仓\t120476\nMotors\t120477\nUIApplication\t120478\n外专局\t120479\nTab\t120480\n近乎完美\t120481\n团签\t120482\ndirections\t120483\n戴尔游匣\t120484\n分格\t120485\n灌胃\t120486\npavilion\t120487\n西安市人民政府办公厅\t120488\n2分之一\t120489\n河南省国家税务局\t120490\n12121\t120491\n加州理工大学\t120492\n神牧\t120493\n仙侠道2\t120494\n就不\t120495\n惶恐不安\t120496\n申博太阳城\t120497\n郑州市人民政府\t120498\n第83\t120499\n通信达\t120500\n巴塞罗那足球俱乐部\t120501\nConsul\t120502\n880万\t120503\nMEG\t120504\n劈腿\t120505\nnatio\t120506\ntopogun\t120507\nautoit3\t120508\n操纵者\t120509\n再三\t120510\n口苦\t120511\n洋县\t120512\n美会\t120513\n环球塑化网\t120514\n视奸\t120515\ni80\t120516\n供过于求\t120517\ngatsby\t120518\n整理稿\t120519\nczc\t120520\n低音提琴\t120521\n工匠篇\t120522\n中山陵\t120523\n漏液\t120524\n2135\t120525\n诗集\t120526\n评话\t120527\n同庆楼\t120528\n孙爱军\t120529\n真帅\t120530\n聚龙湾\t120531\n樱知叶\t120532\n天刀\t120533\n战靴\t120534\n尾闾\t120535\n新股\t120536\n95510_\t120537\n通信展\t120538\n泰拉瑞亚瑟银\t120539\n查缺\t120540\n双轴撕碎机\t120541\n知堂\t120542\nFLVCD硕鼠\t120543\n爱丽\t120544\nqq邮箱\t120545\nxiazai\t120546\n妊娠期糖尿病\t120547\n雷普利\t120548\n点在\t120549\n紫山\t120550\nQuestMobile\t120551\n马尔福\t120552\n体会\t120553\n短长\t120554\n吞服\t120555\n中文幽默王\t120556\nSysprep\t120557\n音儿\t120558\n财务公司\t120559\n软环境\t120560\n盈门\t120561\n皇族\t120562\n爆文\t120563\n情字\t120564\n淡马锡\t120565\nG318\t120566\n心满意足\t120567\n梁咏琪\t120568\n石灰\t120569\n安全生产委员会\t120570\n虐恋\t120571\n稷下\t120572\n维a酸乳膏\t120573\n修行人\t120574\nboyd\t120575\n3月22号\t120576\n新塘\t120577\nworkflow吧\t120578\n裸缸\t120579\n伊格尼兹\t120580\n机队\t120581\nufo\t120582\n延长杆\t120583\n0.13.1\t120584\n龙湖集团\t120585\n易活\t120586\n大庆\t120587\nsnackbar\t120588\nTidyPlates\t120589\n中国共产党中央纪律检查委员会\t120590\n李国栋\t120591\nattacker\t120592\n陈先达\t120593\n莱菔子\t120594\nWordPress函数\t120595\n无冬\t120596\n枸杞芽茶\t120597\nFIDO\t120598\n空挂\t120599\nskins\t120600\n20160105\t120601\n博瑞GE\t120602\n掌托\t120603\n麻山药\t120604\n科印印刷网\t120605\n静态市盈率\t120606\n林农\t120607\n违约率\t120608\nARM7\t120609\nHad\t120610\n成都市第三人民医院\t120611\nsunday\t120612\n迪化\t120613\n512个\t120614\n沃尔沃XC60论坛_汽车之家论坛\t120615\n抱抱熊\t120616\n和时区\t120617\n生产安全事故应急预案管理办法\t120618\nBudget\t120619\n曹建明\t120620\n长白山机场\t120621\n痛饮\t120622\n5A\t120623\n装饰纸\t120624\n概率论与数理统计\t120625\nStrikes\t120626\n睿翼\t120627\nNBA2K12\t120628\nthinkcell\t120629\n微页\t120630\nDishes\t120631\n萌狼\t120632\n北京地铁12号线\t120633\n急性咽喉炎\t120634\n灭蝇\t120635\n娄晓娥\t120636\n长情\t120637\n大话西游月光宝盒\t120638\nfifaonline4吧\t120639\n童装网\t120640\n殇\t120641\n斯洛伐克\t120642\n硫化镍\t120643\n52PK电视游戏网\t120644\n上万条\t120645\n2.33G\t120646\n首例\t120647\nFelicity\t120648\n《复旦大学》\t120649\n平家物语\t120650\nBreak\t120651\n紫天SketchUp\t120652\n模头\t120653\njiage\t120654\n180302\t120655\n两股\t120656\n陇望\t120657\n美国国土安全部\t120658\n刘莎\t120659\nSANC\t120660\n梅菲斯特\t120661\n暖箱\t120662\n金舵\t120663\n甘甜\t120664\n切变\t120665\n失礼\t120666\n十二分之五\t120667\ntor\t120668\n弗雷\t120669\n别具匠心\t120670\nRivals\t120671\n电锯惊魂8:竖锯\t120672\n潍坊市人民政府\t120673\n蛋糕岛\t120674\n南极科考队\t120675\n读屏\t120676\n成群\t120677\n无语凝噎\t120678\n须佐鼬\t120679\n斯里兰卡航空\t120680\n安吉路\t120681\noperators\t120682\n东昌路\t120683\nkaiser\t120684\n新汶\t120685\nXpadder\t120686\n农业展\t120687\n图书试用网\t120688\n高才\t120689\n专套本\t120690\n断弦\t120691\n苹果助手\t120692\n挡不住的风情\t120693\n旧款\t120694\n连带保证责任\t120695\nSP3\t120696\n艾仕得\t120697\ngitv\t120698\n中华餐饮网\t120699\n0406\t120700\n保测评\t120701\n富顺县\t120702\n1829\t120703\n园艺展\t120704\n汝南县\t120705\n1起\t120706\n765\t120707\n滨江大厦\t120708\n山河恋\t120709\n袁紫衣\t120710\n6.x\t120711\n象征性\t120712\n教职员\t120713\n迁入\t120714\n数院\t120715\n一对三\t120716\n特斯拉公司\t120717\n倒播\t120718\nMISUMI\t120719\nODB\t120720\n土曼\t120721\n三槽\t120722\n5000步\t120723\n重塑\t120724\n乔其纱\t120725\n杰诺斯\t120726\n长安睿骋\t120727\n换卡\t120728\n联想公司\t120729\n阿瑜\t120730\n套改\t120731\n成现实\t120732\n塑性指数\t120733\n讨论式\t120734\nyingyue\t120735\n一档\t120736\n九册\t120737\n熊样\t120738\n花篮螺栓\t120739\n海商王2\t120740\ncsu\t120741\n车通\t120742\n下午2点\t120743\nGi\t120744\n猜猜我有多爱你\t120745\n电解水机\t120746\n高粘度\t120747\n一年一次\t120748\n批办\t120749\n金苑\t120750\n飞来寺\t120751\n发作性睡病\t120752\nWIn10\t120753\n伏击战\t120754\n黄乔歆\t120755\n奥术\t120756\ngl文\t120757\n晋\t120758\nオンライン\t120759\n海都\t120760\n香港立法会\t120761\n两门\t120762\n农业产业化\t120763\n超級賣AV\t120764\ncamera2\t120765\nq2\t120766\n环印\t120767\nug10\t120768\nLURE123路亚\t120769\n倍子\t120770\n中国民主促进会\t120771\n周武帝\t120772\n畲族\t120773\n顶牛贷\t120774\n数字\t120775\n青岛论坛\t120776\n烧火\t120777\n脱党\t120778\n疑问词\t120779\namour\t120780\n莺莺燕燕\t120781\n嘉德\t120782\n厦门劲龙机械工程有限公司\t120783\n李倓\t120784\n软中华\t120785\n子字\t120786\nwww.78dm\t120787\n多足\t120788\ndaigou\t120789\n条件\t120790\n游戏攻略网\t120791\n军事家\t120792\n耗尽型\t120793\n孙芮\t120794\n档页\t120795\n骑马与砍杀火与剑\t120796\n徐念沙\t120797\np11\t120798\n第62章\t120799\n记数\t120800\n阴生子\t120801\n聚乙烯亚胺\t120802\n顶吸式\t120803\n0513\t120804\n翻白\t120805\n新泰一中\t120806\n横琴长隆国际海洋度假区\t120807\n蓝艾草\t120808\nCTS\t120809\nwar3\t120810\n高原反应_\t120811\n沽名钓誉\t120812\n窄带滤光片\t120813\n电解铜\t120814\n淋\t120815\n收腹机\t120816\nAuf\t120817\n帐页\t120818\n栏目\t120819\n方军\t120820\n病损\t120821\n彩涂板\t120822\n衰退期\t120823\n重庆市委党校\t120824\n找找\t120825\n西安地铁四号线\t120826\n蹙\t120827\n恒鼎\t120828\n移动房\t120829\n安泰科\t120830\n房政\t120831\n包黑子\t120832\n飘起\t120833\nGala\t120834\n少长\t120835\n水果界\t120836\nsaid\t120837\n手贷\t120838\n沓\t120839\n〇\t120840\n载流子\t120841\n布列松\t120842\nmapgis6.7\t120843\n别字\t120844\n最不\t120845\nchipscope\t120846\n滨江小学\t120847\npoky\t120848\n推理片\t120849\n保研率\t120850\n氨糖软骨素\t120851\n黄头侧颈龟\t120852\n情景喜剧\t120853\n授权方\t120854\n彩塘\t120855\n泡沫之夏\t120856\n骨妹\t120857\n大海贼探险物语\t120858\n银水\t120859\n季卫东\t120860\n湖北省武昌实验中学\t120861\n5.60\t120862\n昭昭\t120863\nサクラ\t120864\n兰州财经大学陇桥学院\t120865\nKristina\t120866\n阳山县\t120867\nhardened\t120868\n往复\t120869\n7017\t120870\n长柄\t120871\n番外\t120872\n江西青年网\t120873\n细化\t120874\nhqy\t120875\n压球机\t120876\n快语\t120877\n舞出我人生4\t120878\n且末\t120879\n大師\t120880\n中山影视城\t120881\n城域\t120882\n梅园新村\t120883\n握柄\t120884\n蛇者\t120885\n惊世\t120886\nhighest\t120887\nzebra\t120888\n民国初期\t120889\n大淘客\t120890\nx-1\t120891\n地窖\t120892\n追究制\t120893\n0019\t120894\n债房\t120895\n4安卓\t120896\n党代会\t120897\nJUN\t120898\n胡德夫\t120899\n全屋\t120900\n贴现率\t120901\nL5\t120902\n军事力量\t120903\n电类\t120904\n逯\t120905\n采集\t120906\nex1\t120907\n利郎\t120908\n磷化铝\t120909\ndua\t120910\n中国数学建模网\t120911\n胎压监测仪\t120912\n3D网游\t120913\n梁玉嵘\t120914\nvici\t120915\n42.3\t120916\n铁铲\t120917\n凭什么\t120918\n联想电脑\t120919\n46万\t120920\n张文中\t120921\n空闲时间\t120922\n南平\t120923\naiqing\t120924\nTicWatch\t120925\n20170403\t120926\n透明底\t120927\n寄生前夜2\t120928\n戚氏\t120929\n人乳\t120930\n2-10\t120931\n大通区\t120932\n写实主义\t120933\n985\t120934\n郑州电力高等专科学校\t120935\nmdp\t120936\n手持机\t120937\n兆日科技\t120938\n暗黑战神\t120939\nRedhat\t120940\n上海中医院\t120941\n娥\t120942\nwtforms\t120943\n强镇\t120944\n阿历克斯\t120945\n闪恋\t120946\n上海华美医院\t120947\n禅音\t120948\n谈钱\t120949\n十三邀\t120950\n中顺\t120951\n0797\t120952\n2018年4月6日\t120953\nConsider\t120954\n月英\t120955\n04月03日\t120956\n十八届三中全会\t120957\npreston\t120958\n中广核电力\t120959\n神符\t120960\n超级星饭团\t120961\nXX集团\t120962\n3000G\t120963\n北京304医院\t120964\n機票\t120965\n解放军医学院\t120966\n刘和刚\t120967\ndropdown\t120968\n欧证\t120969\n600516\t120970\n浙江省食品药品监督管理局\t120971\n市团委\t120972\ndwing\t120973\n捉襟见肘\t120974\n万园\t120975\n椎弓\t120976\nSpeedtest\t120977\n解缙\t120978\n奔驰GLK级\t120979\n诺贝尔文学奖\t120980\n几匹\t120981\n夏利\t120982\n重聚首\t120983\n嘻哈吧\t120984\n藏体\t120985\n蒙头\t120986\n新郎\t120987\n韩亮\t120988\n根尖周炎\t120989\n幽雨\t120990\ngof\t120991\n易损\t120992\n人力资源规划\t120993\n读心术\t120994\n南红玛瑙\t120995\n曼基康猫\t120996\n惊吓\t120997\n大学英语\t120998\n肯德基店\t120999\n10万美元\t121000\n笔经面经_牛客网\t121001\n眉山市\t121002\n进酒\t121003\n516cn\t121004\n当庭\t121005\nAman\t121006\n妹哥\t121007\n巨额\t121008\n歌瑞森\t121009\n2400亿\t121010\n潍城\t121011\n医学检查\t121012\n嗨氏\t121013\n绍兴北\t121014\n滑步弓\t121015\n损险\t121016\n昕薇网\t121017\n真实照\t121018\n通化师范学院\t121019\n大红本\t121020\n24期\t121021\n普力马\t121022\n养马镇\t121023\n狂亲\t121024\n吴先生\t121025\ninfrared\t121026\n瓦窑堡\t121027\nundo表\t121028\n歉\t121029\n懋\t121030\n闪购\t121031\n2.7.5\t121032\n厨房装修|一起网\t121033\n王冠丽\t121034\nnif\t121035\n未由\t121036\n姜屯镇\t121037\n易佰连锁旅店\t121038\n正形\t121039\n广州市国资委\t121040\n142平米\t121041\n嘉论网\t121042\n11式\t121043\n钎具\t121044\n数量词\t121045\n李权哲\t121046\n首尔\t121047\n2016-2035年\t121048\n法律部\t121049\n情商\t121050\n康恒\t121051\nautomate\t121052\n味事达\t121053\n建纬\t121054\n原件\t121055\n非正版\t121056\n哆啦美\t121057\n图木舒克\t121058\nsystemview\t121059\n决定案\t121060\n朱慧敏\t121061\n药物代谢动力学\t121062\nMoss\t121063\n独生子女证\t121064\n皖人社\t121065\nProDesk\t121066\n中国科学院合肥物质科学研究院\t121067\n180401\t121068\n线索\t121069\n安乡县\t121070\n马年\t121071\n宝塔linux\t121072\n芳汀\t121073\n深圳华强\t121074\nscoped\t121075\n东京羽田机场\t121076\nEinaudi\t121077\n一池\t121078\n女帮\t121079\n专用语\t121080\n王大凡\t121081\n1个多月\t121082\nfuels\t121083\n冷镦\t121084\n农贷\t121085\n昆仲\t121086\n妻管严\t121087\n宠物医师\t121088\n1.5.x\t121089\n斯卡\t121090\n结构式\t121091\ncssrem\t121092\n鲜牛\t121093\n琉璃光院\t121094\n济南军区总医院\t121095\n万历通宝\t121096\napic\t121097\n蓉e行\t121098\nLIGUI\t121099\nUiAutomator\t121100\nstrut2\t121101\n东方大酒店\t121102\n食素\t121103\n长胸\t121104\n白堤\t121105\n总厂\t121106\nGoFun\t121107\n河池学院\t121108\n第136集\t121109\n泛型\t121110\n李红艳\t121111\nC200L\t121112\nCircles\t121113\n踢倒\t121114\n电缆分支箱\t121115\n大洋\t121116\n江流儿\t121117\n中铁鲁班商务网\t121118\n玉米饼\t121119\n366集\t121120\n9527\t121121\nAnnals\t121122\n打包\t121123\nSheng\t121124\n交通银行上海分行\t121125\nManfrotto\t121126\n鼎湖山\t121127\n一夫\t121128\n绵阳新闻网\t121129\n湿度传感器\t121130\nstitching\t121131\n无锡市第三人民医院\t121132\n奠\t121133\n96万\t121134\n梦比优斯\t121135\n林小欧\t121136\n建都\t121137\nseparate\t121138\niuni\t121139\nkubernete\t121140\n15000公里\t121141\n二二八\t121142\n陇县\t121143\nencrypt\t121144\n传祺雪鹰\t121145\n180329\t121146\n92看片\t121147\ninva\t121148\n海南岛\t121149\n作曲家\t121150\nhug\t121151\n松花石\t121152\n五小强\t121153\nflstudio\t121154\nhp笔记本\t121155\n于萍\t121156\n2016.04\t121157\n数千个\t121158\n小羊羔\t121159\n转运\t121160\nibt\t121161\n乐敦CC\t121162\n/公司库/大全/厂\t121163\nvfp\t121164\n慕容三国\t121165\n鲁滨孙漂流记\t121166\n公有\t121167\nInvoice\t121168\n棒棒冰\t121169\n卡机\t121170\nENVIRONMENTAL\t121171\nSTC89C52单片机\t121172\nliterals\t121173\n军士\t121174\n日本邪恶漫画_邪恶漫画全集_不知火舞邪恶漫画集\t121175\n远攻\t121176\n中国移动研究院\t121177\n限温\t121178\n幽静\t121179\n扇贝\t121180\n耐特\t121181\n贴身校花\t121182\nfyy\t121183\ngratitude\t121184\n伦敦希思罗国际机场\t121185\nmacbookpro2017\t121186\neva吧\t121187\n骨色\t121188\n&#8222\t121189\n钢拳\t121190\n即墨市\t121191\n直属机关委员会\t121192\nWOMEN\t121193\n豆瓣fm\t121194\nSTM32F207\t121195\n鼓励\t121196\n狂兽\t121197\n浪潮工作室\t121198\n泉州市行政服务中心\t121199\n缪瑞林\t121200\nTrench\t121201\n意大利面\t121202\n邪骨\t121203\n新花\t121204\n猪手\t121205\n华为P20\t121206\nPS素材\t121207\n2016b\t121208\nyyf\t121209\n巫蛮儿\t121210\n亲述\t121211\n被逮\t121212\n观园\t121213\n武汉东湖高新集团股份有限公司\t121214\n迪亚\t121215\n小虎队\t121216\n芯片级\t121217\n椰棕\t121218\n罗胖\t121219\n绿亦歌\t121220\n尖子生\t121221\n税差\t121222\nbdsm\t121223\n永康环讯\t121224\n3.5折\t121225\n9轴\t121226\n干冷\t121227\n易班\t121228\n十码\t121229\n1型糖尿病\t121230\n政信\t121231\n旅客吞吐量\t121232\n乐彩网\t121233\n功成万骨枯\t121234\nreach\t121235\n政法\t121236\n数据结构算法\t121237\nsoundlink\t121238\n折扣村\t121239\n爱普猪影视网\t121240\n室外机\t121241\n调质\t121242\n防护衣\t121243\n流行社\t121244\n_超清\t121245\n2018年4月18号\t121246\n走行\t121247\n股东户数_财经_凤凰网\t121248\nn3ap\t121249\n从0到1\t121250\n云南艺术学院\t121251\n一个件\t121252\n樱\t121253\n辉仔\t121254\n河南省直第三人民医院\t121255\n酒市\t121256\nMesos\t121257\n比亚豪门\t121258\npanty\t121259\n整合者\t121260\n杨生\t121261\napdy\t121262\n永安论坛\t121263\n广东省人力资源社会保障厅_广东省人力资源和社会保障厅\t121264\nslowlog\t121265\nlpddr4\t121266\n倩\t121267\n金钟奖\t121268\n苏州广电\t121269\n台前\t121270\n沪江西语\t121271\n数字滤波\t121272\n500万辆\t121273\n第一格\t121274\nSybase数据库\t121275\n驻地\t121276\nTOX\t121277\n微软小娜\t121278\n学徒\t121279\n徐家力\t121280\n自然现象\t121281\nPPT2016\t121282\n发案率\t121283\nnote5吧_\t121284\n劲畅\t121285\n快穿]\t121286\n字篇\t121287\natma\t121288\n金漆\t121289\n第9代\t121290\n永不止息\t121291\n乐乐游戏网\t121292\n人户\t121293\n大赢\t121294\n郑州一小区\t121295\n狮驼\t121296\nX-T2\t121297\n第13卷\t121298\n阿门\t121299\nowin\t121300\n株洲政府\t121301\nCompanion\t121302\n挤破\t121303\n店小二\t121304\n醒不来\t121305\n谢怜\t121306\nREALLY\t121307\nCreateFile\t121308\nZico\t121309\n江楼\t121310\n水西村\t121311\n十道\t121312\n集结号\t121313\n高清网\t121314\n654\t121315\nGraphs\t121316\n贝利亚银河帝国\t121317\n东方大道\t121318\n13485\t121319\n降解性\t121320\nWolfram\t121321\n济南市市中区政府\t121322\nbaohu\t121323\n柱状\t121324\n新能源补贴\t121325\n好神\t121326\n姜克美\t121327\n锅贴\t121328\n73万\t121329\n外标法\t121330\n高仿表\t121331\nESPCMS\t121332\n淹死\t121333\n王启航\t121334\n重逢\t121335\nnay\t121336\neNaming\t121337\n大姚县\t121338\n颜氏家训\t121339\n大鲵\t121340\n中国联邦快递\t121341\n腾讯代理绝地求生\t121342\n霁\t121343\nLOGO\t121344\nhides\t121345\n来去一场梦\t121346\nPCB\t121347\n利民开发区\t121348\n攀钢\t121349\n禅舞\t121350\nmgt\t121351\n达达乐队\t121352\n太皇太后\t121353\ncollection\t121354\ntravian\t121355\n孽债\t121356\n_巴陵时尚网\t121357\n我需\t121358\n美国哈佛大学\t121359\n吉他拾音器\t121360\nhosted\t121361\naspose\t121362\n两味\t121363\n离子键\t121364\nevidence\t121365\nポルノビデオ\t121366\n卡日\t121367\n星等\t121368\n雷克萨斯is300\t121369\n易彩\t121370\n输卵管堵塞\t121371\n永昌路\t121372\n玄学\t121373\n宗教事务局\t121374\n令人失望\t121375\n沙发套\t121376\n建筑设计规划\t121377\n牙龈\t121378\nNieR\t121379\nalpha1\t121380\n树花\t121381\n折带\t121382\nXD\t121383\n侠客风云\t121384\n有线电话\t121385\n销毁\t121386\n迅龙\t121387\n伽卡\t121388\n劲鹿\t121389\n沁阳市\t121390\nevisu\t121391\n本周\t121392\n第三季\t121393\neryar\t121394\ntestlink\t121395\n李圣杰\t121396\n胡可沙溢\t121397\n阿拉斯加航空\t121398\n美中宜和妇儿医院\t121399\n拥挤度\t121400\n弯\t121401\n误差分析\t121402\n1600亿元\t121403\n遅\t121404\n小帕\t121405\n虎标万金油\t121406\n雪蟹\t121407\nx6plus\t121408\n地球城\t121409\nfresco\t121410\n地素\t121411\nAPPCRASH\t121412\n平邑政府网_平邑县人民政府\t121413\n蓝石\t121414\n第六只\t121415\n茄瓜\t121416\n赣州黄金机场\t121417\n区出\t121418\n贺江\t121419\n西兰卡普\t121420\n武进路\t121421\naxisLabel\t121422\n天书奇谈天行者\t121423\n桃花开\t121424\n方壶\t121425\nWEI\t121426\n李诞\t121427\n重庆万盛\t121428\n收缩膜\t121429\n合法性\t121430\n尽善尽美\t121431\n完税证\t121432\n黄陵矿业集团有限公司\t121433\n张籽沐\t121434\n刘志华\t121435\nwin7蓝屏\t121436\n200颗\t121437\nCOOLPIX\t121438\n透情\t121439\n压身\t121440\n天越\t121441\n叶逐月\t121442\n9:00\t121443\n气韵\t121444\n短号\t121445\n新滨湖孔雀城\t121446\n德尔\t121447\n魏源\t121448\nlibero\t121449\n始祖鸟\t121450\n1532\t121451\nShoe\t121452\nGrade\t121453\n国家记忆\t121454\n妃子们\t121455\n互助\t121456\n阿拉斗牛\t121457\n2015年6月1日\t121458\n佩莱格里尼\t121459\n窗贴\t121460\n画江湖之不良人2\t121461\n通金魔方\t121462\n600703\t121463\nspps\t121464\n西安城\t121465\n欠薪\t121466\n龙台\t121467\nfinal\t121468\nATOS\t121469\n阀块\t121470\nthrow\t121471\n惠聚驻村\t121472\n二十四章\t121473\n琅琊王氏\t121474\nDataFrames\t121475\nDuShu\t121476\n89C51\t121477\n云片糕\t121478\n燕雀\t121479\n侯捷\t121480\n彧\t121481\n西南科技大学\t121482\n史志网\t121483\n铜仁市纪委监察局\t121484\n大奔\t121485\n一丁\t121486\nhypermesh\t121487\n咸蛋黄\t121488\n橙子树\t121489\nDamon\t121490\n學會\t121491\n托臂\t121492\nexe格式\t121493\n查档\t121494\n海力士\t121495\n中恒海晖城\t121496\n陈启礼\t121497\n端详\t121498\n青岛酒店\t121499\n聆讯\t121500\nhg680-j\t121501\nv-ray\t121502\n北京_新闻中心\t121503\n世界游泳锦标赛\t121504\n名言\t121505\n挑一\t121506\n王艳丽\t121507\nasyncio\t121508\nzWrite\t121509\n阳朔\t121510\n核动力航母\t121511\n用法_例句_海词\t121512\n车载播放器\t121513\n数理化\t121514\n争产\t121515\n五谷粉\t121516\n商版\t121517\n05j909\t121518\n叠螺式污泥脱水机\t121519\n游移\t121520\n马小五\t121521\n读笔\t121522\n十首\t121523\n李富真\t121524\n章嘉\t121525\nBareback\t121526\n静力压桩机\t121527\n中共广东省委办公厅\t121528\n极地海洋世界\t121529\n打印机驱动网\t121530\n绝地求生超级助手\t121531\n除害\t121532\n贝莱德\t121533\n周月\t121534\n线框图\t121535\n艾叶\t121536\n2千\t121537\n梅川镇\t121538\n武英\t121539\nxa\t121540\n封住\t121541\npond\t121542\n4202\t121543\n5月后\t121544\n示\t121545\nimpaired\t121546\n味之谜\t121547\n躺赢\t121548\nzenith\t121549\n泰康在线\t121550\n花旗\t121551\n柳江古镇\t121552\nbrandsite\t121553\nump\t121554\n58吧_\t121555\n4条\t121556\n川悠里\t121557\nheb\t121558\n52mm\t121559\n第四辑\t121560\n南门桥\t121561\n貌不惊人\t121562\n人工神经\t121563\n鸡鸭\t121564\n金骏\t121565\n出影\t121566\n羟基化\t121567\n读后\t121568\n洛阳市环境保护局\t121569\n纺锤形\t121570\n高树三姐妹\t121571\n红阳\t121572\nsqlcipher\t121573\n奥博\t121574\n应付款\t121575\n贱婢\t121576\n北师大版小学\t121577\n无锡教育网\t121578\n小米鉴定\t121579\n艾瑞克森\t121580\n咬咬\t121581\n魅蓝Note\t121582\n百炼成神\t121583\nech\t121584\n农村人\t121585\n占有量\t121586\njhtml\t121587\n云监工\t121588\n细雪\t121589\n几许\t121590\n销\t121591\n张虹\t121592\n轩辕剑3天之痕\t121593\n潮牛\t121594\n环境监测\t121595\n路虎神行者2\t121596\n大白兔奶糖\t121597\n苏糯米\t121598\n美团打车吧\t121599\n踢踢\t121600\n拉开序幕\t121601\n701路\t121602\n苏菜\t121603\n壬子日\t121604\nCorn\t121605\n第58集\t121606\n电热开水器\t121607\n八类\t121608\nhappened\t121609\n别致\t121610\nAppSecret\t121611\nCONSULTING\t121612\n护舒宝\t121613\n王善朴\t121614\n超长\t121615\n青云谱区\t121616\n李炳\t121617\n委托合同\t121618\n名贵\t121619\n药业\t121620\n邮政编码查询系统\t121621\n黑简\t121622\n套机\t121623\n知州\t121624\nV4.2\t121625\n又遭\t121626\nLuggage\t121627\n小米Note2\t121628\n卫斯\t121629\nsikix\t121630\n柳亚子\t121631\n我的世界冒险者传说\t121632\n中远集装箱运输有限公司\t121633\nCCTV-15_央视网\t121634\npiper\t121635\n宗次郎\t121636\n护驾\t121637\n巴霍巴利王2\t121638\n黑黑乳\t121639\nsfv\t121640\n闪蒸干燥机\t121641\n7910\t121642\nrfs\t121643\n苦儿流浪记\t121644\n媲美欣\t121645\n23亿\t121646\n要诀\t121647\n尹姓\t121648\n流写\t121649\n岛国人\t121650\n黄河清\t121651\nmoved\t121652\n水调歌头\t121653\n连接类\t121654\nxt5\t121655\n脱脂奶\t121656\n羊驼\t121657\n团队名\t121658\n盖泽\t121659\nMP4播放器\t121660\n善悉\t121661\n全时云会议\t121662\n金湾机场\t121663\nnhac\t121664\n南京雨花台\t121665\n永安\t121666\n浮生梦\t121667\n不可少\t121668\n联谊\t121669\n事业性\t121670\n娓娓\t121671\n懂事\t121672\n起作\t121673\n冷婚\t121674\n新浪网_新浪\t121675\n陈意涵\t121676\n觉醒\t121677\n上海一小区\t121678\n奥比时空猎人\t121679\nIcan\t121680\n来不及了\t121681\nBYJ\t121682\n良渚新城\t121683\nWin7资源管理器\t121684\n丙泊酚\t121685\n2000斤\t121686\n老师长\t121687\nString,String\t121688\n阜沙\t121689\n华杯赛\t121690\n毅然决然\t121691\nleaning\t121692\n神创\t121693\n锈病\t121694\n清冬\t121695\n螺旋杆\t121696\n祖格\t121697\n现代诗\t121698\nobd2\t121699\n天林\t121700\nprintarea\t121701\n刘敏林\t121702\n28.5\t121703\n毒句\t121704\n钻孔机\t121705\n九年前\t121706\n李治国\t121707\nincludegraphics\t121708\n济南市纪委\t121709\n外馆\t121710\n区医院\t121711\n反性骚扰\t121712\n黑暗龙\t121713\n没效\t121714\n梦想秀\t121715\n龙海\t121716\n大拉\t121717\n茎突\t121718\n保卖\t121719\n大行间距呢_\t121720\n宫墙\t121721\n王克\t121722\n金诗\t121723\n水上飞机\t121724\n刘积仁\t121725\n2g显存\t121726\n一证\t121727\n奥迪r8\t121728\n长藤\t121729\n白塔机场\t121730\nG36\t121731\nFanuc\t121732\n杨川\t121733\n应急办\t121734\n临床心理学\t121735\n泗县人民政府\t121736\n传播\t121737\n引人深思\t121738\n军部\t121739\n智慧港\t121740\n鸟居\t121741\n金钟\t121742\n液滴\t121743\n希杰\t121744\n级联菜单\t121745\nEP2\t121746\npt吧\t121747\n阿里指数\t121748\nsolr5\t121749\n西直门北大街\t121750\n劲仔\t121751\n3卡\t121752\n武装\t121753\n泰科电子\t121754\n鹤壁东站\t121755\n速锐论坛_汽车之家论坛\t121756\n爱普森\t121757\n无缝弯头\t121758\n水手服\t121759\n戒烟吧_\t121760\n物候\t121761\nanhei3\t121762\n根本\t121763\n刘增艳\t121764\nbypy\t121765\n灵丹\t121766\nharvard\t121767\n福建农林大学金山学院\t121768\n2000余\t121769\n冒险岛龙神\t121770\n呢_39健康网\t121771\nhuangqiqing123\t121772\nST-Link\t121773\n同德围\t121774\n茉莉茶\t121775\nσ\t121776\n挑卡网\t121777\n长兴中学\t121778\n镁铝\t121779\n大揭秘\t121780\n首佳\t121781\n云开\t121782\n九眼桥\t121783\n资邦\t121784\n振华股份\t121785\n美好生\t121786\n工程洗车机\t121787\n孔莹\t121788\n晒版\t121789\n44亿\t121790\nService\t121791\n华昌路\t121792\n7月9日\t121793\ncrv\t121794\n埃尔文\t121795\n野炊\t121796\n漫佛潮踪\t121797\n男友们\t121798\n淞沪战役\t121799\n旅游城市\t121800\n空气能热水工程\t121801\n佳木\t121802\nacm\t121803\n散力\t121804\n气动布局\t121805\n阿天\t121806\n中国儿童中心\t121807\n姫\t121808\n内河\t121809\n梅州站\t121810\nPaperRight\t121811\n150_\t121812\n文曲星\t121813\n10.05\t121814\n达意隆\t121815\n腾讯新闻迷你版\t121816\nmeituan\t121817\n评鉴\t121818\n彭玲\t121819\n安国寺\t121820\n遭枪杀\t121821\n牧马山\t121822\nCC2640R2F\t121823\n湖北电信\t121824\n模切机\t121825\n郭佳\t121826\n强文\t121827\n陈丹丹\t121828\n第十六届\t121829\n瓷杯\t121830\nmuscular\t121831\n省建设厅\t121832\nd40\t121833\n牧云记\t121834\n茯茶素\t121835\n屈原\t121836\n维迦\t121837\n草榴社区\t121838\n法华经\t121839\nYukon\t121840\n沉没\t121841\n大班椅\t121842\nsideshow\t121843\n60套\t121844\n4存档\t121845\nStatistic\t121846\n阿娜\t121847\n星际2虚空之遗\t121848\n绍兴市区\t121849\n一千零一\t121850\n签证照\t121851\n滕州一中\t121852\n舒适达\t121853\n顶体酶\t121854\nFileReader\t121855\n营业执照\t121856\nMolex\t121857\n翠鸟\t121858\nresponsibility\t121859\n第一队\t121860\nReaders\t121861\n川教\t121862\n粘带\t121863\n长城汽车\t121864\n可兰\t121865\n快递业\t121866\n云南城投集团\t121867\n积友\t121868\n320s\t121869\n高塔\t121870\n第93期\t121871\nLUCI\t121872\n圣光\t121873\n常规赛\t121874\n直角尺\t121875\nEffort\t121876\n天王寺\t121877\n浙江大学海洋学院\t121878\n排序算法\t121879\n羊肉火锅\t121880\n进料加工\t121881\n开列\t121882\n年头\t121883\nicoca\t121884\n重庆搬家公司\t121885\nc602\t121886\n东方精工\t121887\n崔氏\t121888\n极品飞车20吧\t121889\n60天后\t121890\n松手\t121891\nword/wps\t121892\n1.11\t121893\n高考\t121894\n甘茶度\t121895\n烟感探测器\t121896\nNor\t121897\n清季\t121898\n守卫剑阁降龙伏虎\t121899\ncheck\t121900\n豺\t121901\n枯木\t121902\n毛竹笋\t121903\n花里子\t121904\n万客\t121905\n鲁能领秀城\t121906\nUDC\t121907\n前海开源基金\t121908\n邓超\t121909\n试生产\t121910\nneeq\t121911\n会员会费\t121912\n白盾云\t121913\nPACS\t121914\n第一周\t121915\n叮\t121916\n广州复大肿瘤医院\t121917\nSeng\t121918\n微软五笔输入法\t121919\n无度\t121920\n中国中投证券\t121921\n发动机舱\t121922\n肖雄\t121923\nmana\t121924\n排单\t121925\n斯巴达300勇士\t121926\n世界之旅\t121927\nabilities\t121928\n中共眉山市委\t121929\n骨水泥\t121930\nDOTA2荒神罪\t121931\npring\t121932\n美利山\t121933\n优站\t121934\n蓝溪谷\t121935\n金桂苑\t121936\n三十亿\t121937\n二十集\t121938\nSCRUM\t121939\n赛克\t121940\nHOPE\t121941\n圆柱形\t121942\n绿色下载吧\t121943\n窦仕骁\t121944\ncontextmenu\t121945\n害我\t121946\n玫瑰树\t121947\n抛光\t121948\nRimowa\t121949\n10019\t121950\n捕手\t121951\nba\t121952\n西溪蝶园\t121953\n渝长厦\t121954\n王牌御史\t121955\n迈克杰克逊\t121956\n北汽男排\t121957\n芽孢杆菌\t121958\n新疆生产建设兵团公共资源交易信息网\t121959\n晶方科技\t121960\n斯柯达汽车\t121961\n幅\t121962\n五洲城\t121963\nprotostuff\t121964\n6维\t121965\n雅风\t121966\nimputation\t121967\n滨海新城\t121968\nghostscript\t121969\n孙大伟\t121970\n卷号\t121971\n残存\t121972\nmingw32\t121973\n贾伟\t121974\n挡路\t121975\n2017年2月17日\t121976\n168元\t121977\n咬伤\t121978\n台柱\t121979\n毛囊\t121980\n随机变量\t121981\n奔驰s320\t121982\n空调滤清器\t121983\nd30\t121984\n奥菲利亚\t121985\n上海迪士尼酒店\t121986\n位号\t121987\n暴走说法\t121988\n珠山\t121989\n指甲锉\t121990\n岭南通\t121991\n时髦感\t121992\n满意度\t121993\n滑过\t121994\n潜海\t121995\n无损检测\t121996\n明显陵\t121997\n点佰趣\t121998\n常用电路\t121999\n超级宏官网\t122000\n跳汰机\t122001\n张尧学\t122002\n永兴路\t122003\nmfc100.dll\t122004\n金车\t122005\n母乳性黄疸\t122006\n大清国\t122007\n垫铁\t122008\n癌友\t122009\nno3\t122010\n学习成就\t122011\n管理科学\t122012\n繁星戏剧村\t122013\n行走式\t122014\n三三制\t122015\n风枪\t122016\n资料馆\t122017\nFUCHS\t122018\n武汉万达\t122019\n报错\t122020\n大灾\t122021\n行数\t122022\n仲间\t122023\nbazel\t122024\n织梦CMS帮助中心\t122025\n新兴村\t122026\nx3550\t122027\n2017-12-12\t122028\nTEF\t122029\n170米\t122030\n生于\t122031\nlibvirt\t122032\n安塞腰鼓\t122033\ncontend\t122034\nXPath\t122035\n龙堡\t122036\n.rar\t122037\n10.5米\t122038\n博纳影业\t122039\n项目单\t122040\n绿本\t122041\n讨人\t122042\n三分钟\t122043\n谍影重重4\t122044\n差不\t122045\n高厚\t122046\n殷桃\t122047\n黄泥螺\t122048\n默小宝\t122049\ndiwurenge\t122050\nDOIT\t122051\n厦深高铁\t122052\nWIKA\t122053\n天命\t122054\n中国职业技术教育学会\t122055\n10.1.4\t122056\n保管人\t122057\n武口\t122058\n机电证\t122059\n455\t122060\n温州网\t122061\n济南移动\t122062\n潦\t122063\n10兆\t122064\n泪膜\t122065\n君来\t122066\n集体经济\t122067\nQQ群号网\t122068\n突访\t122069\nxfs\t122070\ngenesys\t122071\n庞涓\t122072\nHoneywell_Honeywell\t122073\n104平米\t122074\n神语\t122075\n丁晓红\t122076\n蛇口\t122077\n最高峰\t122078\nDotfuscator\t122079\n原建\t122080\n泼彩\t122081\ndcm\t122082\n1.12.2\t122083\n北展\t122084\njface\t122085\n20171217\t122086\n优惠\t122087\n揽胜公园\t122088\n谁的青春不迷茫\t122089\n名人堂\t122090\n职称证\t122091\nColleges\t122092\n市城管局\t122093\n九点\t122094\nMASTERCAM\t122095\n闽西职业技术学院\t122096\nD类\t122097\n爱贝亲子网\t122098\n小圆镜\t122099\n中国电信广东公司\t122100\n小米A1\t122101\n小涛\t122102\n闰月\t122103\n18k金\t122104\n中东地区\t122105\n解方程\t122106\nPatent\t122107\n杭州湾湿地公园\t122108\n调汇\t122109\ntraditional\t122110\n诺基亚3310\t122111\n91chinese\t122112\n无所遁形\t122113\n乾道\t122114\n120帧\t122115\nak380\t122116\nhacmp\t122117\nmar\t122118\n江西省妇幼保健院\t122119\n耳压\t122120\n丁兰\t122121\n陕北人\t122122\n裙带关系\t122123\n十八集\t122124\n邓英\t122125\n33000\t122126\n农田水利\t122127\n汾湖高新区\t122128\n杭四中\t122129\n&#8211\t122130\nKurento\t122131\nlase\t122132\n影片\t122133\n团课\t122134\n传统观念\t122135\n2u\t122136\n7.2.3\t122137\nEICC\t122138\n11月1日起\t122139\n导入器\t122140\n24关\t122141\nstressful\t122142\n征信记录\t122143\nWarface\t122144\n一下\t122145\n1-10号\t122146\n六颗\t122147\nscoket\t122148\n歌瑞尔\t122149\n昏迷不醒\t122150\n老孙\t122151\n公司理财\t122152\n在街上\t122153\n昆山装修公司\t122154\n山西新闻网\t122155\n八岐大蛇\t122156\n张保维\t122157\n谣言\t122158\n140公里\t122159\n自达\t122160\n王屋山\t122161\n马彩\t122162\n上皮组织\t122163\n物联中国\t122164\n俯卧\t122165\n0.055\t122166\n余弦定理\t122167\n7000亿元\t122168\n秋水逸冰\t122169\n戒毒频道_健客网\t122170\n喇叭分类信息网\t122171\n末世类\t122172\n瑞龙\t122173\n7622\t122174\nquene\t122175\n77bike\t122176\n上市公司\t122177\n电蚊\t122178\nWinHex\t122179\n垫款\t122180\n入团\t122181\n淋巴肿瘤\t122182\n郑子诚\t122183\n钢质防火门\t122184\nQ值\t122185\n邗沟\t122186\n离散度\t122187\n速度场\t122188\nbastard\t122189\n3kg\t122190\n康复\t122191\nXA1\t122192\n主桥\t122193\n永德县\t122194\nlakka\t122195\n台籍\t122196\n滴漏咖啡\t122197\n日日操\t122198\n泡泡电影网\t122199\n权倾天下\t122200\n罗江县\t122201\n第十一回\t122202\n剑阁县人民政府\t122203\n蚕豆\t122204\nlemonade\t122205\n500家\t122206\n冰糖心\t122207\nSXI\t122208\n这个杀手不太冷\t122209\n版权侵权\t122210\nshhn\t122211\nsoda\t122212\nuiscroll\t122213\n外贸出口\t122214\n胸型\t122215\nconstants\t122216\n齐木\t122217\n湖南省人民代表大会常务委员会\t122218\n杨露禅\t122219\n脚气病\t122220\n嘉定\t122221\n壹原侑子\t122222\n阿里速卖通\t122223\n牛头怪乐园\t122224\n广兴镇\t122225\n乡镇卫生院\t122226\n北京数字学校\t122227\nHE\t122228\n5速\t122229\n征文\t122230\n通州国\t122231\n宁波市人力资源和社会保障局\t122232\n新讯\t122233\n垂衣\t122234\ndevel\t122235\n比亚仙路至尊\t122236\n西安政府\t122237\n小米助手\t122238\nBBIN\t122239\ndil\t122240\nformal\t122241\n工业工程专业\t122242\n20140327\t122243\n螺旋藻\t122244\n土地日\t122245\nPS1\t122246\n后进\t122247\nSSRS\t122248\n成功率\t122249\ncertainly\t122250\n乐村淘\t122251\nloom\t122252\n兴东机场\t122253\n电镀铬\t122254\n用情\t122255\n文晖路\t122256\n葛兰素史克\t122257\n夺妻\t122258\n詹姆斯邦德\t122259\n曹王\t122260\n一折\t122261\n李蚊香\t122262\n固定资产折旧_\t122263\n中途岛之战\t122264\n深圳市外事办公室\t122265\nCMake\t122266\n二师\t122267\n小新潮\t122268\n千千万万次\t122269\n命相\t122270\ndj舞曲\t122271\n用户态\t122272\n透视\t122273\nKMC\t122274\n左家庄\t122275\n画刷\t122276\nMBA考试\t122277\nOUT\t122278\n佩带\t122279\n循环链表\t122280\n世贸丽晶城\t122281\n捻线机\t122282\n畅呼吸\t122283\n第四句\t122284\n农廉\t122285\n腾讯qq\t122286\n北京中医院\t122287\n3DLC\t122288\nPM2.5空气质量指数\t122289\n哽\t122290\n蜂巢\t122291\n燃烧学\t122292\ncaoprom\t122293\n1.45G\t122294\n安帅\t122295\n普兰德\t122296\n捐建\t122297\nssq\t122298\n忍无可忍\t122299\n审计师\t122300\nps拾色器\t122301\n大忌\t122302\n五粮液\t122303\nBrewing\t122304\n麻辣面\t122305\n9月初\t122306\n曹妃甸区\t122307\n用友畅捷通\t122308\n周历表\t122309\n120欧姆\t122310\n座车\t122311\n黑屌\t122312\n炉石传说卡\t122313\n对弈\t122314\n腰缠万贯\t122315\n奴仆\t122316\n陈钰琪\t122317\nApproval\t122318\n合肥新闻网\t122319\n仓配\t122320\n渣打银行\t122321\n太残酷\t122322\n春有百花秋有月\t122323\n96分钟\t122324\nAE2\t122325\nFind\t122326\n320万\t122327\n徐汇区\t122328\n一然\t122329\n丁香花\t122330\n闹访\t122331\n立邦\t122332\nPhysics\t122333\n阳极管\t122334\n雷朋太阳镜\t122335\n项目集\t122336\n流线\t122337\n小街\t122338\n毛台\t122339\n直埋电缆\t122340\n民营经济\t122341\n晚枫歌\t122342\n篡位者\t122343\nxaxis\t122344\n投医\t122345\n日新\t122346\n美术史\t122347\nwinpe\t122348\n大湾\t122349\n黄冈新闻网\t122350\n刀战\t122351\n一往情深\t122352\nMARKETING\t122353\nrj\t122354\n目标\t122355\n北京兴事堂药店\t122356\n拉菲酒\t122357\n称重\t122358\nconfirmed\t122359\n娟娟\t122360\n秦汤汤\t122361\n香飘飘\t122362\n帮帮网\t122363\n一面之缘\t122364\n超星尔雅\t122365\ngovern\t122366\n0xc0000417\t122367\n绝世小受\t122368\n马意思\t122369\n边角\t122370\n十三种\t122371\n射手座\t122372\n生命之源\t122373\n用友ERP-NC\t122374\n整流器\t122375\nHEX\t122376\nv3.7.2\t122377\n脑肿瘤\t122378\n师座\t122379\n河工大\t122380\n除味剂\t122381\n朴枫\t122382\n大顶\t122383\n图费\t122384\n黄金交易\t122385\nINSPIRON\t122386\n中国桂平市政府\t122387\n第3章\t122388\n云朵朵\t122389\n入金\t122390\n料斗\t122391\nChest\t122392\n上翘\t122393\n芬妮\t122394\n典例\t122395\n笑春风\t122396\n张叔叔\t122397\n车用车\t122398\n凉拌牛肉\t122399\nStride\t122400\n董克平\t122401\npricing\t122402\n七剑\t122403\n检测场\t122404\n没有问题\t122405\n有过之而无不及\t122406\n电影港\t122407\n省教育厅\t122408\n雅字\t122409\n流失\t122410\n国士\t122411\n时代先锋\t122412\n下雨天\t122413\n磁感应\t122414\n绝交书\t122415\n文字游戏\t122416\nwatchmaker\t122417\n江苏银行卡易贷\t122418\n同济大学建筑与城市规划学院\t122419\n捷顺科技\t122420\ntai\t122421\n千禾味业\t122422\n番禺万达广场\t122423\n底质\t122424\n帝王之路\t122425\n高尔夫GTI\t122426\n天津市市场和质量监督管理委员会\t122427\n10千伏\t122428\nzisha\t122429\nGB50222-2017\t122430\n制单员\t122431\n沐春风\t122432\n樱桃子\t122433\n郑煤集团\t122434\n恩仇\t122435\n华中科技大学出版社\t122436\nzar\t122437\n恩佐\t122438\n刮倒\t122439\nMG3680\t122440\n彩乃\t122441\n奇文\t122442\n讲者\t122443\ntypea\t122444\n唐探2\t122445\n停止\t122446\nggd\t122447\nmh3g\t122448\n青少年宫\t122449\n现货交易\t122450\n金针\t122451\n王津元\t122452\n黄立行\t122453\n逸周书\t122454\nredive\t122455\n四六分\t122456\n探索图形\t122457\nHalls\t122458\nSlyar\t122459\n200mg\t122460\n守岁\t122461\njal\t122462\n奴隶区:我和我的23个奴隶\t122463\n老爷机\t122464\n靠背椅\t122465\nhamilton\t122466\n1654\t122467\n中共二大会址\t122468\n递交\t122469\n达布希勒图\t122470\n诚丰\t122471\nydwe\t122472\n叫苦\t122473\n王振国\t122474\n和点\t122475\n华夏银行手机银行\t122476\n西北工业大学研究生院\t122477\n陈文奥尼尔\t122478\n印面\t122479\n小威廉姆斯\t122480\n肋软骨隆鼻\t122481\nYY6029影院\t122482\n22名\t122483\n居安思危\t122484\n苏梅\t122485\n720_\t122486\n女王节\t122487\n袁昆\t122488\nNEKOPARA\t122489\n军情观察室\t122490\n一分钟后\t122491\n张丽敏\t122492\n网络借贷信息中介机构业务活动管理暂行办法\t122493\n千锤万凿\t122494\ndomestic\t122495\n弹幕\t122496\n国外\t122497\n张江高科技园区\t122498\n齐齐乐梦幻诛仙手游\t122499\n洪江古商城\t122500\n光明磊落\t122501\n数控车床车\t122502\n牛仔网\t122503\n中文类\t122504\n无骨雨刷\t122505\n手板\t122506\n磷脂酰胆碱\t122507\n沈钧儒\t122508\n洗发乳\t122509\n100路\t122510\nCIPS\t122511\n甬商\t122512\n灰场\t122513\n中华孝道园\t122514\nxheditor\t122515\nAkari\t122516\nurl编码\t122517\n中国社会科学报\t122518\nuploadify\t122519\n9口\t122520\n赛格威\t122521\nAudrey\t122522\n醉品\t122523\naicloud\t122524\nsites\t122525\n寡嫂\t122526\n学史\t122527\n六艘\t122528\n神战\t122529\n王辛巴\t122530\n2017年10月15日\t122531\ntrun\t122532\n鸭绿江\t122533\n冤鬼\t122534\n缺氧性\t122535\ndoesnt\t122536\n2018.5.1\t122537\n董董\t122538\n长途\t122539\nntopng\t122540\n新低\t122541\n观万象\t122542\nld\t122543\n水刑\t122544\n王明明\t122545\n键位\t122546\n汤勇\t122547\nvstack\t122548\n枣核\t122549\n百视通\t122550\ndataurl\t122551\nKeeps\t122552\n通用规范汉字表\t122553\n纯新\t122554\n辛店村\t122555\n凌源\t122556\n710\t122557\n徐大宝\t122558\nrecently\t122559\n狄青\t122560\n110千伏\t122561\n小红书吧\t122562\n明达\t122563\n清潭\t122564\n中物联\t122565\n四郎探母\t122566\n资源型\t122567\n节节\t122568\n项阳\t122569\n保障型\t122570\n速尔快递单号查询\t122571\n五一国际劳动节\t122572\n复合硅酸盐水泥\t122573\n青浦汽车站\t122574\nmanu\t122575\n静美\t122576\nONDA\t122577\n九大\t122578\n海丝\t122579\n数联\t122580\ngeeks\t122581\n艰涩\t122582\n僵尸岛\t122583\n骆集益\t122584\nradwimps\t122585\n周黑鸭\t122586\n罗曼史\t122587\n礼制\t122588\n香坊\t122589\n拟\t122590\n大有所为\t122591\nADO\t122592\n智慧星\t122593\nkilakila\t122594\n义犬\t122595\nPowerCenter\t122596\n射色\t122597\nt桖\t122598\ndatazoom\t122599\n惠娜\t122600\n宿松论坛\t122601\nJaymes\t122602\n绝口\t122603\n陈望道\t122604\n北京经济师\t122605\n使命召唤14\t122606\n卡骆驰\t122607\n中公教育陕西分校\t122608\n周王\t122609\n立时\t122610\n蒂勒森\t122611\n芬森\t122612\n钦差大臣\t122613\n侦办\t122614\n花友们\t122615\n姆吉拉\t122616\n防弹车\t122617\n上庄村\t122618\n75mg\t122619\n锐龙R5\t122620\nwince\t122621\n0821\t122622\n琅琊山\t122623\nPOCO网\t122624\nHIFIDIY\t122625\n吉号\t122626\n统景\t122627\n随机信号\t122628\n江铃汽车\t122629\n安全\t122630\n没有你我怎么办\t122631\n詹妮弗洛佩兹\t122632\nproplus\t122633\n中国进出口银行\t122634\n社会工作\t122635\n凯博\t122636\n物以类聚\t122637\n薄樱鬼\t122638\ndgs\t122639\n英帮助中心\t122640\n全女\t122641\n广西壮族自治区旅游发展委员会\t122642\nEmotions\t122643\n协同OA\t122644\n蜀镇\t122645\n惠润\t122646\n5万起\t122647\nlifts\t122648\n孙皓晖\t122649\n铜陵县\t122650\n6070\t122651\n四县\t122652\nvidio\t122653\n哈马斯\t122654\n苏尚卿\t122655\n自告奋勇\t122656\n密州出猎\t122657\n33类\t122658\n累不\t122659\n微码\t122660\n崖壁\t122661\n和创科技\t122662\n砧\t122663\n皇亲国戚\t122664\n顶替\t122665\nbetway\t122666\n门纱\t122667\n吃水果\t122668\n原碟\t122669\n惊鲵\t122670\n顺位\t122671\n黄桥烧饼\t122672\n破处\t122673\n新街口地区\t122674\nClearlove\t122675\n瓷厂\t122676\n5C\t122677\n潘潘\t122678\n芈月重明\t122679\nwrites\t122680\n两对半\t122681\n李明华\t122682\n事非\t122683\n米浆\t122684\nFzlu\t122685\n通用寄存器\t122686\na10\t122687\n东方凭依华\t122688\nMov\t122689\n一淘\t122690\n字帖\t122691\n如皋市\t122692\nɡ\t122693\n奉孝\t122694\n一分钟\t122695\n睡醒\t122696\n欣儿\t122697\n计罚\t122698\njeesit\t122699\n儿童节\t122700\n英米\t122701\nflashair\t122702\nwindows启动管理器\t122703\n套用\t122704\n少女日记\t122705\n沙河北大桥\t122706\nvarious\t122707\n联级\t122708\n佛珠\t122709\nbiss\t122710\n龙成\t122711\nOverseas\t122712\n香蒲\t122713\n梁朝伟\t122714\nundertaking\t122715\n平高电气\t122716\n湖南省委宣传部\t122717\n西瓜影音\t122718\n凤凰山路\t122719\nparm\t122720\n55本\t122721\n汾州\t122722\n华联综超\t122723\n红场\t122724\n坪石\t122725\n裂石\t122726\n吊炸\t122727\ndangan\t122728\n吕大豹\t122729\n3台\t122730\n史学界\t122731\n未悔\t122732\n张翰\t122733\n想清楚\t122734\n行业信息网\t122735\ndrbd\t122736\n3080\t122737\n火柴人联盟2\t122738\n南宁地铁2号线\t122739\nGS4\t122740\n几十页\t122741\nsooner\t122742\n荡涤\t122743\n猴女\t122744\n郴州职业技术学院\t122745\n丘疹\t122746\n盂兰盆节\t122747\n采煤机\t122748\nEli\t122749\ntokyotosho\t122750\n微电流\t122751\nlaopo\t122752\nTypes\t122753\n猜火车\t122754\ndiv_jquery\t122755\n族\t122756\nrtt\t122757\n石猴\t122758\n线雕隆鼻\t122759\n1.1GB\t122760\n小条\t122761\ndb11\t122762\n勘查\t122763\n佐佐木明\t122764\njixu\t122765\n彩丝\t122766\n各个击破\t122767\nNectar\t122768\n亥杂诗\t122769\n熊仔\t122770\n_葛\t122771\nSliding\t122772\n附加筋\t122773\n墙砖\t122774\n湘潭市委\t122775\n零散\t122776\n工单\t122777\n雪盈证券\t122778\nhmt\t122779\n三通球阀\t122780\n天龙八部天龙\t122781\n杭州电子科技大学\t122782\n42期\t122783\n欣喜若狂\t122784\n老黄历网\t122785\n海员招聘网\t122786\n海螺沟景区\t122787\n六样\t122788\n1.13.0\t122789\n张拉力\t122790\n承租人\t122791\n副交感神经\t122792\nimagery\t122793\n南冠\t122794\n可焊性\t122795\n孙悟\t122796\n逾期还款\t122797\n2018.4.10\t122798\n音文\t122799\nNicole\t122800\n肠套\t122801\n清华大学艺术博物馆\t122802\n打底衫\t122803\n胸腺\t122804\nsax\t122805\n大洲村\t122806\n深圳建设\t122807\n男尸\t122808\nJana\t122809\n房颤\t122810\n好女人\t122811\n记下\t122812\n包边带\t122813\n推出\t122814\n三森铃子\t122815\n一个周\t122816\n湿漉漉\t122817\n广搜网\t122818\n吹皱\t122819\n石湖\t122820\n勤能补拙\t122821\n灵玲国际马戏城\t122822\n淘宝美工\t122823\n汪明\t122824\nswinger\t122825\n佛山酒店\t122826\n交强险保单\t122827\nApple苹果\t122828\n破烂熊\t122829\n30只\t122830\nmanifest\t122831\nslow\t122832\n福建省国家税务局\t122833\n永磁铁\t122834\n倚靠\t122835\n光谱\t122836\n过年\t122837\n龙音阁\t122838\n探求\t122839\n中华H530\t122840\n周俊杰\t122841\n有价证券\t122842\n正月初六\t122843\n※\t122844\n玉米浆\t122845\n党委会的工作方法\t122846\n千万年\t122847\n力天宫\t122848\n考勤卡\t122849\n大陸\t122850\n可再分散乳胶粉\t122851\n刘老根\t122852\n迫近\t122853\n外溢\t122854\n柏乡县\t122855\n速干\t122856\n如云\t122857\nlostARK\t122858\n性感照\t122859\n51贴图网\t122860\nGB50016\t122861\ndessert\t122862\nturkey\t122863\nFAC\t122864\nawa\t122865\n邦元\t122866\n商贸公司\t122867\n找房网\t122868\n横溪街道\t122869\n男男女女\t122870\n全国五四红旗团委\t122871\n宏邦\t122872\n北分\t122873\n新火车站\t122874\n沸水\t122875\nifty\t122876\nYotaPhone\t122877\n韩东旭\t122878\n椭圆齿轮流量计\t122879\nGsonFormat\t122880\n香港八达通\t122881\n师讯网\t122882\n形容\t122883\n站号\t122884\n单宝\t122885\nflashlight\t122886\ntelegraph\t122887\nsmtown\t122888\nThai\t122889\n画鱼\t122890\n神勇\t122891\nハマ\t122892\n摩托车\t122893\n亚克西\t122894\n烦请\t122895\nSONY\t122896\n玉麦乡\t122897\n名家论市\t122898\nword|doc\t122899\n欢乐饭米粒儿\t122900\n4166\t122901\n伦理学\t122902\n出包女王\t122903\n反击式\t122904\n奶棒\t122905\n民政部职业技能鉴定指导中心\t122906\n大棒棒\t122907\n今天下午\t122908\n不慌\t122909\n深圳农商行\t122910\ncreatefile\t122911\n0.88\t122912\n早安心语\t122913\n深沪\t122914\n175度\t122915\n长春电视台\t122916\n趣玩乐\t122917\n警用车\t122918\niron\t122919\n超高级\t122920\n宋代\t122921\n光芒四射\t122922\n集贤\t122923\n白杨树\t122924\n纷扰\t122925\nbloody\t122926\n锤锤\t122927\n小框\t122928\n百度云服务器\t122929\n天津王顶堤\t122930\n水调歌头明月几时\t122931\n超梦\t122932\nElegant\t122933\nINE\t122934\n经销商地\t122935\n清分机\t122936\nkato\t122937\n感知器\t122938\n迪通\t122939\n眼视\t122940\nN4S\t122941\n纽伦堡大学\t122942\n乱论\t122943\n腾讯手游助手\t122944\ned2k\t122945\niptables防火墙\t122946\n妥运\t122947\nDES\t122948\n扬大附中\t122949\n监察委\t122950\n纹身纹\t122951\n摸尸\t122952\n力帆汽车\t122953\n支持度\t122954\n中国化工设备网\t122955\n80家\t122956\n案卷\t122957\n条腿\t122958\n太空救援\t122959\n中国民歌大会\t122960\nIFLA\t122961\n194\t122962\nChompoo\t122963\n你是我的情劫\t122964\n瑞蓝玻尿酸\t122965\n官鬼\t122966\n武庚纪\t122967\n机器人网\t122968\nsoliwork\t122969\n美的\t122970\n我的1997\t122971\n270万\t122972\n辣椒籽\t122973\n支持性\t122974\nmustard\t122975\n上汽通用汽车有限公司\t122976\n对此\t122977\n5个半月\t122978\n安赛龙\t122979\n春日野穹\t122980\n电子信息工程专业\t122981\ncanberra\t122982\n10.08\t122983\n形象思维\t122984\n透真\t122985\nValse\t122986\n中华苏维埃共和国临时中央政府\t122987\n初云\t122988\n可图\t122989\nanjular\t122990\n八零年\t122991\nequation\t122992\n小梁\t122993\n三棱锥\t122994\n灵石公园\t122995\n华东医药股份有限公司\t122996\nNMB48\t122997\n小次郎\t122998\n赵村\t122999\n山东电视台农科频道\t123000\n田艺苗\t123001\nFerragni\t123002\n肥皂水\t123003\n影骑士\t123004\nbryan\t123005\n石溪大学\t123006\n20回\t123007\n相期\t123008\n退行性\t123009\n#2\t123010\n音容笑貌\t123011\n第八\t123012\n国际集团\t123013\n如图示\t123014\n3000m\t123015\n嗅觉\t123016\nTalkOP\t123017\n不为所动\t123018\n怜子\t123019\n欧美市场\t123020\n克里米亚\t123021\n公学\t123022\n北师大珠海分校\t123023\n健民\t123024\n山东经济学院\t123025\n华发股份\t123026\n实战篇\t123027\n2.18.4\t123028\n苦辣\t123029\n崛起\t123030\n兰亭盛荟\t123031\nautocad2004\t123032\n8.2.1\t123033\n凿壁偷光\t123034\n几十岁\t123035\n快递\t123036\nladyboy\t123037\nHeidi\t123038\n2018年三月\t123039\n造型\t123040\n两三次\t123041\npm2\t123042\n赭砚\t123043\n舰名\t123044\n楼宇\t123045\n端坐\t123046\n37号\t123047\n长衫\t123048\nproteus8\t123049\n筋斗云\t123050\n朝代表\t123051\nalbert\t123052\n影藏\t123053\n呼吸性碱中毒\t123054\n经验证\t123055\n18a\t123056\n七万\t123057\nWAV/\t123058\n碰上\t123059\n2017.1\t123060\n万顺\t123061\n_九游网\t123062\n横竖屏切换\t123063\n铜管乐\t123064\nZCOOL\t123065\n760P\t123066\n商界\t123067\nyou&#160\t123068\n上原结衣\t123069\n76676\t123070\n降血脂\t123071\n40mhz\t123072\n录音版\t123073\n南开16春\t123074\nCrossOver\t123075\n酷基金网\t123076\n必答题\t123077\n自助餐厅\t123078\n书匠\t123079\n长媳\t123080\n香油机\t123081\n白醋\t123082\n魔法少女俺\t123083\n小兔子\t123084\n逼退\t123085\n龙睛\t123086\n北京交通管理局\t123087\n陈岩石\t123088\n奥氏体\t123089\nZookeeper注册中心\t123090\n胡建人\t123091\n约定俗成\t123092\n加破\t123093\n手写稿\t123094\n垂直运输\t123095\n甜菊糖\t123096\n1.0.1e\t123097\n五菱荣光\t123098\n京津沪\t123099\n船戏\t123100\n北京大学地球与空间科学学院\t123101\n免用\t123102\n笑红尘\t123103\nr100\t123104\n四川省卫生厅\t123105\n武夷山景区\t123106\n驱动精灵\t123107\nkurt\t123108\n时代悦城\t123109\n交接单\t123110\n好汉歌\t123111\n更嗨\t123112\nr430\t123113\nw500\t123114\n江海晚报网\t123115\nLosses\t123116\n对号入座\t123117\n塘尾村\t123118\n罗技g502\t123119\n举头\t123120\n九叶网\t123121\n发际\t123122\n妈宝\t123123\n紫峰大厦\t123124\n磁盘盘符\t123125\n天津大学软件学院\t123126\n雨淋湿\t123127\n众男\t123128\n小灰灰\t123129\n柯迪亚克论坛论坛\t123130\n织姬\t123131\n8684\t123132\n南京江宁区\t123133\n新乡学院\t123134\n德意志第三帝国\t123135\n急性扁桃体炎\t123136\n第152名\t123137\n2众\t123138\n2870\t123139\n勋爵\t123140\n国家审计署\t123141\n外八字\t123142\n茶泡饭\t123143\n滚齿机\t123144\n兰德华\t123145\n方正科技\t123146\nmounts\t123147\n上海滩花园\t123148\n蒂娜\t123149\n越走越远\t123150\n最近7天\t123151\n习气\t123152\neosio\t123153\n视力表\t123154\n新娱\t123155\n荷包岛\t123156\nVertx\t123157\n丰收日\t123158\nxenon\t123159\n拉达曼迪斯\t123160\n兄弟文\t123161\n查理曼\t123162\n国研中心\t123163\n创业投资企业\t123164\n席子\t123165\n公共租赁房信息网\t123166\n亚华科技有限公司\t123167\n38.2\t123168\n墨化\t123169\n米粒状\t123170\n三尖瓣\t123171\n薪日\t123172\nsauna\t123173\n海洋法公约\t123174\nrabbitMQ\t123175\n苏萌\t123176\n创意\t123177\n最后一滴血\t123178\n调温\t123179\nitem\t123180\n艾肯\t123181\n圣魂\t123182\n灰姑娘与四骑士\t123183\n桃洪镇\t123184\n6.52\t123185\nCougar\t123186\n泌尿道感染\t123187\n浦东新区幼儿园\t123188\nchsi\t123189\nビ\t123190\n全副\t123191\n雷恩加尔\t123192\n上海市卫计委\t123193\n竹根\t123194\n川主寺\t123195\n1558\t123196\nnadech\t123197\n厚膜\t123198\n被攻\t123199\n集注\t123200\nfast-rcnn\t123201\n57种\t123202\nJame\t123203\n莲蓬\t123204\n仙途\t123205\n急就章\t123206\n锁口\t123207\n粗重\t123208\n终极三国\t123209\n辽宁省安全生产监督管理局\t123210\n2903个\t123211\n金山铁路\t123212\n第五只\t123213\n集流体\t123214\n猫眼\t123215\n中山大学网络与信息技术中心\t123216\n47.dll\t123217\nntpd\t123218\nSkyUI\t123219\nWTI\t123220\n台海战争\t123221\n爬虾\t123222\n镇江市住房公积金管理中心\t123223\n涡旋式\t123224\n安分\t123225\n中南大学土木工程学院\t123226\nx+4\t123227\nberg\t123228\n谈判官\t123229\n元江县\t123230\n珐琅漆\t123231\n邻苯二甲酸酯\t123232\n晓庄学院\t123233\n10kg\t123234\n铜川市人民政府\t123235\n4.4亿\t123236\nDrums\t123237\nvideosgrati\t123238\n哈尔\t123239\n劳动报\t123240\n飞行器\t123241\n刮过\t123242\n科雷嘉论坛\t123243\n一百万条\t123244\ndaphne\t123245\n大银幕\t123246\nimb\t123247\n即插\t123248\n老北京炸酱面\t123249\n跨界喜剧王\t123250\n史前史\t123251\n别说话\t123252\n程语\t123253\n红警之家手游网\t123254\n乱序\t123255\n新荣记\t123256\n漩\t123257\npromoter\t123258\n罪恶少女\t123259\n彩排\t123260\n丰田霸道4000\t123261\n天王星\t123262\n莱茵小镇\t123263\npyrhon\t123264\n首因效应\t123265\n郑仁\t123266\n老程\t123267\nlovely\t123268\nmagus\t123269\n戎美\t123270\n长恨\t123271\n李春姬\t123272\n蔡伟\t123273\n中国人民政治协商会议共同纲领\t123274\n婚战\t123275\n克林\t123276\npacket\t123277\n统编\t123278\nOpportunity\t123279\n可耐\t123280\ncarhartt\t123281\n央财\t123282\nKBG管\t123283\ncasperjs\t123284\n童军手册之僵尸启示录\t123285\nPetri\t123286\ntsv\t123287\n全面深化改革领导小组\t123288\n恶堕\t123289\nascii\t123290\n魔术裤\t123291\n蟹子\t123292\n奥迪A8L\t123293\n中叶\t123294\n谈起\t123295\n李堡\t123296\n我多喜欢你\t123297\n巳月\t123298\n双星之阴阳师\t123299\n周瑾\t123300\n阿琳\t123301\nstartactivity\t123302\n滔滔\t123303\n五万美元\t123304\n鹤立\t123305\nSTACK\t123306\nqcombobox\t123307\nYII2.0\t123308\nOPERA\t123309\n九剑\t123310\n菠萝饭\t123311\n唐立淇\t123312\n狗蛋\t123313\n风淋\t123314\nReceive\t123315\n幽门螺杆菌\t123316\n冷少\t123317\n吴江路\t123318\n幽兰操\t123319\n浅醉\t123320\n国家安全监管总局\t123321\n厢货\t123322\n审计处\t123323\n指虎\t123324\nBANANA\t123325\n1905\t123326\n址\t123327\n精英化\t123328\n爆料\t123329\n支管\t123330\n补报\t123331\n吴琳\t123332\n黄公路\t123333\n江西工程学院\t123334\n面巾纸\t123335\n中国轨道交通网\t123336\n隐名\t123337\n独闯\t123338\n易明医药\t123339\n读书文摘\t123340\n可可礼物网\t123341\ngVim\t123342\n20160824\t123343\n梅艳芳\t123344\n广东人\t123345\n天花灯\t123346\n万致\t123347\n可塑性记忆\t123348\n战锤40k战争黎明2\t123349\namu\t123350\n二级建造师考\t123351\n韩英\t123352\n红血\t123353\nabout\t123354\n企查查\t123355\n暗地里\t123356\n叶琳\t123357\n建平远翔\t123358\n台州市人民政府\t123359\n交情\t123360\n雾里\t123361\n航班\t123362\nwhoops\t123363\n日本电视台\t123364\nc端\t123365\n铜包钢\t123366\n陈店\t123367\n产床\t123368\nYouPorn\t123369\n20P\t123370\ncoreldraw\t123371\nmanual\t123372\n虎符\t123373\n半桥\t123374\n铁马套\t123375\n漳州片仔癀药业股份有限公司\t123376\nhelo\t123377\nHiring\t123378\n5分钟左右\t123379\nlion\t123380\n星际争霸II\t123381\n吴敏霞\t123382\n美中\t123383\n酷米网www.kumi.cn\t123384\nztree树\t123385\n大冶一中\t123386\nPerspectives\t123387\n中华民国时期\t123388\nsect\t123389\nconsolidate\t123390\n远程教育\t123391\nuipath\t123392\n特码\t123393\n0508\t123394\n京唐城际铁路\t123395\n炎王龙\t123396\n润欣科技\t123397\nplc300\t123398\nfnd\t123399\ninfusion\t123400\nwyf\t123401\n杭高\t123402\n脱去\t123403\nCK电影网\t123404\nATID\t123405\n俊杰\t123406\ndataguard\t123407\n中戏\t123408\n鲤鱼跃龙门\t123409\nGOME\t123410\nInstitution\t123411\n黑龙江科技信息\t123412\n免征车辆购置税\t123413\n第6话\t123414\n植检\t123415\nSD卡\t123416\n高卡\t123417\nTBM\t123418\n轩辕恋舞ol\t123419\ninbound\t123420\nSalvatore\t123421\n小池里奈\t123422\n吴楠\t123423\n汪伪政府\t123424\n攻陷\t123425\n微信营销系统\t123426\n放牛郎\t123427\ncorrective\t123428\n爱思\t123429\nio域名\t123430\n百度云BD迅雷\t123431\n奥德\t123432\nPICOOC\t123433\n邵伟\t123434\n洁希亚\t123435\nMorketing\t123436\nH5游戏\t123437\ngump\t123438\n轮岗\t123439\n舞成名\t123440\n法正\t123441\n不足以\t123442\n无穷级数\t123443\n铜色\t123444\n歌华有线\t123445\n下半夜\t123446\ndnf剑魂吧\t123447\n上海市张江高科技园区管理委员会\t123448\n打算\t123449\n黎辉\t123450\n受孕率\t123451\n建政路\t123452\n贾氏\t123453\n花萼\t123454\ninfo\t123455\n刘彧\t123456\n桂林晚报社\t123457\n大床\t123458\n防洪\t123459\n南京同仁医院\t123460\n水弹\t123461\n国手\t123462\n南京市江宁区人民政府\t123463\nPageAdmin\t123464\nRmx\t123465\n钱华\t123466\n水浒全传\t123467\n抽身\t123468\n湖北日报网\t123469\nGAIA\t123470\n大朝\t123471\nfailures\t123472\n地震带\t123473\nantenna\t123474\n蒼\t123475\n天钢\t123476\n几能\t123477\nYEAR\t123478\n药食\t123479\n催款\t123480\n精卫\t123481\n爆屏\t123482\n人狼\t123483\n艾跃进\t123484\n新都市\t123485\nServer2008\t123486\n固原市\t123487\n背徳\t123488\n举目\t123489\n一立方厘米\t123490\n磁力播放器\t123491\n上展\t123492\n变革者\t123493\n跨级\t123494\n石台县人民政府\t123495\n下册期\t123496\n莲莲\t123497\nDel\t123498\n枫林晚\t123499\n智慧城市博览会\t123500\n万达地产\t123501\nCNAPS_\t123502\n交易保证金\t123503\n盟重\t123504\nDXO\t123505\n南通大学\t123506\n暮光资讯网\t123507\nCCTV-2\t123508\n鸟德\t123509\n加固\t123510\n前赴后继\t123511\n华夏时报网\t123512\n理优\t123513\n王者之路\t123514\n人社部\t123515\n采飞扬\t123516\n国奥\t123517\nsurfacebook\t123518\n格雷福斯\t123519\n优众网\t123520\n跨端\t123521\n47名\t123522\n生物化学\t123523\n霸王2\t123524\nNature\t123525\nStainless\t123526\n冰心\t123527\n茶粉\t123528\n芭芭拉布什\t123529\nK+\t123530\n夹纸\t123531\nV5R20\t123532\n离境\t123533\napprentice\t123534\n白狼栈\t123535\n毛竹\t123536\n莞\t123537\n自习\t123538\n水胶\t123539\n2018-04-21\t123540\n合肥微尺度物质科学国家实验室\t123541\n金珍熙\t123542\n一柱\t123543\n中外合资企业\t123544\n北京的金山上\t123545\n中铁工业\t123546\ncici\t123547\nSX4\t123548\n吉林省住房和城乡建设厅\t123549\niT\t123550\nd750\t123551\n包夜\t123552\n先息后本\t123553\ncarrera\t123554\n30立方米\t123555\n绞线\t123556\n南孔\t123557\n梦痕\t123558\n光华路1号\t123559\n文广局\t123560\n华夏卡\t123561\n华为交换机路由器\t123562\n一底\t123563\nキミ\t123564\nACCESS\t123565\n怀疑\t123566\n素菜馆\t123567\n手风\t123568\nFranklin\t123569\n换色\t123570\n伊州区\t123571\n明奇\t123572\n湖北恩施\t123573\ncaoporn_超碰在线视频\t123574\n搭拆\t123575\nhfile\t123576\n群体性\t123577\n500多年\t123578\n标准日本语\t123579\n萌宠大爆炸\t123580\nMEGALO\t123581\n晋江文学网\t123582\n蹈\t123583\n神崎兰子\t123584\n注塑机\t123585\n齐天乐\t123586\n37000\t123587\n3xl\t123588\n一刻\t123589\n杨学文\t123590\n长沙国金中心\t123591\n继承人\t123592\n永恒之路\t123593\nshizi\t123594\n清涧\t123595\n悦己网\t123596\n大卷\t123597\n胃康灵胶囊\t123598\nTeenager\t123599\naopalliance\t123600\n醋酸乙烯\t123601\ndongying\t123602\n飞沙走石\t123603\n保险\t123604\n驾控\t123605\ndivi\t123606\nbreadcrumb\t123607\n黄金叶\t123608\n肥城市\t123609\n金海国际\t123610\n青雪\t123611\n隆林县\t123612\n医疗器械经营企业许可证\t123613\n2014下半年\t123614\n鲁宾逊\t123615\nMOMA\t123616\n法律规定\t123617\n修改案\t123618\n珠海全志科技股份有限公司\t123619\n西塘景区\t123620\n多值\t123621\n万界九鼎记\t123622\n景丽\t123623\n硫化钼\t123624\n侦测\t123625\n委婉\t123626\n一致性\t123627\n脱力\t123628\n潍坊市\t123629\n蜂花\t123630\n20160225\t123631\n配改\t123632\n搬运费\t123633\n单田芳评书\t123634\n白天鹅\t123635\n粉化\t123636\n16299.371\t123637\n3月31\t123638\n午睡枕\t123639\n丹江\t123640\n韩猛\t123641\n康迪\t123642\n美呗\t123643\nProp\t123644\n台媒\t123645\n聚碳酸酯板\t123646\n记\t123647\n600848\t123648\n勇者斗恶龙怪兽篇joker2\t123649\n齐鲁股权交易中心\t123650\n老太\t123651\n围海\t123652\n翼鸟\t123653\n玻尔\t123654\nTrine\t123655\n黄瑞\t123656\n逍遥网\t123657\n6号\t123658\n赢客\t123659\n呼和\t123660\n曾头市\t123661\n秘笈\t123662\nCCTV4\t123663\nLadyboys\t123664\n联合国日内瓦总部\t123665\n家长们\t123666\n今非昔比\t123667\npingjia\t123668\nqwer\t123669\n小涵\t123670\n飞椅\t123671\nsungennian\t123672\n松桃\t123673\n赵青青\t123674\npromote\t123675\n48伏\t123676\nCaroline\t123677\n76路\t123678\n石家庄北站\t123679\n密着\t123680\n飞起来\t123681\n不还\t123682\n君令\t123683\n十九集\t123684\n园田海未\t123685\n李开\t123686\n5255\t123687\n农装\t123688\ncaigou\t123689\nHTTPS服务器\t123690\nA7R2\t123691\n临界生\t123692\nonset\t123693\n海清\t123694\nkoss\t123695\n吕萍\t123696\nrbg\t123697\nSHIFT键\t123698\n弦波\t123699\n45升\t123700\n取象\t123701\n乐2\t123702\n浏阳蒸菜\t123703\naddevent\t123704\n许斌龙\t123705\n电影米花之味\t123706\n皓齿\t123707\nvivoY85\t123708\n采购经理指数\t123709\n窈窕\t123710\n来跑\t123711\n第一百一十四章\t123712\n双世宠妃\t123713\nshuai\t123714\n体验式营销\t123715\n起点\t123716\nGuaranteed\t123717\n礼宾部\t123718\nCyanogenMod\t123719\nBoi\t123720\n中华人民共和国消费者权益保护法\t123721\n化隆县\t123722\nPYP\t123723\n天宝伏妖录\t123724\n赖昌星\t123725\n广州工程技术职业学院\t123726\n威尔逊\t123727\n加勒比地区\t123728\n湿敏\t123729\n郑亨敦\t123730\n铁狮门\t123731\nERX5\t123732\n12门\t123733\n医享网\t123734\nexcel2010数据透视表\t123735\n欢快\t123736\n煤油\t123737\n飞蚁\t123738\niu\t123739\n变式\t123740\ngoagent\t123741\n15000吨\t123742\n真三国无双ol\t123743\n模拟题\t123744\nselecting\t123745\nkcl\t123746\n解决问题的策略\t123747\n上海建材\t123748\nHammond\t123749\n灰模\t123750\n志林\t123751\n老来难\t123752\n而亡\t123753\n图论\t123754\n大礼堂\t123755\nwiener\t123756\nWorks2\t123757\n张继青\t123758\nSSTap\t123759\n张家界日报\t123760\n现代金控\t123761\n满V版\t123762\n风油精\t123763\n昆仑山脉\t123764\n健身男\t123765\n福建农业职业技术学院\t123766\n赵航\t123767\n丝头\t123768\n羊毛线\t123769\n扎哈\t123770\n银美孚\t123771\n尤\t123772\n1.6THP\t123773\n杜绝\t123774\n广汽传祺gs4\t123775\n新产品\t123776\nConcert\t123777\n115路\t123778\n角子\t123779\n变线稿\t123780\nR1D\t123781\n一江春水向东流\t123782\n血杀\t123783\n橘生淮南\t123784\n妖臣\t123785\n淮州新城\t123786\n凿毛机\t123787\n番禺石楼\t123788\nberry\t123789\n歪唱\t123790\n瑞安航空\t123791\nMKVToolnix\t123792\n养护员\t123793\n662\t123794\nteen\t123795\n岳麓书院\t123796\n作响\t123797\n幕迅\t123798\nanime\t123799\n大连医大二院\t123800\n51度\t123801\n共产党人\t123802\n呼伦贝尔学院\t123803\nActiveRecord\t123804\n睡眠片\t123805\n769\t123806\n一连\t123807\n生日花\t123808\n鳥\t123809\n英语自学网\t123810\n彻骨\t123811\nt15\t123812\n韦东山\t123813\n号号\t123814\n尿分叉\t123815\n铜模\t123816\n5x5\t123817\ndnf剑宗吧\t123818\n3m胶\t123819\n小丹\t123820\n上海迪士尼玩\t123821\nmold\t123822\n温州火车站\t123823\n维生素B族片\t123824\n秦楚论坛\t123825\n生日\t123826\nInvolved\t123827\n南京江北新区\t123828\n陈乐强\t123829\n口才学\t123830\n击败\t123831\n富力半岛花园\t123832\nGreatAnt\t123833\npuyangsky\t123834\nimator\t123835\n不演\t123836\nscolia\t123837\n安钛克\t123838\n投石车\t123839\nUIBE\t123840\n首都儿科研究所附属儿童医院\t123841\n淬火液\t123842\nproto3\t123843\n研招办\t123844\n纸器\t123845\n大智慧\t123846\n20160325\t123847\nバル\t123848\n放射诊疗许可证\t123849\n华泰证券网\t123850\n幕府将军2:全面战争\t123851\nopennms\t123852\n工作机\t123853\nqq群吧\t123854\ncurrent\t123855\n邯郸东站\t123856\n诗配画\t123857\n中瓷网\t123858\nWinKawaks\t123859\n抗旱\t123860\nappscan\t123861\n铲屎\t123862\n噩梦级\t123863\n小场\t123864\n法国航空\t123865\n吕世浩\t123866\n追求者\t123867\n矬穷\t123868\n错构瘤\t123869\n青玉案\t123870\n美加\t123871\n50GB\t123872\n表源\t123873\nzhaopin\t123874\nEFGH\t123875\nOliver\t123876\n国家安全厅\t123877\n1898\t123878\n荐\t123879\n采摘节\t123880\n黄那敏\t123881\n进谏\t123882\n稠密\t123883\nGiliSoft\t123884\n国立西南联合大学\t123885\n沐浴液\t123886\n得名\t123887\n罗明\t123888\n奥尔龙\t123889\n白细胞\t123890\n付账\t123891\n张浦\t123892\n挖补\t123893\n一遍一遍\t123894\n存贷款基准利率表\t123895\n洽洽食品\t123896\n安进\t123897\n中科大洋\t123898\n螺号\t123899\n曹杨路\t123900\n应收\t123901\n拉应力\t123902\nPortfolio\t123903\nwdt\t123904\n喜爱夜蒲\t123905\n骆驼奶\t123906\n新能源有限公司\t123907\n义乌城\t123908\n重庆市人大\t123909\n2017-10\t123910\n治疗机\t123911\n利雅得\t123912\n2018-04-10\t123913\n免费观\t123914\n工资额\t123915\n纤云弄巧\t123916\n南京条约\t123917\n郭晓敏\t123918\n石排\t123919\n皱皮\t123920\nInspiron灵越14\t123921\n无缝管\t123922\n蝶之灵\t123923\n艄公\t123924\n巨划算\t123925\n孤云\t123926\n奥特迅\t123927\n沈家本\t123928\n魂印\t123929\n露出\t123930\n虎妞\t123931\n新疆旅行网\t123932\n星火旅游网\t123933\n整群\t123934\n190R\t123935\ndxomark\t123936\n沃家\t123937\n玛琪娜\t123938\n12.16\t123939\n第五十三\t123940\n重九\t123941\n岁月如歌\t123942\n中文模拟器\t123943\n苏尚儿\t123944\n马里昂\t123945\ncom.cn\t123946\n浙江建设职业技术学院\t123947\n北京密云\t123948\n河源\t123949\n三飞\t123950\nsubtle\t123951\neth1\t123952\n2100万\t123953\n宣泄\t123954\n妈妈咪呀\t123955\n35.6\t123956\n哈尔滨酒店\t123957\n银鱼\t123958\nkyo\t123959\n192.168.255.255\t123960\n发行\t123961\n香樟\t123962\n升旗稿\t123963\n运河西路\t123964\n路途\t123965\n光敏剂\t123966\n0.3g\t123967\n朔雪\t123968\n川音\t123969\n马丁·雅克\t123970\n十五六岁\t123971\nBj\t123972\n200台\t123973\n零售类\t123974\n冷雪\t123975\n鸿特精密\t123976\n觅得\t123977\n胃痉挛\t123978\ngtid\t123979\n张赫\t123980\n羊角蜜甜瓜\t123981\n进货\t123982\naforge\t123983\n卡佩拉\t123984\n行香子\t123985\n学棋\t123986\n曦月\t123987\nkiat\t123988\nClob\t123989\n兴利\t123990\n尤克里里谱_ukulele弹唱谱\t123991\n邓海清\t123992\n医检\t123993\n比目\t123994\n智慧版\t123995\n300年\t123996\n绝地求生刺激战场安卓\t123997\n单休\t123998\n梦百\t123999\ngroovy\t124000\n古丝绸之路\t124001\n谭小平\t124002\n茶凉\t124003\n甘雨\t124004\nshima\t124005\nEvo\t124006\n山师附小\t124007\n50万套\t124008\n建筑公司\t124009\n艾邦\t124010\n弓箭\t124011\n异军\t124012\n道客\t124013\n邀请框\t124014\n三号位\t124015\n南光高速\t124016\n虚数\t124017\nPeggy\t124018\n2006年\t124019\n0739\t124020\n8万\t124021\n挂断\t124022\n惊为天人\t124023\n管批\t124024\n韩智慧\t124025\n韩系\t124026\n纳贤\t124027\n老酸奶\t124028\n大传\t124029\n永城市\t124030\n陈奕天\t124031\n未接\t124032\n牙规\t124033\n建筑人\t124034\n拉取\t124035\n大夫人\t124036\n央视八套\t124037\n迦太基\t124038\n产险\t124039\n底包\t124040\nPetya\t124041\n盐堆\t124042\n黄岗村\t124043\n腹鳍\t124044\n音响效果\t124045\n颈龟\t124046\n清体\t124047\n明匠\t124048\n全球使命3\t124049\n双路\t124050\nProvence\t124051\n努比亚z9mini\t124052\n洛杉矶时报\t124053\nnmi\t124054\n财政专户\t124055\n汽车配件厂\t124056\n长一智\t124057\n誓死捍卫\t124058\n奕诚\t124059\n国家公\t124060\n产房\t124061\n孙博\t124062\npage=\t124063\n▓\t124064\n扎啤机\t124065\n20180420\t124066\n农居\t124067\n鱼市\t124068\nAndroid研究院\t124069\n鸡里奥\t124070\n妇科检查\t124071\n西乡县人民政府\t124072\n高车\t124073\njen\t124074\n伊东遥\t124075\nhau\t124076\n威廉·布莱克\t124077\n郴州市人大\t124078\n明年三月\t124079\n徐树铮\t124080\n齐丽\t124081\n荔城街\t124082\n菲亚特\t124083\n远程控制\t124084\nTaobaoTS\t124085\n奶工\t124086\n1.51G\t124087\n聚房山\t124088\nxlwt\t124089\n埃克替尼\t124090\nCMCC\t124091\ntargeted\t124092\n学派\t124093\n补写\t124094\n老人公寓\t124095\n私建\t124096\n形容词大全网\t124097\n全模\t124098\n安家楼路55号\t124099\nFormation\t124100\n渝州\t124101\n乘机\t124102\n绝代长生诀\t124103\n拿不走\t124104\nquota\t124105\n幼圆\t124106\n千韩\t124107\n七首\t124108\ncjson\t124109\n弘善家园\t124110\nlinlin\t124111\n龙钞\t124112\n耻骨\t124113\n金将\t124114\n敌\t124115\n桑葚膏\t124116\n第几分钟\t124117\n笑天\t124118\n莫尔\t124119\n东旭光电\t124120\nzhonguo\t124121\n黄石经济技术开发区\t124122\n控股权\t124123\n珀金埃尔默\t124124\n5万个\t124125\n中国叉车网\t124126\nListen\t124127\n反线\t124128\n门头沟\t124129\n软组织挫伤\t124130\n宁波市市场监督管理局\t124131\n尊师重道\t124132\n同级\t124133\n绝缘片\t124134\nCruiser\t124135\n1Kg\t124136\n不饱和树脂\t124137\n阿尔塔\t124138\nmjextension\t124139\n闲林街道\t124140\n郭伯雄\t124141\n中国纺织出版社\t124142\n老故事\t124143\n咖喱块\t124144\n壬基酚聚氧乙烯醚\t124145\nwp8.1\t124146\n伤心\t124147\n康瑞保\t124148\nrpcbind\t124149\n水暖管\t124150\n清史\t124151\n永安市\t124152\n口会\t124153\n平衡板\t124154\n跳交谊舞\t124155\n夜问打权\t124156\n剑王将门嫡女\t124157\n徐敏\t124158\n狼青\t124159\n芹泽多摩雄\t124160\n魔界战记吧_\t124161\n好评语\t124162\n打断\t124163\n核算岗\t124164\n大都\t124165\n白妞\t124166\n恋如雨止\t124167\n艾罗伯特\t124168\n山东\t124169\n井底村\t124170\n热舞\t124171\n幫\t124172\ncipher\t124173\n8d\t124174\n姑父\t124175\nbai\t124176\n没有关\t124177\nwei-crm\t124178\n派尔\t124179\n赣州市政府\t124180\n环控\t124181\n676\t124182\n杨晓波\t124183\n中航国际\t124184\n主程序\t124185\nBuses\t124186\n台钓竿\t124187\n腾蛟\t124188\n陈列员\t124189\n绪言\t124190\n马杰\t124191\n利差\t124192\n索拉\t124193\n認\t124194\n铁饭\t124195\n别买\t124196\n北京协和医院东院\t124197\nム\t124198\n爱的世界\t124199\n白纹\t124200\n一碗米\t124201\ngzip\t124202\n叶二娘\t124203\n水刀切割机\t124204\n温莎城堡\t124205\n万款\t124206\n新源路\t124207\n红珊瑚\t124208\n多边主义\t124209\n十万\t124210\n白场\t124211\n遥遥无期\t124212\nparseFloat\t124213\n32k\t124214\nmiki西游\t124215\n冬月\t124216\n成文法\t124217\n礼学堂\t124218\n_机车网\t124219\n智行者\t124220\ndissipation\t124221\n淮安经济技术开发区\t124222\n架杆\t124223\n丁浪\t124224\n教计\t124225\nSEO公司\t124226\nfooter\t124227\n两委\t124228\n安创\t124229\n己经\t124230\n亮颜\t124231\n乘\t124232\n月升王国\t124233\n香港广场\t124234\n10GbE\t124235\n高速版\t124236\n秦腔板胡\t124237\n康庄\t124238\nC7\t124239\nobs吧\t124240\n高昊\t124241\n咬碎\t124242\n罗山县人民政府\t124243\n云道\t124244\n资中\t124245\n版心\t124246\n考察表\t124247\n陈璇\t124248\n2017年2月\t124249\n战队赛\t124250\nkia\t124251\n七天七夜\t124252\n八排\t124253\n搂抱\t124254\n尼康D5600\t124255\n思诚\t124256\ncolgroup\t124257\n头型\t124258\n宜居村庄\t124259\n雅戈尔\t124260\n高压塔\t124261\n88平米\t124262\n持证\t124263\nhiuman\t124264\n有妖气\t124265\n沙西米\t124266\n小医仙\t124267\n知音卡\t124268\n软土\t124269\nphony\t124270\n湖北网络广播电视台\t124271\n好看\t124272\n女高男\t124273\npymol\t124274\n狗洞\t124275\n外汇交易\t124276\n一妻多夫制\t124277\n万达电影城\t124278\n滑滑梯\t124279\n合音\t124280\n淡绿色\t124281\n微众税银\t124282\n23期\t124283\n六指琴魔\t124284\n富者\t124285\n企业家\t124286\n样品房\t124287\nNOD\t124288\n一柄\t124289\nexoplayer\t124290\n南郭先生\t124291\n浮生若梦\t124292\n第六十九条\t124293\nloupan\t124294\n中国农村商业银行\t124295\n绵薄之力\t124296\n前功尽弃\t124297\n马旭川\t124298\n洞库\t124299\n卢卡\t124300\n浙江大学附属中学\t124301\n孟宪贵\t124302\n跑友们\t124303\n降龙伏虎\t124304\n花儿与少年\t124305\n杜\t124306\n泉州华光职业学院\t124307\n紫苏水\t124308\n球帽\t124309\n黑心棉\t124310\n过敏性荨麻疹\t124311\n2022\t124312\n矿床\t124313\n奇丽女性网\t124314\nv1.2.1\t124315\n韩师\t124316\n天拓四方\t124317\n异兽\t124318\n凌源市\t124319\n陈锐\t124320\n刑罚\t124321\n南悦花苑\t124322\n小梅\t124323\n机纸\t124324\n英汉双解\t124325\n显而易见\t124326\n国家AAA级旅游景区\t124327\n手指套\t124328\n管套\t124329\n胆子\t124330\n靠尺\t124331\n旭化成\t124332\n第二排\t124333\nIDC评述网idcps.com\t124334\n中国交通广播\t124335\n乐会\t124336\n扭脚\t124337\n难以理解\t124338\n百消丹\t124339\n长安欧尚图片大图_长安欧尚\t124340\n国土部\t124341\n导联\t124342\n酷我\t124343\n背飞\t124344\n梅利威瑟\t124345\nS/4HANA\t124346\n学作\t124347\n文静\t124348\n奥蕾莉亚\t124349\n梦幻龙族\t124350\n核金\t124351\n300PLC\t124352\n抒怀\t124353\n海克斯\t124354\n瞠目结舌\t124355\n相亲男\t124356\n杨家将传奇\t124357\nJAC\t124358\ntfoot\t124359\n千眼菩提\t124360\n_时时网\t124361\n新期\t124362\n领到\t124363\n逆时营救\t124364\nbab\t124365\n云心\t124366\n颐盛\t124367\n取来\t124368\ngoroutine\t124369\nPiwik\t124370\n姚家园\t124371\n归脾汤\t124372\nsubmitting\t124373\n韩晨阳\t124374\nnVidia\t124375\n磁选\t124376\ntoi\t124377\n豚首\t124378\n朔\t124379\n成都北路\t124380\n同路人\t124381\n嫦娥\t124382\n无霜期\t124383\n电子科大成都学院\t124384\n合辑\t124385\n国际城\t124386\n出租电话\t124387\n曹彰\t124388\n逸龙剑\t124389\n比太钱包\t124390\n纹丝不动\t124391\ne1030\t124392\nMaui\t124393\ntoolbars\t124394\nVisio2010\t124395\n高伟\t124396\n长足\t124397\n恒温\t124398\ncompact\t124399\n白兔记\t124400\n总经理\t124401\n飞傲X5\t124402\n刷网\t124403\ndx3\t124404\n二胡曲\t124405\n综合评标法\t124406\n保时捷Macan\t124407\n徐闻\t124408\n暗夜猫娘\t124409\nBoot缓存\t124410\n阴角线\t124411\n蟠龙湖\t124412\n李总\t124413\nvivoy79\t124414\n丫丫\t124415\n星空套\t124416\n微声\t124417\n知心爱人\t124418\n204001\t124419\n发哥\t124420\n李洪绸\t124421\n同一屋檐下\t124422\nCould\t124423\nJX3PVE.COM\t124424\n用友网络\t124425\nSp\t124426\nFury\t124427\n淳厚\t124428\n4.7\t124429\n第2集\t124430\nRuntimeError\t124431\n720p.1080p\t124432\niMC\t124433\ne950\t124434\n三十分钟\t124435\n区人大常委会\t124436\nindicates\t124437\n李鸿章传\t124438\n动态路由\t124439\nv2.7\t124440\n叶声\t124441\n2011-2016年\t124442\nBad\t124443\n追逐者\t124444\n机器纪元\t124445\n义务教育法\t124446\n夏兰\t124447\n朗诗地产\t124448\n达芬奇调色吧\t124449\nWebImage\t124450\nwww.fkzww.com/book/16571/xml.aspx\t124451\nScript\t124452\nフェチマニア動画アダルトダウンロ\t124453\n吴谦\t124454\n软顶\t124455\n单层板\t124456\nElder\t124457\nBeef\t124458\n美国广播公司\t124459\n连环夺宝\t124460\n焙烧\t124461\ndelivery\t124462\n十一首\t124463\nQTreeWidget\t124464\n咸阳宫\t124465\n测听\t124466\n粉末活性炭\t124467\ngeneral_ci\t124468\n鲁村\t124469\n汉硬\t124470\n贴吧\t124471\n五城\t124472\n郑州经济技术开发区\t124473\n凯皇\t124474\nRhythmk\t124475\n铁血战狼\t124476\n少龙外传\t124477\n灌流器\t124478\n小战士\t124479\n点火开关\t124480\n淫魔\t124481\n龙里县政府网\t124482\n施公奇案\t124483\n成都成华\t124484\nfasd\t124485\n帮帮\t124486\n辛庄镇\t124487\nNutrilon\t124488\nTableRow\t124489\nrpo\t124490\n安暖\t124491\n新乡县\t124492\n过磷酸钙\t124493\n崔各庄\t124494\nCapstone\t124495\n逆境\t124496\n145斤\t124497\n5173.com\t124498\n车驾\t124499\n随\t124500\nLon\t124501\n角架\t124502\n弹弹岛2\t124503\n人名币\t124504\n委派\t124505\n注吹\t124506\n谢天谢地\t124507\n第三边\t124508\n建筑工程信息\t124509\n18B20\t124510\nUDEV\t124511\nUNNATURAL\t124512\n转厂\t124513\n北京华为\t124514\nSleuth\t124515\n四川省成都市第七中学\t124516\n串码\t124517\n成都公交查询网\t124518\nAABC式\t124519\n反光镜\t124520\nBob\t124521\n小片\t124522\n0435\t124523\n阿湿波\t124524\n岛上\t124525\n糖苷\t124526\n路内\t124527\n裸分\t124528\n东郊小镇\t124529\n棋谱\t124530\n酷睿i9\t124531\n造地\t124532\n榨干\t124533\n抄经\t124534\n汤普森\t124535\n纸张\t124536\nFederer\t124537\n约克中央空调\t124538\n抗生\t124539\nPCauto\t124540\n毁人不倦\t124541\n新华都\t124542\n水处理\t124543\n那又怎样\t124544\nsodium\t124545\n德中\t124546\nPPTOK\t124547\n菱格\t124548\n方脸\t124549\nCourse\t124550\n货泉\t124551\n饮恨\t124552\nSVA\t124553\n头像\t124554\n84分\t124555\n图钉\t124556\n男生们\t124557\n内灯\t124558\n芥子园画传\t124559\n清洁版\t124560\nFinale\t124561\nAVR\t124562\n酷乐看剧情网\t124563\n清华简\t124564\nx=0\t124565\n有生之年\t124566\n速度与激情1\t124567\n高频\t124568\n三十八度\t124569\nNexus7\t124570\n如梦似幻\t124571\n高屋建瓴\t124572\n未决\t124573\n广州汽车集团乘用车有限公司\t124574\n2016年10月28日\t124575\n砚\t124576\nFPS\t124577\n工口\t124578\n命水\t124579\n汶川\t124580\n验收员\t124581\n莫干溪谷\t124582\nSignaling\t124583\n500只\t124584\n院路\t124585\n吕瑞英\t124586\n旅游照\t124587\n黑林\t124588\n化音\t124589\n矿点\t124590\n鸿特\t124591\n铺子\t124592\n书签栏\t124593\n凯迪拉克XTS论坛_汽车之家论坛\t124594\n好男孩\t124595\n弹道导弹\t124596\n茶界\t124597\n撤编\t124598\nzenm\t124599\n普大喜\t124600\n30S\t124601\n亥月\t124602\n伽马射线暴\t124603\n8pin\t124604\n以及\t124605\nlibc6\t124606\nLQ-730K\t124607\n张龙豪\t124608\n法制与社会\t124609\n楚留香武当\t124610\n阿芙\t124611\n世界知识产权组织\t124612\n氚云\t124613\n邹越\t124614\n720P版BD-RMVB/\t124615\n纳米膜\t124616\n失信者\t124617\n裴度\t124618\n方知\t124619\nhmm\t124620\n魏勇\t124621\n地圈\t124622\n中国日报网\t124623\nunit1\t124624\n玲珑湾\t124625\nGHO\t124626\n总产\t124627\nkiri\t124628\n四烯酸\t124629\n小忍\t124630\n四龙\t124631\n列朝诗集\t124632\n民通\t124633\n爱唯\t124634\n两_\t124635\n待人接物\t124636\n错视\t124637\n10进制转16进制\t124638\n火焰纹章圣魔之光石\t124639\n静压桩机\t124640\n中国科学院声学研究所\t124641\n雪女\t124642\n王艳玲\t124643\n造字\t124644\n后海\t124645\n酒瓶兰\t124646\n一牛网\t124647\n复临\t124648\n克妻\t124649\n竭尽\t124650\n龙骑兵\t124651\nxxxxxxxx\t124652\n口咬\t124653\nCurved\t124654\n重庆网络广播电视台\t124655\n范德贝伦\t124656\n装装\t124657\n总结性\t124658\nssur\t124659\n2016.3.3\t124660\n林昭\t124661\n2.5.2\t124662\n三声\t124663\n舒客\t124664\n白鱼\t124665\n河北钢铁集团\t124666\nGOP\t124667\n林正疆\t124668\n屋顶着火\t124669\n驻马店驿城\t124670\n三四十\t124671\n决定因素\t124672\n老妹儿\t124673\n奸诈\t124674\n观光\t124675\n上版\t124676\n河南省旅游局\t124677\n味蕾\t124678\n市人民政府办公室\t124679\n燕歌行\t124680\n锦里古街\t124681\n师徒\t124682\n养肝\t124683\n无刺\t124684\n2017.05\t124685\nideacentre\t124686\n最低工资标准\t124687\n子儿\t124688\nmay\t124689\n有妖\t124690\n湖心亭看雪\t124691\n节后综合症\t124692\nlaure\t124693\n马鑫\t124694\ndiaoyong\t124695\n全球汽配网\t124696\n国网公司\t124697\n扣压\t124698\n责骂\t124699\n内院\t124700\nFFT算法\t124701\n留驻\t124702\n南京市博物馆\t124703\n淘客吧\t124704\n西安市第一医院\t124705\nASC\t124706\n倩女幽魂手游科举答题器\t124707\n非线性回归分析\t124708\nqq表情吧\t124709\n08款\t124710\n突发性\t124711\ncx-8\t124712\n食盒\t124713\ncommons-pool2\t124714\n男香\t124715\n金发科技股份有限公司\t124716\nΔ\t124717\nISSUE\t124718\n北京网\t124719\n废气\t124720\nthrows\t124721\n魔值\t124722\n生产件\t124723\n手太阴肺经\t124724\n仙剑奇侠传3\t124725\n洋基队\t124726\n20180225\t124727\n45元\t124728\n虚陵\t124729\n莲雾\t124730\n英国议会\t124731\nる\t124732\n狼星\t124733\n568\t124734\n俞\t124735\n九连者\t124736\nfansadox\t124737\n郑钧\t124738\n132a\t124739\n食品业\t124740\n青岚\t124741\nBOC\t124742\n群兴\t124743\nlightning\t124744\n白天使\t124745\n第五期\t124746\n石群\t124747\n抽脂手术\t124748\n美军\t124749\nGReeeeN\t124750\n打造世界\t124751\n这一点\t124752\n读后感_\t124753\nSNEAKER\t124754\n市党委办公室\t124755\n41周\t124756\n45座\t124757\nprogramme\t124758\n北宫国家森林公园\t124759\n宽沟\t124760\n教育经济与管理专业\t124761\n社工师\t124762\n方竹\t124763\n蜗轮减速机\t124764\n320\t124765\n冀州\t124766\n阿格\t124767\n雅阁沃尔沃\t124768\n理文\t124769\n牒\t124770\n酒精度\t124771\n沐水\t124772\n六一\t124773\n面试网\t124774\n棕树\t124775\n南粤\t124776\n跏趺\t124777\n白石镇\t124778\n整版\t124779\n8@200\t124780\n深圳市柔宇科技有限公司\t124781\n推一推\t124782\n病毒式\t124783\n有伴\t124784\n砌墙砖\t124785\n网兜\t124786\n衡水第一中学\t124787\n妃凌雪\t124788\n河豚\t124789\n地房\t124790\n华蒜豆\t124791\ndfi\t124792\n反激\t124793\n塑模\t124794\n地方站\t124795\n谭某\t124796\n康博\t124797\n智尚型\t124798\n横塘\t124799\nStartIsBack\t124800\n跳页\t124801\n景天科\t124802\n舞池\t124803\n58旅游网\t124804\n郑小辉\t124805\n东北雪乡\t124806\n神奇蜘蛛侠\t124807\n钟玲\t124808\n科林电气\t124809\n艺莞儿\t124810\n橡胶促进剂\t124811\nACAD\t124812\n海豚村\t124813\n花岗镇\t124814\n101贴吧\t124815\n人选\t124816\n当你沉睡时\t124817\n热线电话号码\t124818\n迪杰斯特拉算法\t124819\n作梗\t124820\n斯塔尔\t124821\n邮政编码-火车网\t124822\nttf\t124823\n硅微粉\t124824\n蛋糕油\t124825\n束带机\t124826\ncatalogue\t124827\n吕祖谦\t124828\n响声\t124829\n%5\t124830\n龙讯财经\t124831\n平行结转分步法\t124832\n蓬布\t124833\n洋山四期\t124834\nMrs\t124835\n爱的故事上集\t124836\n杜先生\t124837\n济柴\t124838\nflex-direction\t124839\n晷\t124840\n红豆生民国\t124841\nAD13\t124842\n大论\t124843\n讯宝\t124844\nWSUS服务器\t124845\n黑羽快斗\t124846\nflyback\t124847\n柔韧型\t124848\n网价\t124849\n库存股\t124850\ntagging\t124851\nmongoose\t124852\n名嘴\t124853\n宙斯\t124854\n周璇\t124855\n心美\t124856\nGavinJun\t124857\n横店圆明新园\t124858\nmachi\t124859\n求解释\t124860\nctrlp\t124861\n碧桂园十里外滩\t124862\n金港\t124863\n云知梦\t124864\n单向阀\t124865\n百度文科\t124866\n申能集团\t124867\n真理篇\t124868\n德云\t124869\nDepartment\t124870\n容声\t124871\nping值\t124872\n影视曲\t124873\n县审计局\t124874\n抚\t124875\n超高压清洗机\t124876\n华为投资控股有限公司\t124877\n智能管理器\t124878\n阿阿\t124879\n智慧山\t124880\n黑白板\t124881\n参赞\t124882\nk2200\t124883\n20140630\t124884\n麦克林\t124885\n秦皇岛市公安局\t124886\n玉藻猫\t124887\nreducing\t124888\n桥头排骨\t124889\n艘船\t124890\n宁波大剧院\t124891\n小使者\t124892\n罗兴亚\t124893\n新藏公路\t124894\n地质体\t124895\ngore\t124896\n汉江路\t124897\n网易云音乐吧\t124898\n潘通色\t124899\n现分\t124900\n刘瑞琪\t124901\n0629\t124902\n小町\t124903\n村花\t124904\n冠福股份\t124905\n摹印\t124906\nG7\t124907\n阿里店\t124908\n白公馆\t124909\n虹吸\t124910\nFOTA\t124911\nzock\t124912\n争执\t124913\n秀恩\t124914\n蒜蓉辣椒酱\t124915\n200期\t124916\n七个小矮人\t124917\n袁一琦\t124918\nwishes\t124919\n源地\t124920\n金身\t124921\n百度网盘论坛\t124922\n7350k\t124923\n卵巢巧克力囊肿\t124924\n阴阳术\t124925\n重医\t124926\n服事\t124927\n争功\t124928\n银容\t124929\n软岩\t124930\n2136\t124931\n杨文\t124932\nsatellite\t124933\nThierry\t124934\ndover\t124935\n这个片\t124936\n大王饶命\t124937\n援交女\t124938\nVIDEOS\t124939\n中国整形美容协会\t124940\n东莞移动\t124941\nDerek\t124942\n专业课\t124943\n移动通信技术\t124944\n叶山\t124945\n客规\t124946\n烃基\t124947\n扬程\t124948\n许行\t124949\n冰清玉洁\t124950\n数理统计学\t124951\n牙周袋\t124952\n麻辣女兵\t124953\n金字塔理财网\t124954\n踏过\t124955\n45千克\t124956\n充足率\t124957\n印刷电路板\t124958\n华塑控股\t124959\n萌学园\t124960\n中国石油天然气集团有限公司\t124961\n手缝\t124962\n小伙伴们\t124963\n陈晓光\t124964\n22岁\t124965\n静县\t124966\nRogue\t124967\n爬过\t124968\nAcceleration\t124969\n尧坝古镇\t124970\n苦瓜\t124971\n光传感器\t124972\n咸鱼\t124973\n立邦涂料(中国)有限公司\t124974\n阿合奇县\t124975\nMxnet\t124976\n后代\t124977\nec20\t124978\n1986\t124979\n血怒\t124980\n2008r2\t124981\n转跳\t124982\n黄倩\t124983\nM226dw\t124984\nApple110\t124985\nfunciton\t124986\n孰劣\t124987\njidi\t124988\n故称\t124989\n篮板\t124990\n桥林\t124991\n民生银行信用卡\t124992\n几十个\t124993\nagora\t124994\n去黄\t124995\n爱如茉莉\t124996\n倾转\t124997\n许鹏\t124998\n滴落\t124999\n3.3%\t125000\n瀚唐\t125001\n第五十九条\t125002\n中港通\t125003\n乌发\t125004\n广东财经大学华商学院\t125005\n九块九包邮\t125006\n两小时后\t125007\n91自拍论坛\t125008\n焊接式\t125009\n一分钟左右\t125010\n圆满结\t125011\ngeology\t125012\nevp\t125013\n20年后\t125014\nMods\t125015\nNSC\t125016\n分道扬镳\t125017\n万仙山\t125018\n川上奈奈美\t125019\n等离子清洗机\t125020\n我的好朋友\t125021\n闪退\t125022\nMat类\t125023\nsetsockopt\t125024\n梁羽生\t125025\n定兴县\t125026\n50亿_\t125027\nzhi级\t125028\n精灵安卓\t125029\n好读\t125030\n北京大学图书馆\t125031\n濮阳市人民医院\t125032\n海淀分局\t125033\n魔芋粉\t125034\n刀剑神域2\t125035\n如意通\t125036\n集成材\t125037\n宝马z4\t125038\n相似点\t125039\n真空油炸机\t125040\n影音先锋资源_影音先锋电影_影音先锋\t125041\n38件\t125042\n媲美\t125043\ngremlin\t125044\n绥宁县政府\t125045\n陶瓷轴承\t125046\n中国公考论坛\t125047\nfos\t125048\n第32卷\t125049\n翰林院\t125050\n貂油\t125051\n澳门银河在线\t125052\n强严树\t125053\ncodecasts\t125054\ncomic\t125055\n刁\t125056\n呓语\t125057\n8071\t125058\n达菲\t125059\n22#\t125060\n陈泗旭\t125061\n卡泊三醇软膏\t125062\n锐炬\t125063\n金银业\t125064\n焦树德\t125065\nTeamNII\t125066\n_哥也\t125067\n钱人\t125068\n华夏视讯网\t125069\n除尘机\t125070\n挛缩\t125071\n小流\t125072\nexceeds\t125073\n天津建设工程信息网\t125074\nintruder\t125075\nTic\t125076\n101段\t125077\n打官司\t125078\n长丰\t125079\n复古舞\t125080\ninode\t125081\n外家\t125082\nbarbecue\t125083\n双段\t125084\n浦银\t125085\n金属型\t125086\nD50\t125087\n国区\t125088\n后语\t125089\n曹光彪\t125090\n顺丰特惠\t125091\ntvoc\t125092\n佛寺\t125093\n黄巷\t125094\n甬建发\t125095\n食全食美\t125096\n淘金地\t125097\nwuxi\t125098\n氨酚\t125099\nWoman\t125100\n贵池路\t125101\n兔眼\t125102\n汆\t125103\n丁度·巴拉斯\t125104\n泼水门\t125105\nVECTOR\t125106\n引体向上\t125107\n城市之音\t125108\n江永县\t125109\n碟架\t125110\n10010\t125111\nles\t125112\n限级\t125113\n潮解\t125114\ni5-7200U\t125115\n不可以\t125116\ndurid\t125117\n一寸两寸\t125118\n玛奇朵\t125119\n宠女\t125120\n10000行\t125121\nx58\t125122\n签字人\t125123\n大悲寺\t125124\n大浦东\t125125\ns60v3\t125126\nRadiant\t125127\n程浩\t125128\n蔡奇\t125129\n少儿不宜\t125130\nMACBOOK\t125131\n辽东\t125132\n东洋\t125133\n线性插值\t125134\nsolidworks2018\t125135\n电流场\t125136\n泄水\t125137\n我的世界\t125138\n芝顿\t125139\nie60\t125140\n装饰罩\t125141\n高欣\t125142\nWrest\t125143\n元表\t125144\nHP1010\t125145\n沧月\t125146\n上漆\t125147\nsm总线控制器\t125148\n赠与\t125149\nString\t125150\n贝果\t125151\nRAM/全网通\t125152\n富少\t125153\n行舟\t125154\nwt%\t125155\nZombies\t125156\nmikihouse\t125157\n梧桐雨\t125158\n一个星期前\t125159\n岛风go\t125160\nPowerMill\t125161\n北京大学工学院\t125162\n犯罪类\t125163\n反黑使命2沉默\t125164\n拉丁舞\t125165\n内蒙古大学\t125166\n华能国际\t125167\n民族化\t125168\nDocin.com豆丁网\t125169\n常州电信\t125170\n2005级\t125171\n斯内克\t125172\n余泥\t125173\n杨颜\t125174\n家乐氏\t125175\n投标服务承诺书\t125176\npapi\t125177\n10BIT\t125178\n寡妇村\t125179\nSpring-security-oauth2\t125180\n柯利达\t125181\n苹果浏览器\t125182\n诺第留斯号\t125183\n长安城\t125184\nentp\t125185\n月半湾\t125186\n第一财经日报\t125187\n不用心\t125188\n222&\t125189\n大淘客联盟\t125190\n信任感\t125191\n真武汤\t125192\n猎魔\t125193\n兴梅\t125194\nabode\t125195\n税宣\t125196\n黄宗英\t125197\n仁川\t125198\n星咲优菜\t125199\n销售招聘网\t125200\n900名\t125201\n进出线\t125202\n伫立\t125203\n酷我音乐2016\t125204\nbiorad\t125205\n陆为民\t125206\npatrick\t125207\nr13\t125208\n君子之道\t125209\n三牌楼\t125210\nBCP\t125211\n新华网内蒙古频道\t125212\n爪爪\t125213\n广信贷\t125214\n衣钵\t125215\n仙人山\t125216\n红花梨\t125217\n20150605\t125218\nGB50974\t125219\n干露\t125220\n王镇\t125221\nHWND\t125222\n水虎\t125223\n聚氨酯胶\t125224\n奸臣\t125225\n割线\t125226\n爱成\t125227\n上世纪80年代\t125228\n帮衬\t125229\n10.5.1\t125230\n很苦\t125231\nsoftmax\t125232\n1930年\t125233\n很难说\t125234\nInstitutional\t125235\n疯丢子\t125236\n新课程\t125237\n365式\t125238\n公安部交通管理科学研究所\t125239\n遇难\t125240\n隆庆祥\t125241\n同治皇帝\t125242\n叛道\t125243\n极限特工\t125244\n裂隙\t125245\n起點\t125246\n林溪\t125247\n百度hi\t125248\n凯文李\t125249\n窄幅\t125250\nUri\t125251\n作家们\t125252\n守望相助\t125253\n中关村在线\t125254\n中央部门\t125255\n华雄\t125256\ndeformation\t125257\n锣\t125258\n狗瘟\t125259\n黔西县\t125260\n超过3年\t125261\n水果邦\t125262\n精诚团结\t125263\n案\t125264\n3705\t125265\nconc\t125266\n缤纷活动_幼儿园新闻_无忧幼儿园\t125267\n武汉幼儿园\t125268\n中吴网\t125269\n正德\t125270\n天玄冰\t125271\n紧颜\t125272\n老话\t125273\n储机\t125274\n碱量\t125275\n南光集团\t125276\nRHEL7/CentOS7\t125277\n万科大家世纪之光\t125278\n2txt\t125279\n徐立毅\t125280\n小草青青\t125281\n宋教仁\t125282\n降魔传\t125283\nProgressDialog\t125284\n影视片\t125285\n心脏\t125286\n初六\t125287\n促销服\t125288\n赣州市人民政府\t125289\n毛利小五郎\t125290\n全新版大学英语\t125291\n小秀\t125292\ns80\t125293\n新河镇\t125294\n教育经济学\t125295\nに\t125296\nsontable\t125297\n机动战士敢达\t125298\n师烬\t125299\n蠢人\t125300\n水下\t125301\nVol.2\t125302\n比较法\t125303\n永夜偷香高手\t125304\nmss32.dll\t125305\n11g101\t125306\n\b\t125307\n湖里万达广场\t125308\n照护\t125309\n好便宜\t125310\n房款\t125311\n氟胶\t125312\nwhats\t125313\nLiquipedia\t125314\nEXACT函数\t125315\n人才交流中心\t125316\n肠道功能紊乱\t125317\nairflow\t125318\n蝶翼\t125319\n塞巴斯\t125320\n汉姓\t125321\n万米\t125322\n国际标准舞\t125323\n套房\t125324\n梅花肉\t125325\naldnoah\t125326\n炫彩\t125327\n首个\t125328\nStockings\t125329\n棕叶\t125330\n1926年\t125331\n优课联盟\t125332\n操操\t125333\n霜冻之蓝\t125334\npatindex\t125335\n皇家御河\t125336\n鹿胎\t125337\n凯迪拉克srx\t125338\n仙剑奇侠传幻璃镜\t125339\n一万七\t125340\n此国\t125341\nBrowsersync\t125342\n桥南街\t125343\n三十款\t125344\n单梁起重机\t125345\n丹青\t125346\nsftp\t125347\n法学系\t125348\n高防服务器\t125349\n航站楼\t125350\n中国仙游政府\t125351\n今敏\t125352\ndrank\t125353\n堕邪\t125354\nBalloon\t125355\n401k\t125356\nfeminist\t125357\n文峰路\t125358\n杨明娜\t125359\n九江长江大桥\t125360\n春训\t125361\n硅谷天堂\t125362\n酒井法子\t125363\n金线\t125364\n1.2升\t125365\n我的话\t125366\n浅口\t125367\n铰刀\t125368\n芝加哥机场\t125369\n河海大学研究生院\t125370\n伪恋\t125371\n上街区\t125372\nTIPTOP\t125373\n云端管理中心\t125374\n8.2%\t125375\n巴达兽\t125376\nITCHN\t125377\n第131集\t125378\n付刚\t125379\n姓名版\t125380\n三脚\t125381\nLange\t125382\nphp+ajax\t125383\n娱乐_小红书\t125384\nb+树\t125385\n收完\t125386\n防撞梁\t125387\n钠\t125388\n肖振龙\t125389\n罗技G502\t125390\n贝尔法斯特\t125391\nrefa美容仪\t125392\n胚胎干细胞\t125393\nRJ45接口\t125394\n大统\t125395\n網站\t125396\nmeadow\t125397\n美墨战争\t125398\n麦饭\t125399\n大润\t125400\n欧布奥特曼英雄传\t125401\n高泰宇\t125402\n预约式\t125403\n天骄战纪\t125404\n24点\t125405\ncentos7系统\t125406\n吉视传媒\t125407\nGiver\t125408\n相川\t125409\npythn\t125410\nR8000\t125411\n多元回归模型\t125412\n南京信息工程大学\t125413\n毛儿\t125414\nDRGs\t125415\nlaoban\t125416\n检测\t125417\n下头\t125418\nzombie\t125419\n偏财运\t125420\n2X\t125421\n佛冈县\t125422\n发生冲突\t125423\n孙狗\t125424\n演唱会版\t125425\n伤天害理\t125426\n货单\t125427\n丹顶鹤\t125428\n黄美姬\t125429\n凡女\t125430\nvivoy51a\t125431\natlanta\t125432\n械字号\t125433\n外罩\t125434\n橄榄坝\t125435\nSlug\t125436\n天天象棋春秋五霸\t125437\n报价\t125438\n能量表\t125439\n雪橇犬\t125440\n涣\t125441\naix\t125442\n甜酱\t125443\nR7s\t125444\n香格里拉县\t125445\ncs15\t125446\n天亮了\t125447\n普格县\t125448\n羽林\t125449\n出机\t125450\n巴\t125451\n湖北网\t125452\n里巴巴\t125453\n廖博\t125454\n紫鹊界\t125455\nOscar\t125456\n和成\t125457\n有报\t125458\n小混蛋\t125459\nlibvlc\t125460\n1B\t125461\nESSAY\t125462\n包浆\t125463\n华为MATE9\t125464\n28个月\t125465\narchitectures\t125466\n600715\t125467\n中国影像方志\t125468\nad域\t125469\n宜花\t125470\n见光\t125471\n中共浙江省委\t125472\nChristmas\t125473\nreser\t125474\n返还\t125475\nKanebo\t125476\n胡锡进\t125477\n屯田\t125478\n浙江证监局\t125479\n水岸花城\t125480\n_宗申\t125481\n首钢医院\t125482\nNo.2\t125483\n1.1cm\t125484\n猫人\t125485\ntinder\t125486\n西泵股份\t125487\n体外诊断试剂\t125488\n温忠麟\t125489\n网学\t125490\n上都\t125491\n生命人寿\t125492\n长沙万达广场\t125493\njuju\t125494\n蜀山凌天传说\t125495\niSpring\t125496\n血饮\t125497\n陈媛媛\t125498\n传祺GS4\t125499\n六重天\t125500\n中泰证券\t125501\n初版\t125502\n文明5\t125503\n九地\t125504\n黑黄\t125505\n187ml\t125506\n瘦肉粥\t125507\n耳日\t125508\n数据库\t125509\n测出\t125510\n星情\t125511\n顾冠忠\t125512\n反对票\t125513\n雨板\t125514\ntigo\t125515\nInf\t125516\n熏蒸\t125517\n图坦卡蒙\t125518\n清通\t125519\n赵琳\t125520\n五福星\t125521\n天马股份\t125522\n胡伟立\t125523\n漯河日报\t125524\n国瑞\t125525\n中国交响乐团\t125526\n轴式\t125527\n英雄战姬GOLD\t125528\nSLIM\t125529\nIK\t125530\n吕晶\t125531\nluajit\t125532\n九不准\t125533\n抬起头\t125534\n12岁时\t125535\n王福\t125536\n王启\t125537\nsldprt\t125538\nwitcher\t125539\n489号\t125540\n小黄油\t125541\nems没单号\t125542\n91日\t125543\n控制杆\t125544\n广阳镇\t125545\n第三稿\t125546\n2018年4月30日\t125547\n7700k\t125548\n福成机场\t125549\ngstar\t125550\n红板报\t125551\n上海应用技术大学\t125552\n韩红刘胡兰\t125553\n玛拉\t125554\n闽南理工学院\t125555\n顺遂\t125556\n同级生\t125557\n唐山大兄\t125558\n自考本科_自学考试\t125559\nDen\t125560\n护眼色\t125561\n邦泰集团\t125562\n70部\t125563\n江城路\t125564\n三大招\t125565\n北京市规划委员会\t125566\n厦门工会\t125567\ngmic\t125568\n外皮\t125569\n2015年国庆\t125570\n直升班\t125571\n调宽\t125572\n宅基地证\t125573\n泰国国家旅游局\t125574\n7.0.6\t125575\n拓片\t125576\n严隽琪\t125577\n归心似箭\t125578\nValidating\t125579\n浙江亚太机电股份有限公司\t125580\nlyft\t125581\n快乐酷宝2\t125582\n敖\t125583\n钛矿型\t125584\n星空卫视\t125585\n互助版\t125586\ncharacterization\t125587\n子宫内膜异位\t125588\nGH5\t125589\nmator\t125590\n安宁市\t125591\nWin7系统启动项\t125592\n李建新\t125593\n老榆木家具\t125594\n中华慈善新闻网\t125595\n富豪山庄\t125596\n耳袋\t125597\nselenium+python\t125598\n镀铝板\t125599\nonline\t125600\n南宁小学\t125601\n等离子\t125602\n887\t125603\n亚德诺\t125604\n葡萄酒资讯网\t125605\n炖\t125606\n加定\t125607\n近路\t125608\n八千代\t125609\n引凤\t125610\n樊哙\t125611\n15卷\t125612\n96333\t125613\n主办者\t125614\n瞄具\t125615\n调用\t125616\n风御九秋\t125617\n邦良\t125618\n880GM\t125619\n岁宝百货\t125620\n42crmo\t125621\nAAC/百度云\t125622\ncqy\t125623\n勒柯布西耶\t125624\n限于\t125625\n福建华南女子职业学院\t125626\n就是这样\t125627\n小年轻\t125628\n惠民卡\t125629\npts\t125630\n联萌\t125631\n苯二胺\t125632\n吉他入门\t125633\n哪几\t125634\nUD\t125635\n中信网\t125636\n天津队\t125637\n丽水市人民医院\t125638\n打BOSS\t125639\n爱情悠悠药草香\t125640\n器官捐献\t125641\n前七天\t125642\n塑料杯\t125643\n南京市委\t125644\n非数值型\t125645\n莱特希泽\t125646\n滨化\t125647\nBoats\t125648\n布面\t125649\n冰火板\t125650\n天津人才网\t125651\n体验版\t125652\n落式\t125653\n道教四大名山\t125654\n世纪坛医院\t125655\n柯力\t125656\n腺肌症\t125657\n小魔女\t125658\n黄字\t125659\n友道\t125660\n刷头\t125661\n恩仇录\t125662\n罗美薇\t125663\nRNN\t125664\nDVWA\t125665\nDragonir\t125666\n季中赛\t125667\n2x3\t125668\n商南县\t125669\n热备份\t125670\n盐酸地芬尼\t125671\n洁尔\t125672\n谢公\t125673\nincidence\t125674\nClauses\t125675\nCISSP\t125676\nedk\t125677\n糟糠之妻\t125678\n福田镇\t125679\n4-6\t125680\n警务\t125681\n博云\t125682\n乐恩\t125683\nweaver\t125684\njdk\t125685\n先端\t125686\n和一个\t125687\n构词\t125688\nYQ\t125689\nWord模板\t125690\n斗拱\t125691\nhah\t125692\n小微型\t125693\n迪欧咖啡\t125694\n梁底\t125695\n酸碱度\t125696\nGenealogy\t125697\n龙服\t125698\n25ml\t125699\n固相萃取柱\t125700\n荣信\t125701\n名古屋市\t125702\nMyBatis-Spring\t125703\n网络自制剧\t125704\nHerman\t125705\nM16\t125706\n原人\t125707\nWEB\t125708\n20151031\t125709\nPanDownload\t125710\n故山\t125711\nodex\t125712\n蜂鸟众包\t125713\n紳士\t125714\n光遇\t125715\n志高\t125716\n辅助软件\t125717\n服务器管理器\t125718\n灰雁\t125719\n割嘴\t125720\n广西幼儿师范高等专科学校\t125721\n纸袋机\t125722\n想来想去\t125723\n石梅山庄\t125724\n半壁江中文\t125725\n片中\t125726\n三爱\t125727\n不通顺\t125728\nAUC\t125729\n水利水电建筑工程\t125730\nDyna\t125731\n胸臆\t125732\n凤池\t125733\n魔法雪\t125734\n目黑\t125735\n大理国\t125736\n浮图塔心理学\t125737\n求是\t125738\n刷机大师官网_一键完美刷机_ROOT大师官网_一键ROOT_蘑菇云\t125739\n导杆\t125740\n旋挖钻机\t125741\n巨石集团\t125742\nf460\t125743\n扬州中学\t125744\n联邦德国\t125745\n三中全会\t125746\n西宝\t125747\n膝\t125748\nv2.2.3\t125749\n优逸白\t125750\n传动系\t125751\n国都证券\t125752\n共析钢\t125753\n武汉植物园\t125754\n腐殖酸\t125755\nmassage\t125756\n小柯\t125757\n小红门\t125758\n俄里翁\t125759\nVC++6.0\t125760\nneovim\t125761\n布鲁娜\t125762\n无衣\t125763\n网上购物\t125764\n乌鲁木齐市人民政府\t125765\n四川省电力公司\t125766\n3960\t125767\n乾唐轩\t125768\n闪聊\t125769\n无忧行\t125770\n都匀经济开发区\t125771\n郑州八中\t125772\n97_\t125773\n引黄\t125774\n220TURBO\t125775\n广东公司\t125776\n美波\t125777\n中央军民融合发展委员会\t125778\n结尾曲\t125779\n30万套\t125780\n逆风局\t125781\n幅值\t125782\natitit\t125783\n自我约束\t125784\ndwm\t125785\nxinli\t125786\necs\t125787\n缩松\t125788\nrhyme\t125789\n欢瑞世纪\t125790\n丿\t125791\n3770K\t125792\n发软\t125793\n昨夜西风凋碧树\t125794\nScheme\t125795\n卫计委\t125796\nCharacters\t125797\n裕树\t125798\nHyper-V虚拟机\t125799\n折射仪\t125800\n廖凡\t125801\n悦澜湾\t125802\n乔登\t125803\n五十年后\t125804\nhkust\t125805\nkaldi\t125806\n11.5\t125807\n152路\t125808\n广富林\t125809\n底筋\t125810\n补光\t125811\nvirtualenvwrapper\t125812\nVince\t125813\n节育器\t125814\n阳房\t125815\n核酸\t125816\n土方\t125817\nDiscussion\t125818\n福建自贸试验区\t125819\n官话\t125820\n我房网\t125821\n南国置业\t125822\n寤寐\t125823\n文士\t125824\n聚名\t125825\nrate\t125826\n中国太平洋财产保险\t125827\n超级基因优化液\t125828\n邹杨\t125829\n北乡\t125830\n红脸蛋\t125831\n龟甲-三五中文网\t125832\n吕静\t125833\nzlog\t125834\n噜噜\t125835\nHLD\t125836\n海湖庄园\t125837\n香花桥街道\t125838\n小太阳\t125839\n海洋堂\t125840\n30岁时\t125841\n头像集\t125842\n第14关\t125843\nbookstore\t125844\nNSCLC\t125845\n北师大幼儿园\t125846\n胡路\t125847\n皇女\t125848\n王冰\t125849\n猎趣\t125850\n35分钟\t125851\n素素雪\t125852\n仁济\t125853\n眇\t125854\n贮运\t125855\n黄继吉克隽逸\t125856\n高中时代\t125857\n四喜鸟\t125858\ncontest\t125859\n第一维\t125860\n雷炸\t125861\n电子书\t125862\n)科技有限公司\t125863\n动来动去\t125864\n横向滚动条\t125865\n几声\t125866\n心家\t125867\n剑桥大学\t125868\nMartens\t125869\n微结构\t125870\n查稿\t125871\n轻母\t125872\n1880年\t125873\n伞业\t125874\n平阳\t125875\n魔法门\t125876\nル\t125877\n竖状\t125878\n酷安网\t125879\n徐茂栋\t125880\n冰渣\t125881\n生命之花\t125882\n糖醇\t125883\nstube\t125884\n经由\t125885\n枪炮\t125886\n9双\t125887\n游梁式\t125888\n打米机\t125889\n程建军\t125890\n刀笔\t125891\n丁程鑫\t125892\n等差\t125893\nMAXX\t125894\n谭文\t125895\n盘扣\t125896\n第27章\t125897\ncenturies\t125898\n新店乡\t125899\n衬底\t125900\n芳草街\t125901\n二十号\t125902\n虎弟\t125903\n点场\t125904\n1.93\t125905\n作姓\t125906\n弘高\t125907\n葱花\t125908\n尛样儿\t125909\n红心水\t125910\n战云\t125911\n觉察\t125912\n呱呱\t125913\n简繁体转换\t125914\n1894\t125915\nCleaner\t125916\n梁琴\t125917\n赵志军\t125918\n短袖男\t125919\n被单\t125920\n铆工\t125921\n熊朝忠\t125922\n概念模型\t125923\n焦恩俊\t125924\n嘉兴桐乡\t125925\n共渡难关\t125926\n多联\t125927\n休戚相关\t125928\n画正\t125929\n火葬场\t125930\nestudio\t125931\n86路\t125932\n43号\t125933\n每隔10秒\t125934\n4.26\t125935\n适格\t125936\n以人为本\t125937\nBlackmores\t125938\n禄口街道\t125939\n中央农村工作会议\t125940\n建设项目选址意见书\t125941\n肥肉\t125942\nP4P\t125943\n西苑中医院\t125944\ncreamy\t125945\n超实用\t125946\n1幅\t125947\n斗鱼黄金大奖赛\t125948\n鞠躬尽瘁\t125949\n法库\t125950\n友财网\t125951\n东风夜\t125952\n神_\t125953\n主材费\t125954\n符号\t125955\n眼人\t125956\n广布\t125957\n任斌\t125958\n娇媚\t125959\n配乐朗诵\t125960\n尚府\t125961\n长江通信\t125962\n2018-03-25\t125963\n套丝\t125964\n中共陕西省委\t125965\n木叶丸\t125966\n晚场\t125967\n反黑先锋\t125968\n帝国时代之罗马复兴\t125969\nBarTender\t125970\nmendes\t125971\nmodel3\t125972\nmoba类\t125973\n金锁固精丸\t125974\n30分钟\t125975\n庄子与惠子游于濠梁\t125976\n竹藤\t125977\n第五十二条\t125978\n精神损失费\t125979\ngigabyte\t125980\n倪发科\t125981\n企业信息化建设\t125982\n月下美人\t125983\n立信会计学院\t125984\n打拐\t125985\n前郭尔罗斯蒙古族自治县\t125986\n申请\t125987\n眨眼间\t125988\n味道\t125989\n瑞虎5论坛_汽车之家论坛\t125990\n5线\t125991\n托付\t125992\n青海农牧业信息网\t125993\n杨某某\t125994\n替班\t125995\n表人物\t125996\n工程系\t125997\n9250\t125998\n乱纹\t125999\n精品展\t126000\n超奥特8兄弟\t126001\n剑术\t126002\n滚道\t126003\n凉雾\t126004\n西陆文学\t126005\n猎魂\t126006\n玉芒\t126007\n惠英红\t126008\nvocs\t126009\n吕锋\t126010\n六角棒\t126011\n西索\t126012\n货件\t126013\n邪剑仙\t126014\n5000t\t126015\n召唤流\t126016\nfred\t126017\n11.3卡\t126018\n组句\t126019\nol4\t126020\n遂宁新闻网\t126021\n传菜\t126022\nScrews\t126023\n前后期\t126024\n星辰在线\t126025\n武军\t126026\n16集\t126027\n1万分\t126028\n朝晖路\t126029\n芭蕾舞剧\t126030\n马奇\t126031\nTuPian\t126032\n全网通4G\t126033\n小常宝\t126034\ntoner\t126035\n比旋度\t126036\n冷萃\t126037\npythom\t126038\n第四首\t126039\n9000美元\t126040\n鸿影\t126041\n艺馨\t126042\n三亚市区\t126043\n出巡\t126044\n波动性\t126045\n雪迪龙\t126046\n拉姆蕾姆\t126047\n百度搜素\t126048\n攀枝花学院\t126049\n选中子\t126050\n巨宗\t126051\n福建师范大学福清分校\t126052\n焦虑感\t126053\n组合柜\t126054\n粘弹性\t126055\n央产\t126056\n安东尼罗宾\t126057\n农药登记\t126058\n沪通卡\t126059\n先锋科技(香港)股份有限公司\t126060\n萨摩耶犬\t126061\nalmost\t126062\n巨佬\t126063\n小时代2\t126064\n阋\t126065\n维希度斯\t126066\n百花齐放\t126067\n耳廓狐\t126068\n施密特触发器\t126069\n云汉芯城\t126070\n2层\t126071\nZephyr\t126072\n丽园\t126073\n品川\t126074\n序号&#160\t126075\n中南海\t126076\n五亭桥\t126077\nrct\t126078\ncur\t126079\n商女\t126080\n雅妃\t126081\n循环控制器\t126082\n决战城南旧事\t126083\n苗疆蛊事2\t126084\n文兴宇\t126085\n浙江电力\t126086\n今何在\t126087\n多项\t126088\nInstron\t126089\n馨馨\t126090\n0937\t126091\nalp\t126092\n自然力\t126093\n20170618\t126094\n水蜡\t126095\nwebgame\t126096\n风飞扬\t126097\nsanji\t126098\n关联子\t126099\n陈旗\t126100\nseo\t126101\n北京市第一中西医结合医院\t126102\nSM-G9350\t126103\n随机验证码\t126104\n第8季\t126105\n24p\t126106\npop3\t126107\n随便果\t126108\nRage\t126109\n美格网\t126110\n友盟\t126111\n哈语\t126112\n百家利\t126113\n瑞吉\t126114\n气动切断阀\t126115\n荣耀达摩\t126116\n王朴\t126117\n答错\t126118\nUpdated\t126119\n能源部\t126120\n高拱\t126121\n腌蒜\t126122\n得不得\t126123\n四川外国语大学\t126124\n华少\t126125\n诸葛孔明\t126126\nssslin\t126127\n问答式\t126128\n小\t126129\n杭州道\t126130\nmpich\t126131\nYUV420\t126132\nv1.10\t126133\n349\t126134\n回顶\t126135\n2017-04-28\t126136\n广西住房和城乡建设厅\t126137\n苏打饼\t126138\nito\t126139\n官途\t126140\n董其昌\t126141\n攀枝花市东区\t126142\n脱浆\t126143\n促活\t126144\n金星星座\t126145\n林卡尔\t126146\n园游\t126147\n地虎\t126148\ndigestion\t126149\n垫板\t126150\n再热器\t126151\n梯度\t126152\n23年\t126153\n屏气\t126154\n高龄化\t126155\n这样做到\t126156\n铁片\t126157\n融会\t126158\n心理测试\t126159\n再遇\t126160\n重庆律师事务所\t126161\n口弦\t126162\n太原卫星发射中心\t126163\n中建三局二公司\t126164\n北京圆通快递\t126165\n知识产权案\t126166\nbeyond乐队\t126167\n绝死\t126168\nfind\t126169\n靖江市\t126170\n工效\t126171\n降雨\t126172\nPP助手\t126173\n金辉城\t126174\n宽带技术网\t126175\n其外\t126176\nCBP\t126177\n驱走\t126178\n吃水不忘\t126179\n峨山\t126180\n胆大妄为\t126181\n成长天地\t126182\nbots\t126183\n日本转运公司\t126184\n第一三共\t126185\n2007届\t126186\n两山论\t126187\n第三十九回\t126188\nN1S\t126189\n梁间荷载\t126190\n打挺\t126191\n自由光论坛_汽车之家论坛\t126192\n一颦一笑\t126193\n八层\t126194\n中华律师网\t126195\n储物柜\t126196\n45级\t126197\n通车\t126198\nterm\t126199\n新天龙八部OL_巴士新天龙\t126200\n糖味\t126201\n凤仙\t126202\nBiiigfish\t126203\nKeytool\t126204\n亿嘉\t126205\n萘普生\t126206\n蒋舒婷\t126207\n劇情\t126208\n816\t126209\n影音先锋资源_吉吉影音资源_影音先锋av_影音先锋男人站\t126210\nwxb\t126211\n自由港\t126212\n保山市隆阳区\t126213\n成都市区\t126214\n西师版小学\t126215\nsimplexml\t126216\n微软Edge\t126217\n阿德里安\t126218\n花卉\t126219\n苗翠花\t126220\n8.22\t126221\n65Mn\t126222\nacquire\t126223\n皮装\t126224\n网络电话\t126225\n欧元\t126226\n老中医药\t126227\n万能浏览器\t126228\n花椒粉\t126229\n幸福人寿\t126230\n湖里\t126231\n青岛市地方税务局\t126232\n自贡银行\t126233\nunicorn\t126234\nMrAntiFun\t126235\n2014-2016年度\t126236\n满大街\t126237\n撤出\t126238\n西南部\t126239\n卫东\t126240\n宁扬\t126241\n打糕\t126242\n东科\t126243\n絮叨\t126244\n肾上腺肿瘤\t126245\n陆通\t126246\n薛明\t126247\n二本大学\t126248\n柯景腾\t126249\ndpd\t126250\n网易新闻中心\t126251\n33uu\t126252\n哆啦A梦新番\t126253\n卷角\t126254\n吮指原味鸡\t126255\n洗鞋\t126256\n注解处理器\t126257\n朱家尖\t126258\n专转本\t126259\nUSBKEY\t126260\n铝合金升降机\t126261\n义齿\t126262\n整形术\t126263\n怀抱\t126264\n剪卡\t126265\n沃尔沃V40\t126266\ngl吧\t126267\n洪蓉\t126268\n感情用事\t126269\n第四十五期\t126270\n子宫内膜异位症\t126271\n时间机器\t126272\n三阶\t126273\nSATA3.0\t126274\n聚创\t126275\n鞋匠\t126276\n射\t126277\n东珠\t126278\n罗南\t126279\n常熟尚湖\t126280\nCDE\t126281\n多项式回归\t126282\n平阳公主\t126283\n天泉\t126284\n耘\t126285\nhaviour\t126286\n阜康\t126287\n地位\t126288\n天猫券\t126289\n单双号\t126290\n3d渲染\t126291\n角型\t126292\n小柴胡汤\t126293\n心诚\t126294\n生崽\t126295\n神眼\t126296\n张秀梅\t126297\n石窟\t126298\n壹级\t126299\n吴尊\t126300\n挑\t126301\n余丁\t126302\n岳麓\t126303\n汉兰达论\t126304\nDOSBOX\t126305\n身型\t126306\n刷机助手\t126307\n潮汕\t126308\n招用\t126309\nHAVING\t126310\nmatlabgui\t126311\n软式透水管\t126312\n表见代理\t126313\n软陶\t126314\npycharm\t126315\n因循\t126316\n日航酒店\t126317\n赵忠祥\t126318\n胡宇崴\t126319\njizx\t126320\nSteady\t126321\n剑歌\t126322\n萧蔷\t126323\n发儿\t126324\n捷信公司\t126325\n编辑点\t126326\n三十秒\t126327\n0618\t126328\n东风恶\t126329\nServer2008R2\t126330\n人生旅途\t126331\nvivier\t126332\n动感之星\t126333\n水蒸\t126334\n荣耀路由\t126335\n老爸老妈浪漫史\t126336\n沙林\t126337\n巴霍巴利王2:终结\t126338\nalexander\t126339\n赵永\t126340\n政函\t126341\n爱扬教育网\t126342\n饥饿游戏3\t126343\n_我是演说家\t126344\nTemporal\t126345\nRETRO\t126346\n汉南区\t126347\n欧阳春\t126348\n海关总署\t126349\n1.80\t126350\n埋名\t126351\n阴森\t126352\n白胡子\t126353\n220米\t126354\n杏林湾\t126355\n7晚\t126356\n因特拉肯\t126357\nChang\t126358\n跺脚\t126359\n失信被\t126360\n波切蒂诺\t126361\n大型\t126362\n真才实学\t126363\n诉至\t126364\n辽参\t126365\n语言学纲要\t126366\nJULIA\t126367\n甘草酸\t126368\n东浦\t126369\n002456\t126370\n女足世界杯\t126371\n广安大街\t126372\n沈抚新区\t126373\n第十五期\t126374\nSang\t126375\n祥鹏\t126376\n巴蒂斯塔\t126377\nKGS\t126378\n狮山公园\t126379\n邮戳\t126380\n五花八门\t126381\n外功\t126382\n斗罗大陆II绝世唐门\t126383\n无锡外国语学校\t126384\n风吹麦浪\t126385\n上海交通大学凯原法学院\t126386\n洁福\t126387\n小恶魔\t126388\n月饼盒\t126389\n魏晨\t126390\nQQ黑白秀\t126391\n假面骑士BLACK\t126392\n海河\t126393\n鹭鸶\t126394\ng位\t126395\n4年前\t126396\n安徽盛运环保(集团)股份有限公司\t126397\n音箱\t126398\n再来\t126399\ntableau\t126400\nhk\t126401\n6迅雷\t126402\n苏大附二院\t126403\nHRoot\t126404\n哭嫁\t126405\n长草颜团子\t126406\n为什么么\t126407\n千校\t126408\nGuava\t126409\nmainfest\t126410\n坐牢\t126411\n按兵不动\t126412\nAlexandra\t126413\n金绿\t126414\nSuzy\t126415\n血手宝典\t126416\nEthics\t126417\n十头\t126418\n六面体\t126419\n利特\t126420\n间\t126421\n古树茶\t126422\nMSC\t126423\n人民调解委员会\t126424\n手把手\t126425\n日照市\t126426\n限定符\t126427\n坐禅\t126428\n吉萨\t126429\nMineCraft\t126430\n楼楼\t126431\n劲乐团\t126432\n机智\t126433\nEditions\t126434\n阿长与山海经\t126435\n情关\t126436\n商产\t126437\n墨索里尼\t126438\n多喜爱\t126439\n美熟女\t126440\n食味\t126441\n飞龙湖\t126442\n八十多岁\t126443\n7亿美元\t126444\nProfits\t126445\n党群工作部\t126446\nbrit\t126447\niPhone8plus\t126448\n快慢机\t126449\n植物大战僵尸无尽版\t126450\n凯斯\t126451\n一百元\t126452\n萧亚轩\t126453\nMonoclonal\t126454\n数据冗余\t126455\nLED电路\t126456\n改性沥青\t126457\n前凸\t126458\n一刀秒\t126459\nsequential\t126460\n百度输入法\t126461\nCFB\t126462\niyun\t126463\n鼎胜新材\t126464\n六所\t126465\nPressure\t126466\n唱到底是谁\t126467\n南小鸟\t126468\n李安平\t126469\n果业通网\t126470\n李湘\t126471\n阿里助手\t126472\n电音\t126473\n艾眼\t126474\n客从何处来\t126475\n自欺\t126476\n21速\t126477\n十余万\t126478\n开元盛世\t126479\n梅州市教育局\t126480\n国恒\t126481\n借记卡_\t126482\n遥想\t126483\n13月\t126484\n风云雄霸天下\t126485\n工程热力学\t126486\n龙杰\t126487\n米店\t126488\n化疗药\t126489\n新语\t126490\n小米手机通话\t126491\n清凉一夏\t126492\n壮心\t126493\n穆童\t126494\n大肠经\t126495\n离婚率\t126496\n兽魂\t126497\nWGC\t126498\n衡越\t126499\n青龙满族自治县\t126500\n乌伊岭\t126501\n吉林省物价局\t126502\n宣传类\t126503\noxox色情网\t126504\n急功近利\t126505\ncopper\t126506\n自执行函数\t126507\n网贷天下-网贷天下\t126508\n4.66\t126509\n钟明\t126510\n购物袋\t126511\n李永生\t126512\nwmw\t126513\n御珑湾\t126514\n280_\t126515\n陈杰\t126516\n偷工减料\t126517\n融会贯通\t126518\n20170309\t126519\n凉拌金针菇\t126520\n几天后\t126521\n肌内注射\t126522\n遗像\t126523\n刘孜\t126524\ncrucible\t126525\nModalDialog\t126526\n松树\t126527\n走兽\t126528\n精煤\t126529\n10毫升\t126530\n欣\t126531\n张光辉\t126532\n足球门\t126533\n乐_\t126534\n火辣\t126535\n槽管\t126536\nnankai\t126537\n3169\t126538\nbeanstalkd\t126539\n赵云\t126540\n大诚\t126541\n一过性\t126542\n夏衍\t126543\n软饭\t126544\n还愿\t126545\n巴沙鱼\t126546\nbred\t126547\n东施效颦\t126548\n张方\t126549\n中医学院\t126550\n什么然\t126551\n数字电视遥控器\t126552\n热力学统计物理\t126553\n进乡村\t126554\n298号\t126555\n2瓶\t126556\n一学一\t126557\nFrontpage\t126558\n初检\t126559\n会宁县人民政府\t126560\n长江隧道\t126561\nMXSPS\t126562\n没命\t126563\n温州市图书馆\t126564\n男人节\t126565\n借乐花\t126566\n武汉华中科技大学\t126567\n翻天\t126568\n制作组\t126569\n外语系\t126570\n马恩\t126571\nzuopin\t126572\n卤门\t126573\n证满\t126574\nroundup\t126575\nShake\t126576\n相序保护器\t126577\n孙中山故居纪念馆\t126578\n硫化\t126579\n叶月\t126580\nqlabel\t126581\n新址\t126582\n脸\t126583\n造血干细胞\t126584\n代战\t126585\n独门\t126586\n钩虫病\t126587\n荣耀畅玩7X\t126588\n流泉\t126589\n北京市商务委\t126590\n罗德\t126591\n复分解\t126592\n8001\t126593\n中药袋\t126594\n门式脚手架\t126595\n金岳霖\t126596\n哀乐\t126597\n绝密飞行\t126598\nQUICK\t126599\n引桥\t126600\n独体字\t126601\n脚踝处\t126602\nwwwabc3000cnm\t126603\nCIOAge\t126604\n1497\t126605\n断电\t126606\n人生活\t126607\n话费\t126608\n66u\t126609\nAmour\t126610\n新万家灯火\t126611\n网址导航\t126612\nABAB式\t126613\n空气波\t126614\n60x60\t126615\n金吉拉猫\t126616\n三瑞\t126617\n换领\t126618\ncurse\t126619\n香港教育学院\t126620\n电脑一体机\t126621\n20150627\t126622\n龙珠TV\t126623\n穿线管\t126624\n营养素\t126625\n聚石塔\t126626\n体部\t126627\nAuthors\t126628\nwangyi\t126629\n丰功伟绩\t126630\n屈才\t126631\n半嵌入式\t126632\n刘小华\t126633\n江西省公安厅\t126634\n中山眼科中心\t126635\n自来水管\t126636\n日月光中心\t126637\n刘玉珍\t126638\n底边\t126639\n建设用地规划许可证\t126640\nOpenCV3.0\t126641\n踝骨\t126642\n第五款\t126643\nZegna\t126644\n樱花妖\t126645\n记者团\t126646\n开封市人力资源和社会保障局\t126647\nCortex-M4\t126648\n袁超\t126649\n燃油管\t126650\n汽油炉\t126651\n战鹰\t126652\n传动装置\t126653\n1.2kg\t126654\n真爱\t126655\n微单A6300\t126656\n二代机\t126657\nFavor\t126658\nRETURN\t126659\n灼见\t126660\nautoremove\t126661\niColor\t126662\n青城后山\t126663\nmalloc\t126664\n国寿超月宝\t126665\n叫兽网\t126666\n石燕湖\t126667\n西五环\t126668\nnv200\t126669\n文汇中学\t126670\n无忧爱美网\t126671\n哭诉\t126672\ngnss\t126673\n掌事\t126674\n形而上\t126675\n何利\t126676\n单屏\t126677\n周杰伦的床边故事\t126678\n重伤\t126679\n诗词类\t126680\nmini3\t126681\n黄果树大瀑布\t126682\nsdx\t126683\n模范生\t126684\n花木路\t126685\n荷枪实弹\t126686\ncabeza\t126687\n养根\t126688\n基地\t126689\n中华H330\t126690\n载体\t126691\n北京市中学\t126692\n易捷便利店\t126693\n男女方\t126694\n照样子\t126695\n入木三分\t126696\n1229\t126697\n机械革命x6ti\t126698\nbigbigtree\t126699\n干货店\t126700\n_市井_淮海网\t126701\n&nbsp\t126702\n零售商店\t126703\n杨柳镇\t126704\n总领事馆\t126705\ntescom\t126706\n碎刀\t126707\n邓飞\t126708\n倍他乐克\t126709\n金庸奇侠传5\t126710\n轻歌\t126711\n辛晓琪\t126712\n1:00\t126713\n亨德尔\t126714\n波纹管补偿器\t126715\n外汇占款\t126716\n赵登禹\t126717\nnewinstance\t126718\n奥林匹克竞赛\t126719\n奔驰CLS级\t126720\nYoshua\t126721\n黑水镇\t126722\nPerth\t126723\n笨笨\t126724\n幼儿教育机构\t126725\n万众一心\t126726\n两杆大烟枪\t126727\n抽签\t126728\n兴安盟行政公署\t126729\nUpgrades\t126730\n20151215\t126731\n如柴\t126732\n淮房网\t126733\nPB9.0\t126734\n瓷艺\t126735\n宜居乡村\t126736\n数序\t126737\n二手货车网\t126738\nlyte\t126739\n省人民政府办公厅\t126740\n补给\t126741\n文青\t126742\n摩恩电气\t126743\n最终期\t126744\ncsapp\t126745\n钽铌矿\t126746\n配曲\t126747\n绕行\t126748\n李世默\t126749\n巢\t126750\nTCP/IP协议栈\t126751\n奥维互动地图浏览器\t126752\n庆盛\t126753\n5天前\t126754\n灰黑\t126755\n导视设计\t126756\n幌\t126757\nMYsql\t126758\n3000万辆\t126759\nDMM.com\t126760\nheritrix\t126761\n希志\t126762\n演艺\t126763\nobu\t126764\n低俗喜剧\t126765\n阿拉蕾\t126766\n功劳簿\t126767\n学者型\t126768\n第100页\t126769\n出货单\t126770\n泥棒\t126771\n孤峰\t126772\n小米手机联系人\t126773\n上海音协\t126774\n石小猛\t126775\naxa\t126776\n第21\t126777\nmina\t126778\nY410P\t126779\n20131226\t126780\n防御者\t126781\navro\t126782\n虎扑\t126783\nnullable\t126784\n陈博文\t126785\n质证\t126786\n悦方\t126787\n海淀区学院\t126788\n绵阳外国语学校\t126789\nyymodel\t126790\n工作满意度\t126791\n中土世界:战争之影\t126792\n换股\t126793\nCocos2d-js\t126794\n硕士\t126795\n阿赖耶识\t126796\nuiimage\t126797\n马来群岛\t126798\n一世情缘\t126799\n湄南河\t126800\n分贝\t126801\n方向标\t126802\n防爆区\t126803\n沈威\t126804\n中国硬笔书法协会\t126805\n灌阳\t126806\n欠佳\t126807\n财会人员\t126808\n挂版\t126809\n90平\t126810\n海亮地产\t126811\ngas\t126812\n注册规划师\t126813\n虽小五脏俱全\t126814\n倍儿\t126815\n伙计\t126816\n崂山一中\t126817\n简阳市人民政府\t126818\n北京三快在线科技有限公司\t126819\n在线播放器\t126820\n光谷火车站\t126821\n多远\t126822\n观海卫\t126823\n丁应思\t126824\n权值\t126825\n银河公墓\t126826\n合肥高新技术产业开发区\t126827\n区角\t126828\n竹席\t126829\nDirectX9\t126830\nlexmark\t126831\n路由侠\t126832\nheshan珊\t126833\nlncRNA\t126834\n7500u\t126835\n他日\t126836\n金质习酒\t126837\nelastic-job\t126838\n1.35亿\t126839\n唐老鸭\t126840\n11111111\t126841\n药性\t126842\n凋落物\t126843\nSelectors\t126844\n吴宗俞敏洪\t126845\n红鲷鱼\t126846\n领导力\t126847\n4.8亿\t126848\n提篮桥街道\t126849\n凤铝断桥铝门窗\t126850\n稞\t126851\n玛瑙湾\t126852\n阿尔法.罗密欧\t126853\n350M\t126854\n六百元\t126855\n得意忘形\t126856\n上海罗氏制药有限公司\t126857\n120p\t126858\n刘春\t126859\nJava语言程序设计\t126860\n包金\t126861\n网咖\t126862\n魔法书\t126863\niod\t126864\n灯控台\t126865\n中证网\t126866\n1003\t126867\n1瓶\t126868\n国家国防科技工业局\t126869\n11套\t126870\n哪家\t126871\n宝骏530\t126872\n狠撸撸\t126873\nxh6300\t126874\n京都线\t126875\n佳木斯大学附属第一医院\t126876\nWindows8_Win\t126877\n首枚\t126878\n羊卓雍措\t126879\n利其尔\t126880\n矜\t126881\n金水东路\t126882\n群攻\t126883\nEspresso\t126884\n捕鱼船\t126885\n中华史\t126886\n小儿咳喘灵颗粒\t126887\n生物股份\t126888\n分档\t126889\n思嘉\t126890\n余刚\t126891\n合肥市国土资源局\t126892\n苏洵\t126893\n20140531\t126894\n11月8日\t126895\n耶律\t126896\n女房\t126897\n中国黄金集团\t126898\n_斗蟹\t126899\npostal\t126900\n自考资料\t126901\n汇编语言\t126902\nWireless\t126903\n小熊\t126904\n昆山政府\t126905\n甚少\t126906\nKNN算法\t126907\n临界\t126908\n资产者\t126909\n公司信贷\t126910\n青海省国家税务局\t126911\nsysbase\t126912\n步入式\t126913\numd\t126914\n标版\t126915\n白介素\t126916\n孙桥\t126917\n胎停\t126918\n梦醒了\t126919\n罗圈\t126920\n财付通快捷支付\t126921\n杭州第七中学\t126922\n前壁\t126923\nPornstar\t126924\n富驿时尚酒店\t126925\n古美门\t126926\n巴基\t126927\n墨水盒\t126928\n协程\t126929\n9.1.3\t126930\n奈拉\t126931\n奉献\t126932\n重山\t126933\n微品\t126934\nmissed\t126935\nMitigation\t126936\n西晋\t126937\n哈克\t126938\nMophie\t126939\n马后炮\t126940\n灵宝教育网\t126941\nFP-620K\t126942\nThoughts\t126943\n甘肃国税\t126944\n杭金衢高速\t126945\n苏沐\t126946\n6头\t126947\n数千亿\t126948\n12.7\t126949\n商业楼\t126950\n电动开窗器\t126951\n大理学院\t126952\n骏业\t126953\n红冲\t126954\nLM317\t126955\n元贝\t126956\n试验区\t126957\n十点钟\t126958\n三国之志2\t126959\n广告学\t126960\n周阳\t126961\n风格\t126962\n海鲜街\t126963\n庙碑\t126964\n6月23日\t126965\n核仁\t126966\nnuma\t126967\n妖力\t126968\n亨氏米粉\t126969\noppor7\t126970\n石川施恩惠\t126971\n基本法\t126972\n蔡艺侬\t126973\nSW2016\t126974\n掘墓\t126975\n激越\t126976\nauthenticate\t126977\nmvc5\t126978\n你的爱\t126979\n镁粉\t126980\nwee\t126981\n新金瓶梅3D\t126982\n酶体\t126983\n第90期\t126984\n随叫随到\t126985\n参看\t126986\n中越\t126987\n王曰\t126988\n三陪\t126989\nJudicial\t126990\n林口\t126991\n上海市环保局\t126992\n文曲\t126993\n例子集\t126994\n欢乐堡\t126995\n香蕉味\t126996\n固定液\t126997\nxenserver\t126998\n旨趣\t126999\nasphyxia\t127000\n钉钉\t127001\n妖刀\t127002\n三宫\t127003\ncomplete\t127004\nFortress\t127005\n复出\t127006\n脚刑\t127007\n藏北\t127008\n压死\t127009\nMazda\t127010\n浩辰\t127011\n苹果机\t127012\n兴隆\t127013\n一月前\t127014\n黑雷\t127015\n不确定度\t127016\n财富观\t127017\n2018-2025年\t127018\nPause\t127019\ndir\t127020\n快乐的歌\t127021\n海尔兄弟\t127022\n楚云\t127023\n表框\t127024\n云梦招商局\t127025\n0.0.2\t127026\n正弦波振荡器\t127027\n8.4.0\t127028\n5680\t127029\n朱清\t127030\ncbct\t127031\n一战成名\t127032\n谦信\t127033\nendure\t127034\n军事报道\t127035\n水均益\t127036\n潍坊人事考试中心\t127037\nppt.doc\t127038\n四瓶\t127039\n王翰\t127040\n威尼斯城\t127041\n退机\t127042\n嘉兴房产超市网\t127043\n管理学\t127044\ncran\t127045\n500多种\t127046\n2.3.9\t127047\n人力资源服务公司\t127048\n交互项\t127049\n学工在线\t127050\n王志新\t127051\n2017-12-30\t127052\naipai\t127053\nIP67\t127054\n周振鹤\t127055\nWhisperer\t127056\nfram\t127057\n锂业\t127058\nvim8.0\t127059\n下地狱\t127060\n中国科学院沈阳应用生态研究所\t127061\n赵丽蓉\t127062\n陕西省妇幼保健院\t127063\n娄底\t127064\n网民\t127065\n德帅\t127066\n杂题\t127067\n杀神重生之叶欢\t127068\n鉴宝\t127069\n工业园\t127070\n宫腔积血\t127071\n姚尧\t127072\n吴珊卓\t127073\nRUP\t127074\n中华家纺网\t127075\n正师级\t127076\n超短裙\t127077\nkigurumi\t127078\n科尼赛克\t127079\n东林\t127080\n昕锐论坛_汽车之家论坛\t127081\n金山北站\t127082\n马峰\t127083\n民事诉讼证据\t127084\nValuation\t127085\n史莱克学院\t127086\n7900x\t127087\n715\t127088\nM65\t127089\n以太猫\t127090\n标准语\t127091\n留神\t127092\n中华语文网\t127093\n有福同享\t127094\n姜熙健\t127095\n明发集团\t127096\n正色\t127097\nsergio\t127098\n落地扇\t127099\n乱云少妇\t127100\n地拍\t127101\n关山\t127102\n资料篇\t127103\n房钱\t127104\nmolecularshell\t127105\n150m\t127106\n政务院\t127107\n阿里布达\t127108\nT550\t127109\n现在去见你\t127110\n2016年2月\t127111\n21亿元\t127112\n征文网\t127113\n高红\t127114\n好生\t127115\nCharged\t127116\n既以\t127117\n永利皇宫\t127118\n78月\t127119\n第05期\t127120\nFFI\t127121\n850EVO\t127122\n北京公积金联名卡\t127123\n防滑\t127124\nwallace\t127125\n黔\t127126\n611所\t127127\n朱旭东\t127128\n绿桶\t127129\n上古卷\t127130\n康宁锅\t127131\nアイ\t127132\n5656\t127133\nremedies\t127134\n中央第八巡视组\t127135\n校际\t127136\n电影男\t127137\n20MA\t127138\n霍尔特\t127139\n懒汉\t127140\ngenuitec\t127141\n许嵩\t127142\nlastdayonearth\t127143\n2270\t127144\n复兴号\t127145\n第九章\t127146\nHRV\t127147\n大西北剿匪记\t127148\nASCII码\t127149\n02s515\t127150\n朱琳\t127151\n北京广播电视台\t127152\n水生所\t127153\n承德\t127154\nGRATIS\t127155\n灰板\t127156\n益佰制药\t127157\n想和你\t127158\nKendoUI\t127159\n修身裤\t127160\nNAT\t127161\n蚌肉\t127162\n团结一致向前看\t127163\n思妍丽\t127164\n二期\t127165\n8两\t127166\ndxc\t127167\nHYBRID\t127168\n仔仔\t127169\n红鲫鱼\t127170\n何须\t127171\njensen\t127172\n天汽模\t127173\n百艺\t127174\n巴拉圭\t127175\n十景\t127176\n辣子\t127177\nNIV\t127178\n紫林\t127179\n省政府金融办\t127180\n如期\t127181\n青海省地方税务局\t127182\n臂围\t127183\nCRC\t127184\nEvaluation\t127185\n张大国\t127186\n南岗区\t127187\n井冈山大道\t127188\netude\t127189\nRFI\t127190\nVaginal\t127191\n戬\t127192\n陈列师\t127193\n小知否知否应是绿肥红瘦\t127194\n杨尚昆\t127195\n喜来登\t127196\n假音\t127197\n刻印\t127198\njellycat\t127199\n品项\t127200\n建交\t127201\n汉语言\t127202\n不声不响\t127203\ncrashes\t127204\n承包商\t127205\n2008年7月\t127206\nalien\t127207\n诺冠\t127208\n几宗\t127209\n东箭\t127210\n文军\t127211\n蓝皮\t127212\n剁手\t127213\n狂沙\t127214\n执笔\t127215\n王建\t127216\nhuo\t127217\n统购\t127218\n气相色谱\t127219\n挠度\t127220\n欧尚A800\t127221\n刘统\t127222\n最终幻想7\t127223\nwashed\t127224\n宜丰县\t127225\n瓜田李\t127226\n慢性肾衰竭\t127227\n微程序控制器\t127228\n敦煌研究院\t127229\n图像学\t127230\n立等\t127231\n知青网\t127232\n育儿书\t127233\n妙龄\t127234\ncodewarrior\t127235\n百货\t127236\n菈菈\t127237\n佟丽杜海涛\t127238\n西安培华学院\t127239\n小儿麻痹\t127240\n奥迪Q8\t127241\n夜兔\t127242\n骑马与砍杀战争\t127243\n180平\t127244\n宫颈锥切术\t127245\n四川省物价局\t127246\nAlmost\t127247\n明场\t127248\n阿尔泰山\t127249\n万立方米\t127250\nsars\t127251\n纯平\t127252\n嫩江县\t127253\n首选\t127254\n花圣\t127255\nshopee\t127256\n实习基地\t127257\n寻求\t127258\n罅隙\t127259\nJR东\t127260\n三寸天堂\t127261\n早知\t127262\nui界面\t127263\npbm\t127264\n光鲜亮丽\t127265\n藤制\t127266\n28册\t127267\n辜正坤\t127268\n斑马木\t127269\n600022\t127270\n丝路基金\t127271\n【民政局\t127272\n哋\t127273\n绿通\t127274\ndwg格式\t127275\n萧瑶\t127276\n做不到\t127277\n谭伟仪\t127278\noceanstor\t127279\n叫花子\t127280\n标准化法\t127281\n美国幼儿园\t127282\n盟\t127283\n今年春节\t127284\n回龙湾\t127285\n回贴\t127286\n@+id\t127287\n七星村\t127288\ncgv\t127289\n武汉外国语学校\t127290\n前海人寿\t127291\n志邦橱柜\t127292\nPNA\t127293\n无锡科技职业学院\t127294\n常家庄园\t127295\n信号槽\t127296\n东沟镇\t127297\n宁波工会\t127298\n旋转轮胎泥泞奔驰\t127299\nPSG\t127300\n预科生\t127301\n角片\t127302\n代金券\t127303\n胡建国\t127304\n夜话\t127305\nCHNS\t127306\n钛渣\t127307\n双子座女\t127308\naqjs\t127309\n杀妻藏尸案\t127310\n李焱\t127311\n离子晶体\t127312\n嗓\t127313\nB40\t127314\n广州中山三院\t127315\n18时\t127316\n卷洞\t127317\n2018年03月26日\t127318\n魔术手\t127319\n难怪\t127320\n自压\t127321\nx5650\t127322\n加尔各答\t127323\n原地\t127324\n经视\t127325\n鸾凤鸣\t127326\nBreakfast\t127327\n国有土地使用权转让\t127328\n4.6.5\t127329\n【缘\t127330\n正确地\t127331\n江苏警官学院\t127332\n芯谷\t127333\n大开讲\t127334\n标本\t127335\n新媒体时代\t127336\npops\t127337\n万历皇帝\t127338\n多灾多难\t127339\n就范\t127340\n老洋房\t127341\n双庙村\t127342\nasphalt\t127343\n迷迭香精油\t127344\n唐木\t127345\n志愿者协会\t127346\n计数\t127347\n厨师服\t127348\n看见你的声音\t127349\n汇金网\t127350\nmg3680\t127351\n折损\t127352\n农机合作社\t127353\n财报\t127354\n栅\t127355\nnb\t127356\n调节阀\t127357\nEthernet\t127358\n向何处去\t127359\nzipon\t127360\n幻想水浒传2\t127361\n中国三星电子\t127362\n中国银行浙江省分行\t127363\n环氧煤沥青漆\t127364\n无忌影社\t127365\n汽缸盖\t127366\n明皇陵\t127367\n价格段\t127368\nSrc\t127369\nphotosh\t127370\n3.52\t127371\n警言\t127372\n杰普\t127373\n年菜\t127374\n水果拼盘\t127375\nanf\t127376\n冷面机\t127377\n太平洋车险\t127378\nPECVD\t127379\n薄伽丘\t127380\n郑州七中\t127381\n杜甫吉克隽逸\t127382\n两者\t127383\nmarni\t127384\n20171205\t127385\n工会\t127386\n科普网\t127387\n大亮\t127388\n蒋中挺\t127389\n天官\t127390\nkuyibu\t127391\n上海地铁16号线\t127392\n杨行\t127393\nNAD\t127394\n韩寒杜兰特\t127395\n酒中仙\t127396\n太平洋岛国\t127397\nARCHIVE\t127398\n濑户内\t127399\n广志\t127400\n派克卓尔\t127401\n1天后\t127402\n十一日游\t127403\n10万辆\t127404\ncosmic\t127405\n剔除\t127406\n亚光科技\t127407\n24C02\t127408\n录像播放器\t127409\n刘文新\t127410\n1500mg\t127411\n跑跑车手机网\t127412\n第3家\t127413\n国寿e店\t127414\n人民美术出版社\t127415\n6厘\t127416\n马语\t127417\n進化\t127418\n学思践悟\t127419\n189元\t127420\nXplay3S\t127421\n芙蓉楼\t127422\n本文\t127423\n线性阵列\t127424\n柱盆\t127425\n401\t127426\n里水镇\t127427\n普及贴\t127428\n人民法院\t127429\nB轮融资\t127430\n宸妃\t127431\n标致308论坛\t127432\npallet\t127433\n蓝莓果\t127434\n第七号\t127435\n东汽\t127436\n济宁市高新区\t127437\n赵菁\t127438\n终点\t127439\n二十九年\t127440\n插齿机\t127441\n蛙宝\t127442\nkwon\t127443\nios6\t127444\n长滩\t127445\nLEGBABY\t127446\nSweety\t127447\n防空炮\t127448\n2016年5月1日起\t127449\n射极跟随器\t127450\n汉克狗\t127451\n仙路家养小首辅\t127452\n想起他\t127453\nTrendy\t127454\nSnap\t127455\n猴魁\t127456\nlibnl\t127457\nnba2017\t127458\nq吧\t127459\n表观消费量\t127460\n毋庸置疑\t127461\n链子\t127462\n100平方厘米\t127463\n坪地街道\t127464\n北京农村商业银行\t127465\nl380\t127466\n151项\t127467\n佐菲\t127468\n第四题\t127469\nOmar\t127470\n嘉兴路\t127471\n茶社\t127472\n勾\t127473\n举证期\t127474\n节约型\t127475\n【翼\t127476\n云霄飞车\t127477\n新媒体\t127478\n70亿美元\t127479\n突眼\t127480\n京津城际\t127481\nxpsp3\t127482\n不动产登记申请书\t127483\n藤井\t127484\n睡裙\t127485\n引路\t127486\n兴庆公园\t127487\n长袖善舞\t127488\n60斤\t127489\n金鸣\t127490\nharris\t127491\n金温\t127492\nVirtualbox\t127493\n雄兔\t127494\n假期综合症\t127495\n6.5公斤\t127496\ntyco\t127497\n花儿朵朵\t127498\n536\t127499\n街铺\t127500\nTP-Link\t127501\n乐清市\t127502\n铜官\t127503\nlearnt\t127504\n200倍\t127505\n交广网\t127506\n最爱\t127507\nppt文本框\t127508\n海客\t127509\n包河区政府网\t127510\n底围\t127511\n像素化\t127512\n非天然\t127513\nxplane11\t127514\n燕子包\t127515\n科进\t127516\n藤井莉娜\t127517\n间谍片\t127518\nPMC\t127519\n薛宝钗\t127520\n压力锅\t127521\n桂林新闻报\t127522\n慈溪日报\t127523\nvido\t127524\n张家川\t127525\n材料科学与工程\t127526\nbud\t127527\n领刑\t127528\n骁龙835\t127529\nshock吧\t127530\n纯属虚构\t127531\n潘兴\t127532\n好屌操\t127533\n派遣制\t127534\n国科控股\t127535\n毫无意义\t127536\n半只\t127537\n十小\t127538\n导数\t127539\n虚拟人\t127540\n7538\t127541\nActive\t127542\n智元\t127543\n剑网三丐帮\t127544\n高村镇\t127545\ngippro\t127546\nunusable\t127547\n姚星彤\t127548\n祥子\t127549\n不久前\t127550\n逻辑思维\t127551\nOoh\t127552\nano\t127553\n萨莉亚\t127554\n常州新北区人民法院\t127555\n桐君街道\t127556\n合璧\t127557\n工商银行北京分行\t127558\nsubset\t127559\n辽代\t127560\n荞麦枕头\t127561\n4.2_\t127562\n哲学社会科学\t127563\n质保卡\t127564\n补天\t127565\n专家级\t127566\n2002g\t127567\n白人\t127568\nhps\t127569\n禁赛\t127570\n市知识产权局\t127571\nBTSOWS\t127572\n为主体\t127573\n林微\t127574\n数码\t127575\n第十四条\t127576\nsharks\t127577\n义乌中学\t127578\nCampbell\t127579\n跃华\t127580\n弓兵\t127581\n摇摇头\t127582\n南航金城学院\t127583\n投标网\t127584\n化工系\t127585\nkrait\t127586\nп\t127587\n众泰e200\t127588\n分布式缓存\t127589\n兵源\t127590\n大瑶\t127591\n四川大学制造科学与工程学院\t127592\n新华医院\t127593\n定分\t127594\n恒大珠三角公司\t127595\n满堂红地产网\t127596\n六图网16pic.com\t127597\n在逃\t127598\n木下\t127599\n分辩\t127600\n广珠\t127601\n宁波印象城\t127602\n南京南站\t127603\nRules\t127604\n贝佐斯\t127605\n贴面\t127606\n柯俊雄\t127607\n心艺\t127608\n百道\t127609\nPens\t127610\n砂锅粥\t127611\nweight\t127612\n二玄社\t127613\nPOF\t127614\n太和殿\t127615\nKaede\t127616\ninitially\t127617\n启正中学\t127618\nany\t127619\nGrocery\t127620\ncloud9\t127621\n弊病\t127622\n线数\t127623\n2立方\t127624\nhttp-server\t127625\nCost\t127626\n深圳大学总医院\t127627\n利福平胶囊\t127628\n1000T\t127629\n2009\t127630\nhourly\t127631\n5月16日\t127632\n多款\t127633\n西藏民族大学\t127634\n当事人\t127635\n增资\t127636\n纸制品业\t127637\n徙\t127638\n2017年2月16日\t127639\n爆炸性\t127640\n九一三\t127641\n门关\t127642\nDri\t127643\ngusto\t127644\n自有资金\t127645\n差率\t127646\nctid\t127647\n尖段\t127648\nC#类\t127649\n削除\t127650\n高安市政府网\t127651\n尊贵\t127652\n刻刀\t127653\n扼住\t127654\n玉娇龙\t127655\n长圳\t127656\n不死不休\t127657\n油牡丹\t127658\n尽量\t127659\n46.\t127660\n爱多多\t127661\nVivo\t127662\n李明杰\t127663\n中国人民大学\t127664\n全军出击\t127665\n美好乡村\t127666\n宝马x3\t127667\n一体板\t127668\n冷轧带肋\t127669\n寂灭者\t127670\n衣装\t127671\n黑网吧\t127672\nNOTHING\t127673\n药局\t127674\n雨晨\t127675\n抺\t127676\n箱变\t127677\nm436\t127678\n孙健\t127679\n黑胡椒牛排\t127680\nFormatting\t127681\n急危重症护理学\t127682\nusd\t127683\n佳能数码相机\t127684\n四川大厦\t127685\nelinwenme\t127686\nv1.0_乐游网\t127687\n传祺GS8论坛_汽车之家论坛\t127688\n170名\t127689\n绍兴e网\t127690\n五千年\t127691\nSP1补丁包\t127692\n氯胺酮\t127693\njuediqiusheng\t127694\n中国志愿军\t127695\n纳金网\t127696\n金山医院\t127697\n录取线\t127698\n导码\t127699\n惠安县人民政府\t127700\n蓝天花园\t127701\n搜于特\t127702\n漫游\t127703\n2016年5月6日\t127704\n外食\t127705\n制图员\t127706\n金大师\t127707\n佐伯雪菜\t127708\nmilitary\t127709\n赁\t127710\n水泥罐\t127711\nMode\t127712\n云纵\t127713\n化学清洗\t127714\n独断专行\t127715\n是枝裕和\t127716\n非公平\t127717\n金庭镇\t127718\nxinsss\t127719\n赛过\t127720\n信件\t127721\n发颤\t127722\n玉虚宫\t127723\n元老\t127724\n会计部\t127725\nセント\t127726\n银彩\t127727\n144芯\t127728\n正演\t127729\n季琦\t127730\n深城投中心\t127731\n架子\t127732\nCleaning\t127733\n于江\t127734\n浦口实验小学\t127735\n荧光舞\t127736\n蓝莓论坛\t127737\n张薇薇\t127738\n宁波幼升小\t127739\n华语\t127740\n忍者神龟3\t127741\n绝地求生倍镜\t127742\n奔驰gla\t127743\n真人超级英雄\t127744\n成都棕南医院\t127745\n金丝鱼\t127746\nleague\t127747\nlowagie\t127748\n波利亚\t127749\nsomething\t127750\n华电\t127751\n魔剑士莉奈2\t127752\n西客站\t127753\n毛小方\t127754\n重铸\t127755\n29p\t127756\n省金融办\t127757\n楼阁\t127758\n宽广\t127759\n沙窝\t127760\n盛虹集团\t127761\n联想集团\t127762\n罪的叹息\t127763\n爱情天梯\t127764\n广州富力地产股份有限公司\t127765\nboku\t127766\n自驾游走\t127767\n肺大疱\t127768\n宫城\t127769\nwww.dy2018.com\t127770\n猛狮\t127771\n成都南\t127772\n邮折\t127773\n荣耀高帧率\t127774\n折射\t127775\nhtttp\t127776\n草样\t127777\n多数\t127778\n大昏君\t127779\n基建处\t127780\n美術\t127781\n绥中\t127782\nOOCL\t127783\n英儿\t127784\nmazak\t127785\n赤影\t127786\n28.8\t127787\n理学院\t127788\n油浸变压器\t127789\n咆哮\t127790\n带宽\t127791\n11bg\t127792\n美狄亚\t127793\n滨河西路\t127794\n松下\t127795\n王晓初\t127796\n金玲\t127797\n铭心\t127798\ntourists\t127799\nriver\t127800\nUBM\t127801\n廖品东\t127802\neva吧_\t127803\n特地\t127804\nosgb\t127805\njsonobject\t127806\n北京大学学报\t127807\n噻吩\t127808\n聚集性\t127809\n七七\t127810\n卒\t127811\n紫晶洞\t127812\n游戏源\t127813\nbeam\t127814\n百度云盘/磁力链接\t127815\n10kw\t127816\nnosql\t127817\n七月十四\t127818\n西南石油大学图书馆\t127819\n跑马场\t127820\n权能\t127821\n手环\t127822\n50座\t127823\nbboom\t127824\n奷\t127825\n口甜\t127826\n爱玛客\t127827\n铝架\t127828\n舟山酒店\t127829\n中兴花园\t127830\n小烈\t127831\n真草\t127832\n电瓶\t127833\n初体验\t127834\n500彩票网\t127835\nScientific\t127836\nsoa\t127837\n97.7\t127838\n金科地产集团股份有限公司\t127839\n长治市\t127840\n圣诞树\t127841\n周宁县人民政府\t127842\n好书记\t127843\n日泰\t127844\n多多多\t127845\n那个女\t127846\n可燃气体检测仪\t127847\n空心化\t127848\n视错觉\t127849\n顶级\t127850\n透亮\t127851\nOFweek节能环保网\t127852\n安全员c证\t127853\n美博城\t127854\n河南质量工程职业学院\t127855\n野奢\t127856\nunde\t127857\n桥头集镇\t127858\nBIGBANG\t127859\nFlask-SQLAlchemy\t127860\n李光洁\t127861\n厦门市行政服务中心\t127862\nuib\t127863\ncqsf\t127864\n铜雀春深锁二乔\t127865\n第十一\t127866\n嘅\t127867\n岁月\t127868\n三变科技\t127869\n现付\t127870\n一叶子\t127871\nUPU小说网\t127872\n地圆\t127873\n心爱的人\t127874\n浓缩丸\t127875\n我的美丽日记\t127876\n萌龙大乱斗\t127877\n怪物猎人3G\t127878\n魔王寨\t127879\n陶笛\t127880\n威尼斯人酒店\t127881\n定量泵\t127882\n50摄氏度\t127883\ncld\t127884\n地下党\t127885\n耿宝昌\t127886\n卫青\t127887\ncrt\t127888\n轻蔑\t127889\n早上\t127890\n柱端\t127891\n移动平均线\t127892\n严师\t127893\n易峰\t127894\n丹参酮\t127895\n水原三星\t127896\n青铜\t127897\n宋晓明\t127898\n一木块\t127899\n领型\t127900\n精仿\t127901\n双份\t127902\nBitMap\t127903\n珠江城大厦\t127904\n废塑料\t127905\n迪克牛仔\t127906\n陈帆\t127907\n正仓院\t127908\n泛娱乐\t127909\n多益网络\t127910\n繁殖期\t127911\n蒸煮袋\t127912\n比特大陆\t127913\nZipper\t127914\n青春旅社\t127915\n固衣柜\t127916\n火锅鸡\t127917\nWH\t127918\n爱优\t127919\n埃德加·爱伦·坡\t127920\n凤仪镇\t127921\n薛度云\t127922\n村证\t127923\n安仁新闻网\t127924\n档案柜\t127925\n王储\t127926\n南京天保驾校\t127927\n发言权\t127928\n雨布\t127929\n第79届\t127930\n一兴奋\t127931\n申东旭\t127932\n上海公务员考试网\t127933\n赘肉\t127934\n燕赵\t127935\nsimplex\t127936\n企业境外投资管理办法\t127937\n苦寻\t127938\n免予\t127939\n复仇之路\t127940\n转义字符_\t127941\n首用\t127942\n福厦高铁\t127943\n深圳市水务局\t127944\n叶卡捷琳堡\t127945\nfun\t127946\n吊车\t127947\n齐发\t127948\n北京市私立汇佳学校\t127949\n2017年12月12日\t127950\n平衡片\t127951\n如虎\t127952\nTomorrowland\t127953\n炮解\t127954\n克雷顿\t127955\n非正规\t127956\ntellmewhen\t127957\n光秃秃\t127958\nDOCX\t127959\n尾箱垫\t127960\n九都\t127961\nmvc框架\t127962\n春运\t127963\n王玉华\t127964\n义乌网\t127965\n高洪波\t127966\n寄件\t127967\n放羊的星星\t127968\n登载\t127969\nPrice\t127970\n保荐代表人\t127971\n米兰世博会\t127972\n玄色\t127973\n道标\t127974\ndéfinition\t127975\n语重心长\t127976\np38\t127977\nrplidar\t127978\nstatic\t127979\n性经\t127980\nirun\t127981\nleona\t127982\n成都日报多媒体\t127983\n热敏打印纸\t127984\n东晟\t127985\n朝军\t127986\ncentury\t127987\n纯情罗曼史\t127988\n肉体\t127989\n东圃客运站\t127990\n天平男\t127991\n凤娃\t127992\nword03\t127993\n日照经济技术开发区\t127994\n铁蛋白\t127995\n开心鬼救开心鬼\t127996\n624\t127997\n家化\t127998\n陈婧\t127999\nGo-Ethereum\t128000\n海洋鱼\t128001\nEmmanuelle\t128002\n弹丸论破1\t128003\nTMZ\t128004\n夫妻群\t128005\n牛商网\t128006\n尼康D5300\t128007\n血流不止\t128008\n鲁邦三世VS名侦探柯南\t128009\n南柯一梦\t128010\n邯郸路\t128011\n耗子\t128012\n星沙汽车站\t128013\nPPT格式\t128014\nw1070+\t128015\n抛竿\t128016\n总承包方\t128017\n上海起发实验试剂有限公司\t128018\n软键\t128019\n制敌\t128020\n上汽通用五菱\t128021\n小水滴\t128022\n富饶\t128023\nlight7\t128024\nvalextra\t128025\n琼恩·雪诺\t128026\n签字栏\t128027\n孩爸孩妈\t128028\nSnandy\t128029\n闹事\t128030\n吉利帝豪GL\t128031\n似懂非懂\t128032\n团学\t128033\n小缘\t128034\n批注\t128035\n协调器\t128036\n灌\t128037\n150路\t128038\n锐角\t128039\nHyperworks\t128040\n新都市区\t128041\nmigos\t128042\n广西商务厅\t128043\n第N行\t128044\n该厂\t128045\nmobility\t128046\n早已\t128047\n主业\t128048\n联合国海洋法公约\t128049\n转销\t128050\n过敏性鼻炎\t128051\n横店东磁\t128052\npresence\t128053\n青秀万达\t128054\nMats\t128055\n潜水料\t128056\n顺企\t128057\n察布查尔锡伯自治县\t128058\n华森制药\t128059\n小囊\t128060\n成瘾\t128061\n问情\t128062\n小辉哥\t128063\n非生产性\t128064\n320Kbps\t128065\n沙皮狗\t128066\n二来\t128067\nMTP\t128068\n勤业\t128069\n小白点\t128070\n十不吊\t128071\n舞文弄墨_论坛\t128072\nifree\t128073\n卵巢癌\t128074\n头上\t128075\n麦克吉尔\t128076\nUART\t128077\n广州医科大学\t128078\n日批\t128079\n犯罪嫌疑\t128080\n灯笼椒\t128081\nadolfmc\t128082\n157g\t128083\n信息学\t128084\n庸政\t128085\n省机关事务管理局\t128086\nmindmanager2017\t128087\nCourage\t128088\n滑动变阻器\t128089\n无声恐怖漫画\t128090\nnoe\t128091\n理科类\t128092\n鱼油软胶囊\t128093\n暧昧\t128094\n嗟来\t128095\n滑翔\t128096\nair2\t128097\nWWE2K18\t128098\n英尺\t128099\n江溪街道\t128100\n于无声处\t128101\n盒蛋\t128102\n6061铝板\t128103\n招商银行一卡通\t128104\n雨云\t128105\n泰康人寿保险\t128106\n汉娜·阿伦特\t128107\n煽动\t128108\n本经\t128109\narizona\t128110\n走慢\t128111\n大鼓\t128112\nDIF\t128113\nchoke\t128114\n北京国贸三期\t128115\n林宁\t128116\n皓龙\t128117\n北京中国国际展览中心\t128118\n牛津\t128119\nnewmacbook\t128120\n华晨鑫源\t128121\n臭味剂\t128122\n逆铣\t128123\n长鸣路\t128124\n老百晓\t128125\n水林\t128126\n风雨无阻\t128127\n海南省住房和城乡建设厅\t128128\n领结\t128129\n验孕纸\t128130\n1857\t128131\n魔宠\t128132\n张利宁\t128133\n易从网\t128134\n半坡\t128135\n人头马\t128136\ninspector\t128137\nlandsat\t128138\n49.5\t128139\nDried\t128140\n雨化\t128141\n成势\t128142\n3队\t128143\n小巷\t128144\n光油\t128145\nprotal\t128146\nCFNM\t128147\n冷盘\t128148\n工器\t128149\n锦绣华城\t128150\nPHOTONICS\t128151\n国税机\t128152\n张丽丽\t128153\nbeck\t128154\n微信群发器\t128155\n多巴哥\t128156\n李校长\t128157\n瓦里安\t128158\n般若\t128159\n莲塘一中\t128160\n资兴市\t128161\nlabelimg\t128162\n中党\t128163\n通管局\t128164\n88.7\t128165\n闪闪发光\t128166\n有效期限\t128167\n政采版\t128168\nIKE\t128169\n20170\t128170\n洛阳市人力资源和社会保障局\t128171\nmega\t128172\n广东省统计局\t128173\n福州国家森林公园\t128174\n于清\t128175\n支护\t128176\n一个25岁\t128177\n仙剑3\t128178\n人团\t128179\n沐英\t128180\n陈情表\t128181\n兼爱\t128182\n二氧化碳灭火器\t128183\n南丹政府网\t128184\n正度\t128185\n言求\t128186\n洁颜\t128187\nTDM\t128188\n双槽\t128189\nanaconda\t128190\n太君\t128191\n婚托\t128192\n春水流\t128193\nlishi\t128194\n马航370\t128195\n烧结普通砖\t128196\n匡子语\t128197\n学班\t128198\n3060\t128199\n公共场所卫生许可证\t128200\n清华大学材料学院\t128201\n企业化\t128202\n六爻\t128203\n华南师范大学\t128204\n惩戒权\t128205\n合生汇店\t128206\n有比\t128207\n钛粉\t128208\n华为语音助手\t128209\n翟天临\t128210\n光纤连接器\t128211\n掂量\t128212\n开关量\t128213\nm126\t128214\nげ\t128215\n歌\t128216\nRomance\t128217\nkaizen\t128218\n宋鳕霖\t128219\n梁健\t128220\nviolate\t128221\n刷步\t128222\nq8400\t128223\nexpo\t128224\nXRP\t128225\n一居\t128226\n_战网\t128227\n民事调解书\t128228\n深井\t128229\n天津医学高等专科学校\t128230\nSigar\t128231\n900亿\t128232\n困倦\t128233\n变态版\t128234\n第119章\t128235\n4粒\t128236\n卡慕\t128237\n川委\t128238\n婚姻自由\t128239\n全麻\t128240\n国网重庆市电力公司\t128241\n気\t128242\n微游\t128243\n金豪\t128244\nUtoVR\t128245\n屏蔽箱\t128246\n弹法\t128247\n连年\t128248\n赤佬\t128249\n相对性状\t128250\n别式\t128251\n地杰国际城\t128252\n河南电视台\t128253\n山海\t128254\n房天下土地网\t128255\nColorful\t128256\n第16部\t128257\n睡眠障碍\t128258\nanki\t128259\nic卡\t128260\n球\t128261\n禁卖\t128262\n上海工商\t128263\n锥型\t128264\n西枝江\t128265\n虚频\t128266\n真没想到\t128267\n5元\t128268\n广东省物价局\t128269\n景峰\t128270\n武汉婚纱摄影\t128271\n洞箫\t128272\n词史\t128273\n蔬\t128274\n利川市\t128275\n区人力社保局\t128276\n鱼仔\t128277\n邱月清\t128278\n汤姆林\t128279\nAdblock\t128280\n水篦子\t128281\n逃学威龙3\t128282\n唤\t128283\n毕业时\t128284\nSZ\t128285\n腰窝\t128286\n长留\t128287\n陈远\t128288\n3八重\t128289\n怒焰裂谷\t128290\n生姜片\t128291\n马丹丹\t128292\nwatch\t128293\n樱田樱\t128294\n第九批\t128295\n细流\t128296\n首版\t128297\n荣威e950\t128298\n绒面\t128299\ntrousers\t128300\n不差\t128301\n重睑\t128302\n招商银行信用卡还款\t128303\n单向\t128304\n聊骚\t128305\n东风康明斯\t128306\n协同办公系统\t128307\n炮舰\t128308\n节奏大师\t128309\nseancody\t128310\n港北区\t128311\n福道\t128312\nALDI\t128313\n南京师范大学法学院\t128314\n贬损\t128315\nzcat\t128316\n49天\t128317\n植物大战僵尸3\t128318\n雨润星雨华府\t128319\n中国人民解放军信息工程大学\t128320\n中国药业人才网\t128321\n汪建\t128322\n室友\t128323\nyjig\t128324\n嫌疑犯\t128325\n常电\t128326\n普雷\t128327\n终极教师\t128328\n有机会\t128329\narrested\t128330\n破案\t128331\n资财\t128332\n逾期付款\t128333\n梦想天\t128334\n仪规\t128335\n招商银行运通百夫长白金卡\t128336\n温玉娟\t128337\npengying\t128338\n株洲站\t128339\n生意社\t128340\n八千块\t128341\n金科阳光小镇\t128342\n膨润土\t128343\nDota\t128344\n达拉斯小牛\t128345\n第二双\t128346\n韩影库\t128347\nv201\t128348\n华南城\t128349\n微动漫\t128350\n南少林\t128351\nmouseenter\t128352\n战神纪#\t128353\n喜木\t128354\n具人\t128355\n40mg\t128356\n菜汤\t128357\n藏南地区\t128358\nLinkedin\t128359\n吹动\t128360\n浙江法制报\t128361\n光催化\t128362\n低昂\t128363\n芙蓉阁\t128364\n4.5g\t128365\n等不来\t128366\n一元二次方程\t128367\n仙灵世界\t128368\nframework4.6\t128369\n豆友们\t128370\n10.7.4\t128371\n塞壬\t128372\nAMD64\t128373\nMEMO\t128374\n大美青海\t128375\n诺基亚520\t128376\nuz\t128377\nsanger\t128378\n精简版\t128379\n第三艘\t128380\n极美\t128381\nqqt\t128382\n等额本息还款法\t128383\nYY频道\t128384\n华为MATE8\t128385\n贾衣玫\t128386\nLevel2\t128387\n2ms\t128388\nOrdering\t128389\n在线课堂_直线网\t128390\n禁不起\t128391\n火影忍者手\t128392\n97周年\t128393\n固定资产\t128394\nmeshing\t128395\n鸳\t128396\n杭建平\t128397\ntwinks\t128398\n宋祖\t128399\n菲利普斯\t128400\n玉屏县\t128401\n电介质\t128402\n常开型\t128403\n2.0升\t128404\ntnsping\t128405\n咏叹调\t128406\n天珠\t128407\nES7\t128408\n北京大学医院\t128409\n南湖新城\t128410\nQINGDAO\t128411\n明鑫\t128412\n丽姐\t128413\n月花\t128414\n差分_\t128415\n无锡市中级人民法院\t128416\n领克01论坛论坛\t128417\n集团有限公司\t128418\n低音谱号\t128419\n锂矿\t128420\n麦吉减肥法\t128421\n刷图\t128422\nC82\t128423\n开饭喇\t128424\n烟展\t128425\nonkeypress\t128426\n异士\t128427\n1.6亿\t128428\n韵文\t128429\n汤锅\t128430\n好聚好散\t128431\n寿险公司\t128432\n尝到\t128433\nhgh\t128434\n湖北省人力资源和社会保障厅\t128435\n刘唐\t128436\nideapad710s\t128437\n孔子仁\t128438\ngust\t128439\n北堂\t128440\n手心输入法\t128441\n小女子\t128442\n爱的阳光\t128443\n100处\t128444\n热血传奇开服表\t128445\ndxdiag\t128446\n面对着\t128447\ngambling\t128448\n快捷栏\t128449\niphoneX\t128450\n大豆蛋白\t128451\nTrace\t128452\nPepsiCo\t128453\nchec\t128454\n姜黄粉\t128455\n湖南花鼓戏\t128456\n课率\t128457\n寂寞空庭春欲晚\t128458\n京平\t128459\n看着办\t128460\n血雨\t128461\n红炉\t128462\n前沿科技\t128463\n6.17\t128464\n78届\t128465\n全自\t128466\ncategories\t128467\n规划展览馆\t128468\njisuan\t128469\n10月3日\t128470\n哀羞\t128471\n全剧\t128472\n贻\t128473\nS71200\t128474\n僵蚕\t128475\n立体构成\t128476\n越秀金融大厦\t128477\n大使\t128478\nIbatis\t128479\n树突状\t128480\nSap\t128481\n小微\t128482\n泵业\t128483\n民丹岛\t128484\n驻马店市人民政府\t128485\n中华民国三年\t128486\n赈酒\t128487\n间断\t128488\nBJ40\t128489\n海绵状血管瘤\t128490\nAspectJ\t128491\n分区\t128492\n银河生物\t128493\n一月份\t128494\n澳门新葡京酒店\t128495\n齐家网\t128496\n秒传\t128497\nfranz\t128498\nmaxscript\t128499\n盛夏光年\t128500\n南昌大学科学技术学院\t128501\n中科院电工所\t128502\n戾王\t128503\n2016年10月15日\t128504\nicv\t128505\n天涯镇\t128506\n江北机场\t128507\n和平大道\t128508\n管弦乐队\t128509\n喜闻乐见\t128510\n大鹰\t128511\n一乘\t128512\n宁伟\t128513\nMajority\t128514\n点不点\t128515\n矢泽妮可\t128516\n聊着\t128517\n野上\t128518\n102种\t128519\nsql分组\t128520\n隔断\t128521\nkaiji\t128522\n实对称\t128523\n壳寡糖\t128524\n贺兰雪\t128525\n蛋饼\t128526\nhandler\t128527\n50毫升\t128528\n2103\t128529\n潮红\t128530\n中国邮政网络学院\t128531\n诗行\t128532\n中国人才网\t128533\nDL250\t128534\n获封\t128535\n豪门夜宴\t128536\nLL\t128537\nfm2015吧\t128538\n岁月风云之上海皇帝\t128539\n秦山岛\t128540\n优80\t128541\n我是大明星\t128542\n大医\t128543\n眼黑\t128544\n民人\t128545\n泰来县\t128546\nIthink\t128547\n十幅\t128548\n89平\t128549\n小心超人\t128550\nAND\t128551\nquickfix\t128552\nUVA\t128553\nchocolatey\t128554\n小肠\t128555\n未世\t128556\nutrack\t128557\n习近平治国理政\t128558\n37部\t128559\n怡安\t128560\n闪婚很纯很暧昧前传\t128561\nkona\t128562\ndcb\t128563\n明发\t128564\n陶德\t128565\n只可\t128566\n多件\t128567\n圣衣\t128568\n成双成对\t128569\n回波\t128570\n乙酰胆碱酯酶\t128571\n双向细目表\t128572\nhoan\t128573\n第二代\t128574\nxcx\t128575\n个人银行结算账户\t128576\n网易大厦\t128577\n背景音乐\t128578\n许泽玮\t128579\n5104\t128580\n木斧\t128581\n颈动脉粥样硬化\t128582\nLoan\t128583\n重阳木\t128584\nSpectrometry\t128585\n汶水\t128586\n斩杀\t128587\n劣质\t128588\n平炉\t128589\n庆春路\t128590\n小龙猫\t128591\n等宽\t128592\n跑表\t128593\n觅知网\t128594\nnode-mysql\t128595\n中小学校设计规范\t128596\nfianl\t128597\n抄告\t128598\n朱青\t128599\n挖财\t128600\nHugo\t128601\n0592\t128602\n中国铁路设计集团有限公司\t128603\n苏科\t128604\n木兰科\t128605\nadapter\t128606\n洗煤\t128607\n光板\t128608\n碧海蓝天\t128609\nPvE\t128610\n拇指玩\t128611\n本次\t128612\n食虫\t128613\n艾尔\t128614\n重庆市烟草专卖局\t128615\n成者\t128616\n上海市浦东新区市场监督管理局\t128617\n区划图\t128618\n竞争力\t128619\nEC2\t128620\ndetailed\t128621\nPaginator\t128622\n申纪兰\t128623\n温布利球场\t128624\n该段\t128625\n乐死\t128626\n投壶\t128627\nov7725\t128628\n王树京\t128629\n脑蛋白\t128630\n77家\t128631\n200T\t128632\n凯欣\t128633\nHustle\t128634\nENT\t128635\n恩施土家族苗族自治州\t128636\n橡胶防老剂\t128637\nlong\t128638\n离线版\t128639\n干挂\t128640\n15&\t128641\n和声\t128642\n彩灵堡\t128643\ninvolve\t128644\n超级名人传\t128645\n异闻录\t128646\n中国中药协会\t128647\n并发包\t128648\n英语六级作文\t128649\n三湘银行\t128650\n北京石油化工学院\t128651\n青音\t128652\nspout\t128653\nWeb开发_编程\t128654\n宾得k5\t128655\n姨夫\t128656\n樊氏\t128657\n骁龙652\t128658\n草屋\t128659\n普希金\t128660\n沃伦\t128661\nbatchnorm\t128662\nGENTLE\t128663\nAP考试\t128664\n释凡\t128665\ndirent\t128666\n塞尔达传说\t128667\nUCS\t128668\n新苏教版小学\t128669\n附则\t128670\n金人\t128671\n无卤\t128672\n传代\t128673\n小玄子\t128674\n朗动论坛_汽车之家论坛\t128675\n128M\t128676\n十问\t128677\n词库\t128678\n英式下午茶\t128679\ndreamwever\t128680\nMARINE\t128681\n转圈\t128682\n冰盘\t128683\n淮滨县人民政府\t128684\n萧皇后\t128685\n在线观\t128686\n蝴蝶谷\t128687\n鱼龙\t128688\n福字\t128689\n跳梁\t128690\nOFweek仪器仪表网\t128691\n睿派克\t128692\n187号\t128693\n假面骑士wizard\t128694\nRankings\t128695\n盈\t128696\n刀箱\t128697\n时代小学\t128698\nedmx\t128699\nAnswerCard\t128700\nandroid.support.v4\t128701\nConfessions\t128702\n领头\t128703\n房屋中介公司\t128704\n雷石\t128705\n40亿\t128706\n丰都鬼城\t128707\n三击\t128708\n询价函\t128709\n铁托\t128710\n姜凯\t128711\n梅森\t128712\nexpensive\t128713\n七零八落\t128714\n泡影\t128715\n亿赛通\t128716\n包源\t128717\n梁式桥\t128718\n火凤燎原吧\t128719\ngiants\t128720\nsuperfly\t128721\n很郁闷\t128722\n蝶尾\t128723\n300m\t128724\n股海\t128725\n马洪涛\t128726\n爬堂\t128727\nsai\t128728\nTSL\t128729\n天津地铁4号线\t128730\n千乡\t128731\n亲历者\t128732\n甲申日\t128733\n破小\t128734\n看涨期权\t128735\n飞云市\t128736\n证金\t128737\n石勇\t128738\n杂项\t128739\n_段\t128740\nIOE\t128741\n尼斯湖\t128742\nwebstrom\t128743\n杀戮\t128744\n小米手机2\t128745\n无尘\t128746\n索隆\t128747\n宁夏大学新华学院\t128748\n11200\t128749\n信息系统\t128750\nx360ce\t128751\n八一物流\t128752\n全自动洗衣机\t128753\n易连凯\t128754\n阳光房\t128755\n中标价\t128756\n完结\t128757\n富贵列车\t128758\n集约化\t128759\nwma\t128760\n将相\t128761\n北京办事处\t128762\nOne\t128763\n石花洞\t128764\nCFOP\t128765\n复杂性\t128766\n戚戚\t128767\nexcel10\t128768\n孙玥\t128769\n_大街网\t128770\n不拜\t128771\nAbnor\t128772\n沈阳体育学院\t128773\n二第四\t128774\n特步\t128775\nROI\t128776\neos\t128777\n交汇期\t128778\npscs3\t128779\n水无濑优夏\t128780\n基督福音网\t128781\n雷蛙\t128782\nnovnc\t128783\n_乡途网\t128784\n英皇国际\t128785\nmum\t128786\n市工商局\t128787\n宜万铁路\t128788\ncarib\t128789\nEOS-1D\t128790\n第13批\t128791\n铁头\t128792\n余文森\t128793\n张岩客\t128794\n园林工程\t128795\n傲引擎\t128796\n侯孝贤\t128797\n徐生\t128798\n直饮机\t128799\n爆破者\t128800\n应计\t128801\n荷兰式\t128802\nBienvenue\t128803\n朱赫来\t128804\n忽地\t128805\n天马岛\t128806\nNBA录像网\t128807\n曼哈顿\t128808\n公主裙\t128809\n7:30\t128810\n一嘴\t128811\n龙岗村\t128812\n松村\t128813\n私密处\t128814\nzlata\t128815\nPaperPass\t128816\n耀阳\t128817\n为了你\t128818\n牛蹄\t128819\n英菲尼迪Q50/Q50L论坛_汽车之家论坛\t128820\n金麦\t128821\nSchlumberger\t128822\n2SLS\t128823\n土霉素\t128824\ncanned\t128825\nwebclient\t128826\nadamrose\t128827\n氢溴酸右美沙芬\t128828\nexFAT格式\t128829\n麻烦\t128830\n梅县机场\t128831\n东方通信\t128832\n4LZ\t128833\n画影\t128834\n粘力\t128835\n并未\t128836\n编辑处\t128837\n直邮\t128838\n浪味仙儿\t128839\n曼谷机场\t128840\nTESA\t128841\nsilent\t128842\nyas\t128843\n婐\t128844\n阿川\t128845\nInsomnia\t128846\n房管局\t128847\n甲仑榴莲\t128848\n2014款\t128849\n生死狙击\t128850\nMeal\t128851\n2016年4月15日\t128852\n金达克宁\t128853\n110kV\t128854\n哔咔\t128855\n营口路\t128856\n五证合一\t128857\n进出场\t128858\nTemperature\t128859\n帕梅拉\t128860\nOfficer\t128861\nFancy\t128862\nvgn\t128863\n短丝\t128864\n切丝\t128865\niOS7.1\t128866\n石头山\t128867\n国际米兰\t128868\n中国人民政治协商会议湖北省委员会\t128869\nLESSON\t128870\n方林装饰\t128871\n写单\t128872\ndaogrs\t128873\n俢\t128874\npdf\t128875\n11.8%\t128876\n兰西\t128877\n佐樱\t128878\n浆砌块石\t128879\n中国黄页网\t128880\n有喜\t128881\n铭轩\t128882\n半声\t128883\n石家庄北\t128884\nTOEFL版\t128885\n国家体育局\t128886\n奥卡姆\t128887\n怪叫\t128888\n用权\t128889\n滤槽\t128890\n零度可乐\t128891\n修容棒\t128892\n美奈子\t128893\n低聚半乳糖\t128894\n$.ajax\t128895\n劫机\t128896\n金麒麟\t128897\n财经_猪e网\t128898\n终日\t128899\nVirginia\t128900\n马拉维\t128901\n日月光华\t128902\n雷凌牧神记\t128903\n铁岭\t128904\n打标机\t128905\nShed\t128906\n大爱无疆\t128907\nノ瀬アメリ\t128908\n难别亦难\t128909\n先富\t128910\n送死\t128911\n胎毒\t128912\nrecorded\t128913\n刘蔚\t128914\n银土\t128915\n京医通\t128916\n抗癌\t128917\n土工格室\t128918\ntcnative-1.dll\t128919\n上海电信\t128920\n酒格\t128921\n朝隆\t128922\n踏破铁鞋邮轮网\t128923\n9月23日\t128924\n西凉\t128925\n黄真\t128926\n108届\t128927\n频移\t128928\n河南省国土资源厅\t128929\n行为心理学\t128930\n东安区\t128931\n星标\t128932\n3500\t128933\n小儿麻痹症\t128934\nEmployers\t128935\n大连圣亚\t128936\n天鹅堡\t128937\ncocosCreator\t128938\n非盈利\t128939\n飓龙\t128940\n头镖\t128941\n法人治理结构\t128942\n油雾器\t128943\n260万元\t128944\n电源IC\t128945\n中华人民共和国旅游法\t128946\nemerald\t128947\n5.58\t128948\n决策咨询网\t128949\n轴距\t128950\nhztxt\t128951\n5537\t128952\n网游之倒行逆施\t128953\n慢性肾炎\t128954\n棕垫\t128955\nBAG\t128956\n犹马镇\t128957\n木易\t128958\ngrowl\t128959\n日宅\t128960\n保利房地产(集团)股份有限公司\t128961\n森森\t128962\n单结\t128963\n拥入\t128964\n张爱罗斯\t128965\n夷陵区\t128966\n楷维\t128967\nIGCSE\t128968\n中低\t128969\n台式机能\t128970\n机动性\t128971\nflv播放器\t128972\n69bj\t128973\n概貌\t128974\nCAPS\t128975\n超过36小时\t128976\n金梅瓶\t128977\n介面\t128978\n举牌\t128979\n吴君如\t128980\n哥谭\t128981\n农村信用社银行\t128982\n硬朗\t128983\n0301\t128984\n布罗代尔\t128985\n克鲁伊夫\t128986\n脑囊肿\t128987\nKim\t128988\n赵凯\t128989\n6季\t128990\n嘱咐\t128991\n深圳联通\t128992\n盘牙\t128993\n外耳\t128994\nABCDEFG\t128995\n回充\t128996\n友朋\t128997\nbiee\t128998\n涝\t128999\n古田一路\t129000\nbeca\t129001\n601998\t129002\n600207\t129003\n坐骨\t129004\n欢乐喜剧人3\t129005\n许君聪\t129006\n市纪委监察局\t129007\nApp\t129008\n男人装\t129009\n奥吉\t129010\nz17mini\t129011\n一2019年\t129012\nMrBlue\t129013\nARX\t129014\nactel\t129015\n原配\t129016\n画蛇\t129017\n40HQ\t129018\n年初\t129019\npuerto\t129020\nmetalink\t129021\n50种\t129022\n_日\t129023\naga\t129024\n爱国教育基地\t129025\n1v3\t129026\n净增\t129027\n阿兵\t129028\n16aspx.com\t129029\n音叉\t129030\n天星小轮\t129031\n硫酸四氨合铜\t129032\n亿条\t129033\n南通城\t129034\n猫扑大杂烩\t129035\n1228\t129036\n精轧\t129037\n文晖\t129038\n中银e贷\t129039\n朝中措\t129040\n清昭陵\t129041\nConvert\t129042\n片场照\t129043\n马钰\t129044\n歡迎\t129045\n青苔\t129046\nannuity\t129047\n英俊少年\t129048\n将不国\t129049\n小群\t129050\n女追男\t129051\ntreats\t129052\n为天人\t129053\n开出\t129054\nDura\t129055\n天津航空\t129056\nDI\t129057\n大光谷\t129058\nsurprise\t129059\nMacBookair\t129060\nab卷\t129061\nQ8\t129062\n国民\t129063\n600256\t129064\n赶脚\t129065\n易寒\t129066\n早上八点\t129067\n女性们\t129068\ntoge\t129069\n乱数\t129070\n北京卫生职业学院\t129071\nCarrying\t129072\n王国良\t129073\n防尘圈\t129074\nMontBlanc\t129075\nseemed\t129076\nBass\t129077\n电料\t129078\n俄罗斯央行\t129079\n佛山市国家税务局_广东省国家税务局\t129080\n扫地\t129081\n建文帝\t129082\n超声科\t129083\n所得\t129084\n门禁一卡通\t129085\n蔚空\t129086\n下针\t129087\nwdsj\t129088\nm200\t129089\nzencart\t129090\n纳米技术\t129091\n包死\t129092\n淘宝群聊\t129093\n2018-01-09\t129094\n刘索拉\t129095\n你的要求\t129096\nAMIGO\t129097\n金福珠\t129098\n妣\t129099\n鱼跃\t129100\n20151018\t129101\n椭圆机\t129102\n汉区\t129103\n床具\t129104\n空蝉之森\t129105\n六针\t129106\n无锡市住房和城乡建设局\t129107\n萨顿\t129108\n欧罗\t129109\n增兵\t129110\nFlyDragon\t129111\n双歧杆菌四联活菌片\t129112\n岳敏君\t129113\n喜怒哀乐\t129114\n腾讯手游助手傲引擎\t129115\nregenerate\t129116\n天然气门\t129117\n蕃茄田\t129118\n运输箱\t129119\n遠離塵世\t129120\n心理性\t129121\n高分贝\t129122\n杏雨\t129123\n太热\t129124\n大致\t129125\n22块\t129126\n知金\t129127\n香港证券\t129128\n局党委\t129129\n辛辛那提大学\t129130\n项庄\t129131\n难治性\t129132\n非著名\t129133\n心情失落\t129134\nCollins\t129135\n即墨古城\t129136\nservlet-name\t129137\n青鱼\t129138\n说案\t129139\n动库商城\t129140\n雪宝|港雪宝|雪宝论坛|雪宝论坛\t129141\n罗冲\t129142\n7017k小说网\t129143\n艺体\t129144\n旁门\t129145\n苏杉杉\t129146\n0991车评中心\t129147\nhab\t129148\n单星\t129149\n墩身\t129150\nPycharm\t129151\n22日\t129152\nLemonade\t129153\n下村\t129154\n中山南路\t129155\n厨卫电器公司\t129156\n生姿\t129157\n无主之地2\t129158\n女忧\t129159\n扭扭棒\t129160\n五四红旗团委\t129161\n我要\t129162\n江门市五邑中医院\t129163\n空飘\t129164\n花田\t129165\n湖南省工商局\t129166\ncopa\t129167\n苹果手\t129168\n玉质\t129169\n薄刀峰\t129170\n艾克里里\t129171\n伪军\t129172\n香炉礁\t129173\n吃火\t129174\n阿陈\t129175\n奥特兄弟\t129176\n封臣\t129177\n围棋手\t129178\n四川大学数学学院\t129179\n太平洋联盟\t129180\n131号\t129181\n无微不至\t129182\ne530\t129183\n基基\t129184\n1972\t129185\n西本新干线\t129186\ngtx1060\t129187\n凤儿\t129188\n自愿性\t129189\n寇准\t129190\n横吹\t129191\nデン\t129192\n张家港日报\t129193\nInbound\t129194\nnanopi\t129195\n遭难\t129196\n潘峰\t129197\n清明日\t129198\n黄硕\t129199\n比克\t129200\n朱仁\t129201\n野河\t129202\n内账\t129203\n尹食堂2\t129204\n经体\t129205\n02款\t129206\n空喊\t129207\nxiaoxue\t129208\nreplica\t129209\n杨彦星\t129210\n免疫球蛋白\t129211\n齿轮箱\t129212\nHD1280\t129213\n棉絮\t129214\n清越\t129215\n东风超龙\t129216\n嘉定幼儿园\t129217\n直轴\t129218\n本音\t129219\n项集\t129220\n姤卦\t129221\n黑米网\t129222\n龙门石窟景区\t129223\n滨江花园\t129224\n不动弹\t129225\n寒潭\t129226\n剑速网\t129227\n十三部\t129228\n新会市\t129229\nth2\t129230\n美丽\t129231\n主教学楼\t129232\n洗劫\t129233\npopocom\t129234\n917路\t129235\n远程教育学院\t129236\n在职法硕\t129237\n惊爆价\t129238\nRollup\t129239\n乘数效应\t129240\n钻皇天下网\t129241\nzlc\t129242\n黑皮诺\t129243\n黑saber\t129244\n小松\t129245\n真辛苦\t129246\n小红点\t129247\n赵学立\t129248\nHeadlines\t129249\n质谱图\t129250\n240万\t129251\nHD1080p\t129252\n反对称\t129253\n痴漢\t129254\n蝮蛇\t129255\n比亚迪秦\t129256\nent\t129257\n投资战\t129258\n中国供销集团\t129259\nivory\t129260\nshaker\t129261\n汗衫\t129262\nsim800\t129263\nygopro2\t129264\n不是不是\t129265\nLICEcap\t129266\n大元泵业\t129267\n泡泡浴\t129268\n消毒供应中心\t129269\n广为\t129270\n510050\t129271\nstomp\t129272\nwannacry\t129273\n嘉树\t129274\nDrilling\t129275\nPerformer\t129276\n多克\t129277\n15.0000元\t129278\n张猛\t129279\n财务章\t129280\n杨俊伟\t129281\npathos\t129282\n无忧加盟网\t129283\n长城\t129284\nthreadlocal\t129285\n过五关\t129286\nswitching\t129287\n偃师市\t129288\n元壹角\t129289\n建发央\t129290\n王威\t129291\n事在人为\t129292\n郑州富士康\t129293\n云山\t129294\n北京地\t129295\n考区\t129296\nQS世界大学\t129297\n李中莹\t129298\n头茬\t129299\n女勇者\t129300\n站立式\t129301\n.com域名\t129302\n战之海贼\t129303\n女神节\t129304\n二.doc\t129305\n84版\t129306\ntertiary\t129307\nautofac\t129308\n薏米红豆粥\t129309\n初冬\t129310\n硫蛋白\t129311\n哑铃飞鸟\t129312\nTrait\t129313\n香甜\t129314\n聚氨酯直埋保温钢管\t129315\n增值税专用发票管理办法\t129316\n43分\t129317\nlto\t129318\n七秀\t129319\n袁莉\t129320\nvmbox\t129321\n三相插座\t129322\n管理科学学报\t129323\n进深\t129324\n泷泽\t129325\ntodd\t129326\n定额人工费\t129327\nRuntastic\t129328\n驾校\t129329\n纤维性\t129330\n上海外国语大学研究生部\t129331\n30.00\t129332\n付佳美\t129333\n亚利桑\t129334\n纯种犬\t129335\n1季\t129336\n虹桥路\t129337\n西乙\t129338\n反义\t129339\n君莫笑\t129340\n水栽\t129341\n陆游\t129342\n老闫\t129343\n小白卡\t129344\n浴女\t129345\n社企\t129346\n大爱城\t129347\n原声集\t129348\n代词\t129349\nv1.6.2\t129350\n金山中学\t129351\n版报\t129352\n印兽\t129353\n尺字\t129354\nIterable\t129355\n石窟寺\t129356\nAC3\t129357\n企鹅号\t129358\n华贸\t129359\n埃索\t129360\n八棱海棠\t129361\n王跃文\t129362\n蜜雪\t129363\nSelina\t129364\n千粒重\t129365\n金古\t129366\n高丽大学\t129367\n广州广电计量检测股份有限公司\t129368\nsensetime\t129369\n膳食纤维\t129370\nFONT\t129371\n上杉谦信\t129372\n三国秀\t129373\n苗床网\t129374\n余姚北站\t129375\n0.5mm\t129376\n茶师\t129377\n鼻基底\t129378\n百毒不侵\t129379\n滨河路\t129380\n现象级\t129381\n姜杰\t129382\n海淘网\t129383\n复本\t129384\n医统江山\t129385\nLION\t129386\nmarkup\t129387\n柜型\t129388\nx820\t129389\nzindex\t129390\n潜阳\t129391\niPad5\t129392\n开敞\t129393\n东证\t129394\nv1.5.1\t129395\n政府主义者\t129396\n谱仪\t129397\n广州羊城通有限公司\t129398\nDIOR\t129399\n单元号\t129400\n副歌\t129401\n巨乳症\t129402\n双井富力城\t129403\n油脂类\t129404\n陈天桥\t129405\n千王之王重出江湖\t129406\nm17x\t129407\n冠王\t129408\n毛细管网\t129409\naclocal\t129410\n插\t129411\n半透明\t129412\n云达不莱梅\t129413\n抢掠\t129414\n|信网\t129415\nWW\t129416\n子宫后位\t129417\ncanton\t129418\n40分贝\t129419\nui素材\t129420\n暖味\t129421\n辉隆股份\t129422\n74号\t129423\n无为\t129424\n2052\t129425\n平汝高速\t129426\ntreasures\t129427\nDT880\t129428\njpeg格式\t129429\n文安\t129430\nBionic\t129431\n华民\t129432\n黄头\t129433\n430亿\t129434\n心跳过\t129435\n90SS\t129436\nTWW\t129437\n安特\t129438\n罗技g300s\t129439\n机动车限号\t129440\n庐\t129441\nDSG变速箱\t129442\nD6\t129443\n第206集\t129444\n翔宇教育集团\t129445\n文明网\t129446\n宁海县人民政府\t129447\n保持清醒\t129448\nsde\t129449\n杨议\t129450\nitune\t129451\n鸿润\t129452\n大航海\t129453\n协合\t129454\n相呼应\t129455\n锁销\t129456\n四川外国语大学成都学院\t129457\n搜狗手写输入法\t129458\n3104\t129459\n小七孔景区\t129460\n模拟火车2017\t129461\n新浪装修家居网\t129462\nbance\t129463\nMTN\t129464\n净赚\t129465\n泰民\t129466\n小道消息\t129467\n181号\t129468\n同声翻译\t129469\n10版\t129470\n堀咲莉亚\t129471\napowermirror\t129472\n凯汇商\t129473\n平庄\t129474\ndeparture\t129475\n华北理工大学\t129476\n鳝鱼\t129477\n行星地球\t129478\n山东梆子\t129479\nTwinmotion\t129480\n哈弗歌诗图\t129481\n韵达物流\t129482\n真空包装\t129483\n胚乳\t129484\nEverNote\t129485\nsubtext\t129486\nseparator\t129487\n超级精灵\t129488\n3018\t129489\n第一副\t129490\n五彩滩\t129491\n轨\t129492\n3Dmax2016\t129493\n全死\t129494\n山东管理学院\t129495\n星露谷物语stardew\t129496\n墙功\t129497\n冷阴\t129498\n春功\t129499\n莳\t129500\n大厂县\t129501\n马懿\t129502\n1327\t129503\n吞掉\t129504\n脐贴\t129505\n石梁\t129506\n习仲勋\t129507\n绝妙\t129508\n丁波\t129509\n嘻哈音乐\t129510\nttd\t129511\n宗国\t129512\n奇异性\t129513\n小早川怜子\t129514\nKorg\t129515\n高帮\t129516\n食品网\t129517\n乃木坂46\t129518\nm.2\t129519\n建筑商\t129520\n暗影精灵3pro\t129521\n玉祁\t129522\nrecovery\t129523\n弗雷泽\t129524\n橘子\t129525\n如言\t129526\n0元\t129527\n48mm\t129528\n黑点点\t129529\nAV天堂网2018\t129530\n谢过\t129531\n基米希\t129532\n漏胸\t129533\n加厚型\t129534\n情有可原\t129535\n扑动\t129536\n对峙\t129537\nctime\t129538\n蓝鸟金屋藏娇\t129539\n乡村爱情10\t129540\n户别\t129541\n心声\t129542\nxaml\t129543\n89万\t129544\n皮疹\t129545\n南宁装饰公司\t129546\n不菲\t129547\n27个\t129548\n特类\t129549\nfestival\t129550\n全能法\t129551\n文气\t129552\n10339\t129553\n樱桃酒\t129554\n女长\t129555\nlba\t129556\n长沙东华医院\t129557\n橙意\t129558\n科普丨\t129559\n注册制\t129560\n内乡县人民政府\t129561\n风管机\t129562\n洞子\t129563\n冠特\t129564\ndemonstrate\t129565\n风炫\t129566\nbigbang\t129567\n定军山\t129568\n冠捷\t129569\nhwnd\t129570\n洽谈室\t129571\n供应不足\t129572\n山东卫生新闻网\t129573\n英城\t129574\n3l\t129575\n滁州西涧\t129576\n第一处\t129577\n深点\t129578\nPuzzles\t129579\n曾仕强\t129580\n黄尾袋鼠\t129581\n胡埭\t129582\n权变\t129583\n屯\t129584\n简体字\t129585\nHGAME\t129586\n2010年底\t129587\nxinyuan\t129588\n8015\t129589\n小米智能插座\t129590\n顶包\t129591\n镇宁路\t129592\njoshua\t129593\nピアノ\t129594\n色素痣\t129595\n小鹿\t129596\n坛友们\t129597\nmaker\t129598\n黑瓷\t129599\n梅州客家\t129600\nT7\t129601\nawakening\t129602\n心绞痛\t129603\nblight\t129604\nvolk\t129605\n颌面\t129606\n总队\t129607\n阻火圈\t129608\nVC++\t129609\n徐州市中心医院\t129610\n孤独寂寞\t129611\n丧假\t129612\nsuspicious\t129613\n果戈里大街\t129614\n入学登记表\t129615\n黄忠\t129616\n微鲸电视\t129617\n6601\t129618\nCC图库漫画网\t129619\n蓝天格锐\t129620\n跳仓\t129621\n5032\t129622\n僵尸部队\t129623\nlibav\t129624\n5754\t129625\n320mm\t129626\n逍遥安卓\t129627\ncafé\t129628\nkes\t129629\n嘤\t129630\n宫中\t129631\n伴郎团\t129632\n长生界\t129633\n王智\t129634\n泛泛\t129635\n车票\t129636\n阿强\t129637\n分峰\t129638\n洋山港\t129639\n北四村\t129640\n深圳市政协\t129641\n指向性\t129642\n中通速递\t129643\n女名\t129644\n凭祥党建网\t129645\nviho\t129646\n车裂\t129647\n微软研究院\t129648\n天然居\t129649\n夜惠美\t129650\n蓼莪\t129651\n自动饮水器\t129652\nstatistics\t129653\n普陀寺\t129654\nyingw\t129655\nbiz\t129656\n路感\t129657\n智课教育出国考试\t129658\n2017劳动节\t129659\n华敏\t129660\n民营化\t129661\n@import\t129662\nLover\t129663\n陈天刚\t129664\n新弹丸论破V3\t129665\ndtc\t129666\n青砖茶\t129667\n放羊\t129668\n吴泽\t129669\nE站\t129670\nSam168666\t129671\n双胞\t129672\nモノ\t129673\n1.4米\t129674\n全季\t129675\n中共江苏省委\t129676\n点石\t129677\nPE袋\t129678\n希尔顿酒店集团\t129679\n12G\t129680\n返还率\t129681\nchrome65\t129682\n安亭镇\t129683\n贾宁\t129684\n内购会\t129685\n舒通\t129686\nFT中文网\t129687\n内格夫\t129688\n巴郎\t129689\n深改组\t129690\nx64.exe\t129691\n姓关\t129692\n求辨\t129693\n玉门\t129694\nS2B2C\t129695\n九宗\t129696\n樟树人才网\t129697\n阳城镇\t129698\n配色\t129699\nSU\t129700\neso\t129701\n百户\t129702\n加法运算定律\t129703\n遮阳伞\t129704\n开成\t129705\n低聚\t129706\nPhantomJS\t129707\n真鬼\t129708\n中国鞋网\t129709\n球型\t129710\n癸巳\t129711\n响屁\t129712\n楚天快报_多媒体报\t129713\n98斤\t129714\n晋军\t129715\n新加坡元\t129716\n66层\t129717\nrelay\t129718\n二位\t129719\n软禁\t129720\n妺\t129721\n安慧里\t129722\n北京市公安局\t129723\n星际特工:千星之城\t129724\n小雄\t129725\n梦幻之星\t129726\n观兰\t129727\n防水材料\t129728\n古笼\t129729\n超过12个月\t129730\n坡度系数\t129731\n颁给\t129732\npjs\t129733\n中耳炎\t129734\n天地壹号\t129735\n中国人民志愿军\t129736\n冷公主\t129737\n试音\t129738\n∥\t129739\n折角\t129740\n坑塘\t129741\n真三国无双8_真三国无双8\t129742\n釋\t129743\n生男生女清宫图\t129744\n非常人贩\t129745\n泡桐\t129746\n升级型\t129747\nUnderwood\t129748\nTellMeWhen\t129749\n心叶\t129750\n星际之门\t129751\n西厂\t129752\n徐晓峰\t129753\n92岁\t129754\n组态王\t129755\nweimob\t129756\n空调板\t129757\n合唱团\t129758\n小屯\t129759\n上海交通大学人文学院\t129760\njsd\t129761\n灵游记\t129762\n育才学校\t129763\n海蜘蛛\t129764\n跆拳道\t129765\n烤冷面\t129766\nmin.js\t129767\n庇护\t129768\n超燃\t129769\n石湾镇\t129770\nCANON\t129771\n绣\t129772\nkinjaz\t129773\n微网站\t129774\nconanwang\t129775\nnvcc\t129776\n绿地湖湘中心\t129777\n王铁军\t129778\n德友圈\t129779\n佟毓婉\t129780\n上海大学悉尼工商学院\t129781\n藤原龙也\t129782\n儿孑\t129783\n大佛寺\t129784\n管板\t129785\n李建东\t129786\n老兄\t129787\n东方网\t129788\n力驱\t129789\n发音表\t129790\n83780269\t129791\n太空生活趣事多\t129792\n泥鳅鱼\t129793\n2.94\t129794\nzhuangshi\t129795\n长沙县\t129796\n霍英东教育基金会\t129797\n味子\t129798\n龙之牙\t129799\n输球\t129800\nformers\t129801\n数据库分库分表\t129802\n劳动与社会保障专业\t129803\n天邑\t129804\n结婚率\t129805\n语雀\t129806\n八个月\t129807\n古惑\t129808\n晋升\t129809\n神绝\t129810\n余欢\t129811\n华州\t129812\n荣耀V10\t129813\nlottie\t129814\n不可思\t129815\n秦时明月汉时关\t129816\nEcological\t129817\n社火\t129818\n相交\t129819\n2018.3.18\t129820\n印客\t129821\n杭州派出所\t129822\n钱学森图书馆\t129823\n长方型\t129824\nStringGrid\t129825\n渭\t129826\nstrapon\t129827\n上悬式\t129828\n一三\t129829\nYummy\t129830\n含漱液\t129831\n鲁邦三世\t129832\n模拟联合国\t129833\n派潭镇\t129834\n出墙\t129835\n鸿鑫\t129836\n积炭\t129837\n回天无力\t129838\n54个\t129839\n均质性\t129840\n有野\t129841\natob\t129842\n荣枯\t129843\n近身高手\t129844\n95年\t129845\n当下\t129846\n破壁\t129847\n承德医学院附属医院\t129848\n优化篇\t129849\n问卷法\t129850\nbeiw\t129851\n推崇\t129852\n蜜雪儿\t129853\n破晓\t129854\nPixel\t129855\n蒙恬\t129856\nparker\t129857\nrealtek\t129858\n律师业\t129859\n膜法传奇\t129860\n专利费\t129861\nPBM\t129862\n大分子\t129863\n德罗赞\t129864\n密不透风\t129865\n占中\t129866\nUTR\t129867\n北条麻妃\t129868\n刘慧\t129869\n穿越火线cf\t129870\nshili\t129871\n防渗\t129872\nexecutesql\t129873\n泛微网络\t129874\n提前量\t129875\n金庸群侠传2\t129876\n氟化铝\t129877\nchm阅读器\t129878\n太平洋下载网\t129879\n天上人间动漫网\t129880\nCSMA\t129881\n打码兔\t129882\n000997\t129883\n山村老尸2\t129884\n埃迪·雷德梅恩\t129885\n朴草娥\t129886\n70道\t129887\n穿绳\t129888\n贿赂\t129889\n花谢\t129890\n精盐\t129891\n华微电子\t129892\n抽油机\t129893\n五猖会\t129894\n圣元奶粉\t129895\n烫印\t129896\n细狗\t129897\n售罄\t129898\n教育机构\t129899\n1768\t129900\n公敌\t129901\n离线包\t129902\n外运仓\t129903\n超能干\t129904\n39米\t129905\n350g\t129906\n校考\t129907\n达梦\t129908\n南六公路\t129909\n扰动\t129910\n喜歡\t129911\nGFC\t129912\n傲农\t129913\n银湖汽车站\t129914\n阎连科\t129915\n国际标准化组织\t129916\n终极三国2017\t129917\n刘长铭\t129918\n合欢路\t129919\nfelt\t129920\n安防监控系统\t129921\n红菇\t129922\n麦克白夫人\t129923\n1499\t129924\n枪杆子\t129925\n136名\t129926\n神壕\t129927\n模联\t129928\n李玮峰\t129929\n血战\t129930\n蝙蝠侠阿甘之城\t129931\n腾宇\t129932\n中国国际工程咨询有限公司\t129933\n马克亨利\t129934\nerstudio\t129935\n暗黑血统\t129936\n固彩\t129937\n宁强县政府\t129938\n孔子文化节\t129939\nMDPI\t129940\n民族药\t129941\nmultisim\t129942\n读者证\t129943\n相模\t129944\n督瑞尔\t129945\nGeek\t129946\nMelissa\t129947\n阿梅\t129948\nPROTEL99SE\t129949\n四川省医学会\t129950\n飞创\t129951\n前五年\t129952\n浙江博物馆\t129953\nProxy\t129954\n微电子电路\t129955\n孙陶然\t129956\n途牛\t129957\n箬横镇\t129958\n兰诺\t129959\n早发现\t129960\n转篇\t129961\n武家\t129962\n张小林\t129963\nBucket\t129964\n360网站卫士\t129965\n黄宾虹\t129966\n引用率\t129967\n清华大学附属中学\t129968\nto_char\t129969\nftd\t129970\n萌宝\t129971\ninpaint\t129972\n硬幕\t129973\n济宁医学院附属医院\t129974\nDialect\t129975\n脑萎缩\t129976\n清新剂\t129977\n胸大肌\t129978\n中铁九局\t129979\n高等教师\t129980\n凯瑟喵\t129981\n3阶\t129982\n污染环境罪\t129983\n南湖梦幻岛\t129984\n搞活\t129985\n铃屋什造\t129986\n常熟汽车\t129987\n宠物类\t129988\n进步主义\t129989\n微功率\t129990\nclans\t129991\nFormer\t129992\n好视通\t129993\n10架\t129994\n12口\t129995\n美誉度\t129996\n晋南\t129997\n大霉素\t129998\n深蓝与月光\t129999\nicecream\t130000\n治吏\t130001\n糗百\t130002\n真神\t130003\n中赫时尚\t130004\n苏老师\t130005\nFLOWER\t130006\n3.5.5\t130007\n五合目\t130008\nNa\t130009\nuc吧\t130010\n中国工商银行\t130011\nZ9\t130012\n赵明明\t130013\n照明灯\t130014\n新开\t130015\n龙芯处理器\t130016\n伊斯兰化\t130017\ncombogrid\t130018\n快穿文\t130019\n脑血管狭窄\t130020\n万大\t130021\nfzero\t130022\n刘大钧\t130023\n透明感\t130024\nRolex\t130025\n勇往直前\t130026\n反射器\t130027\n15万起\t130028\n摹\t130029\nhdmi2.0\t130030\nWinSock\t130031\n85路\t130032\n威斯康星大学麦迪逊分校\t130033\n拍嗝\t130034\n白兰鸽\t130035\n安软市场\t130036\n底洞\t130037\n佳木斯日报新闻网\t130038\n紫菜\t130039\n生态县\t130040\nOrz\t130041\n左敏\t130042\n破题\t130043\n液位控制器\t130044\nPSpice\t130045\ninkjet\t130046\n全民枪战2\t130047\nArvin\t130048\n阴虚火旺\t130049\n液泡\t130050\n断句\t130051\n三觉\t130052\n猎人们\t130053\n通川区\t130054\n航嘉\t130055\n1.00\t130056\n1058\t130057\n巧手\t130058\nj-link\t130059\n学参网\t130060\ncontra\t130061\n弹弹堂\t130062\n直肠腺癌\t130063\n赵钢\t130064\n意险\t130065\n华南师范大学网络教育学院\t130066\n反舰弹道导弹\t130067\n党小组长\t130068\n熊市\t130069\n景帝\t130070\n美颂\t130071\nGrizzly\t130072\n升米恩\t130073\n日本金融厅\t130074\nngrok服务器\t130075\nchangjia\t130076\n胡同\t130077\n颐高数码广场\t130078\n四棱\t130079\nshotgun\t130080\nTcpdump\t130081\n大道寺\t130082\nODIS\t130083\nvip\t130084\nHP1005\t130085\n鲨海\t130086\ncompatible\t130087\n20170709\t130088\n悲壮\t130089\n计量认证\t130090\n青橙\t130091\n大理客栈\t130092\nLCM\t130093\nSAP2000\t130094\n真新\t130095\nn++\t130096\ngbase\t130097\n乔杉\t130098\n中央文献研究室\t130099\n蓝淋\t130100\n核相\t130101\n五莲吧\t130102\nSeoul\t130103\n沙头镇\t130104\nplayback\t130105\n0662\t130106\n天池镇\t130107\n一个块\t130108\n瑞祥\t130109\n刘希夷\t130110\n昌平政府网\t130111\n荷叶饭\t130112\n存折\t130113\n肩底\t130114\n机友们\t130115\n渭塘\t130116\n秦基伟\t130117\n性用具\t130118\n乌拉盖\t130119\n月扣\t130120\n御灵\t130121\n风女\t130122\n集体林权\t130123\nestimated\t130124\n补收\t130125\n致远OA\t130126\npt100\t130127\n公用设备工程师考试\t130128\n二居室\t130129\ngtx1050ti\t130130\n迈巴赫S450\t130131\n小米手机5c_MIUI论坛\t130132\n蓝线\t130133\n九鼎新材\t130134\ncollar\t130135\nNIH\t130136\n中国农业网\t130137\n乐檬K3\t130138\n列宁\t130139\n2.1\t130140\njacs\t130141\n联点\t130142\n贵阳路\t130143\n歧路\t130144\n工商银行信用卡中心\t130145\n老康\t130146\n明月楼\t130147\n淡豹\t130148\nspringdata\t130149\n浪涌\t130150\n火坑\t130151\n麦格尼\t130152\nsumatrapdf\t130153\n真账\t130154\nfategrandord\t130155\n全书\t130156\n荣耀分布式路由\t130157\nrpt\t130158\n宣传语\t130159\n900张\t130160\n绿地面积\t130161\n90斤\t130162\n摩尔时空猎人\t130163\nycc\t130164\n短肢剪力墙\t130165\noppo203\t130166\n罗霖\t130167\n人口学\t130168\nwEHt\t130169\n第三只\t130170\n解集\t130171\ngreendao\t130172\n正方\t130173\nsapi\t130174\n湖南省纪委\t130175\n烟囱\t130176\n文电\t130177\nmessager\t130178\nEstimator\t130179\n体彩11选5\t130180\n毛老师\t130181\n特感\t130182\n八月\t130183\nKarin\t130184\n1首\t130185\n商子\t130186\n王雷雷\t130187\nvalmont\t130188\n兴味\t130189\n而已\t130190\n乚\t130191\n中华人民共和国文化部\t130192\nETCP\t130193\n三妻四妾\t130194\n尧治河\t130195\n海南移动\t130196\n闾丘露薇\t130197\n海贼王分析_海贼王\t130198\nDecor\t130199\n护盘\t130200\n雷兽山\t130201\n曼大\t130202\n狼崽\t130203\nwoed\t130204\n钻子\t130205\n狮虎\t130206\n王文军\t130207\nxticks\t130208\n阳山\t130209\n高教社\t130210\n维迪卡尔\t130211\n聚祥春茗茶\t130212\n广州体育馆\t130213\n黄杨山\t130214\n柳菁菁\t130215\n50道\t130216\n马库斯\t130217\n六耳猕猴\t130218\n刘玉婷\t130219\n沙糖桔\t130220\n频域滤波\t130221\n奥迪a8l\t130222\n第八关\t130223\n问询\t130224\n多多大\t130225\n侯长荣\t130226\n160克\t130227\n周晓林\t130228\n双骑\t130229\n留用\t130230\n锈迹\t130231\n水贝\t130232\n车路\t130233\n重庆中考网\t130234\nyiyi\t130235\n失态\t130236\n湿哥\t130237\n兀鹫\t130238\n马尔康市\t130239\n选调生考试网\t130240\n良性循环\t130241\n小米电视4C\t130242\n初论\t130243\nGGD\t130244\n外科学\t130245\n天津物产集团有限公司\t130246\n取火\t130247\n美界\t130248\n宾利添越\t130249\n广州六中\t130250\n条幅\t130251\n铜陵一中\t130252\n8月3日\t130253\n自考03708\t130254\n第一百零四章\t130255\n叶红\t130256\n打孔\t130257\n亲子游\t130258\n京杭运河\t130259\nPlywood\t130260\n翻唱集\t130261\n都市华庭\t130262\n变频水泵\t130263\n13条\t130264\n陈埭\t130265\n+12125308303\t130266\n延伸率\t130267\n字体转换网\t130268\n上腔\t130269\n尿色\t130270\n南江县人民政府\t130271\n小九\t130272\n大可不必\t130273\n妈妈的歌\t130274\n白海豚\t130275\n鬼铠\t130276\niPhone6s/6s\t130277\n回来吧\t130278\n二把\t130279\n1万亿元\t130280\n第46页\t130281\n固态硬盘寿命测试\t130282\n转换函数\t130283\n同义\t130284\n雪花屏\t130285\n合同制\t130286\n花开放\t130287\nPFC\t130288\n呼吸道\t130289\n京城网\t130290\n机脚\t130291\n中山北一路\t130292\n耳中\t130293\n李世石\t130294\n聋儿\t130295\n远程会诊\t130296\n全境封锁\t130297\n第七轴\t130298\n六排\t130299\n跟着\t130300\nChiang\t130301\n赚大钱\t130302\nAmit\t130303\n传函\t130304\n广东腾越建筑工程有限公司\t130305\n钢片\t130306\ndl250\t130307\n九爷\t130308\nbroadway\t130309\n北京北京美国大使馆\t130310\n帝豪GS论坛_汽车之家论坛\t130311\n胸上\t130312\n陈建明\t130313\nmarkman\t130314\n簡\t130315\nrestoration\t130316\nAutocad\t130317\nKNOW\t130318\n蛾类\t130319\n药界\t130320\nTodo\t130321\n东莞市人民政府办公室\t130322\n朱宏路\t130323\n以此\t130324\n零零零\t130325\n深圳市司法局\t130326\n6亩\t130327\n对手\t130328\n减肥网\t130329\n963\t130330\n倍体\t130331\n0614\t130332\n0451\t130333\n安健环\t130334\n润喉片\t130335\n顾氏\t130336\n噂\t130337\n性犯罪\t130338\n十二米\t130339\n都灵理工大学\t130340\n蓝光播放机\t130341\nUIImage\t130342\n剧毒化学品\t130343\n旺座\t130344\nwin2012R2\t130345\n宋天佑\t130346\n大数据风控\t130347\n技侦\t130348\n极尚\t130349\nIntegrator\t130350\n英语专四\t130351\n天棚\t130352\nGUANG\t130353\n秦天宇\t130354\n大业\t130355\n湖北省农村义务教育学校\t130356\n罗顿\t130357\n4月11号\t130358\n古拉\t130359\ntup\t130360\n逸\t130361\n众泰z700\t130362\n数据处\t130363\n百度云_\t130364\n花都湖\t130365\n柳宗理\t130366\n一学位\t130367\n佳能m50\t130368\nwindows0\t130369\n倪伟\t130370\n结款\t130371\n4M\t130372\n二手汽车网\t130373\n认怂\t130374\nmax2016\t130375\n3.5小时\t130376\n58元\t130377\n红番阁\t130378\n猜评\t130379\n凡人\t130380\n韵尾\t130381\n高科技园区\t130382\n中华禅\t130383\n招行信用卡\t130384\nAZD9291\t130385\n4.2G\t130386\n新东方在线网络课堂\t130387\n成都中院\t130388\n月亮石\t130389\n赤裸羔羊\t130390\n三年制\t130391\n地宝网\t130392\n在线字典\t130393\n检查出\t130394\nABDC\t130395\n桥本氏病\t130396\n供卵试管婴儿\t130397\n一庭\t130398\n严斌\t130399\n450分\t130400\nguaranteed\t130401\n盐水\t130402\n英雄无敌手游\t130403\n室内人\t130404\n密码查看器\t130405\nCYC\t130406\n母龟\t130407\n神武_电玩巴士\t130408\n01005\t130409\n舞钢\t130410\n贷款人\t130411\n分式\t130412\n鸡屎藤\t130413\n长春龙嘉机场\t130414\nstimulus\t130415\n拉普\t130416\n上海小学\t130417\n禄丰\t130418\n反胶套胶\t130419\n千龙网\t130420\n耐性\t130421\n手甲\t130422\n2017年9月21日\t130423\ntexmaker\t130424\n十八段\t130425\n南陵县政府\t130426\nBenefit\t130427\n中国神酒\t130428\n天海防务\t130429\n江苏工商局\t130430\n展\t130431\nFinds\t130432\n强奸片\t130433\n卡证\t130434\n沈梦瑶\t130435\n聚氨酯胶水\t130436\n明若黑暗大纪元\t130437\n400万\t130438\nistanbul\t130439\nsqrt\t130440\n新妆\t130441\n桐木村\t130442\n着迷\t130443\nncre\t130444\n彩穴\t130445\n语文知识\t130446\n2017_\t130447\n大师杯\t130448\n烟台万达广场\t130449\n太阳新天地\t130450\n8%\t130451\n覆亡\t130452\n郭芙\t130453\nゴム\t130454\n120ask.com\t130455\n经营活动现金净流量\t130456\n潘维\t130457\n赵双\t130458\n国海证券股份有限公司\t130459\n小王\t130460\n沐雨无尘\t130461\n刘赐贵\t130462\n那方\t130463\n取水器\t130464\n将乐县人民政府\t130465\noath\t130466\n宝安中心区\t130467\n财车\t130468\nVPC\t130469\n二招\t130470\nlead\t130471\nFFSKY-天幻网\t130472\n清水寺\t130473\n醉东风\t130474\nWOW7.1\t130475\n龙城\t130476\n孩儿巷\t130477\n长谷川\t130478\n凤凰镇\t130479\n6页\t130480\n环洱海\t130481\n爱思学\t130482\nlik\t130483\n皇马队\t130484\nredis4\t130485\nMoves\t130486\n得房率\t130487\n溶性\t130488\n矿工费\t130489\n常务副总经理\t130490\n鹿胎膏\t130491\n西塞罗\t130492\n三卫\t130493\nscpx\t130494\n朝阳路\t130495\n卫矛\t130496\n705路\t130497\n海南高速\t130498\nIntellj\t130499\n回忆性\t130500\n事隔\t130501\n广州北\t130502\n招商银行股份有限公司\t130503\n欧姆社\t130504\n明月听风\t130505\n三羊\t130506\nfries\t130507\n弥勒\t130508\n地中\t130509\nx264-WiKi\t130510\n波西·杰克逊\t130511\n旧约\t130512\n高安\t130513\n株洲高新区\t130514\n07553655\t130515\n我的歌声里\t130516\n脑梗\t130517\n宫高\t130518\noceanbase\t130519\nBaidu\t130520\n爱思苹果助手\t130521\n非法采矿罪\t130522\n电动叉车\t130523\n虹漕路\t130524\nWriters\t130525\n超线\t130526\n大学生村官考试网\t130527\n处治\t130528\n100倍\t130529\n江南新城\t130530\n红花酢浆草\t130531\n深海迷航\t130532\n横贯\t130533\n春棚\t130534\n后文\t130535\nExcel技巧网\t130536\n乐语\t130537\n开国功臣\t130538\nRMB\t130539\n法国人\t130540\nmadewell\t130541\n伊斯兰教\t130542\n兴冲冲\t130543\n澄净\t130544\n密集恐惧症\t130545\nftpserver\t130546\n合盘\t130547\n广发银行\t130548\n医纬达\t130549\n塔勒布\t130550\n兴登堡\t130551\n大马士革\t130552\nmaters\t130553\nencode\t130554\n8025\t130555\nhela\t130556\nアルプス\t130557\n化学奖\t130558\n奔跑吧兄弟第二季\t130559\n省察\t130560\n沾染\t130561\n第03集\t130562\n偷爱\t130563\n什么梗\t130564\n大肠息肉\t130565\n5061\t130566\n大脑皮层\t130567\n变电运维\t130568\ncocktail\t130569\nfog\t130570\n一反\t130571\n萝卜皮\t130572\n南京区\t130573\n2008年1月1日\t130574\n78条\t130575\n弥河\t130576\nBot\t130577\n微量注射泵\t130578\n王GX\t130579\n瑞尔斯\t130580\n溴化锂\t130581\n荣耀娜可露露\t130582\n北京西苑医院\t130583\nnonlinear\t130584\npro8\t130585\nProjector\t130586\n战象\t130587\n几句\t130588\n新目标英语八年级\t130589\n上万种\t130590\n自己的歌\t130591\n苏静\t130592\n小米5\t130593\n一网通\t130594\n赢得\t130595\nwondered\t130596\nstm32cube\t130597\n龟泵丙烯\t130598\n水套\t130599\n八股文\t130600\n浦发信用卡\t130601\nkongjie\t130602\n春朝\t130603\ngol\t130604\nWin95\t130605\n桃屋\t130606\nSNAP\t130607\nMySQL服务器\t130608\neratoho\t130609\ncoolpad\t130610\n排污\t130611\n周庆\t130612\n恋曲\t130613\n美体瘦身\t130614\n正通\t130615\n西安市公安局\t130616\nMy0557.cn\t130617\n1.78G\t130618\n布鲁克林区\t130619\n电源网\t130620\n可大可小\t130621\n地标级\t130622\n哺乳室\t130623\n磷酸奥司他韦胶囊\t130624\n移动魔百盒\t130625\n中国五矿集团公司\t130626\nDHCP\t130627\n游车\t130628\n鼻罩\t130629\n0.1g\t130630\n西地\t130631\n盖度\t130632\n陈三两\t130633\n1.7\t130634\n百度云虚拟\t130635\n天帝\t130636\n腾讯网迷你版\t130637\n纹样\t130638\n5节\t130639\nGeany\t130640\n神州信息\t130641\n德尚\t130642\n售房网\t130643\n昆山站\t130644\n20平方米\t130645\n租凭\t130646\n聚脲涂料\t130647\n十层\t130648\n山水时尚酒店\t130649\n研究人员\t130650\nthat\t130651\n绘里\t130652\n打井机\t130653\n被卡\t130654\n卖股\t130655\n0.5千克\t130656\n木业\t130657\n同煤\t130658\n军需官\t130659\n星际宝贝\t130660\n刘大为\t130661\n书市\t130662\n公诉人\t130663\n300251\t130664\n8回\t130665\n巧妙\t130666\nConsolas\t130667\nKou\t130668\n广州白云山医药集团股份有限公司\t130669\n龙港\t130670\n250cc\t130671\n郑斌\t130672\n15k\t130673\n学苑路\t130674\n淫妇\t130675\n妈蛋\t130676\nlinix\t130677\n57集\t130678\n96khz\t130679\n30H\t130680\n逗伴\t130681\n红皮书\t130682\n雨花石\t130683\n景峰医药\t130684\n全角\t130685\n摄像人\t130686\n电视盒\t130687\n威海海洋职业学院\t130688\n吉林省人民政府办公厅\t130689\nright函数\t130690\n炮孔\t130691\n荣冠\t130692\n金投期货-金投网\t130693\n六合一\t130694\n熟透\t130695\n搞趣网\t130696\neno16777736\t130697\n金鼎轩\t130698\n6301\t130699\n凤凰花开\t130700\nisolated\t130701\n夜天\t130702\nGRANT\t130703\n泰和园\t130704\n脱碳\t130705\n小南门\t130706\n2017年一月\t130707\n一岗双责\t130708\n红警\t130709\nChemsrc\t130710\n107万\t130711\n日记帐\t130712\nMonash\t130713\n飞常准|\t130714\n什么座\t130715\nt420\t130716\n松开\t130717\n山东省高院\t130718\n胃仙\t130719\n共舞\t130720\n林一木\t130721\n张策\t130722\njse\t130723\n154个\t130724\n养生篇\t130725\n小肚腩\t130726\n由奈\t130727\n人教数学\t130728\n楚风\t130729\n藏海花\t130730\n2016年上半年\t130731\n十起\t130732\n乙女\t130733\n索姆河战役\t130734\n6日\t130735\n飞卢吧\t130736\n临沂新闻网\t130737\n友谊桥\t130738\n20%|\t130739\n时钟\t130740\n食人岛\t130741\nphantom\t130742\n27号\t130743\nBO\t130744\nDB13\t130745\nmodscan32\t130746\n142个\t130747\n下半场\t130748\nPLAN\t130749\n毕业论文范文_\t130750\n三峡水电站\t130751\n36个\t130752\n康耐视\t130753\n湖镇镇\t130754\n信息检索\t130755\n晚\t130756\ntimeStamp\t130757\n喷液\t130758\n梅新育\t130759\n直拼板\t130760\niMore\t130761\n微信编辑器\t130762\n对月\t130763\n沙利度胺\t130764\nsupre\t130765\n羌塘无人区\t130766\n遥墙国际机场\t130767\n翘课\t130768\n放冰\t130769\n肇事案\t130770\n第11页\t130771\n9.31\t130772\n耀\t130773\n阆中\t130774\n法语助手\t130775\niTerm\t130776\npets4\t130777\n内容分析法\t130778\n五牛图\t130779\n103kg\t130780\n2017年1月13日\t130781\n光棍\t130782\nvirtue\t130783\n百达翡丽\t130784\nZED\t130785\n小麦赤霉病\t130786\n谢明\t130787\n账项\t130788\nprocessed\t130789\nWAR3.52PK.COM\t130790\n亚运\t130791\n美魔女\t130792\n回\t130793\n摇荡\t130794\n十年后\t130795\n吉林省民政厅\t130796\nzizi\t130797\n西藏藏医学院\t130798\n西沙群岛\t130799\n24伏\t130800\n九鼎装饰\t130801\n快行\t130802\n金木研喰种\t130803\n冢本昭和\t130804\n梦之安魂曲\t130805\n春舞\t130806\n非英语专业\t130807\nstole\t130808\n0.5级\t130809\n等一分钟\t130810\n切面编程\t130811\n2018年4月13号\t130812\n幸好\t130813\n宝珀\t130814\n清华阳光太阳能\t130815\n马鲜生\t130816\n麻婆\t130817\n楼栋\t130818\n陈彤\t130819\n地面站\t130820\nZEZE\t130821\n街舞小白\t130822\n慢性湿疹\t130823\nmakemodel\t130824\n园洲\t130825\n金方\t130826\n越洋广场\t130827\n容城镇\t130828\n超级医圣\t130829\n辰欣\t130830\n死海\t130831\nWin10系统版本号\t130832\n广州市食品药品监督管理局\t130833\n两免三减半\t130834\n粘结力\t130835\n生别离\t130836\n陕师大附中\t130837\n5060\t130838\n屌丝们\t130839\n念佛\t130840\neven\t130841\n行李舱\t130842\n朴诗妍\t130843\n杀龙\t130844\n龙之教条吧\t130845\n过一会儿\t130846\nrqt\t130847\n延边\t130848\n江苏省扬州中学\t130849\n标线\t130850\n黄蓝\t130851\n人体蜈蚣3\t130852\nDSM\t130853\nMichelin\t130854\n廉\t130855\n四川建筑职业技术学院\t130856\n桂房网\t130857\n宁武\t130858\n路居\t130859\n57.4%\t130860\n人权宣言\t130861\n议论文\t130862\n中国远征军\t130863\n澜起科技\t130864\n章丘市\t130865\n534号\t130866\ntgs\t130867\nmercedes\t130868\n1.000\t130869\nlibgdx\t130870\n开始\t130871\n一ノ瀬\t130872\n安河桥北\t130873\npdist\t130874\n梁文博\t130875\n雾凇\t130876\n002038\t130877\n第2列\t130878\n应税销售额\t130879\nsqldbx\t130880\n顺德区\t130881\nInterchange\t130882\nMESSAGE\t130883\n爆破音\t130884\n空气阻力系数\t130885\n小寨\t130886\n判处\t130887\n方言版\t130888\n回肠\t130889\nSlay\t130890\nIIHS\t130891\n11.2%\t130892\n仪批\t130893\n惨烈\t130894\n丽水市政府\t130895\n好淫\t130896\n奥特莱斯Outlets\t130897\n福瑞卡\t130898\nHilary\t130899\n播放源\t130900\napm\t130901\n兴修\t130902\n陈列柜\t130903\nHc\t130904\n丹寨\t130905\n整列_\t130906\n3194\t130907\nPonycar\t130908\nRA\t130909\n看云识\t130910\n2016版国家自然科学基金\t130911\n范型\t130912\nparking\t130913\n网链\t130914\n同义字\t130915\n朗读稿\t130916\n3届\t130917\n云龙县\t130918\n130斤\t130919\n西比尔\t130920\n上海港澳通行证\t130921\n红莲寺\t130922\nfaxingw\t130923\nmybitis\t130924\na37\t130925\nJZ5U\t130926\n1.732\t130927\n克隆\t130928\n打字员\t130929\n全市性\t130930\n民房\t130931\n绵山\t130932\n背夹\t130933\n无值\t130934\n不变动\t130935\n小东西\t130936\n抵押权\t130937\n心不在焉\t130938\nFOOD\t130939\nCricket\t130940\n省监察厅\t130941\n待定系数法\t130942\nWii模拟器\t130943\n快递侠\t130944\n普惠性民办幼儿园\t130945\n2018.4.26\t130946\n肖家河\t130947\n反分裂国家法\t130948\n预缴纳税申报表\t130949\n小腹\t130950\n撸丝\t130951\n补缝\t130952\n张潮\t130953\n枕边书\t130954\n显身\t130955\n费率\t130956\n疯狂的日子\t130957\n坦桑\t130958\n抵\t130959\n海尔卡萨帝\t130960\n人人素材网\t130961\n雾面\t130962\n刁亦男\t130963\n高尔基路\t130964\nSpank\t130965\n土地\t130966\nxy散点图\t130967\n雪缘\t130968\ninnocent\t130969\n媽媽鲁\t130970\npremire\t130971\n黄体酮软胶囊\t130972\n没听\t130973\n可逆性\t130974\nSciTE\t130975\n圣礼\t130976\n久别重逢\t130977\n省中\t130978\n30行\t130979\n华达\t130980\n建馆\t130981\n十三号星期五\t130982\n阿【\t130983\n锦绣江山\t130984\n12方\t130985\nhalf\t130986\n胡可\t130987\n眼镜妹\t130988\n图钉们\t130989\n长辈们\t130990\n游戏结束\t130991\n37.1度\t130992\nVanke\t130993\n语文园地一\t130994\n中央监察委员会\t130995\n天府新谷\t130996\nRobot\t130997\n安奴\t130998\n连读\t130999\n集合体\t131000\n麦考瑞\t131001\nTasker\t131002\nMilo\t131003\n妖僧\t131004\n球壳\t131005\n唐吉可德\t131006\n安全出口灯\t131007\n最令人\t131008\n棉带\t131009\n返义词\t131010\n农办\t131011\n几十倍\t131012\n碎拼\t131013\nlips\t131014\n蒸馏塔\t131015\n150亿美元\t131016\n王伯昭\t131017\n雄霸\t131018\n10来万\t131019\n有图有真相-时尚盒子\t131020\n浊酒\t131021\n黑白红\t131022\n固元\t131023\n上海市地方税务局\t131024\n洗发水\t131025\n周小玲\t131026\nP社\t131027\nreadiness\t131028\n合约案\t131029\n烈火\t131030\n姬狩\t131031\n马志勇\t131032\n珠海市工商行政管理局\t131033\n鸿基\t131034\nmini2440\t131035\n面位\t131036\n金戒指\t131037\n罗技G613\t131038\n48支\t131039\n民安路\t131040\n2016-09\t131041\n余华\t131042\n稳态误差\t131043\n老龙头\t131044\n岩羊\t131045\nY型过滤器\t131046\n小米分期\t131047\n叶檀仓央嘉措\t131048\n东莞消防\t131049\n金牌\t131050\nhomepod\t131051\n异步电机\t131052\n中央电视\t131053\n余热发电\t131054\n熔化\t131055\n摄影展\t131056\n九原区\t131057\n贵州省物价局\t131058\n淘宝联盟\t131059\n小米电视吧\t131060\nDreamMail\t131061\nvmtool\t131062\n自交\t131063\n610m\t131064\n主校\t131065\n第17号\t131066\n40004\t131067\n膝关节炎\t131068\n3连败\t131069\n华创资本\t131070\nddw\t131071\n代替品\t131072\n蛮锤\t131073\n吻君\t131074\n九州天空城\t131075\n热血三国3\t131076\n怡口莲\t131077\ngrounded\t131078\n5千米\t131079\n控制机\t131080\n1.0.35.366\t131081\nlibstdc\t131082\nnorthern\t131083\nlier\t131084\n得手\t131085\n阿那亚\t131086\n台盆柜\t131087\n6任\t131088\norr\t131089\n斟\t131090\n地质类\t131091\nCHO\t131092\nCranes\t131093\n2018年1月1日起\t131094\n天才麻将少女\t131095\n任强\t131096\n15幢\t131097\nhyperlynx\t131098\n北京积分落户\t131099\n手压\t131100\n预应力钢束\t131101\n仙剑五外传\t131102\n就是要\t131103\n新台币\t131104\n红红火火\t131105\n美国斯坦福大学\t131106\n明慧\t131107\n草布\t131108\n守护车\t131109\n欧蕙\t131110\nDil\t131111\n红与黑\t131112\n双峰县人民政府\t131113\n点招\t131114\n官文\t131115\n人民公安\t131116\n房率\t131117\n犁头\t131118\n1.46G\t131119\nWang\t131120\n凤凰涅磐\t131121\ndesignated\t131122\n存缴\t131123\n后市\t131124\n半月湾\t131125\n包重\t131126\n庙湾岛\t131127\n憨豆\t131128\n4300u\t131129\n2436\t131130\n上海中医药大学图书馆\t131131\n郭锦荣\t131132\n猜题卷\t131133\nmarket\t131134\n解放大道\t131135\n爱鼻日\t131136\n门垛\t131137\n减灾\t131138\n玄兵\t131139\nFlexbox\t131140\n补选\t131141\n巧妇难为无米之炊\t131142\n卡莉法\t131143\n紫罗\t131144\nCivilization\t131145\n邗江\t131146\nytb\t131147\n百余万\t131148\nCEMU模拟器\t131149\n消旋山莨菪碱片\t131150\nhenu\t131151\n火车时刻表\t131152\n定位销\t131153\n阿克汉\t131154\n抚松县\t131155\n闲不住\t131156\nopiumud\t131157\nvnc\t131158\n拿破仑蛋糕\t131159\n谢君豪\t131160\n濡湿\t131161\n太华山\t131162\n合胜\t131163\n刘宏\t131164\n处见\t131165\n祸端\t131166\n三学\t131167\n爽身粉\t131168\n起势\t131169\n指甲盖\t131170\n岐山县\t131171\n中华铁路\t131172\n爱网\t131173\n感应器\t131174\nTRIZ\t131175\n星官\t131176\n芜湖市政府\t131177\n大阳\t131178\n蹲\t131179\n利福\t131180\n阿斗\t131181\n浙江工业大学\t131182\ng4\t131183\n李成梁\t131184\n私单\t131185\n果粉\t131186\n7.0|\t131187\n动力软件园\t131188\n夏橙\t131189\n成铭\t131190\n图文并茂\t131191\n水蚤\t131192\ntextout\t131193\n胡凯莉\t131194\n连句\t131195\n吓鬼\t131196\nf463\t131197\nForming\t131198\n塑料周转箱\t131199\n烤房\t131200\n屏息\t131201\n10.2.2\t131202\n迅雷BT种子/ED2K\t131203\n养森瘦瘦\t131204\n请与废柴的我谈恋爱\t131205\n古宅\t131206\n麦粒\t131207\nwww.87sk.net/modules/article/txtarticle\t131208\n翁帆\t131209\nchalk\t131210\n楼盘表\t131211\nacting\t131212\n系物\t131213\n1194\t131214\nProbes\t131215\n挂帐\t131216\nfreecodecamp\t131217\n30条\t131218\n女歌手\t131219\n失手绳\t131220\n6262\t131221\nSOSG\t131222\n蓝莓汁\t131223\n朱文公\t131224\n果机\t131225\ndatafram\t131226\n邦威\t131227\nSTAX\t131228\nintercultural\t131229\n扁头\t131230\n高邮市人民政府\t131231\nkernel32.lib\t131232\n康沃尔\t131233\n红桃\t131234\n附校\t131235\n大联\t131236\n平心静气\t131237\ncart\t131238\n中红\t131239\n小处男\t131240\n手臂\t131241\n进班\t131242\nVI设计\t131243\n海伦天麓\t131244\n搞笑电影\t131245\n奴工\t131246\n四分体\t131247\nntr\t131248\n唐斩\t131249\n泰坦\t131250\n广场协议\t131251\nreimbursement\t131252\n提点\t131253\n迅雷磁力\t131254\n兴办\t131255\n相待\t131256\n2018lck\t131257\n拿样\t131258\n正佳广场\t131259\n20150321\t131260\nBD720P+BD1080P\t131261\n弹词\t131262\n瞬态分析\t131263\npioneer\t131264\n折纸玫瑰\t131265\n郭庆\t131266\n兰斯洛特\t131267\n吉山村\t131268\n广田集团\t131269\nrook\t131270\n汉语国际教育专业\t131271\n六月份\t131272\n堂吉高云翔\t131273\nEurail\t131274\n学习通\t131275\nJager\t131276\n宽带连接_\t131277\n99_\t131278\n侯鸿亮\t131279\n茶树油\t131280\n六省\t131281\n愿做\t131282\nVR资源网\t131283\nGarrix\t131284\nkaren\t131285\n万石镇\t131286\n奥视\t131287\nTEFL\t131288\n217个\t131289\nv5.4\t131290\n高差\t131291\n友谊商城\t131292\n清唱版\t131293\n黑脚\t131294\nbnc\t131295\n过氧化物\t131296\n涅盘\t131297\nHiVi\t131298\n1100万\t131299\n何春\t131300\n曹路镇\t131301\nHTCAD\t131302\n他的话\t131303\n氧分子网\t131304\n行侠仗义\t131305\n慢性萎缩性胃炎\t131306\n无锡信捷电气股份有限公司\t131307\nfgets函数\t131308\n开鲁县\t131309\ng奶\t131310\n团本\t131311\n云南南路\t131312\n校外\t131313\n发飘\t131314\njiedian\t131315\n贝加\t131316\n引燃\t131317\n条绒\t131318\n云涌\t131319\n杨洁篪\t131320\n大迈X7\t131321\n鸡油\t131322\n天人合一\t131323\n极品女性网\t131324\n303\t131325\n卡莱特\t131326\n1.20\t131327\n礻\t131328\nCornerstone\t131329\n6品\t131330\n作秀\t131331\n紫龙\t131332\nCUDev\t131333\n俄勒冈州\t131334\n谭耀文\t131335\n中信湘雅\t131336\n法栏\t131337\nMBR格式\t131338\n属相婚\t131339\n平安科技\t131340\n金华\t131341\n大岗镇\t131342\n2013—2014年度\t131343\n法院\t131344\n彼得林奇\t131345\n资中筠\t131346\nSPF30\t131347\niCourt\t131348\n工作制\t131349\n夺日者\t131350\narmed\t131351\n毛明\t131352\n7栋\t131353\n人影\t131354\nPolar\t131355\n埃微\t131356\n赵军\t131357\n芝樱\t131358\n1487\t131359\n侬好\t131360\n墨石\t131361\n阁子\t131362\n今日何日兮\t131363\nRecognized\t131364\nonresume\t131365\n善友\t131366\n尊称\t131367\n0751\t131368\n租车\t131369\nバス\t131370\n草药\t131371\n第1_\t131372\nSX1278\t131373\n中国人民大学在职研究生\t131374\n爆米花网\t131375\npagination\t131376\n租赁税\t131377\n现场版\t131378\n火丁\t131379\n牙结石\t131380\n副主席\t131381\nROY\t131382\nsva\t131383\n石阡\t131384\n牛子裤\t131385\n宁波市医疗中心李惠利医院\t131386\nBFC\t131387\n淮滨县\t131388\n滕萍\t131389\n上海漫威\t131390\n变了心\t131391\n绝不后悔\t131392\ng65\t131393\nlady\t131394\n沙霸\t131395\nLEGAL\t131396\n祷告会\t131397\n亓\t131398\n欢欣\t131399\nzhengce\t131400\nscholastic\t131401\n佐鼬\t131402\n逼人\t131403\n壁饰\t131404\nproceedings\t131405\n七月与安生\t131406\n诸子\t131407\n泗门\t131408\n鲁子\t131409\n助困\t131410\nwxapp\t131411\n1-3小时\t131412\ncollectionviewcell\t131413\nshang\t131414\n汉口\t131415\n‰\t131416\nUVC\t131417\nlotus\t131418\n美女房客爱上我\t131419\n微泡攻略\t131420\n普攻\t131421\n阿拉丁试剂网\t131422\n尧舜\t131423\n外球面轴承\t131424\ncx5\t131425\n英淘\t131426\nruningman\t131427\n联想ideacentre\t131428\n处女身\t131429\n测分\t131430\n40枚\t131431\n街球\t131432\n难料\t131433\n澳门站\t131434\n颜体字\t131435\nopgg\t131436\n微信运动刷步数\t131437\n布达佩斯之恋\t131438\n潘登\t131439\n哲学研究\t131440\njave\t131441\n9850\t131442\n汉库克\t131443\n观音心咒\t131444\n上回\t131445\n煤仓\t131446\n壁咚\t131447\npcle\t131448\nscb\t131449\nbalancer\t131450\n易果\t131451\n里瓦尔多\t131452\n踢馆\t131453\n永乐\t131454\n实验台\t131455\n就是\t131456\n160G\t131457\n2014-2015年\t131458\n泽拉\t131459\n费马大\t131460\n养蜂之家论坛\t131461\n程序史\t131462\n安全教育日\t131463\n在线支付\t131464\n制药厂\t131465\n钯碳\t131466\nOwens\t131467\nfirst\t131468\n广灵\t131469\n琴码\t131470\n周健\t131471\n千秋雪\t131472\n周希汉\t131473\n天津大学化工学院\t131474\nWindowsAPI\t131475\n素描画\t131476\n刘悦\t131477\n有值\t131478\n小骆驼\t131479\n即将\t131480\n三秦都市报\t131481\n浦外附小\t131482\n泰安开发区\t131483\n顾浩\t131484\n59岁\t131485\n神仙道\t131486\n皱纹\t131487\naod\t131488\nfiletyp\t131489\n番茄红\t131490\n威尔胜\t131491\nalexa域名\t131492\nBerry\t131493\n敬文\t131494\n质询\t131495\n内资股\t131496\n扑克之星\t131497\n黎曼\t131498\n诸天纪\t131499\nbbs\t131500\n落第骑士英雄谭\t131501\n纪元娱乐\t131502\n巴花\t131503\n璧山网\t131504\n月星集团\t131505\n第一创业\t131506\nVba\t131507\n中华人民共和国道路交通安全法实施条例\t131508\n女朋友们\t131509\n祭祀\t131510\n老银\t131511\nskimming\t131512\n朗达·拜恩\t131513\n69升\t131514\n樂園\t131515\nRBAC\t131516\n深爱\t131517\nHMC\t131518\npsig\t131519\n妇婴\t131520\n重遇\t131521\n2024年\t131522\n携程卡\t131523\n7.02\t131524\n申硕\t131525\n渊\t131526\n黑章\t131527\nps2吧\t131528\n重庆园博园\t131529\n平天湖\t131530\n瘾\t131531\n無毛\t131532\n招拍\t131533\n算了吧\t131534\n米兰德比\t131535\nsfm\t131536\n91康\t131537\n丰巢快递柜\t131538\nforensic\t131539\n西青\t131540\n鲜血\t131541\nlwIP\t131542\n茶库\t131543\n月下夜想曲\t131544\n迈凯轮\t131545\n复式过欠压保护器\t131546\n幻想家\t131547\n5.98\t131548\n优帮云\t131549\n巢湖学院\t131550\n汉字\t131551\n新婚之夜\t131552\n亲密爱人\t131553\nworm\t131554\n毛不易\t131555\n求职路\t131556\ngolfing\t131557\n出言\t131558\n金口诀\t131559\n1965\t131560\n数百万美元\t131561\n鲁美\t131562\n冰队\t131563\nBeamer\t131564\nResistant\t131565\nSET\t131566\n堕落者\t131567\n卵磷脂\t131568\n07073剑灵洪门\t131569\n南乐县\t131570\n6w\t131571\nmatrox\t131572\nruan\t131573\n6.3.1\t131574\n汕头火车站\t131575\n新发明\t131576\n竞猜币\t131577\n华山景区\t131578\n贡嘎机场\t131579\n领峰\t131580\n兔玩网\t131581\n哈里森·福特\t131582\n恒隆地产\t131583\n有用\t131584\nredirect_uri\t131585\ndenote\t131586\nFumeFX\t131587\ns2000\t131588\n相随\t131589\n望洞庭湖赠张丞相\t131590\n027\t131591\n赔礼\t131592\n厦门兴润星贸易有限公司\t131593\n分担\t131594\n给出出\t131595\n扎破\t131596\nxterm\t131597\n北大第一医院\t131598\n搬进来\t131599\n5.6.3\t131600\n分数的意义\t131601\n第43届\t131602\nz9mini\t131603\n蜜耻母\t131604\nUG\t131605\nD7000\t131606\n儿童滑板车\t131607\n200多元\t131608\n北京中医药大学第三附属医院\t131609\nfoucs\t131610\n吭\t131611\n.avi\t131612\nShaved\t131613\n慈溪市区\t131614\n毫无顾忌\t131615\n斐多\t131616\n谚语\t131617\n彭于晏\t131618\n平面直角坐标系xoy\t131619\n真假美猴王\t131620\n戈德曼\t131621\n铁甲小宝\t131622\n北京百度网讯科技有限公司\t131623\nctcc\t131624\n李茂龙\t131625\nswag\t131626\n宝马拿铁\t131627\n进\t131628\n月嫂证\t131629\n匡威鞋\t131630\n云南能投\t131631\nLIMBO\t131632\n千位\t131633\n大耻度\t131634\n公消\t131635\n沙漠靴\t131636\nPOP子\t131637\nZune\t131638\n丰收年\t131639\nx战警吧\t131640\n心韵\t131641\n耗水\t131642\n5月19日\t131643\n小新社区\t131644\n立威\t131645\ngongzi\t131646\n六氟化硫断路器\t131647\nDoinb\t131648\n葛存壮\t131649\n濒\t131650\nHentai\t131651\n20160917\t131652\n益力\t131653\n圣盾\t131654\n氨基酸奶粉\t131655\n麦克马斯特大学\t131656\n紫外线消毒器\t131657\n安乐镇\t131658\n最低工资标准2017\t131659\n两个\t131660\n旁白\t131661\n文理\t131662\n块体\t131663\n津人社\t131664\n无圣\t131665\nbubuko\t131666\n就业率\t131667\n科学园\t131668\n钱伟\t131669\nRestoration\t131670\n网状\t131671\n联想Y460\t131672\nWeb接口\t131673\n招标信息平台\t131674\n杀生院\t131675\n票务\t131676\nkano\t131677\nEDG电子竞技俱乐部\t131678\n正用\t131679\n扩词\t131680\n化学药品\t131681\n二科\t131682\nsci影响因子\t131683\nqq聊天机器人\t131684\nInter\t131685\n传说\t131686\n7.0.7\t131687\n卡比丘\t131688\n金荞麦片\t131689\n生殖细胞\t131690\n九号\t131691\n画一画\t131692\n墓王之王寒铁斗主题曲MV_墓王之王悬棺寺\t131693\n桃源政府\t131694\n1.5.7\t131695\nweebly\t131696\n浦阳江\t131697\n泥浴\t131698\n1845年\t131699\nsubclass\t131700\n还书\t131701\n文字版\t131702\n恒流\t131703\n四甲基氢氧化铵\t131704\n法学研究\t131705\nLands\t131706\n魁北克市\t131707\nsurf\t131708\n卓绝\t131709\nPassage\t131710\n纯净物\t131711\n全国青联\t131712\n东方小区\t131713\n希斯罗机场\t131714\n民盟\t131715\n14周岁\t131716\nWINNER\t131717\n7寸\t131718\n20170319\t131719\n高枕无忧\t131720\n0.2.3\t131721\n激动网\t131722\n中介\t131723\n严慧明\t131724\n网站后台管理系统\t131725\n牙石\t131726\n采访\t131727\n陈列馆\t131728\n河北省人大常委会\t131729\n科技报\t131730\n再认识\t131731\nvaginal\t131732\nBizarre\t131733\n长篇小说\t131734\n新鸿路\t131735\nScripting\t131736\ntours\t131737\n平整\t131738\n文胜\t131739\n泗水县\t131740\n城市公园\t131741\n恒等式\t131742\n3DMAX\t131743\n康明斯\t131744\n6.8.0\t131745\n保利国际影城\t131746\n土地神\t131747\n公共经济学\t131748\nfreexx性\t131749\n屐\t131750\nngo\t131751\n梁桥\t131752\n可调试\t131753\n最早\t131754\n李士群\t131755\n超低温\t131756\n高速计数器\t131757\n湖南科技职业学院\t131758\n北京市疾控中心\t131759\n园庆\t131760\n肾盂肾炎\t131761\n两三千\t131762\n_言\t131763\n圣天诺\t131764\n中原地产\t131765\n9件\t131766\n自助洗车机\t131767\n广西银行\t131768\n姜强\t131769\n兼听则明\t131770\n板数\t131771\n萧忆情\t131772\nShandong\t131773\nDance\t131774\n第六回\t131775\nds2019\t131776\n39522078\t131777\n古利\t131778\n孙晔\t131779\nMainland\t131780\n下边\t131781\n嘲颅\t131782\n石井镇\t131783\n乌孜别克族\t131784\n中华人民共和国国家旅游局\t131785\n援军\t131786\n208路\t131787\npt2\t131788\n广东菜\t131789\n黄璜\t131790\n依赖性\t131791\n天黑了\t131792\n面数\t131793\nPopcorn\t131794\n海湾\t131795\n默片\t131796\n凯特·阿普顿\t131797\nology\t131798\naltitude\t131799\n美行思远\t131800\n你身边\t131801\n瑞泉\t131802\n得捷电子\t131803\n归零者\t131804\n提到\t131805\nCORBA\t131806\n多样式\t131807\n何强\t131808\nsize\t131809\n最强临高启明\t131810\n中铁诺德滨海花园\t131811\n人教新课标\t131812\n辉煌明朝那些事儿\t131813\n第六页\t131814\n著书\t131815\n大部份\t131816\n滑屏\t131817\n3.9.5\t131818\n3D3S\t131819\n上方山\t131820\n冰火\t131821\n抵消\t131822\n铜盘路\t131823\n茶禅\t131824\n郯\t131825\nrule\t131826\n三华李\t131827\n计算表\t131828\n排课\t131829\nlisted\t131830\n石羊河\t131831\n浙江省卫计委\t131832\n第三十四次\t131833\n建信人寿\t131834\n颈椎\t131835\n375\t131836\nsmallpdf\t131837\n下南洋\t131838\n英菲尼迪qx60\t131839\nhotel\t131840\n轶\t131841\n硝基苯酚\t131842\n洋葱骑士\t131843\n20150427\t131844\n每10分钟\t131845\n新世纪大学英语\t131846\n南模\t131847\nddu\t131848\nopenfile\t131849\n佩内洛普·克鲁兹\t131850\n省纪委监委\t131851\n长坪沟\t131852\n搅乱\t131853\n观点类\t131854\n刻字机\t131855\n小规模\t131856\n圣书\t131857\n云端\t131858\n_福建省国家税务局\t131859\naladdin\t131860\n广州亚运城\t131861\n簇绒\t131862\n古城区\t131863\nRESTClient\t131864\n李长青\t131865\n阿拉丁aladdin\t131866\n云亮\t131867\n鸟飞\t131868\n回表\t131869\n生字\t131870\n陆臻\t131871\nnasa\t131872\nfreescale\t131873\n江苏经贸职业技术学院\t131874\n黄甲镇\t131875\n_湖\t131876\n834集\t131877\n猪尾巴\t131878\n广纳\t131879\n老城厢\t131880\nlady8844.com\t131881\n碛\t131882\n常州东\t131883\n花圃\t131884\n奥达\t131885\nlidan\t131886\n阵亡\t131887\n九段\t131888\n德天\t131889\n我的世界MC\t131890\n手表\t131891\nGIF表情包\t131892\n277万源\t131893\n冠桥\t131894\natd\t131895\nabigcockman\t131896\n老人节\t131897\nCONTENT\t131898\n星战7\t131899\n文轩网\t131900\n深坑酒店\t131901\n卡册\t131902\n2019年9月\t131903\n福清新闻网\t131904\n张航\t131905\ndoin\t131906\n牙痛\t131907\n疏忽\t131908\nLick\t131909\n浦东南路\t131910\n胖男\t131911\nYAN\t131912\n铜鼓县人民政府\t131913\npackaged\t131914\nrose\t131915\n发套\t131916\n紫玉兰\t131917\n自由振动\t131918\n通断\t131919\n温湿度记录仪\t131920\n漆原智志\t131921\n四川站\t131922\nePrice\t131923\nperimeter\t131924\n野模\t131925\nAion--17173\t131926\n藜芦\t131927\n刘鹏程\t131928\n贵州省政府\t131929\n没差\t131930\n学平险\t131931\n洪世贤\t131932\n侠盗猎车手4\t131933\n瓦岗\t131934\n排废\t131935\n44度\t131936\n猪猪\t131937\n拜耳医药保健有限公司\t131938\n8万公里\t131939\nlittlesubgirl\t131940\n雁山区\t131941\n走马\t131942\n作业成本法\t131943\n埃文·蕾切尔·伍德\t131944\n重瞳\t131945\n品佳\t131946\n新竹园中学\t131947\n碰超\t131948\n隆源\t131949\njiu.163.com\t131950\n古力\t131951\n李公子\t131952\n图片版\t131953\n双擎\t131954\n奇台\t131955\n惕\t131956\n克里斯托弗·诺兰\t131957\n平方差公式\t131958\n官匹\t131959\n鬼太郎\t131960\nibase4j\t131961\n象画\t131962\n3902\t131963\n轩辕Rowboat\t131964\n返味\t131965\n范特西\t131966\nKK之家\t131967\n幸福小镇\t131968\nIrene\t131969\noffice2007免费版\t131970\n割伤\t131971\n药农\t131972\n贺敬之\t131973\njidan\t131974\n云计\t131975\n农家媳\t131976\ngen3\t131977\n伸伸\t131978\n紫炎\t131979\nApartments\t131980\n国网福建省电力有限公司\t131981\n山东教育\t131982\n冯英\t131983\n青出于蓝\t131984\n25行\t131985\n分分\t131986\nSkylake\t131987\nNMT\t131988\n7RE\t131989\n2016年4月1日起\t131990\n成名战\t131991\n有悖\t131992\n矸石\t131993\n催眠术\t131994\n唐馨\t131995\nbumblebee\t131996\n王丽达\t131997\n大湾镇\t131998\n鱼游到了纸上\t131999\n史麦斯\t132000\n流经\t132001\nthats\t132002\n祝塘镇\t132003\n安质部\t132004\n无线耳机\t132005\nSEASON\t132006\n余律师\t132007\n孕前体检\t132008\n佳业\t132009\n莉莉丝\t132010\n莫干山景区\t132011\n事业单位领导人员管理暂行规定\t132012\n头位\t132013\n磁粉\t132014\n羊安\t132015\n矮人族\t132016\n发烧友们\t132017\n岳云\t132018\n旷世\t132019\n开发处\t132020\n臀径\t132021\n卡罗拉双擎\t132022\n权贵\t132023\n信贷资产证券化\t132024\n快乐购\t132025\nC+\t132026\n骄阳\t132027\n3430\t132028\n何况\t132029\n达内教育\t132030\n修仙文\t132031\n辽宁师范大学\t132032\n全国代\t132033\n克雷斯波\t132034\n德沃夏克\t132035\n老君威\t132036\n吉林省商务厅\t132037\n一般户\t132038\n土工布\t132039\n高利贷公司\t132040\n微动画\t132041\n北新泾\t132042\n友谊路街道\t132043\n二重性\t132044\n格拉芙\t132045\n韦千里\t132046\n约克公爵\t132047\n蠢蛋进化论\t132048\nnt5\t132049\n萌爪\t132050\n周康\t132051\n云南省人大常委会\t132052\n细胞系\t132053\nDOD\t132054\n风龙\t132055\n挑战\t132056\n齐鲁石化\t132057\n长安欧尚欧尚A800\t132058\nlearn\t132059\n879\t132060\n金图\t132061\n91.2\t132062\n落客\t132063\nonshow\t132064\n异思趣\t132065\n舟车劳顿\t132066\n代表作\t132067\n免驱版\t132068\n线衫\t132069\n合肥大学\t132070\n美嘉\t132071\n胃黏膜\t132072\nMiniconda\t132073\n武汉工程科技学院\t132074\nTCI\t132075\n猝死案\t132076\n双倍余额递减法\t132077\nrtrim\t132078\n意拳\t132079\n页_海报时尚网\t132080\n色氨酸\t132081\n休假日\t132082\n橡胶坝\t132083\n男客\t132084\n鱼缸\t132085\n一幅幅\t132086\n花场\t132087\n天上天下无双刀\t132088\n达晨\t132089\n淞南\t132090\n)贸易有限公司\t132091\nkiss\t132092\n长江出版社\t132093\nventure\t132094\nperfmon\t132095\n陈辉阳\t132096\ngongzuo\t132097\nIHC\t132098\n爱问知识人\t132099\n10865.com\t132100\nstew\t132101\nwebpack\t132102\nBIOLOGY\t132103\n河北北方学院附属第一医院\t132104\n上证50\t132105\n8.9.0\t132106\n圣母婊\t132107\n二四六天\t132108\nTeamViewer\t132109\n带工\t132110\n国联\t132111\n哥们\t132112\n刘锐\t132113\n巨献\t132114\n暝\t132115\n劳务税\t132116\n青岛中学\t132117\n要疯了\t132118\n准入证\t132119\ntypical\t132120\n钟塔\t132121\n平叛\t132122\n满勤奖\t132123\n月光\t132124\n一马当先\t132125\nLAMY\t132126\n客流\t132127\n4.54\t132128\n函数式编程\t132129\n生产型\t132130\n综艺感\t132131\n能源\t132132\n梓山湖\t132133\n漆\t132134\n李荣浩\t132135\n门可罗雀\t132136\neigrp\t132137\n济南日报\t132138\nThrift\t132139\n玉佛\t132140\n海马万古仙穹\t132141\n劳动和社会保障部\t132142\n米花\t132143\nthank\t132144\n3494\t132145\n贪官\t132146\nスペ\t132147\n酷派大神F1\t132148\n三刃\t132149\n埃塞克斯\t132150\n16500\t132151\n碳酸镁\t132152\n财产险\t132153\n榆林市榆阳区人民政府\t132154\nVOLUME\t132155\n蹦床\t132156\n粮商\t132157\n潮汕菜\t132158\n鬼哭神嚎\t132159\n私生子\t132160\n科雷\t132161\n草果\t132162\n意大利\t132163\nm2m\t132164\n质保期\t132165\n大校\t132166\n暂列\t132167\n转让费\t132168\n知名度\t132169\n连夜\t132170\n研究性\t132171\n2016三\t132172\n旅游界\t132173\n隐攻\t132174\n呜咽\t132175\n于月仙\t132176\n远红外线\t132177\n饵块\t132178\nhif\t132179\n环世界A18\t132180\n回复语\t132181\naccrued\t132182\n索斯机械兽\t132183\nEdge论坛\t132184\n张艳丽\t132185\n一盏灯\t132186\n苏州国际科技园\t132187\n不堪其忧\t132188\n专变\t132189\n二五项\t132190\n2几个\t132191\n科技学院\t132192\n新干县政府\t132193\n西安报业传媒集团\t132194\n河南省安全生产监督管理局\t132195\n蔡蔡\t132196\n普析\t132197\n波浪形\t132198\ntobii\t132199\n子线\t132200\n沪江辞典\t132201\n电动四轮车\t132202\nHikaru\t132203\n2018年1月24日\t132204\n诗琪\t132205\nWampServer\t132206\n吴声\t132207\n授权信\t132208\n两值\t132209\nDeskjet\t132210\n丹尼尔\t132211\n经销\t132212\nK-Means\t132213\n仪器箱\t132214\n浩辰云图\t132215\n电子\t132216\n净宅\t132217\n密室逃脱类\t132218\njenkis\t132219\n稀有车\t132220\nOracle数据库\t132221\nv2.4\t132222\n黑_\t132223\n草原天路\t132224\n阳火\t132225\n金蚕\t132226\n参与性\t132227\n时尚\t132228\n平安财险\t132229\ndark\t132230\n夏庄\t132231\n点地址\t132232\n梁山镇\t132233\n17k小说网\t132234\n中国农村网\t132235\n华烨\t132236\n九州风神大霜塔\t132237\n6座\t132238\n玉液\t132239\n阿贡火山\t132240\n兰兰\t132241\nonStop\t132242\n喜爱夜蒲4\t132243\nGroupon\t132244\n个人理财\t132245\nSendCloud\t132246\n麦哥\t132247\n强制式\t132248\n寻医问药网肿瘤科\t132249\n慈鲷鱼\t132250\n保监会\t132251\nCodeWeblog\t132252\n起卦\t132253\n棹歌\t132254\n土地转让合同\t132255\n趣医网\t132256\n2封\t132257\n四川省监狱管理局\t132258\n晓波\t132259\n不动产登记簿\t132260\n行星齿轮\t132261\niPad/\t132262\n宏润建设\t132263\n试驾\t132264\nmanor\t132265\n开手\t132266\n喷口\t132267\n第108期\t132268\n沈伟\t132269\n中超风云手游\t132270\n马树\t132271\n秘术\t132272\nfunction\t132273\n婆子\t132274\n土炕\t132275\ndorsal\t132276\n娄山关\t132277\n十册\t132278\n2017夏\t132279\n冰牙\t132280\n退赛\t132281\n2016.4\t132282\n白罗斯\t132283\n代省长\t132284\n顽主\t132285\n明珠湖\t132286\n慢热\t132287\n插叙\t132288\n李暮夕\t132289\n轮入\t132290\n余杰\t132291\nAcademics\t132292\n汉江集团\t132293\n搞笑GIF\t132294\n黄村镇\t132295\n张可盈\t132296\n52PK游戏网\t132297\n大象城\t132298\n不羁\t132299\n莼湖\t132300\n九三学社中央委员会\t132301\n人寿保险理赔\t132302\n最后面\t132303\n因果\t132304\n2016.2.4\t132305\n百年品牌\t132306\ntubing\t132307\n办公类\t132308\n丽桑卓\t132309\n世界观\t132310\n死神vs火影\t132311\n北京市昌平区中西医结合医院\t132312\n邮政编码_八九网\t132313\n一114\t132314\n某师\t132315\n哈大\t132316\n左孝凌\t132317\n陈娇\t132318\n3640\t132319\n主句\t132320\n仪表\t132321\n修仙之路\t132322\n汉诺\t132323\n秘码\t132324\nmtc\t132325\n万事\t132326\n跺\t132327\n洪山镇\t132328\n高堡奇人\t132329\nSheeran\t132330\n酒\t132331\n有凤来仪\t132332\n情爱片\t132333\n雀巢公司\t132334\nterry\t132335\n12MM\t132336\nSte\t132337\nbuyers\t132338\nLXDE\t132339\n宏文件\t132340\nMPAndroidChart\t132341\n进口报关单\t132342\n鼓槌\t132343\n圆锁\t132344\n监护室\t132345\nmiller\t132346\n铜仁市\t132347\n食品干燥剂\t132348\n81pan.com\t132349\n智道\t132350\nheterogeneity\t132351\nVSO\t132352\n出战\t132353\n灰人\t132354\n党史故事100讲\t132355\n马杰斯特\t132356\nVOLAB\t132357\n医路\t132358\n新东西\t132359\n陈涉\t132360\n1单\t132361\n35例\t132362\n欧美图区\t132363\n敏于\t132364\n全红\t132365\n美乐乐家居\t132366\nHome键\t132367\n双馈\t132368\n化工有限公司\t132369\n顺治通宝\t132370\n坐庄\t132371\n赭石\t132372\n一不\t132373\n宝鸡市陈仓区人民政府\t132374\n黄波\t132375\n三水新城\t132376\n不限号\t132377\nwhy\t132378\n平衡术\t132379\n沧海仙途\t132380\n2017年9月17日\t132381\n小鸟\t132382\nCD版\t132383\n成员变量\t132384\nlabo\t132385\n李勋\t132386\n戏班\t132387\nCAR-T疗法\t132388\n毛管\t132389\n草稿纸\t132390\n湖南网络工程职业学院\t132391\npermits\t132392\n五重\t132393\n20150719\t132394\n棉量\t132395\n记词\t132396\n钯金\t132397\n长老\t132398\nG460\t132399\n冀东水泥\t132400\n发苦\t132401\n敏捷\t132402\n巴士\t132403\n三湖缸\t132404\ndonghua\t132405\n4段\t132406\n6970\t132407\n咕噜咕噜\t132408\n疏散\t132409\nVariations\t132410\n4.9级\t132411\n始得西山宴游记\t132412\n含谷\t132413\n王靖\t132414\n分析部\t132415\n熹妃q传\t132416\n科宝\t132417\n离心式压缩机\t132418\n1080x1920\t132419\n蜒\t132420\n夏平\t132421\n人工台\t132422\nmiyu\t132423\n遛\t132424\n贝因美\t132425\n该管\t132426\n知园\t132427\n金庸群侠传\t132428\n中共福建省委党校\t132429\npicacg\t132430\n45分钟\t132431\n桂中\t132432\n界首市委\t132433\nDHI\t132434\nnc65\t132435\n高拉特\t132436\n金九银十\t132437\nnvarchar\t132438\n机键\t132439\n我愿意\t132440\n进行\t132441\n金太阳鹦鹉吧\t132442\n达摩院\t132443\n李老汉\t132444\nfbr\t132445\n金熙\t132446\nlfy\t132447\n插排\t132448\n物业管理条例\t132449\n隔年\t132450\n霍邱\t132451\n户口市\t132452\n詹妮弗安妮斯顿\t132453\n大水车\t132454\nCNBLUE\t132455\n保本\t132456\narrangements\t132457\n九龙坡\t132458\n张志勇\t132459\n明王\t132460\n皮革展\t132461\n荷重\t132462\n三栋镇\t132463\n六千块\t132464\n金狗\t132465\n反斗联盟\t132466\n用力过猛\t132467\n鲁珀特\t132468\n李水华\t132469\n天风证券股份有限公司\t132470\n免税额\t132471\n66号\t132472\nORACLE11g\t132473\n腾冲市\t132474\n三人称\t132475\n被曝\t132476\n上海市注册会计师协会\t132477\nBites\t132478\n游泳池水\t132479\n600亿美元\t132480\n北京东方国信科技股份有限公司\t132481\n钟晓生\t132482\n老兵们\t132483\n环孢素\t132484\n朔料\t132485\n郑灵犀\t132486\n上海社\t132487\nbzxz\t132488\n引爆\t132489\n月经杯\t132490\nRokid\t132491\n合体技\t132492\n工料机\t132493\n才貌双全\t132494\n百度房产\t132495\ng20\t132496\n仙居新闻\t132497\n椰树\t132498\n鲜汤\t132499\n锦绣江山全国旅游年票网\t132500\n盐酸小檗碱片\t132501\n旅馆式\t132502\n林武樟\t132503\n站酷网\t132504\n小毛\t132505\nC3-XR\t132506\n爱川美里菜\t132507\nAmnesia\t132508\n诗尼曼\t132509\n4.3GHz\t132510\n友好度\t132511\n恐怖黎明\t132512\n小米电视3s\t132513\n服装城\t132514\n构念\t132515\n15例\t132516\n路浩\t132517\n保付\t132518\nKMSAuto\t132519\n防潮垫\t132520\n纸皮箱\t132521\n钛晶\t132522\n求生之路4\t132523\n1984年\t132524\n360智能摄像机\t132525\n4A级景区\t132526\n壹周\t132527\n惠州火车站\t132528\n工控\t132529\n信益\t132530\n塘河\t132531\n小游园\t132532\nxmr-stak\t132533\n新博\t132534\n苦乐村官\t132535\n地漏\t132536\n李瑨瑶\t132537\n神马乐团\t132538\n实例化\t132539\n五清\t132540\nclimacool\t132541\n15名\t132542\n人畜\t132543\n索具\t132544\n胥渡\t132545\n沃尔沃xc60\t132546\n漫本\t132547\n彩虹集团\t132548\nHOLDING\t132549\n我的媳妇是女王\t132550\n思睿\t132551\n餐纸\t132552\n梨斗\t132553\n普铁\t132554\n3737\t132555\n2U\t132556\n场主\t132557\n前度\t132558\n埃舍尔\t132559\ngoogle地图\t132560\n恋家\t132561\n谢师恩\t132562\n上身\t132563\nbochs\t132564\n打手板\t132565\n广水市\t132566\nCry5\t132567\n菲多\t132568\n九江银行\t132569\nZacks\t132570\n附谱\t132571\nzhimi\t132572\nmira\t132573\n飞友\t132574\n音蜗\t132575\n中兴天机\t132576\n声海\t132577\n北京朝阳公园\t132578\nsolidwork2016\t132579\n颤\t132580\n报应\t132581\n时尚资讯网\t132582\n亿阳信通\t132583\n尾桑\t132584\n南方快报\t132585\n张耒\t132586\n维新运动\t132587\n喜威\t132588\n万劫\t132589\n三公斤\t132590\n分工合作\t132591\n上海华东理工大学\t132592\n醉醺醺\t132593\n营销型网站\t132594\nRS485\t132595\n液相色谱柱\t132596\n说服性\t132597\n益菌\t132598\n联合国大会\t132599\n个数\t132600\n活下来\t132601\n张天赐\t132602\n戴佩妮\t132603\nLaboratories\t132604\n鹏润大厦\t132605\n白羊\t132606\n风语筑\t132607\nHiLink\t132608\n杨千桦\t132609\n杨志敏\t132610\n飞鹰\t132611\niclone\t132612\nvinyl\t132613\nJANE\t132614\nE罩杯\t132615\n鱼船\t132616\n联合国安理会\t132617\nDOG\t132618\nTuition\t132619\n墨西\t132620\n广府古城\t132621\n南京人事考试网\t132622\n拱顶罐\t132623\nforming\t132624\n图卢兹\t132625\n蛇杖\t132626\n自力更生\t132627\n中国旅行社\t132628\n贺绿汀\t132629\nautodyn\t132630\n肃南县\t132631\n杨桃\t132632\n立志\t132633\n国家自然科学奖\t132634\n氮气\t132635\n失踪\t132636\nStateflow\t132637\n泰戈尔\t132638\nbrave-sailor\t132639\n蛐蛐罐\t132640\n牛津译林版\t132641\n墨飞\t132642\n上海远慕生物科技有限公司\t132643\n集中帖\t132644\n49.0.2623.112\t132645\np100\t132646\n扩散性百万亚瑟王\t132647\nh62\t132648\nlytwajue\t132649\n2168\t132650\n送医\t132651\n自恋\t132652\n霜雪\t132653\n举手\t132654\n快送\t132655\n公因式\t132656\n布局图\t132657\n中华人民共和国证券投资基金法\t132658\n时局\t132659\n鼻沟\t132660\nNPN\t132661\narmitage\t132662\n编制费\t132663\n知局\t132664\n动议\t132665\n博元\t132666\n828d\t132667\nproverb\t132668\n战狼1\t132669\n微滤膜\t132670\n滨海站\t132671\n梦王国与沉睡的100王子\t132672\n兰伯特\t132673\n卖钱\t132674\nMUCH\t132675\n冬荫功\t132676\n乐拍商城\t132677\n增值税进项税\t132678\n千手扉间\t132679\n质量标准\t132680\n000_\t132681\nStormerZ\t132682\n最简真分数\t132683\n知青岁月\t132684\n纳米盘\t132685\n51yuansu.com\t132686\n西门子工控\t132687\n金表\t132688\n顺风旅游网\t132689\n霍尔系数\t132690\n8篇\t132691\nUAG\t132692\n经济案\t132693\n江海区\t132694\nSQL服务器\t132695\n工证书\t132696\n海事大学\t132697\n宁夏广播电视大学\t132698\n罗正\t132699\n钢笔画\t132700\n无止境\t132701\nobservation\t132702\n肾科\t132703\nsparkstreaming\t132704\n临商银行\t132705\nphp-好库网\t132706\n63位\t132707\n中心矩\t132708\nv-bind\t132709\n小孟\t132710\n潮装\t132711\nFlash\t132712\n佳丽宝\t132713\n数控车床编程\t132714\n全国少工委\t132715\n民主法治村\t132716\n速速\t132717\nnginx负载均衡\t132718\nTyre\t132719\n1856年\t132720\n量杯\t132721\n反选\t132722\nbitvise\t132723\n客场\t132724\n沧江\t132725\n第一\t132726\n网络营销\t132727\n众生丸\t132728\n贵夫人\t132729\n北京小桔科技有限公司\t132730\n犯病\t132731\nContour\t132732\n永和\t132733\n火影忍者ol\t132734\n烫手山芋\t132735\n管控\t132736\nkeepalived\t132737\njsczxy2\t132738\n东方医院\t132739\n安娜柏林\t132740\n增值税认证\t132741\n遵义医学院附属医院\t132742\n和风细雨\t132743\ngamit\t132744\n创新创业学院\t132745\n汗渍\t132746\n租船\t132747\n屋顶式\t132748\n凸缘\t132749\n空调业\t132750\n葡萄论坛\t132751\n凝固剂\t132752\n广饶吧\t132753\nv7000\t132754\n无力\t132755\nMutable\t132756\n罗莎莉亚\t132757\n苏堤春晓\t132758\n用印\t132759\n人犯\t132760\n王雯\t132761\n锦州火车站\t132762\n开场秀\t132763\nGOT\t132764\nhall\t132765\nrestfull\t132766\n印绶\t132767\n雪乳\t132768\n安德的游戏\t132769\n排列组合\t132770\n4.2.0\t132771\n形象照\t132772\n挑檐\t132773\n医学版\t132774\n贝尔加湖\t132775\n千万家\t132776\nangulajs\t132777\n幸福云阳网\t132778\n五四宪法\t132779\nライオン\t132780\n克苏鲁\t132781\n大额\t132782\n安能物流\t132783\n告状\t132784\n邵明\t132785\nz250\t132786\n曼萨诺\t132787\nbotton\t132788\n点水\t132789\n酷家乐吧\t132790\n读点\t132791\nOpenSNS\t132792\n子区\t132793\nKow\t132794\n光影秀\t132795\n俗人\t132796\ngns3\t132797\nAccountant\t132798\n僵尸毁灭工程\t132799\n南无阿弥陀佛圣号\t132800\n鞭策\t132801\n软微\t132802\n旅行证件\t132803\n链战\t132804\n雪花秀\t132805\n苏菲亚公主\t132806\n千山路\t132807\nioncube\t132808\n构树\t132809\n射流风机\t132810\n【众泰T500】众泰众泰T5002018款\t132811\n张汉\t132812\n吴秀波\t132813\n炭火烤肉\t132814\n7队\t132815\n皮袋\t132816\nvera\t132817\n6.84\t132818\n珍爱网\t132819\n安监员\t132820\n奥灶面\t132821\np1s\t132822\n卤粉\t132823\n新鸥鹏教育城\t132824\n椒江区\t132825\nbjca\t132826\nmakati\t132827\n长安欧尚配件\t132828\n成都双流区\t132829\n算出来\t132830\nforcing\t132831\n孔云龙\t132832\n张建成\t132833\nOPPOR11\t132834\n小米Note/顶配\t132835\n埃斯特\t132836\n巡演\t132837\nrepeater\t132838\nDTI\t132839\n黄浦区\t132840\nforeign\t132841\n护臀膏\t132842\n雨夜花\t132843\n小义\t132844\n正体字\t132845\n苏格拉\t132846\n必思\t132847\n护栏\t132848\n特蕾莎·梅\t132849\nammonium\t132850\n港姐\t132851\n映秀\t132852\n方量\t132853\n硫酸镍\t132854\n故意伤害罪\t132855\n干政\t132856\n叶子猪斗战神\t132857\n合欢花\t132858\n5分钟前\t132859\n5108\t132860\n小米手机2/2S\t132861\n1.&#160\t132862\n三类\t132863\n亠\t132864\n2017年12月22日\t132865\navbt\t132866\n心坟\t132867\n10-30\t132868\nMans\t132869\n二图\t132870\n何钦铭\t132871\nBDS\t132872\n狗碗\t132873\nDecideC.dfm\t132874\n奢宠\t132875\nsuoning\t132876\n须佐能乎\t132877\nwww.3199.cn\t132878\nxlsx文件\t132879\nIR\t132880\n电影评介\t132881\ndecompressing\t132882\n墩柱\t132883\n正式\t132884\n报案\t132885\n这个梗\t132886\n电力猫\t132887\nCenterOS\t132888\n乌兰托娅\t132889\nSplines\t132890\n新聞-PLAYNO.1玩樂達人討論區\t132891\n全物\t132892\n潮剧\t132893\n永兴县\t132894\n宅在家\t132895\n财女\t132896\n王绍伟\t132897\n王晓刚\t132898\n幸福时光\t132899\n哈尔滨市发改委\t132900\n完美世界国际版\t132901\n动乱\t132902\nmgr\t132903\n盯盯\t132904\n两秒\t132905\n梅花易数吧\t132906\nhistone\t132907\nparfor\t132908\necotect\t132909\nwidespread\t132910\n欢乐好声音\t132911\n米其林餐厅\t132912\ndsd\t132913\n希勒\t132914\n游戏业\t132915\n受教\t132916\n全级\t132917\nWilkins\t132918\n信保\t132919\nThinkCMFX\t132920\n希纳斯\t132921\nuncertain\t132922\n天堂II\t132923\n明文规定\t132924\n权少\t132925\n徐烨\t132926\n南四环西路188号\t132927\n喜鹊\t132928\n阿鼻\t132929\n山西消防\t132930\nlibmysql\t132931\n44rt\t132932\n1.3.4\t132933\n硬格\t132934\n缓急\t132935\n下丘脑\t132936\n点触\t132937\n酸奶味\t132938\n广州体育中心\t132939\n彩云国\t132940\n亚博\t132941\n华南师范大学附属中学\t132942\n好的\t132943\n红宝石\t132944\n鸡男\t132945\n古林\t132946\n两千个\t132947\n吃糖\t132948\n沪江问答\t132949\nOgilvy\t132950\nBNS\t132951\n内涵村\t132952\n济南东站\t132953\n山东晨鸣纸业集团股份有限公司\t132954\n拉丁文\t132955\n幽灵行动4:未来战士\t132956\n2178\t132957\n凯迪拉克总统一号\t132958\n仓储笼\t132959\nliwei\t132960\n2则\t132961\n复古版\t132962\nTender\t132963\n20场\t132964\n现眼\t132965\n3670\t132966\n&lt\t132967\nCOUNTRY\t132968\n爱情魔发师\t132969\n340g\t132970\n万东医疗\t132971\n自由泳\t132972\n公卫执业医师考试\t132973\n浪费时间\t132974\nDefence\t132975\n読\t132976\n镇东\t132977\n蹦蹦\t132978\n美泉\t132979\n由我\t132980\n上海甄准生物科技有限公司\t132981\n12306改签\t132982\n攻丝机\t132983\n网易游戏平台\t132984\njrebel\t132985\n少林\t132986\n美咖\t132987\nARASHI\t132988\n开玩笑\t132989\nelp\t132990\n折叠\t132991\nu盘启动盘\t132992\n电视剧集\t132993\nmarch\t132994\n_巴士暗黑3\t132995\n许一鸣\t132996\n保卫师\t132997\nva-11\t132998\n爱羽客\t132999\n车体\t133000\nex+\t133001\n参股\t133002\n切题\t133003\nDDR\t133004\n铜银\t133005\n原式\t133006\n名车\t133007\nStoya\t133008\n3019\t133009\n苦短\t133010\nOcclusion\t133011\n中国铁建青秀城\t133012\n别克阅朗\t133013\n仙剑奇侠传五\t133014\n努巴尼\t133015\n1541\t133016\nTST活酵母\t133017\n中国石拱桥\t133018\n协方差\t133019\n阀座\t133020\n抓取\t133021\n证券分析\t133022\n汉莎航空\t133023\n制成品\t133024\n肚量\t133025\nTGideas\t133026\nCocoaChina\t133027\n金浩森\t133028\n肉皮\t133029\n2799\t133030\n小時\t133031\n清华大学工程物理系\t133032\n全览\t133033\n390元\t133034\n紫菱\t133035\n研究站\t133036\n电池组\t133037\nFRED\t133038\nLigui丽柜\t133039\n罩杯\t133040\n泉眼\t133041\n邹鲁\t133042\nsysdate\t133043\nkuang\t133044\n幽游白书\t133045\n阿爸\t133046\n磺酸基\t133047\n两个个\t133048\n作证\t133049\ndoro\t133050\n莱迪亚\t133051\n浙江省交通投资集团有限公司\t133052\n255.255.255.248\t133053\n红袖添香_阅文集团\t133054\n5位\t133055\n爆痘\t133056\n背隙\t133057\n罗莉\t133058\n肝部\t133059\n海峡两岸关系\t133060\n不端\t133061\n费孝通\t133062\n营口港\t133063\n塔王之王\t133064\n斯卡依\t133065\n香花桥\t133066\n天府三街\t133067\noffensive\t133068\n野生菌\t133069\n杜拉\t133070\n上海市妇联\t133071\n什么意\t133072\npredicted\t133073\n振华中学\t133074\n权力观\t133075\n护膝\t133076\n木遁\t133077\n菜杰\t133078\n0619\t133079\ndocuprint\t133080\n普兰特\t133081\n可颂\t133082\nJohann\t133083\n益阳市人民政府\t133084\nROSE\t133085\nQV\t133086\ndevops\t133087\ndx200\t133088\n旗滨集团\t133089\n横隔板\t133090\n神警\t133091\n陈锦石\t133092\n赵孟\t133093\n8624\t133094\n蔡律\t133095\n5030\t133096\n极乐空间\t133097\ncdrx\t133098\n一亿多\t133099\n认知篇\t133100\n盛夏的果实\t133101\n、、、\t133102\nEstimating\t133103\n雀跃\t133104\n乌贼骨\t133105\n加拿大大使馆\t133106\n朝内大街\t133107\n复测\t133108\n费森\t133109\n星河战队:火星叛国者\t133110\n叶桐\t133111\n牙体牙髓科\t133112\nTechnologies\t133113\n勤练\t133114\nallows\t133115\n潮汕人\t133116\n2017年4月20日\t133117\n解放碑\t133118\n呜呜\t133119\nFrame\t133120\nloveless\t133121\n舞坊\t133122\n镜像包\t133123\n滕州房产超市网\t133124\n中性点\t133125\n宽窄巷\t133126\n鱿鱼圈\t133127\nardiuno\t133128\n红五月\t133129\n可视对讲系统\t133130\nhgt\t133131\n大连大学附属中山医院\t133132\n道奇汽车\t133133\n黄鹂鸟\t133134\n捞金\t133135\n真菌性\t133136\n91.5\t133137\n头边\t133138\n科创大厦\t133139\naliexpress\t133140\n普象网\t133141\nCPP\t133142\n大力神杯\t133143\n一位数\t133144\n知后觉\t133145\n萌鸡小队\t133146\n马院\t133147\n血压仪\t133148\n我们都一样\t133149\nSUSE\t133150\ngtx1050\t133151\n展览业\t133152\nAmos\t133153\n正丁基锂\t133154\n求现\t133155\n28卷\t133156\n夏多雷\t133157\n战域\t133158\n掌付通违章网\t133159\n一查\t133160\nJBoss\t133161\nCv\t133162\n空置\t133163\ndeposition\t133164\n省电力公司\t133165\nDiaries\t133166\n7千米\t133167\nFurther\t133168\n门版\t133169\n草棚\t133170\n百分位数\t133171\n第3篇\t133172\n轮缘\t133173\n招募\t133174\n人教版五\t133175\n中空板\t133176\n假钞\t133177\n打孔器\t133178\n洗澡澡\t133179\nChess\t133180\nDIYzhan\t133181\nKing\t133182\n第18位\t133183\n伊哈洛\t133184\n中国工程网\t133185\n情深缘\t133186\n大姆\t133187\n洗煤机\t133188\n维文输入法\t133189\n万顷沙镇\t133190\n阿里股份\t133191\n转迷\t133192\n风恋\t133193\n研究生入学考试\t133194\n陈海波\t133195\n560D\t133196\n天下第一楼\t133197\n勃朗峰\t133198\n政办\t133199\n飞斧\t133200\n_页游\t133201\n燕尾港\t133202\n解脲脲\t133203\nbt5\t133204\n跨域\t133205\nkeypoint\t133206\n奔爱\t133207\n12dora\t133208\n棕榈园\t133209\nCAD自学网\t133210\n挥毫\t133211\n梦幻之星2\t133212\n小马宝莉紫悦\t133213\ncPanel\t133214\n烹饪发烧友\t133215\n圣莫里茨\t133216\nhund\t133217\n挂耳咖啡\t133218\n话事\t133219\n雪弗板\t133220\n湖北省住建厅\t133221\n蒲甘\t133222\n红外相机\t133223\n周小姐\t133224\nMPT\t133225\n前提\t133226\n见证\t133227\n孙思邈\t133228\n明杰\t133229\nStiletto\t133230\n派悦坊\t133231\n按揭贷款\t133232\nGizmo\t133233\n卖淫女\t133234\n千山万水\t133235\n八一学校\t133236\n党的女儿\t133237\n商贸版\t133238\n男图\t133239\n大连医科大学\t133240\nC40\t133241\n旅游法\t133242\nSKT\t133243\ncd补丁\t133244\n犯\t133245\n教会\t133246\n浙A\t133247\n神经性厌食症\t133248\n旅行壶\t133249\nsimultaneous\t133250\n搅匀\t133251\n财政\t133252\n土木类\t133253\n江西省体育局\t133254\n撞铃\t133255\n和平新闻网\t133256\n南禅\t133257\n亚显微结构\t133258\n黄体酮注射液\t133259\n叛逆性\t133260\n火影天殇\t133261\n瀑布\t133262\n火柴人联盟\t133263\n黄河长江\t133264\n邋遢\t133265\nshopspring\t133266\nurl.cn\t133267\nwifi万能钥匙\t133268\n伊万\t133269\n冕\t133270\nwndhw\t133271\n外围设备\t133272\ndms\t133273\n可馨\t133274\n胶条\t133275\nlooking\t133276\nWritable\t133277\n老城镇\t133278\n26米\t133279\n创兴\t133280\n东方财富证券\t133281\n邵光禄\t133282\n随型\t133283\nabthtm\t133284\n有形\t133285\n富贵病\t133286\n修美乐\t133287\n曹洞宗\t133288\n笛\t133289\n豆绿色\t133290\n找不到\t133291\n41位\t133292\n万标立方米\t133293\n算路\t133294\nRule\t133295\n院感\t133296\n奔奔mini\t133297\n喜爱夜蒲2\t133298\nNicky\t133299\n超级稻\t133300\n操器\t133301\n1DX\t133302\n2016年11月10日\t133303\n110斤\t133304\n易出\t133305\n先后顺序\t133306\n小钰\t133307\n险费率\t133308\nsmtp\t133309\n人函\t133310\nBootcamp\t133311\n官渡镇\t133312\nWhatIs\t133313\nDuo\t133314\n青瓦台\t133315\nk10\t133316\n智金\t133317\npatek\t133318\n杨钰麦迪\t133319\n进货单\t133320\n国家形象\t133321\n房产抵押贷款\t133322\n苦其心志\t133323\n易迅\t133324\n114个\t133325\n金家岭\t133326\n速算法\t133327\nmagne\t133328\nemirates\t133329\n3440\t133330\n50微米\t133331\n洛克莫丹\t133332\n备择\t133333\n杰青\t133334\n前5分钟\t133335\n一声令下\t133336\n和平小学\t133337\n杭州地铁7号线\t133338\n班尼\t133339\n汪源\t133340\nUnity3D#Unity\t133341\n石评梅\t133342\n20170615\t133343\n大学霸\t133344\n考昔片\t133345\nPUBG\t133346\n上前\t133347\n超威电池\t133348\nconfiguration\t133349\n朵花\t133350\nabrasive\t133351\n2011-2013年\t133352\n扫略\t133353\n小郎\t133354\n纸老虎\t133355\nflash8.0\t133356\n名手\t133357\n艾达币\t133358\n口是心非\t133359\nTDI\t133360\n48%\t133361\n六少\t133362\n虎啸\t133363\n喜爱夜蒲3\t133364\n反帝\t133365\njoytokey\t133366\n新能源公司\t133367\nXpress\t133368\n劳逸结合\t133369\n黄兴镇\t133370\npan\t133371\n天花病毒\t133372\nTVG\t133373\nmaxent\t133374\n李屹\t133375\n永磁同步电动机\t133376\n竹炮\t133377\n结石病\t133378\nwithin\t133379\nsourcemap\t133380\n胡桃木色\t133381\n季季\t133382\n华西医科大学\t133383\n铁轨\t133384\n分类法\t133385\n假爱\t133386\n连云港市公安局\t133387\n别恋\t133388\n威海市\t133389\n先天性巨结肠\t133390\nCIMC\t133391\npregnant\t133392\nSOE\t133393\n奔富407\t133394\n金敏珠\t133395\n回购网\t133396\ntcp连接数\t133397\nXun\t133398\n饿鬼\t133399\n差异化竞争\t133400\n十五句\t133401\npix2pix\t133402\n章节-163小说网\t133403\nstrictly\t133404\n奥山\t133405\n刀把\t133406\n玩具展\t133407\n分社\t133408\n五百块\t133409\n6374\t133410\n多明戈\t133411\n脚臭\t133412\nexclusive\t133413\n三福\t133414\nmegacli\t133415\n天涯儿\t133416\n点怪\t133417\n石塘村\t133418\n囧囧囧\t133419\nselinux\t133420\n2MM\t133421\n熔铸\t133422\n丙型肝炎病毒\t133423\n治安管理处罚法\t133424\n报亭\t133425\n20160424\t133426\n集中度\t133427\n杨依\t133428\nWatch-MacX\t133429\n不要急\t133430\n不孤单\t133431\n林颖\t133432\n总称\t133433\n无中\t133434\n光钎\t133435\nOffensive\t133436\n燕子山\t133437\nquark\t133438\nuint64\t133439\n兵力\t133440\n天玄\t133441\n智能科学与技术\t133442\n权数\t133443\n工学椅\t133444\n当机\t133445\n盈余公积\t133446\n198元\t133447\n小威\t133448\ni18n\t133449\n戏剧\t133450\n理财公司\t133451\n苦劝\t133452\n几顿\t133453\n统计资料\t133454\n行窃\t133455\n乐高机械组\t133456\n南京市金陵中学\t133457\nIMAX3D\t133458\n北京演艺专修学院\t133459\n美职篮\t133460\n远射\t133461\n长沙市雨花区人民政府\t133462\n桃腮\t133463\njins\t133464\n力战\t133465\nWolverine\t133466\n真皮\t133467\nmate8\t133468\n拒绝令\t133469\n红钱\t133470\n20150714\t133471\n中国国际学校\t133472\n土林\t133473\n手绘版\t133474\ntelephony\t133475\n清明节后\t133476\n中国中纺集团公司\t133477\n清水理\t133478\n积雪苷霜软膏\t133479\n昆明地区\t133480\n印字\t133481\n克劳斯\t133482\n东沙湖\t133483\n1n4148\t133484\n剪力墙\t133485\n奖券\t133486\n夜神\t133487\n虐身\t133488\n汁\t133489\n燕双鹰\t133490\nprinting\t133491\n经发\t133492\n人民电器\t133493\n义勇兵\t133494\n哈密\t133495\n梅菜扣肉饼\t133496\n红色警戒2科技时代\t133497\n\\x00\t133498\n武汉大学文学院\t133499\n35张\t133500\n区划\t133501\n库房\t133502\n结构工程师考试\t133503\n迈瑞公司\t133504\n一草一木\t133505\n陈春\t133506\n爱情骗子我问你\t133507\n散弹枪\t133508\nDrying\t133509\n芳华\t133510\n物流类\t133511\n西安音乐学院\t133512\n痊愈\t133513\n众森\t133514\n四周年\t133515\n山东大学附属生殖医院\t133516\n摩根财团\t133517\n电子邮件\t133518\n重庆商铺网\t133519\n出租车\t133520\n红袜\t133521\n德雷克\t133522\n符合性\t133523\n余浩\t133524\n数据层\t133525\n4x6\t133526\n安眠\t133527\n棉绒\t133528\n龙蛇演义吧\t133529\nEXP\t133530\ncentos6\t133531\n孤独的美食家\t133532\n库安装包\t133533\nfop\t133534\n便民\t133535\n广州赛意信息科技股份有限公司\t133536\ndynatrace\t133537\nxocde\t133538\n湖南青马\t133539\n戴爱玲\t133540\n大妞\t133541\n法律性\t133542\n恶俗\t133543\nicns\t133544\n北医\t133545\n白洋\t133546\n阴道镜检查\t133547\n冬春篇\t133548\ntight\t133549\nコミック\t133550\n国有林场\t133551\n颍州区\t133552\n席慕容\t133553\n九分之四\t133554\n岷东新区\t133555\n直剑\t133556\n疾风知劲草\t133557\n浩创\t133558\nsuperhero\t133559\n景山学校\t133560\n内退\t133561\nbeginners\t133562\n河北省地方税务局网上办税中心\t133563\n牛剑\t133564\n唯\t133565\n熊猫TV\t133566\n毛多\t133567\n稻花\t133568\n撞伤\t133569\n文华学院\t133570\n冰系\t133571\n虚似\t133572\nAMD,Ryzen\t133573\n春气\t133574\n素萨满\t133575\nQuartet\t133576\n儿童安全座椅\t133577\n美毛\t133578\n紫檀木\t133579\n气质\t133580\n可取现\t133581\n毒术\t133582\n电视节\t133583\n胤禛\t133584\nlogminer\t133585\n发热盘\t133586\n东盟网\t133587\n逢\t133588\n几杯\t133589\n会合\t133590\n导管架\t133591\n1月份\t133592\n190mm\t133593\n汉纸\t133594\n鹏华前海万科\t133595\n塘鲺\t133596\nf561\t133597\n艺术节\t133598\n两厅\t133599\n李小牧\t133600\n厂站\t133601\n降降\t133602\n专题辑\t133603\nyast\t133604\n工蚁\t133605\n市场营销类\t133606\n无表情\t133607\n红耳鹎\t133608\n学会思考\t133609\n万豪集团\t133610\nCELINE\t133611\n奥巴梅杨\t133612\n魔兽冰封王座\t133613\n内存变量\t133614\n申龙\t133615\n旅游攻略\t133616\n萨尔斯堡\t133617\n琉璃河镇\t133618\n芙兰\t133619\n40s\t133620\nLET\t133621\n联通IP地址\t133622\n江门一中\t133623\n字字珠玑\t133624\nwicked\t133625\n仿古\t133626\n04.01\t133627\n2018年03月24日\t133628\n洪门\t133629\n卓一航\t133630\n说及\t133631\n第34话\t133632\n棕褐色\t133633\n东莞站\t133634\n电源滤波器\t133635\nSIMS\t133636\n天天逗事\t133637\n致命打击\t133638\n横坡\t133639\n神记\t133640\n无人声\t133641\n锐志论坛\t133642\njee\t133643\n金盏乡\t133644\n株洲日报\t133645\n银河奇异果\t133646\n吸烟者\t133647\n克劳福德\t133648\n琅琊镇\t133649\nSupervisor\t133650\n形似\t133651\n介石\t133652\n太平洋建设集团\t133653\n天涯明月刀ol太白\t133654\n效果篇\t133655\n高科园\t133656\nsessions\t133657\n12.19\t133658\n软剑\t133659\nsm951\t133660\n滋贺县\t133661\n程雷\t133662\n商务港\t133663\n保护垫\t133664\n书房\t133665\n李玮\t133666\n额尔古纳\t133667\n大西厢\t133668\n手续\t133669\n深入虎穴\t133670\n聚酯纤维吸音板\t133671\n桃红葡萄酒\t133672\n酷我K歌\t133673\n自建站\t133674\n义乌小商品城\t133675\n涉黑\t133676\n3144\t133677\nwin10红警2\t133678\n西游记\t133679\nframemaker\t133680\n外治版\t133681\n方便袋\t133682\nTIM\t133683\n近百万\t133684\njunit单元测试\t133685\n丁慧\t133686\nR134a\t133687\n揭西信息网\t133688\n14元\t133689\n134个\t133690\n自然人独资\t133691\n王文龙\t133692\njqury\t133693\n历史学院\t133694\n158个\t133695\n车斗\t133696\n引\t133697\n梦幻金庸\t133698\njobsDB\t133699\n上海民办华二初级中学\t133700\nDefeat\t133701\n王者荣耀高端局\t133702\nxxo\t133703\n倒档\t133704\n五四红旗团支部\t133705\n再分\t133706\n朕的江山\t133707\nBuddhism\t133708\n12吋\t133709\n可移\t133710\n反结账\t133711\n彩版\t133712\n芯纱\t133713\nShp\t133714\n手贴\t133715\n武宁县\t133716\n综合化\t133717\n粉类\t133718\n非空值_\t133719\n大邑县\t133720\n东华门\t133721\npc3\t133722\n安阳殷墟\t133723\niMacros\t133724\n信以为真\t133725\nLiza\t133726\n若想\t133727\n粉花\t133728\nitextpdf\t133729\n九龙虫\t133730\n金融分析师\t133731\n崇文门\t133732\n渝黔铁路\t133733\n星际精灵蓝多多\t133734\n流水式\t133735\n白朗县\t133736\n金地格林格林\t133737\n边关\t133738\n圈点\t133739\n雌神\t133740\n炖品\t133741\n美团\t133742\nYANCHENG\t133743\n盖印\t133744\n地址\t133745\n双园\t133746\n罗口\t133747\n皇弟\t133748\n七十多万\t133749\n二手市场_网优二手网\t133750\n守尸\t133751\n水堂\t133752\n北京律协\t133753\nCASS\t133754\n严重\t133755\n商业秘密网\t133756\n戴胜\t133757\n簪花\t133758\n而入\t133759\n中班\t133760\nvoyager\t133761\n西城红场\t133762\n魔固版\t133763\nPian\t133764\n励志语\t133765\n主药\t133766\n翠西\t133767\n频宽\t133768\n天平路\t133769\nIM\t133770\n挤淤\t133771\n提交\t133772\ngmake\t133773\n胡德\t133774\n邓玉华\t133775\n牌技\t133776\ndbutils\t133777\n嘻哈#\t133778\n17小时\t133779\n邰伟\t133780\n焙烧炉\t133781\n津区\t133782\n脂肪族\t133783\napplescript\t133784\n签字页\t133785\n代理记账管理办法\t133786\n日产NV200\t133787\n结算书\t133788\n少府\t133789\n冠状沟处\t133790\nlb\t133791\n魔姬\t133792\n股坛\t133793\n袭击\t133794\n第45期\t133795\n药品管理法\t133796\n导购员\t133797\n超帅\t133798\n倪海夏\t133799\n边梁\t133800\n今年5月\t133801\nDNA亲子鉴定\t133802\n题君\t133803\n彩田路\t133804\n8898\t133805\n金书红颜录\t133806\n顺络电子\t133807\n0459\t133808\n法拉利\t133809\n孙涛勇\t133810\n美丽中国\t133811\n刘长江\t133812\n五月天演唱会\t133813\n上海交通大学海外教育学院\t133814\n镇人民政府\t133815\nsitter\t133816\n吉安市公安局交警支队_吉安市公安局交通警察支队\t133817\n万头\t133818\n鬼咒\t133819\n斑鱼庄\t133820\n肉蟹煲\t133821\n湖北人事考试网\t133822\n一路向南\t133823\n夜翼\t133824\npinko\t133825\nPrepar3D-模拟飞行论坛\t133826\n1618\t133827\ndatasnap\t133828\n达美康\t133829\n三友化工\t133830\n二级域名解析\t133831\n物流招聘网\t133832\n异客\t133833\n黑暗版\t133834\nfourteen\t133835\n官邸\t133836\ndarren\t133837\ngongshi\t133838\n点钞\t133839\n雍和宫\t133840\n骏派a50\t133841\n转炉\t133842\n出租\t133843\neid\t133844\n第17条\t133845\n螺套\t133846\nhas\t133847\n皮卡版\t133848\n彭真\t133849\n中国城市科学研究会\t133850\n纪伯伦\t133851\n环氧底漆\t133852\n连职\t133853\nnote3\t133854\n短绳\t133855\n米侠浏览器\t133856\n布丁仓鼠\t133857\n县长\t133858\n电动童车\t133859\ng2800\t133860\n吴兴政府网\t133861\n丰泽街\t133862\n易图\t133863\n新城地产\t133864\n本愿\t133865\n东易日盛装饰\t133866\nUNIQUE\t133867\nJohansson\t133868\nMicr\t133869\n执着\t133870\nprematurely\t133871\n0.1秒\t133872\n任真天\t133873\n移动志趣网\t133874\n南充市人力资源和社会保障局\t133875\n奇星\t133876\n企炬\t133877\n措辞\t133878\n维沃移动通信有限公司\t133879\n10.7\t133880\n5米高\t133881\n名佳花园\t133882\n刻花\t133883\n难听\t133884\n小甜橙\t133885\ncontroll\t133886\nCenters\t133887\n宰割\t133888\n开浦兰\t133889\n浮图塔命理百科\t133890\n恩威\t133891\n小红人\t133892\n专销\t133893\n确幸\t133894\n广州市花都区人民医院\t133895\n那些回不去的年少时光\t133896\nAUSTRALIA\t133897\nㄑ\t133898\nWDR6500\t133899\n刀库\t133900\n流量君\t133901\n贸商网\t133902\n上元\t133903\n完事\t133904\n30卷\t133905\n梅城镇\t133906\n君王\t133907\n70张\t133908\nUdietoo\t133909\n禁卫军\t133910\n母女\t133911\n查理与巧克力工厂\t133912\nNATURE\t133913\n绣球\t133914\n最新新闻网\t133915\n秒拍\t133916\nnaxie\t133917\nQuotes\t133918\nscalar\t133919\n税金\t133920\n引进\t133921\n佳能G7\t133922\n250日\t133923\n高陂镇\t133924\nbach\t133925\n王祖贤\t133926\n孟津县\t133927\n魔蝎座\t133928\n南京旅游论坛\t133929\n马凳\t133930\n主筋\t133931\n长沙市长郡中学\t133932\n38.1\t133933\n冰血\t133934\n北京大学教育学院\t133935\n多雷\t133936\n昭平县人民政府\t133937\nDrill\t133938\nUrlEncode编码/UrlDecode\t133939\n西安唐都医院\t133940\n地电\t133941\n3月13\t133942\n过去40年\t133943\nPC君\t133944\n长女版\t133945\nOffer\t133946\n平话\t133947\n000776\t133948\nVOC\t133949\nㄢ\t133950\n连环炮\t133951\n传热传质\t133952\n哈弗_哈弗H6\t133953\n梦幻西游神木林\t133954\nstell\t133955\nOANDA\t133956\n青桥\t133957\n脆皮鸡\t133958\n得志\t133959\n行道树\t133960\n解放村\t133961\n400周年\t133962\n算珠\t133963\n6200元\t133964\n汉中王\t133965\n受赠\t133966\ncfop\t133967\n午马\t133968\n城北村\t133969\n九十一\t133970\nThreads\t133971\nKK\t133972\n微单A7M3\t133973\n朝气\t133974\nlegislation\t133975\n粗细\t133976\n零活\t133977\nRaider\t133978\n丁岚\t133979\n自作孽\t133980\n内模\t133981\n采荷街道\t133982\nbag\t133983\n超武\t133984\nsunderland\t133985\n润滑液\t133986\n雪炫\t133987\n赵熙之\t133988\n85\t133989\n步阳\t133990\nXAF\t133991\nOMAP\t133992\n华师附小\t133993\n占率\t133994\nLiveHouse\t133995\n陈松伶\t133996\n公明街道\t133997\n新视野大学英语视听说\t133998\npushauction\t133999\n孤独感\t134000\n北辰山\t134001\n虎族\t134002\n远嫁\t134003\n木奈奈\t134004\n两姐妹\t134005\n句子\t134006\n考古学\t134007\n卫士长\t134008\n修饰词\t134009\n生地\t134010\n罗织经\t134011\n第89届\t134012\n悠方\t134013\n橙子雨\t134014\n秋水广场\t134015\n201711\t134016\n联共\t134017\n中共中央国务院\t134018\n鞍部\t134019\nFlexPaper\t134020\n会旗\t134021\n索香\t134022\n欣荣\t134023\n禅师网\t134024\n随机英雄\t134025\n赔给\t134026\n百度算法\t134027\ntechnica\t134028\ntier\t134029\nImgur\t134030\nganache\t134031\n宣威之窗\t134032\n五寨县\t134033\nR9tm\t134034\n散策\t134035\nMounts\t134036\n玉浩\t134037\n其乐无穷\t134038\n格策\t134039\n种菜\t134040\nFF7\t134041\n症候群\t134042\nredundancy\t134043\n座头市\t134044\n下一层\t134045\n梵高博物馆\t134046\n青石桥\t134047\nMagisk\t134048\n300元\t134049\n凤凰新闻\t134050\n宁波美团网\t134051\n迄\t134052\n禄\t134053\n5KW\t134054\n肯特大学\t134055\n凄婉\t134056\n曼岛\t134057\n呼噜噜\t134058\n幼升小衔接_小学\t134059\n钆\t134060\n一\t134061\n0名\t134062\n电影票\t134063\n5V5\t134064\n舶\t134065\n拼镜\t134066\n大钦岛\t134067\nproperties\t134068\n叶琪\t134069\nNF\t134070\n詹金斯\t134071\n充斥\t134072\n稿本\t134073\n钛金板\t134074\n144\t134075\n2014-01-02\t134076\nuTorrent\t134077\n41年\t134078\n群之心\t134079\n现代化\t134080\n2015.1.1\t134081\n禁地\t134082\n助理级\t134083\n江户\t134084\n中电建\t134085\n丽台Quadro\t134086\nbig\t134087\n海南省文化广电出版体育厅\t134088\nDataNode\t134089\nM35\t134090\n黑风洞\t134091\nMOD_快吧补丁网\t134092\n18px\t134093\n湖北农村信用社\t134094\n乖离性百万亚瑟王\t134095\n吉祥物\t134096\nRPi\t134097\n水科院\t134098\nero\t134099\nchloe\t134100\nTRX\t134101\n共模\t134102\n美尔雅\t134103\n乡风\t134104\n磋商\t134105\n晋商\t134106\nDCN\t134107\n中共贵州省委\t134108\n嬉皮士\t134109\n设岗\t134110\n里克尔梅\t134111\ncomplaints\t134112\n一颗树\t134113\nwxml\t134114\n义隆\t134115\n数据库原理与应用\t134116\n武汉美团网\t134117\n阿洛\t134118\n何欣\t134119\n乐2pro\t134120\n益达\t134121\n当月\t134122\n脚铐\t134123\nCabello\t134124\n1330\t134125\nRaid0\t134126\n斯兰\t134127\n优书\t134128\n宋鑫\t134129\n优服\t134130\n英雄之证\t134131\n婚庆用品\t134132\n交易额\t134133\n张晓梅\t134134\n秦始皇兵马俑\t134135\n核芯\t134136\n20150324\t134137\n沃尔沃s60\t134138\n一锭\t134139\n转向轮\t134140\n黄荆\t134141\nx7r\t134142\n钓金龟\t134143\n腋毛\t134144\n2424\t134145\n李家福\t134146\n桑科\t134147\n花蝴蝶\t134148\ngoogleplay\t134149\n光明食品集团\t134150\n赤字率\t134151\nattitude\t134152\n锋味\t134153\nwhl格式\t134154\n0526\t134155\nszl\t134156\nsourced\t134157\n续资治通鉴长编\t134158\n24罐\t134159\n室盖\t134160\n音频管理器\t134161\n淘宝达人\t134162\n白卷\t134163\n马尔库塞\t134164\n云通信\t134165\npart\t134166\n大头儿子和小头爸爸\t134167\n格雷\t134168\n农业南路\t134169\n刻录\t134170\n无存\t134171\n绝色医妃\t134172\n360中房网\t134173\nEBC\t134174\n魏宗万\t134175\n很忙\t134176\n前瞻产业研究院\t134177\n2018c1\t134178\nPDF转换\t134179\n住客\t134180\nBX\t134181\n骑宠\t134182\n龙兴镇\t134183\n博美狗\t134184\n新城网\t134185\n菲力\t134186\n存管\t134187\n惠农资讯网\t134188\n东风街道\t134189\n13步\t134190\n万达公寓\t134191\n万美元\t134192\nFujifilm\t134193\n婧\t134194\n科融环境\t134195\n97路\t134196\n井川\t134197\nDefinition\t134198\narpspoof\t134199\n众议院\t134200\n探索性\t134201\n最小说\t134202\nPOJO\t134203\n一两点\t134204\n在唱歌\t134205\n耽美肉文\t134206\n袪\t134207\n2000米\t134208\n9月22日\t134209\n12T\t134210\n必联网\t134211\n西祠胡同\t134212\n网络线\t134213\n多党\t134214\n寻路\t134215\n257号\t134216\n咨询类\t134217\n育英\t134218\nMurata\t134219\n渡江战役纪念馆\t134220\n濑户的花嫁\t134221\n一个都不能少\t134222\n1vs1\t134223\n灰狼\t134224\ncic\t134225\n案中案\t134226\n兜风\t134227\nCODE\t134228\n鲨鱼鳍\t134229\nprogrammed\t134230\n一官\t134231\n上海第二医科大学\t134232\n62路\t134233\n等比数列\t134234\n孙协志\t134235\n孙鹏飞\t134236\n中止执行\t134237\n艾兰\t134238\n最好是\t134239\n帐目\t134240\n0.7%\t134241\n90日\t134242\n汉化完美硬盘版\t134243\n郑湫泓\t134244\n品书网\t134245\n上海地铁9号线\t134246\n长江委\t134247\n大冶市\t134248\n沈冰\t134249\n我与你的光年距离\t134250\n上古卷轴4:湮没\t134251\n十一届三中全会\t134252\n厄介\t134253\n汇佳幼儿园\t134254\n蚂蚁浏览器\t134255\n资讯_系统粉\t134256\n190个\t134257\ngees\t134258\n说媒\t134259\n电子邮件服务器\t134260\n常州市第三人民医院\t134261\n鬼索\t134262\n马灯\t134263\n情深意长\t134264\nPresentations\t134265\n虚拟性\t134266\n藏剑山庄\t134267\nrhino\t134268\n招待所\t134269\n灿烈\t134270\n旗下\t134271\n紫冰\t134272\n专文\t134273\n教授们\t134274\n北京汽车博物馆\t134275\n北漂族\t134276\nVC14\t134277\n打片\t134278\n笔误\t134279\n5.4.16\t134280\n鏊子\t134281\n沧州市政府\t134282\n桂宝\t134283\npr2e\t134284\n5.5.4\t134285\nlord\t134286\n苏欣\t134287\noracle12c\t134288\n监牢\t134289\n着火点\t134290\n氧化钨\t134291\n粉\t134292\n规模化\t134293\n不可数\t134294\n面经\t134295\n马陵山\t134296\nfordham\t134297\n下毒\t134298\nadvances\t134299\n记忆合金\t134300\nu5\t134301\n时计\t134302\n不好玩\t134303\n安装架\t134304\n攀钢集团有限公司\t134305\n消防日\t134306\n男人窝\t134307\n桑吉\t134308\n存储卡\t134309\nlama\t134310\n香味\t134311\n争章\t134312\n非正义战争\t134313\n哈鲁\t134314\naustralian\t134315\n小泉纯一郎\t134316\n刘禅\t134317\n壮阳酒\t134318\n立竿见影\t134319\n完美\t134320\n手机游戏吧\t134321\n泰勒斯\t134322\n内比都\t134323\n中兴移动\t134324\ntrail\t134325\n名侦探\t134326\n健忘\t134327\n我卡\t134328\n340分\t134329\n徐晓东\t134330\n失踪者\t134331\n直联式\t134332\nals\t134333\n腾讯音乐娱乐集团\t134334\n李仁港\t134335\n照片书\t134336\n长皮\t134337\ntui\t134338\n37集\t134339\n山财培训网\t134340\n高嵩\t134341\n滇重楼\t134342\n鸭嘴\t134343\n白浅\t134344\n女侦探\t134345\n创新港\t134346\nOrion\t134347\n涩\t134348\nCOLOR\t134349\n连体裙\t134350\n几发\t134351\n咕咪\t134352\n扩建\t134353\nSystemVerilog\t134354\n红沿河核电站\t134355\n180克\t134356\n锋菲\t134357\nFlick\t134358\n小意\t134359\n夺命咒\t134360\nrequires\t134361\n张浩\t134362\n公存款\t134363\n疗效\t134364\ntestin\t134365\n诡影\t134366\nDomino\t134367\n一千里\t134368\ndiligent\t134369\nel-radio\t134370\n杂多县\t134371\nCallable\t134372\n情头\t134373\n体长\t134374\n大悟\t134375\n同达\t134376\n柳毅传书\t134377\n天河区\t134378\n麦田守望者\t134379\n228元\t134380\n会会\t134381\n几多\t134382\n黄壳\t134383\nalertdialog\t134384\nGR2\t134385\n蓝板\t134386\n衬\t134387\n长沙地区\t134388\n凡客诚品\t134389\n绿跑\t134390\nkeyCode\t134391\n海底隧道\t134392\n省自然科学基金\t134393\n滑轮\t134394\n不做账\t134395\n江格尔\t134396\n牛腿\t134397\n计量型\t134398\ndisplay_errors\t134399\n格林威尔\t134400\n第8节\t134401\ncopter\t134402\n乙基\t134403\n中心城区\t134404\n沈华\t134405\n麻辣天后宫\t134406\nexterior\t134407\n第41集\t134408\n三思\t134409\nCalorie\t134410\nTuscany\t134411\n香喷喷\t134412\n杀医\t134413\n不可饶恕\t134414\n筒壁\t134415\niciba\t134416\n960元\t134417\n橡胶木\t134418\n总结帖\t134419\nscarlet\t134420\n海关法\t134421\n足协\t134422\n手术部\t134423\nPROGRAM\t134424\n反同\t134425\n蓝婷\t134426\n夏鹃\t134427\n张庄村\t134428\n刀锋山\t134429\n周泽\t134430\n有拍\t134431\n光灯\t134432\nLVMH\t134433\n色斑\t134434\n真心对待\t134435\n张立军\t134436\n张润贞\t134437\n喃\t134438\n大先生\t134439\n法医学\t134440\nxmind8pro\t134441\n广园新村\t134442\n网上申购\t134443\n肺栓塞\t134444\nFinishing\t134445\n全高清屏\t134446\n挪威\t134447\nmit\t134448\n128个\t134449\n凶光\t134450\n社会学院\t134451\n塔香\t134452\n周播\t134453\nForecast\t134454\n尼康D810\t134455\n39亿\t134456\n妄动\t134457\n梦之队\t134458\n五乡镇\t134459\n记录纸\t134460\n黄渤海\t134461\n想容\t134462\n汇编语言程序设计\t134463\ncables\t134464\nAlain\t134465\nCNKI\t134466\n内蒙古政府\t134467\n中化岩土\t134468\n载重线\t134469\n蓝魔虾\t134470\nv8\t134471\n现行\t134472\n子字符串\t134473\n介数\t134474\n疯牛\t134475\n龚家湾\t134476\n进制数\t134477\n北京日报社\t134478\n考试星\t134479\nreplicat\t134480\nkiroro\t134481\n超过三个月\t134482\n西点培训学校\t134483\nelasticsearch6\t134484\n林公子\t134485\n安徽省小学\t134486\n青茫\t134487\n山东省委党校\t134488\n厉害了我的国\t134489\n刺鼻\t134490\n李汝珍\t134491\nqmail\t134492\n玩命猜成语\t134493\nseafile\t134494\n普罗\t134495\n拳皇98终极之战\t134496\n城战\t134497\n花样年华\t134498\n飞院\t134499\nKB4093112\t134500\nCREO4.0\t134501\n米肠\t134502\n讲章\t134503\n1418\t134504\n歌会\t134505\n朱光潜\t134506\n碘离子\t134507\njqueyr\t134508\n幻想曹操传外传\t134509\n典型性\t134510\n潜水轴流泵\t134511\n腹肌轮\t134512\n186\t134513\n绿色软件联盟\t134514\n挑起\t134515\nresin4\t134516\n骁龙636处理器\t134517\nLogon\t134518\n球门\t134519\n750万\t134520\nDMI指标\t134521\n全方\t134522\n鸿蒙\t134523\n瓦房店市\t134524\n刘威\t134525\n校园文化艺术节\t134526\n神卡\t134527\n蒋老师\t134528\nAV女優\t134529\n伪女\t134530\n兰州中川机场\t134531\n发水\t134532\n捷配电子通\t134533\n拐杖\t134534\n刻骨\t134535\n第三轨\t134536\n冰光\t134537\n秦始\t134538\n姜昕\t134539\n吸锡器\t134540\nLawyers\t134541\n万科翡翠公园\t134542\npro\t134543\n180度\t134544\n白小姐\t134545\n精元\t134546\n据报\t134547\n懵\t134548\n尼罗鳄\t134549\n蒋婷婷\t134550\n城事\t134551\n未定义函数\t134552\n017年\t134553\n未竟\t134554\nBenz\t134555\n用户界面\t134556\n讨厌自己\t134557\n南移\t134558\n燕郊网城\t134559\n1456\t134560\n雷雷\t134561\n蛇口自贸区\t134562\n白玻\t134563\n重码\t134564\n浆体\t134565\n托尼·史塔克\t134566\nNEB\t134567\nporting\t134568\n丙公司\t134569\n2345手机助手\t134570\n考试场\t134571\n入役\t134572\n海滨街道\t134573\nCounty\t134574\n产学研\t134575\n20170816\t134576\nエステ\t134577\n签约\t134578\n发奖\t134579\n6at\t134580\nC.3\t134581\n扶贫路\t134582\nnpp\t134583\n苯并芘\t134584\n加里曼丹岛\t134585\nDevelop\t134586\n似然函数\t134587\n杨澜碧昂丝\t134588\n2000多年\t134589\n第52届\t134590\nkmalloc\t134591\n中国新声代\t134592\nlayedit\t134593\nMechanisms\t134594\nCockyBoys\t134595\nfreezing\t134596\n3月18\t134597\nsamsara\t134598\n摇摇欲坠\t134599\n广西壮族自治区国家税务局\t134600\n哈瓦\t134601\n西安区\t134602\n安理会\t134603\n超级英雄2\t134604\n海沧大桥\t134605\n涂饰\t134606\n观塘\t134607\n读友\t134608\n拜仁\t134609\n邯郸丛台区\t134610\nvalueOf\t134611\n蝴蝶剑\t134612\nCBA公司\t134613\njiema\t134614\noac\t134615\n名誉\t134616\n普广\t134617\n何辉\t134618\n魔毯\t134619\n唐健\t134620\n08cms\t134621\nWalks\t134622\n强降雨\t134623\n亿立方米\t134624\nnipple\t134625\n张继\t134626\nReactJs\t134627\n大行\t134628\n重大责任事故罪\t134629\n溶血症\t134630\n委托方\t134631\npearson\t134632\n平整机\t134633\nminitab17\t134634\n蓝思科技\t134635\n听一听\t134636\n好好笑\t134637\n4399j.com\t134638\n单写\t134639\n秋涛\t134640\n法定代理人\t134641\n财务共享服务中心\t134642\n解意\t134643\n脑病\t134644\n搜出\t134645\n共襄\t134646\n清帐\t134647\n苹果味\t134648\nnuts\t134649\nretries\t134650\n徐家汇中心\t134651\nadam\t134652\nGuide\t134653\n坎儿\t134654\n引以为\t134655\n小僧\t134656\n瀍河区\t134657\nHSB\t134658\n扎西顿珠\t134659\n白边\t134660\n后入式\t134661\n北凉\t134662\n骁龙425\t134663\n布莱顿\t134664\n三大航空联盟\t134665\nCrimson\t134666\n浸没\t134667\n拉锯战\t134668\n罗普特\t134669\n索瑟姆\t134670\n拐角处\t134671\nundertow\t134672\n撸一管\t134673\n牛男\t134674\nデビュ\t134675\n齐装网\t134676\n微表情\t134677\n潜水排污泵\t134678\n亲爱的你在哪里\t134679\n高丹\t134680\n藤田麻衣子\t134681\nSlender\t134682\n盐\t134683\nasos\t134684\n姨妈巾\t134685\n一箭穿心\t134686\nTacey\t134687\nX9\t134688\n3w\t134689\n贱妇\t134690\nf91\t134691\n广州市越秀区人民法院\t134692\nAssociated\t134693\n二手房公积金贷款\t134694\n绝仙\t134695\n渝万\t134696\nSinner\t134697\n售卖\t134698\n章节目\t134699\n马克沁\t134700\n新唱\t134701\n上海市红十字会\t134702\n陈文胜\t134703\nfabricjs\t134704\nV5R21\t134705\n咳\t134706\n王大陆\t134707\nishow\t134708\n广化寺\t134709\n婚房\t134710\n和顺镇\t134711\n造次\t134712\n挤奶器\t134713\n烟雨霏霏相思梦\t134714\n六角星\t134715\n略图\t134716\n王治郅\t134717\n龙池开发区\t134718\nbeier\t134719\nposite\t134720\n地狱天堂\t134721\n广州市中西医结合医院\t134722\n30回\t134723\nDAO\t134724\n迪普科技\t134725\n高端化\t134726\n林冠\t134727\n蓝山湾\t134728\n揪心\t134729\nm^2\t134730\n试练\t134731\n3ADisk网\t134732\nNAVER\t134733\n刘兰芳\t134734\n木料\t134735\n森下\t134736\n飞彩\t134737\n24届\t134738\n村女\t134739\n神鞭\t134740\n张语格\t134741\n灵超\t134742\n反隐\t134743\nConspiracy\t134744\nsingles\t134745\n练字\t134746\n4k显示器\t134747\n陈淑芬\t134748\nServant\t134749\n粤府\t134750\nuplink\t134751\n小匠\t134752\n淮南百姓网\t134753\n421号\t134754\nEvans\t134755\nliver\t134756\n典型\t134757\n12.9%\t134758\n紫川\t134759\n1.01G\t134760\n正性\t134761\n杏树\t134762\n南京仙林\t134763\n广林\t134764\n磁贴\t134765\n黄瓜籽粉\t134766\n学龄前\t134767\n祥旭\t134768\ncornerstone\t134769\n吞食孔明传\t134770\n安义县\t134771\n黔农网\t134772\n狄波拉\t134773\n喜梦宝\t134774\n山木香\t134775\n巨蟹\t134776\n古典吉他\t134777\n南宁市环境保护局\t134778\n图片库\t134779\n29.0.0.113\t134780\n幕僚长\t134781\n2.1版\t134782\n650元\t134783\n小豆岛\t134784\n园林学院\t134785\n特工组\t134786\n蒸纪\t134787\n百事可乐\t134788\n齐刷刷\t134789\n青年社区\t134790\n预考\t134791\n攻势\t134792\n巴以冲突\t134793\n2017.2.1\t134794\n待审\t134795\nAnnouncements\t134796\n红外灯\t134797\n360系统急救箱\t134798\n转票\t134799\n朱姐\t134800\nvitamio\t134801\n汉中火车站\t134802\n3杯\t134803\n天照\t134804\n平面设计学院\t134805\n╲\t134806\n美巢\t134807\n碘化钾\t134808\npres\t134809\n襄阳火车站\t134810\n阅房网\t134811\n字纹\t134812\nNatasha\t134813\n化疗\t134814\n神仙居\t134815\n霹雳江湖\t134816\n集思广益\t134817\n外来媳妇本地郎\t134818\n我的爸爸\t134819\n无线ap\t134820\n独居\t134821\n1512\t134822\n企业基础信息表\t134823\n推动者\t134824\n长沙大学\t134825\n3.37\t134826\n5713\t134827\nMacbookpro\t134828\n传者\t134829\nHaar\t134830\n现阶段\t134831\n嗤笑\t134832\n5044\t134833\n黑凤梨\t134834\n诞辰\t134835\n趴赛\t134836\n苦瓜片\t134837\n217\t134838\n刘清池\t134839\n汉语言文学网\t134840\n螺杆空压机\t134841\n短管\t134842\n西部大开发\t134843\n希财\t134844\n路上\t134845\n埃利亚松\t134846\nFRAME\t134847\n释迦牟尼佛传\t134848\n非正式会谈\t134849\nriso\t134850\n南堡\t134851\n换帅\t134852\n风云三国\t134853\n中华标准件网\t134854\n武隆区\t134855\nV919\t134856\n求职照\t134857\n晶瑞股份\t134858\n战争与和平\t134859\n黄连上清片\t134860\n36部\t134861\na杖\t134862\n回访\t134863\n光具座\t134864\nemWin\t134865\n鹤王\t134866\nflash版\t134867\nSCBT\t134868\nshijie\t134869\nkyc\t134870\n直售\t134871\n按完\t134872\nstrategic\t134873\n一字板\t134874\nemui5.0\t134875\n抱狗\t134876\ngt720\t134877\n沈亮\t134878\n守护永恒之树\t134879\n枇杷\t134880\n费歇尔\t134881\nDataset\t134882\n蠕墨铸铁\t134883\n周1\t134884\n长益\t134885\n大南街\t134886\n云乐\t134887\nxieav\t134888\n小菜鸟\t134889\nroberts\t134890\n气压棒\t134891\n金融计算器\t134892\nshowman\t134893\n党忠诚\t134894\n副科\t134895\nSHIBOR\t134896\n鼠兔\t134897\n软化剂\t134898\nⅪ\t134899\n脱脂棉\t134900\n近现代史纲\t134901\ntftpd\t134902\n安昌镇\t134903\n米娜\t134904\n加油员\t134905\n紫竹调\t134906\n远东电缆有限公司\t134907\n天舒\t134908\n谢文\t134909\n易建联\t134910\nplural\t134911\n奥通\t134912\nCONTROLLER\t134913\n002151\t134914\n64万\t134915\nscrollbar\t134916\n擅长捉弄的高木同学\t134917\n行政强制法\t134918\n耳眼\t134919\nEXRPG\t134920\n崩\t134921\n螺帽\t134922\nDebugging\t134923\n危险化学品经营许可证\t134924\n曝气管\t134925\ntenos\t134926\n指南录\t134927\n塔斯马尼亚大学\t134928\n11台\t134929\n威德\t134930\n心存感激\t134931\n公民\t134932\n张雨\t134933\nRaymond\t134934\n神奇寶貝百科\t134935\ncompartment\t134936\n安托\t134937\n加氢站\t134938\n四川发展(控股)有限责任公司\t134939\nCaching\t134940\n花开的时候你就来看我\t134941\n企业战略管理\t134942\n异母\t134943\n老街区\t134944\nSRA\t134945\nVERICUT\t134946\n紧肤水\t134947\n40多分钟\t134948\n张立志\t134949\n小雷锋\t134950\n康莱特\t134951\n广美标识LED发光字工厂\t134952\n崔成浩\t134953\n拉萨贡嘎机场\t134954\nwebservice\t134955\naudio\t134956\n杨馥宇\t134957\n争论不休\t134958\n养生信\t134959\nKartik\t134960\nunichi\t134961\ndoctorx\t134962\n002019\t134963\n茶叶罐\t134964\n张庆\t134965\n博鳌乐城\t134966\n会声会影X4\t134967\n御河湾\t134968\n牛牛播放器\t134969\n茅山术\t134970\nterminate\t134971\n_特玩游戏网\t134972\n狗头鱼\t134973\n及时率\t134974\n埋针\t134975\nsoundtrack\t134976\nhololens\t134977\n体操服\t134978\n朱尔\t134979\n禅心\t134980\nPlugin\t134981\ndaylight\t134982\n文爱社区\t134983\n七彩作文网\t134984\n康妇消炎栓\t134985\n_县\t134986\n第130章\t134987\n芽笼\t134988\n招商牛地产\t134989\n1.47G\t134990\nreloc\t134991\n心动版\t134992\nxlr\t134993\n苏州高铁新城\t134994\nWWW.171ZZ.COM\t134995\n撩湿\t134996\n官学\t134997\n刺蛇\t134998\n苏修\t134999\n走访\t135000\n闵行区\t135001\n2.9万\t135002\nb2\t135003\n青春失乐园\t135004\nLXC\t135005\n57度\t135006\n签批\t135007\n逆行者\t135008\n1-2\t135009\n吉利汽车4S店\t135010\noneself\t135011\n国家监察委\t135012\n榆林机场\t135013\nceb文件阅读器\t135014\n始于\t135015\nsucceed\t135016\n北京市中西医结合医院\t135017\n3月15\t135018\n万彩娱乐\t135019\n宋松\t135020\n世贸天阶\t135021\n厦门航空公司\t135022\n赴国\t135023\n雅高集团\t135024\nLOLITA\t135025\n寄递业\t135026\n亚历山大二世\t135027\n建构主义\t135028\n久安\t135029\n长城马拉松\t135030\n中国资本\t135031\n两千多万\t135032\n怀柔\t135033\n楽園\t135034\ncreo2.0\t135035\n樊振郎\t135036\n赵伟洲\t135037\nserve\t135038\nfindings\t135039\n10013\t135040\n第5次\t135041\n9377攻沙\t135042\n可乐必妥\t135043\n生日礼物\t135044\n达叔街\t135045\n浄化\t135046\n大伦敦\t135047\n勇者斗恶龙怪兽篇\t135048\ncorner\t135049\n皮特\t135050\n乔新江\t135051\n老鼠肉\t135052\nsevers\t135053\n艾利之书\t135054\n同栖\t135055\n梅花香自苦寒\t135056\n长江汽车\t135057\n巴音郭楞\t135058\n大唐开国\t135059\n泄不通\t135060\n战锤末世鼠疫2\t135061\n剑灵投影寺院\t135062\nGK\t135063\n我们的侣行\t135064\n牛米\t135065\n炸毛\t135066\n6.xml\t135067\n北东\t135068\n汇冠股份\t135069\n教育时代\t135070\n信阳日报\t135071\n33平米\t135072\n海南建设自由贸易试验区\t135073\n无彩\t135074\n脑膜炎\t135075\n酷漫居\t135076\nnotebooks\t135077\n出人意料\t135078\n覆盖式\t135079\n3座\t135080\n封禅\t135081\ndongdong\t135082\n复地东湖\t135083\n胆小者\t135084\n华为畅玩7x\t135085\n廊坊地区\t135086\n人间至味是清欢\t135087\nclothes\t135088\n工程勘察设计收费管理规定\t135089\nSublimeLinter\t135090\nKen\t135091\n绝地求笙咨询网\t135092\n书堂\t135093\n槐泗镇\t135094\n政治类\t135095\n哪咤\t135096\n回归号\t135097\nSARS\t135098\n吸进\t135099\n模糊数学\t135100\n刻度盘\t135101\n导电液\t135102\n*期\t135103\n500M\t135104\n16.07.13\t135105\n140级\t135106\n任志刚\t135107\n高考生\t135108\n榆次区\t135109\n四挡\t135110\n刘国华\t135111\n三百公里\t135112\n易华\t135113\n骁勇\t135114\n两尺\t135115\nCiti\t135116\n福海街道\t135117\n統計\t135118\n鸡西大学\t135119\npermissive\t135120\nU-571\t135121\nRESTEasy\t135122\n施图伦\t135123\nusermod\t135124\n0877\t135125\n增值电信\t135126\n深圳研究院\t135127\n女子力\t135128\n全波\t135129\n喀拉峻\t135130\n小公\t135131\n蜚声\t135132\n1146\t135133\nDemos\t135134\n230万元\t135135\nC276\t135136\n生态廊道\t135137\n李晴\t135138\npdf阅读器\t135139\nTrunk\t135140\n鼻型\t135141\n巨物\t135142\n杜马镇\t135143\nm4v\t135144\n朝鲜\t135145\nHonoka\t135146\n焕晶\t135147\nCLARINS\t135148\n归期\t135149\norganizer\t135150\n诗性\t135151\n资本市场\t135152\n13.10\t135153\n帝国时代1\t135154\n矿冶\t135155\n吉利汽车博越\t135156\n登峰造极\t135157\n看画\t135158\n剑术师\t135159\n憔悴\t135160\n爲\t135161\nkatalon\t135162\n拿回\t135163\n应期\t135164\n木下秀吉\t135165\n考察期\t135166\n焊角\t135167\nHEVC\t135168\nhush\t135169\n朱庄\t135170\n事业史\t135171\n永无止境\t135172\n求解决\t135173\n玻璃片\t135174\n大拇指广场\t135175\n蕉岭县\t135176\n吉位\t135177\n涌堂\t135178\n卫生委员\t135179\n战服\t135180\n黑龙潭公园\t135181\n德令哈\t135182\n书海\t135183\n52000\t135184\nWin32\t135185\nChristie\t135186\nDN200\t135187\n海爷\t135188\n富力桃园\t135189\n前南斯拉夫\t135190\n宁都县\t135191\n合金管\t135192\n洛阳东方医院\t135193\n炉工\t135194\n德美化工\t135195\n我的女人\t135196\n兵马雅阁\t135197\n0e\t135198\n英语节\t135199\n五支\t135200\nCFXB\t135201\n赵玉瑾\t135202\n新华信\t135203\nJObject\t135204\n安网\t135205\nFacing\t135206\n83级\t135207\nQube\t135208\n宝马6系\t135209\nm6600\t135210\nrancher\t135211\nip68\t135212\ntnga\t135213\n文达\t135214\n第174集\t135215\n陈麻花\t135216\nooh\t135217\n乡野春\t135218\n义务教育\t135219\n治乱\t135220\n承重墙\t135221\n黑带\t135222\n东北酸菜\t135223\nENVY\t135224\nPepper\t135225\n金陵饭店\t135226\n劳动保障新闻网\t135227\n回证\t135228\n内能\t135229\n东山小区\t135230\n批萨\t135231\n隆基泰和物业\t135232\n张述\t135233\n周涛\t135234\n四通阀\t135235\n系统生物学\t135236\nTLS\t135237\nstein\t135238\n青泥洼桥\t135239\n泄漏率\t135240\ndiandian\t135241\n活动柜\t135242\nstudio12\t135243\n2017年3月29日\t135244\n动车段\t135245\nH2O\t135246\n境内\t135247\n希美真由\t135248\n银雾\t135249\n上机\t135250\nō\t135251\n南京脑科医院\t135252\n兰世立\t135253\na55\t135254\n萌兽\t135255\n寡闻\t135256\n涩谷\t135257\n威朗论坛_汽车之家论坛\t135258\n民营资本\t135259\n韦庄\t135260\nRedo\t135261\n谐振频率\t135262\n沙龙会\t135263\n选曲\t135264\n油田公司\t135265\n百胜\t135266\n上古卷轴OL\t135267\n放款\t135268\n杨莲亭\t135269\n盗版狗\t135270\n赤木晴子\t135271\n大连万达\t135272\n宿管\t135273\n双黄\t135274\n库蒂尼奥\t135275\n多国\t135276\nintellij\t135277\n广州注册公司\t135278\n酸枣\t135279\n荣耀畅玩5x\t135280\n收到\t135281\n试映\t135282\nsubfigure\t135283\n剂量\t135284\nnanosleep\t135285\n派件\t135286\n活化\t135287\nexcel散点图\t135288\n2.0下\t135289\n之二十九\t135290\nCody\t135291\n通式\t135292\n黔江在线\t135293\nVista\t135294\n姑米\t135295\n安徽合肥公共资源交易中心\t135296\n兆芯\t135297\n海量版\t135298\nshv32\t135299\n刘雨鑫\t135300\n琪琪格\t135301\n管壳式换热器\t135302\n衣藻\t135303\n张丰毅\t135304\n靓模\t135305\n动作版\t135306\n8MM\t135307\n伯特利\t135308\n伸缩\t135309\nDQMJ3\t135310\n回力轮胎\t135311\n遗言\t135312\n骚味\t135313\n余昌华\t135314\n艾哈迈德\t135315\n节气门\t135316\n订机\t135317\nfat\t135318\n鸿翼\t135319\n上海代表团\t135320\n辽宁科技学院\t135321\n肖剑\t135322\n12月4日\t135323\nPre-A轮融资\t135324\n买卖合\t135325\n溥\t135326\n任督\t135327\n微镇\t135328\n中商产业研究院\t135329\n养生迪\t135330\n绿里\t135331\n地热井\t135332\n冬泉\t135333\nOVA\t135334\n济南长途汽车东站\t135335\n954集\t135336\nDataFrame\t135337\nV9.5\t135338\n身上\t135339\ntss\t135340\n彼端\t135341\nx64dbg\t135342\ncriteria\t135343\n左洛复\t135344\nexs\t135345\n弹塑性分析\t135346\n天津市旅游局\t135347\n豫都网\t135348\n智远理财\t135349\n恶势力\t135350\n视网膜母细胞瘤\t135351\n中长发\t135352\n有期徒刑\t135353\n合星\t135354\n水港\t135355\n肾衰竭\t135356\npathways\t135357\n起夜\t135358\n双十\t135359\n西多夫\t135360\nrecvfrom\t135361\n耿灿灿\t135362\n陈蕃\t135363\namazfit\t135364\n地球在线\t135365\n李洋\t135366\n足弓\t135367\n1799\t135368\n合作处\t135369\n沙市中学\t135370\n大阪海游馆\t135371\n肇州县\t135372\n哈尔滨物流公司\t135373\n大使命\t135374\n小症\t135375\n东戴河新区\t135376\n茶语\t135377\n上海讨债公司\t135378\n699号\t135379\n大师傅\t135380\n淡蓝色\t135381\n第八宫\t135382\n山水文园\t135383\n3568\t135384\n东京湾\t135385\n罗格列酮\t135386\nfloat32\t135387\n橘红丸\t135388\n返利网\t135389\n碱性电池\t135390\n2pdf\t135391\n神乎其技\t135392\n20t\t135393\n湛江一中\t135394\nthroughout\t135395\n第二_\t135396\nEquals\t135397\nconcurrenthashmap\t135398\n逆魔\t135399\nii型\t135400\n长片\t135401\n红白\t135402\n潮头\t135403\n特一\t135404\n飞秒激光近视手术\t135405\n赛意\t135406\n市商务委\t135407\nJuste\t135408\nTop3\t135409\nswapping\t135410\n爬墙\t135411\n译文集\t135412\n魏善庄\t135413\n徐志摩\t135414\n株洲天元区\t135415\n赵毅\t135416\nme\t135417\n黄石二中\t135418\n炒勺\t135419\n大奶\t135420\nanxious\t135421\n几毫升\t135422\nrnl\t135423\n假值\t135424\n行尸走肉第六季\t135425\nstatement\t135426\n南里\t135427\n胶囊公寓\t135428\nLikes\t135429\n形成权\t135430\n乐艺\t135431\n昌吉州\t135432\n旗云\t135433\n手握\t135434\nqq幻想\t135435\n工具钳\t135436\n再生龙\t135437\nAlumni\t135438\n矮仔\t135439\n末级\t135440\n函范本\t135441\nholt\t135442\nVitamix\t135443\n奔驰a级\t135444\n千眼\t135445\n宣讲家网\t135446\n14集\t135447\n美空\t135448\n缓冲\t135449\n雅斯特酒店\t135450\n桉\t135451\n114查询网\t135452\nsmartform\t135453\n易胜华\t135454\n绝情谷\t135455\n50a\t135456\n安徽大厦\t135457\n声动\t135458\n后缘\t135459\n照首\t135460\n郡县制\t135461\n储物袋\t135462\n嘉誉\t135463\n蒸饺\t135464\ncurl_easy_perform\t135465\nonMeasure\t135466\n乔约翰逊\t135467\n碘酸\t135468\n厦门软件园二期\t135469\n茶颜悦色\t135470\n张巡\t135471\n冷杉溪\t135472\n小伙子们\t135473\n郭锡文\t135474\n初年\t135475\n残卷\t135476\n内舒\t135477\n终端管理\t135478\n王雅洁\t135479\n茅洲河\t135480\n定向班\t135481\nmycelipse\t135482\nslicer\t135483\n韩宇亮亮\t135484\n冬末\t135485\n村道\t135486\n突变型\t135487\n阿里地区\t135488\n高压灭菌锅\t135489\ntsar\t135490\n低能\t135491\n晕乎乎\t135492\n养生谷\t135493\n猴票\t135494\n参差不齐\t135495\n全错\t135496\n0.7931292126886547\t135497\nCorners\t135498\n涂油\t135499\n相容性\t135500\nM4\t135501\n60家\t135502\n东江环保\t135503\n急冻奇侠\t135504\n广明高速\t135505\n尼彩\t135506\n203号\t135507\n竹节虫\t135508\nSOT-23\t135509\n斤斤计较\t135510\nharman\t135511\n23456\t135512\nMyBatis+MySQL\t135513\n49层\t135514\nxueersi\t135515\n傲世丹神\t135516\nerica\t135517\n涉入\t135518\n10pro\t135519\n天辰\t135520\nFig\t135521\nfulcrum\t135522\n受益方\t135523\n字形\t135524\n雷雨心\t135525\nHBC\t135526\n乌镇戏剧节\t135527\n汤粉\t135528\nGrads\t135529\n能率燃气热水器\t135530\n封套\t135531\nCatholic\t135532\nderek\t135533\n回来\t135534\nMP3剪切器\t135535\n不鸟\t135536\n纯木\t135537\n伏尔塔瓦河\t135538\n1.3.4.4\t135539\n地地\t135540\nItaly\t135541\n2016年1月1日\t135542\n彝人\t135543\n罗云\t135544\n东安\t135545\n2834\t135546\n可爱的动物\t135547\ncardiac\t135548\n京东小金库\t135549\n潘马斯\t135550\n8丝\t135551\n福永\t135552\nMamma\t135553\n安徽理工\t135554\n气液两相流\t135555\n1230V2\t135556\n主席台\t135557\n山东农科频道\t135558\n四川省交通投资集团有限责任公司\t135559\n板塔\t135560\n番泻叶\t135561\n蒋王庙\t135562\n中域\t135563\n节装\t135564\nChant\t135565\n2018年05月\t135566\n麟\t135567\n常熟市第一人民医院\t135568\n雪獒\t135569\n台商大厦\t135570\n对岸\t135571\n旭辉朗香郡\t135572\niboy\t135573\n泊泉雅\t135574\nXU\t135575\n包藏\t135576\nrarbt\t135577\n一团\t135578\n美案\t135579\n张瑞敏\t135580\n加乐\t135581\n城镇居民基本医疗保险\t135582\n抵价券\t135583\n497\t135584\n焊装\t135585\n7i\t135586\nMgO\t135587\n滴露\t135588\n批量检测\t135589\n德邦快递\t135590\n地球科学学院\t135591\n巴西亚\t135592\n脓毒症\t135593\n高兴太早\t135594\n争战\t135595\n信息管理学\t135596\niServer\t135597\n于建安徒生\t135598\nTSDL\t135599\n徐光启\t135600\n收拍\t135601\n通流\t135602\n刘奶奶\t135603\n中国人民解放军\t135604\n频率分辨率\t135605\nNodeJs\t135606\n内科护理学\t135607\n中新药业\t135608\n建质\t135609\n梯架\t135610\n网页认证\t135611\n27天\t135612\n替罪羊\t135613\n同飞\t135614\n法宣在线|普法网\t135615\n送修\t135616\n漏风率\t135617\n科幻片\t135618\n双旗\t135619\n当代\t135620\n瘦脸仪\t135621\n生逢\t135622\n回归分析法\t135623\n绝世药神\t135624\n精制棉\t135625\n0.015\t135626\n顾爷\t135627\n一二三\t135628\n督导\t135629\n顾国平\t135630\n太行纪念馆\t135631\n唧唧堂\t135632\nar2048s\t135633\n酒石酸\t135634\n2000年前\t135635\n在工地\t135636\n强调事项段\t135637\ngaara\t135638\n赵少康\t135639\n资金成本\t135640\n7.3.0\t135641\n5612\t135642\n3._\t135643\n河北省食品药品监督管理局\t135644\n中胡\t135645\n混凝土密封固化剂\t135646\n中国电子商务协会\t135647\n印花机\t135648\n奇特\t135649\n秦岭\t135650\n己二胺\t135651\n挥着翅膀的女孩\t135652\n红寨恋足论坛_玉足图片|恋足视频|丝袜美女|美足图片|恋足\t135653\nyushi\t135654\n没门\t135655\n_购酒网\t135656\n喝喝\t135657\n雨田\t135658\n袢\t135659\n裸婚\t135660\n朱志伟\t135661\n东疆\t135662\n兴山县\t135663\n资管计划\t135664\n超额收益率\t135665\n桃苗\t135666\n福泉\t135667\n甲酸钠\t135668\n奇力\t135669\nabs泵\t135670\n3.4.1\t135671\n国际_新闻中心\t135672\nJPG模板\t135673\nprom\t135674\n数字认证\t135675\n童话网\t135676\n腾讯视频星光大赏\t135677\n能上能下\t135678\n牢记\t135679\n阿咪\t135680\nnorth\t135681\n2345王牌浏览器\t135682\ndhf\t135683\n过程\t135684\nyidian\t135685\n毛豆腐\t135686\n按键精灵吧\t135687\n困住\t135688\n像素块\t135689\n雅式\t135690\n梦飞\t135691\n心理准备\t135692\n无量寿佛\t135693\nmfi认证\t135694\n多弗\t135695\n飙\t135696\nmindmanager2016\t135697\nSnippets\t135698\n庐州大道\t135699\n标准集团(香港)有限公司\t135700\n细说\t135701\n麻烦处\t135702\n护肝\t135703\n听天由命\t135704\noppoa77\t135705\n高速公路\t135706\n内脂\t135707\n题外话\t135708\n享元\t135709\n非公路\t135710\n删选\t135711\n昆山国际学校\t135712\n鄂伦春自治旗\t135713\n章末\t135714\n水下八关\t135715\n皮尔斯·布鲁斯南\t135716\n中国机电网\t135717\n万江\t135718\nGoo\t135719\n菲常\t135720\n体统\t135721\n亿翰\t135722\n尖草坪区\t135723\n东风南路\t135724\n凯末尔\t135725\n翠山\t135726\n乐六文旅\t135727\n哥们儿\t135728\n2017-2019年度\t135729\nJavascri\t135730\n练习与测试\t135731\n句话\t135732\n李永康\t135733\n金刚不坏\t135734\n驾案\t135735\n飞越疯人院\t135736\n83号\t135737\npmsm\t135738\nmineral\t135739\n环世界A17\t135740\n曼迪\t135741\n吕涛\t135742\n帝妃\t135743\n高密\t135744\n木兰县\t135745\nConsolidation\t135746\n包拯\t135747\n改出\t135748\n失眠症\t135749\nCOSMOPlat\t135750\n档案馆\t135751\n嫖\t135752\n2.1.3\t135753\n7.4V\t135754\n高大\t135755\n普隆\t135756\n汤子星\t135757\nParagraph\t135758\nlolid\t135759\nphoebe\t135760\n沈阳市委\t135761\n称王称霸\t135762\nneedles\t135763\n仙源\t135764\n圆方\t135765\nmentor\t135766\n同悲\t135767\n花蛤\t135768\n另\t135769\n复线\t135770\nCNAME\t135771\n西安物流公司\t135772\nBeyerdynamic\t135773\n司机群\t135774\n李宗泫\t135775\n万里无云\t135776\n复方聚乙二醇电解质散\t135777\nanimations\t135778\n森海塞尔\t135779\n园区管委会\t135780\n战地:叛逆连队2\t135781\n尼福尔海姆\t135782\nmicroservice\t135783\n东北亚\t135784\n国卡\t135785\n何叔衡\t135786\ntortoise\t135787\n软陶泥\t135788\n贞子\t135789\n惠而浦(中国)股份有限公司\t135790\n5所\t135791\n龙七\t135792\n偶氮染料\t135793\n腾讯信用分\t135794\nSal\t135795\n维C\t135796\n子宫脱垂\t135797\n叛逆\t135798\n酒碗\t135799\nKern\t135800\n41%\t135801\n化妆师\t135802\n7800元\t135803\n玻璃栈道\t135804\n施肥机\t135805\n棕榈酸\t135806\n助于\t135807\n歌詞\t135808\n关中人才网\t135809\n彭锋\t135810\n郑思肖\t135811\n18.6\t135812\n性腺\t135813\n双侧乳腺增生\t135814\n挨踢\t135815\n天诺时空\t135816\nbiochemical\t135817\nEDEM\t135818\n大法官\t135819\n五针松\t135820\n薮猫\t135821\n大益茶\t135822\n浙委办\t135823\n83平\t135824\nQantas\t135825\n岭南印象园\t135826\n李美琪\t135827\n关海山\t135828\n商业观察网\t135829\n劳动大厦\t135830\n火栓\t135831\n十二次\t135832\n天国的嫁衣\t135833\n非碳素墨水\t135834\n调制解调器\t135835\n税控开票软件金税盘版\t135836\n4100个\t135837\n优酷路由宝\t135838\n延迟性\t135839\n阿圭罗\t135840\n红古区\t135841\nmdg\t135842\n降血\t135843\n辛弃\t135844\nvivoactive3\t135845\n多少年\t135846\n理政\t135847\n东亚银行\t135848\nt恤\t135849\n忘川童子\t135850\n姨太\t135851\n枪头\t135852\n得逞\t135853\napiclient\t135854\n鸿冠\t135855\n〖\t135856\njub\t135857\nIsight\t135858\ndisposal\t135859\n幽暗\t135860\n广东保监局\t135861\n老用\t135862\n林田湖\t135863\n愚人\t135864\n中国网络电视台\t135865\n宝马5系论坛\t135866\n香吻\t135867\nFire\t135868\n新村路\t135869\n钠硫电池\t135870\nradiohead\t135871\n美咲佳奈\t135872\n对头\t135873\n04183\t135874\n多情剑客无情剑\t135875\napplicant\t135876\n598\t135877\n杜美慧\t135878\n白蜡树\t135879\n放胆\t135880\n驴友\t135881\n后胎\t135882\n笔迹\t135883\n12丨\t135884\nUnicom\t135885\n夜霜\t135886\n慕青\t135887\n钢化玻璃\t135888\n汤宝如\t135889\n海晟\t135890\n利星行\t135891\n更快乐\t135892\n耐时\t135893\n团体意外伤害保险\t135894\n北京社保\t135895\nHaunted\t135896\n宋辉\t135897\n木格\t135898\nniko\t135899\nwin9\t135900\n做法\t135901\nPC6安卓网\t135902\n被干\t135903\n肺结\t135904\n特雷西\t135905\n营\t135906\ncrossing\t135907\n创世神话\t135908\n武器栏\t135909\n快易理财网\t135910\nyylabel\t135911\n望海路\t135912\n樱花\t135913\ngnome\t135914\n满足感\t135915\nCg\t135916\n腰骨\t135917\nRachael\t135918\n肉宠文\t135919\n小米方盒子\t135920\nhongdada\t135921\nx11vnc\t135922\nfrench\t135923\n1us\t135924\n唔该\t135925\n刷水\t135926\n短信通\t135927\n内核源代码\t135928\n省人民医院\t135929\n国家癌症中心\t135930\n159号\t135931\n恶业\t135932\n国际贸易部\t135933\nlilith\t135934\n2018年1月26日\t135935\nprotobuffer\t135936\nzd\t135937\n北京北海公园\t135938\nboosting\t135939\nclsq\t135940\n亚麻籽粉\t135941\n佟刚\t135942\n借来\t135943\n空大\t135944\n梅江\t135945\n命悬\t135946\nGeneralized\t135947\n七旬\t135948\n成都理工大学工程技术学院\t135949\n米芾\t135950\nserving\t135951\nauth2\t135952\n座椅式\t135953\n8.21\t135954\n公明镇\t135955\n私立\t135956\n谢雷\t135957\n华洲\t135958\n小泰克\t135959\n杭州北站\t135960\n鼻渊\t135961\nkimberly\t135962\n九灯\t135963\nALSA\t135964\nOK6410\t135965\n欧也妮·葛朗台\t135966\n五四红旗团\t135967\nboke112导航\t135968\n烟厂\t135969\nprinters\t135970\n保鲜期\t135971\n枸杞菊花茶\t135972\n鱼雷\t135973\n重线\t135974\n六出\t135975\n牛津剑桥\t135976\n新世界出版社\t135977\nLiveReload\t135978\n开穴\t135979\n尊老爱幼\t135980\n天津大学建筑学院\t135981\n六十多岁\t135982\n波士顿咨询公司\t135983\n红云红河烟草(集团)有限责任公司\t135984\n杭州西\t135985\n聊城职业技术学院\t135986\n万古仙穹\t135987\n资产证券化论坛\t135988\nLiam\t135989\n营子\t135990\n金泽市\t135991\n澄海区\t135992\n龙飞凤舞\t135993\n电饭锅\t135994\n广州市幼儿园\t135995\n宇美广场舞\t135996\n霍巴特\t135997\nfanwen.wenku1.com\t135998\n赛增\t135999\n慢性炎\t136000\n恐怖在线\t136001\nBT种子搜索神器\t136002\n下半期\t136003\n星巴克咖啡\t136004\n瑞风\t136005\n婚嫁装修|一起网\t136006\n李云龙\t136007\n加赛\t136008\nAccess2007\t136009\n满江蜡烛人\t136010\n龙阳大道\t136011\n52平米\t136012\n切诺\t136013\n天龙网\t136014\n四十集\t136015\nRESPONSE\t136016\nSharley\t136017\n営\t136018\n乌云网\t136019\n套子目\t136020\n开窍\t136021\ngreasy\t136022\nSME\t136023\nstylus\t136024\n古月轩\t136025\n三十二周\t136026\n5份\t136027\n天能电池\t136028\n出警\t136029\n夏飞\t136030\n姊妹篇\t136031\n仙河镇\t136032\n向阳乡\t136033\n报录比\t136034\n弃船\t136035\nDomainba\t136036\n这一回\t136037\n性别比\t136038\n王树义\t136039\n岳普湖县\t136040\n板头\t136041\n重庆大厦\t136042\n镀锡板\t136043\ncatti\t136044\n1-5月份\t136045\nETAP\t136046\n复仇类\t136047\n绝对湿度\t136048\nRuler\t136049\n一个子\t136050\nr11\t136051\n阿拉伯海\t136052\nwin7激活工具\t136053\n这一代人\t136054\n王春禄\t136055\n爱普生r230\t136056\n滨江街道\t136057\n麻将\t136058\n三方存管\t136059\n残余应力\t136060\n1750年\t136061\nmeland\t136062\n建类文库\t136063\n辞职信\t136064\n送配\t136065\n自制奶茶\t136066\n超美\t136067\n问句\t136068\n陆姓\t136069\n衣库\t136070\n浙江省妇保医院\t136071\nAC9\t136072\nRonan\t136073\nGoLand\t136074\n合肥电视台\t136075\n中国电影博物馆\t136076\n沽名\t136077\n#8\t136078\nlibsasl\t136079\n滚落\t136080\n那年那兔\t136081\n吉林大学第二医院\t136082\n邓白氏\t136083\n50CM\t136084\n_敖汉政府网\t136085\n便装\t136086\n斗罗网\t136087\n宝路\t136088\n吸管杯\t136089\n雨萱\t136090\n绝招\t136091\n8812\t136092\n商标转让交易平台\t136093\n百度外卖\t136094\n虹湾\t136095\n刘爱力\t136096\n督导室\t136097\n布列塔尼\t136098\n机器铃\t136099\n静止不动\t136100\n母线槽\t136101\np2p理财公司\t136102\n咋子\t136103\n天玺湾\t136104\n口出\t136105\n神武手游\t136106\n大盾\t136107\n刘闯\t136108\nIvan\t136109\n好紧\t136110\n大盒\t136111\nExoPlayer\t136112\n依婷\t136113\n东站\t136114\n坡峰岭\t136115\n众位\t136116\n杨继绳\t136117\nhtmlparser\t136118\n收评\t136119\n纪文君\t136120\nzhainan\t136121\n月轮\t136122\n励志片\t136123\n心中心\t136124\n新客网\t136125\n祖马龙蓝风铃\t136126\nPSW\t136127\n出厂\t136128\n陷入僵局\t136129\n工台\t136130\n校作报告\t136131\n高频彩\t136132\n22221092\t136133\n汉初三杰\t136134\n电镀金\t136135\nsupply\t136136\n银龙花园\t136137\n老图\t136138\n黑沙\t136139\n服装史\t136140\n数据值\t136141\n二十文\t136142\n第2张\t136143\n1M\t136144\nJava算法\t136145\n爱的曝光\t136146\n航空资讯_火车网\t136147\n银盾\t136148\n易奇八字在线算命\t136149\n植物志\t136150\n美柚\t136151\n广泽\t136152\n单模\t136153\n驚爆\t136154\n终极\t136155\n内蒙古蒙牛乳业(集团)股份有限公司\t136156\n韩叙\t136157\npreminum\t136158\n中科招商\t136159\n欧根\t136160\n可可·香奈儿\t136161\n一声炮\t136162\n稿纸\t136163\nh2o2\t136164\n5432\t136165\n玩不起\t136166\n八十年代初\t136167\n北极村\t136168\n双洁\t136169\n陪聊\t136170\ndemo\t136171\n630KVA\t136172\n国际社区\t136173\n小动脉\t136174\n浮体\t136175\n杭甬\t136176\n千_\t136177\n佳德\t136178\n田灵儿\t136179\n東西\t136180\n方外\t136181\nkmean\t136182\n抗真菌药\t136183\n三国群英7\t136184\n章振邦\t136185\najaxupload\t136186\n汰渍\t136187\n汉口站\t136188\nLabo\t136189\ncni\t136190\n沙水\t136191\n今个\t136192\nwanese\t136193\n梵音\t136194\nAndroid编程\t136195\n飞旋\t136196\n头一次\t136197\n银收宝\t136198\n88.0\t136199\n西园街道\t136200\nrocco\t136201\n深圳市城市管理局\t136202\n铠装电缆\t136203\n0pp0\t136204\n水星无线路由器\t136205\ncontinental\t136206\n创优\t136207\n自愿者\t136208\ntensoflow\t136209\n劳尔·卡斯特罗\t136210\n\b\b\t136211\n恋爱铃\t136212\nMybatis批量\t136213\n惠商\t136214\n眼影盒\t136215\n低感\t136216\nHttpSession\t136217\n一时\t136218\n吉祥天宝\t136219\n几下\t136220\n西崽\t136221\n苜蓿园大街\t136222\n树梢\t136223\n音碟\t136224\nerm\t136225\n北京)文化传播有限公司\t136226\n彩弹\t136227\n非模式\t136228\n夜叉王\t136229\n20140725\t136230\nkrkr\t136231\n金灶镇\t136232\n0.23\t136233\n猜拳\t136234\n新政镇\t136235\n.net4\t136236\nJK\t136237\n深圳市人民政府外事办公室\t136238\n动画版\t136239\n韩卫国\t136240\n纽扣电池\t136241\n短文\t136242\n一女\t136243\n跷\t136244\n过度疲劳\t136245\n上海工博会\t136246\n北京市政府\t136247\n高达0079\t136248\n00-00\t136249\n烘焙师\t136250\n松雅湖\t136251\n花椒油\t136252\nCapture\t136253\n预备会议\t136254\n东门大桥\t136255\n疾影\t136256\n龙德\t136257\n栏式\t136258\nopo\t136259\n需求感\t136260\n马兰\t136261\n小黄书\t136262\nOFC\t136263\n省思\t136264\n愧怍\t136265\nintroduction\t136266\nMarta\t136267\n几流\t136268\n赶走\t136269\nfrv\t136270\n你的生活\t136271\n珠绣\t136272\n皮纹\t136273\norchestra\t136274\n质化\t136275\nX3.2\t136276\n结晶果糖\t136277\n波利尼西亚\t136278\n华为荣耀8\t136279\n许霆案\t136280\n德国大众汽车\t136281\n墓派\t136282\n云溪\t136283\n镀层测厚仪\t136284\nMaple\t136285\n颐堤港\t136286\n揩\t136287\nperipheral\t136288\nI卷\t136289\n光流传感器\t136290\n数十年\t136291\n废轮胎\t136292\n市北区\t136293\n拿号\t136294\nMaxon\t136295\n讨论会\t136296\n共产房\t136297\non\t136298\nnavacat\t136299\n鹰犬\t136300\n银离子\t136301\n常熟人才网\t136302\n建筑内部装修设计防火规范\t136303\nHardwell\t136304\n公孙大娘\t136305\n文卓\t136306\nDinosaurs\t136307\n肖敏晔\t136308\n教育业\t136309\n樱花盛开\t136310\n半瓷\t136311\n恒天海龙\t136312\n1400公里\t136313\n_机战吧\t136314\n苦累\t136315\n口袋机\t136316\n4月17号\t136317\nRoof\t136318\n人案\t136319\n棕色\t136320\n无圣光\t136321\n借宿\t136322\nfwd\t136323\n总署\t136324\n利血平\t136325\n陈菊\t136326\n3月2\t136327\n户口\t136328\n崔健\t136329\n家纺展\t136330\n雪堂\t136331\n5263\t136332\n跨骑\t136333\n琴岛学院\t136334\n邓紫棋\t136335\n喜忧参半\t136336\n氚\t136337\n方寸山\t136338\n船业\t136339\n石楠花\t136340\nlocaldb\t136341\n儿童学画\t136342\n网易云\t136343\nshar\t136344\n毕业典\t136345\n街电\t136346\n河南广播电视台\t136347\n清炭\t136348\n网易充值中心\t136349\n瑜伽垫\t136350\n济南海关\t136351\n诺贝尔经济学奖\t136352\n义警\t136353\n讲价\t136354\n与式\t136355\n停工留薪期\t136356\n王永\t136357\n市子\t136358\n中华鲟\t136359\n眼鏡\t136360\n神好\t136361\n四十厘米\t136362\n中直机关\t136363\n三建\t136364\n哈希表\t136365\n造反派\t136366\n内边\t136367\nLOD\t136368\n因素法\t136369\nZTE\t136370\nFCP\t136371\n孙传芳\t136372\n库曼\t136373\n熊逸\t136374\n华夏经纬网\t136375\n独裁\t136376\n恶魔之子\t136377\n全球天气网\t136378\n怀人\t136379\n酣睡\t136380\n小荧星\t136381\n门禁机\t136382\nZeroM\t136383\n4.0.5\t136384\n翼龙\t136385\nUpdates\t136386\n是图\t136387\n银禧科技\t136388\n东莞地铁1号线\t136389\n大势已去\t136390\nPoem\t136391\n中人\t136392\nguanyu\t136393\n甘肃省省\t136394\n宝马1系论坛\t136395\nQuarterly\t136396\nCRI\t136397\n蟹类\t136398\n23654318\t136399\nregulation\t136400\n五虎将后传2.9\t136401\n有源\t136402\n五氧化二磷\t136403\n黎塞\t136404\n塑件\t136405\n八大类\t136406\n番茄红素\t136407\n时间校对\t136408\n广西柳工机械股份有限公司\t136409\n房产网xsfc.com\t136410\n复方氨基酸\t136411\n口上\t136412\n运通\t136413\n太阳能路灯\t136414\n怀柔区\t136415\nPTSD\t136416\n马克思主义学院\t136417\n本杰明·富兰克林\t136418\n绍兴柯桥\t136419\n泽旺拉姆\t136420\n包分\t136421\n模压\t136422\n摆平\t136423\n学期末\t136424\n珠江湾\t136425\n域名资\t136426\n剑网2\t136427\n系统分析师\t136428\n新革命\t136429\n瘦长\t136430\n亿万僵尸\t136431\n874\t136432\n科莫多龙\t136433\n增值税处理\t136434\n新疆消防\t136435\nunsw\t136436\n北京银行手机银行\t136437\n三元桥\t136438\n山东美术出版社\t136439\n珠光\t136440\n中国教育出版网\t136441\nrxs\t136442\n华康俪\t136443\n杨石头\t136444\n铝道网\t136445\n翱\t136446\n新北区政府\t136447\nbmg\t136448\n中国建筑\t136449\n沙木\t136450\nSimpsons\t136451\nchopsticks\t136452\n黑骏马\t136453\n蔡国强\t136454\n并排放\t136455\n烛照\t136456\nucrt\t136457\n两个多小时\t136458\n言峰绮礼\t136459\n大仲马\t136460\n初唐四杰\t136461\n翡翠石\t136462\n月抛\t136463\n一KTV\t136464\nsql2008r2\t136465\n黄红云\t136466\n拉贝日记\t136467\nClinic\t136468\n折叠自行车\t136469\n广东省农业厅\t136470\n十版\t136471\n辋川\t136472\n密斯\t136473\nheritage\t136474\nwonderland\t136475\nClariS\t136476\nIntelliJ\t136477\n三洲\t136478\n李绍光\t136479\n南市\t136480\n纪念衫\t136481\n000893\t136482\n形婚\t136483\n新番日剧\t136484\n康养\t136485\naries\t136486\n何庆\t136487\n磷酸铁\t136488\n董先生\t136489\nsendmsg\t136490\n陈亦飞\t136491\n王实甫\t136492\n长春理工大学\t136493\n羽毛枫\t136494\n音道\t136495\n神州飞船\t136496\n革命化\t136497\n33章\t136498\n积碳\t136499\n601099\t136500\n恶有\t136501\ndeguo\t136502\nERS\t136503\nshipin\t136504\n燃油箱\t136505\n中泰乡\t136506\n3938\t136507\n扎头发\t136508\n35.5度\t136509\n捏造\t136510\n杉德哆啦云\t136511\n锁母\t136512\n重庆美蛙鱼头\t136513\n宋柯\t136514\n三鑫\t136515\n金昊\t136516\nfenqu\t136517\n长毛兔\t136518\n海悦天地\t136519\n对外服务\t136520\n不用来回\t136521\n王素君\t136522\naisle\t136523\n中国遗传学会\t136524\n谋划\t136525\n纯水\t136526\nzonghe\t136527\n钦佩\t136528\n新成龙历险记\t136529\n暂缺\t136530\n分月\t136531\n保不住\t136532\nPill\t136533\nGeoserver\t136534\n祖宅\t136535\n秘\t136536\niPhone6S\t136537\n王良四\t136538\n脱位\t136539\n21句\t136540\n北京签证中心\t136541\n牛津街\t136542\n大野\t136543\n七夕\t136544\n失恋阵线联盟\t136545\n太\t136546\n百样\t136547\n彭勇\t136548\n牛岛\t136549\n长白新村社区\t136550\nExe\t136551\nAmber\t136552\n架立筋\t136553\nIDG资本\t136554\nPatchwork\t136555\n惊爱\t136556\nchoice\t136557\n宁波银监局\t136558\n马臀皮\t136559\nLyn\t136560\n警民通\t136561\n水洞沟\t136562\n怡达\t136563\n江西省工商联\t136564\nE470C\t136565\n盐龙网\t136566\nreductase\t136567\n禅\t136568\nkew\t136569\n二千元\t136570\n离家\t136571\n安锋网\t136572\n极皇\t136573\n母狗\t136574\n果酸换肤\t136575\n军牌\t136576\n虫儿\t136577\n山西师范大学\t136578\n左邻\t136579\n高峰乡\t136580\nRub\t136581\n几棵\t136582\n桃片\t136583\n板条\t136584\n资江\t136585\n硬箱\t136586\n55集\t136587\n超凡喜力\t136588\n棱块\t136589\n新疆维吾尔自治区食品药品监督管理局\t136590\nhomer\t136591\n顾虑\t136592\n12.5.9\t136593\nQuality\t136594\n羽扇纶巾\t136595\n形近字\t136596\n求情\t136597\n县主\t136598\n工业建筑\t136599\nOffice2016简体中文\t136600\n一窗通\t136601\n颈肩\t136602\n千秋月\t136603\n污血\t136604\n机动师\t136605\nQQ管家\t136606\n吩\t136607\n兴化\t136608\n杏仁粉\t136609\nk2p\t136610\n奢香\t136611\nsignalling\t136612\n庆元县\t136613\n临床医学博士\t136614\n蝴蝶臀\t136615\n1619\t136616\n18款\t136617\n柬埔寨西哈努克港\t136618\n苏轶然\t136619\n庄市街道\t136620\n人在路途\t136621\n有线电视费\t136622\nCrawler\t136623\n楼梯间\t136624\n净化机\t136625\n梅田\t136626\nDoc\t136627\n25支\t136628\n时花\t136629\ncaramel\t136630\n瞬间\t136631\n碰撞体\t136632\n厦门医学院\t136633\n浴花谷\t136634\n压缩垃圾车\t136635\n鼎鼎\t136636\n张峥\t136637\n2.38\t136638\n喜欢一个人\t136639\n标\t136640\n同床异梦\t136641\n第一第二第三\t136642\n裁断\t136643\nVDSL\t136644\n中美撞机事件\t136645\n冰川网络\t136646\n理料\t136647\nDPA\t136648\n网奇公司\t136649\n耕升\t136650\nica\t136651\n扬州市统计局\t136652\n积月累\t136653\n加热管\t136654\n西藏药业\t136655\n致幻剂\t136656\n蓝本\t136657\nOSAKA\t136658\n摸身\t136659\n经济改革\t136660\n锂电池\t136661\nrece\t136662\n连江\t136663\nsinging\t136664\n卡刷包\t136665\n清单项\t136666\n毒经\t136667\n神圣石\t136668\n寄小读者\t136669\nwebrequest\t136670\nOutdoor\t136671\n整数倍\t136672\nDokuwiki\t136673\n金投贷款网\t136674\n新濠\t136675\n顾况\t136676\n1824\t136677\nhmmlearn\t136678\n深圳消防\t136679\nAIT\t136680\n硪\t136681\n11月6日\t136682\n中共开平市委市人民政府\t136683\n周晓峰\t136684\n爱他美\t136685\nCQC认证\t136686\n手刃\t136687\n大凉山\t136688\n源深体育中心\t136689\n奉化市\t136690\n隆胸\t136691\nfriend\t136692\nPEM\t136693\n异心\t136694\nbloodline\t136695\n1千多\t136696\n主从盘\t136697\n梦醒时分\t136698\nNeoTV\t136699\n年夜饭\t136700\n2017年1月份\t136701\n水鬼\t136702\n幽蓝\t136703\n哈威\t136704\nneurons\t136705\n截掉\t136706\n陕西省民政厅\t136707\n住院\t136708\nイメ\t136709\n嗟来之食\t136710\n杨建强\t136711\n环保证\t136712\n大妈们\t136713\n化学学\t136714\n包活\t136715\n签宝\t136716\n骨袋\t136717\n中国华信\t136718\n小南\t136719\n注册章\t136720\n汕头金山中学\t136721\n灌浆期\t136722\n拦路虎\t136723\n祖墓\t136724\n放味\t136725\n九阳帝尊\t136726\nwindows95\t136727\n利民路\t136728\nMiFlash\t136729\nS03\t136730\n法妞\t136731\n碎尸\t136732\n10um\t136733\n林海雪\t136734\nmarine\t136735\n举腿\t136736\n虾笼\t136737\n80章\t136738\nKOF98\t136739\n鄂竟平\t136740\n安徽省人民政府办公厅\t136741\n保险公估公司\t136742\n回事儿\t136743\n骑士网\t136744\nhan4216331@163.com邮箱\t136745\n黎雄才\t136746\n降一补\t136747\n西藏自治区工商行政管理局\t136748\n全影人才网\t136749\n热收缩膜包装机\t136750\n成都外国语学校\t136751\n鞑子\t136752\n座厕\t136753\n8163\t136754\n大醉侠\t136755\n谷歌访问助手\t136756\n108平米\t136757\n新约圣剑传说\t136758\n土钵菜\t136759\nVCR\t136760\n该站\t136761\nx+1\t136762\n存实亡\t136763\n可视\t136764\n6D\t136765\n五桥\t136766\nwww96saocom\t136767\n大连港\t136768\n李大神武\t136769\n好逼\t136770\n硒都\t136771\n广州长隆熊猫酒店\t136772\n女街\t136773\n开去\t136774\n超级大本\t136775\n花鸟岛\t136776\n画艺\t136777\n[允悲\t136778\n花露\t136779\nplsql数据库\t136780\n胡雯\t136781\njavascript代码库\t136782\natv\t136783\n快播电影网\t136784\n相权\t136785\n笔趣阁_爪机\t136786\n梦桐\t136787\ngif\t136788\n604路\t136789\nvegetable\t136790\n何家村\t136791\n高收入者\t136792\n两性人\t136793\n瑞海\t136794\n30多个\t136795\n风者\t136796\nvv7\t136797\n中介词\t136798\nImmersion\t136799\n王顺杰\t136800\n姬发式\t136801\n便利商店\t136802\n22级\t136803\n天保基建\t136804\n北京307医院\t136805\n斗罗大陆3龙王传说\t136806\n缫丝\t136807\n汇师小学\t136808\n少年儿童\t136809\n逆命\t136810\n百年人寿保险股份有限公司\t136811\n希捷\t136812\n奇偶行\t136813\nv12.0\t136814\n湖光\t136815\ndone\t136816\nvfp6.0\t136817\n引咎\t136818\nInbox\t136819\n乐视公司\t136820\nlistbox\t136821\nmtab\t136822\nttest\t136823\n品牌认知度\t136824\n东营市人民医院\t136825\n脊椎侧弯\t136826\n金庸群侠传x\t136827\n神猪\t136828\nhurry\t136829\nmvc\t136830\n微票儿\t136831\n第二版\t136832\n1477\t136833\n很正常\t136834\n十五集\t136835\n缘由\t136836\n复利率\t136837\n子矜\t136838\n豆捞坊\t136839\n114家\t136840\n雅百特\t136841\n香砂养胃丸\t136842\n病危\t136843\nanri\t136844\nx5sl\t136845\n日机\t136846\n气动马达\t136847\n子望\t136848\n鼎石\t136849\n723\t136850\n小宣\t136851\nRutgers\t136852\nbor\t136853\n窗地\t136854\nRojas\t136855\n阿狄丽娜\t136856\n白头\t136857\n狼的诱惑\t136858\n滴灌管\t136859\n张景岳\t136860\nPOLA\t136861\ngui\t136862\n双溪\t136863\n仙阳\t136864\n中国科学院华南植物园\t136865\n永续盘存制\t136866\n马良\t136867\n秀夫\t136868\n工商局\t136869\n京医通卡\t136870\n话道\t136871\nna2co3\t136872\n苏州科技\t136873\nGIO\t136874\npro2\t136875\n藤田\t136876\n熊去氧胆酸胶囊\t136877\n深汕\t136878\nmacword\t136879\n瓤\t136880\n引线\t136881\n星海音乐学院\t136882\n萨克拉门托\t136883\n水妍\t136884\n碳纤维布\t136885\nMPEG4\t136886\nmsvcp\t136887\npercona\t136888\neaseui\t136889\n民航机\t136890\n游鸿明\t136891\n3S\t136892\n联想_ThinkPad|ThinkCentre|ThinkStation\t136893\nPayPal\t136894\nimageready\t136895\ndijit\t136896\n趾头\t136897\n卓伟\t136898\n代销\t136899\n留下\t136900\n雷鸟\t136901\n周刊少年JUMP\t136902\nMarica\t136903\n国浩律师事务所\t136904\n艾思奇\t136905\nIdioms\t136906\n世界中医药学会联合会\t136907\n兰戈\t136908\nqsys\t136909\n英利\t136910\nv13\t136911\nweiyun\t136912\n源端\t136913\n肠溶片\t136914\n0x02\t136915\n神谷哲史\t136916\nSata\t136917\ncot\t136918\n包治百病\t136919\n光驱位硬盘\t136920\n邢钢\t136921\n祛邪\t136922\n赵巍\t136923\n锌硒宝片\t136924\nWarranty\t136925\nMediBang\t136926\nPubMed\t136927\n20161113\t136928\nCrying\t136929\nsmaart\t136930\n市场局\t136931\n爱路护路\t136932\n公式版\t136933\n4a级\t136934\n凤凰大道\t136935\n华信路\t136936\ncAMP\t136937\n海月姬\t136938\n静怡\t136939\n半裙\t136940\nLOLS5\t136941\n赌命\t136942\nnogamenolife\t136943\n小米公司\t136944\n白兔子\t136945\n100万吨\t136946\n李妍曦\t136947\n太平公主\t136948\n岁月风云\t136949\n中国海油\t136950\n深圳市罗湖区人民医院\t136951\n琉璃岛\t136952\n九二\t136953\n林少\t136954\n第6周\t136955\n福特全顺\t136956\n铆管\t136957\nグランブル\t136958\n莽荒\t136959\n董飞\t136960\n会计基础工作规范\t136961\n2009年12月\t136962\n专题研究\t136963\n毒素\t136964\n干扰器\t136965\n黑素\t136966\n时候车\t136967\n天悦城\t136968\n寰亚\t136969\nMODE\t136970\n多面\t136971\naco\t136972\n雷阵雨\t136973\n停步\t136974\nSphinx\t136975\n金江镇\t136976\n6124\t136977\n日语翻译__fanyi\t136978\n共产党第十九次全国代表大会\t136979\n嵩口\t136980\nChaos\t136981\nH55\t136982\nPCPOP泡泡网\t136983\n3.7.27\t136984\nセンタ\t136985\n压实系数\t136986\n塔楼\t136987\n世界脑力锦标赛\t136988\n1.9.3\t136989\n香格里拉酒店\t136990\n尸体\t136991\n乐毅论\t136992\nProven\t136993\n丁洋\t136994\n供电费\t136995\n兴工\t136996\n济州岛机场\t136997\n龙泉中学\t136998\n212路\t136999\n将近\t137000\n一十\t137001\n4.3.12\t137002\n牛金所\t137003\n中央委员会\t137004\n确权\t137005\nwuqi\t137006\n以色列人\t137007\n溯洄\t137008\n10月24日\t137009\n威志\t137010\nMUJI\t137011\n小牛顿\t137012\ninclusion\t137013\n朱海燕\t137014\n以北\t137015\n防呆\t137016\nevent\t137017\n证状\t137018\n商务通\t137019\n大鼎\t137020\n鹤唳华亭\t137021\n货码\t137022\n山东高速\t137023\n修练之路\t137024\n哈弗H6论坛\t137025\n去除\t137026\n爱你西蒙\t137027\n行首\t137028\n扶阳\t137029\n许婷\t137030\n今年六月\t137031\nQQ表情党\t137032\n20141015\t137033\n童乐\t137034\n阿何\t137035\n南山外国语学校\t137036\n病程\t137037\n宝兴县\t137038\n脑仁\t137039\n出没注意\t137040\n邬\t137041\n雅砻江\t137042\n五菱宏光\t137043\n龙门镇\t137044\ne语言\t137045\n桂林理工大学\t137046\n邹平房产网\t137047\nreduced\t137048\n专篇\t137049\n顺丁橡胶\t137050\n平针\t137051\n49秒\t137052\nv2.3_\t137053\n56.1688.com\t137054\n造兵\t137055\n尚诚志\t137056\n矢量化\t137057\n包王\t137058\nWord-ExcelHome技术论坛\t137059\n素来\t137060\n﹝\t137061\n佳佳\t137062\n收款人\t137063\nminify\t137064\n十年内\t137065\n大行折叠车\t137066\n诺夫哥罗德\t137067\n美丽的人\t137068\n手面\t137069\nareas\t137070\n深圳市卫生监督局\t137071\n光谷南\t137072\nlxxx\t137073\nuestudio\t137074\n获奖励\t137075\n竞业限制补偿金\t137076\ncollectors\t137077\n10%|\t137078\n鬼医狂妃\t137079\n光针\t137080\n马小桃\t137081\n混合痔\t137082\n溜冰毒\t137083\n电子产\t137084\n米5\t137085\n并存\t137086\n步\t137087\nVOL.2\t137088\nSFC模拟器\t137089\n从者\t137090\n无锡宾馆\t137091\n北师大出版社\t137092\n五十回\t137093\nNOX\t137094\nOlsen\t137095\n自下而上\t137096\n楷书\t137097\n春游季\t137098\n麦克坞\t137099\n风热\t137100\n纳税信用管理办法\t137101\n双气\t137102\n58部\t137103\nColle\t137104\n岚裳\t137105\nInspirational\t137106\n广电新闻出版局\t137107\nnce\t137108\nng-show\t137109\n海格客车\t137110\n颜月溪\t137111\n0.59\t137112\n华山北站\t137113\n3款\t137114\nbuild.gradle\t137115\n教龄\t137116\nRSI\t137117\n宅们\t137118\n训练日\t137119\n雷某\t137120\n拆出\t137121\n凯南\t137122\n富奥股份\t137123\n路哥\t137124\n胡小林\t137125\nGOOSE\t137126\n俊宇\t137127\n独狼\t137128\nikon\t137129\n碧桂园时代城\t137130\n中骏蓝湾\t137131\n第63届\t137132\n植入式\t137133\n娱乐型\t137134\n贝瓦\t137135\n花水\t137136\nyeelink\t137137\nNewest\t137138\n澳海澜庭\t137139\nlll\t137140\noy\t137141\n数码管\t137142\nMultiple\t137143\n卤米松\t137144\n秀玲\t137145\n第一例\t137146\n鞍钢股份有限公司\t137147\n0883\t137148\n十年三月三十日\t137149\n毛里塔尼亚\t137150\n鲁大\t137151\nGDI+\t137152\n石凉\t137153\n东芭乐园\t137154\n三八红旗手\t137155\n巩膜炎\t137156\n和田玉籽料\t137157\n碧野\t137158\nG703\t137159\n汪伟\t137160\n实盘\t137161\n绿盟\t137162\n沙漠\t137163\nsonia\t137164\n张桂华\t137165\nLK\t137166\n混合式教学\t137167\n120克\t137168\nArcadia\t137169\n通元镇\t137170\n侨兴债\t137171\n礼泉县\t137172\n拌嘴\t137173\n习总书\t137174\nwindows2012\t137175\n南戴河\t137176\n【凤囚凰\t137177\n宗白华\t137178\n领展\t137179\n大话西游2修炼心\t137180\n氧化池\t137181\n山东省财政厅\t137182\nPorting\t137183\n中国科学院地球化学研究所\t137184\n深情演唱\t137185\n销率\t137186\n大忽悠\t137187\n一错\t137188\nidx\t137189\n铁科院\t137190\n连接词\t137191\n首充号\t137192\n青城山后山\t137193\nBUILD\t137194\n开放性\t137195\n东莞中学\t137196\n朝天钩\t137197\nUSG6000\t137198\n少奶奶\t137199\n300升\t137200\n生死对决\t137201\n锦城花园\t137202\nma1\t137203\nbin/sh\t137204\n丽贝岛\t137205\n埃德斯塔福德\t137206\n正标\t137207\n300kb\t137208\nstderr\t137209\n洗护\t137210\n5675\t137211\n前生今世\t137212\n答问\t137213\n快剑\t137214\n听醉\t137215\n波风\t137216\n鲍曼\t137217\n区直机关工委\t137218\n8.29\t137219\n新辉\t137220\nnook\t137221\n虎航\t137222\n暖树\t137223\n黑信\t137224\n斫琴\t137225\n最邮轮旅行网\t137226\n紫外激光打标机\t137227\ncnki翻译助手\t137228\nKRL\t137229\n罗技G102\t137230\n肾上腺结节\t137231\n岘港\t137232\n津津有味\t137233\nrockets\t137234\n新谷\t137235\n云雾\t137236\ne-mail\t137237\nmevan\t137238\nEdgeMax\t137239\ncoffee瑞幸咖啡\t137240\n年纪\t137241\nElderly\t137242\n京那巴鲁\t137243\n跳针\t137244\n甩开\t137245\n下手\t137246\n402路\t137247\n彩虹路\t137248\n1.1亿元\t137249\n腱鞘炎\t137250\n骗子们\t137251\nTHINKPHP\t137252\n乔姆斯基\t137253\n张家港市政府网\t137254\n张惠春\t137255\n大宪章\t137256\n钟公庙\t137257\nText3\t137258\n不利\t137259\n双保险\t137260\n21P\t137261\n名世\t137262\n姬野尤里\t137263\n玉流馆\t137264\n方天\t137265\n留缝\t137266\n彩字\t137267\n末日孤舰吧\t137268\n架式\t137269\n全诗\t137270\n18.5寸\t137271\n梅州论坛\t137272\n2017-08-31\t137273\n行李箱\t137274\n院歌\t137275\n缕缕\t137276\n叮当\t137277\n送考\t137278\nmmse\t137279\n洋\t137280\n挤一挤\t137281\n大豆磷脂\t137282\nymz01\t137283\n20160130\t137284\n贼子\t137285\n玛纳斯\t137286\n18季\t137287\nLAYOUT\t137288\nOAUTH2\t137289\n东奥会\t137290\n王明玉\t137291\n第几任\t137292\n黄石国家矿山公园\t137293\n上海华山医院皮肤科\t137294\n民事案由\t137295\n爆燃\t137296\n水平型\t137297\nzyp高清\t137298\nPPT触发器\t137299\n翡翠城\t137300\nRichTextBox\t137301\n这场戏\t137302\n一寸金\t137303\n大盘股\t137304\n马军胜\t137305\n海国\t137306\n重明\t137307\n徐州云龙政府网\t137308\n龙王传说\t137309\n200天\t137310\n寺山修司\t137311\n大明帝国\t137312\nzlingh\t137313\n娘王\t137314\n液压离合器\t137315\n助跑\t137316\n世勋\t137317\n佰利\t137318\nisee\t137319\nlwip\t137320\n移徙\t137321\n火锅节\t137322\n谢楠王蒙\t137323\n钟落潭镇\t137324\nfmw\t137325\n道师\t137326\n铺成\t137327\n阿里国际\t137328\n三国之刃\t137329\n腹背受敌\t137330\n彼尔德\t137331\nsuppression\t137332\n福田\t137333\n火弓\t137334\nWicked\t137335\nReilly\t137336\nTrying\t137337\n宗教团体\t137338\n西安曲江新区管理委员会\t137339\nrfind\t137340\n上下级\t137341\n助手们\t137342\n象征\t137343\n4399创世联盟\t137344\n反比例\t137345\n宣示\t137346\n新挑战吧\t137347\n苍龙在天\t137348\n装车机\t137349\n两个世界\t137350\n11名\t137351\nsitemesh\t137352\n非独立悬架\t137353\nchatter\t137354\n自动型\t137355\n内蒙古大学研究生院\t137356\n硬着头皮\t137357\n沁入\t137358\nVariance\t137359\n链剂\t137360\n激烈\t137361\n新疆维吾尔自治区国家税务局\t137362\n053\t137363\n晚自习\t137364\n益智仁\t137365\nDLC3\t137366\nstruggle\t137367\n一汽夏利\t137368\n显像剂\t137369\ncandle\t137370\nwsj\t137371\nhbr\t137372\nno.3\t137373\n咖位\t137374\n六年前\t137375\n海通国际\t137376\n桔梗网\t137377\n祝辞\t137378\n幸运券\t137379\n新游资讯_着迷网\t137380\n中断\t137381\n很神奇\t137382\n跳虫\t137383\n22首\t137384\nsaves\t137385\nL130\t137386\n盛路通信\t137387\n十八岁给我一个姑娘\t137388\ncanner\t137389\n水肿型\t137390\n债务性\t137391\ndfn\t137392\n西谷\t137393\n11天\t137394\n6000元\t137395\n仲恺\t137396\n宝石\t137397\n旋蒸\t137398\n2年后\t137399\n词句\t137400\n所有\t137401\n毒瘾\t137402\n井喷式\t137403\n双相不锈钢\t137404\n逆钟\t137405\n省医\t137406\n定压比热\t137407\n五十本\t137408\n免登\t137409\nPermanently\t137410\n27mm\t137411\n半甜型\t137412\n车件\t137413\n留守女人\t137414\n换挡拨片\t137415\n几日内\t137416\nhec\t137417\nXZs\t137418\nh0930\t137419\n艺员\t137420\n阿玛拉\t137421\n压车\t137422\n中酒\t137423\n新浪微博认证\t137424\n盘夫索夫\t137425\n松脂\t137426\n单色\t137427\n冲锋衣男\t137428\n厄休拉\t137429\nIncreasing\t137430\n黄头龟\t137431\n钟铉\t137432\nmanaged\t137433\n爱赚网\t137434\n红参\t137435\n洪教头\t137436\n新疆自治区\t137437\nCR7\t137438\niuoc\t137439\n谒陵\t137440\n拉德\t137441\n2268\t137442\nvichy\t137443\n安思远\t137444\nMINON\t137445\n坏朋友\t137446\n百兽王\t137447\ncajviewer阅读器\t137448\n北斗星x5\t137449\n藉此\t137450\nhl\t137451\n潘洛斯\t137452\n厦门市教育局\t137453\n王海霞\t137454\n万章\t137455\n热成型\t137456\n室内装饰画\t137457\n广州地铁13号线\t137458\nchenxj\t137459\n面版\t137460\nrailway\t137461\n一些个\t137462\n移动信号塔\t137463\n骆驼集团股份有限公司\t137464\n琅嬛福地\t137465\nPizza\t137466\n28天后\t137467\n坂口美穗\t137468\n穿墙套管\t137469\n张咏梅\t137470\n杜高\t137471\n内幕\t137472\n准则\t137473\nlash\t137474\n透密\t137475\n湘潭北站\t137476\n山长\t137477\np图软件\t137478\n呆鸡\t137479\n线工\t137480\n四舍五入\t137481\n西部航空\t137482\n2018年4月10日\t137483\n繁荣路\t137484\n备案类\t137485\n倪永孝\t137486\n琼中县\t137487\n第18批\t137488\n人才网\t137489\nCWM\t137490\nsba\t137491\n孔雀舞\t137492\nadvantages\t137493\n603\t137494\n方圆地产\t137495\nCareless\t137496\nregenerator\t137497\ncsonline\t137498\n拒变\t137499\n无衬\t137500\n自家\t137501\n斗牛牛\t137502\n智永\t137503\n时代天街\t137504\n广州交响乐团\t137505\n月检\t137506\n佳都\t137507\n齐名\t137508\n仙塘\t137509\n李壮平\t137510\n百度云1080P\t137511\n25吨\t137512\n快易花\t137513\nS90论坛_汽车之家论坛\t137514\n百年虚云\t137515\nRecuva\t137516\n大韩\t137517\n金山岭\t137518\n千人\t137519\nRapide\t137520\n水德\t137521\nnative2ascii\t137522\n六千年\t137523\n日本酒店\t137524\n数值化\t137525\nBinaries\t137526\n引得\t137527\n径向基\t137528\nfirstwingy\t137529\nAwe\t137530\n系泊\t137531\n中山大学深圳校区\t137532\n圳区\t137533\n星光镜\t137534\n金华百姓网\t137535\n安良\t137536\n杨勇\t137537\n木梳\t137538\n三相电机\t137539\n命运交响曲\t137540\n深泽县\t137541\n推诿\t137542\n幂律\t137543\n边间\t137544\n高山下的花环\t137545\n大径山\t137546\n慢镜\t137547\n第二节\t137548\n光电信息科学与工程专业\t137549\n嘉盛\t137550\n鸣樱\t137551\n大唐豪侠\t137552\n张美玲\t137553\nSEVER\t137554\n京基御景中央\t137555\n鸡\t137556\nXiaofeng\t137557\n严大\t137558\nBlanket\t137559\n100%\t137560\n章法\t137561\n淫传\t137562\n玻璃钢化粪池\t137563\n今年7月\t137564\n名侦探柯南事务所\t137565\n拳皇2002风云再起\t137566\n神尊\t137567\n增安型\t137568\nboat\t137569\njieshao\t137570\n大连市环境保护局\t137571\n疏通下水道\t137572\n双面\t137573\nJetBrains\t137574\n邢台市公安局\t137575\n跑盘\t137576\nword工具栏\t137577\ndecline\t137578\n且听\t137579\n水律蛇\t137580\n干路\t137581\n陈家坪\t137582\n武三思\t137583\n铁血读书\t137584\nDMA\t137585\n武汉市统计局\t137586\nedo\t137587\ntabar\t137588\n武里南联\t137589\n300W\t137590\nicc\t137591\n中华人民共\t137592\n盐酸帕罗西汀片\t137593\n破魔\t137594\n铝包木\t137595\n丹嫩\t137596\n代友\t137597\n9公里\t137598\nctb\t137599\n辛卯年\t137600\nj2me\t137601\n房前\t137602\nsmr\t137603\n赛思\t137604\n沁园小区\t137605\n杂役\t137606\n中富通\t137607\n钢带管\t137608\nⅴ\t137609\n磁盘\t137610\n9层\t137611\n道元\t137612\n友通\t137613\n宾得k1\t137614\n_数据库\t137615\n105sl\t137616\n冻产\t137617\n第几层\t137618\n铁离子\t137619\n岳以恩\t137620\n健康谷\t137621\nA83\t137622\n中国人民财产保险公司\t137623\n0.7mm\t137624\n布袋除尘器\t137625\n2017年9月12日\t137626\n成成\t137627\nAcceptance\t137628\n最高人民检察院公安部\t137629\n135号\t137630\n汇丰环球客户服务(广东)有限公司\t137631\nFateGrand\t137632\nWEKA\t137633\npsd格\t137634\n幽门螺旋杆菌\t137635\n辱母\t137636\n中国家具网\t137637\n光明城市\t137638\n吸走\t137639\n散光\t137640\n两江国际影视城\t137641\nad转换器\t137642\n同安生活网\t137643\n浙江经济职业技术学院\t137644\nduke\t137645\n爱游穿梭机\t137646\n航插\t137647\n以太坊ERC20\t137648\n把持\t137649\nGenerator\t137650\n野狐岭\t137651\n金钱包\t137652\n三国群英传3\t137653\n板蓝根\t137654\n正交基\t137655\n投标管理信息网\t137656\n20170509\t137657\n过去30年\t137658\nhome吧_\t137659\n莲花镇\t137660\n重载操作符\t137661\n山料\t137662\n关二爷\t137663\nost\t137664\n徐州东站\t137665\n4.8级\t137666\n元隆雅图\t137667\nloglog\t137668\n史杰鹏\t137669\n江南小区\t137670\n移动魔百盒cm101s\t137671\n25日\t137672\nhomo\t137673\n50亿美元\t137674\n王建伟\t137675\n友爱镇\t137676\nShipping\t137677\n冰凌\t137678\nDiode\t137679\n第1名\t137680\n10b\t137681\n厦门装修公司\t137682\n飞卢小说网\t137683\n祸患\t137684\n空中加油机\t137685\n朱慧\t137686\n求加\t137687\n濑由衣\t137688\n甘肃省法院\t137689\nspecialty\t137690\nwps\t137691\n阿尔法兽\t137692\n黑鹤\t137693\n预防针\t137694\n1&#46\t137695\n张继刚\t137696\n濂溪\t137697\n天津师范大学图书馆\t137698\nPrudential\t137699\n上海环贸广场\t137700\n环氧氯丙烷\t137701\n广厦\t137702\n小三通\t137703\n微油\t137704\n摘&#160\t137705\n济南府\t137706\n付款单\t137707\nA8-7650K\t137708\n漂白粉\t137709\n中国空空导弹研究院\t137710\n上汽\t137711\n天秤座\t137712\nFXCM\t137713\n北京通-养老助残卡\t137714\n人约\t137715\n伐木累\t137716\n剽窃\t137717\n80升\t137718\n好吊\t137719\ndfd\t137720\n和平街道\t137721\ndata\t137722\n百纳\t137723\n宫殿\t137724\nbonds\t137725\nenhancement\t137726\naj1\t137727\n疑似\t137728\n审税\t137729\nAcrobat\t137730\n望其项背\t137731\n医源\t137732\n高派\t137733\nEbony\t137734\nNSX\t137735\n落叶\t137736\n西北民族大学\t137737\n9米\t137738\n上海公证处\t137739\nWitch\t137740\n深圳交易所\t137741\n千淘万漉\t137742\n新中关\t137743\n23栋\t137744\nbt365\t137745\n华富村\t137746\n汉宠妻无度\t137747\n浦东政府网\t137748\n阿格拉\t137749\nrime\t137750\n7670\t137751\n万向集团公司\t137752\nphenomenon\t137753\n腾讯云服务器\t137754\n加回\t137755\n复音\t137756\nFaceID\t137757\njours\t137758\n窟\t137759\n3402\t137760\n雷克萨斯lx570\t137761\n波音787-9\t137762\n崇达技术\t137763\n盯住\t137764\n宿命\t137765\n中华V6\t137766\n大坝镇\t137767\n湾流\t137768\n20代\t137769\n覆盖层\t137770\n魅友\t137771\n海伦凯勒\t137772\n游牧人\t137773\nFSI\t137774\nphpStudy\t137775\nEGF\t137776\n式入\t137777\nArnold渲染器\t137778\n宽容度\t137779\n杨爽\t137780\nNSK\t137781\nwalsh\t137782\n女子性\t137783\n两拨\t137784\n失误\t137785\n星瑞\t137786\n李虎\t137787\n捧场\t137788\n龙堂寺士门\t137789\n精忠\t137790\n中华料理\t137791\n大气层\t137792\n首都机场集团\t137793\nfami\t137794\n融入\t137795\n禁摩令\t137796\n华电能源\t137797\n家政服务行业\t137798\n有缝\t137799\n百度统计流量研究院\t137800\n26级\t137801\n真如寺\t137802\nDecision\t137803\n中式快餐\t137804\n赵计刚\t137805\nningxia\t137806\n室房\t137807\n風情\t137808\n东奥会计在线吧\t137809\n科建\t137810\n神话\t137811\n吉三代\t137812\n艾莉薇\t137813\n2018043期\t137814\n高岭石\t137815\nds2\t137816\n四风四气\t137817\n教练员证\t137818\n2.62\t137819\nコン\t137820\n系扣\t137821\n241号\t137822\n金府\t137823\n微团\t137824\n切磋\t137825\nCan\t137826\n鼻炎康片\t137827\n笔业\t137828\n鸽哨\t137829\n龙流\t137830\n广铁一中\t137831\n中国丽人网\t137832\n辰日\t137833\n曲江文旅\t137834\n局限\t137835\nqihoo\t137836\ncollagen\t137837\n泉州市区\t137838\n白道\t137839\n四糸\t137840\n优酷视\t137841\nshuyang\t137842\n国土安全第七季\t137843\n20.5\t137844\n小腹痛\t137845\n机柜\t137846\n环球嘉年华\t137847\n复数\t137848\n凯乐石\t137849\n郑志刚\t137850\nweblogic服务器\t137851\n子宫萎缩\t137852\n长寿路\t137853\n正姿\t137854\ninputfield\t137855\n绿柳\t137856\nuniversal\t137857\nimaging\t137858\n新热潮\t137859\n副会长\t137860\ndemonophobia\t137861\n05款\t137862\ntrime\t137863\n译头\t137864\n直销价\t137865\n联宝\t137866\n龙润茶\t137867\n太祖\t137868\n花湖\t137869\n单梁桥式起重机\t137870\n废铅\t137871\nmasa\t137872\nkanzi\t137873\nmousemove\t137874\n月赛\t137875\n第15届\t137876\n标识线\t137877\n#VALUE\t137878\n正阳门下\t137879\n校园文化节\t137880\n99健康网\t137881\n氧化石墨烯\t137882\n虎门公园\t137883\nOkHttpUtils\t137884\n丰田凯美瑞\t137885\n草坡\t137886\n指战员\t137887\nsaso\t137888\n不愈合\t137889\nanother\t137890\nopts\t137891\n3096天\t137892\n电焊证\t137893\n磁瓦\t137894\n赵梦玥\t137895\nshowgirl\t137896\n谱曲\t137897\n月亮山\t137898\n美国1号公路\t137899\n集成电路板\t137900\ndrivers\t137901\n嘉兴一中实验学校\t137902\n商一网\t137903\n坚固\t137904\n正三棱锥\t137905\n环境保护税法\t137906\n蒂凡尼的早餐\t137907\n法律案\t137908\n铁警\t137909\nexhibition\t137910\nInfectious\t137911\n3.41\t137912\n百分之几\t137913\n戴望舒\t137914\nzcool\t137915\n杨家才\t137916\n什么时\t137917\n再就\t137918\ns100\t137919\n乐客\t137920\n刘明湘\t137921\nX300\t137922\nREVOLVE\t137923\n骗徒\t137924\nInfineon\t137925\n新破天一剑\t137926\n大大小小\t137927\n绝美\t137928\n陈春兰\t137929\n处心积虑\t137930\n上海大使馆\t137931\n丹佛\t137932\n背景色\t137933\n长时间\t137934\nuBlock\t137935\nmaison\t137936\n坦言\t137937\nDirectories\t137938\n金葱粉\t137939\n西北工业大学理学院\t137940\n9K9K手游网\t137941\n几年来\t137942\n店标\t137943\n蓝牙适配器\t137944\n首辆\t137945\n第一击\t137946\n蚁视\t137947\n小洋人\t137948\n来之不易\t137949\n刘余莉\t137950\n拣货\t137951\n负筋\t137952\n藏界\t137953\n看我\t137954\nSweden\t137955\n无码\t137956\n卡莱尔\t137957\n806\t137958\nData\t137959\n雷纳\t137960\n下第\t137961\n连诗雅\t137962\nandroids\t137963\n农业部规划设计研究院\t137964\nPride\t137965\n768px\t137966\n刘春明\t137967\n月金\t137968\nuitableviewcell\t137969\nP8Z77\t137970\n入网\t137971\n智体\t137972\n顾名思义\t137973\npitbull\t137974\ncomplang\t137975\n附魔\t137976\nlaughter\t137977\nsingto\t137978\n妮露\t137979\n旗形\t137980\n接线柱\t137981\n课程\t137982\n鈴原エミリ\t137983\n连云港政府网\t137984\n物流巴巴\t137985\n泥沙\t137986\nAPPstore\t137987\n艾香\t137988\ncxchanpin\t137989\nimbalance\t137990\ncbz\t137991\n笔算乘法\t137992\n动作文\t137993\n听戏\t137994\n蛊惑人心\t137995\nCognac\t137996\n石室中学\t137997\n灵巧\t137998\n短篇辣文\t137999\n紫外线杀菌器\t138000\n425分\t138001\n求证实\t138002\nvivox21_vivox21\t138003\narcgis\t138004\n腾邦国际\t138005\n我是猫\t138006\n遗患\t138007\n道旗\t138008\nmiix5\t138009\n同桌的你\t138010\n希腊语\t138011\n直树\t138012\n2梯\t138013\nDMIS\t138014\nThrill\t138015\n方正小标宋简体字体下载|方正小标宋\t138016\nAllie\t138017\nhaose0\t138018\n路票\t138019\nC106\t138020\n20170505\t138021\n肥肠\t138022\n华硕RT-AC68U\t138023\n有借\t138024\n祈年殿\t138025\n晴心\t138026\n人棍\t138027\n筑志红\t138028\n釉彩\t138029\n微小宝公众号\t138030\n北京市人才服务中心\t138031\n7月4日\t138032\n女招\t138033\n第五十六\t138034\nMathcad\t138035\nhaystack\t138036\npeixun\t138037\n吉品\t138038\n300KW\t138039\nV3.10\t138040\n60章\t138041\n劳动保障网\t138042\n海上田园\t138043\nRD450\t138044\n尼美\t138045\n天景山\t138046\n魏文侯\t138047\n莎莲娜\t138048\n专职安全员\t138049\n大唐贞观\t138050\n嘉达\t138051\n送还\t138052\n芳文社\t138053\n黑猪\t138054\n惠妃\t138055\n郑丹瑞\t138056\nhlb\t138057\n南昌地铁5号线\t138058\n云奇\t138059\nUCSB\t138060\n零一\t138061\n1:16\t138062\nC罗梅西\t138063\nCLAMP\t138064\n中江国际\t138065\n五一劳动\t138066\n总馆\t138067\n南水北调东线\t138068\n锁水\t138069\n透气性\t138070\n无语\t138071\n中央办公厅\t138072\n联程\t138073\n共聚\t138074\n二十三条\t138075\n关照\t138076\ncooke\t138077\n金链子\t138078\n名次\t138079\n氧层\t138080\niso20000\t138081\n0758\t138082\ngay吧\t138083\n烟台日报传媒集团\t138084\n营级\t138085\n甘肃省科学技术厅\t138086\n108篇\t138087\n川子\t138088\n版官\t138089\n八哥\t138090\n糠酸莫米松乳膏\t138091\n南乡子\t138092\nORA-01031\t138093\n尾迹\t138094\n008号\t138095\n绝地求生刺激战场PC端\t138096\n2017年09月\t138097\n风宇冲\t138098\ntoc\t138099\n726\t138100\nZu\t138101\n聂隐娘\t138102\n芩\t138103\n35升\t138104\n西部矿业\t138105\n塾\t138106\n子规\t138107\n影武者\t138108\n笋瓜\t138109\n保税港区\t138110\n嘉寓股份\t138111\nMEXICAN\t138112\n全盟\t138113\n社区版\t138114\n研判\t138115\n五吨\t138116\nIIIFF\t138117\n顶秀\t138118\n山东话\t138119\n好居网\t138120\n光武帝\t138121\n甘肃经济信息网\t138122\n江苏今世缘酒业股份有限公司\t138123\ncnet\t138124\n代持人\t138125\n银桂\t138126\n大庆西\t138127\n浙江省第一医院\t138128\n门到门\t138129\n小升初电脑派位\t138130\n木鱼\t138131\n隆子县\t138132\n现金流量表间接法\t138133\n4A广告公司\t138134\n嘉积镇\t138135\n莱西信息港\t138136\n导尿术\t138137\n亿恩\t138138\nEV200\t138139\n旁氏\t138140\n首开\t138141\nNomad\t138142\n德里达\t138143\n曼恒\t138144\n南通安港\t138145\n_一休论坛\t138146\nKung\t138147\n普洱生茶\t138148\n活字\t138149\n乐声\t138150\n巴拉望\t138151\n彩乐乐\t138152\n迅雷BT种子超高清\t138153\n露肩\t138154\n梦幻西游单机版\t138155\n浮顶\t138156\n二氯苯酚\t138157\n隔片\t138158\n刘公岛\t138159\n土家菜\t138160\n106元\t138161\nSH-HK\t138162\n菌液\t138163\n舜耕路\t138164\n教化\t138165\n射雕英雄传之铁血丹心\t138166\nKD9\t138167\ntort\t138168\nrownumber\t138169\n开垦\t138170\n白桦林居\t138171\n立新\t138172\n磨人\t138173\n一事\t138174\n深藏不露\t138175\n湖南省监狱管理局\t138176\n广州社区\t138177\n骨碎补\t138178\nDeception\t138179\nzucc\t138180\nハ\t138181\n下摆\t138182\n小萱\t138183\n好爱\t138184\n杭州老板电器股份有限公司\t138185\n行政许可权\t138186\n二胖\t138187\n掐住\t138188\n25.7\t138189\n中国珠宝玉石首饰行业协会\t138190\n沪深股市\t138191\n医技\t138192\n光环师\t138193\n安铺\t138194\n北城\t138195\n光量\t138196\n甘家口街道\t138197\n巡逻机\t138198\n荣耀畅玩5A\t138199\n西南街道\t138200\n红土地\t138201\n圆满完成\t138202\n弃疾\t138203\n上海南\t138204\n金融商报网\t138205\n古战\t138206\nfiat\t138207\n5月6月\t138208\n万兆以太网\t138209\n高铁酸钠\t138210\n赤焰\t138211\n张公子\t138212\n小夫人\t138213\n居多\t138214\n鳕\t138215\n双端\t138216\n逆反心理\t138217\n首问责任制\t138218\ngoldendict\t138219\n迦文\t138220\n访谈录\t138221\n遗传学\t138222\nQIAGEN\t138223\n锦城街道\t138224\n金麟府\t138225\n高压泵\t138226\n化妆棉\t138227\n大阪市\t138228\n暂存盘\t138229\nDISCUZ\t138230\n五十家\t138231\n130平\t138232\n弹吉他\t138233\n北航图书馆\t138234\n阎良\t138235\n君威道奇公羊\t138236\n泰瑞昂\t138237\nDistro\t138238\n干冰机\t138239\ng941\t138240\n笔砚\t138241\n最后一次演讲\t138242\n边长度\t138243\n多长\t138244\nDAT文件\t138245\n气雾罐\t138246\n煤耗\t138247\nExcelHome学院\t138248\n社会工\t138249\n旧酒\t138250\n高策\t138251\n4g\t138252\nPH电极\t138253\n3Dtouch\t138254\nSettlement\t138255\ncycle\t138256\n越多\t138257\n洞庭山\t138258\n银发族\t138259\n弑君者\t138260\n户外广告\t138261\n转版\t138262\n本田佳御\t138263\n广播员\t138264\n水力旋流器\t138265\n延展\t138266\n心理中心\t138267\nYo\t138268\n倍投\t138269\n寒霜朋克/冰汽时代\t138270\n洋白铜\t138271\n王守仁\t138272\n星尘传说\t138273\n方鹏\t138274\n杂文集\t138275\n快乐的节日\t138276\nforgot\t138277\n冯林\t138278\nBLAS\t138279\n光猫桥接无线路由器\t138280\n谜案\t138281\n2.8_\t138282\n外卖_酒店\t138283\n深沪镇\t138284\n思连康\t138285\nminisite\t138286\n暴行\t138287\n新画\t138288\nzmodem\t138289\n海翔药业\t138290\n十九大回声|\t138291\n彻悟\t138292\nAlba\t138293\n陈蒙\t138294\nEnvironments\t138295\n对值\t138296\n振新\t138297\n明眼人\t138298\nDPK750\t138299\n投资银行学\t138300\n边防证\t138301\n娱评\t138302\n种豆得\t138303\n中央委员\t138304\ncarprog\t138305\nAtmel\t138306\n剑网3七秀\t138307\n7科\t138308\n边坡\t138309\n光年\t138310\n教师职业道德\t138311\nTCPDF\t138312\n缔途\t138313\n有目共睹\t138314\n被撞\t138315\nFORM\t138316\n疟原虫\t138317\n量体师\t138318\n中华人民共和国药品管理法\t138319\nboolean\t138320\n白鸡\t138321\nKEL\t138322\n920M\t138323\n如玉\t138324\n躺下\t138325\nzuqiu\t138326\n王老撸\t138327\nvmm\t138328\n神纳花\t138329\n黑种\t138330\n发泡剂\t138331\n易飞\t138332\n枢纽港\t138333\n胎膜\t138334\n帕丁顿熊\t138335\n2ND\t138336\n傅育宁\t138337\n供销合作社\t138338\n榄核\t138339\n金龙奖\t138340\n工贸公司\t138341\n新都国际\t138342\n海底轮\t138343\n前清\t138344\n融卷\t138345\n9298\t138346\npardon\t138347\n危爆\t138348\n讚\t138349\nexpenses\t138350\n复压\t138351\ngtp谱\t138352\npetrochina\t138353\n办案\t138354\nhahaha\t138355\n高盛\t138356\nappsflyer\t138357\n爱卫会\t138358\n声广\t138359\n烘房\t138360\n社保代理公司\t138361\n7月22日\t138362\n笔触\t138363\n吉祥号\t138364\n面包板\t138365\n炼气\t138366\n陶片\t138367\n以太坊智能\t138368\n万重\t138369\n路一鸣\t138370\nè\t138371\n高教出版社\t138372\n13.3.0\t138373\n90YU\t138374\n自动灭火系统\t138375\nTwenty\t138376\n印品\t138377\n灰姑娘\t138378\na7m3\t138379\n路桥公司\t138380\n报警仪\t138381\nDEL\t138382\n股东会\t138383\n锡焊\t138384\n龙山县\t138385\n感召力\t138386\n八村\t138387\n拼座\t138388\n张展\t138389\n环京\t138390\n长翅\t138391\nmfe\t138392\n北京卫视养生堂\t138393\n超能系\t138394\n四条腿\t138395\n凤阳新闻网\t138396\n海阁\t138397\nsnoop\t138398\n3梯\t138399\n2.5寸\t138400\n华波波\t138401\n心悦君兮君\t138402\n和文\t138403\n元贝驾考2018\t138404\n重庆新闻网\t138405\n急性肠炎\t138406\nSAR\t138407\n叶楠\t138408\ncarlo\t138409\n烟灰缸\t138410\n徐司白\t138411\n朗琴\t138412\nreportng\t138413\nSTM32CubeMX\t138414\nチャ\t138415\n天府银行\t138416\nbananas\t138417\n刘姐\t138418\n20161220\t138419\n百合动漫\t138420\n青岛市市\t138421\n中国保健协会\t138422\n郭俊\t138423\n17季\t138424\n据统计\t138425\n洪恩\t138426\nigh\t138427\n精灵公主妮娜\t138428\n德富\t138429\n小米5论坛\t138430\n第4话\t138431\n朴秀珍\t138432\n骗财骗色\t138433\n高品质\t138434\n备降\t138435\n汐音社\t138436\n画架\t138437\n旋\t138438\n吴坤\t138439\n塑钢板\t138440\n新生儿期\t138441\nwebApi\t138442\nドラゴンボ\t138443\n燕窝\t138444\n1亩\t138445\n双阙\t138446\n刘云山\t138447\n交易部\t138448\n晶杰\t138449\n俱欢颜\t138450\n天脊乾坤剑\t138451\nTYP\t138452\n清博\t138453\n反复性\t138454\n周智夫\t138455\n红单\t138456\n山坳\t138457\n社保断缴\t138458\n黑雪姬\t138459\n9926\t138460\n才\t138461\n反馈单\t138462\nmt750\t138463\n盈佳\t138464\n教评\t138465\n圣皇\t138466\n4415u\t138467\n王利华\t138468\n晨会\t138469\n修志\t138470\n刚一\t138471\n修炼之路\t138472\nE舞\t138473\n7300u\t138474\n国有资产\t138475\n5.9.2\t138476\n透明罩\t138477\n磁州\t138478\n道路机动车辆\t138479\nkuler\t138480\n方队\t138481\n粘度计\t138482\nJChere\t138483\n5.7.10\t138484\n代購\t138485\n90页\t138486\nLCC\t138487\n8双\t138488\n河北农大\t138489\n战场女武神\t138490\nlancer\t138491\n参用\t138492\n格勒诺布尔\t138493\nsusun\t138494\nruntime\t138495\n桌酷\t138496\ndescribes\t138497\n阿美\t138498\n提督\t138499\n多边型\t138500\n岛田庄司\t138501\n玉米面粥\t138502\ncmf\t138503\n迅雷迷\t138504\n299元\t138505\ncollectd\t138506\n人妇\t138507\n夏村\t138508\n手机信号屏蔽器\t138509\n宾果消消乐吧\t138510\n两步\t138511\npresented\t138512\n中山村\t138513\nmisty\t138514\n龙建股份\t138515\n麦包包\t138516\n五味子\t138517\napex\t138518\n记忆之门\t138519\n咱老百姓\t138520\n王家镇\t138521\n80后吧\t138522\n榆次一中\t138523\n书体\t138524\n睡不着觉\t138525\n图形\t138526\n蝶泳\t138527\n钱财\t138528\n宣传部\t138529\nVolume\t138530\n2018年5月3日\t138531\n格曼\t138532\n龙珠tv\t138533\nf11\t138534\n影谱\t138535\n天姥\t138536\n顶礼\t138537\neros\t138538\n数据云\t138539\n20份\t138540\nwdr7400\t138541\n乐龄游老年旅游网\t138542\n苍南\t138543\n改道\t138544\n中南财经\t138545\n2.5.5\t138546\n双向选择\t138547\n包装管\t138548\n人身保险\t138549\n洛阳市中心医院\t138550\n烈日灼心\t138551\n2.5天\t138552\nRepresentative\t138553\n建议\t138554\n190X\t138555\n市水利局\t138556\n七宝万科广场\t138557\n昌赣\t138558\n陈炜\t138559\n下怀\t138560\n武体\t138561\n轴流风机\t138562\n邬倩倩\t138563\nCOM口\t138564\n滴滴青桔\t138565\n恒宝股份\t138566\n中华字经\t138567\n在哪儿\t138568\n电流型\t138569\n营养餐\t138570\n30号\t138571\n普审\t138572\n没问题\t138573\n赫卡里姆\t138574\n陈枝记\t138575\nJunior\t138576\nSolr\t138577\n书卷气\t138578\n大本\t138579\n桩基钢筋笼\t138580\n誉诚\t138581\n78度\t138582\n靠山\t138583\nCFCA\t138584\n续流\t138585\ncubi\t138586\n郑树森\t138587\nt66y.com\t138588\n金桥国际\t138589\n金银岛\t138590\ncruz\t138591\n黑暗料理王吧\t138592\nwalks\t138593\n高建\t138594\n微软极客\t138595\n苏若云严\t138596\nsecurity_check\t138597\n国家企业技术中心\t138598\n梁套\t138599\n济广高速\t138600\n谷歌邮箱\t138601\n三星c9\t138602\n毛巾杆\t138603\n每瓶\t138604\n头套\t138605\n中岳\t138606\n右岸\t138607\n周龙\t138608\n海商电商网\t138609\n金地梅陇镇\t138610\n仁恒滨河湾\t138611\n百战天下\t138612\n转系\t138613\n哈尔滨仲裁委员会\t138614\n沙漠绿洲\t138615\n激辩\t138616\nCASIO计算器\t138617\n登高车\t138618\nDynamo\t138619\n日本海\t138620\n突围赛\t138621\n毅行\t138622\n高露洁牙膏\t138623\nGiulia\t138624\n三氧\t138625\npro6\t138626\n一万平米\t138627\nooo\t138628\ngreater\t138629\n十万级\t138630\n20K\t138631\n荞面\t138632\n1.6手动\t138633\n贵港市政府\t138634\n名助\t138635\n福星股份\t138636\n两路口\t138637\n發售\t138638\n胰酶\t138639\nYang志军\t138640\nurlretrieve\t138641\n原品\t138642\n商务印书馆\t138643\n20160924\t138644\nThereis\t138645\n女士\t138646\n笼式\t138647\n内袋\t138648\nWINTER\t138649\n针床\t138650\n逛爷\t138651\nLever\t138652\nWOT-空中网\t138653\n沙头角\t138654\nOKevin\t138655\n20151210\t138656\n30枚\t138657\n倒空\t138658\n图音\t138659\n15粒\t138660\n行动计划\t138661\n鸟儿博客\t138662\n中国考研网\t138663\n没戏\t138664\n房产讯】f.cx\t138665\ntimely\t138666\n乐讯\t138667\n包利民\t138668\n苯类\t138669\n王女士\t138670\n夏桑菊\t138671\nSS\t138672\n分泌物\t138673\n东风41\t138674\n森米\t138675\n欧化\t138676\nHTTP请求\t138677\n纳杰\t138678\n不动工\t138679\niteritems\t138680\n405\t138681\n逸夫楼\t138682\n数据透视表\t138683\n回族\t138684\n荆州市第一人民医院\t138685\n倾\t138686\n电动折叠\t138687\n关于规范金融机构资产管理业务的指导意见\t138688\n谯城\t138689\nHigginCui\t138690\n东莞分公司\t138691\n焓差\t138692\n宜昌东站\t138693\n伊豆半岛\t138694\n精矿\t138695\n10多万元\t138696\n胜道\t138697\n齐家\t138698\n微擎\t138699\nngrx\t138700\nXE3\t138701\n织田\t138702\n东方资产管理公司\t138703\n扎克\t138704\n我的九寨\t138705\n南宁海关\t138706\n20160509\t138707\n杀夫\t138708\n萌战吧\t138709\n就通\t138710\n呆哥\t138711\n润洁\t138712\n三国全面战争\t138713\nEmissions\t138714\n式微\t138715\n天角\t138716\n八爪\t138717\n前半生\t138718\n九十分钟\t138719\n密胺\t138720\n雷速\t138721\n扭送\t138722\n泥淖\t138723\n浙商银行股份有限公司\t138724\n职来职\t138725\n传错\t138726\n加拿大皇家银行\t138727\n旧时光\t138728\n基神\t138729\n2016年4月10日\t138730\n美子\t138731\n委托代理理论\t138732\n监察局\t138733\n掉率\t138734\noppp\t138735\n新疆建设兵团\t138736\nweiwei\t138737\nEstimates\t138738\n20150604\t138739\n云基地\t138740\n金龟子\t138741\n势不两立\t138742\ndp接口\t138743\n水波炉\t138744\n帮唱\t138745\n一线\t138746\n腹膜透析液\t138747\n123.cn\t138748\n天奕\t138749\n阴曹\t138750\n电传\t138751\n缓降器\t138752\n辽宁科技大学\t138753\n台前幕后\t138754\n吴芳\t138755\n富爸爸\t138756\n海驾\t138757\n打斗\t138758\ninterface{}\t138759\n种多肉\t138760\n乡村医生\t138761\n芙蓉石\t138762\n东朝\t138763\nylqx\t138764\n33map.net\t138765\n分期贷款\t138766\n文物出版社\t138767\n新东方托福\t138768\n鲁巷\t138769\n上海市海洋局\t138770\n塔拉多\t138771\n云熙府\t138772\n遮瑕液\t138773\n帐户\t138774\n科普柯\t138775\n故天\t138776\n中分节符\t138777\n鹰嘴豆\t138778\n淘鹊桥\t138779\n李成\t138780\n鉴别\t138781\n上海轨道交通俱乐部\t138782\n无瑕\t138783\n东泽\t138784\n变频空调\t138785\n6份\t138786\n尼康D5100\t138787\n丘丘\t138788\n50000分\t138789\n带参\t138790\n2017秋\t138791\n出版权\t138792\n河南理工大学\t138793\n500级\t138794\n1.88G\t138795\n5023\t138796\npytorch中文网\t138797\n沈昌珉\t138798\n中间型\t138799\n6列\t138800\n竹西\t138801\nmagazine\t138802\n刈\t138803\nhan\t138804\nplate\t138805\n电脑室\t138806\ntwilio\t138807\n3054\t138808\n51页\t138809\n桌\t138810\n妇科\t138811\n营卫\t138812\n举报者\t138813\n太极股份\t138814\n腹外疝\t138815\n成都工业职业技术学院\t138816\n唐小僧\t138817\n体无完肤\t138818\n楚怀王\t138819\n莎安娜\t138820\n划账\t138821\n犯罪都市\t138822\n麦考瑞大学\t138823\n搬砖\t138824\nCPA\t138825\n文丘里流量计\t138826\nroic\t138827\n泌乳素\t138828\n乱世王者\t138829\nsource\t138830\n清华东路\t138831\nkav\t138832\n感人\t138833\nlapack\t138834\n基弧\t138835\n采茶舞\t138836\nqiong\t138837\n挂载\t138838\n金万维动态域名\t138839\n零售行业\t138840\n王战军\t138841\n第96届\t138842\n英国央行\t138843\n管辖权\t138844\nHon\t138845\n大鹏金翅鸟\t138846\n水环境容量\t138847\n亚当·斯密\t138848\nSAV\t138849\n大和号\t138850\n半程马拉松\t138851\n荣耀雅典娜\t138852\n婚照\t138853\n情怀\t138854\nPleasure\t138855\n道琼斯指数\t138856\n太阳号\t138857\n跨境支付\t138858\n浦发银行信用卡\t138859\n纵差\t138860\n番红花城\t138861\n05式\t138862\n封夕\t138863\n纱管\t138864\n预应力\t138865\n慢递\t138866\n平翘舌\t138867\n荒岛求生\t138868\n中国五大发电集团\t138869\n曹勇\t138870\nHDRI\t138871\n通惠家园\t138872\nti\t138873\n魔幻现实主义\t138874\n中控屏\t138875\n沈璐\t138876\nBuffers\t138877\n直刀\t138878\n人犬\t138879\nNBA2K\t138880\n实验型\t138881\n忆先贤\t138882\n多嘎多耶\t138883\n亚特兰蒂斯酒店\t138884\n森山大道\t138885\n支付宝钱包\t138886\nbuiltin_function_or_method\t138887\n梨花压海棠\t138888\nnbn\t138889\nokb\t138890\n物哀\t138891\nbespoke\t138892\n工业集聚区\t138893\nMikami\t138894\n名古屋机场\t138895\n7.1.6\t138896\n16949\t138897\n安道尔\t138898\n南方报业传媒集团\t138899\n食言\t138900\nGearTrax\t138901\n社会生活\t138902\n百分how\t138903\n藤门\t138904\n被免职\t138905\n川字掌\t138906\n圆月弯刀\t138907\narrayList\t138908\n百分之十七\t138909\nRoad\t138910\n杨清华\t138911\n新疆农业信息网\t138912\n七星彩走势图\t138913\n海马区\t138914\n2560\t138915\n天医\t138916\n尚科\t138917\n陈设\t138918\nHausman\t138919\n形神\t138920\n上海自贸试验区\t138921\n商务礼仪\t138922\n郑辉\t138923\n3.0.7\t138924\n青青草在线\t138925\n隆基泰和\t138926\n福玛特\t138927\n哈尼炮\t138928\n其他页\t138929\n特快\t138930\n广丰区\t138931\n中华人民共和国反间谍法\t138932\n下渚湖\t138933\n饱览\t138934\n蓝壳\t138935\n瓶中\t138936\n3668\t138937\n林业\t138938\nDrosophila\t138939\n尼甘布\t138940\n母情节\t138941\n杭州市小学\t138942\nUnicode编码转换\t138943\n幸福空间\t138944\n2期\t138945\n不卑不亢\t138946\n海陵区\t138947\n点对\t138948\nAppszoom\t138949\n卓著\t138950\n德清新市\t138951\nKG316T\t138952\n桥柱\t138953\nLogstash\t138954\nKink\t138955\n儿子们\t138956\n一季\t138957\n保级\t138958\nTerror\t138959\nWIDI\t138960\nwinusb\t138961\n拉姆\t138962\n超级蹉跎\t138963\nopud\t138964\n长岛冰茶\t138965\n吉安\t138966\n第十三部\t138967\necology\t138968\n091诡\t138969\n威海路\t138970\nKIRARI\t138971\n四遍\t138972\nars\t138973\n数字谜\t138974\n达科\t138975\n4星酒店\t138976\n北控水务集团有限公司\t138977\n权责发生制\t138978\n淘手游\t138979\n著\t138980\n2018年2月2日\t138981\n长期股权投资权益法\t138982\np.o.s\t138983\n新市古镇\t138984\n博客网\t138985\n第1章\t138986\n大腰\t138987\n乐购\t138988\n乌兰牧骑\t138989\n新倚天屠龙记\t138990\n欲望山庄\t138991\n浮图塔\t138992\n足光\t138993\n李连杰\t138994\n中古汉语\t138995\n呛人\t138996\nfck\t138997\n2角\t138998\n登幽州台歌\t138999\n一盅\t139000\n狼夫\t139001\n蜜罐\t139002\n北体\t139003\n天际\t139004\n精兵简政\t139005\n德威花园城\t139006\nInferno\t139007\n卡内基\t139008\n洪泽湖\t139009\n曲礼\t139010\n贺林\t139011\n一.1\t139012\n现实主义者\t139013\n千针\t139014\n2-3天\t139015\n水杯子\t139016\nobtaining\t139017\n始于足下\t139018\n市政厅\t139019\n2敏\t139020\n天平\t139021\n介意\t139022\n满清十大酷刑\t139023\n天苍苍\t139024\nhd3000\t139025\n国才\t139026\n风流成性\t139027\n第四个\t139028\n建发股份\t139029\n血河\t139030\n5.5亿\t139031\n南京江宁\t139032\nNsight\t139033\nsge\t139034\n距今\t139035\nhds\t139036\n国家市场监督管理局\t139037\n镇安县\t139038\n射钉器\t139039\n隋唐泽尻绘里香\t139040\n国鹏\t139041\n唐安琪\t139042\n彭泽县\t139043\n纪实\t139044\n平哥\t139045\n无邪\t139046\n电源类\t139047\n孤独症\t139048\n练习题集\t139049\n那多\t139050\nvo\t139051\n黑单\t139052\nMediaCoder\t139053\n交代\t139054\n支杆\t139055\n法人授权委托书\t139056\n李清华\t139057\npads9.5\t139058\ntranstion\t139059\n滴耳液\t139060\nXSS攻击\t139061\n伱\t139062\n骄阳地产\t139063\n4月1\t139064\n阅兵式\t139065\n不防\t139066\n1574\t139067\n背山\t139068\n圆磨床\t139069\n涟\t139070\n田居\t139071\n小用\t139072\n人寿险\t139073\n蒸汽\t139074\nzz21.com\t139075\n技服\t139076\n刚好遇见你\t139077\n武汉中心\t139078\n8630\t139079\n温控箱\t139080\n一百分\t139081\n新动力\t139082\n127.0.0\t139083\nrocket\t139084\n移动机顶盒\t139085\n大气科学\t139086\n力源\t139087\n樱色\t139088\nCEO们\t139089\n中国税务报\t139090\n广雅\t139091\n鳞状细胞癌\t139092\nv3.6.0\t139093\n马飞\t139094\n雷锋小学\t139095\n方桌\t139096\n电磨\t139097\nmonika\t139098\nleft函数\t139099\n滚滚\t139100\n王\t139101\n婚姻登记处\t139102\nPROTO\t139103\n清涧县\t139104\n定南\t139105\n江西师范大学\t139106\n绝\t139107\ncore\t139108\n总结语\t139109\n很庆幸\t139110\n强者\t139111\n乳酸脱氢酶\t139112\n新生活\t139113\n家系\t139114\n创业之星\t139115\n2018年3月11日\t139116\n杆系\t139117\n迎头\t139118\n龙华区人民医院\t139119\n西数硬盘\t139120\n灰太狼\t139121\n忠于\t139122\n可容忍\t139123\n机组人员\t139124\n纪念钞\t139125\n这些人\t139126\n凡尘\t139127\n学不来\t139128\n膀胱炎\t139129\nelevator\t139130\n朱孟依\t139131\n全谱\t139132\n中国工业网\t139133\n红警共和国之辉\t139134\n广西电力\t139135\n新事件\t139136\n管理权\t139137\nDia\t139138\n窃玉\t139139\n松吉电动车\t139140\n水乡\t139141\nstreets\t139142\n卡尔马克思\t139143\n协作式\t139144\n暗沉\t139145\n医用内窥镜\t139146\n文常\t139147\n补肾\t139148\n张洪\t139149\n电解锌\t139150\n18路\t139151\n四十噚\t139152\n东风日产汽车金融有限公司\t139153\nStacked\t139154\n中华人民共和国国家安全部\t139155\n机动车交通事故责任纠纷一审\t139156\nmysqlworkbench\t139157\n0.00MB\t139158\n吴谢宇\t139159\n15双\t139160\n曲挠\t139161\n国家开发投资公司\t139162\n贺州学院\t139163\n150万\t139164\n药智\t139165\n青岛人力资源和社会保障局\t139166\n金鹰国际\t139167\n剪线\t139168\n力宏\t139169\n头孢克洛颗粒\t139170\n浙江省商务厅\t139171\nGOTVG\t139172\n动情\t139173\n新西兰航空公司\t139174\n洁净度\t139175\n15世纪\t139176\n刘磊\t139177\n泉州幼儿师范高等专科学校\t139178\nff15\t139179\n阿里文学\t139180\n第29轮\t139181\n退行性关节炎\t139182\n35年后\t139183\n赞不绝口\t139184\nUC浏览器官网_UC浏览器电脑版\t139185\n国有资本投资运营公司\t139186\nGarageband\t139187\n社会保障局\t139188\n18L\t139189\nǎ\t139190\n雨季\t139191\n新阳路\t139192\n惋惜\t139193\n好痒\t139194\ncssa\t139195\ndrugs\t139196\n水卡\t139197\nVansky\t139198\nTsukasa\t139199\n博彦科技股份有限公司\t139200\ngdp2017\t139201\nscam\t139202\n可燃性\t139203\nAPS\t139204\nBAT\t139205\nSheet1\t139206\n掷\t139207\n早8点\t139208\n航天英雄\t139209\n_塞尔达传说\t139210\n糖豆\t139211\n嘉盛集团\t139212\n杨伯峻\t139213\n说理\t139214\n赛丽亚\t139215\n哈罗\t139216\n戏票\t139217\n待发\t139218\n行程\t139219\n对不\t139220\n伊丽\t139221\nkiker\t139222\n巨力索具\t139223\n循环阀\t139224\n德兴市人民政府\t139225\n庆华\t139226\n泸州市人民政府\t139227\n尴\t139228\n莫匹罗星\t139229\n水环境质量标准\t139230\n肌龄\t139231\n徐娘\t139232\n原画\t139233\n姜勇\t139234\n李玟\t139235\nwindons\t139236\n分容\t139237\nEnema\t139238\n都昌\t139239\n少年犯\t139240\n20单\t139241\n藏姬阁\t139242\n磷酸缓冲液\t139243\n离离\t139244\nCMakeList\t139245\nreclaim\t139246\n左飞\t139247\n解放新村\t139248\n鬼神童子\t139249\n球迷\t139250\nPony\t139251\nyuchen\t139252\n轻拂\t139253\n西和县\t139254\nhuoming\t139255\n集水井\t139256\n淮州\t139257\n泸溪河\t139258\n铃科百合子\t139259\n楚天运动频道\t139260\n意外杀手\t139261\n郑州大学第五附属医院\t139262\n空炮\t139263\nOracl\t139264\n中国国民党\t139265\n卡萨诺瓦\t139266\n汤姆福特\t139267\n四九\t139268\n夏青杯\t139269\n5.2.0\t139270\n坊巷\t139271\n上海大学经济学院\t139272\nHTML文件\t139273\nSOLO赛\t139274\n八角茴香\t139275\n8388\t139276\n低值易耗品\t139277\n悬壶\t139278\nバレ\t139279\n灭鼠药\t139280\n王会勇\t139281\n周六上午\t139282\n梅溪湖公园\t139283\n高慢\t139284\n冲冲冲\t139285\n王玉珍\t139286\n发米论剑-发米网\t139287\n镇江站\t139288\n古冶\t139289\n昆仑山\t139290\n交换器\t139291\n小乌\t139292\n40个\t139293\n8700k\t139294\n涂卡\t139295\nmpich2\t139296\n至尊无赖\t139297\n金庸群侠传1\t139298\n3月12号\t139299\n各\t139300\nITX\t139301\nsuka\t139302\nME60\t139303\n五十度\t139304\n布吉街\t139305\n图片\t139306\n恶犬\t139307\n聚色\t139308\na2\t139309\n侏儒\t139310\ncin.get\t139311\nMermaid\t139312\n李佳航\t139313\n纳粹大屠杀\t139314\nHornyanal\t139315\n2240\t139316\n签名板\t139317\n新楚留香\t139318\n是因为\t139319\n标价\t139320\n筹备组\t139321\nFellow\t139322\n噬魂者\t139323\n讨论帖\t139324\n低帮\t139325\n北极狐\t139326\n青岛晚报\t139327\n重庆市大渡口区人民政府\t139328\n春宵\t139329\n武汉动物园\t139330\n瘟疫公司\t139331\n106.1\t139332\n姓钟\t139333\n湖南公考网\t139334\n邱雪娥\t139335\n小天儿\t139336\nBliss\t139337\nresent\t139338\n黑龙江政府网\t139339\n破晓者\t139340\n惠民网\t139341\n减掉\t139342\n酸奶瓶\t139343\n125斤\t139344\n洗发\t139345\n鼠标板\t139346\nPaincker\t139347\nNginx负载均衡\t139348\n搜道\t139349\nRains\t139350\n夏历\t139351\n萝卜泡菜\t139352\n重疾险\t139353\n四标\t139354\n线框\t139355\nxiaoluo501395377\t139356\n1000期\t139357\n标准工时制\t139358\nGoldwave\t139359\n平起平坐\t139360\n吃香喝\t139361\n断屑\t139362\n《\t139363\n哈尔滨体育学院\t139364\nclip\t139365\n巨硕\t139366\n凤香\t139367\n误餐\t139368\n法规处\t139369\n草\t139370\nredis-server\t139371\n散列\t139372\n大川端侦探社\t139373\nCosmic\t139374\n十九大理论\t139375\n南城县\t139376\n1280超清\t139377\n胜屿\t139378\nlsl\t139379\n模拟炒股\t139380\n新冠道\t139381\n移动迷宫\t139382\n黄锋\t139383\n蚕头\t139384\noriented\t139385\n广西金融投资集团有限公司\t139386\n脑裂\t139387\n芙蓉洞\t139388\n520元\t139389\n全频喇叭\t139390\n一跃\t139391\ncaffee\t139392\n至关\t139393\n薯片\t139394\n上季\t139395\n游医\t139396\n诗山\t139397\nhtml+css\t139398\n0998\t139399\n64GB/全网通\t139400\n南山社区\t139401\n滴滴外卖\t139402\n诸世纪\t139403\nLOHAS\t139404\n2016-03\t139405\n流行钢琴网\t139406\n68分钟\t139407\nepr\t139408\nduoshao\t139409\n10.0.7\t139410\n藏匿处\t139411\n邯郸东\t139412\n泸州市\t139413\n有没有效\t139414\n包装材料\t139415\n栎社机场\t139416\n端阳\t139417\n贵州省省\t139418\n鳞状\t139419\ncompositions\t139420\n大乱斗\t139421\n标语\t139422\n山东在线\t139423\n林少春\t139424\n冯健\t139425\n王灏\t139426\n空值\t139427\n火花\t139428\n数千名\t139429\n话题作文\t139430\n王永泉\t139431\nUniverse\t139432\n夜巡\t139433\n盘尼西林\t139434\n贾士凯\t139435\n计算项\t139436\niobit\t139437\n海洋科学专业\t139438\n1匹\t139439\n福布斯亚洲\t139440\n北京万达\t139441\n代课\t139442\n预支\t139443\n拉夫堡\t139444\n中国航天科工集团有限公司\t139445\nv2.12\t139446\n可转换公司债券\t139447\n冉阿让\t139448\ninvesting\t139449\n成都市第六人民医院\t139450\n14栋\t139451\nZenTalk\t139452\n剪贴\t139453\neclipe\t139454\npulse\t139455\n最终幻想XIV\t139456\n绳锯\t139457\n小琪琪\t139458\n双瓜\t139459\n钢笆片\t139460\n糖渍\t139461\nCREO2.0\t139462\nResidency\t139463\nIPX\t139464\n椅套\t139465\nWebrush\t139466\n6226\t139467\njoins\t139468\n假面骑士Build\t139469\n泰宏建业国际城\t139470\n非物\t139471\neh\t139472\n笔顺\t139473\n第九个\t139474\njiba\t139475\n瓶身\t139476\n弹丸论破v3\t139477\n李建明\t139478\nche\t139479\n零厅\t139480\njavalist\t139481\n10克拉\t139482\n异议期\t139483\n胖熊\t139484\nScore\t139485\n第12批\t139486\nseo优化公司\t139487\n点群\t139488\n口袋妖怪吧\t139489\n葛洲坝\t139490\n击杀\t139491\n金根\t139492\n篮球机\t139493\n走出去\t139494\n五缘湾湿地公园\t139495\n分解表\t139496\n斯巴鲁翼豹\t139497\n教综\t139498\n玉门市\t139499\n88亿美元\t139500\n颂乐\t139501\n西哲\t139502\nRn\t139503\n密度表\t139504\n水票\t139505\n牵绳\t139506\nawvs11\t139507\n锉\t139508\niao\t139509\n480公里\t139510\n风云变幻\t139511\n姬塔\t139512\n塘下镇\t139513\n揭秘\t139514\n湖北路\t139515\n总分\t139516\n北航经管学院\t139517\nSymmetry\t139518\n星期几\t139519\n11.2.0\t139520\n天平街道\t139521\n翼型\t139522\n安姆\t139523\n218.62\t139524\n不干\t139525\n三大纪律八项注意\t139526\n蔡少芬\t139527\n丹佛斯变频器\t139528\n绝想\t139529\nFishParadise\t139530\n分化腺癌\t139531\n圣级\t139532\n袖侧\t139533\n雄关\t139534\n晶体管\t139535\n古典吉他曲\t139536\n片头\t139537\nirql\t139538\n四套房\t139539\n来者\t139540\n数字报_奥一网\t139541\n7572\t139542\n规划\t139543\n帮机\t139544\n相纸\t139545\n花地大道\t139546\nAVN\t139547\n名歌\t139548\n道闸机\t139549\n一笼\t139550\n现金\t139551\n非诚勿扰\t139552\n飘纱\t139553\n技师学院\t139554\n紧急救援\t139555\nBottega\t139556\n中国火箭军\t139557\n玻璃胶\t139558\n油画布\t139559\n不定时工作制\t139560\n铱\t139561\nvivo公司\t139562\n淘宝联名卡\t139563\n码农\t139564\n金九\t139565\n凝\t139566\n中华人民共和国物权法\t139567\n军用\t139568\n黄白色\t139569\n20160112\t139570\nmifeng\t139571\n板画\t139572\n防鼠板\t139573\n电动助力\t139574\n戴浩\t139575\n保荐\t139576\n脉学\t139577\n报关单\t139578\n唐宋\t139579\n拥抱\t139580\n史玉柱\t139581\n水树奈奈\t139582\nv2.8.5\t139583\n中国消防\t139584\n杨佩\t139585\nOID\t139586\n听不厌\t139587\n巴林左旗\t139588\n女科\t139589\n母乳分析仪\t139590\n落泪\t139591\nlp\t139592\n葛小允\t139593\n7500公里\t139594\nS01\t139595\n117家\t139596\n工\t139597\n板门店宣言\t139598\n水床\t139599\nSGL\t139600\n抽象表现主义\t139601\n大唐国际发电股份有限公司\t139602\njixia\t139603\n西二旗\t139604\n百色人才网\t139605\nready\t139606\n链\t139607\n走向堕落\t139608\n甘肃省中医院\t139609\n城市维护建设税\t139610\n1欧元\t139611\n关元\t139612\n大全书\t139613\n康庭\t139614\n筷子兄弟\t139615\n500ms\t139616\n滨安路\t139617\n股份有限公司\t139618\n徽商期货\t139619\nskier\t139620\nSukhumvit\t139621\n午餐\t139622\n紫米\t139623\n台州晚报\t139624\n环环\t139625\n前10天\t139626\n牵强\t139627\n完满\t139628\n冲锋舟\t139629\n法租界\t139630\nambit\t139631\n中牟网\t139632\n尼桑途乐\t139633\n飞天猪\t139634\n须知\t139635\n京华广场\t139636\n静思\t139637\n中国中医科学院中药研究所\t139638\n许刚\t139639\n碑刻\t139640\n青浦区\t139641\n8间\t139642\nOPPOA59s\t139643\nrepairing\t139644\n腐尸之屋\t139645\n背景\t139646\n内容\t139647\n今晚上\t139648\n人工智能概念股\t139649\n新基\t139650\n中华人民共和国应急管理部\t139651\nXE\t139652\n天定\t139653\n集中器\t139654\nnudist\t139655\n200句\t139656\nlinks\t139657\n勇者套\t139658\n艾滋病性病\t139659\nVary\t139660\n健保\t139661\n加油吧实习生\t139662\n王垠\t139663\n3.2圣宗\t139664\n18方\t139665\nstrcmp\t139666\n广东省科学院\t139667\nsuoxie\t139668\n900公里\t139669\n黑桑果\t139670\n规模以上工业企业\t139671\ntera\t139672\n560ti\t139673\n五龙口\t139674\n大佳\t139675\n王东京\t139676\ncripts\t139677\n恋舞\t139678\nRequested\t139679\n昕动\t139680\nTextMate\t139681\nwoll\t139682\n四四\t139683\ngameboy\t139684\n蛇尾\t139685\n海口西海岸\t139686\n关键词密度\t139687\n废材逆袭:冰山王爷倾城妃\t139688\n不伦\t139689\n周传雄\t139690\n门贡\t139691\n宁波湾\t139692\n多维空间\t139693\n链路层\t139694\n量规\t139695\ngraffle\t139696\n双非\t139697\n西华\t139698\nLAUREN\t139699\n胸腺五肽\t139700\n黑绳\t139701\nLayaBox\t139702\n明星\t139703\n酷派8720L\t139704\n缩聚\t139705\n医食同源\t139706\n一学\t139707\n天师符\t139708\n万里学院\t139709\n宁州\t139710\n珠宝盒\t139711\n啼\t139712\nIMAP地址-163邮箱\t139713\n4排\t139714\n广州市环境保护局\t139715\n629\t139716\n三国欢乐斗地主\t139717\nmath库\t139718\nlinaro\t139719\n2536\t139720\n地板胶\t139721\n烟台市规划局\t139722\nLinux编程_Linux公社\t139723\n导向轮\t139724\n古荥镇\t139725\n讹\t139726\n三国志8\t139727\n东洲街道\t139728\n盲区\t139729\n最近6个月\t139730\n龙江银行\t139731\n共和\t139732\n倒车镜\t139733\n非彼\t139734\nExpense\t139735\n硬质棉\t139736\n女超人\t139737\n美黛拉\t139738\n衷\t139739\n在职研究生招生网\t139740\n印花\t139741\n60多个\t139742\n秋官\t139743\n阿古斯\t139744\n药房网\t139745\nLOUIS\t139746\nPhyto\t139747\nBP世界能源统计年鉴\t139748\n稳赚\t139749\n信物\t139750\n极点\t139751\n海事处\t139752\n屯溪一中\t139753\n文具店\t139754\n弹唱版\t139755\n20几岁\t139756\n就座\t139757\n呢子\t139758\n杏花岭\t139759\n香椿\t139760\n新一轮\t139761\nBelieve\t139762\n猛男诞生记\t139763\n靖远县政府\t139764\n天津外国语学院\t139765\n4.5.1\t139766\n微卡\t139767\n中国一拖\t139768\n天天饮食\t139769\n乐清房产网\t139770\n弩车\t139771\n凌凌\t139772\nmsgget\t139773\nups蓄电池\t139774\n冰雪公主\t139775\nIP地址\t139776\n香风\t139777\n日直\t139778\n零落\t139779\namp\t139780\n圣母大学\t139781\n三亚新机场\t139782\n华韵\t139783\n枯燥无味\t139784\n两情相悦\t139785\n熟妇\t139786\n500v\t139787\n中英街\t139788\n1943\t139789\nRainy\t139790\nreactive\t139791\n姚乐怡\t139792\n星月童话\t139793\nS5\t139794\n100RMB\t139795\n歌歌\t139796\n扈\t139797\nVARCHAR\t139798\n化圣\t139799\n旅商\t139800\n友好型\t139801\n数列\t139802\n搜商网\t139803\n隆德\t139804\n俞国良\t139805\n黑硅\t139806\nWikibooks\t139807\n1239\t139808\n国家游泳中心\t139809\n陈志宏\t139810\n保暖性\t139811\nmakea\t139812\n潜水器\t139813\n联系单\t139814\n官能奇谭\t139815\n0105\t139816\n心桥\t139817\n精品\t139818\nLACP\t139819\nIPython\t139820\n途观论\t139821\n2012.10\t139822\n益气聪明丸\t139823\npychar\t139824\n国家防总\t139825\n不忍心\t139826\n改锥\t139827\n到好\t139828\n理论法\t139829\nmarke\t139830\n诺天王\t139831\n解码器\t139832\n律师协会\t139833\n罗技K480\t139834\n华夏学院\t139835\n秀木\t139836\njavaweb\t139837\n20目\t139838\n32D\t139839\n资金链\t139840\ndarkestdungeon\t139841\n连版\t139842\n竹枝\t139843\n酵母菌\t139844\nRats\t139845\n27.3\t139846\nCRS\t139847\n大连理工大学图书馆\t139848\n四川大学艺术学院\t139849\n奶乳\t139850\n黑森州\t139851\n王君\t139852\n饵食\t139853\n108星\t139854\n波轮\t139855\n转任\t139856\netap\t139857\n坡地\t139858\n短吻鳄\t139859\n郑州工业应用技术学院\t139860\nkongge\t139861\n非因工伤残\t139862\n仲元\t139863\n俩天\t139864\n土壤学\t139865\n中铁建工集团有限公司\t139866\n武意\t139867\n郑风田\t139868\n6艘\t139869\n合肥中山医院\t139870\n济南育英中学\t139871\nCCCC\t139872\n胆艺轩\t139873\n阳气\t139874\n燕宇\t139875\n法尔胜\t139876\nnappa\t139877\n1949年后\t139878\n80kg\t139879\n陶博会\t139880\n严宽\t139881\ngengxin\t139882\n帝国时代\t139883\n苏宁易购\t139884\n马自达昂克赛拉\t139885\n先驱者\t139886\n环保法\t139887\n海南新闻中心\t139888\n750亿\t139889\n白鲳\t139890\n九龙新城\t139891\nveil\t139892\n伊万卡特朗普\t139893\n迫切\t139894\n留学服务中心\t139895\n铤而走险\t139896\n林姓\t139897\n共商\t139898\n销号\t139899\n德玛西亚之翼\t139900\n最后一战\t139901\n谢衣\t139902\n装腔作势\t139903\n多元函数\t139904\n带走\t139905\n三株\t139906\n地径\t139907\n刘正军\t139908\nx分之一\t139909\nkatana\t139910\n正大光明\t139911\n中国民生网\t139912\n贝奇\t139913\n赴\t139914\n默认分辨率\t139915\n超导体\t139916\n残保金\t139917\n福永街道\t139918\nSTOCK\t139919\n马勒别墅\t139920\nFuels\t139921\n左海\t139922\n7.4.1\t139923\n地漏芯\t139924\n摸底\t139925\n酒店公寓\t139926\n谈笑\t139927\n上海仁济医院东院\t139928\n深汕合作区\t139929\n启创\t139930\n珍藏级\t139931\n秋之回忆7\t139932\n第40\t139933\n边境游\t139934\n三湘四水\t139935\n绿维文旅\t139936\n海鸥岛\t139937\n2tb\t139938\n张冲andy\t139939\n用版\t139940\n35万元\t139941\nTESLA\t139942\nmiix\t139943\n407\t139944\n十二连环坞\t139945\n张大千\t139946\n成都市实验外国语学校\t139947\n钢筋笼\t139948\nwanda\t139949\n国际贸易\t139950\n书架\t139951\n纤瘦\t139952\n薇拉\t139953\n众筹\t139954\n宾客\t139955\n316路\t139956\n明朝\t139957\n模压板\t139958\nspringmvc拦截器\t139959\n锦河湾\t139960\n夜啼\t139961\n华为p10\t139962\n1018年\t139963\nKVC\t139964\n门童\t139965\n安清欢\t139966\n大度\t139967\n无惧\t139968\n雄心壮志\t139969\n总题\t139970\n每时每刻\t139971\n利剑\t139972\n丘\t139973\n宝龙集团\t139974\n药治\t139975\nGET\t139976\n全加器\t139977\n莫愁湖\t139978\nStigma\t139979\ngridControl\t139980\n唐迟\t139981\n梁书\t139982\n藤仓\t139983\n花盘\t139984\n辎重\t139985\n复赛\t139986\n需求层次理论\t139987\nzho\t139988\nGoQC\t139989\n34%\t139990\n6.5.16\t139991\n望京花园\t139992\nBVV\t139993\n第7页\t139994\nfec\t139995\n4G无线路由器\t139996\n伟长\t139997\nARDS\t139998\n2018.03\t139999\n后退出\t140000\n裆裤\t140001\nGZip\t140002\n谜情\t140003\n咖啡杯\t140004\n采阳\t140005\nps编辑\t140006\npalm\t140007\n超人总动员\t140008\n王魂\t140009\n易逝\t140010\n十七号\t140011\n二档\t140012\n一分钱一分\t140013\nBoss战\t140014\n硅胶干燥剂\t140015\n重负\t140016\nWikiSexGuide\t140017\n交\t140018\n闪电链\t140019\n奥美拉\t140020\n肖若水\t140021\n317\t140022\n北京火车西站\t140023\n微醺\t140024\njieguo\t140025\n赵鼎\t140026\n主办\t140027\n精致\t140028\n芜湖政府网\t140029\n井研县人民政府\t140030\n中小学生人格教育\t140031\nTDX\t140032\n印台\t140033\n怎和\t140034\n邯郸市人力资源和社会保障局\t140035\nboon\t140036\n禁果\t140037\n750ml\t140038\n9656\t140039\n弑天\t140040\n海定区\t140041\npbd\t140042\n大变天\t140043\nteco\t140044\n工作台\t140045\n巨灾\t140046\nhttprequest\t140047\ndijkstra算法\t140048\n小柳\t140049\n业务型\t140050\n广州黄埔区\t140051\n李俊毅\t140052\n登州\t140053\n百年堂\t140054\n太湖国际社区\t140055\ntattoo\t140056\n小牛U1\t140057\n软座\t140058\n长沙中研白癜风医院\t140059\nUSDHKD\t140060\n015\t140061\n河南政协\t140062\n轧钢\t140063\n王国军\t140064\n小超\t140065\n朗易思\t140066\n圣果\t140067\n湖南人大\t140068\n第44话\t140069\n白雪公主\t140070\n【天巡国际机场\t140071\n景地\t140072\n洒热血\t140073\n牛血清白蛋白\t140074\nzhongyi\t140075\n成员国\t140076\n360驱动大师\t140077\n一链\t140078\n抬高型\t140079\n贝贝网\t140080\n行洪\t140081\n餐巾\t140082\n素车\t140083\n朱晓琳\t140084\noffice2016吧_\t140085\n跑圣\t140086\n8立方\t140087\n国库券\t140088\n徐洁儿\t140089\n山影\t140090\n跨线\t140091\n洋介\t140092\n联集\t140093\n六价铬\t140094\n186路\t140095\nerwin\t140096\n王濛\t140097\n田林路\t140098\n茶会\t140099\n迎击\t140100\nm.3dm\t140101\nCAD2015\t140102\n刘柳\t140103\ndynaform\t140104\n低置\t140105\n朋朋\t140106\n导尿管\t140107\n14万\t140108\n公园道一号\t140109\n韩半岛\t140110\n折板机\t140111\n劲松街道\t140112\n角筋\t140113\n第02期\t140114\n一心堂\t140115\n军师联盟\t140116\n劫案\t140117\n弹指\t140118\n恨晚\t140119\nwhere子句\t140120\n1213nf\t140121\n门包\t140122\n芬兰语\t140123\n奇幻森林\t140124\n白云观\t140125\n玄秘塔碑\t140126\n集解\t140127\n3dsmax2010\t140128\n15KG\t140129\n45点\t140130\n奥匈帝国\t140131\niatf16949\t140132\nl5\t140133\n来\t140134\n吴大维\t140135\ndatetime型\t140136\n教片\t140137\nsubstring\t140138\n苏博\t140139\n广州保税区\t140140\n黄执中\t140141\n移动4g\t140142\n潮庭\t140143\n鹰巢\t140144\n男表\t140145\n天合\t140146\n广南县\t140147\n达州市人民政府\t140148\n后官湖湿地公园\t140149\ncass7.0\t140150\nDealers\t140151\n三星级\t140152\n民心网\t140153\n宗教界\t140154\n￠\t140155\nCATTI人事部\t140156\nnonono\t140157\n埃塞\t140158\nappium\t140159\n2014-2015\t140160\n量子化学\t140161\n当周\t140162\n内湖\t140163\n张岳峰\t140164\nArtlantis\t140165\n子坟\t140166\n夜空\t140167\n宝钢集团有限公司\t140168\n喉部\t140169\nfrisk\t140170\n3214\t140171\nbzip2\t140172\n集散点\t140173\n景区化\t140174\n3.5.0\t140175\n六校联考\t140176\nLAW\t140177\n格罗培斯\t140178\n健身馆\t140179\n广州天河体育中心\t140180\n少喝\t140181\n刘隽\t140182\n母后\t140183\n石青\t140184\n病理性\t140185\n唐山医院\t140186\n徐扬\t140187\n颂\t140188\n阿里offer\t140189\n淑媛\t140190\nregsvr32\t140191\n清歌\t140192\n多本\t140193\n辛识平\t140194\n城东路\t140195\ns2010\t140196\n敢死队1\t140197\n铁岗水库\t140198\nhisilicon\t140199\n无趣\t140200\n觅\t140201\n拉丁字母\t140202\n只不\t140203\n天晟新材\t140204\n600cc\t140205\n绝不\t140206\n超灵\t140207\n干将\t140208\n刘国庆\t140209\n百事影院\t140210\ncomc\t140211\n同德医院\t140212\n奥胡斯\t140213\nOrigami\t140214\n1500平方米\t140215\n娄星\t140216\n七笔\t140217\n重复度\t140218\n情妇\t140219\n刘歆\t140220\n北京航空航天大学计算机学院\t140221\n杭州东火车站\t140222\n大获全胜\t140223\n鸾凤和鸣\t140224\n药物化学\t140225\n【文曰小强\t140226\nnal\t140227\n母粒\t140228\n书耽\t140229\n盛势\t140230\n内出血\t140231\n14层\t140232\n轮指\t140233\ntsh\t140234\nhuber\t140235\nkdd\t140236\n–\t140237\ntheta\t140238\n办学\t140239\n海州湾\t140240\n三宅\t140241\n巡航\t140242\nindexed\t140243\nBolton\t140244\nAutoCAD2014\t140245\n德艺\t140246\n海天\t140247\n128路\t140248\n终于\t140249\n手工客\t140250\n小岳岳\t140251\n定语\t140252\nzhan\t140253\n开发工\t140254\n曲婷\t140255\n南风\t140256\n旋舞\t140257\nsession值\t140258\nEmblem\t140259\n减短\t140260\npmv\t140261\n压碎\t140262\n虎父无犬子\t140263\n9孔\t140264\n15天\t140265\n塔径\t140266\n寒气\t140267\n晋中市政府网\t140268\n650_\t140269\n席梦\t140270\nhawaii\t140271\n川滇\t140272\n唐崇荣\t140273\n郑州锅炉股份有限公司\t140274\nmonotonic\t140275\n傅青\t140276\n笑面人\t140277\n钓竿\t140278\nIMSI\t140279\nturf\t140280\n探员\t140281\n92家\t140282\nrbind\t140283\n雅望\t140284\n人体蜈蚣2\t140285\n2500米\t140286\n数条\t140287\n小庄村\t140288\n张力器\t140289\n杨白冰\t140290\n公交男女爆笑漫画\t140291\n庵埠\t140292\n沙格列汀\t140293\nyoujizz\t140294\nFreedom\t140295\n硝酸根\t140296\n怨妇\t140297\nV2.0.0\t140298\n卡特\t140299\n风险偏好\t140300\n权利法案\t140301\n兰石\t140302\n复模\t140303\nsailing\t140304\nPCR\t140305\n国家建筑标准设计图集\t140306\n光球\t140307\n1.11.6\t140308\n中环地产\t140309\n大话降龙\t140310\n香港航空公司\t140311\nEclipse\t140312\n西西弗书店\t140313\n7c\t140314\n幸福大街\t140315\n曽交\t140316\n梨山\t140317\n天那水\t140318\n赵方婧\t140319\n休息\t140320\n32厘米\t140321\n花宵\t140322\n奶箱\t140323\n58同镇\t140324\n介子推\t140325\n蒲县\t140326\n面线\t140327\n顶边\t140328\nnotification\t140329\n信息素\t140330\nprism\t140331\n余干县\t140332\n惦念\t140333\n水瓜\t140334\n三个女人\t140335\n国狂\t140336\n东方太阳城\t140337\nV3000\t140338\n李月仙\t140339\n政府机关\t140340\n孟获城\t140341\n对簿公堂\t140342\nphilippines\t140343\n杂粮粥\t140344\n误判\t140345\nrating\t140346\n64页\t140347\nDirector\t140348\n31万元\t140349\n望奎县\t140350\n真封神_新真封神之【金封神\t140351\n多体动力学\t140352\n菜盒\t140353\n薄谷\t140354\n客单价\t140355\n布瑞克\t140356\nr7sm\t140357\n营销学\t140358\n思特\t140359\n耐克公司\t140360\n上年同期\t140361\n长岭\t140362\nAsa\t140363\n花籽\t140364\n渤海早报数字报\t140365\n云支付\t140366\n躺枪\t140367\nCBF\t140368\n柔力球之歌\t140369\n九品\t140370\n国审\t140371\n世界近现代史\t140372\n﹖\t140373\n雨花台区\t140374\nwaht\t140375\n2017年3月18日\t140376\n换行符\t140377\n入死\t140378\n34集\t140379\ntongweb\t140380\n秀舞\t140381\n轻纺\t140382\n何时能\t140383\n区党工委\t140384\n贾斯汀\t140385\ndouglas\t140386\n射手网\t140387\n江湖群雄传2\t140388\n十年左右\t140389\nWNDR\t140390\n520li\t140391\n跨专业\t140392\n666.com\t140393\n企业管理层\t140394\n格子间\t140395\n天津泰达枫叶国际学校\t140396\n护理系\t140397\n树莓派3b\t140398\n蕉城区政府\t140399\nリング\t140400\n继承法\t140401\n狗版\t140402\n史凯利杰群岛\t140403\nlogback\t140404\n银戒指\t140405\nk80\t140406\n钨极氩弧焊\t140407\nMJ\t140408\n股利\t140409\n蹲坑\t140410\n折子戏\t140411\n航发科技\t140412\nReentrant\t140413\nhike\t140414\n螺旋体\t140415\n宝川\t140416\n勉强\t140417\n收藏版\t140418\n移动h5\t140419\n哈灵顿\t140420\n名旦\t140421\n孢子体\t140422\n微软Office\t140423\nganji\t140424\n无地自容\t140425\n曲柄连杆\t140426\n金敏喜\t140427\n36.08万元\t140428\n四川省农业厅\t140429\n系长\t140430\n相与\t140431\n青田网\t140432\n天津英华国际学校\t140433\n查尔达什\t140434\n男夏\t140435\n春风得意\t140436\n暖色系\t140437\n认缴制\t140438\n淫雨\t140439\n魅族mx4pro\t140440\n石剑\t140441\n铝管\t140442\n志鸿\t140443\n王亚玲\t140444\n报团\t140445\n优酷\t140446\n呜\t140447\n博达交换机\t140448\n李可染\t140449\n抹布\t140450\n魔兽地图\t140451\n钥匙串\t140452\nv11.2.13\t140453\n隋唐逍遥梦路\t140454\n权重系数\t140455\n巾帼文明岗\t140456\n勉\t140457\n信手拈来\t140458\n郎平\t140459\n忆先烈\t140460\nSPSS统计分析\t140461\n3月27日\t140462\n倒水\t140463\n413Xiaol\t140464\n贺斌\t140465\na8f\t140466\n京口\t140467\n陆凉风\t140468\n低清\t140469\n河婆\t140470\n华帝股份\t140471\n杰士邦\t140472\n鲍勃迪伦\t140473\n服批\t140474\nninth\t140475\nnoop\t140476\n海云天\t140477\n4493\t140478\n大山村\t140479\n半音\t140480\n中毒者\t140481\n代买\t140482\n中国共产党第十八届中央纪律检查委员会\t140483\nUSD\t140484\n物联网应用\t140485\n二十二世纪\t140486\n五花大绑\t140487\n天津电信\t140488\n主证\t140489\n新加坡管理大学\t140490\n磨齿机\t140491\n六件\t140492\n女王蜂\t140493\n火神兽\t140494\n西安理工大学\t140495\nmavn\t140496\n图解法\t140497\n硝酸根离子\t140498\n诚品书店\t140499\nkcash\t140500\nspill\t140501\n中央社会主义学院\t140502\n云龙\t140503\n上派\t140504\nJCenter\t140505\n初相见\t140506\n1.9.9\t140507\ngangsta\t140508\n红足直播网\t140509\n叫法\t140510\n广西师大出版社\t140511\n_莫\t140512\n冠群\t140513\nbb文\t140514\n2.8万元\t140515\nForerunner\t140516\nwhl\t140517\nsystrace\t140518\n公略\t140519\n虎骨\t140520\nFiddle\t140521\nsector\t140522\n几千元\t140523\n1.0安卓\t140524\n心颤\t140525\n景顺长城基金管理有限公司\t140526\n爱图客\t140527\n张雅\t140528\n发生机\t140529\n调表器\t140530\negoist\t140531\n丹城\t140532\n北京大学前沿交叉学科研究院\t140533\nHitachi\t140534\n澳贝\t140535\nAUM\t140536\nHoodie\t140537\n熊十力\t140538\n支设\t140539\n检察长\t140540\n过得去\t140541\nreluctant\t140542\n顶向下\t140543\n云客\t140544\n沈阳经济技术开发区\t140545\n途强\t140546\nswift3\t140547\n透了\t140548\n四季彩\t140549\n毒害\t140550\n宁波海关\t140551\nrecognizer\t140552\n江苏长电科技股份有限公司\t140553\n保证人\t140554\n教学反\t140555\n质变\t140556\nJM\t140557\nyuanli\t140558\n虾塘\t140559\n小女仆\t140560\n药城\t140561\n启源\t140562\nbrc\t140563\n秦皇岛经济技术开发区\t140564\n英维迪亚\t140565\n致命之吻\t140566\nkuaiji\t140567\n刘信达\t140568\npcs\t140569\n2018-01-27\t140570\n8小时后\t140571\n点动\t140572\n恒谦教育网\t140573\n金地国际城\t140574\n魔道\t140575\n发愤图强\t140576\n16档\t140577\n洛川县\t140578\n小城镇\t140579\n博格华纳\t140580\n食品级\t140581\n北京教委\t140582\n双色球走势图\t140583\n野游\t140584\n起升\t140585\n萌购吧\t140586\ncr12mov\t140587\n10000吨\t140588\n250a\t140589\n891\t140590\n总校\t140591\n抗渗\t140592\nPropagation\t140593\n双柱式\t140594\n返祖\t140595\n河工\t140596\nzte\t140597\n虞书欣\t140598\n形象地\t140599\n枯矾\t140600\n陈少杰\t140601\nlif\t140602\n测序\t140603\n中国书法协会\t140604\n甲子\t140605\n百峰网\t140606\nRARE\t140607\n西气\t140608\n福建省公安厅\t140609\n化化\t140610\n云子\t140611\n金烔完\t140612\nyf869778412\t140613\n医疗养老教育\t140614\n茶艺馆\t140615\n文通科技\t140616\nRU\t140617\n成人書刊小說寫真集\t140618\n表演系\t140619\n进才中学\t140620\n证券法\t140621\n面目全非\t140622\n好利来\t140623\nSystems\t140624\n德昂族\t140625\n恒力股份\t140626\n第9话\t140627\nSunsin\t140628\nps选区\t140629\n校花的贴身高手2\t140630\n对外开放\t140631\n黑玛卡\t140632\nUpstream\t140633\n五年级语文\t140634\nf548\t140635\n子宫出血\t140636\n心门\t140637\n王府井百货\t140638\nCMDB\t140639\n工业南路\t140640\n女人物\t140641\n阿萨德\t140642\nPale\t140643\n主妇们\t140644\n20道\t140645\n奇葩说\t140646\n剩者\t140647\n拿走\t140648\n敦厚\t140649\n珍奇\t140650\nNetSarang\t140651\n有关规\t140652\nPETCT\t140653\n绝代风华\t140654\naccident\t140655\niptd\t140656\n春之祭\t140657\n女民兵\t140658\nraman\t140659\n接件\t140660\n顾承泽\t140661\n宁波卫生职业技术学院\t140662\n188个\t140663\nbba\t140664\n络合物\t140665\n带源\t140666\n潍坊工程职业学院\t140667\n爱乐团\t140668\n诗钞\t140669\n2147\t140670\n览胜\t140671\n为啥子\t140672\nTecplot\t140673\n击\t140674\nFragmentActivity\t140675\n东区\t140676\n幻奏\t140677\n千千万万\t140678\n浅山\t140679\n环跳穴\t140680\n内容页\t140681\n迟重瑞\t140682\n休斯顿大学\t140683\n绿宝石386\t140684\n金东\t140685\ninstantclient\t140686\n闵玧其\t140687\nshape\t140688\nSpa\t140689\n民宿业\t140690\n杂毛\t140691\n消防水带\t140692\n硅油纸\t140693\n预制件\t140694\n护角条\t140695\nakb48吧\t140696\n梳棉机\t140697\n年代秀\t140698\n付宁\t140699\n富民路\t140700\n穿刺\t140701\n纸币\t140702\n0036\t140703\nAlias\t140704\n大众新闻网\t140705\nonChange\t140706\nCC2540\t140707\n带兵\t140708\n族表\t140709\n夏乔\t140710\n行政机关\t140711\n五岳剑派\t140712\n掌阅科技股份有限公司\t140713\n佳友\t140714\n青春\t140715\n临危\t140716\n布基纳法索\t140717\nMIL\t140718\n15p_\t140719\n满侠\t140720\n鹏翔\t140721\nBoobs\t140722\n被撤销\t140723\n宋某\t140724\n机体\t140725\n4438\t140726\nzhx\t140727\nstartx\t140728\n天灯节\t140729\n喷码\t140730\n银链\t140731\n淮滨论坛\t140732\n第六周\t140733\n海洋生物\t140734\nZHONG\t140735\n口袋妖怪红宝石\t140736\n千千\t140737\n狗爹\t140738\n悦目\t140739\n希洛普\t140740\n香薰机\t140741\n汇景花园\t140742\ntrevor\t140743\n宋刚\t140744\n水团\t140745\n邯郸男健医院\t140746\n乳化液\t140747\n宫腔镜\t140748\nSpring-boot\t140749\n贷款诈骗罪\t140750\nCamera360\t140751\n戏曲史\t140752\n罗城仫佬族自治县\t140753\npivotal\t140754\n贸易站\t140755\n吊绑\t140756\n机友\t140757\nanno\t140758\n理光复印机\t140759\n戴尔xps\t140760\n六盘山\t140761\n长居\t140762\n南山汽车站\t140763\n朗玛信息\t140764\niphone4吧_\t140765\n美菜\t140766\n非负\t140767\n招行银行\t140768\nmerchants\t140769\n沽源县\t140770\n树胶\t140771\n绝望\t140772\n李强\t140773\n七行者\t140774\n小会\t140775\n文思海辉技术有限公司\t140776\n毛铺\t140777\n芦柑\t140778\n药\t140779\n可惜不是你\t140780\nsysprep\t140781\n洛社镇\t140782\n80.90\t140783\n招风耳\t140784\n香螺\t140785\n暴战机甲兵\t140786\n负分\t140787\nEL\t140788\n16点阵\t140789\n吹打\t140790\n所作所为\t140791\nOutbound\t140792\n胡椒盐\t140793\n阿旺\t140794\n光头山\t140795\ndebian\t140796\n1.0.0安卓\t140797\n名字库\t140798\n谍影特工\t140799\n溧阳人才网\t140800\nshin\t140801\n公里\t140802\n临邛镇\t140803\n樱桃园\t140804\n1.5米高\t140805\n10.5代\t140806\nwap\t140807\n二点\t140808\n上海鱼友俱乐部\t140809\n无敌极限流\t140810\nVEROMODA\t140811\n通航\t140812\n火影忍者究极风暴4PC版\t140813\n钟村\t140814\n无往不在\t140815\n20171110\t140816\n清晨\t140817\nDisabled\t140818\n927\t140819\n织田信奈\t140820\n绝叫\t140821\n3420\t140822\n棠湖中学\t140823\nMachining\t140824\n销售类\t140825\n9首\t140826\n张戈\t140827\n敌情\t140828\n全责保险公司\t140829\n预备者\t140830\nconstruct2\t140831\n来力\t140832\n24包\t140833\n大兴安岭地区\t140834\n刘菁\t140835\n8档\t140836\n山西建筑\t140837\n基本面\t140838\n一贫如洗\t140839\n282931\t140840\nadobe公司\t140841\n双光\t140842\n氟化钙\t140843\n荷兰猪吧\t140844\nip138快递查询网\t140845\n春日大社\t140846\n戴尔灵越游匣\t140847\n精大\t140848\n彩币\t140849\n对今\t140850\n艾司奥美拉唑镁肠溶片\t140851\n公考\t140852\nmmm\t140853\n自炊\t140854\n托斯卡\t140855\n自由人游击战争\t140856\n近视率\t140857\n350万吨\t140858\n嘀哩嘀哩\t140859\nmcd\t140860\n姥山岛\t140861\n玩水\t140862\n59厘米\t140863\n第11名\t140864\nenvision\t140865\n声优们\t140866\n小发\t140867\n庄世平\t140868\n雄健\t140869\n物态变化\t140870\n道轨\t140871\n100.1\t140872\n2017年2月5日\t140873\n10MM\t140874\n郑振龙\t140875\n夏林\t140876\n8部\t140877\n叶飞\t140878\n新牧网\t140879\nHandJoy\t140880\n天堂岛\t140881\n凤囚天下3\t140882\n天马山\t140883\n大众T6\t140884\ngad\t140885\n均禾街\t140886\n500vip\t140887\n高鼻梁\t140888\n大年初\t140889\nBeyoncé\t140890\n厦门方特梦幻王国\t140891\n恩施州\t140892\nNgu\t140893\n同类项\t140894\n石市\t140895\n皮面\t140896\n学书\t140897\nThreaded\t140898\n高君宇\t140899\n林下参\t140900\n20170328\t140901\n七品\t140902\n诺基亚N1\t140903\n美英法联合\t140904\n过去7天\t140905\n佟蔓\t140906\n锐龙R7\t140907\n毛化\t140908\n企汇网\t140909\n北京广播学院\t140910\n罗辛\t140911\n中国演出行业协会\t140912\n三环路\t140913\n左尚明舍\t140914\n感悟生命\t140915\n停学\t140916\n钢铁斯琴高娃\t140917\n大英县\t140918\n绿盾征信\t140919\n中国航天科技集团\t140920\nkh\t140921\n清洁公司\t140922\n夹竹桃\t140923\n传水\t140924\n液压剪板机\t140925\nドア\t140926\n二级缓存\t140927\n济民\t140928\n新章程\t140929\nbasePath\t140930\nOtis\t140931\n华为荣耀9\t140932\n阙值\t140933\nbethe\t140934\n趋之若鹜\t140935\n控制面板\t140936\nin石器时代\t140937\nProj\t140938\n水·域\t140939\n济南天桥\t140940\nsolder\t140941\n锁螺丝机\t140942\n富士达\t140943\n翻过\t140944\n口图\t140945\nlm386\t140946\nuniprot\t140947\n秘文\t140948\n腿部\t140949\n新法制报\t140950\n第一代\t140951\njac\t140952\n桑吉号\t140953\n风险型\t140954\n麦花\t140955\n肝火\t140956\n老徐\t140957\n同游\t140958\n第14季\t140959\nxiezai\t140960\n151路\t140961\nAISI\t140962\n例证\t140963\n2018-03-05\t140964\n林志杰\t140965\n泼\t140966\n深圳南\t140967\n座敷童子\t140968\n纳西族\t140969\n好友群\t140970\n柳德米拉\t140971\n打蜡机\t140972\njf\t140973\nbabelrc\t140974\n天涯海角路\t140975\n学姐们\t140976\nlone0922\t140977\n一键恢复\t140978\n温州19楼\t140979\n15平米\t140980\n揭示板\t140981\n科技师\t140982\n何罪\t140983\n财产刑\t140984\n35_\t140985\n约车\t140986\n滤波电路\t140987\n虫洞\t140988\n持股\t140989\ncivil3d\t140990\nyiche\t140991\n刘长青\t140992\n郁花园\t140993\n码枪\t140994\n西安中国银行\t140995\n喹啉\t140996\n不允\t140997\n主科\t140998\n齿板\t140999\n老朱\t141000\n大信\t141001\n免疫功能\t141002\n宝光股份\t141003\n销售信\t141004\n风风机\t141005\n过季\t141006\nLinearLayout\t141007\n天灾人祸\t141008\n名侦探柯南:零的执行人\t141009\n梦网科技\t141010\n区政府党组\t141011\n长江水利委员会\t141012\n1线\t141013\nPingWest品\t141014\n藏镜\t141015\nC51单片机\t141016\n11.0.0.379\t141017\n若干问题的意见\t141018\nyours\t141019\n1493\t141020\n劳动保障局\t141021\n康居\t141022\nnatures\t141023\n散布\t141024\n先是\t141025\nClover\t141026\n米业\t141027\n北京大学第一医院妇产儿童医院\t141028\n辽阔\t141029\nKW\t141030\n10.3%\t141031\n腾讯征信\t141032\n风弄\t141033\nhikvision\t141034\nxifu\t141035\n医疗器械监督管理条例\t141036\n千野\t141037\n亲子\t141038\ncame\t141039\n哈桑\t141040\n甄嬛传\t141041\n九级\t141042\n同盟会\t141043\n氨基柱\t141044\n2月28\t141045\nev400\t141046\n市金融办\t141047\n增持\t141048\n华润国际\t141049\n玄武山\t141050\n邵伯\t141051\n6015\t141052\n1314520\t141053\n下心\t141054\nvariance\t141055\n北京市统计局\t141056\n6996\t141057\n小馨\t141058\n55平米\t141059\nBindException\t141060\n辐射松\t141061\n安贞\t141062\n铅笔袋\t141063\nSMTP/POP3\t141064\nvec\t141065\nSNAKE\t141066\n一个夕\t141067\n乌氏粘度计\t141068\n中指\t141069\ns5720\t141070\n为者\t141071\n委比\t141072\n9系\t141073\n铁翼\t141074\n震精\t141075\n王凤仪\t141076\n刘俐俐\t141077\nDOMO\t141078\nwcp\t141079\ntzc\t141080\n集思录\t141081\n莫里\t141082\n扬州政府网\t141083\n张灵甫\t141084\n俄罗斯西伯利亚\t141085\nCentOs7\t141086\n判断题\t141087\n乳酸钙\t141088\n开张\t141089\n阿米替林\t141090\n糅合\t141091\n陈宏伟\t141092\n今昔\t141093\n俄式\t141094\n奥施康定\t141095\n车载mp4\t141096\n20140608\t141097\n万宝宝\t141098\nsola\t141099\n宋作文\t141100\n小飞龙\t141101\n插件\t141102\nNO.5\t141103\n陶埙\t141104\n曙色\t141105\nAm\t141106\n刘树林\t141107\nfgetc\t141108\n保定市区\t141109\n文路\t141110\n黑莓passport\t141111\ndtmf\t141112\n气候学\t141113\n重人\t141114\n我是歌手第六季\t141115\n怪怪\t141116\n滇藏线\t141117\n0730\t141118\n金角大王\t141119\nKMP\t141120\n亚力山大\t141121\n草比克\t141122\n南安普敦\t141123\n王迁\t141124\nuncovered\t141125\n冰锋\t141126\n第四重\t141127\nHXD\t141128\nSumire\t141129\n乱放\t141130\n浙江林业网\t141131\nagent\t141132\ntoilet\t141133\n重合\t141134\n孟鲁司特钠咀嚼片\t141135\n琼中女足\t141136\n璞丽\t141137\n1000方\t141138\nnursing\t141139\n磕头机\t141140\ntelevision\t141141\n上吐下泻\t141142\nuncaught\t141143\n藏文\t141144\n川庆\t141145\n段首\t141146\n空间域\t141147\n信息论\t141148\n20170417\t141149\nCONCEPT\t141150\n心魔\t141151\n广东省地方税务局电子办税服务厅\t141152\nKlei\t141153\nuvm\t141154\n三文\t141155\n产权式\t141156\n第九回\t141157\n焊件\t141158\nchusetts\t141159\njournalctl\t141160\n3516\t141161\n封城\t141162\n大学排名\t141163\n老家伙\t141164\n马玉涛\t141165\n供需\t141166\n西弗吉尼亚州\t141167\nSingto\t141168\n弹匣\t141169\n中宁县人民政府\t141170\n泰兴人才网\t141171\n孤王\t141172\n资不抵债\t141173\n迈巴赫\t141174\n诺力\t141175\n计财\t141176\n奔腾B70\t141177\n深圳办公室\t141178\n两趟\t141179\n鸡变声器\t141180\n20170703\t141181\n尾兽\t141182\nCodePlex\t141183\nape8.cn\t141184\n0.006\t141185\n20141214\t141186\n97平米\t141187\n偷香邪医\t141188\nLingerie\t141189\n净白\t141190\nprcc\t141191\n水单\t141192\n王绍光\t141193\n嘚嘚\t141194\nUVB\t141195\n聪颖\t141196\n思考\t141197\nAura\t141198\n詹姆斯麦卡沃伊\t141199\n岐江公园\t141200\n广西博世科环保科技股份有限公司\t141201\nClarins\t141202\n宏亮\t141203\n细嗅\t141204\n炭炉\t141205\n野三关镇\t141206\nNBA2K17\t141207\n嘉环\t141208\n大课间\t141209\n公正廉洁\t141210\n西溪国家湿地公园\t141211\n四核处理器\t141212\nshota\t141213\n说教\t141214\n告一段落\t141215\n怀亚特\t141216\n舌位\t141217\n马巷镇\t141218\n第231章\t141219\n绍兴滨海新城\t141220\n墓园\t141221\n党性分析材料\t141222\n圣堂\t141223\n永乐路\t141224\n益起\t141225\n第42页\t141226\n盟市\t141227\n德州大学\t141228\n文化散论\t141229\n正广\t141230\n投送\t141231\n药王庙\t141232\n点火系\t141233\n麻醉科\t141234\n金华火腿\t141235\n宇文泰\t141236\nNAME\t141237\n三海\t141238\n大天门论坛\t141239\n淘妹\t141240\n第四册\t141241\n防撞\t141242\n后湾村\t141243\n撬装式\t141244\n风荷游月\t141245\n梁丽\t141246\n新制度经济学\t141247\n孝南区\t141248\nsight\t141249\n彭筝\t141250\n各怀\t141251\n样块\t141252\n艺龙网\t141253\nUISDC\t141254\nLATEX论坛\t141255\n2019年度\t141256\n背架\t141257\n石锅\t141258\nFlexsim\t141259\n辐流式\t141260\n冷拉\t141261\n云梦泽\t141262\n舞团\t141263\n绿球\t141264\n百草居\t141265\n横移\t141266\n塑界者\t141267\n余毒\t141268\nRoberto\t141269\n园林学习网\t141270\n自如网\t141271\n僵苗\t141272\n每场\t141273\n就叫\t141274\n虢正贵\t141275\n太平森林公园\t141276\n徐丰\t141277\n四川大学华西口腔医学院\t141278\n第62\t141279\n银行承兑汇票\t141280\nIncident\t141281\n碧池\t141282\n触手可\t141283\n钟埭街道\t141284\n蓝火\t141285\n双桥经开区\t141286\n联壁\t141287\n第6册\t141288\nannotate\t141289\n触控屏\t141290\n杠杠\t141291\n一万多个\t141292\nRUSH\t141293\n”\t141294\n超神学院雄兵连\t141295\n油饼\t141296\n瓦莉拉\t141297\n李宗伟\t141298\n林一峰\t141299\nTimeOut\t141300\nBATJ\t141301\ncontain\t141302\n显示适配器\t141303\n拒收\t141304\n英迪格\t141305\n跑跑卡丁车单机版\t141306\n四信\t141307\nHtml\t141308\n车盖\t141309\n仔细看\t141310\n螯合\t141311\n滚粗\t141312\nbreakpad\t141313\n三诊\t141314\n深圳国际会展中心\t141315\n垓下之战\t141316\nprocess\t141317\n西红柿炖牛腩\t141318\n磁阵\t141319\nshaanxi\t141320\nmang\t141321\n普片\t141322\nHFSS\t141323\n孔深\t141324\n抗凝管\t141325\nBJT\t141326\n航母群\t141327\n稻花香酒\t141328\n月经不调\t141329\nvesta\t141330\n管立得\t141331\nxcp\t141332\n利津县\t141333\n文拉法辛\t141334\nHector\t141335\nFTA\t141336\n刑事侦缉档案1\t141337\n几日游\t141338\n玛雅网\t141339\npipework\t141340\n中国科学院西安光学精密机械研究所\t141341\n金顶山\t141342\nSPI\t141343\n主受\t141344\n20150410\t141345\n王焕昌\t141346\n习近\t141347\n苏州同济医院\t141348\n三宫六院\t141349\n香港迪士尼乐园\t141350\n反过来\t141351\n顺河路\t141352\n刘木子\t141353\n情人迷\t141354\n温州房网\t141355\n信用期\t141356\n十吨\t141357\n身边人\t141358\n632700.com\t141359\n木工雕刻机\t141360\n爱探险的朵拉\t141361\n令狐冲\t141362\n人民网游戏\t141363\nm5a78l-m\t141364\n32部\t141365\n开林\t141366\nes文件管理器\t141367\n1&2\t141368\n索纳左耳\t141369\n高老\t141370\nNeuron\t141371\n60吨\t141372\n铜川矿务局\t141373\n空置期\t141374\n荣辱\t141375\n为国为民\t141376\n哨站\t141377\n边走\t141378\n正统\t141379\n龙光城\t141380\n6000\t141381\n三千集\t141382\n试用\t141383\nZoomIt\t141384\n超级全能生\t141385\n乱套\t141386\n壁扇\t141387\n花托\t141388\n8265ac\t141389\n哈耶克\t141390\nu11\t141391\n2018-2019年\t141392\n莫瑞\t141393\n荣威rx5\t141394\n国营企业\t141395\n钢龙\t141396\n孔井\t141397\n提供服务\t141398\n济州市\t141399\n20170808\t141400\nrexxx\t141401\n320_\t141402\n柏林综合症\t141403\n压腿\t141404\nea111\t141405\n18049\t141406\n起可\t141407\nTSSOP\t141408\n徒手\t141409\n刘硕\t141410\n上海电影院\t141411\n第三方\t141412\n会计学院\t141413\n维信卡卡贷\t141414\n镜皇\t141415\n南油\t141416\n智能云输入法\t141417\n颗粒物\t141418\nWOS\t141419\n叉烧饭\t141420\n蒋军\t141421\n苓桂术甘汤\t141422\n孙越\t141423\nFlask框架\t141424\n酒肆\t141425\n房型\t141426\n市委市政府\t141427\n重庆干部网络学院\t141428\n坏档\t141429\npowermock\t141430\nAsahi\t141431\n锈\t141432\n巴子\t141433\n美咲\t141434\n蓝拳\t141435\n象类\t141436\n直链淀粉\t141437\n恐怖游轮\t141438\n清算组\t141439\n获取行\t141440\n大众mib\t141441\n计量柜\t141442\n亚共析钢\t141443\nspeculation\t141444\n悠莱\t141445\nCY\t141446\n7年前\t141447\n宁夏交通运输厅\t141448\n诚哥\t141449\n上海材料研究所\t141450\n鳄鱼岛\t141451\n2332\t141452\n幸福蛋\t141453\n纽约湾区\t141454\n迁改\t141455\n÷\t141456\n吉的堡\t141457\nCOOH\t141458\n一.doc\t141459\n堙灭\t141460\n999封\t141461\n活死人之地\t141462\n靠窗\t141463\nWent\t141464\n伤处\t141465\n毛选\t141466\n线索点\t141467\n吴燕\t141468\n角码\t141469\n魔神柱\t141470\n四十周\t141471\n呻\t141472\n东圃\t141473\n廖师兄\t141474\n锁子\t141475\n韦博\t141476\nreshape函数\t141477\n结膜炎\t141478\nTensorboard\t141479\nfmt\t141480\n懒人修仙传\t141481\n思维馆\t141482\n江春\t141483\n胜利日\t141484\n55周岁\t141485\n聚氨\t141486\nliang\t141487\n赢鼎教育\t141488\n调和油\t141489\n代码块\t141490\n比特币论坛\t141491\n世界政府\t141492\n卡巴内瑞\t141493\n文鸯\t141494\n貴方\t141495\n娘口\t141496\n有益健康\t141497\n宠物王国单机版\t141498\nNash\t141499\n尼木县\t141500\n生信\t141501\n河南省审计厅\t141502\nwarframe\t141503\n黑之契约者\t141504\n河北法院\t141505\n邪王追妻\t141506\n10:00\t141507\n失火罪\t141508\n脆皮鸭\t141509\n备课\t141510\n_麦可蛋糕网\t141511\nvc2008\t141512\n市校\t141513\n生财有道\t141514\n蒲江政府网\t141515\n益禾堂\t141516\n阴界\t141517\nUnzip\t141518\n膝骨关节炎\t141519\n打条\t141520\n义教\t141521\n打发\t141522\nAntelope\t141523\nGHD\t141524\n和音乐学院\t141525\n154140712\t141526\n改定位\t141527\n途牛旅游网\t141528\n朱巍\t141529\nBDF\t141530\n包皮阻复环\t141531\nELEVEn\t141532\n爱好\t141533\n牙粉\t141534\nCJJ\t141535\n哈佛商学院\t141536\n防滑链\t141537\nPORNO\t141538\n塑钢\t141539\n翠微百货\t141540\nTFRecords\t141541\nbrackets\t141542\n学邦\t141543\nSMAP\t141544\n叶城路\t141545\n郝副\t141546\n张延\t141547\n丝\t141548\n套定额_筑龙网\t141549\n传奇登录器\t141550\n6年前\t141551\n赵公山\t141552\n蓝红\t141553\n碳化\t141554\n贝尔宾\t141555\n单真\t141556\n以太坊智能合约编程之菜鸟\t141557\n贝多芬\t141558\n崇武镇\t141559\n32张\t141560\n医学衷中参西录\t141561\n拓号\t141562\n欢乐英雄\t141563\nintelr\t141564\n高质量\t141565\n小旺\t141566\n促进会\t141567\nAmaze\t141568\n哭穷\t141569\n消费心理学\t141570\n跑男3\t141571\n茶艺师\t141572\nscholarship\t141573\nglacier\t141574\n万题库\t141575\n薇娅\t141576\n相依\t141577\nintrinsic\t141578\nparcel\t141579\nrao\t141580\n分诊\t141581\n血管性\t141582\n使用方\t141583\n机油机滤\t141584\n铁灰\t141585\nsdg\t141586\nRetail\t141587\n隆庆\t141588\n公示期\t141589\n铝卷\t141590\n加心\t141591\nlist-style\t141592\nlonlife珑凌/玲珑网游加速器\t141593\n莫尔斯电码\t141594\n2批次\t141595\ncaring\t141596\n金瓶儿\t141597\n总医院\t141598\nvender\t141599\naoo\t141600\n凄テク\t141601\n処理\t141602\n河北北方学院\t141603\nWin2008R2\t141604\n塔柱\t141605\n容错\t141606\n劲性\t141607\n制作器\t141608\nGitKraken\t141609\n票据理财\t141610\n金山工业区\t141611\n第十五篇\t141612\n杜桥\t141613\n雷声\t141614\n不离\t141615\n广佛线\t141616\n糖块\t141617\nendo\t141618\n打非\t141619\n贫女\t141620\n简释\t141621\nytd\t141622\n火热\t141623\n遭弃\t141624\n武汉晨报\t141625\n磁盘根\t141626\n略萨\t141627\n草铵膦\t141628\n碑记\t141629\n经济普查\t141630\n单骑\t141631\n千百惠\t141632\nflour\t141633\n鼠来宝4\t141634\n血药\t141635\n荣耀体脂秤\t141636\n象山港\t141637\npinch\t141638\n上海市科技创业中心\t141639\n第23页\t141640\n课堂教学实录\t141641\n代表值\t141642\n艾司唑仑\t141643\n北京育英学校\t141644\n宁波市安全生产监督管理局\t141645\n逆ryona\t141646\n制片厂\t141647\n湟中县\t141648\n远洋\t141649\nkidding\t141650\n一辑\t141651\n唐杰\t141652\n更新率\t141653\n桃叶\t141654\n汇添富基金\t141655\nlcn\t141656\n李舜\t141657\n食疗法\t141658\n碧蓝海事局\t141659\nalian\t141660\n贵阳数博会\t141661\n洋觅网\t141662\n积微成\t141663\nP2P种子搜索器\t141664\n府河\t141665\n感人泪下\t141666\ntypeid\t141667\nntl\t141668\n绳命\t141669\n街霸4\t141670\n日和坊\t141671\nkts\t141672\n峰顶\t141673\n难辞其咎\t141674\n湖北招标网\t141675\n天蒙山\t141676\n李庄村\t141677\nQDFuns\t141678\n训练班\t141679\n掠影\t141680\n平行班\t141681\nmuller\t141682\n好好\t141683\n电气工程及其自动化\t141684\n两癌\t141685\nCochrane\t141686\n炉石传说狗头人与地下世界\t141687\neden\t141688\nalexa排名\t141689\n黄婷婷\t141690\n足疗师\t141691\nXQB\t141692\n谭望嵩\t141693\n穿墙螺栓\t141694\nKDJ\t141695\n浙教\t141696\nSingle、Dog\t141697\n金山wps\t141698\n邱枫\t141699\n中兴通讯\t141700\n崔荣\t141701\n粧品\t141702\nlbi\t141703\n铁制\t141704\n虫牙\t141705\nKnee\t141706\n分布\t141707\n比德\t141708\n微波炉\t141709\n常州地区\t141710\n优户\t141711\nAttitude\t141712\n血液循环\t141713\n2.27\t141714\n耐水\t141715\n分类信息系统\t141716\n阿生\t141717\nClub\t141718\n影膜\t141719\njenkins2\t141720\n选择篇\t141721\n城建档案馆\t141722\nArrange\t141723\n物理流\t141724\n葡萄酸\t141725\n重要性\t141726\n凯特·温丝莱特\t141727\n烟草局\t141728\naccelerometer\t141729\n辛福\t141730\n希玛\t141731\n绒花\t141732\n苏州地税\t141733\n刘哔\t141734\nlighttools\t141735\n乌镇西栅景区\t141736\n按摩棒\t141737\n孙家栋\t141738\nEROS\t141739\n黑暗面\t141740\nshangh\t141741\n联合收割机\t141742\n舰队colle\t141743\n中变\t141744\n甜甜萌物语\t141745\n见了面\t141746\n1.0.12\t141747\n墨条\t141748\n和讯网\t141749\n杀道行者\t141750\n加时\t141751\n22纳米\t141752\n援华\t141753\n林在范\t141754\natol\t141755\n所罗门\t141756\nmtv\t141757\n望子成龙\t141758\n乡民\t141759\n深圳市雷赛智能控制股份有限公司\t141760\n力美健\t141761\n演说稿\t141762\neve吧\t141763\n刘忠\t141764\n古柯\t141765\n健康促进医院\t141766\n课堂导\t141767\n泰阳\t141768\nRODE\t141769\n注塑成型\t141770\n细分化\t141771\n8栋\t141772\n注册咨询师\t141773\ncharindex\t141774\n陡峭\t141775\n蓝莲花\t141776\n受用\t141777\ngtx960m\t141778\n麦当劳公司\t141779\n吉卜力美术馆\t141780\n黑身\t141781\nFVD\t141782\n景房\t141783\nGUITAR\t141784\n阿西莫\t141785\n工商管理系\t141786\n盲\t141787\n重新\t141788\n巨声\t141789\n沾\t141790\n铅块\t141791\n香港贸发局\t141792\nremains\t141793\n义博会\t141794\n越肩\t141795\n40页\t141796\n唐河\t141797\n飞狮\t141798\n爆操\t141799\n中国林学会\t141800\n刘焱\t141801\n黄金酒\t141802\n第113\t141803\n睚眦\t141804\n乌尔\t141805\n药企\t141806\nXHR\t141807\n超市夜未眠\t141808\n单机游戏网\t141809\n六十秒\t141810\n31页\t141811\n口供\t141812\n爆鳞龙\t141813\nFlexGrid\t141814\n丽影广场\t141815\nInfection\t141816\n宝刀\t141817\n美樱\t141818\ninit\t141819\nv6.1.0\t141820\n杭州英特外国语学校\t141821\n马屿\t141822\n虐童案\t141823\n牛犊\t141824\n雷克顿\t141825\nencodeURIComponent\t141826\nmiscrosoft\t141827\n会员卡\t141828\n0220\t141829\n赛尔达\t141830\n23kg\t141831\n3.87\t141832\n绿地国际城\t141833\n探病\t141834\n蓝梅\t141835\nIhr\t141836\nwbin\t141837\n性情中人\t141838\n孔孝真\t141839\n此心安处\t141840\n连云港传媒网\t141841\n权健集团\t141842\n政体\t141843\n髋骨\t141844\n夏普屏\t141845\n上林县\t141846\n秀逗\t141847\n6一9岁\t141848\n神鸡\t141849\n第一百\t141850\n孙刘\t141851\n尚嘉中心\t141852\n龙肉\t141853\nMeans\t141854\nholly\t141855\n湄洲岛\t141856\n富里酸\t141857\n回龙观医院\t141858\n鹿泉\t141859\n鲁教版\t141860\n联联\t141861\n400px\t141862\ntop50\t141863\n木鸟短租网\t141864\n中国甘肃网\t141865\n供地\t141866\nuyghur\t141867\n2019年前\t141868\n箱根\t141869\n安道拓\t141870\n擦色\t141871\n相中\t141872\ncascader\t141873\n0032\t141874\n西安建工集团\t141875\nMK3\t141876\n徐州网\t141877\n柯美6501\t141878\n怀素真\t141879\nindemnity\t141880\nSQLState\t141881\n亿万级\t141882\n迎娶\t141883\n3.7.4\t141884\nIA\t141885\nBloodborne\t141886\n微震\t141887\n没有效\t141888\n自由端\t141889\n圈舞\t141890\n深圳市房地产中介协会\t141891\n僻字\t141892\n公主岭市\t141893\n两筐\t141894\n麦种\t141895\n一百千克\t141896\n恒龙\t141897\n0.5分\t141898\n蓝牙4.1\t141899\n限行\t141900\n工作报告\t141901\n弟兄们\t141902\n德仁\t141903\nb350plus\t141904\n不积跬步\t141905\n泰安市委\t141906\n青阳论坛\t141907\n罗伯森\t141908\n48P\t141909\n碧潭\t141910\ndetermine\t141911\n星河湾\t141912\nBit\t141913\n凸点\t141914\nMONKEY\t141915\nsymp\t141916\n掉速\t141917\n秀乳\t141918\n藏人文化网\t141919\n上古情歌\t141920\n中二希灵帝国\t141921\n正样本\t141922\n岭南天地\t141923\n点转\t141924\n位运算_\t141925\n宝贝计划\t141926\n4.21\t141927\n大仙儿\t141928\n曹广晶\t141929\n面组\t141930\n水蓝\t141931\n雀斑\t141932\noverride\t141933\n高漫\t141934\n刘思彤\t141935\n废片\t141936\nSUM\t141937\n原封\t141938\n天津南开大学\t141939\n海音\t141940\n九元\t141941\n飘零\t141942\n突发事件\t141943\n茗香\t141944\ncdr2017\t141945\nNOS\t141946\n光程\t141947\nssj\t141948\nubuntn\t141949\nHase\t141950\n好运符\t141951\nikun\t141952\n太空之旅\t141953\n脆化\t141954\n抓影网\t141955\n友军\t141956\n一生一次\t141957\n小芬\t141958\n狗屎\t141959\nWinform\t141960\n运动控制器\t141961\n游戏园\t141962\n官署\t141963\n衬塑复合管\t141964\nmindjet\t141965\n铃\t141966\n过渡阶段\t141967\n滤筒除尘器\t141968\n常州汽车站\t141969\nmen\t141970\nNATIVE\t141971\n帝豪ev\t141972\nbrilliant\t141973\n赛德隆\t141974\nA57\t141975\n双腿\t141976\n战舰波将金号\t141977\nr10\t141978\n噗嗤\t141979\n后舍\t141980\nbuying\t141981\n国槐\t141982\n吴红\t141983\n榆垡镇\t141984\nReactJS\t141985\nword-CSDN\t141986\n90版\t141987\n几十块\t141988\n毛大庆\t141989\n心墙\t141990\n联席\t141991\n这话\t141992\n火影忍者h\t141993\nJava环境变量\t141994\n广州执信中学\t141995\n彩果\t141996\n黄花风铃木\t141997\n边柜\t141998\n三诺\t141999\n羊草\t142000\n一千万\t142001\n材业\t142002\nConditional\t142003\n沪江日语\t142004\nmeister\t142005\n6-1\t142006\n无罪\t142007\nmerriam\t142008\ndiff函数\t142009\n汉化吧\t142010\n2012年2月\t142011\n放烟花\t142012\n祖峰\t142013\n洋名\t142014\n超级无敌奖\t142015\n3842\t142016\nIoc\t142017\n国际型\t142018\n男丁丁\t142019\n盯盘\t142020\n二十字\t142021\n玖\t142022\n黄馨\t142023\n锌钢\t142024\n人间冰器\t142025\n全景机\t142026\n上海金山区\t142027\n新特药\t142028\n鬼针草\t142029\n国情咨文\t142030\n遗宝\t142031\n黛西\t142032\n首列\t142033\n朽木不折\t142034\nvertu\t142035\nsold\t142036\n各市\t142037\nRevised\t142038\nwin7/8版\t142039\n夕阳西沉\t142040\n控端\t142041\n水浒景甜\t142042\n无气\t142043\n周一早上\t142044\n管理学专业\t142045\nXenial\t142046\n书衣\t142047\n宇文化及\t142048\n匿名性\t142049\n60度\t142050\n答曰\t142051\n珠玑\t142052\n顶破\t142053\nhmate\t142054\n船用柴油机\t142055\n国资委研究中心\t142056\n昆布\t142057\n财智金\t142058\n结晶氯化铝\t142059\n6678\t142060\n南阳市第一人民医院\t142061\n吸油\t142062\n告急\t142063\n檔案\t142064\nContinues\t142065\nItems\t142066\n耗油\t142067\n张杰辉\t142068\n几百行\t142069\n锻压机床\t142070\n蚬壳胃散\t142071\n马家军\t142072\nm65\t142073\n军旗\t142074\n实战\t142075\n接住\t142076\n访客\t142077\n谢宁\t142078\n严令\t142079\n扑面\t142080\n工美\t142081\n虹卡\t142082\n硅树脂\t142083\nv9.2\t142084\n香港税务局\t142085\n尖点\t142086\n647\t142087\n陈春花\t142088\n做起\t142089\n弥勒市\t142090\n地里\t142091\n标志杆\t142092\nax2+bx+\t142093\n11期\t142094\npca\t142095\n8000公里\t142096\n盈透\t142097\n二极\t142098\n景瑞控股\t142099\nMTL\t142100\n帝师\t142101\nsin^2x\t142102\n大坪村\t142103\n塌下来\t142104\n湖北美院\t142105\n瞒\t142106\n英国杜伦大学\t142107\n三三集团\t142108\n顿挫感\t142109\n数字机\t142110\nz270\t142111\n知知为知知\t142112\n茹贝源\t142113\n改邪归正\t142114\n顾明城\t142115\n椎间孔镜手术\t142116\n坎布拉\t142117\n扫路车\t142118\nAutism\t142119\n飞索\t142120\nLeoWang\t142121\n保利天汇\t142122\n郭海\t142123\n北京798艺术区\t142124\nPS\t142125\n木线\t142126\n了不起\t142127\n苏怡\t142128\n40套\t142129\n希尔\t142130\nexpires\t142131\n招摇过市\t142132\ne5573\t142133\n苞米地\t142134\n盱眙\t142135\nCPO\t142136\n欢动\t142137\n西周镇\t142138\n士卒\t142139\n薇\t142140\nreal_query\t142141\n1163\t142142\n渊博\t142143\n警视\t142144\n卷饼\t142145\n杉杉来了\t142146\nDisconnected\t142147\n毕马威\t142148\n浪迹\t142149\ncatkin\t142150\n贝佳斯\t142151\n新浪科技_新浪网\t142152\n织田真子\t142153\n牛肉\t142154\n走进\t142155\n西周时期\t142156\n留置\t142157\n核酸酶\t142158\n元谋人\t142159\n80000\t142160\n保时捷Cayman\t142161\n麻醉针\t142162\n4.17\t142163\n冰激凌\t142164\n第一封\t142165\n重设\t142166\n溪头镇\t142167\n黑奴\t142168\n戴斯酒店\t142169\nSusie\t142170\n亲妈\t142171\niperf\t142172\nvue\t142173\n龙池镇\t142174\nTL-WN725N\t142175\n胖瘦\t142176\nPortlet\t142177\n凤倾天阑\t142178\npaloalto\t142179\n537\t142180\ntics\t142181\n龙潭山\t142182\n法向\t142183\n夹棉\t142184\n拳皇wing\t142185\n拱辰街道\t142186\n3.9.0\t142187\n刘海屏\t142188\n鉄\t142189\nos7\t142190\n法定产假工资-华律网\t142191\n建设银行快贷\t142192\nZongheng\t142193\n硕思闪客精灵\t142194\n台灣官方網站\t142195\n血汗工厂\t142196\n信通网\t142197\n凯伊秀\t142198\n开放大学\t142199\ndigdeep\t142200\n帝国时代2高清\t142201\n基教\t142202\n28歳\t142203\n知识青年\t142204\nsuperview\t142205\n稷山县政府\t142206\nHttp\t142207\n1a\t142208\nlayDate\t142209\neis\t142210\n至深\t142211\n20170508\t142212\n24V\t142213\n孙艺珍\t142214\n九份\t142215\n超漩式\t142216\n南直路\t142217\n青岛学前教育网\t142218\n猛犬俱\t142219\n7k7k\t142220\nrbz\t142221\n成都区\t142222\n翼尖\t142223\n跟斗\t142224\nXenApp\t142225\nSesame\t142226\n传话\t142227\n机械姬\t142228\np65\t142229\n罗索\t142230\n班淑\t142231\napa\t142232\nGeneration\t142233\n翻手为\t142234\n纯铁\t142235\n民事赔偿责任\t142236\nqemu-system-x86\t142237\n赌神2\t142238\n绿岛湖\t142239\n清音阁\t142240\n误喝\t142241\n众和\t142242\n杂质峰\t142243\n湖北省政府\t142244\n贷款\t142245\n雅诗敦\t142246\n琵琶行论坛\t142247\n2局\t142248\n大狂欢\t142249\n电竞文\t142250\n孟姜\t142251\n扎营\t142252\n烟片\t142253\n五金类\t142254\n皮椅\t142255\n区城管局\t142256\n安排\t142257\n行规\t142258\n戌狗\t142259\n主卧室\t142260\n例外\t142261\ntailf\t142262\n久久丫\t142263\n3份\t142264\n为谁而炼金\t142265\n封锚\t142266\n终极任务\t142267\n前年\t142268\n哏\t142269\n豉油鸡\t142270\n20160213\t142271\n上海市政工程设计研究总院(集团)有限公司\t142272\n军费\t142273\ngoods\t142274\n维斯\t142275\n无限乱斗\t142276\n日入\t142277\ndq\t142278\nUIS\t142279\n克拉玛依网\t142280\nma\t142281\n博洛尼橱柜\t142282\n氢\t142283\n西游记之大闹天宫\t142284\n第四轴\t142285\n颜好\t142286\n探究性学习\t142287\n愚夫\t142288\n经验性\t142289\n乌尔都语\t142290\n风心病\t142291\n成才路\t142292\nCurse\t142293\n玉藻前\t142294\nFAE\t142295\nUpside\t142296\n徐真真\t142297\n乐安县\t142298\n遵义职业技术学院\t142299\n心如\t142300\n快慢\t142301\n成长路上\t142302\n广西北部湾投资集团有限公司\t142303\n犬细小病毒\t142304\nws2812\t142305\n阿妈妮\t142306\n粤B\t142307\nangeles\t142308\n炉石\t142309\nxmkk\t142310\nOI\t142311\n广东华兴银行\t142312\n爱必妥\t142313\n周山\t142314\n黑逼\t142315\n孟建柱\t142316\n重庆邮电大学\t142317\nfplot\t142318\n蒙氏教具\t142319\n飞白\t142320\n美国签证中心\t142321\n新城市广场\t142322\n单身生活\t142323\n磨课\t142324\n天下贰\t142325\n游匣7577\t142326\n风水师\t142327\n独墅湾\t142328\n自润滑轴承\t142329\n甘氨酸\t142330\n行宫\t142331\n星悦\t142332\n特种钢\t142333\n光绪年\t142334\n瓶邪文\t142335\n碱式滴定管\t142336\nung\t142337\n260亿\t142338\n北界\t142339\ninstalled\t142340\n平面设计师\t142341\n厚爱\t142342\ndess\t142343\n开本\t142344\n1月3日\t142345\n中关村管委会\t142346\nSomewhere\t142347\n2618\t142348\n辛达苟萨\t142349\n朱拉隆功大学\t142350\nxsi\t142351\n全切\t142352\n名升网\t142353\n直江\t142354\nHood\t142355\n4000w\t142356\n退租\t142357\nTurmoil\t142358\n净负债率\t142359\n个险\t142360\n毛巾\t142361\nhuda\t142362\nbecca\t142363\ndragonbones\t142364\nnm\t142365\n九宫山\t142366\n质保部\t142367\n双连\t142368\n银泰城\t142369\n中国运载火箭技术研究院\t142370\n中关村\t142371\n三天两夜\t142372\n夺食\t142373\n随从\t142374\n干校\t142375\nSAp\t142376\n462号\t142377\n剑兰\t142378\n琴艺\t142379\n放言\t142380\n胡欣\t142381\n全讯\t142382\n毫无疑问\t142383\nm1l\t142384\n台达集团\t142385\n淘宝扫一扫\t142386\n东营市人力资源和社会保障局\t142387\n4极\t142388\n彩石溪\t142389\nregistered\t142390\n残疾\t142391\n32018\t142392\n广东证监局\t142393\n分光\t142394\n吉翔\t142395\n圣康尼\t142396\n杜天皓\t142397\n30宝来\t142398\n气叉\t142399\n尾声\t142400\n三十万\t142401\n重庆城\t142402\n凯拉\t142403\n虽小\t142404\n加斯科\t142405\nbjf\t142406\n武装直升机\t142407\n余额宝\t142408\n空间计量经济学\t142409\n好饱\t142410\n这家店\t142411\n滕氏\t142412\n扰码\t142413\nmaintained\t142414\n溶血\t142415\n养猪\t142416\n升升\t142417\n梧桐语\t142418\n歌颂者\t142419\ndede58\t142420\n无领导小组\t142421\n逆流而上\t142422\n果粉们\t142423\n牙仙\t142424\n蟹爪兰吧\t142425\n求饶\t142426\nfreemaker\t142427\n张晓红\t142428\nEddie\t142429\n双塘\t142430\n18044\t142431\n【龙门镖局\t142432\n政经\t142433\n表单项\t142434\nonion\t142435\n衡水景县\t142436\n真帆\t142437\n轻灵\t142438\n佛陀\t142439\n高浓\t142440\n神歌\t142441\n城上\t142442\n想爱就爱\t142443\nlocat\t142444\n唑来膦酸\t142445\n聚乙烯醇\t142446\n方寒\t142447\n族谱\t142448\n质疑者\t142449\nValerie\t142450\n17周年\t142451\n苕\t142452\n笑傲江湖OL\t142453\n烈女\t142454\n广州省\t142455\nVeryCD电驴大全\t142456\nWin10文件资源管理器\t142457\n刘心武\t142458\n圈舍\t142459\nboron\t142460\nBreeze\t142461\n电动车控制器\t142462\n正大杯\t142463\n潜望\t142464\n免注\t142465\n网报\t142466\nJUFD\t142467\n陈煜东\t142468\n斗破苍穹特别篇\t142469\n育才中学\t142470\n顺丰速运(集团)有限公司\t142471\n爱心树\t142472\n第五十四\t142473\n许培扬\t142474\n300017\t142475\n千幅\t142476\nBDO\t142477\n邢国辉\t142478\nccbkr\t142479\n旦\t142480\n惠州市公安局\t142481\n335号\t142482\n雨刮条\t142483\n法律思维\t142484\n陆俨少\t142485\n皮冻\t142486\n太升路\t142487\n购房\t142488\n黄道益\t142489\nTroye\t142490\nmaintenance\t142491\n雷丘\t142492\n郑州市工商局\t142493\n手卷\t142494\nPDE\t142495\n中央电化教育馆\t142496\n体控\t142497\nSg\t142498\n莫旗\t142499\n121度\t142500\nRequest\t142501\n好成绩\t142502\n100元\t142503\n林腾蛟\t142504\n蜡克网\t142505\n贝尔利\t142506\n返程票\t142507\n每所\t142508\n淮安中学\t142509\n逢简水乡\t142510\n红盘\t142511\n流通币\t142512\nPasswords\t142513\n西瓜灯\t142514\n锰矿石\t142515\nWebmin\t142516\nzmap\t142517\nHER2\t142518\nAndroidstudio\t142519\n环球信用卡\t142520\nmin\t142521\n第80\t142522\n十二寸\t142523\n字表\t142524\nweng\t142525\n14月\t142526\n垃圾邮箱\t142527\n云影院yunyy.cc\t142528\n台州市国土资源局\t142529\ncinebench\t142530\nApparatus\t142531\n178\t142532\n餐位\t142533\n克洛斯威\t142534\n2014-2017年\t142535\n51自学网\t142536\n痒\t142537\n枣庄西\t142538\nById\t142539\ncertbot\t142540\n130w\t142541\n伤愈\t142542\n雷达表\t142543\n工青妇\t142544\n赵鼎新\t142545\n1950年代\t142546\n中国人民大学商学院\t142547\n第123章\t142548\n闭环式\t142549\n3300\t142550\n炽天\t142551\nedelman\t142552\n珠海一号\t142553\n王老师\t142554\n彩色的梦\t142555\n狮舞\t142556\n赤军\t142557\nTOMMY\t142558\n首图\t142559\nPortrait\t142560\n海口火车站\t142561\n老子\t142562\n杨振东\t142563\n尼康d5100\t142564\n同床\t142565\n大连政府\t142566\nPace\t142567\n李文君\t142568\n玉带\t142569\nLR\t142570\n光热电站\t142571\n淘货源\t142572\n含氯\t142573\ncccf\t142574\n红酒渍\t142575\nsham\t142576\nfping\t142577\n工程经济学\t142578\n网页扫码\t142579\n虞舜\t142580\n城里\t142581\n大团圆\t142582\n芬顿试剂\t142583\n主阀\t142584\n竞速\t142585\n平安集团\t142586\n份证\t142587\n阜\t142588\n影音先锋资源男人站】影音先锋AV资源站_影音先锋\t142589\n杨辉三角\t142590\n祸害\t142591\nJolie\t142592\nkad\t142593\n微晶\t142594\nsocat\t142595\n100_\t142596\n麻城北\t142597\n澎湃\t142598\n空位\t142599\n合肥奥数网\t142600\n砌砖\t142601\n昭宥\t142602\n万次\t142603\n三位一体\t142604\n敞篷\t142605\n蝴蝶俱乐部\t142606\n交心\t142607\n科通\t142608\n江西国税\t142609\n丁苯酞\t142610\nGood\t142611\n温切斯特\t142612\n强夺\t142613\n中分化腺癌\t142614\n胸腔积液\t142615\n常识性\t142616\n化工\t142617\n苗方\t142618\n小公鸡和小鸭子\t142619\n吴川\t142620\n艾宾浩斯记忆法\t142621\n苦痛\t142622\n华素\t142623\n辐射量\t142624\nTEXTBOX\t142625\n麸皮\t142626\n中科院微生物所\t142627\n宁阳\t142628\n卡壳\t142629\ndecent\t142630\n团结一致\t142631\n10时\t142632\n00001\t142633\n横断山区\t142634\n万科润园\t142635\n下半年\t142636\njavasript\t142637\nfreeing\t142638\n当代史\t142639\n卖火柴的小女孩\t142640\nsorry\t142641\n超过两年\t142642\n爱奇艺影视大全\t142643\n5.7英寸\t142644\n7单\t142645\nV1.0免费版\t142646\n散标\t142647\n脱销\t142648\n电夹板\t142649\ncapslock\t142650\n五星宏辉\t142651\nlistBox\t142652\n大蛇丸\t142653\n董志强\t142654\nprevail\t142655\n花市鲜花网\t142656\n天歌三生不负三世\t142657\n卡马乔\t142658\n飞来峰\t142659\n心石\t142660\n秉公\t142661\n杀子\t142662\n固力果\t142663\n返回函数\t142664\nhaikou\t142665\n嘉善路\t142666\n吾辈\t142667\n首拆\t142668\n04\t142669\nMKJ\t142670\nwifi卡\t142671\n3月28\t142672\n郭店楚简\t142673\n云姐\t142674\n额额\t142675\nkeds\t142676\n土人\t142677\n长治市人民政府\t142678\n柘树\t142679\n闪电十一人3\t142680\n千炮\t142681\n宜春市人民政府\t142682\nrene\t142683\n138号\t142684\nColumbus\t142685\n刷码\t142686\n矿脉\t142687\n濮院\t142688\n安之\t142689\n冬瓜山\t142690\n沪菜\t142691\n嫡孙\t142692\n12米\t142693\n莞讯网\t142694\nCB\t142695\nBD_特玩网流放之路\t142696\n波士顿梗\t142697\n直升机场\t142698\n徐州师范大学\t142699\n意浓\t142700\n淘宝客定向计划\t142701\n颠\t142702\n苏家屯\t142703\n亚特兰提斯\t142704\nariel\t142705\n深圳市眼科医院\t142706\nscute点cc\t142707\n独立日2\t142708\n南昌日报\t142709\n耳塞\t142710\niwara\t142711\nrownum\t142712\n本身\t142713\n贾村\t142714\n泽拉斯\t142715\n高英\t142716\n秋晨\t142717\n遇害\t142718\n劳动合同法实施条例\t142719\n主法\t142720\n228路\t142721\n盘灵\t142722\n天啦撸\t142723\n农校\t142724\nYod\t142725\nflexigrid\t142726\n10.0.0\t142727\n有说有笑\t142728\n思琪\t142729\n蒲苇\t142730\ncompressor\t142731\n闪照\t142732\n核桃味\t142733\n家用车\t142734\n接通\t142735\n九碗\t142736\n2口\t142737\n1200多\t142738\n20件\t142739\n啄木\t142740\n淮扬菜\t142741\n长春日报\t142742\n湖北省林业厅\t142743\n南池\t142744\nWindow7\t142745\n六合彩\t142746\n咲乃柑菜\t142747\ngyno\t142748\nMDD\t142749\nmrbean\t142750\nPolo\t142751\n洛基亚\t142752\n金范秀\t142753\n17度\t142754\n新城镇\t142755\n中华人民共和国个人所得税法\t142756\n特拉华\t142757\n影音先锋av\t142758\nKOF97\t142759\n昆明植物所\t142760\niPhone8\t142761\n八千万\t142762\n成交篇\t142763\n邻近色\t142764\n上述\t142765\n大木桥路\t142766\n男魔\t142767\n玛尔达\t142768\n再生纤维素纤维\t142769\n翁方纲\t142770\n北京首大耳鼻喉医院\t142771\n马克斯\t142772\n江安河\t142773\n羟苯磺酸钙胶囊\t142774\n太子爷\t142775\n动听\t142776\n中南大学数学与统计学院\t142777\n战战\t142778\nl1\t142779\n月下夜\t142780\n羊油\t142781\n一学时\t142782\n刘杨\t142783\n雪拉比\t142784\n龙涎\t142785\n庆丰包子\t142786\n供电器\t142787\n异同\t142788\n专题页\t142789\nsks\t142790\n板鸭\t142791\n行政区划网\t142792\n巴吉度\t142793\nbatches\t142794\nU.2\t142795\n三十本\t142796\n一条狗\t142797\ndop\t142798\n市名\t142799\nClearing\t142800\n高脚\t142801\n绝后\t142802\nЭ\t142803\ngini\t142804\n三合板\t142805\n第二次\t142806\nip6\t142807\n调息\t142808\n新华三技术有限公司\t142809\nkvc\t142810\nnec\t142811\n中字\t142812\n实话\t142813\n法莫替丁\t142814\n非常道\t142815\n智勇的爱车\t142816\n地权\t142817\n鞋帮\t142818\n市委党校\t142819\n大红袍\t142820\n电梯五方通话\t142821\n单轮\t142822\n寒心\t142823\n碳化木\t142824\n城投债\t142825\n乌洛托品\t142826\n天河体育场\t142827\n网纱\t142828\nclassifieds\t142829\n+号\t142830\n花苞\t142831\n福运\t142832\n新英体育网\t142833\nOSPF\t142834\n大余湾\t142835\n20M\t142836\nBootable\t142837\n男星\t142838\n分块\t142839\n中建二局\t142840\n女图\t142841\n桂人\t142842\n童安格\t142843\n中信期货\t142844\n风风雨雨\t142845\n只若\t142846\n可燃冰\t142847\nalibab\t142848\n同级生2\t142849\n十六番\t142850\n搬板\t142851\n幽子\t142852\n中风后遗症\t142853\n认人\t142854\n白粉病\t142855\n龙塔\t142856\n兽交\t142857\n岛式\t142858\n瓦子\t142859\nhanson\t142860\n哪款\t142861\n科尔沁\t142862\n汉萨\t142863\n有道翻译API\t142864\n16mm\t142865\n5535\t142866\nPumps\t142867\n东源\t142868\n组委\t142869\n哂\t142870\n东华大学\t142871\n元大都遗址公园\t142872\n陀枪\t142873\n圈速\t142874\nScrollBar\t142875\n协议价\t142876\n爆兽猎人\t142877\n血神\t142878\n2621\t142879\n阴户\t142880\nvigil\t142881\n底架\t142882\n大悟吧\t142883\n__览潮网\t142884\n昆明地铁4号线\t142885\n篆刻刀\t142886\nicu\t142887\n易贝乐国际少儿英语\t142888\ntent\t142889\n撩夫\t142890\n拼音表\t142891\n标致4008\t142892\n成都博物馆\t142893\n难以忘怀\t142894\n爱佑\t142895\n湖北省通信管理局\t142896\n徐小闩\t142897\n佳慧\t142898\n世纪佳缘\t142899\n赵燕侠\t142900\n深度尺\t142901\n无可匹敌\t142902\n群晖DSM\t142903\n迢迢\t142904\nDatePickerDialog\t142905\n威斯布鲁克\t142906\n留言簿\t142907\n麻辣教师\t142908\n清润\t142909\n唯佳\t142910\nMao\t142911\n电子政务网\t142912\nAlexia\t142913\n鎓\t142914\n剑子\t142915\n西雅图\t142916\n奥林匹克运动会\t142917\n搅拌杯\t142918\n随求\t142919\n马英九\t142920\n优酷视频播放器\t142921\n嗜血娇娃\t142922\nNigel\t142923\nProduce101\t142924\n抛物面\t142925\naccounting\t142926\n卡农论坛\t142927\n岚禾\t142928\n大涡\t142929\nsiji\t142930\n瞬干胶\t142931\n磨粒\t142932\n冷水管\t142933\n愕然\t142934\n电显示器\t142935\n金志\t142936\n翡翠系\t142937\n达美航空公司\t142938\n一边倒\t142939\n黑金刚\t142940\n血液透析\t142941\n全分\t142942\nettercap\t142943\n果醋\t142944\n开瑞k50\t142945\nwin7操作系统\t142946\n传国\t142947\n北京联通\t142948\n外祖父\t142949\n细思恐\t142950\n厂房\t142951\n中湖\t142952\n陕西政府\t142953\n轰趴\t142954\n郝颖\t142955\n美标\t142956\n紧绷绷\t142957\n有偿求\t142958\n下肢静脉血栓\t142959\n见习\t142960\n年堂\t142961\n汉宫飞燕\t142962\n水滴鱼\t142963\n瑞信\t142964\n东方罗马花园\t142965\n大内密探零零发\t142966\n隆化县\t142967\n柏楚\t142968\n近6个月\t142969\n20160617\t142970\n我爸爸\t142971\n中国一路\t142972\n500兆\t142973\n刺青师\t142974\n私通\t142975\nJessieJ\t142976\n差馆\t142977\nQuin\t142978\nA03\t142979\n假面骑士Ex-Aid\t142980\n文淇\t142981\nnltk\t142982\n婕拉\t142983\n在那里\t142984\n法兰距\t142985\n20160502\t142986\n水长流\t142987\n电动伸缩门\t142988\n中华工控\t142989\nGO佳居家具网\t142990\n世通卡\t142991\n快递面\t142992\n珠海经济特区\t142993\n乌衣镇\t142994\n暴制暴\t142995\nowen\t142996\nDiane\t142997\n演丰镇\t142998\n四通八达\t142999\nallowing\t143000\n禾本科竹亚科\t143001\n嬉笑怒骂\t143002\n众泰T300\t143003\n2018-03-13\t143004\n附庸风雅\t143005\nUNLIMITED\t143006\n人参酒\t143007\n左枕\t143008\ncapability\t143009\n年终奖\t143010\n果壳箱\t143011\n恒大俱乐部\t143012\n第148期\t143013\nPedro\t143014\n.Net\t143015\n七美德\t143016\nQualcomm\t143017\n熊猫书院\t143018\nuserdata\t143019\n美帝\t143020\n蓝宇\t143021\n誉胜\t143022\nNa2S\t143023\n天痕\t143024\ntbody\t143025\n放射治疗\t143026\n月圆\t143027\n海棠果\t143028\n爹地宠\t143029\niviewui\t143030\n济北\t143031\nHappened\t143032\n火影忍者3\t143033\n唐僧肉\t143034\n30种\t143035\n生活家装饰\t143036\n顺风车主\t143037\n电子商务有限公司\t143038\n花折\t143039\n碳纳米\t143040\n0467\t143041\n5袋\t143042\n天津政府\t143043\n零售革命\t143044\n冬妮娅\t143045\nMu\t143046\nStream\t143047\n117路\t143048\n职员工\t143049\n泽林\t143050\n备用\t143051\n0x000007\t143052\nNvme\t143053\n布画\t143054\n1.5毫米\t143055\n小米九号平衡车\t143056\n金单\t143057\n卓豪\t143058\n怵\t143059\n29万\t143060\n手表盒\t143061\n笔算除法\t143062\n康佰\t143063\n北京国际信托有限公司\t143064\n7y7.com\t143065\n花朵朵\t143066\n邓广铭\t143067\n中华泰山号\t143068\nCoderZh\t143069\n宫颈管\t143070\n认识图形\t143071\n肿块\t143072\n裙板\t143073\n忻府\t143074\nOrson\t143075\n肄业证\t143076\n德丽莎\t143077\n绝地求生大逃杀鼠标宏\t143078\nInternational\t143079\nmoschino\t143080\n签证录\t143081\n2017.2.4\t143082\n诱导屏\t143083\nWebP\t143084\n怎敢\t143085\n门脸\t143086\n驼\t143087\n圣索非亚大教堂\t143088\n群规\t143089\n15T\t143090\nckfinder\t143091\n子文件夹\t143092\npartion\t143093\nBASARA\t143094\nBoring\t143095\n辉耀\t143096\n初级经济师考试\t143097\n6420\t143098\n昌乐二中\t143099\n遮风挡雨\t143100\n1.87\t143101\n南京郑和外国语学校\t143102\n轻举妄动\t143103\n冠道\t143104\n快马加鞭\t143105\n烧卖\t143106\nBillion\t143107\n椰子汁\t143108\nmeditation\t143109\n南强\t143110\n密集度\t143111\n新仓镇\t143112\nInsanely\t143113\n挥之不去\t143114\n符文工房3\t143115\nDatatables中文网\t143116\n2017年6月26日\t143117\n五选\t143118\n活荷载\t143119\n昆玉\t143120\n石溪\t143121\n苹果开发者中心\t143122\nCaribbean\t143123\nXZ1/XZ1\t143124\n迫击\t143125\n恶妇\t143126\nv3.0.8\t143127\n黎志涛\t143128\n江干四季青\t143129\n14班\t143130\n朦天龙八部\t143131\n波点\t143132\n篱门\t143133\nPLC200\t143134\nOmniGraffle\t143135\n最高工资\t143136\n货币战\t143137\n21.0.0\t143138\n面向\t143139\n诚志\t143140\n初级版\t143141\nMerchandise\t143142\n古井镇\t143143\ntaiyo\t143144\n纸面石膏板\t143145\n纸船\t143146\ntpe\t143147\n法卡顿\t143148\n上饶百姓网\t143149\nnames\t143150\n50余\t143151\n择木\t143152\n|狗\t143153\n子宫切除术\t143154\n所得税\t143155\n有法可\t143156\nLoading\t143157\n梁越群\t143158\n生命周期\t143159\n弓矢\t143160\n山东省粮食局\t143161\n疆内\t143162\nnaomi\t143163\n马德钟\t143164\n月满西楼\t143165\nexpr\t143166\nServlet\t143167\n橙香\t143168\ndur\t143169\n安源煤业\t143170\nHelios\t143171\n宝岛\t143172\n20160506\t143173\n一菲\t143174\n赛金花\t143175\n齿轮式\t143176\n长跑节\t143177\n3千拳\t143178\n新红楼梦\t143179\n百丈\t143180\n改之\t143181\n溶水\t143182\n驻颜\t143183\nspringboot\t143184\n4650\t143185\n纾\t143186\n红葡萄酒\t143187\n蓝猫\t143188\n日播\t143189\n氧化风机\t143190\n优卡丹\t143191\n单组\t143192\n立秀宝\t143193\n软纱\t143194\n第四十三期\t143195\n阿瑞\t143196\n逝者如斯\t143197\n牙齿矫正器\t143198\n成都国际学校\t143199\ndesigner18\t143200\n2600X\t143201\nSNH48星梦剧院\t143202\n惊\t143203\n陈建功\t143204\n长老年\t143205\n调香师\t143206\n大立教育\t143207\n永州职业技术学院\t143208\n退出率\t143209\n沉香救母\t143210\nOnes\t143211\n解难题\t143212\n天津市耀华中学\t143213\n虫灾\t143214\n引荐\t143215\n甄嬛后传\t143216\n手机内存卡\t143217\n苏交科\t143218\n行尸走肉第一季\t143219\n美团骑手\t143220\nlifesmart\t143221\n湖西\t143222\n701\t143223\ninji\t143224\n同修\t143225\n鹫\t143226\n女保镖\t143227\n分卷\t143228\n两包\t143229\n口水三国\t143230\nseize\t143231\n煎饼\t143232\n绿岸\t143233\n苏国\t143234\nPDF转换器\t143235\nKishi\t143236\n西王\t143237\n浮沫\t143238\n无机物化\t143239\n中山杉\t143240\n富都\t143241\ntesseract\t143242\n京瓷FS\t143243\n王雪华\t143244\n1盘\t143245\n崔天琪\t143246\n50家\t143247\n000031\t143248\n93版\t143249\n有始有终\t143250\nunfold\t143251\n倩女幽魂_九游论坛\t143252\n胡地\t143253\n掼蛋王\t143254\n侄子\t143255\n计划群\t143256\n利尔达\t143257\n品社\t143258\nRSLogix\t143259\nrefractory\t143260\n鲁商置业\t143261\n3.0.6\t143262\n软磁材料\t143263\n中肥网\t143264\n樱空桃\t143265\nTravelling\t143266\n三公九卿\t143267\nPuzzle\t143268\n横纹\t143269\n枪杆\t143270\n裆部\t143271\n1.doc\t143272\n哈夫曼编码\t143273\n叶金朝\t143274\n楚网\t143275\n中兴大厦\t143276\n汽保\t143277\n陈若琳\t143278\n贪图\t143279\nCPY\t143280\n训话\t143281\nhie\t143282\nnvp\t143283\n仙剑奇侠传5:前传\t143284\ntxt阅读器\t143285\n挑拨\t143286\n汴州\t143287\n25小时\t143288\ndream\t143289\n巴彦浩特\t143290\npas\t143291\n二十一年\t143292\n12周岁\t143293\n长春南湖公园\t143294\nknot\t143295\n经管学院\t143296\n四脚\t143297\n王莽岭\t143298\nOven\t143299\n刘经理\t143300\n宇视\t143301\n玉米粥\t143302\n所获\t143303\n咽鼓管\t143304\n大纲\t143305\n人教版九年级英语\t143306\n人力资源管理软件\t143307\n刺杀金正恩\t143308\n起亚捷豹xf\t143309\n深套\t143310\nSOR\t143311\n13起\t143312\n断联\t143313\n绝地逃生\t143314\n维多利亚广场\t143315\nmugen\t143316\n王一丹\t143317\n__九江新闻网\t143318\nJPush\t143319\nssn\t143320\n龙东社区\t143321\n可测量\t143322\n大闹天竺\t143323\nTEP\t143324\n曹强\t143325\n中考网\t143326\n第十一条\t143327\n嘻嘻网\t143328\n偷卖\t143329\n316L\t143330\n手卡\t143331\n23g\t143332\n圣才学习网\t143333\npvr\t143334\n小米相册\t143335\n肖劲光\t143336\n建类\t143337\n美乐乐家具城\t143338\n莲下镇\t143339\n航海学院\t143340\n呈贡县\t143341\n渝味晓宇\t143342\n宏基Acer\t143343\n房地产价格指数\t143344\ndlc3\t143345\n橱子\t143346\nFootjob\t143347\n独孤漠归来复仇时秋_墓王之王悬棺寺\t143348\n资智回汉\t143349\nZWWoOoOo\t143350\nINFILE\t143351\n三代同堂\t143352\nbwt\t143353\n北京市通州区人民政府\t143354\n草馏社区\t143355\n柜业\t143356\nrescue\t143357\n全新换代\t143358\nccer\t143359\n1月28日\t143360\nk5\t143361\nGlucose\t143362\n战士们\t143363\n1250\t143364\n義\t143365\n空调格\t143366\n哈氏合金\t143367\n核桃壳\t143368\n万字\t143369\nwep\t143370\n四盘位\t143371\n哈苏\t143372\n网易房产长沙站\t143373\n1个\t143374\n续流二极管\t143375\n拘传\t143376\nfluoride\t143377\n660MW\t143378\n简惬\t143379\n龙腾小说网\t143380\n谷米\t143381\n迷底\t143382\n西地那非\t143383\n庐江县\t143384\n王者荣耀冒险模式\t143385\n赵庄村\t143386\n苏州大学外国语学院\t143387\n小祖咒\t143388\ntfz\t143389\n中国客车网\t143390\n一应俱全\t143391\nImports\t143392\n六彩\t143393\n十两\t143394\n蓝眼睛\t143395\n国际经济法\t143396\n低钾血症\t143397\n速通门\t143398\nHotkey\t143399\n藿香正气液\t143400\n录像厅\t143401\n48码\t143402\n原话\t143403\n打车费\t143404\n2.0t\t143405\nSpiders\t143406\n二2\t143407\n韩城\t143408\nJDBBX\t143409\n平津战役\t143410\n政府预算\t143411\nvpp\t143412\nrequ\t143413\n陈美琪\t143414\n金叶子\t143415\nmeigui\t143416\n35#\t143417\n建安公司\t143418\n崔伟\t143419\nusda\t143420\n丙卷\t143421\n小额工程\t143422\n紫甘蓝\t143423\n96斤\t143424\n神武2\t143425\n禅风\t143426\n范志红\t143427\n吉利新能源汽车\t143428\n16针\t143429\n建筑路\t143430\n复活\t143431\n丁红\t143432\n数千万\t143433\n37mm\t143434\n洪潮\t143435\n高雄市\t143436\n教授\t143437\nbit1129\t143438\n完美情人\t143439\n1623\t143440\n尹美莱\t143441\n永恒之蓝病毒\t143442\n孙彤宇\t143443\n前苏联\t143444\n2016-2017年度\t143445\n上海宝尊电子商务有限公司\t143446\n澳美制药\t143447\nJackets\t143448\nejs\t143449\nModelling\t143450\n家用除湿机\t143451\n砍头息\t143452\n超五类线\t143453\n初中部\t143454\nミク\t143455\n杜克色\t143456\n马竞\t143457\n效力性\t143458\n升降晾衣架\t143459\n擂主争霸赛\t143460\n3003\t143461\n长夜\t143462\n从重处罚\t143463\n计算流体力学\t143464\n飞虎之潜行极战粤语版\t143465\n牛股王\t143466\n2017月\t143467\n热化\t143468\n江中猴姑饼干\t143469\n刘晨\t143470\n社会治安综合治理\t143471\n琪琪网\t143472\n鄠邑区\t143473\n武外英中\t143474\nMoxing\t143475\n部署\t143476\n并继竿\t143477\n好气\t143478\n禄口国际机场\t143479\n秘書\t143480\n走罐\t143481\n自然指数\t143482\n温氏股份\t143483\n冒险者\t143484\n爱看小说网\t143485\n无骨雨刮器\t143486\n统\t143487\n35互联\t143488\n恒誉\t143489\n合家\t143490\n华为p6\t143491\nIpad2\t143492\n截尾\t143493\n精采\t143494\n秦峰\t143495\n《中国商界》杂志社\t143496\n眺望\t143497\npc6苹果网\t143498\n3昆特\t143499\na1586\t143500\n心内膜炎\t143501\ncenos7\t143502\nzuoai\t143503\n飞车\t143504\n马眼棒\t143505\n长平之战\t143506\n玉兰公馆\t143507\n弟兄\t143508\n德阳\t143509\n眼镜人\t143510\n战伤\t143511\n砂砾\t143512\ngxyz\t143513\n校验器\t143514\nMHA\t143515\nVITAS\t143516\n瘦身\t143517\nCOUNTIFS\t143518\n硫铝酸盐水泥\t143519\ntender\t143520\n看番\t143521\n软回车\t143522\n百尊\t143523\n考理\t143524\nFX6300\t143525\n刘小伟\t143526\n无水葡萄糖\t143527\nⅱ\t143528\n淘宝商\t143529\n摘苹果\t143530\n九襄\t143531\n轮到\t143532\n盛世娇宠\t143533\n渔阳滑雪场\t143534\n汪氏\t143535\n选读\t143536\n北图\t143537\nh700\t143538\n左行\t143539\n昌州\t143540\n1322104\t143541\n牙齿\t143542\nHMM\t143543\n评教\t143544\n法源\t143545\nResponse\t143546\n中度贫血\t143547\n看我撸\t143548\nViola\t143549\n厉薇薇\t143550\ncontemporary\t143551\n服装制版师\t143552\n云山路\t143553\n川崎摩托\t143554\n东风破\t143555\n银鹰\t143556\n大同路\t143557\n血玲珑\t143558\n切分\t143559\n成人高等学校招生全国统一考试\t143560\n雄安新区市民服务中心\t143561\n弹性体\t143562\n至尊三十六计之偷天换日\t143563\n志明与春娇\t143564\n深圳技师学院\t143565\n撒娇\t143566\n周建军\t143567\n南开大学计算机与控制工程学院\t143568\n小米手环\t143569\n加贺\t143570\n卷板\t143571\n惹议\t143572\n东莞证券\t143573\n巨婴国\t143574\n半夜十二点\t143575\n8000句\t143576\n全面战争战锤2_全面战争战锤2\t143577\n南京中大医院\t143578\nXX局\t143579\n合家欢\t143580\n护卫队\t143581\n新蔡县人民政府\t143582\n二十世纪八十年代\t143583\n粗面\t143584\n大良客运站\t143585\n省时\t143586\n讨\t143587\n蔡英文\t143588\n萃华\t143589\n郑风\t143590\n飞豆快递\t143591\n非学历教育机构\t143592\n任小芹\t143593\n中铁上海局\t143594\n反话\t143595\n祖庙街道\t143596\n6方\t143597\n开操\t143598\nnmb\t143599\n五芯\t143600\n扶手\t143601\n200多部\t143602\n玄妙\t143603\nマジ\t143604\n瓦尔基里\t143605\n省行\t143606\n国家工商局\t143607\nTanaka\t143608\n胎压表\t143609\nGuideline\t143610\n锦江区\t143611\n要略\t143612\nLAM\t143613\n戚帅\t143614\n石长铁路\t143615\n腾势\t143616\n40粒\t143617\n精灵鱼\t143618\nP2PSearcher\t143619\nChip\t143620\nJustany_WhiteSnow\t143621\n蓝盔\t143622\n比亚迪汽车工业有限公司\t143623\npitstop\t143624\n&#039\t143625\nve\t143626\n乙脑减毒活疫苗\t143627\nAtmosphere\t143628\n四川天府银行股份有限公司\t143629\n朱正延\t143630\n电点\t143631\nf364\t143632\n多指教\t143633\n顿觉\t143634\n蓝天路\t143635\n潇湘人才网\t143636\ndockerd\t143637\n地梁\t143638\nOLTP\t143639\n杰娜\t143640\n金婚\t143641\n黑公司\t143642\nEndless\t143643\n阿兹莫丹\t143644\n朴园\t143645\n广州万科\t143646\n四合一气体检测仪\t143647\n上海迪士尼乐园酒店\t143648\n金投奢侈品网\t143649\nmongobooster\t143650\nuno\t143651\n开坛\t143652\n帝少的心尖宠\t143653\n叙永县人民政府\t143654\n服务力\t143655\n虎门港\t143656\n当爱\t143657\n4.09\t143658\n合水\t143659\n外踝\t143660\nv0.14.0\t143661\n深圳商报数字报\t143662\n拱北汽车客运站\t143663\n老一小\t143664\n詹森\t143665\nxijinping\t143666\n长春市九台区人民政府\t143667\nTEA\t143668\n微元\t143669\n临城\t143670\n凌派论坛_汽车之家论坛\t143671\n四圣\t143672\n萝岗万达广场\t143673\n水果杯\t143674\ncprime\t143675\n刘旭阳\t143676\n牙肉肿痛\t143677\npchmonster\t143678\nplmn\t143679\n_友\t143680\n杂乱\t143681\nblender\t143682\n导航控制器\t143683\noffer\t143684\nWonderful\t143685\n纹身师\t143686\n学士学位\t143687\n29种\t143688\n政府机构\t143689\nflooring\t143690\n自考\t143691\n基部\t143692\n缤\t143693\nkoobee\t143694\n夜钓\t143695\nbeta2\t143696\nHelsinki\t143697\n令人称奇\t143698\n第几分\t143699\nxzz\t143700\n2933\t143701\nthesis\t143702\n马绍尔群岛\t143703\n机动战士高达SEED\t143704\nPhrase\t143705\n20161\t143706\n3.3亿\t143707\n耐磨陶瓷\t143708\n梦幻五开吧\t143709\n衡水市公安局经济开发区分局\t143710\n自热\t143711\neducation\t143712\n克鲁塞德战记\t143713\n黄石人才网\t143714\nzvv\t143715\n宋慧乔\t143716\ni7-7700hq\t143717\n囧囧丸\t143718\n公有链\t143719\nfraser\t143720\n忘年恋曲\t143721\n大风\t143722\n六十六岁\t143723\n母婴\t143724\n20册\t143725\n8000多个\t143726\n九夏\t143727\n51年\t143728\n香芒\t143729\n渗出液\t143730\n顺心\t143731\n雷吉纳\t143732\n淮南凤台\t143733\n警笛\t143734\n3.26\t143735\n平井坚\t143736\n手眼\t143737\nCommits\t143738\n钳子\t143739\n买车网\t143740\n文本谱\t143741\n海贝\t143742\n野生\t143743\n处女贴\t143744\n前世今\t143745\nsecvtorii\t143746\n塌\t143747\n黑青\t143748\n生出\t143749\n野途网\t143750\n批发市\t143751\n宁波市质量技术监督局\t143752\n洋品牌\t143753\nmixin\t143754\n萨凡纳\t143755\n夫子\t143756\n两遍\t143757\n斑马网\t143758\nESPN\t143759\n撼\t143760\n丁建刚\t143761\n合川日报数字报\t143762\n中国档案网\t143763\nvn7\t143764\n伶仃\t143765\n上古卷轴5mo\t143766\n腌面\t143767\n巨海\t143768\n叫鸡\t143769\n蜜宠\t143770\n启隆乡\t143771\njizz\t143772\n色香\t143773\n4.06\t143774\n峡\t143775\n房村\t143776\n圆盘式\t143777\n丹亚\t143778\n地铁人\t143779\n示威\t143780\n广州白云国际机场股份有限公司\t143781\n创领者\t143782\n贵州路\t143783\n孙维君\t143784\n试飞\t143785\n接发球\t143786\n报警器\t143787\n超级西游记\t143788\n广东网\t143789\n跳段\t143790\n旋喷\t143791\n企联\t143792\n工作时间长\t143793\nec2\t143794\n低级语言\t143795\n拉条子\t143796\n美安\t143797\nfertility\t143798\n三十次\t143799\nMPs\t143800\nYYF\t143801\n机不可失\t143802\n青海省科学技术厅\t143803\n太平镇\t143804\n版本库\t143805\n江美仪\t143806\n于朦龙\t143807\n人民司法\t143808\n住宿\t143809\nhttpurlconnection\t143810\n航摄\t143811\n传单\t143812\n癫痫\t143813\nCFTC\t143814\nspf50\t143815\n义务劳动\t143816\n罗时丰\t143817\n翠湖湾\t143818\n丹徒新区\t143819\n汤布\t143820\n一1\t143821\n嗷呜\t143822\nCIM\t143823\n全频带\t143824\n新社区\t143825\nBeginner\t143826\n宓妃\t143827\n卓玛\t143828\n一拳\t143829\ncd-rom\t143830\nlsass\t143831\n3241\t143832\na1661\t143833\n英格尔\t143834\nPrefix\t143835\n卞留念\t143836\n奥朗德\t143837\n溜冰\t143838\n隐形眼镜\t143839\n引资\t143840\n每套\t143841\n患得患失\t143842\nstated\t143843\n气胸\t143844\n金柳露\t143845\n往回\t143846\n金髪天国\t143847\n仿品\t143848\n高性感\t143849\n真空接触器\t143850\n超品相师\t143851\n第33\t143852\n讯飞输入法\t143853\n珍妮曲奇\t143854\n人见人\t143855\nxion\t143856\nwup\t143857\nflyme5\t143858\n新塘镇\t143859\n纬纱\t143860\n火棍\t143861\n原速\t143862\n1908年\t143863\n大鱼际\t143864\n蓝盒\t143865\n缝纫机乐队\t143866\n福州道\t143867\n爱普生投影\t143868\n干草\t143869\n英思科\t143870\n南宋\t143871\n小玫瑰\t143872\n丁彦周凯旋\t143873\n数成\t143874\n十二月份\t143875\n双缝\t143876\n喷泡吧\t143877\n麻阳苗族自治县\t143878\n鑫海\t143879\n陈同学\t143880\n法那科\t143881\n类关系\t143882\nAMH\t143883\n上海健桥医院\t143884\nNX8.0\t143885\n失者\t143886\n张桐\t143887\n72天\t143888\ndword\t143889\n5档\t143890\n北大深圳医院\t143891\n相册\t143892\nzhh\t143893\nc3p0\t143894\n刍狗\t143895\n大专院校\t143896\nshortest\t143897\n中铁十七局\t143898\n瑞秋·麦克亚当斯\t143899\n4天内\t143900\n茨木童子\t143901\n湖北省国土资源厅\t143902\n红旗超市\t143903\n姐妹哥\t143904\n兴许\t143905\n食事\t143906\nphpquery\t143907\ncoke\t143908\nRecommender\t143909\n安卓版_5577安卓网\t143910\n类属\t143911\n萧七爷\t143912\nABS泵\t143913\n得安\t143914\n3210\t143915\n失陷\t143916\nwcf\t143917\n企业改革与管理\t143918\n代驾\t143919\nelta\t143920\nXia\t143921\n胜利者\t143922\nAK47\t143923\nCAD转换器\t143924\nCheek\t143925\n装模\t143926\n江苏宗申\t143927\n陕西农业网\t143928\ntokidoki\t143929\n划算\t143930\n警司\t143931\n段子手\t143932\n安市\t143933\n秦EV\t143934\n金山\t143935\n论述类\t143936\n第59\t143937\ndopdf\t143938\n长沙高铁站\t143939\n女用性\t143940\n印章\t143941\n腿儿\t143942\n应纳税额\t143943\n白蒿\t143944\n战利\t143945\n亚洲足球俱乐部\t143946\nXR\t143947\n江西工业职业技术学院\t143948\n檐沟\t143949\n加坦杰厄\t143950\n第13轮\t143951\n红锈\t143952\n腿酸\t143953\n高月\t143954\n2638\t143955\n整复\t143956\n爸爸去哪儿第一季\t143957\n一种感觉\t143958\n迅雷9\t143959\n神林\t143960\n话本小说网\t143961\nspend\t143962\nlpf\t143963\nDHC\t143964\nLeapMotion\t143965\nw700\t143966\nMixin\t143967\n童哥\t143968\n600645\t143969\n朋购\t143970\n夺心\t143971\n大针蜂\t143972\n扑家\t143973\n戴尔灵越15R\t143974\n云南站\t143975\n水灯\t143976\n精修\t143977\n试管口\t143978\n奥特曼传奇\t143979\n端盖\t143980\n金羊\t143981\n裸眼3D\t143982\n国邦\t143983\n天好\t143984\nPROTREK\t143985\n声光\t143986\nanesthesia\t143987\n赵国强\t143988\n守望先锋联赛\t143989\n牛大王\t143990\n马蹄糕\t143991\nunimplemented\t143992\n尽心竭力\t143993\ne房网\t143994\n二婚男\t143995\ndg\t143996\nnx8.0\t143997\n老高\t143998\n晨光文具店\t143999\n黄道十二宫\t144000\nTSKS韩剧社\t144001\n需\t144002\nETtoday國際\t144003\nparallax\t144004\nQQ音乐\t144005\n高低压柜\t144006\ntooling\t144007\n陈鸿\t144008\n#iOS\t144009\nHTML/JS\t144010\n家族信托\t144011\nSorcerer\t144012\n浦城\t144013\n坝上草原\t144014\n潜水吧\t144015\n魔装\t144016\nIAMUE\t144017\nBeanUtils\t144018\n新西兰北岛\t144019\n支付券\t144020\n三浩\t144021\n魅动\t144022\n西安高新区\t144023\nGourmet\t144024\n升式\t144025\n反斗\t144026\n四杯\t144027\n大话哈尔滨\t144028\nDMC\t144029\n江干区政府\t144030\n阿尔滨\t144031\nAmplification\t144032\nBGM\t144033\n磁条卡\t144034\n反应\t144035\n三人舞\t144036\n多纳鲁马\t144037\n金银花颗粒\t144038\n前列腺增生\t144039\n性教育\t144040\n一亿个\t144041\n月亮播放器\t144042\n轩辕剑外传\t144043\n推窗\t144044\n四平\t144045\nfuwu\t144046\n好吃佬\t144047\n观课\t144048\n国潮\t144049\nopencart\t144050\n牛仔装\t144051\n贾家镇\t144052\n卡尔加里\t144053\n枯枝\t144054\n珠宝城\t144055\n堀口真希\t144056\n十万伏特\t144057\n电锁\t144058\n泽润\t144059\n背式\t144060\n弹丸论破\t144061\nFeelunique\t144062\n聋哑人\t144063\nProject2010\t144064\nprofit\t144065\n广州市第六中学\t144066\n指尖\t144067\n360root\t144068\n贴标机\t144069\n钩针\t144070\n先走\t144071\n维克多\t144072\nIsla\t144073\n武汉市第一医院\t144074\n既遂\t144075\n滴答\t144076\n欧洲\t144077\nRaJor\t144078\n佐藤\t144079\n加满\t144080\n暹罗之恋\t144081\n崆峒派\t144082\n换弹\t144083\n被动句\t144084\n奥突斯\t144085\n拉伤\t144086\n乐在其中\t144087\n含油\t144088\n直饮水\t144089\nmuseo\t144090\n块茎\t144091\n钢银\t144092\n中国园林网\t144093\n86411119\t144094\n米洛\t144095\n750克\t144096\n襄阳市第五中学\t144097\n春回大地\t144098\n放价\t144099\nds3231\t144100\n西双版纳国际度假区\t144101\n8720\t144102\n葛格\t144103\nHelping\t144104\nlenth\t144105\n056型\t144106\n二价\t144107\n班集体\t144108\ncorporations\t144109\n视野\t144110\n五女\t144111\n巍子\t144112\n中山街道\t144113\nEducators\t144114\n户部巷\t144115\nfind函数\t144116\n五亿\t144117\n南湖花园\t144118\n后村\t144119\n兰舍\t144120\nNOC\t144121\nofficejet\t144122\n蒙面女\t144123\n不堪入目\t144124\nGPIB\t144125\n裁人\t144126\nmistake\t144127\n叶县\t144128\n仓储费\t144129\n六爻八卦\t144130\n10万家\t144131\n2018年5月12日\t144132\n晕车\t144133\n超静定结构\t144134\nBenchmark\t144135\n北京积分落户制\t144136\n下播\t144137\n蜂蜜网\t144138\n3CM\t144139\n三国群英传5\t144140\n大户室\t144141\n要脸\t144142\n人造\t144143\nfiler\t144144\ncreampie\t144145\nJawn\t144146\n南海渔村\t144147\n利众\t144148\n附魔师\t144149\n无线电台\t144150\n百金贷\t144151\nfabulous\t144152\n曲杆\t144153\nsaga\t144154\n抢注\t144155\n江阴\t144156\n手指甲盖\t144157\nEAN\t144158\n阿里社\t144159\n国家自然科学基金基础研究知识库\t144160\n陆励成\t144161\n内阻\t144162\n求实\t144163\n大林寺桃花\t144164\n月坛\t144165\n内务部\t144166\n逗游网\t144167\nDROP\t144168\n地质学报\t144169\n生理期\t144170\n第六关\t144171\n翻译家\t144172\n湖南省卫生计生委\t144173\n导演系\t144174\n82级\t144175\nTiger5G\t144176\n3-6岁儿童学习与发展指南\t144177\n山东教育社\t144178\na57\t144179\n成都省\t144180\n金砖银行\t144181\n把脉\t144182\nContemporary\t144183\n王者荣耀排位\t144184\n15代\t144185\n审计长\t144186\n一觉醒来\t144187\n发动机盖\t144188\n树脂胶\t144189\n骨肉相连\t144190\n伦敦德里\t144191\njason\t144192\n10万多\t144193\nK3C\t144194\n96芯\t144195\n涟泉大江户\t144196\n刘涌\t144197\n营销心理学\t144198\nviewsonic\t144199\n哈尔滨日报\t144200\n青岛崂山政府\t144201\nsmar\t144202\n小声\t144203\nATTACK\t144204\n限期\t144205\n龙角\t144206\ntimes\t144207\nfaze\t144208\n山魂\t144209\n尝一尝\t144210\n滴墨\t144211\n史莱姆\t144212\n2004-2018年\t144213\n_浴花谷花卉网\t144214\n第98届\t144215\n凤凰院\t144216\n非得已\t144217\n淮阴\t144218\n4肖\t144219\n履约险\t144220\n土地爷\t144221\n中国纺机网\t144222\nSEQUEL\t144223\n张宝\t144224\n汉中洋县\t144225\n1.0.4\t144226\n古墓丽影:源起之战\t144227\n整单\t144228\n新疆日报\t144229\nUSnews\t144230\n纵波\t144231\n猪e网\t144232\n基本建设财务管理规定\t144233\n重色\t144234\n霰弹\t144235\n绝地求生回放\t144236\n借助于\t144237\n吉林森工\t144238\n马蹄莲\t144239\n预估\t144240\n前月\t144241\n年号\t144242\n招行上海分行\t144243\n蒙娜丽莎瓷砖\t144244\n马拉色菌\t144245\nantibiotics\t144246\nWebCam\t144247\n山楂茶\t144248\n杨建明\t144249\n西红花\t144250\ntimo\t144251\n汪雨\t144252\n春风沉醉的晚上\t144253\n樱花日语\t144254\ni5-8600K\t144255\nCoca\t144256\nxtreg\t144257\n几十款\t144258\n詹姆斯韦德\t144259\n周岩\t144260\nSpan\t144261\n消字灵\t144262\nWopus\t144263\nliyonghui\t144264\n陈莉萍\t144265\ntoupper\t144266\n梁跨\t144267\n传送带\t144268\n中共绵阳市委\t144269\n成都航空有限公司\t144270\n摘草莓\t144271\ncnblue\t144272\n长春桥\t144273\n20171117\t144274\n开水机\t144275\n克拉霉素胶囊\t144276\n新时代农民讲习所\t144277\nWorkstation\t144278\n平假名\t144279\nwen10\t144280\nc92\t144281\n翊\t144282\n中山镇\t144283\n杭州城\t144284\n自毙\t144285\n徐光华\t144286\nSonnar\t144287\n沉默的羔羊\t144288\n梦泪\t144289\n金枫酒业\t144290\n客体\t144291\n印度片\t144292\n365号\t144293\n增大\t144294\n湘潭大学法学院\t144295\n韩兆\t144296\nRealize\t144297\n新潮能源\t144298\n爱奇艺vr\t144299\n钱江新城\t144300\n战乙女\t144301\n16克\t144302\nexcel公式\t144303\n叠印\t144304\n半框\t144305\n2.8.1\t144306\n吉米\t144307\n社会保障卡\t144308\n遗传基因\t144309\n大好河山\t144310\ncodecombat\t144311\n天大地大\t144312\n毛边纸\t144313\n趣流网\t144314\n重叠式\t144315\n放开那三国2\t144316\n清扬婉兮\t144317\nVSCO\t144318\n补骨脂\t144319\n道口\t144320\n搬家\t144321\n龙桥镇\t144322\n计型\t144323\n黑湾\t144324\n低面\t144325\n山西省人民检察院\t144326\n非流动资产\t144327\niPhone8对比8P\t144328\n梦龙乐队\t144329\nviscous\t144330\n30袋\t144331\n挠性\t144332\nWilhelm\t144333\n将军在上\t144334\n京沪线\t144335\n索尼xz1c\t144336\nssdb\t144337\n碗\t144338\n留作\t144339\n肺气虚\t144340\n危途\t144341\n皇汉堂\t144342\n號段\t144343\n扁壶\t144344\n西北\t144345\n8px\t144346\n金神\t144347\n吉林省委\t144348\n转校生\t144349\n吉庆\t144350\n容忍度\t144351\n三角枫\t144352\n三滥\t144353\n压力型\t144354\n宁波市第一医院\t144355\n杰斯塔\t144356\ncss3\t144357\n一恒\t144358\n端砚\t144359\n友阿国际广场\t144360\n羌塘\t144361\n吊球\t144362\n瓜子二手车网\t144363\n冲版机\t144364\n莆田站\t144365\n农村片\t144366\nbet365\t144367\n自适应容器\t144368\n李季\t144369\n一英寸\t144370\n段波\t144371\n赫连\t144372\n自觉\t144373\n洛扎县\t144374\n猪女\t144375\n杨波\t144376\nX270\t144377\n顺鑫农业\t144378\nchrome内核浏览器\t144379\n金投机构\t144380\n庆典\t144381\n多普勒雷达\t144382\noutflow\t144383\n上下卷\t144384\nSaul\t144385\n扬眉吐气\t144386\nUOP\t144387\n20140303\t144388\nIText\t144389\n20时\t144390\n康盛股份\t144391\n无击\t144392\n说话时\t144393\n排比句\t144394\n耦合度\t144395\n内讧\t144396\n2777\t144397\n牛排骨\t144398\n对频\t144399\n天山路\t144400\nmodu\t144401\nMattress\t144402\n商城类\t144403\nChameleon\t144404\n拒单\t144405\n帧动画\t144406\nFUTABA\t144407\n弘历\t144408\n幸甚\t144409\n余显斌\t144410\nDIMM\t144411\n陈良贤\t144412\n账屋网\t144413\n计调\t144414\n火门\t144415\n中外\t144416\n陈佩秋\t144417\nfoxmail邮箱\t144418\n女欢\t144419\n夸网\t144420\n淮安市人力资源和社会保障局\t144421\n第124章\t144422\n管束式\t144423\n闷杀\t144424\n1dx2\t144425\nOQC\t144426\n实验法\t144427\n长野\t144428\n天子传奇\t144429\nBD720p-1080p\t144430\n国信优易\t144431\n乐高玩具\t144432\n一转眼\t144433\nSnake\t144434\n国商\t144435\n酒盅\t144436\n小毅\t144437\n黑太阳731\t144438\n000932\t144439\nFingerprint\t144440\n电路\t144441\n南京师范大学泰州学院\t144442\n嘉悦广场\t144443\n六品\t144444\n离队\t144445\n马泉\t144446\n女牛\t144447\n火神\t144448\n9928\t144449\n线棒\t144450\n佳能ip1188\t144451\n南通市港闸区\t144452\n干涸\t144453\n天津师范大学\t144454\n霜月\t144455\n网络教育专升本\t144456\n躺倒\t144457\nMIDI键盘\t144458\n籽棉\t144459\n粽\t144460\n长绒棉\t144461\n本田缤智\t144462\n谢桥\t144463\nOpencv3\t144464\n汗毛\t144465\n十年前\t144466\n香港学校\t144467\n傲骄\t144468\n澎湃研究所\t144469\n白莲花\t144470\n秋浦河\t144471\n试块\t144472\n乐乐镇\t144473\n奥拉星亚比\t144474\nmrquin\t144475\n曲阜孔庙\t144476\n羊蹄\t144477\n乳峰\t144478\n兵马俑在线\t144479\n美豆\t144480\n杨门女将\t144481\n责任者\t144482\n建湖人才网\t144483\nPiezo\t144484\npetalinux\t144485\nQing\t144486\n1656\t144487\n拼趣\t144488\n第几场\t144489\n高分\t144490\n【洛奇\t144491\nfake\t144492\n甘拜\t144493\n1617\t144494\n敲除\t144495\n佑猴\t144496\n河道\t144497\n想必\t144498\n办公家具\t144499\n纳管\t144500\n江东路\t144501\n公认\t144502\n3000起\t144503\n映照\t144504\n蔡名照\t144505\nSER\t144506\n将军胡同\t144507\n转轴\t144508\n韩子平\t144509\n金钱帝国\t144510\n本本\t144511\nip6s\t144512\n华大\t144513\n横移式\t144514\n洛璃\t144515\n二叠\t144516\nsexually\t144517\n冷却塔\t144518\nANU\t144519\n门当户\t144520\natmospheric\t144521\n56888\t144522\nvoices\t144523\nД\t144524\nOla\t144525\nmast\t144526\n醋泡花生米\t144527\n阚立文\t144528\n校名\t144529\nvehicles\t144530\niran\t144531\nIPEX\t144532\n重复建设\t144533\n菁客\t144534\n1884年\t144535\ncynchanpin\t144536\n滑梯\t144537\n卖些什么\t144538\n乐化\t144539\nBitA\t144540\n金牛湖\t144541\nr270\t144542\nsupersocket\t144543\n薛之乐嘉\t144544\n20161217\t144545\n肉眼牛排\t144546\n三国立志传2\t144547\n城建局\t144548\n一掷\t144549\n滑环\t144550\n药粉\t144551\nECMAScript\t144552\n检测机\t144553\n保利清能西海岸\t144554\ncmo\t144555\nGAMES\t144556\n芦苇叶\t144557\n杨臣刚\t144558\n牵牛子\t144559\n命令集\t144560\n节段性\t144561\n古越剑箫\t144562\n龙牧\t144563\n20140130\t144564\n曲波\t144565\n兴东\t144566\nY67\t144567\n利益\t144568\n计科\t144569\n旅舍\t144570\n石墨粉\t144571\n人造文化石\t144572\nPremiere2018\t144573\n时分秒\t144574\nCommands\t144575\ngvc\t144576\nveer\t144577\n吹塑模具\t144578\n镊子\t144579\n李律师\t144580\n上厕\t144581\n武亦姝\t144582\n这一天\t144583\n全作品\t144584\n中国移动139邮箱帮助中心\t144585\n房龙\t144586\n连心桥\t144587\nArp\t144588\n卡路\t144589\n以中\t144590\n寿宁路\t144591\n门型\t144592\nInk\t144593\n祺智\t144594\n灰飞烟灭\t144595\n塑钢瓦\t144596\n亿利集团\t144597\n西餐牛排\t144598\n欢乐喜剧人第二季\t144599\n彼得\t144600\n赵小涣\t144601\n分科\t144602\narmyfai\t144603\n竞业\t144604\nhurts\t144605\n荣康\t144606\n05秒\t144607\n101式\t144608\n交通事故\t144609\n150方\t144610\nprolific\t144611\nbxxfighting\t144612\nbead\t144613\n18043期\t144614\n45.\t144615\n兴趣度\t144616\n0xc0000428\t144617\n沫晴\t144618\nlogit\t144619\n马氏\t144620\n西城广场\t144621\n126邮箱\t144622\n合并\t144623\n上海师大\t144624\n建业桂园\t144625\n37.8度\t144626\n硫酸雾\t144627\n石家庄蓝天医院\t144628\n拉姆达\t144629\nvbp\t144630\n803路\t144631\n北戴河机场\t144632\n住宅质量保证书\t144633\n无以\t144634\nwit\t144635\n2017年11月19日\t144636\n第六次全国人口普查\t144637\n罢免\t144638\n包火\t144639\nAnalytical\t144640\neclips\t144641\n货运\t144642\n玄女\t144643\n62期\t144644\njpe\t144645\n红烧猪蹄\t144646\n圣安地列斯吧\t144647\ndesk\t144648\n负片\t144649\n事先\t144650\nGLASS\t144651\n230g\t144652\nAmazeUI\t144653\n藁城市\t144654\n南极地区\t144655\n77D\t144656\n电壁炉\t144657\n福临\t144658\n育儿问答-摇篮网\t144659\n百瑞德\t144660\n铜山县\t144661\n德惠市\t144662\nPoster\t144663\n黑彝\t144664\n路基亚\t144665\n彪哥闯奉天#\t144666\n广东省发改委\t144667\n上海铁路局\t144668\n林肯mkc\t144669\n筋\t144670\n粘手\t144671\n空客A380\t144672\nNanning\t144673\n东门\t144674\n影音先锋资源_先锋影音资源站_影音先锋\t144675\n高生\t144676\nwin10红色警戒2\t144677\n圣墟\t144678\n郑颖\t144679\n海淀区政府\t144680\n家庭主妇\t144681\n普洛\t144682\n沟壑\t144683\n短音\t144684\n血界\t144685\n勒夏特列\t144686\n张秀萍\t144687\n掀背车\t144688\nqtimer\t144689\nselfie\t144690\n肖氏\t144691\n寺庙客\t144692\n04集\t144693\nMotorola\t144694\n中国改革论坛网\t144695\n第二十课\t144696\n淮中\t144697\n佳源都市\t144698\n320kbps\t144699\n学类\t144700\nOCX\t144701\n常斌\t144702\n专用性\t144703\ncbs\t144704\n鹊华\t144705\nkmsg\t144706\npushd\t144707\ndnserror\t144708\n喹乙醇\t144709\n刘清\t144710\n合集\t144711\n知夫子\t144712\n索引值\t144713\n想日\t144714\n黎蓓露\t144715\n放倒\t144716\n广州市人民政府\t144717\nepidemiology\t144718\n3排\t144719\n北卡罗莱纳州立大学\t144720\n米连卡\t144721\n附睾炎\t144722\n全星\t144723\n北辰实业\t144724\n十二日游\t144725\nDSL\t144726\nMP288\t144727\nbcrypt\t144728\n认证\t144729\n360手机卫士\t144730\n柴胡舒肝丸\t144731\nBD09\t144732\n后河\t144733\nintermediary\t144734\n广艺\t144735\nactuator\t144736\n加速度传感器\t144737\n美惠\t144738\n烟台路\t144739\n北特科技\t144740\nforticlient\t144741\n浅\t144742\n何龙\t144743\n第88号\t144744\n变形金刚吧\t144745\n陈酒\t144746\n沙王\t144747\n中国远洋海运\t144748\n铝盖\t144749\n4片\t144750\n禅院\t144751\n长直\t144752\n3.5.7\t144753\n沿黄公路\t144754\n二手电\t144755\nAPL\t144756\n飞行类\t144757\njsk\t144758\nUSC\t144759\n8888\t144760\n青岛市人民政府办公厅\t144761\n锦衣之下\t144762\natoi函数\t144763\nCheat\t144764\n美视\t144765\nCATV\t144766\n沪嘉高速\t144767\n曾今\t144768\n瑞贝卡\t144769\nGH4\t144770\nAPDL\t144771\n贝利特\t144772\n600度\t144773\n凯美瑞论\t144774\n硅宝科技\t144775\n心里的声音\t144776\nMall\t144777\n2018年终\t144778\n轻质隔墙板\t144779\n100斤\t144780\n关税减让\t144781\nU8860论坛\t144782\nusb驱动器\t144783\n封门村\t144784\n汽车音乐网\t144785\n中国人民解放军理工大学\t144786\n版.rar\t144787\nPredix\t144788\n钱某\t144789\n朱俊\t144790\n郭金龙\t144791\n匾\t144792\n五十噚\t144793\n地下工程\t144794\n许继集团\t144795\nrigorous\t144796\n富家女\t144797\n渔庄\t144798\n派丽蒙\t144799\n病人\t144800\n2006年12月\t144801\n摧枯拉朽\t144802\n1515\t144803\n阻截\t144804\n中国科学院电子学研究所\t144805\n奇趣\t144806\n刑侦类\t144807\n每一个\t144808\n实用版\t144809\n赖岳谦\t144810\n龙鼓\t144811\n天猫双十一\t144812\n双女\t144813\n城隍庙\t144814\n3月8日\t144815\n蠢材\t144816\nrecruit\t144817\n范曾\t144818\n数破\t144819\n米葫芦\t144820\n组阁\t144821\n配置管理器\t144822\n医药界\t144823\n南汇\t144824\niastate\t144825\n算法导论\t144826\n渡航\t144827\nwr886n\t144828\nFEET\t144829\n20150801\t144830\n14分\t144831\n刷点\t144832\nrorshach\t144833\n彩虹兔\t144834\npino\t144835\n三角形\t144836\n世茂天城\t144837\n骨肿瘤\t144838\n解铃\t144839\n外高桥\t144840\nboi\t144841\n杨珊\t144842\n小米max2吧\t144843\nPR2018\t144844\n八大关\t144845\n温斯顿·丘吉尔\t144846\n手机盒\t144847\n>\t144848\nXian\t144849\n水浒英雄\t144850\n边境口岸\t144851\n丈八北路\t144852\n试婚\t144853\n长安cs15\t144854\n仪琳\t144855\n太原机场\t144856\n梁益建\t144857\n红星机器\t144858\n芳园\t144859\nCDKey\t144860\n2323\t144861\n徐州市城乡建设局\t144862\n宋毅\t144863\n圣恩\t144864\n手机搜房网\t144865\n贼海\t144866\n硫酸氢钾\t144867\n天翼网关\t144868\n聪明的投资者\t144869\nBolt\t144870\ngriffin\t144871\n第78章\t144872\nshutter\t144873\n国英双语.BD\t144874\n挑战者\t144875\n岗前\t144876\n河源市公安局\t144877\n成人秀\t144878\n贵州省工商行政管理局\t144879\n抄测\t144880\n宝版\t144881\n上野树里\t144882\nYonex\t144883\n海洲\t144884\n陕西电信\t144885\nSanitary\t144886\n延边富德\t144887\n90回\t144888\nCustomized\t144889\n莫桑钻\t144890\nlibsvm\t144891\n印花布\t144892\n核磁共振谱\t144893\n青鳉\t144894\n不间断电源\t144895\nsapiens\t144896\n6i\t144897\n车箱\t144898\n奥迪S6\t144899\n6174\t144900\n淘米\t144901\n禁毒\t144902\n大周\t144903\nwset\t144904\n统合\t144905\n平心\t144906\n真剑佑\t144907\n興奮\t144908\n加拉哈德\t144909\n天士力控股集团\t144910\n第29个\t144911\nfloat,double\t144912\n4ps\t144913\nksweb\t144914\nViewGroup\t144915\n李妮\t144916\n因果图\t144917\nconomic\t144918\n200ul\t144919\n避寒\t144920\n张海华\t144921\n上午8点\t144922\n正处\t144923\nCodePush\t144924\nstame\t144925\n伙力\t144926\n村级\t144927\n微民网\t144928\n金刚芭比\t144929\nnes\t144930\n郭力\t144931\nesta\t144932\nKPW\t144933\n料盘\t144934\n百度云资源BT\t144935\n金体\t144936\n平津\t144937\n滥用职权\t144938\n闽越\t144939\n田守枝\t144940\nTDT\t144941\n轻骑兵\t144942\n老勺\t144943\n水嘴\t144944\n80后\t144945\n刀仪\t144946\n联想中国\t144947\n卫生高级\t144948\n沃里克\t144949\n乱不乱\t144950\n合战\t144951\n不肖\t144952\n毛羽\t144953\n佳宁\t144954\n展牌\t144955\n舞动青春\t144956\n高萍\t144957\n拳皇2002\t144958\n上海宝冶集团\t144959\n歌典\t144960\n焰火\t144961\n115级\t144962\nimageio\t144963\n钻尾\t144964\n行孝\t144965\n70分\t144966\n损耗率\t144967\nlocalStorage\t144968\n美智库\t144969\n桥本凉\t144970\nwedo\t144971\ngtx1030\t144972\nsk8\t144973\n圈地\t144974\n水龙果\t144975\n蒋丽\t144976\n响水县\t144977\n记写\t144978\n粤海街道\t144979\n玻璃糖\t144980\n半岛都市报\t144981\n七关\t144982\n佳能G2800\t144983\n淮海戏\t144984\n悠悠寸草心\t144985\nmp4\t144986\n微生物培养基\t144987\nKryo\t144988\nF盘\t144989\n齿状线\t144990\n国英\t144991\nFTDI\t144992\n新凤鸣\t144993\nIFRS\t144994\noncall\t144995\n柏太阳神\t144996\n三千世\t144997\n破伤风疫苗\t144998\n中国人民政治协商会\t144999\n镜湖区\t145000\n教师们\t145001\n归不归\t145002\n工业和信息化部信息中心\t145003\n吴青\t145004\n第三方责任险\t145005\n钢芯\t145006\n克诺斯邦\t145007\n一二九\t145008\nguardian\t145009\n长风街\t145010\n思美\t145011\n舞女\t145012\n艾瑞金屋藏娇\t145013\n优瑞\t145014\n新凯家园\t145015\n方剂篇\t145016\n配属\t145017\nuilabel\t145018\n万科公司\t145019\n天河一号\t145020\nyunv\t145021\nEMS\t145022\n胶盒\t145023\nprayer\t145024\n天山换流站\t145025\n天津环球金融中心\t145026\n新会\t145027\n泽国镇\t145028\n一同\t145029\n武宣县\t145030\n深吻\t145031\n劲射\t145032\n乞讨者\t145033\n孤岛惊魂吧\t145034\n纽卡斯尔大学\t145035\n小树人\t145036\n上缴\t145037\n中国会计视野网\t145038\nMot\t145039\n珍藏展\t145040\n山楂片\t145041\n按摩\t145042\n庆七一\t145043\n危与机\t145044\nkylinlin\t145045\n蜗杆传动\t145046\nuulive\t145047\n检假\t145048\n玉湛高速\t145049\n20140614\t145050\n土木工程学院\t145051\n潮涌\t145052\nrtl\t145053\n共富新村\t145054\n震源\t145055\nmotto\t145056\n纯素\t145057\nCorsa\t145058\n全通\t145059\n宝沃bx5\t145060\n伦敦希思罗机场\t145061\n乡镇便民服务中心\t145062\n吴世春\t145063\nJava编程思想\t145064\nfategran\t145065\necpm\t145066\n摩擦声\t145067\nRosa\t145068\n滤液\t145069\n物竞天择\t145070\n20150920\t145071\nAeron\t145072\n红衣教\t145073\npacked\t145074\nVIPKID\t145075\nmsvcr80.dll\t145076\n地物\t145077\nquerydsl\t145078\n孕妈\t145079\n键角\t145080\nBinding\t145081\nChick\t145082\n增稠\t145083\nnba吧_\t145084\n湖北程力\t145085\n再植\t145086\nKingdom\t145087\n720x1280\t145088\n智浪淘沙\t145089\n深圳公明\t145090\ntariffs\t145091\n叶超\t145092\n不聚\t145093\n营房\t145094\n第3题\t145095\n碘酸钾\t145096\n清羽\t145097\nxu\t145098\nMissed\t145099\n孤烟\t145100\n羊头\t145101\nlxmhhy\t145102\n天府七中\t145103\nmongodb3.4\t145104\n卢浮宫\t145105\n英石\t145106\n比安\t145107\n排骨饭\t145108\n第一百二十章\t145109\n硖石\t145110\n汊\t145111\n126\t145112\n李菁菁\t145113\n乌恰县\t145114\n稀贵\t145115\n喋喋不休\t145116\n一式两份\t145117\n180亿韩元\t145118\n猪农\t145119\nStr\t145120\n前任3再见前任\t145121\n装盘\t145122\n一起去\t145123\n阿弥陀佛么么哒\t145124\n菊水\t145125\nWhore\t145126\n识人\t145127\n小窗\t145128\nroar\t145129\n梁边\t145130\n启文\t145131\n亿辉\t145132\noutdoors\t145133\n好弹\t145134\n流星雨\t145135\n福州市人民政府\t145136\n开沟机\t145137\n御膳房\t145138\n女儿城\t145139\n油型\t145140\n|性\t145141\n_角蛙\t145142\njirayu\t145143\n咸宁北\t145144\n账套\t145145\nDDU\t145146\n邱意浓\t145147\n党十九大\t145148\n芝华\t145149\nFOREIGN\t145150\n探鱼器\t145151\n20170119\t145152\n3000多个\t145153\nMuseums\t145154\n滕州市\t145155\nWitness\t145156\norangepi\t145157\n褒义\t145158\nrhino5\t145159\nboltzmann\t145160\n中村静香\t145161\n脱逃者2\t145162\nuws\t145163\n伦文叙老点柳\t145164\nPandaKill\t145165\n球径\t145166\n碟刹\t145167\n克洛洛\t145168\nDelhi\t145169\n雷龙\t145170\n4欧\t145171\nSrxh1314\t145172\n成都工商银行\t145173\n教导处\t145174\n特特\t145175\n前生\t145176\n赤水市\t145177\n长乐国际机场\t145178\n于毅\t145179\nCvMat\t145180\n小升初_无忧考网\t145181\n市场版\t145182\n取代基\t145183\nchild\t145184\n临沂市教育局\t145185\n25片\t145186\n龙歌\t145187\n1.22.2\t145188\n榛子\t145189\n渲\t145190\nbarrier\t145191\n土地管理法\t145192\n广视通\t145193\n薛亚萍\t145194\n颜控\t145195\n制伏\t145196\n商贸类\t145197\n石家庄铁道大学\t145198\n腾讯企点\t145199\n张太雷\t145200\n几百万条\t145201\n小河村\t145202\n债权人\t145203\narclist\t145204\na1370\t145205\n之二十七\t145206\n简而言之\t145207\nHCS\t145208\n编程人生\t145209\n州城\t145210\n红于\t145211\nInfi\t145212\n装配\t145213\n罗王\t145214\n358元\t145215\n临湘政府网\t145216\n认知行为疗法\t145217\n周巷\t145218\nSAOIF\t145219\n信息码\t145220\n郭羽\t145221\n铁骑\t145222\n蚕室\t145223\nnumber\t145224\n微谷\t145225\n北京顺丰快递\t145226\n221&\t145227\nCFA\t145228\n上海市政府采购中心\t145229\n山楂条\t145230\n转转\t145231\n杜Amy\t145232\n欲语\t145233\n康萃乐益生菌\t145234\n红艳\t145235\n挑剔\t145236\n调味料\t145237\n网康科技\t145238\n决定稿\t145239\ndia\t145240\n亿色\t145241\n仿真树\t145242\n提高\t145243\n0.58\t145244\n王学兵\t145245\n实股\t145246\n小圈子\t145247\n上海猎头公司\t145248\n伤悲\t145249\n下留情\t145250\n蕉门\t145251\nMei\t145252\n27栋\t145253\nwowo\t145254\n温昶\t145255\nopi\t145256\n直播吧zhibo8.cc\t145257\n中上等\t145258\nWin8系统之家\t145259\n第152章\t145260\n顾言\t145261\nPCB联盟网\t145262\n王钦敏\t145263\n中央广播\t145264\nRunJS\t145265\n多年前\t145266\n第三十二条\t145267\nPES2017\t145268\nireader阅读器\t145269\nsecooler\t145270\n宝船\t145271\n殒落\t145272\n山家\t145273\n万五\t145274\n张逸杰\t145275\n幻月\t145276\n思南县人民政府\t145277\n大妖\t145278\n黑杆\t145279\nSSS级\t145280\n狂神\t145281\n奥修\t145282\nhooker\t145283\n太原小学\t145284\n杨业\t145285\n割礼\t145286\n单反相机\t145287\nn9006\t145288\n荻花宫\t145289\n055\t145290\n上海汽车南站\t145291\nkenneth\t145292\n希斯莱杰\t145293\n卡盒\t145294\n纯禽\t145295\nshadowplay\t145296\n法润\t145297\n邻友\t145298\n风骏5\t145299\n拎\t145300\n杨跃\t145301\n蚩尤\t145302\n清屏\t145303\n春申路\t145304\n管氏\t145305\n307国道\t145306\n240W\t145307\n莫菁\t145308\n郡\t145309\n贵烟\t145310\njtg\t145311\n消费者协会\t145312\nOrigin9\t145313\npike\t145314\n钢枪\t145315\n居鲁士\t145316\n戴维斯\t145317\n荼靡\t145318\n灭屏\t145319\n龙门高铁站\t145320\n2018年5月9日\t145321\n北京大学数学科学学院\t145322\nXiuRen秀人网\t145323\n不华\t145324\nMeditation\t145325\n新华村\t145326\n古北水镇\t145327\n仙剑奇缘\t145328\n无限火球法\t145329\n希维尔\t145330\n外资银行\t145331\n赛课\t145332\n山西旅游网\t145333\n指环王三部曲\t145334\n跳墙\t145335\n木亭\t145336\ntvc\t145337\n张耀\t145338\nmarc\t145339\n主教练\t145340\n972\t145341\n中华人民共和国城镇国有土地使用权出让和转让暂行条例\t145342\n雪貂\t145343\nxiaolu\t145344\n路勤\t145345\n吾谷网\t145346\n小米电视2S\t145347\n黄桑\t145348\nPrintArea\t145349\n18元\t145350\np60\t145351\n夫妻档\t145352\ngtx\t145353\n日结\t145354\ndcloud\t145355\n彩带舞\t145356\n狂女\t145357\n000533\t145358\n十六年\t145359\n11.04\t145360\n獐子\t145361\nreckless\t145362\nz400\t145363\n健客医院\t145364\n可莱丝\t145365\n学生公寓\t145366\n骤降\t145367\n折返\t145368\ndaye\t145369\n信用等级\t145370\ngta\t145371\n二轮摩托车\t145372\nmethodology\t145373\n陆先生\t145374\n喋血街头\t145375\n隆阳区\t145376\n壮哥\t145377\nbbg\t145378\n双支\t145379\n地府\t145380\n苍之纪元\t145381\n格致\t145382\n144小时\t145383\nmetoo\t145384\n绝缘\t145385\n屏住\t145386\nMusings\t145387\n华舍街道\t145388\n烧碱\t145389\n孙乾\t145390\n切记\t145391\n龙舟节\t145392\n易教\t145393\n鱼露\t145394\ncstring\t145395\n34码\t145396\n维迈\t145397\nlxj\t145398\n新加坡城\t145399\n凤凰国学\t145400\n很多位\t145401\n椅业\t145402\nsyne\t145403\n错误页\t145404\n醇基\t145405\n6890\t145406\n核显\t145407\n第五套\t145408\n甘肃高台网\t145409\n哑变量\t145410\n荀琳\t145411\n沥水篮\t145412\n福寿园\t145413\nTOP25\t145414\n三大项\t145415\n回魂\t145416\n北京八中\t145417\n湖南工大\t145418\n吉炳轩\t145419\n华昌达\t145420\n山川\t145421\nt630\t145422\nV2.4.2\t145423\n于宁\t145424\n对的人\t145425\n壮实\t145426\n昌化\t145427\n拉布灯箱\t145428\n日菁\t145429\n精灵公主\t145430\nUtilities\t145431\n霍山\t145432\n11分\t145433\n58Game英雄联盟\t145434\n李元元\t145435\n滑稽\t145436\n暗里\t145437\n12345678\t145438\n伊宁\t145439\nAjax\t145440\nlocator\t145441\n益友\t145442\nPointer\t145443\n暂行规定\t145444\n曲张\t145445\n找底\t145446\n广东道\t145447\n分米鸡\t145448\n散心\t145449\nMustang\t145450\n妇女主任\t145451\n云南省博物馆\t145452\n遥远的她\t145453\n“\t145454\n执信\t145455\n门洞\t145456\n2017年5月\t145457\n红水河\t145458\naptamil\t145459\n金月芽\t145460\n财务管理\t145461\n手模\t145462\n红磡领世郡\t145463\n工业洗衣机\t145464\n东岳\t145465\n跨膜蛋白\t145466\n阿尔宙斯\t145467\n大器\t145468\nbrea\t145469\nCreative\t145470\nTUNEL\t145471\n回文串\t145472\n养鹿\t145473\n城里的月光\t145474\n济南市经济和信息化委员会\t145475\n呈\t145476\n擦一擦\t145477\n5685\t145478\n绝地求生错误\t145479\nMetabones\t145480\n一件小事\t145481\n珞珈创谷\t145482\n山大一院\t145483\n首钢股份\t145484\n海星模拟器\t145485\n日币\t145486\nVantage\t145487\n草绳\t145488\n甲氧基\t145489\n豌豆淀粉\t145490\n320米\t145491\n银碗\t145492\n2017种\t145493\n自警\t145494\nHeartbeat\t145495\nIEnumerable\t145496\n0433\t145497\n咸猪\t145498\ncouchdb\t145499\n马门溪龙\t145500\n徐cake\t145501\n浣花\t145502\n缩股\t145503\n古漪园\t145504\n悦游\t145505\n钨钼\t145506\n眼胀\t145507\n铁基\t145508\n东营市国土资源局\t145509\n95集\t145510\n克拉玛依\t145511\nparseInt\t145512\nBOOKING\t145513\ntba\t145514\n7b\t145515\n燕京\t145516\n齿轮减速机\t145517\n麻阳\t145518\ngetitem\t145519\n终结者3\t145520\n铁架\t145521\n夏姬\t145522\n涵数\t145523\n马群街道\t145524\nPHP7.0\t145525\nJett\t145526\nchumojing\t145527\n十九大|\t145528\n金正大\t145529\nbartender\t145530\nwtf\t145531\n赛马娘\t145532\n康营家园\t145533\n肺动脉高压\t145534\n子网号\t145535\n捡拾\t145536\n北京住房公积金管理中心\t145537\nSemantic\t145538\n点头\t145539\n白切\t145540\n金错刀\t145541\nP23\t145542\n7标\t145543\n左小青\t145544\ndetermination\t145545\n张一凡\t145546\n云视听\t145547\n甜茶\t145548\nFlashBack\t145549\n大朝台\t145550\n鸡鸣岛\t145551\nSushi\t145552\n熟能\t145553\n鱼子酱\t145554\nKg\t145555\n硼酸酯\t145556\n第10章\t145557\n四镇\t145558\n商务部\t145559\n形近字大全\t145560\n中华人民共和国环境保护税法实施条例\t145561\n超凡\t145562\n1850年\t145563\n第3场\t145564\n秋露\t145565\n往常\t145566\n茜拉\t145567\n双相障碍\t145568\n朱令案\t145569\n埭溪\t145570\n一万个\t145571\n莫尼卡\t145572\n贝朗卫浴\t145573\n曹毅涵\t145574\n痛哭\t145575\n施惠\t145576\npkgj\t145577\n配件展\t145578\nJmail\t145579\n160501\t145580\n淘宝兼职\t145581\n达人们\t145582\nstc89c52rc\t145583\n寿乡\t145584\n郭姓\t145585\n辨论\t145586\nwines\t145587\n脾脏\t145588\n99安卓\t145589\n古山\t145590\n武汉市物业管理协会\t145591\n社会摇\t145592\n保险经\t145593\n天蝎男\t145594\n马德里皇宫\t145595\noppoa73\t145596\n卧式砂磨机\t145597\n状态寄存器\t145598\n正视\t145599\n针板\t145600\n苏暖\t145601\n柘城县\t145602\nBATCH\t145603\n满帧\t145604\n零零\t145605\n赵玮\t145606\nsupervisorctl\t145607\nAssert\t145608\n苍崎青子\t145609\nConquest\t145610\n鹿邑县\t145611\necma\t145612\nmanufacture\t145613\n竹树\t145614\nTHUSSAT\t145615\n踏踏实实\t145616\ntr069\t145617\n招商引资\t145618\n12月1号\t145619\n2015年后\t145620\n五棵松体育馆\t145621\n周四\t145622\nbks\t145623\nstory\t145624\n一国两制\t145625\n杜鹃\t145626\n绝地求生解封\t145627\nING\t145628\n乡镇团委\t145629\npfs\t145630\n共同纲领\t145631\n700多万\t145632\n剧迷\t145633\n轩辕剑吧_\t145634\n第37条\t145635\n2018年4月21\t145636\nMACHENIKE\t145637\n升堂\t145638\nipl\t145639\n燕窝名牌大全_百强网\t145640\n包材\t145641\nMFC-7360\t145642\ninconel\t145643\n刘秋仪\t145644\n士林\t145645\n墙固\t145646\n剑龙\t145647\n超日\t145648\n散瞳\t145649\n沟内\t145650\n英雄无敌3\t145651\n丸山\t145652\n性瘾\t145653\n灰蒙蒙\t145654\n保剑锋\t145655\n分拆\t145656\nV9.1\t145657\nFlowJo\t145658\n平兴寺\t145659\n二义性\t145660\n300马力\t145661\n土地资源\t145662\n尤里\t145663\n永高股份\t145664\n耐热\t145665\n篠田优\t145666\n疯帽\t145667\n慈禧墓\t145668\nDrawn\t145669\n护腿\t145670\n河野\t145671\n榛子镇\t145672\n十六年前\t145673\nChemOffice\t145674\n交期\t145675\n临武\t145676\n标致308\t145677\n第二十三届\t145678\n绅士\t145679\n襄阳百姓网\t145680\ngeforce\t145681\n绿源电动车\t145682\n器官捐献者\t145683\n三月初十\t145684\nshadownsocks\t145685\nPuerto\t145686\n多多影院\t145687\n王志鹏\t145688\n在线播\t145689\n昆士兰州\t145690\nSerpent\t145691\n涉资\t145692\n上海市公安局浦东分局\t145693\n柳州市城中区人民政府\t145694\n武汉工程大学邮电与信息工程学院\t145695\n水基灭火器\t145696\n酉阳新闻网\t145697\n12关\t145698\nmeals\t145699\n李叶\t145700\n虫姬\t145701\n两连败\t145702\nitunes12.7\t145703\n梅陇\t145704\ngaojun\t145705\n楼区\t145706\n辽海\t145707\n风盔城\t145708\n武汉儿童医院\t145709\n电动移液器\t145710\n6天\t145711\n第一日\t145712\n打下\t145713\n女保姆\t145714\n厦门市思明区政府\t145715\n战略部署\t145716\n维科精华\t145717\n儒林镇\t145718\n中国集邮总公司\t145719\n江苏省人民政府国有资产监督管理委员会\t145720\n传媒配音网\t145721\n越\t145722\nNDS模拟器\t145723\nresolvable\t145724\n主营业务成本\t145725\n12999\t145726\n乡镇人民政府\t145727\n标致5008论坛_汽车之家论坛\t145728\n分组框\t145729\n第四家\t145730\n盾子\t145731\n行政机构\t145732\n从属\t145733\n何某\t145734\ntyping\t145735\n酷安\t145736\n2017.2\t145737\n兵制\t145738\n维勇\t145739\n观澜山水田园\t145740\n蝙蝠和雷达\t145741\n燕尾\t145742\nwindows_Windo\t145743\n丝座\t145744\nkuber\t145745\n宝贝\t145746\n无欲无求\t145747\n44所\t145748\n取物语\t145749\n转教\t145750\n92_\t145751\n産品\t145752\n江湖侠客\t145753\n法语助手|法汉\t145754\n赵\t145755\n棋士\t145756\nRS6\t145757\nNGC\t145758\n洛佩斯\t145759\n炼金台\t145760\n肿瘤标志物\t145761\n复合盐\t145762\n游族\t145763\n脉动真空灭菌器\t145764\n6.35\t145765\n第42\t145766\nTile\t145767\n丹河\t145768\n政论文\t145769\n)商业有限公司\t145770\n曼彻斯特联\t145771\n微波遥感\t145772\n倾斜度\t145773\n道班\t145774\nheim\t145775\n东海大桥\t145776\n学艺运动网\t145777\n第三声\t145778\n163sub\t145779\n方松街道\t145780\ndull\t145781\n杜蒙特\t145782\n50美元\t145783\n死亡女神\t145784\n20160602\t145785\n郭雷\t145786\nGains\t145787\n渎职\t145788\ncontrolling\t145789\n南镇\t145790\n各奔\t145791\n60.0\t145792\nPin接口\t145793\n地方政府融资平台\t145794\nWithings\t145795\n应用光学\t145796\n华东建筑设计研究院\t145797\n108家\t145798\nSilvia\t145799\n爱如生\t145800\n未注\t145801\n正编\t145802\n特番\t145803\n超智\t145804\n高旗\t145805\n团友\t145806\n阜成\t145807\nINCA\t145808\n1000m\t145809\n几近\t145810\n采购\t145811\n易呗网\t145812\n精诚\t145813\nlgg6\t145814\n昆朋\t145815\nservlet-mapping\t145816\n刑天\t145817\nBridges\t145818\n外汇券\t145819\n歼5\t145820\n技术台\t145821\n视物\t145822\n静疗\t145823\neD2k\t145824\n易柏辰\t145825\n城院\t145826\n总结\t145827\n通途\t145828\n逃出生\t145829\noverriding\t145830\n总责\t145831\ngodep\t145832\nMy盛Lady\t145833\n牛舍\t145834\n胡莱三国\t145835\n三奸\t145836\n助研\t145837\n)电子商务有限公司\t145838\n念奴娇\t145839\n金辉集团\t145840\n谷歌云服务器\t145841\n披头士乐队\t145842\n格瑞姆巴托\t145843\n真舒服\t145844\n中国电子科技\t145845\n重庆高新技术产业开发区\t145846\n姚劲\t145847\n电力线\t145848\n红高梁\t145849\n打猪\t145850\n罗思义\t145851\n衣坊\t145852\n中原信托\t145853\n定夺\t145854\n中正\t145855\n美铃\t145856\n熔炼\t145857\n流连\t145858\n陷入困境\t145859\n先锋av\t145860\n咨询部\t145861\n礼钱\t145862\n无非\t145863\n猎团\t145864\nthere\t145865\n合城\t145866\nLISA\t145867\n奥赛德\t145868\n诅咒\t145869\n外贸soho\t145870\n纯文\t145871\n泯然\t145872\n神之子\t145873\n贴现\t145874\n15个月\t145875\nMHL\t145876\n朝暮\t145877\n存在者\t145878\nToo\t145879\n剪手\t145880\n夏提雅\t145881\n花仙\t145882\n唐尧网\t145883\nInhibitor\t145884\nKLM\t145885\nqq播放器\t145886\n艾瑞斯\t145887\n教育版\t145888\n叶澜依\t145889\n面世\t145890\n喧喧\t145891\n蔡州\t145892\n十三招\t145893\nYML\t145894\n完备性\t145895\n米级\t145896\n气环\t145897\n拍找\t145898\n兰州大学生命科学学院\t145899\n浪漫满屋\t145900\nHOUSE\t145901\n航拍器\t145902\n西汉姆联\t145903\n九夜茴\t145904\n八一男篮\t145905\n电解铝\t145906\nthigh\t145907\n面容\t145908\n中华v6\t145909\n4X2\t145910\n小幼\t145911\n一禾\t145912\ninit函数\t145913\n十项\t145914\n上生\t145915\n华通\t145916\n侧柏叶\t145917\n进行时\t145918\n500d\t145919\n兴隆街道\t145920\n船政\t145921\n电炉丝\t145922\n金服\t145923\n20140913\t145924\n挖槽\t145925\n顾家家居\t145926\n剑侠悦动\t145927\n养病\t145928\n闭音节\t145929\n妖后\t145930\n机功率\t145931\nashford\t145932\n晴日\t145933\nPlanning\t145934\n平模\t145935\n乐坏\t145936\n旳\t145937\nillumination\t145938\n慢条斯理\t145939\n30年间\t145940\n恐怖主义法\t145941\n必由之路\t145942\n胶结\t145943\n交感神经型颈椎病\t145944\n马基雅维利\t145945\n3.5亿美元\t145946\n九龙湖新城\t145947\n圣墟小说网\t145948\n造物\t145949\nphp函数\t145950\n留学生们\t145951\n两口子\t145952\nNightly\t145953\n湖南省机关事务管理局\t145954\n国悦府\t145955\n华润深圳湾体育中心\t145956\n武汉市人大常委会\t145957\n52Hz\t145958\n善元堂\t145959\n民康路\t145960\n跑圈\t145961\n威化饼\t145962\n台美\t145963\n痦子\t145964\n944\t145965\n对性\t145966\nbopp\t145967\n环球通\t145968\n感应卡\t145969\nN2O\t145970\n500N\t145971\n精编版\t145972\n转业费\t145973\n防爆箱\t145974\n开行\t145975\n汽车\t145976\n百分之九十\t145977\nrelocation\t145978\nVSP\t145979\n中国工业信息网\t145980\n當\t145981\n东兴区\t145982\n山东英才学院\t145983\n下关镇\t145984\nBDR\t145985\n陈丹青\t145986\n家庭收入\t145987\n画堂\t145988\n联想yoga2\t145989\nopskin\t145990\n易班网\t145991\n百山百川行\t145992\n连厢\t145993\n人头豆腐汤\t145994\n并联谐振\t145995\n上模\t145996\n自古\t145997\n）\t145998\n4621\t145999\n5s\t146000\n5891\t146001\n荧光板\t146002\n东坝\t146003\n编带\t146004\nRGF\t146005\nmax2017\t146006\nReliable\t146007\nminecraftpe\t146008\n实验仪\t146009\n周伯云\t146010\n食疗\t146011\n为你好\t146012\n入党故事\t146013\n珠江帝景\t146014\n环球人物网\t146015\n鞣酸软膏\t146016\nminidp\t146017\n考试周刊\t146018\nmac免费版\t146019\n吐血\t146020\n在师大\t146021\n晋江人才网\t146022\n石家庄外国语学校\t146023\n华菱集团\t146024\n共襄盛举\t146025\nCoat\t146026\n拉卡泽特\t146027\n埋深\t146028\nCLP\t146029\n矢量图标库\t146030\n法槌\t146031\n吴志明\t146032\n传播者\t146033\n不能不去\t146034\n经济观察网\t146035\n代木\t146036\n奥本\t146037\nSoil\t146038\n武汉公司\t146039\n岳阳东站\t146040\n无框\t146041\n独派\t146042\n中国科学院上海生命科学研究院\t146043\n湘警网\t146044\n秀爷\t146045\n直联\t146046\n风色幻想5\t146047\n落尽\t146048\n新津县\t146049\nComplaints\t146050\npity\t146051\n歌咏\t146052\n紧急状态\t146053\nautoCAD\t146054\n基本存款账户\t146055\nopportunity\t146056\n长水街道\t146057\n崂山二中\t146058\nPudong\t146059\n酷读吧小说网\t146060\n农家乐联盟\t146061\n明明知道\t146062\n有限责任公司\t146063\n塔塔粉\t146064\nupup\t146065\n青春年华\t146066\n袁曙宏\t146067\nCarriers\t146068\n爱憎分明\t146069\ni5-5200u\t146070\n船期表\t146071\n灵谷\t146072\n内倾\t146073\n鲍尔环\t146074\n合肥师范学院\t146075\nrk3399\t146076\n寿康宝鉴\t146077\ngdb\t146078\n载频\t146079\nBYK\t146080\n绝缘棒\t146081\n墨魂\t146082\n诺坎普球场\t146083\n四川省地方税务局\t146084\n环岛\t146085\nPAI\t146086\n柯克兰\t146087\n钥匙环\t146088\n壬申\t146089\n就读\t146090\n平均分\t146091\n皮皮网\t146092\n南山\t146093\n昼颜\t146094\n羽芒\t146095\n萘酚\t146096\nconn\t146097\n过敏原检查\t146098\n身股\t146099\n乐泰胶水\t146100\n一晌贪欢\t146101\n毛泽东思想和中国社会主义理论体系概论\t146102\n332路\t146103\nMarley\t146104\n长沙市\t146105\n刹住\t146106\n玛狃拉\t146107\n孟耿如\t146108\n美佳\t146109\nmultilib\t146110\n偏信\t146111\n石原朱丹\t146112\n装睡\t146113\n优学院\t146114\n旅日\t146115\n余光中\t146116\n柏氏\t146117\n黄衫\t146118\n外设堂\t146119\n泰星push\t146120\n风色\t146121\nS02E01\t146122\nDocument\t146123\n百强网\t146124\n英雄联盟S8\t146125\n邓建栋\t146126\nOTZ\t146127\n御套\t146128\n残运会\t146129\n笑颜\t146130\n迅速\t146131\n春光乍泄\t146132\n环测\t146133\n水光\t146134\n孙正关羽\t146135\n归根到底\t146136\n呕气\t146137\n轮查\t146138\n中建八局一公司\t146139\n7码\t146140\n阿婆主\t146141\nJil\t146142\nOREAL\t146143\n王滨\t146144\nmusk\t146145\n新东方大学\t146146\n胖子\t146147\n七丽时尚\t146148\n摩尔浓度\t146149\nBlob\t146150\n老墙\t146151\n安装工程综合定额\t146152\n救世军\t146153\n中央全面深化改革委员会\t146154\n撬棍\t146155\n嘿呦\t146156\n第二十九集\t146157\n概念性规划\t146158\n爱德\t146159\n还考\t146160\n时路\t146161\n雕花楼\t146162\n气象在线\t146163\n蓝染\t146164\n乐福\t146165\n关晓施一公\t146166\n大港\t146167\n古生\t146168\n迈瑞宝xl\t146169\nEduardo\t146170\n超预算\t146171\n城市场\t146172\n3091\t146173\n威斯康辛州\t146174\n锡矿\t146175\n平尾\t146176\n车联\t146177\n王海军\t146178\n0938\t146179\n期中期末\t146180\n荣耀6a\t146181\n比佛利山庄\t146182\n死亡地带\t146183\n绿叶学习网\t146184\n沙湾古镇\t146185\n幸福的人\t146186\n废材\t146187\n高纬度\t146188\nHKDCNY\t146189\nsqlmap\t146190\nT440P\t146191\n适配器\t146192\n0017\t146193\n9号\t146194\n验单\t146195\ntureno\t146196\n蓝天小区\t146197\n神液\t146198\n驾乘\t146199\nmulticharts\t146200\nPork\t146201\n张大磊\t146202\n霍兰德\t146203\n妍\t146204\nridge\t146205\n相护\t146206\n明兰\t146207\n咖喱鸡\t146208\n华宁县\t146209\n枇杷节\t146210\nGospel\t146211\nレビュ\t146212\nPDF\t146213\n挑逗\t146214\nLottie\t146215\n刘云浩\t146216\n菜单\t146217\n3done\t146218\n乙酸乙烯酯\t146219\n校验\t146220\n姑\t146221\n受聘\t146222\n巧解\t146223\nScalar\t146224\n奇数\t146225\n走帮服\t146226\nopenSSH\t146227\n无菌制剂\t146228\n分离机\t146229\n2263\t146230\n心忧\t146231\n融资难\t146232\n2015年1月1日起\t146233\n桩身\t146234\n周建新\t146235\n300w\t146236\n滨河公园\t146237\nmodal\t146238\n草埔\t146239\n临沂在线\t146240\n4399密室逃脱\t146241\nMales\t146242\n服\t146243\n饱和度\t146244\n预收账款\t146245\n西部地区\t146246\n720P/1080P\t146247\n建筑施工扣件式钢管脚手架安全技术规范\t146248\n89名\t146249\n舔\t146250\n濠河\t146251\nSAF\t146252\n伐纣\t146253\n防盗扣\t146254\n发饰\t146255\n联邦政府\t146256\n房车展\t146257\n5mg\t146258\n大鹿岛\t146259\n中国联合航空\t146260\n画堂春\t146261\nmqq\t146262\n正青春\t146263\n姥\t146264\n玉溪高古楼\t146265\nSpreading\t146266\n芝麻饼\t146267\n易损性\t146268\n止盈\t146269\n24件\t146270\n张云峰\t146271\nxencenter\t146272\n上溪\t146273\n瑠川リナ\t146274\n贼\t146275\n4万块\t146276\n干面\t146277\n关键链\t146278\nExpanding\t146279\n岩田刚典\t146280\n假行僧\t146281\n安徽省水利厅\t146282\n占房\t146283\nnvl\t146284\n淘气包埃米尔\t146285\n音悦台\t146286\n420ml\t146287\n楚留香传奇\t146288\n22:00\t146289\n群益\t146290\n河北人民出版社\t146291\n重庆日报网\t146292\n死亡公路\t146293\nmatconvnet\t146294\n正宇\t146295\n第50页\t146296\n矛盾分析法\t146297\n武神天下\t146298\nMac版\t146299\n2216\t146300\n除险\t146301\nT台\t146302\n265上网导航\t146303\n模型展\t146304\n康塔塔\t146305\n夜莺\t146306\n左心\t146307\n张玉平\t146308\n三合局\t146309\n五月一\t146310\n虹梅路\t146311\n梅山路\t146312\nSpring+quartz\t146313\n三十厘米\t146314\n掌酷\t146315\n商务谈判\t146316\n鼠标器\t146317\n宁晓曦\t146318\n耒\t146319\n鬼镜\t146320\n友点\t146321\n博才网\t146322\nSui\t146323\n嫡嫁千金\t146324\nwin7系统盘\t146325\n限价房\t146326\nsdhc\t146327\n八美图\t146328\n释行宇\t146329\n防空兵\t146330\n谷谷\t146331\n0.05\t146332\n鹿群\t146333\n8B\t146334\n彼得斯\t146335\n亚裔\t146336\n硝酸锌\t146337\n阴囊潮湿\t146338\n国家海洋局第三海洋研究所\t146339\nMicroservices\t146340\n四川信息职业技术学院\t146341\n伊修加德\t146342\n温岭市第一人民医院\t146343\n苏家村\t146344\n6600K\t146345\nzip函数\t146346\n温哥华机场\t146347\nMed\t146348\n1798\t146349\n五宝\t146350\n争冠\t146351\n紧急切断阀\t146352\nhockey\t146353\n中餐馆\t146354\n张浚\t146355\n特证\t146356\n全新版大学英语综合教程2\t146357\nstringgrid\t146358\n肯德基\t146359\n勾缝\t146360\n平流沉淀池\t146361\n桑干河\t146362\n槑头槑脑2\t146363\n冠脉\t146364\nic卡读卡器\t146365\n道闸杆\t146366\nOPPO\t146367\n日化\t146368\n火烧火燎\t146369\n比熊\t146370\n塔什干\t146371\n精母\t146372\n离世\t146373\n教务\t146374\n消纳场\t146375\n卢卡库\t146376\nhi3516a\t146377\n各地\t146378\n亲爱的客栈\t146379\n土木工程\t146380\n沙化\t146381\nswoosh\t146382\n消毒片\t146383\n高尔夫论坛_汽车之家论坛\t146384\n手扶拖拉机\t146385\n狂暴:世纪浩劫\t146386\nxftp5\t146387\nInstaller\t146388\nmx播放器\t146389\niatf\t146390\npgi\t146391\n代理商\t146392\n耐碱\t146393\n马服\t146394\nchangge\t146395\n实施性\t146396\n缠斗\t146397\nAirPod\t146398\n第01\t146399\n亿美\t146400\nX399\t146401\n恢復\t146402\n井蛙\t146403\n中机中心\t146404\n唱衰\t146405\n空调过滤器\t146406\n邯郸峰峰矿区\t146407\n例析\t146408\n盛宠\t146409\n李立军\t146410\n西南地区\t146411\n第6级\t146412\n砍单\t146413\nepcos\t146414\n尾款\t146415\n壬丰大厦\t146416\ndock\t146417\n孟凯\t146418\nbirdy\t146419\n历来\t146420\n百鸟园\t146421\nElfArtWorld\t146422\n都市风月奇谭\t146423\ndod\t146424\n嘉爷\t146425\n富商\t146426\n友价\t146427\n国贸中心\t146428\nsurfaceView\t146429\n都市夜归人\t146430\n引气剂\t146431\n荷官\t146432\n东方财富通\t146433\n石中剑\t146434\nMane\t146435\n张学艾弗森\t146436\n志杰\t146437\n崔允漷\t146438\n正气歌\t146439\nnitrate\t146440\n鸿祥\t146441\n太阳照常升起\t146442\n东方名苑\t146443\nagapple\t146444\n阿空\t146445\nSke\t146446\n欧缔兰\t146447\nScatter\t146448\n一板\t146449\n凤囚凰红袖\t146450\n粉红豚\t146451\n阻滞剂\t146452\n广告屏\t146453\n1050\t146454\n智飞生物\t146455\n计算机应用技术\t146456\n宝马3系吧\t146457\n朱美拉\t146458\n临工装载机\t146459\n第37话\t146460\n姐弟恋\t146461\n五五开黑节\t146462\n萝卜汤\t146463\n喜报\t146464\n8187\t146465\n市调\t146466\n大增\t146467\n网龄\t146468\n招股书\t146469\n鬼魂\t146470\n有勇有谋\t146471\n围挡板\t146472\n过敏性\t146473\n蒲地蓝\t146474\nQuagga\t146475\n港中\t146476\n衰亡\t146477\nwate\t146478\n2829\t146479\n普利斯\t146480\n20160531\t146481\n上麦\t146482\npokemon\t146483\n3229\t146484\n创业路\t146485\n网络模块\t146486\njetaudio\t146487\nvgg\t146488\n工伤保险基金\t146489\nlouis\t146490\n掏槽\t146491\n半壁江山\t146492\n县文广新局\t146493\nCastro\t146494\n罗松\t146495\n仁波切\t146496\n1580MF\t146497\n止汗喷雾\t146498\n香茅草\t146499\n左轮\t146500\n常青藤大学\t146501\n小亭\t146502\n埃里温\t146503\n神州电脑\t146504\n淫叫\t146505\n水生\t146506\n百分之2\t146507\n环素\t146508\n广东省妇幼保健院\t146509\nMinion\t146510\n扬剧\t146511\n4699\t146512\nMoulin\t146513\nKoala\t146514\nShampoo\t146515\n委培生\t146516\n8250\t146517\n十九大党章公开课\t146518\n城南花\t146519\n中山市公安局\t146520\n贝斯谱\t146521\nR2016a\t146522\n回收商\t146523\n温病\t146524\n摁\t146525\n供给侧结构性改革\t146526\n从业人员\t146527\nADDR\t146528\n陇海路\t146529\n白线\t146530\n宾果消消乐\t146531\n山东威力重工\t146532\n回老家\t146533\n毒液\t146534\n甜杏仁油\t146535\n40#\t146536\n多多熊\t146537\n20hz\t146538\n活跃率\t146539\n宝马mini\t146540\n偏心异径管\t146541\n中邮网[集邮/钱币/邮票/金\t146542\nswappiness\t146543\nSAIC\t146544\n高卷\t146545\n辉煌十九大】十九大\t146546\n郑总\t146547\nthursday\t146548\n19本\t146549\n北京古北水镇\t146550\n绿地21新城\t146551\nGTX690\t146552\n寒山潜龙\t146553\n吉吉影音AV\t146554\n北京人遗址\t146555\nContribute\t146556\n红杉资本\t146557\n70多度\t146558\nConstantin江诗丹顿\t146559\n聂远\t146560\n宾夕法尼亚州\t146561\nwin10默认输入法\t146562\nporno\t146563\n柏林爱乐\t146564\n阶堂\t146565\n原盐\t146566\nXE10\t146567\n迫真\t146568\n_笔经面经_牛客网\t146569\n精英们\t146570\n糕\t146571\nzhain\t146572\n痴迷者\t146573\n乐铺网\t146574\n线性度\t146575\n微胶囊\t146576\nYxh\t146577\n凯泉\t146578\nROSSINI\t146579\nGD&T\t146580\n陈炎\t146581\n非法定\t146582\n2288\t146583\nb站\t146584\n饥饿\t146585\n科莫多\t146586\n东坡肘子\t146587\n军转网\t146588\n外联部\t146589\nMEM\t146590\nservr\t146591\n8亿\t146592\n广东省交通厅\t146593\nslf\t146594\n亿信\t146595\n运动量\t146596\n╱\t146597\n都市时报\t146598\n付出\t146599\ngiven\t146600\n封闭抗体\t146601\nRaining\t146602\nキュ\t146603\n梅林路\t146604\n发射点\t146605\n诊股\t146606\n守卫剑阁五虎将后传\t146607\n植物界\t146608\n查禁\t146609\ndbcp\t146610\ntux\t146611\n近视眼\t146612\nboonya\t146613\n颍东区\t146614\n定期存款利率表\t146615\n天生\t146616\ndynasty\t146617\n慰籍\t146618\n18l\t146619\nupstart\t146620\n蒲圻\t146621\n夜狼\t146622\n湄公河\t146623\n枫树\t146624\n胡茵梦\t146625\n塑料钞\t146626\n阻燃板\t146627\nncf\t146628\n宰执水浒传\t146629\n房屋在线\t146630\n斜分线\t146631\n超过6个月\t146632\n临安\t146633\n心定\t146634\n4股\t146635\nA7\t146636\n特效版\t146637\n金龙汽车\t146638\nvhost\t146639\n勾掉\t146640\n三安\t146641\n书面\t146642\n废纸盒\t146643\n物字\t146644\n华埠\t146645\n长策\t146646\n鸡冠山\t146647\n古月\t146648\nArpels\t146649\n360g\t146650\n庆春东路\t146651\n油井\t146652\njnl\t146653\n黑客帝国3:矩阵革命\t146654\n清华大学建筑学院\t146655\n珍珠粉\t146656\n萌道动漫网\t146657\n切口\t146658\n快解\t146659\n百慕大群岛\t146660\n科技园区\t146661\nstrengthen\t146662\n顽童\t146663\nfirebase\t146664\nGore\t146665\nNO3\t146666\n翻页\t146667\nHaku\t146668\n朱祁镇\t146669\n相声集\t146670\n1473\t146671\n女烈\t146672\n初定\t146673\nfreeform\t146674\n景嘉微\t146675\n割点\t146676\n星杯\t146677\n逗哏\t146678\n嘉禾街道\t146679\n干妹\t146680\n来龙去脉\t146681\n百微镜头\t146682\n郭建\t146683\n32768\t146684\nCSDN\t146685\n5.7.16\t146686\n金叶榆\t146687\nbga\t146688\n授业\t146689\nChemsigma\t146690\n中铁物流集团\t146691\n中国驻美大使馆\t146692\n中国通号\t146693\n无难事\t146694\n五一社区\t146695\nILOG\t146696\n古文观止\t146697\n加体\t146698\n自华\t146699\nv5.50\t146700\n印模\t146701\n华东建筑设计研究院有限公司\t146702\n泡澡\t146703\nFF8\t146704\n软骨垫\t146705\n嘉麟杰\t146706\n深圳市人民检察院\t146707\nhtmlmainVerName\t146708\nアナ\t146709\n翡翠蓝湾\t146710\n一万种\t146711\n_顶库\t146712\n几经\t146713\n18:30\t146714\nqsc\t146715\n3千米\t146716\nnios\t146717\nTelescope\t146718\n故乡的云\t146719\n简爱\t146720\n160420\t146721\n边潇潇\t146722\n酷我音\t146723\nmousewheel\t146724\n宫化\t146725\n文一街小学\t146726\n星际迷航\t146727\n山墙\t146728\n指针\t146729\nhbm\t146730\n中科方德\t146731\n厚黑学\t146732\nAutoIt\t146733\n狱\t146734\n青岛站\t146735\n最强音\t146736\n证券户\t146737\nHisuite\t146738\n总序\t146739\n陈涌海\t146740\n神火股份\t146741\n立达\t146742\n化机\t146743\n洗钱者\t146744\n其他人\t146745\n齐齐乐手游网77L.com\t146746\n超微主板\t146747\n自考助学\t146748\n非金属氧化物\t146749\n心容\t146750\n袁枚\t146751\n沙面岛\t146752\n20170522\t146753\nEvents\t146754\n体制改革\t146755\n老狼老狼\t146756\n沙质\t146757\nCPSC\t146758\n朱村街\t146759\nschemas\t146760\n黄河湿地\t146761\n8个\t146762\n前轴\t146763\n迷情\t146764\n苏牧\t146765\n伏羲庙\t146766\n石亭镇\t146767\nMKMapView\t146768\n癔病\t146769\n三墩镇\t146770\n3dsmax2017\t146771\n妻宫\t146772\n机动车损失保险\t146773\n耨\t146774\n1080p+720p\t146775\ntimeseries\t146776\nUNDEFEATED\t146777\n小天鹅水魔方\t146778\n压铆螺母\t146779\n不干胶标签\t146780\n留薪期管理办法\t146781\n配线柜\t146782\n无法自拔\t146783\n事业费\t146784\n医药箱\t146785\nHarp\t146786\nAG600\t146787\n固定污染源\t146788\nsteering\t146789\n艳福不浅\t146790\n缩\t146791\n于汉\t146792\n语伴\t146793\n张东升\t146794\n大龙街\t146795\n李向国\t146796\n甜文\t146797\n偏离度\t146798\nJeans\t146799\n&#61485\t146800\n东胜广场\t146801\nOlly\t146802\n万立骏\t146803\nDar\t146804\n口才\t146805\n上海龙凤网\t146806\nFinals\t146807\nZealand\t146808\n扁平足\t146809\nBAZAAR中文网\t146810\n中国仿制药\t146811\n超值型\t146812\n甘肃省司法厅\t146813\n炸鸡粉\t146814\n98平\t146815\n盘州市人民政府\t146816\n花悸\t146817\n福建省商务厅\t146818\n泽兰\t146819\n广播罗曼史\t146820\nnex7\t146821\n等编\t146822\n黄油曲奇\t146823\ncoreldraw2017\t146824\npau\t146825\n有影响\t146826\n凯美\t146827\n王泽山\t146828\n国投\t146829\n热销品\t146830\nsignificant\t146831\n数笔\t146832\nFashion\t146833\n杨阿腾\t146834\n简自豪\t146835\n调兵\t146836\nuiwebview\t146837\n肘部\t146838\n隐贤山庄\t146839\n几英寸\t146840\n反黄\t146841\n左列\t146842\n80端口\t146843\nHorse\t146844\n96层\t146845\nmsi\t146846\n唯物主义\t146847\n佛慈制药\t146848\n海加尔\t146849\n三江口镇\t146850\n泰兴网\t146851\n积存金\t146852\nfinaldata\t146853\n叶国富\t146854\nalba\t146855\n无所不藏\t146856\n3克拉\t146857\n6189\t146858\n齐肩\t146859\n宝马530Li\t146860\n螺牙\t146861\n腾讯微证券\t146862\nvtg\t146863\nD610\t146864\n姜暮烟\t146865\n手相\t146866\n新闻_东快网\t146867\n烟台南山学院\t146868\n浇地\t146869\n200头\t146870\n追影\t146871\n石杰\t146872\n500p\t146873\n谐趣\t146874\n十丈\t146875\nyork\t146876\nyangecnu\t146877\n老大哥\t146878\n广汇集团\t146879\nbenson\t146880\ne签宝\t146881\nCulinary\t146882\njetson\t146883\n花语\t146884\n变题季\t146885\n合身\t146886\n电信欢go\t146887\n涂写\t146888\n蛤蜊\t146889\nMBA中国网\t146890\nanimenz\t146891\n菟丝子\t146892\n炸炉\t146893\nyargs\t146894\nshige\t146895\n康乐县\t146896\n式性\t146897\n价款\t146898\n川农大\t146899\nods\t146900\n包清工\t146901\n验评\t146902\n暖床\t146903\n干系人\t146904\n绵竹\t146905\n宝沃BX5\t146906\nsbm\t146907\n删改\t146908\n管卡\t146909\nlinyi\t146910\n天唐天下\t146911\n抱一下\t146912\n保乐力加\t146913\n绝地求生刺激战场S1\t146914\n技术篇\t146915\n两星期\t146916\n山西中德科工机械\t146917\n辉子\t146918\n融科资讯中心\t146919\n竞彩篮球\t146920\n500亿元\t146921\n上海徐吉电气有限公司\t146922\n淫浪\t146923\nides\t146924\n钙离子拮抗剂\t146925\n刺激物\t146926\n二校\t146927\n长垣县人民政府\t146928\n卫生类\t146929\n努\t146930\n肩膀\t146931\n非美\t146932\n福明街道\t146933\nUTMB\t146934\n开心情色网\t146935\n览天下\t146936\n陈小燕\t146937\n筠连县\t146938\nmsba\t146939\n知乎周刊\t146940\n有标\t146941\ndqmj3p\t146942\n佐佐木琲世\t146943\n内燃叉车\t146944\n海航系\t146945\n臻欣\t146946\n触目惊心\t146947\n14.3\t146948\n岳阳楼\t146949\nX16\t146950\n霜霉病\t146951\n王团长\t146952\n生与死\t146953\n圣画圣不领情_墓王之王悬棺寺\t146954\n追捕\t146955\nsubstr函数\t146956\n潮湿\t146957\nburkert\t146958\n专办\t146959\n上海雅仕\t146960\nmgo\t146961\n移动社区\t146962\n台板\t146963\n上海合作组织\t146964\nAssisted\t146965\n皮质\t146966\n怒战狂心\t146967\nbodyjam\t146968\nhtml代码\t146969\n孙公\t146970\nECIA\t146971\n素菜\t146972\n满地\t146973\n张微\t146974\n大胸\t146975\n百草\t146976\n改版\t146977\n琼英\t146978\n海潮\t146979\n粉墨登场\t146980\n三回\t146981\n大果紫檀\t146982\n黄页88IT网\t146983\n丹毒\t146984\n恶魔末日轮盘\t146985\n插入\t146986\n吡嗪\t146987\n环旭电子\t146988\n600808\t146989\n荆楚\t146990\nJpanel\t146991\n跑轮\t146992\n低值\t146993\nCorporation\t146994\n黄金海滩\t146995\n半岛|半岛网\t146996\nDaughters\t146997\ndepending\t146998\n15g\t146999\n泉州师院\t147000\n动量定理\t147001\n小王子\t147002\n20170221\t147003\n绿色金融\t147004\n五角硬币\t147005\n剑风\t147006\ntung\t147007\n6件套\t147008\n诺基\t147009\nSmooth\t147010\nTerminal\t147011\n长子营镇\t147012\nEES\t147013\nMywife-No\t147014\n保利金香槟\t147015\n江西娱乐网\t147016\n该文\t147017\n清新环境\t147018\namerican\t147019\n高功\t147020\nful\t147021\nLords\t147022\n油布\t147023\n滋尔滨\t147024\nfuntion\t147025\n/公司库/大全\t147026\n吕志\t147027\n中国铁路广州局集团有限公司\t147028\npwscf\t147029\n百利甜酒\t147030\ndevstack\t147031\n北青社区报\t147032\n葡超\t147033\n镇雄县\t147034\n合约期\t147035\n杨烁\t147036\n润百颜\t147037\nbicycles\t147038\n里氏\t147039\n袖珍\t147040\n深蓝之星\t147041\n实干者\t147042\nApologize\t147043\nhuxihx\t147044\n滑动轴承\t147045\n6.0\t147046\nSPECIAL\t147047\n深圳市腾讯计算机系统有限公司\t147048\n兵器集团\t147049\nnastran\t147050\n魔剣士リ\t147051\n彭湃\t147052\n建群\t147053\n断桥铝门窗\t147054\n清风算法\t147055\n莫愁村\t147056\n世荣\t147057\nie5\t147058\n50欧姆\t147059\n翘首以待\t147060\nEminem\t147061\n报告人\t147062\ntantan\t147063\n三个帮\t147064\n百元\t147065\nwushu\t147066\n嘉佳\t147067\n心功\t147068\n盾娘\t147069\n张国政\t147070\n冀中能源集团有限责任公司\t147071\n新安股份\t147072\n隔离带\t147073\n异步函数\t147074\n昆明装修公司\t147075\n精艇游艇网\t147076\n大城中村\t147077\ndbus\t147078\n100日\t147079\ntypically\t147080\n儿童餐\t147081\n罗斯柴尔德\t147082\nPTGui\t147083\n街号\t147084\nsupplementary\t147085\nFon\t147086\n陈正道\t147087\n电容麦\t147088\nCidade\t147089\n茨坪\t147090\n161028\t147091\nETC\t147092\nnapa\t147093\n闸北\t147094\n中华人民共和国住房和城乡建设部\t147095\n淮安市市\t147096\n维修费\t147097\njasperreport\t147098\n合刊\t147099\n十三薪\t147100\nXX年\t147101\n十六进制数\t147102\n烈火刀影\t147103\n中国平安保险(集团)股份有限公司\t147104\n商家们\t147105\nVC9\t147106\n单核\t147107\n天普\t147108\n0个\t147109\n志强\t147110\n净土寺\t147111\n最后一头战象\t147112\n网易集团\t147113\nndb\t147114\n怎用\t147115\n百世物流\t147116\nOmniPlan\t147117\n莘庄\t147118\ngnu\t147119\n唇笔\t147120\n疏通\t147121\n环境保护类\t147122\n旅博会\t147123\n诡面\t147124\n搞不清楚\t147125\n雨崩\t147126\noscillation\t147127\nFired\t147128\n炼奶\t147129\nThrow\t147130\ncome\t147131\namber\t147132\n20171125\t147133\n杨小贤\t147134\n拳赛\t147135\n呼声\t147136\n超神学院雄兵连乾坤篇\t147137\n暴走漫画吧_\t147138\n高压环网柜\t147139\n2&\t147140\n华盛顿特区\t147141\n中国古代史\t147142\ncstdlib\t147143\nFenix\t147144\n管线管\t147145\nDarkness\t147146\n高意\t147147\n东凌国际\t147148\n编织布\t147149\n王伟华\t147150\n中华全国总工会办公厅\t147151\n周晶\t147152\npycham\t147153\n后厨\t147154\nv3.1.7\t147155\n沈阳市人民政府\t147156\nLOL英雄联盟视频\t147157\n童趣大冒险\t147158\n国乐大典\t147159\n最强神话帝皇\t147160\nKAT-TUN\t147161\n玻镁板\t147162\n18亿美元\t147163\n黑键\t147164\nHLM\t147165\nmedical\t147166\n古东瀑布\t147167\n中科院地理科学与资源研究所\t147168\n琴岛通\t147169\n仇英\t147170\n撞裂\t147171\n沙皮犬\t147172\nservlet\t147173\n69年\t147174\n哈尔滨旅游团\t147175\nCl2\t147176\n州府\t147177\nCorrespondence\t147178\n前哨站\t147179\n桥本\t147180\n肆虐\t147181\ndebugview\t147182\n春雷霉素\t147183\n奶酪\t147184\n量化派\t147185\nSolidWork\t147186\n伊莉娜\t147187\n2016-11\t147188\n娇气\t147189\n买卖点\t147190\n心焦\t147191\nAcFun\t147192\n热缩膜\t147193\nreturn\t147194\n安徒生童话故事\t147195\ncasino\t147196\n考古家\t147197\n烟煤\t147198\n一届二次\t147199\n255.255.255.240\t147200\ncloze\t147201\n仟湖\t147202\n如雪\t147203\n2.6亿元\t147204\n防城港\t147205\n辉绿岩\t147206\ngmbacc\t147207\n的卢\t147208\n能看到\t147209\nme百度云\t147210\n超白\t147211\n38本\t147212\n桡骨\t147213\n阿尔伯特\t147214\nPaddlePaddle\t147215\nCHANGHONG\t147216\n英菲克i6\t147217\n急性胰腺炎\t147218\nFei\t147219\nK-means算法\t147220\n93平\t147221\n余英奇\t147222\nheavens\t147223\n红黄蓝\t147224\n卡利隆\t147225\n审结\t147226\n肥皂菌\t147227\n尿垫\t147228\n电离平衡\t147229\n厦门市商务局\t147230\n罗任德\t147231\n平遥县人民政府\t147232\n赠语\t147233\nx女\t147234\n海神\t147235\nv3.2.1\t147236\n高陵\t147237\nwolf\t147238\nDJ娱乐网\t147239\n骚灵\t147240\n优彼\t147241\n丽颜肌\t147242\n9组\t147243\n刮削\t147244\n1.34G\t147245\n大和号战列舰\t147246\n胎动\t147247\n7段\t147248\n笔笔\t147249\n晟邦物流\t147250\n平辈\t147251\n遮板\t147252\n听听\t147253\n倒地\t147254\n五十七\t147255\nrings\t147256\n腰线\t147257\n县国土局\t147258\n53info.com\t147259\npusher\t147260\n嚒\t147261\n三年级语文下册\t147262\n频域分析\t147263\nwxpay\t147264\n睿云网\t147265\n_程力专用汽车股份有限公司\t147266\nlovelive\t147267\nhiding\t147268\n7秒\t147269\n宜品\t147270\n超时空之轮\t147271\nril\t147272\npyp\t147273\n148分钟\t147274\n豆粕\t147275\n333\t147276\n伯格曼\t147277\n叙旧\t147278\n东方玉\t147279\n江滨花园\t147280\n无人仓\t147281\nxpg\t147282\n潘子\t147283\n|米\t147284\nHumidity\t147285\n顺生\t147286\n电子工程世界\t147287\nDoTween\t147288\n女装\t147289\n20方\t147290\n朱共山\t147291\n礼学\t147292\nPavel\t147293\n护脊\t147294\n大地图\t147295\n西兰\t147296\n许卓娅\t147297\nwww.ptfwzx.gov.cn/fwzx/AttachStorage/9224fa7d\t147298\n西湾路\t147299\n李月华\t147300\nDEMO\t147301\ne墅网\t147302\n马努斯\t147303\n2012年11月\t147304\n厦门地铁1号线\t147305\n星尾兽\t147306\nWindowsCE\t147307\n﹗\t147308\n烙饼机\t147309\nVML\t147310\n32.0\t147311\n龙门石窟\t147312\n課後\t147313\n关联公司\t147314\n神探狄仁杰4\t147315\n泛珠\t147316\n昂科威28t\t147317\nWPI\t147318\n冷弯\t147319\n无忧无虑\t147320\n多年来\t147321\n科学记数法\t147322\n恬静\t147323\n战胜\t147324\n50Hz\t147325\n免\t147326\n南山公园\t147327\n喜达屋\t147328\n老戏\t147329\n苯并咪唑\t147330\nphpstom\t147331\n梅城\t147332\n散板\t147333\n珊瑚石\t147334\nmathematica吧_\t147335\n驻留\t147336\n榆树庄\t147337\n待岗\t147338\n后座\t147339\n把住\t147340\n斗米仇\t147341\n寄养\t147342\n持久性\t147343\nDeputy\t147344\n千页豆腐\t147345\n紫微排盘\t147346\n脸型\t147347\n红酒\t147348\n民事再审申请书\t147349\n24000元\t147350\n双青新家园\t147351\n金鼎大厦\t147352\n200v\t147353\ninterchange\t147354\n张天福\t147355\nmcn\t147356\ncpuid\t147357\n天津三中\t147358\n邮政法\t147359\n梅梅\t147360\n风云直播\t147361\n田村淳\t147362\n85式\t147363\n包卜\t147364\n暗夜骑士\t147365\n1480\t147366\nconstraints\t147367\n申请函\t147368\n14001\t147369\n慢慢喜欢你\t147370\n盘头发\t147371\n机耕\t147372\n英伦\t147373\n唐三\t147374\nBuck\t147375\n海南国际旅游岛\t147376\n王声\t147377\n压覆\t147378\n永湖\t147379\n赴约\t147380\n客来店\t147381\n恒润\t147382\nHomeKit\t147383\nuntiy\t147384\n0755-27505283_0755-27505283\t147385\nr9\t147386\n交叉熵代价函数\t147387\nfaithful\t147388\n940M\t147389\n他\t147390\njonyj\t147391\n毛利润\t147392\n答题库\t147393\n溪流\t147394\n汽车类\t147395\n资源邦\t147396\n火线精英\t147397\n好心酸\t147398\n6.4米\t147399\n变变变\t147400\nDay4\t147401\n潲水\t147402\n一百个\t147403\n【卫计委\t147404\n导游员\t147405\n婚前\t147406\n9341\t147407\n对照版\t147408\n舵\t147409\n花溪\t147410\n养不死\t147411\nCruncher\t147412\n详表\t147413\n驱魔笔记\t147414\n家珍\t147415\nvxWorks\t147416\nvalkyrie\t147417\n枕上书\t147418\n19999\t147419\n金毛狮王\t147420\nBurke\t147421\n第三十七期\t147422\n老年报\t147423\ngenerators\t147424\nIAM\t147425\n马大姐\t147426\n沧州医学高等专科学校\t147427\n雪余香\t147428\n逾\t147429\n家委\t147430\n杭州图书馆\t147431\n防爆毯\t147432\n20171122\t147433\n搜狗壁纸\t147434\n衍生物\t147435\nchromium\t147436\n甚为\t147437\n纯小\t147438\n杨嵩\t147439\nAttack\t147440\n礼义廉耻\t147441\n重庆电子税务局\t147442\n平高集团\t147443\n国卫办\t147444\n载舟\t147445\nXboxOne\t147446\n云动\t147447\n图-秀居网\t147448\n勐腊县\t147449\n远山庄园\t147450\nIgnoring\t147451\nVALUE\t147452\n5架\t147453\n一万年\t147454\n七步\t147455\n7567\t147456\nDevZone\t147457\n厚脸皮\t147458\n云鬓\t147459\n第105集\t147460\n门梁\t147461\n北京消防\t147462\n秦始皇墓\t147463\n黑曜\t147464\n工科类\t147465\n防尘盖\t147466\n乔治白\t147467\n4起\t147468\n额度单\t147469\n梦战:碧海旭梦\t147470\n固始吧\t147471\n即墨区\t147472\n16:30\t147473\n老实\t147474\nher\t147475\n10英寸\t147476\n丙氨酸氨基转移酶\t147477\n1900万\t147478\n有道云笔记Markdown\t147479\n连翘\t147480\n玻璃制品\t147481\n金健康\t147482\n猛地\t147483\n上海摩友\t147484\n凝结\t147485\n曲木\t147486\n300寸\t147487\n贷前\t147488\nUI编辑器\t147489\n位图\t147490\n泡沫\t147491\n罗马假日\t147492\n中华人民共和国政府\t147493\n悟天\t147494\nOpenshift\t147495\nTWR\t147496\nCNKI翻译助手\t147497\n阿甘油\t147498\nlq-630k\t147499\n害怕\t147500\n2座\t147501\n解析\t147502\nTow\t147503\n中国科学院广州生物医药与健康研究院\t147504\n内蒙古消防\t147505\n薄暮传说\t147506\n1.5个\t147507\n植村秀\t147508\n0477\t147509\n结肠镜\t147510\nBalabolka\t147511\nshaonian\t147512\nNK细胞\t147513\nSwarovski\t147514\n2017年11月28日\t147515\n端午粽\t147516\nmaking\t147517\n开镜\t147518\n遭不住\t147519\ngongye\t147520\n三轴陀螺仪\t147521\n注册版\t147522\n张子凡\t147523\n周原\t147524\n侯门嫡女\t147525\n南佳\t147526\n任峰\t147527\n空冷器\t147528\n选举人\t147529\n速派/昊锐论坛_汽车之家论坛\t147530\nchannels\t147531\n俞兆林\t147532\n华宝股份\t147533\n蝴蝶鲤\t147534\n真宗\t147535\n上海市教育考试院\t147536\n东北寻宝鼠\t147537\nnaza\t147538\n青岛市市南国家税务局\t147539\n六桥\t147540\nCIO\t147541\n新闻_电子报\t147542\nfx3u\t147543\n天丰\t147544\n欧蓝德\t147545\n松林镇\t147546\nKPI指标\t147547\nlris\t147548\n产销率\t147549\n分型\t147550\n我们的爱\t147551\npro版\t147552\n司龄\t147553\n2018年11月份\t147554\n防风抑尘网\t147555\nwoool\t147556\nps素材\t147557\nstduio\t147558\n威尔史密斯\t147559\n屋舍\t147560\n≦\t147561\n万仙山书院\t147562\n局势\t147563\n孤岛惊魂5_\t147564\n碎梦\t147565\n朴施厚\t147566\n轻型坦克\t147567\n1916年\t147568\n得宝\t147569\ncctv11\t147570\nOlufsen\t147571\n华为畅享6S\t147572\nDustin\t147573\n0.5t\t147574\n便器\t147575\n发电片\t147576\napplication\t147577\n乘员\t147578\n工银瑞信\t147579\ndnfpk吧_\t147580\n货机\t147581\nDirectors\t147582\n瓷娃娃\t147583\nDoing\t147584\n福鼎\t147585\n三穗县政府\t147586\n咬\t147587\nken\t147588\n苏州幼儿师范高等专科学校\t147589\n剪拼\t147590\n坡口机\t147591\nseal\t147592\n金盾网\t147593\n臂部\t147594\n山气\t147595\n福马路\t147596\noneway\t147597\n城与勇\t147598\n收治\t147599\n北京方庄\t147600\n己丑\t147601\n数据框\t147602\n水霉\t147603\n人地\t147604\n人与人\t147605\nmegabus\t147606\n爬架\t147607\n新金瓶梅\t147608\n第二讲\t147609\n天好彩\t147610\n农科新闻网\t147611\n多发\t147612\n增开\t147613\n东方科幻谷\t147614\n吉娘娘\t147615\n乔磊\t147616\nquarrel\t147617\n愉悦\t147618\n每逢\t147619\n二狗子\t147620\n5.1声卡驱动\t147621\nknown\t147622\n酆\t147623\n互联网银行\t147624\n爱罗\t147625\n核团\t147626\n老苏\t147627\n情艺\t147628\n封罐机\t147629\nFlexBox\t147630\n治欠\t147631\n羟丁酸脱氢酶\t147632\n第6关\t147633\n湾里\t147634\n朵色\t147635\n文勇\t147636\n江岸区政府\t147637\n调到\t147638\n夏威夷州\t147639\n明文\t147640\n佳润\t147641\n苍南县人民政府\t147642\n02985387812\t147643\n代还\t147644\n捷信\t147645\n莉亚德琳\t147646\n跖疣\t147647\n10题\t147648\nsexfriend\t147649\n韶城\t147650\nEasiest\t147651\n夺魄\t147652\n楚秀网\t147653\n无贷\t147654\n橘洲\t147655\n五笔字\t147656\n乌鲁木齐\t147657\n多酚氧化酶\t147658\n机油尺\t147659\nzookerper\t147660\nff8\t147661\nAt\t147662\n肥西县人民政府\t147663\n弄出\t147664\n李呈媛\t147665\n山西省卫生和计划生育委员会\t147666\n4,5月份\t147667\n包涵\t147668\n永昌堡\t147669\n无神论\t147670\n纸媒\t147671\n白鲨\t147672\nAirship\t147673\n海眼\t147674\nSue_娜\t147675\n下一年\t147676\n金刚藤胶囊\t147677\n三工\t147678\nAntony\t147679\n教科文组织\t147680\n挠挠\t147681\n全华班\t147682\n老人家\t147683\nspacewander\t147684\nCrafts\t147685\n帕赫贝尔\t147686\n17家\t147687\nConstants\t147688\n西安搬家公司\t147689\n朱自强\t147690\n招待状\t147691\n从现在开始\t147692\n定距\t147693\n投影\t147694\n蚀变\t147695\n铸梦\t147696\n海拉尔\t147697\n机遇\t147698\n6206\t147699\njansport\t147700\n隐私照\t147701\nEquations\t147702\n市民化\t147703\n分公司\t147704\n短词\t147705\n隆基泰\t147706\n精防\t147707\n壬申日\t147708\n51P\t147709\n唐人街探案\t147710\n山羊\t147711\ndre\t147712\n拉米夫定片\t147713\n灰砖\t147714\n编译\t147715\n许三多\t147716\n氮化硼\t147717\n笔趣阁小说网\t147718\n贝塞尔曲线\t147719\n鹏达\t147720\n智慧云\t147721\n老克\t147722\n写\t147723\n程志\t147724\n吴凡\t147725\n上海工商外国语职业学院\t147726\n11.28\t147727\n5900\t147728\n军车\t147729\n蕾伊\t147730\n北京注会\t147731\n春秋航空股份有限公司\t147732\n场长\t147733\nBSD\t147734\n麻风村\t147735\n事务型\t147736\n新神\t147737\n第七十三章\t147738\n第二年\t147739\n少年谢尔顿\t147740\n豪享\t147741\n步兵团\t147742\n接取\t147743\n数分\t147744\n安全装置\t147745\n油耳\t147746\n濑美莉亚\t147747\n高者\t147748\n寐\t147749\n文网文许可证\t147750\n从前天\t147751\n神探夏洛克小姐\t147752\n空姐\t147753\n两维\t147754\nnation\t147755\n剑三帮会\t147756\n复合土工膜\t147757\n郭\t147758\n小男孩\t147759\n10大名\t147760\n名博\t147761\n沧源县\t147762\n360盘\t147763\n黄光\t147764\n宁波银行\t147765\n史蒂芬\t147766\n闵行区中心医院\t147767\ngoverment\t147768\n协同效应\t147769\n2辆\t147770\n第n行\t147771\n跳舞\t147772\n中鲁\t147773\nManagerial\t147774\n一公分\t147775\n扑克圈\t147776\n2017年2月24日\t147777\nHUSH\t147778\n陕西省戏曲研究院\t147779\nrsz\t147780\n鹡鸰\t147781\nMelbourne\t147782\n1万5\t147783\n32M\t147784\n探清水河\t147785\n文佳\t147786\n孙扬\t147787\nif公式\t147788\n滴滴公司\t147789\n1.22\t147790\n莆田市区\t147791\n丽人行\t147792\n情侣衫\t147793\n临安山\t147794\n清风自来\t147795\n大容\t147796\n永洪科技\t147797\nSanyo\t147798\ncod12\t147799\n景财富网\t147800\n盖地\t147801\n埃尔顿\t147802\n于利\t147803\n筑业软件\t147804\n火星十一郎\t147805\n郭江\t147806\n清正廉明\t147807\n盛名\t147808\n借款\t147809\n布拉格广场\t147810\n战技\t147811\nsharesdk\t147812\nfileinputstream\t147813\n旗下子公司\t147814\n由衣\t147815\n眼前\t147816\nHouzz\t147817\n二则\t147818\n售价\t147819\n尤卓尔\t147820\n巨人的陨落\t147821\n503所\t147822\n天马科技\t147823\nUPNP\t147824\n华东师范大学教育学部\t147825\n面镜\t147826\n永德\t147827\n耗散\t147828\n奥威亚\t147829\n千山区\t147830\nBartleby\t147831\nuku\t147832\n815\t147833\n煮艺\t147834\n403_\t147835\nXDC\t147836\n敦和\t147837\n一路向北\t147838\ncomplicated\t147839\n水濑\t147840\nMartian\t147841\n碾子\t147842\n脱审\t147843\nspect\t147844\n数起\t147845\n健发\t147846\n欢聚宝\t147847\n浪子燕青\t147848\n富德\t147849\npowerbeats\t147850\nsummernote\t147851\n北京公交\t147852\n夏薇\t147853\n10%公\t147854\nmencoder\t147855\n唐剑\t147856\n太平洋证券股份有限公司\t147857\ntcsh\t147858\n面砖\t147859\n财考网\t147860\n魔殿\t147861\n河北省委省政府\t147862\n拷贝纸\t147863\nAV艳星\t147864\n顾仲安\t147865\n中国国民党革命委员会\t147866\n瓶坯\t147867\n市党委\t147868\n南瑞\t147869\n中国化工制造网\t147870\n冻干\t147871\n古纸\t147872\n黄致列\t147873\nAin\t147874\n永川网\t147875\nadobeaudition\t147876\ncomputeruniverse\t147877\n天天撸一撸\t147878\n领军者\t147879\n三代\t147880\n球哥\t147881\n锁具\t147882\n泰达希尔\t147883\n炒房\t147884\nSchiff\t147885\n傷\t147886\n发贴\t147887\n瘦脸面膜\t147888\n翼之声\t147889\nlq630k\t147890\nhainan\t147891\n樱井莉亚\t147892\n6950\t147893\n勐库\t147894\n青团\t147895\n合同额\t147896\n不关\t147897\n白清明\t147898\n石脑油\t147899\n樊野\t147900\nPentaho\t147901\n绿卡通\t147902\n8毫米\t147903\n郭军\t147904\nForeigner\t147905\n693\t147906\n海洋奇缘\t147907\n真气\t147908\n美印\t147909\n最终幻想1\t147910\nFLYabroad\t147911\ntakes\t147912\n海南自由贸易区\t147913\n沙土镇\t147914\n蛤蟆\t147915\nWeb版\t147916\nAqara\t147917\n休闲包\t147918\nClassified\t147919\n4年多\t147920\nRARBT_bt电影_bt种子_\t147921\nvicious\t147922\nPhotobucket\t147923\n中国教育装备行业协会\t147924\n下半句\t147925\n缠欢\t147926\n恩施鹤峰\t147927\n广东省学位委员会\t147928\n神枪手\t147929\n百格测试\t147930\n硅谷传奇\t147931\n瀚德\t147932\n弄假成真\t147933\n天水广电网\t147934\n佳木斯快乐舞步健身操\t147935\n张湾区\t147936\n控屏\t147937\n竹水\t147938\n为什么不可以\t147939\n云南省委组织部\t147940\n芙兰朵露\t147941\n深圳月子中心\t147942\n民爱\t147943\n急租\t147944\n堂风\t147945\n3月9日\t147946\n小脚裤\t147947\n表值函数\t147948\nPSI\t147949\n充气垫\t147950\n超级机器人大战f\t147951\n台湾金曲奖\t147952\n证券\t147953\nTangle\t147954\n92周年\t147955\n西谎极落之太爆太子太空舱\t147956\n陕西一区\t147957\n飞特\t147958\n林学院\t147959\nxoo\t147960\n电子汽车衡\t147961\n男女童\t147962\n装饰者\t147963\n12立方\t147964\nproe5\t147965\n奸笑\t147966\nx++\t147967\n谭震林\t147968\n保留价\t147969\n冲天炉\t147970\n电子工\t147971\n应征入伍\t147972\n1821\t147973\n新币\t147974\n磁浮子液位计\t147975\nkinect\t147976\n归化\t147977\n主谋\t147978\n23分\t147979\nsenlin\t147980\n西安北大街\t147981\n精神恍惚\t147982\n赢荡\t147983\nAARRR\t147984\n转折\t147985\n丁桥\t147986\norcid\t147987\n诚心诚意\t147988\n宿松县人民政府\t147989\nCpu\t147990\nq30\t147991\n董平\t147992\n300kg\t147993\n904\t147994\n名词化\t147995\n崇明东滩\t147996\ndjkk\t147997\n石岩公学\t147998\nenergy\t147999\n中新视频\t148000\n世纪路\t148001\nipadpro\t148002\nseafood\t148003\n镇远\t148004\n3:狂猎\t148005\n言情类\t148006\nope\t148007\n地产化\t148008\n认房\t148009\n房地产市场调查报告\t148010\nTSC\t148011\n揺\t148012\n吉林省人民医院\t148013\n团单\t148014\nCENTOS\t148015\n鸿福路\t148016\n龙胆泻肝汤\t148017\n花源\t148018\n1920s\t148019\n郑小瑛\t148020\n强电\t148021\nerotic\t148022\n折标\t148023\n舰队coll\t148024\nM158b\t148025\nCom\t148026\n18成\t148027\n渗析\t148028\n7p\t148029\n洪灏\t148030\n天气丹\t148031\n小白鸽\t148032\n快猴网\t148033\n回压\t148034\n面壁者\t148035\n德意志意识形态\t148036\n第127集\t148037\nhexadecimal\t148038\n苹果Macbook\t148039\n克城\t148040\n扛不住\t148041\n财税法\t148042\n淮海中路999号\t148043\nsone\t148044\nExpire\t148045\n线端\t148046\nWebOffice\t148047\n另一半\t148048\n濠江\t148049\n莫斯科陷落\t148050\nNEN\t148051\n眉山职业技术学院\t148052\n狐猴\t148053\n仁恒江湾城\t148054\nCefSharp\t148055\n头箍\t148056\n仅次于\t148057\n承德银行\t148058\n蜀南竹海\t148059\n乔_木\t148060\n初十\t148061\nTechnique\t148062\n默认源\t148063\n吧_一猫汽车网\t148064\n后生\t148065\nRap\t148066\nBates\t148067\n心丹\t148068\n中效过滤器\t148069\nHacks\t148070\n横七竖八\t148071\n字母\t148072\nezbuy\t148073\nopenssh\t148074\n浊\t148075\nμ\t148076\n竭尽所能\t148077\n海波轮\t148078\n电气学院\t148079\n单体\t148080\n套链\t148081\n三爱三节\t148082\n潘石岳飞\t148083\n功课\t148084\n图文\t148085\nHolmes\t148086\n数千万美元\t148087\n邓恩铭\t148088\n滞纳\t148089\nGLSL\t148090\n固铂轮胎\t148091\nCitizen\t148092\n名区\t148093\n三产\t148094\n十倍\t148095\n伊然\t148096\n1立方\t148097\n奥沙西泮片\t148098\n002673\t148099\nNEX\t148100\n双峰一中\t148101\n改朝换代\t148102\n厦门晚报电子报[厦门晚报\t148103\ncbre\t148104\n别君\t148105\nWAKE\t148106\n光泽度\t148107\n快捷语\t148108\n鸩\t148109\nimprove\t148110\n短\t148111\n明月天涯\t148112\n线性化\t148113\nauxre\t148114\n游星\t148115\n实施\t148116\n9月18日\t148117\n广西壮族\t148118\n大分县\t148119\n冷棚\t148120\n匠心\t148121\n断火\t148122\nlss\t148123\n高玉良\t148124\n特战部队\t148125\n丽水市中心医院\t148126\n磷铁\t148127\n颜字\t148128\n前女朋友\t148129\n武汉协和\t148130\n艳史\t148131\n獨\t148132\n罩极电机\t148133\n18MM\t148134\nbidirectional\t148135\n定额税\t148136\n硫化汞\t148137\n第89集\t148138\npyplot\t148139\n岱山县\t148140\n点_\t148141\n邪恶漫画全集\t148142\n高霞\t148143\n灵飞经\t148144\n绳缚\t148145\nBaahubali\t148146\n沙塔尔\t148147\n第61号\t148148\nC141\t148149\n一事一议\t148150\n玩意儿\t148151\n18粒\t148152\n上海金融报新闻网\t148153\n两天一晚\t148154\n北京摩拜科技有限公司\t148155\n驱动电路\t148156\n虚汗\t148157\n慈善事业\t148158\nMadison\t148159\n坚果类\t148160\nAA级\t148161\n酒精味\t148162\n榨汁机\t148163\n院墙\t148164\nlaminated\t148165\n电炸锅\t148166\n胡适\t148167\n核高\t148168\n琴棋书画\t148169\n美术生\t148170\n河南省中小学幼儿园\t148171\n0475\t148172\n屏东中学\t148173\n厦门自贸区\t148174\n路明非\t148175\nWork\t148176\n瑞金医院\t148177\n德化县\t148178\n操服\t148179\n诸天至尊\t148180\n凯亚\t148181\nydy\t148182\n法尔曼\t148183\n诺金\t148184\n韩德\t148185\n14项\t148186\n张任\t148187\n中国康复医学会\t148188\n青绿\t148189\n概预算软件\t148190\n第一_\t148191\n领地人生MMO\t148192\n太乐地图下载器\t148193\n室内空气检测\t148194\ned2000\t148195\nchahuo\t148196\n说起来\t148197\n奥运会女排比赛\t148198\n聊天栏\t148199\n气路\t148200\n纸钞屋第二季\t148201\nD500\t148202\n卫生村\t148203\n万敏\t148204\n视锥细胞\t148205\n会计准则\t148206\n新湖中宝\t148207\n王者荣耀召唤师\t148208\n淮南市\t148209\nwebflow\t148210\n戏缘\t148211\nBAPI\t148212\nserializers\t148213\n嗯老师\t148214\n入机\t148215\n红区\t148216\n网易MuMu\t148217\n采血\t148218\n头鞋\t148219\n2016年4月14日\t148220\n表带\t148221\n218号\t148222\n波茨坦\t148223\nEntrust\t148224\nfso\t148225\n无望\t148226\n西岗镇\t148227\n热成\t148228\n血战湘江\t148229\n联图\t148230\nPutty\t148231\n驻扎\t148232\n扭捏\t148233\nitoos\t148234\n深圳华为\t148235\n20光\t148236\n跳珠\t148237\n异邦人\t148238\n公益类\t148239\n阿夏\t148240\n熔浆\t148241\n补兵\t148242\n胡明轩\t148243\n换座\t148244\nRealtek\t148245\nruboo\t148246\n七千元\t148247\nptr\t148248\n万德隆\t148249\n人脸\t148250\n破阵\t148251\n激光网\t148252\n美都能源\t148253\n暖皮\t148254\n阳明滩大桥\t148255\n消费者报告\t148256\nvein\t148257\n劝导\t148258\n杜洛克\t148259\n转站\t148260\n样张\t148261\n新解\t148262\n陈敏尔\t148263\n马来人\t148264\n光谷东\t148265\n寻味\t148266\n常来\t148267\nrepo\t148268\n阜城县\t148269\n长信科技\t148270\n0922\t148271\n昆汀·塔伦蒂诺\t148272\n艾康尼克\t148273\n烂摊子\t148274\n床上用品\t148275\n项目制\t148276\nimple\t148277\n警觉\t148278\n氟哌酸\t148279\n法名\t148280\n线帘\t148281\n长江道\t148282\n北京应用物理与计算数学研究所\t148283\n分尸案\t148284\n休矣\t148285\n醚\t148286\npyhone\t148287\n脸盘\t148288\n朱虹\t148289\n韬奋\t148290\n张康阳\t148291\n郭林\t148292\n健者\t148293\n南宁三中\t148294\n5.0\t148295\nMP3@320K\t148296\n金门岛\t148297\n飙升\t148298\n敕\t148299\nkite\t148300\n文盛\t148301\n7140\t148302\n万安\t148303\n第247集\t148304\n1.0分\t148305\n合肥市经济和信息化委员会\t148306\n安徽建筑工业学院\t148307\n海贼王女帝汉\t148308\n郡国\t148309\n白烟\t148310\ndpt\t148311\n驯夫\t148312\n436号\t148313\n吴敬平\t148314\n上海小米\t148315\n中国工商银行上海分行\t148316\n178剑网3\t148317\n蓝得\t148318\n医保本\t148319\n助念\t148320\n钦州\t148321\n不力\t148322\n喷水\t148323\ngt710\t148324\n烫面\t148325\n张卫华\t148326\nEVE-NG\t148327\n亲贝网\t148328\n祖冲\t148329\nEAGLE\t148330\n禧贝\t148331\n环球易购\t148332\n偏斜\t148333\n素材天下网\t148334\n威基伍德\t148335\n微课\t148336\n张宇航\t148337\n位移台\t148338\n圆柱状\t148339\nmothercare\t148340\n广东轻工职业技术学院\t148341\n圆溜溜\t148342\n诗选粹\t148343\n8轴\t148344\n省商务厅\t148345\n天纳克\t148346\ngw250f\t148347\n海商网\t148348\n卖身\t148349\n绿水灵\t148350\n孙夏\t148351\nQuartz\t148352\numeng\t148353\n怀春\t148354\nauthc\t148355\nvostro\t148356\njxw\t148357\n35t\t148358\n乌兰浩特\t148359\n皇堡\t148360\n两排\t148361\n件数\t148362\n刘彤\t148363\n无不尽\t148364\n梁启超\t148365\n鸿茅药酒事件\t148366\nMCC\t148367\n35篇\t148368\nr9tm\t148369\n护理保健_么么亲子网\t148370\n温州医院\t148371\n唐世光\t148372\n油性\t148373\nEffectiveness\t148374\n田惠宇\t148375\n计算机科学与工程学院\t148376\nLovey\t148377\nchant\t148378\n知识产权质押融资\t148379\nMSVCP110.dll\t148380\n2.6.9\t148381\n失智\t148382\n立规\t148383\n三振\t148384\n员工们\t148385\n武则盛一伦\t148386\nZAP\t148387\n中英文混排\t148388\nodin\t148389\n循环扇\t148390\n内需\t148391\n派员\t148392\nMBL\t148393\n安装卡\t148394\n建台\t148395\n项王\t148396\n国家质检中心\t148397\n帐载\t148398\norleans\t148399\n两虚\t148400\n2018.4.2\t148401\n雾岛奈津美\t148402\n硫酸沙丁胺醇气雾剂\t148403\nszzs\t148404\n第三方支付公司\t148405\n贴片晶振\t148406\n300mg\t148407\n陆沉\t148408\n场边\t148409\n陆剧\t148410\n非博\t148411\n上海市住房保障和房屋管理局\t148412\nboobs\t148413\n申鑫\t148414\nHLB\t148415\n中国国际石油石化技术装备展览会\t148416\n措施案\t148417\n产业类\t148418\n王璟\t148419\n码农咖啡馆\t148420\n代金卷\t148421\n你的样子\t148422\n如皋中学\t148423\n不喝\t148424\n大辽\t148425\n孙莉\t148426\n4190\t148427\n小时制\t148428\n白鹏\t148429\n14pk\t148430\nStorage\t148431\ncm13\t148432\nLuv\t148433\n学弟们\t148434\n有房\t148435\nSM4\t148436\n河北镇\t148437\nAspire\t148438\n金刚藤\t148439\n本善\t148440\n四木\t148441\nconviction\t148442\n缘份\t148443\nFINS\t148444\n国家体育场\t148445\n满级\t148446\n平安保险平安福\t148447\ntjbc_北辰\t148448\n贺岁篇\t148449\n苏飞\t148450\n15平方\t148451\n虎杖\t148452\nmayu\t148453\n迷你裙\t148454\n红联\t148455\n补票\t148456\n春雨医生\t148457\n燕文\t148458\n自拍杆\t148459\n逆时针\t148460\nseeyou\t148461\n刘艳红\t148462\n劈刀\t148463\nKeyStore\t148464\nCD2\t148465\n柠檬\t148466\n音平商城\t148467\n重医附二院\t148468\n粘人\t148469\n公共课\t148470\n瑞亚\t148471\njarfile\t148472\nABAA\t148473\n蒲街坊\t148474\n双轮\t148475\n河南大学\t148476\n浙江省儿童医院\t148477\n酷鸟航空\t148478\nQ345\t148479\nGY\t148480\n生物肥\t148481\n悠美\t148482\n南京夫子庙\t148483\n挑毛\t148484\n大亚湾核电站\t148485\n逐浪小说网\t148486\n速翼特\t148487\n百宝箱\t148488\nMiss_wang\t148489\nTucker\t148490\n双面焊\t148491\n车事\t148492\n失修\t148493\nfoto\t148494\n56分钟\t148495\n吸顶机\t148496\n喊累\t148497\n岳阳政府\t148498\nqtextedit\t148499\n泄题\t148500\n衍变\t148501\nTOPSHOP\t148502\n关宏宇\t148503\nfmvp\t148504\n一叶兰\t148505\n打伞\t148506\n全家\t148507\n保时捷股份公司\t148508\n军爷\t148509\n第90号\t148510\n小莹\t148511\n燃气机\t148512\nNets\t148513\n天之眼\t148514\n锅具\t148515\n配有\t148516\n义乌国际商贸城二区\t148517\n街心花园\t148518\nflie\t148519\n40栋\t148520\n永远的爱\t148521\nGRE\t148522\n大裂谷\t148523\n太空漫步\t148524\n当红不让\t148525\nSATWE\t148526\n滚滚长江东逝水\t148527\n寿山\t148528\n宇通客车\t148529\n使馆\t148530\n三亚丽禾温德姆酒店\t148531\nmle\t148532\n裸浴\t148533\n伪静态\t148534\n龙腾传世\t148535\n6%\t148536\nky\t148537\n8730\t148538\n盟友\t148539\n大萌萌\t148540\nvba吧_\t148541\nuvision5\t148542\n本本主义\t148543\n寇伟\t148544\n常州轻工职业技术学院\t148545\n广告师\t148546\n商票\t148547\n柏图斯\t148548\n席\t148549\n5瓶\t148550\n变频器\t148551\n海淀\t148552\n股二头肌\t148553\n夜里12点\t148554\nej\t148555\n两汉\t148556\nl470\t148557\nNubiles\t148558\nelcipse\t148559\nthetic\t148560\n中国男团\t148561\n纵览\t148562\n俄罗斯航空\t148563\nSharon\t148564\n抽样定理\t148565\n男娘\t148566\n20毫升\t148567\n会客\t148568\n第九十一章\t148569\nHybrid\t148570\n影视级\t148571\n花男\t148572\n租号器\t148573\nNICONICO\t148574\n张伟杰\t148575\n公务员工资标准表\t148576\n思博\t148577\n黄姓\t148578\nlimits\t148579\n安第斯山脉\t148580\ngrok\t148581\n低费\t148582\n心迷宫\t148583\n家群\t148584\n马奎斯\t148585\n小窗幽记\t148586\n变异\t148587\n兖州煤业\t148588\n中国共产党西藏自治区委员会\t148589\n石首\t148590\n第196集\t148591\n歌子\t148592\n中外大学\t148593\n1062\t148594\n10排\t148595\n浙企\t148596\n拨号器\t148597\n哥伦\t148598\n重庆人力资源和社会保障网\t148599\n耳温计\t148600\n中国科学院国家天文台\t148601\n江洲\t148602\n逸凡\t148603\n吴旭东\t148604\n甘泉雨岔大峡谷\t148605\n大口径螺旋钢管\t148606\n逍遥客\t148607\n0.0_\t148608\n杀人游戏\t148609\n羽根\t148610\n爱荷华州\t148611\npsd图层\t148612\nsmc\t148613\n爱的时差\t148614\n何香凝美术馆\t148615\n一树\t148616\n绝代商骄\t148617\nHart\t148618\nMT7628\t148619\n新闻世界\t148620\n审计厅\t148621\n波音767\t148622\n国药集团药业股份有限公司\t148623\n汉化组\t148624\n美吉姆国际儿童教育中心\t148625\n煮酒\t148626\n販売\t148627\n魔山\t148628\n实刑\t148629\n北京大学汇丰商学院\t148630\n爱囚\t148631\n5.3\t148632\n顺庆区\t148633\n天地\t148634\n上海皮肤病医院\t148635\nLiner\t148636\n沈阳森林动物园\t148637\n霸王龙\t148638\n巴特勒\t148639\n蒋川\t148640\n大红色\t148641\n籽月\t148642\njodconverter\t148643\n模锻\t148644\nkube\t148645\n暗影猎手\t148646\ndiannao\t148647\nliam\t148648\n天兵天将\t148649\n腾讯众创空间\t148650\n华严\t148651\nmsvcp140\t148652\n柯美750\t148653\n江苏省农业委员会\t148654\n明日花キララ\t148655\nv4.2.3\t148656\n价值流\t148657\n新开普\t148658\n称谓\t148659\n冯小宁\t148660\n魔警\t148661\nscom\t148662\n760D\t148663\n广州大学华软软件学院\t148664\n长水国际机场\t148665\nkeystone\t148666\n7060D\t148667\n便函\t148668\n涂覆\t148669\n平安好福利\t148670\n裕民\t148671\n线雕\t148672\n十万吨\t148673\n当政\t148674\n3迅雷\t148675\n映体\t148676\nfaile\t148677\n50亩\t148678\nfanny\t148679\n8.03\t148680\n规制\t148681\n调停\t148682\n美家宝\t148683\n沙巴州\t148684\n163@163.com\t148685\n司导\t148686\n行云\t148687\n持久型\t148688\n禾香板\t148689\n开颅\t148690\n两江春城\t148691\n一百部\t148692\n单_筑龙网\t148693\nteese\t148694\nSalmon\t148695\n5英尺\t148696\nFeat\t148697\n藏银\t148698\n10万股\t148699\n周润发\t148700\n火塘\t148701\n维尔纽斯\t148702\n莫代尔\t148703\nprio\t148704\n便意\t148705\n镇坪县\t148706\n红外分光光度法\t148707\n柏林娱乐\t148708\n00700\t148709\n双色球吧_\t148710\n边缘型\t148711\n180419\t148712\nBDbilibili\t148713\n玫瑰庄园\t148714\n上海一区\t148715\n卡顿\t148716\n打怪\t148717\n航海王\t148718\n和讯\t148719\n烤串\t148720\n16mn\t148721\n每一代\t148722\n绍兴北站\t148723\n博山区\t148724\n情报板\t148725\n组合机\t148726\nericnie\t148727\n部部夸影视大全\t148728\n偶记\t148729\n兼毫\t148730\nmip\t148731\n菜市\t148732\n5月2日\t148733\n第864章\t148734\n同层\t148735\n耳畔\t148736\n古朴\t148737\nB360\t148738\n微山岛\t148739\n黑天\t148740\n瑟庄妮\t148741\n衡水志臻实验中学\t148742\n浙江大学药学院\t148743\n鹤沙航城\t148744\n望海楼\t148745\n石嘴山市政府网\t148746\n保证\t148747\n上柴\t148748\n掩护\t148749\n抢占式\t148750\n出外\t148751\n5套\t148752\n铁林\t148753\n小樱桃\t148754\n黄胄\t148755\n盤\t148756\nFIIL\t148757\n可能\t148758\nPerla\t148759\n主神\t148760\nattachments\t148761\nGraves\t148762\n哥拉尔\t148763\n楚河汉界\t148764\n腊肠狗\t148765\n虎父\t148766\n撑杆跳\t148767\n优片库\t148768\n修真界\t148769\npdf批量\t148770\nBD-720P-MKV\t148771\nInkscape\t148772\nwNv\t148773\nMacaca\t148774\n碳布\t148775\n广安市\t148776\n法规库\t148777\n事变\t148778\n放火案\t148779\n扣缴\t148780\n郑州二中\t148781\n挂作\t148782\n同床异梦2\t148783\n儿童游乐场\t148784\n米可\t148785\n管桦\t148786\n明知故犯\t148787\n十元\t148788\n广州市工商行政管理局\t148789\n流失败\t148790\n灰水\t148791\n000040\t148792\nmoring\t148793\n调用子函数\t148794\n第一群\t148795\n朱国华\t148796\n百无一用\t148797\nx2+bx+c\t148798\n李子串\t148799\n贾云\t148800\n雅意\t148801\n中国民办教育协会\t148802\n浙江政务服务网\t148803\n社会史\t148804\n蹲着\t148805\n映月湖\t148806\n油车港\t148807\n番茄红素软胶囊\t148808\n金钱版\t148809\n1666\t148810\n通州区政府\t148811\n砂箱\t148812\n剑麻\t148813\n没结果\t148814\n一二_\t148815\n元生\t148816\n于淑珍\t148817\n东风风度MX6\t148818\n鑫鹏\t148819\n166号\t148820\n峰林\t148821\n不似\t148822\nGUID\t148823\nInflater\t148824\n欧四\t148825\n环卫工人节\t148826\n香皂盒\t148827\n明城墙\t148828\n凌河\t148829\n奥斯维辛\t148830\n工作品\t148831\n课稿\t148832\n5亿多\t148833\n赠券\t148834\n预约挂号统一平台\t148835\n60赫兹\t148836\n有信心\t148837\n路桥\t148838\n反恐精英ol吧_\t148839\n阿姆\t148840\nqq电话\t148841\nv7包\t148842\nbendi\t148843\n枯水\t148844\n八玲珑\t148845\n18版\t148846\n网刊\t148847\nPPTV网络电视\t148848\n客邦\t148849\n中国会\t148850\n恐惧症\t148851\n洗衣\t148852\n械斗\t148853\n奇瑞E3\t148854\n雪后\t148855\n水周\t148856\n凤凰新华书店旗舰店_江苏凤凰新华书店官网_天猫书城\t148857\n夏朗论坛\t148858\n唇\t148859\n广西经济管理干部学院\t148860\n烂漫\t148861\nLift\t148862\ne2fsck\t148863\n晚会\t148864\n三国大时代4\t148865\n泰顺\t148866\n预算价\t148867\n利尔化学\t148868\n暑运\t148869\n27270\t148870\n7816\t148871\n左心衰竭\t148872\n内蒙古电力(集团)有限责任公司\t148873\n中国消防在线\t148874\n换气扇\t148875\n转运站\t148876\n自考座位号\t148877\n4680\t148878\nreturned\t148879\n雲霏霏\t148880\n冯象\t148881\n纸匠\t148882\nAwk\t148883\napprove\t148884\n慈心渡鬼\t148885\n王营\t148886\n中土战争\t148887\n周建萍\t148888\n小渡\t148889\n和棋\t148890\n太忙\t148891\n13季\t148892\n弓形\t148893\ninno\t148894\nparce\t148895\n西柚\t148896\n业务处\t148897\n东南门\t148898\nEasyX\t148899\nMedley\t148900\nTL-WR845N\t148901\n大学生们\t148902\n故障版\t148903\n红岩村\t148904\n营口\t148905\n郑俊河\t148906\nWEEK\t148907\n拍案叫\t148908\n三国群英传1\t148909\n21080\t148910\n裸盘\t148911\nWPS格式\t148912\n怀远一中\t148913\n肝胆结石\t148914\n图片片\t148915\n新中路\t148916\n中华人民共和国企业所得税法\t148917\n棒棒棒\t148918\n渔轮\t148919\n6.78\t148920\n三爪卡盘\t148921\n过渡句\t148922\n克和千克\t148923\n第120期\t148924\n厦门园林植物园\t148925\n滑腻\t148926\n甜虾\t148927\n姜辉\t148928\n哑铃型\t148929\n德系\t148930\n行将\t148931\n吉大附中\t148932\ntern\t148933\n壶铃\t148934\n诡水疑云\t148935\n电鬼\t148936\n朗麓\t148937\n淄博新区\t148938\n九块\t148939\n水漫金山\t148940\nwin764\t148941\nglance\t148942\n开槽机\t148943\n王佩丰\t148944\n11斤\t148945\n并购案\t148946\n大飞\t148947\nspartan\t148948\n6.88\t148949\n来日方长\t148950\ncatch\t148951\n_词典网\t148952\n猫途\t148953\n无多\t148954\n无论\t148955\n靓居\t148956\n看听\t148957\n重庆公交查询网\t148958\n祁连县人民政府\t148959\nxo型\t148960\n吹过\t148961\ngaodun\t148962\nManga\t148963\n氨基酸片\t148964\n断\t148965\n吉本多香美\t148966\n深圳河\t148967\n数模竞赛\t148968\n高低档\t148969\n碑谷\t148970\n夜审\t148971\n放箱\t148972\n肺\t148973\n波野结衣\t148974\n机器舞\t148975\nComparing\t148976\n南岳机场\t148977\nMixed\t148978\n对时\t148979\n兰博基尼\t148980\n镁\t148981\n校地\t148982\n银盛泰\t148983\ntoddler\t148984\n张蕴章\t148985\n应聘\t148986\n虾米音乐网\t148987\n微商圈\t148988\nworkshop\t148989\n末日方舟\t148990\n商桥\t148991\ndignity\t148992\niQOS\t148993\n硝酸盐氮\t148994\n花开花谢\t148995\n孔位\t148996\n尼古唐骏\t148997\n岳西县\t148998\n朝辞白帝彩云间\t148999\n22天\t149000\n龙骨牡蛎汤\t149001\n李秋平\t149002\n.dat\t149003\nlucy\t149004\n正态分布曲线图\t149005\n71路\t149006\n圣家堂\t149007\n天河区教育局\t149008\n脱口秀\t149009\n锦绣花园\t149010\n外框\t149011\n巴西站\t149012\n易烊大冰\t149013\ndy\t149014\n软件版\t149015\nopenvswitch\t149016\n1202\t149017\n7孔\t149018\n还原\t149019\n八个\t149020\n排队的人\t149021\n干化\t149022\n绥宁\t149023\n戴鹏民\t149024\n铬酸\t149025\nwondershare\t149026\n五笔打字\t149027\nlogins\t149028\n拔刀剑\t149029\nFLAME\t149030\n专业展\t149031\n低低\t149032\n仙身\t149033\n独步\t149034\n斩龙\t149035\n复旦大学基础医学院\t149036\nseneca\t149037\n注册单\t149038\n行贿犯罪档案查询申请书\t149039\n商务处\t149040\n脐针\t149041\n美浮特\t149042\n动画图\t149043\n兔肉\t149044\n肉末蒸蛋\t149045\nrootkit\t149046\ndblink\t149047\nMAJOR\t149048\n青岛\t149049\n西安便民网\t149050\nWords\t149051\n13篇\t149052\n李源潮\t149053\n馍片\t149054\n祁\t149055\n和集\t149056\n编组\t149057\n滚针轴承\t149058\n机械感\t149059\nChavinKing\t149060\n一检\t149061\nCompleted\t149062\n建德\t149063\n437\t149064\nHalo\t149065\n穿越之娱乐香江\t149066\nsql-CSDN\t149067\n绿地中央公园\t149068\n日夜撸\t149069\n木材撕碎机\t149070\n开黑\t149071\n泰豪\t149072\nireport\t149073\n华中科技大学物理学院\t149074\n本底\t149075\n水产网\t149076\nandroidmanifest\t149077\n参数值\t149078\n矾根\t149079\n1200g\t149080\nredim\t149081\ndbcp连接池\t149082\n45万元\t149083\n1.2.1.jar\t149084\n太多太多\t149085\n三桶油\t149086\n中国数字医疗网\t149087\n建行网银盾\t149088\n羊仔\t149089\n铌\t149090\n酸梅汤\t149091\nxzx\t149092\n大边\t149093\n齐鲁书社\t149094\ncicc\t149095\n龚正\t149096\n2015年以后\t149097\n菩提心\t149098\n高淇\t149099\nfinalcam\t149100\n金samuel\t149101\n多重共线性_\t149102\n小房子\t149103\n硕博\t149104\n2185\t149105\n500目\t149106\n马小虎\t149107\n闭营\t149108\n3.1.5\t149109\n土特产\t149110\n死库\t149111\n安化黑茶\t149112\n中惠\t149113\n服装批发市场\t149114\n强固\t149115\n山东大学生命科学学院\t149116\nBOD5\t149117\n张放\t149118\n王嘉尔\t149119\n覚醒\t149120\n外科正牙网\t149121\n齐鲁医药学院\t149122\nPassenger\t149123\n荣耀时刻\t149124\n天津天狮学院\t149125\n淮口镇\t149126\n碧云社区\t149127\n简·奥斯丁\t149128\n撕裂感\t149129\n蚶子\t149130\n混料\t149131\nillness\t149132\n超模25减肥操\t149133\nSito\t149134\n致命\t149135\n坚瑞沃\t149136\n旭升股份\t149137\n别语\t149138\n韦斯特\t149139\n大族粤铭激光\t149140\n语丝\t149141\n中国酒店集团\t149142\n大迁徙\t149143\n签证官\t149144\ncourier\t149145\n阀值\t149146\n天印大道\t149147\n中共十八届三中全会\t149148\nAgisoft\t149149\n安智\t149150\npur\t149151\n中国酒业协会\t149152\n栖霞\t149153\n伊莱恩\t149154\n中国工商银行软件开发中心\t149155\n外派\t149156\n曼巴狂蛇\t149157\n六只\t149158\n母盘\t149159\n落地窗\t149160\n双鉴探测器\t149161\n盛派\t149162\n市政府办公厅\t149163\n英雄级\t149164\nKind\t149165\n陕西国际商贸学院\t149166\n断崖式\t149167\n半导体业\t149168\n惠普\t149169\nifcfg\t149170\nyuyin\t149171\n网格板\t149172\n吉利丁\t149173\n飞将\t149174\n得利宝\t149175\n城市路网\t149176\n高子\t149177\n进出口权\t149178\n冰公主\t149179\n裘东耀\t149180\n陈之佛\t149181\n很实用\t149182\n春原\t149183\n木瓜树\t149184\n敏锐性\t149185\n李晓晖\t149186\n题文\t149187\n胶体\t149188\nextmail\t149189\n巴比伦河\t149190\n林奈\t149191\n山东省教育招生考试院\t149192\n捞\t149193\n访客管理系统\t149194\n2018-01-23\t149195\n润景园\t149196\n颞下颌关节炎\t149197\n软件源\t149198\nChemist\t149199\nAnaconda3\t149200\n干倘卖\t149201\nconducting\t149202\n唐山市监察委员会\t149203\n坦克世界闪电战\t149204\n出子\t149205\nvg\t149206\n第十九条\t149207\n欧派克\t149208\n猜谜语\t149209\n安全小区\t149210\n山东电建三公司\t149211\n中华女子学院\t149212\n缔合\t149213\n王志明\t149214\n超声波流量计\t149215\n1-2秒\t149216\n黑兽\t149217\nnaturally\t149218\nTm\t149219\n漆包\t149220\n莜\t149221\n全球定位系统\t149222\n苹果6s\t149223\n投生\t149224\n广西壮族自治区人民政府办公厅\t149225\n各显神通\t149226\n广西壮族自治区政府\t149227\nv4.4.0\t149228\n1080P/720P高清\t149229\n阿达\t149230\nCopying\t149231\n四十五岁\t149232\n紫金矿业\t149233\n重庆长江三峡\t149234\nDCGAN\t149235\nesg\t149236\n安桌\t149237\nDean\t149238\n清军\t149239\nremit\t149240\nedge\t149241\n迪士尼梦幻王国\t149242\n金海岸花园\t149243\n呪\t149244\n植物大战僵尸游戏\t149245\nPCOS\t149246\n网管版\t149247\nVISUAL\t149248\n情事\t149249\n华骏\t149250\n斜视\t149251\ninvited\t149252\nZBLOG\t149253\nenclosing\t149254\n鸳鸯谱\t149255\n骆达华\t149256\n颈椎间盘突出\t149257\n粘稠物\t149258\n济南职业学院\t149259\n感性认识\t149260\n无水醋酸钠\t149261\n格雷格\t149262\n静脉压\t149263\n芈月\t149264\n集艾\t149265\n导体棒\t149266\ncad2018\t149267\n元嘉\t149268\n约翰克里斯多夫\t149269\n勇立\t149270\n炮机\t149271\n沁人心脾\t149272\nAV小次郎\t149273\n临聘\t149274\n久长\t149275\neyan\t149276\n宝威\t149277\n截标\t149278\n做头\t149279\n江阴房产网\t149280\n雅女湖\t149281\n安城安娜\t149282\n玻尿酸隆鼻\t149283\n一条街\t149284\n航空城\t149285\n奋进\t149286\n2波\t149287\nAPG\t149288\npkcs\t149289\n水源地\t149290\n瓯海新闻网\t149291\nEMUI4.1\t149292\n恒神股份\t149293\n茂名北路\t149294\n花木街道\t149295\nNDA\t149296\nWorlds\t149297\n3G21J0Z\t149298\n192.168\t149299\n圆框眼镜\t149300\n62P\t149301\n免费算命网\t149302\n诊疗室\t149303\n9-11月\t149304\n来者不善\t149305\n部曲\t149306\nLinux论坛\t149307\n芯片业\t149308\nMonte\t149309\nSu\t149310\n几几年\t149311\n胡汉民\t149312\n不可变\t149313\n带状公园\t149314\n000000000\t149315\n栾川县\t149316\n77平米\t149317\n中国地震局工程力学研究所\t149318\n118w\t149319\nContinuous\t149320\n股肱\t149321\nradmin\t149322\n人民币天使轮\t149323\n2017.02.17\t149324\n储运\t149325\n毒雾\t149326\n昆塔斯\t149327\n偏离\t149328\ne1级\t149329\n李璐\t149330\n扬子人才网\t149331\n远特通信\t149332\n35000元\t149333\n戎\t149334\n治军\t149335\n载调压变压器\t149336\n北京京\t149337\n某些\t149338\n蜀山教育信息网\t149339\nAUB\t149340\n1183\t149341\n4790\t149342\n安环部\t149343\n异服\t149344\n喝一杯\t149345\n浙江省自然科学基金委员会\t149346\nflp\t149347\n长春市环境保护局\t149348\n卧轨\t149349\n招标信息预警\t149350\n墨寒砚\t149351\n好女\t149352\n斯巴鲁力狮\t149353\n尤宪超\t149354\nSd\t149355\n神经节苷脂钠\t149356\n伊伊\t149357\n恶魂\t149358\nV2.2.1\t149359\n藤椒鸡\t149360\n屏风马\t149361\n2.14\t149362\n食盆\t149363\n921\t149364\n纳闷儿\t149365\n哈利迪\t149366\njslint\t149367\nOption\t149368\n阿飞\t149369\n昆明站\t149370\nǒ\t149371\n课外学术\t149372\n土石\t149373\n罗某\t149374\n言情剧\t149375\ngrub4\t149376\nRubyGems\t149377\n魅力惠\t149378\n紫茎泽兰\t149379\n周年庆\t149380\n多长时间\t149381\nCOMODO\t149382\n16进制\t149383\n周迅\t149384\nbenny\t149385\n蝉蜕\t149386\n影音先锋影av色资源网\t149387\n边界线\t149388\n酒粕\t149389\n惠东高铁站\t149390\n曲柄\t149391\n北京地铁6号线\t149392\n接社\t149393\n自身\t149394\n股汇\t149395\n通衢\t149396\n运筹学\t149397\n朝青\t149398\n伍咏薇\t149399\n斯可馨\t149400\n仙女杯\t149401\nh264\t149402\n退休养老保险金计算器\t149403\n德州扑克吧\t149404\n阳光海岸\t149405\n8kg\t149406\nE5450\t149407\n学通\t149408\n80天\t149409\n后堂\t149410\n魔羯\t149411\n太安镇\t149412\n中华人民共和国招标投标法\t149413\n精制币\t149414\n详询\t149415\n兵甲\t149416\n丁旭\t149417\n李海波\t149418\n庚午年\t149419\n抽屉柜\t149420\nWooden\t149421\n周宪\t149422\n森嶋\t149423\n刘锋\t149424\n招标与采购信息网\t149425\n美丽加芬\t149426\n天子山\t149427\n水云轩\t149428\n中骏四季家园\t149429\n良渚文化村\t149430\n3.55\t149431\n浪琴机械\t149432\nDecay\t149433\n冲洗液\t149434\nTL10\t149435\n厉声\t149436\n采油厂\t149437\n英雄战姬\t149438\nsin2x\t149439\n装甲恶鬼村正\t149440\n装调\t149441\n济南网\t149442\nkod\t149443\n黄梅天\t149444\n特玩游戏网\t149445\n大连万达集团\t149446\n同于\t149447\n金融系\t149448\n美女总裁的贴身高手\t149449\n自疗\t149450\n5201\t149451\nLaundry\t149452\n铁氧体磁芯\t149453\n丧家\t149454\n双层石膏板\t149455\n客乐芙\t149456\nFlume-ng\t149457\n张务锋\t149458\nTPEP\t149459\n顶子\t149460\n吉他弹唱教学\t149461\n阿黛拉\t149462\nWordPress企业主题网\t149463\n苏州东\t149464\n张三丰\t149465\n宝德\t149466\nProtel99se\t149467\n摄入\t149468\n尚恩\t149469\n炸雷\t149470\n诬蔑\t149471\n建设方\t149472\nCigars\t149473\n贫\t149474\n金拇指\t149475\n日久\t149476\nMH4G\t149477\n世界华语\t149478\nSyntaxError\t149479\n魂珠\t149480\n练完\t149481\n全面建成小康社会到全面建成社会主义\t149482\nhaishi\t149483\nCSI\t149484\n教学贴\t149485\n六五\t149486\n梦空间\t149487\n麦克风\t149488\n正方体\t149489\n汉成帝\t149490\n2015年元旦\t149491\nGB7718\t149492\nnhk大河剧\t149493\nhd4000\t149494\n妇女性\t149495\n中山大学附属第二医院\t149496\n终极使命\t149497\n三国群英传2\t149498\n最高奖\t149499\n临河街\t149500\n金凤丸\t149501\n匯\t149502\n气口\t149503\n快枪手\t149504\n贺语\t149505\nAndr\t149506\n石墨聚苯板\t149507\n不死帝天\t149508\n县供销社\t149509\n混凝土泵\t149510\n金华南\t149511\n越战\t149512\n14期\t149513\n深圳长城开发科技股份有限公司\t149514\n枣树\t149515\n中山大道\t149516\n柳州市人民政府\t149517\n版本包\t149518\n妄念\t149519\n血稠\t149520\n谷壳\t149521\n湖南省安全生产监督管理局\t149522\n信联\t149523\n300%\t149524\n微通道换热器\t149525\n氧气站\t149526\n首局\t149527\n贫困线\t149528\n张晓峒\t149529\n体验券\t149530\n性商网\t149531\n黄河公路大桥\t149532\nAmen\t149533\n红衫军\t149534\n桃红四物汤\t149535\n福州大学厦门工艺美术学院\t149536\ndota2\t149537\n艾丽莎\t149538\n尿微量白蛋白\t149539\n巴达\t149540\n哮天犬\t149541\n黑龙江\t149542\n新工人\t149543\n阿克苏地区\t149544\n画派\t149545\n打场\t149546\nLion\t149547\n倾向值\t149548\n乌合\t149549\n江淮地区\t149550\n脱靶\t149551\n也门\t149552\nserialized\t149553\n怪箱\t149554\n柯南·道尔\t149555\n苏州市工商行政管理局\t149556\n亿度\t149557\n霍栀\t149558\nG90\t149559\n动态页\t149560\n粉机\t149561\n柱主\t149562\n急急\t149563\n群雕\t149564\n红岭\t149565\n新山兰\t149566\n中国电子信息产业发展研究院\t149567\ns_\t149568\n9.5.3.0\t149569\nskip\t149570\n突破者\t149571\n主机板\t149572\n中央政治局常务委员会\t149573\n真可怜\t149574\n水封罐\t149575\n八大山人\t149576\n李银河\t149577\n联想y400\t149578\nPending\t149579\n宝马女\t149580\n极品飞车5\t149581\n新手站_巴士英雄联盟\t149582\njinghua\t149583\n俯拍\t149584\nunionline\t149585\nzhuyi\t149586\ncfc\t149587\n1.1kw\t149588\nWELCOME\t149589\n风云际会\t149590\n草堂仙\t149591\nAwait\t149592\n例项\t149593\nollie\t149594\niozone\t149595\nHuntsman\t149596\nModa\t149597\npes2018\t149598\n稳居\t149599\n英菲克i9\t149600\n于朦胧\t149601\n名侦探柯南\t149602\n死亡骑士吧_\t149603\n国机汽车\t149604\n香槟广场\t149605\n里海\t149606\n霸屏\t149607\n道士\t149608\n以西\t149609\n数仓\t149610\n杨峥\t149611\n夜郎自大\t149612\n比亚超级神基因\t149613\nPARTY\t149614\n超级群英传\t149615\n浏阳市\t149616\ncl2017\t149617\n大江户温泉\t149618\n巴奴\t149619\ninnovations\t149620\n超敏反应\t149621\nssr\t149622\n瓶口\t149623\n涅灭\t149624\n风淋室\t149625\n发春\t149626\n晶科能源\t149627\n更合理\t149628\npr吧\t149629\n太平洋证券\t149630\n耐寒\t149631\n利略\t149632\n16.9\t149633\n朗格\t149634\n反哺\t149635\n闯关\t149636\n杯罩\t149637\n验孕\t149638\n2018年国庆\t149639\n中大网校\t149640\n虚空藏\t149641\n细菌性阴道炎\t149642\n遗孀\t149643\n和约\t149644\n碧思\t149645\n空文\t149646\nkeil5\t149647\n剑三七秀\t149648\nnmt\t149649\n20171209\t149650\n毕业论文.docx\t149651\n狄更斯\t149652\nDoNews\t149653\n李璟\t149654\n杭州湾世纪城\t149655\n中刀\t149656\n网赚活动网\t149657\nfargo\t149658\n厚燃\t149659\n胞体\t149660\n五象湖\t149661\n施秉县人民政府\t149662\n腾讯浏览\t149663\n透明质酸钠凝胶\t149664\n线路\t149665\n襄阳东站\t149666\n安乃近\t149667\n黑莓KEYone\t149668\nthomson\t149669\n其次\t149670\n58页\t149671\nccb\t149672\n让一切随风\t149673\n李宽端\t149674\nK22\t149675\n蓑\t149676\n张裕葡萄酒\t149677\nmab\t149678\nwrap\t149679\n赵构\t149680\n城商行\t149681\n釉中彩\t149682\nv3菱悦\t149683\n跑图\t149684\n位码\t149685\nSeaJson\t149686\n耳仓\t149687\n协奏曲\t149688\n企业股权出质登记\t149689\n政事堂\t149690\n安博盒子\t149691\nRecommend\t149692\n康王\t149693\n高空车\t149694\n标准车\t149695\n盆骨\t149696\n涉藏\t149697\n第16\t149698\n校园修神录\t149699\n8艘\t149700\n菌膜\t149701\n无水硫酸镁\t149702\n流化\t149703\narmour\t149704\n6U\t149705\n治具\t149706\nHACH\t149707\n777\t149708\n人和街\t149709\n现视\t149710\n切糕\t149711\n太岁符\t149712\nHi3516A\t149713\n第三十集\t149714\n六师\t149715\n克莱修仙高手混花都\t149716\n附分解_广场舞\t149717\n极限竞速地平线3吧\t149718\n热兵器\t149719\n视乎\t149720\n噪波\t149721\n2G网络\t149722\n博奥晶典\t149723\nOwnership\t149724\nwaveform\t149725\n虎牙直播银豆\t149726\n刘前程\t149727\n亡语\t149728\n龙腾世纪:起源\t149729\n诺基亚7Plus\t149730\n7294\t149731\n滚蛋吧肿瘤君\t149732\nGraphics\t149733\n蚌山区\t149734\n数组元素\t149735\n国元证券\t149736\n首都大学\t149737\n妄议\t149738\n神武\t149739\n2.5.6\t149740\n春字\t149741\n筒型\t149742\n从一家公司\t149743\nGameCenter\t149744\n泥丸\t149745\n80集\t149746\n贷款率\t149747\n水浸\t149748\n奇台县人民政府\t149749\n河南省环境保护厅\t149750\n游戏玩家版\t149751\nPreparing\t149752\n石条\t149753\n对数线性模型\t149754\n聚富\t149755\n长征路\t149756\n枫香树\t149757\nGTA5-Mods.com\t149758\n吕布凡人修仙传\t149759\n德育课\t149760\nnace\t149761\n苏夕\t149762\n1107\t149763\nqq2016\t149764\ninsulation\t149765\n下凡\t149766\n一条道\t149767\n梦三国\t149768\n眼窝\t149769\n孙兵\t149770\n不得了\t149771\n北京医院\t149772\nLaughter\t149773\n南京民办小学\t149774\n矽肺病\t149775\n罗默\t149776\n龙气\t149777\nBandai\t149778\n卡塔尼亚\t149779\n军曹\t149780\n巫神\t149781\n转口\t149782\n霄\t149783\nOrigin2015\t149784\n钻孔\t149785\n社群营销\t149786\n英英\t149787\n陌路人\t149788\n大伙\t149789\n乌纱\t149790\n铅球\t149791\nExcel2013\t149792\npkt\t149793\n亲民党\t149794\n看雨\t149795\npredictive\t149796\n肌痛\t149797\n肃清\t149798\n浏阳网\t149799\ntetra\t149800\n18_\t149801\nTOP15\t149802\n黑白锐雯\t149803\n奥新\t149804\n中科创达\t149805\n减小\t149806\n芜湖网\t149807\n金盈\t149808\nU2414\t149809\n奥悦\t149810\nMorph\t149811\n烤肠机\t149812\n皇家贝蒂斯\t149813\n额尔敦\t149814\n蓝鸟\t149815\n一帧\t149816\n王宝钏\t149817\n夏季\t149818\n安仁县\t149819\n管业\t149820\napac\t149821\n五年级下册\t149822\n绿色营销\t149823\n鲁西黄牛\t149824\n鞘翅目\t149825\n大水\t149826\n知识堂\t149827\n诺亚集团_诺亚金融国际_资产管理_集团\t149828\nX片\t149829\n墨武\t149830\n捧哏\t149831\n2016-04\t149832\n武清站\t149833\n时叶\t149834\n川沙\t149835\n事业处\t149836\n热释电红外传感器\t149837\n百度图片搜索\t149838\nsobel算子\t149839\nen150\t149840\n漆面\t149841\n第SC03版\t149842\npatapon\t149843\n测角仪\t149844\n2016\t149845\n30包\t149846\n有机产品认证\t149847\n石毅\t149848\nc#子\t149849\n西工\t149850\n有特色\t149851\n[HP\t149852\n短毛猫\t149853\n添\t149854\n法老\t149855\nsprig\t149856\n2.8d\t149857\nFOODAILY\t149858\nzotero\t149859\nnicht\t149860\n丽岙\t149861\n柯震东\t149862\nbutten\t149863\n上海邦德职业技术学院\t149864\nHUI320\t149865\n52shici.com\t149866\n秦二世\t149867\nDB37\t149868\nPAY\t149869\n防反\t149870\n背道而驰\t149871\n几百个\t149872\n静载\t149873\n阿猫阿狗\t149874\n通勤包\t149875\nAlert\t149876\nlength函数\t149877\n班课\t149878\nxiumi\t149879\n省油\t149880\n用友T+\t149881\n中国教育科学研究院\t149882\n小步舞曲\t149883\n咯噔\t149884\n中国四大汽车集团\t149885\n1760\t149886\n塔洛\t149887\n豆鼓\t149888\n徐胖子\t149889\n沈洁\t149890\n安在\t149891\n华东师范大学第二附属中学\t149892\n天凤\t149893\n滥交\t149894\nstands\t149895\n无码帝国WWW\t149896\n斯沃琪\t149897\n髪\t149898\n金灿荣\t149899\n包动\t149900\n铡草机\t149901\nchristina\t149902\n旋启式止回阀\t149903\n穿楼\t149904\n民宗\t149905\n乐普\t149906\n中铁十四局\t149907\n涉水险\t149908\n少数服从\t149909\n欺行霸市\t149910\nAutocad2012\t149911\n田地\t149912\n750V\t149913\nGenuine\t149914\n出洋\t149915\n400台\t149916\n合景泰富地产\t149917\n刘铮\t149918\nBSS段\t149919\nai素材\t149920\n血胎\t149921\n开瑞国际物流(山东)股份有限公司\t149922\nhiden\t149923\n蓝庭\t149924\n刘冬花\t149925\n2010年7月\t149926\n争着\t149927\n可谈\t149928\n奥运场馆\t149929\n丰台火车站\t149930\n2目\t149931\n三国类\t149932\n美高美\t149933\ntuo\t149934\n33天\t149935\nDARLING\t149936\n阳东区\t149937\n黄龙体育馆\t149938\n这个星期\t149939\nlilun\t149940\n漫步\t149941\n双卡版\t149942\n领头羊\t149943\nptzf\t149944\n套色\t149945\n6840\t149946\n在线学习网\t149947\n胡红\t149948\n刷剧\t149949\n六百年\t149950\n32s\t149951\n28800\t149952\n改制\t149953\n双元路\t149954\n抗寒\t149955\ntwaver\t149956\n申美\t149957\n融信\t149958\n没几天\t149959\n西安电子科大\t149960\ntonymoly\t149961\n彩打\t149962\n心情日记\t149963\n雷锋网\t149964\n阳物\t149965\n扦插生根\t149966\n锰板\t149967\n皮皮时光机\t149968\n慧眼\t149969\n亚洲国际博览馆\t149970\n徐子菲\t149971\n新大洲电动车\t149972\n万达公馆\t149973\n竹架\t149974\n光速\t149975\n酒钱\t149976\n陈茶\t149977\n口音\t149978\nspring4\t149979\nansible\t149980\n那只手\t149981\nhongma\t149982\n熔断器\t149983\n箱庭都市\t149984\n中华人民共和国执业医师法\t149985\nmin-width\t149986\n广濑\t149987\n经略\t149988\n新庄\t149989\n琼浆\t149990\n酒仙网\t149991\n列出现\t149992\n以牙还牙\t149993\n四物汤\t149994\n广安门中医院\t149995\n合作\t149996\n手机腾讯网\t149997\n死了都要爱\t149998\n卡拉\t149999\n十宗罪\t150000\n白痕\t150001\nJspStudy\t150002\nmt7688\t150003\n爸爸去哪儿\t150004\n叔叔\t150005\n天山世界之门\t150006\n地狱火堡垒\t150007\ngnueabi-gcc\t150008\n北京胡同\t150009\n米家激光投影电视\t150010\n电玩巴士\t150011\n七日游\t150012\n碧桂园公园壹号\t150013\n比亚迪宋max\t150014\nR61\t150015\n想当初\t150016\n拿捏\t150017\n菲诺\t150018\n河西区幼儿园\t150019\n图拉斯\t150020\n张家港保税区\t150021\n古兰精\t150022\n上海华谊\t150023\n研究生部\t150024\nNook\t150025\n知更鸟女孩\t150026\n800多个\t150027\n唐卡\t150028\n张振新\t150029\n青苹果影院\t150030\n延安新区\t150031\nword2007文\t150032\n乾进\t150033\n老版\t150034\n六周年\t150035\n空气能地暖\t150036\nRichie\t150037\n武侠q传\t150038\nHELL\t150039\n东方CardTD\t150040\n九图\t150041\nPassMark\t150042\nCamellia\t150043\nfantasia\t150044\nshkd\t150045\n135平米\t150046\n无限浮士德\t150047\n多集\t150048\n举重若轻\t150049\nngix\t150050\n王牌凤\t150051\n有片\t150052\n挂账\t150053\n文化传播有限公司\t150054\nRIM\t150055\n徐文\t150056\n玉兰湾\t150057\n送君\t150058\n上海彩虹室内合唱团\t150059\n手机3158招商加盟网\t150060\n健生\t150061\n韩舞\t150062\nquartz\t150063\nX21\t150064\n中国三峡新能源有限公司\t150065\n六盒\t150066\n大溪沟\t150067\n0.9.4\t150068\n折翼的天使\t150069\nPrep\t150070\n03式\t150071\n长骨\t150072\n马来酰亚胺\t150073\nsurprised\t150074\n残奥会\t150075\n山西交通职业技术学院\t150076\n2018.3\t150077\n卡色\t150078\n邮政储蓄卡\t150079\n50升\t150080\n戮仙\t150081\n百度网盟\t150082\n特种装备网\t150083\n优码\t150084\nx201i\t150085\n人工工日\t150086\n一元二次方程根\t150087\nwww.ygdy8.com\t150088\nblows\t150089\n加湿\t150090\n红灯笼\t150091\nLinkButton\t150092\nepud\t150093\n擦擦\t150094\nxcworkspace\t150095\n自筹资金\t150096\n广西理工职业技术学校\t150097\n艾菲尔\t150098\n经济危机\t150099\n落笔\t150100\n汤圆\t150101\nflasher\t150102\nタ\t150103\nSABA\t150104\nPunishment\t150105\n日本三菱\t150106\n阿尔兹海默症\t150107\n151022\t150108\n大成拳\t150109\n内税\t150110\n爱屋吉屋\t150111\n翟永超\t150112\nerge\t150113\n七只\t150114\n安徽省人大\t150115\nUSB2.0\t150116\nMart\t150117\n拂衣\t150118\nAirbnb爱彼\t150119\n心河\t150120\n奇异值\t150121\n胶花\t150122\n6.6%\t150123\n上海申通\t150124\nastute\t150125\n公众\t150126\nvbh\t150127\n太阳园\t150128\n几相\t150129\n阅读馆\t150130\n24l01\t150131\n周英\t150132\n雷神网游加速器\t150133\n大脚板\t150134\n茱萸湾\t150135\n剑锋\t150136\n玫瑰果\t150137\nruo\t150138\n露兰姬娜\t150139\n核战\t150140\nMP5\t150141\n朋\t150142\n北京路步行街\t150143\n封魔谷\t150144\nservlets\t150145\n大连友谊\t150146\n伺服电动机\t150147\n广西教育学院\t150148\nTITANIUM\t150149\n布里斯托大学\t150150\nEfficacy\t150151\n酸溜溜\t150152\nKaori\t150153\n为国\t150154\n成渝立交\t150155\n37kf\t150156\n在线式\t150157\n雷霆普拉多\t150158\n海马丘比特\t150159\n7行\t150160\n交相辉映\t150161\n萨尔曼汗\t150162\n移入\t150163\n2011级\t150164\n125%\t150165\n上海市农业委员会\t150166\n第一课\t150167\nChigga\t150168\nZHLYW\t150169\nregulatory\t150170\n夕日\t150171\n长沙市中级人民法院\t150172\n道战\t150173\n聂云\t150174\n冒险岛冰雷\t150175\n贾科梅蒂\t150176\n岑巩\t150177\n马迭尔\t150178\n超强台风\t150179\nINFINITE\t150180\n批假\t150181\n成渝经济区\t150182\n一两声\t150183\n合成版\t150184\n麂皮绒\t150185\n南通人才网\t150186\n二手房网_365二手房网\t150187\nQUEST\t150188\n师哥\t150189\nchinois\t150190\n光器\t150191\n女儿墙\t150192\n中信地产\t150193\n保千里\t150194\n我的召唤圣剑\t150195\n私募投资基金管理人\t150196\ntextBox\t150197\n云知\t150198\nCéline\t150199\n牲畜\t150200\n天启\t150201\n洗脑式\t150202\nmatable\t150203\n基列莱特\t150204\n45000公斤\t150205\n虚短\t150206\n狼魂\t150207\n群邑\t150208\nfx-82es\t150209\n星辰奇缘\t150210\nbtsearchs\t150211\n禁毒法\t150212\n暖气\t150213\nfluxion\t150214\n永世\t150215\n哲学类\t150216\n夜夜夜\t150217\n马边新闻网\t150218\ntestmart\t150219\nCLSID\t150220\n很高\t150221\n87个\t150222\n纯玩团\t150223\n4月10日后\t150224\n香菇粉\t150225\n农林路\t150226\nSEO追词网\t150227\nchance\t150228\n4007\t150229\n拱\t150230\n燒\t150231\n浦坝港镇\t150232\n股评\t150233\n拘\t150234\n徐倩\t150235\n腮部\t150236\n心小\t150237\n光电工程学院\t150238\nnewblue\t150239\n曾光\t150240\n陇右\t150241\n元世祖\t150242\n一米多\t150243\n林洋\t150244\n第三卷\t150245\n作作\t150246\n兴汉新区\t150247\n铁笼\t150248\n唬人\t150249\n严加\t150250\n重庆奥数网\t150251\n32天\t150252\n48G\t150253\n英国剑桥大学\t150254\n研讨会议\t150255\n影音先锋资源-影音先锋\t150256\n变电室\t150257\n骗色\t150258\nx64.msi\t150259\n下键\t150260\nav网\t150261\n美cry\t150262\n东风雪铁龙C5\t150263\nLJ2400\t150264\n15|\t150265\n0.6\t150266\n客舱\t150267\n烟台山医院\t150268\n兴国\t150269\n随声\t150270\n逗逼\t150271\n皮肤瘙痒症\t150272\n失乐园\t150273\n传唤\t150274\n广发行\t150275\n夜长梦多\t150276\n6.0.6\t150277\n千千静听\t150278\n张晶\t150279\n官方包\t150280\n广州银联网络支付有限公司\t150281\n题目集\t150282\n专业\t150283\nXY助手\t150284\n博易\t150285\n液压式\t150286\n济南西\t150287\n西海大峡谷\t150288\n滚边\t150289\n编造\t150290\n日番\t150291\n轮转机\t150292\n川国\t150293\n巡游\t150294\n副框\t150295\n天猫店铺\t150296\n赵翔\t150297\n中望cad2017\t150298\nAldrich\t150299\nqt4\t150300\n兵临\t150301\n言行一致\t150302\n拖地机器人\t150303\n秸秆打捆机\t150304\ndocm\t150305\n心得安\t150306\n患难见\t150307\nDRAGON\t150308\n纸巾\t150309\nRoller\t150310\n别动\t150311\n人力资源\t150312\n弹簧垫\t150313\n章贡区\t150314\nFlair\t150315\n庆六一\t150316\n顺网游戏\t150317\n内维尔\t150318\n754\t150319\n从良\t150320\n陪游\t150321\n君商学院\t150322\ndev/sda1\t150323\n耻辱之日\t150324\nG点\t150325\n灯槽\t150326\n油渣\t150327\n元好\t150328\n翡翠湖\t150329\nevi\t150330\nroute-map\t150331\n太原站\t150332\n价字\t150333\n效率低下\t150334\n刻意\t150335\n中信网爱鸽商城\t150336\n红狮\t150337\n球磨\t150338\n高加索\t150339\n29套\t150340\nOGS\t150341\n河口县\t150342\n枭臣\t150343\namino\t150344\n程序群\t150345\nC15\t150346\n阿斯兰\t150347\n霜狼\t150348\n摩尔网\t150349\n往期\t150350\n江苏省常熟中学\t150351\n7018\t150352\ncopying\t150353\n好片\t150354\n武汉联通\t150355\n骑友\t150356\n田埂\t150357\n妲己\t150358\n吉林大学\t150359\n仅存\t150360\n资质代办\t150361\nrenqi\t150362\n二爷\t150363\n银联在线支付\t150364\nTRANSFER\t150365\n佛山市经济和信息化局\t150366\n月光宝盒\t150367\n几十张\t150368\n打屁屁\t150369\n毽子\t150370\n剧场版\t150371\n海洋水族馆\t150372\nwin7-64\t150373\n客户端\t150374\n成绩斐然\t150375\nWPT\t150376\nachievement\t150377\nqsv\t150378\nRainyn\t150379\ngdi\t150380\n糖醋\t150381\n衰亡史\t150382\n一样\t150383\n疲乏\t150384\n丁建国\t150385\n烤箱版\t150386\nlily\t150387\njob\t150388\n小眼\t150389\n著作权登记\t150390\n上海中心\t150391\n下一天\t150392\n李佳芯\t150393\n明光先锋网\t150394\n八幡\t150395\nqq糖\t150396\n多好啊\t150397\n火锅英雄\t150398\n叶柏寿\t150399\n丁蜀\t150400\nlotusscript\t150401\n千万果粉大本营_威锋网\t150402\nMarkLines\t150403\n眉州\t150404\n百运网\t150405\n小麦胚芽\t150406\n云东海街道\t150407\n颜良\t150408\n拳皇wing1.91\t150409\n当代艺术馆\t150410\n中江\t150411\n安尔达\t150412\n华林酸碱平\t150413\nwe7\t150414\n广州市人大常委会\t150415\n艾德思奇\t150416\nvpngate\t150417\n离奇\t150418\n3315\t150419\n客户名单\t150420\n追根\t150421\n蜈支洲岛\t150422\nBBE\t150423\n冰凝\t150424\n分泌\t150425\niomanip\t150426\n棉线\t150427\n索纳塔8\t150428\n哈弗h1\t150429\n5158\t150430\n电子产品结构\t150431\n乙醇酸\t150432\n安徽医院\t150433\n天狱篇\t150434\nl350\t150435\n乖孩子\t150436\n小记\t150437\n浙江大学信息与电子工程学系\t150438\n盲袋\t150439\n校花与野\t150440\n卷席\t150441\nPackage\t150442\n精疲力竭\t150443\n全国工人先锋号\t150444\nDearest\t150445\n姓名\t150446\n油气分离器\t150447\n异世帝国霸主\t150448\n天天斗地主\t150449\n轨距\t150450\nSCALA\t150451\nC轮\t150452\nAbsolute\t150453\n汕头港\t150454\n镣铐\t150455\n社区服务中心\t150456\n舟山市普陀区政府\t150457\nrachel\t150458\n抗议者\t150459\n窝心\t150460\ntf家族\t150461\n溪口村\t150462\n立川理惠\t150463\n断点\t150464\n穷光蛋\t150465\n萧景琰\t150466\n霓凰\t150467\n宁波公司\t150468\n载机\t150469\n龙壁\t150470\n顺源\t150471\n党政机关公文格式\t150472\n盛大\t150473\n中华人民共和国海洋环境保护法\t150474\nJPY\t150475\n围\t150476\nCoupe\t150477\nquinn\t150478\n神经侠侣\t150479\n协和国际部\t150480\n孫\t150481\n圣晶\t150482\n舞歌\t150483\n自然志\t150484\n106路\t150485\n月华\t150486\n三复\t150487\nDatebox\t150488\nSword\t150489\n笑\t150490\n郑明\t150491\nDetect\t150492\nxfplayer\t150493\n心计\t150494\n高粱酒\t150495\n重焕\t150496\n婧氏\t150497\n康泰\t150498\nfulfillment\t150499\n94677奇闻网\t150500\n吉本芭娜娜\t150501\n颁布\t150502\nPRESS\t150503\n名典\t150504\n相顾\t150505\nsilverlight\t150506\n攻战\t150507\n心梦\t150508\n绝影\t150509\n房车露营地\t150510\n种植体\t150511\n递进\t150512\n盐市口\t150513\n陈永生\t150514\nbillyz\t150515\n巴特\t150516\n银河铁道之夜\t150517\n拉玛泽\t150518\n穴居\t150519\n无阻\t150520\n麻姑山\t150521\n履带\t150522\njain\t150523\nAdministrators\t150524\n本息\t150525\nAndersen\t150526\n馊\t150527\n哈尼族\t150528\n华发商\t150529\n大一\t150530\n香艳\t150531\n发泄\t150532\n3屏\t150533\n麦序\t150534\n镇南关\t150535\n淫謀\t150536\n林艳\t150537\n补短\t150538\ntpms\t150539\n短信轰炸机\t150540\n李希光\t150541\nshop\t150542\n0202\t150543\n拼车群\t150544\n新势\t150545\n庭审直播网\t150546\n赵剑华\t150547\nHarder\t150548\n升糖指数\t150549\nEv\t150550\nps理论\t150551\n姝\t150552\n发肿\t150553\n浮萍\t150554\nnullptr\t150555\n羊口\t150556\n森博\t150557\n神拳\t150558\n1386\t150559\n3000万美元\t150560\n华族\t150561\n绝地求生刺激战\t150562\ndotnetcore\t150563\n纸桶\t150564\n非mbr\t150565\n千首\t150566\n拆标\t150567\n孙宏\t150568\n上海涵飞医疗器械有限公司\t150569\n20170925\t150570\n三国恋战记\t150571\n人猿泰山h版\t150572\n粉粉\t150573\n6gb\t150574\n254个\t150575\n去伪存真\t150576\n天空龙\t150577\n阿莱格里\t150578\n桑木\t150579\njinshi\t150580\nTsum\t150581\n娟\t150582\nSRGB\t150583\n亚特\t150584\n吉利集团\t150585\nexue\t150586\n海砂\t150587\n潇湘女性网\t150588\n火麻茶\t150589\n澧县人民政府\t150590\n黛绮丝\t150591\n可替代性\t150592\n一青窈\t150593\n广东地税\t150594\nMSTSC\t150595\nscreencap\t150596\n鹳雀楼\t150597\n压抑\t150598\n壹零\t150599\n水龙敬乐园\t150600\n层次关系\t150601\n熹妃\t150602\n欠税\t150603\nIdea\t150604\n萨日朗\t150605\n爱在路上\t150606\n安福寺\t150607\nsack\t150608\nN3DS\t150609\n深棕\t150610\n四六\t150611\n热血单机版\t150612\n初愈\t150613\n金字塔原理\t150614\n特立尼达\t150615\n47\t150616\n大唐发电\t150617\n魔法棒\t150618\n姬胧月\t150619\n200厚\t150620\n陆离\t150621\n灵隐岛\t150622\n星野竜一\t150623\n中山法院\t150624\n自治区党委\t150625\n蒋敏\t150626\n韦天\t150627\nquestasim\t150628\n潮款\t150629\n福建省科技厅\t150630\n光大集团\t150631\n81名\t150632\n喷水管\t150633\n季裕棠\t150634\n打印纸\t150635\n孔祥\t150636\n加利福利亚\t150637\n残局\t150638\n九茗茶品牌网\t150639\nRealFlow\t150640\n桑旅\t150641\nmangle\t150642\nentitlement\t150643\n佛教音乐\t150644\n借问\t150645\nArticles\t150646\nRe:从零开始的异世界生活\t150647\n2018众泰T700\t150648\n神州十一号\t150649\nReturn\t150650\n白虹\t150651\n远行\t150652\n孙公司\t150653\n条幅机\t150654\n陈希章\t150655\n房价\t150656\n叫好不叫座\t150657\n各指\t150658\n津巴多普通心理学\t150659\n统称\t150660\n堪萨斯城\t150661\nconsiderable\t150662\n成分钟\t150663\n大地震\t150664\n萨义德\t150665\nnin\t150666\n80A\t150667\n今宵\t150668\n达尔优键鼠\t150669\n香椎\t150670\n2016年4月19日\t150671\n6379\t150672\n城堡\t150673\n开平区\t150674\n95式\t150675\n陈磊\t150676\n南部中学\t150677\n认错\t150678\n9座\t150679\n駆\t150680\n睡垫\t150681\n梦幻西游口袋版\t150682\n罗PD\t150683\n昆政办\t150684\n何在\t150685\n峡山\t150686\n600150\t150687\n枪管\t150688\n教育部留学服务中心\t150689\n玻璃墙\t150690\nz7mini\t150691\n加农\t150692\n最大的麦穗\t150693\n朱玲玲\t150694\n树龄\t150695\n建设工程有限公司\t150696\n白头翁\t150697\ncsco\t150698\n淡竹\t150699\n4.07\t150700\n香菇菌棒\t150701\n港漫\t150702\n大兔\t150703\n短暂性\t150704\n共济会\t150705\n阳光城文澜府\t150706\n博学谷\t150707\n华容县人民政府\t150708\n首批10家\t150709\n灯芯绒\t150710\n奇点\t150711\n确定\t150712\n配种\t150713\n胶机\t150714\n母亲节\t150715\n心源性\t150716\n2017小姐威客\t150717\n9.2.0\t150718\nNetEase\t150719\n10【\t150720\n坦赞铁路\t150721\n悦楽\t150722\n带权\t150723\n西班牙语专业\t150724\n马克吐温\t150725\n拍马\t150726\nflux\t150727\n回原形\t150728\ncosmetics\t150729\n20cm\t150730\n陈大鹏\t150731\noppor9plus\t150732\n蓝舞者\t150733\n光伏板\t150734\n窗\t150735\nFinest\t150736\nuio\t150737\n十八式\t150738\n网易健康\t150739\n竖排版\t150740\n声谱\t150741\n特别的爱给特别的你\t150742\n麦芽贷\t150743\n就是为了\t150744\n吉达\t150745\njboss\t150746\n超级机器人大战OGS\t150747\nlatimes\t150748\n15家\t150749\n王天宝\t150750\n横槊\t150751\n火腿\t150752\n吴堡县\t150753\n好房网\t150754\n安全系数\t150755\n一声声\t150756\n肉菜\t150757\n演出服\t150758\n重医附一院\t150759\n13J811\t150760\n速管\t150761\n3DM论坛\t150762\nRupert\t150763\ncovariance\t150764\nMagnificent\t150765\nstea\t150766\n启初\t150767\n镇定剂\t150768\n注意\t150769\n乳房纤维瘤\t150770\nAlchemist\t150771\n脑液\t150772\n成都城建投资管理集团有限责任公司\t150773\nhash_hmac\t150774\n道袍\t150775\n台前县\t150776\n昏暗\t150777\n林永健\t150778\n晋中学院\t150779\n未名湖\t150780\n海马体照相馆\t150781\n118平米\t150782\n金翅\t150783\n一按\t150784\ncanape\t150785\n头纱\t150786\n三国志13\t150787\n六篇\t150788\ndau\t150789\n4.9亿\t150790\nClicker\t150791\n新消息\t150792\nt+2\t150793\n流浪人\t150794\n电容式传感器\t150795\n彩友\t150796\n克林霉素甲硝唑搽剂\t150797\nNear\t150798\n永光\t150799\n简拼\t150800\n袜机\t150801\n14们\t150802\n首章\t150803\n三国志5\t150804\n滨湖新城\t150805\n一小时前\t150806\n七卷\t150807\n金投财经频道\t150808\n学部\t150809\nMrMac\t150810\n镭射机\t150811\n胎盘早剥\t150812\n瑞穗\t150813\ninferno\t150814\n林静晓\t150815\n高新技术企业认定管理工作指引\t150816\n白尾山\t150817\n模拟电子技术基础\t150818\n牌价\t150819\n胡志强\t150820\nSoap\t150821\n表情包\t150822\n紫河\t150823\n法学方法论\t150824\n在江边\t150825\n_帮助中心_众易贷\t150826\nENG\t150827\nkb4088776\t150828\n依维莫司\t150829\nhusband\t150830\n宅子\t150831\njijin\t150832\noris\t150833\n一介撸夫\t150834\n各支\t150835\nString型\t150836\ntabla\t150837\nfanxing\t150838\n危境\t150839\n6K\t150840\n张齐\t150841\n凯\t150842\n伤逝的安详\t150843\n成都市兴蓉集团有限公司\t150844\n武林广场\t150845\n台群\t150846\n羊肚菌\t150847\n寒战\t150848\n本服\t150849\n二等座\t150850\n四_c1\t150851\n路牙石\t150852\nroadmap\t150853\n明光路\t150854\n世界自然基金会\t150855\npluraleyes\t150856\n沈阳陆军总院\t150857\n颈动脉狭窄\t150858\n有人\t150859\nMaker8.Com\t150860\n单机三国志\t150861\n入场券\t150862\n环保达人#\t150863\n诺基亚5230\t150864\nstata\t150865\nEMI\t150866\n远离\t150867\n电板\t150868\n盐湖区\t150869\n拉线\t150870\nUSART1\t150871\n伊莲\t150872\n钱芳\t150873\nX型腿\t150874\n奥迪a3\t150875\n一役\t150876\n蘑菇\t150877\n病字\t150878\n石斛兰\t150879\n顾明\t150880\n多来\t150881\n叶岚\t150882\n几盘\t150883\n环市东路\t150884\n孟醒\t150885\n方型\t150886\n远方的家\t150887\n开送\t150888\n去医院\t150889\nGraylog\t150890\nphlsheji\t150891\n1回\t150892\n化肥\t150893\n口服化疗\t150894\n新招\t150895\nVensim\t150896\n剖宫产率\t150897\n犯冲\t150898\n邀标\t150899\n邓小平时代\t150900\n水人\t150901\n爱视\t150902\n十次\t150903\n小版张\t150904\nVilla\t150905\n汇展\t150906\ndante\t150907\n包头\t150908\n主攻手\t150909\n翻车机\t150910\n金立m2017\t150911\n安农大\t150912\n是真是\t150913\nhamburg\t150914\n内蒙古自治区人民医院\t150915\n色膏\t150916\nkeypress\t150917\n复仇者联盟3\t150918\nWeifang\t150919\n宝石花\t150920\nBless\t150921\n墙纸\t150922\nPjax\t150923\n卡尔·萨根\t150924\nsetheader\t150925\n凤凰山街道\t150926\nsurvivor\t150927\nTiming\t150928\nReflex\t150929\n_琪琪女性网\t150930\n鑫苑国际城市花园\t150931\ninterior\t150932\n海贤\t150933\n硅酸铝针刺毯\t150934\n马勒\t150935\nlazily\t150936\nㄓ\t150937\n微信公众号营销\t150938\n106年\t150939\n常州恐龙园\t150940\n慢性咽喉炎\t150941\n中国山东网\t150942\n二清\t150943\ng13\t150944\n弘康健康\t150945\nbloc\t150946\ns10\t150947\n非线性回归\t150948\n凌天战尊\t150949\n连接号\t150950\n这样子\t150951\n眯眯眼\t150952\n青原区\t150953\n萧平\t150954\n10月4日\t150955\n阿鲁迪巴\t150956\n聊斋奇女子\t150957\n酷狗缓存\t150958\n丁香叶\t150959\n软通\t150960\n艰难困苦\t150961\n守护\t150962\n刘虞\t150963\n大理机场\t150964\nmickey\t150965\n雑誌\t150966\n德语系\t150967\n光污染\t150968\n密封膏\t150969\nA.1\t150970\n单仁\t150971\n建法\t150972\n自然对数\t150973\nThrottleStop\t150974\n注册资\t150975\n华川\t150976\n行影\t150977\n刘爱华\t150978\n徐国庆\t150979\n大S\t150980\ndecltype\t150981\n小戏骨红楼梦\t150982\n志望\t150983\n1787\t150984\n怀尔德\t150985\n张洪杰\t150986\n中国动漫\t150987\n猫空\t150988\n合亚眼镜网\t150989\n稀浆封层\t150990\n记仇\t150991\n一上午\t150992\n采撷\t150993\n惨叫鸡\t150994\nIFN\t150995\n桂建标\t150996\n大海龟\t150997\n牛友\t150998\n黑猫\t150999\n张永红\t151000\nissey\t151001\n倪瓒\t151002\nStory\t151003\n70套\t151004\n华数传媒\t151005\nsqlservice\t151006\n96折\t151007\n借记卡卡号\t151008\nav女忧\t151009\n丰田雷凌\t151010\nsinh\t151011\n龙宝宝\t151012\n138元\t151013\n卫生信息网\t151014\n三阳\t151015\n李传\t151016\n3252\t151017\n2017年10月13日\t151018\n国际法学\t151019\n安徽省第二人民医院\t151020\n11i\t151021\n王心凌\t151022\nsell\t151023\n老太爷\t151024\n货币金融学\t151025\n吓人\t151026\n眼疼\t151027\n赵红\t151028\n省运管局\t151029\n阴囊湿疹\t151030\n曲丹\t151031\n18枚\t151032\nspen\t151033\n上海市长宁区\t151034\n10^2\t151035\n文投控股\t151036\n蝙蝠侠:阿甘起源\t151037\n洛佩兹\t151038\n易奇八字星座\t151039\n伊吹\t151040\n毕业寄语\t151041\n安区\t151042\n提重\t151043\n流通性\t151044\n赌运\t151045\n8.net\t151046\n闷骚男\t151047\nSSNI\t151048\nchecking\t151049\n二乔\t151050\n肠溃疡\t151051\n大风起兮云飞扬\t151052\n非线性方程组\t151053\n一鹤\t151054\n清洁乡村\t151055\nIIS管理器\t151056\n酷鱼\t151057\n谷雨后\t151058\n136路\t151059\n紫币\t151060\n杨桥\t151061\n托梦\t151062\n002024\t151063\n9号健康网\t151064\n三元材料\t151065\n7620a\t151066\n仙帝\t151067\n一君\t151068\nMaverick\t151069\n王府井小吃街\t151070\n点着\t151071\n食人树\t151072\n石湾窑\t151073\n相火\t151074\n23章\t151075\n哥哥太爱我了怎么办\t151076\n游轮\t151077\n中核科技\t151078\n中国军工厂\t151079\n核显能\t151080\nB1\t151081\nC6100\t151082\n重新排序\t151083\n飞度\t151084\n苹果肌\t151085\nggt\t151086\n脑出血后遗症\t151087\nPEC\t151088\n光纤环网\t151089\n起点软件园\t151090\n朋克Frostpunk\t151091\n新晃县\t151092\n香魂\t151093\n四川省住建厅\t151094\nxadmin\t151095\n监测员\t151096\n2000户\t151097\n1064位\t151098\nDubbox\t151099\n高雷\t151100\n挤爆\t151101\n服务署\t151102\n消防中队\t151103\n网仓\t151104\n华侨城中学\t151105\n奥美集团\t151106\n資\t151107\n聊天工具\t151108\n阿明\t151109\n家猫\t151110\n黄泛区\t151111\nspss单因素方差分析\t151112\n途牛网\t151113\nsee\t151114\nトレ\t151115\n角处\t151116\n长沙国家高新技术产业开发区\t151117\n21项\t151118\n劳资员\t151119\n2017年1月19日\t151120\n如花\t151121\n网络连接数\t151122\n祖名\t151123\n重返德军总部\t151124\n五老村\t151125\nrie\t151126\n转公\t151127\n宏桥\t151128\nunnatural\t151129\nCement\t151130\n49.2\t151131\n收件员\t151132\n水电路\t151133\n赢战\t151134\n还贷\t151135\nbj\t151136\n傲慢与偏见\t151137\n65536\t151138\n出尘\t151139\n木下佑香\t151140\n41142110411@qq.com\t151141\n薜荔\t151142\nPET\t151143\n1.96\t151144\n郑允浩\t151145\n周小燕\t151146\n439号\t151147\nOperation\t151148\n苏民\t151149\n真知\t151150\n呼和浩\t151151\nRAF\t151152\n生工生物工程(上海)股份有限公司\t151153\n尖叫声\t151154\n升学\t151155\nsand\t151156\n榴石\t151157\neric\t151158\n四川文化艺术学院\t151159\n黄伯荣\t151160\n卡酷\t151161\n逄\t151162\n時雨\t151163\nrig5\t151164\n志士\t151165\n口袋妖怪GO\t151166\n老根\t151167\n外国妞\t151168\natomos\t151169\n四川商务职业学院\t151170\n棕丝\t151171\nDJ版\t151172\n金无足赤\t151173\n3d全息投影\t151174\n55厘米\t151175\n七一\t151176\nleggings\t151177\ngba模拟器\t151178\ninfo/\t151179\n折股\t151180\nV14\t151181\n第三时尚网\t151182\n0k\t151183\n务川县政府网\t151184\n快码\t151185\n华安\t151186\nfactors\t151187\nLz\t151188\nv7.7\t151189\n关税清单\t151190\njuno\t151191\n印巴\t151192\nC3\t151193\n胜算\t151194\nWhit\t151195\n地衣芽孢杆菌\t151196\ngpon\t151197\nsinn\t151198\nnavcate\t151199\n乾隆年\t151200\n价签\t151201\n力敏\t151202\n帝豪ec7\t151203\n马鞍山OK论坛\t151204\n玩灯\t151205\n东外\t151206\n成分\t151207\n钢棍\t151208\n探矿\t151209\n/dev/sdb1\t151210\nloveyou\t151211\nxiaoming\t151212\n长城汽车股份有限公司\t151213\n町町\t151214\n排渣\t151215\nEXEC\t151216\nA6500\t151217\n沉湎\t151218\nenergetic\t151219\n环桥\t151220\n熊岳城\t151221\n现代战争\t151222\n590分\t151223\n撒腿\t151224\n西庄\t151225\n全外显子测序\t151226\n登顶\t151227\n腐蚀性\t151228\n假面骑士空我\t151229\n滕州人才网\t151230\n测管\t151231\n相提并论\t151232\nFilter过滤器\t151233\nciwong\t151234\n市国税局\t151235\n海棠湾\t151236\n农牧局\t151237\nCosting\t151238\n皇品\t151239\n缩率\t151240\nwwe\t151241\n分解式\t151242\n实验动物中心\t151243\n高强度\t151244\n介电损耗\t151245\n1oz\t151246\n理疗仪\t151247\n钢之炼金术师fa\t151248\n20CM\t151249\n换宿\t151250\n大术\t151251\n宗庆西西里的美丽传说\t151252\n探望\t151253\n上月\t151254\n组织液\t151255\n木北\t151256\nPolo衫\t151257\n新疆九洲恒昌供应链管理股份有限公司\t151258\n牵引式\t151259\nregsvr32.exe\t151260\nSpringCloud\t151261\n屈原管理区\t151262\n矫\t151263\npie\t151264\n预评\t151265\n咕泡学院\t151266\n进步\t151267\n杨彤\t151268\n人民医院\t151269\n空谷\t151270\n供销社\t151271\n分子热运动\t151272\n选法\t151273\n硬盘盒\t151274\n白居特朗普\t151275\n华为南研所\t151276\n未得\t151277\n初面\t151278\n看衰\t151279\n电视显示器\t151280\n短枪\t151281\n中国黄金协会\t151282\nWP10/Win10\t151283\n评审委员会\t151284\nNormalize\t151285\n痛仰\t151286\n________\t151287\n日本清酒\t151288\n花虾\t151289\n幻世篇\t151290\n交通费\t151291\n合成氨\t151292\n跳失率\t151293\n放鞭炮\t151294\ngtb\t151295\n痕迹化\t151296\n钓鱼者\t151297\n截齿\t151298\nfpe\t151299\n四周岁\t151300\n张强\t151301\n心素\t151302\nAudemars\t151303\n千堆\t151304\n水轮机\t151305\n先进\t151306\n没了\t151307\n顺延\t151308\n金正宝\t151309\n疯狂猜图\t151310\n刑侦\t151311\n王胜利\t151312\n一两家\t151313\n临床医学专业\t151314\n音乐坊\t151315\n巴陵县\t151316\n450度\t151317\n共轭效应\t151318\n生丝\t151319\n终成眷属\t151320\n4000年前\t151321\n倚天剑\t151322\nMFC-7450\t151323\n白家庄小学\t151324\nsealing\t151325\n巴哥犬\t151326\n中联重科\t151327\n信长之野望14威力加强版\t151328\n出招\t151329\nphotshop\t151330\n金发\t151331\n汽灯\t151332\n鉴赏期末考试\t151333\n黄冈小状元达标卷\t151334\n八档\t151335\n张贴\t151336\n丁勇岱\t151337\n傅恒\t151338\n2676\t151339\nacess\t151340\n世纪春城\t151341\n百分之0\t151342\n搓\t151343\n家长版\t151344\n助印\t151345\nnuvoton\t151346\n花JD\t151347\n钙素\t151348\n立标\t151349\n昆明北市区\t151350\n宫壁\t151351\n第多少集\t151352\n蓝联\t151353\nsmart210\t151354\nparsons\t151355\n4.2万\t151356\n安徽华强环保科技有限公司\t151357\n何文秀\t151358\n迅付信息科技有限公司\t151359\n淬毒\t151360\n调色师\t151361\n刘恩科\t151362\nMR_CHW\t151363\n低速载货汽车\t151364\n深证指数\t151365\n学校图书馆\t151366\n耀西\t151367\n市国土房管局\t151368\n黑龙江饶河县人民政府\t151369\n众矢之的\t151370\n清心普善咒\t151371\nBotania\t151372\npywrap\t151373\n世博\t151374\n五百万\t151375\n南郊\t151376\ncad2010序列号\t151377\nwindows安装器\t151378\n化学反应式\t151379\n蔷薇科桃属\t151380\n道客巴巴\t151381\n毅腾\t151382\nskit\t151383\n陈孝萱\t151384\n耳鼻喉\t151385\ninitiator\t151386\n24式\t151387\nfootprint\t151388\n阳光体育运动会\t151389\n快乐\t151390\n洗扫车\t151391\n馈赠\t151392\n梦桑阁\t151393\n360手机安全卫士\t151394\nmetal\t151395\n第244集\t151396\n飞寒\t151397\n锁眼机\t151398\n住房贷款利率\t151399\n湘西州\t151400\n实木复合板\t151401\n按量\t151402\n世纪游轮\t151403\n华讯投资\t151404\n飞牌\t151405\n1多\t151406\n杰普特\t151407\n法兰西共和国\t151408\n第8层\t151409\n富力乌衣水镇\t151410\n里奥·梅西\t151411\n金大班\t151412\n厂服\t151413\n卸妆液\t151414\n下船\t151415\nm126a\t151416\n寒武记\t151417\n不受\t151418\n住房公积金贷款\t151419\n天喔\t151420\n石匣\t151421\n秘蜂\t151422\n买主\t151423\n模体\t151424\n抓药\t151425\n326\t151426\n脂肪干细胞\t151427\nv16\t151428\njavasc\t151429\n入淘\t151430\n自学成才\t151431\n学历年\t151432\n数百万\t151433\nInteractions\t151434\n乐清日报\t151435\n日本京\t151436\nALIGN\t151437\ncreatetime\t151438\n45首\t151439\n九亭\t151440\n元身\t151441\n陈建民\t151442\n60篇\t151443\n北京华宇软件股份有限公司\t151444\n洪氏\t151445\ntailor\t151446\n献诗\t151447\n8年\t151448\n爱恋\t151449\n接收值\t151450\n武汉理工大学\t151451\n32GB/WiFi版\t151452\nUniversidad\t151453\n银项链\t151454\n超过2年\t151455\nliteon\t151456\nt30\t151457\n共轨\t151458\n钳式\t151459\nnist\t151460\n何韵诗\t151461\n福建省统计局\t151462\n十有八九\t151463\n肾友\t151464\n朱砂\t151465\n锁汇\t151466\n拖稿\t151467\n易才\t151468\n大西洋城\t151469\nsqluldr2\t151470\n培养皿\t151471\n榆阳机场\t151472\n线椒\t151473\n000001\t151474\n刁爱青\t151475\n爱妻\t151476\n光栅尺\t151477\n谢精\t151478\n与非\t151479\n深度图\t151480\nwww.jiaoyu123.com/modules/article/txtarticle\t151481\n小米盒子3增强版\t151482\n非正常\t151483\nIDEA14\t151484\n公安派出所\t151485\nshort型\t151486\n抓绒\t151487\nglad\t151488\n入台\t151489\n_肝病频道_健客网\t151490\n教益\t151491\n贝斯特\t151492\n郧阳中学\t151493\nnantong\t151494\n密闭式\t151495\n成都交大\t151496\n血塞通\t151497\n榔头\t151498\n无聊水贴\t151499\nccproject\t151500\n托辊\t151501\n死枪\t151502\nCL00\t151503\n选择困难症\t151504\n高铁票改签\t151505\n自由舰\t151506\n航信金税盘\t151507\n全权\t151508\nSupports\t151509\nant\t151510\n苏曼殊\t151511\nwidows10\t151512\n亿利达\t151513\n修仙类\t151514\n河东区\t151515\n刘卫平\t151516\n开发区国税局\t151517\nMacy\t151518\n沈夜焰\t151519\n触漫\t151520\n调标\t151521\n五语\t151522\n抛锚\t151523\n广肖邦\t151524\nbale\t151525\n仙妻\t151526\n穿针引线\t151527\n鹤山信息网\t151528\n/2\t151529\n急救护理学\t151530\n凤凰模拟器\t151531\n烧糊\t151532\ncrond\t151533\n7度\t151534\n皇巢网\t151535\n巴宜区\t151536\n黄金时刻\t151537\n硚口区政府\t151538\n20150817\t151539\n淋语\t151540\n格构柱\t151541\n黄文\t151542\n山东公务员考试网\t151543\n姜伟\t151544\n圆体字\t151545\n江浦街道\t151546\n盗链\t151547\npojie\t151548\n枪神\t151549\n养护\t151550\n采摘园\t151551\n大东亚\t151552\n秀人\t151553\n菏泽人才网\t151554\n孤岛惊魂5CPY\t151555\nuip\t151556\n苏明娟\t151557\n国中水务\t151558\n讲析\t151559\n9.1.2\t151560\nazo\t151561\n成都分行\t151562\n辛辛\t151563\n知我相思\t151564\nISOVIP\t151565\n油条\t151566\n湾沚镇\t151567\n家具装修|一起网\t151568\n厨道\t151569\n路甬祥\t151570\n首颗\t151571\nzhoushan\t151572\n500分\t151573\n旦那\t151574\n富士吧\t151575\n第5节\t151576\n朱俊文\t151577\n缩略图\t151578\nCPA注册会计师\t151579\n卡宝\t151580\n恋尸癖\t151581\n长兴岛郊野公园\t151582\n背拥抱\t151583\n周数\t151584\nCommMonitor\t151585\n23万\t151586\n少见\t151587\nLANGUAGE\t151588\n初页\t151589\nMidea\t151590\n新能源汽车产业链\t151591\n叱\t151592\nBROOKS\t151593\nghost版\t151594\n9111\t151595\n订台\t151596\n内功\t151597\nzjtax\t151598\n康威\t151599\n陈容\t151600\n炎陵县\t151601\n初月\t151602\n虹桥新区\t151603\na套\t151604\n催乳剂\t151605\n美痴女\t151606\n铜锣湾广场\t151607\n货率\t151608\niead\t151609\n吉他堂\t151610\n身材管理器\t151611\n私物\t151612\n胶粉\t151613\nHankook\t151614\n十公里\t151615\nqq旋风\t151616\n高下\t151617\n水母衣\t151618\nCMMI认证\t151619\n8155\t151620\n巴托\t151621\n幻想水浒传\t151622\n功率型\t151623\n燕宝\t151624\n金熊\t151625\n单调性\t151626\n纳尼亚传奇1\t151627\nebp\t151628\n卡西奥佩娅\t151629\n瓮福集团\t151630\n灵跃\t151631\n遗属\t151632\n免生\t151633\n元鼎\t151634\nH77\t151635\nCodey\t151636\n废水\t151637\nFLAC/整轨\t151638\n网易开源镜像站\t151639\n居士\t151640\n电影券\t151641\n被偷走的那五年\t151642\n发型网\t151643\nocx\t151644\n残疾人证\t151645\nlx100\t151646\n秦皇岛火车站\t151647\nE02\t151648\n唯亭街道\t151649\n不爱我\t151650\n福费廷\t151651\n六辆\t151652\n公共资源交易网\t151653\n534\t151654\n导柱\t151655\n村野\t151656\n保送\t151657\narmcc\t151658\n妇产科\t151659\n资生\t151660\n大全集团\t151661\n偷香小说网\t151662\n艾宾浩斯\t151663\n涂塑\t151664\n东岗路\t151665\n风主\t151666\n撸串\t151667\n29寸\t151668\n安娜情欲史\t151669\n黛米摩尔\t151670\n硝基化合物\t151671\n责编\t151672\n捷途\t151673\n裸杀\t151674\n首设\t151675\n失败者\t151676\n市长\t151677\n试解\t151678\n双把\t151679\n虚竹\t151680\n胺\t151681\n抽经\t151682\n猫途鹰\t151683\nWalle\t151684\n上海烛龙\t151685\n较少\t151686\n压不住\t151687\n光电耦合器\t151688\n白帽子\t151689\n缸线\t151690\n短信银行\t151691\n肇庆市第一人民医院\t151692\n封闭期\t151693\n猎户星座\t151694\nroof\t151695\n财富版\t151696\n黄梅镇\t151697\n援\t151698\n浙江省小学\t151699\nVirtuaNES\t151700\n20平方\t151701\n微言\t151702\n14份\t151703\n博罗县\t151704\nbolt\t151705\n回火炉\t151706\n文山州\t151707\n住友酒店集团\t151708\n#value\t151709\n慕小乔\t151710\n急性心肌梗塞\t151711\n尺标\t151712\n1852\t151713\n农产品物流园\t151714\n臺\t151715\nsum\t151716\n138job\t151717\n豪克\t151718\n成藏\t151719\n季后赛首秀\t151720\n南浔镇\t151721\n天津电子\t151722\nmsata\t151723\n温控仪\t151724\n浓墨重彩\t151725\nHits\t151726\n军队\t151727\n长洲\t151728\n感冒片\t151729\n中长跑\t151730\nY03\t151731\n定居\t151732\n轩逸武神天下\t151733\n丝涟旗舰店\t151734\n楚天都市报\t151735\n简体画\t151736\n妙手回春\t151737\n8080.2015244\t151738\n世界奇妙物语\t151739\n络筒机\t151740\n热管散热器\t151741\n取行\t151742\n王祎\t151743\n网速测试\t151744\n沃特森\t151745\n秀币\t151746\n2017年教师节\t151747\nmusix\t151748\n秒懂百科\t151749\n京西宾馆\t151750\n管内\t151751\n如图甲\t151752\n瑞好\t151753\njr史密斯\t151754\n霜叶\t151755\n技术有限公司\t151756\n3.23\t151757\n晋武帝\t151758\ncern\t151759\n天炼\t151760\n综合教务管理系统\t151761\n4048\t151762\n辣白菜\t151763\n无责方\t151764\n上海大金\t151765\nDessert\t151766\n水吧台\t151767\n靖难之役\t151768\n0504\t151769\ntyped\t151770\n糖罐\t151771\nMODERN\t151772\n打比方\t151773\n开口机\t151774\n美国梦之队\t151775\n别傻了\t151776\n东明\t151777\nX80HD\t151778\n土耳其里拉\t151779\n金坛区\t151780\n氧化锰\t151781\n小芳\t151782\n傅里叶级数\t151783\n博济\t151784\n中阴身\t151785\n财宝\t151786\n全风\t151787\nNeo4j中文网\t151788\n310w\t151789\n奥古斯特\t151790\n闪嫁\t151791\nmacd指标\t151792\n结算表\t151793\nxmit\t151794\n天秤女\t151795\n布丽吉特\t151796\n26岁\t151797\nboundaries\t151798\n20171215\t151799\n蒙口\t151800\nFENG\t151801\n北京国际展览中心\t151802\n匪帮\t151803\n各月\t151804\n西安地铁\t151805\njavascript高级程序设计\t151806\nslider\t151807\n交互\t151808\n1.6.2\t151809\ntypefile\t151810\nIG战队\t151811\n10月23日\t151812\n湖北省食品药品监督管理局\t151813\n何立峰\t151814\n寻宝天行\t151815\n虚拟硬盘\t151816\nneicun\t151817\n狼派\t151818\n两叶\t151819\n江西联通\t151820\n范进中举\t151821\n热镀\t151822\n廊下镇\t151823\n五雷轰顶\t151824\n折馆\t151825\nfastreader\t151826\n规模以上工业增加值\t151827\n新药特药\t151828\nAllSight\t151829\nck\t151830\n集美貌\t151831\nTra\t151832\n朴实无华\t151833\n江苏医药职业学院\t151834\nISO14000\t151835\n袋泡茶\t151836\ndat\t151837\n挺括\t151838\n四部\t151839\n长坂坡\t151840\n6.2_\t151841\n台独分子\t151842\n100万股\t151843\n鱼乡\t151844\n新五代史\t151845\n1月17日\t151846\n1:30\t151847\nradi\t151848\n豪麦\t151849\n公布\t151850\n1小时后\t151851\n中村知惠\t151852\n盐酸左氧氟沙星胶囊\t151853\n3.9\t151854\n在我心里\t151855\n3.2.10\t151856\nnopi\t151857\n荸荠\t151858\npinpai\t151859\n宁波政府\t151860\n背贴\t151861\n2010-0\t151862\n3dmgame\t151863\n自查\t151864\n天阑\t151865\n潘集\t151866\n中国互金协会\t151867\n天天日\t151868\n高危儿\t151869\n说话\t151870\n懒人听书\t151871\n上海市体育局\t151872\n玩好\t151873\nUMP9\t151874\n一卡多号\t151875\nJs\t151876\n开平碉楼\t151877\n小李\t151878\n力狼\t151879\n唐艺\t151880\n亿力\t151881\n交流电路\t151882\n金光华广场\t151883\n版面\t151884\n云景\t151885\nUV机\t151886\nSeamless\t151887\n48v锂电池\t151888\n革除\t151889\n中沙\t151890\n平顶山新区\t151891\nITkeyowrd\t151892\n社会工作专业\t151893\n云上\t151894\nVNR\t151895\n一页一页\t151896\n2018年3月15日\t151897\n熔铝炉\t151898\n趋化\t151899\nubi\t151900\n赛百味\t151901\n雷声大雨点小\t151902\n初查\t151903\n邹博\t151904\n宝马3系论\t151905\n龙峰\t151906\n国产版\t151907\nヒロイン\t151908\n迎江区\t151909\n突击再突击\t151910\nlme\t151911\n新宾县\t151912\n60位\t151913\n室内定位\t151914\nUI/UX\t151915\n加液\t151916\n二十国集团\t151917\n白鹿炉石传说\t151918\n萌\t151919\n黑鸟\t151920\n王爷\t151921\n广东国际旅行卫生保健中心\t151922\n三角洲岛\t151923\nOverDrive\t151924\n猴菇饼干\t151925\n3Dmax2018\t151926\n从头再来\t151927\n中村明日美子\t151928\n奶齿\t151929\nWebSite\t151930\n央视版\t151931\n触摸一体机\t151932\n郁闭度\t151933\n尾崎\t151934\nappsync\t151935\njnt\t151936\nfenix\t151937\n王福重\t151938\n胡秃子\t151939\n香皂花\t151940\n陈都灵\t151941\n3多少\t151942\n展颜\t151943\n纯纯\t151944\nwarnings\t151945\nBillwang\t151946\n峰谷\t151947\n一点处\t151948\n整年\t151949\n静观\t151950\n20151224\t151951\n棱晶\t151952\n纱卡\t151953\n新妻\t151954\n赵雅\t151955\n固定收益类\t151956\nppm\t151957\n海南省物价局\t151958\n一标三实\t151959\n到达率\t151960\n安徽省地质矿产勘查局\t151961\n黄娟\t151962\nBayonetta\t151963\n首都\t151964\n遗传物质\t151965\n二次根式\t151966\nsunshine\t151967\n2017年11月14日\t151968\n股吧\t151969\n东辰\t151970\nrabbitMq\t151971\n记述\t151972\n平柜\t151973\n帧长\t151974\n神探狄仁杰5\t151975\ndbghelp\t151976\nvideo\t151977\n岩手县\t151978\nC\t151979\nLetter\t151980\n亚麻酸\t151981\n跨海大桥\t151982\n阜外华中心血管病医院\t151983\n行部\t151984\n肛管\t151985\n福建省人民防空办公室\t151986\n琼州海峡\t151987\nJDG\t151988\n液化天然气\t151989\n纳米\t151990\n体验式\t151991\n重天\t151992\n吊坠绳\t151993\n大杨树\t151994\n陈师\t151995\n过日\t151996\n宠物连连看3.1\t151997\nXm\t151998\n前脸\t151999\n4399穿越火线\t152000\n字节码\t152001\n隔离型\t152002\ntk1\t152003\nGB50500-2013\t152004\n中国京冶工程技术有限公司\t152005\n谢谢你的爱\t152006\nDisneyland\t152007\n鹤山区\t152008\nSTMicroelectronics\t152009\n中楼层\t152010\n0xc000000e\t152011\nDUX\t152012\n应诉\t152013\n敏锐\t152014\n第94号\t152015\n宋詞\t152016\n修缮\t152017\n石桥镇\t152018\nHTML5+CSS3\t152019\nminion\t152020\n粘弹\t152021\n电锯惊魂8\t152022\n专门\t152023\n远洋天著\t152024\nHuawei\t152025\n千本\t152026\ndivcss\t152027\nNan\t152028\n古运\t152029\n你的心\t152030\n福泰\t152031\n逼痒\t152032\n市人力社保局\t152033\n成本会计\t152034\n心理团\t152035\n王哥庄街道\t152036\n肢端\t152037\n重演\t152038\n风切变\t152039\n蛋白激酶\t152040\n逆向分析\t152041\n生物秀论坛\t152042\n胶囊机\t152043\n星家\t152044\n言多必失\t152045\nb29\t152046\n土耳其杯\t152047\n普外科\t152048\n妈妈们\t152049\n常州晚报\t152050\ntensorlayer\t152051\n佳能550d\t152052\n范琳琳\t152053\n优品汽车服务(上海)有限公司\t152054\nASCII,Unicode\t152055\n大众凌渡\t152056\n石天龙\t152057\nmac视频播放器\t152058\n青安岗\t152059\n光谱学\t152060\n涂\t152061\n人教八年级\t152062\nshinobi\t152063\n五大招\t152064\n卤\t152065\n合流\t152066\n李香秀\t152067\n信论\t152068\n火王集团\t152069\n调规\t152070\n战术家\t152071\n莎朗\t152072\n准确地\t152073\n全运\t152074\n摩斯电码\t152075\n办公用\t152076\n岔路\t152077\n80txt\t152078\n长孙无忌\t152079\n赵世熙\t152080\n务本\t152081\n大下小\t152082\n街边\t152083\n精简篇\t152084\n曲苑\t152085\n灰雀\t152086\n王荣生\t152087\nFIND\t152088\n易课\t152089\npair\t152090\n荷尔蒙\t152091\nHelloweba\t152092\n凯旋\t152093\n勇担\t152094\n无糖酸奶\t152095\nmoviepy\t152096\nDUCATI\t152097\n朴恩\t152098\n钢镚\t152099\nQos\t152100\n赫尔城\t152101\nMemphis\t152102\n龙影\t152103\n闭杯\t152104\n奥迪奔驰\t152105\n私设\t152106\n简美妍\t152107\n致命邂逅\t152108\n熊孩子\t152109\n更\t152110\n张百忍\t152111\n鱼子\t152112\n验标\t152113\n好精彩\t152114\n摊费\t152115\n佛奈\t152116\n艾吉奥\t152117\n陈向宏\t152118\n浮球阀\t152119\n环卫所\t152120\n爆炸物\t152121\n陪产\t152122\n三峡博物馆\t152123\nBloom\t152124\n针器\t152125\n趣文\t152126\n葱花饼\t152127\nZKTeco\t152128\n真皮鞋\t152129\n综掘机\t152130\n马牌\t152131\n红米5a\t152132\n老范\t152133\n无福\t152134\nv6.6.5\t152135\nWrong\t152136\n天蝎计划\t152137\n1600公里\t152138\nCalculator\t152139\nuvision\t152140\n20170510\t152141\n红袖添香\t152142\n杨氏太极拳\t152143\n第83章\t152144\n富煌钢构\t152145\nzhixing\t152146\n四川自考网\t152147\nv1.0安卓\t152148\n普兰店市\t152149\n壹元\t152150\n华丰\t152151\n料筒\t152152\n收息\t152153\nblanks\t152154\n晨旭\t152155\n妙意网\t152156\nMAC魅可中国\t152157\n中文档\t152158\n电子工程系\t152159\n凤凰小学\t152160\n无道\t152161\n河南代表团\t152162\n江西省委\t152163\n城镇\t152164\n批发价\t152165\n百度壁纸\t152166\n刘巍\t152167\n丁基\t152168\n晕晕\t152169\n氮肥\t152170\n穿线机\t152171\n不善\t152172\n梅威瑟\t152173\nHeist\t152174\n卡其\t152175\n042期\t152176\n泥人张\t152177\n立即\t152178\n北京欢乐谷\t152179\n干群\t152180\n开学后\t152181\n一饭\t152182\nShout\t152183\n食品药品监督管理总局\t152184\n小儿疳积\t152185\n83张\t152186\n吴法天\t152187\n山高人为峰\t152188\n豹子头\t152189\n零感\t152190\nChronos\t152191\n海南特区\t152192\n谁负责\t152193\n杜培东\t152194\n1毫秒\t152195\n用住\t152196\n西四路\t152197\n红椒\t152198\n仁怀市\t152199\n中华慈善总会\t152200\n性贿赂\t152201\n公需\t152202\n云升\t152203\n库尔扎斯\t152204\n触发器\t152205\nProjects\t152206\n喵窝丨\t152207\n加拿大国家公园\t152208\n陈康肃\t152209\n冷心\t152210\n张凯丽\t152211\n牛车\t152212\n林宇婧\t152213\n雨\t152214\n南宁人才网\t152215\n相对湿度\t152216\nRose\t152217\nfortress\t152218\nMathtype\t152219\n轩辕剑穹之扉\t152220\n亲者\t152221\n区工商联\t152222\n将星\t152223\n考车\t152224\n县级政府\t152225\n都市区\t152226\nfsx\t152227\n日照\t152228\n电动代步车\t152229\n凋亡\t152230\n壹伴\t152231\nReceived\t152232\n74161\t152233\n黑蝴蝶\t152234\n诠\t152235\n63a\t152236\n望春街道\t152237\n电竞\t152238\n全球花木网\t152239\n冯力军\t152240\n1v5\t152241\ncos服\t152242\n娇妃\t152243\nDavid\t152244\n青人\t152245\n镀锌钢\t152246\n仞\t152247\n百度网站\t152248\n沙窗\t152249\n头尖\t152250\n宴\t152251\n城西街道\t152252\n枣庄百姓网\t152253\nredis主\t152254\n制样\t152255\n莹石\t152256\n取反\t152257\n小姑子\t152258\n昭通\t152259\n失人\t152260\n粉盒\t152261\n鱼坛\t152262\nfiash\t152263\nwhe\t152264\n荣耀畅玩6x\t152265\n中铁工程设计咨询集团有限公司\t152266\n前浪\t152267\n诚可贵\t152268\n金立S6\t152269\n英男\t152270\n陌森\t152271\nlrtimelapse\t152272\n圣继绝学\t152273\n金田一\t152274\n孙洋\t152275\n泪流满面\t152276\n月宫\t152277\nCommunications\t152278\n扁平苔藓\t152279\n联合国安理会常任理事国\t152280\ngr2\t152281\n新疆公司\t152282\n易代通使馆认证网\t152283\nHygiene\t152284\n戒律牧\t152285\n富勒烯\t152286\n阿姆河\t152287\n一次\t152288\n白银路\t152289\n袖标\t152290\n管涵\t152291\n53加盟网\t152292\n挂点\t152293\n遮阳罩\t152294\nEHR\t152295\n绞牙避震\t152296\n杏花节\t152297\n香椿叶\t152298\n征信业管理条例\t152299\n河北小学\t152300\n寸照网\t152301\n大连公交\t152302\n云启\t152303\ncmath\t152304\n夏微\t152305\nabcdef\t152306\n真善美\t152307\n大萝卜\t152308\n三国之召唤猛将\t152309\n茂铁\t152310\n亲子教育\t152311\n爆率\t152312\n洋果子\t152313\n目次\t152314\n告示\t152315\n无际\t152316\nSTUDIO\t152317\n清洗器\t152318\n医手\t152319\n康奈尔\t152320\n洛阳铲\t152321\n龙傲\t152322\nTuxera\t152323\nchar函数\t152324\n葡\t152325\n芒果千层蛋糕\t152326\n新宁镇\t152327\n11.0592\t152328\n防暴\t152329\nbizhub\t152330\n先锋e支部\t152331\n二尖瓣\t152332\n廖斌\t152333\n分集剧情介绍\t152334\n米粒·佳\t152335\n红灯区\t152336\nassert\t152337\n银谷大厦\t152338\n北京五中\t152339\n阿富汗\t152340\ntems\t152341\n碱液\t152342\nactive3\t152343\n接触性皮炎\t152344\n车包\t152345\nyoko\t152346\n天天she\t152347\n揭牌\t152348\n中国人民大学新闻学院\t152349\n池田美和子\t152350\n乱码\t152351\n天津现代职业技术学院\t152352\nbeng\t152353\ndeta\t152354\n星际穿越\t152355\n维多\t152356\n字集\t152357\n李子璇\t152358\nwetransfer\t152359\n中国幼儿园\t152360\n双电源\t152361\nbmx\t152362\n结冰\t152363\n上海农场\t152364\n学警狙击\t152365\n艳照门\t152366\n鹏华国防\t152367\n饮料\t152368\npicamera\t152369\n三天\t152370\n隐蔽者\t152371\n将者\t152372\n299号\t152373\nTricks\t152374\n国税实名认证\t152375\n九点半\t152376\n陈百强\t152377\n邻里中心\t152378\n无间道1\t152379\n360se.exe\t152380\n甘肃省环保厅\t152381\nsumitomo\t152382\n少好\t152383\nxdd\t152384\n黄耳龟\t152385\n摇铃\t152386\n超胆侠\t152387\n广州市高级技工学校\t152388\n不落\t152389\n企业保险\t152390\nWCDB\t152391\n活塞环\t152392\n罗姆尼\t152393\nvios\t152394\n情真意切\t152395\n2016年11月14日\t152396\n李文慧\t152397\n金穗\t152398\nhp1213\t152399\n丁当\t152400\n谢词\t152401\nStudy_Work\t152402\n视觉库\t152403\n8350k\t152404\n77套\t152405\n军团战\t152406\n极酷\t152407\n福神\t152408\n基类\t152409\n菲斯\t152410\n2008域\t152411\n中电信\t152412\n昆山幼儿园\t152413\n大连新闻网\t152414\n飞客\t152415\n凿壁\t152416\n最近一个星期\t152417\nghj\t152418\n硬红\t152419\nPuff\t152420\n2612\t152421\nfgets\t152422\n帝豪GS\t152423\n江苏省经济和信息化委员会\t152424\n生日记\t152425\n鞘膜\t152426\n5.0.0\t152427\n2012年06月16日\t152428\nUnderstanding\t152429\n世嘉论坛\t152430\nrepaint\t152431\nmahout\t152432\n杭州萧山机场\t152433\nWAFFLE\t152434\n400ML\t152435\n铁扇公主吧\t152436\nBAOCMS\t152437\n神思\t152438\n201706\t152439\ndiscography\t152440\nNonk\t152441\nZ11miniS\t152442\n必发\t152443\n首诊\t152444\n灵宠\t152445\nvideos\t152446\n炤\t152447\n销售利润\t152448\n俞静\t152449\n婉若游龙\t152450\n亚丽莎\t152451\n北京市第四中学\t152452\ninfluxdb\t152453\n吻吻\t152454\n暖暖\t152455\n短时间\t152456\n5月10号\t152457\n商业区\t152458\nv4.2.4\t152459\nmq\t152460\n上千种\t152461\n快鹿\t152462\nAngelDevil\t152463\n60Hz\t152464\n辐照\t152465\n1000倍\t152466\n120_\t152467\n扫尾\t152468\n朱文杰\t152469\n唐岩\t152470\n孙恒\t152471\n化能\t152472\n舍弗勒\t152473\n八纵八横\t152474\n百文\t152475\n第八层\t152476\n后盖\t152477\n宿雨涵\t152478\n真脸\t152479\nvmwar\t152480\n幸福度\t152481\n新天翼之链\t152482\n1期\t152483\n头陀镇\t152484\n刘凯\t152485\n散热板\t152486\n日常用语\t152487\n检票员\t152488\nCloudFront\t152489\nadc0832\t152490\n双宾\t152491\n喷薄\t152492\n西芹\t152493\n字派\t152494\n或是\t152495\n求医\t152496\n就可\t152497\n辜\t152498\n四百年\t152499\n海口市人力资源和社会保障局\t152500\n莉莉柯林斯\t152501\n500千伏\t152502\n华东师范\t152503\ntimestamps\t152504\n三五互联\t152505\n結城\t152506\nglk300\t152507\n灰头\t152508\n色觉\t152509\n裸流\t152510\n逆鳞会有天使替我爱你\t152511\n大手大脚\t152512\n花瓣儿\t152513\n大大咧咧\t152514\n洮北区\t152515\n断路器\t152516\n难受\t152517\n搞好\t152518\n命运石之门0\t152519\n_千库网588ku.com\t152520\nzai\t152521\n水土\t152522\n中环\t152523\n伍尔特\t152524\n均匀\t152525\n这世道\t152526\n这些事\t152527\n供应者\t152528\n姆爷\t152529\n朱叶\t152530\n187看书网\t152531\n求败\t152532\n皋亭山\t152533\nzendesk\t152534\nCygwin\t152535\n隆安\t152536\n红外吸收峰\t152537\n装点\t152538\ndisease\t152539\nSKILL\t152540\n大联欢\t152541\n胥渡吧\t152542\n更广泛\t152543\n内向\t152544\n装做\t152545\n污浊\t152546\n海归人才网\t152547\n盐城幼儿师范高等专科学校\t152548\n龙均亮\t152549\n佳都科技\t152550\nConstraint\t152551\n莲溪\t152552\nAutoEncoder\t152553\n70句\t152554\n漆板\t152555\n灵山卫\t152556\nΑ\t152557\nVMWare虚拟机\t152558\n10000平方米\t152559\nInstrumentation\t152560\n纸短情\t152561\n乳化\t152562\n乐手\t152563\n用方\t152564\n克\t152565\n5幢\t152566\nFT3\t152567\nyahoo\t152568\n小S\t152569\n不困\t152570\nReliance\t152571\n富文本\t152572\n皮癣\t152573\npetct\t152574\ncheckbox复选框\t152575\n陈鲁民\t152576\n送关\t152577\n甘蓝\t152578\n酱瓶\t152579\ntest1\t152580\n浦发大厦\t152581\n彭佳慧\t152582\n近在眼前\t152583\n国政\t152584\n槐荫区\t152585\n顺天\t152586\n无缝钢管壁\t152587\n记时\t152588\n拈花湾\t152589\n绝地求生4\t152590\n电导率\t152591\n牟平区\t152592\nchrome浏览器\t152593\n清洁球\t152594\n水画\t152595\n瑞特\t152596\nsynplify\t152597\n奥拉朱旺\t152598\n紫青\t152599\n佳球\t152600\n山东省食药监局\t152601\n主诉\t152602\n渗水\t152603\n鼎太风华\t152604\n有光泽\t152605\nAfreeca\t152606\n申花队\t152607\n百会\t152608\nlongchamp\t152609\nIOS9\t152610\n智远\t152611\n河北省住房和城乡建设厅\t152612\n龙岗中心医院\t152613\n荣乌高速\t152614\n第八号当铺\t152615\n六七万\t152616\npacks\t152617\n空调机\t152618\n论文库\t152619\nDCT变换\t152620\nB幢\t152621\n一夜后\t152622\n胡舒立\t152623\n兰蒂斯\t152624\n收汇\t152625\n高梁\t152626\n为你我受冷风吹\t152627\n奶量\t152628\nOpenType\t152629\n反不正当竞争法\t152630\nzfz\t152631\nVIJOZ\t152632\n展览会展商\t152633\n李国斌\t152634\n投放\t152635\nKidney\t152636\n辉山乳业\t152637\nAngular\t152638\n现任\t152639\n镖局\t152640\nFiber\t152641\n佳能g3800\t152642\n澳标\t152643\nAim\t152644\nless-loader\t152645\n郑远元\t152646\nEmo\t152647\n两死\t152648\n福\t152649\n湘妃竹\t152650\n吴鹏\t152651\n李伊庚\t152652\nECMall\t152653\n老河口市\t152654\n中国校友会2017中国大学\t152655\n内质网\t152656\nANALYSIS\t152657\n神木市人民政府\t152658\nshitou\t152659\nJAN\t152660\n2015年10月29日\t152661\n五峰\t152662\n张掖市地方税务局\t152663\n软星\t152664\ntowel\t152665\n顾太清\t152666\nXC40\t152667\n仙语\t152668\nguanjia\t152669\n第三四\t152670\n港理工\t152671\nvSwitch\t152672\n悍女\t152673\n桃源山庄\t152674\n52pk穿越火线\t152675\n第34\t152676\n自旋锁\t152677\nzhjh\t152678\nms17\t152679\n南京日报社\t152680\n外汇管理局\t152681\n5pk\t152682\njeecg\t152683\n应税收入\t152684\nconsist\t152685\n骨髓增生异常综合征\t152686\n网易读书\t152687\nwherever\t152688\n北国之恋\t152689\nAspects\t152690\nMcMaster\t152691\n6.31\t152692\n定陵\t152693\nVDI\t152694\n1100\t152695\n吴丽君\t152696\n泰雷兹\t152697\nEnrique\t152698\nwestern\t152699\nThrust\t152700\n信奉\t152701\n秋华\t152702\n75公斤\t152703\neditions\t152704\n蓬莱汽车站\t152705\nmlxg\t152706\n深圳市小学\t152707\n中国科学技术大学管理学院\t152708\n土建工程专业\t152709\n装甲师\t152710\noras\t152711\n廊曼国际机场\t152712\n喂奶\t152713\n焊炉\t152714\n筋痛\t152715\n核查\t152716\n非序列\t152717\n要面子\t152718\n白楼\t152719\nshiping\t152720\n伊河路\t152721\nsetup.py\t152722\n蝴蝶找花\t152723\n梅多克\t152724\nq7\t152725\n天壤之别\t152726\n电力系统\t152727\n百度拼音输入法\t152728\n鸡蛋羹\t152729\n斯卡拉\t152730\n众泰汽车\t152731\nconj\t152732\n210g\t152733\n威索尼克\t152734\n亚锦赛\t152735\n非木非石\t152736\n上海公交\t152737\n怒不可遏\t152738\n157阶\t152739\nexcel宏\t152740\n中华全国体育总会\t152741\n姬玉露\t152742\n里加上\t152743\n英语网\t152744\n宏福苑小区\t152745\n邓华\t152746\nFisheye\t152747\n1506\t152748\n二十分之一\t152749\n独立避雷针\t152750\nleaves\t152751\n134家\t152752\n善良魔女传\t152753\n柚子街\t152754\ndpn\t152755\n植物大战僵尸全明星\t152756\n狮子兽\t152757\ndockerhub\t152758\n刘畅\t152759\n八达通\t152760\n八钢\t152761\nCC2640\t152762\n饭费\t152763\n三喜\t152764\n抗税\t152765\n爱琴海购物中心\t152766\n修车\t152767\n马子\t152768\nfanout\t152769\nISO9000认证\t152770\nWTAPS\t152771\n中央库\t152772\n明白卡\t152773\n金沙江路\t152774\n设置项\t152775\n谢楼南\t152776\n三聚磷酸钠\t152777\ndivision\t152778\n2016年11月29日\t152779\n娘子关\t152780\n150台\t152781\n牛黄解毒丸\t152782\n暨大\t152783\n天津市公安交通管理局\t152784\n苏州公安局\t152785\n中专生\t152786\n无锡市政府\t152787\n小彭\t152788\n人民军医出版社\t152789\n障碍\t152790\nword格式刷\t152791\n艺术字生成器\t152792\n1879年\t152793\n0.6.0\t152794\nge200\t152795\n乘人之危\t152796\n北京社区卫生服务网\t152797\n刘老\t152798\n平流式\t152799\n肱二头肌\t152800\n0xc0000225\t152801\n可见度\t152802\nworkforce\t152803\n新广网\t152804\n背书转让\t152805\n新港东\t152806\n蒜泥龙虾\t152807\n模神\t152808\nshishanyuan\t152809\n官方源\t152810\n20KB\t152811\n佩文\t152812\n金牛路\t152813\n先抑\t152814\n郭柯彤\t152815\n工程造价咨询企业管理办法\t152816\nOLT\t152817\n斗笔\t152818\n续航\t152819\n小妹子\t152820\n各用\t152821\n惠国\t152822\n28套\t152823\n丙烯酸羟乙酯\t152824\n截污纳管\t152825\n法学界\t152826\n魔兽rpg\t152827\nBCG\t152828\n智投\t152829\n凯芙兰\t152830\n金牌调解\t152831\n玻璃箱\t152832\n洪磊\t152833\n粤华\t152834\n自叹\t152835\n僵尸叔叔\t152836\n虎易\t152837\n两点间\t152838\n小灵通\t152839\n后边\t152840\n听心\t152841\n南通站\t152842\n兑换机\t152843\n大河镇\t152844\n两山\t152845\nVerizon\t152846\n0474\t152847\n菜牌\t152848\n碧水庄园\t152849\n0222\t152850\nrails\t152851\n沈琳\t152852\n百度地图瓦片\t152853\n斗栱\t152854\n六险二金\t152855\nCLIP\t152856\n光敏\t152857\nfm2010\t152858\nTrials\t152859\n高人民法院\t152860\n高分子聚乙烯\t152861\n演奏家网\t152862\n有机酸\t152863\n一万块\t152864\n片状\t152865\n天润酸奶\t152866\n每条\t152867\nCTS6\t152868\n简评\t152869\n见异思迁\t152870\n待机\t152871\n祭神\t152872\n中国大地保险\t152873\n5话\t152874\n银川市市场监督管理局\t152875\n赤山湖\t152876\n媚俗\t152877\n垫块\t152878\n招商办\t152879\nFaceu\t152880\nmsflexgrid\t152881\n你是人间四月天\t152882\n糖醋鲤鱼\t152883\n74名\t152884\n不见得\t152885\n碳膜\t152886\n万分之一\t152887\n诉前财产保全\t152888\n上道\t152889\n养胎\t152890\n养成文\t152891\n知方\t152892\n皮肤性病\t152893\nSHO\t152894\n055型驱逐舰\t152895\n大盘点\t152896\n机油机\t152897\n陈冠舟\t152898\n罗马音\t152899\n常州火车站\t152900\n比伯\t152901\n合理地\t152902\n脂肪酸甲酯\t152903\nparticipate\t152904\n少恭\t152905\n回帖\t152906\n卖出价\t152907\n手弩\t152908\n平安信托\t152909\n整地机\t152910\n颜末\t152911\n麼\t152912\n御赐\t152913\n0025\t152914\n永强\t152915\n东风科技\t152916\n大姑子\t152917\n马桶盖\t152918\n自然之友\t152919\n2017年5月17日\t152920\n汉斯\t152921\nZYZ足球网\t152922\n毒药\t152923\n上海1号线\t152924\n到底是\t152925\n鲁家村\t152926\n42分\t152927\n129级\t152928\n中华民国史\t152929\n低语者\t152930\n气压计\t152931\n女儿们\t152932\n长生我的小人国\t152933\n未央区\t152934\n坏脾气\t152935\n荣耀露娜\t152936\n知鸟\t152937\n文凯玲\t152938\n网页文本框\t152939\n魔体\t152940\n科学计算器\t152941\n豆制品\t152942\n财富卡\t152943\n广州大学附属中学\t152944\n老娘舅\t152945\n茂业商业\t152946\n10000公里\t152947\n单程票\t152948\nleak\t152949\n耕耘者\t152950\n反恐精英\t152951\n12378\t152952\n瑟菲\t152953\n零配件\t152954\n消化系统疾病\t152955\nAssistance\t152956\nDOWN\t152957\n共和新路\t152958\n聚丙烯酸钠\t152959\n一场大雨\t152960\n党\t152961\n鱼柳\t152962\n超级小熊布迷\t152963\n最后还款日\t152964\n检查站\t152965\nPETA\t152966\n绝地求生回放功能\t152967\n山林\t152968\n跌停\t152969\nGad\t152970\n疾患\t152971\n跃龙街道\t152972\n第二|\t152973\n菁\t152974\n马应龙痔疮栓\t152975\nWindows批处理\t152976\n128项\t152977\n共青团员申报表\t152978\n平四\t152979\n说爱\t152980\n异形2\t152981\nMakeblock\t152982\nGPIO\t152983\n人民文学\t152984\n腹部\t152985\n顶点处\t152986\n曹操传吧\t152987\n小米充电宝\t152988\n南京中医药大学\t152989\n鎏嘉\t152990\n6r\t152991\n宜昌一中\t152992\n澜湾\t152993\n杨仪\t152994\n云计算时代\t152995\n无糖蛋糕\t152996\nioffer\t152997\n备付金\t152998\n烈火凤凰\t152999\n吕素\t153000\n注册消防工程师资格考试\t153001\n便民服务中心\t153002\n蛇盘疮\t153003\nhiv\t153004\n宏力\t153005\n协字\t153006\norokin\t153007\n注册计量师考试\t153008\n20151010\t153009\n乐Pro\t153010\nFRAN\t153011\n第108集\t153012\n模件\t153013\n费城76人\t153014\n光阴\t153015\n嘉善县政府\t153016\n征象\t153017\n抽射\t153018\n我买网\t153019\n51.6%\t153020\n秣周东路\t153021\n飞鹅\t153022\nredis分布式锁\t153023\n苏迪曼杯\t153024\n同域\t153025\neNSP\t153026\n忘川\t153027\n永冬\t153028\nbootcss\t153029\nNUMBERS\t153030\n嗤之以鼻\t153031\n莉娜\t153032\n黄喉拟水龟\t153033\n清风明月\t153034\n第15节\t153035\n方鸿渐\t153036\n便便\t153037\n挂件\t153038\n灰舞\t153039\nKSC\t153040\n阳光新路\t153041\n632米\t153042\n苹果电脑浏览器\t153043\nOGRE\t153044\n兜裆布\t153045\n国家卫生计生委办公厅\t153046\nfastqc\t153047\n旅业\t153048\nIT类\t153049\nvsm\t153050\n枪式摄像机\t153051\n闲闲\t153052\n熊兵\t153053\nweave\t153054\n365二手房网\t153055\n周洪宇\t153056\n3万个\t153057\n沧澜\t153058\n不孤\t153059\n鸵鸟\t153060\nAVPlayer\t153061\n质论\t153062\n弥月\t153063\n缉私\t153064\n塑业\t153065\nWALKMAN\t153066\n90b\t153067\n班禄\t153068\n清洁牙齿\t153069\n海洋时代2\t153070\n为谁\t153071\nCracking\t153072\n兰香子\t153073\n腐植酸\t153074\n万兵\t153075\n帕杰罗V97\t153076\n罩盖\t153077\n巽\t153078\n证卡\t153079\n10P\t153080\n座椅\t153081\n日逼\t153082\nMWeb\t153083\n楼台会\t153084\n504所\t153085\n晚樱\t153086\n成渝铁路\t153087\n北卡罗来纳州\t153088\n江苏路\t153089\n冷冻\t153090\n东北菜\t153091\n罗威纳吧\t153092\nnani\t153093\n数字创意产业\t153094\n篆体字\t153095\n北京西二旗\t153096\n割炬\t153097\nLapse\t153098\n盛举\t153099\ndepreciation\t153100\nIMX\t153101\nOoO\t153102\n在线计算器\t153103\n卧龙区\t153104\n清微\t153105\nEDUP\t153106\n超过10秒\t153107\n桨\t153108\n拉菲2\t153109\n重庆市环境保护局\t153110\n太太们\t153111\n复仇者联\t153112\n蛇蝎女佣\t153113\n洗发液\t153114\n沙沙\t153115\n班旗\t153116\n希威社\t153117\n张刚\t153118\n顺民\t153119\n对比度\t153120\near\t153121\n高光粉\t153122\ncc2017\t153123\n安哥拉\t153124\n我的世界天骐\t153125\n期货排名网\t153126\n麻枝准\t153127\n涉企\t153128\n党参\t153129\nJlink\t153130\n斜孔\t153131\n合肥市人民政府\t153132\nDescent\t153133\nparaview\t153134\n|资\t153135\npydub\t153136\n长江三角洲\t153137\nFurry\t153138\n工作服装\t153139\n航线\t153140\n江财\t153141\n古香\t153142\nsarina\t153143\n扶正器\t153144\n法兴银行\t153145\n姜峰\t153146\n2.12.5\t153147\n煤浆\t153148\nDoo\t153149\n堪萨斯州\t153150\nic芯片\t153151\nkann\t153152\n爱的味道\t153153\n芸芸\t153154\n39健\t153155\n北拓\t153156\n时装秀\t153157\n护肤水\t153158\n法律咨\t153159\n梦幻模拟战5\t153160\n1mbps\t153161\n德朗\t153162\nbuildings\t153163\ntobe\t153164\nEncounter\t153165\n涩涩\t153166\n点读版\t153167\n瓶贴\t153168\nMKV720P\t153169\n止息\t153170\n电教\t153171\n普通版\t153172\nath\t153173\n妻味\t153174\n轻集料\t153175\n练习生们\t153176\n风月片\t153177\n晓梦\t153178\nlol邪恶漫画\t153179\n不耻\t153180\nsarvar\t153181\n酒界\t153182\n眼贴\t153183\ntra\t153184\n催要\t153185\nCAMOZZI\t153186\n胜券在握\t153187\n下沙村\t153188\nIndexOf\t153189\n黑庙\t153190\n罗技g910\t153191\n乐驰论坛_汽车之家论坛\t153192\n比及\t153193\n诸生\t153194\n溢于言表\t153195\n肺气肿\t153196\n血气\t153197\n心理类\t153198\n杜克\t153199\n陈也\t153200\n百度霸屏\t153201\n打印页\t153202\n优德w88\t153203\n权侑莉\t153204\nDVDFab\t153205\n氖\t153206\n汉坤\t153207\n错点鸳鸯戏点鸳鸯\t153208\n月夕\t153209\n电吉他论\t153210\n昭熙\t153211\n报晓\t153212\n笔力\t153213\n阿凡\t153214\n质检员\t153215\nBTP\t153216\nneta\t153217\n畅玩\t153218\n昆明小区\t153219\n矢志不渝\t153220\nbool变量\t153221\n50头\t153222\n大圣传\t153223\ndreamviewer\t153224\n无信不立\t153225\n模板\t153226\n五郎\t153227\n岳父\t153228\n消费者\t153229\n阿萨谢尔\t153230\n春游记\t153231\n高伟光\t153232\n第二首\t153233\n专属\t153234\n苏勒亚其其格\t153235\n17磅\t153236\n词人\t153237\nIPRAN\t153238\n扬农\t153239\ncountifs\t153240\n翼猫科技\t153241\n看看\t153242\n亿利资源集团\t153243\n姮\t153244\n妙手\t153245\n铝碳酸镁\t153246\n致郁\t153247\n天空之山\t153248\nPrometheus\t153249\nzii\t153250\n天涯er\t153251\nshuang\t153252\nRenegade\t153253\n人本\t153254\n捕蝇草\t153255\nCX5\t153256\n中央新闻纪录电影制片厂\t153257\n支气管肺炎\t153258\n花粉管\t153259\n杨猛\t153260\n购方\t153261\n绒毯\t153262\n目的性\t153263\n欧莱\t153264\nbundled\t153265\n青岛特锐德电气股份有限公司\t153266\n大一寸\t153267\n11日游\t153268\n轨迹\t153269\n麦克尔\t153270\n茶服\t153271\n阿米\t153272\n桂圆干\t153273\n浮法玻璃\t153274\n五里庙\t153275\n魔女アンネロ\t153276\ngfortran\t153277\n樱花路\t153278\n4.5mm\t153279\nui库\t153280\n浙江恒逸集团有限公司\t153281\n商业车险\t153282\n岚少\t153283\nedc音乐节\t153284\nhelen\t153285\n80070652\t153286\nDarling\t153287\nv60\t153288\n双子楼\t153289\n太二\t153290\n沈达\t153291\n光彩\t153292\nMacros\t153293\n《武汉大学》\t153294\n成都大熊猫基地\t153295\nxshell\t153296\n皖东\t153297\n认字\t153298\n卵壳\t153299\n可保\t153300\n海贼手办\t153301\nshadows\t153302\n中国科学院上海分院\t153303\n主域名\t153304\n县村\t153305\n首台套\t153306\nyejin.huangye88.com\t153307\nbeoplay\t153308\nDA14580\t153309\n因子\t153310\n12.1.0\t153311\n国高\t153312\n皈\t153313\nConvenience\t153314\n四川省骨科医院\t153315\n雅居乐滨江国际\t153316\n不到场\t153317\n试棒\t153318\nshai\t153319\n捧捧场\t153320\nuvision4\t153321\n电母\t153322\npureftp\t153323\n中签率\t153324\nHallelujah\t153325\n豆腐串\t153326\nDDB\t153327\n张天\t153328\nmil\t153329\n房东家\t153330\n梦之蓝\t153331\n紊\t153332\nbanks\t153333\n小鹰\t153334\n立场\t153335\n詹氏\t153336\nHBS\t153337\nSedan\t153338\n700斤\t153339\n中位线\t153340\ndede模板\t153341\nneospeech\t153342\n岳家\t153343\n亚都空气净化器\t153344\n污染性\t153345\n溧阳路\t153346\n第一首歌\t153347\nckplayer播放器\t153348\n贸易业\t153349\n幻贝\t153350\n电影策划_电影网\t153351\n保安证\t153352\n洋泾\t153353\n微医集团\t153354\n┊投行\t153355\n开谈\t153356\n自助店\t153357\nnespresso\t153358\n白金汉\t153359\n郑艺\t153360\nX3650\t153361\n五月一日\t153362\n生命学院\t153363\n看报\t153364\n1.7万\t153365\n中公教育广西分校\t153366\nangelo\t153367\n梦幻诛仙手游仙缘\t153368\n狱中龙\t153369\npredicts\t153370\n夯土墙\t153371\nprof\t153372\n拾味\t153373\n孕妇奶粉\t153374\n谭腿\t153375\n水货\t153376\n1906年\t153377\n15款\t153378\n统筹方法\t153379\n丐萝\t153380\n广州农商行\t153381\n马鞍山东\t153382\nicebox\t153383\n入河\t153384\n安神补脑液\t153385\n无分\t153386\n武汉地铁10号线\t153387\n北京狗民俱乐部\t153388\n荔枝丹\t153389\n张雅茹\t153390\n定调\t153391\n201606\t153392\n采儿\t153393\nSMR\t153394\n新希望六和\t153395\n马思唯\t153396\n中国电子科技集团\t153397\n印尼虎\t153398\nGives\t153399\nrosrun\t153400\n大沥镇\t153401\n腾腾\t153402\n抗日悲惨世界\t153403\n派位\t153404\n加氢催化剂\t153405\n天目\t153406\n防水胶\t153407\nJohnston\t153408\n1689\t153409\n酒饮\t153410\nwinfo\t153411\n松阪\t153412\n1197\t153413\n324路\t153414\nGu\t153415\n单桂敏\t153416\nrou文\t153417\nopensearch\t153418\n疯狗\t153419\n百斤\t153420\n诊疗\t153421\n点击器\t153422\n数学科学学院\t153423\n篮坛\t153424\nf-36044074\t153425\n乖把\t153426\n95518\t153427\n方体\t153428\n订正\t153429\n液晶显示器\t153430\n2018049期\t153431\n帕特森\t153432\n璜\t153433\n通体砖\t153434\n北桥\t153435\n一建证\t153436\n电死\t153437\nmiami\t153438\n丽格\t153439\n德耀中华\t153440\n柯镇恶\t153441\n思博伦\t153442\n赫拉克勒斯\t153443\n牛杂\t153444\n秋李子\t153445\n车床工\t153446\nBarriers\t153447\n14.2%\t153448\n双频路由\t153449\n痴迷\t153450\n郭芙蓉\t153451\n360导航\t153452\ncigarettes\t153453\n2018世界地球日\t153454\n北京协和医学院研究生院\t153455\n布票\t153456\n卡牌篇\t153457\n天津图书大厦\t153458\n三组\t153459\n英语作文\t153460\nalloca\t153461\nTANG\t153462\n贵溪市\t153463\n黄铁矿\t153464\n超感八人组\t153465\nmutual\t153466\n纳闷\t153467\n陈传席\t153468\n友田彩也香\t153469\n双频\t153470\n天南地北\t153471\n5712\t153472\n取士\t153473\n氨基苯磺酸\t153474\n矢\t153475\nrcd\t153476\n第63集\t153477\npkcs1\t153478\nviewing\t153479\n肩袖损伤\t153480\n虹吸效应\t153481\n反射性\t153482\nXiuRen\t153483\n惊天魔盗\t153484\n空想社会主义\t153485\n睿品\t153486\nMULTISIM\t153487\n不涉\t153488\n大集合\t153489\n状况\t153490\n¨\t153491\n隔绝\t153492\n华夏银行信用卡中心\t153493\n储备金\t153494\n卦象\t153495\n米豆腐\t153496\nMapReduce\t153497\n天魁币\t153498\n奇遇\t153499\n蓮実クレア\t153500\n1959年\t153501\n歼10\t153502\nPrimer\t153503\n链器\t153504\n枪手\t153505\n长春二道区\t153506\n耵聍\t153507\n5P\t153508\n街道办\t153509\n瀬奈\t153510\n环球免税店\t153511\n绝缘线\t153512\n爱玩客\t153513\nvideo播放器\t153514\n江小湖\t153515\n诉愿\t153516\n登仙\t153517\n金永\t153518\n安财\t153519\nJaspersoft\t153520\n中二病也要谈恋爱\t153521\n梗概\t153522\n车系\t153523\n广东人社\t153524\n梵蜜琳\t153525\nARCGIS\t153526\n昆山开发区\t153527\n复仇者联盟3吧\t153528\n公狮\t153529\n十颗\t153530\nmvnw\t153531\nnba吧\t153532\niPod\t153533\n德智体\t153534\nフラワ\t153535\n投石机\t153536\nasio4all\t153537\n场强\t153538\n农业合作社\t153539\n大叶公路\t153540\n龙湖葡醍海湾\t153541\n陈世美\t153542\n泾县人民政府\t153543\n北京烤鸭\t153544\n3个月\t153545\n三平寺\t153546\n快乐阳光原创少儿歌曲\t153547\n阔脚裤\t153548\n八卦洲\t153549\n乐泰胶\t153550\n塑料助剂\t153551\n袜\t153552\n浅绛\t153553\n建筑工程保险\t153554\n信达泰禾\t153555\n林立果\t153556\n稿费\t153557\n番茄花园\t153558\n单元测试卷_奥数网\t153559\n石老师\t153560\n手绘学习\t153561\n土尔其\t153562\n上海市徐汇区人民政府\t153563\nDetention\t153564\n丽格海棠\t153565\npeony\t153566\n守林人\t153567\n剑灵南天国\t153568\n草灯\t153569\n开档\t153570\n牛驼温泉孔雀城\t153571\nyou+\t153572\n双臂\t153573\nVeterinary\t153574\n刘继卣\t153575\n_特玩网\t153576\n森兰丸\t153577\n天下足球\t153578\n秘书工作\t153579\n电动车充电器\t153580\npractise\t153581\nmirror\t153582\n阿拉希\t153583\n乐丰\t153584\n太阁5\t153585\n卡隆\t153586\nAsyncHttp\t153587\n霆\t153588\n平重\t153589\n戏剧影视文学专业\t153590\n轻文\t153591\n鞋尖\t153592\n钢丝绳\t153593\nAngkor\t153594\n弥赛亚\t153595\n兔爷\t153596\n中燃\t153597\n复用段\t153598\n阴险\t153599\n山东航空股份有限公司\t153600\n湖北话\t153601\n时间序列分析\t153602\n孔雀蓝\t153603\n船船\t153604\ne舞\t153605\n张建荣\t153606\n意向性\t153607\n突水\t153608\n刑名\t153609\n导航模块\t153610\n堂屋\t153611\n十七年\t153612\n颜骏凌\t153613\nM6800\t153614\n窗框\t153615\n衡南县\t153616\n中国驻日本大使馆\t153617\n圣斗士星矢冥王篇\t153618\n暗卫\t153619\n人皮客栈\t153620\n33张\t153621\n20160222\t153622\n降准\t153623\n魔法少女伊莉雅\t153624\n董事费\t153625\nrunner\t153626\n摩托车展\t153627\n白颖\t153628\n娘炮\t153629\n质量度\t153630\n北京八达岭\t153631\n一多\t153632\n半命题作文\t153633\n小童\t153634\n福建行政学院\t153635\n20世纪20年代\t153636\n馆阁体\t153637\nBerkshire\t153638\n合用\t153639\n11朵\t153640\n180309\t153641\nJavaCrazyer\t153642\n东莞汽车总站\t153643\n知也\t153644\n100本\t153645\n鲁伊科斯塔\t153646\n3月19日\t153647\nImmune\t153648\n矿厂\t153649\n雷小天\t153650\n西安注册公司\t153651\n省区\t153652\n电磁流量计\t153653\n本地机\t153654\nbeatsX\t153655\n张渚\t153656\nTIIDA\t153657\nspelling\t153658\n湖南省卫计委\t153659\n埋板\t153660\n天天射综合网\t153661\n剑网1\t153662\n中国水利水电第八工程局有限公司\t153663\n论股\t153664\n200吨\t153665\nDeveWork\t153666\n东山镇\t153667\nGprinter\t153668\n中国一分钟\t153669\n铠甲\t153670\n喽啰\t153671\n面具\t153672\nopenSUSE\t153673\n陈小花\t153674\n沙河股份\t153675\n定温\t153676\n圆雕\t153677\n蒲蒲兰\t153678\n兴义市人民政府\t153679\n中共中央党校研究生院\t153680\n上海嘉里中心\t153681\n质量监督局\t153682\n电阻箱\t153683\nfspecial\t153684\n龙园小区\t153685\n填\t153686\n汉剧\t153687\n边齿\t153688\n韩东\t153689\n吸烟有害健康\t153690\n择校\t153691\n易象\t153692\n2510\t153693\n真性红细胞增多症\t153694\n黄红英\t153695\n会城\t153696\n古字\t153697\n惠普g4\t153698\nexotic\t153699\n巨蟹女\t153700\n新民政府\t153701\n预制率\t153702\n素可泰\t153703\n众妙\t153704\n想你\t153705\n刘彻\t153706\n罗玉凤\t153707\n论政\t153708\nFETISH\t153709\n都昌县\t153710\n锦和康城\t153711\n顾顺章\t153712\n计划生育法\t153713\n百余年\t153714\nWindows8技\t153715\n奔腾处理器\t153716\n西安工业大学北方信息工程学院\t153717\n过户费\t153718\n歌姬\t153719\npun\t153720\n园林小区\t153721\nguojia\t153722\n索票\t153723\nsymantec\t153724\n不胜枚举\t153725\n苗疆道事\t153726\nubun\t153727\n卢照邻\t153728\n湖州人才网\t153729\n22型\t153730\n黑色四叶草\t153731\n伎町\t153732\n六味地黄\t153733\n企跃网\t153734\n青行灯\t153735\n双音\t153736\n不要说谎\t153737\n凉生\t153738\n阿含经\t153739\n串色\t153740\n赵光义\t153741\n喉咙处\t153742\n雷朋眼镜\t153743\n321\t153744\n1袋\t153745\n华东\t153746\n血脂康胶囊\t153747\n大话3\t153748\n桌球\t153749\n网板\t153750\n亮亮\t153751\nLydia\t153752\nencrypted\t153753\n工科\t153754\n16春\t153755\n夸口\t153756\n可食\t153757\n撼路者\t153758\n长航\t153759\n绥中县\t153760\n西红柿汁\t153761\n药园\t153762\n野蜂\t153763\n河北经贸大学\t153764\n钉板\t153765\n方敏雅\t153766\n韩\t153767\n颈椎增生\t153768\n戈恩\t153769\n法拉盛\t153770\nnick\t153771\nsqlsession\t153772\nThor\t153773\n井盖\t153774\nIrfanView\t153775\n色膜\t153776\n地方各级人民政府\t153777\n圣白莲\t153778\n卸装\t153779\nCSR2\t153780\n信控\t153781\nperturbation\t153782\nobc\t153783\n老北京鸡肉卷\t153784\n居心叵测\t153785\n金风\t153786\n易视网\t153787\n乾嘉\t153788\n石榴树\t153789\n海雾\t153790\n思佳\t153791\n就是爱\t153792\nkansas\t153793\n10处\t153794\nFlat\t153795\n名垂青史\t153796\nAutoC\t153797\n招贴\t153798\n伦敦金\t153799\n异质性\t153800\n一年级\t153801\n5918\t153802\nHaiNan\t153803\n哇啦\t153804\n0.4.5\t153805\n凯奇集团\t153806\n多发性硬化症\t153807\n102条\t153808\n山东省政府\t153809\n居民委员会\t153810\n森重宽\t153811\n巨枪\t153812\n指染冒牌大英雄\t153813\n王十朋\t153814\n中日\t153815\n黑域\t153816\n20.8\t153817\n验房师\t153818\n2014.4\t153819\n回路图\t153820\n李生\t153821\n须\t153822\n中间产品\t153823\n磷酸西格列汀片\t153824\n分压\t153825\nXFX\t153826\n屏机\t153827\nHPLC\t153828\ncocos\t153829\n数据位\t153830\n白轴\t153831\n唐山乐亭\t153832\n客|运动\t153833\n赛尔达传说\t153834\n上海华东师范大学\t153835\nCycle\t153836\n补天志\t153837\n解缠\t153838\n闻见\t153839\n狗链\t153840\n湖南省工商行政管理局\t153841\n史蒂\t153842\n星子\t153843\n90多万\t153844\nCalm\t153845\n生产安全事故\t153846\n四氟垫片\t153847\n甄稀\t153848\nrac\t153849\n托里\t153850\ncarpenter\t153851\n后世\t153852\n控费\t153853\n董振堂\t153854\n神剑伏魔录\t153855\nzywscq\t153856\n30大酒店\t153857\n三家村\t153858\n阳江政府网\t153859\n惮\t153860\n陈丽君\t153861\n宁波人才网\t153862\n南宁国家经济技术开发区\t153863\nnfl\t153864\n上海商业地产\t153865\n服装制版\t153866\n荣耀5X\t153867\n周维清\t153868\nPIL库\t153869\n一片式\t153870\n成都华西医院\t153871\nPAK\t153872\n周樂\t153873\n保护性\t153874\n清零\t153875\n锦州百姓网\t153876\n20180110\t153877\nk650d\t153878\n小飞人\t153879\n好多张\t153880\n艾柯\t153881\n香玉\t153882\n四维空间\t153883\n国务院\t153884\n1300342\t153885\n禁忌_百草录\t153886\n直排式\t153887\n薄熙来案\t153888\n市场导向\t153889\n王哥\t153890\n广东省汽车客运站\t153891\n烧鸡\t153892\n李朝阳\t153893\n梧桐叶\t153894\n热加工工艺\t153895\n紫金西苑\t153896\n4px\t153897\nshengji\t153898\n仁波\t153899\nkepler\t153900\n5克\t153901\n大踏步\t153902\n10-12年\t153903\n口角\t153904\n商转公贷款\t153905\n决\t153906\n华山论剑\t153907\n里诺\t153908\n鸡沙漠\t153909\n雷斯特\t153910\n帝都\t153911\n黄金樟\t153912\n千分之六\t153913\n瑞声科技\t153914\n寂静岭吧\t153915\n天天有喜2之人间有爱\t153916\n5口\t153917\n下营镇\t153918\n杰兰特\t153919\nRSM\t153920\n一个千\t153921\n三等\t153922\n磁盘分区\t153923\n眼力\t153924\n人贩子\t153925\n比较大小\t153926\n仙\t153927\n魅斑\t153928\nWWW.QIWEN8.NET\t153929\n苏健\t153930\n九寨沟\t153931\n扩权\t153932\n何晓群\t153933\nacb\t153934\n人防费\t153935\n青娱\t153936\nGBW\t153937\n囚人\t153938\n合景泰富\t153939\n16进制转10进制\t153940\n汕头站\t153941\n杨振武\t153942\n高新区管委会\t153943\n御猫\t153944\nTalent\t153945\n美变\t153946\n四明湖\t153947\n专审\t153948\n浙江省水利水电勘测设计院\t153949\nBDD\t153950\n郑和远航\t153951\nLibevent\t153952\n魔兽争霸3冰封王座\t153953\nPavilion\t153954\n68名\t153955\n0.0.7\t153956\n刘空青\t153957\n新亚\t153958\n金雨\t153959\n液体\t153960\n拜宠\t153961\n弧菌\t153962\n廖伟棠\t153963\n粉针\t153964\n使命勇\t153965\n乐羊\t153966\n隆昌市人民政府\t153967\n0处\t153968\nmbstring\t153969\n拉板\t153970\n军卫\t153971\n时时刻刻\t153972\n日月心诚\t153973\nPOC\t153974\n小儿内科\t153975\n上海交通大学船舶海洋与建筑工程学院\t153976\n48亿\t153977\n三千米\t153978\n合金钢\t153979\n500E\t153980\n周宁\t153981\n马自达8\t153982\nOffice\t153983\n精灵王传说\t153984\n花棍\t153985\n忠于职守\t153986\n索拉卡\t153987\n痛苦\t153988\n蝙蝠洞\t153989\n北斗无双\t153990\n直流稳压电源\t153991\nFeminism\t153992\n危险报警闪光灯\t153993\n地高辛\t153994\nRugged\t153995\n泰州市公安局\t153996\n望龙\t153997\nrequestbody\t153998\nFEH\t153999\n枇杷叶\t154000\n深圳平安金融中心\t154001\n踏破\t154002\n山东如意集团\t154003\n望奎\t154004\n燃烧军团\t154005\n在职硕士\t154006\n捷易通\t154007\n这个字\t154008\n无限歌谣季#\t154009\nOPC服务器\t154010\n张超\t154011\nairway\t154012\nStanley\t154013\n徐迅\t154014\n商城众网\t154015\n从始至终\t154016\nnohup\t154017\n粉衣\t154018\n12th\t154019\n36道\t154020\n平清\t154021\n恐鸟\t154022\n鲁班水库\t154023\n周波\t154024\n浒山\t154025\nvanessa\t154026\n吹逼\t154027\n云南\t154028\nBreath\t154029\n君临臣下\t154030\n置换\t154031\n图幅\t154032\n傲城\t154033\n赤峰市人民政府\t154034\n宝雕\t154035\nkay\t154036\n设计室\t154037\n精算\t154038\nHttpRequester\t154039\n天语SX4\t154040\n网文\t154041\n四晚\t154042\n毕业感言\t154043\n鬼妈妈\t154044\n华凯\t154045\n灯红酒绿\t154046\n小默\t154047\n电压源\t154048\n菩提根\t154049\n坦克大战吧\t154050\n银海区\t154051\nscp基金会\t154052\n落石\t154053\n吴向东\t154054\n知易行难\t154055\n海滨城\t154056\n史丹\t154057\n双龙潭\t154058\n吃饱了\t154059\n课程与教学论\t154060\n法治\t154061\n未熟\t154062\n杨建国\t154063\n可查\t154064\nWiktionary\t154065\n周家嘴路\t154066\n重要度\t154067\n植体\t154068\n君道\t154069\n王芗斋\t154070\n华电国际电力股份有限公司\t154071\nImprovements\t154072\nstm32f051\t154073\n欠王\t154074\n建峰\t154075\n狗棍\t154076\n拖档\t154077\nC51\t154078\nosmo+\t154079\n眉清目秀\t154080\n传奇归来\t154081\n开盒\t154082\n惊群\t154083\n可知\t154084\nCalico\t154085\n陕西煤业\t154086\n以人为镜\t154087\n许昌新区\t154088\n联盛\t154089\n鲁灰\t154090\n比心\t154091\n98世界杯\t154092\n乜\t154093\ncru\t154094\n中能东道\t154095\n翻倒\t154096\ntcgames\t154097\nV2|linux\t154098\n晨景\t154099\n前两月\t154100\n汇东\t154101\nSQL2005\t154102\n赤井秀\t154103\nleifeng\t154104\n无杆气缸\t154105\n去年11月\t154106\n可不是\t154107\n景谷县\t154108\nTOTO\t154109\n万寿寺\t154110\n东汉\t154111\n知中\t154112\n荐书\t154113\n工代\t154114\n淘宝买家秀\t154115\nGBS\t154116\n10AT\t154117\n词话\t154118\n东京酒店\t154119\n加工资\t154120\n1万平米\t154121\n谷品\t154122\nexia\t154123\nA9VG\t154124\n顺丰物流\t154125\n100kb\t154126\n互连\t154127\n中国传媒大学\t154128\n降压变压器\t154129\n桃毛\t154130\n新邱\t154131\n狮子桥\t154132\n启后\t154133\n民族证券\t154134\n四十部\t154135\n诛神\t154136\nR&B\t154137\nsb0060\t154138\ncolleague\t154139\ncftool\t154140\nSEED\t154141\n阿宾\t154142\n97dyy\t154143\n巨兵\t154144\n苍翼\t154145\n越位\t154146\n4月25日\t154147\n陕西省工商局\t154148\n等比数列{an}\t154149\n开示\t154150\n棣棠\t154151\n跑跑卡丁车\t154152\n出了错\t154153\nconcentrated\t154154\n游戏攻略_海峡网\t154155\n智讯\t154156\n李建生\t154157\n世嘉土星\t154158\n涛\t154159\n安井食品\t154160\n不播\t154161\nH110\t154162\n泰兴\t154163\n自洽\t154164\n过细\t154165\nloss\t154166\n16支\t154167\n58同城租房网\t154168\n字符转义\t154169\nHolika\t154170\n瓷抛砖\t154171\n多盈\t154172\n扩折\t154173\n茶毫\t154174\n5t\t154175\n狮子城\t154176\n新丁\t154177\n烤鳗\t154178\n弱受\t154179\n特奖\t154180\ncab\t154181\n经委\t154182\n信宿\t154183\n白云新市\t154184\n鬼火\t154185\n2009-2010年度\t154186\n沙障\t154187\n大扫除\t154188\n吴月\t154189\n真不错\t154190\n龙泉青瓷\t154191\n厨宝\t154192\n36e\t154193\n人仔\t154194\n13小时\t154195\n广东省交通集团\t154196\n字符\t154197\n无线充电宝\t154198\nwin10病毒\t154199\n售汇\t154200\n答题器\t154201\n金贝\t154202\n网赌\t154203\n推特\t154204\n莫西干\t154205\n迅雷高清下载BT\t154206\nQLabel\t154207\n佳能mg3620\t154208\n支付能力\t154209\n流动加油车\t154210\n云灾备\t154211\n炮筒\t154212\n女大男\t154213\n金环\t154214\n户籍化\t154215\n天秤\t154216\n青龙\t154217\nkardon\t154218\n我爱网\t154219\nwww.fanwen99.cn\t154220\n揽收\t154221\ncreo吧_\t154222\nnameless\t154223\n沈阳市中级人民法院\t154224\n万孚\t154225\n上杭县\t154226\n曲靖日报\t154227\nmangos\t154228\n1430\t154229\n新府\t154230\n369万\t154231\n偶得\t154232\n叶瑛士\t154233\n男兄弟\t154234\np52s\t154235\nntfs\t154236\n妞妞\t154237\n关门\t154238\n高看\t154239\n佣兵团\t154240\n神池县\t154241\n茶山\t154242\n云和新闻网\t154243\n露一手\t154244\ncote\t154245\n茫然\t154246\n杀毒\t154247\n社会工作者考试网\t154248\n一圈\t154249\n主流\t154250\n450万元\t154251\nスト\t154252\n李利群\t154253\n做活\t154254\n企鹅人\t154255\nERDAS\t154256\n云掌柜\t154257\nASIC\t154258\n表现力\t154259\n_华瑞源\t154260\n测绘类\t154261\n嘟嘟噜\t154262\nFEP\t154263\n同酬\t154264\n分散机\t154265\n御女\t154266\n荒古太\t154267\n网考\t154268\n泌尿系统疾病\t154269\n滨州\t154270\n评员\t154271\n刑法\t154272\n三普\t154273\n世界舞台\t154274\n梦游\t154275\n游久资料社\t154276\n几秒\t154277\ndmz\t154278\n亚硫酸\t154279\n抓党建\t154280\n顺意\t154281\n威创股份\t154282\n持诵\t154283\nX3\t154284\nplus版\t154285\n东单公园\t154286\n恍如隔世\t154287\n秦孝公\t154288\n创维E900\t154289\n情兽\t154290\n_文山\t154291\nbtu\t154292\n2018年一月\t154293\n无名氏\t154294\n赤沙\t154295\n600871\t154296\n封装性\t154297\nflatMap\t154298\n0:00:00\t154299\n对勾\t154300\nRECOVER\t154301\n陈蓉晖\t154302\n黄炳誓\t154303\n优利德\t154304\n光束\t154305\n道光通宝\t154306\n核试验\t154307\nELB\t154308\n240TURBO\t154309\n士兵突击\t154310\n双保\t154311\n延吉路\t154312\nHeroic\t154313\nCuisine\t154314\nTragedy\t154315\n1.23\t154316\n合肥市区\t154317\n间位\t154318\nA栋\t154319\nreceiver\t154320\n下自成蹊\t154321\n祖灵\t154322\n闵锐\t154323\n兴安县\t154324\n毛布\t154325\n丽江小倩\t154326\n寄货\t154327\n圣裁\t154328\n_手工帝网\t154329\n6.53\t154330\n西游降魔\t154331\n林跃\t154332\n1789\t154333\n成长类\t154334\n卓少\t154335\nhv\t154336\n魅族PRO7\t154337\n商业体\t154338\nysl\t154339\n知行合一\t154340\n邪恶少女漫画_无翼鸟少女漫画大全集\t154341\nMTCNN\t154342\n好嗨\t154343\n色目人\t154344\na556u\t154345\n丑爷\t154346\n拉克西里\t154347\n电流变送器\t154348\n润滑性\t154349\n20xx\t154350\n犼\t154351\n玖富叮当\t154352\n子明\t154353\n水物\t154354\n伊格尼斯\t154355\n结清\t154356\n南岗镇\t154357\nDolby\t154358\ntfc\t154359\n隐适美\t154360\n郭涛\t154361\n城南街道\t154362\n李昂\t154363\n雅利安人\t154364\n蒙圈\t154365\n符文工房\t154366\nPATA\t154367\n贴上\t154368\n大宁路\t154369\n森宇\t154370\n抓住\t154371\nlync\t154372\n企业集团财务公司\t154373\nKeil4\t154374\n金宏\t154375\n李忘生\t154376\n20141030\t154377\nJYP\t154378\n疑人\t154379\n王明道\t154380\n舌癌\t154381\n螺丝机\t154382\n470\t154383\n海门市人民政府\t154384\n2016年7月2日\t154385\n20151124\t154386\n漯河市\t154387\n净土宗\t154388\n包皮环切\t154389\n令人满意\t154390\n多方面\t154391\n等价类划分法\t154392\nwenku\t154393\n玩客币\t154394\n265G安卓网\t154395\n绘画展\t154396\n600990\t154397\n19.8元\t154398\n按月\t154399\n15万美元\t154400\n日月广场\t154401\n魏小涵\t154402\nMM52.com\t154403\n云淘\t154404\n南山文体中心\t154405\n城市规划学院\t154406\n絮语\t154407\n1060ti\t154408\n蹦蹦床\t154409\n价值论\t154410\n照看\t154411\n规章制度\t154412\n冰川时代5:星际碰撞\t154413\n借助\t154414\n美女间谍\t154415\n鬼斧\t154416\n政校\t154417\n7排\t154418\n第4次\t154419\n益格\t154420\n本题\t154421\n抓真干\t154422\nContainer\t154423\nwarn\t154424\n纳拉克\t154425\n蜜桃芥子\t154426\n青钱柳茶\t154427\n托斯卡纳艳阳下\t154428\n音悦Ta\t154429\nwecenter\t154430\n厚谱教育访问学者网\t154431\n一表\t154432\ncarbide\t154433\n不全\t154434\n登临\t154435\n广州暨南大学\t154436\n侧脸\t154437\n扇形统计图\t154438\n崔佛\t154439\n燃气阀\t154440\n包轻\t154441\n卫宁\t154442\n商合杭铁路\t154443\n残疾证\t154444\n王朝阳\t154445\n2第一章\t154446\n青岛海湾大桥\t154447\nIDEAL\t154448\n重分\t154449\ntheme\t154450\n马克·沃尔伯格\t154451\n众志\t154452\n教育学专业\t154453\n果麦\t154454\nanyone\t154455\n贪吃蛇游戏\t154456\n载明\t154457\n莱蒙湖\t154458\n筋膜炎\t154459\nJ6L\t154460\n闲聊\t154461\n南平市环保局\t154462\n军用版\t154463\n148万\t154464\n比雷埃夫斯港\t154465\nmbi\t154466\n广德吧\t154467\n壳型\t154468\n鸿洲\t154469\nHOLD不住\t154470\n白云湖\t154471\ndim\t154472\n16包\t154473\n杀人犯\t154474\n他克莫司胶囊\t154475\n红壤\t154476\n波形图\t154477\n头哥\t154478\ncony\t154479\nskipped\t154480\n北站\t154481\n南瓜子\t154482\n兰州市财政局\t154483\n娱乐性\t154484\n二战前线2无敌版\t154485\nvmd\t154486\n港都\t154487\nl7\t154488\n尘梦\t154489\n黑工\t154490\n武藏野美术大学\t154491\n张艺馨\t154492\nNYT\t154493\n夫唱妇随\t154494\n人仙\t154495\n苯乙烯\t154496\n100欧元\t154497\n列文虎克\t154498\n外史\t154499\n哈哩\t154500\n完美无瑕\t154501\n指导处\t154502\n安徽建筑大学城市建设学院\t154503\n品牌策划\t154504\n喀斯特\t154505\n绅宝X25\t154506\n铍铜\t154507\n全景网\t154508\n修斯\t154509\n尼山书院\t154510\nDev-C++\t154511\n尼尔:机械纪元\t154512\n选拔性\t154513\nflagship\t154514\n2ab\t154515\n第二回\t154516\n600px\t154517\n五虎\t154518\n广州市朝德机电设备有限公司\t154519\ndefault\t154520\n中科院计算所\t154521\n事态\t154522\n纳萨\t154523\n蜈蚣辫\t154524\n番号搜索器\t154525\n曼施坦\t154526\n艾怡良\t154527\nthreaten\t154528\n必收\t154529\nREDIS\t154530\n十回\t154531\nbibibi\t154532\n北京市法院\t154533\n首栋\t154534\n顺丝\t154535\nST-LINK\t154536\n2017年1月1日起\t154537\n世通\t154538\n末尾处\t154539\n象形字典\t154540\nnote4X\t154541\n倒钩\t154542\netax\t154543\n湘桂\t154544\n调整后\t154545\n红石林\t154546\nspg俱乐部\t154547\n170斤\t154548\n苏州观山\t154549\n八里村\t154550\n翦伯赞\t154551\n初一分\t154552\n一个一元\t154553\ndefect\t154554\n蚂蚁s9\t154555\n7吨\t154556\n青皮\t154557\n济南高新万达广场\t154558\n百丽国际\t154559\nYou.mp3\t154560\n珠珠\t154561\n世纪华联\t154562\n鳕鱼\t154563\n国家新闻出版广电总局\t154564\n镗缸\t154565\n运城盐湖\t154566\n投币机\t154567\n松桥\t154568\n活率\t154569\n赵玲\t154570\n10.X\t154571\n信工所\t154572\n竹笋干\t154573\nCardiff\t154574\n10}\t154575\n北里\t154576\n爱因斯坦\t154577\n哑铃8健身网\t154578\n民丰银行\t154579\n轨枕\t154580\n官星\t154581\n知情同意书\t154582\n221年\t154583\n伦伦\t154584\n免烧砖机\t154585\n夏萌\t154586\nhuanying\t154587\n69800\t154588\n75折\t154589\n中机六院\t154590\n米东\t154591\n妖冶\t154592\noracel数据库\t154593\n信被\t154594\n生产资料\t154595\n願\t154596\n阳市\t154597\n照艳\t154598\n非道\t154599\n立航\t154600\nguolv\t154601\n密封盒\t154602\n升降舵\t154603\n冒菜\t154604\n清漂\t154605\n河南工业大学\t154606\n中国人民大学律师学院\t154607\n羊场\t154608\n大为\t154609\n信息库\t154610\n河野悦子\t154611\n复新\t154612\n意难平\t154613\n阿奇霉素胶囊\t154614\n蕴藏量\t154615\nOvercooked\t154616\n朝露\t154617\nPhotoScan\t154618\n中洲半岛\t154619\nLedger\t154620\n丹水山庄\t154621\n民福康健康_三九养生堂\t154622\ntobu\t154623\n液体粘滞系数\t154624\nLLP\t154625\nmange\t154626\n德扑圈\t154627\nNo.12\t154628\n精校\t154629\n倩女\t154630\n液体灌装机\t154631\n大疆创新\t154632\n霹雳游侠\t154633\n关欣\t154634\n图博\t154635\n暴跳\t154636\n死体\t154637\n鹭栖湖\t154638\n紫马岭公园\t154639\n旁门左道\t154640\nOmron\t154641\n自然科\t154642\n完价\t154643\n野蛮人\t154644\n内盒\t154645\n福特野马\t154646\n宜尚酒店\t154647\n艾森豪威迩\t154648\n赛格\t154649\n桂林旅游团\t154650\nHighcharts\t154651\n老北京人\t154652\n旬阳县\t154653\nWaterloo\t154654\n河源新闻网\t154655\n标准正态分布\t154656\n核废料\t154657\nshortly\t154658\n力戒\t154659\ncherry\t154660\n大连电瓷\t154661\n魏则西\t154662\ncnc加工中心\t154663\n越捷航空\t154664\n闪回\t154665\n3pin\t154666\n交运\t154667\n德永英明\t154668\n岔\t154669\n坡角\t154670\n长公主\t154671\n腐甜\t154672\n洪桥镇\t154673\nMadcola\t154674\nwat\t154675\n成年コミック\t154676\n汗蒸箱\t154677\n节能服务公司\t154678\n张力仪\t154679\n孔雀型\t154680\n16年05月\t154681\n匡算\t154682\n过刊\t154683\nqq动漫\t154684\n骨骸\t154685\n加勒比海盗5:死无对证\t154686\nd6\t154687\n90天后\t154688\n胜芳古镇\t154689\n太古龙帝诀\t154690\n时尚网\t154691\niFixit\t154692\njAVA\t154693\n实招\t154694\nshd\t154695\n飞虱\t154696\nPreschool\t154697\n丙烯\t154698\nContribution\t154699\n超过1小时\t154700\n斗彩\t154701\n金光御九界之鬼途奇行录\t154702\n上古卷轴天际\t154703\nm128\t154704\n宽敞\t154705\n见花开\t154706\nOscarfff\t154707\nraid10\t154708\n自然资源部\t154709\n营销伎巧\t154710\n铺架\t154711\n粗糙\t154712\nApplicationContext\t154713\n薛晓\t154714\n201605\t154715\n纬创资通\t154716\n塔元庄\t154717\n手底下\t154718\n邳\t154719\n折字\t154720\nZhen\t154721\n两本\t154722\n空穴\t154723\nupload/2017/2017-08-20/g_311bdc9c-54a9\t154724\n冰男\t154725\n佳御\t154726\nUC3842\t154727\n圣斗士星矢OL\t154728\n鸿儒\t154729\nEncountered\t154730\n索罗斯\t154731\n柯城区政府\t154732\n庭审现场\t154733\n纳克\t154734\nshy\t154735\n索尔仁尼琴\t154736\n骨节\t154737\n泰币\t154738\n魏书生\t154739\nXIN\t154740\n兽片\t154741\n129名\t154742\n谭小芳\t154743\n冷气机\t154744\n古钱\t154745\n84厘米\t154746\n排行版\t154747\nanywhere\t154748\nP12\t154749\n友谊广场\t154750\n闫华红\t154751\n园林花卉\t154752\n食草堂\t154753\n17年3月\t154754\n8591\t154755\n制作方\t154756\n10万套\t154757\n疫区\t154758\n李瑜\t154759\njoker\t154760\n新流星蝴蝶剑\t154761\nvectorworks\t154762\n斯文\t154763\n汕\t154764\n空开\t154765\ndba\t154766\nFragment\t154767\n秉公执法\t154768\nfaguo\t154769\nballistic\t154770\nWOL\t154771\n魔卡少女樱透明牌篇\t154772\n学龄期\t154773\n车载吸尘器\t154774\n15丨\t154775\n彭鹏\t154776\n沈阳公交查询网\t154777\n少女性\t154778\n猪蹄汤\t154779\n沈河\t154780\n凝聚态\t154781\n紫百合\t154782\n虎门外语学校\t154783\n10余名\t154784\n开锁公司_换锁_修锁公司\t154785\n米盒\t154786\ngvg\t154787\nopencc\t154788\n咸宁政府网\t154789\n万界仙踪\t154790\n阳面\t154791\n0040\t154792\n寨里镇\t154793\n梨花头\t154794\n真子集\t154795\ngensee\t154796\nMIDIPLUS\t154797\n范逸臣\t154798\n网线\t154799\n13米\t154800\n上汽通用汽车金融有限责任公司\t154801\nanhei3战网\t154802\n大树\t154803\n遗落战境\t154804\n水泥工\t154805\n倪健\t154806\n黄条\t154807\nAdaptive\t154808\n依鲁替尼\t154809\n内江站\t154810\nアヘボテオチ\t154811\nStarring\t154812\ncolt\t154813\n滁州市政府\t154814\nQQBODY头像网\t154815\n中国教育考试网\t154816\n河南省人民医院\t154817\n小军\t154818\n怪点\t154819\n辟邪剑谱\t154820\nLIN\t154821\n华茂\t154822\n侧目\t154823\n霞美镇\t154824\n新展\t154825\n美国队长3\t154826\n工具包\t154827\ng37\t154828\n一道题\t154829\n甲醛释放量\t154830\n4GD5\t154831\n中拉\t154832\n万有引力\t154833\n咆\t154834\n螺纹孔\t154835\n附耳\t154836\n太假\t154837\n天津市公安局\t154838\n驻兵\t154839\n优路\t154840\n小航\t154841\n马彪\t154842\nitis\t154843\n八边\t154844\n最小时\t154845\n中舞网\t154846\n科技展望\t154847\n杨门虎\t154848\njav\t154849\nsynthase\t154850\nDendi\t154851\n童体\t154852\nrola\t154853\nixus\t154854\n小正\t154855\n乓\t154856\n代码库\t154857\n农产\t154858\n小米平板1\t154859\n双重认证\t154860\n碧斯诺兰\t154861\nfeval\t154862\n袁腾飞\t154863\n航天日\t154864\n色织布\t154865\n徐斌\t154866\n牵涉\t154867\n择天记\t154868\n20180302\t154869\n坡豪\t154870\n埃克塞特大学\t154871\n2014年清明节\t154872\n那一年\t154873\n干重\t154874\n流星群\t154875\n海洲街道\t154876\n中土世界战争之影\t154877\nCC卡\t154878\n赛特斯\t154879\n村支\t154880\n龙凤镇\t154881\n嘴甜\t154882\n好巧\t154883\n知我者\t154884\n梁锦辉\t154885\n林如\t154886\n新密市\t154887\n百度云/1280超清\t154888\n回用\t154889\n叫作\t154890\n车质\t154891\n南市区\t154892\n500G\t154893\n福田村\t154894\n熟化\t154895\n花毛茛\t154896\n马萨乔\t154897\ngml\t154898\n潘建伟\t154899\n海塘\t154900\n盖盒\t154901\n钢柱\t154902\nLoveMP3\t154903\n浙医二院\t154904\n70多年\t154905\n单号查询_快递网\t154906\n纵目\t154907\n神魔大陆2\t154908\n519888\t154909\n羟基苯甲醛\t154910\n安全评价师考试\t154911\n狂情\t154912\n培训周\t154913\n3丁\t154914\n红提葡萄\t154915\n华泰汽车\t154916\n中国储备粮管理总公司\t154917\n福利院\t154918\n长安街\t154919\n刘公案\t154920\n倡\t154921\n队内\t154922\n蓬莱国际机场\t154923\n式子\t154924\nRear\t154925\n水痘\t154926\n5mod\t154927\n李君\t154928\n大吉大利\t154929\n江苏国泰新点软件有限公司\t154930\n李贺\t154931\n0713\t154932\n克洛伊\t154933\nBrutal\t154934\nslf4j-log4j12\t154935\n王后雄\t154936\nsecp\t154937\n里南\t154938\n中环大厦\t154939\n战争论\t154940\nDNF冰结师\t154941\n登山者\t154942\n防老化\t154943\n孕\t154944\nb20\t154945\nps8.0\t154946\n不得不\t154947\n春信\t154948\n英创\t154949\n战书\t154950\n1000MW\t154951\n禺\t154952\nFoundry\t154953\n无为论坛-无为网\t154954\ndumb\t154955\n日版\t154956\nwidnows\t154957\n_巴士最终幻想14\t154958\n奇思\t154959\n陶老师\t154960\n2.3米\t154961\nlibc.so.6\t154962\n命运石之门\t154963\n国家林业局\t154964\n万兽山庄\t154965\n穿针\t154966\n平桥镇\t154967\n彩钢夹芯板\t154968\n举报\t154969\n皇叔\t154970\n1.5版\t154971\n喇叭花\t154972\n刘全有\t154973\n2亿吨\t154974\n兵临城下\t154975\n355号\t154976\nexcalibur\t154977\n智慧医疗\t154978\n鹰\t154979\n广州农讲所\t154980\n长岭村\t154981\n宁波七中\t154982\nmidnight\t154983\n扫平\t154984\n成功者\t154985\n54分\t154986\n何力\t154987\n随性\t154988\nerp管理软件\t154989\n诺多\t154990\n方法\t154991\n不甘寂寞\t154992\n高声\t154993\n钟发\t154994\nholy_black_cat\t154995\nodm\t154996\n进游\t154997\n形像\t154998\n特殊能力\t154999\n青季\t155000\n灌浆机\t155001\n郁雨竹\t155002\n谢佳\t155003\n盗伐\t155004\n防办\t155005\n客来福\t155006\n浪漫派\t155007\n图虫\t155008\n代售处\t155009\n升降舞台\t155010\n笨猫\t155011\n还未\t155012\n关健\t155013\n人道至尊\t155014\n鼓盆\t155015\n和税\t155016\n神击\t155017\n光头男\t155018\n卢登\t155019\n微派\t155020\n展出\t155021\n洪家街道\t155022\n电子竞技类\t155023\nRobotic\t155024\n8350\t155025\n刘洋\t155026\n鳞次栉比\t155027\n钉鞋\t155028\n跌跌撞撞\t155029\n牛犇犇\t155030\n不夜\t155031\n锦句\t155032\n二超\t155033\n抚顺北\t155034\n大海解说mc\t155035\n化粪\t155036\n宁次\t155037\n甄珠\t155038\n好打\t155039\n太红\t155040\n家乐福卡\t155041\n软土地基\t155042\n短码\t155043\n孙安佐\t155044\n超级黄金手\t155045\nCassandra\t155046\n狂花\t155047\n姨妈期\t155048\naoki\t155049\n武汉自贸区\t155050\n张飞\t155051\n弗洛尔\t155052\n深圳工商银行\t155053\n以文辅政\t155054\n黑月亮\t155055\n圣蒂斯堡\t155056\n维州\t155057\nclutch\t155058\n神舟五号\t155059\n重画\t155060\nM177fw\t155061\n南延段\t155062\n虎课网\t155063\n备战\t155064\n东方传奇\t155065\n钱国超\t155066\n体甲\t155067\n散列通\t155068\n弹幕姬\t155069\n中国计量大学现代科技学院\t155070\n农科路\t155071\nGANSS\t155072\nFiling\t155073\n原子性\t155074\n支付宝号\t155075\n周氏\t155076\n关河\t155077\n太湖学院\t155078\n技德科技\t155079\n风干鱼\t155080\n七分之三\t155081\n缴销\t155082\nfxcm\t155083\n圣晶使徒\t155084\n反渗透纯水设备\t155085\n炸鸡柳\t155086\n龙女郎\t155087\n北疆先锋网\t155088\nDollars\t155089\n国家电网公司\t155090\n克龄蒙\t155091\n欠费\t155092\n铭记\t155093\n家常煲汤网\t155094\n仗助\t155095\n星辉\t155096\n曼瑞德\t155097\n李俊峰\t155098\n试题库\t155099\n仲小萍\t155100\n第一八\t155101\n从优\t155102\n旱船\t155103\n林雪平\t155104\n1559\t155105\n嘉禾路\t155106\n武汉路\t155107\n魔神英雄传2\t155108\n2.13\t155109\n无氧呼吸\t155110\n液晶广告机\t155111\n甚麼\t155112\n连调\t155113\ntsu\t155114\ne03\t155115\n小米智能家居\t155116\n一转\t155117\n最新招\t155118\nGameObject\t155119\n祝英台\t155120\n含氧量\t155121\n阿布贾\t155122\n盛光祖\t155123\n汪宁\t155124\n福州小学\t155125\nmybase\t155126\n百度医疗\t155127\n铸瓷\t155128\n张晓光\t155129\n大市口\t155130\n醇品\t155131\n金逸影城\t155132\n虹吸壶\t155133\n第830集\t155134\n斯旺森\t155135\n窖池\t155136\nAndye\t155137\n192.168.1.254\t155138\n80070003\t155139\n大问\t155140\n吉利博越_吉利_博越报价_博越\t155141\n朗动择天记\t155142\nX55\t155143\n公休日\t155144\n朱莉娅\t155145\n瓦森纳协定\t155146\n科技学\t155147\n珍珠奶茶\t155148\n360n6lite\t155149\n续表\t155150\n小九寨沟\t155151\n叠音\t155152\nIQ\t155153\n吉时\t155154\n郑州旅游职业学院\t155155\n实城\t155156\ncasper\t155157\nHaier\t155158\nhku\t155159\n2.25.1\t155160\n封包机\t155161\n中国地区\t155162\n北京网易\t155163\n中国机械网\t155164\n优思悦\t155165\n285个\t155166\n模糊理论\t155167\n20180328\t155168\n徐记\t155169\n青海人才网\t155170\n纸砚\t155171\n莫璃\t155172\nLatte\t155173\ngatk\t155174\n美国转运公司\t155175\n胚芽\t155176\nurn\t155177\n你是我的小确幸\t155178\n周琼\t155179\n回购利率\t155180\n轻体\t155181\n上海教育出版社\t155182\n麓园\t155183\n22起\t155184\n秀一秀\t155185\n猪链\t155186\npsexec\t155187\n理查德克莱德曼\t155188\n僵尸类\t155189\n蟑螂药\t155190\nbaidubaike\t155191\n凛然\t155192\n20160930\t155193\ncpus\t155194\nY51\t155195\n慢舞\t155196\n量子通讯\t155197\n头层\t155198\n李群玉\t155199\n斯密\t155200\n诫\t155201\n责任有限公司\t155202\n2008年9月\t155203\nSpinal\t155204\n人寿大厦\t155205\n仙逆魔兽剑圣异界纵横\t155206\n平阳镇\t155207\n刘洪涛\t155208\n欧林\t155209\n凡人修仙传仙界篇\t155210\n【天\t155211\n无点\t155212\n郑州博物馆\t155213\n苏东坡\t155214\n山谷里的思念\t155215\n传销式\t155216\n尽有\t155217\n农金\t155218\n饭卡\t155219\n异魔\t155220\n心花怒放\t155221\nTAO\t155222\n石门\t155223\n二零\t155224\n电牛\t155225\npleasure\t155226\n_综\t155227\n莱登\t155228\n林军\t155229\nBCA\t155230\n品铂\t155231\n洁厕宝\t155232\n策略型\t155233\n_筑龙网\t155234\n企业管理体系\t155235\n农夫山泉\t155236\n戈壁石\t155237\n入党积极分子学习\t155238\n4000多万\t155239\n宠物机\t155240\n14_\t155241\nMaxthon\t155242\n创维数码\t155243\n_车家号\t155244\nlpthread\t155245\n万余元\t155246\n捣药\t155247\niNewHope\t155248\nwgs84\t155249\n标致307论坛_汽车之家论坛\t155250\n整理机\t155251\n测谎\t155252\n小仲马\t155253\n电商\t155254\n陈和\t155255\n蜱虫\t155256\n妇科医院\t155257\n特权\t155258\n茶托\t155259\n杰尔\t155260\nwixoss\t155261\n校考成绩\t155262\n小儿过敏性鼻炎\t155263\n滑阀\t155264\nm7206\t155265\n无足轻重\t155266\nsvm算法\t155267\n气体涡轮流量计\t155268\n商通网\t155269\n白浩\t155270\nDell戴尔\t155271\n导出\t155272\n易登分类信息网\t155273\npreferences\t155274\n紫外\t155275\nucs\t155276\nCBG\t155277\n哈士奇资源网\t155278\n成都市商务局\t155279\n高热量\t155280\n融资租赁有限公司\t155281\n今生是第一次\t155282\n阳光印网\t155283\n手动阀\t155284\n小肚鸡肠\t155285\nC6678\t155286\nABS\t155287\n圣像\t155288\n三国乱世攻略\t155289\ncad2011\t155290\n南威\t155291\n孟关\t155292\n快商通\t155293\n全集百度云\t155294\n超限\t155295\n承制\t155296\n开箱机\t155297\n飞牛\t155298\n省儿\t155299\n版税\t155300\nMiracles\t155301\nMUi\t155302\n经常\t155303\n三亚乐居网\t155304\n宇佐美舞\t155305\n数量\t155306\n苏州卫生职业技术学院\t155307\n第1篇\t155308\n科陆\t155309\ncifar\t155310\n80s\t155311\n物美\t155312\n中国版权保护中心\t155313\n冻石\t155314\n汉女\t155315\n青井\t155316\n2016年中\t155317\n管理科学与工程\t155318\noverloaded\t155319\n铝瓦\t155320\n西坪\t155321\n水果图\t155322\n入岩\t155323\n瘦身裤\t155324\nVoiceOver\t155325\n审稿费\t155326\nDeploying\t155327\n解疑\t155328\nNudist\t155329\n9.5.3\t155330\n前囟\t155331\n深圳电台\t155332\n艾玛·罗伯茨\t155333\n大话mg3\t155334\n222222\t155335\n沉思\t155336\n虞城网\t155337\n波谱\t155338\nQuote\t155339\n尽其用\t155340\npoct\t155341\n容重\t155342\n龙星凉\t155343\n7006\t155344\n泥瓦\t155345\n以奖代\t155346\ni2s\t155347\n荣毅仁\t155348\n中西区\t155349\n剡\t155350\n超时空\t155351\n小米note3\t155352\n蒙华\t155353\nwacom数位板\t155354\n摔到\t155355\n马来西亚槟城\t155356\nas3\t155357\n西溪街道\t155358\n第1169章\t155359\n新圩镇\t155360\n丰田公司\t155361\n孔雀简笔画\t155362\n名帅\t155363\ngoodboy\t155364\n做我女朋友\t155365\n何所夏凉\t155366\n休闲服\t155367\n差一步\t155368\n惹是生非\t155369\n说道\t155370\n86步\t155371\n32届\t155372\n芍药居\t155373\n万亿俱乐部\t155374\n汉王电纸书\t155375\namcap\t155376\n电熨斗\t155377\n着衣\t155378\n3000条\t155379\n头寸\t155380\nAPQP\t155381\n银色\t155382\n三姑六婆\t155383\n太空\t155384\n活塞式空压机\t155385\n荒无人烟\t155386\n西府\t155387\n联达\t155388\n108岁\t155389\n辛棄疾\t155390\n婀娜\t155391\n红外测温仪\t155392\n很漂亮\t155393\n瓦尔塔蓄电池\t155394\n吴凤花\t155395\n拜见宫主大人\t155396\n杨国安\t155397\nKisaki\t155398\n22:26\t155399\n造粒机\t155400\nawesomium\t155401\n新关\t155402\n⒐\t155403\n湖滨区\t155404\n猜透\t155405\n知道日报\t155406\npeaks\t155407\n雪炭\t155408\n斯塔克\t155409\na醇\t155410\n四川化工职业技术学院\t155411\n披着\t155412\n气缸盖\t155413\nAut\t155414\n晚成\t155415\n程程\t155416\n杀身之祸\t155417\n水族箱\t155418\n缺陷型\t155419\n自顶向下\t155420\nshuzi\t155421\n扣留\t155422\n子页面\t155423\n花字\t155424\n九句\t155425\n罗格\t155426\nPCT\t155427\n您说\t155428\n程志明\t155429\n600340\t155430\n培智学校\t155431\n18W\t155432\n善恶业\t155433\n昆明日报\t155434\n档位\t155435\n單\t155436\n湖南第一师范\t155437\n一晚后\t155438\n集合信托\t155439\nloses\t155440\n魔兽RPG\t155441\nMemo\t155442\n书画展\t155443\n大航海时代5\t155444\n麻沸散\t155445\nvoltage\t155446\n虚拟7.1\t155447\nnvsip\t155448\n中国工业园\t155449\n财股网\t155450\n天仙\t155451\n华人娱乐\t155452\n工程技术研究中心\t155453\n【龙\t155454\nIDO\t155455\niH5\t155456\n北京语言大学\t155457\n曾垂鑫\t155458\n环太平洋2:雷霆再起\t155459\n妇刑\t155460\n图兰朵\t155461\n小赢卡\t155462\n好开心\t155463\n上海市口腔病防治院\t155464\n创盛\t155465\nab测试\t155466\n施工日记\t155467\n黄皓\t155468\ncirculation\t155469\nTBB\t155470\nカフェ\t155471\n安全卫士\t155472\n金边吊兰\t155473\n阿尔山\t155474\nebooks\t155475\nsyswow64\t155476\n情事满江红\t155477\nkrc\t155478\n废油\t155479\n3115\t155480\n松赞\t155481\n海南省水务厅\t155482\n手熟尔\t155483\n两微\t155484\nHorizontal\t155485\n民调局异闻录\t155486\n红色警戒2:尤里的复仇\t155487\n43分钟\t155488\n上海市食药监局\t155489\n杨建\t155490\n公共营养师考试\t155491\n小孔雀鱼\t155492\nJD\t155493\n胜利乡\t155494\nEnhance\t155495\n败诉\t155496\nhanser\t155497\ntaken\t155498\n登楼\t155499\n长长久久\t155500\n转会\t155501\n45kw\t155502\n新西兰南北岛\t155503\n钱学森\t155504\n向日葵\t155505\n鼻骨骨折\t155506\n王平\t155507\nrfb\t155508\n姜丽\t155509\n龙溪乡\t155510\n枞阳\t155511\n王晓冬\t155512\n反家庭暴力法\t155513\n迪美\t155514\n苏生\t155515\nGrammarist\t155516\n65路\t155517\n平山堂\t155518\n江布拉克\t155519\n7.10\t155520\n赣语\t155521\n孙富有\t155522\nk780\t155523\nlogix5000\t155524\n偿\t155525\n中国茶叶网\t155526\nDVD播放器\t155527\n催热\t155528\n952141\t155529\nExcelVBA\t155530\n埃伦娜\t155531\n圆满结业\t155532\nrbm\t155533\n褐\t155534\n气质型\t155535\n截拳\t155536\n五轴\t155537\nprada\t155538\n残留物\t155539\nhehe+大众健身队\t155540\nVespa\t155541\n东方cardtd\t155542\n4.70\t155543\n贝伦赛丽\t155544\ninception\t155545\n四小天鹅\t155546\n20150509\t155547\n6500u\t155548\n2.6.1\t155549\n25global\t155550\n边城\t155551\n堂上\t155552\n肝火旺\t155553\n尼科尔森\t155554\n天津交通职业学院\t155555\n公杂\t155556\nIT英才网\t155557\nAntd\t155558\n梁敏\t155559\n台州市委\t155560\n点道\t155561\n深圳派出所\t155562\ncitation\t155563\n三板在线\t155564\nRadmin\t155565\n林狗\t155566\n区別\t155567\n手套箱\t155568\nICS\t155569\n大话西游3吧\t155570\n好恶\t155571\n上半\t155572\n狂吠\t155573\n西津\t155574\n粘液质\t155575\n大头照\t155576\nTobit\t155577\n除恶\t155578\n14500元\t155579\n前男\t155580\n异径管\t155581\n六招\t155582\n文采\t155583\n昭示\t155584\n董事\t155585\n湿冷\t155586\nPingCAP\t155587\n一小撮\t155588\nzabbix-server\t155589\n淘宝淘气值\t155590\nrailroad\t155591\n地方戏\t155592\n水东\t155593\n补子\t155594\n小营镇\t155595\n72万\t155596\n包房\t155597\n地球引力\t155598\n屯门时代广场\t155599\n显存\t155600\n哈尔滨铁道职业技术学院\t155601\n武汉体院\t155602\n坂\t155603\n环球娱乐\t155604\n中山三角\t155605\n碗架\t155606\n京沈高速\t155607\n灵猫\t155608\n速報\t155609\n清华苑\t155610\n怪物猎人ol\t155611\n新突破\t155612\n吊件\t155613\n霹雳贝贝\t155614\n心量\t155615\n龙里\t155616\n第29期\t155617\n魔乐科技\t155618\n金属检测机\t155619\n1300度\t155620\n山东社会科学院\t155621\n天虎\t155622\n武汉房管局\t155623\n搬出去\t155624\n将心向明月\t155625\n声带息肉\t155626\n菌肥\t155627\n4|\t155628\n2017年11月6日\t155629\n少女前线\t155630\n继子女\t155631\n不可用\t155632\n团支书\t155633\n电子签证\t155634\nsider\t155635\n克比\t155636\n妈妈网\t155637\nshares\t155638\n杭州擎洲软件有限公司\t155639\n底狱\t155640\ntasting\t155641\n晓容晓枫\t155642\n凶手\t155643\n肌功能\t155644\n闪信\t155645\n仙剑一\t155646\nEms\t155647\n哈尔滨市国土资源局\t155648\n海牛\t155649\n北山镇\t155650\n六三制\t155651\n防爆灯具\t155652\n绿色革命\t155653\n重炮手\t155654\n宁波南站\t155655\n光大\t155656\nguowai\t155657\n李才文\t155658\n酷喃\t155659\n注音版\t155660\n依索\t155661\n科尔多瓦\t155662\n西水\t155663\nV21\t155664\n易乐\t155665\n森普\t155666\n朽木白哉\t155667\nmusician\t155668\nensemble\t155669\n450\t155670\n面颊\t155671\n09:30\t155672\n60万个\t155673\n南岳衡山\t155674\n辽宁电视台\t155675\n上瘾\t155676\n京都奈良\t155677\n1.dfm\t155678\n嫡系\t155679\n交办\t155680\naf\t155681\n跨文化\t155682\narn\t155683\n牛栏\t155684\n防静电\t155685\nExcept\t155686\n武胜\t155687\n渗流\t155688\n出入口\t155689\n战地之王\t155690\nLibvirt\t155691\n头罩\t155692\n花杆\t155693\n于海\t155694\nT470\t155695\nSick\t155696\n上书\t155697\n大小球\t155698\n买好\t155699\n天津公务员考试网\t155700\n盘丝仙仙\t155701\n特发性震颤\t155702\n冥界警局\t155703\n总流\t155704\nPCM\t155705\n嘉泽新能\t155706\n特保\t155707\n精品集\t155708\n绍兴文理学院元培学院\t155709\n潜伏者\t155710\n食肆\t155711\n发刊词\t155712\n五四大街\t155713\n整篇\t155714\n洛邑古城\t155715\n多宝塔碑\t155716\n私阴\t155717\nfilder\t155718\nrollback\t155719\n敢达争锋对决吧\t155720\n总线\t155721\n金笔\t155722\n联动式\t155723\n修读\t155724\n42秒\t155725\n修罗大圣传\t155726\n快递袋\t155727\nzim\t155728\n综英美\t155729\nyxysuanf\t155730\n北疆\t155731\n菲乌米奇诺机场\t155732\nZEBRA\t155733\n_黄金投资策略网\t155734\n檩条\t155735\n阴道出血\t155736\n新中源\t155737\n李雪莱\t155738\ntumour\t155739\n缝包\t155740\nEos\t155741\n胰脾\t155742\nSims4\t155743\n千春\t155744\nQQ在线客服\t155745\n元年\t155746\n无针线雕\t155747\navalonjs\t155748\n蝙蝠侠:侠影之谜\t155749\npressing\t155750\n近红外\t155751\n孔表\t155752\n紫微学院\t155753\nDrugBank\t155754\n4朵\t155755\n阿里大鱼\t155756\nEIP\t155757\n18年后\t155758\n逆止器\t155759\n郭芸\t155760\n8950\t155761\n三万分\t155762\n庄恕\t155763\n普利特\t155764\n遮天吧_\t155765\n三公尺\t155766\n神奇宝\t155767\n雷霆队\t155768\n1.5米\t155769\n淘宝卡\t155770\n文姬\t155771\n苏州港\t155772\n金辉悦府\t155773\n东野\t155774\n美墨\t155775\n20C\t155776\n667\t155777\nO型圈\t155778\n6块\t155779\n塘朗城\t155780\n营养强化剂\t155781\n逃跑计划\t155782\n疫情\t155783\n张江在线\t155784\n沈阳政府\t155785\n1894年\t155786\n赛鸽资讯网\t155787\n兴业街\t155788\nJP\t155789\n磨蹭\t155790\nimax\t155791\n合不拢\t155792\n违法案\t155793\n片山\t155794\n天朗气清\t155795\n贝拉小镇\t155796\n电子系\t155797\n矿山路\t155798\nTrainers\t155799\n潍坊市财政局\t155800\n京藏高速公路\t155801\n参同\t155802\nPuppet\t155803\n蓓蕾幼儿园\t155804\n十局\t155805\nPOY\t155806\n亚吉铁路\t155807\n码器\t155808\n兑换物\t155809\n张建文\t155810\nake\t155811\n施氏\t155812\n杨旭文\t155813\n佛山公司\t155814\nGag\t155815\n丹拿\t155816\n夜班车\t155817\n寻星\t155818\nPJSIP\t155819\n喷火器\t155820\n挑战性\t155821\n1612\t155822\n遇鬼\t155823\n2AM\t155824\n狙击手\t155825\n第五六\t155826\n7000二代\t155827\n显血\t155828\n准字\t155829\n藏王\t155830\ns-cute\t155831\n跳财\t155832\nPaddy\t155833\n从早\t155834\n南京公共自行车\t155835\nTAMU\t155836\n钱宝网\t155837\n康菲\t155838\n表达\t155839\n钩藤\t155840\nbpd\t155841\nuitextview\t155842\n千里之堤\t155843\n辖县级市\t155844\n我是歌手第三季\t155845\n刑民\t155846\n金山北\t155847\n海马福美来\t155848\n银谷普惠\t155849\n三马\t155850\n阳城县政府\t155851\n香闺\t155852\n三项赛\t155853\n力量型\t155854\n社团文化节\t155855\n张志宏\t155856\n未来战争\t155857\n第二架\t155858\n显微镜物镜\t155859\n分居\t155860\n臭氧\t155861\nBulbs\t155862\n睿亚训\t155863\n试用装\t155864\n蛟川双语小学\t155865\n危险驾驶罪\t155866\n奉\t155867\n正切值\t155868\n寿宁\t155869\nComputation\t155870\n妆娘\t155871\n党政领导干部\t155872\n春来了\t155873\n世嘉三厢\t155874\n圣卢西亚\t155875\n3x^3\t155876\n国际贸\t155877\n开麦\t155878\n环世界\t155879\n背光源\t155880\n一大类\t155881\nHSK\t155882\neeee\t155883\n8800\t155884\n叠词\t155885\n113名\t155886\n兴欣\t155887\n032715\t155888\n一个儿\t155889\n外拓\t155890\n沐河\t155891\n神舞幻想\t155892\n血窦\t155893\n六角龙鱼\t155894\n敏感度\t155895\n知情者\t155896\nuicolor\t155897\n微分方程组\t155898\n荣耀凯\t155899\n浙江市\t155900\n大玉儿\t155901\n中芭\t155902\n爱奇艺公司\t155903\n展|\t155904\n奇缘\t155905\n考研党\t155906\nhighscore\t155907\nrc522\t155908\n菊穴\t155909\nlowest\t155910\n手术期\t155911\nAtomikos\t155912\n空调位\t155913\n易武\t155914\narctanx\t155915\nV260\t155916\n大庆东站\t155917\n么样\t155918\n春城\t155919\n掌通\t155920\nrpn\t155921\n宝冶\t155922\n原舍\t155923\n裕元\t155924\nJessie\t155925\n3X/3X\t155926\n奔驰C级论\t155927\nintimate\t155928\nTUYIYI\t155929\n旁氏米粹\t155930\nDuiLib\t155931\n测绘员\t155932\nCSR\t155933\n王璐璐\t155934\n陆军军医大学\t155935\n绿叶制药\t155936\n知网吧\t155937\n龙栖湾\t155938\n中交四航局\t155939\n49级\t155940\n合肥医院\t155941\n降妖\t155942\n营造\t155943\n大婚\t155944\n山楼\t155945\n土航\t155946\n全民健康网\t155947\n唐寅在异界吧\t155948\n刘嘉玲\t155949\n悔恨\t155950\n多米音乐\t155951\n理论界\t155952\n东半球\t155953\nCoder\t155954\n第3话\t155955\n私\t155956\n习近平\t155957\n东津世纪城\t155958\n公倍\t155959\n12万亿\t155960\n积贫\t155961\n花叶\t155962\nAstute\t155963\n求职吧\t155964\nmood\t155965\n孤岛5\t155966\n青海省\t155967\n在前\t155968\n活期宝\t155969\n异术\t155970\n裕沁庭\t155971\n泰州火车站\t155972\n鸳鸯锅\t155973\n法本\t155974\ngraded\t155975\nMongoTemplate\t155976\n3.8%\t155977\nExplicit\t155978\n悠悠网\t155979\n要\t155980\nCoC\t155981\nmacbeth\t155982\n清贵\t155983\n环境声\t155984\n说透\t155985\nES文件浏览器\t155986\nRNA病毒\t155987\n二通\t155988\nSuspect\t155989\nWon\t155990\n少女感\t155991\n科洛·莫瑞兹\t155992\nHiSuite\t155993\n老谋深算\t155994\n自制凉皮\t155995\n脱贫攻坚战\t155996\n大知\t155997\n马恩岛\t155998\n行距\t155999\nlimitations\t156000\n幸福之路\t156001\n过成\t156002\n视频监控\t156003\n十一层\t156004\n独家代理协议\t156005\n七界\t156006\n仪征\t156007\n家用投影仪\t156008\n白眼\t156009\n戒烟\t156010\n电脑显示屏\t156011\n无物\t156012\n聚氧乙烯\t156013\n八百方\t156014\n柳钢集团\t156015\n自行车\t156016\n老虎淘宝客\t156017\n学编程\t156018\n第八卷\t156019\nyoung\t156020\n东大\t156021\n济南出版社\t156022\n苹果云\t156023\n罗湖区\t156024\nneb\t156025\n读诵\t156026\n阿仑膦酸钠片\t156027\n亿企\t156028\n也罢\t156029\nremap\t156030\nppt素材\t156031\ngenre\t156032\nCVP\t156033\n恒洁卫浴\t156034\n青藏铁路\t156035\nsysadmin\t156036\n凯撒旅游\t156037\nabstract\t156038\n5v5\t156039\n布莱克·莱弗利\t156040\n小勾\t156041\n蜀兴川\t156042\n煤气表\t156043\n太阳湾\t156044\n千灯镇\t156045\n德化街\t156046\n_途达论坛_汽车之家论坛\t156047\n王志萍\t156048\nrMBP\t156049\nMONACO\t156050\n电式\t156051\n叫出\t156052\n叶梓潼慕兆丰\t156053\n甘肃省旅游发展委员会\t156054\n完美的世界\t156055\nSpearman\t156056\nspecialize\t156057\nWorm\t156058\nh5\t156059\n真剑者\t156060\n卩\t156061\n汉高乐泰\t156062\nPhilippe\t156063\n一库\t156064\n灵犀一指\t156065\n马太效应\t156066\n佳山三花\t156067\n做一做\t156068\n三系\t156069\n15章\t156070\n广大\t156071\n帕丁顿熊2\t156072\nYES\t156073\n城固市\t156074\n沦丧\t156075\n2016年3月30日\t156076\n迷魂\t156077\n英法百年战争\t156078\n无人知\t156079\nquietwalk\t156080\n落地生根\t156081\nGTX460\t156082\n1129\t156083\n分门别类\t156084\nrevoked\t156085\n刹车器\t156086\n周道\t156087\n悬赏令\t156088\nMusée\t156089\nenter键\t156090\n97名\t156091\n比思论坛\t156092\n保密室\t156093\n三国赵云传之纵横天下\t156094\n猫币\t156095\n掳走\t156096\n事务局\t156097\nMiki\t156098\nv2ray\t156099\n073\t156100\n非因工负伤\t156101\niread\t156102\n克拉霉素\t156103\n991.2\t156104\n气缸\t156105\n99999\t156106\naccelerate\t156107\nDeborah\t156108\n121&\t156109\n美不胜\t156110\n重心\t156111\n天府广场\t156112\n狮王进行曲\t156113\n牡丹灵通卡\t156114\n订书\t156115\n长沙植物园\t156116\n苗木\t156117\n广汉市公众信息网_广汉市政府\t156118\n容规\t156119\n广州市人民政府办公厅\t156120\n云南省委统战部\t156121\n万科幸福誉\t156122\n高宗\t156123\n4700mq\t156124\n黄星\t156125\n英语口语\t156126\n右耳\t156127\n海南省科学技术厅\t156128\n沈阳医院\t156129\n环境类\t156130\n掌沟\t156131\n悦驾网\t156132\n君悦\t156133\n医疗展\t156134\nSaucony\t156135\n北京景山公园\t156136\n四县市\t156137\n物理性质\t156138\n樟脑球\t156139\n仙衣\t156140\n美国梦\t156141\n银行员\t156142\nIndia\t156143\n举世瞩目\t156144\n退避\t156145\nRelic\t156146\n25所\t156147\nPlains\t156148\n天井机\t156149\nForests\t156150\n150dpi\t156151\n兰州人流医院\t156152\n山西三维集团股份有限公司\t156153\n宝群\t156154\n虞姬\t156155\nmsiexec\t156156\n郭一平\t156157\neth\t156158\n沱\t156159\n滑雪服\t156160\nfx-991CN\t156161\n办公费\t156162\n夏婉安\t156163\n强碱\t156164\n可堪\t156165\nintestinal\t156166\n2017款\t156167\n昌宁\t156168\n祢\t156169\n国色天香\t156170\n荷花苑\t156171\n分配生\t156172\n博路\t156173\n饕鬄\t156174\n伍德里奇\t156175\n3厘米\t156176\nResidences\t156177\n9.19\t156178\n亲密付\t156179\n六间\t156180\n刘浪\t156181\n飞泉\t156182\n十五味\t156183\n景云\t156184\n蓬佩奥\t156185\n国信\t156186\n開封\t156187\n记忆棉\t156188\n谋爱\t156189\n1.5倍速\t156190\n中科院高能所\t156191\n赛会\t156192\n武汉街道\t156193\n茶皂素\t156194\n山东省人大常委会\t156195\n1.0000元\t156196\n雅居乐星河湾\t156197\n凯叔\t156198\n坪塘\t156199\n天灯\t156200\nTIF\t156201\n清华控股有限公司\t156202\nv1.3.5\t156203\n北京汽车\t156204\n工人员\t156205\n龙且\t156206\nfrsky\t156207\n张向荣\t156208\nCas9\t156209\n石桥乡\t156210\ncht\t156211\nJiansNet\t156212\n任免权\t156213\n死屏\t156214\nCANVAS\t156215\n雷云3\t156216\n谦哥\t156217\n安视网\t156218\n更少\t156219\n昌平站\t156220\n车城\t156221\n沃达丰\t156222\n中国留学网\t156223\n热依娜\t156224\n土木建筑\t156225\ncentre\t156226\n师匠\t156227\n81万\t156228\n十_\t156229\n张寒\t156230\n建筑力学\t156231\n快乐王子\t156232\n染爱\t156233\njojo的奇妙冒险\t156234\n三春\t156235\n新城村\t156236\n胜负彩\t156237\n水利\t156238\n刘文金\t156239\ndll\t156240\n产品质\t156241\nPregnancy\t156242\n甲酸乙酯\t156243\n御享\t156244\n一零一\t156245\nmybitas\t156246\n徐海涛\t156247\nProtector\t156248\n严格限制\t156249\n冻结\t156250\n合\t156251\n阿卡西\t156252\n桔皮\t156253\n米大师\t156254\n休养生息\t156255\n十克\t156256\n养水\t156257\n独角兽\t156258\n我爱记歌词\t156259\n300块\t156260\n91dede\t156261\n格桑花\t156262\ntransplantation\t156263\n70天\t156264\n洼里\t156265\nsentinel\t156266\n扣字\t156267\n引以为鉴\t156268\nelder\t156269\n5月11\t156270\nUPD\t156271\n坦承\t156272\n英译\t156273\n固体燃料\t156274\n冠心\t156275\n护创\t156276\n红沿河\t156277\n上海物理教育网\t156278\n焦炭\t156279\n中国宝武\t156280\n雷蛇云驱动\t156281\n月亮城\t156282\n萝卜章\t156283\nfront\t156284\nFunds\t156285\n致命罗密欧\t156286\n华东交大\t156287\nfreeline\t156288\n堆龙德庆区\t156289\n智行火车票\t156290\ntmux\t156291\n悦居\t156292\n经营管理者\t156293\n普吉镇\t156294\n米花之味\t156295\n黑石礁\t156296\n金桥店\t156297\n卷页\t156298\n爱国\t156299\n克洛玛古斯\t156300\n硅酮\t156301\n苦杏仁\t156302\n衡阳东站\t156303\n病员\t156304\n花牌\t156305\n西宁火车站\t156306\n迪斯科\t156307\n君盛\t156308\n巡音ルカ\t156309\n牛家村\t156310\n再审民事裁定书\t156311\n毛膏\t156312\n矿难\t156313\n总纲\t156314\n二套房利率\t156315\nVol.1\t156316\n金谊广场\t156317\n再想\t156318\n华为P6/P6s\t156319\n书模\t156320\nIPSAN\t156321\n1741\t156322\n温柔的人\t156323\n我是江小白\t156324\n角色们\t156325\n乌兰布统\t156326\n5.71\t156327\nrefprop\t156328\n细胞培养液\t156329\n胸腔闭式引流\t156330\n理娱\t156331\n文文\t156332\n电场线\t156333\n体检测\t156334\n景宁\t156335\nsupported\t156336\n用外\t156337\n寒泉\t156338\n穿仓\t156339\n保税物流中心\t156340\n荣县人民政府\t156341\n中级审计师考试\t156342\n员村\t156343\nresource\t156344\n大姑娘\t156345\n场镜\t156346\nreille\t156347\n靖江网\t156348\n山城\t156349\n夜魅\t156350\n网页端\t156351\n招标公\t156352\n120mm\t156353\nchangsha\t156354\n千分之二\t156355\n爱情转移\t156356\n10厘\t156357\n怀思\t156358\n苏州中医院\t156359\n无密\t156360\n红满堂\t156361\n徐梵澄\t156362\n液压折弯机\t156363\n养颜\t156364\n伟巴斯特\t156365\n香云纱\t156366\n方倍\t156367\n毫州市\t156368\nattendance\t156369\n苏三起解\t156370\niPa\t156371\n炬\t156372\n黑提\t156373\n交通银行\t156374\n边生\t156375\n桓仁\t156376\nIE800\t156377\n美国商会\t156378\n南霸天\t156379\n汇总表\t156380\n20140709\t156381\n2100年\t156382\n观看者\t156383\nsmaller\t156384\n4123\t156385\n重复项\t156386\nstm32f103c8t6\t156387\n悠闲\t156388\n一一般\t156389\n生活片\t156390\n条屏\t156391\n府天\t156392\nISO27001认证\t156393\n齐博\t156394\n八九十年代\t156395\n区划片\t156396\n导热硅脂\t156397\n警界\t156398\n巴斯夫\t156399\nmogo\t156400\n唐传奇\t156401\n合肥科技馆\t156402\n魏多亮\t156403\nSURL\t156404\n契克\t156405\n福建出入境检验检疫局\t156406\n1024bt\t156407\n问好\t156408\n高中部\t156409\n星区\t156410\n宋应星\t156411\n宫刑\t156412\n喷射式\t156413\n男袜\t156414\n无人之境\t156415\n2124\t156416\n广州市国土资源和房屋管理局\t156417\n拉塞尔\t156418\n提议\t156419\n4标\t156420\n安霸\t156421\n下塘\t156422\n单元格td\t156423\n永林\t156424\n电地\t156425\nBeta版\t156426\n脾阳虚\t156427\n答题类\t156428\n1604\t156429\n第七\t156430\n高尔夫球杆\t156431\n图睿科技\t156432\n太阁立志传2\t156433\n李沇熹\t156434\n新黄岛\t156435\n黄岩人才网\t156436\n夜天子\t156437\n谭\t156438\n600816\t156439\n古剑奇谭3\t156440\n刀疤\t156441\nCDMA\t156442\nSPACE\t156443\n岔路口\t156444\n电感器\t156445\n07集\t156446\ntous\t156447\n2007-2017年\t156448\n高斯\t156449\nStringBuffer\t156450\n索尼娅\t156451\nmeg\t156452\n母质\t156453\n75集\t156454\nSpyder\t156455\n美考专区\t156456\n长笛\t156457\nhongqiao\t156458\nwass\t156459\n法兰克福\t156460\n硬尾\t156461\nJILEBOX\t156462\n固持\t156463\n42号\t156464\n蝙蝠侠大战超人\t156465\n预编\t156466\n主页面\t156467\n成都市工商管理局\t156468\nplantuml\t156469\nORPG\t156470\n圣岛\t156471\n线稿\t156472\n艾欧尼亚吧\t156473\nscandir\t156474\n020\t156475\n幽灵行动阿尔法\t156476\n中诚信托\t156477\n全易通\t156478\n大鱼\t156479\n纳林湖\t156480\n石化公司\t156481\n福田区政府\t156482\n赤练\t156483\n盐酸利多卡因注射液\t156484\n20万平米\t156485\n瑶玲\t156486\n博弈树\t156487\n吕杰\t156488\n直液式\t156489\nascii码\t156490\nSitu\t156491\n粗布\t156492\nHelloWord\t156493\n利品\t156494\n美国康奈尔大学\t156495\n新海盗王\t156496\n山谷\t156497\nq300\t156498\n社风\t156499\nhuayi\t156500\npayroll\t156501\n盈拓\t156502\n芜湖\t156503\n4403\t156504\n太空版\t156505\n奔驰gls\t156506\n广西民族大学\t156507\n曹芳\t156508\n长春市政府\t156509\n武陵镇\t156510\n15美元\t156511\n青岛市工商行政管理局\t156512\n灸感\t156513\n餐标\t156514\n紫岚\t156515\n吉林师范大学\t156516\n羊城\t156517\nXZC\t156518\n发呆\t156519\nCAR-T\t156520\n征途2S\t156521\nFastJSON\t156522\nFilesystem\t156523\nresp\t156524\n2899\t156525\nhighkick\t156526\n产蛋\t156527\n2017年8月27日\t156528\nsto\t156529\npcall\t156530\n欧诺论坛\t156531\n魏大\t156532\noverweight\t156533\n联合政府\t156534\n三支一扶\t156535\n3.9元\t156536\n东凛\t156537\n苏维埃\t156538\n4g+\t156539\n左氧氟沙星\t156540\n一日\t156541\nbootrap\t156542\n透明管\t156543\nDIV+CSS\t156544\n上海蓝十字脑科医院\t156545\n胡桃木\t156546\n血液科\t156547\nwow主题站\t156548\n0\t156549\n嘎吱\t156550\n丙烷\t156551\n紫金公馆\t156552\n宫本武\t156553\n夫西地酸乳膏\t156554\n南陵县人民政府\t156555\ncompliance\t156556\n第二支\t156557\n手杯\t156558\n冥刻\t156559\n蓬蓬裙\t156560\n草榴\t156561\ndiffraction\t156562\n旺财\t156563\n64742\t156564\n互文\t156565\n地雷区\t156566\n107号\t156567\n英语专业四级\t156568\n科技风\t156569\n地阁\t156570\n塔姆\t156571\n先锋榜\t156572\n骨王\t156573\n初始化值\t156574\nradio单选\t156575\n竞技杠\t156576\n履带板\t156577\nW520\t156578\n北座\t156579\n广告传媒公司\t156580\n邦泰\t156581\n全通教育\t156582\n萨瓦迪卡\t156583\n稳定土拌合站\t156584\n卷卷\t156585\n小关北里\t156586\n0551-681xxxx\t156587\n东疆保税港区\t156588\n商办\t156589\n艾璐卡\t156590\n贯流\t156591\n大鲁阁\t156592\n梁宇\t156593\n神羽\t156594\n走茶凉\t156595\n戏词\t156596\n上海市公安系统人民警察\t156597\n王学\t156598\nccd\t156599\nAnalyst\t156600\n资料室\t156601\nKaldi\t156602\n高铁时刻表\t156603\n李冬梅\t156604\n鹿王\t156605\n主词\t156606\n操控性\t156607\n引咎辞职\t156608\n浚县\t156609\nENTER\t156610\n蓝岛\t156611\n版权\t156612\nDiscrete\t156613\n疱疹性\t156614\n王英文\t156615\n八万\t156616\n娇艳\t156617\nvRealize\t156618\n浴屏\t156619\n深圳大学继续教育学院\t156620\n金秒奖\t156621\n子宫肌腺症\t156622\nOAD\t156623\nNon\t156624\n杨汛桥镇\t156625\n蛇咬\t156626\n港口城市\t156627\n羊毛毡\t156628\n楚楚可怜\t156629\n重庆市南岸区人民政府\t156630\n教学研究室\t156631\n祁东新闻网\t156632\nalienwa\t156633\n鹏博\t156634\n阳光幼儿园\t156635\n卓尼\t156636\n钱光\t156637\n裴斗娜\t156638\np20Pro\t156639\n博物院\t156640\n复旦大学大数据学院\t156641\n炸豆腐\t156642\nappliances\t156643\n簿记\t156644\n年轻的心\t156645\n心力\t156646\n申万菱信\t156647\n当前周\t156648\n结衣\t156649\n借鉴\t156650\n招术\t156651\n干流\t156652\n沥青混凝土\t156653\n摆渡者\t156654\n社会法\t156655\n励步\t156656\nwoyao\t156657\n莱文\t156658\nMapped\t156659\n优胜\t156660\nskyui\t156661\n安乐\t156662\n泰然金融\t156663\n贵夫\t156664\nSculpt\t156665\n8.0.4.305\t156666\n黄自元\t156667\nmodis\t156668\nctx\t156669\n盘江股份\t156670\n金葱\t156671\n外板\t156672\n钢绳\t156673\n热恋期\t156674\n计师\t156675\n死锁\t156676\n微粉\t156677\n全国信息安全标准化技术委员会\t156678\nNext\t156679\n27日上午\t156680\n布卢姆\t156681\nExceed\t156682\n_天涯医院\t156683\nUITable\t156684\n2973709849\t156685\n首无\t156686\n韩世忠\t156687\ntruss架\t156688\n广州格力空调\t156689\n语音识别软件\t156690\n大唐电信\t156691\n企划网\t156692\n实习报告\t156693\n107名\t156694\n长春西\t156695\nv2.0安卓\t156696\n斜盘\t156697\n季中\t156698\n冷暖屏\t156699\n狮子岩\t156700\n偶函数\t156701\n云峰山\t156702\n豆腐渣状\t156703\n健胃\t156704\n原数\t156705\nrefore\t156706\n汇源集团\t156707\n黑猫股份\t156708\n20150405\t156709\n可望\t156710\n嘴型\t156711\n咂\t156712\n量化交易\t156713\n京瓷集团\t156714\n金霸王\t156715\n学科群\t156716\n永安林业\t156717\n美亚影院\t156718\n逸云\t156719\n舞鞋\t156720\n其原因\t156721\nUno\t156722\n利津路\t156723\nmm3\t156724\nsqlite\t156725\n莫道\t156726\n后备箱垫\t156727\n上古灵符\t156728\n经典小说吧\t156729\n狂暴之路\t156730\nxplane\t156731\nKrakatoa\t156732\n审校\t156733\n液化气罐\t156734\n硅湖职业技术学院\t156735\n羊腿\t156736\n谢辉\t156737\n辰山\t156738\n负隅顽抗\t156739\n商旅酒店|度假酒店|连锁酒店|民宿|公寓\t156740\n移液\t156741\nMART\t156742\n阜阳市卫生计生委\t156743\n古典文献学\t156744\n雷霆王\t156745\n砂子塘\t156746\n牌具\t156747\n洋紫荆\t156748\n李佃贵\t156749\n持世\t156750\n5200元\t156751\n34mm\t156752\n54张\t156753\n18R\t156754\nk6\t156755\n绘声绘影x10\t156756\n美育\t156757\n10棵\t156758\n上古卷轴4\t156759\n一群群\t156760\n神经病学\t156761\n昌平园\t156762\n再讲\t156763\n2018榴\t156764\nDecade\t156765\n中华人民共和国仲裁法\t156766\n临时工\t156767\nAWE\t156768\n建筑安装工程费用项目组成\t156769\n痛痛快快\t156770\n脚臭味\t156771\n专列\t156772\n5W1H\t156773\noncology\t156774\n硬装\t156775\nIPR\t156776\nstaple\t156777\n55.0\t156778\n老园丁\t156779\n谋生\t156780\nspring+mybatis\t156781\n秦夫人\t156782\n颈椎按摩器\t156783\n工银瑞信睿智\t156784\nsuggest\t156785\nzhenshi\t156786\n酚酸\t156787\n55级\t156788\n2108年\t156789\n旋转蒸发器\t156790\n白带常规检查\t156791\n非那雄胺\t156792\n老幺\t156793\n绝地求生新地图\t156794\n两桶\t156795\nConcordia\t156796\n一杠\t156797\n30px\t156798\n34uc88\t156799\n会声会影11\t156800\n支付宝认证\t156801\n柔石\t156802\n旷\t156803\n飞思卡尔\t156804\n韩女团\t156805\n世茂香槟湖\t156806\n聚醚\t156807\n蕴\t156808\n19世纪70年代\t156809\n霞云岭\t156810\n吉林大学研究生院\t156811\n0759\t156812\n6200\t156813\n流照\t156814\n年玲奈\t156815\n尊邸\t156816\nBeads\t156817\nkof97\t156818\n好手\t156819\n选样\t156820\n介壳虫\t156821\n海湾安全技术有限公司\t156822\n襄州区\t156823\n油菜花节\t156824\n沙画版\t156825\nGARDEN\t156826\nSINGAPORE\t156827\n6.60\t156828\na4988\t156829\n主人杯\t156830\n维尼夫妇\t156831\n雨桐\t156832\n王八\t156833\n刘洪波\t156834\nE书\t156835\n迈腾B6\t156836\nemoji表情\t156837\n奶粉罐\t156838\n星舞\t156839\n棕绳\t156840\n庚新\t156841\n轧制\t156842\n查特\t156843\n89\t156844\nmatlba\t156845\n花式篮球\t156846\n盖浇饭\t156847\nmicro\t156848\n寻找\t156849\n20171114\t156850\n入墙式\t156851\n交头接耳\t156852\n上诉\t156853\nYY6042玄天影视网\t156854\n九九归医\t156855\nmysql-proxy\t156856\n恋歌\t156857\n系统硬盘分区\t156858\n20160117\t156859\n劝君\t156860\n考学\t156861\n钻攻机\t156862\n凡普信\t156863\n纳溪\t156864\n重庆工贸职业技术学院\t156865\nBlog\t156866\n标的物\t156867\nrotating\t156868\n肠粉机\t156869\n盛唐妖异志_盛唐妖异志在线漫画全集_最新盛唐妖异志\t156870\n防风通圣丸\t156871\nzhihua\t156872\n上海海事法院\t156873\n不重样\t156874\n0725\t156875\n碳酸根离子\t156876\n50t\t156877\n台花\t156878\n血气方刚\t156879\ntorch7\t156880\n唐宁墨霆\t156881\nFatRat\t156882\n联动优势科技有限公司\t156883\n新绿\t156884\n增股\t156885\n艾滋病毒\t156886\n郑商所\t156887\n得癌\t156888\n术师\t156889\n古村落\t156890\n含山县政府\t156891\n明台\t156892\nzainali\t156893\n直流继电器\t156894\n凤鸣镇\t156895\n莱茵\t156896\n泥蒿\t156897\n美食\t156898\n蒙古语\t156899\n青海新闻网\t156900\n邓小平理论概论\t156901\n2017年2月18日\t156902\n段落符\t156903\n威廉\t156904\n几梯\t156905\npedro\t156906\n预告登记\t156907\n八亿\t156908\n扇脸\t156909\n集中\t156910\n东边\t156911\n批评家\t156912\n淘宝网\t156913\n稳定型\t156914\n铎\t156915\n950xl\t156916\n矛头\t156917\n智达\t156918\n大荒\t156919\n村名\t156920\n鸡翅\t156921\n呱呱洗车\t156922\n丰台区\t156923\n改革者\t156924\n日志类\t156925\n检测率\t156926\n赫曼德\t156927\n宿敌\t156928\ngrinder\t156929\n蒙台梭利\t156930\n9000E\t156931\n20n\t156932\nコト\t156933\nBibTex\t156934\n齐鑫\t156935\n邹明轩\t156936\nRX560\t156937\n头彩\t156938\n交并\t156939\nCAD平面设计图\t156940\n液枪\t156941\n龙腾世纪3\t156942\n亚俱杯\t156943\n现代远程教育学院\t156944\n赵露\t156945\n量子通信\t156946\n同城群\t156947\n加长车\t156948\n齐木楠雄的灾难\t156949\n同底数幂\t156950\n撤机\t156951\n活下去\t156952\n精密度\t156953\n软芯\t156954\n脾胃科\t156955\n社会保险金\t156956\n信步\t156957\n国兽\t156958\n维加\t156959\n角花\t156960\naddClass\t156961\n漫畫繁體版\t156962\nswot\t156963\n爱的精灵\t156964\n里屋兰丸\t156965\n白糖水\t156966\n华为终端公司\t156967\n网易mumu\t156968\nhomogeneous\t156969\nYJBYS\t156970\n汉考克\t156971\n恒温恒湿箱\t156972\n生当\t156973\n口料\t156974\n分频器\t156975\n碎心\t156976\n牛元帅\t156977\n関ジャニ\t156978\n受辱\t156979\n曹德\t156980\n中汽协\t156981\n泰安古镇\t156982\n知识产\t156983\n凌长\t156984\n高凌风\t156985\nkabibo\t156986\n胡开文\t156987\n职业暴露\t156988\n16章\t156989\n残像\t156990\n电脑软硬派\t156991\n安居乐业\t156992\n八百里\t156993\n法律故事\t156994\n电极式\t156995\n资厅\t156996\nBien\t156997\nporm\t156998\n10个月\t156999\nhedge\t157000\n11n\t157001\n唇疱疹\t157002\n透明质酸酶\t157003\n综合行政执法局\t157004\n仓山\t157005\n锐拓\t157006\n炼药\t157007\n北固\t157008\n大同御东新区\t157009\n更衣柜\t157010\n依赖度\t157011\n远洋渔业\t157012\n易享\t157013\n燕雀安知鸿鹄之志\t157014\n田头村\t157015\n结构胶\t157016\n芭莎珠宝\t157017\n漂洋过海来看你\t157018\n瘦下来\t157019\n赔款\t157020\n观相\t157021\n八十天\t157022\n电源头\t157023\n隔离柱\t157024\nPE塑料\t157025\nnwjs\t157026\ncris\t157027\n至宝\t157028\n徐敏静\t157029\n洗澡盆\t157030\n自卸车\t157031\n进销存管理软件\t157032\nredundant\t157033\n黑花\t157034\n脱腿\t157035\n羊刃\t157036\nExchange\t157037\n1000集\t157038\n饭叶\t157039\nKepware\t157040\nism\t157041\nPlasma\t157042\n重庆公司\t157043\n第六十四\t157044\n笔直\t157045\nmagnolia\t157046\n晋城\t157047\nElvis\t157048\nOCF\t157049\n高炉\t157050\n彭德怀\t157051\n外泌体\t157052\n冷兔\t157053\ntime在线翻译\t157054\n4.8G\t157055\n冉尔\t157056\n汉明吧\t157057\n无锡威盛新材料科技有限公司\t157058\nwindous\t157059\nRayleigh\t157060\n1部\t157061\n马友仙\t157062\n电热鼓风干燥箱\t157063\nDistant\t157064\ninventer\t157065\n营业税金及附加\t157066\n今年7月份\t157067\n爱彼\t157068\n阿斯利康\t157069\n速度与激情6\t157070\n佳妮\t157071\nchive\t157072\n铁杆\t157073\n金字塔\t157074\n获知\t157075\nChaturbate\t157076\nkor\t157077\n3dsMax\t157078\nInclude\t157079\n十九条\t157080\n上海大厦\t157081\nAc\t157082\n蟒皮\t157083\n胎心仪\t157084\n万盛黑山谷\t157085\n教练证\t157086\n赵瑜\t157087\n108名\t157088\n包含\t157089\nPlaySoft\t157090\n取心\t157091\n贴牌\t157092\n山西应用科技学院\t157093\n枫爷\t157094\n鲜炖燕窝\t157095\n施华洛世奇天鹅\t157096\n堆糖网\t157097\nyiu\t157098\n81%\t157099\n贵港市港南区\t157100\n枯\t157101\n椎名桃子\t157102\n幼林\t157103\n我走\t157104\nwordPress\t157105\n想通\t157106\n陆王\t157107\n师珠\t157108\n左罗\t157109\n衰弱\t157110\n次级债\t157111\n东原印\t157112\n省政府安委会\t157113\n文胸\t157114\n裸腿\t157115\n邢台职业技术学院\t157116\nmechanic\t157117\n秩序册\t157118\n42U\t157119\nVOEZ\t157120\n开胯\t157121\n中建一局集团第一建筑有限公司\t157122\n永茂\t157123\n统计与决策\t157124\n纳粹主义\t157125\nWaves\t157126\n课金\t157127\n备用金\t157128\n芳草\t157129\n可丽\t157130\n麻栗坡县\t157131\n拉网式\t157132\n3至5年\t157133\nEVM\t157134\n106.7\t157135\n行李\t157136\n安庆市中级人民法院\t157137\n黑雪\t157138\n郭培\t157139\n郄鹏恩\t157140\n暗月岛\t157141\nU6\t157142\nAppliance\t157143\n彩珠\t157144\n明朗\t157145\n排挡\t157146\nopenwrt\t157147\n玉瓷\t157148\n啾啾岛\t157149\n千门\t157150\n异数\t157151\nFreeType\t157152\n年轻态\t157153\n泽拉图\t157154\n卡饭网\t157155\n悍达\t157156\n正规化\t157157\n小放牛\t157158\n战途\t157159\n鸥鹭\t157160\nsmaba\t157161\nBND\t157162\n小姐姐\t157163\n下级机关\t157164\nC81\t157165\n天妖\t157166\n上海装修公司\t157167\nbjjs\t157168\n团委会\t157169\n电脑麦\t157170\n金意\t157171\n2016届\t157172\n大农村\t157173\n硝苯地平\t157174\n硅谷天堂资产管理集团股份有限公司\t157175\n剑指Offer\t157176\n重庆机场\t157177\n爱看电影网\t157178\n美女们\t157179\n麦芽糖浆\t157180\n风口\t157181\nHAproxy\t157182\n连线\t157183\n退休金\t157184\n尼桑轩逸\t157185\n如家商旅酒店\t157186\nthinkjs\t157187\n120#\t157188\n分获\t157189\n日度\t157190\n点金\t157191\n英菲尼迪QX30\t157192\nTALES\t157193\n车把手\t157194\n人与兽\t157195\n轴承套\t157196\n300亿元\t157197\n嘉泽镇\t157198\n凯恩之角_暗黑3\t157199\n电商报\t157200\nDistributor\t157201\n服务券\t157202\n上海外国语大学西外外国语学校\t157203\n保定三中\t157204\n王传越\t157205\n野望\t157206\n霞姐\t157207\nconcluded\t157208\n郭品超\t157209\nRecognizer\t157210\n大国梦\t157211\n托森差速器\t157212\nKimmy\t157213\n恳谈\t157214\n悦雅\t157215\n統\t157216\n赵本爱因斯坦\t157217\n两只蝴蝶\t157218\n夜游神\t157219\n豚草\t157220\n封神记\t157221\n大连星海公园\t157222\n街拍客\t157223\n一竖\t157224\nwu\t157225\n儿童色\t157226\n三围\t157227\n毁灭性\t157228\n维克多·雨果\t157229\ndwcc\t157230\n王羽\t157231\n爱民区\t157232\n运距\t157233\n既定\t157234\n中国汽车流通协会\t157235\n山西地区\t157236\n珠宝学院\t157237\n在风中\t157238\nv片\t157239\n微乎其微\t157240\n闹中\t157241\n六龙飞天\t157242\n叶儿\t157243\n牛校\t157244\n直角坐标机器人\t157245\n为牢\t157246\n贺兰\t157247\nU盘\t157248\n红黄蓝亲子园\t157249\n吴鸣\t157250\n用友软件园\t157251\nniff\t157252\nWake\t157253\n170101\t157254\nlaomiw\t157255\n丁晖\t157256\n王者之心\t157257\n马自\t157258\n中国财富网\t157259\n烟感\t157260\nimages\t157261\n微赞\t157262\n马斯\t157263\n小米NOTE\t157264\n售罄率\t157265\n汉中门大街\t157266\n转校\t157267\nselect2\t157268\n前科\t157269\n阿娜尔罕\t157270\n政研室\t157271\n第74号\t157272\n华为P7\t157273\n玲珑石\t157274\n3000台\t157275\n邓艾\t157276\nSC圣才学习\t157277\n南屏\t157278\n四小时\t157279\n齐集\t157280\n阿瑞娜\t157281\n明发商业广场\t157282\n彩神通\t157283\n沿河土家族自治县政府\t157284\nidentifiers\t157285\n雪野湖\t157286\n多嘴肉蟹煲\t157287\n山西环保厅\t157288\n右转\t157289\n山海三国如龙传\t157290\n新山村\t157291\n石匠\t157292\nfirewall\t157293\n单目相机\t157294\n鸭爪\t157295\nxiayi\t157296\n山西省农村信用社\t157297\n锋面\t157298\nBITCH\t157299\n吴晓峰\t157300\n大汗\t157301\n邹军\t157302\n铂涛\t157303\n岫岩满族自治县\t157304\nSexy\t157305\n台泥\t157306\n二类\t157307\nwow燃烧王座\t157308\n校长\t157309\n漫画簿\t157310\nStriped\t157311\n宫格\t157312\n片区\t157313\ndota2卡\t157314\n全国质量奖\t157315\nglitch\t157316\n书院镇\t157317\n第133集\t157318\n巴布瑞\t157319\nnanren\t157320\n列管式换热器\t157321\n铁炉\t157322\nBuying\t157323\n姚\t157324\n30册\t157325\n家国\t157326\n满遗\t157327\n出没\t157328\nsiento\t157329\nshare\t157330\n双星新材\t157331\n1岁\t157332\n15年7月\t157333\n九州糖酒网\t157334\n2月10日\t157335\n对外直接投资统计公报\t157336\n勇度\t157337\ngods\t157338\n数据校验\t157339\n钱学罗斯\t157340\n疯魔鲁智深\t157341\n莱卡棉\t157342\n2.5分米\t157343\nzhu329599788@126\t157344\nFranc\t157345\n蛊\t157346\ngazebo\t157347\n一泊\t157348\n神君\t157349\nC++类\t157350\nSLM\t157351\nh61m-ds2\t157352\nCayman\t157353\n三花智控\t157354\n玉藻\t157355\n活版\t157356\nDev\t157357\n斯蒂芬妮\t157358\n铁道部\t157359\n王义博\t157360\n万兽\t157361\nibm\t157362\n望江先锋网\t157363\n小米影视\t157364\n台南\t157365\n做局\t157366\n毒犯\t157367\n甜丝丝\t157368\n命例\t157369\n重轨\t157370\n效果字\t157371\n新联会\t157372\n玉不琢不成器\t157373\n重庆火锅底料\t157374\nwebim\t157375\n靖江路\t157376\n朴泰桓\t157377\n慈诚罗珠堪布\t157378\n圣诞帽\t157379\n杜鹃花海\t157380\nzhiye\t157381\n丰田红杉\t157382\n吸收塔\t157383\n汽车之家\t157384\nangular2\t157385\n所得税汇算清缴\t157386\n海扶刀\t157387\n小盆友\t157388\n金地国际花园\t157389\n左翼\t157390\n虐待\t157391\n徐亮\t157392\n鼻干\t157393\n5.85\t157394\nblessing\t157395\n禁宫\t157396\n糯米网\t157397\n姜家志\t157398\n娘化\t157399\n洛先生\t157400\nU3D\t157401\n伊力特\t157402\n大家谈\t157403\n有机化学专业\t157404\n悠哉日常大王\t157405\ncrowne\t157406\n草市街\t157407\n钢铁网\t157408\n钟馗\t157409\n动视暴雪\t157410\nstdc\t157411\n图种\t157412\n上海2号\t157413\n反贪局\t157414\n李泉\t157415\n运控\t157416\n创研\t157417\n硬脂酸酯\t157418\nsub\t157419\n湖南中烟\t157420\n家底\t157421\n梅关古道\t157422\n布鲁克林大桥\t157423\n补款\t157424\n纳智捷优6\t157425\n箱车\t157426\n医生式\t157427\n标普500指数\t157428\n酒精性肝病\t157429\nmbr格式\t157430\n碧生源\t157431\n魔师\t157432\nMonk\t157433\n残云\t157434\nHansgrohe\t157435\n梁锦松\t157436\n多囊性\t157437\n联想Yoga\t157438\n减免\t157439\n前中\t157440\n隔离乳\t157441\npics\t157442\n襄樊市\t157443\n排险\t157444\n小故事\t157445\n历史使命\t157446\n中国梦劳动美\t157447\noffs\t157448\n如何样\t157449\n甲儿\t157450\nGoogle地图\t157451\n武汉美的\t157452\n子线程\t157453\n占用费\t157454\n远东集团\t157455\n坎特伯雷故事集\t157456\n牌令\t157457\n得克萨斯\t157458\n盛大文学\t157459\naxure\t157460\n輝月\t157461\n14G\t157462\nRegulators\t157463\n巴塞尔表展\t157464\n肽段\t157465\nTricky\t157466\n南通市监察局\t157467\n甲申\t157468\n石艺\t157469\n33cm\t157470\n粗陶\t157471\nb85\t157472\nCAJ格式\t157473\n红外光谱法\t157474\nbaic\t157475\n节结\t157476\nKwok\t157477\n搞基\t157478\n无失\t157479\n瑞家\t157480\n怪物猎人online\t157481\n长沙市国家税务局\t157482\n雅思剑\t157483\n吃饭\t157484\n非齐次线性方程组\t157485\n吊梨\t157486\n爱徒\t157487\n云图tv\t157488\n第十五套\t157489\n8.5元\t157490\n手语版\t157491\n分洪\t157492\n604\t157493\n无尘室\t157494\n赏罚\t157495\n十号\t157496\n三义庙\t157497\n曼牌\t157498\nImagenomic\t157499\nCSS伪元素\t157500\nWell\t157501\n松紧绳\t157502\n乔念\t157503\n瑞虎仙路至尊\t157504\n心木\t157505\n40支\t157506\n细胞外液\t157507\n尹卓\t157508\ninjustice\t157509\n美杜莎\t157510\n刘继红\t157511\n生物界\t157512\n曾子\t157513\n叶非夜\t157514\nWilliams\t157515\n第八十七章\t157516\n澳邮中国快运\t157517\n风炮\t157518\nj4\t157519\n小调\t157520\nSoc\t157521\nTEDx\t157522\n第31个\t157523\n雅思托福\t157524\nWoodland\t157525\n金渝\t157526\n德克萨斯大学奥斯汀分校\t157527\n车载音乐网\t157528\n莱阳市\t157529\nopenni\t157530\n福禄寿喜财\t157531\np1606dn\t157532\nZo\t157533\nchainsmokers\t157534\n田川\t157535\nma2\t157536\n摩羯智投\t157537\n锋彩\t157538\nemf格式\t157539\n放寒\t157540\n沃格\t157541\n熔块\t157542\n撕名牌\t157543\n8厘米\t157544\n典礼\t157545\n滤色镜\t157546\nTSKS\t157547\n邵阳县人民政府\t157548\n海天盛宴\t157549\n阳光人寿保险\t157550\n忘记\t157551\n利生\t157552\nOl\t157553\n鲜言\t157554\n斜顶阁楼\t157555\n6月30号\t157556\n朝闻道\t157557\n第六卷\t157558\nfreeporn\t157559\n养生杖\t157560\npeid\t157561\n综艺最看点\t157562\nlords\t157563\n4X\t157564\nnhs\t157565\n女主\t157566\n广域\t157567\n九层\t157568\n北京四季酒店\t157569\ncfp\t157570\nRetrof\t157571\n佳乐\t157572\n天趣\t157573\niscsi\t157574\n均势\t157575\n廖亚凡\t157576\n凤凰书店\t157577\n礼宾\t157578\n淤地坝\t157579\n绵阳职业技术学院\t157580\n奖章\t157581\n龙腾卡\t157582\n英雄令\t157583\n陆风x2\t157584\n泛华保险\t157585\n航企\t157586\n首诗\t157587\n突变\t157588\n止庵\t157589\n3.8亿元\t157590\n就是说\t157591\n咖啡吧\t157592\n玄殿\t157593\n387号\t157594\n锁表\t157595\nFUSION\t157596\n公共交易\t157597\nPaho\t157598\n顺德城市网\t157599\n方方面面\t157600\nNUMPY\t157601\n行星架\t157602\noox\t157603\n动物学\t157604\n严颜\t157605\n2016_凯风网\t157606\n凤桥\t157607\n乐游加速器\t157608\n酸性食品\t157609\n松岗街道\t157610\n更胜一筹\t157611\n绿地城\t157612\n短促\t157613\n黑轴\t157614\n直角握把\t157615\n天津中心妇产\t157616\n黑铁\t157617\n倒板\t157618\n环球财经连线\t157619\n资本公积金转增股本\t157620\n金投贷款\t157621\nnote5a\t157622\nxxy\t157623\nBONES\t157624\n复旦大学计算机科学技术学院\t157625\n奥康纳\t157626\n单调函数\t157627\n董祥\t157628\n5平方\t157629\n第1场\t157630\n羊奶粉\t157631\nOUTFILE\t157632\nCanberra\t157633\n25566000\t157634\n周建龙\t157635\n离殇\t157636\n10.8.5\t157637\n垂钓\t157638\n初雪\t157639\n神农架大九湖\t157640\nplm\t157641\n电视柜\t157642\n已税\t157643\nJAVHD\t157644\n1987年\t157645\n欣欣然\t157646\n罒\t157647\n左转\t157648\n防范\t157649\n商超\t157650\n美林湖\t157651\n海藻油\t157652\n摘句\t157653\n1008\t157654\n第55届\t157655\n韩越\t157656\n潮音\t157657\n性化\t157658\n221路\t157659\n郭芳\t157660\n大女\t157661\n不认\t157662\n2616\t157663\n隔热板\t157664\nascll码\t157665\ngases\t157666\nf431\t157667\nJS\t157668\n白河湾\t157669\n公堂\t157670\ndrawtext\t157671\n热苏斯\t157672\n小塔\t157673\n搀扶\t157674\n小林家的龙女仆\t157675\nf583\t157676\ntfw\t157677\n亡灵勇士\t157678\n乐势力\t157679\n15题\t157680\n山东省地震局\t157681\n蓝莓酒\t157682\n浐灞生态区\t157683\nfat16\t157684\nAngew\t157685\n中国网上音乐学院\t157686\n怎么回\t157687\n小爱神\t157688\n4WD\t157689\n127路\t157690\nExpected\t157691\n鼎尖\t157692\nmizuki\t157693\n北京新房网\t157694\n刻盘\t157695\n组织员\t157696\n莲菜\t157697\n头桥\t157698\n北京西路\t157699\nADULT\t157700\n稻穗\t157701\nBehavior\t157702\n午餐盒\t157703\n非议\t157704\n高等数学\t157705\nKar98k\t157706\n金銮殿\t157707\n777米奇影院\t157708\n北京大学研究生院\t157709\nGTP谱\t157710\n吉利远景x3\t157711\n死亡独轮车\t157712\n石块\t157713\nAIRBNB\t157714\n快速记忆法\t157715\n沙士\t157716\n医武兵王\t157717\n三十岗\t157718\n友谊\t157719\nwrc\t157720\n创客们\t157721\n思科交换机\t157722\nTanWan\t157723\n点化石\t157724\n杭州师范大学外语学院\t157725\nTreeSet\t157726\n威驰\t157727\n树敌\t157728\n我的旋风少女\t157729\n泊松比_\t157730\n超纯水机\t157731\n高航网\t157732\n镇派\t157733\n八桂嘉年华\t157734\n速信\t157735\nLAPACK\t157736\n净角\t157737\n成都西川中学\t157738\n生日礼\t157739\n罗马式\t157740\n涨乐财富通\t157741\n领围\t157742\n公之于众\t157743\n折页机\t157744\n书趣\t157745\n招录比\t157746\n限位装置\t157747\nMentholatum\t157748\n考勤管理软件\t157749\n沙美特罗替卡松粉吸入剂\t157750\n潘玲玲\t157751\n赵猛\t157752\n姓孙\t157753\n销往\t157754\npoet\t157755\n女中音\t157756\n看重\t157757\nTIA\t157758\n非机动车\t157759\nxenomai\t157760\n官衔\t157761\n激光器\t157762\n国际转运公司\t157763\n平舆县人民政府\t157764\n侠客\t157765\n南京公园\t157766\nMC天佑\t157767\n上海电影学院\t157768\n姜蒜\t157769\n屋漏偏逢连夜雨\t157770\n机枪手\t157771\n误差值\t157772\n老鹿\t157773\n强奸率\t157774\n江门市区\t157775\n连不上\t157776\n程开甲\t157777\n蛐蛐钢琴网\t157778\n双位\t157779\nBBA\t157780\n打服\t157781\nAllan\t157782\n王晓棠\t157783\n皮带机\t157784\n玻镁彩钢板\t157785\n被赞\t157786\nOPPOR15\t157787\n部员\t157788\n宁德市人民政府\t157789\n广州市总工会\t157790\n变化\t157791\n吹落\t157792\n海军上将\t157793\n插坐\t157794\n科尔伯格\t157795\n福建省新闻出版广电\t157796\n王耀庆\t157797\nweganme\t157798\n右腿\t157799\nconspiracy\t157800\n团会\t157801\n素白\t157802\nRAID卡\t157803\n0101\t157804\n盛发\t157805\n徐晶晶\t157806\n素雅\t157807\nfsu\t157808\nupdf\t157809\n水性树脂\t157810\nhome吧\t157811\nLoL\t157812\nVulnerability\t157813\npro4\t157814\n京剧\t157815\n国际金融\t157816\npha2a\t157817\n375号\t157818\n达斡尔族\t157819\n第112\t157820\n悔婚\t157821\n电脑锁屏\t157822\n高桥石化\t157823\n叶天熠\t157824\nFigures\t157825\n褐斑\t157826\nxgg\t157827\nflowering\t157828\n华丽的外出\t157829\n冒顿\t157830\n提标\t157831\n桦树汁\t157832\nComfortable\t157833\n鱼头汤\t157834\n暗无天日\t157835\n太宰府\t157836\n大满洲国\t157837\n不在场\t157838\n赛业\t157839\n铁路局\t157840\n春风吹\t157841\n祝\t157842\n维尔\t157843\n940nm\t157844\n微创\t157845\n十多种\t157846\n7815\t157847\n平远\t157848\n枢轴\t157849\n应天府\t157850\n米林县\t157851\n轻小说\t157852\n酷居\t157853\n非农数据\t157854\n0519\t157855\n舌剑\t157856\n南京物联传感技术有限公司\t157857\n猫尾\t157858\n建外街道\t157859\n裸子\t157860\n武汉地区\t157861\n秘话\t157862\n20140702\t157863\n展览品\t157864\n束带\t157865\n主谓\t157866\nBTkitty\t157867\nAustin\t157868\n助理员\t157869\n樊振东\t157870\n细活\t157871\n服务型\t157872\n二厢\t157873\n琪琪电影网\t157874\n滨江天街\t157875\n阿尔派\t157876\n回归\t157877\n美宿\t157878\nTEC\t157879\n管处\t157880\n尤弥尔\t157881\n中国建始网\t157882\nブル\t157883\nWex5\t157884\nsquirt\t157885\n糖化血红蛋白\t157886\n2板\t157887\n40式\t157888\n李琳玥\t157889\n色块\t157890\n天玥\t157891\n髌腱炎\t157892\nv2.4.7\t157893\n封神召唤师\t157894\n蔡家\t157895\n喷射器\t157896\n文星镇\t157897\nAZW3\t157898\n采菱\t157899\n家商\t157900\njapanes\t157901\nNEXO\t157902\n紫光电子\t157903\n药方\t157904\n马丽\t157905\n艾科\t157906\n四川大学华西临床医学院\t157907\n海洛斯\t157908\n符石\t157909\n招远\t157910\nBoundary\t157911\nVincentCZW\t157912\nlanse\t157913\n分期吧\t157914\n热情\t157915\n常熟房\t157916\n50台\t157917\n3.3.8\t157918\ntpimage\t157919\n砍人\t157920\nRadware\t157921\n.so.6\t157922\n中华龙\t157923\n大别墅\t157924\n任鹏飞\t157925\n终期\t157926\n儿行\t157927\n捷克语\t157928\n丹皮\t157929\n1q84\t157930\n西部新城\t157931\n上学\t157932\n下划线\t157933\n铁中棠\t157934\n李秋水\t157935\n鼻子\t157936\n血泪\t157937\n风的季节\t157938\n中新社\t157939\n录屏\t157940\n梢头\t157941\n东汉末年\t157942\n方片\t157943\nkodak\t157944\n百度云1080P/BD\t157945\n智能相机\t157946\n书痴\t157947\n岩田\t157948\nmllib\t157949\nvital\t157950\n侵占案\t157951\nX射线\t157952\n关帝\t157953\n飞飞飞\t157954\n爱财\t157955\n不能玩\t157956\n大仓\t157957\n大朵\t157958\n王廷江\t157959\n协保\t157960\n多玩斗战神\t157961\n被告人\t157962\n帕拉梅拉\t157963\n浙江横店影视职业学院\t157964\n东方证券股份有限公司\t157965\n流云\t157966\n网宿科技\t157967\n出鞘\t157968\n毒粮\t157969\n比戈\t157970\n和祥\t157971\n等离子体\t157972\n2018年3月23日\t157973\n云客网\t157974\n沈北新区\t157975\n高密度纤维板\t157976\n国际广播电台\t157977\nEPE\t157978\n夜市人生\t157979\n黄盖\t157980\n第八次\t157981\n浙组\t157982\n难免\t157983\n糖衣\t157984\nDeSmuME\t157985\n融创中国\t157986\n满天飞\t157987\navocado\t157988\n预结算\t157989\nmodifications\t157990\n282号\t157991\n高木早希\t157992\n三险一金\t157993\n座狼\t157994\n金保险\t157995\n李小双\t157996\n中鸽网\t157997\n哈儿\t157998\n黑人\t157999\n南桥\t158000\n死丝方\t158001\n18.0.0\t158002\ndengdeng\t158003\n三A\t158004\nELECTRIC\t158005\n国网河南省电力公司\t158006\nOffering\t158007\n九转大肠\t158008\n朝山\t158009\nBundles\t158010\nYidian\t158011\n前列腺钙化灶\t158012\n两路\t158013\n达飞云贷\t158014\n18岁者\t158015\n2013年以来\t158016\n大张旗鼓\t158017\n12回\t158018\n土族\t158019\nnishi\t158020\n电子商务部\t158021\nTOPS\t158022\n竹鼠\t158023\n_书面语\t158024\n阴器\t158025\n雷神之锤\t158026\n吵不吵\t158027\n古田会议\t158028\n菲涅尔\t158029\nSON\t158030\nCoils\t158031\n牙型\t158032\nreina\t158033\nGeneric\t158034\n1一6\t158035\n低压接触器\t158036\ninstability\t158037\n0.375\t158038\n保时捷718\t158039\nartdialog\t158040\n相亲\t158041\nabs函数\t158042\n五峰县\t158043\n大衫\t158044\nnatasha\t158045\n正邦科技\t158046\n5月8号\t158047\n龚全珍\t158048\n浙江队\t158049\nLinux内核源代码\t158050\n起帆\t158051\n文竹\t158052\n第四本\t158053\nios13\t158054\n众泰t600\t158055\n智飞\t158056\n萧绎\t158057\n600289\t158058\n世茂公园美地\t158059\nremainder\t158060\n曈\t158061\n翼教版\t158062\nobjectc\t158063\n大泽乡\t158064\n承销\t158065\n校园\t158066\n2018.4.6\t158067\n中华民国\t158068\n柏林寺\t158069\n囫囵\t158070\nauthz\t158071\n群发\t158072\nDezeen\t158073\n山西网络广播电视台\t158074\n怡景湾\t158075\n欧陆战争\t158076\n徐字\t158077\n再买房\t158078\n东方城\t158079\n狗皇\t158080\n金海湾花园\t158081\n翁祖亮\t158082\n65万元\t158083\n问法\t158084\n炒蛋\t158085\n拉斐特\t158086\nf406\t158087\n路虎极光\t158088\n组蛋白\t158089\n挣值\t158090\n小米手机浏览器\t158091\nunsuccessful\t158092\n非人马自达6\t158093\n芒芒\t158094\n直流电流\t158095\n倒装\t158096\ngstreamer\t158097\n失业证\t158098\n银川\t158099\n4平方\t158100\n53231323\t158101\nsdc\t158102\n主标\t158103\n结核杆菌\t158104\n非诉\t158105\n东渚\t158106\nrotten\t158107\n108招\t158108\n吴柏松\t158109\n萝莉吧\t158110\n都市时报电子报纸—彩龙\t158111\n苇河\t158112\nvp\t158113\n公开日\t158114\n高效减水剂\t158115\nUnsupported\t158116\n组胺\t158117\n前程似锦\t158118\n东冲\t158119\n金城江区\t158120\n铰\t158121\n年轻的母亲1\t158122\nChampions\t158123\n警示片\t158124\naini\t158125\n骓驰\t158126\n夏令时\t158127\n黑锋\t158128\n李振华\t158129\n非正态\t158130\n石决明\t158131\n长嘉汇\t158132\n黔讯网\t158133\n希实\t158134\n85度\t158135\n豫北\t158136\n72周年\t158137\n帕吉\t158138\n肾脏科\t158139\n积层\t158140\n乙苯\t158141\n搜狗地图\t158142\n这四种\t158143\n电脑卡\t158144\neprom\t158145\n罗家伦\t158146\n花样年集团(中国)有限公司\t158147\n三菱化学\t158148\n方园\t158149\n标图\t158150\n血帆\t158151\n萝莉阁\t158152\n更甚\t158153\n管渠\t158154\nkuaishou\t158155\nwarrobots\t158156\n北京师范大学化学学院\t158157\n8项\t158158\n白驼山庄\t158159\n坠落\t158160\n预混\t158161\n先天\t158162\n枸地氯雷他定片\t158163\n专贴\t158164\n夜色美\t158165\n张薇\t158166\n度表\t158167\n觉醒来\t158168\n000820\t158169\n文编\t158170\n陈老\t158171\ndime\t158172\n高岭土\t158173\n代王\t158174\n薇诺娜\t158175\n布道石\t158176\n驾驶室\t158177\n鞋展\t158178\n凤舞天骄\t158179\n武汉家装网\t158180\n前五分钟\t158181\n02号\t158182\n20141217\t158183\nmartin\t158184\n之侧\t158185\n勤学网\t158186\n白溪\t158187\nsuki\t158188\nTPN\t158189\n显示\t158190\ncounty\t158191\n麓谷\t158192\n月狗\t158193\n奶爸良辰讵可待\t158194\n江南雨\t158195\n零壹财经\t158196\nReality\t158197\n张爱玲\t158198\n六脚\t158199\nASX\t158200\n英女王\t158201\n就好\t158202\n翔野\t158203\n黄城\t158204\nnsx\t158205\nf558\t158206\n并列连词\t158207\n纯电动轿车\t158208\nJTGD\t158209\n依云水岸\t158210\n眼镜框\t158211\n广田\t158212\n外贸人\t158213\n御姐归来\t158214\n慢粒\t158215\nBD1280超清\t158216\n山西省工商行政管理局\t158217\n游戏加速器\t158218\n美品\t158219\n高州市人民政府\t158220\n喇叭裤\t158221\n算命生辰八字婚姻-八字姻缘测试-合八字\t158222\n300418\t158223\nmarkdown\t158224\n相符合\t158225\n空军总医院\t158226\n喵星人\t158227\n东西湖区人民政府\t158228\n实验课\t158229\n低密度灶\t158230\n亲政\t158231\n天鹅\t158232\n双河市\t158233\n在线翻译\t158234\n八面埋伏\t158235\n参茸鞭丸\t158236\n秧鸡\t158237\n湖里区人民政府\t158238\nBIN\t158239\nCDRX8\t158240\n车垫\t158241\n机械版\t158242\nCivil\t158243\nOBIEE\t158244\nWunderlist\t158245\n3000多名\t158246\nUnfold3D\t158247\nprettify\t158248\nchia\t158249\noppor\t158250\n羽冠\t158251\nmeiju\t158252\n碳酸钠\t158253\n旧闻\t158254\n现照\t158255\n经纬仪\t158256\n直邮店\t158257\n支气管哮喘\t158258\n堰塘\t158259\nSeparate\t158260\n优惠购\t158261\n用友集团\t158262\n23价\t158263\n双培计划\t158264\n陈文穆\t158265\n虎头鲨\t158266\n90B\t158267\n串转\t158268\n江苏省锡山高级中学\t158269\n民间传说\t158270\nlas\t158271\n三分地论坛面经版\t158272\n军星\t158273\n欧阳路\t158274\n星子县\t158275\n奔跑吧兄弟第四季\t158276\n郭子仪\t158277\nCOHIBA\t158278\nvore\t158279\n性器\t158280\n思埠集团\t158281\n襄\t158282\n厦门市\t158283\n武汉大学珞珈学院\t158284\nTeamviewer13\t158285\n天齐锂业\t158286\nmsl\t158287\n理研\t158288\n萝北县\t158289\n迅雷云\t158290\n180408\t158291\n赵公子\t158292\n湖州市人民政府办公室\t158293\n净心\t158294\n泉州晚报\t158295\n喋血孤岛\t158296\n王屹芝\t158297\nMagna\t158298\n凝固\t158299\n雪饼\t158300\n永鑫\t158301\n狗骨\t158302\nPirate\t158303\n79式\t158304\n剑圣\t158305\n敖鲁古雅\t158306\n15693\t158307\nStokke\t158308\n60千克\t158309\n华妃\t158310\n史料\t158311\n五脚\t158312\ncpanel\t158313\n宠才网\t158314\n竹楼\t158315\nsmartforms\t158316\n月坛北街\t158317\n两金\t158318\n第12位\t158319\nWrecking\t158320\n拨叫\t158321\nGuidelines\t158322\n吡拉西坦片\t158323\n二传手\t158324\ncreo工程图\t158325\n夏墨\t158326\n保安部\t158327\n胡浩亮\t158328\nlucid\t158329\n南京市政务服务中心\t158330\n陌生感\t158331\n江西省住房和城乡建设厅\t158332\n理光7001\t158333\n中航小镇\t158334\nhget\t158335\nzadd\t158336\n喷笔\t158337\nconsortium\t158338\n巴别塔\t158339\n高斯金字塔\t158340\n盐城工学院\t158341\n物联网产业园\t158342\n自然人独资公司\t158343\n星际争霸2中文网\t158344\nStarts\t158345\n双镯\t158346\ncnt\t158347\nT72\t158348\nIVD\t158349\n制造型\t158350\n实况足球\t158351\n理清\t158352\n中华人民共和国香港特别行政区政府\t158353\n上量\t158354\n屏南县人民政府\t158355\nlocales\t158356\nForte\t158357\n宋希濂\t158358\n答码\t158359\n森\t158360\n无意间\t158361\nMDP\t158362\n咸阳路\t158363\n天风证券\t158364\nxmal\t158365\n另类\t158366\nNVC\t158367\n超能查派\t158368\n艇\t158369\n级\t158370\nPolitical\t158371\n观者\t158372\n烂滚夫斗烂滚妻\t158373\n冈姆\t158374\n田林街道\t158375\n千手柱间\t158376\n中国移动福建公司\t158377\n刘家琨\t158378\n龚滩\t158379\n夜问\t158380\n传奇霸业\t158381\n减税\t158382\n袁煜明\t158383\ntransit\t158384\n3305\t158385\n鲁南制药\t158386\n长江新区\t158387\nexpresscard\t158388\nC193\t158389\n国家中心城市\t158390\n滚装船\t158391\n山西政府\t158392\nAnything\t158393\n电纳\t158394\nwide\t158395\n金冠电气\t158396\n北京航天航空大学\t158397\nOyster\t158398\n娄底市中心医院\t158399\n语文建设\t158400\nregulate\t158401\n恒天集团\t158402\n来事\t158403\n东软\t158404\n硫磺岛战役\t158405\n甘露醇注射液\t158406\n9九个\t158407\n水城威尼斯\t158408\n985学院\t158409\n奋进者\t158410\n白岩单田芳\t158411\n50w\t158412\n凶猛\t158413\n政审\t158414\n黄兴路\t158415\nsdut\t158416\nbasal\t158417\n做爱时\t158418\n第1期\t158419\n杜莎\t158420\n7.5KW\t158421\n金亮\t158422\n话集\t158423\nb2c\t158424\n40013\t158425\n三喜临门\t158426\n辅修\t158427\n宋家庄\t158428\n咬咬牙\t158429\n杭州市教育局\t158430\n白帝学院\t158431\n老九\t158432\n硒酵母片\t158433\n陈庄村\t158434\n随意\t158435\n腌肉\t158436\n三步曲\t158437\n第几款\t158438\n果断\t158439\n零分\t158440\n猎魔者\t158441\n槐房\t158442\n红尘\t158443\n橡树林\t158444\n不含铅\t158445\n林萍\t158446\n第二封\t158447\n代本田\t158448\n银办\t158449\n葡萄味\t158450\n教育费附加税\t158451\n史蒂芬·霍金\t158452\n双重股权结构\t158453\n色拉寺\t158454\n揍\t158455\nPoetry\t158456\n0050\t158457\nwww.2015\t158458\n白求恩\t158459\npriority\t158460\n衣服\t158461\nHD4600\t158462\n第A10\t158463\nstringutils\t158464\n橙马\t158465\n韩妆\t158466\n尹鸿\t158467\nArcengine\t158468\n金手指&存档区\t158469\n右撇子\t158470\n精读圣经\t158471\nBlender\t158472\n悦达集团\t158473\n世外桃源\t158474\n比特股\t158475\n鲣鱼\t158476\n慢性鼻炎\t158477\n可供出售金融资产\t158478\n果律\t158479\n高碑店市人民政府\t158480\nstrengths\t158481\n龙美术馆\t158482\n死亡保险\t158483\nsulfur\t158484\n光谷广场\t158485\n4小时以上\t158486\n海美迪\t158487\n详略\t158488\n很快乐\t158489\n主干线\t158490\n红旗小区\t158491\n千里光\t158492\n2003级\t158493\n年期\t158494\n老艾\t158495\n真菌性皮肤病\t158496\ncont\t158497\n俯视图\t158498\n东坝乡\t158499\n3332\t158500\n论对\t158501\n2147217900\t158502\n海区\t158503\n194个\t158504\n齐家团购网\t158505\n帝豪gl\t158506\nCOMIC\t158507\n赵立新\t158508\n渡口\t158509\n查济\t158510\n余杭未来科技城\t158511\n昌平北站\t158512\n次贷\t158513\n居里夫人传\t158514\n怡景苑\t158515\n推拉\t158516\n【临高启明吧\t158517\nhoo\t158518\n红权\t158519\nSim\t158520\n木乃伊1\t158521\nrcv\t158522\nSie\t158523\n何书桓\t158524\natb\t158525\n竖式\t158526\nrtcp\t158527\n40x\t158528\n建文\t158529\n转赠\t158530\n海南医学院第一附属医院\t158531\n叶庆均\t158532\n取代\t158533\n购房补贴\t158534\nShou\t158535\nmama&kids\t158536\n哄人\t158537\n岛根县\t158538\n驴妈妈旅游网\t158539\n拍腿\t158540\nHadSky\t158541\nQS查询网\t158542\n腹中\t158543\nBOD\t158544\n起价\t158545\noutsourcing\t158546\nimagine\t158547\n会计中级考试\t158548\n辛巳\t158549\n一生一阙歌\t158550\n百想\t158551\n瑞思\t158552\n工银亚洲\t158553\n梦谷\t158554\ncad卡顿\t158555\n凌燕\t158556\n第九话\t158557\n晚餐\t158558\n火影忍者动漫\t158559\n鞥\t158560\n廖承宇\t158561\nhyperworks\t158562\n装过\t158563\n2016年后\t158564\n新商\t158565\n840型\t158566\n县委巡察办\t158567\nSOURCE\t158568\n传授\t158569\n出帐\t158570\nnasm\t158571\n落址\t158572\n同圆\t158573\nVisa卡\t158574\n槽子\t158575\n神路\t158576\n凤山街道\t158577\n12月24日\t158578\n咨\t158579\njaybird\t158580\n复方血栓通胶囊\t158581\n精切\t158582\n中华中学\t158583\n爆破手\t158584\n全要素生产率\t158585\n1937\t158586\n王竹立\t158587\n至乐\t158588\n神恩\t158589\n杭州高级中学\t158590\n刘民\t158591\nEdward\t158592\n承德县\t158593\nMindManager2016\t158594\n维生素C片\t158595\n拨球\t158596\n印度站\t158597\n宅邸\t158598\n刚需\t158599\nCovered\t158600\n管治\t158601\nturbulent\t158602\n文根\t158603\n女检察官\t158604\n丑角\t158605\n小故事网\t158606\n水世界\t158607\n浪涛\t158608\n松坪山\t158609\n自然方\t158610\n承债式\t158611\nGMV\t158612\nflavour\t158613\n莼\t158614\n瓦面\t158615\n潜孔钻\t158616\n影锋\t158617\n当我想你的时候\t158618\n中风化\t158619\nvoez\t158620\n宝丽金\t158621\n空王冠\t158622\n养鬼\t158623\n制种\t158624\n林文\t158625\nCAFFE\t158626\n以史为镜\t158627\n忌神\t158628\n脓细胞\t158629\n目的地\t158630\ntemporary\t158631\n切边机\t158632\n泰禾\t158633\n广东讲话精神\t158634\n李安\t158635\ncoser\t158636\n封墙\t158637\n9.0_\t158638\n仝\t158639\n履行期\t158640\n心情愉快\t158641\n人人\t158642\n朱茵\t158643\n900万元\t158644\nHwang\t158645\n流体\t158646\n爵士乐\t158647\n00000001\t158648\n楼兰古城\t158649\n634\t158650\n郭帅\t158651\n阳逻港\t158652\n冲动性\t158653\n土地招拍挂制度\t158654\n沈敏\t158655\n超级吞噬系统\t158656\n孙佳君\t158657\nCONCAT\t158658\n枉法\t158659\n斜巷\t158660\n安徽一区\t158661\n凤羽\t158662\ns120\t158663\n報\t158664\n主谓宾\t158665\n实弹演习\t158666\n成都西\t158667\n天谕神语书院\t158668\n淫邪\t158669\n白夜杀破狼\t158670\n机经网\t158671\n蝼蚁\t158672\n表演队\t158673\nmixing\t158674\nCross\t158675\n切让\t158676\n学苑\t158677\n朱鹏\t158678\n三家\t158679\nNS\t158680\n水旱\t158681\n煤山\t158682\n成兵\t158683\n广西地税\t158684\ngorilla\t158685\n梅河\t158686\ngprof\t158687\n浦沿街道\t158688\n72期\t158689\nspectrum\t158690\n8小时内\t158691\n合彩\t158692\n李志坚\t158693\n601377\t158694\nhml\t158695\n搞机\t158696\n手账\t158697\n藏版\t158698\nsexxxx\t158699\n光洋股份\t158700\n成都地铁8号线\t158701\n火焰限界\t158702\n易拉宝\t158703\n绝世武神\t158704\nFacesitting\t158705\n事项段\t158706\n周喜安\t158707\nvReveal\t158708\n3158\t158709\n人神共愤\t158710\ncable\t158711\n组元\t158712\n灭亡\t158713\n斜楔\t158714\njavacv\t158715\nobb\t158716\n一ノ瀬アメリ\t158717\n大盗黎明\t158718\n特性值\t158719\ngovernments\t158720\n诚语\t158721\n软音源\t158722\n大沙地\t158723\n拉森\t158724\nRail\t158725\n核电荷数\t158726\n苏州市工业园区\t158727\nfastest\t158728\n中信银行\t158729\n西单商场\t158730\n肚里\t158731\n外贸SOHO\t158732\n付磊\t158733\n单肩\t158734\n零位\t158735\n雅森\t158736\nlistctrl\t158737\n环形器\t158738\nm101\t158739\n九泉之岛\t158740\n单季\t158741\n江尚\t158742\n61页\t158743\n倒仓\t158744\nreadyboost\t158745\n单斜杠\t158746\n三贤\t158747\ndreamleague\t158748\n保守\t158749\nhsc\t158750\n周大杰\t158751\n水利水电工程专业\t158752\n烟草\t158753\nMOMENTUM\t158754\n今年九月\t158755\n宁波栎社\t158756\n资产减值准备\t158757\nPerform\t158758\n人表\t158759\n库兹涅佐夫\t158760\n主视眼\t158761\n黄洋\t158762\n锤子m1\t158763\n律界\t158764\n支付类\t158765\n42部\t158766\n毛远新\t158767\n荔湾\t158768\n凯原法学院\t158769\n卢锡安\t158770\n餐灯箱\t158771\n太阳帝国\t158772\n相位裕度\t158773\n何峰\t158774\ncommunication\t158775\n就知\t158776\n六十五岁\t158777\n第62条\t158778\n4.9日\t158779\nVet\t158780\n超级马里奥制造\t158781\nワ\t158782\nBounding\t158783\n克里斯滕森\t158784\ntoyota\t158785\nDS2019\t158786\n铛\t158787\n雷蕾\t158788\n胎记\t158789\n廊下\t158790\nTraveller\t158791\n外圆磨床\t158792\nHaoKoo\t158793\n林文采\t158794\n墨友\t158795\nmetaphor\t158796\n双螺杆泵\t158797\n壹刻\t158798\n国投财富广场\t158799\n电脑音频管理器\t158800\n骨子\t158801\n击球\t158802\n第四十七章\t158803\n考生\t158804\n芥菜疙瘩\t158805\n李炳军\t158806\n谈天说地\t158807\nSF\t158808\n巨人的花园\t158809\nsounds\t158810\n植毛机\t158811\n国际航空运输协会\t158812\n刑官\t158813\n教授级\t158814\n鲜衣怒马\t158815\n礼泉\t158816\n范成\t158817\n刘振\t158818\n雪樱\t158819\n保密员\t158820\n80004002\t158821\nIE\t158822\n简并\t158823\n全服\t158824\n茶舞\t158825\nfse\t158826\n防暑\t158827\n山野菜\t158828\nidentification\t158829\n河库\t158830\n投币式\t158831\n吴起镇\t158832\n一经\t158833\n周旭东\t158834\n金属卡\t158835\n怼\t158836\n打盹\t158837\nPTX\t158838\n偏口\t158839\nzhiji\t158840\n圆缺\t158841\n黄晓阳\t158842\n小狐仙\t158843\n硅钼棒\t158844\n上海迪士尼开园\t158845\n长濑真子\t158846\n受苦\t158847\n柴静贝克汉姆\t158848\n葫芦小金刚\t158849\n上汽荣威\t158850\n3499足球网\t158851\nliterature\t158852\n前所未见\t158853\n贵州省科技厅\t158854\n洪湖水浪打浪\t158855\n100G\t158856\n6项\t158857\n挥发油\t158858\n国家司法鉴定人和司法鉴定机构名册\t158859\n三感\t158860\n敢\t158861\n远辰\t158862\n300支\t158863\n拓也哥\t158864\n月子病\t158865\nInitializr\t158866\n谎言的诱惑\t158867\n装甲兵工程学院\t158868\n锦里沟\t158869\nKunoichi\t158870\n犯罪案\t158871\n古画\t158872\n淘宝天猫电商\t158873\n2.2.0\t158874\n白号\t158875\n梨树湾\t158876\n雷火剑\t158877\n坐井观天\t158878\n陈莉莉\t158879\n五粮液股份公司\t158880\n纺纱\t158881\n2018年3月18日\t158882\n偷香窃玉\t158883\n过程装备与控制工程专业\t158884\n席琳迪翁\t158885\n对冲基金\t158886\n泗县先锋网\t158887\n叹气\t158888\nUTF8编码\t158889\n犯罪心理第十三季\t158890\nIT运维工程师\t158891\n中交三航局\t158892\nada币\t158893\nCaligula\t158894\n巨星\t158895\n乱入\t158896\n北京平谷\t158897\n雷氏\t158898\n手剑\t158899\n禾木村\t158900\n金美儿\t158901\n小兔子乖乖\t158902\n破坏之王\t158903\n重门\t158904\n腾讯视频VIP\t158905\n双代会\t158906\n王小峰\t158907\n张立新\t158908\n厦门小学\t158909\ndn25\t158910\n一度\t158911\n食人之饥\t158912\n20只\t158913\n媚肉\t158914\nDR\t158915\n龙骑科技\t158916\n东二环\t158917\n扣篮\t158918\n顾漫\t158919\nreports\t158920\n结亲周\t158921\n白皮\t158922\ntempur\t158923\n攻击\t158924\n贝斯手\t158925\nfzu\t158926\n郭丹\t158927\n38类\t158928\nSystemC\t158929\n伊沙\t158930\n南安市人民政府\t158931\n变盘\t158932\n心身\t158933\n法藤\t158934\n燕昭王\t158935\nDMM\t158936\n工商总局党组中心组\t158937\n油辣椒\t158938\nNULL\t158939\n十余个\t158940\n中国临川政府网\t158941\n比热容\t158942\n走过的路\t158943\n孤苦\t158944\n60寸\t158945\n神马影院_神马影院\t158946\n蝴蝶夫人\t158947\n防火卷帘\t158948\nfailed\t158949\nKobayashi\t158950\n3000例\t158951\nEternity\t158952\nOy\t158953\n周转筐\t158954\ncscope\t158955\nvirtuanes\t158956\n漫画集\t158957\n工科大学\t158958\nDandelion\t158959\n易车视频\t158960\n投资咨询网\t158961\n北京家教网\t158962\n淮南市公安局\t158963\n重峰\t158964\n0月\t158965\n摘抄\t158966\n纸带\t158967\n通频带\t158968\n被告方\t158969\nH5浏览器\t158970\n阿里巴巴商\t158971\nPKC\t158972\n6509\t158973\nArangoDB\t158974\n链币\t158975\n34级\t158976\npigcms\t158977\n芥\t158978\n李显龙\t158979\n惧色\t158980\n经济法概论\t158981\nWebsocket\t158982\nYIYM俚语网\t158983\n舵手\t158984\n中药方\t158985\n国际博物馆日\t158986\n给水排水\t158987\n独领\t158988\n28035\t158989\n苫盖\t158990\n国电电力发展股份有限公司\t158991\n2v2\t158992\nAreas\t158993\n独轮车\t158994\n22222222222\t158995\nipp\t158996\n坪地\t158997\nPID控制算法\t158998\n白音\t158999\ncjdby\t159000\n韩熙贞\t159001\n绿地地产\t159002\n首汽租赁有限责任公司\t159003\n广州地铁11号线\t159004\n荷乐网\t159005\n年平均增长率\t159006\nTechweb\t159007\n卡勒特\t159008\n世纪公园\t159009\n爱你在心口难开\t159010\n引弧\t159011\n电功\t159012\n医联\t159013\n苏黎世\t159014\n老郑\t159015\nretirement\t159016\n安罗替尼\t159017\nkantar\t159018\njiekou\t159019\nloren\t159020\n华谊兄弟\t159021\n20150722\t159022\n带值\t159023\n外圆\t159024\n上海地铁11号线\t159025\n刘芹\t159026\n徐某\t159027\nABC\t159028\n镜湖新区\t159029\nregion\t159030\n浮游岛\t159031\n一级方程式赛车\t159032\n薄纱裙\t159033\n万亿元\t159034\njustinmind\t159035\n65页\t159036\n丁泽孙中山\t159037\n金军\t159038\n邓小俊\t159039\n品牌篇\t159040\n入册\t159041\n火焰纹章觉醒\t159042\nFS\t159043\n汉代\t159044\ngig\t159045\n一往\t159046\n乌丹\t159047\n南京航空航天大学机电学院\t159048\ns6\t159049\n建立\t159050\nFTN\t159051\n文帖\t159052\nUG编程\t159053\n金山路\t159054\nmothers\t159055\n3500米\t159056\n引智\t159057\nGals\t159058\n第几【\t159059\n鞋带\t159060\n三穗县\t159061\nPBE\t159062\n胡66\t159063\n招商银\t159064\n月星家居\t159065\n十二局\t159066\n爱车\t159067\n好朋友\t159068\nphotoshopcc2017\t159069\n阻尼系数\t159070\n漫威未来之战\t159071\n污泥干燥机\t159072\n韵表\t159073\n十八载\t159074\n己二醇\t159075\n日出东方\t159076\n杀出重围\t159077\n立根\t159078\n圆角\t159079\n深南东路\t159080\n芡实\t159081\nmp3格式\t159082\n电动牙刷\t159083\nLIMS\t159084\n泡沫型\t159085\nrosi\t159086\n初期\t159087\n坦索\t159088\n报表\t159089\n国家公祭日\t159090\n56cm\t159091\n单元格内\t159092\nStackOverflow\t159093\nMTK\t159094\n教师资格者\t159095\n石笼\t159096\n2.7万\t159097\n日月\t159098\n甲午\t159099\n鬼谷\t159100\n零度\t159101\nMerchandiser\t159102\n世祖\t159103\n环类\t159104\n第28\t159105\n肾病\t159106\n360密盘\t159107\nFunky\t159108\n模玩\t159109\n百发\t159110\n湖北省审计厅\t159111\nRCF\t159112\n八一大桥\t159113\n收入者\t159114\nrrxiu\t159115\ndecorator\t159116\n查孕\t159117\n基础设施和公用事业特许经营管理办法\t159118\n万腾\t159119\nHost\t159120\n九吉公红糖\t159121\n湮灭\t159122\nChristian\t159123\n脑起搏器\t159124\nv2.1.2\t159125\n密级\t159126\n襄城新闻网\t159127\n中印洞朗\t159128\n帮忙\t159129\n最终痴汉电车3\t159130\n插进\t159131\n清镇市\t159132\nnature\t159133\n我以为\t159134\nMBA教育中心\t159135\n连山区\t159136\n校园卡\t159137\nmetre\t159138\n收腹衣\t159139\n覆手为\t159140\n性福\t159141\n上海明珠医院\t159142\n20170311\t159143\n宝缇嘉\t159144\n字石\t159145\n阴角\t159146\n走肉\t159147\n一,二\t159148\n鑫泰\t159149\n瓷砖胶\t159150\n朗格多克\t159151\n58在线免费学习网\t159152\n畅享网\t159153\n倘若\t159154\n戴厚良\t159155\n应变式\t159156\n通宣理肺丸\t159157\n12g\t159158\n41亿元\t159159\n卢克团\t159160\n防溢\t159161\n卫宫\t159162\n圣斗士星矢战记\t159163\n边路\t159164\n藏干\t159165\ngles\t159166\n急诊男女\t159167\n非比寻常\t159168\njava算法\t159169\n朴翔\t159170\n梨花雨\t159171\n康桥半岛\t159172\n触板\t159173\nredmine\t159174\n人民币跨境支付系统\t159175\n县党委\t159176\nteraterm\t159177\n城铁\t159178\n9.5成\t159179\n汤唯\t159180\n葵花鹦鹉\t159181\n心跳过速\t159182\nAVL树\t159183\n顺德菜\t159184\n新昌大佛寺\t159185\n打浦桥\t159186\n梦遗\t159187\n电子商务网\t159188\n赵薇\t159189\n优劣势\t159190\n小米科技\t159191\n标题体\t159192\n脚踏板\t159193\n米氮平片\t159194\nnginx+uwsgi\t159195\nrally\t159196\n5分页\t159197\n5450\t159198\nliujians\t159199\ntrop\t159200\n■\t159201\n赔付率\t159202\n通廊\t159203\n郭文帅\t159204\n7.1下\t159205\ntftp\t159206\n滨河街道\t159207\n第一首\t159208\n建筑风格\t159209\n380452075@qq.com\t159210\n清华美院\t159211\n宝马530i\t159212\n斥资\t159213\n1219\t159214\n罗茨风机\t159215\nspool\t159216\n请以你的名字呼唤我\t159217\n卧龙镇\t159218\n旅顺口区\t159219\n专用水\t159220\n狮山广场\t159221\n定影器\t159222\n盐水瓶\t159223\n三则\t159224\n明记\t159225\ncad2019\t159226\n000963\t159227\n总收\t159228\n许乐\t159229\n北京昆仑饭店\t159230\n遁术\t159231\nall土吧\t159232\n层压板\t159233\n陈畅\t159234\n御景城\t159235\n30000000\t159236\n亚马逊Echo\t159237\n吾国\t159238\n蹲便器\t159239\n脱媒\t159240\n四维彩超\t159241\n12306网络\t159242\n生子黄道吉日\t159243\n周薪\t159244\n商城路\t159245\n插簧\t159246\n排水沟\t159247\n海贼王ol\t159248\nAPK\t159249\nESRI\t159250\noppoa79\t159251\n斯凯孚\t159252\n全身镜\t159253\n零度软件园\t159254\n拜求\t159255\n安塞龙\t159256\n百分之九\t159257\n楚河汉街\t159258\nAuck\t159259\nNote6\t159260\n栎社国际机场\t159261\n申奥\t159262\n言止\t159263\nglc260\t159264\n李菁\t159265\n速点\t159266\n下周日\t159267\n_门窗装修|一起网\t159268\n1370\t159269\n塔式\t159270\n88253848\t159271\nlocalization\t159272\n宁德地区\t159273\n挤出机\t159274\ninduction\t159275\n米汤\t159276\n拔插\t159277\n自吹\t159278\n下载机\t159279\n代表大会常务委员会\t159280\n阿离\t159281\nLeague\t159282\nPUBLIC\t159283\nJ3455\t159284\n第十六卷\t159285\n钢人\t159286\n柳如\t159287\n黑灯\t159288\n条件式\t159289\n75米\t159290\n拂晓\t159291\nLint\t159292\n海泰克\t159293\n盐雾测试\t159294\n陈鹤琴\t159295\nMona\t159296\n抑菌剂\t159297\n血槽\t159298\nchocolat\t159299\n叶木\t159300\n东方龙\t159301\n辽宁出入境检验检疫局\t159302\n亚当熊\t159303\n阳坊\t159304\n抠门李治廷即兴创作\t159305\npublication\t159306\nSAME\t159307\n康卡斯特\t159308\n2680v2\t159309\n陈冰\t159310\n61多彩\t159311\n跑男5\t159312\n0831\t159313\n廊曼机场\t159314\n中工招商网\t159315\n新业务\t159316\najxa\t159317\n狼塔\t159318\nrenewal\t159319\n53种\t159320\n名昭\t159321\n老K\t159322\n株洲市人社局\t159323\n下午四点\t159324\n猪八戒工程网\t159325\n丙硫菌唑\t159326\nIScroll\t159327\nmodelmap\t159328\n迪士\t159329\nquick\t159330\n宝宝版\t159331\n4月28日起\t159332\nmizuno\t159333\nErola\t159334\n乐卡克\t159335\n布心\t159336\n7050\t159337\n2017下半年\t159338\n2018-01-04\t159339\n法衣\t159340\n种间\t159341\nHardware\t159342\n中消协\t159343\n大话羽球\t159344\ncad2014\t159345\n北京居委会\t159346\n为甚么\t159347\n龙头\t159348\n鹅卵石制砂机\t159349\n23000元\t159350\n生物群落\t159351\n龙泉窑\t159352\n曼联球迷网\t159353\n树獭\t159354\nCampaign\t159355\n内部装修\t159356\nHorrible\t159357\n黑白键\t159358\nriben\t159359\nFatal\t159360\n女孩性\t159361\n张亮\t159362\n褐皮书\t159363\nShakira\t159364\n二三类\t159365\nHBM\t159366\n湖南省教育考试院\t159367\n消业障\t159368\n悦动论坛_汽车之家论坛\t159369\n黄尾鲴\t159370\n鬼门关\t159371\n代拉\t159372\n清雪\t159373\n龙舌兰酒\t159374\n61%\t159375\ndianji\t159376\n参选\t159377\n海洋类\t159378\n5亿美元\t159379\n朱\t159380\n铺面\t159381\n亿恩网\t159382\n腾讯数据中心\t159383\n农信银\t159384\n连接点\t159385\n界上\t159386\n8月16日\t159387\n图片编辑器\t159388\n风云雄霸\t159389\n煤制烯烃\t159390\n花咒\t159391\n圣经注释\t159392\n硝酸钠\t159393\n信息管理学院\t159394\n昔年\t159395\nFunctional\t159396\ngoogle地球\t159397\nüber\t159398\n皮山\t159399\n铜彩\t159400\n社会财富\t159401\n产儿\t159402\n358号\t159403\n眉鸟\t159404\n干白葡萄酒\t159405\n墨尔本胜利\t159406\n李商隐\t159407\nMatConvNet\t159408\n营会\t159409\n滇藏\t159410\n称重仪\t159411\nMODIS\t159412\nProfession\t159413\nvue2.x\t159414\n200个\t159415\n20141103\t159416\n康复新液\t159417\n北京市建委\t159418\n枫林镇\t159419\nGauss\t159420\n大龙虾\t159421\n上海西郊宾馆\t159422\nsudio\t159423\n永州市\t159424\nHacking\t159425\n仙灵\t159426\n财经周刊\t159427\nKINA.cc\t159428\n红片\t159429\n南京水利科学研究院\t159430\n阿坤\t159431\n蒋龙\t159432\n槽数\t159433\n5.6.5\t159434\n天琴湾\t159435\n针叶\t159436\n丽园路\t159437\n一世华裳\t159438\n生产成本\t159439\n人格化\t159440\n取决\t159441\nhttp状态码\t159442\n唐晓天王祉诺\t159443\n底比斯\t159444\n华业东方玫瑰\t159445\n安赛蜜\t159446\nenrollment\t159447\nHoo\t159448\n文秀\t159449\n综合楼\t159450\nMigration\t159451\n脆骨\t159452\n04.09\t159453\n易思\t159454\n祝贺\t159455\n空气污染指数\t159456\n泰剧\t159457\n欲穷\t159458\n从何而来\t159459\nFelt\t159460\n铁角\t159461\n1680\t159462\n夏菊\t159463\n6.0_\t159464\n好想大声说爱你\t159465\ncocoaPods\t159466\nPUB\t159467\n集趣网\t159468\n茎蜂\t159469\n小卡车\t159470\n瑞郎\t159471\n金圈\t159472\n201602\t159473\n浙江省人民政府办公厅\t159474\n兴替\t159475\n王为\t159476\n55层\t159477\n西湖区\t159478\nrealise\t159479\n3极\t159480\nserever\t159481\n浮想联翩\t159482\ndealer\t159483\n后屏\t159484\n囚凰\t159485\n就业部\t159486\nょ\t159487\n拉平\t159488\n51yza.com\t159489\n55206046\t159490\n江苏省淮阴中学\t159491\n芭蕾鞋\t159492\n恋恋有词\t159493\nbeagle\t159494\n人民币\t159495\n西游大战僵尸2\t159496\n自加\t159497\n2安卓\t159498\nsln\t159499\n满日\t159500\nemitter\t159501\n竞技场\t159502\n神奇校车\t159503\n尼康d610\t159504\n溪谷\t159505\n国产表\t159506\n三驴之家\t159507\n建筑地基基础设计规范\t159508\n李双江\t159509\n养老虎\t159510\n股侠\t159511\n温习\t159512\npreloader\t159513\n离合\t159514\n我的爸爸是森林之王\t159515\nPyDev\t159516\n南瓜粥\t159517\n万华\t159518\n白石桥\t159519\n犹在\t159520\n保障部\t159521\n瓷板画\t159522\n科华控股\t159523\n山地车\t159524\n六月六\t159525\n新还珠\t159526\n一毛\t159527\n龙族3\t159528\n海淀区妇幼保健院\t159529\n长风大悦城\t159530\n中国式过马路\t159531\n李艳秋\t159532\n十行\t159533\n二第二章\t159534\nDocK条\t159535\n更新启动\t159536\nSolagirL\t159537\n天水师范学院\t159538\nar眼镜\t159539\nSketchup\t159540\n千方科技\t159541\nPDU\t159542\n数据量\t159543\n化工业\t159544\npre-A轮融资\t159545\n锐智\t159546\n真颜\t159547\n小小忍者\t159548\n儒虎网\t159549\n潘功胜\t159550\n3分之一\t159551\n极米H1S\t159552\n墙架\t159553\n畅悦\t159554\n三俗\t159555\n烊\t159556\n摇粒绒\t159557\n希瓦\t159558\n炎龙骑士团2\t159559\n六千多\t159560\n宗客网\t159561\n5-6月\t159562\n河北民族师范学院\t159563\n安歇\t159564\n木棉树\t159565\n硝呋太尔制霉素阴道软胶囊\t159566\n票案\t159567\n毒手\t159568\n王者荣耀英雄\t159569\n囊萤映雪\t159570\n废物\t159571\nBIOS-迅维网\t159572\n进品\t159573\n石家庄信息工程职业学院\t159574\n江南大道\t159575\nPIRT\t159576\n委员\t159577\ndatagridview\t159578\n苏百钧\t159579\n事故率\t159580\nOPTIONS\t159581\n大华农\t159582\n斑马\t159583\n歌仙\t159584\n生辉\t159585\nvas\t159586\n有色\t159587\n证道\t159588\n鬼先生\t159589\nProlog\t159590\n职业道德与法律\t159591\n経済\t159592\n商业银行委托贷款管理办法\t159593\n磁湖\t159594\n勇于探索\t159595\n王丽兵\t159596\n打靶归来\t159597\n王力剑灵\t159598\n金投网\t159599\n泼辣修图\t159600\n张成泽\t159601\nK-ON\t159602\n武汉乐居网\t159603\n继电保护测试仪\t159604\ndirectX\t159605\nJPN\t159606\n30多天\t159607\n皇家马德里\t159608\n朝令夕改\t159609\ntakara\t159610\nn7000\t159611\neCognition\t159612\n支教\t159613\nDbvisualizer\t159614\n日记账\t159615\n不是我的错\t159616\nehviewer\t159617\n煤气阀\t159618\n看清\t159619\n混合版\t159620\n酪氨酸激酶\t159621\n18本\t159622\n一版\t159623\n证书\t159624\nv1.0.5\t159625\n党恩\t159626\n进尺\t159627\n性染色体\t159628\n咱们结婚吧\t159629\n屯溪路\t159630\n估价\t159631\n乐高探索中心\t159632\n广州市越秀区信息网\t159633\n天马\t159634\n面补\t159635\n嘎达梅林\t159636\n巴萨罗那\t159637\n第十三届\t159638\n婚外性\t159639\n上海祥树实业发展有限公司\t159640\nWIFI路由器\t159641\n男人\t159642\n腕轮\t159643\n前三甲\t159644\n石涛\t159645\n中公教育天津分校\t159646\n瓷片\t159647\n清宫术\t159648\n天朝上品\t159649\n沈阳药科大学\t159650\n10公分\t159651\n霜语\t159652\n新星出版社\t159653\n六期\t159654\n小野寺梨紗\t159655\nwxss\t159656\ncr2\t159657\n现金等价物\t159658\n口审\t159659\n仰头\t159660\nZZZ\t159661\n福寿山\t159662\nHorny\t159663\nRBD\t159664\n兰考\t159665\n伏击\t159666\n怪胎\t159667\n大卫\t159668\n塌塌\t159669\na79\t159670\nselec\t159671\nModelAndView\t159672\n魅族浏览器\t159673\n177号\t159674\n微秒级\t159675\n网络性\t159676\n诗经\t159677\n好声音\t159678\n以下\t159679\nhi3536\t159680\n乳腺管\t159681\n起手\t159682\n压铸铝\t159683\n灾备\t159684\n曾用名\t159685\n三十里铺\t159686\n阿凡达\t159687\nSoybean\t159688\n云翻样\t159689\n并拢\t159690\nRACER\t159691\n百诺恩\t159692\n82平\t159693\nlem\t159694\n45块\t159695\n于山\t159696\nstones\t159697\n竞争型\t159698\n廣告\t159699\n新屋\t159700\n浙江分公司\t159701\n下河东\t159702\ngrading\t159703\nEssay\t159704\nHG8546M\t159705\n天沐温泉\t159706\nAgilent\t159707\n泪道\t159708\n断掌\t159709\n膜蛋白\t159710\n鱼菜\t159711\n臣民\t159712\n广岛\t159713\n1-10季\t159714\n背照式\t159715\nNASA\t159716\n古交市\t159717\n第三产业\t159718\n中国招生代理网\t159719\n半括号\t159720\n桑德\t159721\n温泉花园\t159722\n淮上\t159723\n币别\t159724\n山东经贸职业学院\t159725\n上海公安局\t159726\nrugged\t159727\n商洛市人民政府\t159728\n300立方\t159729\n王gx\t159730\n互殴\t159731\n中国人民财产保险股份\t159732\n气门弹簧\t159733\n底面\t159734\nbeetl\t159735\nhiwifi\t159736\n海下\t159737\n可信度\t159738\n融侨悦府\t159739\n4GB+64GB\t159740\n9600元\t159741\n重庆分所\t159742\n和合本\t159743\n白马涧\t159744\nkeepa\t159745\n秒解\t159746\nwings\t159747\n玛旁雍错\t159748\nDOLCE\t159749\n57天\t159750\n彩票\t159751\n嗷\t159752\n0091\t159753\n偷晴\t159754\n轻燕\t159755\n泗阳县\t159756\n海河杯\t159757\n次大陆\t159758\n台港\t159759\n红纱\t159760\nsk\t159761\n优酷土豆乐视腾讯\t159762\n531968912\t159763\n飞利信\t159764\n红雪莲\t159765\nlogon\t159766\n铲斗\t159767\n信息窗\t159768\n继室\t159769\n5毫米\t159770\n冷静\t159771\n盐酸多西环素肠溶胶囊\t159772\n秦兵马俑\t159773\n锈水\t159774\n身体\t159775\n假设检验\t159776\n中国船舶重工集团公司第七一一研究所\t159777\n求之不得\t159778\nrt-ac68u\t159779\nAria\t159780\n彭艳\t159781\n加计扣除\t159782\n中楼\t159783\n净股\t159784\n5科\t159785\n斩获\t159786\nVU\t159787\n扎旗\t159788\n质安\t159789\n正颌\t159790\nCRO\t159791\n盛世淫风录\t159792\n天津人民出版社\t159793\n新高教集团\t159794\n看错\t159795\n科技化\t159796\n金阶\t159797\n勇者斗恶龙10\t159798\n夜月\t159799\npab\t159800\n极客战记\t159801\n全光\t159802\n杰克丹尼\t159803\n赛车物语2\t159804\ncombine\t159805\nyam\t159806\n出线\t159807\n安越\t159808\n烟庄\t159809\n苦脸\t159810\n小沈阳\t159811\n900_\t159812\n颜德馨\t159813\nmpls\t159814\n朝安\t159815\n鼎信通讯\t159816\n浙江卫视跨年演唱会\t159817\n王斐\t159818\n错位相减法\t159819\nCLR\t159820\nVICE\t159821\n嘻哈帽\t159822\n情欲片\t159823\n九曲桥\t159824\n铣床\t159825\n盐碱地\t159826\n0年\t159827\n烟斗\t159828\n拔丝机\t159829\n夺下\t159830\n黑白世界\t159831\n上海虹桥国家会展中心\t159832\n服更新\t159833\n白庙村\t159834\n70万吨\t159835\n青奥会\t159836\nFlyme\t159837\nXtraReport\t159838\n达美\t159839\norganizations\t159840\n一端\t159841\n汉化网\t159842\n国脉科技\t159843\n天干\t159844\n护理品\t159845\n辽大\t159846\n浈江\t159847\nstatistical\t159848\n社保中心\t159849\n悠游网\t159850\n烟烟\t159851\n赌城\t159852\n中国少年儿童出版社\t159853\nIML\t159854\n苦读\t159855\n体委\t159856\n有问题\t159857\nwhitening\t159858\nwinio\t159859\n路点\t159860\n证券从业资格证考试\t159861\n伦斯勒理工学院\t159862\nmlgb\t159863\n南通地铁1号线\t159864\n大犬座\t159865\n觉醒技\t159866\nSqlSession\t159867\n快穿\t159868\n10018\t159869\nrda\t159870\n分类与整理\t159871\nonlyoffice\t159872\n不以物喜\t159873\n上谷\t159874\n3450\t159875\n完美的人\t159876\n沛纳海社区\t159877\nOre\t159878\n存款机\t159879\nbonus\t159880\n钉锤\t159881\n心圆\t159882\n真白爱梨\t159883\n锤子\t159884\n丧尽天良\t159885\n张娜\t159886\n越姬\t159887\n琪\t159888\n强机\t159889\n全会\t159890\n第个\t159891\n中安\t159892\n云安县\t159893\n澳门国际机场\t159894\nweeknd\t159895\n井蓝\t159896\ngnd\t159897\n斗茶\t159898\n倒置式\t159899\n25万\t159900\n打一场\t159901\n以工代赈\t159902\nAC模\t159903\n奥斯陆\t159904\n消声静压箱\t159905\n7100\t159906\nCef\t159907\nhitool\t159908\nud\t159909\n出山网\t159910\n南京街\t159911\ng1.sniper\t159912\n大明路\t159913\n南京恒昌汇财\t159914\n承继\t159915\n谈及\t159916\n平衡机\t159917\n唐人街\t159918\n冰龙\t159919\n走来\t159920\n61baobao.com\t159921\ndisturb\t159922\n吉电股份\t159923\n达利园\t159924\n第一胎\t159925\n获批\t159926\n邓波儿\t159927\n保尔柯察金\t159928\nNCC\t159929\n户上\t159930\n南京牛首山\t159931\n高居翰\t159932\nε\t159933\n没有了\t159934\n姜建清\t159935\nPlu\t159936\nVT\t159937\n住持\t159938\nPS蒙\t159939\n77880\t159940\n县发改委\t159941\n烟饼\t159942\n金融服\t159943\n拼多多_百度\t159944\n国培送教下乡\t159945\n属人\t159946\nKAWS\t159947\n复合弓\t159948\n默纳克\t159949\n绫音\t159950\n迪拜哈利法塔\t159951\n虚空之遗\t159952\n外包\t159953\n中江信托\t159954\n陈帅佛\t159955\n帝国时代罗马复兴\t159956\n师叔\t159957\nxls\t159958\nkdl\t159959\n天正建筑软件\t159960\n试戴\t159961\n溶于水\t159962\nmattress\t159963\n逆闪\t159964\n橘子味\t159965\ncbl\t159966\n卓帆\t159967\n弢\t159968\n2017号\t159969\n高尔夫R论坛\t159970\n27元\t159971\n梅州市人民医院\t159972\nwird\t159973\nDamai\t159974\n唐砖\t159975\nArli\t159976\n有独钟\t159977\n博海\t159978\n有点击\t159979\n永利国际\t159980\n龙王庙\t159981\n神兽金刚2天神地兽\t159982\nhuiyi\t159983\nchorus\t159984\n恰巧\t159985\nGoldengate\t159986\n绪川里绪\t159987\n砍腿\t159988\nTru\t159989\n20140504\t159990\n4摄氏度\t159991\n12次\t159992\n发邮件\t159993\n罚单\t159994\nPresidential\t159995\n安化新闻网\t159996\n幌子\t159997\n头孢克洛分散片\t159998\nvca\t159999\n装逼\t160000\n相同\t160001\n支付\t160002\n例行检查\t160003\nShaman\t160004\n省法院\t160005\n忍野忍\t160006\n9038\t160007\n驿城区\t160008\n黄澜\t160009\n兰州文理学院\t160010\n韩芸汐\t160011\n苏州拙政园\t160012\n熔岩虫\t160013\n胶类\t160014\n纯水机\t160015\n轻重缓急\t160016\n商学院\t160017\nebx\t160018\n党风廉政\t160019\n维多利\t160020\n2017年9月1日\t160021\n柴桥街道\t160022\n机场航站楼\t160023\n吴正宪\t160024\n防腐管\t160025\n测向\t160026\n沃尔玛购物广场\t160027\nS\t160028\n229个\t160029\n上海装潢网\t160030\n客堂\t160031\n莫绮雯\t160032\n空比\t160033\nLOOKUP\t160034\n320im\t160035\n铅印\t160036\n资产池\t160037\n洪湖\t160038\n江山股份\t160039\nAPPLIED\t160040\n纸筒\t160041\nGeeksforGeeks\t160042\ngrill\t160043\n蜂刺\t160044\n双节棍\t160045\n晚情\t160046\n住所地\t160047\n青霉素钠\t160048\n主函数\t160049\n家饰\t160050\nZenith\t160051\nnev\t160052\n河西南\t160053\nAubrey\t160054\n理理\t160055\n229号\t160056\n32C\t160057\n世界线\t160058\n道成\t160059\nFrankiee\t160060\n墨缘\t160061\n察看\t160062\n多春鱼\t160063\n陈国坤\t160064\n优优江湖\t160065\n玩心跳\t160066\n过滤袋\t160067\n郭杰\t160068\n阳光姐妹淘\t160069\n消保委\t160070\n润喉糖\t160071\nCmsTop\t160072\nnio\t160073\n科维\t160074\n量化投资\t160075\n森井除湿机\t160076\n超感\t160077\n码隆科技\t160078\n献瑞\t160079\n吴炜\t160080\n强佑\t160081\n3326\t160082\n盆腔炎\t160083\n白南\t160084\n张召\t160085\n三亚免税店\t160086\n009号\t160087\n小说集\t160088\n顽劣\t160089\n羽生结弦\t160090\nPimp\t160091\n神仙道_一游网\t160092\n国家人防办\t160093\n催告\t160094\nplato\t160095\n乐凯胶片\t160096\n照明箱\t160097\nVibe\t160098\nF检验\t160099\n作者们\t160100\n镜鉴\t160101\n情色\t160102\n极限特工3:终极回归\t160103\n12v蓄电池\t160104\n世涛\t160105\n上海市区\t160106\n1.6L\t160107\n打電話\t160108\n三星galaxy\t160109\n蜂王浆\t160110\n套法\t160111\n8式\t160112\nCandidates\t160113\n江苏省卫计委\t160114\n孕酮片\t160115\n竞优\t160116\n火山泥\t160117\n2019年10月\t160118\n霖雨\t160119\n良信电器\t160120\n中西方\t160121\n十二支\t160122\n海德格尔\t160123\n打印头\t160124\n辛亥\t160125\nRhythm\t160126\ntres\t160127\n86727703\t160128\n冯宇\t160129\nMaybelline\t160130\n学前教育专业\t160131\n日照房产超市网\t160132\n预料\t160133\n港荣蒸蛋糕\t160134\n惠金所\t160135\n2017到2018年\t160136\n0423\t160137\n操作秀\t160138\n广东联通\t160139\n推客\t160140\nstrv\t160141\n55分\t160142\n开机号\t160143\n小麦\t160144\n第53期\t160145\n高米\t160146\n极品飞车12\t160147\n半空中\t160148\n欧西里斯\t160149\n娄老师\t160150\n确有\t160151\n马提尼\t160152\n焦渣\t160153\n二值图\t160154\n克拉肯\t160155\n李鸿\t160156\nduns\t160157\n良法\t160158\n凤小岳\t160159\n民俗学\t160160\nfiddle\t160161\nUSED\t160162\ngetset\t160163\n厦门邮轮中心\t160164\n20150707\t160165\ntimedelta\t160166\nmastercamx9\t160167\n走姿\t160168\njony\t160169\nDG\t160170\n前15分钟\t160171\n得益\t160172\n果语\t160173\n源脉\t160174\n策划方\t160175\n大江东去\t160176\n道客巴巴文档下载器\t160177\n你死我活\t160178\nfilco\t160179\nillustration\t160180\nWinmail\t160181\n南宁法院网\t160182\n太阳沟\t160183\n德盛\t160184\n映维网\t160185\n吴莎\t160186\ndialer\t160187\n银行系\t160188\n癞蛤蟆\t160189\n成人高考教育网\t160190\nreopen\t160191\nmovescount\t160192\n女人篇\t160193\n每\t160194\n潼南区\t160195\n老铁们\t160196\n北溟\t160197\n中研\t160198\n司马平邦\t160199\n高空\t160200\n16届\t160201\n6Plus\t160202\n前嫌\t160203\n反应式\t160204\n血帽\t160205\n友行\t160206\nLouisiana\t160207\n小小军团合战三国\t160208\n张蓉\t160209\n回复单\t160210\n南方动漫网\t160211\n塞尔\t160212\n离散变量\t160213\n中庸\t160214\n泳场\t160215\n引期待\t160216\n中共六大\t160217\n自然资源资产离任审计\t160218\n春亚纺\t160219\n高冰\t160220\n全身式\t160221\n板桥镇\t160222\n瘆\t160223\ncolly\t160224\n乐凯撒\t160225\n泸州论坛\t160226\n交加\t160227\n不锈钢片\t160228\n奄\t160229\ncarx\t160230\n李海霞\t160231\n6123\t160232\n余炜\t160233\nCoworking\t160234\n兼顾\t160235\nambassador\t160236\n武汉大学电气工程学院\t160237\n广汕路\t160238\nJKS\t160239\n加盟连锁\t160240\n云虚拟机\t160241\n诚达\t160242\n尤尼克斯YONEX\t160243\n帝湖\t160244\n第2年\t160245\n黑老大\t160246\n江西环境工程职业学院\t160247\n幻音\t160248\n苹果越狱\t160249\n上海迪士尼\t160250\n国家环保局\t160251\n世界知识出版社\t160252\n倩丽\t160253\n罗德岛设计学院\t160254\n美菜网\t160255\n益农信息社\t160256\n生死劫\t160257\n灵龟\t160258\nAdSense\t160259\n两园\t160260\n周瑜\t160261\nLucent\t160262\n最末\t160263\n三张\t160264\n即时比分_比分网\t160265\n吴人\t160266\ncocopod\t160267\n云医院\t160268\nChildlife\t160269\n百分号\t160270\n方特\t160271\n舒泰\t160272\n四惠\t160273\n蒙台梭利教育\t160274\nocular\t160275\n花心男\t160276\n如山\t160277\n电解钴\t160278\n巧板\t160279\n战速\t160280\n交易网\t160281\n奕阁\t160282\nnsstring\t160283\n贵妇\t160284\n新隆\t160285\n冰机\t160286\nvueJS\t160287\n不决\t160288\n小鸡炖蘑菇\t160289\nrpl\t160290\n补平衡\t160291\n210万元\t160292\nCharging\t160293\nCherie\t160294\n三氧化二铬\t160295\n遥控机器人\t160296\n克尔瑞\t160297\n梦醒潇湘love\t160298\n环渤海新闻网\t160299\n荔枝公园\t160300\n弓长岭区\t160301\n动态表情包\t160302\nWebForm\t160303\n见死不救\t160304\n住建委\t160305\n液晶板\t160306\nbiotin\t160307\n国际空间站\t160308\n一帆风\t160309\n飞鲨\t160310\n鸟鸣声\t160311\n什刹海\t160312\n南溪村\t160313\nCraig\t160314\ndread\t160315\n拼图游戏\t160316\n邹区\t160317\n龙塘\t160318\n传宗接代\t160319\nVRTK\t160320\n张起灵\t160321\n锁相放大器\t160322\n观卦\t160323\n24章\t160324\n限重\t160325\n沿江\t160326\n云计算工程师\t160327\n知商\t160328\n返图\t160329\n青紫\t160330\n吐字\t160331\n蕴藉\t160332\n炉石盒子\t160333\n华为荣耀6X\t160334\n基体\t160335\n烟枪\t160336\nELITE\t160337\n炒法\t160338\n安徽省农业委员会\t160339\n安吉余村\t160340\n台形\t160341\n菌血症\t160342\nMywife\t160343\n封神传奇\t160344\n连云港港\t160345\nGems\t160346\n老y\t160347\n绝境求生\t160348\nTrips\t160349\n安全监管总局\t160350\n执政\t160351\n很想\t160352\n任务战\t160353\n书纪\t160354\n南京博物院\t160355\n合肥长淮中医医院\t160356\n邢台学院\t160357\nJoda-Time\t160358\n扑翼\t160359\nenjoyable\t160360\n20150430\t160361\n包\t160362\np70\t160363\n情调\t160364\n20131108\t160365\n鄯善\t160366\nprep\t160367\ncosB\t160368\n妙韵\t160369\n花谱\t160370\n咨询处\t160371\n红茶馆\t160372\nmovements\t160373\n罗村镇\t160374\n北汽幻速\t160375\nIRQ\t160376\n巴卡\t160377\n浮标式\t160378\nguibuyu\t160379\n天彩\t160380\namixer\t160381\ndatatype\t160382\n高达无双\t160383\n地西泮片\t160384\n燕秀\t160385\n风声\t160386\n被解职\t160387\nyuv\t160388\n焚炉\t160389\n余东\t160390\n龙韵\t160391\n严氏\t160392\n运维派\t160393\n新密\t160394\n大王山\t160395\n异果\t160396\n云县\t160397\n西宁站\t160398\n师大百问\t160399\n里脊肉\t160400\n贝壳类\t160401\n清华源\t160402\n商运\t160403\n18条\t160404\n李字\t160405\n私护\t160406\n霍营\t160407\n七八个月\t160408\n为哪些\t160409\n吉他效果器\t160410\n5508\t160411\n郑州装修公司\t160412\n奇侠\t160413\nHelloKitty乐园\t160414\n卧龙岗大学\t160415\n长阳半岛\t160416\n1.0.1安卓\t160417\n范扬\t160418\n浩宇\t160419\n爱舍\t160420\n50瓦\t160421\n这么多好\t160422\nCritic\t160423\n精灵宝可梦剧场版\t160424\n凯伊\t160425\nMuhammad\t160426\nmsie\t160427\nChaoyang\t160428\nheyuan\t160429\nsomachine\t160430\n新飞飞\t160431\n伙伴关系\t160432\n百万千瓦\t160433\n广播剧\t160434\n预览器\t160435\n压倒\t160436\nFARCRY5\t160437\nPSV2000\t160438\n1923年\t160439\n307医院\t160440\n刺刺\t160441\n奌\t160442\nandre\t160443\nshenyan\t160444\n许广平\t160445\n三州\t160446\n基价表\t160447\n黄芪当归\t160448\n遗物\t160449\n胜仗\t160450\n滴水湖\t160451\n人生七年\t160452\n艳妇\t160453\n消签\t160454\n蛇蜕\t160455\n天图\t160456\n南山大道\t160457\n儿童观\t160458\n女王\t160459\n都市妖奇谈\t160460\n卫斯理之蓝血人\t160461\ndocumentElement\t160462\n2000万美元\t160463\n五险+\t160464\n美少年之恋\t160465\n6幢\t160466\n上香\t160467\n李洪亮\t160468\n规整\t160469\n从今\t160470\n每10秒\t160471\n组合盘\t160472\n鸡头米\t160473\n游龙惊凤\t160474\n党政领导干部安全生产责任制\t160475\n3千克\t160476\n卢修斯\t160477\nLED电子屏\t160478\nfnis\t160479\n郭锋\t160480\n带电\t160481\n处理机\t160482\n精馏塔\t160483\n芙蓉锦\t160484\n上海金山石化\t160485\n付高峰\t160486\ncorrected\t160487\n枚举型\t160488\n和氏璧\t160489\n便宜坊烤鸭店\t160490\n读本\t160491\n海典软件\t160492\nDingo\t160493\n总选\t160494\n领巾\t160495\n半地下室\t160496\n温网\t160497\n10|\t160498\n信息技术学院\t160499\n通达性\t160500\n优能\t160501\n百大\t160502\n中国梦之队\t160503\n钢筋网\t160504\n55平\t160505\n羽毛球群\t160506\nfolders\t160507\n咖\t160508\nSingleton\t160509\nJCT\t160510\n烟台地区\t160511\n金钟路\t160512\n反省\t160513\n截字\t160514\n胡侃\t160515\n民族风\t160516\n地\t160517\nToys\t160518\n红木\t160519\n豁出去\t160520\n幸运的人\t160521\nZOO\t160522\n上峰水泥\t160523\n李爱华\t160524\n源氏物语\t160525\n织梦\t160526\n小阿\t160527\n南昌市高新区\t160528\n3.10\t160529\n蓝山\t160530\n判题\t160531\n学用\t160532\nholland\t160533\nresultMap\t160534\n急需要\t160535\n七嘴八舌\t160536\n陈兴良\t160537\n微米级\t160538\n架构师之路\t160539\nAVP\t160540\n长安之星3\t160541\n众成\t160542\nmoose\t160543\n时常\t160544\nsonhoo\t160545\n工信\t160546\n8.1.7\t160547\nps3模拟器\t160548\n导入仪\t160549\n16台\t160550\n郑国强\t160551\n餐饮公司\t160552\n凯音\t160553\n包天\t160554\nportion\t160555\n简单明了\t160556\n十七岁\t160557\nBurst\t160558\nsharemouse\t160559\n南昌县人民政府\t160560\n四轴\t160561\n确确实实\t160562\n立体式\t160563\n林宇辉\t160564\n尼康d80\t160565\n钢影\t160566\n都选C\t160567\n文雅\t160568\n1.12版\t160569\ndragging\t160570\n历练\t160571\n上牌\t160572\n扣款\t160573\n广濑铃\t160574\n16.04.4\t160575\ncol\t160576\n开放期\t160577\n秦涛\t160578\n灌装机\t160579\n卸妆水\t160580\n小库\t160581\n罗汉拳\t160582\n线损\t160583\n界诠法师\t160584\n鸭梨\t160585\n李青\t160586\n绛珠\t160587\n泮托拉唑钠肠溶片\t160588\ncecs\t160589\n瞄股网\t160590\n黄海燕\t160591\nWEUI\t160592\n郭艳\t160593\n第2艘\t160594\n笑气\t160595\n高德拉特\t160596\n麦城\t160597\n流涎\t160598\nxshell5\t160599\n176号\t160600\n區\t160601\nHeavy\t160602\n剧\t160603\n氟硅\t160604\n新疆阜康市政府\t160605\n王扬\t160606\n找准\t160607\n记录器\t160608\n再利用\t160609\nposer\t160610\n0xc000000f\t160611\n29.7\t160612\nbathroom\t160613\n颛顼\t160614\n刷\t160615\n之者\t160616\n步法\t160617\n无齿\t160618\nmilkty\t160619\n贱圣\t160620\n国网湖南省电力公司\t160621\n夜观\t160622\n仓库\t160623\n哈尔滨工程大学\t160624\n李熬\t160625\ntoxin\t160626\n雅仕\t160627\nitween\t160628\ncomctl32.dll\t160629\n张乐平\t160630\nxixi\t160631\n天书奇谈\t160632\n张欣\t160633\nV2.3.1\t160634\n更年\t160635\ncobar\t160636\n清出\t160637\n生长调节剂\t160638\n5178\t160639\n径赛\t160640\n雪莱特\t160641\n江苏省教育考试院\t160642\n雨势\t160643\n交线\t160644\n酥糖\t160645\n胃粘膜\t160646\ncomput\t160647\n十二集\t160648\nAdv\t160649\nYYcaF\t160650\n蜜\t160651\n斯普瑞斯奥特莱斯\t160652\n苍木玛娜\t160653\n本田CR\t160654\n旋切机\t160655\n了不起的盖茨比\t160656\n辽视\t160657\nsquashfs\t160658\n巧计\t160659\n家庭园艺中心\t160660\n大川\t160661\n制动片\t160662\n永无\t160663\n龟纹\t160664\n黑龙江八一农垦大学\t160665\n观世音菩萨\t160666\n马各庄\t160667\n中联航\t160668\n切实可行\t160669\n步云\t160670\n第6集\t160671\n第1天\t160672\n工作区\t160673\n周笔畅\t160674\n歼敌\t160675\n47所\t160676\n20151003\t160677\n凤凰天使TSKS\t160678\n騰訊\t160679\nsav\t160680\n中国科学院宁波材料技术与工程研究所\t160681\n华讯\t160682\n28回\t160683\n漂泊\t160684\n插针机\t160685\n硼酸钠\t160686\n非上市股份公司\t160687\n函数形\t160688\n拍讯\t160689\n码源\t160690\n中海紫御公馆\t160691\n写真图\t160692\nsniff\t160693\nStop\t160694\n17r2\t160695\nV12\t160696\n三台\t160697\n多道\t160698\n垂体微腺瘤\t160699\n中国收藏家协会\t160700\n救灾\t160701\n勃兴\t160702\n四川省委组织部\t160703\n撕心裂爱\t160704\nVxworks\t160705\n傲然\t160706\n100.3\t160707\n纲手\t160708\n利普\t160709\n北京出入境管理局\t160710\n白片\t160711\n特钢\t160712\nMOTORS\t160713\n第12版\t160714\n华能国际电力股份有限公司\t160715\n方舟方块世界\t160716\n阿撒托斯\t160717\n金成\t160718\nMatrices\t160719\nWeed\t160720\n创新链\t160721\n皮下脂肪瘤\t160722\n线批\t160723\n光顺\t160724\n三星曲屏\t160725\n东风标致\t160726\n秘杀\t160727\n科技局\t160728\n王征\t160729\n香椿炒鸡蛋\t160730\n杜华\t160731\nseus\t160732\n湖库\t160733\n凯凯\t160734\ncampaign\t160735\n香菇炖鸡\t160736\n5610\t160737\n玉环政府网\t160738\n唐良智\t160739\n羧甲淀粉钠\t160740\n履约担保\t160741\n拉通\t160742\n2010-2015年\t160743\n张黎明\t160744\n刘潇\t160745\n四川省网上办税服务厅\t160746\nbeep\t160747\nTV\t160748\n香港崇光百货\t160749\n高桥大市场\t160750\n卫兰\t160751\n中华田园\t160752\n龙湖地产\t160753\n110个\t160754\n同方全球人寿\t160755\n甜酒曲\t160756\n8层\t160757\n童妓\t160758\n结汇\t160759\n3124\t160760\n布光\t160761\n阶码\t160762\n靖江市政府\t160763\n無執\t160764\n斗战神灵猴\t160765\n远洋集团\t160766\n崔洪刚\t160767\ngung\t160768\n虎狮\t160769\n王者荣耀电脑版\t160770\n闾\t160771\n陈述书\t160772\n50千瓦\t160773\n拆墙\t160774\n琴鸟\t160775\n刘怡\t160776\n武场\t160777\ngiantess\t160778\n康爱\t160779\n家子弟\t160780\n印号\t160781\n谏逐\t160782\n私阴人\t160783\n弹流\t160784\n乡村学校少年宫\t160785\n版权印\t160786\n邱吉尔\t160787\n嘉兴中医院\t160788\n玛丽简\t160789\n最爱的人\t160790\nyt\t160791\nidols\t160792\n土壤氡\t160793\nGeoServer\t160794\n发菜\t160795\n全国通用\t160796\n江海晚报\t160797\nドキュメント\t160798\n随风而逝\t160799\n上古卷轴5:天际重置版\t160800\n一年三月三\t160801\n淘云\t160802\nOFweek光学网\t160803\n鼻炎膏\t160804\n52所\t160805\n凌淑芬\t160806\n又逢\t160807\n机械有限公司\t160808\n处女座\t160809\n金狮巷\t160810\nformy\t160811\n削皮器\t160812\n充磁机\t160813\n李晓霞\t160814\n未央oo\t160815\nIU\t160816\n2%\t160817\n全直流变频\t160818\n宁古塔\t160819\n合租记\t160820\n全信\t160821\nep52\t160822\nARCGIS—地信网\t160823\n借此机会\t160824\nMTBF\t160825\n样卷\t160826\n鱼屏\t160827\n人事股\t160828\n东风21D\t160829\n英国短毛猫\t160830\nFreshPorts\t160831\n开号\t160832\n南昌二中\t160833\n满页\t160834\n鸡冠洞\t160835\n得主\t160836\n汽油版\t160837\n0.10.0\t160838\n20p_\t160839\n问一\t160840\n靶标\t160841\n高交\t160842\nWebEngine\t160843\n小叙\t160844\n庙山\t160845\n鹬蚌相争\t160846\nqq机器人\t160847\n550w\t160848\nFate\t160849\n网酒网\t160850\n交差\t160851\nmysqli\t160852\n相思豆\t160853\n奥克斯热水器\t160854\n反右运动\t160855\nRange\t160856\n民办学校\t160857\nlib4d\t160858\n100km\t160859\n宁大\t160860\nJavascr\t160861\n弹珠机\t160862\n元宝区\t160863\n保税仓库\t160864\n贵族学院\t160865\n机关工委\t160866\n绿地新里波洛克公馆\t160867\n四渡赤水\t160868\n生牛乳\t160869\n小三元\t160870\n横突\t160871\n苍鼠\t160872\nrap吧\t160873\n流放之路3.1游侠侠客\t160874\n海报架\t160875\n1338\t160876\n漏缝\t160877\nCostco\t160878\n清华大学五道口金融学院\t160879\n起亚途锐\t160880\n思考者\t160881\n微信群主\t160882\njna\t160883\n花果茶\t160884\nmegatrends\t160885\n浙江省环保厅\t160886\n紫竹半岛\t160887\n新风向\t160888\n49位\t160889\n朱丹溪\t160890\nly\t160891\n回民公墓\t160892\n顶岗实习日记\t160893\n3584\t160894\n股绳\t160895\n戈尔德隆\t160896\n快贷\t160897\n水电图\t160898\n英雄联盟助手\t160899\n督抚\t160900\n豪夺\t160901\n窃听风云1\t160902\n蔡高厅\t160903\n7010\t160904\nS686\t160905\n动感\t160906\n柳梧新区\t160907\n二维码扫描\t160908\n杨树花\t160909\n环讯人才网\t160910\n0108648\t160911\n桃子鱼仔ukulele\t160912\nHCT\t160913\nAgility\t160914\ncwd\t160915\n150kg\t160916\n困难\t160917\nbio\t160918\n喜利得\t160919\nAHP\t160920\n九寸\t160921\n央金兰泽\t160922\n七大\t160923\n克什克腾\t160924\n中彩网\t160925\nOFweek工控网\t160926\n日购网\t160927\n狼和小羊\t160928\n20余家\t160929\nHostEase\t160930\n斜风\t160931\nnanana\t160932\n斐擒虎\t160933\n黄一琳\t160934\n劫机案\t160935\n巫启贤\t160936\n新路\t160937\n杀猪刀\t160938\n独孤天下全集\t160939\n3dmax9.0\t160940\n杰仕\t160941\n明美\t160942\ntn屏\t160943\n开封政府网\t160944\n百分之\t160945\ntvbyy\t160946\nFENDI\t160947\nPLSQLDeveloper\t160948\n朗科\t160949\n宠冠\t160950\n内面\t160951\nsettimeout\t160952\n30兆瓦\t160953\n中信银\t160954\n太宰\t160955\nkoala\t160956\nfeels\t160957\n色皿\t160958\n黔中\t160959\n马雪\t160960\n2018.04.13\t160961\n魅族MX3\t160962\n燃烬\t160963\ntris\t160964\nweighted\t160965\n乂\t160966\n振动筛\t160967\n古月哥\t160968\n卑鄙无耻\t160969\n中山市第二人民法院\t160970\n第63期\t160971\n强力\t160972\nTAPD\t160973\n油爆虾\t160974\n20kw\t160975\n宅配\t160976\n世纪金花\t160977\n颈子\t160978\nXPERIA\t160979\n云企信\t160980\n第300章\t160981\n重油\t160982\n市发展改革委\t160983\nEasyPass\t160984\n趋利\t160985\n计划生育委\t160986\n12枚\t160987\n视频\t160988\ntuxera\t160989\n觥筹交错\t160990\n不可收拾\t160991\n表达式校\t160992\n法坦\t160993\nhaarcascade\t160994\n安全单号网\t160995\n2月2日\t160996\n正极\t160997\nebc\t160998\nshard\t160999\n中央民族大学附属中学\t161000\n玉兔\t161001\n白茅根\t161002\n网上保险\t161003\n中国国际贸易中心\t161004\n金童\t161005\n温岭房产网\t161006\n开学季\t161007\n大迈X5\t161008\nab类\t161009\n原样\t161010\n矢子\t161011\n环球移民\t161012\n钻地\t161013\n18年1月\t161014\n小米mix2吧\t161015\n叩开\t161016\n路西法\t161017\nVC2008\t161018\n多则\t161019\n皮秒\t161020\n先锋骑兵_先锋影院_新先锋影院_xfplay影音先锋\t161021\n和第\t161022\n茶麸\t161023\n34pao\t161024\n梅氏\t161025\n卷皮网\t161026\n外六角螺栓\t161027\ntcrct\t161028\n大阪城公园\t161029\n吕小敏\t161030\n预应力梁\t161031\n强制\t161032\n不走寻常路\t161033\n百项\t161034\nbeko\t161035\n品烟\t161036\ndiplomacy\t161037\n作文题\t161038\n水冷中央空调\t161039\n狂奔\t161040\n大连国际机场\t161041\ncarl\t161042\n西柏\t161043\n国大药房\t161044\n江阴农商银行\t161045\nOctavia\t161046\n英雄联盟视频站\t161047\n北京卓川电子科技有限公司\t161048\n消费者联盟\t161049\nSalomon\t161050\n习近平新\t161051\ndespacito\t161052\nrobotframework\t161053\n妙算\t161054\n师团\t161055\n英纳\t161056\n流标\t161057\n儒系\t161058\nwestwood\t161059\n右江\t161060\n关秀媚\t161061\n秀逗魔导士\t161062\n中国军校\t161063\n荧光棒\t161064\n操控感\t161065\ng++\t161066\noid\t161067\n家版\t161068\n品牌形象\t161069\n空壳公司\t161070\n中国重工\t161071\n中国银河\t161072\npermissions\t161073\n轴子\t161074\n高中数学必修1\t161075\n非常了得\t161076\nPostCSS\t161077\n稍等\t161078\n贵港市人民政府\t161079\n72G.com\t161080\n太皮\t161081\n凤馨苑\t161082\n王东升\t161083\n年率\t161084\n促销语\t161085\n智联银行招聘网\t161086\nPD-1/PD-L1\t161087\ntarfile\t161088\n微雕\t161089\nTrimble\t161090\nspore\t161091\n广西壮族自治区工业和信息化委员会\t161092\n收缩期\t161093\nCHOO\t161094\n嗑药\t161095\n周游列国\t161096\nMobiscroll\t161097\n贵友大厦\t161098\n成虎\t161099\n三胞集团有限公司\t161100\nAWE2018\t161101\nhazel\t161102\n烁烁\t161103\n翰海\t161104\n北京市基础设施投资有限公司\t161105\n炮声\t161106\n9亿元\t161107\n外力\t161108\n百度众测\t161109\nwyse\t161110\n牺牲品\t161111\n张合\t161112\n腰段\t161113\n面袋\t161114\n一拨\t161115\n全优\t161116\n各自\t161117\nORA-00904\t161118\n广州市纪委监察局\t161119\n姚远\t161120\n汉砖\t161121\n水利部海河水利委员会\t161122\nhoop\t161123\n二十五天\t161124\n压铸模具\t161125\n碧园\t161126\n干眼症\t161127\n李九\t161128\n龙湖昱湖壹号\t161129\n财政票据\t161130\n拉布拉多猎犬\t161131\n美中不足\t161132\n长航局\t161133\n连通器\t161134\n中国电子科技集团公司第五十四研究所\t161135\n锅底\t161136\n刘阳\t161137\n兆丰\t161138\n结构化面试\t161139\nadduser\t161140\n人教精通\t161141\n书香苑\t161142\n三十儿立\t161143\n西兴\t161144\ngarde\t161145\n东方绿洲\t161146\najax函数\t161147\nHouson\t161148\n滕州房产网\t161149\n失言\t161150\n中铁银通卡\t161151\n逝去\t161152\n鸭绿江口\t161153\nPieces\t161154\n衡阳师范学院\t161155\nm1200\t161156\n胡乐\t161157\nheel\t161158\n蜀光中学\t161159\n_仪\t161160\n刘云\t161161\n国际篇\t161162\nvocation\t161163\n2014.7\t161164\n汪建民\t161165\n纪元龙\t161166\n双曲函数\t161167\n套表\t161168\n靠左\t161169\n中华人民共和国商标法\t161170\n环图\t161171\n七月十五\t161172\n福人\t161173\n魏派\t161174\n江山多娇\t161175\n大青芒\t161176\n重渡沟\t161177\nfirstly\t161178\n人口迁移\t161179\n阿拉山口\t161180\nVending\t161181\n贡梨\t161182\n指名\t161183\n电子表\t161184\n西安交大一附院\t161185\n凶神\t161186\n散瘀\t161187\n上海银行\t161188\n大过\t161189\n双开软件\t161190\n新网银行\t161191\n深圳市国税局\t161192\n产线\t161193\n15乘\t161194\nblacked\t161195\n失和\t161196\n婉容\t161197\n耳聪目明\t161198\n东北往事之黑道风云\t161199\n陶土砖\t161200\n交易单\t161201\n旁边\t161202\n肾萎缩\t161203\n张梦南\t161204\nautomatic\t161205\n山西省省\t161206\n平安人寿保险公司\t161207\n经纪人\t161208\n颜建国\t161209\n竞争对手\t161210\n甘蓝型\t161211\n李雪莲\t161212\n陶正国\t161213\n期市\t161214\n小q\t161215\n预算\t161216\n网易相册_380452075@qq.com\t161217\n耳石\t161218\n介绍信\t161219\nINTOUCH\t161220\n茶戏\t161221\n2in1\t161222\nrelationships\t161223\n97部\t161224\n21.2\t161225\n强军之路\t161226\nhalen\t161227\n体重\t161228\nbetty\t161229\n美国大使馆\t161230\n359号\t161231\n联吡啶\t161232\nC#\t161233\n为主题\t161234\n180426\t161235\n京津冀地区\t161236\n烟纸\t161237\n吕佳\t161238\n_灯\t161239\n99次\t161240\n古宁\t161241\n我自己\t161242\ncolleagues\t161243\n视疲劳\t161244\nZelda\t161245\n合笼\t161246\n海子汤姆克鲁斯\t161247\n顶珠\t161248\n百家湖小学\t161249\n进步奖\t161250\nFCL\t161251\nluc\t161252\n文化创意产业园\t161253\nnano卡\t161254\n抬手\t161255\n亚麻籽\t161256\n叫卖声\t161257\n5平米\t161258\nfirewalld\t161259\n段勇\t161260\n全高清\t161261\n张自忠路\t161262\n智力\t161263\n时点\t161264\nCIMCO\t161265\n优鲜\t161266\n古典乐\t161267\n上海精密仪器仪表有限公司\t161268\nequivalent\t161269\n成就是\t161270\nPVC\t161271\n花楼\t161272\n驼鹿\t161273\nCalifornia\t161274\nfeaturing\t161275\n显微镜目镜\t161276\n配电板\t161277\n20180403\t161278\nbae\t161279\n东方市\t161280\n75名\t161281\n外码\t161282\n特种兵\t161283\n真够\t161284\n务川\t161285\n凌斌\t161286\n贾兰\t161287\n名著\t161288\n溪城家园\t161289\n锦绣新城\t161290\n弘阳\t161291\n捕鱼达人\t161292\n变暖\t161293\n耳式\t161294\n双币\t161295\ninjuries\t161296\n番茄炒蛋\t161297\n腹肌板\t161298\n一个项\t161299\n胚根\t161300\n澳丁\t161301\n西黄村\t161302\n水岛津实\t161303\n唐承沛\t161304\n汇通票据网\t161305\nCause\t161306\n闷气\t161307\nattraction\t161308\n陈松林\t161309\n8起\t161310\nsandra\t161311\n光纤传感器\t161312\n引蛇出\t161313\n海底苍鹰\t161314\n银河园\t161315\n抖森\t161316\n厚干\t161317\n8英寸\t161318\n鲁豫\t161319\n扫视\t161320\n王晨\t161321\n排石\t161322\nExciting\t161323\nx20a\t161324\n山药粥\t161325\n金保\t161326\n北京市气象局\t161327\n中国平安人寿保险\t161328\n南报网\t161329\n宁晋县\t161330\n秧秧\t161331\n多氯联苯\t161332\n哈佛大学图书馆\t161333\n我的世界呆呆生存记\t161334\n第57章\t161335\n老同志\t161336\n在梦里\t161337\n乡村艳妇\t161338\nxCode\t161339\nParfums\t161340\n荷甲\t161341\n微氪\t161342\n27m\t161343\n泰拳王\t161344\naudi\t161345\ndating\t161346\n暗线\t161347\nmisumi\t161348\n社会保障费\t161349\n履历\t161350\nQQ技术网\t161351\nBuilder\t161352\n查无\t161353\nchromebook\t161354\n流行词\t161355\n书香世家连锁酒店\t161356\n全国中小企业股份转让系统\t161357\n文上\t161358\nbill\t161359\n放心不下\t161360\n巴夏\t161361\n豆腐皮机\t161362\nstewart\t161363\n周海峰\t161364\n赵贺新\t161365\n睡美人\t161366\n蛊术\t161367\n堂子\t161368\n苏州公交查询网\t161369\nVCA\t161370\n时代楷模\t161371\n影城\t161372\n11点半\t161373\n12999数学网\t161374\n六位地黄丸\t161375\nBCAA\t161376\n爱己\t161377\nrommon\t161378\n榨苹\t161379\n乾县人民政府\t161380\n紫云飞\t161381\n一醉\t161382\n貉\t161383\n网贷之家论坛\t161384\n国家计委\t161385\npackager\t161386\n12bit\t161387\n淮安市公安局\t161388\n微营销\t161389\n反感\t161390\n骈文\t161391\n60包\t161392\nnorton\t161393\n战地2\t161394\nnupkg\t161395\n一多半\t161396\n星图\t161397\nV380\t161398\npipes\t161399\n张钰桦\t161400\n止水环\t161401\n春酒\t161402\n普洱市\t161403\n得克萨斯州\t161404\n准线方程\t161405\n船木\t161406\n营养品\t161407\n1963年\t161408\n产蛋率\t161409\n戴村镇\t161410\nlink无线路由器\t161411\n图书类\t161412\n6两\t161413\n法国斗牛犬\t161414\n白肌\t161415\n停缴\t161416\n四大名捕2\t161417\n文新街道\t161418\n上坟\t161419\n博创科技\t161420\n小黑瓶\t161421\n一传\t161422\n车城网\t161423\n雏鹰农牧\t161424\n谭木匠\t161425\n艾米莉亚\t161426\n自助银行\t161427\n考评\t161428\n101个\t161429\nTHC\t161430\n流浪法师\t161431\n12星座\t161432\n20|\t161433\n修版\t161434\n每回\t161435\n叮咬\t161436\n小护\t161437\n萃茶师\t161438\n菌类\t161439\n精光\t161440\n发起\t161441\n陈立\t161442\n刘艳芬\t161443\n鸣雏\t161444\n哥达\t161445\n陶渊明\t161446\nKantar\t161447\n马代\t161448\n综服\t161449\n露茜\t161450\n巩留\t161451\nHani\t161452\n夜的钢琴曲五\t161453\n前进街道\t161454\n刀山火海\t161455\n仕官\t161456\n我们的眼睛\t161457\n护士门\t161458\n第13次\t161459\n中外合作大学\t161460\n子平真诠\t161461\n宁波地铁\t161462\n南康在线\t161463\n酷我音乐2017\t161464\n狭路相逢\t161465\n越共\t161466\nBurn\t161467\nInjector\t161468\nPowerDesigner16\t161469\ndisadvantages\t161470\n晚霞\t161471\n子波\t161472\n唐瑛\t161473\n戴玉\t161474\n201712\t161475\n峒\t161476\n魏艾斯\t161477\nfreedos\t161478\nMySQL函数\t161479\nsovereign\t161480\n小米note顶配版\t161481\n惊动\t161482\n彝良县\t161483\n華佳科技有限公司\t161484\n假日花园\t161485\n墨卡托投影\t161486\nlightmap\t161487\n精酿啤酒\t161488\nstor\t161489\n根绝\t161490\n长隆科技\t161491\n韦编三绝\t161492\nimresize\t161493\n阿敏\t161494\n春桃\t161495\n科学\t161496\n买量\t161497\n阳明街道\t161498\nterrestrial\t161499\n甲醛测试仪\t161500\n氨苄西林胶囊\t161501\nERIKA\t161502\n范植伟\t161503\n石器时代\t161504\n静水\t161505\n押花\t161506\nFemme\t161507\n烦忧\t161508\n鹅岛\t161509\n250万吨\t161510\n飞越\t161511\n延保版\t161512\n夏天岛\t161513\n金扇\t161514\n行刺\t161515\n3027\t161516\n药用辅料\t161517\nbl虐文\t161518\n海鲜味\t161519\n29处\t161520\ngong\t161521\n师姐\t161522\n点穴式\t161523\n红动知\t161524\nFe3O4\t161525\n本位主义\t161526\n说明函\t161527\n一里\t161528\nsetData\t161529\n仿句\t161530\n91free\t161531\n宝马7系论坛_汽车之家论坛\t161532\n五日游\t161533\n经集\t161534\nITV\t161535\n销存\t161536\n湖南湘涛\t161537\nhardwell\t161538\n科列\t161539\n扁平线\t161540\n鸿茅公司\t161541\n工程人\t161542\n育苗\t161543\n1.5w\t161544\n掐死\t161545\nzilla\t161546\n02章\t161547\nDevelopment\t161548\n光明顶\t161549\n52期\t161550\n程序员\t161551\n韩磊\t161552\n24001\t161553\n内心深处\t161554\n西洽会\t161555\n抗日奇侠\t161556\n海花\t161557\n威海地区\t161558\n酸洗\t161559\n小鼠\t161560\nxzqh\t161561\n绝灭\t161562\n中国商品信息验证中心\t161563\n100户\t161564\n归雁\t161565\n王者荣耀掉帧\t161566\n木树\t161567\n康忻\t161568\nVarnish\t161569\n纱门\t161570\n吴建飞\t161571\nPrairie\t161572\n佛\t161573\n十方\t161574\n襄县\t161575\n300plc\t161576\ncontinued\t161577\n亚文化\t161578\n股利支付率\t161579\n信证券\t161580\nPassive\t161581\n消防工程师考试\t161582\n销售吧\t161583\n成都房协\t161584\n睿丁\t161585\nentropy\t161586\n菜绪\t161587\n0.7cm\t161588\n人教版小学语文三年级下册\t161589\n牛散\t161590\nV2015\t161591\n考下\t161592\n潇洒\t161593\n新闻_天健网\t161594\n软底\t161595\n压环\t161596\n30周岁\t161597\n十百千万\t161598\n川贝枇杷糖浆\t161599\nQList\t161600\n大雨\t161601\nCentOS/RHEL\t161602\n斯蒂夫\t161603\n股藓\t161604\nHarvey\t161605\n2017年4月28日\t161606\n第三方浏览器\t161607\n国际\t161608\nimwrite函数\t161609\n余周周\t161610\nbyron\t161611\n争暖\t161612\nzsh\t161613\nvgtime\t161614\n三甘醇\t161615\n无益\t161616\n百分之75\t161617\n疲惫\t161618\nrmmod\t161619\n定段\t161620\n非暴力沟通\t161621\nnavgation\t161622\n龙首村\t161623\n201702\t161624\n张文辉\t161625\n爱戴眼镜网\t161626\n罗贤\t161627\n驱动包\t161628\n通神\t161629\n得不偿失\t161630\n帝女花\t161631\n秋毫\t161632\n泽口靖子\t161633\n资料袋\t161634\n三星S换机助手\t161635\nDMI\t161636\n平安福2018\t161637\n信阳市政府网\t161638\n利苑\t161639\n续建\t161640\nPlays\t161641\n随时\t161642\nd11\t161643\n密集区\t161644\nfirebox\t161645\n产前假\t161646\n植物大战僵尸电脑版\t161647\nJetson\t161648\nztb\t161649\n意林杂志\t161650\n阿里巴巴达摩院\t161651\n房东\t161652\n五原县\t161653\n租摆\t161654\n金泊\t161655\nPhpMyadmin\t161656\nobjs\t161657\nCoinDesk\t161658\n小跟班\t161659\n天津市规划局\t161660\n快进来\t161661\n奥迪A4\t161662\n雾化机\t161663\n何训田\t161664\n2017年9月4日\t161665\nurl-pattern\t161666\n泡罩\t161667\n安义古村\t161668\n512\t161669\n生物油\t161670\n源义经\t161671\nitalki\t161672\na9100\t161673\n园子\t161674\n久居\t161675\n支付宝网商贷\t161676\n约惠\t161677\n打消\t161678\n云巅之上\t161679\n亚胺\t161680\n69054170\t161681\n眼表\t161682\n牙槽\t161683\n5000ml\t161684\n忘我\t161685\n倾斜角\t161686\n接招\t161687\n主字\t161688\nE30\t161689\n115cm\t161690\n达古冰川\t161691\n薛伟\t161692\n补寄\t161693\n宁德市\t161694\n530分\t161695\n博维\t161696\n纱支\t161697\nDVD高清\t161698\n桂林市人民医院\t161699\n祸根\t161700\n敬礼\t161701\n江苏省水利厅\t161702\n娇美\t161703\nrk3368\t161704\n幽浮2\t161705\n正不\t161706\n林柏欣\t161707\n苹果酒\t161708\n黑松盆景\t161709\n飘云\t161710\n姆巴佩\t161711\n六桂福\t161712\n留坝县\t161713\n游素兰\t161714\nkota\t161715\n270x\t161716\n所用\t161717\nvlookup函数\t161718\nHistoric\t161719\n麓景路\t161720\n昼行\t161721\n吃豆腐\t161722\nwww日日色色cc\t161723\n艾游\t161724\n2.4.29\t161725\nDIMENSION\t161726\nmess\t161727\n网约\t161728\neugene\t161729\n瞬发\t161730\n义和庄\t161731\n新合村\t161732\nForce\t161733\nFitness\t161734\nNO.001\t161735\n留学\t161736\n传统产业\t161737\n同洲\t161738\n昆明市延安医院\t161739\n类药\t161740\n8770\t161741\n共产党新闻网\t161742\n锦悦\t161743\n金小\t161744\n两限\t161745\n魔兽官方对战平台\t161746\nrandi\t161747\n手帐\t161748\n谣\t161749\n盖子\t161750\nsetdata\t161751\n实体店\t161752\n陈思璇\t161753\n金泉\t161754\n清华大学玉泉医院\t161755\n雷法\t161756\n刀剑神域:夺命凶弹\t161757\n嫡长子\t161758\n锡柴\t161759\n苍之纪元维多利亚\t161760\nZeiss\t161761\n清色\t161762\n获释\t161763\n东方学院\t161764\n世界杯预选赛_\t161765\n造梦西游5\t161766\n中国航天日\t161767\n果维康\t161768\n排毒\t161769\n查一查\t161770\ntlsv\t161771\n海耶斯\t161772\n恶意中\t161773\nchenjianwen\t161774\n空无一人\t161775\n笑点\t161776\nx3650m5\t161777\nsOn\t161778\n麻木不仁\t161779\n如龙6\t161780\n57%\t161781\n杉木板\t161782\n【哥斯拉\t161783\n订单管理系统\t161784\n支柱\t161785\nMyanmar\t161786\nガチ\t161787\n站出\t161788\nresteasy\t161789\n启维\t161790\n胰蛋白酶\t161791\n四川省交通厅\t161792\n弹弹起亚k5\t161793\n南苑新村\t161794\n安徒生童话乐园\t161795\n老云\t161796\n南斯拉夫大使馆\t161797\ntargeting\t161798\n现息\t161799\n引文版\t161800\n液压机\t161801\n卡怪\t161802\ninactive\t161803\nrsh\t161804\nbraking\t161805\n黄巾之乱\t161806\nProductive\t161807\nTOPIC\t161808\n文慧园\t161809\n格灵\t161810\n新媒体营销\t161811\nCloudy\t161812\n八日游\t161813\n韧皮\t161814\n50行\t161815\n冀中\t161816\n太阳鼓\t161817\n杨开\t161818\n黑商\t161819\nHormone\t161820\ntplogin.cn\t161821\n卫\t161822\nlte\t161823\n土块\t161824\n债人\t161825\nNO1\t161826\n中国温岭_温岭政府\t161827\n智能停车场管理系统\t161828\n火锅店\t161829\n圣代\t161830\n作罢\t161831\n烧花\t161832\n117集\t161833\n天然气公司\t161834\nSWT\t161835\n摩擦型\t161836\n真累\t161837\n锚泊\t161838\n福建小学\t161839\nPDX\t161840\ncoderland\t161841\n力量\t161842\n银行间债市\t161843\n天女兽\t161844\n19.30\t161845\n修文县\t161846\n粤语版)\t161847\nnbc\t161848\n小攻\t161849\n孟雪\t161850\n东县\t161851\n诡情\t161852\n文件源\t161853\n封刀\t161854\nArtifactory\t161855\n乌梢蛇\t161856\n李雅普\t161857\n白露\t161858\n唐王镇\t161859\nloser\t161860\n奥尔多\t161861\n主治医师\t161862\n深圳有限公司\t161863\n会宁县\t161864\n摆渡车\t161865\n150um\t161866\n大理古镇\t161867\n240mm\t161868\nSy\t161869\n钢铁侠2\t161870\n妥\t161871\n女孩子\t161872\n量价齐升\t161873\n天才少年\t161874\n云片网\t161875\n辛吉斯\t161876\n潮州新闻网\t161877\n炫耀\t161878\n水养\t161879\n杜丘之歌\t161880\nDay6\t161881\n潮跑\t161882\ncongratulation\t161883\n隔爆\t161884\nAdventure\t161885\n贺礼\t161886\n汉语文\t161887\n天府\t161888\nLululemon\t161889\n接合\t161890\n7980XE\t161891\n第11季\t161892\n标致3008\t161893\n70G\t161894\n临幸\t161895\n零利率\t161896\n层压机\t161897\n汽车蓄电池\t161898\nISE\t161899\n电建\t161900\n制袋\t161901\n末位\t161902\nhib疫苗\t161903\n李中堂\t161904\ngtp6\t161905\n虾爬子\t161906\n物联网概念股\t161907\n八稚女\t161908\n雀雀\t161909\nVoith\t161910\n详说\t161911\nKDZ\t161912\n宽宽\t161913\n十五米\t161914\nforfiles\t161915\n77d\t161916\n山木\t161917\nrecovering\t161918\n兰州石化公司\t161919\nmaat\t161920\n苎麻\t161921\n神州租车\t161922\n狼太郎\t161923\n2016.01\t161924\nmermaid\t161925\n泽艺\t161926\n23_\t161927\n抱养\t161928\n糊涂的爱\t161929\n油泥\t161930\n背信弃义\t161931\n蒸熟\t161932\n衡山路\t161933\n22版\t161934\n99张\t161935\n六月底\t161936\n韩森\t161937\n芦溪镇\t161938\n柱子\t161939\n西塞山怀古\t161940\n鲁氏\t161941\nxinyaohui\t161942\nYT\t161943\n聚缘\t161944\n汉后\t161945\n说出\t161946\n街霸5AE\t161947\n郭蕾\t161948\n铺展\t161949\n楔块\t161950\nci\t161951\n防蚊裤\t161952\nwii游戏\t161953\n中国钢铁工业协会\t161954\n到现\t161955\n心体\t161956\n公积金\t161957\n乳山银滩\t161958\njq选择器\t161959\n开元\t161960\nFPU\t161961\n8560w\t161962\n生路\t161963\n刮画\t161964\n西山森林公园\t161965\nnetpas\t161966\n江南大厦\t161967\n106页\t161968\n湛江新闻网\t161969\nMak\t161970\n珊瑚颂\t161971\n萌妻撩人:腹黑帝少心尖宠\t161972\n鸿海集团\t161973\n欣泰电气\t161974\n碳酸饮料\t161975\n北京地税局\t161976\n蟒河\t161977\n卧式铣床\t161978\nonda\t161979\n程熙媛\t161980\n翻样\t161981\n乙状结肠\t161982\nnmcli\t161983\n李堡镇\t161984\n海南生态软件园\t161985\n如意集团\t161986\n纽卡斯尔联\t161987\nLavida\t161988\nappuim\t161989\n一百单八\t161990\n栋梁之才\t161991\nExceL\t161992\n畸情\t161993\n味源\t161994\n安慕希酸奶\t161995\n罗才军\t161996\n刘承俊\t161997\n随锐科技\t161998\n朗朗\t161999\n陕文\t162000\nfuction\t162001\napis\t162002\n电视节目\t162003\n2018多\t162004\nwindwos10\t162005\n真命天女\t162006\n联合国粮农组织\t162007\n一立方米\t162008\n12316\t162009\n最脏\t162010\nSDRAM\t162011\n离卦\t162012\n免赔额\t162013\n毒剂\t162014\n银坑\t162015\n毛桃\t162016\n2017年6月21日\t162017\n大黄庄\t162018\nPilates\t162019\n大澄社区\t162020\n直肠镜\t162021\n书画类\t162022\n暗殿\t162023\ndiving\t162024\n晋文\t162025\n荔枝FM\t162026\n知不足\t162027\nhappy\t162028\n伊利温\t162029\n休闲类\t162030\n厢式货车\t162031\ncoach\t162032\n28亿元\t162033\nDrinking\t162034\njimin\t162035\n腮\t162036\n彭泽网\t162037\n玛丽·安托瓦内特\t162038\n前向\t162039\ncof\t162040\n东软医疗\t162041\n朱静\t162042\n乔南\t162043\n沃趣\t162044\n刑辩律师\t162045\n华严北里\t162046\n365易贷\t162047\n销品\t162048\n世人\t162049\n中国贵州茅台集团\t162050\n提一下\t162051\n幅射\t162052\nubank\t162053\n阴囊瘙痒\t162054\nMag\t162055\n春苗\t162056\n花鼓戏\t162057\ntair\t162058\n生化奇兵1\t162059\n湖北安全生产监督管理局\t162060\n涂山\t162061\n上演\t162062\n曙光医院\t162063\nRoland\t162064\n孔雀石绿\t162065\n無料エロ動画\t162066\n创园\t162067\n牙髓炎\t162068\n反气\t162069\n环境保护部南京环境科学研究所\t162070\nbootsrap\t162071\n内阁\t162072\n上唇\t162073\n北京东路668号\t162074\n流行乐\t162075\nNETSHO\t162076\n联合体\t162077\n里包恩\t162078\n6X6\t162079\nddv\t162080\nx86-64\t162081\n第21页\t162082\n本地宝\t162083\n送伞\t162084\n小营\t162085\n鲁中书画艺术网\t162086\n富拉尔基吧\t162087\n阳光正好\t162088\n六氟化硫\t162089\n正月初一\t162090\n9000万\t162091\n主战\t162092\n何晟铭\t162093\n特里\t162094\n百度云福利\t162095\nuhf\t162096\n长江干线\t162097\n海马S7\t162098\n石佛\t162099\nIQ测试\t162100\n42万\t162101\npkm\t162102\n艾欣\t162103\n反对者\t162104\n组织胺\t162105\n学战\t162106\n第一问\t162107\n天天快递单号\t162108\n湖心\t162109\n校园小兵传奇\t162110\n6061\t162111\n正康\t162112\n暗红色\t162113\n条理\t162114\n郑万\t162115\n仙堂\t162116\n相托\t162117\n路易·威登\t162118\n梁小民\t162119\nHHeOnline\t162120\n坐垫\t162121\n新市\t162122\n捷渡\t162123\n首件\t162124\n6月\t162125\n崔丝塔娜\t162126\n闫希军\t162127\n假想敌\t162128\n血钙\t162129\n半身\t162130\n隐忧\t162131\n骄纵\t162132\n佳能200D\t162133\n冰蓄冷\t162134\n晕开\t162135\njzzshg\t162136\n雪花膏\t162137\n十恶大败\t162138\n莲実クレア\t162139\nMRO\t162140\n相位角\t162141\n移出\t162142\n龙冠\t162143\n学園\t162144\n列那狐的故事\t162145\nhigher\t162146\ndrawerlayout\t162147\n27所\t162148\n九版\t162149\n马一鸣\t162150\n御茶\t162151\n重庆医科大学附属儿童医院\t162152\nalert,confirm\t162153\n_鱼\t162154\n第一缕\t162155\nf407\t162156\n20108\t162157\n黄伟忠\t162158\n核磁共振氢谱图\t162159\nC63\t162160\n黑莓q20\t162161\nnpm\t162162\n河西区\t162163\n9G\t162164\n杨逍\t162165\n买不如\t162166\n开片\t162167\n宫角\t162168\n中国人保健康\t162169\n湖北自考网\t162170\n卫龙辣条\t162171\ncrisis\t162172\nonkyo\t162173\n华中科技大学自动化学院\t162174\n150px\t162175\n流风机\t162176\n飙车\t162177\nJones\t162178\nLamp\t162179\nAL20\t162180\n纵轴\t162181\n农历二月十九\t162182\n轶闻\t162183\n公休\t162184\n服气\t162185\n线粒体\t162186\n临江路\t162187\njasonfreak\t162188\n差分法\t162189\n辐射\t162190\n锦江饭店\t162191\n苏金\t162192\n马点\t162193\n尼基\t162194\n万方数据\t162195\n棕眼\t162196\n水光针\t162197\n俄狄浦斯\t162198\n杨梅红国际私立美校\t162199\noppose\t162200\n荣耀4c\t162201\n刘一\t162202\n昆阳\t162203\n张大民\t162204\n石家庄市住房和城乡建设局\t162205\noce\t162206\n马大叔\t162207\n霍山县人民政府\t162208\nbit,Windows\t162209\ntimewait\t162210\n牧马人\t162211\n北京木樨园\t162212\n2017年2月25日\t162213\n专科学\t162214\n荔枝味\t162215\n玻璃杯\t162216\n克罗\t162217\nrealvnc\t162218\n2000g\t162219\ndrawings\t162220\n子宁\t162221\n肌底\t162222\n杭政\t162223\n复方肝素钠\t162224\n索氏\t162225\nmicrotime\t162226\n过渡页\t162227\nSEALY\t162228\n中华医学会\t162229\nkoajs\t162230\n世居\t162231\n王寨\t162232\n天地图\t162233\n人造石\t162234\n1.36\t162235\n张文武\t162236\n1艘\t162237\n口诀\t162238\n阿化\t162239\n闭门造车\t162240\n永泰城\t162241\n刘洲\t162242\n卧龙岗\t162243\n周鸿伟\t162244\n外录\t162245\nod值\t162246\nsprintf\t162247\n美元\t162248\n弹丸论破2\t162249\n学习机\t162250\nrts\t162251\n港督\t162252\n痔\t162253\n第四十四章\t162254\nhelpers\t162255\n泽塔琼斯\t162256\n伊森卡特\t162257\n空军军医大学\t162258\neasyuefi\t162259\n滚轮架\t162260\n云集品\t162261\na酸软胶囊\t162262\n弹药库\t162263\n高鑫\t162264\nwatts\t162265\n蓉\t162266\n站内\t162267\n中化能源\t162268\nimihiro\t162269\n华业资本\t162270\n大熔炉\t162271\ngrocery\t162272\n感染力\t162273\nzishi\t162274\n裁决\t162275\n踏莎行\t162276\n夏光\t162277\n休\t162278\n昆明理工大学》\t162279\n置为\t162280\nvco\t162281\n告负\t162282\n获奖者\t162283\n咬破\t162284\n60950\t162285\n古兰经\t162286\n100U\t162287\n木托盘\t162288\ncoon\t162289\n巴哈马\t162290\n神雕侠侣古天乐\t162291\nConvolutional\t162292\n2017年11月12日\t162293\n大沙镇\t162294\n童子功\t162295\n乾隆通宝\t162296\nquaternion\t162297\n息县\t162298\n水口山\t162299\n2016年10月24日\t162300\nPerfume\t162301\n3胎\t162302\n朱自清\t162303\n江峰\t162304\n气型\t162305\n高山市\t162306\n5.7.18\t162307\n用物\t162308\n大悦\t162309\n郭凡生\t162310\n重庆6号线\t162311\nlpddr3\t162312\n洒扫\t162313\nmysqli_query\t162314\n卡卡西\t162315\n60l\t162316\n信商\t162317\n搏动性耳鸣\t162318\n浆果\t162319\n第一车网\t162320\n服装专卖店\t162321\nlibvirtd\t162322\nreloaded\t162323\n侦探片\t162324\n千强镇\t162325\n数理学院\t162326\n二糖\t162327\n桑心\t162328\n海力\t162329\n肉水\t162330\n阿里巴巴西溪园区\t162331\nSSG\t162332\n国家留学基金委\t162333\n海兴\t162334\nIDF\t162335\nTCR\t162336\nBROTHERS\t162337\nie800\t162338\n经典\t162339\n冯谖\t162340\n御龙在天手游\t162341\n地区片\t162342\n油嘴\t162343\n品胜充电宝\t162344\n442首\t162345\nthan\t162346\n刘永泽\t162347\n朱家\t162348\nunl\t162349\n畅销书\t162350\n游戏版\t162351\n十世班禅\t162352\n藏家\t162353\n麻辣IT网\t162354\nexpat\t162355\n高空作业车\t162356\nsaki\t162357\n李佩玲\t162358\n行头\t162359\n闻笛\t162360\n遵规\t162361\n马站\t162362\naudiojungle\t162363\n广点\t162364\n八木梓\t162365\n屁孩\t162366\n消烟\t162367\n多变\t162368\n贵和\t162369\nnextLine\t162370\n耀眼\t162371\nWeTool\t162372\n深V\t162373\n云南省公安厅交通警察总队\t162374\n爱荷华州立大学\t162375\nsymptom\t162376\n龙门乡\t162377\nsP\t162378\nwuye\t162379\n女光\t162380\n怒焰\t162381\n中国电子进出口总公司\t162382\n硬屏\t162383\n一千克\t162384\n圆\t162385\nDbVisualizer\t162386\n杨玥\t162387\n向天\t162388\n尼康d5600\t162389\n选品\t162390\n湿寒\t162391\nFIDIC\t162392\n40多位\t162393\n泗洲\t162394\n休息区\t162395\n埋线\t162396\n桃源\t162397\nNX_模具联盟网\t162398\n先天性髋关节脱位\t162399\n鄂温克族自治旗人民政府\t162400\n紫光阁\t162401\n度量\t162402\n绯\t162403\nZeit\t162404\n林书豪\t162405\n琪琪\t162406\n特乐意\t162407\nRepository\t162408\nv1.1.5\t162409\n自由落体运动\t162410\n浙西天池\t162411\n藜\t162412\n育儿大作战#\t162413\n独女\t162414\n一窝\t162415\n减肥法\t162416\n主渠道\t162417\n旧居\t162418\n拉扯\t162419\n红芪\t162420\n张罗\t162421\n蒸柜\t162422\nemmmmmm\t162423\n第一百一十五章\t162424\n小汤山镇\t162425\n湖南人\t162426\n半自\t162427\n火车王\t162428\n水帘洞\t162429\nLogs\t162430\n109万\t162431\n怎吗\t162432\n炉具\t162433\n他者\t162434\n蓝灰色\t162435\nExcel服务器\t162436\n泰斯拉\t162437\n青藏公路\t162438\n雷客\t162439\n行唐\t162440\n土木建筑工程学院\t162441\n死相\t162442\n二孔\t162443\nngCordova\t162444\n金屋藏娇\t162445\n韩信\t162446\n栾城\t162447\nIc\t162448\n声磁\t162449\n义兴\t162450\n福寿\t162451\ndishes\t162452\n心电\t162453\n大竹中学\t162454\n20140131\t162455\n反序\t162456\n牛津树\t162457\nC-Free\t162458\n大吊\t162459\n约见\t162460\n嗟乎\t162461\n小船\t162462\n风达快递\t162463\n中药制丸机\t162464\ncrowdraw\t162465\nwiff\t162466\n松鹤延年\t162467\n放马\t162468\n富登\t162469\n前男友\t162470\n夜王\t162471\n井绳\t162472\n地煞星\t162473\n东萍象棋\t162474\n暴走大事件_暴走漫画\t162475\ndelighted\t162476\n视域\t162477\n肝脏\t162478\n贝乐虎\t162479\n2016年6月份\t162480\n军中乐园\t162481\nMMSI\t162482\n慕田峪\t162483\nav一\t162484\nwipe\t162485\n温水\t162486\n15crmo\t162487\ndiscussion\t162488\n朗行\t162489\nvc6.0\t162490\n海上威尼斯\t162491\n600\t162492\n上海市徐汇区行政服务中心\t162493\n广西国际壮医医院\t162494\n学说\t162495\n公共交通工具\t162496\n针孔\t162497\n灰分\t162498\n胧村正妖刀传\t162499\n千变万化\t162500\n红日药业\t162501\n跑偏\t162502\n唐城影视基地\t162503\n一彩\t162504\n琼山区\t162505\n默寒\t162506\n蒙元\t162507\nrunloop\t162508\n清华中学\t162509\n1.63\t162510\n黄桷树\t162511\n众力\t162512\nORA-12505\t162513\nsubsystem\t162514\nResurrection\t162515\n云虚拟\t162516\n灵舟\t162517\n记录片\t162518\n黄洪\t162519\ncommunications\t162520\nX3.4\t162521\n饮料类\t162522\n中学区\t162523\n蒸湘区\t162524\nNoir\t162525\n李易\t162526\n论持久战\t162527\n前田かおり\t162528\n从化街口\t162529\n宝莱特\t162530\n北大学\t162531\n概说\t162532\n灵山湾\t162533\n反日\t162534\n第三幕\t162535\n纳米粒子\t162536\n法\t162537\n套餐费\t162538\n二七塔\t162539\n北京雍和宫\t162540\n1920年代\t162541\n吴江经济技术开发区\t162542\nFirmware\t162543\n锦明\t162544\n山新\t162545\nV4.0.1\t162546\n耐泡\t162547\n组合件\t162548\n网易游戏\t162549\n红果果\t162550\n上河城\t162551\n半月板撕裂\t162552\n西安陕鼓动力股份有限公司\t162553\n喵帕斯\t162554\nroberto\t162555\n清吧\t162556\n海里\t162557\n颈动脉窦\t162558\nヌ\t162559\nPGONE\t162560\ngets\t162561\n茶丸\t162562\n宝马x4\t162563\n刁民想害朕\t162564\n悟生\t162565\n心情舒畅\t162566\n2500个\t162567\n松浦亚弥\t162568\n哨子\t162569\nBritish\t162570\n肩垫\t162571\n1751\t162572\nPS-HX500\t162573\n技展\t162574\n丰田中心\t162575\n芳烃油\t162576\nIdiot\t162577\n诡探\t162578\npopular\t162579\n串儿\t162580\n满人\t162581\n一掌\t162582\n700本\t162583\nFame\t162584\ngamble\t162585\n沙缸\t162586\n吕四娘\t162587\ndivorce\t162588\n水兵\t162589\n1028\t162590\n苏菲玛索\t162591\n风轻云\t162592\nsenses\t162593\n珠海市人力资源和社会保障局\t162594\n进模\t162595\n标马\t162596\nAngelia\t162597\nspecimen\t162598\n上海交通大学电子信息与电气工程学院\t162599\n参函数\t162600\n绿海\t162601\n坯料\t162602\n钟敬文\t162603\n李春梅\t162604\nminster\t162605\n值得一提\t162606\n回不来\t162607\n广东公务员考试网\t162608\n第117集\t162609\n修剪\t162610\n房室瓣\t162611\n景瑞地产\t162612\n卡斯特\t162613\nPersonal\t162614\n鸡枪\t162615\n我是什么\t162616\ntimemachine\t162617\n第6部\t162618\nbigfoot\t162619\n煤气费\t162620\n带子\t162621\n旺城\t162622\n音毡\t162623\n集宁南\t162624\n四个服从\t162625\nSTATIC\t162626\n谷微\t162627\ngccbuaa\t162628\n电玩巴士网游_巴士剑网\t162629\n米津\t162630\n1816\t162631\n北京丰田\t162632\n宏基因组学\t162633\n隐马尔科夫\t162634\n2亿元\t162635\nnon\t162636\n刷卡器\t162637\n连霍高速公路\t162638\n九千\t162639\n竞价\t162640\n红方印\t162641\n新浪宝贝\t162642\n赫赛汀\t162643\n中国人民银行金融研究所\t162644\n半分钟\t162645\n中央财经委员会\t162646\n水清沟\t162647\n神秘\t162648\n新丰田\t162649\nwpi\t162650\n潮王路\t162651\n阜城\t162652\n师陀\t162653\ndistress\t162654\ntempo\t162655\n埙曲\t162656\n高尔基\t162657\n中国船级社\t162658\n宝信\t162659\n爱一个人\t162660\n张琼\t162661\n10800\t162662\n三杀\t162663\n圆面\t162664\n泪桥\t162665\nspring5\t162666\n6片\t162667\n转售\t162668\n名票\t162669\n公司办公室\t162670\n刘银川\t162671\n中国人民银行上海总部\t162672\ndetectron\t162673\n卯月\t162674\n自言自语\t162675\nFORTUNE\t162676\nL级\t162677\n深海之战\t162678\npreference\t162679\n孩提\t162680\n宫阙\t162681\n柘木\t162682\nLadies\t162683\n反腐倡廉教育\t162684\n艰难\t162685\n五菱宏光S3\t162686\nm7.5\t162687\n姚晓峰\t162688\n分离焦虑症\t162689\n神经酸\t162690\n手技\t162691\nc-free\t162692\nTK1\t162693\n2.4.33\t162694\nslut\t162695\n匹克\t162696\n制作员\t162697\n荒野求生\t162698\n涂布机\t162699\nchinaese\t162700\nMelting\t162701\n工钱\t162702\n放炮\t162703\n概观\t162704\n黄花岗\t162705\n毛人凤\t162706\n猎豹CS10\t162707\n沉水\t162708\n炸酱面\t162709\n逍遥法外\t162710\n锐评\t162711\n富华大厦\t162712\n蜻蜓侠\t162713\n轻浮\t162714\nHey\t162715\n奔驰E200L\t162716\n26页\t162717\ncarter\t162718\n商业保理公司\t162719\n柏翠\t162720\n241\t162721\n香溪源\t162722\n华强北\t162723\n万得资讯\t162724\n平泉镇\t162725\n电涌\t162726\n卫生纸\t162727\n板仓\t162728\n爬书网\t162729\n订量\t162730\n情诫\t162731\nobey\t162732\n搞笑漫画_来福岛爆笑娱乐网\t162733\n船机\t162734\n溪边静禅\t162735\nv3.1.8\t162736\nnote1s\t162737\nbcx\t162738\n2000ml\t162739\n波次\t162740\n出版传媒\t162741\nrar格式\t162742\n文化衫\t162743\n松本若菜\t162744\n麋鹿\t162745\n鸽\t162746\n人命关天\t162747\n客服员\t162748\n孙凯\t162749\n上线\t162750\nvc14\t162751\n参苓健脾胃颗粒\t162752\ndave\t162753\n字盖\t162754\n子宫内膜薄\t162755\n万科樱花园\t162756\n购买力\t162757\n北京商办\t162758\n干露露\t162759\n影踪\t162760\n布罗茨基\t162761\n陆金宝\t162762\n上海石油天然气交易中心\t162763\n夜人\t162764\njyzx\t162765\n章小蕙\t162766\n天才在左疯子在\t162767\n贴文\t162768\nyoga2\t162769\nTrader\t162770\n汇丽\t162771\n食品伙伴网\t162772\n容道\t162773\n斗破苍穹漫画全集\t162774\n锂电钻\t162775\n凤凰与飞\t162776\n爱琴湾\t162777\n郑州市园林局\t162778\n空见\t162779\n方正证券\t162780\n舒血宁注射液\t162781\ntaichi\t162782\n刘田\t162783\n长安欧尚欧尚X70A2018款高清\t162784\n全球资源网\t162785\n江滨公园\t162786\n济阳路\t162787\n数字电视\t162788\n安康市人民政府办公室\t162789\n2015—2016年度\t162790\n赖世雄\t162791\n拌和机\t162792\n雁过\t162793\n豆沙\t162794\n古惑仔3之只手遮天\t162795\n增持评级\t162796\n黑蚀龙\t162797\nGooglePlay\t162798\n紺野\t162799\n零封\t162800\n引导性\t162801\n夜秀场\t162802\n托普仕\t162803\n山外小楼夜听雨\t162804\nYamagata\t162805\n限制行为能力人\t162806\n雷亚尔\t162807\n东条英机\t162808\n博物学\t162809\n清清宝\t162810\n顿服\t162811\n张家湾镇\t162812\n中材股份\t162813\nconsumed\t162814\n辅导期\t162815\nyhd\t162816\n雒容镇\t162817\n十五日\t162818\ninterim\t162819\n被打破\t162820\n踏雪无痕\t162821\nV3.0.1\t162822\n六界\t162823\n海淀外国语实验学校\t162824\n学会感恩\t162825\n政情\t162826\n200题\t162827\nM7650DF\t162828\nchf\t162829\n银江股份\t162830\n旺仔小馒头\t162831\nLoki\t162832\n高慧\t162833\n兵锋\t162834\n中计\t162835\n天文科普网\t162836\n金鸡百花电影节\t162837\nKO\t162838\n根雕\t162839\nNot-Bad\t162840\n超过五分钟\t162841\n烟杆\t162842\nadditivity\t162843\n平川区\t162844\n全球新能源网\t162845\n1950\t162846\n文爱\t162847\n安德烈·波切利\t162848\n华科大\t162849\n佳能相机\t162850\nbarchart\t162851\n问有\t162852\n5455\t162853\n9.png\t162854\nAB小说网\t162855\n日坛中学\t162856\n限用\t162857\n近6年\t162858\nlimeiky\t162859\n黑皮书\t162860\nfet\t162861\nUniversalis\t162862\n29.5\t162863\n篡改\t162864\n发生器\t162865\n耐磨度\t162866\n习近平谈治国理政\t162867\n睿云\t162868\n大贵\t162869\n青豆客\t162870\n凭栏\t162871\n泪洒相思地\t162872\n慰安\t162873\nThreadLocal\t162874\n明教\t162875\n迷倒\t162876\n小翁\t162877\n谋情\t162878\n启明信息\t162879\n日历女孩\t162880\ndsquared2\t162881\nw80\t162882\n敏华控股\t162883\nperfume\t162884\n尊贤\t162885\n出题\t162886\n四散的尘埃吧\t162887\nlnt\t162888\n皂化\t162889\n粉粒\t162890\n英格索兰\t162891\n蓝男色\t162892\nSymphony\t162893\n中华人民共和国行政处罚法\t162894\n700000\t162895\n个方\t162896\n幻想乡\t162897\n多湖\t162898\n阿婉\t162899\n受宠若惊\t162900\n物贸\t162901\n法律意见书\t162902\nfalse\t162903\nzizg\t162904\n炖盅\t162905\n3000型\t162906\n烂\t162907\n石色\t162908\n十【\t162909\n缩阴\t162910\n购物村\t162911\nsn码\t162912\n孔雀王\t162913\n塞舌尔\t162914\n留名\t162915\nJeanne\t162916\n气层\t162917\n新阳\t162918\n蔡康霍华德\t162919\n驱动人生\t162920\n下拉单元格\t162921\n黑道圣徒4\t162922\nCFLAGS\t162923\n微播\t162924\n竖列\t162925\n1.4万元\t162926\n4.2.1\t162927\n女生活\t162928\n述责述廉\t162929\n施永青\t162930\n广州区\t162931\n铪\t162932\n在马克思墓前的讲话\t162933\n一辈\t162934\n卫鞅\t162935\n一千种\t162936\n蓝天城\t162937\n波源\t162938\n打入冷宫\t162939\n故园\t162940\n1800元\t162941\n好品\t162942\n孤生\t162943\n严宁\t162944\n瑞尔特\t162945\n汉能集团\t162946\n火花塞之家\t162947\n耐溶剂\t162948\n三江平原\t162949\n32年\t162950\n吉野家牛肉饭\t162951\n关键字\t162952\n聊城房产网\t162953\n神奇宝贝绿宝石\t162954\n蓝染惣右介\t162955\n吴汝俊\t162956\nby2\t162957\n传祺GS\t162958\n古雷\t162959\nScotch\t162960\n淫靡\t162961\n中国东方电气集团有限公司\t162962\n牛鬼蛇神\t162963\n次卧\t162964\n设计感\t162965\n云南省第二人民医院\t162966\nccn\t162967\n组数\t162968\nWAPI\t162969\n踏步\t162970\n窦神\t162971\ndirt\t162972\n肛门坠胀\t162973\n海峡论坛\t162974\n绿桃\t162975\n可以想象\t162976\nv1.6\t162977\n徐州市中级人民法院\t162978\nGalaxy\t162979\n蓬街镇\t162980\n下犬式\t162981\nMaturity\t162982\nC1驾驶证\t162983\nTVB剧评网\t162984\n平定\t162985\n霉菌性阴道炎\t162986\n周生生\t162987\n阔太\t162988\n34个\t162989\n比亚迪秦论坛\t162990\n巨牛\t162991\n欣欣旅游网\t162992\n低速\t162993\n集士港镇\t162994\n蛋岛\t162995\n前头\t162996\n可接\t162997\nhttpd.conf\t162998\n第1波\t162999\n观澜\t163000\n蒙氏教育\t163001\n折火\t163002\n服帖\t163003\n褂\t163004\n孙温\t163005\n五年级英语\t163006\nBT种子\t163007\n王牌对王牌3\t163008\n七岁\t163009\n名不虚传\t163010\n合肥市财政局\t163011\n知识面\t163012\n迈克尔\t163013\nLanguages\t163014\n湖南省环境保护厅\t163015\nPS4pro\t163016\n排斥力\t163017\n丁伟\t163018\n山西省环保厅\t163019\n性虐\t163020\n小米手机锁屏\t163021\n房族\t163022\n製品\t163023\n脚袜\t163024\n龟头炎\t163025\nGoblin\t163026\n昌平二中\t163027\n罗大佑\t163028\n防城港市人民政府\t163029\npersent\t163030\n9211\t163031\n任宇翔\t163032\nstarter\t163033\n孙徐春\t163034\n槽道\t163035\n国家高速公路网\t163036\nstir\t163037\n打价机\t163038\n西二环\t163039\n孔道\t163040\n面面\t163041\nscort\t163042\nFranz\t163043\n龚雪\t163044\ncellphone\t163045\n华为m2\t163046\n小类\t163047\n罗豪才\t163048\n乐清网\t163049\n中国图学学会\t163050\n结巴\t163051\n彩盘\t163052\n新时代信托\t163053\n达雅\t163054\n玉米地\t163055\n加床\t163056\n檬\t163057\n民主制度\t163058\n6.68\t163059\nandroidStudio\t163060\n兽娘动物园\t163061\n甚平\t163062\n堡垒机\t163063\n非布索坦\t163064\n济南长清区\t163065\nSupernatural\t163066\n邮政集团\t163067\n睦州\t163068\n金邦\t163069\n钢头\t163070\n莆房网论坛\t163071\n宿迁市教育局\t163072\nBadge\t163073\ndomains\t163074\n郑航\t163075\n病菌\t163076\n3样\t163077\n沈阳孔雀城\t163078\n缓释肥\t163079\n绅宝\t163080\n地暖温控器\t163081\n为什么不对\t163082\n长治市政府\t163083\nRT3070\t163084\n宏强\t163085\n和府\t163086\n江湖奇侠传\t163087\n傲视天地\t163088\n鼠族\t163089\n孔塞\t163090\n招财进宝\t163091\n卢安克\t163092\nCAD病毒\t163093\nG3260\t163094\n120%\t163095\n英飞拓\t163096\n丫蛋\t163097\n一壶\t163098\n数独\t163099\naddRoutes\t163100\nLyon\t163101\n安慰\t163102\n美食库\t163103\nsensei\t163104\n孟轲\t163105\nitem2\t163106\n超声波热量表\t163107\n芳菲\t163108\n赛治\t163109\n袁亮\t163110\n流形\t163111\n农家小炒肉\t163112\n晨语\t163113\n陈柏霖\t163114\n李国杰\t163115\n豆油\t163116\n无有\t163117\n莫城\t163118\n反腐女\t163119\nie11浏览器\t163120\n樱桃小丸子\t163121\n电枢\t163122\n轻纺城\t163123\n高柱\t163124\n明春\t163125\n酋雷姆\t163126\n百忙\t163127\n任刚\t163128\n50毫米\t163129\n2.85\t163130\n孙翔\t163131\nVisualizer\t163132\n第18045期\t163133\n学术版\t163134\n擎天柱\t163135\nspring-webmvc\t163136\n吸入\t163137\n汇景新城\t163138\n月觉\t163139\n一宠\t163140\n节假日\t163141\n外勤\t163142\nbench\t163143\n金水花园路\t163144\nlived\t163145\n敲开\t163146\n师陶杯\t163147\n3178\t163148\n中国舞蹈网\t163149\n封帖\t163150\n云南消防\t163151\nmpf\t163152\nImage,ImageDraw\t163153\nInspector\t163154\n上古世纪吧_\t163155\n山特ups\t163156\nperc\t163157\n诺伊尔\t163158\n高科大厦\t163159\nzibo\t163160\n南宁火车东站\t163161\n百条\t163162\n益海嘉里\t163163\nmorning\t163164\nWebRTC\t163165\n农名\t163166\n赛罕\t163167\n陈丽\t163168\n中华民国十年\t163169\n酮类\t163170\nEventLog\t163171\n艾乐\t163172\n制表位\t163173\n经受\t163174\natn\t163175\nminori\t163176\n明日\t163177\n隔膜计量泵\t163178\n宫腔\t163179\n负重轮\t163180\n古元\t163181\n旧爱\t163182\n蒙乃尔\t163183\nNucleus\t163184\n羊\t163185\noptimization\t163186\n维记\t163187\n抱胸\t163188\niPhone8Plus\t163189\n暧昧期\t163190\n公爵\t163191\n荷花简笔画\t163192\ntrd\t163193\nAccordion\t163194\n例化\t163195\n烟鬼\t163196\n法姐\t163197\n王莹\t163198\n500万元\t163199\ndoov\t163200\n知乎网\t163201\n荣耀7i\t163202\n修理部\t163203\n同和街\t163204\n云康\t163205\n河北建材职业技术学院\t163206\n165元\t163207\nREFERER\t163208\n迅雷铺\t163209\n第193集\t163210\nAlexander\t163211\n蜘蛛侠2\t163212\n日产逍客\t163213\n8月29日\t163214\n官运亨通\t163215\n网银在线\t163216\n1931年\t163217\n柯迪亚克论坛_汽车之家论坛\t163218\n紫狐\t163219\n6315\t163220\n见证人\t163221\n二季\t163222\nCALIS联机合作编目中心\t163223\n关原之战\t163224\nCesar\t163225\n255个\t163226\n综艺类\t163227\n暗影精灵II\t163228\n92名\t163229\n第一篇\t163230\n徐泽\t163231\n张大彪\t163232\n同治帝\t163233\n亿格瑞\t163234\n吊\t163235\nCSMAR\t163236\nVPS\t163237\n爱波网\t163238\n绿可木\t163239\n珈百璃\t163240\n5w1h\t163241\n南丰镇\t163242\n黄景\t163243\n无法无天\t163244\n筒瓦\t163245\n哈弗H6论坛_汽车之家论坛\t163246\n390\t163247\n╋\t163248\n蜜源\t163249\n经适\t163250\n金粮\t163251\n拉式\t163252\n双室\t163253\n回购\t163254\n梅里埃\t163255\npyltp\t163256\n341\t163257\n四氯化钛\t163258\nBH\t163259\n爆头率\t163260\n超过六个月\t163261\n礼品卡\t163262\n厦门人才网\t163263\n王浆\t163264\n中冶德贤公馆\t163265\n2.4.9\t163266\nf428\t163267\n而为\t163268\nh型\t163269\n马承祖\t163270\n照词\t163271\n蹄子\t163272\n硝酸钡\t163273\n后起之秀\t163274\n蒋百里\t163275\n天使投资人\t163276\n蛇油\t163277\n杨蓉\t163278\n招安\t163279\n青梅竹马\t163280\n上海长海医院\t163281\n秦虹\t163282\n潍坊广播电视台\t163283\npcp\t163284\n门胜者\t163285\ncms8x\t163286\n靡靡之音\t163287\n目数\t163288\nJSK\t163289\n前肌\t163290\n签定\t163291\n石灰稳定土\t163292\n能源与动力工程专业\t163293\n龙头镇\t163294\n发财\t163295\nconference\t163296\n330元\t163297\n共军\t163298\n易语言源码\t163299\nrotary\t163300\n梦幻西游剑侠客\t163301\n常见报\t163302\n高境镇\t163303\n长安夜雨\t163304\n学生卡\t163305\n明珠家园\t163306\nDogma\t163307\n甘肃省体育局\t163308\n油印机\t163309\nmp5\t163310\n何田田\t163311\n壁装\t163312\n江粉磁材\t163313\n马玉\t163314\nDIC\t163315\n彩琳\t163316\n厂貌\t163317\n沈毅\t163318\n碣石镇\t163319\n钢之炼金术师\t163320\nV0\t163321\n突聋\t163322\n春事\t163323\nfreesshd\t163324\n一一_\t163325\n文艺复兴运动\t163326\n夔\t163327\n礼品\t163328\n10摄氏度\t163329\n西宁市环境保护局\t163330\ndv\t163331\n打手\t163332\n乳罩\t163333\n定向就业\t163334\n龟箱\t163335\n期限表\t163336\n10首\t163337\n苏州医工所\t163338\n骁龙660处理器\t163339\n全写\t163340\n大连湾街道\t163341\n170CM\t163342\n志志雄\t163343\n二刀\t163344\n西端\t163345\n移动阅读\t163346\n经济研究\t163347\n神宇\t163348\n上集\t163349\n袁鹰\t163350\n乀\t163351\n手不释\t163352\n最长情\t163353\n伊丽莎白泰勒\t163354\n金蝶软件\t163355\n深海惊魂\t163356\n电缆头\t163357\n卡饭论\t163358\n超级火箭\t163359\n温州动物园\t163360\n苏式\t163361\n⑨\t163362\nSammi\t163363\nproe2001\t163364\n静脉曲张袜\t163365\n打码机\t163366\n大砖\t163367\n偷窥无罪\t163368\n瓯越\t163369\n糠疹\t163370\nenemy\t163371\nMT7\t163372\n口袋妖怪复刻mega\t163373\n兽妃\t163374\n2.2GB\t163375\n万达嘉华酒店\t163376\nandrio\t163377\n对外投资_投资公司\t163378\nmacys\t163379\n爱迪智\t163380\nSK海力士\t163381\n硅钙板\t163382\nwe吧_\t163383\n友谊大街\t163384\n梅露\t163385\n游戏攻略\t163386\n阿蓉\t163387\nword-wrap\t163388\n天津市人才服务中心\t163389\n2018年4月22号\t163390\n第一百个\t163391\n重洋\t163392\n首座\t163393\n180cm\t163394\n坯体\t163395\n同心度\t163396\nmassacre\t163397\n小心点\t163398\n髁\t163399\n远哭\t163400\nMSF\t163401\n有假\t163402\n6v\t163403\n木偶奇遇记\t163404\n大通路\t163405\n性息\t163406\nMIMO\t163407\n智文\t163408\n域值\t163409\n口井\t163410\n北京省\t163411\n15IKB\t163412\n囧菌\t163413\n钢化粪池\t163414\noperator=\t163415\n混凝土结构设计规范\t163416\n接卸\t163417\n辛欣\t163418\n正脸\t163419\n气刹\t163420\n欲孽\t163421\nNex\t163422\n月湖雕塑公园\t163423\n张敬富\t163424\n禾禾\t163425\nClo-2\t163426\n肩上\t163427\n玉兰园\t163428\n大头像\t163429\n套管头\t163430\n柳海龙\t163431\n车轮战\t163432\n隐杀\t163433\n阿甲\t163434\n涣散\t163435\nInteresting\t163436\n4431\t163437\n刘全\t163438\n总社\t163439\n异响\t163440\n郭齐勇\t163441\nNBA2k18\t163442\n开拓\t163443\n防爆接线盒\t163444\nupdate2\t163445\n北科建集团\t163446\nphpqrcode\t163447\n人效\t163448\n大名鼎鼎\t163449\n基膜\t163450\n熊爸熊孩子\t163451\n朝天\t163452\n4部\t163453\n两联\t163454\n北市\t163455\n十八般\t163456\n锐龙Ryzen5\t163457\n国际医疗\t163458\n第13名\t163459\n防晒垫\t163460\n晚育\t163461\n玉师师\t163462\n脉冲阀\t163463\n亚莉莎\t163464\n4.46\t163465\narea\t163466\njavascrpit\t163467\n20150114\t163468\n贵州新闻网\t163469\n傲剑\t163470\ni贴吧\t163471\n城隍爷\t163472\n中特期期\t163473\nlunar\t163474\n四一\t163475\n酷狗音乐在线播放器\t163476\n银铃\t163477\n5656xxxx\t163478\n触发式\t163479\n杨虎城\t163480\n余杭塘路\t163481\n希望OL\t163482\n_天等县人民政府\t163483\n西西弗斯\t163484\nphonon\t163485\nmoms\t163486\nrpk\t163487\n发刊\t163488\n盟长\t163489\n织田信\t163490\nCV1\t163491\n企业信息化专业\t163492\n邱礼涛\t163493\nm26\t163494\n适口性\t163495\n马磊\t163496\n严寒地区\t163497\ncsq\t163498\n八年后\t163499\n妇炎\t163500\npvc塑料\t163501\n我的女友\t163502\n联兴\t163503\n踏青节\t163504\n起拱\t163505\n证明信\t163506\n沙孟海\t163507\nsumsung\t163508\n闵行中心医院\t163509\n印泥\t163510\n8x10\t163511\n晾晒架\t163512\n100年内\t163513\n中业\t163514\n毛主席故居\t163515\n从小\t163516\n占卜\t163517\n水井\t163518\n天涯明月刀神刀\t163519\n20160914\t163520\nQQ网\t163521\nifp\t163522\n20150115\t163523\n淋巴\t163524\nalvin\t163525\n第2次\t163526\ni686\t163527\n中际装备\t163528\n见者\t163529\n云阳网\t163530\n细目\t163531\n轻骑\t163532\n20161104\t163533\n氪石\t163534\n到家\t163535\nVIS\t163536\n专款\t163537\n德能勤绩\t163538\n机缝\t163539\nPostScript\t163540\n96版\t163541\n易语言数据库\t163542\n青春岁月\t163543\n鑫众\t163544\n情思\t163545\n小野大辅\t163546\nfairs\t163547\n异世界狂想曲\t163548\n条码扫描器\t163549\n翰香景庭\t163550\n金鸡独立\t163551\n咫尺\t163552\n17分\t163553\nxml\t163554\n抽象语法树\t163555\nSegment\t163556\n云erp\t163557\n调兵山\t163558\n嘉兴市人民政府\t163559\n引导型\t163560\n带环\t163561\n2701\t163562\n少林派\t163563\n分布式数据库\t163564\n蒋巷\t163565\n小野丽莎\t163566\n抗菌液\t163567\nLinkedHashMap\t163568\n河南航天金穗电子有限公司\t163569\n玩游戏网\t163570\n初\t163571\n2.5厘米\t163572\n福建省环境保护厅\t163573\n世界大学学术排名\t163574\n专接本考试\t163575\nNOMOS\t163576\n海曙区政府\t163577\n20150413\t163578\n三足金蟾\t163579\nproject2007\t163580\n殿军\t163581\n装有\t163582\n周永清\t163583\nMCR\t163584\nUgirls爱尤物\t163585\n头痒\t163586\nLiao\t163587\n流萤\t163588\nCache-Control\t163589\n领销\t163590\n茶水费\t163591\nhlhdidi\t163592\n人偶总动员\t163593\nonmeasure\t163594\n巨神\t163595\n代码证\t163596\n研招网\t163597\n旧衣\t163598\n2014—2015年\t163599\n达娃\t163600\n供应商管理_\t163601\nSTemWin\t163602\n酸菜鱼\t163603\nRobam\t163604\n傲基\t163605\n匆匆那年:好久不见\t163606\n两度\t163607\n减容\t163608\n西北军\t163609\n王彬彬\t163610\n蒙城路\t163611\nglog\t163612\n肖国祥\t163613\nKitchen\t163614\n启达教育网\t163615\nwasher\t163616\n豪爵铃木\t163617\n干叶\t163618\necharts3.0\t163619\n亮肤\t163620\nNetgear\t163621\niKON\t163622\n棠梨\t163623\neau\t163624\n维克森林大学\t163625\nMicroRNA\t163626\ninvolvement\t163627\n熔岩区\t163628\n孙峰\t163629\n无线路由_\t163630\n澳门外港\t163631\n第2行\t163632\n晶子\t163633\n美国狙击手\t163634\n不再孤单\t163635\nTien\t163636\n学分\t163637\nvita\t163638\n趣趣手游网\t163639\n三十七年\t163640\n铜山区人民政府\t163641\n学而思教育\t163642\nannex\t163643\n打印费\t163644\n0100\t163645\n通心粉\t163646\n峨嵋派\t163647\n百盘\t163648\nIndexError\t163649\n非我倾城\t163650\n6脚\t163651\n斯特林大学\t163652\n京基集团\t163653\nignorance\t163654\n册码版\t163655\n投股\t163656\nKempinski\t163657\n传动轮\t163658\n广西自然科学基金\t163659\n羊女\t163660\n矩阵法\t163661\n弗利萨\t163662\n技术版\t163663\n4724\t163664\n国家级开发区\t163665\n任摸\t163666\nunpaid\t163667\n光下\t163668\n礼物箱\t163669\nplanetside\t163670\n内所\t163671\n梦笔山人\t163672\nFertility\t163673\nwenzi\t163674\nTELNET\t163675\n土豆条\t163676\n刘珺儿\t163677\n鼎泰\t163678\nurlencoded\t163679\n猫妖传\t163680\npandas\t163681\nBabeLua\t163682\n长征突击手\t163683\nXutils\t163684\n首都国际机场\t163685\nAgo\t163686\n27只\t163687\n中华工控网\t163688\n森锐\t163689\nsaber酱\t163690\n国际板\t163691\n仰望\t163692\n栎\t163693\n黄瓤\t163694\nGlue\t163695\ninformation\t163696\namor\t163697\n冒险岛星\t163698\n903\t163699\n国际租车/全球自驾/全球领先/\t163700\n旧唐书\t163701\n广佛肇\t163702\n乌兰图雅\t163703\n天美丽日记\t163704\n下载源\t163705\n512国际护士节\t163706\n王登初\t163707\n鸟王\t163708\n烂鳍\t163709\n冒险岛豹弩\t163710\n背靠背\t163711\n博图v14\t163712\n左端\t163713\n前牙\t163714\n江苏师范\t163715\nalice\t163716\n嵩山\t163717\ntsql\t163718\n康贝\t163719\nxxhdpi\t163720\n港城路\t163721\nch1\t163722\n验证机\t163723\nComplicated\t163724\n悠悠球\t163725\n361名\t163726\nPakistan\t163727\nAxure中文网\t163728\n张志君\t163729\n曲敏\t163730\n厚望\t163731\n罗汉钱\t163732\n平潭县\t163733\n泰瑞莎\t163734\nF#\t163735\n框梁\t163736\n绍兴柯桥区\t163737\n雷根斯堡\t163738\nspinlock\t163739\nBoyce\t163740\n小米2a\t163741\n三谷\t163742\n长藤鬼校\t163743\n小埋\t163744\nEDF\t163745\n吴辰君\t163746\nSHOWROOM\t163747\n8400\t163748\n五公开\t163749\n日语作文\t163750\n猪崽\t163751\n花园小学\t163752\n红豆粥\t163753\n王洁\t163754\n傉\t163755\n雅仕轩\t163756\n好红\t163757\n一厢情愿\t163758\n洞房\t163759\n华严经\t163760\nBT变态版\t163761\n多米尼加\t163762\n竖折\t163763\n惊曝\t163764\nROM\t163765\nobjec\t163766\n李瑞环\t163767\n家印\t163768\n杀戮间\t163769\n走过场\t163770\n华清飞扬\t163771\n诸城信息港\t163772\n鹤峰网_鹤峰县政府\t163773\n糯米藕\t163774\n三邦\t163775\n火影忍者之博人传\t163776\n出马仙\t163777\nWet\t163778\n强兵\t163779\n金刚钻\t163780\n同伦\t163781\n质态\t163782\n化合\t163783\nBackpack\t163784\n常闭防火门\t163785\n蓝章\t163786\n糖饼\t163787\n归案\t163788\nDBGridEh\t163789\nHKT48\t163790\noutcomes\t163791\n偶极矩\t163792\n福原爱\t163793\n杨坚\t163794\n未定义\t163795\n激电机\t163796\n王秋儿\t163797\nrecoverable\t163798\n曝光率\t163799\n7520\t163800\n艺彩\t163801\n转行\t163802\nunix网络编程\t163803\n曲建武\t163804\n食得\t163805\n5.52\t163806\n颂扬\t163807\nlibyuv\t163808\n178.com\t163809\nII类\t163810\n尼龙板\t163811\n李书\t163812\ndpkt\t163813\nIDOLiSH7\t163814\n下排式\t163815\n重要性_参考网\t163816\n遥感所\t163817\n高程\t163818\nfdfs\t163819\n裙房\t163820\n上古卷轴5:天际重制版\t163821\n杨凌示范区政府网\t163822\n杨戬\t163823\nJ1\t163824\nbeautiful\t163825\n童安琪\t163826\n重庆沙坪坝政府\t163827\n座标\t163828\n搜索$ichannel\t163829\n易建科技\t163830\n厅级\t163831\ndataset\t163832\n雷暴日\t163833\n智云云鹤\t163834\n吴岱融\t163835\nLSC\t163836\ndoi\t163837\n姚振华\t163838\n红汞\t163839\nWong\t163840\n300431\t163841\ncoupled\t163842\n0555\t163843\n西安婚纱摄影\t163844\nt999\t163845\n阿罗汉\t163846\n樱桃节\t163847\n司ミコト\t163848\nWX\t163849\n华都网\t163850\n狂上加狂\t163851\n段落感\t163852\n淘宝退款率\t163853\n福田拓陆者\t163854\n廉颇蔺相如列传\t163855\n大幸\t163856\n被征收人\t163857\n卡布基诺\t163858\n欧拓\t163859\n1位\t163860\n呆妹儿\t163861\n溜槽\t163862\ncwRsync\t163863\n15帧\t163864\n基番\t163865\nBriggs\t163866\n索菲亚教堂\t163867\n杨山\t163868\n菓\t163869\n卷附\t163870\n峰瑞\t163871\n监狱风云\t163872\n空荡荡\t163873\nController类\t163874\nFlorida\t163875\n50000\t163876\n乾陵\t163877\n靶向性\t163878\n11集\t163879\n新川\t163880\n苏州电信\t163881\n傅盛黄轩\t163882\n数媒\t163883\n远藤\t163884\n718\t163885\n社\t163886\n一窗式\t163887\n隆中对\t163888\n金贵银业\t163889\n二十级\t163890\nUrl\t163891\n註冊處\t163892\n83家\t163893\n葱白水\t163894\n财园\t163895\n制氢\t163896\n成都市人民政府政务服务中心\t163897\n黄玉郎\t163898\n奥尔卡\t163899\n酷狗音乐\t163900\n窗器\t163901\n纯心\t163902\n党话\t163903\n紧带\t163904\n巨马\t163905\n陈学伟\t163906\n196\t163907\n彭林\t163908\n环氧漆\t163909\n潜龙勿用\t163910\n华灯\t163911\n数十万\t163912\nffhd\t163913\n财政部办公厅\t163914\n张益达\t163915\n江北街道\t163916\n录象\t163917\n阳逻开发区\t163918\nHandClap\t163919\noxy\t163920\n溢流阀\t163921\n我们的祖国\t163922\n梨花香\t163923\n第七家\t163924\n进补\t163925\nEPLANP8网\t163926\npathofex\t163927\n天孚通信\t163928\n金花\t163929\n每一次\t163930\n清真认证\t163931\n炸裂\t163932\n207路\t163933\n欲海\t163934\n四百万\t163935\n裤头\t163936\n鼎上\t163937\nconda\t163938\n中国纸金网\t163939\n遭遇\t163940\n动模\t163941\n牛b叉电影怡红院\t163942\n阿卡丽\t163943\npc端\t163944\n梦莎\t163945\n体育馆\t163946\n清洁器\t163947\n钱江经济开发区\t163948\n云顶天宫\t163949\n速购\t163950\nA型血\t163951\nChamp\t163952\n章丘\t163953\nzukz2pro\t163954\nVol\t163955\n艹猫\t163956\n谢谢谢\t163957\n99万\t163958\n天方夜谭\t163959\nz370a\t163960\n是是非非\t163961\n救女\t163962\nproportion\t163963\n二乙胺\t163964\n三百元\t163965\nemqttd\t163966\nFCA\t163967\nPackagist\t163968\n关东军\t163969\n无源音箱\t163970\nASOS\t163971\n金沙\t163972\n名木坊\t163973\n威宁自治县\t163974\n手涂\t163975\n效忠\t163976\n生石\t163977\n锦瑟香\t163978\n销售额\t163979\n市质监局\t163980\n张国军\t163981\n算死\t163982\n合乐\t163983\n博陵\t163984\n快乐家园\t163985\n3万台\t163986\n星火教育\t163987\n平民级\t163988\n翘板\t163989\n均值滤波器\t163990\n临空港\t163991\n1井\t163992\n叶蓬\t163993\n精神分析理论\t163994\n百度云全集\t163995\n小西\t163996\n第174章\t163997\n九局\t163998\n办税员\t163999\n谢意\t164000\n阳泉市\t164001\nNha\t164002\n撒比\t164003\n专工\t164004\n妊娠线\t164005\n发货员\t164006\nRTS\t164007\n24200068\t164008\n三国战纪2群雄争霸\t164009\n7001\t164010\nhaoren\t164011\n广德人才网\t164012\n主基地\t164013\nrrdtool\t164014\n小赢理财\t164015\ncm12\t164016\n神番\t164017\nSourcing\t164018\n何济公\t164019\n郑州移动\t164020\n7003\t164021\nbkb\t164022\n泡椒鸡爪\t164023\n等等等等等等\t164024\n血脉偾张\t164025\n安沃驰\t164026\nphotoshopcc\t164027\nslots\t164028\n使命召唤9:黑色行动2\t164029\nbtl\t164030\n请生意经\t164031\n促排\t164032\n八轴\t164033\n婚装\t164034\n6443\t164035\n名侦探柯南零的执行人\t164036\n地塞米松\t164037\n内脂豆腐\t164038\n地安门\t164039\nm277dw\t164040\n广东省中医院珠海医院\t164041\n红杏\t164042\n一半一半\t164043\n再学\t164044\n丁壬\t164045\n苏打绿\t164046\n内射\t164047\n拜年\t164048\n应收账款质押\t164049\n红外测距传感器\t164050\npasha\t164051\n外线\t164052\n郴州西\t164053\n过敏反应\t164054\n阳新\t164055\n帕拉贡\t164056\nxiazia\t164057\n绝处\t164058\nMVCC\t164059\n键入\t164060\n郑合高铁\t164061\nWindows编程\t164062\n笑料\t164063\nxinshi\t164064\nHST\t164065\n四川旅游学院\t164066\naj31\t164067\n羿龙II\t164068\n献吻\t164069\nk=1\t164070\nDJI大疆\t164071\nUSB-HDD\t164072\n2金\t164073\n380m\t164074\nAware\t164075\n阴凉柜\t164076\n左栏\t164077\n辜莞允\t164078\nkaty\t164079\n塔巴斯\t164080\n失业救济金\t164081\n每周末\t164082\nPopupMenu\t164083\n混入\t164084\n颞颌关节紊乱\t164085\n地大\t164086\ntricks\t164087\n分期还款\t164088\n甄\t164089\n封爵\t164090\n赵欣\t164091\n滤棒\t164092\nBearing\t164093\nogame\t164094\n外交官\t164095\n运营商\t164096\n龙腾虎跃\t164097\n叛逃\t164098\n高脂血症\t164099\n补佳乐\t164100\nCDMA卡手机号段\t164101\n民族宗教事务局\t164102\n暗影\t164103\n晾衣杆\t164104\n生产科\t164105\n经穴\t164106\n校车\t164107\n欧麓花园城\t164108\n67%\t164109\n麦乐迪\t164110\n太阳能学报\t164111\n中国佛学院\t164112\n暂用\t164113\n临渊\t164114\n途睿欧_途睿欧论坛论坛\t164115\nPlanck\t164116\n搭桥\t164117\n刘敦桢\t164118\n京东闪付\t164119\n2018年4月30\t164120\n袁文才\t164121\n玛丽亚·凯莉\t164122\n售货员\t164123\n2018年04月05日\t164124\n任娇\t164125\n周建明\t164126\n市政务中心\t164127\n气体质量流量计\t164128\nwritelines\t164129\nbasket\t164130\n越南盾\t164131\n自助餐炉\t164132\nBootstrap4\t164133\n华海\t164134\n神偷奶爸3\t164135\n20160418\t164136\n甘肃省人民医院\t164137\n锡林郭勒大草原\t164138\n刃牙道\t164139\n海南省肿瘤医院\t164140\n佳能MG2580\t164141\n企业志\t164142\nZine\t164143\n辽河油田\t164144\n尼康d850\t164145\n57810933\t164146\n中山古镇\t164147\n周易生辰八字算命\t164148\n紫薇路\t164149\n苍之纪元齐格飞\t164150\n广播电视编导专业\t164151\n表面处\t164152\n抗渗混凝土\t164153\n户户通\t164154\n毕老师\t164155\n中教网\t164156\n通县\t164157\n降头师\t164158\n安塞县\t164159\n满员\t164160\n奥尔良\t164161\n水泥搅拌机\t164162\n有话\t164163\n150Mbps\t164164\n回收装置\t164165\nUWA\t164166\n水部\t164167\n元帅\t164168\n徕卡Q\t164169\n师幼\t164170\n钢炮\t164171\n红小豆\t164172\n边缘\t164173\nTRUNCATE\t164174\nLOL2018\t164175\n汤子\t164176\n弃子\t164177\n天爱\t164178\n纽约帝国大厦\t164179\n一败涂\t164180\n莱昂纳多迪卡普里奥\t164181\n大开眼界\t164182\n170个\t164183\n传祺\t164184\n业权\t164185\n谋发展\t164186\n诚实\t164187\n乱炖\t164188\n咋回事儿\t164189\n不用多说\t164190\n武汉城市圈\t164191\nCriterion\t164192\nWilde\t164193\n4115\t164194\n霍童\t164195\n狙击手:幽灵战士3\t164196\ndouble型\t164197\n深圳政府在线\t164198\n环球贸易网\t164199\n石悦安鑫\t164200\nminy\t164201\nhandlebars\t164202\n别怕\t164203\n230平米\t164204\nPOCKET\t164205\n扬帆远航\t164206\n老宋\t164207\n鹿乃\t164208\n陈述性\t164209\n底牌\t164210\n灵刃\t164211\n06级\t164212\n操作系统\t164213\n迭部县\t164214\n求字体网\t164215\n弑神\t164216\n韩礼德\t164217\n表_银行信息港\t164218\n监督者\t164219\n超级机器人大战\t164220\nElephant\t164221\n近墨者黑\t164222\n航空部\t164223\n科学研究院\t164224\n地钻\t164225\n总裁\t164226\n实发\t164227\nxlwt模块\t164228\n透视学\t164229\n中国水环境集团\t164230\n小米电视4\t164231\n地下城\t164232\npro9.7\t164233\n可走\t164234\n中线性\t164235\n1600亿\t164236\n宗卷\t164237\n正中\t164238\nbootrom\t164239\n捆扎带\t164240\ncnDBA\t164241\n仁和街道\t164242\n7czw.com\t164243\ndecentralized\t164244\n赏樱\t164245\nv6.8\t164246\n看天\t164247\n明兴\t164248\n优品\t164249\n限速\t164250\n两年以后\t164251\n库存\t164252\n书界\t164253\n云岗\t164254\n边防派出所\t164255\n陈晓丹\t164256\n姜gary\t164257\n永江\t164258\nsz260\t164259\n80平米\t164260\n欢宠\t164261\n斗门\t164262\n802.11AC\t164263\n76页\t164264\n极品飞车OL\t164265\nEstimate\t164266\n福建农村信用社\t164267\n烤包\t164268\n冰船\t164269\nScam\t164270\n昕-2008\t164271\n_户县\t164272\n三更_雨\t164273\nanimator\t164274\n至尊红颜\t164275\n异步\t164276\n长沙县政府\t164277\n脂多糖\t164278\n逾期率\t164279\n散状\t164280\n逐桩\t164281\n全利\t164282\n加奈美\t164283\n一年以前\t164284\n王德顺\t164285\n虎头山\t164286\n燃战\t164287\n普里皮亚季\t164288\nKAYANO\t164289\n炭河\t164290\nMP4/MKV\t164291\n格瓦拉\t164292\n蔡嘉\t164293\n玻璃瓶\t164294\n姐妹情深\t164295\n赵彤\t164296\nGogh\t164297\n李大夫\t164298\n和酒店\t164299\n龙战士4\t164300\n束缚\t164301\n法火\t164302\n70首\t164303\n低速电动车\t164304\n游说\t164305\n进纸器\t164306\n安卓7.1\t164307\n教育管理学\t164308\nMaxx\t164309\n火力少年王3\t164310\n盐酸左氧氟沙星片\t164311\n培训通\t164312\n可爱风\t164313\n篮球服\t164314\nDAC0832\t164315\nreceiving\t164316\nCMK\t164317\n徐雯\t164318\n省人事厅\t164319\n中宣部\t164320\n华都商城\t164321\n几十年代\t164322\n围棋宝典\t164323\nthinks\t164324\n呼兰河\t164325\n中锋\t164326\n腰麻\t164327\n调档线\t164328\n荣域\t164329\n隆基绿能科技股份有限公司\t164330\n闻声\t164331\nRitz\t164332\n第87章\t164333\n高新技术园区\t164334\nsaiku\t164335\n谢梓秋\t164336\n小米6x\t164337\n哈姆莱特\t164338\n艾力斯特\t164339\nr610\t164340\n庞培\t164341\n大仲村镇\t164342\n重置\t164343\nkidm\t164344\nno.2\t164345\n会心\t164346\n温凉\t164347\n2017世界女排大奖赛\t164348\n3208\t164349\n天桂山\t164350\nNEPCS\t164351\n疑云密布\t164352\n旁路\t164353\n凭着\t164354\nVux\t164355\n65级\t164356\n胜利街道\t164357\n6.5.6\t164358\n汤森\t164359\n大墩村\t164360\n邪光\t164361\n不经意\t164362\n水险\t164363\n监视\t164364\nExchange论坛\t164365\n样册\t164366\n小读者\t164367\n恋母情结\t164368\nufei\t164369\n韩帅\t164370\nR6300V2\t164371\n做官\t164372\n杰赛\t164373\noeasy\t164374\n不吵\t164375\nMeltwater\t164376\njsyun\t164377\n管座\t164378\nhp5200lx\t164379\n高中时\t164380\n颜面\t164381\n浙江自贸区\t164382\n房天下海外房产网\t164383\n羞射\t164384\nweico\t164385\n轰\t164386\n袁涛\t164387\n134号\t164388\n红螺\t164389\n脚爆\t164390\n德哥@Digoal\t164391\n底色\t164392\nbronze\t164393\n4.62\t164394\n万科城三期\t164395\nWolfe\t164396\n3dlc\t164397\n法杖\t164398\n四季沐歌太阳能\t164399\n如水\t164400\n骨柄\t164401\n秧歌\t164402\n金小妹\t164403\nLinux内核\t164404\n牛黄上清丸\t164405\n307路\t164406\n达通\t164407\n超能陆战队\t164408\n享惠\t164409\n二硫化硒洗剂\t164410\n石家庄西\t164411\nmacd金叉\t164412\n万集科技\t164413\n魏\t164414\nnetworker\t164415\n命中率\t164416\n深受其害\t164417\n萨克斯版\t164418\nSugar\t164419\n赌侠之人定胜天\t164420\nCARTIER\t164421\n豪华\t164422\n恶魔\t164423\n第四轮\t164424\n报盘\t164425\nsusu\t164426\n传化\t164427\n尚未\t164428\n王晓霞\t164429\n陶慧敏\t164430\nhostname\t164431\n步步紧逼\t164432\n华山镇\t164433\n这世间\t164434\n32章\t164435\n拒签率\t164436\n钢构\t164437\nsp6\t164438\n国资厅\t164439\n山姆会员商店\t164440\n田姓\t164441\n增值税纳税\t164442\n喔唷\t164443\n蓝草\t164444\n广州医院\t164445\n覆手\t164446\n河池市人民政府\t164447\n2017年1月25日\t164448\n工人阶级\t164449\n揭短\t164450\n房天下网\t164451\n马拉多纳\t164452\n压应力\t164453\n纹状\t164454\n百度千寻\t164455\n何时\t164456\n麻痹大意\t164457\n闻檀\t164458\n八卦城\t164459\n比特犬\t164460\n病虫害_世纪农药网\t164461\nxgb\t164462\n卡宝宝网\t164463\n税收\t164464\nphotoswipe\t164465\n脉象\t164466\n桂山岛\t164467\n监察员\t164468\n铁木真\t164469\nNotes\t164470\n2018开年\t164471\n风耳\t164472\nDJANGO\t164473\n财布\t164474\n嚼\t164475\n湖北工程职业学院\t164476\n甜筒\t164477\n敏行\t164478\n西诺德\t164479\nidanmu\t164480\n南极磷虾\t164481\n德衡商法网\t164482\n中铁十一局\t164483\n不痛\t164484\n空指针\t164485\nina\t164486\n合拍\t164487\n教练\t164488\n审判员\t164489\n三国群英传7吧\t164490\n泉州站\t164491\n第28条\t164492\n福生活\t164493\n瞬间移动\t164494\nsein\t164495\nWyeth\t164496\n明辨\t164497\n加菲\t164498\n问子\t164499\n球乐乐\t164500\noppor15\t164501\n赵小飞\t164502\njavzoo\t164503\n第9个\t164504\n华视传媒\t164505\n防患\t164506\ntable\t164507\nPro5\t164508\nEaric\t164509\n特蕾莎梅\t164510\n00900\t164511\n平安好贷\t164512\n9月30号\t164513\n十一本\t164514\n1433\t164515\n截断\t164516\nreproductive\t164517\n2000公里\t164518\nb级车\t164519\n2003年\t164520\nボ\t164521\n琴岛\t164522\n亮油\t164523\n沉甸甸\t164524\n门源\t164525\n30天\t164526\ntrack-trace\t164527\n不思议迷宫\t164528\n哥斯达黎加\t164529\ndevcon\t164530\n肾炎康复片\t164531\n北京海洋馆\t164532\n男潮\t164533\n晴隆县\t164534\n铁路人\t164535\n胆艺轩音响技术论坛\t164536\nxyk\t164537\n1.76\t164538\n农险\t164539\n玄武门\t164540\nFunctions\t164541\n专利权人\t164542\nnrs\t164543\n广东海事局\t164544\n安泽\t164545\n厦门大学数学科学学院\t164546\n性文化节\t164547\n蛇夫座\t164548\nk60\t164549\n高新南\t164550\nArray、Slice、Map\t164551\n爱国情\t164552\nconfd\t164553\n生物医\t164554\nrepresentatives\t164555\n2016年10月18日\t164556\n7474\t164557\nsflow\t164558\n水跃鱼\t164559\n悠宝\t164560\n氟米特\t164561\nPangolin\t164562\n光盘盒\t164563\n转体\t164564\n500马力\t164565\n抄板\t164566\n风帽\t164567\n无烟日\t164568\n笔者\t164569\n紫皮糖\t164570\n7月1\t164571\n黄蜂女\t164572\n吃奶率\t164573\n电池箱\t164574\ntogaf\t164575\n混迹\t164576\n下肢\t164577\nrecharge\t164578\n抵扣进项税额\t164579\n板料\t164580\nsepolicy\t164581\n开完\t164582\n一心不乱\t164583\n德州\t164584\n爱与恨\t164585\n落地页\t164586\n欧菲科技\t164587\n收藏帖\t164588\n重庆商报\t164589\n360免费WiFi\t164590\n大宅\t164591\ndevotion\t164592\n日照市政府\t164593\n中国林业网\t164594\nhng\t164595\n发售日\t164596\n特使团\t164597\n大印\t164598\n上海光机所\t164599\n参麦注射液\t164600\n第六感生死缘\t164601\nEPLAN\t164602\n明理\t164603\nvif\t164604\n毒妇\t164605\n子链\t164606\n北京协和医学院\t164607\n白首\t164608\n缓冲溶液\t164609\n强殖装甲凯普\t164610\n甩葱歌\t164611\n古惑仔\t164612\n大鱼号\t164613\n文本版\t164614\n4037\t164615\n可叹\t164616\n离线式\t164617\nOris\t164618\n中国铁路工程总公司\t164619\n单产\t164620\nDisabling\t164621\nPAAS\t164622\n微尺\t164623\n万州北站\t164624\nesxi6.5\t164625\nFGO\t164626\n心跳\t164627\n轰焦\t164628\n檔\t164629\n冷兄\t164630\n5.2.14\t164631\n在线音频转换\t164632\n芯子\t164633\n生理学\t164634\n勒住\t164635\n桑塔比亚迪唐\t164636\n三环新城\t164637\n前湾一路1号\t164638\nxiugai\t164639\n哨卡\t164640\n招商大厦\t164641\n金罐\t164642\n不可取\t164643\n小逃妻\t164644\nhtmlspecialchars\t164645\n淫辱\t164646\n岱宇\t164647\n中国青年出版社\t164648\n98.com\t164649\n雪松控股集团有限公司\t164650\n青训\t164651\n共工\t164652\n爱教网\t164653\n9000f\t164654\n泰康资产管理有限责任公司\t164655\n恒富\t164656\n滴滴代驾吧\t164657\n好大好\t164658\n广西小学\t164659\n令计划\t164660\n冒险岛恶魔猎手\t164661\n财务管理理论\t164662\n神宗\t164663\n中级经济师\t164664\n邢台经济开发区\t164665\n5.2.4\t164666\n360f4\t164667\n南博会\t164668\n赛欧道奇公羊\t164669\n越野跑\t164670\n渤海新区\t164671\n三甲级\t164672\n仙元\t164673\n北京古城\t164674\n三支\t164675\n临渭区\t164676\n重度子痫\t164677\n免进\t164678\nTHISISPAN\t164679\n书圣\t164680\n少平\t164681\n慢更\t164682\n1889\t164683\n普拉索纳塔\t164684\n寿麝香葡萄\t164685\nHayes\t164686\n桩径\t164687\nfindOne\t164688\n太湖新城\t164689\n砖\t164690\nteal\t164691\n中外合资经营企业\t164692\n2017年8月17日\t164693\nopenid\t164694\n旭化\t164695\n电面\t164696\n约翰·施特劳斯\t164697\niwlist\t164698\n遮雨\t164699\n草圣\t164700\n广佛肇高速\t164701\n支流\t164702\n白细胞计数\t164703\n自主权\t164704\n十轮\t164705\n河成云\t164706\n重商\t164707\n董小姐\t164708\nindians\t164709\n石狮子\t164710\n玉柱\t164711\ndior吧\t164712\n11月12日\t164713\nbeasts\t164714\n中白工业园\t164715\n归处\t164716\nask2\t164717\nbundle\t164718\n53天\t164719\n5520\t164720\nS函数\t164721\n资产率\t164722\n偎\t164723\n大小单\t164724\n木洞\t164725\n萨诺狼蛛\t164726\n绎\t164727\n苗文华\t164728\n市貌\t164729\n无线路由器\t164730\n王闻\t164731\n119g网盘\t164732\n兽族\t164733\n战术板\t164734\n似火\t164735\n墨绿\t164736\n羊庄\t164737\nguogangj\t164738\n百分百\t164739\n广州地区\t164740\n中年级\t164741\n40万元\t164742\n市一医院\t164743\n东风股份\t164744\nT440s\t164745\n活套\t164746\n4.5分\t164747\n静香\t164748\n姚岚\t164749\n印第安纳\t164750\n一走了之\t164751\n套膜\t164752\n赵俊\t164753\nmatlab2017b\t164754\n廷\t164755\nsql2012\t164756\n水灵灵\t164757\n止于至善\t164758\n两个星期\t164759\n2110\t164760\nBFS\t164761\n听声辨位\t164762\n粲\t164763\n北洋夜行记\t164764\n巴黎花园\t164765\n黄晓薇\t164766\n鱼目\t164767\n能耗\t164768\nmisc\t164769\nCelery\t164770\n金盘网\t164771\n燕角\t164772\nx70a\t164773\n两线制\t164774\n希罗\t164775\nintegers\t164776\ncemu吧_\t164777\nh3lix\t164778\n大红酸枝\t164779\n60G\t164780\nSigned\t164781\n罪证\t164782\n迈乐\t164783\n%100\t164784\nPRODUCTION\t164785\n管家婆创业版\t164786\n彪炳\t164787\n中华园林网\t164788\n鸡仔\t164789\nsynchro\t164790\nExperiment\t164791\n摄影群\t164792\n幼年\t164793\n器物\t164794\n第12条\t164795\nETtoday\t164796\n不正之风\t164797\n路迪\t164798\nNO.6\t164799\n8@100\t164800\n先锋岗\t164801\nwhl文件\t164802\n二折\t164803\n无限制\t164804\njiava\t164805\nvarbinary\t164806\n选用\t164807\nBake\t164808\n清咽\t164809\n梅边\t164810\n也不够\t164811\n百想艺术大赏\t164812\n赤壁\t164813\n灌肠器\t164814\n信息管\t164815\n百联达\t164816\n部队\t164817\n信賴\t164818\nrequiem\t164819\n芽点\t164820\n万达影城\t164821\n洪玄公\t164822\n辽宁移动\t164823\n旋流沉砂池\t164824\n流行女歌手\t164825\n24起\t164826\n小恩爱\t164827\n水元\t164828\nDatepicker\t164829\n上学时\t164830\nstretching\t164831\nIPsec\t164832\n吴松\t164833\n士兵们\t164834\n304钢\t164835\n心云\t164836\n大刑\t164837\n咿呀\t164838\n还礼\t164839\n裸贷\t164840\n神医\t164841\n黑化\t164842\n西学\t164843\n九齿钉耙\t164844\nSkye\t164845\nPPT转换器\t164846\n花街\t164847\nCAD迷你\t164848\n龙脑\t164849\n2016超星尔雅\t164850\n卡夫亨氏\t164851\n享贷\t164852\n布尔代数\t164853\n喧闹\t164854\n淮西\t164855\nmoba\t164856\n魏武帝\t164857\n标题式\t164858\nbally\t164859\n壳牌石油\t164860\n产包\t164861\n抄送\t164862\n王小米\t164863\nbisu\t164864\n思者\t164865\n超碰在线视频\t164866\n春行\t164867\n蔵\t164868\n手游网\t164869\n佛祖\t164870\n香格里拉酒店集团\t164871\n姻亲\t164872\n气门芯\t164873\n共震\t164874\n科创板\t164875\n锈钢板\t164876\nLutens\t164877\n宁中\t164878\n一语文\t164879\n医考\t164880\n浮桥镇\t164881\n买彩\t164882\n狂野飙车7\t164883\nalternatives\t164884\nPicsArt\t164885\nNumeric\t164886\n我的经济适用男\t164887\n7点半\t164888\n吸振器\t164889\nmight\t164890\n恩智浦\t164891\n青年志\t164892\n二十三章\t164893\n石家庄市人民政府\t164894\n魅可\t164895\n黄花城\t164896\n上海骨科医院\t164897\nExecl\t164898\n5万亩\t164899\nbd\t164900\nguanxi\t164901\n巴霍巴\t164902\n职业技术学院\t164903\n深圳广电集团\t164904\n番茄味\t164905\nboxes\t164906\npb管\t164907\n招标咨询网\t164908\n两岸三地\t164909\n革命烈士纪念馆\t164910\n荆钗记\t164911\n孙子兵法\t164912\n羊安镇\t164913\n聚氨酯发泡机\t164914\n跳吧\t164915\nmontreal\t164916\n无觅处\t164917\n臆想症\t164918\nCPU处理器\t164919\n粉胶\t164920\n二级医院\t164921\n纯品\t164922\n弹药包\t164923\n团徽\t164924\n诡秘\t164925\n贝雕\t164926\n李东明\t164927\nhyde\t164928\n弯圆\t164929\n班马\t164930\n重庆第二国际机场\t164931\n驻马店新闻网\t164932\nICEM\t164933\n制导\t164934\n北腿\t164935\n十六号\t164936\n7RFX\t164937\n宁江\t164938\n青岛西海岸新区\t164939\nAnton\t164940\nv0\t164941\n受孕\t164942\nhuage\t164943\nRicky_Huang\t164944\n慎行\t164945\n羟苯甲酯\t164946\n破茧成\t164947\n金准\t164948\n老蒋\t164949\n民发\t164950\n烘干器\t164951\n曲周\t164952\n尚文\t164953\n浪潮英信\t164954\n绿萍\t164955\nf7\t164956\n朝潮\t164957\n模糊锐化\t164958\n路虎揽胜运动\t164959\n转眼\t164960\n黑椒\t164961\n三t\t164962\n旭升\t164963\n境\t164964\n一照\t164965\n听装\t164966\nguake\t164967\n习\t164968\n凤楼\t164969\n献策\t164970\niPad2\t164971\n150亿元\t164972\n大洋彼岸\t164973\n白月初\t164974\n一簇簇\t164975\ne70\t164976\n25.8\t164977\n闪银吧\t164978\n少年队\t164979\n康护\t164980\n秀白\t164981\n帝国时代论坛|帝国时代II高清版\t164982\n60座\t164983\n丷\t164984\n正泰集团\t164985\n欧冶子\t164986\nDataGrid\t164987\n法兰蝶阀\t164988\n无法执行\t164989\n贷记凭证\t164990\n冕哥\t164991\n7.5kw\t164992\nabce\t164993\n911事件\t164994\n强风\t164995\n非领导\t164996\n40天\t164997\n云瑞\t164998\n大受\t164999\n科员\t165000\n24枚\t165001\n有学问\t165002\n高欢\t165003\nmsg\t165004\n88层\t165005\n挥挥手\t165006\n返佣网\t165007\n骡子\t165008\n有机\t165009\ndonhui\t165010\n百度百度\t165011\n贵阳东\t165012\n女歌\t165013\n庄市\t165014\n牛蛙君\t165015\nHoc\t165016\nCOMMON\t165017\n9场\t165018\nimperfect\t165019\n皮特芬\t165020\nn+3\t165021\n700套\t165022\n盾安环境\t165023\n足女\t165024\n定额\t165025\njena\t165026\n杜比全景声\t165027\n乳腺检查\t165028\nGame234\t165029\n王时敏\t165030\nSOGO\t165031\n耐威克\t165032\n海阳\t165033\n笑忘书\t165034\n航天信息股份有限公司\t165035\n中药药用价值网\t165036\n柠珞\t165037\n火奴\t165038\n纪事\t165039\nvienna\t165040\n黑泽明\t165041\n噬月者\t165042\n洁厕灵\t165043\n桃园眷村\t165044\n重庆文化艺术职业学院\t165045\nweb3j\t165046\n好萌\t165047\n_韩游网\t165048\n华厦\t165049\n长沙市第一医院\t165050\n林森\t165051\n花皙蔻\t165052\n0轴\t165053\n支撑线\t165054\n成神\t165055\nflickr\t165056\n鸡蛋壳\t165057\nBund\t165058\n自荐信\t165059\n上海农村商业银行\t165060\n洗胃\t165061\n寻人启示\t165062\n肠丸\t165063\n68798393\t165064\n比目鱼肌\t165065\n中国会计网zgkjw\t165066\n生活琐事\t165067\n化为\t165068\n衣原\t165069\ncfps\t165070\n南部新城\t165071\ncalamus\t165072\n智能互联\t165073\n提交者\t165074\n4-7月\t165075\n任振鹤\t165076\nmeanings\t165077\n20帧\t165078\n王者荣耀搞笑漫画\t165079\n安博维\t165080\n南阳油田\t165081\n知政\t165082\n连体\t165083\n亮出来\t165084\n袁洪\t165085\n纸杯蛋糕\t165086\n盈信\t165087\n绫野刚\t165088\nxeon\t165089\n国家外管局\t165090\n带生\t165091\n乖宝\t165092\n尘埃拉力赛\t165093\n银灵\t165094\n一七\t165095\n伟字\t165096\n向右转\t165097\n区农业局\t165098\n心酸路\t165099\n上海市国家税务局\t165100\n读透\t165101\n点塑\t165102\n右图\t165103\n豆腐汤\t165104\n一般式\t165105\n暗面\t165106\n乐芙兰\t165107\nSEGMENT\t165108\n战鼓\t165109\n横尾太郎\t165110\n静音版\t165111\nDEAL\t165112\n百度网盘PC\t165113\n乌兰巴托\t165114\n黄文豪\t165115\n证劵公司\t165116\n逆仙\t165117\noverhaul\t165118\n紫光檀\t165119\n抛射\t165120\n心酸史\t165121\n招标方\t165122\n一款\t165123\n雅乐之舞\t165124\n崇高\t165125\n灌阳县人民政府\t165126\n彈\t165127\ndaxing\t165128\n350w\t165129\n西西体\t165130\ndifflib\t165131\n喜帖\t165132\n太和镇\t165133\nMSP430技术论坛\t165134\n12万元\t165135\n毕升\t165136\n软启动\t165137\n照度\t165138\ns7\t165139\n江华瑶族自治县\t165140\nntohl\t165141\n好句\t165142\nnostalrius\t165143\nCDL\t165144\n潜流\t165145\n6400元\t165146\n万象网管\t165147\n食蜂\t165148\n雅思\t165149\n触礁\t165150\n一击\t165151\n花牌坊\t165152\n冰狗\t165153\nATMEGA\t165154\n熊出没之环球大冒险\t165155\n纳思达\t165156\n隆鼻术\t165157\n留置权\t165158\n最喜\t165159\n阮成发\t165160\n洮南市\t165161\n专升\t165162\n大力女\t165163\n399.com\t165164\n囊\t165165\nedited\t165166\n60公斤\t165167\nLWN\t165168\ndlan\t165169\nTooltip\t165170\n麦芒6\t165171\n1200px\t165172\nisn\t165173\nvessel\t165174\n玩友\t165175\n痴人\t165176\n易学堂\t165177\n502胶水\t165178\n叮咚叮咚\t165179\nbearing\t165180\n这里的黎明静悄悄\t165181\n生产线\t165182\n双卧\t165183\n香袋\t165184\nMyCAT\t165185\nmarking\t165186\n商品房预售许可证\t165187\n十万块\t165188\n棒冰\t165189\nDispenser\t165190\n蜘蛛侠1\t165191\n主扇\t165192\nDDos\t165193\nYB-Park\t165194\n房天下产业网\t165195\n致命冲动\t165196\n在目\t165197\nEquivalent\t165198\n外援\t165199\n物件\t165200\n烫染\t165201\n李爱民\t165202\n阳光城翡丽\t165203\ntransplant\t165204\n反酸\t165205\nperhaps\t165206\n微信多开器\t165207\n6孔\t165208\n新引擎\t165209\n大悲\t165210\n奈亚\t165211\n逆市\t165212\n陆瓷\t165213\n小企\t165214\n桂花岗\t165215\nlibusb\t165216\n商南县人民政府\t165217\n休眠\t165218\n旅费\t165219\nmlb\t165220\n呈祥\t165221\n特检\t165222\n芦\t165223\n1077\t165224\nA963设计网\t165225\n鸟喙\t165226\n小米3吧\t165227\n苹果花\t165228\n升降车\t165229\n辰溪县\t165230\n筋肿\t165231\nCPA业\t165232\n竖排\t165233\nmicrosft\t165234\nMOTION\t165235\n南方财富网\t165236\n夜色生香\t165237\n安邦保险集团\t165238\n哈喇子\t165239\n80,90年代\t165240\n泡菜水\t165241\n1只\t165242\n鸣人\t165243\nzhipin\t165244\n神印诸天纪\t165245\n受限\t165246\n56年\t165247\n3860\t165248\n南夏墅\t165249\nX20Plus\t165250\n2020年起\t165251\n众说\t165252\n天心阁\t165253\nYee\t165254\n姑爷\t165255\nhappiest\t165256\n脱色剂\t165257\n一百克\t165258\n直投\t165259\n38所\t165260\n物流学\t165261\nf479\t165262\n法定假日\t165263\n合生创展集团\t165264\nbat批量\t165265\n老河\t165266\n平原镇\t165267\n泰捷WEBOX\t165268\n20160522\t165269\n选票\t165270\nmk4\t165271\n椒友\t165272\ntechnical\t165273\n域链\t165274\n柔软\t165275\n火燎\t165276\n紫金庄园\t165277\n王庆国\t165278\n⊙\t165279\n铭影\t165280\n女皇帝\t165281\n希伯来文\t165282\n非法学\t165283\n海虞镇\t165284\n庹俊杰\t165285\nPALL\t165286\n彩妆师\t165287\nNBA\t165288\n抽芽\t165289\n接受现实\t165290\n六吨\t165291\n成都航空\t165292\n江淮论坛\t165293\n非智力因素\t165294\n武杭\t165295\n绝地求生交易\t165296\n铁齿\t165297\n杭叉\t165298\n连跪\t165299\n烈火如歌烈如歌\t165300\nYoga\t165301\n唱就唱\t165302\n114&\t165303\n刀剑封魔录之上古传说\t165304\n赚客\t165305\n舞娘\t165306\nRom\t165307\n凌陈亮\t165308\n东南卫视\t165309\n硫化锌\t165310\n大榕树\t165311\n临床表现\t165312\n总体国家安全观\t165313\n葫芦岛\t165314\nCLE\t165315\n宝胜科技创新股份有限公司\t165316\n吧挖网\t165317\n柯达伊\t165318\n猜人\t165319\n素履\t165320\n阿神\t165321\n神品\t165322\n索欢\t165323\n其民\t165324\n退休证\t165325\ncaller\t165326\n气象灾害预警信号\t165327\n1377\t165328\n大架号\t165329\n韩友谊\t165330\n市面上\t165331\nseeks\t165332\n彼年\t165333\n日产尼桑\t165334\n六年级数学练习册答案-小学六年级数学练习册\t165335\n第十年\t165336\nbadend\t165337\n运用\t165338\n给煤机\t165339\n火山小\t165340\n避光垫\t165341\n彩钢压瓦机\t165342\n灌胶\t165343\n骄阳似我\t165344\npackage\t165345\n201412\t165346\n公西华侍\t165347\n国势\t165348\n精魄\t165349\n通人\t165350\n三星i9100\t165351\n首期\t165352\nphyto\t165353\n桑植县\t165354\n2013年10月09日\t165355\n布兰妮·斯皮尔斯\t165356\n马丁靴\t165357\n迎新\t165358\n中铁第四勘察设计院集团有限公司\t165359\n欧空局\t165360\n过门\t165361\n介绍片\t165362\n3月8号\t165363\n攻略贴\t165364\n仙琚制药\t165365\ndies\t165366\n撒开\t165367\n太阳能光伏网\t165368\n咨询服\t165369\n德新交运\t165370\n28322485\t165371\n动态版\t165372\n从来没有\t165373\non_书面语\t165374\n赵鑫全\t165375\n建设银行手机银行\t165376\n华北路\t165377\n肯耐珂萨\t165378\nNaked\t165379\nudp\t165380\n酱汁\t165381\n七尖\t165382\n北京邮电大学出版社\t165383\n刻度尺\t165384\n鈴村\t165385\n吉林森工露水河\t165386\n源水\t165387\n肖村\t165388\n称号\t165389\n王光辉\t165390\n佳美\t165391\n第36关\t165392\n剑灵科鲁兹\t165393\n童子鸡\t165394\nmipmap\t165395\n速美\t165396\n花烛\t165397\nSecret\t165398\n事略\t165399\nmplab\t165400\n俄罗斯帝国\t165401\n大连保税区\t165402\n5200章节\t165403\nallure\t165404\nagricultural\t165405\n北京邮电大学网络教育学院\t165406\nCOD11\t165407\nfraction\t165408\n金刚葫芦娃\t165409\n希诺\t165410\n警员\t165411\n下包\t165412\n埃及艳后\t165413\n三门峡市湖滨区\t165414\napology\t165415\n靖江房产网\t165416\nhurt\t165417\n山势\t165418\n100亿个\t165419\nhayday\t165420\n雨魔\t165421\nqmessagebox\t165422\nTreaty\t165423\n副省长\t165424\n对对碰\t165425\n杜亚\t165426\n黄蝶\t165427\nSooPAT\t165428\n17133.73\t165429\n八月初\t165430\n云之遥\t165431\n拿\t165432\n侠盗飞车3\t165433\n中级证\t165434\n割蛋\t165435\n低烧\t165436\n娘俩\t165437\n傲视\t165438\n梧桐街道\t165439\n重构\t165440\n拳皇98吧\t165441\n花小钱\t165442\nair12\t165443\nSonia\t165444\n化霜\t165445\n市场化\t165446\nnzxt\t165447\n市场分析\t165448\n银政\t165449\n春川\t165450\n车头灯\t165451\n晨曦\t165452\n清水乡\t165453\n袁娅维\t165454\n导学案.doc\t165455\n韩栋\t165456\n普旭\t165457\n外套\t165458\n苹果应用商店\t165459\n薄荷\t165460\n扬州市政府\t165461\n边框\t165462\ndategrid\t165463\n南希\t165464\n大提顿\t165465\n雷克萨斯ES\t165466\n吸水石\t165467\n云狗\t165468\nog2\t165469\nAmplifiers\t165470\n深圳文博会\t165471\n萧策\t165472\n山东省科技馆\t165473\n达曼\t165474\n九寨沟景区\t165475\n郑州中学\t165476\nmonsters\t165477\n企秀_播视网\t165478\nIntelliJIDEA\t165479\n少年凯\t165480\n捷达车\t165481\n106条\t165482\n是枝裕\t165483\n宁德职业技术学院\t165484\n大海航行\t165485\n二手网\t165486\n百度大厦\t165487\n金广国\t165488\n两违\t165489\n海淀剧院\t165490\n赵锐\t165491\n葬魂笔记\t165492\n深寒\t165493\n假性\t165494\n一年间\t165495\n料位开关\t165496\n手机无线网\t165497\npc6游戏网\t165498\npywinauto\t165499\n正轨\t165500\n墙饰\t165501\n邻苯二胺\t165502\n客家房产网\t165503\n接包\t165504\npplive\t165505\nD850\t165506\n交通史\t165507\n乐村\t165508\n安嘉\t165509\n荷东\t165510\n雅顿粉胶\t165511\n土猪\t165512\n尤克\t165513\n免疫病\t165514\n和讯银行\t165515\n95598\t165516\n出发地\t165517\n长沙消防\t165518\n一屋不扫何以扫天下\t165519\n芝加哥公牛\t165520\n启德教育集团\t165521\n南京卫生学校\t165522\n清廉随州网\t165523\n三个多月\t165524\n二项\t165525\nJTGF\t165526\n李赫宰\t165527\n3636\t165528\n莱拉\t165529\n登岳阳楼\t165530\n威斯敏斯特教堂\t165531\n以身作则\t165532\n奈特科尔\t165533\n长湖路\t165534\n中国电子商务研究中心\t165535\n策略性\t165536\n导游证\t165537\n老大\t165538\nwenzhang\t165539\n美家美户\t165540\n输变\t165541\n韩辰\t165542\n恒力集团\t165543\n广发易淘金\t165544\n福星高照猪八戒\t165545\n实用类\t165546\n山岗\t165547\n召唤券\t165548\n林洙\t165549\n雾都\t165550\n80路\t165551\n农用地\t165552\nFreeXX性\t165553\n60公里\t165554\n木垛\t165555\n蔡明\t165556\n安丘市\t165557\n0.4秒\t165558\n你曾是少年\t165559\n撸一撸\t165560\nsigaction\t165561\nhats\t165562\nHP1007\t165563\n真空吸盘\t165564\nkangta\t165565\n低频变压器\t165566\n车辆损失险\t165567\n王传\t165568\n宠物女孩\t165569\n小娇\t165570\nSNL\t165571\n信息科\t165572\n插秧\t165573\n精钢\t165574\n皇妃\t165575\n18037\t165576\n肆\t165577\n人大附中\t165578\n0401\t165579\nMWS\t165580\n成忠\t165581\n旋片式\t165582\n集装箱半挂车\t165583\n受让方\t165584\n关牧村\t165585\n6克\t165586\n考拉海\t165587\nfeminism\t165588\n黔城镇\t165589\n全熔\t165590\na43s\t165591\n中国旅游新闻网\t165592\n短切\t165593\nfares\t165594\nYangzhou\t165595\n暖色\t165596\nthttpd\t165597\n新石器\t165598\n第一顺位\t165599\n铁精粉\t165600\n电子烟\t165601\n斯莱特林\t165602\n组态\t165603\n哀而不伤\t165604\n新日恒力\t165605\n北京出入境检验检疫局\t165606\n退行性病变\t165607\n0.3m\t165608\n交通工具\t165609\nprecede\t165610\n巡检表\t165611\n17.2\t165612\n萍聚\t165613\n肺不张\t165614\n圣娼\t165615\n海恩\t165616\nsubstituted\t165617\n季刊\t165618\n蓝芩口服液\t165619\n押中\t165620\n打定\t165621\n去除率\t165622\n石瓢壶\t165623\n紫郡府\t165624\n忍\t165625\n坤灵丸\t165626\n信贷政策\t165627\n香龙\t165628\n66个\t165629\n克罗地亚\t165630\nemphasize\t165631\n忘本\t165632\n911路\t165633\n雷蒙威\t165634\npgo\t165635\n上海市黄浦区人民法院\t165636\n智族GQ\t165637\n类星体\t165638\n二班\t165639\n唐磊\t165640\n海藻酸钠\t165641\n爱情重跑\t165642\nparamter\t165643\n垄\t165644\n腾讯视频电脑版\t165645\n伊利亚特\t165646\n罗密欧与朱丽叶\t165647\n奥本大学\t165648\n旧曲\t165649\n闪电十一人go\t165650\n皮肤科\t165651\n龙阳\t165652\n并肩\t165653\n赫连勃勃\t165654\n三全\t165655\n广利\t165656\nxp1024\t165657\n王VRAINS\t165658\n私募管理人\t165659\n信息处\t165660\n雪\t165661\n倾力\t165662\ninthe\t165663\n仙河\t165664\n豪车\t165665\nToefl\t165666\n一树桃花开\t165667\n750d\t165668\n上海体检\t165669\nNBA常规赛\t165670\nYen\t165671\nv2.3.5\t165672\n220吨\t165673\n乌达\t165674\n宜兴竹海\t165675\n昆石\t165676\nSimditor\t165677\n2018年1月3日\t165678\n北京奥林匹克公园\t165679\n文办\t165680\n本师\t165681\n天目湖御水温泉\t165682\n老有所\t165683\n美丽说\t165684\n茅房\t165685\n加林\t165686\n八女投江\t165687\n临潼区人民政府\t165688\n淋浴间\t165689\n东三环北路\t165690\n江宁路街道\t165691\n美国民主党\t165692\n8718\t165693\nTextBlock\t165694\n宁波地税\t165695\n照明网\t165696\n机电工程管理与实务\t165697\n缩点\t165698\nCEB\t165699\nHenri\t165700\n3日服\t165701\n李建红\t165702\n岳华\t165703\n国务院新闻办\t165704\n除权除息\t165705\nAVENUE\t165706\n生物炭\t165707\n一路走好\t165708\n闯劲\t165709\n13支\t165710\n游侠客\t165711\n西冲\t165712\ngakki\t165713\n南京市财政局\t165714\nLM2576\t165715\n18038\t165716\nThings\t165717\nT460P\t165718\n超声波传感器\t165719\n王俊拽\t165720\n红帽linux\t165721\n溜肩\t165722\n阿尔法狗\t165723\n进路\t165724\n即墨在线\t165725\n戴恩\t165726\n描摹\t165727\nlathe\t165728\n丝袜片\t165729\n验房\t165730\n黑靴\t165731\n童蕾\t165732\n江苏电信\t165733\n秦勇\t165734\n刘莹\t165735\nkaggle\t165736\n义亭镇\t165737\n导电\t165738\n美济礁\t165739\n京东商城_京东消费者帮助中心\t165740\nluogu\t165741\npcm\t165742\n口胡\t165743\n厦门大学人文学院\t165744\n0744\t165745\n田捷\t165746\n腊梅\t165747\n38号车评中心\t165748\n王大珩\t165749\n4000平方米\t165750\n金钟国\t165751\nJAVA编程语言\t165752\n小米手柄\t165753\n20180406\t165754\n专县\t165755\n弹力袜\t165756\n美运\t165757\n佳力图\t165758\n光泽感\t165759\n办件\t165760\n控股公司\t165761\n啊恩\t165762\n炫龙炎魔t1\t165763\n159代餐\t165764\n39\t165765\nv1803\t165766\n八栋\t165767\nEditors\t165768\ndpo\t165769\n赛睿寒冰7\t165770\n福剑\t165771\n抛补\t165772\n红山村\t165773\ndeu\t165774\n惠州市中心人民医院\t165775\n句易网\t165776\nbeats3\t165777\nSEO优化\t165778\n水翼\t165779\n重命\t165780\n杨先森\t165781\n洪文旭\t165782\n石景\t165783\n供氧\t165784\n交媾\t165785\n0.15元\t165786\n天天看看\t165787\n五文\t165788\n美国共和党\t165789\nP3\t165790\nsobre\t165791\n2月25日\t165792\nADIDAS\t165793\n中里\t165794\n酷学网\t165795\n六扇门吧\t165796\n宁晋\t165797\n复合面料\t165798\nBazel\t165799\n报告老板\t165800\n废案\t165801\n蝴蝶湾\t165802\n59分\t165803\n桥机\t165804\ndoremi\t165805\n缺陷\t165806\n锄大地\t165807\n万方\t165808\n滨海国际机场\t165809\n2016财年\t165810\n电缆盘\t165811\n曝气量\t165812\n租费\t165813\n名称管理器\t165814\n阴阳\t165815\nkj\t165816\n5388\t165817\n远郊区\t165818\n孙立人\t165819\n钻营\t165820\n瘟神\t165821\n黑虎泉\t165822\n脉宝\t165823\n2969\t165824\n齐民要术\t165825\nventricular\t165826\n31张\t165827\n星界边境\t165828\ntssd2015\t165829\n汽车时刻表\t165830\n第198集\t165831\n提刑\t165832\n黟县\t165833\n西安交通大学第二附属医院\t165834\n大兴区\t165835\n分任\t165836\n河北省国家税务局\t165837\n销账\t165838\n李江\t165839\nvipkids\t165840\n韩光\t165841\nCCR\t165842\n拉拉队\t165843\n仁寿黑龙滩\t165844\nODOO\t165845\n陈江街道\t165846\n解放军302医院\t165847\n从一部\t165848\n让子弹飞\t165849\n黄建华\t165850\nonenote2016\t165851\n3.48\t165852\n劫走\t165853\nEducator\t165854\n22次\t165855\n喷塑\t165856\ne^2\t165857\nCICI\t165858\n仓山万达广场\t165859\n宁波大红鹰学院\t165860\n金子美玲\t165861\n宾得吧\t165862\nPatches\t165863\n两类\t165864\n目标管理法\t165865\n行动方\t165866\n2ss\t165867\n增幅\t165868\nlys\t165869\n风铃草\t165870\nSTRIX\t165871\n张家界市政府\t165872\n明阳智慧能源集团股份公司\t165873\n岳阳市政府\t165874\n峰度\t165875\n朝朝\t165876\nmoyu\t165877\n上海上美化妆品有限公司\t165878\n前边\t165879\n冯天奇\t165880\n鞋履\t165881\n赵碧嘉\t165882\n厦门新闻网\t165883\n查詢電話號碼\t165884\ncctv10\t165885\n三严\t165886\n张文彤\t165887\n衡力\t165888\n6公分\t165889\n8007000\t165890\n北京温都水城\t165891\n法布尔昆虫记\t165892\nwin7启动项\t165893\n天魔王\t165894\n琴枫雅轩\t165895\n盘龙江\t165896\n炙甘草\t165897\n59度\t165898\n小嶝岛\t165899\n扎赉诺尔区\t165900\n王永贵\t165901\n电子梦\t165902\n美玉\t165903\n山龙\t165904\n文靖路\t165905\n细化器\t165906\npp助手\t165907\n武汉图书馆\t165908\n护踝\t165909\n控卫\t165910\n气动元件\t165911\n张冬云\t165912\n湘里\t165913\n三相分离器\t165914\n一微米\t165915\nvegas\t165916\n郑媛\t165917\n微信银行\t165918\n马普尔小姐探案\t165919\niconv函数\t165920\n花园镇\t165921\nqq群发器\t165922\n门族\t165923\n何海波\t165924\n二星级\t165925\n缘起\t165926\n剧集\t165927\n杨南路\t165928\n水率\t165929\n龙舟文化节\t165930\nReject\t165931\nplast\t165932\n这份情\t165933\n中国邮政银行\t165934\n筑龙岩\t165935\n南海网新闻中心\t165936\n彼岸岛\t165937\natg\t165938\n24平方\t165939\natexit\t165940\n弄巧成拙\t165941\n邻避效应\t165942\n中国电力新闻网\t165943\n西子\t165944\n市商\t165945\n开敞式\t165946\n速算盒子\t165947\n_座\t165948\n王燕\t165949\n金沙遗址\t165950\n多页\t165951\nwin10word\t165952\n北山公园\t165953\n番号大全_女优图\t165954\n翠华路\t165955\n本镇\t165956\nSolar\t165957\ndosth\t165958\napache.commons\t165959\n莱城\t165960\n客村\t165961\nnz\t165962\n流放之路3.2游侠锐\t165963\n海绵垫\t165964\n城市学校网\t165965\nfreepbx\t165966\npro5\t165967\n长春公交查询网\t165968\n诚信防骗居\t165969\ntextarea文本框\t165970\nquickstart\t165971\n中国电气\t165972\n缘于\t165973\n牙语\t165974\n55小说网\t165975\n天猫超市\t165976\n福州市教育局\t165977\n礼号\t165978\n杨瑞代\t165979\n三菱帕杰罗V97\t165980\n沪江在线\t165981\n011\t165982\n十六浦\t165983\n中华人民共和国社会保险法\t165984\n亚拓\t165985\n吖吖云\t165986\nRollback\t165987\nInfraWorks\t165988\n香港半岛酒店\t165989\n安克\t165990\n处罚款\t165991\n氧健身操\t165992\n火冒\t165993\n认知语言学\t165994\n同学家\t165995\n钱塘小学\t165996\n成都幼儿师范学校\t165997\n出口商\t165998\n安居工程\t165999\n芝罘区\t166000\n辱骂\t166001\n嫐\t166002\n赵祯\t166003\n马猴\t166004\n卡密尔\t166005\nFegin\t166006\n源生\t166007\n5只\t166008\n88个\t166009\naccidental\t166010\nrework\t166011\n一夜间\t166012\n复眼\t166013\n盘感\t166014\n主馆\t166015\n转基因大豆油\t166016\n阳包\t166017\nCuCl2\t166018\n新疆人才网\t166019\n过控\t166020\n大同区\t166021\n环绕音\t166022\n总攻文\t166023\n深圳队\t166024\n痛处\t166025\n红斑狼疮\t166026\n主持会议\t166027\n兔老霸夏\t166028\n咸宁新闻网\t166029\n京瓷kyocera\t166030\nTree\t166031\n裤\t166032\n巨彩\t166033\n转寄\t166034\n乘虚而入\t166035\n银网\t166036\nwelded\t166037\n外孙\t166038\n高晓东\t166039\n凝固点\t166040\n郑丽媛\t166041\n陇西\t166042\n火山爆发\t166043\n美娜\t166044\nWheel\t166045\n满速\t166046\n2.3.5\t166047\n溫泉\t166048\n孙晓杰\t166049\n白琉璃\t166050\nltv\t166051\n350平米\t166052\n德械师\t166053\n我爱发明\t166054\nxmapp\t166055\n要闻_华商报\t166056\nFractional\t166057\n舰桥\t166058\n第31条\t166059\n走进新时代\t166060\n灰度直方图\t166061\n长生诀\t166062\nmt7628\t166063\n伦理\t166064\nheyzo\t166065\nLabel\t166066\n绑制\t166067\nfasting\t166068\nacce\t166069\n夬卦\t166070\n车站\t166071\n下午茶\t166072\n150套\t166073\n普条\t166074\nemitting\t166075\n车险理赔\t166076\nmada\t166077\n秦兵\t166078\n落雷符\t166079\n中国会计网\t166080\nkyocera\t166081\n马坡镇\t166082\nTuzi\t166083\n法属圭亚那\t166084\n圣丹斯电影节\t166085\n爪洼岛\t166086\n管理学家\t166087\n雄踞\t166088\n格华止\t166089\nshim\t166090\n德雷塞尔大学\t166091\nfoo\t166092\n松峰山\t166093\n扑克\t166094\n支部\t166095\n浦口经济开发区\t166096\n特种兵王\t166097\noppoR9\t166098\nWrapper\t166099\njitao\t166100\nTrees\t166101\n微鲤\t166102\n草剃京\t166103\n绝品邪少\t166104\nSutra\t166105\n打分\t166106\nInstances\t166107\n法律类\t166108\n末梢神经炎\t166109\n黑麦\t166110\nsyscall\t166111\n脚本函数\t166112\n南海一号\t166113\n克罗库恩\t166114\n零距离\t166115\n万祥\t166116\nFView\t166117\nDataGridView\t166118\n方茴\t166119\nHP5200\t166120\n读片\t166121\n天真烂漫\t166122\nWEGAME\t166123\n冠幅\t166124\n广告曲\t166125\n衣帽\t166126\n洗漱\t166127\n问案\t166128\n6070年代\t166129\n刀手\t166130\n第158集\t166131\n遂宁\t166132\n重庆农商银行\t166133\n白玉川\t166134\n睡篮\t166135\n陈美凤\t166136\n齐鲁银行\t166137\n临街\t166138\n革命性\t166139\n机关枪\t166140\n贩罪\t166141\n19倍\t166142\n不算\t166143\n屠户\t166144\n泰国狮航\t166145\n质押物\t166146\n武汉大学生命科学学院\t166147\n火电机组\t166148\nBordeaux\t166149\n健康元\t166150\n一错再错\t166151\n9h\t166152\n鹏业\t166153\n姣好\t166154\n支委会\t166155\n虐囚\t166156\n磨矿\t166157\n飞致\t166158\nFellowship\t166159\n华住酒店集团\t166160\n装进\t166161\n1580\t166162\n大连旅行社\t166163\nbacktrack5\t166164\n豫剧板胡\t166165\nORR\t166166\ngale\t166167\n上海交通职业技术学院\t166168\n斗兽棋\t166169\n交通运输部公路科学研究院\t166170\n大叶\t166171\n怠\t166172\n自恢复保险丝\t166173\n高潮液\t166174\n果敢\t166175\n争议\t166176\n音频处理器\t166177\n马鞍形\t166178\n勇者斗恶龙4\t166179\n钢鉄\t166180\n崇川区\t166181\n内陆港\t166182\n韩素音\t166183\n造血\t166184\n浓缩型\t166185\n蛭石\t166186\n米花糖\t166187\n乐行\t166188\n偏振\t166189\nU4\t166190\n曹丕\t166191\n管脚\t166192\n鹿亭乡\t166193\n安心亚\t166194\n庄则栋\t166195\n王金金\t166196\n宫颈病变\t166197\n周思源\t166198\n炙心\t166199\nBilly\t166200\nPWA\t166201\n盐味\t166202\n销货单\t166203\n电阻片\t166204\n贵价\t166205\n寿王\t166206\n俄罗斯人\t166207\nCieloSun\t166208\n传真机\t166209\n京新高速\t166210\n龙虎\t166211\n公益西桥\t166212\n夹层\t166213\n古筝谱\t166214\n梁宏毕雯珺\t166215\n同步\t166216\nEcuador\t166217\ncarson\t166218\n火影忍者凯\t166219\nRahman\t166220\n素萨\t166221\n休闲化\t166222\n凹\t166223\n马安\t166224\n封装\t166225\n刘迎春\t166226\n黄铜管\t166227\neval\t166228\n张小东\t166229\n甘肃省妇幼保健院\t166230\n西铁城光动能\t166231\n张德明\t166232\n仓木\t166233\n列如\t166234\n排头\t166235\n祖鲁战争\t166236\n过零丁洋\t166237\nholders\t166238\n深康佳\t166239\n事业类\t166240\n田流\t166241\n桥梓镇\t166242\n201601\t166243\nQNX\t166244\n九分钟\t166245\nlesmills\t166246\n旧币\t166247\n花呗账单\t166248\nCoast\t166249\n肚腩\t166250\n集佳\t166251\n桂北\t166252\n填充料\t166253\n柳北风儿\t166254\nzooskoolknotty\t166255\n华远地产\t166256\nvvc\t166257\n750x1334\t166258\n头次\t166259\n金泽古镇\t166260\ngroupbox\t166261\n赫兹伯格\t166262\n海洋王\t166263\n易宪容\t166264\nMoved\t166265\nXhtml\t166266\n1.84\t166267\n深耕\t166268\n卡姆\t166269\n奥美拉唑\t166270\n百变小樱\t166271\n省委政研室\t166272\n1把\t166273\nInvitation\t166274\n榜头\t166275\n李亮\t166276\n联想VIBE\t166277\n微保\t166278\nupc\t166279\n上万块\t166280\n单丝\t166281\n血手幽灵\t166282\n醉翁亭记\t166283\n凤鸣山\t166284\n西安兵马俑\t166285\n乖戾\t166286\n相长\t166287\n黄土风\t166288\nlunwen\t166289\n魏蜀吴\t166290\n接吻\t166291\n末日\t166292\n牧濑红莉\t166293\n阿迪耐克\t166294\n两个人\t166295\n奶豆\t166296\n宁政\t166297\n汝阳\t166298\n区政法委\t166299\n记录集\t166300\n30本\t166301\nBode\t166302\n上海市红房子妇产科医院\t166303\nrestarting\t166304\nObjectARX\t166305\n夭\t166306\nPC+\t166307\ndocx\t166308\n郑晓燕\t166309\n上海华信证券\t166310\ndividends\t166311\n2000元\t166312\n变幅\t166313\ncctv2\t166314\nzmc\t166315\n2009年10月\t166316\n电容器\t166317\nreached\t166318\n中民\t166319\n7小时\t166320\n黑杨\t166321\n环球卓越考博网\t166322\n彻夜难眠\t166323\n驾管\t166324\n两盏\t166325\n正剧\t166326\n百度站长工具\t166327\nLEAF\t166328\n刘谋\t166329\nword2010版\t166330\nqs世界大学排名\t166331\n血鹦鹉鱼\t166332\n尽数\t166333\n阿勒泰\t166334\n九阴传\t166335\n沧州明珠\t166336\nGraduate\t166337\n校方责任险\t166338\n刘程\t166339\n惊声\t166340\ndedecm\t166341\njeunesse\t166342\nVDD\t166343\n首_\t166344\n瘦腹\t166345\n0xffffffff\t166346\n三鞭酒\t166347\nlixian\t166348\n广告会\t166349\n京逸\t166350\n八戒八戒\t166351\n8.0版\t166352\n_息县\t166353\n诲人不倦\t166354\nzaker\t166355\n760k\t166356\n李中华\t166357\n邪恶天使\t166358\nq8\t166359\n7盘\t166360\nautocad\t166361\n荡然无存\t166362\n独立自主\t166363\nAR网\t166364\nmma\t166365\n4.03\t166366\nDSD\t166367\n葛仙山\t166368\nhdfs\t166369\n万神庙\t166370\n三庆\t166371\nGreasemonkey\t166372\n文本编辑\t166373\n凪\t166374\n大连艺术学院\t166375\n第201集\t166376\n200系\t166377\n冰灾\t166378\nOpenWrt中文网\t166379\nLGA775\t166380\n舞阳\t166381\n分销员\t166382\n解评\t166383\n坚称\t166384\n姬魔\t166385\n岗台\t166386\n旅行的意义\t166387\n黄高\t166388\n阿祖拉\t166389\n圣保罗\t166390\n彭华\t166391\nkeyvalue\t166392\n巴生港\t166393\n李民基\t166394\n激光脱毛仪\t166395\n新英体育\t166396\n500亩\t166397\n角化\t166398\n贵屿镇\t166399\n逸风\t166400\n黄章晋\t166401\nue4吧_\t166402\n西岳\t166403\n核证\t166404\n德克萨斯大学达拉斯分校\t166405\n二笔\t166406\n低保户\t166407\nattributes\t166408\n撤回\t166409\n竞争\t166410\n狂暴者\t166411\n颁\t166412\nrns510\t166413\n1x1\t166414\n缺损\t166415\n门房\t166416\nSQL2008数据库\t166417\n万里\t166418\n泰康之家\t166419\ng50\t166420\n77名\t166421\n包装业\t166422\n超界\t166423\n新进展\t166424\n毕恭毕敬\t166425\n4派\t166426\n紫苏子\t166427\n赛文柒seven\t166428\n清华紫光太阳能\t166429\nCOX\t166430\n金坷垃\t166431\n旺角\t166432\n塑料桶\t166433\n凤凰秀\t166434\n阮清恬\t166435\nMetcn\t166436\n徐州人才网\t166437\nu-boot\t166438\n环球中心\t166439\n云邸\t166440\n县房管局\t166441\n滴滴打车券\t166442\nmybatis批量\t166443\n西湖\t166444\n中冶建工\t166445\n美芽\t166446\n杜塞\t166447\nzhanshi\t166448\n6月前\t166449\n广益路\t166450\n云林\t166451\n网站池\t166452\nwin版\t166453\n3存档\t166454\n一诺千金\t166455\n海基会\t166456\n法师\t166457\n252个\t166458\nTOO\t166459\nzakka\t166460\n猪八戒背媳妇\t166461\n发艺\t166462\n14门\t166463\n替身演员\t166464\n920\t166465\nGalax\t166466\n309国道\t166467\n亨斯曼\t166468\nvm-tools\t166469\n增塑剂\t166470\nparallelism\t166471\n第6页\t166472\n企鹅娘\t166473\n广州华兴康复医院\t166474\n姫野\t166475\n百香籽\t166476\n一分段\t166477\n饼干头\t166478\n菜场\t166479\n华锐\t166480\n朱蒂\t166481\n正韵\t166482\nsqlserver2012\t166483\n另一个群\t166484\n27次\t166485\n汇编器\t166486\n卓越置业集团有限公司\t166487\n迅\t166488\n夜神篇\t166489\n小姑娘\t166490\n佛语\t166491\n贵州省发改委\t166492\n杨美娜\t166493\n第四十三回\t166494\n罂粟\t166495\n可导入\t166496\n石大\t166497\n混凝土膨胀剂\t166498\n零件库\t166499\n艺人们\t166500\n合力泰\t166501\n隐形纱\t166502\n12星\t166503\n热茶\t166504\n开心斗地主\t166505\nMMLoveMeMM\t166506\n参将\t166507\n方源\t166508\n去楼空\t166509\n呗\t166510\ni3\t166511\nforty\t166512\n曲池\t166513\n害惨\t166514\nps糖水片\t166515\nakari\t166516\n孟津县人民政府\t166517\n白燕\t166518\nKeith\t166519\n荒漠\t166520\n葛兆光\t166521\n帅\t166522\n客如云\t166523\n40平方\t166524\n20几年\t166525\n多错\t166526\n秀萝\t166527\n路单\t166528\n营救\t166529\n富士通\t166530\n爱启航\t166531\n刘纪鹏\t166532\n白鹿\t166533\n武炼星辰变\t166534\n六十米\t166535\n错点鸳鸯\t166536\n李微漪\t166537\n石川小百合\t166538\n特别策划\t166539\n悄咪咪\t166540\n何芳\t166541\n抱抱\t166542\n黄羊\t166543\n胃泌素\t166544\nQR\t166545\nwifi助手\t166546\n14r\t166547\n急转弯\t166548\n保济丸\t166549\n反意疑问句\t166550\n崔成国\t166551\n佐佐木遥\t166552\n紗\t166553\n骑鹅旅行记\t166554\nBD-720P-MP4\t166555\n费时\t166556\nqiime\t166557\n保健员\t166558\n潮范\t166559\n布桩\t166560\n绝地\t166561\njiaoshi\t166562\n魔穗字幕组\t166563\niphone5c\t166564\n施茶村\t166565\n40级\t166566\nSMALL\t166567\n刨地\t166568\n入式\t166569\n谭晓彤\t166570\n奥斯特罗姆\t166571\n瘤\t166572\nMonkeys\t166573\n粉皮机\t166574\n陶笛谱\t166575\n南谯区\t166576\nbic\t166577\n一肖\t166578\n长明\t166579\n电脑硬盘\t166580\n尤尤\t166581\n患处\t166582\nTika\t166583\n杨海波\t166584\n清洁型\t166585\n八年级英语\t166586\n八路军\t166587\n校本研修\t166588\n老妇女\t166589\n暗盘\t166590\n迈迪设计宝\t166591\n刷机网\t166592\n新梦\t166593\n每行\t166594\n野帝\t166595\n445号\t166596\n89988550\t166597\nIphoneX\t166598\nofo\t166599\nMTV\t166600\n血珠\t166601\ndx9\t166602\n蒸汽机器人\t166603\n30多万\t166604\n49号\t166605\n秦昭王\t166606\n绿度母\t166607\n默言\t166608\n22点\t166609\n陈中华\t166610\n纹龙\t166611\n法律特征\t166612\n捻机\t166613\n不文\t166614\ndai\t166615\nn4\t166616\n西装\t166617\n消音\t166618\n秋冬装\t166619\n少霞\t166620\n洪泰\t166621\n应知\t166622\n舞天姬\t166623\n脉冲星\t166624\n泰丝\t166625\n郑念\t166626\n红马鞍\t166627\n我的一生\t166628\n证明表\t166629\nMSVCR120.dll\t166630\nCinematic\t166631\n非彤小可商城\t166632\n肴\t166633\nprobes\t166634\nqfii\t166635\n天宁\t166636\n馆陶县\t166637\n维京:王者之战\t166638\ncompareto\t166639\n抄税\t166640\n彭措\t166641\n小野猫\t166642\n双泛\t166643\n165亿\t166644\n头域\t166645\n鼎\t166646\n盐酸特比萘芬片\t166647\n臭味相投\t166648\n佛者\t166649\nRMAN\t166650\n宋岚\t166651\n自动机械表\t166652\ntat\t166653\n乐视乐1S\t166654\n再袭\t166655\n第五种\t166656\n北陆\t166657\n阎奕格\t166658\nWestworld\t166659\n柳斌杰\t166660\n纵论\t166661\n奇佳\t166662\n5572\t166663\n平头\t166664\n云谷寺\t166665\n满满\t166666\nwo-27s\t166667\n山灵\t166668\nIIC\t166669\nV6.3.0\t166670\nintp\t166671\n160米\t166672\n10米高\t166673\n适应症\t166674\n商业保险公司\t166675\nsedex\t166676\n扁桃腺\t166677\n灵动\t166678\n到大\t166679\n秋长镇\t166680\nQQ代刷网\t166681\n立达中学\t166682\nmyers\t166683\nPinterest\t166684\n南昆铁路\t166685\n摊贩\t166686\n考情\t166687\n种子植物\t166688\n王海鹏\t166689\n县国税局\t166690\nhuman\t166691\nKorean\t166692\n荷坳\t166693\n广州万达广场\t166694\n第十六季\t166695\nX-Pro2\t166696\n25万元\t166697\n老越吃香\t166698\n强权\t166699\n20160417\t166700\nGAM\t166701\nmendeley\t166702\n缠访\t166703\n血钻\t166704\n_奇迹MU\t166705\n猫心儿\t166706\nexcel页码\t166707\n渔者\t166708\n马琳\t166709\n药草\t166710\n环境科学与技术\t166711\nsweetheart\t166712\n举起\t166713\n大涨价\t166714\nasami\t166715\nUBUNTU\t166716\n卷盘\t166717\n金龙花园\t166718\nlubuntu\t166719\n丘壑\t166720\n致护\t166721\n重庆市人民政府办公厅\t166722\n豆沙包\t166723\n地藏王菩萨\t166724\n肌酸激酶\t166725\n科右前旗\t166726\norganisation\t166727\n漂尾\t166728\n英飞特\t166729\nPooling\t166730\n张羽\t166731\n相同行\t166732\n滑模变\t166733\n快键\t166734\nSSW\t166735\nshuchu\t166736\n龙大\t166737\n红安县\t166738\n双截龙\t166739\nignored\t166740\nvary\t166741\ncytherea\t166742\n塔吉克族\t166743\n电子鼓\t166744\n小米MIUI9\t166745\n线处\t166746\n四川通志\t166747\n3dmx\t166748\n海信广场\t166749\nanquan\t166750\nVueJs\t166751\n教育部思政司\t166752\n天猫tmall.com\t166753\naynu\t166754\nLANNIA\t166755\n艾顿\t166756\n第九讲\t166757\n雪铁锦绣未央\t166758\n落定\t166759\nDele\t166760\nword页\t166761\n改单\t166762\nATS\t166763\n杨永晴\t166764\n杨树朋\t166765\nautodesk360\t166766\n企业文明网\t166767\n康捷\t166768\nExcel模板\t166769\n干巴巴\t166770\n云贵高原\t166771\n子洲政府网\t166772\n半部\t166773\nAnnouncing\t166774\nxplay5\t166775\n麦块\t166776\nxshot\t166777\n学位服\t166778\n控制者\t166779\npriv\t166780\n大学城\t166781\n无籽\t166782\n取款机\t166783\n林丽敏\t166784\n昂克赛\t166785\n河南省儿童医院\t166786\n命宫\t166787\n惊心动魄\t166788\n福海县\t166789\n币安交易所\t166790\n翻山\t166791\n慌不慌\t166792\n10000步\t166793\n汉龙\t166794\n他其实没那么喜欢你\t166795\n人参泡酒\t166796\n积土\t166797\n王贼\t166798\nRESOURCE\t166799\n存取\t166800\n温皇\t166801\n相许\t166802\nrappelt\t166803\n衢江区\t166804\n主路\t166805\nInt32\t166806\n彩监\t166807\n前课\t166808\n楼南光\t166809\n黑根\t166810\nutilize\t166811\n第2项\t166812\n重拳出击\t166813\n无可比拟\t166814\n微擎模块\t166815\n埃索美拉唑镁肠溶片\t166816\n标新立异\t166817\nmysite\t166818\n附安\t166819\n89c52\t166820\n如黛\t166821\n海洋牧场\t166822\n异丁烯\t166823\n盛事\t166824\n钢渣\t166825\n嵌套查询\t166826\npassword\t166827\nmanhattan\t166828\n男中音\t166829\n崔天凯\t166830\n南京地区\t166831\ntcga\t166832\n调参\t166833\n111期\t166834\nresize2fs\t166835\n减伤\t166836\n尿道口\t166837\n有痕\t166838\n横担\t166839\n玉泽\t166840\n活佛济公\t166841\n牛奶浴\t166842\nPigeon\t166843\n现成\t166844\n更看重\t166845\n李雁鸣\t166846\n南山医院\t166847\n卧位\t166848\n横琴湾酒店\t166849\n张玉安\t166850\n医疗器械分类目录\t166851\n灰原\t166852\ndllexport\t166853\n粉刺\t166854\n油票\t166855\nsalvatore\t166856\n说走\t166857\n红米手机4A\t166858\n江竹筠\t166859\n罪业\t166860\n不方便\t166861\n晋级\t166862\n小白\t166863\n螺纹\t166864\n水电管\t166865\n写作\t166866\npwm波\t166867\n永恒岛\t166868\n朱永棠\t166869\n图吧酒店预订网\t166870\n极智\t166871\n安费诺\t166872\n抢单\t166873\n第36轮\t166874\n李雪松\t166875\n第8集\t166876\nクラブ\t166877\n筹备会\t166878\n幸苦\t166879\n陆行鸟\t166880\n欧派集团\t166881\n古装片\t166882\n形意拳\t166883\n三晋\t166884\nCANBUS\t166885\n_莲蓬鬼话\t166886\n11584\t166887\n7月3日\t166888\n中小客车\t166889\nbitnami\t166890\n外廊式\t166891\nGTX750TI\t166892\n风飞沙\t166893\n鲁政委\t166894\n中国石油化工股份有限公司\t166895\n靓照\t166896\n鼎城区\t166897\nours\t166898\n室内门\t166899\n完璧\t166900\n白金龙\t166901\n奶茶妹\t166902\n省事\t166903\n炉石传说逗鱼时刻\t166904\nagency\t166905\n激流堡\t166906\n安大略省\t166907\n内田\t166908\n伸缩门\t166909\n仫佬族\t166910\nprobiotic\t166911\n时政\t166912\n2060年\t166913\n铁头娃\t166914\n世良\t166915\n东长安街1号\t166916\nawp\t166917\n战争艺术:赤潮\t166918\n列表项\t166919\n信息技术类\t166920\n溴化钾\t166921\nSTM32-F0/\t166922\n含铝\t166923\n第十条\t166924\n7.5w\t166925\n受体\t166926\n10.04\t166927\n北京大学历史学系\t166928\n悦纳\t166929\nSaigon\t166930\n书生商务网\t166931\nChanging\t166932\n脑链\t166933\n国际货运\t166934\nvofill-work\t166935\nOrigin8.0\t166936\n铝土矿\t166937\n凝静志远\t166938\n小家联行\t166939\n美国文理学院\t166940\nAwfully\t166941\nTransportation\t166942\n量具\t166943\n饭统\t166944\nRAIN\t166945\nNAYNEHC\t166946\n走道板\t166947\nandersonyan\t166948\n短头发\t166949\n2017年4月3日\t166950\nnofollow\t166951\n政党\t166952\n29类\t166953\nΙ\t166954\n涿郡\t166955\n桓台\t166956\naim\t166957\n哈哈动漫网\t166958\n双插\t166959\nteachingfeeling\t166960\n瓜子壳\t166961\n桃太郎\t166962\n交界处\t166963\n正阳门\t166964\n人力资源服务业\t166965\n电视投屏\t166966\n看稿\t166967\n67路\t166968\n应聘者\t166969\n面肌痉挛\t166970\nlayui\t166971\n第十九次\t166972\n有情郎\t166973\n那位\t166974\n汤山温泉\t166975\n耳塞式\t166976\n国缘\t166977\n代谢\t166978\n丁家庄\t166979\ncir\t166980\n乌力吉\t166981\n古兰\t166982\n电源电路\t166983\n冒泡排序法\t166984\nsubmit\t166985\n担子\t166986\n马边彝族自治县\t166987\nvii\t166988\n中华人民共和国驻泰王国大使馆\t166989\nrefcursor\t166990\n生理裤\t166991\n姜武\t166992\n一金\t166993\n破山河在\t166994\nflare\t166995\n热橙\t166996\n53_\t166997\n职来职往\t166998\n论处\t166999\n太平养老保险股份有限公司\t167000\nplusm\t167001\n神湾镇\t167002\nMIRROR\t167003\nxinetd\t167004\n有点累\t167005\nbiscuit\t167006\n小分\t167007\n吴起县\t167008\n格式转换软件\t167009\nNRC\t167010\n液压坝\t167011\n【达尔优\t167012\n方兴未艾\t167013\n小魔仙\t167014\n】性\t167015\nexecvp\t167016\n流潋紫\t167017\n成蝶\t167018\n王会喜\t167019\n更好\t167020\n中国杂技系\t167021\n小说下\t167022\n飞机失事\t167023\n田禹治\t167024\n对字\t167025\n办理处\t167026\n村色\t167027\n人名单\t167028\nK550\t167029\nkendrick\t167030\n精索\t167031\n气密\t167032\n第9套\t167033\n李哲\t167034\n意料之中\t167035\n正切\t167036\n岛屿\t167037\n暗码\t167038\n红拳\t167039\nfanying\t167040\n炉霍\t167041\n固肾安胎丸\t167042\n北面\t167043\n北京经济开发区\t167044\n爱啦啦\t167045\n侯梦莎\t167046\n山东省经济和信息化委员会\t167047\n第十八次\t167048\n雷达里奥\t167049\n非凡任务\t167050\n牧牛\t167051\n万2\t167052\n小学语文\t167053\n门板\t167054\n上海生科院\t167055\n鹤州\t167056\n恒科\t167057\nCAXA电子图板\t167058\n八一南昌起义\t167059\n中芬\t167060\n煮酒论史\t167061\n户头\t167062\n本地郎\t167063\n建筑工\t167064\n白带增多\t167065\n耶\t167066\n异形板\t167067\nProfessionals\t167068\n硅PU球场\t167069\n非公开发行公司债券\t167070\n语者\t167071\ntxt小说下载吧\t167072\njamess\t167073\n转名\t167074\n董圆圆\t167075\n生死棋\t167076\n朱勇\t167077\n无职转生\t167078\n换上\t167079\n曾昭玮\t167080\n黄埔古港\t167081\n融合型\t167082\n东泉\t167083\n仙班\t167084\n小季\t167085\n新题\t167086\n行骗\t167087\n优派\t167088\ncrp\t167089\n通稿\t167090\n500nm\t167091\n2017年5月1日起\t167092\n加机\t167093\n30平方\t167094\n添加行\t167095\n埋入式\t167096\n浙江省国际贸易集团有限公司\t167097\n沈阳新闻_新闻中心\t167098\nLED电子显示屏\t167099\n残破\t167100\n恋爱サ\t167101\nMotoGP\t167102\n鲜活\t167103\nJAVA语言运算符\t167104\n能会\t167105\n徐玄\t167106\n抽象派\t167107\n梅派\t167108\n苦楝树\t167109\nRelease\t167110\n亚银\t167111\n大众电影百花奖\t167112\n草畜\t167113\n风范\t167114\n思加图\t167115\n手机南方网\t167116\nmt5\t167117\n无夜\t167118\n华尔道夫酒店\t167119\n吸纸\t167120\n弗林特\t167121\n胸乳\t167122\nCNZZ\t167123\n客卿\t167124\n电脑模拟器\t167125\n4300万\t167126\n公号\t167127\n印有\t167128\n草房子\t167129\n10分钟以内\t167130\neducational\t167131\n防身术\t167132\n席娟\t167133\n金唱片\t167134\n但是\t167135\n敢爱\t167136\n浙江东方\t167137\n泛悦城\t167138\naddr\t167139\n行政处\t167140\n新西兰梅西大学\t167141\n吕勇\t167142\n卜冠\t167143\n20p\t167144\n博乐\t167145\n咨讯\t167146\n边际率\t167147\nHeal\t167148\n活塞销\t167149\npd虚拟机\t167150\nawesome\t167151\nPS磨皮\t167152\n罗征\t167153\n弹孔\t167154\n晋江文学城_\t167155\n非暴力\t167156\n肺片\t167157\n借读费\t167158\n82mm\t167159\n洛朗\t167160\n吹捧\t167161\n捉虫\t167162\n徐洁\t167163\n1章\t167164\n屯昌县\t167165\nascend\t167166\n集会\t167167\nniu\t167168\n少花钱\t167169\n第一卦\t167170\nGd\t167171\n先烈们\t167172\n数码变焦\t167173\n罗西\t167174\n肺俞\t167175\n战舰猎手\t167176\n化学试剂\t167177\n梁宽\t167178\n朋克风\t167179\n功体\t167180\n诺提勒斯\t167181\n季_\t167182\n笑到最后\t167183\n硅藻泥背景墙\t167184\n下结论\t167185\n脓皮症\t167186\nArtists\t167187\n骶髂关节炎\t167188\n活喜\t167189\n外资\t167190\n相对分子量\t167191\n大阪站\t167192\n红烧茄子\t167193\n老沈\t167194\n奥迪A4L论坛_汽车之家论坛\t167195\n81页\t167196\n四眼井\t167197\ntriz\t167198\n人口自然增长率\t167199\nsexoquene\t167200\n季风气候\t167201\nvoto\t167202\nJstl\t167203\n不符实\t167204\nssop\t167205\n拉伸模\t167206\n盐酸羟甲唑啉\t167207\n珮\t167208\nintend\t167209\n广东培正学院\t167210\nRSL\t167211\n屈伸\t167212\nPurification\t167213\n巫婆\t167214\n京香JULIA\t167215\n诡爱\t167216\nHCL\t167217\n水带\t167218\n问水\t167219\n东厂\t167220\n保利香槟花园\t167221\n10.0.3\t167222\n樊登\t167223\nshiye\t167224\n求善\t167225\n流氓\t167226\ngtx980\t167227\n自慰器\t167228\n朱瞻基\t167229\n基本建设程序\t167230\n越来越来\t167231\n富姐\t167232\n温布\t167233\n张_\t167234\nJulie\t167235\nㄅ\t167236\n红袖章\t167237\n太空无垠\t167238\n参好\t167239\nfestivals\t167240\n二代妖精之今生有幸\t167241\nDangerous\t167242\nNut\t167243\n三首歌\t167244\nShoot\t167245\n成交额\t167246\n花种\t167247\nrecycleview\t167248\n鸿达\t167249\nx战警:天启\t167250\n骁龙吧_\t167251\n鼓楼街道\t167252\n肥宅\t167253\n请解\t167254\n寒暄语\t167255\n傲梅分区助手\t167256\nI9100\t167257\n三维鱼乐队\t167258\n新福尔摩斯\t167259\n山边\t167260\n碧海湾\t167261\n铁网\t167262\n孟燕\t167263\n世博展览馆\t167264\n5.44\t167265\n接驳器\t167266\n瞬态\t167267\n安莉芳\t167268\n一目了然\t167269\n8520\t167270\n忘不了\t167271\nrc390\t167272\n湖南铁道职业技术学院\t167273\n内蒙古自治区高级人民法院\t167274\nfclose\t167275\n张片\t167276\nAndrew\t167277\npineapple\t167278\nwindows虚拟机\t167279\n韩兆琦\t167280\n重庆建设银行\t167281\npriorities\t167282\nswc\t167283\n1017\t167284\n症\t167285\n金盒\t167286\n深圳都市报数字报\t167287\n欧阳少恭\t167288\n过逝\t167289\n养德\t167290\n柔亮\t167291\nPaperwhite\t167292\nlovelivesunshine\t167293\n彩标\t167294\ndumps\t167295\n台铁网\t167296\n吉杰\t167297\n5088\t167298\n喷墨机\t167299\n爆衣\t167300\n2群\t167301\n伊苏菲尔盖纳\t167302\n27码\t167303\n中国测绘科学研究院\t167304\n南湖新区\t167305\n两个责任\t167306\nGridFS\t167307\n7月初\t167308\n张金顺\t167309\n选区\t167310\n130马力\t167311\n四川省经济和信息化委员会\t167312\n轻典\t167313\nrhel7\t167314\ni云保\t167315\n联保\t167316\nUI\t167317\n合肥新站\t167318\n大丈夫\t167319\n前卒\t167320\nNEO\t167321\n角座阀\t167322\n王世伟\t167323\n奇俊\t167324\nSmart\t167325\n12.5%\t167326\n习近平总书记系列重要讲话\t167327\n舒乐安定\t167328\n银泰创意城\t167329\n公关部\t167330\n恒大珺睿府\t167331\n先锋指数\t167332\n尽管\t167333\nMCP\t167334\n1000所\t167335\n珠海教育信息网\t167336\n纪念集\t167337\n柳岸晓风\t167338\n亚洲卫视\t167339\n员工持股计划\t167340\n成都房协网\t167341\ngongping11\t167342\n威海职业学院\t167343\n逃逃\t167344\n跪安\t167345\n宏顺\t167346\ncad07\t167347\n创维数字\t167348\nProton\t167349\n250元\t167350\n大师版\t167351\nuevent\t167352\n开门红\t167353\n星海街\t167354\n东华理工\t167355\n灌洗\t167356\nV6.2\t167357\n伏魔记\t167358\n智媒\t167359\n中金网\t167360\n多音字\t167361\n美妍\t167362\nidear\t167363\n住宿价\t167364\n乾隆帝\t167365\nStrin\t167366\n蟹老板\t167367\n汉缆股份\t167368\n埕\t167369\n止水节\t167370\n天津口腔医院\t167371\n吴震\t167372\n正定国际机场\t167373\n文化园\t167374\n重生民国\t167375\n礼光\t167376\n行讯机床网\t167377\n防锈漆\t167378\n金门大桥\t167379\n四针\t167380\n曝光补偿\t167381\n辈子\t167382\n2018.4.15\t167383\n宋健\t167384\n于平\t167385\n氮泵\t167386\n百年孤独\t167387\n人间烟火\t167388\n腰疼\t167389\n申请者\t167390\nelectrodes\t167391\n青大附中\t167392\n随机数组\t167393\n20180410\t167394\n粪机\t167395\n七环\t167396\n撸二哥\t167397\n老农民\t167398\n影子\t167399\n今人\t167400\nExisting\t167401\nLFA\t167402\n长行\t167403\n宴席\t167404\ngams\t167405\n敷\t167406\n毕业证\t167407\n一切险\t167408\n二当家\t167409\n路劲地产\t167410\n萨摩亚\t167411\n劈成\t167412\n1+1教育网\t167413\n泛悦\t167414\n双旗币\t167415\n数控雕刻机\t167416\nk73\t167417\n清澈见底\t167418\n负载型\t167419\n齐平\t167420\n注册\t167421\n康婷公司\t167422\nmultipartfile\t167423\n石墨电极\t167424\n张泉\t167425\n临湖镇\t167426\n45张\t167427\n⊕\t167428\nrequset\t167429\n文里\t167430\nDesigner16\t167431\n可可脂\t167432\n新世纪福音战士剧场版\t167433\n讲演稿\t167434\n重庆乐居网\t167435\n阿U\t167436\n互搏\t167437\n相仿\t167438\n高威\t167439\n赞美人\t167440\n黄山烧饼\t167441\n铝合金\t167442\nLeeds\t167443\n反对党\t167444\nZ1期\t167445\n奔驰S600\t167446\n7.3PTR\t167447\nwiringpi\t167448\ncil\t167449\n王云飞\t167450\nSFR\t167451\n均馆\t167452\n镜框\t167453\n镇静催眠药\t167454\n管乐\t167455\n2016年6月14日\t167456\n北京足贴\t167457\n背中\t167458\n360行\t167459\n双色球走势图-福彩双色球走势图\t167460\ndbcc\t167461\n相亲网\t167462\n贵州省交通厅\t167463\n园林专业\t167464\n00160\t167465\nLouise\t167466\n亮仔\t167467\nalienquest\t167468\n暗渡\t167469\n时限\t167470\n002008\t167471\n感冒冲剂\t167472\n下弦月\t167473\nnavigate\t167474\n东部华侨城大峡谷\t167475\n映射器\t167476\n阿正\t167477\n裸\t167478\n吉林省卫生和计划生育委员会\t167479\n进出境\t167480\n古典型\t167481\n游戏盒\t167482\n商讯-辽宁新闻网\t167483\n南靖\t167484\n汽船\t167485\nzhijia\t167486\n柳沙\t167487\n五龙\t167488\nBlast\t167489\n甲酸钙\t167490\n天蓝地绿水清\t167491\n88平方\t167492\nzxvf\t167493\n留仙洞\t167494\nG85\t167495\nx22\t167496\n掩涕\t167497\n雁城\t167498\nXiao77论坛\t167499\nfeb\t167500\n建宁公主\t167501\n河北省国税局\t167502\n30章\t167503\n85266666\t167504\nzodgame\t167505\nqinghua\t167506\n安全日\t167507\n头校区\t167508\ncoats\t167509\n导航台\t167510\nx3.3\t167511\n高七\t167512\n色叶\t167513\nLakers\t167514\n商业银行股权管理暂行办法\t167515\n社会保险\t167516\nv2.2.0\t167517\nair1\t167518\n孺子可\t167519\n京籍\t167520\n4299\t167521\n沭阳\t167522\n李书记\t167523\n二婶\t167524\n六里桥\t167525\n暗黑mod吧\t167526\n1200家\t167527\n人邮\t167528\n中纬度\t167529\n装神弄鬼\t167530\n宇宙英雄之超银河传说\t167531\nREXROTH\t167532\n固城湖\t167533\n开心夏令营游学网\t167534\n南京苏宁\t167535\n整箱\t167536\n天津海运职业学院\t167537\n深圳市理邦精密仪器股份有限公司\t167538\npdfFactory\t167539\n母乳喂养\t167540\n小金属\t167541\n一阶二\t167542\n黑气\t167543\n十五级\t167544\n游踪\t167545\nв\t167546\nDresden\t167547\nHalcon\t167548\n古门\t167549\n江苏市\t167550\n2018年10月份\t167551\n爻辞\t167552\npmm\t167553\nsysv\t167554\njsessionid\t167555\nA11\t167556\n最新人教版四年级语文下册\t167557\n十四所\t167558\n化物语\t167559\n东风汽车\t167560\n大孔树脂\t167561\n工司\t167562\n倒走\t167563\n135部\t167564\nDAYS\t167565\ntng\t167566\nGameloft\t167567\n第69期\t167568\n仁爱\t167569\n创业投资引导基金\t167570\nTrailers\t167571\n中国纪检监察学院\t167572\nevolving\t167573\n400平方\t167574\n金田铜业\t167575\n静静\t167576\n青岛市北区\t167577\n榜上有名\t167578\nDielectric\t167579\n2016年11月18日\t167580\n悄悄地\t167581\n中国第一汽车集团公司\t167582\n七篇\t167583\n钢屑\t167584\n天刀OL\t167585\n太平轮\t167586\n龙光玖\t167587\n4万个\t167588\n孙洁\t167589\nc语言编程\t167590\nomi\t167591\n全职猎人蚂蚁篇\t167592\n川投能源\t167593\n张海亮\t167594\nポ\t167595\n索亚历险记\t167596\n超级大国\t167597\n石家庄酒店\t167598\n1.1亿\t167599\n轮轩\t167600\n农家宴\t167601\nfork\t167602\n院坝\t167603\n每隔\t167604\nWink\t167605\n3m双面胶\t167606\n嘉禾网\t167607\n保险职业学院\t167608\n风包\t167609\n锦龟\t167610\n2303\t167611\n男枪\t167612\n3dm论坛\t167613\n北岛\t167614\n中国科学院心理研究所\t167615\n香港中国银行\t167616\n4.8.5\t167617\n邪斗\t167618\n中央广播电台\t167619\n35年\t167620\nmog\t167621\n自由行景酒\t167622\n七宗罪\t167623\n塔机\t167624\n电影战神纪#\t167625\n纳什均衡\t167626\nejdz\t167627\n反相器\t167628\nCODEX\t167629\n鲜网辣文\t167630\n国家法定假日\t167631\n魔化\t167632\nIggy\t167633\n正月十四\t167634\noffice\t167635\n北海新区\t167636\n富祥股份\t167637\n客货车\t167638\nGeophysical\t167639\n娱乐网\t167640\n装机员\t167641\n无差\t167642\n2560x1080\t167643\n冰鞋\t167644\n十三家\t167645\n第82号\t167646\numass\t167647\n苍灵\t167648\n禁毒日\t167649\n九年后\t167650\n偶然\t167651\n幼年时\t167652\n胶囊旅馆\t167653\n直角\t167654\n郑开城际铁路\t167655\nman2017\t167656\n特有\t167657\n找神途\t167658\n高品\t167659\n决策树\t167660\n夜勤病栋\t167661\n心气虚\t167662\n虚张\t167663\n成都天府新区投资集团有限公司\t167664\n太极岛\t167665\n滨湖新区\t167666\n森峰\t167667\n刘姝威\t167668\n门墩\t167669\n一卡双号\t167670\n摘录\t167671\n挂挂\t167672\nagility\t167673\n画质\t167674\n洛斯\t167675\n万家丽中路\t167676\n延长路\t167677\nPassion\t167678\n唐顿\t167679\n青西郊野公园\t167680\nSegNet\t167681\n诺亚方舟实验室\t167682\n沧县\t167683\n老羊\t167684\n特图\t167685\n岩手\t167686\n茜茜公主2\t167687\n下午六点\t167688\n郑州大学西亚斯国际学院\t167689\n不二\t167690\n棒棒鸡\t167691\n真材实料\t167692\nLines\t167693\n拳掌\t167694\n海鑫\t167695\n赵微\t167696\n淋巴滤泡增生\t167697\nchimera\t167698\n18件\t167699\n离开时\t167700\n声明类\t167701\n第十一版\t167702\nwebpack4\t167703\n旅记\t167704\nYHAChina\t167705\n油罐车\t167706\n郑国渠\t167707\n64.rpm\t167708\n谁清楚\t167709\ndizhi\t167710\n鸡兔同笼\t167711\n虚幻4吧\t167712\nPPT_word\t167713\nmandarin\t167714\n字块\t167715\n英音版\t167716\n普装\t167717\n人生路上\t167718\n傅程鹏\t167719\nIE11\t167720\n威海市立医院\t167721\n小吊车\t167722\nmainloop\t167723\n盐山\t167724\n效果图\t167725\n画鸡\t167726\n配音网\t167727\n丽景\t167728\n沈升\t167729\n广州杰赛科技股份有限公司\t167730\nLuxion\t167731\nTaught\t167732\n滴水槽\t167733\n北京朝阳大悦城\t167734\n麦草\t167735\n一字一句\t167736\n抢婚\t167737\n郑永刚\t167738\n结\t167739\nDavid_Tang\t167740\n太清\t167741\n游氏\t167742\n1038\t167743\n类似词\t167744\n食量\t167745\n解锋镝\t167746\npentair\t167747\n安庆市\t167748\n工学硕士\t167749\ncommentary\t167750\nGONG\t167751\n三航\t167752\n德性\t167753\n广州南站客运站\t167754\n颈内\t167755\ny430p\t167756\n女生部\t167757\n2015年01月30日\t167758\n夏磊\t167759\n分子动力学\t167760\n标度\t167761\n强身\t167762\n五米\t167763\n北京出版社\t167764\n车辆识别码查询-宜配网\t167765\nNougat\t167766\n面膜\t167767\n伍先生\t167768\n广西壮族自治区食品药品监督管理局\t167769\n剥线\t167770\n现代商业\t167771\n万谢\t167772\n不生\t167773\n水中毒\t167774\n双水\t167775\n雅颂\t167776\n河边镇\t167777\n350ml\t167778\n鹿心社\t167779\n商城县\t167780\n风晴雪\t167781\n妹汁\t167782\n中国长江三峡集团有限公司\t167783\n油耳朵\t167784\n大师们\t167785\nheise\t167786\n04.02\t167787\n杨铭\t167788\n早读\t167789\noh\t167790\n度人\t167791\n00019\t167792\n卫衣女\t167793\nprint2flash\t167794\n何俊\t167795\n彩板房\t167796\nFANCL\t167797\n四川机电职业技术学院\t167798\n刘云飞\t167799\n云浮新区\t167800\n尤克里里ukulele\t167801\n芭比娃娃\t167802\n医疗器械经营许可证\t167803\n神烦狗\t167804\n不认人\t167805\nъ\t167806\n213号\t167807\n都神\t167808\nPH3\t167809\n153集\t167810\narrive\t167811\nfoxconn\t167812\n衢山\t167813\n西西伯利亚\t167814\n减租\t167815\n光女\t167816\n要义\t167817\n楚雄在线\t167818\nGAI\t167819\nt300\t167820\n番\t167821\n麦粒肿\t167822\nfinance\t167823\nTanner\t167824\n镜心\t167825\nLength\t167826\n110配线架\t167827\n萍儿\t167828\n折减\t167829\n礼装\t167830\n全国青少年信息学奥林匹克\t167831\nBHG\t167832\n胸贴\t167833\nLOOKGAME\t167834\n户\t167835\n云视通\t167836\n辣眼时评\t167837\ncanoco\t167838\ntesoon\t167839\n御馔\t167840\n僧\t167841\n佛协\t167842\n内江六中\t167843\n运缘阁\t167844\n李大嘴\t167845\n杀球\t167846\n织布机\t167847\n流水台\t167848\n佰佰\t167849\n楠木\t167850\n刘青松\t167851\nplywood\t167852\n广告机\t167853\n98c\t167854\n胡瓜影院\t167855\n洽谈桌\t167856\n才肯\t167857\n胃\t167858\n斯慕吉\t167859\n音量\t167860\n武鸣\t167861\n阳光公寓\t167862\n10月5日\t167863\n2018年3月25日\t167864\ngram\t167865\n400kg\t167866\n中医院\t167867\n七一网\t167868\n百胜村\t167869\n长春市国土资源局\t167870\n侦缉\t167871\n万分之二\t167872\n重庆市电力公司\t167873\n张建行\t167874\n小店区人民政府\t167875\nvue子\t167876\n发懵\t167877\n阳光集团\t167878\n始料未及\t167879\n三缺一\t167880\n.netcore\t167881\n保守型\t167882\n新疆快递\t167883\n瓯江\t167884\n75千瓦\t167885\n绝世邪神\t167886\n按压\t167887\n王塔姆\t167888\n20170603\t167889\n14498\t167890\n中术\t167891\ntsn\t167892\n掌舵\t167893\n肉偿\t167894\n天鹅湾\t167895\n企业核心竞争力_\t167896\n辽沈晚报\t167897\n1772\t167898\n226个\t167899\n蠡\t167900\n李兵\t167901\n5227\t167902\n扯蛋\t167903\n鱼叉\t167904\nBut\t167905\n小河\t167906\n江户城\t167907\ndxo\t167908\nBASED\t167909\n梅奥诊所\t167910\n王鹏\t167911\n24篇\t167912\n下吕温泉\t167913\nP71\t167914\nthumbnail\t167915\n桂军\t167916\n森雅r7\t167917\n铁锋区\t167918\n小乔丹\t167919\nCollectors\t167920\n包退\t167921\nfreemark\t167922\ncova\t167923\n叉叉哥\t167924\n7000万元\t167925\n厦门市中医院\t167926\n高达exvs\t167927\n肖品\t167928\n打酒\t167929\n深里\t167930\n9950\t167931\n开元名都大酒店\t167932\n一刹那\t167933\n新沂市\t167934\nbzhan\t167935\n硫化仪\t167936\nRS-232\t167937\n东方文化园\t167938\n鼻中隔偏曲手术\t167939\n龙湖\t167940\n肉豆蔻\t167941\n鳗鱼饭\t167942\ntraded\t167943\n烙胤\t167944\n恶性肿瘤\t167945\n逍遥情缘\t167946\n天梁\t167947\n易生支付\t167948\n强电箱\t167949\n吹响\t167950\n查违\t167951\n铜陵北\t167952\nprofiler\t167953\n嘧啶\t167954\ndoesn\t167955\n极米投影\t167956\n共聚焦\t167957\n.jar\t167958\n联络员\t167959\n抄写员\t167960\n梁思礼\t167961\n导电胶\t167962\n书堆\t167963\n北京中日友好医院\t167964\n杰森雷军\t167965\n乡村爱情变奏曲\t167966\nlstat\t167967\n天源\t167968\n唐天\t167969\nA级\t167970\n净宗\t167971\nResponseBody\t167972\n叠合\t167973\n广州三甲医院\t167974\n超声内镜\t167975\n砍下\t167976\n糟卤\t167977\nzhelin\t167978\n汤师爷\t167979\n安全生产法\t167980\njQuery/CSS3\t167981\n请帖\t167982\n解放J6\t167983\n淘宝号\t167984\nm3u8\t167985\nConcrete\t167986\n再现性\t167987\n二十万\t167988\n背景板\t167989\n聚合物水泥砂浆\t167990\n一汽奥迪\t167991\n初中学\t167992\n王永生\t167993\nKKR\t167994\n黏液\t167995\n果体\t167996\n数据库存\t167997\nle\t167998\n和风物语\t167999\n冷流道\t168000\n浦建路\t168001\n55668153\t168002\n深圳公明汽车站\t168003\nmbuf\t168004\n至尊无上\t168005\n队长\t168006\n沃尔我的1979\t168007\n暗装\t168008\n娇兰佳人\t168009\n第二语言\t168010\n风衣\t168011\n宁波网络公司\t168012\n恩美\t168013\nGlashütte\t168014\nwall\t168015\n天业股份\t168016\n壹家\t168017\nmcf\t168018\nClean\t168019\n51万\t168020\n急性肾小球肾炎\t168021\n殃及\t168022\n专业技术人员计算机应用能力考试\t168023\n金朝阳\t168024\n经管资料网\t168025\nons模拟器\t168026\n缘梦\t168027\nguwei4037\t168028\nVXLAN\t168029\norganized\t168030\n四条\t168031\n大连新闻_天健网\t168032\n铁壶\t168033\n异丁醇\t168034\n反不\t168035\noperations\t168036\n季文康\t168037\n商字\t168038\n智联网\t168039\n圣火徽章外传\t168040\ngshock\t168041\n匪浅\t168042\ntact\t168043\n空难\t168044\n鞑虏\t168045\npalace\t168046\n彤\t168047\n微服私访\t168048\n糯米丸子\t168049\n脾\t168050\n公安海警学院\t168051\nWebLogic\t168052\n94cool\t168053\n17p\t168054\n浩博\t168055\n念奴娇赤壁怀古\t168056\nSail\t168057\nTabs\t168058\n维生素d滴剂\t168059\n内存条吧\t168060\n国家广电总局\t168061\n106周年\t168062\n苏格拉底\t168063\n州委\t168064\n滴度\t168065\n铁门\t168066\n畜牧业\t168067\nd3\t168068\n侯夫人\t168069\nclamav\t168070\n肥厚\t168071\npornographic\t168072\n600毫米\t168073\n法签\t168074\n卡子门\t168075\n在空\t168076\n特摄\t168077\n15天左右\t168078\n电池\t168079\nrunn\t168080\nMAKA\t168081\n乌托邦\t168082\n新成立公司\t168083\n3gb\t168084\nASO优化\t168085\n40公分\t168086\n4速\t168087\n乐21k\t168088\n赣州师范高等专科学校\t168089\n铁氧体\t168090\nsupermicro\t168091\n大鱼海棠\t168092\n三亚婚纱摄影\t168093\n本站\t168094\n宗旨\t168095\n青青岛\t168096\n日月明\t168097\n页面值\t168098\n为你点赞\t168099\n探头\t168100\n亚邦\t168101\nlol\t168102\n状元郎\t168103\n混动汽车\t168104\n稳当\t168105\n脸庞\t168106\n20160317\t168107\n迈克尔波特\t168108\n衬砌\t168109\n魂虫王\t168110\n洋娃娃\t168111\n0.005\t168112\nOffical\t168113\n周民\t168114\nMyEclipse2015\t168115\n松紧度\t168116\n咔\t168117\n小刘\t168118\npdf-CSDN\t168119\nbeetlsql\t168120\n天天德州\t168121\nrive\t168122\nEquinox\t168123\n欲出\t168124\n小产品\t168125\n杉浦则夫\t168126\nuc头条\t168127\n望尘莫及\t168128\n弃医\t168129\n坐诊\t168130\n分化型\t168131\n斗魂\t168132\n龙观\t168133\n本版\t168134\n582\t168135\n50岁\t168136\n松恩峡湾\t168137\n金佳\t168138\n5200u\t168139\n帝度\t168140\n济南西部\t168141\n香源\t168142\n卡拉宝\t168143\n运抵\t168144\n托雷\t168145\n缩小\t168146\n广州中山眼科医院\t168147\n降落伞\t168148\n铝艺\t168149\n35期\t168150\n宝马新3系\t168151\n稍加\t168152\nds-5\t168153\n导航者\t168154\n陈君\t168155\n女子戒色吧\t168156\n泉州19楼\t168157\nTeenTube\t168158\n天津派出所\t168159\n灵活度\t168160\n天神下凡\t168161\n后杠\t168162\n封神榜之凤鸣岐山\t168163\n鹰洋\t168164\n翡翠公园\t168165\n陈式太极拳新架一路\t168166\n二次方程\t168167\nPublication\t168168\n唐波\t168169\n知识树\t168170\n0x000006ba\t168171\nPro2\t168172\n指引\t168173\n9191\t168174\n妄为\t168175\n人工定额\t168176\n保险类\t168177\nWxPython\t168178\n非典型鳞状细胞\t168179\nchoices\t168180\n武汉地铁7号线\t168181\n宝马4s店\t168182\n献言\t168183\n腾讯欢乐麻将\t168184\nlunawzh\t168185\n上海物流公司\t168186\n依玛\t168187\nunderscore\t168188\n喜客\t168189\n菊池\t168190\n立杆\t168191\n通时\t168192\nPING\t168193\n开钻\t168194\n华润橡树湾\t168195\n石门县\t168196\n四代机\t168197\n一选\t168198\n出表\t168199\n王兆国\t168200\nFtp\t168201\nco\t168202\n刑事侦缉档案\t168203\nyulin\t168204\n车架\t168205\n新乡医学院\t168206\n斐波那契螺旋线\t168207\n盐酸曲马多\t168208\n6节\t168209\n_路\t168210\n余隆\t168211\n梅尔文\t168212\naskci\t168213\n麻里梨夏\t168214\n红云大道\t168215\n电流互感器\t168216\n433mhz\t168217\n郑源\t168218\nsubjective\t168219\n中午\t168220\n轻井泽\t168221\n箫笛\t168222\n长屏\t168223\n美的置业集团有限公司\t168224\n十宫\t168225\n路虎\t168226\n李桦\t168227\nbulider\t168228\n混流\t168229\n脑筋急转弯\t168230\nKONICA\t168231\n不忠\t168232\n实生苗\t168233\n转角遇到爱\t168234\n我们之间\t168235\n套利\t168236\n八宝景天\t168237\n百洋\t168238\n天龙八部逍遥\t168239\n贡品\t168240\nfinalcut\t168241\n一怎么\t168242\n萧邦\t168243\nCNP\t168244\n责任心\t168245\njushi\t168246\n驾驶案\t168247\n吉利蛋\t168248\n白檀\t168249\nBIS认证\t168250\n金融人才网\t168251\n国家重点研发计划资金管理办法\t168252\n邓小\t168253\n路畅科技\t168254\nPersian\t168255\n谷氨酸脱氢酶\t168256\nqq仙灵\t168257\n2000.com\t168258\n4108\t168259\n欧姆龙自动化(中国)有限公司\t168260\n时往\t168261\n中国人民保险集团\t168262\n农家女\t168263\n组织型\t168264\nfifa17吧\t168265\n摇滚版\t168266\n刘亚庆\t168267\n摸骨\t168268\n交叉感染\t168269\nsm文\t168270\n四川华西集团有限公司\t168271\nsquat\t168272\n富莱克\t168273\n48厘米\t168274\n只\t168275\n坟墓\t168276\n月子服\t168277\n冒险岛\t168278\nF-150\t168279\nscratch2.0\t168280\n全身麻醉\t168281\n伊川县\t168282\n嫌弃\t168283\n西溪诚园\t168284\nStray\t168285\n测得\t168286\n就职\t168287\n18部\t168288\n我们来了\t168289\nUWB\t168290\n000007\t168291\n垃圾片\t168292\n26UUU\t168293\nMSP430\t168294\nM1522nf\t168295\nMFG\t168296\nWin10平板Surface\t168297\nfaceit\t168298\n杨元元\t168299\nchoco\t168300\n2.35\t168301\n伺服放大器\t168302\n丁书苗\t168303\n劇場\t168304\n15部\t168305\n第一金融网\t168306\n1一20\t168307\n减轻\t168308\n深圳西火车站\t168309\n几句话\t168310\n声测管\t168311\n钢衬\t168312\nMIUI\t168313\n绿油\t168314\n20180203\t168315\n欲拒\t168316\nTRUNK\t168317\nforever\t168318\nlouder\t168319\n十里坡\t168320\nRunTime\t168321\n光华路\t168322\nEXCEL饼\t168323\nsquared\t168324\n玉水\t168325\n小恐龙\t168326\n插头\t168327\n四川天府银行\t168328\n御风者\t168329\n花开\t168330\n主机箱\t168331\n巧虎乐智小天地\t168332\n绝命时刻\t168333\n安徽省皖南八校\t168334\n火麻仁\t168335\n莫待无情空余恨\t168336\n哔哩\t168337\n三相半波可控整流电路\t168338\n循环神经网络\t168339\n中国扶贫基金会\t168340\n加勒比海\t168341\n常万全\t168342\n上海建行\t168343\n前庭神经炎\t168344\n_甘肃省国家税务局\t168345\n东晓街道\t168346\n倍\t168347\n初中毕业班\t168348\n李怡\t168349\n笋芽儿\t168350\n韩启德\t168351\n改集\t168352\n途美\t168353\n新纪元传奇\t168354\n54首\t168355\n鸟毛\t168356\n云南省民政厅\t168357\n人民陪审员\t168358\n务员\t168359\n法意瑞\t168360\n变化史\t168361\n杜鹃鸟\t168362\n古德蒂\t168363\n中文\t168364\n元和\t168365\n梅花香\t168366\n醉梦\t168367\n6型\t168368\n夏洛的网\t168369\nfdw\t168370\n398605条\t168371\n乌克丽丽ukulele\t168372\n不近\t168373\nstudio2008\t168374\nNIR\t168375\n凤城十二路\t168376\n中共江苏省委党校\t168377\n大闸蟹\t168378\npwnable\t168379\n夜射\t168380\n刘云天\t168381\n80亿美元\t168382\n女总裁的贴身高手\t168383\n周彦辰\t168384\n女蛇\t168385\nXDD\t168386\n0852\t168387\nEIS\t168388\n3dmaxvray\t168389\nmipush\t168390\n豚鼠\t168391\n120V\t168392\n用户们\t168393\n摩丝\t168394\n真髓\t168395\n兴园小区\t168396\n上海贝尔\t168397\n上海远大心胸医院\t168398\n行政处分\t168399\n免安装硬盘版\t168400\n无氪\t168401\n短租公寓\t168402\n过流\t168403\n魔法学院\t168404\n丁兰街道\t168405\n这文\t168406\nArma3\t168407\n误闯\t168408\nyoumei\t168409\nsyllable\t168410\nal10\t168411\n忍者舞\t168412\n上海环球港\t168413\n上科大\t168414\ngoggles\t168415\n席尔瓦\t168416\n武江\t168417\n萃取液\t168418\n第一口\t168419\nlager\t168420\n蕾丝猫\t168421\nEPP\t168422\n水坊\t168423\nmoban\t168424\n多商网\t168425\n高潮\t168426\n蛋黄果\t168427\n国家统一法律职业资格考试实施办法\t168428\n小习惯\t168429\n花盆\t168430\n见微知\t168431\n灌胶机\t168432\nPETS\t168433\n力法\t168434\n侧切\t168435\nICICLE\t168436\n十年来\t168437\n是否\t168438\n植物大战僵尸Online\t168439\n液氮速冻机\t168440\n商誉\t168441\nRa\t168442\n淘拍\t168443\n叶佳\t168444\n京张铁路\t168445\n首本\t168446\n小鱼人\t168447\n6月29日\t168448\n冬菇\t168449\n木灵\t168450\n围棋子\t168451\nCushion\t168452\n太平广记\t168453\n本田雅阁\t168454\n大沙田\t168455\n衣饰\t168456\n红帽子\t168457\n二代目\t168458\n新开店\t168459\n文嘉\t168460\n熊出没之雪岭熊风\t168461\nHarvard\t168462\nXvideos\t168463\n飘逝\t168464\n外流\t168465\n磷酸肌酸激酶\t168466\n伦敦市\t168467\n二八杠\t168468\n抽风式\t168469\n摆布\t168470\n进出\t168471\nvalue=\t168472\n全球五金网\t168473\n机关幼儿园\t168474\nEzra\t168475\n聚氨酯防水涂料\t168476\n忆秦娥\t168477\n搬山\t168478\n凤凰知音\t168479\n期货日内\t168480\n英军\t168481\n脱衣秀\t168482\n3153\t168483\nSilk\t168484\n面议\t168485\nSHIRT\t168486\n云南大学旅游文化学院\t168487\n5566\t168488\n床戏\t168489\nEyewear\t168490\n华商\t168491\n古建筑群\t168492\n肥西路\t168493\ndma\t168494\n16例\t168495\n客书\t168496\n聊斋新编\t168497\n乔木常剑雄\t168498\n整厘米\t168499\n月亮湖\t168500\n受累\t168501\n首日封\t168502\n定鼎\t168503\n26.6\t168504\n区霸\t168505\nGHO文件\t168506\n廉洁勤政\t168507\n瓜田网\t168508\nDYNA\t168509\n辙叉\t168510\n字符号\t168511\n成都房产网\t168512\n10.1.2\t168513\nel-button\t168514\n布里吉塔\t168515\n教育事业单位\t168516\n¤\t168517\n今年5月份\t168518\nAC1200\t168519\n百丈漈\t168520\ndazhou\t168521\n守财\t168522\n阿斯塔纳航空\t168523\n军事纪实\t168524\n南方教育网\t168525\n标签化\t168526\n色图\t168527\n小爱音响mini\t168528\n17岁\t168529\n10086\t168530\n苏氏牛肉面\t168531\n人男\t168532\n百度吧\t168533\n长尾词\t168534\n衣巾\t168535\n解放街\t168536\ndnf驱魔吧\t168537\n孝庄皇后\t168538\n效能感\t168539\n补帖\t168540\n10号\t168541\ncondition\t168542\n梅陇镇\t168543\nBills\t168544\n主客体\t168545\n银板\t168546\nM1\t168547\n滤料\t168548\n清迈大学\t168549\n抢怪\t168550\nyoub\t168551\n春江路\t168552\n维也纳智好酒店\t168553\n哈佛耶鲁\t168554\n挨冻\t168555\n企商\t168556\n团饭\t168557\n懂行\t168558\nCounseling\t168559\n互联网广告公司\t168560\n铁鼠\t168561\n图盾\t168562\n陽\t168563\n2018年2月11日\t168564\n辛酉日\t168565\n上海国际旅游度假区\t168566\n风卷\t168567\n总共\t168568\n连结\t168569\n洛菲菲\t168570\nfandyst\t168571\nCUT\t168572\n肾阴虚\t168573\n第64届\t168574\n内蒙古自治区财政厅\t168575\nxlua\t168576\n20170503\t168577\n宋汶霏\t168578\njedu\t168579\n展示屏\t168580\nSFO\t168581\nmanners\t168582\n20150515\t168583\nYOTA3\t168584\nTune\t168585\n营业额\t168586\n1.5级\t168587\n57个\t168588\n報導\t168589\n电容值\t168590\n乐天超市\t168591\n安苏羽\t168592\n香港奇案之吸血贵利王\t168593\n求破\t168594\n天空塔\t168595\n遮阳系数\t168596\n常用词\t168597\n腊月二十三\t168598\n金投保险\t168599\n草包\t168600\n退出去\t168601\n陪伴式\t168602\n垂涎\t168603\nBape\t168604\n联想v310\t168605\nblaster\t168606\n弃流\t168607\n刷屏机\t168608\n小儿手足口病\t168609\n験\t168610\n军士长\t168611\n晨曦公主\t168612\n宝鸡市金台区人民政府\t168613\n午安\t168614\n11周年\t168615\n新规定\t168616\n300平米\t168617\n率先\t168618\n认监委\t168619\n壹号院\t168620\n福田时代\t168621\n灵视\t168622\n拍电影\t168623\n魔盟网\t168624\n含子\t168625\n山香\t168626\nkugou\t168627\n造册\t168628\nLake处理器\t168629\n你说的都对\t168630\nDIA\t168631\n光大会展中心\t168632\n放成\t168633\n拉帮结派\t168634\ninone\t168635\n窦性心律\t168636\n北星\t168637\n雪弗莱\t168638\nUnresolved\t168639\nsizeof\t168640\n包包子\t168641\n大隐\t168642\n二仙桥\t168643\nassistance\t168644\n老来网\t168645\n北师大一\t168646\n郑亮\t168647\ncoms\t168648\n铬钢\t168649\n纳米碳酸钙\t168650\nx3.4\t168651\n氯化苄\t168652\n夺目\t168653\nconcave\t168654\n峰子\t168655\ncheaper\t168656\n38元\t168657\n弋阳\t168658\n开身\t168659\n临潼区\t168660\nSwift3\t168661\n名儿\t168662\n邱佳卉\t168663\n董铺水库\t168664\n0891\t168665\n倾天下\t168666\nacrylic\t168667\n塑化剂\t168668\n水地\t168669\n388\t168670\n最优化\t168671\n一年一度\t168672\n丙醇\t168673\n周鸿东方不败\t168674\nArbor\t168675\n你会不会\t168676\nFitbit\t168677\n三匠\t168678\n浙金中心\t168679\n慈光阁\t168680\n黄葵\t168681\n沧州站\t168682\ngrib2\t168683\n会计从业资格证\t168684\n氧化银\t168685\n中二\t168686\n血魂\t168687\n八多\t168688\ndiv/0\t168689\n受业\t168690\n美妻\t168691\n调幅波\t168692\n猪八戒威客网\t168693\n亚音\t168694\n阎学晶\t168695\n魔法少女爱吧\t168696\n翘尾\t168697\n注电\t168698\nBPMN\t168699\nfabrics\t168700\n韩军\t168701\nmp3mp4\t168702\n南华路\t168703\n团餐\t168704\n沙丘\t168705\n法斯特\t168706\n靳海涛\t168707\n也就是说\t168708\nCAF\t168709\n探矿权\t168710\n105cm\t168711\n这里\t168712\n睡死\t168713\n王青松\t168714\nwsus\t168715\n华蟾素胶囊\t168716\n埃克森美孚公司\t168717\n中洲湾\t168718\nMFCC\t168719\nConscious\t168720\n乔枫\t168721\nZz\t168722\nABBYY\t168723\n推拿师\t168724\n酥饼\t168725\n迪迦\t168726\n植鞣\t168727\n北方四岛\t168728\n4.2m\t168729\nPClady\t168730\nE杯\t168731\ninsure\t168732\nFacetime\t168733\n大公资信\t168734\n电销机器人\t168735\nIto\t168736\n古长城\t168737\nwwwnnuu44\t168738\n291集\t168739\n历经\t168740\n杜伊斯堡\t168741\n棉靴\t168742\n鲁泰\t168743\n国网北京市电力公司\t168744\n威力导演\t168745\n氯化锌\t168746\npearls\t168747\n张菲\t168748\n石炭\t168749\nACER\t168750\n润泰\t168751\n雨水\t168752\npottery\t168753\n鲜儿\t168754\n电动助力车\t168755\n绝色双娇\t168756\n三型\t168757\n监事\t168758\n武医\t168759\n杨春湖\t168760\n篆\t168761\n购单\t168762\n烟碱\t168763\n2018年03月29日\t168764\n足球袜\t168765\n36Kr\t168766\n中国测绘网\t168767\n云南师范大学实验中学\t168768\n纺织工\t168769\n不矜\t168770\nplugs\t168771\n微信公众号认证\t168772\n苏绾\t168773\nTaq\t168774\n易语言吧_\t168775\n256MB\t168776\n绝干\t168777\n裸感\t168778\nxiaoshi\t168779\n90多岁\t168780\n影驰名人堂\t168781\nazure\t168782\n体照\t168783\n客坊\t168784\nEPA\t168785\nHAM论坛\t168786\njsg\t168787\n兰州路\t168788\n东方驾校\t168789\n文宗\t168790\nmoves\t168791\nvue-loader\t168792\n所前镇\t168793\n冒牌天神\t168794\n米家车载空气净化器\t168795\n流浪犬\t168796\n江与城\t168797\n优亿\t168798\n陆明\t168799\n沪宁高速公路\t168800\n湘北\t168801\n浮山\t168802\n防晒服\t168803\n秦斌\t168804\n355\t168805\n教体\t168806\ncatia\t168807\n13分钟\t168808\n陈文杰\t168809\n武器包\t168810\n小火\t168811\n青岛人事考试网\t168812\n冀中能源井矿集团\t168813\n摇身\t168814\nhelmet\t168815\n五棵松\t168816\n陈思和\t168817\nscaling\t168818\n百度翻译API\t168819\n睹物思人\t168820\n渔民\t168821\n水湾\t168822\n云赛智联\t168823\n每平米\t168824\n拉槽\t168825\ne3v2\t168826\n酷狗音乐包\t168827\n鄞州中学\t168828\nyaw\t168829\n湖北广播电视大学\t168830\n标准量\t168831\nsketchup\t168832\n零维\t168833\nAdidas三叶草\t168834\n农林牧渔业\t168835\n去年12月\t168836\n萧复\t168837\n黄飞鸿\t168838\nMID函数\t168839\n李鬼\t168840\n报修\t168841\n病毒性肺炎\t168842\n胡葆森\t168843\n圣安东尼奥\t168844\n奉俊昊\t168845\n三民主义\t168846\n管束\t168847\n受助\t168848\n磁共振\t168849\n天津海河\t168850\n压轴卷\t168851\n中国人寿财产保险\t168852\nzx50\t168853\n家庭暴力\t168854\n古荡街道\t168855\n爱敬\t168856\nvoiceover\t168857\n清瘦\t168858\nBriefs\t168859\n冷端\t168860\n风柱\t168861\n吴雪松\t168862\n编班\t168863\n气\t168864\nNb\t168865\n天禅\t168866\n主客\t168867\n流金\t168868\n王大锤\t168869\n理政关键词\t168870\n题头\t168871\n监委会\t168872\n方塔园\t168873\n王局\t168874\nproxool\t168875\n一注\t168876\n红豆薏米粥\t168877\nMKC\t168878\n拉箱\t168879\nmatllab\t168880\n习酒\t168881\n2017—2035年\t168882\n战地:硬仗\t168883\nautomapper\t168884\n苏少\t168885\n平凡的世界\t168886\n侗\t168887\n燧石\t168888\nhtlm\t168889\n邻接权\t168890\nILCE-6000微单TM\t168891\nduck\t168892\n国家人\t168893\n识汝不识丁\t168894\n头孢呋辛钠\t168895\n何承熹\t168896\nzedgraph\t168897\nmme\t168898\nddc\t168899\n何以解忧\t168900\n依\t168901\ncloudcompare\t168902\n一歩\t168903\n孙二娘\t168904\nLBJ\t168905\nciq\t168906\n无级\t168907\n黄装\t168908\n弹子\t168909\nmsvcp120.dll\t168910\n全国各地\t168911\n俄城\t168912\n21集\t168913\nbillie\t168914\nunused\t168915\nqgc\t168916\n博得\t168917\n奎文\t168918\n垮掉\t168919\noffice2003免费版\t168920\n星空物语\t168921\n旅版\t168922\n安远县\t168923\n涵曦\t168924\n宽甸\t168925\n399001\t168926\n格兰玛弗兰\t168927\n盖人\t168928\n神代\t168929\n累倒\t168930\n1022\t168931\nscans\t168932\n萌萌\t168933\n辅币\t168934\n连用形\t168935\nmr\t168936\n扮猪\t168937\n死党\t168938\n火焰纹章回声\t168939\n高凯\t168940\n产品质量法\t168941\n史铁生\t168942\n天镜\t168943\n白贝\t168944\n瑞风s3\t168945\nDarkest\t168946\n爆肝\t168947\n人剑\t168948\n切面\t168949\n棚\t168950\n疑罪从无\t168951\n华南师范大学附属外国语学校\t168952\n建筑八大员\t168953\n国际货物运输保险\t168954\n泰伯利亚\t168955\n港资\t168956\n深圳恒生医院\t168957\nJVC\t168958\n五缘\t168959\n深圳旅游网\t168960\n人公\t168961\n伊多\t168962\n奔路\t168963\n传输\t168964\nトリカゴ\t168965\n推油\t168966\n六国论\t168967\n隆\t168968\nMatt\t168969\n警世\t168970\n毕业论文查重率\t168971\n液压件\t168972\n储钱罐\t168973\n哈尔创世纪\t168974\nRotation\t168975\nlstlisting\t168976\nPPmoney\t168977\n徐海东\t168978\n小国\t168979\n2018-01-16\t168980\n电磁灶\t168981\n初唐\t168982\n0.doc\t168983\n上古卷轴5重制版\t168984\n浮浮\t168985\n李渡\t168986\n谢和平\t168987\n矮小\t168988\niPhone6/6s\t168989\n回目\t168990\n春宫\t168991\n银星\t168992\ntuxiang\t168993\n施工组织设计编制\t168994\n沈阳移动\t168995\n金程网校\t168996\n卧室\t168997\n第二片\t168998\n多大\t168999\nJava泛型\t169000\ntextbox\t169001\nVue2.x\t169002\n众邦银行\t169003\n网易有道翻译蛋\t169004\njieya\t169005\n楼阳生\t169006\n欧诺S\t169007\n紧要关头\t169008\n问题库\t169009\n事业部\t169010\n80#\t169011\n黄埔军校\t169012\n中国国际货运航空\t169013\n灵界\t169014\n贵州茅台酒\t169015\n螺丝粉\t169016\n博伊斯\t169017\n藤野\t169018\n先跑\t169019\n央求\t169020\n独特魅力\t169021\n柏青\t169022\n翁牛特旗\t169023\n宅基地\t169024\n柘城县人民政府\t169025\ndead\t169026\n松野\t169027\n滚柱\t169028\n总场\t169029\nAspect\t169030\n10粒\t169031\n20170907\t169032\n刑警\t169033\n陈星\t169034\n吃一惊\t169035\njlink\t169036\n拓利\t169037\nwebUploader\t169038\n玄德\t169039\nscsi\t169040\nhive\t169041\nBeethoven\t169042\n大公开\t169043\n中华城市\t169044\n弹药\t169045\nSqlserver2008\t169046\nKruskal算法\t169047\n一蓑\t169048\n陈文辉\t169049\n电子面\t169050\n新一代信息技术\t169051\n江门市人力资源和社会保障局\t169052\n艺术\t169053\n偈颂\t169054\n库里南\t169055\n中国人民代表大会\t169056\n资源社区\t169057\n白欣怡\t169058\nrecycling\t169059\n邢小北\t169060\nCAD迷你看图软件\t169061\n联立方程\t169062\n华北平原\t169063\n一针见血\t169064\n部展\t169065\n位高权重\t169066\nlouisiana\t169067\n季末\t169068\n西游记全集\t169069\n_湖心亭看雪客\t169070\n杰森斯坦森\t169071\n好团网\t169072\n大妈\t169073\n比武场\t169074\n卡帕多西亚\t169075\n兴茂\t169076\n低密度脂蛋白胆固醇\t169077\ntitler\t169078\n刚果河\t169079\nofo共享单车\t169080\n下联\t169081\n萨古鲁\t169082\n秋天\t169083\n黄卫\t169084\n绳之以法\t169085\n不换\t169086\nmakiyo\t169087\n餐券\t169088\n风俗画\t169089\n外发光\t169090\n欧文\t169091\nVV5论坛_汽车之家论坛\t169092\n辅音\t169093\ndoom\t169094\n快件箱\t169095\n庶女胜者\t169096\n学家\t169097\n一二三四线\t169098\nVID\t169099\n许纯美\t169100\nw2016\t169101\n集贸\t169102\n塔兹米\t169103\n定位误差\t169104\nOPENWRT\t169105\n2611\t169106\n信安\t169107\n奇飞\t169108\nNXPowerLite\t169109\n玉簪记\t169110\n伊旗\t169111\nady映画\t169112\nchuu\t169113\n四月初\t169114\ndumped\t169115\n中健\t169116\n国母\t169117\n不仅\t169118\n追平\t169119\n贵安\t169120\n收讫\t169121\n上海复旦\t169122\n既然\t169123\n保外就医\t169124\n商战剧\t169125\n裸根\t169126\n600576\t169127\nAssessment\t169128\n炸弹人\t169129\npouch\t169130\n15袋\t169131\n年审\t169132\n白裤瑶\t169133\n重庆广电\t169134\n私心\t169135\n菜园子\t169136\n棋盘山\t169137\n粉病\t169138\n天上再见\t169139\nCPSM\t169140\n不耐烦\t169141\n50平方\t169142\n尤妮佳\t169143\n美容院\t169144\nZ820\t169145\nconfidence\t169146\n阿伦·艾弗森\t169147\n公祖\t169148\n魔线\t169149\n好姐妹\t169150\n复辟\t169151\n霍霍\t169152\n传播类\t169153\n陈小艺\t169154\njquer\t169155\n北京地坛医院\t169156\nGimme\t169157\n主胎\t169158\n小叔子\t169159\n两江新区\t169160\n菏泽市人力资源和社会保障局\t169161\n郑明明\t169162\n永慕\t169163\nautojs\t169164\n建员\t169165\n白洛因\t169166\n莲华\t169167\nservi\t169168\n熟练度\t169169\ndjvu\t169170\n薇姿\t169171\n奔头\t169172\n宪兵\t169173\n书家\t169174\nHINT\t169175\n参赛选手\t169176\n上海滩网\t169177\n空姐们\t169178\n新郑综合保税区\t169179\nTAYLOR\t169180\n调车\t169181\n共识\t169182\n认祖归宗\t169183\n戴源\t169184\n固定片\t169185\n华夏建设\t169186\n十一部\t169187\nWI-FI\t169188\n严肃性\t169189\n嫩乳\t169190\n天下父母心\t169191\n巴铁\t169192\nTortoise\t169193\n屌丝小说网\t169194\nC++运算符重载\t169195\n兴庆宫公园\t169196\n阴干\t169197\nSPIRIT\t169198\n萜烯树脂\t169199\n李小\t169200\n魔怨\t169201\n国力\t169202\nCS408\t169203\n金庸群侠传ol\t169204\n微爱豆\t169205\n降重\t169206\n1.0t\t169207\n英台\t169208\n万商\t169209\nvue-i18n\t169210\n雌雄同株\t169211\n南昌市国土资源局\t169212\n海门房产网\t169213\nMaxDOS\t169214\n住宅式\t169215\n菏\t169216\n府谷县\t169217\n纯文学\t169218\n美的历程\t169219\n腹杆\t169220\n贝克特\t169221\n姚若龙\t169222\n墙根\t169223\n辽委\t169224\n中国能源建设集团有限公司\t169225\n维生素a\t169226\n雨前\t169227\n1.8G\t169228\n中国教育学会\t169229\n地方人民政府\t169230\n钻进\t169231\n澳博\t169232\n王振军\t169233\nNexus5\t169234\n十六期\t169235\nAsin\t169236\nAlta\t169237\n个头\t169238\n浮顶罐\t169239\n集中地\t169240\nASAP\t169241\nIs\t169242\n1群\t169243\n偃月\t169244\n白雪梅\t169245\n月日\t169246\n密目\t169247\n2K屏\t169248\n增值税票\t169249\n控性\t169250\n因特拉\t169251\n全面建成小康社会决胜期\t169252\n宝马i3\t169253\n寒露\t169254\n700张\t169255\n维宏\t169256\n秋元康\t169257\n蒸鸡\t169258\nCreation\t169259\n鸡东县\t169260\n脑疼\t169261\n唐海\t169262\n批发业\t169263\n米桶\t169264\n那曲县\t169265\n多亿\t169266\nspectacular\t169267\n常熟银行\t169268\n网络视频监控\t169269\n室\t169270\n2017年4月27日\t169271\n猪肉片\t169272\n武庚纪2\t169273\nharmony\t169274\n移动POS机\t169275\n服服\t169276\n朱云峰\t169277\n超凡战纪\t169278\n超低空\t169279\n众鑫\t169280\n6000_\t169281\nmail函数\t169282\n豆面\t169283\n整流滤波\t169284\nprograming\t169285\n马自达睿翼\t169286\nromeo\t169287\n真标\t169288\n王黎光\t169289\n洋山三期\t169290\n立命\t169291\n股票学习网\t169292\nKana\t169293\n哆点\t169294\n钮子\t169295\n2018042期\t169296\n碎片\t169297\n团结新村\t169298\n非彤\t169299\n金克木\t169300\n男足\t169301\n报到证\t169302\n南开幼儿园\t169303\n牛叔\t169304\n美领馆\t169305\n沈凌\t169306\n第114期\t169307\n停车场\t169308\n桔色\t169309\n托雷斯\t169310\nASX劲炫\t169311\n诺瓦\t169312\n园林式\t169313\n百里者\t169314\n碳酸锰\t169315\n影驰固态硬盘\t169316\n约翰列侬\t169317\n大火\t169318\n地级\t169319\n等距\t169320\nvelite5\t169321\n掉以轻心\t169322\n拉米雷斯\t169323\n苏州联通\t169324\nC++技术网\t169325\n青春小说\t169326\n364\t169327\n安徽农业大学\t169328\n惜售\t169329\n1628\t169330\n匠魂\t169331\n股份公司\t169332\n网金社\t169333\n人民日报中央\t169334\nnino\t169335\nTwang\t169336\n广元市工商行政管理局\t169337\n2042\t169338\n一券\t169339\n李光洙\t169340\nPR社\t169341\n御岛明日香\t169342\n愚不可及\t169343\n烧\t169344\ncoupe\t169345\n养儿\t169346\n昨日\t169347\n渔人码头\t169348\nwisc\t169349\nansy\t169350\nLebron\t169351\n汕头八合里海记牛肉店\t169352\n58家\t169353\n何维\t169354\ndedicate\t169355\n8.6.1\t169356\n降下来\t169357\n陈星宇\t169358\n苟芸慧\t169359\n喷淋机\t169360\n杀灭\t169361\n自然村\t169362\n天津地铁9号线\t169363\n松井玲奈\t169364\n烧锅\t169365\n第A16\t169366\n面瘫\t169367\n忧民\t169368\n短调\t169369\n玲珑轮胎\t169370\n华中\t169371\n清真化\t169372\n巴蒂\t169373\nleakage\t169374\n湘阴\t169375\n关头\t169376\n灵妖记\t169377\nwww.soso021.com\t169378\n渗\t169379\n维秘秀\t169380\n满背\t169381\n横眉\t169382\n杭州市市\t169383\nSauvignon\t169384\n武汉地产\t169385\n合肥新桥机场\t169386\n行政长官\t169387\n条体\t169388\n氨茶碱\t169389\ntpp\t169390\nA2_山西新闻网\t169391\n皙之密\t169392\n2.3版\t169393\n若尔盖县\t169394\n情系\t169395\n中建材\t169396\n人力部\t169397\n兴证\t169398\nwin1064位\t169399\n螺丝杆\t169400\n修罗佛本是道\t169401\n耐震压力表\t169402\n虚拟人生\t169403\n艺海\t169404\n斯特拉托斯\t169405\n铁血阿郎map\t169406\n电动物流车\t169407\nx1carbon\t169408\n玛祖卡\t169409\n海绵蛋糕\t169410\n声威\t169411\n五脏\t169412\n贼警\t169413\n泉州\t169414\nsfc模拟器\t169415\n初子\t169416\n知识类\t169417\n山东省政府办公厅\t169418\n商者\t169419\n7818\t169420\n溪亭\t169421\n楼忠福\t169422\n生死时刻\t169423\n特务J\t169424\n00000002\t169425\n奎宁\t169426\nmybats\t169427\n552kmm\t169428\n老太太\t169429\n肱骨外上髁炎\t169430\n贸仲\t169431\n阀芯\t169432\n打望\t169433\nSHU\t169434\n句号\t169435\n小灿\t169436\n市场调查报告\t169437\nfft2\t169438\n何不\t169439\ngigabit\t169440\n施工方\t169441\n一夫二妻\t169442\n投委会\t169443\n野草\t169444\n海之号角\t169445\n黄泉篇\t169446\nMPU\t169447\n中国社会科学院哲学研究所\t169448\n海南建设\t169449\n精灵宝可梦go\t169450\n金岭矿业\t169451\n太阳能充电宝\t169452\n钱坤\t169453\n河南云飞科技\t169454\n男同性恋异性恋器具\t169455\n皮格马利翁\t169456\n黄油泥\t169457\n威廉一世\t169458\n草球\t169459\n第14号\t169460\n5000多元\t169461\n英雄王二小\t169462\n望湘园\t169463\n租借\t169464\n美女如云\t169465\n酷派大神f2\t169466\n标摊\t169467\n政委\t169468\n税负率\t169469\n内蒙古自治区公安厅\t169470\n吴建平\t169471\n0秒\t169472\nsound\t169473\n治疆\t169474\n63932295\t169475\n爱斯基摩人\t169476\n长体\t169477\n合唱版\t169478\n海天蚝油\t169479\n给水排水管道工程施工及验收规范\t169480\n非递归\t169481\n死眼\t169482\n几厘\t169483\n凶残版\t169484\n青苹果\t169485\n麻醉药品\t169486\n红豆曲\t169487\n局部变量\t169488\n鸠兹古镇\t169489\n藏机\t169490\n饱经\t169491\nFlash视频播放器\t169492\n白鱀豚\t169493\n模拟滤波器\t169494\n综艺咖\t169495\ncanshu\t169496\n甘井子\t169497\n新风潮\t169498\n斯诺克世锦赛\t169499\n天河北\t169500\n过目\t169501\n算完\t169502\nWordNet\t169503\n调准\t169504\n零部\t169505\n肉火烧\t169506\n换心\t169507\n王玉玲\t169508\n新笑傲江湖\t169509\nSteelSeries\t169510\n厦门市委\t169511\n泥头\t169512\naimee\t169513\ntika\t169514\n中电海康\t169515\n罗马全面战争吧\t169516\n挥出\t169517\n故都\t169518\n运输局\t169519\n年少\t169520\n塞尔丘克\t169521\nguice\t169522\n单膝\t169523\n取物\t169524\n中国扶贫网\t169525\nTitan\t169526\n优惠卷\t169527\n遗子\t169528\n嘉兴南湖\t169529\n仙女山镇\t169530\n阴道分泌物\t169531\n碟形\t169532\n细胞分裂6:黑名单\t169533\n平砍\t169534\n売\t169535\n思齐\t169536\n红太阳集团\t169537\n导风板\t169538\n张瑜\t169539\n选介\t169540\nwtc\t169541\n总机\t169542\n熟食柜\t169543\naggregate\t169544\nbuzhi\t169545\n第一妇婴保健院\t169546\n希金斯\t169547\n莲花闹海棠\t169548\n增江街\t169549\n猫们\t169550\nmat文件\t169551\n杭州\t169552\n克烈\t169553\n丹佛大学\t169554\n汪磊\t169555\nDungeon\t169556\nRicher\t169557\n中国航空油料集团公司\t169558\n营口理工学院\t169559\n尾牙\t169560\n16nm\t169561\n危机边缘\t169562\n蛙人\t169563\n双相情感障碍\t169564\nu2414\t169565\n上海口腔医院\t169566\n史玉杜甫\t169567\n局路\t169568\n遂宁市中心医院\t169569\n成都地铁1号线三期\t169570\n软尾\t169571\ndjango数据库\t169572\n王阳白鹿\t169573\n红警2\t169574\n通信\t169575\n作妖记\t169576\n观音山国家森林公园\t169577\n美少女梦工厂4\t169578\nFocus-Fe\t169579\n3362\t169580\n木木\t169581\nHBsAg\t169582\nnova2plus\t169583\n名风\t169584\n恐袭波士顿\t169585\n落荒而逃\t169586\n汉办\t169587\n京东网\t169588\n宇多田ヒカル\t169589\nBrightness\t169590\n宁波庄市\t169591\n血月\t169592\n第八十六章\t169593\n东百集团\t169594\n闺阁\t169595\n岂非\t169596\n联网型\t169597\nPacemaker\t169598\n贵州职业技术学院\t169599\n20毫克\t169600\nMP3/MP4\t169601\nbrpc\t169602\n南大和园\t169603\nwar格式\t169604\nえ\t169605\n邱胜翊\t169606\n吃赛\t169607\nsunxi\t169608\n深圳市宝安区人民医院\t169609\n亚于\t169610\n金能\t169611\n梦幻神武\t169612\n下裙\t169613\n湖南师范大学\t169614\n街景\t169615\n碧峰峡\t169616\n128m\t169617\n世纪金源\t169618\n巴乔\t169619\nchenyucong\t169620\n芜湖长江大桥\t169621\n令妃\t169622\nOCR识别软件\t169623\n嵊泗岛\t169624\n猪脸\t169625\n大学生心理健康教育\t169626\n第51届\t169627\n小玉\t169628\n公调\t169629\n哀求\t169630\nincreasing\t169631\n数来宝\t169632\n日杂用品装修|一起网\t169633\nxiaoge\t169634\n如琢\t169635\n曾德\t169636\n斗鱼直播下载中心\t169637\ngrubby\t169638\n澳门新葡京\t169639\n三八小说网\t169640\n经济管理\t169641\n太极\t169642\n第几\t169643\nE90\t169644\n钢炼\t169645\njsw\t169646\nWord文本框\t169647\n商流\t169648\nunder\t169649\n区头\t169650\n寻子\t169651\n分线器\t169652\n胡因梦\t169653\n科普性\t169654\nsymbian\t169655\nwebview\t169656\nccl\t169657\n拐骗\t169658\n谒\t169659\n川越\t169660\nDrinks\t169661\n长途飞行\t169662\n安县\t169663\n矶崎新\t169664\n徐文荣\t169665\nJEECMS\t169666\n美纳斯\t169667\n校官\t169668\n病原菌\t169669\n物业管理处\t169670\ndifficulties\t169671\n三国志11威力加强版\t169672\n马岱\t169673\namples\t169674\n国语\t169675\n辞格\t169676\ndinosaur\t169677\n省委宣传部\t169678\n定密\t169679\n劳春燕\t169680\n斐林\t169681\n和光堂\t169682\nUmora\t169683\n838888\t169684\n失业金\t169685\nupdown\t169686\n新线\t169687\n同里古镇\t169688\n孕妇钙片\t169689\n8255\t169690\n调令\t169691\n1350元\t169692\n淘宝旺旺\t169693\n约化\t169694\n导文\t169695\n冲日\t169696\n大疆\t169697\n巴托尼亚\t169698\n八婆\t169699\n负离子发生器\t169700\nWWW.1200SF.COM\t169701\n哈尔滨市司法局\t169702\n战斧式\t169703\n五鼠\t169704\n导引\t169705\n凉拖\t169706\n屏霸\t169707\n看电影学英语\t169708\n削发\t169709\n晶源\t169710\n大宝剑\t169711\napache-cxf\t169712\n北京天健医院\t169713\n不可竞争\t169714\n大众神车\t169715\nZ2/Z2\t169716\n煎药\t169717\n徽县\t169718\n攘攘\t169719\n海珠广场\t169720\n夕会\t169721\n莱姆病\t169722\nautocad2009\t169723\n神尔\t169724\n89家\t169725\n学习方\t169726\n党的十八届三中全会\t169727\n浙江省住房和城乡建设厅\t169728\n足印\t169729\n希雅\t169730\n资治\t169731\nleftjoin\t169732\n菲尔兹奖\t169733\n深圳高级中学\t169734\nBianca\t169735\n鲤城区\t169736\n郁浩然\t169737\n水上乐园\t169738\n皮六\t169739\n戟\t169740\n燕乐\t169741\nProficiency\t169742\n减资\t169743\nwww.zuanke8.com\t169744\n注册金融分析师\t169745\n万家福\t169746\nw酒店\t169747\n25项\t169748\n暴兵\t169749\n1.0.2.1\t169750\n星骋\t169751\n停售\t169752\n沈阳妈妈网\t169753\n白沙园区\t169754\nobama\t169755\n叮嘱\t169756\nCCI\t169757\n趋势线\t169758\n农综\t169759\n法防\t169760\n阻生牙\t169761\n县科技局\t169762\n凝乳\t169763\n甘肃机电职业技术学院\t169764\n玻璃珠\t169765\n老太婆\t169766\n360m\t169767\n玛吉斯轮胎\t169768\nT8\t169769\n影迷\t169770\n汉语编程\t169771\n志书\t169772\n古体\t169773\n七八个\t169774\n声会\t169775\n用来看\t169776\n友芝友\t169777\n成都金堂\t169778\n李靓蕾\t169779\nlibexec\t169780\n广东省人民政府外事办公室\t169781\n北电\t169782\n耻物\t169783\n陈剑平\t169784\nwsrep\t169785\n公园路\t169786\n敌对\t169787\n中国民用航空\t169788\n卡航\t169789\n西和吧\t169790\n西山温泉\t169791\n韦伯斯特\t169792\n科林费斯\t169793\n杨受成\t169794\n朝阳区社保中心\t169795\n无穷无尽\t169796\n揭露\t169797\n鲸鱼沟\t169798\n高危型\t169799\n中华人民共和国发票管理办法实施细则\t169800\npatreon\t169801\n40节\t169802\n雷赛\t169803\n思维导图\t169804\n红麻雀\t169805\n上海金茂大厦\t169806\n庐山区\t169807\n泰森多边形\t169808\n银泰网\t169809\n嘴形\t169810\n逆来顺受\t169811\n20150224\t169812\njforum\t169813\n应用写作\t169814\n龙潭路\t169815\n渡头\t169816\n扩散\t169817\n美滋滋\t169818\n相望\t169819\n7轮\t169820\n安意lady\t169821\n京口区\t169822\nRX\t169823\n臭骂\t169824\n魔珠\t169825\n1013\t169826\n深蓝\t169827\n漠北\t169828\n粘滞键\t169829\n普通高等教育\t169830\n湖父\t169831\n抢亲\t169832\npek\t169833\n母狮\t169834\n怡宝\t169835\n3度\t169836\n巨魔\t169837\n护管\t169838\n大学门卫老董\t169839\n黑河市\t169840\nDPL\t169841\n796\t169842\n14步\t169843\n舒伯特\t169844\n悬挂链\t169845\n黑寡妇蜘蛛\t169846\nListView\t169847\n花花衣\t169848\nPoco\t169849\nwepy\t169850\n360wifi\t169851\n朱村\t169852\n2018-04-29\t169853\n前2个月\t169854\n夜行衣\t169855\n苍海\t169856\nAPN\t169857\njavaG\t169858\n驭剑士\t169859\n吉吉影音av资源站\t169860\n正文页\t169861\n上班后\t169862\n老花眼\t169863\n220mm\t169864\n_______\t169865\ncalvin\t169866\n高镇\t169867\n波茨\t169868\n大学生恋爱观\t169869\n宫崎骏\t169870\n美国月子中心\t169871\n右轮\t169872\nSumomo\t169873\n中编办\t169874\n晾脚\t169875\n感烟探测器\t169876\n八万人体育场\t169877\n恐怖都市\t169878\n中共吉林省委\t169879\nDCS\t169880\n1.0_\t169881\n双叒叕\t169882\n汕尾职业技术学院\t169883\n三抗\t169884\n医德\t169885\n宇文拓\t169886\n4000流明\t169887\n地心历险记\t169888\n陋\t169889\n友田彩\t169890\n食评\t169891\n各值\t169892\n平安养老保险\t169893\n每夜\t169894\n散线\t169895\n山海造梦西游\t169896\n20161227\t169897\n慢性阑尾炎\t169898\n车马炮\t169899\n刘杀鸡\t169900\naccessor\t169901\n玉皇山南\t169902\n春满园\t169903\n大花园\t169904\n韧劲\t169905\n20150123\t169906\n2016年5月1日前\t169907\n静园\t169908\n舞蹈系\t169909\n壬辰日\t169910\ngiga\t169911\ncfree5\t169912\n公主岭\t169913\n不非\t169914\n上海市规划和国土资源管理局\t169915\n狐兔\t169916\n鸦片战争\t169917\nverilog\t169918\nwars\t169919\n电动观光车\t169920\n8887\t169921\n斜齿\t169922\n牡羊座\t169923\n狮山\t169924\n单身潮\t169925\n元宝机\t169926\n原素\t169927\nNBS\t169928\n7.5.8\t169929\n雅安\t169930\nforests\t169931\n上海振华重工\t169932\n無彈窗小說閱讀網\t169933\n磨具\t169934\n江成\t169935\n积玉桥\t169936\n粒子滤波\t169937\n战国志\t169938\n欧兰德\t169939\n作坊\t169940\n图污\t169941\n华晨雷诺\t169942\n套头衫\t169943\n4310\t169944\n南中\t169945\n马铠\t169946\n母親\t169947\n玉林市水利局\t169948\n55公斤\t169949\n课堂教学\t169950\n黄芪甲苷\t169951\n英良\t169952\n潮酷\t169953\n洗碗机\t169954\n钓具\t169955\n上海锦天城\t169956\nchip\t169957\nXcopy\t169958\n诗笺\t169959\n苛刻\t169960\nGRPC\t169961\n南京大学社会学院\t169962\n第几段\t169963\n组通\t169964\n施教\t169965\n新疆政府\t169966\n红鸾\t169967\n360安全路由\t169968\n博人传之火影忍者次世代\t169969\nCCTV音乐频道\t169970\nnewspapers\t169971\n砖雕\t169972\n气泡式\t169973\nhimall\t169974\n新南村\t169975\nKindergarten\t169976\n澳州\t169977\npdf2cad\t169978\nwebkit-scrollbar\t169979\n601009\t169980\n天格地板\t169981\n10万斤\t169982\n架立\t169983\ndutch\t169984\n排扣\t169985\n死兵\t169986\n灵江\t169987\nublox\t169988\n野生鱼\t169989\n弥漫\t169990\nletting\t169991\n中电海康集团有限公司\t169992\n老古\t169993\n幽境\t169994\n塑料撕碎机\t169995\n斯拉夫\t169996\n靠垫儿\t169997\n主主\t169998\n增值税一般纳税人\t169999\n夏利N5\t170000\n民建\t170001\n中关村科技园\t170002\n4n\t170003\n被冻\t170004\n云雾山\t170005\n检定仪\t170006\n人力资源证\t170007\n仙尘\t170008\n东仙\t170009\n军情五处\t170010\nRook\t170011\n喷胶机\t170012\n有轨电车1号线\t170013\nlucking\t170014\n郭峰\t170015\n日向宁次\t170016\n武汉三中\t170017\n乙醛酸\t170018\n赛普瑞斯\t170019\nrestframework\t170020\n王府井百货大楼\t170021\n100m2\t170022\n榻\t170023\n判词\t170024\n苏交科集团股份有限公司\t170025\n不死者\t170026\n矿\t170027\n好好好\t170028\n旧利\t170029\npof\t170030\nFestivals\t170031\n氧化炉\t170032\n房本\t170033\nglib\t170034\n5512\t170035\ntupian\t170036\n考点\t170037\n侯云德\t170038\nfittings\t170039\n汉修\t170040\n航天机电\t170041\n户松遥\t170042\n填词\t170043\n环县\t170044\n转战\t170045\nnavicat\t170046\n2a2\t170047\n地理信息系统专业\t170048\nAutoCAD—地信网\t170049\nPushing\t170050\nGeForce\t170051\n上诉人\t170052\n兴业银行股份有限公司\t170053\n第001集\t170054\nBurmester\t170055\n天德\t170056\n9.21\t170057\nSuppress\t170058\n女职工劳动保护规定\t170059\n1258\t170060\n荷兰牛栏\t170061\n四大点\t170062\n通号\t170063\npyscripter\t170064\n不能淫\t170065\n齐藤美优\t170066\n博金\t170067\ninfection\t170068\n外版\t170069\n单装\t170070\n奉浦\t170071\nChecklist\t170072\nJACKET\t170073\n韦唯\t170074\n齐威王\t170075\n另人\t170076\n跪下来\t170077\n无偿\t170078\n郭台铭\t170079\n菏泽一中\t170080\n20170329\t170081\n海尔森\t170082\n神武门\t170083\nwowgirls\t170084\nnk150\t170085\n纱织\t170086\n渉\t170087\n银行流水账\t170088\nXOR\t170089\n信息卷\t170090\n畅易阁\t170091\n天空岛\t170092\n安迎\t170093\n一轴\t170094\n蜜蜂\t170095\n新洲路\t170096\n东魏\t170097\n时空之轮\t170098\n爱就爱\t170099\n卓德\t170100\n湖南省高速公路管理局\t170101\nLOCK\t170102\n权力的游戏第五季\t170103\n7个\t170104\n枣庄八中\t170105\n删除\t170106\n硕士服\t170107\nYESTON\t170108\n楚天枢\t170109\n十字头\t170110\n此案\t170111\n2000集\t170112\n3-5\t170113\nZUN\t170114\n6550\t170115\n熊涛\t170116\n播州区\t170117\n机动战队\t170118\n珠串\t170119\n荣威RX5论坛论坛\t170120\nWEDO\t170121\nWindows7\t170122\nHazell\t170123\n6.5元\t170124\n徐梦圆\t170125\ngxz\t170126\n武汉市洪山区人民政府\t170127\nリ\t170128\n阿莎摩尔\t170129\n乐发\t170130\n哪两个\t170131\n50多度\t170132\nBrittany\t170133\n台北故宫\t170134\n思维\t170135\n云南虫谷\t170136\nRisk\t170137\n5153\t170138\n很干\t170139\n三藩之乱\t170140\n爱丽丝漫游仙境\t170141\n真由理\t170142\n白斑病\t170143\n毛泽东思想与中国特色社会主义理论体系概论\t170144\n作者\t170145\nucb\t170146\n暮光之城3\t170147\nServer2017\t170148\n40斤\t170149\n澳洲银行\t170150\nPennsyl\t170151\n反顾\t170152\n防毒面具\t170153\nMeshlab\t170154\nCOSTA\t170155\n小寒\t170156\n绘本剧\t170157\n四不\t170158\notr\t170159\n混元\t170160\n思源宋体\t170161\n春管\t170162\n不卖\t170163\n丝袜会所\t170164\n木根\t170165\n2.18.1\t170166\n豪迪\t170167\n男妃\t170168\n王小屯\t170169\n湖南省干部教育培训网络学院\t170170\nNBA2k17\t170171\n1277\t170172\n20150727\t170173\n地狱神探2\t170174\n论战\t170175\n油耗\t170176\ncad2016\t170177\n琐碎\t170178\n烟锁\t170179\n6RF\t170180\nCopier\t170181\n要情\t170182\n酒小七\t170183\n尉犁县政府\t170184\n舞ワイフ\t170185\n天骄\t170186\npct\t170187\n乌纱帽\t170188\n仙术\t170189\nkatsuni\t170190\n王虎\t170191\n欧阳修\t170192\n一幼\t170193\n郑晓龙\t170194\n妻子\t170195\n公卫执业医师\t170196\n法学\t170197\n歌名\t170198\nWorkNC\t170199\n娇娘医经\t170200\nRaman\t170201\n15孔\t170202\n易玩通\t170203\n皮肤镜\t170204\n整形医生\t170205\n住宅专项维修资金\t170206\nGITHUB\t170207\n白茅\t170208\n酷睿i9处理器\t170209\n商业界\t170210\n吴剑\t170211\nfeiyue\t170212\n冰水机\t170213\n李扬\t170214\n胶州湾大桥\t170215\nkinkikids\t170216\n雪天\t170217\n3.5下\t170218\ncve\t170219\n投顾\t170220\n奔驰V260L\t170221\n忙里偷闲\t170222\n中圆\t170223\n单联\t170224\n恋男乱女\t170225\n尹菲\t170226\nSmartphone\t170227\n连丽如\t170228\n12月12日\t170229\n材料费\t170230\n蒋明\t170231\nkinder\t170232\n20170120\t170233\n冰月\t170234\n彩宝贝\t170235\nRedirect\t170236\nyqie\t170237\n生死攸关\t170238\n荣耀商城\t170239\n万里牛\t170240\n电子科大\t170241\n协同过滤\t170242\n渭塘镇\t170243\nnote2吧_\t170244\nEXING\t170245\n迈途\t170246\nGAOQS\t170247\n百胜软件\t170248\n10几次\t170249\n飞跑\t170250\ngalg\t170251\n阊门\t170252\n松点\t170253\n吉利\t170254\n东桥镇\t170255\n火星哥\t170256\n大触们\t170257\n月台\t170258\n房地产有限公司\t170259\nic\t170260\n监生\t170261\n茂港区\t170262\n300斤\t170263\n九月初三\t170264\n三批次\t170265\ngrove\t170266\n菊叶\t170267\n淘宝镇\t170268\n91porm\t170269\n颠沛\t170270\n废人\t170271\n空桩\t170272\n黄龙县\t170273\nR2014a\t170274\nCLK\t170275\n半升洞码头\t170276\n陈永馨\t170277\nintrins\t170278\n血流成\t170279\n品番\t170280\n永进\t170281\n论语\t170282\n哆啪啪\t170283\n罗沙\t170284\nASTRONEER\t170285\nCCAV\t170286\nmhk\t170287\n刑事侦查学\t170288\n电土\t170289\n富平县\t170290\n深崎暮人\t170291\n前策\t170292\n大连中国移动\t170293\n震憾\t170294\n鸭嘴杯\t170295\n杨慎\t170296\nprovide\t170297\n逶迤\t170298\n伊利金领冠\t170299\n长城风骏5\t170300\n西雅图机场\t170301\n通胀\t170302\n西山区\t170303\n重庆工商职业学院\t170304\n六扇门风云\t170305\n六库\t170306\n享购\t170307\n科技小制作小发明\t170308\nPowerpoint\t170309\n倚天屠龙记之魔教教主\t170310\n间隔号\t170311\nP2P\t170312\n靓汤\t170313\n新浜镇\t170314\n鼻咽\t170315\n女王样\t170316\n植物大战僵尸online\t170317\n周内\t170318\nvz\t170319\n黄石市\t170320\n288元\t170321\n多币种\t170322\n返点\t170323\n教养\t170324\n沈复\t170325\n重光\t170326\n铝棒\t170327\n孙葆洁\t170328\ninline函数\t170329\n中国畜牧网\t170330\nTAL\t170331\n凯茵新城\t170332\n第109\t170333\n握住\t170334\n共产党党\t170335\n北京装饰\t170336\n孙科\t170337\n铜仁火车站\t170338\n施工流\t170339\n狄仁杰断案传奇\t170340\n钱峰雷\t170341\n摇曳花瓣爱落泪\t170342\n水围村\t170343\n王玫\t170344\n宿州市立医院\t170345\n湘湖站\t170346\n带口\t170347\n微党课\t170348\n口痒\t170349\n利仁\t170350\n上海金融新闻网\t170351\n蒋俊\t170352\n布丁粉\t170353\n喜服\t170354\n上海应用物理研究所\t170355\n逃票\t170356\n钱红\t170357\nGTA\t170358\n李春芳\t170359\n伊索寓言\t170360\n北京航空航天博物馆\t170361\n软景\t170362\n桶型\t170363\n三七二十一\t170364\n生儿\t170365\n注册监理工程师\t170366\nhelio\t170367\n甲胺\t170368\n写给\t170369\n嬴政\t170370\ncrtmpserver\t170371\nm370\t170372\n小聪\t170373\n23.2\t170374\n甩脂机\t170375\n安利蛋白粉\t170376\n大骨汤\t170377\nSlutty\t170378\nrap2\t170379\n修业\t170380\n云想衣裳花\t170381\n姓李\t170382\nAustria\t170383\n下回\t170384\nkygo\t170385\n婴儿帽\t170386\n上品物语\t170387\n吴振宇\t170388\n贵州人才网\t170389\n利他主义\t170390\n百分之5\t170391\n固定电话机\t170392\n伊卡尔迪\t170393\n肉食性\t170394\n求同存异\t170395\n2018.04\t170396\n千层饼\t170397\n220&\t170398\n星星雨\t170399\n42所\t170400\n学海无涯\t170401\n李美妍\t170402\n第四格\t170403\nMain函数\t170404\n科考队\t170405\n为国捐躯\t170406\n描述\t170407\n货港\t170408\n渤海路\t170409\n碎戏\t170410\n洛水\t170411\n传动器\t170412\never\t170413\nISO9001质量管理体系认证\t170414\n快彩\t170415\n蛋尼\t170416\nArmour|安德玛\t170417\n博罗龙溪\t170418\n演习\t170419\nScorpion\t170420\n20170418\t170421\n房术\t170422\ndbe\t170423\n西单\t170424\nk币\t170425\n试点地区\t170426\n电手\t170427\n铁威马\t170428\nExtractor\t170429\nRene\t170430\n更爽\t170431\n直排\t170432\n贝壳包\t170433\n国农科技\t170434\n刘丽娜\t170435\n该省\t170436\n片酬\t170437\nchanges\t170438\n指导员\t170439\nm1136mfp\t170440\n南京汽车\t170441\n80a\t170442\nbeaten\t170443\n制帽\t170444\n列列\t170445\n流武器\t170446\n中华人民共和国环境保护法\t170447\n这种病\t170448\n东宫\t170449\n浙江越秀外国语学院\t170450\nfal\t170451\n微笑\t170452\n弗朗哥\t170453\n张熙媛\t170454\ncs6\t170455\n100多万行\t170456\n敲碎\t170457\n巡\t170458\n徐建华\t170459\n图卷\t170460\n长兴镇\t170461\nQC3.0\t170462\n1849\t170463\n艺术类\t170464\n厦门仲鑫航自动化设备有限公司\t170465\n巴厘岛机场\t170466\n钢砼\t170467\n拜托\t170468\nmathematical\t170469\n寇世勋\t170470\n好多\t170471\n丹朱\t170472\n极品芝麻官\t170473\n183club\t170474\n舍身\t170475\n85g\t170476\n兴企\t170477\n研究生证\t170478\nOcarina\t170479\n玻璃管液位计\t170480\n勾拳\t170481\n第几周\t170482\n沃尔夫斯堡\t170483\n中英文\t170484\nObject类\t170485\n烟民\t170486\nbis\t170487\n北坞公园\t170488\n查重率\t170489\n全局者\t170490\n煤碳\t170491\nハイパ\t170492\n制假\t170493\n沪牌\t170494\n深圳市注册会计师协会\t170495\n翠湖\t170496\n受拉\t170497\n防线\t170498\n图版\t170499\n寄样\t170500\n暗黑破坏\t170501\n豪森\t170502\n0811\t170503\naqueous\t170504\n制品业\t170505\n相网\t170506\n深圳公司\t170507\n苏西\t170508\nrai\t170509\n细胞外基质\t170510\n杨进\t170511\nlibssl.so.1.0.0\t170512\n对穿\t170513\n值得买\t170514\nEVA\t170515\n西蜀\t170516\n20160315\t170517\n云租车\t170518\n半仙\t170519\n绝世神医\t170520\nWIS\t170521\n不约而同\t170522\n完整性\t170523\n75W\t170524\n半干\t170525\n羌族\t170526\nabraham\t170527\n基分\t170528\n国产片\t170529\n三木集团\t170530\n永王\t170531\n低沉\t170532\n迷魅\t170533\nDeleted\t170534\n蜂蜜\t170535\n15333230979\t170536\n曾经沧海难为水,除却巫山不是云\t170537\nInside\t170538\n吴华\t170539\nSSD+HDD\t170540\n罗大伦\t170541\n贾剑涛\t170542\n天颐湖\t170543\nEverEdit\t170544\n德龙\t170545\n睾丸肿瘤\t170546\n1英镑\t170547\n四十几岁\t170548\nappeal\t170549\n谷歌小说网\t170550\n手靶\t170551\n高压管\t170552\n600376\t170553\nAbigail\t170554\n王占海\t170555\n郑州航空港经济综合实验区\t170556\n腌笃鲜\t170557\nNetty4\t170558\n辛苦\t170559\n职业规划师\t170560\n渡鸦\t170561\nreadable\t170562\n78家\t170563\n能及\t170564\n顿悟\t170565\ntvb\t170566\n江川区\t170567\n面面俱到\t170568\nD3D11\t170569\n阿果\t170570\nmerge函数\t170571\nc51\t170572\n酸甜苦辣\t170573\n奶霸\t170574\n铜官山区\t170575\n历历\t170576\n波利斯\t170577\n焦俊艳\t170578\nddr4\t170579\n魔网\t170580\n高鹏飞\t170581\n聚力\t170582\n辽宁省政协\t170583\nUltrasound\t170584\n洛阳高新区\t170585\nsuggestions\t170586\nBIND9\t170587\n卡米拉\t170588\n市工信委\t170589\n数码宝贝吧\t170590\n自校\t170591\n001979\t170592\nwjj\t170593\n五要\t170594\n第一页\t170595\n20161009\t170596\nls\t170597\n万柳书院\t170598\n聚福园\t170599\n机核网\t170600\n家常菜\t170601\nunmet\t170602\n奶桃\t170603\n维斯塔潘\t170604\n王哥庄\t170605\n乐堡\t170606\n宋卫平\t170607\n纯子\t170608\nwinery\t170609\n美丽达\t170610\n栾城区\t170611\nSequel\t170612\nisuzu\t170613\n市价\t170614\n朱大可\t170615\n计算材料学\t170616\n方谬神探\t170617\ndildo\t170618\n卢坤峰\t170619\n口酒\t170620\n连式\t170621\n宠物貂\t170622\ncisp\t170623\n欺诈案\t170624\nphper\t170625\n双江湖\t170626\nalicdn\t170627\nMuller\t170628\n世朔\t170629\n凯文·凯利\t170630\n星河战队2\t170631\n五龙宫\t170632\n佰迪清\t170633\n湖南省质量技术监督局\t170634\n上拉菜单\t170635\n暗礁\t170636\n浦东金桥\t170637\nsexvideos\t170638\n有机葡萄酒\t170639\n8u161\t170640\n凌晨3点\t170641\n人力资源服务有限公司\t170642\n德剧\t170643\n老年人手\t170644\n000个\t170645\n光临\t170646\n生意\t170647\nmodified\t170648\n德巴金\t170649\n上林湖\t170650\n橙头\t170651\n加护\t170652\n昆山车管所\t170653\n长航凤凰\t170654\n迪玛希\t170655\nOKI\t170656\n永恒的记忆\t170657\n八九网\t170658\n授之以渔\t170659\n国家标准委办公室\t170660\n海王\t170661\n门类\t170662\n衣服装\t170663\n撤\t170664\n次年\t170665\n公务\t170666\n猫妖\t170667\nWireShark\t170668\n水利水电工程\t170669\n水泥石\t170670\n约炮\t170671\n车载净化器\t170672\n征途吧\t170673\nCOUNT函数\t170674\n4枚\t170675\n官色\t170676\n螳臂当车\t170677\n注定\t170678\n十刃\t170679\n踏浪者\t170680\n麻花机\t170681\n安全隐\t170682\n票据贴现\t170683\n标记符\t170684\n奥特之父\t170685\n还田\t170686\n171个\t170687\n第34页\t170688\n笔手\t170689\n酷我音乐盒\t170690\nPSN\t170691\n灰烬\t170692\n血管科\t170693\n陆委会\t170694\n白毫银针\t170695\n王洪宝\t170696\n科丝\t170697\ngt610\t170698\n银盒\t170699\n攻读\t170700\n民间游戏\t170701\n罗田教育信息网\t170702\n66句\t170703\n紫蓬山\t170704\n修身型\t170705\nmysqldb\t170706\n录制器\t170707\n雷霆扫毒\t170708\nhdx\t170709\n高硼硅玻璃\t170710\nslit\t170711\nhomestead\t170712\n几磅\t170713\n神八\t170714\n重庆彩票网\t170715\n押品\t170716\n青楼\t170717\n韩亚航空\t170718\n气质联用仪\t170719\n牛津大学出版社\t170720\nNCT127\t170721\n内陆\t170722\n名声大噪\t170723\n并进\t170724\n李隼\t170725\n服务态\t170726\n苏州市中医医院\t170727\n实体版\t170728\n太阳节\t170729\ncbr600\t170730\n薄葬\t170731\n当权\t170732\n波斯王子4\t170733\n领导干\t170734\n杀人机器\t170735\n著名\t170736\n001期\t170737\n李煜\t170738\n自治县\t170739\nip7\t170740\n王店\t170741\n间苯二酚\t170742\n雷霆陆战\t170743\n窥镜\t170744\n生命之水\t170745\n危城\t170746\n日化品\t170747\n江苏省人大\t170748\n德拉克洛瓦\t170749\n内经\t170750\n充分\t170751\n佳翼\t170752\nPRIVILEGES\t170753\n85万元\t170754\n孤儿院\t170755\n浅红色\t170756\n云飞\t170757\n刘晓明\t170758\n亡语者\t170759\n带秀\t170760\n多囊\t170761\n陈晓华\t170762\n宗家\t170763\ngrp\t170764\n十七个\t170765\nVS2013\t170766\n网易新闻\t170767\n属性点\t170768\n第22话\t170769\n人事考试网\t170770\nDifferences\t170771\n小米手机驱动\t170772\n张亚军\t170773\n一环路南\t170774\n隔离法\t170775\n1808\t170776\n3518\t170777\n统表\t170778\n辛笛\t170779\n马到成功\t170780\n企业银行\t170781\ndrums\t170782\n土茯苓\t170783\n压面\t170784\n月标\t170785\n谢梦\t170786\nyanhee\t170787\n城市执法体制改革\t170788\n暗格\t170789\n2018年03月10日\t170790\n2142\t170791\n磁翻板\t170792\nnulls\t170793\ninstantaneous\t170794\n起凡\t170795\n面相_\t170796\nBrochure\t170797\ntamil\t170798\n老房\t170799\nNUMECA\t170800\npopstate\t170801\n起头\t170802\n秒抢\t170803\n客家人\t170804\n刷手\t170805\n椰族\t170806\n弘瑞\t170807\n意义\t170808\nGEEK\t170809\n万科如园\t170810\n西普\t170811\n中梁首府壹号\t170812\n流浪者\t170813\n讲师\t170814\n敬爱\t170815\n青峰\t170816\ndm3\t170817\n江安\t170818\n瓦楞\t170819\n任课\t170820\n杨官璘\t170821\n杨兵\t170822\nlenfried\t170823\n首映式\t170824\n寒光\t170825\n天坑\t170826\n90集\t170827\n醉翁\t170828\n都江堰景区\t170829\n毒宠\t170830\n八喜冰淇淋\t170831\n旋飞\t170832\nMixers\t170833\n滴塑机\t170834\n胶州政府网\t170835\nNbimer\t170836\n私汤\t170837\n下不了\t170838\n苫布\t170839\n水滴状\t170840\n20150\t170841\n受寒\t170842\n石碣镇\t170843\nso2\t170844\ndatawindow\t170845\nguoshi\t170846\nkiva\t170847\n中国水运报数字报\t170848\n欢乐喜剧人第四季\t170849\n1.8吨\t170850\n倒虹\t170851\nx0处\t170852\n重金属\t170853\n云适配\t170854\n皂幕山\t170855\n不提\t170856\n四氯化碳\t170857\n瞬間\t170858\n非标准化系数\t170859\n霍伊特\t170860\n江门日报\t170861\n美吉姆早教中心\t170862\nattempting\t170863\ngreatwqs\t170864\n红条\t170865\nHori\t170866\n军备竞赛\t170867\n我们十七岁\t170868\n维修师\t170869\n20162017\t170870\n草蜢\t170871\npillars\t170872\n教职工\t170873\n江阴职业技术学院\t170874\n企业管理软件\t170875\n赛客\t170876\n两腿\t170877\n贝多\t170878\n图示仪\t170879\n第81届\t170880\n加点器\t170881\nSAPPHIRE\t170882\n乳蛋白\t170883\njspatch\t170884\n净损失\t170885\n芜湖方特四期\t170886\n白俊遥\t170887\nx-2\t170888\n介入\t170889\n八仙花\t170890\n走远\t170891\n学口琴音乐网\t170892\n方桶\t170893\n车屏\t170894\nReportViewer\t170895\n罗曼诺夫\t170896\n迪拜机场\t170897\n2707\t170898\n手状\t170899\n公管\t170900\n录音师\t170901\n嘴皮\t170902\n世纪华旅\t170903\nDIAMOND\t170904\n未授权\t170905\nCOD14\t170906\n情深不及你\t170907\n三维家\t170908\n略略略\t170909\n2&3\t170910\n48米\t170911\n苍子\t170912\nedible\t170913\nInversion\t170914\nDedecms\t170915\n东方论\t170916\n10.7.5\t170917\n毒膏\t170918\n索引号\t170919\nOpenvswitch\t170920\n不外乎\t170921\n振幅\t170922\n质谱法\t170923\n祖山\t170924\n311\t170925\nmege\t170926\n元培学院\t170927\n381\t170928\n王莲\t170929\n牛肉饭\t170930\nt1航站楼\t170931\n停车收费系统\t170932\n洗涤液\t170933\n名侦探世界\t170934\n异氰\t170935\n更爱\t170936\n试妆\t170937\n宁夏人民医院\t170938\n创业史\t170939\n快科技\t170940\n7只\t170941\nwarriors\t170942\n无人管\t170943\n纸钞\t170944\n基础心理学\t170945\n塞尔达神庙\t170946\n磁粉探伤仪\t170947\n上海市对外服务有限公司\t170948\njingji\t170949\n金洁\t170950\n高清机\t170951\n动会\t170952\n富士美\t170953\n闹笑话\t170954\n熊猫tv\t170955\nEric\t170956\n狮身人面像\t170957\n北京崇文\t170958\n履约保函\t170959\ndjigo4\t170960\n10平方\t170961\n免入\t170962\n平导\t170963\napplecare\t170964\n书机\t170965\n停建\t170966\n北元\t170967\n极性共价键\t170968\n甘婷诺兰\t170969\nch4\t170970\n生物学家\t170971\n砂筒\t170972\n高武\t170973\n英雄联盟游戏\t170974\nF450\t170975\n破坏者\t170976\nqq拼音\t170977\ntombstone\t170978\nwebmaster\t170979\n别有用心\t170980\n利用度\t170981\n陈展鹏\t170982\n巍\t170983\n几速\t170984\n保单贷\t170985\n020_\t170986\n白话史记\t170987\n大慈大悲\t170988\n林吉特\t170989\n超人回来了\t170990\n起誓\t170991\n255\t170992\n黄秀杰\t170993\nhr/\t170994\n水调\t170995\n魅蓝X\t170996\nmathmatica\t170997\n普洛斯\t170998\n点画\t170999\n道路交通安全法\t171000\nscikit\t171001\n联洲电器\t171002\nVIO\t171003\n武汉恒大\t171004\n经一路\t171005\n荣耀s6\t171006\nGibbs\t171007\n详尽版\t171008\n5行\t171009\n健坤\t171010\n万达维多利亚湾\t171011\n尸骨无存\t171012\n马鞍山市人力资源和社会保障局\t171013\n荣耀S8\t171014\n救民\t171015\n查补\t171016\nUpdatePanel\t171017\n缝纫线\t171018\n大地球\t171019\n磷酸二氢钙\t171020\n早场\t171021\n林琼\t171022\n40倍\t171023\n流水镇\t171024\n米香\t171025\n流行款\t171026\n庆科\t171027\n雾武\t171028\n电子标\t171029\n优游娱乐\t171030\n2月17日\t171031\n我们的城市\t171032\n网络存储器\t171033\n丁隐\t171034\n.net4.5\t171035\n横肉\t171036\n早报\t171037\nheartsignal\t171038\n肚脐\t171039\n7.2米\t171040\n易言\t171041\n沙宝亮\t171042\nNew\t171043\n祀\t171044\n真夏\t171045\n昆山南\t171046\n热性\t171047\n外研通点读笔\t171048\n填筑\t171049\n世纪花园\t171050\n火之歌\t171051\nKalafina\t171052\n机片\t171053\nqs\t171054\n金龙山\t171055\n中海誉城\t171056\n元初\t171057\n瓦舍\t171058\n王泷正\t171059\n化妆品盒\t171060\nDiscounts\t171061\n2小时后\t171062\n亮蓝\t171063\n珞巴族\t171064\n唐氏筛查\t171065\n8000_\t171066\n浑源\t171067\n謝謝\t171068\n冰雪娱乐网\t171069\n镓\t171070\npysvn\t171071\n云牛\t171072\n裁判\t171073\n600498\t171074\n8.14\t171075\n10.8.4\t171076\n旅区\t171077\nCENTOS7\t171078\n黄豆\t171079\n气站\t171080\n电喷\t171081\n大绳\t171082\n60周\t171083\nsoftmax_cross_entropy_with_logits\t171084\n琅琊榜之风起长林\t171085\nValueError\t171086\n福山路\t171087\n1.4.5\t171088\n安享\t171089\n朝珠\t171090\nsideload\t171091\n1049\t171092\n林采欣\t171093\n精灵族\t171094\n电力展\t171095\n赛睿寒冰5\t171096\n催化剂\t171097\n10.3\t171098\ngongs\t171099\n美妆\t171100\n魏美人\t171101\n雪野\t171102\nscrap\t171103\n气扇\t171104\n墙锢\t171105\n卢子跃\t171106\n消费维权网\t171107\n焦氏易林\t171108\n预苗\t171109\n化龙\t171110\n劳动密集型\t171111\n学构\t171112\n旁人\t171113\n西部\t171114\n乡下\t171115\n付济华\t171116\n定量化\t171117\n到行\t171118\n沈佳\t171119\n澡巾\t171120\n非平衡电桥\t171121\n深圳西\t171122\n秀人网\t171123\n渤海金控\t171124\n翻车\t171125\n铂悦\t171126\n曹兴明\t171127\nReminder\t171128\n依云郡\t171129\n13.0.6447\t171130\n拳皇2003\t171131\n拉夫\t171132\n滴星\t171133\n日语在线翻译_在线日语词典\t171134\n氮气瓶\t171135\n杜润生\t171136\n特殊教育学校\t171137\n菏泽市政府\t171138\n幻力\t171139\n王一鸣\t171140\n全机\t171141\n李卫红\t171142\n95所\t171143\n非标准件\t171144\nroa\t171145\n58\t171146\n英红镇\t171147\n朱然\t171148\n丽讯\t171149\n傻逼们\t171150\n扑灭\t171151\n96kHz\t171152\n戴尔灵越\t171153\nVCenter\t171154\nfastDFS\t171155\n十二指肠球炎\t171156\n揽客\t171157\n狞笑\t171158\n法医秦明\t171159\n纯净\t171160\n怎呢\t171161\n宏润花园\t171162\n张为\t171163\n地榆\t171164\n的确\t171165\nBT网\t171166\n圆脸\t171167\n香蕉派\t171168\n渔人\t171169\n一个男孩\t171170\n农山\t171171\n香蜜\t171172\nOfficially\t171173\n方太集团\t171174\n希沃论坛\t171175\n节温器\t171176\n勇气\t171177\n卷法\t171178\nCSPPLAZA\t171179\n憩\t171180\n新绛\t171181\n药具\t171182\n三省六部制\t171183\n2018-04-27\t171184\n复仇者营地_凯恩之角\t171185\n纸婚\t171186\n福乐斯\t171187\ndepression\t171188\n乙卯\t171189\n付杰\t171190\nV\t171191\nredisson\t171192\n阿春\t171193\nnudie\t171194\nLPL2018\t171195\n谭轩辕\t171196\n获批准\t171197\n周\t171198\n铜铝复合暖气片\t171199\noutdated\t171200\nQQ麻将\t171201\nSGP\t171202\n起始站\t171203\niskey\t171204\nIgor\t171205\n决斗场\t171206\n水力学\t171207\n诊脉\t171208\n小重山\t171209\n150.0000元\t171210\n赵子琪\t171211\n背摔\t171212\nlcs\t171213\n136平米\t171214\n造梦西游5灵宠\t171215\nGoogle浏览器\t171216\nitch\t171217\n磨削\t171218\n第四款\t171219\n联商论坛\t171220\n扒马褂\t171221\n热弯玻璃\t171222\n祖上\t171223\n佐藤江\t171224\n职人员\t171225\n动易软件\t171226\n江宁镇\t171227\nX5R\t171228\n中美建交\t171229\n蝴蝶兰吧\t171230\n徐\t171231\n盘踞\t171232\n老博会\t171233\n处字\t171234\n中国水利工程协会\t171235\n2017年六月\t171236\n省房\t171237\n被关\t171238\nwayland\t171239\n神戒\t171240\n有的人\t171241\nfand\t171242\n利多卡\t171243\n极品校园绝品狂徒\t171244\ndetectors\t171245\n设到\t171246\n红林\t171247\n脱坑\t171248\n同甘共苦\t171249\n今夜百乐门\t171250\nShare\t171251\n年事\t171252\n留校察看\t171253\n陈亦度\t171254\nacgpower\t171255\nAuthorized\t171256\n船型\t171257\n笊篱\t171258\n93部\t171259\nFreeSync\t171260\n退休年龄\t171261\n补给舰\t171262\nMSDE\t171263\n宋彬彬\t171264\nbsd\t171265\n陈半丁\t171266\n危在旦夕\t171267\n藏经阁\t171268\n三国城\t171269\n龙祥\t171270\n大杀器\t171271\n多姿多彩\t171272\nstarting\t171273\n妇女们\t171274\n辛卯\t171275\n结界\t171276\nuploa\t171277\n霸主\t171278\n蓝色狂想曲\t171279\n返券\t171280\n號\t171281\n能者多劳\t171282\n生产基地\t171283\n考察点\t171284\n超世纪\t171285\n移民史\t171286\n战警\t171287\nassumption\t171288\n爸爸去哪儿第四季\t171289\n六中\t171290\ntextarea\t171291\n圣亚海洋世界\t171292\nCRT\t171293\n桃花季\t171294\nDecks\t171295\n清河\t171296\nHomePod\t171297\n国税定额发票\t171298\n孙红丽\t171299\nMacBook、iMac\t171300\n采空\t171301\n北京上小学\t171302\n梦幻西游力五庄\t171303\n十千\t171304\nignoring\t171305\n北洋军阀\t171306\n1870年\t171307\n工作狂\t171308\n因祸得福\t171309\n枕式\t171310\n雪诺\t171311\n厘米级\t171312\n摇表\t171313\npassionate\t171314\n被动房\t171315\n第三套\t171316\n蟹蟹\t171317\n黄俊\t171318\n中山大学附属口腔医院\t171319\n千鸟格\t171320\n1b\t171321\n临沂一中\t171322\n陈诗云\t171323\nvmnet1\t171324\n发射端\t171325\n良币\t171326\n4234\t171327\n三维空间\t171328\n厦门二中\t171329\n新都小区\t171330\n簧\t171331\niknow\t171332\nbilt\t171333\n深圳国\t171334\n2天1晚\t171335\n飞天虎\t171336\ngoodsync\t171337\n刘医生\t171338\nfrontend\t171339\n_斗蟹游戏网\t171340\nmsdtc\t171341\n扎花\t171342\nlocaldate\t171343\n逢雪\t171344\n财富\t171345\n第50届\t171346\ncb190r\t171347\n江湖夜雨\t171348\n盐分\t171349\n磁力链接\t171350\n冷轧卷\t171351\n润湿性\t171352\nolly\t171353\n伴你学\t171354\n无眼\t171355\nstl\t171356\n历史唯物主义\t171357\nASAHI\t171358\n4ph\t171359\n千呼万唤\t171360\n南河\t171361\n台标\t171362\n竞秀区\t171363\n88693066\t171364\npartitions\t171365\n第36季\t171366\n歌手\t171367\n短租\t171368\n男士\t171369\nwinner\t171370\n耳病\t171371\n总算\t171372\nbtt\t171373\nLSD\t171374\n齐鲁软件园\t171375\nquxiao\t171376\nJavaConfig\t171377\n汉语版\t171378\n逆卷\t171379\n仙魂\t171380\n天外来客\t171381\n东北师范大学\t171382\n中医馆\t171383\n天卷\t171384\n一个勺子\t171385\nZ4X\t171386\nbpo\t171387\n尼康D80\t171388\n筒纸\t171389\nLAC\t171390\nprefix\t171391\n塞弗斯\t171392\n沈军\t171393\n团块\t171394\n20万方\t171395\n个人校本\t171396\n墨守成规\t171397\n不摸\t171398\n_欣\t171399\n卧城\t171400\nChizhou\t171401\nEXCEL表\t171402\n吕昊天\t171403\n恒天财富\t171404\n钢带箱\t171405\n素描自学网\t171406\n林秋雯\t171407\n边境\t171408\n/li\t171409\n海拉6\t171410\n信通院\t171411\n知网节\t171412\n国家自然科学基金\t171413\nelecworks\t171414\n公亮\t171415\nroadster\t171416\n鼎湖区\t171417\n万科\t171418\n阿联\t171419\n流行文化\t171420\n硬盘架\t171421\n星颜\t171422\n体育系\t171423\n唐宇迪\t171424\n高尔夫花园\t171425\n神谷\t171426\n欧洲联盟\t171427\nWalden\t171428\n雪花银\t171429\n工职\t171430\npy\t171431\ntrx\t171432\n_百田\t171433\n海贼\t171434\n铁嘴\t171435\n关切\t171436\n调常用\t171437\n松花江路\t171438\nBuzzFeed\t171439\n凯迪拉克xt4\t171440\n岚皋\t171441\n飨宴\t171442\n柞水溶洞\t171443\n亚斯娜\t171444\n中国农学通报\t171445\n青岛乐居\t171446\n浅显\t171447\n菇类\t171448\n胃窦\t171449\n树叶画\t171450\n可爱的人\t171451\n射钉枪\t171452\n火云传奇\t171453\nswedish\t171454\n于凡\t171455\nEton\t171456\n突降\t171457\n凹模\t171458\n神车\t171459\n干粉砂浆\t171460\nWindows8_Wind\t171461\n卡皇\t171462\n新小说\t171463\n超级马里奥跑酷\t171464\n钓鱼人\t171465\njuzi\t171466\n4月29\t171467\n虾仁\t171468\n记忆\t171469\n法鲨\t171470\n1562\t171471\n罗伯茨\t171472\n龙兄鼠弟\t171473\nHFC\t171474\n7辆\t171475\n何黎明\t171476\n阿尔托大学\t171477\n吉州区人民政府\t171478\n猫扑两性网\t171479\n雨女\t171480\n羟基磷灰石\t171481\n程\t171482\nSATELIGHT\t171483\n包钢股份\t171484\nhiit\t171485\n专科学校\t171486\n张俊豪\t171487\n史前\t171488\n孤傲\t171489\n预览表\t171490\n不能自已\t171491\nhosting\t171492\nOTCBTC\t171493\n腹痛\t171494\n狼队\t171495\n税务师\t171496\n典故故事\t171497\n救心丹\t171498\n为何物\t171499\n首租\t171500\n三趟\t171501\n雷山县\t171502\n基尼指数\t171503\n力士\t171504\n煸\t171505\ndcd\t171506\n见义勇为\t171507\n昭平县\t171508\n钟声\t171509\n金恩\t171510\n吊饰\t171511\n海港区\t171512\n大汶河\t171513\njz\t171514\n赵安\t171515\n邵阳网\t171516\n见多发\t171517\n秘宝馆\t171518\n琴萝\t171519\n蒙地卡罗\t171520\n社旗\t171521\nfx1200\t171522\n挨批\t171523\n生辰纲\t171524\ne92\t171525\n千幻魔镜\t171526\n鱼块\t171527\n无压风门\t171528\n找鸡\t171529\n挽\t171530\n真卡\t171531\n重庆市不动产登记中心\t171532\nVR眼镜\t171533\n暂定\t171534\n普通生\t171535\n莎莎\t171536\n3.15晚会\t171537\ndifficulty\t171538\n南吕\t171539\n轴网\t171540\n5258\t171541\n兴义之窗\t171542\nBAZAAR\t171543\nguazi\t171544\nSEI\t171545\n佛脚\t171546\n140mm\t171547\n1.414\t171548\n体游\t171549\n单相逆变器\t171550\n艾·里斯\t171551\nexchange2016\t171552\n回电\t171553\n硕美科\t171554\n和谐号\t171555\n初一数学\t171556\n泵芯\t171557\n裴松之\t171558\n丹霞\t171559\n四好村\t171560\n临清在线\t171561\nwni10\t171562\n豆娃\t171563\n600120\t171564\n胳臂\t171565\nEth\t171566\n巡按\t171567\n竹帘\t171568\n山西华澳商贸职业学院\t171569\n金坚\t171570\n达美乐比萨\t171571\nSCI影响因子查询-MedSci\t171572\n金盾出版社\t171573\njFinal\t171574\n山东中烟\t171575\n佐力药业\t171576\n无头浏览器\t171577\n同居人\t171578\n网盟\t171579\n雾化吸入器\t171580\n瑞·达利欧\t171581\n2013-2014年度\t171582\nsnh48吧_\t171583\n江民科技\t171584\n连南瑶族自治县\t171585\n木兰湖\t171586\n四川博物馆\t171587\n百变小樱魔术卡\t171588\nStrongYaYa\t171589\n澳门渔人码头\t171590\n土壤肥料\t171591\nHTML5\t171592\n干粮\t171593\nCOD5\t171594\n罂粟粉\t171595\n黑稿\t171596\nUMC\t171597\ntote\t171598\n山东省农业厅\t171599\n骇人\t171600\n寿光网\t171601\n太阳广场\t171602\n争\t171603\nsp3序列号\t171604\n影踪派\t171605\n动环\t171606\n2节\t171607\n环保科技有限公司\t171608\n嵩鼠\t171609\n方正\t171610\n庐山景区\t171611\n兴风作浪\t171612\n朴炯植\t171613\nixchariot\t171614\n芭莎明星慈善夜\t171615\n福利片\t171616\n克林特\t171617\n吉林农业大学\t171618\n制度化\t171619\n贴梗海棠\t171620\n山东论坛\t171621\n人科\t171622\nFormats\t171623\n新枫之谷\t171624\n优尔\t171625\n消防工程师考试网\t171626\n托马斯·哈代\t171627\n8宗\t171628\n卡拉ok\t171629\n悍妇\t171630\n合力\t171631\n汉日\t171632\n地磅\t171633\n偕\t171634\n钟南山\t171635\n幕刃\t171636\n装帧\t171637\n赫尔墨斯\t171638\n试播\t171639\n葛均波\t171640\n三排\t171641\nPDFelement\t171642\n乌塔\t171643\n校舍\t171644\nVlog\t171645\nVMProtect\t171646\n义务\t171647\ncorporain\t171648\n单元体\t171649\n版务\t171650\n武警部队代表团\t171651\n网速\t171652\n瓜果类\t171653\n布氏杆菌\t171654\nm35\t171655\nntldr\t171656\n幼心\t171657\n﹠\t171658\nlabelme\t171659\n63岁\t171660\n九_任\t171661\n阿木古楞\t171662\n300首\t171663\n外高桥保税区\t171664\nRea\t171665\n3SE\t171666\n犊子\t171667\n夏桀\t171668\n浪潮云\t171669\nG片\t171670\n吴倩\t171671\n浪人\t171672\n楚盟\t171673\n五羊邨\t171674\n轮训\t171675\n喜玛拉雅\t171676\n27层\t171677\nVOCAL\t171678\n得病\t171679\n密西\t171680\n山水子\t171681\n浙江省考试院\t171682\n谭军\t171683\n钢栈桥\t171684\n迷罗\t171685\n暗黑破坏神之毁灭\t171686\n林丰\t171687\nSiS001\t171688\n飘尘\t171689\n发辫\t171690\n比亚迪电子\t171691\n上海光大会展中心\t171692\n户口网\t171693\ncalls\t171694\n终南别业\t171695\nAV磁力連結\t171696\n吉布提\t171697\nwould\t171698\n农技\t171699\n程序师\t171700\n萝球社\t171701\n沙漠越野\t171702\nqq空间\t171703\n书书\t171704\n双顶径\t171705\n王屋\t171706\n性爱故事_邻妹妹_邻医网\t171707\nallin\t171708\n私学\t171709\nppp项目公司\t171710\nnginx服务器\t171711\nSubmarine\t171712\nqimage\t171713\n广东会\t171714\n2988\t171715\n单用途预付卡\t171716\n灰层\t171717\n运动家\t171718\n1992\t171719\nlightroom6\t171720\n八家\t171721\n利德\t171722\n国有企业领导人员廉洁从业若干规定\t171723\n骨盆\t171724\n卜易居算命网\t171725\n无锡阳光医院\t171726\n线粒\t171727\n影音风暴\t171728\n张庙\t171729\n10公斤\t171730\n北京西城\t171731\n金鸿\t171732\n301xxxx\t171733\n1.6_\t171734\n5185\t171735\n万盛\t171736\n新宾满族自治县\t171737\n芮甜甜\t171738\n沧州\t171739\n妈妈咪\t171740\n声嘶力竭\t171741\n座山\t171742\n骑师\t171743\n桤木\t171744\nwingying\t171745\n江北区人民政府\t171746\n船首\t171747\n三宅一生\t171748\n李苗\t171749\n9877\t171750\n110平\t171751\n无花空折枝\t171752\n电警\t171753\nPitstop\t171754\n崔各庄乡\t171755\na7ii\t171756\n羧甲基淀粉钠\t171757\n物流有限公司\t171758\n金银花茶\t171759\n唐宝\t171760\nharo\t171761\nglory\t171762\n市委\t171763\n麝香龟\t171764\n中华电信\t171765\nVueScan\t171766\n15日\t171767\n珏山\t171768\n30瓦\t171769\n明山\t171770\nsuction\t171771\n川化\t171772\n升龙\t171773\n上海17号线\t171774\n胡温\t171775\n江南新村\t171776\n插图版\t171777\n招标网\t171778\n年\t171779\n天清\t171780\n移动冰激凌\t171781\n张亚楠\t171782\nlog4cxx\t171783\n冲天炮\t171784\nkimi\t171785\n730li\t171786\n飞傲吧\t171787\n瓶阀\t171788\n自治区党委办公厅\t171789\njtextfield\t171790\n南京南瑞继保电气有限公司\t171791\n固态硬盘接口\t171792\n王者荣耀狂铁\t171793\ngcode\t171794\nv8.2\t171795\n微积分\t171796\n碧沙\t171797\n雅鲁藏布\t171798\npheatmap\t171799\n华服日\t171800\n版人\t171801\n仪陇\t171802\n我们爱的难舍难分\t171803\n桂林西街\t171804\n简兮\t171805\n文斯莫克\t171806\nworldpress\t171807\nsteam卡\t171808\n中国货币网\t171809\nbetterzip\t171810\nGB-T\t171811\n中间\t171812\nZendesk\t171813\n环球捕手\t171814\n入去\t171815\nAFS\t171816\nCavity\t171817\n教育片\t171818\n碱性蛋白酶\t171819\n46号\t171820\nLegend\t171821\n黄树强\t171822\n永不停歇\t171823\n火影忍者忍术\t171824\nauthorized\t171825\n香港酒店\t171826\n肉番\t171827\n卷云\t171828\n出轨率\t171829\n扫地机\t171830\n智利圣地亚哥\t171831\n抛砖引\t171832\n余情与你共白首\t171833\nkcc\t171834\n稻作\t171835\nformality\t171836\n水肥\t171837\n禁断\t171838\n三九猪小说网\t171839\nEVEREST\t171840\n构造线\t171841\n中央军委办公厅\t171842\nxxxholic\t171843\n成组\t171844\n空洞骑士\t171845\nmodule.exports\t171846\n干部\t171847\n保甲\t171848\n北京新能源汽车\t171849\n系统板\t171850\n贡生\t171851\njvc\t171852\nTMS320F2812\t171853\n一个3位\t171854\n待业\t171855\n南山区科技园\t171856\n生鲜乳\t171857\n1920X1080\t171858\n趔趄\t171859\npc吧\t171860\n未达标\t171861\n直截了当\t171862\n警示录\t171863\n锚垫板\t171864\nqlogic\t171865\n南通日报\t171866\n布恩\t171867\nD525\t171868\n格数\t171869\n0.09\t171870\n24VDC\t171871\n通通\t171872\ncedar\t171873\n盘活\t171874\n钏路\t171875\n到期\t171876\n楚天金报_多媒体报\t171877\n压铆螺钉\t171878\n禁止的爱\t171879\n雕花板\t171880\n没法\t171881\n整理架\t171882\nroshan\t171883\n圆舞曲\t171884\n280平\t171885\n圣职\t171886\nThanks\t171887\ndistinct\t171888\n翼勋\t171889\n二分类变量\t171890\n虞世南\t171891\n四氯乙烯\t171892\n南谯\t171893\n人民检察官\t171894\n语音识别技术\t171895\n水电表\t171896\n婺城\t171897\n肉粉\t171898\n限价令\t171899\n雷泰\t171900\n爆震\t171901\niota\t171902\n假体垫\t171903\n本儿\t171904\n看守\t171905\n赶牲灵\t171906\n超变传奇\t171907\n兰妮\t171908\n吧主\t171909\n吴业坤\t171910\n缴枪\t171911\n2.45亿\t171912\n结构型\t171913\n大华电子\t171914\n更多\t171915\nmilli\t171916\n技术转让\t171917\n王海龙\t171918\n衣子\t171919\n猎德村\t171920\njlpt\t171921\n530\t171922\nJDB\t171923\n电影版\t171924\n国智\t171925\n夺宝奇兵3\t171926\n国土资源部办公厅\t171927\n劳教\t171928\n贾斯丁比伯\t171929\n公钥\t171930\n中国金融电子化公司\t171931\n三沙\t171932\n美尼尔氏综合症\t171933\nLE7ELS\t171934\n暗魂\t171935\n上证综指\t171936\n翡翠湖郡\t171937\n链表\t171938\nloewe\t171939\n六层\t171940\n香港海关\t171941\n主婚人\t171942\n大奖赛\t171943\n观海卫镇\t171944\n伪春菜\t171945\n耸肩\t171946\n卡奴与卡神\t171947\nkoikatu\t171948\n梦里梦外\t171949\n98元\t171950\n20欧\t171951\n云纹\t171952\n铃兰花\t171953\n融钰集团\t171954\n群众观\t171955\n扫黄先锋\t171956\n出口禁止令\t171957\n兰州市安宁区人民政府\t171958\nGX7\t171959\n巾\t171960\ngo4\t171961\n反比例函数y=kx\t171962\n暗黑2符文之语\t171963\n婚如玉\t171964\n1815\t171965\n牡丹路\t171966\nProbit\t171967\n吴征\t171968\n希美\t171969\n汽液\t171970\nFAO\t171971\n厚\t171972\n奥斯卡·王尔德\t171973\n20150120\t171974\n300多年\t171975\noption\t171976\n榉木\t171977\n碧云天\t171978\n实至名归\t171979\n辛香汇\t171980\nSlides\t171981\nsubplot\t171982\n李申\t171983\nDirty\t171984\n数据源\t171985\n记录取\t171986\n工信委\t171987\n祸国\t171988\n右值\t171989\n傲娇\t171990\n申达股份\t171991\n日场\t171992\n王卓\t171993\n′\t171994\nCarpet\t171995\nvitualbox\t171996\nTVS\t171997\n捻军\t171998\n不伦不类\t171999\n七骑士\t172000\n健合集团\t172001\n桐原亮司\t172002\n诈\t172003\nsushi\t172004\n另起\t172005\nOL4\t172006\n走向胜利\t172007\n真良心\t172008\n小镇滋味\t172009\n军\t172010\nNursing\t172011\n查泰莱夫人的情人\t172012\n苦美\t172013\nmatlab2017\t172014\n净天\t172015\n两天游\t172016\n生机勃勃\t172017\n浙江大学数学科学学院\t172018\n畜牧局\t172019\n芭菲\t172020\nssis\t172021\n华岐\t172022\n行拘\t172023\ndeepcopy\t172024\n隔夜菜\t172025\n自由主义者\t172026\n范读\t172027\n麦蒂\t172028\n可变现净值\t172029\n南京口腔医院\t172030\n厦门旅游\t172031\n会东县\t172032\n区间车\t172033\nmuzhi\t172034\n超级索尼子\t172035\n1500万元\t172036\n中国电子科技集团公司第四十八研究所\t172037\n农炮\t172038\ndecrypt\t172039\n闲置率\t172040\n海丝商报社\t172041\n良才\t172042\nTUM\t172043\n汉武大帝\t172044\n歌莉娅\t172045\n中望CAD\t172046\n2018年2月26日\t172047\nwalkingp\t172048\n新龙族\t172049\n教言\t172050\n3.9.2\t172051\nwin7系统纯净版\t172052\n旗袍裙\t172053\n牛野\t172054\n台儿庄古城\t172055\n长城大厦\t172056\nMoose\t172057\npscs4\t172058\n西安火车站\t172059\n中业兴融\t172060\n第几节\t172061\n郁郁寡欢\t172062\nRUU\t172063\n甲状腺功能\t172064\nwave\t172065\nNH4Cl\t172066\ntrained\t172067\n群策群力\t172068\n列间\t172069\n绿茶安卓网\t172070\n父窗体\t172071\n红外线体温计\t172072\nChemDraw\t172073\nslideUp\t172074\n恒扬\t172075\n爱卫月\t172076\nanarchy\t172077\nresearchers\t172078\nDUMP\t172079\n稀盐酸\t172080\n大石村\t172081\n轩逸罪恶之城\t172082\n雅丽\t172083\n锋锐\t172084\n服务证\t172085\n苹果8P\t172086\n超薄型\t172087\n阳春市人民政府\t172088\n韦林\t172089\n38种\t172090\n通知消息\t172091\n天将\t172092\n十二宫篇\t172093\n糠\t172094\n农业机械\t172095\n沙漏型\t172096\n新会区\t172097\n福建数字共青团\t172098\n中兴大道\t172099\n北冥有\t172100\n中央人民广播电台\t172101\n油皮\t172102\n毛家超\t172103\n成人教育\t172104\n奥迪RS5\t172105\n继光香香鸡\t172106\n色生香\t172107\n刘甜甜\t172108\nGeneCards\t172109\n横截\t172110\nCLOCK\t172111\nDistributors\t172112\n新浪湖\t172113\n途虎\t172114\nfpc连接器\t172115\n守护珠\t172116\n曹旭\t172117\n22P\t172118\n非承重\t172119\ngoogle输入法\t172120\n哈尔滨火车站\t172121\n玉露香梨\t172122\n贾冰\t172123\n787\t172124\nACG17\t172125\n土耳其海峡\t172126\n朱子家训\t172127\n龙茗路\t172128\nwindows浏览器\t172129\nmarrow\t172130\n定额发票\t172131\n中村春菊\t172132\n湘潭大学\t172133\n定型化\t172134\neclpse\t172135\n一达\t172136\n龙五\t172137\n肥东县委\t172138\n营业房\t172139\n光荣榜\t172140\n聪明伶俐\t172141\nFI\t172142\n兽爪\t172143\n第13节\t172144\n六级\t172145\n无法相信\t172146\n徐留平\t172147\nFAD\t172148\n乙文\t172149\n人大附中杭州学校\t172150\n李林\t172151\n第十一节\t172152\nopentype\t172153\nscene\t172154\n房岩\t172155\nc4dr19\t172156\n值税\t172157\n巅峰之路\t172158\n深更半夜\t172159\n株洲中车时代电气股份有限公司\t172160\n天晶\t172161\n极趣\t172162\nlorawan\t172163\n消防栓\t172164\nnnd\t172165\n大月氏\t172166\n290元\t172167\n洲明科技\t172168\n卷内\t172169\n共浴\t172170\n基情\t172171\n微型车床\t172172\n植物纤维\t172173\n第217集\t172174\n铜氨\t172175\nmarble\t172176\naccountable\t172177\n透光膜\t172178\n84万\t172179\n帕德玛瓦蒂王后\t172180\n民谣\t172181\n罗莉安\t172182\n看诊\t172183\nJetbrains\t172184\n梧桐街\t172185\n暖手\t172186\ngnp\t172187\n经济战\t172188\n五木\t172189\n玄奘之路\t172190\n茶道\t172191\nFrenzy\t172192\nDEFINE\t172193\n买账\t172194\n180321\t172195\n体脂肪率\t172196\n子思\t172197\n诺基亚n1\t172198\n卡米亚\t172199\n插式\t172200\n装家\t172201\n荣耀星耀\t172202\n淘宝美妆\t172203\n小女生\t172204\n古谱\t172205\n负优化\t172206\n抬刀\t172207\n4波片\t172208\n传导式\t172209\n超话\t172210\n大寺镇\t172211\n欧乐堡\t172212\n十二宫杀手\t172213\n谢林\t172214\n馥佩\t172215\nflashCS6\t172216\n桥梁工程\t172217\n黎明时分\t172218\n明澈\t172219\n安言\t172220\n2021\t172221\n央企\t172222\n以太币\t172223\n上海汽车站\t172224\n第几个月\t172225\n8850\t172226\n开裂\t172227\n芍花\t172228\n500句\t172229\n女命\t172230\n热舞网\t172231\n邓卓棣\t172232\n莫德\t172233\naabc式\t172234\n34分\t172235\n网页游戏开服表\t172236\n图赫尔\t172237\nwww.4399\t172238\n竹篙\t172239\n有机食品\t172240\n至于\t172241\nnotepadd\t172242\n广州市儿童医院\t172243\n干什么\t172244\n德福考试\t172245\n灭菌\t172246\n卓悦化粧品\t172247\n边学\t172248\n跟着千玺\t172249\n黄容\t172250\n大作战\t172251\n斯汀格\t172252\nTCT\t172253\n优集品\t172254\n张桓\t172255\n答案版\t172256\n胶塞\t172257\n400例\t172258\n山东省昌乐二中\t172259\n机要\t172260\nChase\t172261\n老年性\t172262\n塔防三国志\t172263\nbootstrap\t172264\n渝水区\t172265\n谢天华\t172266\n南皮吧_\t172267\n搜身\t172268\nihome\t172269\n顺德区人民法院\t172270\nluastudio\t172271\n钢琴奏鸣曲\t172272\n第10年\t172273\n紫辰苑\t172274\ngjf\t172275\n隋唐英雄传\t172276\n工民\t172277\n省肿瘤医院\t172278\n图片高清/批发价格\t172279\n也好\t172280\n胯下运球\t172281\nswaps\t172282\n迷路\t172283\nJS代码网\t172284\n湛江站\t172285\n绝地求生loading\t172286\n如火\t172287\n心环\t172288\n卡尔德隆\t172289\n传中\t172290\n操动机构\t172291\n沧玄\t172292\n七宝中学\t172293\nthirteen\t172294\n裕仁天皇\t172295\ncalligraphy\t172296\n济安\t172297\n林丽\t172298\nki67\t172299\n扭\t172300\n传扬\t172301\n标志性\t172302\nCATL\t172303\n美的热水器\t172304\n衡水火车站\t172305\n周楚楚\t172306\n跃迪\t172307\nPaddle\t172308\n15ISK\t172309\n中国画家网\t172310\n国医堂中医门诊部\t172311\ndried\t172312\n证券投资分析\t172313\nGypsy\t172314\nmonths\t172315\n相容剂\t172316\n丨o聽乄雨o丨\t172317\n手撕定额发票\t172318\nlibpng\t172319\n陈乔恩\t172320\n假手\t172321\n2017年7月17日\t172322\n刀凯\t172323\n陈陈\t172324\n33件\t172325\n东岳庙\t172326\n凌风\t172327\n桂林万达城\t172328\n生存战\t172329\n恒飞\t172330\ninitiate\t172331\nxianfeng\t172332\nbpc\t172333\n玖钻\t172334\n广州越秀区\t172335\n第23天\t172336\n刘昊霖\t172337\n气嘴\t172338\n国际财务报告准则\t172339\n中国汽车工业协会\t172340\n书场\t172341\n17691080980\t172342\n王春\t172343\nfloyd\t172344\n王大雷\t172345\n君马S70\t172346\n五百里\t172347\n挂号健康服务平台\t172348\n扫雪\t172349\ntock\t172350\nkunoichi\t172351\n沃格光电\t172352\n2.5亿\t172353\nE5573s-856\t172354\nQuerySet\t172355\n66名\t172356\n国艺\t172357\n天外飞仙\t172358\n谐音\t172359\n父模块\t172360\nkoro\t172361\nSCL\t172362\n御姐范\t172363\n自购\t172364\n283\t172365\n丧女\t172366\n揭阳日报网\t172367\n张红星\t172368\n浙江吉利控股集团\t172369\n理想生\t172370\n清欠\t172371\n广渠门外大街\t172372\n0.00001\t172373\n主链\t172374\n浩荡\t172375\n吉林森工集团\t172376\n热干面\t172377\nUI控件\t172378\n己亥\t172379\n双边投资协定\t172380\n白茫茫\t172381\nTerryChou\t172382\n国际航运\t172383\n内嵌式\t172384\n吴都\t172385\n济南时报\t172386\n元道\t172387\n连奕名\t172388\n鞋都\t172389\nnetworkmanager\t172390\n非缄默之石\t172391\n水碓\t172392\n我的孩子\t172393\n除锈剂\t172394\n学术讨论会\t172395\nvivoXplay6\t172396\n1万8\t172397\n弗农\t172398\n查阅\t172399\n动工\t172400\n刑九\t172401\n纤姿\t172402\ncreation\t172403\n大荆镇\t172404\n钱江路\t172405\nSlope\t172406\n规定化\t172407\n进攻型\t172408\napp.config\t172409\n逼空\t172410\n水鼓\t172411\nOH)2\t172412\n扬州酒店\t172413\nyueyu\t172414\n拥江\t172415\n最宠网\t172416\nthom\t172417\n私募投资\t172418\n申凤梅\t172419\n予定\t172420\n希望之地\t172421\n节会\t172422\n光动能电波表\t172423\n闪版\t172424\nLancome\t172425\n侯寨乡\t172426\nvids\t172427\n减震垫\t172428\nFenix3\t172429\n重振旗鼓\t172430\n原目\t172431\n踏平\t172432\n地质大学\t172433\n短弓\t172434\nShyAV\t172435\n伊优\t172436\n红酒瓶\t172437\nbayesian\t172438\nA8F\t172439\n连维良\t172440\n李准基\t172441\n天津地铁10号线\t172442\n轩辕策\t172443\n培训处\t172444\n范成大\t172445\n陆昱晟\t172446\n广东女子职业技术学院\t172447\n王振华\t172448\n沾益\t172449\n更好地\t172450\ncosmo89929\t172451\n油性漆\t172452\n落雨\t172453\n顺德职业技术学院\t172454\nROLEX\t172455\n海峡网\t172456\ndevexpress\t172457\n破财\t172458\n恃\t172459\n增生性\t172460\n凯恩之角\t172461\n841n\t172462\n有功\t172463\n芦笙\t172464\n春风十里,不如你\t172465\nNONE\t172466\n至强e3\t172467\n预防接种宣传日\t172468\n六安市政府\t172469\n心澄\t172470\n星海实验中学\t172471\n红包群\t172472\n讯连科技\t172473\n中国农业信息网\t172474\n21秒\t172475\n普洱茶网\t172476\n炸排骨\t172477\n贵州代表团\t172478\nmidea\t172479\n别这样\t172480\n腾讯视频电影频道\t172481\n480p\t172482\n平衡车\t172483\n旧影\t172484\nev\t172485\n逸翠园\t172486\nMSComm\t172487\n成都市第一人民医院\t172488\ncx-4\t172489\n花卉盆景网\t172490\n弓形虫\t172491\nCaltech\t172492\neCommerce\t172493\nFor循环\t172494\n舒尼替尼\t172495\n碧云路\t172496\n悦翔V3\t172497\n吉田亚纪子\t172498\n数控刀具\t172499\nJd\t172500\n帮酷编程\t172501\n张大大\t172502\n四面光\t172503\n妮可·罗宾\t172504\n借种\t172505\n铜头\t172506\n辩护词\t172507\n/usr/lib/x86_64-linux\t172508\n王珏\t172509\n1002\t172510\n碧缇福美牙仪\t172511\n韦玮\t172512\n天关\t172513\n群雄争霸\t172514\n中山六路\t172515\ncalculations\t172516\n旺苍县\t172517\n8口\t172518\n区政府办\t172519\njag\t172520\n两科\t172521\n终息\t172522\nkass\t172523\n宇博\t172524\n何祚庥\t172525\n数据帝\t172526\n金哥\t172527\n柯布西耶\t172528\ndesigner16\t172529\n粘米\t172530\n令\t172531\n动画类\t172532\n印奇\t172533\n五天四晚\t172534\n青春舞曲\t172535\nwes\t172536\nherobrine\t172537\n神门穴\t172538\nMueller\t172539\nCAS\t172540\n总录\t172541\n高阳县\t172542\nHANDLE\t172543\n广东外语外贸大学\t172544\nIRB\t172545\nDB2\t172546\nLaboratory\t172547\n两根\t172548\n安全屋\t172549\n1726\t172550\navision\t172551\n观山湖区\t172552\n吴连枝\t172553\n22%\t172554\n没有出路\t172555\n于桥水库\t172556\n44集\t172557\n8545\t172558\n第九十七章\t172559\n盐焗\t172560\n利胆\t172561\n状王\t172562\n小板凳\t172563\nsoffice\t172564\n卓远\t172565\n微信群控\t172566\n掌机\t172567\n1812\t172568\n凤冈\t172569\n制用\t172570\n藤条\t172571\n同夫\t172572\n六件套\t172573\n聚磷酸铵\t172574\n波斯尼亚\t172575\n流变\t172576\n中华民族大赛马\t172577\n南京中央商场\t172578\n信儿\t172579\n沙井镇\t172580\ntransports\t172581\n铜排\t172582\n泰然集团\t172583\n沈南天若有情\t172584\n北京14号线\t172585\n日了狗\t172586\nMonroe\t172587\n玩呗斗牌\t172588\nw12\t172589\n布莱尔\t172590\nMVI\t172591\n檀悦\t172592\n中远海运\t172593\n亏电\t172594\n俄罗斯妈妈\t172595\n护卫舰\t172596\nxdb\t172597\n墨冰\t172598\n老挑毛\t172599\n浓盐酸\t172600\n2017剁手回忆录\t172601\n洪洋洞\t172602\n双创网\t172603\n最大色情网va先锋\t172604\n弗兰奇\t172605\n抄牌\t172606\n神算\t172607\n460元\t172608\n吉利博越论坛\t172609\n韦伟\t172610\n车载泵\t172611\n洛迦山\t172612\n2016年07月\t172613\n彡\t172614\nborg\t172615\nlocalhos\t172616\n第一字体网\t172617\n脸萌\t172618\nwww.y2002.com\t172619\nT21\t172620\n中央国家机关工委\t172621\n晚上2点\t172622\nvax\t172623\n巫鸿\t172624\n铁头英雄\t172625\n深职\t172626\n网吧天下\t172627\ncling\t172628\n真炎幸魂\t172629\n流星之绊\t172630\nskb\t172631\n正不正\t172632\nchannelart\t172633\n柑橘树\t172634\n社日\t172635\n首旅\t172636\n王薇\t172637\n鸭苗\t172638\n亚庇\t172639\nJiYF\t172640\n小齐\t172641\n电池容量\t172642\n李家大院\t172643\nRegional\t172644\nCatch\t172645\n杨潇\t172646\n几点钟\t172647\n微支付\t172648\nQ9550\t172649\n客道\t172650\n王文强\t172651\n电采暖炉\t172652\n蜂子\t172653\n宁陕\t172654\ncent\t172655\norig\t172656\n天蓝\t172657\n两案\t172658\n中超队\t172659\n长株潭城市群\t172660\n双白\t172661\n梦魇\t172662\n横山区\t172663\n中位数\t172664\nRecall\t172665\n查账\t172666\njrj\t172667\n本帮\t172668\n薄王\t172669\n冯谖客\t172670\nClause\t172671\n她想\t172672\nappcode\t172673\n阀\t172674\n6G+64G\t172675\n金法\t172676\nManag\t172677\n塔希\t172678\n勇敢传说\t172679\ndairy\t172680\n开度\t172681\n纵表\t172682\nvirtualbox\t172683\nZOJ\t172684\nmultipath\t172685\n1万米\t172686\n非营利性\t172687\n电纸\t172688\nKettle\t172689\n低保\t172690\nMemrise\t172691\n润康\t172692\nSofitel\t172693\n滦州\t172694\n源君物语\t172695\n厂务\t172696\n超燃混剪\t172697\n香奈\t172698\n秦时明月吧_\t172699\nProducer\t172700\n爱吃\t172701\n洪兴\t172702\n180427\t172703\nAudioNote\t172704\n下坠\t172705\n1311\t172706\n建设西路\t172707\n荣耀Waterplay\t172708\n阈限\t172709\naerospike\t172710\n爱步\t172711\n王安石\t172712\n晚秋\t172713\n高压电塔\t172714\n义乌小商品批发城\t172715\nJ6S\t172716\n五三\t172717\n杂家\t172718\nPaolo\t172719\n星梦\t172720\n重取\t172721\n金思维\t172722\nexpert\t172723\n最容易\t172724\n涡流\t172725\nP2035\t172726\n聚食\t172727\n李言荣\t172728\n同位角\t172729\n小武\t172730\n东风天锦\t172731\n塔防类\t172732\n新去处\t172733\n佳创视讯\t172734\n泰迪熊\t172735\n玄派\t172736\n金丝玉\t172737\nXBMC\t172738\n清淤\t172739\n元丰\t172740\n钟声网\t172741\n313路\t172742\n振动式\t172743\n虎林新闻网\t172744\n万能试验机\t172745\njsqlparser\t172746\n说愁\t172747\n美国飞虎队\t172748\n5d3\t172749\n奥杜尔\t172750\n冬景\t172751\n应城市委市政府\t172752\n语料\t172753\n霸道黑帮老大爱上我\t172754\n电褥子\t172755\nvizza\t172756\n六国\t172757\npsm\t172758\n标量函数\t172759\n金三胖\t172760\n条线\t172761\n8分半\t172762\n落口\t172763\n陕建集团\t172764\n放根\t172765\n湖南工艺美术职业学院\t172766\n京瓷8520\t172767\n珠澳\t172768\n珠江电视台\t172769\n法律师\t172770\n太阳是大家的\t172771\n200公斤\t172772\n定压比热容\t172773\n响\t172774\n悔过书\t172775\n提拨\t172776\n离岛\t172777\n校长办公室\t172778\nwindows许可证\t172779\n安徽农村商业银行\t172780\n先行者\t172781\n野岭\t172782\nPlaymaker\t172783\n金悦城\t172784\n爬子\t172785\n回车场\t172786\n海狸鼠\t172787\nassignable\t172788\n卧蚕\t172789\n城中\t172790\n宫斗群\t172791\n2104\t172792\nps123\t172793\n美颜网\t172794\n接进\t172795\n争权\t172796\n天下霸唱\t172797\n贝雷塔\t172798\n2014世界杯\t172799\nBonjour\t172800\n中国细胞生物学学会\t172801\nWMZ\t172802\n粉末状\t172803\nJava语言\t172804\nG盘\t172805\n单刀会\t172806\n巴彦淖尔\t172807\n工厂价\t172808\n输家\t172809\n黄安年\t172810\n2016年4月25日\t172811\n埃克森\t172812\n二厘\t172813\n笔趣岛\t172814\nrite\t172815\n朱朱\t172816\n马岭河\t172817\n丽思卡尔顿酒店\t172818\n胭脂红\t172819\ntackle\t172820\n拥护\t172821\n蝴蝶节\t172822\n磁感线\t172823\nLOLs6\t172824\n收官\t172825\n脚刹\t172826\nL7\t172827\n黑龙江省财政厅\t172828\n新环保法\t172829\n亚膜\t172830\n003型\t172831\n覆雨\t172832\n国办\t172833\nbooks\t172834\n中共十一届三中全会\t172835\n6600\t172836\n513路\t172837\n四名\t172838\n肖中特\t172839\n奥星\t172840\n上海民办福山正达外国语小学\t172841\n上海10号线\t172842\n勤于\t172843\n糖精钠\t172844\n克劳德·莫奈\t172845\nallwinner\t172846\n重庆海外旅行社\t172847\nfzy\t172848\n中移\t172849\n怯场\t172850\n三尾\t172851\n中国民族宗教网\t172852\n余秀华\t172853\n康成\t172854\n生日蛋糕\t172855\norg\t172856\na16.1\t172857\n紫鹊\t172858\n长沙市口腔医院\t172859\n城市杯\t172860\n饱和聚酯树脂\t172861\n扑奔网\t172862\nA片\t172863\n中航高科\t172864\n1.2_\t172865\nomg\t172866\n劳人\t172867\nmsp430\t172868\n两厢版\t172869\np0口\t172870\nNCO\t172871\n六口\t172872\n养蚕\t172873\n知新\t172874\n屈人\t172875\n稀烂\t172876\nL4158\t172877\n民声网\t172878\n李缇娜\t172879\n文茜\t172880\n色妆\t172881\n副总监\t172882\n高濑由奈\t172883\n小点点\t172884\n毕马汇\t172885\n陈_\t172886\nfms\t172887\n字幕君\t172888\n愧对\t172889\nnelson\t172890\n狗刀\t172891\nconditioner\t172892\n洪七\t172893\n赛艇\t172894\n唇釉\t172895\n7.13\t172896\ntell\t172897\n中铁建设集团有限公司\t172898\nxinxi\t172899\n广州市规划局\t172900\nAz\t172901\ncomputer\t172902\n转载\t172903\n瓢鞋\t172904\n点读笔\t172905\n音诗\t172906\n鲁克玛\t172907\nMySQLWorkbench\t172908\n归于\t172909\n银杯\t172910\n1.6l\t172911\n决定论\t172912\n茶牌\t172913\n缺岗\t172914\n胜利小区\t172915\n梁冬\t172916\n蚂蚁短租网\t172917\n佐鸣\t172918\n假期\t172919\n600台\t172920\n维沃\t172921\n佳绩\t172922\n张元元\t172923\n很安静\t172924\n代入\t172925\n外贷\t172926\n陆埠\t172927\n家臣\t172928\n华罗庚金杯\t172929\n1女\t172930\n异烟肼片\t172931\n最高限额\t172932\n爱憎\t172933\n_句集\t172934\nmorphia\t172935\n涤塔夫\t172936\n220kv\t172937\n赠我予白\t172938\n迪兰R9\t172939\n辽宁省人大\t172940\n中国癌症基金会\t172941\nClaire\t172942\n想象\t172943\n泰语\t172944\n拔地而起\t172945\n勇者王\t172946\n大宗商品\t172947\n广视\t172948\n2265安卓网\t172949\n氟碳喷涂\t172950\n街舞VS热血街舞团\t172951\n生命值\t172952\n三次\t172953\n中兴刷机网\t172954\n52p\t172955\n中国上市公司协会\t172956\n马莉莉\t172957\nGeomagic\t172958\n福利彩票双色球\t172959\ncorelDRAW\t172960\n中央火车站\t172961\n坚果云\t172962\nAussie\t172963\n费尽心思\t172964\n2mp4\t172965\nko\t172966\n任何地方\t172967\n冥王\t172968\n解压器\t172969\nmd模拟器\t172970\n胀袋\t172971\n第一财经周刊\t172972\n陕西省交通建设集团公司\t172973\n240小时\t172974\n综合开发\t172975\nYouth\t172976\n费米子\t172977\n姫川\t172978\n三罐\t172979\n东软睿道\t172980\n废盐\t172981\n附点\t172982\n不知去向\t172983\n观澜河\t172984\n习近平讲故事\t172985\ndescription\t172986\n黑户\t172987\n优梵\t172988\n2018-4-23\t172989\n天堂2吧\t172990\n斜流风机\t172991\n布隆\t172992\n瞬间胶\t172993\nstartActivity\t172994\n石榴\t172995\n薛福成\t172996\n15.0.0\t172997\n业者\t172998\n宁波东\t172999\n外挂群\t173000\nNASM\t173001\nexcluding\t173002\n热血屠龙\t173003\n旅游\t173004\n井点\t173005\n河南公务员考试网\t173006\n请相信\t173007\n小留学生\t173008\nYF\t173009\nphpExcel\t173010\n农批\t173011\n潘伟\t173012\n3d眼镜\t173013\n从业\t173014\n官人\t173015\n╯\t173016\nbrk\t173017\n启示类\t173018\n寓言\t173019\n王庆华\t173020\n痰多\t173021\n陈铭章\t173022\n金林\t173023\n蛮横\t173024\n星展银行\t173025\nShopping\t173026\n囤\t173027\n欧陆4\t173028\nyuuki\t173029\n武林网\t173030\nTHOMAS\t173031\nAIM\t173032\ncodesmith\t173033\nnba2k18mc\t173034\n湖南理工学院\t173035\nmade\t173036\n5501\t173037\n虚火\t173038\n七家\t173039\n李默然\t173040\njyj\t173041\n槐米\t173042\n真伤\t173043\n高王\t173044\n吴晓玲\t173045\n中建八局\t173046\n欧模\t173047\n反馈率\t173048\n拉丁裔\t173049\n太阳能发电站\t173050\n滨州医学院\t173051\n十几度\t173052\n不痒\t173053\n摒弃\t173054\nplayerunkowns\t173055\n58在线\t173056\n宽带通\t173057\nUniversità\t173058\n拉达克\t173059\n小好\t173060\n李永明\t173061\npecvd\t173062\n跟踪者\t173063\n冲动\t173064\n初衷\t173065\n天涯若比邻\t173066\n电磁炮\t173067\n黄锈石\t173068\nhzw\t173069\n纯化水\t173070\n菲仕兰\t173071\n付给\t173072\nHarbin\t173073\n20161028\t173074\n歼十\t173075\n杆件\t173076\n简笔画网\t173077\n梯户\t173078\n云播放器\t173079\n王海滨\t173080\n水神\t173081\nedith\t173082\n生线\t173083\n本宫\t173084\n云模板\t173085\n上蹿下跳\t173086\n高污\t173087\n浙江教育_新浪\t173088\nPRS\t173089\n序贯\t173090\n战地模拟器\t173091\n甲醛公司\t173092\n虹之玉\t173093\n银监会办公厅\t173094\n草稿\t173095\n南京体育学院\t173096\nDAI\t173097\n心形线\t173098\ndusk\t173099\n美字\t173100\n沕沕水\t173101\nnaxx\t173102\n担保公司\t173103\n认账\t173104\ncelebration\t173105\nRadiohead\t173106\nwarp\t173107\n中华人民共和国城市居民委员会组织法\t173108\n纺织面料\t173109\n嗔\t173110\n武安\t173111\n星缘\t173112\ndormitory\t173113\n夏阳街道\t173114\n铜掌柜\t173115\nhivesql\t173116\n常州经济开发区\t173117\n求稳\t173118\n倦收天\t173119\n鬼来\t173120\n402sh\t173121\n店子镇\t173122\n第六十七条\t173123\n1.13.1\t173124\n古惑仔之人在江湖\t173125\n清溪镇\t173126\n空心病\t173127\n格里芬\t173128\n非织造布\t173129\n成田机场\t173130\n开心网\t173131\n李令霞\t173132\n温暖\t173133\n各年\t173134\n中文单\t173135\n开场舞\t173136\n森峰科技\t173137\n余杭区政府\t173138\nbatchsize\t173139\n125级\t173140\nexchange2013\t173141\n一个好人\t173142\n移联网信\t173143\nopposite\t173144\n五块\t173145\ngmp认证\t173146\nMAP\t173147\n美吉姆\t173148\n新闻评论—科学网\t173149\n和班尼特福迪\t173150\n吃亏\t173151\nGPA\t173152\n物业管理\t173153\n72色\t173154\nAlison\t173155\n苏州母子医院\t173156\nhorny\t173157\n太空船\t173158\n黄世友\t173159\n血管硬化\t173160\n佳能7d\t173161\n平坝\t173162\n易拉罐\t173163\n巴沙\t173164\n绯夜传说\t173165\n做手\t173166\nsemicolon\t173167\n全髋关节置换术\t173168\n和平桥\t173169\n赛特\t173170\n富绅\t173171\n日本区\t173172\n布控\t173173\n抿嘴\t173174\nCONVERT\t173175\nled灯\t173176\n坦格利安\t173177\n神鬼世界\t173178\n技术创新\t173179\n360手机浏览器\t173180\n定制型\t173181\n入葬\t173182\n说无\t173183\n银狐犬\t173184\n名马\t173185\n感恩教育\t173186\n亚铁\t173187\n2005\t173188\n妈祖\t173189\n恒温水浴锅\t173190\n陈雪梅\t173191\n商纣王\t173192\n天界山\t173193\n3圈\t173194\n蘸\t173195\nccid\t173196\nVHDL\t173197\n体力学\t173198\n双色注塑\t173199\n歪把子\t173200\n州管\t173201\n布尔迪厄\t173202\n郭子凡\t173203\n2018年4月\t173204\n淋膜\t173205\n父子级\t173206\nFLL\t173207\n第43章\t173208\n云雨\t173209\n烟案\t173210\n孔隙水\t173211\n金展\t173212\n455号\t173213\nOCBC\t173214\n一呼百应网\t173215\n教学\t173216\nnoodle\t173217\n国乒\t173218\n4.7G\t173219\n华东科技\t173220\n太阳能\t173221\n王梦秋\t173222\n姓名牌\t173223\nbids\t173224\n黑武\t173225\nBOOTCAMP\t173226\nces\t173227\ntimescale\t173228\n裂屏\t173229\n了不起的孩子\t173230\n商家帮助中心\t173231\n本城\t173232\nkimiss闺蜜网\t173233\n唐彬\t173234\n一平\t173235\nFEG\t173236\nwalmart\t173237\nhangs\t173238\n两批\t173239\n20171020\t173240\n检验批\t173241\n20171218\t173242\nMedoo\t173243\n新貌\t173244\n金属管\t173245\n边值\t173246\nopenswan\t173247\n广州凤凰网\t173248\nabsolute\t173249\n李明阳\t173250\n热成像\t173251\n3.25%\t173252\n猫小乐\t173253\n木瓜\t173254\n马里奥赛车\t173255\n阿肆\t173256\ncamping\t173257\n六线谱\t173258\n抗日战争\t173259\nwedge\t173260\n四十二式太极拳\t173261\nbl文小受\t173262\n金特里\t173263\n第五集\t173264\n冷战热斗\t173265\n牙关\t173266\n2018-03-29\t173267\n西西游戏网\t173268\n啵\t173269\n高板\t173270\nPolly\t173271\n哈哈\t173272\nRime\t173273\n同村\t173274\n热带\t173275\n1600mhz\t173276\n淘宝指数\t173277\npostpone\t173278\n白城市\t173279\n64G\t173280\niBS\t173281\n顶位\t173282\n2047\t173283\nSBH70\t173284\n种性\t173285\ngenetics\t173286\n编译器\t173287\n首仿\t173288\n记录人\t173289\n接种\t173290\n浦发信用卡中心\t173291\n中峰\t173292\n植物生长素\t173293\n黍子\t173294\n湖南省长沙市第一中学\t173295\nyisi\t173296\nP型\t173297\n敌友\t173298\n刘文波\t173299\n非语言\t173300\n伊苏7\t173301\n王辉\t173302\n首信\t173303\n水土镇\t173304\n披萨\t173305\nRocks\t173306\n上海市浦东医院\t173307\n700s\t173308\nlewis\t173309\n沟\t173310\n鸭头\t173311\n照明灯具\t173312\n太平洋时尚网\t173313\n沙东\t173314\n尹施允\t173315\nEMMA\t173316\n加油歌\t173317\n主题曲\t173318\n锅盖头\t173319\nSiffredi\t173320\nkbps\t173321\n地说\t173322\ncharacters\t173323\nmatlab7\t173324\n水室\t173325\n李恩惠\t173326\n深圳市总工会\t173327\n漕河泾新兴技术开发区\t173328\nCu\t173329\n驾照\t173330\n北湖新区\t173331\nGithub\t173332\n山东大学\t173333\n刘瑛\t173334\n青狐\t173335\n棠德花园\t173336\n3.83\t173337\n周海滨\t173338\nz370e\t173339\n月灵\t173340\n怂恿\t173341\nMoe\t173342\n大错特错\t173343\n劫富济贫\t173344\n肝叶\t173345\n主合同\t173346\n税控盘版\t173347\n52.5\t173348\n题名\t173349\n珠海一中\t173350\n小布达拉宫\t173351\n地贴\t173352\n金光\t173353\n西林\t173354\n周易吧\t173355\n帮\t173356\n聂聂\t173357\n慈善法\t173358\n28th\t173359\n阳明山\t173360\n红马\t173361\n钎探\t173362\n飞蚂蚁\t173363\npytharm\t173364\n行政部门\t173365\n吉林省高速公路管理局\t173366\n地区\t173367\n瑜\t173368\nBcc\t173369\n朱江\t173370\n正能量\t173371\nCharms\t173372\n三好\t173373\n濮良贵\t173374\n白莲教\t173375\nfilter过滤器\t173376\n头孢噻肟钠\t173377\n皓丽\t173378\n中科美菱\t173379\n91级\t173380\nwzy\t173381\n优山美地\t173382\n联创电子\t173383\n展现量\t173384\n天乙贵人\t173385\n万公里\t173386\nGuess\t173387\n驭夫\t173388\n流感\t173389\n秦巴娱乐\t173390\n比达网\t173391\n枷锁\t173392\nrefs\t173393\n华胜\t173394\n磷酸二酯键\t173395\n青岛农商银行\t173396\n3.99\t173397\nnuist\t173398\nUPF\t173399\n乾坤袋\t173400\n京圈\t173401\n踢皮球\t173402\n于芳\t173403\n自定心\t173404\n云月\t173405\n独孤依人\t173406\n社区医院\t173407\n1050Ti\t173408\n交通类\t173409\n大力钳\t173410\n51单片机定时器\t173411\n摄之徒\t173412\n过孔盖油\t173413\n中英教育\t173414\nFai\t173415\nopening\t173416\n10690\t173417\n磁粉离合器\t173418\n罗得\t173419\n陶晶\t173420\n葫芦巴\t173421\n拳皇14吧\t173422\n虎啸龙吟\t173423\n张新明\t173424\nSissy\t173425\n道孚\t173426\n中国社会科学院研究生院\t173427\n唐家沱\t173428\n油量\t173429\n伯诺德\t173430\n村上里沙\t173431\n徽州大道\t173432\n师母\t173433\n昆特\t173434\n美食文\t173435\n复勘\t173436\n博冠\t173437\n散财\t173438\n归纳法\t173439\n几胜\t173440\nnotepadqq\t173441\n警情\t173442\nSpiiker\t173443\n168厘米\t173444\n真情实感\t173445\n张钧\t173446\n柳时元\t173447\n马鞍镇\t173448\nEAU\t173449\n不锈钢制品\t173450\n站址\t173451\n货柜车\t173452\n2.12.0\t173453\n婕斯\t173454\n颈项\t173455\nppt播放\t173456\n金地格林世界\t173457\n乔娜\t173458\n广州市委\t173459\nBlu-ray\t173460\n君汇\t173461\n首块\t173462\n小爱\t173463\nrdb\t173464\n和平型\t173465\n李老\t173466\nBronze\t173467\n四肖期期准\t173468\nCurl\t173469\n尹柯\t173470\n自相矛盾\t173471\n徐建\t173472\n碳架\t173473\n品相\t173474\n小学妹\t173475\n盘城\t173476\n来讲\t173477\n0303\t173478\n乐维斯\t173479\nLara\t173480\n刘天礼\t173481\nLABVIEW\t173482\n研磨机\t173483\n纱线展\t173484\n长安欧诺汽车\t173485\n10月初\t173486\n潮汛\t173487\n太阳集团\t173488\n平安驾校\t173489\n韶关市\t173490\n吴笑笑\t173491\nM30\t173492\n61岁\t173493\n第一次世界大战\t173494\ngoogle-chrome\t173495\n废票\t173496\n六大类\t173497\nFlights\t173498\n旅顺\t173499\nxmlspy\t173500\n弹一弹\t173501\n地源\t173502\n葡计\t173503\niHerb\t173504\nsizing\t173505\n孤雁\t173506\n东风本田\t173507\n上吊线\t173508\n四畳半\t173509\n卡轨\t173510\n暴笑\t173511\nbaike\t173512\n域内\t173513\n乐乎\t173514\n低噪\t173515\nencountered\t173516\n氯碱化工\t173517\n护理液\t173518\n莲花新城\t173519\n天龙八部黄日华\t173520\n给青年的十二封信\t173521\n承装\t173522\nwordpress\t173523\n4000分\t173524\n国者\t173525\n蛇纹石\t173526\n电数\t173527\n监狱风云3\t173528\n陶制\t173529\n英科医疗\t173530\n王不留\t173531\nEAI\t173532\nI2C\t173533\n神厨小福贵\t173534\n千\t173535\n被动\t173536\n晨雾\t173537\n底材\t173538\n刻写\t173539\n石浦\t173540\n压纹\t173541\n安徽交通广播\t173542\n大鹿\t173543\n夏粮\t173544\nHurricane\t173545\n试片\t173546\nAvenger\t173547\n麦德斯·米科尔森\t173548\n小黄人大眼萌\t173549\n椿萱茂\t173550\n百草霜\t173551\n写字楼\t173552\n澳门科技大学\t173553\n【伟\t173554\n青山绿水\t173555\n偷盗\t173556\n边板\t173557\n挂靠\t173558\n文心雕龙\t173559\n瞠目\t173560\n20170308\t173561\n广东省医院\t173562\n汤阴\t173563\n韦恩\t173564\n宿泊\t173565\n联合路\t173566\n京建法\t173567\n火影忍者:究极风暴4\t173568\n新溪\t173569\n安东尼奥\t173570\n岩墙\t173571\n阳床\t173572\n172\t173573\n书币\t173574\nl440\t173575\n长春理工大学光电信息学院\t173576\n真汉子\t173577\n国际经济与贸易\t173578\n红包车\t173579\n清洗\t173580\n与否\t173581\n保护费\t173582\n停手\t173583\n寂灭\t173584\n田敏\t173585\n浙江亚厦装饰股份有限公司\t173586\n一般性\t173587\n500头\t173588\n散热架\t173589\n阴展\t173590\n走下坡路\t173591\n拟物\t173592\nRT_\t173593\n稿稿\t173594\nHangfire\t173595\nBEM\t173596\n类目\t173597\n暗黑破坏神3】数据库\t173598\nbaron\t173599\n熊安\t173600\n武汉工程大学流芳校区\t173601\n北京东站\t173602\n询证\t173603\nRumors\t173604\n渗进\t173605\n9.0.1\t173606\n华西\t173607\n寿司醋\t173608\n大草\t173609\n新知\t173610\nBioinformatics\t173611\n百度文本编辑器\t173612\n国家卫生健康委\t173613\n北海北\t173614\n划线机\t173615\n栽花\t173616\n出口认证\t173617\n金石东方\t173618\ndbscan\t173619\n鲤鱼\t173620\n无机结合料\t173621\n玩游\t173622\n线员\t173623\n萤石工作室\t173624\n石门村\t173625\n高佣\t173626\n锁妖塔\t173627\n全框\t173628\nroly\t173629\n2017年05月\t173630\n汶川县人民政府\t173631\n王之炀\t173632\n帝豪GL论坛\t173633\n第25章\t173634\n太东公园\t173635\n九九夜\t173636\n囚徒\t173637\n诸\t173638\n张燕友\t173639\nK4\t173640\n中华预防医学会\t173641\n归程\t173642\niCloud\t173643\njamin\t173644\n本生\t173645\n赔率\t173646\n国片\t173647\n南京理工大学紫金学院\t173648\n5951\t173649\n上海西\t173650\nAPA\t173651\ndocstring\t173652\n领口\t173653\n行迹\t173654\n泰安镇\t173655\n转轮除湿机\t173656\n宁波市第六医院\t173657\nEiji\t173658\n智障界\t173659\n大V快跑\t173660\n文斋\t173661\n郑州高铁站\t173662\n好得\t173663\nDaVinci\t173664\n赢智\t173665\n小牌\t173666\n连败\t173667\n心理所\t173668\n察哈尔右翼后旗\t173669\n科学技术部火炬高技术产业开发中心\t173670\nRack\t173671\ncsc\t173672\n无改\t173673\n十八湾\t173674\n192.168.43.1\t173675\n2018年2月1日\t173676\ncalendars\t173677\nreadfile\t173678\nPiaget\t173679\n成败\t173680\npiv\t173681\nElan\t173682\n安装源\t173683\n会考试\t173684\n中山东路\t173685\nAutoCAD2013\t173686\nqq空间动态\t173687\n泰国政府\t173688\nGGT\t173689\n老滚5\t173690\n中国福建_福建政府\t173691\n九品蜀门\t173692\nOFAC\t173693\n纺织厂\t173694\n润滑泵\t173695\n慕尼黑啤酒节\t173696\n丹武\t173697\n移至\t173698\nweb-view\t173699\ngoogle\t173700\nGIA\t173701\nUIDatePicker\t173702\n百万\t173703\n2888元\t173704\n无锡政府\t173705\n美国证监会\t173706\n新作为\t173707\n双床房\t173708\n三十三岁\t173709\nPhotoSwipe\t173710\nWin8系统\t173711\n云界\t173712\nContinuum\t173713\n或曰\t173714\n0819\t173715\n朝见\t173716\n拈花惹草\t173717\n功利性\t173718\nSEEK\t173719\n肠鸣\t173720\n温州\t173721\n中华人民共和国担保法\t173722\n琐记\t173723\n资兴市政府网\t173724\n非法持有毒品罪\t173725\nQQJAY\t173726\n安锋\t173727\n丽思卡尔顿\t173728\n三株口服液\t173729\nnx\t173730\n莆房网\t173731\n姜杉\t173732\n于凤至\t173733\n伊斯卡\t173734\n微肥\t173735\n智影\t173736\nJIAO\t173737\n舟过安仁\t173738\nJTJ024\t173739\n南京大学信息管理学院\t173740\n深圳市燃气集团股份有限公司\t173741\n扁平苔癣\t173742\n观评\t173743\ng7x\t173744\n我爱男闺蜜\t173745\n卖火\t173746\n湖南省经济和信息化委员会\t173747\n迈腾330\t173748\nExpired\t173749\n关怀\t173750\n学士府\t173751\nBBR\t173752\n举起手\t173753\n胡先煦\t173754\n企校\t173755\n鱼化寨\t173756\n伊思多尔\t173757\n义务教育段\t173758\n回顾性\t173759\n复核\t173760\n读览\t173761\n贝尔曼\t173762\n十八紫微网\t173763\n趣谈\t173764\n堪培拉\t173765\n工业\t173766\ndn150\t173767\n千金裘\t173768\n莱奥\t173769\n清洁员\t173770\n以人为\t173771\n意千重\t173772\narcmap\t173773\n0氪\t173774\n武陵山\t173775\nCemu\t173776\n铁达尼号\t173777\n大声呼喊\t173778\n24.com\t173779\n门槛石\t173780\n山东省通信管理局\t173781\n中方县\t173782\n融汇\t173783\nDBF\t173784\n水帘柜\t173785\n企业会计学\t173786\n粤军\t173787\n满记\t173788\n凑莉久\t173789\n一型\t173790\n咪咕\t173791\n显微结构\t173792\ncpk\t173793\n新浜\t173794\n刺刀\t173795\n马踏湖\t173796\nLED\t173797\n12组\t173798\n坦克世界闪电战吧\t173799\n战女神\t173800\n插柳\t173801\n至美\t173802\nmeanwhile\t173803\nAOMEI\t173804\n信访条例\t173805\n园林网\t173806\nwithholding\t173807\n7132\t173808\n品道\t173809\n杨紫琼\t173810\nAshton\t173811\n盆子\t173812\n成都小学\t173813\n事例\t173814\n党籍\t173815\n34期\t173816\n螺翼式\t173817\n太医\t173818\n国家地理频道\t173819\n毫厘\t173820\n摩羯座女\t173821\n惠水县\t173822\n34位\t173823\nzheng\t173824\n20160816\t173825\n若干条\t173826\n周郎顾\t173827\nxianyu\t173828\n美体师\t173829\n茶屋\t173830\n郑人买履\t173831\n天誉花园\t173832\n存活率\t173833\n云泥\t173834\n宝马X5\t173835\nmedela\t173836\n1685\t173837\n镐子\t173838\n山西人\t173839\n声明式\t173840\n拥有\t173841\n八十岁\t173842\n杨宪益\t173843\n呼呼声\t173844\n涂图\t173845\n4月11日\t173846\n准备着\t173847\nPlace\t173848\n彩铃\t173849\n精牛\t173850\n江苏农商行\t173851\n桃村\t173852\n钟欣桐\t173853\n安卓4.3\t173854\n耳温\t173855\n村游网\t173856\n同一首歌\t173857\n名胜世界\t173858\n屯门市广场\t173859\n生活秀\t173860\nyilu\t173861\n供电局\t173862\n换班\t173863\nBritax\t173864\n动盘\t173865\n航天电器\t173866\n孰优\t173867\n戏迷\t173868\nvalve\t173869\n七代\t173870\n惊魂未定\t173871\n猎头顾问\t173872\n100头\t173873\n火影劫\t173874\n二元化\t173875\n许伟\t173876\n南沙群岛\t173877\n蓄发\t173878\n甜果\t173879\n国土资源局\t173880\n扯扯\t173881\n数目\t173882\n丙午\t173883\n五一节后\t173884\n乐虎\t173885\n下载券\t173886\n大名城\t173887\n猎灵\t173888\n包文婧\t173889\njavmoo\t173890\nres/raw\t173891\nHitch\t173892\n巡礼\t173893\n云知声\t173894\nnop\t173895\n长拳\t173896\n铁道学院\t173897\n就业登记\t173898\n69周年\t173899\n本来\t173900\n泛蓝\t173901\n新奥特\t173902\nnielsen\t173903\n70平方米\t173904\n3906\t173905\n汇网\t173906\n石虎\t173907\n美食坊\t173908\n中国烟草公司\t173909\n焦作万方\t173910\n泪\t173911\nawr\t173912\n盾击\t173913\n氧胆酸\t173914\n防窥膜\t173915\n天津地铁6号线\t173916\n白雀\t173917\n碳化钛\t173918\n日丰\t173919\nrussian\t173920\nmuji\t173921\ncaribbeancom\t173922\nswmm\t173923\n姜广策\t173924\nG620\t173925\nSecurity\t173926\ndmf\t173927\nrrr\t173928\n渝人社\t173929\n武警医院\t173930\n启普\t173931\n击音\t173932\n洪尚秀\t173933\n慕容沣\t173934\n徐才\t173935\nkitty\t173936\n2375\t173937\n沈阳中街\t173938\n查摆\t173939\nNiceHash\t173940\n烤肉\t173941\n一个场\t173942\nfingers\t173943\n360企业安全集团\t173944\n利奥博德\t173945\n努钦\t173946\n侦查学\t173947\n隔离\t173948\nJedisPool\t173949\nhd660s\t173950\n炫斗\t173951\nForeach\t173952\n魔方公寓\t173953\n正理\t173954\nflsh\t173955\n步奏\t173956\n餐廳\t173957\n马坊\t173958\n金丝小枣\t173959\n博业\t173960\nugg\t173961\nHTML5/CSS3\t173962\n注准\t173963\n核桃木\t173964\n新华信托\t173965\n清河街道\t173966\n二维随机变量\t173967\nGAMMA\t173968\n乌木\t173969\n效价\t173970\n弯度\t173971\n兵策\t173972\n机械法\t173973\ncrawling\t173974\n电阻率测试仪\t173975\n珞璜镇\t173976\n鄂温克\t173977\n分页查询\t173978\n金刚线\t173979\n殷浩\t173980\n回型\t173981\n龙海市人民政府\t173982\nBars\t173983\n英雄岛\t173984\n挪到\t173985\n大连普兰店\t173986\nRh\t173987\n核导弹\t173988\n花重锦官城\t173989\n1347\t173990\n府城镇\t173991\n吹灰之力\t173992\n電車\t173993\n颜骑\t173994\n15马力\t173995\n考代\t173996\n750千伏\t173997\n快猫\t173998\n零式\t173999\n浮城\t174000\nVFX\t174001\n折耳猫\t174002\n次元立方网\t174003\n国家助学贷款\t174004\n熙园\t174005\nrelations\t174006\n老虎滩\t174007\n蛇影\t174008\n百子湾路\t174009\nBragi\t174010\n品胜\t174011\nGigE\t174012\n机器学习基石\t174013\n0.5毫米\t174014\nPiggy\t174015\n三峡日报\t174016\n释义\t174017\n京东咚咚\t174018\n300743\t174019\n威势\t174020\n塞萨尔\t174021\n妻弟\t174022\nADC\t174023\n桑尼号\t174024\n京哈\t174025\n次方\t174026\n晋元\t174027\n模板兔\t174028\n失根\t174029\nFCFF\t174030\n卡托普利\t174031\n花瓣\t174032\n过重\t174033\n私人\t174034\n仙岩街道\t174035\n非诚不找\t174036\n262号\t174037\n天通苑\t174038\n电视机\t174039\n牡丹籽油\t174040\n600521\t174041\n摸胸\t174042\n苏炳添\t174043\ncityu\t174044\n硝酸银\t174045\ncydia\t174046\n张派\t174047\n合众人寿\t174048\n喜之郎\t174049\n巨人\t174050\ngraphical\t174051\n娱乐化\t174052\n绿缘\t174053\n86亿\t174054\n万联证券\t174055\nm720\t174056\n第78号\t174057\n嘬\t174058\n石光荣\t174059\n想读\t174060\n喷雾\t174061\n_健康百科\t174062\nQQ群发\t174063\n4.92\t174064\n王志峰\t174065\n中国东方航空集团有限公司\t174066\n苏语凝\t174067\n广州黄埔军校\t174068\n单纯\t174069\nhttpresponse\t174070\n601901\t174071\n锐线\t174072\n杨宝德\t174073\n蓝科锂业\t174074\n5.6.36\t174075\n消保保证金\t174076\n之二十\t174077\n两厢车\t174078\n检务\t174079\n瑧湾\t174080\n梦行\t174081\nCoddingMan\t174082\n韦帕\t174083\n战成\t174084\n隐月\t174085\nLoser\t174086\n土默特左旗\t174087\n博士学位\t174088\n海庄市\t174089\n斯特林堡\t174090\n阿朱\t174091\n人口普查\t174092\n南京交通银行\t174093\n夏莉\t174094\n砍断\t174095\n杂费\t174096\n凤子君\t174097\n深圳国家高技术产业创新中心\t174098\n梧\t174099\nbxl\t174100\n情杀\t174101\n4.4mm\t174102\n陆家嘴信托\t174103\n计算式\t174104\n罗罗公司\t174105\nENSP\t174106\n红谷滩新区\t174107\nfme\t174108\n颜悦\t174109\n该怎样\t174110\n采摘器\t174111\n钢拱架\t174112\n1一4\t174113\n凤城五路\t174114\n上座率\t174115\n东高地\t174116\n爱秀\t174117\n扬州个园\t174118\n宁河区\t174119\n阡陌客\t174120\nByredo\t174121\nwkwebview\t174122\n贩毒案\t174123\n献身\t174124\n筱筱\t174125\n内生性\t174126\n床事\t174127\n余政\t174128\n上海物流\t174129\n网眼\t174130\n瑞卡\t174131\nbape\t174132\n28元\t174133\n菊苣根\t174134\n百目\t174135\n几等奖\t174136\n推进器\t174137\n东京食尸鬼3\t174138\n東京熱\t174139\n20160104\t174140\n0777\t174141\n歼10c\t174142\n铁合金\t174143\n金山打字通\t174144\n双清湾\t174145\n生齿\t174146\n菠萝树\t174147\n皇\t174148\nREITS\t174149\n不言而喻\t174150\n观心\t174151\n伊兰特论坛_汽车之家论坛\t174152\n工具性\t174153\n明丽\t174154\n泰源\t174155\n戴蒙\t174156\n丰泰\t174157\n蛇年\t174158\n更新器\t174159\nRepair\t174160\n落秋\t174161\n应用科学\t174162\nrecycler\t174163\n农光\t174164\n青银高速\t174165\n1列\t174166\n阿秀\t174167\n竹笛\t174168\n继子\t174169\n大麻\t174170\n知我心\t174171\n绿联\t174172\n西南财经大学》\t174173\n祭司\t174174\n15张\t174175\nLean\t174176\n地震鲶\t174177\n神装\t174178\n交银金融网\t174179\n奶袋\t174180\n花楸\t174181\n大兴调查研究之风\t174182\n品保部\t174183\n停车证\t174184\n层状\t174185\n如注\t174186\nfcc\t174187\n佳能G7X\t174188\nmatla\t174189\n5点半\t174190\n桑托劳\t174191\n新疆人\t174192\n缘石\t174193\n苍生\t174194\n合肥机场\t174195\n坛九\t174196\n咖啡师\t174197\n沃商店\t174198\n吐鲁番政府网\t174199\n小茉莉\t174200\n奶子\t174201\n富力泉\t174202\nebsd\t174203\n1800年\t174204\nnagesa\t174205\n口袋妖怪3ds\t174206\n气动隔膜阀\t174207\n2.0m\t174208\n香满园\t174209\n停走\t174210\n成都市妇女儿童中心医院\t174211\n内囊\t174212\n丝足\t174213\n雁塔地宫\t174214\n南北战争\t174215\n2月8日\t174216\n去应力\t174217\nfacerig\t174218\n剪刀手爱德华\t174219\n柳林镇\t174220\n20180404\t174221\n锈色\t174222\n灵寿县\t174223\n宁静号\t174224\n极电机\t174225\n平遥古城\t174226\n茕茕\t174227\n打草惊蛇\t174228\n截桩\t174229\n高宝\t174230\nimporter\t174231\nminiconda\t174232\nCocoa\t174233\n力证\t174234\n礼物\t174235\n倒置\t174236\njasmin\t174237\n砸金蛋\t174238\n上海虹桥医院\t174239\nentrepreneur\t174240\n王安\t174241\n乳胶\t174242\n交易案\t174243\nCImage\t174244\n乳腺囊性增生\t174245\n高段\t174246\n2k12\t174247\n浸出液\t174248\n20150130\t174249\nTissot\t174250\n王津\t174251\n灭火剂\t174252\n绘图\t174253\n7480\t174254\n浙江国税电子税务局\t174255\nprimocache\t174256\nxplus\t174257\n叶文洁\t174258\n社会保险登记表\t174259\n补足\t174260\nSSH端口号\t174261\n汇票贴现\t174262\n姬子\t174263\n增值税发票\t174264\n长沙地铁3号线\t174265\n联合式\t174266\n大镜\t174267\n中银通支付卡\t174268\n201512\t174269\n感恩之心\t174270\n书号\t174271\n惊惶\t174272\n质子交换膜燃料电池\t174273\nwayne\t174274\n牙科诊所\t174275\n金卡\t174276\n超滤净水器\t174277\n万能码\t174278\n管理思想史\t174279\n陈氏\t174280\n52秒\t174281\n建管\t174282\ntoll\t174283\n张基河\t174284\n缓震\t174285\n大西洋新城\t174286\n焦作日报社\t174287\n辽宁中医药大学\t174288\n天友\t174289\nmbot\t174290\n藤井有彩\t174291\n王子变青蛙\t174292\nSpringerLink\t174293\n小學\t174294\nfiledialog\t174295\n熄焦\t174296\n古手川唯\t174297\n张晓亮\t174298\nLancet\t174299\n刮刮奖\t174300\n38.9\t174301\n超声波清洗剂\t174302\n闭包函数\t174303\npdd\t174304\nsinónimos\t174305\n美德玛\t174306\n订舱\t174307\naxd\t174308\n多此一举\t174309\n公养\t174310\n足太阳膀胱经\t174311\n大哥哥\t174312\n2.4l\t174313\n福利在线\t174314\nevt\t174315\n布政使\t174316\n抓鸟\t174317\n默默地\t174318\n重质\t174319\n铁心桥\t174320\n千金药业\t174321\n六七十年代\t174322\n韶光\t174323\n甲片\t174324\n大石窝镇\t174325\n立方米\t174326\nwww.hc360.com/wrong/cp404\t174327\n思念已故\t174328\n三等分\t174329\n木兰草原\t174330\n魔女君越\t174331\n心头\t174332\n深惠\t174333\n真实的谎言\t174334\nRoar\t174335\n家境\t174336\n即时比分直播网\t174337\nrehash\t174338\namcharts\t174339\n北京嘀嘀无限科技发展有限公司\t174340\n3.2m\t174341\n镀锌管\t174342\n曹阳\t174343\n王菲\t174344\n实名认证\t174345\n祁斌\t174346\n白兰氏\t174347\n沛县\t174348\n河南日报网-河南日报\t174349\n居酒屋\t174350\n妃妃\t174351\n玻纤\t174352\n711便利店\t174353\n成圣\t174354\n双柱\t174355\n广州政府\t174356\n贫乳\t174357\n仙医\t174358\n阜平县\t174359\n刘明辉\t174360\ncyst\t174361\n陆抑非\t174362\n遗存\t174363\n不倒\t174364\n200多针\t174365\n民和县\t174366\n无糖型\t174367\nblair\t174368\nめ\t174369\nETtoday圖集\t174370\n愿赌服输\t174371\n太麻烦\t174372\n山阳镇\t174373\nScanning\t174374\n滴滴云\t174375\n鸟人村\t174376\nDyes\t174377\neav\t174378\n第9节\t174379\n秀英区\t174380\nBaked\t174381\nDateTime\t174382\nCove\t174383\n澡盆\t174384\ncoating\t174385\n杨新\t174386\nHLTV\t174387\nTomas\t174388\n七颗钻石\t174389\n请记\t174390\n济南遥墙机场\t174391\nK70\t174392\ntcx\t174393\n辛\t174394\n外露式\t174395\n2DLC\t174396\n1m\t174397\n关工委\t174398\n一件\t174399\n计时\t174400\n白发\t174401\n强健\t174402\n可持续性\t174403\n养生之道\t174404\n意大利菜\t174405\niDRAC\t174406\n闹鬼\t174407\n略知\t174408\n┃\t174409\n2017年10月22日\t174410\n1622\t174411\n了当\t174412\n武汉医院\t174413\n成山路\t174414\n页_汉寿政府网\t174415\n童话村\t174416\n软误\t174417\n调节板\t174418\n有物\t174419\n高小\t174420\nnas4free\t174421\n女忍者\t174422\n北京市律师协会\t174423\n基金托管人\t174424\n中国核工业建设集团公司\t174425\nRhino\t174426\nSQL数据库\t174427\nBlued\t174428\n安联保险\t174429\n0204\t174430\n风骏6\t174431\n刘光辉\t174432\ndeed\t174433\n人间情缘\t174434\n大板桥\t174435\n宝杨路\t174436\n马里奥银河\t174437\n大棚\t174438\n晋城银行\t174439\n第一诫\t174440\n559\t174441\n软皮\t174442\nSQLEXPRESS\t174443\n比特范\t174444\n窗边\t174445\n黑蚁\t174446\n摩登时代\t174447\n善款\t174448\n浙一\t174449\n位育\t174450\nJDK1.7\t174451\n成都市民政局\t174452\nOmega\t174453\n止咳片\t174454\n24.2\t174455\npclint\t174456\n心内\t174457\n无形资产\t174458\nwalter371\t174459\n钙铁锌口服液\t174460\n常华敏\t174461\n地信\t174462\n罪孽\t174463\n仙儿\t174464\n考察报告\t174465\n珍菌堂\t174466\n梅溪新天地\t174467\ngus\t174468\n沙扒湾\t174469\ns3c2440\t174470\nBeard\t174471\n皮色\t174472\n600006\t174473\nУ\t174474\n鞋区\t174475\n除颤仪\t174476\n重蹈\t174477\n衢州西区\t174478\n3穴\t174479\nlevante\t174480\n不可原谅\t174481\n一千年后\t174482\nn9009\t174483\n满门\t174484\n4022\t174485\n李高峰\t174486\nSCALE\t174487\ncss模板\t174488\nutil包\t174489\n梁光烈\t174490\nipod\t174491\n180\t174492\n比萨斜塔\t174493\n魔药\t174494\n猜心\t174495\n白马井\t174496\n王山\t174497\n封建社会\t174498\ndispatcherservlet\t174499\n201不锈钢板\t174500\n青州市\t174501\n骑射\t174502\nLaplace\t174503\n764\t174504\n穿云\t174505\nappp\t174506\n薪点\t174507\nPathophysiology\t174508\n春末夏初\t174509\n锲而不舍\t174510\n腐皮\t174511\n甜菜碱\t174512\nwhichav\t174513\nmongolia\t174514\n王欣婷\t174515\n舞秀\t174516\n打印派\t174517\n碾米\t174518\n群人\t174519\n东扩\t174520\nWeston\t174521\n崇仁镇\t174522\ne5\t174523\n小神农\t174524\n中科院心理研究所\t174525\n速感\t174526\n注册商标\t174527\n迪科斯彻\t174528\nNote论坛\t174529\n景城\t174530\n不给\t174531\n珠海市妇幼保健院\t174532\n薛斌\t174533\nmaxsize\t174534\n红花岗区\t174535\n和平乡\t174536\n九龙湖\t174537\n六连\t174538\nsrpg\t174539\n9.6英寸\t174540\nKA\t174541\n缩写词\t174542\n佳仁\t174543\ntbb\t174544\n空罐\t174545\nXIUMIN\t174546\n能掉\t174547\n超碰撞在线\t174548\nRaiders\t174549\n锡膏\t174550\n合景\t174551\n打进\t174552\n一周之内\t174553\nValmont\t174554\n楼板\t174555\n第717章\t174556\n\u001e\t174557\n提香\t174558\n校友\t174559\n略懂\t174560\n石羊\t174561\n蚌山\t174562\n11周岁\t174563\n化学式\t174564\n晨\t174565\n精选篇\t174566\n海川\t174567\n黄圣蔡康永\t174568\n江总\t174569\n刺客教条吧\t174570\n性暴力\t174571\nnewface\t174572\n100张\t174573\n零距离商务网\t174574\n挡案\t174575\n漕\t174576\n于魁智\t174577\n家电论坛\t174578\n提神\t174579\n石桥路\t174580\n沈阳机场\t174581\n炫动\t174582\ntyb\t174583\n阿拉伯文\t174584\n中交投资有限公司\t174585\nMP3+\t174586\n李欧梵\t174587\n卧薪\t174588\n两法\t174589\n海伯伦\t174590\n星空下\t174591\n一枝梅\t174592\n刘芬\t174593\n石家庄市中医院\t174594\n9.4%\t174595\nSubs\t174596\n大乘\t174597\n乾安县\t174598\n稻草人\t174599\n上海城\t174600\n6.5.8\t174601\n福利播放器\t174602\n石家庄精英中学\t174603\n企图\t174604\n中国食品药品检定研究院\t174605\n小屯村\t174606\n黄嘴\t174607\n昆明学院\t174608\n马赛人\t174609\n邹兆龙\t174610\n青岛奥帆中心\t174611\n无锡妇幼保健院\t174612\n绿风\t174613\npmbok\t174614\n德贝橱柜\t174615\n令箭\t174616\n安娜金\t174617\n章龄\t174618\n怪声\t174619\n听会\t174620\n脑筋急转弯_匿名_天涯问答\t174621\nTSV\t174622\n清浊\t174623\n新加坡机场\t174624\n问事\t174625\n嘉伦\t174626\n26.58\t174627\n燕青\t174628\n计划生育\t174629\nR7000\t174630\n宠儿\t174631\n王者荣耀2018kpl\t174632\n吹弹可\t174633\n3397\t174634\n22套\t174635\n上海7号线\t174636\nonsubmit\t174637\n温顺\t174638\n雨凡\t174639\n光明新区\t174640\n众泰T700\t174641\n安易\t174642\n窗位\t174643\n阿里钱盾\t174644\n选校\t174645\nProfit\t174646\n倪琳\t174647\n洛克王国\t174648\n芬克\t174649\nSHF\t174650\n20余个\t174651\n徐浩然\t174652\n独乐\t174653\n衣联论坛\t174654\n平板盒子网\t174655\n托福\t174656\n优秀奖\t174657\ne5800\t174658\n委托代理关系\t174659\nRMBP\t174660\n长城哈佛\t174661\n脑活素\t174662\n米寿\t174663\n全英音乐奖\t174664\n近1个月\t174665\n11g\t174666\n桃花潭\t174667\n进化者\t174668\n卷管\t174669\n阿克陶\t174670\n向往的生活\t174671\n闹闹天宫\t174672\n泥条\t174673\n50讲\t174674\ngalgame,hgame\t174675\n2017年4月8日\t174676\n周桂珍\t174677\n002410\t174678\n九根\t174679\n兴隆寺\t174680\n不像话\t174681\n大众养生网\t174682\n博库书城\t174683\n公选课\t174684\n遂宁市\t174685\n桂发祥\t174686\n迈克生物\t174687\n尿素氮\t174688\n找回误\t174689\n方式\t174690\nINTERVAL\t174691\n西藏旅游\t174692\n地球物理\t174693\ndhu\t174694\n云南省司法厅\t174695\n杳无音讯\t174696\ncarrier\t174697\nvid\t174698\n窖口\t174699\n笔盒\t174700\n偷心\t174701\n宋真宗\t174702\n张中生\t174703\n数码宝贝4\t174704\n驴肉\t174705\n任哲\t174706\n01民\t174707\n张大宁\t174708\n铝企\t174709\n土地局\t174710\n深圳平安\t174711\n慕寒卿\t174712\n新场镇\t174713\n180家\t174714\n164号\t174715\n大掌门2\t174716\n第三辆\t174717\n日晕\t174718\n贪便宜\t174719\nHandjob\t174720\n大爱无言\t174721\n慕斯蛋糕\t174722\nRJ\t174723\n低下头\t174724\n生物技术有限公司\t174725\n联勤\t174726\n微蓝\t174727\n17173天下3\t174728\n南华寺\t174729\n4528\t174730\n闹场\t174731\n锦天城\t174732\nphosphorus\t174733\n河源市\t174734\n原阳县\t174735\n20万美元\t174736\n排水孔\t174737\n恐怖电影\t174738\n2012年1月1日\t174739\n洋气\t174740\n亿联银行\t174741\nmaxcompute\t174742\n实木复合烤漆门\t174743\n华府网\t174744\n行政诉讼案\t174745\n英语六级阅读\t174746\n向城市\t174747\ntrackingmore\t174748\n三相电路\t174749\n脚底\t174750\nmachines\t174751\nonline3\t174752\n48章\t174753\n绍兴移动\t174754\n2670\t174755\n太阳岛\t174756\nROR\t174757\n首月\t174758\n芬腾\t174759\n87999201\t174760\n和栈\t174761\n薄荷脑\t174762\n魔眼\t174763\n长螺旋钻孔灌注桩\t174764\nholdings\t174765\n巾帼展\t174766\n张玉卓\t174767\n江南化工\t174768\n缺水\t174769\n尿桶\t174770\n大杀四方\t174771\n黑晶\t174772\n补偿性\t174773\nHanser\t174774\n以德报怨\t174775\n南岭\t174776\nSunnee\t174777\n发冷\t174778\n偏门|捞偏门\t174779\nR1S\t174780\n阿帕\t174781\n柜面\t174782\n中山市中医院\t174783\n咔嗒\t174784\n选业\t174785\n广告费\t174786\n讨喜\t174787\n停封\t174788\nconcurrenthash\t174789\n嘉善房产超市\t174790\n2010-2011年\t174791\n中天\t174792\n狼师\t174793\n7个工作日\t174794\n中藏联盟\t174795\nqg\t174796\n多层次\t174797\n龙泉宝剑\t174798\n0x000006d9\t174799\n伸缩臂\t174800\n吉赛尔\t174801\n红花\t174802\n回行\t174803\n相信自己\t174804\n河北建筑工程学院\t174805\n04.03\t174806\n悬吊式\t174807\n永猎\t174808\ncactiez\t174809\n两栖\t174810\n白龙\t174811\n不存钱\t174812\n20世纪80年代\t174813\n电伴\t174814\n东华帝君\t174815\nMotivational\t174816\n高句丽\t174817\n覆盖物\t174818\n砼\t174819\nVinda\t174820\nxlight\t174821\n担责\t174822\n纯良\t174823\n少女小渔\t174824\n医疗公司\t174825\n十五次\t174826\n马赛曲\t174827\n阻\t174828\n张淼\t174829\n自然景观\t174830\n枸杞泡酒\t174831\n鬃毛\t174832\n上海乔羽生物科技有限公司\t174833\n精品软件区\t174834\n风林\t174835\n测产\t174836\n奇迹时代3\t174837\n破损率\t174838\n半生笛\t174839\n4011\t174840\nriders\t174841\n日坛\t174842\nAri\t174843\n福建幼儿师范高等专科学校\t174844\n唐口\t174845\ni78700k\t174846\n长江中路\t174847\n20160925\t174848\n上海民航职业技术学院\t174849\n三口之家\t174850\n溃退\t174851\ngx7\t174852\n000670\t174853\n24所\t174854\n自顶\t174855\nexeclp\t174856\n江湖学院\t174857\nuae\t174858\n清远市人民政府\t174859\n叶集区\t174860\n苏夏\t174861\nPano2VR\t174862\n山娃\t174863\n少管所\t174864\nh9t韩剧网\t174865\n光瓶\t174866\n大耳朵图图之美食狂想曲\t174867\n3项\t174868\n健友股份\t174869\n玻色子\t174870\n铺开\t174871\nFrameworks\t174872\nGradle\t174873\n锯末粉碎机\t174874\n江苏长江商业银行\t174875\n亚衣\t174876\n插班\t174877\n污污\t174878\n网人\t174879\nVIETNAM\t174880\n孙某某\t174881\nX230\t174882\n新源里\t174883\napollo\t174884\n四有\t174885\n百度文学\t174886\n稀罕物\t174887\nwalled\t174888\n一生的爱\t174889\n卧\t174890\n仙逆\t174891\nSustaina\t174892\n腾讯地图api\t174893\n北京现代汽车有限公司\t174894\n1441\t174895\n日史\t174896\n非沪籍\t174897\n梳洗\t174898\n肛乳头肥大\t174899\n刘凤科\t174900\n赛人\t174901\n夜晚的实验\t174902\nOilfield\t174903\nV9.2\t174904\n口算乘法\t174905\n精石\t174906\ntend\t174907\n县工会\t174908\n井研县\t174909\n紧缩\t174910\n邮驿\t174911\n乐符\t174912\n1.9.14\t174913\n云南经济管理学院\t174914\n乐园君\t174915\nWingsBlog\t174916\n二诊\t174917\n3ce\t174918\nwin10卡顿\t174919\nbelonging\t174920\n水生动物\t174921\nAUTOCAD2007\t174922\n豁\t174923\niTunes12.7\t174924\nsd高达g世纪超越世界吧\t174925\n三叉\t174926\n领动论坛_汽车之家论坛\t174927\n妖怪百姬\t174928\n350m\t174929\n升调\t174930\n理睬\t174931\n质量保函\t174932\n深情触摸\t174933\n龙椅\t174934\n避税天堂\t174935\n施工单\t174936\n陈忠建\t174937\n首页面\t174938\n1.16.1\t174939\n针扎\t174940\n邯\t174941\n北极星输配电网\t174942\n武汉体育中心\t174943\n1970\t174944\n银棋\t174945\n大领主\t174946\n陈幼坚\t174947\n调账\t174948\n女宝\t174949\n|天正建筑\t174950\n康熙大帝\t174951\n夹胸\t174952\n刘奕宁\t174953\n没头\t174954\n城市发展史\t174955\n数百米\t174956\noff在线翻译\t174957\n桩位\t174958\n生化危机吧\t174959\n瑾瑜\t174960\n同悦\t174961\n日本软银\t174962\n石方\t174963\n三句\t174964\n哈尔滨文化公园\t174965\n非缘勿扰\t174966\n第13天\t174967\ndeveloped\t174968\nrectangular\t174969\nflowplayer\t174970\n评诺\t174971\nLULU\t174972\nArtist\t174973\n开水\t174974\nsynonym\t174975\n哲理性\t174976\n长竿\t174977\n守望先锋邪恶漫画\t174978\n赤舌\t174979\n王源欧阳夏丹\t174980\n麦克斯韦\t174981\nIJN\t174982\n天渊\t174983\nRecycle\t174984\n开裆裤\t174985\n希罗线\t174986\n萧克\t174987\n某一条\t174988\n人民公社化运动\t174989\n骨肉\t174990\n外道\t174991\n李景亮\t174992\n杨金路\t174993\n预防接种\t174994\nnegotiation\t174995\n近卫军\t174996\n天纯蓝\t174997\njib\t174998\nuft8\t174999\n奶皮子\t175000\n影视学\t175001\n建德花园\t175002\n小柯基\t175003\n张弥曼\t175004\n中交第一航务工程局有限公司\t175005\n吃胸\t175006\n小公举\t175007\n李爽\t175008\nChrdai\t175009\n副科级\t175010\nDove\t175011\nINTERNAL\t175012\n孔雀东南飞\t175013\n长歌门\t175014\n深不可测\t175015\n阳花\t175016\nEP.2\t175017\nTATTOO\t175018\n号位\t175019\n浆膜\t175020\n南芥\t175021\ncww\t175022\nnjupt\t175023\nV2.01\t175024\nTMCY-025\t175025\n大鹏安东尼\t175026\n检定员\t175027\ndhtmlxgrid\t175028\n自娱自乐\t175029\n盐城新闻网\t175030\nsettings\t175031\n喉管\t175032\n河谷\t175033\n文明礼貌\t175034\naofenglu\t175035\n雨荷\t175036\n腊八豆\t175037\n堕天\t175038\nIBC\t175039\niOS7.1.1\t175040\n5000分\t175041\nmans\t175042\n2017年4月18日\t175043\n木糠\t175044\n发蒙\t175045\n陈信喆\t175046\n御家汇\t175047\nMySQLdb\t175048\n供应商\t175049\n动感地带卡\t175050\n黃歷\t175051\n微拍堂\t175052\n小吏\t175053\nshakira\t175054\n楼堂馆所\t175055\n鼓擂\t175056\n学画\t175057\n宠物酒店\t175058\n木下和津实\t175059\n猎首\t175060\n3下\t175061\n4830\t175062\nxinwen\t175063\n逛商场\t175064\n日报表\t175065\n苏州工业园区体检中心\t175066\n如皋港\t175067\n王家村\t175068\n扳\t175069\n米仓凉子\t175070\nLPT\t175071\n伴随\t175072\n难以抗拒\t175073\n玉饰\t175074\n三次元\t175075\n王建国\t175076\nCrypt\t175077\n涿鹿县\t175078\neternal\t175079\n哀蓝\t175080\n武汉大学信息管理学院\t175081\nz型\t175082\n顾道长生\t175083\n翻生武林\t175084\nOval\t175085\n奥特曼\t175086\n邪风曲\t175087\n寡不敌众\t175088\n安非他酮\t175089\n原声\t175090\n青年宫\t175091\nTB5\t175092\nWpeace\t175093\n孙文波\t175094\n背井离乡\t175095\n沈娴\t175096\n潜孔\t175097\nbaldr\t175098\n吉恩镍业\t175099\n塔台\t175100\nW3School在线测试\t175101\n哈尔滨住房公积金管理中心\t175102\n小徐\t175103\n整栋楼\t175104\nvmware虚拟机\t175105\n海沧区人民政府\t175106\n汤饭\t175107\n摧\t175108\n甲胺磷\t175109\n采购网\t175110\n广东台\t175111\n住院医师\t175112\n复膜机\t175113\n易美逊\t175114\n神将世界\t175115\n张口\t175116\n实验班\t175117\n立陶宛\t175118\n血荐轩辕\t175119\nc15\t175120\n玛莎妮娜\t175121\n无冕之王\t175122\n雁门太守行\t175123\n恒胜\t175124\n鲁奇\t175125\n淘气包马小跳\t175126\n阿凡提的故事\t175127\n大刘\t175128\n朗仁\t175129\n9888\t175130\n上海耐克\t175131\n海思麒麟960\t175132\n中山南区\t175133\n龙湖天街\t175134\ns500\t175135\nkWh\t175136\n粉刷匠\t175137\n外国女孩\t175138\n穆斯塔\t175139\n痞子英雄2\t175140\nJour\t175141\n中队\t175142\n毁灭术\t175143\n谋道\t175144\ny470吧\t175145\n环保员\t175146\n艾尚真\t175147\n非香\t175148\n训\t175149\n家政服务\t175150\n福分\t175151\n001A\t175152\n維基\t175153\njimshi\t175154\n定型枕\t175155\n长春人才网\t175156\n东山社区\t175157\n小项\t175158\nHegre\t175159\nxnormal\t175160\n智能助手\t175161\n纤尘\t175162\nGreenPlum\t175163\nTruth\t175164\ngccp\t175165\n524\t175166\n北京林业大学经济管理学院\t175167\n公估\t175168\n减字谱\t175169\n青团子\t175170\n超载限制器\t175171\n斐波\t175172\n第十四章\t175173\nEmulator\t175174\n20180315\t175175\n烧烤炉\t175176\n230\t175177\n巨剑\t175178\ncangku\t175179\n22步\t175180\n420亿\t175181\n厦门研科生物技术有限公司\t175182\n五月五\t175183\n搜狐影音\t175184\n逍遥派\t175185\n生殖毒性\t175186\n色列\t175187\n锄草\t175188\nJG\t175189\n3颗\t175190\n亚丁风景区\t175191\n卓集\t175192\n7380\t175193\n┫石器时代\t175194\n极恶\t175195\n150吨\t175196\n血儿\t175197\n借势\t175198\n秋刀\t175199\n加合\t175200\n全奖\t175201\n三元一次方程\t175202\n纯氧\t175203\n身败名裂\t175204\n中电电气\t175205\n天鸽互动\t175206\nstrrchr\t175207\n十星级\t175208\n石点\t175209\n新华文轩\t175210\n2分\t175211\n中国制药网\t175212\n片场\t175213\nchuang\t175214\n相谈\t175215\n女教師\t175216\n科一\t175217\n二十四式\t175218\n新宝\t175219\n87路\t175220\n唐史\t175221\n枫树叶\t175222\n宠物篇\t175223\n531\t175224\n硬泡\t175225\n幼儿歌\t175226\n佳能650D\t175227\n丹桂\t175228\n蒋军晶\t175229\n2562\t175230\n大少\t175231\n钢叉\t175232\nchan\t175233\n宏观经\t175234\n落日\t175235\n自考专业\t175236\nRGP\t175237\n非线性规划\t175238\n7k7k特战\t175239\nInheritance\t175240\n阳光绝世唐门\t175241\n绘本\t175242\n滨海城\t175243\n王者荣耀助手\t175244\npepsi\t175245\n云高防服务器\t175246\n刷单\t175247\n农业工程学报\t175248\n八颗\t175249\nīp\t175250\n市场调查\t175251\n陈尧咨\t175252\n足底筋膜炎\t175253\n心仪\t175254\n沙拉店\t175255\n连装\t175256\n顶渲\t175257\n哈里斯堡\t175258\n宁波东钱湖\t175259\n0.2kg\t175260\nCommonJS\t175261\n第二十七回\t175262\n胎梦\t175263\n保护券\t175264\n三网合一\t175265\n星海湾\t175266\nRivers\t175267\n荧光分析法\t175268\n2017年7月14日\t175269\nfees\t175270\n几乎\t175271\n贫嘴张大民的幸福生活\t175272\n尝谕\t175273\n联机宝\t175274\n搜狐IT\t175275\n红山动物园\t175276\n鲜少\t175277\n错综复杂\t175278\n壹号收藏网\t175279\nAdidas\t175280\n第11部\t175281\n武汉小学\t175282\nWarnings\t175283\n直接性\t175284\n花娘\t175285\n超清1080P\t175286\n悲喜剧\t175287\n北京自考网\t175288\n地藏\t175289\n华中科技大学电气与电子工程学院\t175290\n东成\t175291\n同沙生态公园\t175292\n卸除\t175293\nqq群机器人\t175294\n天竺山\t175295\n聲音\t175296\n鬼村\t175297\n燃舞\t175298\n真三8\t175299\n93分钟\t175300\n创思\t175301\n花花期\t175302\nQ猪\t175303\n18起\t175304\nTRT\t175305\n深圳工行\t175306\n双壁波纹管\t175307\n画效\t175308\n赵镇\t175309\n简明扼要\t175310\n哪一天\t175311\n雪花神剑\t175312\n丰子恺\t175313\n都芳\t175314\n湖北能源\t175315\n卡哥\t175316\n信奥\t175317\n毕业会考\t175318\n大连民族学院\t175319\n经验版\t175320\n苏楠\t175321\n子涵\t175322\n51盖房网\t175323\n人教版小学英语五年级下册\t175324\n单县一中\t175325\n密版\t175326\n保妇康栓\t175327\nsplice\t175328\n洛阳国家牡丹园\t175329\n宾得k3\t175330\n乐町\t175331\n白头海雕\t175332\n奇塔\t175333\nLinuxEA\t175334\n家彩网\t175335\nSacrifice\t175336\n秀尔\t175337\nesight\t175338\n浓诚\t175339\n力胜\t175340\n拉力赛\t175341\n13周\t175342\n题字\t175343\n老无所依\t175344\n徐立平\t175345\n2018年1月2日\t175346\n跑道\t175347\n鱼宴\t175348\n捭阖\t175349\n新浪微博\t175350\n封神传说\t175351\n菲悦\t175352\n葡萄树\t175353\n5千块\t175354\n巧辨\t175355\n乒超联赛\t175356\n清雅苑\t175357\n女配\t175358\n铝质\t175359\n烧饭\t175360\n龙池山自行车公园\t175361\n辉月杏梨\t175362\n领地\t175363\n神盾局特工\t175364\n更严重\t175365\n地震区\t175366\n五粮液集团\t175367\n月迹\t175368\n溪溪\t175369\n11阶\t175370\nM3\t175371\n时间词\t175372\n死亡之塔\t175373\n端木蕻良\t175374\n车载式\t175375\n2.doc\t175376\n成功之道\t175377\n思皓\t175378\nlish\t175379\n南瓜饼\t175380\n平昌冬奥会短道速滑\t175381\n201704\t175382\n献艺\t175383\n华奥星空\t175384\n5410\t175385\n基礎\t175386\n文峰城市广场\t175387\n20151207\t175388\n跳杯\t175389\n郑观\t175390\n8千多\t175391\n金泰宇\t175392\n靖港古镇\t175393\nCapitol\t175394\nBD1024p/1080p/Mp4\t175395\nunmount\t175396\n筋流\t175397\n蛤蚧\t175398\njjg6256308-ChinaUnix\t175399\n好甜蜜\t175400\n殡仪车\t175401\n9批次\t175402\nIllu\t175403\n世界无烟日\t175404\n学周刊\t175405\n信捷\t175406\nora\t175407\n民生金融租赁股份有限公司\t175408\n隆德县\t175409\n锔瓷\t175410\n大话滨海_滨海论坛_滨海网\t175411\nFLAC\t175412\n坚定者\t175413\n新特\t175414\n介电常数\t175415\n数据接口\t175416\n0755-27505283\t175417\n雅阁锐\t175418\neiei\t175419\n幸福镇\t175420\n扫盲\t175421\n小月子\t175422\n南昌大学一附院\t175423\n叠螺\t175424\n史志办\t175425\n商环\t175426\n尾行4\t175427\n新华苑\t175428\n118集\t175429\n星巴克\t175430\ndro\t175431\n猎龙者\t175432\n中国人民保险公司\t175433\n收订\t175434\n联想小新锐\t175435\n水陆两栖\t175436\nD100\t175437\n补价\t175438\n灵析\t175439\n售后服务站\t175440\nasw\t175441\n坠胀\t175442\nwindows-x64\t175443\n水果汁\t175444\n谷维素\t175445\n并励\t175446\n搞笑片\t175447\ndena\t175448\n流出版\t175449\nkal\t175450\n/tslweb/cwfx/jzpg/sz000418.mht\t175451\n乳神\t175452\n郑州绿城通\t175453\n北卡罗来纳大学教堂山分校\t175454\n下效\t175455\n四季花园\t175456\n放光\t175457\n洛阳纸\t175458\n美善\t175459\n女干\t175460\n2018年元宵节\t175461\n船务\t175462\n创新体制机制\t175463\n养乐\t175464\n暴风城\t175465\n代收款\t175466\n铁汉生态\t175467\n骨筋膜室综合征\t175468\nWhirlpool\t175469\n上海道\t175470\n按作\t175471\nUUU\t175472\n第42条\t175473\n深圳政府\t175474\n振华路\t175475\n透气帽\t175476\n人儿\t175477\n平面直角坐标\t175478\n丹书\t175479\n普斯\t175480\n补租\t175481\n亚硫酸氢钠\t175482\n蝴蝶谷娱乐网\t175483\n坚固性\t175484\n联想手提\t175485\n高梓淇\t175486\n幻魂\t175487\nnx500\t175488\n固定汇率制\t175489\n还原度\t175490\n熏肉大饼\t175491\nCoins\t175492\n手机易车网\t175493\n条状\t175494\n五七工\t175495\n青青草原\t175496\nTopology\t175497\n单机端\t175498\nKeeley\t175499\n宋玉林\t175500\ngetevent\t175501\nBAM\t175502\n引申义\t175503\n弄垮\t175504\npayday2\t175505\nConverting\t175506\n冰峰\t175507\n江西省人民政府办公厅\t175508\n连连看小游戏大全\t175509\n60v\t175510\n博爱家园\t175511\n泗洪\t175512\n头雕\t175513\n法王如意宝\t175514\n住宅梦物语\t175515\n新北站\t175516\nNo.9\t175517\n被窝\t175518\nCuriosity\t175519\n刘能\t175520\n华擎科技\t175521\ninsar\t175522\n复旦大学附属华东医院\t175523\nShit\t175524\n相公山\t175525\n992\t175526\n与共\t175527\n新市府\t175528\n宋小菜\t175529\n样车\t175530\n变故\t175531\nlcf\t175532\n撕碎\t175533\n鹏程万里\t175534\n第三关\t175535\n痴汉\t175536\n更正\t175537\nQQ钻\t175538\n0.1元\t175539\nGGII\t175540\ngtx460\t175541\n翻耕\t175542\n包围盒\t175543\n喜爱\t175544\n用户线\t175545\n包揽\t175546\n1843年\t175547\n车尾箱\t175548\n仿真机器人\t175549\nir2520\t175550\n字典网\t175551\n致远金融\t175552\n网络人\t175553\n23cm\t175554\nGTX980Ti\t175555\nSquared\t175556\n2018年1月份\t175557\n拉德斯基进行曲\t175558\n52位\t175559\n8台\t175560\n贵州中烟工业有限责任公司\t175561\nsuv\t175562\n索尼蔡司\t175563\n西园\t175564\n马儿\t175565\n储备库\t175566\nbsr\t175567\nFURNITURE\t175568\nyouxin\t175569\nae币\t175570\n以身相许\t175571\n湖南省人民政府\t175572\n64张\t175573\n宝塔Linux\t175574\n孟文豪\t175575\n甲贺忍法帖\t175576\n长江南路\t175577\nkokoro\t175578\n雪花梨\t175579\n长清湖\t175580\nsD卡\t175581\n软套\t175582\n霍林郭勒市\t175583\n叶祖新\t175584\n企业管理概论\t175585\n12333社保查询网\t175586\n诡神冢\t175587\n澳门风云\t175588\n赐予\t175589\n守序\t175590\n小白花\t175591\n肉兔\t175592\njury\t175593\n一念天堂\t175594\n浪头\t175595\n海马玩论坛\t175596\nFSP\t175597\n藤本月季\t175598\n剖腹\t175599\n龙蟠中路\t175600\n春娃娃\t175601\nh310m\t175602\nikuai\t175603\n地龙\t175604\n长城国瑞证券\t175605\n10样\t175606\n多边贸易\t175607\n6.4寸\t175608\nAPIC\t175609\nWap\t175610\n狂插\t175611\n3.15消费者权益日\t175612\n石米\t175613\n中国科协\t175614\n熊出没之夏日连连看\t175615\n毛笔字\t175616\n阴凉库\t175617\n650个\t175618\n高尔泰\t175619\n山东队\t175620\n天龙光电\t175621\n虚开增值税专用发票罪\t175622\n_种\t175623\nipz-127\t175624\npandorabox\t175625\n大阪海\t175626\nPWR\t175627\nJSPatch\t175628\n雷雳\t175629\n升至\t175630\n三严三实\t175631\n期中考试时间\t175632\n常理\t175633\ncad格式\t175634\nDiptyque\t175635\n藏龙岛\t175636\n执政官\t175637\nq235\t175638\n坡币\t175639\nPaypal\t175640\n导航页\t175641\n绝园\t175642\n退学后\t175643\nSong\t175644\n原态板\t175645\n飞羽\t175646\n佳酿\t175647\n6616\t175648\n仙林校区\t175649\n颈膜\t175650\n胡一刀\t175651\n耀世\t175652\n第157章\t175653\n丽水市人力资源和社会保障局\t175654\n大错\t175655\n甲醛检测仪\t175656\n自学考试网\t175657\n八夫临门\t175658\n讲演\t175659\n富士相机\t175660\n嵌岩桩\t175661\nJARCH\t175662\n蒲团之极乐宝鉴\t175663\n夹江县人民政府\t175664\n位卑\t175665\n锦绣河山\t175666\n蛋包饭\t175667\n答辩会\t175668\nDFP\t175669\n退魔\t175670\nairchina\t175671\n66kv\t175672\n六道仙人\t175673\n力压\t175674\n华晨集团\t175675\nParade\t175676\n8道\t175677\n中华人民共和国公共图书馆法\t175678\n如风\t175679\n岚奶茶\t175680\nstrtoul\t175681\n引种\t175682\n乱写\t175683\n九游网\t175684\n抗磨液压油\t175685\n仿真分析\t175686\n很赞\t175687\n桌脚\t175688\n福克斯st\t175689\n2017-0\t175690\n鸭溪镇\t175691\n泰隆银行\t175692\n蜡基\t175693\n全球加盟网\t175694\n菲尼\t175695\n台江县\t175696\nsurvival\t175697\nmoreover\t175698\n宝山钢铁股份有限公司\t175699\n木下亚由美\t175700\nElectronic\t175701\n四册\t175702\nactively\t175703\n慢性糜烂性胃炎\t175704\nMesse\t175705\nKane\t175706\n宝力豪\t175707\n冀南新区\t175708\nexpoon\t175709\n型器\t175710\n在业\t175711\n大巫师\t175712\n七咲枫花\t175713\n李时珍\t175714\n110年\t175715\n箭士柳白猿\t175716\n大义凛然\t175717\n支持部\t175718\n苯磺酸\t175719\nqq公众号\t175720\n19度\t175721\n准妈妈\t175722\n伏立康唑片\t175723\n第三者责任保险\t175724\n勤务\t175725\n黄莉\t175726\n临汾路街道\t175727\n盐酸赛庚啶片\t175728\n华峰超纤\t175729\n埋葬\t175730\n无止\t175731\n桔树\t175732\n不淫\t175733\n刘诗雯\t175734\n溜娃\t175735\n过且过\t175736\nLIGHTROOM\t175737\n70篇\t175738\nTeenage\t175739\n煤矿\t175740\n奥古斯都\t175741\n接生婆\t175742\n马可·波罗\t175743\n圣斗士星矢欧米伽\t175744\n_合肥网\t175745\n铜条\t175746\n轉\t175747\n四合茗苑\t175748\n醋酸纤维素\t175749\n居无定所\t175750\n踵\t175751\n五菱宏光S1\t175752\n窖藏\t175753\n背兽\t175754\n25公里\t175755\n厦\t175756\n粤桂\t175757\n6800K\t175758\n换气机\t175759\n柚子木字幕组\t175760\n花胶\t175761\n北京农业嘉年华\t175762\n材料科学基础\t175763\n景观设计专业\t175764\n第8关\t175765\n大冷股份\t175766\n浪潮集团有限公司\t175767\n优乐美\t175768\nC99\t175769\n5万美元\t175770\n星花\t175771\n浪荡子\t175772\n信中利\t175773\nINTJ\t175774\n妖妃\t175775\n天后宫\t175776\n福港\t175777\nCartridges\t175778\n47天\t175779\n杀破狼广播剧\t175780\ndelay\t175781\n3月1\t175782\n阿灿\t175783\n猛力\t175784\n蹬腿\t175785\n9505\t175786\n1403\t175787\n1班\t175788\n童养媳\t175789\n股骨颈骨折\t175790\n卡兵\t175791\n唐琪儿\t175792\n报名单\t175793\nnubile\t175794\nOficial\t175795\n莲花桥\t175796\n青马班\t175797\n借书证\t175798\nDSA\t175799\n2注\t175800\n小米6#\t175801\n皮壳\t175802\n炸药包\t175803\n陶城\t175804\n魅夜\t175805\n21%\t175806\n一年后\t175807\n周庄村\t175808\n登记单\t175809\n博彦科技\t175810\n铵\t175811\n遮瑕盘\t175812\n束氏\t175813\n产业结构调整指导目录\t175814\n爱唯欧\t175815\ninfuse\t175816\n江西旅游网\t175817\nCSS类\t175818\n示范片\t175819\n安山\t175820\n阳阳\t175821\n电磁离合器\t175822\n陶粒\t175823\n加多\t175824\n浙江工商大学研究生院\t175825\n励精图治\t175826\nDerulo\t175827\n十招\t175828\n圣夜\t175829\n齐风\t175830\n80平\t175831\nAK12\t175832\n比基尼照\t175833\n中国科学院地质与地球物理研究所\t175834\n正反面\t175835\n多目标\t175836\n仙姑\t175837\n今年1月\t175838\n独显_\t175839\n洪阳\t175840\nGigabit\t175841\nChou\t175842\nLamy\t175843\nInformix\t175844\n广东省交通运输厅\t175845\n油压缓冲器\t175846\nunited\t175847\n富力中心\t175848\n2018年4月20日\t175849\n塑料制品\t175850\n左甲状腺素钠片\t175851\n泛函\t175852\nfutanari\t175853\nReflections\t175854\n桥本氏甲状腺炎\t175855\nwswang\t175856\n问一下\t175857\n自治州\t175858\n欧亚大陆\t175859\n再审申请书\t175860\n新策\t175861\n精灵宝可梦究极日月\t175862\n虫眼\t175863\n0666\t175864\ndatadir\t175865\n长蛇\t175866\n联合会\t175867\n非专利\t175868\n开\t175869\n拜特\t175870\n大时代\t175871\n八门\t175872\n风灵\t175873\n中国女足\t175874\n岭南路\t175875\nFerragamo\t175876\n妆点\t175877\n大淘营\t175878\n郭大为\t175879\n品牌知名度\t175880\n奔驰glc260\t175881\n索道\t175882\n郑州市市\t175883\n黄继光\t175884\n入室\t175885\nmerits\t175886\n六石\t175887\n尤莉安\t175888\n厦门小鱼\t175889\n胶原\t175890\n莫杰\t175891\napplewatch\t175892\n35xs\t175893\n与生\t175894\n过一会\t175895\n肏逼\t175896\n244个\t175897\nhair\t175898\n双塔街道\t175899\n6.1.7\t175900\n元熙\t175901\nrdl\t175902\nsteward\t175903\n空欢喜\t175904\ntrapcode\t175905\nhacked\t175906\nvolumn\t175907\n弹体\t175908\n康桥老街\t175909\n苗人凤\t175910\n盘存\t175911\n广州市皮肤病防治所\t175912\n欧巴桑\t175913\nAnkle\t175914\n推片\t175915\nCS231n\t175916\n对象值\t175917\n电位箱\t175918\n侠盗猎车手4:自由城之章\t175919\n环境设计专业\t175920\n新笔趣阁\t175921\nCasual\t175922\nz11minis\t175923\n鹰狮\t175924\nHung\t175925\n李海东\t175926\nbigdata\t175927\nrespond\t175928\n横断\t175929\n双拉\t175930\n清远传媒网\t175931\n5.3.0\t175932\n航机\t175933\nLed\t175934\n创意性\t175935\n公寓馆\t175936\n王新平\t175937\n0.08MB\t175938\n先行\t175939\n40首\t175940\n巨塔\t175941\nIQ博士\t175942\n中心线\t175943\n持续期\t175944\nc9\t175945\n战略支援部队\t175946\n体验性\t175947\n贺岁\t175948\n毫米波\t175949\n看不见的爱\t175950\n粟特\t175951\n12a\t175952\n1874\t175953\n卫生管\t175954\n安雷\t175955\nVolunteers\t175956\nNyato\t175957\n第16期\t175958\n边位\t175959\n外导\t175960\n久事\t175961\nGaotie\t175962\nCSRF攻击\t175963\n近三个月\t175964\nWarez\t175965\n母亲的歌\t175966\noffi\t175967\n500大卡\t175968\n一|\t175969\n英飞凌\t175970\n阳光宝贝\t175971\n接触\t175972\n解系\t175973\n陈玲\t175974\nZWILLING\t175975\n1500个\t175976\nElement\t175977\n托单\t175978\n贴切\t175979\n箱中\t175980\n平桂管理区\t175981\n王逗逗\t175982\n渡江\t175983\n笑对人生\t175984\n王素\t175985\n主客场\t175986\n南下\t175987\n临西\t175988\n死亡之路\t175989\n胃疼\t175990\n超级机器人大战A\t175991\n2018年4月1日\t175992\n3d66\t175993\n泸州在线\t175994\ns1\t175995\n复仇者们\t175996\n贱笑\t175997\ncrypto\t175998\n2037\t175999\nBrocade\t176000\n江苏省交通厅\t176001\nfsm\t176002\n富硒食品\t176003\n金鹰节\t176004\n果位\t176005\n张双喜\t176006\n同乡会\t176007\n支具\t176008\n土豆炖牛肉\t176009\n工业北路\t176010\n新方向\t176011\n美若黎明\t176012\n陶奕希\t176013\n张文慈\t176014\n督军\t176015\nPEACEBIRD\t176016\n电剑\t176017\nWish\t176018\n龙兽\t176019\nv24\t176020\n徇私\t176021\nbreakdown\t176022\n丽居\t176023\nhoudini\t176024\nseven7\t176025\n新二村\t176026\n3030\t176027\ntop15\t176028\n一个1000\t176029\n仼\t176030\ndnf元\t176031\n呆料\t176032\n枝叶\t176033\n杨八姐\t176034\n自然传奇\t176035\n兵贵神速\t176036\n古佛\t176037\n精短\t176038\nstm32cubemx\t176039\n神行太保\t176040\nJavaFX\t176041\n名将\t176042\n宝贝王\t176043\n为非\t176044\n水苏糖\t176045\n横幅\t176046\n光学检测仪\t176047\n武穴\t176048\n补假\t176049\n南瓜车\t176050\n西峡\t176051\n感_\t176052\n喻人\t176053\n分段函数\t176054\n贵妇人\t176055\n搅拌器\t176056\n1.031\t176057\n5届\t176058\n坊子\t176059\nmisery\t176060\nNei\t176061\n黑白机\t176062\n沾沾自喜\t176063\nvnp\t176064\n伏尔加\t176065\nsolidwors\t176066\n政府性\t176067\n百度云网盘/BT\t176068\n解放路小学\t176069\nUSB无线网卡\t176070\n犀照\t176071\n乔欣\t176072\n慕\t176073\n寻宝者\t176074\n超级聊天术\t176075\n京城大厦\t176076\n避忌\t176077\n第176集\t176078\n0._\t176079\n划定\t176080\n保驾\t176081\n认识分数\t176082\nlopencv\t176083\n傲骨\t176084\n天南\t176085\n高锋\t176086\n掌管\t176087\n盗墓类\t176088\n辽宁联通\t176089\n中科院力学所\t176090\nFlip\t176091\ngtp\t176092\n中国科学院生物物理研究所\t176093\nbazaar\t176094\n20180329\t176095\nDefy\t176096\n型梁\t176097\n姓名板\t176098\nSHM\t176099\nFoster\t176100\new\t176101\n摇摆\t176102\ne类\t176103\n1500万美元\t176104\n杂物\t176105\n人伦\t176106\n周深\t176107\nABBS\t176108\nbillpeng\t176109\n荷兰东印度公司\t176110\n高版\t176111\n足金\t176112\n兴宾区政府\t176113\nrundown\t176114\nVictoria\t176115\n浩盛\t176116\n浅田结梨\t176117\n挨到\t176118\ndb\t176119\n2.39\t176120\n精学\t176121\nCheers\t176122\n君宝\t176123\n箭矢\t176124\n490元\t176125\n七十七\t176126\n东方可儿\t176127\n96集\t176128\nminds\t176129\namortization\t176130\n寄秋\t176131\n毛穴\t176132\n题海\t176133\n仲魔\t176134\n山石网科\t176135\n000\t176136\n柑橘\t176137\n河北省民政厅\t176138\n百列\t176139\n囚车\t176140\nvx\t176141\n褒禅山\t176142\n胃蛋白酶原\t176143\n蛋仔\t176144\ne京网\t176145\n烟宝\t176146\n玩一玩\t176147\n有名无实\t176148\n名画\t176149\n闸口\t176150\n反恐特战队之猎影\t176151\n杜冥鸦\t176152\n不锈钢潜水泵\t176153\n胧村\t176154\n阿哥阿妹\t176155\n生化危机启示录2\t176156\n加隆\t176157\nwincap\t176158\n街房\t176159\n云辉\t176160\n连城诀\t176161\nRxBus\t176162\n图牌\t176163\n幼儿园工作规程\t176164\n湘莲\t176165\n花园桥\t176166\n1500吨\t176167\n直男\t176168\ng_7afcb6b3-4df5\t176169\nanthropology\t176170\n手术学\t176171\n噎住\t176172\nAndroidSDK\t176173\n金财互联\t176174\n劳务派遣\t176175\n海尔热水器\t176176\n●\t176177\n稽留流产\t176178\n0993\t176179\n嘎嘎山湖\t176180\n多路\t176181\n私服\t176182\n立板\t176183\n用心\t176184\n广安门内大街\t176185\nerror_log\t176186\n范县\t176187\n迈锐天骄战纪\t176188\n鑫龙\t176189\nCarpe\t176190\n释魂\t176191\n飞舞\t176192\n天罚\t176193\n陀枪师姐2\t176194\n北京人在纽约\t176195\n瑞舒伐他汀钙片\t176196\n世纪\t176197\n日本料理铁板烧\t176198\n华为荣耀6Plus\t176199\n镀锌件\t176200\ndbg\t176201\n金庸无双\t176202\nch0602120131@126\t176203\nmdi\t176204\n封校\t176205\n冲气娃娃\t176206\nfully\t176207\n花儿\t176208\n坛子\t176209\n大浪\t176210\n手碟\t176211\n一个16位\t176212\n2500g\t176213\n合村\t176214\n丙肝\t176215\n再世篇\t176216\n封神英雄榜2\t176217\n托班\t176218\n高柜\t176219\nap考试\t176220\n体液\t176221\n刨腹\t176222\n混标\t176223\n2364\t176224\n奔驰s350\t176225\n灭菌器\t176226\n格瑞德\t176227\n第30条\t176228\n黑打\t176229\nbottles\t176230\n徐克\t176231\nMINI论坛_汽车之家论坛\t176232\n侧石\t176233\npe板\t176234\n深圳电子\t176235\n海鸭蛋\t176236\n通会\t176237\n胡博\t176238\n卷发棒\t176239\n热天\t176240\nEVPlayer\t176241\n谷底\t176242\n地资\t176243\n一整夜\t176244\n抽筋\t176245\n种太阳\t176246\nJackieLiu\t176247\n语音识别\t176248\n应声\t176249\n紫包菜\t176250\n英雄版\t176251\nSidebar\t176252\n十九大贵州省代表团\t176253\n紫宝石\t176254\ninquire\t176255\n梅林\t176256\n头球\t176257\n刑侦支队\t176258\n建筑学院\t176259\n肺大泡\t176260\nXFPLAY\t176261\n3850\t176262\n彬\t176263\nMITE\t176264\n50乘\t176265\n海网\t176266\n泰兴市人民法院\t176267\n鬼泣hd\t176268\nEOS币\t176269\n公帐户\t176270\n发动机\t176271\n单文\t176272\n黄四郎\t176273\n高桥吾郎\t176274\ndra\t176275\n杨国庆\t176276\npip2\t176277\n羞答答\t176278\n谢伟山\t176279\n哈尔滨装修公司\t176280\n经七路\t176281\n安徒飞剑\t176282\n省政法委\t176283\n童眼\t176284\n暗黑童话\t176285\n组主\t176286\n农业经济\t176287\n少安\t176288\n安拉胡阿克巴\t176289\n五开\t176290\n220平米\t176291\nzhiyou\t176292\n2018年4月25\t176293\n钜大锂电\t176294\n法语系\t176295\nsolidworks2008\t176296\n迟迟\t176297\n汗手\t176298\n初装\t176299\n出水阀\t176300\n苏州大学附属儿童医院\t176301\n副州长\t176302\n全聚德\t176303\n下课文\t176304\n碰碰香\t176305\n调距\t176306\nYOYO\t176307\n河南省气象局\t176308\n评弹\t176309\n师生们\t176310\nhomeland\t176311\n一梦\t176312\n企业群\t176313\narchitecture\t176314\n阴道紧缩术\t176315\n601919\t176316\n双塔寺\t176317\n康冠\t176318\n吧_沃保保险网\t176319\n转诊\t176320\n奥克兰大学\t176321\narrangement\t176322\n陆丰市\t176323\n国金\t176324\n曹建军\t176325\n粉印\t176326\n12月2日\t176327\nCities\t176328\n体能测试\t176329\n海口美兰机场\t176330\n连贯性\t176331\n冤案\t176332\n吉隆县\t176333\n递归树\t176334\n美亚光电\t176335\n免费周易八字算命网\t176336\n改编\t176337\n横山镇\t176338\n百盒\t176339\norderBy\t176340\n加贝\t176341\nRoaming\t176342\n好久\t176343\n继而\t176344\n矾\t176345\n汉川市\t176346\n高中篇\t176347\nrlm\t176348\n手功\t176349\n金融工程专业\t176350\n2600K\t176351\n稀硝酸\t176352\n2016-12\t176353\n91Porn\t176354\n冲田总司\t176355\n蒋益民\t176356\n纶\t176357\n瞰\t176358\nHealthcare\t176359\n现价\t176360\n遗传题\t176361\n13.5\t176362\n生命体征\t176363\n记账式\t176364\n软妹币\t176365\n款子\t176366\n445\t176367\n黑剑\t176368\n包皮医院\t176369\n米粉\t176370\n潘知常\t176371\numx4\t176372\n贵州省交通运输厅\t176373\nKOKIA\t176374\n侃爷\t176375\n一建建筑\t176376\n叶月奈穗\t176377\n吴谨言\t176378\n探测\t176379\n肉业\t176380\n加气机\t176381\n画骨\t176382\ncomm\t176383\netre\t176384\n4200元\t176385\n学名\t176386\n互文性\t176387\nAvoid\t176388\n里弄\t176389\n融安县\t176390\n巷子\t176391\n美心西饼\t176392\n滚铁环\t176393\n秋\t176394\n银杏谷\t176395\nracial\t176396\n2.xml\t176397\n桃花朵朵开\t176398\n腋下\t176399\n南瑞集团公司\t176400\n广州街\t176401\n思学\t176402\n嘉里中心\t176403\n巴黎水\t176404\nlibeay32.dll\t176405\n内蒙古科技大学\t176406\n娃哈哈天眼晶睛\t176407\n白朴\t176408\ncra\t176409\n武汉国际会展中心\t176410\n义气\t176411\n门把\t176412\n092\t176413\n中洋\t176414\n指点迷\t176415\n1间\t176416\n曹立\t176417\n花桶\t176418\n馕\t176419\n中钞\t176420\n手性分子\t176421\nreliable\t176422\n正好\t176423\n39371797\t176424\n山东大学青岛校区\t176425\n图吧\t176426\n物体\t176427\n湘潭市中心医院\t176428\nTL-WR841N\t176429\n安全色\t176430\nFaces\t176431\n海腾\t176432\n学制\t176433\n滨湖湾\t176434\n跑不掉\t176435\n印城\t176436\nListbox\t176437\n京基百纳\t176438\n蛋白片\t176439\n大金湖\t176440\n甘肃政法学院\t176441\n10.16\t176442\n卢正龙\t176443\n区住建局\t176444\n贵妃椅\t176445\n凝血分析仪\t176446\n1000篇\t176447\nnetsuite\t176448\n王师傅\t176449\n琍\t176450\nMartinez\t176451\n香樟园\t176452\nSirens\t176453\n助动\t176454\n窗台\t176455\n水喷淋\t176456\n造访\t176457\n最小公因数\t176458\n25家\t176459\ndura\t176460\n词学\t176461\n露台上\t176462\nx-2y\t176463\n稀少\t176464\n毛地\t176465\nStrauss\t176466\n十三层\t176467\n东坝地区\t176468\nTTS\t176469\n51号\t176470\n毛公山\t176471\n大庆油田有限责任公司\t176472\nRaid\t176473\n德邦物流股份有限公司\t176474\n25H\t176475\nSPH\t176476\n强壮\t176477\n吉利博瑞GE\t176478\n蒜皮\t176479\n张翠山\t176480\n扬州炒饭\t176481\n全能\t176482\n智会\t176483\n侦察报告\t176484\n量子加速器\t176485\n雅思6\t176486\n希島\t176487\n天津人大\t176488\n大锅菜\t176489\n拖式\t176490\n沈珍珠\t176491\n卓文萱\t176492\n大芬油画村\t176493\nELECOM\t176494\nrelationship\t176495\n孙辈\t176496\n28_\t176497\n任航\t176498\n青春路\t176499\n2018年2月13日\t176500\n07073三国\t176501\n三生石\t176502\n籼稻\t176503\n广州华商职业学院\t176504\n360公司\t176505\n左图\t176506\n效验\t176507\n要证\t176508\n第十一课\t176509\n夕日红\t176510\n吸量\t176511\n莊\t176512\n南方网通\t176513\neb\t176514\n妖怪都市\t176515\nCarlo\t176516\n一个卓\t176517\n广东文艺职业学院\t176518\n没有\t176519\n生音\t176520\n上海税友软件有限公司\t176521\n福州软件职业技术学院\t176522\nqq阅读器\t176523\n中标通知书\t176524\n魏宁海\t176525\n王国庆\t176526\n市纪委监委\t176527\n大钱\t176528\n瑞丽航空\t176529\n中国银行信用卡\t176530\n燃气调压柜\t176531\n陂\t176532\n嫡传\t176533\nRhoades\t176534\n圣原\t176535\n拿破仑传\t176536\n环保科技公司\t176537\n盐雾试验箱\t176538\n洪晓芸\t176539\n布纹\t176540\nArts\t176541\n三分天\t176542\n我文\t176543\n新华大道\t176544\n出入库\t176545\noutline\t176546\n体积与容积\t176547\n准确性\t176548\n骆冰\t176549\n1倍\t176550\n太平洋汽车网\t176551\njackeylove\t176552\n会籍\t176553\n什么装\t176554\n7293\t176555\n得分率\t176556\nADC0809\t176557\n克洛克达尔\t176558\nfst\t176559\nassociated\t176560\n计量表\t176561\n公权力\t176562\n密码器\t176563\n抵扣进项税\t176564\n统计师考试\t176565\n退下来\t176566\n1K\t176567\n好带\t176568\n五六百\t176569\ndichan\t176570\n英姿飒爽\t176571\n密封管\t176572\n不知心\t176573\n水师\t176574\n卢浦大桥\t176575\n娇颜\t176576\n深圳公交查询网\t176577\n腰椎病\t176578\n6月11日\t176579\n亏欠\t176580\n微子\t176581\n戏曲片\t176582\nBalea\t176583\n西藏\t176584\n国瑞置业\t176585\n解锁屏\t176586\nexperience\t176587\n桃花江\t176588\n専\t176589\n百朗英语\t176590\nhills\t176591\n墓场\t176592\n声浪\t176593\n鲁伊\t176594\n药品经营许可证\t176595\n亮眼\t176596\nf1-f12\t176597\nModelSim\t176598\n电子学报\t176599\n绝地求生端游\t176600\n隋唐英雄3\t176601\n執\t176602\n滋肾育胎丸\t176603\nArcteryx\t176604\n瘦子\t176605\nUSB口\t176606\n4TB\t176607\nConnector\t176608\n油画展\t176609\n威海火炬高技术产业开发区\t176610\n超级娱乐\t176611\nweblogic12\t176612\n尼格买提\t176613\n吴畏\t176614\ncdfs\t176615\nZappos\t176616\n4213\t176617\n共阳数码管\t176618\n承平\t176619\n截成\t176620\nufl\t176621\n你的眼\t176622\n气室\t176623\n694\t176624\nナカ\t176625\nEcology\t176626\n苏海\t176627\n蓝琪儿\t176628\n右旋\t176629\nnylon\t176630\n1454\t176631\ndepends\t176632\n044期\t176633\n销户\t176634\nSyracuse\t176635\nA.0\t176636\n24款\t176637\n大江东网\t176638\niMAC\t176639\n缪斯\t176640\n弩弓\t176641\n中间值\t176642\n足球小将\t176643\n流离失所\t176644\n_无冬之夜\t176645\ncint\t176646\n陆兵\t176647\n龙龟\t176648\n_飞\t176649\n张家口市人民政府\t176650\n无锡城市职业技术学院\t176651\n黑弦\t176652\n古寨\t176653\n揽才\t176654\n浙江大学继续教育学院\t176655\n贷款利率装修|一起网\t176656\n守纪\t176657\n脑溢血\t176658\n椭\t176659\n8040\t176660\n感謝\t176661\n60kw\t176662\n8600gt\t176663\nHashmap\t176664\n500000元\t176665\ngihub\t176666\nPICTURES\t176667\n行为型\t176668\n购书\t176669\nDoctoral\t176670\nairoot\t176671\n集士港\t176672\n宗派主义\t176673\n江安县\t176674\nould\t176675\n港行\t176676\n720P版BD-RMVB\t176677\niMX6\t176678\n2048n\t176679\ngrf\t176680\nJava软件工程师之路\t176681\n友谊村\t176682\natoms\t176683\n拳皇吧_\t176684\n海峡两岸\t176685\n环牛\t176686\n吉林省文化厅\t176687\n周薇\t176688\n眩\t176689\n黑龙江省教育厅\t176690\n天津格力空调\t176691\nbbr\t176692\n蠡县\t176693\n八大处公园\t176694\n福冈市\t176695\n废都\t176696\n原千惠\t176697\n大巴\t176698\n小米钱包\t176699\nAT89S51\t176700\n马伯艾薇儿\t176701\nKANG\t176702\n白花\t176703\n2018年2月3日\t176704\n陕西省气象局\t176705\n猪头三房产网\t176706\n黄少天\t176707\n四川省人大常委会\t176708\n党支\t176709\n早先\t176710\nnotepad2\t176711\n集成吊顶网\t176712\n铁达\t176713\n凌采\t176714\n欧版\t176715\n325路\t176716\n1.1.0.0\t176717\n莱昂纳德\t176718\n爽子\t176719\n照相馆\t176720\n爱奇艺偶像男团竞\t176721\n2.1.0\t176722\n抱枕\t176723\n操穴\t176724\n苏州地铁3号线\t176725\n为师\t176726\nFansadox\t176727\nA2k\t176728\nem5\t176729\n2014年7月\t176730\n白流苏\t176731\n湖北艺术职业学院\t176732\n#4\t176733\n深圳银行\t176734\ncgtn\t176735\n铁西广场\t176736\n185号\t176737\nGhana\t176738\ntweets\t176739\n高速公路公司\t176740\n响应\t176741\n法学类\t176742\n不得安宁\t176743\n吕碧城\t176744\n包小柏\t176745\n3052\t176746\n银行信用卡\t176747\n十九种\t176748\n沙茶面\t176749\nmomi\t176750\nDLNA\t176751\n上海广播电视台\t176752\nTransmission\t176753\n還有\t176754\n御女天下艳海风波\t176755\n葡萄成熟时\t176756\n细\t176757\n恶毒牧\t176758\n喂乳\t176759\nfeenix\t176760\n吉利沃尔沃\t176761\n祖母\t176762\n处方权\t176763\n无抵押贷款\t176764\n两公分\t176765\n科学技术处\t176766\n豪情壮志\t176767\n苗禺哥\t176768\n低电量\t176769\n国色天乡\t176770\nCUTV\t176771\n亚洲城市大学\t176772\nVMAX\t176773\n鄂州\t176774\n安德鲁·加菲尔德\t176775\nlzh\t176776\n户外服装\t176777\n分宜县\t176778\n列出\t176779\n物帘\t176780\n上海通用雪佛兰\t176781\n旅行青蛙\t176782\n赚\t176783\n背饰\t176784\n石油战争\t176785\n出庭作证\t176786\n百诚\t176787\n米沃奇\t176788\n准贷记卡\t176789\n火影究极风暴4\t176790\n清华图书馆\t176791\n开宝\t176792\ndubbo接口\t176793\nOTPUB\t176794\n莫古力蛮族\t176795\npet\t176796\n什么套\t176797\nDental\t176798\n德干高原\t176799\n自考通\t176800\n莘庄站\t176801\n2608\t176802\n有一死\t176803\n鞘\t176804\n2017年9月9日\t176805\n5季\t176806\nHome\t176807\n钻木\t176808\n新快现\t176809\n美星\t176810\nCUCAS\t176811\nBumblebee\t176812\nWant\t176813\n水晶粉\t176814\nGotti\t176815\nDBJ\t176816\n自我牺牲\t176817\n扔掉\t176818\nord\t176819\n丽丰\t176820\n合影照\t176821\n扫描员\t176822\n久冬\t176823\n口福\t176824\nCAB\t176825\n宜兴\t176826\n叶梓\t176827\n海贼王同人\t176828\n射频\t176829\n义乌国际商贸城五区\t176830\n赛季\t176831\n光纤通信技术\t176832\n同校\t176833\n徐君\t176834\nFIG\t176835\n原生\t176836\n老巴\t176837\n2017天\t176838\n新古典主义\t176839\nProfiler\t176840\n首登\t176841\n氢能源\t176842\n几注\t176843\n声门\t176844\n妙祥法师\t176845\n阿然\t176846\n记忆深处\t176847\n恢弘\t176848\n混住\t176849\n第壹\t176850\nGreatK\t176851\nimagick\t176852\n中国机场\t176853\n空客330\t176854\n赛小花\t176855\nddiction\t176856\n飞来石\t176857\n嗅探器\t176858\ninds\t176859\nTRAP\t176860\n格式工厂转换器\t176861\n2017-04-26\t176862\n爬坡\t176863\n花礼网\t176864\n乌山\t176865\nn4050\t176866\n小皮网\t176867\niPhone6SP\t176868\n安卡拉\t176869\n勐海\t176870\n蒙哥\t176871\n人大商学院\t176872\ntit\t176873\nSmaart\t176874\n中华人民共和国特种设备安全法\t176875\nnsq\t176876\n暴富\t176877\n良辰吉日\t176878\n天创恒达\t176879\n壬戌\t176880\n沈阳路\t176881\n食单\t176882\n血厚\t176883\n装修队\t176884\n乐学网\t176885\n肾细胞癌\t176886\n200余名\t176887\n建工网\t176888\n吧台椅\t176889\ndefinition\t176890\n单筒\t176891\n五年间\t176892\n元妃\t176893\nasl\t176894\n7间\t176895\n狂爱\t176896\npunisher\t176897\n公园式\t176898\n美国加州大学洛杉矶分校\t176899\nDreamScene\t176900\n玻璃钢格栅\t176901\n头角\t176902\n钼矿\t176903\n新星帝国\t176904\n周5\t176905\n金陵中学河西分校\t176906\npcx\t176907\n20184月\t176908\n桑巴舞\t176909\noxide\t176910\n海相\t176911\n会阴穴\t176912\n1500000\t176913\n生化奇兵\t176914\niwc\t176915\n硼替佐米\t176916\n电动自行车网\t176917\n绿化处\t176918\nMagic\t176919\n斜坡\t176920\nIDD\t176921\n水娃\t176922\nreliance\t176923\n幸村\t176924\n叶茂\t176925\nGlamour\t176926\na^2+b^2\t176927\n乎乎\t176928\n战锤\t176929\n百度离线\t176930\n北极星火电招聘网\t176931\n背心裙\t176932\n3320M\t176933\n付娜\t176934\n宣教\t176935\n铁甲工程机械网\t176936\n桂阳县\t176937\n杀杀\t176938\n幸福婚嫁网\t176939\n辞职申请书\t176940\nFrancis\t176941\n活题\t176942\n导员\t176943\n刘国邵逸夫\t176944\n安居\t176945\nY\t176946\nv4.7.0\t176947\n长隆野生动物世界\t176948\nBeagle\t176949\n肠镜检查\t176950\n港汇广场\t176951\n志玲\t176952\n201710\t176953\n二十几篇\t176954\n素马\t176955\nmdadm\t176956\n挖路\t176957\n600728\t176958\n河北工程大学\t176959\n大洋电机\t176960\n虹桥镇\t176961\n巨石达阵\t176962\n奇子\t176963\n观花\t176964\naotu\t176965\n深圳市唯盛机械有限公司\t176966\n豫中\t176967\n中华大街\t176968\n河殇\t176969\n南小杜\t176970\n质量员\t176971\n文化遗\t176972\n10亿欧元\t176973\n当且\t176974\n360随身WiFi电脑版\t176975\n一级建造师证\t176976\n1939年\t176977\n表别名\t176978\nRCL\t176979\nmallow\t176980\n手残\t176981\nJavaAPI\t176982\n孙建平\t176983\n一日游记\t176984\n可乐饼\t176985\n钢化玻璃膜\t176986\nCKU\t176987\n度假酒店\t176988\n中铁二十局\t176989\n痛痛\t176990\n恩智浦半导体公司\t176991\nZIP压缩文件\t176992\nMATX\t176993\n空气源\t176994\n广州富力丽思卡尔顿酒店\t176995\n红井\t176996\nPlayboy\t176997\n极速飞车\t176998\n婴儿期\t176999\n亚洲基础设施投资银行\t177000\n广州)国际家具博览会\t177001\n史湘云\t177002\n性交片\t177003\n纳他霉素\t177004\n58建筑英才网\t177005\n甑\t177006\n活性炭滤芯\t177007\n红鼻子\t177008\n前庭大腺囊肿\t177009\n装嫩\t177010\n歼-15\t177011\n己卯\t177012\n万艾可\t177013\nNIPT\t177014\n赵慧\t177015\n国家级水产种质资源保护区\t177016\n1.0.32\t177017\n分布式服务器\t177018\n七枚\t177019\n刘长春\t177020\n太师椅\t177021\nWhidy\t177022\n鸟牌\t177023\n古诗苑\t177024\n轴测图\t177025\nvfork\t177026\n法币\t177027\n老鼠屎\t177028\n林宇\t177029\n给给\t177030\nIMEI号\t177031\n祥云路\t177032\n胜者\t177033\n超过60秒\t177034\n20多斤\t177035\ncemu模拟器\t177036\n宋兵乙\t177037\n百头\t177038\n紧身衣\t177039\n单击\t177040\n心喜\t177041\n水润\t177042\n避世\t177043\n李元昊\t177044\n锲而舍\t177045\n伊州\t177046\n星空之谜\t177047\n山东小学\t177048\n玉龙沙湖\t177049\n航埠镇\t177050\n蛇蛇争霸\t177051\nserena\t177052\n经验论\t177053\n电单\t177054\n人生难得\t177055\ndot1q\t177056\nvmmon\t177057\n闽宁镇\t177058\n208万\t177059\n好高兴\t177060\n家\t177061\n代理报关委托书\t177062\n真武死人经\t177063\n2012年春节\t177064\n天鹅之死\t177065\n大世纪\t177066\n骨龙\t177067\n1827\t177068\n花毛\t177069\n辉映\t177070\n10049\t177071\n长安星卡\t177072\n战斗型\t177073\n1846年\t177074\n教研会\t177075\n江永县人民政府\t177076\n美图日\t177077\n4388x\t177078\n20160528\t177079\n欠税公告\t177080\nSem\t177081\naboboo\t177082\n城市管理学\t177083\n曾泳醍\t177084\n卡片机\t177085\n不及\t177086\n基督\t177087\n阳间巡逻人\t177088\n违约\t177089\n博卡\t177090\n易发\t177091\nL-1\t177092\n姜糖膏\t177093\n直属机关工会\t177094\n宝燕壹号\t177095\nUltraNav\t177096\ncried\t177097\n基督新教\t177098\noni\t177099\n甜婚\t177100\n万兴神剪手\t177101\n宫雪花\t177102\n内蒙古公安厅\t177103\n木材削片机\t177104\n75度\t177105\n武汉2号线\t177106\n夜十三\t177107\n林青\t177108\nFreaky\t177109\nMinimal\t177110\nroller\t177111\n90.0\t177112\n学友\t177113\n朱东润\t177114\n绳带\t177115\n四清运动\t177116\n牢不\t177117\niso11\t177118\n题纲\t177119\n依视路\t177120\nNODEJS\t177121\n杰尔马\t177122\n重庆市南开中学\t177123\n胤禵\t177124\n娄烨\t177125\n琴半岛\t177126\n江淮\t177127\n老屋\t177128\nendnote7\t177129\nLab-on-Web\t177130\n安东阳\t177131\n13代\t177132\n消防梯\t177133\n超级马里奥\t177134\n雅诗兰黛多效智妍\t177135\n0.68\t177136\n湖州喜来登温泉度假酒店\t177137\naddEventListener\t177138\n倍功\t177139\n中车长春轨道客车股份有限公司\t177140\n深圳地铁6号线\t177141\n速率\t177142\nrabbit\t177143\n老味\t177144\nhuangye\t177145\n蒲公英水\t177146\n磷酸一铵\t177147\n斯堪的纳维亚\t177148\n盘绕\t177149\n镇北堡西部影视城\t177150\nOmniFocus\t177151\n夏完淳\t177152\nmarathon\t177153\n哈尔滨机场\t177154\n赵匡胤\t177155\n82名\t177156\n热血动漫\t177157\n大江流\t177158\n中国石油天然气运输公司\t177159\n红柚\t177160\n會\t177161\n胸痛\t177162\n留客\t177163\n入园\t177164\n国家教育委员会\t177165\ncuc\t177166\n潜罪犯\t177167\n通信员\t177168\n叶片\t177169\n微信认证\t177170\n圣殇\t177171\n电动版\t177172\noctet\t177173\n饮食类\t177174\n第二十九回\t177175\n连带保证\t177176\n退化\t177177\n钜派投资\t177178\n病句\t177179\n艺术系\t177180\nMonoBehaviour\t177181\n旅美\t177182\n罗慧娟\t177183\nchm-CSDN\t177184\n科索沃战争\t177185\n流行歌曲\t177186\n京兆尹\t177187\n爱情的骗子我问你\t177188\n整盘\t177189\n电炖锅\t177190\n柯尔特\t177191\n杏干\t177192\n明基医院\t177193\n吊诡\t177194\n表面粗糙度\t177195\ncfu\t177196\n反用\t177197\nvcx\t177198\n天辉\t177199\n追认\t177200\n第31号\t177201\n艾露猫\t177202\n阿克赛钦\t177203\n双写\t177204\n地三鲜\t177205\n1月底\t177206\nLay\t177207\n争风吃醋\t177208\n空心菜\t177209\nHosted\t177210\n阿拉斯加雪橇犬\t177211\n高尔夫4\t177212\n游子身\t177213\n400多\t177214\n甲醇\t177215\nJolly\t177216\nilove\t177217\n卡库\t177218\n甲基蓝\t177219\n朗月\t177220\nGIMP\t177221\n同纬度\t177222\n工作日程表\t177223\n胎盘素\t177224\n上塘\t177225\n眼红\t177226\n广中\t177227\nhld\t177228\n千花\t177229\n文山会海\t177230\n石田スイ\t177231\necl\t177232\n4399元气骑士\t177233\n思普瑞特\t177234\ngoddess\t177235\n山坡\t177236\n离子水\t177237\nDITA\t177238\n郑州公司\t177239\n水源保护区\t177240\n高走\t177241\n植袋\t177242\n邪恶漫画少女漫画\t177243\n郭玲\t177244\n二十七八岁\t177245\n豆本豆\t177246\n花生浆\t177247\n试画\t177248\nStuart\t177249\n硅锰\t177250\n条形图\t177251\n八字命理学\t177252\n现代商贸工业\t177253\n虚框\t177254\nMERGE\t177255\n并口硬盘\t177256\n可奖\t177257\n名爵zs\t177258\n0.44\t177259\n擅于\t177260\n凯特·温斯莱特\t177261\n第14次\t177262\npremiere2018\t177263\nandorid\t177264\n600208\t177265\n火男\t177266\nthieves\t177267\n福悦榕湾\t177268\n财网\t177269\nFirefly-RK3288\t177270\n快乐大本营\t177271\n辨证施治\t177272\nLagoon\t177273\n回信\t177274\nNudes\t177275\n五莲县\t177276\n小道\t177277\n陶陶\t177278\n王之涣\t177279\n翻译机\t177280\n佳豪\t177281\n博\t177282\n25S\t177283\nfreevideotu性\t177284\n庵埠镇\t177285\n谵妄\t177286\n弦子\t177287\nyanyuan\t177288\n酥油茶\t177289\n郑州市第七人民医院\t177290\n入党申请书\t177291\nEddy\t177292\n清贫\t177293\n西米露\t177294\n平谷镇\t177295\n彩妆品\t177296\n机气\t177297\n就要\t177298\n八里湖\t177299\n摩贝网\t177300\n肃\t177301\n宋磊\t177302\n浙江省政协\t177303\n三十五岁\t177304\n老河口\t177305\n长沙小学\t177306\n黄昕\t177307\n黄沙镇\t177308\nUR\t177309\n董事局\t177310\n机械舞\t177311\nsref\t177312\n全国人口普查\t177313\n法网恢恢\t177314\ncp感\t177315\n抽票\t177316\n孙旭\t177317\n易生\t177318\n1.5L_\t177319\n变4\t177320\n娜可露露\t177321\n新蒲新区\t177322\n肖辉\t177323\n中国地震局地球物理研究所\t177324\n海口湾\t177325\n天使城\t177326\n两百元\t177327\n百度联盟吧\t177328\n天津四中\t177329\n奔驰CLA级\t177330\n电泳仪\t177331\n脚丫\t177332\n30题\t177333\n暖人\t177334\nT58\t177335\n板洞\t177336\n安徽省物价局\t177337\nDracula\t177338\n叶星辰\t177339\nBabel\t177340\n桃花红\t177341\nh1z1\t177342\nKey-Value\t177343\n路路行旅游网\t177344\njuly\t177345\n天津市环湖医院\t177346\nhtml2canvas.js\t177347\nlnn\t177348\n羊羊羊\t177349\n14颗\t177350\n微博\t177351\n奥克兰理工大学\t177352\nchinadns\t177353\n克丽缇娜\t177354\n拉林河\t177355\n善良的心\t177356\n相煎\t177357\nSlf4j\t177358\n3578\t177359\n谍影\t177360\n展品\t177361\n偷逃\t177362\n深情对唱\t177363\n排种器\t177364\n王者局\t177365\n雕铣机\t177366\n拿住\t177367\n海云\t177368\n临泉县人民政府\t177369\n作客\t177370\n叶城县\t177371\n欣百达\t177372\n东方肝胆医院\t177373\nlpp\t177374\n足球直播\t177375\n镇海股份\t177376\n点照\t177377\nDamned\t177378\n布力\t177379\n替加环素\t177380\n编写组\t177381\n何以笙箫默\t177382\n虎齿\t177383\n冰蛙\t177384\n深圳市疾病预防控制中心\t177385\n_天合联盟\t177386\nlibcrypto\t177387\n卡盟\t177388\n重庆小吃\t177389\n吴孟达\t177390\n浓重\t177391\nGB/4754\t177392\n南京师范大学音乐学院\t177393\nWeird\t177394\n没有关机\t177395\n浮生如梦\t177396\n生鲜\t177397\n遵义市第一人民医院\t177398\n阿什\t177399\nmysql57\t177400\n作业管理\t177401\n许老师\t177402\n绑票\t177403\nT110\t177404\n王国栋\t177405\n侯旭\t177406\n百病\t177407\n怡口软水机\t177408\n河南省人大\t177409\n瑞拉\t177410\n茶树精油\t177411\n赣州西站\t177412\n飞虎极战\t177413\nwenQ\t177414\n小红莓乐队\t177415\nmsw\t177416\nSharpe\t177417\n7.5厘米\t177418\n蟋蟀罐\t177419\nd3.js\t177420\na16\t177421\n口饭\t177422\nCrysaty\t177423\n吕岭路\t177424\n京津冀一体化\t177425\n张宝胜\t177426\n北京文艺广播\t177427\n毛服\t177428\n11.13\t177429\nbomber\t177430\n杵\t177431\n一百步\t177432\n钢膜\t177433\n邱兵\t177434\n广东银行\t177435\n江西地区\t177436\n脉博\t177437\ntext3\t177438\nSM痴女系\t177439\n4月6号\t177440\niterm\t177441\n坤宝丸\t177442\n自然观\t177443\nzyx\t177444\nCadence\t177445\n4200\t177446\n特辑\t177447\n湖南卫视元宵喜乐会\t177448\n192.168.1.1_\t177449\n明良\t177450\n绵阳\t177451\n宝鸡市教育局\t177452\n米家骑记电助力折叠自行车\t177453\n高达00剧场版\t177454\n百度影音电影网\t177455\n低俗\t177456\n张宏\t177457\n芒果班戟\t177458\n混搭\t177459\n优力\t177460\n厦门一中\t177461\n李长江\t177462\n草野\t177463\n铁甲咒\t177464\n中国大百科全书\t177465\nHomebrew\t177466\n小梵\t177467\n&ldquo\t177468\n朝阳街道\t177469\n交通一卡通\t177470\n黄斑变性\t177471\nMakeFile\t177472\n人文科学\t177473\n国防大学政治学院\t177474\n216\t177475\n铜川路\t177476\n银行学\t177477\n经验集\t177478\nlorem\t177479\n挪威森林猫\t177480\n材料工程学院\t177481\n竞职\t177482\nwin101709\t177483\nA50指数\t177484\n倒窗\t177485\n体力不支\t177486\n防辐射\t177487\n农业信息网\t177488\nspectroscopy\t177489\n希区柯克\t177490\n驯悍记\t177491\n80公里\t177492\nbobo\t177493\n工作学\t177494\n民泰银行\t177495\n2109\t177496\nrmp\t177497\n超马\t177498\n从句\t177499\n弹框\t177500\nCharges\t177501\nEa\t177502\n城市级\t177503\n知书达理\t177504\n鱼屎\t177505\n脑心通胶囊\t177506\nMoran\t177507\n澳门金沙城\t177508\njwgl\t177509\n弥合\t177510\nACA\t177511\n朱伟\t177512\nbanq\t177513\nxumenger\t177514\n4399xyx\t177515\nx98\t177516\n夏商西周\t177517\n皮丘\t177518\n王德峰\t177519\n呼和浩特站\t177520\n恰如\t177521\n轰顶\t177522\n镰仓物语\t177523\n德基\t177524\n2级\t177525\n林冲\t177526\n福州市民服务中心\t177527\n登舰\t177528\n有答\t177529\n生产经理\t177530\n捜査\t177531\n南京消防\t177532\n巫师\t177533\n福彩3D\t177534\n为列\t177535\n林洛雪\t177536\n金斧头\t177537\n珐琅\t177538\nT800\t177539\n消防队\t177540\n横切\t177541\n名满天下\t177542\n大保\t177543\n小米加步枪\t177544\n荆天明\t177545\n何凯明\t177546\n蜜宠甜妻\t177547\n圣物\t177548\n乗\t177549\n女孩子们\t177550\n卦街\t177551\n名节\t177552\nmido\t177553\n十翅\t177554\n以药养医\t177555\nMapper\t177556\n色表\t177557\n有气\t177558\n新华通讯社\t177559\n1-2月\t177560\n新木优子\t177561\n捷途捷途X952018款\t177562\n红塔区\t177563\n苗木表\t177564\n603773\t177565\nderive\t177566\n俊雅\t177567\n烧烤/烤肉\t177568\n杭州区块链产业园\t177569\n补碘\t177570\n观澜富士康\t177571\n执罪\t177572\n22pk\t177573\n刘希\t177574\nRPF\t177575\n完蛋\t177576\ncontos7\t177577\n英雄联盟职业联赛\t177578\n暗黑破坏神3_Diablo3_凯恩之角\t177579\n刘女士\t177580\nruru\t177581\n师范学\t177582\nski\t177583\nWingdings\t177584\nOakley\t177585\n留念\t177586\n栅栏式\t177587\n水垫\t177588\n三双\t177589\n水中\t177590\nSTC89C52\t177591\n20161206\t177592\n嗣\t177593\nwpp\t177594\n玫瑰酒\t177595\n培正学院\t177596\n宋家庄站\t177597\n母穴\t177598\nngixn\t177599\n圣言\t177600\n万表\t177601\n凡人修仙传单机版\t177602\n瘦脸操\t177603\n治违\t177604\n11-2\t177605\npage\t177606\ntextured\t177607\n傲\t177608\n急跌\t177609\nCPLEX\t177610\n网上车市\t177611\n沁香园\t177612\n东岑西舅\t177613\n3dMax\t177614\n新武则天外传\t177615\n黄金分割\t177616\nNodejs\t177617\n劳务工\t177618\n真枪\t177619\n管理方格理论\t177620\nMilking\t177621\n北京市建筑设计研究院有限公司\t177622\n草帘\t177623\n瓦业\t177624\n古籍\t177625\n自粘式\t177626\nHTML+CSS\t177627\nWORD2003\t177628\n独一\t177629\n花井美\t177630\n湘味\t177631\n视作\t177632\n成都搬家公司\t177633\n赵立\t177634\n拓扑排序\t177635\npou\t177636\n定冠词\t177637\n济缘算命\t177638\n解下\t177639\n茶文化\t177640\n南仓\t177641\n中源协和\t177642\n小伙子\t177643\n远坂凛\t177644\n兰考县\t177645\n水芹\t177646\n一个50\t177647\n释出\t177648\n三星Galaxy\t177649\njavaBean\t177650\n滚出去\t177651\n深圳地税电子税务局\t177652\n冯学荣\t177653\n求婚\t177654\nWenger\t177655\n文惠卡\t177656\nsplinter\t177657\nDagger\t177658\n李科\t177659\nbutterfly\t177660\n合宿\t177661\n大团\t177662\nmiranda\t177663\npendant\t177664\n奓山\t177665\n刘光明\t177666\nincreases\t177667\nTURBO\t177668\n五十块\t177669\n消防车\t177670\nzhongxin\t177671\n4-5月\t177672\n大分辨率\t177673\n15天后\t177674\n5万分\t177675\n海南博鳌亚洲论坛\t177676\n射频美容仪\t177677\n战会\t177678\n收银台\t177679\n绅士版\t177680\nLOOKUP函数\t177681\ncorrecting\t177682\n增光路\t177683\n南安市\t177684\n花落谁家\t177685\n太刀\t177686\n大宝漆\t177687\n井岸\t177688\n乳头\t177689\n洛阳\t177690\n2子\t177691\njigsaw\t177692\n中国中车\t177693\n四宗\t177694\n库尔图瓦\t177695\n基坑\t177696\n出生在\t177697\n天剑录\t177698\n东风本田汽车有限公司\t177699\n两个儿子\t177700\n黑焰\t177701\n新能源网\t177702\n重庆广播电视大学\t177703\n太白小区\t177704\n回执号\t177705\n丁度巴拉斯\t177706\n双截龙2\t177707\nusb驱动\t177708\n洲\t177709\n返贫\t177710\n北京恒昌汇财投资管理有限公司\t177711\n台州市第一人民医院\t177712\n郭亮\t177713\n农工民主党\t177714\nvenue\t177715\n明亡\t177716\n三星堆\t177717\n航天科工\t177718\nDEBUFF\t177719\n武器装扮\t177720\n带头人\t177721\n陵\t177722\n山棕\t177723\n碘伏消毒液\t177724\n银厂沟\t177725\nBMB\t177726\n8英里\t177727\n平安车险\t177728\n褶子\t177729\nnado\t177730\n杂字\t177731\n痛心\t177732\n数码宝贝大冒险tri\t177733\nRF/\t177734\n起亚焕驰\t177735\n蒸炉\t177736\n渤海人寿\t177737\n每隔5分钟\t177738\n14分钟\t177739\n针孔摄像头\t177740\nenemies\t177741\n范式\t177742\n墨缘文学网\t177743\n故城\t177744\n特种庆余年\t177745\n0623\t177746\n刘婕\t177747\nBasketball\t177748\n华财\t177749\n校额\t177750\n716\t177751\ndetox\t177752\nBuster\t177753\n可卡\t177754\n七子饼\t177755\n可燃\t177756\n菜单栏\t177757\n忙忙\t177758\n124&\t177759\n凉州区\t177760\nSuperior\t177761\n控制组\t177762\n绝歌\t177763\ngmv\t177764\nmpc\t177765\nprogrammes\t177766\noblique\t177767\n天天p图\t177768\n媒体播放器\t177769\n短信宝\t177770\n登云路\t177771\n7-12月\t177772\n提前还贷款\t177773\n无双三国\t177774\n1853494\t177775\naif\t177776\nSGX\t177777\natom\t177778\n青木源\t177779\n19张\t177780\n200A\t177781\n北大医学院\t177782\nVOA慢速英语\t177783\ndonet\t177784\n鸿通\t177785\n提灯女神\t177786\nhonours\t177787\n汤敏\t177788\n天燃气\t177789\n腰牌\t177790\n废话\t177791\n风云3\t177792\n抽抽\t177793\n退伙\t177794\n802.11a/\t177795\nCorel\t177796\n沈丘县\t177797\n苟不教\t177798\n微生物群落\t177799\n中国航天科技集团有限公司\t177800\nTXT\t177801\n冬川豆\t177802\n马荣成\t177803\n静海一中\t177804\n3599元\t177805\n醛基\t177806\n传家训\t177807\n中公考研\t177808\n碧蓝航线\t177809\n德邦物流\t177810\n张校长\t177811\n光卢克\t177812\n秀米编辑器\t177813\n几十本\t177814\n博学网博学商学院\t177815\n迈外迪\t177816\n村委会\t177817\n业余爱好\t177818\n蓝叠\t177819\nC168\t177820\nfengbin2005\t177821\n12月25日\t177822\n市侨联\t177823\n胶浆\t177824\n時候\t177825\n编织乐\t177826\n精科\t177827\n君山\t177828\n观音竹\t177829\n糖尿病\t177830\n房车\t177831\n22部\t177832\n4431s\t177833\n底化\t177834\n龙堂\t177835\n中央公馆\t177836\n一炉\t177837\n王晋\t177838\n山东农业工程学院\t177839\n索松村\t177840\n1770\t177841\n排烟道\t177842\n酩酊\t177843\n胡子长\t177844\n开心保保险网\t177845\n深圳市金证科技股份有限公司\t177846\nalv\t177847\n静态化\t177848\n静教院附校\t177849\n均压\t177850\n以求\t177851\n滨海街道\t177852\nLennon\t177853\n雅塔莱斯\t177854\n止汗露\t177855\ncarry\t177856\n250M\t177857\n现存量\t177858\n高甲戏\t177859\n水桶机\t177860\n云南省政协\t177861\n海珠区政府\t177862\n花椒树\t177863\n09年\t177864\n不差钱\t177865\nthunderbolt\t177866\n同伴们\t177867\n绿地海外滩\t177868\n静安寺\t177869\n番禺人才网\t177870\n八月八日\t177871\n服部圭子\t177872\n输入端\t177873\n10股\t177874\n余少群\t177875\n腰背\t177876\n追讨\t177877\n凸块\t177878\n政审表\t177879\n社保户\t177880\n硕放中心幼儿园\t177881\n中华人民共和国消防法\t177882\nFang\t177883\nnage\t177884\n算子\t177885\n化工资讯网\t177886\n中南锦城\t177887\n制造者\t177888\n副厅\t177889\n格挡\t177890\n阴阳鱼\t177891\n影象\t177892\n无父\t177893\n许语\t177894\n得一人心\t177895\n矸\t177896\n滑模\t177897\npomelo\t177898\nCrossword\t177899\n潮7000\t177900\n追声\t177901\n542\t177902\n服务项\t177903\n亚陆风云\t177904\n贵州中耀华建\t177905\n关东煮\t177906\nSQLserver2008\t177907\n硬伤\t177908\n用列\t177909\nHL\t177910\nh.265\t177911\n中华女性网\t177912\n蜡黄\t177913\n余大裕\t177914\n边距离\t177915\n3米高\t177916\n剑网3_17173剑网3\t177917\n古车\t177918\nutv\t177919\n防门\t177920\n诗句\t177921\n09五\t177922\n何家弘\t177923\n外墙板\t177924\n甲子园\t177925\n广州公安网上办事大厅\t177926\n釉色\t177927\n完成时\t177928\nimagepng\t177929\n沙城镇\t177930\n央视纪录频道\t177931\n中远海运物流有限公司\t177932\n一落千丈\t177933\nSprinter\t177934\n学贯中西\t177935\n春航\t177936\n剪断\t177937\n灵霄峡\t177938\nashampoo\t177939\nG01\t177940\n住宅区\t177941\n中共中央文献研究室\t177942\n搜狗浏览器\t177943\n民办本科\t177944\n1-11月份\t177945\nSpecify\t177946\nunico\t177947\n桀骜\t177948\n王守英\t177949\nzuo\t177950\n中金博客\t177951\n123操逼网\t177952\n诚信大道\t177953\n2017十部\t177954\n结账\t177955\n威宁县\t177956\nMandatory\t177957\nOTK\t177958\nE8中文网\t177959\n1轮\t177960\n杂峰\t177961\n第72号\t177962\n有待\t177963\nhstack\t177964\n护层\t177965\n祖孙\t177966\nPNG\t177967\n大神f2\t177968\n工業\t177969\n振翅\t177970\n北关区\t177971\n紫苏\t177972\n敢于\t177973\n鼓型\t177974\n珠海特区报数字报\t177975\nGEF\t177976\n美乡村\t177977\n网管型\t177978\n三十名\t177979\n9E\t177980\n中国建设银行个人网上银行\t177981\n我喜欢一个人\t177982\nCAJViewer\t177983\n多式\t177984\n福山区\t177985\nDuke\t177986\n餐叉\t177987\n毁灭战士4\t177988\n购物节\t177989\n贝利尼\t177990\n计划案\t177991\n墨莲\t177992\n无人区\t177993\n明知\t177994\noracle\t177995\n神湾\t177996\n将嫁\t177997\n长城环球通\t177998\n中国邮政挂号小包\t177999\n院外\t178000\n钱站\t178001\n天易成网管软件\t178002\n校园版\t178003\njap\t178004\n产期\t178005\n定影\t178006\nLab\t178007\n才能够\t178008\n差事\t178009\n宜昌市中心医院\t178010\n异度传说\t178011\n灯笼草\t178012\n接错\t178013\n清莱\t178014\ncurvature\t178015\n难装\t178016\n斜口钳\t178017\n9点半\t178018\n维思\t178019\n龙鸣\t178020\n矿产\t178021\npursuer\t178022\n管壳式\t178023\n迈丘\t178024\n7.3惩戒\t178025\n苏靖\t178026\n万熙\t178027\n谭伟\t178028\nhls\t178029\n310W\t178030\n程光\t178031\n结庐\t178032\n真谛\t178033\n靖康之耻\t178034\nchristmas\t178035\n加工类\t178036\n自助烤肉\t178037\n梳打\t178038\n东京食尸鬼第三季\t178039\ninterested\t178040\n强直性脊椎炎\t178041\nattachEvent\t178042\n爱信变速箱\t178043\nble\t178044\n兽性\t178045\n医博\t178046\n味茶\t178047\n上舰\t178048\n销售价\t178049\n神契幻奇谭\t178050\n豪强\t178051\n塔里木\t178052\n丰城市人民政府\t178053\nedk2\t178054\n画员\t178055\npina\t178056\nWATCH2\t178057\n海阳市\t178058\nopenfoam\t178059\n霸凌\t178060\n3c认证\t178061\nFUJIFILM\t178062\n给女儿\t178063\n调校\t178064\n弱体\t178065\n吉本\t178066\nxg\t178067\nNo.4\t178068\n双龙\t178069\nDemand\t178070\n龙在宇\t178071\n阳光保险集团股份有限公司\t178072\n已经\t178073\n棚帘\t178074\n第一所\t178075\n全叔\t178076\n新枪\t178077\n脓\t178078\n节哀顺变\t178079\n中商惠民\t178080\n8.36\t178081\n近海\t178082\n陶东风\t178083\n湘园\t178084\n【史\t178085\n梁明\t178086\nPreparedStatement\t178087\nflock\t178088\n吴欣\t178089\n天面\t178090\n重庆朝天门\t178091\n几天一个\t178092\n复方精油\t178093\n苦海无涯\t178094\n高雯\t178095\ndomian\t178096\n认输\t178097\n绿力\t178098\nNote_MIUI论坛\t178099\n标养\t178100\n谜局\t178101\n1700元\t178102\n2017.3.3\t178103\n沉沉\t178104\n颜柳欧赵_\t178105\n精酿啤酒屋\t178106\n61部\t178107\n妖气原创漫画梦工厂\t178108\n淮塔\t178109\nnx8.5\t178110\n新店\t178111\njim\t178112\nshina\t178113\n韩德李大霄\t178114\n休憩\t178115\n马奶酒\t178116\nomega-3\t178117\nAurora\t178118\n蕾哈李\t178119\n4000点\t178120\n商州区人民政府\t178121\n流行音乐\t178122\npossibilities\t178123\n两会思想汇报\t178124\nbeanshell\t178125\n零之\t178126\n通灵宝玉\t178127\n球根\t178128\n贰号\t178129\n三五成群\t178130\nfsockopen\t178131\n博途V13\t178132\n瀏覽\t178133\n一星期后\t178134\n_途昂论坛_汽车之家论坛\t178135\n男体\t178136\n硝基甲烷\t178137\n风山\t178138\n昨晚\t178139\n瓷罐\t178140\n失格\t178141\n胶笔\t178142\n世界末日\t178143\nArrivals\t178144\n彝乡\t178145\n落车\t178146\nie12\t178147\n人人保\t178148\n托起\t178149\n断桥残雪\t178150\n言小念\t178151\n永红街道\t178152\n水牢\t178153\n超极速\t178154\n砌石\t178155\n20170611\t178156\n二面\t178157\n灵签\t178158\n铜网\t178159\n说着\t178160\nrobust\t178161\n笨笨熊\t178162\nhopeless\t178163\n氯苯基\t178164\n山东出版集团\t178165\nspdif\t178166\nnlb\t178167\n出锋\t178168\nserial\t178169\n子项\t178170\n保修期\t178171\n开销项\t178172\n有便意\t178173\n同学录\t178174\n选择符\t178175\n东南大学研究生院\t178176\n具有\t178177\n绍兴日报数字报\t178178\n陨\t178179\nappearing\t178180\n电信增值业务许可证\t178181\n章子\t178182\nCAPTCHA\t178183\n老人鞋\t178184\n李宗翰\t178185\n地质\t178186\n证券经纪业务\t178187\nsso单点\t178188\n萌娘\t178189\nOkhttp3\t178190\n路勇\t178191\n市府\t178192\n热议\t178193\n锉刀\t178194\nreserves\t178195\n低调\t178196\n新约圣经\t178197\n修建性详细规划\t178198\n围兜\t178199\n流满面\t178200\n锦丰\t178201\n计划项\t178202\n1450级\t178203\n20171120\t178204\n相沢\t178205\nラムネ\t178206\nTraps\t178207\n林周县\t178208\ndadi\t178209\n尝鲜版\t178210\n思创医惠\t178211\n王立彤\t178212\nEdge浏览器\t178213\ntikz\t178214\n楼中楼\t178215\n1314.com\t178216\n努比亚电竞\t178217\n城记\t178218\n自由程\t178219\n串变\t178220\n第1行\t178221\n鲁冰花\t178222\n质子数\t178223\n何惜\t178224\n孙尚\t178225\n利华教育网\t178226\n一页\t178227\nhipp\t178228\ntracy\t178229\n1秒\t178230\n福州市财政局\t178231\nmydumper\t178232\n易主\t178233\n可卡犬\t178234\n小金鱼\t178235\n海购\t178236\n肖刚\t178237\n能以\t178238\n北京六里桥\t178239\n猎豹wifi\t178240\nショタ\t178241\n大雄宝殿\t178242\n红莓\t178243\n郑州房管局\t178244\n4.8万\t178245\n昆吾\t178246\n明矾\t178247\nframed\t178248\n四十分\t178249\n中央路\t178250\n新\t178251\n左女\t178252\nオトコ\t178253\nanhei\t178254\nv1.2.6\t178255\nMyGirl\t178256\n董允\t178257\n戚百草\t178258\n24页\t178259\n果图\t178260\n古神树\t178261\n希伯特\t178262\n27.2\t178263\n父母邦\t178264\n英语专业八级考试\t178265\n593\t178266\n512k\t178267\n自古以来\t178268\n李蓉\t178269\n进步性\t178270\n卡簧钳\t178271\nC9\t178272\n昆泰\t178273\n变速箱\t178274\n羟基\t178275\nWhyWin\t178276\nLogistic\t178277\n练习4\t178278\n04月15日\t178279\n炙烤\t178280\n十东路\t178281\n您好\t178282\nm701\t178283\nHD国粤\t178284\n冀\t178285\n讲到\t178286\n自驾旅游\t178287\n模糊测试\t178288\n小弟\t178289\n小河虾\t178290\n生生\t178291\n2017|\t178292\n腹黑逆天大小姐\t178293\n庄行镇\t178294\n宅基证\t178295\n框条\t178296\n杠头\t178297\n小米手环2\t178298\n泥管\t178299\n萨菲罗斯\t178300\n安卓7.1.1\t178301\n双排座\t178302\nvypr\t178303\n你的朋友\t178304\n郑州财经学院\t178305\n魔弦传说\t178306\nRCM\t178307\n女医肉奴隶\t178308\n陕西省国有资产监督管理委员会\t178309\n思域\t178310\n外刊\t178311\n00009\t178312\n花城广场\t178313\n莅临\t178314\n艺术园\t178315\n前进村\t178316\nGCK\t178317\n漆门\t178318\n证券招聘网\t178319\n術\t178320\nabp171\t178321\n弄短\t178322\n狼狈为奸\t178323\n隔层\t178324\n调用者\t178325\n水喉\t178326\n英菲尼迪qx50\t178327\n港澳台\t178328\n牡丹卡\t178329\nvisio\t178330\n方差公式\t178331\n邻补角\t178332\n十多\t178333\n中环国际广场\t178334\n福房网\t178335\n尘泥\t178336\n第三名\t178337\n圆心\t178338\n转向架\t178339\n欺诈师\t178340\n刚需房\t178341\n医学教育考试网\t178342\n14000元\t178343\nempress\t178344\nWWWS18MCOM\t178345\n害人\t178346\n三杆\t178347\n毒水\t178348\n秋成勋\t178349\n星达\t178350\n三源\t178351\n成都瑞芬思生物科技有限公司\t178352\n镰鼬\t178353\n高等院校\t178354\n栖所\t178355\n鱼头\t178356\n凯悦\t178357\n驭\t178358\nranch\t178359\n养乐多\t178360\n李玹雨\t178361\n国医\t178362\nipxe\t178363\n无晴\t178364\nfifaonli\t178365\n统筹\t178366\n机主\t178367\n黄芽\t178368\npatient\t178369\n哈伯福\t178370\n硫化硅橡胶\t178371\n2015年10月16日\t178372\n创新杯\t178373\n三拍\t178374\n玻璃人\t178375\n中时\t178376\n3.0\t178377\nCiudad\t178378\nmpp\t178379\n刘小宝\t178380\n萧衍\t178381\n妹妻\t178382\nappicon\t178383\n1.1.5\t178384\n琯头\t178385\n预制桩\t178386\n5281\t178387\n大校区\t178388\n台式机\t178389\nMazey\t178390\n墓王之王悬棺寺全集\t178391\n20160804\t178392\n蔻\t178393\n陆良县\t178394\n土方十四郎\t178395\n杉本彩\t178396\n舞蹈\t178397\n一念\t178398\n冒险岛战神\t178399\n植物药\t178400\ncs1.6\t178401\nPPP项\t178402\n光纤在线\t178403\n民族舞\t178404\n白皇后\t178405\n绿竹\t178406\niptraf\t178407\n河北省财政厅\t178408\n腾讯动漫\t178409\n凌晨一点\t178410\n48步\t178411\ncoreldraw9\t178412\n3.96\t178413\n狈\t178414\n亮马河\t178415\n金蝉脱壳\t178416\nwta\t178417\n广安在线\t178418\n金家坝\t178419\n1迅雷\t178420\n吕文艺\t178421\n138家\t178422\n花客\t178423\n焊管\t178424\n库伦沙漠\t178425\n纯果汁\t178426\n轴承座\t178427\n情痴\t178428\n姿色\t178429\nElastic\t178430\n刘康\t178431\nFranck\t178432\n风溯君\t178433\nDay3\t178434\nq1\t178435\n紫玉\t178436\n神雾集团\t178437\nTANK\t178438\n越南人\t178439\nShipment\t178440\n苍翼默示录神观之梦\t178441\n行条\t178442\nPhd\t178443\n韩松\t178444\nryzen5\t178445\nserver2014\t178446\n月桂醇\t178447\n都市言\t178448\nSealy\t178449\n5250\t178450\n合租房网\t178451\nbutable\t178452\n电视业\t178453\n击落\t178454\nkeeping\t178455\n体制\t178456\n小额贷款\t178457\n切尔\t178458\n焦虑\t178459\nsf吧\t178460\n匈牙利狂想曲\t178461\n受罚\t178462\n水文学\t178463\nari\t178464\n光伏逆变器\t178465\n嘉元\t178466\n33_\t178467\n山海战记\t178468\ndesigner3\t178469\n旬果\t178470\n日综\t178471\nOPENVPN\t178472\n关联认证\t178473\n痴狂\t178474\n御驾\t178475\nNCIS\t178476\n天府文化\t178477\n阜阳市人力资源和社会保障局\t178478\n乙酰氨基酚\t178479\n笙箫\t178480\n大疆精灵\t178481\n理论版\t178482\nOV5640\t178483\n漫水\t178484\ngtimg\t178485\n薏米茶\t178486\n系数\t178487\n谯\t178488\n龙潭寺\t178489\n渚薰\t178490\n本公司\t178491\n馋段\t178492\n衍纸\t178493\n2018-2022年\t178494\n招惹\t178495\n诱导性\t178496\n6.2.0\t178497\nblockly\t178498\nmeta分析\t178499\nEPD\t178500\n优纪\t178501\n15道\t178502\n吴裕泰\t178503\n7股\t178504\n八英里\t178505\n休闲裤\t178506\nFateGO\t178507\nfreerdp\t178508\n洪子诚\t178509\n秩序感\t178510\n神武幻想\t178511\n年复合增长率\t178512\n小志\t178513\n沧海一滴\t178514\n三官庙\t178515\n盗号\t178516\n婉约词\t178517\n证信\t178518\n歌机\t178519\nCast\t178520\n持妆\t178521\nUISegmentedControl\t178522\n游山\t178523\n2.1万\t178524\n西安派出所\t178525\n食品科学\t178526\n院院\t178527\n_妆美扮靓_爱靓网\t178528\nDTU\t178529\n碉堡\t178530\n问声\t178531\n军嫂\t178532\nAppendix\t178533\n调训\t178534\n康复者\t178535\n梁上柱\t178536\n孝陵卫\t178537\n达世币\t178538\nsburg\t178539\nrma\t178540\n红楼梦全集\t178541\n吊桶\t178542\ncom3d2\t178543\n葡萄糖酸\t178544\n石油公司\t178545\n袅娜\t178546\n李劼\t178547\nvhdl\t178548\nsunon\t178549\niface\t178550\n空\t178551\n8.86\t178552\n行胜于言\t178553\n岩棉夹芯板\t178554\nlexical\t178555\n匡扶\t178556\n菲尼斯\t178557\n假面骑士Wizard\t178558\n噼啪\t178559\nvcvarsall\t178560\n2017度\t178561\n丰裕\t178562\n柯基吧\t178563\n见闻录\t178564\n别惹\t178565\n防疫\t178566\n弋痕夕\t178567\n北美购房网\t178568\n硫代\t178569\n55mm\t178570\n一令\t178571\n决战者\t178572\n11.02\t178573\n王利民\t178574\n晚上六点\t178575\n插画展\t178576\n凌霜\t178577\n工具篇\t178578\n奇米影视777\t178579\n街坊邻居\t178580\nNos\t178581\n塞缪尔\t178582\nMercedes\t178583\ntwd\t178584\n绝地风暴\t178585\n反季\t178586\n移动通信网络\t178587\n陆之昂\t178588\n芙蓉国\t178589\naloud\t178590\ninp\t178591\n养生之道网\t178592\n一统三国\t178593\n行色匆匆\t178594\n大同新闻网\t178595\nVegetable\t178596\n信息技\t178597\n777模板777moban.com\t178598\n花螺\t178599\n条刷\t178600\n一路向东\t178601\n征伐\t178602\n卓达集团\t178603\ncapitalization\t178604\n茴香苗\t178605\n期货经纪公司\t178606\n神清气爽\t178607\n卡关\t178608\nSkyline\t178609\n尤甚\t178610\n古着\t178611\nCoremail\t178612\nmsfconsole\t178613\n天使兽\t178614\n业务能力\t178615\n100多套\t178616\n藏香\t178617\nFrp\t178618\n米家小白智能摄像机\t178619\ncouldn\t178620\n位置度\t178621\n王霸胆\t178622\nIBE\t178623\n音乐厅\t178624\n错报\t178625\n暗术\t178626\n闲心\t178627\n首旅酒店\t178628\n天空战记\t178629\n中铁碧桂园\t178630\n分力\t178631\n5.7\t178632\n822\t178633\n秦良玉\t178634\n高速路\t178635\n解谜\t178636\n每间\t178637\nmoan\t178638\n厦门大学艺术学院\t178639\n鼓励类\t178640\n转标\t178641\n七朵组合\t178642\n炸茄盒\t178643\n2015—2030年\t178644\n红木家具网_大涌红木_沙溪红木_红木\t178645\n体坛\t178646\n安存\t178647\n相会\t178648\n路人女主\t178649\n月会\t178650\n尚武\t178651\n应用材料公司\t178652\nFeature\t178653\n染红\t178654\n前页\t178655\nhp-ux\t178656\n植美村\t178657\n氘代氯\t178658\n源码\t178659\n万佛寺\t178660\n青春片\t178661\n49座\t178662\n接长\t178663\n保密性\t178664\n韩飞\t178665\n麦当劳汉堡\t178666\n牛鼻\t178667\n乒坛\t178668\n棉套\t178669\nsmartview\t178670\n铁山区\t178671\nviability\t178672\n霹雳天命之战祸邪神2\t178673\n马斯特\t178674\n杨公堤\t178675\n建业\t178676\n不出所料\t178677\n夏一文\t178678\n南红网\t178679\n邱志杰\t178680\n万般\t178681\n名名\t178682\n法律规范\t178683\n扑火\t178684\n818\t178685\n喋血街头2\t178686\n秦汉魏晋\t178687\n长沙图书馆\t178688\n胡润\t178689\n直至\t178690\n影音先锋AV\t178691\n红痕\t178692\n2012年3月\t178693\njumbo\t178694\n铜牙\t178695\n道尔顿净水器\t178696\n莱绅通灵\t178697\nSTUART\t178698\n凌克\t178699\n丰田酷路泽\t178700\n天猫男人节\t178701\nbeast\t178702\n贝兹\t178703\n包干\t178704\n集体经济组织\t178705\n海华\t178706\n把玩\t178707\n穿透力\t178708\n税库\t178709\n两个女人\t178710\neasyx\t178711\n超乳\t178712\n帆\t178713\ngcm\t178714\n开测\t178715\nTerre\t178716\n冰爽\t178717\n豆类\t178718\nlectures\t178719\n侧位\t178720\n外资公司\t178721\n地雷\t178722\n佳妻\t178723\n确率\t178724\n中国社会科学院近代史研究所\t178725\n纱笼\t178726\nLt\t178727\nx+2\t178728\n15平方米\t178729\n胶囊型\t178730\n多时\t178731\ngnz48\t178732\n清真餐厅\t178733\n宝锋\t178734\n2.7G\t178735\n魍魉\t178736\nPeptide\t178737\n合能\t178738\njpkc\t178739\n平盘\t178740\n小腿\t178741\n7.2.0\t178742\n室内外\t178743\n10KM\t178744\n2732\t178745\n齐木楠雄\t178746\n可怖\t178747\n冻结行\t178748\n谷胱甘肽片\t178749\n打胎\t178750\n伍美珍\t178751\n磨粉\t178752\n钟杰\t178753\nTSA\t178754\n亲王\t178755\n4季度\t178756\nSketch\t178757\n饲养场\t178758\n质发电\t178759\n怪物猎人XX中文网\t178760\n轻匀\t178761\n换填\t178762\n新新魔塔2\t178763\n牛板金\t178764\n网易摄影\t178765\n瑞祥新区\t178766\n金酒\t178767\n艳事\t178768\ngeorges\t178769\n北京工商网通\t178770\n热米饭\t178771\n12句\t178772\nparent\t178773\n悬链\t178774\n速攻\t178775\n气垫鞋\t178776\nAD9\t178777\n再买\t178778\n会要\t178779\nGroovy\t178780\nh_\t178781\n前任\t178782\n王淑华\t178783\n三基三严\t178784\n20xx年\t178785\nRestart\t178786\nマジ軟\t178787\n入圣\t178788\nsgd\t178789\n晨晨\t178790\n宝马X1论坛_汽车之家论坛\t178791\nMerriam\t178792\n路口\t178793\n东山\t178794\n九生\t178795\n从古至今\t178796\n牛b\t178797\n锅\t178798\n茅箭新闻网\t178799\nBeep\t178800\n跃上\t178801\nT480\t178802\nrunfile\t178803\n317国道\t178804\n期望收益率\t178805\n学行\t178806\n射完\t178807\n房地产成本\t178808\n朝臣\t178809\n江西省公安厅交通管理局\t178810\n木村拓哉\t178811\ntoca\t178812\n糍粑\t178813\nsuica\t178814\n南方电网综合能源有限公司\t178815\ncnb\t178816\nfj酷路泽\t178817\n解放区\t178818\nTiO2\t178819\n盛龙广场\t178820\n凯迪拉克ATS\t178821\n静载试验\t178822\n齐齐哈尔市住房和城乡建设局\t178823\nkirk\t178824\n小米Note顶配版\t178825\nxai\t178826\nDecorative\t178827\n牧羊场\t178828\n种子\t178829\nbdd\t178830\n徐岩\t178831\n宜居性\t178832\n12345678910\t178833\n2017年5月22日\t178834\n除妖\t178835\noscillator\t178836\nmmr\t178837\n蛇鞭粉\t178838\n人王\t178839\n黑白色\t178840\nsnowflake\t178841\n执行性\t178842\nem算法\t178843\nSemiconductor\t178844\n美光科技\t178845\n孟子义\t178846\n叶光明\t178847\n协信\t178848\n内蒙省\t178849\n公议\t178850\n小港街道\t178851\n57000\t178852\n油葫芦\t178853\n镜台\t178854\n4300\t178855\n清明高速公路\t178856\nguards\t178857\n偶练\t178858\nArtifact\t178859\nVISVIM\t178860\nnews.680.com\t178861\n大渡\t178862\n棋界\t178863\n花美\t178864\n醋都网\t178865\n日照钢铁控股集团有限公司\t178866\n2018年4月2号\t178867\n邮资片\t178868\nSQL*Plus\t178869\n屏\t178870\n上行\t178871\n东北农业大学\t178872\nJSONArray\t178873\nブランド\t178874\n杨博客\t178875\n建筑工人\t178876\n笔筒\t178877\n并列出\t178878\n丰年虾\t178879\n联想v110\t178880\n0.15\t178881\n小波基\t178882\n细集料\t178883\n嘉兴市第二医院\t178884\n猪笼\t178885\n苏宁联盟\t178886\n强计\t178887\n6.0.0\t178888\nwintogo\t178889\n中国工艺美术协会\t178890\n御兽\t178891\ncad病毒\t178892\n陌生人\t178893\nfacilities\t178894\n袁岚峰\t178895\n高加索犬\t178896\n小丁猫\t178897\n佛坪县\t178898\n我爱这土地\t178899\nGranite\t178900\n干海参\t178901\n南头街道\t178902\n飞禽\t178903\n殿后\t178904\n档板\t178905\nflex3\t178906\n硫糖铝\t178907\n向好\t178908\n沈雅琴\t178909\n绿色中国\t178910\n厦门晚报\t178911\n一触即发\t178912\n莲城\t178913\n重仓\t178914\n17.5%\t178915\n每四年\t178916\n感统失调\t178917\n整理盒\t178918\n夏东海\t178919\n甲卷\t178920\nskywalking\t178921\n公安部交通管理局\t178922\n802.1x认证\t178923\n修复器\t178924\n交通人\t178925\n进球\t178926\noppor11plus\t178927\n永乐国际\t178928\n三条线\t178929\n靳尚谊\t178930\n48级\t178931\n小炒花生米\t178932\nANIME\t178933\n米欧奇\t178934\n镭波\t178935\n古论\t178936\nE520\t178937\n老抽\t178938\n非处\t178939\n裙裙\t178940\n110KV\t178941\n用友政务\t178942\nSud\t178943\n分析性\t178944\n赠药\t178945\n毛泽东思想和中国特色社会主义理论体系概论\t178946\n汉中站\t178947\n泺口\t178948\n吲达帕胺\t178949\n易燃易爆\t178950\n性无能\t178951\n金陵图书馆\t178952\n脱卸\t178953\n高斯模糊\t178954\nwddns\t178955\n人工挖孔灌注桩\t178956\nDuring\t178957\n锦辉\t178958\n杨小凯\t178959\n软碟\t178960\n水形\t178961\n博瑞\t178962\n170平米\t178963\n苏州工业职业技术学院\t178964\nVissim\t178965\n塞罕坝林场\t178966\npluck\t178967\narriving\t178968\n雷马\t178969\nEgret\t178970\n4017\t178971\n宁滁\t178972\n聊天\t178973\n苏梅岛\t178974\n4.5号\t178975\n连接\t178976\n120.com\t178977\n访学\t178978\n新景\t178979\n多周目\t178980\n小白龟\t178981\n都督\t178982\n毕筱奇\t178983\n姚力军\t178984\n临平南站\t178985\n日货\t178986\n30%|\t178987\n扶手箱\t178988\nhighstock\t178989\n师兄\t178990\n韩非\t178991\n搅\t178992\n格点\t178993\n产调\t178994\n馨语\t178995\nPe\t178996\n年花\t178997\n太乙真人\t178998\n江西广播电视台\t178999\n獒\t179000\nanan\t179001\n储液\t179002\n苟且\t179003\n汇综发\t179004\n351国道\t179005\n粤东西北地区\t179006\nyingzi\t179007\n2#\t179008\n西藏发展\t179009\n核苷\t179010\n衢\t179011\n太极扇\t179012\n勤能\t179013\n狂饮\t179014\n世青赛\t179015\n伯劳\t179016\n代通\t179017\njemeter\t179018\n2019年上半年\t179019\n木丝板\t179020\n呼叫\t179021\n蛋蛋\t179022\n开膛手\t179023\n滑雪者\t179024\n2018滴滴\t179025\n【电影社\t179026\n民国十八年\t179027\ngao\t179028\n千杯不醉\t179029\nDN20\t179030\n面垫\t179031\ninitrd\t179032\nGluon\t179033\n中国人民大学文学院\t179034\n佳能6580\t179035\nDICK\t179036\n师大附小\t179037\n根治率\t179038\nqsub\t179039\n你的笑容如繁花\t179040\n艺博会\t179041\n葫芦籽\t179042\n高埗\t179043\n沙丁鱼\t179044\n首府\t179045\n12.5亿\t179046\n莱卡\t179047\n守恒定律\t179048\n1500题\t179049\n武器类\t179050\n卫斯理蓝血人\t179051\n瓯\t179052\n生物质能\t179053\nJuan\t179054\n毅昌股份\t179055\n322号\t179056\n一万台\t179057\n贵州省安全生产监督管理局\t179058\n测井\t179059\n轻松\t179060\nT4\t179061\n袁祥仁\t179062\n莫桑石\t179063\n罗门\t179064\nshui\t179065\n西装裤\t179066\n玉米脱粒机\t179067\n56期\t179068\n聘任制\t179069\npipelines\t179070\n兴福镇\t179071\n房产公司\t179072\ncomputing\t179073\n常德乐居\t179074\n吃茶\t179075\n经典版\t179076\n王浩然\t179077\nVessel\t179078\n南京医科大学第一附属医院\t179079\n重合度\t179080\nProface\t179081\ncotta\t179082\nskilled\t179083\n七出\t179084\n细致\t179085\n记忆盒子\t179086\n上海书画出版社\t179087\ntransforms\t179088\n3点\t179089\n老红军\t179090\n辽阳石化\t179091\nMDB数据库\t179092\nElse\t179093\nWasted\t179094\nsling\t179095\n期货从业资格-233\t179096\n账务\t179097\n丹景山\t179098\n林雅诗\t179099\n多玩赛尔号\t179100\n瓷板\t179101\n笑破\t179102\n岩\t179103\n函授\t179104\n连云港市委\t179105\n高通量测序\t179106\n团宠\t179107\nalter\t179108\n浊流\t179109\n坤丹\t179110\n九州志\t179111\n田志强\t179112\n小米VR眼镜\t179113\n百度影音快播\t179114\n向京\t179115\n坐果率\t179116\n绝密级\t179117\n陪酒\t179118\nGoal\t179119\n应化\t179120\n敏感期\t179121\n金济东\t179122\n闲来\t179123\n退休处\t179124\n拉斯特\t179125\n农商行\t179126\n杰德大道\t179127\nARRAY\t179128\n#电影战神纪#\t179129\n好汉杯\t179130\nmatlab2015a\t179131\nVolvo\t179132\n颜永平\t179133\n黑蒜\t179134\n奇差\t179135\n中国自考网\t179136\n彩釉\t179137\n药家鑫\t179138\n港险\t179139\n厮杀\t179140\nCarl\t179141\n奶奶庙\t179142\n契丹人\t179143\n魔印\t179144\n格利特\t179145\n凯迈乐软件著作权\t179146\n黄海北道\t179147\n20170116\t179148\n小桌\t179149\n梅家坞\t179150\n课桥\t179151\n崩溃率\t179152\n胡兰\t179153\n长沙职业技术学院\t179154\n浩\t179155\n山崩地裂\t179156\n黑鹰直升机\t179157\n安徽师范大学皖江学院\t179158\n舸\t179159\n技改\t179160\n围打\t179161\n戴军\t179162\n护送\t179163\n签证页\t179164\n饼干盒\t179165\n葵花药业集团\t179166\n福鼎白茶\t179167\n排板\t179168\nr390\t179169\n大春\t179170\n金桥镇\t179171\n北大图书馆\t179172\n太阳伞\t179173\n人工智能时代\t179174\n36只\t179175\n森提纳克斯\t179176\n小觑\t179177\n屿\t179178\n大昭寺\t179179\n马克叔\t179180\nqueryset\t179181\n晏城\t179182\n黑童子\t179183\njanella\t179184\n淘金币\t179185\nnomi\t179186\ne45\t179187\n乐华七子\t179188\n筹备工作\t179189\n第三者险\t179190\n紀\t179191\n105种\t179192\n干基\t179193\n三聚氰胺事件\t179194\n现实文\t179195\n龙男\t179196\n云转码\t179197\nProfibus\t179198\n徐连\t179199\n利服\t179200\n北斗七星\t179201\n汪直\t179202\n交汇处\t179203\n呼死你软件\t179204\n长沙装修公司\t179205\n上海市委宣传部\t179206\n主缆\t179207\n匪事\t179208\n6320\t179209\n张清芳\t179210\n套服\t179211\n吹雨打\t179212\n远距\t179213\nzhenjing\t179214\nRosy\t179215\n作妖\t179216\n800首\t179217\n增驾\t179218\n辛家庙\t179219\n座谈\t179220\n光绪皇帝\t179221\ninspire\t179222\n瞿麦\t179223\n弘光\t179224\n7216\t179225\nmxplayer\t179226\n20150108\t179227\n金湾区\t179228\np成\t179229\nUnit6\t179230\n品牌童装网\t179231\n造林\t179232\nChile\t179233\nDocumentary\t179234\n将夜吧_\t179235\n几十万\t179236\ntorrent\t179237\n萨达\t179238\nauthorities\t179239\n冰糖燕窝\t179240\n胀缝\t179241\n1018\t179242\n回转\t179243\n拉挤\t179244\n艾尚\t179245\n呕吐\t179246\n71岁\t179247\n奴隷\t179248\nPareto\t179249\ne555\t179250\n知柏地黄丸\t179251\n结构化分析\t179252\n考古队\t179253\nfilter函数\t179254\n墨痕\t179255\n离境税\t179256\n携号转网\t179257\nstool\t179258\n品观网\t179259\n14亿\t179260\n批判性\t179261\n蘑菇战争2\t179262\npartners\t179263\ntma\t179264\n复星\t179265\n地锅\t179266\n李志杰\t179267\n城事_新京报电子报\t179268\n关心妍\t179269\ncoreldraw12\t179270\n半神\t179271\n黄仁\t179272\n龙溪镇\t179273\n加收\t179274\n套包\t179275\n10B\t179276\n划船机\t179277\n东明县\t179278\n6-2\t179279\ndatatraveler\t179280\n無\t179281\n火印\t179282\n丹特\t179283\n后患\t179284\n六人行\t179285\n磷剂\t179286\n金来沅\t179287\nsunhaiyu\t179288\nAunt\t179289\n淘宝助理吧\t179290\ncc霜\t179291\nBET\t179292\n孝服\t179293\n十绝\t179294\n求模\t179295\nGoogSQL\t179296\n乌鲁木齐高新区\t179297\n桃木\t179298\n博迈科\t179299\n郭字\t179300\n妮可\t179301\n锰钢\t179302\nmako\t179303\ngroupon\t179304\nKepler\t179305\n萨伊蓝\t179306\n血腥味\t179307\n隐写\t179308\n徐香\t179309\n两座\t179310\n枫叶教育\t179311\n1458\t179312\n世界屋脊\t179313\n礼域\t179314\ncpufreq\t179315\n四川科伦药业股份有限公司\t179316\n卡方分布\t179317\n极影字幕社\t179318\n滤油\t179319\n隆美尔\t179320\ne27\t179321\n难念\t179322\nwifi信号强度\t179323\n名人馆\t179324\n落果\t179325\n服装饰品\t179326\n瓜洲\t179327\n议会制\t179328\n初音未来\t179329\n巨野县\t179330\n李朝晖\t179331\n反扒\t179332\n鼎尖教案\t179333\n裂天\t179334\n高颜值\t179335\n枪爷\t179336\n聂元梓\t179337\n全非\t179338\n洛夫克拉夫特\t179339\n李杨\t179340\n徐丽\t179341\nshining\t179342\n龙格库塔\t179343\n下基层\t179344\n西区\t179345\n1:48\t179346\n每科\t179347\n动画化\t179348\n想和你唱\t179349\n龙游县\t179350\n张永生\t179351\n守护者们\t179352\nczm21\t179353\n益生金\t179354\n东方园林\t179355\n常熟市\t179356\n泰迪吧\t179357\n紫韵\t179358\nMink\t179359\ndeyu\t179360\nBandit\t179361\n年轻的母亲3\t179362\n历数\t179363\nm9\t179364\nmg\t179365\n十二金钗\t179366\n糖尿病频道_健客网\t179367\n日标\t179368\n马厂\t179369\n小婵\t179370\n神兵小将\t179371\n米黄\t179372\n歌喉\t179373\n史莱\t179374\n封基\t179375\n绿野仙踪\t179376\nPVC卡\t179377\n美女酒店\t179378\n顺序表\t179379\n围炉音乐会\t179380\n喀秋莎\t179381\n杨航\t179382\n吹吹风\t179383\n世者\t179384\n严选\t179385\n风雨飘摇\t179386\nqca\t179387\n缠绵悱恻\t179388\n中国民用航空华东地区管理局\t179389\n崇宁通宝\t179390\n暗龙\t179391\n六行\t179392\nblat\t179393\n粮农\t179394\n会理\t179395\n以点\t179396\n多轨\t179397\nparadise\t179398\n出拳\t179399\nannual\t179400\ncoeur\t179401\n辛普森\t179402\n广州市质量技术监督局\t179403\n亚太经济合作组织\t179404\n扣分\t179405\n柔情版\t179406\nBTA\t179407\n顺昌县\t179408\n岩壁\t179409\n五沟镇\t179410\n盐酸莫西沙星\t179411\nEri\t179412\nsd敢达\t179413\n处事\t179414\nkilled\t179415\n金碗\t179416\n540li\t179417\n在线新华字典\t179418\n房屋抵押贷款\t179419\n马眼\t179420\n阎鹤祥\t179421\n阿六头\t179422\n李狗\t179423\nmsaa\t179424\nDiffie\t179425\n殖民\t179426\n来福士广场\t179427\nBODY\t179428\n数字媒体专业\t179429\n1862年\t179430\n摸腿\t179431\n东方小说网\t179432\n严伯钧\t179433\n不洁\t179434\n乱象\t179435\nFandom\t179436\n宁城县\t179437\n沙盒类\t179438\n徐州东\t179439\n分部积分法\t179440\n新乡一中\t179441\n鸿合\t179442\n萧山日报\t179443\n輕\t179444\n上海12号线\t179445\nrbp\t179446\n煤制油\t179447\n热血精灵王\t179448\n江淮风暴\t179449\n读秒\t179450\n廖化\t179451\n逗趣\t179452\n集英社\t179453\n窗帘\t179454\n导购招聘网\t179455\n挡鼠板\t179456\n2米高\t179457\n平安不动产\t179458\n专名\t179459\n山水S酒店\t179460\n奥库\t179461\n追风筝的人\t179462\n真由\t179463\nHearth\t179464\nV2.3.2\t179465\n钉子户\t179466\n亚博娱乐\t179467\n大江保卫战\t179468\nRepo\t179469\n真言\t179470\n灵身\t179471\nGeometry\t179472\n活羊\t179473\n合规化\t179474\n护衣\t179475\n潘神\t179476\nMonsta\t179477\n乌尤尼盐沼\t179478\n讯\t179479\n北京婚纱摄影\t179480\n瓦纳卡\t179481\ndrafts\t179482\n安徽省农村信用社\t179483\n江华县政府\t179484\nepidemic\t179485\n出纸\t179486\n音像制品\t179487\n2016年4月18日\t179488\nExporter\t179489\nmats\t179490\n2017年8月20日\t179491\n彩虹湾\t179492\n绿城百合花园\t179493\n新虹街道\t179494\n时语\t179495\n1250所\t179496\n黔南政府网\t179497\n邻苯二甲酸酐\t179498\n大鸨\t179499\niphone7吧\t179500\n神龙章轩\t179501\nifnull函数\t179502\n位错\t179503\n钉头\t179504\ndcr\t179505\n异国短毛猫\t179506\n青岛日报\t179507\n名差\t179508\n食品商务网\t179509\n胶囊咖啡\t179510\n深红色\t179511\n桃蛋\t179512\n遥感\t179513\n相助\t179514\n黄渡\t179515\n磐基\t179516\n调解书\t179517\n新浪股吧\t179518\n东方山水乐园\t179519\n四万多\t179520\n收缴\t179521\n沁源县\t179522\nduties\t179523\npaddy\t179524\ntranscend\t179525\n0218\t179526\nanchor\t179527\n万代mg\t179528\n男排\t179529\n湖南卫视跨年演唱会\t179530\nUniform\t179531\n2414\t179532\n交强\t179533\n双塔山\t179534\n玩捉迷藏\t179535\n俄语学习网\t179536\n小芸\t179537\n茶溪谷\t179538\n十六部\t179539\n电精\t179540\n魅族Pro7\t179541\nSammy\t179542\n1508\t179543\n锌业股份\t179544\n一批次\t179545\n4天\t179546\n癖\t179547\nHAM\t179548\n亿万豪宠\t179549\n雄性激素\t179550\n影卫\t179551\nComprehension\t179552\n蒸发仪\t179553\n隔离栅\t179554\n惹出\t179555\n泼洒\t179556\n龙井路\t179557\n2016.06\t179558\nWebElement\t179559\n潮机\t179560\n放电管\t179561\n孙剑\t179562\n重关\t179563\n台系\t179564\n心脾\t179565\n祝由术\t179566\n春装\t179567\n中森明菜\t179568\n福田戴姆勒\t179569\nnaire\t179570\n武当山\t179571\nASCII\t179572\n骤\t179573\n毛骗\t179574\n不规则\t179575\n公局\t179576\n重庆市委组织部\t179577\n无痕浏览模式\t179578\n天蚕变\t179579\n互联网+农业\t179580\n江南街道\t179581\n时科\t179582\n医科院肿瘤医院\t179583\n薏苡仁\t179584\n武汉未来科技城\t179585\n野餐\t179586\n盗版书\t179587\n9002\t179588\n中国中医药网\t179589\n增容费\t179590\n1024位\t179591\n大阅兵\t179592\n安心得利\t179593\n道林·格雷\t179594\n佛山陶瓷网\t179595\n拍地\t179596\n5160\t179597\n绝地求生pc1.0\t179598\n亚科\t179599\n94万\t179600\n枣庄市薛城区\t179601\n缩至\t179602\n邓老师\t179603\n麦言\t179604\n上网卡\t179605\nmining\t179606\n氯\t179607\n性心理学\t179608\n共产党员纪律处分\t179609\n二合\t179610\n跑者\t179611\nmiu\t179612\n两江四湖\t179613\n高级人民法院\t179614\n锦绣中学\t179615\n化学工程与工艺\t179616\n茂名地区\t179617\n扳倒\t179618\n首办\t179619\n丙戊酸钠片\t179620\n中国科学院生态研究中心\t179621\n斑点\t179622\n横躺\t179623\n漱\t179624\nWEB-DL\t179625\nsql优化\t179626\n晴天\t179627\n460公里\t179628\ngoose\t179629\n防爆窗\t179630\n孟广禄\t179631\n景东\t179632\n第65\t179633\nmodao\t179634\n营业所\t179635\n神鼎\t179636\ninsertion\t179637\n佛乐\t179638\n东决\t179639\n卡卡资讯网\t179640\n泥怪\t179641\n氧菌\t179642\n豫商\t179643\n乐上财税网\t179644\n南工院\t179645\n译言\t179646\n宠兽\t179647\nQ迅\t179648\n林安安\t179649\n于君\t179650\n靠谱\t179651\n蚂蚁大宝卡\t179652\nrecylerview\t179653\n十六万\t179654\n1体\t179655\n神经根型颈椎病\t179656\n军子\t179657\n负一屏\t179658\n顺兴\t179659\n橙斧\t179660\nCAD软件\t179661\n四月八日\t179662\n厂规\t179663\n海伍德\t179664\n渤海大学\t179665\n24pin\t179666\n周觅\t179667\nwn\t179668\n火炬之光2吧\t179669\nIso\t179670\n百度贴吧\t179671\n卓越大厦\t179672\n德信\t179673\n别码\t179674\n美少男\t179675\n中航城国际社区\t179676\n20桌\t179677\n李女士\t179678\n露娜mini2\t179679\n吸声\t179680\n海狗丸\t179681\n3DM简体中文\t179682\n神华宁煤\t179683\njsdk\t179684\n包装工程\t179685\n陈翔六点半\t179686\n济宁医学院\t179687\n浮岛\t179688\n中洁\t179689\n花样爷爷\t179690\n25s\t179691\n中金公司\t179692\n祝福\t179693\n双桥\t179694\n材料科学\t179695\n大叫\t179696\n电缆\t179697\n达一中\t179698\n试吃\t179699\n护臂\t179700\n立案\t179701\nWince\t179702\n多媒体信息发布系统\t179703\n蒿子\t179704\n八式太极拳\t179705\n渡船\t179706\n18192324511\t179707\nKerberos\t179708\n本气\t179709\n绽妍\t179710\nGTX650\t179711\n杭州互联网法院\t179712\n茶碗\t179713\n湃\t179714\n纳米球\t179715\nym\t179716\n机炮\t179717\n七情六欲\t179718\nCGS\t179719\n安捷伦科技(中国)有限公司\t179720\n乔维安\t179721\nLED补光灯\t179722\nGross\t179723\n锦绣龙城\t179724\n31本\t179725\n徐帅\t179726\nAPI\t179727\n上海环球\t179728\ntrw\t179729\n星岛\t179730\n前次\t179731\n_太平洋下载中心\t179732\nrtl8168\t179733\ntween\t179734\n陈白沙\t179735\n4y\t179736\ndivisions\t179737\n黄带\t179738\n400天\t179739\n机动队\t179740\n孤岛惊魂5储物箱\t179741\nCollective\t179742\n慷慨\t179743\n粘土\t179744\n最佳化\t179745\n色谱级\t179746\n国家级经济开发区\t179747\n心有灵犀\t179748\nxiren\t179749\n中油资本\t179750\nMetroLyrics\t179751\n基材\t179752\n中山大学电子与信息工程学院\t179753\n祸从口出\t179754\n异丁醛\t179755\nFinder\t179756\n犹豫不决\t179757\n摄人心魄\t179758\n威宝\t179759\n时景\t179760\npy2.7\t179761\n歐洲\t179762\n推销员\t179763\n诗曲\t179764\n黄宅镇\t179765\n67个\t179766\n党风廉政建设责任制\t179767\n陪唱\t179768\n土规\t179769\n讨要\t179770\n环保展\t179771\nモ\t179772\n数据元\t179773\n方山风景区\t179774\n合肥消防\t179775\nch340\t179776\n陈倩\t179777\n附魔宝珠\t179778\n高径\t179779\n一篇\t179780\n区块连\t179781\n名刹古寺\t179782\n国防部\t179783\n根据地\t179784\nFrequency\t179785\n学术规范\t179786\n至理\t179787\n1000万条\t179788\n词谱\t179789\n英朗xt\t179790\nNginx|Kangle\t179791\n16口\t179792\n4001\t179793\nios8吧\t179794\nconveyor\t179795\nHDD\t179796\n家画\t179797\n餐品\t179798\n铁三院\t179799\n氧传感器\t179800\n中本\t179801\n贾登峪\t179802\n玻璃贴\t179803\n裂锦\t179804\nfliter\t179805\n武藤兰\t179806\n松香树脂\t179807\n中心镇\t179808\n背膜\t179809\n维京传奇\t179810\n哈尔滨市政府\t179811\n开阳县人民政府\t179812\n网型\t179813\n杭天琪\t179814\n甲衣\t179815\n涞源\t179816\nxna\t179817\n南岭国家森林公园\t179818\n全都\t179819\n尼可\t179820\npidof\t179821\n华为p7\t179822\n好准\t179823\n掠天记\t179824\n民口\t179825\n滨江开发区\t179826\n金帆奖\t179827\n青藤\t179828\n69名\t179829\nB站up主\t179830\n尸毒_茅山术之捉鬼高手\t179831\n新途胜\t179832\n伏特\t179833\nLinda\t179834\n三城市\t179835\ndad\t179836\n上海市浦东新区\t179837\nIOP\t179838\n如面\t179839\n新区\t179840\n陆华\t179841\nsinx^2\t179842\n外插\t179843\nlegrand\t179844\n833集\t179845\n半球体\t179846\n环世界吧\t179847\n火工品\t179848\n赤芍\t179849\n当成\t179850\n腺样体\t179851\n父母爱情\t179852\nlevel-2\t179853\n雷克萨斯LC\t179854\n起诉方\t179855\n100%桑\t179856\n烟花\t179857\n明天你好\t179858\nhackintosh\t179859\n我在未来等你\t179860\n建华路\t179861\n歌版\t179862\n体坛周报\t179863\n散淡\t179864\n斗武大陆\t179865\nReservoir\t179866\n老腿\t179867\n模壳\t179868\n挪威人\t179869\n玄黄\t179870\n详解\t179871\n/f\t179872\ngzjd\t179873\n乔布\t179874\n尼多王\t179875\n伤亡率\t179876\n排骨年糕\t179877\n文化创意产业园区\t179878\n叶锦添\t179879\n380亿\t179880\n小台灯\t179881\nVisas\t179882\n词云\t179883\nwuma\t179884\n收票\t179885\n庵野秀明\t179886\n银川市第一人民医院\t179887\nlnk\t179888\nASA\t179889\nca888\t179890\n顺峰\t179891\n长门有希\t179892\n同心路\t179893\n阿里个肯达\t179894\n排坐\t179895\n拉出血\t179896\n山西大同大学\t179897\n宝藏世界\t179898\n上海科技\t179899\n四折\t179900\n创课\t179901\n铁穹\t179902\n游刃有余\t179903\n石屑\t179904\n姊妹花\t179905\n副卡\t179906\n国燕委\t179907\n生产经\t179908\nhashcat\t179909\n小事\t179910\n英语学习网\t179911\n富美\t179912\n袜套\t179913\n外乡人\t179914\nVeins\t179915\n保定市\t179916\nSEPHORA\t179917\n6:00\t179918\n淘宝618\t179919\n标间\t179920\n干杯\t179921\n苏加诺\t179922\n洛杉矶华人资讯网\t179923\n反斗城\t179924\n激发\t179925\n浙江广厦\t179926\n变态反应\t179927\nkata\t179928\n文投\t179929\n大馋猫\t179930\n核苷酸\t179931\nsandwich\t179932\n见舞\t179933\n教育教学知识与能力\t179934\n木槿花西月锦绣\t179935\n中船集团\t179936\n洛江区\t179937\n2017-2023年\t179938\njavaScript\t179939\n把好\t179940\n|亿万\t179941\n第40期\t179942\ncl00\t179943\n微站\t179944\n海宁房产网\t179945\n釜山大学\t179946\nTESOL\t179947\n振动压路机\t179948\nstripe\t179949\n命运\t179950\n肾小管\t179951\n存在感\t179952\n宇宙飞船\t179953\n变压\t179954\n2018年1月28日\t179955\n林殊\t179956\n柠檬黄\t179957\n电影迷\t179958\n青椒炒鸡蛋\t179959\n婚\t179960\n免责\t179961\n海源机械\t179962\n行政人员\t179963\nAlfred\t179964\nERA\t179965\n庆春广场\t179966\n苏姿丰\t179967\n万股\t179968\n无可奉告\t179969\n西平\t179970\n84590143\t179971\n第七场\t179972\nCortex-M\t179973\n厌氧氨\t179974\n苹方\t179975\n保温炉\t179976\nAcid\t179977\n七连\t179978\n外野\t179979\n秒贷\t179980\n攻受\t179981\nlineage\t179982\n孟茜\t179983\n政治委员\t179984\n水霉病\t179985\n省得\t179986\n智家招商网\t179987\n北三县\t179988\n最后时光\t179989\n底特律:变人\t179990\n蒸汽清洗机\t179991\n陕西路\t179992\n7时\t179993\n镲片\t179994\n中央结算公司\t179995\nondemand\t179996\n_飞鹏网\t179997\n这两句\t179998\n长痘\t179999\n批语\t180000\n接线盒\t180001\n花唇\t180002\n水司\t180003\n皖智教育\t180004\n吉美\t180005\n热压罐\t180006\n南京市江宁医院\t180007\n铜川市人民医院\t180008\n极路由b70\t180009\n格鲁吉亚\t180010\n福州海峡\t180011\n凯信\t180012\n冰糖橙\t180013\n云测\t180014\n毛发\t180015\nOffice2007\t180016\n四川省卫计委\t180017\n王坤\t180018\n四川工商职业技术学院\t180019\n劳动合同证明书\t180020\n云通\t180021\n电子银行\t180022\n控部\t180023\nLiDE\t180024\n半两\t180025\n寿春中学\t180026\nmicrokms\t180027\n阿萨辛\t180028\n渝昆高铁\t180029\n深圳税务局\t180030\n前60秒\t180031\nLQFP\t180032\nyeri\t180033\n全警\t180034\n惠聚\t180035\n深圳市锐明技术股份有限公司\t180036\nzwb\t180037\nA320\t180038\n大椎穴\t180039\n优劲\t180040\n星湖街\t180041\n1233\t180042\nucp600\t180043\n司长\t180044\n远程\t180045\n吕梁市\t180046\n小宠水族\t180047\nGTK\t180048\n氢化钠\t180049\n肉毒\t180050\n主力机\t180051\n何超琼\t180052\n第2局\t180053\n熱\t180054\n倒斗\t180055\nwowufeiyang-1\t180056\nasi\t180057\n柏木\t180058\n贡多拉\t180059\n幼交\t180060\nWebEnh\t180061\n澳大利亚昆士兰大学\t180062\nOABC\t180063\nSIMULIA\t180064\n6点钟\t180065\n赤峰\t180066\n地文\t180067\n指出\t180068\n三山新城\t180069\n函数形参\t180070\n收视\t180071\nSRT\t180072\n前滩海景壹号\t180073\n用友U9\t180074\n图\t180075\n苏里南\t180076\n82159870\t180077\nExpress\t180078\n知非\t180079\n大丰荷兰\t180080\n千寻Mugen\t180081\n金昌政府网\t180082\n嘌呤\t180083\nLanguage\t180084\nnova3\t180085\n梁相宜\t180086\n分布式电源\t180087\n尼莫\t180088\n魔祖\t180089\n埃德蒙\t180090\nplus2013\t180091\n硫酸纸\t180092\n点到\t180093\n正事\t180094\n2018年3月10日\t180095\nSeaTools\t180096\n无法触碰\t180097\n三规\t180098\n白种人\t180099\nwin7之家\t180100\n小抄写员\t180101\n金魔方\t180102\n说还休\t180103\n艾洛松\t180104\n环湖东路\t180105\n黄岛区\t180106\n2014-2016年\t180107\n安迷\t180108\nSFF\t180109\nball\t180110\n车关\t180111\nnum_rows\t180112\n套批\t180113\n陕西省发展和改革委员会\t180114\n20150508\t180115\n咏月\t180116\n献歌\t180117\n青曲社\t180118\n挥发性脂肪酸\t180119\n信息栏\t180120\n止付\t180121\n3缸\t180122\n中山大学第一附属医院\t180123\n次声\t180124\nUINa\t180125\n五年前\t180126\n宁梓\t180127\n葡京\t180128\nbroker\t180129\n魚魚\t180130\n螺丝孔\t180131\n香湖\t180132\n三肖\t180133\n海王集团\t180134\n衢州日报社\t180135\n晒干\t180136\n垛庄镇\t180137\nZI\t180138\n滑索\t180139\n滑膛\t180140\n银翔\t180141\n乳突\t180142\n坨坨\t180143\n洋务\t180144\n西湾\t180145\n13骇人\t180146\n特摄剧\t180147\ng90\t180148\n医疗商务网\t180149\nMFI\t180150\n江阴人才网\t180151\nPaperYY\t180152\n5356\t180153\n淮语\t180154\n恋晴\t180155\nallocating\t180156\n时间选择器\t180157\n疆良\t180158\n张明华\t180159\n未来网\t180160\n三木\t180161\nplank\t180162\n轮窑\t180163\nmpm04\t180164\n二一\t180165\naguncn\t180166\n中古史\t180167\n空囊\t180168\n米狄尔\t180169\nbam\t180170\n奥西里斯\t180171\n静态库\t180172\n泉州市永春县人民政府\t180173\n中威电子\t180174\nRAID\t180175\nRPA\t180176\n十七日游\t180177\n心跳文学社\t180178\n天龙武神诀\t180179\n重庆市委\t180180\n太阳运动\t180181\n五一北京\t180182\n通勤车\t180183\n成疾\t180184\n婪\t180185\n千岛湖镇\t180186\n过氧乙酸\t180187\nwindows_Windows系\t180188\n三连冠\t180189\n生草\t180190\n1289\t180191\n狗磊\t180192\n宋村\t180193\n幻想三国志4外传\t180194\n北汽汽车\t180195\n3d图\t180196\n双江县\t180197\n奥运钞\t180198\n火火兔\t180199\n肾囊肿\t180200\n结婚吧\t180201\n23小时\t180202\n孙小红\t180203\n04月02日\t180204\n田舎\t180205\n4代\t180206\n感时花溅泪\t180207\n万象天成\t180208\n庸俗化\t180209\n贝\t180210\n公孙止\t180211\n上海化工\t180212\nradhat\t180213\nei\t180214\narecord\t180215\n城与勇士\t180216\n朦\t180217\n白无常\t180218\n法制生活网\t180219\n辜负\t180220\n冷锅鱼\t180221\n弗雷德\t180222\n扩面\t180223\n广汽菲克\t180224\n乌托家家居网\t180225\nkaikai\t180226\n养老信息网\t180227\n天气预报查\t180228\n苏苏\t180229\n东来东往\t180230\n360118\t180231\n鬼泣5\t180232\n石症\t180233\nDAVID\t180234\n中国世界遗产\t180235\n大学物理学\t180236\nz3c\t180237\n245\t180238\n银河号\t180239\n公输盘\t180240\n全裸体\t180241\njavacsv\t180242\n防伪税控\t180243\n三国志13pk\t180244\nwindows8\t180245\n江南水务\t180246\n戴米安\t180247\n车迟国\t180248\n家号\t180249\n开路\t180250\n百门\t180251\n原带\t180252\n职业高中\t180253\n1791\t180254\n中华粮网\t180255\nThomson\t180256\n5}\t180257\n前海深港合作区\t180258\n乔尼\t180259\n若曦\t180260\n全分辨率\t180261\n周期股\t180262\n5ppt\t180263\nsonnyangel\t180264\n第三步\t180265\n50篇\t180266\n听书\t180267\n内卡\t180268\n贴心\t180269\n武家之殇\t180270\nFOO\t180271\nSemen\t180272\n育心\t180273\n塑料块\t180274\n战弩\t180275\n帅气和尚爱上我\t180276\n杂品\t180277\njruby\t180278\n极寒之城\t180279\n健康展\t180280\n顶角\t180281\n速宠\t180282\n谭盐盐\t180283\n爱在\t180284\n缤特\t180285\ncoron\t180286\n迅雷村\t180287\n四明山国家森林公园\t180288\n阳光保险公司\t180289\n郧西在线\t180290\n曲婉婷\t180291\n第二实验小学\t180292\nC++程序设计\t180293\n冯辉\t180294\n下崽\t180295\n郑建国\t180296\n崇文花园\t180297\n台北西门町\t180298\n狗尿\t180299\nYokogawa\t180300\n喷油\t180301\n阿尔瓦\t180302\n千万吨级\t180303\n免费三级片大全\t180304\n水迹\t180305\n642\t180306\n冰粉\t180307\n便衣\t180308\n小言\t180309\n变宝网\t180310\n公比\t180311\n汕头市人民政府\t180312\n九族\t180313\n雪尼尔\t180314\n读报\t180315\n昊天\t180316\n荒古魔\t180317\nsalaries\t180318\nJACOB\t180319\nLease\t180320\n美国大使馆签证中心\t180321\n1200m\t180322\n哗\t180323\n今年6月份\t180324\n仪容\t180325\n请辞\t180326\n显卡吧\t180327\n第十一位\t180328\nBoule\t180329\n锦户亮\t180330\n135张\t180331\n护指\t180332\n涤纶丝\t180333\n梁平法\t180334\n46度\t180335\n方恩\t180336\n何婷\t180337\n硅片\t180338\n捕鱼器\t180339\n云南省总工会\t180340\n山本美和子\t180341\nAthena\t180342\n针座\t180343\n4.0.3\t180344\n有途网\t180345\n重庆小区\t180346\ngrounding\t180347\n昆明市中级人民法院\t180348\n电力系统规划\t180349\n小猴\t180350\nrx1\t180351\nClocks\t180352\n淘金网\t180353\n乙醚\t180354\n金话筒奖\t180355\naddidas\t180356\n垂杨柳医院\t180357\n富士急\t180358\n仙游县\t180359\n中研院\t180360\n庖丁解\t180361\n此群\t180362\n羊栖菜\t180363\n灵魂摆渡3\t180364\n恋爱史_秀目网\t180365\n直立式\t180366\n决战江桥\t180367\n北史\t180368\nkmeans算法\t180369\n南方航空\t180370\n第一场雪\t180371\n后轴\t180372\n查兰\t180373\n防坠器\t180374\nGoog\t180375\n什麽\t180376\nvmare\t180377\n开方\t180378\n改装车\t180379\nMEID\t180380\nsplash\t180381\n天原杯\t180382\nBath\t180383\n4处\t180384\nmysql子\t180385\n快版\t180386\n国家税务总局\t180387\n实验二小\t180388\n以来\t180389\n三江学院\t180390\n中央新闻联播\t180391\nQQ华夏手游\t180392\n0106\t180393\ndress\t180394\n中国疾病预防控制中心\t180395\n百荣\t180396\n建新村\t180397\nlifespace\t180398\n白矿油\t180399\n采样机\t180400\n酷米网\t180401\n乐岛\t180402\n滨州市人民医院\t180403\n尚书省\t180404\n噬神者2\t180405\n武汉商学院\t180406\nNebula\t180407\n综合收益\t180408\n迪伦\t180409\n签注机\t180410\n玉皇庙\t180411\n腾讯视频缓存\t180412\nbeers\t180413\n交通集团\t180414\nSpecialty\t180415\n1.38\t180416\nSofina\t180417\n便签贴\t180418\n女杀手\t180419\n东方娱乐\t180420\n徐汇牙防所\t180421\n艺文志\t180422\n风车村\t180423\n途胜论坛_汽车之家论坛\t180424\n内胆\t180425\n琉璃塔\t180426\n南昌市财政局\t180427\ntreelist\t180428\n12360\t180429\n35平方米\t180430\n玩卡网\t180431\n彭水苗族土家族自治县\t180432\nFCB\t180433\n美价\t180434\n8周\t180435\n杉杉\t180436\nDFA\t180437\n保卫萝卜1\t180438\n刀塔传奇天空要塞\t180439\n新悦\t180440\nmedian\t180441\n好懂\t180442\n生命之旅\t180443\n3处\t180444\n22本\t180445\n彭飞\t180446\noneshot\t180447\nspaghetti\t180448\n文化街\t180449\n新垣绫濑\t180450\n中华人民共和国水利部\t180451\n14倍\t180452\n接棒\t180453\n负反馈\t180454\n鸳鸯蝴蝶\t180455\n靓号\t180456\n宋军\t180457\nSRE\t180458\nHashimoto\t180459\n数据页\t180460\n京太\t180461\n六载\t180462\n启迈斯\t180463\n嘉园\t180464\n3373\t180465\nstride\t180466\n邢军\t180467\n侠盗飞车5\t180468\n展开图\t180469\n热熔胶\t180470\n拷红\t180471\ndelphi2007\t180472\nEPSON\t180473\n但斌\t180474\n饼子\t180475\n蜜拓蜜\t180476\n圣嘉\t180477\nathletic\t180478\nExcel表格日期格式转换大全\t180479\nnth-of-type\t180480\n天山天池\t180481\n铝镁锰\t180482\n撤销权\t180483\nbootstrapValidator\t180484\n上海电影节\t180485\n中山网\t180486\n胚芽米\t180487\n2kids\t180488\n四氟甲醚菊酯\t180489\nh110m\t180490\nbuke\t180491\n石阡新闻网\t180492\n阳泉\t180493\n国家市场监督管理总局\t180494\n256级\t180495\nVIRTUAL\t180496\n页岩砖\t180497\n开江\t180498\n神杖\t180499\ncertificates\t180500\n王昌龄\t180501\n墨然\t180502\n亚足联\t180503\n火元素\t180504\n弘善佛教网\t180505\n星冰乐\t180506\n谋权\t180507\n核聚\t180508\n马头墙\t180509\n南京师范大学生命科学学院\t180510\n傻白\t180511\n伸缩式\t180512\n极速版\t180513\n骚当\t180514\n反洗\t180515\n安师大\t180516\n八斗\t180517\n女夏\t180518\n博士人才网\t180519\n昂达\t180520\n热血江湖\t180521\n邵芸\t180522\n索比太阳能光伏网\t180523\n广岛之恋\t180524\n新闻观\t180525\n徐可\t180526\ncdiscount\t180527\n励展\t180528\n北京展览馆\t180529\n做单\t180530\n分析\t180531\n幼体\t180532\n豪门\t180533\n氨基吡啶\t180534\n王炸四子\t180535\n不寐\t180536\n同济大学浙江学院\t180537\n鼻涕\t180538\n20150617\t180539\n钟扬\t180540\n南省\t180541\nmimimi\t180542\n第3批\t180543\n58赶集\t180544\n电源盒\t180545\nrepresentative\t180546\n雨花台烈士陵园\t180547\n中国社会科学院历史研究所\t180548\n科技战\t180549\n约瑟\t180550\n广西交通职业技术学院\t180551\n暴雪娱乐公司\t180552\n巴塞罗那\t180553\n教育学部\t180554\nomnigraffle\t180555\n甲氧苄啶\t180556\nstrapless\t180557\n0.85\t180558\n汤姆林徽因\t180559\n怎么办\t180560\nparsing\t180561\n徐沟\t180562\n神像\t180563\n事业编制\t180564\n万堂书院\t180565\n办公桌\t180566\n装机版\t180567\nAthens\t180568\n74%\t180569\n变配电房\t180570\n妇联3\t180571\n杀\t180572\n永达\t180573\n不留痕\t180574\n《武汉理工大学》\t180575\nalexa\t180576\n家宴\t180577\n打票\t180578\n雪球网\t180579\n阿卡姆骑士\t180580\njomalone\t180581\n棕榈股份\t180582\n安防网\t180583\n蔡骏\t180584\n压力波\t180585\n白种\t180586\n膝伤\t180587\nyasm\t180588\n紧追\t180589\n瘦弱\t180590\n中华人民共和国民办教育促进法实施条例\t180591\nbat批处理文件\t180592\n鲍斯股份\t180593\nPlist\t180594\n豪特\t180595\n诏令\t180596\n牵挂\t180597\n屠城\t180598\n盖房\t180599\n小蓝罐\t180600\n早死\t180601\n548\t180602\n朝桐光\t180603\n旺旺集团\t180604\n建化\t180605\nBetterZip\t180606\ndigger\t180607\nniang\t180608\n上海运动健身\t180609\n于文文\t180610\n英雄王\t180611\n童话\t180612\n中九\t180613\n湖北省人大常委会\t180614\n烧录\t180615\n北京市西城区人民法院\t180616\n推荐者\t180617\n锡崖沟\t180618\n排卵\t180619\n合伙型\t180620\namanlikethis\t180621\nK73\t180622\n党岭\t180623\n复旦二附中\t180624\n音乐性\t180625\n长距离\t180626\n联储证券\t180627\nmatlab2010a\t180628\nmetres\t180629\n就业补助金\t180630\n壹方中心\t180631\nasialee\t180632\n转到\t180633\nat_\t180634\n4ml\t180635\n阿斯汤\t180636\n风速仪\t180637\njpn\t180638\n中信银行股份有限公司\t180639\n别错过了\t180640\n皮筋\t180641\nOLED显示屏\t180642\n碧螺春茶\t180643\n山盟海誓\t180644\n失火\t180645\n红原大草原\t180646\n谷歌地图\t180647\n3455\t180648\n最强弹一弹\t180649\n起名\t180650\n匠托奇\t180651\n东风雷诺\t180652\n夸张句\t180653\n信本\t180654\n兰蔻菁\t180655\nCOMME\t180656\nasm\t180657\n廊坊市国土资源局\t180658\n擂鼓山\t180659\n壁柜\t180660\npico\t180661\n12320\t180662\nvue父组件\t180663\n三重门\t180664\n64x\t180665\nedusoho\t180666\n智天使\t180667\n1200000\t180668\n中国石化销售有限公司\t180669\n雪肤\t180670\n唐马儒\t180671\n26元\t180672\nARM处理器\t180673\nDNF地下城与勇士_多玩游戏网\t180674\n剑灵洪门\t180675\n骸\t180676\n可用性\t180677\n多囊性卵巢综合症\t180678\n500万\t180679\n阴露\t180680\n20度\t180681\n名流\t180682\n回龙\t180683\n我讨厌\t180684\n龟池\t180685\n辞退\t180686\n寻欢\t180687\n大脸\t180688\n剪切模量\t180689\n彩艺\t180690\n安德烈波切利\t180691\nP600\t180692\n淡樱\t180693\n鸡腿菇\t180694\n西安市工商行政管理局\t180695\n母其弥雅\t180696\n2018美元\t180697\n有文化\t180698\n女性主义\t180699\nWeibo\t180700\n相对面\t180701\n公寓\t180702\n从头开始\t180703\n老庄\t180704\nmurray\t180705\n福强路\t180706\n嫁接\t180707\n24.8\t180708\n64位\t180709\n20150510\t180710\n重力眩晕2\t180711\ninsprion\t180712\ntraceroute\t180713\nagg\t180714\n交通大厦\t180715\ntang\t180716\n仲裁法\t180717\n孟村\t180718\n4800元\t180719\n投资方\t180720\n指手画脚\t180721\n二元酸\t180722\n马丁路德金\t180723\n第1部\t180724\n长岭县\t180725\nMerchandising\t180726\n长江师范学院\t180727\n十夜\t180728\n野战部队\t180729\nHolland\t180730\n德乙\t180731\n先锋影音在线\t180732\n严密性\t180733\n世界中心\t180734\n四川北路\t180735\n以撒的结合胎衣\t180736\n蜂窝网\t180737\nP值\t180738\n24架\t180739\n世界五百强\t180740\n詹天佑\t180741\nOxidation\t180742\n海绵城市\t180743\nchoose\t180744\n高志\t180745\n11.03\t180746\n新农合\t180747\n解构主义\t180748\n薄荷油\t180749\n东瓜\t180750\n魅蓝Note6\t180751\n华东地区\t180752\n无猜\t180753\n鸿道\t180754\nreid\t180755\n李珊\t180756\nkline\t180757\npostfix\t180758\nfotor\t180759\n中国移动通信集团公司\t180760\n丝洞\t180761\nasdasd\t180762\n199号段\t180763\n抗风柱\t180764\n双敏\t180765\nhdmi显示器\t180766\n二级人力资源管理师\t180767\n书柜\t180768\n红花檵木\t180769\ncfhpl\t180770\n难熬\t180771\n通俗\t180772\n天王山\t180773\n爱迪长安cs35\t180774\ntinyphp\t180775\nAnyCAD\t180776\n王卫星\t180777\ntimers\t180778\n推优\t180779\nVscode\t180780\npdca\t180781\n湖北华一寄宿学校\t180782\nIr\t180783\nplayable\t180784\nfree函数\t180785\nYeLin\t180786\nBAD\t180787\n窗台石\t180788\n竹蜂\t180789\n要塞\t180790\n中国铁路济南局集团有限公司\t180791\n桃树\t180792\nTB58\t180793\n31.3\t180794\nCtrip\t180795\n发痒\t180796\n天天有喜\t180797\n义愤\t180798\n敏妍\t180799\n2018年04月26日\t180800\n500miles\t180801\n中骏置业\t180802\nAmsterdam\t180803\n防水\t180804\nPICASSO\t180805\nPTR\t180806\n够住\t180807\nV1.0\t180808\n66.com\t180809\n替米沙坦\t180810\n乐康\t180811\n六十岁\t180812\nPowering\t180813\n系统集成项目管理师\t180814\n第5张\t180815\nsichuan\t180816\n金百合\t180817\n收获期\t180818\n见好就收\t180819\nzjc\t180820\ncushion\t180821\n轮渡时刻表\t180822\n平桥乡\t180823\n打战\t180824\nguides\t180825\n频谱仪\t180826\nf477\t180827\n王东\t180828\n渭南日报\t180829\n智云smooth\t180830\n云朵儿\t180831\n超级网站整站下载器\t180832\n1947\t180833\nMediated\t180834\n帕瓦特奥特曼\t180835\nsinopec\t180836\n洗浴会馆\t180837\nJosephine\t180838\n口若悬河\t180839\n47座\t180840\n侧锋\t180841\nFeminist\t180842\n家电维修网\t180843\n600010\t180844\n济南大观园\t180845\n转档案\t180846\n年代\t180847\n套利定价\t180848\n漠海\t180849\n扣量\t180850\n伟哥\t180851\njjwxc\t180852\nINSERT\t180853\n乡土中国\t180854\n遥控跳蛋\t180855\nsnoy\t180856\nSpA\t180857\n分包商\t180858\n市财政局\t180859\n应收账款保理\t180860\n2018-03-03\t180861\ncarbondata\t180862\n艾莉婕\t180863\n谭君铁\t180864\n妫河\t180865\n更加\t180866\n皮神\t180867\n1.9GB\t180868\n火车迷\t180869\n创世纪\t180870\n欧哲\t180871\n废气阀\t180872\nAv\t180873\n灌肠液\t180874\n职女\t180875\nwenda\t180876\n南洋镇\t180877\n创智天地\t180878\n效度分析\t180879\niOS10\t180880\nlgn\t180881\n待补\t180882\n毡房\t180883\n野团\t180884\n关税与贸易总协定\t180885\n郭华\t180886\n黑曜神\t180887\n云雀\t180888\nNovell\t180889\n心乐\t180890\ngSOAP\t180891\n覆盖面\t180892\nJ-LINK\t180893\n赶尸\t180894\n尊宝披萨\t180895\n南康区人民政府\t180896\n神羊\t180897\n转盘\t180898\n羽坛\t180899\n中国人民大学农业与农村发展学院\t180900\n针砭时弊\t180901\n悍夫\t180902\n药师经\t180903\n诺心蛋糕\t180904\n移动盒子\t180905\n阿依河\t180906\nFORTRAN\t180907\n飘流幻境\t180908\nlineageos\t180909\n20171008\t180910\n长沙市妇幼保健院\t180911\nDISM\t180912\ntolilong\t180913\n省卫生计生委\t180914\n侧风\t180915\n两便\t180916\n闸\t180917\n赛琳\t180918\n倾辛\t180919\n山东能源枣庄矿业\t180920\n蓝城区\t180921\n北京对外经贸大学\t180922\n杭州开发区\t180923\nAle\t180924\n乃们\t180925\n交银施罗德基金\t180926\nTunes\t180927\n德拉吉\t180928\niclound\t180929\n90km\t180930\n蒋万安\t180931\nArctime\t180932\n中科院深圳先进技术研究院\t180933\n农情\t180934\n第二盘\t180935\n工伤险\t180936\n急难先锋\t180937\n随性随缘\t180938\ntiao\t180939\n世茂国际广场\t180940\nXTS\t180941\n支付率\t180942\nHolly\t180943\n火流\t180944\n镀铝膜\t180945\n春梦\t180946\njiehun\t180947\n英睿达\t180948\n锁门\t180949\nCARD篇\t180950\n多伊尔\t180951\n燃起\t180952\n化探\t180953\n银行从业资格考试\t180954\nTAIL\t180955\n划拔\t180956\n竽\t180957\n政行\t180958\n2998\t180959\n专权\t180960\n霉菌阴道炎\t180961\nTata\t180962\n秽\t180963\nisotope\t180964\n上海旅游高等专科学校\t180965\n竖屏\t180966\ncamera360\t180967\nLazer\t180968\n0.5吨\t180969\n偶像大师灰姑娘\t180970\n黄金岛\t180971\n模制\t180972\n冷光灯\t180973\nExtrude\t180974\n騒麦\t180975\n穗\t180976\n狂牛\t180977\n飞了\t180978\n螺丝帽\t180979\n完结版\t180980\n汉森\t180981\n缇娜\t180982\n宝特\t180983\n高空作业\t180984\nfileinput\t180985\nmin2\t180986\n看清楚\t180987\n孟鲁司特\t180988\notherwise\t180989\n快乐的\t180990\n金鹰世界\t180991\n耗氧量\t180992\n腾讯安全应急响应中心\t180993\n彩铅\t180994\n甲维盐\t180995\n孕产妇\t180996\n人物简笔画大全\t180997\n前杠\t180998\n陈秋明\t180999\n158\t181000\n深奥\t181001\n风幕机\t181002\nSongs\t181003\n红隼\t181004\nUfficiale\t181005\nsteup\t181006\n摩斯密码\t181007\n轩辕剑4\t181008\nQ2\t181009\n太平洋电\t181010\nBlitz\t181011\n直筒裙\t181012\nmises\t181013\n傲天\t181014\n安全垫\t181015\n死人经\t181016\n陈德铭\t181017\n票人\t181018\n汽贸城\t181019\n失业保险基金\t181020\n下陷\t181021\n彼得·德鲁克\t181022\n抗菌剂\t181023\n永旺\t181024\n3000部\t181025\n丰溪\t181026\n彩超机\t181027\n米拉乔沃维奇\t181028\nglock\t181029\n1892年\t181030\n福元\t181031\n党总支部\t181032\n奉天\t181033\n制纸\t181034\n肖某\t181035\n浙江国际海运职业技术学院\t181036\n山东大学化学与化工学院\t181037\n快跑者\t181038\n800斤\t181039\n0.2cm\t181040\n福田外国语学校\t181041\n牧马\t181042\n侠盗猎车手6\t181043\n群吕\t181044\nDefinitive\t181045\nmyether\t181046\n羞辱2\t181047\nGoodnight\t181048\nTomcat端口号\t181049\n溶剂\t181050\n5时\t181051\n裙舞\t181052\nLuaStudio\t181053\nNCBI\t181054\n占主导\t181055\n看报告\t181056\n设防\t181057\nswisse\t181058\n古三通\t181059\n欧洲冠军联赛\t181060\n取精\t181061\n小牛在线\t181062\n屯里\t181063\n区瑞强\t181064\n小情人\t181065\n500美元\t181066\n洪兴十三妹\t181067\n革委会\t181068\n11.3\t181069\nMKT\t181070\n焊锡\t181071\n职业类\t181072\n25部\t181073\n409号\t181074\n小說網\t181075\nrestful\t181076\n河南沁阳市委\t181077\n黄岛\t181078\n中国金融培训中心\t181079\n太湖仙岛\t181080\n指点杆\t181081\nCondor\t181082\n神州半岛\t181083\n养鹅\t181084\n木屑颗粒机\t181085\nlululu\t181086\nv9play\t181087\n上海天擎\t181088\n谈过恋爱\t181089\n14节\t181090\n河北省环保厅\t181091\n母钱\t181092\njpanel\t181093\n可不\t181094\nPCB库\t181095\n康亿\t181096\n双城区\t181097\nWebsite\t181098\ncollider\t181099\nwsbm\t181100\n德马吉\t181101\n净水\t181102\n念错\t181103\n雷克萨斯nx200\t181104\n卖家\t181105\nPerse\t181106\n怀中\t181107\n小犬\t181108\n可申\t181109\n高桥留美子\t181110\nvmware12\t181111\n塔吉克\t181112\n五年级下册语文书\t181113\n新绝代双骄2\t181114\nvr渲染器\t181115\n认亲\t181116\n陈粒\t181117\n兴学\t181118\n安卓7.0\t181119\n鼠曲草\t181120\n血浆蛋白结合率\t181121\n蛙眼\t181122\n城南大道\t181123\n骨文\t181124\n铸造机\t181125\n永德县人民政府\t181126\ngre\t181127\n焦糖布丁\t181128\nIntelli\t181129\n防盗\t181130\n亨格瑞\t181131\n6.6级\t181132\n普通中学\t181133\n92班\t181134\nxxxxxxx\t181135\n徐州市第一人民医院\t181136\nClearance\t181137\n陈俊\t181138\njiangmen\t181139\n2.0分\t181140\n摄影机\t181141\nmnt\t181142\n4针\t181143\ntermios\t181144\n2.9.3\t181145\nv2.2.7\t181146\n明年底\t181147\n过载\t181148\n明亮\t181149\ninputbox\t181150\n伪满\t181151\n梳妆镜\t181152\n立约\t181153\n北京申通快递\t181154\n易开\t181155\n旅长\t181156\n电信物联卡\t181157\n考工记\t181158\n葛力姆乔\t181159\n2018年4月13日\t181160\n金家\t181161\n沙河镇\t181162\n秋浦歌\t181163\n而外\t181164\n坑神newface\t181165\n稀释剂\t181166\n赫茨伯格\t181167\nvst\t181168\ndeclared\t181169\nHALO\t181170\nwifi天线\t181171\n2016年教师节\t181172\n公卿\t181173\n老婆婆\t181174\n双酚\t181175\n口技\t181176\n超银河传说\t181177\n彩虹六号\t181178\naluminum\t181179\nBrandy\t181180\n绽蔷薇\t181181\nRMBCrew热血街舞团\t181182\n正堂\t181183\n少儿篮球\t181184\n龙源电力\t181185\nE200L\t181186\nf534\t181187\ni博导\t181188\n1._\t181189\n千里之外\t181190\n45亿美元\t181191\n普拉斯\t181192\necho\t181193\n大明王朝\t181194\nt9\t181195\ncodeKK\t181196\n福昕风腾PDF\t181197\n陈锡文\t181198\n三河坝\t181199\n灰口铸铁\t181200\n阿拉尔市\t181201\nical\t181202\n专版\t181203\n模\t181204\n无人驾驶汽车\t181205\n久泰能源\t181206\n机油味\t181207\n移动处理器\t181208\n作家出版社\t181209\n万象新城\t181210\n新力东园\t181211\n自掘\t181212\nDascom\t181213\nb9\t181214\n海贼王手办\t181215\nFrameBuffer\t181216\ntamper\t181217\n鼠类\t181218\n繁星\t181219\n画圈\t181220\n湖南康达自愿戒毒中心\t181221\n夏影\t181222\n虚拟人生3\t181223\n18年4月份\t181224\n京A\t181225\nSICK\t181226\n4000年\t181227\n中油瑞飞\t181228\n智造\t181229\n谢礼\t181230\nxysjk\t181231\nBeneath\t181232\n我和我的23个奴隶\t181233\n正整数\t181234\nweare\t181235\n克罗克\t181236\nMild\t181237\n锥\t181238\n非法值\t181239\n竞界\t181240\nAngels\t181241\n刘志伟\t181242\nAccelerated\t181243\n蒂亚tia\t181244\n灿勋\t181245\n隆阳\t181246\nLancaster\t181247\n咏荷\t181248\n追述\t181249\nGIF动图\t181250\n廊坊市人民政府\t181251\n净率\t181252\n植物大战僵尸2摩登\t181253\nlegion\t181254\nWSD\t181255\n比亚迪汽车销售有限公司\t181256\n李公乐\t181257\n无法逃离\t181258\n中化集团\t181259\n致使\t181260\nChelsea\t181261\n蒙城政府\t181262\n0991\t181263\n虞舜人才网\t181264\n2k小说阅读网\t181265\n剪断机\t181266\n刷脸考勤机\t181267\n御银股份\t181268\n编校\t181269\n正山堂\t181270\n新元素\t181271\nfnc\t181272\n观音禅寺\t181273\n小酒店\t181274\n脾气虚\t181275\n台山下川岛\t181276\n一艘船\t181277\n小当家\t181278\n心善\t181279\n电影节\t181280\n瑞克\t181281\n杭州公交查询_杭州网\t181282\n家里\t181283\n20160229\t181284\n嘉闵\t181285\nhomme\t181286\n广州市质监局\t181287\nsan11\t181288\n快门声\t181289\n枸杞酒\t181290\n瞥见\t181291\nJNI\t181292\n御方\t181293\n19小时\t181294\n汽油箱\t181295\nwon\t181296\n040期\t181297\ngeckodriver\t181298\n北九水\t181299\n常州市公安局\t181300\nDeloitte\t181301\n视杰\t181302\n开伞\t181303\nach\t181304\n寻仙手游\t181305\n纳尼亚\t181306\n消费者物价指数\t181307\nNude\t181308\n和平号\t181309\n苏曼\t181310\n佤山\t181311\n聚氧化乙烯\t181312\n莞式\t181313\n投票\t181314\n墨龟\t181315\n扫兴\t181316\n公差\t181317\n地狱犬\t181318\n千佛寺\t181319\n郑太力\t181320\n射向\t181321\n快\t181322\nproduction\t181323\ndeargentle-ChinaUnix\t181324\n仙5\t181325\n短短\t181326\n王者荣耀代练\t181327\n6321\t181328\n生物节律\t181329\n浴帘杆\t181330\nBarrett\t181331\ngroupadd\t181332\n智乃\t181333\n星宇股份\t181334\n梦战碧海旭梦\t181335\n北沟镇\t181336\n天地科技\t181337\n蜂蜜柠檬水\t181338\n金台路\t181339\n秦皇岛职业技术学院\t181340\nmarkers\t181341\n安置小区\t181342\n美国麻省理工学院\t181343\n垂线\t181344\n绝地求生卡盟\t181345\n充电盒\t181346\n南京浦\t181347\n元尊\t181348\n力宝\t181349\n南丁格尔\t181350\nLeonardo\t181351\n喷麦\t181352\n赵全营\t181353\n超变\t181354\n召南\t181355\n天銮\t181356\n乐高人仔\t181357\n港桥\t181358\n中国杂技团\t181359\n团康\t181360\n90010\t181361\nopenstreetmap\t181362\n开尔\t181363\n欢庆\t181364\n帝豪EV\t181365\n德鲁\t181366\n广东省检察院\t181367\n印力集团\t181368\n梦里花落\t181369\n岱宗\t181370\n隆基泰和控股\t181371\n女生篇\t181372\n超冒险小镇物语\t181373\n我在诛仙逍遥涧\t181374\n深情相拥\t181375\n千辆\t181376\n实芯\t181377\n机电传动控制\t181378\n丽江古城区\t181379\n北京剧院\t181380\nBrewery\t181381\nmeltwater\t181382\n不归零\t181383\n武侯新城\t181384\n对得起自己\t181385\nActivation\t181386\n鸡巴\t181387\n中草\t181388\n屄屄\t181389\n一用\t181390\n天若共和国之辉\t181391\n胶州路\t181392\n淄博信息港\t181393\n蒸气\t181394\n末世孤芳不自赏\t181395\n岔子\t181396\n肉蟹\t181397\n1577\t181398\n永近\t181399\n机刷\t181400\n布洛赫\t181401\n新闻出版局\t181402\nStaub\t181403\n感激不尽\t181404\nPS教程\t181405\n师夷\t181406\n罗塞塔\t181407\nEVGA\t181408\n陆谦\t181409\na5100\t181410\n下挫\t181411\n干挂铝塑板\t181412\nHostels\t181413\nx16\t181414\nConcurrentHash\t181415\nHDOJ\t181416\n天元集团\t181417\n～\t181418\n睛彩\t181419\n健康餐\t181420\n美国区\t181421\n华域汽车\t181422\n裁切机\t181423\n航空母\t181424\n无赖\t181425\n聚沉\t181426\n上海市发展和改革委员会\t181427\nsuger\t181428\n册封\t181429\n安知晓\t181430\n龙空\t181431\n聚合页\t181432\n到付款\t181433\n广西盛隆冶金有限公司\t181434\n高速公路管理局\t181435\nTeamcenter\t181436\n院所\t181437\n恢复\t181438\ntame\t181439\n江山百姓网\t181440\n阿黛尔\t181441\n沪青平公路\t181442\n熟人\t181443\n选送\t181444\n险资\t181445\n西安城西客运站\t181446\n黑龙江省森林工业总局\t181447\n瑞丰光电\t181448\n汉国\t181449\n相爱相亲\t181450\n3mod\t181451\nDRAMA\t181452\n调笑\t181453\n中等生\t181454\n过滤纸\t181455\n仕林\t181456\n美欣达\t181457\nzdic\t181458\n有理\t181459\n恶魔城暗影之王\t181460\nyida\t181461\nsetlocale\t181462\n2017.HDRip.720p.H264.AAC\t181463\n调集\t181464\n回台\t181465\n浦大喜\t181466\n小米MIX#\t181467\n游击战\t181468\n中邪\t181469\n虽然\t181470\n中望cad\t181471\n闲置品\t181472\n合同期\t181473\n傲剑凌云\t181474\njad\t181475\n回笼觉\t181476\n旅法师\t181477\n另一条路\t181478\n食道炎\t181479\n跳伞\t181480\nagnostic\t181481\n雄鱼\t181482\nPu\t181483\n樟木头\t181484\n1.27mm\t181485\n在海上\t181486\n邹正\t181487\n诸葛亮\t181488\n纽元\t181489\nstakeholder\t181490\n崔走召\t181491\ncontentprovider\t181492\n52万\t181493\n中国游戏公司\t181494\n股癣\t181495\nWHERE\t181496\n胡须\t181497\ntidy\t181498\n传开\t181499\n临床药理学\t181500\ncpan\t181501\n40多天\t181502\n三国杀手机版\t181503\n一夏\t181504\nNativeScript\t181505\n开训\t181506\n城管\t181507\nAppCode\t181508\n男变女\t181509\n孙祁祥\t181510\n资审\t181511\n雷德利·斯科特\t181512\n【八戒\t181513\n红米手机2\t181514\n飞行家\t181515\nsanguo\t181516\nBAT批处理\t181517\n篠田\t181518\nlincoln\t181519\n60秒后\t181520\n价盘\t181521\n收摊\t181522\n阿甲科技\t181523\n武汉光谷联合产权交易所\t181524\nlnc\t181525\n复机\t181526\ninv\t181527\n迷失者\t181528\n伊丽莎白·奥尔森\t181529\n串词网\t181530\n星女\t181531\nEncounters\t181532\nflea\t181533\n狗尾草\t181534\n顺德一中\t181535\n椰风寨\t181536\n疾病预防控制中心\t181537\n可立\t181538\n3d模型溜溜网\t181539\n人间中毒\t181540\n艺康\t181541\n取得成功\t181542\nkow\t181543\n加利\t181544\nExcel多列堆积柱形图\t181545\n【方\t181546\n即将来临\t181547\n叶朗\t181548\n珠宝匠\t181549\n并案\t181550\nX战警前传:金刚狼\t181551\n江心洲\t181552\n摇杯\t181553\n1065\t181554\n7.9元\t181555\n荆条\t181556\n回望\t181557\nCCTV-2_央视\t181558\n修根\t181559\n589\t181560\n流放之路2.5\t181561\n苏秦\t181562\n空腔\t181563\n报童\t181564\n没关系,是爱情啊\t181565\n思否\t181566\n家国情\t181567\n陵水黎族自治县\t181568\n树兰\t181569\n201507\t181570\n官方说法\t181571\n第一卷\t181572\n沙克\t181573\nTurner\t181574\n裴秀智\t181575\n投篮机\t181576\n4051\t181577\n盘发器\t181578\n/trunk/data/module/0a7bea5dbe571d35def883cbec796437\t181579\n心爱的姑娘\t181580\n云鹊医\t181581\nSta\t181582\n美空网\t181583\n002387\t181584\n小泽玛莉亚\t181585\npainted\t181586\n马仁奇峰风景区\t181587\n16段\t181588\n大话西游3\t181589\ncoordination\t181590\n港丽\t181591\n上电\t181592\nbarcode4j\t181593\n昆明公司\t181594\n丘成桐中学\t181595\n中共天津市委\t181596\n白日门\t181597\n张洁\t181598\n排列3试机号\t181599\n神贴\t181600\n雷克萨斯IS\t181601\ntopshop\t181602\n手波\t181603\nWord2013/2010/2007/2003\t181604\n当风\t181605\ngt740\t181606\ndoll\t181607\n60&\t181608\n第7届\t181609\n美孚速霸\t181610\n张琰\t181611\n公祭\t181612\n没空\t181613\n洪\t181614\nAtomos\t181615\n精确度\t181616\nChampagne\t181617\n古北路\t181618\nKipling\t181619\n杨新华\t181620\n农户\t181621\n重神机\t181622\ncomman\t181623\n麦菜\t181624\nParody\t181625\n12.25\t181626\navast\t181627\nConfigurer\t181628\nAppscan\t181629\n石头村\t181630\n俞小凡\t181631\n液体皂\t181632\ne6430\t181633\n金屋修仙狂徒\t181634\n起潮落\t181635\n夹闭\t181636\n周某人\t181637\n6029\t181638\n小石潭\t181639\ncodexiu\t181640\n清江\t181641\n消灭\t181642\n新加坡站\t181643\n文新\t181644\n天与空\t181645\n三个星期\t181646\n钻墙\t181647\ne-cology\t181648\n300D\t181649\n嘉莉\t181650\n莎菲\t181651\n内布拉斯加州\t181652\n博盛\t181653\n暗黑血统1\t181654\ndelphixe\t181655\n头部\t181656\n37p\t181657\n铝电解\t181658\n西安医学高等专科学校\t181659\n冥婚霸宠\t181660\n语舞\t181661\n周彬\t181662\nGRADE\t181663\nleadership\t181664\n数据版\t181665\n兵道\t181666\nlibudev\t181667\n椰皇\t181668\n飞跃鞋\t181669\n鑫隆\t181670\n走肾不走\t181671\n万里挑一\t181672\n九只\t181673\n女魔\t181674\n单通\t181675\n免税购物\t181676\n散利\t181677\n66_\t181678\n机械师\t181679\n八角亭\t181680\n鬼步\t181681\n红叶子\t181682\n僵尸生活\t181683\nTKO\t181684\n金寨路\t181685\nanasys\t181686\n甩\t181687\n上海协和\t181688\n提职\t181689\n东篱\t181690\n刮痧板\t181691\n145个\t181692\n湖北气象\t181693\n张文龙\t181694\nexpenditure\t181695\nh5ai\t181696\n张秀丽\t181697\nsomi\t181698\n神准\t181699\nUk\t181700\nQQ技术\t181701\n国际长\t181702\n20160524\t181703\n客语\t181704\n确权证\t181705\n128集\t181706\n六节\t181707\n发家致富\t181708\n李峰\t181709\n自在\t181710\n荣昊\t181711\n定弘法师\t181712\n币制\t181713\n被害妄想症\t181714\n16进制转换\t181715\n要否\t181716\n潍坊中学\t181717\n无人深空\t181718\npains\t181719\nRick\t181720\n快帆\t181721\n邪恶版\t181722\n倍量\t181723\n近郊\t181724\n威尔士\t181725\n副组长\t181726\n鞭\t181727\n公尺\t181728\n智牌\t181729\n教育\t181730\n紫薇星\t181731\n枸杞\t181732\n好戏网\t181733\n迅雷7.9\t181734\n沦亡\t181735\n绑架者\t181736\n15.10\t181737\n橱柜门\t181738\n2001太空漫游\t181739\n青洽会\t181740\n莫须有\t181741\n清爽版\t181742\n金纸\t181743\nwebhooks\t181744\nubentu\t181745\n神户制钢\t181746\nmorrison\t181747\n毛线\t181748\nold\t181749\n石桥街道\t181750\n靳羽西\t181751\n工作经历\t181752\n亚历山大鹦鹉\t181753\n海霸\t181754\n王保\t181755\n岁月号\t181756\nautomotive\t181757\n鬼域\t181758\nSBT\t181759\nsrgb\t181760\n森林公安局\t181761\n官湖村\t181762\n2017年终\t181763\n警衔\t181764\n放学后\t181765\n东方航空\t181766\n辽宁省知识产权局\t181767\nuled\t181768\n报上\t181769\n济淮\t181770\n滿\t181771\n第45届\t181772\nsetHeader\t181773\n巡洋舰\t181774\n前篇\t181775\n中国远洋海运集团有限公司\t181776\n交底\t181777\n国家电网\t181778\n教学部\t181779\n于\t181780\n中国人民大学公共管理学院\t181781\n两个日\t181782\n繁植\t181783\n董妃\t181784\n青萝\t181785\n大心\t181786\n夜血\t181787\n结核性胸膜炎\t181788\n昆政\t181789\n生辰八字起名网\t181790\n厢房\t181791\n沙钢\t181792\n第68届\t181793\nwin7系统文件\t181794\n静安嘉里中心\t181795\n田赋\t181796\n吉林体育学院\t181797\n中国海西网\t181798\nDiskgenius\t181799\n固若金汤\t181800\n红叶节\t181801\n吉姆·罗杰斯\t181802\nrq\t181803\nARMY\t181804\nxposed框架\t181805\n墅级\t181806\n正定机场\t181807\ndung\t181808\n情定大饭店\t181809\nCecilia\t181810\nhuarache\t181811\n与妻书\t181812\n甲苯磺酰氯\t181813\n角钢法兰\t181814\n门联\t181815\n劫持\t181816\npayer\t181817\n力智\t181818\nweb端\t181819\n专辑\t181820\n消防证\t181821\n郭兰\t181822\n韩国街\t181823\n启德\t181824\n北方重工\t181825\n石家庄幼儿园\t181826\n160kw\t181827\nv90\t181828\n兆龙\t181829\n贾君鹏\t181830\nリゾ\t181831\n品控\t181832\n平安新一贷\t181833\n9厘\t181834\n次中音\t181835\n游戏葡萄\t181836\n钌\t181837\nPgone\t181838\nFILE\t181839\n尹普美\t181840\n术科\t181841\n栅条\t181842\n4斤\t181843\n桥式起重机\t181844\n瓷都\t181845\nvivoX6\t181846\nconsider\t181847\n青团社\t181848\nzuanke\t181849\n二世祖\t181850\n晒课\t181851\nL358\t181852\n惜败\t181853\n次席\t181854\n小贸\t181855\nó\t181856\n多肽类\t181857\n慧聪网\t181858\n一条龙\t181859\nuuuu\t181860\naosp\t181861\n宠物猪\t181862\n货运员\t181863\np115b\t181864\n沐阳\t181865\ndays\t181866\n广元\t181867\n1711\t181868\nac+\t181869\n陆龟\t181870\n冷泉港\t181871\n出版史\t181872\n外音\t181873\n海口机场\t181874\n低烟\t181875\nJ6P\t181876\n泰兰特\t181877\n江西省博物馆\t181878\nPDW\t181879\nhttplib\t181880\n快乐的秘密室\t181881\n展线\t181882\n躺平\t181883\n粒体\t181884\nRay-Ban\t181885\nstdlib\t181886\n哪年\t181887\n王叶\t181888\n28.6\t181889\n木刻\t181890\n酷鸟\t181891\n自在如风\t181892\n双十协定\t181893\n小禽兽\t181894\ncamo\t181895\n甲状腺癌\t181896\n化学药\t181897\n鑫宝\t181898\n阿登战役\t181899\n万紫千红总是春\t181900\n咋个\t181901\n当表\t181902\n埋压\t181903\n筛子\t181904\nsdmt\t181905\nlo裙\t181906\nPLK\t181907\n川贝\t181908\n留学回国人员证明\t181909\n120亿美元\t181910\n宁夏卫视\t181911\n50本\t181912\n正西\t181913\n盐酸曲美他嗪片\t181914\n九库文学网\t181915\n第86届\t181916\ndeploying\t181917\n菊糖\t181918\nTCD\t181919\n牛筋底\t181920\n当众\t181921\n电催化\t181922\nmasked\t181923\n泰富\t181924\n法学家\t181925\n歌航\t181926\n三级箱\t181927\n679\t181928\n核力\t181929\n铜蓝蛋白\t181930\n贬义词\t181931\n黑镜\t181932\n17天\t181933\n噴\t181934\n资本利得税\t181935\nlancome\t181936\n彩霞\t181937\n临淄教育信息网\t181938\n风华中学\t181939\n宠妃\t181940\n9台\t181941\n轻云淡\t181942\n检疫员\t181943\n4411s\t181944\nhardest\t181945\n刀痕\t181946\n魔卡幻想\t181947\n3100元\t181948\n利己主义者\t181949\n一加手机6\t181950\n李宗霖\t181951\n崩落\t181952\n泉声\t181953\n主系\t181954\n大田镇\t181955\n股城\t181956\n李素罗\t181957\n显影液\t181958\n纱仓\t181959\n四川大学材料科学与工程学院\t181960\n海南航空股份有限公司\t181961\n成都海关\t181962\n超快排\t181963\nMa\t181964\n山东综艺频道\t181965\n溴乙烷\t181966\n酷车\t181967\n好多大\t181968\n盐川\t181969\n三胞胎\t181970\n背机\t181971\n张进\t181972\n魅族应用商店\t181973\negs\t181974\n飞虎之潜行极战全集\t181975\ndianping\t181976\n出口基地\t181977\n省市级\t181978\n监狱建筑师\t181979\n老友粉\t181980\n黄气\t181981\n椎间\t181982\n秋凉\t181983\n殷琦\t181984\n永不瞑目\t181985\n徐宿淮\t181986\nPROXY\t181987\n卢亮\t181988\n8205\t181989\n空气量\t181990\n雪伦\t181991\n断手\t181992\n蓓蓓\t181993\n精神病院\t181994\n王国\t181995\n700平\t181996\n盒式\t181997\n冬虫\t181998\n百鸡\t181999\n游学网\t182000\n珠海市国土资源局\t182001\n爱田\t182002\n姐们\t182003\n眼上\t182004\n卫龙\t182005\n物业管理师\t182006\n恒源煤电\t182007\n保利阳光城\t182008\n电动喷雾器\t182009\nnpl\t182010\nyanyan\t182011\nWUSTL\t182012\nkatherine\t182013\n丙酸\t182014\n案例馆\t182015\n黑页\t182016\n13家\t182017\n卡农社区\t182018\n初等数论\t182019\n南京古林公园\t182020\n温都水城\t182021\n低碳贝贝\t182022\n菜宝\t182023\n国家教育部\t182024\ntudou\t182025\n50吨\t182026\n死疽\t182027\n河北省农村信用社联合社\t182028\n11月19日\t182029\n去空\t182030\n4399j小游戏大全_双人小游戏大全_4499小游戏\t182031\nmuke\t182032\n行名\t182033\n基布\t182034\n帯\t182035\nwd\t182036\nfrom\t182037\ndede5.7\t182038\n一实\t182039\n九门胡同\t182040\n斑霜\t182041\n290万\t182042\nTurbine\t182043\n寻甸回族彝族自治县\t182044\n城关小学\t182045\nsew减速机\t182046\n感到\t182047\n杨政\t182048\n宝马M5\t182049\n804路\t182050\nC/C++\t182051\n22支\t182052\n怀宁\t182053\n换衣\t182054\n飞爪\t182055\n從\t182056\n曹华\t182057\n美景之屋\t182058\n松柏生\t182059\n促进派\t182060\n有机化学\t182061\n救人\t182062\n随机数表\t182063\n鞍山路\t182064\n肤白貌美\t182065\n猎城\t182066\n第三十七届\t182067\n红银\t182068\nExcel2010\t182069\n6GB+64G\t182070\n第五十三条\t182071\nyouxi\t182072\n防城港北\t182073\nToB\t182074\n灰砂砖\t182075\n雅阁\t182076\n深刻\t182077\n大通V80\t182078\niphone8p\t182079\n论著\t182080\n艳舞\t182081\n乳喷\t182082\n贾汪区人民政府\t182083\n实词\t182084\nAndroid数据库\t182085\n将乐\t182086\n附加税\t182087\n珠光粉\t182088\n水席\t182089\n枫斗\t182090\n广东省网上办事大厅佛山分厅\t182091\n黑水公园\t182092\n气体压缩机\t182093\n大话西游免费版\t182094\n李伟东\t182095\n圣虚\t182096\n夏凉\t182097\n馕饼\t182098\n0.29\t182099\ngfx\t182100\nstatically\t182101\n逃命\t182102\n东吧\t182103\n修仙路\t182104\nmipay\t182105\n色人阁\t182106\n12.2.0.1\t182107\n封头\t182108\n中美贸易战\t182109\n朴海镇\t182110\n椰蓉\t182111\nZipArchive\t182112\n战翼\t182113\n评委们\t182114\n雅士银\t182115\n钱码\t182116\n4518\t182117\n石潭村\t182118\n证照\t182119\n省委书记\t182120\n茉\t182121\n18袋\t182122\n90度\t182123\n藿香正气滴丸\t182124\n乌拉特中旗\t182125\nPIP\t182126\nqc\t182127\n冤家不聚\t182128\n广西中医药大学第一附属医院\t182129\n鹤壁职业技术学院\t182130\n原词\t182131\n永登县\t182132\n龙仔\t182133\nE300L\t182134\n同心村\t182135\n期后\t182136\n新光圆成\t182137\n二者\t182138\n叠螺式\t182139\n5-1\t182140\n甜妞\t182141\n位图索引\t182142\n穿帮镜头\t182143\npref\t182144\nbootstap\t182145\n30分钟后\t182146\n郁风\t182147\n安控\t182148\n裤管\t182149\nrosy\t182150\n感激\t182151\n上海地铁12号线\t182152\ne7500\t182153\nsasha\t182154\n过山\t182155\n火险\t182156\n麻友圈\t182157\n华观路\t182158\nF2FS\t182159\n干干\t182160\n两会人大代表\t182161\n阳光融和医院\t182162\n超级中国\t182163\n粘扣\t182164\n画院\t182165\nsever\t182166\n很重要\t182167\n黄石路\t182168\n伊品\t182169\n50千米\t182170\nChromium浏览器\t182171\n胡侦探\t182172\n赵曙明\t182173\n仗剑走\t182174\n深圳科技\t182175\n三角恒等变换\t182176\n上海中环\t182177\n南湖湿地公园\t182178\n可减少\t182179\n触摸版\t182180\n宫本\t182181\n6芯\t182182\n纽伯瑞\t182183\nenso\t182184\n灵心\t182185\n切割者\t182186\n标签\t182187\n堕入\t182188\n新浪湖南新闻_新浪\t182189\ndebuff\t182190\n111家\t182191\n菱湖\t182192\n凄\t182193\n苍穹浩瀚\t182194\n潍坊科技学院\t182195\n3rd\t182196\n陈广\t182197\n济南长途汽车总站\t182198\n不过\t182199\n克拉拉\t182200\n第二十章\t182201\ntabpage\t182202\n宏华集团\t182203\nShower\t182204\njiaqi\t182205\n四人餐\t182206\nECS云服务器\t182207\n偷干\t182208\n唯物论\t182209\n鲁苏\t182210\n习近平关于严明党的纪律和规矩论述摘编\t182211\n一万一个\t182212\n2018.03.30\t182213\n上汽通用金融\t182214\n不可不去\t182215\n火拳艾斯\t182216\nAQUA\t182217\n523li\t182218\n不倦\t182219\n炮手\t182220\nwin2008r2\t182221\n一文秒\t182222\n并发连接数\t182223\n本证\t182224\nNX许可证\t182225\n百度语音\t182226\n中海和平之门\t182227\n重庆师范大学研究生院\t182228\nLEN\t182229\nSTORE\t182230\n大白鲸\t182231\n事牙\t182232\n广坤\t182233\n_v1.0.0_9ht\t182234\nGadget\t182235\n横琴口岸\t182236\n锥形瓶\t182237\n速读\t182238\nbbox\t182239\n卵泡\t182240\n面粉袋\t182241\n文艺界\t182242\n百度云HD720P\t182243\n军武次位面吧\t182244\n梁振伦\t182245\n21.5寸\t182246\n焊线\t182247\nYoyo\t182248\n梁祝\t182249\n性行为\t182250\nmabinogi\t182251\n龙感湖\t182252\n跌进\t182253\n凶灵\t182254\n泰牛\t182255\nseagull\t182256\n海达\t182257\n圆柏\t182258\nof_书面语\t182259\n大千世界\t182260\n淳\t182261\nWebM\t182262\n我和祖父的园子\t182263\n沙村\t182264\n关山度若飞\t182265\n半梦半醒\t182266\n夜会\t182267\n拨弦\t182268\n安平桥\t182269\n两事\t182270\n粉红色\t182271\n3268\t182272\n开罗君\t182273\n小喜\t182274\n太原理工大学现代科技学院\t182275\n晓薛\t182276\nPAVILION\t182277\n剧名\t182278\nunpack\t182279\n刺客列传\t182280\n简尚\t182281\n天才医仙\t182282\ncloak\t182283\n中国传媒大学出版社\t182284\n销售表\t182285\n民宿酒店\t182286\n宋钱\t182287\n成都大魔方演艺中心\t182288\n垂腰\t182289\nuint\t182290\n每一套\t182291\n海角天涯\t182292\n80多岁\t182293\nAggregate\t182294\n淤青\t182295\n防盗网\t182296\n还君明珠\t182297\n隆鼻\t182298\n物管费\t182299\n钛棒\t182300\n泪囊炎\t182301\n20160415\t182302\n3月15号\t182303\n全币种卡\t182304\n废旧物\t182305\n顧客\t182306\n张家辉\t182307\n孙兴\t182308\n剧讯\t182309\n重中之重\t182310\n联东U谷\t182311\n九分裤\t182312\n分歧者\t182313\nplex\t182314\n7390\t182315\n雪中\t182316\n深圳分公司\t182317\n计件\t182318\n美里湖\t182319\n宝能系\t182320\nCCTV-4\t182321\n心景\t182322\nweb管理\t182323\n山穷水尽\t182324\nMinus\t182325\n红花逍遥片\t182326\n这道\t182327\n第一个人\t182328\n黄冈市公安局\t182329\n劳勃\t182330\n奇骏\t182331\n爱之火\t182332\nPayroll\t182333\n杂志铺\t182334\n轻烟\t182335\naorus\t182336\n雪镜\t182337\n丁丁图\t182338\n散\t182339\n孙武\t182340\nship\t182341\n下手为强\t182342\n绝命毒师吧\t182343\nEVAL\t182344\n北京教育学院\t182345\nlinnux\t182346\n老奇人\t182347\n手指头\t182348\n菱角\t182349\n长安欧尚欧诺\t182350\n君合|\t182351\ndll文件大全\t182352\n分宜\t182353\n滕飞\t182354\n拖链\t182355\n南华大学\t182356\n光域网\t182357\n50支\t182358\n青春痘\t182359\n负荷率\t182360\n60码\t182361\n躁动\t182362\n鱼际\t182363\n通风窗\t182364\nEducation\t182365\n纪宁\t182366\n吴志强\t182367\n硬叉\t182368\n黄明峰\t182369\n创出\t182370\n身死\t182371\n巴川中学\t182372\nSerato\t182373\n2018-04-18\t182374\n孟加拉塔卡\t182375\n李公麟\t182376\n20151231\t182377\n林肯贾乃亮\t182378\n娇术\t182379\n汨罗江社区\t182380\n脚蹼\t182381\n望城区\t182382\n鸡毛飞上天\t182383\n参会\t182384\n台子\t182385\nWAX\t182386\n长臂猿\t182387\nGuards\t182388\n清华控股\t182389\n主病\t182390\namm\t182391\nMistine\t182392\n搜狗问问\t182393\n授课\t182394\n肾透析\t182395\n孔肖吟\t182396\n撞人\t182397\n星晴\t182398\n热身运动\t182399\n风俗娘\t182400\n马车\t182401\n勇士令状\t182402\n四个半月\t182403\n索康尼\t182404\n耍宝\t182405\n喀喇昆仑公路\t182406\n掉\t182407\ncascade\t182408\n风骨\t182409\n超重\t182410\n免备案\t182411\n首支\t182412\n临沂市住房和城乡建设委员会\t182413\n升窗\t182414\n出浴\t182415\nLots\t182416\n50问\t182417\n分箱\t182418\n心血来潮\t182419\n绿山墙的安妮\t182420\nSHB\t182421\n大动脉\t182422\n铸源\t182423\n收资\t182424\n刘德武\t182425\n黑镜头\t182426\n风流哥\t182427\n飒\t182428\n20171116\t182429\n中国文明联盟\t182430\n闪光弹\t182431\n显示器\t182432\n1361\t182433\n双流中学\t182434\n入会\t182435\n鹿鞭膏\t182436\n双汇\t182437\n藏历\t182438\n13针\t182439\n4660\t182440\n融汇温泉城\t182441\nKeys\t182442\n火火火\t182443\n平安车辆保险\t182444\nP2P网贷\t182445\n房托\t182446\n黑加仑\t182447\n固执己见\t182448\n刘宁\t182449\n抗氧化\t182450\n逃离\t182451\n百适可\t182452\n金融级\t182453\nw\t182454\n4980\t182455\n秀爽\t182456\n六师附小\t182457\n李俊杰\t182458\npolyu\t182459\n台数\t182460\n华魂\t182461\ncavans\t182462\n劲嘉股份\t182463\nerp管理系统\t182464\nVectorworks\t182465\nYiven\t182466\n金翅鸟\t182467\n20171002\t182468\nF11\t182469\n招收\t182470\nverify\t182471\nmortar\t182472\n906\t182473\n臀位\t182474\n免胶\t182475\n肖洋\t182476\n番龙眼\t182477\n广州华南商贸职业学院\t182478\n微流控芯片\t182479\n徐总\t182480\n陈末\t182481\n苹果pro\t182482\nSolutions\t182483\n交通运\t182484\n296\t182485\n控申\t182486\n吴语\t182487\n酒店篇\t182488\n拓展型\t182489\n清水汪汪\t182490\n管理群\t182491\n大园\t182492\nparametric\t182493\n景子\t182494\n知名\t182495\n爱岗\t182496\n除外\t182497\n协约国\t182498\n董易奇\t182499\n婚否\t182500\n湖南省植物园\t182501\n专用版\t182502\n光华园\t182503\nBt\t182504\n夬\t182505\n上楼\t182506\n广西壮族自治区人力资源和社会保障厅\t182507\n兽幼\t182508\n痫\t182509\ngenymotion\t182510\n1mol\t182511\n20170909\t182512\n福州南\t182513\n茉莉\t182514\nhistorian\t182515\n一角钱\t182516\n193条\t182517\n逼逼\t182518\n敬天\t182519\n恐难\t182520\n网游之神级机械猎人\t182521\n坐式\t182522\nDuik\t182523\nmips\t182524\n贼王\t182525\n扭痧\t182526\nVue-组件嵌套之——父组件向子组件\t182527\n二手车置换\t182528\n东陵\t182529\n飞仙\t182530\npans\t182531\n水晶宫\t182532\n甜似火\t182533\n_捷配仪器仪表网\t182534\nWright\t182535\nmysqladmin\t182536\n盖里\t182537\n鹤田加奈\t182538\n这栋\t182539\n2016年11月30日\t182540\n益智堂\t182541\nlat\t182542\n巧克力酱\t182543\n喜马听书\t182544\n临床医师\t182545\n多钟\t182546\n碧桂\t182547\n硕转博\t182548\nRibbon\t182549\nSuqqu\t182550\n氢氧化\t182551\n遁甲\t182552\n工尺谱\t182553\n中海基金\t182554\n东城路\t182555\n林正乔丹\t182556\n黄漫\t182557\n华为maters\t182558\n39秒\t182559\n中共甘肃省委\t182560\n10月25日\t182561\n人相\t182562\n导热系数表\t182563\n蚌壳\t182564\n美女孩\t182565\n珞珈\t182566\n东山县\t182567\n龚建华\t182568\n送电\t182569\n石斛夜光丸\t182570\n中山市国土资源局\t182571\n电子音乐节\t182572\nW酒店\t182573\nMDR-1A\t182574\n不建\t182575\n中文标\t182576\n奥迪4s店\t182577\n长春高新区\t182578\n后子\t182579\n睁大眼睛\t182580\n900p\t182581\n3000吨\t182582\n一个2018\t182583\n封锁\t182584\n老鼠台\t182585\n钱志亮\t182586\nDELE\t182587\n微度\t182588\n消防泵\t182589\n国旻\t182590\n韩泰轮胎\t182591\n姚晨\t182592\nw5\t182593\nclarks\t182594\nOPEC\t182595\n冷酷无情\t182596\n车声\t182597\n苑\t182598\n荷美尔\t182599\nim04\t182600\n日游\t182601\n让渡\t182602\n小岗\t182603\n美国高通公司\t182604\navantage\t182605\n装入\t182606\n润发乳\t182607\nFeCl2\t182608\n知之为\t182609\nAdMaster\t182610\n湛江市委\t182611\n穆丝\t182612\nfengjian\t182613\n公馆\t182614\n各自安好\t182615\naiesec\t182616\n碘化氢\t182617\n4季\t182618\nPelikan\t182619\n资阁\t182620\n全恐龙\t182621\n中国光学学会\t182622\nTravels\t182623\n安全隐患\t182624\n衢山镇\t182625\n易博\t182626\nkeepalive\t182627\n依山\t182628\n佑天兰果冻\t182629\n看房团\t182630\n咳特灵胶囊\t182631\n0018\t182632\n小野二郎\t182633\nSJL\t182634\nvedios\t182635\ndm单\t182636\najax回调函数\t182637\n泰山火车站\t182638\n殷悦\t182639\n辛拉面\t182640\n赵晓东\t182641\n国土资源网\t182642\n普陀山景区\t182643\n鬼街\t182644\nnmf\t182645\n中南大学湘雅医学院\t182646\nHDT\t182647\n孤影\t182648\n庄长兴\t182649\n一班人\t182650\n惟\t182651\nAnneHan\t182652\n陈近南\t182653\n刘华\t182654\n椿象\t182655\n跳窗\t182656\n裴佳欣\t182657\n伽马射线\t182658\n凡人修仙之仙界篇\t182659\n1.2M\t182660\n月潭水库\t182661\n郎在线视频\t182662\nQiagen\t182663\n神驰\t182664\n肌肉科技\t182665\n三峡工程\t182666\n常州一院\t182667\n京建发\t182668\n堕胎\t182669\n内特罗宾逊\t182670\n2017年7月27日\t182671\n窦婴\t182672\n儿儿\t182673\n水保\t182674\n130分\t182675\n&#30340\t182676\n价值投资\t182677\nwraps\t182678\n宁红\t182679\n正逐步\t182680\n色号\t182681\n翻拍版\t182682\n小祖宗\t182683\nLatex\t182684\n26公里\t182685\n上位史\t182686\nv7.2\t182687\n8000多\t182688\n_庭院/花园装修|一起网\t182689\n天格\t182690\n李肇星\t182691\n到户\t182692\n泽雅镇\t182693\n知网重复率\t182694\n混世\t182695\nyann\t182696\nHyun\t182697\n姐妹兄弟\t182698\n庆幸\t182699\n漫威迷\t182700\n移动4G\t182701\n闫氏\t182702\n始发地\t182703\n乌鲁木齐物流公司\t182704\n俄媒\t182705\n耳温枪\t182706\n96|\t182707\ntransition\t182708\n偶\t182709\n中南君悦府\t182710\n高强度钢\t182711\nchris\t182712\n哭哭\t182713\ntermius\t182714\n石膏粉\t182715\n钴铬合金\t182716\n金蝶智慧记\t182717\n职业病\t182718\nmathematic\t182719\nradial\t182720\n方总\t182721\n九龙杯\t182722\n孙艺琪\t182723\n礼仪篇\t182724\n军礼\t182725\n状态变量\t182726\n重庆旅行社\t182727\n毛主席纪念堂\t182728\n济南市地方税务局\t182729\n心烦意乱\t182730\n黄土地区\t182731\n地效\t182732\n铲子\t182733\n剧耀东方\t182734\n爱不完\t182735\n石头剪刀布\t182736\n能得\t182737\n襄阳站\t182738\nQuasi\t182739\n气候变化\t182740\n济源市教育局\t182741\n猩红\t182742\n4架\t182743\n惊天动地\t182744\n广州地铁\t182745\n20160610\t182746\n大公国际资信评估有限公司\t182747\nms17-010\t182748\n小额诉讼\t182749\n非法经营罪\t182750\n打卡\t182751\n扬马\t182752\nibeacon\t182753\nrainsoul\t182754\n哈森\t182755\n手工费\t182756\n激荡\t182757\n江畔\t182758\nHitman\t182759\n四射\t182760\njustinbieber\t182761\n清肺\t182762\n环保人\t182763\n三岔河镇\t182764\n程凯\t182765\n华蓥市\t182766\n知错\t182767\n西迁\t182768\n5dm\t182769\n龙门大道\t182770\n8C\t182771\n技术调查\t182772\nKD指标\t182773\nWin7控制面板\t182774\nJENKINS\t182775\n鄂州大学\t182776\n加涅\t182777\n十二届\t182778\n附加费\t182779\n掌权者\t182780\n机甲师\t182781\n条件性\t182782\n书类\t182783\n琅琊榜\t182784\n障碍性\t182785\nv6.3.0\t182786\n夫妻性\t182787\n中建科技\t182788\n权力\t182789\n光密\t182790\n查询页\t182791\n华夏人寿保险股份有限公司\t182792\n小校\t182793\n第二人民医院\t182794\ncf2018\t182795\n乙酉日\t182796\n地理信息系统\t182797\nDEEPIN\t182798\n阿鲁卡\t182799\n武威职业学院\t182800\n华侨城欢乐谷\t182801\n2561\t182802\n开瑞开瑞K602018款\t182803\n中共三大\t182804\n讲述\t182805\n得数\t182806\n华人文化\t182807\n修饰语\t182808\n新明\t182809\n铝塑复合板\t182810\n2017年双十一\t182811\n省籍\t182812\n崇拜\t182813\n福建省泉州市\t182814\n星际争霸1.08\t182815\n贵州民宗委\t182816\n6周\t182817\n进口版\t182818\n000513\t182819\n败酱草\t182820\nsw2014\t182821\n遣送\t182822\n插图\t182823\n点不接地\t182824\n绘恋\t182825\n涂尔干\t182826\n滑槽\t182827\n玻璃夹\t182828\n斩妖\t182829\n相对值\t182830\npsycho\t182831\n经典算法\t182832\n讲座稿\t182833\nsunshine组合\t182834\n近7年\t182835\nTrick\t182836\n江左\t182837\n南京市经济和信息化委员会\t182838\n喜气\t182839\n希恩\t182840\nexams\t182841\n雕花\t182842\n乙基纤维素\t182843\n12.38万元\t182844\n打碟机\t182845\n1.55\t182846\n噩梦\t182847\n31个\t182848\n伊邪那美\t182849\n代理权\t182850\n街头霸王4\t182851\nExcess\t182852\n存志\t182853\n大丁丁\t182854\nlyra\t182855\n宣化街\t182856\n一千米\t182857\n黑ba\t182858\n花木兰2\t182859\n高于\t182860\n审图公司\t182861\n金斗\t182862\n三圣花乡\t182863\n2017年3月6日\t182864\n公证费\t182865\n崔波\t182866\n1.7%\t182867\n拓本\t182868\n炖鸡汤\t182869\nBD/HD高清\t182870\nstyler\t182871\nSteal\t182872\n氖气\t182873\nobservable\t182874\n游久网\t182875\n西游2\t182876\n平抛运动\t182877\n七仔\t182878\nPandaTV\t182879\n7.7分\t182880\n正级\t182881\n卡拉克沙漠\t182882\n荷兰大学\t182883\n省国资委\t182884\n大连机床集团\t182885\n四川代表团\t182886\n特别呈现\t182887\n自拍性\t182888\nlogd\t182889\n发球机\t182890\n镇妖塔\t182891\n腮腺\t182892\n麒麟正传\t182893\n优立通\t182894\n链家\t182895\n显卡催化剂\t182896\nopens\t182897\n邢其毅\t182898\n美途\t182899\n博士帽\t182900\ninspect\t182901\n平安保险公司\t182902\n假相\t182903\n椰岛\t182904\n生境\t182905\n张寨镇\t182906\n古意\t182907\n工欲善其事必先利其器\t182908\n斗兽争霸\t182909\nFullCalendar\t182910\n泽民\t182911\n饭饭\t182912\n射电\t182913\n灭尽龙\t182914\n巴尔\t182915\nTS3\t182916\n9型\t182917\nxprivacy\t182918\n苏州论坛\t182919\nShader\t182920\n克拉克森\t182921\n盲杖\t182922\nMilky\t182923\n南史\t182924\n杭州市中心\t182925\n蓝田\t182926\n大笨钟\t182927\n孙耀琦\t182928\n西宸\t182929\n国泰百货\t182930\nande\t182931\n锁扣\t182932\n产女\t182933\n鲁比\t182934\n悟空传\t182935\n第一次接触\t182936\n悍马H2\t182937\n艾斯德斯\t182938\n雅培菁智\t182939\n职业装\t182940\n高古玉\t182941\n中国医疗人才网\t182942\n2000.10\t182943\nimposed\t182944\nlaundry\t182945\npolling\t182946\n喷洒车\t182947\n申旭\t182948\nReservation\t182949\n坎德拉\t182950\n罗马花园\t182951\n莱山区\t182952\nsynchronous\t182953\n80种\t182954\n危急关头\t182955\n.vue\t182956\n花魂\t182957\n光耦合器\t182958\n失宠\t182959\n开幕式\t182960\n后悔莫及\t182961\n核航母\t182962\n伞状\t182963\n基站\t182964\nfullCalendar\t182965\n大图\t182966\nvoor\t182967\n通胀预期\t182968\n维多利亚2吧_\t182969\n断气\t182970\n圣牧\t182971\n回响\t182972\n休宁\t182973\nFluffy\t182974\n检检\t182975\nCEMU\t182976\n5.29\t182977\n傻乎乎\t182978\n达美航空\t182979\n未央路\t182980\n亚度\t182981\n精益管\t182982\n联想小新潮7000-13\t182983\n牙片\t182984\n福建省建设厅\t182985\n于勒\t182986\n中交二航局\t182987\nbites\t182988\nredtube\t182989\nfac\t182990\n大思\t182991\n小升初网\t182992\n中国商务新闻网\t182993\n傲剑狂刀\t182994\n面积分\t182995\n三峡升船机\t182996\n四天前\t182997\n丽都花园\t182998\n千灯古镇\t182999\n未婚男\t183000\n秀发\t183001\n环境科学与工程学院\t183002\n徐鸣\t183003\n更方便\t183004\nNOW直播\t183005\n法强\t183006\ncfa\t183007\ntuned\t183008\n被处\t183009\nsunk\t183010\n2570p\t183011\n子婴\t183012\nPo\t183013\n王立民\t183014\n有光\t183015\nopenload\t183016\n固定资产残值率\t183017\njquery单选框radio\t183018\n毕\t183019\nnuendo\t183020\n63年\t183021\n一多秀\t183022\n浙江在线-浙商网\t183023\nolympus\t183024\nAL\t183025\n阳光教育\t183026\n自动生\t183027\nstansmith\t183028\nBovine\t183029\n609路\t183030\n中国电力投资集团公司\t183031\n下水管\t183032\n拨出\t183033\nlancet\t183034\n烧鸟\t183035\n剃头刀\t183036\n轻生\t183037\n汪俊林\t183038\n选矿\t183039\necosystem\t183040\nxkcd\t183041\n小学生作文-无忧考网\t183042\n濒临灭绝\t183043\n正盛\t183044\n王的盛宴\t183045\nITPSC\t183046\n百度排名优化公司\t183047\nadmissions\t183048\n俊凯\t183049\n联想小新锐7000\t183050\nqq代理服务器\t183051\n奇形怪状\t183052\nframer\t183053\nHalf\t183054\n华夏二手车网\t183055\n小欧\t183056\n无绳吸尘器\t183057\n第一双\t183058\n二硝基苯酚\t183059\n误认\t183060\nz^2\t183061\n128种\t183062\n黄刀镇\t183063\n郑罗茜\t183064\n1200度\t183065\n矛与盾\t183066\n克拉玛依政府网\t183067\n家地板\t183068\n周星弛\t183069\n苏伊士\t183070\n少儿舞\t183071\n铜都\t183072\n德安县人民政府\t183073\n经开\t183074\nEnable\t183075\n107期\t183076\nady\t183077\n塞力斯\t183078\n好厉害\t183079\n王明军\t183080\n大板\t183081\n猎犬\t183082\n撒播\t183083\n葡萄糖酸钙口服溶液\t183084\n明远\t183085\n西安康辉旅行社\t183086\n有血有肉\t183087\n数下\t183088\n管理性\t183089\nITU\t183090\n差异\t183091\n2.1%\t183092\nT700\t183093\nbjj\t183094\n履约保证保险\t183095\n外训\t183096\n酱包\t183097\nskyline\t183098\n归去来兮辞\t183099\npam\t183100\n青年城\t183101\n安徽省人力资源和社会保障厅\t183102\n钢市\t183103\nCHATEAU\t183104\nnetframe\t183105\n肚肚\t183106\n音讯\t183107\n正月十六\t183108\n维生素B12\t183109\n急速\t183110\n光大期货\t183111\nappears\t183112\n8克\t183113\n人人乐\t183114\nlesion\t183115\n激光焊接机\t183116\n白山镇\t183117\n金圆\t183118\n管理处\t183119\n沪教版六年级语文\t183120\n杨丽萍\t183121\n雷者\t183122\n空中营救\t183123\n低铁\t183124\n气器\t183125\n零杆\t183126\n东城广场\t183127\n体病\t183128\n撕边\t183129\nVerilogHDL\t183130\n18751390275\t183131\n开封文化艺术职业学院\t183132\nmicrosof\t183133\n星湾\t183134\n20几天\t183135\nhzau\t183136\n10片\t183137\n科技类\t183138\n100款\t183139\n圣劳伦斯河\t183140\n德翔\t183141\n枕木\t183142\n内网穿透\t183143\n腰痛宁胶囊\t183144\n方案\t183145\n建军节\t183146\n行船\t183147\n谭某某\t183148\n可谓\t183149\n转款\t183150\n计税法\t183151\n同屋\t183152\n曼彻斯特城\t183153\n波琳\t183154\n注释性\t183155\n西藏北路\t183156\n趴地\t183157\n7535\t183158\n可溶性膳食纤维\t183159\n腰间盘\t183160\nwatching\t183161\n黄西\t183162\n大众汽车有限公司\t183163\nfuel\t183164\n展示页\t183165\nG+\t183166\n无锡景\t183167\n温州火车南站\t183168\n口舌生疮\t183169\nHave\t183170\n众泰t500\t183171\n无耻之徒\t183172\n米芝莲\t183173\n婚迷\t183174\n连轴器\t183175\n银联商务有限公司\t183176\n我型我秀\t183177\n山参\t183178\n新安江路\t183179\n2话\t183180\n潢川论坛\t183181\n44\t183182\n104号\t183183\n狂宴\t183184\n联想G510\t183185\n地球脉动第二季\t183186\n交女\t183187\n麦瑟尔\t183188\n热地\t183189\n藏蓝色\t183190\n航空运输\t183191\n长江水利网\t183192\n粉喷桩\t183193\n阿斯旺\t183194\n哈尔滨宾县\t183195\n湖北师范学院\t183196\n集采\t183197\nTesco\t183198\n喷淋式\t183199\n所向\t183200\n合作医院\t183201\n鄙夷\t183202\n设计邦\t183203\n死亡空间吧_\t183204\nMicrosoftWord\t183205\n富阳站\t183206\n天下无缺\t183207\n特别报道\t183208\n高家园\t183209\n2143\t183210\n弹性系数\t183211\nwin7x64\t183212\n叔嫂\t183213\n摆姿\t183214\n设计学类\t183215\n新锅\t183216\ndegradation\t183217\n科远股份\t183218\n西里斯\t183219\ncms\t183220\n逗阵\t183221\n驯养\t183222\n专柜\t183223\n滑冰场\t183224\n囚犯\t183225\n第十七期\t183226\n花开富贵\t183227\n澳门葡京赌场\t183228\n易语言俱乐部\t183229\n贴式\t183230\n强排热水器\t183231\n闰土\t183232\n双重差分\t183233\n德萨\t183234\n2.91\t183235\n兽夫\t183236\n白花花\t183237\n音游\t183238\n佰\t183239\n关不掉\t183240\n烟田\t183241\n学即用\t183242\n驱邪\t183243\n李济深\t183244\n20160802\t183245\n花宴\t183246\n客座教授\t183247\nAntivirus\t183248\nin循环\t183249\n北京大学国家发展研究院\t183250\n|民\t183251\n26篇\t183252\n大白鲨\t183253\n格列佛游记\t183254\n朱宝意\t183255\nCactus\t183256\n4700元\t183257\n堕落天使\t183258\n阿拉伯\t183259\n才学\t183260\n转职\t183261\n过生\t183262\n打金\t183263\nsass-loader\t183264\nStudi\t183265\n龙狼\t183266\n太和洞久咳丸\t183267\n封烟\t183268\n八卦\t183269\nbecket\t183270\n小堆\t183271\n跳纤\t183272\n继任者\t183273\n有许多\t183274\nantibody\t183275\n下花\t183276\n山东路\t183277\nBulk\t183278\ndisturbance\t183279\npem\t183280\n银嘉\t183281\n墨霆\t183282\n_尐\t183283\n过碳酸钠\t183284\n正装表\t183285\n烟大\t183286\n汽车门\t183287\nflipped\t183288\n西江网\t183289\n我是歌手第四季\t183290\n七律\t183291\n首善之区\t183292\n佛山汽车站\t183293\n千瓦\t183294\n一辆10万\t183295\n若尘\t183296\n支架式\t183297\nB85-PRO\t183298\n混合气\t183299\n奥夫\t183300\nemule\t183301\n情感频道_主妇网\t183302\n滇\t183303\nradius\t183304\n慕言\t183305\n燃点\t183306\n温州二中\t183307\n我的宝贝\t183308\n剑网三编辑器\t183309\n生画\t183310\n赌毒\t183311\n肉蒲\t183312\n石蒜科\t183313\n夫复何求\t183314\n栖川\t183315\n谙\t183316\nYamy\t183317\n京门\t183318\n果园镇\t183319\n云蕾\t183320\n彩虹鱼\t183321\n张果果\t183322\nAllenW\t183323\n1970s\t183324\n黏膜\t183325\n两点水\t183326\n丁聪\t183327\n7.1.4\t183328\n新洲乡\t183329\n能吃苦\t183330\n尻屄\t183331\n收音\t183332\n安康新闻网\t183333\n杰狮\t183334\n独自一个人\t183335\n第二十关\t183336\nlihua\t183337\n烟箱\t183338\n慧忠里\t183339\n芝商所\t183340\n笑霸来了\t183341\n20XX\t183342\n干瘪\t183343\nイラスト\t183344\n工网\t183345\n倏然\t183346\n3607\t183347\nCommit\t183348\naeabi\t183349\n唱念\t183350\n摩罗丹\t183351\n病友\t183352\nfetal\t183353\nkoyo\t183354\n中联部\t183355\nkld\t183356\n爬\t183357\n2018年3月\t183358\n启辰\t183359\n杨蕊\t183360\n合肥六中\t183361\nMaintenance\t183362\nShore\t183363\n中广核工程有限公司\t183364\n内南\t183365\nFusionServer\t183366\n先健科技\t183367\nchage\t183368\n180320\t183369\n阿里巴巴公益基金会\t183370\nHiShop\t183371\n平鲁区\t183372\n发烧级\t183373\n指纹套\t183374\n稳性\t183375\n生日会\t183376\nocr\t183377\n网辣文\t183378\n微单A7RIII\t183379\n照常\t183380\n1981\t183381\n西莉卡\t183382\n读史\t183383\n妞博网\t183384\n否词\t183385\n迷你剧\t183386\n海马m3\t183387\nCXH\t183388\n全局性\t183389\n影娱\t183390\nハッピ\t183391\nofficeword\t183392\n42英寸\t183393\n郭鹏\t183394\n边说\t183395\n仙官\t183396\n安缘\t183397\n4月10\t183398\n冠品\t183399\n百度学术\t183400\n独得\t183401\n2016年12月30日\t183402\n串联\t183403\n嗦\t183404\n幻神\t183405\n基团\t183406\n西点\t183407\n天天美剧\t183408\n杨公\t183409\n安维汀\t183410\n骨脂\t183411\n铭洋\t183412\n绿之韵集团\t183413\nST股\t183414\n多彩\t183415\n鲁峰\t183416\n大泼猴\t183417\n男孩子\t183418\n嘎子哥\t183419\n红花湖\t183420\n140米\t183421\nProcessOn\t183422\nthinker\t183423\n传真\t183424\n樣\t183425\n英格兰人\t183426\n0.8.2\t183427\n杭州旅游网\t183428\n成都网易\t183429\n轩尼\t183430\noverlaps\t183431\nAddict\t183432\n违体\t183433\n6.7%\t183434\n安凯\t183435\n谣言案\t183436\n早孕期\t183437\n王东辉\t183438\n6.0下\t183439\n一两年\t183440\n龙山寺\t183441\n人烟稀少\t183442\n傻钱\t183443\n成品房\t183444\n蒙娜丽莎婚纱摄影\t183445\n卡尔蔡司\t183446\n数珠\t183447\n汉祚堂吉诃德\t183448\n逾越节\t183449\n新浪闽南_新浪网\t183450\n第69集\t183451\ntoshiba\t183452\n特洛伊木马\t183453\n煤球\t183454\n乐通\t183455\n柳如烟\t183456\n棉花糖\t183457\n30000\t183458\n90m\t183459\n左心房\t183460\n心有灵犀一点通\t183461\n名雕\t183462\n至人\t183463\n百步\t183464\nstruck\t183465\n试验员\t183466\n中心村\t183467\n敢想\t183468\n经典片\t183469\n舞蹈学院\t183470\n温州物流公司\t183471\n萧关\t183472\ndoget\t183473\n聪聪\t183474\nAnsoft\t183475\n划不来\t183476\n脱衣舞女\t183477\n水泵\t183478\n1.1.10\t183479\n闻鸡起舞\t183480\n1353\t183481\n了\t183482\n0.60\t183483\nforeignkey\t183484\n李春玲\t183485\n五黑\t183486\n环境保护部办公厅\t183487\n威尼斯人娱乐\t183488\n聚海\t183489\n0823\t183490\n南白\t183491\nf446\t183492\n很明显\t183493\n倍福\t183494\nphpdragon\t183495\n火人节\t183496\n荣耀v9play\t183497\n91搜课\t183498\n成都师范学院\t183499\n硬盘播放器\t183500\n掌圈\t183501\noncreate\t183502\n严峻\t183503\n独立柱\t183504\n红花郎\t183505\nRolf\t183506\netcdctl\t183507\n一个100\t183508\n100.com\t183509\n诛仙手游云梦\t183510\n斯图亚特\t183511\n十堰地区\t183512\n深圳凤凰网\t183513\n液力偶合器\t183514\nVue2.0+ElementUI\t183515\n崮山路\t183516\n芳香油\t183517\n挖币\t183518\n钢花管\t183519\nwmz\t183520\n华盟\t183521\n九龙湖镇\t183522\n万化\t183523\n喜喜\t183524\n零起点\t183525\nspeedfly\t183526\n金相磨抛机\t183527\ncourse\t183528\n巴勒斯坦\t183529\n川端康\t183530\n赐婚\t183531\n马克瑞\t183532\nnba2kol\t183533\n商务楼\t183534\n梦幻诛仙手游青云\t183535\n郑州地铁10号线\t183536\n万杰\t183537\nsqlhelper\t183538\n肌力\t183539\n黑白照片\t183540\n捣鼓\t183541\n站亭\t183542\n英韩\t183543\n盈亏\t183544\n长火\t183545\n江铃考斯特\t183546\n窦智孔\t183547\n小叶紫檀盘\t183548\n脒\t183549\n随行付\t183550\n47P\t183551\nSecureCRT\t183552\n万象城\t183553\n东阳花园村\t183554\n7整除\t183555\n新世代\t183556\n济源市环境保护局\t183557\n两一个\t183558\n棋迷\t183559\n胡维勤\t183560\n撒欢\t183561\nPS4中文网\t183562\n殷都\t183563\n银西铁路\t183564\n二叉堆\t183565\n易卜拉欣\t183566\n3d投影\t183567\n杀手6吧\t183568\n免疫疗法\t183569\n37年\t183570\n孙欣\t183571\n点校\t183572\n30平方米\t183573\n五大连池风景区\t183574\n张艺源\t183575\n凯特琳\t183576\n密封箱\t183577\nmaku\t183578\n752\t183579\n3多\t183580\n橙心\t183581\npowerbeats3\t183582\n331号\t183583\n德府\t183584\n飞鸽电动车\t183585\n一桥\t183586\n聂政\t183587\nfung\t183588\n枳实\t183589\n林汉达中国历史故事集\t183590\n狗类\t183591\n抓挠\t183592\n仁品\t183593\n建卡\t183594\n2K17\t183595\nEvolife\t183596\n山口山战记\t183597\n大话西游\t183598\n众望\t183599\n誓愿\t183600\n益字\t183601\n文科生\t183602\n虫鸣\t183603\nEG\t183604\n顾\t183605\n怀着\t183606\n覆膜机\t183607\n廉价\t183608\n私家车\t183609\n直属机关党委\t183610\n闺秘\t183611\n雷磁\t183612\n2018-01-29\t183613\n好丽友\t183614\n保持\t183615\n十五大\t183616\n胡椒木\t183617\nimpression\t183618\n四则\t183619\n形影\t183620\n中华诗歌网\t183621\n省卫生厅\t183622\n澳门葡京酒店\t183623\n洼地\t183624\n文化遗产园\t183625\n水果店\t183626\n脱壳破解\t183627\n少量\t183628\nntn\t183629\n七怪\t183630\n苏民峰\t183631\n戴尔灵越13\t183632\nm128fw\t183633\n眼雨\t183634\n无处\t183635\n均安\t183636\nChallenges\t183637\n袭来\t183638\n地表\t183639\n艺文\t183640\n两字\t183641\n流传\t183642\n天喻信息\t183643\n潘震\t183644\n缅华网\t183645\n新疆沙漠\t183646\n强汉\t183647\n易伟\t183648\n艾华\t183649\n建设银行跨行\t183650\n酒鬼酒\t183651\nWinDriver\t183652\n180313\t183653\n地法\t183654\njiandan\t183655\n丙寅\t183656\n雷昂\t183657\n世界地\t183658\n病区\t183659\n两个月亮\t183660\n48页\t183661\n汞\t183662\nabe\t183663\n违抗\t183664\n失约\t183665\n对分\t183666\nドラ\t183667\n一亿元\t183668\n以及其\t183669\n权益法\t183670\n200回\t183671\n星际2人族\t183672\n张家界市人民政府\t183673\ndrawer\t183674\n滚轮式\t183675\n速冻饺子\t183676\n盗情\t183677\n摩卡&Java\t183678\nplash\t183679\n万县\t183680\n场合\t183681\ngestures\t183682\nmarian\t183683\n分给\t183684\n伊苏8PC\t183685\n18957118186\t183686\n虎鲸\t183687\n淘点金\t183688\n色标\t183689\npreg\t183690\n三少爷的剑\t183691\n普陀区\t183692\n不告而别\t183693\n发改局\t183694\n可数不可数\t183695\n歩\t183696\n心悦2\t183697\n南太湖\t183698\n性交易\t183699\n页级\t183700\n驯龙高手\t183701\nMottoIN\t183702\n20T\t183703\n张道真\t183704\n因噎废食\t183705\n变行\t183706\n狮王争霸\t183707\n芍药\t183708\n敏而\t183709\n超全面\t183710\nMolding\t183711\n五色土\t183712\n身體\t183713\n润叶\t183714\nfour\t183715\n拖堂\t183716\n多肉叶插\t183717\nposed\t183718\n反写\t183719\ntvbs\t183720\n临时户\t183721\n广饶县\t183722\n岂是\t183723\n华师版\t183724\nAlcatel\t183725\n其人其事\t183726\n5星\t183727\n航战\t183728\n小鹿乱撞\t183729\nyonyou\t183730\nopendaylight\t183731\n50片\t183732\n小助\t183733\n鱼卵\t183734\n加工业\t183735\n边防军\t183736\n三国梦想\t183737\n8266\t183738\n2月12日\t183739\nxshell6\t183740\n发黄\t183741\n股长\t183742\n金开大道\t183743\n致癌\t183744\n饸烙面\t183745\n菁优网\t183746\nResonance\t183747\nUnfortunately\t183748\n无怨\t183749\nι\t183750\n人教版小学语文四年级下册\t183751\nROA\t183752\n直饮水机\t183753\n卡帕多奇亚\t183754\n傅海棠\t183755\n范宁\t183756\n唐圭璋\t183757\nTIK\t183758\n宝龙\t183759\n爱林州网\t183760\nhydro\t183761\n金沙遗址博物馆\t183762\n回郭镇\t183763\n染色机\t183764\n中建三局第一建设工程有限责任公司\t183765\n国源\t183766\n宝莲\t183767\nMOULD\t183768\n笑尿\t183769\n烟草证\t183770\n大学生性\t183771\n净化率\t183772\n26寸\t183773\nASIO\t183774\n腰头\t183775\n20岁时\t183776\n83版\t183777\nfeeder\t183778\n十万亿\t183779\n德江县\t183780\n禁止的爱:善良的小姨子\t183781\n接天\t183782\nlogout\t183783\n烂辉龙\t183784\n损伤\t183785\n国际名家具\t183786\n华茂股份\t183787\n人版\t183788\n简码\t183789\n宣政院\t183790\nNOTE\t183791\nsimmons\t183792\n地方人\t183793\n吉斯\t183794\nPratt\t183795\n楚王\t183796\n渝东北\t183797\n冠希\t183798\n铂宫\t183799\n佳能mg3080\t183800\n农人\t183801\npocket\t183802\n太虚观\t183803\nwindows7_Wind\t183804\n家贼\t183805\n一口一个\t183806\n秦怡\t183807\n香板\t183808\n武侠类\t183809\n芙蓉隧道\t183810\n3163\t183811\n突进\t183812\nCorelDRAW2017\t183813\n4741G\t183814\n硫化钾\t183815\n缩写大全\t183816\n中央区\t183817\nSingleX\t183818\n老赖\t183819\n皇冠\t183820\n倘\t183821\nCSP\t183822\n孟获\t183823\n百度语音识别\t183824\n盯盯拍\t183825\n隔月\t183826\n更为\t183827\nFlashCS6\t183828\n卧撑\t183829\n即时比分|\t183830\n双珠\t183831\nstaged\t183832\nramada\t183833\n五象\t183834\n娇俏\t183835\n主事\t183836\n3201\t183837\n江都区\t183838\nULTIMATE\t183839\nios11分屏\t183840\nCosmo\t183841\n无双大蛇Z\t183842\n诺贝尔医学奖\t183843\nt+0\t183844\n十二座\t183845\n廖碧儿\t183846\n宝泉景区\t183847\n超短焦\t183848\ncreated\t183849\n暗杀教室\t183850\n萧内网\t183851\n教育改造\t183852\n预备党员会议\t183853\n潘娇娇\t183854\n米哈伊尔\t183855\nSauce\t183856\n剥\t183857\n昆山\t183858\n水溶肥\t183859\n鞋袜\t183860\n博斯普鲁斯海峡\t183861\n1599元\t183862\n223路\t183863\n佛眼\t183864\n非常人\t183865\npanorama\t183866\n不问俗尘非\t183867\nCi\t183868\n钟萍\t183869\n凤凰木\t183870\n藤本壮介\t183871\n波幅\t183872\n嗄\t183873\n智趣\t183874\n做信\t183875\n蓝旖琳\t183876\n通背\t183877\nopenlayers\t183878\n茱莉亚\t183879\n依时利\t183880\n科瓦奇\t183881\n去年12月份\t183882\n工业化\t183883\n直射\t183884\n盛果期\t183885\n安徽医科大学第一附属医院\t183886\n葡萄酒杯\t183887\n温棚\t183888\n羽凡\t183889\n有创意\t183890\n别惹我\t183891\n网页视频\t183892\n签解\t183893\n伴手\t183894\n一盆\t183895\n鹤年堂\t183896\nchrome\t183897\n樾\t183898\n九层妖塔\t183899\n早会\t183900\n张怀义\t183901\n冯玉祥\t183902\n很自信\t183903\n型条\t183904\n多达\t183905\n载中\t183906\n云南省交通运输厅\t183907\n避开\t183908\n运险费\t183909\n爬一爬\t183910\n百分之1\t183911\n冠状动脉狭窄\t183912\nKK网\t183913\n泥雕\t183914\nencouraged\t183915\n佰草集\t183916\n1801\t183917\n循环油\t183918\n第十辑\t183919\n水上\t183920\n星际争霸1.18\t183921\n出租车票\t183922\nHull\t183923\n姨妈\t183924\n假山石\t183925\nteaming\t183926\ngarcons\t183927\nSkechers\t183928\ndpa\t183929\n杂文\t183930\n语例\t183931\n心愿\t183932\n澳规\t183933\nBeamNG\t183934\n益母\t183935\n周晓涵\t183936\n19周\t183937\n楼兰古国\t183938\n一百平米\t183939\n波斯猫\t183940\n迅雷720p超清\t183941\n洛阳网\t183942\n小幺鸡\t183943\n裹\t183944\n为伍\t183945\n短片\t183946\n容限\t183947\n余新镇\t183948\n北京德和衡律师事务所\t183949\n贵州民族大学\t183950\n北京音乐学院\t183951\n电稿\t183952\nEchoes\t183953\n震荡\t183954\n3114\t183955\npsy\t183956\n刘治国\t183957\n马具\t183958\n薄白\t183959\n药学院\t183960\nia\t183961\n拖把池\t183962\n2014年初\t183963\n王云峰\t183964\n应收账款转让\t183965\n老钱\t183966\n坚朗\t183967\n漫域\t183968\n客人\t183969\n钢鞭\t183970\n15批次\t183971\n施德\t183972\n个签\t183973\nchariot\t183974\n挖贝网\t183975\n电烙铁\t183976\n逐梦者\t183977\n入库单\t183978\ndiaper\t183979\nKUYOU\t183980\nphysic\t183981\n幻变\t183982\n第二更\t183983\n学有所\t183984\n宿建德江\t183985\n算力\t183986\nEris\t183987\n2.99\t183988\n王之哈莫\t183989\n中国好舞蹈\t183990\n刻度值\t183991\n下班后\t183992\n红葱头\t183993\n鲁管\t183994\n虚传\t183995\n献血\t183996\nWindows_匿名_天涯问答\t183997\nWEBSITE\t183998\nUnified\t183999\n轮滑吧\t184000\n赵璞\t184001\n噤声\t184002\n千山\t184003\n画乡\t184004\n博罗县人民政府\t184005\n林则徐纪念馆\t184006\n冥灯龙\t184007\n售卖机\t184008\n西藏南路\t184009\n李喆\t184010\nkele\t184011\n碰碰船\t184012\n富贵竹\t184013\n大腕\t184014\n恒业\t184015\n遇上你是我的缘\t184016\nweq\t184017\nwxParse\t184018\n泉州医学高等专科学校\t184019\n发行者\t184020\n河南村\t184021\n文轩九月网\t184022\n第五季\t184023\nNandFlash\t184024\n开核\t184025\n说开\t184026\n380马力\t184027\n艾瑞巴蒂\t184028\n张岩\t184029\n恒银金融\t184030\n蓝天下_浙江卫视\t184031\n顺通\t184032\n指弹谱\t184033\n2014年末\t184034\n急性脑梗塞\t184035\nYNET.com北青网\t184036\ndllimport\t184037\n哨笛\t184038\n玄武\t184039\n卡尔顿山\t184040\n哈医大\t184041\n孽欲\t184042\n灼热\t184043\np500\t184044\n哥奇米\t184045\n血腥\t184046\n腋梁\t184047\n教育部科技发展中心\t184048\n重孝\t184049\n归一化\t184050\n8.1_\t184051\n致远舰\t184052\nuid\t184053\n万古天帝\t184054\n灌口镇\t184055\ntelent\t184056\nRYU\t184057\n45章\t184058\n城口县\t184059\nwin7序列号\t184060\n增选\t184061\n臣服\t184062\n几户\t184063\nSneakers\t184064\n北米\t184065\n荣耀X2\t184066\n棉片\t184067\n观湖\t184068\n互化\t184069\n网盘\t184070\n京东卡\t184071\n庞加莱\t184072\nBLZ\t184073\n胡马度阴山\t184074\n国际幼儿园\t184075\nAutoencoder\t184076\n窦性\t184077\n马王爷\t184078\n人民铁道网\t184079\n散文集\t184080\n留连\t184081\n烫钻\t184082\n理想的风筝\t184083\n短篇小说集\t184084\n煞星\t184085\nEuler\t184086\n西罗园\t184087\n并不难\t184088\n撒泼\t184089\n醋酸钯\t184090\n娇弱\t184091\n代码片\t184092\n递变\t184093\n考斯特\t184094\n装回\t184095\n50TB\t184096\n十分之九\t184097\n安康市人力资源和社会保障局\t184098\n初音速\t184099\n振华科技\t184100\n湖南省农村信用社\t184101\nC6\t184102\nVRay渲染器\t184103\n衬衫\t184104\nxiaowei\t184105\n4.26世界知识产权日\t184106\n2.02\t184107\n金岐玟\t184108\n企业管理资源网\t184109\n赛格电脑城\t184110\n万户侯\t184111\n阿瑟\t184112\nAvada\t184113\n王大枪\t184114\n佐匹克隆\t184115\n傲来国\t184116\n退域\t184117\n宝马320Li\t184118\n流图\t184119\n索尼xz1\t184120\n仙王\t184121\n兽首\t184122\n雷蛇锐蝮蛇\t184123\n寸土必争\t184124\n电镀银\t184125\n坟山\t184126\n倒角机\t184127\n三会\t184128\n山东省中小企业局\t184129\nHospitality\t184130\nontouchevent\t184131\nmentioned\t184132\n大都督\t184133\n膝关节镜\t184134\n9万多\t184135\n种鸽\t184136\n技惊四座\t184137\nva11\t184138\n周安\t184139\n张仪村\t184140\nwinpcap\t184141\n中举\t184142\n热收缩膜\t184143\n网易MuMu模拟器\t184144\n青花椒\t184145\n风寒咳嗽\t184146\n螺蛳湾\t184147\n蜜液\t184148\namCharts\t184149\n离散系数\t184150\n银河电影联盟\t184151\ncem\t184152\nmultibyte\t184153\n耶伦\t184154\nroyalrover\t184155\n菲洛城\t184156\ns32\t184157\n柳依依\t184158\n谅\t184159\nKnockOut\t184160\n璜泾镇\t184161\n杭州网易\t184162\n载流子迁移率\t184163\n马蒂尼\t184164\n热血之城\t184165\n一本通\t184166\n送春\t184167\n上市价\t184168\n太阳能控制器\t184169\nduqu\t184170\nPhotoShop资源网\t184171\n渺渺\t184172\n床笠\t184173\nDeeper\t184174\n怔\t184175\n猜球\t184176\n不伦之恋\t184177\n大连路\t184178\n金包\t184179\n7v7\t184180\n防城区\t184181\n华夏\t184182\n第三者\t184183\n难易度\t184184\n行李员\t184185\n出版地\t184186\n程彤颜\t184187\nnoteexpress\t184188\n建筑工程\t184189\n白斌\t184190\nSHERO\t184191\n天津经济开发区\t184192\n江西省委组织部\t184193\n贝乐泰\t184194\n战扬\t184195\n度蜜月\t184196\n四川省气象局\t184197\n魂牵梦绕\t184198\nI7\t184199\nwindows2003\t184200\n华元欢乐城\t184201\n九三学社\t184202\n格挡率\t184203\nHANGZHOU\t184204\n铁娘子\t184205\n刘亚军\t184206\n炉衬\t184207\n黄金茶\t184208\nKilled\t184209\n白藜芦醇\t184210\n9.2.4.0\t184211\n民生银行\t184212\n蛟龙突击队\t184213\n和讯股票\t184214\n皮娜\t184215\n渔商\t184216\n无羡\t184217\n科大智能\t184218\n远成集团\t184219\n旱\t184220\n玄奥\t184221\n氯化亚锡\t184222\n王皇后\t184223\n马克思恩格斯\t184224\n提单\t184225\n返古\t184226\n刘晓军\t184227\n订购\t184228\n吉林出版集团\t184229\n柳溪\t184230\n王欧\t184231\n酷睿I5\t184232\n20130728\t184233\n静安公园\t184234\n大修\t184235\nang\t184236\n绣品\t184237\n增强型\t184238\nJFileChooser\t184239\n苹果装饰公司\t184240\n紫火\t184241\n考不考\t184242\nsamsonite\t184243\n港航\t184244\n贝儿\t184245\n球瓶\t184246\n女声歌\t184247\n古屋\t184248\n混沌\t184249\n白堤路\t184250\n金健\t184251\nhoodie\t184252\n808毕业论文网\t184253\n蕾雅·赛杜\t184254\npkhex\t184255\n前者\t184256\nTer\t184257\n棉毛衫\t184258\nbasara\t184259\n电龙\t184260\n步美\t184261\n交通事故责任认定书\t184262\n东莞火车站\t184263\n痛并快乐\t184264\n辩手\t184265\n海贼战队豪快者\t184266\nAUNE\t184267\nCoin163\t184268\n书章\t184269\n兴业大道\t184270\n自动售货机\t184271\n张皇后\t184272\ndbms_job\t184273\nBeast\t184274\nVipshop\t184275\n12.5.0\t184276\nailee\t184277\n2020年后\t184278\n珠江道\t184279\n标王\t184280\n3380\t184281\n大康农业\t184282\n中公教育山西分校\t184283\n哈利法克斯\t184284\n傅善祥\t184285\n固戍地铁站\t184286\nmyeloid\t184287\n信息技术部\t184288\n京哈高速公路\t184289\n无敌兔\t184290\n广义表\t184291\n南山区\t184292\n魔法帝\t184293\norbi\t184294\nSerge\t184295\n上海四季酒店\t184296\n占优\t184297\nForget\t184298\n心本\t184299\nui文件\t184300\n主管理员\t184301\niotop\t184302\n马佳\t184303\n药箱\t184304\nB612咔叽\t184305\n折叠车\t184306\n常州龙网\t184307\n陈泽民\t184308\n四队\t184309\n正宁\t184310\n镭雕机\t184311\n美吉特\t184312\n奥松\t184313\n国务院安委办\t184314\ndata分区\t184315\nes200\t184316\n德州二中\t184317\n北戴河\t184318\n2082\t184319\n利石素\t184320\nSerif\t184321\n电与磁\t184322\n红金\t184323\n鬼武者2\t184324\n宁夏回族自治区发展和改革委员会\t184325\n奥的斯\t184326\n剑网3天策\t184327\nECR\t184328\nrevit2016\t184329\n爆\t184330\n真魂\t184331\n应收账款函证\t184332\n775\t184333\n巴伐利亚州\t184334\n大熊猫繁育研究基地\t184335\nferrite\t184336\n大创立项\t184337\n中国宏桥\t184338\nTortoiseGit\t184339\n鞣酸\t184340\nnt6\t184341\n中冶赛迪\t184342\n自锁式\t184343\n学而不厌\t184344\nbattle\t184345\n我的妹妹有点怪\t184346\n华中科技\t184347\n纱条\t184348\n小米卡\t184349\n桐花\t184350\nsealy\t184351\n榨\t184352\nAnconda\t184353\n谈资\t184354\n论英雄\t184355\nAMERICA\t184356\n狮岭\t184357\n街声\t184358\n分级\t184359\n洪山桥\t184360\n秦始皇\t184361\n轰趴猫\t184362\nWebsites\t184363\n512m\t184364\n广汽\t184365\n千禧年\t184366\n遭受\t184367\n周秉德\t184368\n烘托\t184369\n紫微大帝\t184370\n孟欣\t184371\ndeem\t184372\n亮哥\t184373\n点题\t184374\n申华\t184375\n1000多种\t184376\n采纳率\t184377\n大利\t184378\n滇西\t184379\n小产权房买卖\t184380\n最後\t184381\n白宁\t184382\n大馆\t184383\n玩呗\t184384\nInte\t184385\n求答\t184386\n红色警戒共和国之辉\t184387\n呆妹\t184388\n深圳地铁2号线\t184389\n时寒冰\t184390\n净水商城\t184391\n禁制\t184392\n电影学院\t184393\n型秀\t184394\n蒸馒头\t184395\n长假\t184396\n指弹曲\t184397\n接收器\t184398\n浪水\t184399\n铁力木\t184400\n离散化\t184401\n002450\t184402\n惑星\t184403\n匆匆那年\t184404\n涡卷\t184405\n龙湖滟澜山\t184406\n乙未\t184407\n琉星\t184408\n蒙特雷\t184409\n管理暂行办法\t184410\n受众\t184411\n开户网\t184412\n蒂塔\t184413\n罗志祥\t184414\nxtrabackup\t184415\n弛豫\t184416\n庙前镇\t184417\n空床\t184418\n鹰王\t184419\n茶炉\t184420\nT480S\t184421\n奔三路学习网\t184422\n戏称\t184423\n神医喜来乐传奇\t184424\n张懿\t184425\n忘返\t184426\nAtlus\t184427\n药费\t184428\n斯威x7\t184429\n今义\t184430\n角纸\t184431\n水粉纸\t184432\n德国杯\t184433\n瘦骨嶙峋\t184434\n受穷\t184435\n全层\t184436\nwelt\t184437\n民间风水奇谭\t184438\n铃木吉姆尼\t184439\n范海辛的奇妙冒险\t184440\nmonochrome\t184441\nadam01\t184442\n2025年\t184443\n三汁焖锅\t184444\n掌阅ireader\t184445\n虚类\t184446\n梅村\t184447\n于晴\t184448\n荣耀5c\t184449\nGB_\t184450\n20150407\t184451\n神州网\t184452\n手礼\t184453\n走上坡路\t184454\n分栏\t184455\n宏川\t184456\n江燕路\t184457\n华为视频会议\t184458\n塞北\t184459\n2.rmvb\t184460\n洁美科技\t184461\n黑客\t184462\n宝乐迪\t184463\n隔空投送\t184464\n真空罐\t184465\n沙甸\t184466\n锦湖轮胎\t184467\n浒墅关\t184468\n除皱\t184469\n阿里百度\t184470\n扣板吊顶\t184471\n马蜂\t184472\n三圾片\t184473\n握威\t184474\n联系观\t184475\n随身学\t184476\n余映潮\t184477\n初二\t184478\nLID\t184479\n20170914\t184480\ndn80\t184481\n纪念版\t184482\n然後\t184483\n小沢アリス\t184484\n抑魔金\t184485\n沙滩公园\t184486\n白石村\t184487\n湖南\t184488\nMAXIMUS\t184489\n锡金\t184490\n佩德罗\t184491\n真麻烦\t184492\nKeds\t184493\nStopping\t184494\n组合体\t184495\nds4\t184496\n拨备\t184497\n分析步\t184498\n两通\t184499\n美潮\t184500\n紫陌\t184501\n乐客城\t184502\n一室一\t184503\n主图\t184504\n20161228\t184505\n発売\t184506\n绥宁县\t184507\n西部建设\t184508\n禾晏山\t184509\n32767\t184510\n酵母片\t184511\n218i\t184512\nisr\t184513\n大连市地方税务局\t184514\n红一方面军\t184515\n斯图加特大学\t184516\n赵伊凡\t184517\n微寒\t184518\n古仔-520听书网\t184519\n气弹簧\t184520\nHei\t184521\nf8\t184522\n奥数\t184523\n张立宪\t184524\n足疗机\t184525\n李志超\t184526\n分地\t184527\n数通\t184528\n称道\t184529\n驻马店\t184530\nhuangye88.com\t184531\nvulnerabilities\t184532\n运用题\t184533\nhasee\t184534\n泥酔\t184535\n因果报应\t184536\n白兰度\t184537\n特种柜\t184538\n南通附院\t184539\nCAP\t184540\n费德李琳\t184541\n于曼丽\t184542\n六斤\t184543\n萌德\t184544\ndoggy\t184545\noki\t184546\n幸福树\t184547\n理单\t184548\n经营服务性收费\t184549\n重生冰公主的校园之行\t184550\nDATABASE\t184551\n交火\t184552\nviedo\t184553\n欤\t184554\nVNX\t184555\n麦迪孙亚芳\t184556\n瑶琴\t184557\n唐枫\t184558\nkpl2018\t184559\nshutdown\t184560\n安·兰德\t184561\n自紧\t184562\njumping\t184563\n成有\t184564\nCNA\t184565\n河灯\t184566\n勉县\t184567\n宁平\t184568\nnetlink\t184569\n髓\t184570\n魔杖\t184571\n代码表\t184572\n科学技\t184573\n士兵\t184574\n尿裤\t184575\n无线组网\t184576\nOpenfiler\t184577\n野马\t184578\nhappen\t184579\n7.35元\t184580\n最好的舞台\t184581\n15plus\t184582\n2.9GB\t184583\n排泄\t184584\n高新民\t184585\n半导体照明\t184586\n公用事业\t184587\n高精\t184588\nClimate\t184589\n评头\t184590\n藤原浩\t184591\n马科\t184592\n白杨路\t184593\n使用证\t184594\n科服\t184595\n甘油三脂\t184596\n挤入\t184597\n金盛国际家居\t184598\n白塔寺\t184599\n百合镇\t184600\nCTAN\t184601\n96%\t184602\n表芯\t184603\n江苏国税电子税务局\t184604\n中马\t184605\nMicroStation\t184606\n9.03\t184607\n图形界\t184608\n2厘\t184609\n何巧女\t184610\n结合剂\t184611\ncount\t184612\n兵部\t184613\n考无忧\t184614\n庄重\t184615\n民利\t184616\n锁存\t184617\nsyst\t184618\n二曲\t184619\n变比\t184620\n江西省机电设备招标有限公司\t184621\n耳垂\t184622\n奶酪包\t184623\n封盖\t184624\n时尚管理专业\t184625\nBabyCenter\t184626\n王永平\t184627\nbennett\t184628\n一个5\t184629\n正版\t184630\n沈悦\t184631\n泰华\t184632\nloadlibrary\t184633\nBestMe\t184634\n美港\t184635\n南大港\t184636\n七六六忘忧小说网\t184637\nPY\t184638\n商客通\t184639\n碗莲\t184640\n第三年\t184641\n宽度\t184642\n修复科\t184643\nWORD2013\t184644\n宝马320li\t184645\n书算\t184646\n毕夏\t184647\nchkdsk磁盘修复命令工具\t184648\n单费\t184649\n极性\t184650\n河北干部网络学院\t184651\n中海信托\t184652\n澡\t184653\n叶宇\t184654\n几十辆\t184655\n双唇\t184656\n西安论坛\t184657\n937\t184658\n国标电线平方数\t184659\n开采权\t184660\nmiles\t184661\n节哀\t184662\n几样\t184663\n西装革履\t184664\n民服\t184665\n自考马克思主义基本原理概论\t184666\n天生一对/宿缘\t184667\nManhattan\t184668\n建林\t184669\n小正太\t184670\n灭霸\t184671\n小日\t184672\nhedron\t184673\n东方银谷\t184674\n招商宝\t184675\n联销\t184676\n勇吉拉\t184677\nconvince\t184678\n4.5寸\t184679\n加氢\t184680\n零地\t184681\n甜樱桃\t184682\n余年\t184683\n9点钟\t184684\nremittance\t184685\n社交化\t184686\nOutlook2016\t184687\nAirports\t184688\n腾讯浏览器\t184689\n百姬\t184690\n沏\t184691\nNP\t184692\n拼音字\t184693\n边界层\t184694\n滦河\t184695\n古德曼\t184696\n湖南省知识产权局\t184697\n4月1日起\t184698\n烧麦\t184699\n2y^2\t184700\npoor\t184701\n3560\t184702\n万全区\t184703\n送养\t184704\njordi\t184705\n原籍\t184706\nPARTITION\t184707\n南阳一中\t184708\n4-5天\t184709\n无人机\t184710\n下庄村\t184711\n整骨\t184712\n10017\t184713\n广东电子税务局\t184714\n电气原理图\t184715\nAi\t184716\nith\t184717\n黎屋村\t184718\nWeiss\t184719\n广宗县\t184720\n94神马\t184721\n﹢\t184722\nDAW\t184723\nlaraval\t184724\n枯叶落\t184725\nm1911\t184726\n虹桥\t184727\nCameraRaw\t184728\n微压计\t184729\n耕牛\t184730\n电刑\t184731\n寒冷\t184732\n24天\t184733\nHUBLOT\t184734\n绝对服从\t184735\n黄勤\t184736\n莺莺\t184737\n530Li\t184738\n米兔\t184739\n口位\t184740\n4116\t184741\n敞口\t184742\n横沥\t184743\n伊金霍洛旗人民政府\t184744\n孔侑\t184745\n珠海市住房公积金管理中心\t184746\n资源版\t184747\n广告板\t184748\n000300\t184749\nSavorboard\t184750\n中国航油\t184751\n数据域\t184752\n激光投影仪\t184753\n电子图\t184754\nValve\t184755\n信阳电视网\t184756\n迪杰斯特拉\t184757\n邬兴亮\t184758\n非城镇\t184759\n微智\t184760\nEDGE\t184761\n成都消防\t184762\n纠察队\t184763\n酷玩_播视网\t184764\n飞行堡垒fx63vd\t184765\n一个寸\t184766\n毛豆新车网\t184767\n雅诗兰\t184768\n17ss\t184769\n4C\t184770\n毕比\t184771\n小米手机4c\t184772\nNavicate\t184773\n磐石市\t184774\n枸橼酸铋钾\t184775\n丿丶\t184776\nunauthorized\t184777\n修神\t184778\nqidian\t184779\ngodlie\t184780\n杭州华三通信技术有限公司\t184781\nzoey\t184782\n嘉定区幼儿园\t184783\nEpiData\t184784\n插车\t184785\n广物\t184786\n仁义镇\t184787\ns7-300\t184788\n三日\t184789\n调查报告书\t184790\n1001\t184791\n6队\t184792\nOregon\t184793\n耳帽\t184794\n文迪\t184795\n自怨自艾\t184796\n原始部落\t184797\n献算\t184798\nジェマ\t184799\n卡拉多\t184800\n广渠\t184801\n哈尔滨职业技术学院\t184802\n光的反射\t184803\n护目镜\t184804\nMonitors\t184805\n101首\t184806\n仙本那\t184807\n热得快\t184808\n上海公司\t184809\n北大法学院\t184810\n伍允龙\t184811\n献演\t184812\n近三百年\t184813\n陈嘉桦\t184814\n16\t184815\n龚敏\t184816\n耶利亚\t184817\n全峰快递\t184818\nconvert函数\t184819\n石鼓区\t184820\n国际范\t184821\n东鹏卫浴\t184822\n式样\t184823\nvi设计公司\t184824\n等比数列an\t184825\n于雷\t184826\n勇者大战魔物娘rpg\t184827\n浅田\t184828\n白鹭洲公园\t184829\n绿城桃李春风\t184830\nFindchips\t184831\nPis\t184832\n济南绿地中心\t184833\n南洋中学\t184834\n1527\t184835\n搓澡巾\t184836\n厦门小鱼社区\t184837\n桌道\t184838\n邻邦\t184839\n位址\t184840\n凌潇肃\t184841\n48式\t184842\n北大附中深圳南山分校\t184843\n22例\t184844\n东北一家人\t184845\n风波\t184846\n胡锦涛\t184847\nimagenet\t184848\n糖豆广场舞课堂\t184849\n中国宝安集团股份有限公司\t184850\nunbunt\t184851\n仙仙\t184852\n枥木\t184853\n习书记\t184854\n2880\t184855\nNO.10\t184856\n凤辣子\t184857\n草寇\t184858\n书源\t184859\n夹生饭\t184860\n瓦尔莎拉\t184861\n龙轩美术网\t184862\n血石\t184863\n陶瓷型\t184864\nfpu\t184865\n一百张\t184866\nGadgets\t184867\n永中\t184868\n戴安全\t184869\n误发\t184870\n咋调\t184871\n金钟罩\t184872\n套马\t184873\n九所镇\t184874\n中外合作专业\t184875\n12331\t184876\n土星五号\t184877\n歌赛\t184878\n外汇宝\t184879\n约柜\t184880\nz240\t184881\n集中管理\t184882\n南田\t184883\n南马\t184884\n5201314\t184885\nC249\t184886\nsain\t184887\n天娇\t184888\n输精\t184889\n防撞柱\t184890\nbboss\t184891\n广州港集团\t184892\n商事登记\t184893\ncmg\t184894\n3DS.DUOWAN.COM\t184895\nnpy\t184896\n碑文\t184897\n天津融资租赁公司\t184898\n端子箱\t184899\n李少白\t184900\nnoone\t184901\n87届\t184902\n杜村\t184903\n基药\t184904\n东升科技园\t184905\n理杏仁\t184906\n星月神话\t184907\n天威\t184908\n坚忍不拔\t184909\n葵玲奈\t184910\n存在于\t184911\n齐园\t184912\n争分夺秒\t184913\n太空豆\t184914\n亲吻姐姐\t184915\n闪电宝\t184916\n600460\t184917\n有机化\t184918\n仁厚\t184919\n上外地\t184920\n学级\t184921\n市房产局\t184922\n显卡驱动\t184923\n罗纳尔迪尼奥\t184924\n_质\t184925\n阎岭\t184926\n贵阳酒店\t184927\n祖庭\t184928\n酒度\t184929\n无线电波\t184930\n宁蒗彝族自治县\t184931\n差点\t184932\n银企\t184933\n无恙\t184934\nExpertise\t184935\n防城港市防城区\t184936\n新生儿奶粉\t184937\nMyCat\t184938\n驿\t184939\n先锋影音资源网\t184940\n回沪\t184941\n场景\t184942\n央视六一晚会\t184943\n白色车\t184944\n光心慌慌\t184945\n中华路\t184946\n康巴什区\t184947\n锦程网\t184948\n卡拉羊\t184949\n啤酒吧\t184950\n鲁滨孙\t184951\n贴壁细胞\t184952\niOS7.1.2\t184953\n提取器\t184954\n接踵而来\t184955\nATSL\t184956\n金派\t184957\n贵阳网\t184958\n离型\t184959\n扫书\t184960\n隠\t184961\n三轴仪\t184962\n088\t184963\n全体会议\t184964\n阿弥陀佛圣号\t184965\n时空石\t184966\n胡言\t184967\n作家网\t184968\n想走\t184969\n22.0\t184970\n地脚螺栓\t184971\n三爱富\t184972\n南京湖南路\t184973\nredefine\t184974\n碧叶\t184975\n魔纹\t184976\nレイ\t184977\n嫖娼案\t184978\n重复值\t184979\n东华镇\t184980\n我的理想\t184981\n王様\t184982\n贾平凹\t184983\n服务贸易总协定\t184984\nzbj\t184985\n妃\t184986\n娱乐-北国网\t184987\n朗廷\t184988\n猫笼\t184989\n教师资格证考试\t184990\n笠\t184991\nTECHNOLOGY\t184992\ncasing\t184993\n羊皮\t184994\n腰精\t184995\n1922年\t184996\n光禄\t184997\ngranger\t184998\n图腾柱\t184999\n对标管理\t185000\n广汽传祺\t185001\n森提纳克斯号\t185002\n金鸡湖景区\t185003\nasd\t185004\n纪念性\t185005\n荣耀畅玩5X\t185006\n铝封\t185007\n龙穴\t185008\n景气\t185009\n列加\t185010\n娄底新新网\t185011\n常州酒店\t185012\n辉县市人民政府\t185013\n马尾辫\t185014\n挖孔桩\t185015\n七弦琴\t185016\nC92\t185017\n尖沙咀\t185018\n颈挂式\t185019\n静觅\t185020\nmii\t185021\n预言帝\t185022\n超图软件\t185023\n福建之窗\t185024\n百果\t185025\n一见\t185026\n认识人民币\t185027\n关口\t185028\n李智\t185029\n开图\t185030\n逃单\t185031\n温尔缦\t185032\n南部商务区\t185033\n败亡\t185034\nfrancs\t185035\n闭合性粉刺\t185036\n弓水\t185037\n公斤级\t185038\n名本\t185039\n兰陵笑笑生\t185040\n达川人民政府\t185041\n两个一百年\t185042\n数量型\t185043\nb8\t185044\n火盆\t185045\n龙背山森林公园\t185046\n岩层\t185047\n沈玉琳\t185048\nRedline\t185049\nsolidwor\t185050\n梨花木\t185051\n铂寓\t185052\n东直门\t185053\n新宿幻灵\t185054\n优库\t185055\n杨柳河\t185056\nhdd\t185057\nmey\t185058\n丝绸之路\t185059\n大通冰室\t185060\n5121\t185061\n亚夏汽车\t185062\n粉墨春秋\t185063\n第21章\t185064\n平壤\t185065\n水冶\t185066\n盐城市环境保护局\t185067\nClara\t185068\n请回答1994\t185069\n大公网\t185070\n较场口\t185071\nnim\t185072\nmorn\t185073\nupdate\t185074\n巴特孙宏斌\t185075\n庞然\t185076\n兰山区\t185077\n仙草\t185078\n喉返神经\t185079\n峪口\t185080\n亚太区\t185081\n青云志3\t185082\nerd\t185083\n文一科技\t185084\n新航路\t185085\n第44\t185086\n成都市泡桐树小学\t185087\n结构主义\t185088\n大黄蛰虫丸\t185089\n爱尚鲜花\t185090\n焦点中国网\t185091\n乙烯\t185092\n规范表\t185093\n幻梦之晓\t185094\n彐\t185095\nJava版\t185096\nleezhxing\t185097\n孟伟\t185098\n280mm\t185099\n八张\t185100\nzhudai\t185101\n嘉定图书馆\t185102\n胡锦\t185103\n天津大道\t185104\n多壁碳纳米管\t185105\nRTM\t185106\n棕红色\t185107\n盖梁\t185108\n永大\t185109\n风电场\t185110\n王鸽\t185111\n4折\t185112\n马猴烧酒\t185113\noad\t185114\n惯坏\t185115\n泰山网\t185116\n过去分\t185117\n0550.com\t185118\n苏少版小学\t185119\n80386\t185120\n实战化\t185121\nALD\t185122\n琐忆\t185123\n丰丰\t185124\n米拉克\t185125\n两万步\t185126\n再见前任\t185127\n亚士创能\t185128\nsmeg\t185129\n埇桥区政府\t185130\n青竹湖湘一外国语学校\t185131\n小米盒子3s\t185132\n琥珀糖\t185133\ngoldwave\t185134\n台儿庄区政府\t185135\n十几层\t185136\n贾某\t185137\n肉长\t185138\n逆转裁判1\t185139\n纯钛\t185140\n我的奋斗\t185141\n工作感想\t185142\n舒印彪\t185143\n36款\t185144\n世茂股份\t185145\n福昌\t185146\n西于庄\t185147\n高次\t185148\nPermGen\t185149\n换骨\t185150\n你好运\t185151\n构象\t185152\n高气压\t185153\n王文博\t185154\n龚伟\t185155\n2.2.22\t185156\n品诺\t185157\n寞\t185158\n倡树\t185159\n献给\t185160\nShallow\t185161\n缓释\t185162\n祛病\t185163\n2.com\t185164\n四面楚歌\t185165\n1亿股\t185166\n烟村\t185167\n骨干\t185168\n【丸子\t185169\n扈三娘\t185170\n郑宇民\t185171\n龙焱\t185172\n犀\t185173\n2.5万吨\t185174\n至尊计状元才\t185175\n西街\t185176\nvisualstudio\t185177\n冯冯\t185178\n英昌\t185179\n100美元\t185180\n杉咲花\t185181\n50kg\t185182\n12档\t185183\n萌卡篮球\t185184\n樱桃小丸子第二季\t185185\n大小端\t185186\nappleid\t185187\n尽可\t185188\n天津九龙男健医院\t185189\n投资报酬率\t185190\n世\t185191\n镍铁\t185192\n省运会\t185193\n分节\t185194\n库布齐沙漠\t185195\n大壮\t185196\n京天\t185197\n白龙路\t185198\n提供商\t185199\n永顺县\t185200\n龙景\t185201\njiaren\t185202\n天津中考网\t185203\nc168\t185204\n超星集团\t185205\n夜猫\t185206\n错关\t185207\n练有\t185208\n东楼\t185209\nPHP\t185210\n二页\t185211\n求异\t185212\n钱泳辰\t185213\n苦头\t185214\n反接\t185215\n因数与倍数\t185216\n预备党员转正申请书\t185217\n王者荣耀凤求凰\t185218\naxygen\t185219\n五娃\t185220\n德资\t185221\n四川水井坊股份有限公司\t185222\n辛宪英\t185223\n社区化\t185224\n檀弓\t185225\n尽全力\t185226\n变配电\t185227\nAUTOMATIC\t185228\nbean\t185229\n机动化\t185230\n上海市数字证书认证中心\t185231\n圣骑士|Paladin-魔兽世界\t185232\n金庸群侠传苍龙逐日\t185233\n色逼\t185234\n导流罩\t185235\n史玉武则天\t185236\n杭州九堡客运中心\t185237\n指南形\t185238\n中国言实出版社\t185239\nGOGO\t185240\n波伏娃\t185241\n坦率\t185242\n寡言\t185243\n中国移动江苏公司\t185244\n植物大战僵尸西游版\t185245\n群星\t185246\n竹筒饭\t185247\n闽龙\t185248\n全王\t185249\n佳能6D2\t185250\n教学类\t185251\n25000元\t185252\n嘉会\t185253\n泥石\t185254\n陈煜\t185255\nPainting\t185256\n上海苏宁\t185257\n省纸\t185258\n20160425\t185259\n大发雷霆\t185260\n8006\t185261\n儒武争锋\t185262\n药物中毒\t185263\n団\t185264\n格林诺奇\t185265\n成爱\t185266\nlazada\t185267\nmonday\t185268\n李玄易\t185269\n清真菜\t185270\n剑王庶女嫡妃\t185271\nGL小说\t185272\n张岱\t185273\n2158\t185274\n兴业基金\t185275\n担保无抵押贷款公司|担保公司\t185276\n当湖街道\t185277\nGU\t185278\nattackers\t185279\n百分之二十\t185280\nMyGirl美媛馆\t185281\n电信翼支付\t185282\nax^2+bx+c\t185283\n五行山\t185284\n永远记得\t185285\n中国军团_新浪\t185286\n强排\t185287\n2214\t185288\nSecPath\t185289\nelectrons\t185290\n一路有你\t185291\n鸿达兴业\t185292\n跟随\t185293\n第一行代码\t185294\n八速\t185295\n十八年后\t185296\nRoca\t185297\n五年级下册语文课文\t185298\nWatcher\t185299\nkiwi\t185300\n肇庆市\t185301\n发育宝\t185302\n懒惰\t185303\n名楼\t185304\n石家庄铁道大学四方学院\t185305\n就爱\t185306\nJUKI\t185307\n蹑手蹑脚\t185308\n素化\t185309\n闻所未闻\t185310\n19亿\t185311\ndbp\t185312\n诉讼财产保全责任保险\t185313\n肉末\t185314\n安全文\t185315\n杰瑞德·莱托\t185316\n补中益气丸\t185317\n桃花林\t185318\n文长\t185319\n民事诉讼法\t185320\n塑造\t185321\n敌兵\t185322\n曹叡\t185323\n200亿美元\t185324\n花甜\t185325\n依兰吧\t185326\n螺母\t185327\n烂尾\t185328\nVill\t185329\n大麦芽\t185330\n离婚律师网\t185331\n墙内\t185332\n铝芯\t185333\n112期\t185334\n大半个\t185335\n龙桥街道\t185336\n火野丽\t185337\n撮镇\t185338\n小卖柜\t185339\n天祝政府网\t185340\n提拉米苏\t185341\n洋气点\t185342\n合肥市环境保护局\t185343\n开水器\t185344\n赵波\t185345\nVMnet8\t185346\n秦菲墨天宇\t185347\n剑侠情缘网络版2\t185348\n丁丁当当\t185349\n白龙江\t185350\nGraphic\t185351\n宠物游戏\t185352\n房地产权证\t185353\n潘佳佳\t185354\n魔羯女\t185355\nPost\t185356\n育才二小\t185357\neam\t185358\nai模板-千图网\t185359\n常期\t185360\nReviewed\t185361\n帕劳\t185362\n言出\t185363\n春游湖\t185364\n等效焦距\t185365\n天天630\t185366\n甲钴胺片\t185367\nop\t185368\n光纤熔接机\t185369\n中超吐口秀\t185370\n四方坪\t185371\n艾附暖宫丸\t185372\n琅琅比价网\t185373\nMariana\t185374\nmicaljamee\t185375\n_森\t185376\n3820\t185377\n聚氨酯胶粘剂\t185378\ninurl\t185379\n物流论\t185380\n180张\t185381\n大话西游2吧\t185382\n华睿\t185383\n11支\t185384\n吉他中国\t185385\n士族\t185386\n大行动\t185387\n厦门东海职业技术学院\t185388\nE14\t185389\nDepends\t185390\n国务院机关事务管理局\t185391\n固步自封\t185392\n奏议\t185393\nCaprice\t185394\n华强安防网\t185395\n新闻周刊\t185396\n好玩\t185397\n处在\t185398\n张溥\t185399\ncommu\t185400\ndescriptor\t185401\n孙巍\t185402\n排上\t185403\n口袋妖怪梦\t185404\n388.13\t185405\ncorldraw\t185406\ncreations\t185407\n同类目\t185408\nGrown\t185409\n乡伴\t185410\n长洱\t185411\nx64位\t185412\n壶腹癌\t185413\n1.4.7\t185414\n参投\t185415\n搜服\t185416\n爱宠\t185417\n王真儿\t185418\n民族风情\t185419\n凯迪拉克XTS\t185420\n有点污\t185421\n丰田陆巡\t185422\n烦闷\t185423\n35万千瓦\t185424\n舞服\t185425\n吴欢\t185426\n揭阳市\t185427\n刨根问底\t185428\n公家\t185429\n驯服\t185430\n郑波\t185431\nRecv\t185432\n逆行\t185433\n宫比罗\t185434\n剧情片\t185435\nCoop\t185436\n3900元\t185437\n铁血讲武堂\t185438\n强帖\t185439\n作用域\t185440\n佃农\t185441\n黄金大劫案\t185442\n565\t185443\n一年又一年\t185444\n中华田园猫\t185445\n布姆英雄学院\t185446\n杨磊\t185447\n进户门\t185448\n湖北省纪委\t185449\n本届\t185450\n混沌武士\t185451\nNaHCO3\t185452\n大林木\t185453\n纯真时代\t185454\n27页\t185455\n挡道\t185456\n减少性\t185457\n云南省烟草专卖局\t185458\n梦舟股份\t185459\n杯装\t185460\n数码复合机\t185461\n屏幕\t185462\n7.1级\t185463\n天府之国\t185464\n马料\t185465\n伍德灯\t185466\n菠菜公司\t185467\ndropbox\t185468\n朱雀门\t185469\nUPLAY\t185470\n0379\t185471\n身孕\t185472\n劫镖\t185473\n王炸\t185474\nstatutory\t185475\n元祖食品\t185476\nVC++2015\t185477\nWORD、EXCEL\t185478\n745\t185479\n除名\t185480\nuu网游加速器\t185481\nuninstall\t185482\n盘腿\t185483\n韩平\t185484\nSvn\t185485\n變態\t185486\nServices\t185487\n全筑股份\t185488\n耐酸性\t185489\n利剑高悬\t185490\n1十\t185491\n兄弟连\t185492\n嘶哑\t185493\n新世纪百货\t185494\n接待日\t185495\n放任\t185496\n病魔\t185497\n陶笛网\t185498\n龙港区\t185499\n永恒网\t185500\n刑事案\t185501\n南地区\t185502\n2018KPL\t185503\n大丸\t185504\n脊髓灰质炎\t185505\n反码\t185506\n2333\t185507\n拆掉\t185508\n拳皇2005\t185509\n烦恼\t185510\n8910\t185511\nSilicone\t185512\n荔枝园\t185513\njtextarea\t185514\n航空母舰\t185515\n学不可以\t185516\nquarters\t185517\n40道\t185518\nWall\t185519\n1.3.3\t185520\n真夏夜之银梦\t185521\n灵敏度\t185522\n爱尔兰\t185523\n麦教猛\t185524\n169号\t185525\nLCD12864\t185526\n智能体脂秤\t185527\n甘肃科技馆\t185528\n证书管理器\t185529\n里程\t185530\n长盛\t185531\nhella\t185532\n宝清县\t185533\n相宜\t185534\nSoup库\t185535\n85977328\t185536\n祖辈\t185537\n输送管\t185538\n峻工\t185539\n哈蒙德\t185540\n黑\t185541\n杞县人民政府\t185542\n只有\t185543\n枪膛\t185544\n淮上区\t185545\n能为所欲为\t185546\nwindows2008r2\t185547\nScholarship\t185548\n丝芙兰国际\t185549\n张敬轩\t185550\n新奥尔良鹈鹕队\t185551\n被否决\t185552\n智版\t185553\n侏儒兔\t185554\n铃木羚羊\t185555\n尚智逢源\t185556\n离子散\t185557\nsleeve\t185558\nThrough\t185559\n成都市人民政府\t185560\n中山小学\t185561\n百度编辑器\t185562\nnasdaq\t185563\nPier\t185564\n假脸\t185565\n三九感冒灵\t185566\n最后一炮\t185567\n王国纪元\t185568\n20150804\t185569\n230mm\t185570\nlined\t185571\n四优\t185572\n素包子\t185573\n阿加莎克里斯蒂\t185574\ntaxable\t185575\n军港之夜\t185576\n20亿美元\t185577\n烧结砖\t185578\nVMware12\t185579\n王跃\t185580\n三千多万\t185581\n华法林\t185582\n战意\t185583\n纳米氧化铝\t185584\n叶天宇\t185585\n旅图\t185586\n优科豪马\t185587\n巴黎迪士尼乐园\t185588\n病因\t185589\n海盗船k70\t185590\n党风廉政建设主体责任\t185591\n江阴市\t185592\n梅花联轴器\t185593\n轉換\t185594\n36年前\t185595\n一个歌\t185596\n相信未来\t185597\n前任3:再见前任\t185598\nlocations\t185599\n大长山岛\t185600\n免房券\t185601\n一件套\t185602\nnews\t185603\n普拉洛芬滴眼液\t185604\n全交\t185605\n合剂\t185606\nwos\t185607\n基本功\t185608\n贡缎\t185609\n拳石\t185610\n广通\t185611\n上马\t185612\n七界传说\t185613\n致君\t185614\n韩村河\t185615\n鳄龙\t185616\n活腻\t185617\n小米5|小米5S\t185618\n曼德勒\t185619\n鲸人\t185620\n栈\t185621\nYS\t185622\n居众装饰\t185623\nthemida\t185624\n力能\t185625\n梁荣忠\t185626\n汉中市政府网\t185627\n100.0000元\t185628\n第91届\t185629\n周天勇\t185630\n希拉穆仁草原\t185631\n先师\t185632\n結果\t185633\n梁峰\t185634\n偶尔\t185635\n十宗\t185636\n水云间\t185637\n糖油\t185638\n剑舞\t185639\n20170218\t185640\nHo\t185641\n夜趣福利\t185642\nzonetan\t185643\nsiege\t185644\n王二小放牛郎\t185645\nBABY\t185646\n中元节\t185647\nWin10技\t185648\n亏\t185649\n驴妈妈旅游资讯网\t185650\n自转\t185651\n攻打\t185652\n智慧型\t185653\n锡器\t185654\n无位置传感器\t185655\n品鉴会\t185656\n空想\t185657\n40针\t185658\nmatlb\t185659\n二|\t185660\nset_header\t185661\n无房\t185662\n高能所\t185663\n防伪\t185664\n光性\t185665\n悼词\t185666\n百年不遇\t185667\n勇攀高峰\t185668\n设立\t185669\n疏文\t185670\n渲图\t185671\nwirte\t185672\n寓意\t185673\n中国遂宁_遂宁市政府\t185674\nDVDRip\t185675\nral\t185676\n1.9G\t185677\npate\t185678\n中信书店\t185679\n湖北法院\t185680\n数据库会\t185681\n肥兔\t185682\n新概念作文\t185683\n中国徒步网\t185684\n5千多\t185685\n40袋\t185686\n坡面\t185687\n端妃\t185688\n4月10日起\t185689\nmyrio\t185690\n苹果5\t185691\n失爱\t185692\n虎扑NBA\t185693\n立足点\t185694\n红白理事会\t185695\n候选项\t185696\n地支\t185697\n160平方\t185698\n东华软件股份公司\t185699\nMVRDV\t185700\n更何况\t185701\n滚轮\t185702\n山雾\t185703\nsof\t185704\n倌\t185705\npetri\t185706\n宫女\t185707\n向美\t185708\n环山\t185709\nGuns\t185710\n野值\t185711\n6330\t185712\n等奖\t185713\n第一百章\t185714\noperands\t185715\n带宽积\t185716\n天空彩\t185717\n憋不住\t185718\nfca\t185719\n香水有毒\t185720\n成都市总工会\t185721\n批次\t185722\n畅谈\t185723\nrpcs3吧_\t185724\n银山\t185725\n存费\t185726\n中机国际\t185727\n新航\t185728\nfatf\t185729\n安妮股份\t185730\n令人捧腹\t185731\n外贸信托\t185732\nPDR\t185733\n微软Surface\t185734\nISO9001-2015\t185735\nlsd\t185736\n2012年底\t185737\n女主人\t185738\n类型化\t185739\n环游世界\t185740\n多方炮\t185741\n比特币\t185742\n量一量\t185743\nmembers\t185744\n姬骑士\t185745\n滕州二中\t185746\n卷边\t185747\n五要素\t185748\n天科\t185749\n企业管理咨询公司\t185750\n九节鞭\t185751\nargv\t185752\n广博\t185753\n邹静\t185754\n稀罕\t185755\n雷宇\t185756\n分页函数\t185757\n28度\t185758\n当于\t185759\n邓紫\t185760\n智者千虑\t185761\n北京现代音乐学院\t185762\n习爷爷\t185763\nPendulum\t185764\n墙图\t185765\n一次世界大战\t185766\n课外作业\t185767\nSixty\t185768\n领导学\t185769\n24.1\t185770\n工作队\t185771\nTCM\t185772\n暖花开\t185773\n交通工程\t185774\n寅月\t185775\n伊犁师范学院\t185776\n天涯明月刀真武\t185777\n慕尼黑工业大学\t185778\n3-2\t185779\nteamlab\t185780\njeans\t185781\n神仙群\t185782\n梁桂\t185783\n挚爱\t185784\nBINARY\t185785\n濮阳新闻网\t185786\n1圈\t185787\n快娱乐\t185788\n海文\t185789\n唐风采\t185790\n无量\t185791\n漫夭\t185792\n海尔地产\t185793\n全套装\t185794\n耿美\t185795\n义弟\t185796\n凤凰汽车_凤凰网\t185797\n不可言传\t185798\n罚息\t185799\n筹\t185800\n敬而远之\t185801\nAnalyzing\t185802\n轻淘客\t185803\n白云万达广场\t185804\n回唱\t185805\njflash\t185806\n野生动物\t185807\ncom+\t185808\nn位\t185809\n生下\t185810\nCompute\t185811\nANESSA\t185812\nPATEK\t185813\n乌饭\t185814\nxiaoqu\t185815\n压力性\t185816\n妊高症\t185817\n安耐晒金瓶\t185818\n球差\t185819\n神话剧\t185820\n赵明\t185821\n宴会\t185822\n石层\t185823\n32&6\t185824\n地心历险记2\t185825\n巩俐\t185826\n左眼\t185827\nCodePlayer\t185828\n重生之贼行天下\t185829\n螨虫\t185830\nUnmi\t185831\n十周\t185832\n瓮安县\t185833\n热血传奇手机版\t185834\n花包\t185835\n黄山学院\t185836\n裸奔\t185837\n【求文】求\t185838\n排成\t185839\n特卖会\t185840\nPolyester\t185841\n三一学院\t185842\nunregistered\t185843\n新东方英语学校\t185844\n核苷类\t185845\n客所思\t185846\n试管婴儿\t185847\n黄亿华\t185848\nSHISEIDO\t185849\n天涯小筑\t185850\n切割液\t185851\n红毛丹\t185852\n塞住\t185853\nv7.8\t185854\n青钢\t185855\n松柏镇\t185856\nLiquor\t185857\n唧唧\t185858\nfootnote\t185859\n戯\t185860\n六分仪\t185861\n叫床声\t185862\n红\t185863\nDevil\t185864\n高虎\t185865\n郭雪芙\t185866\n侯震\t185867\n天马寨\t185868\n国梦\t185869\n美学\t185870\n舌尖\t185871\n钛银\t185872\n上排\t185873\nsplitter\t185874\nSuckseedeva\t185875\n英伟\t185876\n三亚凤凰机场\t185877\n新鹿鼎记\t185878\n过亿\t185879\n广濑海\t185880\n绍市\t185881\n自调\t185882\nQunar\t185883\n微微\t185884\npromised\t185885\n奴才\t185886\n湖南湘雅医院\t185887\n浙西\t185888\n孙淳\t185889\n现代教育技术\t185890\n沪江词汇社\t185891\nnetty\t185892\n缴款\t185893\n连日\t185894\n方图\t185895\nRK3368\t185896\n秦关\t185897\n轻弹\t185898\nexcel文件\t185899\n说来说去\t185900\n易烊崔永元\t185901\n装袋机\t185902\n红烧\t185903\n莎蔓莉莎美容院\t185904\n洛阳乐居网\t185905\n温总\t185906\nポイント\t185907\nmw1\t185908\n分馆\t185909\nfocusrite\t185910\n龙炎飞\t185911\nemployment\t185912\n智能小区\t185913\n娄山关路\t185914\n14393\t185915\nLogback\t185916\nBauer\t185917\n欧联杯\t185918\n废土2\t185919\n緒\t185920\n学究\t185921\n趁着\t185922\n请教\t185923\n志在千里\t185924\n上海市中医院\t185925\n胶枪\t185926\n龙湖水晶郦城\t185927\n卡佩\t185928\n李万山\t185929\npianist\t185930\n天機\t185931\n抗过敏益生菌\t185932\n中央研究院\t185933\n敌我\t185934\n烟灰\t185935\n自航\t185936\n溶质质量分数\t185937\n荥经\t185938\n2016新年\t185939\n断肠草\t185940\n七兄弟\t185941\n平铺\t185942\n小吃街\t185943\ncentro\t185944\n定势\t185945\nee\t185946\n张钧甯\t185947\n广告展\t185948\ncc\t185949\n猫猫\t185950\n士力架\t185951\nquanguo\t185952\n29条\t185953\n绵竹市\t185954\n德永\t185955\n欧兰\t185956\n众泰\t185957\n暴力型\t185958\n冒险之旅\t185959\n市场营销\t185960\nLipo\t185961\n减配\t185962\n斯克\t185963\n不适格\t185964\n苏州高铁站\t185965\n桌上\t185966\n12类\t185967\n5000多万\t185968\n博物馆奇妙夜3\t185969\ndescending\t185970\nstony\t185971\n宏景\t185972\n宝鸡国家高新技术产业开发区\t185973\nSAY\t185974\n农副\t185975\n段话\t185976\n论断\t185977\nsrx\t185978\n猫鼬\t185979\n安理申\t185980\n人到中年\t185981\n线迹\t185982\n涞滩古镇\t185983\n宏博\t185984\n算命|免费算命|八字算命\t185985\n浙江大学光华法学院\t185986\n火电\t185987\n歺\t185988\n真岛吾朗\t185989\n8片\t185990\n中羽\t185991\n上单\t185992\ndgx\t185993\n南京银行\t185994\n美屋\t185995\n百度地图热力图\t185996\n20181\t185997\n我是特种兵\t185998\n小米4c\t185999\nElevator\t186000\n机瞄\t186001\n双味\t186002\n六约\t186003\n猪猪侠\t186004\n阿桂\t186005\n料机\t186006\nabove\t186007\n杰出\t186008\n给爱\t186009\ntorchvision\t186010\n335\t186011\n4相\t186012\nWage\t186013\n泛亚有色金属交易所\t186014\n正道\t186015\n最及时\t186016\n真武大帝\t186017\n火辣辣\t186018\n非公募基金会\t186019\n岐阜\t186020\n光触媒灭蚊灯\t186021\n湖南师大附中\t186022\n人妖秀\t186023\n近两个月\t186024\nvivox20a\t186025\n一去不回\t186026\n保备案\t186027\n玉米花\t186028\n水清\t186029\n灵山\t186030\n张秀文\t186031\n卡器\t186032\n搬运工\t186033\n中级经济师考试\t186034\nRPCS3\t186035\n寒\t186036\n东北财经大学》\t186037\n打天梯\t186038\n江山樾\t186039\n华为P8max\t186040\n飞凌\t186041\n欧曼\t186042\nFacial\t186043\n除湿剂\t186044\nsurface3吧\t186045\n白灰\t186046\n突刺\t186047\nsnoopy\t186048\njulian\t186049\n一屋\t186050\nPART1\t186051\n碎石桩\t186052\n滴滴出行科技有限公司\t186053\n硫化亚铁\t186054\n神兽金刚\t186055\n马克思传\t186056\n猛酸钾\t186057\nmultipart\t186058\n资金链断裂\t186059\n0^\t186060\n轻氧\t186061\n鱼跃龙门\t186062\n国务院法制办公室\t186063\n三百万\t186064\n谎称\t186065\nthyroid\t186066\n洛维\t186067\n康乐部\t186068\nrecommendation\t186069\n人教版小学语文六年级下册\t186070\n数据池\t186071\nfellatio\t186072\nLinen\t186073\n一二期\t186074\nwps2017\t186075\ngarage\t186076\nEngine\t186077\n以太坊源代码\t186078\n桃花源记\t186079\n高源\t186080\n还有你\t186081\n斧王\t186082\n浙江省测绘与地理信息局\t186083\n挨宰\t186084\n打腿\t186085\n3503\t186086\n赌圣2\t186087\n塑木\t186088\n下一个世界\t186089\n微整形培训学校\t186090\n16班\t186091\nHiragino\t186092\n回龙山\t186093\nFX8300\t186094\n闲暇时\t186095\nSync\t186096\n意大利面酱\t186097\n成精\t186098\n88天\t186099\n哑哑\t186100\n阿冬\t186101\n杜占元\t186102\n一数\t186103\nzhiwu\t186104\n枫叶卡\t186105\n拉尔\t186106\n资阳日报数字报\t186107\nword段\t186108\n梳\t186109\n哈尔冰\t186110\nmatble\t186111\n某样\t186112\n南大仙林校区\t186113\n屌丝版\t186114\n单元测试\t186115\n丘比\t186116\n60层\t186117\n夏洛克·福尔摩斯\t186118\n旗盒\t186119\n真面目\t186120\n珍宝蟹\t186121\n柴油\t186122\ninstruments\t186123\n齐市\t186124\nformats\t186125\n国有土地使用权\t186126\n五六千块\t186127\nInfected\t186128\naerial\t186129\n秘密森林\t186130\n盖毯\t186131\n奔溃\t186132\n贵州省公路工程集团有限公司\t186133\n标法\t186134\n毛囊炎\t186135\n读档\t186136\n艮山西路\t186137\n如影随心\t186138\n程云\t186139\n走私案\t186140\n军衔制\t186141\n日本药妆\t186142\n刘士余\t186143\nMOBI|EPUB|AZW3\t186144\n波流\t186145\n旅游塔\t186146\n割发\t186147\n空包\t186148\ncitra\t186149\n开放\t186150\n信用卡诈骗罪\t186151\n神系\t186152\n亲娘\t186153\n尽职\t186154\nHG255D\t186155\n羊角\t186156\n选件\t186157\n御厨\t186158\n钢管\t186159\n氪\t186160\nd60\t186161\n四十六\t186162\n罗萨\t186163\n尤里奇\t186164\n王艳\t186165\n專區\t186166\n阿尔卑斯山脉\t186167\n狄龙\t186168\n杨汉\t186169\n机器学习算法工程师\t186170\n不能见\t186171\n敌人\t186172\n白石路\t186173\n学武\t186174\n尚德实验学校\t186175\n如来轩逸\t186176\n一起走\t186177\nkas\t186178\n随身集合石\t186179\ntars\t186180\nsojson\t186181\ns50\t186182\n装药\t186183\n会计年度\t186184\n辛辛苦苦\t186185\n水\t186186\n广东龙川政府\t186187\nSpell\t186188\n王江泾\t186189\n吴国平\t186190\n娘娘庙\t186191\nsouvc\t186192\n西安地铁4号线\t186193\n何阳\t186194\n枪伤\t186195\n蜀郡\t186196\n生物识别\t186197\n卷头发\t186198\n江西理工\t186199\n李安琪\t186200\n感染性心内膜炎\t186201\norcale\t186202\n千层糕\t186203\n亿百\t186204\n六把\t186205\n苍井优\t186206\n乌牛\t186207\n硬盘录像机\t186208\n永暑礁\t186209\n疑难病\t186210\n说明书库\t186211\n百通物流\t186212\n富春江镇\t186213\n火药枪\t186214\n北控水务(中国)投资有限公司\t186215\n纸画\t186216\n千年前\t186217\ntvp\t186218\n石油库\t186219\n0.618\t186220\n春风吻上我的脸\t186221\nfreehand\t186222\nViki\t186223\nUrus\t186224\n命令与征服:红色警戒3\t186225\n武庚\t186226\n窄带\t186227\nE招贷\t186228\n仿威图\t186229\n卡塔尔航空\t186230\n1488\t186231\n免息贷款\t186232\n物性\t186233\n理算\t186234\n咀嚼肌\t186235\n马牙槎\t186236\n小米1s\t186237\n中国安全生产协会\t186238\n白沙大道\t186239\n齐聚\t186240\npeoples\t186241\n柳毅\t186242\nYY号\t186243\n点将台\t186244\n孙休\t186245\n红玛瑙\t186246\nJoye\t186247\n千里马工程网\t186248\n学业\t186249\n玉海园\t186250\n正史\t186251\n500.0000元\t186252\n{\t186253\n珍品\t186254\n馀\t186255\n怨妻\t186256\nOrganics\t186257\n5.8.10\t186258\n朱逢博\t186259\n通江县\t186260\n悬置\t186261\nAdaboost\t186262\n张建平\t186263\n郑万铁路\t186264\n提币\t186265\n万岁\t186266\n规培生\t186267\nhatch\t186268\n渐进性\t186269\n化蝶\t186270\ntop\t186271\n凤凰OS\t186272\n伊犁州\t186273\n魔鬼鱼\t186274\n20160830\t186275\nEVOLUTION\t186276\n墙墙\t186277\n胡家园\t186278\n底模\t186279\n三菱电梯\t186280\n王喆\t186281\n1000层\t186282\n岩石力学与工程学报\t186283\n李侠\t186284\n姐控\t186285\n卫生所\t186286\n范数\t186287\n未名湖畔\t186288\nN卡\t186289\n蜡\t186290\n成都宾馆\t186291\nhone\t186292\n第60\t186293\n师旷\t186294\n六枚\t186295\n剑网三吃鸡\t186296\n卡使玩英雄联盟\t186297\n拼音\t186298\n水牛角\t186299\n老关\t186300\n中野亚梨沙\t186301\n黄骅市\t186302\n硅酸盐\t186303\n上海宝冶\t186304\n九阴瑞纳\t186305\npingce\t186306\n西湖论剑\t186307\nOneNote2016\t186308\n轩辕\t186309\n1.06\t186310\n元宝鸽\t186311\n达人秀\t186312\n联想g460\t186313\nlene\t186314\ntorture\t186315\n小太阳幼儿园\t186316\n凉宫琴音\t186317\n人民防空\t186318\n密位\t186319\n越城区政府网\t186320\n苏军\t186321\n安卓软件园\t186322\n重庆市渝中区人力资源和社会保障局\t186323\n草酰氯\t186324\n军迷\t186325\n表面张力系数\t186326\nnekopara\t186327\n兵器工业集团\t186328\nFFXIV\t186329\n唐长安\t186330\ndefender\t186331\neasy_install\t186332\n好快\t186333\nrocky\t186334\ntmobile\t186335\n大连市\t186336\n断音\t186337\n单八将\t186338\nXamarin\t186339\n876AV\t186340\n军火贩\t186341\n沣河\t186342\n恨\t186343\n新娘\t186344\nSElinux\t186345\ndinner\t186346\nJinghuaS\t186347\nfucks\t186348\n斗宗\t186349\n6600k\t186350\n中华全国青年联合会\t186351\n调重\t186352\n内镜室\t186353\n今早\t186354\n中世\t186355\n偷闻\t186356\nyinyue\t186357\n高血压病\t186358\n美研\t186359\n微操\t186360\n心理医生\t186361\n板车\t186362\n243号\t186363\n5粒\t186364\n便士\t186365\n龙蟒\t186366\n戚风蛋糕\t186367\n虎山行\t186368\n凤凰涅槃\t186369\n称做\t186370\n暂住\t186371\n1000\t186372\n0.05mm\t186373\n市工商行政管理局\t186374\n经纬纺机\t186375\n2012下半年\t186376\n千里达\t186377\n韩氏\t186378\ncenos\t186379\nBEAUTIFUL\t186380\n辞职吧\t186381\nChung\t186382\n署\t186383\nd2s\t186384\n斑蝥\t186385\n壳机\t186386\n佳欣\t186387\nyummy\t186388\nmonologue\t186389\nIP加速器\t186390\n附真\t186391\nMechanism\t186392\nsouls\t186393\n延海\t186394\n传经送\t186395\npoets\t186396\n3000平米\t186397\n县志\t186398\nmitaka\t186399\n低压管\t186400\n凌少\t186401\n2017_凯风网\t186402\n晁天王\t186403\nacti\t186404\n朴赞郁\t186405\n第25号\t186406\n31.5英寸\t186407\nroute-static\t186408\n佳能\t186409\n多特软件站\t186410\n月宗\t186411\n蚂蚁\t186412\n乌兰察布市\t186413\n钟文\t186414\n2015-2022年\t186415\n芒果街\t186416\n投建\t186417\n陈伟俊\t186418\n4光\t186419\nEUR\t186420\nsays\t186421\n北京市自来水集团\t186422\nCHIC\t186423\n医疗岗\t186424\n中英网www.uker\t186425\nethan\t186426\n爬取\t186427\nReagan\t186428\n剪式\t186429\n讲经\t186430\n地方\t186431\n3月25\t186432\n施心远\t186433\n王胜\t186434\nCCV5\t186435\n圣迪奥\t186436\n阋墙\t186437\nm401d\t186438\n20170627\t186439\n情狱\t186440\n变矩器\t186441\n青岛公交查询网\t186442\n36层\t186443\n新秀\t186444\n老天\t186445\n40路\t186446\n亿\t186447\n充电柜\t186448\n校园默示录\t186449\n导航仪\t186450\n国管公积金\t186451\n美团众包\t186452\n循环伏安曲线\t186453\n龙炎\t186454\nDiskpart\t186455\n生精\t186456\n淘宝代运营公司\t186457\n耐克\t186458\n背段\t186459\n6吨\t186460\n自书遗嘱\t186461\n资阳中学\t186462\n097\t186463\n胭脂虫\t186464\n国家测绘局\t186465\nol番号职女制服诱惑\t186466\nstations\t186467\n博尔塔拉\t186468\nportmap\t186469\n合资股比\t186470\n大战神\t186471\n还有效\t186472\n看逼\t186473\n0.9.3\t186474\n工人日报网\t186475\n斗湖\t186476\n拔罐\t186477\n8801\t186478\n洋服\t186479\n夺嫡\t186480\n牧云\t186481\ntreat\t186482\n我是歌手第五季\t186483\n河南省商城县\t186484\n阿卡姆疯人院\t186485\n蒙妮坦\t186486\n20160108\t186487\n李遂镇\t186488\n翻浆\t186489\ncollocation\t186490\n雷厉风行\t186491\nscratch2\t186492\n风水大师\t186493\n贝律铭\t186494\n文献标识码\t186495\n凡妮莎海辛\t186496\n狗民网\t186497\n沙漠老鼠团\t186498\nResidents\t186499\nGMap\t186500\n零矩阵\t186501\n微信公众号号\t186502\nSinga\t186503\n在云端\t186504\n生理心理学\t186505\n30立方\t186506\n考妣\t186507\n兴义吧\t186508\nSpecified\t186509\n海洋之歌\t186510\n佛山祖庙\t186511\n浅谈\t186512\nIndeed\t186513\nteach\t186514\n金博会\t186515\n警种\t186516\n慎食\t186517\n10mb\t186518\n嫩穴\t186519\n中青\t186520\n新疆兵团\t186521\n湖南大学图书馆\t186522\nairdroid\t186523\n展播\t186524\n顶楼\t186525\nexecutable\t186526\n黑裤袜\t186527\n中康\t186528\nGAI爷\t186529\n金莲花胶囊\t186530\n联络柜\t186531\n喊麦DJ网\t186532\n鹤山人才网\t186533\n北京市国家税务局\t186534\n14:30\t186535\n49.1\t186536\n7万\t186537\n邵音音\t186538\n抽水机\t186539\n北京市城市管理委员会\t186540\n52集\t186541\n朋友圈\t186542\n黎里镇\t186543\n明阳\t186544\n精灵宝可梦\t186545\nvmlinuz\t186546\n魔皮\t186547\n存货周转率\t186548\n富坚\t186549\nchange事件\t186550\nValidity\t186551\n泰沙\t186552\nFallout\t186553\n巴峡\t186554\n万网云\t186555\n新生儿黄疸\t186556\n0cube\t186557\n大城县\t186558\n中央文明委\t186559\n千黛斯\t186560\nlisten\t186561\n威吓\t186562\n大饼\t186563\n总动员\t186564\n首开集团\t186565\n厦门六中\t186566\n走人\t186567\ndaniu\t186568\nknocked\t186569\n乐尚\t186570\nunderage\t186571\n大珠山\t186572\nSchneider\t186573\n838434\t186574\n慧点科技\t186575\n2单\t186576\n圆心角\t186577\n上海市中医医院\t186578\n波涛\t186579\n扬美古镇\t186580\njsPDF\t186581\n金锣集团\t186582\n桃城区\t186583\n松紧\t186584\n57秒\t186585\n梁萍\t186586\n匠品\t186587\n菲林\t186588\n环境保护部\t186589\n猎聘\t186590\n嘟嘟唇\t186591\n第168期\t186592\n众泰SR9\t186593\n梓\t186594\n余位\t186595\nacct\t186596\n平潭岛\t186597\nofficials\t186598\n巴拉望岛\t186599\n整存整取\t186600\n湖北教育信息网\t186601\n拴住\t186602\n丁鹏\t186603\n紫铜\t186604\nExcel2007\t186605\n1维\t186606\n一毫秒\t186607\n长途客运站\t186608\n91prom\t186609\n父子关系\t186610\n念皇\t186611\ncntv\t186612\n今视网\t186613\n活页卷\t186614\n周更\t186615\n机械设计手册\t186616\n加权平均利率\t186617\npinky\t186618\n大愚\t186619\n柳州站\t186620\n栖霞路\t186621\n凯乐\t186622\n22cm\t186623\n2公分\t186624\n宫斗文\t186625\n节能科技有限公司\t186626\n黑色洛城\t186627\ncpic\t186628\n瑞舒伐他汀\t186629\n杜鹃山\t186630\n119w\t186631\n中国证券登记结算有限公司\t186632\n王晓磊\t186633\n骆家辉\t186634\n发奋\t186635\n打湿\t186636\nrift\t186637\n小露\t186638\n马旭东\t186639\nunixtime\t186640\n任子行\t186641\n湿疣\t186642\n管理科\t186643\n申论\t186644\n肺段\t186645\n的士高\t186646\n8855\t186647\n创力\t186648\n洗色\t186649\n暖心\t186650\n外壁\t186651\n扎克拉文\t186652\n昆明酒店\t186653\n白裙\t186654\n4格\t186655\nn7\t186656\n洋泾中学\t186657\n27句\t186658\n税项\t186659\n尸油\t186660\n后部\t186661\n李启明\t186662\n妖精剑\t186663\nliyou\t186664\n一壶老酒\t186665\n湖口县\t186666\n游子\t186667\n千花坊\t186668\n广汽三菱\t186669\nIL-6\t186670\nOcata\t186671\n妆\t186672\n征信报告\t186673\n阿里纳斯\t186674\nforked\t186675\n李小刚\t186676\nald\t186677\n收款员\t186678\nSWEET\t186679\n花咲\t186680\n疲劳度\t186681\nPhalcon\t186682\n形心\t186683\n周口市人民政府\t186684\n比丘\t186685\n食品招商网\t186686\n厨卫展\t186687\n心文\t186688\n上海轨道交通11号线\t186689\n莱斯特城\t186690\n曳\t186691\n芳华绝代\t186692\n东风新村\t186693\nparalles\t186694\n哑巴新娘\t186695\n别嘌醇\t186696\n车证\t186697\n新庄镇\t186698\nCixi\t186699\n平焊法兰\t186700\n残年\t186701\n电子商城\t186702\n跳出\t186703\n回执照\t186704\n石油危机\t186705\nEOC\t186706\n被偷窥\t186707\n霄边\t186708\nHoi\t186709\n小米Note3\t186710\niss\t186711\n肩摔\t186712\n贝贝南瓜\t186713\n小坂\t186714\n速叠杯\t186715\nbonnie\t186716\n洛瑞\t186717\n天娱传媒\t186718\n龙斗士\t186719\n吸波\t186720\n角料\t186721\nmtk\t186722\n九龙坡石桥铺\t186723\n萌战凌派\t186724\n电标\t186725\n交道口\t186726\n八集\t186727\n鞍山立山\t186728\n自考-233\t186729\n自拍神器\t186730\n内在美\t186731\n荷叶镇\t186732\n76_\t186733\n四手\t186734\n树板\t186735\n阿金斯\t186736\n银行招聘考试网\t186737\n东西\t186738\n松芝股份\t186739\n无锡新闻网\t186740\n直链\t186741\n【兰斯\t186742\n集火\t186743\n站务\t186744\n叫座\t186745\n配料机\t186746\n流浪者之歌\t186747\n赤焰军\t186748\n竞彩猫\t186749\nios7.1.2\t186750\n102.5\t186751\nmingxingku\t186752\nXRDP\t186753\n白木优子\t186754\n香巴拉\t186755\n伦敦金属交易所\t186756\n构家\t186757\nbeacon\t186758\nCODol\t186759\n_希\t186760\n机井\t186761\n一级消防工程师\t186762\nborrowed\t186763\n谢强\t186764\n10个\t186765\n去皮\t186766\nduellinks\t186767\n130卡\t186768\n同床共枕\t186769\n折扣网\t186770\n保罗·沃克\t186771\n热贡\t186772\n天津大学管理与经济学部\t186773\nピ\t186774\n省文联\t186775\n泰州市区\t186776\ndot\t186777\n骗吻狂魔\t186778\n酸酸乳\t186779\n吸血鬼猎人\t186780\nENR\t186781\n2345.com\t186782\n美元现汇\t186783\n笑不出来\t186784\n泸州老窖股份有限公司\t186785\n伦敦大学金史密斯学院\t186786\n百分之十\t186787\n上海徐汇区青少年活动中心\t186788\nEspaa\t186789\n2018年04月07日\t186790\nDTOP\t186791\n玻璃酸钠注射液\t186792\nedrawmax\t186793\n递等式\t186794\n海战世界吧\t186795\n补辑\t186796\n徐良\t186797\n软模\t186798\n木机\t186799\n盛世豪庭\t186800\n总收入\t186801\n脱线\t186802\n河北地区\t186803\nnau\t186804\n四十五\t186805\n农林牧渔\t186806\n北京西山森林公园\t186807\n金装传奇\t186808\n酷评\t186809\n许达哲\t186810\n循环伏安\t186811\nwcs\t186812\n51Halcon\t186813\n庐江二中\t186814\n陈诉\t186815\n认\t186816\n拉筒\t186817\n第七辑\t186818\n刘广东\t186819\n温州城\t186820\n红燕\t186821\n草莓节\t186822\n永恒之火\t186823\n义\t186824\n寿喜锅\t186825\n音系\t186826\n熄屏\t186827\n劳伦\t186828\n测试性\t186829\n丙寅年\t186830\n聘制\t186831\nCompressors\t186832\n汇仁肾宝\t186833\n格隆汇\t186834\n死神VS火影\t186835\n王一川\t186836\n2366\t186837\n三倍体\t186838\n中古\t186839\n口袋妖怪绿宝石386\t186840\n组合类\t186841\n深恶痛绝\t186842\n广州中山大学\t186843\n31名\t186844\n龙珠\t186845\n罗城县\t186846\n松本芽依\t186847\n弩箭\t186848\n口コミ\t186849\n事发后\t186850\n全价\t186851\n前传\t186852\n嗨曲\t186853\n物竞\t186854\n武东\t186855\n37平米\t186856\n片上\t186857\n落锁\t186858\n64_\t186859\n吴志祥\t186860\n老骥\t186861\n主卖\t186862\n2gb\t186863\n12颗\t186864\n孝介\t186865\n60平\t186866\n禁令\t186867\nRhodes\t186868\nuiimageview\t186869\nCATS\t186870\nPolyethylene\t186871\n通港路\t186872\n间谍同盟\t186873\n一面\t186874\n联合村\t186875\nCherish\t186876\n85厘米\t186877\n浙江省建设投资集团股份有限公司\t186878\n返聘\t186879\nnsis\t186880\n至慧\t186881\nzyd\t186882\n构造地质学\t186883\nAffect3D\t186884\n水溶性膳食纤维\t186885\n鹿城\t186886\n黑客帝国2\t186887\nanz\t186888\n龙门村\t186889\n刘越\t186890\n撸撸鲁\t186891\n点读\t186892\n埋置\t186893\n35kV\t186894\n苏文茂\t186895\n咖喱辣椒\t186896\n乌索普\t186897\n多常\t186898\n欧卡\t186899\n考估\t186900\nandroid5.0\t186901\n问责制\t186902\n伪作\t186903\n4千亿\t186904\nlevels\t186905\n有所作为\t186906\n香炉峰\t186907\n实习证\t186908\n133个\t186909\n感染\t186910\n陈建中\t186911\n假机油\t186912\n爆款\t186913\n参编\t186914\n生在\t186915\nchatbot\t186916\n造价通询价圈\t186917\n香湖湾\t186918\n海贼王女帝娜美\t186919\n僵尸\t186920\n资和信商通卡\t186921\n世袭制\t186922\n井筒\t186923\n硬碰硬\t186924\n诺威\t186925\n伊斯兰文化\t186926\n相合\t186927\n保利金町湾\t186928\n电话销售\t186929\n中石化\t186930\n教典\t186931\n1年后\t186932\n猛片\t186933\n蜗蜗\t186934\nelectric\t186935\n力诺瑞特\t186936\n故事\t186937\n0507\t186938\n100余\t186939\n中传\t186940\n次轮\t186941\n张桂芳\t186942\nMySQL\t186943\n20160724\t186944\n一座\t186945\nblueflame\t186946\n潜江市\t186947\n庄河市教育局\t186948\n励志\t186949\n操作按钮\t186950\n指示剂\t186951\n青网\t186952\n中山市第一中学\t186953\n铁券\t186954\nFreeLists\t186955\nscientists\t186956\nkisssoft\t186957\n斗门镇\t186958\n运品\t186959\n#02\t186960\nplantation\t186961\n一账通\t186962\n纤维束\t186963\n上海福山外国语小学\t186964\n攀岩馆\t186965\nFitz\t186966\n测试点\t186967\n工代会\t186968\ncyto\t186969\nStarting\t186970\n檀溪\t186971\ncomy\t186972\n广告门\t186973\n微医\t186974\n40期\t186975\n丰花月季\t186976\n高力士\t186977\nc盘分区\t186978\n退货率\t186979\n1144\t186980\n李秉宪\t186981\n调唱\t186982\n丙烯酸\t186983\nex表\t186984\n喺\t186985\n新长安欧尚\t186986\nrevised\t186987\nyamada\t186988\n真空包装机\t186989\n循环化\t186990\n湛江机场\t186991\n二寸照\t186992\n穿膜\t186993\nX100F\t186994\n路长制\t186995\n乐利\t186996\n沣东\t186997\n恶风\t186998\nrecycled\t186999\n三氮唑\t187000\n停机坪\t187001\n铁定\t187002\ncxt\t187003\n抵抗性\t187004\n李浩菲\t187005\n艾迪康\t187006\n株洲日报社\t187007\n缉查\t187008\n李小华\t187009\n晨鸣纸业\t187010\n小猪短租-短租\t187011\n1夜\t187012\n中信银行公司\t187013\nWithout\t187014\ncommons-pool\t187015\n邮乐购\t187016\n黄玲\t187017\nSAT\t187018\nOptiX\t187019\n填满\t187020\nmvvm\t187021\n7.4.3\t187022\n倒过\t187023\n无关风月\t187024\n国家质量监督检验检疫总局进出口食品安全局\t187025\n皮埃尔\t187026\n迷奸案\t187027\nstored\t187028\n参比电极\t187029\nNETGEAR\t187030\n京粮集团\t187031\n439\t187032\n超变战陀\t187033\nMODS\t187034\n_育路高校招生网\t187035\n2602\t187036\n气味图书馆\t187037\n猪肠粉\t187038\ncherrytree\t187039\n职员制\t187040\n杯子\t187041\nunserialize\t187042\n不见了\t187043\n魅蓝s6\t187044\n好急\t187045\n关节炎_\t187046\n红旗h5\t187047\n尾草\t187048\nExhibition\t187049\n相对人\t187050\n2009年4月\t187051\nUltraScale\t187052\n兴化400生活网\t187053\n补注\t187054\n庭院式\t187055\nfr4\t187056\nx11\t187057\n天机\t187058\n羽织\t187059\ngran\t187060\n安吉尔\t187061\n李国\t187062\n王敬\t187063\n小琴\t187064\n干部履历表\t187065\n吉林日报\t187066\n类比推理\t187067\n国际劳动节\t187068\n帝国时代:终极版\t187069\nsans\t187070\n舌战群儒\t187071\nExec\t187072\nnomo\t187073\n杨朔\t187074\n三星GALAXY\t187075\n磁化率\t187076\n甲壳虫\t187077\n硫酸钙\t187078\n北京凤凰网\t187079\n黑帝\t187080\n陈小旺\t187081\n2015年12月30日\t187082\n长春街\t187083\n6.8.1\t187084\n教练员\t187085\n内外部\t187086\nP站\t187087\n无颜之月\t187088\n暗病\t187089\ncinema4d\t187090\n曾国藩家训\t187091\n老友记\t187092\napizza\t187093\n粉红湖\t187094\n义乌购\t187095\n烟波\t187096\n农村金融网\t187097\n20行\t187098\n土豆泥\t187099\nresidency\t187100\n主备库\t187101\n海红\t187102\ncitrix\t187103\n纪念堂\t187104\n水杨酸甲酯\t187105\n蒂佳婷\t187106\n阿奇霉素分散片\t187107\n天地缘\t187108\nglut\t187109\n269号\t187110\n无义\t187111\n快打旋风\t187112\n合适\t187113\n核对\t187114\nph值\t187115\n20160123\t187116\nSoftpedia\t187117\n国家开放大学电大\t187118\nqtreeview\t187119\n毛戈平形象设计艺术学校\t187120\n毛细管\t187121\n转班\t187122\n为证\t187123\nmantle\t187124\n豆鞋\t187125\nBUN\t187126\n奥运圣火\t187127\n中国机器人网\t187128\nguilin\t187129\n上古卷轴5:天际特别版\t187130\n新增多\t187131\n整页\t187132\n科乐美\t187133\n邮电\t187134\n直指\t187135\njsrender\t187136\n天津泰达投资控股有限公司\t187137\n荒诞\t187138\n电子章\t187139\nsx1278\t187140\n刘海涛\t187141\n红藻\t187142\n葡萄牙队\t187143\n条条大路通罗马\t187144\n花精灵王\t187145\nstokes\t187146\nSuite\t187147\n张金涛\t187148\nwindows7_Window\t187149\n78.com\t187150\n北京税务局\t187151\n谢特\t187152\n倍捻机\t187153\n中国农业大学资源与环境学院\t187154\n坦克车\t187155\nWAV,CD\t187156\n宿迁日报社\t187157\n无忧车牌网\t187158\nPetrel\t187159\n单相思\t187160\n海默\t187161\nxammp\t187162\n500天\t187163\n刻录机\t187164\n巨臀\t187165\n天源迪科\t187166\n新闻传播学院\t187167\nwindbg\t187168\n攻守道\t187169\nTRESemmé炫\t187170\n博多豚骨拉面\t187171\n阿诗丹顿\t187172\n九派\t187173\n守信\t187174\n诺维奇\t187175\nJ罩杯\t187176\n湖南路\t187177\nExecutive\t187178\n惠州南\t187179\n融资融券业务\t187180\n133家\t187181\n锋芒毕露\t187182\n科磊\t187183\n超新星\t187184\n马鞍山经济技术开发区\t187185\n混同\t187186\n山海经异兽\t187187\n郴\t187188\n宝成\t187189\n结构\t187190\n充电架\t187191\n童谣\t187192\n扩张器\t187193\n成渝高速\t187194\nFittings\t187195\n神仙卷\t187196\n商贷转公积金\t187197\n四面\t187198\n创建\t187199\n坤泰胶囊\t187200\n庞统\t187201\n鼻孔\t187202\n毛鹃\t187203\n外村\t187204\n代替\t187205\n衣冠庙\t187206\n工房\t187207\n一代二代\t187208\n亳州职业技术学院\t187209\n醉枕\t187210\nstroe\t187211\n李志宏\t187212\nsmart论坛_汽车之家论坛\t187213\n重庆啤酒\t187214\n50年代\t187215\n吊儿郎当\t187216\n槿汐\t187217\n东航万里行\t187218\n滚动码\t187219\n想念\t187220\nVinyl\t187221\n甲壳\t187222\nhava\t187223\nCONCATENATE\t187224\n意涵\t187225\n警幻\t187226\n海贼王之\t187227\nThi\t187228\n疯女\t187229\n阅界书房_读书频道\t187230\n爱不单行\t187231\n大众文化\t187232\n嘎查村\t187233\nnetsnmp\t187234\n受降\t187235\n海夫\t187236\n智圣汤泉\t187237\n私募网\t187238\nIMDB\t187239\n洗衣皂\t187240\n吹填\t187241\necap\t187242\nSORT\t187243\n晴天霹雳\t187244\n国际酒店集团\t187245\n昆仑域\t187246\n粟粒\t187247\n陈式太极拳老架一路\t187248\n法兰城\t187249\n康强电子\t187250\n德邦股份\t187251\n四岁\t187252\nwocao\t187253\nETS\t187254\n唱头\t187255\n预应力锚索\t187256\n水镜\t187257\n一连串\t187258\nPHPMyadmin\t187259\nMinistry\t187260\nconstraint\t187261\n孩纸们\t187262\nST龙力\t187263\n皇汉\t187264\nBiologicals\t187265\n中微\t187266\nBTAVA\t187267\n沈阳城市学院\t187268\n重庆市发改委\t187269\n电源防雷器\t187270\n澄\t187271\n海马玩\t187272\n7v\t187273\nSCHOOL\t187274\ncology\t187275\n上学前\t187276\n我看完\t187277\nhp1050\t187278\n设\t187279\nvoting\t187280\n柳琴\t187281\n吸水量\t187282\n华纳娱乐\t187283\nXiong\t187284\n汪圆圆\t187285\nlck\t187286\n158家\t187287\nDoubt\t187288\n夜跑\t187289\n猫爸\t187290\n1114\t187291\nCar\t187292\n乐彼\t187293\n刑事律师\t187294\n中国建筑人才网\t187295\n3.95\t187296\n郑州市第一人民医院\t187297\nPlayer\t187298\n八根\t187299\n重庆南滨路\t187300\n吴敏\t187301\nbuka\t187302\n爱乐网\t187303\nmillion\t187304\n渔村篇\t187305\n陕西华特科技有限公司\t187306\nTampa\t187307\n汤素兰\t187308\n立窑\t187309\nWANNA\t187310\n手拉\t187311\n崔亮\t187312\n只有我不在的街道\t187313\n元芳\t187314\nCRF\t187315\n苦雨\t187316\nSAICMOTOR\t187317\n书文\t187318\nsmartconfig\t187319\n明十三陵\t187320\n球权\t187321\n上海大众\t187322\n公安\t187323\nTWICE\t187324\n人贩\t187325\n黄金100秒\t187326\n65K\t187327\nLate\t187328\n乐1S\t187329\n风王\t187330\n稳层\t187331\n秀堂\t187332\n尚赫\t187333\nshaper\t187334\n75t\t187335\nendings\t187336\n百度信誉\t187337\n苏会文\t187338\n破天荒\t187339\n棋魂\t187340\n灰姑娘的故事\t187341\n魔域口袋版\t187342\n問題\t187343\n心金魂银\t187344\n猴菇\t187345\n片羽\t187346\n厚朴\t187347\nW5500\t187348\n芭蕉\t187349\n2450M\t187350\n高环\t187351\n1328\t187352\n600503\t187353\n深圳市人居环境委员会\t187354\n977\t187355\n春风不度\t187356\n第11卷\t187357\n灵山梵宫\t187358\n赤峰百姓网\t187359\n雅鲁藏布江大峡谷\t187360\n满清禁宫奇案\t187361\n192.168.1.102\t187362\n返校\t187363\n二百个\t187364\n莳绘\t187365\n巷道\t187366\n四倍体\t187367\n写字机\t187368\n谷子\t187369\n快药\t187370\n数据咖\t187371\n驳岸\t187372\n骨传导耳机\t187373\n动画城\t187374\n新白蛇传\t187375\n化石燃料\t187376\n学前教育改革\t187377\n台头镇\t187378\n金霞经济开发区\t187379\n希妈\t187380\n元洲装饰\t187381\npgp\t187382\nstussy\t187383\n罗瑞\t187384\n看跌期权\t187385\n辽阳市教育局\t187386\n深圳市第一职业技术学校\t187387\n拉格朗\t187388\n北京分行\t187389\n斑美拉\t187390\n隐形门\t187391\n周波机\t187392\nmass\t187393\nooi\t187394\n阿特兹ATENZA\t187395\n杨开慧\t187396\n东坪山\t187397\n就业困难人员\t187398\n席卷\t187399\n山治\t187400\n防褥疮\t187401\n省残联\t187402\ntime_wait\t187403\n移位寄存器\t187404\n原语\t187405\n永宏\t187406\ndaikin\t187407\n用酒\t187408\n驱动装\t187409\n辛酉年\t187410\n巴氏杀菌机\t187411\nexcel表格链接\t187412\n7.x\t187413\n兰花展\t187414\n路金波\t187415\n环氧树脂\t187416\n波兹曼\t187417\nezcad\t187418\n1轴\t187419\nipadmini1\t187420\n华强电子世界\t187421\ndowebok\t187422\n田田\t187423\n笑疯\t187424\nflashplayer\t187425\n营业税\t187426\n酷狼\t187427\n老潼关\t187428\n勾画\t187429\nRockwell\t187430\nBoil\t187431\n风衣男\t187432\n叶梓萱\t187433\n20161026\t187434\n榨菜\t187435\n新公路\t187436\n红姨\t187437\n张外龙\t187438\n亨斯迈\t187439\n佳士科技\t187440\n广岛大学\t187441\n张文清\t187442\n动载\t187443\n李春燕\t187444\n洛克公园\t187445\n蒲葵\t187446\n鹿皮绒\t187447\nspk\t187448\n勺子\t187449\n吹牛皮\t187450\n华创证券有限责任公司\t187451\n襄城区\t187452\n利他林\t187453\n成都经济技术开发区\t187454\nVariflight\t187455\n106号\t187456\n极佳\t187457\n总代理\t187458\nwww.hebjs.gov.cn/zhengcewenjian/tongbaogongshi/20\t187459\n码王\t187460\n权威房产网\t187461\n渔政\t187462\n服务篇\t187463\n新远\t187464\n试发\t187465\n香河家具城\t187466\nstrtol\t187467\n工伤假\t187468\n观山湖\t187469\n黄乙玲\t187470\n憨憨\t187471\n65R17\t187472\n公网对讲机\t187473\n273度\t187474\n卢克金团\t187475\ngarch\t187476\n无敌风火轮\t187477\nabcd\t187478\n新虹桥\t187479\n安居古镇\t187480\n副主任\t187481\n慧球科技\t187482\n贝比\t187483\ngalway\t187484\n细圆\t187485\n乐不思蜀\t187486\n文稿\t187487\n达拉\t187488\n刀剑神域第二季\t187489\n招商银行跨行\t187490\n徐汇\t187491\n690\t187492\n东方教育\t187493\n200mm\t187494\n格拉斯\t187495\ndilili\t187496\n暮云\t187497\n详勘\t187498\n臣\t187499\n横页\t187500\n兰陵王\t187501\n消纳\t187502\n株洲新闻网\t187503\nmax90\t187504\n宠翻天\t187505\n雷亚\t187506\n外侧\t187507\n文依\t187508\n餐厨垃圾管理办法\t187509\n老界岭\t187510\n如影随形\t187511\n小昕\t187512\n微爱豆电视版_微爱豆TV端高清\t187513\n上年\t187514\n排架柱\t187515\n职业规划书\t187516\n发报机\t187517\n博库网\t187518\n美韩\t187519\n死骑\t187520\n重庆南岸茶园新区\t187521\n怀化市委\t187522\n17厘米\t187523\n可想而知\t187524\n戒\t187525\nKirin\t187526\nlcd屏\t187527\n三经\t187528\n景上\t187529\n花泽类\t187530\n行政责任\t187531\n明明德\t187532\n万件\t187533\n围着\t187534\n不仁\t187535\n即期信用证\t187536\n桂聘人才网\t187537\n平面轴承\t187538\n加藤リナ\t187539\n包转发率\t187540\n赵合德\t187541\n关系代词\t187542\n20英尺\t187543\n筱山纪信\t187544\n东段\t187545\nSEVE\t187546\n武商广场\t187547\n好多久\t187548\nTask\t187549\nhtc\t187550\n士象\t187551\nGIORGIO\t187552\n鳇鱼\t187553\n出嫁女\t187554\n榔梨\t187555\n元宝版\t187556\n滑行道\t187557\n饭堂\t187558\n林先生\t187559\n评优\t187560\nintelij\t187561\n大自然家居\t187562\n太空旅行\t187563\n黄金大厦\t187564\n陕西师范大学\t187565\nxfeatures2d\t187566\nUTF-8\t187567\n电缆网\t187568\n张贵林\t187569\n反作用\t187570\n仙桃西\t187571\n和平鸽\t187572\n第8卷\t187573\nrepresent\t187574\n金榜起名网\t187575\n报审表\t187576\n石老人\t187577\n傅里叶变换算法\t187578\n20世纪60年代\t187579\n穷寇\t187580\n四川移动\t187581\nrefining\t187582\nGDI\t187583\n熏天\t187584\ns7lol\t187585\n血塞通分散片\t187586\n坠床\t187587\n同源词\t187588\n副社长\t187589\nchongqi\t187590\ndresses\t187591\n威哥\t187592\nhanlp\t187593\n地形\t187594\n不公平\t187595\n批评性\t187596\n2020年度\t187597\n转台\t187598\n头面\t187599\n蜂王\t187600\n陪我看\t187601\nMIT\t187602\n不谙\t187603\n进水阀\t187604\n冷镦机\t187605\nWin2012\t187606\n李大为\t187607\n中国南方电网公司\t187608\n西安科技大学研究生院\t187609\n30MM\t187610\n店村\t187611\n女强\t187612\n古雷石化\t187613\n虹口公园\t187614\nsylvia\t187615\n罗马数字\t187616\n上实\t187617\n非深户\t187618\n激光秀\t187619\n菲比\t187620\n612\t187621\n女变男\t187622\nrevaluation\t187623\n军中\t187624\nmy_cool2007\t187625\n通过式抛丸机\t187626\nCBR600RR\t187627\n项目公司\t187628\n高泰明\t187629\n抄本\t187630\nSS501\t187631\n乡韵\t187632\n义务性\t187633\n新快网\t187634\n大肉\t187635\n向阳村\t187636\n虎牙妃妃\t187637\n级配\t187638\n纠纷率\t187639\n老丁\t187640\noccur\t187641\nxjc\t187642\n保温包\t187643\n12码\t187644\n5.55\t187645\n建武\t187646\n元好问\t187647\nfools\t187648\n娼妓\t187649\n7.\t187650\n宏正\t187651\n财教\t187652\n第十关\t187653\n刚刚\t187654\n机灵\t187655\n艾扬格\t187656\n中国科学院光电技术研究所\t187657\n冯琳\t187658\n广播电视节目制作经营许可证\t187659\n蒂亚戈\t187660\npalpay\t187661\n三体人\t187662\n松花江\t187663\nWin7/Win8.1\t187664\nフレ\t187665\n衢州\t187666\n锅炉\t187667\neic\t187668\n中间站\t187669\n平安银行股份有限公司\t187670\n小ck\t187671\n柳绿\t187672\n周浦医院\t187673\n过背\t187674\n四川省博物馆\t187675\n飞卢论坛\t187676\n30块\t187677\n穷追猛打\t187678\ngen8\t187679\n参政议政\t187680\n江苏康缘药业股份有限公司\t187681\n谷丙\t187682\n15L\t187683\n夏令营\t187684\ngsp\t187685\n杂面\t187686\n中国医疗保险网\t187687\n达州市中心医院\t187688\nJUMP\t187689\noptimizer\t187690\n奥刃\t187691\n梦圆飞天\t187692\n相面\t187693\n富春江路\t187694\n御龙\t187695\ntae\t187696\n中华民谣\t187697\n人民日报海外版\t187698\n鹤庆县\t187699\n陆一伟\t187700\n手动\t187701\n王盛\t187702\n4份\t187703\n曼谷航空\t187704\n运动力\t187705\n元坝\t187706\n生态观\t187707\n伊秀网\t187708\n2511\t187709\n二人\t187710\n穿孔机\t187711\n潢川在线\t187712\n热血无赖\t187713\nmeshlab\t187714\n三仙山\t187715\n二氯\t187716\n宝马x6\t187717\n有生\t187718\n判别式\t187719\n大生活\t187720\nnatalia\t187721\n小森林\t187722\n2017F1\t187723\nexx\t187724\n学术\t187725\n商规\t187726\n杏仁酸\t187727\n广州地铁9号线\t187728\nPS4\t187729\n凡尔赛条约\t187730\nIbis\t187731\n开棺\t187732\n子宫颈癌疫苗\t187733\n5.5kw\t187734\n中世纪2全面战争\t187735\n因数和倍数\t187736\nbrompton\t187737\n西霞口野生动物园\t187738\n704\t187739\n伤感\t187740\nRoc\t187741\n学饮杯\t187742\n大气污染物排放标准\t187743\n都市笑口组\t187744\n贝塔系数\t187745\nudietoo\t187746\n思缘论坛\t187747\nUpdate\t187748\nzblog\t187749\n入团志愿书\t187750\n可可西\t187751\n花垣县\t187752\n板绘吧\t187753\n金石良缘\t187754\nprotected\t187755\n人造草坪\t187756\n家属院\t187757\n单尾\t187758\n轻友\t187759\n强档\t187760\nkev\t187761\nWTO\t187762\n3000名\t187763\n郑州宾馆\t187764\nvgg19\t187765\n昆明电视台\t187766\n影响期\t187767\nlft\t187768\n50亿\t187769\n上高\t187770\n潍坊路\t187771\nmorgen\t187772\n易事特\t187773\nLimits\t187774\n菲尔普斯\t187775\n团城山\t187776\n纷纷\t187777\nSnapshot\t187778\n物媒\t187779\n400斤\t187780\n尤利西斯\t187781\nc2cc中国化妆品网\t187782\nBORUTO\t187783\n4.10.3\t187784\n国际会议\t187785\n雯雯\t187786\n逍遥江山\t187787\ninterracial\t187788\n吞拿鱼\t187789\n日常用品\t187790\n未经营\t187791\nzeppelin\t187792\n九亿\t187793\n沙耶之歌\t187794\n气动泵\t187795\n珍兽\t187796\n布兰奇\t187797\n宜宾市第二人民医院\t187798\n宋辽\t187799\n西西里美丽传说\t187800\npand\t187801\nReina\t187802\n强片\t187803\n温江区\t187804\n校园宣讲会\t187805\n黄明诸葛亮\t187806\n602路\t187807\n双三\t187808\nAid\t187809\n金沙小学\t187810\n毒刺\t187811\n对曰\t187812\ndrupal\t187813\n白滨\t187814\n基性\t187815\nrsk\t187816\n鱼池\t187817\n老明锐\t187818\n太阳镜\t187819\n晶体\t187820\n渔女\t187821\n江花\t187822\n露水之爱\t187823\n王云鹏\t187824\n流瑜伽\t187825\n调歌\t187826\n杜鹃花\t187827\n增分\t187828\n北行\t187829\n卫斯理之支离人\t187830\n葡萄皮\t187831\n无形\t187832\n骨密度仪\t187833\n金地西沣公元\t187834\n地主婆\t187835\n第1辑\t187836\n陈正康\t187837\nook\t187838\nbcdboot\t187839\n白止\t187840\n金碧新城\t187841\n牛蒡茶\t187842\n2015.8\t187843\n20150112\t187844\n包底\t187845\n假文凭\t187846\n批复\t187847\n医改\t187848\n幕府将军2\t187849\n蚕妇\t187850\n西湖大道\t187851\nGTX660\t187852\n因式分解_\t187853\n老子传奇\t187854\n庞尊\t187855\n亚明\t187856\n慕尼黑机场\t187857\n搜狐\t187858\n途家社区\t187859\n千钧\t187860\n第6批\t187861\n红魔\t187862\n雷暴\t187863\nNation\t187864\n玩战\t187865\n玉米面\t187866\n官员\t187867\nFlavors\t187868\n微车\t187869\n麦地\t187870\n260ml\t187871\n吉他弹唱谱\t187872\n亲亲宝贝网\t187873\n筋包\t187874\n微泡\t187875\n赤豆\t187876\nX5\t187877\n图卡\t187878\n色撸撸\t187879\n侯杰\t187880\n火影忍者新时代\t187881\n发吐\t187882\ndx\t187883\n上海海湾国家森林公园\t187884\n宝芝林中药网\t187885\n求带\t187886\n2107年\t187887\n魔飞\t187888\n不懂事\t187889\nrosimm\t187890\n不亮灯\t187891\n指导语\t187892\n留学生公寓网\t187893\n368号\t187894\n第三|\t187895\n泸州\t187896\n罗霈颖\t187897\nt100\t187898\ncpy\t187899\nclyde\t187900\n薛洋\t187901\n重庆市自来水有限公司\t187902\n特稿\t187903\nIP地址数据库\t187904\n吕小军\t187905\n分期乐\t187906\n70斤\t187907\n698\t187908\n奈良\t187909\n地雷战\t187910\n杭州人才网\t187911\n罗婷\t187912\nquarterly\t187913\n速通\t187914\n肥婆\t187915\n空无一物\t187916\n垂涎欲滴\t187917\n股票期权激励计划\t187918\n差频\t187919\n1940\t187920\nJUNIT\t187921\n川茶\t187922\nsdr\t187923\ncpe\t187924\n湖南省财政厅\t187925\n第三军医大学\t187926\n南山湖\t187927\n作品\t187928\n余村\t187929\n深圳国际仲裁院\t187930\n谈社\t187931\n麻点\t187932\n海军总医院\t187933\n值此\t187934\n戴叔伦\t187935\n美术学\t187936\n电视显示屏\t187937\n7周岁\t187938\n58速运\t187939\nsqlplus\t187940\n上海长滩\t187941\n东疆港\t187942\n鲁本\t187943\nHydration\t187944\n怀宁县\t187945\ns3\t187946\nVBA\t187947\n城市居民\t187948\n僵约\t187949\nwidows\t187950\n小野麻里亚\t187951\n失忆\t187952\n输电线路\t187953\n2015年1月\t187954\n厦门华厦职业学院\t187955\n热血侠\t187956\n金刚石线\t187957\nMetropolis\t187958\nLENS\t187959\nweibull\t187960\n七点半\t187961\n10日游\t187962\n晨光熹微\t187963\n亿万家\t187964\n设点\t187965\n宋埠\t187966\n小儿支气管肺炎\t187967\ndarphin\t187968\nRobotics\t187969\n和好\t187970\n解放日\t187971\nminio\t187972\n钢制文件柜\t187973\nzumba\t187974\nImporter\t187975\n琴学\t187976\n破邪传\t187977\n动网论坛\t187978\n真像\t187979\n环保\t187980\nstax\t187981\n鹏博士\t187982\n网厅\t187983\n休息椅\t187984\n阿里移动\t187985\nWORDPRESS\t187986\n海龙湾\t187987\n460天\t187988\n7道\t187989\n党争\t187990\n攻玉\t187991\n纪检组\t187992\n招标投标网\t187993\nbonpoint\t187994\n牵机\t187995\n第40号\t187996\ndiy吧\t187997\n79万\t187998\n175米\t187999\n20160818\t188000\navn\t188001\n广州省站\t188002\n新胜达\t188003\nTowers\t188004\n烟道式\t188005\n航班延误险\t188006\n泉州动车站\t188007\n流程题\t188008\n旗面\t188009\n静雨\t188010\nqq西游\t188011\nSHENZHEN\t188012\n半票\t188013\nsettle\t188014\n淘宝优惠卷\t188015\n地方时\t188016\n黄斑水肿\t188017\n航区\t188018\n10CD\t188019\na1398\t188020\n铜仁市人民政府\t188021\n期货配资\t188022\nPX4\t188023\n忘形\t188024\n兰香缘\t188025\n创变\t188026\n涂塑复合钢管\t188027\n家教高级课程\t188028\nwbc\t188029\n陈志煜\t188030\nibuki\t188031\nperfection\t188032\npop\t188033\n那张脸\t188034\n私贷\t188035\n东国大学\t188036\nJING\t188037\ninstaller软件包\t188038\nmacg\t188039\nVOL.3\t188040\n食品药品监管总局\t188041\nrepay\t188042\n水岸\t188043\n福伦达\t188044\n磁力链接网\t188045\n海战世界\t188046\nDNSmasq\t188047\n屠幼\t188048\n装载机\t188049\nsmic\t188050\n液批\t188051\n10瓦\t188052\n欧束\t188053\neslpod\t188054\n沈氏玄空学\t188055\n优希\t188056\n曾轶可\t188057\n海南省文昌市人民政府\t188058\nㄕ\t188059\n正误\t188060\n金泽镇\t188061\n轻微交通事故\t188062\n懵懵懂懂\t188063\nScrolls\t188064\n记入\t188065\ntexlive\t188066\nHDWiki\t188067\n国家人力资源和社会保障部\t188068\n西北大学现代学院\t188069\nMENG\t188070\n18栋\t188071\n搜学网\t188072\n滨江西\t188073\n零轴\t188074\n领誉\t188075\n海泉\t188076\n淘客联盟\t188077\n国际村\t188078\nANTLR\t188079\n锤炼\t188080\n音学\t188081\nDVI\t188082\n电动扫地机\t188083\n波推\t188084\nKunming\t188085\n13度\t188086\n追债\t188087\n李新亮\t188088\n黄光亮\t188089\n100offer\t188090\n腿模\t188091\n联想小新310\t188092\n新荣区\t188093\n咽喉处\t188094\n哲夫\t188095\n支付令\t188096\n吊脚楼\t188097\n兴业\t188098\n河源置业网\t188099\n第5号\t188100\nout在线翻译\t188101\n儿童孤独症\t188102\n2018年2月27日\t188103\n公共秩序\t188104\n塞克\t188105\n第89章\t188106\n2万套\t188107\n脸盆\t188108\nLIB\t188109\n电信业\t188110\n农发行\t188111\nstyles\t188112\nE212\t188113\n寰\t188114\n熙康\t188115\n锐讯\t188116\n五千亿\t188117\n答辩书\t188118\n回执照片\t188119\n760\t188120\nautocad2012\t188121\n绚濑绘里\t188122\nhwi\t188123\n布谷\t188124\n邮券\t188125\n易经\t188126\n腾讯微博\t188127\n劳社部\t188128\n孙志强\t188129\nOverlord不死者之王\t188130\n小楷\t188131\n教条\t188132\n刘嘉亮\t188133\n非处方\t188134\n子宫内膜炎\t188135\n金华网\t188136\n透明层\t188137\nzotac\t188138\njolimark\t188139\n丁佳慧\t188140\n博格巴\t188141\n音乐榜\t188142\n住房公积金联名卡\t188143\n老卢\t188144\n安德罗妮\t188145\n国窖1573\t188146\n蒙阳\t188147\n三里屯\t188148\n字贴\t188149\n北方人\t188150\n物业化\t188151\n双程\t188152\n10平米\t188153\n群辉\t188154\nlambda函数\t188155\n易优\t188156\n实况足球2016\t188157\n教学案\t188158\n奢美\t188159\nsqlit\t188160\n力健\t188161\nlpr\t188162\n折板\t188163\n审阅者\t188164\n式量\t188165\namelia\t188166\n烈士们\t188167\n心迹\t188168\n求偿\t188169\n汉腾x5\t188170\npharmacology\t188171\n外研\t188172\n1470\t188173\nAppBarLayout\t188174\n政论片\t188175\n我也想\t188176\n江南新天地\t188177\nbabytree\t188178\nx205ta\t188179\n兽兽门\t188180\n塘头\t188181\n成都糖酒会\t188182\n巨力\t188183\n罗马帝国衰亡史\t188184\n激浊扬清\t188185\n东京巨蛋\t188186\n002594\t188187\nps6\t188188\n人数\t188189\n险峻\t188190\n夏娃\t188191\n管板式\t188192\n喜庆十九大\t188193\nf587\t188194\n连城\t188195\nwebpack3\t188196\n15on\t188197\n17#\t188198\n航程\t188199\n努比亚手机:#努比亚红魔\t188200\n主权\t188201\n逸翠庄园\t188202\nS7\t188203\n换尿布\t188204\nMKV/540p\t188205\n赣榆区\t188206\n简例\t188207\n昆明康辉旅行社\t188208\n好用\t188209\n纳博科夫\t188210\n非法\t188211\n保税区汽车网\t188212\n歌唱祖国\t188213\n小三阳\t188214\nShirts\t188215\n韶羞\t188216\n玩咖\t188217\n5.5V\t188218\n跑跑卡丁\t188219\n长辈\t188220\n弱方\t188221\n上海美团打车\t188222\n无量纲化\t188223\n北京分所\t188224\nArcpy\t188225\n湖北工会网\t188226\n岩崎千鹤\t188227\nhcs\t188228\ngn125\t188229\n25年\t188230\nbixby\t188231\n2018—2020年\t188232\nHS\t188233\nPPT幻灯片\t188234\n漏报\t188235\n火鹰\t188236\n克里斯·海姆斯沃斯\t188237\n列入\t188238\n熊出没之探险日记\t188239\nplay文\t188240\nrealforce\t188241\n1305\t188242\n16K\t188243\n吉祥如意\t188244\n董小宛\t188245\n表番\t188246\n春色无边\t188247\n冰露\t188248\n腾讯文档\t188249\n白龙口\t188250\n星恒教育\t188251\n酱香型白酒\t188252\n宇星\t188253\n细长\t188254\n砂与海之歌\t188255\n蔡琰\t188256\n亲恩\t188257\nchemicals\t188258\n负者\t188259\n布病\t188260\nwindows_Wi\t188261\n川大智胜\t188262\n开灯\t188263\n高熵合金\t188264\n工作车\t188265\n成都市公安局出入境管理局\t188266\n中国人民政治协商会议\t188267\nguide\t188268\nBG文\t188269\nsdi\t188270\n菲亚\t188271\n上海新城\t188272\nv1.1.3\t188273\nWatching\t188274\n蒂安希\t188275\n带式输送机\t188276\n五叶神\t188277\n花羊\t188278\ncesiumjs\t188279\nnetty服务器\t188280\n盛邦\t188281\n1小时左右\t188282\nintegrated\t188283\n嘉宝米粉\t188284\n靖海\t188285\n3月28号\t188286\n古文版\t188287\nmoosefs\t188288\n断金\t188289\n万分位\t188290\nGQ男士网\t188291\n连身裙\t188292\nsingleTask\t188293\n保护屏\t188294\n▂\t188295\n江干\t188296\n电信日租\t188297\n冴君麻衣子\t188298\n曹杨八村\t188299\n方舟水手吧\t188300\n20160301\t188301\n春晖花园\t188302\n病状\t188303\n新盘\t188304\nBD-720P-RMVB\t188305\n清肝\t188306\n蒙古草原\t188307\n一网天下\t188308\n税源\t188309\n南派三叔\t188310\n朝鲜战争\t188311\n崇安区\t188312\n网-天工网\t188313\n万菱广场\t188314\n施蛰存\t188315\n3DES\t188316\n石渠宝笈\t188317\n陕西学前师范学院\t188318\n12v2a\t188319\n李永平\t188320\n轴比\t188321\n新帖\t188322\n早苗\t188323\n好奇心日报\t188324\n久战\t188325\n围海股份\t188326\nupsert\t188327\n回避\t188328\n第59集\t188329\n箱庭\t188330\nJohns\t188331\n抗干扰\t188332\n加隆德\t188333\n花築\t188334\n沉香屑\t188335\n大腿骨\t188336\n长谈\t188337\n生物法\t188338\n南航基地\t188339\ndisplay\t188340\n350dpi\t188341\n2018年5月起\t188342\n16卷\t188343\n北京市朝阳区\t188344\n柳下挥\t188345\nBree\t188346\nclj\t188347\n华东野战军\t188348\nCote\t188349\n素类\t188350\n大佛\t188351\n林果业\t188352\n片儿川\t188353\nGB2760\t188354\nsaveOrUpdate\t188355\n搜狗公司\t188356\nJBOD\t188357\n优才\t188358\n罗英石\t188359\n索尼镜头\t188360\n左侧\t188361\n希岸酒店\t188362\n宁波第二医院\t188363\nIVMS-4200\t188364\nN2N\t188365\n仁和村\t188366\n魔兽争霸冰封王座\t188367\nquarter\t188368\n致命弯道7\t188369\n早期症状\t188370\n人民邮电\t188371\nbid\t188372\n海禁\t188373\n口业\t188374\n601985\t188375\n永硕\t188376\n/&#160\t188377\n何涛\t188378\n螺旋杆菌\t188379\n荣列\t188380\n三镇\t188381\n恶魔之吻\t188382\n筑龙教育\t188383\n八卦牌\t188384\nCVE\t188385\nbt\t188386\n一个两个\t188387\n珍珠花\t188388\n一服\t188389\n小米公交\t188390\n雅丹\t188391\n陕西省人大\t188392\n省局\t188393\n双驱\t188394\nfierce\t188395\nszy\t188396\n第242集\t188397\n视觉稿\t188398\n子物体\t188399\n王鹏飞\t188400\n螺纹套\t188401\n冬凌草片\t188402\n微观世界\t188403\n余姚论坛\t188404\nDPC\t188405\n李和平\t188406\nVTR\t188407\n上层建筑\t188408\n艾欧史密斯\t188409\n公安部门\t188410\n监察委员会\t188411\n动物源性\t188412\n丁未\t188413\nbcaa\t188414\n造舰\t188415\n广州恒大\t188416\n两分米\t188417\n勇敢者游戏2\t188418\nrabitmq\t188419\nwind\t188420\nwick\t188421\n白改黑\t188422\n近100年\t188423\ncommend\t188424\n博西华\t188425\n必中\t188426\n17mm\t188427\n登鹳雀楼\t188428\n赵世权\t188429\n580万\t188430\n金山区\t188431\n于仁\t188432\n荣耀典韦\t188433\nsubmissions\t188434\n业主\t188435\n氢溴酸西酞普兰片\t188436\n孔院\t188437\n宏图三胞\t188438\n100平米\t188439\n福润\t188440\n张培基\t188441\n机动车登记证\t188442\nby子句\t188443\n建联\t188444\n230ore\t188445\nKIS标准版\t188446\n躲不过\t188447\nXscript\t188448\n孙悟空三打白骨精\t188449\n陆锦花\t188450\n柠檬烯\t188451\n孵蛋\t188452\n64分\t188453\n柳惜音\t188454\nBlind\t188455\n科威特\t188456\n处世\t188457\n力度\t188458\n国家教委\t188459\n启道\t188460\n893\t188461\n魔兽真三国无双\t188462\ntechne\t188463\n药物治疗学\t188464\nWyndham\t188465\nsubmarine\t188466\n养生经\t188467\n医师证\t188468\n对公账户汇款\t188469\nKEY\t188470\n胸牌\t188471\n帝豪论坛_汽车之家论坛\t188472\n0208\t188473\n任家萱\t188474\n雪珂\t188475\n杭埠镇\t188476\n航天币\t188477\n蓝蓝路\t188478\n新市区\t188479\n星尚网\t188480\n#墓王之王#皇子不敌求凰夫人\t188481\nBaseAdapter\t188482\n不知火舞公园\t188483\nGO\t188484\n张五\t188485\nBoblim\t188486\n摄影版\t188487\n赵亚夫\t188488\n金光纸业\t188489\n集装箱房\t188490\n12864\t188491\n南沙湾\t188492\n三湖鱼\t188493\n古巴比伦\t188494\n民谚\t188495\nv3ma\t188496\n地性\t188497\n感情线\t188498\n20174月\t188499\n绿地光谷中心城\t188500\n倒背\t188501\n企谷\t188502\n裂解\t188503\n函数式\t188504\n土著\t188505\n能耐\t188506\ngordon\t188507\n瑞琪\t188508\n茅庐\t188509\n同沙水库\t188510\nCling\t188511\n猫咪\t188512\n兴园\t188513\n9.99\t188514\n地毯装修|一起网\t188515\n奇招\t188516\n沃尔沃S90\t188517\n斗牛犬\t188518\n美图软件\t188519\n第8轮\t188520\n45平方\t188521\n巧言令\t188522\n唐德\t188523\n锐动\t188524\n西湖杯\t188525\nqwebview\t188526\n赵建军\t188527\n过级\t188528\n极米h1s\t188529\n英武\t188530\n洛神赋\t188531\n冰蚕\t188532\n色味\t188533\nWiring\t188534\n蜀黍\t188535\n南滨\t188536\n北京电子\t188537\n李辰\t188538\nFeikeBT\t188539\nQDialog\t188540\n暗强\t188541\n邓文迪\t188542\n寒沙\t188543\n传进\t188544\n岗位\t188545\n固色\t188546\n德玛\t188547\nlinFen\t188548\n新闻办\t188549\n透平机\t188550\n旱作\t188551\n服务化\t188552\n湖广会馆\t188553\n手套\t188554\n恋活\t188555\n白裤\t188556\n捂盘\t188557\n庙\t188558\n7.3.5PTR\t188559\n橙戒\t188560\n三国战记风云再起\t188561\n二炮手\t188562\n现场\t188563\n快_寻医问药网\t188564\n沥青拌合站\t188565\ncr2032\t188566\n姿\t188567\noffce\t188568\n供体\t188569\nM252dw\t188570\nServlet过滤器\t188571\n胎毛\t188572\n正大综艺\t188573\nWildto\t188574\n1218首\t188575\n校志\t188576\nDal\t188577\n丰乐镇\t188578\n胸科医院\t188579\n黄翔\t188580\n五彩浅\t188581\n惆怅客\t188582\n普板\t188583\n科华路\t188584\n科莱恩\t188585\n大云镇\t188586\n万有引力定律\t188587\n诗序\t188588\n追格\t188589\nDorian\t188590\n限制性内切酶\t188591\n公称直径对照表\t188592\n计扣\t188593\n一第一\t188594\nccl4\t188595\n时机\t188596\n南海农商银行\t188597\nJacobi\t188598\n特材\t188599\nass\t188600\n磅值\t188601\n沈阳军区\t188602\nBanner\t188603\n木墩\t188604\n恒邦\t188605\nmyclub\t188606\n十六种\t188607\n冬天里的一把火\t188608\n微话\t188609\n成都军大医院\t188610\nNo.10\t188611\n王霄\t188612\n郭某某\t188613\n坐怀\t188614\n名都城\t188615\n新制度主义\t188616\n倒腾\t188617\ndey\t188618\npowers\t188619\n费恩历险记\t188620\n梁灿彬\t188621\n多特蒙德\t188622\nsel\t188623\n三联\t188624\n召陵\t188625\n伪类\t188626\n前江工业园\t188627\n成贵高铁\t188628\n新农\t188629\n冻土\t188630\n无限币\t188631\n李太林\t188632\n管理服务器\t188633\n逾越\t188634\nc300\t188635\n过烧\t188636\n岛礁\t188637\nalert、confirm\t188638\n橙黄色\t188639\ncilicili\t188640\n100次\t188641\n境章\t188642\n无人性\t188643\n手鼓\t188644\n孙华\t188645\nAVMOO\t188646\n镇业\t188647\n11路\t188648\nRecyclerView\t188649\n采菊东篱下\t188650\n九里晴川\t188651\n20160119\t188652\n薄荷茶\t188653\nP5\t188654\nadk\t188655\n检疫\t188656\n催缴\t188657\n壮士\t188658\n中国文字博物馆\t188659\n12首\t188660\n张志李兰迪\t188661\n四川路\t188662\n生化人\t188663\n天团\t188664\n树瘤\t188665\n欠钱\t188666\n早乙女露依\t188667\ncypher\t188668\nyesterday\t188669\n雌株\t188670\nobjectarx\t188671\nJulia\t188672\n冥妻挚爱\t188673\n国土资源报\t188674\n郑希怡\t188675\nRVS\t188676\n长三角城市群\t188677\nExcel应用大全\t188678\n大不大\t188679\n聚脂\t188680\nwsad\t188681\n大清银币\t188682\n微软Sculpt\t188683\n登乐\t188684\nmx6\t188685\n新代\t188686\nkeysight\t188687\ntoget\t188688\nhla\t188689\n乐读网\t188690\n林奕\t188691\n倍速播放器\t188692\n神队\t188693\n畅聊\t188694\n绿豆汤\t188695\nTerraria\t188696\n2回\t188697\n32周\t188698\n六年级数学下册\t188699\n舒润\t188700\n福克\t188701\njzt\t188702\n奖池\t188703\npycuda\t188704\nslate\t188705\n红山街道\t188706\n地下城与\t188707\n第2条\t188708\n分机\t188709\nra3\t188710\n美格\t188711\n麦斯\t188712\n沪南\t188713\nMOS86\t188714\n20db\t188715\n制造技\t188716\n马氏链\t188717\n04年\t188718\n温州公交查询网\t188719\n江北快速路\t188720\n破解机\t188721\n40美元\t188722\n热拉\t188723\n体考\t188724\n徐工\t188725\n蓝思科技股份有限公司\t188726\n罗之豪\t188727\n储液器\t188728\n透气型\t188729\n信捷电气\t188730\nrundll32.exe\t188731\n0901\t188732\n拓疣\t188733\n库尔德\t188734\n防夹手\t188735\n三四郎\t188736\n清明果\t188737\n王猛\t188738\nMAMAMOO\t188739\nshould\t188740\n尚冰冰\t188741\n北青网\t188742\n静物画\t188743\nsuccession\t188744\noder\t188745\n烙馍\t188746\nNaruto\t188747\n拉饵盘\t188748\ntefl\t188749\n语义\t188750\n滴滴专\t188751\nCURD\t188752\n东北国际医院\t188753\n融创地产\t188754\n各有所\t188755\n动圈式\t188756\n毛人\t188757\n院民\t188758\n云南大学图书馆\t188759\n烧焦\t188760\n分布式文件系统\t188761\n叶腊石\t188762\n0219\t188763\n马克思主义政党\t188764\n排斥\t188765\nlumix\t188766\nR12\t188767\n同片\t188768\n通言\t188769\n闽南人\t188770\n铁王\t188771\n三十晚上\t188772\n湿胸\t188773\n智能网联汽车道路测试管理规范\t188774\n双拥工作\t188775\n沙具\t188776\n行业英语\t188777\n梅隆\t188778\nsolarzoom光伏太阳能网\t188779\nintroduced\t188780\nOOXX无翼鸟\t188781\ntcr\t188782\n赵艳\t188783\n喵赛克\t188784\n上不上\t188785\n北京优衣库\t188786\n深圳欧菲光科技股份有限公司\t188787\n乡村四月\t188788\n煮面炉\t188789\n损坏\t188790\n层析法\t188791\nvim\t188792\n白沙街道\t188793\n塔河县\t188794\n缝隙式\t188795\ndct\t188796\nAntisHsu\t188797\n枸橼酸西地那非片\t188798\nmechanism\t188799\n杨亚洲\t188800\n新疆农村信用社\t188801\n陈鸣\t188802\n配乐版\t188803\n222\t188804\n大功率三极管\t188805\n多拨\t188806\nvdat\t188807\n德谷\t188808\n强制爱\t188809\n寻人大师\t188810\n茅坪镇\t188811\n熟词\t188812\nvega56\t188813\n御景蓝湾\t188814\n岗顶\t188815\n俄语系\t188816\nginza\t188817\n主角机\t188818\n马鞍山市住房和城乡建设委员会\t188819\n主题酒店\t188820\n丹武至尊\t188821\n喜悦号\t188822\n24秒\t188823\n2016.3\t188824\n国网安徽省电力有限公司\t188825\n卤化丁基橡胶\t188826\nTelstra\t188827\n上海有机所\t188828\nPreservation\t188829\n信号塔\t188830\n规矩\t188831\n选修课\t188832\n海贼岛\t188833\n寿险\t188834\n软泥\t188835\nbarely\t188836\n健伍\t188837\n简支梁\t188838\n张晟\t188839\n相泽\t188840\n魔蛋\t188841\n中山陵景区\t188842\n天津市科委\t188843\n小打小闹\t188844\n白居万茜\t188845\nindented\t188846\n挎包\t188847\n门铰\t188848\n彭山区\t188849\n小灶\t188850\n被砸\t188851\n金甲\t188852\n南溪区\t188853\n怪力\t188854\n插脚\t188855\n下岗\t188856\n沐王府\t188857\n观鸟镜\t188858\nWindows系\t188859\n八院\t188860\nretina\t188861\n松花湖\t188862\n蓝月广告\t188863\n外贴式\t188864\n空字\t188865\n爱站\t188866\n尊容\t188867\n十梓街\t188868\n阳江市\t188869\n顺德区行政服务中心\t188870\n联体\t188871\n反序列化\t188872\n晚年\t188873\n大单\t188874\n广西壮族自治区公安厅\t188875\n泰山医学院\t188876\nhaccp\t188877\nrazer\t188878\n本级\t188879\n新谱\t188880\n捷运\t188881\n雪朗峰\t188882\n粉膏\t188883\n隆江\t188884\n平结\t188885\nOFDM\t188886\n大架\t188887\n君悦豪庭\t188888\n昭阳区\t188889\n存储函数\t188890\n校联考\t188891\nTiger\t188892\ne学\t188893\n哭喊\t188894\n怪物猎人世界盾斧\t188895\n总手\t188896\n宿舍区\t188897\n赶路\t188898\n慕少艾\t188899\n安安\t188900\n四美图\t188901\nHD598\t188902\n行付\t188903\nDATEDIFF\t188904\n陈继儒\t188905\n彩票业\t188906\n焊接\t188907\n凉拌面\t188908\n赠本\t188909\nmtn\t188910\n中农大\t188911\ndbforge\t188912\n12#\t188913\n戏曲影视剧场\t188914\n牛股\t188915\nUnarchiver\t188916\n加人\t188917\ndeadlines\t188918\n血带\t188919\nfbd\t188920\n快进\t188921\nDOOM4\t188922\n实验者\t188923\n恩慈\t188924\n康胶囊\t188925\n梧田街道\t188926\n瓦尔哈拉\t188927\njiating\t188928\n外腔\t188929\n邰丽华\t188930\n根表\t188931\ntidyplates\t188932\n180平方米\t188933\n外治法\t188934\n复旦大学上海医学院\t188935\n无迹\t188936\n二台\t188937\n以防\t188938\nuss\t188939\n樊刚\t188940\nExtensions\t188941\n内文\t188942\n切大\t188943\n丙氨酸\t188944\n灵丹妙药\t188945\n耳元\t188946\n衬片\t188947\n闽东\t188948\n神奇宝贝特别篇\t188949\n三眼桥\t188950\nndd\t188951\n派遣函\t188952\n灰灰菜\t188953\n北京公交查询网\t188954\nAutomate\t188955\n苗疆\t188956\n养老保险网\t188957\n绩效\t188958\nhpa\t188959\n脱式计算_\t188960\n犀浦站\t188961\n忒修斯之船\t188962\n功底\t188963\n仙鹤神针\t188964\n阿联酋航空公司\t188965\ncol-xs\t188966\n董导\t188967\n雪云\t188968\n联合分布函数\t188969\n1913年\t188970\n得天独厚\t188971\n墓碑\t188972\n五旬\t188973\n对答如流\t188974\n挂钩\t188975\npunish\t188976\n食育\t188977\n中东呼吸综合征\t188978\n0.3%\t188979\n清颜堂\t188980\n薛婧\t188981\n2017年9月6日\t188982\n轻音\t188983\n富家公子\t188984\n600美元\t188985\n铅丝笼\t188986\n河南大学出版社\t188987\n悬挂\t188988\n西周分封制\t188989\nwarded\t188990\n老百姓大药房连锁股份有限公司\t188991\nsan值\t188992\nAngry\t188993\n郭象\t188994\n银珠\t188995\n战地3\t188996\n野食\t188997\n城市大厦\t188998\nv5r21\t188999\n荡气\t189000\n病毒感染\t189001\n白夜玲珑\t189002\n御魂\t189003\n奥南\t189004\nUUID\t189005\n定向\t189006\n韵达股份\t189007\n芬太尼\t189008\n中小学生人格教育与学习能力\t189009\n淘宝店\t189010\n销售方\t189011\n青岛国际啤酒节\t189012\n面纸\t189013\ndnf黑暗武士吧\t189014\n宏信证券\t189015\ns905\t189016\n吴晓东\t189017\n野区\t189018\n90g\t189019\n老板电器\t189020\nthk\t189021\n螺带\t189022\n津城\t189023\nVisa4UK\t189024\n塑形镜\t189025\n起子机\t189026\n紧绷\t189027\nLabVIEW\t189028\n落马洲\t189029\n几个\t189030\n2504\t189031\n冥域\t189032\n手机支付宝\t189033\n27英寸\t189034\n泸沽湖景区\t189035\nCaring\t189036\n时人\t189037\n赵晓明\t189038\n美校\t189039\n圣戈班\t189040\n大团镇\t189041\n潘素\t189042\n14段\t189043\n45平\t189044\n移动平均\t189045\n囚绿记\t189046\nicey\t189047\n载调\t189048\n爱迪奥特曼\t189049\n回中\t189050\n滴水穿石\t189051\n路娜\t189052\n足戒\t189053\n希冀\t189054\n达因\t189055\n医博会\t189056\n体例\t189057\n会议费\t189058\n杀劫\t189059\n艾德克斯\t189060\n天天链N1\t189061\n都市偷心龙爪手\t189062\nabig\t189063\n始\t189064\n部照\t189065\n绕轴\t189066\n区纪委\t189067\n洋牡丹\t189068\n山东能源临沂矿业集团有限责任公司\t189069\n中国山东政府\t189070\n小产权房网\t189071\n淫巧\t189072\nTAS\t189073\np5\t189074\n0771\t189075\n刘艺\t189076\n中国招商银行\t189077\nK8s\t189078\nMonitor\t189079\n易经六十四卦\t189080\n光风霁月\t189081\nchristie\t189082\nsogo\t189083\n170级\t189084\n斜桥镇\t189085\n裤腰\t189086\n扫毒\t189087\n178CSGO\t189088\n金泉网\t189089\n光伏并网逆变器\t189090\n东方幼儿园\t189091\n孤立性\t189092\n38.5度\t189093\n布厂\t189094\n10遍\t189095\nwfw\t189096\n尿布疹\t189097\n射频模块\t189098\nnstimer\t189099\n预组\t189100\n李铁柱\t189101\n珠海华发实业股份有限公司\t189102\ntheWalker\t189103\ndl580\t189104\n64916642\t189105\n阻挡\t189106\npotplayer\t189107\nwoider\t189108\ndownxp\t189109\n最搞笑\t189110\n杨汊湖\t189111\n追梦赤子心\t189112\n长葛市\t189113\n山狼\t189114\n红杜鹃\t189115\n生命册\t189116\n主修\t189117\n德罗索\t189118\n大里\t189119\nstair\t189120\n63个\t189121\n厉以宁\t189122\n标准上工标网\t189123\nairforce\t189124\n油臂\t189125\n花房姑娘\t189126\n完场\t189127\n外方\t189128\n0.4mm\t189129\n红瑞木\t189130\nGossip\t189131\n节省\t189132\n台州科技职业学院\t189133\n少数民族预科班\t189134\n北京大学继续教育学院\t189135\n息屏\t189136\nFilms\t189137\nKpopstar\t189138\n9卷\t189139\n社工库\t189140\n金稻\t189141\nV4000\t189142\n峰哥\t189143\nsmiles\t189144\n户口准迁证\t189145\n雅园\t189146\n帆板\t189147\n1000多元\t189148\nREPLAY\t189149\nironman\t189150\nichartjs\t189151\n未付\t189152\n指纹版\t189153\nhepa\t189154\nnaim\t189155\n火电厂\t189156\n迷你版/记账王\t189157\n幸存\t189158\n明路\t189159\nNekopara\t189160\n半裸\t189161\n资本公积转增资本\t189162\n中国商飞\t189163\n家居\t189164\n日向雏田\t189165\n黑洞\t189166\n五湖招聘会网\t189167\n蒋氏\t189168\n谜男\t189169\n美时美刻\t189170\n橡胶树\t189171\n歌尔股份有限公司\t189172\n时狱篇\t189173\n百分之3\t189174\n蚕食\t189175\n爱问问\t189176\n第60条\t189177\n自变\t189178\n甲壳虫乐队\t189179\n石墨矿\t189180\n叶梓潼\t189181\n蓄\t189182\n酷冷\t189183\n香金\t189184\n南昌市第三医院\t189185\n失宠王妃之结缘\t189186\n临床助理医师考试\t189187\n吃饭时\t189188\n戴雷\t189189\n无线路由\t189190\n1.4\t189191\n田野\t189192\nIngram\t189193\nAvery\t189194\nLucida\t189195\n夕又米\t189196\n不期\t189197\n枇杷蜜\t189198\nToner\t189199\n入珠\t189200\n3x-2\t189201\n女同片\t189202\n寻情记\t189203\n贬义\t189204\n大猫网\t189205\n困想\t189206\n新坡镇\t189207\n米亚基\t189208\nfayuan\t189209\n弹药架\t189210\n主城区\t189211\n杨圣亮\t189212\n违法\t189213\n桑塔宝马x5\t189214\n刀尖\t189215\n鲁班百科\t189216\nASP.NET\t189217\n促改\t189218\nAnitama\t189219\n库·丘林\t189220\n阿斯麦\t189221\nV们\t189222\n黄冈日报\t189223\n鲜颜\t189224\n雷蛇Razer\t189225\nC语言库函数\t189226\nIllustrator\t189227\n英得尔\t189228\n20160731\t189229\n波尔多庄园\t189230\n10.1021\t189231\n人民军\t189232\n大赏\t189233\n费斯托\t189234\n清风DJ音乐网\t189235\ntango\t189236\n乐清市教育局\t189237\n曹鹏\t189238\n北部地区\t189239\n2015年10月14日\t189240\n数码单\t189241\n百科瘦身_美体_太平洋时尚网\t189242\n信义坊\t189243\njrxml\t189244\n高度尺\t189245\n清蒸鳕鱼\t189246\n肇事罪\t189247\n中哲\t189248\n登良\t189249\n金钱豹\t189250\n福建广电网络集团股份有限公司\t189251\n唐山路南区\t189252\n交流障碍症\t189253\nBSM\t189254\n安徽省发展和改革委员会\t189255\n11.6英寸\t189256\nxms\t189257\n常山新闻网\t189258\n吴一龙\t189259\n续缘\t189260\n金立方\t189261\nspringmvc+mybatis\t189262\n李静\t189263\n坤元\t189264\n温州便民网\t189265\nwifi信道\t189266\n微电子\t189267\nq宠\t189268\nsheraton\t189269\n糖料\t189270\n火场\t189271\n西院\t189272\n灭菌锅\t189273\n鸡贼\t189274\nPentagon\t189275\n大千网\t189276\n洋河\t189277\n旅行法\t189278\nMacbookPro\t189279\n保尔森\t189280\nLAVA\t189281\n和平里\t189282\n12万公里\t189283\n地毯机\t189284\nat89c52\t189285\n综合科\t189286\nDALSA\t189287\n蜀锦\t189288\n9分\t189289\n氯化物\t189290\n中农联\t189291\n纪言\t189292\n蓝奕邦\t189293\n影音先锋高清\t189294\n第6号\t189295\n零之轨迹\t189296\n欧雪\t189297\n_市州\t189298\n殴打\t189299\n虾姑\t189300\n鹰牌瓷砖\t189301\n布套\t189302\n苏州市吴中区人民政府\t189303\n奔驰GL\t189304\nCHROME\t189305\nSDL\t189306\n晶块\t189307\nA7s\t189308\n发狂\t189309\n湾畔\t189310\n销售者\t189311\n6880\t189312\n支持版\t189313\nOfficiel\t189314\n洋奶粉\t189315\n西藏卫视\t189316\n上海招商网\t189317\n强真\t189318\n六面\t189319\nreponse\t189320\n常熟开关厂\t189321\n那几年\t189322\nexplanation\t189323\nRiviera\t189324\n广西城市职业学院\t189325\n最小二乘法\t189326\n逆解\t189327\n七侠五义\t189328\n斑马仓\t189329\nfl吧_\t189330\n返款\t189331\n东靖路\t189332\n绝缘靴\t189333\nsimplify3d\t189334\n陆婷婷\t189335\n建帐\t189336\nipoe\t189337\n美容冠\t189338\n于佩尔\t189339\n炒茶\t189340\ntestrpc\t189341\n二十出头\t189342\nfor\t189343\n重庆钢铁股份有限公司\t189344\n杰克股份\t189345\n明示\t189346\n国际领先\t189347\n距离传感器\t189348\n63张\t189349\nPositive\t189350\n误关\t189351\n最毒妇人心\t189352\n干色\t189353\nImpactor\t189354\nlvy\t189355\n自动车床\t189356\n国光帮帮忙吧\t189357\n星星点点\t189358\nviolin\t189359\n彩伦\t189360\n黎贝卡\t189361\n咽干\t189362\narchlinux\t189363\nDIY综合论\t189364\n菊次郎的夏天\t189365\n芒刺\t189366\n朝鲜族\t189367\n缺\t189368\n7055\t189369\n分崩离析\t189370\n猪皮\t189371\n心欲\t189372\n量子场论\t189373\nVega\t189374\n江南都市报\t189375\n邳州论坛\t189376\n奋勇\t189377\n非遗文化\t189378\nIPSW\t189379\n若素\t189380\n50件\t189381\n冈田将生\t189382\n郑晓_\t189383\n高德地图api\t189384\n上海派出所\t189385\n十陵\t189386\n合理\t189387\n干手器\t189388\n泰坦之旅:不朽王座\t189389\n游离氯\t189390\n本真\t189391\n哪路\t189392\n90cm\t189393\nMongoose\t189394\n响当当\t189395\n氢OS\t189396\n导航屏\t189397\n杨凌职业技术学院\t189398\nhape\t189399\n梅湖\t189400\n牧野诡事\t189401\n科任\t189402\n硫磺岛\t189403\n大部\t189404\n定陶县\t189405\n二十首\t189406\n第1季\t189407\nE-M5\t189408\n一览_快啦网\t189409\n流媒体\t189410\n苏州科达\t189411\n银山镇\t189412\n肝疼\t189413\neinyboy\t189414\n木业信息网\t189415\n买卖合同\t189416\n战火兄弟连\t189417\n0034\t189418\n火影鸣人\t189419\n氨纶丝\t189420\n盛京银行\t189421\n抗压强度\t189422\nScooter\t189423\nluminar\t189424\n孬\t189425\nzipai\t189426\n2018年4月23号\t189427\nM20\t189428\n高华\t189429\n国体\t189430\n上步\t189431\n法兰克\t189432\n飞思卡尔杯\t189433\n仲鑫达\t189434\n天籁天\t189435\n专插本\t189436\nSFA\t189437\n解约金\t189438\n高墩\t189439\n美国联合航空\t189440\n人头像\t189441\n1862\t189442\n一冠\t189443\nreally\t189444\n虎形\t189445\n胃蛋白酶\t189446\n旋转变压器\t189447\n韩波\t189448\n销售代理\t189449\n德世朗\t189450\n金秀瑶族自治县\t189451\n清白之年\t189452\n百度网站排名\t189453\n裙裤\t189454\nyq\t189455\n男m\t189456\n静电容\t189457\n10px\t189458\n大族激光科技产业集团股份有限公司\t189459\n北京航天城\t189460\n奇热网\t189461\n火金\t189462\ncams\t189463\noverlord\t189464\n石巨人\t189465\n碧玉花\t189466\n夺得\t189467\n友谊赛\t189468\n登记表\t189469\nskg\t189470\ntom\t189471\n证行\t189472\n赵卫东\t189473\n水栓\t189474\n虫螨腈\t189475\n25座\t189476\n体球网\t189477\n忽忽\t189478\n养心胶囊\t189479\n朱总理\t189480\n裂伤\t189481\njtable\t189482\n纵模\t189483\n畅捷\t189484\n新华医疗\t189485\n中赫\t189486\n电动单梁起重机\t189487\nCEdit\t189488\nServ\t189489\n转过\t189490\n首当\t189491\n迷唇\t189492\n灰字\t189493\n湖州南\t189494\n渔塘\t189495\n法子\t189496\n蜈蚣\t189497\nlrz\t189498\n甜酒\t189499\n最高音\t189500\n银章\t189501\n交医\t189502\n白皮书\t189503\n莱科宁\t189504\n有机片\t189505\n上机题\t189506\n刮\t189507\nE座\t189508\n创世纪2\t189509\n通威股份\t189510\n三角体\t189511\n遭骂\t189512\n社保证明\t189513\n51testing\t189514\n142家\t189515\n金融新闻网\t189516\nmro\t189517\n魔法类\t189518\n第十一天\t189519\n学士\t189520\n滨松\t189521\nLisboa\t189522\n瓷缝剂\t189523\n0次\t189524\n女神天使エンゼルティア\t189525\nBAPE\t189526\n5.1.73\t189527\n湖北省妇幼保健院\t189528\n時代\t189529\n陈登\t189530\n重生之最强人生\t189531\n66181305\t189532\n微胖\t189533\n金秋\t189534\n失信\t189535\n上诉书\t189536\n换轮\t189537\n男和女\t189538\n中华人民共和国老年人权益保障法\t189539\n一个三岁\t189540\nLOL英雄联盟S8\t189541\n上市公司重大资产重组管理办法\t189542\n肖像画\t189543\n李白凤\t189544\n清算号\t189545\n疾速特攻\t189546\n直拍\t189547\n海洋节\t189548\n绿岛湖壹号\t189549\n冯注龙\t189550\n安盛天平\t189551\n门徽\t189552\naut\t189553\ndisconf\t189554\n种类\t189555\n开学典\t189556\n硬度计\t189557\n西部证券\t189558\n2056\t189559\nmeasures\t189560\n铜仁凤凰机场\t189561\n21所\t189562\n陈炯明\t189563\n蘸水\t189564\n3月24号\t189565\n客厅灯\t189566\n146家\t189567\nbea\t189568\nyuk\t189569\n不懂\t189570\ni+=2\t189571\n15种\t189572\n临工\t189573\ntabs\t189574\n竹剑\t189575\n6870\t189576\n望春风\t189577\n100万辆\t189578\n宝林寺\t189579\n小宁\t189580\n水题\t189581\n暮宝少年御妖录\t189582\nqq电脑版\t189583\n金清\t189584\n杰弗森\t189585\ntanh\t189586\n行会\t189587\ncameraraw\t189588\n廖智\t189589\n窝火\t189590\n秒光\t189591\n饱受\t189592\nTABLES\t189593\n1204\t189594\n路堑\t189595\n小贴士\t189596\nexhausted\t189597\n恩静\t189598\n26只\t189599\n广西经贸职业技术学院\t189600\n自动洗车机\t189601\n118i\t189602\n抵补\t189603\nxici\t189604\nlark\t189605\n分光光度法\t189606\n影壁\t189607\n吊柱\t189608\n倒立式\t189609\n肇庆国家高新区\t189610\n中信国安\t189611\n赛文奥特曼\t189612\n丘维声\t189613\n力帆620\t189614\n原案\t189615\nPowter\t189616\n深圳市教育局\t189617\n主队\t189618\n八里庄\t189619\nacaddoc\t189620\n吉大\t189621\n死字\t189622\n华仔\t189623\n张近水形物语\t189624\n转椅\t189625\nsol\t189626\n刘羽琦\t189627\n老广的味道\t189628\n麦王\t189629\n铆合\t189630\n西马克\t189631\n金刚萨埵心咒\t189632\n佳能77D\t189633\n头盔式\t189634\n成家立业\t189635\n三水南站\t189636\n幻燐\t189637\nwindsor\t189638\n圣宫\t189639\n哪头\t189640\n齿\t189641\n以貌取人\t189642\n火苗\t189643\nPromote\t189644\n世界树迷宫5\t189645\nkv\t189646\n3.2g\t189647\n尸骨\t189648\n灯架\t189649\n20句\t189650\n盼望\t189651\n中材科技\t189652\nRetailer\t189653\n2f\t189654\n零风险\t189655\n飘舞\t189656\n睿思\t189657\n蹭蹭\t189658\n传统节日\t189659\n刘鸣炜\t189660\n航天钞\t189661\na8\t189662\n嘉实基金管理有限公司\t189663\n赞助\t189664\n护肤油\t189665\nsketchup2018\t189666\n赵天成\t189667\n赵海洋\t189668\n文本框值\t189669\n女王们\t189670\n600018\t189671\n12.14\t189672\n大通v80\t189673\n79小说网\t189674\n累想\t189675\n天灵盖\t189676\n预算师\t189677\n德政\t189678\n龙巅金鱼\t189679\n综研\t189680\n邯郸火车站\t189681\n敲锣打鼓\t189682\n第92号\t189683\n市图书馆\t189684\n学者\t189685\n富氧\t189686\n连带清偿责任\t189687\n山东财经大学东方学院\t189688\n信誓蛋蛋\t189689\n礁\t189690\npanther\t189691\n上林苑\t189692\ndaoru\t189693\n概率论\t189694\nTechNet\t189695\n谷小酒\t189696\n脊柱侧弯\t189697\n边侧\t189698\njavlib\t189699\n河南省卫计委\t189700\n影驰\t189701\n鱼肚白\t189702\n手写输入法\t189703\n全品高考网\t189704\n缩略\t189705\n视剧\t189706\nredshift\t189707\n海军大连舰艇学院\t189708\nsaturation\t189709\nIAMINRED口红控\t189710\n五万多\t189711\n女记者\t189712\n王秀丽\t189713\nCanton\t189714\nHANA\t189715\n黄祸\t189716\n点读包\t189717\n万元级\t189718\n税友软件集团股份有限公司\t189719\n四川队\t189720\n聚合酶\t189721\n三月后\t189722\nshiguan\t189723\n3221种\t189724\n汾阳市\t189725\n废液\t189726\n20150118\t189727\n陕西燃气集团有限公司\t189728\n0.2m\t189729\n木鹿\t189730\n国家电投\t189731\n南京邮电大学研究生院\t189732\n本杰明·格雷厄姆\t189733\n精探\t189734\n湖南师大\t189735\nPokemon\t189736\n闻仲\t189737\n浙南\t189738\nqiyubao\t189739\n深深处\t189740\n无精打采\t189741\n1万个\t189742\n哥妹\t189743\n现实\t189744\n尖子\t189745\n教师机\t189746\n虚拟盘\t189747\n99\t189748\n俺去也电影网\t189749\n食博会\t189750\n黑胡椒\t189751\nginput\t189752\n110型\t189753\n神印\t189754\nhkc\t189755\n初感\t189756\npolyfill\t189757\nǐ\t189758\n品种权\t189759\n九集\t189760\n瞬变电磁法\t189761\n襄阳市人力资源和社会保障局\t189762\n华盾\t189763\n麦咭\t189764\n一气之下\t189765\n反恐精英online\t189766\n刘淇\t189767\n申\t189768\n金蝶KIS迷你版\t189769\n胎气\t189770\n6111\t189771\n龙门阵\t189772\n976\t189773\n瓜氨酸\t189774\n任易屏\t189775\n255_\t189776\n景别\t189777\n不亮屏\t189778\n华水\t189779\n胆汁质\t189780\n倒爷\t189781\n包装卷\t189782\ntornadomeet\t189783\n梅龙路\t189784\nMuMu\t189785\n中规院\t189786\n华夏收藏网\t189787\n盐道街小学\t189788\n雷点\t189789\n400亩\t189790\n新渡镇\t189791\n洁碧\t189792\negypt\t189793\n含笑\t189794\n上产\t189795\n癌骨\t189796\n捕食者\t189797\n瓜州\t189798\n胶画\t189799\n龙湖地产有限公司\t189800\n秦海\t189801\n蓝光雍锦园\t189802\nKai\t189803\n沣惠南路\t189804\n直到世界的尽头\t189805\n大连市中级人民法院\t189806\n薪级工资\t189807\n国防科技大学_国防科学技术大学\t189808\n侯健\t189809\n劲度\t189810\n宏祥\t189811\n气泡音\t189812\n爰片\t189813\n菠\t189814\nhobby\t189815\n教师资格认定申请表\t189816\n今井麻美\t189817\n免死\t189818\n三马路\t189819\n格鲁夫\t189820\nrandomized\t189821\n唇唇欲动\t189822\ngetaddrinfo\t189823\nSM\t189824\nJun\t189825\n婚礼纪\t189826\n20天后\t189827\ncipe\t189828\n太平洋人寿保险\t189829\nClipping\t189830\nJW万豪酒店\t189831\n山鲁佐德\t189832\nAzure\t189833\nゲ\t189834\n明凯\t189835\nMBD\t189836\n疔疮\t189837\n免打\t189838\n苔\t189839\n渠道\t189840\n风阀\t189841\n避免\t189842\n王大庆\t189843\n市话\t189844\necn\t189845\n通气量\t189846\n淫民\t189847\n万维家电网\t189848\n西大望路\t189849\nyy号\t189850\n重庆市北部新区\t189851\n欧派木门\t189852\n杀人夜\t189853\n掠婚\t189854\n观城\t189855\n眉山\t189856\n绦虫\t189857\nf罩杯\t189858\n相有\t189859\n3周目\t189860\n病况\t189861\n昭雪\t189862\ncsls\t189863\n入力\t189864\n芳香疗法\t189865\n马里奥\t189866\nSoon\t189867\n妇保院\t189868\nINF\t189869\n奥美拉唑钠\t189870\n众泰T700论坛_众泰T700车友会\t189871\n阿叔\t189872\n独脚戏\t189873\n抢手\t189874\n朱璇\t189875\n丰厚\t189876\n七十二小时\t189877\n中土\t189878\nMUA\t189879\n曼切斯特\t189880\n徐超\t189881\n梁振英\t189882\n鳞状上皮增生\t189883\nAppcan\t189884\nadjacent\t189885\nBig_Foot\t189886\n斜梁\t189887\n闯祸\t189888\n嘉旅\t189889\n红石头\t189890\n20150414\t189891\n三星A8\t189892\n一个女人\t189893\nphototshop\t189894\n曲佤哈文\t189895\n横卧\t189896\n连续梁\t189897\n720p|1080p高清\t189898\n螺丝\t189899\n转岗\t189900\n逸飞\t189901\n沸点\t189902\nMLA\t189903\nmp3格式转换器\t189904\n52pojie.cn\t189905\n装字\t189906\nNCAA\t189907\nExpat\t189908\n恒久\t189909\n中国音乐学院附中\t189910\nX02\t189911\n丫头\t189912\nExpectation\t189913\n病残\t189914\ngeren\t189915\nNightMare\t189916\nbigbrother\t189917\n林哈夫\t189918\n四岁半\t189919\n自然式\t189920\nsurplus\t189921\n物理学科\t189922\nfuqi\t189923\n山水画\t189924\n贾扬清\t189925\n玻璃钢管道\t189926\n裕同科技\t189927\n二十件\t189928\n美智子\t189929\nS7E\t189930\n梅沙\t189931\n路由表\t189932\n嫩芽\t189933\n悔\t189934\n东庄\t189935\n五十五\t189936\n南汇新城\t189937\n白绿\t189938\ndayi\t189939\n村貌\t189940\n窠臼\t189941\n头把\t189942\nav番号大全\t189943\n静海区\t189944\n光剂\t189945\n固装\t189946\n魅妍社\t189947\n轻武器\t189948\n赏心悦目\t189949\nemca\t189950\nYW\t189951\n动态块\t189952\n涂在\t189953\n一堆\t189954\n引起\t189955\nbl吧_\t189956\n肝细胞癌\t189957\n莆田市人民政府\t189958\n区块链\t189959\n云报\t189960\nlfree\t189961\nifyou\t189962\n众彩\t189963\n324国道\t189964\n抵制\t189965\nSFX\t189966\n湿婆\t189967\n魂斗罗\t189968\n古族\t189969\n维也纳3好酒店\t189970\n上标\t189971\n南网\t189972\nms\t189973\n布达拉宫\t189974\n李丰田\t189975\n云星\t189976\n二十四节气表\t189977\n末末\t189978\n|||\t189979\n旧金山湾区\t189980\n及膝\t189981\n监本\t189982\na=2\t189983\n泥泞\t189984\n趣旅\t189985\nIOT\t189986\n汉高\t189987\n哥弟\t189988\n类风湿因子\t189989\nhurst\t189990\n主裁\t189991\n大老远\t189992\n电脑电源\t189993\n构造柱\t189994\n马臻\t189995\n训练家\t189996\n大连市民政局\t189997\n圆融广场\t189998\n社服\t189999\n嵌条\t190000\n选集\t190001\n车祸\t190002\neBay.cn外贸大学\t190003\n军迷们\t190004\n辱母者\t190005\n音阙诗听\t190006\naccepted\t190007\n上女\t190008\n洞明\t190009\n贴画\t190010\n三相交流电机\t190011\n很难\t190012\n三分钟左右\t190013\n悲剧性\t190014\nG2000\t190015\n白际\t190016\nMAX2\t190017\n姚姓\t190018\n楼长\t190019\n新冶\t190020\nRoses\t190021\n12377\t190022\n1355\t190023\n切片面包\t190024\n6220\t190025\n連線\t190026\n老鹰队\t190027\n14g\t190028\n李少平\t190029\nRitual\t190030\n清血\t190031\n补贴\t190032\nselina\t190033\n北安市\t190034\nSSDT\t190035\n蝎毒\t190036\nGoogleNet\t190037\n旧金山市\t190038\n丧礼\t190039\n亚厦股份\t190040\n银沙\t190041\n北京东\t190042\notc\t190043\n快递站\t190044\nimei号\t190045\n代章\t190046\njade6.0\t190047\n自行车赛\t190048\n木桶饭\t190049\nlaji\t190050\n桃园村\t190051\n阵地\t190052\n墨菲特\t190053\n大庙\t190054\n抹光\t190055\n3G\t190056\n飞毛腿\t190057\n汤姆·赫兰德\t190058\nensure\t190059\n拉链袋\t190060\n子元素选择器\t190061\n安装\t190062\nsoure\t190063\nDHV\t190064\n王正伟\t190065\nf1\t190066\n西门子PLC\t190067\n会计证\t190068\n沙门氏菌\t190069\nhold\t190070\n公称直径_\t190071\n第一波\t190072\nbranching\t190073\nZhonghua\t190074\n注塑模具\t190075\n车牌照\t190076\n三尖\t190077\n冰糖炖雪梨\t190078\n法考\t190079\n环境空气质量标准\t190080\n金地荔湖城\t190081\nadventures\t190082\n深圳报业集团\t190083\n360吧_\t190084\nperms\t190085\nSexLab\t190086\n魏鹏\t190087\n冰河时代\t190088\n朝辉\t190089\n依从\t190090\n小视\t190091\neac\t190092\n安盈\t190093\n节符\t190094\n开封网\t190095\nh型钢\t190096\nパン\t190097\n黎珐\t190098\nkaito\t190099\n锦州世博园\t190100\n珍珑棋局\t190101\n刷钻\t190102\n能不能说\t190103\nobscure\t190104\n芥末堆\t190105\n日上免税行\t190106\n表扬\t190107\n0835\t190108\nSri\t190109\n260吨\t190110\nkaili\t190111\n鹜\t190112\n荣耀MagicBook\t190113\n荒料\t190114\ntextview\t190115\n冬堡学院\t190116\n中共中央组织部办公厅\t190117\n36页\t190118\nxxx公司\t190119\n亿恩社区\t190120\n白纸\t190121\n中宝\t190122\n菜园坝火车站\t190123\n功利\t190124\ndota2plus\t190125\n松江南\t190126\n广州市工贸技师学院\t190127\n水盆\t190128\n笑花\t190129\npractical\t190130\n政德\t190131\n锯子\t190132\n液压马达\t190133\n猫娘\t190134\nf30\t190135\n欧泊莱\t190136\nshootercheng\t190137\n几缸\t190138\n2.45\t190139\n励志篇\t190140\ndatong\t190141\n语体\t190142\n保卫萝卜3码头\t190143\n5d4\t190144\n炝锅\t190145\n33届\t190146\n李康\t190147\n结缔组织病\t190148\n异义\t190149\n云风\t190150\nsio2\t190151\n倒计时\t190152\n韦斯莱\t190153\n好飞网\t190154\n王者歪传\t190155\n异类\t190156\n外商投资公司\t190157\n加拿大央行\t190158\n泊车辅助系统\t190159\n带阻滤波器\t190160\nwebsql\t190161\nXperia\t190162\n怎奈\t190163\n节奏型\t190164\n罗斯托夫\t190165\n180米\t190166\n瞬风\t190167\n烤肉饭\t190168\n伟国\t190169\nShock\t190170\n东风日产乘用车公司\t190171\n二氧化碳\t190172\nxlwings\t190173\n君越太古神王\t190174\n欧菲光\t190175\n手偶\t190176\n随心\t190177\n壹微百应\t190178\n杭州东站\t190179\n点源\t190180\n北京301医院\t190181\n相寓\t190182\n尼日利亚\t190183\n胰岛\t190184\n吴建军\t190185\n懊悔\t190186\n云南衡水实验中学\t190187\n清里\t190188\na6300\t190189\n太奇怪\t190190\n泼辣\t190191\n泽天\t190192\n苏醒\t190193\n行车制动器\t190194\n前文\t190195\n惠家\t190196\n双晶\t190197\n华电集团公司\t190198\n于言\t190199\n推荐信\t190200\n门票价\t190201\nlxde\t190202\n云考试\t190203\n百事\t190204\n福建龙马环卫装备股份有限公司\t190205\n音代\t190206\n百分之80\t190207\n道和云科\t190208\n弓星\t190209\n高路\t190210\n一霎时\t190211\n萧乾\t190212\n泰通\t190213\n贝贝熊\t190214\n86盒\t190215\n小茗\t190216\n树牢\t190217\n城南社区\t190218\n桑拿\t190219\nSR2\t190220\n老天籁\t190221\nnow直播吧\t190222\nranks\t190223\n堆排序算法\t190224\n7多\t190225\n7千元\t190226\n999号\t190227\n以假乱真\t190228\n长兴岛经济区\t190229\n西安门\t190230\n富山县\t190231\n油浸\t190232\n86u\t190233\nKEYNOTE\t190234\n孝感高中\t190235\n冬马\t190236\nDF创客\t190237\n蛟河\t190238\n核细胞\t190239\n生物柴油\t190240\n音柱\t190241\nelixir\t190242\n成都市科学技术局\t190243\n丘脑\t190244\n天降奇兵\t190245\n缩缝\t190246\n超凡蜘蛛侠3\t190247\nisle\t190248\n精度\t190249\n好装\t190250\n凉霸\t190251\n陈红梅\t190252\n喵师傅\t190253\n萌豚\t190254\n王铮亮\t190255\nVEGF\t190256\n伍德斯托克\t190257\n压管\t190258\nrecruiter\t190259\n抢\t190260\n罪恶之城\t190261\nTag\t190262\nJCI\t190263\n广州市环保局\t190264\n200g\t190265\n负氧离子\t190266\n中共中央政治局常务委员会\t190267\n设计类\t190268\nJI\t190269\nhuaren\t190270\n王越\t190271\n诸神之战\t190272\n隆安教育信息网\t190273\n教育部学校\t190274\nseduce\t190275\n朴智旻\t190276\n路虎Landrover\t190277\n上海装修网\t190278\n杰克伦敦\t190279\n木艺\t190280\n万千\t190281\n谢乐\t190282\n尾崎丰\t190283\n房产经纪公司\t190284\n国家煤化工网\t190285\nLinux-51CTO\t190286\nsiam\t190287\n单臂\t190288\n好多次\t190289\n纵宠\t190290\n中国航空公司\t190291\n约里克\t190292\n富一代\t190293\n美利车\t190294\n连接性\t190295\n五香花生\t190296\ngourmet\t190297\n区块链公司\t190298\n7706\t190299\n天使帝国4\t190300\n哈雷750\t190301\n几岁时\t190302\npokemmo\t190303\n画筒\t190304\n十三夜\t190305\nCarbon\t190306\n柔化\t190307\n5137\t190308\n半球\t190309\nXH\t190310\n美盈森\t190311\n东北地区\t190312\n刘家庄\t190313\nd600\t190314\n唐宁ONE\t190315\nisi\t190316\n杨易\t190317\n导练\t190318\n乐檬x3\t190319\n爬上来\t190320\n纤道\t190321\n3635\t190322\n平滑度\t190323\n格瑞斯\t190324\n昼\t190325\n呲牙\t190326\n液晶屏\t190327\n太湖源\t190328\n1.65米\t190329\nag亚游集团\t190330\n长颈族\t190331\n丝氨酸\t190332\nexecuted\t190333\n230米\t190334\n无线端\t190335\n修订\t190336\n苏富比\t190337\n时间观\t190338\n司考聚吧\t190339\n一项\t190340\nBitAuto\t190341\n史玉梅\t190342\n光一科技\t190343\n商云\t190344\n中国烟草\t190345\n新视野大学英语视听说教程2\t190346\nkoolearn\t190347\n旗台\t190348\nIT搜购网\t190349\n玄学网\t190350\n曼斯菲尔德\t190351\n乍得\t190352\n_宅\t190353\n涂壁\t190354\n酷宝\t190355\n行用卡\t190356\n净化剂\t190357\n柳茜\t190358\nfx-991es\t190359\n渔船\t190360\n雪山飞猪\t190361\n国家气候中心\t190362\nTextRank\t190363\n王显伟\t190364\n驱蚊灯\t190365\n30万美元\t190366\n6C\t190367\n孤狼\t190368\ntranspose\t190369\n商务园\t190370\n聪哥\t190371\n星汉\t190372\nTreatments\t190373\n凯莱\t190374\n四千元\t190375\nstatu\t190376\n敞开\t190377\nhavit\t190378\n大泽\t190379\nFDA\t190380\n昌邑市\t190381\n吐槽大会\t190382\n公证处\t190383\n工口片\t190384\n3.92\t190385\n壁温\t190386\n意识形态\t190387\nU8基金网\t190388\n千百\t190389\n麦色\t190390\n唯思影城\t190391\n西长安街\t190392\n免费阅读器\t190393\n大胖\t190394\n高学历\t190395\n廖明\t190396\n大虎山\t190397\n水处理剂\t190398\n铝模板\t190399\npayload\t190400\n社会福利\t190401\n王奇生\t190402\n草坪\t190403\nA轮融资\t190404\n垃圾压缩站\t190405\nwin2016\t190406\n奥华\t190407\nappendix\t190408\n万种\t190409\n复评\t190410\nTodoist\t190411\n苦苣菜\t190412\n新证\t190413\n擦肩而过\t190414\n神泣\t190415\n枣皮\t190416\n衢州政府网\t190417\n贝塔斯瑞\t190418\n宁波方特东方神画\t190419\nPYG\t190420\n上锁\t190421\n钢丝轮\t190422\n当当当当当\t190423\n赵又廷\t190424\n暴徒\t190425\n阿姨们\t190426\n展辰\t190427\n贝影\t190428\ntumor\t190429\n烟鬼组合\t190430\nJamKong\t190431\n吴大伟\t190432\nTL-WN823N\t190433\n龙崎\t190434\n12架\t190435\n饮\t190436\n护腰带\t190437\nppt课\t190438\n第十五条\t190439\n世纪星源\t190440\nadodb\t190441\n上海图书馆上海科学技术情报研究所\t190442\n供水量\t190443\n3119\t190444\n23件\t190445\n轮操\t190446\nDatastage\t190447\n潼关县\t190448\nwzry\t190449\n撕碎机\t190450\n周后\t190451\n血光\t190452\n防火间距\t190453\n高铁网\t190454\n柱边\t190455\nBB机\t190456\n首师大二附中\t190457\net\t190458\nalchemy\t190459\n360杀毒软件\t190460\nbilibilijj\t190461\n阜阳百姓网\t190462\n北海酒店\t190463\nunnecessary\t190464\nMUC\t190465\n好闲\t190466\n收心\t190467\n微云同步助手\t190468\n地磁\t190469\n电火\t190470\n挑出\t190471\n稳步\t190472\n富力金港城\t190473\nsdkmanager\t190474\n180页\t190475\nMaxim\t190476\n慈禧全传\t190477\n外转\t190478\nISG\t190479\nARKit\t190480\nyangzhou\t190481\n灰面\t190482\n501米\t190483\n呼斯楞\t190484\n11.doc\t190485\n亚丝娜\t190486\n额尔古纳河\t190487\nmpeg-4\t190488\n起征\t190489\n停战\t190490\nc0000005\t190491\n导套\t190492\n终於\t190493\n马丁·斯科塞斯\t190494\n弘尚\t190495\n滚远\t190496\n秃鹤\t190497\n观棋\t190498\noffice10\t190499\n张立群\t190500\n局部\t190501\n友谊奖\t190502\n遍地开\t190503\n空流\t190504\ncc2640\t190505\n湖南广播电视台\t190506\n除法\t190507\n仪仗\t190508\n利益观\t190509\n欲望\t190510\n溶出\t190511\n后赛\t190512\n白涛\t190513\n化工商城\t190514\n低脂肪\t190515\n齐越节\t190516\n轮班制\t190517\niframe\t190518\n条件变量\t190519\n共\t190520\n周龄\t190521\n代\t190522\n东风风行\t190523\nStrider\t190524\nUTF-8格式\t190525\n灵源\t190526\n陇东\t190527\n李恩美\t190528\n五盘\t190529\n定都\t190530\n米扬\t190531\n见见\t190532\n出猎\t190533\n压迫感\t190534\n蝴蝶鱼\t190535\n陈来\t190536\n裙摆\t190537\n多源\t190538\n林麝\t190539\n0.9%\t190540\ngdal\t190541\n希灵帝国\t190542\n中办\t190543\n工作集\t190544\n体材\t190545\n中公遴选考试网\t190546\n黑衣组织\t190547\n乱讲\t190548\n淘宝街\t190549\n规定性\t190550\n骑士经理2\t190551\n里番儿ACG\t190552\n消费机\t190553\n巴西利亚\t190554\n1.8d\t190555\n火影忍者:究极忍者风暴4\t190556\npropensity\t190557\n女工\t190558\n仔细\t190559\n重庆两江新区政府网\t190560\nherbinate\t190561\nMFT\t190562\nzhizuo\t190563\n催账\t190564\n2830\t190565\n3.7.6\t190566\n水关\t190567\n樊昌信\t190568\n定表\t190569\nfaceu\t190570\n村屋\t190571\n20多所\t190572\n和分\t190573\n黑鲨\t190574\n观湖庄园\t190575\n教务管理系统\t190576\n成都理工大学\t190577\n炼神\t190578\n中小学\t190579\n转股\t190580\n单纯形法\t190581\n你的情深乱我流年\t190582\n林茨\t190583\n文字性\t190584\n全集\t190585\n遮山\t190586\n雪山飞狐\t190587\n樱慈\t190588\n收件箱\t190589\n浙江大学法学院\t190590\n市城市更新局\t190591\n北极星风力发电网\t190592\n1280p高清\t190593\n得得得\t190594\n地下\t190595\n越野性\t190596\n名人录\t190597\n赛纳\t190598\nWhiskey\t190599\nV3.2\t190600\n尸约\t190601\nSKF\t190602\n000060\t190603\nIPS屏\t190604\n梅溪湖国际新城\t190605\n糊涂虫网\t190606\n清开灵注射液\t190607\nv2.2.8\t190608\n斯泰克\t190609\n上海微创软件股份有限公司\t190610\n白沙大桥\t190611\n1186\t190612\n试验器\t190613\n中山大学图书馆\t190614\n亡国\t190615\n请君\t190616\n中华人民共和国中医药法\t190617\n润发\t190618\n斯巴\t190619\nddooo\t190620\n封接\t190621\nsdy\t190622\n1059\t190623\n後編\t190624\n姬川\t190625\n急诊科医生\t190626\n人民邮报\t190627\n众创空间\t190628\n一箭\t190629\n皮泥\t190630\n风月俏佳人\t190631\n6.09\t190632\n上在\t190633\nFootage\t190634\n高胆固醇血症\t190635\ndeposit\t190636\n偷取\t190637\nLargest\t190638\n考研率\t190639\n毕业设计作品网\t190640\n社区银行\t190641\n教风\t190642\n天津滨海图书馆\t190643\n陈禹\t190644\n传奇十一人\t190645\nIndustrie\t190646\n最后岁月\t190647\n夹克男\t190648\n小友\t190649\n有事儿\t190650\n种源\t190651\n37.3\t190652\n航天广电\t190653\nR11s\t190654\n光器件\t190655\n天健网\t190656\nSqueeze\t190657\n王栎鑫\t190658\n邵佳一\t190659\n中国土地勘测规划院\t190660\n连锁版\t190661\n无叶风扇\t190662\nwin7安装盘\t190663\n逆期\t190664\n护生\t190665\n食品类\t190666\n配速\t190667\n中医生\t190668\n廖美\t190669\nfob\t190670\n融入式\t190671\n12集\t190672\n健康之路(医护网\t190673\nChino\t190674\n罗兰贝格\t190675\n矢量图素材\t190676\nwendy\t190677\nSASS\t190678\n唐古拉山\t190679\n微盾\t190680\n光华里\t190681\nreo\t190682\n剑南\t190683\n泡饭\t190684\n合格者\t190685\n丰台分局\t190686\n专业户\t190687\n如痴\t190688\n10.18\t190689\n小星星\t190690\n愤恨\t190691\n183\t190692\n第157集\t190693\n通风机\t190694\n福报\t190695\n植物库\t190696\nbjy\t190697\n73平米\t190698\n下垂型\t190699\n刀身\t190700\n顺达路\t190701\n动漫公司\t190702\n蒲城县\t190703\n北达资源中学\t190704\n奥尔什方\t190705\n突围战\t190706\n简线\t190707\n钻头\t190708\n闪卡\t190709\n分娩期\t190710\n团购网\t190711\n亚麻籽油\t190712\n北京外企人力资源服务有限公司\t190713\n圣魔\t190714\n李智慧\t190715\n杭州新闻中心\t190716\nhosts\t190717\n党动机\t190718\n坠桥\t190719\nsqoop2\t190720\n国创高新\t190721\n离散函数\t190722\nBikes\t190723\n张德祥\t190724\n拧紧\t190725\n北京昆仑通态自动化软件科技有限公司\t190726\n梁楼\t190727\n雷锋塔\t190728\n江西省工商局\t190729\n福州市公安局\t190730\n中国卫生部\t190731\n辕\t190732\n奶业\t190733\n莲蓬乳\t190734\n主渣\t190735\n姬岛朱乃\t190736\n延安大学西安创新学院\t190737\neffect\t190738\n广播电台\t190739\n玻璃镜\t190740\n真伪\t190741\n一行两列\t190742\n赏荷\t190743\n计容\t190744\n牛丼\t190745\n那曲市\t190746\n加急\t190747\n拟办\t190748\nhaote\t190749\n中山大学新华学院\t190750\n群龙无首\t190751\n金贵\t190752\n面善\t190753\n西陇\t190754\ndarknet\t190755\n十二烷\t190756\nng-click\t190757\n轮片\t190758\n胜太路\t190759\n神调\t190760\n董成鹏\t190761\n风行S500\t190762\n入场证\t190763\nszzk\t190764\n非零自然数\t190765\nqq群主\t190766\nziliao\t190767\n何清涟\t190768\n4K分辨率\t190769\n高密度板\t190770\n投票贴\t190771\ndispatcher\t190772\n高东\t190773\n食食\t190774\n全国人大专门委员会\t190775\n178hui.com\t190776\nfpv\t190777\n制氧\t190778\n谢晓亮\t190779\n强奷\t190780\n登记簿\t190781\n药事管理与法规\t190782\nsystemc\t190783\n二木\t190784\n男医\t190785\nSD高达G世纪\t190786\n宝强\t190787\n花宫\t190788\n国表\t190789\n丁黑\t190790\nGSG\t190791\n5000辆\t190792\n乳糖不耐症\t190793\n紫苑\t190794\n辐射类\t190795\n丫丫壁纸网\t190796\n20160226\t190797\n夜场交谊舞曲网\t190798\n南宫月萧逸轩\t190799\nFujian\t190800\n&\t190801\n幽\t190802\n饿狼传说\t190803\n感通\t190804\n美国免税州\t190805\n江扬\t190806\n鹭沢文香\t190807\nSpoken\t190808\n恒昌利通\t190809\n钢调质\t190810\n19000\t190811\n防风\t190812\ntoday函数\t190813\n使命召唤9\t190814\n河北交通职业技术学院\t190815\n金风玉露\t190816\n圣马可广场\t190817\n犍为县\t190818\n2014-04\t190819\n朱涛\t190820\n朝阳医院\t190821\nSoft\t190822\n张继向华强\t190823\n欺师\t190824\nDVD-AVI\t190825\n陇川\t190826\n自私\t190827\n江苏省清江中学\t190828\nXBOXONE\t190829\n灵信\t190830\n悬索\t190831\n彼得大帝\t190832\n大众CC论坛_汽车之家论坛\t190833\n湾沚\t190834\n拨轮\t190835\nbookmark\t190836\nプリンセス\t190837\n线色\t190838\n蓝盾股份\t190839\n驱魔人\t190840\n2014年9月\t190841\nCAD2018\t190842\njuand\t190843\n1252\t190844\n补肾壮阳\t190845\n家亲\t190846\n校对\t190847\n钢坝\t190848\n花海阁\t190849\n事不过三\t190850\n甜美\t190851\n东罗马\t190852\n贵广\t190853\n厦门建设银行\t190854\n柳青瑶\t190855\nlululemon\t190856\n邕州\t190857\nexcel单元格\t190858\n广东省国税局\t190859\n8889\t190860\n小北路\t190861\nWGS84\t190862\n乙醇钠\t190863\n医学检验师\t190864\n脱空\t190865\n大别山区\t190866\n10.24\t190867\n载歌载舞\t190868\n刘长山路\t190869\n复混肥料\t190870\n稻米\t190871\n布置\t190872\n对脚\t190873\n哪几样\t190874\n消歧\t190875\ndropdown-menu\t190876\nCREDIT\t190877\n语意\t190878\n恶魔猎手浩劫\t190879\n夜排\t190880\nCsgola\t190881\n省名\t190882\n自尊心强\t190883\nlation\t190884\n大天狗\t190885\ncpx\t190886\n中国建筑材料科学研究总院\t190887\n掌中宝\t190888\n遂川县人民政府\t190889\namule\t190890\n亿达春田\t190891\n推运\t190892\n邪妃\t190893\n乐高蝙蝠侠\t190894\n黛玉\t190895\n4.1.3\t190896\n尤娜\t190897\n传说级\t190898\n昆仑墟\t190899\n银都路\t190900\n新疆财经大学\t190901\n代发表\t190902\n北京市小学\t190903\nawaiting\t190904\n精铁\t190905\nEscort\t190906\n小城\t190907\n海南藏族自治州\t190908\n阳泉市郊区\t190909\n恒德\t190910\n拨入\t190911\n金融机具\t190912\n加一天\t190913\nbashrc\t190914\niq300\t190915\n法拍房\t190916\n李硕\t190917\n闺\t190918\nep.3\t190919\n铃儿响叮\t190920\n内页\t190921\n天昏地暗\t190922\n化点\t190923\n感恩有你\t190924\n序数\t190925\n泽北\t190926\nTexmaker\t190927\njekyll\t190928\n小叶女贞\t190929\nattention\t190930\n方向键\t190931\n金基\t190932\n87.6\t190933\n总得\t190934\n满城区\t190935\nCord\t190936\n学易试题君之\t190937\n利诱\t190938\n中州路\t190939\n稀松\t190940\n脾肾阳虚\t190941\n徐歌阳\t190942\n冈田\t190943\nflask\t190944\n阴柔\t190945\n双林镇\t190946\n潮物\t190947\n226路\t190948\n神烦警探\t190949\nPhotoFans\t190950\n坡头镇\t190951\n华润创业\t190952\n惠誉\t190953\n带脉\t190954\n百余\t190955\n瀚亚\t190956\n0.89\t190957\n射阳\t190958\n刺头\t190959\n役\t190960\n隆道尔\t190961\n低者\t190962\n清平乐村居\t190963\n织梦大学\t190964\n章源钨业\t190965\n睡好觉\t190966\n阿拉善右旗\t190967\n言情中文网\t190968\nExercise\t190969\n紫光集团\t190970\nflipper\t190971\n有必要\t190972\nfgo本能寺\t190973\n中华人民共和国国家监察法\t190974\njapan\t190975\nmain\t190976\n邓刚\t190977\n李文浩\t190978\n白天\t190979\nhighcharts\t190980\n绵\t190981\n盛夏晚晴天\t190982\n12版\t190983\n烤面筋\t190984\n4挡\t190985\n刷架\t190986\n新川创新科技园\t190987\n小车\t190988\n姓任\t190989\nTracking\t190990\n你的脸\t190991\n纷纷雨纪念网\t190992\n道中\t190993\n变态反应科\t190994\nhph\t190995\n啼鸟\t190996\n主线程\t190997\nSmall\t190998\n钨矿\t190999\n知豆\t191000\n再留\t191001\n段号\t191002\n密云路\t191003\n120粒\t191004\ndbgrid\t191005\noat\t191006\nCargill\t191007\n投资脉搏网\t191008\n卖者\t191009\n水能\t191010\n02期\t191011\n梅州\t191012\n发廊\t191013\n热岩\t191014\nvoice\t191015\n尖角\t191016\n陈坤\t191017\n都安\t191018\n鄂尔多斯草原\t191019\n算头\t191020\n雄才\t191021\n心里\t191022\n百度研究院\t191023\n青少台\t191024\n研讨课\t191025\nrpg游戏\t191026\n丽台p600\t191027\n任达\t191028\n0.1米\t191029\n江宁路\t191030\n现手\t191031\nNail\t191032\nGratis\t191033\ny5\t191034\nconflicting\t191035\n红魔鬼\t191036\n新疆招商网\t191037\n重檐\t191038\nwkt\t191039\n伊格尔顿\t191040\n标音\t191041\n游泳\t191042\n生米镇\t191043\n国金证券股份有限公司\t191044\nLeakCanary\t191045\n学习时报\t191046\n公信\t191047\n变难\t191048\n时态\t191049\n出其不意\t191050\n石勒\t191051\n克劳迪娅\t191052\n银河帝国\t191053\n艾叶煮鸡蛋\t191054\n娼\t191055\n令吉\t191056\n电缆沟\t191057\nclean\t191058\n通风橱\t191059\n三四百\t191060\n白贤\t191061\n箱门\t191062\n二米\t191063\n海蛎子\t191064\nArrayBuffer\t191065\n村镇银行\t191066\n小鞋子\t191067\n第108章\t191068\n不料\t191069\nCOST\t191070\n保卫者\t191071\n堪培拉大学\t191072\n水命\t191073\ntou\t191074\nWLK\t191075\n4K-STAR\t191076\n锐度\t191077\n哈尔滨音乐学院\t191078\n二日游\t191079\n城与\t191080\n实验题\t191081\n骰骨\t191082\n1054\t191083\n几内亚\t191084\n华东师范大学计算机科学与软件工程学院\t191085\ntad\t191086\n最高院\t191087\nUMass\t191088\nGong\t191089\nbehaviour\t191090\n流道\t191091\n百亿_\t191092\nUntitled\t191093\nPriv\t191094\n维特比\t191095\ndesigner10\t191096\n上海市静安区中心医院\t191097\n中文分词器\t191098\n算式\t191099\n1008611\t191100\n元盛\t191101\n招亲\t191102\n滴滴网\t191103\n48种\t191104\n铜锭\t191105\n双叶美佳\t191106\nssas\t191107\n发展和改革委员会\t191108\n崔阿扎\t191109\nIbiza\t191110\nMill\t191111\n蓝帆医疗\t191112\n亭山\t191113\n肖波\t191114\n异香\t191115\n陪审团\t191116\n正商\t191117\nreactor\t191118\n中兴通信\t191119\n寿命长\t191120\n丁纯\t191121\n美秀\t191122\n邪不压正\t191123\n上海自贸区注册公司\t191124\n窦桂梅\t191125\nMarkdown\t191126\n4U\t191127\n伍兹\t191128\nhnd\t191129\n苦味\t191130\n微尚\t191131\n亮彩\t191132\n衰减系数\t191133\n杭州第四中学\t191134\n泄压阀\t191135\nsqing55\t191136\n任平\t191137\n奉贤\t191138\n精彩人生\t191139\n幸福巧克力\t191140\nPlug\t191141\n深圳市交通运输委员会\t191142\n贺银成\t191143\nles吧\t191144\nOSChina\t191145\n828\t191146\n1699元\t191147\n绿城花园\t191148\n0015\t191149\n1000XM2\t191150\n如下\t191151\n大兴亦庄\t191152\n英雄联盟\t191153\nPriest\t191154\n散水\t191155\n3-4年\t191156\n停开\t191157\n盈石\t191158\npreview\t191159\n重录\t191160\n刂\t191161\n潍坊百姓网\t191162\n雷人网\t191163\njequery\t191164\n云舟\t191165\n坎通纳\t191166\n盛大花园小学\t191167\n渡江战役\t191168\n4班\t191169\nAltitude\t191170\n人事部\t191171\n青岛崂山风景区\t191172\n轩辕剑三\t191173\n10022\t191174\nPConline\t191175\n青梅煮酒论英雄\t191176\n魔卡少女樱clear\t191177\n卖\t191178\n抗日战争胜利\t191179\n医疗专业人才网\t191180\n弟妹\t191181\n付通\t191182\n生活日报数字报\t191183\n心电轴\t191184\n狗拳\t191185\n退库\t191186\n3s\t191187\n大黑屋\t191188\ntourism\t191189\n守土有责\t191190\n威易网\t191191\n204国道\t191192\n失落之歌\t191193\nsm961\t191194\n广州医药有限公司\t191195\n即期利率\t191196\n旺达\t191197\n8列\t191198\n大蛋\t191199\n飒漫画\t191200\n床类\t191201\n深入浅出\t191202\n大商新玛特\t191203\n男人篇\t191204\n装存\t191205\n370\t191206\n发号\t191207\nAprilia\t191208\n海南省地方税务局\t191209\n酱鸭\t191210\n杭普\t191211\n爱奇艺娱乐\t191212\n十月一\t191213\n密苏里\t191214\n中荣\t191215\n南澳大桥\t191216\n宁波分公司\t191217\n小学生活\t191218\n建华大街\t191219\n20160323\t191220\n公开课\t191221\n增氧\t191222\n郭志刚\t191223\n全美航空\t191224\n比特流\t191225\n胡立峰\t191226\nWKWebView\t191227\n兰国\t191228\n账户名\t191229\n上游\t191230\n力男\t191231\n中继间\t191232\n蓝曼龙\t191233\n鸡缸杯\t191234\n征途单机版\t191235\n澜沧\t191236\n权宜之计\t191237\n刷枪\t191238\n何亮\t191239\nmisfit\t191240\n方恨\t191241\nbiangbiang面\t191242\n钢琴师\t191243\n腹式呼吸\t191244\n琉璃狐\t191245\n观棋不语\t191246\n学术语\t191247\n定增基金\t191248\n出海捕鱼\t191249\n张汝伦\t191250\n沈亚\t191251\ntxtarticle\t191252\nplib\t191253\n五一广场\t191254\n华尔兹吧\t191255\n弟子们\t191256\nPERK\t191257\n文学天唐锦绣\t191258\n彼岸\t191259\n浙江中烟工业有限责任公司\t191260\n星域\t191261\n百度_\t191262\n仲间由纪惠\t191263\n声泪俱下\t191264\n秦志戬\t191265\n肠生\t191266\n利税率\t191267\n商洽函\t191268\niP地址\t191269\n通用类\t191270\n警卫员\t191271\n快堆\t191272\nCMT\t191273\n期初余额\t191274\nIMAC\t191275\n同济大学设计创意学院\t191276\n万特\t191277\n32nm\t191278\n少有\t191279\n过意不去\t191280\n爱的那点性事\t191281\n酋\t191282\n违停\t191283\n情投意合\t191284\n特洛伊战争\t191285\nJSESSIONID\t191286\nContextMenu\t191287\nDESC\t191288\n东莞理工学院城市学院\t191289\n4亿吨\t191290\n9月份\t191291\n贝安\t191292\n险境\t191293\n表类\t191294\n达慕\t191295\n布鲁克纳\t191296\ndealing\t191297\n西藏自治区人民政府\t191298\nAlt\t191299\n淘车\t191300\n关元穴\t191301\n2012-2014年\t191302\n500欧元\t191303\n安徽国际商务职业学院\t191304\nTagName\t191305\nwunian\t191306\n肿瘤坏死因子\t191307\n9.35\t191308\nhellokid\t191309\n新淫\t191310\n沙箱\t191311\n多一页\t191312\n敦力\t191313\n绍兴市委\t191314\n剧中\t191315\n纳豆\t191316\n汾湖\t191317\n三胡\t191318\n上海乾拓贸易有限公司\t191319\n流星网络电视\t191320\n挖沟机\t191321\n绿花\t191322\nLS58A51\t191323\n12.7毫米\t191324\n绿肥\t191325\n1800万\t191326\n毛织\t191327\n天宏一卡通\t191328\n淘宝心\t191329\nPatek\t191330\nmatlib\t191331\neurope\t191332\n向基层延伸\t191333\nadress\t191334\n参保\t191335\n测斜仪\t191336\nv2.2.0_\t191337\n圆滚滚\t191338\n玉符\t191339\n圣剑传说2\t191340\n家用制氧机\t191341\n换发\t191342\n沙河\t191343\n水浒无间道\t191344\n孟州\t191345\n否定式\t191346\n沙琪玛\t191347\nedgm\t191348\n观音洞\t191349\n禁限\t191350\n阿德\t191351\n昆明市人社局\t191352\n迅雷BT\t191353\n台值\t191354\n苏坤\t191355\n格林豪泰快捷酒店\t191356\n厉内荏\t191357\n几G\t191358\ndegli\t191359\n花神\t191360\n发病\t191361\n产区\t191362\ncandy\t191363\n蜀门\t191364\n客人们\t191365\n白冰\t191366\n鞅\t191367\n邵东县\t191368\n七彩阳光广播体操\t191369\n冬宫\t191370\n哥莫拉\t191371\n10.0000元\t191372\nDVD-ROM\t191373\ndabei2006\t191374\nPCX\t191375\ntubiao\t191376\n天通快讯\t191377\nminis\t191378\n莒\t191379\n妇洗器\t191380\n公益广告片\t191381\n含糊\t191382\nsess\t191383\n对谈\t191384\n洪先礼\t191385\n坂本冬美\t191386\n塑料箱\t191387\n王钧\t191388\n雪屋\t191389\n伏藏\t191390\n天润城\t191391\n家池\t191392\n第二列\t191393\nregular\t191394\n53部\t191395\n主面\t191396\n武装力量\t191397\n陈颖\t191398\n苏紫旭\t191399\n8102\t191400\n贞心\t191401\n南野灯\t191402\n春燕\t191403\nB350M-PLUS\t191404\n一万日元\t191405\n机油箱\t191406\n凛冬\t191407\n江干区\t191408\n太空上\t191409\n有丝分裂\t191410\n支撑座\t191411\n_以撒的结合\t191412\n4104\t191413\n芜菁\t191414\n2010-2012年\t191415\n太阳月亮\t191416\n收报\t191417\nCSDN学院\t191418\n独体\t191419\n傀儡国\t191420\n意式浓缩咖啡\t191421\n守护者\t191422\n热休克蛋白\t191423\n圆整\t191424\n小爽\t191425\n家居服\t191426\nIESS\t191427\nLOL掌游宝\t191428\nQFileDialog\t191429\n电吉他\t191430\n我不\t191431\n惭\t191432\n安徒生童话\t191433\n开领\t191434\n沃\t191435\n武湖\t191436\n荣威ERX5\t191437\nAUS\t191438\n盟战\t191439\n阳普医疗\t191440\n尤三姐\t191441\n直拳\t191442\n居住地\t191443\n口本\t191444\n奥斯卡金像奖\t191445\n扮演类\t191446\n8780\t191447\n乐檬\t191448\n青岛火车北站\t191449\ndalton\t191450\n炮流\t191451\n意大利黑手党\t191452\n升降桌\t191453\n资料分析\t191454\n莫达非尼\t191455\nyourself\t191456\n红头文件\t191457\n武官\t191458\nSTM32F0\t191459\n特百惠水杯\t191460\n21万元\t191461\n2009-2015年\t191462\n渝社区\t191463\nclosers\t191464\n中石化销售公司\t191465\n阿爷\t191466\n2017年12月30日\t191467\n妙计\t191468\n奸人\t191469\n杯杯\t191470\n米多多\t191471\n10X\t191472\n女国\t191473\n有似\t191474\n京九铁路\t191475\n氨氮去除剂\t191476\nPCB电路板\t191477\n大连教育\t191478\n驾训班\t191479\npwa\t191480\nWpf\t191481\n后座力\t191482\n羟\t191483\n198万\t191484\n方发\t191485\n七色光之歌\t191486\n朋党\t191487\n标方\t191488\nDinosaur\t191489\n20141216\t191490\n骚客\t191491\n科学界\t191492\n新春版\t191493\n译\t191494\n健康族\t191495\n李丰\t191496\n三点半\t191497\nCivilisation\t191498\n有特\t191499\n高通\t191500\nsci-hub\t191501\n地妻\t191502\n徐晖\t191503\n小狮子\t191504\ndelphi2010\t191505\n毓婷\t191506\n红袍女\t191507\n鹿港小镇\t191508\nwheels\t191509\n胶质\t191510\nHTMl\t191511\n于民\t191512\n贵局\t191513\nPROE\t191514\n江西应用科技学院\t191515\n加刑\t191516\ntimedatectl\t191517\n晋州市\t191518\n海尔\t191519\n60分\t191520\n交子\t191521\n三视图\t191522\n卢村\t191523\n博信股份\t191524\nupport\t191525\n配伍禁忌\t191526\n全国人民代表大会常务委员会\t191527\n飘洋\t191528\n尤莉\t191529\nStress\t191530\n塔里木河\t191531\n淫狐\t191532\nDebi\t191533\n中国铁路武汉局集团有限公司\t191534\n发糕\t191535\nxp1024核工厂down\t191536\n共和县\t191537\n_凡科\t191538\n手法\t191539\n杜建国\t191540\n同情\t191541\n五一节\t191542\n轨迹球\t191543\n800集\t191544\nsetdefault\t191545\n地域\t191546\n王者荣耀后羿\t191547\n三刀头\t191548\n郑耀先\t191549\n邦\t191550\n白血\t191551\n淞\t191552\nBrightman\t191553\n移值\t191554\n力器\t191555\n大行宫\t191556\n音乐传奇\t191557\n断网急救箱\t191558\nparticularly\t191559\n叠拼别墅\t191560\n青铜镜\t191561\n展翼\t191562\n法论\t191563\n济青\t191564\n流行性乙型脑炎\t191565\nExam\t191566\n斧头\t191567\n71\t191568\n扩大化\t191569\n灶头\t191570\n江铠同\t191571\ntrilogy\t191572\n李宏伟\t191573\n虎仔\t191574\n花雕酒\t191575\nh9\t191576\n冠盖\t191577\nKonka\t191578\nTRACE\t191579\n大慈恩寺\t191580\n北京神舟国旅\t191581\n红孩子\t191582\n少于\t191583\n才臣\t191584\n马长礼\t191585\n夜羽\t191586\n白刃战\t191587\n颜回\t191588\n王贝\t191589\njz.99.com\t191590\n文教资料\t191591\n消防许可证\t191592\n雪楼\t191593\n猴姆\t191594\n通机\t191595\nk神\t191596\nMidori\t191597\n朝花惜时\t191598\n帧头\t191599\n申办者\t191600\n身世之谜\t191601\n一曲\t191602\n乙二醇单丁醚\t191603\n拼客\t191604\n国基路\t191605\n乙\t191606\n开衫\t191607\n钧瓷\t191608\n菲菲\t191609\n曼食慢语\t191610\n脱臼\t191611\n拉米苏\t191612\nTech\t191613\ncompete\t191614\n500V\t191615\n怒求\t191616\npayday\t191617\n瑞和城\t191618\n高教杯\t191619\n牛扒\t191620\ncho\t191621\ndwz\t191622\n6019\t191623\n撑开\t191624\n倒下来\t191625\n波导\t191626\n陆群\t191627\n刀法\t191628\n东风标致508\t191629\n高桥美绪\t191630\n公办小学\t191631\n36座\t191632\n赌徒\t191633\n平利县\t191634\n深海浩劫\t191635\n荣膺\t191636\nEmgu\t191637\n坚决反对\t191638\n几勺\t191639\nlo装\t191640\n名朋\t191641\n图学\t191642\n二手房交易网\t191643\n西乡塘\t191644\n2018.4.9\t191645\n极位\t191646\n斯达\t191647\n结核科\t191648\n无踪\t191649\n桑梓镇\t191650\n千所\t191651\n广州版小学\t191652\n中庸之道\t191653\n转金\t191654\n耳挂式\t191655\nMIX2s\t191656\n资产重组\t191657\n颍东区人民政府\t191658\n地波\t191659\n12秋\t191660\n古藤\t191661\n强弓\t191662\n胆管\t191663\n齿轮油\t191664\nPun\t191665\n太原理工大学\t191666\nSWM\t191667\n卷芯\t191668\n乌克兰\t191669\nDogg\t191670\n蚕结茧\t191671\n京瓷Kyocera\t191672\n温州中学\t191673\n新魏\t191674\n张朝辉\t191675\n播讲\t191676\n望天门山\t191677\n子瞻\t191678\n札达县\t191679\n炮楼\t191680\n尹珊珊\t191681\n_款\t191682\n绣春刀2\t191683\n敖包\t191684\ncastsequence\t191685\n托里姆\t191686\n长江三角洲城市群\t191687\n352号\t191688\n吸塑盘\t191689\n花东镇\t191690\n892\t191691\n逻辑层\t191692\n仙方\t191693\n掌趣\t191694\n清芷园\t191695\n自测试\t191696\n8051\t191697\n神装机\t191698\n捐精\t191699\nUSB\t191700\n偷窃\t191701\n本田crv\t191702\nip网\t191703\n糖份\t191704\n御府\t191705\n倚门\t191706\n云丹\t191707\n刀冽香\t191708\nFisheries\t191709\n中信证券投行部\t191710\n贫铀弹\t191711\n天王镇\t191712\n把妹达人\t191713\n蔚领\t191714\n腿裤\t191715\n2016-2018年度\t191716\n华胥引\t191717\n抗议\t191718\n18.com\t191719\n张晓明\t191720\n华科\t191721\n20151020\t191722\n驾到\t191723\nLizard\t191724\n抓获\t191725\n柳河县\t191726\n爱娃\t191727\n海淀区\t191728\n0775\t191729\n拉宽\t191730\nAliexpress\t191731\n线塔\t191732\n關係\t191733\n20160823\t191734\n转发器\t191735\nmf4700\t191736\n假冒伪劣\t191737\nFUCKING\t191738\n花伞\t191739\n黄河镇\t191740\njustification\t191741\n迅雷电驴\t191742\n揭阳市人民政府\t191743\n反式脂肪\t191744\n红霉素眼膏\t191745\n中央第三环境保护督察组\t191746\n一级域名\t191747\n取卡\t191748\n王冬梅\t191749\n长岛旅游网\t191750\n考段\t191751\n桃花仙\t191752\nccaa\t191753\n大峪\t191754\n徐徐\t191755\n电灯\t191756\n西西弗\t191757\nCAD2008\t191758\n色体\t191759\n肇兴侗寨\t191760\n直机\t191761\n关于审理人身损害赔偿案件适用法律若干\t191762\n北京市政府采购中心\t191763\n尚荣医疗\t191764\n电热油汀\t191765\n亨廷顿舞蹈症\t191766\n邪乎\t191767\n帕拓逊\t191768\n新癀片\t191769\n29名\t191770\n乱想\t191771\n天下霸图\t191772\n樱由罗\t191773\n燃烧网\t191774\n明势\t191775\nRho\t191776\n安发国际\t191777\nイキ\t191778\njourney\t191779\ntraps\t191780\n陈版主\t191781\n小鱼儿网\t191782\n2290\t191783\njsh\t191784\n霍英东\t191785\n无节\t191786\nv2.01\t191787\nnap\t191788\n创先\t191789\n旅游鞋\t191790\n尿酮体\t191791\n费纳\t191792\n360压缩软件\t191793\nmk6\t191794\n中国证券业协会\t191795\n螺片\t191796\n嘉庚\t191797\ni7-8700k\t191798\nShelley\t191799\n386\t191800\nArchlinux\t191801\noffice2007\t191802\n脱敏\t191803\n照管\t191804\nITA\t191805\n何苗\t191806\n资讯_风信网\t191807\n潭城镇\t191808\n莆田福房网\t191809\n整层\t191810\n富力尚悦居\t191811\n穆托姆博\t191812\n山药薏米芡实粥\t191813\n中俄列车大劫案\t191814\n实用率\t191815\n米缸\t191816\n铝土\t191817\n独占\t191818\n猎狐\t191819\n白领们\t191820\n中山市工商行政管理局\t191821\n自签名\t191822\n咲子\t191823\n附伤\t191824\nuzi\t191825\n送东阳马生序\t191826\nWindows、Linux\t191827\n辉腾锡勒草原\t191828\npyperclip\t191829\n恨嫁\t191830\nsnagit\t191831\n那一场\t191832\noffice365邮箱\t191833\n里格\t191834\n富创\t191835\nskylines\t191836\n笋盘\t191837\n洁云\t191838\n铜墙\t191839\nEdmodo\t191840\n大巷\t191841\n现已\t191842\nshk\t191843\n奔驰G63\t191844\n盛源\t191845\n石英比\t191846\n小沛\t191847\n整张\t191848\nGbLive\t191849\n大夜\t191850\n筹备期\t191851\n坑洼\t191852\n浦西万达广场\t191853\n蒋昌建\t191854\n150毫升\t191855\n诸界神级英雄\t191856\n刘维尔\t191857\n基隆\t191858\n淞沪\t191859\n波西米亚\t191860\nMurder\t191861\n门灯\t191862\nfuzhou\t191863\n邵伟华\t191864\nCAD2016\t191865\n坐标式\t191866\n温如歌北唐修\t191867\n台女\t191868\n浴袍\t191869\n生产量\t191870\n2018一模\t191871\n死气沉沉\t191872\n水质分析仪\t191873\n金沙洲万达广场\t191874\nCrowd\t191875\n五违\t191876\n平复帖\t191877\n暴雨\t191878\n麦尔\t191879\nDentistry\t191880\nSouvc\t191881\n山西能源学院\t191882\n明世隐\t191883\nlions\t191884\nJarvis\t191885\nsnh48吧\t191886\n胡八一\t191887\ná\t191888\n彻查\t191889\n绿藤\t191890\n维维集团\t191891\nnissan\t191892\n4279\t191893\n子宫直肠窝\t191894\nnss\t191895\n搪塑\t191896\njqgird\t191897\n新京报\t191898\n怪物猎人p3\t191899\nY430P\t191900\nOLYMPUS\t191901\n行政组织学\t191902\n击坠\t191903\n0911\t191904\n威海卫\t191905\n法律本科\t191906\n山东译声翻译公司\t191907\n左云\t191908\n原创性\t191909\n艾瑞泽7\t191910\n化学反应\t191911\n途歌\t191912\nblink\t191913\n金勇\t191914\nveins\t191915\nMedeA\t191916\n盛松成\t191917\n语权\t191918\n自由故\t191919\nSymmetric\t191920\n银川晚报\t191921\n高锰酸钾\t191922\n服务名\t191923\n7号\t191924\nblanket\t191925\n物块\t191926\n0737\t191927\n满山\t191928\nadob\t191929\n水蛭\t191930\n国家自然科学基金青年基金\t191931\n父母的爱\t191932\ntuf\t191933\n反噬\t191934\n自相残杀\t191935\n延绵\t191936\n瑞莎\t191937\n列会\t191938\n20150916\t191939\nJavhd\t191940\nWS\t191941\n怛罗斯之战\t191942\n阿里云邮\t191943\n扮相\t191944\n弘业期货\t191945\nw3af\t191946\ndonald\t191947\nleayrainy\t191948\n华金\t191949\n互联网_赛迪网\t191950\n赤壁怀古\t191951\n市场营销系\t191952\n赎回期\t191953\n汇率\t191954\n泪别\t191955\n雷洋案\t191956\n袁腾\t191957\n中山科技园\t191958\n奔驰C级AMG\t191959\n测温\t191960\n130g\t191961\n引领者\t191962\ndcs\t191963\nAvision\t191964\n梁帝\t191965\n微漫画\t191966\n阳光房产网\t191967\n耐用\t191968\ndecay\t191969\nhqplayer\t191970\nVA-11\t191971\n吉恩\t191972\n东云\t191973\n威可\t191974\n勇者大战魔物娘\t191975\nbjrbj\t191976\n吃醋\t191977\nEiffel\t191978\n龙皮\t191979\n联窗\t191980\n淳化阁帖\t191981\n天惠超市\t191982\n加餐\t191983\n华中科\t191984\n面炉\t191985\nidw\t191986\n二包\t191987\n萎缩性胃炎\t191988\n2000年代\t191989\n梁田\t191990\n暂未\t191991\nChamber\t191992\nX21_\t191993\n呢\t191994\n189dg.com\t191995\nyoun\t191996\n二_\t191997\n基因敲除\t191998\n灯座\t191999\n雷士\t192000\n中国兵器工业集团\t192001\n维修金\t192002\n巩义市人民政府\t192003\n溶氧量\t192004\n际华\t192005\n杭州小说网\t192006\n四流南路\t192007\n修仙界\t192008\n遗失\t192009\n对於\t192010\n潮商\t192011\n帕特里克\t192012\n小米note\t192013\n数控弯管机\t192014\n49所\t192015\n超声波测厚仪\t192016\n63.5\t192017\nPea\t192018\n中国家电网\t192019\n熊继柏\t192020\nnoip\t192021\n朴素贝叶斯分类器\t192022\n金诚信\t192023\n春耕\t192024\n四首歌\t192025\n黑米糕\t192026\n乌拉拉\t192027\ncos妆\t192028\n苏垦农\t192029\n英盛网\t192030\nカ\t192031\n定制\t192032\n7多少\t192033\n尼古拉特斯拉\t192034\n一级建造师执业资格考试\t192035\n双手\t192036\n息烽\t192037\n重新来过\t192038\n王巧\t192039\n宕昌县\t192040\nI6\t192041\n思域论坛_汽车之家论坛\t192042\n第八部\t192043\nIndependent\t192044\n设置页\t192045\n乌龙冰茶\t192046\n渎职罪\t192047\n三大\t192048\n3950\t192049\nu点\t192050\n成凤\t192051\n南越王\t192052\n守护灵\t192053\n高山村\t192054\n大金空调\t192055\n熊爪\t192056\n中国好人榜\t192057\n书阁\t192058\n手豪\t192059\nNACHI\t192060\n控制\t192061\n哈萨克语\t192062\n飞影\t192063\nslt\t192064\n建造师\t192065\n百灵威\t192066\n季枫\t192067\n重击\t192068\n首开国风美唐\t192069\nV类\t192070\n購買\t192071\n阶次\t192072\n张郭庄\t192073\n妮妲\t192074\n预防\t192075\nSQLserver数据库\t192076\n1T+128G\t192077\nvol.2\t192078\n再选\t192079\n空矩阵\t192080\n10亿\t192081\n黑河流域\t192082\n朴志胤\t192083\n接天莲叶无穷碧,映日荷花别样红\t192084\n家徽\t192085\n郭炜\t192086\n第105章\t192087\n泰嘉\t192088\n华辰\t192089\nex900\t192090\n切合\t192091\n2016年12月24日\t192092\n合眼\t192093\n华裕\t192094\ndirectshow\t192095\nall金\t192096\n大人气\t192097\n继承者\t192098\n澳门金沙\t192099\n至尊型\t192100\n喃喃\t192101\n金丝燕网\t192102\n经侦大队\t192103\n程皓\t192104\n台江\t192105\n嘉措\t192106\n周易取名测名网\t192107\n优木\t192108\n127k\t192109\n冯鹏\t192110\n年代感\t192111\n北京精雕集团\t192112\ndsw\t192113\n苏亚星\t192114\n相负\t192115\n200度\t192116\n南蛮子\t192117\n复极\t192118\naccordance\t192119\n演讲与口才\t192120\n千里共婵娟\t192121\n热气\t192122\nyuen\t192123\ncaterpillar\t192124\n红帽杯\t192125\n44篇\t192126\n华人论坛\t192127\n季花\t192128\nbm\t192129\nMITSUBISHI\t192130\n七步沟\t192131\n绝对\t192132\n尼康d5200\t192133\n73页\t192134\n李春雷\t192135\n250000\t192136\n养生园\t192137\n天津\t192138\n麦迪森\t192139\nthet\t192140\n3cm\t192141\n中震\t192142\nf-41082\t192143\nmovable\t192144\n黄花沟\t192145\naccession\t192146\nInspired\t192147\n圣歌\t192148\n百分之10\t192149\n10磅\t192150\n经营贷款\t192151\njmx\t192152\n楼门\t192153\n广告业\t192154\n避光\t192155\n250NK\t192156\n叶酸\t192157\nos.system\t192158\n念\t192159\n简奥斯汀\t192160\n神枪狙击\t192161\n佳能5d3\t192162\n行礼\t192163\n真野惠里菜\t192164\n腰眼\t192165\n有你有\t192166\n小钱\t192167\n断掉\t192168\n吴哥\t192169\nfbkc\t192170\n市发改委\t192171\n棉麻\t192172\n双廊\t192173\naccenture\t192174\n三硝基甲苯\t192175\n144分钟\t192176\n通付\t192177\n舜缘居\t192178\n40步\t192179\n异文\t192180\n十一种\t192181\n枪杀\t192182\n70号\t192183\n创意大厦\t192184\n标牌机\t192185\n中谍\t192186\nEXPIRE\t192187\n芒果体育网\t192188\nWooYun\t192189\n林俊\t192190\n武汉)有限公司\t192191\nBootstrap\t192192\n说出去\t192193\n战国四公子\t192194\n英国BBC\t192195\n浦市\t192196\n第34章\t192197\n张丽梅\t192198\n艘\t192199\n第13集\t192200\n心动过速\t192201\n校宝在线\t192202\ndiscuzX\t192203\n轻钢结构厂房\t192204\n刻板\t192205\n沙田\t192206\n303路\t192207\n87家\t192208\n1779\t192209\n江门中心医院\t192210\n漂浮式\t192211\nestate\t192212\n屉\t192213\n簋街\t192214\n320i\t192215\n五大行\t192216\n八叉\t192217\n炫清\t192218\n金汇财经\t192219\n旬邑\t192220\n弘一\t192221\nSUMPRODUCT\t192222\n温州法院网\t192223\n莆田火车站\t192224\n116i\t192225\n泛雅\t192226\n湖南区\t192227\n4.82\t192228\n两员\t192229\n新平县\t192230\n养成类\t192231\nlibuv\t192232\n依然在\t192233\n600问\t192234\n散货\t192235\n余新\t192236\n博猫\t192237\n橘猫\t192238\n工艺师\t192239\n咪喹莫特乳膏\t192240\n无神论者\t192241\n水行者\t192242\n平江路\t192243\n照射\t192244\ncaoli1024\t192245\nEBSD\t192246\n院门\t192247\ndaw\t192248\n山坟\t192249\n古汉语常用字字典\t192250\n景悦\t192251\n29周\t192252\n通宇通讯\t192253\n废盐酸\t192254\n2015上半年\t192255\n传化公路港\t192256\n3776\t192257\n263邮箱\t192258\n劳动合同\t192259\n刘晓春\t192260\n横排\t192261\n錯\t192262\n黄热疫苗\t192263\n蕅益\t192264\n华夏民族\t192265\n鸡公山\t192266\n玉盘\t192267\n见效\t192268\n我的危险妻子\t192269\n胶状\t192270\n斯威士兰\t192271\n浙江省环境保护厅\t192272\n人猿泰山成人版\t192273\n定类\t192274\n应用经济学\t192275\nFIFAOnline3\t192276\nfota\t192277\n健高\t192278\n威克士\t192279\nentrySet\t192280\n重庆医科大学\t192281\n中华玻璃网\t192282\nlogo\t192283\n喷薄而出\t192284\n邵氏诗词库\t192285\n诗莉莉\t192286\n71集\t192287\n先决条件\t192288\n不栖\t192289\n一加仑\t192290\nipencil\t192291\n泸州港\t192292\n法制办\t192293\n卓里奇\t192294\n诛仙青云\t192295\n0551\t192296\n罗伦\t192297\nchorme浏览器\t192298\n300i\t192299\n落第骑士的英雄谭\t192300\n植绒\t192301\n知识库\t192302\nnope\t192303\n中部地区\t192304\n大姑\t192305\nswiss\t192306\n807\t192307\n非烟\t192308\n浙商\t192309\n照度计\t192310\nporch\t192311\n维安\t192312\n兰州市委\t192313\n母语\t192314\n西乡镇\t192315\n集合式\t192316\n12.0.7\t192317\n烧开\t192318\nPOSCMS\t192319\n2048年\t192320\n刘维\t192321\n生地会考\t192322\n九九八十一\t192323\n光电式\t192324\nGRIP\t192325\n4109\t192326\n复州城\t192327\n竖笛\t192328\n左瞳\t192329\n芦苞\t192330\n韭菜盒子\t192331\n卡尔曼\t192332\n波澜哥\t192333\npierre\t192334\n张晓\t192335\n憨山\t192336\nmodular\t192337\n左炔诺孕酮片\t192338\n袖之手\t192339\n热风\t192340\n金鱼姬\t192341\n婚礼\t192342\n顶族\t192343\n位选\t192344\n经学\t192345\n吴永宁\t192346\n190\t192347\npenetration\t192348\n深圳惠程\t192349\n600天\t192350\n寒潮\t192351\nNgrok\t192352\n尼丁\t192353\n更准确\t192354\n白洁少妇\t192355\n白豆蔻\t192356\n土产\t192357\n彪子\t192358\n达尔电影网\t192359\n贷书\t192360\n枫林路\t192361\n首篇\t192362\n菱形块\t192363\n葬爱\t192364\n_片\t192365\n匹凸匹\t192366\nspade\t192367\n单元格里\t192368\n北京军区总医院\t192369\n|名\t192370\nRailgun\t192371\n最后一首歌\t192372\ncctv6\t192373\n天平座\t192374\n飞秒激光器\t192375\n31_\t192376\n独山子\t192377\n劳务报酬\t192378\n武汉华为\t192379\n梗阻性黄疸\t192380\nHoneySelect\t192381\n全市\t192382\n临港集团\t192383\nFreeBSD\t192384\n唱吧吧_\t192385\nGR07\t192386\n瓯海大道\t192387\n戴尔灵\t192388\n跳跳\t192389\n碧木凛\t192390\ntransform\t192391\n东方午新闻\t192392\n蓝茶\t192393\nMidwest\t192394\nMVVM\t192395\n中华写字楼网\t192396\n传颂\t192397\n琼台师范学院\t192398\n重庆火车站\t192399\n紫东\t192400\n荣耀黄金\t192401\nMyEclipse2016\t192402\n工商管理局\t192403\n植物大战僵尸单机版\t192404\n维克斯\t192405\n普利盘\t192406\n妻妾成群\t192407\n栓塞\t192408\nイケ\t192409\nPERSONAL\t192410\n西安公交查询网\t192411\n百利金\t192412\n点率\t192413\n深夜食堂3\t192414\n兴业信托\t192415\n沈鼓\t192416\n1584\t192417\n陈惠敏\t192418\n2季\t192419\n磷酸二铵\t192420\n浙大玉泉校区\t192421\ndeclaring\t192422\nspark,storm\t192423\n洵\t192424\nstanford\t192425\n漂移\t192426\n超仙\t192427\n日本菜\t192428\n手机网易网\t192429\n旺角卡门\t192430\n可依\t192431\ncub\t192432\n银河广场\t192433\n碳化炉\t192434\n冷冬\t192435\n求贤\t192436\n塑胶跑道\t192437\nComedy\t192438\n第102期\t192439\nthinkpad8\t192440\ny型\t192441\n片断\t192442\nGantz\t192443\n内蒙古政协\t192444\n石家庄理工职业学院\t192445\n耐压值\t192446\n五幼\t192447\n十三幺\t192448\ntop5\t192449\n忻州师范学院\t192450\n鸡头\t192451\n叶勇\t192452\n机构\t192453\nspark-sql\t192454\n掘港镇\t192455\n无信\t192456\n混卖\t192457\n易尔通\t192458\n驭房\t192459\n正大城\t192460\n刘真\t192461\n自白\t192462\ntreeview\t192463\n字架\t192464\n思忖\t192465\nxhci\t192466\n珠江商报\t192467\n苏州环保局\t192468\n谢芷馨\t192469\n桑吉尔夫\t192470\n周建\t192471\n内圈\t192472\n22分钟\t192473\n南汇街道\t192474\n福建省政协\t192475\nbrands\t192476\nTd\t192477\n音乐奖\t192478\n徐州市物价局\t192479\n写开\t192480\nloa\t192481\nsh\t192482\n救济金\t192483\n脚链\t192484\n贺强\t192485\n20160429\t192486\nautocad2017\t192487\nneihan\t192488\nG80\t192489\n枯竭\t192490\n标距\t192491\n几等\t192492\n黄缘闭壳龟\t192493\n可米酷\t192494\n提意见\t192495\nof&#160\t192496\n迷烟\t192497\n小米平板2\t192498\n信用保证保险\t192499\n双山岛\t192500\n名伶\t192501\nattending\t192502\n布料\t192503\nixf\t192504\n升阳\t192505\n许家\t192506\n尘埃\t192507\n往前\t192508\n中通服\t192509\n新录\t192510\n纸质\t192511\nOFFICE2016\t192512\n南斯拉夫\t192513\n退出来\t192514\n伸缩性\t192515\nHotel\t192516\nHuntKey\t192517\n诛砂\t192518\n山东省人民政府国有资产监督管理委员会\t192519\n有限差分\t192520\nCecile\t192521\n6月30日\t192522\nBridal\t192523\n面粉\t192524\n溪上\t192525\n石化加油卡\t192526\n圣品\t192527\n你中有我\t192528\n血项\t192529\n23亿元\t192530\n剑宗\t192531\n民星\t192532\n逆汤\t192533\n300_\t192534\n王立\t192535\n郑州小学\t192536\nabn\t192537\n罗源湾\t192538\n阴阳镜\t192539\n150多\t192540\nbamboo\t192541\nPlasmid\t192542\n楼车\t192543\n中国广州分析测试中心\t192544\n总编辑\t192545\n橘味\t192546\n苏州市相城区\t192547\n五觉\t192548\nfle\t192549\n深圳艺术学校\t192550\n捕捞\t192551\nzhiliao\t192552\n萧平旌\t192553\n重生之名流巨星\t192554\n贵阳市政府\t192555\n三氯化磷\t192556\n献祭\t192557\n射频识别技术\t192558\nwinbond\t192559\n汉语发音\t192560\n丹尼斯克\t192561\n芽森\t192562\nAV无码\t192563\n中国教育信息网\t192564\n蒙特利\t192565\n改良剂\t192566\nCurriculum\t192567\n40克\t192568\n解剖师\t192569\nxiaoxing\t192570\npope\t192571\n香榭丽舍大街\t192572\n更准\t192573\n王红霞\t192574\n蔡江\t192575\n五香\t192576\nTove\t192577\n夺爱\t192578\ngraphy\t192579\n盘古开天地\t192580\n莜面\t192581\noppor9\t192582\n北南\t192583\n左乙拉西坦\t192584\nsuds\t192585\n2835\t192586\n卫铁\t192587\nMarlboro\t192588\n杭氧股份\t192589\nrs422\t192590\n裁决书\t192591\n贪狼化\t192592\nShimily\t192593\n仙踪林\t192594\noldest\t192595\npang\t192596\n几十次\t192597\n首访\t192598\nmapp\t192599\n水工\t192600\n重庆燃气\t192601\n导入库\t192602\n一唱\t192603\n大牌驾到\t192604\n中产阶级\t192605\n菜鸟团\t192606\n4.com\t192607\nDay2\t192608\n盛京医院\t192609\n300MIUM\t192610\n善行者\t192611\ngittortoise\t192612\n林妙可\t192613\ncompetency\t192614\n天域苍穹\t192615\nRevolve\t192616\nwin7iso\t192617\n9k9k网页游戏开服数据中心\t192618\n环世界a18\t192619\nsysstat\t192620\n20150505\t192621\n薄樱\t192622\nAids\t192623\nPTV\t192624\n物色\t192625\n金华房产网\t192626\n菽庄花园\t192627\n阿魏酸\t192628\n兔狲\t192629\n20秒\t192630\n卡卡贷\t192631\n安瓿瓶\t192632\n一两百\t192633\n盐酸左氧氟沙星\t192634\n曲姓\t192635\n白龙吟\t192636\nAnnotation\t192637\nCFDA\t192638\n自学日语\t192639\n转现\t192640\n太认真\t192641\n查理斯\t192642\nBOOS\t192643\n白塔公园\t192644\n原义\t192645\n喻可欣\t192646\n上海和睦家医院\t192647\n_美骑网\t192648\n陈明仁\t192649\nbrilliance\t192650\n贼巢\t192651\n赵立平\t192652\n里昂\t192653\n吕四港\t192654\n39岁\t192655\n创辉煌\t192656\nForbes\t192657\nStraight\t192658\naotucad\t192659\n蜀道\t192660\n共沸点\t192661\n尸身\t192662\n吉林农网\t192663\n远洋天地\t192664\n金蝶K3\t192665\n7图\t192666\n腐图\t192667\n伊斯兰革命\t192668\n缩孔\t192669\n刘培基\t192670\n3158家居网\t192671\n窗锁\t192672\n洗练\t192673\n火计\t192674\n数据局\t192675\n后片\t192676\n226dw\t192677\n陈旭\t192678\n坐\t192679\n松园\t192680\n林斤澜\t192681\n甩奶舞\t192682\n二级建造师执业资格考试\t192683\njunit\t192684\n临沂市文化广电新闻出版局\t192685\n光屏\t192686\n通职者\t192687\n康丽\t192688\n重庆市人民政府\t192689\n2011年6月\t192690\n詹姆斯·弗兰科\t192691\n四中\t192692\n腕关节\t192693\n手机凤凰网\t192694\n北京实验二小\t192695\n佳能MP288\t192696\n上海新能源汽车\t192697\n潼湖\t192698\n买空\t192699\n快乐西游\t192700\n折扣卷\t192701\n纸包鱼\t192702\n百色市\t192703\nECS\t192704\n中艺\t192705\n朱青生\t192706\nGridview\t192707\n京享值\t192708\ntreasury\t192709\n姬骑士莉莉娅\t192710\n福建省小学\t192711\n高晴\t192712\n彩虹关\t192713\n腹腔镜\t192714\n磁扣\t192715\n龙的心\t192716\n贵族\t192717\n第八张\t192718\n烈士墓\t192719\n849\t192720\n炼金术\t192721\n刘克庄\t192722\nclio\t192723\npthreads\t192724\n中药师\t192725\nRem\t192726\n辽宁宏运\t192727\n002120\t192728\nAudible\t192729\n三国志11吧_\t192730\n阿斯蒂芬\t192731\nUFO事件\t192732\n孤岛惊魂5吧\t192733\n深圳人社局\t192734\nvfd\t192735\n淮安日报\t192736\n上海钢联\t192737\nholding\t192738\nobj格式\t192739\nkds\t192740\ncashier\t192741\n酒精炉\t192742\n李潇潇\t192743\n缸壁\t192744\nhhhh\t192745\njesse\t192746\ns5700\t192747\n千兆路由器\t192748\n高品位\t192749\n明忽暗\t192750\n非酒精性脂肪性肝病\t192751\n大七\t192752\n福兴\t192753\n叶忆落\t192754\n萧功秦\t192755\n黑暗天使\t192756\n帝姬\t192757\nLight\t192758\n致青春\t192759\n偏执\t192760\nJDE\t192761\n正南\t192762\n中国长安汽车集团股份有限公司\t192763\n慰问品\t192764\n碧昂丝\t192765\n读写\t192766\n虚拟办公室\t192767\n山东兄弟\t192768\ndmax\t192769\n位置处\t192770\nliblinear\t192771\n果袋\t192772\n连成\t192773\nA*算法\t192774\n欧体\t192775\n桂龙\t192776\n诱惑性\t192777\n挡土\t192778\n476\t192779\n开滦集团\t192780\n中原工学院信息商务学院\t192781\nhell\t192782\n宽带山\t192783\n白各庄新村\t192784\nweifengCorp\t192785\n荣组\t192786\n义道\t192787\n弥渡县\t192788\n导热仪\t192789\nimportance\t192790\n龙血武帝\t192791\n红腰豆\t192792\n|华律办事\t192793\n污水泵\t192794\n中旭\t192795\n缺德\t192796\n邱启明\t192797\nBecca\t192798\n胎垫\t192799\n卡奥网\t192800\nfittime\t192801\n中央苏区\t192802\n张亚辉\t192803\n金山在线\t192804\nTP5.0\t192805\n搬到\t192806\nMaterial\t192807\n72次\t192808\n无悔人生\t192809\nWHQL\t192810\n吃香\t192811\n此时\t192812\nfavor\t192813\n邓世昌\t192814\n本草\t192815\nJedisCluster\t192816\nv13.0\t192817\n驼子\t192818\n小米温湿度传感器\t192819\n宝骏E100\t192820\n本溪\t192821\n蓝光集团\t192822\n家辉\t192823\nMarvell\t192824\nyanghuahui\t192825\n8组\t192826\n装饰装修公司\t192827\n九重天\t192828\n耐用品\t192829\nfollow\t192830\nc4l\t192831\n真核\t192832\n温压\t192833\n红猫\t192834\n跳舞的线\t192835\n管螺纹\t192836\n张克帆\t192837\n栈板\t192838\n电脑显示器\t192839\n鉴真东渡\t192840\nArmed\t192841\n黑犬兽\t192842\nエレジ\t192843\n置于\t192844\n顽\t192845\n小许\t192846\n上海大学宝山校区\t192847\n低血钾\t192848\n牧绅\t192849\n周力\t192850\n沙漠公路\t192851\n巴列维\t192852\nUnusual\t192853\n色达五明佛学院\t192854\n苯环\t192855\n40摄氏度\t192856\n硫酸庆大霉素注射液\t192857\n9栋\t192858\n经理们\t192859\n冰条\t192860\n吉大一院\t192861\n华三模拟器\t192862\n在梦中\t192863\n结息\t192864\n第i个\t192865\n线程池\t192866\n防火漆\t192867\n探索式\t192868\n十九层\t192869\n热封\t192870\n中国福利会少年宫\t192871\n陈建州\t192872\n国币\t192873\n庄浪门户网\t192874\n海之蓝\t192875\n还商贷\t192876\n得儿\t192877\ngblive\t192878\n姊妹节\t192879\nrash\t192880\n锇\t192881\n宜川\t192882\n肺炎\t192883\n雅筑\t192884\n死胎\t192885\nICT\t192886\n1848年\t192887\n忘仙\t192888\nRancher\t192889\nTorrents\t192890\n袁慧琴\t192891\n筑梦路\t192892\n院线\t192893\n卡缪\t192894\n赋形\t192895\n头声\t192896\n死于安乐\t192897\nODL\t192898\n1^2\t192899\n企信\t192900\n1.3.0.0\t192901\n指天\t192902\n香港岭南大学\t192903\nF22\t192904\n大皮\t192905\ndiya\t192906\n腰身\t192907\n2V2\t192908\n喻文州\t192909\n云人\t192910\n县农机局\t192911\n辽宁省委\t192912\n挑花\t192913\nText\t192914\n富平\t192915\n受委屈\t192916\n傻傻分\t192917\n卡诗顿\t192918\nBears\t192919\nlibmemcached\t192920\nwidi\t192921\n顶盖\t192922\nExcel表图\t192923\n冰人\t192924\nA2版\t192925\n巨肺\t192926\n大侠\t192927\n牛初乳\t192928\n克莱鹏\t192929\n坐厕\t192930\n操作父\t192931\nmuban\t192932\n吴双\t192933\n无心快语\t192934\n仙剑三外传\t192935\n噻\t192936\n考勤表\t192937\n宋先生\t192938\ncdj\t192939\n千万富翁\t192940\n列王的纷争\t192941\n新能力\t192942\n沾光\t192943\n访惠聚\t192944\n挖心\t192945\n一宗\t192946\npsd模板\t192947\n县教育局\t192948\n唐玲\t192949\n员章\t192950\n密目式安全网\t192951\n云漫\t192952\n义兄\t192953\nTwist\t192954\n企画\t192955\n男星们\t192956\ninterp1\t192957\n大切诺基\t192958\n洋洋得意\t192959\n送命\t192960\n无解\t192961\n万元\t192962\n同家\t192963\n五公斤\t192964\n夜刀\t192965\n2017年端午节\t192966\nCondensed\t192967\n三叶青\t192968\n焚香\t192969\n莫希\t192970\n英雄主义\t192971\n28码\t192972\n篆刻吧\t192973\n玉溪镇\t192974\n潮平\t192975\ndrama\t192976\n上善若水\t192977\n徐芃\t192978\n吉他谱txt-和弦谱\t192979\n海马300\t192980\n则灵\t192981\n缴械\t192982\n中远海特\t192983\n天华建筑设计公司\t192984\n黄老五\t192985\n中国畜牧兽医学会\t192986\n火焰检测器\t192987\n安徽人事考试网\t192988\n锂电车\t192989\n康普\t192990\n中移物联网有限公司\t192991\nphpmyadmin\t192992\n小脸\t192993\n第115期\t192994\n笛音\t192995\nGentlemen\t192996\n两港\t192997\n雇工\t192998\n竹青\t192999\n慌张\t193000\n犹江\t193001\n北京人社局\t193002\nBD1080p\t193003\n蓝科高新\t193004\n上海中公\t193005\nwants\t193006\n还信\t193007\n搞大\t193008\n烤鸭蛋\t193009\nRia\t193010\n寡女\t193011\n12月15日\t193012\n鹤望兰\t193013\n云南网络广播电视台\t193014\n李东健\t193015\n林奚\t193016\n饼铛\t193017\n迟早\t193018\nQt编程\t193019\n方灯\t193020\n紫癫\t193021\n橙弓\t193022\n怪物猎人X\t193023\n关胜\t193024\n抢客\t193025\n中国太平洋保险(集团)股份有限公司\t193026\n代子\t193027\n百达\t193028\nXNA\t193029\n女队\t193030\n五一节前\t193031\n白川\t193032\n铃木杏\t193033\nzip\t193034\n建国门\t193035\n百夜\t193036\n片群\t193037\nfaraway\t193038\n桅杆\t193039\n红剑鱼\t193040\n集气瓶\t193041\n谭浩俊\t193042\n6.26\t193043\n效绩\t193044\n美竹凉子\t193045\n姚国华\t193046\n旗帜\t193047\n拉环\t193048\n一小杯\t193049\n博努奇\t193050\n304不锈钢管\t193051\nc3po\t193052\n四会\t193053\n武汉三镇\t193054\n冷却\t193055\n安定门外大街\t193056\nipad7\t193057\nyyp\t193058\n腹股沟处\t193059\n铜皮\t193060\n誘\t193061\n李腾\t193062\nCMSIS\t193063\nmodem\t193064\n房檐\t193065\nflv\t193066\n临安19楼\t193067\n止损\t193068\n卡会\t193069\n诗意\t193070\n巴音郭楞蒙古自治州人民政府\t193071\n凹岸\t193072\n蒙面\t193073\n唉\t193074\n金杨路\t193075\n贺炜\t193076\nXE6\t193077\n同等学力在职研究生\t193078\n内蒙古自治区国家税务局\t193079\n逢春\t193080\n撞出\t193081\n莲花小区\t193082\n塞拉维\t193083\n256bit\t193084\n中水渔业\t193085\n翻折\t193086\n瘦客户机\t193087\n也谈\t193088\n惠尔顿\t193089\n崔庆才\t193090\n超声波塑料焊接机\t193091\n恩施市政府\t193092\n原道\t193093\n博学网\t193094\n四十不惑\t193095\n二战\t193096\n铝瓶\t193097\n军饷\t193098\n永诺\t193099\nx240s\t193100\n鸿蒙天尊\t193101\nWin版\t193102\n杂货\t193103\n至极\t193104\n林忠钦\t193105\n300072\t193106\n柳汽\t193107\n棕榈泉\t193108\n阳坝\t193109\n羽生亚梨沙\t193110\n自动控制系统\t193111\n民国二十六年\t193112\n朗阁雅思\t193113\nStatement\t193114\n孙逸仙\t193115\n黄金眼\t193116\n中国循环经济协会\t193117\nSL700\t193118\noba\t193119\nexchanges\t193120\n铁们\t193121\n精米\t193122\nnsp\t193123\n户号\t193124\n五十度灰\t193125\n隔帘\t193126\n东方彧卿\t193127\n第4场\t193128\n丧妻\t193129\n塘镇\t193130\n自营\t193131\n宅宅\t193132\n龙山公园\t193133\n5040\t193134\n红帆\t193135\n游天堂\t193136\n车线\t193137\n古县\t193138\nanimax\t193139\n述标\t193140\n切瑟尔\t193141\n金智秀\t193142\n嘻嘻哈哈笑话网\t193143\n默认输入法\t193144\n星际迷航3:超越星辰\t193145\n预建\t193146\n木九十眼镜\t193147\n讲法\t193148\n润格\t193149\n猪肝\t193150\nCousin\t193151\n0y\t193152\n卧龙谷\t193153\n85分\t193154\n梅干\t193155\n母爱\t193156\nAusten\t193157\n对手方\t193158\n科尔沁右翼前旗\t193159\n普吉府\t193160\n郑州电子信息职业技术学院\t193161\nSteamSpy\t193162\n长顺县\t193163\n东方之门\t193164\n包头新闻网\t193165\n上货\t193166\n上海机电\t193167\nLISTVIEW\t193168\nrion\t193169\n音羽\t193170\n血红\t193171\n奇骏2.0\t193172\n亨廷顿\t193173\n浅浅\t193174\n祭灵\t193175\nboats\t193176\n金沙花园\t193177\n下机\t193178\nruler\t193179\n广昌\t193180\n65万\t193181\nWIfi\t193182\n软木\t193183\neaoron\t193184\n西野七濑\t193185\n个股期权\t193186\n成都装饰公司\t193187\nhawk\t193188\n梅州日报\t193189\n┨\t193190\n兴海县\t193191\n出生人\t193192\ncbox央视影音\t193193\n上海教育宝\t193194\napt\t193195\n重庆科技馆\t193196\n维度表\t193197\nprominent\t193198\nair/pro\t193199\n屏蔽罩\t193200\n宇信易诚\t193201\n依依\t193202\nwcl\t193203\n甲天下\t193204\n兔纸\t193205\n8.0.1.0\t193206\n纽瑞德\t193207\n001002008\t193208\n丁俊辉\t193209\n妙法\t193210\n李柱山\t193211\n商经\t193212\n便捷\t193213\n石虾\t193214\n诗朗诵稿\t193215\n大橋未久\t193216\n整理师\t193217\n丽珠医药\t193218\n私情\t193219\n海商王\t193220\n东风日产汽车\t193221\nteenager\t193222\n上海恒大\t193223\n蒙德里安\t193224\n吴群\t193225\niS6\t193226\n屌色\t193227\n裁缝店\t193228\n取而代之\t193229\n东白山\t193230\n硅胶片\t193231\n黄灿灿\t193232\ncalib\t193233\nCIMA\t193234\n从第\t193235\n金缕\t193236\n雅尊\t193237\n佳苗\t193238\n增值税暂行条例\t193239\nREASON\t193240\n唐生智\t193241\nDerby\t193242\nzmx\t193243\n终结者外传\t193244\n横跳\t193245\n难忘的回忆\t193246\n雅典\t193247\n2016年3月25日\t193248\nStreak\t193249\n鞍山市人民政府\t193250\n宋清辉\t193251\n疱\t193252\n主旋律\t193253\ngzx\t193254\n中共中央办公厅\t193255\n2017年12月7日\t193256\n启辰D50\t193257\n汉南\t193258\n名窑钧瓷网\t193259\n太多\t193260\ndto\t193261\n201607\t193262\n迅猛\t193263\n宁波社保\t193264\n白盒测试\t193265\n龙心\t193266\nFSCapture\t193267\n100.dll\t193268\n泗州\t193269\n格式转换器\t193270\ncpta\t193271\n海豹突击队\t193272\n神墨\t193273\n破镜\t193274\n荷色\t193275\n9dmgame\t193276\n九宝\t193277\n桦甸市\t193278\n防皱\t193279\n老巷\t193280\n网线路\t193281\nメモリ\t193282\n卡瓦依\t193283\n科技苑\t193284\nMerray\t193285\n武汉信息传播职业技术学院\t193286\n深度国际\t193287\n2015年5月\t193288\n中国时尚\t193289\nfunko\t193290\n发闷\t193291\ntopjui\t193292\n张碧晨\t193293\n蜂拥而至\t193294\n汗水\t193295\n马诺\t193296\n入团申请书\t193297\n煤矿机\t193298\n骨疼\t193299\nsockaddr_in\t193300\n加藤莉娜\t193301\n片装\t193302\n液压劈裂机\t193303\n美句\t193304\n1480元\t193305\n翼支付\t193306\n奥卡索\t193307\nwaters\t193308\nZhan\t193309\n候语\t193310\njiuye\t193311\n陈哥\t193312\n微店网\t193313\n巨蟹座\t193314\n枣庄新闻网\t193315\n田园式\t193316\nMoleskine\t193317\n内件\t193318\n软驱\t193319\n董鄂妃\t193320\n跟宠\t193321\nLOADING\t193322\nvalves\t193323\nquartus2\t193324\n朱辰彬\t193325\n阿楞\t193326\n中方便地\t193327\n丁克族\t193328\n容克\t193329\napkpure\t193330\nguarantee\t193331\nHouse\t193332\n284号\t193333\n存储位\t193334\nevents\t193335\nC3P0\t193336\n院景\t193337\n暴汗服\t193338\n皮绳\t193339\n医学检验科\t193340\nacorn\t193341\n保利堂悦\t193342\n明仁天皇\t193343\n地皮菜\t193344\n书记\t193345\n百分之98\t193346\n1000页\t193347\nBBN\t193348\nminiso名创优品\t193349\n20180330\t193350\n党魁\t193351\n张仲景\t193352\n张纯如\t193353\n暴饮\t193354\n何波\t193355\n王宣\t193356\n四川人大\t193357\n默认播放器\t193358\nvideos性\t193359\ncuit\t193360\n王中山\t193361\n松桃县\t193362\n快马加盟网\t193363\n1.0.1_\t193364\n空海法师\t193365\n众发娱乐\t193366\n贷款方\t193367\n为何\t193368\n兽人\t193369\n3.8元\t193370\n与罚\t193371\n戒毒所\t193372\n羟基苯乙酮\t193373\n三原穗花\t193374\n长海\t193375\n雄虎\t193376\n神武卡\t193377\n辛波斯卡\t193378\n捉迷\t193379\n国公\t193380\n大秦\t193381\n安康高新区\t193382\n强取\t193383\n主泵\t193384\n五爪金龙\t193385\n登报\t193386\n热老化\t193387\n三届二次\t193388\n柠檬绿茶\t193389\n切尔西\t193390\n光字\t193391\n海澜\t193392\n龙之塔\t193393\n市发改局\t193394\n603383\t193395\nRDBMS\t193396\n奇形\t193397\n从头至尾\t193398\n表盖\t193399\n婴儿游泳馆\t193400\n诛杀\t193401\ncoturn\t193402\n易拓\t193403\n装箱机\t193404\nevaluate\t193405\n玻璃反应釜\t193406\n周作\t193407\n文字学\t193408\nexodus\t193409\n51电子网\t193410\n圣景\t193411\n仲裁员\t193412\nFREE\t193413\n亚述\t193414\n2012年度\t193415\nshut\t193416\n红坛\t193417\n2018君越\t193418\n神钢\t193419\n闪乱神乐吧\t193420\n新浪家居网\t193421\n奔驰glc\t193422\n专升本考试信息网\t193423\n电暖器\t193424\n三叔\t193425\nDirectX11\t193426\n安彩\t193427\n流放之路-POE\t193428\n刘春燕\t193429\n李雨霏\t193430\n岗集\t193431\nHelloKitty\t193432\n嵊泗列岛\t193433\n谷胱甘肽\t193434\n乱马\t193435\n20160309\t193436\n秀坊\t193437\n银根\t193438\n余姚北\t193439\npackard\t193440\n周蓬安\t193441\n大仙们\t193442\n佳能600d\t193443\n2018-03-06\t193444\n三清山旅游网\t193445\n2行\t193446\n张鲁\t193447\n豫菜\t193448\nDvbbs\t193449\n烤箱\t193450\n阴兽\t193451\n升降台\t193452\nYCbCr\t193453\n碧岭\t193454\ncellspacing\t193455\n安徽省质量技术监督局\t193456\n微不足道\t193457\n波多野结衣家庭教师\t193458\n韩立华\t193459\n转机\t193460\n嫩江政府网\t193461\n白点\t193462\n非常规\t193463\n钓饵\t193464\n巴音汗\t193465\n太极旗\t193466\n发面\t193467\n洞口县人民政府\t193468\n亚巴顿\t193469\n云彩\t193470\n兵痞\t193471\n纳米片\t193472\ncoln\t193473\nRollen\t193474\n幸福耙耳朵\t193475\n2.9.0\t193476\n2016年4月11日\t193477\n南通路\t193478\nUV平板打印机\t193479\n马坝\t193480\n同济大学出版社\t193481\n贾宏声\t193482\ninterviews\t193483\n声部\t193484\n第四波\t193485\n1000例\t193486\n5-4\t193487\n过分\t193488\nGNP\t193489\n_火车网\t193490\n王秀芳\t193491\n汇价\t193492\n胶合木\t193493\n预付账款\t193494\n电脑维修-ab蓝学网\t193495\n6.8分\t193496\n自始至终\t193497\n重甲\t193498\nConsent\t193499\n修旧利废\t193500\nHeidiSQL\t193501\n五福星撞鬼\t193502\n五环路\t193503\n1000毫安\t193504\njmm\t193505\n现代经济信息\t193506\n逍遥浮士德\t193507\nポレ\t193508\n龙去脉\t193509\n浙江清华长三角研究院\t193510\n镍氢电池\t193511\nasml\t193512\nffmepg\t193513\n胡麻油\t193514\nWdate\t193515\n2577\t193516\n能伸\t193517\n胸闷气短\t193518\n自愈\t193519\n后浇带\t193520\npsoc\t193521\n周佛海\t193522\n调台\t193523\n汉\t193524\n电铲\t193525\n20170715\t193526\n礼品袋\t193527\n新帝\t193528\n济南地区\t193529\n证候\t193530\n派遣员\t193531\n亮瞎眼\t193532\n彩铝\t193533\n存款利率表一览_银行信息港\t193534\nDeco\t193535\n廉贞\t193536\nexcel画曲线图\t193537\n148分\t193538\n锚钉\t193539\n彩调剧\t193540\n何镜堂\t193541\nll\t193542\n10700\t193543\n自我保护\t193544\n彭县\t193545\n北京环球主题公园\t193546\n新序\t193547\n6080电影网\t193548\n授业解惑\t193549\ndudu\t193550\n圆球\t193551\n高血压药\t193552\n论驾\t193553\n变形金刚\t193554\nmoko\t193555\nalgo\t193556\n益母草膏\t193557\n跟著\t193558\n2520M\t193559\nmatlap\t193560\nGraphPad\t193561\n齐鲁网\t193562\n虚声\t193563\n脱身者\t193564\n不发\t193565\n晕了\t193566\n长隆店\t193567\n数学期\t193568\n带通滤波\t193569\n3158创业信息网\t193570\n徇\t193571\nHandBrake\t193572\naxure中继器\t193573\n设计学\t193574\nLandmark\t193575\nHikariCP\t193576\nweber\t193577\n战死\t193578\n超蝙\t193579\n信号源\t193580\n扑热息痛\t193581\n首名\t193582\n穹庐\t193583\n冷箭\t193584\n点样\t193585\n国务院扶贫开发领导小组办公室\t193586\nwin7计算器\t193587\n鲁人版\t193588\n春季\t193589\n铁磁材料\t193590\ntears\t193591\n湿陷性\t193592\n巨猩\t193593\n胁\t193594\n这种事\t193595\n中软国际\t193596\n橡皮管\t193597\n自由贸易协定\t193598\n低通滤波器\t193599\n晶码\t193600\n2.61\t193601\n肥臀\t193602\n东海泰禾广场\t193603\n圣沃尔夫冈\t193604\n忍刀七人众\t193605\n北斗卫星\t193606\n18105\t193607\n用税\t193608\n伤心太平洋\t193609\n罗技G403\t193610\nICOCA\t193611\nL2\t193612\n多穷\t193613\nsectional\t193614\n超静\t193615\n四溅\t193616\n卫夫人\t193617\n盱眙365网\t193618\ncapo\t193619\n电力仪表\t193620\nBk\t193621\nHYSYS\t193622\n共享盘\t193623\n邓亚萍\t193624\n43mm\t193625\n抗磨\t193626\nBD1080P高清\t193627\n春素小皙\t193628\nAngelica\t193629\n603897\t193630\n详情页\t193631\n_诺\t193632\n斗鱼小叮当\t193633\n源文件名\t193634\n无动\t193635\n我的未来\t193636\n360手机N4\t193637\n台盟\t193638\n全球铁合金网\t193639\n大罗山\t193640\n重托\t193641\n铂丝\t193642\n金山小学\t193643\n可动性\t193644\nP20#\t193645\n神\t193646\n聚客通\t193647\n塔子\t193648\n舜宇光学科技\t193649\n玻璃钢桥架\t193650\n5月\t193651\n赤子心\t193652\n猪猪侠之五灵守卫者\t193653\nHTML/Xhtml\t193654\n_钱站\t193655\nDQN\t193656\n绿地集团\t193657\n套数\t193658\n湖南镇\t193659\n加措\t193660\n初闻\t193661\n用口\t193662\n国泰产险\t193663\n变态杀人狂\t193664\n计生信息网\t193665\n检察官外传\t193666\nstrain\t193667\n慎买\t193668\n家家顺\t193669\n排舞\t193670\n盗版小说\t193671\n07073\t193672\n第.\t193673\nWhite\t193674\n贵阳\t193675\n第3行\t193676\n第132章\t193677\nsurveys\t193678\n6期\t193679\nMustang论坛_汽车之家论坛\t193680\n米兰达可儿\t193681\n蚂蜂窝自由行\t193682\n3月6号\t193683\n材料成型及控制工程\t193684\n吴慷仁\t193685\n索尼XZ1\t193686\n上海物贸\t193687\ndraftsight\t193688\n凝点\t193689\n骨髓增生\t193690\n中南美洲\t193691\n钿\t193692\n强夯机\t193693\n妈妈的朋友4\t193694\n一百多\t193695\n虎胆龙威4\t193696\n通史\t193697\nSmoke\t193698\nCapsule\t193699\n20间\t193700\n聊城站\t193701\n斯坦尼斯拉夫斯基\t193702\n3例\t193703\n北京地铁10号线\t193704\n亿智\t193705\n侠盗猎\t193706\n刘青云\t193707\n米奥\t193708\n辽宁大厦\t193709\nImprovement\t193710\n抛掷\t193711\n螯合剂\t193712\n南苏丹\t193713\n宠物级\t193714\n闯入者\t193715\n珍惜自己\t193716\n伏立康唑\t193717\n回眸一笑百媚生\t193718\nshopper\t193719\n儋\t193720\n托育\t193721\n43\t193722\n明标\t193723\nwpdb\t193724\nmultiprocess\t193725\n穗港\t193726\nResistance\t193727\n净色\t193728\n陈丽娟\t193729\n保妥适\t193730\n长安区\t193731\n枯井\t193732\n徐婷\t193733\n球幕\t193734\n14.04LTS\t193735\n古墓派\t193736\n种子类\t193737\n赫山区\t193738\nhyd\t193739\n竹业\t193740\n臭袜\t193741\n区委\t193742\n二手店\t193743\n梅兰竹菊\t193744\n作业单\t193745\n络绎不绝\t193746\nInvasive\t193747\n腰条\t193748\n总能\t193749\n句末\t193750\nEntrepreneur\t193751\n河南机场\t193752\n厄\t193753\nVALUES\t193754\n提走\t193755\n一点一点\t193756\n分析家公式网\t193757\n五一高速公路\t193758\n汽轮机\t193759\n分时四驱\t193760\n奥雅宝骏730\t193761\n威霆\t193762\n演示机\t193763\n12333公共服务网\t193764\na1278\t193765\n林赛\t193766\ndelegate\t193767\nXCAR\t193768\n人身自由权利\t193769\n妖精\t193770\n泾源县\t193771\nsite\t193772\n国务院金融稳定发展委员会\t193773\n6月8日\t193774\n山东出入境检验检疫局\t193775\n白菊\t193776\n王建业\t193777\n雷杜\t193778\n孙小\t193779\n外传\t193780\n督导员\t193781\n附近\t193782\n1月22日\t193783\n助学金\t193784\n雪纳瑞俱乐部\t193785\n康强医疗人才网\t193786\n陶祖\t193787\n破产管理人\t193788\n藕片\t193789\n兽人族\t193790\nlulipro\t193791\n两点论\t193792\n平安时代\t193793\n仙阁\t193794\n8.0.5\t193795\n7.2.5\t193796\ns2m\t193797\n红牌楼\t193798\n全版本\t193799\n帅丰集成灶\t193800\n兰花集团\t193801\n阿立哌唑片\t193802\n家居品\t193803\n早上8点\t193804\n视图表\t193805\n3.2%\t193806\n分类信息网\t193807\n2010-2017年\t193808\n科学中国人\t193809\nido\t193810\n天津红桥区\t193811\nMaterials\t193812\n新华军网\t193813\nRic\t193814\n13亿元\t193815\nexecutor\t193816\n掌贝\t193817\n长袖\t193818\nrmk\t193819\n隶变\t193820\n弃袍\t193821\n沈夜\t193822\n温溪\t193823\n朱塞佩\t193824\nv1.12\t193825\n渡边曜\t193826\nctl471\t193827\nTelephony\t193828\n北风\t193829\n赛默飞世尔科技\t193830\nSas\t193831\n电子扇\t193832\n成渝\t193833\n4.5.6\t193834\n大气压值\t193835\n复仇者联盟\t193836\n美编\t193837\nmyapp\t193838\n龟友之家论坛\t193839\n天津科技馆\t193840\nser\t193841\n辰辰\t193842\n锁带\t193843\n山西省中医院\t193844\n第179集\t193845\n农民伯伯乡下妹\t193846\n迈进\t193847\n中共河南省委\t193848\nPartitioning\t193849\n煤气罐\t193850\n好成语\t193851\n晚间\t193852\n产业链分析\t193853\n为媒\t193854\n理肤\t193855\nDEB\t193856\n女单鞋\t193857\n200本\t193858\n文东\t193859\n屠皇\t193860\n淘宝群\t193861\n腹带\t193862\n魔神英雄传\t193863\n存管银行\t193864\n名片盒\t193865\n角条\t193866\n齐亮\t193867\n108条\t193868\n南山人民医院\t193869\n北京世界园艺博览会\t193870\ncsv格式\t193871\n努比亚Z17#\t193872\n128元\t193873\n柴火炉\t193874\nupdates\t193875\n官窑\t193876\n外汇局\t193877\n附着物\t193878\nDESIGN\t193879\n养犬\t193880\n精湛\t193881\nrsp\t193882\nfigma\t193883\n500多元\t193884\n实现率\t193885\n有丝\t193886\ncatrice\t193887\n布偶\t193888\n单眼皮\t193889\n大嫂\t193890\n139.com\t193891\n区人\t193892\n黑车\t193893\n10.27\t193894\n美然\t193895\n急危重症\t193896\n450m\t193897\n点字\t193898\n颜渊\t193899\n新标日\t193900\n32.2\t193901\n天泰城\t193902\n皮皮龟\t193903\n杞子\t193904\n电子工程学院\t193905\n孝感房\t193906\n襄阳区\t193907\n圆筒形\t193908\n曾道人\t193909\n高清迅雷BT\t193910\nbgd\t193911\n但\t193912\n盛通\t193913\nkol\t193914\nSUMO\t193915\n韩德褚时健\t193916\n姜波\t193917\n毛剑卿\t193918\n浦兴路街道\t193919\nRachel\t193920\n小组赛\t193921\n力劈\t193922\n小林\t193923\nwrench\t193924\n无出其右\t193925\n单臂电桥\t193926\n公债\t193927\n大河之舞\t193928\nCCTV-2_央视网\t193929\n知识人\t193930\n沈阳医大\t193931\n20170312\t193932\n一些人\t193933\n11宗\t193934\nexlce\t193935\n立事\t193936\n多意\t193937\n风温\t193938\nTory\t193939\n摔裂\t193940\n烫平机\t193941\n优居客\t193942\n恒利\t193943\n10万年\t193944\n蓝飏\t193945\ndepartment\t193946\n五转\t193947\n雍容\t193948\n西安影讯\t193949\n谢琳\t193950\n中国会计视野论\t193951\nHibernate4\t193952\n提现\t193953\n冰果\t193954\nJinan\t193955\n本名\t193956\n老老\t193957\ncvf\t193958\nwex5\t193959\n一不小心爱上你\t193960\nPROCESS\t193961\n胤\t193962\n诗琳通\t193963\n称王\t193964\nBF4\t193965\n中山市交通运输局\t193966\n张旭豪\t193967\n刘玉梅\t193968\nzhixian\t193969\ne75\t193970\n我的女朋友\t193971\n皮卡堂过家家\t193972\n铆足\t193973\n最新消息\t193974\nocam\t193975\n金投黄金网\t193976\n高清百度云\t193977\nSONG\t193978\n火影忍者究极风暴革命\t193979\n4.5m\t193980\n拆散\t193981\nSEGGER\t193982\n华侨\t193983\n光熙门\t193984\nbb机\t193985\n丝戏\t193986\n吉宏股份\t193987\n被授权人\t193988\nAnchor\t193989\n北京社\t193990\n绝配\t193991\n宋端\t193992\n次世代\t193993\n不要点\t193994\n丁香色\t193995\n预付款\t193996\n医疗器械有限公司\t193997\n41000\t193998\n秘密之森\t193999\n2018二\t194000\n碳黑\t194001\n雷博\t194002\n洗衣球\t194003\naicpa\t194004\n七水硫酸镁\t194005\n白露露\t194006\nAEG\t194007\ntrax\t194008\n誊写\t194009\n福彩双色球\t194010\n九洲\t194011\nAdsense\t194012\n罗拉\t194013\nrejection\t194014\n3月18日\t194015\n天画\t194016\n张翼\t194017\n亡魂之国\t194018\n高培\t194019\n蛙\t194020\n新乐敦\t194021\n剥削者\t194022\n前10月\t194023\n日耳曼\t194024\n斗技\t194025\n冷却期\t194026\n八个小时\t194027\nabstraction\t194028\n曾毅\t194029\n寺冈\t194030\n容桂镇\t194031\n幸福拍手歌\t194032\n点支\t194033\n第20天\t194034\n悉尼奥运会\t194035\n上海分公司\t194036\n8時間\t194037\n亲爱的笨笨猪\t194038\n我们的孩子\t194039\n倚天屠龙记\t194040\n青青河边草\t194041\n2018年1月14日\t194042\npray\t194043\n王章\t194044\n镍柱\t194045\n20150404\t194046\n王作安\t194047\n313\t194048\n加载\t194049\n百度风云榜\t194050\nsana\t194051\n口袋妖怪蓝宝石\t194052\n冠字号查询系统2\t194053\n夹住\t194054\n18183.com\t194055\n35章\t194056\n4.7.1\t194057\nhtcad\t194058\nSecuring\t194059\n奎因\t194060\n汇学\t194061\n内蒙古自治区政府\t194062\n来说话\t194063\ntopaz\t194064\n猎手\t194065\n提回\t194066\n3次元\t194067\n知难而退\t194068\n销魂艳婢\t194069\n小本天骐\t194070\n芳容\t194071\n香椿苗\t194072\nwindows10s\t194073\n天津大学出版社\t194074\n海岸线\t194075\nssg5\t194076\nSYB\t194077\n苯二甲酸\t194078\n雄韬股份\t194079\n富途牛牛\t194080\n四翼\t194081\n金桂\t194082\n卡套\t194083\n玻璃漆\t194084\n抚州市政府\t194085\ngetsync\t194086\n122路\t194087\n信息科学技术学院\t194088\n顾地科技\t194089\nopentsdb\t194090\n国旗杆\t194091\n馍\t194092\n王兴华\t194093\n4a景区\t194094\nsshfs\t194095\n5月3号\t194096\n广东省卫生计生委\t194097\n南昌大学图书馆\t194098\n199.COM\t194099\n七国集团\t194100\n唇音\t194101\n摆拍\t194102\n幼雏\t194103\n谏太宗十思疏\t194104\n330毫升\t194105\nAlin\t194106\n甘肃建投\t194107\n恐怖灵异-520听书网\t194108\n红芸豆\t194109\n人屏\t194110\n张紧器\t194111\n重起\t194112\n龙潭峡\t194113\n爱贷网\t194114\n巫术\t194115\n施一公\t194116\n防爆认证\t194117\n诺基亚1520\t194118\n加奇\t194119\n3礼包\t194120\nCS231\t194121\n联合国环境规划署\t194122\n生活卡\t194123\n锡盟\t194124\n多叶\t194125\n扒皮\t194126\n藏马\t194127\n贝拉米奶粉\t194128\n相城大道\t194129\nMacIdea\t194130\nstatsmodels\t194131\n包租公\t194132\n谷歌云\t194133\n4040\t194134\n佐佐木亚希\t194135\n南塘老街\t194136\n398元\t194137\n筑友\t194138\n北京工业大学\t194139\n比德文电动车\t194140\n林汉达\t194141\n220级\t194142\n连绵起伏\t194143\nsliverlight\t194144\n拉下拉\t194145\n匪夷所思\t194146\nコイカツ\t194147\n八百年\t194148\n软胶囊\t194149\nLooper\t194150\n王晓红\t194151\n轴头\t194152\n棘洪滩\t194153\nBelgium\t194154\n沙曼\t194155\nUCI\t194156\n胜浦镇\t194157\n勾花\t194158\n管虎\t194159\n河南省交通运输厅\t194160\nJill\t194161\n1.12.1\t194162\n低处\t194163\n宛平\t194164\n钱库镇\t194165\nGv\t194166\n明月珰\t194167\n二网\t194168\n鱼爪网\t194169\n幽龙\t194170\n热机\t194171\n喀什市\t194172\n无计\t194173\n电子存包柜\t194174\nX220i\t194175\n真格\t194176\n12122\t194177\n皱妓\t194178\n松滋\t194179\n機械\t194180\n广东省人民医院\t194181\nifc\t194182\n微知库\t194183\n潘老师\t194184\nMIA\t194185\n隔油器\t194186\n中央党史研究室\t194187\n2488\t194188\n4.0下\t194189\n潍坊市环境保护局\t194190\n朱竹清\t194191\n通州园\t194192\nTissue\t194193\n3.0mm\t194194\nCsgo\t194195\nTingChina\t194196\n第170章\t194197\n红凯\t194198\n练习场\t194199\n打飞机\t194200\n国家工商行政管理总局\t194201\n布哥\t194202\n科技迷航\t194203\n日据\t194204\n海航地产\t194205\n琼\t194206\n苏教\t194207\n战墙\t194208\n稳定剂\t194209\n伦理史\t194210\n老章\t194211\n磁控\t194212\n舞文弄墨\t194213\nウルトラマン\t194214\n小仓柚子\t194215\n品性\t194216\n池杉\t194217\n上海瑞金医院\t194218\n剑侠情缘三\t194219\n养母\t194220\n易车网\t194221\n吕刚\t194222\n天天文库\t194223\n第六个\t194224\n十指\t194225\n马未\t194226\n红馆\t194227\n聊天术\t194228\nWacker\t194229\n萧元\t194230\n海南出版社\t194231\n广西外国语学院\t194232\n多臂\t194233\n甲基环己烷\t194234\n20150703\t194235\nFactory\t194236\n引火\t194237\n城桥镇\t194238\n黑龙江省食品药品监督管理局\t194239\n晒\t194240\n多方位\t194241\n铁损\t194242\n菜地\t194243\nStylus\t194244\n付子龙\t194245\n至者\t194246\n汉语拼音表\t194247\n示范基地\t194248\n岛风\t194249\n吉檀迦利\t194250\nsendkeys\t194251\n石岩镇\t194252\n驭菱\t194253\n1807\t194254\n下月起\t194255\n凯聪\t194256\n瞧瞧\t194257\n20170911\t194258\n6565\t194259\n功能性\t194260\neeglab\t194261\n撒贝宁时间\t194262\n一微\t194263\n同一个人\t194264\ngl小说\t194265\n护肤品\t194266\n热盘\t194267\n华雪\t194268\n渭南市华州区人民政府\t194269\n大孔吸附树脂\t194270\nu2417\t194271\n沪股通\t194272\n没人接\t194273\nPantry\t194274\n平A\t194275\n报平安\t194276\n深圳办事处\t194277\n太浪\t194278\nHistorical\t194279\n天河省\t194280\nx77\t194281\n节分\t194282\n救\t194283\n智点\t194284\n徐州高级中学\t194285\n常州站\t194286\n宝马M4\t194287\n不求甚解\t194288\n内府\t194289\n石岩\t194290\n铬酸钾\t194291\n青海省人民医院\t194292\nmns\t194293\nRacing\t194294\n颜宇鹏\t194295\n黄尾鱼\t194296\nAboboo\t194297\n边刀\t194298\n北大博雅\t194299\n刘枫\t194300\n沈河区\t194301\n永杰\t194302\n气虚\t194303\n80070426\t194304\n声声慢\t194305\nszse\t194306\n卡普\t194307\n88斤\t194308\n销许\t194309\n萧峰\t194310\n沈阳市交通局\t194311\nmila\t194312\n千恋\t194313\n郑州工程技术学院\t194314\n金市\t194315\n720版\t194316\n安彩高科\t194317\n2.5.4\t194318\n夹竹\t194319\n兴亚\t194320\n美国通用电气\t194321\n站架\t194322\nx86处理器\t194323\n奥瑞\t194324\n广益街道\t194325\n罗红\t194326\n公媳\t194327\nmcr\t194328\n市民族宗教事务局\t194329\n华中帝国\t194330\n反激式\t194331\n首饰品\t194332\n创杰\t194333\nGachinco\t194334\n1000W\t194335\n暖通\t194336\n柱表\t194337\n天象棋网\t194338\n电脑派位\t194339\n天谷\t194340\n吉他教学\t194341\n第九十四章\t194342\n马冬晗\t194343\n中海集运\t194344\n麦霸\t194345\nSortedSet\t194346\n4.41\t194347\n上菜\t194348\n看荐\t194349\n湖中\t194350\n醋蛋归元液\t194351\n192.168.1.4\t194352\n杭二\t194353\n千蛛套\t194354\n预计\t194355\nrua\t194356\nSpine\t194357\n头盔\t194358\ncommune\t194359\n品质部\t194360\n软板\t194361\nslic\t194362\n雕饰\t194363\n配装器\t194364\nadb\t194365\nmdio\t194366\n蜜露\t194367\n焖锅\t194368\n共感\t194369\n派趣\t194370\n十六届\t194371\n五象大道\t194372\n张钊\t194373\n北京旷视科技有限公司\t194374\n李海平\t194375\n上海凯泉泵业(集团)有限公司\t194376\n易贝\t194377\n测智\t194378\nunexpectedly\t194379\n佳舞\t194380\n北京科学技术委员会\t194381\n永生不死\t194382\n南大通用\t194383\n光辉灿烂\t194384\n沃尔森\t194385\n减肥舞\t194386\n箫鼓\t194387\n泡馍\t194388\nICLOUD\t194389\n柏万青\t194390\n湘邮科技\t194391\n治感冒\t194392\nMonero\t194393\n拜客\t194394\n旧额\t194395\n纽约长岛\t194396\n煌\t194397\n手电\t194398\nalc887\t194399\nエロ画像ビュ\t194400\n形状记忆合金\t194401\n华映科技\t194402\n亲临\t194403\n佐伯俊\t194404\n瞪羚企业\t194405\n爽记\t194406\n配建\t194407\n三百六十五天\t194408\n洋口\t194409\n车子\t194410\n腐尸\t194411\n调研组\t194412\n蒙特卡罗\t194413\n300兆\t194414\n李丽莎\t194415\n我是一个任性的孩子\t194416\n五座\t194417\n空箱\t194418\nぅ\t194419\n补零\t194420\nyy4410\t194421\n憨豆特工2\t194422\n荷包网\t194423\n老玛瑙\t194424\n徐河俊\t194425\n三国志2\t194426\n猛男\t194427\n这双\t194428\n冰海\t194429\nmates\t194430\n状王宋世杰\t194431\n前景色\t194432\n圆舞\t194433\n√\t194434\nx10\t194435\n景观砖\t194436\n20平米\t194437\n普普\t194438\n声东击西\t194439\n雍容华贵\t194440\nGlenn\t194441\n落沪\t194442\n优胜者\t194443\n长城证券股份有限公司\t194444\n中国化学工程集团公司\t194445\n荆州理工职业学院\t194446\n103.8\t194447\n艾晓\t194448\n徇私枉法罪\t194449\n紫水\t194450\n视为\t194451\n冲绳\t194452\n葬花\t194453\n中国农药网\t194454\n月夜\t194455\n荣获\t194456\n原料\t194457\n网信办\t194458\n四无\t194459\n合邦\t194460\n贝克尔\t194461\n傅雷\t194462\n间隙性\t194463\n370X\t194464\n梅里雪山\t194465\n爱丽丝漫游奇境记\t194466\n我的征信\t194467\n奴隶色\t194468\n断包\t194469\n海因茨\t194470\n会议椅\t194471\n无脑\t194472\n贝多芬传\t194473\n8.17\t194474\n批图\t194475\nregress\t194476\n丧葬费\t194477\n查重复\t194478\n总仓\t194479\n城西区\t194480\n安于现状\t194481\n白咲舞\t194482\n睡觉时\t194483\n电荷线\t194484\n餐点\t194485\n妈妈我想你\t194486\n龙泉湖\t194487\n中存\t194488\n0588\t194489\n海智\t194490\n2.0.7\t194491\n布武\t194492\n中证500\t194493\n亚欧大陆桥\t194494\n8.2\t194495\n双节\t194496\n组包\t194497\n杜玉波\t194498\n英诺\t194499\n中天运会计师事务所\t194500\n九九电影网\t194501\n潮起\t194502\n美豪酒店\t194503\n伟绩\t194504\nEMBA\t194505\nletax\t194506\n汇兑损失\t194507\n郭欢\t194508\n吉良\t194509\nFounding\t194510\n独\t194511\nyelp\t194512\n联币\t194513\n肇东信息网\t194514\n百田知了_百田网\t194515\n魏晋南北朝\t194516\n斯柯达昕动\t194517\n易红\t194518\n平台\t194519\n汾河\t194520\n侠岚之弋痕夕\t194521\n57亿元\t194522\n龙腾世纪2\t194523\n败北\t194524\nSubscription\t194525\n豪景\t194526\n奥迪A5\t194527\n庚饭\t194528\npreo\t194529\n阳光丽景\t194530\n高政\t194531\n009\t194532\n油墨\t194533\nwww.52pojie.cn\t194534\n1100元\t194535\n国歌法\t194536\n天干物\t194537\n接表\t194538\n石羊街道\t194539\n仙豆糕\t194540\n铁砂\t194541\n娇体\t194542\n云岚宗\t194543\n180417\t194544\n矢仓\t194545\n相濡以沫\t194546\n3月29日\t194547\n500个\t194548\npcs7\t194549\n万余\t194550\nodata\t194551\n福建海峡银行\t194552\n掘\t194553\n所有人\t194554\n不然的话\t194555\n胡椒粉\t194556\njinyu\t194557\n天龙茶馆\t194558\n当当云阅读\t194559\n稳压罐\t194560\ndeus\t194561\n大梦西游4伏妖记\t194562\n影术\t194563\n商铺\t194564\n总体来说\t194565\noctopus\t194566\n明灯\t194567\n丝雨\t194568\nrx470d\t194569\n雷克萨斯IS300\t194570\n暴打\t194571\nKG\t194572\n凯蒂\t194573\n331路\t194574\n87万\t194575\n合悦\t194576\n仙台\t194577\n一个多\t194578\n旺仔牛奶\t194579\n乱\t194580\n我的世界末影人\t194581\nwinebottler\t194582\n岑溪市人民政府\t194583\n美国联合航空公司\t194584\n男士们\t194585\n炎可宁片\t194586\n环件\t194587\n十度\t194588\nIzumi\t194589\n18期\t194590\n华夏二手车_新闻中心\t194591\n白帽\t194592\ngongjiao\t194593\n深圳东部\t194594\n2.5ml\t194595\n5splus\t194596\n贵州广电\t194597\n0.8米\t194598\n参禅\t194599\n橘梨纱\t194600\niso9001\t194601\n网易考拉海购\t194602\n龙神丸\t194603\n170_\t194604\n大干\t194605\n方尖塔\t194606\n通湖草原\t194607\n齐桓晋文\t194608\n第二十次\t194609\n街道派出所\t194610\n崽儿\t194611\n天涯过客\t194612\n封端\t194613\n烘\t194614\n莲湖路\t194615\n护面\t194616\n同网\t194617\nlieve\t194618\n声库\t194619\n友谊宾馆\t194620\n伤透\t194621\n恋练有\t194622\n身穿\t194623\n厂长\t194624\n王冠雄\t194625\n额\t194626\n率队\t194627\n文景路\t194628\n碎岩\t194629\n王振\t194630\n第三十八次\t194631\n营业日\t194632\n十八般武艺\t194633\n丢包率\t194634\n深圳市宝安区妇幼保健院\t194635\n菲尔杰克逊\t194636\nMalware\t194637\n张子善\t194638\n湘潭高新区\t194639\n黄大妮\t194640\n废木\t194641\nwi-fi\t194642\n山东中公\t194643\n电脑音箱\t194644\n红豆薏米\t194645\nv1.1.7\t194646\n50公斤\t194647\n西湖道\t194648\n寒门长恨歌\t194649\n正黑体\t194650\n行省\t194651\n郭龙\t194652\n瓮安县人民政府\t194653\n婚恋观\t194654\n湖南大学法学院\t194655\n税务师事务所\t194656\n三峡传媒网\t194657\nunforget\t194658\n川芎\t194659\nfindall\t194660\n收付实现制\t194661\n16下\t194662\nDUAL\t194663\negret\t194664\nmybatis拦截器\t194665\n功夫熊猫盖世传奇\t194666\n沈彬\t194667\n战盟\t194668\n亲爱的祖国\t194669\n六档\t194670\n性本善\t194671\nRefused\t194672\n返迁\t194673\n光感\t194674\n山东教育厅\t194675\n连州镇\t194676\n耳垢\t194677\n检验员\t194678\n幼儿园区\t194679\n军博园\t194680\n盐酸贝那普利片\t194681\n补肾壮阳_315网\t194682\n筑视网\t194683\n三阳开泰\t194684\n新里波洛克公馆\t194685\n一言通天_一言通天\t194686\nZ11Max\t194687\n萝莉岛\t194688\nZGMF\t194689\n小壶\t194690\n宋闵浩\t194691\nCanada\t194692\n灰度值\t194693\n中秋月饼\t194694\n中国水利水电第四工程局有限公司\t194695\n易传\t194696\ntyc\t194697\n基建科\t194698\n小智\t194699\n花船\t194700\n开化\t194701\n市井\t194702\n0.61\t194703\n止住\t194704\n工作组织\t194705\n报幕词\t194706\n重庆三峡职业学院\t194707\n赌侠\t194708\nUnited\t194709\nxiongdi\t194710\n韩天峰\t194711\n白兰地\t194712\nrohs认证\t194713\n水阁\t194714\n泰政\t194715\n真的可以\t194716\n特一药业\t194717\nsaprk\t194718\n杜鲁\t194719\n违命\t194720\n金辉天鹅湾\t194721\n递加\t194722\n很用力\t194723\nSynthesis\t194724\n大兴国际机场\t194725\n【路亚吧\t194726\n线电流\t194727\n石玲奈\t194728\n花戏\t194729\n规规矩矩\t194730\n东湖景区\t194731\n美利坚大学\t194732\n然间\t194733\nsolrcloud\t194734\n特别报告\t194735\n卓义峰\t194736\ncompliant\t194737\n资阳网\t194738\n惠特\t194739\nhenlu\t194740\n19分\t194741\n正源股份\t194742\n跨年度\t194743\n吃饭的漂亮姐姐\t194744\n鬣狗\t194745\n货物类\t194746\nM1213\t194747\nComputational\t194748\nESET\t194749\nexploration\t194750\n为知\t194751\n剑灵武神塔\t194752\n教师证\t194753\n550万\t194754\n三里街\t194755\n康迪克\t194756\nperiods\t194757\n妙处\t194758\ncomposer\t194759\n玉颜\t194760\n单硝酸异山梨酯\t194761\n衡阳县政府\t194762\n行灯\t194763\nnico\t194764\ncourses.washington.edu/ph227814/228/W14/notes/Bessel.nb\t194765\n工艺美术员\t194766\n17.6\t194767\n3_九游\t194768\n保列治\t194769\n束流\t194770\n四大名助\t194771\n应贴\t194772\n纪委监察局\t194773\n莹石云\t194774\n谢灵运\t194775\nFJ\t194776\n小峰由衣\t194777\n长命百岁\t194778\nnewegg\t194779\n密令\t194780\n硬蛋网\t194781\n海上日出\t194782\n哪里里\t194783\n建中\t194784\n万里阳光号\t194785\n苏菲娜\t194786\n武强\t194787\n长根\t194788\n全委会\t194789\nonbeforeunload\t194790\n1.4g\t194791\n搜鸽网\t194792\n周运\t194793\n漫灌\t194794\nName\t194795\nChasing\t194796\n肾石通颗粒\t194797\n移民家园网\t194798\n包头轻工职业技术学院\t194799\n桂骏\t194800\nphpcms\t194801\n布莱\t194802\nCIVIC\t194803\n2048\t194804\n国管\t194805\n千花网\t194806\n娇纵\t194807\n纳米线\t194808\n13天后\t194809\nmeid\t194810\nwww.22bw.com\t194811\n无锡高新区\t194812\n2917年\t194813\n5000米\t194814\n劳务派遣工\t194815\n仙鹤草\t194816\n最大\t194817\n班里\t194818\n义齿加工厂\t194819\n株式会\t194820\n十字线\t194821\n曾庆洪\t194822\nDetective\t194823\n男扮\t194824\npill+\t194825\n20170112\t194826\n凤尾草\t194827\nReact-Native\t194828\nzpl\t194829\n百岁山\t194830\n灵庙\t194831\n相适\t194832\n脑ct\t194833\nNginx+\t194834\nvvt\t194835\n宫颈糜烂\t194836\n交通运输专业\t194837\nPacBio\t194838\n3派\t194839\n木制品\t194840\npro10\t194841\n2.8E\t194842\n姬存希\t194843\n窑湾古镇\t194844\n赛恩\t194845\n大桥道\t194846\n余杭口水社\t194847\n帝\t194848\n正义\t194849\n脊椎动物\t194850\n邱少赵雷\t194851\n阿里健康大药房\t194852\n英雄联盟半价吧\t194853\nGuangdong\t194854\nzxp\t194855\n国家文物局\t194856\n诺哈网\t194857\n硬纸\t194858\n澳大\t194859\n林庚\t194860\n530le\t194861\n那棵树\t194862\n二模\t194863\n倒刺\t194864\n下列\t194865\n赏饭\t194866\nwin10预览版\t194867\n扮穷\t194868\n薄荷叶\t194869\n当面\t194870\n宝坻一中\t194871\n金匮要略\t194872\n窗间\t194873\nrim\t194874\nCyan\t194875\n多样性指数\t194876\n紫图\t194877\n康保\t194878\n立卷\t194879\nCmder\t194880\n城固县\t194881\n接闪器\t194882\n中宏\t194883\nritual\t194884\n确认\t194885\n第十四天\t194886\n片碱\t194887\n尤利特\t194888\n酮症酸中毒\t194889\n韩剧宫\t194890\n10点钟\t194891\n济南中院\t194892\n福塔\t194893\n4458\t194894\n十二烷基硫酸钠\t194895\n杜少府\t194896\n比亚万域之王\t194897\n陈章良\t194898\n代办处\t194899\n2居\t194900\n不断\t194901\n百星\t194902\n一杯羹\t194903\n新民学会\t194904\n貂婵\t194905\n主持稿\t194906\n波音787\t194907\n场面描写\t194908\n级联选择器\t194909\n600321\t194910\n逼供\t194911\nVA\t194912\n空压\t194913\n失配\t194914\n斯塔万格\t194915\n调声\t194916\n模式\t194917\n辽沈\t194918\n错综\t194919\nvue-router\t194920\n呼风唤雨\t194921\n处方粮\t194922\n国务院发展研究中心\t194923\n八届\t194924\nnars\t194925\n旅行包\t194926\n接收方\t194927\n北京电力医院\t194928\n阿波菲斯\t194929\n操作线\t194930\n珠江路\t194931\n城市别墅\t194932\n益阳日报\t194933\n副屏\t194934\n1575\t194935\n测漏仪\t194936\n陀飞轮\t194937\n潜油电泵\t194938\nshow\t194939\n该类\t194940\n桂林旅游\t194941\npunching\t194942\n松弛\t194943\n蓝牙5.0\t194944\n李泽言\t194945\n行政机\t194946\n扦插苗\t194947\n上海市经信委\t194948\n冰魂\t194949\n香型\t194950\nstance\t194951\n未入\t194952\neffectiveness\t194953\n尤佳\t194954\n星程酒店\t194955\n装甲战争\t194956\n五宫\t194957\n群交\t194958\nkitti\t194959\n福能集团\t194960\n魁星\t194961\n凤雏\t194962\n废品回收\t194963\n茂名市人民医院\t194964\npropagation\t194965\n国教\t194966\nStuff\t194967\nt420s\t194968\n煤粉锅炉\t194969\ndlsym\t194970\n8mg\t194971\n深解\t194972\n来宾日报社\t194973\nRig\t194974\n经得起\t194975\n阿瞒\t194976\n高柳\t194977\nUME\t194978\n臀肌\t194979\n精雕油泥\t194980\n塌塌米\t194981\n赤色黎明吧\t194982\n淋浴室\t194983\n杨胜\t194984\n政法委书记\t194985\ndiff\t194986\n一级棒\t194987\n温暖的人\t194988\n精编\t194989\nHook\t194990\n皋城\t194991\n第29届\t194992\n六十条\t194993\n白英\t194994\n中国幼儿在线\t194995\n高层次\t194996\naj3\t194997\n110g\t194998\n上海华信\t194999\n电热杯\t195000\n保冬妮\t195001\n酒徒\t195002\n佛朗哥\t195003\n弗兰克赫兹\t195004\n降成\t195005\n月色真美\t195006\n十里蓝山\t195007\nwiley\t195008\n布地奈德福莫特罗粉\t195009\n无稽之谈\t195010\n虞城县\t195011\n金桥路\t195012\n疤痕膏\t195013\n分类号\t195014\n张火丁\t195015\n经文\t195016\nwarfram\t195017\n鹤城\t195018\nrepresents\t195019\nTPX\t195020\n第九部\t195021\nyueming\t195022\n蚁蚕\t195023\n孙娜恩\t195024\n大力马\t195025\n保家卫国\t195026\nACID\t195027\n双频路由器\t195028\n|保险公司\t195029\n头虱\t195030\nBrunswick\t195031\n康百万庄园\t195032\n接嫁\t195033\n纯棉\t195034\n飞身\t195035\n第五十集\t195036\n创造101档案\t195037\n苏州园\t195038\n1种\t195039\n珠海户口网\t195040\n医渡云\t195041\n香肠派对\t195042\n停车费\t195043\n眉豆\t195044\n入夏\t195045\n九拍\t195046\nSpEL\t195047\n我唾弃你的坟墓2\t195048\n秋夕\t195049\nmqtt协议\t195050\n关键\t195051\n台生\t195052\n歌本\t195053\n减数分裂\t195054\n水境\t195055\n兒\t195056\n肩扛\t195057\n解惑\t195058\nEnvato\t195059\n赫斯\t195060\n套牌车\t195061\n60分钟\t195062\n德勤\t195063\n中药店\t195064\nNirvana\t195065\n红棉花\t195066\nMOTHER\t195067\n华侨路\t195068\n含糊其辞\t195069\n揪着\t195070\n第一百一十一章\t195071\nLicenses\t195072\n清泉\t195073\n底噪\t195074\n除三害\t195075\nUdieToo\t195076\n万平方公里\t195077\n蓝儿\t195078\n屁眼\t195079\nfeelpurple\t195080\n奔跑吧_\t195081\n白手\t195082\n健达奇趣蛋\t195083\n四s店\t195084\n高装\t195085\n风度\t195086\nmsvcr120\t195087\n春意\t195088\n金鱼吊兰\t195089\n章学诚\t195090\nready函数\t195091\n猛\t195092\n微纳米\t195093\n毫安时\t195094\n反恐精英ol吧\t195095\n法神\t195096\n第100号\t195097\nWorksheets\t195098\n若彤\t195099\nDL-888D\t195100\n马刺队\t195101\n浮标\t195102\n明天\t195103\nAutoCAD2006\t195104\n集装箱\t195105\ngermany\t195106\n岸波白野\t195107\n开道\t195108\n中国国足\t195109\n八户\t195110\n老集\t195111\n6发\t195112\n冰瀑\t195113\n防黑\t195114\ndefloration\t195115\n避雷针\t195116\nModulation\t195117\n聚乙烯醇滴眼液\t195118\n利群集团股份有限公司\t195119\n求变\t195120\n1.2g\t195121\n17秋\t195122\n站表\t195123\nexam\t195124\nSERVICES\t195125\nbeta版\t195126\n蕙兰瑜伽\t195127\n赣府厅\t195128\n波子\t195129\n我是未来\t195130\n新罪\t195131\n598754908\t195132\n上海市西南位育中学\t195133\n莫拉\t195134\nForty\t195135\nautoconfig\t195136\n升降椅\t195137\n见字如面\t195138\n求片\t195139\n美女照\t195140\n三山\t195141\n自强不息\t195142\nacca吧\t195143\nOurJS\t195144\n北京信息职业技术学院\t195145\n长宁区妇幼保健院\t195146\n通江路\t195147\n普适\t195148\n云鹏\t195149\n毕向东\t195150\n渝中大坪\t195151\n喷煤\t195152\n雷蛇黑寡妇蜘蛛\t195153\nESC\t195154\ndoyoudo\t195155\nScreenToGif\t195156\n腐\t195157\n水岛津\t195158\nSweets\t195159\ntranslate3d\t195160\n淫語\t195161\n小橙子\t195162\n10幢\t195163\n75天\t195164\n绝代妖姬\t195165\n槐树花\t195166\nNorman\t195167\n三周后\t195168\nBureau\t195169\n高林生\t195170\nV13\t195171\n出乎意料\t195172\n张猛龙\t195173\n小儿惊厥\t195174\nsearch\t195175\n圆旭\t195176\n四校\t195177\n殊\t195178\n红人网\t195179\n周恩\t195180\n签到板\t195181\n61535566\t195182\n世联\t195183\n福建省委组织部\t195184\n苹果笔\t195185\nEPS\t195186\nLocaSpace\t195187\n新建\t195188\n电玩巴士PS4中文网\t195189\nSpaces\t195190\nfragement\t195191\n完全成本法\t195192\n黑暗英雄\t195193\n李丽娟\t195194\n万寿路街道\t195195\n成都七中初中学校\t195196\n黑龙江省卫生和计划生育委员会\t195197\n一束\t195198\n高级讲师\t195199\n瘦身大作战\t195200\n夜撸\t195201\nCBR1000RR\t195202\n携程\t195203\n米希尔\t195204\ntransducer\t195205\n强无敌\t195206\n暗黑2\t195207\n欲望格斗2\t195208\n歙砚\t195209\nBrookstone\t195210\n阳城县\t195211\n权重股\t195212\n重秤\t195213\n逆转裁判5\t195214\nipex\t195215\n角器\t195216\n37.6度\t195217\n会声会影x2\t195218\nrvm\t195219\nholistic\t195220\n受到\t195221\n折弯机\t195222\n码匠\t195223\n酒企\t195224\n横滨\t195225\n一座座\t195226\n070\t195227\n蜂窝煤\t195228\nGlen\t195229\n清河门\t195230\n亚星化学\t195231\ncorosync\t195232\n花地\t195233\n槿\t195234\n尼德\t195235\n12v锂电池\t195236\n米字\t195237\n护理人员\t195238\n赵紫阳\t195239\n给\t195240\n渝水\t195241\n弹丝\t195242\n氢原子\t195243\n苏蟹阁\t195244\nnucleic\t195245\n上下水\t195246\nDanielle\t195247\n夹心\t195248\nWindow10\t195249\n沈玲\t195250\nへ\t195251\n人大附小\t195252\ncarriage\t195253\n771所\t195254\n硫酸羟氯喹片\t195255\nWired\t195256\ntolerant\t195257\n津云\t195258\n拍案叫绝\t195259\n上性\t195260\n科教\t195261\n泉州网\t195262\nDVDrip\t195263\noci8\t195264\nwifi探针\t195265\n北京市人大\t195266\n灵芝胶囊\t195267\napp播放器\t195268\n优测\t195269\n专场\t195270\nage\t195271\n知所\t195272\n十三道\t195273\n鱼纹\t195274\n21经济网\t195275\n王建勋\t195276\n3.4.7\t195277\nLoves\t195278\n师事\t195279\n3155\t195280\n华龙汽车\t195281\n帮客\t195282\n奈绪\t195283\nbac\t195284\n中顾\t195285\n往届\t195286\n应用程\t195287\n浴池\t195288\n汉服\t195289\n石家庄\t195290\n12月29日\t195291\n7月23日\t195292\n细胞分裂4\t195293\n放过夜\t195294\n王子鸣\t195295\n通球\t195296\n6.28\t195297\n人均\t195298\n第八十五章\t195299\n易龙\t195300\nBinder\t195301\n企业并购\t195302\n刀卡\t195303\n金南佶\t195304\n平盖\t195305\n周雨奇\t195306\n北川\t195307\n众泰t700\t195308\nKo\t195309\n3800亿\t195310\n224条\t195311\nhcl\t195312\n马光\t195313\n200支\t195314\n战斗鸡\t195315\nCorridor\t195316\n卡尔·马克思\t195317\n埃菲尔\t195318\nAdobeRGB\t195319\n俘\t195320\n2022年前\t195321\n中益\t195322\n笑话\t195323\n马丁·弗瑞曼\t195324\nBye\t195325\n王艺\t195326\n蒙田\t195327\nOrigins悦木之源\t195328\n伢\t195329\n十里庙\t195330\n帝豪ev450\t195331\n长沙大道\t195332\n8688\t195333\n集尘器\t195334\n正团\t195335\n虎哥\t195336\n寸步不离\t195337\n第四十九章\t195338\n仙侠武侠小说-17k小说网\t195339\n吧台\t195340\n猎人笔记\t195341\n铃木启悦\t195342\nmultiplication\t195343\nmacOS\t195344\n压力管\t195345\n朝阳新村\t195346\n热血起亚k3\t195347\n办理机\t195348\n恋爱的味道\t195349\n宇智波带土\t195350\n二甲基苯酚\t195351\n酷家乐云\t195352\n923\t195353\npercentage\t195354\n外班\t195355\n雍州\t195356\n海波龙\t195357\n斜体字\t195358\nabba\t195359\n刘东亮\t195360\n由此\t195361\nCaution\t195362\n江汇教育广场\t195363\n入班\t195364\nQ235\t195365\n名枪\t195366\n1826\t195367\nclearTimeout\t195368\nUDF\t195369\n液晶拼接屏\t195370\n信上\t195371\n银雀山\t195372\n杉浦康平\t195373\ncondom\t195374\n裤脚\t195375\n板王\t195376\n地流\t195377\n钾长石\t195378\ngrub4dos\t195379\nfrees\t195380\n虎溪\t195381\n配筋\t195382\n许建国\t195383\n气相色谱质谱联用仪\t195384\n单手\t195385\n夜阑卧听风吹雨\t195386\n李升基\t195387\naarch64\t195388\n十二公民\t195389\n绿雕\t195390\n晨宇\t195391\n中新网\t195392\n老罗\t195393\n神木县\t195394\n萧道成\t195395\n谷瀑环保旺铺\t195396\n职权\t195397\n第52章\t195398\n仁义\t195399\n第八十四章\t195400\nReset\t195401\n中山大学附属肿瘤医院\t195402\n汉斯出版社\t195403\n一根面\t195404\n夏江\t195405\n杯子舞\t195406\nborder-box\t195407\n会士\t195408\n德尔股份\t195409\n朱莉\t195410\n不动产权证书\t195411\n朴妮唛\t195412\nnova\t195413\nShakespeare\t195414\n爱菲\t195415\n极米投影仪\t195416\n瑞虎5x\t195417\n上谈兵\t195418\n御城collection\t195419\n县份\t195420\n抱摔\t195421\n价\t195422\n_v1.1安卓\t195423\n第208章\t195424\n功夫片\t195425\n吉克冒险岛\t195426\n饼状\t195427\n萌宠小大人\t195428\n传奇卡\t195429\nCrucial\t195430\n佳果\t195431\n海尔日日顺\t195432\n缅陆\t195433\n书眉\t195434\ncie\t195435\n锁屏\t195436\n北京国税\t195437\n刘凡\t195438\nnba2k18吧\t195439\nSASO\t195440\n冯耀宗\t195441\n季华路\t195442\n血液学\t195443\n海泉湾度假区\t195444\n巴固\t195445\n出场费\t195446\n半高\t195447\n零次\t195448\n血者\t195449\n浙江省委办公厅\t195450\n诺雷德\t195451\nnbtstat\t195452\n主调\t195453\n出字\t195454\n酒心巧克力\t195455\n凶鸟\t195456\n氪星\t195457\n红葵\t195458\nlaoyue\t195459\n软银\t195460\n中兴软创科技股份有限公司\t195461\ninsurance\t195462\nChildrens\t195463\nWin10以太网\t195464\n昆嵛山\t195465\n猜题\t195466\nunstack\t195467\necxel\t195468\n文风\t195469\n战破苍穹\t195470\n菲森\t195471\n6.5亿\t195472\n年资\t195473\n叶绿版\t195474\nchess\t195475\n雷经理\t195476\n船台\t195477\n好想知道\t195478\n市地方税务局\t195479\n心窝处\t195480\n中国英语能力等级量表\t195481\nSpray\t195482\n939\t195483\n艾诗\t195484\n3d定制女仆\t195485\n长安CX70_长安欧尚\t195486\n北京歌华有线\t195487\n公吨\t195488\n和平街一中\t195489\n加重犯\t195490\nhiu\t195491\n塔型\t195492\n激光刻章机\t195493\n计划类\t195494\n友好集团\t195495\n科力达\t195496\n大追捕\t195497\n天天看\t195498\n1804\t195499\n7777\t195500\n首金\t195501\n神典石\t195502\n湖南日报\t195503\n哈奇\t195504\nMultiBox\t195505\n治国安邦\t195506\n三氯乙烯\t195507\n弯曲试验\t195508\n720lu\t195509\n森雅\t195510\n卡介苗\t195511\nlacoste\t195512\n校注\t195513\n星野美优\t195514\nArtand\t195515\n美短\t195516\n聚苯\t195517\nGive\t195518\n外市\t195519\n黄石人事考试网\t195520\n学期期\t195521\n螺旋仪\t195522\n打狗棍\t195523\n霓裳魅影\t195524\n23套\t195525\n乳源大峡谷\t195526\n兰州新闻网\t195527\n梦园小区\t195528\n漏电断路器\t195529\njlzzjlzz\t195530\n叉勺\t195531\nippicv\t195532\nweifang\t195533\n赚币\t195534\nelvui\t195535\n消耗型\t195536\n麻竹高速\t195537\n二年级语文下册\t195538\n昨天上午\t195539\nGreenDao\t195540\n贾磊\t195541\n黄水晶\t195542\n京口北固亭怀古\t195543\n蒙牛\t195544\nwear\t195545\n1659\t195546\nP400\t195547\nAMH云\t195548\n伍卫国\t195549\n量点\t195550\n一行行\t195551\nPhaser\t195552\n嗨歌\t195553\n三醋酸\t195554\n于子涵\t195555\ntomorrow\t195556\n16倍\t195557\n潍坊机场\t195558\n阿黛单田芳\t195559\n卡带\t195560\nllc\t195561\ninterface\t195562\n32件\t195563\n埋入\t195564\n想听\t195565\n高度规\t195566\n美堂\t195567\n神偷奶爸\t195568\n_淇县政府网\t195569\n本赛季\t195570\nKB2919355\t195571\n故园难再留眷恋\t195572\n贴片机\t195573\n张婕\t195574\n上海梅赛德斯-奔驰文化中心\t195575\n20min\t195576\n战国吧\t195577\n能量学\t195578\nAssistor\t195579\n霍尔木兹海峡\t195580\n增压器\t195581\nbmc\t195582\n此身\t195583\ntor浏览器\t195584\ngtx690\t195585\n连词\t195586\nGson\t195587\n体量\t195588\nToggle\t195589\nbq\t195590\n人民论坛\t195591\n长汀网\t195592\ncPickle\t195593\n托普\t195594\n王柳雯\t195595\n污泥\t195596\n道术\t195597\n20世纪以来\t195598\n第3次\t195599\n广美\t195600\n高级认证\t195601\n粉色系\t195602\n华阳国际\t195603\n天康\t195604\n林楠\t195605\n丫才\t195606\nCooperation\t195607\n江口政府网\t195608\n原单\t195609\n样仪\t195610\n17650\t195611\n凤求凰\t195612\n招商手机银行\t195613\n敢问路在何方\t195614\n白一护\t195615\n爱博\t195616\n10万亿元\t195617\n脉冲\t195618\nwilly\t195619\noro\t195620\nv4.5\t195621\n平面变压器\t195622\n专政\t195623\nwtg\t195624\nvalidator\t195625\n虚高\t195626\n2017年12月26日\t195627\nDeclan\t195628\n0871-6835xxxx\t195629\n五加皮\t195630\n世界阅读日\t195631\n张国刚\t195632\n蛾子\t195633\nv9.1\t195634\nsystemverilog\t195635\n神藏\t195636\n内史\t195637\n德佩\t195638\n11P\t195639\n乞食\t195640\n商业承兑汇票贴现\t195641\n转段\t195642\n砂层\t195643\nx42j\t195644\n审判官\t195645\nLao\t195646\n弃儿\t195647\n免检期\t195648\n50多年\t195649\n)生物科技有限公司\t195650\n收发器\t195651\n用语\t195652\nλ\t195653\n真田十勇士\t195654\n弄大\t195655\n柴桑区\t195656\n全片\t195657\n中度\t195658\n新光三越\t195659\n退隐\t195660\n乐律\t195661\n农贸市场\t195662\npistol\t195663\nwin764位\t195664\n盐酸氨基葡萄糖片\t195665\ngrasp\t195666\n柜类\t195667\n美丽家园\t195668\nVIP\t195669\n联创互联\t195670\n密会\t195671\n11点\t195672\n艺伎摩尔庄园\t195673\n黄金鱼\t195674\nLOSER\t195675\n梁凡\t195676\n疯狂猜歌\t195677\n衰落\t195678\nVNCserver\t195679\n深水养\t195680\nftpd\t195681\n深渡\t195682\n豪门千金\t195683\nsystems\t195684\n幸运查克\t195685\n芝麻糊\t195686\nGSM卡\t195687\n万科蓝山\t195688\n见面会\t195689\n热招\t195690\n碧影\t195691\n贡嘎\t195692\n宣员\t195693\ngetch\t195694\nASR\t195695\n可塑\t195696\n情妹妹\t195697\nuiauto\t195698\n33层\t195699\n反应力\t195700\nthem\t195701\n戚墅堰区\t195702\nbi\t195703\n汉中市人民政府办公室\t195704\n车轨\t195705\n大笔\t195706\n工机\t195707\n额窦炎\t195708\nLTPS\t195709\n绝地求生辅助\t195710\n术式\t195711\n走过\t195712\nv2.3.3\t195713\n20160906\t195714\n北京地铁15号线\t195715\n玄武石\t195716\n石政\t195717\nCLINICAL\t195718\n一塌糊涂\t195719\n果冻粉\t195720\n共青团第十七次全国代表大会\t195721\n金芒果\t195722\nlegal\t195723\n桃源一中\t195724\n批款\t195725\nKinjaz\t195726\nXPS\t195727\n说死\t195728\n印文\t195729\n远方的人\t195730\n茨维塔耶娃\t195731\n富硒茶\t195732\nadg\t195733\n安徽水安建设集团股份有限公司\t195734\n写景\t195735\n王者荣耀长城守卫军\t195736\n上高县\t195737\nPMP认证\t195738\n夹式\t195739\n暮\t195740\n舟山市国土资源局\t195741\n外国文学史\t195742\n教堂\t195743\n费特希耶\t195744\ngypsy\t195745\n折叠刀\t195746\n甘其毛都\t195747\n氨氯地平片\t195748\n星际传奇2\t195749\nwarping\t195750\n翻译学\t195751\n佛教_凤凰佛教\t195752\n枞阳县\t195753\n镍钛合金\t195754\n程明\t195755\n蜀牛\t195756\nlv5\t195757\n八卦娱乐\t195758\n夏恋\t195759\n四女\t195760\n时间的朋友\t195761\ns3700\t195762\n新桥国际机场\t195763\n脑损伤\t195764\nWan\t195765\n俏如来\t195766\n爱称\t195767\n仲裁网\t195768\n咸宁网\t195769\n高线\t195770\n娟秀\t195771\nIE9浏览器\t195772\n替米沙坦片\t195773\n尿囊素凝胶\t195774\n刘主任\t195775\nint型\t195776\n阴平\t195777\n王蕾\t195778\ndrupal8\t195779\n高玩秀\t195780\n乐高活动中心\t195781\n加害\t195782\n百蝶\t195783\n液压传动系统\t195784\n7106\t195785\n兵卫\t195786\n不计步\t195787\nkpi指标\t195788\n许可\t195789\n暗电流\t195790\n一键共享\t195791\n党纪处分条例\t195792\n挂耳式\t195793\n干扰素\t195794\n天乾\t195795\n享誉\t195796\n浪客行\t195797\n宁德新能源科技有限公司\t195798\n红颜知己\t195799\n用友软件服务中心\t195800\n安平\t195801\n琴声\t195802\nJavaScript高级程序设计\t195803\n兴明\t195804\n白沙村\t195805\n一汽汽车\t195806\n寒食帖\t195807\n秦陵\t195808\n3234\t195809\n3月12\t195810\nappraisal\t195811\nOSIM\t195812\n10.12.5\t195813\n成都公证处\t195814\n主页\t195815\n北京郡\t195816\n情陷\t195817\ndomino\t195818\n拜寿\t195819\nDAB\t195820\n99P\t195821\n中公教育湖南分校\t195822\ndsm\t195823\n瘪\t195824\nxxe\t195825\n雷蛇炼狱蝰蛇\t195826\n豆腐渣\t195827\n硅胶套\t195828\nkmp\t195829\n昭苏\t195830\n弱磁\t195831\n哪站\t195832\n金某\t195833\n正气\t195834\npassage3\t195835\n拖板车\t195836\n0.1.2\t195837\n合击\t195838\nJacksile\t195839\n华宇晨\t195840\n襄阳南路\t195841\n新造\t195842\n重庆市人才交流服务中心\t195843\n代偿期\t195844\n妻离子散\t195845\n%\t195846\n嘉天下\t195847\n小昂\t195848\n骁悉\t195849\n装机吧\t195850\n敕令\t195851\n林婉珍\t195852\n一室\t195853\n混砂机\t195854\nPalms\t195855\n点杀\t195856\n艾力特\t195857\n伯恩\t195858\n五源河\t195859\nfreely\t195860\n娃娃机\t195861\nae白金卡\t195862\n戴花\t195863\n温蒂\t195864\n徐小明\t195865\nBanYuner\t195866\n5千\t195867\n斑鳖\t195868\n暴击装\t195869\nmegalo\t195870\nPr\t195871\n同济大学软件学院\t195872\n51种\t195873\n王汀\t195874\n品读\t195875\n中央情报局\t195876\n77次\t195877\n救美\t195878\n孟母三迁\t195879\n神曲\t195880\n合肥市经济开发区\t195881\n三边\t195882\nSnapGene\t195883\n侧翻\t195884\n武汉经开区\t195885\n一拖四\t195886\nhario\t195887\n招聘季\t195888\n生态币\t195889\n2014.12\t195890\n蜜蜂帮帮\t195891\nzenbook\t195892\ntianfu\t195893\n20150426\t195894\n岳阳楼区\t195895\n山东台\t195896\n南通公司\t195897\n站灯\t195898\n掌号\t195899\n七里坪镇\t195900\n安庆\t195901\n低筋面粉\t195902\n寻夫\t195903\n非标门\t195904\n停办\t195905\n军版\t195906\nquora\t195907\n周虹\t195908\n贵阳机场\t195909\n简案\t195910\nrestricted\t195911\nY510P\t195912\nchilly\t195913\n承办人\t195914\n毒气\t195915\nCuba\t195916\n金刚网\t195917\n恒温阀\t195918\n真人图书馆\t195919\n色准\t195920\n混浊\t195921\nDS214play\t195922\nminibatch\t195923\n齐大\t195924\nr2000\t195925\n猜到\t195926\n多福\t195927\n林春\t195928\n澄江镇\t195929\nmp3全集_乾坤听书网\t195930\n年史\t195931\n旗曲\t195932\n乳制品\t195933\n飞天大盗\t195934\n孟州市\t195935\n一扶\t195936\n中设设计集团\t195937\n梭式窑\t195938\n居尼尔斯\t195939\nnetgear\t195940\n板式\t195941\n第多少天\t195942\n重庆市教育委员会\t195943\n觉醒者\t195944\n都暻秀\t195945\n尊敬语\t195946\n巴巴爸爸\t195947\n职友\t195948\n桃田贤\t195949\n归集\t195950\n张一丁彦雨航\t195951\n华南区\t195952\n青岛酒店管理职业技术学院\t195953\n贞\t195954\n明珠大厦\t195955\nOnedrive\t195956\n72G倩女幽魂手游\t195957\n傀儡\t195958\n长虹集团\t195959\n静一静\t195960\n电子科技公司\t195961\nCC2530\t195962\n5090\t195963\n浦东干部学院\t195964\n打鱼\t195965\nBright\t195966\nMeng\t195967\n古墓丽影9\t195968\n小产权\t195969\n停工\t195970\n半糖\t195971\nxue\t195972\n头等\t195973\ncsl\t195974\n保理公司\t195975\nwdr\t195976\n尤物\t195977\nDDoS\t195978\n质保金\t195979\n精文\t195980\n义女\t195981\n兰炭\t195982\nfuchsia\t195983\n欧阳锋\t195984\n中粮生化\t195985\n合肥市包河区人民政府\t195986\n辛伐他丁\t195987\n李翔宇\t195988\n西流湖\t195989\n绿绿\t195990\n芝麻粉\t195991\n假定\t195992\n郑艳\t195993\n罗韬\t195994\n苏安新村\t195995\nf检验\t195996\n太阳女神\t195997\n北人集团\t195998\n泛型接口\t195999\n董办\t196000\n至死\t196001\n爱唯侦察\t196002\n开膛\t196003\n上海济光职业技术学院\t196004\n刘佳佳\t196005\n中国五矿集团\t196006\nOrdinary\t196007\n防漆\t196008\n袭胸\t196009\n冻疮\t196010\n奥迪帝霸\t196011\n风俗\t196012\n老城市\t196013\n1.2万\t196014\n电踏车\t196015\n酒泉\t196016\nrobotmaster\t196017\nXml\t196018\n惬\t196019\n无盖\t196020\n细木工板\t196021\n一坐\t196022\n金鸡胶囊\t196023\n几多个\t196024\nNimbus\t196025\n拉格\t196026\n正衣冠\t196027\nBand\t196028\n华捷艾米\t196029\n国瑞城\t196030\n黑建\t196031\n违反\t196032\n漠河北极村\t196033\n秦商\t196034\n北京市劳动局\t196035\n歪歪扭扭\t196036\nHO\t196037\nintellij-idea\t196038\nActionBar\t196039\nmov\t196040\n宁波国际会展中心\t196041\n弈乐\t196042\n黑寡妇\t196043\n五羊杯\t196044\n1个半月\t196045\n2017-11-28\t196046\n4.2L\t196047\n日影\t196048\n黄页网\t196049\nBRC\t196050\n平移和旋转\t196051\nmanny\t196052\n第一股\t196053\n阴灵\t196054\n92版\t196055\n小女孩子\t196056\n辽宁成大\t196057\n定额计价\t196058\n扬州机场\t196059\n组工\t196060\n脑白质\t196061\n李冶\t196062\n东莞市高端电子有限公司\t196063\n指针变量\t196064\n无锡新传媒网\t196065\n惠若\t196066\n杂\t196067\n薛原\t196068\n1千亿美元\t196069\n失活剂\t196070\nkaa\t196071\nwhereis\t196072\n创星\t196073\n打击乐\t196074\nlib\t196075\n扶植\t196076\n刘可\t196077\n诸多\t196078\n960124\t196079\n九零\t196080\nSummoners\t196081\n优缺点\t196082\namobbs\t196083\n镇墓兽\t196084\n自考市场营销学\t196085\n猫光\t196086\n2016年6月17日\t196087\n中电环保\t196088\nflags\t196089\n免杀\t196090\n晩\t196091\n闪爆\t196092\nbalmain\t196093\n赃款\t196094\nnode-inspector\t196095\n泪囊\t196096\n苏宁理财基金\t196097\n吸毒\t196098\n青春中国\t196099\n戮力同心\t196100\n羽西\t196101\n萤石服务中心\t196102\n弘道\t196103\n超级美熟女\t196104\n秦汉大道\t196105\n静净\t196106\n起亚k3\t196107\n12ml\t196108\n女子坊\t196109\n权重\t196110\n山松\t196111\nSIM卡座\t196112\n梦之旅\t196113\n白山茶\t196114\n月尾\t196115\n山东大厦\t196116\n夜狼猎奇网\t196117\niPhone6sPlus\t196118\n刘爱军\t196119\nfariver\t196120\n序章\t196121\n人迹\t196122\n东萍象棋网\t196123\nHDCD\t196124\nDelphi7\t196125\nFrankYou\t196126\n兔头\t196127\n画图板\t196128\n野村证券\t196129\n观澜汽车站\t196130\n咸阳论坛\t196131\n毛峰\t196132\n品质\t196133\n石狮市\t196134\n0209\t196135\n伊犁河谷\t196136\n逼王\t196137\n磕碜\t196138\n金童路\t196139\n池塘\t196140\n窝案\t196141\n男衣\t196142\n中华环保联合会\t196143\n改性塑料\t196144\n国防科工委\t196145\n美瑛\t196146\n靳埭强\t196147\n液槽\t196148\nWTF\t196149\n坎普斯\t196150\n杨栋梁\t196151\n122周年\t196152\n草菅人命\t196153\n启辰r50\t196154\n硝呋太尔片\t196155\n读做\t196156\n深圳南油\t196157\n酷爱zero\t196158\nStreets\t196159\n遭窃\t196160\n合福高铁\t196161\n闻风丧胆\t196162\nlist2\t196163\n嵌字\t196164\n跑男6\t196165\n停摆\t196166\nCAPM\t196167\n感光鼓\t196168\n友利奈绪\t196169\n游感\t196170\n给他\t196171\n14个月\t196172\n秦卫江\t196173\n西延\t196174\n南风黑暗大纪元\t196175\n公子羽\t196176\nlibSVM\t196177\n竹桃\t196178\nchun\t196179\n油样\t196180\n暗夜文学网\t196181\n淫词\t196182\n招标师\t196183\ncracker\t196184\n98.5%\t196185\n日k线\t196186\n悉数\t196187\n湿濡\t196188\n功放\t196189\n天衣\t196190\nUnity3D游戏\t196191\n一千元\t196192\n0.95\t196193\n畅爽\t196194\n朱芳江南春\t196195\n苏特\t196196\n岱岱\t196197\nErro\t196198\nINSIDE\t196199\nqiushi\t196200\n赵梓茜\t196201\n035\t196202\n无法原谅\t196203\n球馆\t196204\n爱相随\t196205\n火车东站\t196206\n彩虹岛\t196207\n税控机\t196208\nGEA\t196209\n嘉德罗斯\t196210\n勾弦\t196211\n聚丙烯酸酯\t196212\n面管\t196213\nDeclare\t196214\n晕死\t196215\n触宝输入法\t196216\n吉他曲\t196217\n物语\t196218\n长物\t196219\n从天\t196220\n解卦\t196221\n首组\t196222\nPPTP/L2TP\t196223\n科类\t196224\n细纹\t196225\n67_\t196226\n功能主义\t196227\ncommunicating\t196228\n俯视\t196229\n恒昌财富\t196230\n优待\t196231\n房荒\t196232\n新郑龙湖\t196233\n人间胸器\t196234\n米蛋\t196235\n镀锌卷\t196236\n秦般若\t196237\n镇纪委\t196238\n两帧\t196239\n新乡市人民政府办公室\t196240\n挺嗨\t196241\n天乙\t196242\n多牛逼\t196243\n小议\t196244\n医务室\t196245\n外调函\t196246\n维修版\t196247\niPrint\t196248\n百天\t196249\n待到山花烂漫时\t196250\n金竹\t196251\nLoon\t196252\n电视源\t196253\n600秒\t196254\n曙光小区\t196255\n内忧\t196256\n中国绿色食品发展中心\t196257\n在线影院\t196258\n576芯\t196259\nクル\t196260\n白平\t196261\n延伸\t196262\n许岑\t196263\n哈尔滨小学\t196264\nGGO\t196265\n湖南邮电职业技术学院\t196266\n美尼尔\t196267\n社会保护\t196268\ndos攻击\t196269\n3709\t196270\np22\t196271\n禄劝县\t196272\n慈云山\t196273\nSucceed\t196274\n数智\t196275\n暗访直男地下基\t196276\n淘宝旺旺号\t196277\n水盆羊肉\t196278\nhand\t196279\n人工岛\t196280\n暴虐\t196281\ndvd2\t196282\n荣盛石化\t196283\n1703\t196284\n王全安\t196285\n金融时报\t196286\n山西联通\t196287\nracism\t196288\nMarshall\t196289\nsnowy\t196290\n碧血书香梦\t196291\ngoodluck\t196292\n张廷模\t196293\n李京华\t196294\n广西新闻网\t196295\n宋美娜\t196296\n戴氏\t196297\n三栖\t196298\nshaped\t196299\n哪一年\t196300\n房室传导阻滞\t196301\n两个字\t196302\n水灾\t196303\n男口\t196304\ntls\t196305\n二元次\t196306\n机电学院\t196307\n音音\t196308\n逍遥法\t196309\nZion\t196310\n真龙头\t196311\n塬\t196312\nEP11\t196313\n第7集\t196314\n1kb\t196315\nGravure\t196316\n华侨路茶坊\t196317\n威雅\t196318\n碰出\t196319\n測\t196320\n2826\t196321\n南郑区\t196322\n一览\t196323\n干衣\t196324\n朗悦\t196325\n南京市环境保护局\t196326\n长寿花吧\t196327\n裂口女\t196328\n铝框\t196329\nimpersonate\t196330\n1adac\t196331\n人本网\t196332\n九里\t196333\n水泥管\t196334\n5780\t196335\n一个梦\t196336\n厌学\t196337\n淋巴瘤\t196338\n36岁\t196339\nspring-web\t196340\n使命召唤11\t196341\n张学易建联\t196342\n白破疫苗\t196343\n消泡\t196344\n江召伟\t196345\n珍护\t196346\npeek\t196347\n拜城县\t196348\n耀江花园\t196349\n皮划艇\t196350\n实施方\t196351\nmls\t196352\n伏妖\t196353\n白包\t196354\nk15\t196355\n脏乱差\t196356\n龙科\t196357\n中国国际教育网\t196358\n君子堂\t196359\n铠武\t196360\n朱萍\t196361\n瑜伽经\t196362\n晶型\t196363\n家品\t196364\n华斯股份\t196365\n铜镀镍\t196366\nKEYS\t196367\n零零碎碎\t196368\n周思萍\t196369\n潘虹\t196370\n160722\t196371\nRadiator\t196372\n再开一贴\t196373\nmsc\t196374\n太迟\t196375\n骗贷\t196376\n肾经\t196377\n国家杯\t196378\n回答者\t196379\njim_shen\t196380\n兽拳\t196381\n珊瑚\t196382\n拉格朗日函数\t196383\n余佳运\t196384\n劣单\t196385\n稻城\t196386\n优盘\t196387\n山东舰\t196388\n次磷酸钠\t196389\n肺动脉\t196390\n御尊\t196391\n中国电信集团\t196392\n李祥霆\t196393\n日职联\t196394\n秦宣太后\t196395\n拳种\t196396\n敖江\t196397\n缺如\t196398\n晨露\t196399\n第44条\t196400\n大乱\t196401\nDaylight\t196402\nR级片\t196403\n1803\t196404\n汇讯\t196405\nCakePHP\t196406\n李小丁\t196407\ndodo\t196408\n南天国\t196409\n山丘之王\t196410\n是因\t196411\n0324\t196412\ncraigs\t196413\n绞合\t196414\nMEC\t196415\n童子\t196416\n安全生产事故\t196417\n07073火影\t196418\n龙骧\t196419\n哀伤\t196420\n肺炎克雷伯菌\t196421\n长征医院\t196422\n心食谱\t196423\nKenNgai\t196424\n昆仑华府\t196425\n月面\t196426\n冰棺\t196427\n迷境\t196428\n趴窝\t196429\n落户\t196430\nPeopleSoft\t196431\npeas\t196432\n傅菁\t196433\n主控板\t196434\n中土集团\t196435\nbroaden\t196436\n姜枣茶\t196437\n附页\t196438\n盛唐妖异志-缘漫动漫\t196439\noracle11gR2\t196440\n安全滑触线\t196441\nbrctl\t196442\n勤工\t196443\n毛遂\t196444\n失\t196445\n活页\t196446\n广州西村\t196447\n陈三\t196448\n八八\t196449\nclf\t196450\n拍卖行\t196451\n郭庄子\t196452\n翰林府\t196453\n僵清\t196454\n海关编码查询系统\t196455\n延寿县\t196456\n45R18\t196457\n郑州\t196458\n横拉\t196459\n桑达\t196460\n印度塔塔\t196461\ncoco奶茶\t196462\n寂寞在唱歌\t196463\n刘光\t196464\n大众电影\t196465\n罩住\t196466\ndell服务器\t196467\n恩怨\t196468\n值班表\t196469\n都市猎人\t196470\n七十级\t196471\n阿力士招商网\t196472\nptv\t196473\n时讯\t196474\n现代学徒制\t196475\n荻浦村\t196476\n小先生\t196477\nFCR\t196478\n冥王夫\t196479\n西固\t196480\n网评\t196481\n吉安县\t196482\n景域\t196483\nlan8720\t196484\n2018.1.22\t196485\n神都\t196486\n成活\t196487\n递归法\t196488\n互助县\t196489\nZinc\t196490\n江中\t196491\n兔笼\t196492\nDosage\t196493\nABO文\t196494\nqywwkai\t196495\nrevit族\t196496\n秋意\t196497\n铁血战士2\t196498\n假药案\t196499\n20170301\t196500\n姑苏区\t196501\n14K\t196502\n北京西三旗\t196503\nrx100m3\t196504\n有亲\t196505\n启灵\t196506\n醇醚\t196507\n更精彩\t196508\n澳门金沙国际\t196509\n脑水肿\t196510\n这一次\t196511\nbelieve\t196512\n上海轨道交通13号线\t196513\n惊声尖笑\t196514\nDA屏\t196515\n00\t196516\n印江自治县\t196517\n双向流\t196518\n交叉处\t196519\n音乐室\t196520\n原点\t196521\n视客眼镜网\t196522\n佛山市质量技术监督局\t196523\n熟料\t196524\n2017年8月21日\t196525\n万豪白挑\t196526\n去年9月\t196527\n箅\t196528\n绿红\t196529\n十几秒\t196530\n4&5\t196531\n捷豹XJ\t196532\n天津注册公司\t196533\n直男们\t196534\nV6.4\t196535\n祁连山南路\t196536\n红盾网\t196537\n高知特\t196538\nROC\t196539\n财官\t196540\n雪铁龙C3-XR\t196541\nMyEclipse10\t196542\n宅斗\t196543\n吴能谢芹\t196544\n白云飞\t196545\n水溶\t196546\nCPUID\t196547\n论美国的民主\t196548\n计较\t196549\n丰乳镇\t196550\n陶艺\t196551\n白帝学园\t196552\n葛二蛋\t196553\n2018.5\t196554\n大枪\t196555\n王亚南\t196556\n天猫医药馆\t196557\n3i\t196558\n武宗\t196559\nconvex\t196560\n433MHz\t196561\n苏共亡党\t196562\n玖兰枢\t196563\n兴顺\t196564\n8mm\t196565\n好孕\t196566\n非极性\t196567\n占用\t196568\ntmpl\t196569\n300家\t196570\n_箫\t196571\n利乐包\t196572\n鲻鱼\t196573\n平均法\t196574\n施坦威\t196575\nIglesias\t196576\n打折扣\t196577\nfiremonkey\t196578\n南昌大学前湖校区\t196579\n小番茄\t196580\n凌天下\t196581\n青春赞歌\t196582\n医疗期\t196583\nlegend函数\t196584\n甘祖昌\t196585\n挨骂\t196586\n中华人民共和国教育法\t196587\n48式太极拳\t196588\nt420i\t196589\n思美人兮\t196590\n烂尾楼\t196591\nhew\t196592\n树标\t196593\n魔兽争霸3:冰封王座\t196594\nKFR-26GW/\t196595\n江西丰城电厂\t196596\n广科\t196597\n调补\t196598\n视场角\t196599\n20161213\t196600\n海致\t196601\ndescr\t196602\n西弗\t196603\ny55a\t196604\n门机\t196605\n温州市中医院\t196606\n良训\t196607\n活蹦乱跳\t196608\n退换货\t196609\n乱伐\t196610\n江西省地质矿产勘查开发局\t196611\n导游网\t196612\nCX-7\t196613\nabs塑料\t196614\n9.0级\t196615\n烈焰战神\t196616\n叶小白\t196617\ncoordinate\t196618\n蓝瑛\t196619\nkey\t196620\n就是为\t196621\nqcow2\t196622\nSaturday\t196623\n独服\t196624\n1502\t196625\n守护冰之谷\t196626\n唐山市人才交流中心\t196627\n匿爱\t196628\n武警\t196629\n王喜\t196630\n粤菜馆\t196631\n科技史\t196632\n美乐棵\t196633\n野风车\t196634\n布莱德\t196635\n大肠菌群\t196636\n3立方\t196637\n刹那芳华\t196638\n能器\t196639\n宝松\t196640\n目标公司\t196641\n三月四月\t196642\nDRAG\t196643\n魔术贴\t196644\n20180212\t196645\n安顺经济技术开发区\t196646\n菅\t196647\n七氟烷\t196648\nqemu-kvm\t196649\n指示点\t196650\n新沟镇\t196651\n吃嫩草\t196652\nlianjia\t196653\n省分\t196654\nCircle\t196655\n鸣音\t196656\n堀辰雄\t196657\n绵柔\t196658\n优化器\t196659\n心脑\t196660\n零售商\t196661\n缸头\t196662\n端板\t196663\n鼓楼东街\t196664\n此事\t196665\n麦景\t196666\n妙曼\t196667\n白河堡水库\t196668\n红豆词\t196669\n±\t196670\nmsra\t196671\n6十1\t196672\n羊毛裤\t196673\n安徽理工大学\t196674\n西西人\t196675\n诱拐\t196676\n广清高速\t196677\n厌奶期\t196678\n恒生\t196679\nsunbet\t196680\n腊八节\t196681\n兔展\t196682\n加勒比海盗3:世界的尽头\t196683\nCorelDraw\t196684\nDragonBones\t196685\n蔚蓝色\t196686\n北京市公安局公安交通管理局\t196687\n甩胸\t196688\n奥斯卡影城\t196689\n两派\t196690\n初级钢琴曲集\t196691\n12海里\t196692\n李铭\t196693\n蓝海华腾\t196694\n行尸走肉第七季\t196695\n草莓酱\t196696\nQuotation\t196697\nGAMS\t196698\n超酷炫\t196699\n徐一鸣\t196700\nmeanshift\t196701\n2018.1.29\t196702\n8712\t196703\n灯箱\t196704\n武汉卓尔\t196705\n古朗基\t196706\n红警3吧\t196707\n金融局\t196708\n南美洲\t196709\n电脑线\t196710\n荣威e50\t196711\n讨薪\t196712\n安医大二附院\t196713\ncocos2d\t196714\n张近灵契\t196715\nEndurance\t196716\n绿玉\t196717\nf595\t196718\nturtlebot\t196719\n御批\t196720\nTaglist\t196721\n20140925\t196722\n飞蝶\t196723\n新城大道\t196724\n煲汤网\t196725\n参考资料\t196726\n抢劫案\t196727\n90B套\t196728\n一听音乐网\t196729\nGit@OSC\t196730\n卢旺达饭店\t196731\n3959\t196732\n数百\t196733\n9i\t196734\n溃疡型\t196735\n云付\t196736\n旅行装\t196737\n有声有色\t196738\n美容仪\t196739\n授权委托书.doc\t196740\n烟台市政府网\t196741\njiameng\t196742\nWater\t196743\n屌炸天字幕组\t196744\n216号\t196745\n芭比公主\t196746\nJeffrey\t196747\n内视镜\t196748\nOpenBSD\t196749\n信用卡还款\t196750\n002251\t196751\n八里店\t196752\nvs2018\t196753\n太白镇\t196754\n承太郎\t196755\n妊娠糖尿病\t196756\n劫龙\t196757\n粉煤灰\t196758\nprv\t196759\n寝室\t196760\n巨肉\t196761\n北极星电力招聘网\t196762\n购买\t196763\nUBOOT\t196764\n盘庚\t196765\nday6\t196766\n爱问频\t196767\n9所\t196768\n邀约\t196769\n太原动物园\t196770\n刺客荣耀-荆轲_\t196771\nheadache\t196772\nitpub\t196773\n刘炅然\t196774\n乛\t196775\n四十二式太极剑\t196776\n菜籽粕\t196777\nbgs\t196778\n平面直角坐标系\t196779\n290\t196780\n一清二\t196781\n德邦单号\t196782\n核污染\t196783\n化学系\t196784\n三十天\t196785\n4英寸\t196786\n対魔忍アサギ\t196787\nFeels\t196788\n冯媛\t196789\n燃爆\t196790\n怪兽大学\t196791\nB18\t196792\n强农\t196793\n栽植\t196794\n长莫及\t196795\n张海鹏\t196796\ncompetition\t196797\n海峡都市报\t196798\n12针\t196799\n中山博爱医院\t196800\nWord页码设置全攻略\t196801\n杨家山\t196802\n城市人家\t196803\n悲回风\t196804\n中期票据\t196805\n环球医疗\t196806\n私账\t196807\n康婷\t196808\n8700K\t196809\n微模块\t196810\n请闭眼\t196811\ndofilter\t196812\n火纹\t196813\n青甘\t196814\n高腔\t196815\n光谷一路\t196816\n填埋场\t196817\n天籁\t196818\n党课\t196819\n走极端\t196820\nRoyale\t196821\n相似于\t196822\n1w\t196823\n曾经以为\t196824\n蚕豆花\t196825\nceph\t196826\n二十世纪九十年代\t196827\n东北门\t196828\n山西省监狱管理局\t196829\n同名类\t196830\nMonsters\t196831\n双清路\t196832\n列斐伏尔\t196833\n太平机场\t196834\n运动传感器\t196835\n郑永\t196836\n孔桥\t196837\n粉丝优惠券网\t196838\n比村\t196839\n组合包\t196840\n配货员\t196841\n海安县人民政府\t196842\n友言\t196843\n社会主义改造理论\t196844\n惠威吧\t196845\n不由天\t196846\n乐影\t196847\nlegally\t196848\n17g\t196849\nuncompress\t196850\n王莉霞\t196851\n龚文祥\t196852\n查集\t196853\nMemcache\t196854\n睡不到\t196855\n1.5亿\t196856\ngals\t196857\n開放\t196858\n天目山\t196859\nLongchamp\t196860\nV3.7\t196861\nsar\t196862\n张立勇\t196863\n牙箱\t196864\n褚橙\t196865\n20千米\t196866\n77mm\t196867\n吴克\t196868\nauthorize\t196869\n暖日\t196870\n32万\t196871\n史子\t196872\n赤霉病\t196873\n中央组织部\t196874\n北京市物价局\t196875\n热\t196876\n车溪\t196877\n竹茹\t196878\n信用管理师\t196879\n_香\t196880\n波斯语\t196881\n口罩卡\t196882\nnod32\t196883\nccpit\t196884\n时柱\t196885\n方国\t196886\n牛腱\t196887\n分子泵\t196888\n王度庐\t196889\nNortheast\t196890\n化学成份\t196891\n像元\t196892\n霓凰郡主\t196893\n大歌\t196894\n凤栖梧\t196895\n夏商\t196896\nFMBA\t196897\n纪元2070\t196898\n1964年\t196899\npid控制\t196900\n走肾不走心\t196901\n代理制\t196902\nWritten\t196903\n金花鼠\t196904\n改居\t196905\n802.11ac\t196906\n泸县\t196907\nAeon\t196908\nverity\t196909\n径向基函数\t196910\n企业询证函\t196911\n挺举\t196912\nAless\t196913\n上年末\t196914\nLDR\t196915\n夏风\t196916\n2147467259\t196917\n超跌反弹\t196918\n橙县\t196919\nBoxer\t196920\n1.55.0\t196921\n操机\t196922\n国电电力\t196923\n博思网\t196924\n袋子\t196925\n子成\t196926\n周利\t196927\ntaskalfa\t196928\nINDIA\t196929\n慷慨解囊\t196930\n幻影\t196931\n不过如此\t196932\n阿里集团\t196933\n格式塔心理学\t196934\nXUAN\t196935\n3DMAX2012\t196936\njessie\t196937\n賽馬會\t196938\n陈水\t196939\nQzone\t196940\n星锤\t196941\nKISS\t196942\n平安寿险\t196943\n死神来了吧\t196944\nSOW\t196945\n几】\t196946\n一次就好\t196947\n到手\t196948\n还原点\t196949\n职涯\t196950\n城下\t196951\n上海老洋房\t196952\n就业力\t196953\n木桌\t196954\n平板门\t196955\n中国网地产\t196956\n指导者\t196957\n左立\t196958\n4.27\t196959\n细粒度\t196960\nfide\t196961\n登革\t196962\n240斤\t196963\n秧田\t196964\n小砖\t196965\nP10P\t196966\n富坚义博\t196967\n文澜中学\t196968\n平局\t196969\n流星锤\t196970\nreferenced\t196971\nvalvrave\t196972\n20160402\t196973\n石榴裙\t196974\n知书达礼\t196975\n444534800\t196976\n炕桌\t196977\n村上丽奈\t196978\n一两万\t196979\n乐高星球大战\t196980\n腻子\t196981\n守卫剑阁纵横天下\t196982\n青青子\t196983\n丽雅\t196984\nGBT\t196985\n拔丝\t196986\n疲倦\t196987\n徐庄镇\t196988\n32支\t196989\n智选假日酒店\t196990\n活扣\t196991\n曲径\t196992\n唠叨\t196993\n互锁\t196994\n贻贝\t196995\n九所\t196996\n制人才培养\t196997\n淫香\t196998\n蒜泥\t196999\n中国远洋\t197000\nPCC\t197001\n长琴\t197002\nmau\t197003\n苍溪县\t197004\nWOrd\t197005\n三级管\t197006\n3亿元\t197007\n九峰山\t197008\n划拳\t197009\ncovalent\t197010\nCodeForge\t197011\nwin7+ubuntu\t197012\n寿行\t197013\nmafumafu\t197014\n16进制数\t197015\n群名\t197016\n萧炎\t197017\n李志新\t197018\n拉珠\t197019\n妪\t197020\nFineCMS\t197021\nshide\t197022\n奇声\t197023\n45斤\t197024\n张国\t197025\nP95\t197026\n新亮剑\t197027\n猜不透\t197028\n大石坝\t197029\n第十九回\t197030\nwangka\t197031\n首师大附中\t197032\n凤凰宽频\t197033\n二百元\t197034\n诺科\t197035\n嗨付\t197036\n济川\t197037\n黑椒牛排\t197038\n黑百通\t197039\n心寒\t197040\n丝面\t197041\nspeedo\t197042\n佛山网易\t197043\n壁挂架\t197044\nCathy\t197045\n搔\t197046\n新铃木\t197047\n一场\t197048\n伯劳鸟\t197049\n放宽\t197050\n向阳开\t197051\n深圳建国泌尿外科医院\t197052\n临海\t197053\n鄂伦春\t197054\n国家经贸委\t197055\n保健院\t197056\n惊魂\t197057\n奁\t197058\nJumeirah\t197059\n无翼鸟邪恶漫画\t197060\n第三任\t197061\n车服\t197062\n刘永胜\t197063\nMAXON\t197064\ncubic\t197065\nPrejudice\t197066\n耐久\t197067\n民间剪纸\t197068\n汇海\t197069\n冼村\t197070\njuese\t197071\n无字碑\t197072\n蜘蛛人\t197073\nR6400\t197074\nstag\t197075\nswitc\t197076\n20150902\t197077\nmagicavoxel\t197078\n离镜\t197079\n144帧\t197080\n余姚\t197081\nweiba\t197082\n≧\t197083\n筑龙岩土工程\t197084\ncomposition\t197085\n牛刀\t197086\n第十五个\t197087\n深圳之窗\t197088\n江湖路\t197089\n陈洋\t197090\nWORKBENCH\t197091\n无障碍设施\t197092\nCrank\t197093\n2500亿\t197094\nOtto\t197095\n制动力矩\t197096\ns6edge+\t197097\n9平方米\t197098\n章鱼\t197099\n追上\t197100\n阿k\t197101\n受肉\t197102\n老好\t197103\n爱的大冒险\t197104\n护短\t197105\n鄂教版\t197106\n三月底\t197107\n色文\t197108\n海淀区北\t197109\n物理卷\t197110\nFrequent\t197111\n12.56厘米\t197112\nadhoc\t197113\n浙江行政学院\t197114\n天马屏\t197115\n一对对\t197116\n货币银行学\t197117\n爆丸\t197118\nLQ-680K\t197119\n出证\t197120\n弥陀寺\t197121\n兵役登记证\t197122\nLaminate\t197123\n恰噶\t197124\n50T\t197125\n三粒\t197126\n惯偷\t197127\n垣曲县\t197128\n炮灰女\t197129\n载分\t197130\nyoushi\t197131\n中币网\t197132\n克里木\t197133\n述职述廉\t197134\nnvlddmkm\t197135\n加美\t197136\n山师\t197137\n迷语\t197138\n开县\t197139\n幸福门\t197140\n119.com\t197141\nPos机\t197142\nUsing\t197143\n缺铁\t197144\n安仁镇\t197145\n政工\t197146\n呼吸管\t197147\n九步\t197148\nMOQ\t197149\nHiPP\t197150\n温迪\t197151\nadmire\t197152\nIPN\t197153\n小林正雪\t197154\nxauth\t197155\n电气安装工程\t197156\n0023\t197157\n杜聿明\t197158\nscilab\t197159\n卓正\t197160\n欧州\t197161\n1083\t197162\n购物街\t197163\n夹线\t197164\n左右手\t197165\n2196\t197166\n有机化学网\t197167\n学童\t197168\n杭州湾大桥\t197169\n日产轩逸\t197170\n喝醉酒\t197171\n龙恩0707\t197172\nSika\t197173\ntum\t197174\n2.4亿\t197175\nYeezy\t197176\n精仪学院\t197177\n开户行行号\t197178\n斯特\t197179\n衣冠\t197180\nHSPICE\t197181\n多功能\t197182\n顶呱\t197183\ninssider\t197184\n出口交货值\t197185\n漳港\t197186\n星际比亚迪f0\t197187\n便民查询网\t197188\n机器手\t197189\n黄玉\t197190\nUIAutomator\t197191\n西安晚报\t197192\njn\t197193\n成都医院\t197194\n黑号\t197195\nOPPOR11Plus\t197196\n区图书馆\t197197\n烟台高新区\t197198\n搭错车\t197199\n大襟\t197200\nJenny\t197201\n2天1夜\t197202\n芝加哥医院\t197203\nhoward\t197204\nPogo\t197205\nSystemic\t197206\n木屐\t197207\n龙与猫之国\t197208\n管理式\t197209\n野猪\t197210\n马陆镇\t197211\n笑骂\t197212\n存储卷\t197213\ndinoy\t197214\nfabric\t197215\n1356\t197216\n神武新区\t197217\nsum函数\t197218\nlistagg\t197219\n1.7.8\t197220\n年包\t197221\n15t\t197222\n三把\t197223\n枣园小区\t197224\nwarren\t197225\n工疗\t197226\n韩国中央大学\t197227\n600233\t197228\nkakao服\t197229\n瑕不掩瑜\t197230\n古墓丽影4\t197231\n长江韬奋奖\t197232\n旺夫\t197233\n刹马镇\t197234\n月亮与六便士\t197235\n善\t197236\nauthenticated\t197237\n打雪仗\t197238\n反恐怖主义法\t197239\n臧天朔\t197240\nA4腰\t197241\n加值\t197242\navaya\t197243\n冰峰魔恋\t197244\n11根\t197245\n广州地铁12号线\t197246\n地西泮\t197247\nparachute\t197248\n信息管理专业\t197249\n安徽审计职业学院\t197250\n105克\t197251\nphotoshop7\t197252\n重返德军总部新秩序\t197253\n33248588\t197254\nhdm\t197255\n良药\t197256\n35%\t197257\nGuacamole\t197258\nvs岚\t197259\n贵州省体育局\t197260\nhelixstudios\t197261\n北京市疾病预防控制中心\t197262\n吉林大学继续教育学院\t197263\np8\t197264\n花山镇\t197265\n真版\t197266\n甲类厂房\t197267\n超时空男臣\t197268\nStaff\t197269\n流行曲\t197270\n灰口\t197271\n南面\t197272\n数值传热学\t197273\n阿加莎\t197274\n大沽南路\t197275\ntbd\t197276\nminutes\t197277\n福能\t197278\n王vrains\t197279\nRCD510\t197280\npolarization\t197281\nBackspace\t197282\npapago\t197283\n高线公园\t197284\n横琴新区管委会\t197285\n花池\t197286\n南昌晚报\t197287\n龙门古镇\t197288\n刑事裁判\t197289\n锰酸锂电池\t197290\n秦汉胡同\t197291\n萧元启\t197292\nNPOI\t197293\n邱隘镇\t197294\n患有\t197295\n1298\t197296\nWhy\t197297\n口腔助理医师\t197298\n克格莫\t197299\n赵鹏程\t197300\n1103\t197301\n夏朗\t197302\n子div\t197303\nAsm\t197304\n叶花\t197305\n张韵涵\t197306\n猪颈肉\t197307\n越式\t197308\n天天酷跑\t197309\n征求稿\t197310\n负号\t197311\n衰仔\t197312\n吭声\t197313\n别缠\t197314\n五山路\t197315\n咚咚\t197316\njuris\t197317\n130家\t197318\n管理台\t197319\n灾难片\t197320\n姓刘\t197321\n安利\t197322\n南海岸\t197323\n八达通卡\t197324\n柯美6500\t197325\n拉林铁路\t197326\npgis\t197327\nvczh\t197328\nwinflash\t197329\nstardict\t197330\n分泌性中耳炎\t197331\n27.5g\t197332\n第109期\t197333\n华为mate9pro\t197334\n小儿脑瘫\t197335\n周结\t197336\n定额价\t197337\n6分\t197338\n通心菜\t197339\n温存\t197340\njdl\t197341\nsmzdm\t197342\n幻想神域ol吧_\t197343\n南浦\t197344\n剡溪\t197345\ncp吧\t197346\ndpkg\t197347\nguanbi\t197348\n波罗的海\t197349\n推流器\t197350\n2014-2019年\t197351\n月出\t197352\n武强县\t197353\n袁了凡\t197354\n单招\t197355\n林奕匡\t197356\n二\t197357\n127家\t197358\n木鞋\t197359\nruiren491112\t197360\n河北电信\t197361\n淮北市\t197362\n中国平顶山市政府\t197363\n日光岩\t197364\n勃艮第\t197365\n瓜皮\t197366\n居家养老服务\t197367\n山水纹\t197368\n长颈\t197369\n20171211\t197370\nDJ音乐厅\t197371\nKpop\t197372\n胡牌\t197373\n张万霖\t197374\n开怀\t197375\n统驭\t197376\n萧林\t197377\nlaravle\t197378\nイタズラ\t197379\n观影券\t197380\nstefan\t197381\n抱佛\t197382\n赛罗奥特曼\t197383\n当局\t197384\ntechnic\t197385\n92部\t197386\n田雷\t197387\n牧云溪谷\t197388\n|张\t197389\n唯兰颂\t197390\n上海国际酒店\t197391\n音乐梦想\t197392\n很厉害\t197393\n乙酸酐\t197394\nBr2\t197395\n宣发\t197396\n宫颈肥大\t197397\n绝景版\t197398\nX99A\t197399\n别国\t197400\nHotDog\t197401\n甘谷\t197402\n锦\t197403\n蓝地\t197404\n17章\t197405\n接口处\t197406\n北京中国移动\t197407\nMajesty\t197408\n3.8米\t197409\n双纤\t197410\n周报\t197411\nIPV\t197412\n12排\t197413\n班名\t197414\njplayer\t197415\nstc\t197416\nxinjiang\t197417\n太饱\t197418\nleisi\t197419\nbaocuo\t197420\nqq批量\t197421\n9004\t197422\n济南植物园\t197423\nUSIM卡\t197424\n合肥万达\t197425\n容斥\t197426\n扎西\t197427\n精囊\t197428\n思乡曲\t197429\nGuys\t197430\n心力衰竭\t197431\n钓技\t197432\n鸟仔\t197433\n赋值变量\t197434\n果不其然\t197435\n市级政府\t197436\n自吸泵\t197437\n庭前花开花落\t197438\ncxy\t197439\n火炕\t197440\nJadClipse\t197441\ntube\t197442\nhrv\t197443\nbx300\t197444\n十六国\t197445\n康缘\t197446\n万润股份\t197447\n尼罗河\t197448\n克诺尔\t197449\n屐齿\t197450\n广西区\t197451\n西咪替丁\t197452\n乌廷芳\t197453\nHints\t197454\n746\t197455\n铜梁\t197456\n陈逸\t197457\ncjs\t197458\ndysx\t197459\n打私\t197460\n接骨木\t197461\n钱治亚\t197462\n非泵\t197463\n含沙量\t197464\n火影忍者究极忍者风暴\t197465\n40亩\t197466\n旋转门\t197467\n刘乐\t197468\nPetterLiu\t197469\n2004年4月\t197470\n_鸟\t197471\n母球\t197472\n1009\t197473\nswot分析法\t197474\n国史大纲\t197475\n神硕\t197476\n加权平均\t197477\n耐高联赛\t197478\n碟中谍6\t197479\n不善者\t197480\nTinker\t197481\n马奈\t197482\n列位\t197483\nfmincon\t197484\n奥比岛\t197485\n永丰镇\t197486\n平行进口车\t197487\n美国空军\t197488\n李永富\t197489\n千吨级\t197490\n中房\t197491\n全校\t197492\n东北师范大学》\t197493\njre\t197494\n10月14日\t197495\n五四北\t197496\n小岭\t197497\n七公里\t197498\nremo\t197499\n精神分裂\t197500\n犬儒\t197501\n氯气\t197502\n张志成\t197503\n柜理\t197504\n4.4日\t197505\n空白项\t197506\n陈梓童\t197507\n柯坪县\t197508\n画曲面\t197509\njiathis\t197510\nCONNECTION\t197511\nhoist\t197512\n喜神\t197513\n佐藤健\t197514\nshe\t197515\n苏州人才网\t197516\n梦幻号\t197517\n孙菲菲\t197518\n包玉刚实验学校\t197519\n工头\t197520\n陈超美\t197521\n云解决方案\t197522\n人者\t197523\n适宜性\t197524\n湖南交通工程学院\t197525\nm24\t197526\n越军\t197527\n吹笙\t197528\n牛顿运动定律\t197529\n20150717\t197530\n比亚迪S6\t197531\n超级街头霸王2\t197532\nUrgent\t197533\n经贸\t197534\n动易技术中心\t197535\n置入\t197536\n签收单\t197537\n成都双流机场\t197538\n红杏园\t197539\n小海\t197540\n严格\t197541\n李大辉\t197542\n西南财经大学工商管理学院\t197543\n彝文\t197544\n六堡\t197545\nPCB-与非网\t197546\n中国武夷山市政府\t197547\n宁波保税区管理委员会\t197548\n大宗易\t197549\n富士康集团\t197550\n挫败感\t197551\n众安科技\t197552\n岳阳楼记\t197553\n副经理\t197554\n磨边机\t197555\n蛇龙\t197556\n培训椅\t197557\nLEON\t197558\nWindows10专\t197559\n污栅\t197560\n久闻网\t197561\nvideo.js\t197562\nfuta\t197563\n糙\t197564\n用卡\t197565\n商品性\t197566\nHIS\t197567\n设计师\t197568\n冲绳机场\t197569\ndashed\t197570\n澳柯玛\t197571\n活费\t197572\nMiracast\t197573\n开始菜\t197574\nValidation\t197575\n抵债\t197576\n孙晓\t197577\n露脐\t197578\n周怡\t197579\n半载\t197580\n主题库\t197581\nGetaway\t197582\n8.x\t197583\nbata\t197584\n点翠\t197585\nRobertson\t197586\n埃尔加\t197587\n挫败\t197588\n艾维\t197589\n先息\t197590\n美学史\t197591\n26201121\t197592\n苏州高新区\t197593\n优速快递\t197594\n克利夫兰骑士\t197595\n张歆诺兰\t197596\n水标\t197597\n黑木瞳\t197598\n天行健\t197599\n中广核技\t197600\n篮球裤\t197601\nremotely\t197602\n皮垢\t197603\n哭腔\t197604\n商友\t197605\n主考\t197606\n深圳市傲基电子商务股份有限公司\t197607\n情敌\t197608\n邯郸市第一中学\t197609\n李书福\t197610\n汕头日报\t197611\n杨澜金凯瑞\t197612\n堡主\t197613\n命门\t197614\n常州市规划局\t197615\n迎春花\t197616\n风险源\t197617\nm27\t197618\n小菲\t197619\n一民\t197620\nbarra\t197621\n0.54\t197622\noakley\t197623\n宁波房产网\t197624\n宛转\t197625\n詹瑞文\t197626\nnights\t197627\n阿格拉玛\t197628\n蟒山森林公园\t197629\n福荫\t197630\n以文会友\t197631\ntank\t197632\n核工\t197633\n北京北新桥\t197634\n河北省纪委监委\t197635\n首都医科大学附属北京友谊医院\t197636\nscaleType\t197637\n奥迪R8\t197638\n舞蹈鞋\t197639\n9e\t197640\n国家高新技术产业开发区\t197641\n北京师范大学政府管理学院\t197642\n小儿腹泻\t197643\n尤长靖\t197644\nmphil\t197645\n南海舰队\t197646\n28w\t197647\ntrash\t197648\n侨胞\t197649\n窒\t197650\n高坂桐乃\t197651\n京科\t197652\n厚街\t197653\nv1.2.2\t197654\n快斗\t197655\n初犬\t197656\n拼音格\t197657\n陈庚\t197658\n燃气调压器\t197659\n教长\t197660\n欲焰\t197661\n守关\t197662\n二醇\t197663\n刷机包\t197664\n世间\t197665\n长牌\t197666\n2秒钟\t197667\nntleas\t197668\npodfile\t197669\n白雪公主与猎人\t197670\n锐7000\t197671\n创业家\t197672\n作业部落\t197673\n忠告\t197674\n尤尼\t197675\n厦门日报电子报\t197676\n印刷稿\t197677\n品牌传播网\t197678\n苏氨酸\t197679\n商标注册证\t197680\n石墨烯\t197681\n宋瑞\t197682\n历上\t197683\n5千万\t197684\n上海学校\t197685\nHearthstone\t197686\n道台\t197687\n骆驼趾\t197688\n读读\t197689\n汉鼎\t197690\nFSO\t197691\n国家公园\t197692\n云南石林\t197693\n中西部\t197694\n克利斯朵夫\t197695\nSGD\t197696\n每当\t197697\n持名\t197698\n龙方\t197699\n梁铮\t197700\n3月24\t197701\n冰雪女王\t197702\n童颜针\t197703\n湖岭镇\t197704\n同乐村\t197705\n分期利率\t197706\nconsul\t197707\n十件事\t197708\n守约\t197709\n愁容\t197710\n南京酒店\t197711\n邹哥\t197712\nM50\t197713\n荆门市人民政府\t197714\n桃君\t197715\n立体书\t197716\nservicestack\t197717\n钢板仓\t197718\nIE地址栏\t197719\neyuyan\t197720\n渡\t197721\n战锤末日鼠疫2\t197722\n贲门炎\t197723\n静冈县\t197724\n35亿美元\t197725\nSSI\t197726\n一键端\t197727\n安庆岳西网\t197728\n桑弘羊\t197729\n苦恋\t197730\n长沙中国青年旅行社\t197731\neai\t197732\nNS2\t197733\n体育教育训练学\t197734\n安全生产事故隐患排查治理暂行规定\t197735\n一拳超人第二季\t197736\n天邪鬼青\t197737\n2220\t197738\n月经周期\t197739\n二龙路\t197740\n珍档\t197741\n马哈蒂尔\t197742\n常\t197743\n科赛\t197744\n姜惠贞\t197745\n六有\t197746\ntakeover\t197747\neconomy\t197748\n苏妲己\t197749\ngadget\t197750\nOrigins\t197751\n2345加速浏览器\t197752\nProE\t197753\n4000万\t197754\n刘季\t197755\n刘姥姥\t197756\n鬼铃\t197757\nXenCenter\t197758\n本溪路\t197759\nmash\t197760\n澎湖\t197761\n风凰\t197762\n解決\t197763\n自贸区\t197764\n新文学\t197765\nBL锁\t197766\n600多个\t197767\npet-ct\t197768\n周围神经炎\t197769\n黄寺大街\t197770\ntweak\t197771\n一涵\t197772\n大道朝天\t197773\nnui\t197774\n电导\t197775\n初见\t197776\n区文广新局\t197777\n响响\t197778\n红色家书\t197779\n凯茵\t197780\n呼兰路\t197781\n朝阳区地税局\t197782\n重庆十一中\t197783\n开瑞汽车\t197784\n咖喱鱼蛋\t197785\n0.15g\t197786\n页岩多孔砖\t197787\n行政区划通史\t197788\nSakai\t197789\npractices\t197790\n叶卡捷琳娜二世\t197791\n大买卖\t197792\n升本\t197793\nooc\t197794\n检修箱\t197795\n20130815\t197796\n雷州\t197797\nbirmingham\t197798\n民信\t197799\n对称性\t197800\n壁墩\t197801\nengin\t197802\n唐家墩\t197803\n梁钢筋\t197804\n王晓军\t197805\nvivoy55\t197806\n51Job\t197807\n北美地区\t197808\n2.50\t197809\n郝劭文\t197810\n韩兆新\t197811\n子元\t197812\n维港\t197813\n抑制性\t197814\n沪江网\t197815\nword文\t197816\n创作基地\t197817\n乌拉尔山\t197818\n不可选中\t197819\n雀\t197820\n农信银行\t197821\n超星电子\t197822\n做秀\t197823\n华安捷讯\t197824\n发际线\t197825\n4月19\t197826\n七雄争霸\t197827\nJOKER\t197828\nmcv\t197829\n执子\t197830\n巴朗\t197831\n大连石化\t197832\n索桥\t197833\nsharepoint\t197834\nAI\t197835\n鼻症\t197836\n传谣\t197837\n暖管\t197838\n610\t197839\n不然后悔\t197840\n少女骑士物语\t197841\n微魔\t197842\n凭\t197843\nispa\t197844\n铁皮柜\t197845\n2芯\t197846\n慢病\t197847\n周媛\t197848\nVCSA\t197849\n5.1.5\t197850\n景德镇陶瓷大学科技艺术学院\t197851\n寒冬期\t197852\n不匀\t197853\n憋\t197854\n叶子柯蓝\t197855\n插花\t197856\n超级机器人大战OG\t197857\n乌兰牧\t197858\n纽芬兰\t197859\n武志红\t197860\ncannondale\t197861\n录音笔\t197862\njonathan\t197863\n防磁\t197864\nkaby\t197865\n顶岗\t197866\n鼓凳\t197867\n助燃剂\t197868\n巧克力蛋糕\t197869\n边境城市\t197870\n广西建设执业资格注册中心\t197871\nc++库\t197872\n斯宾特\t197873\n16G101\t197874\n黄页/公司库/大全/\t197875\nskrillex\t197876\n矢志\t197877\nProphet\t197878\nCCTV-6电影频道\t197879\nyuna\t197880\n对比性\t197881\n狗焕\t197882\news\t197883\nKodak\t197884\n孙宇\t197885\n朗费罗\t197886\n公告栏\t197887\n白领通\t197888\n厂址\t197889\n食谱\t197890\nCold\t197891\n放牛娃\t197892\n纯手\t197893\n百万吨\t197894\n交叉轴\t197895\n厨师帽\t197896\nSaver\t197897\n猥亵\t197898\n标致雪铁龙\t197899\n致歉信\t197900\n宣威火腿\t197901\nreagent\t197902\n新浪微盘\t197903\nbattles\t197904\n红油\t197905\nfrozenset\t197906\nshit\t197907\n叫来\t197908\n沧龙\t197909\n圆葱\t197910\nCHF\t197911\n石之门\t197912\nv3.1.3\t197913\n国际花园\t197914\n一欧元\t197915\n吉普车\t197916\nv12\t197917\n有点意思\t197918\n寺庙\t197919\n没头脑和不高兴\t197920\n喝啤酒\t197921\nUI界面\t197922\nmupdf\t197923\nwin10excel\t197924\n1tb\t197925\n无机非金属材料\t197926\n硝酸铋\t197927\n檐廊\t197928\nREVIVO\t197929\n万卷\t197930\nProvíncia\t197931\n网上书店\t197932\n波罗\t197933\nNIU\t197934\n微话题\t197935\n211工程大学\t197936\n奇谭\t197937\n脂蛋白a\t197938\n东奥\t197939\n280万\t197940\n火蝠\t197941\n李集镇\t197942\ndSPACE\t197943\npdf转换器\t197944\nH3C路由器\t197945\nsrt\t197946\nCited\t197947\nwifi增强器\t197948\n20180417\t197949\n短银\t197950\n过会\t197951\n唐子宜\t197952\n波色\t197953\n朱颜血\t197954\ndefend\t197955\n巫娜\t197956\n奇崛\t197957\n无法入睡\t197958\n全栈之路\t197959\n黄钦\t197960\nFurmark\t197961\n粘鼠板\t197962\nVxLAN\t197963\nb25\t197964\n磁控管\t197965\n四川省高级人民法院\t197966\n山东省体育局\t197967\n亚硝酸钠\t197968\nazone\t197969\n不速之客\t197970\n江南园林\t197971\nExplore\t197972\n蝴蝶简笔画\t197973\n黑白熊\t197974\n前班\t197975\n江苏大学附属医院\t197976\nStone\t197977\n上海农商银行\t197978\nonedrive\t197979\n沙迦\t197980\n速卖通运费\t197981\n丑行\t197982\n局部性\t197983\n画押\t197984\n裂痕\t197985\n沐风网\t197986\nDaybreak\t197987\ncku\t197988\n河东万达广场\t197989\n冈仁波齐\t197990\nmx8\t197991\n碱基对\t197992\nBOB\t197993\nmobile9\t197994\nⅥ\t197995\n嫩枝\t197996\n刻碑\t197997\ndefinitions\t197998\nsm卡\t197999\n饱经沧桑\t198000\n列别\t198001\n专责\t198002\n电磁开关\t198003\n百夫长白金卡\t198004\n捷信贷款\t198005\n肿胀感\t198006\n田磊\t198007\n多彩贵州网\t198008\n五言诗\t198009\n米田\t198010\nkubeadm\t198011\n垫材\t198012\n捉妖\t198013\n1.10\t198014\n红颜薄命\t198015\n录取比\t198016\n触宝\t198017\n奇石\t198018\n五矿万科\t198019\n邢台县\t198020\n白鹰\t198021\n浴火\t198022\n20级\t198023\n稳中求进\t198024\n外生\t198025\n本钢\t198026\n深情\t198027\n索尼XZ2\t198028\ntry-catch\t198029\n大隋\t198030\n泰國\t198031\n奎迪\t198032\n天麻首乌片\t198033\n二宫ナナ\t198034\ndropwizard\t198035\n弹窗\t198036\n陈振\t198037\n撅\t198038\n普拉沃尔沃s60\t198039\n开山鼻祖\t198040\n刷白\t198041\n高级技工学校\t198042\n恶咒\t198043\n超级碗中场秀\t198044\n仙人\t198045\n工师\t198046\n求成\t198047\n赵佶\t198048\n宝清\t198049\niKuai爱快流控\t198050\n天安门广场\t198051\n一大批\t198052\n背负式\t198053\n引伸\t198054\n王者荣耀百里\t198055\n生活馆\t198056\nwindow8\t198057\n缜\t198058\n夏萍\t198059\n古生物学\t198060\n省委老干部局\t198061\n美食总动员\t198062\nWPS2013\t198063\n妖魂\t198064\n致动\t198065\n荣耀S9\t198066\n江苏省交通运输厅\t198067\n网络文化经营许可证\t198068\noccurred\t198069\n成都乐居网\t198070\nGenome\t198071\n考试季\t198072\n家养\t198073\n桑田\t198074\n英里\t198075\n第182集\t198076\n中台禅寺\t198077\nVIVE\t198078\n简中版\t198079\nhiti\t198080\n分公\t198081\nBangladesh\t198082\n海石\t198083\n腾讯加速器\t198084\n连阴\t198085\n氩弧焊丝\t198086\nyrh\t198087\n水云居\t198088\n苏若禹\t198089\n融e购\t198090\n搜狐时尚\t198091\n赵青\t198092\n思录\t198093\n北京国际雕塑公园\t198094\n湖州晚报\t198095\n七年后\t198096\nHONT\t198097\n宿州路\t198098\n单率\t198099\n2017年3月9日\t198100\n沉默的天使\t198101\n吴晨\t198102\n今秋\t198103\n社区居委会\t198104\n终止经营\t198105\n王府井购物中心\t198106\nTheodore\t198107\n威唐工业\t198108\n商业健康险\t198109\n初回\t198110\n蜕壳\t198111\n古德寺\t198112\n唐海县\t198113\n互动营销\t198114\n盐田港\t198115\n谚\t198116\n克里思\t198117\n仓央汉乡\t198118\n500万吨\t198119\n电源论坛\t198120\nCC388A\t198121\n飞箭\t198122\nccorz\t198123\nExcelHome技术论坛\t198124\n中欣卡\t198125\n半坡遗址\t198126\nalienware17\t198127\n硅胶\t198128\n怀瑾\t198129\n炼厂\t198130\n小欢\t198131\n喜大\t198132\n7.8%\t198133\n研修班\t198134\n煮蛋\t198135\n皱襞\t198136\nssl\t198137\nPOS\t198138\n白云街\t198139\n脚照\t198140\n北方工业大学\t198141\nH10\t198142\n说拜拜\t198143\n400只\t198144\n测一测\t198145\n南科大\t198146\n革新路\t198147\niphone5S\t198148\nAlgorithmic\t198149\n一起来电\t198150\n墙皮\t198151\n鼎尚\t198152\n张景中\t198153\n美原\t198154\n刹车碟\t198155\n泸溪\t198156\n龚蓓苾\t198157\ngaige\t198158\n大足县\t198159\n600万\t198160\n小形\t198161\n柳州市政府网\t198162\n重货\t198163\n科目表\t198164\nRX3\t198165\nSOSO\t198166\ntizen\t198167\n东方公司\t198168\n大理市\t198169\n攻略篇\t198170\n黄虎\t198171\n新环境\t198172\n落寞\t198173\n深圳通\t198174\n30fps\t198175\n南京土壤研究所\t198176\n大瀚\t198177\nhottoys\t198178\nstage\t198179\n刘家村\t198180\n久久信息网\t198181\n难逢\t198182\n刘安琪\t198183\n流变仪\t198184\n悟空理财\t198185\n赛出\t198186\nジ\t198187\n军棍\t198188\ncontrib\t198189\n医疗信息系统\t198190\n_联商论坛\t198191\n青狼\t198192\n金枝\t198193\n965m\t198194\n点炮\t198195\n3啪啪啪\t198196\n4.5%\t198197\n大块头\t198198\n乳铁蛋白\t198199\n曲院风荷\t198200\n贵宾厅\t198201\n斯维\t198202\n贝丝\t198203\n哪里\t198204\n酷炫吊炸天\t198205\n踝\t198206\n曲风\t198207\nSandwich\t198208\n龙潭湖公园\t198209\n八九点钟\t198210\n猪肚汤\t198211\n微妙\t198212\n大头儿子\t198213\n企鹅社区\t198214\n存款保险\t198215\n沪江小D英语翻译\t198216\nCyanogen\t198217\n江东区\t198218\n杭州人大\t198219\n邮政储蓄\t198220\n光迅\t198221\n转来\t198222\n金融管理专业\t198223\ncapsules\t198224\n第二十\t198225\n相似三角形\t198226\n南京鼓楼区\t198227\n深圳格力空调\t198228\n钳型\t198229\n老人与海鸥\t198230\nU9网\t198231\n波士顿矩阵\t198232\nnex5r\t198233\noptimal\t198234\n2018年4月24\t198235\n凯迪电力\t198236\nnota\t198237\n一腔\t198238\n引战\t198239\n三只松鼠\t198240\n数据盘\t198241\n阔别\t198242\n流行服装\t198243\n烧炭\t198244\nSATA接口\t198245\ncodeforce\t198246\n华润雪花\t198247\ncodevs\t198248\n勾针\t198249\n现代农业示范园区\t198250\nhonest\t198251\n私库\t198252\n迫\t198253\npagelist\t198254\nクリスマス\t198255\n函格式范文\t198256\ncox\t198257\n科尔尼\t198258\n西渡\t198259\n莹润\t198260\n浪味\t198261\n营养剂\t198262\n20多天\t198263\n安昌古镇\t198264\nsed\t198265\n庚寅\t198266\n里约\t198267\n22家\t198268\n化学类\t198269\n天水市\t198270\n无声处\t198271\n雍正剑侠图\t198272\nlumo\t198273\n114黄页\t198274\njiyi\t198275\n发轫\t198276\n大学区\t198277\n临高唐\t198278\n鼻夹\t198279\nhtml文件\t198280\n安吉丽娜朱莉\t198281\n绍兴文理学院\t198282\npuck\t198283\n水星记\t198284\n欲成欢\t198285\n男兵\t198286\n镨钕\t198287\n任意门\t198288\n电热线\t198289\n后视镜\t198290\n纸空文\t198291\n辽宁省物价局\t198292\n1.3.8\t198293\n广兰路站\t198294\n鸿言\t198295\n上车\t198296\n油画棒\t198297\n小制\t198298\n电子邮\t198299\n同勉\t198300\n良恶性\t198301\n切板\t198302\npath环境变量\t198303\n磨难\t198304\n處\t198305\n李建设\t198306\n顶杆\t198307\n新抚区\t198308\n猪男\t198309\nGB18030\t198310\nSharePoint\t198311\n4396\t198312\n评书\t198313\n哈希算法\t198314\n贵司\t198315\n奖状\t198316\n9公分\t198317\n22寸\t198318\n逢山开路\t198319\nnyx\t198320\n12个月\t198321\nax100\t198322\n郫都区\t198323\n核式\t198324\n领先型\t198325\n掌盟\t198326\n莱州一中\t198327\n方行\t198328\n及格率\t198329\n二十五周年\t198330\n餐馆\t198331\n浙江农林大学\t198332\n肉肿\t198333\n无发\t198334\n艺华\t198335\nproperty_tree\t198336\n断代\t198337\n魏震\t198338\n胡桃色\t198339\n口袋妖怪3DS\t198340\n美丽俏佳人\t198341\n墅质\t198342\n试试看\t198343\n张胜\t198344\n炫浪\t198345\n围油栏\t198346\n笑顔\t198347\n音神\t198348\n力德\t198349\ndrc\t198350\n途居\t198351\nwfi\t198352\n20130621\t198353\n3子\t198354\n中国旅行社总社\t198355\n_额\t198356\n3.51\t198357\n车墩镇\t198358\n连云港白塔埠机场\t198359\n四川省卫生和计划生育委员会\t198360\n中汇\t198361\n倾城色\t198362\n大傻\t198363\n升力\t198364\n葡七\t198365\n医学教育\t198366\n产品类\t198367\nA线\t198368\n云谷路\t198369\n港口群\t198370\n网儿\t198371\n中裙\t198372\n横渡\t198373\n东南快报\t198374\n电位计\t198375\n2k18吧_\t198376\n配享\t198377\n荨麻疹\t198378\n冠寓\t198379\nDOTA2\t198380\n随随\t198381\n长安道君\t198382\nYJK\t198383\n小凡\t198384\n人民交通网\t198385\nexecfile\t198386\n战神斯巴达之魂\t198387\n声音键\t198388\nidreamx\t198389\n何庆华\t198390\n绿檀木\t198391\n爱的奉献\t198392\n炒手\t198393\n奔驰B级\t198394\n可发性聚苯乙烯\t198395\n宁波镇海区\t198396\n天生桥\t198397\n2007年10月\t198398\n2017.04\t198399\n燃气费\t198400\n山君\t198401\nd8\t198402\n杭州北\t198403\n隣人\t198404\n冈本信彦\t198405\nAdwords\t198406\n数字线\t198407\nCOACH\t198408\n河北经济日报\t198409\n四年级下册语文\t198410\n河运\t198411\nSDM\t198412\n槊\t198413\n11英寸\t198414\n跳棋\t198415\n反键\t198416\n44级\t198417\n魔丝\t198418\n海马普力马\t198419\n长生劫\t198420\n南宁卫生信息网\t198421\n奇经八脉\t198422\n安徽日报\t198423\n花殇\t198424\n元宝\t198425\n高密度电法\t198426\n肖健\t198427\n安娜贝尔\t198428\neova\t198429\n铂金卡\t198430\n政客\t198431\n10堂\t198432\n中山大学珠海校区\t198433\n轻金属\t198434\n躺尸\t198435\nfrps\t198436\n宠幸\t198437\n李晓旭\t198438\n开张大吉\t198439\n尚品网\t198440\n大视野\t198441\nBrita\t198442\n汇佳\t198443\nvue-swiper\t198444\n阴雨天\t198445\nXE5\t198446\n托克\t198447\n监守自盗\t198448\n134a\t198449\n口过\t198450\n中正式\t198451\n黑红\t198452\n中华人民共和国标准化法\t198453\n江西省质量技术监督局\t198454\n诱骗\t198455\n其\t198456\n溪口风景区\t198457\nLME\t198458\n克山病\t198459\n165号\t198460\n违和\t198461\nKEN\t198462\n紫铜板\t198463\n180首\t198464\n山毛榉\t198465\nColt\t198466\n全距\t198467\n小南国\t198468\n制霉菌素片\t198469\nKODI\t198470\n一始村\t198471\n申请单\t198472\nGIt\t198473\n标量变量\t198474\n本科生\t198475\nkaola\t198476\n德国馆\t198477\nDDR4内存\t198478\ny85\t198479\n超标电动车\t198480\n扁平风\t198481\n分流\t198482\n疏花\t198483\n表参道\t198484\n傅晓田\t198485\n科学社会主义\t198486\norcad\t198487\n保洁工\t198488\n东方神起\t198489\ndistributed\t198490\n斯坦利·库布里克\t198491\nGeometric\t198492\nEP8\t198493\nRailways\t198494\n钟白\t198495\n电子商务类\t198496\n校花终结者2审判日\t198497\n空心电抗器\t198498\n诗书\t198499\ninheritance\t198500\n130万\t198501\n春秋三传\t198502\nDaniels\t198503\nmás\t198504\n蒂森电梯\t198505\n吓吓\t198506\n炫装\t198507\n地石\t198508\n李光宇\t198509\n二级消防工程师\t198510\n胡春华\t198511\n合肥\t198512\n婠婠\t198513\n石峰区政府\t198514\n严敏\t198515\nSCREEN\t198516\n平坟\t198517\n王亭文\t198518\n广州租车公司\t198519\nAssessing\t198520\n平安一账通\t198521\n中冶建工集团有限公司\t198522\nTechnical\t198523\n加勒比女海盗\t198524\n黄恺杰\t198525\n水资源费\t198526\n格力美的\t198527\nduyun\t198528\n落败\t198529\n醚类\t198530\nREF\t198531\n大兴旧宫\t198532\n_正义网\t198533\n槭树\t198534\n海昏侯墓\t198535\n500kva\t198536\n公证员\t198537\n金山股份\t198538\n欣都\t198539\n19.9元\t198540\n150度\t198541\n一个24寸\t198542\nillustr\t198543\n深圳技术大学\t198544\nstadium\t198545\n鳝\t198546\n地窟\t198547\nCrew\t198548\n重组方\t198549\n小数点\t198550\n亚米网\t198551\ncsmoney\t198552\n20180427\t198553\n心悦诚服\t198554\n星威\t198555\n剑齿\t198556\n来势汹汹\t198557\n劳动派遣\t198558\n复兴区\t198559\nAssetBundle\t198560\n淘寶網\t198561\nherozho\t198562\n减脂期\t198563\n考期\t198564\n廓形\t198565\nraid卡\t198566\n重庆市总工会\t198567\n长工\t198568\n武川县\t198569\n美姐\t198570\nromans\t198571\n醋精\t198572\n重生之嫡女祸妃\t198573\nveux\t198574\n恣\t198575\ncacheable\t198576\nleave\t198577\n泥城\t198578\nElevated\t198579\nIView\t198580\n扩股\t198581\n手汗\t198582\nx+1/\t198583\n王彩霞\t198584\n王金\t198585\nftp\t198586\n古典吉他论坛\t198587\n学霸君\t198588\n大火花\t198589\n0.3平方\t198590\n1盎司\t198591\n番茄\t198592\n总值\t198593\n尾钩\t198594\nMKV+RMVB\t198595\nyian\t198596\n中华全国供销合作总社\t198597\n服装设计专业\t198598\nairsoft\t198599\nRm\t198600\n周武王\t198601\nikea\t198602\nApk\t198603\nmediacoder\t198604\n双脚\t198605\nCC版\t198606\n大李\t198607\n球花\t198608\nUDK\t198609\nunderwriting\t198610\n九站\t198611\n摆拳\t198612\n天津方特欢乐世界\t198613\n空放贷款\t198614\n碧水源\t198615\n许继\t198616\n川字纹\t198617\nMBProgressHUD\t198618\n数公里\t198619\n淦\t198620\n奏\t198621\nchara\t198622\n广播包\t198623\n折叠凳\t198624\n生意经\t198625\n青科大\t198626\n青云镇\t198627\n胡艳\t198628\nann\t198629\n评议\t198630\nsigner\t198631\n交缠\t198632\n凉城公安局\t198633\n屁眼儿\t198634\n装甲骑兵\t198635\n泰铢\t198636\n硝酸铜\t198637\n2016年11月\t198638\n盖棺\t198639\n裕达\t198640\n新生路\t198641\n北京航空总医院\t198642\n钓鱼日记\t198643\n19.99\t198644\n姬松茸\t198645\n出盘\t198646\ncad迷你看图软件\t198647\n150粒\t198648\n菲亚特菲翔\t198649\n18&\t198650\nhxh\t198651\n智神\t198652\nipad吧\t198653\n加拉\t198654\n种桑\t198655\n罪夜之奔\t198656\nTechnic\t198657\n专业技术人才网\t198658\nwhich\t198659\n献计\t198660\n梭罗\t198661\niPython\t198662\n梦游症\t198663\nfoxtable\t198664\nV8\t198665\n7723\t198666\n峰力助听器\t198667\n浦东日上免税店\t198668\n计划生育假\t198669\naldo\t198670\n杨锦麟\t198671\n五分钟内\t198672\n降板\t198673\n韩星\t198674\n石室\t198675\nclicker\t198676\n太阳能展\t198677\nStem\t198678\n夏敏\t198679\n临桂\t198680\n鹄\t198681\n善良\t198682\npll\t198683\n墨家\t198684\n下标\t198685\n芦荟膏\t198686\n连着\t198687\nigraph\t198688\n台胞证\t198689\n10匹\t198690\n耸人听闻\t198691\n郑州大学\t198692\n黄梦莹\t198693\n蝶湖湾\t198694\n卖场型\t198695\n合纤\t198696\n第2017\t198697\nn900\t198698\n小爱音箱\t198699\nORA-01034\t198700\nLANCER\t198701\n5.9.3\t198702\nNO.4\t198703\n中建\t198704\ntiti\t198705\n7320\t198706\n菊花蕾\t198707\nnagito\t198708\n合德\t198709\n想天开\t198710\n几盏\t198711\n小的时候\t198712\n组培苗\t198713\n第118章\t198714\n牟利\t198715\n切应力\t198716\n华宝证券\t198717\n做爱时候\t198718\n终奖\t198719\n远港\t198720\nHL-2140\t198721\n在座\t198722\n万树繁花\t198723\n胡作非\t198724\n唐盾\t198725\n民生保险\t198726\n南京工程学院\t198727\n另眼看\t198728\nLuminar\t198729\n保障网\t198730\n纬创\t198731\n纳爱斯集团\t198732\n种型\t198733\n越秀路\t198734\nHOSPITAL\t198735\n唐山学院\t198736\n斗兽\t198737\n720P.HD-MP4\t198738\n哪吒传奇\t198739\nPlenty\t198740\n塞尔达传说时之笛\t198741\n表联查\t198742\n有违\t198743\n美形\t198744\n800x600\t198745\n第19\t198746\n班得瑞\t198747\n4560\t198748\n决定\t198749\n一于\t198750\n大鸟\t198751\ngomez\t198752\n检测盒\t198753\n有机可乘\t198754\n华为荣耀4A\t198755\n祗园\t198756\n王向东\t198757\npropagate\t198758\nrefactoring\t198759\nwebRoot\t198760\n沈知夏\t198761\nnq\t198762\n宜兴法院\t198763\neDP\t198764\n桥头村\t198765\n木里藏族自治县\t198766\n窥\t198767\n半胱氨酸血症\t198768\n时空盘\t198769\n赛选\t198770\nPartners\t198771\n黄坛\t198772\nSignalR\t198773\n金泫雅\t198774\n边境牧羊犬\t198775\nka600资讯网\t198776\nusemap\t198777\n2o18\t198778\n绗缝机\t198779\nts8080\t198780\n20151111\t198781\n装死\t198782\n坍落\t198783\n外貌\t198784\nFCPX\t198785\n火枫\t198786\n销售篇\t198787\n首南街道\t198788\n一个3\t198789\n莫等闲\t198790\n360云钻\t198791\n升势\t198792\nmoth\t198793\n辉腾\t198794\n当\t198795\nq宠大乐斗\t198796\n外汇返佣网\t198797\n电材\t198798\n三袋\t198799\n必备\t198800\n成片\t198801\n新浪尚品_新浪网\t198802\n高博文\t198803\n眉笔\t198804\n毒气弹\t198805\ninstances\t198806\n中国音乐金钟奖\t198807\n精原细胞\t198808\n诺亚奥特曼\t198809\nFULL\t198810\n影视圈\t198811\n奔腾\t198812\n神仙居景区\t198813\n魏伟\t198814\nAirport\t198815\n骨髓间充质干细胞\t198816\n注册离岸公司\t198817\n1060maxq\t198818\n肠易激综合征\t198819\n孙宁\t198820\n艾欧娜尔\t198821\n起居\t198822\n玉溪市\t198823\n纱奈\t198824\nhyatt\t198825\n陈奇\t198826\nRegus\t198827\nwi-1000x\t198828\n冻结帧\t198829\n上站\t198830\n20160304\t198831\n地条钢\t198832\n珠江城\t198833\n小学生\t198834\n世界遗产名录\t198835\nmasturbating\t198836\n1.52\t198837\n断水\t198838\n新智\t198839\n明水\t198840\n囧的呼唤吧\t198841\n从警\t198842\n冒险岛魔链\t198843\n爽爽\t198844\nMy\t198845\nfree本\t198846\nmicroarray\t198847\nLaurence\t198848\n石矿\t198849\njs浏览器\t198850\n拱墅区政府网\t198851\n南水镇\t198852\n200MB\t198853\n牧神独步天下\t198854\nSharing\t198855\n苏州市教育局\t198856\n大名\t198857\n中交一航局\t198858\n虾米电台\t198859\n若无其事\t198860\n花篮\t198861\n生玩\t198862\nMr.Ming2\t198863\n初验\t198864\n片色\t198865\n阅江楼\t198866\n北京市市\t198867\npowerbi\t198868\nClasses\t198869\npanels\t198870\n互联网时代\t198871\n幸福的影子\t198872\nRegulations\t198873\n今日子\t198874\n暗火\t198875\nTSQL\t198876\n范达尔\t198877\n假面骑士\t198878\ncarrot\t198879\n相如\t198880\n费雪\t198881\n王金山\t198882\n农业机器人\t198883\n4AM\t198884\n阿尔斯通\t198885\n爱殇\t198886\n统计信号处理\t198887\n2017-12-05\t198888\n英国皇家艺术学院\t198889\n风情园\t198890\n叫屈\t198891\n造反\t198892\nshocking\t198893\n新交规\t198894\n精灵宝可梦吧\t198895\nnet.ipv4.tcp\t198896\n圣子\t198897\n周转率\t198898\n番禺南村\t198899\n忒\t198900\nhydrating\t198901\n文游台\t198902\n古田美穗\t198903\n易燃性\t198904\n广宇集团\t198905\nqq秀\t198906\n渝怀铁路\t198907\n梅德明\t198908\n收视费\t198909\n慧车天下\t198910\n大包\t198911\nmaxmara\t198912\n代祷\t198913\n顺畅\t198914\n富滇银行\t198915\npl\t198916\n第十四回\t198917\nzhongg\t198918\n侍者\t198919\n构词法\t198920\nOLED\t198921\n青出\t198922\n蓝客\t198923\nVZ\t198924\n全书式\t198925\n有限状态机\t198926\n健忘症\t198927\nmate10Pro\t198928\n谢村\t198929\n接引\t198930\n云速\t198931\n小米笔记本\t198932\nkktv\t198933\n海峡午报\t198934\n五塘广场\t198935\nrho\t198936\nHC-05\t198937\n文子\t198938\n杭州凯悦酒店\t198939\n2503\t198940\n大气压\t198941\n多雨\t198942\n20150411\t198943\n副中心\t198944\nHooker\t198945\n酚类\t198946\n广东省电信规划设计院有限公司\t198947\n保障卡\t198948\n借阅机\t198949\n玖玥瑾\t198950\n3133\t198951\n江苏省环保厅\t198952\ncreatures\t198953\n天津移动\t198954\n胜平负\t198955\n捕鱼达人3\t198956\n苏州北站\t198957\nGreens\t198958\n软基\t198959\n7S\t198960\n公倍数\t198961\n轩逸论坛_汽车之家论坛\t198962\n东北网\t198963\n俞正强\t198964\n大洋街道\t198965\n擦去\t198966\n扫雷舰\t198967\nunb\t198968\nCB650F\t198969\n惊心食人族\t198970\n缴付\t198971\n雪花酥\t198972\n饮料瓶\t198973\n辅助账\t198974\n花枪\t198975\nlh服\t198976\n邪恶少女漫画全集\t198977\n荷兰银行\t198978\n伟星城\t198979\n排位\t198980\n应收账款周转率\t198981\n降水井\t198982\n跟单员\t198983\n嗳\t198984\n乌尤尼\t198985\n杜特\t198986\n微头条\t198987\n回收库存\t198988\n新价\t198989\n永续经营\t198990\n超限费\t198991\n外报\t198992\n海南省乐东黎族自治县人民政府\t198993\nblr18\t198994\nsimtrade\t198995\n中国科学院亚热带农业生态研究所\t198996\n牙位\t198997\n大喊\t198998\n超低价\t198999\n南昌新闻网\t199000\n参谋长\t199001\n蔡先生\t199002\n失步\t199003\n苍井\t199004\n淮南王\t199005\nOV2640\t199006\n174个\t199007\n出马\t199008\n0.3元\t199009\n承插式\t199010\ncairo\t199011\ngifcam\t199012\n第二十八\t199013\nFOTOMEN\t199014\nintsmaze\t199015\n怒战\t199016\n李成泽\t199017\n西口\t199018\n西联汇款\t199019\n铜灯\t199020\n钛白粉\t199021\n长安街道\t199022\n诫文\t199023\n山东省社会科学界联合会\t199024\n姜敏赫\t199025\nFFSKY天幻网\t199026\n星际公民\t199027\n000725\t199028\ndatealive\t199029\n风际\t199030\n俞敏洪\t199031\ndaban\t199032\nWebQQ\t199033\n高德导航离线地图包\t199034\n石油工程学院\t199035\nDublin\t199036\ngente\t199037\n南宁公司\t199038\nKpopStarz\t199039\nWaldorf\t199040\n汪丁丁\t199041\n如出一辙\t199042\n搜鞋网\t199043\n平车\t199044\n中泰信托\t199045\n5条\t199046\n春辉\t199047\n海平面\t199048\n光谷院区\t199049\ncaopoin\t199050\n郭维\t199051\n十三套\t199052\n十全九美\t199053\n八月份\t199054\n林三竖\t199055\n东京喰种第三季禁播?东京食尸鬼\t199056\ndeception\t199057\n北京戏曲艺术职业学院\t199058\n武则天秘史\t199059\n第41页\t199060\n麦芽糖醇\t199061\n3X\t199062\n解狐\t199063\naccurate\t199064\n完整片\t199065\n可餐\t199066\n藿香正气水\t199067\n剪切波\t199068\n健翔桥\t199069\nPIS\t199070\n四险\t199071\n北极星\t199072\n联通云数据公司\t199073\n四十度\t199074\n蛋白质粉\t199075\n长力\t199076\n天猫年货节\t199077\nPartnerships\t199078\n2017年10月14日\t199079\n老化测试\t199080\nscarpa\t199081\n2205\t199082\n圣都装饰\t199083\n纸雕\t199084\n武夷茶\t199085\n高家俊\t199086\nDON\t199087\n广州市脑科医院\t199088\n羞\t199089\n兴汝金城\t199090\n可读性\t199091\n16度\t199092\n扫弦\t199093\n稻香\t199094\nAmericans\t199095\n巨怪\t199096\n少年派\t199097\n@Transactional\t199098\n叙事诗\t199099\n二塘\t199100\n东江纵队纪念馆\t199101\n第3周\t199102\n玩具店\t199103\n第132期\t199104\n学乐\t199105\n时空爱丽舍\t199106\n养生桶\t199107\n葡萄汁\t199108\n合情\t199109\n红水\t199110\n过客\t199111\n前海自贸区\t199112\n肾绞痛\t199113\nPENTAX\t199114\n贵阳北站\t199115\n20多年后\t199116\n吊带袜\t199117\n稻城县\t199118\n绝地重生\t199119\n周兴哲\t199120\n上轮\t199121\n红亮的心\t199122\nGIGA\t199123\n考察材料\t199124\nGTX965M\t199125\n看不穿\t199126\n豌豆射手\t199127\nPageHelper\t199128\n钓\t199129\n金静\t199130\n6.6米\t199131\n熟食\t199132\n右滑\t199133\n火遍\t199134\n股票群\t199135\n襄城\t199136\n宽严\t199137\nsir\t199138\n直飞往返\t199139\n中锦\t199140\n小吴\t199141\ncappella\t199142\n任务管理器\t199143\n筆\t199144\n希罗娜\t199145\n穷尽\t199146\n20系\t199147\n一丛\t199148\nadina\t199149\n谢_\t199150\n腐男\t199151\n陆谷孙\t199152\n必检\t199153\n武进区\t199154\njuyu\t199155\never17\t199156\n天翼云\t199157\n万柏林\t199158\nbet体育在线\t199159\n心理咨\t199160\n测角\t199161\nPerforming\t199162\n三百斤\t199163\n嘘声\t199164\nSachs\t199165\nCX200\t199166\nIframe\t199167\n圆管涵\t199168\n长心\t199169\n39.2\t199170\n邻位\t199171\n康乾\t199172\n静妃\t199173\n吐火\t199174\n附属品\t199175\n月亮岛\t199176\n2017a\t199177\n先父\t199178\n包刚\t199179\n三角地\t199180\n里杰卡尔德\t199181\n七个月\t199182\naeg\t199183\n美化版\t199184\nGod\t199185\nvimrc\t199186\n通卡\t199187\n夜萝莉\t199188\n求和函数\t199189\nunbound\t199190\n移植性\t199191\ngasket\t199192\nmedieval\t199193\nMatconvnet\t199194\n璇玑\t199195\n肯德基麦当劳\t199196\nJava注释模板\t199197\nSchnappi\t199198\nlj2200\t199199\n福田论坛\t199200\n杯弓蛇影\t199201\n六和彩\t199202\n下不下来\t199203\n夏士莲\t199204\n头条头条号\t199205\n夹弦\t199206\n粉鲍鱼\t199207\n那年青春我们正好\t199208\n2423\t199209\n寒雨\t199210\n第08期\t199211\n离合器分泵\t199212\n爱丁堡\t199213\n黄米面\t199214\n导电滑环\t199215\n凹点\t199216\n猫屋\t199217\n长银\t199218\n我的贴身校花\t199219\n龙庭\t199220\nHercules\t199221\n2016-2017年\t199222\n隶\t199223\n三处\t199224\n失去知觉\t199225\n尾缀\t199226\n5-8年\t199227\n神罚\t199228\n31天\t199229\n河北华夏\t199230\n詹启敏\t199231\n电卷\t199232\n中国府\t199233\n24#\t199234\n华为荣耀8吧\t199235\n小人国\t199236\n伯伦\t199237\n情愫\t199238\ncl\t199239\nCoreNLP\t199240\nIntellijIDEA\t199241\n美白\t199242\n太尼玛\t199243\n竞走\t199244\n老蒙迪欧\t199245\n缠膜\t199246\n售票员\t199247\n艾利桑德\t199248\nparagon\t199249\n0770\t199250\n监理师\t199251\n周礼\t199252\n林川\t199253\nSof\t199254\nguer\t199255\nAutoblog\t199256\n关诗敏\t199257\n耐萨里奥\t199258\n诺基亚6\t199259\n决断\t199260\n唇刷\t199261\nPaloma\t199262\n卡位\t199263\n玛丝菲尔\t199264\n946\t199265\nHilton\t199266\n民歌\t199267\nwfd\t199268\n科斯定理\t199269\n让给\t199270\n维卡\t199271\n弄得\t199272\n无数次\t199273\n戏外\t199274\n玉鼎\t199275\n电动剃须刀\t199276\n腥红\t199277\n百合花\t199278\nhelper\t199279\nMi\t199280\n闪光术\t199281\n带罩\t199282\n机能\t199283\n行值\t199284\n_众泰T500车友会_XCAR爱卡汽车俱乐部\t199285\n荒城\t199286\n两百块\t199287\n任先生\t199288\n延安革命纪念馆\t199289\n山西农大\t199290\n加购率\t199291\nITOOLS\t199292\n独孤城\t199293\n第一顿\t199294\n北京社区医院\t199295\n2018-01-19\t199296\n提资\t199297\n怪谈百物语\t199298\n北京友谊医院\t199299\nDeus\t199300\nq\t199301\n百学网\t199302\na\t199303\nPSP_翼风网\t199304\n课间十分钟\t199305\n画儿\t199306\ncharacteristic\t199307\n散热硅脂\t199308\n诃\t199309\n徐大荣\t199310\n乙丙\t199311\n捧腹\t199312\n何海霞\t199313\n榴芒\t199314\nremaining\t199315\n龙一\t199316\n中国文化部\t199317\n储存罐\t199318\n工程流体力学\t199319\nOMG战队\t199320\n徐州医学院附属医院\t199321\n内蒙古西部\t199322\nmooer\t199323\n雅典学院\t199324\n衣锦还乡\t199325\n20148\t199326\n榆阳区\t199327\n于睿\t199328\n时间继电器\t199329\nexile\t199330\n利得税\t199331\n零异频道\t199332\nsensor\t199333\n三国杀OL\t199334\n卧式拉力试验机\t199335\n特案\t199336\ncss浏览器\t199337\n惠特尼\t199338\n小棒\t199339\n二个多月\t199340\n宠物王国外传\t199341\n奇虎\t199342\n黑土\t199343\n90位\t199344\n醇酸树脂\t199345\n灰器\t199346\n几面\t199347\n中国科学技术大学》\t199348\n反转片\t199349\n黑光人才网\t199350\n杜邦公司\t199351\n100度\t199352\n封箱机\t199353\n预购\t199354\nPrescription\t199355\n心包积液\t199356\n大蒙古国\t199357\n奥迪A4L\t199358\n母巢\t199359\n孝庄秘史\t199360\n佳航\t199361\n888动漫网\t199362\nCCTV-1\t199363\n猩球崛起\t199364\nFreescale\t199365\n列文\t199366\n贾琳\t199367\n乐呵呵\t199368\nxhprof\t199369\n幻速\t199370\n免除\t199371\n生产地\t199372\n休日\t199373\n热米皮\t199374\n风生\t199375\n北京单场\t199376\n千百种\t199377\n各科\t199378\n沃特福德\t199379\n400NK\t199380\n善良妈妈的朋友\t199381\nLogo\t199382\n千足金\t199383\n山水s酒店\t199384\n大蜗牛\t199385\n澁谷果歩\t199386\n明朝时期\t199387\njooq\t199388\n一天一夜\t199389\n夹层板\t199390\n江特电机\t199391\n☆\t199392\nsexart\t199393\n凡尔赛玫瑰\t199394\n微感\t199395\n昭和天皇\t199396\n魔刃\t199397\n浙江在线\t199398\n郑姓\t199399\n长大后\t199400\n蒸散\t199401\n11.1.3\t199402\n230平\t199403\n综合知识\t199404\n子函数\t199405\n境段\t199406\n创想兵团\t199407\n莫让\t199408\n烟温\t199409\n无敌仙侠世界\t199410\n4本\t199411\n真空收纳袋\t199412\n沉积岩\t199413\n目盲\t199414\n受难日\t199415\n广西教育厅\t199416\n160斤\t199417\n北京市第三中级人民法院\t199418\n若雪\t199419\n阿坝县\t199420\n2016年1月1日起\t199421\n美少女万华镜\t199422\n不夜天\t199423\n乖乖猪世界\t199424\n笔法\t199425\nXinhua\t199426\n遵纪守法\t199427\nphpadmin\t199428\n滨江区\t199429\n28次\t199430\n韩可彬\t199431\n奥托\t199432\n网易严选吧\t199433\n票口\t199434\n私募证券\t199435\n潜江油焖大虾\t199436\n大眼仔\t199437\neighty\t199438\n华德福幼儿园\t199439\n世卫组织\t199440\n中旬\t199441\n现金流量表编制\t199442\n光面\t199443\n市疾控中心\t199444\n资料柜\t199445\nvor\t199446\n4月上旬\t199447\n洛阳酒店\t199448\n潍坊高新区\t199449\n毕业论文网\t199450\nHttpOnly\t199451\n腔镜\t199452\n列国\t199453\n火莹\t199454\n上证综合指数\t199455\nsubmission\t199456\n2G显存\t199457\n无争\t199458\n绣眼鸟\t199459\n魅族pro6s\t199460\n南平市城乡规划局\t199461\nKingRoot\t199462\n64名\t199463\n河蚬\t199464\n鼎胜\t199465\n工学版\t199466\n更快捷\t199467\n国家卫生计生委\t199468\n一览_\t199469\n么多\t199470\n海带丝\t199471\n政府工\t199472\n石油醚\t199473\nforcode\t199474\n农区\t199475\n广西梧州市人民政府\t199476\n层流罩\t199477\n体视\t199478\n沪府\t199479\n花幽山月\t199480\n莫然\t199481\n精灵世纪\t199482\n最高检\t199483\n幻\t199484\n孕后期\t199485\nretrofit\t199486\n艮宫\t199487\nm177fw\t199488\n伤物语\t199489\nHQ\t199490\n彩漫\t199491\n回流比\t199492\n128bit\t199493\nfavicon\t199494\n丽婴房\t199495\n许昌市\t199496\n40【\t199497\n2099年\t199498\namputee\t199499\n甘肃省发展和改革委员会\t199500\n沥青泵\t199501\n南园街道\t199502\n天江药业\t199503\n墨菲斯\t199504\nwww.56.com\t199505\n一万点\t199506\n多罗罗\t199507\nhkex\t199508\n海上花园\t199509\n许昌市人民政府\t199510\n锋刃\t199511\n反恐部队\t199512\n每两个月\t199513\n衡水市区\t199514\nworked\t199515\n陕师大\t199516\n熊大熊\t199517\n起航\t199518\n千石\t199519\nWin7局域网\t199520\n祖父的园子\t199521\n胡润榜\t199522\n天津广播电台\t199523\n14道\t199524\n催眠类\t199525\nVueClub\t199526\n南瓜苗\t199527\n职业半仙\t199528\n南街村\t199529\n_源\t199530\nレジデンス\t199531\n曾经沧海\t199532\n14英寸\t199533\n4101\t199534\n桑娜\t199535\n李洪基\t199536\n炮台山\t199537\n读库\t199538\n董铂然\t199539\n2018年2月8日\t199540\n海一方社区\t199541\n北大深圳研究生院\t199542\nBlessing\t199543\n碧波\t199544\n高创\t199545\nP2P理财\t199546\nrefuse\t199547\n多品\t199548\n首都儿研所\t199549\nAndyJee\t199550\n双溪镇\t199551\nTaizhou\t199552\n药锅\t199553\nPortugal\t199554\n裸石\t199555\npartially\t199556\n消费端\t199557\n原胶\t199558\n长礼\t199559\n我在故宫修文物\t199560\n值域\t199561\n原粮\t199562\n4670\t199563\nMulberry\t199564\n个人抵押贷款\t199565\n共同还款人\t199566\n曹世镐\t199567\n荣辉\t199568\nofffice\t199569\nliantong\t199570\nMKV/\t199571\n射雕英雄传之东成西就\t199572\n250ml\t199573\n败笔\t199574\n王畅\t199575\n威可多\t199576\nXEL\t199577\n大明山景区\t199578\n第1\t199579\n长城哈弗h6\t199580\n李景\t199581\n大隐静脉曲张\t199582\n闽派\t199583\n浸泡\t199584\n蓬莱市\t199585\n汨\t199586\n电化学法\t199587\n10万方\t199588\n自毁\t199589\n语文月刊\t199590\n边形\t199591\n南泽\t199592\n17uoo\t199593\n幸福蓝海国际影城\t199594\n有缝钢管\t199595\n图景\t199596\n规整化\t199597\n添加长\t199598\n一个小村庄的故事\t199599\n埋地式\t199600\n三国战记2007\t199601\n郁美净\t199602\n通刷\t199603\n塞拉利昂\t199604\n汤珈台湾\t199605\n猛药\t199606\n鼻血\t199607\n打拍\t199608\n厦门软件园\t199609\n异名\t199610\n魅族PRO6Plus\t199611\n5173吧_\t199612\n_连锁_招商_加盟店\t199613\n九龙生态园\t199614\n酒钢宏兴\t199615\n2004年\t199616\n思贤路\t199617\nSRS\t199618\n務處\t199619\npilz\t199620\n南邮\t199621\n麦丝\t199622\nalembic\t199623\n福佳\t199624\n锁鞋\t199625\n三堆\t199626\n6月7日\t199627\n走了\t199628\n套片\t199629\n汽化潜热\t199630\n交易商协会\t199631\n礼让\t199632\n养胃粥\t199633\n战术裤\t199634\n幽默感\t199635\ntortoisegit\t199636\n第104集\t199637\n音序\t199638\n2017届\t199639\n硒砂瓜\t199640\n女作家\t199641\n陶瓷业\t199642\nasn.1\t199643\n黑小虎\t199644\n纳帕\t199645\n10053\t199646\n中蜂\t199647\n月月\t199648\n罗夏\t199649\nword卡\t199650\n申根\t199651\n柱中\t199652\n母皇\t199653\n欧玛\t199654\n龙巫\t199655\nWeb\t199656\n建襄小学\t199657\n幕府将军2全面战争\t199658\n1月31日\t199659\n4308\t199660\n四山\t199661\n马赛公寓\t199662\nresults\t199663\nuint32\t199664\n伊斯兰\t199665\n小匡\t199666\n少年包青天1\t199667\n公关公司\t199668\n独生子女费\t199669\n2016款\t199670\n微动作\t199671\n肩胛骨\t199672\n北京工作居住证\t199673\n对角\t199674\n纪实片\t199675\n土著人\t199676\nP4G\t199677\n做强\t199678\n娱乐名人榜\t199679\n北京玛丽妇婴医院\t199680\nkmsauto\t199681\n3344\t199682\n品种法\t199683\n肉刺\t199684\nsis\t199685\nadelaide\t199686\naudition\t199687\n长颈鹿简笔画\t199688\n轻解\t199689\nTopX\t199690\n林场\t199691\n提味\t199692\npivot_table\t199693\nCapacitors\t199694\n行政管理制度\t199695\n余军\t199696\n上海猫友会\t199697\n科研类\t199698\n基本经济制度\t199699\n多卷集\t199700\n尚街\t199701\n葡萄胎\t199702\n32集\t199703\n国心\t199704\n侠盗猎车手罪恶都市\t199705\n神农本草经\t199706\n斌斌\t199707\n漳州\t199708\nlowi\t199709\n可折叠\t199710\nBeauté\t199711\n美国耶鲁大学\t199712\nuniverse\t199713\nunlimited\t199714\n剿灭\t199715\n可观\t199716\n飞流\t199717\nprofessional\t199718\n甜味\t199719\nDolphin模拟器\t199720\n开局\t199721\n单晶炉\t199722\n市场调查问卷\t199723\n最新\t199724\nNova3e\t199725\n腾笼换鸟\t199726\nmach\t199727\n伊莎贝尔\t199728\n辽远\t199729\n下街\t199730\n存亡\t199731\n鲁大妈\t199732\n预言贴\t199733\n机械革命x1\t199734\nAd\t199735\nx分\t199736\n大坪医院\t199737\n奸淫\t199738\n花界\t199739\n鸿联\t199740\n1mm\t199741\n交配\t199742\n微囊藻毒素\t199743\nFEI\t199744\n多盈理财\t199745\n蔬菜\t199746\n保护主义\t199747\nhttps服务器\t199748\nnici\t199749\nMalt\t199750\n江桥万达广场\t199751\n饥人谷\t199752\n糖稀\t199753\nITIL\t199754\n扪心问诊\t199755\n红界\t199756\nLIGO\t199757\nsaving\t199758\n曝光量\t199759\n汽油弹\t199760\ne级\t199761\n国网山东省电力公司\t199762\n2017年8月1日起\t199763\n遗失物\t199764\n宁为玉碎\t199765\n遮羞布\t199766\n广东移动动感地带卡\t199767\n授\t199768\npoliform\t199769\n调压盒\t199770\n002517\t199771\ngii\t199772\n动车站\t199773\n家奴\t199774\n中医\t199775\n0.0001\t199776\n推广\t199777\n金寨县人民政府\t199778\n皮皮盘\t199779\n六人行吧\t199780\n平衡杆胶套\t199781\noffice365吧_\t199782\n铁核桃\t199783\n点支式\t199784\ninhibits\t199785\n消防安全责任制实施办法\t199786\n有为\t199787\nAXIS\t199788\n爱的着陆\t199789\n阿里嘎多\t199790\n行车电脑显示屏\t199791\n金黄\t199792\nhao123.com\t199793\nbash_profile\t199794\n字符串单\t199795\n子视图\t199796\n扬大\t199797\nmysql数据库表\t199798\n安新区\t199799\n一等品\t199800\n纹波\t199801\n汤逊湖\t199802\n318i\t199803\n低人一等\t199804\n东澳岛\t199805\n蔡敏\t199806\nsinks\t199807\n周芷若\t199808\n包村\t199809\n招揽\t199810\n寄卖\t199811\n新氧\t199812\n老e\t199813\nglove\t199814\n情有独钟\t199815\n种粮\t199816\nwangwenan6\t199817\n圣哲\t199818\nYY网\t199819\n新安房产网\t199820\n远大住工\t199821\nemployers\t199822\nPartner\t199823\n发散性思维\t199824\n无底\t199825\n优能中学\t199826\n300672\t199827\n松视\t199828\nhitalk\t199829\n八府堂\t199830\n约翰逊\t199831\nPayload\t199832\n幸事\t199833\n宝马320\t199834\n硬格机\t199835\n纳兰珠儿\t199836\n太守\t199837\n喜善\t199838\n豫南\t199839\n厦门电视台\t199840\n推卸\t199841\n王小帅\t199842\nvmci\t199843\n777米奇\t199844\n哈尔斯塔特\t199845\n9月2日\t199846\n急刹车\t199847\ncanny\t199848\n风雅园\t199849\nmapred\t199850\ntaglib\t199851\n党建会\t199852\n麻木感\t199853\n合造价\t199854\nexistent\t199855\n宁德\t199856\n卫生服务中心\t199857\n铺布机\t199858\nVariables\t199859\n10w40\t199860\n家政人员\t199861\n1亿吨\t199862\n情深不知他爱你\t199863\n东晋\t199864\n接机\t199865\n硬核亨利\t199866\n第93号\t199867\n山阳区\t199868\n金城镇\t199869\n艾叶草\t199870\n北京开发区\t199871\n草害\t199872\n副主编\t199873\nAdvertCN\t199874\n客船\t199875\n呼市\t199876\n大山子\t199877\neryuan\t199878\n眼花缭乱\t199879\n当是\t199880\n绒皮\t199881\n近忧\t199882\nGANK\t199883\n93级\t199884\nctm\t199885\nsanitary\t199886\n5.56毫米\t199887\n我真的好想你\t199888\n生有\t199889\n阜阳火车站\t199890\n广证\t199891\nDL580\t199892\n伊比利亚\t199893\n银丝\t199894\nKIMI\t199895\n刀锋\t199896\n米兰达\t199897\n北京电台\t199898\n詳解\t199899\n城发\t199900\n迅播动漫影院\t199901\n别爱\t199902\n黎明之前\t199903\n大花岭\t199904\nbitter\t199905\n雅马哈音乐中心\t199906\n攀比\t199907\n介词\t199908\n热冲压\t199909\n汉程网\t199910\n通道\t199911\n如愿\t199912\n1V1\t199913\n冰膜\t199914\n智能遥控器\t199915\n广州市少年宫\t199916\n师一课\t199917\n一顿一顿\t199918\n亦来云\t199919\nreaders\t199920\n简日\t199921\n中国房地产信息网\t199922\n6斤\t199923\n2015年4月1日\t199924\n主站\t199925\n环绕型\t199926\n透纳\t199927\nike\t199928\n成都国\t199929\n帝天\t199930\nSIEMENS\t199931\n央视新闻频道\t199932\n砍头\t199933\n8032\t199934\n稀碎\t199935\n狱内\t199936\n互联网电视\t199937\n刺客信条4黑旗\t199938\n鸡足山\t199939\n医学系\t199940\neeg\t199941\n讲话稿\t199942\nlrfgjj2\t199943\n酷感\t199944\n冠醚\t199945\nSealee\t199946\n兵俑\t199947\n投贷\t199948\n_哥斯拉\t199949\n管建刚\t199950\n拉锯\t199951\n李影\t199952\n导联心电图\t199953\nBulbapedia\t199954\n电动摩托车\t199955\n+086\t199956\n拉力机\t199957\nSurvivors\t199958\nthinkgem\t199959\n夜狩\t199960\n设备有限公司\t199961\n牛栏山镇\t199962\nYOYOW\t199963\n古希腊\t199964\n北京香山公园\t199965\n樱子\t199966\n天盈广场\t199967\n古荡\t199968\nassic\t199969\n民族资本主义\t199970\n空荡\t199971\ndidn\t199972\n凤凰血\t199973\nbelt\t199974\n紧迫性\t199975\n氨基甲酸铵\t199976\n4000m\t199977\n易截屏录屏\t199978\n龙泉阳光城\t199979\n图形处理器\t199980\n内蒙古自治区教育厅\t199981\nsample\t199982\n史诗之路\t199983\n五仙\t199984\n第九天\t199985\n快四\t199986\n荣耀S7\t199987\n威航\t199988\n大陆性\t199989\n虚假诉讼\t199990\ngla\t199991\n富川瑶族自治县\t199992\n出江湖\t199993\n冰风谷\t199994\n侍妾\t199995\n梦幻群侠传吧\t199996\n大连民族大学\t199997\nMVC4.0\t199998\n中华人民共和国行政复议法\t199999\n王大伟\t200000\n四好农村路\t200001\n衡东县\t200002\n玛丽皇后\t200003\n猜人游戏\t200004\n栀子\t200005\nF117\t200006\n政策学\t200007\n重耕\t200008\n网格\t200009\n鼻毛修剪器\t200010\n珐琅彩\t200011\n电镀\t200012\n张晓丽\t200013\n京东小白卡\t200014\n扛着\t200015\n建国西路\t200016\n上海曙光医院东院\t200017\n胸膜\t200018\n大力魔\t200019\n舒扬\t200020\nFares\t200021\n60帧\t200022\n夜未央\t200023\n备赛\t200024\nFaceRig\t200025\n训练员\t200026\n酷狗k歌\t200027\n硅烷\t200028\n天启四骑士\t200029\n建工社\t200030\n杨佑宁\t200031\nCommitted\t200032\n头文字D\t200033\n兴文石海\t200034\n6代\t200035\n财政学\t200036\n张正友\t200037\n嵌段\t200038\n青霉素皮试\t200039\nCAFE\t200040\nSUBSTRING\t200041\n山东海\t200042\n辩驳\t200043\n安娜贝拉\t200044\n滞胀\t200045\n公众微信\t200046\n王婉悠\t200047\nEppendorf\t200048\n包小姐\t200049\n黑龙江省发展和改革委员会\t200050\n紫胤\t200051\n一起\t200052\n欧莎\t200053\n稳步增长\t200054\n也称\t200055\nextend\t200056\nMPLS\t200057\n徐萍\t200058\n复分解反应\t200059\n朗豪坊\t200060\n挥师\t200061\nBaacloud\t200062\n牛市口\t200063\n亲宝\t200064\nhotfix\t200065\n举止\t200066\n智光电气\t200067\n避难所\t200068\n超高压\t200069\n甲型流感病毒\t200070\n威胁\t200071\n欢欣鼓舞\t200072\n玻璃纸\t200073\nRePlugin\t200074\n76人队\t200075\n外文版\t200076\n外热式\t200077\nSOB\t200078\n比例表\t200079\n上海文艺出版社\t200080\n凌晨三点\t200081\n平海镇\t200082\n羞刑\t200083\nBang\t200084\n声带\t200085\n体画\t200086\n中看不中用\t200087\n赛马场\t200088\n乡建\t200089\n孙老师\t200090\n发梦\t200091\n康田\t200092\n中石\t200093\nExtraordinary\t200094\n开封市环境保护局\t200095\n逢入京使\t200096\ncss+js\t200097\nhnt\t200098\n碧霞\t200099\n船夫\t200100\nWendy\t200101\n松本麻里奈\t200102\n加里森敢死队\t200103\n三星s7\t200104\n汇众教育\t200105\n8日游\t200106\n广州日立\t200107\n顺时科技\t200108\n小雅音箱\t200109\nshrinkage\t200110\nLinuxTOY\t200111\n室内装修\t200112\n7101\t200113\nRICH\t200114\n矮男\t200115\n柯灵\t200116\n黄埔一期\t200117\n退伍军人\t200118\n体谅\t200119\n五十二\t200120\ndatagrip\t200121\n查开\t200122\n宜昌\t200123\n冰泉\t200124\n千株\t200125\n小产权房\t200126\nq9500\t200127\n全麦饼干\t200128\n安亭\t200129\n飞动\t200130\n总经销\t200131\n闪念胶囊\t200132\nsyms\t200133\n2.5.3\t200134\n祥仔\t200135\nPeau\t200136\n帕萨特论\t200137\nDUP\t200138\nmitsubishi\t200139\n142号\t200140\n柏林镇\t200141\nActor\t200142\n戒备\t200143\n留学党\t200144\na17\t200145\n押金\t200146\n类间\t200147\n00000057\t200148\n525i\t200149\n侯凯\t200150\n鸳鸯戏水\t200151\n招调\t200152\n金蝶云\t200153\nRC522\t200154\nTutor\t200155\nPACKET\t200156\n草垫\t200157\n中共黄冈市纪律检查委员会\t200158\nbenke\t200159\nNL\t200160\n定义式\t200161\n汉菜\t200162\n华瑞紫韵城\t200163\n娄葑\t200164\n长生印\t200165\n罗芳\t200166\nelasticsearch5.x\t200167\n还早\t200168\n奇面族\t200169\nCarlton\t200170\n10口\t200171\n刀友\t200172\n政府采购中心\t200173\n麦咖啡\t200174\ndcc\t200175\nZimbra\t200176\n雄迈\t200177\n兰缪\t200178\n创赛\t200179\nGW250F\t200180\nstdole32.tlb\t200181\n关键件\t200182\n服务群\t200183\npacs\t200184\n钟鸣鼎食\t200185\n清华美术学院\t200186\n汤米\t200187\nrowkey\t200188\nspecificity\t200189\nfmea\t200190\n好人品\t200191\nJAVA软件工程师\t200192\n马洪刚\t200193\nSqlDataReader\t200194\n共青团\t200195\nmuch\t200196\n悠\t200197\n豁达\t200198\ninkscape\t200199\n开黑节\t200200\n宜春职业技术学院\t200201\n迅雷种子_迅雷电影下载_迅雷哥云_迅雷哥\t200202\n耕田\t200203\n灵魂摆渡2\t200204\n超清_哔哩哔哩\t200205\n钩沉\t200206\n蓝猫淘气3000问\t200207\n喜旺\t200208\n保健医\t200209\n策划书\t200210\n177\t200211\n反恐\t200212\n2017年11月25日\t200213\n8月10日\t200214\n山东农大\t200215\n土地理\t200216\n碧桂园公司\t200217\n怪客\t200218\n0.16.0\t200219\n沙包\t200220\n密集\t200221\n机器视觉技术\t200222\n骗保\t200223\n三薪\t200224\nThieves\t200225\n恋爱吧\t200226\n昕\t200227\n参考网\t200228\n冯飞\t200229\n大角虫\t200230\n海淘贝\t200231\n2000小时\t200232\n最准确\t200233\n芭\t200234\nHellman\t200235\n唐县\t200236\n凹处\t200237\n神火之盗\t200238\n5.72\t200239\n朱星杰\t200240\n操操日\t200241\n杆菌\t200242\n腾讯云ubuntu\t200243\n盐泰锡常宜铁路\t200244\npoli\t200245\nyag\t200246\nneitui\t200247\n葡萄\t200248\n宜春市纪委监察局\t200249\n稲川\t200250\n奔奔MINI\t200251\n水底\t200252\n无忧论文网\t200253\n威盛\t200254\n刘冰\t200255\n服务单\t200256\n德克萨斯\t200257\n抖音连音社\t200258\n粤芯\t200259\n息息\t200260\n过来玩\t200261\n百树\t200262\n商标分类\t200263\nxlabel\t200264\n幼师\t200265\n实战性\t200266\nCummings\t200267\nieee\t200268\n58000\t200269\n光华楼\t200270\n夕风毒毒\t200271\n控制线\t200272\n让开\t200273\n归约\t200274\n宝马摩托论坛\t200275\n这货\t200276\n话剧\t200277\n赤塔\t200278\n捏面人\t200279\n必需要\t200280\n极点舞曲网\t200281\n版值\t200282\n神翼\t200283\n胃烧\t200284\n红湖\t200285\n重庆华侨城\t200286\n山南市\t200287\n衡水中学\t200288\npuc\t200289\n左晴雯\t200290\n清洗段\t200291\n物项\t200292\n周泰\t200293\n亲近\t200294\n国珍\t200295\n商业保险\t200296\nSyria\t200297\nc100\t200298\n夜路\t200299\n主动\t200300\n回案\t200301\n国知局\t200302\n缘来\t200303\n199年\t200304\n透水\t200305\n傅彪\t200306\n_慕课手记\t200307\n性福联盟\t200308\n云犀\t200309\nh片\t200310\n叶云燕\t200311\n侠义值\t200312\n聚乐\t200313\nANAL\t200314\n晋江市政府网\t200315\n朝觐\t200316\n雷蒙斯尼奇\t200317\nbioedit\t200318\nWeb服务器\t200319\n畅玩包\t200320\n16年10月\t200321\n林长民\t200322\nyuanyin\t200323\n恐怖漫画\t200324\n五架\t200325\n见字如面2\t200326\n眼巴巴\t200327\n液压启闭机\t200328\n50ML\t200329\n刻绘机\t200330\nLED大屏网\t200331\n护目\t200332\nglg\t200333\n水禽\t200334\n海胆\t200335\nPL\t200336\n吴王\t200337\nXLS格式\t200338\n湖南涉外经济学院\t200339\n近代\t200340\nzhen\t200341\ncft\t200342\n春鹃\t200343\n韩兴娥\t200344\n石板镇\t200345\nexk\t200346\n长葛\t200347\n府谷\t200348\n超级工程\t200349\nipad吧_\t200350\n雷沃重工\t200351\n广海镇\t200352\n雀仙桥\t200353\n嘉禾县\t200354\n夜行\t200355\n八百\t200356\nMae\t200357\n等效变换\t200358\n172.16.0.0\t200359\n低收入者\t200360\n清开灵\t200361\n空空导弹\t200362\n中建六局\t200363\n晕色\t200364\nbodog\t200365\n263.net.cn\t200366\n五分彩\t200367\n211吧_\t200368\nCroft\t200369\ndocin\t200370\ntokyo\t200371\n巨兔\t200372\n西条丽\t200373\n自守\t200374\n良渚古城\t200375\nSidney\t200376\n蓝边\t200377\n蛇虫\t200378\n蒋钦\t200379\nboeing\t200380\n老八\t200381\nMassacre\t200382\n蔡元培\t200383\nc++dll\t200384\nMelody\t200385\n元素值\t200386\n永正\t200387\n章丘四中\t200388\n20000个\t200389\n六欲\t200390\n膜盒\t200391\nError\t200392\n雪菜\t200393\n2G网\t200394\n天刑\t200395\nitouch\t200396\nboson\t200397\n烁体\t200398\n中移电子商务有限公司\t200399\nrefrain\t200400\n右\t200401\n胆敢\t200402\n可达龙\t200403\nblackout\t200404\nconcat\t200405\n淘米网\t200406\nChaser\t200407\n埃及航空\t200408\n席慕罗纳尔多\t200409\n走进去\t200410\n毒岛冴子\t200411\n清穿\t200412\n连州市\t200413\n汉渝路\t200414\n万古仙穹乐文_万古仙穹笔趣阁_万古仙穹顶点_万古仙穹\t200415\n心保\t200416\na4\t200417\n又双叒叕\t200418\nequations\t200419\n劳力\t200420\n恶魔少爷别吻我\t200421\n花财\t200422\n折花\t200423\n4千瓦\t200424\n吊顶式\t200425\n宏海\t200426\n新碟\t200427\n偷鸡\t200428\n难事\t200429\n﹤\t200430\nhttp2\t200431\n韭菜花\t200432\n0.1度\t200433\n知识产权联盟\t200434\n英德市人民政府\t200435\n黑果\t200436\n3文件\t200437\n值日表\t200438\n绝爱\t200439\n4c\t200440\n美图m6\t200441\n1641\t200442\n安徽中澳科技职业学院\t200443\n宏愿\t200444\n双眼皮吧_\t200445\n马喆\t200446\n厦门大学附属实验中学\t200447\n有本事\t200448\n转破\t200449\n领世郡\t200450\n阜阳乐居网\t200451\n4399游戏吧\t200452\n真三国无双\t200453\n紫皮石斛\t200454\n82万\t200455\n张君龙\t200456\n霹雳天命之战祸邪神\t200457\n班恩\t200458\n海德\t200459\n幻萌\t200460\n発情\t200461\n掉毛\t200462\n0814\t200463\n西安市碑林区\t200464\ngraph\t200465\ncamhi\t200466\nmercari\t200467\n山东太阳纸业股份有限公司\t200468\nSpecialized\t200469\n肖山\t200470\nact\t200471\n眼位\t200472\n高新区\t200473\nCLD\t200474\n八公山\t200475\n团结路\t200476\nphim\t200477\nv8.1\t200478\n廷杖\t200479\n旋风少女\t200480\n隐婚甜宠\t200481\n绿城服务\t200482\n宝马x1\t200483\nfollowed\t200484\n企业登记网\t200485\n云栖大会\t200486\n废机油\t200487\n红糖姜水\t200488\nshiji\t200489\n牧丹\t200490\n74138\t200491\n联欢晚会\t200492\nThu\t200493\n最低工资标\t200494\n去不去\t200495\n爱维\t200496\n工薪阶层\t200497\n靓装\t200498\nconversions\t200499\n罗凯\t200500\n声优\t200501\n陕西测绘地理信息局\t200502\n肖和东\t200503\n能科股份\t200504\n气弹\t200505\n珠海金湾机场\t200506\n苏鎏\t200507\n杭锅股份\t200508\n第1年\t200509\nChanel\t200510\n悦跑圈\t200511\nSymp\t200512\n何猷君\t200513\n南嘉舞步操\t200514\n梦之路\t200515\n360os\t200516\n优斗士\t200517\n火针\t200518\n水土流失\t200519\n通用硅酸盐水泥\t200520\nereg\t200521\n牛油火锅\t200522\n天梭1853\t200523\n币机\t200524\n卒中\t200525\n八一桥\t200526\npageadmin\t200527\n花人\t200528\nEcstasy\t200529\n十四号\t200530\n今夜舞起来\t200531\nae2018\t200532\n8&\t200533\n喷播\t200534\n标识贴\t200535\n品位\t200536\n杨梅树\t200537\ntcp协议\t200538\n跳跳犬\t200539\n眼肿\t200540\nzcl\t200541\n不拘一格\t200542\n练习者\t200543\n伯纳天纯\t200544\n中国大地财产保险股份有限公司\t200545\n经典爱情诗句大全\t200546\nWeenie\t200547\n起易论坛\t200548\n威客\t200549\n松本楼\t200550\n慢性气管炎\t200551\n云计算架构\t200552\n镭\t200553\n北京国际学校网\t200554\n一会\t200555\n_轩\t200556\n劫尽\t200557\n32\t200558\n血清学\t200559\nE旅行网\t200560\n首航节能\t200561\n制片主任\t200562\n圣诞灯\t200563\n人教版八年级\t200564\n少年维特之烦恼\t200565\n南唐\t200566\n天天象棋楚汉争霸\t200567\n邱诗晗\t200568\n浮动窗口\t200569\n596\t200570\n世博源\t200571\n立行立改\t200572\n快乐崇拜\t200573\nmyql\t200574\n002320\t200575\n根浴\t200576\n二十八号\t200577\ndestroyed\t200578\n456号\t200579\n江山新闻网\t200580\n阐释\t200581\n林纾\t200582\n日程本\t200583\npes\t200584\n吴家堡\t200585\n零速\t200586\n王羽佳\t200587\n几件\t200588\n良将\t200589\n宏江\t200590\nVasp\t200591\n这样的女人\t200592\n汉学家\t200593\n4-10\t200594\n六书\t200595\n07073航海王OL\t200596\n打boss\t200597\nDLC+\t200598\n艺品\t200599\n绝缘子\t200600\n雁荡镇\t200601\n平高\t200602\n代军\t200603\n波音空客\t200604\n有我师\t200605\n犯戒\t200606\n一锅端\t200607\n游赏\t200608\ndiplomatic\t200609\n太庙\t200610\n买开\t200611\n帅警\t200612\nLupin\t200613\n奥斯汀\t200614\n哈代\t200615\n直路\t200616\n美高\t200617\nbbe\t200618\n选房\t200619\nWebpack2\t200620\n盖有\t200621\nTFP\t200622\n独仙\t200623\n职改\t200624\n各厂\t200625\n头昏眼花\t200626\n被减数\t200627\n金名\t200628\nmoya\t200629\n果啤\t200630\n起去\t200631\n贵州省代表团\t200632\n民忧\t200633\n300mm\t200634\njqgrid\t200635\n涛雒镇\t200636\n瑰石\t200637\nlmgrd\t200638\n64x64\t200639\n750g\t200640\n好好过日子\t200641\n高级计量经济学\t200642\n肢位\t200643\n藏刀\t200644\n新闻产经_北京商报网\t200645\n6000K\t200646\n华晨中华\t200647\ngamersgate\t200648\n泰课\t200649\nPyPI\t200650\n刘璋\t200651\n余派\t200652\n第五人\t200653\n傻小子\t200654\n金茂集团\t200655\n中共湖南省委党校\t200656\n算术平方根\t200657\nClickOnce\t200658\n过渡户\t200659\n利器\t200660\n三十\t200661\n喷泉\t200662\n国际消费者权益日\t200663\n步步高学习机\t200664\n张记\t200665\n六十个\t200666\n中原\t200667\nhs编码查询\t200668\n藏色阁\t200669\n武汉十七中\t200670\n线刷宝ROM中心\t200671\n田国立\t200672\n新华人寿保险股份有限公司\t200673\n中鸣机器人\t200674\nstationery\t200675\n_汽配人\t200676\n八云紫\t200677\n赝势\t200678\n晓出净慈寺送林子方\t200679\n1v4\t200680\n甜菜根\t200681\n浮梁\t200682\n马丘\t200683\n甩臀舞\t200684\n顺风考试网\t200685\n20170315\t200686\nzhou\t200687\n中国人民解放军军事医学科学院\t200688\n灵兽\t200689\n海岸\t200690\nDenuvo\t200691\n中国农药信息网\t200692\n逾期\t200693\nOA\t200694\n境界\t200695\ni10\t200696\n中控科技\t200697\n徐世昭\t200698\n拉马\t200699\n中国人民银行清算总中心\t200700\n红章\t200701\n云玺\t200702\n知其然\t200703\n32|64位\t200704\n修罗铠甲\t200705\n碧水豪园\t200706\n单库\t200707\n国标委\t200708\n山东寿光\t200709\n青草\t200710\nFANUC\t200711\n丰绅殷德\t200712\n古登堡\t200713\n月满\t200714\n假死\t200715\n王烁\t200716\n三德歌\t200717\n男款\t200718\n客站\t200719\n怪物弹珠\t200720\n0472\t200721\n项城\t200722\n主办人\t200723\n中栋\t200724\nad14\t200725\nclang\t200726\n四喜丸子\t200727\nKOREA\t200728\n张嘉张爱玲\t200729\n捞针\t200730\n绝孙\t200731\n中西药\t200732\n剪枝\t200733\n梵歌\t200734\n雨爱\t200735\n旗黄膏\t200736\nPolymer\t200737\nlexburner\t200738\n摘镜\t200739\n现代咨询方法与实务\t200740\n布垫\t200741\n忍冬艳蔷薇\t200742\n联合国人权理事会\t200743\n浙江石化\t200744\n相互\t200745\nsheng\t200746\nNand\t200747\nstardew\t200748\nClamp\t200749\n高志远\t200750\n非鱼\t200751\n泱泱大国\t200752\n2只\t200753\nж\t200754\n单人舞\t200755\n二第一\t200756\nmajesty\t200757\nbruker\t200758\n平满\t200759\nsluts\t200760\n自杀率\t200761\n指事\t200762\n光仪\t200763\n五小\t200764\n金燕子\t200765\nigv\t200766\n35米\t200767\n西安北\t200768\n老凯越\t200769\n羔羊\t200770\n千色\t200771\n屈辱\t200772\nG27\t200773\n52张\t200774\n绿\t200775\n20160414\t200776\n上海财经大学商学院\t200777\n30平米\t200778\n18万起\t200779\n2018-04-11\t200780\n扒皮吧\t200781\n埃蒙\t200782\n神风怪盗贞德\t200783\n培乐多\t200784\n卓力\t200785\n安全网\t200786\n香悦\t200787\n官们\t200788\nupward\t200789\n国君\t200790\nfasterrcnn\t200791\nmpeg\t200792\n保平\t200793\n中山医院\t200794\n汇总_高考网\t200795\n苏州园区\t200796\n安沙镇\t200797\n提供者\t200798\n55000\t200799\n烦烦烦\t200800\ndevDependencies\t200801\n牙线\t200802\ndiane\t200803\n谷方益元\t200804\n600188\t200805\n一点一\t200806\n武汉市区\t200807\n清粉\t200808\nBvlgari\t200809\nchronicles\t200810\nIo\t200811\n热伤风\t200812\n宫妃清丹\t200813\n42种\t200814\n使命召唤8现代战争3\t200815\n联想z500\t200816\n偷天换日\t200817\n喷漆机\t200818\nYouMP3\t200819\n透鲜\t200820\n传输者\t200821\n轨客网\t200822\n镶嵌式\t200823\n金山打字通2016\t200824\nDM5动漫屋\t200825\nSFE\t200826\nclassroom\t200827\n陈浩明\t200828\n金钱卦\t200829\n仙鹤\t200830\n卤素\t200831\n核控\t200832\n7360\t200833\n71期\t200834\n市销率\t200835\n蔡慎坤\t200836\n吹灰器\t200837\nJJC\t200838\n阴唇\t200839\n汝城\t200840\n长生果\t200841\n普贤菩萨行愿品\t200842\n友谊地久天长\t200843\n瑞沃\t200844\n60幅\t200845\n中南百草原\t200846\nExcel365\t200847\npetroleum\t200848\ncy163\t200849\n第8名\t200850\n黄智贤\t200851\n106斤\t200852\n0691\t200853\n石角镇\t200854\n变装秀\t200855\nSap2000\t200856\n5分米\t200857\n心手\t200858\n半孔\t200859\n简笔图\t200860\n1C#\t200861\n东坡\t200862\nheidisql\t200863\n百度银行卡识别企业版\t200864\n希克斯\t200865\n艺师\t200866\n29.9元\t200867\n春砂仁\t200868\n含水\t200869\n石川佳纯\t200870\n鸭血\t200871\nREGISTRATION\t200872\n福特猛禽F150\t200873\n爬坑\t200874\n做作业\t200875\nNumbers\t200876\n宋霭龄\t200877\nmoma\t200878\n浙江人才网\t200879\ntouch6\t200880\ncedu\t200881\n南风古灶\t200882\n7.1%\t200883\n泥鳅汤\t200884\n炒房团\t200885\n致跳\t200886\n电话簿\t200887\n伊莱特\t200888\n老村长酒\t200889\nExpecting\t200890\n韩庄\t200891\n华阳路\t200892\nphys\t200893\n夜樱\t200894\n茅盾文学奖\t200895\nyidong\t200896\nNormal\t200897\n冰城\t200898\nhysteresis\t200899\n在森林里\t200900\n有孕\t200901\n电泳漆\t200902\nAddressing\t200903\nNote3全网通\t200904\nCheung\t200905\n立位\t200906\n东城花园\t200907\n新加坡交易所\t200908\n第9条\t200909\n超任\t200910\n300ML\t200911\n康华\t200912\n352\t200913\n关岭县\t200914\ntechno\t200915\nBL51\t200916\n英雄坛\t200917\n扫雷群\t200918\n哈尔滨地铁1号线\t200919\n公匙\t200920\n福克林肯mkx\t200921\n备品\t200922\n嘉绍大桥\t200923\n皮带轮\t200924\n金基德\t200925\n福建商会\t200926\n上海华信国际集团有限公司\t200927\n提货\t200928\n江夏\t200929\nmanage\t200930\n吴志刚\t200931\n视听展\t200932\n34层\t200933\n池石镇\t200934\n道依茨\t200935\nz17s\t200936\n小行星\t200937\npc板\t200938\n巴贝拉\t200939\nEVS\t200940\npmma\t200941\n虫\t200942\nrainbow70626\t200943\n存款单\t200944\n尤雨溪\t200945\nksp\t200946\n金边机场\t200947\n娜儿\t200948\n能人\t200949\n板厂\t200950\n1645\t200951\n琴\t200952\n罗莱家纺\t200953\n大陆性气候\t200954\n這樣\t200955\nSHIMANO\t200956\n南华期货\t200957\n比比皆是\t200958\n繁昌县\t200959\n伊莉莎白\t200960\n西游记之孙悟空三打白骨精\t200961\n上海海事大学图书馆\t200962\n哽咽感\t200963\n特许经营\t200964\n碧血丹心\t200965\n马力\t200966\nNatura\t200967\n毛绒\t200968\n低空\t200969\n一颗\t200970\n青云店镇\t200971\nbase64\t200972\nwinhex\t200973\nTenant\t200974\n第三场\t200975\n深兰科技\t200976\n宜\t200977\n偷星\t200978\nCWE\t200979\nTpimage\t200980\n华鑫股份\t200981\n烟感器\t200982\n吕佳容\t200983\n药物\t200984\nheckman\t200985\nzip.001\t200986\n牛蒡\t200987\n税函\t200988\ncontigo\t200989\n球员们\t200990\n辽宁省食品药品监督管理局\t200991\n北京政协\t200992\nrmse\t200993\nIMBA\t200994\n美女与野兽\t200995\n全真派\t200996\n手游馆\t200997\n索然无味\t200998\n邵平\t200999\n举例\t201000\n差法\t201001\n林浅\t201002\n汤坑\t201003\n魔兽侏罗纪公园\t201004\nCaptive\t201005\n美丽的秘密\t201006\n败退\t201007\n靠背\t201008\n丧葬补助金\t201009\n300分\t201010\n辽宁省公安厅\t201011\n大露\t201012\n50年前\t201013\n压屏机\t201014\n禾花鱼\t201015\n005\t201016\n上海飞机制造有限公司\t201017\n黄鹤楼公园\t201018\nqpic\t201019\n许家贝多芬\t201020\n华蓥山\t201021\n稻田朋美\t201022\n检验单\t201023\n附件炎\t201024\nexcel2013\t201025\ntheories\t201026\n撸猫\t201027\nAV天堂AV\t201028\n发友网\t201029\n英格尔斯\t201030\n斗罗大陆之绝世唐门\t201031\nshmget\t201032\n当流科技\t201033\n汽车网\t201034\n埃弗顿\t201035\n泰禾金尊府\t201036\n收件方\t201037\n一干\t201038\n奇游\t201039\n盖世仙尊\t201040\n奇耻大辱\t201041\n拍击\t201042\n楷书四大家\t201043\n聚龙股份\t201044\nWord-Excel\t201045\n两全\t201046\n反反复复\t201047\n状元素\t201048\n货物\t201049\n华西村\t201050\nF108\t201051\n拜山\t201052\n611.com\t201053\n缘尽\t201054\nliuxue\t201055\n青年时报\t201056\n艾米龙\t201057\nFlicker\t201058\n13500元\t201059\n性欲望\t201060\n螺洲\t201061\n昆明市人力资源和社会保障局\t201062\n老西关\t201063\n酷酷跑\t201064\nHeron\t201065\n菲丽丝\t201066\n第23号\t201067\n正定矩阵\t201068\n江苏省政府办公厅\t201069\nStarter\t201070\n奇瑞汽车\t201071\ngs60\t201072\nZPL\t201073\n1年以上\t201074\n144赫兹\t201075\nav无码片\t201076\n写真\t201077\n荷花苗\t201078\n不容易\t201079\n泉州地区\t201080\nPCB板\t201081\n主舞\t201082\n肉漫\t201083\n三穗\t201084\n起点读书\t201085\n0.01g\t201086\n莱信\t201087\nROOT精灵\t201088\n五样\t201089\n卫技\t201090\nYOHO潮流志\t201091\n自血疗法\t201092\n承修\t201093\nbuling\t201094\nxflow\t201095\n载人航天工程\t201096\n那个男人\t201097\nF5\t201098\n死亡天使\t201099\n遗传算法\t201100\n200瓦\t201101\n宏川智慧\t201102\nEggs\t201103\n值线\t201104\n中华民国十八年\t201105\n厦门房地产联合网\t201106\n高勇\t201107\n分外\t201108\n大西北\t201109\n行政事务\t201110\nTouhou\t201111\n曾一鸣\t201112\n瓶体\t201113\n定山\t201114\n拉菲庄园\t201115\n米面\t201116\n油锅\t201117\n廖晖\t201118\nBeginInvoke\t201119\n彭凯平\t201120\n双环醇\t201121\n期权费\t201122\n趵突泉公园\t201123\n72000\t201124\n宝华韦健\t201125\nTyloo\t201126\n搜狐时尚_搜狐网\t201127\n弹弹岛\t201128\n叫喊\t201129\n二诊理综\t201130\nrk3188\t201131\n赵群\t201132\n第六张\t201133\n陕西大剧院\t201134\n不合群\t201135\n入席\t201136\n文档化\t201137\n榴莲饼\t201138\n筑巢奖\t201139\n11mm\t201140\n2012年下半年\t201141\n小矮人\t201142\n000676\t201143\nzTree树\t201144\n叶文翔\t201145\n最生\t201146\n桥上\t201147\n结合型\t201148\n一以贯之\t201149\n南京军区福州总医院\t201150\n筐\t201151\n胶印\t201152\n误收\t201153\nRunlin\t201154\n薄姬\t201155\n航天工程\t201156\n李铮\t201157\n爆笑虫子全集\t201158\nDELUXE\t201159\n共振\t201160\n梦三国2\t201161\n7.38\t201162\n岳阳市二人民医院\t201163\n倍博特\t201164\ngoodman\t201165\n宜人贷问答\t201166\n办结\t201167\n中药味\t201168\n金险\t201169\n广州大学松田学院\t201170\n异地就医\t201171\n二七路\t201172\n下咽\t201173\n不坐\t201174\nmeyer\t201175\n主演\t201176\n360文件粉碎机\t201177\n中海阳光玫瑰园\t201178\nbose音响\t201179\n明道_论坛\t201180\nip协议\t201181\n失眠者\t201182\niOS11.1\t201183\n忌妒\t201184\n渐弱\t201185\n建筑与城市规划学院\t201186\n撒丁岛\t201187\n好儿子\t201188\n看家\t201189\n寿险理赔\t201190\n奥达曼\t201191\n反恐精英CS\t201192\n上雪\t201193\n龟裂\t201194\nelsevier\t201195\namuse\t201196\nCumlouder\t201197\nmann\t201198\n潜龙0318\t201199\n惜春\t201200\n处方集\t201201\n4矩阵\t201202\n驯龙高手2\t201203\n十三万\t201204\n二卷\t201205\n痛失\t201206\n李朝\t201207\n返家\t201208\nTQ2440\t201209\n邮政储蓄银行\t201210\n锦绣香江\t201211\nsebs\t201212\n拜把子\t201213\n爱尔兰共和国\t201214\nVISION\t201215\nhighlighter\t201216\n大写字\t201217\nHistogram\t201218\n建军\t201219\n保姆式\t201220\n湖州在线\t201221\n捷豹XF\t201222\nUC头条\t201223\nAwayMP3\t201224\n话儿\t201225\nYeah\t201226\ncheckbox值\t201227\nShakes\t201228\n2015年9月15日\t201229\n吉村卓\t201230\nddz\t201231\n13幢\t201232\n竹木\t201233\n驾乘险\t201234\nOracle数据库栏\t201235\n三居室\t201236\nzdd\t201237\nparameter\t201238\n中国中信集团有限公司\t201239\npress\t201240\n鑫丰\t201241\n双环戊二烯\t201242\nCVA\t201243\n衅\t201244\n蔡家坡\t201245\n中国美术家网\t201246\n素肌\t201247\n资产支持证券\t201248\n合生元\t201249\nAGAIN\t201250\n收费站\t201251\n试一试\t201252\n新兰\t201253\ntso\t201254\n二进制位\t201255\nrestore\t201256\nwalls\t201257\n宁波海尔\t201258\nsammi\t201259\n办好\t201260\n普宁\t201261\nwwwkan7878com\t201262\nCrosslane\t201263\nFIB\t201264\n那本\t201265\nwarframe集团\t201266\n春树\t201267\n过山车\t201268\n80倍\t201269\n藻井\t201270\nIP反查\t201271\n不定根\t201272\ndatapeng\t201273\n天龙\t201274\n整体性\t201275\n培根\t201276\n皑皑\t201277\n宋庆龄基金会\t201278\nvol.4\t201279\n工伤认定\t201280\n2017年7月7日\t201281\n于欢\t201282\n中国奥园地产集团股份有限公司\t201283\n北京航空航天大学软件学院\t201284\n海晏\t201285\n10000号\t201286\n稳定性\t201287\n陈白露\t201288\n第四关\t201289\n002176\t201290\n擀\t201291\n饱和碳酸钠\t201292\n22条\t201293\n干胶\t201294\ntxt下载-必读网\t201295\n鸭粉丝汤\t201296\n兽奸\t201297\n扑向\t201298\n洪江市\t201299\n暗花\t201300\n尊荣版\t201301\n国土安全\t201302\n子域\t201303\n23时\t201304\nwilcom\t201305\nREALTEK\t201306\n大蛇\t201307\n飞天猫\t201308\n第13话\t201309\n诸家\t201310\n迅雷/百度云\t201311\ncongestion\t201312\n售货亭\t201313\n电球\t201314\n功波\t201315\n鹿勋\t201316\n口袋妖怪复刻_九游论坛\t201317\n1X\t201318\n不同寻常\t201319\n姚莉\t201320\n211985\t201321\n固特异轮胎\t201322\n甫\t201323\n古语\t201324\n酒标\t201325\n繁殖\t201326\nMika\t201327\n新疆维吾尔自治区人力资源和社会保障厅\t201328\n托法替尼\t201329\nCX-8\t201330\n复式记账\t201331\n抓包软件\t201332\n剑网3吧_\t201333\n舍甫琴科\t201334\n雾水\t201335\n2013-2014年\t201336\n丁基锂\t201337\n科学出版社\t201338\n谱天下\t201339\n凯迪拉克CTS\t201340\n海鲜\t201341\n成殇\t201342\n灯台\t201343\n货券\t201344\n铤\t201345\n撕光\t201346\n上海整形医院\t201347\n600741\t201348\n电磁场\t201349\n小猪班纳\t201350\n听音\t201351\n盐穴\t201352\n原上草论文网\t201353\n开生\t201354\n狂赌\t201355\n钢纹\t201356\n土星\t201357\n奔驰s600\t201358\nGTA4\t201359\n桃渚镇\t201360\nキョ\t201361\n四十种\t201362\n战痕\t201363\n妥妥\t201364\n三岔口\t201365\nConvertio\t201366\nwince6.0\t201367\n测绘学报\t201368\nkim\t201369\n宝贝老板\t201370\n男1女\t201371\n航天恒星科技有限公司\t201372\n纳米粒\t201373\n生态农场\t201374\ngreens\t201375\n矢志不移\t201376\n郭斌\t201377\n飞舟\t201378\n呼啦\t201379\n玉竹\t201380\n原子城\t201381\n宾利飞驰\t201382\n第11集\t201383\n720P.MKV\t201384\n切克\t201385\n世源\t201386\n中南世纪花城\t201387\ndplyr\t201388\n南京理工大学经济管理学院\t201389\n大老二\t201390\n南通市政府\t201391\n118家\t201392\n浪淘沙\t201393\n7302\t201394\n2.5GB\t201395\n凝块\t201396\n玥儿\t201397\n废热\t201398\noracle11gr2\t201399\n养殖场\t201400\n30根\t201401\nmx300\t201402\nqq实名认证\t201403\n1024个\t201404\n速达5000\t201405\n肖邦\t201406\n青壳\t201407\n吸虫病\t201408\n盖塔\t201409\n仙剑六\t201410\n武汉地产集团\t201411\n芦荡\t201412\n杂志社\t201413\n套利交易\t201414\n有限合伙私募基金\t201415\n第五十九章\t201416\n三盛国际公园\t201417\n乐单机\t201418\n缓存盘\t201419\n运筹\t201420\n启东新闻网\t201421\n390号\t201422\n谢庆军\t201423\n郭萍\t201424\n贺敬轩\t201425\n江户川柯南\t201426\n哪本书\t201427\n听雨\t201428\n三格化粪池\t201429\n张川川\t201430\n森松尼\t201431\nKVM\t201432\n你若安好\t201433\n济南东\t201434\n意见稿\t201435\n米脂县第三中学\t201436\n鹦哥\t201437\n系友\t201438\n金枝欲孽2\t201439\n不密\t201440\npractitioner\t201441\n汽笛\t201442\n第11次\t201443\nUnity2017\t201444\n羟基硅油\t201445\n市医院\t201446\n神州控股\t201447\n全真教\t201448\nHar\t201449\n火树银花\t201450\n图处\t201451\nbadge\t201452\ndev-c++\t201453\nASI\t201454\nWorker\t201455\n20170320\t201456\n甚大\t201457\n美丽谎言\t201458\n麻辣豆腐\t201459\n斗燃\t201460\n铁锭\t201461\nsoilworks\t201462\n神界危机5.0\t201463\n學生\t201464\ntongyi\t201465\n杨丽花\t201466\n14万元\t201467\n3TB\t201468\n庞大\t201469\n旅宿\t201470\n戦舰\t201471\n模板页\t201472\n阿姆罗\t201473\n校霸\t201474\n摇摆机\t201475\n西弗勒斯·斯内普\t201476\n小魔卡\t201477\n根性\t201478\nDjango\t201479\n流水槽\t201480\n摇滚乐队\t201481\n营店\t201482\n大东门\t201483\n人工授粉\t201484\n重庆三峡博物馆\t201485\n16cm\t201486\npdf-毕业论\t201487\n玺越\t201488\nExcellent\t201489\n希沃授课助手\t201490\n冷缩\t201491\n重庆自贸区\t201492\n2018037\t201493\n双龙高铁\t201494\n陈昂\t201495\n黄老邪\t201496\n阿兹\t201497\n龙江路小学\t201498\n暗枪士\t201499\n计算机应用基础\t201500\n丹尼斯\t201501\n断桥\t201502\nKonica\t201503\n荒坡\t201504\nsrr\t201505\n千组\t201506\n发现室\t201507\ny染色体\t201508\n一举两得\t201509\n秘书们\t201510\n流沙河\t201511\n向井蓝\t201512\n好医\t201513\n_湖北省国家税务局\t201514\n五四团日\t201515\n论斗\t201516\ntheo\t201517\n悍高\t201518\n发放\t201519\n真鲜奶\t201520\n55BBS-我爱购物网\t201521\n龙毅\t201522\n35万\t201523\nher2\t201524\n沙湾村\t201525\nadwords\t201526\n送达\t201527\n朱洁\t201528\n金边黄杨\t201529\n组织度\t201530\njimi\t201531\n发令\t201532\n新港区\t201533\n华晶\t201534\n二次函数y=ax2+bx+\t201535\n牛客\t201536\n巴迪龙\t201537\n五影\t201538\n纳晶\t201539\n马尔萨斯\t201540\n合伙制\t201541\n凤凰房产\t201542\n来说下\t201543\n王克勤\t201544\n美体_太平洋\t201545\n微波通信\t201546\n手机回收网\t201547\n放手\t201548\n伏法\t201549\n生灭\t201550\n通天阁\t201551\n张全\t201552\n场景化\t201553\n地牢猎手5\t201554\n灼心\t201555\n六七十岁\t201556\n青岛海底世界\t201557\n健康时报网\t201558\n指纹采集器\t201559\n宁远县人民政府\t201560\n斯沃数控\t201561\n不分词\t201562\n21世纪以来\t201563\n易通行\t201564\nSTM32F107\t201565\n实时\t201566\n2017.3.5\t201567\n上海公交卡\t201568\n影音先锋影\t201569\n乡宁\t201570\n琅琊\t201571\nmydddfly\t201572\n双飞翼\t201573\nnexo\t201574\n芭蕾女孩\t201575\n美食篇\t201576\n搅拌棒\t201577\n中国共产党党组工作条例\t201578\n能源学院\t201579\n预压\t201580\n通分\t201581\nYacht\t201582\n流行前线\t201583\n优酪乳\t201584\n博彩公司\t201585\n星化\t201586\n九城\t201587\n第二位\t201588\n冯世纶\t201589\neloquent\t201590\n平沿\t201591\n杂物箱\t201592\n分割符\t201593\npfizer\t201594\n彭冠英\t201595\nSue\t201596\nGIFTU\t201597\nRinnai\t201598\n安吉大道\t201599\n同曦\t201600\n香缘\t201601\nlone\t201602\n亲亲我的宝贝\t201603\netc/hosts\t201604\n海盛\t201605\n李光华\t201606\n世界上\t201607\n茶文章\t201608\n10087\t201609\n唐玄宗\t201610\ngai爷\t201611\n赵奇\t201612\n插翅\t201613\n适航证\t201614\n北京师范大学历史学院\t201615\nZiMuZu\t201616\nSKS\t201617\n杭州巨星科技股份有限公司\t201618\n诗仙李白\t201619\n福永汽车站\t201620\n细腰\t201621\n金世豪\t201622\n乐业\t201623\n少女级\t201624\n张丽华\t201625\n空腹血糖\t201626\n秧盘\t201627\n长椿街\t201628\n肠绞痛\t201629\n7160\t201630\n受封\t201631\n吉普赛\t201632\n翼风网\t201633\n热牛奶\t201634\nFNC\t201635\n贝拉拉\t201636\n音体\t201637\nLIGHT\t201638\n中滔环保\t201639\n茶多酚\t201640\n30招\t201641\n套扎\t201642\n伪基站\t201643\n业火\t201644\n刘建\t201645\n大屏嶂森林公园\t201646\n老肥\t201647\n第31期\t201648\n时尚型\t201649\n纵索\t201650\n文化产业管理专业\t201651\n刘泉\t201652\n不一\t201653\n网球赛\t201654\n干豆角\t201655\n上海市财政局\t201656\n魏超\t201657\n义宏\t201658\nbilinear\t201659\n销售面积\t201660\nQPushButton\t201661\n伊菲\t201662\n示好\t201663\n疾苦\t201664\nCobbLiu\t201665\n天美时\t201666\n新荣\t201667\n1466\t201668\n泰拉\t201669\n2小时\t201670\n明纬开关电源\t201671\n知识林\t201672\n龙道\t201673\n010|\t201674\n0714\t201675\n6月14日\t201676\n删稿\t201677\n佳哥\t201678\n山东新北洋信息技术股份有限公司\t201679\nRyzen5\t201680\n福特福克斯\t201681\n蛇窝\t201682\n嘉庆通宝\t201683\n满眼\t201684\n2400亿元\t201685\nacpi\t201686\n400万元\t201687\n潢\t201688\nReCap\t201689\n运货\t201690\n契约者\t201691\n万达乐园\t201692\nArai\t201693\nESHOP\t201694\n兔皮\t201695\n1z\t201696\n板报\t201697\n漫才\t201698\n旁遮普\t201699\n阈值\t201700\n开源分布式数据库\t201701\n壮乡\t201702\nogv\t201703\n古罗马帝国\t201704\n除尘\t201705\nAMOLED\t201706\n刘琨\t201707\n和谐\t201708\n长谷部\t201709\nmulberry\t201710\n592个\t201711\n中国移动魔百盒\t201712\nglade\t201713\n改气\t201714\nprome\t201715\n金葡菌\t201716\nendonote\t201717\n徐贲\t201718\nfenton\t201719\n0.1uf\t201720\nAyumi\t201721\n对生\t201722\n梁顶\t201723\nU盘启动盘\t201724\n古城\t201725\n阿坡饵\t201726\n三王\t201727\n帅胡\t201728\n学园默示录\t201729\n微观经济学分册)\t201730\ncake\t201731\n同病相怜\t201732\n本规范\t201733\nwww.1080\t201734\n二十四味\t201735\nSPARTAN\t201736\nParent\t201737\nchangelog\t201738\n剑啸江湖\t201739\nAlanf\t201740\n瓷肌\t201741\n1.029\t201742\n西湖公园\t201743\n94号\t201744\n凉性\t201745\n放尿\t201746\nANKER\t201747\nSOUL\t201748\n家用品\t201749\n何新\t201750\n剑侠情缘网络版叁\t201751\n递延收益\t201752\nGANG\t201753\n中德\t201754\nshezhi\t201755\n261ARA\t201756\n等离激元\t201757\n挣脱\t201758\n冻豆腐\t201759\nCD4\t201760\n马汉航空\t201761\nIt\t201762\n泪腺\t201763\n大钢\t201764\ndpmi\t201765\ntarga\t201766\n甲铁城\t201767\n风和\t201768\n1625\t201769\n244\t201770\n李昌钰\t201771\n安全社区\t201772\n达到\t201773\n陈妙瑛\t201774\nmaan\t201775\n多一张\t201776\n一纸\t201777\nPS模拟器\t201778\n35周岁\t201779\n星际争霸\t201780\n障眼法\t201781\nmeinu\t201782\n凯发\t201783\n视盘\t201784\n打印机机\t201785\n挑一挑\t201786\n烫头\t201787\n积液\t201788\n哗然\t201789\n価格\t201790\n注射用哌拉西林钠他唑巴坦钠\t201791\n汉中门\t201792\n革兰\t201793\n补心丸\t201794\nT5\t201795\nOPE\t201796\n2071\t201797\n软实力\t201798\n梦幻西游符石组合表\t201799\ncxgrid\t201800\n打基础\t201801\n死状\t201802\n小善\t201803\n24H\t201804\n净化塔\t201805\n瓦格宁根大学\t201806\nIdentity\t201807\n王健\t201808\n腐蚀液\t201809\n2第一\t201810\n稍\t201811\n煮海焚天\t201812\n桌上型\t201813\n绢花\t201814\n豆花\t201815\n书简\t201816\n衡山西\t201817\n2017年2月10日\t201818\n辅助群\t201819\n利率市场化\t201820\n拉活\t201821\n也买酒\t201822\nSMPlayer\t201823\n乱斗\t201824\n鲁哈尼\t201825\n装表\t201826\n分版\t201827\n格蓝迪\t201828\n车行\t201829\n学生机\t201830\n武夷山政府网\t201831\n燕条\t201832\n那几天\t201833\n老总\t201834\n刘天永\t201835\nMATLA\t201836\n创天\t201837\ndwf\t201838\n价格表\t201839\n80万元\t201840\ngua\t201841\n三分\t201842\n江淮_瑞风R3\t201843\n长春市二道区政府\t201844\n上海清算所\t201845\n黄体酮胶囊\t201846\n啪啪啪群\t201847\n植鞣革\t201848\n中西部地区\t201849\n佳能IXUS\t201850\n双头龙\t201851\n京香Julia\t201852\n黄道吉日查询网\t201853\n冤家们\t201854\n成形术\t201855\n27.0\t201856\n井研\t201857\n马来貘\t201858\n硬物\t201859\n水板\t201860\n流利\t201861\n重式\t201862\n江南时报网\t201863\n声韵\t201864\n法识\t201865\n博实乐教育集团\t201866\n阳谷县\t201867\nKD-55X9000E\t201868\n首旅如家酒店\t201869\n南京大学政府管理学院\t201870\n概念性\t201871\n温州五马医院\t201872\n中国集体经济\t201873\n不变价\t201874\n换去\t201875\n苏尼特左旗\t201876\n抽象类\t201877\n老诚\t201878\n乳房胀痛\t201879\n维维豆奶粉\t201880\n电阻应变式传感器\t201881\n戚迹\t201882\n大队\t201883\njinping\t201884\n黑塞哥维那\t201885\nepochs\t201886\nAPlayer\t201887\ng6d1\t201888\n中国中铁电气化局集团公司\t201889\ncla\t201890\nANTA\t201891\n第128期\t201892\n神采\t201893\n博尔顿\t201894\n我该怎么办\t201895\n散弹\t201896\n科茨沃尔德\t201897\n生物科\t201898\nABB\t201899\nLNA\t201900\n女化\t201901\n老枪手\t201902\n王晓慧\t201903\nnrm\t201904\n鄂\t201905\n陕西煤业化工集团有限责任公司\t201906\nY927\t201907\ncoding\t201908\n黄磊\t201909\n苏敏\t201910\ntuzi\t201911\n莫语\t201912\nscript\t201913\nthinkphp5\t201914\n麻果\t201915\npng\t201916\nnom\t201917\n九节虾\t201918\n洗米华\t201919\n南门街\t201920\n荷兰猪\t201921\n教学篇\t201922\n贵阳市财政局\t201923\n除了\t201924\n黑龙江省图书馆\t201925\n陈迪\t201926\n输出型\t201927\npostgrel\t201928\n康帝\t201929\n夫妻工\t201930\n超过30天\t201931\n21点\t201932\n第四十一次\t201933\n中登\t201934\n扁桃酸\t201935\n水机\t201936\n薄片\t201937\nmpegts\t201938\n钱伟长\t201939\n黑a\t201940\nmmhg\t201941\n万能支撑器\t201942\n红莓花儿开\t201943\nTong\t201944\ncues\t201945\n爹娘\t201946\n谷氨酸钠\t201947\nLEMON\t201948\n拉萨河\t201949\n敌特\t201950\n妖姬\t201951\n两三年内\t201952\n安利柯\t201953\n浮法\t201954\n原来如此简单\t201955\n农耕\t201956\n石棉县\t201957\n叹息\t201958\n活吞\t201959\n网贷天下\t201960\n尚海湾\t201961\n工年\t201962\n艾派克\t201963\nfail2ban\t201964\n速查\t201965\n葱衣\t201966\npostMan\t201967\n克拉霉素片\t201968\n南苑社区\t201969\n钼酸铵\t201970\n高起本\t201971\n殚精竭虑\t201972\n汽车配件有限公司\t201973\n黑泷太郎\t201974\n魔鬼式\t201975\n深情版\t201976\n安立\t201977\n反光板\t201978\n塔吉克斯坦\t201979\n银匠\t201980\n人云亦云\t201981\n空肠\t201982\n雅馨\t201983\napk安装器\t201984\n胸膜间皮瘤\t201985\n杞菊地黄丸\t201986\n徐永心\t201987\n中文源\t201988\n托娅\t201989\n武林传奇\t201990\nimage\t201991\n8.53\t201992\n奶昔\t201993\n东山花园\t201994\n群舞\t201995\n壁葬\t201996\n尼休斯顿\t201997\n竹林\t201998\n64件\t201999\n20161218\t202000\n柴旦\t202001\n抓错\t202002\n茅侃东邪西毒\t202003\nget源\t202004\n上海律师事务所\t202005\n数据包络\t202006\n乔峰\t202007\n萨缪尔森\t202008\n海信空调\t202009\n恐怖悬疑栏目-喜马听书\t202010\n蛇肉\t202011\n二神\t202012\nyoueryuan\t202013\n星港城\t202014\n三国雪\t202015\n建院\t202016\nim50\t202017\n华南理工大学\t202018\n南开大学周恩来政府管理学院\t202019\n菊花台\t202020\n34名\t202021\nURDF\t202022\n债券回购\t202023\n宛城区\t202024\n故事盒\t202025\n居筱亦\t202026\n明镜\t202027\n成才\t202028\n工行手机银行\t202029\nneca吧\t202030\n少校\t202031\n猜歌\t202032\n目的国\t202033\nlights\t202034\n33分钟\t202035\nInduced\t202036\n红星海\t202037\n诺克提斯\t202038\n文锦渡\t202039\n定鼎网\t202040\n善心\t202041\n太平洋寿险\t202042\n礼单\t202043\n与人\t202044\nIMPORT\t202045\n字符串函数\t202046\n鲍莉\t202047\n羌笛\t202048\n皋兰\t202049\nmargins\t202050\n东尼大木吧\t202051\n第46号\t202052\nmp4播放器\t202053\n微党\t202054\nimbatv\t202055\n罪名\t202056\nRIMOWA\t202057\n蝴蝶梦\t202058\n白羊座女\t202059\n九段线\t202060\nJButton\t202061\n未亡人\t202062\n上端\t202063\n1.6亿元\t202064\n寒湿\t202065\nNutritional\t202066\n保护条\t202067\n031\t202068\n固定调\t202069\n曹宁\t202070\n新欧\t202071\n优丽\t202072\n行和列\t202073\n工艺美术\t202074\n将爱情进行到底\t202075\n冷处理\t202076\n武昌工学院\t202077\n魔鞋\t202078\n江瑜\t202079\n标文\t202080\n砍掉\t202081\n武术家\t202082\n书品\t202083\n兴隆小区\t202084\n九龙医院\t202085\n4.8号\t202086\nSwipe\t202087\n外宣\t202088\n中国统计局\t202089\n36.3\t202090\n小包包\t202091\n泰森\t202092\n胸膛\t202093\n半亿\t202094\n亚琛\t202095\n第一起\t202096\n大菱鲆\t202097\nnokia7plus\t202098\n4#\t202099\n中京电子\t202100\n流柯\t202101\nv3.5.0\t202102\n氯金酸\t202103\ncd播放器\t202104\n大陆银行\t202105\n汉王影院\t202106\n乡村爱情8\t202107\n庙岭\t202108\nwarder\t202109\n萨班斯法案\t202110\nTextureView\t202111\n航海东路\t202112\n論文\t202113\n19幢\t202114\n小环\t202115\nkatong\t202116\n飞桥\t202117\n雅图\t202118\n辽河\t202119\n宋宁宇\t202120\n失速\t202121\n张红霞\t202122\nVS2012\t202123\n午时茶\t202124\nAcne\t202125\n回马\t202126\n进口报关\t202127\n北楼\t202128\n拨杆\t202129\n受精\t202130\njquery选择器\t202131\n古怪\t202132\n梦幻2018\t202133\n环江县\t202134\n翘首\t202135\nFreestyle\t202136\nSwiper\t202137\n植村\t202138\n潘东\t202139\n10周岁\t202140\n流延机\t202141\n水玉姬\t202142\n长江航道局\t202143\n误嫁\t202144\nHBA\t202145\ndis\t202146\n生茶\t202147\n王局长\t202148\ngamemaker\t202149\n小弟弟\t202150\n古本\t202151\n1.7cm\t202152\nXem\t202153\n魔域SF\t202154\n俸伯\t202155\n被处死\t202156\nroleplay\t202157\n商企\t202158\n一大跳\t202159\n奇骏论坛_汽车之家论坛\t202160\n西装节\t202161\n三浦理惠子\t202162\n利盟\t202163\n非建筑\t202164\nKnights\t202165\nJazz\t202166\n健康日\t202167\n不雨\t202168\n熊猫人之谜\t202169\n卡博特\t202170\nprejudice\t202171\n押运员\t202172\nbuilds\t202173\n梁艺龄\t202174\n0611\t202175\n必要报酬率\t202176\nPRIDE\t202177\n杰杰\t202178\n第十六集\t202179\n拐走\t202180\n3.5毫米\t202181\nTetra\t202182\n浙江大学医学院附属口腔医院\t202183\n储藏柜\t202184\npxi\t202185\n微波滤波器\t202186\n圣地亚哥\t202187\n40日\t202188\n隐形眼镜盒\t202189\n荒人\t202190\n狂秀\t202191\n菜户营\t202192\n真石漆\t202193\n转增股\t202194\n文件库\t202195\n北岸\t202196\n正三角形\t202197\ndreamhack\t202198\n渔区\t202199\nt600\t202200\n九星胡\t202201\n张文生\t202202\n情迷\t202203\n韩村河镇\t202204\n蜂巢蜜\t202205\n麽\t202206\n小冯\t202207\n电驴\t202208\nLrc歌词网\t202209\nTPC\t202210\n采\t202211\n京津唐\t202212\n狠批\t202213\n千针石林\t202214\n重庆医院\t202215\n木瓜牛奶\t202216\n针锋对决\t202217\nvolta\t202218\n徐州火车站\t202219\n电动摩托\t202220\n破岩\t202221\n艳福\t202222\n豆丁网\t202223\nstep7\t202224\n央视新闻联播\t202225\n育儿问答\t202226\nphpcs\t202227\n52年\t202228\ntunel\t202229\n重磅级\t202230\n杯段\t202231\nsty\t202232\n自动焊锡机\t202233\ns3cmd\t202234\n61首\t202235\nnormalize\t202236\n对外承包工程\t202237\n延平\t202238\n睡一觉\t202239\nToggleButton\t202240\n日高\t202241\n主动型\t202242\n5月26\t202243\n沙洲\t202244\n中国消费网\t202245\n剑王\t202246\n网篮\t202247\n盛年\t202248\n陈经纶\t202249\n淮北\t202250\n硫化铅\t202251\na6l\t202252\n终板炎\t202253\nvs编译器\t202254\n尿碘\t202255\n平江\t202256\n恢\t202257\n1000余\t202258\n杭州江南实验学校\t202259\n高仿版\t202260\n不定时工时制\t202261\n窑湾\t202262\n榆林市人民政府\t202263\n卡规\t202264\n贪财\t202265\n有毛\t202266\n短驳\t202267\n住化\t202268\n千克\t202269\n李文生\t202270\n江湖风云录2\t202271\n钛镁合金\t202272\n小蜜淘\t202273\n氨酚黄那敏颗粒\t202274\n治愈者\t202275\n中央人民广播电台少年广播合唱团\t202276\n养身\t202277\n席梦思\t202278\npp棉\t202279\n张柯宇\t202280\n绍兴鉴湖\t202281\n中国医疗器械行业协会\t202282\n收集池\t202283\nBuilt\t202284\n电版\t202285\n100支\t202286\n内蒙古\t202287\n李微微\t202288\nYL\t202289\n太平财产保险有限公司\t202290\n危情\t202291\n分布式光伏\t202292\n大田村\t202293\n2k17\t202294\n京东大学\t202295\nURV\t202296\n出身\t202297\n红瓶\t202298\n北滘\t202299\n臂展\t202300\n邳州\t202301\n空言\t202302\nxixicat\t202303\n钓鱼台国宾馆\t202304\n微彩\t202305\n企业管理系统\t202306\n易语言浏览器\t202307\n扎西德勒\t202308\n萧统\t202309\n卓依婷\t202310\n河南省肿瘤医院\t202311\n归并排序\t202312\n丝缎\t202313\n修图\t202314\n上市规则\t202315\n猴姑\t202316\n筛选\t202317\n猎金集团\t202318\n金通灵\t202319\n艾梅\t202320\n市档案局\t202321\nAdjusted\t202322\n超净台\t202323\nbige\t202324\n金丝楠木\t202325\nchecked\t202326\nparquet\t202327\n床沿\t202328\n海德花园\t202329\n长春动植物公园\t202330\n黄某\t202331\n双汇发展\t202332\n人身意外保险\t202333\n优酷爱奇艺\t202334\n道克\t202335\n20160608\t202336\n扒皮鱼\t202337\n元辰\t202338\n生花\t202339\nDirectDraw\t202340\n蒜蓉粉丝\t202341\n300章\t202342\n军婚\t202343\n香调\t202344\n使命召唤4现代战争\t202345\n微星z370\t202346\nstocks\t202347\nfcb\t202348\n机管\t202349\nMODO\t202350\n晶向\t202351\n铺作\t202352\n天河广场\t202353\nschule\t202354\n片\t202355\n性侵害\t202356\nFASHION\t202357\n长有\t202358\n毛细胞\t202359\n龙坞茶镇\t202360\n高翠兰\t202361\n盘点\t202362\n遊\t202363\nSQL函数\t202364\n领客\t202365\nZX100\t202366\n杀人不眨眼\t202367\n河曲\t202368\n越南政府\t202369\n武圣卡\t202370\n宁波市国土资源局\t202371\n275ml\t202372\ndye\t202373\n天津市人民医院\t202374\n吃好\t202375\n匿名者\t202376\n兴化市\t202377\n救赎\t202378\n占用率\t202379\n一闪而过\t202380\n樱花庄\t202381\nPROJECT\t202382\n中国嘉德国际拍卖有限公司\t202383\n很喜\t202384\n无价之宝\t202385\n负环\t202386\n李飞\t202387\nSkyworth\t202388\nXCode8\t202389\n中音\t202390\n中华人民共和国国防部\t202391\nLoveLive\t202392\n运动方程\t202393\n蛋花汤\t202394\nfucker\t202395\nboth\t202396\n鹏城\t202397\n王佩瑜\t202398\n第二种\t202399\nuipicker\t202400\n护城\t202401\n电动门\t202402\nRotor\t202403\nspj\t202404\nimei\t202405\n人教版四年级语文上册\t202406\nNUC\t202407\n张镇麟\t202408\n紧身牛仔\t202409\n2017-10-27\t202410\n徐姐\t202411\n1960年\t202412\n22码\t202413\naway\t202414\n借款条\t202415\n保养品\t202416\n洁瑛\t202417\n大篆\t202418\n滴滴顺风车\t202419\n拜金主义\t202420\n同伴\t202421\n武当山道\t202422\ncuk\t202423\nSSR卡\t202424\nVR一体机\t202425\n三国吧_\t202426\n灵狐\t202427\n乳液\t202428\nAlarmManager\t202429\ncosyer\t202430\n晏紫东\t202431\nLoadRunner12\t202432\nACF\t202433\nBOSS\t202434\nUNIQ\t202435\n隔壁村\t202436\nTroll\t202437\n氟碳涂料\t202438\nClerk\t202439\n方大特钢\t202440\n观展\t202441\nWebster\t202442\n双口\t202443\n精绝\t202444\n广东美术馆\t202445\n临猗\t202446\n百分点\t202447\n九支\t202448\n奔富\t202449\n卡其色\t202450\n蜂拥\t202451\nwalking\t202452\n机试\t202453\n乐东黎族自治县\t202454\n广州美国领事馆\t202455\n上周四\t202456\n输出端\t202457\n江苏苏宁\t202458\n玖富万卡\t202459\n奥迪S5\t202460\n来临\t202461\n0.8.6\t202462\n等级\t202463\n微赞微擎\t202464\n蚌埠市人民政府\t202465\n十答\t202466\n耒阳市\t202467\n365.cn\t202468\n100厚\t202469\n国务院常务会议\t202470\n被捕\t202471\n伊诺\t202472\n_沃游网www.woyoo.com\t202473\n飞扬神途\t202474\n摔打\t202475\n延安市人民政府\t202476\ncommons-logging\t202477\n白白侠\t202478\n小少爷\t202479\n黄马甲\t202480\n李米\t202481\n百色新闻网\t202482\n8节\t202483\n幻维\t202484\n博图V13\t202485\n总统\t202486\nJDOM\t202487\n惹火甜\t202488\n类图\t202489\n左旋\t202490\n更多元\t202491\nPowerPoint幻灯片\t202492\n向氏\t202493\n柳州人才网\t202494\n恋爱时代\t202495\nPID算法\t202496\n凤凰路\t202497\n泡泡袖\t202498\n神游\t202499\n老任\t202500\nNGO中心\t202501\n吃惊\t202502\n单路\t202503\nKorn\t202504\n杨慧\t202505\n硬膜外血肿\t202506\n苏联共产党\t202507\n晋美\t202508\n学徒工\t202509\nLena\t202510\n不忘本\t202511\n注射用醋酸亮丙瑞林\t202512\n元认知\t202513\n美导\t202514\n凡间\t202515\n电信天翼4G\t202516\n上塘镇\t202517\n眼\t202518\n60次\t202519\n远洋新干线\t202520\n认缴资本\t202521\n龙运\t202522\n伦敦政治经济学院\t202523\ngcd\t202524\n七宝店\t202525\nTA们\t202526\n孟小冬\t202527\n_侠\t202528\n张婧懿\t202529\nMHO\t202530\nodin3\t202531\n中国联塑\t202532\n主轴承\t202533\n恩诺\t202534\nSneaky\t202535\n托莱多\t202536\nIndiaMART\t202537\n海军航空工程学院\t202538\n第08章\t202539\n专利申请人\t202540\n麒麟970\t202541\n花王股份\t202542\n吉屋\t202543\n毁\t202544\n采耳\t202545\n三星中央\t202546\n龙尊\t202547\n三十九级\t202548\n_金兰企划网\t202549\n寿险业\t202550\n内热\t202551\n快易捷药品交易网\t202552\n正面图\t202553\n07073游戏网\t202554\n八王坟\t202555\n数量级\t202556\nEditor\t202557\nrepost\t202558\n屎壳郎\t202559\n宋初一\t202560\n以退\t202561\n60台\t202562\n1440\t202563\n海蓝steven\t202564\n校面\t202565\nwps2013\t202566\n守艺\t202567\ncomponentone\t202568\nDREAMWEAVER\t202569\n难易\t202570\n邹忌讽齐王纳谏\t202571\n吴强\t202572\n沸程\t202573\nbeijing\t202574\nWP10\t202575\n地槽\t202576\n中序遍历\t202577\nyù\t202578\n1904\t202579\n晏殊\t202580\n河海大学商学院\t202581\nHandy\t202582\n遇\t202583\n支公\t202584\nbod\t202585\n匡恩\t202586\n149元\t202587\n除疤\t202588\n福币\t202589\n特征\t202590\n华灿光电\t202591\n模块式\t202592\n界桩\t202593\n鑫空间-鑫生活\t202594\n覆合\t202595\n崇义县\t202596\n享物\t202597\nphotosho\t202598\n朗恩\t202599\n水貂\t202600\n三农普\t202601\n太阳泪\t202602\nfinecms\t202603\n大兴镇\t202604\n贷款利率\t202605\n上元节\t202606\n丝粒\t202607\nReform\t202608\n李沧东\t202609\n蒙脱石\t202610\n新三字经\t202611\n拉丁人\t202612\n阿拉伯之春\t202613\n数字化工厂\t202614\n龟梨和也\t202615\n股人\t202616\n3万\t202617\n璧山县\t202618\n孔府\t202619\n西安省\t202620\n2017年12月2日\t202621\nliuxue86.com\t202622\n陈鹤皋\t202623\nCorelD\t202624\ncritical\t202625\nhdoj\t202626\n三厘米\t202627\n声书\t202628\nbrix\t202629\nmsoffice\t202630\n墓穴\t202631\nExactly\t202632\n娜美\t202633\n51分\t202634\n新中式建筑\t202635\n仿真软件\t202636\n恩爱夫妻\t202637\n阳光社区\t202638\n迁都\t202639\n1.4.4\t202640\n24处\t202641\n吴晓华\t202642\nMOOG\t202643\n一首首\t202644\n母儿\t202645\n穆斯贝尔海姆\t202646\nmouseover\t202647\n濑户\t202648\nHintsoft\t202649\n5】\t202650\n中核建\t202651\n亿性\t202652\njcr\t202653\n北汽战旗\t202654\n美素佳儿\t202655\n金宇\t202656\n2.6.6\t202657\n塞规\t202658\n好假\t202659\n北塔\t202660\nAudience\t202661\n王哲\t202662\nSG\t202663\n李晓东\t202664\n煤机\t202665\nff15吧_\t202666\n卡斯柯信号有限公司\t202667\n驱风油\t202668\n智米\t202669\npo主\t202670\n陈方\t202671\n戴尔灵越14\t202672\n行政版\t202673\nwps2016\t202674\n培优\t202675\nptgui\t202676\n东营市人民政府\t202677\n沉香木\t202678\n交通\t202679\nHappiness\t202680\nso.1\t202681\n中药禁忌网\t202682\n五十铃汽车\t202683\n整理箱\t202684\n广州市白云区政府网\t202685\n底气\t202686\n观邸\t202687\n郝鹏\t202688\n然乌\t202689\n掌子\t202690\n参考价\t202691\n证卷\t202692\n香水海\t202693\ngdf\t202694\n陷落\t202695\n山空\t202696\n小巧手\t202697\n美态\t202698\n兰州工业学院\t202699\n北大附小联合实验学校\t202700\n窥屏\t202701\n郭总\t202702\n崖城\t202703\n回迁楼\t202704\ncet4\t202705\nB150\t202706\n特罗姆瑟\t202707\n阶位\t202708\nbitbit\t202709\n行列式\t202710\npth660\t202711\n宏光\t202712\n猎杀潜航3\t202713\n北京中路\t202714\n垌\t202715\n肤浅\t202716\n一堂课\t202717\n许继电气\t202718\nmooka\t202719\n毛呢大衣\t202720\n企业知识产权管理规范\t202721\n本硕博连读\t202722\nCARBON\t202723\n0391\t202724\nshook\t202725\n通政\t202726\n诗雨\t202727\n第131\t202728\n当我想你\t202729\n吊带衫\t202730\nUgirls\t202731\nThou\t202732\n天生炫舞时代\t202733\n张掖\t202734\n声声入耳\t202735\n地磁传感器\t202736\n新安怡\t202737\ntemperatures\t202738\n一撮\t202739\n龙华医院\t202740\n这本\t202741\nㄌ\t202742\n西亚斯\t202743\n钱鞭\t202744\n宇多田\t202745\n鲜艳\t202746\n施秉\t202747\n木吒\t202748\n太白山\t202749\n挂表\t202750\nchmod\t202751\nShotgun\t202752\n诗人们\t202753\nSurvivor\t202754\n小个子\t202755\n002081\t202756\nCpp\t202757\n恋舞OL\t202758\n卖片\t202759\nx64.dll\t202760\n所在表\t202761\n在那边\t202762\n战豹\t202763\nperforming\t202764\nEP51\t202765\n老行家\t202766\n雷霆万钧\t202767\n分面\t202768\n盎格鲁撒克逊\t202769\nbeurer\t202770\nsectors\t202771\nFileListing\t202772\n管理类\t202773\n戒严\t202774\nGeoTrust\t202775\n星海广场\t202776\ninetd\t202777\n企业征信机构\t202778\n国家专利局\t202779\n明细栏\t202780\nCharity\t202781\n葛家澍\t202782\n杨小曼\t202783\n菏泽百姓网\t202784\n中共一大会址纪念馆\t202785\nparentheses\t202786\n热炒\t202787\n赵楠\t202788\n互联网化\t202789\n嘀嘀点\t202790\n玄冥\t202791\n曹村\t202792\nflet\t202793\n與\t202794\n倒行\t202795\n周王庙镇\t202796\nHS编码查询系统\t202797\n鹿岛\t202798\nmsdn\t202799\n中国县\t202800\n金禾\t202801\nt800\t202802\n一喝水\t202803\n香烟\t202804\nTDDL\t202805\n云南师范大学文理学院\t202806\n欧迪\t202807\n大柴胡汤\t202808\n转借\t202809\n弥可\t202810\n普乐士\t202811\n漫舞\t202812\nonerror\t202813\n18035期\t202814\n履带式抛丸机\t202815\nLuxembourg\t202816\n惠威科技\t202817\nJsp\t202818\n映泰Hi-Fi\t202819\n超级变态传奇\t202820\n长沙商贸旅游职业技术学院\t202821\n三十八周\t202822\n张黄镇\t202823\n夺标\t202824\n几天几夜\t202825\n晨鸣\t202826\n分类\t202827\n黄雅琼\t202828\n曹作兰\t202829\n茶客\t202830\n1敏\t202831\narcsde\t202832\nwww.tonghua5.com\t202833\n撸撸撸电影\t202834\n穷理\t202835\n河粉机\t202836\n剑侠情\t202837\njson\t202838\ninterpolation\t202839\n型格\t202840\n龙楼镇\t202841\nEntropy\t202842\n利哥\t202843\n279元\t202844\n始终\t202845\n狂闪\t202846\n八面\t202847\n漆树\t202848\n邯郸市区\t202849\n大比拼\t202850\nL4\t202851\n新环境保护法\t202852\n陈卫东\t202853\n硬质合金\t202854\n弱光\t202855\n奈曼旗\t202856\n透水混凝土\t202857\nwaiting\t202858\ngaoqing\t202859\n印光法师\t202860\n法字\t202861\ntransportation\t202862\n山楂酒\t202863\nzhuo\t202864\n幼虫\t202865\n公共文化服务保障法\t202866\n铝罐\t202867\nR2017a\t202868\npb9.0\t202869\n山里红\t202870\n心裳\t202871\n蓝丝\t202872\n小熊请客\t202873\n激素类药\t202874\n立帖\t202875\n麻省理工科技评论\t202876\n绮美香\t202877\nforin\t202878\n幽灵行动\t202879\n保利清能\t202880\n天才狂妃\t202881\nFanyijia\t202882\n多囊卵巢\t202883\n沐辰\t202884\n交通卡\t202885\n尽处\t202886\n风墙\t202887\n邓伟\t202888\nwzyxidian\t202889\n生态林\t202890\n以梦为马\t202891\n10式\t202892\n车审\t202893\n1000多张\t202894\nVideoView\t202895\nOFFICE2010\t202896\n苏宁易购帮助中心\t202897\n27.79万亿元\t202898\nSuperSocket\t202899\n100所\t202900\nlyg\t202901\n华大基因\t202902\n时代广场的蟋蟀\t202903\n派力奥\t202904\n复国\t202905\n城者\t202906\n结构体变量\t202907\n大打出手\t202908\n彭子益\t202909\n调减\t202910\n塞西\t202911\n环法自行车赛\t202912\n11卷\t202913\n8.5_\t202914\n问祖\t202915\nManifest\t202916\nSATA\t202917\n贵州省财政厅\t202918\n金库网\t202919\n刺槐\t202920\n纹绣\t202921\n落后\t202922\n复方甘草酸苷\t202923\n十七度\t202924\n岔河\t202925\n发下\t202926\n八重天\t202927\n尚维\t202928\nJBL\t202929\n雪人股份\t202930\n100#\t202931\n打摆子\t202932\n石栏\t202933\n素色\t202934\n171号段\t202935\nowned\t202936\n10.4\t202937\nHERMES\t202938\n吃豆人\t202939\ntemps\t202940\n北京建筑大学\t202941\n&2\t202942\n圣骑\t202943\n75克\t202944\n游匣7559\t202945\n雷宇扬\t202946\n濮阳市实验小学\t202947\nfuneral\t202948\nLone\t202949\n奥特胶囊\t202950\n清华附小\t202951\n大忙\t202952\n协整\t202953\ngaba\t202954\n鸡脯肉\t202955\n在一起\t202956\nWORX\t202957\n20160102\t202958\n赠礼\t202959\n华艺名教育\t202960\n泡\t202961\n杨浩涌\t202962\n玩唱\t202963\n北斗星通\t202964\n深圳市委\t202965\n锐骐\t202966\n贤妃\t202967\n回放器\t202968\n结机\t202969\n谢平\t202970\ncad光标\t202971\n棕榈酸酯\t202972\n林正泰\t202973\n变化率\t202974\n中望cad2015\t202975\n沧州市人力资源和社会保障局\t202976\n澜沧拉祜族自治县\t202977\n苹果MAC\t202978\nDutch\t202979\n软笔\t202980\n出纳\t202981\n1450\t202982\n那个你\t202983\n包产到户\t202984\n朱可夫\t202985\n中国共产党宣城市委员会\t202986\n红丹\t202987\nJAX-RS\t202988\ncharacterized\t202989\n格瑞\t202990\n复音口琴\t202991\n观书有感\t202992\n静女\t202993\n国防科大\t202994\n当局者\t202995\n乌鲁木齐路\t202996\n界\t202997\n暖昧\t202998\n半个世纪\t202999\n惊喜\t203000\nPCU\t203001\n罗普斯金\t203002\n烟台三中\t203003\n回头是\t203004\n面包机\t203005\n照片展\t203006\n陕西省教育厅\t203007\n金福南杀人事件始末\t203008\n2164\t203009\n新宾\t203010\n杀头\t203011\n第17卷\t203012\n赵海涛\t203013\n6.1.6\t203014\n刚毛\t203015\n犬\t203016\n白云瑞\t203017\n五页\t203018\n四十四\t203019\n推信网\t203020\n速成\t203021\n卷席筒\t203022\nDrawings\t203023\n零税\t203024\n项目管理师\t203025\n通信技术\t203026\nfilereader\t203027\n油包\t203028\nX4\t203029\nSociety\t203030\n洛宝贝\t203031\n招考\t203032\nMash\t203033\n拿破仑\t203034\n波箱油\t203035\n教育联展网\t203036\n约书亚\t203037\n磁器口\t203038\n杉浦ボッ树\t203039\nTXT电子书\t203040\n万花谷\t203041\n网众\t203042\n租购\t203043\n逍遥津\t203044\n2016年1月18日\t203045\n透视眼\t203046\n香江\t203047\n0b\t203048\n新萌\t203049\nprofession\t203050\n四川大学出版社\t203051\n露华浓\t203052\n逻辑分辨率\t203053\n2017年1月1日\t203054\nDMD\t203055\n黄金国\t203056\n快影\t203057\n梁园\t203058\n点评酒店排名【携程酒店\t203059\n24座\t203060\n怀山\t203061\nJul\t203062\n85w\t203063\n张吉怀\t203064\n穷困潦倒\t203065\n颁奖\t203066\n冷空气\t203067\n第三级\t203068\nKuni\t203069\n数码宝贝故事赛博侦探\t203070\nciti\t203071\n疏港大道\t203072\n裙楼\t203073\n冥灵\t203074\n实用型\t203075\nx264\t203076\n绿城诚园\t203077\n中华机械网\t203078\n湘南学院\t203079\n亡命天涯\t203080\nWIN7系统\t203081\n东阳人才网\t203082\n合肥地区\t203083\n美中贸易战\t203084\ndel\t203085\ntroops\t203086\n阿元\t203087\n地调\t203088\nIXXX\t203089\n跨区表\t203090\nu2518\t203091\nangelina\t203092\n立\t203093\n堵车\t203094\n长沙窑\t203095\nday\t203096\n乌兰察布\t203097\n国剧_6v电影网\t203098\npremiun\t203099\nqpainter\t203100\n2330\t203101\n乌鲁木齐地窝堡机场\t203102\n涨破\t203103\n古诗二首\t203104\n贵妃\t203105\n厂线\t203106\nITP\t203107\n饮血\t203108\n200多套\t203109\n泰然自若\t203110\n19.0\t203111\n优酷土豆\t203112\n甲盖\t203113\n林顿\t203114\nNewSea\t203115\n非法学专业\t203116\n看上\t203117\n初性\t203118\n车门\t203119\n蓝珀\t203120\n琵琶曲\t203121\n保富\t203122\n蓝叠安卓模拟器\t203123\n法国商学院\t203124\n操场\t203125\nbett\t203126\n汝窑\t203127\n温莎郡\t203128\n2ch吧_\t203129\n鬼潇潇\t203130\n微波传感器\t203131\n曾志权\t203132\n鸡尾酒会\t203133\n己欲达而达人\t203134\n将进酒\t203135\n天意文学网\t203136\n至高\t203137\nrandomly\t203138\n工人报新闻网\t203139\n360网址导航\t203140\n1976\t203141\nderen\t203142\n森鸥\t203143\n洛神\t203144\n泰比\t203145\ntalk\t203146\nStepper\t203147\n筑美\t203148\n包车\t203149\ndaocloud\t203150\n页卷\t203151\n肖瑶\t203152\n显示性\t203153\nSofa\t203154\n中国宝武集团\t203155\n冒昧\t203156\n瑞恩\t203157\nSUAPP\t203158\n云帆\t203159\n300458\t203160\n胚层\t203161\n视讯\t203162\n撸撸奇\t203163\n三星S5\t203164\n通乳\t203165\n百度新闻源\t203166\n开胃菜\t203167\n点阵图\t203168\n42分钟\t203169\nwindow_net\t203170\nifind\t203171\n环亚娱乐\t203172\n钾\t203173\n彭静\t203174\n2.5万元\t203175\n亡父\t203176\n炮姐\t203177\n4.98\t203178\n电子报\t203179\nwww.4399wanju.com\t203180\nrhetorical\t203181\n福克斯论坛_汽车之家论坛\t203182\nv1.7\t203183\n农安\t203184\nLear\t203185\n严肃\t203186\n龙隐\t203187\n单层\t203188\nasn\t203189\nEin\t203190\n張\t203191\n毛家饭店\t203192\n002431\t203193\n优待警\t203194\nFerrari\t203195\n商海\t203196\n概况\t203197\n坡莫合金\t203198\n三亚房产网\t203199\n华为荣耀7\t203200\n党刊\t203201\n均益\t203202\n陈元\t203203\nPCSX2\t203204\nperson\t203205\n补射\t203206\n洋字\t203207\nfuze\t203208\n小册子\t203209\n新图\t203210\nharbor\t203211\nVeneta\t203212\n位置\t203213\n山塘\t203214\nMID\t203215\nIMG\t203216\n请回答1988\t203217\nnese\t203218\n穿孔吸音板\t203219\n葡萄牙\t203220\nptfx\t203221\npui\t203222\n1071\t203223\nComic\t203224\n勇舞网\t203225\n申沃\t203226\n骗人\t203227\n难得糊涂\t203228\n动态图\t203229\n儒生\t203230\n求金\t203231\n88折\t203232\n文句\t203233\n颜色拾取器\t203234\nstm32f10x\t203235\nTsinghua\t203236\n狠人\t203237\n漏底\t203238\n江都政府网\t203239\n神武帮派\t203240\n酷毙\t203241\n电解质紊乱\t203242\n教师招聘网\t203243\n人夫\t203244\nFIA\t203245\n云中雀\t203246\n转门\t203247\n两江镇\t203248\n保本理财\t203249\n朝五晚九\t203250\n海尔集团\t203251\n甲状腺腺瘤\t203252\nCors\t203253\n菜椒\t203254\nTS16949\t203255\n汉武\t203256\ncdr7\t203257\n瑞虎7论坛_汽车之家论坛\t203258\n武汉大学外国语言文学学院\t203259\n宇宙战舰大和号\t203260\nHewlett\t203261\n爱情睡醒了\t203262\n这句话\t203263\n2017年五月\t203264\n中国华能集团有限公司\t203265\n浙江网络联盟\t203266\n理家城_卫浴\t203267\n解道\t203268\n琼恩\t203269\n九轮\t203270\n刘玄\t203271\n李立宏\t203272\n升旗仪式\t203273\n网易社\t203274\nForecasts\t203275\n骨格\t203276\nLipton\t203277\n南巷\t203278\n浙江交工集团股份有限公司\t203279\n控制式\t203280\n38师机动队\t203281\n嘉爵\t203282\n王炳贤\t203283\n二倍\t203284\nIDOL\t203285\n幸运一击\t203286\n承载力\t203287\n边宽\t203288\n南通西站\t203289\n8860\t203290\n多喝\t203291\n潘\t203292\n寄生虫\t203293\n门嘴\t203294\n赋码\t203295\n通许\t203296\nteminal\t203297\n平桥新闻网\t203298\n本社\t203299\n养生\t203300\n浙江大学国际联合学院\t203301\nEnabled\t203302\nunblock\t203303\n高速度\t203304\n高行健\t203305\nHuracan\t203306\n何明\t203307\nSolidCAM\t203308\n温故知新\t203309\ndirectplay\t203310\n乳源\t203311\n两屏\t203312\n孙红\t203313\n派件员\t203314\n天波\t203315\n王伟忠\t203316\n请留\t203317\n三国之见龙卸甲\t203318\ngv钙片\t203319\n电汇\t203320\n多长时\t203321\nmvcc\t203322\n结筋\t203323\n鸿运国际\t203324\n漳浦县\t203325\n小黑猪\t203326\nLangrisser\t203327\n接触性\t203328\nSUN\t203329\n1955\t203330\n食品厂\t203331\n3dmax2011\t203332\n大桥未久\t203333\nwarns\t203334\n林岚\t203335\n金皇村\t203336\n莱斯璧\t203337\nunless\t203338\n四堡\t203339\n读后感\t203340\nfamily\t203341\n东京旅游问答\t203342\nCodeBlocks\t203343\n湖城\t203344\n米思齐\t203345\n酒类\t203346\n光荣公司\t203347\nusb2.0\t203348\nkraft\t203349\n钢琴烤漆\t203350\n黛力\t203351\nOSA\t203352\n维生素B6\t203353\n麦克米兰\t203354\n补中\t203355\n1more\t203356\n10只\t203357\n20150228\t203358\n大唐无双2\t203359\n阿基米德\t203360\n凌远\t203361\n补水仪\t203362\n葡京娱乐场\t203363\nhfd\t203364\n弋矶山\t203365\n加子\t203366\n布拖县\t203367\n三城一区\t203368\n新星\t203369\nVBScript\t203370\n八达网\t203371\n夫琅禾费衍射\t203372\n现流\t203373\n大张\t203374\n天技\t203375\n供养\t203376\n影舞者\t203377\n域外\t203378\n圆通\t203379\nxdc\t203380\n我很喜欢\t203381\n风之丘\t203382\nunderstanding\t203383\ngloble\t203384\n409\t203385\n溴化丁基橡胶\t203386\ntlp\t203387\n神界原罪2\t203388\n矢吹健太朗\t203389\n大事者\t203390\nGNOME\t203391\n移动支付网\t203392\nrestaurant\t203393\n通风器\t203394\n发照\t203395\n入化\t203396\n枫亭镇\t203397\nINS-30131\t203398\n沙巴克皇宫\t203399\n瑞雪\t203400\n广州海\t203401\n战国四大名将\t203402\n跨接\t203403\n武汉地铁集团\t203404\n45s\t203405\n武汉美术馆\t203406\n活跃度\t203407\n大家闺秀\t203408\n空洞\t203409\n资中县\t203410\n44期\t203411\n莫干\t203412\n佳\t203413\n通威\t203414\n国际金融市场\t203415\n湖南省工商局企业注册局\t203416\n安规认证\t203417\n凸起\t203418\n五更\t203419\ndry\t203420\n小路\t203421\nSCF\t203422\n消防水\t203423\n三国萌战\t203424\n逆流小说网\t203425\n悯\t203426\n老公性\t203427\n台湾街\t203428\n伍伍慧\t203429\nuti\t203430\n密室\t203431\n3ppt\t203432\n皮肌炎\t203433\n逐次\t203434\nqualifies\t203435\n普度\t203436\n那年\t203437\nMasks\t203438\nchinalorin\t203439\n亚洲湾\t203440\n泰国人\t203441\n300059\t203442\n快处\t203443\nautonomous\t203444\n堕夜\t203445\n老是\t203446\nOpt\t203447\n单集\t203448\n9130\t203449\n|性商网\t203450\n乙酸钠\t203451\n沪江日语学习网\t203452\n穿越时空的少女\t203453\n和睦家医院\t203454\n客运东站\t203455\n王扶林\t203456\n三僚\t203457\nScissors\t203458\n云龙政府网\t203459\n飞秒\t203460\nDruid连接池\t203461\nMOCO\t203462\n杨柳郡\t203463\n小机\t203464\n我是杜拉拉\t203465\n学礼\t203466\n1909\t203467\n笑傲神雕\t203468\n重庆市公共资源交易中心\t203469\n正言\t203470\n助动车\t203471\n禁渔期\t203472\n编绳\t203473\nalison\t203474\nPublic\t203475\n∕\t203476\n空山鸟语\t203477\n作序\t203478\n肉骨茶\t203479\n金港城\t203480\n100帧\t203481\n哼唱版\t203482\nclaudia\t203483\n老父\t203484\n小乖乖\t203485\n吉良吉影\t203486\n防己\t203487\n彩喷\t203488\n超短线\t203489\n兴安岭\t203490\nbullets\t203491\nTimeseries\t203492\n崩坏学院2\t203493\n恶心感\t203494\n逸尘\t203495\n万行教师人才网\t203496\ngr07\t203497\nTia\t203498\nminifilter\t203499\nlinuxc\t203500\nav片毛\t203501\nButyl\t203502\n我的美母教师\t203503\n亿华通\t203504\n介绍性\t203505\nJekyll\t203506\n七瀬ジュリア\t203507\n下面\t203508\nactiviti-explorer\t203509\n百度视频\t203510\n名湖\t203511\n隐密\t203512\n中行外汇\t203513\ncpu吧_\t203514\n首批\t203515\n围压\t203516\n落草\t203517\n一牛\t203518\n平手友梨奈\t203519\n深圳教育在线\t203520\n文件盘\t203521\n郑报\t203522\n学位认证\t203523\n三会一课会议\t203524\n巧克力\t203525\n明星梦\t203526\n河南省农村信用社\t203527\n礼拜\t203528\n星野飞鸟\t203529\n家福\t203530\n5102\t203531\n湖北省人民代表大会常务委员会\t203532\n罗城\t203533\n祁劲松\t203534\n重庆理工大学\t203535\n丰田花冠\t203536\nWWE狂野角斗士\t203537\n绿色建筑工程\t203538\n订货号\t203539\n1116\t203540\n第96期\t203541\n贬官\t203542\nTECPLOT\t203543\n吉林省高级人民法院\t203544\ndisadvantage\t203545\n青灯\t203546\n海南新闻网\t203547\n昆明电信\t203548\n银行结算账户\t203549\n互补品\t203550\n身处\t203551\n灵契\t203552\n这首诗\t203553\n造影剂\t203554\n过生日\t203555\n申通快递单号\t203556\nSPRD\t203557\n大人物\t203558\n侧壁\t203559\n西安市环境保护局\t203560\n张永纯\t203561\n可容纳\t203562\nSomething\t203563\n氟虫腈\t203564\n浴霸\t203565\n第四十二章\t203566\n汴梁城\t203567\nUnveils\t203568\n龋\t203569\n纯铜\t203570\n海味\t203571\n新港东路\t203572\n南回归线\t203573\n家电业\t203574\n梦幻之星OL2\t203575\n钢笔工具\t203576\nYOUR\t203577\n垦利\t203578\nxLua\t203579\n巴巴\t203580\n二层交换机\t203581\n正商智慧城\t203582\n亮机\t203583\n`\t203584\n前所未闻\t203585\nbroader\t203586\n拉风龙\t203587\n胆酸\t203588\n贩卖\t203589\n重庆购物狂\t203590\n汽配人图片网\t203591\n魔幻\t203592\n香山公园\t203593\nSyllabus\t203594\n华远君城\t203595\nbilibili视频|哔哩哔哩\t203596\n第146集\t203597\n隔离度\t203598\nAllianz\t203599\n亚里沙\t203600\n星徽\t203601\n蛋子\t203602\n刹帝利\t203603\n桩顶\t203604\n子窗体\t203605\nMPlayer\t203606\n三元达\t203607\n起式\t203608\n淘宝秒杀器\t203609\n县卫计局\t203610\n蜀山战纪剑侠传奇\t203611\n李兆会\t203612\n小百\t203613\n601992\t203614\n中国友谊出版公司\t203615\n第一角\t203616\n爱站网\t203617\n公对公\t203618\n尿崩\t203619\n投币器\t203620\n创作\t203621\n矮马\t203622\n棒约翰\t203623\nSandy\t203624\n四千万\t203625\n峭壁\t203626\n藏\t203627\n大和锦\t203628\n杂交水稻\t203629\n紫外吸收峰\t203630\n李尚顺\t203631\n中国福利会幼儿园\t203632\n巨乳女\t203633\n中国工商银行股份有限公司\t203634\n四团镇\t203635\n珠光宝\t203636\nSNAT\t203637\n华城\t203638\nhikaricp\t203639\n投资学\t203640\n32点\t203641\n登山队\t203642\n出事后\t203643\n咱们\t203644\n运球\t203645\nCranberries\t203646\n孙晨\t203647\n以外\t203648\n后半个月\t203649\n明末边\t203650\nNow\t203651\n金刚鹦鹉\t203652\n东方剑桥幼儿园\t203653\n草鸡\t203654\ng1840\t203655\n勿入\t203656\n大关小学\t203657\nForgotten\t203658\n非加密区\t203659\n校本\t203660\n伊吾\t203661\nleaflet\t203662\n弓道\t203663\n法外狂徒\t203664\nVienna\t203665\n张大爷\t203666\nStardew\t203667\n无垠的太空第二季\t203668\n高翻\t203669\n2772\t203670\nMSB6006\t203671\n罗曼诺\t203672\n南线\t203673\n两笔\t203674\n童声\t203675\n_火车时刻查询_火车票网\t203676\nouhy\t203677\nlover\t203678\n2017年4月15日\t203679\n羊儿\t203680\ntext文本框\t203681\nHarmonized\t203682\n90nm\t203683\n非人哉\t203684\n2600元\t203685\nIFTY\t203686\n高姿\t203687\n词尾\t203688\n宏杉科技\t203689\n手掌\t203690\n上海商城\t203691\n包长\t203692\n我的梦里花落\t203693\n2.2.6\t203694\n奇瑞瑞虎7\t203695\nquo\t203696\n黑心\t203697\n提线\t203698\ndefender安全中心\t203699\n停征\t203700\npee\t203701\n100券\t203702\n钟鸣\t203703\n安徽银行\t203704\n乌有之乡\t203705\n我的战争\t203706\nPRECISION\t203707\n音响师\t203708\n血战上海滩\t203709\n10.com\t203710\n600多万\t203711\nmunich\t203712\n开刃\t203713\n毛茛\t203714\n白茶饼\t203715\nRequires\t203716\n同程网\t203717\n星女郎\t203718\n中国港湾工程有限责任公司\t203719\niG电子竞技俱乐部\t203720\nDIRECT\t203721\n12154\t203722\n缺斤少两\t203723\n田径运动会\t203724\n21册\t203725\nHarmonic\t203726\nNTP\t203727\n存贷款基准利率\t203728\n150卡\t203729\n铁箱\t203730\n黄岭\t203731\n美柑\t203732\n今古\t203733\n热播剧\t203734\n直觉\t203735\n电棒\t203736\n川航\t203737\n怎么都快乐\t203738\n計劃\t203739\n飞速\t203740\n杀伤力\t203741\n奥迪Q5L\t203742\ns9+\t203743\n4000万吨\t203744\n昭然若揭\t203745\n仙灵骨葆片\t203746\n多重共线性\t203747\n中共山东省委组织部\t203748\n勇哥\t203749\n私募频道\t203750\n入子\t203751\n一灯\t203752\n建龙\t203753\n色彩学\t203754\n东方战场\t203755\n新生儿\t203756\n小霸\t203757\n完数\t203758\n暗杀星\t203759\n作乐\t203760\nWenQ\t203761\nios6.1.3\t203762\nnalu\t203763\n中国营销传播网\t203764\n万科城花璟苑\t203765\nNumPy\t203766\nhtons\t203767\n桃园\t203768\n网纹瓜\t203769\n电加热炉\t203770\n艳鬼\t203771\n200plc\t203772\n体脂秤\t203773\n小怪\t203774\n万道\t203775\n钮扣\t203776\ndissolve\t203777\n鲁美诺斯\t203778\n8220\t203779\n民主德国\t203780\n北京生命科学研究所\t203781\n静修\t203782\n29P\t203783\n第一款\t203784\n见笑\t203785\n哇\t203786\n积分表\t203787\n驾校一点通模拟考试c1\t203788\nuses\t203789\n47mm\t203790\n寿光\t203791\n明明\t203792\n分布式存储系统\t203793\n1774\t203794\n票友\t203795\ni5-7400\t203796\n小度\t203797\nU17\t203798\nglobals\t203799\n河森堡\t203800\n环海\t203801\n锦江酒店\t203802\nIdeaPad\t203803\n义乌卫生局\t203804\n红提\t203805\ntemporal\t203806\n快乐老家\t203807\n五孔\t203808\n1591\t203809\n中空注浆锚杆\t203810\n抵押物\t203811\nquadro\t203812\n会厌囊肿\t203813\n1150针\t203814\nA00\t203815\n上海轨道交通17号线\t203816\npf\t203817\n季张全蛋\t203818\nucan\t203819\n七星\t203820\n炉石传说冰封王座冒险\t203821\ntsunami\t203822\n准备就绪\t203823\n挂彩\t203824\ncompliment\t203825\n点色\t203826\n三月初\t203827\n梗塞\t203828\nSend\t203829\n品骏\t203830\n阳光农廉网\t203831\n契机\t203832\n茁\t203833\n白草\t203834\nscenic\t203835\n批批\t203836\n影豆网\t203837\n海马体\t203838\n纳兰性德\t203839\nXubuntu\t203840\n黄靖\t203841\nstrategy\t203842\nube\t203843\n2016年4月4日\t203844\n短点\t203845\n辽宁VS广厦\t203846\nWHI\t203847\n倍健\t203848\nAccessibilityService\t203849\n武汉房地产\t203850\n资讯\t203851\n20150507\t203852\n向君\t203853\n国际航空\t203854\n定局\t203855\n雪铁纳\t203856\n小廖\t203857\n电子计算器\t203858\n最近一年\t203859\n品客\t203860\n负性\t203861\n|八\t203862\n国盾\t203863\n逃出绝命镇\t203864\n真中\t203865\nCurseForge\t203866\nTrades\t203867\n奥兰多迪士尼\t203868\n冰肤\t203869\n种马\t203870\n准妈妈们\t203871\nPMSM\t203872\n均质器\t203873\nCompensation\t203874\n六季\t203875\n27.8\t203876\n434号\t203877\n童生\t203878\n国家发展和改革委员会\t203879\n恒道\t203880\n内忧外患\t203881\n雏鸡\t203882\n借东西的小人阿莉埃蒂\t203883\n蓝镜\t203884\n代客理财\t203885\neg\t203886\n金香玉\t203887\n唱唱\t203888\n四一分\t203889\n银河基金\t203890\n吸奶\t203891\n郡县\t203892\n西条琉璃\t203893\nNSUserDefaults\t203894\nMiner\t203895\n米米\t203896\n门闩\t203897\n福友\t203898\n联庄\t203899\nkevin\t203900\n拖泥带水\t203901\n建虎网\t203902\n大伴\t203903\n赛门\t203904\n卢纯\t203905\nFAILED\t203906\nPhabricator\t203907\n始发车\t203908\n求发展\t203909\nPC6电视TV\t203910\n优车客\t203911\n月魄\t203912\n第19集\t203913\n暗影精灵3\t203914\nPlaza\t203915\nORACLE数据库\t203916\n紫癜\t203917\n出入站\t203918\n希尔瓦娜斯\t203919\n大连图书馆\t203920\n9810\t203921\n吕氏春秋\t203922\nnian\t203923\n广佛智城\t203924\n北京大成\t203925\n吴瑛\t203926\n四川消防\t203927\n天琴\t203928\n纸塑\t203929\n林夕孙杨\t203930\n47001\t203931\n佐佐木艾丽\t203932\n雅康\t203933\n征帆\t203934\n看风云\t203935\njavaw\t203936\n痛经_\t203937\n哈多\t203938\nMX500\t203939\n收纳盒\t203940\n官方版\t203941\n阴沟\t203942\n载运\t203943\n文萃\t203944\n24册\t203945\n此曲\t203946\n司索工\t203947\n上海六院\t203948\n大野狼\t203949\n足浴盆\t203950\nRobtex\t203951\n童装\t203952\n衡水学院\t203953\n死火山\t203954\nrhel6\t203955\nCoreOS\t203956\nplant\t203957\n合乘\t203958\n布莱斯\t203959\ndoy\t203960\n回事_妈妈网\t203961\n出栏\t203962\n座便器\t203963\n橘园\t203964\nBT+\t203965\n怒江北街\t203966\n火炬开发区\t203967\n充分必要条件\t203968\n佣工\t203969\nstdafx\t203970\nGalen\t203971\n别有洞天\t203972\n连续\t203973\n依玛壁挂炉\t203974\n星际争霸2\t203975\n10多年前\t203976\n吉艾科技\t203977\n高固\t203978\nlol.52pk.com\t203979\n妄\t203980\n卓\t203981\n相似度\t203982\n南坑村\t203983\n3通\t203984\n魔锤\t203985\nages\t203986\n郑州市人民政府办公厅\t203987\n卡拉卡拉\t203988\n碳税\t203989\n丢面子\t203990\n150公斤\t203991\n很大声\t203992\n段姓\t203993\n瑞典大学\t203994\n乔高建\t203995\n奥林巴斯\t203996\n多重性\t203997\n武国\t203998\n弹珠传说\t203999\n980元\t204000\n打窝\t204001\n票据背书\t204002\nBreach\t204003\n李佳易中天\t204004\n南京农业大学工学院\t204005\n桜花\t204006\n亚瑟潘德拉贡\t204007\n有些人\t204008\n20150520\t204009\n吃水不忘挖井人\t204010\n老k\t204011\n酷哥\t204012\n新西兰航空\t204013\n郁金香花\t204014\n银黄\t204015\n起茧\t204016\n虹桥社区\t204017\n食醋\t204018\n817路\t204019\n殺\t204020\n七宫\t204021\n憋闷\t204022\nv11.1\t204023\n20140214\t204024\n中篇\t204025\ncss2\t204026\n非营运\t204027\n三老\t204028\n福建省发改委\t204029\n刘家峡\t204030\n第五关\t204031\n遇人不淑\t204032\n俗哥\t204033\nW3School\t204034\nSteve\t204035\n乳腺增生\t204036\n利君\t204037\n换片\t204038\n丑哭\t204039\n老奇骏\t204040\n天津一中心\t204041\n天在下雨我在想你\t204042\n刺绣机\t204043\n黑龙江省政协\t204044\n你的错\t204045\n信息员\t204046\ncursors\t204047\nPMAC\t204048\n魁北克省\t204049\n诺贝尔化学奖\t204050\n戏题\t204051\n3平方\t204052\n颜丹晨\t204053\n三明政府网\t204054\n华为武汉研究所\t204055\n侠客网\t204056\n生物试剂\t204057\n内丘\t204058\nhbv\t204059\n喷火龙\t204060\n魏华刚\t204061\nethic\t204062\n宝齐莱\t204063\n液相色谱法\t204064\n中建钢构有限公司\t204065\n苍银\t204066\n2018年5月19日\t204067\nglue\t204068\nlga\t204069\n漾濞县\t204070\n次位\t204071\niba\t204072\n疱疹\t204073\n膏药\t204074\niphone中文网\t204075\n恒大悦龙台\t204076\n冷夜\t204077\n肠虫\t204078\nKwan\t204079\n合口\t204080\n在职研究生考试网\t204081\n甲状\t204082\n寒月\t204083\n进职\t204084\n信息学奥赛\t204085\n金可儿\t204086\n译员\t204087\n路线\t204088\n花莲县\t204089\n动听968\t204090\n煲仔\t204091\nttp\t204092\ntab\t204093\n补完\t204094\n安徽职业技术学院\t204095\n返老还童\t204096\n擦玻璃机器人\t204097\n灰点\t204098\n消化器官\t204099\n一九四三\t204100\nPONY\t204101\n偏执狂\t204102\n用刷\t204103\n相逢何必曾相识\t204104\n张思\t204105\n哈尔滨平房区\t204106\nouts\t204107\n残雪\t204108\nPXW-FS5\t204109\n广西民族大学相思湖学院\t204110\ntle\t204111\n神龙谷\t204112\nBOSE\t204113\n自愈力\t204114\nB10\t204115\n保俶塔实验学校\t204116\n富乐\t204117\n金袖\t204118\n分餐制\t204119\n撒贝林夕\t204120\nPayoneer\t204121\n金色梦乡\t204122\n铸币\t204123\nHcash\t204124\n王福生\t204125\n曾晖\t204126\n研磨棒\t204127\n热河南路\t204128\nupgrades\t204129\n10.26\t204130\n编审\t204131\n第21关\t204132\n各页\t204133\n楚雄市\t204134\n书业\t204135\n池头\t204136\n一个星期内\t204137\n元素周期律\t204138\n勾花网\t204139\nTNA\t204140\n面筋哥\t204141\n中级会计实务\t204142\n禽流感病毒\t204143\n比特鱼BT-全集BT种子|BT下载|BT\t204144\nl430\t204145\nhomework\t204146\n本线\t204147\n紫圣\t204148\n好玩儿\t204149\nBlizzard\t204150\n600704\t204151\nWinRAR\t204152\n韩刚\t204153\nqiyue\t204154\nJK全球购\t204155\nt0\t204156\n中央日报\t204157\n领势\t204158\n八句\t204159\n梯形\t204160\n专插\t204161\nURL传参\t204162\nght\t204163\n绿珠\t204164\nshougou\t204165\nobs弹幕助手\t204166\n505号\t204167\n超过5年\t204168\nSingh\t204169\n宪法学\t204170\n算卦\t204171\nt波\t204172\n上海浦东机场\t204173\n列宾\t204174\n唐山\t204175\nd3dx9_42.dll\t204176\n半妖\t204177\n漫射\t204178\nAnnual\t204179\n高纯度\t204180\n医谷\t204181\nClan\t204182\n原体\t204183\nパンツ\t204184\n干变\t204185\n五福\t204186\ncnas\t204187\n注射式\t204188\n雪机\t204189\n公安部第三研究所\t204190\narctime\t204191\n途游\t204192\n房间隔缺损\t204193\n八宿\t204194\n家家乐\t204195\n登高证\t204196\nHGH\t204197\n合盛硅业\t204198\n卓悦\t204199\n为善\t204200\n红椿\t204201\n每点\t204202\n通济街道\t204203\n黄璐\t204204\n攻墓\t204205\n龙爪\t204206\n第八周\t204207\n16.7\t204208\nacre\t204209\nCircus\t204210\n路肩\t204211\n守夜人\t204212\n安徽新华学院\t204213\n傅军\t204214\n114.3\t204215\n意式风情街\t204216\n2018.4.21\t204217\n黄河水利职业技术学院\t204218\n66亿\t204219\n中钢集团\t204220\n晒晒\t204221\n景德镇政府网\t204222\n第76章\t204223\n何江\t204224\n优学\t204225\n墨迹\t204226\n矮生\t204227\n春丽\t204228\n小儿肺热咳喘颗粒\t204229\n楚才\t204230\n指派\t204231\nyct\t204232\nChemdraw\t204233\n芭提雅红灯区\t204234\n原学\t204235\n笔写\t204236\n复仇者联盟3:无限战争_\t204237\n永泰县\t204238\nsetp\t204239\n增值电信业务许可证\t204240\n20150401\t204241\n碳炉\t204242\nDigestive\t204243\n国民经济行业分类\t204244\n崔嵬\t204245\n冥神\t204246\n淡忘\t204247\n银狐\t204248\nPHPExcel\t204249\n大榭岛\t204250\n精武英雄\t204251\n深圳求艺网\t204252\n苹果电脑\t204253\n类形\t204254\n38940983\t204255\n90毫升\t204256\n盖骨\t204257\n前洲\t204258\n海兔\t204259\n长安三怪探\t204260\nM13\t204261\n全建筑\t204262\n博联\t204263\n低小散\t204264\n停暖\t204265\n代言\t204266\n谱例\t204267\n公孙丑\t204268\nhydrogen\t204269\n天津国\t204270\n以暴制暴\t204271\n选项卡\t204272\n很暧昧\t204273\n砂石\t204274\n魂玉\t204275\n810\t204276\n富春路\t204277\n骐达\t204278\n喻\t204279\n椎名林檎\t204280\n银信类\t204281\nguofu\t204282\n益策\t204283\n若\t204284\n刘金\t204285\n洁面慕斯\t204286\n五分钟\t204287\narteon\t204288\n广东药科大学附属第一医院\t204289\nv4.2.2\t204290\n广东德普龙建材有限公司\t204291\n北京三甲医院\t204292\nblobs\t204293\n华印医疗\t204294\n凉面\t204295\n公安基础知识\t204296\n农药学\t204297\n马口镇\t204298\n女同性恋\t204299\n总集篇\t204300\n一大盆\t204301\ntitan\t204302\n旭阳\t204303\nLacie\t204304\n尤果网\t204305\n盗版三国志\t204306\n弗罗多\t204307\n再秀\t204308\n条纹状\t204309\nBuddy\t204310\n直方\t204311\n重庆市妇幼保健院\t204312\nactin\t204313\n执法\t204314\n登贝莱\t204315\n煋\t204316\namcl\t204317\n雅化集团\t204318\nMM们\t204319\n法苑\t204320\n永昼\t204321\n张充\t204322\n流光\t204323\n上下面\t204324\n华南理工大学网络教育学院\t204325\n上海绿地申花足球俱乐部\t204326\n2009-2016年\t204327\n掉眼\t204328\n桂花村\t204329\n加摩尔\t204330\n超现实\t204331\nSee\t204332\nzachary\t204333\n杨大明\t204334\n布丁\t204335\n法制员\t204336\n铁扣\t204337\n体验者\t204338\n山南新区\t204339\n受伤\t204340\n地锦\t204341\n金兵\t204342\n1451\t204343\n放慢\t204344\n绝缘杆\t204345\n3X畅玩版\t204346\n燕小六\t204347\nQuo\t204348\n平板电子书网\t204349\nEliza\t204350\n茶餐\t204351\n精于\t204352\n兽血沸腾\t204353\n泰和新材\t204354\n4399弹弹堂\t204355\n活性炭处理\t204356\n逍\t204357\nYY直播\t204358\n五矿地产\t204359\n灯笼袖\t204360\n上海医疗\t204361\nLimousine\t204362\n柳州银行\t204363\n+3\t204364\n三星s\t204365\nIndividual\t204366\n海南省省\t204367\n沈周\t204368\n江西省粮食局\t204369\n傲雪寒梅\t204370\n72家\t204371\n西域都护府\t204372\n仓费\t204373\n木王\t204374\ndendi\t204375\n市情_郴州政府\t204376\n某一点\t204377\n2017年06月\t204378\n国家版权局\t204379\n战友团\t204380\n1.27\t204381\nNegotiation\t204382\n创新创业\t204383\n旅行车\t204384\n格林豪泰\t204385\n胸背\t204386\n学社\t204387\n几何图\t204388\n心无旁骛\t204389\n老奶奶\t204390\nGTX960m\t204391\nhjc\t204392\n团费\t204393\n英姿\t204394\nGFF\t204395\n白珍熙\t204396\n长荣海运\t204397\netc\t204398\n边城镇\t204399\n大塘村\t204400\n住房公积金管理中心\t204401\n董颖\t204402\n口袋妖怪单机版\t204403\nncb\t204404\n≮\t204405\nTLD\t204406\n床文\t204407\n雷霆宙域\t204408\n43种\t204409\n纳税筹划\t204410\n相看两不厌\t204411\n四川经济网\t204412\n广洲\t204413\n璞樾\t204414\n热水机\t204415\n太阳能热水器\t204416\n双因素方差分析\t204417\n鱼钩\t204418\n十二代\t204419\n玉湖镇\t204420\n京韵大鼓\t204421\n周岁照\t204422\nrayban\t204423\n弯线\t204424\nwakelock\t204425\n2018年03月12日\t204426\n三原穗香\t204427\n平整度\t204428\n网络电视节目\t204429\n红原县\t204430\npma\t204431\n伊达\t204432\n八万吨\t204433\n背景素\t204434\n近百人\t204435\n郑州市高新区\t204436\n小説\t204437\nav2017\t204438\n天津市人力资源和社会保障局\t204439\n中观\t204440\n版花\t204441\n洋白菜\t204442\n加墨\t204443\n热床\t204444\n卡梅拉\t204445\nACE\t204446\n搜房网\t204447\nlimit\t204448\n钱币\t204449\n兵线\t204450\n方先觉\t204451\n爱荷华大学\t204452\n洛浦\t204453\n甘肃省环保厅_甘肃省环境保护厅\t204454\n郑州市审计局\t204455\nOUTLET\t204456\n尸横遍野\t204457\n孔宣\t204458\n枣仁\t204459\nrca\t204460\n满婷\t204461\n水力\t204462\n1.7.5\t204463\n有活力\t204464\n大连市人才服务中心\t204465\ngpedit.msc\t204466\n标错\t204467\n牛玉儒\t204468\n房地产联合网\t204469\n杰顿\t204470\n冷冲模\t204471\nPN\t204472\n中冶\t204473\n旁通阀\t204474\n孔繁森\t204475\n万国社区\t204476\n折点\t204477\n尼桑天籁\t204478\n五佳\t204479\n黑豹乐队\t204480\nlangmuir\t204481\n43元\t204482\nCamtasia\t204483\n苏墨\t204484\nSEKAI\t204485\n备注名\t204486\n轧花\t204487\n磴口县\t204488\n菽\t204489\n快跑\t204490\n龙珠战士Z\t204491\n钱程\t204492\n锆合金\t204493\n一生一世\t204494\ntabbed\t204495\n骑马射箭\t204496\n潞城镇\t204497\n紫藤园\t204498\n追亡\t204499\nMeta分析\t204500\n大连美琳达妇儿医院\t204501\n圣罗兰\t204502\n舒适版\t204503\nThread类\t204504\n医生集团\t204505\n公共绿地\t204506\n炮兵\t204507\n一年之计在于春\t204508\n骨甲\t204509\n白光莹\t204510\n巡逻\t204511\n刺秦秘史\t204512\nhilfiger\t204513\n松岭路\t204514\nenabler\t204515\n2012R2\t204516\nhtonl\t204517\n戈登\t204518\ndxgi\t204519\n应城\t204520\n博洛尼亚大学\t204521\n奇山\t204522\n毛细管电泳\t204523\n别来\t204524\n热轧机\t204525\n同位\t204526\nAbraham\t204527\n机器人大\t204528\n故乡\t204529\n更持久\t204530\n时尚金鹰网\t204531\n斑鬣狗\t204532\n医妻\t204533\n天津地区\t204534\n酰胺\t204535\n起凡群雄逐鹿\t204536\n模块术\t204537\n丰满\t204538\n专干\t204539\n提问\t204540\n世界自闭症日\t204541\n郭盈恩\t204542\n奥古雷\t204543\n全析\t204544\nInterracial\t204545\nii23\t204546\n钦州吧微社区\t204547\n绅宝x25\t204548\n上映\t204549\n过路过桥费\t204550\n000996\t204551\n名鉴\t204552\n汤河\t204553\n吴泉锡\t204554\n高仪\t204555\n黄成义\t204556\n192.168.2.0\t204557\n圣安德鲁斯大学\t204558\nm22\t204559\nningbo\t204560\n小秘方\t204561\n私盐\t204562\n扭摆法\t204563\ndita\t204564\nOrgan\t204565\nweiyi\t204566\nKnitting\t204567\naodi\t204568\n第六名\t204569\neve-ng\t204570\nDeutschland\t204571\nHDC\t204572\nXNXX\t204573\nAdults\t204574\n尸案调查科\t204575\n离心风机\t204576\n猪群\t204577\n禅雅塔\t204578\n赣县\t204579\n馆法\t204580\n辨别\t204581\nAddition\t204582\n谷歌母公司\t204583\n针型\t204584\n10K\t204585\n岁月蹉跎\t204586\n包进\t204587\n劲牌有限公司\t204588\n云林县\t204589\nNANA\t204590\n力高地产\t204591\n许磊\t204592\n东胜之窗\t204593\n6259\t204594\n坪山街道\t204595\n古代\t204596\n锁舌\t204597\n@media\t204598\n银河期货\t204599\n2016年12月31日\t204600\n红钢城\t204601\n乒乓球馆\t204602\n而且\t204603\n观澜湖\t204604\n歧化\t204605\n第72期\t204606\n哪般\t204607\n莫兰迪\t204608\nuGet\t204609\n比比琼斯\t204610\n二进制数\t204611\n典典\t204612\n12m\t204613\nWien\t204614\nsnort\t204615\n步非烟\t204616\n知乎大V\t204617\n义诊\t204618\noptus\t204619\n制药工程专业\t204620\n广东省文化厅\t204621\n瓶胚\t204622\nspanish\t204623\naee\t204624\nideapad\t204625\n魔兽世界猎人\t204626\nopeng\t204627\n代数\t204628\n雅虎网\t204629\n横折\t204630\n丹江口库区\t204631\n随风飘摇\t204632\nldk\t204633\n蓝瓶\t204634\n小可爱\t204635\n故去\t204636\nonline2\t204637\n北京燃气\t204638\n挤牙\t204639\n太行山脉\t204640\n路工\t204641\nliding\t204642\n8图\t204643\nprintf函数\t204644\n韩非子\t204645\n心切\t204646\nallocate\t204647\nDavichi\t204648\n98拳皇\t204649\n同业存单\t204650\n迅疾如风\t204651\nMX\t204652\n7.3\t204653\n分身版\t204654\n尖塔\t204655\n永泽\t204656\n哥俩\t204657\n多帧\t204658\n同业拆借利率\t204659\n南昌万达乐园\t204660\n记住乡愁\t204661\n和平村\t204662\n独来独往\t204663\nSLR/DSLM论坛\t204664\nComm\t204665\n慢性病\t204666\n几起\t204667\n七十二\t204668\n两胎\t204669\n李解\t204670\nNo2\t204671\n有把握\t204672\n7起\t204673\n得力\t204674\n源盛\t204675\ndump.rdb\t204676\n泰岳\t204677\ncritic\t204678\n改善型\t204679\n植物学家\t204680\n20180122\t204681\n堕神\t204682\nfirefly\t204683\n韩叶\t204684\ntraumatic\t204685\nloaded\t204686\n6大酒店\t204687\n药师证\t204688\n冷凝管\t204689\n周连华\t204690\n致病菌\t204691\ng4d1\t204692\nVT-x虚拟化\t204693\n均瑶\t204694\n倾品加盟网\t204695\n酸中毒\t204696\n蓝风铃\t204697\n20ms\t204698\n波音747\t204699\nwegene\t204700\n非税收入\t204701\n女单\t204702\n问号处\t204703\n机械硬盘\t204704\n澳門\t204705\nps动图\t204706\ntoni\t204707\nBerg\t204708\n二〇一七年\t204709\nantibiotic\t204710\n迅雷U\t204711\n王晓方\t204712\nrapid\t204713\n艾媒咨询\t204714\n十二套\t204715\n相乘\t204716\nnvarchar2\t204717\n那一代\t204718\n太肥\t204719\n45周年\t204720\n1.77G\t204721\n冰格\t204722\n爱爱医医学网\t204723\n华阳小学\t204724\n跳格\t204725\nzgy\t204726\nmybaits\t204727\nK70RGB\t204728\n刘淑英\t204729\nAeronautics\t204730\n缬沙坦分散片\t204731\n河青网\t204732\n0.04\t204733\n上财统管学院\t204734\n徐清\t204735\n准东\t204736\n形体\t204737\n檫\t204738\n晒房网\t204739\nHulu\t204740\nspringer\t204741\n熊熊燃烧\t204742\n教管\t204743\n黑龙江省国土资源厅\t204744\n开审\t204745\n代课老师\t204746\n林月如\t204747\n涌\t204748\nassets\t204749\n真灵九变\t204750\nHelloBike\t204751\n拳皇97OL\t204752\n校正\t204753\n朝日啤酒\t204754\n马丽华\t204755\n金旺\t204756\n商业片\t204757\nltp\t204758\n国家公务员局\t204759\n资兴市政府\t204760\n朱建\t204761\n春茧\t204762\n三幻神\t204763\n手肿\t204764\n腾达大厦\t204765\n扎克斯\t204766\nnightmare\t204767\n攻守\t204768\n春日游\t204769\nBulgari\t204770\n总冠军\t204771\n文类\t204772\n麻城\t204773\n对证\t204774\n吉莲\t204775\n蟹壳\t204776\n前两小时\t204777\n一大袋\t204778\n17133.1\t204779\np_\t204780\n太浪漫\t204781\n绝世龙蛇演义\t204782\n潘长江\t204783\n景德镇学院\t204784\n穷凶极恶\t204785\nGrails\t204786\n17年后\t204787\n理由\t204788\n宋小宝\t204789\nOpenAL\t204790\n既得利益\t204791\n气炮\t204792\n天龙八部丐帮\t204793\n我乐意\t204794\nIP详解\t204795\n思茅\t204796\n白痴\t204797\nneu\t204798\n2x+1\t204799\n卡本\t204800\n布吉河\t204801\n金孔雀轻轻跳\t204802\n佛跳墙\t204803\n420\t204804\nD3400\t204805\n广轻\t204806\n9.2分\t204807\n为零\t204808\n食用\t204809\n4.6.3\t204810\n无量山\t204811\n操作时\t204812\n司汤达\t204813\n附着\t204814\n郑州东\t204815\n偏好\t204816\n腿肠\t204817\n动产\t204818\n孙颖莎\t204819\n初始化为\t204820\n收费机\t204821\n哨声\t204822\ndd\t204823\n第八套\t204824\n张璞\t204825\n嘉兴小学\t204826\nhusky\t204827\n顺风耳\t204828\n百度网盘高清1080P\t204829\nLeadership\t204830\n深海水族馆\t204831\n邓煌\t204832\n正颌手术\t204833\n金普新区\t204834\n刘德海\t204835\n蛋卷头\t204836\nntsd\t204837\n薄荷膏\t204838\n济宁政府网\t204839\n处对象\t204840\n口板\t204841\n韶关高铁站\t204842\n年休假\t204843\n绿化贷\t204844\n市农机局\t204845\n示儿\t204846\nRecover\t204847\n蜕\t204848\n拌料机\t204849\n瑞者\t204850\n7.28\t204851\n联想网御\t204852\n京山轻机\t204853\n登高\t204854\nEvernote\t204855\n满月照\t204856\n氟斑牙\t204857\n阳光苑\t204858\naiohttp\t204859\n无限秦海璐\t204860\n神经刀\t204861\n吴国华\t204862\n陈哲\t204863\n物理防晒霜\t204864\n2018年2月10日\t204865\nLED光源\t204866\n分镜\t204867\n悦动无敌剑域\t204868\n真笔网\t204869\n按摩器\t204870\n光导\t204871\n日程管理软件\t204872\n合肥研究院\t204873\n高蛋白质\t204874\n经三路\t204875\n朱广权\t204876\n日丰管\t204877\n前程网\t204878\n气电\t204879\n无靠\t204880\n杂填土\t204881\n2018年03月13日\t204882\n赵俊良\t204883\n归园田居\t204884\n曙光西里\t204885\n均衡化\t204886\n下放\t204887\n深圳能源集团股份有限公司\t204888\n第四回\t204889\nonmousemove\t204890\n王端端\t204891\nMOBILE\t204892\n上海通用\t204893\n精灵王座\t204894\n抽打\t204895\n灵魂摆渡黄泉\t204896\nDEDECMS\t204897\n曲根\t204898\n奔驰俗人回档\t204899\n一穷二白\t204900\n破极兵刃\t204901\ncentOS\t204902\n扁桃体肥大\t204903\n碎尸案\t204904\n青学\t204905\n玉渊潭\t204906\n三审\t204907\nflightgear\t204908\n西南角\t204909\n美思奇\t204910\n石大胜华\t204911\nlibuuid\t204912\nTips\t204913\n复读\t204914\n神医嫡女\t204915\n红标\t204916\n中华联合财产保险公司\t204917\n活动课\t204918\n云海玉\t204919\n64365.com\t204920\n哈尔滨医院\t204921\n盐酸氟桂利嗪胶囊\t204922\n葡萄干\t204923\n挤塑聚苯板\t204924\n军票\t204925\n串接\t204926\n李红良\t204927\n疾厄宫\t204928\n1.5万亿元\t204929\n特拉卡\t204930\n震泽古镇\t204931\n高耗\t204932\n857\t204933\n1.0.20\t204934\n霍去病\t204935\nsupermini\t204936\n门扇\t204937\ngentleman\t204938\n南京军区南京总医院\t204939\n马尔克斯\t204940\n热泵式\t204941\n贝克\t204942\n桂林中学\t204943\n祺贵人\t204944\n彩椒\t204945\n茶叶店\t204946\n生脉饮\t204947\n小公寓\t204948\n尖儿\t204949\n凉山\t204950\nDestination\t204951\n2017-11-30\t204952\nreseller\t204953\n002466\t204954\njenson\t204955\nBDRIP\t204956\n卡氏\t204957\npk\t204958\n五大湖\t204959\n三先\t204960\n络筒\t204961\n落格\t204962\nstructures\t204963\n财务管理专业\t204964\nvirmach\t204965\n深蹲\t204966\n逍遥情缘吧\t204967\n葵千恵\t204968\n0750\t204969\nYGO\t204970\ncrimson\t204971\n清炭ハリケ\t204972\n切断机\t204973\n机载雷达\t204974\n太平山顶\t204975\ncrimes\t204976\n棒料\t204977\n裸露\t204978\n10\t204979\n粗鄙\t204980\n拓浚\t204981\n全国人民代表大会_中国人大\t204982\nU宝\t204983\n转阴\t204984\n3.02\t204985\n谢娜三毛\t204986\n購\t204987\n郭庄\t204988\n衡阳市人力资源和社会保障局\t204989\n品牌建设\t204990\n高渗性脱水\t204991\n合肥站\t204992\n王一凡\t204993\n南长街\t204994\n赤子丹心\t204995\n华巨\t204996\n十五世纪\t204997\n太学\t204998\nbytearray\t204999\n全国征兵网\t205000\n人力资源社会保障网\t205001\n4000_\t205002\n亿发\t205003\n中央民族干部学院\t205004\n晕倒\t205005\nbark\t205006\nmist\t205007\n毁谤\t205008\n6阶\t205009\n第十七季\t205010\n中药一号网\t205011\n业绩\t205012\n薄脆\t205013\nSkating\t205014\n自然基金\t205015\n厦门中山医院\t205016\n艾肯4nano\t205017\n优先股\t205018\n踏踏\t205019\npid\t205020\n大利嘉城\t205021\nSharks\t205022\n中关村三小\t205023\n票务员\t205024\n波音777-300ER\t205025\n康乐小区\t205026\n天福茗茶\t205027\n矽品\t205028\n妙味\t205029\n刘青\t205030\n不干胶贴标机\t205031\n圆环图\t205032\n齐心合力\t205033\n背柜\t205034\n烽皇\t205035\n开心播播网\t205036\nBINGBIAN病变\t205037\n青青青\t205038\n10050\t205039\n中山六院\t205040\n德妃紫苏\t205041\ninstallation\t205042\n品面\t205043\nSteak\t205044\n一览_乐游网\t205045\n上海日立\t205046\n3999\t205047\nScraping\t205048\nirobot\t205049\n母星\t205050\n600401\t205051\n南京高科\t205052\n南星\t205053\nfma\t205054\n益西彭措堪布\t205055\n百折\t205056\n6千万\t205057\n中海九号公馆\t205058\n比较器\t205059\n出生年月\t205060\n69页\t205061\n尤夫股份\t205062\n歙县人民政府\t205063\n蒲扇\t205064\n景气度\t205065\nCLUB\t205066\n逻辑函数\t205067\n广宣\t205068\n重茬\t205069\n海南经贸职业技术学院\t205070\nエリ\t205071\n失球\t205072\n矛\t205073\n王亚彬\t205074\n红白歌\t205075\n空中客车A380\t205076\n五日\t205077\n茶叶末\t205078\n分析化学\t205079\n走火入魔\t205080\n20分\t205081\n烟台大学法学院\t205082\n乔装\t205083\n惠州在线\t205084\n人人皆知\t205085\n2017年9月14日\t205086\n爱扬\t205087\n调节器\t205088\n科普篇\t205089\n西悉尼大学\t205090\ni2\t205091\n秤\t205092\nWESCOM\t205093\n克而瑞\t205094\n陆川县\t205095\n湛江东海岛\t205096\n杜海涛\t205097\n伴君\t205098\n赵统\t205099\n库仑定律\t205100\n磁力吧\t205101\n欧盟\t205102\nJ-Link\t205103\n优泌乐\t205104\n原机\t205105\n大连松下\t205106\n烈度\t205107\n玉清\t205108\n上海世纪公园\t205109\n张永明\t205110\nhr\t205111\n东涌\t205112\n66376963\t205113\nOG2\t205114\n姚伟\t205115\ngcs\t205116\n鸣梁海战\t205117\n亮红灯\t205118\n币币\t205119\n谢丽尔\t205120\n国庆节\t205121\n杭州海康威视数字技术股份有限公司\t205122\n牛奶哥\t205123\n中国科学院学部\t205124\n红岩魂\t205125\n嘉华\t205126\n卡农\t205127\n疏松\t205128\n预混料\t205129\n178激战2\t205130\n2992\t205131\n石墨烯气凝胶\t205132\n搜宝\t205133\nCBETA\t205134\n求神\t205135\n木目\t205136\n胸锁乳突肌\t205137\n寸心\t205138\nterrariape\t205139\nug6.0\t205140\n奥浩哉\t205141\nhh24\t205142\n251号\t205143\n纳斯达克\t205144\n瞬秒\t205145\n王者荣耀吕布吧\t205146\n天津日报\t205147\n王京\t205148\n刚刚开始\t205149\n刘珣\t205150\n若然\t205151\n古剑奇谭2\t205152\n永定路\t205153\n煤价\t205154\n河姆\t205155\n防松螺母\t205156\nELM327\t205157\n小东江\t205158\n希尔薇魔\t205159\nHarman\t205160\n征即退\t205161\njavabean\t205162\nButtons\t205163\n呼吸困难\t205164\n投产\t205165\n查查网\t205166\n民歌中国\t205167\nInvestingAnswers\t205168\n遨游浏览器\t205169\n_叩富网\t205170\n斧钺\t205171\nletv\t205172\nshader\t205173\n同型号\t205174\nJSBridge\t205175\n冷却水泵\t205176\n000832\t205177\n房产税\t205178\npretrained\t205179\n高级会计师\t205180\nrecycle1\t205181\n和弦谱_木木吉他网\t205182\n爱如潮水\t205183\nPXI\t205184\n创世战车\t205185\n恶魔战线\t205186\n育\t205187\n1.6t\t205188\n手赚之家\t205189\n郭浩\t205190\njekins\t205191\n曼龙\t205192\nMontblanc\t205193\n洗剪吹\t205194\nSoccer\t205195\n用火\t205196\n牛背梁国家森林公园\t205197\n尚冰\t205198\n崩坏3\t205199\n科学营\t205200\npsp\t205201\n永久性\t205202\n市发展和改革委员会\t205203\n神舌\t205204\n李学文\t205205\n我的叔叔于勒\t205206\nrcc\t205207\n双黄连\t205208\n山魈\t205209\nSCENE\t205210\n封闭剂\t205211\n第59号\t205212\n雅奇\t205213\n聚酯纤维\t205214\n底座\t205215\n幼升小学区\t205216\n垂径\t205217\n5月18\t205218\n剖\t205219\n三两三\t205220\n山东大学齐鲁儿童医院\t205221\n神圣\t205222\nKota\t205223\n广州科学城\t205224\n蜡绳\t205225\n泰兰德酷路泽\t205226\n古典吉他独奏\t205227\nACS\t205228\n梦婷\t205229\n黄志伟\t205230\n第18集\t205231\n一字形\t205232\n李兴钢\t205233\n48千米\t205234\n国浩\t205235\n大坏蛋\t205236\n饭乐\t205237\n学不好\t205238\n亚达\t205239\ncfr\t205240\n中高艺\t205241\n字门\t205242\n笔记术\t205243\n投中\t205244\n40个小时\t205245\n0330\t205246\n56614093\t205247\n开斋节\t205248\n金刚石刀具\t205249\n长信基金\t205250\nKanban\t205251\n采暖散热器\t205252\n免安\t205253\n新新影院\t205254\n格鲁特\t205255\n5.6L\t205256\n百分之一\t205257\n羊圈\t205258\naplay\t205259\n茜公主\t205260\nsleeper\t205261\n20151002\t205262\n浇菜\t205263\n立春\t205264\nI9\t205265\n科里\t205266\n万国城\t205267\n意蕴\t205268\n南昌东湖区政府\t205269\n平桥区\t205270\n程昱\t205271\n挡箭牌\t205272\nSSQ\t205273\nsegmented\t205274\n1950s\t205275\nMarcel\t205276\n退办\t205277\n萌妹\t205278\n407号\t205279\n东环街\t205280\n宜修\t205281\n木驴\t205282\n乘务员\t205283\n黑牙\t205284\n美作文\t205285\nafreecatv\t205286\n560元\t205287\n无力感\t205288\n贸易保护主义\t205289\nchromatin\t205290\n雅格\t205291\n601600\t205292\n肝硬化论坛\t205293\n40千克\t205294\n踮脚\t205295\n轻享\t205296\n李炎\t205297\n自吸式\t205298\nVCS\t205299\n通知书\t205300\n温暖的家\t205301\n新华报业网\t205302\n滚蛋吧\t205303\n俪\t205304\n抚顺二中\t205305\n一字之差\t205306\nMgen\t205307\n花样滑冰锦标赛\t205308\n突入\t205309\nControls\t205310\nスリ\t205311\n961\t205312\n激活\t205313\n无骨\t205314\n变身术\t205315\n若琪\t205316\n河北艺术职业学院\t205317\n尾门\t205318\n世纪末\t205319\n北湖新闻网\t205320\n比什凯克\t205321\n一号机\t205322\n内蒙古伊利实业集团股份有限公司\t205323\nlogistic回归分析\t205324\n不列颠尼亚\t205325\nHypertension\t205326\n馅\t205327\n10余个\t205328\nサポ\t205329\n5g网络\t205330\n秦钟\t205331\nscale\t205332\n中国国务院\t205333\n武定\t205334\n精密过滤器\t205335\n镜象\t205336\n腾达\t205337\nmilla\t205338\n20160421\t205339\n娄\t205340\n十九党\t205341\n人机\t205342\n抱住\t205343\n不忘初\t205344\nAndroidStudio\t205345\n乐韵\t205346\n赢版\t205347\n18061期\t205348\n形式化\t205349\n汤姆柳传志\t205350\n肖艳\t205351\n谭咏麟\t205352\n日日日\t205353\n虹桥站\t205354\n返乡\t205355\n黄花\t205356\n仲裁申请书\t205357\n琼剧\t205358\n乌龙派出所\t205359\n132平米\t205360\nBeen\t205361\n可可\t205362\n农投\t205363\nmagnet\t205364\nTerrace\t205365\n优弹素\t205366\n字符串类\t205367\n鲜满\t205368\nhebernate\t205369\nDropDown\t205370\nGrafana\t205371\n辽宁省人民政府办公厅\t205372\n预告照\t205373\n燕尾榫\t205374\n萌购\t205375\n箍牙\t205376\n中山八路\t205377\n工具库\t205378\n16期\t205379\n炼铁\t205380\n迅雷u\t205381\nxueyuan\t205382\n纤毫\t205383\n九阴真经ol吧\t205384\n车墩\t205385\nDongle\t205386\n饱满\t205387\n960\t205388\n绿松\t205389\n连州地下河\t205390\n真胸\t205391\n32欧\t205392\n表兄弟\t205393\nLoops\t205394\n420马力\t205395\n涉猎\t205396\nkeyshot4\t205397\n扁平率\t205398\nmetallic\t205399\nMockingBot\t205400\n940mx\t205401\n沙钢股份\t205402\n20kV\t205403\n注入式\t205404\n叶文\t205405\nMAVIC\t205406\n教育部社科司\t205407\n香蕉树\t205408\n来出\t205409\n结篇\t205410\ncrom\t205411\nchapter\t205412\n东莞东华医院\t205413\n归口\t205414\nREG\t205415\n没看见\t205416\n48V20A\t205417\n经济法基础\t205418\n苏俄\t205419\ngx\t205420\n经纬度\t205421\n1.6g\t205422\n货商\t205423\nSQL2008R2\t205424\nbigfish\t205425\n中国电子学会\t205426\nbride\t205427\n王忠明\t205428\n向左\t205429\n古天\t205430\n拜倒\t205431\n人脸识别技术\t205432\n丁可\t205433\ntvt\t205434\n商合杭高铁\t205435\n暴乱\t205436\n属组\t205437\n奥飞娱乐\t205438\n横屏\t205439\n5-10分钟\t205440\n汉阳铁厂\t205441\n40007\t205442\n中子弹\t205443\n云闪付\t205444\n蜜饯\t205445\n海帅\t205446\n小新塘\t205447\n刘看山\t205448\n两项\t205449\n确保\t205450\n验货\t205451\n2016年12月28日\t205452\n我的童年\t205453\n钰\t205454\ndungeon\t205455\n凸版\t205456\n莉夏\t205457\n盂县\t205458\n黑蝶\t205459\ndnf鬼泣\t205460\npve宏\t205461\n2018年03月22日\t205462\n贸易项\t205463\n20141213\t205464\nxXx\t205465\n呱\t205466\nbeanlam\t205467\nUI设计规范\t205468\n32g\t205469\n鸿荣源\t205470\n牢固\t205471\n装修资\t205472\n4100\t205473\n回动\t205474\n小娘子\t205475\n外控\t205476\nsing\t205477\n温州市\t205478\n松阳\t205479\n腰梁\t205480\n职规\t205481\n酱香型酒\t205482\n三板\t205483\n东莞控股\t205484\n鲜香\t205485\n兴隆吧\t205486\nAcute\t205487\n私营企业\t205488\n惠山站\t205489\n荚果\t205490\n厦门小鱼网\t205491\n2198\t205492\n林花\t205493\n斑竹园\t205494\n美国商务部\t205495\n迷津\t205496\n牛顿环\t205497\n濉溪路\t205498\n上海1号\t205499\n中国民生银行信用卡中心\t205500\n十万八千里\t205501\nBSCI\t205502\n中环城\t205503\n20150418\t205504\n声势浩大\t205505\n25T\t205506\n失重感\t205507\n借方\t205508\n十二烷基磺酸钠\t205509\n上海宾馆\t205510\n1第三章\t205511\n差旅费\t205512\nkindle558\t205513\n威斯特\t205514\n三星I9500\t205515\n九莲山\t205516\n泰州职业技术学院\t205517\nctf\t205518\n肖凝儿\t205519\n上海市青浦区规划和土地管理局\t205520\n深度学习与自然语言处理\t205521\n中枢神经\t205522\n东阳市人民政府\t205523\n捣固\t205524\n金田起义\t205525\nmoegirl\t205526\npvdf\t205527\nSHA\t205528\n舒筋丸\t205529\n6&\t205530\n城东大道\t205531\n不做功\t205532\n滤清\t205533\n蓝桥杯\t205534\n2900\t205535\n原甲酸三乙酯\t205536\n天心\t205537\n嘉兴日报\t205538\n五进制\t205539\n曾攀\t205540\n雕龙\t205541\n缩合\t205542\n装饰管\t205543\nMano\t205544\n开瑞开瑞K702018款\t205545\n小溪流\t205546\n寇仲\t205547\n辩证关系\t205548\n送学\t205549\n咪达唑仑\t205550\n着眼\t205551\n张艳萍\t205552\nDorado\t205553\n厦门理工学院\t205554\n德加\t205555\n马睿菈\t205556\nb&o\t205557\n31卷\t205558\n课标版\t205559\n止汗剂\t205560\n炫迈口香糖\t205561\n哦我的皇帝陛下\t205562\n贺龙\t205563\n20151109\t205564\n供热\t205565\n茆诗松\t205566\nNON\t205567\n有时\t205568\n东华小学\t205569\nwwe2k18\t205570\n65英寸\t205571\n车务段\t205572\n广告歌\t205573\n风水画\t205574\n翼板\t205575\n白崇禧\t205576\n经理人\t205577\n麦家瑜\t205578\n泸县二中\t205579\n惠山经济开发区\t205580\nUne\t205581\n万维钢\t205582\n范县网\t205583\n家庭收支\t205584\ndamn\t205585\nWays\t205586\n固体催化剂\t205587\n宁夏回族自治区卫生和计划生育委员会\t205588\n想开通\t205589\n无复\t205590\n碳酸钙\t205591\n粘液腺囊肿\t205592\n一吻定情\t205593\n植皮\t205594\n党建读物出版社\t205595\ntlb\t205596\n卓非\t205597\nbordeaux\t205598\n中国关工委\t205599\n北京吉利学院\t205600\nADOBE\t205601\n镌刻\t205602\n四人斗地主\t205603\n注射室\t205604\n奔跑者\t205605\n丽水火车站\t205606\n吸乳器\t205607\nbars\t205608\n海科\t205609\n空气币\t205610\n南山寺\t205611\n多游网\t205612\n九步行街\t205613\n养羊场\t205614\n竹溪新闻网\t205615\n风生水\t205616\n湖南广播电视大学\t205617\n比喻词\t205618\n天天宝\t205619\nブラック\t205620\n化学工程专业\t205621\nrelevant\t205622\n国家理财规划师\t205623\n天之痕\t205624\n庸人\t205625\nutm\t205626\n美橙\t205627\n租售比\t205628\n马平\t205629\n林中小屋\t205630\n夜灯\t205631\n第02章\t205632\n西海都市报\t205633\nlamento\t205634\n国字号\t205635\nStyle\t205636\n过奖\t205637\n2018年4月18日\t205638\n茅家埠\t205639\n乐视TV\t205640\n精版\t205641\n绍兴人民医院\t205642\nwin32api\t205643\n夜网\t205644\nFarcry5\t205645\n卸船机\t205646\n李志民\t205647\n70分贝\t205648\n第38届\t205649\n衡重式挡土墙\t205650\n深睿医疗\t205651\n翟鸿燊\t205652\n台州数字报\t205653\n装B\t205654\n邹俊\t205655\n欢然\t205656\nMirroring\t205657\n南京森贝伽生物科技有限公司\t205658\n本原\t205659\n白鸟美玲\t205660\n太原理工大\t205661\n搬厂\t205662\n李庄古镇\t205663\n五星村\t205664\n绍兴E网\t205665\n武修\t205666\n创信\t205667\nscroll-view_w3cschool\t205668\n荞西\t205669\n两非\t205670\ncesim\t205671\njuhe\t205672\n新桑塔纳\t205673\n山西大学商务学院\t205674\n红旗渠网\t205675\n买卖信\t205676\n张文雄\t205677\n滨湖区\t205678\nFoolish\t205679\nPPA源\t205680\n许你余生多欢喜\t205681\n天雨\t205682\n毕福周笔畅\t205683\n名侦探俱乐部\t205684\n负心汉\t205685\n七分钟\t205686\n梁小冰\t205687\n定城镇\t205688\n行办\t205689\n游社\t205690\n拖车杠\t205691\n显化\t205692\n电器柜\t205693\n新著\t205694\ni5-7500\t205695\n气管道\t205696\n陕天然气\t205697\npsx\t205698\n大魏笑忘书\t205699\n2017年7月18日\t205700\nNote4\t205701\nxun\t205702\n【美\t205703\n金鹰卡通\t205704\njavaAPI\t205705\n闪送员\t205706\n贺梓凝\t205707\n渡难关\t205708\nbackground\t205709\n花友\t205710\n三星S8/S8+\t205711\n骏派A50\t205712\n副职\t205713\nsari\t205714\n隽永\t205715\n新浪潮\t205716\n高平\t205717\n海口龙华区政府\t205718\n白落衡\t205719\n老边\t205720\n虹彩\t205721\nmusically\t205722\nhmac\t205723\ninevitable\t205724\n娇嗔\t205725\n人工晶体\t205726\n许世友\t205727\n0202250\t205728\n魔猿\t205729\n琼林\t205730\n坎\t205731\n巽宫\t205732\n6mg\t205733\n150056\t205734\n槽罐\t205735\n闽菜\t205736\n弄人\t205737\n金刚石砂轮\t205738\n初雨\t205739\nPiercing\t205740\n张梦雪\t205741\n姜玉武\t205742\n育种\t205743\n领衔主演\t205744\ninputType\t205745\n5.5_\t205746\n学生活\t205747\n淘宝村\t205748\n真三国无双7:帝国\t205749\n阳寿\t205750\n北华\t205751\n欧瑞博\t205752\n王建立\t205753\nigmp\t205754\n宫主星\t205755\n2.15\t205756\n歌帝梵\t205757\n经侦\t205758\nAdreno\t205759\ne百\t205760\n中国石油化工集团公司\t205761\n扁\t205762\n线卡\t205763\n仰卧板\t205764\n肉丁儿童网\t205765\nRAID10\t205766\n唯快\t205767\nhale\t205768\n布琳\t205769\n意式咖啡机\t205770\n15寸\t205771\n刷宝空包网\t205772\n十三分\t205773\n凹凸\t205774\n葵千惠\t205775\n绿色时报电子报\t205776\n日经\t205777\n牧业\t205778\n薛明媛\t205779\nlonewolf\t205780\n印茄木\t205781\nSTEEL\t205782\n88岁\t205783\n移液器\t205784\n1500年\t205785\n720P+1080P\t205786\n第1代\t205787\n170家\t205788\n德凯\t205789\nAutos\t205790\n壬基酚\t205791\n45#钢\t205792\n柑\t205793\n王小强\t205794\n条码\t205795\nsoapUI\t205796\n南京医药\t205797\n韩国银行\t205798\n农家菜\t205799\n全牛\t205800\n东海舰队\t205801\n淘帖\t205802\nisnan\t205803\n大会址\t205804\nlx\t205805\n山叶\t205806\n&rdquo\t205807\n21.4\t205808\n发型包\t205809\n#3\t205810\n主题歌\t205811\n高卢\t205812\nDTW\t205813\n韩老师\t205814\nVans\t205815\n碧玺\t205816\n透声\t205817\nSobel算子\t205818\n色漫画\t205819\n暖花\t205820\n玉华洞\t205821\n主抓\t205822\nNES游戏网\t205823\n权力的游戏第一季\t205824\n傲世中文网\t205825\n日神\t205826\nfirestorm\t205827\ndiversity\t205828\n场地\t205829\n雷霆海战\t205830\n影室\t205831\n認識\t205832\n压铸锌合金\t205833\nMOFs\t205834\n462\t205835\n卖掉\t205836\n横溪镇\t205837\n苏州南站\t205838\nTop10_\t205839\n棉毛\t205840\n黄金螺\t205841\n紫门\t205842\n上海八万人体育场\t205843\n教育信息化\t205844\nAndrews\t205845\n自夸\t205846\n半导体封装\t205847\n坐标拾取器\t205848\n沥青砼\t205849\n河北省住建厅\t205850\n四川省人大\t205851\n根骨\t205852\n诺诚\t205853\n1165\t205854\n本办法\t205855\n巢哥\t205856\n看不见的客人\t205857\n马宁宁\t205858\n百度站长社区\t205859\n力量感\t205860\n接拍\t205861\n金正恩文\t205862\n无锡公交查询网\t205863\nqprocess\t205864\n1024核工厂_bt工厂_1024核工厂\t205865\n1万亿美元\t205866\n补贴战\t205867\n3d播放器\t205868\nmaya2013\t205869\n王者荣耀空白名\t205870\nvga接口\t205871\nsurfacepro\t205872\n玻璃幕墙\t205873\n优惠码\t205874\n深渊\t205875\n2015届\t205876\n砖容\t205877\n碘酒\t205878\n敏力\t205879\n14个\t205880\n5870\t205881\n国赛\t205882\n时间变量\t205883\n品客薯片\t205884\n铭泽\t205885\n圆臀\t205886\n独立日\t205887\n赵小亮\t205888\n2802\t205889\n敬告\t205890\n半臂\t205891\n羽毛笔\t205892\n富华路\t205893\n滥伐\t205894\n汉初\t205895\n水幻\t205896\n半身不遂\t205897\n深圳市发展和改革委员会\t205898\n竞争者\t205899\n淘宝杂谈\t205900\n头门港\t205901\n收人\t205902\n森迪\t205903\n机力\t205904\n法人公司\t205905\n彩林\t205906\ni3-6100\t205907\n刘璇\t205908\n李恪\t205909\n参考文\t205910\nStretch\t205911\n第31届\t205912\n袁卫\t205913\n双线圈\t205914\n塔器\t205915\n【文明5吧\t205916\n经理级\t205917\n三板公司\t205918\nalbany\t205919\n前列腺痛\t205920\n空击\t205921\n贴入\t205922\nGMO\t205923\nXMind\t205924\n三汇\t205925\n水合\t205926\n呼格案\t205927\nurlopen\t205928\n19路\t205929\n吹草动\t205930\n口袋妖怪VS\t205931\ncombox\t205932\n224xp\t205933\n工银瑞信基金\t205934\nfdr\t205935\n浮盘\t205936\n标签机\t205937\n养子\t205938\n藤校\t205939\n电影战神纪\t205940\nCVTE\t205941\nforwarder\t205942\n百度吉吉影音\t205943\n加深\t205944\n年票\t205945\n快速卷帘门\t205946\n太攀\t205947\n最后一道防线\t205948\n头脑王\t205949\n南河镇\t205950\n白云山风景区\t205951\n小蜜\t205952\n文道\t205953\n萌娘资源站\t205954\n龙羊峡\t205955\n花生米\t205956\n手机袋\t205957\n白驼山\t205958\n毫末\t205959\n传言\t205960\n小额担保贷款|无担保无抵押贷款公司\t205961\n双球菌\t205962\nparenthesis\t205963\nzww\t205964\nsimetrix\t205965\n跳灯\t205966\n自由之战2\t205967\nb1\t205968\n打捞\t205969\n叻\t205970\n28座\t205971\nyxx\t205972\n90dnf\t205973\nOLM\t205974\n粮油\t205975\ncardinality\t205976\ncool\t205977\n老镜\t205978\n雅虎\t205979\n4.61\t205980\n临床诊断\t205981\n闵行区对口小学\t205982\n大寨\t205983\n阿琪\t205984\nSqlServer2005\t205985\n教师法\t205986\n马迪\t205987\nMSP430F149\t205988\n保温箱\t205989\n小钟\t205990\n肖申克的救赎\t205991\n刘_\t205992\n特货\t205993\n郑州高新技术产业开发区\t205994\n大阳台\t205995\n厨嫂\t205996\n603444\t205997\n小津安二郎\t205998\n佳通轮胎\t205999\n线序\t206000\n结题\t206001\n谢霆锋\t206002\n针尾\t206003\n青丘\t206004\n舍勒\t206005\nkush\t206006\n血站\t206007\n欧亚学院\t206008\n永恒至尊\t206009\n吴桑\t206010\n区管委会\t206011\nJavLibrary\t206012\nCorrupt\t206013\n房地产E网\t206014\nHc360慧聪网\t206015\n安宝\t206016\n口袋妖怪go\t206017\n110亿元\t206018\n探侦\t206019\n牛皮席\t206020\n彭妈妈\t206021\n4x4\t206022\n吉吉\t206023\n北京比特大陆科技有限公司\t206024\nAndroid-Studio\t206025\n灵洞\t206026\n金龙路\t206027\n微软信仰中心\t206028\nsspanel\t206029\n安小兔\t206030\n余杭街道\t206031\n何人\t206032\n武汉十五中\t206033\n八十八\t206034\n包巾\t206035\n借款利率\t206036\n热血青春\t206037\n高距\t206038\n15万方\t206039\n戴利\t206040\n89757\t206041\n天坛\t206042\n凯恩之怒\t206043\n新都会\t206044\nCompliant\t206045\ncrazy\t206046\n仙宫\t206047\n奈非天\t206048\n出版费\t206049\n白玉堂\t206050\n问底\t206051\nalt+\t206052\nmanzhou\t206053\nLetoublon\t206054\n魔石\t206055\n中融基金\t206056\n98.0%\t206057\n尚磊\t206058\n戊午\t206059\nchexun.com-车讯网\t206060\n好兴奋\t206061\n王某某\t206062\n万家星城\t206063\n初编\t206064\n印奈儿\t206065\n5余\t206066\n芗城\t206067\nstalker\t206068\n岑宁儿\t206069\n1549\t206070\n千伏\t206071\nmongodb3.0\t206072\n豪迈科技\t206073\n四快\t206074\nServers\t206075\n蕴涵\t206076\n金光咒\t206077\nzebradesigner\t206078\n杨丽娟\t206079\nX3.1\t206080\n点金胜手\t206081\n嘉琪\t206082\n辨伪\t206083\n十五日内\t206084\n欢娱\t206085\n上海宝冶集团有限公司\t206086\n男刀\t206087\n幽幽\t206088\n乐其乐\t206089\n开题\t206090\n170530\t206091\nBD版\t206092\n江海西路\t206093\n七元\t206094\n厮守\t206095\n教科书\t206096\n作报告\t206097\n裸足\t206098\ntumbler\t206099\n水院\t206100\n95#\t206101\ndeclined\t206102\nExcel工作簿\t206103\n9696\t206104\nBrady\t206105\n喜明\t206106\n建程网\t206107\n罪色\t206108\nthumbs\t206109\n古城镇\t206110\nhbd\t206111\n秋叶茜\t206112\n加鲁鲁兽\t206113\n结结\t206114\n中新科技\t206115\n竹岔岛\t206116\n不收录\t206117\ncad吧\t206118\n颜苏\t206119\n赞赏\t206120\n程诚\t206121\n星湖湾\t206122\nNai\t206123\nランキング\t206124\n伦教镇\t206125\n鄞州日报\t206126\n幸福爱人\t206127\n奢华酒店\t206128\n杯赛\t206129\nXY散点图\t206130\n北京八大处\t206131\nAriba\t206132\n查表法\t206133\n圣经启示录\t206134\naging\t206135\n淀\t206136\n侧柏\t206137\n百分比法\t206138\n结帐\t206139\n20183\t206140\n爬塔\t206141\n300G\t206142\n按摩枕\t206143\n陈家庚\t206144\n全国妇联\t206145\n干尸\t206146\n80层\t206147\n8月26日\t206148\nEroge\t206149\ngenius\t206150\n鼎利\t206151\n周晓虹\t206152\nVNS\t206153\n泽风\t206154\n霸穹封神演义\t206155\n孽爱\t206156\nexistence\t206157\n领袖们\t206158\n5.8折\t206159\n明斯克号\t206160\n龙华街道\t206161\n茶馆\t206162\n超级维特拉\t206163\n昆明市人大常委会\t206164\n安之星\t206165\n几番\t206166\n刘刘\t206167\n仙魔录\t206168\n大别山\t206169\n牛熊\t206170\n淋漓\t206171\n382\t206172\nNum\t206173\n10000mah\t206174\n枯燥乏味\t206175\n革新\t206176\n爱上\t206177\n痛彻心扉\t206178\n力筋\t206179\nchenhuan\t206180\n东南大学出版社\t206181\n码表\t206182\n龙虎豹\t206183\n乐坛\t206184\n摸上去\t206185\n6511\t206186\n奉化区\t206187\n保价\t206188\n遂平县人民政府\t206189\nm1319f\t206190\nkuaijie\t206191\n双肩\t206192\n望花区\t206193\nanyoffice\t206194\n罗兰\t206195\n罗彩霞\t206196\n20160618\t206197\n第二十一期\t206198\nPokémon\t206199\n交通运输信息网\t206200\n绿地控股集团有限公司\t206201\n阿旭\t206202\n主轮\t206203\n小额支付\t206204\nN1\t206205\n许平君\t206206\nLEGOLAND\t206207\n多加\t206208\n厦门新闻_海峡网\t206209\nh2os\t206210\n牛痘\t206211\n辽宁传媒学院\t206212\n增槎路\t206213\n惊艳\t206214\n河北省质量技术监督局\t206215\n20161126\t206216\nTPA\t206217\n显控触摸屏\t206218\n武式太极拳\t206219\nx-a\t206220\n中宏网\t206221\n福田站\t206222\n县域\t206223\n通服\t206224\nsoho\t206225\n十里平湖\t206226\n湖州市发展和改革委员会\t206227\nmav\t206228\n5天4晚\t206229\n竞拍\t206230\n致癌性\t206231\n游仙区\t206232\n酥皮\t206233\n3076\t206234\nFSM\t206235\n刘志平\t206236\n间质\t206237\n新闻界\t206238\nHISTORY\t206239\nwalkman\t206240\n本田XR\t206241\n高血脂\t206242\nPANEL\t206243\n2.5升\t206244\nticks\t206245\n结核菌素\t206246\n前童古镇\t206247\n瘦腿\t206248\n陕西服装工程学院\t206249\nf-152678\t206250\n第5层\t206251\n虚轴\t206252\nFLUX\t206253\n特罗凯\t206254\n紧固\t206255\n风行\t206256\nE宠\t206257\nAndroid源代码\t206258\n吊儿郎\t206259\n太阳病\t206260\nhimiko\t206261\n冲浪板\t206262\n安全载流量\t206263\n翘翘\t206264\n东航集团\t206265\n与日俱增\t206266\n眼动\t206267\n3亿多\t206268\n风忆雪\t206269\n心窗\t206270\nTamper\t206271\n阿浩\t206272\n加注解\t206273\ndysfunction\t206274\noutward\t206275\n三门县政府\t206276\n36w\t206277\n6.1级\t206278\n狗熊\t206279\nSCAR\t206280\n海燕\t206281\n日化新闻-138美业网\t206282\n漯河市郾城区人民政府\t206283\n易无休资源网\t206284\n版\t206285\n小聪明\t206286\n543680484\t206287\nFCD\t206288\n遏\t206289\n西藏自治区党委\t206290\nfairness\t206291\n86939779\t206292\n桡神经\t206293\n商标局\t206294\n荣威4s店\t206295\n南京外国语学校\t206296\n丈量\t206297\n97分\t206298\n省联社\t206299\n万川\t206300\n极距\t206301\n凯盛科技\t206302\nBeeline\t206303\nAMQP\t206304\nkpl王者荣耀\t206305\nPROFIBUS\t206306\n春夜宴从弟桃花园序\t206307\n天卡\t206308\n高压电工证\t206309\n打印稿\t206310\n拂面\t206311\n戎码之路\t206312\n有机肥料\t206313\n更可靠\t206314\nVLONE\t206315\n压损\t206316\nmysqll\t206317\n虞城\t206318\n七七八八\t206319\n学易网校\t206320\n绕口令\t206321\n1954\t206322\nBishop\t206323\nasinx\t206324\nmu\t206325\n诈捐门\t206326\n词序\t206327\n福建广电\t206328\n共青团员之歌\t206329\n出列\t206330\n现代战争5\t206331\n实干\t206332\n竖井\t206333\n弧焊机器人\t206334\n仁粹大妃\t206335\n星幕\t206336\n返签\t206337\n假体隆鼻\t206338\n申菱\t206339\n田蚡\t206340\n大众T-Roc\t206341\n完美搭档\t206342\n雨鸟\t206343\n党鞭\t206344\n皮定均\t206345\ncliff\t206346\n2K18\t206347\n校聘\t206348\n蔡瑁\t206349\n包宏\t206350\n三国赵云传\t206351\nsinc函数\t206352\n直筋\t206353\ndahon\t206354\n综合伊人网\t206355\n元谋县\t206356\n中键\t206357\n查无此人\t206358\n0717\t206359\n食戟之灵吧\t206360\n2012款\t206361\nLodop\t206362\n第22季\t206363\n空气消毒机\t206364\n王昆\t206365\nmiya\t206366\n灰体\t206367\n农奴\t206368\n简快\t206369\n管野静香\t206370\nCATION2\t206371\n增霸卡\t206372\n溪\t206373\n波斯湾\t206374\n房地产抵押贷款\t206375\n迪亚哥\t206376\nmuxDevLoad\t206377\n盐丸\t206378\n夹\t206379\n中通快递公司\t206380\n北京中医药大学东方学院\t206381\n松尾芭蕉\t206382\n防爆电器\t206383\n稳扎稳打\t206384\n分项\t206385\n游戏室\t206386\nTCL集团\t206387\n佳宾\t206388\n三千\t206389\n2409\t206390\n方塔\t206391\nproface\t206392\n卡佩罗\t206393\n天然林\t206394\nHari\t206395\nhuoqu\t206396\n百丽宫\t206397\n韩赛尔\t206398\n邀请信\t206399\n单倍型\t206400\ndual\t206401\ngenuine\t206402\n姚家园路\t206403\n歌舞青春\t206404\n专享版\t206405\n友华\t206406\n紫苹果装饰\t206407\n外委\t206408\namount\t206409\n﹑\t206410\nsonatype\t206411\n重庆南方翻译学院\t206412\n6.5.19\t206413\n人身险\t206414\n智尊型\t206415\n最终幻想13\t206416\n新普\t206417\n1000giri\t206418\n5D\t206419\nxpo\t206420\n迅雷网游加速器\t206421\n石灰粉\t206422\n汽车新闻_12缸汽车网\t206423\nvideoview\t206424\n看不上\t206425\n自测题\t206426\nSarah\t206427\n光大银行信用卡中心\t206428\nHD800\t206429\n崩坏帝豪\t206430\n唉呀\t206431\nwith-http\t206432\n本初\t206433\n露地\t206434\n2100个\t206435\n会声会影9\t206436\ngriffith\t206437\n幻世录\t206438\n两棵树\t206439\nOvers\t206440\n钢筋\t206441\n起亚狮跑\t206442\n课外读物\t206443\n可吉网\t206444\n中核钛白\t206445\n20150828\t206446\n林平\t206447\n八爪鱼\t206448\n叶笙\t206449\n男2女\t206450\n博客群\t206451\n内画\t206452\n深圳企业信用网\t206453\n刘向\t206454\n删版\t206455\nCozy\t206456\ncef3\t206457\n浦发银行成都分行\t206458\n小天唐锦绣\t206459\n解决方\t206460\n2014.1\t206461\nnote8\t206462\nnba直播吧\t206463\n爬行类\t206464\nrods\t206465\nmoom\t206466\n鲨鱼鳍天线\t206467\n6.1班\t206468\n应力比\t206469\nAtlanta\t206470\nalfa\t206471\n速求\t206472\n乔丽\t206473\nhp1000\t206474\ny50bt\t206475\n100mm\t206476\n鞋料\t206477\n上海社会保险服务网\t206478\n小鲁\t206479\n连番\t206480\n微阵列\t206481\n评阅\t206482\n西樵镇\t206483\n218路\t206484\n瓦伦\t206485\n色驴\t206486\n没道理\t206487\n健身舞_播视网\t206488\n波黑\t206489\n职业生涯规划书\t206490\nqt\t206491\n柴璐\t206492\n八星\t206493\n滤管\t206494\n百度无人车\t206495\n网络变压器\t206496\n有限空间作业\t206497\n原子侠\t206498\nhina\t206499\n下一\t206500\nAmazfit\t206501\n减少\t206502\n台头\t206503\n台城\t206504\n张亚涛\t206505\n抗暴\t206506\n宠物环\t206507\n雨水量\t206508\nwhirl\t206509\n军体拳\t206510\n人人乐超市\t206511\n法华镇路\t206512\n队尾\t206513\n训练裤\t206514\nleakcanary\t206515\n不拘\t206516\n水力发电厂\t206517\n凉露\t206518\nSUS304\t206519\n太上老君\t206520\n四斤\t206521\n卧龙区人民政府\t206522\n德意志银行\t206523\n夜帝\t206524\n四言\t206525\n二名\t206526\n脆肉鲩\t206527\n出征\t206528\nbookmarks\t206529\n牛皮筋\t206530\n80后记忆网\t206531\n一览英才网\t206532\n杭州网\t206533\npolyflow\t206534\nste\t206535\n暖光\t206536\n受益匪浅\t206537\n,\t206538\nNathan\t206539\n大方居\t206540\n早春\t206541\n刘新华\t206542\n上海商业会计学校\t206543\n易斋\t206544\n_华娱\t206545\n少儿舞蹈\t206546\n喷嘴\t206547\n杀局\t206548\n沾满\t206549\n太保\t206550\n小韩\t206551\n阿司匹林片\t206552\n模擬\t206553\n人民检察院案件信息公开网\t206554\nsixth\t206555\n模切\t206556\nBra\t206557\n安吉丽娜\t206558\n预报名\t206559\n美海军陆战队\t206560\n1586\t206561\n72策\t206562\n木里县\t206563\n美来\t206564\n钓鱼机\t206565\n狂呼\t206566\n天涯读书\t206567\n汇艺设计素材网\t206568\nSilver\t206569\n妖铃铃\t206570\n水西门大街\t206571\n6400\t206572\n表法\t206573\n设定集\t206574\n雷贝拉唑钠肠溶胶囊\t206575\n框架式\t206576\ndrake\t206577\n白纸坊\t206578\n相争\t206579\n费启明\t206580\n22层\t206581\n侦探们\t206582\n1440X900\t206583\n阻聚剂\t206584\n小宗\t206585\n机式\t206586\n471g\t206587\n立世\t206588\n箍筋弯钩\t206589\n江苏省人力资源和社会保障厅\t206590\n压痕\t206591\n清凉\t206592\n气罩\t206593\nsmu\t206594\nsophie\t206595\n美唐\t206596\nbban\t206597\n隔日达\t206598\n化作\t206599\n林育群\t206600\npowerquery\t206601\n情报机构\t206602\n栀子花开\t206603\n红晕\t206604\n200G\t206605\n1500万\t206606\n舟舟\t206607\n佳话\t206608\nsante\t206609\n金·凯瑞\t206610\n静安路\t206611\n益海嘉里集团\t206612\n电子商务产业园\t206613\n沪深股票\t206614\n碧翠\t206615\naround\t206616\n紫凝\t206617\n乐平\t206618\n∵\t206619\n【带秀tv\t206620\n第08\t206621\n50th\t206622\n湖美\t206623\nTFE\t206624\nresilio\t206625\n大亏\t206626\n20140406\t206627\n电工学\t206628\n小枝\t206629\n五句话\t206630\n译音\t206631\n丧茶\t206632\n中科大少年班\t206633\n改新\t206634\n18厘\t206635\n家常菜做法【食谱秀\t206636\n悠迅\t206637\n关停\t206638\nMMS\t206639\ndirectives\t206640\n河南区\t206641\n盘古智库\t206642\n好文\t206643\n勃\t206644\n恶整\t206645\n千页\t206646\n立法会\t206647\n田纳西河\t206648\ndigital\t206649\n上海自动化仪表有限公司\t206650\n带伤\t206651\n神五\t206652\n机器人大擂台\t206653\n辱警\t206654\nPut\t206655\n叉骨\t206656\nDATE\t206657\n国防建设\t206658\n一万倍\t206659\n0.5.4\t206660\n掉单\t206661\n题款\t206662\n只读\t206663\nY染色体\t206664\n张国祥\t206665\n不予\t206666\n高新技术开发区\t206667\nZZ\t206668\n福州市仓山区人民政府\t206669\n黑兔网\t206670\n蒙面唱将猜猜猜2\t206671\n企信通\t206672\n广东青年职业学院\t206673\n大田县\t206674\n亲妈妈\t206675\n筒灯\t206676\n百分之四十\t206677\n南宁市发展和改革委员会\t206678\n综投\t206679\n3.5.8\t206680\n大通g10\t206681\nFIR滤波器\t206682\n暂估价\t206683\n龙岗路\t206684\nArc\t206685\n一府\t206686\n苹果越狱助手\t206687\n陈佳\t206688\n液晶片\t206689\n李红梅\t206690\njiaoxue\t206691\n老中\t206692\n腮红膏\t206693\n英雄联盟_英雄联盟\t206694\n120斤\t206695\n家物\t206696\n北京市工会\t206697\n广西工商职业技术学院\t206698\npositively\t206699\n禧龙\t206700\n10月17日\t206701\nCCTV节\t206702\n蓝盘\t206703\nconte\t206704\n3.4_\t206705\n中央特科\t206706\nvoss\t206707\n数声\t206708\n临时性\t206709\n格言\t206710\ngbq\t206711\n伊藤博文\t206712\n昆山仁医网\t206713\n56天\t206714\n独角戏\t206715\n冲击钻\t206716\ncardi\t206717\n磷钼酸\t206718\n徐州高铁站\t206719\nThinkPHP\t206720\n饭田\t206721\n竹山县\t206722\nH9\t206723\n米切尔\t206724\n小越女\t206725\nrs\t206726\nSolved\t206727\n60fps\t206728\n一向\t206729\n口袋妖怪永恒之沫\t206730\n敏敏\t206731\nwindows编程\t206732\n本域\t206733\n欧讯\t206734\nbanshee\t206735\n刘洪\t206736\n狗友\t206737\n中华体育网\t206738\n瞎忙\t206739\n神代梦华谭\t206740\n热感\t206741\n仙茅\t206742\n更容易\t206743\n粹\t206744\n叶罗丽\t206745\n彭阳县\t206746\n2568\t206747\n西湖景区\t206748\n胡平\t206749\n海坊主\t206750\nvins\t206751\n京沪\t206752\n福莱恩\t206753\nhabits\t206754\nfina\t206755\n鲁中晨报\t206756\n凉州词\t206757\n追咬\t206758\nGCM\t206759\n骨朵\t206760\n刘方\t206761\n农银\t206762\n华英证券\t206763\n501.8\t206764\nOvercome\t206765\nHang\t206766\n冲冠\t206767\n002806\t206768\nneusoft\t206769\n超越时空\t206770\nMorgan\t206771\n二十余年\t206772\n天刀助手\t206773\n麒麟区人民政府\t206774\n非功能性\t206775\n绯雨\t206776\noverview\t206777\n圣胡安\t206778\nvulnerability\t206779\n买受\t206780\n差强人意\t206781\n长费\t206782\n白苏\t206783\n正文卷\t206784\nsd卡槽\t206785\n协同创新中心\t206786\n程媛媛\t206787\nThreadPoolExecutor\t206788\n14日\t206789\nxstart\t206790\n选编\t206791\n沙漠皇帝吧\t206792\n严重性\t206793\nlboost\t206794\nDoomsday\t206795\n佛山三中\t206796\n韩剧\t206797\n护龄\t206798\n装饰件\t206799\nFailover\t206800\nFinished\t206801\n万幸\t206802\n解全析\t206803\n海厅\t206804\n处理工\t206805\n砂器\t206806\n联动性\t206807\n西马庄\t206808\n2成\t206809\n深圳爱康健口腔医院\t206810\n正威集团\t206811\n教育部高等学校\t206812\n委员会\t206813\n股东户数\t206814\nNETWORKS\t206815\n第16周\t206816\n12月份\t206817\n万条\t206818\n40M\t206819\n英村\t206820\n广东省代表团\t206821\n陆冰嫣\t206822\n黄万里\t206823\n陕政办\t206824\n飞虎之潜行极战\t206825\n硫酸反应\t206826\n160吨\t206827\n西安西大街\t206828\n格價網\t206829\n郑州国家高新技术产业开发区\t206830\n专利权\t206831\n房县\t206832\n芙蓉中路\t206833\n6520\t206834\n客控系统\t206835\n砂漠\t206836\n西安装修公司\t206837\n明东\t206838\n新视觉影院\t206839\n高福\t206840\n4.6_\t206841\n金台\t206842\n装片\t206843\n神往\t206844\nredigo\t206845\n厦企\t206846\n2010-2014年\t206847\n亚太\t206848\ntypename\t206849\nseparated\t206850\n周日晚上\t206851\n暗堡\t206852\n御\t206853\n冲印\t206854\nSoho\t206855\n灵溪\t206856\n久久小说阅读网\t206857\n服装百事通\t206858\n阐明\t206859\n4月底\t206860\n卡兹克\t206861\n君之\t206862\n洋口港\t206863\n御姬之翼\t206864\n潜行极战\t206865\n主窗\t206866\n马援\t206867\nlad\t206868\n袖珍椰子\t206869\nconquer\t206870\n打荷\t206871\n流行音乐节\t206872\n鞋架\t206873\n舟山日报\t206874\n雏龙\t206875\n相信你\t206876\nmkisofs\t206877\n管塞\t206878\nnue\t206879\n何加林\t206880\n捉鳖\t206881\n室内消火栓\t206882\n江林\t206883\nCOMFAST\t206884\n6声\t206885\n110公里\t206886\n折扇\t206887\n男女式\t206888\nSQLPLUS\t206889\n对账\t206890\n作妖计\t206891\nzhili\t206892\n胡图图\t206893\n3.2.7\t206894\n学前儿童\t206895\n汉匈\t206896\n前置仓\t206897\nPGD\t206898\n七周岁\t206899\n天津港\t206900\n小健\t206901\n三多\t206902\n博采网络\t206903\nAttraction\t206904\n群\t206905\n宝马3系\t206906\n不治之症\t206907\n裸臀\t206908\n驾考科\t206909\n韩朝\t206910\n放闪\t206911\n行政监察法\t206912\n臻于至善\t206913\n谭派\t206914\n分蘖\t206915\n千仞\t206916\n早晨6点\t206917\n低密度影\t206918\n督导组\t206919\n小米电视3\t206920\n耀华中学\t206921\n首季\t206922\n人速\t206923\n变形术\t206924\n1030\t206925\n0327\t206926\nPerry\t206927\n京东号\t206928\n霧島\t206929\n稔山镇\t206930\n雕塑展\t206931\nmu-mimo\t206932\n浙江省纪委省监察委\t206933\n鬼才\t206934\nzip-CSDN\t206935\nillinois\t206936\n11道\t206937\n老齐\t206938\n包头市公安局\t206939\n魔幻类\t206940\n陈岩\t206941\n身体语言\t206942\n第5项\t206943\nhdparm\t206944\n洗鼻\t206945\n宠物界\t206946\n鄢梦萱\t206947\nAVS\t206948\n默示\t206949\n耳機\t206950\n宠妻\t206951\nhomekit\t206952\nbutter\t206953\n航天金税盘\t206954\nPEG\t206955\n力图\t206956\n堪折\t206957\n星野明\t206958\n选修\t206959\nHf\t206960\nLinux命令大全\t206961\n唯妻\t206962\n耳骨\t206963\n银钱\t206964\nfoxmail7.1\t206965\nOdin3\t206966\nUNIX网络编程\t206967\n江宁区\t206968\n新怡家园\t206969\n候选\t206970\nstme\t206971\n腾讯新闻中心\t206972\n南美\t206973\n头孢\t206974\n厦门地税\t206975\ncontrolled\t206976\nm6s\t206977\n江湖哈弗h6\t206978\n祁王\t206979\n郑现锋\t206980\n驱动程\t206981\n泰斯特\t206982\n罗中立\t206983\nYeti\t206984\ncultbeauty\t206985\n攀钢钒钛\t206986\n教研部\t206987\nGreedy\t206988\nRMI\t206989\n高塘石\t206990\n85集\t206991\n95595\t206992\n倩女幽魂2\t206993\n防火墙\t206994\n卧房\t206995\n朔州市\t206996\n南通政府网\t206997\nAWA\t206998\n全国证券公司\t206999\nanswer\t207000\n绿豆粥\t207001\n你的回忆\t207002\n示踪剂\t207003\nzs\t207004\n吉祥三宝\t207005\nhenkaku\t207006\n教研员\t207007\n蒲公英的约定\t207008\n西凤酒\t207009\n黛米\t207010\n咚\t207011\n信访办\t207012\n不欢\t207013\n鹅\t207014\nDeviceNet\t207015\n黑橙\t207016\n南漳\t207017\n状元成才路\t207018\n抢购一空\t207019\n永久居住权\t207020\n1357\t207021\nzidong\t207022\n眼脸\t207023\nInvesting\t207024\n80S\t207025\noptions\t207026\nalnx\t207027\n离子交换膜\t207028\n环江毛南族自治县\t207029\npipi美\t207030\nrecover\t207031\n1060\t207032\n米兰·昆德拉\t207033\n20161031\t207034\nPyMOL\t207035\n西瓜汁\t207036\n喉罩\t207037\n45000\t207038\n婚闹\t207039\n总工期\t207040\nEPQ\t207041\n供电量\t207042\n元大都城垣遗址公园\t207043\nCLOSE\t207044\n鱼怪\t207045\n原材料商务网\t207046\n摘要\t207047\n计算板\t207048\n邓萃雯\t207049\n吉州窑\t207050\n20170420\t207051\n海南华侨中学\t207052\n鱼之乐\t207053\n难见\t207054\nfx60vm\t207055\n冰片\t207056\n威锋网\t207057\n荣恩\t207058\nDisadvantages\t207059\n子树\t207060\n2018中考网\t207061\n风采\t207062\n台州天台\t207063\noccured\t207064\n开滦股份\t207065\n树莓\t207066\n马克思列宁主义\t207067\n浦发银联\t207068\n项籍\t207069\nAhmed\t207070\n靓靓\t207071\nfinland\t207072\n收款机\t207073\nInterpreter\t207074\njavaWeb\t207075\nEPROM\t207076\n罗城仫佬族自治县人民政府\t207077\n古里镇\t207078\n研究史\t207079\n万籁俱寂\t207080\n宿主机\t207081\n南京地铁S3号线\t207082\n江宁谷\t207083\n首领\t207084\n满点\t207085\ni5处理器\t207086\n酵母\t207087\n第16批\t207088\n聊斋艳谭1:艳魔大战\t207089\n鲜网文\t207090\n王老板\t207091\nAK70\t207092\niBasso\t207093\n铜版画\t207094\n巨牌\t207095\n钢铁雄心4\t207096\n甲巯咪唑片\t207097\n好呀\t207098\n在上\t207099\n郑州办公室\t207100\nSwitchButton\t207101\n%2f\t207102\n蛆\t207103\n港航物流\t207104\nCabinets\t207105\nzap\t207106\n猴年纪念币\t207107\n慈铭\t207108\nTLP\t207109\n太行天路\t207110\n梁高\t207111\n玉虚\t207112\n裸眼3d\t207113\n广华新城\t207114\n雅居乐清水湾\t207115\n老鱼\t207116\n垂耳兔\t207117\n雹\t207118\n类克\t207119\n亲嘴\t207120\n北大深研院\t207121\n180亿美元\t207122\n百科_天天b2b电子\t207123\nJuly\t207124\n双百\t207125\n灰浆\t207126\n刀布\t207127\n新浪河\t207128\n渝北\t207129\n细则\t207130\n凉薯\t207131\n吴念真\t207132\n二试\t207133\n8.20\t207134\n古蔺\t207135\n仿音\t207136\ndocbook\t207137\n扣完\t207138\nunatural\t207139\n泗州网\t207140\n美成\t207141\nnectar\t207142\nsmss\t207143\n王明辉\t207144\nCBL\t207145\n石榴木命\t207146\ncvd\t207147\n楼承板\t207148\n2.0米\t207149\n工艺员\t207150\n重组后\t207151\n港式\t207152\npad2\t207153\n1396\t207154\n耀邦\t207155\ntummy\t207156\n崎\t207157\ncx70\t207158\nactors\t207159\nportfolio\t207160\ngay片\t207161\n3502\t207162\n六武\t207163\nonunload\t207164\nsuzhou\t207165\n送走\t207166\n沈阳教育\t207167\n低龄化\t207168\n远房\t207169\n真爱你的云\t207170\n十拿九稳\t207171\n机电产品\t207172\nqbasic\t207173\n极速下载器\t207174\n金刀\t207175\n附睾囊肿\t207176\n飞讯\t207177\n细线\t207178\n冀玉华\t207179\n邪咒\t207180\n爱房网\t207181\n赤裸天使\t207182\n中羽在线\t207183\n入驻\t207184\n段宜恩\t207185\nFabless\t207186\n揭晓\t207187\n土库曼\t207188\n水边的阿狄丽娜\t207189\n直落\t207190\n福利来了\t207191\n开博进销存管理系统\t207192\n基金业\t207193\n艺术字\t207194\n219路\t207195\n抹茶蛋糕\t207196\n芬太尼透皮贴剂\t207197\n解聚\t207198\n广州长隆酒店\t207199\n这个周末\t207200\n成都海尔\t207201\n无脸\t207202\nnPlayer\t207203\n显密佛网\t207204\n成形\t207205\n知乎读书会\t207206\n客战\t207207\n首都图书馆\t207208\nGameLook\t207209\ncrgk\t207210\n青椒\t207211\nsstap\t207212\nhumanity\t207213\n保运通保险网\t207214\n李成敏\t207215\n落下山\t207216\n人民大学出版社\t207217\n露米娅\t207218\nKTM\t207219\n长洲岛\t207220\nmdns\t207221\n大麦若叶青汁\t207222\n左联\t207223\n挖矿者\t207224\n警察学院\t207225\nbattery\t207226\n网纹辊\t207227\n燕归巢\t207228\n21个月\t207229\n根尖\t207230\n刚正\t207231\n整个\t207232\n第51页\t207233\n沉降\t207234\n时间服务器\t207235\n菜单条\t207236\n兵家\t207237\n博尔冯仑\t207238\n1280X720_\t207239\n蜀桧\t207240\nitoa\t207241\nnik\t207242\n5个月\t207243\n陈云霁\t207244\nNoodle\t207245\n九寨\t207246\n6月2日\t207247\n调度室\t207248\n机盖\t207249\n王大\t207250\n光流法\t207251\n储运部\t207252\n酷乐猫\t207253\n海航集团\t207254\n可苦\t207255\nд\t207256\n健康风险评估\t207257\n江山人\t207258\nbeta3\t207259\n适应度函数\t207260\n充满活力\t207261\n七线\t207262\n布达佩斯\t207263\n内蒙古建筑职业技术学院\t207264\n乂学\t207265\n税案\t207266\n电动飞机杯\t207267\n干什么用\t207268\n叙述性\t207269\n年集\t207270\n长沙地铁\t207271\nKLIA2\t207272\n道安\t207273\n7472\t207274\ndouban\t207275\n浩沙健身\t207276\n春天后\t207277\n玄葫堂\t207278\nPanther\t207279\ndsf\t207280\neHR\t207281\n秽翼\t207282\n杂阿含\t207283\n王元姬\t207284\n茶仓\t207285\n团魂\t207286\n嘉院\t207287\n茅箭区\t207288\n凌家塘\t207289\nrmit\t207290\n200平\t207291\n水浴式\t207292\nS.H.E\t207293\n魔情\t207294\njig\t207295\n生育金\t207296\n五大洲\t207297\n宽松式\t207298\n红卫村\t207299\nメインテ\t207300\n左室\t207301\nR4\t207302\n鹏元\t207303\n坚果\t207304\nCDS\t207305\n断壁\t207306\n华而不实\t207307\n质押贷款\t207308\n戢\t207309\n散仙\t207310\n上海妇科医院\t207311\n不切实际\t207312\n风脉泉\t207313\nSTIGA\t207314\n雅各书\t207315\n492\t207316\nszhome\t207317\n无间道2\t207318\n佳能相机镜头\t207319\n明玉\t207320\n融侨观邸\t207321\nhlg\t207322\n63_\t207323\n千里马人才网\t207324\n4.55\t207325\n北辰悦来壹号\t207326\n芽苗\t207327\n杨格\t207328\n格萨尔王\t207329\n百环家园\t207330\n欧蒙\t207331\n广播体操\t207332\n原著\t207333\nmathematica\t207334\n承德医学院\t207335\n监督员\t207336\n巴塞罗\t207337\n波长\t207338\n五洋\t207339\n人人素\t207340\n僵直\t207341\n沈月\t207342\n安徽电力\t207343\n辅助轮\t207344\n光纤陀螺\t207345\n周娟\t207346\n大公狗\t207347\nCarey\t207348\nmakelove\t207349\n张蓓\t207350\n税收减免管理办法\t207351\n乐鸟\t207352\n旅游度\t207353\nconformance\t207354\n坪山区\t207355\n瞿秋白\t207356\n保险套\t207357\n很糟糕\t207358\n王美芳\t207359\ntaoxinxu\t207360\n贪贿\t207361\n飘洒\t207362\n吐鲁番地区\t207363\nSonny\t207364\n2224\t207365\n外协\t207366\n紧锣密鼓\t207367\nshb\t207368\n劫匪\t207369\n影青\t207370\n透润\t207371\n宏命令\t207372\n曹查理\t207373\nAnri\t207374\n亮灯\t207375\n结业式\t207376\nStereo\t207377\n自封\t207378\n尼诺\t207379\n昆腾\t207380\n安徽省烟草专卖局\t207381\ncomeback\t207382\n重生追美记\t207383\n屈原列传\t207384\n千档\t207385\n6000千瓦\t207386\n核医\t207387\n赵霁\t207388\nCRASH\t207389\n圣套\t207390\nyii2\t207391\n泰伦卢\t207392\nUnknown\t207393\n推导\t207394\n血条\t207395\n东青\t207396\n昆仑关\t207397\n嘉北郊野公园\t207398\n樱井翔\t207399\n121家\t207400\n青岛求实职业技术学院\t207401\ntiffany\t207402\n周锐\t207403\n加强版\t207404\n筹备金\t207405\n牙签盒\t207406\n绿地世纪城\t207407\n入门篇\t207408\n10月19日\t207409\n真假学园\t207410\nnad\t207411\n嘟噜\t207412\n广州外语外贸大学\t207413\n饺子粉\t207414\n咗\t207415\n哈弗h2\t207416\nygm\t207417\ngcl\t207418\n女星们\t207419\n世纪坛\t207420\niapp\t207421\n无限月读\t207422\nratchet\t207423\n谭松韵\t207424\n压式\t207425\n戒不掉\t207426\n虔\t207427\n赵丹阳\t207428\n浙江大学医学院附属邵逸夫医院\t207429\n买车\t207430\n第132集\t207431\n分价\t207432\n埋设\t207433\n_野生技术协会\t207434\n发动机冷却液\t207435\n王永华\t207436\nubantu\t207437\ncaopron\t207438\n文天祥\t207439\nyolo\t207440\n入主\t207441\n李润杰\t207442\n看步\t207443\n伊野尾慧\t207444\n诗教\t207445\n粤知一二\t207446\n攻击者\t207447\nnachos\t207448\n四个数\t207449\n亚翔集成\t207450\n拜日\t207451\n11.2.2\t207452\n6.3_\t207453\nares\t207454\n高鼎\t207455\naigis\t207456\n蛇纹\t207457\n新吴\t207458\n重庆火车北站\t207459\n课室\t207460\n铁岭新区\t207461\n黄金水岸\t207462\n婴灵\t207463\n翡翠绿洲\t207464\n尼特罗\t207465\n东方新天地\t207466\nModern\t207467\n花招\t207468\n黎里\t207469\nspoc\t207470\nGBZ\t207471\n金肯职业技术学院\t207472\n厉_\t207473\n光稳定剂\t207474\n炉石传说英雄\t207475\n天机城\t207476\n恒佳\t207477\n洛阳东\t207478\n201路\t207479\nevening\t207480\n怪不得\t207481\n张振富\t207482\n选码\t207483\n付行\t207484\n小升初择校_小升初\t207485\n柳传志\t207486\n接口层\t207487\n两个月\t207488\n老年人\t207489\n卫生人才网_医药英才网_医院\t207490\nmof\t207491\n上海建筑工程\t207492\n限位器\t207493\n清整\t207494\n马斯诺\t207495\n当上门\t207496\n九月份\t207497\n龙抄手\t207498\nsql批量\t207499\n八十周年\t207500\n美泰\t207501\n遭\t207502\n言语交际\t207503\n莱托\t207504\n把列\t207505\n连弹\t207506\n伸向\t207507\ngtav\t207508\nWPS2016\t207509\nMFC-7480D\t207510\n6年以上\t207511\n居民区\t207512\n密钥环\t207513\nsourcetree\t207514\n1.74\t207515\n常德经开区\t207516\n17.5\t207517\n白亦非\t207518\n780元\t207519\nattractions\t207520\nfips\t207521\nbeini\t207522\n莆田学院附属医院\t207523\n8卷\t207524\n上者\t207525\nAzkaban\t207526\n密度板\t207527\n几百年\t207528\n八|\t207529\n异辛烷\t207530\n合订本\t207531\n1.8万元\t207532\n李豪\t207533\n朱芳芳\t207534\n微热\t207535\n委托代理合同\t207536\n武英殿\t207537\n监督执纪工作规则\t207538\n宾州镇\t207539\n跑江湖\t207540\n脉冲波\t207541\nPTC热敏电阻\t207542\n市上\t207543\n马管家\t207544\nboiling\t207545\nCNS\t207546\n汉责文化\t207547\n绿松石\t207548\n不容\t207549\n龙书\t207550\n妖玲玲\t207551\n最好的未来\t207552\n劲儿\t207553\n9.28\t207554\n边防护栏\t207555\n杀阵\t207556\n二军\t207557\nSQLiteStudio\t207558\n几局\t207559\n周晓\t207560\n换流站\t207561\n嘉峪关\t207562\n林逸欣\t207563\n平淡\t207564\n活肤\t207565\n首尔机场\t207566\n光明广场\t207567\n亚游集团\t207568\n加盟代理\t207569\n电极帽\t207570\n胡小青\t207571\nHaiTaoZu\t207572\n寿安镇\t207573\n第三十一章\t207574\n鸿坤理想城\t207575\ntunes\t207576\n扫描机\t207577\n马墩\t207578\ntome4\t207579\nBioconductor\t207580\ndnf忍者吧\t207581\nNiFi\t207582\n前10分钟\t207583\n浙江省教育厅\t207584\n王桂英\t207585\n核定单\t207586\n陆海空\t207587\n象拔蚌\t207588\n马丁内斯\t207589\n膝盖处\t207590\nedt\t207591\n捷安\t207592\n蛋哥\t207593\n水乳\t207594\n灵药\t207595\n唧\t207596\n云气\t207597\n里仁洞\t207598\n尚略\t207599\n易效\t207600\n酒会\t207601\nREQUIRES\t207602\n邓州市\t207603\n40点\t207604\n沙狐\t207605\nProspect\t207606\npointer\t207607\n一粒\t207608\n1950x\t207609\n子宫糜烂\t207610\nprg\t207611\n徐家庄\t207612\n方玉斌\t207613\n海南住房公积金管理局\t207614\n教研科研网\t207615\nQQ群主\t207616\n太仓经济开发区\t207617\nFU\t207618\nGarment\t207619\n沪市\t207620\n勐拉皇家国际\t207621\na67\t207622\n郑渊洁\t207623\n数制转换\t207624\n吕琳\t207625\n步履维艰\t207626\nYYT\t207627\nworks3\t207628\n一草\t207629\nDropwizard\t207630\nhyxd\t207631\n赢鼎\t207632\nroyale\t207633\n/网批\t207634\n地下城起源\t207635\n输户\t207636\n老E\t207637\n峨眉派\t207638\n绿罗\t207639\nSeaJS\t207640\n68天\t207641\n天津一中\t207642\n荣耀QQ\t207643\n存仓\t207644\n聚宝汇\t207645\n九部\t207646\n知乎知卡\t207647\n美餐\t207648\ndvd\t207649\n富士急乐园\t207650\nASL\t207651\nIP6\t207652\n威刚\t207653\n政采\t207654\ninterp\t207655\n羟甲基\t207656\n陌生男子\t207657\nAPM\t207658\n越王\t207659\nWebLogic11g\t207660\n漏财\t207661\n快客便利店\t207662\n思迅天店\t207663\n英华达\t207664\n蓬安\t207665\n车炮\t207666\n贺兰县\t207667\n192.168.1.1\t207668\n龙源峡\t207669\n碳酸钙d3片\t207670\n泰州国际机场\t207671\n辜鸿铭\t207672\n二第三章\t207673\ngyang\t207674\n美国众神\t207675\nMalaysian\t207676\n北京市委宣传部\t207677\n小米应用商店\t207678\n浙江电网\t207679\n王潇\t207680\n茧\t207681\n唐雎\t207682\nBTM\t207683\n一年一\t207684\n充裕\t207685\nduality\t207686\n7180\t207687\n创世记\t207688\n张棪琰\t207689\n远距离\t207690\n演说家\t207691\n晓红\t207692\n速解\t207693\n上饶站\t207694\nbootable\t207695\n观察\t207696\n贵州工程应用技术学院\t207697\n仔面\t207698\n索雷斯\t207699\nQHD\t207700\nIAB\t207701\n居室\t207702\n宁峰\t207703\nNeutrogena\t207704\n西藏区\t207705\n氢氧化铵\t207706\n科罗拉多州\t207707\n八角寨\t207708\n立方体\t207709\n铝膜\t207710\n阿良良\t207711\n单绳\t207712\nfastreport\t207713\n123期\t207714\n优酷片库\t207715\n买手店\t207716\n备注\t207717\n陈鼓\t207718\n跳跳虎\t207719\n酷玩6\t207720\n何梦蓉\t207721\n巨树\t207722\n好人馆\t207723\nE3\t207724\n吉水\t207725\n第110集\t207726\n生理性\t207727\n国有土地使用证\t207728\n晁错\t207729\nia32-libs\t207730\ncandence\t207731\n高压证\t207732\n藏花\t207733\n致病性\t207734\n明尼苏达州\t207735\n王近山\t207736\n三夏\t207737\n惠斯通电桥\t207738\nsimu\t207739\n李子树\t207740\n永动机\t207741\n华章\t207742\nnoting\t207743\nSBS\t207744\n西安新闻网\t207745\n新全顺\t207746\ndick\t207747\n新贝\t207748\n圆珠\t207749\n手錶\t207750\nactionscript\t207751\n奇怪的她\t207752\n胶囊内镜\t207753\nФ\t207754\nGODEX\t207755\n卷笔刀\t207756\n搞姬\t207757\n帝豪EC7\t207758\n工程宝\t207759\n消防控制柜\t207760\nmysql索引\t207761\n天尺\t207762\nSAN\t207763\nHomme\t207764\n方位\t207765\n总套\t207766\n吉他六线谱\t207767\n眉山日报\t207768\n其它地区\t207769\n宇顺电子\t207770\ncold\t207771\n我们的未来\t207772\n五章\t207773\n千颂伊\t207774\n武动乾坤\t207775\n公共管理专业\t207776\n单挑\t207777\n综采队\t207778\n人物志\t207779\nsn\t207780\n中国商飞公司\t207781\n硝化甘油\t207782\n进军\t207783\n紫烟\t207784\n工商管理专业\t207785\n荟萃\t207786\n新洲镇\t207787\n陆永\t207788\ngenus\t207789\n7毫米\t207790\n200点\t207791\n2018.04.11\t207792\nssooking\t207793\n张优\t207794\n1TB\t207795\n1128\t207796\n张千巽\t207797\n金煌\t207798\n独立性\t207799\nLab-分析测试百科网\t207800\nbof\t207801\narchitects\t207802\nOslo\t207803\n厨刀\t207804\nServlet3.0\t207805\n格罗宁根大学\t207806\n行政令\t207807\n金泰焕\t207808\n长城h6\t207809\n801\t207810\n购物篇\t207811\n选别机\t207812\n济南南部山区\t207813\n莱尼\t207814\n丁香人才网\t207815\n微臣\t207816\n格鲁德\t207817\n中国联通集团\t207818\n结文\t207819\nRang\t207820\n巨好\t207821\nPathology\t207822\n大厨师\t207823\n木木枭\t207824\n言程\t207825\n圣甲虫\t207826\n金水区\t207827\n何颖\t207828\n利星\t207829\n谢沧行\t207830\n虎头蜂\t207831\nFresco\t207832\n堆排序\t207833\n军事理论\t207834\n皮埃蒙特\t207835\n收款宝\t207836\nbrown\t207837\nmysql服务器\t207838\n007:大破天幕杀机\t207839\ncoreldrawx7\t207840\n国殇\t207841\n勾玉\t207842\n大九湖\t207843\n浙江绿城\t207844\n哪门子\t207845\n慕天星\t207846\n霍夫\t207847\n中转\t207848\n兰芳园\t207849\n财富网\t207850\n毛球\t207851\n张博\t207852\n_天维导购\t207853\n葱蒜\t207854\n死记\t207855\nife\t207856\nschoolbag\t207857\n而死\t207858\n你给我听好\t207859\n中国男乒\t207860\n盘状\t207861\n小盒\t207862\n利威尔\t207863\n单簧管波尔卡\t207864\n中兴电信\t207865\n少有人走的路\t207866\n脸油\t207867\n大版\t207868\nPATRAN\t207869\n好争\t207870\ngpi\t207871\ntrample\t207872\n几丁质\t207873\n广州塔\t207874\nvera吧噗\t207875\n陈小\t207876\n起跑线\t207877\nyarn\t207878\n偷东西\t207879\n知不知\t207880\npurchases\t207881\n7宫\t207882\nMy97\t207883\n蒸汽挂烫机\t207884\n巴中\t207885\n2020\t207886\n副部长\t207887\n站夸信息网\t207888\nws\t207889\n疯狂猜成语/图猜成语\t207890\n星际战\t207891\n慌慌张张\t207892\n感知\t207893\ncontrolfile\t207894\n银票\t207895\nssh-keygen\t207896\nshannon\t207897\n亚萨合莱\t207898\nzookeeper\t207899\n康熙\t207900\nkai\t207901\n873\t207902\nATR\t207903\n元婴期\t207904\n保义镇\t207905\n果蝇\t207906\n乙烯基树脂\t207907\n砌\t207908\n仁和会计\t207909\nTinto\t207910\n公顷\t207911\n宋之问\t207912\n源潭\t207913\nモデル\t207914\n第-\t207915\n宁德时代新能源科技有限公司\t207916\n6102\t207917\nN档\t207918\n景芝\t207919\n欧普康视\t207920\nPOP3/SMTP\t207921\n广西壮族自治区人民政府\t207922\n六万\t207923\n部落战争\t207924\n夏晖\t207925\n借物\t207926\n鱼川\t207927\n洗涤灵\t207928\nin-house\t207929\n四民\t207930\n嘉实多磁\t207931\nWHERE子句\t207932\n齐声\t207933\n普调\t207934\n2212\t207935\n狐妖\t207936\n并处\t207937\n出游季\t207938\n雕刻机\t207939\nzxxk\t207940\n成都信息工程大学\t207941\n400余名\t207942\n黑吃黑吧\t207943\n经编机\t207944\n发审委\t207945\n河南电台\t207946\n顺河\t207947\n匮乏\t207948\n飞火\t207949\n0839\t207950\n五谷磨坊\t207951\n胸腺法新\t207952\n和付\t207953\n大公报\t207954\n美食城\t207955\n赵晓霞\t207956\n9110\t207957\n3004\t207958\n泌体\t207959\nPractical\t207960\n建筑施工企业\t207961\n美莱整形美容\t207962\n山根绫乃\t207963\n一二三_\t207964\noppo游戏中心\t207965\n名犬\t207966\n妈妈鞋\t207967\n兰州大学化学化工学院\t207968\n松江府\t207969\n技术资\t207970\n打标\t207971\n真恋姬无双\t207972\n云城区\t207973\n黄晓村上春树\t207974\n手腕处\t207975\n陆家嘴街道\t207976\n锅碗瓢盆\t207977\nela\t207978\n雷索\t207979\n乐团\t207980\n王飞跃\t207981\n长存\t207982\n蠕形螨\t207983\n六野\t207984\n年前\t207985\n刘志辉\t207986\npresto\t207987\n2015.3\t207988\n朱瑾瑜\t207989\n罗摩衍\t207990\nLCG\t207991\n咋选\t207992\nCH\t207993\n深圳欣锐科技股份有限公司\t207994\n密码箱\t207995\n解决问题\t207996\n华仪\t207997\n四科\t207998\n手边\t207999\n缠枝\t208000\n发料\t208001\n北京京港地铁有限公司\t208002\n第戎\t208003\n一匀\t208004\n氟比洛芬酯\t208005\n创新工场\t208006\n20码\t208007\n傻姑\t208008\nAxle\t208009\nU2417\t208010\nSUBTOTAL\t208011\n梭门\t208012\n宣城市人民政府\t208013\n20151110\t208014\nSterling\t208015\nI罩杯\t208016\nm1213nf\t208017\n紫斗\t208018\n铝塑复合管\t208019\n饭香\t208020\nframeborder=\t208021\nacfun\t208022\n季洁\t208023\n伤身\t208024\nS-S\t208025\n鹅绒\t208026\n光通量\t208027\n黄宗巴特尔\t208028\n梦里水乡\t208029\n汉武帝刘彻\t208030\n岭南集团\t208031\n水高\t208032\nSKIN\t208033\n幻方\t208034\nSudden\t208035\n2018年11月\t208036\n4.5_\t208037\nbaoji\t208038\n300万吨\t208039\n巴黎酒店\t208040\nt形\t208041\nTeddy\t208042\n黑熊精\t208043\n帮服\t208044\n3句\t208045\nobjection\t208046\n滚揉机\t208047\n棉毡\t208048\nsexin\t208049\n医妓\t208050\n土罐\t208051\npathfinder\t208052\n高中版\t208053\nCAGR\t208054\n萨基特·乔杜里\t208055\n咒缚者\t208056\nCSL\t208057\n成都电子科大\t208058\n巅峰会\t208059\n平安财富宝\t208060\n证券大厦\t208061\n如意宝\t208062\nbyredo\t208063\n三频\t208064\n第届\t208065\n感动中国\t208066\n氙气\t208067\n110周年\t208068\n壕\t208069\n驸马\t208070\n银河文明3\t208071\n领航城\t208072\nkey,value\t208073\n9.58\t208074\n国都\t208075\nオ\t208076\nmrna\t208077\n白坭镇\t208078\n大坡村\t208079\nFD\t208080\n5万名\t208081\n有否\t208082\n呼入\t208083\n青春珊瑚岛\t208084\nstlink\t208085\n落水狗\t208086\nword2017\t208087\n奔马\t208088\n小花猫\t208089\n陈莉\t208090\nTFBOYS\t208091\nSlowly\t208092\n10图\t208093\n转角度\t208094\n聚政\t208095\n斗嘴\t208096\nhg680\t208097\nxuanxuan\t208098\n船舶\t208099\nTensorLayer\t208100\n婺城新区\t208101\nledshowtw\t208102\n李洪成\t208103\n阴刻\t208104\nManagers\t208105\n寿桃\t208106\n1745\t208107\npkpm2010\t208108\n前海中心\t208109\n定额表\t208110\n星期七\t208111\nlcd1602\t208112\n依旧\t208113\ndatax\t208114\n教团\t208115\n焦作日报\t208116\n流行舞\t208117\n不致\t208118\n32pao\t208119\nBamBam\t208120\n去屑\t208121\n求非\t208122\n密云县\t208123\n沈政\t208124\nkumamon\t208125\n利马\t208126\n鲜语\t208127\n有讲究\t208128\n美诺\t208129\n摄政王爷\t208130\nFifty\t208131\n深圳职业技术学院\t208132\n发愤写史记\t208133\nOwl\t208134\n李通\t208135\n酸洗磷化\t208136\n全能播放器\t208137\nSex8_CC\t208138\n邵逸灵契\t208139\n3一\t208140\n303sh\t208141\nexpect\t208142\nios11.1\t208143\n144平米\t208144\n导流洞\t208145\n敏使朗\t208146\n河南省通信管理局\t208147\n实验箱\t208148\nWIND\t208149\n陈求发\t208150\n修炼爱情\t208151\n液化烃\t208152\n异速联\t208153\n泻火\t208154\n撒拉嘿呦\t208155\n安希\t208156\n征地补偿标准\t208157\nClosest\t208158\n枫情\t208159\n腾讯云域名\t208160\n2588元\t208161\n小红山客运站\t208162\nOpenCV3\t208163\n内存块\t208164\nvm\t208165\n平安租赁\t208166\n潜望镜\t208167\n思想者\t208168\n腕\t208169\n哼哼唧唧\t208170\n中兴红牛\t208171\n坑梓\t208172\n2008-2015年\t208173\n付清泉\t208174\n十日内\t208175\n租客\t208176\n质押\t208177\n操作法\t208178\n揭阳市区\t208179\n点经\t208180\n中山\t208181\nA59\t208182\n时值\t208183\n制毒案\t208184\n荔城区\t208185\n眼周\t208186\n蜜桃儿\t208187\nAnniversary\t208188\n上海国旅\t208189\nJeremy\t208190\n学税\t208191\nJXL\t208192\n陈志刚\t208193\n上万名\t208194\n梅\t208195\n绿博\t208196\nLincoln\t208197\n上海建工\t208198\n上海4号线\t208199\n思辨性\t208200\n童欣\t208201\n乐高积木\t208202\n红河\t208203\n油蜡皮\t208204\nFlavor\t208205\n定存\t208206\n恋恋笔记本\t208207\n韩裔\t208208\n普惠园\t208209\n随车起重机\t208210\n8颗\t208211\n双凤镇\t208212\n哀\t208213\n自由广场\t208214\n婉慈\t208215\n长期借款\t208216\n一一直\t208217\n送子\t208218\n3.5mm\t208219\nkattle\t208220\n张宝英\t208221\n每笔\t208222\n夹杂\t208223\n中国情歌汇\t208224\n真田昌幸\t208225\n病号\t208226\n狮心\t208227\n三十四十\t208228\n申请人\t208229\n刮胶\t208230\nbct\t208231\n美少女梦工厂2\t208232\nLenovo\t208233\n末端\t208234\n制衣厂\t208235\n吸顶灯\t208236\nw3\t208237\nqq音乐播放器\t208238\nFMX\t208239\n六休\t208240\n操作资格证\t208241\n编舞\t208242\n罗成\t208243\ntoefl\t208244\n习作\t208245\nmediumint\t208246\n畜牧展\t208247\n环球贸易中心\t208248\n中华民\t208249\n游览线路\t208250\nLife\t208251\n森林系\t208252\n绒线\t208253\n显存位宽\t208254\n攀岩\t208255\n李梦圆\t208256\n麻将胡\t208257\nP70\t208258\nafterburner\t208259\n麦景图\t208260\nWin7输入法\t208261\n中大网校儿童教育网\t208262\n级次\t208263\n一鉴\t208264\n中国移动通信集团\t208265\n奥沙西泮\t208266\n9.6万\t208267\n上百人\t208268\n玛萨\t208269\n冲劲\t208270\nKlein\t208271\n女儿家\t208272\n洋葱新闻\t208273\n花语谷\t208274\n吉安县人民政府\t208275\n2015年5月4日\t208276\n麻仁润肠丸\t208277\nclass=\t208278\n天下大宋\t208279\n银临\t208280\n中交公路规划设计院有限公司\t208281\n北京派出所\t208282\n投资收益率\t208283\n海教园\t208284\n3d背景墙\t208285\n钙拮抗剂\t208286\naip\t208287\n815路\t208288\n两新\t208289\n职工代表\t208290\n无线投影\t208291\n阳宅\t208292\n脉冲激光器\t208293\n割破\t208294\n谢斌\t208295\n庄小威\t208296\nEcho\t208297\nnj\t208298\n深圳银监局\t208299\n杨科\t208300\n胡隐君\t208301\n遇见爱情的利先生\t208302\n美岐\t208303\nb2b2c\t208304\n相人\t208305\n沃兴华\t208306\n商标机\t208307\n寿寿花\t208308\n呷浦\t208309\n精灵梦叶罗丽全集\t208310\n王丽娜\t208311\n军代表\t208312\n42年\t208313\n还有多远\t208314\n侠客英雄传\t208315\n那些梗\t208316\n活结\t208317\n死亡者\t208318\n英签\t208319\n雀舌茶\t208320\nDSR\t208321\n0108\t208322\n00000124\t208323\n北京市朝阳区房屋管理局\t208324\n俞晴\t208325\nt480s\t208326\n北京城\t208327\n灭磁\t208328\n金虫\t208329\n善哉\t208330\n米胖_米胖火车\t208331\n加药量\t208332\nnote9\t208333\nRingo\t208334\n20161222\t208335\n法文\t208336\n交单\t208337\n美风\t208338\n外债\t208339\n休闲男\t208340\n到会\t208341\n宦\t208342\n东凤镇\t208343\n精灵旅社\t208344\n千里莺啼绿映红\t208345\n清华大学继续教育学院\t208346\n推子\t208347\n朗润\t208348\n武汉华夏理工学院\t208349\n登飞来峰\t208350\n自治区物价局\t208351\n山东保监局\t208352\npes2013\t208353\n无限剑制\t208354\nunshift\t208355\n孙一峰\t208356\n安游守望先锋\t208357\n起亚KX3\t208358\n小号\t208359\n牵一发而动\t208360\n24th\t208361\n调低\t208362\n大枪战\t208363\n公用电脑\t208364\nnolock\t208365\n当婚\t208366\n秦姓\t208367\n湿身\t208368\n框式\t208369\n绝尘\t208370\n直爽\t208371\n鸽子\t208372\n运钞车\t208373\nv4\t208374\n作法\t208375\n南国都市报\t208376\n重金属捕捉剂\t208377\nRecommendation\t208378\n河南省司法厅\t208379\n旧番\t208380\nParka\t208381\nS君\t208382\n杨震\t208383\n五龙新城\t208384\njshop\t208385\n数控剪板机\t208386\n强干\t208387\n中国汽车工业信息网\t208388\n车限号\t208389\n套词\t208390\n并购基金\t208391\n陈凯琳\t208392\n435\t208393\n东界\t208394\n二十公里\t208395\n二十五\t208396\n狮子王1\t208397\n中海金沙湾\t208398\n全聚德烤鸭店\t208399\ngats\t208400\n教理\t208401\nEvergarden\t208402\n大北农\t208403\n下拉\t208404\n魔刹石\t208405\n调音阶\t208406\n输者\t208407\n333号\t208408\n智云股份\t208409\nf100\t208410\n洛阳理工学院\t208411\n2016年10月21日\t208412\ncaps\t208413\n瞒不住\t208414\ndelmia\t208415\n上海新华\t208416\n嘉诺\t208417\n運命\t208418\nb叉\t208419\n捐钱\t208420\n炎刘镇\t208421\n寺库网\t208422\n通联支付网络服务股份有限公司\t208423\n中南区\t208424\n丽水市\t208425\n600323\t208426\n优迅\t208427\n长虹科技大厦\t208428\nrichmond\t208429\n刘妈\t208430\n小三儿\t208431\n防胀\t208432\nbarcelona\t208433\n基床\t208434\n太子妃\t208435\n夹栏\t208436\n中色\t208437\n乌鲁木齐八一中学\t208438\nPKM\t208439\nPitch\t208440\n雷盾\t208441\n新街街道\t208442\n里界\t208443\n复利\t208444\n嘉定一中\t208445\n速度与激情3\t208446\n粉饰\t208447\n袁帅\t208448\n小乘佛教\t208449\n广济寺\t208450\n代偿\t208451\n垃圾场\t208452\nWinRAR压缩软件\t208453\n墓道\t208454\n无违\t208455\n台州一中\t208456\n江化微\t208457\n一个11岁\t208458\n邓尼茨\t208459\n90100\t208460\n1614\t208461\n总结表\t208462\nHQL\t208463\n杜鹃花展\t208464\n胃散\t208465\npscs2\t208466\n焦作火车站\t208467\n皱折\t208468\ncmakelists\t208469\n济南市科学技术局\t208470\naffects\t208471\n颠三倒四\t208472\n国家卫生健康委员会\t208473\n楚剧\t208474\n楼氏\t208475\n切\t208476\n宏达股份\t208477\n第45章\t208478\n披甲\t208479\n比比资源先锋影音网-比比资源bibizy\t208480\nnumberbox\t208481\nSAPI\t208482\n边吊\t208483\n倾歌\t208484\n阿三\t208485\ngnueabi\t208486\n全曲\t208487\n在不言中\t208488\n五代十国\t208489\n重庆大学学报\t208490\n青岛万象城\t208491\n聊斋部落战争\t208492\n微表处\t208493\n人才市场\t208494\nAdelaide\t208495\n泰中\t208496\n第八十二条\t208497\n上海市经济和信息化委员会\t208498\n北京法院网\t208499\nbpsk\t208500\nrela\t208501\n管理费用\t208502\n武汉市文化局\t208503\n宋陵\t208504\nE12\t208505\nflash网\t208506\n美少女梦\t208507\n五洲\t208508\n阿佳\t208509\n表压\t208510\n仅\t208511\n河南人事考试网\t208512\nchaopeng\t208513\n君主\t208514\n凯迪拉克XT5\t208515\nDataland\t208516\n圣斗士星矢斗士\t208517\n操纵杆\t208518\nGonzo\t208519\nAPEC会议\t208520\npsd模板-千图网\t208521\n华尔街见闻早餐FM\t208522\nppt|\t208523\n美国杜兰大学\t208524\nsuccessful\t208525\nCapabilities\t208526\n仿鞋\t208527\n时刻表\t208528\n0824\t208529\n赛轮金宇\t208530\n瘸\t208531\n上海中信信息发展股份有限公司\t208532\n忘掉\t208533\n泰山景区\t208534\n1000克\t208535\nCPS\t208536\n爱的代价\t208537\n凯格林\t208538\n4千多\t208539\n图谋不轨\t208540\ntransfur\t208541\n沪建交\t208542\n15起\t208543\n剑灵召唤师\t208544\n贿\t208545\n无锡天一中学\t208546\nTiffany\t208547\ntrans\t208548\nv4.2\t208549\n中原文化\t208550\npuzzled\t208551\n洗黑\t208552\n寻子网\t208553\n遞\t208554\nhousing\t208555\n弧幕\t208556\n售水机\t208557\n东营教育信息网\t208558\n老余杭\t208559\n中华人民共和国\t208560\n婉言\t208561\n跳出去\t208562\n中国文艺\t208563\n1.CN\t208564\n泉纱\t208565\n有机硫\t208566\n硼氢化钾\t208567\n骗贷案\t208568\n漫画性\t208569\n蚕砂\t208570\n惊情\t208571\n这胸\t208572\n7.5亿\t208573\n黑芝麻糊\t208574\nvenice\t208575\n上汽通用汽车\t208576\nH型钢\t208577\n千数\t208578\n化学吧\t208579\nX86\t208580\n花园城市\t208581\n柏叶\t208582\n_文\t208583\n李冲\t208584\n6标\t208585\n解放网\t208586\nbt变态版\t208587\n街巷\t208588\n喵酱\t208589\n卸下\t208590\n医教\t208591\n郑州港区\t208592\n荏苒\t208593\n本周五\t208594\n刘星宇\t208595\n前部\t208596\n5.5.1\t208597\n高健\t208598\n蓝湾\t208599\n失败率\t208600\n龙瑞\t208601\nhalifax\t208602\n乘客们\t208603\n特级\t208604\n农坛\t208605\ndom对象\t208606\n绝无仅有\t208607\nfirehose\t208608\npfc\t208609\n碰破\t208610\n180亿元\t208611\n大石街道\t208612\n找坡\t208613\n许多\t208614\n大约在冬季\t208615\n打情骂俏\t208616\n网赚客\t208617\n并行度\t208618\n无风险利率\t208619\n裂颅\t208620\n苏州市高新区\t208621\n鬼妹\t208622\nmhp\t208623\n鳏夫\t208624\n豕\t208625\n山东现代学院\t208626\n6点\t208627\n5轴\t208628\n氯霉素滴眼液\t208629\n5月9日\t208630\n舅爷\t208631\n乐余镇\t208632\n水垢\t208633\n沈丹\t208634\nP2P网络借贷\t208635\n评论员\t208636\n夏日炎炎\t208637\n彩点\t208638\n第64期\t208639\n姚芊羽\t208640\n笨狼\t208641\n夏履镇\t208642\n深业集团有限公司\t208643\n龙门县\t208644\n300666\t208645\n2ch吧\t208646\n推知\t208647\n没收到\t208648\n海中\t208649\n戴埠镇\t208650\n龙战士3\t208651\n黄芪精\t208652\n建构式\t208653\n机器视觉系统\t208654\npostcss\t208655\nscholars\t208656\n泥沼\t208657\n德鲁依\t208658\n中韩自贸区\t208659\nm6\t208660\ninspiron\t208661\n塞巴斯蒂安\t208662\n马航MH370\t208663\n汽车传感器\t208664\n林昌\t208665\nTCP粘包\t208666\n紫燕\t208667\n粤商\t208668\n园洲镇\t208669\n糯康\t208670\n视阈\t208671\n龙湖春江郦城\t208672\nveromoda\t208673\n古天马未都\t208674\n广州经济开发区\t208675\n历史片\t208676\ndonkey\t208677\nMaxwell\t208678\n13cm\t208679\n神性\t208680\n妖妖\t208681\n概念股\t208682\nxscript\t208683\n呼儿\t208684\n厨师证\t208685\n污损\t208686\nXpe\t208687\n一次函数\t208688\n0760\t208689\nEVO\t208690\n浮屠塔\t208691\n竖纹\t208692\nabcc式\t208693\n威廉斯\t208694\nMp3\t208695\n1立方米\t208696\n游戏盘\t208697\n擦亮\t208698\n健身步道\t208699\n横线\t208700\nHiveSQL\t208701\nlinkedin\t208702\nfirefo\t208703\n传奇世界2\t208704\n玄猫\t208705\n王子涵\t208706\n低点\t208707\n19件\t208708\n梁柱板\t208709\nCloud9\t208710\nSciPy\t208711\n4105\t208712\n第二针\t208713\n税前列支\t208714\n海天出版社\t208715\n女儿节\t208716\n免收费\t208717\n温岭人才网\t208718\n第十八篇\t208719\n12px\t208720\n05后\t208721\ncrosshair\t208722\n6亿\t208723\ni9500/Galaxy\t208724\nitaly\t208725\n泰姬陵\t208726\n脚镯\t208727\n母鸡\t208728\n万瑞\t208729\n燕郊经济开发区\t208730\ny50\t208731\n山东外贸职业学院\t208732\n徐康俊\t208733\n佣金\t208734\n赵汝飞\t208735\nno.5\t208736\n鲁发\t208737\n苍茫\t208738\n配乐诗\t208739\nSSH服务器\t208740\nblt\t208741\n后项\t208742\n德尔波特罗\t208743\nBOOKS\t208744\n图列\t208745\n爱可可\t208746\n街道居委会\t208747\ni铂\t208748\n前沿阵地\t208749\n克格汪\t208750\nht\t208751\n四联单\t208752\n建章\t208753\n中华秋沙鸭\t208754\n3mm\t208755\nCocktail\t208756\n抗折\t208757\nEVERYDAY\t208758\n生态门\t208759\nAnnouncement\t208760\n沙美岛\t208761\n思迅软件\t208762\n2228\t208763\n武威政府网\t208764\n无证之罪\t208765\n上周六\t208766\n不正\t208767\nICL\t208768\nVLAN\t208769\n宝姿\t208770\n娘子军\t208771\n中元股份\t208772\n蒙古大营\t208773\n温州经济技术开发区\t208774\n玛氏公司\t208775\n风辰\t208776\nGCS\t208777\n天使动漫论坛\t208778\n深圳市光明新区政府\t208779\n真朴\t208780\n焊台\t208781\n那个月\t208782\n天沃科技\t208783\n梦幻诛仙2\t208784\n五峰山\t208785\nhashlib\t208786\n异乡\t208787\n1.0.0.3\t208788\n七点钟\t208789\n解耦\t208790\nvivox6d\t208791\n真实性\t208792\n刘峙\t208793\n蛮子\t208794\n标贴\t208795\n沛泉\t208796\n当量\t208797\n汇出\t208798\n书情\t208799\n仆\t208800\n标致208\t208801\n李双双\t208802\n置底\t208803\n东海生活网\t208804\n诺维乔克\t208805\n假花\t208806\n季节变化\t208807\ncj\t208808\nExpect\t208809\nCitizens\t208810\n中医内科学\t208811\n乌兰镇\t208812\n碗型\t208813\nteechart\t208814\n麻黄素\t208815\n环氧树脂胶粘剂\t208816\nfeignclient\t208817\n井泽\t208818\n90万\t208819\n阿尔都塞\t208820\n培养箱\t208821\n105平\t208822\n识货\t208823\n反修\t208824\n5370\t208825\ndiu\t208826\n子公司\t208827\n文德路\t208828\n母子爱情2\t208829\nsmali\t208830\nmcq\t208831\nMethyl\t208832\n云南山\t208833\n准印证\t208834\ncaigan\t208835\n美协\t208836\n彩度\t208837\n夕阳白雪\t208838\n泽尔丹\t208839\n3.90\t208840\n刘娥\t208841\n四万元\t208842\n小龙人\t208843\n邮银\t208844\nUnlimax\t208845\n胖娃\t208846\n青兰高速\t208847\nGB28181\t208848\n嘻唰唰\t208849\n西楚\t208850\n热风机\t208851\n罗哲文\t208852\nSplash\t208853\n惹怒\t208854\n破刃\t208855\n辩\t208856\ngta5mod\t208857\n东周列国志\t208858\n六龙争霸\t208859\n判\t208860\n6.86\t208861\n媒体证\t208862\n福州装修公司\t208863\n风之子9881198198\t208864\n江津\t208865\n长春市教育局\t208866\n文城镇\t208867\n金海湖新区\t208868\n制动主缸\t208869\n万难\t208870\n树枝\t208871\n福利房\t208872\nwowza\t208873\n中共襄阳市纪委\t208874\ndissolution\t208875\n顺承\t208876\nRS触发器\t208877\n些许\t208878\n阅读笔记\t208879\n请稍等\t208880\n8门\t208881\n弱酸性\t208882\n吡咯烷酮\t208883\n富文\t208884\n莫林\t208885\n泰勒·斯威夫特\t208886\n秀杰\t208887\nmbed\t208888\n周4\t208889\n承让\t208890\nMute\t208891\n勇闯夺命\t208892\n2017年7月13日\t208893\n詩\t208894\n宣武\t208895\n大材小用\t208896\n高亮\t208897\n桂井\t208898\n山西焦化\t208899\n葵涌街道\t208900\n甘良\t208901\n孙禄堂\t208902\n获任\t208903\n新无双列传\t208904\n官场局中局\t208905\n白内障\t208906\n登山口\t208907\nViber\t208908\nHi商学院_海商网\t208909\n税号\t208910\n北大数院\t208911\nqcow\t208912\n灿谷\t208913\n偷香天下第一妃\t208914\n功能性胃肠病\t208915\n东方圣城网\t208916\n40\t208917\n人体学\t208918\n收音头\t208919\n爱威奶\t208920\n财政部金融司\t208921\n十三周\t208922\n真好玩\t208923\n本州\t208924\n郭翔\t208925\n横滚动条\t208926\ncarpet\t208927\n清明粑\t208928\n榆次\t208929\n25分钟\t208930\n移资\t208931\n蒋依依\t208932\n受贿人\t208933\n鑫哥\t208934\n听习\t208935\n千万次\t208936\n三千公里\t208937\n18句\t208938\n锁骨链\t208939\n诵\t208940\n提取物\t208941\nwww.uuu9.com\t208942\n4711\t208943\n深证成指\t208944\n无限动漫录\t208945\n珠市口\t208946\n李益\t208947\n浦那\t208948\n葛根素\t208949\ngparted\t208950\n广东省高院\t208951\n山本\t208952\ncached\t208953\nHDTV高清\t208954\nufo事件\t208955\nlny\t208956\n汕大\t208957\nORA-03113\t208958\n不买账\t208959\nsolidwork\t208960\n眼镜展\t208961\n岛外\t208962\nandroidsdk\t208963\n子式\t208964\nMartina\t208965\n耳轮\t208966\nglimpse\t208967\n水玉\t208968\n音乐剧\t208969\n2017年起\t208970\n小壁虎借尾巴\t208971\n燧发枪\t208972\n类推\t208973\n凑钱\t208974\n热流体\t208975\n龙床\t208976\n李亚萍\t208977\n荔城\t208978\n检查\t208979\n灵灵\t208980\nWindWant\t208981\n三\t208982\n装修123网\t208983\n穿透率\t208984\n整字\t208985\n合作市\t208986\n宇舶\t208987\n长发公主\t208988\n唐琼\t208989\n合众e贷\t208990\n武夫\t208991\n家们\t208992\n鹰牌陶瓷\t208993\n100包\t208994\n球化\t208995\n劈\t208996\n否决权\t208997\n/label\t208998\nESP8266\t208999\n卢静\t209000\nGalGame\t209001\n寒凉\t209002\n百度视频搜索\t209003\n聚碳酸酯\t209004\n海陆空\t209005\n4千克\t209006\n情殇\t209007\n景芝酒业\t209008\n中国一冶\t209009\n奶粉盒\t209010\n伊罕\t209011\n青年工作委员会\t209012\n东富龙\t209013\n数竖式\t209014\n美豪丽致酒店\t209015\n山口组\t209016\n科场\t209017\n挤眉弄眼\t209018\n布局页\t209019\njindu\t209020\n杨珊珊\t209021\nAlitrip\t209022\n达旗\t209023\n咸海\t209024\n大师姐\t209025\n一点点\t209026\n陸\t209027\nu77\t209028\n样版\t209029\n男票\t209030\n野\t209031\n腌制品\t209032\n睢县\t209033\nmsv\t209034\n瓦拉加尔\t209035\n青春饭\t209036\n90亿美元\t209037\n护航\t209038\n支吊架\t209039\n头腔\t209040\n李孝利\t209041\n十孔口琴\t209042\n健脾\t209043\n广州网易\t209044\n时间计时器\t209045\n梁国\t209046\nTav\t209047\n極品\t209048\n红细胞压积\t209049\n丽江市玉龙县政府\t209050\n西西软件站\t209051\nMPF\t209052\n殿前\t209053\n翔安马巷\t209054\n千图\t209055\npresents\t209056\n圣科\t209057\nの\t209058\n东风中路\t209059\nlullaby\t209060\n一起来看流星雨\t209061\n北京地铁16号线\t209062\n多一\t209063\n博思特\t209064\n變成\t209065\n芳甸\t209066\n德兴市\t209067\nwindak\t209068\n渔沟镇\t209069\n桃花零零妖\t209070\n唯美系网\t209071\n校园招聘\t209072\n李算\t209073\n温阳路\t209074\n惠湾\t209075\n3000R\t209076\n啄木鸟\t209077\n法律法规\t209078\n黄兰香\t209079\n邮政局\t209080\n会阴部\t209081\n昆明市第一人民医院\t209082\n祖训\t209083\nsausage\t209084\n南京物联\t209085\n白疾风\t209086\n瑞美\t209087\nOBS弹幕助手\t209088\n咸丰\t209089\n达美乐\t209090\n习近平总书记系列讲话精神\t209091\n16进制编码\t209092\nrundeck\t209093\ndonation\t209094\n表壳\t209095\n眼球突出\t209096\nHel\t209097\nwdr7300\t209098\n技术流\t209099\n百世快递\t209100\nmfi\t209101\n窗式\t209102\nmongdb\t209103\nAliOS\t209104\n促进局\t209105\n0.7.0\t209106\n旬邑县\t209107\n烂皮\t209108\n3PE\t209109\n作曲\t209110\n将心比心\t209111\n王晓敏\t209112\nrad\t209113\n三利谱\t209114\n乡党\t209115\nCheckout\t209116\nintelligent\t209117\n呼吸肌\t209118\nSuricata\t209119\n16M\t209120\n大合影\t209121\n想见你\t209122\nresnet\t209123\n冰红\t209124\n嘉奖令\t209125\ncompressed\t209126\n新华杯\t209127\nDHEA\t209128\n荣华\t209129\n寒铁\t209130\n勾陈\t209131\n2016年8月27日\t209132\n龙吴路\t209133\n航天六院\t209134\n重庆曙光男科医院\t209135\n自取其辱\t209136\n点击\t209137\n市人民法院\t209138\n仿瓷\t209139\n65533\t209140\nPC游戏综\t209141\nkaptcha\t209142\n7020\t209143\ngameloft吧_\t209144\n数一数\t209145\n北方区\t209146\ntower\t209147\nmolecules\t209148\n南征\t209149\n肺病\t209150\n秋收\t209151\n网页服务器\t209152\n一顶\t209153\n剑技\t209154\n四氯化锡\t209155\n高清英语全集\t209156\n小题大做\t209157\n卖主\t209158\n西安公司\t209159\n眼量\t209160\nautocad2018\t209161\ncrowd\t209162\n平安证券股份有限公司\t209163\n随州二中\t209164\n顶配版\t209165\nMVP\t209166\n华南教育新闻网\t209167\n超生\t209168\n费劲\t209169\n凯兰帝\t209170\nEDIUS7\t209171\n北京人家\t209172\n刘亚东\t209173\nword域\t209174\n黄港\t209175\n护绿\t209176\n龙岗区教育局\t209177\n天诺\t209178\n石马河\t209179\nmonde\t209180\n欧博\t209181\n業\t209182\n7601\t209183\n鼓山\t209184\n乐平镇\t209185\n蓬\t209186\n金正焕\t209187\nbt磁力链接\t209188\n6GD5\t209189\n市建委\t209190\nNTSC\t209191\n【城市便捷酒店\t209192\nmoder\t209193\n罗刹国\t209194\nsplatoon2\t209195\n^2\t209196\n2903\t209197\n秦岭国家植物园\t209198\n生风\t209199\n翻译者\t209200\n港口岸\t209201\n墙衣\t209202\n角雕\t209203\n2017-2030年\t209204\n曾颖\t209205\n网台\t209206\nbaomidou\t209207\n奇葩\t209208\n费县政府网\t209209\n切缘\t209210\n塞尔达时之笛\t209211\n5439\t209212\n工业新城\t209213\n程序员的自我修养\t209214\n蝴蝶之吻\t209215\nRealty\t209216\n勾地\t209217\n出版署\t209218\n四果汤\t209219\n无足\t209220\n面交\t209221\n预销\t209222\n典子\t209223\n网络部\t209224\n略略\t209225\n中国核能电力股份有限公司\t209226\nFinn\t209227\n何德何\t209228\nanzhuang\t209229\n组织行为学\t209230\n自我训练\t209231\n莫斯科\t209232\n锦瑟\t209233\ntextare\t209234\n山东商业职业技术学院\t209235\nleecode\t209236\nn5s\t209237\n无所事事\t209238\n交通工程专业\t209239\n李雯\t209240\n眼镜男\t209241\n皋埠镇\t209242\n死于非命\t209243\n古埃及金字塔\t209244\n大海水\t209245\n大连自贸区\t209246\n41种\t209247\n进出口额\t209248\nxi\t209249\nv1.01\t209250\n高县\t209251\nintensity\t209252\n8045\t209253\n考办\t209254\nE9\t209255\n〒\t209256\n第三排\t209257\n枯萎\t209258\n材质库\t209259\nshifter\t209260\n廊坊百姓网\t209261\n硅胶制品\t209262\ninstrumental\t209263\nTNF\t209264\n幼龟\t209265\n手轮\t209266\n4k分辨率\t209267\n孟楠\t209268\n中铁二十局集团\t209269\n佳能IX6580\t209270\n三七总皂苷\t209271\n宽带\t209272\n弃文\t209273\ntar.gz-CSDN\t209274\n中通物流\t209275\nGNSS\t209276\n阴凉处\t209277\n欢乐古吧\t209278\n评论\t209279\nSTW\t209280\niPhone6s\t209281\nsecc\t209282\n厦门国税\t209283\nluac\t209284\n小鸭淘客助手\t209285\nCAD测量面积\t209286\n牧云云\t209287\n微友\t209288\n医妃\t209289\n新飞\t209290\n16天\t209291\n曹休\t209292\n金信网\t209293\n全选中\t209294\nz4x\t209295\n淮海\t209296\n取费\t209297\nE拓建筑网\t209298\n百校联盟\t209299\n四姐妹\t209300\n均为\t209301\n慈悲城\t209302\n20170713\t209303\n大过年\t209304\n北美自由贸易区\t209305\n耐世特\t209306\n田径场\t209307\n世界气象日\t209308\n韦纳\t209309\nlaptops\t209310\n马克·扎克伯格\t209311\n去世\t209312\nYUAN\t209313\n中国工信部\t209314\n系部\t209315\n魔灵召唤\t209316\n圆庆广场\t209317\nbasys2\t209318\n一年以来\t209319\n西安市人大\t209320\n缓交\t209321\n江苏金融租赁股份有限公司\t209322\n孔眼\t209323\ndate\t209324\n镇元子\t209325\nace\t209326\n\\\t209327\n冤罪\t209328\neyou\t209329\namnesia\t209330\n55555\t209331\n香小陌\t209332\n损失险\t209333\n中国电力企业联合会\t209334\nMegaCli\t209335\n林恩\t209336\n黑头像\t209337\n纳雅\t209338\n双万双\t209339\n噬血\t209340\n阁下\t209341\n德约科维奇\t209342\nIQOS\t209343\n宝飞龙\t209344\n门式起重机\t209345\nEdwards\t209346\n梁遇春\t209347\n长风公园\t209348\n志伟\t209349\n三朵\t209350\n口粮\t209351\n2875\t209352\n2485种\t209353\n陈向东\t209354\n浙江吉利汽车有限公司\t209355\n__main\t209356\n格尔\t209357\n144家\t209358\n卸车\t209359\n采暖\t209360\nSeam\t209361\nMarkMan\t209362\n安固\t209363\n镜像站\t209364\n误差函数\t209365\nChaseDream\t209366\n李存孝\t209367\nfirewall防火墙\t209368\nvuforia\t209369\n20150201\t209370\n无限期\t209371\n长队\t209372\n锚索\t209373\n富春山\t209374\n类方\t209375\ncuff\t209376\n钟兴民\t209377\nGOODS\t209378\nPHPOK\t209379\nRemmina\t209380\n720P超清\t209381\n抗日了不起的盖茨比\t209382\n贴心萌宝荒唐爹\t209383\n骁龙810\t209384\n3张\t209385\n舆地\t209386\nSRD\t209387\n以德\t209388\n企业会计准则\t209389\n小岚\t209390\n蕾切尔\t209391\n少尉\t209392\n3联\t209393\n宁泽李颂慈\t209394\n麻布\t209395\n万秀村\t209396\nabo吧\t209397\n华情\t209398\n挂图\t209399\njackjing\t209400\nzooskool\t209401\n浮梦\t209402\nfsync\t209403\n款款\t209404\n中国人民解放军空军总医院\t209405\n盛丰\t209406\n开用\t209407\nbob\t209408\nCVPR2017\t209409\n金环蛇\t209410\n0.15.0\t209411\naliez\t209412\nMC大白\t209413\n5月15日\t209414\n冰杯\t209415\n1700x\t209416\n三年级作文大全_小学生作文\t209417\n三语\t209418\n天津市人民检察院\t209419\n用友畅捷通T+\t209420\n11.2.6\t209421\n孙冰\t209422\n箔\t209423\n山东站\t209424\n研读\t209425\n桐庐人才网\t209426\n名古屋中部国际机场\t209427\n房产管理局\t209428\nL_\t209429\n御天神帝\t209430\n映\t209431\n佛教社\t209432\n武林外史\t209433\n亡灵之国\t209434\n广州办事处\t209435\n2007年\t209436\n锂动力电池\t209437\n企业链\t209438\nHannah\t209439\n十六项\t209440\n王老菊\t209441\nnearly\t209442\n515\t209443\nCANDY\t209444\n底角\t209445\n第19部\t209446\n2788\t209447\n广东侨网\t209448\n入住\t209449\n渡鸟\t209450\n写数\t209451\n白蒙蒙\t209452\n错错错\t209453\nMust\t209454\n19份\t209455\n辽宁何氏医学院\t209456\n时尚芭莎\t209457\n老歌\t209458\n解放军仪仗队\t209459\n上海斯玛特企业服务有限公司\t209460\n口腔癌\t209461\n清水苑\t209462\n聚丙二醇\t209463\n广州工商学院\t209464\n离离原上草\t209465\n磁盘阵列柜\t209466\ndiscards\t209467\numa\t209468\n李频\t209469\n23.cn\t209470\ndorado\t209471\n预提费用\t209472\n干桂圆\t209473\n唇钉\t209474\n潮喷\t209475\n工资条\t209476\n福建日报社\t209477\n钢琴家\t209478\n达日县\t209479\n2.3L\t209480\n脚胶\t209481\n4毫米\t209482\n适龄\t209483\n异氟烷\t209484\nhg8245\t209485\n好农资招商网\t209486\nutunbu\t209487\n安徽省省\t209488\n凹面镜\t209489\n马累\t209490\n衬衣\t209491\n雇佣关系\t209492\n被骗\t209493\nRIVER\t209494\n弱旅\t209495\n崔村镇\t209496\n滨崎步\t209497\n重庆市潼南区人民政府\t209498\n7905\t209499\n凌度行车记录仪\t209500\n万信\t209501\n高胜美\t209502\n质谱\t209503\n健康权\t209504\n蜜桃大丸子\t209505\n冬瓜荷叶茶\t209506\n富聊\t209507\n政坛\t209508\n杜云生\t209509\n圈\t209510\n嗜战\t209511\nSECURITY\t209512\n心灵捕手\t209513\n2004级\t209514\n委托代理人\t209515\nwifi标志\t209516\n_美图录\t209517\nstockx\t209518\n钱物\t209519\n盲肠\t209520\n韩航\t209521\n20171019\t209522\n教育处\t209523\n哥哥\t209524\n懦夫救星\t209525\n番禺社区网py168.com\t209526\n新人\t209527\n20日内\t209528\n锁卡贴机\t209529\n乌蒙\t209530\nidentifying\t209531\nWordPress\t209532\njiancai\t209533\n述职\t209534\n中国移动浙江公司\t209535\n各有千秋\t209536\n高福利\t209537\n1500\t209538\n中高\t209539\n捏脊\t209540\n伊亚\t209541\n清甜\t209542\n斯诺登\t209543\n净瑞\t209544\nyjj\t209545\nWIN7系统台式电脑\t209546\n6盘\t209547\nVIDIA\t209548\n珠江街\t209549\ndifferent\t209550\nsaozi\t209551\n全国人民代表大会及其常务委员会\t209552\ncdrom\t209553\nInformation\t209554\n洛杉矶迪士尼乐园\t209555\n儒林\t209556\nR16\t209557\n国家机构\t209558\n乾造\t209559\n蓝水湾\t209560\nhough\t209561\n265g安卓网\t209562\nゅ\t209563\n三甲\t209564\nConstellation\t209565\n冠龙\t209566\n王祖\t209567\n好东西\t209568\nfilofax\t209569\n花鱼\t209570\n两弹\t209571\n海宁\t209572\n黄霑\t209573\n新高\t209574\n回避型\t209575\n莫来石\t209576\n激素性皮炎\t209577\ndnf卡顿\t209578\n宣汉县\t209579\n入党申请人\t209580\n映泰\t209581\n温降\t209582\n倾轧\t209583\n盖州市\t209584\n5.7.0\t209585\n京蓝科技\t209586\ncabinet\t209587\ncuba\t209588\n羿飞\t209589\n交通量\t209590\n魂インサ\t209591\n芙蕾雅\t209592\n北湖公园\t209593\n互质数\t209594\n徐湛\t209595\n水泡型\t209596\n加州州立大学\t209597\n营养日\t209598\n李伟杰\t209599\n瞳色\t209600\n第15条\t209601\n紫藤\t209602\n冷凝式\t209603\n一桶\t209604\n永磁吸盘\t209605\n艾米莉亚·克拉克\t209606\n商住楼\t209607\n利雅路\t209608\n三四十年代\t209609\nexiu\t209610\n雨生\t209611\n水磨社区\t209612\n程毅\t209613\n超跑\t209614\nstarking\t209615\n私钥\t209616\n黑枸杞\t209617\n王玉梅\t209618\n达西定律\t209619\n恒压阀\t209620\n在后\t209621\n校标\t209622\n死理\t209623\n奔跑吧第二季\t209624\n华西二院\t209625\n跑走\t209626\nmulticast\t209627\n山东政法学院\t209628\n或否\t209629\n对望\t209630\nISO9000\t209631\nChrome内核浏览器\t209632\n北京科技经营管理学院\t209633\n盲目性\t209634\n价目\t209635\n吕宋岛\t209636\n老窖\t209637\n市中区人民政府\t209638\nfad\t209639\n安居乐\t209640\n1.1.0.1\t209641\n192.168.1.10\t209642\n醒世恒言\t209643\nnab\t209644\n幼齿\t209645\n起重器\t209646\n小伪娘\t209647\n坑篇\t209648\n韦陀\t209649\n末日储物箱\t209650\n密蒙花\t209651\n新塘站\t209652\n棕榈叶\t209653\n2017年5月18日\t209654\n拿取\t209655\n黄疸指数\t209656\ntrackbar\t209657\n工口图漫画网\t209658\n邯郸市中心医院\t209659\n山海关\t209660\nFindbugs\t209661\n卡劵\t209662\n3.4下\t209663\n勾股数\t209664\n休休\t209665\nJPGOODBUY\t209666\nredemption\t209667\n片方\t209668\n移送\t209669\n创意图\t209670\nsentences\t209671\n乐器\t209672\n意外惊喜\t209673\n鹿角巷\t209674\n026\t209675\n彩钢复合板\t209676\n整箱运价\t209677\n363号\t209678\n训练鞋\t209679\nChandelier\t209680\nPino\t209681\n家机\t209682\n高客\t209683\n一桌\t209684\n翻译员\t209685\n加口\t209686\nopkg\t209687\n德诺\t209688\nubifs\t209689\n风浪\t209690\n归云\t209691\ngfs\t209692\nTimeout\t209693\n园山街道\t209694\n蒲河\t209695\n金价\t209696\n高技术\t209697\n1.0.doc\t209698\n复件\t209699\n上海闵行\t209700\n钉眼\t209701\nVS2010/MFC\t209702\n裕昌\t209703\nMovies\t209704\n重庆市巴南区人民政府\t209705\n小保\t209706\n逸动xt\t209707\n桶\t209708\n李玫伊布\t209709\n白塑\t209710\ndoes\t209711\n飞宇\t209712\nClickHouse\t209713\ndesmume\t209714\n增订版\t209715\n常熟理工学院\t209716\nbreakout\t209717\nMonaco\t209718\n192.168.0\t209719\n云云\t209720\n仁医\t209721\n硅酸盐水泥\t209722\n追身\t209723\n张仁\t209724\n2第二章\t209725\n风调雨顺\t209726\n北京美的\t209727\n华商网\t209728\nAAS\t209729\n晟邦\t209730\n语文园地五\t209731\n一千倍\t209732\n深圳大学医学部\t209733\n上代\t209734\n烩面\t209735\ntsp\t209736\nAbaqus\t209737\nlayuiadmin\t209738\n历史\t209739\n74cms\t209740\n崔巍\t209741\n安庆路\t209742\n腾讯信鸽\t209743\nstyling\t209744\n绝地求生日服\t209745\n剪刀式\t209746\n江西科技师范大学\t209747\nsmash\t209748\n荆门社\t209749\n辰巳唯\t209750\nzFrontier\t209751\nxt4\t209752\nDisk\t209753\n朗琴园\t209754\n结婚前\t209755\n600110\t209756\n九州寻仙记\t209757\nqq影音播放器\t209758\n中华人民共和国国家发展和改革委员会\t209759\n字符编码\t209760\n扁鹊\t209761\n短柄\t209762\n2011年12月\t209763\nzorro\t209764\n中国海监\t209765\n不像样\t209766\n得票率\t209767\n昔阳县\t209768\n一个1\t209769\n老宗\t209770\nfulness\t209771\n神怪\t209772\nCHANEL\t209773\nHyunji\t209774\n21公里\t209775\n桃心\t209776\n阳光路上\t209777\n乐天玛特\t209778\n西进运动\t209779\n刀伤\t209780\nMSVCR100.dll\t209781\n不动产物权\t209782\n季华园\t209783\n江苏省职业学校\t209784\n百看不厌\t209785\n龙傲天\t209786\n732\t209787\n数据分析\t209788\n配平\t209789\n乐宁\t209790\n汉声\t209791\n三番五次\t209792\nMonet\t209793\n8089\t209794\n嫩牛\t209795\n戴睿\t209796\n亚无码\t209797\n九鼎传说\t209798\n14张\t209799\n86个\t209800\n青冥\t209801\n田老师\t209802\n复宏汉霖\t209803\n壮烈牺牲\t209804\n一跟\t209805\n朝阳县\t209806\ndgv\t209807\n189个\t209808\n3000mm\t209809\n唯大\t209810\n市委书记\t209811\n崔瑟琪\t209812\n白腿\t209813\n43家\t209814\n海关关区\t209815\n双层床\t209816\n新世\t209817\n西乐葆\t209818\n超声流量计\t209819\nGenuitec\t209820\n叛离\t209821\n无尽传奇\t209822\n山西建筑职业技术学院\t209823\n损招\t209824\n第35集\t209825\n贴身男秘\t209826\n沃克\t209827\n南非华人网\t209828\nxtrareport\t209829\n下订\t209830\n浓烟\t209831\n产额\t209832\n寡\t209833\n18.04\t209834\n双骄\t209835\n博越论坛_汽车之家论坛\t209836\nwulian\t209837\n埋\t209838\n畑山夏树\t209839\n销钉\t209840\n赛达\t209841\npublic类\t209842\n最差\t209843\n1928年\t209844\n基督徒\t209845\n粉毛\t209846\nhinge\t209847\n女生殖器\t209848\n椭圆度\t209849\n待人\t209850\n1.1米\t209851\n我的野蛮女友\t209852\n缔造者\t209853\n畅视\t209854\npowerpivot\t209855\n大开眼戒\t209856\n郊县\t209857\nwiringPi\t209858\nBgm\t209859\n装车\t209860\nrisc\t209861\n浩然之气\t209862\n联运\t209863\n2017年9月27日\t209864\n空气罐\t209865\nf-15219445\t209866\n西方人\t209867\nconfidentiality\t209868\n八匹\t209869\n钻戒\t209870\n多事\t209871\narrows\t209872\n自走炮\t209873\n威仕特\t209874\n三打白骨精\t209875\n落雪\t209876\n5.26\t209877\n参赛证\t209878\n美妞\t209879\n各政党\t209880\n宁馨\t209881\n山中\t209882\n3格\t209883\n南普陀寺\t209884\nsp1000\t209885\nPlayHome\t209886\n祭出\t209887\nvarargin\t209888\n动态\t209889\n王俪丁\t209890\n预征\t209891\n9000\t209892\n所在\t209893\n联想Y480\t209894\n黄金一代\t209895\n清远市区\t209896\n绿城育华小学\t209897\n金安\t209898\n有姓\t209899\n左膀右臂\t209900\n连\t209901\n山山\t209902\n地塞米松磷酸钠注射液\t209903\n哈博森\t209904\n易卡\t209905\n中美国\t209906\n杏林春暖\t209907\n圣心\t209908\n美豪\t209909\n实色\t209910\n云鹤plus\t209911\nbinyi\t209912\nVFP6.0\t209913\n拨云\t209914\n峨眉山市\t209915\n四舍五入函数\t209916\n改名\t209917\n阳剑\t209918\n卢林\t209919\n亲署\t209920\n39480744\t209921\n小八\t209922\n3000万元\t209923\n专卖店\t209924\n奥匈\t209925\n永清路\t209926\n呐喊声\t209927\n悠然见南山\t209928\n王凯旋\t209929\n特快列车\t209930\n润丰\t209931\n航速\t209932\n易语言信息框\t209933\n聚法\t209934\n多玩魔盒\t209935\n重庆城市职业学院\t209936\n来文\t209937\n骨化三醇胶丸\t209938\n江金权\t209939\n连接者\t209940\n♂\t209941\n斯外戈\t209942\n领克\t209943\n心坊\t209944\n扬帆\t209945\n于城市\t209946\n邱莹莹\t209947\nv2.2\t209948\n人性交\t209949\n毒奶\t209950\n泵体\t209951\nlets\t209952\n粪污\t209953\n2620\t209954\n2BD\t209955\nHOSTS\t209956\n36次\t209957\nbillion\t209958\n灵泉\t209959\n通字\t209960\n跌停板\t209961\n错情\t209962\n雄浑\t209963\n常氏\t209964\n师之路\t209965\n苹果在线\t209966\n智感\t209967\n可申诉\t209968\n有毒气体\t209969\n防城港市\t209970\n种草机\t209971\nnote1\t209972\nQuartusII\t209973\n新闻广播\t209974\n2017年10月8日\t209975\n纸价\t209976\n件号\t209977\n天苍苍野\t209978\n群芳\t209979\n事无\t209980\n九七电影院\t209981\npostcode\t209982\n棉花姑娘\t209983\n嗷嗷\t209984\n银标\t209985\n高跷\t209986\n112平米\t209987\n哪方\t209988\n养鱼池\t209989\n偷猎\t209990\n列别名\t209991\n花管\t209992\n吴梅村\t209993\nCreo-三维网\t209994\n过氧化苯甲酰\t209995\n留白\t209996\n100架\t209997\n零模\t209998\n韩政府\t209999\n当之无愧\t210000\n2017-04\t210001\n混凝土搅拌站\t210002\n东湖新技术开发区\t210003\n15斤\t210004\n为老不尊\t210005\n节距\t210006\nzhun\t210007\nU2415\t210008\n凯德广场\t210009\n面妆\t210010\n液质联用仪\t210011\nCAD看图软件\t210012\nmicrosd卡\t210013\n安徽省政协\t210014\n水煮鱼片\t210015\nx360\t210016\n徐坊客运站\t210017\n洛天依\t210018\n丛刊\t210019\n疏离\t210020\n经济适用房\t210021\n144.0\t210022\n姜子牙\t210023\n威盛电子\t210024\nreinstall\t210025\n外压\t210026\n修仙传\t210027\n677米\t210028\n治平\t210029\n富文本框\t210030\n迅雷7\t210031\n神龙架\t210032\n道口烧鸡\t210033\n建设集团\t210034\n童帽\t210035\n小筑\t210036\n空置房\t210037\n税税\t210038\n急性呼吸窘迫综合征\t210039\n江源\t210040\n军旅人生\t210041\n人类学家\t210042\n社会阶层\t210043\nsxl\t210044\n明体\t210045\n龙蝇\t210046\nyigi\t210047\n淘翠\t210048\n作图\t210049\n山东省纪委监察厅\t210050\nBeeg\t210051\n中国水利\t210052\nBicycle\t210053\n伴性\t210054\n低聚异麦芽糖\t210055\n通风系统\t210056\n布兰切特\t210057\n两小无\t210058\n汽车配件网\t210059\n缴库\t210060\n螺旋板\t210061\n灰色系\t210062\n轻透\t210063\n321国道\t210064\ntrna\t210065\n9001\t210066\n赵永生\t210067\n素食主义\t210068\n芦笛岩\t210069\n二律背反\t210070\n克霉唑\t210071\n防爆门\t210072\n七百万\t210073\npsychology\t210074\n20180421\t210075\n前三天\t210076\nRoom\t210077\n4册\t210078\n宁波国家高新区宁波新材料科技城\t210079\n济宁市人民政府\t210080\n止汗石\t210081\n刘壮实\t210082\n9.24\t210083\n人鞭\t210084\n余静\t210085\n全国图书馆参考咨询联盟\t210086\nACM/ICPC信息站\t210087\n红豆奶茶\t210088\n非国有企业\t210089\n硬干\t210090\nspray\t210091\n野生技术协会_科技_bilibili_哔哩\t210092\n共同借款人\t210093\n捷通\t210094\n四大会计师事务所\t210095\n终边\t210096\n商业贷款计算器\t210097\n新闻午\t210098\nFX-8300\t210099\n平坝区\t210100\n注册规划师考试\t210101\ndso\t210102\n恐怖\t210103\nnds模拟器\t210104\n糗事\t210105\n风脉\t210106\n建研院\t210107\n罗江区\t210108\n线灯\t210109\n闻道\t210110\n大学生创业\t210111\ncmi\t210112\nSPARK\t210113\n莆田医院\t210114\n绥棱县\t210115\n萌蒂\t210116\n黄山日报\t210117\n特工片\t210118\n萧潇\t210119\n美劳\t210120\n紫薇花园\t210121\n上海农业网\t210122\n电子制作\t210123\n优播影音\t210124\n传回\t210125\n福中集团\t210126\n字部\t210127\n挂带\t210128\n快穿逆袭\t210129\n埃塞俄\t210130\n广东省省\t210131\n强生医疗\t210132\n手纸\t210133\n创伤后应激障碍\t210134\nderived\t210135\n习京平\t210136\n火化炉\t210137\n深圳湾公园\t210138\nDarry\t210139\n4in1\t210140\n第十三季\t210141\n侯王\t210142\n精影\t210143\n脐装\t210144\n大戏看北京\t210145\n2018新年\t210146\nRT-AC68U\t210147\n如临\t210148\n余依\t210149\n一个错误\t210150\n易凯资本\t210151\n860K\t210152\n韩智恩\t210153\n华晨宝马\t210154\n草狗\t210155\nazusa\t210156\nSTELLA\t210157\nzipcode\t210158\n金冥\t210159\n桦树茸\t210160\n水仙茶\t210161\nMicroWIN\t210162\n辽\t210163\n线装\t210164\nWallet\t210165\n讨鬼传2\t210166\n萧夫人\t210167\nfoun\t210168\n語言\t210169\n泰康在线财产保险股份有限公司\t210170\nchaxun\t210171\n太湖园博园\t210172\n华标\t210173\n五个一工程\t210174\nraising\t210175\n揍人\t210176\n2126\t210177\n万事通\t210178\n梅姐\t210179\n李富贵\t210180\n高氯酸盐\t210181\n智能视频监控系统\t210182\n神店\t210183\niF\t210184\n海口镇\t210185\n评星\t210186\n电波拉皮\t210187\nps板\t210188\n荣盛花语馨苑\t210189\nftype\t210190\n建筑\t210191\nSplitter\t210192\n模流\t210193\nraw文件\t210194\n_丹东新闻网\t210195\n排行榜单\t210196\n重庆地区\t210197\nsizer\t210198\n酒友\t210199\n君必强\t210200\n地基\t210201\nDesk\t210202\n巧克力棒\t210203\n孳息\t210204\n新墨香\t210205\njks\t210206\nIC集成电路\t210207\noozie\t210208\n钓鱼俱乐部\t210209\nXtreme\t210210\n张掖_张掖市政府\t210211\n7成\t210212\n雅马哈YAMAHA\t210213\n婚拍\t210214\nservername\t210215\nsuffering\t210216\nflstudio12\t210217\n从今以后\t210218\n油烟\t210219\nDope\t210220\n香蕉\t210221\nrollover\t210222\n澳优奶粉\t210223\n达川区\t210224\n脚本管理器\t210225\n灯影\t210226\n风装\t210227\nresponses\t210228\n泡面番\t210229\n民考汉\t210230\n小战\t210231\n无修版\t210232\n袍子\t210233\n人君\t210234\nhihi\t210235\n小公猫\t210236\n怪物猎人xx\t210237\nZYNQ\t210238\n1452\t210239\n免券\t210240\n打印盒\t210241\n刻槽\t210242\nvicky\t210243\nRCT\t210244\n西藏自治区国家税务局\t210245\n旨\t210246\n梁敏仪\t210247\n记法\t210248\n培训营\t210249\n赵天宇\t210250\n西田麻衣\t210251\n富路\t210252\n5500U\t210253\n金腰带\t210254\nVu\t210255\nchrony\t210256\n误表\t210257\n大案\t210258\n沈阳政府网\t210259\n40t\t210260\n第13页\t210261\n贴里\t210262\n充气娃娃\t210263\n4210u\t210264\nEVUS\t210265\n中国文旅\t210266\n义工证\t210267\nBate\t210268\n撤侨\t210269\n书刊\t210270\nB类\t210271\nblackmore\t210272\nRT-PCR\t210273\n2013年春节\t210274\n大真探\t210275\n上海斐讯数据通信技术有限公司\t210276\n青岛啤酒股份有限公司\t210277\n发面饼\t210278\n中春\t210279\n阿紫\t210280\n信批\t210281\n玩爆\t210282\n白水\t210283\nSax\t210284\n双陆\t210285\n石文\t210286\n雅梵哲\t210287\n游控\t210288\nAntenna\t210289\n德莱文\t210290\n互保\t210291\n学校教育制度\t210292\n沙丁胺醇\t210293\n阝\t210294\n第149集\t210295\n爱莲说\t210296\n夜夜宠\t210297\n密度仪\t210298\nCrawl\t210299\n地埋灯\t210300\n7.39\t210301\n叶涛\t210302\n爱的色放\t210303\nMyAnime\t210304\n怡瑞\t210305\n被讹\t210306\nPaperwhite3\t210307\n电阻式触摸屏\t210308\n沙漏\t210309\n电盒\t210310\nNB-IoT\t210311\n归家\t210312\nhealthier\t210313\n焊机\t210314\nchemie\t210315\n自助式\t210316\n8.16\t210317\n粉底乳\t210318\n指匠情挑\t210319\n7.5mm\t210320\n符龙飞\t210321\n勒口\t210322\n中骏世界城\t210323\n碗盘\t210324\nfungus\t210325\n追梦\t210326\n说吃\t210327\n裸钻\t210328\n海光\t210329\n成绩册\t210330\nNagios\t210331\n黄教主\t210332\n丑陋\t210333\n晞\t210334\n中国科学院金属研究所\t210335\n中国移动通信集团终端有限公司\t210336\n漫思\t210337\n柳浪闻莺\t210338\n6.6.4\t210339\n联邦\t210340\n原装屏\t210341\nCouchbase\t210342\n电缆连接器\t210343\ntesserocr\t210344\n汇纳科技\t210345\n管理端\t210346\n为患\t210347\nAVFoundation\t210348\n长宽\t210349\n170元\t210350\n深圳市人民政府\t210351\n感笔\t210352\n很孤单\t210353\n拉拉裤\t210354\n无一不\t210355\n36度\t210356\nChina-贝迪中国\t210357\n利泰\t210358\n纳武单抗\t210359\n新浪无锡_新浪网\t210360\n涤新云\t210361\n文案类\t210362\nurbox\t210363\n临朐\t210364\n安详\t210365\n搭顺\t210366\n幻夜\t210367\n空气指数\t210368\n和而不同\t210369\n茴香菜\t210370\nprose\t210371\n丹水\t210372\n日圈\t210373\n美的中央空调\t210374\n极路由\t210375\n碧游\t210376\n540分\t210377\n猎城网\t210378\nFraction\t210379\nWDM\t210380\nf600\t210381\n65种\t210382\n唁电\t210383\nISSEY\t210384\n九年\t210385\n伦敦国王学院\t210386\n马辉\t210387\n4m\t210388\n吾悦\t210389\n杜邦卫\t210390\n王亚伟\t210391\nComputed\t210392\n掉渣\t210393\n如何处\t210394\n宝珠笔\t210395\ngetsockopt\t210396\nStrange\t210397\nrichtextbox\t210398\n健康快乐\t210399\n自闭\t210400\n耍赖\t210401\n多发生\t210402\n国新\t210403\n包送\t210404\n开庭\t210405\n虎斑\t210406\nrce\t210407\n4750g\t210408\n_比特\t210409\n紫瞳\t210410\n广播电视总局\t210411\n圣三国蜀汉传\t210412\n44条\t210413\nj++\t210414\n英奇\t210415\n机枪\t210416\n花龙\t210417\n一倍多\t210418\n深谋\t210419\n黑白无双\t210420\n隔振\t210421\n树和喜鹊\t210422\n艳星\t210423\nmerchandise\t210424\n2016年4月9日\t210425\n植物液\t210426\n昌隆\t210427\n车购税\t210428\n20160423\t210429\n刘海戏金蟾\t210430\n分保\t210431\n侍\t210432\n不名\t210433\n劳作\t210434\nx470\t210435\n前车之鉴\t210436\n爱理\t210437\nhrb400\t210438\n无异\t210439\nDWF\t210440\nGoodReader\t210441\nawe\t210442\nemv\t210443\n全度妍\t210444\n欧阳震华\t210445\n方龄\t210446\n淫羊藿\t210447\n固位\t210448\n民营银行\t210449\n倭国\t210450\nmvsd\t210451\n人在江湖飘\t210452\n轴系\t210453\n一骑当千\t210454\n清源路\t210455\n机顶盒\t210456\nFreeStyle\t210457\n香港中文大学\t210458\nVolumes\t210459\n丁嘉丽\t210460\n扪\t210461\n融创美盛\t210462\ndawn\t210463\n20170620\t210464\n超品离歌\t210465\n冷冽谷\t210466\n齐默尔曼\t210467\n2W\t210468\n1.29亿\t210469\n实验报告\t210470\n蛋池\t210471\ncec\t210472\n市立医院\t210473\n每刻\t210474\n699\t210475\n瑞星\t210476\nHyde\t210477\n0953\t210478\n七世\t210479\n汩汩\t210480\n数学之美\t210481\n鲨湾\t210482\n铜山区\t210483\n图门\t210484\n哈利波特里\t210485\n白毛\t210486\n5g\t210487\n橙路\t210488\n常德职业技术学院\t210489\nNBA直播吧\t210490\n最后一代\t210491\np25\t210492\nYami\t210493\n千盛\t210494\ncetos\t210495\n督察员\t210496\n阿纳斯塔西娅\t210497\nfanfan\t210498\n上海市宝山区人民政府\t210499\n细烟\t210500\n宝冢\t210501\n减删\t210502\n三穴\t210503\n2018博越\t210504\n金喜善\t210505\n主星\t210506\n鹰潭北站\t210507\n微喷头\t210508\n695\t210509\n科技园\t210510\nhasClass\t210511\n止血\t210512\n改配\t210513\n甘地\t210514\nechar\t210515\n迦尔纳\t210516\n多发性脂肪瘤\t210517\nvspd\t210518\nWoodbury\t210519\n90#\t210520\nchicken\t210521\nGZ\t210522\n新源\t210523\nGTX660Ti\t210524\n皮炎\t210525\n颗粒无收\t210526\n传说之下\t210527\n1893年\t210528\n团块状\t210529\nstephen830\t210530\n中国中化集团公司\t210531\n投币箱\t210532\n手自一体车\t210533\n扇耳光\t210534\n张博士\t210535\n健身者\t210536\n迁西县\t210537\n履历表\t210538\n汉峪金谷\t210539\n不退\t210540\n卢\t210541\n恒益\t210542\n动植物园\t210543\n汽车之家专业评测中心\t210544\nSIS001\t210545\n4.5\t210546\nniweiwei\t210547\n冰丝席\t210548\n百家姓大全\t210549\n阜外\t210550\n单利\t210551\nfreewheel\t210552\n优姬\t210553\n主案\t210554\nauditd\t210555\n狂销\t210556\n有可\t210557\nsimulated\t210558\n卡刀宏\t210559\n太白楼\t210560\nvivox9plus\t210561\n浮叶\t210562\n半行\t210563\n金正大集团\t210564\npdf+\t210565\n合生汇\t210566\n蓝泊湾\t210567\n白年\t210568\n假体丰胸\t210569\n郭沫若\t210570\n执事\t210571\n华北工控\t210572\n三星集团\t210573\n虚幻争霸\t210574\n德容\t210575\n毫无\t210576\n总和法\t210577\nCarry\t210578\n3320m\t210579\n爱锋派\t210580\n大米粥\t210581\n标准式\t210582\n北京交通大学经济管理学院\t210583\n小程序斗地主\t210584\n桃园结义\t210585\n头曲\t210586\nroyals\t210587\n水关长城\t210588\n帐户名\t210589\n银龙\t210590\n爆缸\t210591\nonenet\t210592\nPCH\t210593\n幸福岛\t210594\ndbf\t210595\n6080\t210596\n美国女足\t210597\n铅合金\t210598\n2.9.6\t210599\n上头\t210600\n雪糕机\t210601\n2016年4月6日\t210602\nundef\t210603\n强森\t210604\n盲盒\t210605\nmpush\t210606\n冲孔网\t210607\nresubmit\t210608\n碗筷\t210609\nstork\t210610\nV3.3\t210611\n40d\t210612\n熨平\t210613\n聚偏氟乙烯\t210614\n冲动效应\t210615\npmo\t210616\n第39\t210617\n晴雨\t210618\n发件箱\t210619\n桑寄生\t210620\n泉城公园\t210621\n桑妮·黎翁\t210622\n白露片\t210623\nkhs\t210624\n气泡水机\t210625\nAnni\t210626\n白魔王\t210627\njianghu\t210628\n维托\t210629\n广州国际会展中心\t210630\n年度版\t210631\n哇哇\t210632\n科洛弗道10号\t210633\nOME\t210634\n消违章\t210635\n莲花菩提\t210636\n7588\t210637\n水菜丽重口味番号大全\t210638\nGoogleMap\t210639\n塞维利亚大教堂\t210640\n豹子\t210641\n合肥瑶海\t210642\n伊春市政府网\t210643\n20岁\t210644\n李建学\t210645\n注册网\t210646\n梦想之城\t210647\n复购率\t210648\n288x\t210649\n标养室\t210650\n赤楠\t210651\n心然\t210652\n仓颉输入法\t210653\n韶能股份\t210654\n花卉园\t210655\nfontawe\t210656\n航发商\t210657\n红绡\t210658\n徐先生\t210659\n本草铺网上中药店\t210660\nSBA\t210661\n洋人\t210662\n黑衣\t210663\npear\t210664\n掩面\t210665\n苹果8x\t210666\n意华股份\t210667\n第一辑\t210668\n美国国会图书馆\t210669\nHTMl5\t210670\n梦想世界\t210671\n不吝赐教\t210672\n踩刹车\t210673\n闵彬\t210674\nBalanced\t210675\n匣心记\t210676\n狗年\t210677\n3.2亿元\t210678\n祁阳\t210679\nRep\t210680\n博后基金\t210681\n希志爱野\t210682\n过帐\t210683\n金河\t210684\n一代宗师\t210685\nBOYS\t210686\n巨锤\t210687\n第49号\t210688\ngi\t210689\n丸子汤\t210690\n光能使者\t210691\nmyslq\t210692\n精心\t210693\n等不及\t210694\n凤凰岗\t210695\n垫刀\t210696\n凤凰城小区\t210697\n吉泰\t210698\n压码\t210699\n站洋\t210700\n杀人鲸\t210701\ncblas\t210702\n反甲\t210703\n河童\t210704\nip地址ping\t210705\n青木花恋\t210706\npyhs2\t210707\n魔兽三国\t210708\n20w\t210709\n锋影\t210710\n配音阁\t210711\n超酷网页视频播放器\t210712\n夏新\t210713\n频谱分析仪\t210714\n注册框\t210715\n吐水\t210716\n真心朋友\t210717\ndty\t210718\n对夹式\t210719\n梅长苏\t210720\niis8.5\t210721\n四老\t210722\ndve\t210723\n跟我回家\t210724\n频数\t210725\n黄石镇\t210726\nCONSTRUCTION\t210727\n伟大\t210728\n怒批\t210729\n易融\t210730\n論\t210731\n行政区\t210732\ngating\t210733\n分头\t210734\n苹果梨\t210735\n注会证\t210736\n跟焦\t210737\nSustainability\t210738\n四城\t210739\n警犬来啦\t210740\n免装\t210741\n0469\t210742\n柜员\t210743\n变声期\t210744\n压缩性骨折\t210745\nm5a97\t210746\n罗嘉良\t210747\n葬\t210748\n淘书\t210749\n丁字裤\t210750\n最秀\t210751\n赌场风云\t210752\n中航飞机\t210753\n聚能教育\t210754\nwoodbury\t210755\n上海五星级酒店\t210756\n飞检\t210757\nIDES\t210758\n探班\t210759\n穗乃果\t210760\nCSocket\t210761\n噗噗声\t210762\ncloset\t210763\n浦\t210764\n高银\t210765\n陆家嘴金融城\t210766\n率时\t210767\n几叶\t210768\nrica\t210769\nydui\t210770\n淘宝双12\t210771\nConstitution\t210772\n很多天\t210773\n龙城花园\t210774\n爱画网\t210775\nTheory\t210776\n必看\t210777\n弋江\t210778\n编带机\t210779\n关麦\t210780\n中华人名共和国\t210781\n12t\t210782\n她\t210783\n第一弹\t210784\n两穴\t210785\n港漫吧\t210786\nFUCK\t210787\n独孤九贱\t210788\n阿拉丁照明网\t210789\n丁亥\t210790\nWOLF\t210791\n机破\t210792\n伊藤洋华堂\t210793\n天梭石英表\t210794\n石羊镇\t210795\n网格法\t210796\nUED\t210797\n昌岗\t210798\n辽宁科技馆\t210799\nean\t210800\nmp4.torrent\t210801\n国线\t210802\n大明寺\t210803\n铭成\t210804\n越窑\t210805\n流浪之路\t210806\n德文布克\t210807\n白药\t210808\nImmediate\t210809\n青虹\t210810\n良田\t210811\n张彬彬\t210812\n本土化\t210813\n仙子\t210814\n一钩\t210815\n粮道\t210816\n甲戌日\t210817\n免登陆\t210818\nZH\t210819\n烽火通信\t210820\nix35\t210821\n升班马\t210822\nIMM5257\t210823\n理中丸\t210824\nnto\t210825\n独苗\t210826\n丫\t210827\n黄翠\t210828\n佯装\t210829\n优先性\t210830\nDsw\t210831\n逆臣\t210832\n灰斑\t210833\n蔡雅奇\t210834\nParamiko\t210835\n狂夫\t210836\n698号\t210837\notp\t210838\n鸟眼\t210839\nv2x\t210840\n熟面孔\t210841\n中国科学院测量与地球物理研究所\t210842\n来来往往\t210843\n随州论坛\t210844\nchaoxing\t210845\n急性膀胱炎\t210846\n玉树市\t210847\n沙茶酱\t210848\n组织机构代码\t210849\n巨轮2\t210850\n2次方\t210851\nPivotal\t210852\nSuperJunior\t210853\n吕滔\t210854\n后防\t210855\n软元件\t210856\n通用变频器\t210857\n泾渭茯茶\t210858\n古墓\t210859\n商用电\t210860\n荷叶饼\t210861\n它那\t210862\n欧阳夏丹\t210863\n重庆花卉园\t210864\nuserform\t210865\n黄建国\t210866\n耐高\t210867\n赓续\t210868\n9kg\t210869\n金苗\t210870\nios\t210871\n安全箱\t210872\n甲亢\t210873\n干白\t210874\n20150728\t210875\n毒舌\t210876\n行政管理体制改革\t210877\n大唐风华路\t210878\n二年级语\t210879\n柳房网\t210880\n餐饮卫生许可证\t210881\nMobike\t210882\n沈琼\t210883\n51所\t210884\n不列颠\t210885\n平圆头\t210886\n合屏\t210887\n志怪\t210888\n不相\t210889\n妇炎康片\t210890\n犀牛皮\t210891\n抑或是\t210892\n望江南\t210893\n云掌\t210894\nep\t210895\n口袋妖怪网\t210896\n百老汇影城\t210897\n印度尼\t210898\n佛肇\t210899\n冲进\t210900\n花名册\t210901\n辞类\t210902\n约克夏\t210903\n可免\t210904\n鲜鲜\t210905\nrelocations\t210906\nStill\t210907\n曹俊\t210908\n子陵论坛\t210909\n点燃\t210910\ndoctors\t210911\n壮志凌云\t210912\n内校\t210913\n同方股份\t210914\n松林\t210915\n1627\t210916\n冲开\t210917\n安石\t210918\n王银\t210919\n次氯酸根\t210920\n35件\t210921\n科研计算机网\t210922\nGraphviz\t210923\n2018年04月17日\t210924\n黑猫警长\t210925\n基\t210926\n周继红\t210927\n60598\t210928\n适意\t210929\n兴城市\t210930\n界牌镇\t210931\n莱加内斯\t210932\n酬乐天扬州初逢席上见赠\t210933\n商行\t210934\n海带汤\t210935\nCBRE\t210936\n二十载\t210937\n呈妍\t210938\n施测\t210939\n天之道\t210940\n保利创世纪\t210941\n锁芯\t210942\n青木\t210943\n苏菲亚\t210944\nuniforms\t210945\n等风\t210946\n鬼城\t210947\n119个\t210948\n王琰\t210949\ncoordinates\t210950\ncorrelate\t210951\nmmi\t210952\nx100\t210953\n桥型\t210954\n提问贴\t210955\n医药级\t210956\n节令\t210957\n执位\t210958\n工业地产\t210959\n国安委\t210960\n无限极增健口服液\t210961\n庭院\t210962\n训练\t210963\nk380\t210964\nstructure\t210965\n启飞\t210966\n五片\t210967\n第4号\t210968\n冻干粉针\t210969\n2016.8\t210970\n田晓菲\t210971\n红郡\t210972\n瀛台\t210973\n西安市委\t210974\n罗尔斯\t210975\nTears\t210976\n艾雪\t210977\n杨丽菁\t210978\n法罗群岛\t210979\n天津男科医院\t210980\n深圳仁爱医院\t210981\n田纳西大学\t210982\n盛创\t210983\n张红丽\t210984\ne5430\t210985\n殷抗日之铁血智将\t210986\n仙园\t210987\n星城翠珑湾\t210988\n美国农业部\t210989\n对叙\t210990\n茶经\t210991\n搜狐邮箱\t210992\n吉林大学学报\t210993\n20141127\t210994\n概念版\t210995\n网络编程\t210996\n64bits\t210997\n出入境管理局\t210998\n两个铁球同时着地\t210999\n宽宏大量\t211000\n白云路\t211001\n最后一段话\t211002\n南通电视台\t211003\nBURST\t211004\n赵忠贤\t211005\n名侦探柯南:唐红的恋歌\t211006\nBlur\t211007\n美加狮\t211008\n起亚佳乐\t211009\n金粒\t211010\n陕西省高级人民法院\t211011\nthroughput\t211012\n邮册\t211013\n绿都\t211014\nshaon\t211015\n云书网\t211016\nsoup\t211017\n天源路\t211018\n告诫\t211019\n5点\t211020\n后街\t211021\nDDR3内存\t211022\nhokuyo\t211023\n坚硬\t211024\n李子迟\t211025\n救父\t211026\n奇智\t211027\nvl\t211028\n妞干网\t211029\ngds\t211030\n白黑游戏论坛\t211031\n车档\t211032\n夜秀\t211033\n马科斯\t211034\n四川地税\t211035\n曲江二期\t211036\ncharms\t211037\n汉之殇\t211038\n艾条\t211039\n擎苍\t211040\n华东五市\t211041\n1064\t211042\nSideshow\t211043\n横州镇\t211044\n上海高等研究院\t211045\n京雄高铁\t211046\n零售部\t211047\n天坛大佛\t211048\n皇府\t211049\n马君\t211050\n尽可夫\t211051\n叶茂中\t211052\n魂破九天\t211053\n余杭区区\t211054\n百悦城\t211055\nc199\t211056\n彗星来的那一夜\t211057\n李坤\t211058\nmasha\t211059\n乌兰布和沙漠\t211060\n投资学专业\t211061\n云丝\t211062\n竹内纱里奈\t211063\nRider\t211064\nblended\t211065\n一雄\t211066\n山东广播电视台\t211067\n建筑环境与能源应用工程专业\t211068\n市住建委\t211069\n胡萝卜汁\t211070\nNoteExpress\t211071\n咖喱粉\t211072\n屹立\t211073\n膝下\t211074\n大公镇\t211075\n久久不见久久见\t211076\n沈春耀\t211077\n有加\t211078\nmichaels\t211079\n脑图\t211080\n民品\t211081\n意向金\t211082\n2016年10月29日\t211083\n五斗橱\t211084\n酷我音乐\t211085\nGPS版\t211086\nWin7纯净版\t211087\nnew3dsll\t211088\n京东帮服务店\t211089\n金美\t211090\n满标\t211091\nDiDi\t211092\nwjw\t211093\n爱新觉罗·溥仪\t211094\n自荐生\t211095\n幹\t211096\n美天\t211097\n遮光罩\t211098\n钢铁侠\t211099\n蓝烟\t211100\n肺动脉瓣狭窄\t211101\n卧榻\t211102\npydev\t211103\nExternal\t211104\n区建设局\t211105\n报货\t211106\nsogou\t211107\n激光头\t211108\n学而思1对1\t211109\n康华眼科网\t211110\n)文化传播有限公司\t211111\n制衣\t211112\n胶粉聚苯颗粒\t211113\nhtcm8\t211114\n中华网\t211115\nhttplib2\t211116\n12丝\t211117\n委屈\t211118\n玻碳电极\t211119\n东新园\t211120\n原画集\t211121\n老苗\t211122\n周遭\t211123\n卫庄\t211124\n打通关\t211125\n压片机\t211126\n蒋磊\t211127\n夹丝\t211128\nflatten\t211129\n久光\t211130\n双张\t211131\n在线视频\t211132\n3余2\t211133\n兴永郴赣\t211134\n达尔文进化岛\t211135\nprevie\t211136\n水口\t211137\n莎乐美\t211138\nbiso\t211139\n锈蚀\t211140\n杂症\t211141\n夏茅\t211142\n母系\t211143\nAntares\t211144\n000755\t211145\n辛弃疾\t211146\n永寿\t211147\n並\t211148\n町村\t211149\nCSSCI\t211150\nExperiences\t211151\n胶南市\t211152\n长乐未央\t211153\n柘皋\t211154\n艺林\t211155\n梅花泪\t211156\n人力资源局\t211157\n良子\t211158\n水泥助磨剂\t211159\n7阶\t211160\n御宠医妃\t211161\n上火\t211162\n橄榄树\t211163\nEBM\t211164\nMiix5\t211165\n格斗士\t211166\n杭州奥体中心\t211167\n长春大学\t211168\n百\t211169\n女娲\t211170\n英勇无畏\t211171\n灰谷\t211172\n国睿科技\t211173\n共射放大电路\t211174\n廉江\t211175\n傻春\t211176\n改稿\t211177\n军心\t211178\n大众生活报\t211179\nrigid\t211180\ncad图层\t211181\n凤姐\t211182\n操盘线\t211183\n罗溪镇\t211184\nlagrange\t211185\n左冷禅\t211186\n第一季02\t211187\n输配\t211188\n鱼花\t211189\n上海市消防局\t211190\n公开表\t211191\n七月一日\t211192\n单鞋\t211193\n安全出口\t211194\n瘦身_美体_太平洋时尚网\t211195\n多连杆独立悬架\t211196\n嘻哈\t211197\n鸭屎香\t211198\n1khz\t211199\n奥妮安\t211200\nhitwh\t211201\n王受文\t211202\nVBox\t211203\nG100\t211204\n郭晓峰\t211205\n快乐星猫\t211206\np3p\t211207\n中研网\t211208\n建设招标网\t211209\n中国建筑第四工程局有限公司\t211210\n玉泉山\t211211\n南盘江\t211212\n施工招标\t211213\n_书旗520小说网\t211214\n盐城房产网\t211215\n恶食\t211216\n领花\t211217\n深宫怨灵\t211218\n死精症\t211219\n5.6.12\t211220\nl301\t211221\nLitePal\t211222\n骨伤科\t211223\n3D打印\t211224\n桃花朵朵\t211225\n四神丸\t211226\n星际传奇3\t211227\n搬迁村\t211228\nCoreText\t211229\n和泉\t211230\n酿造\t211231\n4.5.5\t211232\n2万方\t211233\n交往\t211234\n中文在线\t211235\n100厘米\t211236\n掌电\t211237\n漫猫\t211238\n驱魔师\t211239\n费米\t211240\n怡亚通\t211241\n马德里竞技\t211242\n苏波\t211243\nGlusterfs\t211244\nLaTeX\t211245\n终末了\t211246\n同兴达\t211247\n遮阳蓬\t211248\n死心塌地\t211249\n曲云\t211250\ntabbarcontroller\t211251\n嘉庚学院\t211252\n浅色\t211253\n化简\t211254\nipv4\t211255\n蒸饭机\t211256\n盘山路\t211257\nRequirejs\t211258\n国家主席\t211259\nvanke\t211260\n北京印刷学院\t211261\n田文镜\t211262\n最终幻想13雷霆归来\t211263\n第八天\t211264\ngzy\t211265\n合肥市小学\t211266\n经营权\t211267\n蓬蓬\t211268\n清算期\t211269\n餐谱\t211270\n发送者\t211271\n长沙市南雅中学\t211272\n复旦大学药学院\t211273\n熔模\t211274\n朱厚照\t211275\n杜兰\t211276\n按理说\t211277\n墓库\t211278\n指纹\t211279\n王闯\t211280\nE人E本\t211281\ncrc16\t211282\nrest_framework\t211283\nvc++2010\t211284\n617\t211285\n55周年\t211286\nspinning\t211287\n2018.1.0\t211288\n票\t211289\njadclipse\t211290\n长沙黄花机场\t211291\n不遂\t211292\n珠地\t211293\nOPPOR9\t211294\n想到\t211295\nFloTHERM\t211296\n闪电狗\t211297\n爱的旅程\t211298\nvisual\t211299\n逐风者\t211300\nA类\t211301\n_兰\t211302\n轴数\t211303\nSur\t211304\n00090517\t211305\n芜湖路街道\t211306\n上海临港新城\t211307\n湖南建工\t211308\n发货单\t211309\n生辰八字算命网\t211310\n20180112\t211311\n均相\t211312\n金靴奖\t211313\n揭批\t211314\n南区\t211315\n贫富差距\t211316\n13r3\t211317\n摞\t211318\n程刚\t211319\n马概\t211320\n汇篇\t211321\namd64.whl\t211322\n第一号\t211323\n数只\t211324\nawb\t211325\n两月后\t211326\n箭步\t211327\n|周\t211328\n奶企\t211329\n木南日\t211330\n齿槽\t211331\n平板集热器\t211332\n二次元\t211333\n阿波龙\t211334\n黑王马\t211335\n广东新闻网\t211336\n武氏\t211337\n干涉\t211338\n上海市交通委\t211339\n选中项\t211340\nWinrar\t211341\n视神经脊髓炎\t211342\n金科\t211343\n4天后\t211344\n夏雨荷\t211345\ngit库\t211346\n西美\t211347\nrmx\t211348\n清华天河\t211349\n微蓝网\t211350\n月亮宫\t211351\n逆变\t211352\n2017-10-30\t211353\n阿维菌素\t211354\n李闯\t211355\n塔妮娅\t211356\n城寨英雄\t211357\n钱沛云\t211358\nskrill\t211359\n64下\t211360\n犟\t211361\n母差\t211362\n飞度瑞风s3\t211363\n平罗\t211364\n裙底走光\t211365\n仪陇县政府\t211366\n德\t211367\n容金珍\t211368\n滚存\t211369\n@Query\t211370\n海淘转运公司\t211371\n柱塞\t211372\n接出\t211373\neyed\t211374\n成语玩命猜\t211375\n执行案\t211376\n战刻\t211377\n广州市育才中学\t211378\n我一定要\t211379\n硝酸\t211380\n出海口\t211381\n宾馆\t211382\n张娅\t211383\n影视业\t211384\nConsumers\t211385\n妙蛙花\t211386\n灰白\t211387\n水浴锅\t211388\n纽宾\t211389\n西安大略\t211390\n0.5.1\t211391\n卡雷尔\t211392\ncom口\t211393\n毁人\t211394\n简氏\t211395\n天喜\t211396\n纳米级\t211397\nprofibus\t211398\n打拆\t211399\n淬火机\t211400\n陋俗\t211401\n近四十年\t211402\n多义性\t211403\n三院\t211404\n选秀\t211405\n陈珊珊\t211406\n湘雅医院\t211407\n权术\t211408\n西直门\t211409\n二环路东\t211410\nOxide\t211411\n女人的地男人犁\t211412\n11月中旬\t211413\nint值\t211414\nkruskal\t211415\n精灵鼠传说\t211416\n松饼\t211417\n甚嚣尘上\t211418\n租户\t211419\n黄豆粒\t211420\n600667\t211421\nludashi\t211422\n腾讯手游助手卡\t211423\n沧州师范学院\t211424\n平和堂\t211425\n实验品\t211426\n心情游记\t211427\n世锦赛\t211428\n庖丁\t211429\n自立\t211430\n前去\t211431\n中国田径协会\t211432\n公权\t211433\nGoto\t211434\n尉迟琳嘉\t211435\ndom节点\t211436\nfell\t211437\n江城区人民政府\t211438\nWE言堂_WE言堂_全景网\t211439\n张晓伟\t211440\n胡薇\t211441\n龟\t211442\n边疆\t211443\n宿友\t211444\n哈尔滨西\t211445\n护助\t211446\n沈阳物流公司\t211447\nfecl3\t211448\n愚型\t211449\n中梁地产\t211450\n工程教育专业认证\t211451\n&#61483\t211452\n杂志网\t211453\n小辞店\t211454\n兢兢业业\t211455\n王有国\t211456\n许仙志\t211457\n云虚\t211458\naxb\t211459\n360免费wifi\t211460\nultra\t211461\n李馨\t211462\n死女\t211463\n上海万众医院\t211464\n东游记\t211465\n教育部学历证书电子注册备案表\t211466\n战吼萨\t211467\nIBF\t211468\n纯爱\t211469\n一个多年\t211470\n立博\t211471\n罗辉\t211472\n精忠报国岳飞传\t211473\nminix\t211474\n矫顽力\t211475\n大马戏\t211476\n计件工资制\t211477\nnaris\t211478\n鱼鳍\t211479\n苏维埃进行曲\t211480\niDrive\t211481\n大惊失色\t211482\n星歌剧\t211483\n佳能5D4\t211484\nMERCURY\t211485\nSQL\t211486\n老鼠鱼\t211487\n天弘余额宝\t211488\nymodem\t211489\n易瑞\t211490\nTranscript\t211491\n1080P电影网\t211492\n白石山\t211493\n边界值\t211494\n中央和国家机关会议费管理办法\t211495\nMyEther\t211496\n货费\t211497\ndesigner\t211498\n被打\t211499\n西瓜书\t211500\n3861\t211501\n华夏城\t211502\n收缩机\t211503\n锣鼓喧天\t211504\n13年后\t211505\n遮望\t211506\n横生\t211507\n空气动力学\t211508\n2015年终\t211509\nRegulatory\t211510\n全资子公司\t211511\n神舟\t211512\n水波纹\t211513\n它\t211514\n区妇联\t211515\n金东区\t211516\n藏品\t211517\n13706663418\t211518\n跳跃表\t211519\n1.66G\t211520\n张新民\t211521\n骨髓移植\t211522\n重庆来福士广场\t211523\nABCC\t211524\nflop\t211525\nopensll\t211526\n中海华山珑城\t211527\n30余万\t211528\nios5.1.1\t211529\n贵州恒丰\t211530\n安塞区\t211531\n牛皮鞋\t211532\n土笋冻\t211533\n无职\t211534\nX论坛\t211535\nLNK2019\t211536\n古美西路\t211537\n文成网\t211538\n湖北人\t211539\neveryone\t211540\n黑手党2\t211541\n德善\t211542\n亲民网\t211543\n困困\t211544\ngroza\t211545\n看电视\t211546\n越王勾践\t211547\n孔姓\t211548\n新能源汽车\t211549\n拉美西斯\t211550\n五柳\t211551\nHospitals\t211552\nAirPrint\t211553\n慈善总会\t211554\nGNZ48\t211555\n木架\t211556\n东头\t211557\n2018年十九\t211558\n迅雷路由器\t211559\n迪卡\t211560\n重装机兵2重制版\t211561\n力帆520\t211562\nEXPRESS\t211563\n飞虫\t211564\n独唱\t211565\nN\t211566\nthirty\t211567\n司徽\t211568\n主演们\t211569\n锯片\t211570\npH值\t211571\nvariety\t211572\n亿房论坛\t211573\n长株\t211574\nStates\t211575\n08集\t211576\n信息群\t211577\n探险队\t211578\n热辐射\t211579\n发育\t211580\n留人\t211581\nEmlog\t211582\n巴松\t211583\n复方氨酚烷胺片\t211584\n武汉公交\t211585\nIATF16949\t211586\n人民战争\t211587\n终端号\t211588\n螺旋提升机\t211589\n清新县\t211590\n中山火炬职业技术学院\t211591\n叶音符\t211592\n塑料花\t211593\n佳奥\t211594\n荣耀盒子Pro\t211595\n侵蚀\t211596\nbatsing\t211597\n液晶电视机\t211598\npover\t211599\n取数\t211600\nvoronoi\t211601\n1月14日\t211602\ntrunc函数\t211603\nCABasicAnimation\t211604\nANN\t211605\nmoutain\t211606\n头帽\t211607\nAM08\t211608\n道奇顾道长生\t211609\n平机\t211610\nV1.1\t211611\n春秋战国时期\t211612\n坦克堂\t211613\n陈巨来\t211614\n鹏程杯\t211615\n香辣虾\t211616\n刘喜\t211617\n拟南芥\t211618\n颂钵\t211619\nnCode\t211620\n兴文县\t211621\ncoun\t211622\n破关\t211623\n福建农林大学东方学院\t211624\n交换\t211625\n査\t211626\n啦\t211627\n惰性\t211628\n西伯利亚\t211629\n县供电公司\t211630\n乐天圣苑\t211631\n金子美穗\t211632\n一鸣真鲜奶吧\t211633\nSCOP\t211634\n青奥村\t211635\nexcel饼\t211636\nSCR脱硝\t211637\n海女\t211638\nTrails\t211639\ntr行\t211640\n白猫\t211641\nansi编码\t211642\n第181集\t211643\n一圆\t211644\nbones\t211645\n蝶依\t211646\nEva\t211647\n一年前\t211648\n恩宁路\t211649\n静\t211650\n纯生啤酒\t211651\n疯石\t211652\n铊\t211653\nappointment\t211654\n无知者\t211655\n站长们\t211656\nUSBoot\t211657\n哈弗H9\t211658\n上海国际商品拍卖有限公司\t211659\n金家潘\t211660\n来料加工\t211661\n秦华\t211662\n潇湘\t211663\n三分地论坛留学申请版\t211664\n河北环境工程学院\t211665\n六里屯\t211666\n碎牙\t211667\n三十六年\t211668\n奎爷\t211669\n中华卫浴网\t211670\n暴虎\t211671\n15元\t211672\n背标\t211673\n防老剂\t211674\n合肥市第五十中学\t211675\n陈蓓\t211676\n枳椇子\t211677\n圣井\t211678\n动\t211679\n测验卷\t211680\n魔法门之英雄无敌3HD\t211681\n奥宝\t211682\n天龙八部2\t211683\n最终幻想2\t211684\n建筑史\t211685\n黄师\t211686\n十四分之一\t211687\n美时\t211688\n新桑塔纳吧\t211689\n滑动条\t211690\nxide\t211691\n本间\t211692\n达瓦里希\t211693\n纸贴\t211694\n乐动圈圈\t211695\n伊利亚\t211696\n又要\t211697\n星冠\t211698\n可爱的中国\t211699\n关丹\t211700\n攻击机\t211701\nRepresentations\t211702\n刘鑫\t211703\n小大人\t211704\n视觉疲劳\t211705\n不锈铁\t211706\n楚天舒\t211707\n255.255.240.0\t211708\n神宝\t211709\n半死不活\t211710\n中建大公馆\t211711\n互联网有限公司\t211712\n红旗Linux\t211713\n永旺梦乐城\t211714\ncounting\t211715\ndnf炽天使吧\t211716\nTanzania\t211717\n零息\t211718\n中国成人教育协会\t211719\n乌丢\t211720\nston\t211721\n原邦\t211722\n泰达人才网\t211723\n黑山羊\t211724\n快牙\t211725\n中驰\t211726\n复方氨酚烷胺胶囊\t211727\n溴氰菊酯\t211728\n虾籽\t211729\n养词\t211730\n福利卡\t211731\n塑料管\t211732\nrtys\t211733\n潘雨辰\t211734\n赢商网\t211735\n习式\t211736\n郁郁\t211737\nVisionUnion\t211738\n12500\t211739\n风象星座\t211740\n3e\t211741\n朴实\t211742\nlive\t211743\n机构代码证\t211744\n爱建\t211745\n福田康明斯\t211746\n双河村\t211747\n一次性工伤医疗补助金\t211748\n鞭毛\t211749\n青年广场\t211750\npk8\t211751\nMon\t211752\n矽胶\t211753\n脚手\t211754\nSLE\t211755\nnti\t211756\n4K+\t211757\n人民公社\t211758\nRendering\t211759\n菊花\t211760\nkmplayer播放器\t211761\n4A级\t211762\n78动\t211763\n去医\t211764\nexciting\t211765\n刘思纤\t211766\nalright\t211767\n23周\t211768\n过去10年\t211769\neastern\t211770\nBEFORE\t211771\n生巧\t211772\n贿赂案\t211773\n2017年4月份\t211774\n乃至\t211775\n油刹\t211776\nbaijie\t211777\nCNG\t211778\n小米vr\t211779\n钓罗非鱼\t211780\nHOT\t211781\n漫画大全_少女爱情漫画\t211782\nKSS\t211783\n众里寻他千百度\t211784\n不足为\t211785\nKTC\t211786\n台州经济开发区\t211787\n2重\t211788\n四点半\t211789\n杳\t211790\n呱呱呱\t211791\n羊奶皂\t211792\nFAT32格式\t211793\n0351\t211794\n超声电机\t211795\n三堂会审\t211796\n美疗\t211797\n智能视频监控\t211798\nsql服务器\t211799\n0v0\t211800\n帮助者\t211801\n国管公积金贷款\t211802\n枪王\t211803\n流氓艳遇记\t211804\n灰塑\t211805\n树状\t211806\n丘疹性荨麻疹\t211807\n砂型\t211808\n生虫\t211809\n凶手还未睡\t211810\n巴扎\t211811\n4月13号\t211812\n茂陵\t211813\nsx2000\t211814\n周艺轩\t211815\n第一棒\t211816\nCLS\t211817\nB20\t211818\nlending\t211819\n礼数\t211820\n甲缩醛\t211821\n白蛇传\t211822\n郑州铁路局\t211823\n2盘位\t211824\n熊老师\t211825\n9K\t211826\n闪光点\t211827\n2014年2月\t211828\n山西省林业厅\t211829\n以为是\t211830\nCCAI\t211831\n四伏\t211832\nBa\t211833\nINK361\t211834\n街办\t211835\n奶渍\t211836\nventry\t211837\n30周\t211838\n昆曲\t211839\n黄帆\t211840\n吹塑机\t211841\n董卿朗\t211842\n110网\t211843\n弗朗西斯科\t211844\n75型\t211845\n截根\t211846\n青葱岁月\t211847\n闷死\t211848\n粉丝圈\t211849\n甲鱼汤\t211850\n集体\t211851\ngsoap\t211852\nsubscriptions\t211853\n纸卷\t211854\nSnapKit\t211855\n保利林语溪\t211856\nbeforeunload\t211857\n南京地铁6号线\t211858\n雷勇\t211859\n焖\t211860\n人民银行县支行\t211861\npodcast\t211862\n四自\t211863\n圆锥破碎机\t211864\n一秒\t211865\n地球百子\t211866\nFTF\t211867\n0KB\t211868\n犯了错\t211869\n清明\t211870\n字本\t211871\n黑胡椒酱\t211872\n龇牙咧嘴\t211873\n武胜县\t211874\n黄碧云\t211875\nc++17\t211876\n千层底\t211877\nnhibernate\t211878\n抛光布\t211879\n125分钟\t211880\n七支\t211881\n为其\t211882\nHanger\t211883\n囗\t211884\n泥塑\t211885\n找死\t211886\n哈师大\t211887\nwsp\t211888\n无限挑战吧\t211889\n斑马智行\t211890\nSHY\t211891\n屎味\t211892\n中华医学会临床药学分会\t211893\n我的世界mc\t211894\n编织机\t211895\n零丁洋\t211896\n突兀\t211897\n王德禄\t211898\ngtx970\t211899\n名域\t211900\n奥菲欧\t211901\n失眠夜\t211902\nIPad\t211903\nPAL\t211904\n终极武学\t211905\n陈观泰\t211906\n防洪法\t211907\nAdjustable\t211908\n新锐\t211909\nm2s\t211910\n活水\t211911\n阻尼\t211912\n钟淑\t211913\n企业投资项目核准和备案管理办法\t211914\n2400mhz\t211915\n田横镇\t211916\n铜锈\t211917\nEssays\t211918\n注安师\t211919\nluciferase\t211920\nastra\t211921\n嘟拉\t211922\n电动泵\t211923\nCyprus\t211924\n磁致伸缩位移传感器\t211925\n背背佳\t211926\nNETWORK\t211927\n香客\t211928\nepeter\t211929\n恒强\t211930\n鬼吹灯之牧野诡事\t211931\n周到\t211932\n沉思者\t211933\n杨高北路\t211934\n良乡镇\t211935\n友爱路\t211936\n段云\t211937\n估一估\t211938\n宝马7系报价\t211939\n固安捷\t211940\n带圈\t211941\n按键精灵9\t211942\n曹国舅\t211943\n人力资源管理系统\t211944\n陈昌智\t211945\n王冬儿\t211946\n生财\t211947\neconomist\t211948\n青霉\t211949\n任飞\t211950\nwebStorm\t211951\n先天性白内障\t211952\n鸽王\t211953\n昆虫简笔画\t211954\n王紫菲\t211955\n隧洞\t211956\nCm\t211957\n斯特拉斯堡\t211958\n菱形\t211959\n陈女士\t211960\n税延\t211961\n中国铁建\t211962\n初访\t211963\n荷尔拜因\t211964\n槽\t211965\n芈月传\t211966\n教务处\t211967\n周宏伟\t211968\n前门店\t211969\n第五卷\t211970\n1041\t211971\n新东方在线\t211972\n美年体检中心\t211973\n观海听涛\t211974\n伸肌\t211975\n系绳\t211976\ndetroit\t211977\n汉班托塔港\t211978\n教育管理专业\t211979\n基金委\t211980\n19楼\t211981\n李咏梅\t211982\n舜耕\t211983\n9.27\t211984\n机械族\t211985\nD\t211986\n珠海大学\t211987\n推歌\t211988\n上海交通大学附属瑞金医院\t211989\n忻州一中\t211990\n巴莉\t211991\n皮纹纸\t211992\n银雾湖\t211993\n提取罐\t211994\n焦火\t211995\n唯二\t211996\n美年广场\t211997\n一套秒\t211998\nLellansin\t211999\nanchors\t212000\n市场监管局\t212001\n异型鼠鱼\t212002\n信用家\t212003\nMANGO\t212004\n阿弥陀佛\t212005\nopenmp\t212006\n长幅\t212007\n接触者\t212008\nsweet\t212009\n老君洞\t212010\n159\t212011\n富宝\t212012\n第三张\t212013\n修身养性\t212014\n深不见底\t212015\n谷阳北路\t212016\n杨群\t212017\n证券从业资格证书\t212018\nArnoldLu\t212019\n多指\t212020\n死人\t212021\n家|撸\t212022\n柯尔\t212023\nmmj\t212024\n售房\t212025\npackages\t212026\n粉底液\t212027\n化学工程与工艺专业\t212028\n墓\t212029\n钥匙盘\t212030\n207\t212031\n明细余额\t212032\nellipsis\t212033\n前情\t212034\n观摩\t212035\n五升六\t212036\n华侨城集团\t212037\njackey\t212038\n荣耀S10\t212039\n6305\t212040\n我欲天下第一妃\t212041\ncctv\t212042\n11月22日\t212043\nmysql_query\t212044\n陋室\t212045\nQunar.com\t212046\n孟广美\t212047\n暗黑天\t212048\nler\t212049\nAsymmetric\t212050\nbourne\t212051\n美攻\t212052\nmille\t212053\n泄愤\t212054\n9折\t212055\n蒸面\t212056\n凯子\t212057\n华东师大二附中\t212058\nizip\t212059\n有点像\t212060\n陈钟\t212061\n龅牙兔\t212062\n二码\t212063\n六线谱G\t212064\n60缸\t212065\n博金贷\t212066\n技术标\t212067\nAPScheduler\t212068\n土窑\t212069\n中华灯\t212070\n____\t212071\nplecs\t212072\n维基链\t212073\n郭霖\t212074\n毕赤酵母\t212075\n黄务\t212076\n共度难关\t212077\n协调员\t212078\n3777\t212079\n浙商中拓\t212080\napproximate\t212081\n皮肤病\t212082\n微社区\t212083\n重打\t212084\nRoms\t212085\n戴\t212086\n温度变送器\t212087\n58帮帮\t212088\n杭州女装网\t212089\n刘振华\t212090\nRIGHT\t212091\n百问\t212092\n捉拿\t212093\n苗头\t212094\n数家\t212095\n唐家堡\t212096\n雪狐\t212097\n雏田轮\t212098\n切肉机\t212099\n阜成门外大街\t212100\n九好集团\t212101\n淘宝试用中心\t212102\n向阳处\t212103\n涂装\t212104\n割地\t212105\n超声波振动筛\t212106\n野化\t212107\n污文\t212108\n欧树\t212109\n司法\t212110\n虎丘塔\t212111\n自取\t212112\nopenCV\t212113\n铜官山\t212114\n福州省\t212115\n阎王爷\t212116\n四胎\t212117\n智叟\t212118\n用友知识库\t212119\n智器\t212120\nNaver\t212121\n南京市政府\t212122\npreserved\t212123\n林度\t212124\n刘正风\t212125\n大桥未久爬玻璃\t212126\n个子类\t212127\nvfx\t212128\nchl\t212129\n中国总会计师协会\t212130\n八十个\t212131\n泰州南站\t212132\n唐山北站\t212133\nDragons\t212134\n数控折弯机\t212135\ndwg文件查看器\t212136\n弓长\t212137\nCentury\t212138\n南村\t212139\n奇星记\t212140\n动之以情\t212141\n结垢\t212142\n未名医药\t212143\n求名\t212144\nMilestone\t212145\n四大家族之龙虎兄弟\t212146\n大舜\t212147\nhd800\t212148\n相同点\t212149\n退款\t212150\n蜡嘴\t212151\n金海湾\t212152\nhadid\t212153\n卖图\t212154\n鬼谷子\t212155\ncontractual\t212156\n百里杜鹃\t212157\nreshape\t212158\n酥咔饼干\t212159\nnlogn\t212160\n盐城市区\t212161\n过年啦\t212162\n雅称\t212163\nintellij-idea-2016\t212164\n交感神经\t212165\n56f4-46c7-b648\t212166\n汀\t212167\n睿智\t212168\n澳\t212169\nSQLServer\t212170\n坚不可摧\t212171\n尸骸\t212172\n脚汗\t212173\n年收益率\t212174\n舌尖上的中国3\t212175\n皇明太阳能\t212176\n价键\t212177\n/empty\t212178\n罚球\t212179\n贝朗医疗\t212180\n甜水\t212181\n局域网管理工具|天易成网管局域网监控软件\t212182\nDl\t212183\nEEA\t212184\nMOSFET\t212185\n第130集\t212186\n直流\t212187\n谐振式\t212188\n礼智\t212189\nAniston\t212190\nsetnx\t212191\n朝向\t212192\n吻照\t212193\n无水氯化钙\t212194\n14W\t212195\nbluez\t212196\nPPS\t212197\nFriendship\t212198\nremoving\t212199\n甘当\t212200\n北京装饰公司\t212201\n金葵\t212202\n张小姐\t212203\nmyi\t212204\n直到黎明\t212205\n一列\t212206\n长城VV7\t212207\n胡润四大名捕\t212208\n曲筱绡\t212209\n半妖司藤\t212210\nSkull\t212211\n于震\t212212\nHYDAC\t212213\n木刻画\t212214\nvlfeat\t212215\n老乔\t212216\n铺层\t212217\ngwy\t212218\n6万\t212219\n伊思\t212220\n中和反应\t212221\n南通博物苑\t212222\n拉入\t212223\n罗纹\t212224\n3598元\t212225\n瑞典皇家理工学院\t212226\n安徽省经济和信息化委员会\t212227\nSACDR\t212228\n赛轮\t212229\n中国纪检监察\t212230\n沧海遗珠\t212231\n六月\t212232\n韩志国\t212233\n三国志吕布传\t212234\nheal\t212235\n第18\t212236\n仙坛股份\t212237\n82.5\t212238\n刘璟\t212239\n猫熊\t212240\n海棠树\t212241\n出院后\t212242\n阴夫\t212243\nplanetary\t212244\n左心耳封堵术\t212245\n英利浦\t212246\n中国女装网\t212247\n虞初新志\t212248\n峰峰矿区\t212249\nCH4\t212250\nfluentd\t212251\n路饭网\t212252\n停搏\t212253\n普济\t212254\n大学生创业网\t212255\n棉花堡\t212256\nduncan\t212257\n闽中\t212258\n哀其不幸\t212259\n王岚\t212260\n20171112\t212261\n龙王溪\t212262\n醴陵市政府\t212263\n小米网\t212264\n闪住\t212265\n允星河\t212266\n5299\t212267\nGen\t212268\n观光客\t212269\n漏洞库\t212270\n广播系统\t212271\n学思\t212272\n潘家湾\t212273\n烽火芳菲\t212274\n湖南工业大学\t212275\nGIANT\t212276\n超声波发生器\t212277\n中国证劵监督管理委员会\t212278\n安卓篇\t212279\n锣机\t212280\npassengers\t212281\nmicrousb\t212282\n家教网\t212283\n美白丸\t212284\ngeth\t212285\n4晚\t212286\n数秒\t212287\n硅胶垫\t212288\n新宝3\t212289\n婚纱\t212290\n局域网交换机\t212291\n9280\t212292\n90分\t212293\n上交大\t212294\n罗弗敦群岛\t212295\n立方数\t212296\n洋河大曲\t212297\n王开\t212298\n临安市\t212299\n瑾年\t212300\n苏哈托\t212301\n玻璃器皿\t212302\nNCR\t212303\n外高\t212304\n有皮\t212305\n朱建国\t212306\n毛病\t212307\nwww.xuexi111.com\t212308\n青少年教育\t212309\n270\t212310\n张哲\t212311\n易于\t212312\n杨花\t212313\n法力无边\t212314\n1-3月份\t212315\n张永歆\t212316\n李珣\t212317\n水杨酸钠\t212318\n财满街\t212319\njseea\t212320\n同性\t212321\nipsec\t212322\n技术馆\t212323\n七强\t212324\n秀山县\t212325\n说话声\t212326\n博朗耳\t212327\nChart\t212328\nwps吧\t212329\n不帅\t212330\n普工\t212331\n厦门金旅\t212332\n北极星小姐姐\t212333\n发蜡\t212334\n救赎者\t212335\n翻动\t212336\n过量\t212337\n陈庭威\t212338\n2.125\t212339\n20170717\t212340\n佳能60D\t212341\n我的儿子\t212342\nAVX\t212343\n开贴\t212344\n香市动物园\t212345\nmirrorlist\t212346\n乡村公路\t212347\n长果\t212348\n写入\t212349\n电缆管\t212350\n南湖景区\t212351\n种瓜得\t212352\n姜云凡\t212353\nMuMu模拟器\t212354\n定增\t212355\n瘦骨\t212356\n张焱\t212357\n鞋\t212358\n花町物语\t212359\nion-tabs\t212360\n旭光\t212361\n公署\t212362\n查股网\t212363\n合肥一六八中学\t212364\n不惮\t212365\n华中师大\t212366\n乐园\t212367\n三门峡网\t212368\n龙珠超漫画\t212369\n16毫米\t212370\n瑰丽\t212371\n万仞\t212372\n手形\t212373\n芙莉美娜\t212374\n鱼线轮\t212375\nSeparation\t212376\n1000多块\t212377\n酒钢\t212378\nFebruary\t212379\n卡索拉\t212380\n互助群\t212381\n经书\t212382\n千寻瀑\t212383\nshanshan\t212384\n中菲\t212385\n银行履约保函\t212386\n【帝\t212387\n终末旅行\t212388\n刘伟\t212389\n武汉宜家\t212390\nscx\t212391\n格列齐特\t212392\n高山仰止\t212393\n几_\t212394\n蒙氏幼儿园\t212395\n21世\t212396\n固晶机\t212397\n世贸组织\t212398\n梁海玲\t212399\n拼邮\t212400\n砀山\t212401\n事通\t212402\n保本型\t212403\nmemorial\t212404\n圈友\t212405\n咬口\t212406\n韩国男团\t212407\n龙海家园\t212408\nActiveSync\t212409\n黑龙江省安全生产监督管理局\t212410\nLPS\t212411\n64wei\t212412\n含毒\t212413\n雾灯\t212414\n亚沙\t212415\n云和梯田\t212416\n兼有\t212417\n那么\t212418\n董卿\t212419\nFAQs\t212420\nMilf\t212421\n唇液\t212422\n第十三集\t212423\n李承铉\t212424\nAppInventor\t212425\n法徽\t212426\n陕西省委党校\t212427\n画曲线\t212428\n星球大战7\t212429\n客户\t212430\n门窗\t212431\n第二层\t212432\n在线棋牌游戏平台\t212433\nscenarios\t212434\n正心谷\t212435\nAmmonium\t212436\n沉香\t212437\namlogic\t212438\n编程高手\t212439\n等不起\t212440\n信用风险\t212441\n形而上学唯物主义\t212442\n浔兴\t212443\nHeath\t212444\n长沙联通\t212445\n云计算技术频\t212446\nfoward\t212447\n顶托\t212448\n射洪新闻网\t212449\n昔日\t212450\n朝阳政府网\t212451\n广交\t212452\n古斯塔斯\t212453\n中国共产党中央委员会宣传部\t212454\n裘千仞\t212455\nnut\t212456\n8辑\t212457\n机电工程学院\t212458\n雷霆世纪\t212459\n杨捷\t212460\n热敷\t212461\nmapminmax\t212462\nC45\t212463\n梁衡\t212464\n卯榫\t212465\n虎牙直\t212466\nkeycode\t212467\n无锡南禅寺\t212468\n中国营养学会\t212469\nDictionaries\t212470\n虾丸\t212471\n丑夫\t212472\n兰道\t212473\n70型\t212474\n脾湿\t212475\n无机盐\t212476\n圣美\t212477\n天津幼儿园\t212478\n批驳\t212479\n阳光政府\t212480\n草树\t212481\n中山市南区\t212482\n泪如雨下\t212483\n比方说\t212484\nsleep\t212485\n果冻鞋\t212486\n声轨\t212487\n团员\t212488\n例数\t212489\n鼻镜\t212490\nチェ\t212491\n铜线\t212492\n2月19日\t212493\n重度抑郁症\t212494\n押尾光太郎\t212495\nnvidia\t212496\n连接符\t212497\n20180414\t212498\n生发液\t212499\n情仇\t212500\n潇湘晨报\t212501\n胖东来\t212502\n观感\t212503\n多乐之日\t212504\n分封制\t212505\n王俊琪\t212506\n权金城\t212507\n主帅\t212508\n参拍\t212509\n政府采购货物和服务招标投标管理办法\t212510\n湛江钢铁\t212511\nCrimes\t212512\n茶房\t212513\n涂料配方网\t212514\n哈迪\t212515\n体型\t212516\n通谋\t212517\ntv4\t212518\n消化\t212519\n中情局\t212520\n浴帘\t212521\n糯米团子\t212522\n酒精性\t212523\n红色警戒3起义\t212524\nPhong\t212525\n含辛茹苦\t212526\n特工类\t212527\n配股\t212528\nSCSS\t212529\nGAY\t212530\nSTM32/STM8\t212531\nAzusa\t212532\nTUN\t212533\nヒミツ\t212534\n_房网\t212535\nHqew\t212536\n新编日语\t212537\n北京市委组织部\t212538\n美妆百科\t212539\n后父\t212540\n霍拉\t212541\n水下滑翔机\t212542\n北京后海\t212543\n九溪\t212544\n安仁\t212545\n▉\t212546\n利普刀\t212547\n父子文\t212548\n托运箱\t212549\n讨论\t212550\ntara\t212551\n君奉天\t212552\n安泰信\t212553\n讯息\t212554\n瘟\t212555\n神舟九号\t212556\niphone5\t212557\nconverters\t212558\n语文港\t212559\n火眼金睛\t212560\n领英吧\t212561\n米乔\t212562\n唐艺昕\t212563\n拾音器\t212564\ne63\t212565\n吃哑巴亏\t212566\n恒顺\t212567\n西天\t212568\n西电东\t212569\n中华鳖\t212570\n豪欲\t212571\n小米盒子遥控器\t212572\n图书展\t212573\n8816\t212574\n老同学\t212575\nspass\t212576\nps磨皮\t212577\nFisting\t212578\n400M\t212579\n何晶\t212580\n三仓\t212581\n滇剧\t212582\nプレミアム\t212583\n贬谪\t212584\nERC\t212585\n土豆块\t212586\niPhone5S\t212587\n有前途\t212588\n普瑞维亚\t212589\n六十\t212590\n张洪涛\t212591\n沃腾\t212592\ntrustee\t212593\n张宝华\t212594\n上个月\t212595\n贝瓦儿歌大合集全集\t212596\n西亭\t212597\n激光雷达\t212598\n魂不守舍\t212599\n平塘县人民政府\t212600\n曲阳县\t212601\n全景式\t212602\nHuTiger\t212603\n孙政才\t212604\nhxw\t212605\n德才兼备\t212606\n中奖\t212607\n重生之俗人\t212608\n咖啡屋\t212609\n刀口处\t212610\n北京大学经济学院\t212611\n张佳玮\t212612\n怪兽娘\t212613\n捉鸡麻将\t212614\n酸乳\t212615\n1410\t212616\n马场\t212617\n国恩股份\t212618\n丹皮酚软膏\t212619\n楚天科技\t212620\nipaid\t212621\n虹桥天街\t212622\n神父\t212623\n凯迪拉克xts\t212624\nui设计\t212625\n51条\t212626\n56厘米\t212627\n营销类\t212628\nKhaled\t212629\n99年\t212630\n学习率\t212631\ndotamax\t212632\n中山东一路\t212633\n风音\t212634\nie10浏览器\t212635\n苍狗\t212636\n王来\t212637\ncuckoo\t212638\n国家总局\t212639\n沈阳音乐学院\t212640\n鑫\t212641\n绷得\t212642\nCTeX\t212643\n外人\t212644\n夜曲\t212645\n小浪底风景区\t212646\npes2017\t212647\n得票\t212648\n电动车棚\t212649\nsoap协议\t212650\n太原物流公司\t212651\n直型\t212652\n以利亚\t212653\n贝雷桥\t212654\n山岸\t212655\n排列3\t212656\n透顶\t212657\n厦建工\t212658\n无锡市水利局\t212659\nFAST4WARD\t212660\n宜宾三江房产网\t212661\n安兹乌尔恭\t212662\n南通理工学院\t212663\n中国婴童网\t212664\n高增益\t212665\n食色大陆\t212666\nsinatra\t212667\nBiggest\t212668\n零次元\t212669\n二十届\t212670\n孙湘\t212671\n255号\t212672\n扬州日报\t212673\n选妃\t212674\n南蛮\t212675\n陈志成\t212676\n山东人大\t212677\n林微因\t212678\n芝麻街\t212679\nhelicopter\t212680\nNusa\t212681\n内带\t212682\n中原网\t212683\n本规定\t212684\n黄浦区中心医院\t212685\nMicrosof\t212686\n勇气默示录2\t212687\n艺版\t212688\n胡娟\t212689\n橄榄果\t212690\n玩游戏\t212691\n吸血虫\t212692\n合肥小区\t212693\n石井村\t212694\nILLUSTRATOR\t212695\n沈国军\t212696\n淡水龙虾\t212697\n哟西\t212698\n详释\t212699\nmnet\t212700\ninvented\t212701\nusb转换器\t212702\n分布式计算\t212703\n源城区人民政府\t212704\n这样做\t212705\n多恩\t212706\nrostopic\t212707\n第22部\t212708\n少年阿宾\t212709\n中山大学出版社\t212710\n岳父母\t212711\n血岩\t212712\n双打\t212713\nresourcemanager\t212714\n防化\t212715\n天地线\t212716\n本等\t212717\n徐长青\t212718\n枫儿\t212719\n军地\t212720\n尘缘\t212721\n炉渣\t212722\n南京地铁8号线\t212723\n想吐槽\t212724\n伊丽莎白一世\t212725\n犒赏\t212726\n属灵\t212727\n最露\t212728\n第86集\t212729\n#墓王之王悬棺寺#\t212730\nTCODE\t212731\nトヨタ\t212732\n30载\t212733\n华侨医院\t212734\n双桥东路\t212735\n新秩序\t212736\n金唐\t212737\n古道\t212738\n刀版\t212739\n景道\t212740\nPC游\t212741\n百度影视\t212742\n1414\t212743\n新东方集团\t212744\n威尼斯\t212745\n摇控\t212746\n两三万\t212747\n1.5MW\t212748\n外裤\t212749\n百域阁\t212750\n3平米\t212751\n造梦西游3\t212752\n右江民族医学院\t212753\n锦溪镇\t212754\ncertified\t212755\n填充率\t212756\n5.7亿\t212757\n浙大科技园\t212758\n尚小云\t212759\n偏暖\t212760\n抽逃\t212761\n404错误\t212762\n红红\t212763\n碎片化\t212764\n纪灵\t212765\n父与女\t212766\n声级\t212767\nlp-question\t212768\n有爱\t212769\n香槟色\t212770\nMultiScatter\t212771\nWHY\t212772\n埭头\t212773\n广东新闻_南方网\t212774\n赵冬\t212775\n神兵\t212776\n小桂\t212777\n养卡\t212778\n加仑\t212779\nhunters\t212780\nremount\t212781\n时隔\t212782\n南朝宋\t212783\n程维高\t212784\n广深港高速铁路\t212785\n黄遵宪\t212786\n护食\t212787\nCreator\t212788\n5558\t212789\n各学校\t212790\n广东奥飞动漫文化股份有限公司\t212791\n莲藕汤\t212792\n爬杆\t212793\n自知者明\t212794\n荣威Marvel\t212795\n工商管理类\t212796\n长沙公司\t212797\n阅点\t212798\n8024001\t212799\nRoboto\t212800\n扎刺\t212801\n西安国际\t212802\n辛酉\t212803\nVirSCAN\t212804\n游龙传说\t212805\nbooking酒店\t212806\n找钱\t212807\nSoldier\t212808\n祁连山国家级自然保护区\t212809\n城市之心\t212810\npro/e\t212811\n代理记账许可证\t212812\n福汇\t212813\n漆黑一片\t212814\n阳光100国际新城\t212815\n大手子\t212816\n黯羽\t212817\naam\t212818\n樱田\t212819\n开疆\t212820\n坏孩子\t212821\n三种人\t212822\n三生三世十里\t212823\n云展网\t212824\n贴身高手\t212825\nev71\t212826\nVIII\t212827\n红盾工商网\t212828\n三国杀_三国杀\t212829\n射孔\t212830\n75欧\t212831\n全国银行间同业拆借中心\t212832\n停止位\t212833\n消魂\t212834\n文彩\t212835\n聂树斌\t212836\n公正性\t212837\n哪此\t212838\n【文山\t212839\nFn键\t212840\nshuomi\t212841\n官梯\t212842\n石库门\t212843\nXXNX\t212844\n跳舞街\t212845\n沪松公路\t212846\n中国哲学史\t212847\nloveis715\t212848\n千广网\t212849\n但愿人长久\t212850\n记录\t212851\niterate\t212852\nZXA10\t212853\n首汽集团\t212854\n北京林业大学\t212855\n北京现代ix35\t212856\n行千里\t212857\n丁一宇\t212858\n通用航空机场\t212859\nwid\t212860\n一笑而过\t212861\n公眾\t212862\n惊爆游戏\t212863\n杨云\t212864\n第二世界\t212865\nStephen\t212866\nAdecco\t212867\nDM单\t212868\nEMB\t212869\n奥一\t212870\nCici\t212871\n樱井步\t212872\n领克01论坛_汽车之家论坛\t212873\n李山甫\t212874\nHeadphone\t212875\n后悔药\t212876\n圣祠\t212877\n诗织\t212878\n魔怔\t212879\n逃逸案\t212880\n上意\t212881\n科苑小区\t212882\n整月\t212883\n9月9\t212884\n1199元\t212885\nDNSPod\t212886\n_联英人才网\t212887\nFTSE\t212888\n新浪体育台\t212889\n薛鹏\t212890\n镰刀菌\t212891\n正楷\t212892\nDan\t212893\n厦门大学经济学院\t212894\n普兰店\t212895\n中州杯\t212896\n杀人蜂\t212897\n皮哥\t212898\n鄢陵县\t212899\nwww.2kxs.com\t212900\nhimself\t212901\n三十二个\t212902\n宜人贷\t212903\n福建高院\t212904\n13点\t212905\n卖品\t212906\n游戏机\t212907\n神谷浩史\t212908\nIRONMAN\t212909\n张文忠\t212910\n涂去\t212911\n邹碧华\t212912\n浙江省医院\t212913\n25岁\t212914\n清远职业技术学院\t212915\n裹身\t212916\n荣耀兰陵王\t212917\n华北集团\t212918\n封景\t212919\nocd\t212920\n37.5%\t212921\n欧普特蒙\t212922\n路标\t212923\n端子板\t212924\nmy399.com\t212925\n名扬\t212926\nMeitu\t212927\n中牟县政府\t212928\n有限元分析软件\t212929\n古窑民俗博览区\t212930\nsymbol\t212931\n小冰冰传奇天空要塞\t212932\n捐献\t212933\n票数\t212934\n无尽之剑2\t212935\n乳期\t212936\n3雷\t212937\n慈溪市规划局\t212938\nMybatis-Plus\t212939\n哈尔滨路\t212940\n万人\t212941\n吊销\t212942\nalb\t212943\nhacg\t212944\nmarshmallow\t212945\nV形\t212946\n青秀城\t212947\n鸡蛋汉堡\t212948\n李晟\t212949\nwin764位旗舰版\t212950\ngoals\t212951\n相对估值法\t212952\n绯月\t212953\n路考\t212954\n惠州市技师学院\t212955\n工艺部\t212956\n苏小小\t212957\nn3dsll\t212958\n山流水\t212959\n独单\t212960\n有法必依\t212961\n恣意\t212962\n四风谷\t212963\n中科院自动化所\t212964\nmindmanger\t212965\nChinaUnix论坛\t212966\nDaytona\t212967\n铭城\t212968\n莉亚迪桑\t212969\n滑\t212970\n拉手网\t212971\n棠棣\t212972\n长亮科技\t212973\n凑单\t212974\n紫吕\t212975\n哀婉\t212976\n苍玄\t212977\n致美丽的你\t212978\n黄鹤\t212979\n二巡\t212980\n青芝坞\t212981\n被撤\t212982\n球面波\t212983\n张梁\t212984\n口枷\t212985\n中国人大\t212986\n大熊山\t212987\n切片操作\t212988\n1磅\t212989\n松金洋子\t212990\n检委会\t212991\n精尽人亡\t212992\n20150105\t212993\n截出\t212994\n测量值\t212995\n上东国际\t212996\nwmts\t212997\n拒签\t212998\n皮先生\t212999\n包青天之碧血丹心\t213000\n溶质\t213001\n混悬剂\t213002\n跪倒\t213003\nMD760CH/\t213004\n第119\t213005\n挑刺\t213006\n婚姻\t213007\nstudio8\t213008\n中加枫华国际学校\t213009\n桑葚\t213010\n小护士\t213011\n万年青\t213012\n接图\t213013\n男字\t213014\n黑弓\t213015\n彼特\t213016\n三言二拍\t213017\n整经机\t213018\n960万\t213019\n学修\t213020\n上海双旭电子有限公司\t213021\n马脚\t213022\n遵化市\t213023\n话不投机\t213024\n金蝶KIS标准版\t213025\nLYH\t213026\n12oz\t213027\nAllen\t213028\n助勃\t213029\nDavidson\t213030\n萜类\t213031\n橘颂\t213032\n团体操\t213033\npayne\t213034\n铃木保奈美\t213035\n广东省建设厅\t213036\n山葵\t213037\n信诚\t213038\n7610\t213039\n阴气\t213040\n皮包\t213041\n吉网\t213042\nThinkP\t213043\n30KW\t213044\n伊琳娜\t213045\n天欧\t213046\n未婚夫\t213047\n鑫瑞\t213048\n铁包\t213049\n上海上海上海联通\t213050\n名商\t213051\n太婆\t213052\nXDR\t213053\n文赋\t213054\nKills\t213055\n冲线\t213056\n滞箱费\t213057\n12亿元\t213058\n接生\t213059\n56度\t213060\n攀藤\t213061\n雪精灵\t213062\n7D\t213063\nhmaster\t213064\n5440\t213065\n肝纤维化\t213066\n装逼犯\t213067\n时差\t213068\n马未巨石强森\t213069\n试题\t213070\nrestrict\t213071\n米奇网\t213072\n比比较好\t213073\n质量检验员\t213074\n510000\t213075\nv4.1.0\t213076\n红盒\t213077\n红心芭乐\t213078\n内雕\t213079\n分页\t213080\nmogodb\t213081\n14寸\t213082\n70迈\t213083\n北雄\t213084\n麦郎\t213085\n收获日\t213086\nrendertexture\t213087\n紫光控股\t213088\n石城县人民政府\t213089\n甜食\t213090\n乳化液泵\t213091\n高配\t213092\nnestle\t213093\nshp格式\t213094\n买卖宝\t213095\n老韩\t213096\n金瓶\t213097\n野战门\t213098\nCEVA\t213099\n芭蕾舞裙\t213100\n5亩\t213101\n入托\t213102\n紫郡\t213103\nsmack\t213104\n克感敏\t213105\n螳螂拳\t213106\n作文\t213107\n汤不热\t213108\n114名\t213109\n心灰意冷\t213110\nQualified\t213111\n吊坠\t213112\n数十亿\t213113\n考务工\t213114\n李光羲\t213115\n福田区小学\t213116\nWOC\t213117\nVit\t213118\n正态分布检验\t213119\n斯达克\t213120\n成都国际商贸城\t213121\n翠园中学\t213122\npremium12\t213123\n聚簇索引\t213124\nctl\t213125\n读生\t213126\n99re\t213127\n﹕\t213128\n北凉悍刀行\t213129\n告终\t213130\n程耳\t213131\n撼动\t213132\n删文\t213133\n十进制转二进制\t213134\n有益\t213135\n桃花位\t213136\n锦屏县人民政府\t213137\n3068\t213138\n废名\t213139\n那头\t213140\n老帕\t213141\nstabilizer\t213142\n新手期\t213143\n通知了\t213144\n打工\t213145\nsupor\t213146\n十二生肖三合\t213147\n上海火车站南广场\t213148\n3ST\t213149\n怀仁\t213150\nclimb\t213151\n玉树县\t213152\n瞅\t213153\n200PLC\t213154\n海运拼箱\t213155\n佛子岭水库\t213156\n请缨\t213157\n感性负载\t213158\n新员\t213159\n肾上腺腺瘤\t213160\n帮众\t213161\n天明\t213162\n小說章節閱讀\t213163\n天津海昌极地海洋公园\t213164\n覆膜板\t213165\n全身性\t213166\n交互稿\t213167\nG大调\t213168\nMeiko\t213169\n尚都\t213170\ncorrosion\t213171\n抚养\t213172\n翡翠路\t213173\n长巾\t213174\n慕容雪\t213175\n王韬\t213176\nIntro\t213177\n深圳市国家税务局\t213178\n二件套\t213179\n逃学威龙1\t213180\n世界历史网\t213181\n戴金\t213182\n伶人\t213183\n新任职\t213184\n自圆其说\t213185\n中星微电子\t213186\nepb\t213187\n胜利街\t213188\n招商银行杭州分行\t213189\ngio\t213190\n袁腾宁泽涛\t213191\n七秀坊\t213192\n蒸发量\t213193\n任我橹\t213194\n艾弗王东明\t213195\n线率\t213196\n程村\t213197\n难追\t213198\n红楼梦之十二金钗\t213199\n张学武\t213200\n佳能m3\t213201\n雷凌论坛\t213202\n许用应力\t213203\n汤姆克兰西:全境封锁\t213204\n复数类\t213205\n新干员\t213206\n卢比\t213207\n再见十八班\t213208\n)投资咨询有限公司\t213209\nAchievements\t213210\n一个8岁\t213211\n幼儿教育\t213212\nNotch\t213213\n罗小黑战记\t213214\n数据编辑器\t213215\n张颖\t213216\n迪信\t213217\n绵绸\t213218\n遁地\t213219\nCONTACT\t213220\n录播\t213221\n戴敦邦\t213222\n产乳\t213223\n林登万\t213224\n车点\t213225\nruanjian\t213226\n侵犯著作权罪\t213227\n五险二金\t213228\n密云区\t213229\n和煦\t213230\n囚爱\t213231\n长沙市博物馆\t213232\nsalary\t213233\n工程力学\t213234\n本位币\t213235\n妮可·基德曼\t213236\n杜宪\t213237\nfreshman\t213238\n二重身\t213239\n遮遮掩掩\t213240\nshoot\t213241\ntis\t213242\n五公祠\t213243\n美少女战士\t213244\ntostring\t213245\n繁华大道\t213246\n科拉传奇\t213247\n党务公开制度\t213248\n背下\t213249\n新疆师范大学\t213250\n塔吊顶\t213251\n嗲\t213252\n水漏\t213253\n26磅\t213254\n滑板车\t213255\n销签\t213256\n语音识别系统\t213257\n赛摩\t213258\n阿泱\t213259\n187a\t213260\n重大疾病险\t213261\n大商城\t213262\n生态瓶\t213263\n上秋\t213264\n华为助手\t213265\n保利上城\t213266\n方元\t213267\n阿拉巴马州\t213268\n菠萝鱼\t213269\n涉外民事关系法律适用法\t213270\nvers\t213271\n恩特\t213272\n泥河\t213273\n磷酸三丁酯\t213274\n江宏杰\t213275\n青占鱼\t213276\njjj传奇\t213277\n穆大叔\t213278\n修剪机\t213279\n中铁快运\t213280\nShowFuli|秀\t213281\n名牌\t213282\n白浅夜华\t213283\n七起\t213284\n左臂\t213285\n海澜集团\t213286\n墓兽\t213287\n摄食\t213288\n影赛\t213289\n网通\t213290\n返回值函数\t213291\n天蛇\t213292\n凑巧\t213293\n封山育林\t213294\n钝器\t213295\n9.9\t213296\n黄江吉\t213297\n丁字\t213298\n钟表展\t213299\n纳西情歌\t213300\n负压式\t213301\n吕玲绮\t213302\n周知\t213303\nopenpose\t213304\n金曲捞\t213305\n加州大学圣克鲁兹分校\t213306\n人造地球卫星\t213307\n瑞刷\t213308\ncufflinks\t213309\n阿里p6\t213310\n中国重型汽车集团有限公司\t213311\n狐狸和乌鸦\t213312\n黄芪汤\t213313\nnara\t213314\n汇福家园\t213315\n带板\t213316\n安富\t213317\n木子李\t213318\n计划生育协会\t213319\nl2tp\t213320\n谢君\t213321\n达伽马\t213322\n第二线\t213323\nKramer\t213324\n海警\t213325\nC80\t213326\n广州社保局\t213327\n筹劳\t213328\n大意失\t213329\n中肯\t213330\n卑\t213331\n出入境检验检疫\t213332\n郝广才\t213333\ne16\t213334\nphenomena\t213335\n海康威视公司\t213336\niFinD\t213337\n书画家\t213338\n成非\t213339\n中元胞\t213340\n一个9岁\t213341\n四朵\t213342\ndisi\t213343\n金岸\t213344\n剑网3丐帮\t213345\n140\t213346\n秃头\t213347\n益坚\t213348\nusgs\t213349\n里巴人\t213350\n值不值\t213351\n方维\t213352\n象头山\t213353\n中国文化中心\t213354\n柑桔\t213355\nAarif\t213356\n立论\t213357\n四川省大学\t213358\n环山路\t213359\n化学进展\t213360\n骚年\t213361\n财会人\t213362\n天狗\t213363\n都江\t213364\n两亿元\t213365\n全能性\t213366\n胖达\t213367\n海欣\t213368\n22mt\t213369\n16薪\t213370\n2.79\t213371\n0720\t213372\nDevtools\t213373\n厚脸\t213374\n王中磊\t213375\n鞭炮\t213376\n后心\t213377\n市人民政府\t213378\n闽政\t213379\nstories\t213380\ncdp\t213381\n言而无信\t213382\n中区\t213383\n纸面\t213384\nt54\t213385\n华信\t213386\nMomenta\t213387\n甘露\t213388\n会话\t213389\n套索\t213390\n百叶帘\t213391\nb52\t213392\n神经质\t213393\nhse\t213394\n双击亮屏\t213395\n静平衡\t213396\n双象\t213397\nplink\t213398\n利空\t213399\n索非布韦\t213400\n萨拉赫\t213401\n20170923\t213402\n中国国际高新技术成果交易会\t213403\n画贴\t213404\n疗愈\t213405\n补发\t213406\n葛雨晴\t213407\n事后\t213408\n太空堡垒卡拉狄\t213409\nP2P理财平台\t213410\n东京游记\t213411\n奥绿\t213412\n人术\t213413\n字词句\t213414\n期中考卷\t213415\n抢答器\t213416\nWildfly\t213417\nMy97DatePicker\t213418\n现实版\t213419\n茅坑\t213420\nExtJS\t213421\n37岁\t213422\n和孚镇\t213423\n三里屯优衣库\t213424\n阻力\t213425\n9360\t213426\nv97\t213427\n军属\t213428\n深圳明德实验学校\t213429\npohreb\t213430\n单精度浮点数\t213431\n食戟\t213432\n中国法院\t213433\n桌带\t213434\n融资租赁资源网\t213435\n工伤死亡赔偿标准\t213436\n対\t213437\n围观网\t213438\n9份\t213439\n小学生日记大全网\t213440\n职业生涯规划\t213441\n拉希德\t213442\n分置改革\t213443\n4GB+32GB\t213444\nSCX\t213445\ndear\t213446\n止血粉\t213447\n谷口\t213448\n口述史\t213449\n劈尖\t213450\np4\t213451\nlom\t213452\n张伦硕\t213453\n八重樱\t213454\n情景化\t213455\nclash\t213456\n万方期刊网\t213457\n猝死\t213458\n玻切\t213459\n踏炎\t213460\n十七章\t213461\n带锯\t213462\n撸波波\t213463\nSalesman\t213464\n人材\t213465\n东东手游助手\t213466\n文匯報\t213467\n猎\t213468\n花颜\t213469\n风险投资网\t213470\n加盟商\t213471\n概率密度函数\t213472\n美佳网\t213473\njic\t213474\n雪芙蓉\t213475\n七年级英语\t213476\n艾佛森\t213477\n背头\t213478\n宿州论坛\t213479\n讹人\t213480\n伦敦大桥\t213481\nWin64位\t213482\n崔志佳\t213483\nworf\t213484\n包分配\t213485\n题海战术\t213486\nunit6\t213487\n外所\t213488\nAPB\t213489\nAx\t213490\n燕麦粥\t213491\nEndeavour\t213492\n甪直\t213493\nxel\t213494\nGAC\t213495\nGRP\t213496\nAegis\t213497\n依宪\t213498\n东方烟草报\t213499\n二分之一次方\t213500\nSQLServer2000\t213501\n菜单式\t213502\nvalidity\t213503\nAOT\t213504\n6210\t213505\n淫女\t213506\n奢姿\t213507\n演艺圈\t213508\n肽聚糖\t213509\n公德心\t213510\n感觉器官\t213511\nlabour\t213512\n108.2\t213513\n狗头人冒险模式\t213514\n敬亭山\t213515\n工作室群\t213516\n确认函\t213517\n阿卡伊勒\t213518\n鲜氧\t213519\n罗技k375s\t213520\n7.5码\t213521\n自适应父\t213522\n悟饭\t213523\n墨车\t213524\n奥斯卡王尔德\t213525\n梅州警视网\t213526\n乐购网\t213527\n复仇者联盟1\t213528\n手机贷\t213529\n机甲\t213530\n湖滨银泰in77\t213531\n汝\t213532\nundergraduate\t213533\n先乳\t213534\n斗罗大陆小舞\t213535\n现代健康网\t213536\n思拓\t213537\n不穷\t213538\n真狠\t213539\n终结者2\t213540\n58P\t213541\n谢毅\t213542\n私募投资基金募集行为管理办法\t213543\n变异性\t213544\n纵横天下\t213545\n圆梦大学\t213546\n削藩\t213547\n金源世纪城\t213548\n蝉花\t213549\n圣灵\t213550\n顶推\t213551\n实训\t213552\n偶像季\t213553\n很准\t213554\n中川\t213555\n金山毒霸\t213556\n世茂滨江花园\t213557\n称兄道弟\t213558\n马兰士\t213559\n川菜馆\t213560\n饼屋\t213561\n木兰花乡\t213562\n1平方\t213563\n李轩辕\t213564\n459\t213565\n顺德农商行\t213566\n中来股份\t213567\n大剑\t213568\n横山县\t213569\n管道壁\t213570\n入闱\t213571\n粉碎型\t213572\n新风机\t213573\n300户\t213574\n老年时报数字报\t213575\n贝莱\t213576\nrockbox\t213577\nWaltz\t213578\npkgs\t213579\n换汇\t213580\n85本\t213581\nAimer\t213582\nWhois\t213583\n隐层\t213584\n长疣\t213585\n七边形\t213586\n自难忘\t213587\n奇普\t213588\n拼劲\t213589\n西陆wap站\t213590\n近身\t213591\n在职研究生\t213592\n百里挑一\t213593\n海尔空调\t213594\n花宵道\t213595\n卧龙地产\t213596\n广日股份\t213597\n注册一级建造师\t213598\n排忧\t213599\n龙榆生\t213600\n围棋\t213601\n旧县村\t213602\n草珊瑚\t213603\n屈光度\t213604\n银湖街道\t213605\nzhenghai\t213606\n警示板\t213607\n星三角\t213608\n草原女民兵\t213609\n红昭愿\t213610\n五铢钱\t213611\n139平米\t213612\nFundamental\t213613\n苏扇\t213614\n藤崎\t213615\n通道县\t213616\nfeedback\t213617\nbzero\t213618\n企业级路由器\t213619\n三国志7\t213620\n涤棉\t213621\ndatetimebox\t213622\n村中\t213623\nspencer\t213624\n浓密\t213625\n末轮\t213626\n鸟友\t213627\n2000名\t213628\n弘博\t213629\n北京师范大学亚太实验学校\t213630\n名庄\t213631\n150万元\t213632\n2.6\t213633\n吃喝玩乐\t213634\n杨晓琼\t213635\nEternal\t213636\n导热片\t213637\n成办\t213638\n狮子吧\t213639\n玛咖酒\t213640\ncasts\t213641\n陈向阳\t213642\n子午\t213643\n眼图\t213644\n天闻角川\t213645\nat24c02\t213646\n情网\t213647\n蓝细菌\t213648\n气喘\t213649\nled屏幕\t213650\nNumberPicker\t213651\n刘保聚\t213652\n260米\t213653\n乱收\t213654\n唯他可可\t213655\n消电\t213656\n世界图书出版公司\t213657\n第28话\t213658\n飞亚达\t213659\n海口市住房和城乡建设局\t213660\neu\t213661\nOld\t213662\n运动表\t213663\n矫形器\t213664\n天佑城\t213665\n杰里\t213666\nAT24C02\t213667\n苑琼丹\t213668\n道门吧\t213669\n比亚迪G5\t213670\nMattresses\t213671\nneg\t213672\n建园\t213673\n173号\t213674\n湘水\t213675\ntcpip协议\t213676\n阿宅\t213677\n输送泵\t213678\n西语\t213679\n汽配百科\t213680\n方所书店\t213681\nhavior\t213682\nzhixue\t213683\n反思\t213684\n东苑\t213685\n五天五夜\t213686\n舞艺\t213687\n3500吨\t213688\nAWS\t213689\n光棍节光网\t213690\n排演\t213691\n振捣器\t213692\n京汉铁路\t213693\n水压机\t213694\nPCCAD\t213695\n乳奴\t213696\n轮值\t213697\nPARP\t213698\nKitty乐园\t213699\n不疑\t213700\nEPCOS\t213701\n咏雪\t213702\nJoin\t213703\n南京仁康医院\t213704\n八进制数\t213705\n小爬虫\t213706\n真金\t213707\n榆林市纪委监察局\t213708\n翠菊\t213709\n9.5mm\t213710\n外购\t213711\n澳科大\t213712\n妥善\t213713\n阴阳性\t213714\ndisconnected\t213715\nXPATH\t213716\nUnity5\t213717\n春江花园\t213718\n静逸\t213719\n张云鹏\t213720\n花布\t213721\n染色\t213722\n即墨论坛\t213723\n易风\t213724\n穂\t213725\nopenlayer3\t213726\nProguard\t213727\n75部\t213728\n测量\t213729\n安康家园\t213730\n张晓敏\t213731\n闵行体育公园\t213732\n吸收性\t213733\n资产周转率\t213734\n精灵图\t213735\n斯维尔\t213736\n腹透\t213737\n飞傲\t213738\n回本\t213739\nanswering\t213740\n酚氨咖敏片\t213741\n1.0cm\t213742\n运河公园\t213743\n黄强\t213744\nν\t213745\n耶鲁大学\t213746\n地图册\t213747\n盘古越狱\t213748\nchng\t213749\nflyme\t213750\n古明\t213751\nHdmi\t213752\n外冷内热\t213753\n乐球\t213754\n火星网\t213755\n硅晶片\t213756\n海通证券股份有限公司\t213757\n邻居们\t213758\nCAD2010\t213759\nDebit\t213760\n优胜奖\t213761\n啮齿动物\t213762\n秀米xiumi\t213763\n病虫\t213764\n完型\t213765\n华清池\t213766\n电信分公司\t213767\n八_\t213768\n晋宁区\t213769\n屠榜\t213770\n大拇\t213771\n旷世科技\t213772\n保险绳\t213773\ncamellia\t213774\nrehabilitation\t213775\n128号\t213776\n豆汁\t213777\n几丁\t213778\nchakan\t213779\n试讲稿\t213780\n乌托\t213781\n宣传册\t213782\n金鹭\t213783\n入刑\t213784\n汽车配件公司\t213785\n萧正楠\t213786\n和\t213787\n吉祥鸟\t213788\n洪泽\t213789\n不愈\t213790\n总工程师\t213791\n飘动\t213792\n企业认证\t213793\n20150826\t213794\n奇宝斋\t213795\n中欣\t213796\n鼓掌\t213797\n置之不理\t213798\ndoub\t213799\n新高考\t213800\nBromo\t213801\nzclip\t213802\n牵引管\t213803\nw3cschool\t213804\n烧烤吧\t213805\n余钱\t213806\n分享式\t213807\n李国主\t213808\npsj\t213809\n劳教所\t213810\n380_\t213811\nSIM900A\t213812\nConventions\t213813\n陶晶莹\t213814\nFalls\t213815\n区区\t213816\n能端\t213817\n听书阁\t213818\nUMP\t213819\n佳能6d\t213820\n鬼媾人\t213821\n金港花园\t213822\n华夏族\t213823\n22.CN\t213824\navsow\t213825\n6行\t213826\n转基因食品\t213827\n哇卡\t213828\n在岗\t213829\ncookie值\t213830\n广雅中学\t213831\n坎巴拉\t213832\n新东\t213833\n奥迪A6L\t213834\n棉纶\t213835\noobe\t213836\n鞍山海城\t213837\n张淑敏\t213838\n子房\t213839\n空艇\t213840\nvit\t213841\n嘉乐\t213842\n划屏\t213843\n试航\t213844\n20万亩\t213845\n嗯嗯嗯\t213846\n秦艽\t213847\n张荫麟\t213848\n魅蓝E2\t213849\n砂之船\t213850\nvundle\t213851\nSOFTWARE\t213852\nWWW.成人.COM\t213853\n淫图\t213854\n益生菌粉\t213855\n格兰仕集团\t213856\n鲍家花园\t213857\n舒痕\t213858\n西坑\t213859\n哈尔滨市第三中学\t213860\n小岛秀夫\t213861\nyuy\t213862\n原谱\t213863\n红孩儿\t213864\n党建研究网\t213865\n_贵港政府网\t213866\n60si2mn\t213867\nDiameter\t213868\n中文安\t213869\n曹洪\t213870\n兴起\t213871\n偿债备付率\t213872\nEXW\t213873\n公共行政学\t213874\n唐才子传\t213875\n两天后\t213876\n八路网\t213877\n平方公里\t213878\n北京市幼儿园\t213879\n5018\t213880\n加纳\t213881\n中华民族一家亲\t213882\n婺江\t213883\nY币\t213884\n/td\t213885\nWP\t213886\nCD碟\t213887\n春见\t213888\n高升控股\t213889\n护理学\t213890\n维数\t213891\n何德何能\t213892\nQuartus\t213893\n20170831\t213894\n接缝处\t213895\n人识\t213896\nknob\t213897\n舞泡网\t213898\n119级\t213899\n板垣\t213900\n5500万\t213901\n偏色\t213902\nw5500\t213903\n簧色\t213904\n针线盒\t213905\n大连市教育局\t213906\n0020\t213907\n刘之冰\t213908\n海虾\t213909\n天鹅之恋\t213910\n招警\t213911\n声色俱厉\t213912\n上海长宁政府\t213913\n万能检讨书\t213914\n安徽警官职业学院\t213915\n气控阀\t213916\nSIMD\t213917\n方所\t213918\n退耕还林\t213919\n实腹式\t213920\n绿色汉\t213921\n遗传学家\t213922\n永煤集团\t213923\n西藏矿业\t213924\npickerview\t213925\n枪战\t213926\nH9T韩剧网\t213927\n冲子\t213928\n承载\t213929\n佛歌\t213930\n下脚料\t213931\n世纪联华超市\t213932\n戏曲谱\t213933\n魅力研习社\t213934\n张江\t213935\n本命年\t213936\nzhinin\t213937\n北极星电力新闻网\t213938\nuav\t213939\n拉卡拉易分期\t213940\n色光\t213941\nElisabeth\t213942\n后会\t213943\n象山校区\t213944\nmp280\t213945\n荆门政府网\t213946\n镧系\t213947\nSWF格式\t213948\n7.2.1\t213949\n远大集团\t213950\n中国铁塔股份有限公司\t213951\n拳法\t213952\nvs2015\t213953\n鄞州二院\t213954\n93#\t213955\n佳澜\t213956\n活动部\t213957\nhangout\t213958\naxs\t213959\n熹妃传奇\t213960\n三寸金莲\t213961\n磕绊\t213962\n塞入\t213963\n瑞鼎\t213964\n珠海小学\t213965\n补精\t213966\n螺杆式冷水机\t213967\n工程物\t213968\niPhone6/6\t213969\n宋堂吉诃德\t213970\n掘墓者\t213971\n赞达拉战争\t213972\n论共产党员的修养\t213973\n312号\t213974\n实况\t213975\n袁启聪\t213976\n在写\t213977\n中国初级卫生保健基金会\t213978\n弯曲应力\t213979\n增加值\t213980\n魔法石\t213981\n乔任梁\t213982\nExist\t213983\n不好意\t213984\n雨夜\t213985\n三笑\t213986\n粗略\t213987\n报告率\t213988\n过长\t213989\n尾蝎\t213990\nkramer\t213991\n丰田皇冠\t213992\nbyhieg\t213993\n这\t213994\n天汉\t213995\n狐踪谍影\t213996\n人性欲\t213997\n肥佬影音\t213998\n鬼蟹\t213999\n珍藏册\t214000\nc++语言\t214001\n掷弹筒\t214002\n疳积\t214003\n财通\t214004\n沪深A股\t214005\n炙热\t214006\nS7/S7\t214007\nDVD-MKV\t214008\n无端\t214009\n智联全国招聘网\t214010\n獣\t214011\n离职证明书\t214012\n凤凰国际机场\t214013\nYogurt\t214014\n聚财\t214015\n欧系\t214016\n大税\t214017\n贪嗔痴\t214018\n杨美琪\t214019\n大思英语\t214020\n小企业会计准则\t214021\n严颖\t214022\n无敌版\t214023\n金灵芝\t214024\n滟照门\t214025\n离任审计\t214026\n武则天\t214027\n武康\t214028\n度母\t214029\n皮屑\t214030\n磨刀\t214031\n摇摆车\t214032\nGREAT\t214033\n陈文图\t214034\n德语\t214035\nctrl+alt\t214036\n大智\t214037\n纯鲜\t214038\n物流信息网\t214039\n覆土\t214040\n福田欧曼\t214041\n市市场监管局\t214042\n刘河\t214043\n泡学\t214044\n李蛟\t214045\n泄漏\t214046\n飞单\t214047\n一宫\t214048\n荣耀路由2荣耀路由\t214049\n复合\t214050\n车志\t214051\n合二为一\t214052\n離\t214053\n8421码\t214054\n银量\t214055\ndianqi\t214056\n面包\t214057\nitp\t214058\n日期框\t214059\n七磅\t214060\n邻舍\t214061\n浙江省温州中学\t214062\n索法\t214063\n盐津网\t214064\nFotor\t214065\n湖州中学\t214066\n魔方大厦\t214067\n南通大学医学院\t214068\n荡神\t214069\n鹤年堂古方眼宝\t214070\nH60\t214071\n6.4亿\t214072\n偏移量\t214073\n3DSource\t214074\n中国安全教育网\t214075\n空气净化器滤网\t214076\n新疆中泰化学股份有限公司\t214077\nItMP3\t214078\nIncreased\t214079\nCEL\t214080\n身手\t214081\n2308\t214082\n音尘\t214083\n陕西站\t214084\n微单索尼\t214085\n延寿\t214086\n之上\t214087\nveri\t214088\n赏金\t214089\n唐砖羊脂球\t214090\n三者关系\t214091\n西丽大学城\t214092\n5w-30\t214093\nmarco\t214094\nDs\t214095\n二次型\t214096\n阿里天池\t214097\nClubmed\t214098\nWinZip\t214099\n医界人才网\t214100\n客运\t214101\n第09\t214102\n蓄洪区\t214103\n蓝宙\t214104\n惠翔\t214105\n无限时刻\t214106\n轻钢龙骨纸面石膏板\t214107\n美景\t214108\n冗杂\t214109\n王启明\t214110\n质量\t214111\n香港科技大学\t214112\nElectromagnetic\t214113\nfather\t214114\nDNSPOD\t214115\nX战警\t214116\n20171212\t214117\napicloud\t214118\n木乃伊\t214119\n16.1\t214120\n魕\t214121\n对象资源管理器\t214122\n码率\t214123\n佰咖汽车\t214124\n武侠世界大穿越\t214125\n小伍\t214126\n逻辑板\t214127\n经济师考试网\t214128\nCaffeCN\t214129\n插手\t214130\n曼达\t214131\n第二下\t214132\n遗产案\t214133\n辫子\t214134\n伐木工人\t214135\n老妈子\t214136\n南京软件谷\t214137\n熔炼炉\t214138\n怒江州\t214139\n迟日江山丽\t214140\n奥铃\t214141\n贵溪市人民政府\t214142\n蒸馏水机\t214143\n百度天圣卡\t214144\n360云\t214145\n微灌\t214146\n上海源叶生物科技\t214147\n5.3.2\t214148\n取胜\t214149\n纱子\t214150\n邹老师\t214151\n二〇\t214152\n10分钟内\t214153\n抗辩\t214154\nVue脚手架\t214155\n好过\t214156\n够不到\t214157\n万科物业\t214158\n细胞凋亡\t214159\n我是一只小小鸟\t214160\n长颜\t214161\n脱硫\t214162\n铭威\t214163\n浙江省政府\t214164\nlauch\t214165\n无极性\t214166\n燥湿\t214167\n铅笔芯\t214168\n克莱少帅\t214169\n百万个\t214170\n蛋龙\t214171\nchanical\t214172\n计测\t214173\n嘈\t214174\n北京市国税局\t214175\n如来不负卿\t214176\n科蚁网\t214177\n斯泰迪\t214178\nZuo\t214179\n云南陆军讲武堂\t214180\n不听\t214181\n华林路\t214182\n公益金\t214183\n2018.3.27\t214184\n联名信用卡\t214185\nJAPANiCAN\t214186\n飞行堡垒FX\t214187\n00000024\t214188\n收集率\t214189\n润湿\t214190\n明日星城\t214191\n慧敏\t214192\n巴航\t214193\n佛山市安全生产监督管理局\t214194\n阿斯塔\t214195\n赞美太阳\t214196\n干红\t214197\n迪米\t214198\n聿\t214199\n新航道\t214200\n波波城\t214201\n倍加福\t214202\n2018.3.2\t214203\n4000位\t214204\n书写者\t214205\n且行\t214206\n血流\t214207\n三娘教子\t214208\n形参\t214209\n盖茨基金会\t214210\n麦昆\t214211\n间歇性\t214212\n酥脆\t214213\nS10\t214214\n爱酷\t214215\nsimplis\t214216\n宝嘉\t214217\n春暖\t214218\n160集\t214219\nCampaigns\t214220\n老爹\t214221\n丹江口\t214222\n清补\t214223\n沃保保险网\t214224\n翻皮\t214225\n珍虎网\t214226\n2月份\t214227\n澳门凯旋门\t214228\n山东城市建设职业学院\t214229\nWriteFile\t214230\n招商信诺人寿保险\t214231\n北影节\t214232\nRichEdit\t214233\n砂锅饭\t214234\n中西路\t214235\n63平米\t214236\n赵钶\t214237\nojdbc6.jar\t214238\n夏弥\t214239\n急拉\t214240\n明锐\t214241\nllll\t214242\n例如\t214243\n大合\t214244\n棕竹\t214245\n建造师证\t214246\n张大春\t214247\n卡琳娜·卡普\t214248\n就业创业证\t214249\n慕田峪长城\t214250\n耶稣传\t214251\nDOTA1\t214252\n咣\t214253\n简练\t214254\nvenn\t214255\nexploitation\t214256\ndhl\t214257\n白鹭\t214258\n2058\t214259\n百e\t214260\n牌类\t214261\n电磁铁\t214262\n纵横四海\t214263\n奢品\t214264\n福山镇\t214265\n2015年上半年\t214266\n泰力\t214267\nemerg\t214268\nROIC\t214269\n软文批发网\t214270\n揭底\t214271\nalcohol\t214272\n驯龙高手1\t214273\n秀湖\t214274\n发泡模具\t214275\n海鲜菇\t214276\ngkong\t214277\n小四门\t214278\n成都七中\t214279\n张雨鑫\t214280\nliuxin\t214281\n宿迁市第一人民医院\t214282\n震撼\t214283\n卢湾体育馆\t214284\n抗兔\t214285\n娱乐版\t214286\n斯皮尔曼\t214287\n张莹\t214288\n藤真\t214289\n梅西C罗\t214290\n构式\t214291\nPUBMED\t214292\n凌波丽\t214293\n回春操\t214294\n卡达斯\t214295\n八美\t214296\n北省\t214297\n按键音\t214298\n是不为\t214299\nHooked\t214300\n6.5_\t214301\n放\t214302\nDTG\t214303\n高工锂电\t214304\n安装库\t214305\n四清\t214306\n贾德\t214307\n汉字宫\t214308\nhmailserver\t214309\n佛度\t214310\n大店镇\t214311\n3d66.com\t214312\nlogix\t214313\n985_\t214314\n烫手\t214315\nbum\t214316\nNPG\t214317\n混流风机\t214318\ntextures\t214319\nOpenlayers\t214320\nPCL\t214321\n红剑\t214322\n马毛\t214323\n西丰\t214324\n云霓\t214325\n宠物防御战\t214326\n利王\t214327\n太空漫游\t214328\nprotein\t214329\nwhen\t214330\n野狐\t214331\nMaker\t214332\nTOMCAT\t214333\n排区\t214334\nRK\t214335\nHippo\t214336\nocc\t214337\n廉明\t214338\n西方哲学史\t214339\n【尐\t214340\n义卖\t214341\n悠乐\t214342\n澳华\t214343\n街机三国志2\t214344\n息烽县\t214345\n新疆维吾尔自治区商务厅\t214346\n豪爵\t214347\ntmpfs\t214348\n一拖三\t214349\n直线段\t214350\n油粘米\t214351\nmal\t214352\nwebbrower\t214353\n刻章机\t214354\n新农村示范村\t214355\n上沪\t214356\n隐藏的歌手\t214357\n注册公用设备工程师考试\t214358\n往\t214359\n商砼站\t214360\n狂狼\t214361\n99届\t214362\n梦比优斯奥特曼\t214363\n解渴\t214364\nXavier\t214365\n上海交通大学自动化系\t214366\nnh\t214367\n井冈山革命博物馆\t214368\n丝宝论坛\t214369\n德江\t214370\n蓝湾半岛\t214371\nmicorsoft\t214372\n含山县\t214373\n杯架\t214374\n仑\t214375\n维奇\t214376\n疤痕疙瘩\t214377\nsupervised\t214378\n存在与时间\t214379\n叶隐\t214380\n快车\t214381\n广州市南沙区人民政府\t214382\n大井\t214383\n质询案\t214384\n华泽\t214385\n日晒\t214386\nfaketaxi\t214387\n潮白河孔雀城\t214388\n刘一手\t214389\n揭东\t214390\n赏鉴\t214391\n冰森\t214392\n某一刻\t214393\n45倍\t214394\n文华财经随身行\t214395\n环游\t214396\noperable\t214397\n中国电影导演协会\t214398\n死亡飞车3\t214399\n花笙记\t214400\n怀里\t214401\nlm317\t214402\nTKinter\t214403\n独身\t214404\n第十二回\t214405\n卧听\t214406\n华力创通\t214407\n诗魂\t214408\nzshrc\t214409\n唐悠悠\t214410\n时间段\t214411\nBOSS直聘\t214412\n黄岑\t214413\n上海爱车\t214414\n二串一\t214415\nfildder\t214416\n反面\t214417\n敞开式\t214418\n临沂地区\t214419\n姜花\t214420\n4741\t214421\nauthority\t214422\n张天宇\t214423\n宜宾机场\t214424\n人工喂养\t214425\n拍卖房\t214426\n自体脂肪隆胸\t214427\n伺机\t214428\n停赛\t214429\n分页线\t214430\n海峡新闻网\t214431\n虚拟变量\t214432\n三角带\t214433\n婚丧喜庆\t214434\nHunger\t214435\n整车手游网\t214436\n西觅亚\t214437\na25\t214438\n东直门店\t214439\npyserial\t214440\n玻璃钢储罐\t214441\n抗倾覆\t214442\n恶少女\t214443\n败给\t214444\n麦加\t214445\nHomer\t214446\n关税\t214447\n全福银行\t214448\n小女友\t214449\n穆雅斓\t214450\n如约\t214451\n流花车站\t214452\n众享\t214453\n普光\t214454\n明纬\t214455\napril\t214456\n木偶戏\t214457\n20150612\t214458\n20171128\t214459\n自行车道\t214460\n林烨\t214461\n俯冲\t214462\n浦寨\t214463\n600138\t214464\n二青\t214465\n328路\t214466\n天择\t214467\n一滴泪\t214468\nTomcat6\t214469\n脑梗塞后遗症\t214470\nPhotoZoom\t214471\n百环\t214472\n空气感\t214473\n发展与教育心理学\t214474\n昔阳\t214475\n当当网\t214476\n20170404\t214477\n阳谋\t214478\n7.3兽王\t214479\n沈德咏\t214480\nlibpcre.so.0\t214481\nmysq\t214482\ncorrection\t214483\n郭娟\t214484\n南定\t214485\n客管\t214486\n氮化硅\t214487\nw3m\t214488\n圆梦计划\t214489\nguzzle\t214490\n1800X\t214491\n中国养老金网\t214492\nSM论坛\t214493\n护理伦理学\t214494\n过去两年\t214495\n赞皇县\t214496\n相思林\t214497\nAnomaly\t214498\n长安CS55\t214499\ndasai\t214500\n倍晶\t214501\nshmily\t214502\nserver名\t214503\nHealtheon\t214504\n人大广东代表团\t214505\n操作攻略\t214506\n上银\t214507\n渝洽会\t214508\n皖西卫生职业学院\t214509\npreparedstatement\t214510\n深圳市卫计委\t214511\n铁锁\t214512\n伊通河\t214513\n建(构)筑物消防员\t214514\nfusion\t214515\n变相怪杰2\t214516\n争辩\t214517\nkio\t214518\n快板书\t214519\nx220t\t214520\nAP\t214521\n美东\t214522\nx86_\t214523\n贾云馨\t214524\n上海伊莱美医疗美容医院\t214525\n376.33\t214526\ngyx\t214527\n梵\t214528\n劫煞\t214529\n淘宝规则\t214530\nXMAN\t214531\n电子驾驶证\t214532\n副主\t214533\n步韵\t214534\n伏吟\t214535\n林肯mkz\t214536\n名所\t214537\n曜石黑\t214538\n自用\t214539\n两家人\t214540\n旅\t214541\n29cm\t214542\nDUST\t214543\n电视剧情\t214544\n美式箱变\t214545\n羊城通\t214546\n54所\t214547\n神酒\t214548\n弯腰\t214549\n江苏信息职业技术学院\t214550\n那谁\t214551\n答题王\t214552\n现代物流\t214553\n五十六个\t214554\nsole\t214555\n归宿\t214556\n银枪\t214557\n广德新闻网\t214558\n千万群\t214559\n那会儿\t214560\nTOC\t214561\nICAO\t214562\n柜上\t214563\n除颤器\t214564\n新大洲\t214565\n老死不相往来\t214566\n武人\t214567\n开心消消乐电脑版\t214568\n新社\t214569\ngrease\t214570\n美村\t214571\n瑞安市教育局\t214572\n三峡\t214573\ncodes\t214574\n信阳市人民政府\t214575\nweb框架\t214576\n古巨基\t214577\n太阳电池\t214578\n广州医科大学附属第二医院\t214579\n藤萍\t214580\n红影\t214581\n塘沽区\t214582\nHPLC法\t214583\n善融\t214584\n太尉\t214585\n黄辣丁\t214586\n正邦\t214587\n北京海底捞\t214588\n哑巴\t214589\n保利中环广场\t214590\nRGB565\t214591\n挂羊头\t214592\n依宪治国\t214593\n威特斯\t214594\n十之八九\t214595\n健康舞\t214596\nx-plane\t214597\n电炖盅\t214598\n酒石酸唑吡坦片\t214599\n好车网\t214600\n云南省委\t214601\n行政诉讼\t214602\nLCA\t214603\n新立村\t214604\n杰西卡·查斯坦\t214605\n一起走天涯\t214606\n瑞德\t214607\n转角\t214608\n雅布\t214609\n列自适应\t214610\naccruals\t214611\nstruts2\t214612\n常州旅游商贸高等职业技术学校\t214613\n河南省商丘市\t214614\n泄爆\t214615\n东北摩托联盟\t214616\n杨谨华\t214617\n起重工\t214618\n顺丰快递电子\t214619\n陈见\t214620\n冒险岛2扎昆\t214621\n首都人才网\t214622\n丝藻\t214623\n区考\t214624\n象形图\t214625\n中医药管理局\t214626\nBT宝塔Linux\t214627\n上海房地产交易中心\t214628\n网页编码格式\t214629\n生存者\t214630\n宾县\t214631\namoyzhu\t214632\n禹棹奂\t214633\nw510\t214634\n南瓜粉\t214635\n陈一鸣\t214636\n熏心\t214637\nlawrence\t214638\n洇\t214639\n3.4.3\t214640\n狮子楼\t214641\n6挡\t214642\n藏地密码\t214643\nKafka\t214644\n松下投影仪\t214645\n巍山镇\t214646\nv4.0.0\t214647\n本杰明巴顿奇事\t214648\n尾线\t214649\nCardTD\t214650\n一致\t214651\nyosino\t214652\n中华人民共和国海关\t214653\n云锋\t214654\n留队\t214655\n蓝润集团\t214656\n18例\t214657\n彩月\t214658\n1215\t214659\n重刻\t214660\ncube-ui\t214661\n卖报\t214662\n港漂\t214663\n牛塘\t214664\n滑雪场\t214665\n飞印网\t214666\n羟脯氨酸\t214667\n罗技鼠标驱动\t214668\n烟具\t214669\n奉贤南桥\t214670\ncs5.5\t214671\ntransfer\t214672\nYellow\t214673\n巴彦浩特镇\t214674\n品品\t214675\n小桃园\t214676\n天气预报瓶\t214677\n阿司匹林泡腾片\t214678\n订车\t214679\n永邦\t214680\n恒重\t214681\n石门颂\t214682\nunity5\t214683\n财科所\t214684\n养肌场\t214685\n六位数\t214686\n环湖路\t214687\n哈德森\t214688\n不惊\t214689\n暗夜行路\t214690\n阿南\t214691\n充墨\t214692\n蜡笔小新剧场版\t214693\n摆脱贫困\t214694\n美国法学院\t214695\n李曼\t214696\n望城坡\t214697\n江门大道\t214698\n踏风\t214699\n广州中国大酒店\t214700\n皮雕\t214701\n吵闹\t214702\n映入\t214703\n前脚\t214704\n仿宋体\t214705\n海南省工业和信息化厅\t214706\nindy\t214707\n23574096\t214708\n熊猫咪咪\t214709\n阳历\t214710\n袁成杰\t214711\n麦吉尔\t214712\n1.0.3_\t214713\n周玲\t214714\n300170\t214715\n萍乡经济技术开发区\t214716\n张东海\t214717\ncomodo\t214718\n江西中医药大学\t214719\n浙传\t214720\n铸造展\t214721\n南通市住房保障和房产管理局\t214722\n佑康\t214723\n血染\t214724\n踏月\t214725\n果宝特攻4\t214726\n麦子杰\t214727\n赛项\t214728\ngtx760\t214729\n炉石传说:魔兽英雄传\t214730\n钨丝\t214731\n体色\t214732\nChoice\t214733\n007:幽灵党\t214734\n工业缝纫机\t214735\n纯洁\t214736\n水阻柜\t214737\nH2O2\t214738\n淇河\t214739\n普华\t214740\n新加坡滨海湾金沙酒店\t214741\n盐酸盐\t214742\nmapview\t214743\nFinisar\t214744\n中国红十字会\t214745\n成也萧何\t214746\n香溢\t214747\n中华书局\t214748\n模型化\t214749\n拾穗者\t214750\n乳房炎\t214751\n电缸\t214752\n静帧\t214753\n南溪镇\t214754\n终极之战\t214755\nk++\t214756\n以太坊代币钱包\t214757\n技术活\t214758\nIMAP\t214759\n陵辱\t214760\nexcell\t214761\n戏弄\t214762\n俄罗斯美女学院\t214763\n徐光\t214764\ne5400\t214765\n仰天山\t214766\n山手\t214767\ncrosscode\t214768\n凉水河\t214769\n无责\t214770\n二龙戏珠\t214771\nfirepath\t214772\n五卷\t214773\n路外\t214774\n0027\t214775\n班前\t214776\n红子鸟\t214777\nots\t214778\n哥特风\t214779\n饰件\t214780\n深圳市经贸信息委\t214781\n手捧花\t214782\n71级\t214783\n预备役\t214784\n常青藤\t214785\n败家子\t214786\n黑科\t214787\nGATE奇幻自卫队\t214788\n睡袋\t214789\nSPDY\t214790\n一荐\t214791\n清寒\t214792\n进化类\t214793\n胆石\t214794\nMCSE\t214795\n瑞克多\t214796\n山雀\t214797\n茂盛\t214798\n临清\t214799\n運動\t214800\n关联方借款\t214801\n测控电路\t214802\n极盗者\t214803\n合阳县\t214804\n乐Pad\t214805\n郑强\t214806\n压接钳\t214807\n红染\t214808\n暴走大事件:#暴走大事件\t214809\n红歌\t214810\n江西财经大学现代经济管理学院\t214811\n欧贝克\t214812\n橡树湾\t214813\n友人母\t214814\navis\t214815\n海利得\t214816\n红茶叶\t214817\n楚天时报_多媒体报\t214818\n爱力\t214819\n发现物\t214820\n54平米\t214821\n上汽大众途观\t214822\n问世\t214823\n深圳体育馆\t214824\n昆虫\t214825\nwng\t214826\n双氯芬酸钠缓释片\t214827\n换句话说\t214828\nisl\t214829\n4399方舟\t214830\n筛查\t214831\n乘龙\t214832\n刘湛秋\t214833\n井中月\t214834\n_模板王\t214835\n绢画\t214836\n铁杵\t214837\n雪婷\t214838\n西德\t214839\n张海山\t214840\n恋旧\t214841\n千炮捕鱼\t214842\napktool\t214843\n选人\t214844\n激光脱毛\t214845\nPara\t214846\n2016.3.4\t214847\n北风网\t214848\n莫道君\t214849\n长安欧尚】长安欧尚\t214850\n必争\t214851\n反光片\t214852\n德育\t214853\n丰臀\t214854\n大数据产业博览会\t214855\n第三方库\t214856\n福记\t214857\n旧证\t214858\n装饰公司\t214859\n苏仙新闻网\t214860\n火影忍者漫画\t214861\n态势\t214862\n贝城\t214863\nbcompare\t214864\n名厨\t214865\n直驱式\t214866\ndebout\t214867\n聚乙二醇\t214868\nboom\t214869\n魔术棒\t214870\n停泊\t214871\n王菲菲\t214872\n希尔顿逸林酒店\t214873\n独立域名\t214874\n之二十一\t214875\n孙建国\t214876\n马建华\t214877\n近水楼台\t214878\n观音莲\t214879\n焖鸡\t214880\n滕文公\t214881\n五险一金计算器\t214882\n4包\t214883\n仙魔劫\t214884\nCollections\t214885\n侯永斌\t214886\n盘门\t214887\n1-1号\t214888\n弘毅投资\t214889\n天水\t214890\n海军工程大学\t214891\n第三十四\t214892\n砝码\t214893\nSendKeys\t214894\nlinguistics\t214895\n潮骚\t214896\n几岁\t214897\n食子\t214898\n1891\t214899\n四蹄\t214900\n河姆渡遗址\t214901\nhelpful\t214902\n作痛\t214903\n世界冠军\t214904\n出方\t214905\n西城区小学\t214906\n开化县政府\t214907\nlively\t214908\nbruno\t214909\n电解锰\t214910\ncole\t214911\nvjudge\t214912\nAxon\t214913\nonechain\t214914\n宋姓\t214915\n廉州镇\t214916\n春天会计服务公司\t214917\n钢板桩\t214918\n沉木\t214919\n为中华之崛起而读书\t214920\n克隆岛\t214921\n双腔\t214922\n币\t214923\n两用\t214924\n平舆在线\t214925\n表演者\t214926\ngearing\t214927\n库位\t214928\nrxjava\t214929\n歌单\t214930\n4.75\t214931\n园林定额\t214932\n甘肃省博物馆\t214933\n出口处\t214934\n己知\t214935\n王济武\t214936\nneca\t214937\n套息\t214938\n青秀山风景区\t214939\n烟沙\t214940\n百世快运\t214941\n和衷共济\t214942\n孙幼军\t214943\n李广\t214944\n张婉婷\t214945\n一言\t214946\n工农红军\t214947\n许士林\t214948\n中间表\t214949\n纳税户\t214950\n普罗米\t214951\n花雕\t214952\n星期四\t214953\n王露\t214954\n北京公积金\t214955\n倒角\t214956\n北京教育考试院\t214957\n5.12国际护士节\t214958\ndumpbin\t214959\n罗马2全面战争吧_\t214960\n引继\t214961\n沪股\t214962\nAAA\t214963\noldschool\t214964\n文本字\t214965\n格栅灯\t214966\n书评\t214967\n单田芳\t214968\nCasey\t214969\n儿科\t214970\ninstitute\t214971\n马奎兹\t214972\nk-1\t214973\nasdm\t214974\n2558\t214975\n龙州县\t214976\n戊己\t214977\n巴音布鲁克草原\t214978\n康尼\t214979\nlianai\t214980\n乌鲁木齐南\t214981\n九华山庄\t214982\nbanking\t214983\na59m\t214984\n裡\t214985\n万事俱备\t214986\n调度器\t214987\n卡夹\t214988\n决不\t214989\n2321\t214990\n300瓦\t214991\n穷举法\t214992\n上一步\t214993\n转弯机\t214994\n最真的梦\t214995\n特困\t214996\n第26轮\t214997\ntaiwan\t214998\nbros\t214999\n24度\t215000\n亲爱的篮球\t215001\n莫嫡\t215002\n透过性\t215003\n佛典\t215004\n陈越香\t215005\n研报评级\t215006\n迎春杯\t215007\n点阵屏\t215008\n铜鼓\t215009\n忠君\t215010\n西乡大道\t215011\n小牛肉\t215012\nyoo\t215013\n4.0_\t215014\n弱者\t215015\nBrass\t215016\n1734\t215017\n朝内\t215018\n电动按摩器\t215019\n梯\t215020\n泰天\t215021\n一灸\t215022\n隐形防护网\t215023\nemmmm\t215024\n保利星海小镇\t215025\n柠檬干\t215026\n孝友\t215027\n船上\t215028\n宋晓垒\t215029\n南木\t215030\n广州长安医院\t215031\n妈妈是超人3\t215032\n盾尾\t215033\n强光\t215034\n乌头\t215035\n相融\t215036\n11.08\t215037\n杀戮尖塔\t215038\n呼和浩特民族学院\t215039\n检举\t215040\n結\t215041\n座机号\t215042\n几千万\t215043\n洋槐\t215044\n水平尺\t215045\n加油牌\t215046\n丽芙·泰勒\t215047\n郑外\t215048\nLPA\t215049\n芳华火影忍者ol\t215050\nHeterogeneous\t215051\n汉_\t215052\n郑州儿童医院\t215053\n大秦网\t215054\n音樂\t215055\nKC网络电话\t215056\nquanzhou\t215057\n管家式\t215058\n熊猫酒仙\t215059\nSmile\t215060\nEr\t215061\n朱赢椿\t215062\n运城市中心医院\t215063\n河北航空\t215064\n台州在线\t215065\nSketchUp2017\t215066\n储气\t215067\n叙帝利\t215068\n电子产品展\t215069\n语言不通\t215070\n国药控股\t215071\n权健足球俱乐部\t215072\nowns\t215073\n钛酸钡\t215074\njexus\t215075\ndot4\t215076\n顾问团\t215077\n下沙高教园区\t215078\n赤潮指挥学院\t215079\n潘安湖湿地公园\t215080\n定点\t215081\n金水宝胶囊\t215082\n舱盖\t215083\nkotlin\t215084\n动手做\t215085\n音教\t215086\nYachts\t215087\n翼子板\t215088\n退改\t215089\nato\t215090\n1320804\t215091\n代购费\t215092\n黑白条\t215093\n40度\t215094\n媳妇\t215095\n金属打包机\t215096\n脱轨\t215097\n何兆武\t215098\n1300\t215099\n一个div块\t215100\n苯甲酸乙酯\t215101\n30多次\t215102\n受敌\t215103\n11月下旬\t215104\n番茄酱\t215105\n冲突\t215106\n8828\t215107\n伙计们\t215108\n遨游\t215109\n金蝶集团\t215110\n吉尼斯世界之最大全\t215111\n一汽轿车\t215112\n孙颖\t215113\n北京观韬中茂律师事务所\t215114\nVariant\t215115\n翠屏诚园\t215116\nweapons\t215117\n小结节灶\t215118\n武汉大学口腔医院\t215119\n屏边\t215120\ng/cm3\t215121\nWMI\t215122\n德州论坛\t215123\n斗柜\t215124\n玻璃钢冷却塔\t215125\n宫本凡人修仙传\t215126\n万翔\t215127\n翻新机\t215128\n大小鬼\t215129\n波浪谷\t215130\n甘肃省委党校\t215131\nshowroom\t215132\n48例\t215133\nportrait\t215134\n暹罗海洋世界\t215135\n重调\t215136\n150平\t215137\n打蛋器\t215138\n烟味\t215139\n筋斗\t215140\n康园\t215141\nbrp\t215142\n未央城\t215143\n城西小学\t215144\n小米众筹\t215145\nDesktx\t215146\n飒爽\t215147\n观音拳\t215148\n2800万元\t215149\n灵塔\t215150\n不粘胶\t215151\n镇流器\t215152\n消受\t215153\n彩虹作文网\t215154\nstep函数\t215155\ngenomes\t215156\n国祯环保\t215157\ndft\t215158\n文化\t215159\n应用计量经济学\t215160\nnba常规赛\t215161\n11座\t215162\ndouyin\t215163\ncostco\t215164\n竞争币\t215165\n刘红军\t215166\n物理学家\t215167\n声子谱\t215168\n贝克莱\t215169\n37次\t215170\n20121129\t215171\n正山小种红茶\t215172\n温江中学\t215173\n正义性\t215174\nantipodes\t215175\n360巴迪龙\t215176\nXLS\t215177\n90秒\t215178\n五化\t215179\n盘锦大洼\t215180\ntanwan\t215181\n永城北\t215182\n佳能mf4700\t215183\n禽兽\t215184\n超级吸引力\t215185\n峥嵘\t215186\n移民百事通\t215187\n海洛\t215188\nCamel\t215189\n快捷酒店\t215190\n天女散花\t215191\nsql日志\t215192\n女人们\t215193\n坏人\t215194\n保贴\t215195\n歪嘴\t215196\n竞秀\t215197\nKeylight\t215198\nX3650M5\t215199\nMapping\t215200\nrasp\t215201\n华阴市\t215202\n大秦铁路\t215203\n二口女\t215204\nStationery\t215205\n超前进位加法器\t215206\n高黄\t215207\nCC2015\t215208\n空船\t215209\n思拓力\t215210\n营养周\t215211\n九项\t215212\n分贝值\t215213\n滚花\t215214\n视察\t215215\n京汉大道\t215216\n台州市人力资源和社会保障局\t215217\n荷兰代尔夫特理工大学\t215218\n赞美\t215219\n果菜\t215220\n3559\t215221\n48\t215222\n无穷重阻\t215223\n黄腾峡\t215224\n拉斯普京\t215225\n大面积\t215226\nA3+\t215227\n旧貌\t215228\n论文学\t215229\n38年\t215230\n坚强不屈\t215231\n船企\t215232\n柠檬鸭\t215233\n碍\t215234\n宝马奔驰\t215235\n儿童福利院\t215236\n容中尔甲\t215237\nフェ\t215238\n纯形\t215239\nCryp\t215240\n臭脚\t215241\n便携式\t215242\nshell批量\t215243\n徐碧城\t215244\nSleep\t215245\n孔令辉\t215246\n上海市\t215247\n更趋\t215248\n海飞丝\t215249\n沐清雨\t215250\n顺次\t215251\n苏颜\t215252\n36题\t215253\n二零一八年\t215254\n手纹\t215255\n市人大常委会\t215256\n5.0寸\t215257\n风拳\t215258\npvcreate\t215259\n一块一块\t215260\n查尔斯\t215261\n横图\t215262\n汤臣\t215263\n中国医药集团\t215264\n银镯\t215265\nCSGIRL\t215266\n湖南省科技厅\t215267\ncc2014\t215268\n宁波大学医学院附属医院\t215269\n五倍子\t215270\nPIXMA\t215271\nZBlog\t215272\n央视春节联欢晚会\t215273\n真实化\t215274\n31类\t215275\n草原\t215276\n粉丝邦\t215277\n重拳\t215278\n孝感北站\t215279\n宁夏食品药品监督管理局\t215280\n撒哈拉的故事\t215281\ns15\t215282\n外国人\t215283\n金鼎奖\t215284\n流量\t215285\n黑皮鸡枞菌\t215286\n跑起来\t215287\n血点\t215288\n北京城建龙樾湾\t215289\n2310m\t215290\n崔荣容\t215291\n正拍\t215292\n来看\t215293\n小依\t215294\n地气儿\t215295\n近三年内\t215296\n松禾\t215297\n民歌手\t215298\n脱毛仪\t215299\n肝肠\t215300\n依存度\t215301\n下盘\t215302\n黑背\t215303\n二柄\t215304\n汉坡\t215305\nWin10资源管理器\t215306\nkth\t215307\n广西民政厅\t215308\n龙城广场\t215309\nLogitech\t215310\nFarmer\t215311\nDip\t215312\n爱不爱你\t215313\n飞艇\t215314\n食玩\t215315\n龙翔大道\t215316\n13类\t215317\nBirdy\t215318\n电势能\t215319\n共青团中央书记处\t215320\n不一样的搞法在线视频\t215321\n伍月\t215322\n武汉市新洲区政府\t215323\n一个星期天\t215324\n电蚊香\t215325\n好辛苦\t215326\n凝气\t215327\n黑科技\t215328\n现状\t215329\n5码\t215330\n皮面具\t215331\nOpenGl\t215332\nj20\t215333\n电动执行机构\t215334\n河北省国土资源厅\t215335\n冀鲁豫\t215336\n制香\t215337\n淄博市妇幼保健院\t215338\nbrief\t215339\n4Minute\t215340\n输入法老\t215341\n成就\t215342\nHolistic\t215343\n清谈\t215344\noppor7s\t215345\n爱秀美\t215346\n浩森\t215347\n龙券网\t215348\n鳐\t215349\n薛埠镇\t215350\n小黄狗\t215351\n龙纹身的女孩\t215352\n吉林华桥外国语学院\t215353\n2GD5\t215354\n绕线器\t215355\nbn\t215356\nskyworth\t215357\n机票\t215358\n裤边\t215359\n伴舞\t215360\nbjcps\t215361\nFH\t215362\nbitmex\t215363\n意见不合\t215364\n乐视乐\t215365\n都市晨报\t215366\n海之恋\t215367\n围殴\t215368\n漆色\t215369\n见\t215370\n有线电视台\t215371\n资\t215372\nSocks\t215373\n同济大学附属口腔医院\t215374\n漂染\t215375\n考驾照\t215376\n野百合\t215377\n落款处\t215378\n苏州大学图书馆\t215379\n白沙泉\t215380\n日坛公园\t215381\n4.34\t215382\n闲置房\t215383\nfinishes\t215384\nゐ\t215385\n护边\t215386\nFe2O3\t215387\n莫兰蒂\t215388\nsx\t215389\n北京市科协\t215390\n长征镇\t215391\ncraft\t215392\nLathe\t215393\n花园洋房一楼\t215394\n扒圈\t215395\n无节操\t215396\neSpace\t215397\n上海申鑫\t215398\n浮游菌\t215399\n无再\t215400\n河北雄安\t215401\n演绎者\t215402\n阀型\t215403\n安居区人民政府\t215404\n马乐\t215405\n爱情电影\t215406\ninstrumentation\t215407\nSites\t215408\n克林霉素\t215409\n电磁炉\t215410\n蝴蝶型\t215411\n80W\t215412\n游吧\t215413\ninequality\t215414\n深圳维盟科技股份有限公司\t215415\n花椒籽\t215416\nwher\t215417\n华为mate9吧_\t215418\n沙德沃克\t215419\n王英\t215420\n阳起石\t215421\n招字\t215422\n场式\t215423\n金边虎皮兰\t215424\nnuclear\t215425\n张文权\t215426\n枣庄西站\t215427\n谭逗逗\t215428\nXMLSpy\t215429\n啪啪啪\t215430\n钟山\t215431\n双叶杏\t215432\n七科\t215433\ndianshiju\t215434\n金虹桥\t215435\n悬河\t215436\n锦江之星酒店\t215437\n碳酸氢铵\t215438\n24卷\t215439\n冒泡排序\t215440\n过错方\t215441\n城镇居民医疗保险\t215442\n微商朋友圈\t215443\nOpenFire\t215444\n刮胡泡\t215445\n36578\t215446\ndapi\t215447\nStart\t215448\n天下归元\t215449\n中川机场\t215450\nExtjs6\t215451\n六君子汤\t215452\n杭州市经济和信息化委员会\t215453\n厄里斯\t215454\ngmssl\t215455\n企事业单位\t215456\nDermatology\t215457\n医鸿鸟\t215458\n草莓穆斯塔\t215459\nP35\t215460\n小睡\t215461\nv30\t215462\n御风\t215463\n京官\t215464\nDcloud\t215465\n四万亿\t215466\n抵压\t215467\n魔神坛斗士\t215468\n短板\t215469\n物业服务合同\t215470\n留发\t215471\n七八\t215472\nlpl2018\t215473\n26届\t215474\n2个工作日\t215475\n5952\t215476\n气点\t215477\n杰尼斯\t215478\n维拉帕米\t215479\n刘海龙\t215480\nTRIP\t215481\n第五笔\t215482\nNAMD\t215483\n成都人事考试网\t215484\n要约\t215485\nUPS不间断电源\t215486\n初等矩阵\t215487\n羊奶\t215488\n老了\t215489\n挺\t215490\n克里斯汀\t215491\n何凯文\t215492\n合表\t215493\n高密度脂蛋白胆固醇\t215494\n当属\t215495\n频遇\t215496\n吹灯\t215497\n形术\t215498\n张建龙\t215499\n身前\t215500\nUIAlert\t215501\n来生缘\t215502\n凌晨1点\t215503\n绕道\t215504\n中控智慧科技股份有限公司\t215505\n七招\t215506\n点餐机\t215507\n扫描头\t215508\nburpsuite\t215509\nファッション\t215510\n信息战\t215511\n欧洲区\t215512\n白岛\t215513\n12.0\t215514\nyoutube\t215515\n冷食\t215516\n菜板\t215517\n8W\t215518\n丝虫病\t215519\n新闻直播间\t215520\n冠昊生物\t215521\n小孙\t215522\nStandards\t215523\n1789年\t215524\ncccpan\t215525\n阎锡山\t215526\n汇天国际结算网\t215527\n肉禽\t215528\nas2\t215529\n56路\t215530\n惊奇\t215531\n尿检\t215532\n0001901212\t215533\n藕池\t215534\n钢芯铝绞线\t215535\n高点\t215536\n先马坦克\t215537\n自然辩证法概论\t215538\n肿瘤科_99健康网\t215539\n1692\t215540\n槽板\t215541\n阿杜\t215542\n麻吉\t215543\n建行企业网上银行\t215544\n勤勤\t215545\n春城路\t215546\n新海港\t215547\n班纳\t215548\nP8B75\t215549\n禁画\t215550\nps蒙板\t215551\n川中\t215552\n3RD\t215553\n新罕布什尔州\t215554\n9553安卓\t215555\n注册环评工程师\t215556\n男扮女\t215557\n17条\t215558\n1832\t215559\n红姑\t215560\n翡翠公馆\t215561\n圆片\t215562\n秋山澪\t215563\n操作规则\t215564\nweike\t215565\n经咒\t215566\np158b\t215567\n三国演义\t215568\n支配者\t215569\n溜溜球\t215570\n9.7.1.3519\t215571\n风系\t215572\n信院\t215573\n鹅埠镇\t215574\n香缇\t215575\n48度\t215576\n正大天晴药业集团股份有限公司\t215577\nGomez\t215578\n第7个\t215579\n影流\t215580\nAmperex\t215581\n吃菜\t215582\n九周\t215583\n东君\t215584\n二宫和香\t215585\n开发师\t215586\n益智游戏\t215587\nXPE\t215588\n土地网\t215589\n还魂\t215590\n韦鲁斯\t215591\n侯军\t215592\n搜狐基金\t215593\n伯南克\t215594\n大奶姐妹花1\t215595\n灰空间\t215596\n受得了\t215597\n最后一次\t215598\n2018年5月6日\t215599\nAlamps\t215600\n飞天螳螂\t215601\n得服\t215602\n把式\t215603\n第一曝光台\t215604\n42次\t215605\n有限理性\t215606\n冶春\t215607\n阿瓦\t215608\n烘干机\t215609\n一年左右\t215610\n中保协\t215611\n中老年\t215612\n溆浦县\t215613\nofficiel\t215614\nSurveillance\t215615\n孙敬\t215616\n奶茶妹妹\t215617\n国观\t215618\n咪咕音乐\t215619\n福禄桐\t215620\nHUB\t215621\n末学\t215622\n20161212\t215623\n!\t215624\n昆明人才网\t215625\n线型\t215626\n华美紫馨\t215627\n清水房\t215628\n钩衣\t215629\n企业安全生产标准化基本规范\t215630\n菩提路\t215631\n福特锐界\t215632\n自然主义\t215633\n嘉园小区\t215634\nLocate\t215635\n翻白眼\t215636\n红旗社区\t215637\n70集\t215638\n吉尼斯世界纪录大全\t215639\n蓝汛\t215640\nSS64.com\t215641\n何杨\t215642\n云熙\t215643\n药流\t215644\n祖国万岁\t215645\n43年\t215646\n显色反应\t215647\n2&#41\t215648\n卡多克\t215649\n细项\t215650\n乙脑\t215651\n南园岛\t215652\n犯贱\t215653\n钦州360网\t215654\n墓王之王寒铁斗\t215655\nJong\t215656\nCAO\t215657\ncode128\t215658\n严明党\t215659\n赵大宝\t215660\n蛐蛐\t215661\n亲假\t215662\nnier\t215663\n伏安\t215664\n拉森桩\t215665\nChew\t215666\n欲海情魔\t215667\n国债利率\t215668\n窨井盖\t215669\nTMG\t215670\n绘声绘影\t215671\n凡科\t215672\n触诊\t215673\nA股公司\t215674\n尚客优\t215675\n方仲永\t215676\n鹤乡\t215677\n红枣汤\t215678\n湊莉久\t215679\n引流量\t215680\n0公里\t215681\n旋转椅\t215682\n航模\t215683\nBrit\t215684\n拿回来\t215685\n人骨\t215686\n航天\t215687\nk650\t215688\nmultisim12\t215689\n乞讨\t215690\n双峰山\t215691\n第12次\t215692\nhearth\t215693\nPlay\t215694\n奔驰GL级\t215695\n嫩逼\t215696\n秉承\t215697\n废掉\t215698\n奈须蘑菇\t215699\n马奇诺\t215700\n徐中约\t215701\nplayed\t215702\n静脉瓣\t215703\n湖南省公共资源交易中心\t215704\n承建商\t215705\n糖\t215706\nWhistle\t215707\nxiaoke\t215708\n土木工程师考试\t215709\n田益珍\t215710\n贾金金\t215711\nWavelet\t215712\n3368\t215713\n道客巴巴免费下载器\t215714\n坐船头\t215715\nchenssy\t215716\n玻汾\t215717\n云计价\t215718\n税务人\t215719\n赛罗\t215720\n弗吉尼亚·伍尔芙\t215721\n台州恩泽医疗中心\t215722\n4k电视\t215723\n日澳\t215724\n活点\t215725\nsdv\t215726\n平湖市\t215727\n天堂2\t215728\n梅花山\t215729\n刘萌萌\t215730\n护肘\t215731\n23.5\t215732\n李劲松\t215733\n绿色明珠网\t215734\n3538\t215735\n林三\t215736\n洗澡戏\t215737\n国家食品药品监督管理局执业药师资格认证中心\t215738\nGianna\t215739\nDaniu\t215740\n闪电侠第四季\t215741\n红豆水\t215742\nKeepalived\t215743\n妯娌\t215744\n扎染\t215745\nPMLC\t215746\n行气\t215747\n王小军\t215748\n平头哥蜜獾\t215749\n压铆螺柱\t215750\nbegin\t215751\n缝剂\t215752\nxsc\t215753\n网域\t215754\n数控冲孔机\t215755\n1578\t215756\n寄情\t215757\nDLS\t215758\n宁波华翔\t215759\n浦江县\t215760\n李建平\t215761\n贵州高速公路集团有限公司\t215762\n大厂孔雀城\t215763\n粉丝通\t215764\n离间\t215765\n国药准字查询网\t215766\n鼠王\t215767\n大款\t215768\n爱拍原创资讯中心\t215769\n鲍氏\t215770\n幽桐\t215771\n松江\t215772\n玻璃壶\t215773\n日英\t215774\n斗舞\t215775\n四川国际标榜职业学院\t215776\nitms\t215777\n小辫子\t215778\nhcd\t215779\n国宏\t215780\n睫\t215781\n曹蒹葭\t215782\n毁魅吧\t215783\nxiangmu\t215784\nso.2\t215785\n六十甲子\t215786\n建图\t215787\n宁死不屈\t215788\n游戏性\t215789\n路条\t215790\n20150614\t215791\n英语学习\t215792\n西安市\t215793\n五祖寺\t215794\n非居民企业股权转让\t215795\n24105\t215796\n优题\t215797\n铬钒钢\t215798\n16层\t215799\n幽冥\t215800\n95555\t215801\n鄂西\t215802\n围甲\t215803\n实际\t215804\njakarta\t215805\n6.21\t215806\n唐七公子\t215807\n左宁\t215808\n兴长\t215809\n消间\t215810\n真香\t215811\ng60\t215812\n萨隆\t215813\nALTIUM\t215814\n随身携带\t215815\n马晓\t215816\n草柳\t215817\n周昆\t215818\n横县\t215819\n2a-1\t215820\n21.75天\t215821\n旺诠\t215822\n本玩\t215823\n哈尔滨华德学院\t215824\n跨线程\t215825\n8.5\t215826\ngamma\t215827\n流感下的北京中年\t215828\niphone6p\t215829\n华为荣耀4c\t215830\nhandy\t215831\n时空之轮2\t215832\n金融卡\t215833\n梵花\t215834\n荣华路\t215835\n吴亦凡\t215836\nRicoh\t215837\n焰魔\t215838\n荷兰网\t215839\n1.2版\t215840\n聚集\t215841\n柠条\t215842\n记事作文_小故事网\t215843\n水尾\t215844\n龟峰山风景区\t215845\n神国\t215846\nHollow\t215847\n百变\t215848\n吸嘴\t215849\n鄞州法院\t215850\n81分\t215851\n嘉荫县\t215852\nprops值\t215853\n70款\t215854\nWEBSERVICE\t215855\n无法忘怀\t215856\n守夜者\t215857\n低格\t215858\n五一中路\t215859\n深圳集散中心\t215860\n明鉴\t215861\n鹰派\t215862\n杭州市气象局\t215863\n海港\t215864\n百世快递单号\t215865\n霍比特人3:五军之战\t215866\n不动产登记局\t215867\nDefender安全中心\t215868\nPYQT\t215869\n利企\t215870\n500多名\t215871\n名调\t215872\nyon\t215873\n诗梦\t215874\n穿山甲\t215875\n罗子雄\t215876\n十六分\t215877\n言行\t215878\nLush\t215879\n百贸网\t215880\n7分米\t215881\n28周\t215882\n药明生物\t215883\n320元\t215884\n昆明市五华区人民政府\t215885\nYuu\t215886\n动平衡仪\t215887\n谍影重重5\t215888\n石强\t215889\n很脏\t215890\n拉菲草\t215891\nInno\t215892\n宣纸\t215893\n诺心LECAKE\t215894\n守望先\t215895\n差错率\t215896\n金仕堡\t215897\n12处\t215898\n赶到\t215899\n拒缴\t215900\n2017年10月19日\t215901\n风行SX6\t215902\n到此为止\t215903\nfopen\t215904\n庄生\t215905\n重庆妈妈网\t215906\n一式\t215907\n龙君\t215908\n上善\t215909\n石评\t215910\nsteam\t215911\n三星s6edge\t215912\nsustainability\t215913\n3.21\t215914\n双井站\t215915\n终稿\t215916\nNOSQL\t215917\n曹宅\t215918\n林加德\t215919\nHUANG\t215920\n泰永长征\t215921\n小野洋子\t215922\n折现率\t215923\n北京天坛医院\t215924\n殇雪\t215925\n淄博高新区\t215926\n中兴通讯公司\t215927\n孙瑞\t215928\n栾树\t215929\ntablets\t215930\nsweater\t215931\n优格\t215932\n魔兽福克斯\t215933\n尚志市\t215934\n周朴园\t215935\n车轮胎\t215936\n奥宇\t215937\n不一样\t215938\nIsland\t215939\n蒙奇\t215940\n福天\t215941\n姜之鱼\t215942\n济南教育宝\t215943\n安置补助费\t215944\n.docx\t215945\ndisp函数\t215946\n金鸡湖\t215947\nslurry\t215948\nAOE\t215949\n福门\t215950\n逃亡\t215951\n今川义元\t215952\n82225527\t215953\n经济之声\t215954\nPet\t215955\n曲柄滑块\t215956\n2^\t215957\nNHibernate\t215958\n真实惠\t215959\ncoreldrawx8\t215960\n乌夜啼\t215961\n昆明工业职业技术学院\t215962\n失范\t215963\n周冠军\t215964\n克鲁斯\t215965\nkey_\t215966\n冯小罗纳尔多\t215967\n纷\t215968\n機\t215969\n法人股\t215970\n征询\t215971\n第54期\t215972\n袁阔成\t215973\njoanne\t215974\n丁克吧\t215975\n移动浏览器\t215976\n分卷压缩包\t215977\n人言可畏\t215978\njion\t215979\n诺贝尔奖\t215980\n第一宗\t215981\n阿伦尼乌斯\t215982\n2玖章\t215983\ngm501\t215984\n峡谷\t215985\n徐震\t215986\n雪人套\t215987\n多退少补\t215988\n中国平安保险集团\t215989\n0591-8621xxxx\t215990\n蓝铜\t215991\n极简主义者\t215992\nlxw\t215993\n凌越\t215994\n高英培\t215995\n步行者\t215996\n荆州博物馆\t215997\n超过30分钟\t215998\n做歌\t215999\n纳米压痕仪\t216000\n四足\t216001\n无障碍\t216002\n公职类\t216003\nLinkError\t216004\n实小\t216005\nqual\t216006\naspx\t216007\n这一站\t216008\n兴延高速\t216009\nNASDAQ\t216010\nlkong\t216011\n4399生死狙击\t216012\n斯巴鲁傲虎\t216013\n货号\t216014\n平凡希\t216015\n美国医院\t216016\n贺岁纪念币\t216017\n刘病\t216018\n成分股\t216019\n三非\t216020\n团标\t216021\n小玩意\t216022\n众赢\t216023\n石家庄市公安局\t216024\n两槽\t216025\n中国检验认证集团\t216026\n美国州\t216027\n起号\t216028\n未来3天\t216029\n黄氏\t216030\nDPT-S1\t216031\n百战天虫\t216032\n石黑京香\t216033\n留声玩具\t216034\n碘伏消毒\t216035\n27篇\t216036\n留党\t216037\ngetElementById\t216038\n朝霞\t216039\nLinn\t216040\n信阳学院\t216041\n疣体\t216042\nincl\t216043\n4首\t216044\n挎\t216045\n禁考\t216046\n运城北\t216047\n千力\t216048\n瑞士法郎\t216049\n磨刀机\t216050\n超临界流体\t216051\n真名\t216052\nstat函数\t216053\nskpye\t216054\nF660\t216055\n同种\t216056\n青年创业园\t216057\n沈阳北站\t216058\n小问\t216059\n北桥镇\t216060\n诗派\t216061\nultimate\t216062\n政治犯\t216063\n痴人说梦\t216064\nchexun\t216065\n完美视频大全\t216066\nQH\t216067\n双休日\t216068\n戴表\t216069\n姬岛瑠梨香\t216070\n幺儿\t216071\n易法通\t216072\nUSCPA\t216073\n套镜\t216074\nbs4\t216075\n三国志霸王的大陆\t216076\nBarron\t216077\n木塞\t216078\nCuff\t216079\n赛虎\t216080\n平明\t216081\n次元位\t216082\nclinet\t216083\n小辰\t216084\n讨好型\t216085\n33篇\t216086\n魅族PRO\t216087\n限幅电路\t216088\n易修网\t216089\nhidpi\t216090\nOFweek医疗科技网\t216091\n20020\t216092\n24路\t216093\n玛曲县\t216094\n旅游券\t216095\n东江时报\t216096\n招商管理\t216097\n惊天阴谋\t216098\n漂不漂亮\t216099\n韩庚村上春树\t216100\n新莞\t216101\n晋北\t216102\n3Dmax2012\t216103\n用典\t216104\n浮动层\t216105\n数物\t216106\n科华恒盛\t216107\n罗纳尔多\t216108\n2018年1月22日\t216109\n反应谱\t216110\n2018-01-03\t216111\n圣灰\t216112\n锂云母\t216113\n合肥高新区管委会\t216114\n短靴\t216115\n异或门\t216116\n远期\t216117\n雜誌\t216118\nAntonyms\t216119\n岳成律师事务所\t216120\n千日\t216121\n品冠\t216122\n辛集市\t216123\n出摊\t216124\nArcEngine\t216125\nJProfiler\t216126\nxfce\t216127\n办园\t216128\n15.7%\t216129\n望\t216130\n等待戈多\t216131\n金泰妍\t216132\n阳泉煤业\t216133\nirae\t216134\n文心兰\t216135\n忐忑\t216136\n侠客风云传/新武林群侠传\t216137\n烤牛排\t216138\n逆水寒\t216139\n李烈\t216140\n牛皮藓\t216141\n35crmo\t216142\n会风\t216143\n笨牛\t216144\nUBuntu\t216145\n木舟\t216146\nkpw\t216147\nLDAC\t216148\n相思成瘾\t216149\neSight\t216150\n心泪\t216151\n分水岭\t216152\n糜竺\t216153\nTRB\t216154\n题序\t216155\nosmo2\t216156\n广外\t216157\n奈奈生\t216158\n生疮\t216159\n押运\t216160\n弹性力学\t216161\n窗上\t216162\n三个女人一个因\t216163\nINFINITY\t216164\n花笺\t216165\n牙疼\t216166\n喜力\t216167\nmohurd\t216168\n叙事性\t216169\n牛牛网\t216170\ndistillation\t216171\nLED屏幕\t216172\n腰椎管狭窄\t216173\n灰溜溜\t216174\nscapes\t216175\n遵义吧\t216176\nmkfs\t216177\n莫小安\t216178\n选行\t216179\nangle\t216180\n雅途\t216181\n沙河市\t216182\n波奇\t216183\n滨海工业区\t216184\neID\t216185\nIBAN\t216186\n科普类\t216187\n李建滨\t216188\n2.22\t216189\n短发\t216190\n麦振鸿\t216191\n欲望之岛\t216192\ntp5.1\t216193\nurgent\t216194\n微信公众平台编辑器\t216195\n屯儿\t216196\n罗强\t216197\n好几天\t216198\n借力\t216199\n电源时序器\t216200\n湖北财税职业学院\t216201\n第十七条\t216202\n白带拉丝\t216203\n600亿元\t216204\n牛氏\t216205\n限售股\t216206\n营销员\t216207\n2016-2016年\t216208\n发动\t216209\n小赢\t216210\n转音\t216211\n彩贴\t216212\nTOWN\t216213\n今后\t216214\nWeCenter\t216215\nsaoif\t216216\nWest\t216217\n汪涵\t216218\n电子器件\t216219\nwin7共享文件\t216220\n丝芭\t216221\n水谷隼\t216222\n中盐\t216223\n团子\t216224\namini\t216225\n在吧\t216226\n昆明五华\t216227\n江南农村商业银行\t216228\nhnu\t216229\n大瑞铁路\t216230\n虎头\t216231\n牛腱子\t216232\n高森\t216233\nLogger\t216234\nYOLO2\t216235\n1:40\t216236\n吕宋\t216237\n梯形荷载\t216238\npdms\t216239\nclassname\t216240\n家庭教师\t216241\n朱元\t216242\n晋王\t216243\n门内\t216244\nARMv8\t216245\n第一房网\t216246\n斯堪的纳维亚半岛\t216247\n一公里\t216248\n克鲁苏\t216249\n蛋神\t216250\n光力\t216251\n增长极\t216252\n夜览\t216253\ncommittee\t216254\n新乡日报\t216255\n定额项\t216256\n行李牌\t216257\n28块\t216258\n辻本杏\t216259\n工作业\t216260\n蚕业\t216261\n春晖路\t216262\n琼州\t216263\n直埋保温钢管\t216264\nmagento\t216265\n云迹\t216266\n考位\t216267\n烟花三月下扬州\t216268\n加里波利\t216269\n上合\t216270\n徐锐\t216271\n挪\t216272\n中国地球物理学会\t216273\n北京JW万豪酒店\t216274\nAsthma\t216275\n第…\t216276\npassion\t216277\n凉菜\t216278\n程璐\t216279\nfitbit\t216280\n自净\t216281\n212\t216282\nhaier\t216283\n群晖Synology\t216284\nMagicc\t216285\n三字\t216286\nglyphicon\t216287\n谢罪\t216288\n闪光字\t216289\nn3\t216290\n火影忍者究极风暴2\t216291\n荷航\t216292\n环境案\t216293\ndrcom\t216294\ngroupby\t216295\n利培酮片\t216296\nrepetier\t216297\n五星之光\t216298\n决不饶恕\t216299\n焦作站\t216300\n腹直肌\t216301\n田永成\t216302\n2016年五一劳动节\t216303\n20150603\t216304\n德友\t216305\n学生类\t216306\n姬路城\t216307\nreceives\t216308\n反语\t216309\nfxxk\t216310\n翠城馨园\t216311\n大气颗粒物\t216312\nSo\t216313\n母恋\t216314\nTenable\t216315\n北京市卫生计生委\t216316\n张老师\t216317\n无序\t216318\n永久居留权\t216319\n标量值函数\t216320\nc60\t216321\n曼海姆\t216322\n泰拉瑞亚里\t216323\n床帘\t216324\n谢玲玲\t216325\n凹陷\t216326\n应用名\t216327\n财富通\t216328\n定位理论\t216329\n良品\t216330\n微视频\t216331\n蠡湖\t216332\nKMPlayer播放器\t216333\nhtmleditor\t216334\n86号\t216335\n抗张\t216336\n聚智\t216337\n廖昌永\t216338\n丹山\t216339\npaige\t216340\n百度五笔\t216341\ns[i\t216342\n一叶扁舟\t216343\n日程\t216344\n中国历史地图集\t216345\n5670\t216346\n十八号\t216347\n注册域名\t216348\n简语\t216349\n指南录后序\t216350\n夜鸣猪\t216351\n_城市地区\t216352\n保护地球\t216353\n第11关\t216354\n沙湖\t216355\n光粉\t216356\nEdmund\t216357\n第67期\t216358\n洛阳路\t216359\n黄仁烁\t216360\nUTF-16\t216361\n陕西法院\t216362\n草榴论坛\t216363\n异维\t216364\n潜藏\t216365\nwww域名\t216366\n布胶\t216367\n南理工\t216368\n南唐遗少\t216369\nNOISE\t216370\n代培\t216371\n申宝忠\t216372\n国税电子税务局\t216373\n缘若\t216374\n资产回报率\t216375\n﹐\t216376\n凤祥\t216377\n三百米\t216378\n梅子黄时雨\t216379\n皇马尤文\t216380\n凉拌苦菊\t216381\n恒华\t216382\n附合\t216383\n荣耀水晶\t216384\n西诺克斯\t216385\n山西中德科工机械制造有限公司\t216386\nlistview\t216387\n圣三国志英杰传\t216388\n五者\t216389\n纷乐\t216390\n金百利\t216391\n草食\t216392\n不动声色\t216393\n计收\t216394\n富源\t216395\n牵丝戏\t216396\nLevin\t216397\n校长室\t216398\n603883\t216399\n沙利亚\t216400\n马盖普\t216401\nrocketdock\t216402\n择思达\t216403\nChao\t216404\n全南\t216405\n30几\t216406\n结温\t216407\n科华北路\t216408\n荧光剂\t216409\n叶玲\t216410\n深圳市发改委\t216411\n县发改局\t216412\n最火\t216413\n狐獴\t216414\n中国延安干部学院\t216415\n墙角数枝梅\t216416\n联字\t216417\nDropzone\t216418\n美派\t216419\n2016年12月26日\t216420\nレイプ\t216421\nrepublic\t216422\n20160718\t216423\n谬误\t216424\n北方工业\t216425\n五洲园\t216426\n7天酒店\t216427\n冒险岛贝勒德\t216428\n刘新杰\t216429\n诸葛连弩\t216430\n锐角三角函数\t216431\n砂石分离机\t216432\n郭英成\t216433\n爱康集团\t216434\n福利好\t216435\n球团矿\t216436\n右页\t216437\n小辫儿\t216438\n免税州\t216439\n超氧化物歧化酶\t216440\nSneaker\t216441\n安娜堡\t216442\n千聊\t216443\n2324\t216444\nyeild\t216445\n艾菲\t216446\n碳氮比\t216447\n_一游网\t216448\n南京市妇幼保健院\t216449\n乃果\t216450\n吴克群\t216451\n柔性屏\t216452\n硕士学位论文\t216453\n毛器\t216454\ndiv距离浏览器\t216455\n精粹\t216456\n谷崎润一郎\t216457\nSynonym\t216458\n澄粉\t216459\n_克拉玛依政府网\t216460\n凌空路\t216461\n莫丝\t216462\njosh\t216463\nProGuard\t216464\n丁目\t216465\n煤制甲醇\t216466\n四句\t216467\n9187\t216468\n那些事\t216469\n军分区\t216470\n渠化\t216471\n企源\t216472\n伊斯坦堡\t216473\n任总\t216474\nnslog\t216475\n郑洪\t216476\nSaveAs\t216477\n幕府将军2武士之殇\t216478\nbefor\t216479\n妖\t216480\n林宝\t216481\n3部\t216482\nCharm\t216483\n四野\t216484\n喻越越\t216485\n数字示波器\t216486\nhousework\t216487\n9600\t216488\n烧酒网\t216489\n坐地起价\t216490\ny55\t216491\n阜阳路\t216492\n硬链接\t216493\nm1005\t216494\nlingfeng\t216495\nSRTM\t216496\n市委政法委\t216497\n三要素\t216498\n篇\t216499\n鸮\t216500\n卵蛋网\t216501\n托架\t216502\nJSF\t216503\n熊出没之熊心归来\t216504\n2310\t216505\nsudasuta\t216506\nFritz\t216507\n狄拉克\t216508\n梅毅\t216509\n浮梁县\t216510\nvariability\t216511\n声望\t216512\nLiszt\t216513\nbibox\t216514\n中国空间技术研究院\t216515\n云南驰宏锌锗股份有限公司\t216516\n北京公安\t216517\n艾灸\t216518\n数理统计\t216519\n百多\t216520\n好客山东\t216521\n找一找\t216522\n大學\t216523\n遮荫\t216524\n善者\t216525\n铺陈\t216526\n小彤\t216527\nAlyssa\t216528\nwrold\t216529\n35路\t216530\n穿甲弹\t216531\n蔚\t216532\n曼苏拉娜\t216533\n出气\t216534\n阿里斯顿燃气灶\t216535\nuiscrollerview\t216536\n360游戏管家\t216537\nuwb\t216538\n柳云龙\t216539\n食疗养生_养生之道网\t216540\n建筑结构荷载规范\t216541\n50mg\t216542\n免修\t216543\n一羽\t216544\n张绣\t216545\nksv\t216546\n灯油\t216547\n诘问\t216548\n同意书\t216549\nMMX\t216550\nzfcg\t216551\n渐进式\t216552\n探空\t216553\n黄缘\t216554\n李森\t216555\n尤克里里\t216556\n花婆婆\t216557\nConsensus\t216558\ntascam\t216559\n计划生育证明\t216560\n眼镜架\t216561\n110.dll\t216562\n中国银行长城环球\t216563\n新邵县人民政府\t216564\n五仁月饼\t216565\ncapt\t216566\n一嗨\t216567\n桂东\t216568\n万泉小学\t216569\n唐钢\t216570\n忻城县\t216571\n35.8\t216572\n上海市监察局\t216573\n鲜皮\t216574\n一把\t216575\n迪安诊断\t216576\n流放之路迷宫附魔\t216577\n众投邦\t216578\n羽田真里\t216579\n猩红山峰\t216580\nResidential\t216581\n百分之五十\t216582\n降维算法\t216583\n卡牌游戏\t216584\nEuphoria\t216585\n小米粒疙瘩\t216586\n广东职业技术学院\t216587\n爆乳装\t216588\n腾格里\t216589\n金日\t216590\n稻盛霸王别姬\t216591\n彩美旬果\t216592\n景德路\t216593\nGRU\t216594\nmamicode\t216595\n陕西警官职业学院\t216596\n诗艺\t216597\n20170822\t216598\napparel\t216599\ng25\t216600\n香河县\t216601\n己土\t216602\nDeja\t216603\n上海卫计委\t216604\n3厘\t216605\n手写字\t216606\n查德\t216607\n王小康\t216608\ninSSIDer\t216609\n81条\t216610\n河北旅游职业学院\t216611\n飞飞CMS\t216612\n橙子\t216613\n数法\t216614\n方姓\t216615\n5月7日\t216616\n自助者\t216617\n少别\t216618\n0x8\t216619\n普利尼\t216620\nex750bt\t216621\nSmiling\t216622\n序列码\t216623\n灰堆\t216624\n26年\t216625\n子模块\t216626\np8p67\t216627\n沪杭\t216628\n汽运\t216629\n电子眼\t216630\n丰李镇\t216631\n白布鞋\t216632\nMitsubishi\t216633\n捕梦\t216634\nMiniVCap\t216635\n免租\t216636\n進行\t216637\n情女\t216638\nUTF8\t216639\ncz80\t216640\n安全工\t216641\n子路由器\t216642\nGeorg\t216643\n偶然性\t216644\nUG4.0\t216645\n聂卫平\t216646\n同题\t216647\n耐力板\t216648\n致逸\t216649\n1900\t216650\n深度linux\t216651\n32处\t216652\n迈腾\t216653\n6000年\t216654\nUEFI+GPT模式\t216655\n星象\t216656\n卓弥\t216657\n推荐码\t216658\nAutomation\t216659\n绝地求生卡\t216660\n堆焊\t216661\n主驾\t216662\n分子公司\t216663\n让行\t216664\n金融观察网\t216665\n七格格\t216666\nNO.27\t216667\n马大哈\t216668\n机动战士高达0079\t216669\n11&\t216670\nSc\t216671\nPeaceful\t216672\n英锦赛\t216673\niPadAir\t216674\n骸骨\t216675\n镜\t216676\nmcgs\t216677\n御龙湾\t216678\n芥子园画谱\t216679\n陈志远\t216680\n做案\t216681\n双十中学\t216682\n仿真电路图\t216683\naoi\t216684\n湖海\t216685\n月期\t216686\n滚筒式\t216687\n威诺\t216688\n万豪礼\t216689\nArrayList类\t216690\nOverdrive\t216691\n软交所\t216692\n布尔函数\t216693\n北京火车站\t216694\n玉屏风颗粒\t216695\n_健康无忧网\t216696\n婉瑜\t216697\n建兴\t216698\n下拉项\t216699\n澳大利亚悉尼大学\t216700\n犬瘟\t216701\n银行存款\t216702\n余姚市人民政府办公室\t216703\n上海地铁7号线\t216704\nodl\t216705\n废渣\t216706\n观澜国际\t216707\n汇发\t216708\n瑜舍\t216709\n雷洞坪\t216710\n鸡藕\t216711\n268元\t216712\n河南省体育中心\t216713\n马小玲\t216714\n残荷\t216715\n血管炎\t216716\nF7\t216717\n纤毛\t216718\nSKII\t216719\n天堂鸟\t216720\n爱的故事\t216721\nhurried\t216722\n出国留\t216723\n情剑\t216724\n开户名\t216725\n广平\t216726\n多普达\t216727\n冒顶\t216728\nchopin\t216729\ne通卡\t216730\n1.2V\t216731\n水煮鸡蛋\t216732\ncontrollers\t216733\n库林\t216734\n钓鱼艇\t216735\ndo\t216736\nJia\t216737\n李佳思\t216738\n鸽巢问题\t216739\n2130元\t216740\n彭先生\t216741\n佞\t216742\nYAAW\t216743\n南京文明网\t216744\n任雪\t216745\n16排\t216746\n加油票\t216747\n中冶京诚\t216748\n黏住\t216749\n金书江湖\t216750\n2018周\t216751\n甲亢病\t216752\n军歌\t216753\n橄榄菜\t216754\n凌\t216755\n复旦学报\t216756\n艳香\t216757\n500款\t216758\n公路工程标准施工招标文件\t216759\nSysinternals\t216760\nqq群发\t216761\nendangered\t216762\n建工险\t216763\n李家庄\t216764\ni57500\t216765\nmsb\t216766\n学金\t216767\n合肥万达乐园\t216768\n失爆\t216769\n丰裕口\t216770\n何东\t216771\n注会经济法\t216772\n影单\t216773\n宁波教育学院\t216774\n100KM\t216775\ncale\t216776\ndublin\t216777\n生猪\t216778\n2015年9月\t216779\n撸夫\t216780\n摩奇\t216781\n无线遥控器\t216782\n葡币\t216783\n好多好多\t216784\n张海生\t216785\npps\t216786\n万尾\t216787\n年轻的嫂子2\t216788\n混控\t216789\n48穴\t216790\n柱石\t216791\n托\t216792\n一第三章\t216793\n有理函数\t216794\n洪梅镇\t216795\n主恩\t216796\nreasoning\t216797\nw3resource\t216798\n取道\t216799\nHIC\t216800\n烟网\t216801\n小米电脑\t216802\n灰暗\t216803\n内嵌\t216804\n风疹病毒抗体\t216805\n32栋\t216806\n七水硫酸锌\t216807\n万里路\t216808\n盐雾腐蚀试验箱\t216809\nvariations\t216810\n虚拟交易\t216811\n00852\t216812\n默写版\t216813\nAxygen\t216814\nwnt\t216815\n意空间阅读网\t216816\n直管公房\t216817\nc14\t216818\n美姑\t216819\n赫比\t216820\n6.39\t216821\n粤北\t216822\nchuchu\t216823\n鄂东晚报\t216824\nDior\t216825\n核数\t216826\n监规\t216827\n鬼方\t216828\n磨碎\t216829\n孟衍竹\t216830\n苁蓉益肾颗粒\t216831\n岸部\t216832\n曝光\t216833\n新汉\t216834\n国家留学基金管理委员会\t216835\nYann\t216836\n棉湖镇\t216837\n茹雅慧\t216838\n疲劳感\t216839\n魂环\t216840\n白雪峰\t216841\n水云\t216842\nAsshole\t216843\n诸如此类\t216844\n陆寓丰\t216845\n22nm\t216846\n郎君\t216847\n老腊肉\t216848\n不高兴\t216849\n蒙草\t216850\n係\t216851\n山西大学附属中学\t216852\n10万名\t216853\n莽荒纪\t216854\nnetfilter\t216855\nDom\t216856\n无患\t216857\n冲锋\t216858\n魔宗\t216859\n香格里拉镇\t216860\nrabbitmq-server\t216861\n化身程序猿\t216862\n程鹏\t216863\n西湖区教育局\t216864\nb85m-d3v\t216865\nTIAN\t216866\n长野县\t216867\nGal\t216868\n食邑\t216869\n天业\t216870\ncash\t216871\n八都镇\t216872\n太空一号\t216873\nInfrared\t216874\n河南省政府\t216875\ncygames\t216876\njidian\t216877\n菜车\t216878\n怕是\t216879\n双流国际机场\t216880\n2018.0\t216881\n8x8x\t216882\n悟道2\t216883\n电热饭盒\t216884\ntf2\t216885\n心理咨询\t216886\n马塍路\t216887\n冯大辉\t216888\n西川贵教\t216889\n地图学\t216890\n约翰福音\t216891\n厦门科华恒盛股份有限公司\t216892\n星礼\t216893\n滑台\t216894\nHex\t216895\n单恋大作战\t216896\n三国戏英杰传\t216897\n自发电\t216898\n等边\t216899\n42.dll\t216900\n爱丽丝学园\t216901\n中铁十九局集团有限公司\t216902\nwww.zhuna.cn/hotellist/e0301/a0017/\t216903\n2017年6月3日\t216904\n孙静雅\t216905\nnosetests\t216906\n娜美罗宾\t216907\n必发指数\t216908\n送课\t216909\n草莓季\t216910\nnesting\t216911\nZB\t216912\n2x\t216913\nhplaserjet\t216914\n国立清华大学\t216915\nportal认证\t216916\nReaper\t216917\n男靴\t216918\n起息\t216919\n真力\t216920\n豪迪QQ群发器\t216921\nnsca\t216922\n人情风\t216923\n警察\t216924\n小微企业\t216925\n轻于\t216926\n书论\t216927\n大金中央空调\t216928\n楚汉风云\t216929\n焰口\t216930\n蒲团\t216931\nV2.02\t216932\n东映特摄剧\t216933\n自攻\t216934\n剪羽\t216935\n三焦\t216936\n泛欧\t216937\n天尚\t216938\n2亿\t216939\n二把手\t216940\n粘贴板\t216941\n凹型\t216942\n任梓慧\t216943\n5.27\t216944\n推信\t216945\n思德网\t216946\nhap\t216947\n豆花文\t216948\n维娜\t216949\n购物季\t216950\n盆帽\t216951\n红茶菌\t216952\n冻库\t216953\n水房\t216954\n不做成\t216955\n心慌方\t216956\nminiUI\t216957\n粤教版小学\t216958\niSee\t216959\n乳酸菌饮料\t216960\n罗氏制药\t216961\n中南林科大\t216962\n快播网\t216963\nability\t216964\n威锋\t216965\n二手市\t216966\n宋城\t216967\n米糕\t216968\n欲恋学园2\t216969\n模具工\t216970\n超过三年\t216971\n每秒钟\t216972\ncreating\t216973\niVMS-4200网络视频监控\t216974\n困境\t216975\nsow\t216976\norcal\t216977\n木南\t216978\n江苏省金湖县政府\t216979\nM550\t216980\n1分钟左右\t216981\n中华草龟\t216982\nEFL\t216983\n惧\t216984\ndifficult\t216985\n奇葩说第四季\t216986\n机密\t216987\n西集镇\t216988\n国际件\t216989\nL800\t216990\n广马\t216991\n2700u\t216992\n东门路\t216993\n马踏英雄联盟之决胜巅峰\t216994\n周转箱\t216995\n梅捷\t216996\nShares\t216997\nvhs\t216998\n屠狗\t216999\n移机\t217000\nea7\t217001\n粘滞\t217002\n刺客信条:叛变\t217003\nsgt\t217004\n雇佣军\t217005\n15th\t217006\n滤泡性咽炎\t217007\n剖面图\t217008\n梅影莲香\t217009\n苦字\t217010\n上海米其林\t217011\nSamsonite\t217012\n否\t217013\n有点慢\t217014\nwelcome\t217015\n拉力计\t217016\n万朵\t217017\n邓健泓\t217018\nqw\t217019\nfcm\t217020\n20多分钟\t217021\n北京大学艺术学院\t217022\nkryo\t217023\n孙海洋\t217024\n望湖城\t217025\nBrokerage\t217026\nCommunist\t217027\n百度地图JSAPI\t217028\n草本植物\t217029\n安阳镇\t217030\n张振宇\t217031\n000935\t217032\n单双\t217033\n2018年五一\t217034\nFilling\t217035\n临淄区\t217036\n饼形\t217037\n耳标\t217038\n824\t217039\n吴伟业\t217040\n三轨\t217041\n党建工\t217042\n豪华感\t217043\n牌篇\t217044\n工程签证单\t217045\nnavicatformysql\t217046\n发昏\t217047\n赫恩\t217048\nDrives\t217049\n东湖社\t217050\n千田理子\t217051\nInfoComm\t217052\n电动力学\t217053\n20161108\t217054\n山西省国家税务局\t217055\n会展商\t217056\n立创\t217057\n王军霞\t217058\n咖色\t217059\n煎包\t217060\ngetHibernateTemplate\t217061\n一杯\t217062\n超越梦想\t217063\n吕姬\t217064\n电头\t217065\n金坛论坛\t217066\nDXDLW\t217067\nwwan\t217068\n二家\t217069\n语块\t217070\n银白\t217071\n_斯诺克\t217072\npublickey\t217073\n6步\t217074\n真人版\t217075\n陈希\t217076\n普米\t217077\n全热\t217078\n门风\t217079\n钥匙链\t217080\nMRS\t217081\n中国电子信息产业集团\t217082\n原油\t217083\nChapters\t217084\n矢状\t217085\n茶文艺茶\t217086\n1874年\t217087\n白女巫\t217088\n叶澜\t217089\nzephyr\t217090\n16日\t217091\n三【\t217092\n盘锦北\t217093\n谷朊粉\t217094\nRAID0\t217095\nTracker\t217096\n梅里美\t217097\n展示柜\t217098\n张跃\t217099\n放行\t217100\n莫听穿林打叶声\t217101\n十四周年\t217102\nS5PV210\t217103\n再循环\t217104\nSICP\t217105\npox\t217106\n绝对部部\t217107\n党的建设\t217108\n枸杞岛\t217109\n喊\t217110\ne盾\t217111\n深圳实验学校\t217112\n手痒\t217113\n基桩\t217114\n姜堰区\t217115\nDreamweaver\t217116\n隅\t217117\n工业遗址\t217118\n严植婵\t217119\n坚如磐石\t217120\ni76700\t217121\nBoA\t217122\n平邑\t217123\n居住证\t217124\nMacroeconomics\t217125\n百度助手\t217126\n捍卫者\t217127\n竞争度\t217128\n唱盘\t217129\nDocker\t217130\nTeleport\t217131\n中粮集团\t217132\n雅萌10t\t217133\nfilling\t217134\nMetallica\t217135\n安e\t217136\nPallet\t217137\nxc90\t217138\nCGRect\t217139\n气压\t217140\n朝鲜人民所\t217141\nkickers\t217142\n千树\t217143\n国建\t217144\n收官战\t217145\nPokemmo\t217146\nCla\t217147\np9plus\t217148\n做戏\t217149\n洪雨\t217150\n大摩\t217151\nVb\t217152\ncaxa2018\t217153\n2万里\t217154\n塔希里亚\t217155\n性取向\t217156\nMDC\t217157\n骑缝\t217158\njpg格式\t217159\n11月5日\t217160\n散热风扇\t217161\n阮琦\t217162\n新办\t217163\n青云街\t217164\n晶面\t217165\nREVOLUTION\t217166\n壹心\t217167\n地圈梁\t217168\n4399手游通\t217169\n洋枪\t217170\n夏明\t217171\n星屑\t217172\n油漆桶\t217173\nDENON\t217174\n20天内\t217175\n6CM\t217176\n混沌与秩序2吧_\t217177\n亲爱的朋友\t217178\n无线中继器\t217179\n陈埭镇\t217180\n设计史\t217181\nSense8\t217182\n危险人物\t217183\n红月传说\t217184\n声望商\t217185\n一颗一颗\t217186\n水感\t217187\nmastercard\t217188\n左乙拉西坦片\t217189\n倪海厦\t217190\n杰拉德\t217191\n异宠\t217192\n海货\t217193\n今年清明节\t217194\nusb接口\t217195\n国家建设高水平大学\t217196\n坚韧\t217197\nBOM表\t217198\n雅思G类\t217199\n埃尔克森\t217200\n探伤仪\t217201\n玉米苗\t217202\n细工\t217203\n欠\t217204\n相亲相爱一家人\t217205\nv4.3\t217206\n04月09日\t217207\n红魔乡\t217208\nrup\t217209\n包豪斯\t217210\nupdated\t217211\n波姬小丝\t217212\n羽线\t217213\nhelps\t217214\n天井式\t217215\n黄维\t217216\n脚步\t217217\n世爵\t217218\ndisabling\t217219\n2163\t217220\nbiang\t217221\nIP地图\t217222\nov\t217223\n牛黄清胃丸\t217224\n知行网\t217225\n征衣\t217226\n蟹棒\t217227\n赵金龙\t217228\n周海江\t217229\n混合泳\t217230\n怀德路\t217231\n军运\t217232\n分析法\t217233\ntgc\t217234\n六十万\t217235\n蛇口邮轮中心\t217236\n狭义相对论\t217237\n泰剧-巴巴影院\t217238\n一架\t217239\nfalsh\t217240\n网络展\t217241\n乔思伯\t217242\n同舟共济\t217243\n流霞\t217244\n娘心\t217245\n辗压\t217246\n四十一\t217247\nbabyface\t217248\n巴郎仔\t217249\n小棚\t217250\n衡阳市公安局\t217251\nRotary\t217252\n冲程\t217253\n1982年\t217254\n同省\t217255\n恒源\t217256\n5.0.8\t217257\n不配\t217258\n美事\t217259\n彭厨\t217260\n中国中冶集团\t217261\n粉红猪\t217262\n征信网\t217263\n方苞\t217264\nkwargs\t217265\nRTR\t217266\n宁波火车站\t217267\n一月内\t217268\n防狼喷雾\t217269\n松露\t217270\n换毛\t217271\n牛头\t217272\nm1530\t217273\n天涯明月刀乐伶\t217274\nv-html\t217275\n框架集\t217276\n邛崃天台山\t217277\n小气鬼\t217278\n秽土转生\t217279\n贝加尔湖畔\t217280\n蓝墨\t217281\nHaru\t217282\nxywy\t217283\n仙乐斯广场\t217284\n子群\t217285\n云南民族村\t217286\nqled\t217287\n白敬亭\t217288\n光顾\t217289\n迷茫期\t217290\n给水箱\t217291\n理货\t217292\n八度电影院\t217293\n雄哥\t217294\n寒霜朋克\t217295\n福师\t217296\njava6\t217297\n干朋友\t217298\nvidoes\t217299\ntange\t217300\n800公里\t217301\n牛肉拉面\t217302\n被子\t217303\n万科里金域国际\t217304\n开口\t217305\n圆领衫\t217306\n降龙罗汉\t217307\nmxd\t217308\n360助手\t217309\n东拉西扯\t217310\n胃底静脉曲张\t217311\nIntellij\t217312\n20180310\t217313\n试行\t217314\n护院\t217315\n阿里巴\t217316\n砖桥\t217317\n保胎\t217318\n频文\t217319\n开发期\t217320\n找朋友\t217321\n落雷\t217322\n跨国\t217323\n惊魂记\t217324\nPUK码\t217325\n篆体字转换器\t217326\n银毫\t217327\nchomd\t217328\nUSART\t217329\nc文件\t217330\n自考在线\t217331\n驯龙\t217332\n地板砖\t217333\n亲妹\t217334\n长老环\t217335\n普通高中课程标准\t217336\n宝鸡南站\t217337\n试算平衡\t217338\n枢机\t217339\n南真菜果\t217340\n较高\t217341\n必听\t217342\n线刷机\t217343\n柳蒿芽\t217344\nhonda\t217345\n年轻的朋友来相会\t217346\n甘肃工商行政管理局\t217347\nDropbox\t217348\niVMS-4200\t217349\nEthan\t217350\n硬招\t217351\n万彩\t217352\n数据库原理\t217353\n金奇\t217354\n紫晶悦城\t217355\n解职\t217356\n巴布亚\t217357\n北京壹号\t217358\n青春派\t217359\nblacksunny\t217360\n满意\t217361\n不考\t217362\n仙某某\t217363\n夕暮\t217364\n连接套\t217365\n木窗\t217366\n移动磁盘\t217367\n万达广\t217368\n0907\t217369\n热血江湖2\t217370\nredirecting\t217371\n孔雀明王\t217372\n对话集\t217373\n伊藤园\t217374\n油浸式变压器\t217375\n诸光路\t217376\nFRB\t217377\n现金账\t217378\n酶标\t217379\n宁海县\t217380\nseverlet\t217381\n11.2.0.4.0\t217382\n扼腕\t217383\n假面骑士fourze\t217384\nprocedural\t217385\n大侦探波罗\t217386\nWPS\t217387\n列治文\t217388\n孙科舞蹈工作室\t217389\n整理版\t217390\n美越\t217391\noysho\t217392\n中国自由贸易试验区\t217393\n副处长\t217394\n仪表阀\t217395\n胡某某\t217396\n蒲蒲兰绘本馆\t217397\n液压\t217398\n破阵子\t217399\nLeopard\t217400\n海洋赞礼号\t217401\n福茂\t217402\n亚琛工业大学\t217403\n蛋白电泳\t217404\n刘在\t217405\nRegent\t217406\n衣衫\t217407\n2016年3月2日\t217408\n苍之纪元吧\t217409\n维基教科书\t217410\n低热值\t217411\n中号\t217412\ntcode\t217413\n保护帽\t217414\n阳光工程\t217415\n爱尔\t217416\n垫条\t217417\n维管束\t217418\n观叶\t217419\nRSS源\t217420\n千足\t217421\ncamtasia9\t217422\n华强北街道\t217423\nbugfree\t217424\n通风井\t217425\n重文\t217426\n36分\t217427\n香蜜沉沉烬如霜\t217428\n刘以鬯\t217429\n信用贷款利率\t217430\n德夯苗寨\t217431\n三唑酮\t217432\ntuijian\t217433\n健康果\t217434\n开封县\t217435\n腾讯分分彩走势图\t217436\n普吉机场\t217437\n门庭若市\t217438\n白漆\t217439\n乐句\t217440\n太特\t217441\n消磁\t217442\n爆乳照\t217443\n老城区\t217444\n大临\t217445\n光彩夺目\t217446\n冠捷显示器\t217447\n4.63\t217448\n绿城桂花园\t217449\n免费播放器\t217450\n饥寒交迫\t217451\nX射线衍射\t217452\nIAR\t217453\n禅绕画\t217454\n鱼肝油\t217455\nactivesync\t217456\n网易战网\t217457\n洗牌期\t217458\n江山无限\t217459\nmt6750\t217460\n四证\t217461\n融诺网\t217462\n游戏型\t217463\n郑太顺\t217464\n年级\t217465\n宿迁湖滨新区\t217466\n第131期\t217467\ntkinter\t217468\n酷易搜\t217469\n稽核员\t217470\n种羊场\t217471\n遗志\t217472\n冷备\t217473\n军机\t217474\n写文\t217475\nsplits\t217476\n向太\t217477\n都市大亨物语\t217478\n击中\t217479\n学妹\t217480\n斑马gk888t\t217481\n鸟瞰图\t217482\n注册会计师\t217483\nSITC\t217484\n2.3.3\t217485\n成性\t217486\n中优\t217487\n分子思想汇报\t217488\n冯敬尧\t217489\n朱岩\t217490\n高山流水\t217491\n5秒后\t217492\n张美丽\t217493\n乞力马扎罗\t217494\n橙装\t217495\n张有\t217496\n九阴真经江湖\t217497\n寒冬\t217498\n吉利帝豪\t217499\n几场\t217500\nBending\t217501\n铁电\t217502\n克难\t217503\n花阳\t217504\n三孔\t217505\n基准收益率\t217506\n雪儿\t217507\n2305\t217508\nOrganization\t217509\n不散\t217510\n犊寨\t217511\n锅内\t217512\n20180111\t217513\n际通宝\t217514\n中粮集团有限公司\t217515\nlilitales\t217516\n注意力\t217517\n催款函\t217518\n绷缝\t217519\n欧特克\t217520\n现浇箱梁\t217521\n4周年\t217522\nphylogenetic\t217523\n槐角丸\t217524\n船长\t217525\n残\t217526\n缙云县政府\t217527\nopinions\t217528\n好盈\t217529\n暗斗\t217530\nNewland\t217531\n吴宣仪\t217532\n泱泱\t217533\n青菜粥\t217534\n雄赳赳\t217535\n梦思\t217536\n圣才\t217537\nProtege\t217538\n蜜糖\t217539\n0115\t217540\n30多\t217541\nplayground\t217542\n绵阳市商业银行\t217543\n赤霞珠干红葡萄酒\t217544\n广州王老吉药业股份有限公司\t217545\n_牛摩网\t217546\nTwinkle\t217547\n全球购\t217548\n启悦罪恶之城\t217549\n步长制药\t217550\njzs\t217551\n歼8\t217552\n大连市人民政府国有资产监督管理委员会\t217553\n铁罗汉\t217554\nV1.0|\t217555\n永夜君王\t217556\n非人为\t217557\n崇贤街道\t217558\n出保\t217559\n抗日地雷战\t217560\nword7\t217561\n线切割机床\t217562\n360随身WIFI\t217563\n环大西洋\t217564\nComparisons\t217565\n王洪涛\t217566\n蓝影\t217567\n穆夏\t217568\n太行山\t217569\n第十五周\t217570\n吉林市人力资源和社会保障局\t217571\n11.2.15\t217572\n美国国家安全局\t217573\n石天\t217574\n3c\t217575\n摄于\t217576\nsirius\t217577\n原煤\t217578\n朱铁志\t217579\n轧差\t217580\n里头\t217581\n一类\t217582\nRuntimeException\t217583\n云南丽江\t217584\n味逍遥丸\t217585\nyiren\t217586\n人造纤维\t217587\n2014年1月1日\t217588\n铁力市\t217589\n含氟牙膏\t217590\nCrush\t217591\n3r\t217592\n大血藤\t217593\n飞泊通\t217594\n万东\t217595\n海菠萝网\t217596\n麒麟街道\t217597\n微晶板\t217598\n捕鱼达人2\t217599\n李顺圭\t217600\nundifined\t217601\n本语\t217602\n海南地区\t217603\n黑巧克力\t217604\n天津大学研究生院\t217605\nMethodology\t217606\n沸腾\t217607\nencino\t217608\n肉麻\t217609\n二十元\t217610\n烟火灾\t217611\n凌海市\t217612\n贵定县人民政府\t217613\n分条机\t217614\n巴洛特利\t217615\n娇喘\t217616\n艾米\t217617\n痰饮\t217618\n根力\t217619\n大服\t217620\n钻石块\t217621\n重庆大学\t217622\n德育之窗\t217623\n费尔南德斯\t217624\nenduro\t217625\n现代牧业\t217626\n南京大学》\t217627\n四川省通信管理局\t217628\n碳纤维\t217629\n弘阳集团\t217630\ngetBean\t217631\n行邮税\t217632\n陈学明\t217633\n拖布池\t217634\n铁钉\t217635\n慎之又慎\t217636\n丁璇\t217637\n500.19\t217638\n云龙示范区\t217639\n毛琳\t217640\n11.21\t217641\n衡率\t217642\nValencia\t217643\n丽江古城\t217644\n沙贝\t217645\n特勤机甲队\t217646\n云石\t217647\nTrip\t217648\n共渡\t217649\n佛母\t217650\n概括性\t217651\n招飞\t217652\nTheft\t217653\n斗蟹游戏网\t217654\n取色器\t217655\n去远方\t217656\n难以为继\t217657\n炒客\t217658\n僧帽\t217659\n吉林艺术学院\t217660\n梁耀安\t217661\nGrinding\t217662\n2018年10月1日\t217663\n油汀\t217664\n牙齿大街\t217665\n合宝\t217666\nckplayer\t217667\n腹泻\t217668\nifile\t217669\n360天\t217670\n2K分辨率\t217671\n@Value\t217672\n歼-10C\t217673\n粘箱机\t217674\n配位化合物\t217675\n武夷山东站\t217676\n斯柯达Yeti\t217677\n必达\t217678\n820\t217679\n宣美\t217680\n信口开河\t217681\n钻石型\t217682\n排尾\t217683\ntalbe\t217684\n股味\t217685\n调合\t217686\n茶杯\t217687\n广东机电职业技术学院\t217688\n脚架\t217689\n李一\t217690\n奥特佳\t217691\n电动平衡车\t217692\n九十九次\t217693\n太原市委\t217694\n交大一附院\t217695\n泰州\t217696\n江杨北路\t217697\n炼就\t217698\n第5周\t217699\n君主立宪制\t217700\n王派\t217701\n38家\t217702\nCCTV证券\t217703\n纳帕溪谷\t217704\n鲁一鲁\t217705\n死钱\t217706\n三届\t217707\nfreelancer\t217708\n35集\t217709\n微信企业版\t217710\n82路\t217711\n还是否\t217712\nbey\t217713\nSQLSERVER数据库\t217714\nEGFR\t217715\nrxtx\t217716\nhoneywell\t217717\n自驾到\t217718\nOriginals\t217719\n焊接机\t217720\n西安市儿童医院\t217721\n卡扎库斯\t217722\n超过10天\t217723\n如花美眷\t217724\n蒙特拉\t217725\nmounted\t217726\n田园乐\t217727\n友哈巴赫\t217728\n致以\t217729\n一部番\t217730\n786\t217731\n科多兽\t217732\n筒仓\t217733\n04月17日\t217734\n断组\t217735\n竹泉村\t217736\n神魄\t217737\n呼和浩特铁路局\t217738\n血瘀型\t217739\nImageView\t217740\n肥妈\t217741\n华夫饼机\t217742\n宋慈\t217743\n文化传媒有限公司\t217744\n一键装机\t217745\ninterpret\t217746\n802D\t217747\n34套\t217748\n叶正\t217749\n创世中文\t217750\n华为荣耀畅玩7X\t217751\n中关村南大街\t217752\n太乙路\t217753\n楼台\t217754\n1.5公斤\t217755\n国家图书馆出版社\t217756\nzjk\t217757\n田鼠\t217758\nWx\t217759\n74届\t217760\n医学家\t217761\n小米版\t217762\nJenni\t217763\n绿皮\t217764\n捞月\t217765\n冲超\t217766\n庐山路\t217767\n秘卷\t217768\n王京花\t217769\n第一式\t217770\nco2\t217771\nMeanings\t217772\n2128\t217773\nA1700\t217774\n东方爱婴\t217775\n五日内\t217776\n口肌\t217777\n大虾\t217778\n东栅\t217779\n大西\t217780\n1084\t217781\n奥克斯中央空调\t217782\n小白杏\t217783\n日新亭\t217784\n二十五岁\t217785\n董海川\t217786\n人车分流\t217787\n阿拉斯托\t217788\n酮康唑乳膏\t217789\n华子\t217790\n20万元\t217791\n剃头\t217792\nbetter-scroll\t217793\n弹簧支吊架\t217794\n公英制\t217795\n最富有\t217796\n新疆自治区党委\t217797\n尼康d3400\t217798\neffects\t217799\nFourth\t217800\n刘梅\t217801\nemoney\t217802\nprefixer\t217803\n冬病\t217804\n徕卡\t217805\n百服宁\t217806\n福利吧\t217807\n小龟\t217808\n顾问\t217809\n原皮\t217810\n险情\t217811\n中大附中\t217812\n牧童遥指杏花村\t217813\n传媒大学\t217814\nlong型\t217815\n寻仙\t217816\n内篇\t217817\n这么急\t217818\n买买买\t217819\n大航海家4\t217820\n4399H5游戏-h.4399.com\t217821\n商务范\t217822\n田子\t217823\n梁生宝\t217824\nshishenm\t217825\n冈本伦\t217826\nMLC\t217827\nwoaidu\t217828\nfatego\t217829\n横切机\t217830\nkrups\t217831\n文脉\t217832\n神农架林区\t217833\n河北省委党校\t217834\n合击版\t217835\n电磁波\t217836\n剑三丐帮\t217837\n惠州卫生职业技术学院\t217838\n知画\t217839\n亚目\t217840\n豆蛙\t217841\n卡罗拉论\t217842\n黑葡萄\t217843\nSRP\t217844\n佰仟\t217845\n2ds\t217846\n堡镇\t217847\n村田制作所\t217848\n狼神\t217849\n茅屋为秋风所破歌\t217850\nvaccine\t217851\n_饭\t217852\n贝爷\t217853\ngliffy\t217854\n12元\t217855\n25M\t217856\n稀有端\t217857\n入殓\t217858\n拦门\t217859\nmolex连接器\t217860\nMedian\t217861\n熊叔\t217862\n行中\t217863\n华夏出版社\t217864\n橡木板\t217865\n晋书\t217866\ntasker\t217867\n上古卷轴5:天际重制版+\t217868\n黄陂南路\t217869\nEX\t217870\n诺远\t217871\n北京首都国际机场股份有限公司\t217872\nЖ\t217873\n事业单位差旅费管理办法\t217874\n人才类\t217875\ndisks\t217876\n千纤草\t217877\nillustrator\t217878\n统保\t217879\n译汉\t217880\n残次\t217881\n柴门\t217882\n乌篷船\t217883\nPADS\t217884\n梅花镇\t217885\n伤及\t217886\n亲本\t217887\nFont\t217888\n怒江傈僳族自治州\t217889\nPLEX\t217890\n棠真\t217891\n叶青\t217892\n南院\t217893\n重定向后\t217894\n威廉玛丽学院\t217895\n20161230\t217896\nFate-itmo\t217897\n派森\t217898\n3篇\t217899\n示弱\t217900\nFMVP\t217901\n海联金汇\t217902\nc++2015\t217903\ncashmere\t217904\n重庆市轨道交通(集团)有限公司\t217905\nPp\t217906\n祸国殃民\t217907\n地图_火车网\t217908\n马航\t217909\n大广赛\t217910\n假树\t217911\n3.63\t217912\n张维维\t217913\n奇丑\t217914\n摩登天空\t217915\n头尾\t217916\n第51集\t217917\nCCU\t217918\n侧吸油烟机\t217919\n商务车\t217920\nselfish\t217921\n甜瓜\t217922\n骑楼\t217923\n小桔\t217924\n奥维\t217925\n西吡氯铵\t217926\nymca\t217927\n梦想的声音\t217928\n12篇\t217929\n私募可交换债\t217930\n二十道\t217931\ntiexue\t217932\n试剂瓶\t217933\n思多金\t217934\n就读懂\t217935\ntdx\t217936\n酵母粉\t217937\n可视性\t217938\n粉丝\t217939\nGPUImage\t217940\n填充\t217941\n二叠纪\t217942\n桃花潭水深千尺\t217943\n西子湖\t217944\n刘丹青\t217945\n91名\t217946\nsalesman\t217947\n身后事\t217948\n血样\t217949\n雪之花\t217950\n2902\t217951\n43寸\t217952\n年保玉\t217953\n太阳山\t217954\n北京复兴医院\t217955\n何彬\t217956\n抚琴\t217957\n譬喻\t217958\n内蒙古联通\t217959\n量子比特\t217960\nmonash\t217961\n兄弟\t217962\nNARS\t217963\nLoft\t217964\n实现代\t217965\nAllow\t217966\n见贤思齐焉\t217967\n142857\t217968\n朝闻天下\t217969\nboss战\t217970\n10座\t217971\n老顾\t217972\nEpisode\t217973\nGuangxi\t217974\nmd-*\t217975\n体育教学\t217976\n702\t217977\n并轨\t217978\n光荣绽放\t217979\n根河\t217980\nxmm\t217981\n美多吉\t217982\nTwilio\t217983\nip7p\t217984\n大世界\t217985\n刘虹\t217986\n搜狐科普\t217987\n黑水\t217988\n5月初\t217989\nhighlight\t217990\n亿乐社区\t217991\n20150322\t217992\n新快报\t217993\n魂牵梦萦\t217994\nCoatings\t217995\n收购方\t217996\n二婚\t217997\nomitting\t217998\n中国管理科学研究院\t217999\n土豆粉条\t218000\nBrea\t218001\n四限\t218002\n曾子曰\t218003\n水皮\t218004\n郑文亮\t218005\n假面骑士BUILD\t218006\nTits\t218007\n宁波市住房公积金管理中心\t218008\n第三方源\t218009\n似锦\t218010\n罗爷\t218011\n小蜜蜂\t218012\n流涕\t218013\n卡尔维诺\t218014\n陕鼓\t218015\n东安路\t218016\n银魔\t218017\n世园\t218018\n台电科技\t218019\n骤升\t218020\n金黄色葡萄球菌\t218021\n吉利服\t218022\n18班\t218023\n东湖\t218024\n君乐宝\t218025\n玉林路\t218026\nSP1简体中文版\t218027\n热稳定剂\t218028\n饭票\t218029\ncui\t218030\n路法\t218031\n77亿\t218032\n超清画质\t218033\nTBBP\t218034\n195万\t218035\n黑漆漆\t218036\n夏木\t218037\n赵子龙\t218038\n张良计\t218039\n浜崎真緒\t218040\ncmder\t218041\n操逼\t218042\n旅游团\t218043\n2.23\t218044\n存款案\t218045\n北京鼎石国际学校\t218046\n深圳市二医院\t218047\n2017年12月24日\t218048\n916\t218049\nQuill\t218050\nRSS阅读器\t218051\n南邵\t218052\n忧愁\t218053\n狂客\t218054\n2015.10\t218055\nspx\t218056\n齐民\t218057\nDNAMAN\t218058\n切点\t218059\n青岛市政府\t218060\n黄药师\t218061\n黑角\t218062\n动态规划法\t218063\n加列\t218064\n杨浦小学\t218065\n花痴\t218066\n比隆\t218067\n五论\t218068\n前掌\t218069\n彩虹\t218070\n逍遥侯\t218071\n国信金太阳\t218072\n牛头怪\t218073\n航空延误险\t218074\nrequestparam\t218075\n小远\t218076\n7501\t218077\n华医通\t218078\n嘉汇\t218079\n召唤器\t218080\n鲶鱼\t218081\n殷切\t218082\n伊宁市\t218083\n图画\t218084\n叩齿\t218085\n墙头草\t218086\n1989年\t218087\n页\t218088\n2014年4月份\t218089\n前端客\t218090\n赵烨\t218091\nLongShaoAn\t218092\nsoe\t218093\n冷干机\t218094\n噢噢\t218095\n东方电气集团\t218096\nLions\t218097\n重庆巴蜀中学\t218098\nQuiz\t218099\n物理\t218100\nimageJ\t218101\n200l\t218102\nBumper\t218103\ndolly\t218104\nVibrator\t218105\n东海网\t218106\n充电池\t218107\n杭州西奥电梯有限公司\t218108\n老花镜\t218109\n触发率\t218110\n上海都市旅游卡\t218111\n冰冰\t218112\n独唱版\t218113\n戴着\t218114\n科研型\t218115\nwdr6300\t218116\nYOKA时尚网\t218117\n荞麦面\t218118\n博后\t218119\nLETV\t218120\n厂工\t218121\n高考季\t218122\n两江四岸\t218123\n中博教育\t218124\n腾讯助手\t218125\n〈\t218126\nRTT\t218127\nu+\t218128\n金钱观\t218129\n市中医医院\t218130\n学士帽\t218131\n铝铁\t218132\n真善\t218133\n风塔\t218134\nhsfzxjy\t218135\n智睿\t218136\n桂人社\t218137\n车载DVD导航改装网\t218138\n催情药\t218139\n芦台\t218140\n细辛\t218141\n董酒\t218142\n瑞思学科英语\t218143\n磨牙棒\t218144\n1344\t218145\n手下\t218146\ndaughter\t218147\n昆马\t218148\n北上广深\t218149\n光纤耦合器\t218150\n狮子合唱团\t218151\n硅藻泥\t218152\n调度会\t218153\nSharedPreferences\t218154\n三发\t218155\n保利集团\t218156\n熔火之心\t218157\nH4\t218158\nQueensland\t218159\n黄国\t218160\n人力资源和社会保障部\t218161\nartDialog\t218162\nhuakai\t218163\nBreakers\t218164\n白糖糕\t218165\n骰宝\t218166\n遛遛\t218167\n主产\t218168\n斩钉截铁\t218169\ncough\t218170\n刘小寐\t218171\ne60\t218172\n标尺\t218173\n湖泊\t218174\n举杯\t218175\n光学类\t218176\n丁香花园\t218177\n抉择\t218178\n长相\t218179\n大牢\t218180\n跳转页\t218181\n工厂管理\t218182\n山东国税\t218183\n与人为善\t218184\n刘军宁\t218185\nqihuo\t218186\n军乐节\t218187\n米菲\t218188\n李文荣\t218189\n水稳拌合站\t218190\n篠田步美\t218191\n伍斯特\t218192\n畜牧人才网\t218193\n薛华\t218194\nViewSonic\t218195\nKilaKila\t218196\nCC3D\t218197\n三河\t218198\n清雅\t218199\n生僻\t218200\n美莱\t218201\n藐视\t218202\n午评\t218203\n花彩\t218204\n陌上人如玉,公子世无双\t218205\n慕名而来\t218206\n懵懂\t218207\n石英晶体\t218208\n9天8晚\t218209\n15针\t218210\n十载\t218211\n8月31日\t218212\n名星\t218213\n海宁皮城\t218214\n赢稷\t218215\nPIXHAWK\t218216\nwmi\t218217\n泾\t218218\n艾克\t218219\n清册\t218220\nCBN\t218221\n唇毛\t218222\nDocumentation\t218223\n贪污\t218224\n赶场\t218225\n水电厂\t218226\nBILL\t218227\n撤展\t218228\n福建卫生职业技术学院\t218229\n针线包\t218230\n44路\t218231\n二款\t218232\n∨\t218233\n鉴\t218234\n香港政府\t218235\n浙江体育职业技术学院\t218236\n古巴\t218237\nthrown\t218238\n丁磊春娇与志明\t218239\n8uftp\t218240\n重组股\t218241\nDucky\t218242\n关礼杰\t218243\nender\t218244\nH色\t218245\n见实效\t218246\n青政\t218247\n哈曼卡顿琉璃\t218248\n最美的时光\t218249\n印力\t218250\n7月29日\t218251\n奥莉安娜\t218252\n品控部\t218253\n中国投资资讯网交易在线\t218254\nbaidu.com/s/1jGJvqJk\t218255\nColored\t218256\n读写法\t218257\n行业分析师\t218258\n河北省\t218259\ntwig\t218260\n格林童话\t218261\niPhone中文网\t218262\nnewifi3\t218263\n君约\t218264\n精实\t218265\n8000W\t218266\n副董事长\t218267\n福建省经信委\t218268\n相关子\t218269\npst\t218270\n复卦\t218271\ndy9776\t218272\n经九路\t218273\n绿箭侠吧\t218274\n荣耀畅玩7c\t218275\n育苗班\t218276\n在线翻译_翻译在线__英语翻译\t218277\n隔腔\t218278\n傅艺伟\t218279\n$50\t218280\n几月\t218281\ncocos2dx\t218282\n书卡\t218283\n武经\t218284\n正荣财富中心\t218285\n03708\t218286\ncarlife\t218287\n排叉\t218288\n上海师范大学研究生院\t218289\n逗号分隔符\t218290\nbold\t218291\n【RT4\t218292\n钻攻\t218293\n五芒星\t218294\n碳碳\t218295\n养生方\t218296\n兰博\t218297\n智城\t218298\n日海通讯\t218299\n托里伯奇\t218300\n碳酸水\t218301\n成都地铁5号线\t218302\n主创\t218303\n李东旭\t218304\n馆长\t218305\nLongman\t218306\n背书人\t218307\n三帮\t218308\nOncology\t218309\n万能显卡驱动\t218310\nNose\t218311\n七八个小时\t218312\nE网\t218313\nutexas\t218314\n饭儿\t218315\n大理寺\t218316\n联想Y470\t218317\n丛铭君\t218318\n存款\t218319\nip138查询网\t218320\n50px\t218321\n人工浮岛\t218322\n立言\t218323\n1500g\t218324\n环保币\t218325\n眼镜娘\t218326\n雨恋\t218327\n龙珠超宇宙2吧\t218328\n被嫌弃的松子的一生\t218329\n明敏\t218330\n巴雷西\t218331\n圣淘沙岛\t218332\n斋月\t218333\n广州市番禺区教育局\t218334\n国家4A级旅游景区\t218335\n活生生\t218336\n鞋边\t218337\n笋壳\t218338\n车皮\t218339\n5.5.56\t218340\ncrack\t218341\n制服下的名器\t218342\n建宁路\t218343\n钦天监\t218344\n透薄\t218345\naccredited\t218346\n钱王\t218347\n邢台银行\t218348\nrecompile\t218349\n西栅\t218350\n普力\t218351\nbests\t218352\n项俊波\t218353\n广西壮族自治区环境保护厅\t218354\n旧罪\t218355\n唐明\t218356\n36槽\t218357\n编订\t218358\n上海外高桥保税区\t218359\nGis\t218360\n三星a9100\t218361\n衡水老白干\t218362\n脐带血干细胞\t218363\n追凶\t218364\n营造法式\t218365\n二型\t218366\n子类\t218367\n省县\t218368\n神经痛\t218369\n整理\t218370\n藻酸盐\t218371\n网络\t218372\n三彩\t218373\n骑马与砍杀风云三国\t218374\n别扭\t218375\n容城县\t218376\n战车\t218377\n中国联航\t218378\n上海古籍出版社\t218379\nmafu\t218380\n梅瓶\t218381\nvjshi\t218382\niso22000\t218383\n风味小吃\t218384\n启扬\t218385\nFilm\t218386\niOS9.2\t218387\n数学界\t218388\n王淼\t218389\n轻变传奇\t218390\n马群\t218391\nLeaders\t218392\n透露\t218393\n飘浮\t218394\nnotepad++\t218395\n失神\t218396\n汽油发动机\t218397\nzao\t218398\n湖北民族学院\t218399\n二杠\t218400\n睦\t218401\n防御工事\t218402\n京东众筹\t218403\n肖特\t218404\n密春雷\t218405\n85名\t218406\n生管\t218407\n三秦\t218408\n颜小健\t218409\n正太\t218410\n榜书\t218411\nSS喰种\t218412\n辽宋\t218413\n不做饭\t218414\n普货\t218415\n第71关\t218416\n魔法坏女巫\t218417\nPosting\t218418\n正泰电气股份有限公司\t218419\n腊子口\t218420\n马卡洛夫\t218421\n林良铭\t218422\n看似\t218423\n可增\t218424\nORIGINS\t218425\n6关\t218426\nhcg\t218427\n全民英杰传\t218428\n挤公交\t218429\n维客\t218430\n蒙特卡罗算法\t218431\n做马\t218432\n自动变速器\t218433\n供电站\t218434\njava_帮酷\t218435\n冲击式\t218436\n尊尚\t218437\n消解\t218438\n仁智股份\t218439\n华为Mate10Pro\t218440\n宫炮\t218441\n娃哈哈\t218442\n芜湖市弋江区人民政府\t218443\n多叉树\t218444\n昭和时代\t218445\nHEAD\t218446\n北京交通大学威海校区\t218447\n绿城玉兰花园\t218448\nJAMES\t218449\n女装子\t218450\n林君\t218451\n大海贼冒险岛\t218452\n电子管\t218453\n仁者乐山\t218454\n优亲厚友\t218455\n酷派大神F2\t218456\nlaptop\t218457\n透明片\t218458\n国车\t218459\n北京市发展和改革委员会\t218460\n光音\t218461\n新渠道\t218462\n张山营镇\t218463\n毛瑟\t218464\n正念\t218465\n鲜奶蛋糕\t218466\n伊贝诗\t218467\n银河娱乐\t218468\n广东国晖律师事务所\t218469\nActivator\t218470\n80072ee2\t218471\n鼻部\t218472\n逍遥谷\t218473\n三氯\t218474\n桐乡市政府\t218475\n正知\t218476\n平安公司\t218477\n平片\t218478\n较晚\t218479\n艳遇\t218480\n老大娘\t218481\n000061\t218482\n汉字树\t218483\n华康字体\t218484\nbala\t218485\n优我女团\t218486\n北京农家院\t218487\n胡桃雏\t218488\n摸鱼儿\t218489\nphotoshopcs\t218490\nE4A\t218491\negrep\t218492\nCommander\t218493\n20151223\t218494\nbodies\t218495\n长安cs35\t218496\n上海市第五人民医院\t218497\n逍遥子\t218498\n华意\t218499\ncamds\t218500\n杠精\t218501\n影视城\t218502\nFavorites\t218503\n班戟\t218504\nSaying\t218505\n减阻\t218506\n近\t218507\n价格体系\t218508\n滋贺\t218509\n正激式\t218510\n普拉特\t218511\n癌性\t218512\n浙江爱信诺航天信息有限公司\t218513\n夺命邮差2\t218514\n亦即\t218515\n生色\t218516\n直送\t218517\n长和\t218518\n有源监听音箱\t218519\n叶开\t218520\n千方\t218521\n二十几万\t218522\n减色\t218523\n下关街道\t218524\n⒏\t218525\n第三本\t218526\n13本\t218527\n经济补偿金\t218528\n蟹笼\t218529\n过港\t218530\n369号\t218531\n点量\t218532\nchinaren\t218533\n黄雀在后\t218534\n西热力江\t218535\n夜&枫\t218536\n存货成本\t218537\n慈恩寺\t218538\n美人蕉\t218539\n风筝轮\t218540\n著作人\t218541\n季泉\t218542\n雪漠\t218543\n安平镇\t218544\n北沙河\t218545\n上海期货交易所\t218546\n元诗\t218547\n加热棒\t218548\ngoblin\t218549\n病症\t218550\n黄公望\t218551\n莎啦啦\t218552\n500吨级\t218553\n百万款\t218554\n姜平\t218555\n再生成\t218556\n望夫成龙\t218557\n封\t218558\n益西\t218559\n北京中医药大学东直门医院\t218560\nC93\t218561\n王佐\t218562\n怀化\t218563\n北京电影节\t218564\n拙火\t218565\nSqoop\t218566\n杨达才\t218567\n瓯北镇\t218568\n1443\t218569\n上页\t218570\n学生案\t218571\nRemedy\t218572\n商贸通\t218573\n电视剧\t218574\ncjpl\t218575\n得得撸\t218576\n团操\t218577\n罗睺\t218578\n根生\t218579\nluyou\t218580\nssni\t218581\n王陆\t218582\n莫泰168\t218583\n返照\t218584\nw2\t218585\n彩涂卷\t218586\nspecifies\t218587\n拾荒\t218588\n史迪仔\t218589\n大恒\t218590\n舞台中央\t218591\n低维\t218592\n华润中心\t218593\n学而思高中\t218594\n花冠\t218595\n失落的星球2\t218596\n52share\t218597\n指偶\t218598\n典范英语\t218599\n人才大厦\t218600\n多摩豪\t218601\n选优\t218602\n海南师范大学\t218603\n水煤浆\t218604\n山浩\t218605\n郑俊朗\t218606\n冰公大魏\t218607\n煤企\t218608\n花槽\t218609\n法兰泰克\t218610\n关于建国以来党的若干历史问题的决议\t218611\n晶彩\t218612\n舞蹈教学\t218613\nm2000\t218614\nscws\t218615\n第一册\t218616\n莫妮卡贝鲁奇\t218617\n泠鸢yousa\t218618\n2904\t218619\n南京市第二医院\t218620\n凯胜\t218621\n书台\t218622\n一整年\t218623\npxc550\t218624\n1A\t218625\n梗死\t218626\nnote3双网通\t218627\n水门事件\t218628\n鸽蛋\t218629\n小亮仔\t218630\n金属大师\t218631\n挨拶\t218632\n别康桥\t218633\n毛油\t218634\n蔡康\t218635\n振子\t218636\n问渠\t218637\n拜腾\t218638\n帕金森综合症\t218639\n徽采商城\t218640\n缨\t218641\n遮羞\t218642\ninference\t218643\nPrimitive\t218644\nluna\t218645\n李准\t218646\n副关长\t218647\n阴身\t218648\n同仁双宝\t218649\nHoh\t218650\n罐\t218651\n御风童子\t218652\ndeveloping\t218653\n23228561\t218654\n沂源县人民政府\t218655\nFX-991CN\t218656\njQuery、layer\t218657\n阿伏加德罗\t218658\n被套\t218659\n斯\t218660\n3万名\t218661\nWebView\t218662\n反弹\t218663\nwind数据库\t218664\n五色\t218665\n个案\t218666\n螺旋藻片\t218667\n我的世界神秘时代4\t218668\n八桥\t218669\n省级\t218670\n长期规划\t218671\nInvestigations\t218672\n大丽\t218673\nztc\t218674\ndcf\t218675\n老辣\t218676\n韩先楚\t218677\nliness\t218678\nシンデレラガ\t218679\ninitial-scale\t218680\n公盂\t218681\n薛璐\t218682\n海南省琼海市政府\t218683\n2017年6月16日\t218684\n最快的速度\t218685\nAntiques\t218686\n忍之国\t218687\n蓝魅\t218688\ncommute\t218689\n黄者\t218690\n呼和浩特回民区\t218691\n南方科技\t218692\n情感日记\t218693\n表项\t218694\npearl007\t218695\nhaul\t218696\nepo\t218697\n马不停蹄\t218698\n东片\t218699\n资中县人民政府\t218700\n城主\t218701\n智游宝\t218702\n第100期\t218703\n16分\t218704\n上海搜狐\t218705\n北京市机构编制委员会办公室\t218706\n阳信县\t218707\n想不开\t218708\nT12\t218709\n六足机器人\t218710\nSccc\t218711\n家地\t218712\n喔\t218713\n云烟\t218714\nlaughs\t218715\nlols6\t218716\n老头乐\t218717\nPREMIERE\t218718\n杰理\t218719\n桧木\t218720\n36年\t218721\n上日\t218722\n内部控制制度\t218723\n冗\t218724\n螺旋桨\t218725\n威汉\t218726\n红带\t218727\n五十周年\t218728\n100c\t218729\n支纱\t218730\nMadonna\t218731\nust\t218732\nCrunch\t218733\nwb\t218734\n性恶\t218735\n应用物理学\t218736\nQuantum\t218737\n小男友\t218738\n手足无措\t218739\n罗秀路\t218740\n中国葛洲坝地产\t218741\n临江新区\t218742\n取不到\t218743\n贵州省食品药品监督管理局\t218744\n6120\t218745\n许继集团有限公司\t218746\n四千亿\t218747\n神州长城\t218748\nentry\t218749\n夏令\t218750\n公务员编制\t218751\n鲁鲁修\t218752\n03届\t218753\n中厚\t218754\n57条\t218755\nnetworks\t218756\n重刑\t218757\n泥头车\t218758\n中国海关\t218759\n泪雨\t218760\n赵威\t218761\n14话\t218762\n牛庄\t218763\n高评\t218764\nAUI\t218765\nwfp\t218766\n海南测绘地理信息局\t218767\n中华石龙子\t218768\nfumefx\t218769\n太阳鸟\t218770\n12月18日\t218771\n枙子花\t218772\n朝鲜艺术团\t218773\n3月6日\t218774\n上海肿瘤医院\t218775\n腾讯充值中心\t218776\n8.5.2\t218777\n宗萨仁波切\t218778\n新规\t218779\nheiping\t218780\n报税盘\t218781\n技术\t218782\n泰瑞达\t218783\n鸢飞路\t218784\n球室\t218785\n热炉\t218786\n一号双\t218787\n沃森生物\t218788\n维修单\t218789\n财务科\t218790\npiperck\t218791\n9.9包\t218792\n锦荟\t218793\n葵儿\t218794\n谭杰\t218795\nwarface\t218796\nyue\t218797\n二十二章\t218798\nbaked\t218799\nISO标准\t218800\nxfyn\t218801\n蚂蚁篇\t218802\n衣壳\t218803\nRK3188\t218804\nCoefficient\t218805\nFamily\t218806\n地堡\t218807\n学车\t218808\n牟丛\t218809\n江苏省林业局\t218810\n陈醋\t218811\nthough\t218812\n雷克萨斯es350\t218813\nf373\t218814\nSW2017\t218815\n明天早上\t218816\n白玉菩提\t218817\nLOGISTICS\t218818\nprefect\t218819\n3个\t218820\n海南省国家税务局\t218821\n装运港\t218822\n2017至2018年\t218823\n黑颈鹤\t218824\n勃艮\t218825\n没感冒\t218826\nrihanna\t218827\n黑马股\t218828\n85后\t218829\n广西师范\t218830\n邱晨\t218831\n培土\t218832\n球鞋\t218833\n导热膏\t218834\n家长群\t218835\n沿河县\t218836\n欣杨Kitty\t218837\n嫌隙\t218838\n黄婉秋\t218839\n加勒比海盗3\t218840\nBJ\t218841\n基准\t218842\n中铁十八局\t218843\n胀感\t218844\nsmap\t218845\n面巾\t218846\n个人独资\t218847\n怪状\t218848\n码栈\t218849\n职往\t218850\n骄傲的少年\t218851\n广西警察学院\t218852\n试读\t218853\n福安一中\t218854\n长粒\t218855\n华昌\t218856\nfreeradius\t218857\n传家\t218858\n宏济堂\t218859\nXshell\t218860\n水果刀\t218861\n基础卷\t218862\n大溪镇\t218863\n灵雀云\t218864\n地铁房\t218865\n荷城\t218866\n公文集\t218867\nu2414h\t218868\n老白干\t218869\n邪医\t218870\n生物质燃料\t218871\n高照\t218872\n94个\t218873\n三松\t218874\n艺录\t218875\nscop\t218876\n电解池\t218877\n翻来覆去\t218878\n自鸣\t218879\n殡\t218880\n江苏省美术馆\t218881\n市直机关工委\t218882\n基督时报\t218883\n睾丸鞘膜积液\t218884\n榆次吧\t218885\n看空\t218886\n工作报\t218887\n一妻多夫\t218888\nv370\t218889\n叶瑷菱\t218890\nppt字体\t218891\n平安夜\t218892\n马赛克\t218893\n十字弩\t218894\n优雅型\t218895\nhelan\t218896\npopupwindow\t218897\n烟台市财政局\t218898\n菜饼\t218899\n川原砾\t218900\n教育学心理学\t218901\nHpoi\t218902\n4月23\t218903\n50K\t218904\n万能注册机\t218905\n灯饰\t218906\n江苏国税\t218907\n市政路\t218908\n当存\t218909\n镇坪路\t218910\n月光石\t218911\nLinux/Windows\t218912\n97届\t218913\n福特汽车\t218914\n高抬贵手\t218915\n炮王\t218916\n闹翻天\t218917\n北疆饭店\t218918\n独眼巨人\t218919\n8900万\t218920\n东方海洋\t218921\n杨正\t218922\n建桥学院\t218923\n房贷\t218924\n安徽建工集团\t218925\n畏难\t218926\nJEET\t218927\n瞭\t218928\n程野\t218929\n边防\t218930\n酷动\t218931\n裤女\t218932\n鬼脸\t218933\nquweiji\t218934\n四好\t218935\nokey\t218936\n神教\t218937\n酒人\t218938\n路飞\t218939\n100题\t218940\n壶\t218941\n值班\t218942\nLiv\t218943\n天时地利人和\t218944\n智慧城市\t218945\n痰\t218946\nAdded\t218947\n词包\t218948\nEchart\t218949\n优酷腾讯\t218950\n玛德\t218951\nsmooth\t218952\n微软\t218953\n残暴\t218954\n建材类\t218955\n副镇长\t218956\n曼曼\t218957\n2.3GB\t218958\nlpar\t218959\n吴庆\t218960\n减数\t218961\n基本形\t218962\n帕丁顿\t218963\n白庙\t218964\n老爸老妈\t218965\n寡淡\t218966\nbur\t218967\n35千伏\t218968\n凤凰县\t218969\n屌\t218970\n刮掉\t218971\n蛙类\t218972\nmanufacturer\t218973\n伤痛\t218974\n秦律\t218975\n用友云\t218976\n黄仁勋\t218977\ntolove\t218978\nUTF\t218979\nlcamry\t218980\n好多好\t218981\n稼轩\t218982\n夜查\t218983\n招商银行深圳分行\t218984\nserious\t218985\n江北新区管委会\t218986\n血汗泪\t218987\nv1.8\t218988\n南玻A\t218989\n成本低\t218990\n达峰\t218991\n14部\t218992\n白领丽人\t218993\neon\t218994\n189号\t218995\n第几套\t218996\nProm\t218997\n曼奇\t218998\noversized\t218999\n改性聚丙烯\t219000\n波哥\t219001\n看不看\t219002\nSolarbio\t219003\n120急救\t219004\n回到家\t219005\n湖南在线\t219006\n免費漫畫區\t219007\n宠爱自己\t219008\n铜器\t219009\n单相电\t219010\n三从四德\t219011\n制片方\t219012\n气冲\t219013\n罚金刑\t219014\n酸奶蛋糕\t219015\n宿醉\t219016\n46项\t219017\nnvm\t219018\n老伴\t219019\n绯世\t219020\n范文稿\t219021\n拳皇\t219022\nRPLIDAR\t219023\n三百颗\t219024\n洗碗工\t219025\n保暖\t219026\npes6\t219027\n癫狂\t219028\n金政\t219029\nfaster-rcnn\t219030\n岳口\t219031\n石碶\t219032\naria2gui\t219033\nx3.1\t219034\n花酒\t219035\n冷峰\t219036\n可比克薯片\t219037\n360软件小助手\t219038\n男长\t219039\n国务卿\t219040\n_苹\t219041\n极化率\t219042\nharper\t219043\n爱色\t219044\nmkimage\t219045\n美姿\t219046\n苦\t219047\n剑网3吧\t219048\n小奶猫\t219049\n阳澄\t219050\n南京致远外国语小学\t219051\n玖壹壹\t219052\n4.1日\t219053\nBalmain\t219054\nr710\t219055\n有利条件\t219056\n121号\t219057\n天茂湖\t219058\n张亚东\t219059\n13话\t219060\n侦探社\t219061\nBoss直聘\t219062\n一半儿\t219063\n好去处\t219064\n青岛院区\t219065\n后位灯\t219066\n百分之30\t219067\n软基处理\t219068\nShadowSocks\t219069\n蚯蚓\t219070\nFapTV\t219071\nddms\t219072\n丽江大理\t219073\n狩人\t219074\n米诺地尔酊\t219075\n那非\t219076\n潘多夫\t219077\n促落实\t219078\n健身餐\t219079\n职业形象\t219080\ntampermonkey\t219081\nuptake\t219082\n碰碰车\t219083\nzhongziso\t219084\n热敏电子面单打印机\t219085\n20140430\t219086\n黑卡RX100\t219087\n中国国家地理网\t219088\n花与蛇\t219089\npagecontrol\t219090\n藐\t219091\nxtion\t219092\n周扬青\t219093\n崔克\t219094\nFACEBOOK\t219095\n网易河北\t219096\n高雅拉\t219097\n宏基因组测序\t219098\n海陆丰\t219099\n薄弱\t219100\n施迈茨\t219101\n基地校\t219102\n洪武\t219103\nD3V\t219104\n粉妆\t219105\n神医毒妃\t219106\n辽宁省图书馆\t219107\n山岛\t219108\n哈哈IT网\t219109\n时产\t219110\n周华健\t219111\n周舟\t219112\n鬼佬\t219113\n临汾市\t219114\n风土人情\t219115\n25度\t219116\nBINGO\t219117\n眼观\t219118\n益智区\t219119\n邹琼俊\t219120\nMatlab\t219121\nVII\t219122\n撒放\t219123\n选择值\t219124\n14堂\t219125\n琼台仙谷\t219126\nLuminous\t219127\n娄星区\t219128\n265G生死狙击\t219129\n中国国家地理\t219130\nGIGABYTE\t219131\n浙江代表团\t219132\n3.6.6\t219133\n选讲\t219134\n科学学\t219135\n丢人现眼\t219136\n有甚于\t219137\nWharf\t219138\n眼神经\t219139\nKeyCode\t219140\n铁奥\t219141\n55年\t219142\n天峰\t219143\nwndr4300\t219144\n半路出家\t219145\n恶兽\t219146\n技术岗\t219147\n机动车变更登记\t219148\n九龙路\t219149\n主义\t219150\nPIG\t219151\n655\t219152\n国家命运\t219153\n宁海路\t219154\n4公里\t219155\n汪睿\t219156\n黄麻\t219157\n咏\t219158\nAJ1\t219159\n查发\t219160\n篝\t219161\n纪念表\t219162\ncnv\t219163\n新亚欧大陆桥\t219164\n_网商之窗\t219165\n紫翼龙王夜\t219166\n蜀汉英雄传\t219167\n金柚网\t219168\n丝绳\t219169\n车载空气净化器\t219170\n肾蕨\t219171\n刘庭羽\t219172\n奥迪牧神记\t219173\nSmiledL\t219174\n不扣\t219175\nCITIC\t219176\n摩天\t219177\n18080\t219178\n拼法\t219179\n赵传\t219180\n张挺\t219181\n三幢\t219182\n侵限\t219183\ndistpicker\t219184\n狭窄性腱鞘炎\t219185\n严酷\t219186\n主魂\t219187\n75亿\t219188\n兼谈\t219189\n保护膜\t219190\ncmp\t219191\n2.3.7\t219192\n交房\t219193\ncontext:component-scan\t219194\n大王\t219195\nfatty\t219196\nTAB\t219197\nrayon\t219198\n张江路\t219199\n战斗机\t219200\n慧海\t219201\n三期\t219202\n冰天\t219203\n拍场\t219204\n46张\t219205\n外敌\t219206\ngalera\t219207\n24分钟\t219208\n木块\t219209\n社通\t219210\n苹果头\t219211\n过天\t219212\n金指码\t219213\n大堤\t219214\n5545\t219215\n溶氧仪\t219216\n太湖一号\t219217\n田蕴章\t219218\n袋\t219219\n蒙特利尔\t219220\n虞雯\t219221\n马斌\t219222\n走线\t219223\n尾椎\t219224\ncapped\t219225\n钢杆\t219226\n王晓梅\t219227\n启动项\t219228\n翠城花园\t219229\n复旦大学管理学院\t219230\n青花瓷酒\t219231\n运势\t219232\nb41\t219233\nnitrogen\t219234\n中后\t219235\n铝窗\t219236\n李玉\t219237\nNBA直播|CCTV5\t219238\nconnect\t219239\n药物流产\t219240\n网络机顶盒\t219241\n空调压缩机\t219242\nShangri\t219243\nmain函数\t219244\n司尔特\t219245\nmelonstreet\t219246\n青稞酒\t219247\n5第一章\t219248\n王洁曦\t219249\n白云石\t219250\n许茂\t219251\n消炎药\t219252\ngdd\t219253\n抖擞\t219254\n阴陵泉\t219255\n长荣\t219256\n花间堂\t219257\n云门\t219258\n真爽\t219259\n林黛玉进贾府\t219260\nR2018a\t219261\nteambition\t219262\nit网\t219263\nUploading\t219264\nGazette\t219265\n钢丝钳\t219266\n133号\t219267\n丁香园\t219268\n华润信托\t219269\n人性本恶\t219270\n东昌府\t219271\nebony\t219272\n瓦\t219273\nsha1值\t219274\n分期后\t219275\n绝地求生黑号\t219276\n中华人民共和国道路交通安全法\t219277\n晋善\t219278\n英俄\t219279\n圣母玛利亚\t219280\n豹纹守宫\t219281\n工业摄像机\t219282\nAD17\t219283\n危险化学品目录\t219284\n枣庄市人力资源和社会保障局\t219285\n凤鸣\t219286\nMMSE\t219287\n诺埃尔\t219288\n处置\t219289\n10四大\t219290\n灿\t219291\n尚层\t219292\n铁三\t219293\n掩藏\t219294\n梨汁\t219295\n潘林\t219296\n汇总项\t219297\nC9Pro\t219298\n百鬼\t219299\n2月5日\t219300\nIOC\t219301\n绝地求生刺激战场沙漠地图\t219302\n造田\t219303\nハイスク\t219304\n亚色\t219305\n倚重\t219306\n敢为\t219307\n银灰色\t219308\nremotes\t219309\n小学语文教学\t219310\n宝岛眼镜店\t219311\n天花\t219312\n002052\t219313\n气柱袋\t219314\n720云\t219315\n光大山湖城\t219316\n扩屏\t219317\n沥川\t219318\n松山机场\t219319\n大麻镇\t219320\n日汉\t219321\nALV\t219322\n国匠\t219323\n桂林七星区\t219324\n烟油\t219325\n0.4%\t219326\n网贷知识堂\t219327\n20160129\t219328\n三刃木\t219329\n真信\t219330\n凉鞋女\t219331\n雅瑶镇\t219332\n数十亿美元\t219333\n支撑柱\t219334\n汤宝\t219335\ngu\t219336\nnova2\t219337\n细节决定成败\t219338\nTalkBack\t219339\n贾旭明\t219340\n奥卡\t219341\n芙蓉南路\t219342\n升仙\t219343\n嗨卡\t219344\n巴德尔\t219345\n横位\t219346\n胆囊癌\t219347\n假面骑士exaid\t219348\n刀片\t219349\n岳绮罗\t219350\n农业类\t219351\n国家安全局\t219352\n十二块\t219353\n乐段\t219354\n操盘宝\t219355\n长沙市教育局\t219356\n诗题\t219357\nD90\t219358\n色欲之夜\t219359\n雅绅特\t219360\n云时代\t219361\n官术\t219362\nHPB300\t219363\n极贫\t219364\n懵智\t219365\n精化\t219366\n浪妻\t219367\nToolStrip\t219368\n8930\t219369\n碑材\t219370\n忆儿\t219371\n椒盐\t219372\n乐元素\t219373\n梦回传奇\t219374\n凤起路\t219375\n12.3寸\t219376\n20160404\t219377\n史凯利杰\t219378\n上海信裕生物科技有限公司\t219379\n培训学\t219380\nav女友优\t219381\n鲁肃\t219382\n过松源晨炊漆公店\t219383\n咪咕音乐网\t219384\n杏堂\t219385\n100篇\t219386\n傲游浏览器\t219387\n凯迪拉克ct6\t219388\n冠珠陶瓷\t219389\n黄才伦\t219390\n世界反法西斯战争\t219391\nSignage\t219392\n信息工程专业\t219393\nIP地址查询\t219394\n巴基斯\t219395\nfputc\t219396\n不可抗拒\t219397\n中原新区\t219398\n夕阳看\t219399\n20pin\t219400\n集邦\t219401\n肉联厂\t219402\nmargaret\t219403\n决胜千里之外\t219404\n八思巴\t219405\nLived\t219406\n热成像仪\t219407\n手机定位系统\t219408\n轿子山\t219409\n江南西\t219410\n页岩油\t219411\n白马湖国际会展中心\t219412\n12.2.1\t219413\n第47\t219414\n动天下\t219415\nBGA\t219416\n酬金制\t219417\n颈纹\t219418\n带劲\t219419\n已有\t219420\n郝海东\t219421\n颅压\t219422\n微视听\t219423\nitues\t219424\n努比亚Z7\t219425\ntortoiseGit\t219426\n刘云生\t219427\n左手亲情右手爱\t219428\n藤浦惠\t219429\n小霞\t219430\n桃花水\t219431\n生化棉\t219432\n中妻\t219433\n景甜\t219434\n10.10.4\t219435\n罗湖汽车站\t219436\nSince\t219437\n原形\t219438\n蓄客\t219439\n鼓山镇\t219440\n1:80\t219441\n意料\t219442\n建保\t219443\n洗血\t219444\nLog4net\t219445\n丧尸病毒\t219446\n搜索树\t219447\n李鹏\t219448\n3245\t219449\n255.255.255.224\t219450\npwn\t219451\n凌动\t219452\n黑血\t219453\n体育西路地铁站\t219454\n梵语\t219455\n续页\t219456\n过桥费\t219457\n狐\t219458\n乐山市市中区\t219459\n适合度\t219460\n蛋儿\t219461\n点点点\t219462\nkbhit\t219463\n业种\t219464\n乔宇\t219465\n叶蓓\t219466\n运单\t219467\n勒紧\t219468\nMEDICINE\t219469\ntanke\t219470\n拉钩\t219471\nShuo\t219472\n元素周期表\t219473\nMarvin\t219474\n检索\t219475\n榭兰图\t219476\n米诺陶斯\t219477\n君不见\t219478\nX9s\t219479\n信服\t219480\n小米云空间\t219481\nshark\t219482\nVBA宏\t219483\n实施者\t219484\n六十四\t219485\n情难\t219486\n丝巾\t219487\n强晶\t219488\n刘金山\t219489\n超媒体\t219490\nATX\t219491\n杀警\t219492\ndigestive\t219493\n马奎\t219494\nsocket\t219495\n许多次\t219496\n黑瓶\t219497\n招生处\t219498\n汪清县\t219499\n游去\t219500\n灭蚊灯\t219501\nFFS\t219502\n+86\t219503\n寥寥\t219504\n搓背\t219505\n27件\t219506\n长治百姓网\t219507\n风顺\t219508\n水咲ロ\t219509\n嘱托\t219510\n茶水柜\t219511\n安堂\t219512\n输入法\t219513\n减去\t219514\n谢姓\t219515\n韩寒怒怼\t219516\n小津\t219517\n无极\t219518\n六方位\t219519\n胡\t219520\n可掬\t219521\n古隆中\t219522\n判若两人\t219523\n施一杜兰特\t219524\n嘉立创\t219525\n西部牧业\t219526\nallow\t219527\nIXDC\t219528\n31套\t219529\n双子传说\t219530\n易创培训网\t219531\n辉瑞制药有限公司\t219532\n碧晨\t219533\n慎言\t219534\n战争雷霆\t219535\nUncaged\t219536\n中周\t219537\nLarryZeal\t219538\nlnp\t219539\n9908\t219540\n入场\t219541\n圈地运动\t219542\n彩华\t219543\n底数\t219544\n惠菲宁\t219545\n文件卡\t219546\n一百五十\t219547\n王剑\t219548\n满色\t219549\n北京麦田\t219550\n婚恋网\t219551\n冷雨\t219552\n超微粉碎\t219553\n台球\t219554\n离散数学\t219555\n昆明市盘龙区人民政府\t219556\n宅急便\t219557\n黑水虻\t219558\n陈晓明\t219559\n木糖\t219560\n机蜜\t219561\n燕麦\t219562\n说辞\t219563\nvacant\t219564\n761\t219565\n青少\t219566\nMNS\t219567\n徐灿\t219568\n好黑\t219569\n园务\t219570\n不粘\t219571\n长江公路大桥\t219572\n12秒\t219573\n租贷\t219574\n南京市浦口区人民政府\t219575\n企业招商网\t219576\n天翼\t219577\n独孤天下\t219578\n很惭愧\t219579\n观赏鱼百科_龙巅网\t219580\nBlaster\t219581\n王清华\t219582\n姚笛\t219583\n启东经济开发区\t219584\nbat\t219585\nfluorescent\t219586\n中卫\t219587\n清税证明\t219588\n米德加尔特\t219589\n契阔\t219590\n凌兆新村\t219591\n怀柔水库\t219592\ndosbox\t219593\n佟爱国\t219594\n7两\t219595\n看着\t219596\n隔声罩\t219597\n锦天城律师事务所\t219598\n蓝鸽集团\t219599\n高树\t219600\nNOTE2\t219601\n腐漫\t219602\nerik\t219603\n上海安谱实验科技股份有限公司\t219604\n小花仙\t219605\n繁重\t219606\n最小离地间隙\t219607\nI2\t219608\nfifaol\t219609\nPolygon\t219610\n企友\t219611\n台州市环保局\t219612\n思无邪\t219613\n难言\t219614\n|支\t219615\n江苏省省级机关医院\t219616\nE20\t219617\n华山长空栈道\t219618\n车险\t219619\n高中数学必修5\t219620\n惠南小学\t219621\n恋曲1990\t219622\n烟罗\t219623\n蜂群\t219624\n天尖\t219625\n非分\t219626\n候车厅\t219627\nbigger\t219628\n布子\t219629\n魅影\t219630\nHer\t219631\n小儿豉翘清热颗粒\t219632\n2319\t219633\n3DOne简易操作教程\t219634\n厢式压滤机\t219635\n中共青海省委\t219636\n0.56\t219637\nicmp协议\t219638\n殷氏\t219639\n蒋佳恩\t219640\nTransitions\t219641\n可以\t219642\n小商品\t219643\n头孢丙烯\t219644\n【心食谱\t219645\n港独\t219646\n云中歌\t219647\nOSB\t219648\nWOW6.0\t219649\n魏尧\t219650\n金鞭溪\t219651\nSHKD\t219652\no365\t219653\nサ\t219654\n三津\t219655\n结婚照\t219656\n电流\t219657\n方大炭素\t219658\nIAA\t219659\nH61M-DS2\t219660\n咨政\t219661\nmultilayer\t219662\n中国共产党中央政治局\t219663\nU32\t219664\n贼人\t219665\n市运会\t219666\n余强\t219667\n眼科\t219668\n20171106\t219669\n麻辣江湖\t219670\n迈特凯\t219671\nPanorama\t219672\n自治区\t219673\n爱宝美\t219674\n迷你\t219675\ntrycatch\t219676\n中友\t219677\n檀香\t219678\n花容月貌\t219679\n埼玉县\t219680\n南校\t219681\n中国太平洋人寿保险\t219682\n天涯明月刀丐帮\t219683\nBlair\t219684\n不过关\t219685\nchx\t219686\n805路\t219687\nrul\t219688\n列管\t219689\n基金证券\t219690\n餐盒\t219691\n娇韵诗恒润\t219692\n0616\t219693\n会场\t219694\n奥迪q5l\t219695\n新乡村\t219696\n包装物\t219697\nhandsomecui\t219698\nessec\t219699\nLeslie\t219700\nhyperscan\t219701\n实业有限公司\t219702\n地花\t219703\nTransMac\t219704\n奸角\t219705\n虎尾\t219706\n虚拟化\t219707\nV40\t219708\n20倍\t219709\n太仓百姓网\t219710\n弥补\t219711\n靑\t219712\n家具板\t219713\n华脉科技\t219714\n君实生物\t219715\n团结湖\t219716\n王馨平\t219717\n刘清华\t219718\nunpb\t219719\n排球\t219720\nentitled\t219721\n小舟\t219722\nvolcano\t219723\n摘心\t219724\n奉佑生\t219725\n毕业生们\t219726\n2018年1月29日\t219727\nalluxio\t219728\n日照市人民政府\t219729\n井台\t219730\n陈济棠\t219731\n殷美根\t219732\n决战\t219733\n水龙\t219734\n2018部\t219735\nDiving\t219736\nGina\t219737\n20s\t219738\n原发性血小板增多症\t219739\nUnity3D学习网\t219740\n阔绰\t219741\nobserve\t219742\n韩姓\t219743\n真三国无双4\t219744\nDemi\t219745\n羞花\t219746\n1981年\t219747\nsysu\t219748\n中软国际科技服务有限公司\t219749\n麻涌\t219750\n清露\t219751\n天一生\t219752\n布丁豆豆\t219753\nG20\t219754\n大富科技\t219755\n吴兵\t219756\n西游记女儿国\t219757\n山田\t219758\n罗梅罗\t219759\nnju\t219760\n揭开\t219761\n齐向东\t219762\n太原市\t219763\n牧羊人之心\t219764\nCompact\t219765\n蔬菜商情网\t219766\n代物\t219767\n毛岸英\t219768\n12分钟\t219769\n杰夫·贝索斯\t219770\n国际法学院\t219771\n漂白液\t219772\n三连败\t219773\n生物接触氧化法\t219774\n相同列\t219775\n张森\t219776\n亚氯酸钠\t219777\n烙画\t219778\n塘沽火车站\t219779\n59年\t219780\n新岛\t219781\nPDF/TXT\t219782\n执念\t219783\njmc\t219784\n任务版\t219785\n天沐\t219786\n熟练程度\t219787\n陆天明\t219788\n杭州极地海洋公园\t219789\n云曦\t219790\n面波\t219791\n膜片弹簧离合器\t219792\n康泰纳仕\t219793\n裤装\t219794\n灵梦\t219795\nToolbag\t219796\n理财类\t219797\n新观念\t219798\n望湖楼\t219799\n合志\t219800\n浪漫曲\t219801\n中工网\t219802\n天津论坛\t219803\n聚米\t219804\n比特斯拉\t219805\n学科网\t219806\n流浪军\t219807\n事务部\t219808\nSTVD\t219809\n新彩\t219810\n宋斌\t219811\n华恒\t219812\n无内\t219813\n变成\t219814\nAndroid安\t219815\n泽哥\t219816\n鸿飞\t219817\n0088\t219818\n百度地图车机版\t219819\n金相分析\t219820\ndayz\t219821\n航运在线\t219822\nPS批量\t219823\n北京五星级酒店\t219824\n连岛\t219825\nyangzhi.huangye88.com\t219826\n牢度\t219827\n画中画\t219828\n染色质\t219829\n告示板\t219830\nedr\t219831\n护彤\t219832\n23本\t219833\n头胀\t219834\n骚女\t219835\n火影疾风坛\t219836\n消防机器人\t219837\ncreaterepo\t219838\n全向轮\t219839\n八边形\t219840\n东京银座\t219841\ndpr\t219842\n净利率\t219843\n帝友\t219844\n狼毒\t219845\nPeer\t219846\n甲氨蝶呤\t219847\n县国土资源局\t219848\n坡道\t219849\n李佩红\t219850\n惠子相梁\t219851\n无尽世界\t219852\n博爱慎行\t219853\nThink\t219854\n宝利来\t219855\n豪沃克\t219856\n技术转让合同\t219857\n倒像\t219858\n天气状况\t219859\n菜心\t219860\n用友公司\t219861\n广州市人大\t219862\nsubspace\t219863\n锁模\t219864\n836\t219865\n创伤性\t219866\n蔬果汁\t219867\n香雪\t219868\nPapa\t219869\nClever\t219870\n亲爱的许你来生\t219871\n对垒\t219872\n中央综治委\t219873\n美国宾夕法尼亚州\t219874\n钴\t219875\n逝世\t219876\nwonderful\t219877\n蜜蜂少女队\t219878\nfitc\t219879\n女神经\t219880\n北溪\t219881\n世范\t219882\n第九册\t219883\nsignheart\t219884\n临平北\t219885\nTour\t219886\nZXing\t219887\nl3\t219888\n市社保局\t219889\n课音乐\t219890\n降魔篇\t219891\n李鑫\t219892\n拉卡拉手机收款宝\t219893\n晚装\t219894\n起点币\t219895\n惊天危机\t219896\n德能\t219897\n人肉\t219898\nPrototype\t219899\n袖套\t219900\n幻想纹章\t219901\nBlogJava\t219902\nopcua\t219903\nl101\t219904\n中国故事大会\t219905\n大阳电动车\t219906\n20170107\t219907\n十一天十一夜\t219908\n中策\t219909\n减光\t219910\n污泵\t219911\nVolcano\t219912\n垂体腺瘤\t219913\n石炉\t219914\n竹叶青茶\t219915\n高交会\t219916\nFlare\t219917\nPowerPoint2013\t219918\n危险源管理制度\t219919\n涂刷\t219920\n理道财税咨询网\t219921\n小萍\t219922\n付款方\t219923\nbeiji\t219924\n西电捷通\t219925\naape\t219926\n夏丹\t219927\nUE4\t219928\n124个\t219929\nGeeM2\t219930\n2500吨\t219931\n永乐电器\t219932\n军情解码\t219933\n疑问贴\t219934\n哗哗\t219935\n参试\t219936\n纤手\t219937\n电力调整器\t219938\n未婚证明\t219939\n属具\t219940\n高尔夫物语\t219941\nSis001\t219942\nisv\t219943\n民权县\t219944\n敦\t219945\n取走\t219946\n包奕凡\t219947\n申智珉\t219948\n恒星币\t219949\n小米之家\t219950\n东风悦达起亚\t219951\n笙\t219952\nez\t219953\n技巧\t219954\n羽弥\t219955\n沼渣\t219956\n开式\t219957\n组建\t219958\n雅本化学\t219959\nEmbarcadero\t219960\n谢春涛\t219961\n东莞市\t219962\n阳原\t219963\nhackyo\t219964\n风水先生\t219965\n第一晚\t219966\n罗汉山\t219967\n第十回\t219968\n寂寂\t219969\n绿政\t219970\nracer\t219971\n9.4分\t219972\ntidb\t219973\nSGI\t219974\n两立\t219975\n2525\t219976\n字长\t219977\n博多\t219978\nJack47\t219979\n截止阀\t219980\n家庭用品\t219981\n中条山\t219982\nCMAKE\t219983\nEXE格式\t219984\n交稿\t219985\n藏族\t219986\n城市亮化工程\t219987\n挂杆\t219988\n看门狗2吧_\t219989\nDCU\t219990\n白金分期信用卡\t219991\n光龙\t219992\n箱规\t219993\nindeed\t219994\nLSTM\t219995\nse\t219996\n福盈\t219997\n办公点\t219998\nsubplots\t219999\nTransformer\t220000\n力偶矩\t220001\n同期\t220002\n格子柜\t220003\n陈功\t220004\n王者雪铁龙c5\t220005\nironic\t220006\n十六章\t220007\nj8\t220008\n寿词\t220009\n未变\t220010\n甲方乙方\t220011\n20161107\t220012\n2018年04月21日\t220013\n休杰克曼\t220014\n红韵\t220015\n方锐\t220016\n核算\t220017\n海权\t220018\nPerson\t220019\n翘舌音\t220020\n19.5英寸\t220021\n工卡\t220022\n响巢\t220023\n&#61677\t220024\nPPTX\t220025\n脸肿字幕组\t220026\n临终关怀\t220027\n拼合\t220028\n市食药监局\t220029\nJesus\t220030\n桂林机场\t220031\n小狸\t220032\n交城\t220033\nviewstate\t220034\nFS7\t220035\n黄山西\t220036\n20140921\t220037\nzpt\t220038\n比莉\t220039\n故都的秋\t220040\nDBJT\t220041\n陈昌浩\t220042\n国际经济法概论\t220043\n胶片感\t220044\n先后\t220045\n毛伟\t220046\n超声波细胞粉碎机\t220047\n翻斗\t220048\n认作\t220049\n中桥\t220050\n金兰湾\t220051\n有限单号查询_网\t220052\nANK\t220053\n5013\t220054\nspooler\t220055\n眼馋\t220056\nSafari浏览器\t220057\n涡轮车\t220058\n纳雍县\t220059\n二连浩特\t220060\njetty\t220061\n谱写\t220062\nIsabel\t220063\n不定式\t220064\n神树\t220065\nInterrupt\t220066\n一年制\t220067\nestablish\t220068\ntester\t220069\n新婚姻法\t220070\n可爱的你\t220071\nfmri\t220072\n新世纪幼儿园\t220073\n林文信\t220074\n小学生作文_小故事网\t220075\n宁波东部新城\t220076\n展销\t220077\n基源\t220078\nvc52\t220079\nfragile\t220080\nvolvo\t220081\n白玉兰花\t220082\n食品流通许可证\t220083\n黄春明\t220084\n忽略\t220085\n驚\t220086\n沙奈朵\t220087\n药酒案\t220088\n伊尔萨\t220089\n搪玻璃\t220090\n索命\t220091\n南平镇\t220092\n灌南县人民政府\t220093\n一两周\t220094\n転生剣奴\t220095\n田家镇\t220096\n香篆\t220097\nKILL\t220098\n行上\t220099\n海门街道\t220100\n几轮\t220101\n洗去\t220102\nesd\t220103\n补给站\t220104\n佛牌\t220105\n毒姐\t220106\n劳斯莱斯古思特\t220107\n易感性\t220108\n酉金\t220109\n笋尖\t220110\n八八伍\t220111\n梁晓声\t220112\n第65章\t220113\n纽瓦克\t220114\n黄桥街道\t220115\n5170\t220116\n78号\t220117\n金世遗\t220118\n中顺洁柔\t220119\n苯乙酸\t220120\n千千网\t220121\nMarina\t220122\n临空经济示范区\t220123\n网名\t220124\n中国广电\t220125\n液压接头\t220126\n高清晰版\t220127\n相残\t220128\n0376\t220129\nAuld\t220130\n华瑞银行\t220131\nresignation\t220132\ntribute\t220133\nA5\t220134\n马海涛\t220135\n环球运费网\t220136\n巨涛\t220137\n建筑工程质量监督站\t220138\n中南世纪城\t220139\n1880\t220140\n_千图网\t220141\nC6H\t220142\n放球\t220143\n常德站\t220144\n发烧\t220145\n步兵\t220146\nMedium\t220147\n彭琳\t220148\n室觉网\t220149\n辽中南\t220150\n孤枕难眠\t220151\n意力\t220152\n114\t220153\ndkdn\t220154\n隔音屏\t220155\nT桖\t220156\n裸鼠\t220157\n环球资源网\t220158\n自溶\t220159\n仙之侠道\t220160\n裴少\t220161\n体决\t220162\n交叉点\t220163\n第一中学\t220164\n脚本错误\t220165\n樟村\t220166\n申花\t220167\n双穿\t220168\n灵山路\t220169\n彬卅\t220170\nAmherst\t220171\n花园乡\t220172\nk乳\t220173\n林师傅\t220174\n水拓画\t220175\nlit\t220176\n朱天福\t220177\n县级市\t220178\n文件包\t220179\nup\t220180\n雪中送炭\t220181\nyuno\t220182\n寄宿生\t220183\n万春\t220184\n磺基水杨酸\t220185\nII级\t220186\necole\t220187\n立水桥南\t220188\n拟人\t220189\n四论\t220190\nBalsamiq\t220191\n盱眙龙虾\t220192\n猪狗\t220193\n无尘布\t220194\n炮战\t220195\nWIN64\t220196\n撂\t220197\n美度\t220198\n故事梗概\t220199\nNBA2k\t220200\n5nd\t220201\n黄崖洞\t220202\n交易对方\t220203\nSai\t220204\n1683\t220205\nHoneymoon\t220206\n广东温氏食品集团股份有限公司\t220207\n壁障\t220208\n严蔚敏\t220209\n霜刃\t220210\n长两短\t220211\n氨基丁酸\t220212\n交付\t220213\nconstantly\t220214\nconformity\t220215\n鼓风机\t220216\nlumia1020\t220217\n焊料\t220218\n蓝舌石龙子\t220219\n图书证\t220220\n造业\t220221\n河北雄安新区\t220222\n石破天惊\t220223\n中咨公司\t220224\n山东大学研究生院\t220225\n大弯\t220226\n蝶花\t220227\n那封信\t220228\n山丹丹\t220229\n砸锅\t220230\n劳动力\t220231\n冷暴力\t220232\nぷ\t220233\n松原\t220234\n吴下阿蒙\t220235\nhefei\t220236\nwin7系统网络\t220237\nGrain\t220238\n会计公司\t220239\n重庆国税\t220240\n湖南文理学院\t220241\n金俊\t220242\n亮资\t220243\n蛇男\t220244\n麗枫酒店\t220245\n17秒\t220246\n陈阿娇\t220247\n91bab\t220248\nregulations\t220249\nsvnadmin\t220250\n代维\t220251\n电线管\t220252\n小向真奈美\t220253\nqwe123rty\t220254\n度云\t220255\n北斗星小说网\t220256\n狐尾花\t220257\nTranny\t220258\njiading\t220259\nm7\t220260\n这年\t220261\n内蒙古自治区公安厅交通管理局\t220262\n王乔\t220263\nNa+\t220264\n这么装\t220265\nDiao\t220266\n海涵\t220267\n中国大使馆\t220268\n猎猎\t220269\n怀特迈恩\t220270\n泥城镇\t220271\nAV天堂看网\t220272\n2.1GB\t220273\n半杯\t220274\n北阳\t220275\nRegina\t220276\nBVR\t220277\n封天2\t220278\n工业大道\t220279\n慢性白血病\t220280\nCreations\t220281\n龙创\t220282\n園田\t220283\n城市公交\t220284\n永不\t220285\n白市驿镇\t220286\n在别处\t220287\n鬼刃\t220288\n西栅老街\t220289\n尿裤子\t220290\n沃趣科技\t220291\nnovation\t220292\n华为荣耀V10\t220293\n黄明孔子\t220294\n专用帖\t220295\n玉立\t220296\n爱表网\t220297\n迁居\t220298\nFZ\t220299\n转出让\t220300\n以奖代补\t220301\n带集\t220302\n换流\t220303\n英語\t220304\n岳阳路\t220305\n杨义先\t220306\n踩点\t220307\n镇长\t220308\n时运\t220309\n人教版2018\t220310\n皇家一号\t220311\n南京医科大学附属逸夫医院\t220312\nbrake\t220313\n焊锡丝\t220314\n多士\t220315\n每一个你\t220316\neight\t220317\npdx\t220318\n一万次\t220319\n3066\t220320\n东方出版社\t220321\n趣盒子\t220322\n大禹陵\t220323\n一环路西\t220324\n光山县\t220325\nForMacMAC\t220326\n助学\t220327\naquan\t220328\n磷虾\t220329\nCMYK值\t220330\n公安部物证鉴定中心\t220331\n天安人寿健康源\t220332\n平乐镇\t220333\ni77700hq\t220334\nMybatis3\t220335\n李新中\t220336\n国外大学\t220337\n李干杰\t220338\n三国杀国战\t220339\n廿\t220340\nmiffy\t220341\nDIESEL\t220342\n保利公园\t220343\nf498\t220344\n山重水复疑无路\t220345\n库布里克\t220346\n猫耳fm\t220347\n高利贷\t220348\nmpu6050\t220349\n茶舍\t220350\n苏州河\t220351\nVISI\t220352\n奔驰C63\t220353\n广州市人民政府国有资产监督管理委员会\t220354\n朝九晚五\t220355\nDirectx\t220356\n港建\t220357\nfacade\t220358\n欧润哲\t220359\ntilbury\t220360\n天涯明月刀五毒\t220361\n布拉芙\t220362\n33级\t220363\n37页\t220364\n地轨\t220365\n首都博物馆\t220366\n玻璃板\t220367\n南联盟\t220368\n劲旅\t220369\n绿箭侠第六季\t220370\n林教头\t220371\n建筑工程工程\t220372\nLOOP\t220373\n龙湖物业\t220374\n天翼校园网\t220375\n盼君\t220376\n武将\t220377\n3200\t220378\n忆捷\t220379\n吉它\t220380\nRAN\t220381\n搜楼网\t220382\n油标\t220383\n3.6亿\t220384\n奇缺\t220385\n航天发展\t220386\nnanjing\t220387\n金池\t220388\nMAKING\t220389\ntonumber\t220390\n瑞源\t220391\n黑水城\t220392\n金丝雀\t220393\n古瑞瓦特\t220394\n亮瞎\t220395\n白衣战士\t220396\n玩法\t220397\ntwister\t220398\nSocial\t220399\n暗喻\t220400\nkingsoft\t220401\n5w40\t220402\n阿v\t220403\n老千2:神之手\t220404\n长明灯\t220405\nlululun\t220406\n火火的爱\t220407\n泪眼\t220408\nDB11T\t220409\nION\t220410\n硫离子\t220411\n托维尔\t220412\n高纯氧化铝\t220413\n低血糖症\t220414\n生死线\t220415\n美拉德反应\t220416\nh3cne\t220417\n联想小新300\t220418\n情咒\t220419\na1466\t220420\n抽选\t220421\n行政执法局\t220422\nbehalf\t220423\nGank\t220424\n广西法治日报\t220425\nFIGHTING\t220426\n奉天省\t220427\n黄莹\t220428\n11寸\t220429\n摄图网\t220430\n12}\t220431\n周平王\t220432\n过虑\t220433\nminimum\t220434\n外包商\t220435\n压胶\t220436\n大墙\t220437\n李卓\t220438\nsleeping\t220439\n拍发\t220440\n沙力\t220441\n五原路\t220442\n购车计算器\t220443\n精灵们\t220444\n古城墙\t220445\nsxky\t220446\n公务卡\t220447\n艾海提沙依\t220448\n84集\t220449\n03集\t220450\n最终幻想14wiki_最终幻想14攻略\t220451\n响动\t220452\n十字星\t220453\n天银\t220454\n马厂镇\t220455\nrequirement\t220456\n美尻\t220457\n404页\t220458\n布机\t220459\n多玛姆\t220460\n刘家昌\t220461\n中装新网\t220462\n红杏出\t220463\n新刑事诉讼法\t220464\n后背疼\t220465\n傅雷家书\t220466\n地宝\t220467\n05级\t220468\n刑释\t220469\n桃花岛\t220470\nALT+\t220471\n一雷\t220472\nLittlePeng\t220473\ncvp\t220474\n1682\t220475\n大疆精灵4\t220476\nflashing\t220477\nDDK\t220478\n亵玩\t220479\n路缘\t220480\n4005\t220481\n陈修风\t220482\nword13\t220483\nMq\t220484\nthailand\t220485\n总伤\t220486\n泰康保险集团\t220487\n广播台\t220488\n通灵人\t220489\nIPT\t220490\n娼馆\t220491\n座架\t220492\n第17轮\t220493\n师情\t220494\n薄胎\t220495\noppor9splus\t220496\n三分之一处\t220497\n居留证\t220498\nuc3842\t220499\n李建民\t220500\nblizzard\t220501\n新史\t220502\n联想昭阳\t220503\n/button\t220504\n京沈高铁\t220505\n中梁镇\t220506\n刘家坪\t220507\n别扯\t220508\nintervals\t220509\n修复机\t220510\n破零\t220511\n之江实验室\t220512\n华贸易战\t220513\n记者\t220514\n乌鸡汤\t220515\n迪庆\t220516\n王亚飞\t220517\n假面骑士amazons\t220518\n复方甘草酸苷胶囊\t220519\n氧化塘\t220520\n彩笔\t220521\n何奕\t220522\n清政\t220523\n有奖\t220524\n威风凛凛\t220525\nhrbp\t220526\nGOT7\t220527\n石距\t220528\n珍重\t220529\n高建华\t220530\n4万多\t220531\n谷安\t220532\n蜃境\t220533\n白麻\t220534\n风险补偿金\t220535\n6名\t220536\n旗山\t220537\n社函\t220538\n唐山地区\t220539\n净末\t220540\n子龙\t220541\n刘再复\t220542\n安岳县\t220543\n长针\t220544\neasier\t220545\n白灵菇\t220546\n騙\t220547\n前男友们\t220548\nnotepad\t220549\n赠汪伦\t220550\n温州都市报\t220551\n签约式\t220552\n东风柳汽\t220553\ncecile\t220554\n领克4s店\t220555\n第14集\t220556\n常州机电职业技术学院\t220557\n半道\t220558\n同济经管\t220559\nC语言函数\t220560\napabi\t220561\n孟达\t220562\nPhil\t220563\n并非\t220564\n2:22\t220565\n光泽\t220566\n扣除\t220567\n玉面\t220568\n斗提机\t220569\nGuild\t220570\n9月10日\t220571\nCincinnati\t220572\nI5处理器\t220573\ncp36-cp36m-win\t220574\nAutoprefixer\t220575\n古钟\t220576\n万链\t220577\n近30天\t220578\ne200\t220579\n堰桥街道\t220580\n特种车\t220581\nEM\t220582\nherb\t220583\n移动办公行\t220584\nBrowser\t220585\n霏凡\t220586\nESP分区\t220587\n国家自主创新示范区\t220588\nFreePBX\t220589\n大话刘罗锅\t220590\n离子方程式\t220591\n破肚\t220592\n学不死\t220593\n人类史\t220594\n贝亚娜\t220595\n同仇敌忾\t220596\n喷桩\t220597\n注资\t220598\n宁波市交通运输委员会\t220599\n友利\t220600\nmotor\t220601\n半盖\t220602\nickb\t220603\n宁波公交\t220604\nzd423\t220605\nPH\t220606\n主塔\t220607\n小吊梨汤\t220608\n重压\t220609\n第十九届\t220610\n博雅\t220611\n4级\t220612\n食玩包\t220613\n双星物语\t220614\n应收股\t220615\n2009-2017年\t220616\n猛龙特囧\t220617\nhp1108\t220618\n725\t220619\nSimilarSites\t220620\n冰区\t220621\n蹂躙\t220622\n地沟\t220623\n螭龙\t220624\n涤\t220625\n单男\t220626\n报春\t220627\n乐儿\t220628\nInspiration\t220629\n冒险岛2\t220630\n仙桥\t220631\nclos\t220632\n测绘工程专业\t220633\nJavaS\t220634\n千米\t220635\n昂刺鱼\t220636\n种地\t220637\nisland\t220638\n皇恩\t220639\n罗增斌\t220640\n乐风\t220641\n光子\t220642\n房博士网\t220643\n辛店\t220644\n射手播放器\t220645\nZHI\t220646\n博才\t220647\n逍遥西游\t220648\n江苏建康职业学院\t220649\n色素色\t220650\n姆希塔良\t220651\n石牌村\t220652\n何亮亮\t220653\npiu\t220654\n一个11\t220655\n绝世唐门漫画全集\t220656\n王安顺\t220657\n霞多丽\t220658\n场地网\t220659\nv5.4.0.3970\t220660\n衣展\t220661\n普职\t220662\n迁坟\t220663\n述职述德述廉\t220664\n90kw\t220665\n【张\t220666\nproxy\t220667\n玉林市\t220668\n偷窥\t220669\nINCREMENT\t220670\n韩歌\t220671\norigin9\t220672\narchetype\t220673\npass反向代理\t220674\nplugy\t220675\n朱世赫\t220676\n蛋挞液\t220677\n谷轮\t220678\n冲激\t220679\n明海\t220680\n潞州\t220681\n2623\t220682\n豪森药业\t220683\nMSDTC\t220684\n工程项\t220685\n玛堡\t220686\n芝兰玉树\t220687\nsilkn\t220688\nCRAN\t220689\nIE11浏览器\t220690\nvboxdrv\t220691\n安徽天康(集团)股份有限公司\t220692\n广西艺术学院\t220693\n化学肥料\t220694\n金扣\t220695\nLateral\t220696\nxsb\t220697\n184号\t220698\n铝制\t220699\n防滑剂\t220700\n反侦察\t220701\n交通警察\t220702\n2169\t220703\n城市区\t220704\n无独有偶\t220705\n玉雕机\t220706\n谷山\t220707\n格鲁尔\t220708\n欢醉\t220709\n微信名_短美文网\t220710\nDensity\t220711\npersist\t220712\n偶像们\t220713\n一框\t220714\n均分\t220715\naks\t220716\n喷丸\t220717\n真人快打X\t220718\nprtsc\t220719\n参合\t220720\nADsafe\t220721\nVijeo\t220722\n松本润\t220723\n并排\t220724\n三板模\t220725\nbackup\t220726\n沪昆高铁\t220727\n中共上海市委\t220728\n赛睿寒冰3\t220729\n苏联人\t220730\n西夏墅\t220731\nLaid\t220732\nproces\t220733\nDSH\t220734\n泳装\t220735\n东风雪铁龙\t220736\n叙化\t220737\n山西移动\t220738\n1.0.6\t220739\nBacklog\t220740\n武杨\t220741\n加州1号公路\t220742\n榨季\t220743\n新心\t220744\nwin8\t220745\n万象广场\t220746\n欲情故纵\t220747\ncdec\t220748\n奥尔滨\t220749\n陈一新\t220750\n久久健康网\t220751\n合康新能\t220752\nXper\t220753\nXSSF\t220754\n稳压管\t220755\n南湖水苑\t220756\nhiberate\t220757\n30余名\t220758\n任鸟飞\t220759\nPaula\t220760\nrecurrent\t220761\nD80\t220762\n注本\t220763\n江海股份\t220764\n词项\t220765\n桃李杯\t220766\n0083\t220767\nYear\t220768\n天寿\t220769\n泄气\t220770\n做菜网\t220771\n触觉\t220772\n九九乘法口诀表\t220773\n长管\t220774\n商道\t220775\nLaserJet\t220776\n二飞\t220777\n农林大学\t220778\n舞法\t220779\ncatfight\t220780\n孽障\t220781\n首都医科大学附属北京妇产医院\t220782\n清控\t220783\n吴千语\t220784\nMCMC\t220785\n刹车片\t220786\n计划单列市\t220787\n窗前\t220788\n回评\t220789\n万丰奥威\t220790\n扎哈·哈迪德\t220791\n天枰男\t220792\n上片\t220793\n铝电\t220794\n卡贴\t220795\n护士长\t220796\n杨宇军\t220797\n决堤\t220798\n臭气\t220799\n小智皮卡丘\t220800\n魔佞\t220801\n楼兰论坛_汽车之家论坛\t220802\n住房和城乡建设局\t220803\n李锦记\t220804\n朝比奈\t220805\n杨桥村\t220806\n淫声\t220807\n黄旼泫\t220808\nyanedu\t220809\n两环\t220810\n文学回忆录\t220811\niec\t220812\n欣怡\t220813\n人盘\t220814\n二氯吡啶\t220815\n高邮\t220816\n西哈努克\t220817\n陆航\t220818\n常营店\t220819\n铃星\t220820\nx3440\t220821\n渭南市教育局\t220822\nDIY博客园\t220823\nmd\t220824\n南北回归线\t220825\nearpods\t220826\nrokid\t220827\n北京科蓝软件系统股份有限公司\t220828\n热力学第二定律\t220829\n如梦奇谭\t220830\n由来\t220831\nmodel层\t220832\npowerpoint2016\t220833\nvue-resource\t220834\n其义\t220835\n第几级\t220836\nIGZO\t220837\nPaging\t220838\n民国二十三年\t220839\n沃华\t220840\n108tv\t220841\n孙卫东\t220842\n加少\t220843\n阎良区\t220844\nminaj\t220845\n中华人民共和国统计法实施条例\t220846\n航海王强者之路\t220847\n362\t220848\n阳光城丽兹公馆\t220849\n沪江听力酷\t220850\nweek\t220851\nyangzi\t220852\n黑炎\t220853\n帝豪\t220854\n后继无人\t220855\n记录者\t220856\n举兵\t220857\nカンケイ\t220858\nSerenade\t220859\n大同煤矿集团有限责任公司\t220860\n参与者\t220861\ntosh\t220862\nDJ娑娜\t220863\n水盒\t220864\n听诊\t220865\n师证\t220866\n光明重影\t220867\n扭矩\t220868\niMac\t220869\n混子曰\t220870\n金仓\t220871\n水磨镇\t220872\n_分析测试百科网\t220873\nTTI\t220874\nacsii\t220875\n郭溪街道\t220876\n尤菲\t220877\n崔龙海\t220878\n中泰化学\t220879\n河海大学图书馆\t220880\n生还者\t220881\n郑丹妮\t220882\n新东站\t220883\n永宁街道\t220884\nem菌\t220885\n招股\t220886\n舒尔846\t220887\nM男\t220888\n自尊心\t220889\n牛筋面\t220890\n行贿犯罪\t220891\n禁区\t220892\n拼车\t220893\n绒促\t220894\n劫良缘\t220895\n各项\t220896\n血套\t220897\nsqlconnection\t220898\n西泰山\t220899\n3话\t220900\n叶欣\t220901\n万历十五年\t220902\n叶美玲\t220903\n企业版\t220904\n书后\t220905\n切实\t220906\n创福康\t220907\n风行烈\t220908\ngaytube\t220909\n篾\t220910\n毛巾架\t220911\nzp\t220912\n一百多斤\t220913\n灵族\t220914\n古崖居\t220915\n5.08\t220916\n广告框\t220917\n20160101\t220918\n梅格雷\t220919\n靖江街道\t220920\n还阳\t220921\n素材中国sccnn.com\t220922\npureboost\t220923\n私户\t220924\n詹姆斯·麦卡沃伊\t220925\n唔知\t220926\n三丰\t220927\n超清播放\t220928\nSelenium2Library\t220929\n佐藤学\t220930\n谁是最可爱的人\t220931\n列项\t220932\n中软\t220933\n华阳国际设计集团\t220934\n徐豆豆\t220935\n23.6英寸\t220936\n11月25日\t220937\n宏图大道\t220938\n傅献彩\t220939\n中铁上海工程局集团有限公司\t220940\n电动车商情网\t220941\n中学生阅读\t220942\n光轨\t220943\n王法\t220944\n广播电视大学\t220945\n恋色\t220946\n脚藓\t220947\n端口映射\t220948\n利群\t220949\nGALA\t220950\n撸_\t220951\n健脑\t220952\nFANTASY\t220953\n八位\t220954\n中共虞城县委\t220955\nWiFi信道\t220956\n虎牙直播\t220957\n十强县\t220958\njt\t220959\n186万\t220960\n45本\t220961\nsw2017\t220962\nbilly\t220963\n陈庆之\t220964\n偏振光\t220965\n不寿\t220966\n田寮\t220967\n焦作市国土资源局\t220968\n去玩\t220969\n沧州友浩管道有限公司\t220970\n腾讯游戏助手\t220971\n弄懂了\t220972\n封神演义神魔传\t220973\n水遁\t220974\n电动轮椅车\t220975\n要点卡\t220976\n宁\t220977\n昂科雷\t220978\n脉\t220979\n优比\t220980\n灰袍\t220981\n小东门\t220982\n克罗库\t220983\n脚轮\t220984\n七小时\t220985\n叶子楣\t220986\ncurl库\t220987\n伪装浏览器\t220988\n48khz\t220989\n荔浦\t220990\n价优\t220991\n不怕人\t220992\n南京河海大学\t220993\n道炎\t220994\nOUR\t220995\n上千\t220996\n天际股份\t220997\n青浦镇\t220998\n映美620k\t220999\n群芳谱\t221000\n郭文海\t221001\n印良法师\t221002\n无创\t221003\nenvironmental\t221004\n桂林市\t221005\n透明屏\t221006\n方斌\t221007\n身权\t221008\n国家电网公司信息通信分公司\t221009\nE0\t221010\n神风\t221011\nupon\t221012\nMove\t221013\nR1C\t221014\n几亿\t221015\nendnotex7\t221016\n爱情万岁\t221017\n罗圣依\t221018\nmigrations\t221019\n水沙\t221020\n易门\t221021\nalarm\t221022\nqq音速\t221023\n热像仪\t221024\n六通\t221025\nJAXB\t221026\n盈利能力分析\t221027\nDecember\t221028\n鲐鱼\t221029\npaly\t221030\n德莉莎\t221031\n力邦\t221032\n800句\t221033\n山东省千佛山医院\t221034\n电动排烟窗\t221035\nFred\t221036\n扭距\t221037\ngtx950\t221038\nB座\t221039\nAsset\t221040\n小黄信\t221041\n君尚\t221042\n丽图\t221043\n127.0.0.1_\t221044\n清真饭店\t221045\n麸质\t221046\n平直\t221047\nCCAR\t221048\nPMX\t221049\n楼面\t221050\n莆仙网\t221051\n60安\t221052\ntablayout\t221053\n均方差\t221054\n缅币\t221055\nviki\t221056\n皇家马\t221057\n继峰股份\t221058\n登陆艇\t221059\n师徒恋\t221060\n假_\t221061\n昆卡\t221062\n战区\t221063\n燕然\t221064\n斯佳唯婷\t221065\n180403\t221066\n包皮龟头炎\t221067\npptv\t221068\n母亲们\t221069\n仿编\t221070\n朝阳公园\t221071\nMusica\t221072\n7.5代\t221073\n消协\t221074\n門\t221075\n身患\t221076\n醋酸钠\t221077\n赛诺菲\t221078\n伏牛山\t221079\n华行\t221080\n允熙儿\t221081\nDataSnap\t221082\n女大童\t221083\n澳牧\t221084\n钤\t221085\n婚约\t221086\nVolte\t221087\n崇光\t221088\n左倾\t221089\n鸡汤文\t221090\n乐考网\t221091\narithmetic\t221092\n坏账准备\t221093\njar格式\t221094\n彝州\t221095\n刘卫\t221096\n菠萝咕噜肉\t221097\n灌南在线\t221098\n王妈\t221099\n一论\t221100\n张伯苓\t221101\nchrono\t221102\n高地\t221103\nexim\t221104\n新规范\t221105\n金水路\t221106\n极盗车神\t221107\nPCF\t221108\nyoufeng\t221109\n叶根友\t221110\nBRIDGE\t221111\n接警\t221112\n18cm\t221113\n吕锡文\t221114\n的哥\t221115\n帝们\t221116\n妇产医院\t221117\nNavicat11\t221118\ntransation\t221119\n杀格\t221120\n蒋兆和\t221121\nU段\t221122\n恒温恒湿机\t221123\n十九届\t221124\n中空玻璃\t221125\n推选\t221126\n飛\t221127\n点香\t221128\n男尊女卑\t221129\nz2pro\t221130\nWorkouts\t221131\n改版后\t221132\nGEO\t221133\n容积式换热器\t221134\nRocker\t221135\n厄贝沙坦氢氯噻嗪片\t221136\nSKY\t221137\n半条命2\t221138\n恰好\t221139\n欧诗漫\t221140\n张本智和\t221141\n沿帽\t221142\n电信光猫\t221143\n爱游\t221144\nLiga\t221145\n氮素\t221146\nunicode_ci\t221147\n瓜达尔港\t221148\n网元\t221149\n跟脚\t221150\n科森科技\t221151\n总结词\t221152\nPuma\t221153\n天子寻龙\t221154\n更衣室\t221155\nMcLaren\t221156\n灵芝孢子粉\t221157\n远去\t221158\n鄞\t221159\n封穴\t221160\n新王者\t221161\ntca\t221162\n马来语\t221163\n大宇\t221164\n自测\t221165\nfreefei\t221166\n763\t221167\n渔家乐\t221168\n251\t221169\n小米路由mini\t221170\n观赏\t221171\n2.6米\t221172\n预见未来\t221173\n手感\t221174\n濡鸦\t221175\n奢望\t221176\n仙岛\t221177\n狂帝百美缘\t221178\n20余万\t221179\npanduan\t221180\n模数转换器\t221181\n嘿siri\t221182\nrealestate\t221183\n金豆豆\t221184\n专用章\t221185\n精点\t221186\n副科长\t221187\navmo\t221188\nwinsorize\t221189\n江苏省国资委\t221190\n钢城\t221191\n梅子涵\t221192\n燕安居\t221193\n正位片\t221194\n速递\t221195\n赵若虹\t221196\n周半仙儿乱语\t221197\n航空公\t221198\n207年\t221199\n茶礼\t221200\n废行\t221201\n零用\t221202\n莫高义\t221203\n688号\t221204\n赣州房产网\t221205\n仓管员\t221206\nCommons\t221207\n菊花展\t221208\n东莞市教育局\t221209\nCOLORS\t221210\n大声说出来\t221211\nseasons\t221212\ndb块\t221213\n角度值\t221214\n吾妻\t221215\n义举\t221216\n402495948@qq\t221217\n808\t221218\n郭静\t221219\n三真\t221220\n悲伤恋歌\t221221\n景顺\t221222\n医学生学习网\t221223\n嘻\t221224\n赵宏\t221225\n玉泉山庄\t221226\n内联函数\t221227\nFLstudio\t221228\n融侨\t221229\n山东信息职业技术学院\t221230\nPedestrian\t221231\n想开发\t221232\n钢卷尺\t221233\n78动漫论坛\t221234\n12户\t221235\n响彻\t221236\nTodd\t221237\n极值点\t221238\n龙骑士\t221239\n万事兴\t221240\n中网站\t221241\n爱情线\t221242\n跑人环\t221243\n浩浩\t221244\n詹记\t221245\n杭州市西湖区政府网\t221246\n怪虫\t221247\n欺凌\t221248\nswift4\t221249\n科技部火炬中心\t221250\nsourceTree\t221251\n赵文华\t221252\n趣票网\t221253\n每5秒\t221254\n比亚迪e6\t221255\n挖井\t221256\n双回路\t221257\n垟\t221258\n三叶\t221259\n拾色\t221260\n新能源汽车展\t221261\n吉吉影音高清\t221262\n跑友\t221263\n4t\t221264\n晋平公\t221265\n3damx\t221266\n广东广播电视台\t221267\nNo\t221268\n安然无恙\t221269\n担保书\t221270\n港邦物流\t221271\n千名\t221272\n桑苗\t221273\nsqljdbc4\t221274\n北洋水师\t221275\n三目运算符\t221276\n圣斗士吧\t221277\n桐庐县\t221278\n一呼百应\t221279\n赞片网\t221280\nPP板\t221281\n消渴丸\t221282\n谷歌游览器\t221283\nTraditional\t221284\n苯甲酸钠\t221285\n西安技师学院\t221286\n剧角\t221287\n客运站\t221288\n蓝毛\t221289\n30句\t221290\n某某\t221291\n醛固酮\t221292\n麦格理\t221293\n中国民主党派\t221294\n樱井彩\t221295\n350家\t221296\n1月5号\t221297\n第二人称\t221298\n津冀\t221299\nMBP\t221300\nA1660\t221301\n天津大学机械工程学院\t221302\n水西\t221303\nPager\t221304\nviscose\t221305\n两季\t221306\nL298N\t221307\n华洋\t221308\n三连发\t221309\n结直肠癌\t221310\n特命\t221311\nZee\t221312\n硬笔书法\t221313\n月影星劫\t221314\n一道\t221315\n陈木胜\t221316\n跑男来了\t221317\n宝利通\t221318\n故障率\t221319\n人欲\t221320\n20节\t221321\n娜塔莉波特曼\t221322\nTitle\t221323\n牛湖\t221324\n串号\t221325\n51模拟器\t221326\n贪杯\t221327\n化分\t221328\nEXCEL格式\t221329\n快啦网\t221330\nMYSQL数据库\t221331\n缪\t221332\nsignificance\t221333\n入场词\t221334\n滨河南路\t221335\n裁军\t221336\n赣州地区\t221337\n复用性\t221338\n马粥街\t221339\n六年级\t221340\n2.4.4\t221341\n收卷\t221342\n朝阳社区\t221343\n中空\t221344\n比达尔\t221345\n周年\t221346\nhor\t221347\n移位运算符\t221348\n韩语专业\t221349\n抗凝血酶\t221350\n低端\t221351\ndicom\t221352\n杨金海\t221353\nDebugLZQ\t221354\n中国娱乐网\t221355\n反传\t221356\n温州动车站\t221357\n中央大道\t221358\n重庆自考网\t221359\neba\t221360\n金陵中学\t221361\n水碾河\t221362\n肛泰栓\t221363\nsobo\t221364\nHDMI显示器\t221365\n安和桥北\t221366\nCATTI\t221367\n三人行\t221368\n双河乡\t221369\n枪火游侠\t221370\n万事如\t221371\n九堡镇\t221372\noverwrite\t221373\nweb监控\t221374\n上海县\t221375\n军产\t221376\n分章\t221377\n求偶\t221378\n新华镇\t221379\nJanet\t221380\n20160606\t221381\nTMT\t221382\n行尸走肉第八季\t221383\n汉硕\t221384\n380ml\t221385\n五十六朵\t221386\n中华苏维埃\t221387\npercent\t221388\n丽兹行\t221389\n合不上\t221390\n马奎尔\t221391\n急急忙忙\t221392\noptima\t221393\n小泉真希\t221394\n九酷\t221395\n风湿膏\t221396\n煤气瓶\t221397\n流外人田\t221398\nJone_chen\t221399\n第71章\t221400\n交游\t221401\n樱草\t221402\n海门\t221403\n申科\t221404\n6300\t221405\n金猴\t221406\n440c\t221407\n嵩阳\t221408\nwvr450g\t221409\n呼喊\t221410\n偏印\t221411\n白城之窗\t221412\n铁桥\t221413\n0.1.5\t221414\ndale\t221415\n遗踪\t221416\n益丰大药房\t221417\n大疆科技\t221418\n凯洛格\t221419\n索溪峪\t221420\nAmbient\t221421\npolybridge\t221422\n瑞盈\t221423\n天涯扒皮\t221424\n作人\t221425\n边坝县\t221426\n盘龙云海\t221427\n建政\t221428\n春汛\t221429\n模样\t221430\n漂粉精\t221431\n法桐\t221432\nnetherlands\t221433\n临浦镇\t221434\n阮义忠\t221435\n内德维德\t221436\n水洗棉\t221437\neleven\t221438\ngar\t221439\nBerhad\t221440\nInputS\t221441\nDraper\t221442\n流区\t221443\n就近\t221444\n夏泽网\t221445\nWM\t221446\n德企\t221447\n一血\t221448\nAsses\t221449\nFAQ\t221450\n4035\t221451\n三十级\t221452\n白马山\t221453\n沈音程彦\t221454\n税警\t221455\n线法\t221456\n纱窗\t221457\n道门\t221458\nwin10浏览器\t221459\n红三代\t221460\nsquares\t221461\n四川省总工会\t221462\n南越王墓\t221463\n辽宁体育馆\t221464\n意大利国家队\t221465\n倒戈\t221466\n2014年后\t221467\n堂课\t221468\n一险\t221469\na1533\t221470\n集翔网\t221471\n泸水市\t221472\n私服端\t221473\n虞政博\t221474\n骨玉\t221475\n怒赞\t221476\nnmp\t221477\n办公家\t221478\n禹贡\t221479\n笔批\t221480\n逐一\t221481\n曾国\t221482\n梅陇路\t221483\n护考\t221484\n名派\t221485\n2013年12月31日\t221486\n印刷体\t221487\n快快游戏\t221488\n出云\t221489\n重释\t221490\njgj\t221491\nMicroServer\t221492\n223.104\t221493\n鸭翅\t221494\nGentoo\t221495\n琼脂糖凝胶\t221496\n20141117\t221497\n建设期\t221498\n抗倍特板\t221499\n公告板\t221500\n两倍\t221501\n果然翁\t221502\n台座\t221503\nlvory\t221504\nppoe\t221505\n综合保税区\t221506\n广告奖\t221507\nmemcache\t221508\nTimo\t221509\n上海西站\t221510\n西藏自治区\t221511\n营业利润\t221512\nbluebird\t221513\n泸定\t221514\n三十六种\t221515\n爱之初体验\t221516\nandroid7\t221517\n利来蛋糕\t221518\n苏颖\t221519\n700g\t221520\nOffice助手_希赛网\t221521\nMerci\t221522\nMidi\t221523\n日立\t221524\n蚕丝\t221525\n各为\t221526\n素娜\t221527\n12.8万\t221528\n第五十五章\t221529\n奈萨里奥\t221530\nAvaya\t221531\n榴莲蜜\t221532\n32所\t221533\n憨山大师\t221534\n中国农资导报网\t221535\n2010-2011年度\t221536\n人间天\t221537\n2a\t221538\n无限流小说\t221539\n排数\t221540\n入骨\t221541\n棕榈蜡\t221542\n内存\t221543\n校通\t221544\n任意函数\t221545\n2D\t221546\ncathedral\t221547\n发空包\t221548\ninfix\t221549\n凤凰医疗\t221550\nWei\t221551\n三帆\t221552\n回水器\t221553\nlunasol\t221554\n太史慈\t221555\n磁盘阵列\t221556\n039期\t221557\n廊坊发展\t221558\n继承者们\t221559\n三十码\t221560\n山路\t221561\n草薙京\t221562\n金刀峡\t221563\nCorporate\t221564\nManage\t221565\n富士龙\t221566\n可想\t221567\n6.0.5\t221568\n聚思鸿\t221569\n瘦身操\t221570\n辽宁省人民政府\t221571\nerase\t221572\n绿色出行\t221573\nvisible\t221574\nMoore8\t221575\n2017—2019年\t221576\n姑妈\t221577\n头孢菌素\t221578\n途\t221579\nps_\t221580\n县委宣传部\t221581\n莲花路\t221582\n1951\t221583\n热血精灵派\t221584\n向右\t221585\n蒲\t221586\ntaptap\t221587\n100集\t221588\n聚异戊二烯\t221589\nalen\t221590\n栖霞山\t221591\n接手\t221592\n锋尚版\t221593\n深圳市丰巢科技有限公司\t221594\n言言\t221595\nCyxw\t221596\n轰隆\t221597\n300集\t221598\n湖北美术学院\t221599\n沈大\t221600\nWindows防火墙\t221601\n停更了\t221602\n绚彩\t221603\n哭出\t221604\nshadowrocket\t221605\n空窗期\t221606\n千方百计\t221607\nIMDS\t221608\n世界波\t221609\n贤\t221610\n嘉旺\t221611\niBFree\t221612\n幻域\t221613\nsesame\t221614\n中国计量院\t221615\n上海东华大学\t221616\n预兆\t221617\nuie\t221618\n省军区\t221619\n威尼斯电影节\t221620\n丝牙\t221621\nsot\t221622\nWeb应用\t221623\n旅游学院\t221624\nlis\t221625\n刘志仁\t221626\n急用\t221627\n洁净室\t221628\n七伤\t221629\nnandflash\t221630\n鸳鸯溪\t221631\ncst\t221632\n南方保险网\t221633\nAladd设计量贩铺\t221634\n西塞山\t221635\n米\t221636\n中国科学院长春应用化学研究所\t221637\n广西法院\t221638\n标准体\t221639\n零代码\t221640\nlindberg\t221641\n8张\t221642\n脱掉\t221643\n120寸\t221644\n创业群\t221645\n360压缩\t221646\n济南南\t221647\n卡罗拉2017\t221648\nBYE\t221649\n碘海醇\t221650\n提灯\t221651\n人居\t221652\n祖\t221653\n洛阳伽蓝记\t221654\n豹式\t221655\n药害\t221656\n东土\t221657\n河中石兽\t221658\n陪你\t221659\n航空小镇\t221660\nWafer\t221661\n劳拉\t221662\n柜箱\t221663\n狗猫\t221664\nfragstats\t221665\nlustre\t221666\n糖酐\t221667\njane\t221668\n古城乡\t221669\n在库言库\t221670\n机器学习研究会\t221671\nBlueSoleil\t221672\n普制\t221673\n影剧\t221674\n邪恶帝\t221675\n食戟之灵漫画\t221676\n污染\t221677\n磨刀石\t221678\nK歌\t221679\n新思\t221680\n姦\t221681\nJiaotong\t221682\n西康路\t221683\n苏民网\t221684\n伙食费\t221685\ncfx\t221686\n长身\t221687\n青山\t221688\nyogabook\t221689\n王一帆\t221690\n医女\t221691\n喷油机\t221692\noracle11\t221693\nX档案\t221694\n泰卦\t221695\n6平方米\t221696\n隐藏单元格\t221697\n华园\t221698\n海安教育信息网\t221699\n访问端\t221700\n波士顿学院\t221701\n知呱呱知道网\t221702\n220元\t221703\nParameter\t221704\n钛锅\t221705\n银尘\t221706\n派上用场\t221707\n000100\t221708\n兢兢\t221709\nijk\t221710\n雪莱\t221711\n咳特灵\t221712\n窃听器\t221713\nPS液化工具\t221714\nImToken\t221715\n附上\t221716\n第1框\t221717\nNAT穿透\t221718\n统领\t221719\nlodge\t221720\n开杀\t221721\n称重传感器\t221722\n胚盘\t221723\n孔浩\t221724\n为你而生\t221725\n占豪\t221726\n四川省人民检察院\t221727\n齐肩发\t221728\nskia\t221729\n休闲伦巴\t221730\n吉利远景X1\t221731\n演艺公司\t221732\n小飞机\t221733\n出逼\t221734\nopenfiledialog\t221735\nCandidate\t221736\n百白破\t221737\nMessage\t221738\n精鹰\t221739\nblackboard\t221740\n核发\t221741\n引拍\t221742\n福成\t221743\n0502\t221744\nAlbums\t221745\n合欢树\t221746\n卷\t221747\n小风\t221748\n王立华\t221749\n26度\t221750\n绵阳百姓网\t221751\n公孙衍\t221752\n东南大学自动化学院\t221753\n大亚圣象\t221754\ny\t221755\n帝国时代2\t221756\n恒山路\t221757\ntishi\t221758\n住\t221759\n元胞\t221760\nmacbookp\t221761\nloadtxt\t221762\n两万亿\t221763\n五十条\t221764\n音乐网\t221765\n左乳\t221766\nr61\t221767\n6部\t221768\n花滑\t221769\n痢疾\t221770\n小八义\t221771\n桑德拉·布洛克\t221772\n天一\t221773\n炼武\t221774\n食杂\t221775\ni7-7500u\t221776\n蜜丝\t221777\n供给侧改革\t221778\n朗读者\t221779\n尚源\t221780\n陆慷\t221781\n船用\t221782\nPixie\t221783\n香精油\t221784\n住家\t221785\nN+1\t221786\nFLVCD\t221787\nFQ\t221788\n铜锌合金\t221789\n青田县\t221790\n吓唬\t221791\n慈星\t221792\n树德中学\t221793\n新奇遇\t221794\n酒仙\t221795\nCRUD\t221796\nG1.Sniper\t221797\n遏制\t221798\nexcel格\t221799\n余压阀\t221800\ntabel\t221801\n物性参数\t221802\n上议院\t221803\n富庶\t221804\n星文\t221805\n实数集\t221806\nPOL\t221807\n卫勤\t221808\nclasse\t221809\n沽空\t221810\n胡亥\t221811\nWeiPhone\t221812\n莆田市公安局\t221813\n30hz\t221814\n春喜\t221815\n中公教育\t221816\n回力\t221817\n大夫说\t221818\nJavadoc\t221819\n轮毂轴承\t221820\n剑姬\t221821\n公信宝\t221822\n发酵\t221823\n生均\t221824\n魂魂\t221825\n路桥招聘网\t221826\n2018.4.18\t221827\n保护伞公司\t221828\n北辰购物中心\t221829\n金山桥\t221830\n逆流\t221831\n天青\t221832\n风险性\t221833\n柯莱特\t221834\n油门\t221835\n张先生\t221836\nautolayout\t221837\ngay男\t221838\nDawson\t221839\n快三\t221840\n史蒂文斯\t221841\n日历天\t221842\n龙虎塘\t221843\nstephen\t221844\n尘封\t221845\n散文吧作文网\t221846\n优乐美奶茶\t221847\n兴道\t221848\n第163集\t221849\n鸭季\t221850\nhaydee\t221851\nshinelon\t221852\nMRP\t221853\n令第\t221854\n森兰\t221855\n广州男科医院\t221856\n战锤鼠疫2\t221857\n上海国资委\t221858\n云栖\t221859\nopenGL\t221860\n来乍到\t221861\ndoc-CSDN\t221862\n斩将\t221863\n增长期\t221864\n投融资\t221865\n迎新网\t221866\n杨志\t221867\n张起淮\t221868\n中国经济信息社\t221869\n导学案\t221870\n李公朴\t221871\n鑫茂科技\t221872\npermit\t221873\n雀鹰\t221874\nmingxing\t221875\n慕吱\t221876\n豪威科技\t221877\n女神篇\t221878\n金属所\t221879\ncrnn\t221880\n阿多尼斯\t221881\n哪方面\t221882\n玻珠\t221883\n中华诗词学会\t221884\n华为荣耀路由Pro\t221885\n剑网三95小橙武\t221886\n周水子机场\t221887\n常州科教城\t221888\n分拣\t221889\n宫野明美\t221890\n任城区\t221891\n雪里红\t221892\n元山晴香\t221893\n吴静\t221894\n环城\t221895\n广渠门内大街\t221896\n一居室\t221897\n两时间\t221898\n苏州海洋馆\t221899\n建筑师\t221900\nCR2032\t221901\n净瓶\t221902\n大牛\t221903\n黄金麻\t221904\n宅师\t221905\n阀体\t221906\nz840\t221907\n移动彩云\t221908\nthinkbaby\t221909\n转非\t221910\n河池先锋网\t221911\n千篇一律\t221912\n梦幻西\t221913\nelementUI\t221914\n主贷人\t221915\n怪癖\t221916\nvmall\t221917\nadobe吧_\t221918\nseq2seq\t221919\n600597\t221920\n集输\t221921\n棒棒哒\t221922\n穆桂英大破天门阵\t221923\n灼伤\t221924\n唯物辩证法\t221925\n牛津纺\t221926\n牟\t221927\n缪娟\t221928\n信访\t221929\nwww.av3.cc/\t221930\n冰眼\t221931\n18式\t221932\n外国语学院\t221933\n车之鉴\t221934\n长安CS55论坛_汽车之家论坛\t221935\n陈一丹\t221936\nIphone6s\t221937\nVIP浏览器\t221938\n蓝绿\t221939\nV2X\t221940\n大祭\t221941\nModification\t221942\n孟刚\t221943\n新白云国际机场\t221944\n经合组织\t221945\nwwwwww\t221946\n一孩\t221947\n单元音\t221948\nSubscribe\t221949\n仓壁振动器\t221950\n路子\t221951\n10000件\t221952\n膨珊瑚\t221953\n魔弹王\t221954\n赋予\t221955\n大森\t221956\n昆明便民网\t221957\n东街口\t221958\n挥棒\t221959\n纹眉\t221960\n1.49亿人次\t221961\n汇流板\t221962\n高思\t221963\n框框\t221964\n浦西\t221965\n滴滴车\t221966\n雨期\t221967\n佣兵天下\t221968\n集资\t221969\n李凤\t221970\nZIZ\t221971\n建设用地使用权\t221972\n魂武者\t221973\n大班\t221974\n线程安全\t221975\n三道堰镇\t221976\n陈晓卿\t221977\n石黑\t221978\nshopbop\t221979\n1920x1080_\t221980\n扩展名\t221981\n4.3号\t221982\n可见分光光度法\t221983\n芭蕾舞团\t221984\n穷奇\t221985\nfxy\t221986\n抢话\t221987\n仿色\t221988\n太息\t221989\n临平山北\t221990\n中国青年\t221991\n530i\t221992\n列示\t221993\n南京航空航天大学金城学院\t221994\nSW6\t221995\n长安悦翔\t221996\n人类星球\t221997\n修罗之瞳\t221998\n赤壁赋\t221999\nvivox21\t222000\n哇噻网\t222001\nTimestamp\t222002\n1.3万亿\t222003\nTera\t222004\nEnterprises\t222005\n肉步\t222006\nCooper\t222007\n天峦湖\t222008\n胡兰成\t222009\n3.6m\t222010\n拐爷\t222011\n12j201\t222012\n游戏厅\t222013\n金利镇\t222014\n刺客信条3吧\t222015\n长量\t222016\n海峰\t222017\n交感\t222018\n战神\t222019\n实操性\t222020\n阿布罗狄\t222021\n喜阳\t222022\n四川大学化学学院\t222023\nrediscluster\t222024\n桔子水晶酒店\t222025\nObjective\t222026\n国密\t222027\nmatalab\t222028\nmk14\t222029\nAWARD\t222030\n杨建军\t222031\n快快乐乐\t222032\n韩妃\t222033\n圆生菜\t222034\n杨中华\t222035\n淮坊\t222036\n水晶玻璃\t222037\naaa\t222038\n泽野弘之\t222039\n比基尼勇士\t222040\n杂酱面\t222041\n黄热病\t222042\n三亚湾\t222043\n2018年4月2日起\t222044\n面色\t222045\n养老\t222046\n长春师范学院\t222047\n恋与制作人\t222048\n姑苏人才网\t222049\n店群\t222050\n芒果网\t222051\n审核表\t222052\n3.6.0\t222053\n飞散\t222054\n70.1\t222055\n中潭路\t222056\n惠威m200mkiii\t222057\nnaka\t222058\n广州市旅游商务职业学校\t222059\n升官\t222060\n2017年2月1日\t222061\n误认为\t222062\n翁源县\t222063\n比勒\t222064\n绿色征途\t222065\n四川大学经济学院\t222066\n横款\t222067\n毕业文\t222068\ngeneralized\t222069\n纳木错湖\t222070\n公安部交管局\t222071\n四边\t222072\nIncluded\t222073\n对白\t222074\n卜冠今\t222075\n突起\t222076\n4.3日\t222077\nandroid模拟器\t222078\nextensive\t222079\n豫发\t222080\n态\t222081\nE1-471G\t222082\n补丁包\t222083\n吴雄志\t222084\n褥垫\t222085\n实价\t222086\n轻衣\t222087\n写上\t222088\n工商卡\t222089\nMemcached\t222090\n五公分\t222091\nw540\t222092\n家具类\t222093\n加盟费\t222094\n心脏瓣膜病\t222095\n静州\t222096\n情急\t222097\n瑞文网\t222098\n510\t222099\n甬政发\t222100\n崇山\t222101\n差平方\t222102\nolt\t222103\n负载均衡_\t222104\n回事\t222105\nKitchenAid\t222106\n第十\t222107\n程十发\t222108\n穿搭_小红书\t222109\nx2/a2+y2/b2=1\t222110\n省内外\t222111\n东溪镇\t222112\n似水\t222113\n陕西省国土资源厅\t222114\nDKP\t222115\n奈美\t222116\n可矣\t222117\n安家补贴\t222118\n微小宝公众号助手\t222119\n江苏省高级人民法院\t222120\n白文鸟\t222121\n茶锈\t222122\n14.04.4\t222123\n接触镜\t222124\n迦南\t222125\n特约险\t222126\n泰祺\t222127\n垃圾桶\t222128\nHistone\t222129\n怒风\t222130\n三月三十日\t222131\n杨俊峰\t222132\n新安镇\t222133\n红石峡\t222134\n猛禽\t222135\npccp\t222136\n留购\t222137\nPetaPoco\t222138\n明州医院\t222139\n永生劫\t222140\n皮杰\t222141\n流通量\t222142\n百应\t222143\n关注者\t222144\nニ\t222145\nresearcher\t222146\n义煤\t222147\n家电脑\t222148\n深处\t222149\n圣典百科\t222150\n摩尔斯\t222151\n中铁二十二局\t222152\n储能器\t222153\nC275\t222154\n大计\t222155\n平乡\t222156\n特产馆\t222157\n雀舌\t222158\n12月6日\t222159\n上海仁济医院南院\t222160\n鱼骨图\t222161\n阎崇年\t222162\n4月29日\t222163\n换攻\t222164\nUnGeek\t222165\n安徽四创电子股份有限公司\t222166\n45万\t222167\n市科协\t222168\n泸州医学院\t222169\n手拉车\t222170\n三史\t222171\n仲裁时效\t222172\n稞麦综合视频站下载器\t222173\n再\t222174\nRockchip\t222175\n大主教\t222176\n法拍\t222177\n特写\t222178\n大西洋底\t222179\n立普妥\t222180\nLAG\t222181\n排土场\t222182\npolyethylene\t222183\n1973\t222184\n979950\t222185\n饮马河\t222186\n线香\t222187\n信用征信知识问答\t222188\n菲律\t222189\n贵阳日报\t222190\nKeller\t222191\nyuce\t222192\n一扇\t222193\nBlot\t222194\n爱威\t222195\nhero6\t222196\n第2季度\t222197\n抖音\t222198\nShop\t222199\n逃债\t222200\n超声波清洗器\t222201\n赛格电子\t222202\n启幕\t222203\n励耘\t222204\n工程经济\t222205\n职业教育网\t222206\n农网\t222207\nwwyz\t222208\ngoal\t222209\n颈管\t222210\n毛文龙\t222211\n枭神\t222212\n徐颖\t222213\n明天系\t222214\n中信重工机械股份有限公司\t222215\n早晨\t222216\n黑狼\t222217\n难求\t222218\n温漂\t222219\n卓集送货运版\t222220\ncomparisons\t222221\n古斯塔夫·勒庞\t222222\n田螺\t222223\nmbc\t222224\nPres\t222225\nDb\t222226\n海峡都市报闽南版数字报\t222227\n龙广\t222228\n膜式燃气表\t222229\ngreenvpn\t222230\n食物垃圾处理器\t222231\njvm\t222232\n杏坛镇\t222233\n礼拜六\t222234\n腱子肉\t222235\n阜丰集团\t222236\n1475\t222237\n卡洛塔妮\t222238\n美金\t222239\n雷佳音\t222240\n34家\t222241\n巨石阵\t222242\n摩托化\t222243\n5.31\t222244\n纽仕兰\t222245\nmac地址表\t222246\n广东省住房和城乡建设厅\t222247\n肝健康网\t222248\npornhub\t222249\n章页\t222250\n价值链分析\t222251\ntio\t222252\nKoei\t222253\nMovavi\t222254\n10696\t222255\n总规\t222256\n非会计\t222257\n一半\t222258\nwinsocket\t222259\n狂暴世纪浩劫\t222260\n渭北\t222261\n物理防晒\t222262\n凉快\t222263\n100年后\t222264\nTX2\t222265\n有道云笔记markdown\t222266\n4058\t222267\n鞋头\t222268\n世博广场\t222269\n华岩寺\t222270\n新西兰大学\t222271\n大伯哥\t222272\n林子\t222273\n第四十六\t222274\nokcard\t222275\nWEB认证\t222276\n十二之天贰\t222277\n渠县\t222278\n葡萄籽油\t222279\n内蒙古电力\t222280\n型膜\t222281\n陶朱公\t222282\n威锋源-威锋网\t222283\n航津路\t222284\nsolidworks2015\t222285\nuo\t222286\n炉膛\t222287\nAccounting\t222288\nkeytruda\t222289\naar\t222290\n折抵\t222291\n音圈\t222292\nDOM\t222293\nglfw\t222294\n对称矩阵\t222295\n文荒\t222296\nservice层\t222297\n一招制敌\t222298\n十二生肖性格_祥安阁风水网\t222299\n山狗\t222300\n驭胜S350\t222301\n起痘\t222302\n弧光\t222303\nwww.ccc\t222304\n菲律宾\t222305\n户田惠梨香\t222306\n里仁学院\t222307\nね\t222308\n徐昊\t222309\nziran\t222310\n中考分数线\t222311\n中国联通公司\t222312\n吴宇森\t222313\n途物资\t222314\n翻屏\t222315\n视觉化\t222316\n九阳神功\t222317\n深圳私立小学\t222318\n伍家岗\t222319\n9.5折\t222320\n聚氨酯弹\t222321\n银溪墅府\t222322\n二叉排序树\t222323\n八幼\t222324\n张可\t222325\nDMX\t222326\n微信群发\t222327\n5DSR\t222328\n66.0.3359.117\t222329\n虐片\t222330\n10400\t222331\ncp105b\t222332\n收留\t222333\nOauth\t222334\nXXX\t222335\n抱恙\t222336\nCentrifugal\t222337\nmethylation\t222338\n37.9\t222339\n拣选\t222340\n爱牙日\t222341\n北京同仁堂中医医院\t222342\n功守道\t222343\n屠苏\t222344\nImagination\t222345\n8平方米\t222346\n2356\t222347\n咪唑类\t222348\n企业发展有限公司\t222349\n借书\t222350\n经营分析\t222351\ngovendor\t222352\nShaunChen\t222353\n11级\t222354\n李琳琳\t222355\n默剧\t222356\n蒲州\t222357\nDOTA217173\t222358\n针刺\t222359\n资本论\t222360\n16枚\t222361\n首钢京唐公司\t222362\n瑞森\t222363\nKose\t222364\n忠烈祠\t222365\n僕\t222366\n深圳市博雅成信金融服务有限公司\t222367\n张谷\t222368\n无可替代\t222369\n20150319\t222370\n电烤炉\t222371\n荷都网\t222372\n十辆\t222373\n异军突起\t222374\n应用文\t222375\n枪刺\t222376\n刘多\t222377\n小主\t222378\njava泛型\t222379\nleung\t222380\nsession\t222381\n宁南县\t222382\n汪晨蕊\t222383\n#6\t222384\n二甲硅油\t222385\n黄思婷\t222386\n大光明\t222387\n昆明移动\t222388\n美的厨电\t222389\ngiza\t222390\n探视\t222391\n刻划\t222392\n看不过\t222393\n茶子\t222394\n飞行团\t222395\n砍柴网\t222396\n二路\t222397\n丁俊秦晓\t222398\n钢线\t222399\n浙江省消防\t222400\nyjyun高清\t222401\n3关\t222402\n区城市管理局\t222403\nhiv病毒\t222404\n跳关\t222405\n长丰村\t222406\n重庆招聘网\t222407\nranyonsue\t222408\n很纯末日轮盘\t222409\n吉林工业大学\t222410\n一第一章\t222411\n爱睿希\t222412\nv3.0.4\t222413\n踱步\t222414\n班会\t222415\n碧海银沙\t222416\n张志峰\t222417\n并联电抗器\t222418\nLLL\t222419\n云卷云舒\t222420\n阮阮\t222421\noppoa83\t222422\n南宁职业技术学院\t222423\nAMZ\t222424\nLilian\t222425\nWget\t222426\n参道\t222427\n言归正传\t222428\n配置化\t222429\n2挡\t222430\n飞享\t222431\n违者\t222432\n自来水厂\t222433\n缮制\t222434\n以和为贵\t222435\n990元\t222436\n善领\t222437\n心手相牵\t222438\n炉管\t222439\n热质\t222440\n姓名测试\t222441\n踏步机\t222442\n成工\t222443\n水稻\t222444\n巴塔木\t222445\nOOM\t222446\n王泽\t222447\n厨师\t222448\n善书\t222449\n血污\t222450\nVMI\t222451\n杨女士\t222452\ncm101s-2\t222453\n乌克丽丽\t222454\n李巍\t222455\n孟加拉语\t222456\n公子衍\t222457\n沈大高速\t222458\nSensory\t222459\n种植箱\t222460\n小老\t222461\n中文学习网\t222462\nsignalr\t222463\nuv打印机\t222464\n传媒学院\t222465\n元大都\t222466\n收派员\t222467\nPpt\t222468\n陕西卫视\t222469\n沃利贝尔\t222470\n中国海军陆战队\t222471\n瀚文\t222472\n创盈\t222473\nbada\t222474\n马军\t222475\n教者\t222476\n魔岩石\t222477\n安徽人大\t222478\n张叶\t222479\n本堂\t222480\n杀虫药\t222481\n蕉叶\t222482\n十九章\t222483\n中国少年先锋队队歌\t222484\neverlasting\t222485\npc模拟器\t222486\n115个\t222487\n监事人\t222488\n狂言\t222489\nGlobally\t222490\n商丘古城\t222491\n续保\t222492\n_神马影院\t222493\n2017年11月3日\t222494\n季克良\t222495\nMqtt\t222496\nnox\t222497\n黄光熙\t222498\n咏唱\t222499\n徐家汇公园\t222500\n白绝\t222501\n腰板\t222502\n10队\t222503\nFatigue\t222504\n佛本傲剑凌云\t222505\n朋友说\t222506\n爱居兔\t222507\nmanuel\t222508\n扁桃体\t222509\nXeLaTeX\t222510\n涓\t222511\n前代\t222512\n战斧牛排\t222513\n大年初一\t222514\n陈安宁\t222515\n暗算\t222516\n文韵\t222517\n小斯\t222518\n香雪海\t222519\nPjTime\t222520\n80000000\t222521\n永恒战士4\t222522\nnetty4\t222523\n利国利民\t222524\n杨秀\t222525\n9v2a\t222526\n自缢身亡\t222527\n净现值法\t222528\nwunder\t222529\n兽王争锋\t222530\n甚于\t222531\n东直门中学\t222532\n临空经济区\t222533\n第67条\t222534\n大话西游之爱你一万年\t222535\n泄洪\t222536\n状图\t222537\nIterm2\t222538\n三畅\t222539\nReunion\t222540\n永磁无刷直流电机\t222541\n专业技术人员继续教育网\t222542\n程控电话交换机\t222543\n四嫁\t222544\nulipad\t222545\n第二周\t222546\n美国西南航空公司\t222547\n六星\t222548\n王夫人\t222549\noppoa53\t222550\n回龙村\t222551\n张泽\t222552\n圣保罗教堂\t222553\n发行量\t222554\n卡通熊\t222555\n广西河池\t222556\nwikia\t222557\n无尽神域\t222558\nisaac\t222559\n飞视\t222560\n福安市政府\t222561\n腾格里沙漠\t222562\n减肥期\t222563\n9.1.0\t222564\n堵漏剂\t222565\n耐玩\t222566\n建滔\t222567\n社区卫生院\t222568\n甲硝锉\t222569\n服务员\t222570\n龌蹉\t222571\n里奥\t222572\n黄东\t222573\n西安大学\t222574\n睑板腺\t222575\n芦丹氏\t222576\nEXPMA\t222577\ndiany\t222578\n梁端\t222579\n青年公园\t222580\nRx\t222581\n苹果7home键\t222582\n7718\t222583\n离婚案\t222584\nhero4\t222585\n深层次\t222586\nLocal\t222587\n所以说\t222588\n本人\t222589\n生门\t222590\n300多分\t222591\n语文科\t222592\nCodis\t222593\n心肌梗塞\t222594\n微信文\t222595\n0404\t222596\n160公里\t222597\n刀杆\t222598\n山东博物馆\t222599\n横溪\t222600\n41号\t222601\npartner\t222602\n2017十九\t222603\n人事\t222604\n240G\t222605\n码头镇\t222606\n宝马525Li\t222607\n野鸡大学\t222608\ndaima\t222609\n酰胺类\t222610\nclad\t222611\n多措\t222612\n用开\t222613\n第二十三期\t222614\n【努比亚\t222615\n美食展\t222616\n像册\t222617\n大宋牡丹亭\t222618\n王洪光\t222619\n嗜\t222620\n德清房产网\t222621\n生化危机浣熊市\t222622\n诺手\t222623\n种值\t222624\n黄曲霉毒素B1\t222625\n黄怒波\t222626\n萨德\t222627\n孕育\t222628\nmario\t222629\n冼\t222630\n东华凤九吧\t222631\n冻存盒\t222632\n夜郎国\t222633\n河南财经学院\t222634\n班台\t222635\nTestNg\t222636\n第6位\t222637\n生鲜食品\t222638\nG41\t222639\n廖英强\t222640\nHourglass\t222641\n莱克斯\t222642\nC3300\t222643\n20170721\t222644\n秋月爱莉\t222645\nLED日行灯\t222646\n插筋\t222647\n解盘\t222648\n2008年6月\t222649\n休息权\t222650\n短期工\t222651\n欧阳平\t222652\n同位素标记法\t222653\nWins\t222654\nzip压缩\t222655\n现图\t222656\n紫外激光器\t222657\n弱网\t222658\n疏狂\t222659\nxia_sir\t222660\n2017-10-23\t222661\n减\t222662\n撒欢儿\t222663\n在公园\t222664\n红棉\t222665\n临检\t222666\n第一更\t222667\n30年代\t222668\nVRV\t222669\n上合组织\t222670\n522\t222671\n采挖\t222672\n港鐵\t222673\n3遍\t222674\n伊丽莎白\t222675\n海面\t222676\n歇后语\t222677\n粮站\t222678\n中医儿科学\t222679\n清境农场\t222680\n荒冢\t222681\n盐酸多西环素片\t222682\ncili\t222683\n珠江新城\t222684\ndbunit\t222685\n一维码\t222686\n娄烦县\t222687\n长脂肪瘤\t222688\nstrophe\t222689\nmile\t222690\n传世无双\t222691\n五羟色胺\t222692\n3条\t222693\nTestLink\t222694\n差别\t222695\n季羡林\t222696\n皆\t222697\n普特\t222698\n自私的基因\t222699\n商周青铜器\t222700\n蠕虫\t222701\n喋血孤城\t222702\n日本女排\t222703\n根源\t222704\n爱要怎么说出口\t222705\n开红酒店\t222706\n季期\t222707\nwindriver\t222708\n威海银行\t222709\n夏之星\t222710\nPES2018\t222711\ntwelve\t222712\n双江镇\t222713\nanroid\t222714\n值班费\t222715\n樱理惠\t222716\n助行\t222717\n先睹为快\t222718\n盒子\t222719\n巴黎铁塔\t222720\n一位\t222721\n王旭\t222722\nunchecked\t222723\n阴毛\t222724\n漂浮物\t222725\n辽中县\t222726\n造化之门\t222727\n发鬼\t222728\n清朝时期\t222729\n百度坐标\t222730\npapersee\t222731\n烟雾传感器\t222732\n和平\t222733\n北京中医药大学\t222734\nbeat\t222735\n神射\t222736\n质数\t222737\nGalactic\t222738\n济南幼儿园\t222739\ncontracting\t222740\n咖喱牛肉饭\t222741\n平博\t222742\n中国邮政储蓄\t222743\n围场县\t222744\nFitzgerald\t222745\n产业部\t222746\n电话本\t222747\n20141129\t222748\n力\t222749\n新华三集团\t222750\n税收征管\t222751\n春来\t222752\n治省\t222753\nbd1080p高清\t222754\n旧情\t222755\n修罗武神\t222756\nCR2\t222757\n丰产\t222758\n跨平台\t222759\ntrack\t222760\n罗维\t222761\n萨尔茨堡\t222762\ninspired\t222763\n20180423\t222764\n水灰\t222765\n2014—2018年\t222766\n轴号\t222767\n校友会\t222768\n近者\t222769\nhomeassistant\t222770\nフル\t222771\n中科特膳\t222772\nAngul\t222773\n记叙\t222774\n林边\t222775\n古木\t222776\n轻飘飘\t222777\n西餐\t222778\nadc0809\t222779\n受虐狂\t222780\n铁扇公主\t222781\nallocator\t222782\n煤油炉\t222783\n暖\t222784\n斋菜\t222785\n慢悠悠\t222786\n淘品牌\t222787\n继发性肺结核\t222788\n换向阀\t222789\n按值\t222790\n能量棒\t222791\n重庆北站\t222792\n引江\t222793\n太阳帽\t222794\n错愕\t222795\n陈娟\t222796\n泉州泉港区\t222797\n张可儿\t222798\n中智德\t222799\n59.9\t222800\nservlet3.0\t222801\n霉斑\t222802\n20170702\t222803\n锤石\t222804\n凯美纳\t222805\n护唇\t222806\n孪晶\t222807\nslq\t222808\n蓝狗\t222809\nSigmaPlot\t222810\n17643\t222811\namazon\t222812\n加\t222813\n移相器\t222814\n指正\t222815\n泰安市政府\t222816\nBlingee\t222817\n肺内\t222818\n涨速\t222819\n1920x1080\t222820\n丁玲\t222821\nordering\t222822\n金萱\t222823\n武道至尊\t222824\n霍比特人2:史矛革之战\t222825\n大众途锐\t222826\n肠痉挛\t222827\n塔公\t222828\n四回\t222829\nLauncher\t222830\n2018月\t222831\n脓疮\t222832\n沱江镇\t222833\n埃克塞特\t222834\nWatts\t222835\n20151201\t222836\nwifi密码查看器\t222837\n水猪\t222838\n浪潮之巅\t222839\n大马猴\t222840\n颅底\t222841\nLAUDER\t222842\n少帝\t222843\n商米\t222844\n20170520\t222845\nChemical\t222846\n3290\t222847\n莽荒记\t222848\n屋面工程技术规范\t222849\n表报\t222850\nMagnesium\t222851\n知己知彼百战不殆\t222852\n中汇广场\t222853\n彩银\t222854\nV6.9\t222855\n朱瑞\t222856\n步进驱动器\t222857\nA卷\t222858\n活检\t222859\nDust\t222860\n务别\t222861\n掀起你的盖头来\t222862\n入党\t222863\n第84期\t222864\n痛定思痛\t222865\n转动惯量\t222866\n黄益平\t222867\n南京长江\t222868\n高姓\t222869\n高灵\t222870\n越野测试\t222871\n国球\t222872\n华律网\t222873\n0350\t222874\n加门\t222875\n斜行\t222876\n五百年后\t222877\n東凛\t222878\n鳗鲡\t222879\n前哨\t222880\n棚内\t222881\n华尔街之狼\t222882\nPowerPoint2003\t222883\n掉段\t222884\nWh\t222885\n独具一格\t222886\n招商银行手机银行\t222887\n期望理论\t222888\n小小孩\t222889\n靓声\t222890\n咖妃\t222891\n山东大学外国语学院\t222892\n取纸\t222893\n护腿板\t222894\n兰山\t222895\n邪情\t222896\n潍坊电视台\t222897\ncomfirm\t222898\n工业展\t222899\nreader\t222900\n清港\t222901\n惜福镇\t222902\n薄情\t222903\n吏民\t222904\n中新网_中国新闻网\t222905\nHafiz\t222906\n1921年\t222907\n24周岁\t222908\n浙江\t222909\n黄庭坚\t222910\nstrings\t222911\n拍客\t222912\n蜘蛛车\t222913\n宇智波斑\t222914\n靡情\t222915\n我馆\t222916\n磷叶石\t222917\n前十\t222918\nDECCA\t222919\nVengeance\t222920\n女豹\t222921\n三星镇\t222922\n环保型\t222923\n诸葛古镇\t222924\n三星C9\t222925\n开发版\t222926\n尾田荣一郎\t222927\n赤根京\t222928\nformid\t222929\n2790\t222930\njava9\t222931\n5毫升\t222932\n法隆寺\t222933\n新聞懶人包\t222934\nTeller\t222935\n20161006\t222936\n宋平\t222937\n共和党\t222938\n本哈德·施林克\t222939\n查龙寺\t222940\n页眉\t222941\n套条\t222942\n13-28周\t222943\n170430\t222944\n斗牌传说\t222945\ngether\t222946\n变分\t222947\n三十家\t222948\nmeila\t222949\n16级\t222950\n通花\t222951\n化工产品\t222952\nchevron\t222953\n虚空之女\t222954\n米人\t222955\n图标集\t222956\n李睿\t222957\n_淘\t222958\n下届\t222959\n伤官\t222960\n干瞪眼\t222961\n木函\t222962\n菲洛嘉\t222963\n亚克力板\t222964\n欲求王\t222965\n大众摄影网\t222966\n宝黛\t222967\n长隆企鹅酒店\t222968\n实验\t222969\n相声界\t222970\n第几卷\t222971\n尤特娜\t222972\n未定\t222973\n斗智斗勇\t222974\n新考\t222975\n第十四批\t222976\n3147\t222977\n7.5.0\t222978\n辣度\t222979\n航海王ol\t222980\naudacity\t222981\n尖锐\t222982\nsofm\t222983\n3行\t222984\nitell\t222985\nAO\t222986\n陶姓\t222987\n针织布\t222988\n惠生工程\t222989\nBANK\t222990\n若人\t222991\n水牛奶\t222992\n条\t222993\n三百首\t222994\n花骨朵\t222995\nhongse\t222996\n反跳\t222997\nNylons\t222998\n血汗钱\t222999\n擒狼\t223000\n重生三国\t223001\n容桂客运站\t223002\n松竹\t223003\n上海枫叶国际学校\t223004\n周学伟\t223005\n拉莫三嗪片\t223006\n木壳\t223007\n犯事\t223008\n实生\t223009\n车用尿素\t223010\n施莱德\t223011\nqcc\t223012\n双阳路\t223013\n过滤网\t223014\nAXURE\t223015\nled电子显示屏\t223016\n王大妈\t223017\n隐形袜\t223018\n龙池山\t223019\nnananana\t223020\n仓库盘\t223021\nPicasso\t223022\n日向小次郎\t223023\n啐\t223024\n深圳北高铁站\t223025\n身体健康\t223026\n控制室\t223027\n刘智勇\t223028\n西岛\t223029\n席地而坐\t223030\n荣盛御湖观邸\t223031\n孔\t223032\n雪兰\t223033\n吸污机\t223034\n飞行幼乐园\t223035\n战国兰斯\t223036\n3308\t223037\n历史教学\t223038\n苏价费\t223039\n魔界\t223040\n95w\t223041\n赵启平\t223042\n美城\t223043\nDeque\t223044\n崇启大桥\t223045\nASD\t223046\n商务部办公厅\t223047\nbanian\t223048\nLOLS\t223049\n掌阅小说网\t223050\n喷饭\t223051\n溶豆\t223052\n青岩古镇\t223053\ndollar\t223054\n软甲\t223055\n统计学习方法\t223056\n夜港服\t223057\n60元\t223058\n刘明布\t223059\n吉草\t223060\n陈奕迅\t223061\n齐天大性之大闹女儿国\t223062\n红米5P\t223063\n七彩鱼\t223064\n果戈里\t223065\nElearning\t223066\n你们村\t223067\n6度\t223068\ntext-shadow\t223069\n瑞思迈呼吸机\t223070\n陈冯富珍\t223071\n粒径\t223072\n萌犬\t223073\n1159\t223074\n何进\t223075\nhxsxw\t223076\n26.1\t223077\n菜类\t223078\n防暴恐\t223079\n碧桂园海昌天澜\t223080\n中国新药杂志\t223081\n膝关\t223082\n无人\t223083\n99名\t223084\n随风而去\t223085\n中国中电华通\t223086\nCoil\t223087\n华东交通大学理工学院\t223088\n捋\t223089\n侯宝林\t223090\ndigg\t223091\n智牙\t223092\n抛石挤淤\t223093\n半推半就\t223094\n口腔溃疡\t223095\n血痕\t223096\nplz\t223097\n夺宝岛\t223098\nhtml版\t223099\n域名实名认证\t223100\n行星\t223101\n巴旦木\t223102\n开蛋\t223103\nruin\t223104\n秘殿\t223105\n天葬\t223106\n阿维尼翁\t223107\n相干\t223108\n老侯\t223109\n介绍\t223110\n烟台二中\t223111\ndnf战斗法\t223112\n增色\t223113\n香灰\t223114\n字行\t223115\n绝世战魂\t223116\n发生炉\t223117\n抽离\t223118\n重庆大学网络教育学院\t223119\n雨露\t223120\n正态分布曲线\t223121\n干燥箱\t223122\npyc\t223123\n哪一个\t223124\n中融\t223125\n净手\t223126\n0.63\t223127\nhyperwork\t223128\nmurder\t223129\n一碟\t223130\n621226\t223131\n西蓝花\t223132\nMinitab17\t223133\n有条件\t223134\n继发\t223135\n冰皮\t223136\nip地\t223137\n小成\t223138\n592\t223139\n变形杆菌\t223140\n2049\t223141\nPointNet\t223142\nsaas\t223143\n肌肉男\t223144\n吴国\t223145\n湖海塘\t223146\n跟手\t223147\nCDEC\t223148\nul认证\t223149\n重工业\t223150\n182号\t223151\n中国水产学会\t223152\nharrison\t223153\n抚顺传媒网\t223154\n野居\t223155\nE430\t223156\n防狼\t223157\nmt7601\t223158\nslime\t223159\nhanshu\t223160\nVRP\t223161\n应彩云\t223162\n大卸\t223163\n2.6GB\t223164\n种衣剂\t223165\npfsense\t223166\n佛罗里达州立大学\t223167\n那年那兔那些事\t223168\n津津\t223169\n怀_\t223170\n直女\t223171\nleelazero\t223172\n徐来\t223173\n95599\t223174\n冰冷\t223175\n消保法\t223176\n气蚀\t223177\n容缺\t223178\n马步谣\t223179\n佞英雄联盟之决胜巅峰\t223180\n厚尾\t223181\nreadline\t223182\n157克\t223183\nx\t223184\n上海市金山中学\t223185\n日月星\t223186\ntencount\t223187\n811路\t223188\n启典\t223189\n公制螺纹\t223190\n藏书馆\t223191\n均摊\t223192\n鹤翼\t223193\n一盘\t223194\nNSInteger\t223195\n费托\t223196\n艾尔肯\t223197\n舌燥\t223198\n围攻\t223199\nextreme\t223200\n宏宇瓷砖\t223201\n加里曼丹\t223202\n上海中医药大学附属曙光医院\t223203\n古晋\t223204\n中国医大一院\t223205\n厦门市住房公积金管理中心\t223206\n隆盛镇\t223207\n收徒\t223208\n擦浴\t223209\n水污染防治行动计划\t223210\n福妻\t223211\n西天目山\t223212\n2760\t223213\n干式变压器\t223214\n506\t223215\n全氮\t223216\n超群\t223217\n定做\t223218\n翻倍\t223219\n记录簿\t223220\n时间间隔\t223221\n阿加曲\t223222\n轻车熟路\t223223\n申精\t223224\nScilab\t223225\n岭头村\t223226\n两百多个\t223227\n葫芦藓\t223228\n局限性\t223229\n中电普华\t223230\nwangxusummer\t223231\n呼和浩特市政府\t223232\n塑板\t223233\nK2450\t223234\nplc\t223235\nSonicare\t223236\n49you\t223237\n热爱生命\t223238\n魔兽争霸论坛\t223239\n墙柜\t223240\n中铁六局\t223241\nAkt\t223242\n300级\t223243\n4个半月\t223244\n咔叽\t223245\n装算量\t223246\nWebGL\t223247\n四多\t223248\n通灵\t223249\n星际争霸2:虚空之遗\t223250\n智悲佛网\t223251\nCarloZ\t223252\n砸伤\t223253\n新安晚报\t223254\nlacker\t223255\nmrp\t223256\ntc310\t223257\n国度\t223258\nol\t223259\n前任们\t223260\n百页\t223261\nDecimalFormat\t223262\n我的快乐就是想你\t223263\n小桥\t223264\nkJ\t223265\n巨钳螳螂\t223266\n数显游标卡尺\t223267\n翼状胬肉\t223268\nlianye\t223269\n小鱼干\t223270\n皮带\t223271\n迎宾大道\t223272\nhks\t223273\n总领\t223274\n请留意\t223275\n企业财产保险\t223276\n智网\t223277\n2016年10月10日\t223278\n北京百奥莱博科技有限公司\t223279\n海案\t223280\n牦牛角\t223281\n下页\t223282\n大秦铁路股份有限公司\t223283\n白事会\t223284\nPLAYHOME\t223285\n金钱橘\t223286\n朱雯\t223287\n西游灭妖传\t223288\n喜结连理\t223289\n联合网络电视台\t223290\n快乐鸟\t223291\n北京月子中心\t223292\n七次\t223293\n费尔蒙酒店\t223294\n3月9号\t223295\n阳光理政\t223296\n料理店\t223297\n20180401\t223298\nBloomberg\t223299\n第32个\t223300\n概念教学\t223301\n多士炉\t223302\n丁立梅\t223303\n佳沃\t223304\n互联网+金融\t223305\n折半\t223306\n通气帽\t223307\nBeautifulSoup\t223308\n国博城\t223309\n奉贤海湾\t223310\ntri\t223311\n最方便\t223312\n液压缸\t223313\n北航计算机学院\t223314\n新金梅\t223315\n竹枝词\t223316\n银河奥特曼S\t223317\nLuise\t223318\n神武逍遥外传\t223319\n国际娱乐\t223320\n寒星\t223321\n靠马路\t223322\n2018年04月20日\t223323\n拍档\t223324\ndass\t223325\n许文赫\t223326\nstiga\t223327\n宝鉴\t223328\n牙胶\t223329\n摆尾\t223330\n淫语\t223331\nRGD\t223332\n假装\t223333\n涪城\t223334\n张常吴敬琏\t223335\nwdlinux\t223336\nopps\t223337\nfeathers\t223338\n干缩\t223339\n魔力盒\t223340\nnba2k\t223341\n垂直电商\t223342\n终极秘密\t223343\nSending\t223344\n20160227\t223345\n新庄村\t223346\n蓝村\t223347\n农业科学院\t223348\n广州市疾病预防控制中心\t223349\n露出来\t223350\n熵增\t223351\n徐如林\t223352\nCard\t223353\n6ES7\t223354\nnibiru\t223355\nrestarted\t223356\n我的泪\t223357\n大柱子\t223358\n热岛效应\t223359\nqfang\t223360\n魏明\t223361\n创城\t223362\n收破烂\t223363\npadi\t223364\n2.2.1\t223365\n严为民\t223366\n小六子\t223367\nAha\t223368\n还款人\t223369\n倾城网\t223370\n康保县\t223371\n三七伤药片\t223372\n澳加美联\t223373\n外置式\t223374\n裝\t223375\n三国法\t223376\n风尚版\t223377\nMG3080\t223378\n百变球球\t223379\n集中式\t223380\n扎克天若\t223381\n辛酸\t223382\n草莓网\t223383\n一整天\t223384\nbeforeEach\t223385\n招商证券智远理财\t223386\n素养\t223387\n麦卢卡\t223388\n0792\t223389\n艾欧\t223390\nAnthony\t223391\n兔八哥\t223392\n灵敏\t223393\n0.2.2\t223394\n继往开来\t223395\nGardening\t223396\nVampire\t223397\n债基\t223398\n序跋\t223399\n702所\t223400\n铠甲勇士捕\t223401\nsteamcn\t223402\n聚橙网\t223403\n连续模\t223404\n卧螺离心机\t223405\n基本粒子\t223406\n会计学专业\t223407\n小动物们\t223408\n振荡\t223409\n小鹿男\t223410\nPlaying\t223411\n命脉\t223412\nvpk\t223413\n高清照\t223414\nlai\t223415\n暗黑修仙\t223416\n起亚赛拉图\t223417\n桔贝合剂\t223418\n三国志13PK威\t223419\n修出\t223420\n云南国税\t223421\nSMOK\t223422\n王沪宁\t223423\n第15轮\t223424\n圆桌\t223425\n风驰\t223426\n小米盒子4\t223427\n蹲位\t223428\n时刊\t223429\n误区\t223430\n32a\t223431\n长沙市开福区人民政府\t223432\n16话\t223433\n金尚\t223434\njbpm4\t223435\n天津轻工职业学院\t223436\ncycling\t223437\n东伊运\t223438\n桂正和\t223439\n逃荒\t223440\nabsorb\t223441\n压电陶瓷\t223442\n模拟人生\t223443\ndor\t223444\n牛骨\t223445\nsims\t223446\n李明德\t223447\n柱层\t223448\nSAS\t223449\n国家信息中心\t223450\nv线\t223451\n郑州市第二人民医院\t223452\n澳宝\t223453\n物法\t223454\n绝地反击\t223455\nKLZ\t223456\n域城镇\t223457\n板负筋\t223458\n十九周\t223459\n合肥地铁\t223460\n别相信\t223461\nWin10\t223462\n社長\t223463\n父控件\t223464\n上海高铁站\t223465\n万行网\t223466\nwmf格式\t223467\n申万\t223468\n中海广场\t223469\n800元\t223470\nmahuan2\t223471\n铝模\t223472\n御座\t223473\n第03章\t223474\nLigerUI\t223475\nwasd\t223476\n号召\t223477\n完美图库网\t223478\n南昌市人民政府\t223479\n见清\t223480\n刘天池\t223481\n分式函数\t223482\n基金收益率\t223483\nSINOCHEM\t223484\n地方教育费附加\t223485\n消化系统\t223486\n129个\t223487\n第2卷\t223488\n中国宇航出版社\t223489\n账号\t223490\n清华大学电子工程系\t223491\n地利\t223492\n黑眼睛\t223493\n三率\t223494\nRANK\t223495\n万倍\t223496\n古堡\t223497\n1688.com\t223498\nlam\t223499\nWindows8.1\t223500\nA13\t223501\n安阳市\t223502\n提炼\t223503\n泥盆\t223504\nmatlab2017a\t223505\n运动减肥\t223506\n白云大道南\t223507\nlatex\t223508\nphone001.com\t223509\nworse\t223510\n嘉荣\t223511\nGTX960\t223512\n幼兽\t223513\n书盟\t223514\n澳瑞克\t223515\n镍钛\t223516\n狗场\t223517\n下料机\t223518\n曾辉\t223519\n鳙鱼\t223520\nselphy\t223521\nzjx\t223522\n伪满皇宫博物院\t223523\n岛里\t223524\n趣视网\t223525\n无锡海岸城\t223526\n品管\t223527\n30%\t223528\n徒步鞋\t223529\n水果节\t223530\n蜘蛛精\t223531\n你好棒\t223532\n被辞退\t223533\nsfw\t223534\n空放\t223535\nmaterialize\t223536\n周玥\t223537\nnef\t223538\n咬合桩\t223539\nYoshi\t223540\n成龙丁肇中\t223541\n王晓芳\t223542\nsmplayer\t223543\n双天至尊\t223544\n梅丽莎\t223545\n法制晚报\t223546\n澳洲中学\t223547\nOblivion\t223548\n金石学\t223549\n海底人\t223550\n皮盒\t223551\nOneNote2010\t223552\n运算式\t223553\noF\t223554\n线性链表\t223555\n预上\t223556\n青岛农业银行\t223557\n篓\t223558\n第0页\t223559\n赵州桥\t223560\n英语流利\t223561\nCSCO\t223562\n養\t223563\n成刚\t223564\nrebirth\t223565\n络绎\t223566\napi\t223567\n中山舰\t223568\n爱思论坛\t223569\n凉台\t223570\n香豆素\t223571\n尼禄\t223572\n孤舰\t223573\n梅花山庄\t223574\n备稿\t223575\n科技展\t223576\n夹议\t223577\nLinaro\t223578\n史思明\t223579\nMOD管理器\t223580\nJava+Selenium\t223581\n上海市普通高等学校\t223582\nhMailServer\t223583\n版块\t223584\n999.9\t223585\n辉达\t223586\n万方医学网\t223587\n爱飞客\t223588\n区间段\t223589\n几会\t223590\nservname\t223591\n立行\t223592\n第138章\t223593\n24道\t223594\n加硬\t223595\n睡眠薬\t223596\n深圳市人民政府金融发展服务办公室\t223597\n详谈\t223598\n米多\t223599\n尉\t223600\nBMP\t223601\n胶版\t223602\n四口\t223603\n孕味\t223604\n麦氏\t223605\n埃德蒙顿\t223606\n白姓\t223607\n价费\t223608\n姜村\t223609\n好跑\t223610\n凯迪拉克\t223611\ngabriel\t223612\n正牌\t223613\n快压\t223614\n珊瑚癣\t223615\nw3ctech\t223616\n声阻抗\t223617\n标为\t223618\nMy_cute\t223619\nMAX232\t223620\nprelude\t223621\n大小型\t223622\nmousse\t223623\n易水湖\t223624\nminecraft吧_\t223625\n河北省人力资源和社会保障厅\t223626\n77bike.com\t223627\n烟羽\t223628\n阿德马\t223629\n管理有限公司\t223630\n兮兮\t223631\n造币\t223632\n后景\t223633\n卫生高级职称考试网\t223634\n鸡条\t223635\n_龙轩美术网\t223636\n三天两头\t223637\n渠道网\t223638\nCitadel\t223639\npmt\t223640\n磨骨\t223641\n色机\t223642\n宋喆\t223643\nrooms\t223644\n3749\t223645\nbackend\t223646\n乐融\t223647\n雪梨汁\t223648\n600万美元\t223649\n龟龟\t223650\n气囊\t223651\n张琪\t223652\n12p\t223653\n王玉明\t223654\n辩证\t223655\n女拳\t223656\n小家碧玉\t223657\n足球赛\t223658\n丝图阁\t223659\n充要条件\t223660\n连翘败毒丸\t223661\n围场满族蒙古族自治县\t223662\nccic\t223663\n耳房\t223664\n立海\t223665\nstronger\t223666\nboiler\t223667\n宣传费\t223668\n李永杰\t223669\n开始栏\t223670\n史克肠虫清\t223671\n宜昌三峡机场\t223672\n印通\t223673\nrides\t223674\n射击场\t223675\nshadow\t223676\n壕无\t223677\n超导可视无痛人流\t223678\nQ币\t223679\n华鹏\t223680\n天宁寺\t223681\n悦跑\t223682\n陈萝莉\t223683\n人教版八年级物理\t223684\nCarrie\t223685\n反馈抑制器\t223686\n拙羽斋\t223687\n续约\t223688\nendnotex8\t223689\n轰天雷\t223690\n隋唐演义吧\t223691\n寻常型银屑病\t223692\n长期股权投资核算\t223693\noemsf\t223694\n长治县\t223695\n光伏电池板\t223696\nHDiPad\t223697\n清教\t223698\n阿泽\t223699\n经济衰退\t223700\nCoach\t223701\n李昊桐\t223702\n八通线\t223703\n力拓\t223704\n新豪轩\t223705\n脯\t223706\nATK\t223707\n百弗英语\t223708\n七子\t223709\n上海火车站北广场\t223710\ngei\t223711\n二十种\t223712\nKeyshot\t223713\n雷炎\t223714\n南京教育信息网\t223715\n酯化\t223716\n院方\t223717\n华成法硕\t223718\n放射性肺炎\t223719\n跳蛋\t223720\n骆家庄\t223721\n李婧\t223722\n露肩装\t223723\ni5-7300hq\t223724\n上古卷轴重制版\t223725\n创新思维\t223726\n深圳先进院\t223727\n黑山县\t223728\n仪器仪表世界网\t223729\n化妆间\t223730\n广州国际学校\t223731\nn8000\t223732\n卧底归来\t223733\n皇家马德\t223734\nbegan\t223735\n试验机\t223736\nVirS\t223737\n第4天\t223738\n整柜\t223739\n马鞍山二中\t223740\n庄学忠\t223741\nlm3886\t223742\n弯箍机\t223743\n松井\t223744\n文公\t223745\nCAD2006\t223746\nhabitat\t223747\n华鼎股份\t223748\n沈梦席慕容\t223749\n阿语\t223750\n云南省省\t223751\n基本原理\t223752\n龙岩市永定区人民政府\t223753\n百晓知道\t223754\n丁春秋\t223755\n逞\t223756\nCP\t223757\n佰伴\t223758\n浙江财经大学\t223759\n打哈\t223760\n受试者\t223761\n东航物流\t223762\n中筒袜\t223763\n挑板\t223764\n易康\t223765\n新宿天鹅\t223766\n油温机\t223767\nknowing\t223768\n18册\t223769\n鸽子笼\t223770\n花奶\t223771\n36伏\t223772\n隔日\t223773\n彭刚\t223774\n千家智能家居网\t223775\n卡森\t223776\n蓝色港湾\t223777\nNa2O2\t223778\n88%\t223779\n倾尘\t223780\nlucky_zhang\t223781\n固端\t223782\n邺城\t223783\n德驰\t223784\n惨绝人寰\t223785\nCVR\t223786\n球包\t223787\n刘志鹏\t223788\n18招\t223789\nMOTO360\t223790\n殷保华\t223791\n6.1.7600.16385\t223792\n吾诺\t223793\n下脚\t223794\n半块\t223795\n小黄米\t223796\n月河街\t223797\n贺文\t223798\n华泰财险\t223799\n李亚玲\t223800\n狐火\t223801\n周文斌\t223802\n卡梅隆\t223803\nlala\t223804\n23倍\t223805\ncadvisor\t223806\n世林\t223807\n万泰城\t223808\n共青团南昌市委\t223809\n绵阳电视台\t223810\n树苗\t223811\n劳资关系\t223812\nnhentai\t223813\n招商中央华城\t223814\n氙灯\t223815\n叶小钗\t223816\n鲍斯\t223817\n终末\t223818\n遮天蔽日\t223819\n亚太地区\t223820\n骁龙\t223821\n宜城市\t223822\n发泡塑料\t223823\n变声器\t223824\nLAKE\t223825\n迅维网\t223826\n共振分析仪\t223827\n林方\t223828\n货架期\t223829\n黑街\t223830\nBeanFactory\t223831\n一休\t223832\n济南国际园博园\t223833\n干呕\t223834\n王秘书\t223835\n太古战神\t223836\nbss\t223837\n老书虫\t223838\n78平米\t223839\n翻译\t223840\n拍宝\t223841\nzsmj\t223842\n相对密度\t223843\n服务管理器\t223844\n交通运输厅\t223845\n双元\t223846\n18001\t223847\n尿道\t223848\n何求\t223849\n蚌\t223850\nWin/Mac\t223851\n鸭王2\t223852\n毕业礼物\t223853\n高氏\t223854\n会计分录\t223855\n指纹油\t223856\n雷神雷伊\t223857\nv2.2.4\t223858\n佰趣\t223859\n此间的少年\t223860\n火影忍者究极忍者风暴4\t223861\n孝感政府\t223862\n柔和七星\t223863\nFNIC\t223864\n波多尔斯基\t223865\n乐家卫浴\t223866\n推荐表\t223867\n本里\t223868\n59集\t223869\n焗饭\t223870\nopenssl\t223871\nr6点\t223872\n朋友网\t223873\n老机\t223874\n开管\t223875\n吸痰\t223876\n分开\t223877\n王受之\t223878\n哺乳枕\t223879\n近距离\t223880\nnvh\t223881\n玉线\t223882\n莫水千流\t223883\n螺杆式压缩机\t223884\n快牛金科\t223885\n支队长\t223886\n福康\t223887\n唐骏\t223888\nnoticed\t223889\n指挥\t223890\n掺和\t223891\n北京市金融工作局\t223892\n引力场\t223893\n厦门站\t223894\n手浮\t223895\n座舱\t223896\n说情\t223897\ndatalogic\t223898\n迁徒\t223899\n大同证券\t223900\n2两部\t223901\n中国教育在线小学\t223902\n中公教育北京分校\t223903\n服务类\t223904\n甘肃省公安厅交通警察总队\t223905\n花熊\t223906\n禽畜\t223907\n赛程\t223908\n2100元\t223909\nutilization\t223910\nMSVC\t223911\n步进电机驱动\t223912\n补差\t223913\n绝地求生地图\t223914\n定性滤纸\t223915\n试办\t223916\n杜达雄\t223917\n廊坊市卫生局\t223918\n车链\t223919\nビッチ\t223920\n1.1.0_\t223921\n砖石\t223922\n汉诺威工业博览会\t223923\nchanel香奈儿\t223924\n白豆\t223925\n分励脱扣器\t223926\n广西师范学院\t223927\n拓扑\t223928\n3三个\t223929\n红高跟\t223930\n政治体制改革\t223931\nmanga\t223932\n从业人\t223933\n防城港市港口区\t223934\n玄道\t223935\nmoonandstar08\t223936\n德比郡\t223937\n压差开关\t223938\n讯号\t223939\n宝泰\t223940\n上海市进才中学\t223941\n湛江市人民政府\t223942\n电气火灾监控\t223943\n塑料类\t223944\n460\t223945\n锯铝机\t223946\n爬上\t223947\nmsp430f149\t223948\n佳能MG3080\t223949\n班农\t223950\n累计值\t223951\n情哥哥\t223952\n胡雪岩\t223953\n54\t223954\n导版\t223955\n白金\t223956\n烟树\t223957\n西奈\t223958\n霹雳天命之战祸邪神II破邪传\t223959\ndarkness\t223960\n简阳市\t223961\n中国大百科全书出版社\t223962\n管家婆中特网\t223963\n血流动力学\t223964\n伍朝辉\t223965\n第八识\t223966\n芜湖日报\t223967\n扯破\t223968\n戏园\t223969\n距离感应器\t223970\n放尽\t223971\n凤鸣岐山\t223972\n海昌极地海洋世界\t223973\n亮色\t223974\n坦洲镇\t223975\n平均主义\t223976\n江苏消防\t223977\n越好\t223978\n支支\t223979\n香蕉君\t223980\n古墓丽影源起之战百度云\t223981\nGOV\t223982\nX-Japan\t223983\n张曼\t223984\n姐妹俩\t223985\n安裝\t223986\n涌浪\t223987\n8例\t223988\nstring类\t223989\n特许\t223990\n刑讯\t223991\n红蝶\t223992\n外汇经纪商\t223993\n26厘米\t223994\n360_\t223995\n气液\t223996\n晶耀\t223997\n0579\t223998\n靖安\t223999\n挂坠\t224000\n2016年2月29日\t224001\nNearby\t224002\njoint\t224003\n10000万元\t224004\n每方\t224005\n辖村\t224006\nfcpx\t224007\n2019款\t224008\n甲股份有限公司\t224009\n百试百灵\t224010\n李朗\t224011\n轮回诀\t224012\n合租公寓\t224013\n睿睿\t224014\n失眠多梦\t224015\n呼伦湖\t224016\n中亚\t224017\n12所\t224018\nアンソロジ\t224019\n高箱\t224020\n72\t224021\n朝凤\t224022\nOriginPro2018\t224023\n米氏方程\t224024\nnonzero\t224025\n金长江\t224026\n素餐\t224027\n福缘殿\t224028\n3km\t224029\n不择\t224030\n贺伟\t224031\n主航道\t224032\n宁高宁\t224033\n41类\t224034\n咨询师\t224035\n御剑情缘_九游论坛\t224036\n易海\t224037\n美世咨询\t224038\ngungeon\t224039\n办公族Office\t224040\n720p.BD\t224041\n二代\t224042\n畅意\t224043\n一旦\t224044\n开门大吉\t224045\n料阀\t224046\n空间直角坐标系\t224047\n谢磊\t224048\n城市规划用地\t224049\n海儿\t224050\n醉赤壁\t224051\n白度\t224052\ncoarse\t224053\nxfplay影音先锋\t224054\n海口农商银行\t224055\nDependencies\t224056\nVision\t224057\n魔影\t224058\n七巧国\t224059\nalvaro\t224060\n池贤宇\t224061\nBenjamin\t224062\n169个\t224063\n拖挂式\t224064\njavamelody\t224065\ndrawable\t224066\nJEEP\t224067\n西岗\t224068\n巴特罗之家\t224069\n铁鹰四关\t224070\n萨鲁法尔大王\t224071\n故弄玄虚\t224072\n党风廉\t224073\n五百年前\t224074\n长城物业\t224075\n净水机\t224076\n李尚恒\t224077\n图例\t224078\nTRIPS\t224079\n2018年4月13\t224080\n脚贴\t224081\n路块\t224082\nnetbios\t224083\n河南省食品药品监督管理局\t224084\n李素丽\t224085\n用来到\t224086\n通风工程\t224087\ndecal\t224088\n机员\t224089\n公司型\t224090\n999元\t224091\n速溶\t224092\n狂暴巨兽\t224093\n背压式\t224094\n倪震\t224095\n与神同行\t224096\n千古绝唱\t224097\n陈洲\t224098\n介绍版\t224099\n增彩\t224100\ndip\t224101\n001章\t224102\n077\t224103\noracle表\t224104\n牛牛\t224105\n木神\t224106\n深圳市宝安区政府\t224107\n影视园\t224108\n北京二中\t224109\nvampire\t224110\n背离\t224111\n第52\t224112\n3x\t224113\nSituation\t224114\n白如雪\t224115\n手术刀\t224116\n南京人才网\t224117\n英格力士\t224118\n华润三九\t224119\n租出\t224120\ntelecode\t224121\n标识码\t224122\n乌司他丁\t224123\n合肥幼儿园\t224124\n广城\t224125\n泉州晚报数字报\t224126\n色带\t224127\n3D\t224128\n活春\t224129\n揖礼\t224130\nCHARGE\t224131\n自来水费\t224132\n36%\t224133\n南昌外国语学校\t224134\n教育费\t224135\n牧\t224136\n电气工\t224137\nAT0371-53L\t224138\nG9280\t224139\n存储虚拟化\t224140\n施舍\t224141\n中国茶叶流通协会\t224142\n魂飞魄散\t224143\n乐鑫\t224144\n欧陆战争4\t224145\n15.8%\t224146\n城市建设用地\t224147\nkinds\t224148\n旗号\t224149\n保护法\t224150\n课例\t224151\n小学\t224152\nword2012\t224153\n38项\t224154\n犬儒主义\t224155\n肾痛\t224156\ndBm\t224157\n平度一中\t224158\n富豪们\t224159\n瀬\t224160\n洞察者\t224161\n翘楚\t224162\n签到墙\t224163\n80款\t224164\n太阳旗\t224165\n胸里\t224166\n大宝\t224167\nHue\t224168\n缝边\t224169\n拟态\t224170\n000687\t224171\n少数民\t224172\nbuffalo\t224173\n弹珠\t224174\n大斧\t224175\n百叶箱\t224176\nusb转串口\t224177\n叶芽\t224178\n阿黛海子\t224179\n北原夏美\t224180\n冒烟\t224181\nPolice\t224182\npef\t224183\nbegins\t224184\n17所\t224185\n米家电磁炉\t224186\n81个\t224187\n蒲州街道\t224188\ncdr6\t224189\nNIKE耐克\t224190\n158cm\t224191\n练功\t224192\n帝师红与黑\t224193\n奔驰ML350\t224194\n钱晨\t224195\nRDD\t224196\n一号位\t224197\n深圳中医院\t224198\n黄建\t224199\n0:00\t224200\n刀剑萌侠\t224201\n卓越世纪中心\t224202\n重装机兵3\t224203\nign\t224204\n梧桐路\t224205\n郑震湘\t224206\n苏维\t224207\nPennsylvania\t224208\n新世界中心\t224209\n一#\t224210\n锦绣江南\t224211\n法式甜品\t224212\n西雅\t224213\n镇痛膏\t224214\n各有所爱\t224215\n蒿\t224216\n印版\t224217\n曲阜孔府\t224218\n陶瓷器\t224219\nMOOC\t224220\n玟星\t224221\n魏坤琳\t224222\n江楠楠\t224223\nfission\t224224\n救亡运动\t224225\n青灰\t224226\n闭环管理\t224227\nEvergreen\t224228\nmysql数据源\t224229\n跑车网\t224230\n饱和\t224231\n银行部\t224232\n谷村新司\t224233\nglup\t224234\n刘小涛\t224235\n万灵古镇\t224236\n59天\t224237\n五载\t224238\nidea吧_\t224239\n实践\t224240\n张爱朋\t224241\n发电器\t224242\n昆虫学\t224243\n车云网\t224244\n不用\t224245\n中国人民大学国际学院\t224246\n外建\t224247\n惠尔康\t224248\n阿佛洛狄忒\t224249\n长春市规划局\t224250\n7系\t224251\n神医废柴妃\t224252\n4分钟\t224253\n连环坞\t224254\n龙树菩萨\t224255\nAcc\t224256\n达明\t224257\nSake\t224258\nUEFI\t224259\n福州一中\t224260\n龟甲龙\t224261\n临泉\t224262\n天盟\t224263\n广清\t224264\nxwdreamer\t224265\n指间\t224266\nPredictions\t224267\n7B\t224268\n第106号\t224269\n27册\t224270\n更具\t224271\n雪茄客\t224272\n江西省信息中心\t224273\nDaft\t224274\n3438\t224275\n跳着\t224276\n龙元建设集团股份有限公司\t224277\n赵幼斌\t224278\nCN\t224279\n0851-12340\t224280\n碾\t224281\n不看重\t224282\n竹草\t224283\n吁\t224284\n鲁迅故里\t224285\n素万那普国际机场\t224286\nabbyy\t224287\n单田汤姆克鲁斯\t224288\n汉州\t224289\n全球中文音乐榜上榜\t224290\n郭小慧\t224291\n书咖\t224292\nqpy\t224293\n标志情报局\t224294\nFossies\t224295\nHUNTA\t224296\n周新华\t224297\nmy盛lady\t224298\n十天前\t224299\n展成\t224300\n星门\t224301\nalc\t224302\n血歌\t224303\n摄像师\t224304\n断根\t224305\n2022厘米\t224306\n缠\t224307\n明晓溪\t224308\n起点中文\t224309\n连衣裙\t224310\n逝者\t224311\n徐其耀\t224312\n两校\t224313\nbws\t224314\n江西财经职业学院\t224315\nAVOP\t224316\n饲料袋\t224317\ncompulsory\t224318\n野鸡坪\t224319\nPHPWord\t224320\n遂\t224321\n互斥体\t224322\nqvod\t224323\n林伯渠\t224324\n高联\t224325\n艾福杰尼\t224326\npes2018吧\t224327\n中安消\t224328\nBiBi\t224329\n紫灵芝\t224330\n太阳能板\t224331\n祝贺词\t224332\nSpriteKit\t224333\n物联网云\t224334\nxyz\t224335\n五建\t224336\n求爱\t224337\n泡泡纱\t224338\n平江府\t224339\n上海地铁3号线\t224340\n皖酒\t224341\n粗糙集\t224342\n鸡腿\t224343\n开税\t224344\n上海交大\t224345\n漕溪路\t224346\n米虾\t224347\nxvm\t224348\n复旦大学附属金山医院\t224349\nApples\t224350\n28号\t224351\n轻量\t224352\n园芳\t224353\n肉卷\t224354\n唱吧\t224355\n抽噎\t224356\n一骑当千2\t224357\n入行论\t224358\n身份证明书\t224359\nmerchandising\t224360\n45天内\t224361\n十五分钟\t224362\n菜瓢谷\t224363\nエロゲ\t224364\nsupportive\t224365\n藥\t224366\nAIR\t224367\n铃木骁途\t224368\n夜不归宿\t224369\n天津市市\t224370\n石漠化\t224371\n夜访\t224372\n燕赤霞\t224373\n重定\t224374\n天灵\t224375\n小狐\t224376\n人类:一败涂地\t224377\n01017\t224378\n海底捞\t224379\n江苏省科协\t224380\n机动车辆\t224381\n第二幕\t224382\nigor\t224383\n威化饼干\t224384\n老卓\t224385\n广西电网有限责任公司\t224386\nsigning\t224387\n慈溪市公安局\t224388\n黑塞矩阵\t224389\n喷印\t224390\nUnchained\t224391\n鸡米\t224392\n木柴\t224393\n滴\t224394\n秦子恒\t224395\nVegetarian\t224396\n无功补偿装置\t224397\nCalendar类\t224398\n小我\t224399\n西特\t224400\n贵州省委党校\t224401\n海南建省办特区\t224402\n磨料\t224403\nQQ自由幻想OL\t224404\n节俭\t224405\n2015年2月份\t224406\n襟翼\t224407\n第八项\t224408\n2018年五月份\t224409\n海思麒麟950\t224410\n眷\t224411\n200件\t224412\n最优惠\t224413\n铜冠\t224414\nMBR\t224415\n千枚\t224416\n克01\t224417\n魏晋\t224418\nUPX\t224419\n介護\t224420\n第几页\t224421\n易紧\t224422\n品牌力\t224423\n孙丽\t224424\nsuppor\t224425\n3D模型\t224426\n丹华\t224427\n龙沐湾\t224428\n消毒柜\t224429\n不谋而合\t224430\n色散\t224431\n8802\t224432\n企叮咚\t224433\n半边\t224434\n红崖谷\t224435\n海航创新\t224436\nExports\t224437\n不可失\t224438\n上海容创生物技术有限公司\t224439\n生产性\t224440\n出苗\t224441\nape\t224442\n珍珠衫\t224443\n红叶季\t224444\n周鹏\t224445\n无框阳台窗\t224446\n匀净\t224447\n珠江啤酒\t224448\n焦作中旅银行\t224449\n实锤\t224450\n中国羽毛球队\t224451\n妈蛋表情网\t224452\n鹿客\t224453\n霹雳\t224454\n毕设\t224455\n铜仁机场\t224456\n亚古兽\t224457\n冒雪\t224458\n大红袍茶\t224459\n亚马逊站\t224460\n巨笔\t224461\n岚皋县人民政府\t224462\nimplications\t224463\n封神英雄榜\t224464\n芦蒿\t224465\n鵺\t224466\nTak\t224467\n苹果手机维修中心\t224468\n魔板\t224469\n法人证\t224470\n重庆恒大\t224471\n小哲\t224472\n碱性氧化物\t224473\n匹\t224474\n叶斑\t224475\n吆\t224476\n王镛\t224477\nopenflow\t224478\n灯火通明\t224479\n油灰\t224480\n意见书\t224481\n尿味\t224482\n高丝\t224483\n变态性\t224484\n齐赞\t224485\npy.66wz.com\t224486\npinnacle\t224487\n福建省厦门双十中学\t224488\n上海国家会展中心\t224489\n想像\t224490\n晚生\t224491\nQatar\t224492\n对等\t224493\n陈靖仇\t224494\n色姐妹综合网\t224495\ngwt\t224496\n测物\t224497\n新映画\t224498\n电源分配器\t224499\n拯救者\t224500\n操纵性\t224501\n美文亭\t224502\n剪切合并器\t224503\n上海立信\t224504\npifa\t224505\n600884\t224506\n姜敏京\t224507\n壤塘县\t224508\nbindservice\t224509\n恋语\t224510\n30家\t224511\nOrwell\t224512\n食不厌精\t224513\nsuse\t224514\n武藏野\t224515\nCyberLink\t224516\n两机\t224517\n刀剑封魔录\t224518\nKiehls\t224519\nxyq\t224520\n检验\t224521\n外表\t224522\n占线\t224523\n郑智\t224524\nepmd\t224525\n10.9.2\t224526\nideapad300\t224527\n省吃俭用\t224528\nags\t224529\n那样子\t224530\n猛干\t224531\nDijkstra\t224532\n双力\t224533\n突泉县\t224534\nwpc\t224535\n麻省理工\t224536\n怼天怼\t224537\n2.0.10\t224538\nBLU\t224539\n苯二氮卓类\t224540\n虎豹骑\t224541\n管片\t224542\n静若\t224543\nweibo\t224544\n吉阳区\t224545\n身旺\t224546\n陈漫\t224547\n王浦劬\t224548\n乐府\t224549\n卢斌\t224550\nITBEAR\t224551\n车小喜\t224552\nLattice\t224553\n101种\t224554\n新一中\t224555\n徐梦洁\t224556\n黄一芝\t224557\n卢沟桥街道\t224558\nspring-framework\t224559\n3.66\t224560\nkong\t224561\n二1\t224562\nCheney\t224563\n宜人\t224564\n40kg\t224565\n墨器\t224566\n偷看\t224567\ninterp2\t224568\ncrtd\t224569\n职业证\t224570\n铃原爱蜜莉灰\t224571\n秦皇汉武\t224572\n星楼\t224573\nforeo\t224574\n新众泰T500\t224575\n生死连\t224576\n索状\t224577\n虚空幻界\t224578\ntopic\t224579\nreal-time\t224580\n炜衡\t224581\n脑瓜\t224582\n结尾段\t224583\n蜕化\t224584\n全军\t224585\n大连东软信息学院\t224586\n胡梅尔斯\t224587\n万级\t224588\n诅咒DLC\t224589\nperk\t224590\n活性氧化铝\t224591\n池晟\t224592\n诛仙二\t224593\n赵曼\t224594\n见于\t224595\n拉力试验机\t224596\ncvpr\t224597\n焙\t224598\nconducted\t224599\n上海五角场\t224600\n组成部\t224601\ntrainable\t224602\n大河报\t224603\n运动式\t224604\n2162\t224605\n援手\t224606\n休闲西装\t224607\n10DVD\t224608\n承蒙\t224609\n杏仁露\t224610\n天天电影网\t224611\n第一幅\t224612\n上三天\t224613\n南京信息工程大学滨江学院\t224614\nAdvanced\t224615\n赵南\t224616\n水钟\t224617\n自组织\t224618\n因人\t224619\n达芙通\t224620\n郭庄村\t224621\n范玮武则天\t224622\nstephanie\t224623\nverifying\t224624\n居保\t224625\n好吃\t224626\nActin\t224627\n阿贤\t224628\n網\t224629\n以撒的结合重生\t224630\n百润股份\t224631\n1325\t224632\n淋球菌\t224633\n博讯\t224634\n宝安体育馆\t224635\nrcnn\t224636\n速溶茶\t224637\n汽油锯\t224638\nPreferred\t224639\nWEBAPI\t224640\nS9300\t224641\nQ天下\t224642\n连续剧\t224643\n刘滨\t224644\n戴安澜\t224645\n3658\t224646\n里加\t224647\na=3\t224648\n日志型\t224649\n办年\t224650\n令行禁止\t224651\n纪行\t224652\n葛长伟\t224653\n武状元\t224654\nSuperSlide\t224655\n唐德影视\t224656\nM1136-ZOL\t224657\nmenus\t224658\n25道\t224659\n专网\t224660\n邦利\t224661\n莫德里奇\t224662\n中央纪检委\t224663\n民丰\t224664\n暮光之城\t224665\n牙龈肿痛\t224666\n54年\t224667\n帮战\t224668\n渡边\t224669\n盖世太保\t224670\nRoute\t224671\n促建\t224672\nCO\t224673\n林正英\t224674\nWindws\t224675\n十年\t224676\nerror3_凤凰\t224677\n8一个\t224678\n哈继铭\t224679\n并联式\t224680\n顶膜\t224681\neccentric\t224682\nFacets\t224683\n千讯\t224684\n0.03MB\t224685\n病理学\t224686\n培养\t224687\ncallee\t224688\n陈丽丽\t224689\n100点\t224690\nSienna\t224691\n酒井王一博\t224692\n中易\t224693\n九黎\t224694\n凸透镜\t224695\n开放房\t224696\n圆台形\t224697\n387\t224698\n泡芙小姐\t224699\nlightdm\t224700\n北固亭\t224701\n黄油\t224702\n错误\t224703\n广东招商网\t224704\n太空类\t224705\n酶谱\t224706\n绿母文\t224707\nSuki\t224708\n航空意外险\t224709\n大肠杆菌\t224710\n快乐时光\t224711\ned2\t224712\n1.8.0\t224713\n26uuu\t224714\n刁钻\t224715\n山东中医药大学\t224716\n分治法\t224717\n神明\t224718\n鲜切\t224719\n玫瑰茶\t224720\n户外\t224721\n魅蓝5s\t224722\ngrc\t224723\nnice\t224724\n简繁\t224725\n朴娜莱\t224726\n切开\t224727\n红黑树\t224728\n15R3\t224729\n第四步\t224730\n中南锦苑\t224731\n汉泰\t224732\n追风少年\t224733\n菜籽榨油机\t224734\n兽神\t224735\n平潭综合实验区\t224736\n军行\t224737\nelevation\t224738\n财政总预算会计制度\t224739\nphpspider\t224740\nqq靓号\t224741\n见血封喉\t224742\n帝国\t224743\n托勒\t224744\nPAYDAY2\t224745\n苦撑\t224746\n婚途\t224747\n北京中医药信息网\t224748\n彰化\t224749\nJDT\t224750\n青肿\t224751\nn+\t224752\n倍斯特\t224753\n天宫二号\t224754\n个人保险\t224755\n购得\t224756\n电服\t224757\n引导页\t224758\n蓝讯\t224759\n世说新语\t224760\nKINJAZ\t224761\n纵情\t224762\n3PAR\t224763\n王舞\t224764\n守卫者\t224765\n鬼斯\t224766\n福特f150\t224767\n3v\t224768\n宁波市住房和城乡建设委员会\t224769\n阳光售房网\t224770\n国际赛\t224771\nRetention\t224772\n米斯特\t224773\n浮生记\t224774\n立根原\t224775\nOracle11gR2\t224776\n香港9号\t224777\n教师教育学院\t224778\nsodu-豆沙网\t224779\n113号\t224780\n伯贤\t224781\n海利\t224782\n东莞市市\t224783\nBrowsing\t224784\n二十二个\t224785\n梁图\t224786\n风火轮迷你模型\t224787\nPanasonic\t224788\n20150818\t224789\n坝头\t224790\n中国人民财产保险\t224791\n外贸类\t224792\n生命缘\t224793\n猎车\t224794\n测试器\t224795\n炒肝\t224796\n南昌\t224797\n资讯化\t224798\nmister\t224799\n云南白药气雾剂\t224800\n有害气体\t224801\n自哀\t224802\n喊话器\t224803\n姜斌\t224804\n贺卫方\t224805\n横竖屏\t224806\n水间\t224807\n存留\t224808\n干片\t224809\n明夷待访录\t224810\n硝化棉\t224811\n拉曼光谱\t224812\n94\t224813\n%cd%\t224814\n张雪梅\t224815\n彭拜\t224816\n麦格\t224817\n相爱十年\t224818\n殇情\t224819\n烛人\t224820\njuku\t224821\n出换\t224822\n南三条\t224823\n两袖\t224824\n凯迪仕\t224825\n人龙传说\t224826\n2012年伦敦奥运会\t224827\ns2\t224828\n中国认证信息网\t224829\n经济观察报\t224830\n马蹄酥\t224831\n早上十点\t224832\nSilly\t224833\n表叔\t224834\n昆明市人民政府\t224835\nTHP\t224836\n00012\t224837\n清华大学药学院\t224838\n哲思\t224839\n邗\t224840\n四月一号\t224841\n优活\t224842\n绝地归来\t224843\n菜鸟网络\t224844\n建安七子\t224845\n玛奇\t224846\n宣威市人民政府\t224847\n二二\t224848\n触发\t224849\n快眼看书\t224850\n法人章\t224851\n江苏省住房和城乡建设厅\t224852\n山东省海洋与渔业厅\t224853\n2p\t224854\n贝嫂\t224855\nkms\t224856\n中国药学杂志\t224857\n党中央集中统一\t224858\n城管局\t224859\n陈一\t224860\n对门\t224861\n死鱼\t224862\n进仓\t224863\n1486\t224864\n濮阳\t224865\n租赁方\t224866\nFoobar\t224867\nstarr\t224868\n信息稿\t224869\n手工帐\t224870\n野玫瑰\t224871\n白兰地酒\t224872\n可能性\t224873\n扣人心弦\t224874\n冬奥组委\t224875\n颈椎治疗仪\t224876\nMCL\t224877\n润德\t224878\n地质灾害\t224879\n市文联\t224880\n热心\t224881\n希尔顿集团\t224882\ndiscard\t224883\n勇士闯魔城\t224884\n公总号\t224885\n电源板\t224886\n伊莉莎\t224887\nS02\t224888\n吃荤\t224889\n精环\t224890\ncabernet\t224891\n机器人争霸\t224892\ngeographical\t224893\nV2.0.2\t224894\nPDA\t224895\n&#183\t224896\n长安福特汽车有限公司\t224897\nsjz\t224898\n腊山\t224899\n白斑马\t224900\n淘宝代购\t224901\n基辅\t224902\nmodulo\t224903\n兵役登记表\t224904\ncomedy\t224905\n整流\t224906\n適\t224907\nmanually\t224908\nProto\t224909\nflappy\t224910\n贝思特\t224911\n芜湖经济技术开发区\t224912\naccess\t224913\n79分\t224914\n丁村\t224915\n八九\t224916\n释疑\t224917\n少林英雄\t224918\n我本沉默传奇\t224919\n手机座\t224920\n坚守者\t224921\n大亚湾经济技术开发区\t224922\n贝恩\t224923\n松陵镇\t224924\n落座\t224925\n嘉禾舞社\t224926\n廉颇蔺相如\t224927\n牵引车\t224928\n3天左右\t224929\n告诉\t224930\n热水瓶\t224931\n七仙女思春\t224932\n仓板\t224933\n中国西藏旅游网\t224934\n暗金丑岛君\t224935\n边键\t224936\nkeyshot6\t224937\n昌图\t224938\n五宗\t224939\n王予柔\t224940\n家庭群\t224941\n预设体\t224942\n阿拉德战记\t224943\n5000家\t224944\n信富\t224945\n想办法\t224946\n即兴\t224947\n东方盛世\t224948\nenable\t224949\n纳豆nado\t224950\nNone\t224951\n星期六\t224952\n腾跃\t224953\n五观\t224954\n不到期\t224955\n总领事\t224956\nASM\t224957\n目测\t224958\n微生态制剂\t224959\n记忆大师\t224960\n棚子\t224961\n范宗沛\t224962\n健值\t224963\n借支\t224964\n369\t224965\n欧阳\t224966\n程连元\t224967\nora-12541\t224968\ngnome-shell\t224969\nMAC地址表\t224970\n味精\t224971\n冒犯\t224972\n歌斐颂\t224973\nH264码流\t224974\n米林科维奇\t224975\nistat\t224976\n德州扒鸡\t224977\n遮阳挡\t224978\n民工学校\t224979\n采茶女\t224980\n十二位\t224981\n小诊所\t224982\n支月英\t224983\n牙冠\t224984\n推定\t224985\n绘图机\t224986\n000816\t224987\n格隆\t224988\n音场\t224989\nExpandableListView\t224990\n少民\t224991\n下辈子\t224992\n收藏礼品网\t224993\n明太子\t224994\n襄王\t224995\n信虫\t224996\n蒙特梭利教育\t224997\n动植物\t224998\n华妹陀\t224999\n两条\t225000\nyuming\t225001\ncscd\t225002\n摇珠\t225003\nbitwise\t225004\n郑和下西洋\t225005\ndire\t225006\n闲字\t225007\n老奶\t225008\n庆大霉素\t225009\n柠檬酸\t225010\n单片\t225011\n常世\t225012\n学霸学习网\t225013\n招标与采购网\t225014\nkobuki\t225015\n通档\t225016\n唯满侠吧_\t225017\n口渴\t225018\n格林达\t225019\n筹钱\t225020\nDropDownList\t225021\n卢菲菲\t225022\n智客\t225023\n13只\t225024\n总揽\t225025\n兖州一中\t225026\n爬下\t225027\n排重\t225028\nDixon\t225029\n冰树\t225030\n第二十六条\t225031\n魂斗罗2\t225032\n疑案\t225033\n601398\t225034\n3片\t225035\n分馏\t225036\nLTspice\t225037\n追根究底\t225038\n艾夫杰尼\t225039\nHER\t225040\n罗泾镇\t225041\n楚\t225042\n胡凡\t225043\nmilo\t225044\n囿\t225045\n糖化酶\t225046\n天浩\t225047\nLa\t225048\n弦论\t225049\n新松\t225050\n哈卡莱\t225051\n鸡照\t225052\n人机界面\t225053\n糊里糊涂\t225054\n瓜尔佳氏\t225055\n菜月\t225056\n永恒之沫\t225057\ntiger\t225058\n抚州市临川区政府\t225059\n柯\t225060\nsysrq\t225061\n跨境通\t225062\n执行器\t225063\noutof\t225064\n昌平\t225065\n立邦漆\t225066\n二百五\t225067\n基金从业资格证考试\t225068\n乔纳森\t225069\n医学检验吧\t225070\n尖子班\t225071\n桥下镇\t225072\n激活剂\t225073\n张玉\t225074\nh图\t225075\n摩特\t225076\nRGB24\t225077\n汽车西站\t225078\n潴留\t225079\n业之峰装饰\t225080\n百度权重排名查询-站长seo\t225081\n改中\t225082\n乐步\t225083\n丁香湖\t225084\n夏玲\t225085\nlf\t225086\n孙笑川\t225087\n旧部\t225088\n合理性\t225089\n海淘攻略\t225090\n拿去\t225091\n老邓\t225092\n电气工程师考试\t225093\n百泉镇\t225094\n乐清湾\t225095\n狼心狗肺\t225096\n红豆网\t225097\n马奶\t225098\n四尾\t225099\nWp\t225100\n你好旧时光\t225101\n中定义\t225102\n爽身\t225103\n牌组\t225104\n长城风骏\t225105\n会议室\t225106\n凯恩\t225107\n阆苑仙葩\t225108\n上海地铁站\t225109\n自明\t225110\nmushroom\t225111\n红网\t225112\n惜\t225113\n教程\t225114\ncohesive\t225115\n576P\t225116\n十九大后政治局\t225117\n不排除\t225118\n狂浪生\t225119\n北河\t225120\n夜校\t225121\n换季\t225122\n父业\t225123\n小米mini\t225124\n校园霸凌\t225125\nR410\t225126\n百度闪投\t225127\n未必\t225128\n林璎\t225129\nartrage\t225130\n猜出\t225131\n遮天魔兽剑圣异界纵横\t225132\n单女\t225133\n区级\t225134\n商业健康保险\t225135\nptu\t225136\n果脯\t225137\n大帆\t225138\n智盈\t225139\n猕猴桃汁\t225140\n靠\t225141\n仙人柱\t225142\n跨镇\t225143\ncorporation\t225144\nhyperx\t225145\n广州市番禺区人民政府_广州市番禺区政府\t225146\n内蒙古自治区文化厅\t225147\n丽水19楼\t225148\n上新街\t225149\ngateway\t225150\nnetbean\t225151\nimac\t225152\n石家庄二中南校区\t225153\n菜台\t225154\n红楼梦吧_\t225155\nIE8\t225156\n王特\t225157\n上海音乐学院附中\t225158\n加米\t225159\n憧\t225160\n汉宣帝\t225161\n辅机\t225162\n菌种\t225163\n双全\t225164\n盐水鸭\t225165\njava类\t225166\n六盘水市\t225167\n养眼\t225168\n挺入\t225169\n4770k\t225170\n讯问\t225171\n87亿\t225172\n体适能\t225173\nR15梦境版\t225174\nsoulmate\t225175\n博美犬\t225176\n内江市人民政府\t225177\n刀阵\t225178\n流歌\t225179\nduanxz\t225180\n0452\t225181\nR语\t225182\n八声\t225183\n氨酚羟考酮片\t225184\n菏泽房产网\t225185\n金丽\t225186\n新会柑普茶\t225187\nmenyoo\t225188\n默念\t225189\n红旗飘飘\t225190\n上元教育\t225191\n被机\t225192\nkeepass\t225193\nNEEA\t225194\n太丑\t225195\n天依\t225196\n桔园小区\t225197\n海绵体\t225198\n张掖丹霞国家地质公园\t225199\n扬州八怪\t225200\n2018047\t225201\n永恒者\t225202\n洁厕块\t225203\n第一章第一节\t225204\n有度\t225205\n午宴\t225206\n税凭证\t225207\nConsole口\t225208\nBored\t225209\n氢氟酸\t225210\n芳仔\t225211\n刘海霞\t225212\n洪山区\t225213\n茂县\t225214\n不见尾\t225215\n万达商圈\t225216\n混凝土拌合站\t225217\n3辆\t225218\n世俗\t225219\npoint\t225220\n派送\t225221\n量服\t225222\n女客\t225223\n整体化\t225224\n白马湖小学\t225225\nvisionpro\t225226\n松江万达广场\t225227\ng12\t225228\n贾人\t225229\n双楠\t225230\n新城市花园\t225231\n卸载\t225232\nMUSE\t225233\n堀未央奈\t225234\nguinea\t225235\n长尾关键词\t225236\n民二庭\t225237\nm390\t225238\n孟云\t225239\nQE\t225240\n复合岩棉板\t225241\n音乐节\t225242\n二阶常系数\t225243\n黑鱼汤\t225244\n骑马与砍杀吧\t225245\n承德市人民政府\t225246\n全天候贸易网\t225247\n腾讯视频播放器\t225248\n官舍\t225249\n刘星泰\t225250\n速腾1.6l\t225251\n叙述者\t225252\nendowment\t225253\n2第36\t225254\n2018.1.18\t225255\n甲状腺功能亢进症\t225256\n谢国\t225257\n300GB\t225258\ninputmapper\t225259\n外边\t225260\n迪克\t225261\n对帐\t225262\n甘溪镇\t225263\n重庆能源职业学院\t225264\nrank函数\t225265\nTORY\t225266\n节约\t225267\nMegalo\t225268\n陈岗\t225269\n木土\t225270\n龙居\t225271\n崔胜铉\t225272\nrinse\t225273\n双十佳\t225274\n童攀\t225275\n洛伦兹曲线\t225276\n李建伟\t225277\n灵魂回响\t225278\n混双\t225279\n少年王\t225280\n应用挂载器\t225281\nCOF\t225282\nPS/photoshop\t225283\nJpa\t225284\n水鞋\t225285\n孙继海\t225286\n光伏组件\t225287\n小商\t225288\n临死前的严监生\t225289\n生离\t225290\n海安\t225291\n20170429\t225292\n玉壁\t225293\n呐喊\t225294\n山东发改委\t225295\n水冷机\t225296\n股权置换\t225297\n同济南路\t225298\nFiddler\t225299\n芝加哥公牛队\t225300\nqb\t225301\n16年4月\t225302\n暂且\t225303\n温政\t225304\n命天\t225305\n双收\t225306\n上校\t225307\n膀胱肿瘤\t225308\nE480\t225309\n苏龙\t225310\npyaudio\t225311\nV5.6\t225312\n欹\t225313\nmistine\t225314\nbiochemistry\t225315\n和级\t225316\nbegining\t225317\n从头到尾\t225318\n叶星云\t225319\n美食街\t225320\n南平延平区\t225321\n催眠大师\t225322\nv1.8.0\t225323\n系统版本\t225324\n另开\t225325\n杨家沟\t225326\nMPQ\t225327\nresonant\t225328\n北约\t225329\n转码器\t225330\n河南联通\t225331\n秘策\t225332\n2015-2016年度\t225333\n一环\t225334\nCelebrity\t225335\nmysql连接数\t225336\nsolr7\t225337\n菲\t225338\n1厨\t225339\n高压开关\t225340\n催情\t225341\n龚小京\t225342\n副总统\t225343\n公牛电器\t225344\n终结\t225345\ndongao\t225346\n2017-11-20\t225347\nwnacg\t225348\nbeta5\t225349\nDirectx11\t225350\n龚一\t225351\nJLWZ\t225352\n神条\t225353\n羊羔\t225354\n12505\t225355\n作好\t225356\nUr\t225357\n良时\t225358\n这版\t225359\n还不错\t225360\n远志明\t225361\n筱慧\t225362\n铸型\t225363\n生死锁\t225364\n20150522\t225365\n太傻网\t225366\nimg\t225367\n别克gl8\t225368\n遂川\t225369\n袁崇焕\t225370\n双离合\t225371\nslickedit\t225372\n小曲\t225373\n3.34\t225374\n喀什大学\t225375\n礼赏\t225376\n一载\t225377\n平底锅\t225378\n车资\t225379\n3026\t225380\n赵顺然\t225381\n金融机构\t225382\n95万元\t225383\n无矾油条\t225384\n极地求生\t225385\n宁波万金磁业有限公司\t225386\n续存\t225387\n黄河东路\t225388\n女m\t225389\n天仙配\t225390\ndella\t225391\nIGN\t225392\n王静波\t225393\n负二\t225394\n双廊镇\t225395\n痘痘\t225396\n德比战\t225397\nGallagher\t225398\nTops\t225399\n软体家具\t225400\n横道图\t225401\n蓝天教育\t225402\n溴化\t225403\n华夏陶瓷网\t225404\n中核集团\t225405\nfreezer\t225406\n跳蛙\t225407\n偏导\t225408\n0932\t225409\n绛州\t225410\nsprintboot\t225411\n数亿元\t225412\n枯树枝\t225413\n旋转粘度计\t225414\n壁纸类\t225415\n赵国华\t225416\n大机\t225417\n归脾丸\t225418\n愿得\t225419\n中华人民共和国国土资源部\t225420\n0&\t225421\n信哥\t225422\n沣东农博园\t225423\n大宁\t225424\nf-343809\t225425\n6677\t225426\n大连市人民政府办公厅\t225427\n洛阳博物馆\t225428\n乳名\t225429\n超乎想象\t225430\nsomthing\t225431\nxmodem\t225432\nspatial\t225433\n杜杰\t225434\n瑞虹天地\t225435\n降伏\t225436\n蒲桃\t225437\n重庆大学公共管理学院\t225438\nkopf\t225439\n现场表演\t225440\n秦昊\t225441\n5.6.19\t225442\nHunk\t225443\n古灵精怪\t225444\n展阴\t225445\n好温柔\t225446\n战熊\t225447\n朱桢\t225448\nreducer\t225449\n3417\t225450\n煎煮\t225451\n蒙特卡洛算法\t225452\n4月5月\t225453\n砂光机\t225454\njavascript-js\t225455\n又拍网\t225456\n中年人\t225457\n奇艺视频转换器\t225458\n长海医院\t225459\n当涂县人民政府\t225460\n奇强\t225461\n1码\t225462\n修路\t225463\n高晟\t225464\n维塔斯\t225465\n加州\t225466\n左云县\t225467\n书单\t225468\nMoto\t225469\n慈\t225470\n保洁招聘网\t225471\n江澈\t225472\n配势\t225473\n翰威特\t225474\n357号\t225475\n课外书\t225476\n腿照\t225477\n一个劲\t225478\n14款\t225479\nYouTube\t225480\n益盟股份\t225481\n时尚博主\t225482\n火冥\t225483\n郭旺\t225484\nSuperstar\t225485\nuwc\t225486\n文墨\t225487\n飞逝\t225488\n原版\t225489\n165Hz\t225490\n收编\t225491\nBD-1080P\t225492\nrama\t225493\n三同\t225494\nvsco\t225495\n坚毅\t225496\n中冶天工集团有限公司\t225497\nreferring\t225498\n电核\t225499\nAsuraDong\t225500\n2017-09-19\t225501\n云OA\t225502\n非门\t225503\n鬼符\t225504\n课画\t225505\n美之海\t225506\n浮欢\t225507\n南安普顿大学\t225508\n戈雅\t225509\n10月8号\t225510\n班牌\t225511\nparsererror\t225512\n梅地亚\t225513\n81杠\t225514\n十月馨\t225515\nSurrey\t225516\n中寰\t225517\npreset\t225518\n今朝\t225519\n铁手\t225520\n三娘湾\t225521\n心头肉\t225522\n八珍糕\t225523\nCello\t225524\n奇瑞qq\t225525\nxdg\t225526\n哈尔滨医科大学\t225527\n特通\t225528\n空无\t225529\nslay\t225530\nckj\t225531\n茅侃\t225532\n素有\t225533\n龙巅锦鲤鱼\t225534\n胡咏梅\t225535\n蒸架\t225536\nPV付\t225537\n萘乙酸\t225538\n20161115\t225539\n5n\t225540\n人人通\t225541\n无名女尸\t225542\n豪言\t225543\n白衫\t225544\n36.5\t225545\n钢琴谱\t225546\n合谷\t225547\n艾问\t225548\n生活条件\t225549\nMOVIE\t225550\n点分\t225551\n脉冲响应\t225552\nSplit\t225553\nsuccessor\t225554\n气动执行机构\t225555\n20160507\t225556\n人大代表\t225557\n使然\t225558\n傅淼\t225559\n遇过\t225560\nsunzn\t225561\n工程学习网\t225562\n馬\t225563\n虚拟服务器\t225564\n环量\t225565\n16盒\t225566\n百回\t225567\nwaiguo\t225568\n250型\t225569\n休旅车\t225570\n王安卓\t225571\n弋阳县\t225572\n高桥镇\t225573\n猎金\t225574\n发热剂\t225575\n109家\t225576\n秀强股份\t225577\npstack\t225578\n帮转\t225579\n肾病综合症\t225580\n保利威视帮助中心\t225581\n多多少\t225582\n羊水量\t225583\n金旭\t225584\n第24章\t225585\n空保\t225586\n马敏\t225587\n大益广场\t225588\n罗生门\t225589\n迦南诗\t225590\n爱驰\t225591\n97折\t225592\n忍战\t225593\n相亲会\t225594\n黄猿\t225595\n施建祥\t225596\n国台\t225597\nrazor\t225598\nxubling\t225599\n浙江广播电视大学\t225600\n胡扯\t225601\nWick\t225602\n3PL\t225603\nconfident\t225604\n幼时\t225605\n择吉日\t225606\n女德\t225607\n小八路\t225608\n大公园\t225609\n硕博士\t225610\n芒草\t225611\nlgv30\t225612\n夜落\t225613\n1733\t225614\n银行理财产品\t225615\nJSC\t225616\nhonor8\t225617\n贮罐\t225618\n改作\t225619\nDat\t225620\n平湖街道\t225621\n比较分析法\t225622\ntello\t225623\ndial\t225624\n上海石化\t225625\n废酸\t225626\n股友\t225627\n装运期\t225628\n棕刚玉\t225629\n校花之贴身高手\t225630\n电压变送器\t225631\nGameSir\t225632\n聚氨酯保温材料\t225633\n闰秒\t225634\n强度试验机\t225635\n地藏缘论坛\t225636\nWarald\t225637\n收回去\t225638\n环太平洋雷霆\t225639\nh7\t225640\nrhcsa\t225641\nrosbag\t225642\n红旗linux\t225643\ndiameter\t225644\n奔驰CLS\t225645\ndlopen\t225646\n暗调\t225647\n有所长\t225648\nAkiba\t225649\n双液\t225650\n桂林北\t225651\n续封\t225652\n9930\t225653\n画角\t225654\n记忆犹新\t225655\nnrf\t225656\n瑞景\t225657\n70周\t225658\n朱时茂\t225659\n5则\t225660\n全见\t225661\n正月\t225662\n网卡通\t225663\n纽特\t225664\nplaye\t225665\n沙井客运站\t225666\n移动全网通\t225667\n天津西\t225668\n黄仁西游记之女儿国\t225669\nlibmysqlclient.so.18\t225670\nDB\t225671\n合力科技\t225672\n560\t225673\n卧虎山\t225674\n小米max2\t225675\n【菲\t225676\n杰瑞\t225677\n脱欧\t225678\nASPx\t225679\n452\t225680\n沙赞\t225681\n凯氏定氮仪\t225682\n束脚裤\t225683\n机械工程专业\t225684\nrpc服务器\t225685\n小时工兼职\t225686\n嘉应学院\t225687\nenb\t225688\nAndro\t225689\n暗黑2战网\t225690\n磷酸氢二钾\t225691\n夕阳红\t225692\n汕尾市\t225693\n样板戏\t225694\n52cm\t225695\n放假\t225696\n授时\t225697\n回笼\t225698\nx230s\t225699\n张海迪\t225700\n济南万达广场\t225701\n西安铁路职业技术学院\t225702\n捷豹\t225703\n烤箱烤肉\t225704\n博尔\t225705\n秦池\t225706\n银江\t225707\n3200点\t225708\n香港天文台\t225709\n乔司\t225710\n入梦\t225711\n普鲁士\t225712\nDUCK\t225713\n周康网\t225714\nX光安检机\t225715\nOnline\t225716\n平地\t225717\n周坚\t225718\n死寂\t225719\narnold\t225720\n沧州日报\t225721\n方特梦幻王国\t225722\ntreewidget\t225723\n学历\t225724\n苍南新闻网\t225725\noffsets\t225726\n30kw\t225727\n爱彼迎\t225728\n祥和苑\t225729\ntortoiseSVN\t225730\n邻域\t225731\n白平衡\t225732\n橙红\t225733\n百度影音高清\t225734\n无敌汽车网\t225735\nIplImage\t225736\nAutoComplete\t225737\n下一页\t225738\n星罗\t225739\nLDO\t225740\n喋血双雄\t225741\n睡岗\t225742\n2018.3.9\t225743\n172cm\t225744\n先河环保\t225745\n枕\t225746\n汉中市中心医院\t225747\n饭拍秀\t225748\n唐笑\t225749\n猎杀潜航5\t225750\n模子\t225751\n十三经注疏\t225752\n伤风败俗\t225753\n沙芬\t225754\n奥园广场\t225755\n红塔证券\t225756\n狗尾\t225757\n20170109\t225758\n2.3万\t225759\nEdi\t225760\n郑宇伯\t225761\n気分\t225762\n水文局\t225763\n子块\t225764\n抱有\t225765\n称帝\t225766\ntone\t225767\n女朋\t225768\nn4030\t225769\nIntertek\t225770\n238号\t225771\n重大疾病保险\t225772\n报答\t225773\n百兆口\t225774\n重庆社区\t225775\n无源器件\t225776\n七年\t225777\n加以\t225778\nkfx\t225779\n华与华\t225780\n哈巴河\t225781\n4.95\t225782\n单亲\t225783\n6第六章\t225784\n基本色\t225785\n配页机\t225786\n粘度\t225787\n春日由衣\t225788\n衬肩\t225789\nraining\t225790\nzhizi\t225791\n南昌装修公司\t225792\n公路护栏网\t225793\n下浮\t225794\n安庆市立医院\t225795\n豫康\t225796\n荣城\t225797\n有价值\t225798\n1000头\t225799\n孙庄村\t225800\n刷写\t225801\n网易艺术\t225802\n怡心湖\t225803\nSSR+锐速\t225804\nalign\t225805\n白凉粉\t225806\n他方\t225807\n170集\t225808\n练手\t225809\n嘉欣\t225810\n刷血\t225811\n巨杉\t225812\n猫侍\t225813\n2018-2024年\t225814\n初始化\t225815\nbrad\t225816\n家庭观\t225817\n王冠军\t225818\n陈锦鸿\t225819\n剑网三五\t225820\nbaumer\t225821\n锂硫电池\t225822\n大气球\t225823\n相悦\t225824\n25英寸\t225825\n指示物\t225826\n周卫东\t225827\n四川大学华西第二医院\t225828\n转\t225829\n残肢\t225830\n海外\t225831\n闪之轨迹\t225832\n大陶陶\t225833\n北滘新城\t225834\n国法\t225835\n凉山州国土资源局\t225836\n电子调速器\t225837\n宝诚\t225838\noga\t225839\n该地\t225840\nlinex\t225841\n乌鲁木齐市发改委\t225842\nm228b\t225843\n金属材料工程专业\t225844\n事权\t225845\n采薇书院\t225846\n惠州一中\t225847\n第76条\t225848\n天外\t225849\n燃料电池\t225850\n水晶兰\t225851\n捷讯\t225852\nGuardian\t225853\n尼康d7500\t225854\n免听\t225855\n适可而止\t225856\n360win10\t225857\n战斧\t225858\n藤椒\t225859\n金乡县\t225860\n三违\t225861\n85寸\t225862\n常量\t225863\n蜜獾\t225864\n水游城\t225865\n朱海峰\t225866\n释放器\t225867\ni78700\t225868\n单项\t225869\naij\t225870\n齿根\t225871\nvivox\t225872\n顶渲网\t225873\n孙山\t225874\n第834集\t225875\n2016年5月9日\t225876\n尚义\t225877\nICP算法\t225878\n双波\t225879\n滞\t225880\n披锋\t225881\n顶刊\t225882\n光科\t225883\n形形色色\t225884\n沙特\t225885\nprw\t225886\n男神\t225887\n种植户\t225888\n路明\t225889\nproduce101&\t225890\n铁锚\t225891\n中信证券公司\t225892\nFlyknit\t225893\n南通党建网\t225894\n辣椒碱\t225895\n时序\t225896\n做爱\t225897\n长城WEY\t225898\n骑士队\t225899\n伦敦大学\t225900\n攻摄\t225901\n彩虹版\t225902\n大唐宋\t225903\nits\t225904\n死老鼠\t225905\n睾丸囊肿\t225906\n至上主义\t225907\n河南日报网\t225908\n凝望\t225909\n2014-08-17\t225910\n归属权\t225911\n护杠\t225912\n翻台率\t225913\n鬼狐\t225914\n￢\t225915\n学游泳\t225916\n中国废旧物资网\t225917\n吴川市\t225918\n固本强基\t225919\n10米台\t225920\n新疆银行\t225921\nFiref\t225922\n银卡\t225923\npageable\t225924\nkick\t225925\n灌木\t225926\n北师版\t225927\n塔罗牌\t225928\n潮与虎\t225929\nCheckbox复选框控件\t225930\n小牛资本\t225931\n23种\t225932\n唐古拉\t225933\n牌桌\t225934\n80mg\t225935\noneday\t225936\n防烟分区\t225937\nSkia\t225938\n山大南路\t225939\n两种\t225940\n可转债转股\t225941\n青海\t225942\n赵都新城\t225943\n导航网\t225944\n奇幻空间\t225945\n委投\t225946\n字格\t225947\n做到\t225948\nSao\t225949\n恋子\t225950\n格尔木市政府网\t225951\n分水镇\t225952\n5183\t225953\n山西四建集团有限公司\t225954\n内蒙\t225955\n股灾\t225956\n云端翔龙骑士团\t225957\n广东省国家税务局\t225958\n南七\t225959\n树汁\t225960\n绿廊\t225961\nchambers\t225962\n拓宽\t225963\n复议案\t225964\n巴斯克\t225965\nZ9Max\t225966\nIDEA2016\t225967\nTPMS\t225968\n4.4.7\t225969\n13L\t225970\n空号\t225971\n阿片\t225972\n晚间天气预报\t225973\n正德皇帝\t225974\n祭献\t225975\n竹板\t225976\n40独立\t225977\n笔仙大战贞子2\t225978\n糠醛\t225979\n阿拉斯加俱乐部\t225980\nPriceline\t225981\n独品\t225982\nThreesome\t225983\nAPI-CSDN\t225984\n绿城房地产集团有限公司\t225985\n分布式爬虫\t225986\n勾芡\t225987\n赵振华\t225988\n反洗脑\t225989\n重量级\t225990\n凤桥镇\t225991\n腾迅\t225992\n认同\t225993\nDunhill\t225994\n040\t225995\n徐瑞\t225996\n过敏性休克\t225997\n天鸿\t225998\n奇梦\t225999\n女猴\t226000\n硝化菌\t226001\n荆门\t226002\nadaptive\t226003\nBorN\t226004\nFletcher\t226005\n国家自然基金\t226006\n合宜\t226007\n平均学分\t226008\nSYM\t226009\n杜如晦\t226010\n回到原点\t226011\n朱姓\t226012\n盐商集团\t226013\nthreats\t226014\nviolated\t226015\n滴漏\t226016\n沧州市教育局\t226017\n680元\t226018\n半缕\t226019\n香精\t226020\nscissors\t226021\nWSN\t226022\n徐州二院\t226023\neyebeam\t226024\nG500\t226025\nwoocommerce\t226026\n雅士\t226027\n茵芙莎\t226028\n陆渡镇\t226029\n有缘无分\t226030\n小豆苗\t226031\n苟利国家生死以,岂因祸福避趋之\t226032\n少狼\t226033\n张晓风\t226034\n华易网\t226035\n号簿\t226036\n二代妖精\t226037\nlobe\t226038\n特部\t226039\n蝾螈\t226040\nUSB充电器\t226041\n褪色的人生\t226042\n狗眼看\t226043\newebeditor\t226044\n2000xxx\t226045\n70d\t226046\n荣誉章\t226047\n天峨\t226048\ndiscrimination\t226049\nDivorce\t226050\n注浆管\t226051\n外阴白斑\t226052\n高昌区\t226053\nsex\t226054\n归元\t226055\nresidual\t226056\n自考生\t226057\n轰六\t226058\n黑龙江省畜牧兽医局\t226059\n婚姻树\t226060\n合成人\t226061\n凤王\t226062\n亚洲博鳌论坛\t226063\n女忍\t226064\n研究领域\t226065\n太昊\t226066\n菊花石\t226067\n1016\t226068\n昨天凌晨\t226069\nProtein\t226070\n六句\t226071\n密钥认证\t226072\n紫泥\t226073\n金宝融\t226074\nmisaki\t226075\n开场\t226076\narms\t226077\n愚钝\t226078\n湖南教育出版社\t226079\n优创\t226080\n两把\t226081\n洒落\t226082\n凌晨两点半\t226083\n老狐狸\t226084\n提手\t226085\nRSR\t226086\n100S\t226087\n求助\t226088\n环太\t226089\n档膜\t226090\n糯米糍\t226091\n就业信息网\t226092\nMessaging\t226093\n王汉宗\t226094\niceblue\t226095\n中国水泥网\t226096\n黑衣男\t226097\n金诚同达律师事务所\t226098\nmp3转换器\t226099\n几组\t226100\nnexusmods\t226101\n聚氨酯喷涂\t226102\n文武双全\t226103\n果渣\t226104\n魔兽世界wow\t226105\n1.02G\t226106\n沙棘汁\t226107\n差别定价\t226108\n眼视光\t226109\n协同作用\t226110\n狗皮膏\t226111\nsysroot\t226112\nxfun吃货俱乐部\t226113\n6.51\t226114\n南开区幼儿园\t226115\n吉林电视台\t226116\n西数\t226117\n柴米油盐\t226118\n安徽网\t226119\nTOMS\t226120\n台基\t226121\n野牛\t226122\nGTX650Ti\t226123\n期股\t226124\n南马路\t226125\npassat\t226126\n番红花\t226127\nanhydride\t226128\n段意\t226129\n洁洁良\t226130\n教师\t226131\n魔法俏佳人\t226132\n沙月\t226133\n橄榄球场\t226134\n2018-04-13\t226135\nMagicBook\t226136\n刘俊峰\t226137\n德国人\t226138\n显露\t226139\n5立方米\t226140\n敦刻尔克大撤退\t226141\n地数\t226142\n后坐力\t226143\n体验室\t226144\n漳州一中\t226145\ntsx\t226146\n国防科工局\t226147\n国关\t226148\nCharts\t226149\n上海外国语学院\t226150\n爱宕\t226151\n无线电\t226152\n僵尸版\t226153\n15日前\t226154\n上万元\t226155\n偏科\t226156\n轩窗\t226157\n砍伐\t226158\nPHP+MySQL数据库\t226159\n修正带\t226160\n素绉缎\t226161\n藏不住\t226162\n赫鲁晓夫\t226163\niGame\t226164\n襄阳论坛\t226165\nxlam\t226166\n赣南\t226167\n瞻前顾后\t226168\ntousu\t226169\n不爱的人\t226170\n坪山新区\t226171\n5420\t226172\n菩萨蛮\t226173\n熊猫星秀\t226174\nEthical\t226175\n一室户\t226176\n森达\t226177\n方料\t226178\n直销员\t226179\n省政府办公厅\t226180\nff14吧_\t226181\n三段式\t226182\n家俊\t226183\nEarphones\t226184\n假面骑士斗骑大战\t226185\n雷光夏\t226186\nToolchain\t226187\n南昌经开区\t226188\n深圳10区\t226189\n虾哥论坛\t226190\n胖墩\t226191\n断章取义\t226192\n阿花\t226193\n练号\t226194\n郑万高铁\t226195\nones\t226196\n大亚湾西区\t226197\nDidi\t226198\n有形动产租赁\t226199\n医疗室\t226200\n677\t226201\nontrack\t226202\n巴浪\t226203\n养肾\t226204\n站酷海洛\t226205\n大麦若叶\t226206\n文昌鱼\t226207\n妖琴师\t226208\nPolymerase\t226209\n大产\t226210\n绫濑恋\t226211\n液氮罐\t226212\n反抽\t226213\nbeeone\t226214\n心深处\t226215\n海中金\t226216\nhugepage\t226217\nValidate\t226218\n方逸华\t226219\n星创天地\t226220\n春光灿烂猪八戒\t226221\nbandai\t226222\nlinkin\t226223\n爽肤水\t226224\n祥和家园\t226225\n旅游城\t226226\n受罚者\t226227\n磨合\t226228\n真爱的谎言\t226229\n巧克力囊肿\t226230\n嘉怡\t226231\n销票\t226232\niwebshop\t226233\n银宝\t226234\nAirDroid\t226235\n加减\t226236\n共轭\t226237\n第五次\t226238\n袁贵仁\t226239\n祥安\t226240\n中国化工学会\t226241\ntinyos\t226242\n含磷\t226243\nfeisky\t226244\nVirtual\t226245\n嘎玛仁波切\t226246\n消炎片\t226247\n解释性\t226248\n低碳钢\t226249\n夏三万\t226250\n6二代\t226251\n755\t226252\n融信海上城\t226253\n放课\t226254\nIDEA15\t226255\n激情戏\t226256\n诗化\t226257\n尾页\t226258\n三途川\t226259\n金牛镇\t226260\n尾蛇\t226261\n孔型\t226262\n型钢\t226263\n一体\t226264\n金王子\t226265\n石砖\t226266\n47.5\t226267\n单胺氧化酶\t226268\n过大年\t226269\n24g\t226270\nNOP\t226271\n大转盘\t226272\n一了百了\t226273\n张安\t226274\n船检\t226275\nCUCM\t226276\n30毫米\t226277\n佛咒\t226278\n新疆交通职业技术学院\t226279\nlinunx\t226280\nwin7安全模式\t226281\n不能不说\t226282\n潇湘子\t226283\n何泓姗\t226284\n东湖高新\t226285\nsche\t226286\n畅言\t226287\n润田\t226288\n样本数\t226289\n一而再\t226290\nTesters\t226291\n新宝来\t226292\n188A\t226293\n伊莎贝尔·阿佳妮\t226294\n七日情\t226295\n节税\t226296\n济南市长清区政府\t226297\nraid\t226298\n河南医院\t226299\nblazblue\t226300\n高瞻远瞩\t226301\n邪恶漫画之火影忍者\t226302\n谢记\t226303\n态氧\t226304\n造桥\t226305\n2281\t226306\n多架\t226307\n_匿名\t226308\n临江镇\t226309\n23dm\t226310\n信头\t226311\n淘宝源\t226312\n储水式热水器\t226313\nweb服务器\t226314\n雷尼\t226315\n804\t226316\nsoma\t226317\n南京师范大学附属实验学校\t226318\n装订机\t226319\n手气\t226320\n豌豆蛋白\t226321\n狂妻\t226322\n学园\t226323\nFool\t226324\nsudden\t226325\n热血勇士\t226326\n龙游政府网\t226327\n萨满祭司\t226328\nLazada\t226329\n博洛尼亚\t226330\n喷色\t226331\n若羌县人民政府\t226332\n中青旅控股股份有限公司\t226333\n左丘\t226334\n古墓丽影11\t226335\n容大\t226336\niBooks\t226337\n江州区\t226338\n菜窖\t226339\n玛莎拉\t226340\n核武器\t226341\n往下跳\t226342\n粘贴式\t226343\nParticipation\t226344\n中学学科网_海量中小学\t226345\n进出货\t226346\n准确度\t226347\nTextbox\t226348\n毛阿敏\t226349\n西乌珠穆沁旗\t226350\n阿里应用分发\t226351\n1-3天\t226352\n持之以恒\t226353\nJeecg\t226354\n合而为一\t226355\n飞达广播网\t226356\n88家\t226357\n辟谷\t226358\n歼击\t226359\n方包\t226360\n宝马2系\t226361\n13所\t226362\n再生稻\t226363\ncigar\t226364\n场上\t226365\n17套\t226366\n郎咸平\t226367\n鹅肝酱\t226368\n男囚\t226369\n第六十二条\t226370\n笑着\t226371\n脱粒\t226372\nMathematical\t226373\n山东国税门户网站_山东省国家税务局\t226374\n委托单\t226375\n蔡幸娟\t226376\n千星之城\t226377\nPRIMO\t226378\n币界\t226379\n首都医科大学宣武医院\t226380\n75D\t226381\n啸声\t226382\nTDA\t226383\n偏性\t226384\nHITA\t226385\n茶镜\t226386\nTrio\t226387\n利特村\t226388\nWin7之家\t226389\n泥质\t226390\n战舰少女r\t226391\n产量\t226392\nrequesting\t226393\n哈尔滨新区\t226394\n史地\t226395\n最最\t226396\n李达康\t226397\n相映成趣\t226398\ngte\t226399\n肖川\t226400\n萨克斯曲\t226401\n0.15mm\t226402\n子时\t226403\n姬菲奥娜\t226404\nSm\t226405\n两居\t226406\n债务率\t226407\nesr\t226408\n语文教学通讯\t226409\n国宇\t226410\n劳宫穴\t226411\n蛇\t226412\n财位\t226413\nalpha2\t226414\n直接税\t226415\n蔡林\t226416\n100l\t226417\n张慧芳\t226418\n电袋\t226419\n睢阳\t226420\n绿萝裙\t226421\n仙女裙\t226422\n中新\t226423\n李兴\t226424\n随礼\t226425\nIII期\t226426\ninternals\t226427\n贸易有限公司\t226428\nSourceForge\t226429\n一掷千金\t226430\n三藩\t226431\n4988\t226432\n第208集\t226433\n独立保函\t226434\n谭艳\t226435\n华世奎\t226436\n第16集\t226437\n丰田C-HR\t226438\ngrub2\t226439\n疯狂的兔子\t226440\n日语翻译吧\t226441\n刺客信条:枭雄\t226442\n绝世美人\t226443\n超经典\t226444\n干酵母\t226445\n李学军\t226446\n养老展\t226447\n透视镜\t226448\nWithin\t226449\nadversarial\t226450\n淞沪会战\t226451\n雪飞\t226452\n斜街\t226453\n俺们\t226454\n95369\t226455\n9月15\t226456\n地环\t226457\n吸钱\t226458\n郑州人才网\t226459\n有作\t226460\n神经漫游者\t226461\n欧文莱\t226462\n煮鸡蛋\t226463\nresign\t226464\nv2017\t226465\n防倒\t226466\n词汇\t226467\n主题公园\t226468\n色影师\t226469\n共享式\t226470\n食机\t226471\na13\t226472\n陈海燕\t226473\n叶仙\t226474\n骏龙\t226475\n80厘米\t226476\n杨小宝\t226477\n田缘\t226478\nワンピ\t226479\n好听\t226480\n捕鱼达人千炮版\t226481\n浙江大学基础医学院\t226482\n咽\t226483\n微软sculpt\t226484\n征求\t226485\n冷热\t226486\n左西\t226487\n理光Ricoh\t226488\n四川省政府政务服务和公共资源交易服务中心\t226489\n三江购物\t226490\n烤烟型\t226491\n芬顿\t226492\n长沙地铁2号线\t226493\n卤鸡\t226494\n盆浴\t226495\n1服\t226496\n前辈们\t226497\n跑台\t226498\n10万一个\t226499\n平板式\t226500\nform\t226501\n份\t226502\n海水鱼\t226503\n860M\t226504\n秘汤\t226505\n瓜瓜\t226506\n地球防卫军4.1\t226507\n苏岩\t226508\nVBGood\t226509\n话框\t226510\n内径\t226511\n模卡电视\t226512\n张津涤\t226513\n32P\t226514\n立恒\t226515\n露得\t226516\n京都议定书\t226517\n爱迪尔\t226518\n叫嚣\t226519\n康肃\t226520\n数位屏\t226521\n成都住房公积金管理中心\t226522\n划片机\t226523\n银行业\t226524\n昭关\t226525\n3d3s\t226526\nmaze\t226527\n4遍\t226528\nBow\t226529\n坎妹\t226530\npte\t226531\ndwa\t226532\n蚂蚁flow\t226533\nChrome模拟手机浏览器\t226534\n秦伟\t226535\n前后径\t226536\n长方体的体积\t226537\n4公分\t226538\n华鑫证券\t226539\n李永辉\t226540\n三十三年\t226541\n张继聪\t226542\n若琳\t226543\n6025\t226544\n三星堆遗址\t226545\n大恒科技\t226546\n朱铭\t226547\ns函数\t226548\n卓翼科技\t226549\n雷亚架\t226550\nninja250\t226551\n礼品盒\t226552\n三维度\t226553\n水凝膜\t226554\nwangEditor\t226555\n2033\t226556\n糖精\t226557\n15千米\t226558\n像素\t226559\n留守老人\t226560\n王少杰\t226561\nXXXXXX\t226562\n宋祖儿\t226563\n高山南\t226564\n黄宥明\t226565\n7057\t226566\n雪山姑娘\t226567\nnahco3\t226568\nodoo\t226569\n多一个\t226570\n终章\t226571\n长乐宫\t226572\n火拳\t226573\n5x\t226574\n东光小区\t226575\n基盘\t226576\n医用制氧机\t226577\n夜关门:欲望之花\t226578\n兵\t226579\n大成律师事务所\t226580\nconcern\t226581\nroobo\t226582\n青龙场\t226583\n鲜芋\t226584\n一芯\t226585\nrenmin\t226586\nargentina\t226587\n成倍\t226588\n政法学院\t226589\n明智吾郎\t226590\n鲜食\t226591\nFlorian\t226592\n制证\t226593\n学政\t226594\n慕煜城\t226595\n睡枕\t226596\nHARUKA\t226597\n贝克曼\t226598\n二等品\t226599\n耳钉\t226600\ncgw\t226601\n袁雪芬\t226602\n项目管理部\t226603\n素土\t226604\nHynix\t226605\n城市频道\t226606\n经典文学\t226607\n湖北省肿瘤医院\t226608\n1539\t226609\n食品科学与工程\t226610\n京城\t226611\n古劳水乡\t226612\n杨乐乐\t226613\n团购群\t226614\n流通\t226615\n杏\t226616\n美克\t226617\n荆南\t226618\n一个多小时\t226619\n研究类\t226620\n月色\t226621\n丰景春\t226622\n鹿目\t226623\nnodes\t226624\n大雄\t226625\nOZ\t226626\n浙江省建设厅\t226627\n响指\t226628\nRole\t226629\n气运\t226630\n鸾凤玉\t226631\n刘路\t226632\nLuouy\t226633\nPanabit\t226634\n朱进忠\t226635\n黄仁宇\t226636\n张老大\t226637\n花帽\t226638\nBB弹\t226639\n麻江县\t226640\n1.27a\t226641\n专委\t226642\n二清机\t226643\n中兴街\t226644\n战神1\t226645\nflowing\t226646\n筏竿\t226647\n小姚\t226648\n迅达\t226649\n战训\t226650\n北街\t226651\n鼎好\t226652\n7908\t226653\n有劳\t226654\n流放之路S3\t226655\nz800\t226656\n美盛\t226657\nzed\t226658\n燕十三\t226659\n3万5\t226660\n上下班\t226661\n信息管理员\t226662\n几分之几米\t226663\n任正\t226664\nelsa\t226665\n多少次\t226666\n袁岳\t226667\n曾卓君\t226668\nShelton\t226669\n10-3\t226670\n基票\t226671\nblower\t226672\n黄蜡石\t226673\n江苏法制报\t226674\n宁宇\t226675\n白净\t226676\n盘式\t226677\nNexus\t226678\n网购\t226679\n侯建国\t226680\n王岳川\t226681\n框架篇\t226682\n拉美西斯二世\t226683\n和式\t226684\ncutterman\t226685\n欧亚集团\t226686\n目錄\t226687\n8899\t226688\nCaCl2\t226689\n自取灭亡\t226690\nSpringAOP\t226691\n清馨\t226692\n脚下\t226693\n云盟在线\t226694\n苏北\t226695\n张弛有度\t226696\n东户\t226697\n踏入\t226698\n中山大学外国语学院\t226699\n512护士节\t226700\n西安交通大学\t226701\n星际战甲吧\t226702\n7.5.3\t226703\n永平县\t226704\n连花清瘟胶囊\t226705\n海底大猎杀\t226706\n阿尔伯塔大学\t226707\n宅急送\t226708\n第6卷\t226709\n华南师范大学新陶园\t226710\n91pron\t226711\n撞墙\t226712\n性命\t226713\n披头散发\t226714\n流程图\t226715\n市编办\t226716\n磷酸钙\t226717\n蛮腰\t226718\n罪魇\t226719\ny币\t226720\nNSNumber\t226721\n共饮\t226722\n侍从\t226723\n岂不\t226724\n孩子王\t226725\n病入膏肓\t226726\n20世纪初期\t226727\nHowell\t226728\n义薄云天\t226729\n花地湾\t226730\n冲刺赛车物语\t226731\n拾光\t226732\n太空铝\t226733\njishu\t226734\n中建集团\t226735\nkingkong\t226736\n洪浩\t226737\n4u\t226738\n水土保持法\t226739\n脂溢性皮炎\t226740\n活泉\t226741\n票方\t226742\n巧虎岛\t226743\nJGJ\t226744\n成都兴城投资集团有限公司\t226745\n可不可数\t226746\n盐焗鸡\t226747\n甲基苯胺\t226748\n美艺\t226749\n名穴\t226750\n2v\t226751\nbaoan\t226752\n海陆空天惯性世界\t226753\n周刊\t226754\n中国信达资产管理公司\t226755\n兰花科创\t226756\n武汉市环保局\t226757\n百名红通\t226758\n磁怪\t226759\n服務\t226760\n高希希\t226761\n400G\t226762\n音王\t226763\nTreePanel\t226764\nmidifan\t226765\n污水\t226766\n配角\t226767\n降临\t226768\n第二宫\t226769\nmeiyuan\t226770\n违例\t226771\n考昔\t226772\n山东大学管理学院\t226773\n幽城幻剑\t226774\n互不\t226775\n第二十二批\t226776\nFW\t226777\n5支\t226778\n太和县政府网\t226779\n永泰天门山\t226780\n死路一条\t226781\n上海钜派投资集团有限公司\t226782\n2130\t226783\n城市化率\t226784\n人台\t226785\n专利期\t226786\nogtt\t226787\n波风水门\t226788\n王若飞\t226789\n并肩作战\t226790\n阿里云万网\t226791\n财规\t226792\nCHOKmoson\t226793\n普特英语\t226794\n圣巴巴拉\t226795\n工业遗产\t226796\nphison\t226797\n云水谣\t226798\nWindows10搜索框\t226799\n偷偷地\t226800\n东风标致408\t226801\n雅治\t226802\n请稍候\t226803\n延川\t226804\n索博\t226805\n_纪实台\t226806\n口径\t226807\n锦富\t226808\n胺碘酮\t226809\n月下美人浴_护花狂龙\t226810\n老派\t226811\nblu\t226812\n普定县\t226813\n龙渊\t226814\n赵忠\t226815\nDos\t226816\n抗倍特\t226817\nket\t226818\n纤体\t226819\n神荼\t226820\n荣巷\t226821\n拉丁舞鞋\t226822\n贪官们\t226823\n全盔\t226824\n物态\t226825\n老残游记\t226826\n调剂员\t226827\n20.6\t226828\n2018.01\t226829\n蟠龙山\t226830\nerika\t226831\nswfupload\t226832\n第九人民医院\t226833\ntranscription\t226834\n朗阁在线培训中心\t226835\n耳坠\t226836\n授人以渔\t226837\n管道工\t226838\n肝病科\t226839\n3999元\t226840\n东莞市中级人民法院\t226841\n炮线\t226842\n小额管理费\t226843\n选子\t226844\n上海教育\t226845\n爱回收网\t226846\nsprin\t226847\nwinnie\t226848\n开福\t226849\nmes\t226850\nSOA\t226851\nSaliency\t226852\n山西统计信息网\t226853\n接龙\t226854\n卡纸\t226855\nappeyes\t226856\n丁薛祥\t226857\n中标人\t226858\n马跃\t226859\n融资担保公司监督管理条例\t226860\n慢性肺心病\t226861\n滑动窗口\t226862\n度假\t226863\n欧也妮葛朗台\t226864\n冠农股份\t226865\n普惠\t226866\n般配\t226867\n以案释法\t226868\n生地黄\t226869\n结子\t226870\nliquid\t226871\n悟吉塔\t226872\nReplay\t226873\nYUM\t226874\n汪强胜\t226875\nyutingliuyl\t226876\n学驾\t226877\n凤凰系统x86\t226878\n发明人\t226879\n泳池\t226880\n五斗柜\t226881\nDEV-C++\t226882\n第一把\t226883\n高小淑\t226884\n中国异闻录\t226885\n指数化\t226886\nyike\t226887\n吐舌\t226888\n中国特种设备检测研究院\t226889\n25亩\t226890\n襄阳城\t226891\n招警考试网\t226892\n老子明天不上班\t226893\nNight\t226894\n裸戏\t226895\nLTP\t226896\nb7000\t226897\n血债血偿\t226898\n煊赫门\t226899\n赏玩\t226900\n快龙\t226901\n超少年密码\t226902\n入组\t226903\n少白头\t226904\n挡箭\t226905\nANP\t226906\n祖逖\t226907\nSQL盲注\t226908\n周旋\t226909\n徐志伟\t226910\n潜变量\t226911\n弱视治疗仪\t226912\n疯狂天后\t226913\n顶格\t226914\n圆通寺\t226915\n重载连接器\t226916\n沉孔\t226917\n渔获\t226918\n球粒\t226919\n腰酸\t226920\n火炎焱燚\t226921\n职阶\t226922\n玮哥\t226923\nBAZOOKA\t226924\n哈尔滨市人大常委会\t226925\n中西里\t226926\n搜狐美食\t226927\n叶音英\t226928\n茅山后裔\t226929\n廉情\t226930\n魏敏\t226931\nuk\t226932\n阿希\t226933\n鸠江\t226934\n神秘园之歌\t226935\nzavr\t226936\n米庄\t226937\n北京奇虎科技有限公司\t226938\n塔读文学网\t226939\n电控\t226940\n俱舍论\t226941\n罗马发音\t226942\nawvs\t226943\n清污\t226944\n旺盛\t226945\nv5.3.1\t226946\n平昌\t226947\n石家庄市区\t226948\n第46章\t226949\n数显压力表\t226950\nwifi信号满格\t226951\n信达澳银\t226952\n可证\t226953\n茅台王子酒\t226954\n穆晟铭\t226955\nm115b\t226956\n青笋\t226957\n大药\t226958\nContent-Type\t226959\nWeGene\t226960\n江宁医院\t226961\n文忠\t226962\nccp\t226963\n锐创网络\t226964\n公共资源交易信息网\t226965\n振安\t226966\n20160827\t226967\n永清街\t226968\n迎春街\t226969\n调治\t226970\npoland\t226971\n相供电\t226972\n颌骨囊肿\t226973\n吹掉\t226974\n乳清蛋白\t226975\n三亚婚纱摄影工作室\t226976\n迅达(中国)电梯有限公司\t226977\n秦羽墨\t226978\n别无\t226979\n小龙坎\t226980\n行政管理\t226981\n英美\t226982\nBubble\t226983\n陈安琪\t226984\nSCTP\t226985\n申江\t226986\nCocos2d\t226987\n文华\t226988\n伺服\t226989\n张志军\t226990\n4752g\t226991\n中南半岛\t226992\njudy\t226993\n南人\t226994\n云商城\t226995\n寒意\t226996\n新机型\t226997\npolaroid\t226998\n唐俊\t226999\n十几年\t227000\n千里走单骑\t227001\n聘干\t227002\nSci-Hub\t227003\n梁家仁\t227004\n爱\t227005\n红花岭\t227006\nr4烧录卡\t227007\n周妍希\t227008\n太阳能汽车\t227009\n股票市场价格\t227010\nLIBS\t227011\n屏风隔断\t227012\n齐套\t227013\n男人和一个女人\t227014\n最后机会\t227015\n三九胃泰颗粒\t227016\n卫生室\t227017\npinyin\t227018\n2237\t227019\n炎黄\t227020\nNOVA\t227021\n扫门\t227022\n月光银\t227023\n商务咨询公司\t227024\nigf\t227025\n开机启动\t227026\n萌萌哒天团\t227027\n第43集\t227028\n粕\t227029\n乱世佳人\t227030\n42亿元\t227031\n好笑\t227032\nRelay\t227033\n全局\t227034\n重庆卷\t227035\n通用机\t227036\n李竞\t227037\n素素\t227038\n顾晓松\t227039\nenjoying\t227040\nJrebel\t227041\n黄鹂\t227042\n张永华\t227043\n文殊菩萨\t227044\n高新大道\t227045\n赤井\t227046\nOEMSF\t227047\n酷开\t227048\nYue\t227049\n360秒\t227050\noctomap\t227051\n自视\t227052\n吉野\t227053\n中骏雍景湾\t227054\nTes\t227055\nclasspath*\t227056\n受审\t227057\n梢\t227058\nbeta6\t227059\n杨莹\t227060\n注释行\t227061\n原调\t227062\n王彪\t227063\n中国煤炭地质总局\t227064\n诺贝尔物理学奖\t227065\n牙刷架\t227066\ngea\t227067\nvc++6.0\t227068\nall银\t227069\n下半生\t227070\n统招\t227071\n带轮\t227072\n在行\t227073\ndecode\t227074\n乔杉修睿\t227075\nspt\t227076\n祥哥\t227077\n响堂山石窟\t227078\n72道\t227079\n拜仁慕\t227080\njiying\t227081\n天字一号\t227082\nubtuntu\t227083\n赵晓波\t227084\n一次机会\t227085\n郑敬淏\t227086\n小米扫地机\t227087\naime\t227088\n加冕成王\t227089\n杜月\t227090\n王爱民\t227091\n亿豪\t227092\n机器学习算法\t227093\nnuan\t227094\n褐色\t227095\ncctvb\t227096\nGlock\t227097\n三国志刘备传\t227098\n掉漆\t227099\n6.1特别篇\t227100\n青海省发展和改革委员会\t227101\n车上\t227102\n丰城电厂\t227103\n宁波海曙\t227104\n御敌\t227105\n飞烟\t227106\n德隆威廉姆斯\t227107\n坏女孩\t227108\n朱洪武\t227109\n女雏\t227110\n裁床\t227111\n系統\t227112\n哈韩\t227113\n角隅\t227114\nMouser\t227115\n广联\t227116\n苏科大\t227117\n隆突\t227118\n喝彩\t227119\n四杆\t227120\n陈婕\t227121\n福州华伦中学\t227122\n桐树\t227123\nuialert\t227124\n密山\t227125\nLSG\t227126\n武汉市工商行政管理局\t227127\n微积分学\t227128\n兰卡斯特大学\t227129\n梅河口\t227130\n锋友\t227131\n三亚机场\t227132\nvivomove\t227133\nAshford\t227134\n王致和\t227135\nitprobie-菜鸟\t227136\n中金证券\t227137\nUITabBarController\t227138\n秋吉\t227139\n4月21日\t227140\n湖南教育网\t227141\n绿茶软件园|33LC.com\t227142\n舞力\t227143\nEMF\t227144\n路由宝\t227145\nstabilization\t227146\n3岁\t227147\n凉水塔\t227148\nQuizzes\t227149\n叶选廉\t227150\n散场\t227151\nPRESTIGE\t227152\n覅\t227153\n非食用物质\t227154\n秦君\t227155\n铆接\t227156\n爱听网\t227157\n放射性核素\t227158\n开战\t227159\n平坝县\t227160\n法律顾问\t227161\n匿名块\t227162\n计划处\t227163\n形物\t227164\n上海洋山\t227165\n开包\t227166\n郭沫安妮海瑟薇\t227167\nSteinberg\t227168\n泡菊花茶\t227169\n学库宝\t227170\n套衫\t227171\n气血两虚\t227172\n将向\t227173\n千山暮雪\t227174\nManolo\t227175\n性差\t227176\n三四句\t227177\ntrs\t227178\n外部类\t227179\nwhat\t227180\nshooter\t227181\n成都地铁7号线\t227182\ncricket\t227183\nmaoyi\t227184\n毛裤\t227185\n枫香\t227186\n尺子\t227187\n硬面\t227188\ncarpe\t227189\n畅玩7x\t227190\n西魏\t227191\n魄罗\t227192\n第49届\t227193\n交满\t227194\n危险关系\t227195\nLeaked\t227196\ncctv5+\t227197\n归园田\t227198\n全社\t227199\nkokobop\t227200\n600380\t227201\n南海电铁\t227202\nvivosport\t227203\napi函数\t227204\n子父类\t227205\nd站\t227206\n刘墉\t227207\n超级高铁\t227208\n派克笔\t227209\nPharm\t227210\n故意伤害案\t227211\n几巴\t227212\n广盛\t227213\n海南日报数字报\t227214\n拓展篇\t227215\n大屯街道\t227216\n同人\t227217\n一首\t227218\nComboBox\t227219\n顶弄\t227220\n公输\t227221\n中信出版集团\t227222\nMCM/ICM\t227223\n气球塔防5\t227224\n胜天\t227225\n5.12护士节\t227226\niFunBox\t227227\n彩网\t227228\n杨新海\t227229\n藏尸\t227230\n联合国国际货物销售合同公约\t227231\nblingbling\t227232\nTheoretical\t227233\nKeePass\t227234\n绝赞\t227235\n老百姓\t227236\n商用电脑\t227237\n猎鱼\t227238\n针灸学\t227239\n电工类\t227240\n韩羽\t227241\n胡杰\t227242\n防磁柜\t227243\n蒜头鼻\t227244\n美索不达米亚\t227245\n43.\t227246\n方值\t227247\nCFPL\t227248\nTaxes\t227249\n两番\t227250\n休闲装\t227251\n三山街\t227252\nDwan\t227253\n路怒症\t227254\nyoutd\t227255\n黑暗史\t227256\n达州市住房和城乡建设局\t227257\n住世\t227258\n六场\t227259\nthorn\t227260\n上古音\t227261\n爱卡汽车俱乐部\t227262\n胜利路\t227263\n膨大\t227264\n上下其手\t227265\n手机播放器\t227266\n菲翔\t227267\ngsm\t227268\n雅安新闻网\t227269\n婆娑\t227270\n拘役\t227271\n51家\t227272\n真彩\t227273\n勇挑\t227274\n真丝\t227275\n便利度\t227276\n台电X98\t227277\n攻伐\t227278\nsqte\t227279\n易辅客栈\t227280\nc200l\t227281\n有经\t227282\n芦竹\t227283\nwim10\t227284\n软法\t227285\n三江\t227286\n单一化\t227287\n老死\t227288\n华漕\t227289\n灭罪师\t227290\n自诉\t227291\n弦乐\t227292\n90s\t227293\n九个月\t227294\n精灵骑士\t227295\n青海日报\t227296\n100多亿\t227297\n布拉德·皮特\t227298\nletsencrypt\t227299\n方志馆\t227300\n摩尔根\t227301\n宋婷婷\t227302\n8.4%\t227303\ngland\t227304\n井田\t227305\n常伴\t227306\n龙目岛\t227307\nSMIC\t227308\n中国涂料人才网\t227309\nVario\t227310\n感受到\t227311\n三国志9吧\t227312\nFLA\t227313\nHashMap\t227314\n唐德刚\t227315\nearnest\t227316\n超能造梦\t227317\nporns\t227318\n东财通\t227319\n以内\t227320\nseated\t227321\n奥格瑞玛\t227322\n轻装上阵\t227323\n人生若如初相见\t227324\ncar2go\t227325\nemperor\t227326\n李志清\t227327\n爱就一个字\t227328\n高志强\t227329\n数据采集卡\t227330\npublisher\t227331\n百科全书\t227332\n宫羽\t227333\n女俘\t227334\n剩余电流互感器\t227335\n南方日报\t227336\nground\t227337\n亚健康\t227338\n关帝灵\t227339\n缤智\t227340\nLo\t227341\n紫荆阁\t227342\n集美大学诚毅学院\t227343\n辽宁省质量技术监督局\t227344\n李露\t227345\nExplosion\t227346\n千子村\t227347\n有所属\t227348\n上善若\t227349\n彩绘\t227350\n统一卷\t227351\nagro\t227352\n诡\t227353\n金霉素眼膏\t227354\nSWF播放器\t227355\n伐木\t227356\ncurd\t227357\n和号\t227358\n八十中学\t227359\nechoes\t227360\n坡子街\t227361\nccc认证\t227362\n污水处\t227363\n0355\t227364\n我爱卡\t227365\n黑拳\t227366\n徐秀娟\t227367\n233_\t227368\n免费日\t227369\n天天飞车\t227370\n房房网\t227371\n0776\t227372\n组卷\t227373\n张天德\t227374\n齐鲁证券\t227375\n赌球网\t227376\n一觉\t227377\n2017年12月1日起\t227378\nengineers\t227379\n预约码\t227380\n夜色邦福利网\t227381\n王毅\t227382\nE\t227383\n成都市国土资源局\t227384\n张敏\t227385\n每日一练\t227386\n起坐\t227387\nadnroid\t227388\nB族\t227389\n模拟浏览器\t227390\n夺位\t227391\n10000多\t227392\n平装\t227393\n邪恶漫画中文网\t227394\nFort\t227395\n温州日报数字报\t227396\n恳请\t227397\n民事诉讼时效\t227398\n防化服\t227399\nSour\t227400\n野中\t227401\n氹仔码头\t227402\n23个月\t227403\n等级赛\t227404\n北京艺星整形美容医院\t227405\n团风\t227406\n酱香\t227407\n僧尼\t227408\nExam8.com\t227409\n回梦丹\t227410\ncs1\t227411\n烟台东\t227412\n上海城投\t227413\n洪浪北\t227414\n夜战\t227415\n我和春天有个约会\t227416\n张长青\t227417\n东京战记\t227418\n大舌头\t227419\n铁道\t227420\n黑鹰S\t227421\n圆的面积\t227422\nWrench\t227423\n18039\t227424\n四幅\t227425\n三角帽\t227426\n陈庆华\t227427\nbush\t227428\n瓜\t227429\n物主\t227430\n威盾\t227431\n略去\t227432\n未接电话\t227433\n武汉东湖宾馆\t227434\n美康\t227435\nQUEEN\t227436\n中国东盟自由贸易区\t227437\nbreakthrough\t227438\n张扣扣\t227439\n家用型\t227440\n酮基\t227441\nipad2018\t227442\n道德感\t227443\nrav4\t227444\n加成\t227445\n金属贴\t227446\n二十四道\t227447\n大感\t227448\n哭脸\t227449\npapillon\t227450\n温室效应\t227451\n原卷\t227452\n云城\t227453\n脏兮兮\t227454\n破界\t227455\n刘红梅\t227456\n83993160\t227457\n第136章\t227458\nNewbie\t227459\n牛仔骨\t227460\n绝体\t227461\n维希\t227462\nslender\t227463\n试用版\t227464\n背胶\t227465\n电位滴定法\t227466\n第2名\t227467\n生活品\t227468\n看玩\t227469\n冷擎\t227470\n赞\t227471\ncs35\t227472\n肘形\t227473\n奴家\t227474\n魏千\t227475\n道路交通安全法实施条例\t227476\n龙归镇\t227477\nhumans\t227478\n39天\t227479\n分包人\t227480\n业字\t227481\n反思性\t227482\n火烧云\t227483\n首发价\t227484\n火焰纹章苍炎\t227485\n中日甲午战争\t227486\n集成测试\t227487\n日产新阳光\t227488\n胶囊式\t227489\n教员\t227490\n抬轿\t227491\n节余\t227492\nfew\t227493\n卷材\t227494\n操控\t227495\n挂旗\t227496\n中山大学附属第七医院\t227497\n床战\t227498\n建县\t227499\n客家新闻网\t227500\n辛硫磷\t227501\n分批法\t227502\n慈溪论坛\t227503\n黑宝石\t227504\n渗出性\t227505\n命令框\t227506\n足球界\t227507\n450吨\t227508\n老干部\t227509\nMARS\t227510\n120厘米\t227511\n何佩瑜\t227512\nCVV\t227513\n队套\t227514\n节奏\t227515\nsantos\t227516\nxrp\t227517\n硫辛酸\t227518\n柱线\t227519\n美希\t227520\nKris\t227521\n4423\t227522\n苦衷\t227523\n愿者\t227524\n莫忘\t227525\n咸阳新闻网\t227526\ndans\t227527\n多格\t227528\n丘咲\t227529\n白胶\t227530\n胡凯\t227531\n注册表项\t227532\n微信学习网\t227533\ndengji\t227534\n袍江新区\t227535\n林雪\t227536\nRALPH\t227537\nascent\t227538\n甲骨文丛书\t227539\n热泪盈眶\t227540\nJillian\t227541\n山手栞\t227542\nStraw\t227543\n北京门头沟区\t227544\n菩提子\t227545\n大坦沙\t227546\nshaoxing\t227547\n四年后\t227548\n内蒙古大草原\t227549\n取向\t227550\n国际卡\t227551\n光荣册\t227552\n天行九歌\t227553\n酒泉卫星发射中心\t227554\n万喜\t227555\n女高跟鞋\t227556\narrest\t227557\n颐和家园\t227558\n郑杰\t227559\n新街口街道\t227560\n_九\t227561\n旁观者清\t227562\nthermofisher\t227563\n林朝英\t227564\n给水栓\t227565\n中南大学信息科学与工程学院\t227566\n第十二代\t227567\nNestle\t227568\n3.89\t227569\nLinguistic\t227570\n7周年\t227571\n凝脂\t227572\n深市上市公司\t227573\nGRUB2\t227574\n自激\t227575\n六氟异丙醇\t227576\n基础设施\t227577\n人造大理石\t227578\n访友\t227579\n60题\t227580\nvivox9s\t227581\n上海总工会\t227582\n项王故里\t227583\n越州\t227584\n中国大酒店\t227585\n文儿\t227586\n杨凌示范区\t227587\nvbox虚拟机\t227588\n点睛\t227589\n大木山\t227590\n客宝\t227591\n嘴嘴\t227592\n给我的爱\t227593\n澄溪镇\t227594\nMQL\t227595\n淮水\t227596\n定线\t227597\n星闻\t227598\n橄榄山\t227599\n面套\t227600\n缩管机\t227601\n高云欧弟\t227602\n李放鸣\t227603\n狼人\t227604\n填列\t227605\n冠绝\t227606\n馆档网\t227607\n淄博职业学院\t227608\n普洱茶饼\t227609\n4000毫安\t227610\n惠州市人力资源和社会保障局\t227611\n上海房价网\t227612\n侍女\t227613\n黄河票务网\t227614\n学习类\t227615\n不疯\t227616\n甚远\t227617\nOPPLE\t227618\nQQ分组网\t227619\n人身意外险\t227620\n54321\t227621\n各款\t227622\n安徽省博物馆\t227623\n京威股份\t227624\n烟台机场\t227625\n雨露文章网\t227626\nA30\t227627\n汇付\t227628\n感受性\t227629\n凤凰于飞\t227630\n苏白\t227631\n池袋\t227632\n这位\t227633\n止痛片\t227634\n安全感\t227635\ng480\t227636\n运河镇\t227637\n要紧\t227638\n两格\t227639\n声临其境\t227640\nInt\t227641\n之二\t227642\n条状物\t227643\n复变函数\t227644\n酷斯特\t227645\n杨德昌\t227646\n浅川梨奈\t227647\n天下无谋\t227648\n番茄助手\t227649\n钧窑\t227650\nPrintf\t227651\n乐词\t227652\n20150705\t227653\n矿床学\t227654\ndashboard\t227655\n出口成章\t227656\n热缩式\t227657\n刷道\t227658\n自收\t227659\n社保减员\t227660\n厦门市地方税务局\t227661\n神木\t227662\n房地图\t227663\nIIR\t227664\n问道网\t227665\n平安i贷\t227666\n莫默\t227667\n驾驶者\t227668\n莒南县\t227669\n扶沟\t227670\n近两年\t227671\n大阪关西国际机场\t227672\n土豪镇\t227673\n无痛分娩针\t227674\n邬童\t227675\nbasics\t227676\n打印线\t227677\n中国华西工程设计建设有限公司\t227678\n符文s8\t227679\n大片级\t227680\n文港镇\t227681\n阿瓦隆之王\t227682\n980M\t227683\nPHOENIX\t227684\nUtilization\t227685\n三菱PLC编程软件\t227686\n用益物权\t227687\n还手\t227688\n55586659\t227689\n战国七雄\t227690\n非国家\t227691\nclustered\t227692\n四十路\t227693\n丧尸类\t227694\n杜阮镇\t227695\n刺五加\t227696\njwweb\t227697\n正风肃纪\t227698\nmolten\t227699\n依此\t227700\nSweetFX\t227701\n毛门\t227702\n李嘉图\t227703\n漕宝路\t227704\n可随时\t227705\nlm2596\t227706\nregional\t227707\n竟可\t227708\n短歌行\t227709\n张旭\t227710\n变局\t227711\n苏士澍\t227712\nkizuna\t227713\n腓肠肌\t227714\n达示\t227715\nioremap\t227716\n偷渡客\t227717\n定远县人民政府\t227718\nblum\t227719\n宁珂\t227720\n招商雍和府\t227721\n迷仙\t227722\njasperreports\t227723\n东方秃鹰\t227724\n1000本\t227725\nQrcode\t227726\n焊头\t227727\n程前\t227728\n听语\t227729\n电晕机\t227730\n磷化液\t227731\n执结\t227732\n捣蛋\t227733\n花线\t227734\n启东人事人才网\t227735\n格氏\t227736\n航信\t227737\n阿米巴\t227738\n宁波市发展和改革委员会\t227739\nv3.2.2\t227740\nMT6737\t227741\n尼日尔\t227742\n物流地产\t227743\nInference\t227744\n污女\t227745\n冠盖满京华\t227746\n西安邮电大学\t227747\n29歳\t227748\n7旬\t227749\n压密\t227750\nSins\t227751\nluoke\t227752\n体操\t227753\n相法\t227754\n使命召唤8\t227755\n50861114\t227756\n相对熵\t227757\n陈靖姑\t227758\n石家庄国\t227759\n二厘米\t227760\n诉\t227761\n常州纺织服装职业技术学院\t227762\n滴答网\t227763\n混肥\t227764\n廿三里\t227765\n主场馆\t227766\nlid\t227767\n轮廓仪\t227768\n苏苏苏\t227769\n民间艺人\t227770\n价值感\t227771\n广东省人民政府办公厅\t227772\n投怀送抱\t227773\n遥远的救世主\t227774\n台湾站\t227775\n官渡古镇\t227776\n恒大华府\t227777\n乌坎\t227778\n景剧\t227779\nDetection\t227780\n烽火台\t227781\nKevin-PC\t227782\n珲少\t227783\nsquid\t227784\n193\t227785\n孙铭\t227786\n陈绍华\t227787\n電話區號查詢|手機號碼歸屬\t227788\n爱恨情仇\t227789\n沿河\t227790\n卧龙路\t227791\ncnooc\t227792\nwindos\t227793\n米兰展\t227794\nDuPont\t227795\n中央编译局\t227796\n堤口路\t227797\ncallable\t227798\n2.69\t227799\n牛山镇\t227800\n苏州绿叶日用品有限公司\t227801\n润扬大桥\t227802\n佛印\t227803\n督师\t227804\n居中\t227805\nStd\t227806\nBern\t227807\n秦楚网\t227808\n磐石\t227809\n李海涛\t227810\nBCC\t227811\n炔诺酮片\t227812\n博益\t227813\n蚁群\t227814\n合利\t227815\nFTL\t227816\ninvalidate\t227817\n20180415\t227818\n宫部凉花\t227819\nnis\t227820\n河北省国资委\t227821\n糖醋排骨\t227822\n天罡地煞\t227823\n平假\t227824\n蝙蝠侠阿甘\t227825\n0108639\t227826\n海蛞蝓\t227827\n河源职业技术学院\t227828\n定力\t227829\nwww.315hyw.com\t227830\n硬膜囊\t227831\nGstreamer\t227832\nsurl\t227833\nPT站\t227834\n书湖阴先生壁\t227835\n单轨吊\t227836\nstdole\t227837\n晶睛\t227838\n田芳\t227839\n成宥利\t227840\n平妖传\t227841\n车业\t227842\n大飞歌\t227843\n充水\t227844\nGeoDa\t227845\n风雪夜归人shen\t227846\nWriteUp\t227847\n同笼\t227848\nperfychi\t227849\nCOMBO\t227850\n沁水\t227851\n河北政法职业学院\t227852\n国青队\t227853\n低版\t227854\n第九\t227855\ndplayer\t227856\n思政部\t227857\n魏峰\t227858\n1912\t227859\n不屈不挠\t227860\n以撒的结合胎衣+\t227861\n航天五院\t227862\n智伴机器人\t227863\n美的燃气热水器\t227864\n剧增\t227865\n张江高科实验小学\t227866\n货比\t227867\n合肥市教育局\t227868\n供\t227869\n星罗棋布\t227870\n填谷\t227871\n全水\t227872\nGB50010-2010\t227873\n毒计\t227874\n早来\t227875\n逐月之月\t227876\nreligious\t227877\n2.7\t227878\nballbusting\t227879\n网易\t227880\nJena\t227881\ncc3d\t227882\n隔离墩\t227883\n王叁寿\t227884\n成都市食品药品监督管理局\t227885\n主机\t227886\n法战\t227887\n54页\t227888\nairborne\t227889\n老聃\t227890\n象屿集团\t227891\n市中心\t227892\n香蒲丽眼膜\t227893\nCSgirl\t227894\nzipline\t227895\nCiel\t227896\nprimal\t227897\n隆中\t227898\n死结\t227899\n财务管理办法\t227900\ndomaine\t227901\nZowie\t227902\n北京王府学校\t227903\njnj\t227904\nJuliet\t227905\npropaganda\t227906\n福特蒙迪欧\t227907\n饭机\t227908\n腓立比书\t227909\nSiemens\t227910\n谭家菜\t227911\n搜狗竞价\t227912\nliquibase\t227913\n中华文明\t227914\nbloves\t227915\n2win10\t227916\n输尿管\t227917\n肇嘉浜路\t227918\n30亿美元\t227919\n170331\t227920\n地道\t227921\nsuccessive\t227922\n增行\t227923\n色盘\t227924\n驼铃\t227925\n量子物理学\t227926\n240平方\t227927\n懒熊\t227928\n敌台\t227929\nSBaiCL\t227930\n惹火上身\t227931\n交通委\t227932\nslurm\t227933\n美里\t227934\nSB0060\t227935\n亲子鉴定\t227936\n题临\t227937\n深圳地铁11号线\t227938\n路晨\t227939\n灰心\t227940\nPorno\t227941\n偷脸\t227942\n徐家湾\t227943\nVR播放器\t227944\n星火燎原\t227945\n保育员\t227946\ntkl\t227947\n导量\t227948\n吸储\t227949\n校勘\t227950\n子宫腺肌瘤\t227951\n金子塔\t227952\n备\t227953\ntwitch\t227954\n|孙\t227955\n值守\t227956\n刘玥\t227957\n美术展\t227958\n天猫电器城\t227959\n匠\t227960\n直腿\t227961\n38P\t227962\n古灵\t227963\nodom\t227964\n电力杆\t227965\n大boss\t227966\n图话\t227967\nu2415\t227968\n小渝\t227969\n阿婆\t227970\n华润银行\t227971\nstat3\t227972\nmaven3\t227973\nreflection\t227974\nlogonui\t227975\nCortex-A8\t227976\nhandout\t227977\n育英小学\t227978\n苦楝\t227979\n后级\t227980\n铝焊\t227981\n544\t227982\n色青\t227983\npersonal\t227984\n儒士\t227985\n万方学术圈\t227986\nm401\t227987\n银块\t227988\n4米\t227989\n3月30\t227990\n假面骑士build\t227991\n玩伴\t227992\n類編\t227993\n鬼畜版\t227994\n脑室\t227995\n九色鹿\t227996\n高素质\t227997\n增倍镜\t227998\n78.5\t227999\n栈中\t228000\n1811\t228001\n本安\t228002\n马元坤\t228003\n试试水\t228004\n520\t228005\n无管网\t228006\n短衣\t228007\n望牛墩\t228008\n0.75\t228009\n万凰之王\t228010\n经幡\t228011\n陌鱼\t228012\n诉苦\t228013\nqcustomplot\t228014\n【脑叶公司\t228015\n安医附院\t228016\nsentos\t228017\n楼下的房客\t228018\n近四年\t228019\n戴姆勒\t228020\n退党\t228021\n注疏\t228022\n看天下劳苦\t228023\n西电集团\t228024\nrats\t228025\n6B\t228026\n栽种\t228027\n郑州园博园\t228028\n底款\t228029\n_果乐\t228030\n不能动\t228031\n耳夹\t228032\n12厘\t228033\n终极一班\t228034\n登记\t228035\n吉非替尼\t228036\n擀面杖\t228037\n天府国际空港新城\t228038\n文件袋\t228039\n联发科\t228040\n碴\t228041\n三轮挎子\t228042\n日本株式会社\t228043\n200kw\t228044\n佰咖\t228045\n断舍\t228046\ntroy\t228047\nveritas\t228048\n代县\t228049\n中国矿业大学图书馆\t228050\n全国律协\t228051\n平潭时报数字报\t228052\n浙江省机关事务管理局\t228053\nxxxtome\t228054\n新创公司\t228055\n99999999\t228056\n百世集团\t228057\n暮春\t228058\n做出\t228059\n中信新快\t228060\n万岁爷\t228061\n合种\t228062\nUIWeb\t228063\n400\t228064\n日本国立大学\t228065\n5.X\t228066\n梦幻手游\t228067\nDalton\t228068\n灯效\t228069\n麒麟门\t228070\n门站\t228071\n缪勒\t228072\n关公战\t228073\ngooflow\t228074\n闪光粉\t228075\n胡罗卜\t228076\nMinister\t228077\n百元机\t228078\n深圳能源\t228079\n作怪\t228080\n赵强\t228081\nFoundations\t228082\n壹方城\t228083\n出事\t228084\n1.0.14\t228085\n上海地铁10号线\t228086\n无耻之徒/无耻家庭\t228087\nV3.1.0\t228088\n2pm\t228089\n北京交通大学电气工程学院\t228090\n补和\t228091\n2.7下\t228092\n萤石云视频\t228093\n_菜鸟C4D\t228094\n孟浩\t228095\n帽衫\t228096\n兴苑\t228097\n第三世界\t228098\n辉光管\t228099\n掌纹\t228100\n李永乐\t228101\nsprintf函数\t228102\n丽贝卡·弗格森\t228103\n第80期\t228104\n1318\t228105\n暴风tv\t228106\n分红\t228107\nA2\t228108\n接头处\t228109\n铺路\t228110\n淡衣\t228111\n通化市人民政府\t228112\n十堰市纪律检查委员会\t228113\nG310\t228114\n啪\t228115\n受益无穷\t228116\n音乐家\t228117\nwrite\t228118\n回收宝\t228119\nw型\t228120\n成年礼\t228121\n2016部\t228122\n娼妇\t228123\n华侨大学厦门工学院\t228124\n语汇\t228125\n亚硝酸根\t228126\n南化\t228127\n丢命\t228128\n农历\t228129\n前12个月\t228130\n徐润福\t228131\n朱民\t228132\n电力测功机\t228133\n期票\t228134\nspecify\t228135\n德州大学奥斯汀分校\t228136\n抖S\t228137\n中年女\t228138\n高媛\t228139\n汉乡\t228140\nX10\t228141\n人教版六年级语文下册\t228142\n会化\t228143\n土类\t228144\nccxr\t228145\n橋\t228146\nCXW\t228147\n周村古商城\t228148\n道口镇\t228149\n智云\t228150\n徐新\t228151\n市场经济\t228152\n名词性从句\t228153\nyingyin\t228154\n硫酸新霉素\t228155\n7410\t228156\nchina\t228157\n花丸\t228158\n建设者\t228159\n三途\t228160\n刘行\t228161\n堂口\t228162\nlola\t228163\n兽化\t228164\n4.5日\t228165\n人民法院报\t228166\n_社\t228167\n梅肉\t228168\n一坊\t228169\n金橘\t228170\n准证\t228171\n易小星\t228172\n蒜头\t228173\n文教\t228174\n华林寺\t228175\n赏枫\t228176\n仙女座\t228177\n牧原股份\t228178\n藏画\t228179\n孙小美\t228180\n北大方正\t228181\n龙口网\t228182\n本尼\t228183\nMV_土豆\t228184\n全自动灌装机\t228185\n山神庙\t228186\nBPI\t228187\n比亚智跑\t228188\n无话不说\t228189\n京都站\t228190\n左木\t228191\nrhs\t228192\n窦昕\t228193\n直信创邺\t228194\n4.32\t228195\n夕张\t228196\n全图鉴\t228197\n防尘服\t228198\nCHEERS\t228199\n金融界网\t228200\n赢合\t228201\n界首镇\t228202\n防护罩\t228203\n坨\t228204\n珠海城市职业技术学院\t228205\n仲夏夜之梦\t228206\n古田县\t228207\n高晓松晓\t228208\n自动售卖机\t228209\n溢色\t228210\n三票\t228211\n鲶鱼效应\t228212\n毒鱼\t228213\n酒壶\t228214\nCBox央视影音\t228215\n技炉\t228216\ndigits\t228217\n叮咯咙咚呛\t228218\n766星\t228219\n中关村科技园区海淀园管理委员会\t228220\nAIDA64\t228221\n臂式\t228222\n时尚感\t228223\n95338\t228224\n兰亭黑\t228225\n三井住友\t228226\n男同性恋\t228227\ngudai\t228228\n铁卷\t228229\n冲刺卷\t228230\n范儿特西\t228231\nsmarts\t228232\n感党恩\t228233\n南方大厦\t228234\n打尽\t228235\n红旗杀神\t228236\n蜜菓\t228237\n1.0mm\t228238\n大富\t228239\n首汽约车\t228240\n棉兰老岛\t228241\n冲刺期\t228242\n群文\t228243\n科氏\t228244\n泗洪县人民政府\t228245\n钱串\t228246\n西北苗木网\t228247\n丹皮酚\t228248\nweb.config\t228249\n摩尔天天飞车\t228250\n华谊\t228251\n大燕\t228252\ncats战车\t228253\n曾明\t228254\nguoguo\t228255\n怦然\t228256\n而是\t228257\napt-get\t228258\n第160章\t228259\nHolders\t228260\n匿迹\t228261\n掌讯\t228262\n高云\t228263\n地下城与勇\t228264\n冰肌\t228265\n狼窝\t228266\n熹妃q传吧\t228267\n维多利亚州\t228268\n20户\t228269\n大连医科大学中山学院\t228270\n布市\t228271\n斯大林之死\t228272\n民康\t228273\n镇馆\t228274\n相控\t228275\n珂赛特\t228276\n香港亚视\t228277\n答非\t228278\n脑科\t228279\n工具变量法\t228280\n昆明市教育局\t228281\n6050\t228282\n打僵尸\t228283\nVC6.0\t228284\n申万宏源证券\t228285\n壹钱包\t228286\n四品八德\t228287\n海豚网游加速器\t228288\n书画室\t228289\n奔驰E级论坛_汽车之家论坛\t228290\nwma格式\t228291\ngag\t228292\nImac\t228293\n美琪大戏院\t228294\n堤坝\t228295\n每平\t228296\n眼尖\t228297\n德钦\t228298\n十六步\t228299\nM4/3\t228300\n邹城市政府网\t228301\n哈弗H5论坛_汽车之家论坛\t228302\ncla200\t228303\n天府大道北\t228304\nortp\t228305\n小米车载充电器\t228306\n中国安全生产科学研究院\t228307\n357\t228308\n饮水思源\t228309\n宁波城市职业技术学院\t228310\n杨不悔\t228311\n娇奴\t228312\n苗盘\t228313\n346\t228314\n盐城高新区\t228315\n雷光\t228316\n62级\t228317\n湖南省长沙市长郡中学\t228318\npitta\t228319\nyjk\t228320\nGuidance\t228321\n横琴新区\t228322\n逃不过\t228323\n趴走\t228324\n反相放大器\t228325\n聘金\t228326\n潘仁美\t228327\n宝男\t228328\n科世达\t228329\nShy\t228330\n大港油田\t228331\n科工\t228332\n乌饭树\t228333\n600万吨\t228334\n高文安\t228335\n21亿\t228336\n海坊\t228337\n受惊\t228338\nhkd\t228339\ncosco\t228340\n赛亚人\t228341\n总装备部\t228342\nStand\t228343\nsegger\t228344\n悠游堂\t228345\n傻眼\t228346\n万能断路器\t228347\n北京京城皮肤病医院\t228348\n诉讼费计算器\t228349\nesxi6\t228350\nwo\t228351\n庞皓\t228352\n魔指\t228353\n丁醇\t228354\n绕城高速\t228355\n空集\t228356\nav2018\t228357\n披靡\t228358\n李小龙传奇\t228359\n众合\t228360\n公路工程管理与实务\t228361\n拜登\t228362\n为圣\t228363\n2.7.6\t228364\nhpp\t228365\n西安光机所\t228366\n75页\t228367\n魔将\t228368\n问责条例\t228369\ncomponet\t228370\n月老\t228371\n鼓楼区政府\t228372\npanama\t228373\n非金融专业\t228374\n3d打印技术\t228375\n交际舞\t228376\n蚕虫\t228377\n内压\t228378\n爱康分院\t228379\n50v50\t228380\nionic-v1\t228381\n冷月如霜\t228382\n法兰琳卡\t228383\n観\t228384\n锦鳞\t228385\n【板\t228386\n滨海图书馆\t228387\n百泉山\t228388\n王玉君\t228389\n你的话\t228390\n更安全\t228391\n老夫聊发少年狂\t228392\n省外\t228393\n吉利自由舰\t228394\n钥匙盒\t228395\n香菇多糖\t228396\nyangshi\t228397\n井冈山市\t228398\nrooster\t228399\n睿驰\t228400\n追索\t228401\n捐髓\t228402\n爱心卡\t228403\n经济部\t228404\ninitialize\t228405\nweb应用_\t228406\n休息日\t228407\n策士\t228408\nbak格式\t228409\niPhone6sP\t228410\nwgc\t228411\n嵯峨野\t228412\nac6\t228413\n淋水器\t228414\n行销\t228415\nioctl\t228416\n不确\t228417\n四怀\t228418\n大富翁4fun\t228419\n通灵之战\t228420\njar\t228421\n1块\t228422\n贵州省国土资源厅\t228423\n中医药网\t228424\n2.4.12\t228425\nsqlcmd\t228426\n居安\t228427\n空管\t228428\n了解决\t228429\nCloudera\t228430\nWebopedia\t228431\n1.5平方\t228432\n诺贝尔科学奖\t228433\nwharf\t228434\n智播客\t228435\n扬起\t228436\n功能性消化不良\t228437\n云南省监狱管理局\t228438\n共享型\t228439\n陆上\t228440\n登登登登\t228441\n互联网金融\t228442\n松茸\t228443\n模拟城市\t228444\n粉红丝带\t228445\nrankings\t228446\nDid\t228447\n安全服\t228448\ntrench\t228449\nvs2015吧\t228450\n丽岛\t228451\n李光琪\t228452\nAPE+CUE\t228453\n天格地热\t228454\n十一月份\t228455\n屯溪\t228456\n疆场\t228457\n寿州\t228458\n上下集\t228459\n55年前\t228460\n氮化物\t228461\n麻衣相法\t228462\n天生我材必有用\t228463\n歇马\t228464\n六爪\t228465\n抗菌素\t228466\n25平方米\t228467\n中国水产养殖网\t228468\n劝学诗\t228469\n省委党校\t228470\n贵州省质量技术监督局\t228471\n沉沙井\t228472\n迪士尼公主\t228473\n0793\t228474\n77集\t228475\nwin10任务管理器\t228476\n修身齐家治国平天下\t228477\n铥\t228478\nMigrate\t228479\n摆轮\t228480\n西安未央区\t228481\nwuyue\t228482\n突然有一天\t228483\n內\t228484\nsiamese\t228485\n枯草杆菌\t228486\nbusybox\t228487\n尹光\t228488\n初学者们\t228489\n薄膜式\t228490\n话题\t228491\n天湖\t228492\n蓝莓\t228493\n错版\t228494\n信息局\t228495\nerdas\t228496\n通讯组\t228497\n网页设计与制作\t228498\n股气\t228499\n脊柱肿瘤\t228500\n捷豹xf\t228501\n38页\t228502\n俄语\t228503\n百团\t228504\n請\t228505\nmoboplayer\t228506\nspacing\t228507\npornstars\t228508\n扫墓日记\t228509\n张晓春\t228510\n串通投标罪\t228511\n天品\t228512\n茶包\t228513\n克鲁伊维特\t228514\nmod_wsgi\t228515\n北方园艺\t228516\n曼恩\t228517\n如切\t228518\n零\t228519\n哈大高铁\t228520\nTIBCO\t228521\n美沙拉嗪\t228522\n废土\t228523\n0.99元\t228524\n取卵\t228525\n青园\t228526\n运动帽\t228527\n望梅止渴\t228528\n广河县人民政府\t228529\n北京理工大学》\t228530\n智利葡萄酒\t228531\nsut\t228532\n护理学专业\t228533\nDistortion\t228534\n温岭\t228535\n姑奶奶\t228536\n灌南县\t228537\n金地集团\t228538\n廿年\t228539\n推迟\t228540\n2298\t228541\n1024x576\t228542\n军工路\t228543\n对单\t228544\n千家万户\t228545\n20161018\t228546\n知味\t228547\n中置轴轿运车\t228548\n放回\t228549\n澳洲墨尔本大学\t228550\n骨转移\t228551\n华菱星马\t228552\ndatabus\t228553\n开源堡垒机\t228554\n贻笑\t228555\n普工招聘网\t228556\n下架\t228557\n舒城中学\t228558\nTOA\t228559\nAntipodes\t228560\n严军\t228561\n舌苔\t228562\nsketchup2016\t228563\n第二集\t228564\n市环保局\t228565\n8圈\t228566\n牟宗三\t228567\n水解酸化池\t228568\n牌品\t228569\n概要\t228570\n蟹将\t228571\n初次分配\t228572\n最后的骑士\t228573\n我欲封天\t228574\n二十\t228575\nStoryBoard\t228576\n中华人民共和国上海海事局\t228577\n惠汝\t228578\n水寒\t228579\n西藏生死书\t228580\nbd1080p\t228581\n归侨侨眷\t228582\ntlx\t228583\nCQRS\t228584\nu88\t228585\n密文\t228586\nNUnit\t228587\n偶偶\t228588\n护颈\t228589\nargument\t228590\nonpropertychange\t228591\n织梦dede\t228592\n红花村\t228593\n修换\t228594\n纳米材料\t228595\n五台山景区\t228596\n顶墙\t228597\n黄濑\t228598\n香都\t228599\n南通科技职业学院\t228600\n5.56mm\t228601\n女一号\t228602\n跟妆\t228603\n广州智康1对1\t228604\n称呼语\t228605\n陶波\t228606\n2.96\t228607\n深信服社区\t228608\n真义\t228609\n崇宁\t228610\n身经百战\t228611\n总线制\t228612\n安徽省国土资源厅\t228613\nsetitem\t228614\n450公里\t228615\n月蝎\t228616\nmpos\t228617\n倾国倾城\t228618\n云州\t228619\nAnimated\t228620\n吴协恩\t228621\n公转商\t228622\n碎屏险\t228623\n合闸\t228624\nrunat\t228625\n伊尹\t228626\nE27\t228627\n秦文\t228628\n肺热咳嗽\t228629\nps3吧_\t228630\nTRAVEL\t228631\n张冬梅\t228632\njiben\t228633\n解脲\t228634\n赵红霞\t228635\nopenerp\t228636\n不敢\t228637\n闻君\t228638\n五一医院\t228639\n前妻\t228640\n遍地开花\t228641\n钦州人才网\t228642\nCynthia娆墨旧染\t228643\n送付\t228644\n北海老街\t228645\n丁威特\t228646\n安全帽\t228647\n深圳达实智能股份有限公司\t228648\nh5社区论坛-易企秀社区论坛\t228649\nDataInputStream\t228650\n众说纷纭\t228651\nDrupal7\t228652\n钢铝\t228653\n广西政府\t228654\n占据\t228655\nPDH\t228656\njustified\t228657\n陈鲁豫\t228658\n清灵仙露\t228659\n体裁\t228660\n氨甲环酸\t228661\n天下3藏宝阁\t228662\n奥门\t228663\nHug\t228664\n法律咨询百科_找法网\t228665\n懒散巴黎圣母院\t228666\nPlotly\t228667\n冶父山\t228668\n金木集团\t228669\nRanges\t228670\n暴风骤雨\t228671\n江苏省邮电规划设计院有限责任公司\t228672\n尤伦斯\t228673\n何\t228674\n交地\t228675\n冬暖夏凉\t228676\n百事可乐公司\t228677\n垃圾英雄\t228678\n深溪\t228679\n97版\t228680\n连花清瘟颗粒\t228681\n双拥模范\t228682\nchain\t228683\n猎神\t228684\n1stopt\t228685\n中队长\t228686\n烟农\t228687\n科学部\t228688\n吸音\t228689\n李教授\t228690\n自考本科专业\t228691\n芸娘\t228692\n老严\t228693\n色群\t228694\n古根海姆博物馆\t228695\n原癌\t228696\nbeyond\t228697\n大巴士\t228698\nphpmywind\t228699\n日昌\t228700\n廖彩杏\t228701\n爱到尽头\t228702\nmustang\t228703\nwidth\t228704\nkworker\t228705\n孙海\t228706\n咚漫\t228707\n高等教育法\t228708\nOMP\t228709\nGrit\t228710\ngg\t228711\n腾起\t228712\n共计\t228713\n1582\t228714\n蹦迪舞\t228715\n天安门城楼\t228716\n款\t228717\n小震\t228718\nEnter\t228719\n宋杰\t228720\n刘一锅\t228721\n实图\t228722\n花边\t228723\n惟楚\t228724\n风媒\t228725\n寿喜\t228726\n比勒陀利亚\t228727\nmaven\t228728\n德沃\t228729\n供房\t228730\n雪白血红\t228731\n邯郸市第一医院\t228732\n爱国人士\t228733\nSubway\t228734\n中华人民共和国中外合资经营企业法实施条例\t228735\n已婚\t228736\n流淌\t228737\n女律师\t228738\n塔克夫\t228739\n内存泄漏\t228740\n风息\t228741\n赤峰市公安局\t228742\n薏米红豆汤\t228743\n快递员\t228744\n小米快传\t228745\n美国纽约摄影学院\t228746\n乘客\t228747\n折腾\t228748\n156\t228749\n恒大名都\t228750\n光电编码器\t228751\n色乳\t228752\n百里屠苏\t228753\n同意思\t228754\n538\t228755\n成都旅行社\t228756\n江湖蒙迪欧\t228757\n褶积\t228758\n子怡\t228759\n果岭草\t228760\n第三季01集\t228761\n1005\t228762\n3000亿\t228763\nstorage\t228764\n黑龙江省文化厅\t228765\nSwagger\t228766\n一封信\t228767\n优越性\t228768\n针灸\t228769\n小英雄雨来的故事\t228770\nprodad\t228771\n特刊\t228772\n天平秤\t228773\n亚甲蓝\t228774\n邮政管理局\t228775\n张佑赫\t228776\n勇者圣殿_凯恩之角_暗黑破坏神3论坛\t228777\nCL2\t228778\n987\t228779\n中朝边境\t228780\n712路\t228781\n教学实录\t228782\n超时空要爱\t228783\n超清图\t228784\n线荷载\t228785\nBRAUN\t228786\n\\u0000\t228787\n5468\t228788\n机动战士高达\t228789\nfundamentals\t228790\n览\t228791\n薨\t228792\nzhiyu\t228793\n何当共剪西窗烛\t228794\n居庸关\t228795\n黄石北站\t228796\n苏小妹\t228797\n暗淡无光\t228798\n免责事由\t228799\n建造+三国志12PK威力加强版+三国志11\t228800\n2017年1月16日\t228801\nRM\t228802\n秒值\t228803\n34\t228804\nWU\t228805\n孙聪\t228806\n约伯\t228807\n愚痴\t228808\nMerkle\t228809\n回家的诱惑\t228810\n赵永刚\t228811\n检查院\t228812\nMF\t228813\n帆布鞋\t228814\n沃顿商学院\t228815\n集群\t228816\n组部\t228817\n长路漫漫\t228818\n侠影\t228819\n10万千瓦\t228820\nphotos\t228821\n7.8米\t228822\n荷泽市\t228823\nバイト\t228824\n中山植物园\t228825\n康复路\t228826\n34D\t228827\n果木浪子\t228828\n错过了\t228829\n愚见\t228830\nruijie\t228831\n停停\t228832\n辽宁医药职业学院\t228833\n7.0%\t228834\n上海飞利浦\t228835\n小锐\t228836\n0538\t228837\n有志不在\t228838\n党工团\t228839\n98\t228840\n山吹\t228841\n华策影视\t228842\n第二十五章\t228843\n石牌\t228844\n老梦\t228845\nsockjs\t228846\n圣训\t228847\n一百多年前\t228848\n昆仑虚\t228849\n23卷\t228850\n石墨化\t228851\nCAD2013\t228852\n银座\t228853\n料神\t228854\nMac论坛\t228855\n回春\t228856\n必考题\t228857\nb85m\t228858\n绍兴东\t228859\nNPDP\t228860\nCaesar\t228861\n顽固派\t228862\n12.5.1\t228863\n列长度\t228864\n芦笛\t228865\n快拍\t228866\ngoovis\t228867\n俞敏苏菲玛索\t228868\n刘森\t228869\n市建设局\t228870\n楚江\t228871\n社会经验\t228872\nmqms\t228873\ncam350\t228874\n铅锌矿\t228875\n肺科\t228876\nCaff\t228877\n族群\t228878\n重炮\t228879\n平尚科技\t228880\n金川县\t228881\n余罪良辰讵可待\t228882\n比比资源网\t228883\n纯艺\t228884\n减排\t228885\n清丽\t228886\n敌军\t228887\n平缝机\t228888\n电压互感器\t228889\n糗友\t228890\nmysql2\t228891\n缩进\t228892\nclodop\t228893\nAvenue\t228894\n虚开增值税发票\t228895\n乔东\t228896\n人人生\t228897\n键器\t228898\n李荣华\t228899\nhyperion\t228900\n无盘天下\t228901\n氯化镁\t228902\n文香\t228903\nr8000\t228904\n化学所\t228905\n价管\t228906\n000528\t228907\n龙吟\t228908\n090\t228909\ncoat\t228910\n安徽出版集团\t228911\nGabor\t228912\n丰台站\t228913\n8843\t228914\n加色\t228915\nCOBRA\t228916\n长沙地铁1号线\t228917\n词汇集\t228918\n魏玮\t228919\n水滴轮\t228920\n调羹\t228921\n走形\t228922\n景洪市第四中学\t228923\n店务\t228924\n初体\t228925\n银丰大厦\t228926\nssp\t228927\n纹身秀\t228928\n电子承兑汇票\t228929\n泰罗奥特曼\t228930\n跟贴\t228931\njama\t228932\n经济性\t228933\n白刃\t228934\n欧阳海\t228935\nA350\t228936\n大华山镇\t228937\nPong\t228938\nTheo\t228939\n21Cake蛋糕\t228940\n2月1日起\t228941\n经营租赁\t228942\n莱特币/Litecoin\t228943\n智能变电站\t228944\nrv4\t228945\n算命大师_算卦\t228946\n两腔\t228947\n刘丰\t228948\n汉辞网\t228949\n团扇\t228950\n1211\t228951\n圆规\t228952\n申通快递单号查询_申通\t228953\n四城联创\t228954\n370分\t228955\n2018-02-27\t228956\n城关街道\t228957\ngse\t228958\n宝山顾村\t228959\n极刑\t228960\n哔哩哔哩弹幕网\t228961\n西咸新区\t228962\n冰芯\t228963\n抓伤\t228964\n锡浆\t228965\n三费\t228966\ncontax\t228967\n还阴债\t228968\n小米m6\t228969\n南京中医药大学翰林学院\t228970\nLPL\t228971\nM37\t228972\n6.4.8\t228973\na3\t228974\n不死药\t228975\nPES6\t228976\n施平\t228977\n太和新闻网\t228978\nyishi\t228979\n大自然木门\t228980\n恒拓\t228981\n华华\t228982\n浪卡子县\t228983\n锦业路\t228984\nmata\t228985\n比利时\t228986\n永升\t228987\n五十四万亿元\t228988\ncom3\t228989\nAuthorware阅读网\t228990\n喘息性支气管炎\t228991\n风云图\t228992\n管理工\t228993\n福维克\t228994\n二一三\t228995\n非定\t228996\n23话\t228997\n神物\t228998\n2.56\t228999\niDesktop\t229000\n猪哥\t229001\n太原南\t229002\n57分\t229003\n热熔胶机\t229004\n纪律处分条例\t229005\n武林群侠传吧\t229006\nelseif\t229007\n士大夫\t229008\n布比\t229009\n夜枭\t229010\nImbaTV\t229011\n20170917\t229012\n老年大学\t229013\n防火窗\t229014\n中国科学院理论物理研究所\t229015\n生活质量\t229016\n抬杠\t229017\n莉莲\t229018\n娜娜莉\t229019\ntopspin\t229020\n五通\t229021\n天峻\t229022\n太古里\t229023\n农用\t229024\n百亿美元\t229025\n省社科院\t229026\n荣耀日\t229027\n307号\t229028\n正话\t229029\n会计\t229030\n郭晶晶\t229031\nQPST\t229032\naffect\t229033\n实体店地址\t229034\n五通桥区\t229035\n数据包\t229036\n刘军\t229037\n地方投融资平台\t229038\n委局\t229039\n小时差\t229040\n万科白鹭郡\t229041\npcba\t229042\n330m\t229043\n小莎\t229044\none\t229045\n广东东软学院\t229046\n真三国无双8pc\t229047\n周苏红\t229048\n巴霍巴利\t229049\n倍频程\t229050\n紫微星\t229051\n物联网工程专业\t229052\n操射\t229053\n10.5.8\t229054\n花带\t229055\nstddev\t229056\n昨天\t229057\nSamui\t229058\n能用到\t229059\n@Scheduled\t229060\n世界经济\t229061\n射胶\t229062\n极角\t229063\n福贡\t229064\n氟利\t229065\n就能\t229066\nburn\t229067\n植保无人机\t229068\nauo\t229069\n社会保险费\t229070\ngitconfig\t229071\n仲裁委员会\t229072\n慢步\t229073\n西苑小区\t229074\njszg\t229075\n免体\t229076\n栋笃神探\t229077\nsdt\t229078\n四川华西医院\t229079\n洛瑶瑶\t229080\ngoodnight\t229081\n渭南职业技术学院\t229082\n雨律\t229083\n葵涌\t229084\ninvestment\t229085\n二十一\t229086\n草帽姐\t229087\n邵武\t229088\n迈瑞宝\t229089\n佳世客\t229090\ninstance\t229091\nThank\t229092\ncreo\t229093\n中电投\t229094\n美利云\t229095\n致函\t229096\n365_\t229097\n炭烧咖啡\t229098\n动荷载\t229099\n星游城\t229100\n阿里域名\t229101\n玛德琳\t229102\n小野寺\t229103\ndosage\t229104\nCarver\t229105\n9101\t229106\n没有我\t229107\n第9批\t229108\nassessment\t229109\nYam\t229110\n张宏堡\t229111\n百度网盘\t229112\nVersatile\t229113\nframework4.0\t229114\n一整\t229115\n玻璃桥景区\t229116\n潭州学院\t229117\n酷路泽\t229118\n九十\t229119\n赤裸裸\t229120\n4K论坛\t229121\nredux\t229122\nweix\t229123\n水光潋滟晴方好\t229124\n筛选机\t229125\n冯村\t229126\n解压\t229127\nPodcasts\t229128\n欧尼尔\t229129\n张大妈\t229130\n二级建造师建筑工程管理\t229131\nSimultaneous\t229132\n康复期\t229133\n宝克\t229134\n13500\t229135\n金矿石\t229136\nCreeper\t229137\n栏架\t229138\nStor\t229139\n天长\t229140\n马毓芬\t229141\nMuv\t229142\nRouters\t229143\n卷式\t229144\nQQ头像_找个性网\t229145\n展新颜\t229146\n北京幼儿园\t229147\n乱哄哄\t229148\n360安全软件\t229149\n工艺流程图\t229150\n17.3\t229151\n毕井泉\t229152\n保工\t229153\n无厘\t229154\n五中\t229155\n流入\t229156\n怒火英雄\t229157\n超过180天\t229158\n误惹妖孽王爷:废材逆天四小姐\t229159\n俊发城\t229160\n天极网\t229161\nLoversLab\t229162\n新哥斯拉\t229163\n广州港务局\t229164\n唱会\t229165\n弗洛伊德\t229166\n内衬\t229167\n迈尔\t229168\nalert\t229169\n师友\t229170\n大唐三藏圣教序\t229171\n烈明镜\t229172\n三星s8\t229173\n箭毒蛙\t229174\n易匚\t229175\n张建鹏\t229176\nacceptance\t229177\n胜安\t229178\n清绝\t229179\n6.1分\t229180\n擒贼先擒王\t229181\n雀神\t229182\nplain\t229183\n神幻魔镜\t229184\nduct\t229185\n沼\t229186\novercome\t229187\n育路\t229188\n002180\t229189\nCopper\t229190\n德兰\t229191\n等值线图\t229192\nWPC\t229193\n东三省\t229194\n塘溪镇\t229195\nepub+mobi\t229196\n见习期\t229197\nmichel\t229198\n第37\t229199\nActivate\t229200\n35.com\t229201\n蒸\t229202\n小亚\t229203\n我不想说\t229204\n附方\t229205\nShuang\t229206\n三角梨\t229207\n目击\t229208\n中专\t229209\n通航产业园\t229210\n猴头菇\t229211\n宫岛\t229212\n实物量\t229213\n河南机电职业学院\t229214\n黑豆黑米\t229215\nWorkForce\t229216\n扎哈哈迪德\t229217\n漫天要价\t229218\n刘春晓\t229219\n南通市人民政府\t229220\n0246794\t229221\n红鹤\t229222\n名刊\t229223\n芽芽\t229224\n蛋糕券\t229225\n婚外情\t229226\n放暑假\t229227\n桑椹酒\t229228\n2601\t229229\nQueues\t229230\n新戏\t229231\n顺和\t229232\n辽宁省委组织部\t229233\n古尊\t229234\n蝴蝶效应\t229235\n双相钢\t229236\n座次\t229237\n航运界\t229238\n开润股份\t229239\n萨尔瓦多·达利\t229240\n轴流\t229241\n北京物流公司\t229242\n金融服务业\t229243\n过水\t229244\n开州之窗\t229245\n东湖公园\t229246\n网车\t229247\n卡普兰\t229248\nkeller\t229249\n临床执业助理医师\t229250\n依定\t229251\n澳洲政府\t229252\n一组\t229253\n恶霸\t229254\n眼型\t229255\n虚幻\t229256\n老朽\t229257\n青涩照\t229258\n公车\t229259\n白巾\t229260\ncooperation\t229261\nCX-4\t229262\n黄龙病\t229263\n昆明火车站\t229264\n助理社会工作师\t229265\nyellow\t229266\n管卫东\t229267\nフェラチオ\t229268\n挖斗\t229269\n泄放\t229270\n小吃车\t229271\ncospa\t229272\n院装\t229273\n家坝\t229274\n中华秘方网\t229275\nBibtex\t229276\n上海17号\t229277\n旧车\t229278\n滨河新区\t229279\n薛敏\t229280\n九个多月\t229281\n济宁孔子国际学校\t229282\n杰卡斯\t229283\nspectra\t229284\n陶土板\t229285\n利国\t229286\n景美人\t229287\n皮肚\t229288\nConstruct\t229289\n照日天劫\t229290\n抹香鲸\t229291\n微信电脑版\t229292\n守桥\t229293\n简体转繁体\t229294\n栏位\t229295\n不良率\t229296\n趋势分析法\t229297\n小清河\t229298\n成品板\t229299\n桃林路\t229300\n黑柳\t229301\nMarlin\t229302\n灵城镇\t229303\n浙江大学远程教育学院\t229304\n江都市\t229305\n王立山\t229306\nShowroom\t229307\n指挥棒\t229308\n114ps\t229309\n切中\t229310\n第47期\t229311\n140_\t229312\njzx\t229313\n闹闹女巫店\t229314\nhzc\t229315\n打击群\t229316\n奈尔\t229317\n锦葵\t229318\nAEF\t229319\n临沂市区\t229320\nScrollRect\t229321\n九记\t229322\n田言\t229323\n大黑\t229324\n10085\t229325\n金庭\t229326\naxure8\t229327\n振华重工\t229328\nPRO5\t229329\n圆图\t229330\n可做到\t229331\n通货膨胀\t229332\neighteen\t229333\n东湖磨山\t229334\n扫码器\t229335\n线柜\t229336\n自由段\t229337\n舟山市环保局\t229338\n平庄煤业\t229339\n语境\t229340\n中科电气\t229341\n许村镇\t229342\n循环贷\t229343\n武都\t229344\nLTM\t229345\n怠慢\t229346\n王寅翼\t229347\n延华智能\t229348\n淘鞋网\t229349\nIntervention\t229350\n大藏\t229351\n韫\t229352\n青岛市中心医院\t229353\n5月26日\t229354\n开撸\t229355\n启奏\t229356\n拾得\t229357\nbt下载器\t229358\n西安医学院第一附属医院\t229359\n耳机\t229360\n鄂尔多\t229361\nhrb\t229362\n江湖儿女\t229363\n300504\t229364\ngan\t229365\n壁山\t229366\n灵武市\t229367\n滑门\t229368\n恐\t229369\n1814年\t229370\nlbs\t229371\n25卷\t229372\n交涉\t229373\nprcc2017\t229374\n疯了吧\t229375\n防滑垫\t229376\n房讯\t229377\n前程无忧论坛\t229378\n雅事\t229379\n吴梅\t229380\n登場\t229381\n4150\t229382\n料到\t229383\n2.8米\t229384\n2523\t229385\n插拔式\t229386\n盆友们\t229387\n何甘蓝\t229388\n斓曦\t229389\nquixel\t229390\n彬彬寒灵\t229391\n感冒\t229392\n第三讲\t229393\n加里奥德曼\t229394\n杭二中白马湖学校\t229395\n新闻坊\t229396\n继续率\t229397\n北京市垂杨柳医院\t229398\nAshes\t229399\n10.11.2\t229400\n北京区\t229401\n质控品\t229402\n性人\t229403\nBHT\t229404\nLCK2018\t229405\n小型机\t229406\n毕飞宇\t229407\nWebber\t229408\n虬角\t229409\n罐子\t229410\n望京西园四区\t229411\n硫酸钠\t229412\n第n次\t229413\n景天专区\t229414\n张志云\t229415\n付呗\t229416\n缪缪\t229417\n粘豆包\t229418\nongoing\t229419\n狐尾\t229420\nU3\t229421\nbes\t229422\nChapel\t229423\n草业\t229424\n昆山仁济医院\t229425\n百花缭乱\t229426\n巴拉库斯\t229427\nshaderforge\t229428\nsuzuki\t229429\n110元\t229430\nDrag\t229431\n阿一\t229432\n画书\t229433\n歌在飞\t229434\n問\t229435\n饥饿鲨鱼\t229436\nusb数据线\t229437\nsvpwm\t229438\n漳州古城\t229439\n沪深300ETF\t229440\n记忆之夜\t229441\n激荡三十年\t229442\nk40\t229443\nxrdp\t229444\n新干\t229445\n硼砂水\t229446\n南雅\t229447\n安居客\t229448\n明堂\t229449\n棒婊\t229450\n琥珀酸\t229451\n深装\t229452\n菌体\t229453\nmave\t229454\n三代目\t229455\n正值\t229456\n证监\t229457\n资产评估\t229458\nproduces\t229459\n商州\t229460\nD810\t229461\n葛洪桂龙\t229462\nStamp\t229463\n美容博览会\t229464\nwwise\t229465\n哈克贝利\t229466\n金额式\t229467\n溺文\t229468\n无线适配器\t229469\nS8_\t229470\n西城镇\t229471\n胜浦\t229472\n魅蓝metal\t229473\nInjection\t229474\n0.075\t229475\n玫瑰花语\t229476\nGoyard\t229477\n遒劲\t229478\n河南能源化工集团\t229479\n德展健康\t229480\n1000x\t229481\n招生\t229482\n偏向\t229483\n建业春天里\t229484\n李正元\t229485\n摩擦感\t229486\n黑苦荞\t229487\n0122\t229488\n角鲨烯\t229489\n宁西\t229490\n英国纽卡斯尔大学\t229491\n巴比馒头\t229492\n杭州菜\t229493\nmiddleware\t229494\n非空白单元格\t229495\n湖北省工商局\t229496\n郭毅\t229497\n首层\t229498\nkernel32.dll\t229499\n就让\t229500\n退房\t229501\nChrome\t229502\n兴业银行\t229503\nworkstations\t229504\n空气幕\t229505\n未及\t229506\n和平行者\t229507\n武汉天地\t229508\n宜昌市纪委监察局\t229509\n绝知此事要躬行\t229510\n谢拉\t229511\n元禾\t229512\n友谊商店\t229513\n烂根\t229514\n黑暗复仇者\t229515\nValid\t229516\n低分子肝素钙\t229517\n报效\t229518\n上海三菱电梯有限公司\t229519\n蒋平\t229520\n渐强\t229521\nPEARL\t229522\nsparks\t229523\n20140511\t229524\n四神\t229525\n15万元\t229526\n电脑型\t229527\n死亡之组\t229528\nPercona\t229529\n杨庄\t229530\ntcr6300\t229531\n三元乙丙橡胶\t229532\n赘余\t229533\n姐妹综合久久综合网\t229534\n保护期\t229535\n琵琶记\t229536\n格洛纳斯\t229537\n李琴\t229538\n聊斋2\t229539\n操守\t229540\nvar变量\t229541\nDEEN\t229542\n安全柜\t229543\n济源一中\t229544\n破极\t229545\nScientist\t229546\n对称型\t229547\n灌墨\t229548\n酷传\t229549\n张家界在线\t229550\n汉子们\t229551\n报合一\t229552\n诗情\t229553\n冷渣机\t229554\n连欣\t229555\n88.2\t229556\n高晶\t229557\n景\t229558\n成都市公共交通集团有限公司\t229559\n翁美玲\t229560\n僵持\t229561\n缴费\t229562\nmindmaster\t229563\nifdef\t229564\n汉维\t229565\n绿江村\t229566\n土建工程师\t229567\n善果\t229568\n脏弹\t229569\n国防科技\t229570\n造价师考试\t229571\n赌博群\t229572\n姜珂卓\t229573\n管理经济学\t229574\n痿\t229575\n淄川区\t229576\n吴郡\t229577\n耳炎\t229578\n樹\t229579\nunsuccess\t229580\nqo\t229581\n拉直\t229582\nvirus\t229583\n塑化\t229584\n主语\t229585\n友邻优课\t229586\nmm豆\t229587\n简阳市人民医院\t229588\n有匪\t229589\n桃花堤\t229590\n爱表族论坛\t229591\nDawn\t229592\nin-1\t229593\nthinphp\t229594\n迅雷极速版\t229595\nAFNetworking3.0\t229596\n大坛\t229597\n摆放\t229598\n金属新闻网\t229599\n策马\t229600\n诗章\t229601\n街舞袋鼠\t229602\n来那度胺\t229603\nCAL\t229604\nsingle\t229605\n华润幸福里\t229606\nchua1989\t229607\nWebRTC中文网\t229608\n人界\t229609\nGel\t229610\nLPL英雄联盟\t229611\n5颗\t229612\n治班\t229613\n字幕机\t229614\n價\t229615\n绕城\t229616\n仙丹\t229617\n人民交通出版社\t229618\n优等生\t229619\n好莱坞环球影城\t229620\n太阳电缆\t229621\n戈壁\t229622\n铃村爱里\t229623\n浙江省杭州高级中学\t229624\n高危妊娠\t229625\n木桶\t229626\n驱赶\t229627\n密密\t229628\nP2V\t229629\n死亡宣\t229630\n人工授精\t229631\n訾\t229632\n银色版\t229633\n张丽萍\t229634\nlolvsdota\t229635\n巨潮\t229636\n55首\t229637\n出生者\t229638\n百丈崖\t229639\n200集\t229640\n如厕\t229641\nrpr\t229642\n奇经\t229643\n荷花\t229644\n感染性疾病科\t229645\n国砖\t229646\n临港街道\t229647\n清灵\t229648\n张槎\t229649\n流媒体网\t229650\n悲鸣\t229651\nofiice\t229652\nsim卡通讯录\t229653\n交通工具类\t229654\n烹调师\t229655\ncathy\t229656\nHaley\t229657\n撩人\t229658\n狂狗\t229659\n12380\t229660\n纱巾\t229661\n为民\t229662\n惯例\t229663\n数据库表\t229664\n昭苏县\t229665\nadvertisement\t229666\n弯曲\t229667\n装饰框\t229668\nct\t229669\n张瑞雪\t229670\n寓居\t229671\n杜世成\t229672\nkriging\t229673\n走一回\t229674\n记分\t229675\n五位\t229676\n临澧县\t229677\n豪锐型\t229678\n自粘\t229679\n野狸岛\t229680\nFederico\t229681\n西子湖畔\t229682\n二十厘米\t229683\n内蒙古银行\t229684\n域名城\t229685\n继续教育学\t229686\n东莞市财政局\t229687\n贡菜\t229688\nvu\t229689\natmosphere\t229690\n看片会\t229691\ncws\t229692\nBlueprint\t229693\n应税所得率\t229694\n如来佛\t229695\npropert\t229696\n神无月\t229697\n雪华\t229698\n农广校\t229699\n损友圈\t229700\n张旭东\t229701\nBones\t229702\nIS2\t229703\n花桥\t229704\nndsl\t229705\n官寨\t229706\n3390\t229707\n中国银监会浙江监管局\t229708\n晒台\t229709\n二重唱\t229710\n渝城\t229711\n粉彩瓷\t229712\n悬浮窗\t229713\n绿野山庄\t229714\n小拉\t229715\nAgainst\t229716\n5875\t229717\n大明锦衣卫\t229718\n170324\t229719\n休闲西服\t229720\nSanchez\t229721\n同属\t229722\n西南财经大学\t229723\n花园大道\t229724\n矿石镇\t229725\n震震\t229726\nrdx\t229727\n20141220\t229728\n韩梅梅\t229729\n冲击波病毒\t229730\n多玩游戏网\t229731\n小萌希奥\t229732\n7重\t229733\n今\t229734\n晋身\t229735\n足浴器\t229736\n20171127\t229737\n命运2\t229738\n王宁\t229739\n龙珠z电光火石3\t229740\n锡块\t229741\n定稿\t229742\n90ss\t229743\n天津力神电池股份有限公司\t229744\n2018/04/04\t229745\n合疗\t229746\nDENSO\t229747\n相片\t229748\n比不知道\t229749\n裸衣\t229750\ninterests\t229751\n78厘米\t229752\n监狱学院\t229753\n换号\t229754\n锡炉\t229755\n立校\t229756\n阿里子\t229757\n摄魂\t229758\n柳岩朱丹\t229759\n舞泡\t229760\ndiaoyu\t229761\n神叉\t229762\n梁姓\t229763\n胰高血糖素\t229764\nv4.6.2\t229765\n梁子湖政府网\t229766\n铝壶\t229767\nbackward\t229768\n马庄镇\t229769\n封箱\t229770\n葱白\t229771\n味觉\t229772\nCtrl+F\t229773\n便签纸\t229774\n疗养\t229775\n海龟\t229776\nmillions\t229777\n治教\t229778\n九十五周年\t229779\n不济\t229780\n法定节假\t229781\n高枫\t229782\n47万\t229783\n培训业\t229784\nccpc\t229785\n新能源展\t229786\n迅羽\t229787\n散修\t229788\n2017年6月27日\t229789\n6星\t229790\n第五页\t229791\n小丑鱼\t229792\n多级离心泵\t229793\n期间\t229794\nB85M\t229795\n芳珂\t229796\nonenote2010\t229797\n牛头马面\t229798\n宁夏回族自治区住房和城乡建设厅\t229799\n启鸣\t229800\n卓大王\t229801\n发誓\t229802\n空客公司\t229803\n零零后\t229804\n65001\t229805\n现代农业示范园\t229806\nGX85\t229807\ndunlop\t229808\nROMs\t229809\n抗焦虑药\t229810\n织田non\t229811\n庄里\t229812\n稻香湖\t229813\n青木川\t229814\n广东科技学院\t229815\n坐位\t229816\n驾考一点通\t229817\n猪肋排\t229818\n机库\t229819\n城中村\t229820\n郭斯特\t229821\n2m2\t229822\nCDLinux\t229823\n维尼熊\t229824\nSMEG\t229825\n水池龟\t229826\n精调\t229827\n枕形\t229828\n福满\t229829\n朱墨\t229830\n美固\t229831\n好市\t229832\n绝情\t229833\n三元\t229834\n劫色\t229835\n旅讯\t229836\n3.6v\t229837\n松下蓄电池\t229838\n搅拌机\t229839\n3870\t229840\n烟毒\t229841\n干包式\t229842\nqq华夏吧\t229843\n5月中下旬\t229844\nn9005\t229845\n小米手机6_MIUI\t229846\n辣手\t229847\n大掌门2吧\t229848\n先入\t229849\n二十一天\t229850\n长袜子皮皮\t229851\n宇智波鼬\t229852\nOpenGLES\t229853\n水会\t229854\n上师大附中\t229855\n两块\t229856\n回头一看\t229857\n3百\t229858\n苏悦\t229859\n邮政储蓄银行信用卡\t229860\n刘言\t229861\n感烟\t229862\n医学\t229863\n3公里\t229864\n喜玛诺\t229865\n学府街\t229866\n半球型\t229867\nmarker\t229868\n香奈儿包\t229869\n中网格\t229870\n八号\t229871\n这么远\t229872\n观复博物馆\t229873\n筹建期\t229874\n3.81\t229875\n中房集团\t229876\n家校\t229877\nsympathy\t229878\n基本级\t229879\n八一水库\t229880\n明珠湾\t229881\nideapad700\t229882\n大斌健身网\t229883\n人工荨麻疹\t229884\n热血街舞团#\t229885\n复方酮康唑发用洗剂\t229886\n沱牌舍得\t229887\nCHAIN\t229888\n免税发票\t229889\n门萨\t229890\n特大桥\t229891\n豆腐渣样\t229892\n蔡军\t229893\n2本\t229894\n盘龙\t229895\n姜东元\t229896\n凡科互动\t229897\n阿雅\t229898\n新闻网\t229899\n无痛\t229900\n射灯\t229901\n王海涛\t229902\n下不到\t229903\n迷彩\t229904\n美尼尔综合症\t229905\n纳木措\t229906\n新湖\t229907\n牛锡明\t229908\nvillage\t229909\n江泉\t229910\n亚力克\t229911\nLocalization\t229912\n顶层\t229913\n杨芸\t229914\n徐水\t229915\n圆锥管\t229916\nanji\t229917\n东梓关村\t229918\n微久信\t229919\n个人二手车网\t229920\n铁子\t229921\n高频炉\t229922\n白永祥\t229923\n34亿元\t229924\n网易云课堂\t229925\nnucleus\t229926\n力群\t229927\n苏州汽车北站\t229928\n龙凤呈祥\t229929\n惠大麦\t229930\nspri\t229931\n侗家\t229932\ncomprehension\t229933\n一生不变\t229934\n多愁\t229935\nBT\t229936\npmc\t229937\n天龙八部卡\t229938\n平安在线\t229939\n排期表\t229940\nPacket\t229941\n亲眼\t229942\n右佐匹克隆片\t229943\n涂掉\t229944\n藏污纳垢\t229945\n无形资产减值准备\t229946\n布制\t229947\n20多年\t229948\n杨教授\t229949\n解秘\t229950\n顶新\t229951\n中海油田服务股份有限公司\t229952\n吴雪\t229953\n父元素\t229954\n标致5008\t229955\n天广中茂\t229956\n道友请留步\t229957\n布封\t229958\n扑克牌\t229959\n启发式算法\t229960\nSTC89C51\t229961\n国际歌\t229962\n色列漫画网\t229963\n李茂山\t229964\n厦门建发股份有限公司\t229965\n自卫权\t229966\n庆盛站\t229967\nd5\t229968\n圣诞版\t229969\n重庆市福利彩票发行中心\t229970\n石膏板\t229971\n混\t229972\n37分钟\t229973\n独家网\t229974\nA1699\t229975\n右脸\t229976\n储户\t229977\n张瑞希\t229978\nV3500\t229979\nSPLIT\t229980\nunholy\t229981\n佳能6d2\t229982\n驶入\t229983\n系统服务\t229984\n乌桕\t229985\n散就散\t229986\n段庄\t229987\n全麦\t229988\n打药机\t229989\nm7206w\t229990\narm7\t229991\n客家娘酒\t229992\not\t229993\n咨询帖\t229994\n下定决心忘记你\t229995\n双层板\t229996\n杜鹃花节\t229997\n买卖盘\t229998\n回标\t229999\n看漂\t230000\n五棵\t230001\n毛刺\t230002\n控制按钮\t230003\n8m\t230004\n漫威黑豹\t230005\n劳动南路\t230006\n四川省质量技术监督局\t230007\n潍坊工商职业学院\t230008\n段码\t230009\ndailog\t230010\n夺冠\t230011\n白银\t230012\nphpunit\t230013\n紫水晶洞\t230014\n军团再临\t230015\n王康\t230016\n败家女\t230017\nopensips\t230018\nHeights\t230019\nSparkFun\t230020\nxtu\t230021\n37.8\t230022\n五谷杂粮\t230023\n赵丽华\t230024\nx5m\t230025\n方大集团\t230026\n降龙\t230027\n算命先生\t230028\nSCP\t230029\nSpoilers\t230030\n0571\t230031\n铁棍\t230032\nViz\t230033\nNLS\t230034\n高津\t230035\n挂贴\t230036\n筱原千绘\t230037\n偏度\t230038\n新标\t230039\n拍景\t230040\n75路\t230041\n叛忍\t230042\n116平\t230043\n筑基\t230044\n首都医科大学附属北京儿童医院\t230045\n国家级风景名胜区\t230046\nRubik\t230047\n22.6\t230048\n对口\t230049\n仪器展\t230050\n平底\t230051\n咸宁职业技术学院\t230052\n退后\t230053\n各部门\t230054\n去角\t230055\n马尾松\t230056\n闻堰镇\t230057\nCarrera\t230058\n建业大厦\t230059\n市内\t230060\n清城区人民政府\t230061\nWVW\t230062\n协整方程\t230063\n3名\t230064\n百洛油\t230065\n口味虾\t230066\n合仓\t230067\n聊城新闻网\t230068\n后半句\t230069\n北汽新能源4S店\t230070\n兰花\t230071\n清河中学\t230072\n八角星\t230073\n福彩3D-中彩网\t230074\nsaved\t230075\n鹏保宝\t230076\n荣耀百里守约\t230077\ntranswell\t230078\n云南省食品药品监督管理局\t230079\n神经源性\t230080\nCeph\t230081\nMP4格式转换器\t230082\n99号\t230083\n凭依\t230084\n2.6.5\t230085\n综合症\t230086\n加分解\t230087\n消愁\t230088\n格萨尔\t230089\n红筹\t230090\n琴心\t230091\n点换\t230092\nVIP素材\t230093\n劲图\t230094\nerror_code\t230095\n中国铁塔\t230096\nWindows8_Windows\t230097\n商河政府网\t230098\n梓树\t230099\nclipboard\t230100\n支委\t230101\n阿里游戏\t230102\n汉富\t230103\n班级体\t230104\newsa\t230105\nwedo2.0\t230106\nDRIVER\t230107\n腾退\t230108\n青岛市财政局\t230109\n急性牙髓炎\t230110\n碟机\t230111\nmoly\t230112\n烟叶税\t230113\ngmail\t230114\nImplications\t230115\nViking\t230116\n野树\t230117\nplug\t230118\n李伦\t230119\n佳怡\t230120\n猛虎\t230121\n鹰航\t230122\nFresh\t230123\nValentina\t230124\n小雨滴\t230125\n毕忠良\t230126\n芳村客运站\t230127\n山东省文化厅\t230128\n50多张\t230129\n滨化股份\t230130\ntestbench\t230131\n花园街\t230132\n舫\t230133\n金领\t230134\n2344\t230135\n大连育明高中\t230136\n游荡\t230137\n1.2个\t230138\n安徽\t230139\n阿福\t230140\n39部\t230141\n有益菌\t230142\n骅\t230143\n乐东_论坛\t230144\naltuim\t230145\n苏建函价\t230146\n如\t230147\n免税收入\t230148\nWIFI\t230149\n创达\t230150\n灵匣网\t230151\n蜗轮\t230152\n景东县\t230153\n王宫\t230154\nProe5.0\t230155\n不任\t230156\n杀人狂\t230157\n辅导百问\t230158\n天智航\t230159\nfiltration\t230160\n盘花\t230161\n头名\t230162\n8770W\t230163\nDISC性格测试\t230164\nmo\t230165\n慢版\t230166\n佳能MG\t230167\n卢氏人民政府\t230168\nMarketing\t230169\n临川二中\t230170\n复康\t230171\n吻安\t230172\n日本幼儿园\t230173\n长串\t230174\nMirze\t230175\n李丹妮\t230176\n评论服\t230177\n巴南万达广场\t230178\n整装\t230179\n李圣经\t230180\nivy\t230181\n略说\t230182\nzzs\t230183\n赵霖\t230184\n14443\t230185\n上海纽约大学\t230186\n夕\t230187\n6900\t230188\n仙桃数据谷\t230189\n润燥止痒胶囊\t230190\n都昌在线\t230191\n冬堡\t230192\n单控开关\t230193\n能谱仪\t230194\n揭榜\t230195\nJY\t230196\n重庆\t230197\n皲裂\t230198\n美亚\t230199\n平衡球\t230200\n垫底辣妹\t230201\n陈颖芝\t230202\n依顿电子\t230203\n福建人\t230204\nファミレス\t230205\n夏眠\t230206\n第二十一章\t230207\n汉诺威\t230208\n33项\t230209\n情种\t230210\nqid\t230211\n笑纹\t230212\n纵横交错\t230213\nSunny*\t230214\nPRIME\t230215\nSNF\t230216\nvivoy51\t230217\n650mt\t230218\n奥_\t230219\npayphone\t230220\n凤凰号\t230221\nDivision\t230222\n倒茶\t230223\n整列\t230224\n细胞核\t230225\n试压泵\t230226\np52\t230227\n第二十四期\t230228\n朱雀大街\t230229\nbiaze\t230230\n许愿瓶\t230231\n好乱\t230232\n黄浦江\t230233\n汽摩\t230234\n袁克文\t230235\n麦卓\t230236\n望夫石\t230237\n这座城市\t230238\n仙凡\t230239\n\\\\\t230240\n嘴脸\t230241\n迈出\t230242\n猛鬼追魂\t230243\nwbfs\t230244\n纹管\t230245\n刘智\t230246\n花粉儿\t230247\n庾信\t230248\n福士广场\t230249\n玫红\t230250\n陈僖仪\t230251\nbluenile\t230252\n镇江市委\t230253\n12318\t230254\n两三声\t230255\n皮画\t230256\n南海网\t230257\naxi\t230258\n杀人不分\t230259\n优优健康网\t230260\nRevive\t230261\nSequence\t230262\n看道\t230263\n两代\t230264\n长贴\t230265\n鸭脚\t230266\n四板\t230267\n侧方\t230268\nlinode\t230269\n海盐南北湖\t230270\n提眉\t230271\n鸿观\t230272\n一气化\t230273\nsw2016\t230274\n第五版\t230275\n8&#160\t230276\n第168章\t230277\n0579.cn\t230278\n中日韩\t230279\n进度表\t230280\n企业规章制度\t230281\n工程险\t230282\n延边队\t230283\n苏芸\t230284\n日本小林制药\t230285\n中枪\t230286\n土木工程师\t230287\n智汇城\t230288\n城头镇\t230289\n坤庭\t230290\n10ML\t230291\n小微金融\t230292\n三门峡日报\t230293\n箭塔\t230294\n没有人\t230295\n101次\t230296\n张骥\t230297\nwordpress数据库\t230298\n怕冷\t230299\n联行\t230300\n西学东渐\t230301\n努比亚Z9\t230302\n花藤\t230303\nSandboxie\t230304\nsubstr\t230305\n中国电子网\t230306\n期间费用\t230307\n保利世贸博览馆\t230308\n2048S\t230309\n员工价\t230310\naccuse\t230311\n加油费\t230312\nADINA\t230313\n四月末\t230314\n爱宝乐园\t230315\n狗界\t230316\n1544\t230317\n神盾特工局\t230318\n17:00\t230319\nci10\t230320\n鬼故事\t230321\nEcotect\t230322\n留置针\t230323\n几百米\t230324\n德阳市交通运输局\t230325\n外脚\t230326\n碎纸机\t230327\nfb\t230328\n努比亚z17mini\t230329\n北落\t230330\n滑模观测器\t230331\n冷冻期\t230332\n赋安\t230333\n牛皋\t230334\n莫扎特传\t230335\nfashion\t230336\n接制动\t230337\n笑傲江湖2:东方不败\t230338\n云派\t230339\nCountry\t230340\n喜家德\t230341\n艾5\t230342\neyesight\t230343\n书摘\t230344\n传送门骑士\t230345\n2018-02-04\t230346\n林海\t230347\n浩二\t230348\n围棋赛\t230349\n沙织\t230350\n配字\t230351\nAOPA\t230352\n硅橡胶\t230353\n截教\t230354\nitcast\t230355\n优盟\t230356\n书学\t230357\n广东省中医药局\t230358\n人类\t230359\n附着式\t230360\n大坏狐狸的故事\t230361\n胶粘剂\t230362\n山东青年政治学院\t230363\n欧盘\t230364\n教育教学论坛\t230365\n内关\t230366\n苏慧伦\t230367\n忘关\t230368\n豌豆杂交\t230369\n佳吉快运\t230370\n乐虎国际\t230371\n汽车轮\t230372\nExporting\t230373\n30万方\t230374\n十五秒\t230375\n洗车店\t230376\n巴滨路\t230377\nbeiwody\t230378\nNational\t230379\n叭\t230380\n北京城建设计发展集团股份有限公司\t230381\n胸透\t230382\n接收端\t230383\n杰斯\t230384\n纸芯\t230385\n风险识别\t230386\n陈露\t230387\n263财富网\t230388\n国家计算机网络与信息安全管理中心\t230389\n综合型\t230390\nintersect\t230391\n三明市\t230392\n郭嘉\t230393\n国银\t230394\nPytest\t230395\ndebuginfo\t230396\n助溶剂\t230397\n顺捷\t230398\n900m\t230399\n播放机\t230400\n金川集团\t230401\n济宁市公安局\t230402\n本页\t230403\n球彩\t230404\n多金\t230405\n2.12\t230406\n振兴路\t230407\n按钮式\t230408\n翻书页\t230409\n环卫工人\t230410\n1.07\t230411\n赛迦\t230412\n巴瑶族\t230413\n西行漫记\t230414\n盱\t230415\n泰星来客\t230416\nzuko\t230417\n28%\t230418\nTestin\t230419\n莫莉\t230420\n4720\t230421\n吉林大学第一医院\t230422\n义务教师\t230423\n中国学位与研究生教育信息网\t230424\n执政为民\t230425\n刘建国\t230426\n独乐乐\t230427\n安吉新闻网\t230428\n百分之六十\t230429\n范勇\t230430\n中证\t230431\n烧烤师\t230432\n刘双舟\t230433\ncurl函数\t230434\n工程机\t230435\nTestbench\t230436\n指示\t230437\nCPN\t230438\nmeteoric_cry\t230439\n美丽村庄\t230440\n国寿财险\t230441\n360超级root\t230442\nDva\t230443\n越野滑雪\t230444\nconferences\t230445\n51美术高考网\t230446\n总长\t230447\n第k\t230448\n郑号锡\t230449\n首长\t230450\n大平调\t230451\n火药桶\t230452\n破甲弹\t230453\n鲁冰\t230454\n投行小兵\t230455\n站式\t230456\nresin\t230457\n陈思进\t230458\n口腔医院\t230459\n番茄牛腩\t230460\nleonwang\t230461\n血清型\t230462\n甲硝唑氯化钠注射液\t230463\n无线AP\t230464\n矿浆\t230465\n赵本山\t230466\n凯姆\t230467\n喷火枪\t230468\n健康管理中心\t230469\n心照\t230470\n1896年\t230471\n夕阳无限好\t230472\n攻略版\t230473\n∞\t230474\nstrom\t230475\nWJ\t230476\njbt\t230477\n合理化\t230478\n梦想的力量\t230479\n托梁\t230480\n正直\t230481\n谭氏\t230482\n放小\t230483\nawesome-swiper\t230484\nPBL\t230485\n硬木\t230486\n金智\t230487\n熊朝亮\t230488\neechen\t230489\n100方\t230490\n打开花\t230491\n女刊\t230492\nconcerns\t230493\ne宝\t230494\n世奥\t230495\n市安全监管局\t230496\n廖南\t230497\n轮改\t230498\n推\t230499\n中美撞机\t230500\n法治化\t230501\n金龙湖\t230502\n三棍\t230503\ncplex\t230504\n皇城\t230505\nPMP_希赛网\t230506\n关峡\t230507\n停休\t230508\nfringe\t230509\n酷狗音乐吧\t230510\n诺夏\t230511\ni5-6200U\t230512\n图书\t230513\n手炉\t230514\n时轮\t230515\n法宣在线\t230516\n4亿元\t230517\n爱福清网\t230518\nheitai\t230519\n卓正医疗\t230520\n纽约尼克斯\t230521\n333kkkkcom\t230522\nnonce\t230523\n交易性金融\t230524\n凌人\t230525\n真情故事\t230526\nv4l2\t230527\n不列颠哥伦比亚省\t230528\nGom\t230529\ncatalina.out\t230530\n五十种\t230531\n中项\t230532\n十段\t230533\n【有图】\t230534\n无烟\t230535\n杨清明\t230536\nconfirming\t230537\n转场\t230538\n罗滕堡\t230539\nlol刀妹\t230540\n12勇士\t230541\n田宫\t230542\nMysteel\t230543\n重申\t230544\nV2.0_\t230545\n监护人\t230546\n熏制\t230547\n陈诺\t230548\n姑获鸟\t230549\n美孚润滑油\t230550\nNO.9\t230551\n26名\t230552\n乳果糖\t230553\n青岛海洋大学\t230554\n冲洗\t230555\n纳斯达克指数\t230556\n划线器\t230557\n救心\t230558\n镍币\t230559\nMETAL\t230560\n567号\t230561\n围布\t230562\nWin7/Win8.1/Win10\t230563\n一百级\t230564\nbudao\t230565\nBT学院\t230566\n丹东市公安局\t230567\ngeneva\t230568\n紧缺\t230569\n千亿美元\t230570\nsme\t230571\nLayered\t230572\n按钮\t230573\n0xc0\t230574\n狮身\t230575\n元素分析仪\t230576\n丁日\t230577\nBCF\t230578\ntif格式\t230579\n浑河\t230580\n舒宝\t230581\n门兴\t230582\n尼康d3300\t230583\n9158多\t230584\n周艳\t230585\n马鞍街道\t230586\n六和\t230587\nDOWNLOAD\t230588\nsimplicity\t230589\n古吉拉特邦\t230590\n三衢道中\t230591\nCAPT\t230592\n病根\t230593\n治疗器\t230594\n悠月\t230595\n长电科技\t230596\n灵堂\t230597\nDUXS\t230598\n工博会\t230599\nTakara\t230600\nx1x2\t230601\n唬\t230602\n演奏会\t230603\n114la\t230604\n木马\t230605\nsin1\t230606\n北大人民医院\t230607\n画面\t230608\n还有什么用\t230609\n深圳市建筑设计研究总院有限公司\t230610\n马赫\t230611\ngtx1080\t230612\nrx10m4\t230613\n吧女\t230614\n罗援\t230615\n继承类\t230616\n板野友美\t230617\n灵隐\t230618\n微微风\t230619\n叔公\t230620\nqpp\t230621\n轻物\t230622\n孕肚\t230623\n成都家装公司\t230624\n注册页\t230625\n如实\t230626\nzidoo\t230627\n马略卡岛\t230628\n莱芜新闻网\t230629\n2018年1月8日\t230630\n苗木商情网\t230631\n欣杨\t230632\n姓吴\t230633\n广州花城广场\t230634\n羽川翼\t230635\n水性\t230636\n金融控股集团\t230637\nkm\t230638\n黄金镇\t230639\n昆山经济技术开发区\t230640\nVJC\t230641\n请克制\t230642\n心酸\t230643\nAppleID\t230644\n生命科学联合中心\t230645\n搜酷\t230646\n李晓彤\t230647\n访问者\t230648\nosi\t230649\n江西省卫生和计划生育委员会\t230650\n古卷\t230651\n小蜜书\t230652\n成都第三人民医院\t230653\nFortiClient\t230654\ngcc\t230655\n起批\t230656\n梦幻曲\t230657\n重庆西\t230658\n41路\t230659\n255.255.255.0\t230660\nEDA库\t230661\n古琴谱\t230662\n搜狗百科\t230663\n普客\t230664\n通天河\t230665\n#{}\t230666\n阿奇霉素\t230667\nbecky\t230668\nDAU\t230669\n万科紫台\t230670\n方力\t230671\n情长\t230672\n今日下午\t230673\n戒烟器\t230674\n邹韬奋\t230675\n十排\t230676\n太平洋电影城\t230677\n数梦工场\t230678\nwallpaper\t230679\n广源\t230680\n大球盖菇\t230681\n4毛\t230682\n水熊虫\t230683\n卡尔玛\t230684\n孔雀河\t230685\n吃翔\t230686\n郝万山\t230687\n固定\t230688\n康鹏\t230689\n中国机床网\t230690\n喷药\t230691\n科特迪瓦\t230692\nlest\t230693\n浙江宇视科技有限公司\t230694\n系统性红斑狼疮\t230695\npostgre\t230696\n对象\t230697\n长安大学公路学院\t230698\nPPT架构图\t230699\n光者\t230700\nCNET\t230701\nnursery\t230702\n罗琳\t230703\n工作地\t230704\n章瑞虹\t230705\nguomo\t230706\n小白猪\t230707\n逗吧逗把街\t230708\n004号\t230709\n方括号\t230710\nbelif\t230711\n抛丸清理机\t230712\n莱尔\t230713\nDiscussions\t230714\n9000个\t230715\n龙虎风云会\t230716\n白金级\t230717\n狂击\t230718\n空中楼阁\t230719\n原名\t230720\n穆雷\t230721\n蓝天驾校\t230722\n2.53G\t230723\n行论\t230724\nSolo\t230725\n色球\t230726\n爆增\t230727\n北京电力公司\t230728\nexness\t230729\n16点\t230730\n鸿城\t230731\n野象\t230732\n银锭\t230733\n孤鹰\t230734\n中秋月\t230735\n全来\t230736\n60万方\t230737\n焦三仙\t230738\nflower\t230739\n中国联通软件研究院\t230740\n封窗\t230741\nwui\t230742\n黄牛\t230743\naxrue\t230744\n朱千雪\t230745\n旭日上城\t230746\n5411\t230747\n一再\t230748\n龙湖春江\t230749\n0.4.2\t230750\ntiki\t230751\n丙烯酸甲酯\t230752\n泥岗\t230753\n金笛\t230754\n三湖慈鲷\t230755\n龙树\t230756\n花少\t230757\n好车\t230758\n铜包\t230759\n民事案\t230760\n名井南\t230761\n相悖\t230762\n用关\t230763\n吸粪车\t230764\n五乡\t230765\n高崎圣子\t230766\n地壳\t230767\n保母\t230768\n易烊\t230769\n灭顶\t230770\n841\t230771\n道统\t230772\nPV\t230773\nranjiewen\t230774\n国家工程研究中心\t230775\n蜘蛛池\t230776\n红糖\t230777\n搅热\t230778\n情欲九歌\t230779\nautopano\t230780\n领现金\t230781\n一行三会\t230782\nwsd\t230783\n东德牧羊犬\t230784\n≈\t230785\n汪敏\t230786\n移\t230787\n游聚\t230788\n游仙\t230789\n存放\t230790\n马自达323\t230791\n务川自治县\t230792\ngaosu\t230793\n没用\t230794\n高妙\t230795\n羊肉\t230796\n中华北大街\t230797\n监督管\t230798\n边锋\t230799\nLNK\t230800\nquant\t230801\n奔驰c260\t230802\n鲁敏\t230803\nrack\t230804\n镀铝锌\t230805\n威武\t230806\n粉儿\t230807\n男佣\t230808\n乌石村\t230809\n私有财产权\t230810\n莆田市政府\t230811\ngrooming\t230812\n红旗连锁\t230813\n蕾丝特\t230814\n九安医疗\t230815\n国培\t230816\n第二包\t230817\npfile\t230818\n叶欢\t230819\n刘志杰\t230820\n美裙\t230821\n弗吉尼亚理工大学\t230822\n薄层板\t230823\n笑脸\t230824\nModel类\t230825\n晒单网\t230826\n烽火集团\t230827\n34种\t230828\n旱獭\t230829\n凤凰湖\t230830\n你看起来好像很好吃\t230831\n几茬\t230832\n转考\t230833\n48平方厘米\t230834\n甘棠\t230835\n苍井翔太\t230836\ngb2312\t230837\nRamdisk\t230838\n酸茶\t230839\n送保\t230840\nLifetime\t230841\n第25届\t230842\n鱼唇\t230843\n深圳中院\t230844\n地狱之眼\t230845\nbottle\t230846\n百度地图sdk\t230847\ndrive\t230848\n桃源小区\t230849\n大橙子\t230850\n300粒\t230851\n太史\t230852\n南海执信中学\t230853\n建筑工程吧\t230854\n泳帽\t230855\n资本充足率\t230856\nJBS\t230857\n黄花菜\t230858\n阿扎莉娜\t230859\n3mbang\t230860\n逢场作戏\t230861\n心急\t230862\nCS75\t230863\n阵屏\t230864\n权法\t230865\n马卫东\t230866\n上外附中\t230867\nexist\t230868\n做广告\t230869\n刘志峰\t230870\n莎莉文\t230871\n碟调网\t230872\n吵醒\t230873\n李欣\t230874\n北京三元桥\t230875\n泳裤\t230876\n西昌学院\t230877\n林冰\t230878\nasciidoc\t230879\n2011年3月\t230880\n彩字秀\t230881\n运动健身\t230882\n8G\t230883\n苏联\t230884\nswitch\t230885\n易购网\t230886\n卢鑫\t230887\n伊斯特\t230888\n另一面\t230889\n一程\t230890\n湟源\t230891\n阮莞\t230892\nNZD\t230893\n晚字\t230894\ncreo5.0\t230895\n家常便饭\t230896\n空霸\t230897\n奶光\t230898\n湖南环境生物职业技术学院\t230899\n胸痹\t230900\n五年级\t230901\n千兆光猫\t230902\n测试机\t230903\n钻级\t230904\n浅黄\t230905\n5万吨\t230906\n历史学\t230907\n心功能\t230908\nlinux私房菜\t230909\n我本\t230910\n广发证券易淘金\t230911\n上海国际会议中心\t230912\n启海\t230913\n政改\t230914\n三个人\t230915\n启动管理器\t230916\n北京博爱医院\t230917\n泉城极地海洋世界\t230918\n微机室\t230919\n剑网三\t230920\n双字\t230921\n刘明月\t230922\n三铰\t230923\n夸大其词\t230924\n衍生化\t230925\n走起来\t230926\n最先\t230927\n胆囊\t230928\n机械设计\t230929\n环氧乙烷灭菌\t230930\n易达\t230931\n裸熊\t230932\n糟践\t230933\n时代倾城\t230934\n名噪一时\t230935\nyou1you\t230936\n万科天空之城\t230937\n曹路宝\t230938\n萧平章\t230939\n终末期肾病\t230940\n卡首\t230941\n盖楼\t230942\nosim\t230943\n冷静王\t230944\n调级\t230945\n70s\t230946\n14纳米\t230947\n闹着玩\t230948\n嘉兴人才网\t230949\nNo.001\t230950\n荥阳网\t230951\ntfp\t230952\n_标清\t230953\nsinr\t230954\n水性丙烯酸树脂\t230955\n藠头\t230956\n中星9号\t230957\n即用型\t230958\n伊秀娱乐网\t230959\n孟子\t230960\n婆罗门教\t230961\n一答\t230962\n联欢会\t230963\n富阳区\t230964\nsonnet\t230965\n小黑猫\t230966\n中国电子科技集团公司第五十研究所\t230967\n程龙\t230968\n爱不爱\t230969\n金投股票-金投网\t230970\nGPK\t230971\n广州军区武汉总医院\t230972\nlipa\t230973\n自然乐园\t230974\n33路\t230975\nslept\t230976\n桃井\t230977\n浩天信\t230978\n犀利\t230979\n破格\t230980\n日本富士\t230981\n恶作\t230982\n包月\t230983\n八个字\t230984\n北宫森林公园\t230985\n同恋\t230986\n纳米二氧化钛\t230987\n航空公司\t230988\n96页\t230989\n石垣岛\t230990\n油菜花\t230991\nB+\t230992\nex2006\t230993\n3.4.6\t230994\n深圳地铁集团\t230995\n帕提欧\t230996\n停水停电通知网\t230997\n匪首\t230998\n字号\t230999\n向华刘备\t231000\n加重\t231001\n48天\t231002\n海南省政府\t231003\n保定地区\t231004\nSia\t231005\nRamos\t231006\n三叉戟\t231007\n引援\t231008\n天真无邪\t231009\n扑扑\t231010\n淮安市洪泽区政府\t231011\n厦门市软件园二期\t231012\n天夏\t231013\n德隆\t231014\n除疤膏\t231015\n晋绥\t231016\n杀鸡\t231017\n黯然神伤\t231018\n20161130\t231019\n宠物版\t231020\n万州机场\t231021\n月曜\t231022\n大非农\t231023\nDiscourse\t231024\nSDMT\t231025\n龍壇\t231026\n诸葛烤鱼\t231027\nK12\t231028\n饭钱\t231029\n000059\t231030\n蓝裙\t231031\n龙宝\t231032\ndirac\t231033\n立柱\t231034\n高挑\t231035\n多峰\t231036\n绿森林硅藻泥\t231037\n科蓝\t231038\n黑笔\t231039\n美式橄榄球\t231040\nhd7750\t231041\n热镀锌板\t231042\nDivergence\t231043\n巴利语\t231044\nGEETEST\t231045\n上大路\t231046\nsongs\t231047\n91.6\t231048\n菜鸟驿站\t231049\n虞永平\t231050\n献殷勤\t231051\n举家\t231052\n致学网\t231053\nkies3\t231054\n爱之梦\t231055\n网站数据库\t231056\ndtm\t231057\nPvc\t231058\n第一期\t231059\n2265\t231060\n千字文\t231061\n灵验\t231062\nr8\t231063\n而动\t231064\n信阳菜\t231065\n基zz\t231066\n薪金煲\t231067\n张子\t231068\n标致408论坛_汽车之家论坛\t231069\nGraphite\t231070\n格杀勿论\t231071\nconcerto\t231072\n糖宝\t231073\n一下一个\t231074\n小說\t231075\n列席\t231076\n济宁市委\t231077\n俄罗斯远东地区\t231078\n宋阳\t231079\n好易\t231080\n家庭\t231081\n文宣\t231082\nPOCIB\t231083\n传递性\t231084\n曾祥展\t231085\n新长铁路\t231086\n胡雪梅\t231087\n激酶\t231088\n6.48\t231089\n辽宁医学院\t231090\n方得\t231091\n深山老林\t231092\n边距\t231093\n门扣\t231094\n沈晓海\t231095\n播种器\t231096\n22度\t231097\n红安网\t231098\n90首\t231099\nhuizhou\t231100\n卡西姆\t231101\n88titlename88\t231102\n鱼机\t231103\n高明区\t231104\n叶梦\t231105\n天虹商场\t231106\n66游\t231107\n冰激凌机\t231108\nsetValue\t231109\n钱德洪\t231110\n祝贺语\t231111\n贝多芬钢琴奏鸣曲\t231112\n兴安职业技术学院\t231113\n移动联通4G\t231114\n架桥机\t231115\nHero4\t231116\n留种\t231117\nDDecode\t231118\n一八\t231119\n有图有真相_\t231120\nluoji\t231121\n对联\t231122\n省台\t231123\n历峰\t231124\n宫底\t231125\n白鹤山\t231126\n资本市场部\t231127\n名吃\t231128\n1863年\t231129\n婺源\t231130\n百万年\t231131\n检验台\t231132\n蜂蜜柠檬茶\t231133\n安全员证\t231134\n液体流量计\t231135\n高效能人士的七个习惯\t231136\n220kV\t231137\ncaptain\t231138\n神龙峡\t231139\n库巴\t231140\n茶轴\t231141\n华人运通\t231142\n蜈蚣精\t231143\n企信网\t231144\naviation\t231145\n管理学部\t231146\n5月8日\t231147\n郝\t231148\nremmina\t231149\n力学\t231150\n发源地\t231151\n巧克力糖\t231152\n山塘街\t231153\nGTA4自由城之章\t231154\n50MW\t231155\n博士伦\t231156\n实验稿\t231157\n丘处机\t231158\n口瓶\t231159\n案防\t231160\n贝瓦儿歌\t231161\n中国沧州政府\t231162\n罗振玉\t231163\n薛葵\t231164\n隐逸\t231165\n中资美元债\t231166\nOG\t231167\n本地版\t231168\n佰澳朗德\t231169\n烛秋\t231170\ncsb\t231171\n500小时\t231172\n张老板\t231173\nAHC\t231174\n百性阁\t231175\n王长喜\t231176\ncovering\t231177\n挂式\t231178\n递归\t231179\n夏窗\t231180\n笛莎\t231181\n航向\t231182\n退休后\t231183\n站前街道\t231184\n一千公里\t231185\n万达主题乐园\t231186\n盛启\t231187\n英国利兹大学\t231188\n佛教八大宗派\t231189\n5a\t231190\n学佛\t231191\n110ml\t231192\n成人生\t231193\n徐剑\t231194\ncqy吧\t231195\n选书网\t231196\n青城山六善酒店\t231197\ncl地址\t231198\n竹坑\t231199\n3610\t231200\n京本\t231201\n彰武\t231202\n荆州市人民政府\t231203\n扩展\t231204\nsex8cc\t231205\n仰卧起坐\t231206\n特锦赛\t231207\n卓云\t231208\n守灵\t231209\n竹田\t231210\n郑州注册公司\t231211\nsweetalert2\t231212\n021|\t231213\n动建\t231214\n1.0.0.9\t231215\nlck2018\t231216\n云打包\t231217\n嘉利\t231218\ns+1\t231219\n感恩季\t231220\n张书记\t231221\n电信学院\t231222\n幸福时代\t231223\n村社\t231224\n1.8t\t231225\n永平路\t231226\n环评师\t231227\n死刑\t231228\nGTX750Ti\t231229\n女丽网\t231230\n黑盒\t231231\n艺兴\t231232\n纯美\t231233\n海油工程\t231234\n格纹\t231235\n重学\t231236\n幽径\t231237\n威朗\t231238\niis6.0\t231239\n南财大\t231240\n邵阳新闻在线\t231241\n孤寡\t231242\n计划图\t231243\n星龙\t231244\n金网\t231245\n玉东新区\t231246\n西递村\t231247\n殿下\t231248\n繁体子\t231249\n整蛊\t231250\n近人\t231251\n平原\t231252\n黄河网\t231253\n国家卫计委\t231254\njubeat\t231255\n梦伴\t231256\n180万吨\t231257\n1年多\t231258\n严查\t231259\n接地端\t231260\n3万余\t231261\n义和\t231262\n十二栋\t231263\n223.104.255.25\t231264\n两千五\t231265\n等你来\t231266\n广东省人力资源和社会保障厅\t231267\n金升\t231268\n骑马与砍杀光明\t231269\n灯头盒\t231270\n青山湖区\t231271\n唐巧\t231272\nswj\t231273\n反逆\t231274\n划艇\t231275\n次品\t231276\nsimplorer\t231277\n十五号\t231278\n认筹金\t231279\n太好\t231280\n记事簿\t231281\n80240037\t231282\n奋达科技\t231283\n旱冰场\t231284\n羊血\t231285\nturn\t231286\n建筑物\t231287\ncib\t231288\n周海亮\t231289\n解决型\t231290\n园岭小学\t231291\nlocalinstall\t231292\n嫂\t231293\n百分比例\t231294\n大轮\t231295\n黔张常铁路\t231296\n床房\t231297\n红坊\t231298\n九方\t231299\n上班族\t231300\nパイズリ\t231301\n陈秋\t231302\n王文彪\t231303\n返回值类型\t231304\n喷油器\t231305\nsmj\t231306\n横竖\t231307\n凄美\t231308\n多轴\t231309\n重度脂肪肝\t231310\nSqlMap\t231311\n亚瑟士\t231312\n缺钱\t231313\n何雄\t231314\n凤凰生活网\t231315\n翠芽\t231316\n花果园\t231317\n国家新闻出版广电总局电影局\t231318\ncrescent\t231319\n宝\t231320\n霉素\t231321\n光学取景器\t231322\n卖油\t231323\n水封\t231324\n刘晓云\t231325\n11.3.3\t231326\n无休止\t231327\n血印\t231328\n梁颖\t231329\n6401\t231330\n多米\t231331\n12毫米\t231332\n技成培训网\t231333\n兔人\t231334\nVer.2\t231335\nMactype\t231336\n广州妈妈网\t231337\n多巴胺\t231338\napache-tomcat\t231339\nProgram\t231340\n实力\t231341\n卓凡\t231342\n珍稀\t231343\nyse\t231344\n驴苗\t231345\n我和僵尸有个约会3\t231346\n狮子林\t231347\n秦毅\t231348\n半个小时\t231349\n邻家\t231350\n安全师\t231351\n藏书楼\t231352\n西部信托\t231353\n召唤类\t231354\n更年期\t231355\n戒赌吧\t231356\n断送\t231357\n无轨\t231358\n江苏食品药品职业技术学院\t231359\n华鼎\t231360\n孙海涛\t231361\n舒城县人民政府\t231362\n区里\t231363\n杂货铺\t231364\nsunos\t231365\n修真者\t231366\n一照一码\t231367\n新时代人大\t231368\n临时演员\t231369\n泣鬼神\t231370\n嗨滁网\t231371\n滥\t231372\n万安县\t231373\n两居室\t231374\n四期\t231375\n桃花涧\t231376\n亚马逊子\t231377\n锐澳\t231378\n香水座\t231379\n蜘蛛王\t231380\n上班时\t231381\n2018.3.23\t231382\n叙永县\t231383\n清水\t231384\n椎管内肿瘤\t231385\n炉石传说吧_\t231386\n炔雌醇环丙孕酮片\t231387\n氡气\t231388\n包吃住\t231389\n目\t231390\n李佩斯\t231391\nCHARLS\t231392\n洞天\t231393\n84天\t231394\n江河\t231395\n中地\t231396\nbaner\t231397\n大熊市\t231398\n贵阳市住房和城乡建设局\t231399\n神火\t231400\n马克笔画\t231401\n李庆远\t231402\n莲池区\t231403\n极速蜗牛\t231404\n水柔\t231405\n完饭\t231406\nguard\t231407\n汉源县\t231408\nPET-CT\t231409\nPyhton\t231410\nbarron\t231411\n事者\t231412\n圣甲\t231413\n理实一体化\t231414\n瑞虎3论坛\t231415\n降世\t231416\n旅游线路\t231417\n27#\t231418\n30万吨\t231419\n太湖大道\t231420\n周主任\t231421\n南阳市\t231422\n南川区\t231423\n四海钓鱼网\t231424\n交趾\t231425\n黄浦\t231426\nDetermine\t231427\nOops\t231428\n烟酒店\t231429\narf\t231430\n硫酸亚铁\t231431\n锚链接\t231432\n多益战盟\t231433\n红鱼儿\t231434\n规律\t231435\n飞鹿言情小说网\t231436\n胸腺肽\t231437\n接近角\t231438\n希爱力混合片\t231439\n小派\t231440\n得乐\t231441\n百望山\t231442\nvisa卡\t231443\n邻家大贱谍\t231444\n类类\t231445\n裂化\t231446\n恒大御澜庭\t231447\n译码\t231448\n沙城\t231449\n流场\t231450\n杜震宇\t231451\n国假\t231452\n州名\t231453\n门萨俱乐部\t231454\nmoving\t231455\n本来生活\t231456\n军嫂网\t231457\n翳\t231458\n3-4天\t231459\n旬刊\t231460\n明锐论坛\t231461\n唐硕\t231462\n波形\t231463\n血白细胞\t231464\nParents\t231465\n刘松仁\t231466\n审核期\t231467\n生态主义\t231468\n烛火\t231469\n挂角\t231470\n第24个\t231471\n点处\t231472\nmp10\t231473\n港\t231474\n麒麟960\t231475\n攻壳\t231476\n配音师\t231477\nshufa\t231478\n我需要你\t231479\n平石\t231480\n枪\t231481\nBMR\t231482\n04期\t231483\n卷珠帘\t231484\n产局\t231485\n干熄焦\t231486\n槽君\t231487\n普宁高铁站\t231488\n补档\t231489\n宇智波止水\t231490\niphones\t231491\n山珍\t231492\n神将三国\t231493\n反之\t231494\n結衣\t231495\n队友\t231496\n吮\t231497\nbc221\t231498\nLocked\t231499\n神话版\t231500\n珠帘\t231501\nRobotFrameWork\t231502\nreplace\t231503\n伦乱\t231504\n凤塘镇\t231505\n周次\t231506\nCES2018\t231507\n数字量\t231508\n邓飞虎\t231509\n38938291\t231510\n叶片泵\t231511\n淘宝权\t231512\n吕洞宾\t231513\n保交\t231514\n云领\t231515\n39度\t231516\n税前\t231517\n准格尔\t231518\nLAD\t231519\n可适\t231520\n花爷\t231521\n森川智之\t231522\n60R17\t231523\n白孝文\t231524\n我的妈妈作文\t231525\nQ友\t231526\n一两句\t231527\n孤岛危机\t231528\n巅峰版\t231529\n木珠\t231530\nxialeistudio\t231531\n低层\t231532\n青城山镇\t231533\n七乐彩\t231534\nquqi\t231535\nlazy\t231536\nmushi\t231537\n逆生长\t231538\nKumar\t231539\n文画\t231540\n舞社\t231541\n电柜\t231542\n仙灵骨葆\t231543\n列表\t231544\n冷杉\t231545\nMydm\t231546\n勤学培训网\t231547\n甩鞭\t231548\n泽熙\t231549\n惠康\t231550\n元氏吧_\t231551\n下属县\t231552\n协定\t231553\n怕\t231554\nQC4.0\t231555\n地丁\t231556\n高密政府网\t231557\npana\t231558\n总舵\t231559\n财富之路\t231560\n银行人\t231561\n周胜\t231562\n和谐型\t231563\nGusto\t231564\nAcquire\t231565\n强婚\t231566\n全要素\t231567\n几位\t231568\nSSO单点\t231569\n中国光大集团\t231570\nprin\t231571\n爱之初\t231572\n内向型\t231573\n玻璃屋\t231574\n滴滴顺风车主\t231575\nspecialized\t231576\nvivoy75\t231577\n天津市财政局\t231578\n快捷人才网\t231579\n繁荣富强\t231580\n台东\t231581\n曾豪\t231582\n586\t231583\n充盈\t231584\n恩物\t231585\n国际酒店\t231586\nアタシ\t231587\n黛眉山\t231588\n松南\t231589\n倾心\t231590\nwhichever\t231591\n孵化\t231592\nCSC\t231593\n方寸\t231594\nhsa\t231595\nfinder\t231596\n灵缇犬\t231597\nDZS\t231598\n安少\t231599\n疼感\t231600\n胡塞尔\t231601\nMel\t231602\n漏项\t231603\n韩料\t231604\n布莱德利·库珀\t231605\nbelli\t231606\n西西软件园\t231607\n第一句\t231608\n万峰\t231609\n抚州市人民政府\t231610\nA3\t231611\n牛元\t231612\n中建大厦\t231613\n柯蒂斯\t231614\n良渚\t231615\n日照市教育局\t231616\n万箭\t231617\n3.06\t231618\n永定河\t231619\n从前\t231620\npigu\t231621\nruntime-|1-1-0.dll\t231622\n前下\t231623\n牛鲨\t231624\ncircus\t231625\n7k7k皮卡堂\t231626\n西霞口\t231627\n双4g\t231628\n别有\t231629\n2017年2月13日\t231630\n东方明珠电视塔\t231631\nReinforced\t231632\n江门站\t231633\n滴滴快车\t231634\n木林镇\t231635\nonl\t231636\n2016年11月12日\t231637\n细犬\t231638\n卡神\t231639\n小鹏\t231640\nservelt\t231641\n国家工商总局\t231642\n威朗15S\t231643\n无话可说\t231644\n陈若轩\t231645\n王士\t231646\nnsd\t231647\n开发式\t231648\n美国国务院\t231649\n逍客\t231650\n热食\t231651\nkapt\t231652\n2018-03-09\t231653\n延长术\t231654\n缠绵\t231655\n上法\t231656\nM40\t231657\n虹金所\t231658\n天卓\t231659\n合成型\t231660\n慢性非萎缩性胃炎\t231661\n大玩\t231662\n便利\t231663\n撕烂\t231664\n一小时\t231665\n一号线\t231666\n典当业\t231667\n大公\t231668\n金软景\t231669\n宝工\t231670\n三年多\t231671\n农村经济发展\t231672\n工作型\t231673\nLanka\t231674\n尖鞋\t231675\n鼠人\t231676\n舞钢市\t231677\n8.18\t231678\n列数\t231679\n李媛媛\t231680\n小庆\t231681\n速冻水饺\t231682\n开启\t231683\n朝南\t231684\n超越离合器\t231685\n一直\t231686\n半亩地\t231687\nSurface\t231688\n斗破苍穹OL\t231689\n3^3\t231690\n斯蒂芬·库里\t231691\n彬彬有礼\t231692\njudge\t231693\n神门\t231694\n蚊虫叮咬\t231695\n黄埔大道\t231696\n弘盛\t231697\n生化\t231698\nLIFT\t231699\nMultiplication\t231700\n0558\t231701\n十八道\t231702\n煅烧炉\t231703\n富力新城\t231704\n心心\t231705\n0326\t231706\n装甲\t231707\n各所\t231708\nReferral\t231709\n柒柒\t231710\nmanjaro\t231711\n骑车\t231712\n玛丽莲\t231713\n老布\t231714\n上海市建筑施工行业协会\t231715\n不堪回\t231716\nCheckPoint\t231717\n娘娘\t231718\n经鉴定\t231719\n永恒之焱\t231720\nr17\t231721\n合肥在线\t231722\n尾号限行\t231723\n撵\t231724\n沙耶\t231725\n数据魔方\t231726\n杂志虫\t231727\n星章\t231728\n京开高速\t231729\n1.11.5\t231730\n中华人民共和国主席\t231731\nRodeo\t231732\n冯建华\t231733\n羊骨\t231734\nauthor\t231735\n金源路\t231736\n文件函数\t231737\n箱里\t231738\n经史\t231739\n毛笔\t231740\n刀口\t231741\nlibxml\t231742\n依兰\t231743\n幂级数\t231744\n夏琳\t231745\n法拉\t231746\n2k屏\t231747\n大麟子\t231748\n2828\t231749\n劍\t231750\nsleep函数\t231751\nInvestopedia\t231752\n昂\t231753\na片\t231754\n99re6\t231755\n广西高院\t231756\n水鸟\t231757\n每户\t231758\n自慢\t231759\n胀栓\t231760\n早纪\t231761\n武汉东湖高新区\t231762\najuste\t231763\nSeventh\t231764\n艺术型\t231765\nArgus\t231766\n王蒙\t231767\n委任状\t231768\n封神榜传奇\t231769\n血猎者\t231770\nWebSockets\t231771\n串词\t231772\n206所\t231773\n克鲁兹\t231774\n接收站\t231775\n三峡电力职业学院\t231776\n蜀山战纪之踏火行歌\t231777\n国威ws824\t231778\nframework\t231779\n棚膜\t231780\n努比亚Z17miniS\t231781\n斯普利特\t231782\n增根\t231783\n新博狗\t231784\n信教\t231785\n百斯盾\t231786\n被拘\t231787\n人\t231788\njhu\t231789\n三十公里\t231790\n唐朝\t231791\nLeetcode\t231792\n26亿元\t231793\n再孕\t231794\n醛酮\t231795\n欣妈富隆\t231796\n纽约现代艺术博物馆\t231797\n自动开关机\t231798\n爽点\t231799\n中国西北\t231800\n青柚\t231801\n包年\t231802\n人无完人\t231803\n巡司河\t231804\n银信科技\t231805\n人工湖\t231806\nTherapy\t231807\n血龙\t231808\n张震宇\t231809\n人生观\t231810\n925纯银\t231811\n九龙神鼎\t231812\n费尔巴哈\t231813\n早春呈水部张十八员外\t231814\n十一次\t231815\n崔莉\t231816\n2ce\t231817\n9g\t231818\n喷雾罐\t231819\n变电所\t231820\n昌里\t231821\n大槻响\t231822\n胃热\t231823\n金峰\t231824\n两条杠\t231825\nwww82kkkkcnm\t231826\n突围\t231827\n上海移动\t231828\nNotices\t231829\n好彩网\t231830\n医护版\t231831\n褥垫层\t231832\n搬空\t231833\n压枝\t231834\n成都航空学校\t231835\n条码扫描\t231836\n季华\t231837\n麦麸\t231838\n受伤了\t231839\n癌痛\t231840\n雅堂\t231841\n凤城九路\t231842\n豫人社\t231843\n一枝笔\t231844\n陈布雷\t231845\n假梁\t231846\n岳鹏\t231847\n双截龙4\t231848\n长房云时代\t231849\nLibratone\t231850\n安戈洛环形山\t231851\n两宝\t231852\n氤氲\t231853\n760型\t231854\n16.04.1\t231855\n市一中\t231856\n伊吾县\t231857\n0.007\t231858\n大图书馆\t231859\n一g\t231860\n29章\t231861\n中山市统计局\t231862\n5格\t231863\n忤逆\t231864\n227号\t231865\n九颗\t231866\n一百_\t231867\ngh\t231868\n石歧\t231869\n药材\t231870\n紧实\t231871\n算机\t231872\nMate8\t231873\n天裕\t231874\n朱自蕾哈娜\t231875\n彪哥闯奉天之做梦没想到\t231876\n铁狮\t231877\n6科\t231878\n半学期\t231879\n联络部\t231880\n椰子肉\t231881\n角尖\t231882\n50万平方米\t231883\n水务局\t231884\n王者荣耀百里守约特工魅影\t231885\n杨凌农科城\t231886\n夏庄街道\t231887\n云笛\t231888\n好店\t231889\n廖承志\t231890\n铸业\t231891\nKiwi\t231892\nmaria\t231893\n白岩山\t231894\n铆钉\t231895\nPM20L-020永磁式步进电机\t231896\n东方路\t231897\n铃木园子\t231898\n命名权\t231899\n福利彩\t231900\n国家级\t231901\n滚蛋\t231902\n天云\t231903\n况味\t231904\n350THP\t231905\n斯米兰群岛\t231906\n稳控\t231907\n白银便民网\t231908\n2.4plus\t231909\n安平便民网\t231910\n点米\t231911\n南烟斋\t231912\n扩编\t231913\n直流接触器\t231914\n张江集团中学\t231915\nscrolls\t231916\n苏霍姆林斯基\t231917\nBAII\t231918\n连云港电视台\t231919\nhomestay\t231920\nOhio\t231921\n第12卷\t231922\n土狼\t231923\n自然经济\t231924\n泰禾中央广场\t231925\ntl10\t231926\n摩友们\t231927\nphi\t231928\npirates\t231929\n感激涕零\t231930\n接站\t231931\n认不认识\t231932\n国会大厦\t231933\nLasso\t231934\n剑意\t231935\n杂曲\t231936\n洗颜\t231937\n小米6#小米6\t231938\n大众MIB\t231939\n艾特网\t231940\n袜业\t231941\n铝件\t231942\n3dmark11\t231943\n辩经\t231944\n魔工坊\t231945\nCURSOR\t231946\n惊悚类\t231947\n胆结石\t231948\n头孢呋辛酯片\t231949\n攻修\t231950\n烧尾\t231951\n邪恶漫画无翼鸟\t231952\nAyawawa\t231953\nnzt\t231954\n丙火\t231955\n抽蛋\t231956\n鱼肥\t231957\n海棠路\t231958\n西松屋\t231959\n20年前\t231960\nKeeping\t231961\n怀庄\t231962\n宠物商城\t231963\n信息工\t231964\n界首\t231965\n金庸群侠传之苍龙逐日\t231966\n大众\t231967\n印钞机\t231968\n滞留\t231969\n差差\t231970\n丝杆\t231971\n罗蒙环球城\t231972\n犀利仁师\t231973\n领跑者\t231974\n的士\t231975\nGPP\t231976\n屏山镇\t231977\n拉蒙\t231978\n晒跑\t231979\n蘑菇汤\t231980\n民国时期\t231981\n武二郎\t231982\n海宁新闻网\t231983\n魔术家\t231984\n深圳市农产品股份有限公司\t231985\n枝枝\t231986\n节欲\t231987\nvoid*\t231988\n楼层板\t231989\n黑道\t231990\n铣槽机\t231991\n翻唱\t231992\n一人饮酒醉\t231993\n北京邮电大学世纪学院\t231994\nheatmap\t231995\n昆明长水机场\t231996\nblh\t231997\n滴滴专车\t231998\n红磷\t231999\n0.5w\t232000\n天生桥风景区\t232001\n红动中国\t232002\n9.5万\t232003\n折纸吧\t232004\n100M\t232005\n五微\t232006\n卢拉\t232007\n东方\t232008\n两不厌\t232009\nQUOT\t232010\n东新区\t232011\n低手\t232012\n纠\t232013\n蛇身\t232014\n育新\t232015\n霸域\t232016\n10月1日起\t232017\nRare\t232018\n樱花公园\t232019\n益田村\t232020\n锐志琴帝\t232021\n赵汀阳\t232022\nDeb\t232023\n汽枪\t232024\n无花果论坛\t232025\n频数分布表\t232026\nConsultant\t232027\n洒布\t232028\nZMX\t232029\n原性\t232030\n大智若愚\t232031\n磁力链\t232032\n张予曦\t232033\nsmf\t232034\n赋帝\t232035\nStandings\t232036\n米油\t232037\n97天\t232038\n特种兵学校\t232039\n832集\t232040\n驿站\t232041\n电子技\t232042\nzhineng\t232043\njypt\t232044\n新城花园\t232045\n摩提\t232046\n于东\t232047\n武康大楼\t232048\n停下来\t232049\n海湾镇\t232050\n酒曲\t232051\n叨鱼\t232052\n凌弱\t232053\n上海外国语大学贤达经济人文学院\t232054\n3000米\t232055\n铁牛集团\t232056\n飞歌导航\t232057\n中国教育技术协会\t232058\n天津市第三中心医院\t232059\n先锋村\t232060\n长三角\t232061\n三颗\t232062\n买房日记\t232063\n童梦\t232064\n邳县\t232065\n网络型\t232066\n輸\t232067\n吴婷婷\t232068\n囚服\t232069\n游客\t232070\n三长\t232071\n土耳其进行曲\t232072\nzeromike\t232073\n郑绪岚\t232074\n破产\t232075\n胡蝶\t232076\n东莞市民政局\t232077\n字节流\t232078\n高清全集\t232079\n茶香\t232080\n枫糖浆\t232081\n致痘\t232082\n耐美尔\t232083\nlinux防火墙\t232084\n600D\t232085\n航空餐\t232086\niphone5吧_\t232087\nWipe\t232088\n换上瘾\t232089\n贵州省新闻出版广电局\t232090\nstochastic\t232091\n卧虎藏龙2\t232092\n产品线\t232093\n食肉\t232094\nWai\t232095\n东方启辰\t232096\n广东人大\t232097\n停车楼\t232098\n油酥饼\t232099\n古今大战秦俑情\t232100\n神爱之家\t232101\n东宏\t232102\nspringCloud\t232103\n苏讯网\t232104\n专长\t232105\ncnm\t232106\n大贤\t232107\n机娘\t232108\n滁宁\t232109\nBras\t232110\nspace\t232111\n走动\t232112\nKvm\t232113\noda\t232114\n一列多行\t232115\n新市民\t232116\n碧桂园营销中心\t232117\n非种马小说吧\t232118\nmva\t232119\n卡行\t232120\n2017速卖通\t232121\n块料\t232122\n資訊\t232123\n杨丽丽\t232124\nbdmv\t232125\nInterests\t232126\n盐酸西替利嗪片\t232127\n接力赛\t232128\namr格式\t232129\n闪点悖论\t232130\n自然增长率\t232131\n高野\t232132\n微课程\t232133\nsunflower\t232134\n任何人\t232135\n油浴锅\t232136\n鲁镇\t232137\nSWAG\t232138\n佛爷美容院\t232139\n6068\t232140\nBhd\t232141\n如流\t232142\n云中子\t232143\n企业宝\t232144\n狗鞭\t232145\nuiautomatorviewer\t232146\n方舒\t232147\n英雄传说:闪之轨迹\t232148\n槽轮机构\t232149\nSN码\t232150\n华锦股份\t232151\n105家\t232152\n扬州经济技术开发区\t232153\n修习\t232154\nBeyond\t232155\ncdrx4\t232156\n许昌职业技术学院\t232157\n龙煤集团\t232158\n迷谷\t232159\nVermont\t232160\n卓丽\t232161\n诊病\t232162\n可爱女人\t232163\n邻伴网\t232164\n霹雳先锋\t232165\n正体\t232166\n梦想版\t232167\n正切函数\t232168\n家女\t232169\n渗透乳\t232170\npg数据库\t232171\n孝感东\t232172\n鼓词\t232173\n株洲在线论坛\t232174\n索套\t232175\n软带\t232176\n老母\t232177\n西蒙菲莎大学\t232178\n路灯杆\t232179\n李昆\t232180\nhasco\t232181\n今季\t232182\n沈超\t232183\n典藏\t232184\nV4.9\t232185\n分解器\t232186\n重获\t232187\n克里斯保罗\t232188\n杀马特\t232189\n调发\t232190\n乔叶\t232191\n脚背\t232192\nSulwhasoo\t232193\n中山站\t232194\n无线电视\t232195\n正午30分\t232196\n特洛\t232197\nsexinsx\t232198\n侠骨\t232199\nOrc\t232200\n城邦\t232201\nC280\t232202\n8916\t232203\n高脚杯\t232204\n代建制\t232205\n国行版\t232206\n网吧\t232207\n壁挂炉\t232208\nPAX\t232209\n吃喝玩\t232210\n中国宝安\t232211\n模器\t232212\n阴人\t232213\n50张\t232214\n静电消除器\t232215\n挺纪\t232216\n音膜\t232217\n锁闭阀\t232218\nlib64\t232219\n直进\t232220\n深入浅\t232221\n23位\t232222\n半次元\t232223\n伪史\t232224\nLED洗墙灯\t232225\nlidar\t232226\n码垛机器人\t232227\n中房地产\t232228\n收听\t232229\nhijack\t232230\nbald\t232231\nv5.0.4\t232232\n承德政府网\t232233\n重庆市文化委员会\t232234\n赵皇帝\t232235\n汤汁\t232236\nRental\t232237\n胡杏儿\t232238\n20151021\t232239\n浙江电台\t232240\n9.1\t232241\n10C\t232242\n分隔符\t232243\n狗爪\t232244\n老岳\t232245\n汛期\t232246\n48分钟\t232247\n全幅\t232248\n事件簿\t232249\n新三国志孔明传\t232250\n驱动\t232251\n_巴士枪神纪专区\t232252\n压件\t232253\n湛江\t232254\n齐河县\t232255\n王雪松\t232256\n孙明霞\t232257\n纪实摄影\t232258\n什么系\t232259\n马三立\t232260\npyglet\t232261\n第十一个\t232262\n丁凯\t232263\nearlier\t232264\n衣裤\t232265\n高要\t232266\nD1\t232267\n淘宝装修\t232268\nellipse\t232269\n烤鹅\t232270\n63期\t232271\n一隅\t232272\n费版\t232273\n丁类厂房\t232274\n定金\t232275\n恒载\t232276\n江苏省委宣传部\t232277\n挪移\t232278\nHum\t232279\n长线\t232280\n4.49\t232281\n2015年5月份\t232282\n宗介\t232283\n热麦\t232284\n爱姉妹\t232285\n全量\t232286\n不期而遇\t232287\n四体\t232288\n秦伯\t232289\n暗讽\t232290\n质保\t232291\n陆家浜路\t232292\n轴突\t232293\n诊室\t232294\n坑爹\t232295\nAsylum\t232296\n第38话\t232297\n魔兽\t232298\n台州路桥区\t232299\n漆皮\t232300\n贝塔历险记\t232301\n杠灯\t232302\n平乐县\t232303\n双网通\t232304\nAggregation\t232305\n同数\t232306\n2017万圣节\t232307\nkmf\t232308\n邻家少女\t232309\n鞋号\t232310\n六核\t232311\n嘌呤霉素\t232312\n喜出望外\t232313\n决定者\t232314\n北洋舰队\t232315\nZa\t232316\n杭州客运中心站\t232317\nbqqm\t232318\n永贵电器\t232319\n高坂保奈美\t232320\n科普特\t232321\nインタ\t232322\n蓝田镇\t232323\n程强\t232324\n付邮\t232325\n三十题\t232326\n守门\t232327\nDenver\t232328\n八戒日付网\t232329\n断线\t232330\n二初\t232331\nububtu\t232332\nenroll\t232333\n带头\t232334\nFAKER\t232335\n妹纸们\t232336\n西湖花园\t232337\n丹阳房产网\t232338\nuim\t232339\nEcharts3.0\t232340\n永夜大教堂\t232341\n陡坡\t232342\nreads\t232343\n亚硫酸根\t232344\n狮子女\t232345\n国富论\t232346\n化验\t232347\nberkeley\t232348\n第2课\t232349\n鲁鲁\t232350\n侨联\t232351\n硕果\t232352\n100|\t232353\n李昌镐\t232354\n铜铃\t232355\n0929\t232356\n四百斤\t232357\nSecretary\t232358\ng1x\t232359\n_化\t232360\n海淘UK站\t232361\n巨乳娘\t232362\npyCharm\t232363\n兼容\t232364\n仙剑诀\t232365\ndotNET\t232366\n庄妮\t232367\n雪佛兰科鲁兹\t232368\n白尖\t232369\n非电解质\t232370\n总承\t232371\n前列腺结石\t232372\n赤砂之蝎\t232373\n鸡蛋牛奶\t232374\n长草\t232375\n磁护\t232376\n法利\t232377\n老蔡\t232378\nFOREX\t232379\n新湖路\t232380\n开不开\t232381\n烦心\t232382\ninotifywait\t232383\n泉州市人力资源和社会保障局\t232384\nlaboratory\t232385\n变态杀人\t232386\nDK\t232387\n耳旁\t232388\n1050ti\t232389\n厄运小姐\t232390\n德晟\t232391\n五月天色情网\t232392\nxiaotian\t232393\nquery\t232394\n开鑫贷\t232395\n张申\t232396\na37m\t232397\nxlc\t232398\nfs\t232399\njaedong\t232400\n6#\t232401\n2018节假日\t232402\n天天爱西游\t232403\n河大\t232404\n救药\t232405\n陈果\t232406\n89.5\t232407\n四刃\t232408\n令人瞩目\t232409\n夏峥\t232410\nfall\t232411\n王云\t232412\nZynq\t232413\n该片\t232414\n侯耀文\t232415\n代投\t232416\n山河水\t232417\n氙气灯\t232418\nCICS\t232419\nhangye\t232420\n域名商\t232421\n太浩湖\t232422\n路人甲\t232423\nddg\t232424\n第10条\t232425\n中国起重机械网\t232426\n市运管局\t232427\n常昊\t232428\n10kV\t232429\n贵安新天地\t232430\n打卷机\t232431\n寄封\t232432\nhypothesis\t232433\nRubyMine\t232434\n来的爱\t232435\nleone\t232436\n黑醋\t232437\n票务通\t232438\n荣耀3C\t232439\n4k播放器\t232440\n广州十三行\t232441\n钟宅\t232442\n妮娜·杜波夫\t232443\nUGREEN\t232444\n3.1万\t232445\nJTT\t232446\n申世景\t232447\n鬼位\t232448\n路易斯康\t232449\n第97\t232450\n放射源\t232451\n惊觉\t232452\n华中师范大学\t232453\nSSDB\t232454\n你情我愿\t232455\n刀光剑影\t232456\n草帽\t232457\n2018年3月30日\t232458\n黄槐\t232459\n爆门\t232460\nqualifier\t232461\n风速\t232462\n韩俊\t232463\n北京市区\t232464\n鲁德\t232465\n一起走过的日子\t232466\ne470\t232467\nPROPERTY\t232468\n正者\t232469\n思源地产\t232470\n京新\t232471\n密云水库\t232472\nkindlefire\t232473\nHonda\t232474\n范长龙\t232475\nPINGO\t232476\n自作\t232477\n寻衅滋事罪\t232478\n苏外\t232479\nbf4\t232480\n粉刷\t232481\n你們\t232482\nsbl\t232483\n56次\t232484\n北京地铁四号线\t232485\n隐龙传\t232486\n曼昆宏观经济学\t232487\n6月4日\t232488\n中国烟草总公司辽宁省公司\t232489\nSaved\t232490\npreprocessor\t232491\n聚乙烯复合管\t232492\n国际货币\t232493\n品牌规划\t232494\n卡牛\t232495\npre段\t232496\n要卡\t232497\n黄姚古镇\t232498\n植物大战僵尸\t232499\n国家土地管理局\t232500\n影音先锋片源_影音先锋看av片_xfplay资源看片源\t232501\n桃汁\t232502\n36路\t232503\n众发\t232504\n血淀粉酶\t232505\nLOOKBOOK\t232506\n辽宁vs广厦\t232507\n竹升面\t232508\n美的电器\t232509\n页面\t232510\n星通\t232511\n电子膨胀阀\t232512\n银子岩\t232513\n陆地巡洋舰\t232514\n盈创\t232515\n花间提壶方大厨\t232516\nforeigners\t232517\nGpu\t232518\n黑棋\t232519\n判别\t232520\n下截\t232521\n和谐区\t232522\n疯狂猜\t232523\n肾损伤\t232524\nPathways\t232525\n凌霄殿\t232526\n麦冬草\t232527\n郭小平\t232528\n邪恶漫\t232529\n茂茂\t232530\n天津市商务委员会\t232531\nept\t232532\n弹车\t232533\n慢一拍\t232534\n架桥\t232535\n余方\t232536\n挂牌\t232537\n暴风转码\t232538\n现钞\t232539\n国家开发银行\t232540\n吉瑞\t232541\n锚\t232542\n郑欧班列\t232543\ncook\t232544\n山东大学第二医院\t232545\n上级领导\t232546\n9.6米\t232547\n因数\t232548\n再造\t232549\n圆通山\t232550\n顾先生\t232551\n预糊化淀粉\t232552\nDetailed\t232553\n自驾旅行\t232554\n捐精者\t232555\n玉如意\t232556\n光脚\t232557\n杰克·吉伦哈尔\t232558\n特丽莎\t232559\n帀\t232560\nhi8\t232561\nemd\t232562\n115斤\t232563\n水涨船高\t232564\n清华大学微电子所\t232565\n神将\t232566\n盘丝洞\t232567\n破表\t232568\ncelia\t232569\n聊斋先生\t232570\n胖女孩\t232571\n讨账\t232572\n绝伤\t232573\n教育科技公司\t232574\n天相\t232575\n有限群\t232576\n联通软件研究院\t232577\n元明粉\t232578\n云南招聘网\t232579\n屏幕分辨率\t232580\n标准文献网\t232581\n安东帕\t232582\nDx\t232583\n北月\t232584\n计薪\t232585\n西城区政府\t232586\n刚果民主共和国\t232587\n饭喇\t232588\n2018年02月28日\t232589\n操作证\t232590\n注册营养师\t232591\n如止水\t232592\n不二法门\t232593\n两粒\t232594\n天琴座流星雨\t232595\n诀窍\t232596\n区公安分局\t232597\n哀公\t232598\n舟山市\t232599\n1.025\t232600\nlizhi\t232601\n第4层\t232602\n冷水江市\t232603\n华侨城小学\t232604\n小凤\t232605\n十来天\t232606\nThinkVision\t232607\n水谷幸也\t232608\nVillage\t232609\n救星\t232610\nep3\t232611\n九龙镇\t232612\n劲牌\t232613\n43款\t232614\n污水综合排放标准\t232615\nplacing\t232616\nAtomic\t232617\nBolts\t232618\n电动遥控直升机-5iMX.com\t232619\n西游记之女儿国\t232620\n郑集镇\t232621\nOKR\t232622\n网络学院\t232623\n六盘水市人民政府\t232624\n永恒之塔天族\t232625\n晶盛\t232626\n源头\t232627\n高塘镇\t232628\ndetection\t232629\n第二式\t232630\nprepost\t232631\n武汉大学药学院\t232632\nHi-Fi\t232633\n2000000\t232634\n二值化算法\t232635\niap\t232636\n文明密码\t232637\ntopsolid\t232638\n郁金香文化节\t232639\n兄友\t232640\n2560p\t232641\n卤三国\t232642\nWeb框架\t232643\n不定积分\t232644\n三分地论坛找工求职版\t232645\n12星男\t232646\n热固性塑料\t232647\n日常\t232648\n悦然\t232649\n分数计算器\t232650\n荣耀qq\t232651\n一亿年\t232652\n紫爵\t232653\n野兔子\t232654\n2018年2月\t232655\n行为能力人\t232656\n中华人民共和国中央\t232657\n创酷\t232658\n恒博\t232659\n农苗网\t232660\n腾讯now\t232661\n李侑菲\t232662\nmultiplex\t232663\n仇视\t232664\n6.9分\t232665\nldm\t232666\n无着\t232667\n苦水\t232668\n日抛隐形眼镜\t232669\n第一二三\t232670\n字典树\t232671\n开瑞绿卡\t232672\n泥煤\t232673\nAdBlock\t232674\n何佳\t232675\ngwu\t232676\ncontamination\t232677\nEater\t232678\n办办\t232679\n助力泵\t232680\n地菜\t232681\n江西省统计局\t232682\n十份\t232683\n六鳌\t232684\n耳耳机\t232685\n500度\t232686\n封缄\t232687\n丧尸片\t232688\nドクタ\t232689\n亲们\t232690\n马温泉\t232691\n莫斯科郊外的晚上\t232692\nHerschel\t232693\n达芬奇调色软件\t232694\n15公斤\t232695\n66岁\t232696\n湖南师大附中博才实验中学\t232697\n王辉忠\t232698\n瓜蒌子\t232699\n姬川丽娜\t232700\n俘获\t232701\n洛带镇\t232702\n践踏\t232703\nrecyclebin\t232704\n汉溪\t232705\n联想小新潮5000\t232706\n一年之计在于\t232707\ncontinue\t232708\n眼眶\t232709\n惨状\t232710\n世界之树\t232711\nアンナ\t232712\n不愧为\t232713\nQt\t232714\n晒衣\t232715\n创客大赛\t232716\n关中地区\t232717\n肃然起敬\t232718\nNetBeans\t232719\n曾老师\t232720\n不辞而别\t232721\n偷梁换柱\t232722\n王位\t232723\n杨舒婷\t232724\n电锅炉\t232725\n大地\t232726\ndart\t232727\n英语系\t232728\nCrown\t232729\n多赢\t232730\n滨州经济技术开发区\t232731\n大根\t232732\n20150721\t232733\n尼玛磁力链接\t232734\n险\t232735\n方家胡同\t232736\n初来乍到\t232737\n搏一搏\t232738\n祛斑霜\t232739\n英雄联盟鸡里奥宝典\t232740\n嗜铬细胞瘤\t232741\n20170415\t232742\n十九大习总书记\t232743\n球速\t232744\nmye\t232745\n卢姥爷\t232746\n电线路\t232747\n000681\t232748\n赛米控\t232749\n摩尼教\t232750\nDanfoss\t232751\n天父\t232752\nC类\t232753\n异父\t232754\n驱魔者\t232755\n梦想小镇\t232756\n成都耍耍网\t232757\n550度\t232758\n达州人大\t232759\n西吉\t232760\n同伙\t232761\n镗孔\t232762\nCocos2d-x\t232763\n事业线\t232764\n北京花乡\t232765\n菲尼克斯\t232766\n明实录\t232767\negen\t232768\n防腐木\t232769\nfactory\t232770\n顾雏军\t232771\n14种\t232772\n冬木\t232773\n91周年\t232774\npaused\t232775\n夜间天\t232776\n惊门\t232777\njauery\t232778\nPIL\t232779\n性爱日记\t232780\n移动侦测\t232781\n伪麻黄碱\t232782\n剑网三捏脸吧\t232783\n安徽省招标集团\t232784\n盘州\t232785\n小处\t232786\n天子峰\t232787\n粤海铁路\t232788\n天开眼\t232789\n降幅\t232790\ngh-pages\t232791\nphase\t232792\n我朋友的老姐\t232793\n拟人化\t232794\nbir\t232795\n理据\t232796\n汪汪队立大功\t232797\nRedshift\t232798\n20180408\t232799\n201411\t232800\n温尼科特\t232801\n600410\t232802\n试题集\t232803\n3.5英寸\t232804\n3dsmax2018\t232805\n量子通信技术\t232806\nKD树\t232807\n智能科技\t232808\nwondows\t232809\n混凝土结构工程施工质量验收规范\t232810\n吴融\t232811\n湘雅常德医院\t232812\n持久代\t232813\ncc1101\t232814\njabra\t232815\n焚天之怒\t232816\n奇葩说4\t232817\n欧非\t232818\n20160313\t232819\n石景山\t232820\n佩雷斯\t232821\n公务舱\t232822\n西宁市\t232823\n红叶石楠球\t232824\n舅\t232825\nctrl+a\t232826\n募投\t232827\n信望\t232828\nBusybox\t232829\nFundGG\t232830\n一个故事\t232831\n固定源\t232832\n春日野结衣\t232833\n双孔\t232834\n弗兰西斯\t232835\n十六年后\t232836\nEASYUI\t232837\n十集\t232838\n李淼\t232839\n平安区人民政府\t232840\n九个九日\t232841\n蜘蛛侠英雄归来\t232842\n咸安政务网\t232843\n紫霞\t232844\n冻裂\t232845\n湿盛\t232846\n天天向\t232847\n风干机\t232848\n珠海电视台\t232849\n洪湾\t232850\n武昌政府网\t232851\n赵红梅\t232852\nRxJava\t232853\nAAE\t232854\nHeight\t232855\n五脏六腑\t232856\n斯通纳\t232857\n新华联广场\t232858\n标清\t232859\n思南路\t232860\n夏诗涵\t232861\n荫\t232862\n数学史\t232863\nBitcoin86\t232864\n宋文骢\t232865\n刻蚀机\t232866\n色温\t232867\ninches\t232868\n怎样子\t232869\n聊天群\t232870\n瑞松\t232871\n6万亿\t232872\n海原县\t232873\n个人类\t232874\n成熟\t232875\nArchives\t232876\n跳动\t232877\n知遇\t232878\nedin\t232879\n〝\t232880\naction类\t232881\n不定\t232882\n590\t232883\n仓门\t232884\n几维\t232885\n广东高速\t232886\nInterview\t232887\n第二春\t232888\n搞笑版\t232889\n九宝乐队\t232890\nmysql编码\t232891\nuint32_t\t232892\n郑州商品交易所\t232893\n生命之树\t232894\n一个45亿\t232895\n张明敏\t232896\n阿尔卑斯\t232897\n不得不爱\t232898\n2.1A\t232899\n十月天使\t232900\n心育\t232901\necos\t232902\n10余\t232903\n统战部长\t232904\n闲林镇\t232905\n2017年11月份\t232906\nCOMMAND\t232907\ncid\t232908\n江苏省住建厅\t232909\n毕业赠言\t232910\n天马行空\t232911\n佛山中医院\t232912\n兽世\t232913\n受让人\t232914\n米易县\t232915\nTsui\t232916\nWharton\t232917\n我的谎\t232918\n香港电视\t232919\n姚雪垠\t232920\ndiagnosis\t232921\nhd4400\t232922\n4.2V\t232923\nEcstore\t232924\n实体类\t232925\nPOSTEK\t232926\nSacramento\t232927\n体上\t232928\n皆是\t232929\n沃音乐\t232930\nmidascivil\t232931\n若爱深埋于岁月\t232932\n0-1岁\t232933\n002596\t232934\n科贝尔\t232935\n红烧鱼\t232936\n广东快乐十分\t232937\n巴斯滕\t232938\n城市空气质量指数\t232939\n三阴交穴\t232940\n沈南假如爱有天意\t232941\n随上\t232942\n粪肠球菌\t232943\n二目\t232944\nHoang\t232945\n金铭\t232946\n等额选举\t232947\n窃明\t232948\n4月20号\t232949\n三清山\t232950\nting\t232951\n1-4月\t232952\n徐炜\t232953\n不会\t232954\n立伟\t232955\n台州19楼\t232956\n启蒙主义\t232957\ndigi\t232958\n江苏路街道\t232959\n山羊奶\t232960\n复选\t232961\n梁成\t232962\n利维\t232963\n玻化微珠保温砂浆\t232964\n石女\t232965\n金运路\t232966\n中辉环球\t232967\n魔兽盒子\t232968\nWeeknd\t232969\nXJL\t232970\n梁湘\t232971\nU型枕\t232972\nsolidworks2007\t232973\n_月斜影\t232974\naustralia\t232975\n螯合物\t232976\n京东派\t232977\n可读\t232978\n高低板\t232979\n今客\t232980\n20180116\t232981\n单元式\t232982\n四轮电动车\t232983\nSsr\t232984\n3352\t232985\nsequencing\t232986\nuta\t232987\n2.5折\t232988\n定时器2\t232989\n阅读题\t232990\n睡魔\t232991\n9.5%\t232992\n猎豹黑金刚\t232993\n连云港市人民政府\t232994\nDrake\t232995\n转产\t232996\n脱皮\t232997\n外贸市场\t232998\n欧阳超\t232999\n请求\t233000\n若干块\t233001\nsequelblight\t233002\n黑男\t233003\n辅助类\t233004\n使用级\t233005\n夜深人\t233006\n扁桃体癌\t233007\n墨鱼仔\t233008\n绵阳机场\t233009\n原子论\t233010\n新武林外传\t233011\n跨表\t233012\n吴念\t233013\n不许\t233014\n财富证券\t233015\n无门\t233016\n雨润\t233017\n七古\t233018\n一个四\t233019\nDelta\t233020\nMega\t233021\nGzu521.com\t233022\n六尾\t233023\n流压\t233024\nHierarchical\t233025\nJEDEC\t233026\n亲商\t233027\n长春光华学院\t233028\n连声\t233029\n许可证\t233030\nfe80\t233031\n8段\t233032\n激战2\t233033\nalxe_yu\t233034\n黄伟文\t233035\n西洋\t233036\n桑德伯格\t233037\n8300H\t233038\n祯\t233039\n三、四\t233040\n分卷压缩文件\t233041\n谷歌地球中文版\t233042\n景行行止\t233043\nvmplayer\t233044\nNa2CO3\t233045\nAA2\t233046\n东英\t233047\n狗牙\t233048\n502胶\t233049\n2016十位\t233050\n保住\t233051\n侦察队\t233052\n多所\t233053\n鞋厂\t233054\n莱阳\t233055\n屠魔令\t233056\n绣锦\t233057\nSYBR\t233058\n嗨体\t233059\n忘川河\t233060\n01集\t233061\n1.9亿元\t233062\n东南大学艺术学院\t233063\n背景乐\t233064\nhcp\t233065\n哦类\t233066\n片字\t233067\nNuance\t233068\n交通部\t233069\nxingtai\t233070\n俏号网\t233071\n人道主义\t233072\n导游词\t233073\n凉都\t233074\n爱欢网\t233075\nbaa\t233076\n瓦尔迪\t233077\n四只手\t233078\n顶臀\t233079\n易语言\t233080\n黄芪颗粒\t233081\n2.5.8\t233082\n羊湖\t233083\n秋霞\t233084\n凯立\t233085\n37名\t233086\n刘向东\t233087\nadjustment\t233088\n晕菜\t233089\n扁桃体肿大\t233090\n兵粮\t233091\n3-23天\t233092\n卡方\t233093\n质控\t233094\n大溪地\t233095\n蓝达摩\t233096\n鱼镖\t233097\nnouns\t233098\n白大褂\t233099\n百尺竿头\t233100\nphysiology\t233101\n上位法\t233102\n国术\t233103\n与其\t233104\n定量包装秤\t233105\n1841年\t233106\n南京市市\t233107\ndocker\t233108\n顺达期货\t233109\nflexsim\t233110\n数源\t233111\nBiochemical\t233112\n延平区\t233113\nESF\t233114\nDuties\t233115\nlzo\t233116\n1095\t233117\n英雄合击发布网\t233118\n电辅热\t233119\n马报\t233120\n市委教育工委\t233121\n定江山\t233122\n扣上\t233123\n杭州市文一街小学\t233124\n宝马X2\t233125\n朗峰\t233126\n指针式\t233127\n理事长\t233128\n长隆熊猫酒店\t233129\nzxhn\t233130\n阿霉素\t233131\n裂解炉\t233132\n探伤机\t233133\nHashiCorp\t233134\n主收入\t233135\n干香菇\t233136\n五年级下册英语\t233137\nblush\t233138\nspecifications\t233139\n琅琅\t233140\nPlupload\t233141\n海康4200\t233142\nForeigners\t233143\n拉粑粑\t233144\n凤血\t233145\n玄凤鹦鹉\t233146\n道仙\t233147\n陈二狗\t233148\n贵校\t233149\n评为\t233150\nprofile\t233151\nmism\t233152\n1月16日\t233153\n日夜兼程\t233154\n权利的游戏\t233155\nnanomsg\t233156\n北京蓝调庄园\t233157\n古罗马\t233158\n身残志坚\t233159\n远动\t233160\n港台地区\t233161\nsituation\t233162\n古渡\t233163\n阎罗王\t233164\n杨文龙\t233165\n诸国\t233166\n裸纤\t233167\n傅佩荣\t233168\n20160331\t233169\n三思而行\t233170\n3.7.1\t233171\n我的朋友圈\t233172\n程力\t233173\n3230\t233174\n黄志\t233175\n评课\t233176\ncc2015\t233177\n华中师范大学》\t233178\n市房\t233179\n自动挡\t233180\n易联众\t233181\notool\t233182\n玉兔精\t233183\n电热水瓶\t233184\n限权\t233185\n石家庄一中\t233186\n围挡\t233187\n重庆三峡\t233188\nElsa\t233189\n同步带轮\t233190\n自律性\t233191\n发木\t233192\nOPPOr15\t233193\nDigimon\t233194\n浙江自然博物馆\t233195\n蛋糕坊\t233196\nCanntBelieve\t233197\n永安山\t233198\n旁证\t233199\nGTX970\t233200\n暗黑之魂2\t233201\n基金类\t233202\n浙\t233203\n旅游攻略网\t233204\n刚力彩芽\t233205\n起存\t233206\nLinkedList\t233207\n17133\t233208\n贵子\t233209\n金华一中\t233210\nq5\t233211\n果真\t233212\n4G\t233213\n华容区\t233214\n小公主\t233215\n如寓\t233216\n浙江大学经济学院\t233217\n成都四中\t233218\n云东海\t233219\n丰臣\t233220\n嘉泰\t233221\n泵感\t233222\n胰液\t233223\n冷暖\t233224\n4625\t233225\n篮彩\t233226\nATFB\t233227\nfsb\t233228\n迎宾街\t233229\n大月亮\t233230\nTPD\t233231\nSQL数据库表\t233232\ntubi\t233233\n新驻京办\t233234\n涿州东站\t233235\n外脑\t233236\n沪江日语-沪江旗下日语学习资讯网站_日语等级考试_日语入门到精通\t233237\n苍梧县\t233238\n修改\t233239\n叶予舜\t233240\na320\t233241\n小松鼠\t233242\n降费\t233243\nZhou\t233244\n彻底改变\t233245\n立信\t233246\n时延\t233247\n条命\t233248\n班干部\t233249\nswagger-ui\t233250\n电动风阀\t233251\n汪曾祺\t233252\n机械设计基础\t233253\n毛燕\t233254\n周朝\t233255\n改性研究\t233256\n英国牛津大学\t233257\n一脚\t233258\nthread类\t233259\n心动\t233260\n北京呀路古热带植物园\t233261\nò\t233262\n海南省博物馆\t233263\n暂住地\t233264\ngate奇幻自卫队\t233265\n晕奶\t233266\n精炼油\t233267\n区交通局\t233268\n问天\t233269\n智爷\t233270\n一砖一瓦\t233271\n氘代试剂\t233272\n小肖\t233273\n163.net\t233274\n血崩\t233275\n般若波罗密多心经\t233276\n爱慕虚荣\t233277\n最大者\t233278\n新江湾\t233279\nHDU\t233280\n沙坪坝站\t233281\n调工\t233282\n发话\t233283\ngrants\t233284\n苏博特\t233285\nracket\t233286\n苏州建设交通高等职业技术学校\t233287\n商吧-1024商务网\t233288\n新大话西游3\t233289\n德律风根\t233290\n足球杯\t233291\nHours\t233292\n飓风\t233293\nExperienced\t233294\n市部\t233295\n官员们\t233296\n提梁机\t233297\n没房没\t233298\n吴佳怡\t233299\n东台人论坛\t233300\n独墅湖\t233301\n石锤\t233302\n机器人总动员\t233303\nhowo\t233304\n地図\t233305\n陈发树\t233306\nLyric\t233307\ngrains\t233308\n正道沧桑\t233309\n宜兴紫砂壶\t233310\ngoodenough\t233311\n祥林\t233312\n斩魂\t233313\n护患\t233314\n金铜\t233315\n磷酸化\t233316\ndta\t233317\n外伤性\t233318\n枫舞\t233319\n喜怒无常\t233320\n南大附中\t233321\n海尔斯\t233322\n悟空源码网\t233323\n梵曲配音网\t233324\n金多宝\t233325\n胎座\t233326\n68181987\t233327\n巴雷特食管\t233328\nCancel\t233329\n五辛\t233330\n12306网站\t233331\n国家自科基金\t233332\n常微分方程\t233333\n漫客\t233334\n水蒸蛋\t233335\n什么位\t233336\n吉宝\t233337\n仁山\t233338\nhelp\t233339\nHPI\t233340\n毒花\t233341\n宝安实验学校\t233342\n精品文库网\t233343\n绿城玉园\t233344\n辛格\t233345\n宝玑\t233346\n最差劲\t233347\npopkontv\t233348\n味\t233349\n五里村\t233350\n凉鞋\t233351\n三相负载\t233352\nrunphp\t233353\n大梦想家\t233354\n02.06\t233355\n场次\t233356\n电动闭门器\t233357\nSkyfall\t233358\n巴格达\t233359\n凌达\t233360\n非对称\t233361\n回赠\t233362\n骷髅峡谷\t233363\n接点\t233364\n调兵山市\t233365\n沃锐\t233366\n升迁之路\t233367\n伏季\t233368\n吕品\t233369\npath变量\t233370\n一捺\t233371\n每隔5秒\t233372\njab\t233373\n白树\t233374\n桃子鱼仔\t233375\n磁法\t233376\n猕猴\t233377\nDeveloped\t233378\n奉化市政府\t233379\n234.com\t233380\n洗脱\t233381\n弦长\t233382\nford\t233383\n息怒\t233384\nCocosCreator零基础制作游戏\t233385\n可不可能\t233386\nPowerPoint模板\t233387\n中海国际社区\t233388\n雪人谷\t233389\n杭州宾馆\t233390\n美色\t233391\n东北风\t233392\n正货\t233393\n3方法篇\t233394\n祁连\t233395\nKendo\t233396\n浪妇\t233397\n220W\t233398\n怀丙\t233399\n岁月无痕\t233400\n江苏省纪委\t233401\n色位\t233402\n1705599\t233403\nSurprising\t233404\n339\t233405\n多级放大电路\t233406\nTFC\t233407\n四川新闻网\t233408\n30L\t233409\n山东劳动职业技术学院\t233410\n韩正\t233411\n加名\t233412\n中国通信建设集团有限公司\t233413\n琼斯\t233414\n饶宗颐\t233415\n厦门大学图书馆\t233416\nmotogp\t233417\nenet\t233418\n氯化铁\t233419\n空单元格\t233420\nMigrating\t233421\n上海市学生事务中心\t233422\n鬼干部\t233423\n借位\t233424\n罢黜百家\t233425\n硅酮胶\t233426\n古丈县\t233427\n魔剑士\t233428\n路远\t233429\n21保健品网\t233430\n首都医科大学\t233431\n立健\t233432\n愁\t233433\n暴走鞋\t233434\n字间\t233435\nRodrigo\t233436\n十八洞村\t233437\n象山县人民政府\t233438\n陇西县\t233439\nCJZhaoSimons\t233440\ndrew\t233441\nPID控制器\t233442\n亲爱的孩子\t233443\n3121\t233444\n深圳建行\t233445\nreflected\t233446\n河长\t233447\n拉卜楞寺\t233448\n百分之70\t233449\n打卤\t233450\n林源\t233451\n古墓丽影起源之战\t233452\n票头\t233453\n进而\t233454\n盖娅\t233455\n领券\t233456\nhanxin\t233457\n4.5元\t233458\n电视游\t233459\n陆敏\t233460\n叶王\t233461\n四章\t233462\n意\t233463\n秦镇\t233464\n异灵术\t233465\nmci\t233466\n消费方\t233467\n国际私法\t233468\n新都街道\t233469\n稳上\t233470\n黎\t233471\n夜不能寐\t233472\nopen-type\t233473\nesc键\t233474\n心果\t233475\n狂殴\t233476\n街机三国\t233477\n大连造船厂\t233478\n科罗娜\t233479\n配屏\t233480\n东方红教育网\t233481\n集体婚礼\t233482\n160年\t233483\n电商类\t233484\n孙翠凤\t233485\n吃水\t233486\n定坤丹\t233487\n工程塑料\t233488\n飞猪旅行\t233489\n林徽马苏\t233490\n好人主义\t233491\n洛溪\t233492\nFGT\t233493\n朊病毒\t233494\n进销存软件免费版\t233495\nunblockcn\t233496\n吴江\t233497\n沙巴克\t233498\nnvme\t233499\n浙江大学生命科学学院\t233500\n苹果6S\t233501\nx100f\t233502\n民间资本\t233503\nCSS3选择器\t233504\n30点\t233505\n60周年\t233506\n全国人大常委会法工委\t233507\n上海杜莎夫人蜡像馆\t233508\n国机智骏\t233509\nSpark\t233510\n高燃\t233511\n燕儿\t233512\n觅影\t233513\n汽车贴\t233514\nTTL\t233515\n双分\t233516\n咯血\t233517\n手伤\t233518\n1-5年\t233519\n第三十\t233520\n创新板\t233521\n高斯奥特曼\t233522\n扣扣\t233523\n抹黑\t233524\n容容\t233525\nShroud\t233526\n莉丝\t233527\n小千\t233528\n漯河西\t233529\n代理\t233530\n离散性\t233531\n2102\t233532\n李林峰\t233533\n坦克世界盒子\t233534\nVant\t233535\n奇酷\t233536\n西安科技\t233537\nreact-dom\t233538\n工作服\t233539\nReceiver\t233540\nPicture\t233541\n创举\t233542\n骑士幻想夜\t233543\nWord格式\t233544\n选萃\t233545\n网框\t233546\n64集\t233547\n福卡\t233548\nE450\t233549\n降央卓玛\t233550\n盐亭县\t233551\n牵动\t233552\nWalsh\t233553\n输出功率\t233554\n刘丽丽\t233555\n环保网\t233556\n沥滘\t233557\n慈禧\t233558\nsnc\t233559\nx550\t233560\n曹州牡丹园\t233561\n泰山风景名胜区\t233562\n小龙\t233563\n猪羊\t233564\n异戊醇\t233565\n单元格批量\t233566\n线图\t233567\n沉淀物\t233568\npatched\t233569\n省略句\t233570\n梅雨潭\t233571\nexaid\t233572\nOpen-source\t233573\n春兰空调\t233574\n金阳客车站\t233575\nRandomized\t233576\n上海国际饭店\t233577\n刘某某\t233578\n心烦\t233579\n安福县人民政府\t233580\nThat\t233581\n张家界凤凰古城\t233582\n脑电图\t233583\n麂皮\t233584\n途睿欧_途睿欧价格_途睿欧淘宝天猫货源\t233585\n实情\t233586\n换名\t233587\n略阳县\t233588\n一苇\t233589\n己巳年\t233590\n昌东\t233591\n合肥光华学校\t233592\n生活区\t233593\n补养\t233594\n18183H5\t233595\nbrian\t233596\n联邦快递\t233597\n10目\t233598\n张为民\t233599\n昂克塞拉\t233600\n控价\t233601\n第二座\t233602\n网页视频下载器\t233603\n温度仪\t233604\n品途\t233605\n儿媳妇\t233606\n安多县\t233607\ndiagnostics\t233608\n黄朝阳\t233609\n2029年\t233610\nphpmailer\t233611\nbillboa\t233612\n聚焦镜\t233613\n电信卡\t233614\n外屏\t233615\n绝地求生耳机\t233616\nmellowsmile\t233617\n普印\t233618\n风风火火\t233619\n红枣干\t233620\n猪猪侠之超星萌宠\t233621\n炫卡\t233622\ndvm\t233623\n4月10日\t233624\n浙江大华\t233625\nX战警天启\t233626\n甘油\t233627\n弈秋\t233628\ns8+\t233629\n保密局\t233630\nwindows\\system32\t233631\n央视财经评论\t233632\n渝西\t233633\nLighting\t233634\n晃悠\t233635\nbt学院\t233636\n大咖们\t233637\n偷光\t233638\n考入\t233639\n身躯\t233640\nW2\t233641\n乾丰\t233642\n水晶沙漠\t233643\n长安汽车长安欧尚A800图解大\t233644\nE物流单号查询\t233645\n汉密尔顿\t233646\n清华大学学生职业发展指导中心\t233647\n致青春2\t233648\n机制木炭机\t233649\n外语学院\t233650\n纱布\t233651\ngorogoa\t233652\ncombotree\t233653\njbutton\t233654\n第九篇\t233655\n北京北路\t233656\n搜外SEO\t233657\n酒吧\t233658\nWorkload\t233659\n小云菜\t233660\n凉气\t233661\n广州知识城\t233662\nswitcher\t233663\n少林拳\t233664\n林书孙\t233665\n天龙八部发布网\t233666\n杨家坪中学\t233667\n杜仲茶\t233668\n超级马拉松\t233669\n钢结构设计规范\t233670\n大航空公司\t233671\n泛海集团\t233672\n雅雅\t233673\n腹黑女\t233674\n石川岛\t233675\n响螺湾\t233676\nSave\t233677\n奉命\t233678\n002797\t233679\n西北农林科技\t233680\n树派\t233681\nSphere\t233682\n1131\t233683\n南庭\t233684\nBle\t233685\n李继存\t233686\n电信华为\t233687\n天联\t233688\n雪佛兰赛欧\t233689\n喷砂除锈\t233690\n应选\t233691\nbuneng\t233692\n惠农网\t233693\n罗建华\t233694\neia\t233695\n别骗\t233696\n物道\t233697\n合版\t233698\n无机酸\t233699\n太平洋游戏论坛\t233700\n聂佳\t233701\nC语言中文网\t233702\n财政部会计司\t233703\n别闹\t233704\n云运维\t233705\n幕府将军2全面战争吧_\t233706\n底栖\t233707\ndede数据库\t233708\nAutoCAD2010\t233709\n点值\t233710\n北岛玲\t233711\n二手车\t233712\n合同\t233713\n杭州崇文实验学校\t233714\n北京蟹岛\t233715\n三角锥\t233716\n刘志军\t233717\nB卷\t233718\n为快\t233719\n快3\t233720\n军内\t233721\n铃木隼\t233722\n溶解性表\t233723\n转变\t233724\n双环科技\t233725\nWearing\t233726\n薛云奎\t233727\nBayern\t233728\ngpa\t233729\n商业城\t233730\n花上\t233731\n承重\t233732\n5007\t233733\nUSGS\t233734\n马歇尔\t233735\nFertilizer\t233736\n双电源转换开关\t233737\n网赚\t233738\n大韩民国\t233739\n刚性联轴器\t233740\n之江新语\t233741\n坐像\t233742\nsaic\t233743\n测绘费\t233744\n川进青\t233745\n国家科技评估中心\t233746\n触手\t233747\nPlastic\t233748\n浆纱\t233749\nlyingman\t233750\n网络小说阅读网\t233751\n水浒Q传\t233752\n0.008\t233753\n轻便式\t233754\n黄瓜汁\t233755\n志俊\t233756\n主食\t233757\n第15版\t233758\n阴瑜伽\t233759\n白古\t233760\nVOL\t233761\n白恩\t233762\n烟店\t233763\n子书\t233764\n简_\t233765\n开原\t233766\n铁雄\t233767\n凌氏\t233768\n游戏宝\t233769\ngeography\t233770\n鸟枪\t233771\n丁志诚\t233772\n督察组\t233773\n4月22日\t233774\n北京航空大学\t233775\n钮祜禄氏\t233776\n三分地论坛\t233777\n第十四集\t233778\n吴必虎\t233779\n椎\t233780\n省司法厅\t233781\n科瑞特\t233782\n好莱坞八大影业公司\t233783\n这样的歌\t233784\n前海深港现代服务业合作区\t233785\nblown\t233786\n变换\t233787\n深圳大运中心\t233788\n攻坚\t233789\n劳士顿\t233790\nSEIKUU\t233791\n脚踝骨\t233792\ndispaly\t233793\ntopbrids\t233794\n喵星人大战吧\t233795\n九张机\t233796\n生命之杯\t233797\n礼品展\t233798\ncanyin\t233799\n人教版五年级英语\t233800\n大侦探福尔摩斯2:诡影游戏\t233801\n搜达足球网\t233802\n鹧鸪\t233803\n浏阳\t233804\n脖套\t233805\n宋洁\t233806\ndingding\t233807\n察尔汗盐湖\t233808\n成都市安全生产监督管理局\t233809\n红碱淖\t233810\nconntrack\t233811\n大班组\t233812\n平安汽车\t233813\n中国编剧网\t233814\n背心\t233815\n疲态\t233816\n庭院灯\t233817\n刁石京\t233818\n名老中医\t233819\n百山\t233820\n2支\t233821\n兼定\t233822\n麻梨疙瘩\t233823\nthinkstation\t233824\n北韩\t233825\n铲车工\t233826\nAiseesoft\t233827\n第27条\t233828\n77%\t233829\nalios\t233830\n草花头\t233831\n哈尔滨北\t233832\n陈科\t233833\nRyu\t233834\n无忧商务网\t233835\n白发魔女传\t233836\n首都航空\t233837\nhiveserver2\t233838\n汉邦一点通\t233839\n福建)自由贸易试验区\t233840\n北海道大学\t233841\n木城\t233842\n膨胀管\t233843\n六员\t233844\n歇山顶\t233845\n笼屉\t233846\n天乐\t233847\n销分\t233848\n七季\t233849\n末日鼠疫2\t233850\n误事\t233851\n快评\t233852\n药食同源\t233853\n600209\t233854\n画传\t233855\npaperright\t233856\npho\t233857\npurchased\t233858\nχ\t233859\n浮动利率\t233860\ngitkraken\t233861\n爱丽绝世武神\t233862\n平井一夫\t233863\n乐说网\t233864\n不减\t233865\n尸者\t233866\nNBA全明星赛_\t233867\n孤岛惊魂吧_\t233868\n嘉兴市区\t233869\n长发\t233870\n新田惠海\t233871\n黄世仁\t233872\n阴谋\t233873\n坏消息\t233874\n舷梯\t233875\ninterpolate\t233876\n净原\t233877\n脚\t233878\n两生\t233879\n16年09月\t233880\ndevon\t233881\n露着\t233882\n微盛\t233883\n伊人\t233884\n鸡爪\t233885\n乔家\t233886\n美国医学院\t233887\nManufacturers\t233888\nNORM\t233889\n20170203\t233890\n原值\t233891\n五德\t233892\nqq群\t233893\nblanco\t233894\n揉捻\t233895\n8码\t233896\n电鼓\t233897\n_页\t233898\n魔兽世界\t233899\nHIbernate\t233900\n对置\t233901\n测风塔\t233902\n百分之一百\t233903\n硬是\t233904\nlater\t233905\n分手之后\t233906\n胆总管结石\t233907\nbthand\t233908\nhtmlentities\t233909\nMap\t233910\n2.8GHz\t233911\n星爆\t233912\n卤味\t233913\n张承\t233914\n四人位\t233915\ndishi\t233916\n电热棒\t233917\n56F8300BOOTLOADER.zip_56F834x_flash.elf.S\t233918\n张某甲\t233919\n上海市第六人民医院东院\t233920\nTypeError\t233921\n永城网\t233922\nchengyu\t233923\n水廉政\t233924\ngg修改器\t233925\n叫响\t233926\ncnr\t233927\n骑行眼镜\t233928\nV刹\t233929\n新长城\t233930\n镀锌方矩管\t233931\n生人\t233932\n藏友\t233933\n倚天屠龙\t233934\n大爷\t233935\npoietic\t233936\n未选择的路\t233937\n湯\t233938\n勇气默示录\t233939\n星梦奇缘\t233940\nDropping\t233941\nHonors\t233942\n魁\t233943\n为了皇帝\t233944\n消防类\t233945\n欧龙马滴剂\t233946\n灵官\t233947\n小李哈弗h6\t233948\n美农\t233949\nENVIRONMENT\t233950\n九江站\t233951\n默认浏览器\t233952\n议课\t233953\n酱猪蹄\t233954\n动辄\t233955\nWISH\t233956\n天津力神\t233957\n神志\t233958\n小檗\t233959\n忘年\t233960\nBLOW\t233961\nOpenLDAP\t233962\nQWQ\t233963\n砸开\t233964\n邮政银行\t233965\n秀芽\t233966\n广州中望龙腾软件股份\t233967\n允礼\t233968\n近卫\t233969\n吉丰\t233970\n罗飞\t233971\n陈国祥\t233972\n包块\t233973\n元亨利贞\t233974\n5册\t233975\n梅花洲\t233976\n日本筑波大学\t233977\n新疆财政厅\t233978\n20160416\t233979\n谷\t233980\n水型\t233981\nzuofa\t233982\n子基金\t233983\nxlink\t233984\n散歩\t233985\n杨沛\t233986\nWSGI\t233987\nWebstorm\t233988\n600170\t233989\n乾城\t233990\na10处理器\t233991\n迥然不同\t233992\n复交\t233993\n表彰会\t233994\n赣深高铁\t233995\n热线12\t233996\n夏青\t233997\nkno\t233998\n居者\t233999\n村架\t234000\n全幅机\t234001\nLF\t234002\n24片\t234003\n半夜两点\t234004\n1117\t234005\n四川省纪委\t234006\n是道\t234007\n工模\t234008\n柯哀\t234009\nColin\t234010\n灵璧县\t234011\nzoofilia\t234012\n交换性\t234013\nlgg\t234014\n每篇\t234015\n陆展元\t234016\n飞猪\t234017\n又名\t234018\n55式\t234019\n唯心主义\t234020\n上海电机学院\t234021\n渔业\t234022\n油品\t234023\n河源市区\t234024\ndistributions\t234025\n聚醚醚酮\t234026\n漱口水\t234027\n祁阳政府网\t234028\n铭文页\t234029\n皆非\t234030\n双机\t234031\n祥生地产\t234032\n木包\t234033\n北关\t234034\n500\t234035\nwps2018\t234036\n10mm2\t234037\ncohort\t234038\n黄岗镇\t234039\n法爷\t234040\n某行\t234041\n真空计\t234042\n丹麦女孩\t234043\nVander\t234044\n灰度图\t234045\n陈建强\t234046\nweb编程\t234047\n康达尔\t234048\n躬行\t234049\n大锅灶\t234050\n白狼王\t234051\n口段\t234052\n白夜凛音\t234053\n3.47\t234054\nTriple\t234055\n先锋书店\t234056\n花粉丝网\t234057\n招标公告网\t234058\n汉滨\t234059\n毕加索\t234060\n发式\t234061\n260x\t234062\nCGJOY\t234063\nソニ\t234064\nQJ\t234065\ndoughnut\t234066\n石家庄动物园\t234067\n山东大学计算机科学与技术学院\t234068\nstay\t234069\n岩板\t234070\n麦迪格\t234071\nning\t234072\n朱元冰\t234073\n鹅肉\t234074\n刀画\t234075\n氩弧\t234076\n高庚杓\t234077\nchengdu\t234078\n莲花池\t234079\n0.99\t234080\n预备党员\t234081\n糖蛋白\t234082\nLandsat8\t234083\n美妆达人\t234084\n北台\t234085\n硬件电路\t234086\nsbus\t234087\n戚\t234088\n翡翠岛\t234089\n老鹳草\t234090\n88位\t234091\n江苏恒瑞医药\t234092\n龙安寺\t234093\nwdcp\t234094\n清竹\t234095\n何润东\t234096\n海淘\t234097\n天汇\t234098\n神经纤维瘤病\t234099\n始终如一\t234100\n9092\t234101\n区点\t234102\n醉歌\t234103\n翩翩少年\t234104\n运动员\t234105\n二宫和也\t234106\n第6天\t234107\n荣耀之路\t234108\ncumulus\t234109\n综治委\t234110\nGaia\t234111\n瑞达路\t234112\nHHKB\t234113\n支离破碎\t234114\n小米电视2\t234115\n沙湾区\t234116\n第六批\t234117\nIW\t234118\n21mm\t234119\n输入框\t234120\n先见之明\t234121\n戴若希\t234122\n王将\t234123\nnuclei\t234124\nTint\t234125\nYan\t234126\ntag\t234127\n大河网\t234128\n无色\t234129\n衡水百姓网\t234130\n白俄罗斯\t234131\n飞嘴\t234132\nTimberlake\t234133\n十一年\t234134\n水值\t234135\ndaum\t234136\nhass\t234137\n德杰\t234138\n500s\t234139\n江昌义\t234140\nmock\t234141\n终结者2直播_终结者2攻略直播_终结者2\t234142\n二桥\t234143\n倚天剑与屠龙刀\t234144\n低落\t234145\n5集\t234146\n智谋\t234147\nstimulation\t234148\nThy\t234149\n祁阳县\t234150\nbinance\t234151\n功率管\t234152\nbujian\t234153\nAccess2013\t234154\n福建省机关事务管理局\t234155\ncatenin\t234156\n双尾\t234157\n第二小学\t234158\n老哥\t234159\n民管\t234160\n闪魔\t234161\n02323\t234162\n洗面盆\t234163\n人教版|小学\t234164\n表器\t234165\n﹣\t234166\n钒铁\t234167\n冼澡\t234168\n逆差\t234169\nalley\t234170\n马羚\t234171\n20180121\t234172\n蒋波\t234173\n武汉公安局\t234174\n冰花\t234175\n苏果卡\t234176\nweb_custom\t234177\n小洲村\t234178\n述廉\t234179\n兆\t234180\n德阳市人民政府\t234181\n售完\t234182\n助燃\t234183\n回体\t234184\npH\t234185\n赵各庄\t234186\n电线\t234187\n松江大学城\t234188\n高管\t234189\n韩乔生\t234190\ngenerating\t234191\nLaunching\t234192\n黑式破解\t234193\n苏勇\t234194\n16.04.2\t234195\n昊天锤\t234196\ngps定位\t234197\n深圳福田客运站_深圳福田客运站\t234198\nMonterey\t234199\nwaterfall\t234200\n19ise\t234201\n函调\t234202\n西布曲明\t234203\n并线辅助系统\t234204\n石笼网\t234205\n角马\t234206\n现时\t234207\n数值型变量\t234208\n王者荣耀奕星\t234209\n浅水\t234210\n锦山\t234211\nWINHEX\t234212\n导报\t234213\n叶海林\t234214\n1d\t234215\nWWW.色.COM\t234216\n字头\t234217\n玉禾田\t234218\nVide\t234219\n一丝不苟\t234220\n正行\t234221\n请求体\t234222\n浦东幼儿园\t234223\n福华路\t234224\n例假期\t234225\n孙正义\t234226\nparka\t234227\n细嫩\t234228\n一攻\t234229\n吸铁\t234230\nhercules\t234231\n三星i9500\t234232\n0478\t234233\n美橙互联\t234234\n派伟俊\t234235\n阳光教育集团\t234236\nProductivity\t234237\n圈徽\t234238\n闷热\t234239\n新化\t234240\n高达Versus\t234241\n七友\t234242\n线下\t234243\n卧式离心泵\t234244\n梅钢\t234245\n湖区\t234246\nDefect\t234247\n直播吧\t234248\nsumblime\t234249\nMost\t234250\n半剖\t234251\n500mL\t234252\n根管治疗\t234253\nstatics\t234254\n石家庄市政府\t234255\n华润怡宝\t234256\n吕玉兰\t234257\n子宫内膜增厚\t234258\nHD1080P+720P\t234259\n新教\t234260\n幸运者\t234261\n相去甚远\t234262\n职介\t234263\n2017年1月4日\t234264\n花牌情缘\t234265\n神界3原罪加强版\t234266\nGhostXP\t234267\n暨南大学经济学院\t234268\n老弹\t234269\n家友\t234270\nAmo\t234271\n统建房\t234272\neSATA\t234273\n杨医生\t234274\nKeras\t234275\n怡莱酒店\t234276\n金箔\t234277\n老妖\t234278\n机动战队吧\t234279\n加州伯克利大学\t234280\n燕丹\t234281\nunit\t234282\n硅烷偶联剂\t234283\n海事局\t234284\ndaoshi\t234285\n瑟蕾娜\t234286\nAudioRecord\t234287\n三防漆\t234288\n宝魔\t234289\n股权转让合同\t234290\n甜角\t234291\n许少锋\t234292\n氨基树脂\t234293\n西岗区\t234294\nWinMerge\t234295\n石岩街道\t234296\nXP系统\t234297\n喊冤\t234298\n集体经济发展\t234299\nflink\t234300\n2017-2018年\t234301\nVal\t234302\n非关系型\t234303\nAy\t234304\n德式\t234305\n10.02\t234306\n智能型\t234307\n2062\t234308\n80道\t234309\n优科\t234310\n仲裁者\t234311\n2.&#160\t234312\n学射\t234313\n新蕾\t234314\n平城\t234315\nmg3620\t234316\n幻想曲\t234317\nurlparse\t234318\n2018双\t234319\nKangle\t234320\n水妖\t234321\n地方教育附加\t234322\n夹芯板\t234323\n金立M6\t234324\n阿萨姆奶茶\t234325\n姻\t234326\nencryption\t234327\n胫腓骨\t234328\n张晓棠\t234329\n余姓\t234330\nIIT\t234331\n利人\t234332\n奶泡\t234333\n陈染\t234334\n桐谷\t234335\n九重环奈\t234336\n谍梦\t234337\n杜宾\t234338\nbisect\t234339\nprerender\t234340\n医道\t234341\n龙魂侠影\t234342\n卢丽安\t234343\n番数\t234344\n中国乒乓球协会\t234345\n摩登先生网\t234346\nFordham\t234347\n小儿积食\t234348\n叔子\t234349\nThreeJs\t234350\n千分之四\t234351\n无车日\t234352\n雷电3接口\t234353\nrb\t234354\n宾得镜头\t234355\n523\t234356\n不能不能\t234357\n信长之野望13天道\t234358\n发愿\t234359\nChosen\t234360\n决裂\t234361\n驱动精灵-驱动之家\t234362\n三溪口\t234363\n两孔\t234364\n瓦罐汤\t234365\npala\t234366\nmodifier\t234367\nWaiting\t234368\n另一边\t234369\n栈桥\t234370\n腾讯游戏频道\t234371\n中时电子报\t234372\n民生信用卡一卡行天下\t234373\nvpu\t234374\nsofia\t234375\n幻想三国志3\t234376\n绘图仪\t234377\n钢套钢保温钢管\t234378\n中公卫生人才网\t234379\n上饶日报-上饶数字报\t234380\nip地址\t234381\n徐州矿大\t234382\n电动尾门\t234383\n毛刺机\t234384\n2014年度\t234385\n梭编\t234386\n2017年国庆\t234387\n博士生们\t234388\n36花卉网\t234389\n嘉嘉\t234390\n过河拆桥\t234391\n金奈\t234392\n1.1.0.6\t234393\n不醉不会\t234394\n棉纺织业\t234395\n必居\t234396\n神舞\t234397\n稳定\t234398\n宝井\t234399\nBPO\t234400\nwaiter\t234401\n回显\t234402\nsettlement\t234403\n齐国\t234404\n伊尔迷\t234405\n哀愁\t234406\n14CM\t234407\n广州中院\t234408\n生物链\t234409\n0903\t234410\nLEO\t234411\n转印带\t234412\n7.1分\t234413\n徐州旅游网\t234414\n谈录\t234415\ngtx660\t234416\n住宅设计规范\t234417\n井神股份\t234418\n脏物\t234419\n柏辽兹\t234420\nUNO\t234421\n新启\t234422\n根式\t234423\n北京城建集团有限责任公司\t234424\nAB测试\t234425\n杀人者\t234426\n长沙住房公积金管理中心\t234427\nwadeson\t234428\n弄成\t234429\n对攻\t234430\n议长\t234431\n企业支付宝\t234432\n田忌\t234433\n115g\t234434\n变音\t234435\n兰州职业技术学院\t234436\n大中华\t234437\n活肌\t234438\ngw\t234439\n北大纵横\t234440\nS9+\t234441\n魏文彬\t234442\n陕西人大\t234443\n26.58万元\t234444\n曳舞\t234445\nyouself\t234446\n正露丸\t234447\n紫衫龙王\t234448\n神界3\t234449\n新材料有限公司\t234450\n高魔心\t234451\n精彩视频_幼儿园新闻_无忧幼儿园\t234452\nmdb数据库\t234453\n抓耳挠腮\t234454\nfategrand\t234455\n枯石\t234456\nBDMV\t234457\n谢祖武\t234458\n机会主义\t234459\n5X\t234460\n汇\t234461\n房卡牛牛\t234462\n壹角\t234463\nNAS/网络存储\t234464\n未来感\t234465\n黄包\t234466\n钢之炼金术师FA\t234467\n知友\t234468\n强弩\t234469\n安徽教育出版社\t234470\n国药一致\t234471\n5pb\t234472\nPoe\t234473\n探春\t234474\nTL-WDR6500\t234475\n难以捉摸\t234476\nSupplies\t234477\nfiredac\t234478\n催熟\t234479\nduyin\t234480\n整夜\t234481\n贯穿\t234482\nROAD\t234483\n上海国际半程马拉松赛\t234484\n启辰T70\t234485\n两间\t234486\n近两月\t234487\nhotkey\t234488\n中日韩自贸区\t234489\n李思汤姆汉克斯\t234490\n八节\t234491\n段小楼\t234492\nflush\t234493\n东经\t234494\n总设计\t234495\n布雷顿\t234496\nTHANN\t234497\nyule\t234498\n不符\t234499\n72G诛仙\t234500\n平安健康网\t234501\n整句\t234502\n张仁华\t234503\n第107集\t234504\n大埔县人民政府\t234505\n飘花资源网\t234506\nyixing\t234507\n伊兰特\t234508\n万用\t234509\n杀生\t234510\n操作键\t234511\n张攀\t234512\nread_csv\t234513\n第38页\t234514\n长江航道\t234515\n溜达\t234516\n环球旅行\t234517\n悬崖边\t234518\n473ml\t234519\n卡德\t234520\n皮艺\t234521\n石材雕刻机\t234522\n蓝调\t234523\nwin10系统分区\t234524\n陈阳\t234525\n社区服务站\t234526\n南方文交所\t234527\nINFORMIX\t234528\n后尾\t234529\n逸动\t234530\n52pk剑网3\t234531\ncctv14\t234532\n姬姓\t234533\ninterpreter\t234534\n孙正聿\t234535\n重庆三峡医药高等专科学校\t234536\n彼得潘\t234537\n云闪\t234538\n5道\t234539\n为三\t234540\n黄霞\t234541\n岩芯\t234542\nMSW\t234543\n贫富\t234544\n徐光宪\t234545\n刘桢\t234546\nY460\t234547\n广州国际体育演艺中心\t234548\n云狐\t234549\n易胜博\t234550\n你是谁\t234551\n065\t234552\n光谷创业街\t234553\n市科委\t234554\n000988\t234555\n扬子空调\t234556\n3芯\t234557\n水海\t234558\n打粉机\t234559\nlatest\t234560\ntibet\t234561\n受受\t234562\nlina\t234563\n超级版\t234564\n6950x\t234565\n重生版\t234566\nPRC\t234567\n酒单\t234568\n起亚k5\t234569\n罗山\t234570\nAE模板素材\t234571\n两芯\t234572\n菜碟\t234573\n_v1.3安卓\t234574\n莎士比亚\t234575\n上旋\t234576\n牵引带\t234577\n亮丽\t234578\nHONG\t234579\n移动门\t234580\n长沙经开区\t234581\n3110\t234582\n991\t234583\n晚上23点\t234584\nbound\t234585\n5V2A\t234586\n一品官\t234587\n锋范\t234588\n云丘山\t234589\n牧羊少年奇幻之旅\t234590\n中央城\t234591\n头屑\t234592\n沿滩区\t234593\n战姬\t234594\n浒\t234595\n息斯敏\t234596\n质权\t234597\n海南省质量技术监督局\t234598\n第90章\t234599\n206路\t234600\n鏖战\t234601\n过稿\t234602\n出款\t234603\n谷昴\t234604\n郭汜\t234605\n开走\t234606\nvoid函数\t234607\n随事\t234608\n7373\t234609\n遗产房\t234610\n26天\t234611\n8090网页游戏\t234612\n血包\t234613\n天津市安全生产监督管理局\t234614\n深圳市沃特玛电池有限公司\t234615\n稀泥\t234616\n采荷一小\t234617\n泡腾片\t234618\n女服\t234619\n名利双收\t234620\n软组织感染\t234621\nFOXMAIL\t234622\n掰阴\t234623\ncalf\t234624\ndine\t234625\n耳熟能详\t234626\n开机\t234627\n苏眠\t234628\n磁导率\t234629\n牛肉汤\t234630\n五点钟\t234631\nhydroxy\t234632\n美好家\t234633\n木种\t234634\nmback\t234635\nlure\t234636\n郑少\t234637\n幼升小非京籍\t234638\n峭\t234639\n魔兽战役\t234640\naccess2013\t234641\n319\t234642\n助眠\t234643\n座头鲸\t234644\n珞狮南路\t234645\n道氏\t234646\nSUMIF\t234647\n世博大道\t234648\n领舞\t234649\n横沟\t234650\n回力鞋\t234651\n华勤通讯技术有限公司\t234652\nEliteDesk\t234653\n鸿博\t234654\n卡哇伊\t234655\n抵挡\t234656\n日上\t234657\n曼龙鱼\t234658\n贵阳护理职业学院\t234659\n美新\t234660\n广东电信\t234661\n战友\t234662\n王冠红人馆\t234663\n那拉提草原\t234664\n初始\t234665\n竖中指\t234666\n11.0.3\t234667\n95式自动步枪\t234668\n油水分离器\t234669\n平立\t234670\n佝偻病\t234671\n苯二酚\t234672\n观落\t234673\n般若文海\t234674\nrds\t234675\n崔小拽\t234676\n水月洞天\t234677\n轮组\t234678\n初教\t234679\n张宝图\t234680\n搵\t234681\n赵已然\t234682\n显像\t234683\n泥斗\t234684\n续租\t234685\n马自达2\t234686\norphan\t234687\n梓唯衣\t234688\n第7篇\t234689\n钧策\t234690\n路雪\t234691\n四五月\t234692\n仓井\t234693\nfeeling\t234694\n吉木乃县人民政府\t234695\nic2\t234696\n女人体\t234697\n梵境\t234698\n易读版\t234699\n丝尔\t234700\n9920\t234701\n嘉定紫藤公园\t234702\n韩龙\t234703\n男婴\t234704\n哪吒\t234705\nottawa\t234706\n幕明\t234707\n壊\t234708\n维持性\t234709\nKisses\t234710\n祺\t234711\n九龙瀑布\t234712\n中老\t234713\nclassy\t234714\n李庆海\t234715\n技能机\t234716\n卡洛琳\t234717\n铜板街\t234718\ntestnet\t234719\n序幕\t234720\n光明大道\t234721\n墨香铜\t234722\n荒漠化\t234723\n曾国藩家书\t234724\n哥布林杀手\t234725\ndbn\t234726\n抽风\t234727\n王国忠\t234728\n日语考级\t234729\n你好不好\t234730\n亭序\t234731\n选择题\t234732\n卓美亚\t234733\n斯嘉贾跃亭\t234734\n本值\t234735\n25A\t234736\n马马\t234737\nSuning\t234738\n大洪湖\t234739\n转化\t234740\n压差\t234741\n李迅雷\t234742\n篠崎\t234743\nxbo\t234744\n纳川\t234745\n袁世海\t234746\nstroop\t234747\n定置\t234748\n高埗镇\t234749\n趣弹\t234750\n活性炭\t234751\n快乐学习\t234752\nRocco\t234753\n拉瑞\t234754\n网络硬盘\t234755\nprecompiled\t234756\nXing\t234757\n虐受\t234758\n川中岛\t234759\nwebservice接口\t234760\n复数域\t234761\nGNS3\t234762\nreorder\t234763\nA7II\t234764\n老夏\t234765\n高H\t234766\napidoc\t234767\n天望\t234768\n苍溪县人民政府\t234769\nmhp3\t234770\n中夏易经起名网\t234771\n奈斯博\t234772\n郑萍\t234773\ngonglu\t234774\n杭州市住房保障和房产管理局\t234775\n思想库\t234776\n第二关\t234777\nWE\t234778\n烟台市\t234779\n处关系\t234780\ncoaching\t234781\n73亿\t234782\n重坦\t234783\n遇险\t234784\n4.6.2\t234785\n1500_\t234786\n手锤\t234787\n电位法\t234788\n苏林\t234789\n开发篇\t234790\nUboot\t234791\nコスプレイヤ\t234792\n盈盈理财\t234793\n科技篇\t234794\n黑色车\t234795\nrelic\t234796\n江油中学\t234797\n西达本胺\t234798\n中国人民大学法学院\t234799\n稀有异刃\t234800\nporsche\t234801\n辛普森杀妻案\t234802\n170428\t234803\n周邦彦\t234804\n控股\t234805\n宜信商通贷\t234806\n巡视\t234807\n438\t234808\n人力资源招聘网\t234809\n美利坚合众国\t234810\n版部\t234811\ngrils\t234812\nLIC\t234813\n7.5.6\t234814\nLCT\t234815\n瓦轴\t234816\nJoanne\t234817\n线规\t234818\narcher\t234819\n传帮\t234820\n通径\t234821\naddview\t234822\n几把\t234823\n金名都\t234824\nfactorial\t234825\n盐酸氨溴索\t234826\n学签\t234827\n引发剂\t234828\n球手们\t234829\n萨沙\t234830\n首秀\t234831\n诉衷情\t234832\nopenday\t234833\n孟瑞\t234834\nAlternative\t234835\n艾森克\t234836\n妃英理\t234837\n蝎女\t234838\nラブ\t234839\ncost\t234840\n文本文件\t234841\n初代吸血鬼\t234842\nTrustedInstaller\t234843\n十九大报\t234844\n波涛汹涌\t234845\n有始无终\t234846\n野村萌香\t234847\n狄\t234848\n阿拉亚\t234849\nWSTMart\t234850\n拥挤\t234851\n细胞瘤\t234852\n空白\t234853\n二级c语言\t234854\n康铜\t234855\nseetaface\t234856\n哈赫\t234857\n酒厂\t234858\n背背\t234859\nwindows程序设计\t234860\n对话框\t234861\n沉头\t234862\n杨立青\t234863\n城郊\t234864\n银盛\t234865\n三攻\t234866\n钟祥之窗\t234867\nalphanumeric\t234868\n42家\t234869\ncalculated\t234870\nal\t234871\n群落\t234872\n三方面\t234873\n狞猛\t234874\n骑马与砍杀泡菜\t234875\nSelenium\t234876\n指摘\t234877\nComp\t234878\n三无产品\t234879\n甄妮可可\t234880\n飞塔防火墙\t234881\nK-means\t234882\n头皮脂溢性皮炎\t234883\n西游记后传\t234884\nairmax\t234885\nJavier\t234886\n黄委\t234887\n斯佩比亚\t234888\nhc\t234889\n内经素\t234890\n刺丝\t234891\n民居\t234892\n四川省经信委\t234893\n川报\t234894\n远华案\t234895\n高标清\t234896\n安奇\t234897\n命运之路\t234898\n强将\t234899\n融雪\t234900\nSIM800\t234901\n112元\t234902\nCARRY\t234903\n非诚勿扰1\t234904\n心情\t234905\n财富大厦\t234906\n全国人民代表大会\t234907\nDANIEL\t234908\n形形\t234909\n李盈莹\t234910\n死兆星\t234911\n二手工程车\t234912\n酱肘子\t234913\n善道\t234914\nEnergy\t234915\n汶上政府网\t234916\norgernizer\t234917\n小牧\t234918\n100px\t234919\n疯狂世界\t234920\n三通管\t234921\n2017—2021年\t234922\n汉匈决战\t234923\n鳞癌\t234924\n亡齿寒\t234925\nchassis\t234926\n胃胶囊\t234927\nicd10\t234928\n滚烫\t234929\n新中医\t234930\n12阶\t234931\n秋池\t234932\n北镇市人民政府\t234933\n自封袋\t234934\n36码\t234935\n兰台世界\t234936\n烨辉\t234937\n尹道贤\t234938\nliying\t234939\n286个\t234940\n左盘\t234941\n点菜单\t234942\n首演\t234943\n邪恶漫画之教育坏小孩_邪恶少女漫画无翼鸟全集\t234944\n美猴\t234945\n昙花梦\t234946\n橙鹰\t234947\n投资回报率\t234948\n对外经贸大学\t234949\n河塘\t234950\n磊磊\t234951\n西田城\t234952\n醇酒\t234953\n同类型\t234954\n蔡桓公\t234955\n参配\t234956\n收案\t234957\n追杀\t234958\n充电钻\t234959\n大连科技学院\t234960\n苍松\t234961\n6.5mm\t234962\n永中office\t234963\n开源证券\t234964\n铜牌\t234965\n深圳市安全生产监督管理局\t234966\nMetacritic\t234967\n捐助\t234968\n万科湖心岛\t234969\n孟学农\t234970\n一个位\t234971\n口袋妖\t234972\n营山\t234973\n重庆市地方税务局\t234974\n最终幻想12黄道时代\t234975\n刻机\t234976\n水飞蓟宾胶囊\t234977\n奇葩超能事务所\t234978\n360儿童卫士\t234979\n两万块\t234980\n双拥模范城\t234981\n145\t234982\n上拉电阻\t234983\n2017年四月\t234984\n一人有限公司\t234985\n有勇无谋\t234986\n误报\t234987\n名气\t234988\n杨晓渡\t234989\n边检站\t234990\n祝寿\t234991\n第二卷\t234992\n不可开交\t234993\nthorough\t234994\n消元法\t234995\n高人一等\t234996\n字笔\t234997\n视帝\t234998\nSola\t234999\n廉政公署\t235000\n珍妮花\t235001\nJeju\t235002\n套马杆\t235003\nhuiyuan\t235004\n套型\t235005\n脐血\t235006\n心热\t235007\n抚顺路\t235008\n第28次\t235009\n党们\t235010\nIIR滤波器\t235011\n套花呗\t235012\nf191\t235013\n柳川\t235014\n蛋糕店\t235015\n灌灌\t235016\n雷锋日\t235017\n雄安人才网\t235018\n上海恒隆广场\t235019\n玩通\t235020\n申命\t235021\n古古电影网\t235022\n佳苗瑠华\t235023\n通话费\t235024\n豌豆荚\t235025\n许哲佩\t235026\nBridging\t235027\n大切\t235028\n朱丹维塔斯\t235029\n武汉市公安局\t235030\n亚甲基\t235031\n上海海关学院\t235032\ncs75\t235033\n六朵\t235034\n意韵\t235035\n万山岛\t235036\nWEB-MP4\t235037\n眉型\t235038\n公装\t235039\n五饼二鱼\t235040\n腺样体手术\t235041\n枫林湾\t235042\n5GHz\t235043\nAvi\t235044\nhtop\t235045\n1218\t235046\n丛林肉搏\t235047\n权证\t235048\nyy语音\t235049\nanxiety\t235050\nmedoo\t235051\n友达\t235052\n刘斌\t235053\ncamila\t235054\nRapid\t235055\n貌\t235056\n陈婷\t235057\n马桩\t235058\nkdx\t235059\npopupmenu\t235060\nutau\t235061\ngrr\t235062\n僵尸道\t235063\n长河镇\t235064\n沼泽\t235065\n中药品\t235066\n8样\t235067\n华夏现金增利\t235068\n得道者\t235069\n孙畅\t235070\n中国二十冶集团有限公司\t235071\n何伟\t235072\n圣祖\t235073\n农事\t235074\n铁炉村\t235075\n20180214\t235076\naeiou\t235077\npossibility\t235078\n止泻\t235079\n叶菜\t235080\n谤\t235081\n董卓\t235082\n懿古今\t235083\n错话\t235084\n易金通\t235085\n税友集团\t235086\n幺蛾子\t235087\n盛大传奇\t235088\n欧瑞\t235089\n观音桥街道\t235090\n218.94\t235091\n沙盒\t235092\n海南酒店\t235093\n桃花节\t235094\nDNF2016\t235095\nwin7双系统\t235096\n英雄山路\t235097\n1991年\t235098\n回款率\t235099\n家家\t235100\n太阁4\t235101\n品文\t235102\n3920\t235103\n氧化钼\t235104\n波尔多液\t235105\nStacy\t235106\nstrut\t235107\n16299.15\t235108\n墨江\t235109\n234汽车网\t235110\n4屏\t235111\n喀喀湖\t235112\n六爻占卜\t235113\n剩余价值率\t235114\n幻肢\t235115\n北通阿修罗\t235116\nchenhao\t235117\n韦博英语\t235118\n常量值\t235119\n灿若\t235120\n豫章\t235121\n姐夫\t235122\n古墩\t235123\n巧匠\t235124\n李微\t235125\nQuerying\t235126\n迟暮\t235127\n身披\t235128\n奥迪Q3\t235129\n无思\t235130\n胡萝卜丝\t235131\n手机端\t235132\n致命性\t235133\n蜜桃成熟时\t235134\nHENRY\t235135\n徐国栋\t235136\nACG汉化组\t235137\n丁级\t235138\n日东电工\t235139\n元器件\t235140\n两世\t235141\n常用电话\t235142\n潍坊职业学院\t235143\n苦恼\t235144\n昆钢\t235145\n全球速卖通\t235146\n柏杨\t235147\n雅歌\t235148\n专业人士\t235149\n钓鱼池\t235150\n阿优\t235151\n重庆晚报\t235152\n乌克兰人\t235153\n报出\t235154\narm处理器\t235155\n小丁丁\t235156\n水地暖\t235157\n李荷艺\t235158\n清华斯维尔\t235159\n一元二次不等式\t235160\n满贯\t235161\n搜狗推广管家\t235162\n安徽省扶贫办\t235163\n床\t235164\n1707\t235165\n迷惑\t235166\n国函\t235167\n神经元特异性烯醇化酶\t235168\n清宫性史\t235169\n鹈鹕\t235170\n0.93\t235171\n理性化\t235172\nyau\t235173\n卡旺卡\t235174\nVideoStudio\t235175\n计量泵\t235176\nserializeArray\t235177\nVulcan\t235178\n非新\t235179\n嘻嘻\t235180\n国华电力\t235181\n重锤式\t235182\n种种\t235183\n壁炉\t235184\ndisruptive\t235185\n有多远走多远\t235186\n农牧区\t235187\n岭南师范学院\t235188\n奇乐\t235189\n刘世锦\t235190\n养廉\t235191\n114号\t235192\n周大\t235193\n资讯入口网\t235194\n天根生化科技(北京)有限公司\t235195\ng45\t235196\n优思\t235197\nrebellion\t235198\nexpress-session\t235199\n贴片式\t235200\n净水剂\t235201\nSinatra\t235202\n宁陵县\t235203\n星湖路\t235204\n冯珊珊\t235205\n女s\t235206\n喝醉\t235207\n萃取机\t235208\nszs\t235209\nポロリ\t235210\n算命网\t235211\nswim\t235212\n物美超市\t235213\n卫生城\t235214\n微商达人\t235215\n冰结\t235216\n不鸣\t235217\n5nm\t235218\n引诱\t235219\n存疑\t235220\n历届冬奥会\t235221\n法棍\t235222\n栋笃笑\t235223\n七色花\t235224\n环链\t235225\n钱蓓婷\t235226\n认证杯\t235227\n审评\t235228\n水源\t235229\n单相异步电机\t235230\n阵元\t235231\nMovidius\t235232\n始兴县\t235233\nf12018\t235234\n点击量\t235235\n北京民政局\t235236\n资料性\t235237\n七万公里\t235238\n携程旅行\t235239\nangular4.0\t235240\n唱法\t235241\ntimezone\t235242\n外部源\t235243\nResolved\t235244\n沈阳医学院\t235245\n前两个月\t235246\n南航集团\t235247\n中华人民共和国行政诉讼法\t235248\n中铁十二局\t235249\n防长\t235250\n四川南充\t235251\narmani\t235252\n分离定律\t235253\n德园\t235254\nSensing\t235255\nraf\t235256\n新乡医学院第一附属医院\t235257\n扁舟\t235258\n六盘水市政府网\t235259\n皮亚尼奇\t235260\n钢料\t235261\n犹可\t235262\n环世界a17\t235263\n7400\t235264\n脚模\t235265\n唐平\t235266\n优畅\t235267\nBigTits\t235268\n第六十八章\t235269\nCSS篇\t235270\n遒\t235271\n贴\t235272\n薛松\t235273\n乐清中学\t235274\n肉料\t235275\n产税\t235276\n8.3.1\t235277\n孙妍\t235278\nexpiration\t235279\n无功补偿\t235280\n单立人\t235281\nPerl\t235282\n女神异闻录4黄金版\t235283\n三国史\t235284\n新国家安全法\t235285\n碧霄\t235286\n阴阳师大\t235287\n冰种\t235288\n唐末\t235289\nBON\t235290\n2016年初\t235291\n隆科多\t235292\n5折\t235293\n防尘罩\t235294\n130套\t235295\n鼓式\t235296\n确有其事\t235297\n蓝斯\t235298\n停刊\t235299\n布类\t235300\n三圈\t235301\n东北大亨\t235302\n毕业设计中期检查表\t235303\n易观\t235304\n线桥\t235305\n不必说\t235306\n听听我说的吧\t235307\n唱见\t235308\n灭罪\t235309\n投影机\t235310\n雅顿\t235311\nserver2017\t235312\n20161205\t235313\n灵岩山寺\t235314\n島\t235315\n折达公路考勒\t235316\n汉末\t235317\n刘奇云\t235318\n奥古曼\t235319\n四五个月\t235320\n首钢基金\t235321\n后知者\t235322\n碳钢\t235323\n达芬奇\t235324\n佛洛依德\t235325\n暹罗猫\t235326\n高迪\t235327\n奥龙\t235328\nAHA\t235329\n20170725\t235330\n余生有你才安好\t235331\nArchangel\t235332\n独弦琴\t235333\n自由行攻略/\t235334\n长城雪茄\t235335\n冰桶\t235336\n36家\t235337\nisis\t235338\n197个\t235339\nimperative\t235340\n最后的爱\t235341\nDude\t235342\n天山大峡谷\t235343\n第5场\t235344\n盈德\t235345\n冒号\t235346\n泛滥成灾\t235347\n在线版\t235348\n萨伏伊\t235349\n冰水仙\t235350\nnqa\t235351\n风灵月\t235352\n刘亦孟非\t235353\nmeltdown\t235354\n欠缺\t235355\n手算\t235356\n无间行者\t235357\nUBC\t235358\nwin10pro\t235359\n阳光板\t235360\nexamine\t235361\n字标\t235362\n百色起义\t235363\n角度尺\t235364\n顺口溜\t235365\nspoiled\t235366\nConversions\t235367\nweg\t235368\nIP位址資訊\t235369\nwww.w3.org\t235370\nautolink\t235371\n基音\t235372\n向前进\t235373\n植物人\t235374\n微生\t235375\n行缩\t235376\n自问自答\t235377\n陈庭妮\t235378\n上海古镇\t235379\n大揭\t235380\n沉船\t235381\n陈之遥\t235382\n净资产\t235383\n绵阳火车站\t235384\n昌南\t235385\n未婚妻\t235386\n青山纸业\t235387\n邢爱明\t235388\n九口\t235389\nCrusader\t235390\n舟山\t235391\n姚望\t235392\n傅国涌\t235393\n20171222\t235394\n直客\t235395\n杜瓦\t235396\n全威儒\t235397\nhmget\t235398\n红褐色\t235399\n黄花梨木\t235400\n张俊\t235401\n第几期\t235402\n20140527\t235403\n长江图\t235404\n语语\t235405\nrenfe\t235406\n3211\t235407\nINR\t235408\n摄影师们\t235409\n4399h5游戏\t235410\nfileoutputstream\t235411\n87G\t235412\n包裙\t235413\n生活展\t235414\n春江新城\t235415\ntomford\t235416\n密西西比州\t235417\nmatlab2010\t235418\n药店\t235419\n观察器\t235420\n不增\t235421\n滴定\t235422\n二进制串\t235423\n京华春梦\t235424\nPanties\t235425\n博爱县人民政府\t235426\n液性\t235427\nBLAME\t235428\n龙浩\t235429\nsurfac\t235430\nAppend\t235431\n成都幼师学校\t235432\n种植机\t235433\n孤胆枪手2\t235434\n临泽\t235435\n文森\t235436\n500ML\t235437\n南药\t235438\n庄子\t235439\n抓捕\t235440\n王者荣耀阿珂\t235441\n封印者\t235442\n新货\t235443\n002129\t235444\n扇风\t235445\n荒神罪\t235446\n古墓王\t235447\n施定柔\t235448\n孝顺镇\t235449\n清算行\t235450\n程单\t235451\n呼和浩特白塔机场\t235452\n膝关节置换术\t235453\n张秀娟\t235454\n沙县\t235455\n极色\t235456\n随诊\t235457\ngrabcut\t235458\ncentos5\t235459\n贸易额\t235460\n开心猫\t235461\n1899元\t235462\n六角螺母\t235463\n250家\t235464\n北京颐和园\t235465\n铁剂\t235466\n9月1日前\t235467\n生男生\t235468\n遗尘\t235469\n铝木门窗\t235470\n朱建明\t235471\n4线\t235472\n脓头\t235473\n贡茶\t235474\n纯白色\t235475\ndelete键\t235476\nHTMLParser\t235477\nKeystore\t235478\nZha\t235479\n八公举\t235480\n三明一中\t235481\n比瑞吉宠物用品股份有限公司\t235482\n四颗\t235483\n赖声川\t235484\n北仑\t235485\n中国科学院光电研究院\t235486\n朗新科技股份有限公司\t235487\n羊角锤\t235488\n赣州市人民医院\t235489\n搬家公司_搬家公司价格_搬家公司\t235490\n马凡舒\t235491\n万仙山景区\t235492\n事业单位公共基础知识\t235493\nAwards\t235494\n打骗\t235495\n低压\t235496\n后海大道\t235497\n盗贼\t235498\n黑基\t235499\nartnet\t235500\n55You.COM\t235501\n九十九个\t235502\n材料\t235503\n翻译版\t235504\n刨刀\t235505\n调琴\t235506\nQQ悄悄话\t235507\n育才二中\t235508\n停驶\t235509\nmymps\t235510\n陈岱孙\t235511\n错车\t235512\n艰苦朴素\t235513\n李涵\t235514\n主人公\t235515\n前程无忧网\t235516\n溪洛渡水电站\t235517\n盐城经济开发区\t235518\n中通快递\t235519\n江西省纪委\t235520\nav狼\t235521\n爱问问免费\t235522\n处变\t235523\n枪骑兵\t235524\n恒辉\t235525\n枣庄市立医院\t235526\n哈尔滨市委\t235527\n古龙水\t235528\n工业设计吧\t235529\n中德证券\t235530\n物权法\t235531\n媒派\t235532\n土木工程詹天佑奖\t235533\n秦可卿\t235534\nResident\t235535\n雷池\t235536\n路童\t235537\n领鲜\t235538\n不由自主\t235539\n脱兔\t235540\n爆浆海盐奶盖蛋糕\t235541\n且听风吟\t235542\n悠哉游哉\t235543\nBoot框架\t235544\n皮诺\t235545\n道滘镇\t235546\n13万赞\t235547\n赛睿\t235548\n天分\t235549\n联招\t235550\n梁爱琪\t235551\n我管你\t235552\n银莲花\t235553\ntuberculosis\t235554\n降额\t235555\nF1.2\t235556\n网业\t235557\n送温暖\t235558\nigb\t235559\n44码\t235560\n质评\t235561\nindent\t235562\n宝生\t235563\n盐酸阿霉素\t235564\n注册者\t235565\n办完\t235566\n中和教育\t235567\n公安局指挥中心\t235568\n别克GL8\t235569\n脱离\t235570\nPer\t235571\n中国农化招商网\t235572\n恋物\t235573\n四枚\t235574\n餐饮业\t235575\n英孚少儿英语\t235576\n见福\t235577\n蓝箭\t235578\n比特币交易网\t235579\nIKO\t235580\n颖上县\t235581\n富港\t235582\n给你\t235583\n义渠\t235584\nwolfSoul\t235585\n惠莞高速\t235586\n交通银行股份有限公司\t235587\n处处吻\t235588\nb75\t235589\n山观\t235590\n铁臂\t235591\n己未\t235592\n蒙德\t235593\n润圆\t235594\n景色\t235595\nV5.7\t235596\n茯苓饼\t235597\n素水泥浆\t235598\n外星人\t235599\n运动员们\t235600\nsentiment\t235601\n二元论\t235602\n麦卢卡蜂蜜\t235603\n九首\t235604\n卡德加\t235605\n汽车融资租赁公司\t235606\n政研\t235607\nQQ飞车手游\t235608\n反渗透阻垢剂\t235609\n17134\t235610\n苦月亮\t235611\n西麦\t235612\n明代\t235613\njlabel\t235614\n西山景区\t235615\n字灯\t235616\n路车\t235617\n最终昂克赛拉\t235618\n饥寒\t235619\n阿育吠陀\t235620\n无挑\t235621\n图样机\t235622\n上升星座\t235623\n溯痕\t235624\n老米\t235625\n邓月平\t235626\n睿力集成电路有限公司\t235627\n大门\t235628\n伊红\t235629\n千军\t235630\n耿\t235631\nYumi\t235632\ncbg\t235633\n8000亿元\t235634\n全彩版\t235635\n218\t235636\n绿城物业\t235637\n张志民\t235638\n妥布霉素滴眼液\t235639\n诸暨中学\t235640\n优胜教育\t235641\n第二十六回\t235642\n徐江\t235643\n大嶝\t235644\n市面\t235645\n力臂\t235646\nsikuli\t235647\n菲罗多\t235648\n菩\t235649\nElectrolux\t235650\njayjun\t235651\n血小板分布\t235652\n捷豹xe\t235653\n永生者\t235654\n180000\t235655\n屯溪老街\t235656\nLevel-2\t235657\n天下第二\t235658\n倩女幽魂3\t235659\n统制\t235660\n秋心\t235661\n每秒\t235662\ncostmap\t235663\nMIME\t235664\n落伍\t235665\n氧化镧\t235666\nDances\t235667\n资金周转率\t235668\n贝米钱包\t235669\n视权\t235670\n群发器\t235671\n漏保\t235672\n颂拓斯巴达\t235673\nh3\t235674\n陈耀兴\t235675\n南康家具网\t235676\nkaydenkross\t235677\n化妆品类\t235678\n御小凡\t235679\n第104章\t235680\n2017-08\t235681\n细嚼慢咽\t235682\npict\t235683\n好考\t235684\n移速\t235685\n沣西\t235686\n散景\t235687\n三次方\t235688\n西柏坡\t235689\n软控股份\t235690\n荷包蛋\t235691\n桂政\t235692\n彩子\t235693\n甜菜\t235694\n8181\t235695\n宦海奇官\t235696\n生为本\t235697\n喜糖盒\t235698\n千载难逢\t235699\n打闹\t235700\n小盘股\t235701\n总院\t235702\nWALK\t235703\n电热丝\t235704\n初号机\t235705\n甲级防火门\t235706\n大众汽车\t235707\n圣泉\t235708\n美国海关\t235709\n中国三农网\t235710\nnba\t235711\n阻带\t235712\n1.2L\t235713\n2千元\t235714\nwebBrowser\t235715\n电气工程及其自动化专业\t235716\n颌下腺\t235717\n湘夫人\t235718\n8h\t235719\n茄果\t235720\nhs8545m\t235721\n229路\t235722\nlogging模块\t235723\n我还记得\t235724\natreus\t235725\nadvent\t235726\n赛欢网\t235727\n三防灯\t235728\n顾昀\t235729\n水利学报\t235730\n聚丙烯\t235731\nssy\t235732\n南京公安\t235733\n艾尔玛\t235734\nMPLUS\t235735\n山西通志\t235736\n1800米\t235737\n米淘\t235738\n3239\t235739\nGoulding\t235740\nWLAN版\t235741\n苏牙\t235742\n溴吡斯\t235743\n210亿\t235744\n嫡长孙\t235745\nv5.1\t235746\n一夜\t235747\n显影\t235748\nLumia950\t235749\n法治中国\t235750\n胃幽门螺杆菌\t235751\nAutofac\t235752\nBEATS\t235753\n慢性肾功能衰竭\t235754\n400点\t235755\n90克\t235756\n20180103\t235757\nGTX980M\t235758\n吸音棉\t235759\n陪客\t235760\nx性\t235761\n古风文\t235762\n玉面小嫣然\t235763\n镇江市水利局\t235764\n7-2\t235765\n给水泵\t235766\n功德林\t235767\n彬彬\t235768\n振动电机\t235769\n善变\t235770\n南京鼓楼医院\t235771\n增长黑客\t235772\nG2020\t235773\naptx\t235774\n权利人\t235775\n古浪路\t235776\n烧着\t235777\n柳桉木\t235778\n第二档\t235779\n兆阳\t235780\n高速动车组\t235781\n液压千斤顶\t235782\n亨\t235783\n步入\t235784\n一根根\t235785\n奥迪Q3论坛\t235786\n聊城高新区\t235787\n战五渣\t235788\n抗菌\t235789\n云峰\t235790\n猪哥亮\t235791\n预售票\t235792\n抗辩权\t235793\nylmf\t235794\n杀心\t235795\nyamy\t235796\n才子佳人\t235797\n超广角\t235798\nm1卡\t235799\n小作坊\t235800\nmsgbox\t235801\nthinning\t235802\n低潮期\t235803\n眼睑\t235804\n1千亿\t235805\nYYYYMMDD\t235806\n套门\t235807\n氮氧化合物\t235808\n饥饿度\t235809\n偏态分布\t235810\n红米2A\t235811\n龙神契\t235812\n韩孝周\t235813\nz5p\t235814\n监理公司\t235815\n打雀英雄传\t235816\nbiodiversity\t235817\n妒忌\t235818\nhearing\t235819\n东方公证处\t235820\n正式版\t235821\n网易企业\t235822\n建设工程公司\t235823\nE路\t235824\n自媒\t235825\n电费\t235826\n400g\t235827\n中华民国政府\t235828\n运城\t235829\n表亲\t235830\n免编程\t235831\n刷版\t235832\n4712\t235833\n诏安县\t235834\n南京二手房网\t235835\n张华\t235836\n反动派\t235837\n回宫\t235838\n暖机\t235839\n销售代表招聘网\t235840\n汤姆·克兰西\t235841\n羊舍\t235842\n石河子\t235843\n3sigma\t235844\n8迅雷\t235845\n全民国家安全教育日\t235846\nYDT\t235847\n优模网\t235848\n飞扬围棋论坛\t235849\n酿酒\t235850\n上海眼科医院\t235851\n艾熏\t235852\n5.1版\t235853\nGastroenterology\t235854\n绝句\t235855\n卡心\t235856\n孙宏贝多芬\t235857\n如是我闻\t235858\n中纪委\t235859\n美食类\t235860\nzooskooistay\t235861\n上海阳光泵业\t235862\n正反方\t235863\n怀玉\t235864\n二手床\t235865\n称快\t235866\nkx5\t235867\n辉夜\t235868\n磨丁\t235869\ndhcp\t235870\nMIKA\t235871\n真烟\t235872\n武汉长江大桥\t235873\n老性\t235874\n怎么回事儿\t235875\n凯迪拉克中心\t235876\n杂样\t235877\n1820\t235878\n北京市人事局\t235879\n蟛蜞\t235880\n天津市环保局\t235881\nPretty\t235882\nfpace\t235883\n中海发展\t235884\nFri\t235885\n朱光亚\t235886\n在工\t235887\n共和制\t235888\n北坡\t235889\n植物大战僵尸2失落之城\t235890\n3620\t235891\nAHCI\t235892\n宝山区\t235893\n三芯\t235894\n火影忍者剧场版博人传\t235895\npscs5\t235896\n重晶石\t235897\nARMANI\t235898\n金龙客车\t235899\n草船借箭\t235900\n弗拉门戈\t235901\n万国\t235902\n2349\t235903\n清廉\t235904\n蓬头\t235905\n日轻\t235906\nmag\t235907\n裘国根\t235908\nHorses\t235909\n军级\t235910\n致新\t235911\n违约方\t235912\n白鹅潭\t235913\n_高考网\t235914\n混沌军团\t235915\n香港金管局\t235916\n汉长安城遗址\t235917\n偿还期\t235918\n烘烤机\t235919\n超过七天\t235920\nhellochart\t235921\nDungeons\t235922\n中工\t235923\ndown.xs\t235924\nscarpy\t235925\n旺旺仙贝\t235926\n传闻\t235927\n密布\t235928\n花山\t235929\n714\t235930\n上海民办幼儿园\t235931\nWord批量\t235932\n投毒\t235933\n傩城\t235934\n艾莉森\t235935\n眼见\t235936\n鞋盒\t235937\n妥帖\t235938\n新景家园\t235939\nFIAT\t235940\n劲霸\t235941\n5d\t235942\n中国环博会\t235943\n法制\t235944\n乐都区人民政府\t235945\n一院\t235946\n改性环氧树脂\t235947\n富德生命人寿\t235948\n危险游戏\t235949\n耐看\t235950\nA酸\t235951\n悬浊液\t235952\n二类户\t235953\n高庄村\t235954\nEMG\t235955\n第3号\t235956\n龙币网\t235957\nEK-乐\t235958\n詹15\t235959\nNSMutableArray\t235960\n27部\t235961\n哈药集团\t235962\n装裱机\t235963\n福建省司法厅\t235964\n百花齐放春满园\t235965\np3\t235966\n150磅\t235967\n螺丝刀\t235968\ninstgram\t235969\n魔方秀\t235970\n存储柜\t235971\n棍棍\t235972\nweak\t235973\n杭州市委\t235974\n太宗\t235975\n超参数\t235976\nVoiceTube\t235977\n17英里\t235978\n霹雳天命之破邪传\t235979\n28&\t235980\n不喝奶\t235981\n斯迈夫\t235982\n贝尔格里尔斯\t235983\n平跟\t235984\n成校\t235985\n彩电\t235986\n甄宓\t235987\n猎影之狼\t235988\n中国国际经济交流中心\t235989\n不锈钢无缝钢管\t235990\n薯类\t235991\nアダルト動画\t235992\n融\t235993\n双击\t235994\n2017KPL\t235995\n火影忍者:究极忍者风暴3\t235996\n孤岛惊魂5黄金版\t235997\n十四岁\t235998\nFormerly\t235999\n隐形的翅膀\t236000\nJASON\t236001\n杨恭如\t236002\n周小结\t236003\n东方电影频道\t236004\n可听\t236005\n协议版\t236006\n5117\t236007\n核桃粉\t236008\n硫酸盐\t236009\n保坂\t236010\n巨斧\t236011\n罗杰斯\t236012\n传媒业\t236013\n郑州城市职业学院\t236014\n转争\t236015\n九宫八卦\t236016\n四个\t236017\n火车旅游网\t236018\n珍珠膏\t236019\n炉石传说_炉石传说\t236020\nLUT\t236021\n常林\t236022\n电脑路由器\t236023\n17173FGO\t236024\n官南大道\t236025\n小燕\t236026\n罗音\t236027\ninstr\t236028\n发神\t236029\n红酸枝\t236030\n山水池\t236031\n超车\t236032\n茶梗\t236033\n六世\t236034\n鹊\t236035\n圆管\t236036\n摇滚史\t236037\n授权委托书范本\t236038\n搜财网\t236039\nhans\t236040\n同性爱\t236041\n濠江区\t236042\n20141205\t236043\nsprd\t236044\n单一式\t236045\n五河琴里\t236046\n1.1万亿元\t236047\n3161\t236048\n文化节\t236049\nreception\t236050\n政治部\t236051\nCSM\t236052\n10.5亿\t236053\n广渠路\t236054\n郭杜\t236055\n_济\t236056\neternity\t236057\nWB4S\t236058\nBC省\t236059\n美人妻\t236060\n新城吾悦\t236061\nChemie\t236062\n詹姆斯哈登\t236063\n东风集团\t236064\n装可怜\t236065\n丁柳元\t236066\n树子\t236067\nbeamer\t236068\n水凝\t236069\n陕西网络广播电视台\t236070\n陶弘景\t236071\n广西文明网\t236072\n诺基亚930\t236073\n东海湾\t236074\n王熙\t236075\n夸克\t236076\n易站\t236077\n同镇\t236078\n咕力咕力\t236079\n大框\t236080\n白眼病\t236081\n过问\t236082\nlucent\t236083\n魔兽剑圣异界纵横\t236084\n糯米鸡\t236085\n张学林\t236086\n区组\t236087\n确美\t236088\ncentso\t236089\nxib\t236090\n新美\t236091\n龙光\t236092\n金曲\t236093\n春韵\t236094\n角\t236095\n绣红旗\t236096\nCCER\t236097\n无上\t236098\n羊羹\t236099\n济南市城市管理局\t236100\n2012吧\t236101\n丹溪\t236102\n编码表\t236103\ncontent\t236104\n|规范|天工问答\t236105\n新能泰山\t236106\nCNN卷积神经网络\t236107\n哈利波\t236108\n肏\t236109\n如意湖\t236110\n林肯mkx\t236111\n相关\t236112\n贵州智诚\t236113\n200克\t236114\n基本路线\t236115\nSheryl\t236116\n分色\t236117\nHOBO\t236118\n倒卖\t236119\n车臣战争\t236120\n武胜路\t236121\n滂沱大雨\t236122\n塑身贴\t236123\n控方证人\t236124\n屋顶花园\t236125\n四级\t236126\n一部\t236127\n水浒传\t236128\n王育琨\t236129\n传质\t236130\n天劫\t236131\n华纳国际\t236132\n09月\t236133\n商科类\t236134\nPops\t236135\n少先队队\t236136\n神豪\t236137\n芙\t236138\n变形金刚擎天柱\t236139\n宁缺毋滥\t236140\n求生之路2Mod_求生之路2\t236141\n长街镇\t236142\n未记\t236143\nquare\t236144\nApr\t236145\n爱鸟周\t236146\n诚信承诺书\t236147\n李一桐\t236148\n僵尸博士\t236149\n极路由1s\t236150\n王冰洋\t236151\n营口道\t236152\n留住\t236153\n订本\t236154\n草民\t236155\n病病\t236156\n41P\t236157\n体力劳动者\t236158\n气动电磁阀\t236159\n201407\t236160\n反射率\t236161\n第一稿\t236162\n世佳\t236163\n龙洋\t236164\n生产能力\t236165\n重庆刑事律师网\t236166\nwannaone\t236167\n300套\t236168\nNo.8\t236169\n郑州火车站\t236170\n铂涛酒店集团\t236171\n说句话\t236172\n律师团\t236173\n仙侠世界2\t236174\naccelerated\t236175\n仿生学\t236176\n宝硕股份\t236177\n电子政务\t236178\n20160412\t236179\n茅台集团\t236180\n润杰\t236181\nWebVR\t236182\n合租\t236183\nUltraEdit编辑器\t236184\n侧耳\t236185\n3750\t236186\n喜马\t236187\n瓜子\t236188\n纳福\t236189\n输卵管\t236190\n星型卸料器\t236191\n魔尺\t236192\n陈小武\t236193\n硬派\t236194\n苏贝果\t236195\n民企\t236196\n大一岁\t236197\n宇钻\t236198\n乐居财经\t236199\n浙江女排\t236200\n艾医生\t236201\n宫妃\t236202\nlaunched\t236203\n伍六一\t236204\n冻存管\t236205\n第136期\t236206\n天津大学电气自动化与信息工程学院\t236207\n工艺学\t236208\n热血战歌\t236209\n孙甜甜\t236210\n分龄\t236211\n中原环保\t236212\n英语学科网\t236213\nspringfox-swagger2\t236214\nnms\t236215\n宝鸡市区\t236216\n绿帽\t236217\n仓储式\t236218\n酸梅汁\t236219\n叉车\t236220\n冰装\t236221\n第50关\t236222\n辛宇\t236223\n16.04下\t236224\n观音诞\t236225\n成都装修公司\t236226\n艾轩\t236227\n崔斯特\t236228\n包子\t236229\n走为\t236230\n北京智芯微电子科技有限公司\t236231\n灵芝多糖\t236232\nRST\t236233\n回澜\t236234\n童行\t236235\nmethane\t236236\n恥\t236237\n秀珍\t236238\nCons\t236239\n原态\t236240\n王纯\t236241\n煤城\t236242\n剑门关\t236243\n茂硕电源\t236244\n家乐福\t236245\n网新科技\t236246\n1702015\t236247\n凤铝\t236248\n1085\t236249\nSRL\t236250\n爱有声小说网\t236251\nhayes\t236252\n若干份\t236253\n胥口镇\t236254\n黄金菊\t236255\nHa\t236256\nRows\t236257\n前郭县\t236258\n布鲁氏菌病\t236259\n林夏薇\t236260\n轴对称图形\t236261\nFreemaker\t236262\n91C仔\t236263\n呼市二中\t236264\n新荣村\t236265\n大丽花\t236266\n稗子\t236267\n补税\t236268\n4项\t236269\n第41条\t236270\n悬灸\t236271\n极差\t236272\n王鸥\t236273\n李大海\t236274\n库洛\t236275\n粒子群\t236276\npacf\t236277\n一个一米\t236278\n碧桂园天玺湾\t236279\n偶活学园\t236280\n切刀\t236281\n奇光\t236282\n国际内衣网\t236283\nv5.9\t236284\n合同章\t236285\neclipse4.7\t236286\n1U\t236287\nhotels\t236288\n天涯海角\t236289\n曼谷廊曼机场\t236290\n王家墩\t236291\n房管\t236292\n五矿发展\t236293\n南关岭\t236294\n下沉式\t236295\n21年前\t236296\n笑靥\t236297\n五爪龙\t236298\n萤草\t236299\n拜仁慕尼\t236300\n楚都\t236301\n百度图标\t236302\nATTO\t236303\n18566201099\t236304\n第30号\t236305\n微透镜\t236306\n阿中\t236307\n杭州市人民防空办公室\t236308\n清洁在线\t236309\nmts\t236310\n厅堂\t236311\n生化危机1\t236312\n拉缝\t236313\n叶缘\t236314\n﹎\t236315\n四万公里\t236316\n伊纳瑞斯\t236317\n齐整\t236318\n钱穆\t236319\n青年们\t236320\n外县\t236321\n小米金融\t236322\n鸷\t236323\n胡问鸣\t236324\n温碧泉\t236325\n重庆大学》\t236326\n停不了的爱\t236327\nclarke\t236328\n孙丽萍\t236329\n少龙风流\t236330\n杨海涛\t236331\n融创国博城\t236332\n申敏儿\t236333\n花情\t236334\nmatalb\t236335\n力得\t236336\n99种\t236337\n2.cn\t236338\n外资司\t236339\n豆斋果\t236340\nperth\t236341\n36032\t236342\n扬扬\t236343\nnativecat\t236344\n南歌子\t236345\n博购\t236346\n值得一听\t236347\n呆逼\t236348\n造\t236349\n直屏\t236350\n宜昌市\t236351\n巴萨\t236352\n韩王\t236353\n佳地\t236354\n11回\t236355\n日单\t236356\ngys\t236357\n林改\t236358\nscrews\t236359\n画线\t236360\n春雨\t236361\n致死\t236362\n红白蓝\t236363\n充电\t236364\n水经注\t236365\n两封信\t236366\n火山石\t236367\nUninstaller\t236368\n中国邮政报\t236369\n万方数据中小学\t236370\n雅骚\t236371\n红黄蓝绿\t236372\n女妆\t236373\n中华香烟\t236374\n怎么多\t236375\n奥创纪元\t236376\n5W-40\t236377\n国办发明电\t236378\n嗜碱性粒细胞\t236379\nHTML\t236380\nCollecting\t236381\n酒窝网\t236382\n天山区\t236383\n艾丽卡\t236384\n盐官镇\t236385\nStadt\t236386\nWorks\t236387\n抗静电\t236388\n0304\t236389\n计算机工程与应用\t236390\n思普\t236391\n赵胤胤\t236392\n天昊\t236393\n榆林日报\t236394\n犬瘟热\t236395\n杨园\t236396\n料线\t236397\nCOMMERCIAL\t236398\n.net2.0\t236399\ndefs\t236400\n脚底下\t236401\n可以吗\t236402\n知乎机构号\t236403\n极客湾\t236404\n发寻\t236405\n这么多年来\t236406\n铁胆神侯\t236407\nDido\t236408\n71型\t236409\n户外运动\t236410\n80项\t236411\n七星岩\t236412\n功光\t236413\n内置版\t236414\n男女\t236415\n获得感\t236416\n4月9号\t236417\n240级\t236418\n映异构体\t236419\n中华商标超市\t236420\n危改\t236421\n浦江\t236422\n导板\t236423\n闺头\t236424\n金庸群侠传之决战江湖\t236425\n赖伟锋\t236426\n卢麒元\t236427\n应付税款法\t236428\n苹果8p\t236429\n预借\t236430\n中石化集团公司\t236431\n22087511\t236432\n68度\t236433\n自傲\t236434\n张振华\t236435\n天龙八部手游\t236436\n彼女\t236437\n两手空空\t236438\n加次\t236439\n空中大灌篮\t236440\n珍妮特\t236441\n龙之\t236442\n钴铬烤瓷牙\t236443\n林雨薇\t236444\nSSGF\t236445\n我的奇妙男友\t236446\n验伤\t236447\n西南财经大学会计学院\t236448\ntop100\t236449\n终极PK\t236450\ndanlan\t236451\n后半生\t236452\n日前\t236453\n神谷英树\t236454\n29800\t236455\nWordPress数据库\t236456\nSwoole\t236457\n长沙保卫战\t236458\nnicki\t236459\n我的男孩\t236460\n隐士\t236461\n笙歌\t236462\n闲客\t236463\n钟灵毓秀\t236464\nPSO\t236465\n残光\t236466\n安讯士\t236467\nSelenium2\t236468\n活佛济公2\t236469\n中国国旅\t236470\n易百纳论坛\t236471\n版面费\t236472\n0x8007007\t236473\nFiat\t236474\n正荣地产\t236475\n无所不有\t236476\nRAS\t236477\n大势\t236478\n脐血流\t236479\n秦岭山\t236480\n恒林股份\t236481\n昆玉河\t236482\n续贷\t236483\n救出\t236484\n领子\t236485\n高转送\t236486\n4752\t236487\n桥门式起重机\t236488\n乘数\t236489\n消声\t236490\nHD720P\t236491\n10000美元\t236492\nlpd\t236493\n大核\t236494\n康泰生物\t236495\nspar\t236496\nv3.6\t236497\nwindows98吧\t236498\n温州银行\t236499\nigs\t236500\nfriday\t236501\n92A\t236502\n2k分辨率\t236503\n文阁\t236504\n车辆购置税\t236505\noVirt\t236506\nSophos\t236507\n中共青岛市委党校\t236508\n88125315\t236509\nmysqlimport\t236510\n美素奶粉\t236511\nexui\t236512\n成吉思汗陵\t236513\n爆草\t236514\n观赏螺\t236515\nvery\t236516\n30处\t236517\napproach\t236518\n有条路\t236519\n萦\t236520\n地下层\t236521\n五个字\t236522\n天津地铁5号线\t236523\n绿色荧光蛋白\t236524\n公众人物\t236525\n天降\t236526\n朱庇特\t236527\n上百种\t236528\n人牙\t236529\n卫生厅\t236530\n匀称\t236531\n何晶晶\t236532\n思亲肤\t236533\n堰流\t236534\n耿爱学\t236535\n建成区\t236536\n5541\t236537\n花重锦\t236538\n花烛夜\t236539\n失帧\t236540\n神舟Z7\t236541\nweb自动化测试\t236542\n刘徽\t236543\n徐蕾\t236544\n编辑器\t236545\n阴牌\t236546\n有限元法\t236547\n瑶族\t236548\n信丰\t236549\n上联\t236550\n政府债\t236551\nNameNode\t236552\n英杰们\t236553\n荆州市中心医院\t236554\n豫龙镇\t236555\n圣基茨和尼维斯\t236556\n国泰港龙\t236557\n用友通T3\t236558\n0952\t236559\n防喷器\t236560\n45届\t236561\nRebase\t236562\n0kb\t236563\n雅思8\t236564\n重庆科创职业学院\t236565\nF大调\t236566\nchartjs\t236567\n共赢\t236568\n控制性详细规划\t236569\nSANYO\t236570\n探身\t236571\n项目类\t236572\n010101\t236573\n龙腾世纪3:审判\t236574\n95g\t236575\n成都市技师学院\t236576\nsignificantly\t236577\n满目疮痍\t236578\n黑女人\t236579\n天使之路\t236580\n大误\t236581\ndongguan\t236582\n四世同堂\t236583\n墨墨\t236584\noe\t236585\n富马酸喹硫平片\t236586\nxtick\t236587\n亚欧\t236588\n云浮时刻网\t236589\n3A级\t236590\nswagen\t236591\n云打印机\t236592\n环亚\t236593\n行政处罚书\t236594\nconvection\t236595\n人朝\t236596\nScreen\t236597\n驻马店驿城区\t236598\nSignificance\t236599\nSOUTH\t236600\n中国江西省人民政府\t236601\n农博\t236602\nsupervisor\t236603\n汚\t236604\n盲探\t236605\n方特欢乐世界\t236606\n窑炉\t236607\n会后\t236608\n2.0.1\t236609\n王海平\t236610\n工段长\t236611\n中国汽车技术研究中心\t236612\n價格\t236613\n忍心\t236614\n小农女\t236615\n仓库管理员\t236616\n掌上英雄联盟\t236617\n排污费\t236618\nin&#160\t236619\n防毒面罩\t236620\nPBP\t236621\nModbus\t236622\n华自科技\t236623\n猎头\t236624\nEra\t236625\nsendMessage\t236626\n守礼\t236627\n60wow\t236628\n四川音乐学院\t236629\n星美国际影城\t236630\n我们的太阳\t236631\nddr4内存\t236632\n2-3年\t236633\n弄懂\t236634\n阎婆惜\t236635\n宅男派\t236636\n矿业大学\t236637\ncervical\t236638\nnini\t236639\n北京和美妇儿医院\t236640\n编集\t236641\n大连市口腔医院\t236642\n迪士尼英语\t236643\n虚拟定位软件\t236644\n巨虎\t236645\n宿吉南\t236646\n静态链接库\t236647\n20150823\t236648\n三分球\t236649\n棉质\t236650\n边型\t236651\ncopies\t236652\n中邮网\t236653\n速射炮\t236654\n雷达液位计\t236655\n朱晓红\t236656\n电工进网许可证\t236657\n保家\t236658\n祝贺信\t236659\n三十出头\t236660\nMachine\t236661\n国安办\t236662\ne470c\t236663\n几年\t236664\n1.57\t236665\n广发证券\t236666\npairing\t236667\n静安\t236668\nNikkor\t236669\n翡翠玉\t236670\n隆华节能\t236671\n敞篷车\t236672\n牵引机\t236673\n优合\t236674\n停产\t236675\n花味\t236676\n枣庄市政府\t236677\n今年1月份\t236678\nママン\t236679\n收场\t236680\ncropper\t236681\n大关中学\t236682\nisin\t236683\n民商法\t236684\nolive\t236685\ngrandpa\t236686\n恐龙\t236687\n有量\t236688\nVisiting\t236689\nWax\t236690\n易说\t236691\n钰儿\t236692\n恒和\t236693\n美圆\t236694\n考文垂\t236695\n里亚尔\t236696\n五定\t236697\nCIBC\t236698\n李默\t236699\n大中小型\t236700\n港式茶餐厅\t236701\n到达到\t236702\n猜疑\t236703\n起哄\t236704\n皮夹\t236705\nboylondon\t236706\n泰顺县\t236707\n门脉\t236708\nR8\t236709\n1斤\t236710\n代班\t236711\n种子轮\t236712\nYANG\t236713\nDOVE\t236714\n林伟\t236715\n赵丽丽\t236716\n2016-2017\t236717\nCSOL2\t236718\n恒温热水器\t236719\n糖税\t236720\n朱恒鹏\t236721\nadmini\t236722\n欠息\t236723\nrog吧\t236724\n通讯簿\t236725\n中华人民共和国教育部\t236726\n米卡\t236727\n金港湾\t236728\n收付\t236729\nWalkthrough\t236730\nsuperset\t236731\n前书\t236732\n新生命\t236733\n施悬\t236734\n爆仓\t236735\n网书\t236736\ndemurrage\t236737\n先生们\t236738\n黑板\t236739\nmybatis-plus\t236740\n小米音响\t236741\neLyrics\t236742\n油道\t236743\nuva\t236744\na-2b\t236745\n北京饭店\t236746\nphonetic\t236747\n龙爪槐\t236748\nMIP\t236749\nqlikview\t236750\n简讯_民心网\t236751\n虚拟现实眼镜\t236752\n36行\t236753\n二元一次不等式\t236754\n九华\t236755\n薄纱\t236756\n谢金燕\t236757\nnopad\t236758\n3558\t236759\n预防接种异常反应\t236760\n正向化\t236761\n20170412\t236762\n举世\t236763\n解放街道\t236764\n东莞沙田集散中心\t236765\n7.05\t236766\neasypanel\t236767\n江西经济管理干部学院\t236768\npws\t236769\n谋面\t236770\n不写\t236771\n绝世猫痞\t236772\n煤化\t236773\n会导致\t236774\nSexual\t236775\n航空运单\t236776\n杀猪\t236777\n西平县\t236778\n17.10\t236779\n冒领\t236780\nwudi\t236781\n1919\t236782\n泰盈\t236783\n塔山\t236784\n克劳德\t236785\n深水埗\t236786\n17173DNF\t236787\n岳阳职业技术学院\t236788\n原创新\t236789\n古穿今\t236790\nterrain\t236791\n在肩\t236792\n1980\t236793\n受冷\t236794\n最冤\t236795\n广州市黄埔区教育局\t236796\n会当凌绝顶\t236797\n杨浦\t236798\n北极星电力网\t236799\ndoyou\t236800\n成绵高速\t236801\n任妙音\t236802\nfujifilm\t236803\n娄东\t236804\nwebkit-box\t236805\n潘超\t236806\n邻苯二甲酸氢钾\t236807\n毒师\t236808\n第六套\t236809\n捷克申根\t236810\n筋疲力尽\t236811\n3266\t236812\n牛振华\t236813\n何生\t236814\n好奇葩\t236815\n小熊维尼历险记\t236816\n内蒙古煤矿安全监察局\t236817\n打销\t236818\n观书\t236819\n欧浦钢网\t236820\n乾坤听书网\t236821\n40码\t236822\n西南航空\t236823\n独行\t236824\n聚龙花园\t236825\n授予\t236826\n满盘\t236827\n马力机\t236828\n货基\t236829\n尼古拉二世\t236830\n查韦斯\t236831\n除草机\t236832\nProof\t236833\n微信聊天机器人\t236834\n第三方支付牌照公司\t236835\nads\t236836\nSRTP\t236837\ngrapher\t236838\n北德\t236839\n高密度聚乙烯\t236840\n赛达尔\t236841\nAION\t236842\n商人们\t236843\n购地\t236844\nS&L\t236845\n20151209\t236846\n必创\t236847\n新优\t236848\n黑糖姜茶\t236849\n洛阳市委\t236850\n转向油\t236851\n修门\t236852\nautorelease\t236853\n美国保险公司\t236854\n产业基地\t236855\n美丽雅\t236856\nTENNFY\t236857\n视效\t236858\nLink无线路由器\t236859\n气垫船\t236860\n中国岩土网\t236861\n马尔\t236862\n李捷\t236863\nGan\t236864\n深圳蛇口邮轮中心\t236865\n【许\t236866\nGAMER\t236867\n770\t236868\n武老师\t236869\nfits\t236870\n察哈尔学会\t236871\nespresso\t236872\nMSV\t236873\nYandex\t236874\n三界\t236875\n韩世麟\t236876\n苏神\t236877\n俞老师\t236878\n刘建民\t236879\n氢氰酸\t236880\n交流赛\t236881\n晕厥\t236882\n纽巴伦\t236883\n梳子\t236884\n拉卡拉吧\t236885\n天下第一村\t236886\n家庭类\t236887\n仙三外传\t236888\n清库\t236889\n鱼尾板\t236890\n哈尔滨太平国际机场\t236891\n曹营\t236892\n沪昆铁路\t236893\n第三十一回\t236894\n金坤\t236895\n获评\t236896\n金庄\t236897\navengers\t236898\n南昌市政府\t236899\n慈善基金会\t236900\n忠犬八公\t236901\nfailing\t236902\n惊叹\t236903\n贝林\t236904\nM4R\t236905\n陈俊生\t236906\n小钢炮\t236907\n综论\t236908\nlv2\t236909\n9306\t236910\n贝奥武夫\t236911\n小伙伴儿\t236912\nsamplitude\t236913\n200辆\t236914\n鹰击长空\t236915\n节圆\t236916\n纸皮\t236917\n死地\t236918\n钢波纹管涵\t236919\nmod风云三国\t236920\n面部神经炎\t236921\n小林子\t236922\n看片网\t236923\nwinload\t236924\nA9处理器\t236925\n仙境传说吧_\t236926\nSquid\t236927\n0.22\t236928\n小鹤\t236929\n神契\t236930\n尾段\t236931\n十余名\t236932\n很听话\t236933\n科研部\t236934\n香菜\t236935\nthml\t236936\n七水硫酸亚铁\t236937\n深入理解计算机系统\t236938\n古美路\t236939\nPatrickLiu\t236940\n稀里糊涂\t236941\n卵石\t236942\n漓江\t236943\n成人大学\t236944\n证据确凿\t236945\n惠州大亚湾区\t236946\n奢悦\t236947\n飞来\t236948\n乍暖还寒\t236949\n南山竹海\t236950\n挑逗性\t236951\nHotUKDeals\t236952\n没有声\t236953\n捧得\t236954\n人口老龄化\t236955\n广灵县\t236956\n亚泰集团\t236957\nJAL\t236958\nLED控制器\t236959\n刘夏\t236960\n恒大影城\t236961\n细胞液\t236962\nTencent\t236963\n拟人句\t236964\n创安\t236965\n5万多\t236966\n钟茂森\t236967\nStandard\t236968\n航段\t236969\n诗赋\t236970\n林长制\t236971\n四、五\t236972\n电子数\t236973\n江河湖海\t236974\n荷兰花园\t236975\n设配\t236976\n苏宁云钻\t236977\n南瓜灯\t236978\n江油市人民政府\t236979\n出卷\t236980\ng0\t236981\n水月雨\t236982\n码工\t236983\n省院\t236984\n详写\t236985\nSpringMVC4\t236986\n商洛\t236987\n比亚修罗武神\t236988\npermission\t236989\n纣王\t236990\n足球直播网\t236991\n白贞\t236992\nlgv20吧\t236993\nSunshine\t236994\nxtr\t236995\nstanding\t236996\n搜狐女人\t236997\n颠鸾倒凤\t236998\n乌然堂\t236999\n汤水\t237000\n西峰\t237001\n40次\t237002\npg2\t237003\nword安全模式\t237004\n北魏孝文帝\t237005\n江南长城\t237006\nsemaphore\t237007\n嫡出\t237008\n荆棘\t237009\nchinese\t237010\n振华石油\t237011\n正月十五\t237012\n沪人社\t237013\n微尘\t237014\n临沂职业学院\t237015\n32条\t237016\n净资\t237017\nbriefing\t237018\nIvey\t237019\n施工员\t237020\n江山如此多娇\t237021\n己婚\t237022\n杏林苑\t237023\n张幼仪\t237024\n营业成本\t237025\n郑丽\t237026\n美居\t237027\n67亿\t237028\n公允\t237029\n仿古瓦\t237030\n等离子空气消毒机\t237031\n逝水\t237032\n石门栈道\t237033\n柯南道尔\t237034\n洛奇英雄传\t237035\n膨化\t237036\n陈世峰\t237037\n乘号\t237038\n手术库\t237039\n冰葡萄酒\t237040\nstrust2\t237041\n中宠股份\t237042\npae\t237043\n登庸\t237044\n斐乐\t237045\n万能变声器\t237046\n丝毛\t237047\n悉尼机场\t237048\nRoadster\t237049\n维多利广场\t237050\nRival\t237051\n牡丹江大学\t237052\n2018年02月\t237053\ntsubomi\t237054\n会计类\t237055\n别情\t237056\nwol\t237057\nLPDDR3\t237058\n阮\t237059\nwuyu\t237060\n黄老\t237061\n索果\t237062\n复选框checkbox\t237063\n割腕\t237064\n四库全书总目\t237065\n债权转让协议\t237066\n郑锐彬\t237067\nooxx邪恶傉动图\t237068\n但凡\t237069\n虞美人\t237070\n房玄龄\t237071\n第七类\t237072\n蛋白质组学\t237073\n李村镇\t237074\n崔泽\t237075\n余建军\t237076\n广州市执信中学\t237077\n宝马天道图书馆\t237078\n核平\t237079\nPDB\t237080\n阿格里奇\t237081\n核磁共振仪\t237082\n编导\t237083\ndgc\t237084\n旅游圈\t237085\n深圳华侨城洲际大酒店\t237086\n信乐团\t237087\n大疆如影\t237088\n双指\t237089\n汽车继电器\t237090\n龙湖首开\t237091\nyongka\t237092\n花厨\t237093\n蒙受\t237094\nESM\t237095\n响叮\t237096\n圆周\t237097\n藩镇割据\t237098\n公司章程\t237099\n弄直\t237100\n仙剑ol\t237101\nLock\t237102\n相书\t237103\n不锈钢防盗网\t237104\n月湖区\t237105\nswftools\t237106\n哈雷883\t237107\nxgp\t237108\n宾夕法尼亚大学\t237109\n金火\t237110\n八珍丸\t237111\n高优\t237112\n中南大学湘雅三医院\t237113\n美心门\t237114\n流日\t237115\nexo\t237116\n大地色\t237117\n分晓\t237118\n二妙丸\t237119\n水电站\t237120\ncm3d\t237121\n顶红网\t237122\n20130726\t237123\n1万年\t237124\nGloria\t237125\n注会东奥\t237126\n江淮iev6e\t237127\n格律\t237128\n西京学院\t237129\n滚焊机\t237130\n鲤鱼池\t237131\nmoodle\t237132\n主委\t237133\nsamb\t237134\n汉韵\t237135\n吉祥语\t237136\n济宁市国土资源局\t237137\n14万亿\t237138\n波斯王子\t237139\n贝鲁特\t237140\n得爱\t237141\n海口市生态环境保护局\t237142\n骨灰盒\t237143\n王坚\t237144\nmito\t237145\n黄色片\t237146\nWIPO\t237147\n故城县\t237148\nUnit3\t237149\n死亡之谜\t237150\n两权\t237151\n五龙河\t237152\n襄州\t237153\n好啊\t237154\n宿城\t237155\n大公司\t237156\n恃强凌弱\t237157\nword行间距\t237158\n平面直角坐标系xOy\t237159\n卷走\t237160\n智跑\t237161\n贾蓉\t237162\n微颔\t237163\n故事情节\t237164\n贸\t237165\n忻州新闻网\t237166\n曾博\t237167\n连锁酒店\t237168\n刘博\t237169\n煤电\t237170\n莫卧儿帝国\t237171\ncontention\t237172\nvocab\t237173\nschematic\t237174\n状态栏\t237175\n大师兄\t237176\n万智牌\t237177\n深圳海洋世界\t237178\n决心\t237179\n老城墙\t237180\n制服少女\t237181\n累死累\t237182\n月亮河\t237183\n急便\t237184\n安徽省商务厅\t237185\n杭州邵逸夫医院\t237186\nMMI\t237187\n新鲜血液\t237188\njisuanji\t237189\nIrish\t237190\n4PX\t237191\n猴子们\t237192\n第30话\t237193\n会计学堂\t237194\n麦克利兰\t237195\nGNC\t237196\n摩托坊\t237197\n恒友\t237198\n老坛\t237199\n科莫多巨蜥\t237200\n贡山县\t237201\n稳定度\t237202\nkn\t237203\n5672\t237204\nproficiency\t237205\n黄炎\t237206\nardupilot\t237207\n海关信息网\t237208\n妻欲\t237209\nadding\t237210\n三峡机场\t237211\ntomorrowland\t237212\n6线\t237213\nPlayhome\t237214\n清见\t237215\n|赛\t237216\n杆头\t237217\n黄家驹\t237218\n出水芙蓉\t237219\nVST\t237220\n重庆清华中学\t237221\n开封市\t237222\n快乐学\t237223\n又笑\t237224\n维吾尔族\t237225\n文化教育\t237226\n双色球机\t237227\n海利尔\t237228\n3平方米\t237229\ncellular\t237230\n从上到下\t237231\nkk\t237232\n5种\t237233\n柯洛克\t237234\n卫体\t237235\n铁拳\t237236\n贞观大闲人\t237237\n内存修改器\t237238\n樱花园\t237239\n数十载\t237240\n尚普\t237241\nDealmoon\t237242\n阴谋论\t237243\n刘心悠\t237244\n光束灯\t237245\n九曳\t237246\n罗迪斯\t237247\nd7200\t237248\n影幕\t237249\n寄主\t237250\n6亿元\t237251\n机械设备有限公司\t237252\nLDN\t237253\n249亿\t237254\n汇典\t237255\n囊中羞涩\t237256\n丰盛集团\t237257\n井冈山大学\t237258\nLIANG\t237259\n胃肠型感冒\t237260\n国家纳米科学中心\t237261\ninfluence\t237262\n两片式\t237263\n太太口服液\t237264\n院标\t237265\n第六组\t237266\n肥力\t237267\n8000吨\t237268\n音程\t237269\n2万公里\t237270\n天域套\t237271\n一汽丰田威驰\t237272\n瑶乡\t237273\nond\t237274\n恒温干燥箱\t237275\n2256\t237276\n0635\t237277\n恒波\t237278\n浏览器端\t237279\nMARCO\t237280\n氮循环\t237281\njsvc\t237282\n兵荒马乱\t237283\n98度\t237284\n津市市人民政府\t237285\n牛腱肉\t237286\nEMC测试\t237287\nISTP\t237288\n权房\t237289\n随县人民政府\t237290\n半殖民地半封建社会\t237291\n115号\t237292\n一千瓦\t237293\n界址\t237294\n天眼通\t237295\nxidian\t237296\n意经\t237297\n投票箱\t237298\n变通\t237299\n惠通\t237300\n厉娜\t237301\n绵阳科技城\t237302\n赫章县\t237303\n一行一列\t237304\n机动战士敢达ol\t237305\nhierarchical\t237306\nmakes\t237307\n物联网\t237308\n中间户\t237309\nclx\t237310\nIPFS\t237311\n招来\t237312\nTutoring\t237313\n二职\t237314\n直系\t237315\ngroupe\t237316\nIE6\t237317\n天艺\t237318\nprotos\t237319\n卷积码\t237320\nNSUser\t237321\nintel处理器\t237322\n小白福音\t237323\n管上\t237324\nL801\t237325\nAC88U\t237326\ninflammatory\t237327\nLeisure\t237328\n极客学院\t237329\n层板\t237330\nWako\t237331\n挂衣\t237332\nccw\t237333\n安徽工业大学\t237334\n日出江花红胜火\t237335\ns9\t237336\n布哈拉\t237337\n猴年\t237338\n集合石\t237339\n砂锅\t237340\n友诚\t237341\n张轩松\t237342\ndmm\t237343\nunet\t237344\nplaintext\t237345\nWegame\t237346\n赤峰市\t237347\n不破楼兰终不还\t237348\nstation\t237349\n南师\t237350\nVOA\t237351\nacos\t237352\nfolded\t237353\n十张\t237354\n赌圣3无名小子\t237355\n变文\t237356\n中等\t237357\nキセキ\t237358\nkermit\t237359\n飞沫\t237360\n锁相\t237361\n太原科技大学\t237362\ntushy\t237363\n驭势科技\t237364\n模角\t237365\n紫屏\t237366\n国房\t237367\n酸汤鱼\t237368\n缇\t237369\noppoa59s\t237370\n获赞\t237371\n1682亿元\t237372\n赶早\t237373\nWin10文件管理器\t237374\n圣导师\t237375\n热带季风气候\t237376\n推演\t237377\n修女\t237378\nKMS10\t237379\n科泰\t237380\n组训\t237381\n频临\t237382\n国子\t237383\n下泵\t237384\nLeicester\t237385\nWIZnet\t237386\n包政\t237387\n纽甜\t237388\n红井路\t237389\n腾发\t237390\nPUGB\t237391\nacaa\t237392\n素银\t237393\nhgame\t237394\n暗管\t237395\n阿兰德龙\t237396\ntuple\t237397\n国通信托\t237398\n缝章\t237399\n朱振华\t237400\n淡紫\t237401\n闪光\t237402\n战豆\t237403\n20160118\t237404\nexexl\t237405\n精灵语\t237406\n聚众\t237407\n氟维司群\t237408\n47%\t237409\nreadonly\t237410\n国泰安\t237411\n车刀\t237412\n部落冲突皇室战争\t237413\n⑥\t237414\n被推\t237415\n半导体激光器\t237416\n按键消抖\t237417\n闹掰\t237418\n镀铑\t237419\n根处\t237420\n跳测\t237421\n上源\t237422\n莫斯科行动\t237423\n性奴隶\t237424\n创造101\t237425\n二力\t237426\n氧吧\t237427\n中华人民共和国国家版权局\t237428\n训政\t237429\nliter\t237430\n凤飞\t237431\nrule34\t237432\nOneNote2013\t237433\noutput\t237434\nIMEI码\t237435\n0310\t237436\n包工\t237437\n范儿\t237438\n91搜墓\t237439\n广百百货\t237440\nacl\t237441\nMisc\t237442\n手快\t237443\n常秋月\t237444\ne经\t237445\n书券\t237446\n放电\t237447\n俄海军\t237448\n无穷\t237449\n模形\t237450\n浪淘\t237451\n保拉\t237452\n主办权\t237453\n食盐\t237454\nblv\t237455\nSnapshots\t237456\n骨髓水肿\t237457\n下调\t237458\n放底\t237459\n发家88商机网\t237460\n生辰八字\t237461\n10秒以上\t237462\n优客逸家\t237463\n韦斯\t237464\nセックスフレンド\t237465\n亲权\t237466\nModeler\t237467\n仁爱版\t237468\n3186\t237469\nBill\t237470\n限薪\t237471\nroast\t237472\ncontent-type\t237473\n中国电建地产\t237474\nfuwu.huangye88.com\t237475\n0.1m\t237476\n邓小平理论\t237477\n地空\t237478\n石湖荡\t237479\n天脊龙门\t237480\n徐州市区\t237481\n宝鸡文理学院\t237482\n超级助手\t237483\nsl700\t237484\n教育学概论\t237485\n标准定额信息网\t237486\n莫过于此\t237487\nHORI\t237488\nTUT\t237489\nijl15.dll\t237490\n东升镇\t237491\nDAD\t237492\ndocumentation\t237493\n膜片\t237494\n木构\t237495\n种族灭绝\t237496\nUsers\t237497\n王年\t237498\n雅卓\t237499\n成发\t237500\n任玲\t237501\n炬力\t237502\n星动\t237503\n二维数组\t237504\n看看看\t237505\n300多块\t237506\nrice\t237507\ndiscover\t237508\n锌板\t237509\n和苑\t237510\n硬壳\t237511\n36平方厘米\t237512\n民生控股\t237513\n_文运法硕\t237514\n29.0\t237515\n全文本\t237516\n西南财经大学法学院\t237517\nzoos\t237518\n绿色守护\t237519\nDesigner10\t237520\nMid\t237521\nstudios\t237522\n五羊本田佳御\t237523\n19.2.1\t237524\nù\t237525\n受精卵\t237526\nmeven\t237527\n郁南县人民政府\t237528\n浙江商会\t237529\n造化之门吧\t237530\n117\t237531\n甲醇钠\t237532\n摇把\t237533\n火化\t237534\n板球\t237535\nWP7\t237536\n联合国常任理事国\t237537\n联评\t237538\ne生\t237539\nusg\t237540\n6亿美元\t237541\n张家口银行\t237542\nBB10\t237543\n攻略组\t237544\n逻辑驱动器\t237545\n如幻\t237546\n开班式\t237547\n作答\t237548\n自由行\t237549\n广金\t237550\n下腔静脉\t237551\n重庆大学经济与工商管理学院\t237552\n皂盒\t237553\n猛犸象\t237554\n内蒙古晨网\t237555\n清连高速\t237556\n大宋提刑官\t237557\n信誉群\t237558\nhereby\t237559\n庄股\t237560\n四生\t237561\n三桥村\t237562\nstayin\t237563\n哥斯拉2014\t237564\n神州谣\t237565\nDJ\t237566\noutreg2\t237567\n笔下\t237568\n安康市人民政府\t237569\n中钢\t237570\n江苏卫视荔枝台_江苏卫视\t237571\npromotes\t237572\n彩虹色的花\t237573\n卤代烷\t237574\n智农\t237575\n高尔夫rline\t237576\n1km\t237577\n优能佳\t237578\n史密斯热水器\t237579\nmx500\t237580\n古剑奇谭\t237581\n蒸脸器\t237582\n吴晓敏\t237583\n朱育帆\t237584\nBasement\t237585\nnaati\t237586\n廊坊银行\t237587\n帅帅\t237588\n山西国税\t237589\nemoji\t237590\n伊利集团\t237591\n磋\t237592\n2017年9月15日\t237593\nhuan\t237594\n热端\t237595\nE7240\t237596\nSebastian\t237597\n七重\t237598\n连铸\t237599\n器油\t237600\ntxtx\t237601\n芦台镇\t237602\nmechanical\t237603\n龙岩火车站\t237604\n马子方\t237605\n取之于蓝\t237606\n高黎贡\t237607\nzhonghua\t237608\n哈西万达广场\t237609\ntmb\t237610\nGLS\t237611\n西南大学\t237612\ntrt\t237613\n污泥泵\t237614\n芙蓉花\t237615\nプリ\t237616\n25a\t237617\n极豆\t237618\nKING磊\t237619\n万科瑧山府\t237620\n八头\t237621\n联学\t237622\n0395\t237623\nak\t237624\n防水剂\t237625\naaf\t237626\ndefence\t237627\n笛箫\t237628\n128fn\t237629\n陈威\t237630\nYield\t237631\n嘟嘟\t237632\n串烧\t237633\n康雍乾\t237634\nbc省\t237635\n财务分析\t237636\n小贝\t237637\n很大\t237638\n武器\t237639\n玩忽职守罪\t237640\n张宏良\t237641\n鹭岛\t237642\n成都市国资委\t237643\n巨流河\t237644\n合加\t237645\nnand\t237646\n吴绮莉\t237647\n阳光城集团股份有限公司\t237648\n阳光采购网\t237649\n付海东\t237650\nvspds\t237651\n15段\t237652\n胚子\t237653\n漆柚\t237654\n起笔\t237655\nzimmer\t237656\n20141025\t237657\n优士\t237658\n敬老院\t237659\nUltima\t237660\n2016年11月19日\t237661\n直肠指检\t237662\ntab键\t237663\n优录\t237664\n刘国正\t237665\n徐骏\t237666\n十三条\t237667\n黑龙江省教育学院\t237668\n毛振华\t237669\nEvidence\t237670\n康爱多\t237671\n瀍河回族区\t237672\n动态值\t237673\n伐檀\t237674\n优甲乐\t237675\n定襄县\t237676\n齐打交火\t237677\nsrd\t237678\n飞来飞去\t237679\n北大附中\t237680\n签约率\t237681\n猥亵儿童罪\t237682\n增收\t237683\nbigint\t237684\nImportError\t237685\n7届\t237686\n德勤俱乐部\t237687\n押宝\t237688\nPMMA\t237689\nGLI\t237690\n补助费\t237691\n山东墨龙\t237692\n智慧城市网\t237693\n家可归者\t237694\n沙坪坝\t237695\n决战紫禁之巅\t237696\n土种\t237697\n鼓舞\t237698\n梨涡\t237699\n被免\t237700\n35kg\t237701\n天河东路\t237702\n明港\t237703\n语文卷\t237704\n氧化性\t237705\n非税\t237706\n抿\t237707\n不务正业\t237708\n普通人\t237709\n镍镉\t237710\n问道\t237711\n尚城国际\t237712\ntsing\t237713\n电脑蓝屏\t237714\n联调\t237715\nX-FORCE\t237716\n美签\t237717\n旭阳集团\t237718\n糖瓜\t237719\n一条杠\t237720\n爆护\t237721\n浙江省疾病预防控制中心\t237722\n青瓷\t237723\nwi1000x\t237724\n沙门菌\t237725\n英纳格\t237726\n华立\t237727\n千亩\t237728\n百物\t237729\n东京战纪\t237730\n辛德瑞拉\t237731\n增广\t237732\nLok\t237733\n开飞\t237734\n烧鸡公\t237735\n天启之门\t237736\n监装\t237737\n纲\t237738\n和值谜\t237739\n谢晓飞\t237740\n肇祸\t237741\n天津市儿童医院\t237742\n软石\t237743\nmkv/\t237744\n奈奎斯特\t237745\n1-9月\t237746\nhantai\t237747\ndw2017\t237748\n北京三院\t237749\n亥伯龙\t237750\n技高一筹\t237751\n4600\t237752\n新娱乐在线\t237753\n昨夜雨疏风骤\t237754\n1发\t237755\n110家\t237756\n爱自由网\t237757\n茶店\t237758\ncdm\t237759\n董老师\t237760\n75_\t237761\n哈德门\t237762\n王鼎杰\t237763\n铜仁一中\t237764\n连网\t237765\n建档立卡户\t237766\n温泉镇\t237767\n海淀北部\t237768\n7死\t237769\n陈尸\t237770\n袁朗\t237771\n凤凰新华书店\t237772\n三星w2018\t237773\ncgfloat\t237774\n互\t237775\nbootimg\t237776\n中国中医药出版社\t237777\n鱼虫\t237778\n工务\t237779\n霍天洪\t237780\n食色\t237781\n仙剑情缘\t237782\n左转弯\t237783\nAero\t237784\n来朝\t237785\n魏征\t237786\nuptown\t237787\n泰兴经济开发区\t237788\n第56章\t237789\n一键越狱\t237790\n叠石桥\t237791\nzen7\t237792\n200万星光级\t237793\n中国银行跨行\t237794\nHaining\t237795\nTelephone\t237796\n卷理综\t237797\n贝太\t237798\nRisks\t237799\n弹力绳\t237800\n刀架\t237801\n裕龙\t237802\n少昊\t237803\n神池\t237804\n金蝶易记账\t237805\n荣成\t237806\n杨过\t237807\n老黑\t237808\n扣式\t237809\n濑亚美莉\t237810\n天天乐\t237811\n何雅玲\t237812\ntextread\t237813\n家鹅\t237814\nplating\t237815\nyyxf\t237816\n翩翩体\t237817\n一册\t237818\nBroadLink\t237819\n20170514\t237820\nCoastal\t237821\n雪炎\t237822\n山河\t237823\n40407游戏网\t237824\n四平路校区\t237825\n揣\t237826\n加载器\t237827\n分众传媒\t237828\n中山一院\t237829\n晋阳湖\t237830\n产出比\t237831\n共渗\t237832\ncolorful\t237833\nav伦理片\t237834\n中国常州网\t237835\n复活赛\t237836\n2017-07-04\t237837\n中科院宁波材料所\t237838\n铁铬\t237839\n4.1亿\t237840\n延世\t237841\nOz\t237842\n2009年6月\t237843\n我和僵尸有个约会1\t237844\n影音先锋看片网站-影音先锋AV-影音先锋\t237845\n黄河故道\t237846\n3T\t237847\n低眉不问俗尘非\t237848\n无视\t237849\n/箱\t237850\n分立\t237851\nmega2560\t237852\noneplus\t237853\n君成\t237854\nx-force\t237855\n扒光\t237856\n中国恩菲工程技术有限公司\t237857\n系法\t237858\n双优卷\t237859\n人工鱼群算法\t237860\n唐薇\t237861\n小儿黄疸\t237862\n7kw\t237863\n上海宏康医院\t237864\n倒谱\t237865\n一个四一个\t237866\n改行\t237867\n黑柴\t237868\n啵乐乐\t237869\n博会\t237870\n一模二模\t237871\n伪声\t237872\n网格状\t237873\n延误\t237874\nAmigo\t237875\n金妮\t237876\n湖南人力资源社会保障\t237877\n柯林\t237878\n奇瑞风云2\t237879\nBT侠\t237880\nsog\t237881\n阿奇\t237882\n小祥子\t237883\n马宏伟\t237884\n复兴村\t237885\n好范文网\t237886\n模模\t237887\n慕诗琪\t237888\n情亲\t237889\n特克斯县\t237890\n可加性\t237891\n方敏\t237892\nEdu\t237893\n和平北路\t237894\n冰壶\t237895\n脉冲群\t237896\n1953\t237897\n有利网\t237898\n华兰\t237899\n江西日报社\t237900\n猩球崛起2\t237901\n黄伊汶\t237902\n身受\t237903\n贾敏\t237904\ngathered\t237905\n妈妈的话\t237906\n看得起\t237907\n绝地求生直播_绝地求生视频攻略解说_绝地求生\t237908\n周向\t237909\n唷\t237910\n刘洵\t237911\n魏德米勒\t237912\n电信公司\t237913\n秀山网\t237914\n陶瓷\t237915\n直仪\t237916\n邮储银行\t237917\niPhone7s\t237918\n首制\t237919\n57号\t237920\n壹号皇庭5\t237921\n驻训\t237922\n第十二次\t237923\n奇痒\t237924\n龙巅罗汉鱼\t237925\n180325\t237926\n福利\t237927\n西部证券股份有限公司\t237928\n多越好\t237929\n百亿级\t237930\n14关\t237931\n精神分裂症\t237932\n茹萍\t237933\n推土机\t237934\n三峡大坝\t237935\n丁汝昌\t237936\n清云\t237937\n杜金雨\t237938\n野生技术协会\t237939\n1600多\t237940\n福建银监局\t237941\n聚乙烯丙纶\t237942\n投稿\t237943\nK线\t237944\n5.4.8\t237945\n1485\t237946\n巨贪\t237947\n2一\t237948\n刊刻\t237949\n获选\t237950\n热冲击\t237951\nNAXX\t237952\n王大力\t237953\n金链\t237954\nohm\t237955\nmavenjar\t237956\n模板变量\t237957\njingwhale\t237958\n蓝点网\t237959\n无籽西瓜\t237960\n双宋\t237961\n7za\t237962\n洗涤器\t237963\nTIMER\t237964\n去年底\t237965\n团中央学校部\t237966\n西安交通大学电气工程学院\t237967\nTinyMCE\t237968\n三目\t237969\n安阳百姓网\t237970\n3^2\t237971\n凤歌灵飞经\t237972\n中甲联赛\t237973\n天黑请闭眼\t237974\n20170622\t237975\n无依无靠\t237976\n兵团网\t237977\n东疆湾\t237978\n马艳\t237979\nElementUI开发后台管理系统\t237980\nx20plus\t237981\n婆婆丁\t237982\n寿阳\t237983\n病书\t237984\nplt\t237985\n加息预期\t237986\n耽于\t237987\n圣菲\t237988\n每一颗\t237989\n扑鱼\t237990\n詹一美\t237991\n密子君\t237992\n广东省中学\t237993\n自胜\t237994\nTRACK\t237995\nprotuse\t237996\n锡安国家公园\t237997\n铜仁南站\t237998\n铝梯\t237999\n王佐良\t238000\n姓墨\t238001\n永磁直流电机\t238002\n林英\t238003\n阴媒\t238004\n滚槽机\t238005\n赵爽\t238006\n解方程组\t238007\n高邮市政府网\t238008\nentitlements\t238009\nnone\t238010\n39军\t238011\n张玉凤\t238012\n冷热水管\t238013\n办学许可证\t238014\n大约\t238015\ncommission\t238016\nvsftpd\t238017\n当饭吃\t238018\n渔火\t238019\n高分子材料与工程专业\t238020\n乘坐\t238021\n囡囡\t238022\n三明市教育局\t238023\n空调过滤网\t238024\nWild\t238025\n直说\t238026\nmuyouking\t238027\n中华人民共和国文物保护法\t238028\n云山镇\t238029\n领悟\t238030\n儒法\t238031\n圣僧\t238032\n野地\t238033\n去极端化\t238034\n北信源\t238035\n600887\t238036\n汾西县\t238037\nrundll32\t238038\n张虎\t238039\n产成品\t238040\nsolo谱\t238041\n森友\t238042\n合战三国\t238043\n漏气\t238044\nfackbook\t238045\n1812年\t238046\n巨潮网\t238047\n福州长乐机场\t238048\n贵州省纪委\t238049\n90000\t238050\n复审\t238051\neverything\t238052\n治丧\t238053\n冲走\t238054\n鲁迅公园\t238055\n冬季恋歌\t238056\n灯火辉煌\t238057\n暮光之眼\t238058\n陈翔路\t238059\nNMB\t238060\n攀枝花市商业银行\t238061\n熟练者\t238062\n911美剧网\t238063\n民事纠纷\t238064\n段建业\t238065\n陕西\t238066\n20180308\t238067\n群子\t238068\n下课铃\t238069\n陈师曾\t238070\n4众评\t238071\n杭电\t238072\n蝴蝶牌\t238073\n长津湖战役\t238074\n邬贺铨\t238075\n1899年\t238076\ncircuit\t238077\nsouvenir\t238078\n昆明信息港\t238079\n安身\t238080\n西洋参片\t238081\n新河\t238082\n乌石\t238083\n淘宝交易\t238084\n最终幻想9\t238085\nGraffiti\t238086\n青岛火车站\t238087\n成都文殊院\t238088\n竹尖\t238089\n烟魁\t238090\n三宝殿\t238091\n食草\t238092\n安全绳\t238093\n在线云点播\t238094\n春江郦城\t238095\n凰妃\t238096\njlzz\t238097\n列为\t238098\n鲁尔\t238099\n双开双控\t238100\n精华版\t238101\nasp\t238102\n从权\t238103\n电子盘\t238104\n车胎\t238105\n盗撮\t238106\nextrusion\t238107\n康宝莱\t238108\n晶形\t238109\n饭酒\t238110\n百度api\t238111\n100KW\t238112\n6x6\t238113\n定语从句\t238114\n大邵网\t238115\n绿色供应链\t238116\n需具备\t238117\n试运\t238118\nvol.1\t238119\n描声\t238120\n末班\t238121\n尚座\t238122\n执象\t238123\n凯德天府\t238124\n马壮\t238125\n唱名\t238126\n搬兵\t238127\n夕颜\t238128\n车柱\t238129\n1.0.27\t238130\n一个八\t238131\n千里\t238132\n西南交通大学土木工程学院\t238133\n沃4G+\t238134\n汉简\t238135\nGLib\t238136\n只手遮天\t238137\nade\t238138\n主听\t238139\n1345\t238140\n刘萍\t238141\n500KW\t238142\n名值\t238143\n祖尔法拉克\t238144\n裴勇俊\t238145\n诺维茨基\t238146\nAdvocacy\t238147\n每句话\t238148\n刘天\t238149\n武城县\t238150\n4294967295\t238151\n归咎\t238152\n淘宝助理批量\t238153\n360急救盘\t238154\nreview\t238155\n罗克韦尔自动化(中国)有限公司\t238156\nDING\t238157\n打毛\t238158\nRico\t238159\n女老董\t238160\n百田\t238161\n姆\t238162\ntranscript\t238163\n洗完\t238164\n第二重\t238165\n9天\t238166\n逛\t238167\n觅春网\t238168\nPapillon\t238169\n陈国汉\t238170\n间接税\t238171\n果色\t238172\nrepolist\t238173\n宁夏教育考试院\t238174\n博悦\t238175\n80000003\t238176\nCENTRAL\t238177\n狗娃\t238178\n章鱼哥\t238179\n家庭教育学\t238180\n相通\t238181\n唐建\t238182\n段希帆\t238183\n武警后勤学院附属医院\t238184\nNST\t238185\nWestone\t238186\n专治\t238187\n哈尔滨地铁2号线\t238188\n希尔斯\t238189\n企业品牌形象\t238190\n绝命毒师\t238191\n第三极\t238192\n五级三阶制\t238193\n陋巷\t238194\nBASH\t238195\n细杆\t238196\n诚创\t238197\n同利\t238198\n熟男\t238199\n成都双流\t238200\n虐肛\t238201\n火急\t238202\n插编\t238203\n北京高速公路\t238204\n一起去旅行\t238205\n抗磨剂\t238206\n3盘\t238207\nMali\t238208\n赛车群\t238209\n纸类\t238210\n吸尘机\t238211\n茶具\t238212\n有责\t238213\n熔丝\t238214\n防病\t238215\n澳式橄榄球\t238216\n鼓岭\t238217\nVGG16\t238218\nTokens\t238219\n36倍\t238220\nadsorption\t238221\n偏瘫\t238222\nSQL语\t238223\n2尤里\t238224\n油然而生\t238225\n合并症\t238226\n滑移装载机\t238227\nMan\t238228\n120分钟\t238229\n华东师范大学图书馆\t238230\n华东理工大学网络教育学院\t238231\nByte\t238232\n发张\t238233\n钰珑湾\t238234\n淡干海参\t238235\nlaonian\t238236\n面无表情\t238237\n合浦县人民政府\t238238\n票处\t238239\n地灾\t238240\nv5.7.2\t238241\n平衡力\t238242\n庞明\t238243\n德智体美劳\t238244\nc编程\t238245\n向涛\t238246\n深南\t238247\n浦发美国运通\t238248\n北京城墙\t238249\n喵哥\t238250\n长沙分公司\t238251\n什们\t238252\n擦亮眼睛\t238253\n多效唑\t238254\n权益类\t238255\n第二十一批\t238256\n纵欲\t238257\n李俊宏\t238258\n熊杰\t238259\n包皮系带\t238260\n紧张感\t238261\n新宋体\t238262\n发展型\t238263\nJob\t238264\n耙子\t238265\nELF\t238266\n三级片\t238267\n乌马河\t238268\n水槽\t238269\n李秀\t238270\n残叶\t238271\n0xC004F074\t238272\n广州市红十字会医院\t238273\n曾\t238274\ncctv8\t238275\n小情歌\t238276\n中国汽车网\t238277\nhal\t238278\n天梦\t238279\n望京店)\t238280\n杨一\t238281\n胃糜\t238282\n衡器\t238283\nDCP-1608\t238284\nBasket\t238285\n弹尽粮绝\t238286\n密圈\t238287\n施克辉\t238288\n美梦成真\t238289\n免烫\t238290\n水平管\t238291\n马斯卡彭\t238292\n单飞\t238293\n毛纺\t238294\n活下\t238295\nPowerDesigner15\t238296\noppoR11\t238297\n重度贫血\t238298\nCBE\t238299\n鼻\t238300\n硝酸钙\t238301\n26.0.0\t238302\n地雷阵\t238303\n30多家\t238304\n产仔\t238305\n七块\t238306\n高仿劳力士\t238307\n防爆照明\t238308\nreimport\t238309\n魏州\t238310\n20170830\t238311\n一大串\t238312\n闸瓦\t238313\nPETG\t238314\nJONES\t238315\n勋\t238316\n封茗囧菌\t238317\n代运\t238318\n人体传感器\t238319\n江岸\t238320\n几千块\t238321\n龙岗天安数码城\t238322\n高野山\t238323\n明子\t238324\n韦驮\t238325\n威威\t238326\n58度\t238327\n海林市\t238328\n虎跑泉\t238329\n2胎\t238330\n金多\t238331\n触手文\t238332\n压轮\t238333\n法庭\t238334\n血拼\t238335\n阿里\t238336\nrslogix5000\t238337\n张常汤珈铖\t238338\n15场\t238339\n大般若经\t238340\ncarcinoma\t238341\ni%2\t238342\n删帖\t238343\nUnix/Linux\t238344\n东方法学\t238345\n中文转换\t238346\n匀强电场\t238347\n托运\t238348\n阳离子染料\t238349\ngsj\t238350\n小杨\t238351\n炉灶\t238352\n塞隆\t238353\n虚线\t238354\n赵小军\t238355\n徐芳\t238356\n802.1Q\t238357\n刘瑞琦\t238358\n第八大\t238359\n預訂\t238360\nbehaviac\t238361\n风华正茂\t238362\n湖南省中医药研究院附属医院\t238363\nJolla\t238364\n红笺\t238365\n杨婧\t238366\n超级马里奥:奥德赛\t238367\n酷播\t238368\n动作类\t238369\nldc\t238370\n净土\t238371\n炉石传说圣骑士\t238372\n60张\t238373\n李璇\t238374\n托福网考\t238375\n区行政服务中心\t238376\n2016.3.2\t238377\nXCODE\t238378\n泸水县\t238379\n0411-39569525\t238380\n第48届\t238381\n哑巴亏\t238382\n跟读\t238383\n天梭吧\t238384\nCaoPorn97\t238385\n保利国宾\t238386\ncouch\t238387\n明星大侦探\t238388\n草莓慕斯塔\t238389\n2553\t238390\n老光\t238391\n龙门教育\t238392\n三吉\t238393\n清华大学教育研究院\t238394\n独木桥\t238395\n西南财大\t238396\n肿瘤君\t238397\ncopolymer\t238398\n义务人\t238399\n)实业有限公司\t238400\n连环计\t238401\n23.3\t238402\nMAYDAY\t238403\n试用品\t238404\n华东在线\t238405\n霞浦\t238406\n经济学专业\t238407\n山东大学学报\t238408\n钍\t238409\n2017-12-14\t238410\n低敏\t238411\n二拍\t238412\n期货投资分析\t238413\nChloé\t238414\n桜\t238415\n98集\t238416\n营养土\t238417\n环艺\t238418\n化学阉割\t238419\n银湖山庄\t238420\nChunk\t238421\nエルフ\t238422\n痛楚\t238423\n430\t238424\n智品\t238425\n危旧房\t238426\n20170502\t238427\n后门\t238428\n网络库\t238429\n100辆\t238430\n守约方\t238431\n桃子&鱼仔\t238432\n2e\t238433\n米重\t238434\n银龙鱼\t238435\n未就业\t238436\n刘坤\t238437\n杭行路\t238438\n常宁市\t238439\n咪咪\t238440\n阿狸\t238441\nqtcreator\t238442\n公报\t238443\n铃木光司\t238444\n一方人\t238445\n嘉兴19楼\t238446\n伤逝\t238447\n376号\t238448\nhcv\t238449\n福建教育出版社\t238450\nS-400\t238451\n1.5T\t238452\n单音版\t238453\n利害\t238454\n万科新都会\t238455\n20170704\t238456\n刘文西\t238457\n硚口\t238458\n蜂箱\t238459\n预习\t238460\n指示性\t238461\n江上渔者\t238462\n操劳\t238463\nstr2num\t238464\nEaoron\t238465\nutovr\t238466\n功角\t238467\n素匠\t238468\n水芹菜\t238469\n徐阶\t238470\n雕虫\t238471\nhofstede\t238472\nWindows8_\t238473\n婉约\t238474\naap\t238475\n易查\t238476\n965M\t238477\nc1s\t238478\n2181\t238479\n技术指导\t238480\n08级\t238481\ndunhill\t238482\n1814\t238483\n七濑\t238484\n低碳\t238485\n心脏科\t238486\n牧民\t238487\nqualified\t238488\n朝鲜半岛\t238489\n结仇\t238490\n酉时\t238491\nHua\t238492\n2603\t238493\n柠檬膏\t238494\n美声唱法\t238495\nNBA2k14\t238496\n鸡蛋果\t238497\n拿把\t238498\n公开户\t238499\n坛友\t238500\n昭通学院\t238501\n洛克精灵战记\t238502\n12320挂号\t238503\n六哥\t238504\nresorts\t238505\n人间美姬\t238506\n益学堂\t238507\n全不\t238508\n演绎法\t238509\n琴头\t238510\nCOPD\t238511\n三短\t238512\nwindows分区\t238513\n塞拉\t238514\n变坏\t238515\n48首\t238516\n又见\t238517\n安徽电子信息职业技术学院\t238518\nour\t238519\n升压站\t238520\n古早味蛋糕\t238521\n2013届\t238522\n智享\t238523\nc服\t238524\nprivileged\t238525\n尼古拉斯·霍尔特\t238526\nCompatibility\t238527\nGenomic\t238528\n体验感\t238529\n三相电\t238530\n机量\t238531\n固化炉\t238532\n账目\t238533\n兰州西客站\t238534\n钟平\t238535\ne派\t238536\n透气阀\t238537\n河北省高速公路管理局\t238538\n32_\t238539\npm\t238540\n怒放\t238541\n裕华\t238542\n贫困地区\t238543\n600分钟\t238544\n李文静\t238545\n1.6mm\t238546\n切口处\t238547\n电视棒\t238548\nszb\t238549\n朱武\t238550\n近親\t238551\nJapan\t238552\n金蝶云_\t238553\nPumping\t238554\n金韵蓉\t238555\ninfrast\t238556\n透视裙\t238557\n第九类\t238558\n李红\t238559\n德保县\t238560\n超过60天\t238561\n北洋大学\t238562\n保安族\t238563\n明珠港\t238564\n呋虫胺\t238565\n胶味\t238566\n戒网\t238567\n柯基犬\t238568\n村长\t238569\n5.6.10\t238570\n文武贝\t238571\n决定书\t238572\n第140期\t238573\n人民防空办公室\t238574\n洛道\t238575\n房门\t238576\n鄂托克之窗\t238577\n_成都政府网\t238578\nCheapest\t238579\n天一家园\t238580\n孙丹\t238581\nv形\t238582\nkenn\t238583\nvitual\t238584\n墨鱼干\t238585\n百套\t238586\n失真\t238587\n王侃\t238588\n肠胃炎\t238589\n煮锅\t238590\nVultr中文网\t238591\n盆池\t238592\n28首\t238593\nmdt\t238594\nhazards\t238595\n园博园\t238596\n金刚功\t238597\n法兰\t238598\nwildlife\t238599\n纪英男\t238600\n疏水\t238601\n镇安县人民政府\t238602\n界王拳\t238603\n瑞鸟\t238604\n昰\t238605\nswe\t238606\n程力专用汽车股份有限公司\t238607\n脚围\t238608\n遥测\t238609\n苏州市交通运输局\t238610\n青海省住房和城乡建设厅\t238611\nhm\t238612\nJPMorgan\t238613\n20cr\t238614\nVD\t238615\n曹辉\t238616\n惩戒\t238617\n高士奇\t238618\nmfc7340\t238619\n五常稻花香大米\t238620\n象帕\t238621\n唐斌\t238622\n青岛市水利局\t238623\n查派\t238624\n陕西省天然气股份有限公司\t238625\n速胜\t238626\n第01集\t238627\n占星者\t238628\n无极县人民政府\t238629\nlogos\t238630\n双歧杆菌三联活菌胶囊\t238631\nCN2\t238632\n烤鱼\t238633\nmusql\t238634\n国机集团\t238635\n肖伟\t238636\n错埠岭\t238637\n生物类\t238638\n章文俊\t238639\n3.5kg\t238640\n自卸\t238641\n荻花\t238642\n伏\t238643\nProtected\t238644\n单身群\t238645\n金牛\t238646\nbootdo\t238647\n廊坊车管所\t238648\n捕蝇纸\t238649\n新罗马\t238650\n日月同辉\t238651\n赣州市区\t238652\nmice\t238653\n流和\t238654\njobject\t238655\n七仙女\t238656\nCordless\t238657\n一轻\t238658\n取保候审\t238659\n洋妞\t238660\njs\t238661\n六年级语文下册\t238662\n00942\t238663\n李欣然\t238664\n菜菜绪\t238665\nSVHC\t238666\n卫生许可证\t238667\nyotube\t238668\n东营市\t238669\n陆婷\t238670\n绿帽哀歌:女友出轨日记\t238671\n九州海上牧云记\t238672\nragnarok\t238673\n大连晚报\t238674\naircrack-ng\t238675\n毕业旅行\t238676\n红星农场\t238677\nheart\t238678\n夜总\t238679\n卷积神经网络CNN\t238680\n远程教育学\t238681\n知网cnki\t238682\n破坏力\t238683\n39.99\t238684\n渴望\t238685\n中百集团\t238686\n铳梦\t238687\n魂套\t238688\n3103\t238689\n湛江一中培才学校\t238690\n2轴\t238691\n9月2号\t238692\n9下\t238693\n菡萏\t238694\n喂喂\t238695\n蔡老板\t238696\n高低肩\t238697\n亲生女\t238698\nRevlon\t238699\n哈南\t238700\n紫金县\t238701\n同景\t238702\n宇都宮\t238703\n导行\t238704\n东风雪铁龙世嘉\t238705\n非执业\t238706\nab蓝学网\t238707\n4第四章\t238708\n服务态度\t238709\nBitch部\t238710\n张傲月\t238711\n香米\t238712\n后进村\t238713\n26p\t238714\n5月28\t238715\n傣族\t238716\n斐济岛\t238717\n秀玉\t238718\nGary\t238719\n缔约\t238720\n爱人的谎言\t238721\n小橘\t238722\npiece\t238723\n/年\t238724\n温州医科大学仁济学院\t238725\n陈文浩\t238726\n西藏城投\t238727\n中国共产党基层组织选举工作暂行条例\t238728\n宝华山\t238729\n中山市食品药品监督管理局\t238730\n7420\t238731\n蝙蝠侠:阿甘骑士\t238732\n苏州大学》\t238733\n兰蔻\t238734\nNouveau\t238735\n北京幼升小\t238736\n晶粒\t238737\n福建联迪商用设备有限公司\t238738\n女模\t238739\n死疫\t238740\n张皓\t238741\n4天前\t238742\n核拨\t238743\n宽慰\t238744\n蹼\t238745\n文娟\t238746\n48平方\t238747\n运费\t238748\n_安游在线\t238749\n税码\t238750\n一大拨\t238751\n3220\t238752\ndwarf\t238753\n送君千里\t238754\n人口众多\t238755\n石塑\t238756\n归去来\t238757\nliguangsunls\t238758\n东方雅苑\t238759\n张晓军\t238760\n天天象棋三国演义\t238761\n三分地\t238762\nRhode\t238763\n克夫\t238764\n张小宇\t238765\n黑作坊\t238766\n关节痛\t238767\nbitlock\t238768\n刘金琨\t238769\n凝血因子\t238770\n江苏银行手机银行\t238771\n中咨\t238772\n苏35\t238773\nOPPOR7\t238774\nC++17\t238775\n第33页\t238776\n华新水泥股份有限公司\t238777\n85387800\t238778\nbrandi\t238779\nZtree\t238780\n少年强则国强\t238781\n前距\t238782\n重安\t238783\n归属感\t238784\nunderarmour\t238785\n梅高\t238786\n红庙\t238787\n2007年4月\t238788\n摩米士\t238789\n调平\t238790\n第135集\t238791\n自刎\t238792\nSunlight\t238793\n流芳\t238794\n艾莉莎\t238795\ntop8\t238796\nWindows程序设计\t238797\n醇酸\t238798\nMotorcycle\t238799\nwinds10\t238800\n上门\t238801\n歼7\t238802\n枯黄\t238803\n赚得\t238804\n192.168.3.1\t238805\n中科软科技股份有限公司\t238806\narraylist\t238807\n常用名\t238808\n中心岛\t238809\n面条菜\t238810\n好久不见\t238811\n东方明珠\t238812\nAD15\t238813\n失当\t238814\n恶魔人\t238815\n中共中央直属机关采购中心\t238816\n捷森\t238817\n汉语词典_词典网\t238818\n20160120\t238819\n500行\t238820\n毛概\t238821\n李晨阳\t238822\n0085268404392\t238823\n盐炉\t238824\n鑫辉\t238825\n网驰\t238826\n小升衔接\t238827\n并州\t238828\n锦隆\t238829\n17951\t238830\n另行\t238831\n传奇\t238832\n刘利\t238833\n张德德\t238834\nPromoting\t238835\n第二分册\t238836\n四川美丰\t238837\n正威国际集团\t238838\n波克基斯\t238839\n歃血\t238840\n格里菲斯\t238841\n翔殷路\t238842\n新气\t238843\n粪池\t238844\n中单词\t238845\n杨家将\t238846\n几百遍\t238847\n罗技m558\t238848\n34周\t238849\n将军纪念馆\t238850\n泪阜\t238851\nRaj\t238852\n宁波市人大常委会\t238853\n巨商\t238854\n协卡助手\t238855\nEmails\t238856\n月落\t238857\n做错了事\t238858\n互联网理财\t238859\n环境科学概论\t238860\nopengl32.dll\t238861\n入社\t238862\n花溪牛肉粉\t238863\n增减员\t238864\n不共线\t238865\n无相劫\t238866\n念诵\t238867\n凹凸棒\t238868\n野心与终局\t238869\n流言板\t238870\n读音\t238871\nConditions\t238872\n新欢\t238873\n来说说看\t238874\n疱疹病毒\t238875\nE470c\t238876\n干渴\t238877\n井工\t238878\n大场镇\t238879\n大勇\t238880\n大包子\t238881\n闪存\t238882\n天气预警_天气网\t238883\n字串\t238884\nISA\t238885\n第9轮\t238886\nlephone\t238887\n破坏王\t238888\n杨贤硕\t238889\n毕节地区\t238890\n拢\t238891\n福特斯\t238892\n附能\t238893\n棉絮状\t238894\n西安咸阳机场\t238895\n酥软\t238896\n竹纸\t238897\n蜈蚣丸\t238898\n石料\t238899\n崛起之路\t238900\n20140110\t238901\nnormalization\t238902\n研究员\t238903\n鷇音子\t238904\n雨花台\t238905\n李翔\t238906\nepson1390\t238907\n刘志丹\t238908\n董杰\t238909\n史济怀\t238910\n利菁\t238911\n非开挖\t238912\n一贫\t238913\n中塑在线原料库\t238914\n赌博默示录\t238915\nMOUNT\t238916\n食发\t238917\nFOOK\t238918\nJiaThis\t238919\n彰显\t238920\n临床肿瘤学\t238921\n阿里达摩院\t238922\nsvg/woff/woff2\t238923\n5000张\t238924\n进件\t238925\n城市花园\t238926\n固量\t238927\n直插式\t238928\n宽裕\t238929\n徐金慧\t238930\n11G\t238931\n崔维星\t238932\n关颖\t238933\n金饼\t238934\n睢冉\t238935\n狼与香辛料\t238936\nMandela\t238937\n患癌\t238938\n游泳馆\t238939\n催化器\t238940\n乱子\t238941\nProvinces\t238942\n泥土\t238943\n银川机场\t238944\nhmr\t238945\nstoi\t238946\nGrease\t238947\n文景之治\t238948\n双摄\t238949\n2018年2月17日\t238950\n大便\t238951\nマン\t238952\n货店\t238953\n大赛\t238954\nssleay32.dll\t238955\nKestrel\t238956\n三明市三元区\t238957\n推理\t238958\nkyb\t238959\ncamel\t238960\nupvc管\t238961\n轮船\t238962\nspring4.x\t238963\nargc\t238964\n莫阿娜\t238965\n北京十一学校\t238966\n城区段\t238967\n混频器\t238968\n立花馆恋爱三角铃\t238969\nmifi\t238970\n脱毛器\t238971\n同音\t238972\nCanopy\t238973\n江苏教育考试院\t238974\n谈场\t238975\n终极Shell\t238976\n陆小千\t238977\n不温不火\t238978\n维普查重\t238979\n唐山站\t238980\n罗振宇\t238981\n猎豹清理\t238982\n凯莱英\t238983\n清水河县\t238984\n呼之欲出\t238985\n物镜\t238986\n文兰\t238987\nlogger\t238988\nNorm\t238989\nxī\t238990\ncdma2000\t238991\n保修金\t238992\n居所\t238993\n瘀斑\t238994\n绿化苗木\t238995\nBZ\t238996\n父传子\t238997\n鼓气\t238998\n中南世纪雅苑\t238999\nTHB\t239000\n短文学网\t239001\nnobody-junior\t239002\nextjs6\t239003\n中国制造交易网\t239004\n悖论\t239005\n蒙牛酸酸乳\t239006\n我喜\t239007\n哑光\t239008\n生好\t239009\n信托贷款\t239010\n伊苏8\t239011\n84_\t239012\n杨静\t239013\n地久天长\t239014\n金点\t239015\n单刷\t239016\npty\t239017\n第三批次\t239018\n黑酸枝\t239019\n昆山)有限公司\t239020\n凯瑟\t239021\n有钱途\t239022\n隋朝\t239023\n观音山\t239024\n侏罗纪公园3\t239025\nbruce\t239026\n贤才\t239027\najax局部刷新\t239028\n半画幅\t239029\nminidom\t239030\nmsu\t239031\n分换\t239032\n北京国贸\t239033\n罹\t239034\n质心\t239035\n户型图\t239036\n口袋精灵2\t239037\n五高\t239038\nprpr\t239039\n09:00\t239040\nBFG\t239041\n图率\t239042\naplayer\t239043\n中国西电\t239044\nGFM\t239045\n太安堂\t239046\n北郡\t239047\nSIGSEGV\t239048\n织物\t239049\n红色系\t239050\n十几倍\t239051\n1个多小时\t239052\n马欣\t239053\n初川\t239054\nDuolingo\t239055\n万千瓦\t239056\n弹痕\t239057\n凭实力\t239058\n普邦\t239059\nsteel\t239060\n银客网\t239061\n几餐\t239062\n壹纳网\t239063\nchoir\t239064\n深流\t239065\n土豆淀粉\t239066\n戚家山\t239067\n国务院教育督导委员会办公室\t239068\n抱箍\t239069\n丹纳赫\t239070\n达克宁软膏\t239071\nbgm\t239072\nwindows98\t239073\n2018年三月份\t239074\n爱卡汽车网\t239075\n类象\t239076\n红光镇\t239077\n汉寿县政府\t239078\nThrottle\t239079\n宝相寺\t239080\ncree\t239081\nriyuan\t239082\n手动泵\t239083\n421多\t239084\n皮革城\t239085\n肄业\t239086\n数组函数\t239087\n云杰\t239088\n高考学习网\t239089\nqsv格式转换\t239090\n金头陀\t239091\n欧特\t239092\n易方达基金\t239093\n6009\t239094\n吴映香\t239095\nholle\t239096\n粽子节\t239097\nbys\t239098\n综艺大热门\t239099\n焦仲卿\t239100\n比距\t239101\nKari\t239102\nS7|S7\t239103\n相惜\t239104\n毛细血管瘤\t239105\n粉尘仪\t239106\n26分\t239107\n余秋雨\t239108\nmagical\t239109\nRatings\t239110\n这些\t239111\n美莎\t239112\n地下城与勇士\t239113\n加兴\t239114\n后民\t239115\n北京会所\t239116\n操作栏\t239117\nNakamura\t239118\n四样\t239119\nBullet\t239120\n路太远\t239121\n很一般\t239122\n亲心\t239123\n04.08\t239124\n车感\t239125\n改建\t239126\n2900万\t239127\n广饶\t239128\ngaotie\t239129\n农商\t239130\n夏时制\t239131\n北京国际旅行卫生保健中心\t239132\n法定报告传染病\t239133\n三亲\t239134\nrx570\t239135\n孙梅\t239136\nLDH\t239137\n罪与罚\t239138\n立档\t239139\n数码相\t239140\n旧案\t239141\n够钟\t239142\n山洞\t239143\n计算机应用研究\t239144\n凯尔\t239145\n苏州政府网\t239146\n公包\t239147\nSuggestions\t239148\nk-means聚类\t239149\n炼狱厨神\t239150\n赵一德\t239151\n金瓜\t239152\n农药厂\t239153\n陪诊\t239154\nFlying\t239155\n刘关张\t239156\n安卓游戏网\t239157\n包户\t239158\n十亿级\t239159\n镰\t239160\n丛云\t239161\n猫狗大战2\t239162\nesl\t239163\n石灰石\t239164\n懵钟\t239165\ngro\t239166\n大涌\t239167\n刘宝\t239168\n王建成\t239169\n大小便\t239170\n死徒\t239171\n片单\t239172\n台州职业技术学院\t239173\n上皮内瘤\t239174\n新聊斋志异\t239175\nVerilog\t239176\n元子\t239177\n电子作业与在线考试\t239178\n耐脏\t239179\n商用化\t239180\n云搬家\t239181\n陇南市人民政府\t239182\nSG3525\t239183\n7层\t239184\n北京大兴信息网\t239185\n砸锅卖铁\t239186\n鄙\t239187\nMuslim\t239188\nyouare\t239189\n宁波市规划局\t239190\nCases\t239191\n远期收益率\t239192\n耶拿\t239193\n7000公里\t239194\n杨柳树\t239195\n王者传奇\t239196\n今年10月\t239197\n青藏铁路公司\t239198\n不蔓不枝\t239199\n杭州拱墅区\t239200\nTELL\t239201\n恰恰舞\t239202\n安得\t239203\n白藤湖\t239204\nprovince\t239205\n垦利区\t239206\n磁座\t239207\nYY6090青苹果影院\t239208\n托米\t239209\nis300\t239210\n2017年5月16日\t239211\n30万台\t239212\n财富公司\t239213\nautomake\t239214\nS20\t239215\n旋光\t239216\nwandering\t239217\n邦定\t239218\n天际通\t239219\npolaris\t239220\n锁子甲\t239221\n戴南镇\t239222\n蔡文静\t239223\n永修县委\t239224\nBeholder\t239225\n2018年9月\t239226\nbotz\t239227\n高深莫测\t239228\n库登记表\t239229\n广州无线电集团\t239230\ndirs\t239231\n涛声\t239232\n龙首\t239233\n北京明天幼稚集团\t239234\n自治法\t239235\n屠华\t239236\n农企\t239237\n新篇\t239238\n火球\t239239\n力点\t239240\n2017年1月24日\t239241\n成名史\t239242\n灵通卡\t239243\n美男子\t239244\n重力传感器\t239245\nflowjo\t239246\n安徽省民政厅\t239247\n格桑拉\t239248\n手写笔\t239249\n广州华侨医院\t239250\nAurvana\t239251\nHMAC\t239252\n潮阳区\t239253\n百度云在线\t239254\n盂兰神功\t239255\n胜家\t239256\n发肤\t239257\n特发性血小板减少性紫癜\t239258\n瓶车\t239259\nGACKT\t239260\n猎上网\t239261\n口蹄疫\t239262\n橘子花\t239263\n美容室\t239264\n9c\t239265\n特批\t239266\n萧庭生\t239267\n战神三十六计\t239268\n中交上海航道局有限公司\t239269\n非负矩阵\t239270\n刊易\t239271\nZC\t239272\n台言\t239273\n蝎子精\t239274\n屈曲\t239275\n惠州酒店\t239276\nORANGE\t239277\n向前\t239278\n层次分明\t239279\n7.0.0\t239280\n布地奈德混悬液\t239281\n长崎县\t239282\n厦门市图书馆\t239283\n生活大爆炸TBBT\t239284\n火星探测器\t239285\n小说者\t239286\n水碱\t239287\n轻佻\t239288\n情衷\t239289\n湖南省自然科学基金\t239290\nTUBE\t239291\n心有\t239292\nNAMES\t239293\nFORMAT\t239294\n铜业\t239295\n传奇大亨\t239296\n有机农产品\t239297\n乔纳斯\t239298\nPort\t239299\n宠物小精灵XY\t239300\n古田四路\t239301\nCourtyard\t239302\n1607\t239303\n祖尔金\t239304\nWEP\t239305\n22566655\t239306\nSunRain117\t239307\n无水亚硫酸钠\t239308\n螺旋测微器\t239309\n对立\t239310\nDischarge\t239311\n黄记煌\t239312\n无疾而终\t239313\n第7场\t239314\n2016年10月31日\t239315\n26层\t239316\n新美大\t239317\n游馆\t239318\n号灯\t239319\nLaurent\t239320\n恒常性\t239321\n创业信息网\t239322\n省道\t239323\n北京西山\t239324\n水洗石\t239325\n热熔胶枪\t239326\nmybatisplus\t239327\nAlgorithm\t239328\n益秒\t239329\n旧村庄\t239330\n后身\t239331\n童星梦工厂\t239332\n小米米家智能\t239333\n梦幻谷\t239334\n佳博\t239335\n供大于求\t239336\n黄晓\t239337\n思议\t239338\n188路\t239339\n沭阳县\t239340\n哪个服\t239341\n反贪\t239342\n六图网\t239343\n万豪金\t239344\n某年\t239345\n海豚模拟器\t239346\n黄棣\t239347\n农民合作社\t239348\n日出而作\t239349\nbend\t239350\n石蛙\t239351\n马努·吉诺比利\t239352\n邪恶集\t239353\n小仓鼠\t239354\n心境\t239355\n沪江德语\t239356\n四下\t239357\n冯莫提\t239358\n工商注册网\t239359\n涨幅\t239360\n暗黑世界\t239361\n225路\t239362\nlegacy\t239363\nK3003\t239364\n第42号\t239365\n岳麓区\t239366\nOpenstack\t239367\n速汇金\t239368\n蚂蚁窝\t239369\n梁志天\t239370\n谈天说\t239371\nc仔\t239372\n北国超市\t239373\n玉林市政府\t239374\n毛主席水晶棺\t239375\n翰\t239376\nib\t239377\n紫霞仙子\t239378\njoy\t239379\nPei\t239380\n孜孜以求\t239381\n服务商\t239382\nscx-4521f\t239383\n第109集\t239384\n欧洲城\t239385\n工整\t239386\n唐韦星\t239387\n郑欣宜\t239388\n中学综合素质\t239389\n参悟\t239390\n手绘简笔画\t239391\n赭山\t239392\n3888元\t239393\n597苗木网\t239394\n中海油能源发展股份有限公司\t239395\n真八岐\t239396\n天色\t239397\n七宝外国语小学\t239398\n紫砂\t239399\n中国佛教\t239400\ndatePicker\t239401\nf514\t239402\n荻港\t239403\noffense\t239404\nfarther\t239405\n盛世投资\t239406\n国家电网有限公司\t239407\n党课稿\t239408\n电测\t239409\n眺\t239410\n读品\t239411\nxex\t239412\n6门\t239413\ncajviewer\t239414\n八一路\t239415\npagepro\t239416\n乏味\t239417\n增减\t239418\n黑海\t239419\n小米空气净化器2\t239420\nparadigm\t239421\n听君\t239422\n星河湾双语学校\t239423\ndesire\t239424\nfps游戏\t239425\n脱口\t239426\nhoudini16\t239427\nJVM\t239428\n山西省教育厅\t239429\nJunit\t239430\n24小时内\t239431\n独资企业\t239432\n转意\t239433\n袁冶\t239434\nMacDonald\t239435\n2光\t239436\n白石\t239437\n武进市\t239438\n热裤\t239439\n桌台\t239440\n细菌战\t239441\nit公司\t239442\nchannelid\t239443\n巧遇\t239444\n待摊\t239445\n回不来了\t239446\n崔罗什\t239447\n孤独星球Lonely\t239448\n歌手2\t239449\n6000名\t239450\n3pdf\t239451\n班船\t239452\n1.15G\t239453\n损益类\t239454\n浦东新区小学\t239455\n重现期\t239456\n航天科技大厦\t239457\n三线制\t239458\n融资性贸易\t239459\n我们都要\t239460\n京东e卡\t239461\nsedan\t239462\n马小路\t239463\n18年3月份\t239464\n凌霄花\t239465\n王童语\t239466\n保险学\t239467\n疲劳\t239468\nlumia920\t239469\n第8次\t239470\njared\t239471\n绚香\t239472\n健威\t239473\n允浩\t239474\n张克\t239475\npian\t239476\nRecordings\t239477\n韶关\t239478\n哈尔滨东站\t239479\n火患\t239480\nDDX\t239481\n唐山男健医院\t239482\nMedicinal\t239483\nHurt\t239484\n亚东军事网\t239485\n冶疗\t239486\n夕阳之歌\t239487\ncorrectly\t239488\n张德强\t239489\n康馨苑\t239490\n4月15号\t239491\n幂\t239492\n0.0\t239493\nhouzz\t239494\n沈阳机床\t239495\n17年5月\t239496\n万合天宜\t239497\n信泰保险\t239498\n黄花溪\t239499\n友德\t239500\n主人翁\t239501\n达文西\t239502\nfcp7\t239503\nK95\t239504\n绿松石盘\t239505\n牛津英语模块\t239506\n慕尼黑惨案\t239507\n喷管\t239508\n禁忌搜索算法\t239509\nECOVACS\t239510\n巫师2\t239511\n回家的路\t239512\n辽宁盼盼吧\t239513\n电棍\t239514\n12.7.3.46\t239515\n四壁\t239516\n西安银行\t239517\nprobable\t239518\n饰面\t239519\n君子者\t239520\nmadonna\t239521\n填反\t239522\n仙人球\t239523\n福林\t239524\n十所\t239525\n惠人原汁机\t239526\n高等学校\t239527\nbtv\t239528\n无话\t239529\n凭吊\t239530\n长气\t239531\nhs8546v\t239532\nPOI\t239533\n企\t239534\nlue\t239535\n心理咨询师证\t239536\n600号\t239537\n清初\t239538\n盐酸利多卡因\t239539\nCB认证\t239540\n出货率\t239541\n3d效果图\t239542\n国战\t239543\n名校\t239544\nexpectation\t239545\n全本\t239546\n过半数\t239547\nKPa\t239548\n广东中烟\t239549\n杨劲松\t239550\n合庆\t239551\n3.38\t239552\ncycles\t239553\n讽\t239554\n意长\t239555\n泸州市商业银行\t239556\n董伟\t239557\n澄星股份\t239558\npdm\t239559\n复旦大学附属中山医院\t239560\n寻香\t239561\n98.1\t239562\n城郭\t239563\n悔青\t239564\nKeithley\t239565\n好安静\t239566\n高先生\t239567\n1.8.6\t239568\nMystery\t239569\nNBK\t239570\n暖声\t239571\n云家网\t239572\n耀皮玻璃\t239573\n桂林人才网\t239574\n屏保\t239575\n崇义\t239576\ndadada\t239577\n历史研究\t239578\n追偿\t239579\nMPA\t239580\n延时喷剂\t239581\n15a\t239582\n自建\t239583\ngitweb\t239584\n银河玖乐\t239585\n性侵案\t239586\n点明\t239587\n宁德师范学院\t239588\n带面\t239589\n中国五大银行\t239590\n张宏伟\t239591\n明\t239592\n金轮\t239593\n墙头\t239594\n扇面\t239595\n定额子目\t239596\nrj45\t239597\n汉江\t239598\n宣萱\t239599\n开封奇谈\t239600\nAres\t239601\n50日\t239602\n刘禹锡\t239603\n情词\t239604\n居中和\t239605\nxp/\t239606\n榴\t239607\nHAXM\t239608\nzhubo\t239609\n泰瑞尔\t239610\n沃尔沃XC90\t239611\n凯德龙\t239612\nAdMob\t239613\n致知\t239614\n亦庄经济技术开发区\t239615\n第127号\t239616\n承租方\t239617\n联想y480\t239618\nlua吧\t239619\nオンナ\t239620\n福建省委党校\t239621\n红色警戒2\t239622\n字子\t239623\n指示卡\t239624\n坍落度\t239625\n天选者\t239626\n体育生\t239627\n元江\t239628\n池子\t239629\n800点\t239630\nkennys\t239631\n武汉大学法学院\t239632\nwvr300\t239633\n黄冈市区\t239634\nexce\t239635\n宜宾市统计局\t239636\nMission\t239637\n广告单\t239638\n南拳\t239639\n雅力士\t239640\nshagn\t239641\n7f\t239642\n天翼高清\t239643\n汉姆\t239644\n安利纽崔莱\t239645\n极地恶灵\t239646\n25份\t239647\n一延米\t239648\n靓晶晶\t239649\n固发\t239650\n国权北路\t239651\nmarlboro\t239652\nCOIN\t239653\n金蝶迷你版\t239654\n拳皇97吧\t239655\n懿德\t239656\n会徽\t239657\n铁工\t239658\nhennessy\t239659\n酒缸\t239660\n轮边\t239661\n陈峰\t239662\n图吧导航\t239663\n专帖\t239664\n扣紧\t239665\n圣约翰大学\t239666\n临顿路\t239667\n仙姬\t239668\n邯郸学院\t239669\n百伦\t239670\n铣刨\t239671\n拼假\t239672\nComputex\t239673\n长安CX70\t239674\nSpread\t239675\n恒基达鑫\t239676\n孙河乡\t239677\n自粘型\t239678\n中国银行间市场交易商协会\t239679\nZHEJIANG\t239680\nparaphrase\t239681\n虎神\t239682\n吹风机\t239683\n以父之名\t239684\n云计算_比特网\t239685\n汇金中心\t239686\n佩罗\t239687\n新洋丰\t239688\n十来分钟\t239689\n二十分\t239690\n管棚\t239691\n商务\t239692\n木桐\t239693\n刘询\t239694\n42款\t239695\n征战\t239696\nalternating\t239697\n抹子\t239698\n红云\t239699\n气垫BB霜\t239700\n禽鸟\t239701\n德宝\t239702\n江苏省常州高级中学\t239703\nBUFFALO\t239704\n精神病频道_健客网\t239705\n努尔哈赤\t239706\n3站\t239707\n水曜日\t239708\n火天\t239709\n外网\t239710\n晚稻\t239711\n车窗\t239712\nTrados\t239713\n戏装\t239714\n蕾姆\t239715\n粤赣\t239716\n鼻贴\t239717\n情迷六月花\t239718\n真八核\t239719\n阿尔法·罗密欧\t239720\n侧分\t239721\n正规军\t239722\n富硒米\t239723\n金蟾\t239724\n青春影\t239725\n宫略\t239726\n苹果团\t239727\n35关\t239728\n秘档\t239729\nea6500\t239730\n匹伐他汀\t239731\n塔拉\t239732\ndevTools\t239733\nindexing\t239734\nRoman\t239735\n古剑奇谭1\t239736\nr720\t239737\n耕织\t239738\n中央空\t239739\nhaven\t239740\n踏石留印\t239741\n上海游泳馆\t239742\n恩格尔\t239743\nmatab\t239744\nDVD光驱\t239745\n评标\t239746\n沈阳二中\t239747\n5.2.6\t239748\n高道\t239749\nfinancial\t239750\n十三式\t239751\n筑造\t239752\n血樱\t239753\n梯梯\t239754\n高精度\t239755\n肯尼思\t239756\n避险\t239757\n绿地悦澜湾\t239758\n富甲天下\t239759\n艾青\t239760\nip地址查询\t239761\n中脘\t239762\n玛吉\t239763\n操作符\t239764\n魔法风云纪\t239765\n64步\t239766\nMetInfo\t239767\n杭派\t239768\n战锤40k\t239769\n河北银行\t239770\nMAE\t239771\n电卷棒\t239772\n安婕妤\t239773\n爱的秘密\t239774\n10000本\t239775\nppt转换器\t239776\ntv\t239777\n山南镇\t239778\n感性\t239779\n街\t239780\n别克赛欧\t239781\ncxx11\t239782\nde\t239783\n队会\t239784\n光着\t239785\n红瞳\t239786\n广州东圃\t239787\n王样\t239788\n缩径\t239789\ndao层\t239790\n导医\t239791\n培训网\t239792\n小米降噪耳机\t239793\n色宝\t239794\neffectively\t239795\n烧制\t239796\n粤垦路\t239797\n不胫而走\t239798\n第一步\t239799\n精准\t239800\nA4\t239801\n看书网\t239802\n刘醒龙\t239803\n谢婷婷\t239804\nYOLOv3\t239805\n郭家沟\t239806\n睿\t239807\n間\t239808\n采销\t239809\n新宿站\t239810\n第一个100万\t239811\n妊娠反应\t239812\n剑桥\t239813\nPSX\t239814\n1300x\t239815\nPSVR\t239816\n江西警察学院\t239817\n解列\t239818\n卡门\t239819\n上海初中学校\t239820\n渗滤液\t239821\n乳酸杆菌\t239822\nPix4Dmapper\t239823\n我的父亲我的兵\t239824\ncomes\t239825\n该药\t239826\natthe\t239827\n太阳简笔画\t239828\n潘爱民\t239829\n童诗\t239830\n菲林格尔\t239831\nkokia\t239832\n小吉\t239833\n过敏药\t239834\n解放日报\t239835\n搭救\t239836\n预紧力\t239837\nMuse\t239838\n缉拿\t239839\n上海旅行社\t239840\nsdu\t239841\n20171015\t239842\n懐\t239843\nJart\t239844\n上海儿童艺术剧场\t239845\n#ifndef\t239846\n孔子曰\t239847\n无变\t239848\n西安火车北站\t239849\n客园\t239850\n自动波\t239851\n旺旺号\t239852\n阿基拉\t239853\n腾讯控股\t239854\n六类\t239855\n切眉\t239856\n177个\t239857\n广东\t239858\n2018-01-06\t239859\n羌溪花园\t239860\n3.9G\t239861\n雪仗\t239862\n财院\t239863\n三升\t239864\n三妹\t239865\nCloning\t239866\n杭州艺星整形美容医院\t239867\n跨考考研网\t239868\n一言不合\t239869\n辅印\t239870\nBase\t239871\n死身\t239872\n矮壮素\t239873\n大众_\t239874\ncone\t239875\n枢密院\t239876\n依波\t239877\n出包\t239878\n保罗沃克\t239879\n护肤篇\t239880\nPETER\t239881\n易诚\t239882\n动脉期\t239883\n后长\t239884\ncombination\t239885\n如子\t239886\nntop\t239887\n商务咨询有限公司\t239888\n栏栅\t239889\n豪猪\t239890\n阳澄湖镇\t239891\n秦皇岛\t239892\n奉节县\t239893\n环境税\t239894\n住院大楼\t239895\n玛塔龟\t239896\n沧桑\t239897\n平板版\t239898\n观演\t239899\n马浚伟\t239900\n2018.04.14\t239901\n多步\t239902\n林木儿\t239903\nUm\t239904\n602\t239905\n杭州幼儿园\t239906\n死宅网\t239907\n世面\t239908\n骨科\t239909\n张局长\t239910\n72分\t239911\n教态\t239912\nH1\t239913\n更厉害\t239914\n莲花北村\t239915\n酸雾\t239916\n陈聪\t239917\n冥衣\t239918\n网络分析仪\t239919\n水岸星城\t239920\n海南史志网\t239921\n风间\t239922\nwildfly\t239923\nids\t239924\n云檀\t239925\n续资治通鉴\t239926\n自考英语专业\t239927\n服输\t239928\n吨公里\t239929\n嗜血\t239930\n六速\t239931\nDN50\t239932\n迪龙\t239933\n导火线\t239934\n大桌\t239935\n假枪\t239936\n稀\t239937\njiajia\t239938\n巧记\t239939\n契约\t239940\n番薯叶\t239941\n1-6月份\t239942\n神断狄仁杰\t239943\n玫琳凯之窗\t239944\nrhca\t239945\n2017—2030年\t239946\nElephants\t239947\n路虎发现神行\t239948\n介质\t239949\n折翅\t239950\n规约\t239951\n炉鼎\t239952\n要人\t239953\n2018年03月09日\t239954\nwangs\t239955\n搜狐银行\t239956\ninstack\t239957\nwagas\t239958\n26周年\t239959\n威尼斯人\t239960\n朝中社\t239961\n骨转移癌\t239962\n作爱\t239963\n一百道\t239964\n超声波驱蚊器\t239965\n紧接\t239966\n炖鱼\t239967\n金穗花园\t239968\nIntegrate\t239969\n神途发布网\t239970\n棉花糖和云朵妈妈\t239971\n神宫\t239972\n孤独的人\t239973\nSleeves\t239974\n此次\t239975\n清河区\t239976\n黄小蕾\t239977\n金飞\t239978\n阵脚\t239979\nffff\t239980\n日思夜想\t239981\n萧太后河\t239982\n权权\t239983\n投资商\t239984\ntrn\t239985\n操作题\t239986\n天道\t239987\n亚洲国际大酒店\t239988\n右拳\t239989\n挪开\t239990\n努比亚z17s\t239991\ncoast\t239992\n同妻\t239993\n麦子\t239994\n气态\t239995\njustjavac\t239996\neject\t239997\nwww.2345.com\t239998\n昆明\t239999\n女儿国\t240000\n凡人修仙传\t240001\nVuser\t240002\n邳州市\t240003\n卡巴斯基实验室\t240004\n20160919\t240005\n双鱼座女\t240006\n部教版\t240007\n低压柜\t240008\n育新花园\t240009\n射击\t240010\n伤医\t240011\n病假\t240012\n神经科学\t240013\ndyj\t240014\n裕民县\t240015\n未知\t240016\n一身\t240017\n3388\t240018\n表准\t240019\n民女\t240020\n玉天玑算命网\t240021\n粘性\t240022\n译林牛津版\t240023\n合同款\t240024\n家用路由器\t240025\n通草\t240026\n广大银行\t240027\n人民币汇率网\t240028\nwellbet\t240029\n80码\t240030\n龙湖紫宸\t240031\necu\t240032\n变动成本法\t240033\n美赞臣\t240034\n东平网\t240035\n监考\t240036\n儿字\t240037\n子女们\t240038\n朱境尧\t240039\nSBR\t240040\nxman\t240041\n最深处\t240042\nyyyy-mm-dd\t240043\nklm\t240044\n任我鲁\t240045\n大逃杀\t240046\n晓燕\t240047\n东郭先生\t240048\n起效\t240049\n耽美网\t240050\n42路\t240051\n周晓鸥\t240052\n俺去也.COM\t240053\n天麻丸\t240054\n初生儿\t240055\n毫米\t240056\n贡献值\t240057\n扁带\t240058\n水仙芒\t240059\n窗口化\t240060\n勒索病毒\t240061\npoorly\t240062\n凡尔登战役\t240063\n雅戈尔集团股份有限公司\t240064\n黄疸\t240065\nlibgcc_s.so.1\t240066\n5y\t240067\n100则\t240068\n联想潮7000\t240069\n厨子戏子痞子\t240070\n轻医美\t240071\nqq飞车手\t240072\n雁荡山\t240073\n普查\t240074\nMalena\t240075\nSYN\t240076\n金币机\t240077\n洪斌\t240078\n鲁文建筑服务网\t240079\n细高跟\t240080\n杭州余杭政府\t240081\nFormax\t240082\n讫\t240083\n虾苗\t240084\n融创日月湾\t240085\n胤祥\t240086\n媾\t240087\n形体舞\t240088\n水烟筒\t240089\n天津市和平区\t240090\nEMUI3.0\t240091\n字根\t240092\n沙雅\t240093\n砷中毒\t240094\n宋祁\t240095\n杀不死\t240096\nAutomatic\t240097\nKuo\t240098\n对拷\t240099\n冷水江\t240100\n一天\t240101\n龙文教育\t240102\n面试\t240103\n小糯米\t240104\nliuwu265\t240105\n杰克斯\t240106\nCheryl\t240107\n云南铁路\t240108\n扫扫\t240109\n第97期\t240110\n李小董卿\t240111\n事实\t240112\n33关\t240113\n芙蓉园\t240114\n茅村\t240115\n石岩湖\t240116\n40公里\t240117\n弥陀\t240118\nFILES\t240119\nculture\t240120\n福克斯RS\t240121\n航煤\t240122\n乐趣\t240123\n青岛地区\t240124\n羽灵\t240125\n能勤绩廉\t240126\n刚度比\t240127\n武陵区\t240128\n抚恤\t240129\n清关公司\t240130\n木霉\t240131\n悄然\t240132\n大大们\t240133\n房屋抵押权\t240134\n中国建行\t240135\n诗律\t240136\nWhose\t240137\n188a\t240138\nbil\t240139\n马骝\t240140\n鲁大妈色播网\t240141\n亚州\t240142\nRANSAC\t240143\n对乙酰氨基酚\t240144\n世外中学\t240145\n全球通史\t240146\n电产\t240147\n总统制\t240148\n蛋氨酸\t240149\n清澄\t240150\n白水洋\t240151\n电脑中毒\t240152\n北京交通大学\t240153\n300188\t240154\n赛螃蟹\t240155\n陕西省通信管理局\t240156\n洗牙\t240157\n紫盘\t240158\n柴油发电机\t240159\npelican\t240160\nV家\t240161\n成教\t240162\n牛姐\t240163\n穿线器\t240164\n医神\t240165\n沙园\t240166\n海口美兰国际机场\t240167\n二十四个\t240168\n偷越\t240169\n苏沫\t240170\nMini4\t240171\n托派\t240172\n海马S5\t240173\n公交线\t240174\nCoordinate\t240175\n9600gt\t240176\n三相短路\t240177\now\t240178\n幕思城\t240179\ncommonjs\t240180\n速途研究院\t240181\n如萍\t240182\n大塘新村\t240183\n博阿滕\t240184\nobviously\t240185\n百盛\t240186\n反中\t240187\n菁菁\t240188\nB70\t240189\n大河教育\t240190\n王晨霞\t240191\n重税\t240192\nViVi\t240193\n五年级作文\t240194\n滤板\t240195\n将才\t240196\n托孤\t240197\n容县\t240198\nOAO\t240199\n邴越\t240200\n苏三省\t240201\n国泰航空公司\t240202\njungle\t240203\n霍都\t240204\n掌门人\t240205\nHng\t240206\n行天\t240207\n普通类\t240208\n无方\t240209\n红顶\t240210\nOEE\t240211\n惨无人道\t240212\n最大一\t240213\n4.19\t240214\n外包装盒\t240215\n卫材\t240216\n福特撼路者社区-易车网\t240217\n2014年08月17日\t240218\n萍乡火车站\t240219\n星岛环球网\t240220\nnoah\t240221\n15000元\t240222\n岂能\t240223\n泛海控股\t240224\n按列\t240225\n大同火车站\t240226\n防水补漏\t240227\n韩愈\t240228\nXXXXXXX\t240229\n川藏线\t240230\n酒店集团\t240231\n宁波互联\t240232\nEIA\t240233\n容斋\t240234\n接待员\t240235\n桑拿板\t240236\n全包式\t240237\n银氨\t240238\nevergreen\t240239\n8【\t240240\n第三点\t240241\n向明\t240242\n金兰谱\t240243\n尤溪县\t240244\n南龙湖\t240245\ndelivering\t240246\nLOOT\t240247\n酸胀\t240248\nfdisk分区\t240249\n卡板\t240250\nprototxt\t240251\n上条\t240252\n人参榕\t240253\nWinForms\t240254\n于静\t240255\nds5ls\t240256\nabcc\t240257\n2013.11\t240258\n十包\t240259\n虎口藏宝\t240260\n寻址\t240261\nAD14\t240262\nmunmap\t240263\n做梦\t240264\n杂货商\t240265\n将夜吧\t240266\n戴尔XPS\t240267\n歌霸\t240268\n永远在路上\t240269\nRestFul\t240270\n奇亚籽\t240271\n爱投资\t240272\n别克4S店\t240273\n绿城春风十里\t240274\nrug\t240275\n金刚萨埵\t240276\n御主\t240277\n周耀辉\t240278\nUPnP\t240279\n忧忧\t240280\n谦友\t240281\nvisio2010\t240282\n几英尺\t240283\nyaoi\t240284\n广西民族师范学院\t240285\n土牛\t240286\n唐家岭\t240287\n盘坐\t240288\nMQ\t240289\n广东省质监局\t240290\ncutter\t240291\n4.23世界读书日\t240292\n先手\t240293\n南昌十中\t240294\n独孤求败\t240295\n热石\t240296\n微信公众号子\t240297\nGayTube\t240298\n11001\t240299\n114啦小说网\t240300\n栖兰小筑\t240301\n宣传日\t240302\n宏伟之窗\t240303\n桢\t240304\n日日顺健康\t240305\n兵兵\t240306\n稻飞虱\t240307\n原著党\t240308\n阿难\t240309\n高压蒸汽灭菌器\t240310\n18:00\t240311\nbelly\t240312\n西青区\t240313\nv9.0\t240314\n苍术\t240315\n微动开关\t240316\njimeper\t240317\nMapR\t240318\n64条\t240319\nmsf\t240320\nabrt\t240321\n尊师\t240322\n阿彻\t240323\n动物\t240324\n铝锂合金\t240325\n李山\t240326\n男\t240327\nelm\t240328\n镇江市\t240329\nexlipse\t240330\nWebStorm2017\t240331\n双电\t240332\n漆木\t240333\n金艺林\t240334\n22.3\t240335\n云景花园\t240336\ntcls\t240337\n37周年\t240338\n便捷酒店\t240339\nSKYPE\t240340\n提子\t240341\n10年以上\t240342\n4090\t240343\n特种设备目录\t240344\n鹅们\t240345\n主场\t240346\n舍入\t240347\n19栋\t240348\n鞍鞯\t240349\ntHeader\t240350\n林赛罗韩\t240351\nQt5\t240352\nYONEX尤尼克斯\t240353\n专访\t240354\n范伟\t240355\n咆哮体\t240356\n纵身\t240357\n东风小学\t240358\n新颜\t240359\n润衡\t240360\n阿雅小说网\t240361\n步摇\t240362\ncrayon\t240363\natop\t240364\n得人心\t240365\n安迅物流\t240366\n个人住房抵押贷款\t240367\ndx7\t240368\n龙口镇\t240369\n弯曲模量\t240370\n转导\t240371\n商厦\t240372\n我的回忆\t240373\n徐太宇\t240374\n有限元在线\t240375\n安徽省公安厅交通管理局\t240376\nE-learning\t240377\n贵阳农商银行\t240378\nSWITCH\t240379\npharm\t240380\n梯型\t240381\n赵洋\t240382\njava+selenium\t240383\n福州站\t240384\n余承东\t240385\n乍现\t240386\n变癌\t240387\n打火器\t240388\n大兴县\t240389\n镜花缘传奇\t240390\nqpos\t240391\n堆码\t240392\n1915\t240393\n丘陵\t240394\niab\t240395\nMaddock\t240396\n十字军之王吧_\t240397\n维信\t240398\n吉印\t240399\n南汽\t240400\n牛油纸\t240401\n黄金天国\t240402\n幸福来敲门\t240403\n盖面\t240404\n中天建设集团有限公司\t240405\n交人\t240406\n引体向上吧_\t240407\n圣导\t240408\n东方美人\t240409\n油泼面\t240410\nRegular\t240411\n说茶网ishuocha.com\t240412\n鸡冠区\t240413\n∪\t240414\nBEAUTYLEG\t240415\n3800元\t240416\n贝尔金\t240417\nCapricorn\t240418\n南昌湾里\t240419\n64元\t240420\n黒獣\t240421\n工装\t240422\n200.0000元\t240423\n安徽省委党校\t240424\n2524\t240425\n孟德尔\t240426\n党龄\t240427\n十全大补汤\t240428\n吉他曲谱\t240429\n漢化\t240430\n包皮环切手术\t240431\n花钱\t240432\n酸辣粉\t240433\n达尔优\t240434\n虫咬\t240435\nToday\t240436\n免付费\t240437\n温岭新闻网\t240438\n中山图书馆\t240439\n新奥集团\t240440\n3周年\t240441\nsaw\t240442\n辛香料\t240443\nSanders\t240444\nPE管\t240445\n南周\t240446\n呦呦鹿鸣\t240447\n560m\t240448\n黄真伊\t240449\n爱奇艺播放器\t240450\n老飞飞\t240451\n纽百伦\t240452\n盗墓笔记\t240453\n上海财经大学研究生院\t240454\n盈丰路\t240455\n让\t240456\n惊弓\t240457\n人见人爱\t240458\nUndefined\t240459\n六校\t240460\n贵州大学明德学院\t240461\ngridview\t240462\n顶嘴\t240463\n猜车\t240464\n蹄筋\t240465\n二之国\t240466\n500.com\t240467\n中国野生动物保护协会\t240468\n离休\t240469\n2018年04月14日\t240470\n农田\t240471\n巴拿马万国博览会\t240472\n46秒\t240473\nx02\t240474\n东东不死传说\t240475\ndfl\t240476\n鸵鸟油\t240477\n梦幻西游69\t240478\n月均\t240479\nNecessary\t240480\n尖晶石\t240481\n终归\t240482\n3dsmax2015\t240483\nB级片\t240484\n马克思主义研究网\t240485\n硅PU\t240486\nkedou\t240487\n蒙古图格里克\t240488\n张家口市委\t240489\n大慈\t240490\n滞纳金\t240491\ncx-5\t240492\n站务员\t240493\n监护权\t240494\n我儿\t240495\n虾类\t240496\n泵头\t240497\n浮木镇\t240498\n黑雨\t240499\n麓\t240500\nlibs\t240501\n负重\t240502\nstock\t240503\n6000平米\t240504\n十讲\t240505\n梧村\t240506\nwounded\t240507\nA级车\t240508\n南拳妈妈\t240509\n天涯_论坛\t240510\n错字\t240511\niTextSharp\t240512\nai2017\t240513\nq345\t240514\nSolo2\t240515\n媒体\t240516\n温酒\t240517\n全金\t240518\n穿墙术\t240519\n仲利国际租赁有限公司\t240520\n心汉\t240521\n网管员\t240522\n保奈美\t240523\nAmadeus\t240524\n24点阵\t240525\n门体\t240526\n考勤\t240527\n2017年10月16日\t240528\n氨咖黄敏胶囊\t240529\n桂建管\t240530\n夸父追日\t240531\n2017君威\t240532\ne475\t240533\nSTM32F103\t240534\n2018年4月23日\t240535\n中材节能\t240536\n韩翃\t240537\nBehance\t240538\n优逸\t240539\n关掉\t240540\n人士\t240541\n李亚军\t240542\n揽胜运动版\t240543\n复还\t240544\n虹猫\t240545\npolarized\t240546\n千万\t240547\n球机\t240548\n鬼打墙\t240549\n博华\t240550\n波义耳\t240551\n康居西城\t240552\nhifi音响\t240553\n鸡蛋\t240554\n开袋\t240555\n电吉他拾音器\t240556\n翁氏\t240557\n六西格玛\t240558\n1300亿元\t240559\ngreSQL\t240560\n恶魔城月下夜想曲\t240561\n铁拳无敌孙中山\t240562\n喊价\t240563\n液化滤镜\t240564\n美国哥伦比亚大学\t240565\njkmiao\t240566\n自行车架\t240567\n2017Q1\t240568\nMin\t240569\n马兜铃\t240570\n毡子\t240571\n98万\t240572\ntengxun\t240573\n误以为\t240574\n金凯瑞\t240575\n3db\t240576\n蒲阳镇\t240577\n最值问题\t240578\n集合\t240579\n张韶涵\t240580\n棉织物\t240581\n杭州教育局\t240582\nIES\t240583\nVAE\t240584\n古装\t240585\n姚庄\t240586\nHospital\t240587\n牛角梳\t240588\n杨子小s\t240589\n枇杷园\t240590\nMacTalk\t240591\n暖暖环游世界吧\t240592\n互赠\t240593\n新迪\t240594\n轩\t240595\n广告买卖网\t240596\n漫谈库\t240597\n运管处\t240598\nUA\t240599\n杭州植物园\t240600\n神战:权力之眼\t240601\n50#\t240602\n贵州省高级人民法院\t240603\n迪普\t240604\ngor\t240605\n人手\t240606\n不收藏\t240607\n中国建筑第二工程局有限公司\t240608\n五华信息港\t240609\n可以说下\t240610\n手绘稿\t240611\n贾府\t240612\niii\t240613\n埃莉诺\t240614\n影院\t240615\n共板法兰\t240616\n沃尔沃V60\t240617\n爱情歌\t240618\n大帅府\t240619\n暮雨\t240620\n讨鬼\t240621\n即食\t240622\n夯实机\t240623\n辛迪\t240624\n邢台市\t240625\n宝马4S店\t240626\n任文\t240627\n工商银行公司\t240628\n文玩\t240629\nChemicals\t240630\n热转印\t240631\n软面\t240632\n常客\t240633\n桌面板\t240634\nGrace\t240635\n座板\t240636\n为马\t240637\n仁和会计培训学校\t240638\ndreamed\t240639\nfloating\t240640\n怪物猎人世界铳枪\t240641\n20160705\t240642\n惊魂序曲\t240643\nwww.wenku1.com\t240644\n德智教育\t240645\n划去\t240646\n散结\t240647\n建州\t240648\n菲特\t240649\n子板\t240650\nbatik\t240651\n花山郡\t240652\n汉军\t240653\n达尔文\t240654\n156个\t240655\n板栗子\t240656\n建筑工程类\t240657\n市水务局\t240658\n尚朋堂\t240659\n一充\t240660\n元帅们\t240661\n灭活\t240662\n非名\t240663\n乱战先锋\t240664\n湖南科技学院\t240665\n新巴尔虎左旗\t240666\n小米语音助手\t240667\n杨世光\t240668\n乱菊\t240669\n3000张\t240670\n垫江县\t240671\nPenetration\t240672\n适应\t240673\nintercourse\t240674\n5秒钟\t240675\n意念力\t240676\nbeforesend\t240677\n左丘明\t240678\n中国十七冶集团有限公司\t240679\nFallin\t240680\n16片\t240681\n换货\t240682\n物理化学\t240683\n木姜子\t240684\n史密斯夫妇\t240685\nNutz\t240686\n葡萄糖酸钙锌口服液\t240687\n熨斗\t240688\n王凤麟\t240689\n半山\t240690\n吴文化\t240691\n图片查看器\t240692\n57路\t240693\n瘦脸\t240694\n新能源电动汽车\t240695\nAOW\t240696\n林明美\t240697\n开合\t240698\n五马\t240699\n角美镇\t240700\n留山\t240701\nLUPA\t240702\n0.2元\t240703\n开悟\t240704\n古田会址\t240705\n浦汇\t240706\nGintama\t240707\n树熊\t240708\n阿曼达·塞弗里德\t240709\nemerson\t240710\n中行\t240711\n额温\t240712\n恒\t240713\n成都熊猫基地\t240714\n莆田市教育局\t240715\n4.5.0\t240716\n河北敬业集团\t240717\n利刀\t240718\n柱袋\t240719\n全盖\t240720\n马金凤\t240721\n舒达\t240722\n工具\t240723\n小辛\t240724\n打打\t240725\n炉甘石\t240726\n住户\t240727\nK粉\t240728\n责信\t240729\n遮盖液\t240730\n印像\t240731\n雪纺裙\t240732\n寻甸\t240733\nFormula\t240734\n一佳\t240735\n喂鸡\t240736\nhf\t240737\n五字诀\t240738\naat\t240739\n中国政法大学研究生院\t240740\n嘉兴南\t240741\n为学\t240742\n华盖创意\t240743\nFear\t240744\n配套商\t240745\n同事们\t240746\n中国人民政治协商\t240747\n三指\t240748\n拳谱\t240749\n艺术团\t240750\n2373\t240751\n洪泽区\t240752\nQQ仙灵\t240753\n异口同声\t240754\n快印\t240755\n焦距\t240756\n2017一模\t240757\n换一换\t240758\n洛小熠\t240759\n自燃险\t240760\nreadlines\t240761\n德州市政府\t240762\nlr12\t240763\n江苏省知识产权局\t240764\n齐物论\t240765\n西游篇\t240766\n问效\t240767\n韩都\t240768\n卷机\t240769\nhyy\t240770\n斩魄刀\t240771\n黑狱断肠歌之砌生猪肉\t240772\n苏豪\t240773\n第二条\t240774\n输入口\t240775\nccc\t240776\n山东省妇幼保健院\t240777\n张国宝\t240778\n唐川\t240779\n先天性心脏病\t240780\n左脚舞\t240781\n网易视频云\t240782\n20150628\t240783\n山东轻工业学院\t240784\ncgjoy\t240785\n艾普拉唑肠溶片\t240786\n数值模拟\t240787\n7.0德鲁伊\t240788\nFSK\t240789\nemblem\t240790\nsmog\t240791\n小玉玉\t240792\n证实\t240793\n618年\t240794\n共存\t240795\n选型\t240796\nseoul\t240797\n虚像\t240798\n韵达速递\t240799\n1024M\t240800\n邓敏\t240801\ncheckpoint\t240802\n七片\t240803\n神煞\t240804\nDeux\t240805\n20150807\t240806\n应征\t240807\na19\t240808\n台词网\t240809\n小淘气\t240810\nNars\t240811\n天作之合\t240812\n智恩\t240813\n陈宝国\t240814\n袁来\t240815\n常振明\t240816\n關\t240817\n重晶石粉\t240818\n常州市妇幼保健院\t240819\n川澄绫子\t240820\n紫竹苑\t240821\n表观遗传学\t240822\n迈腾1.8T\t240823\n高航\t240824\n下迷\t240825\nkinzu\t240826\nxmemcached\t240827\n跳板机\t240828\n菠萝咕咾肉\t240829\n1T\t240830\n天主教会\t240831\n第二维\t240832\n事实上\t240833\n烩\t240834\nexcel批量\t240835\n朱刚\t240836\n数据库系统工程师考试\t240837\n莱西在线\t240838\nπ\t240839\nface_recognition\t240840\n德泽\t240841\nInsider\t240842\nrespect\t240843\nSWFUpload\t240844\n樱花国际日语\t240845\n亚狮龙\t240846\n雨量\t240847\n憨厚\t240848\n相食\t240849\n上里古镇\t240850\n北京市教育委员会\t240851\n黄山酒店\t240852\nflask-SQLAlchemy\t240853\nt16\t240854\n12英寸\t240855\n乌梅丸\t240856\n罗汉果花\t240857\n鞭腿\t240858\n8650u\t240859\n立式车床\t240860\n夜火车\t240861\n二度\t240862\n奇珍\t240863\n闻者\t240864\n哈迷\t240865\ntaipei\t240866\n平说\t240867\n水嫩\t240868\n董寿平\t240869\n第十一代\t240870\n售票\t240871\n因材\t240872\n岛国片\t240873\n百度影音DVD高清\t240874\n东京食尸鬼2\t240875\n蒲公英根\t240876\n电子血压计\t240877\nmead\t240878\n妖狐x仆ss\t240879\n窗台板\t240880\n松赞林寺\t240881\nPostek\t240882\nwy\t240883\n山东广电网络集团\t240884\ncelebrated\t240885\n卧姿\t240886\n几十天\t240887\n416号\t240888\n杨曦\t240889\n中策橡胶集团有限公司\t240890\n波谷\t240891\n离思\t240892\nv2.3.6\t240893\n29岁\t240894\n大太监\t240895\nwcg\t240896\nmd5sum\t240897\nEvonik\t240898\n于航\t240899\n李登辉\t240900\n炉石传说狗头人冒险\t240901\n邓小东\t240902\n100.7\t240903\n沂南县\t240904\n信电\t240905\n实体化\t240906\n白蕉\t240907\n下旨\t240908\n凯雷德\t240909\n氨曲\t240910\n注册测绘师\t240911\n车仁表\t240912\n4000W\t240913\n盖雅\t240914\nModelAttribute\t240915\nxcorr\t240916\n痞\t240917\n复旦交大\t240918\n凯迈乐\t240919\n蒙古骑兵\t240920\n电艺考\t240921\n豪沃\t240922\n解析器\t240923\n中央国家安全委员会\t240924\n烤面包\t240925\n辽宁金农网\t240926\n暨南\t240927\n形声\t240928\n折起\t240929\n通运\t240930\n黄斌\t240931\n临河\t240932\nVertical\t240933\nOwhat\t240934\nvconsole\t240935\nLogcat\t240936\n一定\t240937\n逍遥丸\t240938\n称骨算命表\t240939\n北京地税\t240940\neditview\t240941\nAtlantic\t240942\n模块库\t240943\n无尘服\t240944\nBlues\t240945\n有词\t240946\n孤本\t240947\n林同棪\t240948\n气旋\t240949\nbonding\t240950\n奈何\t240951\n洗手池\t240952\n张贤\t240953\n盛大开机\t240954\n法史\t240955\nrelevance\t240956\nTracePro\t240957\n眉山房产网\t240958\nOPA\t240959\n无亲\t240960\n祈愿\t240961\n黑电\t240962\n涂擦\t240963\ntorr\t240964\n本型\t240965\n49.5MW\t240966\nfutex\t240967\n蓝调口琴\t240968\nfears\t240969\n体下\t240970\n惹官司\t240971\n动能转换基金\t240972\n以太\t240973\n胸衣\t240974\n深度学习库\t240975\nWework\t240976\n家鱼\t240977\nCPI指数\t240978\n慧都\t240979\n内限\t240980\nMelon\t240981\n弹腿\t240982\n塞维\t240983\n13M\t240984\n无绳\t240985\n二甲基砜\t240986\n海尔空气能热水器\t240987\nscorll\t240988\n田进\t240989\n4月1日\t240990\n南峰\t240991\n神户港\t240992\ndane\t240993\n赶出来\t240994\n后来\t240995\nANSYS命令流\t240996\n立面图\t240997\n全能侦探社\t240998\nω\t240999\n美满霉素\t241000\n4.7米\t241001\n高仙芝\t241002\n筒体\t241003\nantd\t241004\nRAZR\t241005\n电水壶\t241006\n海底囚人\t241007\nZH-CN\t241008\n中华纺织网\t241009\n夏馨雨\t241010\n6晚\t241011\n第五十一章\t241012\n志达\t241013\npassphrase\t241014\n中基协\t241015\n明清古\t241016\n月亮池\t241017\n音悦\t241018\n7幅\t241019\n好坏处\t241020\n长安欧尚装饰\t241021\nCNF\t241022\n欧神诺陶瓷\t241023\n前三年\t241024\n手艺\t241025\nTuck\t241026\n微分\t241027\nHighest\t241028\n江苏凤凰科学技术出版社\t241029\n固然\t241030\n水晶郦城\t241031\n十五亿\t241032\nx21\t241033\n无法控制\t241034\n南新路\t241035\n航权\t241036\n大众书局\t241037\n风管\t241038\nPh\t241039\n泪洒\t241040\n实验服\t241041\n2017年1月12日\t241042\n下肢静脉曲张\t241043\n开群\t241044\n魔益果\t241045\n断丝\t241046\n百科\t241047\n西典\t241048\n陈印精\t241049\n标记\t241050\n股息\t241051\n秒付\t241052\n厨房地柜\t241053\n内贾德\t241054\n防震垫\t241055\n东平县\t241056\n支援\t241057\n辶\t241058\ndoc文档赚钱网\t241059\n七娃\t241060\njavac\t241061\n宝泉\t241062\n名作\t241063\nkik\t241064\n安卓版v3.0\t241065\n热切\t241066\nxboxon\t241067\n端州区人民政府\t241068\n田园风\t241069\n赏花季\t241070\n东方美谷\t241071\n北京汇文中学\t241072\nscholar\t241073\n2.5次元\t241074\n存世量\t241075\n南瓜头\t241076\n女声版\t241077\n起劲\t241078\n冰水\t241079\n六时\t241080\n斯巴鲁brz\t241081\n火影忍者动画\t241082\n够不够\t241083\n广东省省委\t241084\n免考\t241085\n230路\t241086\n美生活\t241087\n中华响扇\t241088\n不灭\t241089\n66rpg\t241090\n龙岗河\t241091\n有限元分析从入门到精通\t241092\n玩机\t241093\nCFile\t241094\n青年报\t241095\n指定类\t241096\n贵州省高速公路管理局\t241097\n文庙\t241098\n接访\t241099\n明细表格\t241100\nWrinkle\t241101\n曲面电视\t241102\n20141128\t241103\n解算\t241104\n1万元\t241105\nEyre\t241106\nliquidation\t241107\n萨姆\t241108\n3309\t241109\n全温\t241110\n煲\t241111\n鲁能公馆\t241112\n孩童\t241113\n智能网联\t241114\n2811\t241115\n灵溪镇\t241116\nSherman\t241117\n庆丰包子铺\t241118\n株洲西\t241119\nR卡\t241120\npwd\t241121\n川崎绫\t241122\n董旭\t241123\n握力\t241124\n第一帧\t241125\n国际商务区\t241126\n大荣\t241127\n江苏凤凰教育出版社\t241128\n杜鹃园\t241129\n︿\t241130\n斗罗大陆2绝世唐门\t241131\n彭志强\t241132\n美纲玲\t241133\nminila\t241134\nⅧ\t241135\nEPR\t241136\n1月9日\t241137\n汽修工\t241138\n无人售\t241139\n鲍叔牙\t241140\n食物网\t241141\n亿阳\t241142\n纯甄\t241143\n卖逼\t241144\n刪\t241145\n尊尼\t241146\nGases\t241147\n蟹肉棒\t241148\n1936年\t241149\n8.1.4\t241150\n山主\t241151\nmongos\t241152\n奥迪a6l\t241153\n马洁\t241154\n浴巾架\t241155\n曾国藩故居\t241156\n于翔\t241157\n20150825\t241158\n关山月美术馆\t241159\n大舟\t241160\n剑南春\t241161\n加油\t241162\n945.Com\t241163\n海信洗衣机\t241164\n门业\t241165\n37.7度\t241166\n毛利\t241167\n李云\t241168\n葛沽\t241169\n省委统战部\t241170\n金山屯\t241171\n先修\t241172\n沙与沫\t241173\n隆化\t241174\n巩义在线\t241175\n1280x720\t241176\n南海实验学校\t241177\n简道云\t241178\n电脑硬盘序列号\t241179\n紧迫\t241180\n三河镇\t241181\n全能版\t241182\n惊魂博物馆\t241183\n无难\t241184\n舍近求远\t241185\nOOS\t241186\n100欧\t241187\nMinikube\t241188\n轨道轮\t241189\n老鼠\t241190\nZ3期\t241191\n瓦片\t241192\nVPL\t241193\n穿帮\t241194\n合肥财经职业学院\t241195\ndar\t241196\n梦幻西游天机城\t241197\nC4d\t241198\n千分之几\t241199\npellet\t241200\n宁夏回族\t241201\n领标\t241202\n向量\t241203\n十二部\t241204\n用集\t241205\nlisp\t241206\n金浦机场\t241207\nxixirenti\t241208\n冰感\t241209\n做功\t241210\n陈灯\t241211\n使用类\t241212\n超清版\t241213\n葛明\t241214\n苏荃\t241215\nSunshine组合\t241216\nregularly\t241217\n杯水\t241218\n青年人网\t241219\n慢性心力衰竭\t241220\n梅花拳\t241221\n云谷\t241222\n人类一败涂地\t241223\n扬名\t241224\n康达新材\t241225\nClement\t241226\n中山装\t241227\n中国学士院\t241228\n春风\t241229\nSIGMA\t241230\n道教\t241231\n中国热带农业科学院\t241232\ngenerally\t241233\n美德乐\t241234\n徐明星\t241235\n实数根\t241236\n制锦市\t241237\n子豪\t241238\n快船队\t241239\n几何量\t241240\n蜀光网\t241241\n饥渴\t241242\n621路\t241243\n向下\t241244\n高锰钢\t241245\n吴思佳\t241246\n自兹\t241247\n电热毯\t241248\n红豆糕\t241249\n雷波\t241250\n8561\t241251\n0331\t241252\n鳌\t241253\nSING女团\t241254\n品丝\t241255\n和其正\t241256\n21片\t241257\npremi\t241258\n最强大脑\t241259\n奔现\t241260\n别集\t241261\n百度云网盘\t241262\n新星宇\t241263\n别墅\t241264\n三百六十行\t241265\n优惠包\t241266\n无限气\t241267\nJavaScript编程学院_IT学院\t241268\n东明路\t241269\n一花\t241270\n总段\t241271\n没完\t241272\navs\t241273\n语气\t241274\n同益\t241275\nCitavi\t241276\n三湖\t241277\n告\t241278\n风竹\t241279\nmato\t241280\n人间世\t241281\n新安全观\t241282\n意绵\t241283\n廉政建设\t241284\n5月2日起\t241285\n童年时\t241286\n射洪县\t241287\n文俊\t241288\n郑王庙\t241289\nictclas\t241290\n唐都医院\t241291\n零6个月\t241292\n国宾馆\t241293\n低压断路器\t241294\n山歌\t241295\nErgomax\t241296\n注册资金\t241297\n株洲市人民政府\t241298\n阿悄\t241299\n导符\t241300\n浙江树人大学\t241301\n班加西\t241302\n刘梦溪\t241303\n西澳大学\t241304\n第2套\t241305\n罗定\t241306\n亲爱的妈妈\t241307\n铁皮枫斗\t241308\n黑长直\t241309\n沾衣欲\t241310\n河南教育学院\t241311\n华为盒子-奇珀网\t241312\n炮山甲\t241313\n爱机\t241314\nthou\t241315\n露娜2\t241316\n酱香白酒\t241317\n波司登\t241318\n长大于\t241319\n欢悦\t241320\n30支\t241321\n扩军\t241322\n东旭蓝天\t241323\n品致\t241324\n外埠\t241325\nLevel5\t241326\n仙女湖\t241327\n五角大楼\t241328\nGiantess\t241329\n盖法\t241330\n飞秒手术\t241331\n摸脸\t241332\n善辩\t241333\n酶\t241334\n25美元\t241335\n非球面镜片\t241336\n超碰碰\t241337\nSaab\t241338\n行政级别\t241339\n107篇\t241340\n塞夏\t241341\n自启项\t241342\n德州仪器\t241343\n助推器\t241344\n银灿\t241345\n二郎山\t241346\n黑客临高启明\t241347\n过于\t241348\n3700万\t241349\n氟树脂\t241350\nclassify\t241351\n挂画\t241352\n控制员\t241353\n东海\t241354\n北京焦化厂\t241355\n免税店\t241356\n间歇期\t241357\n重拾\t241358\n换补\t241359\nfiv\t241360\n电子信息系统\t241361\ncosplay\t241362\n吟荷\t241363\n强奸2制服诱惑\t241364\nmetricbeat\t241365\n东湖小学\t241366\n魔兽争霸RPG\t241367\n梵克雅宝\t241368\n凉宫\t241369\n旋转体\t241370\n积算仪\t241371\nHMS\t241372\n熔炉\t241373\n苦中苦\t241374\n黔人社厅通\t241375\n仰卧\t241376\nhezi\t241377\n北卡州\t241378\nFreeRTOS\t241379\ncpda\t241380\n中欧班列\t241381\n猴山\t241382\n顺泽\t241383\n恐龙世界\t241384\n太矮\t241385\n水温\t241386\nWindows2012\t241387\ndup\t241388\n40多名\t241389\n同济科技\t241390\n小见\t241391\n酷狗M1\t241392\n国际交流学院\t241393\n100x\t241394\n网智\t241395\n碎铁\t241396\n文熙俊\t241397\n635k\t241398\nTLM\t241399\n玉兰香苑\t241400\n亳\t241401\nA.O.史密斯\t241402\n爱我你就抱抱我\t241403\n颁奖典\t241404\n弧线\t241405\n桂阳县政府\t241406\n中核\t241407\n社会山\t241408\n中国电子科技集团公司第三十八研究所\t241409\nMASSAGE\t241410\n分拣机\t241411\n泽野\t241412\n萨博尼斯\t241413\n合全药业\t241414\n盘型\t241415\n9种\t241416\n流平\t241417\n160平米\t241418\n可倾式\t241419\n中南院\t241420\n铜元局\t241421\n专题展\t241422\n武汉住房公积金管理中心\t241423\nBrain\t241424\n濮院镇\t241425\n印刷费\t241426\n房天下装修家居网\t241427\nAlicia\t241428\n陈秋霞\t241429\n尊\t241430\n周海媚\t241431\nsmpp\t241432\n放射\t241433\n海洋科技\t241434\n宠修\t241435\n胶辊\t241436\n地下交通站\t241437\n技术革新\t241438\n阿盟\t241439\n金庸群侠传单机版\t241440\npano\t241441\n小叶榕\t241442\n怎样\t241443\nGoLang\t241444\ndjs\t241445\n巴巴熊\t241446\nComposite\t241447\n260克\t241448\n缤纷\t241449\nReact\t241450\nメイド\t241451\ntencent\t241452\n多得多\t241453\n完达山奶粉\t241454\nCinemas\t241455\n队标\t241456\n自发粉\t241457\n申世京\t241458\n焉耆县\t241459\n岳母娘\t241460\n1810\t241461\n销售代理合同\t241462\n短信网\t241463\n静脉输液\t241464\n老虚\t241465\n青松\t241466\n灵芝破壁孢子粉\t241467\n杨菁\t241468\n狂欢椅\t241469\n天下无双\t241470\n顾阳\t241471\n纺织材料\t241472\n周建人\t241473\n北京摇号\t241474\nComes\t241475\n中将\t241476\n林涛\t241477\n第几集\t241478\n姜网\t241479\nimcrop\t241480\n万家丽\t241481\nGLOBE\t241482\n念坛公园\t241483\n谋局者\t241484\n日常性\t241485\n滑轨\t241486\n陈涉世家\t241487\n高育良\t241488\n干柠檬片\t241489\n吴琼\t241490\n密闭\t241491\n深辰\t241492\n创享\t241493\n积分\t241494\nNBA2K15\t241495\n三禾\t241496\n蚕屎\t241497\n紫发\t241498\n辛亥革命纪念馆\t241499\n亚瑟\t241500\nstrncpy\t241501\n节奏类\t241502\n合十\t241503\n良美\t241504\n封侯\t241505\n放水\t241506\n亦可\t241507\n粪男孩\t241508\n路桥区\t241509\n33年\t241510\n金刚石锯片\t241511\n尚明\t241512\nvom\t241513\n杭州联合银行\t241514\n正准\t241515\nHibernateTemplate\t241516\n任浩铭\t241517\n德玛吉\t241518\n香果\t241519\n上海国际能源交易中心\t241520\n南京师范大学仙林校区\t241521\njdp\t241522\n韦曲\t241523\n麦积\t241524\n2018041期\t241525\n菲利华\t241526\n小弯\t241527\n视机\t241528\n播音与主持艺术专业\t241529\n平息\t241530\n御载\t241531\n寻程\t241532\n无法忍耐\t241533\nTension\t241534\n7.52\t241535\n阿尔伯特·爱因斯坦\t241536\n衢州市区\t241537\n东方绀珠传\t241538\noreal\t241539\nPAYDAY\t241540\n杀伐\t241541\nredirected\t241542\n48亿元\t241543\n林上\t241544\n2017年10月17日\t241545\n五台山体育馆\t241546\n选择\t241547\n西安航天基地\t241548\n周乃翔\t241549\n双剑合璧\t241550\n荒木经惟\t241551\n薛帕德\t241552\n5.5万\t241553\nbreakpoint\t241554\n乐成镇\t241555\n环丙烷\t241556\n高瑟\t241557\n肚上\t241558\n卫报\t241559\n岱海\t241560\n刘备三国演义\t241561\n一吻成瘾\t241562\n北京大学生电影节\t241563\n索纳塔九\t241564\n菊苣\t241565\n梦幻传说\t241566\n以案说\t241567\n进样\t241568\n王永江\t241569\nCCProxy\t241570\n肠\t241571\n转换器网\t241572\n陶溪川\t241573\n公开化\t241574\n指向\t241575\n德拉\t241576\nmarginnote\t241577\n丝筒\t241578\n肌筋膜炎\t241579\n身边\t241580\n奶油蛋糕\t241581\n犬奴\t241582\n调适\t241583\n甘南县\t241584\n140克\t241585\n郞\t241586\n平安科技(深圳)有限公司\t241587\n气库\t241588\n福西西\t241589\n羽客羽毛球网\t241590\nUTF-8_\t241591\n三务\t241592\n4月28\t241593\n挂轮\t241594\n婚庆网\t241595\n保时捷918\t241596\n早夭\t241597\n过敏源检查\t241598\nDayMP3\t241599\n熊先生\t241600\n韶华\t241601\n豫西\t241602\n点算\t241603\n南京脑康中医院\t241604\n干蒸房\t241605\nLovelyz\t241606\n住友\t241607\n叛\t241608\nPOLA美白丸\t241609\n南京男科医院\t241610\n3dmax2014\t241611\n麦瑞\t241612\nPrincipal\t241613\ngoogle浏览器\t241614\n本田杰德\t241615\n传至\t241616\n三国机密潜龙在渊\t241617\nPROMISE\t241618\nR4烧录卡\t241619\nACTA\t241620\n复发率\t241621\n角鹰兽\t241622\nsilk\t241623\npubic\t241624\n王国彬\t241625\n精密机\t241626\nKenny\t241627\n中润\t241628\n天津城市一卡通有限公司\t241629\n站前区\t241630\n200亿元\t241631\nHRF-10T\t241632\n炉石传说新冒险\t241633\n伏尔泰\t241634\n南泉镇\t241635\n泸州职业技术学院\t241636\n沈澜\t241637\n阅件\t241638\n刘易斯·卡罗尔\t241639\n弗吉尼亚大学\t241640\n3分集\t241641\n穆僮\t241642\n硅溶胶\t241643\n奥康\t241644\n移动无限流量卡\t241645\n几列\t241646\n即住\t241647\n十三月\t241648\n飘雨桐\t241649\n愈合\t241650\n爱依服\t241651\nredolog\t241652\nzds\t241653\n120个\t241654\nzhunkua\t241655\n2018.4.8\t241656\n三流大学\t241657\nDoujin\t241658\nifruit\t241659\nTVB\t241660\n南方公园\t241661\n漏斗\t241662\n断了线\t241663\n无水硫酸钠\t241664\n路霸\t241665\n台州人才网\t241666\n赵葆秀\t241667\n卡扎克\t241668\n大魔方\t241669\n青橄榄\t241670\n9998\t241671\n商科\t241672\nstoryline\t241673\nhostel\t241674\n机耕路\t241675\n颜料\t241676\n男女性\t241677\npathname\t241678\n飞砂\t241679\n反季节\t241680\n壮大村\t241681\n1月11日\t241682\n暗扣\t241683\n201712月\t241684\n醋酸锌\t241685\n凤阳县政府\t241686\nSwiss\t241687\nCape\t241688\ncmd命令提示符\t241689\n一万句\t241690\n精准版\t241691\ncarat\t241692\n化妆品\t241693\n糯米饭\t241694\n走地\t241695\n7000\t241696\n徐立华\t241697\n河南地税\t241698\n福格森\t241699\n爱音\t241700\n相交线\t241701\n努维尔\t241702\n吞金\t241703\n李志远\t241704\n11w\t241705\n公交机场\t241706\n针尖\t241707\n徐州经济技术开发区\t241708\n创驰\t241709\nfreee\t241710\nplanet\t241711\n激凸照\t241712\nQQ斗地主\t241713\n奥迪汽车\t241714\nIP-COM\t241715\nTOY\t241716\n增长点\t241717\n狄娜\t241718\n沿江路\t241719\n360图书馆\t241720\nZ370-A\t241721\n机遇期\t241722\nHTML5+\t241723\nWPE\t241724\nABT\t241725\n综合资讯_职业餐饮网\t241726\n露天机\t241727\n郑旭\t241728\n人马\t241729\n荣耀路由器\t241730\nmySQL数据库\t241731\n洞穴\t241732\n烧火油\t241733\n猫科\t241734\n摩安珂\t241735\n钢制品\t241736\nV7.3\t241737\n剑网3剑网三\t241738\n羽然\t241739\n内蒙古农牧业厅\t241740\n海马模拟器\t241741\n进出口贸易有限公司\t241742\n魔法杖\t241743\n张彦远\t241744\n生化危机7\t241745\n40G\t241746\n坦克世界军功中心\t241747\nwindow0\t241748\n办假\t241749\n潮色\t241750\ncoolmax\t241751\nourselves\t241752\n嬉水\t241753\n周江\t241754\n神宫寺\t241755\n内蒙古师范大学\t241756\n第237号\t241757\n齐贤\t241758\n李洪强\t241759\n北大出版社\t241760\n东方地灵殿\t241761\n成安县\t241762\n合欢\t241763\n曼天雨\t241764\n2716\t241765\ninspection\t241766\n53度\t241767\n花椒叶\t241768\ndefalut\t241769\n3740\t241770\n生物膜\t241771\n大价\t241772\n天命杯\t241773\nB+轮融资\t241774\n拼爹\t241775\n息县人民政府\t241776\n环氧玻璃\t241777\n慢性便秘\t241778\nincubating\t241779\n御水湾\t241780\n符号学\t241781\n开取\t241782\n事务管理器\t241783\nweet\t241784\nsannian\t241785\n无影响\t241786\n天津神州浩天科技有限公司\t241787\n河南明泰铝业股份有限公司\t241788\n10分钟左右\t241789\n3.2.0\t241790\n太湖花园\t241791\n兼业\t241792\n维多利亚湖\t241793\n次氯酸\t241794\n经济型\t241795\n走向远方\t241796\n触类旁通\t241797\n养猪场\t241798\nnote7\t241799\n4色\t241800\n河沟\t241801\n高翔\t241802\n后山\t241803\nSD内存卡\t241804\n760K\t241805\n工欲善其事\t241806\nUntil\t241807\n虎钳\t241808\n世界邦旅行网\t241809\n乌金山\t241810\n无痛流产\t241811\n▼\t241812\nMetaspace\t241813\n7卷\t241814\n常青树\t241815\n挡煞\t241816\n一二级\t241817\n闪灵\t241818\n防损员\t241819\nSymptoms\t241820\n女鼠\t241821\n幂等性\t241822\n教师资格证\t241823\n数万亿\t241824\nretrof\t241825\n${}\t241826\n朱晓峰\t241827\n尚雅\t241828\n复合剂\t241829\n贤士\t241830\n湿球\t241831\ncelebrity\t241832\n101创始人你好\t241833\n客户化\t241834\n用人之道\t241835\n贴签\t241836\n培佳\t241837\ncp1e\t241838\n凹凸不平\t241839\n2018-01-31\t241840\n群加群\t241841\nbroforce\t241842\n8084\t241843\n硝基苯甲酸\t241844\nIntelligent\t241845\n项目组\t241846\nキキ\t241847\n行政部\t241848\n开秀\t241849\nHarem\t241850\n王志纲\t241851\n102号\t241852\n全向\t241853\n新房客\t241854\n绿电\t241855\n赤水大瀑布\t241856\n丹东港\t241857\n蜡水\t241858\nTHEORY\t241859\n太原街\t241860\n长戈\t241861\n忠旺集团\t241862\nサキュバス\t241863\nw5100\t241864\nQQ西游\t241865\nICP-OES\t241866\n财行\t241867\n三次产业\t241868\nWolf\t241869\n新景祥\t241870\n黑称\t241871\n社旗县\t241872\n李庆扬\t241873\n帶\t241874\n勉传\t241875\n三帅\t241876\n华日\t241877\n百旺\t241878\n罠\t241879\nlspl\t241880\n大田\t241881\ndubbox\t241882\n涉税服务实务\t241883\n双播\t241884\nREST\t241885\n神位\t241886\n阿提\t241887\ngories\t241888\n淘营销\t241889\nserver2012\t241890\n3x2\t241891\n毛瑟98k\t241892\n金融史\t241893\n冰鸟\t241894\n联英\t241895\nzrender\t241896\nESP32\t241897\npharma\t241898\n灌河\t241899\n澜\t241900\n调定\t241901\n周家镇\t241902\n卡环\t241903\n英威腾\t241904\n15.8\t241905\n黄自\t241906\nhighlighting\t241907\n百度公司\t241908\n金御札\t241909\n太极湖\t241910\n歼八\t241911\n上卖\t241912\n杨宇\t241913\n长方形\t241914\n四川省成都外国语学校\t241915\n收获\t241916\n智能包\t241917\nBed\t241918\n日译\t241919\n华天科技\t241920\nNTTV\t241921\n房价网\t241922\n中国人寿\t241923\n生魂\t241924\nCKG48\t241925\nz5c\t241926\n广州地铁三号线\t241927\n神力女超人\t241928\n旧路\t241929\n薛家湾\t241930\n火焰喷射器\t241931\nSilverlight\t241932\n楚楚动人\t241933\n核资\t241934\n比特币挖矿机\t241935\n费沁源\t241936\n山北\t241937\n微电\t241938\nlvjun106\t241939\n陈萌\t241940\n7000_\t241941\n脉管\t241942\n比值\t241943\n中国葛洲坝集团股份有限公司\t241944\n6-10\t241945\n乐秀\t241946\n新闻人\t241947\nchiu\t241948\nzjz\t241949\n六两\t241950\nexcel表里\t241951\n郭敬施瓦辛格\t241952\n冬奥会\t241953\n员工\t241954\n鉴定结论\t241955\n特律\t241956\n黄金赛\t241957\nlait\t241958\n新概念英语全四册\t241959\n工商局打印公司\t241960\n弗拉德三世\t241961\n波表\t241962\n牡丹虾\t241963\n贡献奖\t241964\nmoblie\t241965\n涨停\t241966\n杨庆\t241967\n三十多年前\t241968\n皮尔特沃夫\t241969\n戴泽\t241970\nwacom\t241971\n广州港\t241972\nScholar\t241973\n换手率\t241974\nsphere\t241975\n彩开奖\t241976\n过多久\t241977\nTOP9\t241978\n应运而生\t241979\n光之国\t241980\n群号\t241981\n皮包包\t241982\n游览器\t241983\n爱情片\t241984\n娃娃脸\t241985\n荒骷髅\t241986\n镍铬\t241987\n宝山村\t241988\n澄海\t241989\n初训\t241990\n咸宁房产网\t241991\n天府新区直管区\t241992\ninitio\t241993\n李代数\t241994\nbou\t241995\n自贡市人民政府\t241996\n木糠杯\t241997\npicnic\t241998\n化简比\t241999\n战报\t242000\n反馈型\t242001\n中国地质博物馆\t242002\n壮举\t242003\n20170104\t242004\nWiz\t242005\nAdy映画\t242006\n万里山\t242007\n虫虫\t242008\n基础知识\t242009\n15分钟内\t242010\n神爱爱小说网\t242011\n国会山\t242012\n第七层\t242013\n云锦路\t242014\n苏州市人民政府\t242015\n木指\t242016\n0.2级\t242017\n150平方米\t242018\n显存容量\t242019\n轴承\t242020\n校园文\t242021\n安姐\t242022\n北京市海淀外国语实验学校\t242023\n致公党\t242024\n咱们穿越吧\t242025\nPixelStyle\t242026\n沈金龙\t242027\n士兵证\t242028\n操作指\t242029\n填充液\t242030\n740\t242031\n物品\t242032\n韩家\t242033\n万贯\t242034\n骑马与砍杀1.011\t242035\n跤\t242036\n微基因\t242037\n军事化\t242038\n波什\t242039\n国际投行\t242040\n北京师范大学出版社\t242041\n120亩\t242042\n无败神装机龙\t242043\n波纹状\t242044\n非木\t242045\n大开沙界\t242046\n房潮\t242047\n理气\t242048\n察哈尔右翼中旗\t242049\n叶波\t242050\n谢客\t242051\n跨坐\t242052\n后羿陵墓\t242053\n浉河区\t242054\nECFA\t242055\n花郎\t242056\n全美音乐奖\t242057\n天盾\t242058\n两款\t242059\n魁拔\t242060\nlevel5\t242061\n卧龙吧\t242062\n佛性\t242063\n岭南新天地\t242064\nSMS\t242065\n金蛋理财\t242066\n陈德\t242067\n棉花糖小说网\t242068\n格外\t242069\n健达\t242070\n长诗\t242071\n小虎子\t242072\n大众日报\t242073\n被删\t242074\n油腻\t242075\n孺子\t242076\ndickies\t242077\nSurviving\t242078\n英唐智控\t242079\nAMOS\t242080\n多位数\t242081\nB型血\t242082\n依兰爱情故事\t242083\n线路板\t242084\n价值点\t242085\n敬献\t242086\n丽颖\t242087\n恩菲尔德\t242088\nIPHONE5\t242089\nAlldatasheet\t242090\n长安欧尚X70A车友会\t242091\n胳肢\t242092\n微腔\t242093\n压板\t242094\n百亿\t242095\nconduction\t242096\n深圳东\t242097\n都灵\t242098\n河北省总工会\t242099\n18家\t242100\ndata1\t242101\n摩羯\t242102\nEpiphone\t242103\n鸡图\t242104\n几何图形\t242105\n鼓起勇气\t242106\n深圳市冠亚达电子有限公司\t242107\n专用车\t242108\n中国邮政速递物流股份有限公司\t242109\n月莹\t242110\nZe\t242111\n授乳\t242112\n字符画\t242113\n罗技402\t242114\n秦刚\t242115\n7.0.85\t242116\nDUBBO\t242117\n自精\t242118\n4円\t242119\n广发银行信用卡\t242120\n330im\t242121\n卫国\t242122\n公安局长\t242123\naov\t242124\n清科集团\t242125\n双杠\t242126\n半袖\t242127\nWebBrowser\t242128\n八校联考\t242129\n医学院\t242130\nqspi\t242131\nnamenode\t242132\n定时开关\t242133\n40GB\t242134\n行知路\t242135\nUNSW\t242136\n貘\t242137\n范畴\t242138\n小娟\t242139\n块级\t242140\n义乌\t242141\n姜饼\t242142\n欧莱雅集团\t242143\nsubject\t242144\nbubble\t242145\n腿骨\t242146\n老公司\t242147\n人帮\t242148\n荷尔拜\t242149\n安徽广播影视职业技术学院\t242150\n2013年8月\t242151\n肺子\t242152\nOrganized\t242153\njyk\t242154\n元山\t242155\n电磁换向阀\t242156\nC卷\t242157\n300岁\t242158\n节材\t242159\n女生\t242160\nzhì\t242161\ninsulin\t242162\n混料机\t242163\n巾帼枭雄\t242164\n绅宝X65\t242165\n设计图\t242166\n护手霜\t242167\n南京市口腔医院\t242168\nBoots\t242169\n保固\t242170\n夹器\t242171\n拍摄技\t242172\n迪堡\t242173\n紫阳街道\t242174\n满秋\t242175\n二十七\t242176\n异物感\t242177\n国家食品药品监管总局\t242178\n双高双普\t242179\n龙芯\t242180\n20集\t242181\njsdo\t242182\nキング\t242183\nオンナ狩\t242184\n黑冰爆珠\t242185\n学雷锋\t242186\n谆谆教导\t242187\n收银秤\t242188\n连绵\t242189\n福建教育电视台\t242190\n克格勃\t242191\n菅野美穗\t242192\n陈小熊\t242193\n0307\t242194\n浙江省农业厅\t242195\n魏树旺\t242196\n三相\t242197\n原因\t242198\n第41章\t242199\n笼具\t242200\nVPNs\t242201\n博彩\t242202\n110平方\t242203\n葫芦村\t242204\n熊胆\t242205\n48天后\t242206\n衡阳路\t242207\n古拉加斯\t242208\n十全\t242209\n协议库存\t242210\nARCH\t242211\n丁彦雨航\t242212\n第十二轮\t242213\nzixun\t242214\np80\t242215\n套曲\t242216\n柏悦\t242217\n守擂\t242218\nrtmp直播流\t242219\n跌宕\t242220\n菲利普·科特勒\t242221\nBoa\t242222\n核桃酥\t242223\n十二五规划\t242224\nPS4/X1/PC\t242225\n委托付款\t242226\n穿普拉达的女王\t242227\n韩斌\t242228\n特尔施特根\t242229\n长春市委\t242230\nDz\t242231\n唱标\t242232\n济南森峰\t242233\n迀\t242234\n通便\t242235\n军服\t242236\n生物醇油\t242237\n广东省环保厅\t242238\n浙江纺织服装职业技术学院\t242239\n双彩网\t242240\n表名\t242241\n翼狐网\t242242\n平安保险\t242243\n成都自贸区\t242244\n人间天堂\t242245\n人教版三年级语文下册\t242246\n河南豫剧网\t242247\n地震力\t242248\n微商群\t242249\n318国道\t242250\n少年三国\t242251\n通感\t242252\n奥涅金\t242253\n春柳\t242254\nunterminated\t242255\n宝生莉莉\t242256\n云霄路\t242257\n粉墙机\t242258\nKeen\t242259\n动画学\t242260\n沈涛\t242261\nSQLite3\t242262\n团购网_好团网\t242263\n辛亥革命博物馆\t242264\n农行手机银行\t242265\n新思达\t242266\n湾悦城\t242267\n二胡谱\t242268\n航空学院\t242269\n业原\t242270\n扮嫩\t242271\n余高\t242272\n闸北区\t242273\n草单\t242274\n郑大远程\t242275\ncadence\t242276\n天堂伞\t242277\n街头巷尾\t242278\n65g\t242279\nCSS样式\t242280\n海淀公园\t242281\n临沂经济开发区\t242282\n水山\t242283\n氘水\t242284\n9198\t242285\naugment\t242286\n45000元\t242287\n多言\t242288\n美的空调\t242289\n电容传感器\t242290\n民政局\t242291\n苞米叶网\t242292\n觉得\t242293\n英卓\t242294\nlinux人中文\t242295\n张家山\t242296\n圆锥曲线\t242297\n高扬\t242298\n马克思恩格斯全集\t242299\n仲盛世界商城\t242300\n暴击\t242301\n虹桥火车站站\t242302\n米乐\t242303\n气体流量计\t242304\n老翁\t242305\n几年级\t242306\n非洲象\t242307\nI/O\t242308\n304医院\t242309\nFrance\t242310\n暗黑3卡奈魔盒\t242311\n28181\t242312\n圣女\t242313\n浣花溪\t242314\nn9008\t242315\n聚集率\t242316\n浒苔\t242317\n阎罗\t242318\nDarren\t242319\n大姨子\t242320\n三星A9\t242321\n为首\t242322\n凯叔西游记\t242323\n联合国宪章\t242324\n北大医学部\t242325\n野鸭湖\t242326\n道路交通事故处理程序规定\t242327\n佛鳄\t242328\n众生相\t242329\nguns\t242330\n乳腺浸润性导管癌\t242331\n常德农经网\t242332\nSelena\t242333\n新浪理财师\t242334\n贵港日报数字报\t242335\n已知道\t242336\n世纪金源大饭店\t242337\n饵料\t242338\n北京市商务委员会\t242339\n爱夏\t242340\n002142\t242341\nムペ\t242342\n爆一爆\t242343\n佳兆业集团\t242344\n信息港山阳论坛山阳网\t242345\n设计者\t242346\n塑钢门\t242347\n离心式\t242348\n高小凤\t242349\n九月九日\t242350\n半边脸\t242351\n红宠\t242352\n中华建设网\t242353\n条带\t242354\n的句子_经典语录\t242355\n遣返\t242356\nKoh\t242357\n亚太杯\t242358\n南安普顿\t242359\n枪钻\t242360\n终极街头霸王4\t242361\n能斯特\t242362\n风情节\t242363\nPyramid\t242364\nPIRT梦想旅行网\t242365\n擦边\t242366\nEmerald\t242367\n亜衣\t242368\n电竞馆\t242369\n经济增长\t242370\n中医街\t242371\n20170115\t242372\n波托菲诺\t242373\nSADP\t242374\n拍卖\t242375\n直接融资\t242376\n德银\t242377\n唱一首歌\t242378\n李罕\t242379\n62位\t242380\n汉川房网\t242381\n勿施\t242382\nUV灯\t242383\n599962030\t242384\n_川川\t242385\n库利\t242386\n带式干燥机\t242387\n127.0.01\t242388\n陕西省国资委\t242389\n强强联合\t242390\nt梁\t242391\n名座\t242392\nmeghan\t242393\n司徒兰芳\t242394\n赛迪顾问\t242395\n界门纲目科属种\t242396\n9千克\t242397\nVentures\t242398\n庐山会议\t242399\n级数\t242400\nwww.75pk.com\t242401\n京华烟云\t242402\n殷殷\t242403\n东风风光\t242404\n石塔\t242405\namx\t242406\n特别行政区\t242407\n回身\t242408\n方正电子\t242409\n钛酸酯偶联剂\t242410\n上海仁爱医院\t242411\nlinYongJie\t242412\n主支\t242413\n录态\t242414\npia\t242415\n梭芯\t242416\n23:00\t242417\n牵牛花\t242418\n永利\t242419\n_舞文弄墨\t242420\n摄影家协会\t242421\nzk\t242422\n小向美\t242423\n摩登女郎\t242424\n共产主义\t242425\n丽水职业技术学院\t242426\n小五台山\t242427\n2013款款\t242428\n乱斗堂\t242429\n穿搭\t242430\n北京恒昌汇财\t242431\n阿育王\t242432\nsmartpss\t242433\n曼奈柯斯\t242434\n星马\t242435\n重做\t242436\njulius\t242437\nD4\t242438\n钟逸伦\t242439\nzjol\t242440\n北城国际\t242441\n内疚\t242442\n乐扣乐扣\t242443\n月季树\t242444\n大渠道\t242445\n仰面\t242446\n三四分钟\t242447\n国哥\t242448\n原创附教学_广场舞\t242449\n良工\t242450\n黄大发\t242451\n220\t242452\n缎\t242453\n18av\t242454\n十晚\t242455\nraise\t242456\n兰玉\t242457\n西秦\t242458\n约读\t242459\n区统计局\t242460\nprepareStatement\t242461\n丹灶镇\t242462\nE45\t242463\n飞瀑\t242464\nructure\t242465\n快播\t242466\nDirectly\t242467\n断路\t242468\n松糕\t242469\n超星读书\t242470\n卡通画\t242471\nMicrobiology\t242472\nnested\t242473\n散手\t242474\n羽咲\t242475\nvSphere\t242476\n郑雨盛\t242477\n心咒\t242478\nParadise\t242479\n陈雷\t242480\n杜陵\t242481\n多汁\t242482\n侵权法\t242483\n脆皮烤鸭\t242484\n苏州东山精密制造股份有限公司\t242485\n牛马\t242486\n英树\t242487\nnaya\t242488\n黄克诚\t242489\n水流传感器\t242490\n生产事故\t242491\n食光\t242492\n微博网\t242493\n有和\t242494\n腾讯传\t242495\n算客\t242496\n仔片\t242497\n青春励志\t242498\n难过\t242499\n淮南\t242500\n滑膜\t242501\n中山市地方税务局\t242502\n华美\t242503\n1hz\t242504\n上星期\t242505\n工党\t242506\n屏占比\t242507\npt80\t242508\n曾哥\t242509\n宋宇\t242510\n岐山县人民政府\t242511\n施虐\t242512\n新华侨报网\t242513\n虫穴\t242514\n家电消费网\t242515\nled电视\t242516\n20160206\t242517\n埃及金字塔\t242518\n融创海湾半岛\t242519\n逗唱\t242520\n城巴\t242521\n鹤岗市\t242522\n非人类\t242523\nL440\t242524\n女怕嫁错郎\t242525\n远洋新天地\t242526\n公寓楼\t242527\n克拉默\t242528\n自题\t242529\n义马\t242530\n开心洋葱\t242531\n三门海\t242532\n关节囊\t242533\nEas\t242534\nsubstitution\t242535\n6点半\t242536\n悠悠\t242537\n引发\t242538\n5186\t242539\nNr\t242540\n医卫\t242541\ncif.mofcom.gov.cn/site/UserFiles/File/1339403830496.mht\t242542\nNPR\t242543\n湘西土家族苗族自治州人民政府\t242544\n注册建筑师\t242545\n白军\t242546\n300份\t242547\n稍纵即逝\t242548\n5203\t242549\n灵狗\t242550\n普华商学院\t242551\n宝庆路\t242552\nsharelatex\t242553\n浮石\t242554\nimslp\t242555\n泊松\t242556\n甲酰胺\t242557\n舍生忘死\t242558\naufs\t242559\n维斯特洛\t242560\n得闲\t242561\n23只\t242562\n消防工程专业\t242563\n昌邑一中\t242564\nngmodel\t242565\n啊哈\t242566\n板件\t242567\nchannelartlist\t242568\n鲁安\t242569\n蜀山\t242570\n发动机油\t242571\n双玻\t242572\n黑王\t242573\n唱好\t242574\n看图写\t242575\n老龄委\t242576\n主公司\t242577\n绿帽王\t242578\n北高镇\t242579\ndaz3d\t242580\n2011协同创新中心\t242581\n宁波市对外贸易经济合作局\t242582\n二梯\t242583\n上海厂房网\t242584\ntimg\t242585\n彩虹门\t242586\nscare\t242587\n航空博物馆\t242588\nfemibion\t242589\n局域网共享软件\t242590\n嫡\t242591\n红草\t242592\n赵括\t242593\nfl12\t242594\n50ml\t242595\n红糖水\t242596\n0px\t242597\n老底\t242598\n烟\t242599\n无遮拦\t242600\nX1C\t242601\n俾鬼\t242602\n16gb\t242603\n紫金鼠\t242604\n高墙内\t242605\n早孕测试纸\t242606\n逐瘀\t242607\n进行式\t242608\n乙未年\t242609\n冰灯\t242610\n江宁\t242611\n无名英雄\t242612\n赵伊彤\t242613\nzh-cn\t242614\n非生物\t242615\n薄壁不锈钢管\t242616\nWCG2013\t242617\n广州市小学\t242618\n哈弗h6\t242619\n生命科学园\t242620\n大掌柜\t242621\n央视综艺频道\t242622\nBonds\t242623\n隐退\t242624\n春城无处不飞花\t242625\nKEYSHOT\t242626\n下拉条\t242627\n126尊\t242628\n梯步\t242629\n卡档\t242630\n掠翼\t242631\nSa\t242632\n红旗区\t242633\nschott\t242634\nsaltstack\t242635\n采茶纪\t242636\n盘宽\t242637\n甲\t242638\n吉隆坡机场\t242639\n挑空\t242640\np7级\t242641\n随便吃\t242642\n东北侧\t242643\n魔兽编年史\t242644\n乳腺纤维瘤\t242645\nganglia\t242646\nconstitute\t242647\n齐商银行\t242648\n模范堂\t242649\n1935年\t242650\n自力式调节阀\t242651\n大鸭梨烤鸭店\t242652\n掀开\t242653\n金俊浩\t242654\n小息\t242655\n足外翻\t242656\n雷耶斯\t242657\n科化\t242658\n牛弹琴\t242659\n微软Power\t242660\n南京汽车站\t242661\n两艘\t242662\nlucifer\t242663\nDAZ\t242664\n中国人民银行成都分行\t242665\n大豆网\t242666\nrelaxation\t242667\n玄宗\t242668\n车客\t242669\n两餐\t242670\n钱包君\t242671\nreindex\t242672\n居间费\t242673\n瑕疵\t242674\n釉下彩\t242675\n0#块\t242676\nIVT\t242677\n岛津\t242678\n1547\t242679\n甭\t242680\n2g网\t242681\n3bangz\t242682\n日租公寓\t242683\nparade\t242684\n梁漱溟\t242685\ncmdlet\t242686\n2017年3月8日\t242687\n给孩\t242688\n萃茶机\t242689\nDAPP\t242690\n金森女贞\t242691\n含铅\t242692\n北平无战事\t242693\njacoco\t242694\n看待\t242695\n皮儿\t242696\n2把\t242697\n女房客\t242698\n内部人\t242699\nloud\t242700\n1152\t242701\n1.4万\t242702\nTRIGGER\t242703\n约什\t242704\n修炼者\t242705\n李福\t242706\n水蜜桃\t242707\nwordl\t242708\n石家庄铁道学院\t242709\n全国百强中学\t242710\n第四十期\t242711\n邵雍\t242712\n禅城区\t242713\n义渠王\t242714\n勋兴\t242715\n韩讯网\t242716\n大幕\t242717\n媒\t242718\n多空\t242719\n李海龙\t242720\nfrydsh\t242721\n幼升小五证\t242722\n杨门虎将\t242723\n4ms\t242724\n防渗土工膜\t242725\n53条\t242726\n668\t242727\n雷颂德\t242728\nbaggage\t242729\n豆叶\t242730\n情话\t242731\nbewithme\t242732\n澳洋科技\t242733\n身长\t242734\n注册商标专用权\t242735\n昌平北\t242736\n阿萝\t242737\n限定\t242738\n油醋\t242739\nworldmachine\t242740\n小猪唏哩呼噜\t242741\nbootstrapping\t242742\nZUK\t242743\n奥迪修仙高手混花都\t242744\n玉手\t242745\n私权\t242746\n话放\t242747\n218j\t242748\n狗狗\t242749\n堂妻\t242750\n六则\t242751\nAutoCAD\t242752\nyokohama\t242753\n林逸\t242754\n口袋妖怪叶绿\t242755\n顺藤摸瓜\t242756\n录程\t242757\n罹患\t242758\n姑娘果\t242759\n联璧\t242760\n集体土地使用权\t242761\nresponse\t242762\n微血管\t242763\n48张\t242764\nshouji\t242765\n夷陵\t242766\n入党函调\t242767\n购物网\t242768\n刁民\t242769\n梁景华\t242770\n相学\t242771\nrosewood\t242772\n血战太平洋\t242773\n幸福蓝海\t242774\n新发现\t242775\n第1步\t242776\n杨记\t242777\n四羊方尊\t242778\n好坏\t242779\n护龈\t242780\n闵惠芬\t242781\n傻瓜相机\t242782\n匠师\t242783\nAppStore\t242784\n7700hq\t242785\n蓝鸽\t242786\nInstitut\t242787\nSKG\t242788\n锰酸锂\t242789\n广西教育\t242790\n德扑\t242791\n哈撒\t242792\n纽黑文\t242793\n自动重\t242794\n星美集团\t242795\n保温材料\t242796\n周云蓬\t242797\n危険日\t242798\n速干裤\t242799\nFiesta\t242800\nU型\t242801\n地坪\t242802\n项链\t242803\n军资\t242804\nanquye\t242805\nwoe\t242806\n季风区\t242807\ncfile\t242808\n连接盘\t242809\n何似\t242810\n深粮集团\t242811\n可可娱乐网\t242812\n3unshine\t242813\nrider\t242814\n深井烧鹅\t242815\n2015.08\t242816\n眼仔\t242817\nv17\t242818\n1660\t242819\n深龋\t242820\n休眠期\t242821\n杀戮秀\t242822\n贾跃\t242823\n摇表器\t242824\n耐酸\t242825\n薄命\t242826\n过几天\t242827\nfolds\t242828\n绕指\t242829\n过桥米线\t242830\n房务部\t242831\n海顺\t242832\nF12\t242833\n杭州市人民政府办公厅\t242834\n义经\t242835\n深深宝A\t242836\n何超\t242837\nExpires\t242838\n闪克2\t242839\n30兆\t242840\n集团化\t242841\nwaiver\t242842\n桃花王小丫\t242843\n建设银行储蓄卡\t242844\n菲舍尔\t242845\nadlm\t242846\ngogoboi\t242847\n三字经\t242848\n旅发\t242849\n八荣八耻\t242850\ndesigner14\t242851\n中国国际电子商务网\t242852\n投控\t242853\n厚泽\t242854\n马睿\t242855\n布匹\t242856\n2018年12月份\t242857\n华发集团\t242858\n奥托博克\t242859\n桃花沟\t242860\n特效化妆师大对决\t242861\n中书省\t242862\n颁奖典礼\t242863\n2326\t242864\nOUTLOOK2013\t242865\n猩红收割者\t242866\n梦幻西游法\t242867\nWinCHM\t242868\n宇宙大爆炸\t242869\n弹簧夹\t242870\n波尔多葡萄酒\t242871\n速动资产\t242872\n硫化罐\t242873\nCAD测量\t242874\n164个\t242875\n反击\t242876\n天驰\t242877\ndespite\t242878\nBeetl\t242879\n真钞\t242880\n北京中油瑞飞信息技术有限责任公司\t242881\n超低功耗\t242882\nflat\t242883\n四神汤\t242884\n挖机\t242885\nonclick函数\t242886\nDuff\t242887\n奥古斯塔\t242888\ngtk+\t242889\n限贷\t242890\npdf编辑软件\t242891\n中国上海_上海政府\t242892\n80后脱口秀\t242893\n桃林镇\t242894\n依法行政\t242895\n调值\t242896\njobs\t242897\nV80\t242898\n诺水河\t242899\nwatch2\t242900\n夏姓\t242901\nA1466\t242902\nkerwinC\t242903\n云母粉\t242904\n交强险\t242905\n花狸\t242906\nios9.0\t242907\n河钓\t242908\n标局\t242909\n王建东\t242910\n尿失禁\t242911\n微孔板\t242912\n行位\t242913\n华中师范大学武汉传媒学院\t242914\n感观\t242915\njzb\t242916\n全站\t242917\nmargin-left\t242918\n骨感\t242919\n闪电泡芙\t242920\n如初\t242921\n中国城市规划设计研究院\t242922\n尘香花\t242923\n选款\t242924\n赵龙\t242925\n智能医疗\t242926\nEPIC\t242927\n李汉荣\t242928\n灵魂\t242929\n涨感\t242930\n电动隔膜泵\t242931\n恒大人寿保险有限公司\t242932\n选层\t242933\n0.7.3\t242934\n博实股份\t242935\n冷拔管\t242936\n郑毅\t242937\n星廓\t242938\n偶服\t242939\n122家\t242940\nCate\t242941\n非独立核算\t242942\njdhj651\t242943\n刘艳霞\t242944\n王昀\t242945\n加盟\t242946\n巴利\t242947\n3月24日\t242948\n责怪\t242949\n潜者\t242950\n图拉扬\t242951\n一千只\t242952\n弓背\t242953\nmoll\t242954\n隣\t242955\n锁匠\t242956\n蒲松林\t242957\n李文达\t242958\nulong\t242959\n医者\t242960\n红豆\t242961\n流寇\t242962\n牛蛋\t242963\n珠宝类\t242964\n图片花\t242965\n丘比沙拉酱\t242966\nhtml5+js\t242967\n轩辕伏魔录\t242968\nBurton\t242969\n卡扎菲\t242970\n白河峡谷\t242971\nGlee\t242972\n药罐\t242973\n红雨\t242974\n非真实\t242975\n入仓\t242976\n南京南火车站\t242977\nPSCC2018\t242978\n阳光下\t242979\nNAATI\t242980\n孤岛惊魂原始杀戮\t242981\n德克萨斯州\t242982\n都柏林\t242983\n鱼腥\t242984\n20#\t242985\n信息部\t242986\n碧海雄心\t242987\n凯硕\t242988\n惩戒骑\t242989\n煤厂\t242990\n曼珠\t242991\n玄幻奇幻小说-17k小说网\t242992\n一只狗\t242993\n压缩袋\t242994\n郑委\t242995\n1.5P\t242996\n港闸\t242997\n53亿\t242998\n明细帐\t242999\nadministrators\t243000\n连州市政府\t243001\n麻古\t243002\n河南省住房和城乡建设厅\t243003\n2.4寸\t243004\n肠胃\t243005\n作弊版\t243006\n短筒\t243007\n投稿客\t243008\n金盏花水\t243009\n工资单\t243010\nqqhfeng\t243011\n藏文输入法\t243012\n相对数\t243013\n高耀洁\t243014\n佳木斯职业学院\t243015\n半拍\t243016\n乱射\t243017\n动态硬盘\t243018\nMQTT消息服务器\t243019\n大发888\t243020\n安娜·弗里茨\t243021\n12支\t243022\n平安人寿\t243023\nPrize\t243024\n天安社\t243025\nBas\t243026\n钢笔工具画\t243027\ntrio\t243028\n7.5.6.0\t243029\nkot\t243030\n炸屏\t243031\nelliott\t243032\n百度影音\t243033\n宁夏地区\t243034\n7厘\t243035\n永别\t243036\n盘圆\t243037\n警服\t243038\nibanez\t243039\n小识\t243040\n邀约若风\t243041\n樊梨花\t243042\n瓦妮莎\t243043\n1顿\t243044\n亏心事\t243045\n嘉联\t243046\nshadowsocket\t243047\n易酒批\t243048\n赵雅芝\t243049\nman2013\t243050\n填上\t243051\n收购站\t243052\n醉氧\t243053\nkickstart\t243054\n1930\t243055\n深发展\t243056\n古拉姆\t243057\nWord,Excel\t243058\n山药豆\t243059\n哇噻\t243060\n327号\t243061\nFileUtils\t243062\n宝马3系报价\t243063\n黎黄陂路\t243064\ntomas\t243065\nV6.1\t243066\nVagaa\t243067\nteamviewer12\t243068\n神经网络与深度学习\t243069\n绝世神偷:废柴七小姐\t243070\n济南西客站\t243071\n高启\t243072\n门厂\t243073\nsnail\t243074\n请安\t243075\n淬火机床\t243076\n斐迪南\t243077\n预评价\t243078\n20亿_\t243079\nㄤ\t243080\n于建嵘\t243081\n底云飞\t243082\n33首\t243083\n驭胜S330\t243084\n流民\t243085\n迦南美地\t243086\n安徽省12366电子税务局\t243087\n时任\t243088\nCTO\t243089\n钉珠\t243090\n苹果id\t243091\nsour\t243092\n西戎\t243093\n才知\t243094\n荣耀s7\t243095\n海南省公安厅交通警察总队\t243096\n4场\t243097\nれ\t243098\n福建省体育局\t243099\ns60\t243100\n高清书籍\t243101\n合件\t243102\n3567\t243103\n37pao\t243104\n自我暗示\t243105\n劳恩斯酷派\t243106\n小水电站\t243107\nHelvetica\t243108\nGigaset\t243109\n瀬心美\t243110\n2019上半年\t243111\n糖果游戏浏览器\t243112\npixel吧_\t243113\narray_splice\t243114\n北京海淀外国语实验学校\t243115\n英国宫\t243116\n微孔\t243117\n初夏\t243118\n美职\t243119\n百万亿\t243120\n强制险\t243121\ncsol2吧\t243122\n哈恩\t243123\n开销\t243124\n椿芽\t243125\n豆腐皮\t243126\n蛮荒搜神记\t243127\n化测试\t243128\n可行权\t243129\n秦雨\t243130\n王安忆\t243131\n三生三世十里桃花白浅\t243132\n砸烂\t243133\n曲折性\t243134\n三九\t243135\n统测\t243136\n龙蛇幻想世界大穿越\t243137\n状元\t243138\nutd\t243139\n本命佛\t243140\n银行信\t243141\n北京市劳动和社会保障局\t243142\n注销\t243143\n一句\t243144\n粮人\t243145\nwin32gui\t243146\n三世书\t243147\n应用文写作\t243148\n抽动症\t243149\nsensors\t243150\n春沐源\t243151\n僵尸病毒\t243152\naj12\t243153\n塔式起重机\t243154\n电信网\t243155\n近60年\t243156\n颂党恩\t243157\n宝贝宝贝\t243158\n圣城风云\t243159\n药皮\t243160\nTOOLBOX\t243161\n横向稳定杆\t243162\n白哈巴\t243163\n金融学院\t243164\n22行\t243165\n首农集团\t243166\n英菲尼迪汽车\t243167\n烧屏\t243168\nConstructor\t243169\n三更\t243170\n灿灿\t243171\n9mgo\t243172\n全国青少年普法网\t243173\n黑妹\t243174\nmezzanine\t243175\n068\t243176\n西安地铁14号线\t243177\nbutyl\t243178\n小俏媳\t243179\n自定制\t243180\nbaltimore\t243181\nctrl+1\t243182\nPa\t243183\n浦东机场\t243184\n8000元\t243185\n除权除息日\t243186\n亚星客车\t243187\n400种\t243188\n恩比德\t243189\n武汉工业大学\t243190\n超声波金属焊接机\t243191\n模塑科技\t243192\n四氧化二氮\t243193\nuser\t243194\n佳能mg3680\t243195\n流黄鼻涕\t243196\n骨龄\t243197\nVicetone\t243198\n巴里坤县\t243199\n远情\t243200\nnavica\t243201\n凤临阁\t243202\n毫\t243203\n暧\t243204\n漯河\t243205\nqsv格式转换mp4\t243206\n3分钟左右\t243207\n颠覆\t243208\n型号价\t243209\n一个小时\t243210\n汇讯网\t243211\n三顺\t243212\n贝格\t243213\n戴尔灵越15\t243214\n1688年\t243215\n海尔大道\t243216\n蒲城\t243217\n曾国荃\t243218\n戴安\t243219\n焕新颜\t243220\nitself\t243221\n鬼叔\t243222\nawait\t243223\n蛇果\t243224\n湘雅医学院\t243225\n文件传输\t243226\n曹子建\t243227\n720P\t243228\ninvoker\t243229\n捕风捉影\t243230\n紫外可见分光光度计\t243231\n球釜\t243232\n米兰\t243233\n才村\t243234\n竹园镇\t243235\n708090\t243236\n天幻\t243237\n非永燕\t243238\n优质商\t243239\n横越\t243240\n_网贷之家\t243241\n德丰\t243242\n水果姐\t243243\n武汉大学基础医学院\t243244\ncixi\t243245\n12根\t243246\n李守力\t243247\n张艾亚\t243248\n移动支付\t243249\n筵\t243250\n76年\t243251\n抽象思维\t243252\n550亿\t243253\n两盒\t243254\n伪装学渣\t243255\n脱育\t243256\np7\t243257\n非上市公司\t243258\n画谱\t243259\n龙头山\t243260\n网上阅卷\t243261\n烧烤机\t243262\n没注意\t243263\n银翼杀手2049\t243264\n马克尔\t243265\nhtai\t243266\njuypter\t243267\n中国国防动员网\t243268\n冬子\t243269\n库木塔格沙漠\t243270\n无痛性\t243271\n胡宗宪\t243272\n人体解剖学\t243273\n海斯\t243274\n腹语\t243275\n月桂叶\t243276\n新捷达\t243277\n建设银行网银盾\t243278\n保利凤凰湾\t243279\n解放路\t243280\n老客\t243281\nweiss\t243282\nDancer\t243283\n活动篇\t243284\nudesk\t243285\n无菌医疗器械\t243286\n截杀\t243287\n防静\t243288\n特力\t243289\n仿旧\t243290\n匹夫有责\t243291\narri\t243292\n逆天级\t243293\n香泉镇\t243294\n杰伟世\t243295\n校巴\t243296\n四百个\t243297\n进水量\t243298\nwebroot\t243299\n警备区\t243300\n红杏出墙\t243301\nSUMIFS函数\t243302\nBRITA\t243303\n刘朝晖\t243304\n施华洛世奇\t243305\n追昔\t243306\n三第二\t243307\nUptown\t243308\n恶人谷\t243309\n表姑\t243310\n升小\t243311\n龙水镇\t243312\nLineRenderer\t243313\n星光璀璨\t243314\n未知量\t243315\nsense8\t243316\n三摩地\t243317\n1千克\t243318\n南瓜汁\t243319\n土柱\t243320\n易乎\t243321\n史各庄\t243322\n3dMAX\t243323\n法国银行\t243324\n新葡京娱乐场\t243325\n27万\t243326\nbioconductor\t243327\n赵峰\t243328\n高H.NP\t243329\nSKD\t243330\n印刷展\t243331\n中央八项规定\t243332\n现代名\t243333\n重庆电讯职业学院\t243334\n性色\t243335\n代理服\t243336\n武汉\t243337\n国族\t243338\n奥田民生\t243339\n魏蜀\t243340\n简奥斯丁\t243341\n利群集团\t243342\n金普\t243343\nrdf\t243344\n亿美元\t243345\n极迅网游加速器\t243346\n网红思瑞\t243347\n装配图网\t243348\n资料包\t243349\n比特币病毒\t243350\n第七周\t243351\n云联惠\t243352\n第三十个\t243353\nNetworking\t243354\n两房\t243355\nulike\t243356\n李菲儿\t243357\n骨子里\t243358\nmassa\t243359\n拓跋\t243360\n佛冈\t243361\n自由之翼\t243362\n老飞度\t243363\n子宫内膜癌\t243364\n白浊之村\t243365\njedi\t243366\n首付款\t243367\n信息箱\t243368\n爬电\t243369\n玩儿\t243370\n沙柳\t243371\n滞销品\t243372\n制分\t243373\n变体\t243374\n老年证\t243375\n多程\t243376\n纵梁\t243377\n朱良春\t243378\n行客\t243379\n奥利司\t243380\n呵斥\t243381\n奔泰\t243382\n9月10号\t243383\n25p\t243384\n65536行\t243385\n中国铁路上海局集团有限公司\t243386\n荆州日报\t243387\nhtml2\t243388\n收藏家\t243389\n中位\t243390\n贫民窟\t243391\nSheldon\t243392\n哈尔滨大学\t243393\n屠魔\t243394\n莫科\t243395\n神出鬼没\t243396\ngearbox\t243397\n不存\t243398\n报关员考试\t243399\n床铃\t243400\nat指令\t243401\n马戏团\t243402\n陈奂仁\t243403\nlrt\t243404\n亲笔信\t243405\n玛尔斯\t243406\n600050\t243407\nB2B2C\t243408\n70克\t243409\n孟府\t243410\nparc\t243411\n乙城\t243412\n爱育\t243413\n尼克\t243414\n100名\t243415\n新趋势\t243416\n进贴\t243417\n人面桃花\t243418\n开云集团\t243419\n蜜月期\t243420\n4任\t243421\n短融\t243422\n安倍晋三\t243423\n校检\t243424\n多尼克\t243425\nTry\t243426\n宝安公园\t243427\n刘龙\t243428\n阳光奥美集团\t243429\n素叻他尼\t243430\niid\t243431\n与时俱进\t243432\n盘古大观\t243433\ninherit\t243434\n江西工业贸易职业技术学院\t243435\n沈阳东\t243436\n黄素熙\t243437\n招财鱼\t243438\n链接池\t243439\nmsdb\t243440\n基准价\t243441\n108颗\t243442\n倒是\t243443\n首份\t243444\n绸子\t243445\nFluid\t243446\n一个500\t243447\n群光广场\t243448\n培养基\t243449\n百度云压缩包\t243450\ncodefirst\t243451\n清华科技园\t243452\n女老师\t243453\n奇技\t243454\n第28期\t243455\n鱼石脂\t243456\n临城镇\t243457\n科普中国\t243458\n王晨阳\t243459\n10.11\t243460\n上社\t243461\n景明\t243462\n不弃\t243463\n只顾\t243464\n朴海林\t243465\n义阳\t243466\n万科大都会\t243467\n上海地铁17号线\t243468\n一根16G\t243469\n李劼杰\t243470\n饭冈加奈子\t243471\n老楼\t243472\n2019年12月\t243473\n切中要害\t243474\n碧生源常润茶\t243475\n74部\t243476\n这些话\t243477\n王煜全\t243478\n讲解器\t243479\n小房车\t243480\n杭十四中\t243481\n积极分子\t243482\nV2.3.5\t243483\n应收账款质押登记办法\t243484\nwash\t243485\n怒揭\t243486\n7.17\t243487\n京人\t243488\n鼬佐\t243489\n复旦大学附属儿科医院\t243490\n火娃\t243491\nsplayer\t243492\n20160110\t243493\n4595\t243494\n大工匠\t243495\n饿死\t243496\n驭胜\t243497\n直发\t243498\n求生存\t243499\n余弦值\t243500\n萤\t243501\n安谧\t243502\n美孚一号\t243503\n普通话\t243504\nT50\t243505\nSCRM\t243506\n实值期权\t243507\n2017年11月21日\t243508\n七七事变\t243509\n曾诚\t243510\naria\t243511\n七濑茱莉亚\t243512\n圣诗\t243513\n官山镇\t243514\n邪恶漫画第一军事网_邪恶漫画军事网_无翼鸟\t243515\n飞机稿\t243516\n首席执行官\t243517\n暴龙哥\t243518\n852路\t243519\n史志\t243520\n石梯\t243521\n奔跑吧兄弟第六季\t243522\n上好佳\t243523\n下半月\t243524\nZigBee\t243525\nQuickly\t243526\n导播台\t243527\n整栋\t243528\n债权请求权\t243529\n摩擦角\t243530\n大邱庄镇\t243531\n金字塔式\t243532\n中国联合弓会\t243533\n公仔类\t243534\n小野寺梨纱\t243535\n紫米无线充电器\t243536\n5120\t243537\n大坂\t243538\n十几部\t243539\n法点\t243540\n新日产\t243541\n华联股份\t243542\n养女\t243543\nRevisited\t243544\n双峰\t243545\n飞象网\t243546\n才子\t243547\n龙眼肉\t243548\n南山街道\t243549\n袜子\t243550\n三天左右\t243551\nma5\t243552\n赛博\t243553\n惠博普\t243554\n联想手机\t243555\n涂附磨具网\t243556\neTax\t243557\nfulfilling\t243558\n摘译\t243559\n皮城\t243560\n县委党校\t243561\n孝义\t243562\n平谷区政府\t243563\n汽车江\t243564\n哈西奈德\t243565\n20160611\t243566\n裁集\t243567\n雷老母\t243568\n隔热\t243569\n头文\t243570\n绶带\t243571\n沙溪古镇\t243572\n万魔\t243573\n1月24日\t243574\n谷里街道\t243575\n守卫\t243576\n呼吸式\t243577\n俞承豪\t243578\n华主席\t243579\n浦东大道\t243580\n昆明市公安局\t243581\n合米\t243582\n舒缓\t243583\n干涉仪\t243584\n杨雪霏\t243585\nsmil\t243586\n終極\t243587\n红谷\t243588\n29天\t243589\n/批发价格\t243590\n大航海之路\t243591\nPSD格式-千图网\t243592\n阻击战\t243593\n墙板\t243594\n劫难\t243595\n第9季\t243596\nhard\t243597\n中国技术市场报数字报\t243598\n王长贵\t243599\n登山服\t243600\n湖南省社会科学院\t243601\n全宋诗\t243602\n电粒子\t243603\n获刑\t243604\n政府办\t243605\n杨玉梅\t243606\n湖北省公安厅\t243607\nconsulting\t243608\n入统\t243609\n大猿王\t243610\nRaychan\t243611\n1500公里\t243612\n中塘镇\t243613\nDS918+\t243614\n中泰证券股份有限公司\t243615\n中华人民共和国行政许可法\t243616\n长隆酒店\t243617\nWould\t243618\n雾谷伯爵\t243619\n统整\t243620\n泰坦之旅\t243621\npeaceful\t243622\n1.6.3\t243623\nmescroll\t243624\n地脉\t243625\ncaita\t243626\n今日说法\t243627\ndangdang\t243628\nDX9\t243629\nHarley\t243630\n9.11\t243631\n星际旅行\t243632\n32亿\t243633\n虎臣\t243634\nsudo\t243635\n湖州市总工会\t243636\nNECA\t243637\n怕疼\t243638\n产业新城\t243639\n瓶器\t243640\n黄灯\t243641\n财政厅\t243642\n公共自行车\t243643\n跌\t243644\n关爱\t243645\n上海第一妇婴保健院\t243646\nccut\t243647\n贾康\t243648\n空心胶囊\t243649\n20170210\t243650\n辽南\t243651\n操作软件\t243652\nOLE\t243653\n一比装修网\t243654\n大荐人\t243655\n当立\t243656\n极线\t243657\n上海复星集团\t243658\n催眠师\t243659\n斯坦伯格\t243660\n桃溪\t243661\n三国志12\t243662\n格兰芬多\t243663\nReactjs\t243664\nhashes\t243665\n梦想之路\t243666\n6700k\t243667\n地市级\t243668\nKPMG\t243669\n4.3_\t243670\n7.36\t243671\n松陵\t243672\n短号家庭网\t243673\n小胡同\t243674\nmwc\t243675\n5-6\t243676\n深圳市食品药品监督管理局\t243677\n劲椎病\t243678\n近些年\t243679\n十九大学习体会\t243680\nPremium\t243681\n劳动法库\t243682\n佛罗里达国际大学\t243683\nΓ\t243684\nfreezeframe\t243685\n刮瓷\t243686\n维美德\t243687\n姑娘们\t243688\n一个百\t243689\n膈应\t243690\n大辛庄\t243691\n客如云科技(北京)股份有限公司\t243692\n崇廉\t243693\n代表处\t243694\n【龙腾\t243695\n夏依\t243696\n种子期\t243697\nV1.3\t243698\n塞拉尼斯\t243699\n燕云十八骑\t243700\npcl6\t243701\n尚义县\t243702\nreplacement\t243703\ncollapsing\t243704\n京东\t243705\n啸叫\t243706\n蛋糕盒\t243707\n治安员\t243708\n35万吨\t243709\n爱问问同学录\t243710\n保研\t243711\n插座箱\t243712\n油种\t243713\nDynasty\t243714\n围棋史\t243715\n诡谲\t243716\n夏安\t243717\n德国政府\t243718\n大师\t243719\n苦涩味\t243720\nPDFCreator\t243721\n名目\t243722\n恩典\t243723\n辐射4\t243724\n孤坟\t243725\n田亮\t243726\n三万天\t243727\n内蒙古自治区旅游局\t243728\n吉尔吉斯坦\t243729\nM1#\t243730\n代善\t243731\n小牛\t243732\n筏板\t243733\njefflee168\t243734\nifstream\t243735\n补偿\t243736\n爱的艺术\t243737\n华东电力设计院\t243738\nnat2\t243739\n美日\t243740\n8u\t243741\n刘顺\t243742\nMSYS\t243743\nPOLICE\t243744\n一定可以\t243745\nPhotos\t243746\nkumi\t243747\nHELLA\t243748\n导购\t243749\n电子元器件\t243750\n华东重机\t243751\n新浪军\t243752\n登勃朗峰\t243753\n爱乱伴侣\t243754\n生化池\t243755\nGhostscript\t243756\n铝热剂\t243757\n茂谷柑\t243758\nч\t243759\nshier\t243760\n3DMA\t243761\n刚果盆地\t243762\n黄澄澄\t243763\n教育信息网\t243764\n普陀\t243765\n灵智\t243766\n肉孜节\t243767\n未休\t243768\n第204集\t243769\n香港恒生银行\t243770\n会盟\t243771\n炭黑\t243772\n呼图壁县\t243773\n观麦\t243774\n44mm\t243775\n日期\t243776\n托比·马奎尔\t243777\n谷日记网\t243778\n小儿疝气\t243779\n高成\t243780\n喷器\t243781\n凌波微步\t243782\n助剂\t243783\n百分之99\t243784\n彩云国物语\t243785\ncucumber\t243786\n防爆电气设备\t243787\n10芯\t243788\n杜林\t243789\n印子月\t243790\n比亚迪\t243791\n伊莲娜\t243792\nwin32com\t243793\nlishui\t243794\n魔术师约翰逊\t243795\n画作\t243796\nrrc\t243797\n氯乙烯\t243798\n八卦镜\t243799\npresenting\t243800\n震撼人心\t243801\n放射线\t243802\n搜集\t243803\n琼脂糖凝胶电泳\t243804\n未关\t243805\n野战蓝鲫\t243806\n团兵\t243807\n时评\t243808\n晓青\t243809\n易洁\t243810\nie8\t243811\n潍柴集团\t243812\n女人心\t243813\n我是证人\t243814\ncellpadding\t243815\nJunn\t243816\n林版\t243817\n明年5月\t243818\nconflicted\t243819\n西藏省\t243820\n中睿盛通\t243821\n李大宝\t243822\n陵阳镇\t243823\n环保性\t243824\n210个\t243825\n6369电影网\t243826\n北美站\t243827\n风面\t243828\n林果\t243829\n祖玛珑\t243830\n虎标\t243831\n合德镇\t243832\n浙江省金华十校\t243833\n误差棒\t243834\n塔达林\t243835\n20140730\t243836\nruhe\t243837\n真三国无双OL\t243838\n麦迪娜\t243839\n评审员\t243840\n96A\t243841\n钟阳\t243842\n急眼\t243843\n迅雷BT电驴\t243844\n所学\t243845\n中电光谷\t243846\n济南园博园\t243847\n李程程\t243848\n芦花荡\t243849\n王者荣耀武则天\t243850\nifeve\t243851\n巴歇尔\t243852\n教育部学历证书\t243853\n承钢\t243854\n天敌\t243855\n心宿\t243856\nAEM\t243857\n南海市\t243858\n阻态\t243859\n上海申花队\t243860\n电系\t243861\n挠痒\t243862\n长坡\t243863\n电子钱包\t243864\n快文网\t243865\n拍照时\t243866\n骏歌\t243867\n脑血管畸形\t243868\n鸭胸\t243869\n毕业设计文献综述\t243870\n撕票风云\t243871\n跟新\t243872\nTMW\t243873\n小小强\t243874\n五代十国时期\t243875\n嘴巴\t243876\n返空\t243877\n新金庸立志传\t243878\n一段\t243879\nDunk\t243880\n流梗\t243881\n罪刑\t243882\n乡友\t243883\n奇峻\t243884\n冯如杯\t243885\n许家老子\t243886\n引育\t243887\n词缀\t243888\nRxJava2\t243889\n瑞文戴尔\t243890\n舒比奇\t243891\n31分\t243892\n云猫\t243893\ngathering\t243894\nTweak\t243895\n免控\t243896\n雅诗兰黛集团\t243897\n刚玉\t243898\n水鱼\t243899\n信用卡还款日\t243900\n中控方向盘\t243901\n菲尔米诺\t243902\n李兰娟\t243903\n大阪酒店\t243904\n娱票儿\t243905\n微距\t243906\n九强生物\t243907\n大丰街道\t243908\n李勤勤\t243909\nitunesstore\t243910\nvcs\t243911\nGTS450\t243912\n大吉达摩\t243913\n华东理工\t243914\n飞檐走壁\t243915\n6.14\t243916\n楚楚推\t243917\n第18期\t243918\n芦恒路\t243919\n大江大河\t243920\nBangbros\t243921\n八块\t243922\n封杀令\t243923\n激情\t243924\n更进一步\t243925\n人样\t243926\n金华路\t243927\nBaby\t243928\n8670\t243929\n中华人民共和国慈善法\t243930\n张纯\t243931\nEuclidean\t243932\n方正科技集团股份有限公司\t243933\n40l\t243934\nDigit\t243935\n陪酒女\t243936\n从重\t243937\n遥希\t243938\n张菁\t243939\n用来\t243940\n大早\t243941\n莫纳什\t243942\nWindows7旗舰版\t243943\n广式腊肠\t243944\nadvisor\t243945\n天使心\t243946\n对数收益率\t243947\n蜗居\t243948\n体检机\t243949\n2010\t243950\n华鲁恒升\t243951\n30英寸\t243952\n難\t243953\n恶女花魁\t243954\n妄言\t243955\n舌头溃疡\t243956\n2018元\t243957\n99天\t243958\nhi3516\t243959\n柠檬红茶\t243960\nvivox9i\t243961\n瓦缸\t243962\n刘琪\t243963\n碳谱\t243964\n拨缴\t243965\n运输类\t243966\n紫海\t243967\n王大治\t243968\n莫不\t243969\n打不通\t243970\nmdd\t243971\nyy楚河吧\t243972\n解题\t243973\n易少\t243974\n灞波儿\t243975\n阚泽\t243976\n一霐\t243977\n厚度仪\t243978\n水泥池\t243979\n【米\t243980\n怎能\t243981\n资讯类\t243982\n拦住\t243983\napproximation\t243984\n20160113\t243985\n土石坝\t243986\n用字\t243987\n尤文图斯\t243988\n陈创\t243989\nweb接口\t243990\n伊拉克\t243991\n骑士们\t243992\n秋月\t243993\njwt\t243994\n陈式太极拳\t243995\n总行\t243996\n慷慨激昂\t243997\n复材\t243998\n胚\t243999\n筑物\t244000\n宣言书\t244001\nhaodiaocao\t244002\n杨锋\t244003\nStats\t244004\n素裹\t244005\n变动\t244006\n要得\t244007\n吕永岩\t244008\n葡萄浏览器\t244009\n轻小\t244010\n文二西路\t244011\n全卡\t244012\n监管\t244013\n真高\t244014\n广义货币\t244015\n眩光\t244016\nHONEY\t244017\n顺丰集团\t244018\n小白虫\t244019\n陈师傅\t244020\ncapitan\t244021\n色釉\t244022\nteer\t244023\nyunke\t244024\nVGM\t244025\n阎村镇\t244026\ndryer\t244027\nBCE\t244028\n油精\t244029\n购号网\t244030\nheretits\t244031\n偶素浅小浅\t244032\n爆宠火妃\t244033\n柴刀\t244034\n软骨病\t244035\n蜂具\t244036\n01型\t244037\n鲁菜\t244038\n源氏\t244039\nVivaldi\t244040\n天津自然博物馆\t244041\n脂肪烃\t244042\n西盟\t244043\n主动语态\t244044\n光猪\t244045\n广东医学院附属医院\t244046\n黄鱼\t244047\n遮羞版\t244048\nonedriver\t244049\n史密斯机\t244050\n马丁尼\t244051\n阴宅\t244052\n恭候\t244053\n王猎\t244054\nISDN\t244055\n国家级自然保护区\t244056\n喷洒\t244057\n工程管理费\t244058\n挡膜\t244059\nboke\t244060\n长清\t244061\n广州曙光整形美容医院\t244062\n探戈舞\t244063\n三部\t244064\n漆工\t244065\n维莎\t244066\n虚拟\t244067\n幼园\t244068\nfreeware\t244069\n炎天\t244070\n詹姆斯卡梅隆\t244071\nVs2017\t244072\n白蒺藜\t244073\n骗术\t244074\n中粮天悦壹号\t244075\n十大名\t244076\nPMD\t244077\n五年制高职\t244078\n优步中国\t244079\n不留\t244080\n详案\t244081\n弹弹堂3\t244082\n武汉市国土资源和规划局\t244083\n隐蔽工程验收\t244084\n进气阀\t244085\n售后服\t244086\n黑店\t244087\n上局\t244088\n蛇盘\t244089\n僵尸战争\t244090\n泪目\t244091\n玉龙纳西族自治县\t244092\n博美\t244093\n4月28号\t244094\n陈志朋\t244095\n推论\t244096\n新闻之家\t244097\naudio-technica\t244098\n阎肃\t244099\nActiveX\t244100\n旅游业\t244101\ntextrank\t244102\n考勤机\t244103\n刘晓彤\t244104\n拜金\t244105\nexpectations\t244106\n面片\t244107\n葡萄游戏厅\t244108\n黑框\t244109\n小米5C\t244110\ncea\t244111\neap\t244112\n中国人民解放军总医院301医院\t244113\n磁学\t244114\n煤气发生炉\t244115\n爱与不爱\t244116\n吴为\t244117\n源型\t244118\nwealthy\t244119\nQUAD\t244120\n导热系数\t244121\n寿堂\t244122\n苏麻喇姑\t244123\n泰达宏利基金管理有限公司\t244124\n白粥\t244125\n汪博士\t244126\n男主\t244127\n真空压力表\t244128\nBiotechnology\t244129\n独立董事制度\t244130\nolens\t244131\n排入\t244132\n范蠡\t244133\nlarave\t244134\n350公里\t244135\n杭州经济技术开发区\t244136\n省图\t244137\n老干\t244138\nvigor\t244139\nMSSQL数据库\t244140\n莫古力\t244141\naborting\t244142\n民族团结一家亲\t244143\n安庆市政府\t244144\nmedley\t244145\n微民\t244146\n无翼鸟邪恶少女漫画全集\t244147\n莫西\t244148\n石家庄万达广场\t244149\n2017年6月24日\t244150\n玻璃窑\t244151\n5175\t244152\n91卡\t244153\n博客\t244154\nTX1\t244155\n5235\t244156\n860m\t244157\n墙报\t244158\n吉维尼\t244159\n超级马里奥银河2\t244160\n钻石湾\t244161\n定位符\t244162\n新中源陶瓷\t244163\nNeo4j\t244164\n教学机\t244165\n露逼\t244166\n富贵包\t244167\nwebstorm2018\t244168\n警匪\t244169\n马来西亚第二家园\t244170\n夏禾\t244171\n助威\t244172\n内训师\t244173\n全友家居\t244174\n大岭镇\t244175\n外区\t244176\n失血性休克\t244177\n喉结\t244178\n关\t244179\n像风\t244180\n阿克希亚\t244181\n克莱姆\t244182\n出旗\t244183\n主职\t244184\n间断性\t244185\n精华\t244186\n郫县豆瓣酱\t244187\n王官庄\t244188\n104岁\t244189\n庄敏\t244190\n金钟山\t244191\n演绎推理\t244192\n想错了\t244193\n板网\t244194\n暗黑系\t244195\n络腮胡子\t244196\n乘龙怪婿\t244197\n昌乐一中\t244198\n闽南房产网\t244199\n杨铁心\t244200\n迦拉卡斯\t244201\nTextbook\t244202\n那一个\t244203\n澳彩\t244204\njunit测试\t244205\n简支梁桥\t244206\n四十\t244207\n假胸\t244208\n浙江大学》\t244209\n小黑裙\t244210\nstring库\t244211\n阿卡\t244212\nmobike\t244213\n曼蒂\t244214\nMoi\t244215\n1134\t244216\n吃鸡号\t244217\n伤心处\t244218\n20180419\t244219\n带好\t244220\n拟合\t244221\nDownloads\t244222\n反腐\t244223\n白百何\t244224\nngrok\t244225\n7pv.net\t244226\n桥口区\t244227\n测量费\t244228\nKY\t244229\n东京美食网\t244230\n活色\t244231\n10几分钟\t244232\ndaier\t244233\n1.5KW\t244234\n精选版\t244235\n晚八点\t244236\n江西师大\t244237\n剤\t244238\n宏定义函数\t244239\n1200万吨\t244240\n小宝贝\t244241\n猎影之狼暗夜\t244242\n图像\t244243\n峰回路转\t244244\n绝世最好的我们\t244245\n2018044\t244246\n叙利亚政府\t244247\n創業\t244248\n十八里店\t244249\n2017年7月25日\t244250\n上海电大\t244251\n影响者\t244252\n第四胎\t244253\n大胡\t244254\n胎教音乐\t244255\nending\t244256\n英语教育专业\t244257\nlistener\t244258\n002223\t244259\n拼图\t244260\n食戟之灵\t244261\n陈迹\t244262\n李轩\t244263\n欧比旺\t244264\n灯火\t244265\n猪蓝耳病\t244266\n18.2.2\t244267\n风向袋\t244268\n今天凌晨\t244269\n唯爱\t244270\ngisoracle\t244271\n箱盖\t244272\n乒乓\t244273\nDolphin\t244274\n竖直面\t244275\n微星电脑\t244276\nPIN码\t244277\n合区\t244278\n套索工具\t244279\n光学英才网\t244280\n胆总管\t244281\nmanget\t244282\n最强\t244283\n20元\t244284\n玄幻小说吧\t244285\n簌簌\t244286\n灵石路\t244287\nCNC\t244288\nfag\t244289\n鸡蛋面\t244290\n亚马逊海淘\t244291\n去取\t244292\n毛龟\t244293\n拜占庭帝国\t244294\n610000\t244295\n内塔尼亚胡\t244296\n节庆\t244297\n发达国家\t244298\n朱永博\t244299\n李天佑\t244300\n恒大健康产业集团\t244301\n托瓦格尔\t244302\n第58\t244303\n天天基金\t244304\n第4条\t244305\nTight\t244306\n2.0.6\t244307\n病倒\t244308\ninterop\t244309\n真好\t244310\n溧水\t244311\n塔钟\t244312\n10连\t244313\n王建强\t244314\n驾驶类\t244315\n999级\t244316\n泰伊·谢里丹\t244317\n中信网信鸽\t244318\n集备\t244319\n2015年3月份\t244320\n诺雷\t244321\n永兴县人民政府\t244322\n济南小太阳幼儿园\t244323\n索网\t244324\n周围性\t244325\nTexts\t244326\n主印\t244327\nNoteexpress\t244328\n抒写\t244329\nsincere\t244330\n错带\t244331\nDribbble\t244332\n防盗门\t244333\nliu\t244334\n小笼\t244335\n激战2吧_\t244336\n长门\t244337\n易彤\t244338\nBEST\t244339\n张地\t244340\n十款\t244341\n吉祥的歌\t244342\n流放之路3.0\t244343\n武汉大学马克思主义学院\t244344\n第十五节\t244345\n莱茵河谷\t244346\nSwish\t244347\n化验员\t244348\n暴民\t244349\n1236\t244350\n霍少霆\t244351\n转所\t244352\nsm4\t244353\nBABE\t244354\n当客软件园\t244355\n中置音箱\t244356\n咏志\t244357\n有声\t244358\nSuperman\t244359\n000046\t244360\n壹号\t244361\n阿普斯特\t244362\n与子偕老\t244363\n能通\t244364\n紫叶李\t244365\nXFP\t244366\nUgirls尤果网\t244367\n0058\t244368\n陈满\t244369\n叶城\t244370\n本愿寺\t244371\n高尚\t244372\n琢玉\t244373\n赫伯特\t244374\n知应\t244375\n47家\t244376\n岁寒\t244377\nOctober\t244378\n新加坡樟宜机场\t244379\n监控摄像机\t244380\n古域\t244381\n几桶\t244382\n家猪\t244383\n犯不犯法\t244384\n首医\t244385\n黔西南州人民政府\t244386\n出生女\t244387\n监听器\t244388\nwindowsserver2012\t244389\n荷兰皇家航空公司\t244390\n燃气调压箱\t244391\nhostapd\t244392\n金融性\t244393\n怪物猎人XX\t244394\n驴打滚\t244395\n三星c7\t244396\n崩塌\t244397\n18.03\t244398\nlanvin\t244399\n条框\t244400\n光伏并网\t244401\n看着我\t244402\n做好准备\t244403\n17.03\t244404\nMagics\t244405\n大腹\t244406\n斑秃\t244407\n石生花\t244408\n暗恋\t244409\n钓鱼网\t244410\n点将\t244411\n乐于助人\t244412\nコンプリ\t244413\n湖北电视台\t244414\n欧布奥特曼剧场版\t244415\n单面板\t244416\n白玉兰广场\t244417\n粗粮\t244418\n4700万\t244419\nJee\t244420\n银泰集团\t244421\n权责发生制原则\t244422\nmamp\t244423\n归休\t244424\nf436\t244425\n聘用合同\t244426\noffice365教育版\t244427\n伊利诺伊大学\t244428\n宋徽宗\t244429\ndior迪奥\t244430\n功与名\t244431\n新火\t244432\nAtour\t244433\n名声\t244434\n铁皮箱\t244435\n精灵星\t244436\nwrestling\t244437\nLSE\t244438\nmergers\t244439\n张炎夏\t244440\nTeeth\t244441\n61分\t244442\n喑杀\t244443\n天津市工业和信息化委员会\t244444\n亮丙瑞林\t244445\n北校区\t244446\n流传输\t244447\n龙湖西宸\t244448\nlaughed\t244449\n月初\t244450\n皿\t244451\nearly\t244452\nWebkit\t244453\n1欧\t244454\n西部片\t244455\n0924\t244456\n红船精神\t244457\nimt\t244458\n华夏天空小说网\t244459\n己烯\t244460\n雪莹\t244461\n天坛医院\t244462\n刘小慧\t244463\n台联\t244464\n可测性\t244465\n八达路\t244466\n不透光\t244467\nefficiently\t244468\n石河\t244469\n贡菊\t244470\n为契机\t244471\n禁酒\t244472\n前列康舒胶囊\t244473\n数字信号处理\t244474\n飞月宝\t244475\n贝宝\t244476\n巫山历险记\t244477\n光珠\t244478\n老国\t244479\n沈阳地铁9号线\t244480\n棉机\t244481\nS905\t244482\n信邦\t244483\n石菖蒲\t244484\n可选择\t244485\nSHOT\t244486\ni57300hq\t244487\nchecksum\t244488\ntroop\t244489\n赢利性\t244490\nfull\t244491\n华人书香网\t244492\n2道\t244493\n鹿晗\t244494\n西环\t244495\n蛇口港\t244496\nelse\t244497\n终将\t244498\nmapperscan\t244499\n屏库\t244500\n消防\t244501\n象山\t244502\ntria\t244503\n拼音输入法\t244504\n番茄工作法\t244505\nAngelababy\t244506\n路政员\t244507\n如东县政府\t244508\nMM-DD\t244509\n纪法\t244510\n沃柑\t244511\n松山湖科技产业园区\t244512\n尚坤\t244513\nprocesson\t244514\n风暖\t244515\n牙龈脓肿\t244516\n端\t244517\n团伙案\t244518\n黑钛\t244519\n王筝\t244520\nbootstrp\t244521\n40cm\t244522\n吸氧机\t244523\n技术类\t244524\n暗示性\t244525\nOnclick\t244526\nFactbook\t244527\nechart\t244528\n1700亿元\t244529\n0418\t244530\n3310\t244531\n一夫一妻\t244532\n人民日报思想纵横\t244533\n国际联盟\t244534\n12月\t244535\n马保国\t244536\n好多家\t244537\n图网\t244538\n正山\t244539\n20161225\t244540\n京东大鼓\t244541\n刘秀荣\t244542\n货栈\t244543\n编制类\t244544\n烟台港\t244545\n陈信元\t244546\n死亡半径\t244547\n57家\t244548\n桂岭镇\t244549\ndeny\t244550\n万科山景城\t244551\n激活工\t244552\n山西省科学技术厅\t244553\n耐张线夹\t244554\n视区\t244555\n大听网\t244556\n蛇人\t244557\n池房网\t244558\n仁杰\t244559\n做生意\t244560\nWord达人网\t244561\n三一\t244562\n南丰路\t244563\n苏州六中\t244564\n物联网工程学院\t244565\n师小札\t244566\n迪亚克\t244567\n暴涨\t244568\n韩村镇\t244569\n阜蒙县\t244570\n兰州市政府\t244571\n观海路\t244572\n76mm\t244573\n通用卷\t244574\nFigure\t244575\n灵路\t244576\n狄耐克\t244577\n2.5公里\t244578\n9篇\t244579\n陌\t244580\n慢速版\t244581\n两目\t244582\nBluRay-MP4\t244583\n羔\t244584\n白真\t244585\nISTQB\t244586\n中国文学史\t244587\n巅峰期\t244588\nexception\t244589\n宏指令\t244590\n单轴\t244591\n阳光水岸\t244592\n4.11彪哥闯奉天\t244593\n熙华\t244594\nunstable\t244595\n冰雪奇缘2\t244596\n第234集\t244597\nSample\t244598\n大嘘\t244599\n正大乐城\t244600\n扑朔\t244601\n窗户\t244602\n做绝\t244603\n重燃\t244604\n示宽灯\t244605\n美香\t244606\n友们\t244607\n配电箱\t244608\nTarot\t244609\n保险代理人考试\t244610\nmeridian\t244611\n指文\t244612\n雨珠\t244613\n水绘园\t244614\n订货会\t244615\n充电箱\t244616\n背德\t244617\n元氏\t244618\n未了\t244619\n材料学院\t244620\n皮脂腺异位症\t244621\n含金量\t244622\n铲球\t244623\n国际航空公司\t244624\n南谯区人民政府\t244625\n急救车\t244626\n建成后\t244627\n牧宝\t244628\n山南网\t244629\n秋草\t244630\n唏\t244631\n陈默\t244632\n北京世界公园\t244633\n走后\t244634\n该组\t244635\n电子科技大学出版社\t244636\n函数\t244637\n大众大杂烩\t244638\n八心八箭\t244639\n楚人美\t244640\n塞头\t244641\n雅昌茶舍\t244642\n医疗卫\t244643\n点类\t244644\n香港旅游发展局\t244645\n送祝福\t244646\n两副\t244647\n天津广播电视大学\t244648\n茂名站\t244649\n英雄杀手\t244650\n张亚勤\t244651\n探戈\t244652\n招远市\t244653\n群英汇\t244654\nsumo\t244655\n科鲁兹论\t244656\n渝蓉高速\t244657\n技术栈\t244658\n年字\t244659\n快进键\t244660\n诉讼资产网\t244661\nSCOUT\t244662\n小椴\t244663\n石海\t244664\n川煤集团\t244665\nfastText\t244666\n科教文卫\t244667\n气象科技\t244668\n于海波\t244669\n班歌\t244670\n北京市人民代表大会常务委员会\t244671\n压屏\t244672\nGps\t244673\n凡尔赛和约\t244674\n夜半歌声\t244675\nforum\t244676\n精力充沛\t244677\ni5-5200U\t244678\n昔\t244679\nwchar\t244680\n天剑王\t244681\n万德\t244682\n全慧彬\t244683\n链锁\t244684\niphone7P\t244685\n周伟\t244686\n薛家岛\t244687\n反响\t244688\n敏魔\t244689\n供给端\t244690\n龙龙\t244691\n旗舰版\t244692\ntriangles\t244693\n渡渡鸟\t244694\n怪物猎人世\t244695\n浅井舞香\t244696\n双桥镇\t244697\n销项税额\t244698\nDST\t244699\n归来\t244700\n20话\t244701\n晚上6点\t244702\n基础代谢率\t244703\n3.0.0\t244704\n157路\t244705\nzhengyeye\t244706\n糖画\t244707\n智哥\t244708\n白鹭开发者中心\t244709\nHiveServer2\t244710\n双胞胎\t244711\ngir\t244712\n却更\t244713\n牧场物语蜜糖\t244714\n绝宠妖妃\t244715\n急性上呼吸道感染\t244716\n铜损\t244717\ncorps\t244718\n120米\t244719\n2132\t244720\n铜箔\t244721\nethical\t244722\n大张伟\t244723\n腐蚀机\t244724\n2008年10月\t244725\n郑袖\t244726\neasyUI\t244727\n语态\t244728\n11.0.19\t244729\n康茂盛\t244730\n诗文\t244731\n穿心莲\t244732\n952\t244733\n麦兜\t244734\nHeartMP3\t244735\n宁波地铁4号线\t244736\n4整除\t244737\nsViper\t244738\n古劳镇\t244739\n盖伊\t244740\n我的个神啊\t244741\n罗技MX\t244742\n奥马哈\t244743\n杨昆\t244744\n人民日报社\t244745\n79条\t244746\n伊顿公馆\t244747\n照搬\t244748\n杜子建\t244749\n谷智鑫\t244750\n输不起\t244751\n夏建统\t244752\n97.4\t244753\nV4.1.0\t244754\n安庆市迎江区人民政府\t244755\np62\t244756\n删删\t244757\n雾气\t244758\n金山画王\t244759\n用水量\t244760\nSosoApi\t244761\nMark\t244762\n有源滤波器\t244763\n张继军\t244764\n_牛彩网\t244765\npmol\t244766\n触电者\t244767\npager\t244768\n单标\t244769\n第一道\t244770\n自以为是\t244771\n糖皮网\t244772\nunsupport\t244773\nartisan\t244774\n启辉器\t244775\ndraw9patch\t244776\n弩兵\t244777\n思凡\t244778\n王致\t244779\n北京朝阳_朝阳政府\t244780\n战歌\t244781\nSAST\t244782\n次渠\t244783\n宠物犬\t244784\n恒华科技\t244785\nylabel\t244786\n中国伊斯兰教协会\t244787\n9500元\t244788\n水月\t244789\n丐帮\t244790\n马家堡\t244791\n译源\t244792\nParcelable\t244793\nPDL\t244794\n黄褐色\t244795\n区林业局\t244796\n人权事记\t244797\n艺龙旅行网】酒店预订_机票查询_酒店\t244798\n东吴大学\t244799\n简水乡\t244800\n冰盒\t244801\n长江第一湾\t244802\n六宫粉黛\t244803\n泷川花音\t244804\n五十步笑百步\t244805\n电子科技大学成都学院\t244806\n肌肤\t244807\n安永\t244808\n好孩子\t244809\n林业厅\t244810\n食用百合\t244811\n胡英\t244812\n可恨\t244813\n错伏\t244814\n信长之野望吧\t244815\n弓村\t244816\n上官吉庆\t244817\n漕盈路\t244818\n桐城市人民政府\t244819\n宏华\t244820\nfrag\t244821\n关境\t244822\n湖南省大学\t244823\n锦江投资\t244824\n文渊狮城\t244825\n财神庙\t244826\n鱼类\t244827\n思想汇报网\t244828\n城管委\t244829\n资和信\t244830\nThumb\t244831\nKSD\t244832\n水晶梨\t244833\nDWG格式\t244834\n04.17\t244835\n三国志姜维传\t244836\n星期九\t244837\n叉烧酱\t244838\n杂咏\t244839\n圣贤\t244840\n云舒\t244841\n破防\t244842\n忘年恋\t244843\n南沙\t244844\n非货币性资产交换\t244845\n第五幕\t244846\n凸面镜\t244847\n平行度\t244848\n全队友\t244849\n音乐人\t244850\n保支\t244851\n宠婚撩人\t244852\n布洛尼亚\t244853\n朱玲\t244854\n元宵佳节\t244855\n脱壳\t244856\n开源代码\t244857\n3万平方米\t244858\n林田惠\t244859\n争光\t244860\n复旦大学生命科学学院\t244861\n中百超市\t244862\n排油\t244863\n甘肃省高级人民法院\t244864\nchrom浏览器\t244865\n请留步\t244866\n信阳师院\t244867\n吻痕\t244868\nForces\t244869\nWeight\t244870\nondestroy\t244871\n乐子\t244872\nFOCAL\t244873\n更出色\t244874\n琥珀谷\t244875\n义门陈\t244876\n业已\t244877\ngt750\t244878\nyeyelu\t244879\n2015a\t244880\n晃眼\t244881\n九台市\t244882\n国家大事\t244883\n陪练\t244884\n定频\t244885\n平安大学\t244886\n黑白\t244887\nPreset\t244888\n谭千秋\t244889\n42万元\t244890\n大剧\t244891\n祖蓝\t244892\n354号\t244893\ncertify\t244894\nmovs\t244895\n牯岭街\t244896\n脐血库\t244897\n华铁科技\t244898\n养人\t244899\n11300\t244900\n育儿师\t244901\n10万分\t244902\n港航管理局\t244903\n焦波\t244904\n忽而今\t244905\nangular5\t244906\nlumion6\t244907\n品学兼优\t244908\n上海房产信息网\t244909\n薬\t244910\n谐振腔\t244911\nHackRF\t244912\n华科尔\t244913\n泰州市统计局\t244914\n9450\t244915\n泰拉瑞亚\t244916\n珍珠棉\t244917\n佛魔\t244918\n晨鸣集团\t244919\nmsys2\t244920\nlibx264\t244921\n共产党地方委员会\t244922\n佛丁\t244923\n达尔\t244924\n礼盒装\t244925\n抵扣\t244926\n葡京娱乐\t244927\ndata格式\t244928\n幼\t244929\nsnmp\t244930\n灌砂法\t244931\n地人\t244932\n早上7点半\t244933\n女警官\t244934\n鉴评\t244935\n反三角函数\t244936\n魔卡少女樱\t244937\n永遇乐·京口北固亭怀古\t244938\n11亿\t244939\n利片\t244940\n6.071\t244941\n[日\t244942\n马国\t244943\n旅游文化节\t244944\n聊城市中心医院\t244945\n单员\t244946\n肉品\t244947\n偏旁\t244948\nk2路由器\t244949\nSERIES\t244950\n沈先生\t244951\n徐滔\t244952\n微红\t244953\n税要\t244954\nLURE\t244955\n欧式家具\t244956\n潮州古城\t244957\n委以重任\t244958\n雷明\t244959\n汤加丽\t244960\n照蛋\t244961\nlimx\t244962\n花样馒头\t244963\nGai\t244964\n郑州地区\t244965\n劳务招聘\t244966\n十二钗\t244967\n和燕路\t244968\n腮腺肿瘤\t244969\n下决心\t244970\n遥看\t244971\n质物\t244972\nhashtable\t244973\n天猫精灵\t244974\n故障码\t244975\nbok\t244976\nTBS\t244977\n广东水利电力职业技术学院\t244978\n5SING\t244979\n他山之石\t244980\n建索引\t244981\n合肥市图书馆\t244982\n鲛珠传\t244983\n哀鸿遍野\t244984\n恶犯\t244985\n蟹爪兰\t244986\n20150611\t244987\n玻璃体\t244988\n隐秘而伟大\t244989\n索尼ILCE-9\t244990\nFastener\t244991\n北京四合院\t244992\n悬雍垂\t244993\n灵韵\t244994\n考试报\t244995\nzwei\t244996\n鸥\t244997\n盘符\t244998\nthoughts\t244999\n市市场监管委\t245000\n教师资格考试\t245001\n急问\t245002\n晓峰\t245003\n甲硝唑芬布芬胶囊\t245004\n酬勤\t245005\nsetu\t245006\npromoting\t245007\n胡志刚\t245008\nquest\t245009\n白术\t245010\nairbus\t245011\n香港城市大学\t245012\n生命版\t245013\n淘金路\t245014\n兰斯吧\t245015\nwww.jb51.net\t245016\nfenix5s\t245017\nSoin\t245018\n毒妻\t245019\n担面\t245020\n贺普仁\t245021\nqingming\t245022\n佳发安泰\t245023\n结肠癌\t245024\nTalkingData\t245025\ngsi\t245026\n血痂\t245027\nVEE\t245028\n红阵\t245029\n大话水浒\t245030\n装甲门\t245031\n串口类\t245032\n清华大学汽车工程系\t245033\n压扁\t245034\n蝙蝠侠前传\t245035\njavajdk\t245036\n3DH\t245037\nRebirth\t245038\n神工\t245039\n信息学院\t245040\n石松\t245041\n进行中\t245042\n免激活\t245043\n赵石\t245044\n宏村\t245045\n创利\t245046\n2004届\t245047\n3532\t245048\n广告栏\t245049\n4年\t245050\n太全\t245051\n黑白版\t245052\nwinter\t245053\n衰竭\t245054\n选材\t245055\n深圳市民政局\t245056\nBD高清_电影港\t245057\n流行性\t245058\n王小贱\t245059\n田林县人民政府\t245060\n胡晓明\t245061\n学问\t245062\n香榭丽舍大道\t245063\n265G.COM\t245064\n血管瘤\t245065\n西湖区行政服务中心\t245066\n福布斯中文网\t245067\n何方\t245068\n原物\t245069\n韶关市区\t245070\n无骨雨刷器\t245071\n17次\t245072\n物流信息系统\t245073\nkobe\t245074\n酷狗网\t245075\n恒压源\t245076\n基金公司\t245077\n恒昌\t245078\n刘湘\t245079\n海鹏\t245080\n叶子媚\t245081\n265G\t245082\n丰唇\t245083\nsusceptibility\t245084\n添益\t245085\n联合国儿童基金会\t245086\n龙隐者\t245087\n三国志1\t245088\n02年\t245089\n海螵蛸\t245090\n皇姑屯\t245091\n中旗\t245092\ninsertBefore\t245093\npoweredge\t245094\nNipples\t245095\n廋\t245096\n草人\t245097\nWTP\t245098\nspline\t245099\n深圳市第二人民医院\t245100\n一早上\t245101\n生趣\t245102\n2017LCK\t245103\n斯宾诺莎\t245104\n折光率\t245105\n20期\t245106\n吉康\t245107\n麻鸭\t245108\n副厂\t245109\n观承\t245110\nencapsulation\t245111\n信合\t245112\n富春街道\t245113\n鞋拔子\t245114\n上衣\t245115\n一土\t245116\nservelet\t245117\ninventor2018\t245118\n企业管理子系统\t245119\n人展\t245120\n天猫店\t245121\n狂探\t245122\nPDF阅读器\t245123\nSTM32f103\t245124\n囊肿\t245125\nWarhol\t245126\n3.7亿\t245127\nRakuten\t245128\nr-studio\t245129\nAaronBlogs\t245130\n真木今日子\t245131\nz7\t245132\nEaster\t245133\nQFD\t245134\n4235\t245135\n叩问\t245136\n20160111\t245137\n回旋曲\t245138\ncod9\t245139\n信息产业\t245140\n风气\t245141\n升级路\t245142\n30m\t245143\n超立方体\t245144\n20180113\t245145\n凌华\t245146\n五洋捉鳖\t245147\n十一届二次\t245148\n穆赫兰道\t245149\n略晓\t245150\n宇佳\t245151\n刘志国\t245152\n4428\t245153\n意味深长\t245154\n黄单\t245155\n帘幕\t245156\n可复\t245157\nwifi路由器\t245158\nKenzo\t245159\nvisio2003\t245160\n2018年4月19号\t245161\n勾引\t245162\n刘嘉\t245163\n波函数\t245164\n起付\t245165\n旗米拉\t245166\n棍儿\t245167\n江西省食品药品监督管理局\t245168\n落霞\t245169\n金诚财富\t245170\nReplacing\t245171\n滨州政府网\t245172\n挥洒\t245173\n神句\t245174\n十八首\t245175\n萃取仪\t245176\nStudent\t245177\n港台剧\t245178\n901\t245179\n海南易建科技股份有限公司\t245180\n窜一窜\t245181\n京投公司\t245182\npoliceman\t245183\n迷迭\t245184\n中国恒大集团\t245185\n格栅机\t245186\n36kr\t245187\n朱维群\t245188\nRENDER\t245189\n纪元戒\t245190\n七魄\t245191\n美台\t245192\nwargamecn\t245193\nxpe\t245194\n蒲江县\t245195\n变污\t245196\n凌渡论坛_汽车之家论坛\t245197\nMaho\t245198\n人民法\t245199\n背单词\t245200\n高进\t245201\ncisg\t245202\nToll\t245203\n所到之处\t245204\n致力于\t245205\n冯薪朵\t245206\n北京华科中西医结合医院\t245207\n五冶\t245208\n周建民\t245209\n10.1c\t245210\n超级赛亚人\t245211\n2.44\t245212\n夜精灵\t245213\n菜墩\t245214\n女票\t245215\n东北师大\t245216\n83\t245217\n118路\t245218\n于海生\t245219\n盛世大厦\t245220\n兰斯9\t245221\n沃派\t245222\n梦想帝王\t245223\n线损率\t245224\n白秀珠\t245225\n折叠自行车论坛\t245226\n人人都是产品经理\t245227\n受理处\t245228\n十余年\t245229\n凤凰园\t245230\n一笑倾城\t245231\n刘建伟\t245232\nCOUNTRYMAN\t245233\nP档\t245234\nphp变量\t245235\n东风汽车有限公司\t245236\n国家卫生计生委卫生和计划生育监督中心\t245237\n教器\t245238\n100mL\t245239\n柯妹\t245240\nV8.4\t245241\n虎穴\t245242\n胡丹丹\t245243\n增级\t245244\n别逃\t245245\n艾迪逊\t245246\n周捷\t245247\n魔法森林\t245248\n卢靖姗\t245249\nCaptivate\t245250\n74军\t245251\n天津广播\t245252\n黄叹号\t245253\n31条\t245254\n动漫节\t245255\n白化病\t245256\n清远市\t245257\n首钢工学院\t245258\n四洲\t245259\ntcc\t245260\n20180307\t245261\n谜材\t245262\n张钢\t245263\n凯发娱乐\t245264\n眼前一亮\t245265\n16.06\t245266\n混种\t245267\n杭州站\t245268\n诺桑觉寺\t245269\n一键换\t245270\n内涝\t245271\n机修\t245272\n12月28日\t245273\n601258\t245274\n玛驰\t245275\n服务生\t245276\n陈凡\t245277\n跨行业\t245278\n苯氧乙醇\t245279\n4队\t245280\nYEEZY\t245281\n延付\t245282\n新专辑\t245283\n春物\t245284\n3DMax\t245285\n快板词\t245286\n天厨\t245287\n瓦雷利亚\t245288\nShire\t245289\n感染体\t245290\nfactorio\t245291\n许浑\t245292\n朝阳小说网\t245293\n空想家\t245294\n蓬朗镇\t245295\n锁阳固精丸\t245296\n拔下\t245297\n炎帝\t245298\n滨州网\t245299\n超能者\t245300\nluffy\t245301\n宁波兴宁中学\t245302\nFrankfurt\t245303\n中共中央\t245304\n北京市仁和医院\t245305\n玩家一号\t245306\n生态系统\t245307\n红蟹\t245308\n中华人民共和国城乡规划法\t245309\nUHS\t245310\n_农视网\t245311\n無料\t245312\n传音\t245313\n千句\t245314\n鹤沙\t245315\n止跌\t245316\n溧水县\t245317\n国内市场\t245318\n新手们\t245319\n孔明灯\t245320\n李渔\t245321\n静安区\t245322\n逆龄\t245323\nEval\t245324\n20170818\t245325\nASPCMS\t245326\n联想记忆法\t245327\n12.53\t245328\n文明人\t245329\n编缉\t245330\n汽层\t245331\n档案学\t245332\nsketchup8\t245333\n大粤网\t245334\n母语者\t245335\nFootprint\t245336\n裸背\t245337\n余音\t245338\n救市\t245339\n星辰.Lee\t245340\nSTX\t245341\n仙姑顶\t245342\n北方音乐\t245343\nshock\t245344\n型腔\t245345\nRetrieving\t245346\nfont-style\t245347\n小题\t245348\n顿\t245349\n华硕路由器\t245350\n星期五\t245351\n毒理\t245352\n花红小黑膏\t245353\n早参\t245354\n中兴人才公寓\t245355\n型女\t245356\n鱼味\t245357\nsecurefx\t245358\nIIS6.0\t245359\n斑羚飞渡\t245360\n花信\t245361\n低分化腺癌\t245362\n吴涟序\t245363\n汤川\t245364\npy-faster-rcnn\t245365\n陕西电视台\t245366\nMedical\t245367\n武警消防\t245368\n上海光学仪器厂\t245369\n前埔\t245370\n哭出来\t245371\n幻想计划\t245372\n长隆野生动物园\t245373\n横木\t245374\n1980元\t245375\n活性剂\t245376\n一个32位\t245377\n心理学概论\t245378\nInches\t245379\ng85\t245380\n正月初八\t245381\n期终\t245382\n1155\t245383\n勃拉姆斯\t245384\n梧桐木\t245385\n1.3cm\t245386\n瑞士航空\t245387\n器值\t245388\n和平古镇\t245389\n一日不见\t245390\n友信\t245391\n世界四大文明古国\t245392\n第7号\t245393\n机加工\t245394\n|征文网\t245395\n山钢\t245396\n裁切\t245397\nHUGO\t245398\n盘龙区\t245399\n氦氖激光器\t245400\n永和大王\t245401\n旻\t245402\n池建强\t245403\n财务管理学\t245404\n越来越\t245405\n张喵喵\t245406\n100项\t245407\n小金胶囊\t245408\n经理学\t245409\n烤麸\t245410\n低碳技术\t245411\n果胶酶\t245412\nConverters\t245413\neasyar\t245414\n单倍行距\t245415\n一天前\t245416\n硝酸铵钙\t245417\nevcard\t245418\nRebuild\t245419\n胡人\t245420\n顾毓琇\t245421\n第五大道\t245422\n班委\t245423\n北京大红门\t245424\n贝吉塔\t245425\nwinpython\t245426\n高丽\t245427\n逸致\t245428\n刘楚恬\t245429\n平平淡淡\t245430\nEGG\t245431\n丁公子\t245432\n标点符\t245433\n以弗所\t245434\nsitting\t245435\n遇见王沥川\t245436\n网络流\t245437\n射杀\t245438\n耐候\t245439\n资源部\t245440\n修稿\t245441\n财会\t245442\n晚风\t245443\n挡车工\t245444\nBootstrapTable\t245445\nAfternoon\t245446\n常熟高新区\t245447\n0.50\t245448\n吡拉西坦\t245449\nNeuroscience\t245450\n咽喉\t245451\n争风\t245452\nLNK1104\t245453\n戒灵\t245454\n大牛股\t245455\nblow\t245456\n敢死队\t245457\n封顶\t245458\n魏大暗黑破坏神\t245459\n明间\t245460\n日本武道馆\t245461\nTRAIN\t245462\n北极星工程招聘网\t245463\n勇将\t245464\n松江一中\t245465\nforced\t245466\n唱票\t245467\n震泽\t245468\n7.5折\t245469\ncoalesce\t245470\nQW\t245471\n十节\t245472\n早市\t245473\n长声\t245474\n机智云\t245475\n涉\t245476\n拖挂车\t245477\nCynthia\t245478\n四人\t245479\n忍者龙剑传吧\t245480\n我是特种兵2\t245481\n甲基丙烯醛\t245482\n加印\t245483\nJArray\t245484\nPAN\t245485\n大学图书馆\t245486\n水上漂\t245487\n大妹\t245488\n海昌隐形眼镜\t245489\n电视台\t245490\n王蓉蓉\t245491\n迁\t245492\n小彩旗\t245493\n京剧艺术网\t245494\n泡泡哥\t245495\n有鬼\t245496\n提醒器\t245497\n私对\t245498\n谁动了我的奶酪\t245499\n探路\t245500\n湖南省畜牧水产局\t245501\n谔谔\t245502\n00R\t245503\ncane\t245504\n逃嫁\t245505\n搭班\t245506\n100式\t245507\n木棉\t245508\n混沌速派\t245509\n日亚\t245510\n金海岸\t245511\n呈坎\t245512\n52倍\t245513\n演艺界\t245514\n莘塍\t245515\n中科商务网\t245516\n小仓优子\t245517\n陈朝阳\t245518\nRica\t245519\n潮虫\t245520\nTrucks\t245521\n第四期\t245522\n使命召唤4\t245523\nsi\t245524\n奥\t245525\n多玩游戏论坛\t245526\n深圳公积金\t245527\n肾脏内科\t245528\n在路\t245529\nEB8000\t245530\n上海市经济信息化委\t245531\n顺城区\t245532\n蒙牛特仑苏\t245533\n182天\t245534\n填充色\t245535\nCSN\t245536\n803\t245537\n抚胸\t245538\n怪物史莱克\t245539\n显形\t245540\n変化\t245541\n温香\t245542\n95亿美元\t245543\n2496\t245544\n别董大\t245545\n查定\t245546\n莫辨\t245547\n绿林好汉\t245548\nOtter\t245549\n深海迷航海蛾号\t245550\n9班\t245551\nCSB\t245552\n屏障\t245553\n2000亿元\t245554\n贝雷架\t245555\n膏体\t245556\nDdrops\t245557\nGitbook\t245558\n刘曦\t245559\n尖领\t245560\n彩格\t245561\nf568\t245562\n过敏性皮炎\t245563\nE530\t245564\n苏州网\t245565\nGZIP\t245566\n政报\t245567\n创战者\t245568\n宜兴市人民医院\t245569\nrifle\t245570\n小蛮腰\t245571\n广东省教育研究院\t245572\nrelated\t245573\n焙烤展\t245574\n少妇们\t245575\n真音\t245576\n辽宁省委党校\t245577\n华润凤凰医疗\t245578\n负罪者\t245579\n新组\t245580\n59元\t245581\nccdd\t245582\n大周镇\t245583\n施瓦比尔盖茨\t245584\n骤起\t245585\nnow直播\t245586\n一卷\t245587\n南海法院\t245588\n沉睡者\t245589\n参考书目\t245590\n66天\t245591\n医调委\t245592\n并列\t245593\n迁出\t245594\n壁挂式太阳能热水器\t245595\n防污\t245596\nmultikey\t245597\n2.7.14\t245598\n8250u\t245599\n下半辈子\t245600\n招标采购方式管理办法\t245601\nLCK2017\t245602\n游易客\t245603\n龙海起重工具\t245604\n和顺\t245605\n新加坡国立大学\t245606\n小喽啰\t245607\n水杯架\t245608\n第6节\t245609\n中华人民共和国宪法\t245610\n新方\t245611\n0.9.8\t245612\n1800家\t245613\n卡尔威特\t245614\n龙域\t245615\n帆船酒店\t245616\n存款利率计算器\t245617\n中交第三航务工程局有限公司\t245618\n猎豺狼\t245619\n中共福建省委机构编制委员会办公室\t245620\n合相\t245621\nvague\t245622\n今夕何夕\t245623\n全新胜达\t245624\n中标结\t245625\n申根签证保险\t245626\n能做\t245627\n音乐篇\t245628\n思想报告\t245629\n路路行\t245630\nyuka\t245631\n双拥网\t245632\n拉萨市区\t245633\n打呼\t245634\nHiking\t245635\n86722\t245636\n岳峰\t245637\n湖南中医药高等专科学校\t245638\n聚变\t245639\n人民币跨境\t245640\n美年大健康\t245641\n提公因式法\t245642\n梅李\t245643\n16P\t245644\n哲学系\t245645\n补录\t245646\nlauy\t245647\n4yx\t245648\nvapor\t245649\n分界线\t245650\n洱源\t245651\n双违\t245652\n宋建国\t245653\n智能摄像机\t245654\ndreamtimes\t245655\nKevin-Kong\t245656\n年份\t245657\n雅正\t245658\n小事情\t245659\ntoward\t245660\ndnf战斗法师\t245661\n虎鱼网\t245662\nリオン\t245663\n眼儿\t245664\n植入物\t245665\n五分钟以内\t245666\nunderline\t245667\nPrecise\t245668\nOPSX\t245669\n车尔尼\t245670\noutfitters\t245671\n红点设计奖\t245672\n石嘴山网_石嘴山市政府\t245673\nFracture\t245674\n涂松岩\t245675\nezd\t245676\n600804\t245677\n一片一片\t245678\n0910\t245679\n|雄安\t245680\nyetu\t245681\n乐高\t245682\n改天\t245683\nConservative\t245684\n张群\t245685\n7.0\t245686\n伴山\t245687\n心晴\t245688\n葛洲坝水电站\t245689\nアナル\t245690\n两管\t245691\n第四十六条\t245692\n负有\t245693\nSuperset\t245694\nプロ\t245695\n电子世界\t245696\n低烟无卤电缆\t245697\n我的世界拔刀剑\t245698\n205国道\t245699\n汪大东\t245700\n鲁班\t245701\nFIRST\t245702\nMerlin\t245703\n托福雅思\t245704\n联合声明\t245705\n征税\t245706\nloadrunner12\t245707\n县残联\t245708\n范本&#41\t245709\n爬爬\t245710\n金牛女\t245711\n冲积扇\t245712\n南山集团有限公司\t245713\n壹號\t245714\n16t\t245715\n才川\t245716\n拳皇吧\t245717\n文字链\t245718\n肝痛\t245719\n故事性\t245720\n植物大战僵\t245721\n第7话\t245722\ntop3\t245723\n傻瓜\t245724\nCdr\t245725\n北七家镇\t245726\n村霸\t245727\n僧伽吒经\t245728\n顾比\t245729\ntracing\t245730\n巩留县\t245731\n陈茂\t245732\nCDR\t245733\nFET\t245734\n0516\t245735\n有棵树\t245736\n立式离心泵\t245737\n江门市中心医院\t245738\ndbx\t245739\nLinks\t245740\n百分之60\t245741\n藏语\t245742\n馒头型\t245743\n徐启方\t245744\n拉科鲁尼亚\t245745\n湖北省人民政府法制办公室\t245746\n百度手机卫士\t245747\nMacos\t245748\n罗布泊之咒\t245749\nrmtp\t245750\n出值\t245751\n受害\t245752\nCLG\t245753\n李智勇\t245754\n蓝雪\t245755\n7.00\t245756\n牧马图\t245757\na2奶粉\t245758\n防空导弹\t245759\n当头一棒\t245760\n马卡斯\t245761\n错版币\t245762\n古越\t245763\n魔剑姬\t245764\n邓村\t245765\n钟楚曦\t245766\n两千块\t245767\n反家暴法\t245768\n汪伪\t245769\n对话式\t245770\n同济大学研究生院\t245771\n9s\t245772\nFlux\t245773\n哈尔滨公交查询网\t245774\n亿海\t245775\n张永\t245776\n左叶\t245777\nmaxq\t245778\nmpsoft\t245779\n幽微\t245780\nNH4NO3\t245781\n任意性\t245782\n莎莎网\t245783\nspaceclaim\t245784\n摩配\t245785\n灰缸\t245786\n逆世\t245787\n三集\t245788\n宝龙地产\t245789\n雾都孤儿\t245790\n恐龙当家\t245791\n通缉\t245792\n桥梁\t245793\nGrants\t245794\n+7\t245795\n吴家山\t245796\n江南华府\t245797\nIGF\t245798\nconsistency\t245799\n苏东坡传\t245800\n40片\t245801\n剑阁\t245802\n动圈\t245803\n风尘\t245804\n音乐学院\t245805\n象牙山\t245806\n东风汽车集团\t245807\n棺姬嘉依卡\t245808\n横车\t245809\n自离\t245810\n阿拉\t245811\n第k个\t245812\n南京明基医院\t245813\n鹿鼎\t245814\n漏损\t245815\n省文化厅\t245816\nDataTemplate\t245817\n宏立城\t245818\n管理页\t245819\n龙兴古镇\t245820\n女尊文\t245821\n磅礴卡\t245822\n尼亚加拉瀑布\t245823\n满窗\t245824\nARB\t245825\n雷诺阿\t245826\n武器装\t245827\nX3850\t245828\n心理罪之城市之光\t245829\n腾邦集团\t245830\n来龙\t245831\nOPEN\t245832\n张思德\t245833\n中药饮片\t245834\n大话IT\t245835\n南京师范大学外国语学院\t245836\n李德华\t245837\n方形\t245838\n伏笔\t245839\n布洛\t245840\nC88\t245841\norders\t245842\n官军\t245843\n拔尖人才\t245844\n招标合同\t245845\nSMP\t245846\n抽烟\t245847\n尖山镇\t245848\n濮阳网\t245849\nicloud\t245850\n双生子\t245851\n二十六\t245852\n堵塞感\t245853\n热血高校2\t245854\n数据块\t245855\n张继索罗斯\t245856\n备览\t245857\n名石\t245858\n孙琳\t245859\n招商策划\t245860\n低贱\t245861\n实习生\t245862\n上海浦\t245863\n苦口婆心\t245864\n生物力学\t245865\nActivePerl\t245866\nfloat\t245867\nQT4\t245868\nCubic\t245869\n台语\t245870\n浙江大学西溪校区\t245871\nInteliJ\t245872\n绫波芹\t245873\n流术\t245874\n1954年\t245875\nMac迅雷\t245876\nrenminbi\t245877\n沈阳公司\t245878\nSprings\t245879\n成府路\t245880\npingo\t245881\n暖身\t245882\n金恒智能\t245883\n反舰导弹\t245884\n羟乙基\t245885\n星通宝\t245886\n表演\t245887\nrata\t245888\n西蒙\t245889\n板野有纪\t245890\niText\t245891\n雅加达亚运会\t245892\n铜金粉\t245893\n就业\t245894\n说谎\t245895\n唐晶\t245896\n营利\t245897\nINVOICE\t245898\nZOJIRUSHI\t245899\nspirit\t245900\n阜新新闻网\t245901\nFixedUpdate\t245902\n86式\t245903\n高中数学必修4\t245904\n妙手空空\t245905\n仿生人\t245906\n武汉教育信息网\t245907\n外装\t245908\n全关\t245909\n感官\t245910\n安拉\t245911\n八字桥\t245912\n解元\t245913\n55项\t245914\n中考题\t245915\ndisciplinary\t245916\n蹲守\t245917\n充气床\t245918\n虫虫钢琴-钢琴曲-钢琴谱\t245919\n九届\t245920\n阿香米线\t245921\n70升\t245922\n广西政府法制网\t245923\n广西北部湾经济区\t245924\n11度\t245925\n9530\t245926\n葎草\t245927\nnx12\t245928\n子程序\t245929\n真理子\t245930\n心斋\t245931\n恩平\t245932\n简影\t245933\n色医\t245934\n浙江农业商贸职业学院\t245935\n诺坎普\t245936\n羰基\t245937\n社科联\t245938\n百度快照\t245939\n20160214\t245940\n1.6版\t245941\nJavassist\t245942\nffprobe\t245943\n六级作文\t245944\n甲基咪唑\t245945\n广陵\t245946\n上古卷轴5传奇版\t245947\n石鳞\t245948\nDoodle\t245949\nfout\t245950\nBitnami\t245951\n福绵区\t245952\n谶语\t245953\n宁化\t245954\n国林\t245955\n比色法\t245956\nInsert\t245957\n单因素\t245958\n世茂海峡大厦\t245959\n第二十条\t245960\n职数\t245961\nQian\t245962\n送进\t245963\n死妈\t245964\n广东省情网\t245965\n利荣集团\t245966\n几格\t245967\n软糖\t245968\nPanigale\t245969\n广州番禺职业技术学院\t245970\n包钢集团\t245971\nLede\t245972\n20150729\t245973\n龙泉剑\t245974\nReel\t245975\nokok\t245976\n固定行\t245977\n陪护假\t245978\n临界区\t245979\n拔作\t245980\n国家基本公共卫生服务\t245981\n35g\t245982\n水冲\t245983\nコレクション\t245984\n游击队之歌\t245985\n香粉\t245986\n伊犁州党委\t245987\n指端\t245988\n75a\t245989\nOpenCV4\t245990\n执行庭\t245991\n车标\t245992\n激光祛斑\t245993\n长安商用车\t245994\n何燕\t245995\n山西省人民政府\t245996\nnavigator\t245997\n5.6.1\t245998\n广东省商务厅\t245999\n云梦\t246000\n蹉跎\t246001\n贪吃\t246002\n深圳湾1号\t246003\n断桥铝型材\t246004\n青龙虾\t246005\n布阵\t246006\n炮枪\t246007\n北方\t246008\n子慕\t246009\nASDM\t246010\n频率分析\t246011\n12.0.24\t246012\n她的一生\t246013\n云计算产业\t246014\n50万元\t246015\n三花控股集团\t246016\nLMI\t246017\n不逾矩\t246018\n天下太平\t246019\n第多少个\t246020\n硫酸铝\t246021\n中共江苏省委组织部\t246022\n动用\t246023\n上周\t246024\n幸福南路\t246025\n美睫\t246026\n20160706\t246027\n胶批\t246028\n止血药\t246029\nDMF\t246030\n进化型\t246031\n1080元\t246032\nYAHOO\t246033\n护工招聘网\t246034\nds3h\t246035\n酒商\t246036\n豫章师范学院\t246037\n增删\t246038\n河南省商务厅\t246039\n针对性\t246040\n中铁物流\t246041\n飞出去\t246042\n动画师\t246043\n无相功\t246044\n荡寇风云\t246045\n歪歪兔\t246046\n星痕\t246047\nG3930\t246048\nxyx\t246049\n2500K\t246050\n历史剧\t246051\n第二十七期\t246052\nmamamoo\t246053\n羊胎盘\t246054\n住人\t246055\n荻\t246056\nLSP\t246057\n1900M\t246058\n灵境\t246059\n灰黑色\t246060\n辅导员\t246061\nfujitsu\t246062\n感应炉\t246063\n领略\t246064\nstartup.bat\t246065\n苦苦\t246066\n技巧篇\t246067\nEZCast\t246068\n315真伪查询网\t246069\nsgcc\t246070\n壁挂锅炉\t246071\nNarrow\t246072\n2016年12月29日\t246073\n清远市公安局\t246074\n齐灵云\t246075\n新梗\t246076\n商用地产\t246077\n0825\t246078\n2000年以前\t246079\n永不倒\t246080\nAPPLE\t246081\nHILL\t246082\n步森股份\t246083\n疑邻\t246084\n施工管理\t246085\ntabata\t246086\ngnu-gcc\t246087\njg\t246088\nSuddenly\t246089\n雷诺数\t246090\n操死\t246091\nau3\t246092\n先秦诸子选读\t246093\n余纯顺\t246094\n耀之\t246095\n码元\t246096\n城市建设\t246097\n微信支付宝\t246098\n一来\t246099\n青园小区\t246100\n柯南\t246101\n督脉\t246102\n客观题\t246103\n占位符\t246104\n句柄\t246105\nWINFORM\t246106\npharmaceuticals\t246107\njaxb\t246108\nome\t246109\n战锤2\t246110\n深圳证券\t246111\nreveal\t246112\n第11届\t246113\nWKWeb\t246114\nc4dr18\t246115\n送行\t246116\nT430S\t246117\n伺服器\t246118\n21三体\t246119\n升盘\t246120\n591\t246121\n中控证\t246122\nISO14001\t246123\n地神\t246124\n印度理工学院\t246125\n演说\t246126\n忠信\t246127\n长鸣\t246128\n施药\t246129\n品牌忠诚度\t246130\n湘潭北\t246131\n切比雪夫不等式\t246132\nDameWare\t246133\nNetWork\t246134\n归根结底\t246135\n康婷瑞倪维儿\t246136\n第211集\t246137\n民主党派\t246138\n四川省科学技术厅\t246139\n天津津\t246140\n笋片\t246141\n邪尊\t246142\n东方花旗证券有限公司\t246143\n中银活期宝\t246144\n田宇\t246145\n还乡\t246146\n鱼人岛\t246147\n前来\t246148\n真倚天屠龙记\t246149\n30万起\t246150\n东江大桥\t246151\n焊管机\t246152\n轻\t246153\n黄山毛峰\t246154\n757956132\t246155\n幸福树网\t246156\n红颜丽人\t246157\n小贴\t246158\n商品房认购书\t246159\n幻灯机\t246160\n胆大包天\t246161\n存入\t246162\n复吸\t246163\n403号\t246164\n八七会议\t246165\n炫光\t246166\n江西省中医院\t246167\n诗鬼\t246168\n美团大众点评\t246169\n目录页\t246170\n免费全集\t246171\n南山植物园\t246172\n沈巷\t246173\nhtpc\t246174\ndux\t246175\n火茶\t246176\nwww.126.com\t246177\n练气\t246178\n硬景\t246179\n开放商\t246180\n哲学社会科学版\t246181\nRepairs\t246182\n会声会影10\t246183\n恒泰广场\t246184\n冰柱\t246185\n胡焕庸\t246186\n金路\t246187\nUSTC\t246188\n223\t246189\n爬服\t246190\nEBITDA\t246191\n28c\t246192\n外迁\t246193\n改写\t246194\n板台\t246195\n朱云来\t246196\n罗湖站\t246197\n0.5米\t246198\n图尔库\t246199\narmv7\t246200\n英国皇家学会\t246201\nactiviti5\t246202\nPS4/X1/NS\t246203\n隐患\t246204\n外排\t246205\n青禾\t246206\nwebsoket\t246207\n沉溺\t246208\n社会学家\t246209\n琢磨\t246210\n冯燕\t246211\npeak\t246212\n控烟\t246213\n金庸群侠传Online\t246214\n上链\t246215\nfreeglut\t246216\nv5.5\t246217\n志在\t246218\n花城大道\t246219\n迎来\t246220\n2.6%\t246221\n单氰胺\t246222\n东方欲晓\t246223\n让你走\t246224\nppg\t246225\n500日元\t246226\n华宝信托\t246227\nraider\t246228\n毛芦芦\t246229\n王奇\t246230\n交流版\t246231\nchatting\t246232\n海风教育\t246233\nonekey\t246234\n合肥地铁5号线\t246235\nFLAG\t246236\n鉴真国际半程马拉松赛\t246237\n大叻\t246238\n木结构\t246239\n扣锁\t246240\n财务会计学\t246241\n水底灯\t246242\n二十个\t246243\n空警\t246244\n巴拉巴拉\t246245\n第二号\t246246\ngonggao\t246247\n细小病毒\t246248\n凝汽式\t246249\n宝利\t246250\n89方\t246251\n眉宇\t246252\n再见只是陌生人\t246253\n百宝阁\t246254\n开票\t246255\n爱了\t246256\nMICROCHIP\t246257\n财经百科\t246258\n磁器\t246259\n德意志第二帝国\t246260\n谢逊\t246261\nWyse\t246262\n三江人才网\t246263\n胡忠\t246264\ntype3\t246265\n爨底下村\t246266\n残留\t246267\n发动机总成\t246268\nBottom\t246269\nVerse\t246270\nCC2017\t246271\n下沙网\t246272\n洛阳城\t246273\n邵东站\t246274\nempire\t246275\n孙瑜\t246276\n柏林禅寺\t246277\n苍岚\t246278\n脚扣\t246279\n网易音乐\t246280\n资源链\t246281\n光速蒙面侠21\t246282\n高低杠\t246283\nsweeping\t246284\npug\t246285\n超人正义联盟\t246286\n畅安\t246287\nvaleo\t246288\n金融办\t246289\nR720\t246290\n1280x800\t246291\n深仇大恨\t246292\n几千年前\t246293\n结对\t246294\nShaver\t246295\n山西共青团\t246296\n梅岭中学\t246297\nmercure\t246298\n宝来论坛\t246299\n省部\t246300\nRAY\t246301\n蒸汽流量计\t246302\n二套房\t246303\n洛索洛芬钠片\t246304\nNovels\t246305\n哟\t246306\n抚慰\t246307\n万古霉素\t246308\n中景\t246309\n化院\t246310\nxl2tpd\t246311\ndeep\t246312\n木事\t246313\n知牛\t246314\n荥\t246315\n李豫\t246316\n浑南区\t246317\n短书\t246318\n西安思源学院\t246319\n24位\t246320\n匹敌\t246321\n画室\t246322\n牌器\t246323\n第39次\t246324\n经检验\t246325\n黄河特大桥\t246326\n粘接\t246327\n近邻宝\t246328\n1980s\t246329\n板墙\t246330\n线子\t246331\n八十七\t246332\nva11halla\t246333\n7850\t246334\n上下架\t246335\n洋河酒\t246336\n纽约州立大学布法罗分校\t246337\n护\t246338\n周成建\t246339\n湖北省电力公司\t246340\nroshe\t246341\n川味坊\t246342\n重生之侯府嫡女\t246343\nPRODUCE\t246344\n千斤顶\t246345\nBULL\t246346\n中国互联网举报中心\t246347\n就业创业信息网\t246348\n临沂市政府\t246349\n开眼看世界\t246350\n国际刑事法院\t246351\n乐施会\t246352\n汉王科技股份有限公司\t246353\n凉山生殖健康医院\t246354\n子宫动脉\t246355\n途安论坛\t246356\n荒川爆笑团\t246357\n6.7万\t246358\n奇品\t246359\nWonders\t246360\n英国爱丁堡大学\t246361\nmaintain\t246362\n谢科\t246363\n义蓬街道\t246364\n爱普生(中国)有限公司\t246365\n盖饭\t246366\n2017年4月22日\t246367\nOdd\t246368\naged\t246369\n董湘昆\t246370\n波士顿凯尔特人队\t246371\n屌丝男\t246372\n红警2共和国之辉\t246373\n远投\t246374\n北京陵园\t246375\n臻园\t246376\nWoods\t246377\n三七片\t246378\n桌前\t246379\n0.625\t246380\n什叶派\t246381\n竹窗\t246382\n冥月\t246383\n毕家军\t246384\n瑞丰\t246385\n溥儒\t246386\nwindows7_\t246387\n7600\t246388\n航空航天大学\t246389\nx230\t246390\nkaoshi\t246391\n惩前毖后\t246392\n热血物语地下世界\t246393\n绿版\t246394\nABP-069\t246395\n开心一笑\t246396\nnude\t246397\nOKEx\t246398\nfengyun\t246399\n拖挂\t246400\n海棠花\t246401\n黄石公\t246402\n血小板减少性紫癜\t246403\n刻录盘\t246404\n20160701\t246405\n内务条令\t246406\nsheying\t246407\n事发\t246408\n上海市律师协会\t246409\n碰壁\t246410\n赤手空拳\t246411\nasking\t246412\n线缆\t246413\n320S\t246414\n一一一一\t246415\n抬高\t246416\ncose\t246417\nChristina\t246418\nstandby\t246419\nimd\t246420\nfedex\t246421\n周通\t246422\n犹豫期\t246423\n洪永淼\t246424\n极坐标系\t246425\nuc129\t246426\nPEO\t246427\n榴莲蛋糕\t246428\nxujishou\t246429\n朗筑\t246430\n奔驰C260\t246431\nSma\t246432\n摇卦\t246433\n萨拉萨蒂\t246434\n代谢综合征\t246435\n后继者\t246436\n德国队\t246437\n韩美女\t246438\n黄秋生\t246439\nzane\t246440\ntil\t246441\n11天11夜\t246442\n筏轮\t246443\n事务处\t246444\n衡水站\t246445\n捕蝉\t246446\n第17届\t246447\n荆\t246448\n276号\t246449\niar\t246450\n大王村\t246451\n瑟拉娜\t246452\n杭州网络广播电视台\t246453\n黄静\t246454\n奶咖\t246455\n北京宾馆\t246456\n飞天茅台\t246457\n霍思毕\t246458\n68周年\t246459\n阶梯教室\t246460\n阿西里西\t246461\n现在就告白\t246462\n漫延\t246463\n刀魂\t246464\n河北省国土资源厅（海洋局\t246465\nhibor\t246466\n夜场\t246467\n几百公里\t246468\n20151208\t246469\n客集\t246470\npacka\t246471\n频阳\t246472\n成型\t246473\nSelective\t246474\n痰管\t246475\n辽阳\t246476\n金冷\t246477\n应变\t246478\n责权利\t246479\n景山街道\t246480\n点都德\t246481\n下一站\t246482\n佳源集团\t246483\n应用级\t246484\n溺爱\t246485\nexcel、word\t246486\n语文课程标准\t246487\n第142期\t246488\n动力工程专业\t246489\n娃娃菜\t246490\n口酸\t246491\n新志\t246492\nftir\t246493\n土木在线\t246494\n摘下\t246495\n异网\t246496\n量级\t246497\n收悉\t246498\n刘雷\t246499\n随身WIFI\t246500\nlaker\t246501\n拉紧器\t246502\n龙凤阁\t246503\n1280p\t246504\n彩线\t246505\n4200h\t246506\nexlcel\t246507\n人才济济\t246508\n铺轨\t246509\nBD奇艺\t246510\n2分米\t246511\n地标\t246512\n强拆\t246513\n解放西路\t246514\n20178\t246515\n冥王篇\t246516\n习莫\t246517\n哈利波特番外篇\t246518\n天竺寺\t246519\n亚泰\t246520\n梁杰\t246521\naloe\t246522\n即送\t246523\n每隔1秒\t246524\n鲁政\t246525\n江汉大学文理学院\t246526\n张学兵\t246527\nV7.1\t246528\n触动\t246529\n宜山路\t246530\n续命\t246531\n广州地铁设计研究院有限公司\t246532\n碱水面\t246533\n144个\t246534\n新唐遗玉\t246535\n竞彩网\t246536\nTP\t246537\nbuild\t246538\n排液\t246539\ncok\t246540\nflies\t246541\nCNV\t246542\n浸液\t246543\nShijia\t246544\n周五晚上\t246545\n那句话\t246546\n因次\t246547\n第111\t246548\nEdexcel\t246549\n上海市慈善基金会\t246550\n和服\t246551\n李慧珍\t246552\n1700187\t246553\n无性生殖\t246554\n22.2\t246555\n309路\t246556\n一蚊\t246557\n松原市\t246558\n中财论坛\t246559\n金城大厦\t246560\n惠州市第一人民医院\t246561\n新泰\t246562\n重子\t246563\n城市金融网\t246564\n厂品\t246565\n三好生\t246566\n认主\t246567\njalopy\t246568\n水莱丽\t246569\nVERTU\t246570\nXBox\t246571\nOrganisation\t246572\nBadboy\t246573\n一家之言\t246574\n蓝点\t246575\n陈向明\t246576\n公司职员\t246577\n剑来\t246578\n河坝\t246579\n村镇\t246580\n各式各样\t246581\n市国土资源局\t246582\n综合类\t246583\nwinwebmail\t246584\nCASA\t246585\nppsx\t246586\n819\t246587\n办文\t246588\n盘友网\t246589\ncolour\t246590\n黑井户\t246591\n电压跟随器\t246592\n脾大\t246593\nSpeedTree\t246594\nvacuum\t246595\n顾俊\t246596\n尽览\t246597\ncao\t246598\nReleased\t246599\nWinServer\t246600\n三河口\t246601\n死不渝\t246602\n范钦珊\t246603\n中柬\t246604\n偏颇\t246605\nhealthcare\t246606\nsimens\t246607\n新领\t246608\niKBC\t246609\n上海东方医院\t246610\n廿载\t246611\n方林\t246612\n佛香\t246613\n草堂\t246614\nherbs\t246615\n小四\t246616\n畴\t246617\n周惠楠\t246618\n津宝\t246619\n九江火车站\t246620\n苹果型\t246621\nstdev\t246622\nSyncToy\t246623\n白脉软膏\t246624\n铁画\t246625\n磷酸氢钙\t246626\noffice2016\t246627\n瓜子二手车直卖网\t246628\nVincent、Yan\t246629\n明家联合\t246630\nmonkeyrunner\t246631\n冷弯型钢\t246632\n返流\t246633\n安徽法院\t246634\n夏文汐\t246635\n秘符\t246636\n周燕珉\t246637\nRIDER\t246638\n丰收宝\t246639\nUnreachable\t246640\n壬戌年\t246641\n奇热\t246642\n106.2\t246643\n黑山头\t246644\n未婚女\t246645\n台州日报\t246646\n衣扣\t246647\n梦涵\t246648\n最终痴汉\t246649\n群妖\t246650\nk-means算法\t246651\n石头记\t246652\n苍苍\t246653\n好网名网\t246654\n虐童\t246655\n师傅\t246656\n水利协会\t246657\nHORIZON\t246658\nSteamVR\t246659\n洛阳市政府\t246660\n冯满天\t246661\n逍遥客栈\t246662\n宁波镇海\t246663\n卡莉\t246664\n镜像版\t246665\n天通苑北三区\t246666\n克拉恋人\t246667\n铸铝\t246668\nOutcall\t246669\n文玩天下吧\t246670\n托费\t246671\n好不容易\t246672\n中国教育和科研计算机网\t246673\n瓷盘\t246674\nvalet\t246675\n城市人口密度\t246676\n作别\t246677\n16亿元\t246678\njq\t246679\nChong\t246680\n先觉\t246681\nPitt\t246682\n大有作为\t246683\n宠物蛇\t246684\n原药\t246685\nrr\t246686\n孟县\t246687\n科诺\t246688\n云途\t246689\n霞理沙\t246690\n动静\t246691\n国航\t246692\niPad1\t246693\n传染性\t246694\n十点半\t246695\n商品页\t246696\n绝缘垫\t246697\n法式面包\t246698\n赌债\t246699\n配资\t246700\n泡打粉\t246701\n木蜡油\t246702\n次韵\t246703\n飞机场\t246704\n南阳网\t246705\n0.5行\t246706\n24份\t246707\nInterfaces\t246708\nyazhou\t246709\n阿尔弗雷德\t246710\n五万块\t246711\n农工党\t246712\n巨浪\t246713\n夏蒙\t246714\n招标采购-建设招标网\t246715\nwin7共享文件夹\t246716\nzmlctt\t246717\n黑排角\t246718\n希尔斯布莱德\t246719\n杨风\t246720\n焦味\t246721\n1.mp4\t246722\n河北雄安新区规划纲要\t246723\ndmc\t246724\nmimetype\t246725\n谭正岩\t246726\n得荣县\t246727\n海南省发展和改革委员会\t246728\n博克\t246729\n人尿\t246730\n18038期\t246731\n戴亚\t246732\n禅境\t246733\n人工骨\t246734\n20吨\t246735\n造诣\t246736\n建设\t246737\n烟笼\t246738\n退稿\t246739\n二手价\t246740\nMach\t246741\n老三\t246742\nwephone\t246743\n今天\t246744\n魔技\t246745\n共产主义者\t246746\n酪酸梭菌\t246747\nMapServer\t246748\n音声\t246749\n新兴路\t246750\n5加仑\t246751\n末世之黑暗召唤师\t246752\n金钟炫\t246753\n学区\t246754\n悠融\t246755\n旌旗\t246756\nroundabout\t246757\n十台\t246758\nz8350\t246759\n太空酒店\t246760\n阎魔爱\t246761\n1号位\t246762\n附件\t246763\nNCM\t246764\n京都御所\t246765\n碱式碳酸铜\t246766\n混分\t246767\n污名化\t246768\nYuko\t246769\nShadowsock\t246770\n初盘\t246771\n香取慎吾\t246772\n控们\t246773\n黄一山\t246774\n无针\t246775\n林依轮\t246776\n洛杉矶湖人队\t246777\n兵库\t246778\n上海质子重离子医院\t246779\norac\t246780\n3.4.10\t246781\n崔京浩\t246782\n1500亿元\t246783\n内化\t246784\n小米平板\t246785\nKata\t246786\n三一重工股份有限公司\t246787\n北海站\t246788\n蓟州区\t246789\n西安儿童医院\t246790\n老鹰\t246791\n牛蹄筋\t246792\n1881年\t246793\n搁架\t246794\n鳄雀鳝\t246795\n开斋\t246796\nbats\t246797\n10月20日\t246798\n瑞声达\t246799\n理查德·塞勒\t246800\n谷大\t246801\n小晴\t246802\n体木\t246803\n岩土工程专业\t246804\n3折页\t246805\n12轮\t246806\n丰岛\t246807\nchecklist\t246808\n译林出版社\t246809\n1866年\t246810\n助理岗\t246811\n十四种\t246812\n大数据公司\t246813\n党员活动日\t246814\n腾达建设\t246815\n黑莓Q10\t246816\n比亚迪g6\t246817\n北京师范大学\t246818\n绿色银行\t246819\n_汽油价格网\t246820\n盲人摸象\t246821\n芭比之梦想豪宅\t246822\n福布斯富豪榜\t246823\n运兵\t246824\n赠从弟\t246825\n监禁\t246826\n贵州银行\t246827\nITT\t246828\n易联云\t246829\n校订\t246830\n小甘菊\t246831\n王婷婷\t246832\n浅紫色\t246833\n队里\t246834\nUIActivity\t246835\n365路\t246836\n黄边\t246837\n200&\t246838\n一船\t246839\nsetAttribute\t246840\nazw3吧\t246841\n第18038期\t246842\n百度百家号\t246843\nicq\t246844\n橄榄叶\t246845\n中国林科院\t246846\n清灰\t246847\n绳模\t246848\n造化傲剑凌云\t246849\n行程单\t246850\n中央财经大学金融学院\t246851\n我的世界地图\t246852\n麒麟色影院\t246853\n也有\t246854\n太阳板\t246855\n技术论\t246856\n太阁立志传1\t246857\nUV胶\t246858\nbetween\t246859\ntez\t246860\n嬴稷\t246861\n南京路步行街\t246862\n今年6月\t246863\n八一下\t246864\n交接班\t246865\n风雪山\t246866\n7.6.1\t246867\n木字\t246868\nRefined\t246869\nProps\t246870\n可口可乐杯\t246871\n6月1\t246872\n罗源湾海洋世界\t246873\n总部基地\t246874\n非党\t246875\n危险物品\t246876\n邓婵玉\t246877\n石杉\t246878\n美服\t246879\n白骨\t246880\n到点\t246881\n出钱\t246882\nhui\t246883\n雅各布\t246884\n航天云网\t246885\n刘思源\t246886\n闻香\t246887\nNang\t246888\n哥\t246889\n河南理工大学万方科技学院\t246890\n荣悦台\t246891\n银灰\t246892\n擦痕\t246893\n底物\t246894\nios9\t246895\n乔迪\t246896\n绝代双娇\t246897\n心血管\t246898\n8700K_\t246899\n昔阳政府网\t246900\n消耗性\t246901\n凌翔\t246902\n电视播放器\t246903\n伤食\t246904\n合众思壮\t246905\nbesiege\t246906\n钞票\t246907\n黑鬼\t246908\n长春理工\t246909\nm.2接口\t246910\n粮田\t246911\n安哥拉曼纽\t246912\n宁夏华夏医院\t246913\nLISTENER\t246914\n油烟机\t246915\n言爱\t246916\n综投网\t246917\n勤勤恳恳\t246918\n梅特勒\t246919\n上海博禹泵业有限公司\t246920\n潇洒走\t246921\n海南全岛建设自由贸易试验区\t246922\n捕鱼类\t246923\nglamglow\t246924\n108遍\t246925\n布莱希特\t246926\n玛托\t246927\n冲锋号\t246928\nCommun\t246929\nwx.request\t246930\n刘雪峰\t246931\nleaked\t246932\nSSD硬盘\t246933\n维罗妮卡\t246934\namibo\t246935\n骗精\t246936\n海阔天空\t246937\nInstead\t246938\n邪恶少女漫画大全\t246939\n一个错\t246940\nSkyrim\t246941\n六角形\t246942\n恍然\t246943\n补色\t246944\n医宗\t246945\niata\t246946\n女画\t246947\n豆豆\t246948\n好意\t246949\n防狼喷雾剂\t246950\n附居\t246951\n5.0.9\t246952\n1713\t246953\n浴照\t246954\n大康\t246955\ncvt变速箱油\t246956\n版长\t246957\n6188\t246958\n讯飞\t246959\n80米\t246960\n女娲成长日记\t246961\nBWV\t246962\n温灸\t246963\n酒托\t246964\n城南路\t246965\n易地扶贫\t246966\nLSAT\t246967\nSheet2\t246968\n矩\t246969\n苏宁电器\t246970\n10.11.5\t246971\n洛阳市教育局\t246972\n爆炸盒\t246973\n吴伯雄\t246974\n克孜勒苏柯尔克孜自治州\t246975\n万达金街\t246976\npowerpoint2013\t246977\n树阵\t246978\nconstruction\t246979\n引争议\t246980\n现代缤智\t246981\n铁饼\t246982\n童裤\t246983\njxc\t246984\n293号\t246985\n骆玉明\t246986\n汉文帝\t246987\n这种人\t246988\n1.8.2\t246989\n30w\t246990\n亿宝\t246991\nComple\t246992\nhoz\t246993\nBuilds\t246994\n艾玛·沃特森\t246995\n48小时内\t246996\n数字函数\t246997\n南岳大庙\t246998\n3R\t246999\n笑傲体\t247000\n国酒香\t247001\n华讯方舟\t247002\n效能\t247003\n待客之道\t247004\ncss定位\t247005\n紫罗兰色\t247006\n1200年\t247007\n晨午\t247008\n常量池\t247009\n药液\t247010\nV3.01\t247011\n兀突骨\t247012\n中春路\t247013\n换型\t247014\n负债类\t247015\n核算员\t247016\nopen-falcon\t247017\n最高分辨率\t247018\n弗莱迪\t247019\nTeng\t247020\n鞋款\t247021\n48平米\t247022\n放养\t247023\nMixtape\t247024\nMeetings\t247025\n甘南\t247026\njames\t247027\n火种\t247028\n介绍篇\t247029\n青黄\t247030\nvb6.0\t247031\n要量\t247032\n阿卡莱\t247033\n张贺\t247034\nEngineers\t247035\nEnvi\t247036\n赵继伟\t247037\n聚酰胺树脂\t247038\nGMS\t247039\n百叶窗\t247040\n购物团\t247041\n下坠感\t247042\n三本院校\t247043\n牛币\t247044\n真弓\t247045\n种族主义\t247046\n香港小学\t247047\n扎比子\t247048\n霸王集团\t247049\n大阳巧客\t247050\n六统一\t247051\n樱井\t247052\n平口\t247053\nsdf\t247054\n网络摄像机\t247055\n七股\t247056\n创鑫\t247057\n前8月\t247058\n毕业设计中期检查\t247059\nOCA\t247060\n易性\t247061\n中铁隧道集团\t247062\n农副产品\t247063\n宾士\t247064\n时库\t247065\nlandsat8\t247066\n甘肃省\t247067\n浙大城市学院\t247068\n火气\t247069\n景陵\t247070\n360n4\t247071\n投标员\t247072\n绿色建筑评价标准\t247073\nLau\t247074\n对流传热\t247075\n金隅集团\t247076\n涮肚\t247077\n肥龙\t247078\n贝肯鲍尔\t247079\n虞洽卿\t247080\n草体字\t247081\n高楠\t247082\n银镜反应\t247083\n基金申购\t247084\n铁锨\t247085\nHampshire\t247086\n美猎\t247087\n12盎司\t247088\n9P\t247089\nExecutor\t247090\nAFL\t247091\n柱塞泵\t247092\n80美元\t247093\n休整\t247094\n55BBS\t247095\n上海硕博公司\t247096\n玉米芯\t247097\nJOB\t247098\n0416\t247099\n没得救\t247100\n李晓龙\t247101\n金家湾\t247102\n自治区人大常委会\t247103\n青岛海洋科学与技术国家实验室\t247104\n帮费\t247105\n三风\t247106\n热喷涂\t247107\n耽兮\t247108\nbrige\t247109\n片刻\t247110\n竹马\t247111\n孔子庙\t247112\n2016年5月5日\t247113\n李洪志\t247114\n被绑架\t247115\n冰山一角\t247116\n雪舞\t247117\n加百列\t247118\n蚊帐\t247119\n选股器\t247120\n同服\t247121\n阳萎\t247122\n动态代理\t247123\nchecked=\t247124\n长泾镇\t247125\n室内\t247126\nDIST\t247127\n海图\t247128\n罗宝\t247129\n咸丰帝\t247130\n捷豹XE\t247131\n中建七局\t247132\n小升初入学攻略\t247133\n四月下旬\t247134\n大耳朵英\t247135\n江月\t247136\n跟风\t247137\n考纪\t247138\nrco\t247139\napplestore\t247140\n苹果Live\t247141\n向家坝水电站\t247142\n将军之刃\t247143\n基层\t247144\n八片\t247145\n泰罗\t247146\n民间\t247147\n针形\t247148\n调查报\t247149\n年方\t247150\n王子\t247151\ntailored\t247152\n桩承台\t247153\n222com\t247154\nAB型\t247155\n干铺\t247156\n香农定理\t247157\n理财规划师考试\t247158\n风临\t247159\n夹蝶阀\t247160\n魔胎\t247161\n南波儿\t247162\n还有点\t247163\n那些你很冒险的梦\t247164\n深圳西乡\t247165\nakb\t247166\n上海办公室\t247167\n环氧树脂胶\t247168\n日本航空公司\t247169\n23岁\t247170\n葛亚曦\t247171\n退税率\t247172\nextent\t247173\n风和日丽\t247174\nnamese\t247175\n背夫\t247176\n大型犬\t247177\n化学气相沉积\t247178\n旧\t247179\n捷克人\t247180\n绿城通\t247181\n至善\t247182\nacne\t247183\n返厂\t247184\n上海杨浦_上海市杨浦区人民政府\t247185\n须弥座\t247186\n一个多星期\t247187\n杜雨露\t247188\n米帅\t247189\n阿萌\t247190\n焊\t247191\n末世孤雄吧\t247192\n抵款\t247193\n率定\t247194\n地形学\t247195\n投注\t247196\nSamantha\t247197\n攻城三国\t247198\n溴化锂机组\t247199\n顾又铭\t247200\n南方\t247201\n刘峥\t247202\n地址库\t247203\n阜康市\t247204\n刺血\t247205\n予\t247206\n2008年1月\t247207\n高超音速\t247208\n自矜\t247209\n休息站\t247210\n特选\t247211\n句语\t247212\n28支\t247213\n3无名\t247214\n永新\t247215\nV3700\t247216\n凤冠霞帔\t247217\n隐蔽\t247218\n水煮花生\t247219\n虱子\t247220\n非标自动化设备\t247221\n汇总\t247222\n竞选大队委\t247223\n圣师\t247224\n矜贵\t247225\nsophisticated\t247226\n需求管理\t247227\n默认值\t247228\n荣辱与共\t247229\n何用\t247230\nprotoc\t247231\n16000\t247232\n无锡市新区\t247233\n飞智\t247234\ncognos\t247235\n西坡\t247236\n噶尔丹\t247237\n责无旁贷\t247238\nmeasured\t247239\nbnt\t247240\n42式太极拳\t247241\n中国铁路总公司\t247242\n重贴\t247243\n984\t247244\n京旺家园\t247245\nY35\t247246\n十一集\t247247\n滑铁卢\t247248\n三杯茶\t247249\n李志国\t247250\nDiscounted\t247251\n乘法结合律\t247252\n喝豆浆\t247253\n教职\t247254\n20季\t247255\n生活与哲学\t247256\n贝灵顿\t247257\n钢梁\t247258\n门尾\t247259\n入场式解说词\t247260\n无价\t247261\n大哉\t247262\n申冬奥\t247263\n剥离机\t247264\n饭\t247265\n货场\t247266\nAPEC\t247267\n设计部\t247268\n古蜀\t247269\n稠度\t247270\n在中\t247271\n柘荣\t247272\n商用机\t247273\ncad卡\t247274\n馥郁\t247275\n幼狮\t247276\n吉林省教育厅\t247277\n12缸\t247278\n漏雨\t247279\n小金口\t247280\n跑气\t247281\nmelting\t247282\n万世师表\t247283\n一家老小向前冲\t247284\n泰禾广场\t247285\n王敏\t247286\n不假思索\t247287\n安顺西秀区政府网\t247288\n星将\t247289\n夏诗文\t247290\n超大城市\t247291\n从这里\t247292\n惠众\t247293\ndoingsth\t247294\n沉浮录\t247295\n土肥站\t247296\n雀鸟\t247297\n崽\t247298\n虚拟机栈\t247299\n新加坡动物园\t247300\n天盛\t247301\n企口\t247302\n中华人民共和国监察法\t247303\n谷种\t247304\n脱不了\t247305\nciphers\t247306\n好感\t247307\n乐迷\t247308\n此消彼长\t247309\n耐克空军一号\t247310\nsimca\t247311\n西师版\t247312\n五只\t247313\n罗彬\t247314\n艾肯声卡\t247315\n上海农业银行\t247316\nRemi\t247317\n千波\t247318\n冥府\t247319\n华为荣耀8吧_\t247320\n8月24日\t247321\n舒曼\t247322\n阳绿\t247323\nACD\t247324\n梦溪论坛\t247325\n李鸿忠\t247326\nluxixing\t247327\n搪塞\t247328\n坟茔\t247329\n生态安全\t247330\nmadu\t247331\n乐文小说网\t247332\n气相色谱仪\t247333\n欲情\t247334\n正牙\t247335\nSYNC\t247336\n证券从业资格考试网\t247337\n欢瑞\t247338\n少小\t247339\n信人\t247340\nBIO\t247341\n大云\t247342\n双色\t247343\n高边\t247344\n処\t247345\n楼巴\t247346\n撷取\t247347\n小黑鱼\t247348\n豆各庄\t247349\n形成性\t247350\n菜花\t247351\n安迪\t247352\nfreess\t247353\n三丁目\t247354\n仓山区\t247355\n奔驰G500\t247356\n小米电池\t247357\n巴西政府\t247358\n首金网\t247359\n今古传奇\t247360\n全南县\t247361\n钱税\t247362\n太鼓\t247363\n迅捷\t247364\n有同感\t247365\n温度系数\t247366\n杉德\t247367\nsqllite3\t247368\n轻笑\t247369\n裙下\t247370\n妹爷\t247371\n韩心怡\t247372\nrake\t247373\n中衡\t247374\n杰弗里\t247375\n10.9.1\t247376\n四数\t247377\n狗不理包子\t247378\n蓝光发展\t247379\n张术平\t247380\n华鸿\t247381\nCalculating\t247382\n电加热器\t247383\nhime\t247384\n_图吧\t247385\n魔腾\t247386\ngrate\t247387\n福州南路\t247388\n大君\t247389\n黄皮树\t247390\n蜥蜴人\t247391\n司仪\t247392\n嫖娼\t247393\n单衣\t247394\n铁面无私\t247395\n第一列\t247396\nEnvy\t247397\n系统之家win7\t247398\n驱灵师\t247399\nweenie\t247400\n几多愁\t247401\n4一个\t247402\n朗诵稿\t247403\n朱亮\t247404\n0931\t247405\n192.168.1.101\t247406\n20140920\t247407\n磕巴\t247408\n开玩\t247409\nalgorithms\t247410\n国利\t247411\n游方\t247412\n乾隆孝\t247413\n为安\t247414\n苏联解体\t247415\nhupu\t247416\n浦东新区税务局\t247417\n注意到\t247418\n囧事\t247419\n金钟河大街\t247420\n蓬房\t247421\n有方\t247422\n十六首\t247423\nbras\t247424\n皑希优\t247425\nGH\t247426\n农业经济学\t247427\n接房\t247428\n收藏率\t247429\ninfrastructure\t247430\n古河镇\t247431\n王明君\t247432\n代玉\t247433\n2k11\t247434\n南京技师学院\t247435\n历城区\t247436\n悲怆\t247437\n名剑风流\t247438\n李昞\t247439\n双活\t247440\nbiblatex\t247441\n喀纳斯湖\t247442\n万菱汇\t247443\n天水政府网\t247444\n天津地铁3号线\t247445\n才貌\t247446\nhadoo\t247447\n园艺山\t247448\nprove\t247449\n周作人\t247450\npiercing\t247451\n水听器\t247452\n花多多\t247453\n人身自由权\t247454\n6000l\t247455\n乔治城\t247456\n足球小将世界杯\t247457\n而胜\t247458\n优界\t247459\n伊壁鸠鲁\t247460\n人免疫球蛋白\t247461\nscattered\t247462\n刘丹\t247463\n钢塑\t247464\n学利斯\t247465\n广州车管所\t247466\n儿童多动症\t247467\n大动作\t247468\ncodecs\t247469\n王老吉\t247470\n娛樂\t247471\n意大利大使馆\t247472\nwotime\t247473\n死而生\t247474\n新高一\t247475\n戊醇\t247476\n闭经\t247477\n片名\t247478\n郭大师网\t247479\n蓝宝石\t247480\n新邮\t247481\n老响\t247482\nacs510\t247483\n6罐\t247484\nwien\t247485\ndevcpp\t247486\n长矛\t247487\n保和殿\t247488\n实用篇\t247489\n新希望集团\t247490\nYAML\t247491\n吊架\t247492\n圆明园的毁灭\t247493\n莱茵河畔\t247494\nenforcement\t247495\n八哥犬\t247496\n39就医助手\t247497\n都城\t247498\n加尔文\t247499\n寒冰剑\t247500\n1556\t247501\nc语言编译器\t247502\n阿尔迪\t247503\n梁子岛\t247504\n蔍\t247505\n八马茶业\t247506\n豆本豆豆奶\t247507\n不动产抵押登记\t247508\n人工学园2\t247509\nmshflexgrid\t247510\nbtrfs\t247511\n杨显惠\t247512\n九球\t247513\n宝哥\t247514\n妻孝\t247515\n解穴\t247516\n威联通\t247517\n便血\t247518\nRan\t247519\n冰浴\t247520\n两局\t247521\n天津市体育局\t247522\nac\t247523\n水钵\t247524\n想人\t247525\n嘉士\t247526\n秀子\t247527\n49%\t247528\n临沂市人力资源和社会保障局\t247529\nIconX\t247530\n晓阳\t247531\n年金现值\t247532\nchou\t247533\n中华经典藏书\t247534\n打交\t247535\n发起式证券投资基金\t247536\n8年内\t247537\n96_\t247538\n生命周期函数\t247539\n饱经风霜\t247540\n郝文婕\t247541\n佘诗曼\t247542\n反问句\t247543\n5173\t247544\n指头\t247545\n丹栀逍遥丸\t247546\nDeeplearning4j\t247547\n高速公路养护\t247548\n矿物绝缘电缆\t247549\n烘焙机\t247550\n移动魔百盒-奇珀网\t247551\n南京国际博览中心\t247552\nmatpower\t247553\nbaobei\t247554\nsmarter\t247555\n私刻\t247556\n自反性\t247557\n箭靶\t247558\n少言\t247559\n美吉生物\t247560\n教育改革\t247561\n霓虹灯\t247562\n去找工作\t247563\nenigma\t247564\n冰糖炖梨\t247565\n1.1安卓\t247566\n阮国琴\t247567\n结怨\t247568\nconstructive\t247569\n马踏\t247570\nChow\t247571\nRunningSnail\t247572\ntvs\t247573\n星辰\t247574\nhaizi\t247575\n一等一\t247576\n闻花\t247577\n郑合\t247578\nskf轴承\t247579\n倡导者\t247580\n喀纳斯景区\t247581\n欣源网\t247582\n大雨天\t247583\n中建锦绣城\t247584\n又见北风吹\t247585\n音频转换器\t247586\nainuo\t247587\n2015年7月\t247588\n把酒倒满\t247589\n老年病科\t247590\n斑马贝\t247591\n9盘\t247592\n大杂烩\t247593\nautosport\t247594\n烈火雄心\t247595\n别哭\t247596\n黔建建通\t247597\nuirecorder\t247598\n照明装修|一起网\t247599\nucg\t247600\n中国共产党十九大\t247601\n亲情网\t247602\n有氧呼吸\t247603\nERUN\t247604\n反射类\t247605\n肥西县\t247606\n中山市第一人民法院\t247607\n20150911\t247608\n花溪公园\t247609\n自动控制原理\t247610\n众泰T7002.0T\t247611\n辽宁号航空母舰\t247612\n水功能区\t247613\n紫荆山路\t247614\n埃雷拉\t247615\nbrochure\t247616\n保利建工\t247617\ndoc-毕业论\t247618\n草儿\t247619\n起解\t247620\n卷帘式\t247621\n缴\t247622\noccupy\t247623\n奔驰GLS\t247624\nariana\t247625\nHaydn\t247626\n女当家\t247627\n体协\t247628\n移动隔断\t247629\n万泉河水清又清\t247630\n妨碍\t247631\n镇江高等专科学校\t247632\n七星山\t247633\n麻辣变形计\t247634\n原鹏超\t247635\nqishi\t247636\n涪风\t247637\nalan\t247638\n住宿业\t247639\n2017年五一劳动节\t247640\n上海拍拍贷\t247641\nc字裤\t247642\n欧冠\t247643\ndermaroller\t247644\n藏灯\t247645\n护堤\t247646\n人民音乐出版社\t247647\n月界\t247648\n教科版\t247649\n疝气灯\t247650\n鹏鸿\t247651\n大烧\t247652\n重力异想世界\t247653\nhwe\t247654\nfitness\t247655\n雅堂小超\t247656\n会员群\t247657\n梅溪湖\t247658\n决胜网\t247659\n孔子故里\t247660\n隐世\t247661\n福利性\t247662\n浙江小学\t247663\n碘钨灯\t247664\n弯弯腰\t247665\n潘瑞\t247666\nBurt\t247667\n御弟\t247668\n南沙万达广场\t247669\nv1.9.1\t247670\nvida\t247671\nULSUM\t247672\n旅店\t247673\n吉林银行\t247674\n君成录\t247675\n灵华\t247676\n降噪耳机\t247677\ncaz\t247678\nThinkCentre\t247679\n好想哭\t247680\nHindi\t247681\n蚧壳虫\t247682\n报告类\t247683\n谢某\t247684\n西捷航空\t247685\n集装箱船\t247686\n宋雪\t247687\n腰链\t247688\n机理\t247689\n广博股份\t247690\n戴口罩\t247691\n灾荒\t247692\nFATCA\t247693\n哮鸣音\t247694\n耿乐\t247695\n骑乘\t247696\n打刀\t247697\n魔兽大宋\t247698\n肇兴\t247699\n延吉西\t247700\nAAAI\t247701\n煞\t247702\nwpt\t247703\n希尔薇\t247704\n坐杆\t247705\n电涡流传感器\t247706\n微机原理与接口技术\t247707\n累赘\t247708\n荆浩\t247709\niplimage\t247710\n机动车驾驶证申领和使用规定\t247711\ndocument\t247712\n疯狂游戏大亨\t247713\n阿哲\t247714\n舒城县\t247715\n脸蛋\t247716\n偷拍者\t247717\n记账本\t247718\n台中火车站\t247719\n盼复\t247720\n赤日炎炎\t247721\nHydrochloride\t247722\n交流电桥\t247723\n6h\t247724\n贾涛\t247725\n上地实验小学\t247726\nRPC服务器\t247727\n杀场\t247728\n郝昭\t247729\nApogee\t247730\n百鬼夜宴\t247731\n养车无忧网\t247732\n香港迪士尼酒店\t247733\n降压药\t247734\n0.4kv\t247735\n申根区\t247736\nRichardson\t247737\n逆波兰\t247738\n李佛\t247739\n剑纯\t247740\n不碰\t247741\n苏联卫国战争\t247742\n恢复正常\t247743\n昏庸\t247744\n磷矿\t247745\n南阳理工学院\t247746\n王家湾\t247747\n腕儿\t247748\n金立\t247749\n砸车\t247750\nIE80S\t247751\n镇静剂\t247752\niob\t247753\n地头蛇\t247754\n落英缤纷\t247755\n德州路\t247756\n城投公司\t247757\n接地箱\t247758\n新生\t247759\n首套\t247760\n苦情\t247761\nrhino5.0\t247762\n中国水利水电第十一工程局有限公司\t247763\nwiper\t247764\n金龟村\t247765\n半身照\t247766\n小米黑鲨\t247767\n盗窃罪\t247768\n94分钟\t247769\n国通快递\t247770\n张秀卿\t247771\nhcash\t247772\n千厮门大桥\t247773\n魅王宠妻:鬼医纨绔妃\t247774\n福建省人民医院\t247775\n信捷plc\t247776\n散片\t247777\n中港城\t247778\n大安乡\t247779\n咸阳国际机场\t247780\n闸坡\t247781\n92.7\t247782\n车界\t247783\n龙语\t247784\n广东亚视演艺职业学院\t247785\n5280BT\t247786\n泽西岛\t247787\n黑角龙\t247788\nrickowens\t247789\n组装电脑吧\t247790\nConsulate\t247791\nadj\t247792\n黑风暴\t247793\n侨务\t247794\nhaidian\t247795\ntypekit\t247796\n修饰符\t247797\n寒龙\t247798\n智成\t247799\n度盘下载器\t247800\nmql\t247801\n1.49G\t247802\n常规化\t247803\n显赫\t247804\neSIM卡\t247805\n2017注册会计师考试\t247806\n京东万象资讯中心\t247807\n旅行社\t247808\nCNH\t247809\n相符\t247810\n科尔曼\t247811\nPostgre\t247812\n绍兴市教育局\t247813\n运河区\t247814\n大山深处\t247815\n泰能\t247816\n妮子\t247817\n孟丽\t247818\n住校生\t247819\n年检查询网\t247820\n紫河车\t247821\n太阳能灯\t247822\n4公斤\t247823\n8杯\t247824\nc++类\t247825\n渐冻症\t247826\n烤红薯\t247827\n九十周年\t247828\n公众号\t247829\n三菱重工海尔\t247830\n48名\t247831\n原生鱼\t247832\n关云\t247833\n软接头\t247834\n非限定性\t247835\n600048\t247836\n穿着打扮\t247837\n伍子胥\t247838\nCFA考试网\t247839\n弓\t247840\n四月份\t247841\n3323\t247842\n国字脸\t247843\nTrident\t247844\n施政\t247845\n120日\t247846\n双j管\t247847\n萝卜报告\t247848\n复旦管院\t247849\n墨盒\t247850\n皇马科技\t247851\nunarchiver\t247852\nbcdedit\t247853\n晚熟\t247854\n可交债\t247855\n3105\t247856\n天津市住房公积金管理中心\t247857\n金华市人民政府\t247858\nupto\t247859\n越南机场\t247860\n十五条\t247861\n董贤\t247862\n府\t247863\n刘震\t247864\n便溏\t247865\n替补\t247866\n去日\t247867\n不二周\t247868\n2238\t247869\n6.23\t247870\n54军\t247871\n列值\t247872\n参议\t247873\nTags\t247874\nRabbitMq\t247875\n老群\t247876\n450亿元\t247877\n二大\t247878\nCAIXIN\t247879\n满天星花\t247880\n西门子公司\t247881\n鲁大师\t247882\n不必须\t247883\n嘉儿\t247884\n广东省纪委\t247885\n天喜丸\t247886\nCES2017\t247887\n阿尔帕西诺\t247888\n名利场\t247889\n上海技术物理研究所\t247890\n榕城通\t247891\nzerotier\t247892\n交叉性\t247893\n甜爱路\t247894\ngit服务器\t247895\n河南电大\t247896\nv+\t247897\nipdb\t247898\nserver2008R2\t247899\n文名\t247900\ns级\t247901\ngym\t247902\n一个头\t247903\nEntropy_lxl\t247904\n电动轿车\t247905\n中国自动化\t247906\n金灶\t247907\n9.6.1\t247908\n天津市中心妇产科医院\t247909\n直接影响\t247910\n赋能\t247911\n115w\t247912\n死点\t247913\n划手\t247914\n除\t247915\n包皮环切术\t247916\n我的中国心\t247917\n云办税厅\t247918\nJTBC\t247919\n上方山国家森林公园\t247920\n军运会\t247921\n手字\t247922\ncdr14\t247923\n话说\t247924\n教无类\t247925\ndifferentiation\t247926\n公路\t247927\n东莞人才网\t247928\n脱贫\t247929\n赏罚分明\t247930\n62岁\t247931\n海口骑楼老街\t247932\n发卡机\t247933\n小时代\t247934\n番禺汽车客运站\t247935\n700亩\t247936\n泵组\t247937\n拉力\t247938\n预约挂号\t247939\noffice2007兼容包_office2007兼容包下载官方免费版\t247940\n8023\t247941\n恒威\t247942\n质人\t247943\nHi运动健身网\t247944\n昌邑\t247945\nkindl\t247946\nCome\t247947\n七星区\t247948\n旅游包车\t247949\n大鲸\t247950\n注销登记申请书\t247951\n筋疙瘩\t247952\n四川科技馆\t247953\n相城区\t247954\nShowDoc\t247955\nValeo\t247956\ncapacitor\t247957\n电车\t247958\n2018.3.31\t247959\nFrances\t247960\nxcode6\t247961\n一观\t247962\n命硬\t247963\n华为刷机网\t247964\norb\t247965\n中国保险公司\t247966\n设别\t247967\n小辛庄\t247968\n证费\t247969\n2038年\t247970\n两头\t247971\n波段\t247972\n郎酒集团\t247973\nnhdtb\t247974\n引道\t247975\n西藏佛教\t247976\nlsgxeva\t247977\n北京社保定点医院\t247978\n12月3日\t247979\n额头\t247980\n共创新\t247981\n作声\t247982\ntissues\t247983\n人事关系\t247984\n如何\t247985\nMT6735\t247986\n文成\t247987\njustice\t247988\nlongman\t247989\n斗地\t247990\n注胶机\t247991\n花生日记\t247992\n机密级\t247993\n2307\t247994\n7月下旬\t247995\n东热\t247996\n硼酸\t247997\naiming\t247998\n冰糖水\t247999\nlondon\t248000\n耐磨件\t248001\n厂价\t248002\n辛巴德\t248003\n大众桑塔纳\t248004\ng27\t248005\n索纳塔9\t248006\n鲁侍萍\t248007\n小宫\t248008\nYvonne\t248009\n楚香凝\t248010\n看样\t248011\nSHMET\t248012\n康馨家园\t248013\n楼\t248014\n乐和\t248015\n9阶\t248016\n坎儿井\t248017\n锝\t248018\n乔戈里峰\t248019\n苏州大学出版社\t248020\n下关\t248021\n黑龙江生产建设兵团\t248022\n洋河酒厂\t248023\ncm\t248024\n黎明大学\t248025\n意外事故\t248026\n不靠谱\t248027\nxds\t248028\n蔡永康\t248029\n歪小子\t248030\nMaquillage\t248031\nwen7\t248032\n小勇\t248033\n红绿灯\t248034\n孟非\t248035\n尾矿库\t248036\n咪哚网\t248037\nPC817\t248038\n缺相\t248039\n组织者\t248040\nGa\t248041\n1.0.2.8\t248042\n0.7.1\t248043\n50号\t248044\n二十四孝\t248045\n郭广昌\t248046\n粉红猪小妹全集\t248047\n安东诺夫\t248048\n赶赴\t248049\n提起诉讼\t248050\n灾区\t248051\n金属材料学\t248052\n英派斯健身俱乐部\t248053\n签名版\t248054\n朝鲜人民军\t248055\n温州市人民医院\t248056\n出口型\t248057\nzan\t248058\n第87期\t248059\n教改\t248060\n加勒比海盗2\t248061\n高空抛物\t248062\n方庄店\t248063\ncapacitance\t248064\n阿修罗之怒\t248065\n龙海市\t248066\n麦迪逊邦\t248067\n服装袋\t248068\n烧杯\t248069\nFord\t248070\n小梅沙海洋世界\t248071\nbitbake\t248072\nWAR\t248073\n17册\t248074\n伊敏\t248075\n收支\t248076\ncad看图软件\t248077\nFlap\t248078\n外推\t248079\n内江第一城\t248080\n上地\t248081\n魔厨\t248082\n601857\t248083\n贴文归类\t248084\n涂子沛\t248085\n出资者\t248086\nArguments\t248087\n浮球开关\t248088\n温良恭俭\t248089\n20ft\t248090\n滚动\t248091\n42亿\t248092\n10U\t248093\n下午一点\t248094\n圆柄\t248095\n1000台\t248096\n工程学院\t248097\n3dmax2018\t248098\n海叔\t248099\n几款\t248100\nbunker\t248101\n流痕\t248102\nseele\t248103\n被选\t248104\n铁锂电池\t248105\n独孤天下吧\t248106\nTightVNC\t248107\n碳酸丙烯酯\t248108\n司发通\t248109\n沙坪坝三峡广场\t248110\n球茎\t248111\n非系统盘\t248112\nVMware虚拟机\t248113\n月表\t248114\nv2.4.6\t248115\n小栗旬\t248116\n9%\t248117\nBranch\t248118\npaiming\t248119\n新锐科技\t248120\nYahoo\t248121\n木精\t248122\nfreesurfer\t248123\n中办国办\t248124\n第几回\t248125\n一回事\t248126\n电子商业承兑汇票\t248127\nTri\t248128\n华龙证券\t248129\n春化\t248130\n童展\t248131\n大连联通\t248132\nL1800\t248133\npip\t248134\n含光路\t248135\naccommodation\t248136\n电梯证\t248137\n2套\t248138\n畅达\t248139\n評價\t248140\n赵平\t248141\n车企\t248142\ngt240\t248143\n刑徒\t248144\n球磨罐\t248145\n182&\t248146\n中国共产党\t248147\n龙兴物流\t248148\n人与自然\t248149\n例表\t248150\n4.23\t248151\nenq\t248152\n泽旺多吉\t248153\n高周波\t248154\n乳山市政府\t248155\n汤姆索亚\t248156\n95\t248157\n百战网\t248158\nX2.5\t248159\n唐亮\t248160\n泌阳县人民政府\t248161\n颗星\t248162\n石家庄经济学院\t248163\n15时\t248164\n袁梦\t248165\n第一盘\t248166\n基础型\t248167\n寻龙\t248168\nsdglyuan\t248169\n元秋\t248170\n必须招标的工程项目规定\t248171\n860EVO\t248172\ntwink\t248173\n函数体\t248174\nAppraisal\t248175\n王玉龙\t248176\n论美\t248177\nresourse\t248178\n炽道\t248179\n硬板\t248180\n小米手\t248181\n小D\t248182\n班组\t248183\n热卖\t248184\n自主\t248185\n司马李茂\t248186\nRPS\t248187\n包贝尔\t248188\n固话\t248189\n信息与计算科学专业\t248190\n聪美\t248191\n艾比安\t248192\n圣者\t248193\n三卡\t248194\n用机\t248195\n轻钢龙骨隔墙\t248196\n登位\t248197\n60伏\t248198\n上海学前教育网\t248199\n再一个\t248200\nhighlighted\t248201\n武汉长江工商学院\t248202\n节节虫\t248203\nmx375\t248204\nyuang\t248205\nCCNA\t248206\n欢喜冤家\t248207\n传播学类\t248208\n盘石\t248209\n黄牙\t248210\n多线程处理\t248211\n老鹰茶\t248212\n屌报\t248213\n兴致勃勃\t248214\n勤耕\t248215\n扶梯\t248216\nlofter\t248217\n证明\t248218\njdbctemplate\t248219\n大进\t248220\n7z\t248221\n新侨\t248222\n吴鉴鹰\t248223\n甲状腺素\t248224\nBT图书馆\t248225\n同时代\t248226\n版头\t248227\n日流水\t248228\n西综\t248229\n全运会\t248230\nv6.3\t248231\n催乳素\t248232\n数组子\t248233\nCoinbase\t248234\ncryAllen\t248235\n梁建章\t248236\n石墨片\t248237\n2017年12月底\t248238\n京东商\t248239\n乐max\t248240\n生育力\t248241\n大连地铁2号线\t248242\n王娟\t248243\n对立统一\t248244\n东陶\t248245\nZimZz\t248246\nBehavioral\t248247\n柠檬树\t248248\n黄衫女\t248249\ns600\t248250\n瓶型\t248251\n财物\t248252\n我是不是你最疼爱的人\t248253\n2.5d\t248254\n河南中公教育\t248255\n藏红花百科\t248256\n高锰酸盐指数\t248257\n空勤\t248258\n文盲\t248259\n共轴\t248260\n电开关\t248261\n最低位\t248262\n皇冠梨\t248263\n酉阳土家族苗族自治县\t248264\n鸳鸯眼\t248265\n账单日和还款日\t248266\n趋向\t248267\n机筒\t248268\n即速\t248269\ncup\t248270\n土名\t248271\narxiv\t248272\n卤菜\t248273\narrary\t248274\n滚动的天空\t248275\n温莎堡\t248276\n花童\t248277\n雨歌\t248278\n不幸\t248279\n特药\t248280\n邹恒甫\t248281\n跳轨\t248282\n格锐\t248283\n5155\t248284\n张首晟\t248285\n刘文勇\t248286\n亚冠杯\t248287\n茵栀黄颗粒\t248288\nexcle\t248289\n稀饭\t248290\n太原日报网\t248291\neventbus\t248292\n重启后\t248293\n蓬莱镇\t248294\nVibration\t248295\n广东农工商职业技术学院\t248296\n旅居\t248297\nCCTV-11_央视网\t248298\n光明桥\t248299\n123平米\t248300\n小向まな美\t248301\nc语言程序设计\t248302\n周游\t248303\n杨红\t248304\n7820hk\t248305\n4754\t248306\n徐子陵\t248307\n头排\t248308\nsurfacebook2\t248309\n狼术\t248310\n张学敏\t248311\n沙罗曼蛇\t248312\n鳉鱼\t248313\nLorenzo\t248314\nmoment\t248315\n经济学院\t248316\n非聚簇索引\t248317\n广州大道\t248318\n贸易\t248319\n米兰家具展\t248320\n佩珀代因大学\t248321\n张硕辅\t248322\n拉里伯德\t248323\n更迭\t248324\n4百\t248325\nAgera\t248326\n冬日\t248327\n所谓\t248328\n柯南里\t248329\n茶庄\t248330\n宝兴之窗\t248331\n烟台酒店\t248332\n加档\t248333\n汞中毒\t248334\njiade\t248335\n朝族\t248336\n国家宗教事务局\t248337\n肺叶\t248338\n4667\t248339\nWishing\t248340\n平子\t248341\n时间规划局\t248342\nbyclick\t248343\n分库\t248344\n0.3.4\t248345\n课后生\t248346\n酒药\t248347\n特印\t248348\ntianmao\t248349\n桐柏县\t248350\n兽用\t248351\n信托产品\t248352\n上帝王\t248353\n重赛\t248354\n自治区工商局\t248355\nmAh\t248356\n9月8号\t248357\n女袜\t248358\n秦三\t248359\nSystemverilog\t248360\n95版\t248361\n新加坡滨海湾花园\t248362\n电脑柜\t248363\nJOG\t248364\n小洋山\t248365\n古力娜扎\t248366\n升压板\t248367\n吴海\t248368\n亚当\t248369\n镇上\t248370\n小径\t248371\n阿拉伯世界\t248372\n星尚\t248373\n阿土\t248374\n四川煤矿安全监察局\t248375\n北部\t248376\n火星时代教育\t248377\n迪高\t248378\n天津教育\t248379\ncofface\t248380\n笃\t248381\n灵芝孢子粉胶囊\t248382\n异教\t248383\nREVO\t248384\nria\t248385\n101岁\t248386\n建设有限公司\t248387\n优联\t248388\n福永码头\t248389\n传奇私服网\t248390\n肌肽\t248391\n流量器\t248392\n第8\t248393\n润江\t248394\nProvision\t248395\n吴兴区\t248396\n要命\t248397\ni3-8100\t248398\n武汉光谷\t248399\nFileUpload\t248400\n基台\t248401\n灯检机\t248402\n电热锅炉\t248403\n面语\t248404\n淋膜机\t248405\n照相\t248406\ndehydrogenase\t248407\n东门街\t248408\n河北电力\t248409\n客室\t248410\n妇科男医\t248411\n两码事\t248412\n卡希尔\t248413\n中营网\t248414\n拳手\t248415\nS3700\t248416\nMOON\t248417\n一嗨租车\t248418\n首拍\t248419\nClinics\t248420\n联众用户服务中心\t248421\nBeam\t248422\nwindows10吧\t248423\n物联网卡\t248424\n两只手指\t248425\nfc\t248426\n李晓宁\t248427\n淋膜纸\t248428\n万安山\t248429\n红房子医院\t248430\nseit\t248431\n项目化\t248432\n净菜\t248433\nkeylight\t248434\n四眼\t248435\n唐寅\t248436\n山水画家\t248437\n布尔值\t248438\n丝布\t248439\n青教\t248440\nfangfa\t248441\n凯宾\t248442\nGameRes\t248443\n凤穿牡丹\t248444\n师洋\t248445\n命符\t248446\n来路\t248447\nlymph\t248448\n杉杉股份\t248449\n赛龙舟\t248450\nWORKS2\t248451\n蕾娜\t248452\n踢脚板\t248453\nconscience\t248454\nHelix\t248455\n驾车\t248456\n防腐木葡萄架\t248457\nDrawer\t248458\n果酸\t248459\n呼啸山庄\t248460\n49年\t248461\n检察室\t248462\nCinderella\t248463\n量子物理史话\t248464\n大小学\t248465\n微量进样器\t248466\n灵川\t248467\niphone6P\t248468\n铁板烧\t248469\ntopk\t248470\n97影院_九七电影院\t248471\n盛辉\t248472\n林文月\t248473\n走马岭\t248474\nE5800\t248475\naow\t248476\nNPI\t248477\n无妄\t248478\n司马光\t248479\nv$session\t248480\n1416\t248481\n钻石网\t248482\nBreeding\t248483\n神偷世家\t248484\n59亿\t248485\n大连市工商行政管理局\t248486\n八达岭野生动物园\t248487\n白信\t248488\n电子元器\t248489\n神武5\t248490\nIPHONEX\t248491\n完美主义\t248492\n一见倾心\t248493\n弄假\t248494\n周燕\t248495\n法式西点\t248496\n冠者\t248497\n雪窦寺\t248498\n峨山县人民政府\t248499\n150块\t248500\n六神装\t248501\n发挥\t248502\n最高级\t248503\n执委\t248504\nDigg\t248505\n法制网\t248506\n标记物\t248507\nWlan\t248508\n474\t248509\n居民点\t248510\n率下\t248511\n欧贝斯\t248512\n前驱车\t248513\n贪色\t248514\n网络工程\t248515\n副总设计师\t248516\n环境工程学\t248517\nev360\t248518\n弯曲机\t248519\niio\t248520\ncontentEditable\t248521\n上官婉儿\t248522\nltsb\t248523\n婴儿车\t248524\n艺格\t248525\n阵地战\t248526\n866\t248527\n忠烈杨家将\t248528\nocm\t248529\n正交化\t248530\nCarton\t248531\n1.0|\t248532\n712\t248533\n压管机\t248534\n乔托\t248535\n相见\t248536\n接掌\t248537\n进击的巨人漫画\t248538\n雷霆之怒\t248539\n3nm\t248540\n子装\t248541\n20多项\t248542\nNCRE报名系统\t248543\n低水平\t248544\n标况\t248545\n物理场\t248546\nR50\t248547\n优书网\t248548\n女美\t248549\n瑞根\t248550\n妙祥\t248551\n唐生\t248552\n康海\t248553\n茶罐\t248554\n六块\t248555\nBloody\t248556\n防抖\t248557\n立足\t248558\n系统类\t248559\n湖南银行\t248560\nmybatis3\t248561\n机器学\t248562\n杨恒均\t248563\n疑心病\t248564\n新知杯\t248565\n洪哥\t248566\n物象\t248567\n灰蓝色\t248568\n回盲部\t248569\n高姥山\t248570\n2018年5月1号\t248571\n阿贵\t248572\n竹达彩奈\t248573\n沈刚\t248574\nserv-u\t248575\ni3-4130\t248576\n软饵\t248577\n哥人\t248578\n运动战\t248579\n旺姆\t248580\n台域\t248581\n收缩性\t248582\nscanjet\t248583\n必要\t248584\n自治区纪委监委\t248585\n阎立品\t248586\n哈特诺村\t248587\n鸡宝\t248588\n婴芭莎\t248589\n程朱理学\t248590\nCNU\t248591\n北京一证通\t248592\n昂科拉\t248593\n银通\t248594\n威海市人民政府\t248595\n金融证券\t248596\n何雯娜\t248597\n玩住\t248598\n盗摄\t248599\n标致3008论坛\t248600\n泸州电视台\t248601\nv3.7\t248602\n连击\t248603\n刘国中\t248604\nkpopstar\t248605\n追思会\t248606\n容柜\t248607\n七天后\t248608\n天游峰\t248609\nRJ版\t248610\nfe2o3\t248611\n燃气灶\t248612\n曾涛\t248613\nWindows10|Win8|Win7\t248614\n假性湿疣\t248615\n恐怖融资\t248616\nspacy\t248617\n立德粉\t248618\n骆玉笙\t248619\n北二外\t248620\n显卡天梯图\t248621\neveryday\t248622\n51talk无忧英语\t248623\n开槽\t248624\n退役战\t248625\n厚底\t248626\nscipy\t248627\n1.82\t248628\n植物神经紊乱\t248629\n铝芯电缆\t248630\n黄陂区人民政府网站_武汉黄陂政府\t248631\n上兴\t248632\n双压\t248633\nwindow\t248634\n匹多莫德口服液\t248635\n活性乳酸菌\t248636\n选登\t248637\n猪油膏\t248638\n口红\t248639\n环古城河\t248640\n17只\t248641\n吊图\t248642\n台上\t248643\n霍尔效应\t248644\n吴丽\t248645\nkeith\t248646\n同升\t248647\n67级\t248648\n感情世界\t248649\n个人户\t248650\n搜狐公司\t248651\nitss\t248652\n同心锁\t248653\nptyhon\t248654\n罗罗\t248655\n恰同学少年\t248656\n颜若\t248657\nHttpURLConnection\t248658\n话痨\t248659\n今金贷\t248660\n文献检索\t248661\n明辨是非\t248662\n到\t248663\n花儿香\t248664\n洛溪新城\t248665\n既视感\t248666\n守则\t248667\n郑州航空港区\t248668\n兽人文\t248669\n马来酸依那普利片\t248670\n补拍\t248671\n媚肉之香\t248672\n反转录\t248673\n王柯\t248674\n36_\t248675\n手工坊\t248676\nf181\t248677\n三滴\t248678\n金融研究\t248679\nUSB驱动器\t248680\n04月16日\t248681\n苍雪龙城\t248682\n现代商业杂志社\t248683\n贝尔摩德\t248684\nZN\t248685\n局部麻醉\t248686\n范文集\t248687\n河北省人事考试局\t248688\nmbl\t248689\n300篇\t248690\n样板\t248691\nCOA\t248692\n广州路\t248693\n变位系数\t248694\n0950\t248695\nppt_中考网\t248696\n现场图\t248697\nrestify\t248698\nAC66U\t248699\n杨老\t248700\n帽饰\t248701\n特质\t248702\n傍水\t248703\n行止\t248704\n双龙寿字币\t248705\n眼毛\t248706\n事半功倍\t248707\n中北\t248708\n海归男\t248709\n画技\t248710\n玺印\t248711\n务实\t248712\n松泽\t248713\n仪表箱\t248714\n湖镇\t248715\n华天酒店\t248716\n童玲\t248717\n回销\t248718\nistripper\t248719\n工作场所\t248720\n待办\t248721\n4499\t248722\n奇力康\t248723\n格斯\t248724\n工商管\t248725\nsabre\t248726\nangualrjs\t248727\n一房\t248728\n135元\t248729\n一览表\t248730\nSTER\t248731\n虚析构函数\t248732\nsrc\t248733\n考案\t248734\n5.9.0\t248735\n周厉王\t248736\n有弊\t248737\n黑矮星\t248738\n网站关键词优化\t248739\n感度\t248740\n八关斋戒\t248741\n100升\t248742\n盖世史\t248743\n颂歌\t248744\n京溪南方医院\t248745\nG9350\t248746\n外星\t248747\n凉城县\t248748\n大众摄影\t248749\n有说\t248750\n金山开发区\t248751\ncamry\t248752\n速锐\t248753\n狭缝\t248754\n中国基金网\t248755\n耳痛\t248756\n它们\t248757\n小番外\t248758\n战况\t248759\n得意洋洋\t248760\n朗行论坛\t248761\n皮布\t248762\nti84\t248763\n不得不知\t248764\n第30章\t248765\n载弹量\t248766\n马晓燕\t248767\nxuanze\t248768\n张教授\t248769\n中国能源\t248770\n山东黄金\t248771\n派送员\t248772\nJK触发器\t248773\n等效\t248774\n托克托吧\t248775\n水陆空\t248776\nps7\t248777\n小红门乡\t248778\n冠心病\t248779\n维多利亚湾\t248780\n陕西省数字证书认证中心\t248781\n明色\t248782\nautoencoder\t248783\n农民房\t248784\n灯火阑珊处\t248785\n江江\t248786\noxo\t248787\n五八\t248788\n秋明\t248789\n浩海\t248790\nC++标准库\t248791\n江边\t248792\nz-1\t248793\n32歳\t248794\nworkspace\t248795\n驾照网\t248796\nUndeveloped\t248797\n黑色星期五\t248798\n20170111\t248799\nbadger\t248800\n大卫杜夫\t248801\n窗格\t248802\n苏卡达\t248803\n济南护理职业学院\t248804\npremature\t248805\n小觅\t248806\n播种面积\t248807\n6.5cm\t248808\nToby\t248809\n吴宫\t248810\n入国\t248811\n搜狗动漫\t248812\nFeaturing\t248813\n即墨北\t248814\n交响乐队\t248815\n心水主\t248816\n菩提果\t248817\n交通展\t248818\n阿SA\t248819\n爱思特\t248820\ndlink\t248821\n7.5.5\t248822\nmot\t248823\nSEARCH\t248824\n伊朗里亚尔\t248825\n想入非非\t248826\n魔天\t248827\n1x\t248828\n嵊\t248829\n规则集\t248830\n罗田\t248831\n巧妇\t248832\n措施费\t248833\ndts\t248834\n华加\t248835\n奥美沙坦酯片\t248836\n少爷风流\t248837\n深圳儿童医院\t248838\n爱卡汽车\t248839\n怎_\t248840\n鸢尾花\t248841\n舒体\t248842\nsubtitle\t248843\n小额免密支付\t248844\ngl62m\t248845\nifr\t248846\n扔进\t248847\n李维康\t248848\n佑米\t248849\n不动产证\t248850\n小传\t248851\n顺序性\t248852\n谷歌搜索引擎\t248853\n窥视\t248854\n20150811\t248855\n咕噜吧啦音乐网\t248856\n王彩桦\t248857\n大白汽车分期\t248858\n20170226\t248859\n差别化\t248860\ntiamo\t248861\n长沙西站\t248862\n姐妹团\t248863\n湛江大道\t248864\n监盘\t248865\n汉秀\t248866\n不排卵\t248867\n14时\t248868\n第7季\t248869\n0点0分\t248870\n册码\t248871\n梁月\t248872\n中星\t248873\n厦门市国家税务局\t248874\n远郊\t248875\n姚亮\t248876\n80070091\t248877\n投票权\t248878\n挂落\t248879\n武将风云录\t248880\n龙湖名景台\t248881\n豆\t248882\n颜\t248883\n阔腿裤\t248884\nEMLOG\t248885\nUNITY3D\t248886\nBTC123\t248887\n天堂电影院\t248888\n考勤管理系统\t248889\n男虎\t248890\n大国有企业\t248891\n秦帝\t248892\n齿轮轴\t248893\n车符\t248894\n水务\t248895\n杨晓峰\t248896\nAxela昂克赛拉\t248897\n龙墨\t248898\n康宏\t248899\n行政学院\t248900\n广州化工\t248901\n顶式\t248902\n青枫公园\t248903\n聊斋艳谭\t248904\nexcel批注\t248905\n城镇居民可支配收入\t248906\n信则成\t248907\n通知费\t248908\n黄景行\t248909\n包罗\t248910\n木花\t248911\n六页\t248912\n湖南工学院\t248913\n青岛医院\t248914\n肖毅\t248915\nEarned\t248916\n白玫瑰\t248917\nRecording\t248918\n角度传感器\t248919\n巴塞\t248920\n区域型\t248921\nizle\t248922\nOptane\t248923\n陕西省物价局\t248924\n回呼\t248925\n蓝沁\t248926\nRS2\t248927\n闷油瓶\t248928\n梦灵\t248929\n城东新城\t248930\ncems\t248931\n巴塞尔协议3\t248932\n赵萍\t248933\n小兴安岭\t248934\n198号段\t248935\n伯仁\t248936\n鹿筋\t248937\n投缘\t248938\nJokes\t248939\n老端\t248940\n无脚\t248941\n行研\t248942\n五星级\t248943\n王宝强\t248944\nddr3l\t248945\n8000亿\t248946\n小臭臭\t248947\n.war\t248948\n上海购物中心\t248949\n愚忠\t248950\n耐火窗\t248951\n九正家居网\t248952\nlsh\t248953\n致爱丽丝\t248954\nOurOcg\t248955\nglu\t248956\nburied\t248957\n第50章\t248958\n7130\t248959\n会议论文集\t248960\n上街\t248961\ntoFixed\t248962\n2rs\t248963\n压电\t248964\n香芋奶茶\t248965\n今年2月\t248966\n斑驳\t248967\n空气过滤器\t248968\n大奖章\t248969\nESS\t248970\nArchi\t248971\ndropbear\t248972\nSeek\t248973\n14PK\t248974\n欧洲游\t248975\n瑞克斯\t248976\nGashina\t248977\n苁蓉\t248978\n瓷窑\t248979\n皇冠蛋糕\t248980\nWIRE\t248981\n一妻\t248982\n近半\t248983\n2018-02-01\t248984\nPablo\t248985\n七十二家\t248986\n冷战\t248987\n脚部\t248988\n麦孔\t248989\nLiterary\t248990\n中山大学南方学院\t248991\n所得税费\t248992\n艾滋_\t248993\nHD1280高清韩语\t248994\n国际志愿者日\t248995\n娱乐宝\t248996\n撸之撸\t248997\n芙蓉站\t248998\n苏幕\t248999\n眯眼\t249000\n2018.04.10\t249001\n信披\t249002\n网络工程专业\t249003\nCarbo\t249004\n锡山司法局\t249005\n饭菜\t249006\n芳村\t249007\n过膝袜\t249008\n赤柱\t249009\nWE战队\t249010\n南坑\t249011\n创博会\t249012\nMT7688\t249013\nnave\t249014\nacer\t249015\n答辩人\t249016\n财务管理毕业论文\t249017\n1.5.8\t249018\n180425\t249019\n退货运费险\t249020\n中国科学院过程工程研究所\t249021\n40平米\t249022\nSyncthing\t249023\nTRW\t249024\nxps15吧_\t249025\nImplementation\t249026\n|Sony/KM\t249027\n观致汽车\t249028\n000876\t249029\n命令行参数\t249030\n信函\t249031\n粤通\t249032\n二手\t249033\nComponentOne\t249034\n讼\t249035\n40集\t249036\n三浦惠理子\t249037\n古色古香\t249038\n国际处\t249039\n暗黑3吧\t249040\nPotPlayer播放器\t249041\n导料\t249042\ndeo\t249043\n记挂\t249044\nduora\t249045\n狂野飙车\t249046\nsprign\t249047\n12.8\t249048\nPCAP\t249049\n奥田\t249050\nGCJ-02\t249051\n河南人才网\t249052\n龙蛇演义\t249053\n苏格兰牧羊犬\t249054\n斯柯达明锐\t249055\n度假屋\t249056\nAtlassian\t249057\n缓一缓\t249058\nradiobox\t249059\n319号\t249060\n0.22um\t249061\n16CD\t249062\ncounters\t249063\n小力\t249064\n奇米影视\t249065\n阚\t249066\npack1\t249067\n中天置地\t249068\njianer\t249069\n张淼淼\t249070\n食灵\t249071\n10690700367\t249072\n监管仓\t249073\n一路上有你\t249074\n黄振龙\t249075\nKOHLER\t249076\n搜狐动漫\t249077\n遥观镇\t249078\n注册资本\t249079\n丄\t249080\n星条\t249081\n自然卷\t249082\npillow\t249083\n兰州西站\t249084\n异想世界\t249085\n衣舞\t249086\n彭昱畅\t249087\n5112\t249088\n莫曼顿\t249089\n千年3\t249090\nXLSX\t249091\n大鹏鸟\t249092\npvz2\t249093\n御堂\t249094\n甲骨文\t249095\n爆肚\t249096\n黄楚平\t249097\n雅马哈\t249098\n劳动最光荣\t249099\n黄威\t249100\n真题\t249101\n荣耀5\t249102\n30篇\t249103\npvsyst\t249104\n463\t249105\n学习效率\t249106\n京津城际铁路\t249107\n嘻哈四重奏\t249108\n咬舌\t249109\n顶底\t249110\n夏冰\t249111\n企业汇算清缴\t249112\n中国水周\t249113\n华为P8\t249114\n明太祖朱元璋\t249115\n凌渡宝马x3\t249116\n抬花轿\t249117\n切换输入法\t249118\n交易者\t249119\n边角料\t249120\nGmarket\t249121\n筱杉\t249122\n2717\t249123\n长屿\t249124\n青龙村\t249125\n哈洛克\t249126\n张玮\t249127\n270元\t249128\n泪妖\t249129\n紫嫣\t249130\nTobias\t249131\ngrefr\t249132\nG520\t249133\n颗菜\t249134\n浪潮公司\t249135\ndiana\t249136\n干球\t249137\n奎托斯\t249138\nDIT\t249139\n未来水\t249140\n固镇县\t249141\nL\t249142\n谢顶\t249143\nIncome\t249144\n作家榜\t249145\nMarionette\t249146\n爱马\t249147\n双关\t249148\n北京奔驰奔驰\t249149\n长江实验小学\t249150\nArmor\t249151\n军科\t249152\n就这样\t249153\n上铺\t249154\n浙江省民族宗教事务委员会\t249155\n500元\t249156\n苏州北\t249157\n光学\t249158\n大秦帝国之崛起\t249159\nCListCtrl\t249160\n龙岗政府在线\t249161\nsklearn机器学习库\t249162\n长春乐居\t249163\n航运\t249164\n养生百科\t249165\n二部\t249166\n铝镍钴\t249167\nwish平台\t249168\n季凉川\t249169\n粉瓶\t249170\ntrades\t249171\n回缩\t249172\nGoldenField\t249173\n文者\t249174\n板牙\t249175\ngetAttribute\t249176\nWindowsError\t249177\n隔间\t249178\nloon\t249179\n别墅式\t249180\n310亿\t249181\n旅客列车\t249182\n秒白条\t249183\nIT007\t249184\nRuckus\t249185\n夜宠\t249186\n8510\t249187\n宋晓峰\t249188\n末日迷踪\t249189\n蘸汁\t249190\nbce\t249191\n高分子网\t249192\n泉源\t249193\nMyBatis批量\t249194\n十八位\t249195\n燕妮\t249196\n约克空调\t249197\n武警工程大学\t249198\n91tv\t249199\n大恶司\t249200\n直逼\t249201\n致盲\t249202\nPAD\t249203\n烟台论坛\t249204\n问曰\t249205\n梁静\t249206\n变调\t249207\n工业以太网\t249208\nPower-BI\t249209\n奥迪A8\t249210\n南京富士通电子信息科技股份有限公司\t249211\n厦门大学公共事务学院\t249212\n贾跃亭新公司\t249213\n打表\t249214\nITM\t249215\n魔纹石\t249216\n爱犬\t249217\nHinton\t249218\n陌森眼镜\t249219\n幻镜\t249220\n手盆\t249221\n12万个\t249222\n讯景\t249223\n区民政局\t249224\n146名\t249225\n赵先明\t249226\n赶超\t249227\n万能声卡\t249228\n西洋画\t249229\nIF函数\t249230\nBigInteger\t249231\n05号\t249232\n水土保持\t249233\n激光打码机\t249234\nm558\t249235\n虚空精灵\t249236\n陶醉\t249237\nIPC\t249238\n简书\t249239\n教育咨询师\t249240\n融解\t249241\n秦时明月之天行九歌\t249242\n好丽\t249243\n新选组\t249244\n大员\t249245\nrural\t249246\n五十级\t249247\n刘建忠\t249248\n有机肥造粒机\t249249\n林政\t249250\n纸皮核桃\t249251\n岳晓东\t249252\n科比布莱恩特\t249253\n小二网\t249254\nGopro\t249255\n9.5.5\t249256\nbraid\t249257\n地转偏向力\t249258\n战锤2:全面战争\t249259\n5月24日\t249260\n东钱湖镇\t249261\n高永\t249262\n张翔玲\t249263\n局团委\t249264\n快乐大本\t249265\n太妃糖\t249266\nflam\t249267\n$10\t249268\n7.0.4\t249269\n圆凳\t249270\nowing\t249271\nYY粉丝网\t249272\n马石油\t249273\n金梧桐\t249274\n吕思清\t249275\n吉他独奏谱\t249276\nEnrollment\t249277\n中国女排\t249278\n京东商场\t249279\n旋喷桩\t249280\n看不透\t249281\n四战\t249282\n司马辽太郎\t249283\ndell吧_\t249284\n喉软骨\t249285\nDJMAX\t249286\n梆梆\t249287\n5.5.2\t249288\n华润医疗\t249289\n赵丽颖\t249290\n波罗峪\t249291\n中海\t249292\n4层\t249293\n过热器\t249294\n归依\t249295\n此情可待\t249296\n贤令\t249297\n绿叶菜\t249298\nJAZZ\t249299\n万霖\t249300\n学生处\t249301\n画龙点睛\t249302\n贷款王\t249303\n中坤广场\t249304\n斌\t249305\n欅\t249306\n叔丁基苯酚\t249307\n陈丹婷\t249308\n孤寂\t249309\n不习\t249310\n牛鹿\t249311\n策略组\t249312\nCompat\t249313\nzikao5.com\t249314\n弦\t249315\n新疆维吾尔自治区交通运输厅\t249316\n蔡宝健\t249317\n复合肥料\t249318\n行变\t249319\n展望_参考网\t249320\n双底\t249321\n锚杆静压桩\t249322\n72级\t249323\n望谟县\t249324\n理财规划师\t249325\n思埠\t249326\n霍乱\t249327\n20160410\t249328\n新疆人事考试网\t249329\n更行\t249330\n1千\t249331\n两坨\t249332\n代缴\t249333\n上证50指数\t249334\n4399仙境传说\t249335\n新野\t249336\n鸡蛋卷\t249337\n聚飞光电\t249338\n美国科技公司\t249339\n马嘴\t249340\n反邪教\t249341\n当里\t249342\n山东省委宣传部\t249343\n600588\t249344\n开涮\t249345\n层间\t249346\n未来科学城\t249347\n眼科学\t249348\nMUSIX\t249349\n卡牌类\t249350\n直购\t249351\n2018-02-07\t249352\n3章\t249353\n链锯\t249354\ngs63\t249355\nabaqus2017\t249356\n张泉灵\t249357\n11:00\t249358\n孙旭东\t249359\nnamespaces\t249360\nResizer\t249361\n戴龙\t249362\n优图\t249363\n偏见\t249364\n僵尸大陆\t249365\n封药\t249366\n青剑湖\t249367\n酌情\t249368\n娶亲\t249369\n虚空先知\t249370\n坦克世界闪击战\t249371\nseas\t249372\n淋巴肿\t249373\n流亡者\t249374\n酷6视频\t249375\n胃口\t249376\n不格式化\t249377\n冥婚\t249378\n生生世世\t249379\n最低工资|华律\t249380\n死亡之雪2\t249381\n鼻咽炎\t249382\n凶暴\t249383\n收尸\t249384\n甲午风云\t249385\n湖南工程职业技术学院\t249386\n激\t249387\n68家\t249388\n林心如\t249389\n箱式货车\t249390\n麻黄碱\t249391\n克数\t249392\n畅云\t249393\njavacript\t249394\n冲绳那霸\t249395\n接驾\t249396\n天津石化\t249397\n选种\t249398\n马前了不起的盖茨比\t249399\nLeo\t249400\n汪辉\t249401\nExcel2017\t249402\n18042\t249403\n吉他谱G调弹唱\t249404\n警察证\t249405\ntumi\t249406\nprojet\t249407\n凯西\t249408\n新金山论坛\t249409\n偷星九月天\t249410\ncubase9\t249411\n东天目山\t249412\nMaxi-247\t249413\n少玩\t249414\n色爱综合网\t249415\n6秒\t249416\n5.0.2\t249417\n佳易儿歌视频大全网\t249418\n龙门站\t249419\n船籍\t249420\n过名\t249421\n360安全路由P1\t249422\nBash\t249423\n切尔诺贝利核电站\t249424\n语谱\t249425\nint函数\t249426\n享\t249427\n二寸\t249428\nGuerlain\t249429\n狂阶\t249430\n映秀镇\t249431\nKaty\t249432\n森罗\t249433\n金沙江大桥\t249434\n虾饺\t249435\n救护\t249436\n2年内\t249437\n林晨钰\t249438\nmutt\t249439\n几十kb\t249440\n凸台\t249441\n焊帽\t249442\n耆那教\t249443\n丹比奴\t249444\nCSV\t249445\n北京海淀区小学\t249446\n17场\t249447\nKomodo\t249448\n我的青春我做主\t249449\n手费\t249450\n转印\t249451\nwin10显卡驱动\t249452\n单身者\t249453\n风景照\t249454\n25996767\t249455\n万寿\t249456\n裸藻\t249457\nmek\t249458\n伊藤英明\t249459\n金沙大道\t249460\nkara\t249461\n汉普\t249462\nMIN\t249463\nPoet\t249464\n5-8月\t249465\nrohs\t249466\n补妆\t249467\n茶马\t249468\n私秘\t249469\nJar\t249470\n1分钱\t249471\n650mm\t249472\n十字镇\t249473\n广告杯\t249474\n幼升\t249475\n剑三代练\t249476\n贲门癌\t249477\nSpiceworks\t249478\n恰似\t249479\n八棱\t249480\n象群\t249481\ndavichi\t249482\n紫荆路\t249483\n先发制人\t249484\n北京高能时代环境技术股份有限公司\t249485\n练通\t249486\n真章\t249487\n光纤跳线\t249488\n过门石\t249489\n小i\t249490\n昨天下午\t249491\n定基\t249492\n窿\t249493\n管涌\t249494\nfive\t249495\nuntrusted\t249496\n音悦Tai\t249497\n指环王\t249498\n草表\t249499\n高能版\t249500\n雄黄酒\t249501\n1467\t249502\n铜屑\t249503\nThee\t249504\nWord-ExcelHome\t249505\n存证\t249506\n株洲市政府\t249507\n哈尔滨学院\t249508\n大乘佛教\t249509\n广东省卫生厅\t249510\n好心游戏网\t249511\nwe123.com\t249512\n搜狐新闻\t249513\nsupernova\t249514\n麦家\t249515\n跳掉\t249516\ndirectx12\t249517\n时尚笔记_海报时尚网\t249518\n炉香\t249519\n重彩画\t249520\n气动阀\t249521\n欧豪\t249522\n冈本\t249523\n中国通信第一产经\t249524\n畀\t249525\n人面桃花相映红\t249526\n埋头苦干\t249527\n委书记\t249528\n反卷积\t249529\n20160623\t249530\n福田瑞沃\t249531\n纳米树脂\t249532\n男高\t249533\n豫建建\t249534\n战斗王之飓风战魂\t249535\n终结者2:审判日直播_终结者2:审判日\t249536\nfuhao\t249537\n垦荒\t249538\n李泽钜\t249539\n漕河泾街道\t249540\n水嶋あずみ\t249541\n苏越\t249542\n关联关系表\t249543\n24年\t249544\n中土纪元\t249545\n江淮大众\t249546\n柜门\t249547\n湖南国税电子税务局\t249548\n17品\t249549\n闪耀的罗曼史\t249550\n陵寝\t249551\n张若虚\t249552\n销售处\t249553\n闻一多先生的说和做\t249554\n明年春节\t249555\nmeng\t249556\n二级路由器\t249557\n五里井\t249558\n地情\t249559\n缩量\t249560\n未知死亡\t249561\n18042期\t249562\n普贤行愿品\t249563\n与此同时\t249564\nvisualstudio2017\t249565\n云梦县\t249566\n魔龙之魂\t249567\n锐利度\t249568\n蜂窝煤机\t249569\n华东理工大学化工学院\t249570\n黑吃鸡\t249571\n狂狮\t249572\n还好\t249573\n河合奈保子\t249574\n天津师范\t249575\npoverty\t249576\nGreyson\t249577\n赢利\t249578\n方头\t249579\n中华小姐\t249580\n新娘妆\t249581\n雾屏\t249582\n浙江大学医学院\t249583\n大庆网\t249584\nc51单片机\t249585\n3包\t249586\n豆各庄乡\t249587\n彰\t249588\n史蒂夫\t249589\n改重\t249590\n木卡姆\t249591\n胎压监测器\t249592\n住人集装箱\t249593\n历史馆\t249594\n7步\t249595\n盛桥镇\t249596\n陇南政府网\t249597\ndegree\t249598\n莲花新村\t249599\nv3.6.2\t249600\nMxNet\t249601\nmysql-workbench\t249602\n邻居\t249603\n韩力\t249604\n张鹤\t249605\n1000道\t249606\nFedex\t249607\n康乐花园\t249608\n安畅\t249609\n唐建军\t249610\n乌镇西栅\t249611\n首都经济贸易大学\t249612\n贵溪\t249613\n7根\t249614\n平安镇\t249615\n武海峰\t249616\n杏花村\t249617\nF3.5\t249618\ndest\t249619\n铁轮\t249620\n预录\t249621\n青柑\t249622\nit男\t249623\n拉屎\t249624\n柯里化\t249625\n龙血武姬\t249626\nvanish\t249627\n南昌市城乡规划局\t249628\nAGREE\t249629\n数学类\t249630\n粮草\t249631\n新浪江苏新闻_新浪\t249632\n底部\t249633\n茅山新四军纪念馆\t249634\n桐岛永久子\t249635\n昆明中国国际旅行社\t249636\n张骏\t249637\n秣\t249638\n马拉松赛\t249639\n北京托福\t249640\n新城区\t249641\n金数\t249642\n贤丰控股\t249643\n求逆\t249644\n58同城\t249645\n畅听\t249646\n五邑\t249647\n粉体\t249648\n厚涂\t249649\n千仞雪\t249650\n冯莹\t249651\n四例\t249652\n罗托克\t249653\n防火门\t249654\n史莱克\t249655\nCenturies\t249656\n超级女声吧_\t249657\n格罗方德\t249658\n吾悦广场\t249659\n黄博\t249660\n天津港集团\t249661\n10枚\t249662\n划掉\t249663\n三单\t249664\n绝地求生死亡回放\t249665\nreallifecam\t249666\n九一八\t249667\n城市规划\t249668\n沙漠矿\t249669\n林恩术\t249670\nw88\t249671\n曲版\t249672\n80070057\t249673\n八分\t249674\n应不应该\t249675\n二手房贷款\t249676\n畅轻\t249677\n魔化生\t249678\ncytus\t249679\n何沐阳\t249680\n金融化\t249681\n法律咨询网\t249682\n金投美股网\t249683\n吊挂式\t249684\n武艺\t249685\n云南中烟\t249686\n火爆\t249687\nSharpest\t249688\n滚针\t249689\nior\t249690\n华氏度\t249691\n宜相克\t249692\n腹腔镜胆囊切除术\t249693\nluasocket\t249694\n乐智小天地\t249695\n然乌湖\t249696\n金城小区\t249697\n儿童\t249698\nR语言\t249699\n三聚氰胺板\t249700\n电动套丝机\t249701\n船式\t249702\n工品汇\t249703\naccessories\t249704\nJIM\t249705\nlt26i\t249706\n翻译腔\t249707\n定州新闻网\t249708\nNUS\t249709\nAthlon\t249710\n玩球\t249711\nbanned\t249712\n穿心莲内酯\t249713\n谋事在人\t249714\nPlaybook\t249715\n200SMART\t249716\n大连重工\t249717\nps2接口\t249718\n山东路桥\t249719\n义体\t249720\n泄露\t249721\n温州大学城\t249722\n湿\t249723\n秦渊\t249724\n青青稞酒\t249725\n全选\t249726\n无锡市卫生局\t249727\nmotioninjoy\t249728\nSun\t249729\n李自健\t249730\ngeojson\t249731\n中共一大会址\t249732\n刘爽\t249733\n成都东软学院\t249734\nalibaba国际站\t249735\n超级明星\t249736\n28页\t249737\n西湖边\t249738\n报销型\t249739\nAnalyzer\t249740\n实质上\t249741\nll16\t249742\n2013—2017年\t249743\nMainstream\t249744\nigm\t249745\n镜子\t249746\n天日\t249747\n联队\t249748\nvasp\t249749\n输出流\t249750\nAdvent\t249751\nPygame\t249752\nkepserverex\t249753\n帮写\t249754\n化妆术\t249755\n180316\t249756\n永磁电机\t249757\n苏州地铁\t249758\n虎蚕\t249759\nintegral\t249760\n乘人\t249761\n五里镇\t249762\ncheque\t249763\n乌蝇哥\t249764\nKay\t249765\n四川艺术职业学院\t249766\n农历年\t249767\nconsoles\t249768\n魅蓝x\t249769\n比翼\t249770\n加洛特\t249771\n驻马店网\t249772\n麻辣诱惑\t249773\nt+1\t249774\n拘留\t249775\n二氯甲烷\t249776\n粗人\t249777\n好别扭\t249778\n武卫东\t249779\n王金城\t249780\nGTO\t249781\n城下町\t249782\n相加\t249783\n柱平法\t249784\n民生大厦\t249785\n干股\t249786\n储存卡\t249787\n夢乃\t249788\n苏芷\t249789\nNyaa\t249790\n探秘\t249791\nshaded\t249792\n百人会\t249793\n侯村\t249794\n厚仪\t249795\n滑轮鞋\t249796\n苮儿\t249797\n枫叶布\t249798\n比分\t249799\n腰椎间盘膨出\t249800\n有所不为\t249801\n顺路\t249802\n第126集\t249803\n运算器\t249804\n自治区政府\t249805\n校场\t249806\n翼环\t249807\nModes\t249808\n名规则\t249809\n计轴\t249810\n梧桐街1905号\t249811\n大营村\t249812\n铜精矿\t249813\n保险新闻网\t249814\nnumbering\t249815\n南通市区\t249816\n安世半导体\t249817\n抚弄\t249818\n78张\t249819\n收缩压\t249820\n青红皂白\t249821\n北京新时代模特学校\t249822\nawg\t249823\n沧州兴昊管道有限公司\t249824\n星雨\t249825\n生贽\t249826\n安全玻璃\t249827\n绒球\t249828\n堆场\t249829\n用刑\t249830\n变形金刚玩具\t249831\n讲学\t249832\n细面\t249833\n2c\t249834\nv2.9\t249835\nAdvantages\t249836\n别走\t249837\n报刊\t249838\n镜花缘\t249839\n流量表\t249840\n独克宗古城\t249841\n中望3D\t249842\n96v\t249843\nLO\t249844\n天使投资人-著名天使投资人名录\t249845\n百里挑\t249846\n大童保险\t249847\n奥纬\t249848\nCucumber\t249849\n进退两难\t249850\n捧起\t249851\n百度信誉档案\t249852\n建设银行报\t249853\n银团贷款\t249854\nfirefox火狐浏览器\t249855\n45年后\t249856\n水果摊\t249857\n池中物\t249858\n河庄\t249859\n国家档案馆\t249860\n桥臂\t249861\n塔米\t249862\n页岩气资源税\t249863\n轻链\t249864\n贵州省环境保护厅\t249865\n紫穗槐\t249866\ngilisoft\t249867\n入选\t249868\nMer海蓝之谜\t249869\n金姐\t249870\n民主生活会谈心谈话记录\t249871\n础\t249872\nVARTA\t249873\n潮人\t249874\n止滑\t249875\n噬菌体\t249876\n政府管理学院\t249877\n质朴\t249878\n黄达\t249879\nQQ影音播放器\t249880\n大淸一統志\t249881\n伯恩斯坦\t249882\n储热\t249883\n随风飘\t249884\njsm3u8\t249885\n深圳站\t249886\n白芍药\t249887\n信托型\t249888\n槟榔谷\t249889\n威讯\t249890\n1783\t249891\n阿标\t249892\n重装机兵XENO\t249893\n接近传感器\t249894\nGOLDEN\t249895\n200寸\t249896\n形态率\t249897\n肇庆学院\t249898\noval\t249899\n射手影音\t249900\n钒矿\t249901\nCooMark\t249902\n阿sa\t249903\n冷却系统\t249904\n头痛药\t249905\n3男\t249906\n康乐新村\t249907\n飞雨\t249908\nTANGO\t249909\n赌侠诗\t249910\nGambit\t249911\nbiomed\t249912\n采风\t249913\nat89s51\t249914\n驱虫药\t249915\nMockplus\t249916\n玟\t249917\n40英尺\t249918\n爱琴\t249919\nERStudio\t249920\n联华超市\t249921\n升级换代\t249922\n奇物\t249923\n张秋生\t249924\nrs4\t249925\n班列\t249926\n线盒\t249927\n燕子\t249928\n喜极而泣\t249929\nastaxie\t249930\n5万多元\t249931\n优越者\t249932\n新三板挂牌公司\t249933\n香年广场\t249934\n宠姬\t249935\n功放板\t249936\n不辱\t249937\n洪姓\t249938\n龙岗区\t249939\n天天315\t249940\n转码\t249941\n抛货\t249942\n交接表\t249943\n辛特兰\t249944\n胭脂雪\t249945\nGameSpot\t249946\n圈子\t249947\n演\t249948\n三里亭\t249949\n王露露\t249950\nsourcing\t249951\n元彪\t249952\n狗们\t249953\n译库\t249954\n盘先生\t249955\nnurses\t249956\nLaw\t249957\n王乐君\t249958\n懒\t249959\n郑虹\t249960\n东方虹医学美容网\t249961\n醋味\t249962\n战时\t249963\n李新民\t249964\n重庆限号\t249965\n大唐国际\t249966\n醋酸氯\t249967\n升职申请书\t249968\n经典观后感集\t249969\n未完成\t249970\n清底\t249971\nv2.0.9\t249972\n十八洞\t249973\n镇里\t249974\n2380\t249975\n十碗\t249976\nphp5ts.dll\t249977\n地屈孕酮片\t249978\n风景名胜区\t249979\n虎皮兰\t249980\n令人吃惊\t249981\n暗黑龙爵\t249982\n分餐\t249983\nDPT-RP1\t249984\n47期\t249985\n兰州财经大学\t249986\n订单页\t249987\npropulsion\t249988\n3608\t249989\n金鸡镇\t249990\nlj\t249991\ntutorabc\t249992\n上海虹桥站\t249993\n金龙大厦\t249994\n一吹\t249995\n北京街道\t249996\n码间\t249997\n海柳\t249998\n海康硬盘录像机\t249999\n甩手网\t250000\n江苏油田\t250001\n5W30\t250002\n岿然不动\t250003\n创捷\t250004\n巴布亚新几内亚\t250005\nputs\t250006\n沃斯\t250007\nHD-720P-MP4\t250008\nsklean\t250009\n预埋螺栓\t250010\n一劫\t250011\n二孩\t250012\n康旗股份\t250013\n相敬如宾\t250014\n健腹轮\t250015\n预埋板\t250016\nspecified\t250017\n东孚\t250018\n电动车时代网\t250019\n驶向\t250020\n猫头\t250021\n复垦\t250022\n劳累\t250023\nAppleiPad\t250024\nBIT\t250025\n耳语者\t250026\n2018.4.11\t250027\nCary\t250028\n黄巢\t250029\n黑龙江大学\t250030\n墨多多\t250031\nvodafone\t250032\n35TFSI\t250033\n挖掘\t250034\n龙虎风云\t250035\n昆山南站\t250036\n亚航\t250037\n喝牛\t250038\n画蛇添足\t250039\n山东省住建厅\t250040\nplanted\t250041\n住有所居\t250042\n植草\t250043\n智尊宝纺\t250044\n蔬菜饼\t250045\n终不悔\t250046\n075582\t250047\n像素描\t250048\n万唯\t250049\n睡服\t250050\n负荆请罪\t250051\n临漳\t250052\n九堡\t250053\n争斗者\t250054\n87087227\t250055\n赚钱\t250056\n胡胖子\t250057\n贝登\t250058\n二丙二醇\t250059\n神帖\t250060\n龙泉街\t250061\nNAB\t250062\nLazarus\t250063\n捷报频传\t250064\nparties\t250065\n广发南航\t250066\n刀轴\t250067\n物位\t250068\noctober\t250069\n控制灯\t250070\nU77\t250071\n32年前\t250072\nfraud\t250073\n西游车展网\t250074\nsinus\t250075\nz\t250076\n淫途\t250077\n歌诗达邮轮\t250078\nmofos\t250079\n新债申购\t250080\n水果片\t250081\n装作\t250082\n华南师范\t250083\n正月初三\t250084\n彩妆\t250085\n陕西日报\t250086\n山西地税\t250087\n靳哲\t250088\nAutoit\t250089\nintellectual\t250090\nKERNEL32.dll\t250091\n执行率\t250092\n鸿威\t250093\n失德\t250094\n相反数\t250095\n钢丸\t250096\npps影音\t250097\n孙姓\t250098\n系统流程图\t250099\n超声波检测仪\t250100\n滨海镇\t250101\n中教\t250102\n全自动贴标机\t250103\n李贤平\t250104\n妞范\t250105\n值班员\t250106\n格子裙\t250107\n月经&#160\t250108\n省身\t250109\n小巧玲珑\t250110\nAren\t250111\n新光天地\t250112\n宠妻法则\t250113\n1摄氏度\t250114\n1435276397\t250115\n于庄\t250116\nDGJ\t250117\nrgd\t250118\n班机\t250119\nGame5\t250120\n线箱\t250121\n亨本河\t250122\n云汇魔盒\t250123\n6.7.0\t250124\n耐斯\t250125\nbracket\t250126\nV+\t250127\nks\t250128\n东南大学建筑学院\t250129\n火网\t250130\n随波逐流\t250131\n直缝钢管\t250132\nBathing\t250133\n现役军人\t250134\n祖宾\t250135\n红烧肉\t250136\nKFR-50LW/\t250137\n华润万象天地\t250138\n肥城\t250139\npu皮\t250140\nDiamond\t250141\n属主\t250142\n洗头房\t250143\n不能承受的生命之轻\t250144\n百合粉\t250145\n摩擦材料\t250146\n真确\t250147\n2018年4月16号\t250148\n650万\t250149\nBPM\t250150\n亿客\t250151\n哈尔滨市住房保障和房产管理局\t250152\n旧宅\t250153\nDeed\t250154\nP11\t250155\n打野球\t250156\n基因库\t250157\n20180313\t250158\n哈飞路宝\t250159\n洛阳关林\t250160\n卡修斯\t250161\niis\t250162\n丰田奕泽\t250163\nhardcore\t250164\nsummertime\t250165\n兵装\t250166\n300只\t250167\nISAPI\t250168\n百丈镇\t250169\nOVERLORD\t250170\nbareminerals\t250171\n第24\t250172\ntib\t250173\n中国注册会计师协会\t250174\n地中海\t250175\n液\t250176\n鹿鼎娱乐\t250177\n八十厘米\t250178\n小轩\t250179\n乔军\t250180\n波霸奶茶\t250181\n_Big磅\t250182\n创业商机网\t250183\n林淑娟\t250184\n孕夫\t250185\n正规式\t250186\n基教科\t250187\n达华\t250188\ngetters\t250189\n全真七子\t250190\n尤克里里谱\t250191\n吴桐祯\t250192\nexcel表单元格\t250193\n曹政奭\t250194\n西安政治学院\t250195\n陆心媛\t250196\n三国13\t250197\n古斯特\t250198\n商城版\t250199\n三位\t250200\n谢广坤\t250201\n华音\t250202\n地鼠网\t250203\n全国人大常委\t250204\n忍者\t250205\nquote\t250206\n寒菽\t250207\n1398\t250208\n斯瓦辛格\t250209\n探宝\t250210\nHPUX\t250211\n底布\t250212\nfinesse\t250213\n左右侧\t250214\n创景\t250215\n中国计生协\t250216\n随机变量函数\t250217\n四川省南充高级中学\t250218\n天剑群侠\t250219\n联盛广场\t250220\n东路\t250221\n废矿物油\t250222\n打雷劈\t250223\n挂机头\t250224\nitop\t250225\n濮\t250226\n热火队\t250227\n陕西省审计厅\t250228\n套壳\t250229\n杏堂夏\t250230\nmaestro\t250231\nL1300\t250232\nm+1\t250233\n钻柱\t250234\n签名表\t250235\nWR\t250236\n表盘\t250237\n肝功能检查\t250238\nManuscript\t250239\n31-60天\t250240\n源性\t250241\nmaterialgirl\t250242\n尽欢\t250243\nBALENCIAGA\t250244\nwavelet\t250245\n如何边\t250246\nSweetAlert\t250247\n_多玩游戏新闻中心\t250248\n阳光杯\t250249\n甜橙\t250250\n3.2\t250251\nCompressed\t250252\n谱\t250253\n衢州市人民医院\t250254\n黑图\t250255\n徐培成\t250256\n刷机精灵\t250257\n表土\t250258\n诸葛io\t250259\n巨蛙\t250260\n王丽丽\t250261\n)食品有限公司\t250262\n硝酸反应\t250263\n丁晴\t250264\n脚踏实地\t250265\n乳状\t250266\n3321\t250267\n银燕\t250268\n第六层\t250269\n2008级\t250270\n车膜\t250271\n并不\t250272\n安娜卡列尼娜\t250273\npostgrepsql\t250274\nEncryption\t250275\n东京搜查官re\t250276\n花莲\t250277\n上海地铁\t250278\n哈弗蛊\t250279\nautotune\t250280\nCher\t250281\n努比亚z7mini\t250282\n凤凰星座\t250283\n香槟玫瑰\t250284\n梅干菜\t250285\n星魂\t250286\nEslint\t250287\n薇姿VICHY\t250288\n何石\t250289\n第八位\t250290\n脱毛\t250291\n胡晓\t250292\n蔓蔓青萝\t250293\n白沙瓦\t250294\n长春欧亚\t250295\n克列门蒂\t250296\n乐扣乐\t250297\n卡夫\t250298\nmerchant\t250299\n刊例\t250300\n晚上十一点\t250301\n漳州城市职业学院\t250302\n去手\t250303\n克鲁格曼\t250304\n货运费\t250305\n丰田汽车公司\t250306\n巨奖\t250307\nkeen\t250308\n智象\t250309\nM12\t250310\n左腿\t250311\nwann\t250312\n狮驼岭\t250313\n筑地\t250314\nAbstract\t250315\n甲基磺酸\t250316\nHessian\t250317\n夏瑶\t250318\n四屏\t250319\n肌酐清除率\t250320\n龙竹\t250321\nBT樱桃\t250322\n叶天\t250323\n主根\t250324\nBD-RMVB\t250325\n不死者之王\t250326\n可爱们\t250327\n0.5克\t250328\naggregated\t250329\nLucie\t250330\nSYT\t250331\n候鸟\t250332\nmodprobe\t250333\n大中国\t250334\n啸风\t250335\nLevelDB\t250336\napad\t250337\n603路\t250338\n月光奏鸣曲\t250339\n东京国际电影节\t250340\nintention\t250341\n清洁机\t250342\n复读机\t250343\n第15批\t250344\n训诂学\t250345\n阴间\t250346\nIncompatible\t250347\nlds\t250348\n城北乡\t250349\n29届\t250350\n致良知四合院\t250351\n山鼎\t250352\n癸酉\t250353\nPartial\t250354\n鞭挞\t250355\n数以\t250356\npeter\t250357\n0.9.1\t250358\n泰拉科技\t250359\n控制面\t250360\n8123\t250361\n工业新区\t250362\n美邻\t250363\nazd9291\t250364\n双唑泰栓\t250365\n陈明真\t250366\n_线话\t250367\n王大春\t250368\n国光\t250369\nCNAS认证\t250370\nst2\t250371\n果宝\t250372\n洺悦府\t250373\nmotivate\t250374\n仙本纯良\t250375\nMapinfo\t250376\n50s\t250377\n淘者\t250378\nAuthor\t250379\n洛川\t250380\n郫都\t250381\nmtcnn\t250382\n比音勒芬\t250383\n庆熹纪事\t250384\n群创\t250385\n铝箔袋\t250386\n只要你\t250387\n电影类\t250388\n验视\t250389\n哪一次\t250390\n华润双鹤药业股份有限公司\t250391\nmayday\t250392\nvmin\t250393\n阿基里斯\t250394\n淮城镇\t250395\n禁咒\t250396\n军籍\t250397\nNERO\t250398\n春糖\t250399\n双待\t250400\n鳖\t250401\n村舍\t250402\ncnelc\t250403\n女娲石\t250404\n对偶句\t250405\n世界征服者4\t250406\n李玉刚\t250407\n音乐世界\t250408\n铝基板\t250409\n焦晃\t250410\n悲哀\t250411\n隔离剂\t250412\n88395777\t250413\nhenghanan\t250414\n遵义市统计局\t250415\n1.5万元\t250416\n猥琐男\t250417\nldac\t250418\n镇改市\t250419\n逼单\t250420\n彳亍\t250421\n弯机\t250422\n属於\t250423\n111名\t250424\n婚庆\t250425\n五角场\t250426\n大陈\t250427\n济南市卫生和计划生育委员会\t250428\n巴比\t250429\n山东省机关事务管理局\t250430\n盒中\t250431\n太奇\t250432\n阿伦\t250433\n麦克斯\t250434\n内分泌学\t250435\n民生生活会\t250436\n绿色呼吸牛蒡茶\t250437\nZ17S\t250438\n审美观\t250439\n欧洲机场\t250440\n仙狐\t250441\nqq仙侠传\t250442\n20140605\t250443\n湿罗音\t250444\n10054\t250445\n二条\t250446\n宏源期货\t250447\n百度云论坛\t250448\n市场利率\t250449\n中国建筑装饰\t250450\n浸渍纸\t250451\nlightening\t250452\nhuhuuu\t250453\n见怪不怪\t250454\n程序坞\t250455\n正弦波\t250456\n水月亮\t250457\n奸妃\t250458\n宁波江北政府网\t250459\n华为畅享7plus\t250460\n近不近\t250461\n最后一个人\t250462\n西北门\t250463\n14.00\t250464\n道心\t250465\ntaskbar\t250466\n坝基\t250467\n郁江\t250468\n吸热\t250469\n50欧元\t250470\nTTMN\t250471\n孝感\t250472\n台球桌\t250473\n一万张\t250474\n士农工商\t250475\n变态乱伦\t250476\n量子阱\t250477\n郁钧剑\t250478\n第13关\t250479\ngg助手\t250480\nuv灯\t250481\n日照便民网\t250482\nobje\t250483\n埃德\t250484\n下里巴人\t250485\nxl39h\t250486\n昆明机床\t250487\n裁\t250488\n挂户\t250489\n湿性\t250490\n决策表\t250491\n良渚文化艺术中心\t250492\n上古汉语\t250493\n莫泰\t250494\n咨询服务类\t250495\nAbbott\t250496\n康师傅冰红茶\t250497\n平陆县\t250498\n3000多家\t250499\nogre\t250500\n孝义市政府\t250501\n100GB\t250502\n三拳\t250503\n国家药品监督管理局\t250504\nhighgui\t250505\n福彩3D字谜图谜\t250506\n338个\t250507\n暗器\t250508\n杭锦旗\t250509\n七点\t250510\n第二十二章\t250511\n舞蹈演员\t250512\n数控铣工\t250513\n嘴鸟\t250514\nfuck\t250515\n55\t250516\n国轩\t250517\nairkiss\t250518\n工商户\t250519\n孔雀石\t250520\nlaboratories\t250521\n乙丑年\t250522\n丁酉年\t250523\n甘氏\t250524\nblocking\t250525\n江华县\t250526\n走水\t250527\n咪咪兔\t250528\n空心楼\t250529\n重本\t250530\n乐拍\t250531\n喜豆\t250532\n神运\t250533\n000401\t250534\n8斤\t250535\n西安广播电视大学\t250536\n简解\t250537\nx3650\t250538\n声音效\t250539\n深交所\t250540\n西北工业大学自动化学院\t250541\n弹性势能\t250542\n热工\t250543\n隔断柜\t250544\nzain\t250545\n裘海正\t250546\n力生\t250547\n丘北\t250548\n厦门小猪网\t250549\nDubbo\t250550\nPop\t250551\n杀手6\t250552\n法式家具\t250553\n美年体检\t250554\n上海师范\t250555\n共同选择\t250556\n韩庆祥\t250557\n生化危机6:终章\t250558\n节目表\t250559\n三厅\t250560\n德里\t250561\n中华人民共和国民法总则\t250562\n红极一时\t250563\ntease\t250564\n2147483647\t250565\n偏\t250566\nbinascii\t250567\nNpm\t250568\n双控开关\t250569\n塑纸\t250570\n桃隐社区\t250571\n期房\t250572\nOnline4\t250573\n睡不醒\t250574\n0571-13588826066\t250575\n别傻傻分\t250576\n百度竞价排名\t250577\n帖子\t250578\n轮询\t250579\n水幕\t250580\n长清区\t250581\n1104\t250582\n登入\t250583\n全考\t250584\n男文\t250585\n奥迪A6\t250586\nCane\t250587\npale\t250588\nc语言程序\t250589\nSupportAssist\t250590\n叶公好龙\t250591\nabb\t250592\n永安期货\t250593\nremind\t250594\n哈加萨\t250595\n五百公里\t250596\n瑞茂通\t250597\nhyperdunk\t250598\n昭通市\t250599\n一针\t250600\n88分\t250601\n王拓\t250602\n扬子江城市群\t250603\njess\t250604\n瘦脸针\t250605\nsugar\t250606\n二八二五六\t250607\n广告语\t250608\n手冢\t250609\n飘绿\t250610\n天中\t250611\n辽宁装备制造职业技术学院\t250612\n借读生\t250613\n裙装\t250614\n戴维斯杯\t250615\n油腻感\t250616\n车技\t250617\n腰裤\t250618\n过日子\t250619\n莱芜莱城区\t250620\n陈珏\t250621\n火影节奏大师\t250622\nvocaloid\t250623\n饥馑\t250624\n250kg\t250625\nBufferedReader\t250626\n陈晓波\t250627\n城规\t250628\n通常\t250629\n李敖伊布\t250630\n老红\t250631\n涌流\t250632\napmserv\t250633\n半桶\t250634\nSHA1值\t250635\n期盼\t250636\ny+1\t250637\nvideoshd\t250638\n托盘\t250639\n公路养护\t250640\n起点女频\t250641\n云裳馨悦\t250642\n信长\t250643\n脑膜\t250644\n永安公墓\t250645\n喊停\t250646\nThinkJS\t250647\n博森\t250648\ntigase\t250649\nKIDS\t250650\n小学生日记大全\t250651\n奥运福娃\t250652\n夸父\t250653\n联想启天\t250654\n弗吉尼亚\t250655\n全店\t250656\n昆明市住房和城乡建设局\t250657\nbutterflies\t250658\ncoleman\t250659\n顶芽\t250660\nPortal\t250661\n好下场\t250662\nRefrain\t250663\n16pf\t250664\n东洲家园\t250665\n柴古唐斯\t250666\n来曲唑\t250667\n戏床\t250668\n断断续续\t250669\n转述句\t250670\n帘子\t250671\nDusk\t250672\n洞林湖\t250673\nDelonghi\t250674\n制造部\t250675\n挟\t250676\n第46条\t250677\n硕思\t250678\n疏密\t250679\n肇庆市人民政府\t250680\n现车\t250681\n城果\t250682\n刘福\t250683\ndvi接口\t250684\n自然类\t250685\n罗技G300\t250686\n过\t250687\n拔罐减肥\t250688\n十二世纪\t250689\n热血三国\t250690\nConor\t250691\n生子\t250692\n西宝生物科技\t250693\nAssist\t250694\n导轮\t250695\n时下\t250696\n第六款\t250697\nsentient\t250698\n芳芳的性福生活\t250699\n静海县\t250700\n老壶\t250701\n筱晓\t250702\nJunit单元测试\t250703\nMIUI9稳定版\t250704\n梁璐\t250705\n中山大学附属第三医院\t250706\n蚩尤九黎城\t250707\n借新还旧\t250708\n管芯\t250709\nprado\t250710\n一首歌\t250711\n赤水市人民政府\t250712\n马光明\t250713\n金蝶kis记账王\t250714\n四川卫视\t250715\n华能大厦\t250716\nRTCP\t250717\n铃木汽车\t250718\n第七大\t250719\nLOB\t250720\n50A\t250721\nPornstars\t250722\n韩浩\t250723\npgpool\t250724\n诺氟沙星胶囊\t250725\n2015.doc\t250726\n20171009\t250727\n绝地求生成就\t250728\n第120章\t250729\n咚咚肿瘤科\t250730\n中国五冶大学\t250731\n田横岛\t250732\nDB9\t250733\n贴装\t250734\n火狐中文\t250735\nR2015a\t250736\n浙江省文化厅\t250737\n100米\t250738\n快乐12\t250739\n36块\t250740\n七件套\t250741\n绝命时钟2\t250742\n冰原狼\t250743\n双零\t250744\n塘沽湾\t250745\n疵\t250746\n早治\t250747\n最速\t250748\n斜影\t250749\n尚长荣\t250750\n车口\t250751\n4.6.0\t250752\n红磡体育馆\t250753\ncryptographic\t250754\nandroid6.0\t250755\nWin10正式企业版\t250756\n药师佛\t250757\n解药\t250758\n参展\t250759\nDumps\t250760\n乳清粉\t250761\n北京市朝阳区人力资源和社会保障局\t250762\n陕西人\t250763\n青丘狐传说\t250764\n笑言\t250765\n恶性淋巴瘤\t250766\n电采暖\t250767\n弧室\t250768\nbrush\t250769\n红二代\t250770\nPPF\t250771\n士兰\t250772\n危局\t250773\n天猫魔盒M13\t250774\n闷养\t250775\n万新村\t250776\n酸涩\t250777\n电信4G\t250778\n王锦蛇\t250779\n风振\t250780\n1500万套\t250781\n拼摆\t250782\n董文\t250783\n句\t250784\n我不是明星\t250785\n草莓地\t250786\n教育学基础\t250787\n古稀之年\t250788\n康比特\t250789\nMEIZU\t250790\n我和他\t250791\n翻译本\t250792\nchd\t250793\n黄埭\t250794\n八牌\t250795\n字幕组\t250796\nff14死宫\t250797\n同姓\t250798\n0624\t250799\n令人深思\t250800\n福建省科学技术厅\t250801\n_鹭岛\t250802\n电化学反应\t250803\n719所\t250804\n100T\t250805\n一针一线\t250806\n鞋标\t250807\n李庄\t250808\n信质\t250809\n变温\t250810\nchecklistbox\t250811\n3.1版\t250812\n权律二\t250813\n1000多万\t250814\n唐万新\t250815\n孙雪梅\t250816\n无忧支付网\t250817\n音差\t250818\n36届\t250819\nEdge\t250820\n吊威亚\t250821\n孙勇\t250822\n唐朝末年\t250823\nMBT\t250824\n2016年11月1日\t250825\n贷后管理\t250826\n鹦鹉螺号\t250827\n联觉\t250828\n正比例\t250829\n并驾齐驱\t250830\n小三居\t250831\n电子保单\t250832\n吉林省教育学院\t250833\nSubnautica\t250834\n松铺系数\t250835\n无砟轨道\t250836\n双4G\t250837\n猛攻\t250838\n芝柏\t250839\n方宏\t250840\nmustache\t250841\n陆面\t250842\n半生不熟\t250843\n富瀚微\t250844\n农夫导航\t250845\n湖南财经工业职业技术学院\t250846\nlady8844\t250847\n三星w2015\t250848\n兴业国际信托有限公司\t250849\n中国科学院近代物理研究所\t250850\n革命运动\t250851\n酷猴\t250852\n不止步\t250853\n硬脂酸钠\t250854\n浙江政府\t250855\n燕子来时新社\t250856\n郎\t250857\n诛九族\t250858\n精图\t250859\nlide\t250860\nuber\t250861\n美卡素\t250862\n0.52\t250863\n托夫勒\t250864\n逐页\t250865\n蓬朗\t250866\nrubbing\t250867\n轻质土\t250868\n页表\t250869\n铭泰\t250870\n拉巴斯\t250871\n定亡者\t250872\n接码\t250873\n北方导航\t250874\n残人\t250875\nrs485\t250876\n张海平\t250877\nA怪\t250878\n973计划\t250879\nレム\t250880\n宗喀巴\t250881\n酪蛋白\t250882\n翼展\t250883\nnewwifi\t250884\nTransfer\t250885\n熊去氧胆酸\t250886\n12.38万\t250887\n几艘\t250888\n华润万象\t250889\n京东快递\t250890\ndcom\t250891\n广州大学\t250892\n凯程\t250893\natos\t250894\n异质\t250895\ndbcontext\t250896\n滨海城市\t250897\n诸城\t250898\n蒙山大佛\t250899\n克劳利\t250900\n空调\t250901\n奕轩居\t250902\n赛隆药业\t250903\n艺术创作\t250904\n双排牙\t250905\n桐\t250906\n累积投票制\t250907\n大佬们\t250908\n成本部\t250909\n车边\t250910\n7例\t250911\n4盒\t250912\n神乐\t250913\n烈烈\t250914\n莲师\t250915\n披萨盒\t250916\n董家镇\t250917\nTivoli\t250918\n建设部\t250919\n皮卡堂\t250920\n超高温\t250921\nCracked\t250922\n野途\t250923\n拔河\t250924\n星海名城\t250925\noam\t250926\n中国教师人才网\t250927\n文曰小强\t250928\n大连体育中心\t250929\nlog4cplus\t250930\n昆山市\t250931\nWin7激活工具\t250932\n网景\t250933\nhbsag\t250934\n10S\t250935\nicb\t250936\n满蒙\t250937\n独闯天涯\t250938\n苏福\t250939\n3卫\t250940\n文秋芳\t250941\npyqtgraph\t250942\n青龙乡\t250943\n伯牙绝弦\t250944\n接屏\t250945\nlent\t250946\nepa\t250947\n绿阴\t250948\n聚荣网\t250949\ntolist\t250950\n爵爷\t250951\n排椅\t250952\n奇味\t250953\n兰香\t250954\n禁脔\t250955\n师大云端图书馆\t250956\noc渲染器\t250957\nv3.3.0\t250958\n真空箱\t250959\nSupporting\t250960\n亚当·莱文\t250961\n天瑞\t250962\n量\t250963\n标准色\t250964\n三人斗地主\t250965\n医香\t250966\n银耳汤\t250967\n四季歌\t250968\n希女王\t250969\n恪守\t250970\nCG儿\t250971\nComet\t250972\n闲云\t250973\nS32\t250974\n明确性\t250975\n16s\t250976\navant\t250977\nlovin\t250978\n使能\t250979\n种族歧视\t250980\n西涧\t250981\n忱\t250982\n1.2万个\t250983\nhref值\t250984\noee\t250985\nwww.75pk\t250986\n快软路由\t250987\n锻炼\t250988\n第239集\t250989\n2代\t250990\n愤然\t250991\n火灾\t250992\n蹲厕\t250993\n青安\t250994\n品宣\t250995\n魔\t250996\n打造\t250997\n东方非\t250998\n机投镇\t250999\n装置性\t251000\nV2版\t251001\n加卷\t251002\ncameo\t251003\n脑场\t251004\n举证责任\t251005\nDecomposition\t251006\n功能键\t251007\n惊华\t251008\n李天泽\t251009\n保镖\t251010\ncylinder\t251011\n20m\t251012\n影线\t251013\n商议\t251014\n哑口无言\t251015\n罐体\t251016\n海岛旅游网\t251017\n40163\t251018\n中菏\t251019\n215\t251020\n哈拉雷\t251021\n梦寻\t251022\n网桥\t251023\n帕米拉\t251024\n少阴\t251025\n成绩单\t251026\nunset\t251027\n王铭\t251028\n花腿\t251029\n本末倒置\t251030\nScholars\t251031\n360搜索\t251032\n无锡市图书馆\t251033\n炸膛\t251034\nstartisback\t251035\n活珠子\t251036\n亮见\t251037\n中铝\t251038\n4.35%\t251039\n冷峻\t251040\n下午\t251041\n好啦\t251042\n长坡镇\t251043\n总纲领\t251044\n1349\t251045\n38式\t251046\n000円\t251047\nW510\t251048\n大众cc吧\t251049\n电子科技大学》\t251050\n鞍型\t251051\n咐\t251052\n第302章\t251053\nArrows\t251054\n旋风管家\t251055\n15cm\t251056\n圆盘豆\t251057\n碳酸铜\t251058\n陕西省体育局\t251059\n赵志祥\t251060\n李金斗\t251061\n浏览迷\t251062\n甘精胰岛素\t251063\nsecurity3\t251064\n成果\t251065\n米村\t251066\n赵嘉敏\t251067\n黛芙薇尔\t251068\nMilwaukee\t251069\n铠甲勇士铠传\t251070\n苏州高新区管委会\t251071\n蓝盈莹\t251072\nyuicompressor\t251073\n西藏旅游攻略网\t251074\n三大点\t251075\nPoC\t251076\n邵夷贝\t251077\n握把\t251078\ninitializers\t251079\n零点乐队\t251080\n扎头\t251081\n粉猪\t251082\n押注\t251083\n摸逼\t251084\n一蹴而就\t251085\n铁路公安局\t251086\n奥迪Q5\t251087\n雁儿\t251088\n射术\t251089\n沙塔尔沙吾提\t251090\n木芯\t251091\nScheduling\t251092\n何雁诗\t251093\nlvalue\t251094\n空白期\t251095\n宝武钢铁\t251096\n亚沙美\t251097\n美政府\t251098\nEcho/\t251099\n诺贝\t251100\n奉城镇\t251101\n乙丑\t251102\n乱片\t251103\n响水吧\t251104\n热解炉\t251105\n樱花大战3\t251106\n北京市交通委员会路政局\t251107\nplusready\t251108\nALO\t251109\n枸橼酸他莫昔芬片\t251110\n1878年\t251111\n弱碱\t251112\n山东省人民防空办公室\t251113\n步数\t251114\n诗迷\t251115\n判决书\t251116\n单频\t251117\n灵幻\t251118\n道尔智控\t251119\n有伴儿歌网\t251120\n刘建宏\t251121\n表面改性\t251122\n日常生活\t251123\n战斗类\t251124\nThompson\t251125\n5200\t251126\n米尔纳\t251127\nporndu\t251128\n许配\t251129\nROYAL\t251130\n红鸟\t251131\n铁富镇\t251132\n2018年6月\t251133\n中国物品编码中心\t251134\n高血压\t251135\n巧克力色\t251136\n杨二\t251137\n潍坊\t251138\n控\t251139\n运动摄像机\t251140\n胭脂鱼\t251141\nfdc\t251142\nworkday\t251143\n喷蜡\t251144\n_恋\t251145\n格蕾特\t251146\nhoon\t251147\n兴隆路\t251148\n哈尔滨啤酒\t251149\n金桂园\t251150\nSynth\t251151\n黑粉\t251152\n剪刀石头布\t251153\n斯威\t251154\n多包\t251155\n9年后\t251156\n平安E行\t251157\n牧草\t251158\n三江源自然保护区\t251159\n海疆纵横_海疆在线\t251160\nGT730\t251161\n信主\t251162\n公墓\t251163\n胄\t251164\n肝脓肿\t251165\n鱼食\t251166\n花针\t251167\n牛肉棒\t251168\n张东明\t251169\n觊觎\t251170\n肖华\t251171\n德威\t251172\nmodeland\t251173\n跟上\t251174\n插值器\t251175\n普华永道中天会计师事务所\t251176\n学者们\t251177\nBD-mp4\t251178\n甄栗子\t251179\n广众网\t251180\n垃圾转运站\t251181\n掌中\t251182\n冷冷冷\t251183\n1913\t251184\n三四级\t251185\n装饰网\t251186\n夏尔巴人\t251187\n万2.5\t251188\n红泥\t251189\n2.5个\t251190\n蒋易\t251191\nオトメドリ\t251192\nimovie\t251193\n2860\t251194\n降臨\t251195\n方志友\t251196\ncaj格式\t251197\n均线多头排列\t251198\n曲霉\t251199\n十二战纪\t251200\nbusted\t251201\nNm\t251202\n预付\t251203\n公务员\t251204\n武汉苹果\t251205\nv2013\t251206\nSPWM\t251207\n闪现\t251208\nyw\t251209\n锻造炉\t251210\n联单\t251211\n亦凡\t251212\n坐堂\t251213\nPottery\t251214\n壬辰年\t251215\nBrokenIce\t251216\n泰斗\t251217\n徐鹤宁\t251218\nlzma\t251219\n认识小数\t251220\n黑番茄\t251221\n中央党史和文献研究院\t251222\n安缦酒店\t251223\n赵圆瑗\t251224\n马德华\t251225\n2升\t251226\nLeah\t251227\n甘菊\t251228\n横按\t251229\n落落大方\t251230\n小楼又东风\t251231\n谷歌公司\t251232\naol\t251233\n中国动物疫病预防控制中心\t251234\nVR之家\t251235\nsmart卡\t251236\ndocer\t251237\n0321\t251238\n联想y450\t251239\n角子机\t251240\n注释本\t251241\n豪客来牛排\t251242\n中潜股份\t251243\n表展\t251244\n任人唯亲\t251245\n栈上\t251246\nandra\t251247\nWarn\t251248\n一号通\t251249\n天津市河北区\t251250\n營\t251251\n小米max\t251252\n泰拉石\t251253\n移动物联\t251254\n某一个\t251255\n爱媛\t251256\n正恩\t251257\n分级诊疗制度\t251258\n澔\t251259\n黄绿\t251260\n加身\t251261\n入地\t251262\n浩东\t251263\n守身如玉\t251264\n乐山日报数字报\t251265\n悉尼大学\t251266\nGulp\t251267\n张勤\t251268\n声表\t251269\n费耶诺德\t251270\n2017cc\t251271\n93路\t251272\n驿马\t251273\n新人们\t251274\n语调\t251275\n囟门\t251276\nzyz913614263\t251277\nivt\t251278\n6000次\t251279\n梦游天姥吟留别\t251280\nFounder\t251281\n360杀毒网\t251282\n37万\t251283\n豪擎\t251284\nalexnet\t251285\nsqlsrv\t251286\n上海火车站\t251287\n新闻节目\t251288\n店口\t251289\n而\t251290\n会声会影X7\t251291\n杂鲷\t251292\n婚帖\t251293\n金东方\t251294\n第几周年\t251295\nwin8_装机吧\t251296\n过渡性\t251297\nGMGC\t251298\ntac\t251299\n草溜\t251300\n区人民医院\t251301\n正力\t251302\n石鼓文\t251303\n瓷器\t251304\n新局面\t251305\n毛泽东故居\t251306\n兴庆区\t251307\n1.0.7\t251308\n成都地铁18号线\t251309\n变形模量\t251310\n含钙\t251311\n坑坑\t251312\nSpringMVC\t251313\n乐城\t251314\n支系\t251315\n尸套\t251316\n涂黑\t251317\n大足\t251318\n【神界原罪2\t251319\n星球大战前线2\t251320\n手打\t251321\n郭靖宇\t251322\nsegmentation\t251323\n20180102\t251324\n米氏\t251325\n德治\t251326\n费用率\t251327\n扎克伯格\t251328\ngbcbig\t251329\n20161122\t251330\n传祺GS4论坛_汽车之家论坛\t251331\n舜网新闻中心\t251332\nloxia\t251333\n86集\t251334\n保利广场\t251335\n慈铭体检集团_体检中心\t251336\nwifi破解器\t251337\n李子峰\t251338\n快件\t251339\n中国医学科学院皮肤病研究所\t251340\n对阵\t251341\n香河\t251342\nAV网\t251343\n佛诞\t251344\n室性期\t251345\n数码影\t251346\n淘宝\t251347\n日本京都大学\t251348\n男朋\t251349\n童国祥\t251350\n集成\t251351\n学而思云\t251352\n类风湿病\t251353\n舆论\t251354\nheydouga\t251355\n铸造件\t251356\n暑气\t251357\n兰州市公安局\t251358\nEDraw\t251359\n黛色\t251360\n德约\t251361\n朵儿\t251362\nboll\t251363\n推广语\t251364\n打不着\t251365\n刘志明\t251366\n华南植物园\t251367\n根茎类\t251368\n骚浪\t251369\n吉吉影音先锋\t251370\n250米\t251371\ncygwin64\t251372\nc++编程\t251373\n夏淳\t251374\n主父\t251375\nLatino\t251376\n烙铁\t251377\n仍旧\t251378\ni贷\t251379\n影相\t251380\n大车\t251381\n硬点\t251382\n万龙滑雪场\t251383\n江夏大道\t251384\n线孔\t251385\n卡球\t251386\n哈尔滨商业大学\t251387\n宁波理工学院\t251388\n上野公园\t251389\n山东招商网\t251390\nMHW\t251391\n第1版\t251392\n邓中夏\t251393\nNavy\t251394\nEMMC\t251395\n庄景臣\t251396\n应否\t251397\n斗牛曲\t251398\nno-cache\t251399\n转录组\t251400\nF6\t251401\nMPV\t251402\n01_土豆视频\t251403\n鹦鹉大道\t251404\n叫爱\t251405\n逗引\t251406\nSOX\t251407\nVsinger\t251408\n启动子\t251409\nrdma\t251410\n新桥村\t251411\n跃出\t251412\n葬仪\t251413\n社会主义初级阶段\t251414\n亩均\t251415\n公知\t251416\nColette\t251417\npytesser\t251418\n均值聚类算法\t251419\n唐招提寺\t251420\n庆王府\t251421\nMES\t251422\niFly\t251423\nnorthwest\t251424\naccel\t251425\n限免\t251426\n张韶董卿\t251427\n卷2\t251428\n引人注意\t251429\naxel\t251430\n内翻\t251431\nJose\t251432\nFlyme6\t251433\n卜\t251434\n杨星\t251435\niDisplay\t251436\n启齿\t251437\n飞虎\t251438\n虫友们\t251439\n纪连海\t251440\n选号\t251441\n武汉银行\t251442\n冥河\t251443\nsprinboot\t251444\nHyperLedger\t251445\nPatricia\t251446\n春晓\t251447\n黑锋传奇\t251448\n后挂式\t251449\n蚜虫\t251450\n李小豹\t251451\n金属件\t251452\n3月3\t251453\n多传感器\t251454\n张掖市\t251455\n几万条\t251456\n6163\t251457\n大毒枭\t251458\n油泵\t251459\n7820X\t251460\n建设性\t251461\n贤达\t251462\n700家\t251463\n海警学院\t251464\n财新网\t251465\n远方光电\t251466\n古龙群侠传\t251467\n肉山\t251468\n雪景\t251469\nAPIs\t251470\n涂山雅雅\t251471\n共阳极数码管\t251472\n约翰尼\t251473\n程阳\t251474\n矮寨大桥\t251475\ntyper\t251476\n加斯帕·诺埃\t251477\n酱门\t251478\n年内\t251479\n牢头\t251480\n小管\t251481\n8333\t251482\n小唯\t251483\n做作\t251484\nLONGCHAMP\t251485\n全科\t251486\n董雷\t251487\nernest\t251488\n天隆寺\t251489\n金龙国\t251490\n降噪豆\t251491\n虚拟仪器\t251492\n庇护山庄\t251493\n调离\t251494\n凑数\t251495\n辛伐他汀片\t251496\nhkl\t251497\n张志刚\t251498\n快易典\t251499\n第15章\t251500\n中国汽车报\t251501\nCMEF\t251502\nV2.4.0\t251503\n一匹\t251504\n得值\t251505\n灼烫\t251506\n上海联影医疗科技有限公司\t251507\n如鲠在喉\t251508\n落实\t251509\nWindows7系统\t251510\n西南门\t251511\nsent\t251512\n上原爱\t251513\ntude\t251514\nmrd\t251515\n新倩女幽魂2\t251516\n优惠票\t251517\nintegrative\t251518\nGitlab\t251519\n成本法\t251520\n酵素梅\t251521\n中气\t251522\n惹恼\t251523\n驳论\t251524\nGroups\t251525\n跌打\t251526\n民非\t251527\n慢半拍\t251528\n有间\t251529\n小酒\t251530\n石家庄人才网\t251531\n颜术\t251532\nwig\t251533\n调研团\t251534\n法法网\t251535\n电子纸\t251536\nTITAN\t251537\n股票市盈率\t251538\n滑动验证码\t251539\n动心\t251540\nMSDN\t251541\n林宛白\t251542\n富易\t251543\n同仁堂\t251544\n闭目\t251545\n潘迎紫\t251546\n兴合\t251547\n粘液块\t251548\n农心\t251549\n卢龙县\t251550\n特立\t251551\n逸彩\t251552\npcanywhere\t251553\netl\t251554\n灵器\t251555\nF1票务网\t251556\n噜噜色\t251557\n绮\t251558\n士郎\t251559\n小龟王\t251560\n莫问\t251561\n左右为难\t251562\npread\t251563\n离人愁\t251564\n吴潜\t251565\n美剧社\t251566\n稳利\t251567\n绿化队\t251568\n会计处\t251569\n欢迎词\t251570\nElec\t251571\npostage\t251572\n天慧\t251573\n沙巴\t251574\n南宁市教育局\t251575\n重庆市璧山区人民政府\t251576\n鲽鱼\t251577\nfilza\t251578\n咪唑\t251579\n国际贸易理论与实务\t251580\n他趣\t251581\n潇湘书院独家\t251582\n抖音花姐\t251583\n拼写\t251584\nEven\t251585\n上海华美\t251586\n拎着\t251587\n归真\t251588\n近亲相奸\t251589\nNBA2Kol\t251590\nTeen\t251591\n默写题\t251592\nfe55\t251593\n482002\t251594\n当道\t251595\n雷特\t251596\n手围\t251597\nMSVCP140.dll\t251598\n工长君\t251599\n于虹\t251600\nRemarkable\t251601\n清诗\t251602\n金鱼\t251603\n球赛\t251604\n陈桉\t251605\n招募说明书\t251606\n林娜\t251607\n互助土族自治县\t251608\nIconWorkshop\t251609\nAEO认证\t251610\n康熙来了吧\t251611\nresx\t251612\n极轴\t251613\n截关\t251614\n神光\t251615\n淘六六淘宝\t251616\njug\t251617\nBaker\t251618\n琴泓\t251619\n蓝罐曲奇\t251620\n宋飞\t251621\nelement\t251622\n倭瓜\t251623\nbiomedical\t251624\nFoxAE\t251625\n龙凤褂\t251626\n河北农业大学\t251627\n李光辉\t251628\n小诸葛\t251629\n献辞\t251630\n外径\t251631\n礼箱\t251632\n太田痣\t251633\n不爱吃\t251634\n7月份\t251635\n摩擦力\t251636\n第15集\t251637\n4月28日\t251638\n真吾\t251639\n渐成\t251640\nHOI4\t251641\n猫池\t251642\n盛诺\t251643\ntemplets\t251644\n神洲\t251645\n欧阳智薇\t251646\n长盈\t251647\n马绍尔\t251648\nEDS\t251649\n非洲加纳\t251650\n圆圆母婴网\t251651\n鼎捷\t251652\n雅式橡塑网\t251653\n罩子\t251654\n调查研究\t251655\n电子云\t251656\n海上花列传\t251657\n方瓶\t251658\n20150311\t251659\n中山北路\t251660\nRoster\t251661\n美娱\t251662\n2245\t251663\n白花蛇舌草\t251664\n绿景控股\t251665\nNewspaper\t251666\n綦江\t251667\nexcle2010\t251668\nkdj指标\t251669\n彩客网\t251670\n环境变量PATH\t251671\n20160918\t251672\n侨福\t251673\n保持期\t251674\n硬板床\t251675\n铝锭\t251676\n陈允斌\t251677\nrpd\t251678\n刘亮\t251679\n同股同权\t251680\n向阳镇\t251681\nYTD\t251682\n黑暗圣经全集\t251683\nlfd\t251684\n中国陶瓷网\t251685\n哈尼梯田\t251686\n双柏县\t251687\n老书\t251688\n捡取\t251689\n鄂城区\t251690\n44层\t251691\ng\t251692\n东皋\t251693\n修磨\t251694\n桃李园\t251695\ncatherine\t251696\n邹斌\t251697\n汉寿\t251698\n一江\t251699\n搭接\t251700\nequip\t251701\n后园\t251702\nminiS\t251703\n运法\t251704\n安全生\t251705\n平凉路\t251706\n雷士照明\t251707\n路虎揽胜运动版\t251708\n压片\t251709\nCELLS\t251710\n吹奏\t251711\n宏村镇\t251712\n20141101\t251713\n男女平等\t251714\n帕米尔高原\t251715\nArkham\t251716\n雅加\t251717\n官格\t251718\n自制药\t251719\n那些年我们一起追过的女孩\t251720\nscar\t251721\n齿音\t251722\n恶灵附身\t251723\n胆巴碑\t251724\n休止\t251725\n沙比岛\t251726\n失常\t251727\nBarn\t251728\n硬货\t251729\n本刊\t251730\nkbc\t251731\n何妨\t251732\n信阳毛尖茶\t251733\n魏巡\t251734\n鄂托克前旗之窗\t251735\nipg\t251736\n非洲花梨\t251737\n587\t251738\nDAT\t251739\n如果爱\t251740\n╥\t251741\n解直锟\t251742\n貉子\t251743\n睫毛夹\t251744\n催命\t251745\n颛臾\t251746\ndyn\t251747\n铭刻\t251748\n蛇口客运码头\t251749\nipai\t251750\n侦探物语\t251751\n佳音\t251752\n不言\t251753\n寄修\t251754\n十八年\t251755\nphoto+\t251756\n加拿大约克大学\t251757\n三十二式太极拳\t251758\n欧泊\t251759\nfasta\t251760\n丛台酒\t251761\n王四\t251762\n四大发明\t251763\nmacmini\t251764\n孙平\t251765\nLOAD\t251766\n狮子坪\t251767\n王和\t251768\n940MX\t251769\n沪深300指数基金\t251770\n号码机\t251771\njpress\t251772\n滴水湖馨苑\t251773\n随州\t251774\n光大兴陇信托\t251775\n方庄\t251776\n易炼红\t251777\n立体声\t251778\n金圆券\t251779\n另行通知\t251780\n申根国\t251781\nwcc\t251782\nGPG\t251783\ngou\t251784\n坎墩\t251785\n颚式破碎机\t251786\nlxvs\t251787\n安徽省工商局\t251788\n优速快递单号\t251789\n162cm\t251790\n非官\t251791\ncxx\t251792\n西丽湖\t251793\nD级车\t251794\n千红制药\t251795\n飘香剑雨\t251796\n奇葩证明\t251797\n王建峰\t251798\n视网膜屏\t251799\n十滴水\t251800\n82版\t251801\n战场女武神3\t251802\n简/繁\t251803\n北京人民广播电台\t251804\n极限\t251805\n0515\t251806\nwww.mm131.com\t251807\n染色体检查\t251808\nfn+\t251809\n吉利4s店\t251810\n感受器\t251811\n王盈\t251812\n1951年\t251813\n古玺\t251814\n5两\t251815\n宝妻\t251816\n保定站\t251817\n十九大报告诞生记\t251818\n离经叛道\t251819\n黄春\t251820\n隐婚\t251821\n中国航天科工集团第二研究院\t251822\n一英里\t251823\n翘翘板\t251824\n考验期\t251825\n翰林\t251826\n19套\t251827\n所罗门群岛\t251828\n8话\t251829\nBukkit\t251830\n说下\t251831\n金满\t251832\n玉缘\t251833\n贵州省委\t251834\nWIN\t251835\n宜安科技\t251836\nneigh\t251837\n消化酶\t251838\n档次\t251839\n大连北站\t251840\n九环\t251841\n吴劲松\t251842\n全自动包装机\t251843\n大灯\t251844\n6月底\t251845\n柴胡加龙骨牡蛎汤\t251846\n9080\t251847\n民生银行信用卡中心\t251848\nmonica\t251849\n广电\t251850\njsql\t251851\n下来\t251852\n日本本州\t251853\n周多\t251854\n十九大知识竞赛题\t251855\n家用脱毛仪\t251856\n村居\t251857\n金棕榈奖\t251858\nCORS\t251859\n精业\t251860\n失音\t251861\nicloud.com\t251862\n螺旋纹\t251863\n鞋品\t251864\n拉提\t251865\n虫儿飞\t251866\n通气\t251867\n青千\t251868\n系统城\t251869\n福州高级中学\t251870\n实心板\t251871\n杭城\t251872\n胆寒\t251873\n干行\t251874\n港乐\t251875\n华为公司\t251876\n暗影格斗3吧\t251877\n成都高新投资集团有限公司\t251878\nccs\t251879\n贯\t251880\n建警\t251881\nhuli\t251882\n价值观念\t251883\n蜂胶\t251884\nGoodmidi\t251885\n盛弘股份\t251886\n1毫\t251887\n感思\t251888\n35倍\t251889\n科研管理系统\t251890\nfeifei\t251891\n洛必达法则\t251892\n製作\t251893\n第十五章\t251894\n家电维修\t251895\n一钱\t251896\n热膨胀系数\t251897\nWORLD\t251898\n碧桂园假日半岛\t251899\n美好的\t251900\n苏列\t251901\n康拓\t251902\nWaters\t251903\n爱爱&#160\t251904\nhealer\t251905\n安徽工程大学\t251906\nt01\t251907\n3dmaxs\t251908\n营业税金及附加_\t251909\n明证\t251910\n共青团团旗\t251911\n马夸特\t251912\n2000分\t251913\n乳木果\t251914\n部片\t251915\n化整\t251916\nSelector\t251917\n说变就变\t251918\n索伦托\t251919\n芙蓉古城\t251920\n劲乐团单机版\t251921\n李元霸\t251922\n二手房房\t251923\n后花园\t251924\n资阳\t251925\n四大\t251926\n万胜\t251927\n长春一汽\t251928\n代位权\t251929\n美乃滋\t251930\n眼晴\t251931\n拉齐奥\t251932\nman2018\t251933\n三明\t251934\n林生祥\t251935\n大转移\t251936\nsot23\t251937\n荆州经济技术开发区\t251938\n1731\t251939\n太平天国\t251940\nQSFP\t251941\n中央广播电视中等专业学校\t251942\n彩妆盒\t251943\n8000e\t251944\n书本\t251945\n需求分析师\t251946\n附超\t251947\n娭毑\t251948\n三只眼\t251949\n行者\t251950\nFC模拟器\t251951\n族舞\t251952\nrealism\t251953\n反侧\t251954\nProcess类\t251955\n乌迪内斯\t251956\n万位\t251957\n杭州先临三维科技股份有限公司\t251958\nsurcharge\t251959\n中纪\t251960\n机型\t251961\n永昌街道\t251962\n棒球帽\t251963\nsyslog-ng\t251964\n铅门\t251965\nkuaidial\t251966\n艳女\t251967\nHammer\t251968\nMT\t251969\n梵谷\t251970\n爱过\t251971\n众信旅游悠哉网\t251972\nErnest\t251973\n里程碑\t251974\ntelecomadmin\t251975\n硬脂酸\t251976\n1632\t251977\n检测站\t251978\n秦皇岛市第一医院\t251979\nhema\t251980\nipmi\t251981\n行家里手\t251982\n飞轮海\t251983\n恋爱过\t251984\n挥剑\t251985\nimmutable\t251986\n鳕鱼肝油\t251987\n鞑靼\t251988\n烟台站\t251989\n义庄\t251990\n宁财神\t251991\n金蝶EAS\t251992\n谷氨酰转肽酶\t251993\n林秋楠\t251994\nFO\t251995\n5.8%\t251996\nfaq\t251997\n麦峰强\t251998\n并举\t251999\n翼手\t252000\n黄囊\t252001\n安阳市第二人民医院\t252002\n上海市工商局\t252003\nsaints\t252004\n工位\t252005\n百万任\t252006\n有必\t252007\n海明威\t252008\nzaha\t252009\nwxl\t252010\n立铣\t252011\n10欧姆\t252012\nOpenWrt下载网\t252013\n杨成\t252014\n亲家母\t252015\n相位图\t252016\n统分\t252017\nUnsafe\t252018\n鸿基新城\t252019\n方季惟\t252020\n鬼谷君\t252021\n2015年10月24日\t252022\n带有\t252023\n美图M4\t252024\n免取\t252025\n苏家坨镇\t252026\n金陵\t252027\n25.dll\t252028\ncribe\t252029\n课时\t252030\n福建船政交通职业学院\t252031\nCocos\t252032\n环境法\t252033\n迪马希\t252034\n何静文\t252035\n2名\t252036\n大通县\t252037\n教路\t252038\n制安\t252039\nreport\t252040\n3D片源网\t252041\n北京师大附中\t252042\n全场\t252043\n网路岗\t252044\n辽宁篮球队\t252045\n上海微系统与信息技术研究所\t252046\n悬案\t252047\n威言\t252048\nnamp\t252049\n新闻_赛迪网\t252050\n北京电气\t252051\n训词\t252052\nliebe\t252053\nawarded\t252054\n科班出身\t252055\nAida\t252056\n避障\t252057\n平山湖大峡谷\t252058\n渺\t252059\n880G\t252060\n哈农钢琴\t252061\n无尾熊\t252062\n背光\t252063\n6200U\t252064\n达维\t252065\n丝足会所\t252066\n黑悟空\t252067\n雪人\t252068\nFastreport\t252069\n二道桥\t252070\nnell\t252071\n大隐刀\t252072\n华发商都\t252073\n54米\t252074\n凯路仕\t252075\n东莞庄\t252076\n新西兰南岛\t252077\ncooperative\t252078\n行政管理类\t252079\n大文豪\t252080\n⒌\t252081\n张公案\t252082\n广州市卫生局\t252083\nZDM\t252084\n情韵\t252085\n英雄无敌7\t252086\n买出\t252087\n真的爱你\t252088\nEstimation\t252089\n呈贡区\t252090\n全名录\t252091\n井架\t252092\n大坡\t252093\n7466\t252094\n中证登\t252095\n0.22.0\t252096\nwin7系统\t252097\n安委会\t252098\nbackports\t252099\nMat\t252100\n惶惶\t252101\n石家庄市委\t252102\n商用制冰机\t252103\n15毫米\t252104\n贺拉斯\t252105\nimplosion\t252106\n晴雨表\t252107\n卡利玛神庙\t252108\nS7/edge\t252109\nAesop\t252110\n七叶\t252111\n咸阳市\t252112\n读客\t252113\n红岩杰狮\t252114\n河北电台\t252115\nfriendship\t252116\nLicking\t252117\n虫子们\t252118\n鹳\t252119\n闪频\t252120\n混频\t252121\n威尔\t252122\n杨凌区\t252123\n500平方米\t252124\n奈德\t252125\nSHORT\t252126\n昆明市政府\t252127\nHazards\t252128\n沈阳市财政局\t252129\nC200\t252130\n仿表\t252131\n采桑子\t252132\n伊秉绶\t252133\n信喵\t252134\n巢湖市\t252135\n理事\t252136\n莆田一中\t252137\n周敦颐\t252138\n啄\t252139\n绊脚\t252140\n住居\t252141\n江鱼\t252142\n万强\t252143\n开城\t252144\n消费价格指数\t252145\nFcitx\t252146\n净利\t252147\ndzqabc\t252148\n冯建军\t252149\n养鱼\t252150\n慈云寺\t252151\n5454\t252152\n中国国家交响乐团\t252153\n深圳中山泌尿外科医院\t252154\n2017.12.29\t252155\n21080P\t252156\n帮扶暖\t252157\n补偿柜\t252158\n地库\t252159\n第54章\t252160\n韩国旅游发展局\t252161\n勤荒\t252162\nICE\t252163\n安州\t252164\n镜像\t252165\n第101号\t252166\nrole\t252167\n7th\t252168\n开滦\t252169\nLuManager\t252170\n古汉语字典_词典网\t252171\n2.5亿元\t252172\n独户\t252173\n第49页\t252174\n陈玉成\t252175\n国印\t252176\nbin-log\t252177\n堡礁\t252178\nu16\t252179\n火星研究院\t252180\n吃紧\t252181\n晋城镇\t252182\n9.5米\t252183\n黑莓\t252184\n门朝\t252185\n安息日\t252186\n圣淘沙\t252187\n战国史\t252188\n盖尔·加朵\t252189\n柳\t252190\n抛投\t252191\n机载激光雷达\t252192\n气象家园_气象人\t252193\nGRUB\t252194\nhistory\t252195\nFLUENT\t252196\n13级\t252197\n古代汉语\t252198\nbbt\t252199\n37毫米\t252200\n优质套\t252201\n松嶋\t252202\n选公\t252203\n余子\t252204\n督考\t252205\n初會篇\t252206\n虚开增值税专用发票\t252207\n非专业人士\t252208\n寝宫\t252209\n被辞职\t252210\n一套\t252211\n冯洁\t252212\n钳位\t252213\n肉感\t252214\n床式\t252215\nBTU\t252216\n生本\t252217\nInternal\t252218\n十九大\t252219\n张天一\t252220\n截短\t252221\n唐成\t252222\n玲珑加速器\t252223\n翻腕\t252224\nguerlain\t252225\n中华人民共和国行政强制法\t252226\n招商银行银行\t252227\n液碱\t252228\n淘宝退货运费险\t252229\n茚三酮\t252230\n浙江省军区\t252231\nyoutell\t252232\n西安建筑科技大学\t252233\nklaus\t252234\n武山\t252235\nmidas\t252236\n雷锋精神\t252237\n乳鼠\t252238\n南沙区\t252239\n广东省委党校\t252240\nxrd\t252241\n茶亭\t252242\n罗广斌\t252243\n湿式报警阀\t252244\n根瘤菌\t252245\n沈石溪\t252246\n戛纳湾\t252247\n冯保\t252248\n牧原\t252249\n钻井\t252250\n水象星座\t252251\n扭矩_\t252252\n斗罗大陆1\t252253\n虎门中心客运站\t252254\nSAI\t252255\nAR眼镜\t252256\n醉打金枝\t252257\n群版\t252258\n给予者\t252259\n新劳动合同法\t252260\n黑白迷宫\t252261\n1498\t252262\n无憾\t252263\n绿蜂胶\t252264\n3275\t252265\n跨站\t252266\n松拓斯巴达\t252267\n顶立\t252268\n天保\t252269\n反客为主\t252270\n咋样\t252271\n先锋者\t252272\n赤脚大仙\t252273\n2千兆\t252274\n从生\t252275\n恒达\t252276\n17万元\t252277\n1534\t252278\n响应性\t252279\n欢go\t252280\n三唑仑片\t252281\njiaoyu\t252282\n临场\t252283\n主跨\t252284\niOS8.0\t252285\n次限\t252286\n热咖啡\t252287\n莫泰酒店\t252288\n沈一石\t252289\n核化\t252290\n47章\t252291\n二十万元\t252292\ne5500\t252293\n迷你世界助手\t252294\n陆浩\t252295\n上川岛\t252296\n樱花酒\t252297\n81集\t252298\n商检局\t252299\n炮架\t252300\n幼犬\t252301\n去春来\t252302\nggplot2\t252303\n36氪\t252304\nassoc\t252305\n波音\t252306\nm7205\t252307\n数字政通\t252308\n甘露丸\t252309\n胃药\t252310\n美考\t252311\n机关事务局\t252312\n周群\t252313\n应急灯\t252314\n板报网\t252315\n95级\t252316\n整定\t252317\n黔城\t252318\nViolent\t252319\n胡炜\t252320\nSuccubus\t252321\n恋恋不舍\t252322\n普莉希拉\t252323\n石台先锋网\t252324\n船舶与海洋工程专业\t252325\nprincipal\t252326\n螺杆泵\t252327\n孟洛川\t252328\n8020\t252329\n一水\t252330\n昌北机场\t252331\n铜芯\t252332\n副城\t252333\nTransistor\t252334\n杭州大厦\t252335\n悲怆奏鸣曲\t252336\n孤舟\t252337\n红河哈尼族彝族自治州人民政府\t252338\n镇江新区\t252339\n量变\t252340\n獐子岛\t252341\nsift\t252342\nelfutils\t252343\n华东大区\t252344\n米诺环素\t252345\n脾胃虚寒\t252346\n智齿冠周炎\t252347\n奥美拉唑镁肠溶片\t252348\n酒酿蛋\t252349\n禁烟\t252350\neFinancialCareers\t252351\n抽丝剥茧\t252352\nV3\t252353\n纪念帖\t252354\n88米\t252355\n学段\t252356\n河洛文苑\t252357\nmenustrip\t252358\nv2.5.2\t252359\n耐老化\t252360\n2.9\t252361\n布料机\t252362\n天启科技\t252363\n超时费\t252364\n北京潭柘寺\t252365\ndivide\t252366\n姬丽哈泽尔\t252367\n自治区纪委\t252368\n天涯赤子心\t252369\n离散\t252370\n7893994\t252371\n休养\t252372\n好心塞\t252373\n玛泽\t252374\n内详\t252375\n看文\t252376\n硝酸咪康唑\t252377\n规定\t252378\nKiat\t252379\n不射\t252380\n东福山\t252381\n氧自由基\t252382\n狮子洋\t252383\n民乐县\t252384\n涡阳新闻网\t252385\n看跌\t252386\nStringBuffer类\t252387\n中国机构编制网\t252388\ncdh5\t252389\n贴囧网\t252390\n注册码生成器\t252391\nXbox\t252392\n知识产权贯标认证\t252393\nJayYin\t252394\n三亚天涯区\t252395\n体性\t252396\n英雄传说:闪之轨迹2\t252397\n北京师范大学研究生院\t252398\n19首\t252399\nvcredist_x64.exe\t252400\n南都网\t252401\nwidget\t252402\n词块\t252403\n新绛县\t252404\n建筑构件装修|一起网\t252405\n改革派\t252406\n枝裕和\t252407\nIdaho\t252408\n非盈利性\t252409\n糖质\t252410\n衰鬼\t252411\n2009-2012年\t252412\n氟苯基\t252413\n5系\t252414\n3月18号\t252415\n海吉布\t252416\n草原情\t252417\n唐奇安\t252418\n人肉搜索\t252419\n衡阳市八中\t252420\n伏魔战记\t252421\nbackbeat\t252422\n中极\t252423\n何超盈\t252424\n游戏集\t252425\n小米机顶盒\t252426\n智电网\t252427\n乳脂\t252428\n5月20\t252429\nn6\t252430\n青豌豆\t252431\n粉条\t252432\n人文素养\t252433\n世茂房地产\t252434\n神行\t252435\n铝型材\t252436\n五一大道\t252437\n公汽\t252438\n柳如是\t252439\n神转折\t252440\n回去\t252441\n十里江湾\t252442\nnga\t252443\n阿里腾讯\t252444\n远者\t252445\nipad5\t252446\n8色\t252447\nPotplayer\t252448\n红宝\t252449\n声明\t252450\nOffGamers\t252451\n发烧碟\t252452\nbecause\t252453\nDB11/T\t252454\n黑腹\t252455\n名妆\t252456\nmlogit\t252457\n永平镇\t252458\n危废\t252459\n林洁\t252460\n中山大学孙逸仙纪念医院\t252461\n传习录\t252462\nsy\t252463\n萧了个晓\t252464\n金兰\t252465\n布防\t252466\n扑鼻\t252467\navenger\t252468\n赵朴\t252469\n电子枪\t252470\n平安e行\t252471\n读者群\t252472\n込\t252473\n辨识度\t252474\n美景之屋2\t252475\n党政\t252476\n省住房和城乡建设厅\t252477\n供弹\t252478\nSurvive\t252479\n硅管\t252480\n球操\t252481\n重典\t252482\n如有神\t252483\n重庆警察学院\t252484\n四五百\t252485\n炖汤\t252486\n来游\t252487\nICP许可证\t252488\n铭文\t252489\nEarnings\t252490\n东海闲湖城\t252491\n怪盗\t252492\n一朝\t252493\n急诊科\t252494\nnouveau\t252495\n柯恩\t252496\nBarcode\t252497\n搏彩\t252498\n机人\t252499\n石昊\t252500\n氢水\t252501\n肺功能检查\t252502\n横行天下\t252503\n紫御华府\t252504\n仪众国际\t252505\n解放军第309医院\t252506\nhofer\t252507\n青岛婚纱摄影\t252508\n土车\t252509\n小杯\t252510\n杨家溪\t252511\nHistory\t252512\n58G\t252513\ncompletion\t252514\n体力值\t252515\n苏州中心\t252516\n3576\t252517\n甘肃仁爱妇产医院\t252518\n路行\t252519\n鞋凳\t252520\n星系\t252521\n活力型\t252522\n哈尔滨百姓网\t252523\n藏有\t252524\nwww.4399.cn\t252525\n正交编码器\t252526\n三四个\t252527\n绝地枪神k1\t252528\n改性工程塑料\t252529\n灵车\t252530\n王雪\t252531\n中国旺旺\t252532\n恭喜恭喜\t252533\n爱沢花梨\t252534\n20170525\t252535\n私房片\t252536\n笔趣\t252537\n折弯件\t252538\n上榴人士\t252539\n达芬奇手术机器人\t252540\n1-5月\t252541\n申克\t252542\nHACK\t252543\n安川\t252544\n驴行\t252545\n快走\t252546\nTurkey\t252547\n氯氮平片\t252548\nzhenglei\t252549\nnetscaler\t252550\n拉拉操\t252551\n京准通\t252552\n20160829\t252553\n会商\t252554\n06:00\t252555\n贤者\t252556\n佛灯\t252557\n千兆交换机\t252558\n107年\t252559\n色差计\t252560\n军工厂\t252561\n李俊基\t252562\n房扑\t252563\n克里斯·安德森\t252564\n职官\t252565\nShowman\t252566\n周乙\t252567\n味型\t252568\nmu-X牧游侠\t252569\n大唐雷音寺\t252570\n根河市人民政府\t252571\n二氧化\t252572\n投票网\t252573\npont\t252574\nlagou\t252575\n3月7号\t252576\n导航卡\t252577\n通古斯大爆炸\t252578\n九巴\t252579\n爱驰汽车\t252580\n衷肠\t252581\n56平米\t252582\n上弦\t252583\n江西省能源局\t252584\nbrushes\t252585\n信托受益权\t252586\n海昌路\t252587\n龟苗\t252588\n守护人\t252589\na方\t252590\n68852789\t252591\n中国标准出版社\t252592\n夜律\t252593\n八次\t252594\n王啸\t252595\n乐童\t252596\n广州)有限公司\t252597\n煤炭\t252598\ndrain\t252599\nDoris\t252600\nChina\t252601\n英雄连\t252602\n秘钥\t252603\n地台床\t252604\nvtp\t252605\n刀板\t252606\n淘宝大学\t252607\ngafei\t252608\ncet\t252609\n黄帝陵\t252610\n第十四个\t252611\n安吉物流\t252612\n河南宝泉\t252613\nShelby\t252614\n冯芷墨\t252615\n大理\t252616\n泰迪罗宾\t252617\n秋水\t252618\n西安大略大学\t252619\nflash播放器\t252620\n大吴哥\t252621\n灵基\t252622\n凄切\t252623\n狂想\t252624\nelong\t252625\n睡眠仪\t252626\ntiantian\t252627\n浦江华侨城\t252628\n滤筒\t252629\n诊法\t252630\n双井\t252631\n坐立\t252632\ntechnological\t252633\n背叛\t252634\n天涯论坛\t252635\n杯柄\t252636\n木系\t252637\n神雾\t252638\n周黎明\t252639\nipad2017\t252640\n山根\t252641\n翻盖\t252642\n0509\t252643\n午月\t252644\nLITTLE\t252645\n物探\t252646\nKiss\t252647\n磁力仪\t252648\n孤客\t252649\n薪期\t252650\n生君\t252651\n人与\t252652\nOracle11g\t252653\n花序\t252654\n提示窗\t252655\n驱动程序\t252656\n难波公园\t252657\n敢拼\t252658\n研华工控机\t252659\n办到\t252660\n量子点\t252661\nquit\t252662\n徐玉兰\t252663\n0559\t252664\nN54U\t252665\n刹把\t252666\n威努特\t252667\n玷\t252668\nginseng\t252669\n心理状态\t252670\ncutie\t252671\n莲花\t252672\n小晶\t252673\n标液\t252674\n无边无际\t252675\nHD1080P\t252676\n临澧县政府\t252677\nextjs4\t252678\n天沟\t252679\n聚光灯\t252680\nsalts\t252681\n怀来吧\t252682\n萌呆\t252683\n陈兵\t252684\n5-7月\t252685\n诺唯真\t252686\n叶尼塞河\t252687\n星座秀\t252688\n6.6.6\t252689\n懂礼貌\t252690\n满州\t252691\n忍者蝙蝠侠\t252692\n狂化\t252693\ndekevin\t252694\nlolS7\t252695\n姜力\t252696\n黑市\t252697\n仁化\t252698\nEnumeration\t252699\n横塘路\t252700\n面机\t252701\nTheos\t252702\n广证恒生\t252703\njealous\t252704\n40万辆\t252705\n南京市环保局\t252706\n声频\t252707\n后驱\t252708\n老姚\t252709\n空子\t252710\n女高音\t252711\n简然\t252712\n几世纪\t252713\n王晓东\t252714\n在等\t252715\nㄨ\t252716\n亚瑟王\t252717\n莫比\t252718\n白鹤\t252719\n中兵红箭\t252720\n红旗镇\t252721\n济南方特东方神画\t252722\n蛇灾\t252723\n诺尔曼\t252724\n4x100米接力\t252725\n鲁艺\t252726\n五六岁\t252727\n伊苏6\t252728\n大众文艺\t252729\n连城璧\t252730\nNRF51822\t252731\nV2.8\t252732\n地暖分集水器\t252733\n你我\t252734\n单薄\t252735\n东旭集团有限公司\t252736\n036\t252737\nspring-security\t252738\n凭空\t252739\nbaja\t252740\nEXCE\t252741\n通线\t252742\n还原性\t252743\n293\t252744\n雪中送\t252745\n2格\t252746\n注册端口号\t252747\n拉帕替尼\t252748\nad9\t252749\n油渍\t252750\n皋月\t252751\ncompare4\t252752\n迷你大乱斗\t252753\n微生态\t252754\n25方\t252755\nxps15\t252756\n矶钓\t252757\n抵食\t252758\n欧亚\t252759\n丹姿\t252760\n魅\t252761\n关系词\t252762\n五件套\t252763\n人人秀H5\t252764\nishi\t252765\n垂钓园\t252766\n瑞虎3\t252767\n中兴通\t252768\n审计署\t252769\n编程吧\t252770\nkatsumi\t252771\nAppleid\t252772\n黄沙\t252773\n标底\t252774\nSTM32F103RCT6\t252775\n都市型\t252776\n完达山\t252777\n6205\t252778\n线刷\t252779\n哈佳铁路\t252780\n国际园林博览会\t252781\n竹舍\t252782\nDHCオンラインショップ|化粧品健康食品ファッション\t252783\n兔费\t252784\nVirtualBox\t252785\n十个月\t252786\nPLoS\t252787\n过后\t252788\nGriffith\t252789\n系统工\t252790\n广外外校\t252791\n绍兴市政府\t252792\n0零\t252793\n电热\t252794\n年龄组\t252795\n回忆篇\t252796\n修理\t252797\n抹胸裙\t252798\n班表\t252799\n华研\t252800\n中国民用航空中南地区管理局\t252801\nGfK\t252802\n内存版\t252803\nzzr\t252804\n第几步\t252805\nlocalhost\t252806\n用水性\t252807\n13周岁\t252808\nHead\t252809\nOTP\t252810\nCollage\t252811\n少先\t252812\n91jm\t252813\n新碶\t252814\n英特尔\t252815\n龙岩市人民政府\t252816\n彝海\t252817\n北京市食药监局\t252818\n蒙顶山茶\t252819\ndtb\t252820\n特力屋\t252821\n基督教\t252822\n湖南司法厅\t252823\n明亚保险\t252824\n浮球\t252825\n木锯\t252826\n假面饭店\t252827\n神巴巴星座网\t252828\nWord转换器\t252829\n鲜网辣文合集\t252830\n落山\t252831\n郝杰\t252832\njot\t252833\n淘宝天猫网店\t252834\nISTA\t252835\n远程教育招生网\t252836\n360安全卫士网管版\t252837\n问卷星\t252838\n万用表\t252839\n熟女\t252840\n长宁路\t252841\n北半球\t252842\n集运\t252843\n_窝窝QQ网\t252844\n转会费\t252845\n广告\t252846\n左氧氟沙星胶囊\t252847\n隘\t252848\n担忧\t252849\n高志大厦\t252850\n王工\t252851\n凶案\t252852\n乐雪薇\t252853\n迷笛音乐节\t252854\nGALLERY\t252855\n世萌\t252856\n血友病\t252857\nvowpal\t252858\n88万元\t252859\n激振力\t252860\n伊莱美\t252861\n热量\t252862\n答题助手\t252863\n海獭\t252864\n中国中药杂志\t252865\n轿车\t252866\n中冠\t252867\n16.98万起\t252868\n突变体\t252869\n南京东站\t252870\n乖宝宝\t252871\n拘押\t252872\n毙\t252873\n胡老师\t252874\n布拉德\t252875\n遇人\t252876\nSonata\t252877\n光强度\t252878\n痒疹\t252879\n三千方\t252880\n丸\t252881\n九十多岁\t252882\nprofessor\t252883\n6.6.1\t252884\n六小时\t252885\n宜白路\t252886\n农产品交易网\t252887\n深圳市前海深港现代服务业合作区管理局\t252888\n固相萃取仪\t252889\n隆德大学\t252890\n88000\t252891\nmiro\t252892\n吴坚\t252893\n洒水车\t252894\nChild\t252895\n0.24.5\t252896\n振兴东北\t252897\n四川男篮\t252898\n6.6.3\t252899\n载有\t252900\n身份证明\t252901\n张高清\t252902\nupside\t252903\n济川街道\t252904\nSP4\t252905\n嫁给我\t252906\n9.5.2.0\t252907\n北京电网\t252908\n荣誉勋章2010\t252909\n雷伊\t252910\n妹纸\t252911\n高达00\t252912\n子墨\t252913\n微电商\t252914\n118万\t252915\n尼尔机械纪元吧\t252916\n三件套\t252917\n购课\t252918\n梦幻群侠传\t252919\n第十三节\t252920\nXK\t252921\nHyperSnap\t252922\n三十三\t252923\n张红军\t252924\n余姚市\t252925\nKnockin\t252926\n武侯区\t252927\n洁面粉\t252928\n广州东塔\t252929\n毛骗终结篇\t252930\n5i\t252931\n求助者\t252932\n荣耀v9\t252933\ncardio\t252934\n3706\t252935\n阿莫西林\t252936\n生活相对论\t252937\n共演\t252938\n绢丝\t252939\nantm\t252940\n八天\t252941\n青书学堂\t252942\n联名卡\t252943\nspringmvc4\t252944\n田歌\t252945\n跃点\t252946\n甄姬\t252947\n神经线\t252948\n漂泊者\t252949\nithoughts\t252950\n发威\t252951\n打水\t252952\n42天\t252953\n0024\t252954\nmqb\t252955\n执行\t252956\n零食品\t252957\n丽人\t252958\n四叶\t252959\n新颖性\t252960\n1712\t252961\ni3/Core\t252962\n尸首\t252963\n易碎品\t252964\nsinopharm\t252965\nappliance\t252966\nifi\t252967\n为先导\t252968\n留得住\t252969\nliaison\t252970\n皮肤性病学\t252971\n非财务\t252972\n唱诵\t252973\n战龙三国\t252974\n体指针\t252975\n氯化亚铜\t252976\n水泥搅拌车\t252977\nReducer\t252978\n赠品展\t252979\n伤亡事故\t252980\n营口职业技术学院\t252981\n警事\t252982\n黑晴明\t252983\n清华园\t252984\n重庆大学新闻学院\t252985\n现货黄金\t252986\nYui\t252987\n南昌三中\t252988\n色计\t252989\n七禾网\t252990\n辽宁省人民医院\t252991\n少女的祈祷\t252992\ncrooked\t252993\n面包盘\t252994\n内块\t252995\n索伦\t252996\n经审计\t252997\n车物\t252998\n3月1日后\t252999\n莫少\t253000\n西媒\t253001\nTomorrow\t253002\n瑞金\t253003\nDEX\t253004\n华声\t253005\n同列\t253006\n宁波e舟网\t253007\n药学学报\t253008\n兴蓉\t253009\n蛋白质组\t253010\n分册\t253011\n乐满地\t253012\n申卡\t253013\n技术贸易壁垒\t253014\ntransmit\t253015\n360免费WIFI\t253016\nproconfig\t253017\n王者荣耀血王宫\t253018\n玻璃厂\t253019\n范晓东\t253020\n科普展\t253021\n类型学\t253022\n守护甜心\t253023\nDBN\t253024\n玉臀\t253025\n群嘲\t253026\n作品篇\t253027\n桑田沧海\t253028\n凯诺\t253029\n真视通\t253030\n产妇\t253031\n寻仙吧\t253032\n赵襄\t253033\n华军\t253034\n绝色狂妃\t253035\n79元\t253036\n北京大数据研究院\t253037\norace\t253038\n投洽会\t253039\n天成医疗\t253040\nC5\t253041\n好乐买\t253042\n中国物业网\t253043\n甜葡萄酒\t253044\n云南省人民政府国有资产监督管理委员会\t253045\n超神学院\t253046\n压比\t253047\n開\t253048\n4位\t253049\n小叶黄杨\t253050\n八里镇\t253051\n水晶洞\t253052\n裁断机\t253053\n圆极化\t253054\n省企\t253055\n莱盛\t253056\ndara\t253057\n≯\t253058\n撒隆巴斯\t253059\n安亲\t253060\nmdm\t253061\n焊接钢管\t253062\n坚定不移\t253063\n康爱多网上药店\t253064\n2017年5月28日\t253065\n光驱启动\t253066\n王杨\t253067\n打虎记\t253068\n魔能石\t253069\n3.4.3.9\t253070\n50平\t253071\n预存\t253072\nVirgo\t253073\n辊压机\t253074\n1cm\t253075\n蛋糕盘\t253076\n国有资\t253077\n纳新\t253078\nworkflow规则\t253079\nmalone\t253080\n桃花债\t253081\n一周后\t253082\n广州华南理工大学\t253083\ngrab\t253084\nSteampunk\t253085\n南充市商业银行\t253086\n风石\t253087\n云服\t253088\n877路\t253089\n名面\t253090\n场地赛\t253091\n擎天之柱\t253092\n赵德汉\t253093\n56_\t253094\n平安国际金融中心\t253095\n春兰\t253096\nhd530\t253097\n华发中央公园\t253098\n虚妄\t253099\n莫斯树\t253100\nMerida\t253101\n笼\t253102\n2413\t253103\n莲湖区\t253104\n入替\t253105\n吸金\t253106\n克重\t253107\n数独题\t253108\n长沙经济技术开发区\t253109\n蚂蚁花呗\t253110\n彖\t253111\nultraman\t253112\n颇多\t253113\n电吉他吧_\t253114\n完美搭配\t253115\n学出\t253116\nJagger\t253117\n淘粉吧\t253118\nWAV+CUE\t253119\n用户名\t253120\nElliot\t253121\nrng战队\t253122\n腑\t253123\n叶子猪新闻中心\t253124\n继峰\t253125\n达官贵人\t253126\n皇居\t253127\ncannot\t253128\n九头\t253129\n中国证券公司\t253130\nautosave\t253131\n舔吮\t253132\n聚醚型\t253133\n百度网盘/迅雷\t253134\n天高\t253135\n一次一个\t253136\n上焦\t253137\n160000\t253138\ndemocracy\t253139\n盛大花园\t253140\nTDL\t253141\n汐\t253142\n朱枫\t253143\ndumplings\t253144\nwunderlist\t253145\n四不放过\t253146\n天津劝业场\t253147\n抜\t253148\n捕食\t253149\n无法双面\t253150\n重庆市体育局\t253151\n铜版纸\t253152\nAtelier\t253153\n阿妈\t253154\n5730\t253155\nsqlite3数据库\t253156\n且末县\t253157\n今报网_东方今报\t253158\nLAYSHA\t253159\n美国贸易代表办公室\t253160\n012\t253161\n惠州市国土资源局\t253162\n奔涌\t253163\n特训营\t253164\n锦江\t253165\nEventEmitter\t253166\n台风\t253167\n格瓦拉生活网\t253168\n物业管理有限公司\t253169\n没事儿\t253170\n偷生\t253171\n红米1S\t253172\nsystem32\t253173\n纯阳废土\t253174\n松山湖\t253175\n16GB/WiFi版\t253176\n萧县\t253177\n萧县北\t253178\n荚\t253179\n迈克尔·波特\t253180\ned2k种子.rar\t253181\nshome\t253182\n一个集\t253183\nRPG/\t253184\nvmx\t253185\n沫\t253186\n四幼\t253187\n刘汝佳\t253188\nCaoporn\t253189\n烟花戏\t253190\n汇兑收益\t253191\n皇冠假日酒店\t253192\n科技部办公厅\t253193\n喷射状\t253194\n家门口\t253195\n收假\t253196\n容闳\t253197\n意恋\t253198\n分集水器\t253199\n儿童型\t253200\n哈珀\t253201\n好就业\t253202\n75种\t253203\n事倍功半\t253204\n祖冲之\t253205\n英语四级考试\t253206\n指导价\t253207\n中富\t253208\n职友网\t253209\n单证\t253210\nトン\t253211\n92年\t253212\nClifford\t253213\n四川外语学院成都学院\t253214\n诊察\t253215\n成都市统计局\t253216\n闯关类\t253217\n第五节\t253218\n其林\t253219\n沪江英语学习网\t253220\n昆明经开区\t253221\n各具特色的民居\t253222\n周总结\t253223\n昂立英语\t253224\n荣耀3x\t253225\n太虚图书馆\t253226\n云梦网\t253227\n叶凡\t253228\n一起作业帮助中心\t253229\n快美妆\t253230\n发动机压缩比\t253231\n贺德克\t253232\n抚州财政局\t253233\n黑布林\t253234\n北京市丰台区人民法院\t253235\n4.9.1\t253236\n女包\t253237\n集装箱车\t253238\n2.5A\t253239\n东戴河\t253240\n龙威\t253241\n首\t253242\n太空服\t253243\n34张\t253244\n乔11\t253245\n深交\t253246\n一切安好\t253247\n我们这\t253248\n徐少华\t253249\n亚运村\t253250\n周婧\t253251\n醋泡花生\t253252\nCheckbox\t253253\n压缩裤\t253254\n李弘\t253255\n维多利亚市\t253256\n楚汉争霸\t253257\n76cm\t253258\nhouse\t253259\n门岗\t253260\nfountain\t253261\nppt-CSDN\t253262\n160亿元\t253263\n费尽心机\t253264\n白塘壹号\t253265\n畸\t253266\n前列腺囊肿\t253267\n扬州西区\t253268\n尹灵芝\t253269\nc1flexgrid\t253270\n中国科学院海洋研究所\t253271\n纳特·帕格\t253272\n贵州省人民政府办公厅\t253273\n肖像照\t253274\n大希\t253275\n客户区\t253276\n迎泽西大街\t253277\n骑马与砍杀汉匈全面战争\t253278\n甲乳\t253279\n铝轮\t253280\n徐逸\t253281\n太平洋铁路\t253282\n广州电台\t253283\nkeywords\t253284\n足模\t253285\n桑田佳佑\t253286\n质押权\t253287\nmined\t253288\nspel\t253289\n动起来\t253290\n西十区\t253291\ndistances\t253292\n夏贝尔\t253293\n三轮车\t253294\nEV\t253295\n嘴兽\t253296\n一次两次\t253297\nsj\t253298\n交易员\t253299\nPuts\t253300\n口香\t253301\n乙二醇\t253302\nvitashell\t253303\narchetype-webapp\t253304\n硫\t253305\n弗利\t253306\n公厕\t253307\n我可能不会爱你\t253308\nPacking\t253309\n中体网\t253310\nWrath\t253311\n锤片\t253312\n败也萧何\t253313\n鱼眼\t253314\n伏地魔\t253315\n情人桥\t253316\n220v\t253317\n20层\t253318\n安防监控\t253319\n板图\t253320\n灰与幻想的格林姆迦尔\t253321\npredict\t253322\n黄河新闻网\t253323\nrhetoric\t253324\n打日语\t253325\n湖北专升本网\t253326\n答案网\t253327\n熔山龙\t253328\n汉王镇\t253329\n辣话\t253330\n碎身\t253331\nsysup\t253332\n白骑士\t253333\n很自然\t253334\n富海人才网\t253335\n月刊\t253336\n默认选择\t253337\n寒温\t253338\n80d\t253339\n张天明\t253340\n日播版\t253341\n嘈嘈\t253342\n净利润率\t253343\n直身\t253344\n陈晓宇\t253345\n2014年12月份\t253346\n名图傲世九重天\t253347\n肇事逃逸\t253348\n北大医药\t253349\n水杉树\t253350\nEurope\t253351\n治权\t253352\n【莓丽\t253353\n炸伤\t253354\nDB31\t253355\n姐姐\t253356\n十九大和十九届一中全会\t253357\nHD1280高清\t253358\n柠檬精油\t253359\naugust\t253360\n素酶\t253361\n万松岭\t253362\n短诗\t253363\nCombox\t253364\n富贵男贫穷女\t253365\n信阳房产网\t253366\n崭\t253367\nAFs\t253368\nHCM\t253369\nTradition\t253370\n七、八\t253371\nstudying\t253372\n解含\t253373\n2015年10月份\t253374\n爸\t253375\n世界大学\t253376\nRAV\t253377\n樱桃\t253378\nfried\t253379\n职业教育活动周\t253380\n磨头镇\t253381\n力荐\t253382\n广西那坡县人民政府\t253383\n翼虎论\t253384\n入命\t253385\n延毕\t253386\n红米note\t253387\nCBD\t253388\nOceans\t253389\nincase\t253390\nInputBox\t253391\nmedibang\t253392\ntask1\t253393\n薏仁水\t253394\n弹冠\t253395\nvaliant\t253396\niTween\t253397\n1322\t253398\n杳无音信\t253399\n动动广场舞\t253400\ngt620\t253401\n回头路\t253402\nmss\t253403\n书面化\t253404\n武十郎\t253405\n魔物娘的相伴日常\t253406\n55种\t253407\n云南财经大学\t253408\n乱清\t253409\nprosperous\t253410\n168万\t253411\n新影集团\t253412\n窦道\t253413\n烟雨蒙蒙\t253414\n志方\t253415\n中骏广场\t253416\nudh\t253417\n小向\t253418\n12CM\t253419\n中华人民共和国突发事件应对法\t253420\n第四十四\t253421\n教研组\t253422\n少年英雄\t253423\n云顶温泉\t253424\nCEIBS\t253425\n冷凝水\t253426\nJ6\t253427\n200K\t253428\n网上认证\t253429\n48_\t253430\n汽车违章查询\t253431\n摄影部\t253432\n退票\t253433\nncl\t253434\n房产税|华律\t253435\n阐\t253436\nGB_T\t253437\n手嶌葵\t253438\n能带给\t253439\n青海民族大学\t253440\n天下财经\t253441\n武汉理工大学出版社\t253442\n校运动会\t253443\n凯迪夏\t253444\n1.5讲话\t253445\n肃宁\t253446\n_莲山\t253447\n毕业照\t253448\n女仆咖啡厅\t253449\ncubase8\t253450\n萧山机场\t253451\n大胆人体艺术网\t253452\n社科\t253453\n只当\t253454\n易筋\t253455\n撘\t253456\n女性感\t253457\nmhxy\t253458\n单组份\t253459\n紫云苗族布依族自治县人民政府\t253460\n闸式剪板机\t253461\n网色\t253462\n旅游大道\t253463\n性治疗师\t253464\n乡村爱情故事\t253465\n丁类\t253466\n王式安\t253467\nqdialog\t253468\nYJY\t253469\n窑尾\t253470\n粘胶纤维\t253471\n言论\t253472\n紫苏中文网\t253473\n染缸\t253474\n杭电股份\t253475\npconnect\t253476\n0.1.3\t253477\n植骨\t253478\n金莲\t253479\n世民\t253480\n群宝\t253481\n就能够\t253482\n萧水光\t253483\n纯生\t253484\n科教片\t253485\nHERO5\t253486\n招行手机银行\t253487\n注册表键\t253488\n麻片\t253489\n温溪镇\t253490\nplearning\t253491\n沢実纱\t253492\n张榜\t253493\n周疲劳\t253494\n结果页\t253495\n文汇报\t253496\n誉\t253497\n生活方\t253498\n那年花开月正圆\t253499\n11M\t253500\nPatong\t253501\n八千流\t253502\n第16次\t253503\n快言\t253504\nsd高达g世纪创世\t253505\n卡通风\t253506\n碧桂园控股有限公司\t253507\nFloyd\t253508\n省编办\t253509\n中华人民共和国食品安全法\t253510\n文才\t253511\n极端主义\t253512\n兰山区政府\t253513\n电影\t253514\n晶粒度\t253515\n昌都市\t253516\n较\t253517\n骆玉珠\t253518\n闪迪\t253519\n第48个\t253520\nvariables\t253521\n大家一起来\t253522\n约翰尼·德普\t253523\n顾止\t253524\nu20\t253525\n多元醇\t253526\n普通高等学校\t253527\n荣\t253528\n小影\t253529\n剥肋\t253530\nアオアシ\t253531\n转正定级表\t253532\n患儿\t253533\n粒细胞\t253534\n被选举权\t253535\nspeech\t253536\n调解员\t253537\n苦工\t253538\n双氢睾酮\t253539\nCHOCOOLATE\t253540\n萨维塔\t253541\nstime\t253542\n周达\t253543\n太平财险\t253544\n加价率\t253545\n海田\t253546\n金辉世界城\t253547\n妈妈是超人2\t253548\ngrandma\t253549\n焕蓝\t253550\nheights\t253551\nGyoung\t253552\n热固性\t253553\n莱\t253554\n土地庙\t253555\n周振华\t253556\n4.0.4\t253557\n中国教育装备采购网\t253558\nRDX\t253559\n窑\t253560\n四川省工商行政管理局\t253561\n凯越兰德酷路泽\t253562\n】\t253563\n电脑声卡驱动\t253564\n月关\t253565\n托槽\t253566\nSCU\t253567\ninterrupts\t253568\n大卫科波菲尔\t253569\n广汇能源\t253570\n一个多重\t253571\nNachrichten\t253572\n宾度\t253573\n4338\t253574\n526\t253575\n多点触控\t253576\n适得其反\t253577\n体温表\t253578\n准格尔旗\t253579\n钢管桩\t253580\n拜拜\t253581\n鲜竹\t253582\n推重比\t253583\n农业部办公厅\t253584\n775亿\t253585\n珠海市发展和改革局\t253586\n清枫聆心\t253587\n忍者大师吧\t253588\n电缆保护管\t253589\nAL10\t253590\n大豫\t253591\n游离氨基酸\t253592\n黑鹰\t253593\n事业单位管理岗\t253594\n一段路\t253595\n项目管理师考试\t253596\n集成电\t253597\n传奇之王\t253598\n窗帘杆\t253599\n蔺晨\t253600\n286号\t253601\n两基\t253602\nmatlab2012a\t253603\nconsultancy\t253604\n宅霸\t253605\n扬州市第一人民医院\t253606\n100多个\t253607\n勾连\t253608\n稳产\t253609\n蒜客\t253610\n中国女团\t253611\n正位\t253612\n第3回\t253613\nmodern\t253614\n毫无保留\t253615\n更重\t253616\n37秒\t253617\n努比亚V18\t253618\n云南机电职业技术学院\t253619\n蔡东\t253620\n四冲程\t253621\n葡萄籽\t253622\n桃花谣\t253623\n第183章\t253624\n无线电信号\t253625\nw541\t253626\n佳崔\t253627\n军改\t253628\ndrought\t253629\nWindo\t253630\n双树\t253631\n惠州经济职业技术学院\t253632\n装夹\t253633\n花亭湖\t253634\n盐酸苯海索片\t253635\n梁斌\t253636\n布衣乐队\t253637\n超微服务器\t253638\n模压门\t253639\n畅饮\t253640\npuls\t253641\nad转换\t253642\n印子\t253643\n餐饮类\t253644\n陕西工业职业技术学院\t253645\n大运城\t253646\nDDR5\t253647\n潘巧云\t253648\n谭谈\t253649\n党毅飞\t253650\n填平\t253651\n汉威大厦\t253652\n数组类\t253653\n封堵\t253654\n4.7英寸\t253655\n柴米\t253656\n探微\t253657\n全美\t253658\n王文超\t253659\n义堂镇\t253660\n冰醋酸\t253661\n板框式\t253662\n小产证\t253663\n水冰月\t253664\n壁材\t253665\nLanding\t253666\n制取\t253667\n玛雅文明\t253668\n姬圈\t253669\n集成块\t253670\n就业促进法\t253671\n韩先生\t253672\n陶辉\t253673\n中华整木网\t253674\n黄历网\t253675\n房产中介公司\t253676\nTp\t253677\n2CHA\t253678\n上海璞丽酒店\t253679\n袍江工业区\t253680\nAlbany\t253681\n镀膜玻璃\t253682\n马尔默\t253683\n召回率\t253684\n唐朝乐队\t253685\n上海人民广播电台\t253686\nnetsh\t253687\n斜切\t253688\n晶胞\t253689\n祥和\t253690\n卵裂\t253691\n清收\t253692\n年末\t253693\n郑先生\t253694\n高价股\t253695\n双线\t253696\n猛狮科技\t253697\n劳动监察\t253698\nPHOTO\t253699\n王一卡特\t253700\npac\t253701\n中国建博会\t253702\n非公经济组织\t253703\nEPPlus\t253704\n光条\t253705\n彭梓嘉\t253706\ncrea\t253707\n娘家人\t253708\n水箭龟\t253709\n室内乐\t253710\n94周年\t253711\n外源\t253712\n电影片\t253713\n一个天\t253714\n电度表\t253715\nexcel2000\t253716\n下一代防火墙\t253717\nNCAP\t253718\n隔墙板\t253719\n金利科技\t253720\nshareholder\t253721\n南麂列岛\t253722\n大虹桥\t253723\n北苑家园\t253724\n藤本月季网\t253725\n美特斯\t253726\nmachine\t253727\n吉利远景\t253728\nwin10ghost\t253729\n德鲁克\t253730\n升速\t253731\n样件\t253732\n薛\t253733\n酒业\t253734\nwarning\t253735\n代抢\t253736\n谢颖颖\t253737\n佳能5d2\t253738\n自化\t253739\n议事日程\t253740\n坐支\t253741\narcaea\t253742\n枯叶蝶\t253743\n杨哲\t253744\n热胀冷缩\t253745\nVocational\t253746\n扩展柜\t253747\nXQuery\t253748\n凤台\t253749\n世硕\t253750\n1000hz\t253751\nmyid\t253752\n饲主\t253753\n代客泊车\t253754\n明月光\t253755\n工作时间\t253756\n辅星\t253757\n集体户\t253758\nWEST\t253759\n飞架\t253760\nmysql数据库栏\t253761\nRender函数\t253762\n猎熊\t253763\ngavan\t253764\n汉杯\t253765\n老佛\t253766\nVAT\t253767\n驱驰\t253768\nBPC\t253769\n隆回县\t253770\n三十首\t253771\nhereson\t253772\n杨汛桥\t253773\n图章\t253774\n姚霁珊\t253775\n按着\t253776\n散热性\t253777\nuhttpd\t253778\n四川石化\t253779\ntapered\t253780\n杭州市江干区教育局\t253781\n浅绿\t253782\n2017年12月13日\t253783\n王牌军\t253784\nradon\t253785\n张兆辉\t253786\nmutually\t253787\n刘先银\t253788\ndisability\t253789\nlyrics\t253790\n蒙自路\t253791\nBukkake\t253792\n油边\t253793\n金天\t253794\n元牌\t253795\n李天扬\t253796\n猎鹰突击队\t253797\n月德\t253798\n左右\t253799\n基恩士\t253800\n短焦\t253801\n羽田あい\t253802\n上犹县人民政府\t253803\n无敌电影网\t253804\nVictorian\t253805\n林则\t253806\n肥圆\t253807\nCrescent\t253808\npc\t253809\n钟祥市政府网\t253810\n金振焕\t253811\n浮潜\t253812\n缠绵入骨\t253813\nfyeedu\t253814\nPEOPLE\t253815\n【山\t253816\n毕达哥拉斯\t253817\n木水\t253818\n出具\t253819\n三分裤\t253820\n集体土地建设用地使用证\t253821\nReferrer\t253822\n郑庄公\t253823\npanic\t253824\n倒库\t253825\n林内热水器\t253826\n清岛湾\t253827\nfar\t253828\n湖南省公安厅\t253829\n龙湖公园\t253830\n污图\t253831\n灞桥区\t253832\n431金融学\t253833\n安游\t253834\n究极忍者风暴\t253835\n酸酐\t253836\n藩篱\t253837\n海信智能电视\t253838\n刺激感\t253839\n番荔枝\t253840\nDV视频剪辑论坛\t253841\n辽媒\t253842\n大动干戈\t253843\n甲硝唑片\t253844\n圣托里尼岛\t253845\n恋舞OL_九游论坛\t253846\n2毛\t253847\n黑龙江省环保厅\t253848\n300多个\t253849\nlvan\t253850\nnutanix\t253851\n回乡偶书\t253852\n喵萝\t253853\n珠海国际会展中心\t253854\nTyphoon\t253855\n203所\t253856\n盛宠之嫡女医妃\t253857\n贝卡尔特\t253858\n微甜\t253859\nehome\t253860\n答式\t253861\n▍\t253862\n王庚\t253863\nIndustries\t253864\n一栋\t253865\n林添一\t253866\n军人\t253867\nactionbar\t253868\n晃\t253869\n富余\t253870\n动怒\t253871\n17年前\t253872\n6.7下\t253873\nZencart\t253874\nPES2013\t253875\n爱美之心\t253876\n视贝\t253877\natan\t253878\n琼中黎族苗族自治县\t253879\n铁三角\t253880\neee\t253881\n上海华山医院\t253882\nb612\t253883\n中国小学\t253884\n红点奖\t253885\n八件套\t253886\n坏男孩\t253887\nLS\t253888\nSeduction\t253889\n双杰电气\t253890\n甘肃工业职业技术学院\t253891\n冒名顶替\t253892\n土木工程专业\t253893\n财权\t253894\n黄热病疫苗\t253895\n国家动物博物馆\t253896\n神首集团\t253897\n朱啸虎\t253898\nSEO代客操作\t253899\n电池包\t253900\n吐痰\t253901\n川化股份\t253902\n博澳城\t253903\n柳州市柳南区政府\t253904\n多些\t253905\ne影\t253906\nOri\t253907\n冥妻\t253908\n中国科学院南海海洋研究所\t253909\n北京盈建科软件股份有限公司\t253910\n山灵m1\t253911\n150平米\t253912\n跨省\t253913\nv卡\t253914\n统计源\t253915\n先兆子痫\t253916\n号版\t253917\n孤岛危机3\t253918\n肖敏\t253919\n陈文娟\t253920\nlele\t253921\n彩宝网\t253922\n聚光\t253923\n县城管执法局\t253924\n口袋妖怪绿宝石802\t253925\n申龙斌\t253926\n梦然\t253927\nTFTLCD\t253928\n勿进\t253929\n6轴\t253930\nGaze\t253931\n玫瑰花苗\t253932\ndiaries\t253933\n银象\t253934\n易爆\t253935\nrom包\t253936\n全国疟疾日\t253937\n建构区\t253938\n桨距\t253939\n无数条\t253940\n规训\t253941\n汕头大学医学院\t253942\n旁友们\t253943\n0049\t253944\n炒饭\t253945\n笔划\t253946\nzin\t253947\n华宁路\t253948\nlisa\t253949\n小黄鸡\t253950\nNTC\t253951\n教务长\t253952\n艳羡\t253953\nconservative\t253954\n渗透者\t253955\n23\t253956\n天斗\t253957\n几划\t253958\n摇摆舞\t253959\nBTScene\t253960\n报道\t253961\n和美版\t253962\nbootargs\t253963\n陈劲\t253964\nZGL\t253965\napologies\t253966\n中局\t253967\n筋板\t253968\n免票\t253969\n五世\t253970\nMolly\t253971\nIllinois\t253972\n池城\t253973\n国王节\t253974\ndeployed\t253975\nVue\t253976\n瓦屋山\t253977\n20170902\t253978\n6杯\t253979\n北京星巴克\t253980\n人海战术\t253981\n淄博师范高等专科学校\t253982\n高超\t253983\n七月初七\t253984\n溪山\t253985\n黑化文\t253986\n深圳市龙岗区人民医院\t253987\ndeclare\t253988\ndalao们\t253989\n配伍\t253990\nipone8\t253991\nphotoshopcs4\t253992\n昆明妈妈网\t253993\n粟裕\t253994\nFaded\t253995\n医家\t253996\n礼程\t253997\nxxx\t253998\n中国注册会计师\t253999\n老烟\t254000\n维修厂\t254001\n阿伦特\t254002\n参变\t254003\n虎易网\t254004\n前三分钟\t254005\n瑶\t254006\n6.0德拉诺\t254007\n无遮挡\t254008\nsunglasses\t254009\nMAO\t254010\n紫江\t254011\n朗诵\t254012\ntiptop\t254013\nKimi\t254014\n千图网www.58pic\t254015\n恶劣\t254016\n林深\t254017\nbcl\t254018\n波兰斯基\t254019\n浅灰色\t254020\n中国科学技术协会\t254021\n疏运\t254022\n井水\t254023\n旧片\t254024\n摩天轮\t254025\n无手\t254026\n当代电影\t254027\nDNF地下城与勇士\t254028\nwu2198\t254029\n上海国际金融中心\t254030\n标识牌\t254031\n石牌镇\t254032\n幻想三国志4\t254033\n5.3.8\t254034\n壳股\t254035\nMSSQL-51CTO\t254036\n日产车\t254037\n精灵宝可梦GO\t254038\niBeacon\t254039\n发布器\t254040\n64gb\t254041\n沅陵\t254042\n摩根大通\t254043\n宝视达\t254044\n保密委员会\t254045\n根瘤\t254046\n知情权\t254047\nPlsql\t254048\nIATA\t254049\n必成\t254050\n2399\t254051\n广州起义\t254052\n彼得堡\t254053\n航拍中国\t254054\nmovl\t254055\n17:30\t254056\n星哥\t254057\nMusician\t254058\n古波斯\t254059\ncrocodile\t254060\n武里南\t254061\n美龙\t254062\n鹿斑\t254063\ngrandle\t254064\n盐酸特比萘芬\t254065\n展览室\t254066\n神仙道2\t254067\n贤王\t254068\ntr750\t254069\nUSB3.1\t254070\n科尔宾\t254071\nx-6\t254072\n中昂\t254073\ndateedit\t254074\nopened\t254075\n一乐\t254076\n单效\t254077\n源自\t254078\n三国杀界\t254079\n保洁\t254080\n资库\t254081\n眼皮肤\t254082\n豌杂面\t254083\nAPT\t254084\nSimpleDateFormat\t254085\n十多名\t254086\n祖宗十九代\t254087\n发展大厦\t254088\n马鞭\t254089\n罗纳\t254090\n搞笑漫画日\t254091\n上海市大学\t254092\n追访\t254093\n长沙市财政局\t254094\n坑爹哥\t254095\n暴烈\t254096\n妖物\t254097\n防火棉\t254098\n仙乐传说\t254099\n一杰\t254100\n第7部\t254101\n桑原\t254102\nx-t2\t254103\n4PIN\t254104\n三孚股份\t254105\n双饰\t254106\n灰色地砖\t254107\n灼人\t254108\ntits\t254109\n王一霏\t254110\n民主路\t254111\n创维电视遥控器\t254112\n渔山列岛\t254113\nfont-family\t254114\n十绝阵\t254115\nf574\t254116\n雍江\t254117\n深圳外国语学校\t254118\nfinalshell\t254119\n正阳\t254120\n国游\t254121\nlaoshan\t254122\n公兴\t254123\n乐高积木玩具\t254124\n中国铁路哈尔滨局集团有限公司\t254125\n寸照\t254126\n烤地瓜\t254127\n12110\t254128\n谷氨酰转移酶\t254129\n认知心理学\t254130\n朝鲜语\t254131\n共价\t254132\n60335\t254133\n棉棉\t254134\nMOV格式\t254135\nsnipping\t254136\n说下载\t254137\n谷场\t254138\n四种\t254139\nissuance\t254140\n信访件\t254141\nHD-RMVB\t254142\n反恐精\t254143\n亚威股份\t254144\n简童\t254145\n德尔斐\t254146\n女总异世邪君\t254147\n拉奥孔\t254148\n仙剑\t254149\n接受方\t254150\n闲来一坐\t254151\n增稠剂\t254152\n杭州下城区\t254153\n抛光片\t254154\nPOJ\t254155\n26G\t254156\n90所\t254157\n农用车\t254158\n8月1日\t254159\n果卿\t254160\nOMS\t254161\n重庆地铁10号线\t254162\n奔驰C级论坛\t254163\n偷偷的\t254164\n450000\t254165\n三侠五义\t254166\n岳麓街道\t254167\n掂\t254168\n津上熊吉\t254169\n刘晓宇\t254170\n语速\t254171\n黄龙景区\t254172\n铁站\t254173\n齐妃\t254174\nSoqi\t254175\n24家\t254176\nlinun\t254177\n华中理工大学\t254178\n曹姓\t254179\n李剑青\t254180\n能歌善舞\t254181\n天然气表\t254182\nDMT\t254183\n金钻石\t254184\n4.10.0\t254185\n红石寨\t254186\n半夜2点\t254187\n金高\t254188\n红鳉鱼\t254189\nTestimonials\t254190\n体育文化节\t254191\n华帝\t254192\n雷王\t254193\n丰宁县\t254194\nseri\t254195\n充气式\t254196\n6820hk\t254197\n凯宾斯基\t254198\n跳电\t254199\n世纪大道\t254200\n20000公里\t254201\n小聚\t254202\n华人开运网\t254203\n卡诗\t254204\nPUKY\t254205\n我为\t254206\n整轨\t254207\n14米\t254208\n戈麦斯\t254209\nBrien\t254210\n瑞典克朗\t254211\n遗照\t254212\n金投赏\t254213\n围追\t254214\n康乐街\t254215\n狂虐\t254216\n老汪\t254217\n文在寅\t254218\n下牙\t254219\nherschel\t254220\n流月\t254221\n105周年\t254222\n穿上\t254223\n赫梯\t254224\n嶙峋\t254225\n利物浦大学\t254226\n上海医疗保险\t254227\n德马\t254228\n不修边幅\t254229\n300毫升\t254230\n建局\t254231\n社保算\t254232\n丁腈\t254233\ngaokao\t254234\n黄龙玉\t254235\nHGame公司\t254236\n理解力\t254237\n喜士多\t254238\n名图\t254239\nPia\t254240\n卡尔马\t254241\n艾海提\t254242\n浪凡\t254243\nEncoding\t254244\n安装工程师\t254245\n尴尬\t254246\n特指\t254247\nvmp\t254248\n监区\t254249\nOverwatch\t254250\n正衡\t254251\n哈商大\t254252\n赐死\t254253\n75b\t254254\n六星级\t254255\nUSB技术论坛\t254256\nft中文网\t254257\n12v\t254258\n中国电影家协会\t254259\nPIV\t254260\nMBR分区\t254261\n斗鱼直播节\t254262\nRAP\t254263\n杏脯\t254264\n逆苍天\t254265\n哈工大\t254266\n一番\t254267\n宝们\t254268\n垂体瘤\t254269\n苏州纳米所\t254270\n陈子庄\t254271\nlamb\t254272\n上巻\t254273\nnaruto\t254274\n植发\t254275\n比基尼美女\t254276\nmasscan\t254277\n大盆菜\t254278\n好处费\t254279\n上海虹桥国际机场\t254280\n绿城两江御园\t254281\n京东网银\t254282\n结下\t254283\nSounds\t254284\n怒之铁拳3\t254285\n除螨机\t254286\n高项\t254287\n企业编制\t254288\n天露山\t254289\n凤凰大厦\t254290\ncutout\t254291\nRegistrar\t254292\ntenth\t254293\n400亿\t254294\n火影忍者百科\t254295\n拳套\t254296\n河北人社网\t254297\n问候\t254298\n圣伯纳\t254299\n孙杨\t254300\nVodka\t254301\n老虎团\t254302\n619\t254303\n金素恩\t254304\n非线性偏微分方程\t254305\n万秀\t254306\napns\t254307\n三间\t254308\n188bet\t254309\n抱框柱\t254310\n怡然自得\t254311\n英日\t254312\n自渡\t254313\n3035\t254314\n满帮集团\t254315\n稀有异\t254316\nServer2000\t254317\n长期负债\t254318\n国家核安全局\t254319\n安吉丽娜·朱莉\t254320\n46种\t254321\n多_\t254322\n茱莉亚音乐学院\t254323\n台商区\t254324\n解释型\t254325\n中控考勤\t254326\nMSATA\t254327\n比亚迪速锐\t254328\n答题者\t254329\n蜗轮丝杆升降机\t254330\nv1.5.2\t254331\n晾晒\t254332\n你的手\t254333\n停诊\t254334\n广州航海学院\t254335\n略阳\t254336\n云兽\t254337\nTutorABC\t254338\n工作作风\t254339\n欧蓝德论坛_汽车之家论坛\t254340\n青米\t254341\ntext\t254342\n新奇骏\t254343\n武汉市东西湖区\t254344\n晴空万里\t254345\n最喜欢\t254346\n漳州市人民政府\t254347\n横空出世\t254348\n绝地求生4x4\t254349\n海上嫁女记\t254350\n天津博物馆\t254351\n乂乂\t254352\n打折券\t254353\n何因\t254354\n明基BenQ\t254355\n螺纹塞规\t254356\n行稳致远\t254357\n九博人才网\t254358\n孙女\t254359\n轻歌曼舞\t254360\n徐医附院\t254361\n代书遗嘱\t254362\nJCreator\t254363\n烟叶\t254364\nVanity\t254365\n大亚湾政府网\t254366\n配餐\t254367\n39320244\t254368\n珀利\t254369\n押送\t254370\n音笑网\t254371\n多圈\t254372\n楢山节\t254373\nfreesync\t254374\nAftermarket\t254375\n黄仁俊\t254376\n即热式饮水机\t254377\n白鹿天下\t254378\n保守主义\t254379\n保持微笑\t254380\n高坡\t254381\n世友\t254382\n闯过关\t254383\nLinked\t254384\nOfficePLUS\t254385\n湖南卫视\t254386\n冠豪高新\t254387\nRecruitment\t254388\nPhotographer\t254389\n火烈鸟\t254390\n小站镇\t254391\n广州汇智通信技术有限公司\t254392\n新贝娱乐\t254393\n知日\t254394\n孖\t254395\n_碗莲吧\t254396\n650MT\t254397\n妖狐淫刀\t254398\n台北机场\t254399\n手锯\t254400\n上帝之手\t254401\n张芝\t254402\n撒手\t254403\nV2.5.0\t254404\n棒杀\t254405\n子胥湖\t254406\n生物膜法\t254407\n自卸货车\t254408\n锐进\t254409\npylab\t254410\n富智康\t254411\n毛概题库\t254412\n1.36G\t254413\nTK\t254414\n南海万达广场\t254415\n十二分\t254416\n迁就\t254417\n征明\t254418\n点灯\t254419\n三亚\t254420\n煤油味\t254421\nteh\t254422\n福日电子\t254423\n被刑\t254424\n易化\t254425\n诊改\t254426\n质量保证期\t254427\nonlick\t254428\n|言\t254429\nD2S\t254430\n生物素\t254431\n马鲛鱼\t254432\n芥酸酰胺\t254433\nbetta\t254434\n黄蓉新传\t254435\n行政法\t254436\n贵州电网\t254437\n报销\t254438\n肥西县政府\t254439\nV8.0\t254440\n沾益县\t254441\n内联式\t254442\n配管\t254443\n整套\t254444\n撞机\t254445\n四十多岁\t254446\n落花流水\t254447\n雪村\t254448\n_凤凰佛教\t254449\n宋伊人\t254450\ngauss\t254451\n君同\t254452\n嵊州新闻网\t254453\n绿度\t254454\nbnb\t254455\nSRPG\t254456\nyyc\t254457\ndiction\t254458\n螽斯\t254459\n暹粒\t254460\n翻台\t254461\nBeginners\t254462\n部分集\t254463\n零售学\t254464\nvct\t254465\n储粮\t254466\n注册吧\t254467\nwww点elesos点com\t254468\n45cm\t254469\n第二十二条\t254470\n盘盘\t254471\nGPS定位\t254472\n同等学力申硕考试\t254473\n增长速度\t254474\n无效\t254475\nFESTIVAL\t254476\n香缇卡\t254477\n三千场\t254478\n四道街\t254479\n黄兴国\t254480\n精气神\t254481\n第五家\t254482\n我的故事\t254483\n肌纤维\t254484\n开板\t254485\n久其软件\t254486\n面盆\t254487\nyuyue\t254488\n狂帝\t254489\n逸动论坛\t254490\n第36章\t254491\ntdl\t254492\n急聘\t254493\npowermap\t254494\n上海对外贸易学院\t254495\n首都师范大学科德学院\t254496\n0-10\t254497\n海装\t254498\nMMD\t254499\n嘀咕\t254500\n恋爱幸运曲奇\t254501\n皮贴\t254502\n知人善任\t254503\n山果\t254504\n许建\t254505\n比格比萨\t254506\n逆神\t254507\n武汉长江新城\t254508\n弄脏\t254509\n张金山\t254510\n上海南浦妇科医院\t254511\n腹黑攻\t254512\n两节\t254513\n垃圾盒\t254514\n青海省人民政府\t254515\nrehl\t254516\n工农区\t254517\n百道网\t254518\n艾莱依\t254519\n宜章新闻网\t254520\n江东新区\t254521\nGEL\t254522\n1727\t254523\n新会一中\t254524\n320亿\t254525\n汽车消费税\t254526\n恒易融网\t254527\n捏人\t254528\n王广\t254529\n传教\t254530\n乌克娜娜\t254531\n卫星发射中心\t254532\n开原市\t254533\n刷客\t254534\n偶书\t254535\n派特灵\t254536\nsoc\t254537\n黑龙江省人大\t254538\nDEAN\t254539\n4G模块\t254540\n开采业\t254541\n黄大炜\t254542\n吉林\t254543\n盐柱\t254544\n大牌\t254545\n速8\t254546\nMtoA\t254547\n万鼎\t254548\n深圳锦绣中华\t254549\n94.5\t254550\nagn\t254551\nfry\t254552\n北京市经济技术开发区\t254553\n库鲁尔\t254554\n绝命毒师第二季\t254555\n三五个\t254556\n糯化\t254557\n2039\t254558\n干爹\t254559\n玻璃钢脱硫塔\t254560\n1月20号\t254561\nr7s\t254562\ncbu\t254563\n茱莉蔻\t254564\n荣威W5\t254565\n红三角\t254566\n红种\t254567\n发愣\t254568\n模数\t254569\n盐城技师学院\t254570\n速贷\t254571\n绿化树\t254572\n潘桥街道\t254573\nwonderware\t254574\n喷淋管\t254575\n康佳集团股份有限公司\t254576\nfalling\t254577\nGLONASS\t254578\n小儿推拿网\t254579\n氨化\t254580\nkeydown事件\t254581\n先进集\t254582\n方塘\t254583\ngtx1070\t254584\n室外\t254585\nibus输入法\t254586\nVertica\t254587\nttyouku\t254588\ninoreader\t254589\n顾海\t254590\n沃特股份\t254591\n惨爪龙\t254592\n随访\t254593\nmax-age\t254594\n对碰\t254595\n范家屯镇\t254596\n笑里藏刀\t254597\n幻梦\t254598\n工交\t254599\n炎\t254600\n美妇人\t254601\ndepth\t254602\n陆洋\t254603\nGL\t254604\nGPT分区表\t254605\n杨茁\t254606\n威露士\t254607\n次元壁\t254608\n力力\t254609\n汗鲨\t254610\nThe\t254611\nH6\t254612\n行李厢\t254613\n文明户\t254614\nlovelyz\t254615\n红高粱\t254616\n博塔斯\t254617\n毒蘑菇\t254618\n逮\t254619\n顶掉\t254620\n电子手刹\t254621\nnigo\t254622\n方舟中心岛\t254623\n随机应变\t254624\n颇高\t254625\no\t254626\n政府类\t254627\nSewing\t254628\n深圳市怡亚通供应链股份有限公司\t254629\n真绝\t254630\n沣\t254631\nplc200\t254632\n烟大海底隧道\t254633\n前件\t254634\nWww\t254635\nreef\t254636\n宾王\t254637\n荆州市网上公安局\t254638\n小明星大跟班\t254639\n音超\t254640\nFrank\t254641\n飞人\t254642\n咖啡之翼\t254643\npdf书\t254644\n小瑜\t254645\nJuiceSSH\t254646\n0.34\t254647\n海南村\t254648\n20151029\t254649\n爱情戏\t254650\n固井\t254651\n下拉框\t254652\nstatistic\t254653\n3320\t254654\n悬镜\t254655\n电装\t254656\n奶牛场\t254657\n休陪\t254658\n中部国际机场\t254659\n首帖\t254660\nstrf\t254661\n1960\t254662\n小雅\t254663\n渔牌\t254664\n街机三国志\t254665\nnexus5x\t254666\n阿尔山国家森林公园\t254667\n美林谷\t254668\nclss\t254669\n猪骨\t254670\n聘为\t254671\n循\t254672\n轮融资\t254673\n第118\t254674\n紗栄子\t254675\nCH3COOH\t254676\n王瑞昌\t254677\n四川省绵阳南山中学\t254678\n脚距\t254679\nELEX\t254680\n品尊\t254681\n双水镇\t254682\n醋王\t254683\n方格子\t254684\n广大附中\t254685\n彩库\t254686\n白云小区\t254687\nSpaceSniffer\t254688\n琉璃青ro\t254689\nbrazil\t254690\nacg\t254691\n指导意\t254692\n纵坡\t254693\n甲烷传感器\t254694\n爬虫入门\t254695\n福州黎明职业技术学院\t254696\n山顶\t254697\n瑜伽网\t254698\n0层\t254699\n背蚁\t254700\n395级\t254701\n沉降率\t254702\n净含量\t254703\n吞食\t254704\n拼版\t254705\n县市场监督管理局\t254706\n幽默曲\t254707\n胖猫\t254708\n浪漫一生言情小说阅读网\t254709\n水悦城\t254710\n歌库\t254711\n特蕾莎\t254712\n世界路\t254713\nGoogle+\t254714\n章鱼小丸子\t254715\n_海\t254716\n串音\t254717\n撸射\t254718\n红月亮\t254719\n该机\t254720\nwoo\t254721\n无压\t254722\n上下川岛\t254723\n1850\t254724\n2018-03-04\t254725\ndeveloper_Kale\t254726\nAccess\t254727\n万车网\t254728\n欲说还休\t254729\n中钢协\t254730\n早城\t254731\n上一层\t254732\n中德安联人寿保险有限公司\t254733\n0.45um\t254734\n金控\t254735\nスマ\t254736\n550米\t254737\n泉舜\t254738\nAlexa排名查询|世界\t254739\n孙大圣\t254740\n比如何\t254741\n掠爱\t254742\n鸿山镇\t254743\n手冷\t254744\n木桥\t254745\n南京长江大桥\t254746\n连写\t254747\n年柱\t254748\n寂静\t254749\n中城市\t254750\n大话三国\t254751\noverrun\t254752\n2016年元旦\t254753\n品味\t254754\n菏泽\t254755\n妖气漫画网\t254756\n金山桥开发区\t254757\n魔弹之王与战姬\t254758\nRevelation\t254759\nomar\t254760\n万可\t254761\n生物科技公司\t254762\nSwear\t254763\n漆艺\t254764\nbjb\t254765\n成都市环境保护局\t254766\nDDC\t254767\nx4简体中文\t254768\n巢穴\t254769\n美蛙鱼头\t254770\nLoca\t254771\n江门\t254772\n龙胜县\t254773\n南京市中西医结合医院\t254774\n新能\t254775\n深谈\t254776\n满园\t254777\n圣手\t254778\n平溪\t254779\nMultimedia\t254780\n乐色\t254781\n帝景湾\t254782\n岸线\t254783\n从来都没有\t254784\n苏昱擎\t254785\n陈纪修\t254786\n张剑飞\t254787\n六神无主\t254788\n长江东路\t254789\n竭诚\t254790\n任一点\t254791\npythonista\t254792\n吉祥铝塑板\t254793\nAltera\t254794\n竞翔\t254795\n中国建筑总公司\t254796\n变者\t254797\nHSS\t254798\n北大\t254799\n脑炎\t254800\n幅面\t254801\n义线\t254802\n算错\t254803\n宜兴人才网\t254804\ndollars\t254805\n爱德华·诺顿\t254806\n博路定\t254807\n逗子\t254808\n报复\t254809\n富营养化\t254810\nMinimum\t254811\n摇手\t254812\n长乐市\t254813\n圆杆\t254814\n本集\t254815\n鲍尔芬\t254816\n5杯\t254817\n横格\t254818\n名人传\t254819\ncalibrate\t254820\n圣咏\t254821\n雒树刚\t254822\nmark\t254823\n张禧嫔\t254824\n四年级语文下册\t254825\n新奔奔\t254826\nLonger\t254827\nsuis\t254828\n倡议书\t254829\n多啦\t254830\n礼卡\t254831\n劳碌\t254832\ndruid连接池\t254833\nsdcc\t254834\n安委\t254835\n顺德港\t254836\n雀梅\t254837\nnohttp\t254838\n伦理片\t254839\n第15周\t254840\n动物体\t254841\n越野胎\t254842\n方生\t254843\nzhs\t254844\n春雨惊春清谷天\t254845\nbim5d\t254846\nheydouga4017\t254847\n1号店\t254848\n老伙伴\t254849\nMVBOX\t254850\n林平之\t254851\nmarvis\t254852\n鬼故\t254853\n行员\t254854\nGeneral\t254855\n三叠\t254856\n乔西\t254857\n漏水谁负责\t254858\n奥威\t254859\n福州海关\t254860\n辣妹\t254861\n86726160\t254862\n不可调\t254863\n电流表\t254864\n蛙式打夯机\t254865\n众泰SR7\t254866\n禁酒令\t254867\nsafe\t254868\n包装罐\t254869\nusername\t254870\n农友\t254871\n春樱\t254872\n鸟简笔画\t254873\n阿力\t254874\n栏杆\t254875\n锦绣军婚\t254876\n佩夫人\t254877\n2处\t254878\n重餐饮\t254879\n摄影包\t254880\nxy助手\t254881\n扎比\t254882\n宿迁人才网\t254883\n湖苏沪\t254884\n万界天尊\t254885\n800千克\t254886\n华影\t254887\n赵梦\t254888\n侏罗纪公园\t254889\n菲利克斯\t254890\n10.1.7\t254891\n停偿\t254892\n岳飞\t254893\n日程管理\t254894\n教育费附加\t254895\n用心棒\t254896\nNexus6\t254897\n祥源控股集团有限责任公司\t254898\nTI5\t254899\n马家岩\t254900\n金标\t254901\n极好\t254902\ninflater\t254903\niBeiKe\t254904\n纯C语言\t254905\n丧尸围城2\t254906\nfrequencies\t254907\n艾晓星际战甲\t254908\n坏坏小娇妻\t254909\n冰棍儿\t254910\n曼尼托巴大学\t254911\nL455\t254912\n苹果社区\t254913\n锯机\t254914\n腾讯微云\t254915\n宗地图\t254916\n老白汾酒\t254917\n何家劲\t254918\n声临\t254919\n一飞冲天\t254920\napi64.dll\t254921\n几排\t254922\n不改\t254923\n卡拉斯科\t254924\n金芽\t254925\n云南省图书馆\t254926\n亭亭\t254927\n拓麻\t254928\n经典福克斯\t254929\n逃学英雄传\t254930\n东厂西厂\t254931\n320mmgh\t254932\n锐博\t254933\n地球脉动\t254934\nALTER\t254935\n|Sony/KM/M\t254936\n小票机\t254937\nIonic2\t254938\n6N\t254939\n宇通\t254940\n比划\t254941\n华泰保险\t254942\n海德汉\t254943\n中国电力建设股份有限公司\t254944\n百度基金\t254945\n龙谷\t254946\n魔法门之英雄无敌3\t254947\ncolourpop\t254948\n圆梦金\t254949\n巨贾\t254950\ncharges\t254951\n分配\t254952\n长沙移动\t254953\n余淼杰\t254954\nexfat\t254955\n发改\t254956\n穷逼\t254957\n广州银行\t254958\ngroups\t254959\nIT经理网\t254960\n精友\t254961\n龙馨\t254962\nFinger\t254963\nPPCN\t254964\n生化危机2\t254965\n禁用语\t254966\n请笑纳\t254967\n0.96寸\t254968\n武周\t254969\n9.15\t254970\nCCA\t254971\n搀\t254972\n河南省统计局\t254973\n赎楼\t254974\n禧玛诺\t254975\n3^x\t254976\n681\t254977\n裸体版\t254978\n秋意浓\t254979\nCONGRESSIONAL\t254980\n见鬼十法\t254981\n香舌\t254982\nATP\t254983\n琴叶榕\t254984\n红豆集团\t254985\n隆达\t254986\nOpening\t254987\n劝酒\t254988\n0566\t254989\n246个\t254990\n闪红灯\t254991\n10W\t254992\n奇策\t254993\n活瓷\t254994\n王村\t254995\n维护税\t254996\n生活节\t254997\n美狮\t254998\n海吉星\t254999\n阻火器\t255000\n51费宝网\t255001\nTEL\t255002\n电子承兑\t255003\n不服帖\t255004\nfect\t255005\n常阳\t255006\n关汉卿\t255007\n新南威尔士州\t255008\n瞎扯\t255009\nl383\t255010\n存量\t255011\nジャパン\t255012\n案外人\t255013\n周西\t255014\n天津乐居网\t255015\n平仄\t255016\n磊哥\t255017\n华海亲子鉴定\t255018\n萌犬好声音\t255019\n南昌市教育局\t255020\n张铁民\t255021\n滑点\t255022\n执惠\t255023\n60倍\t255024\n环保衣\t255025\n上海三菱电梯\t255026\n维他命水\t255027\n永恒之盘\t255028\n码步\t255029\n和林县\t255030\n积德\t255031\n52P\t255032\n灭世\t255033\n韩国菜\t255034\n徐钢\t255035\nrestapi\t255036\nsalah\t255037\n佳子\t255038\n雾感\t255039\n范世錡\t255040\nipsum\t255041\n五粮液股份\t255042\n2015年7月1日\t255043\n急诊室故事\t255044\n坂崎\t255045\n温州教育教学研究院\t255046\n030\t255047\n裙子\t255048\n肛门瘙痒\t255049\n自扰\t255050\n刘镇伟\t255051\n日本住友\t255052\ncooper\t255053\n普瑞\t255054\n铁塔公司\t255055\nweekends\t255056\n总厅\t255057\n20170807\t255058\n骨骼肌\t255059\n艾兰沃克\t255060\n5577安卓网\t255061\nIstio\t255062\n14周\t255063\n绫濑成美\t255064\n东京食尸鬼re\t255065\n凤阳县人民政府\t255066\n强吻\t255067\n先锋派\t255068\n前驱\t255069\n美容师证\t255070\n14.1.0\t255071\n粘虫\t255072\n集货\t255073\n一夫当关\t255074\n新模型\t255075\n游戏吧手游网\t255076\n慈善赛\t255077\n霍尔顿\t255078\n对外贸易\t255079\nengaged\t255080\n三坛\t255081\n风再起时\t255082\n中脉美体\t255083\nBased\t255084\n51Ape.Com无损音乐网\t255085\n孙桥镇\t255086\n谱尼\t255087\n沈晓明\t255088\n秦皇岛野生动物园\t255089\nshelve\t255090\n排球女将\t255091\n陈永红\t255092\n一个半小时\t255093\n模块\t255094\n盖普\t255095\n赚到钱\t255096\n著者\t255097\ndb4\t255098\n达思\t255099\n华梦\t255100\n补换\t255101\n潜山路\t255102\n网易新闻_网易网\t255103\n互通版\t255104\nSwiper2|Swiper中文网\t255105\n2012-2013年度\t255106\n爱菊小学\t255107\n泽雷\t255108\n吉利德\t255109\n斐波那契数列\t255110\n继发性高血压\t255111\n逻辑门电路\t255112\n3分之2\t255113\n回收期\t255114\n全血粘度\t255115\n人人健康\t255116\n中国建材集团有限公司\t255117\n飘泊\t255118\n八扇屏\t255119\n一帆\t255120\n公积金贷款\t255121\n辐条\t255122\n如何是好\t255123\n周琳\t255124\n支个招\t255125\n帝隆\t255126\n高新北区\t255127\n七天酒店\t255128\n皇爷\t255129\n动态合并单元格\t255130\n20180320\t255131\nTOREAD\t255132\n狩猎场\t255133\n骏梦\t255134\nLDT\t255135\n贺村镇\t255136\n无籽葡萄\t255137\n农民\t255138\n刘阿姨\t255139\n12306.cn\t255140\n⒉\t255141\nV93\t255142\n王震\t255143\n王青\t255144\n拍砖\t255145\ncolor\t255146\n6.7分\t255147\nI2000\t255148\n百越\t255149\n楼栋号\t255150\n刘颖\t255151\n2014.11\t255152\nidrac\t255153\n小彩\t255154\n国润\t255155\n观测者\t255156\ngotta\t255157\n瑞年国际\t255158\nWarcraft\t255159\n马莉\t255160\n1000章\t255161\n物流\t255162\n搜狐快站\t255163\n中海达\t255164\nHUA\t255165\n8.36.0.2\t255166\n4888\t255167\n循环系统\t255168\n甲米机场\t255169\n10.01\t255170\nsl4j\t255171\n巨硬\t255172\n风流书呆\t255173\n瘦人\t255174\nvpn\t255175\nBMI指数\t255176\n确成\t255177\n金砖会议\t255178\n三立\t255179\n上海浦东丽思卡尔顿酒店\t255180\n跷跷板\t255181\ntedu\t255182\n1寸\t255183\nfm2016\t255184\n地才\t255185\n改低\t255186\n20180402\t255187\n壁柱\t255188\nAnastasia\t255189\n紫荆广场\t255190\n逐项\t255191\n8610\t255192\n滤芯\t255193\nhundreds\t255194\n咲良田\t255195\n预制场\t255196\n书风\t255197\n秦玥飞\t255198\n范涛\t255199\n侯波\t255200\n兵龄\t255201\n王清宪\t255202\n金域蓝湾\t255203\n风云路\t255204\n周谷堆\t255205\n石种\t255206\n芬必得\t255207\ntk\t255208\n维克多弗兰\t255209\n呛奶\t255210\n52型\t255211\n北大青鸟环宇消防设备股份有限公司\t255212\n加德满都\t255213\n风子\t255214\nft3\t255215\n白炽\t255216\n远方\t255217\n抢道\t255218\n中车四方车辆有限公司\t255219\nghostxp\t255220\n对劲\t255221\n奇葩大会2\t255222\nvisiting\t255223\n磨刀棒\t255224\n车泵\t255225\nAcronis\t255226\n遥控汽车\t255227\n12s\t255228\n雾矢翊\t255229\n暴走无双\t255230\n轩辕剑群侠录\t255231\n玉蝶\t255232\n吴健雄\t255233\n蒂姆·库克\t255234\nkkr\t255235\n铁石\t255236\nretail\t255237\n金久哲\t255238\nsakula\t255239\n大石镇\t255240\n澳门塔\t255241\n唐山北\t255242\n江宁春牛首国际马拉松赛\t255243\n税企\t255244\n喜达屋酒店\t255245\n童彤\t255246\n心理测验\t255247\n章盖\t255248\n麻溜\t255249\njoel\t255250\n秒懂\t255251\n华姐\t255252\n摇篮\t255253\n隆曦\t255254\n10元\t255255\n基因\t255256\n中国人民解放军海军\t255257\n不辞\t255258\n藏蓝\t255259\nbp\t255260\n库丘林\t255261\n奥迪双钻\t255262\n足球鞋\t255263\n口袋妖怪xy\t255264\n间之楔\t255265\n虎骨酒\t255266\n单精度\t255267\n汽车商业保险\t255268\n3000包\t255269\n玫丽\t255270\n北京诺金酒店\t255271\n影星\t255272\n硕导\t255273\n圣妖\t255274\n音乐圈\t255275\n欧月\t255276\n江苏联通\t255277\nmhx\t255278\ndashubao\t255279\ninstructor\t255280\n见方\t255281\nf-547859\t255282\nZi\t255283\n息\t255284\n庐山云雾茶\t255285\nsce\t255286\n朝田诗乃\t255287\n紫外灯\t255288\n中科金财\t255289\n肚子饿\t255290\n北京市高级人民法院\t255291\n浦发银行信用卡中心\t255292\n150厘米\t255293\n东环一路\t255294\n法兰圈\t255295\nVers\t255296\n云呼叫中心\t255297\n管狐\t255298\n白牌\t255299\n第8批\t255300\n长沙白癜风医院\t255301\n噐\t255302\n中高职\t255303\n京港澳高速公路\t255304\njihua\t255305\n4.1.9\t255306\nHosts\t255307\nweb3.js\t255308\n锅边\t255309\n颜仟汶\t255310\n第十讲\t255311\n职会\t255312\n结合性\t255313\n政府党组\t255314\n宁波市委\t255315\n口袋妖怪x\t255316\n株洲市公安局\t255317\n碧蓝幻想\t255318\n术士\t255319\n动名词\t255320\n105岁\t255321\n维意\t255322\n慈溪市人民政府\t255323\n阅读机\t255324\n上海震旦职业学院\t255325\nigex\t255326\n经管\t255327\n拦车\t255328\n本田锋范\t255329\n天健社\t255330\n不减当年\t255331\n巴城镇\t255332\n海马蜀山战纪2踏火行歌\t255333\n刘晓\t255334\n捐书\t255335\n格式工厂吧_\t255336\nMFC-7470D\t255337\n吉林大学中日联谊医院\t255338\n第64条\t255339\nrefresh\t255340\n水片\t255341\n电子计时器\t255342\n萨顶顶\t255343\n奶油奶酪\t255344\n吸货\t255345\n高天上流云\t255346\n五十年前\t255347\n禾木\t255348\n预混剂\t255349\n三亚原野映像婚纱摄影\t255350\n文泉驿\t255351\n格式条款\t255352\n营利性\t255353\nKm\t255354\n丙三醇\t255355\n文化宫\t255356\nCST\t255357\n钢联\t255358\n千锋教育\t255359\n鸡瘟\t255360\n武内直子\t255361\neggs\t255362\n34元\t255363\n肛门镜\t255364\niWebShop\t255365\n孤胆车神新奥尔良\t255366\n图客\t255367\n城步苗族自治县\t255368\n张明\t255369\n小胡子\t255370\n德亚\t255371\n银联数据服务有限公司\t255372\n止水阀\t255373\n谢了\t255374\n康正\t255375\n佐丹奴\t255376\n雅邦\t255377\nS赛\t255378\n职类\t255379\n荠菜花\t255380\n曹杨新村\t255381\nfami通\t255382\nGAP\t255383\n蒲园\t255384\nEcharts\t255385\nPACO\t255386\n异食癖\t255387\n任用\t255388\n6100元\t255389\n硬解码\t255390\n花市网\t255391\n6PIN\t255392\n神机\t255393\n管理\t255394\nvietnam\t255395\n张大力\t255396\n人工计划\t255397\nbonbon\t255398\n高佑思\t255399\n训练所\t255400\n观看\t255401\n丑逼\t255402\n鸽子汤\t255403\n狩猎\t255404\n趣阁\t255405\nzvs\t255406\n东奥注会\t255407\nbd版\t255408\nticket\t255409\n大元帅\t255410\n棒棒糖\t255411\n校园广播站\t255412\n雪莉露\t255413\n乔帮主\t255414\n宋隆\t255415\n县直\t255416\nPCV\t255417\n展业\t255418\nD11\t255419\n北京市首都公路发展集团有限公司\t255420\n眼胶\t255421\nDH\t255422\n山岚\t255423\nWebServer\t255424\n000677\t255425\n奋战\t255426\nEXCEL函数\t255427\n恒大\t255428\nhiker\t255429\nQQ绿钻\t255430\nmsvcp110.dll\t255431\nsont\t255432\n误入\t255433\n负数\t255434\n步进\t255435\n为人父\t255436\n法律援助网\t255437\n尘土\t255438\n移动安全网\t255439\n陈源\t255440\n三帝\t255441\nKITH\t255442\n22期\t255443\n天祥尚府\t255444\n第20个\t255445\nshowms\t255446\n大友克洋\t255447\n银监局\t255448\n菩提寺\t255449\nsqlalchemy\t255450\nUniforms\t255451\n七周\t255452\n军武\t255453\n北京交通一卡通\t255454\nbrain\t255455\n纤维状\t255456\n鬼楼\t255457\nVCam\t255458\nucr\t255459\n第14批\t255460\n张木易\t255461\n正月初九\t255462\n审计软件\t255463\nAppium\t255464\nBS\t255465\n小螃蟹\t255466\n1696\t255467\n成奎安\t255468\n天津贵金属交易所\t255469\n倒带\t255470\n金口镇\t255471\n双调\t255472\n宝钗\t255473\n河南日报报业集团\t255474\n四十位\t255475\n网带\t255476\n2.2.7\t255477\n吸音板\t255478\n苗疆蛊事\t255479\nhostile\t255480\n八岐\t255481\nSimple\t255482\nqq飞车手游\t255483\n童程\t255484\n通信类\t255485\n河南油田\t255486\n109魔方寸\t255487\n山姆\t255488\n路中\t255489\n天无日天天射天天视\t255490\nX30\t255491\n沈曼\t255492\n汉化补丁v3.0\t255493\ncapp\t255494\n三角包\t255495\n鹅头\t255496\n胃苏颗粒\t255497\n9.8折\t255498\n周莉莉\t255499\n_村美小站\t255500\n绿能宝\t255501\n上海教育人才网\t255502\n第5页\t255503\n硕士学\t255504\n龙川县\t255505\n新世界花园\t255506\n赛历\t255507\ntomatoes\t255508\n风工\t255509\n拖库\t255510\n星术\t255511\n心碎逍遥_\t255512\n古玩城\t255513\n无线鼠标\t255514\n简况\t255515\n常剑雄\t255516\n工件\t255517\n二多少\t255518\n锌合金压铸件\t255519\n平安万里通\t255520\n徐本禹\t255521\nposters\t255522\nbex5\t255523\nhsl\t255524\n从今日起\t255525\npud\t255526\n滚转\t255527\n信义嘉御山\t255528\n真情实意\t255529\n啰\t255530\nchic\t255531\n72名\t255532\n紫鑫\t255533\n西塔\t255534\n性交会\t255535\n2018年03月27日\t255536\n危险源\t255537\n房产证抵押贷款\t255538\n梦中的额吉\t255539\n果茶\t255540\n不怕苦\t255541\n抚顺热高乐园\t255542\nwinows\t255543\nMarginal\t255544\n苒苒\t255545\n青铜局\t255546\n寵物\t255547\n挚友\t255548\n光度学\t255549\nmini90\t255550\n企业愿景\t255551\n增城中学\t255552\n洋画\t255553\n泰兴镇\t255554\n陈总\t255555\nHope\t255556\n5506\t255557\n6倍\t255558\n县信访局\t255559\ncdr\t255560\n第二十三章\t255561\n进餐\t255562\n乡味\t255563\npmx\t255564\n最小者\t255565\n15v\t255566\n电加热\t255567\ntabl\t255568\n纵横公考\t255569\njolly\t255570\n大马士革钢\t255571\n巴尔特\t255572\n花石镇\t255573\n通识课\t255574\n王维维\t255575\n摘牌\t255576\n荞麦皮\t255577\n东海大道\t255578\n宏发\t255579\n多疑\t255580\n谈何容易\t255581\n丰泽\t255582\n第55\t255583\n安陆\t255584\n枭士\t255585\n多极\t255586\n智能家\t255587\n老练\t255588\nS8/S8+\t255589\n7升\t255590\nMacX苹果网\t255591\n太和医院\t255592\n林格伦\t255593\nD7500\t255594\n吴建国\t255595\n五员\t255596\n输灰\t255597\n嘉兴市商务局\t255598\nトビ姫\t255599\npinder\t255600\n12个\t255601\n蔡宁\t255602\n66P\t255603\n固定表\t255604\n苏州洲际酒店\t255605\n肺肿瘤\t255606\nITtecman\t255607\n三情\t255608\n条扣\t255609\n10002\t255610\n大姓\t255611\n敌机\t255612\n晶钻\t255613\npeanuts\t255614\n柏岚\t255615\nMsSql\t255616\n畈\t255617\n翡丽公园\t255618\n广东外语艺术职业学院\t255619\n番组\t255620\n选项板\t255621\nsjcam\t255622\n手柄\t255623\n深房\t255624\n中级经济法\t255625\n伊丽莎白·巴托里\t255626\n劲丽\t255627\n哈莉\t255628\nf578\t255629\n西安市教育局\t255630\n辛卯日\t255631\nwin7系统主题\t255632\n成都白癜风医院\t255633\n小米Note\t255634\n1873年\t255635\n挺立\t255636\n1200mm\t255637\n可解\t255638\n企领网\t255639\n一圈半\t255640\n11场\t255641\n大场\t255642\nonex\t255643\nUSING\t255644\n张含\t255645\n付英\t255646\n第139章\t255647\nluke\t255648\n镜界\t255649\n收取\t255650\n大头菜\t255651\n中单\t255652\n乐界\t255653\nORA-00942\t255654\n文化交流会\t255655\n数字联\t255656\n安书\t255657\n工艺性\t255658\n累加\t255659\n遁入\t255660\n第15天\t255661\nendorse\t255662\n姨太太\t255663\n读条\t255664\n周启动\t255665\n高亚楠\t255666\n天元区\t255667\n繁荣\t255668\n中昌\t255669\n九里山\t255670\n装单\t255671\n雅斯贝尔斯\t255672\n包山\t255673\n噪音\t255674\n付守东\t255675\n莫雷\t255676\n小猪短租_旅游出行类小猪短租\t255677\n华为云空间\t255678\n白牙\t255679\n幕后玩家\t255680\n耶哥蕊特\t255681\n二手车交易网\t255682\n管理所\t255683\n聂海胜\t255684\n金地悦江\t255685\n熏儿\t255686\n20150531\t255687\n消费贷款\t255688\n喜家德水饺\t255689\n墓群\t255690\n12点钟\t255691\n主题\t255692\njk\t255693\n绝品\t255694\n过失犯罪\t255695\n卢布\t255696\n2014级\t255697\nInternet协议\t255698\nIKAnalyzer\t255699\nPermSize\t255700\n莘庄镇\t255701\n天王表\t255702\n河北医科大学\t255703\n箫曲\t255704\n音符盒\t255705\n半梦\t255706\n老母鸡汤\t255707\nProliant\t255708\n湖北三峡职业技术学院\t255709\n臣子\t255710\n蒙大拿\t255711\n陈哈哈\t255712\n陈放\t255713\n早些\t255714\nRAIL\t255715\n锐珂\t255716\n选举日\t255717\n北京日上免税店\t255718\n720P/1080P高清\t255719\nregs\t255720\n泼溅\t255721\n己二酸\t255722\n硝基苯胺\t255723\n好又快\t255724\n1600万吨\t255725\ntarjan\t255726\n潘松\t255727\n秋子\t255728\nonlylady\t255729\n宋宁\t255730\n高睿\t255731\nhelena\t255732\nIng\t255733\nactran\t255734\n横店集团\t255735\n中国海峡人才网\t255736\n清河坊\t255737\n文部省\t255738\n看得见\t255739\n胶箱\t255740\n汉祚高门\t255741\n21克拉\t255742\nsaucony\t255743\nzto\t255744\n施工\t255745\n92电影网\t255746\n威灵\t255747\nexcal表格\t255748\n宁波动物园\t255749\n烟消云散\t255750\n雷狼\t255751\nshuan\t255752\n80x80\t255753\n耗量\t255754\n225\t255755\n外审员\t255756\nBoom\t255757\n叽咕\t255758\n香槟塔\t255759\n泽普\t255760\n心游之域\t255761\n八万元\t255762\n古猗园\t255763\n20安\t255764\n相头\t255765\n数据库服务器\t255766\n娱乐怪讯网\t255767\n恶警\t255768\n眼灯\t255769\n天津建设网_天津市城乡建设委员会\t255770\n邰智源\t255771\nBalvin\t255772\n王劲\t255773\n李智远\t255774\n红票\t255775\n闪金镇\t255776\n开角型青光眼\t255777\n戊申日\t255778\n石莲花\t255779\n360卫士\t255780\n休学\t255781\n闭园\t255782\n丽景湾\t255783\n排名\t255784\n11t\t255785\n〕\t255786\n菁蓉镇\t255787\n8705\t255788\nbaiduyun\t255789\n福特探险者\t255790\n李斯丹妮\t255791\n后汉\t255792\nEI\t255793\n20gp\t255794\n忍冬\t255795\n周正毅\t255796\n韩国城\t255797\n瓷爷\t255798\n特奥\t255799\n青岛北\t255800\n变量泵\t255801\n43本\t255802\n溜\t255803\n叔丁基\t255804\n杨迪\t255805\n民规\t255806\n中山国\t255807\n世华地产\t255808\n风貌\t255809\nshinhwa\t255810\n地址符\t255811\nshibor\t255812\n平江新城\t255813\n14天\t255814\n原创文\t255815\n珠海妇幼保健院\t255816\nGlove\t255817\n后档\t255818\n草堂寺\t255819\n勉力\t255820\nKnees\t255821\n汤山\t255822\n嘉州路\t255823\n5511\t255824\n晚7点\t255825\ncherry-pick\t255826\n植筋胶\t255827\n母耳\t255828\n6.0.53\t255829\n准题库\t255830\nDelight\t255831\n硝子\t255832\n三国演\t255833\n后摇\t255834\nRX8\t255835\n岗山\t255836\n盛瑞\t255837\nsoh\t255838\n古城村\t255839\nliulun\t255840\n地球公转\t255841\n一正一\t255842\n酹江月\t255843\n玻璃\t255844\n和平新村\t255845\n樱花树下\t255846\n硫氰酸钠\t255847\n莪术\t255848\nswich\t255849\n朵拉历险记\t255850\nSISIMO\t255851\n路易九世\t255852\n红色高跟鞋\t255853\n飞行棋\t255854\n640分\t255855\n小小恋歌\t255856\n纽荷兰\t255857\n婚变\t255858\n达娜\t255859\n超差\t255860\n借名\t255861\n优惠证\t255862\n大宝法王\t255863\nBAO\t255864\n112个\t255865\n李承鹏\t255866\n怎办\t255867\n南京财经大学会计学院\t255868\nGeoTools\t255869\nasdf\t255870\n远到\t255871\n12世纪\t255872\n区教委\t255873\nbarefoot\t255874\nGACHA\t255875\nnetes\t255876\nNightMa\t255877\n明治神宫\t255878\nFMOD\t255879\n布尔津\t255880\nav漆\t255881\n普崔塞德\t255882\n武汉中医院\t255883\n网游西厢记\t255884\n热线\t255885\n鸿园\t255886\n双色球|福彩3d|排列三|大乐透\t255887\n驱鬼\t255888\n陈光王阳明\t255889\n新城路\t255890\n送号\t255891\nPalo\t255892\n董月月\t255893\n荒岛求生记\t255894\nKOF\t255895\n馆区\t255896\n柳智惠\t255897\n飞豹\t255898\n分辨\t255899\n湖南省交通运输厅\t255900\n血球\t255901\nHERO\t255902\n气丹\t255903\n风景简笔画\t255904\n波胆\t255905\n磁性套索工具\t255906\n圈养\t255907\n中国标准化研究院\t255908\n苦菊\t255909\n杜娟\t255910\nv8.3\t255911\n热容比\t255912\nbanner\t255913\nbec\t255914\n十里河\t255915\n伊藤润二\t255916\ncanonical\t255917\n40余家\t255918\n清楚\t255919\n双歧杆菌\t255920\n卿本风流\t255921\n公示板\t255922\n肝穿刺\t255923\n王旭光\t255924\n西西里岛\t255925\n我的绝地求生\t255926\n42例\t255927\n神仙传\t255928\n线性回归算法\t255929\nusb3.1\t255930\ntwt\t255931\n潘之琳\t255932\nOK镜\t255933\nwic\t255934\nwoody\t255935\nPancake\t255936\n150分\t255937\n枯死\t255938\n巴布豆\t255939\n639亿\t255940\n1:50\t255941\n公信力\t255942\n汉氏\t255943\n杨锦荣\t255944\nFASTA\t255945\n晚晴\t255946\n共眠\t255947\n机器鱼\t255948\n逆世界\t255949\n字体包\t255950\n大智路\t255951\n王浩\t255952\nS型\t255953\n11.8\t255954\n世界树\t255955\n中国卫生人才网\t255956\n杭银\t255957\nIPAD3\t255958\n党工委\t255959\nAe\t255960\nSponge\t255961\n三沙镇\t255962\n淘大\t255963\nTFZ\t255964\n大黄鱼\t255965\n东乡县\t255966\n电规\t255967\n咖啡店\t255968\n蓝营\t255969\n贝恩公司\t255970\n高根\t255971\niD\t255972\nxslx\t255973\n枪套\t255974\nv2\t255975\ng633\t255976\n侨都\t255977\n下期\t255978\n瓜叶菊\t255979\n三端\t255980\n济南新东站\t255981\nmatrices\t255982\njpa\t255983\n熊肉\t255984\n联合党\t255985\n浪琴机械表\t255986\n石楼县\t255987\n独生\t255988\n活干\t255989\n张洪波\t255990\n唐一菲\t255991\n电子档\t255992\n给你看\t255993\n差\t255994\n筱原\t255995\n凤舞九天\t255996\n3.8.2\t255997\n第10集\t255998\n车载充气泵\t255999\n厦门六中合唱团\t256000\nNBA火箭队\t256001\nJavabean\t256002\n萎缩性鼻炎\t256003\n蔻蔻网\t256004\n246zl\t256005\n冠道论坛_汽车之家论坛\t256006\n14.0英寸\t256007\nEmail\t256008\n选美\t256009\n乳腺肿块\t256010\n逼迫\t256011\n七七空间\t256012\n仁和区\t256013\n西安博物院\t256014\npears\t256015\n每块\t256016\n重庆政府\t256017\n启通\t256018\n汇总篇\t256019\n张彪\t256020\n中华人民共和国保守国家秘密法实施条例\t256021\n印花裙\t256022\n直式\t256023\n茅根\t256024\n0287\t256025\n御苑\t256026\n69元\t256027\nЮ\t256028\n第30个\t256029\n方正品\t256030\niOS9.3\t256031\n30组\t256032\n740m\t256033\njenkin\t256034\n韩庄村\t256035\n乐景\t256036\n神秘园\t256037\n自动焊机\t256038\n邪眼\t256039\n婚路\t256040\n6236\t256041\n串口\t256042\n叶叶\t256043\n15套\t256044\n乜嘢\t256045\n着色剂\t256046\n福佳白\t256047\nLeila\t256048\n航点\t256049\n海域\t256050\nDomain\t256051\n黄网\t256052\n谋士\t256053\n王亚\t256054\n175个\t256055\n3811\t256056\n佛山市教育局\t256057\n秦武王\t256058\n海龙屯\t256059\n5.88\t256060\n王童\t256061\nlinxu\t256062\n李文军\t256063\n激萌相机\t256064\n杀破狼1\t256065\n心配\t256066\nListe\t256067\n烟薯\t256068\n山东科学技术出版社\t256069\n双百计划\t256070\n二二六\t256071\n绿岛\t256072\n营养科\t256073\n茵蒂克丝\t256074\n18米\t256075\n绝地逃亡\t256076\nqzrc\t256077\ncorresponds\t256078\n肌\t256079\n查谱网\t256080\n英语文化\t256081\nzhongxing\t256082\n六发\t256083\n比较好\t256084\n265G传奇霸业\t256085\n眉角\t256086\n腾讯分分彩\t256087\n液化气灶\t256088\n光棍树\t256089\n戴森\t256090\n锯材\t256091\n海淀区教委\t256092\nCOMPLEX\t256093\n献花\t256094\n荆楚网\t256095\nmetatrader\t256096\n2017年11月24日\t256097\n古水北镇\t256098\n黄宏\t256099\n绝地求生录像\t256100\n鸿茅药业\t256101\ntear\t256102\n四川蓝光发展股份有限公司\t256103\n富力天禧城\t256104\npeep\t256105\n经典区\t256106\n8.1\t256107\nEditplus\t256108\n肝肠寸断\t256109\n她俩\t256110\niterative\t256111\n13台\t256112\n郭志坚\t256113\nFOG\t256114\n八二年\t256115\n环市西路\t256116\n苦荞麦\t256117\n富思特\t256118\n蛇口线\t256119\nkindle阅读器\t256120\n南澳县\t256121\n246\t256122\n宮\t256123\n月光族\t256124\n细菌菌落\t256125\nRTF\t256126\nFlash8\t256127\n拓词\t256128\n黑切\t256129\n灰木纹\t256130\n挡边\t256131\n哈啰\t256132\nwestworld\t256133\n周转量\t256134\n梁惠王\t256135\n五服\t256136\n王者造梦西游\t256137\n山东大学医学院\t256138\n多义词\t256139\n青岛电视台\t256140\n软料\t256141\n发电板\t256142\n李永忠\t256143\n举报信\t256144\n小铃\t256145\n最终幻想4\t256146\nCs1.6\t256147\n四面山\t256148\n和堂\t256149\n抗风\t256150\n头端\t256151\n电导率仪\t256152\n娱记\t256153\nhtml4\t256154\n山东新闻\t256155\n惠州市区\t256156\n生产部\t256157\n区审计局\t256158\n6215\t256159\n公切\t256160\n多杆\t256161\n龙行\t256162\n天津公安局\t256163\nlifeline\t256164\n香港会议展览中心\t256165\nCifar\t256166\nPIANO\t256167\n瓦萨\t256168\n中视版\t256169\n第2章\t256170\n中债资信评估有限责任公司\t256171\n水化\t256172\nMerging\t256173\n报告集\t256174\n潮州市潮安区人民政府\t256175\n大神级\t256176\n行径\t256177\nPricing\t256178\n文字稿\t256179\n陈晓玲\t256180\n3165\t256181\nLocalhost\t256182\n蒋子龙\t256183\n姜华\t256184\n象鼻山\t256185\n麻袋理财\t256186\n溶瘤病毒\t256187\n速聘\t256188\nprogrammer\t256189\n刷天\t256190\n潜污泵\t256191\na555l\t256192\n三杯酒\t256193\n精索鞘膜积液\t256194\n审问\t256195\n环保局\t256196\n亿田集成灶\t256197\n一捆\t256198\n中达电通\t256199\nworkdsz\t256200\n所求\t256201\npremises\t256202\n100L\t256203\n嫡庶\t256204\n沱茶\t256205\n旅路\t256206\n阻扰\t256207\n八十首\t256208\n张靓颖\t256209\n施工荷载\t256210\n紧凑\t256211\n2500t\t256212\n社会工作者\t256213\nroot权限\t256214\n大四\t256215\n统一战\t256216\n药妆\t256217\n6l\t256218\n关小平\t256219\n甘精胰岛素注射液\t256220\nJavamail\t256221\n天天新品网\t256222\n南瓜子仁\t256223\n湖南银河电气有限公司\t256224\n报名处\t256225\n农路\t256226\n大连华锐重工集团股份有限公司\t256227\n喵姐\t256228\n挺身\t256229\n骚骚\t256230\nsql\t256231\n岳池县人民政府\t256232\nvivibear\t256233\n软水器\t256234\n小魏\t256235\n85625667\t256236\n巴彦淖尔市政府\t256237\n还不够\t256238\n大卡车\t256239\n报建\t256240\n冤枉钱\t256241\nhappily\t256242\n葱子\t256243\n100克\t256244\n亲密无间\t256245\nN型\t256246\nThinkCMF\t256247\nafew\t256248\n伊泽瑞尔\t256249\n全友\t256250\nsnacks\t256251\n5.02\t256252\ndsdyjg\t256253\n骚话\t256254\norf\t256255\n小米九号\t256256\n商业贸\t256257\n包装袋\t256258\n小试\t256259\n九本\t256260\ncum\t256261\n天视通\t256262\nqingjoin\t256263\nVS2005\t256264\n小屁孩\t256265\n印第安\t256266\n卡若区\t256267\n硅化木\t256268\n寻梦环游记\t256269\n唯美\t256270\n南诏\t256271\nMatLab\t256272\n冒险村物语\t256273\n胡麻\t256274\n可有可无\t256275\n第3段\t256276\nResilio\t256277\n糖类\t256278\n宿州房产网\t256279\n川美\t256280\nネコ\t256281\n傻逼\t256282\n疲累\t256283\n车机端\t256284\n7k7k奥拉星\t256285\n套口\t256286\n注册消防工程师\t256287\n山东科技职业学院\t256288\n雏菊花\t256289\n51papers.com\t256290\n呛\t256291\n去函\t256292\n憨鼠社区\t256293\n张梁记\t256294\n明晰\t256295\nucbug游戏网\t256296\n非恶意\t256297\n琲世\t256298\n江梦娴连羲皖\t256299\nohio\t256300\n电价\t256301\n刀山\t256302\n宁和\t256303\n前缘\t256304\nconsumption\t256305\n六本木新城\t256306\n紫气\t256307\n运动版\t256308\nVitalik\t256309\nrequire\t256310\n苏米\t256311\n爱我的人和我爱的人\t256312\nzhongguo\t256313\nburgh\t256314\n饱和烃\t256315\n淮阴区政府\t256316\n慨叹\t256317\n呋喃唑酮片\t256318\n贺新春\t256319\n6.0.2\t256320\n穆晨\t256321\n唐之韵\t256322\n6.3.9\t256323\n定王台\t256324\n沉香豌\t256325\n6.5码\t256326\nbattleeye\t256327\n番鸭\t256328\n光甘草\t256329\n半导体硅片\t256330\n苗家\t256331\nt430\t256332\n博天堂\t256333\nSEXLAB\t256334\n杀戮都市o\t256335\npS\t256336\n中汤\t256337\n防骚扰\t256338\n打麻醉\t256339\n驳回\t256340\n黄埔港\t256341\n姐姐们\t256342\n博乒网\t256343\n注满\t256344\n气阀\t256345\n马莲\t256346\nClaims\t256347\n陈小兵\t256348\n崆峒山\t256349\n3d动漫\t256350\n鼻头\t256351\n65公里\t256352\n所值\t256353\n烤瓷牙\t256354\ntcping\t256355\n斌哥\t256356\n物是人非\t256357\n甘肃省农牧厅\t256358\nAC-DC\t256359\n广州体育职业技术学院\t256360\ncscript\t256361\n奥义\t256362\n狂舞\t256363\n妙笔生花\t256364\nCrosswalk\t256365\n治国理政\t256366\n窗饰\t256367\n察右后旗\t256368\n平安陆金所\t256369\n谷歌搜索引擎网\t256370\n江淹\t256371\n实单\t256372\n首映礼\t256373\n锁号\t256374\n刘卫东\t256375\nOgg\t256376\n聚维酮碘\t256377\nYoungG\t256378\n第一餐\t256379\n3.3.2\t256380\n吓退\t256381\n扒皮吧_\t256382\nexcel格式\t256383\n腾讯游戏道聚城\t256384\n初原\t256385\n环城北路\t256386\n138job.com\t256387\nEasyAR\t256388\n1500美元\t256389\n梓ユイ\t256390\n精细化学品\t256391\n小邱\t256392\n基尼系数\t256393\n佳品\t256394\nMP3播放器\t256395\nDict\t256396\nHamasaki\t256397\n妙招\t256398\n承检\t256399\n第一部分\t256400\n运达\t256401\n亚硫酸盐\t256402\n锐族\t256403\n死亡事故\t256404\n陪打\t256405\n_万\t256406\n英大泰和财产保险股份有限公司\t256407\nONYX\t256408\n南京大学研究生院\t256409\n29分\t256410\n河北站\t256411\n鹏辉能源\t256412\n山东电网\t256413\n二三年\t256414\n邹文琴\t256415\n600000\t256416\nlinux5\t256417\n命猫\t256418\n玛依拉\t256419\n东芝e-STUDIO\t256420\n无锡公积金\t256421\n篇名\t256422\ndnf炽天使\t256423\n請問\t256424\n600567\t256425\n唐凯\t256426\n碧树\t256427\n宁波大学医学院\t256428\n第121章\t256429\n忆术家\t256430\n深圳市计量质量检测研究院\t256431\n朝鲜中央电视台\t256432\n格洛克\t256433\n资源库\t256434\n12堂\t256435\nPPTP\t256436\n杨总\t256437\n丘逢甲\t256438\n乌兰县\t256439\n小熊维尼与跳跳虎\t256440\n社区工作者考试网\t256441\n获\t256442\n返听\t256443\n多美滋\t256444\n画江湖之杯\t256445\n渔利\t256446\n综合题\t256447\n第二十四章\t256448\n备着\t256449\n音腔\t256450\ngdk\t256451\n65平米\t256452\n卢正雨\t256453\ndht11\t256454\n反革命暴乱\t256455\n西郊公园\t256456\n周弘\t256457\n赵献\t256458\nSHSH\t256459\n子目\t256460\n多维\t256461\n苦求\t256462\n吴优\t256463\ndyned\t256464\n回浦中学\t256465\n5.5.9\t256466\n万科麓山\t256467\n网优fengj.com\t256468\nE5200\t256469\nBQool\t256470\ne盘\t256471\nsiteserver\t256472\n专线\t256473\n35家\t256474\n千蛛\t256475\nRicequant\t256476\n海狗\t256477\n许章润\t256478\n057\t256479\n玫瑰湾\t256480\n醒脾养儿颗粒\t256481\n陈子龙\t256482\nLeigh\t256483\n天澜\t256484\n较轻\t256485\n20160407\t256486\n山鼠\t256487\n开订\t256488\n夜长\t256489\n刹\t256490\n控员\t256491\n到村\t256492\nkoji\t256493\n小考\t256494\n梅园小区\t256495\ncsma\t256496\n小客车指标管理信息系统\t256497\n山水城\t256498\nliy\t256499\n美源\t256500\n水疗\t256501\n点位\t256502\n雷霆行动\t256503\n阿拉德\t256504\n阿牧\t256505\n常州市城乡建设局\t256506\n北京CBD\t256507\n777免费电影网\t256508\nTODAY\t256509\n暖衣\t256510\n多啦A梦\t256511\n沁源\t256512\n爱亲\t256513\n世界小姐\t256514\nVCF\t256515\n知见\t256516\n90公里\t256517\n共色\t256518\n心版\t256519\n生产\t256520\n喜福会\t256521\n迅雷高速\t256522\nh8s\t256523\ndedeCMS\t256524\npulldown\t256525\n签字费\t256526\n龙刺\t256527\n面条\t256528\n无罩\t256529\nchallenger\t256530\n大写意\t256531\n刺客信条:启示录\t256532\n虾\t256533\n讣告\t256534\n汽车时代网\t256535\n秉性\t256536\n展博\t256537\n小甜妻\t256538\n别害羞\t256539\n夜明珠\t256540\n半球盘\t256541\n不方\t256542\n龙游天下\t256543\n拉杆箱\t256544\nnpc\t256545\nSSENSE\t256546\n学警出更\t256547\n市政协\t256548\nOK资源网\t256549\n仙桃影院\t256550\n播报\t256551\n尼克斯\t256552\n牛舌\t256553\n埃因霍温\t256554\nAX\t256555\n测温仪\t256556\n血书\t256557\n凹槽\t256558\n音乐教育专业\t256559\nIllust\t256560\n纳米矿晶\t256561\n辽宁省工商行政管理局\t256562\n半月板\t256563\npeters\t256564\n娃儿\t256565\n祭语\t256566\n中建一局\t256567\n盗用\t256568\nDBD\t256569\n肾功能检查\t256570\n制杀\t256571\n屏无\t256572\naccessory\t256573\n动量\t256574\n莲都区\t256575\n6kg\t256576\n撞钟\t256577\n载\t256578\n幽灵公主\t256579\n51黄页网\t256580\n钻井液\t256581\n218.77\t256582\n喵\t256583\n文创\t256584\n源神\t256585\n等量\t256586\n黎平县\t256587\n野草莓\t256588\n东莞市委\t256589\n电化铝\t256590\n蒿俊闵\t256591\nEcharts-柱状图柱\t256592\n净味\t256593\n蛇骨链\t256594\n魅族MX4\t256595\n信达资产管理公司\t256596\n民义\t256597\n中华人民共和国国家知识产权局\t256598\n受诉\t256599\n铁票\t256600\nselenium-webdriver\t256601\n杰夫·贝佐斯\t256602\n新湖国际\t256603\n数看\t256604\n微信区\t256605\n丈夫\t256606\n太阳鱼\t256607\n唐绮阳\t256608\n无盐\t256609\n名书\t256610\n财付通支付科技有限公司\t256611\n素然\t256612\n闭关锁国\t256613\n武汉凡谷\t256614\npdf档\t256615\n雷珠单抗\t256616\n烟台市委\t256617\n黑檀\t256618\n玉米片\t256619\n滚动轴\t256620\n北京市医院\t256621\n偏误\t256622\n钟政\t256623\n潜水镜\t256624\n散架\t256625\n空前\t256626\n拍\t256627\nmate7\t256628\n老眼\t256629\nBT种子磁力链接\t256630\nSwedish\t256631\n寄望\t256632\n代招\t256633\n杭州市公安局交通警察支队\t256634\n光伏\t256635\nntdll.dll\t256636\n经法\t256637\n维摩诘经变\t256638\n收工\t256639\nreputation\t256640\n商住地\t256641\n刘诗\t256642\nlolicon\t256643\n水利学院\t256644\nOSM\t256645\n周蓉\t256646\ndfa\t256647\n力行\t256648\nposix\t256649\n哈市\t256650\n现代史\t256651\n深圳大学城图书馆\t256652\n蟾宫曲\t256653\nny\t256654\n茬子\t256655\n宿雾航空\t256656\n山口百惠\t256657\n琪琪女性网\t256658\n柱网\t256659\n正己烷\t256660\nDiner\t256661\nchampion\t256662\n侨鑫集团\t256663\n阴囊炎\t256664\n卢克妮\t256665\nAutoCAD素材\t256666\n失措\t256667\n华风\t256668\ncruises\t256669\nFishing\t256670\ndcdc\t256671\n武汉市第六医院_江汉大学附属医院\t256672\nSqlParameter\t256673\n迈克尔·杰克逊\t256674\ni\t256675\nrumen\t256676\n我的偶像\t256677\n华腾\t256678\n牵黄\t256679\n邢佳栋\t256680\n大运村\t256681\n撸大师\t256682\nhnx\t256683\n月冲\t256684\n列级\t256685\n_币\t256686\nnepal\t256687\n胡娜\t256688\nViolin\t256689\n库库\t256690\n时代大道\t256691\n过年了\t256692\nForest\t256693\n武汉地铁集团有限公司\t256694\nPrestaShop\t256695\n高建军\t256696\n麒麟刺\t256697\n周雪光\t256698\n旅行类\t256699\nORIGIN\t256700\nWISH邮\t256701\n保证保险\t256702\nkilling\t256703\n急性咽炎\t256704\n商保\t256705\n简介\t256706\n数积\t256707\n沪杭高速\t256708\n门限\t256709\n正片影音先锋\t256710\n删去\t256711\n灰条\t256712\n月光如水照心扉\t256713\nlingerie\t256714\n凸轮轴传感器\t256715\n奥斯曼\t256716\n防微杜渐\t256717\n河南省体育局\t256718\n气浮池\t256719\nNETSHOW论\t256720\n個\t256721\n十多年\t256722\nLINE6\t256723\n祝塘\t256724\n永业\t256725\n新禧\t256726\n明德\t256727\n马常胜\t256728\n十堰市委市政府\t256729\n瑞友天翼\t256730\n反运算符\t256731\n内宫\t256732\n情缠\t256733\n友爱\t256734\nSedans\t256735\n前后轮\t256736\n不限速\t256737\n生化危机hd\t256738\nlargest\t256739\n李铭顺\t256740\n说错\t256741\n救火英雄\t256742\n漳州房产网\t256743\nVSCode\t256744\n第25轮\t256745\n莫特\t256746\n可恶\t256747\n评审会\t256748\n同业\t256749\n苏高中\t256750\nPaints\t256751\nRIVAL\t256752\n擀面皮\t256753\nBF\t256754\n李姗姗\t256755\nRiko\t256756\n财经网\t256757\n北京市交通委\t256758\n松堡\t256759\n慕尼黑工大\t256760\nDiagnostic\t256761\nWitches\t256762\n抖友\t256763\n徐哲\t256764\nprobably\t256765\n热那亚\t256766\nExists\t256767\n迎宾小区\t256768\n德波尼亚\t256769\n先天性肾上腺皮质增生症\t256770\n蝙蝠侠内敌\t256771\nIFNULL\t256772\nipin\t256773\n伏天\t256774\n一大笔\t256775\n徐庆\t256776\nimage2\t256777\niohone\t256778\n聘证网\t256779\n国王的恩赐:黑暗面\t256780\n90年代以来\t256781\n高德置地广场\t256782\n浆粕\t256783\n翀\t256784\nChristoph\t256785\n入职体检\t256786\n伊人影院\t256787\n4串\t256788\n鲁克\t256789\n黄宗艾弗森\t256790\n南韩\t256791\n30ml\t256792\nPVX\t256793\n中华人民共和国生态环境部\t256794\nCAD2012\t256795\n腰部\t256796\n大院长\t256797\n邹城\t256798\n巡视者\t256799\n螳螂虾\t256800\n卢米埃尔\t256801\n黑侠\t256802\n1052\t256803\n复位\t256804\npkp\t256805\n经天纬地\t256806\n核剂\t256807\n三地\t256808\n改联\t256809\n毗湿奴\t256810\n袁冰妍\t256811\n芥末君\t256812\napp开发公司\t256813\n许龄月\t256814\nTreats\t256815\npolestar\t256816\n60余家\t256817\n钟乳石\t256818\n伊基克\t256819\n阵\t256820\n内架\t256821\n理想主义\t256822\nsnippets\t256823\nphilip\t256824\n勤工助学\t256825\nLCD\t256826\n咸湿西游记\t256827\n无偏估计量\t256828\n蓝色光标\t256829\n蟹\t256830\n承办方\t256831\nTOP1\t256832\n香榭里\t256833\nAST\t256834\n凉风\t256835\n一贷\t256836\nasked\t256837\n迅播\t256838\n通假\t256839\nvred\t256840\n市花\t256841\n投拍\t256842\n企业管理创新\t256843\n上海百度\t256844\nExcel达人网\t256845\n灵雨\t256846\n你的最爱\t256847\n华容县\t256848\n周冰倩\t256849\n拳击史\t256850\n红神\t256851\n南塘\t256852\nRobinson\t256853\n花江\t256854\n丫鬟\t256855\nEvisu\t256856\n二十多万\t256857\n汉安\t256858\n中华人民共和国民事诉讼法释义\t256859\n执迷不悔\t256860\nwinmerge\t256861\nga\t256862\n汇川\t256863\n医疗补助金\t256864\n服务间\t256865\n振东集团\t256866\nbeliefs\t256867\n谢先斌\t256868\n二之国2:幽灵国度\t256869\n线距\t256870\n32p\t256871\n中山一路\t256872\n复方氨酚溴敏胶囊\t256873\n吊环螺钉\t256874\n神婆\t256875\n上次\t256876\nrare\t256877\n急救室\t256878\n扣发\t256879\n加油包\t256880\n名宿\t256881\n桂林市政府\t256882\n2点半\t256883\n胰岛细胞\t256884\n徽派建筑\t256885\n电喷车\t256886\n气魄\t256887\n月保\t256888\n96777\t256889\nBattlelog\t256890\n直径\t256891\n南桥新城\t256892\n张嘉倪\t256893\n我懂了\t256894\n陈树义\t256895\nV2.8.0\t256896\nⅡ\t256897\n第36届\t256898\n【\t256899\n公子世无双\t256900\n练习赛\t256901\n_伊\t256902\n友方\t256903\n庆良\t256904\nmagneto\t256905\n威海大众网\t256906\nBridge\t256907\n台锯\t256908\n瞎眼\t256909\n双食记\t256910\ngarena\t256911\n39P\t256912\n咽痒\t256913\n独裁者\t256914\n上海出入境检验检疫局\t256915\n光荣使命\t256916\nBelinda\t256917\n阿莫多瓦\t256918\n阿们网\t256919\n听歌识曲\t256920\n户式\t256921\n音单\t256922\n陆风x7\t256923\n龙珠英雄\t256924\n穆铁稻盛和夫\t256925\n恩德培\t256926\n起起落落\t256927\n创意秀\t256928\n天叶\t256929\n和政县\t256930\npgadmin4\t256931\n上窑\t256932\n新安社区\t256933\n消防报警器\t256934\n轻甲\t256935\n叫花\t256936\n五果\t256937\nFermi\t256938\n出餐\t256939\n长江花园\t256940\n九毛\t256941\n赵坤\t256942\nfcntl\t256943\n2506\t256944\n483号\t256945\nGMP\t256946\n割喉\t256947\n欧洲站\t256948\n无锡市国家税务局\t256949\n复习笔记\t256950\n2018年4月11日\t256951\nSD高达G世纪创世\t256952\n新理\t256953\n巴厘\t256954\n橙子汁\t256955\n朗\t256956\n刘炽\t256957\n芥菜\t256958\n173.com\t256959\n机宝\t256960\n五类\t256961\n三秒钟\t256962\n68u\t256963\n称心如意\t256964\n迪马股份\t256965\n黔西\t256966\nBurns\t256967\n3M双面胶\t256968\n学历证书\t256969\n麻团\t256970\n少年头\t256971\n河南省国税局\t256972\n死亡飞车2\t256973\n百业网\t256974\n王朝辉\t256975\n造梦西游OL\t256976\n男保姆\t256977\n林肯MKX\t256978\n云计算市场\t256979\n&mdash\t256980\n赵老师\t256981\n澧州\t256982\n声律\t256983\n冠头岭\t256984\n素妍\t256985\n容抗\t256986\nactivity\t256987\n超全\t256988\nnrf24l01\t256989\n等额本息法\t256990\n忽悠式\t256991\n板岩\t256992\n王者荣耀赏金联赛\t256993\nUnicodeEncodeError\t256994\n舞蹈家\t256995\n崔勇\t256996\n贵州省通信管理局\t256997\n审死官\t256998\n木村文乃\t256999\n第四篇\t257000\n空气净化\t257001\nEXPO\t257002\n纸弹\t257003\n党总支\t257004\n传菜机\t257005\nRasa\t257006\n天谕晶锐\t257007\n芦芽山\t257008\n祁阳新闻网\t257009\nbluehost\t257010\n广为人知\t257011\n密封圈\t257012\n明修\t257013\n点心世界\t257014\nflask-login\t257015\n绪论\t257016\n910\t257017\n名伦\t257018\ncrying\t257019\n霍\t257020\n另外\t257021\n建筑学报\t257022\n售票点\t257023\n折折\t257024\n修罗场\t257025\n微分电路\t257026\n总量\t257027\n高级会计师考试\t257028\n散分\t257029\n崇法\t257030\n非全日制\t257031\n下报\t257032\n微信货源网\t257033\n两小时内\t257034\ndisabled\t257035\ntammy\t257036\n不安全感\t257037\n沙漠王子\t257038\nAirplay\t257039\n哥特\t257040\n386号\t257041\n0x3\t257042\n诚心\t257043\n河南坠子\t257044\n三国志12:威力加强版\t257045\n萨尔曼\t257046\n张庆鹏\t257047\n10mm\t257048\n方红\t257049\n西北政法大学\t257050\n哈利波特与魔法石\t257051\n优果网\t257052\n到尾\t257053\n行政管理师\t257054\n五品\t257055\nfftshift\t257056\n威廉姆\t257057\n四合村\t257058\n新浪\t257059\n塞尔达传说:荒野之息\t257060\n都市鬼谷医仙\t257061\n繁体版\t257062\n摸金符\t257063\n拍拍乐\t257064\n白亮\t257065\niga\t257066\nAttempt\t257067\n焕白\t257068\n名侦探狄仁杰\t257069\n小沃科技\t257070\n京沪高速\t257071\n沃尔玛山姆会员店\t257072\nPPI\t257073\n聚拢\t257074\nGTX1050TI\t257075\n2股\t257076\n东风公司\t257077\n座席\t257078\n鲁路\t257079\n九十七\t257080\n峙\t257081\n美容人才网\t257082\n28年前\t257083\n个人保险咨询网\t257084\n下腹部\t257085\n166cm\t257086\nwebmvc\t257087\n_电子银行\t257088\n写法\t257089\nGodTelMe\t257090\n后腿\t257091\nxmanager\t257092\n兰溪市人民政府\t257093\n动脉导管未闭\t257094\n噜噜网\t257095\n一半多\t257096\n今夜欢乐颂\t257097\nPandoc\t257098\nSCM\t257099\n维拓\t257100\nsmile\t257101\n大一圈\t257102\n心肺复苏术\t257103\n杨飞飞\t257104\nr\t257105\n吊装式\t257106\n申报员\t257107\n王晓鹏\t257108\n坑害\t257109\nAppearance\t257110\n福奈特\t257111\n20161224\t257112\n孕前检查\t257113\n黄锦辉\t257114\n韩剧网\t257115\n巴恩\t257116\n制药业\t257117\n9十个\t257118\n上海电影译制厂\t257119\n1892\t257120\n贵港市港北区\t257121\n燃堂\t257122\n酱腌菜\t257123\n副牌\t257124\ncudnn7\t257125\n精纯\t257126\n第4个\t257127\nxln\t257128\n2000毫安\t257129\n枝\t257130\n2018年4月14日\t257131\n乐文\t257132\n网络赛\t257133\n果冻布丁\t257134\n青岛港湾职业技术学院\t257135\n花溪谷\t257136\n结网\t257137\n一晚\t257138\n示数\t257139\nkeil3\t257140\nHype\t257141\n送生\t257142\n萨马拉\t257143\n鲜菇\t257144\n关系\t257145\n吴邪\t257146\n张征\t257147\n黄云斌\t257148\n诙谐\t257149\n多普勒频移\t257150\n屌丝IT男\t257151\n实足\t257152\n乔雅登\t257153\n惹的祸\t257154\n查询系统|编号查询|真伪查询|平台\t257155\n粉蝶\t257156\nv2版\t257157\n症结\t257158\n指客网\t257159\n殊途\t257160\n15片\t257161\n10分之一\t257162\n阴狠\t257163\n数据处理\t257164\n4140\t257165\n清蒸鲈鱼\t257166\n广州北站\t257167\n吴汉\t257168\n四骑士\t257169\n來\t257170\n函數\t257171\n印制\t257172\n明挖法\t257173\n脑垂体瘤\t257174\n冒险岛联盟\t257175\n护裆\t257176\n刘字\t257177\n蒋村\t257178\n第二十一条\t257179\n二氧化锡\t257180\nminimize\t257181\n内力\t257182\n悬式绝缘子\t257183\nT_\t257184\n126平米\t257185\nBard\t257186\n禁歌\t257187\n斯卡拉蒂\t257188\n足浴粉\t257189\n画意\t257190\n原力\t257191\nwin10office\t257192\n_类\t257193\n法宝网\t257194\n武当派\t257195\n万科水晶城\t257196\n沙特阿拉伯\t257197\nshanxi\t257198\nSpss\t257199\n指示器\t257200\nFindBugs\t257201\n大众点评网\t257202\n所有制\t257203\n适航\t257204\n余音绕梁\t257205\n张利平\t257206\ndzz\t257207\n郭建军\t257208\n来历\t257209\n云团\t257210\n大鱼解梦网\t257211\n1平方公里\t257212\n安宰贤\t257213\n乡村版\t257214\n没手\t257215\n锈斑\t257216\n王陵\t257217\n华新村\t257218\n鼬\t257219\n凹凸棒石\t257220\n吉林市中心医院\t257221\n安置\t257222\nSynaptics\t257223\n等值线\t257224\n总功\t257225\n锌钢护栏\t257226\n优学派\t257227\n放学别走\t257228\niphone6splus\t257229\nW33\t257230\noecd\t257231\n三万亿\t257232\n西山壹号院\t257233\n李明辉\t257234\nEFD\t257235\n张朝剑雨\t257236\nAccommodation\t257237\n50多种\t257238\n佛山市红盾信息网\t257239\n汽车装饰\t257240\nregistrar\t257241\n千娇\t257242\n万和燃气热水器\t257243\n第1周\t257244\nbuliao\t257245\n杭州小区\t257246\n大望\t257247\n恭请\t257248\n全关通信息网\t257249\n短毛\t257250\nOSX\t257251\n郁南\t257252\nICD\t257253\n盘石网盟\t257254\n诗歌集\t257255\n气动葫芦\t257256\n16起\t257257\nspaces\t257258\n徐少春\t257259\n毕业论文范文\t257260\n6v电影网\t257261\n存活\t257262\n泗\t257263\n结伴\t257264\n航发\t257265\nemsp\t257266\n火炮兰\t257267\n王经理\t257268\ncrf++\t257269\n高中女孩\t257270\n小字\t257271\norient\t257272\n2755\t257273\n交通管理\t257274\n粘合剂\t257275\n红楼春梦\t257276\n启月\t257277\n无尽太空2\t257278\n三都县\t257279\nDAF\t257280\nhp1018\t257281\n龙湖西府\t257282\n号手\t257283\n五国\t257284\n日美\t257285\nRGB灯效\t257286\n东鹏陶瓷\t257287\n灰饼\t257288\n嘉兴房产网\t257289\n参茸补肾片\t257290\n池店镇\t257291\n金羊网\t257292\n傻_\t257293\n函授站\t257294\nSolidWorks2016简体中文\t257295\n损坏率\t257296\n慈宁宫\t257297\n矸石山\t257298\n香港特区政府\t257299\n余姚新闻网\t257300\nSolveigMM\t257301\n26卷\t257302\n县工商联\t257303\n陈展\t257304\n星戒\t257305\n睫状肌\t257306\n次责\t257307\n混合动力版\t257308\n2059\t257309\n78年\t257310\n仲裁庭\t257311\nLINKART\t257312\n反渗透系统\t257313\n迪加\t257314\n1000万美元\t257315\nManuel\t257316\nerc\t257317\n牲\t257318\nVRRP\t257319\nty\t257320\n抗冲击\t257321\n义勇\t257322\n来者不拒\t257323\n省侨联\t257324\n劳动公园\t257325\n亿邦动力网\t257326\nDispute\t257327\n谢特朗普\t257328\nucp\t257329\nredisTemplate\t257330\n21次\t257331\n定音鼓\t257332\nElWiki\t257333\n上弹\t257334\n满城烟火\t257335\n主轴\t257336\n紫荆公园\t257337\n挥发物\t257338\nTIFF\t257339\n进销存\t257340\n质量保修金\t257341\n民航小区\t257342\n灰鸽子\t257343\n哪首诗\t257344\n城市绿化\t257345\n盐城市政府\t257346\n跨界石\t257347\n雄鹿队\t257348\nSing\t257349\n稻米油\t257350\n木优\t257351\n四氟管\t257352\n难波站\t257353\n6小时内\t257354\nSignals\t257355\nm1136\t257356\n拍拍手\t257357\n小店区\t257358\n顾村镇\t257359\ndenny\t257360\n第21条\t257361\n配网\t257362\n李秀满\t257363\naot\t257364\nxiao77论坛\t257365\n〆\t257366\nforeigner\t257367\n常备\t257368\n壮丁\t257369\n赛风\t257370\nEPO\t257371\n中德生态园\t257372\n蒜瓣\t257373\n行程表\t257374\n广西军区\t257375\n夏家三千金\t257376\n厨柜\t257377\n坚定性\t257378\n性战\t257379\n雪地靴\t257380\n室内设计专业\t257381\n导电剂\t257382\n博山\t257383\n花之舞\t257384\n12.28\t257385\n郸城县人民政府\t257386\n气流纺\t257387\n相异\t257388\n易企\t257389\n末日时\t257390\n佳人女性网\t257391\n大伟\t257392\n乳首\t257393\nWIN7\t257394\nbangdream\t257395\n1153\t257396\n中交三公局\t257397\n压力值\t257398\n韩企\t257399\n索赔案\t257400\n高淳区\t257401\n汽车费\t257402\n市文明办\t257403\ngli\t257404\n全球风暴\t257405\n真空电镀\t257406\n新安江水电站\t257407\nff14召唤师\t257408\n0.11MB\t257409\n狼君\t257410\nh330\t257411\n大疆Osmo\t257412\nFME\t257413\nChampion\t257414\n7.4v\t257415\nStrix\t257416\n米淘科技\t257417\n荣京东街\t257418\n8月23日\t257419\n诺唯\t257420\n失实\t257421\n2.5维\t257422\n仙剑二\t257423\n交通运输部\t257424\n能言\t257425\npyinstaller\t257426\n5.2.1\t257427\nweights\t257428\n逃生门\t257429\n一个三个\t257430\n小姿\t257431\nwin7win10\t257432\n排除万难\t257433\n2000_\t257434\nwemall\t257435\n乐部\t257436\n77w\t257437\n水务公司\t257438\n动平衡机\t257439\n域名\t257440\n军机处\t257441\nCoalition\t257442\n黄豆酱\t257443\n反渗透净水器\t257444\nOSHC\t257445\nmesh\t257446\n形象化\t257447\nTrial\t257448\n关子\t257449\nFlexRay\t257450\n莫比乌斯\t257451\n棱角\t257452\n宝莲灯\t257453\n华衣网\t257454\n灵活就业人员养老保险\t257455\n私享会\t257456\n音乐著作权\t257457\nwages\t257458\n彩云比特\t257459\nPOLICY\t257460\n反弹器\t257461\n唱响\t257462\n压杆\t257463\n同济大学\t257464\n萱萱\t257465\nd9\t257466\n司波达\t257467\n数据库系统概论\t257468\n风帆\t257469\n科室\t257470\nfwx\t257471\n囊谦县\t257472\n语嫣\t257473\nuim卡\t257474\ncad块\t257475\n裸检\t257476\n外胎\t257477\nwin10edge\t257478\n1312\t257479\n3DM未加密版\t257480\n桂纶镁\t257481\n致命黑兰\t257482\n郑州市\t257483\n吸入性肺炎\t257484\n澄海3c\t257485\nhandled\t257486\n北京女子图鉴\t257487\n亚太商谷\t257488\n瓦良格\t257489\n生活大爆炸\t257490\nReligious\t257491\n马晓东\t257492\n管理部\t257493\n少年感\t257494\nEve\t257495\n詹姆\t257496\ncublas\t257497\nbutterk\t257498\nBecause\t257499\nanti\t257500\nprepaid\t257501\n樱飞\t257502\n麦凯\t257503\n洗手歌\t257504\n118114\t257505\n色选机\t257506\n农水\t257507\n50l\t257508\n姜恩惠\t257509\n背光键盘\t257510\n/title\t257511\n/strong\t257512\n奶爸会有天使替我爱你\t257513\nImprove\t257514\n不是你的错\t257515\n感生\t257516\n竞技\t257517\nvnd\t257518\nWinkawaks\t257519\n小米盒子3\t257520\n4.6万\t257521\n搜物网\t257522\n晓组织\t257523\n无损检测技术\t257524\natliwen\t257525\nextractor\t257526\nCrest\t257527\niCAN\t257528\n刘晖\t257529\n燕山君\t257530\n优色林\t257531\n601390\t257532\n王家林\t257533\nmention\t257534\n金芙蓉\t257535\nTrekking\t257536\n6.1.1\t257537\n向子\t257538\n乔梁\t257539\n真功夫\t257540\n丁捷\t257541\n淘洗\t257542\n安康杯\t257543\n检录\t257544\n肝损伤\t257545\nlumpur\t257546\n管轮\t257547\n不通透\t257548\n女文工团\t257549\n妈妈圈\t257550\nextinction\t257551\n龙镇\t257552\n_洛奇英雄传吧\t257553\n杜明\t257554\ndate函数\t257555\nNAKED\t257556\nSOT23\t257557\n核气\t257558\n超微型\t257559\n晴日峰\t257560\n质谱仪\t257561\n金馆\t257562\n救恩\t257563\n上海分行\t257564\nMECHREVO\t257565\n直视\t257566\n中铁二十一局集团有限公司\t257567\n猜猜\t257568\n妖姬三国\t257569\n露华\t257570\n第十三篇\t257571\n常常\t257572\n景县\t257573\n长江镇\t257574\n织带\t257575\njomashop\t257576\npayback\t257577\n发长\t257578\n7款\t257579\n般若心经\t257580\n北陆药业\t257581\nGL8\t257582\n11.0.1\t257583\n华埠镇\t257584\n镇宅\t257585\n混声\t257586\n上海环球金融中心\t257587\n君士坦丁堡\t257588\nWay\t257589\n华南理工大学研究生院\t257590\ntana\t257591\n120W\t257592\n全开源\t257593\nLU\t257594\n神药\t257595\n第65届\t257596\n成熟时\t257597\n3d2012\t257598\n管培\t257599\n微小说\t257600\n呜咪123\t257601\n波本\t257602\n西南交通大学研究生院\t257603\n增加\t257604\nza\t257605\n7m\t257606\n日本街道\t257607\n雷克萨斯NX论坛\t257608\n贝丽\t257609\n同创\t257610\n世界杯赛\t257611\n协管\t257612\n林火\t257613\n小净\t257614\nfacing\t257615\n阿阳\t257616\nux\t257617\n一出好戏\t257618\n普拉多论坛_汽车之家论坛\t257619\n塔布\t257620\nCPLD论坛\t257621\n经济数学\t257622\n10篇\t257623\n日照一中\t257624\n米纳\t257625\n张梦\t257626\nT恤\t257627\n池水\t257628\nAgricultural\t257629\nchino\t257630\n龙之谷\t257631\n郭明\t257632\n张纯烨\t257633\n基本介绍\t257634\n轻轨\t257635\n24班\t257636\nhttp请求\t257637\n2.0GB\t257638\n国家食品药品监督管理总局药品审评中心\t257639\n参人\t257640\n半年内\t257641\n渴望现场\t257642\n金味\t257643\n数遍\t257644\n嗨嗨\t257645\n土地改革\t257646\n毒死\t257647\n共助\t257648\n那片星空那片海\t257649\n大辅\t257650\n第06期\t257651\nmounty\t257652\n12MHz\t257653\n老港\t257654\n肠梗\t257655\njjc\t257656\n反查\t257657\n一览_快\t257658\n南京玄武外国语学校\t257659\n裸线\t257660\n1.3.17\t257661\n用神\t257662\n高仁\t257663\n68集\t257664\n金徽\t257665\n十二月\t257666\n智能血压计\t257667\n金康宇\t257668\nideapad320\t257669\n国办函\t257670\n变道\t257671\n朝美穗香\t257672\nSSTAP\t257673\n300x600\t257674\n丰特\t257675\n陈璧君\t257676\n建制镇\t257677\n院庆\t257678\n9月5日\t257679\n蛛\t257680\nresponsive\t257681\n汽水分离器\t257682\n新帅\t257683\nresistor\t257684\n588\t257685\nINVESTMENT\t257686\n丁酸\t257687\n香椿木\t257688\n老饕\t257689\n二十大\t257690\n大北京\t257691\nexecel\t257692\n千面\t257693\n捂\t257694\n赢创德\t257695\n救兵\t257696\n1.3mm\t257697\n不喜\t257698\n恒丰银行\t257699\napo\t257700\nplorn\t257701\n媚药\t257702\n白枫\t257703\nnodemanager\t257704\n圣彼得堡国立大学\t257705\n北京实验学校\t257706\n提倡\t257707\n小桃\t257708\n雅言\t257709\nmanong\t257710\n虚拟货币交易所\t257711\n胡作非为\t257712\nMONEY\t257713\nltr\t257714\n大Boss\t257715\n木柱\t257716\n六顶\t257717\n铁青\t257718\n胃容量\t257719\n马白玉\t257720\n挂念\t257721\n商合杭\t257722\n8620\t257723\n中国美术协会\t257724\nGRD\t257725\n赞叹不已\t257726\n更年期综合症\t257727\n号百控股\t257728\n音乐人生\t257729\n有灵\t257730\n条边\t257731\n普通百姓\t257732\n模型师\t257733\n瑞宏网\t257734\n内蒙古鸿茅国药股份有限公司\t257735\n勒沃库森\t257736\nAbs\t257737\nEffect\t257738\n800年\t257739\n技术资料\t257740\n静态块\t257741\nRechargeable\t257742\n佩饰\t257743\n王尚\t257744\n皮尔\t257745\n前一日\t257746\n荇菜\t257747\n看得出\t257748\n波洛克\t257749\n朝颜\t257750\n邻居家\t257751\n翕\t257752\nbreach\t257753\nrefund\t257754\n音孔\t257755\n泰星pope\t257756\ndps\t257757\n管头\t257758\n浦和红钻\t257759\n金玛\t257760\n杨哥\t257761\n打架子鼓\t257762\n绿草\t257763\n万达影业\t257764\n抬梁式\t257765\n陆扬\t257766\n心形\t257767\n7.5公斤\t257768\n砧木\t257769\n龙樾\t257770\nPoland\t257771\n胡懋仁\t257772\n花札\t257773\n简七\t257774\n31776114\t257775\n冒充者\t257776\n海贼王\t257777\n恢宏\t257778\n消毒机\t257779\n燕北\t257780\n专修店\t257781\n第19页\t257782\nInvader\t257783\nbit位\t257784\ncgc\t257785\n下框\t257786\n沈阳日报\t257787\n领驭\t257788\n95%置信\t257789\n南平市\t257790\n范跑跑\t257791\n钱锺书\t257792\nbasel\t257793\n冬奥\t257794\n十二万\t257795\n桂碧园\t257796\n交叉式\t257797\n盐酸左西替利嗪片\t257798\n两年多\t257799\nLovin\t257800\napd\t257801\n精舞门\t257802\n郭宏\t257803\n伊斯塔战灵\t257804\n混凝沉淀池\t257805\n0802\t257806\n6f\t257807\n士兰微\t257808\nNur\t257809\nuicollectionview\t257810\n9股\t257811\n冷小\t257812\n清水里的刀子\t257813\n绿建\t257814\njQuery库\t257815\n十四章\t257816\n内禁\t257817\n桩头\t257818\n秀智\t257819\n致意\t257820\n立地\t257821\n遵义市政府\t257822\n郑俊怀\t257823\nmolar\t257824\n平煤股份\t257825\n礼意\t257826\n冈布\t257827\nDiffMerge\t257828\n35项\t257829\n名花\t257830\n16型\t257831\n龙须面\t257832\n软绵绵\t257833\n净值型\t257834\nstelvio\t257835\n房屋置换\t257836\n海尔电热水器\t257837\nVest\t257838\n甘南法院\t257839\nMOS\t257840\n姚老板\t257841\n大余\t257842\n回报\t257843\n盐哥\t257844\n地皮\t257845\n敢做\t257846\nKazakhstan\t257847\n鲜于枢\t257848\n虹吸管\t257849\nDana\t257850\n万_\t257851\n伊瓜苏\t257852\n讲道讲章|主日讲章|祷告|大全\t257853\n漫游费\t257854\nMLS\t257855\n大小头\t257856\n一子\t257857\ncastle\t257858\n高坂穗\t257859\nGBLive\t257860\nsumproduct\t257861\n抽风式散热器\t257862\n狮龙\t257863\n限制片\t257864\n顶力\t257865\n仙魔\t257866\n松吉\t257867\n新中\t257868\nwis\t257869\n中程\t257870\n剑网3气纯\t257871\nHON\t257872\n上海8号线\t257873\napb\t257874\nTranslating\t257875\nCaoliu\t257876\nRX550\t257877\n巴巴罗萨\t257878\n慢食\t257879\n预览\t257880\n男生女\t257881\nShinelon\t257882\n全彩少女漫画:奴隶职员肛门被虐狂调教外传-女体化改造篇\t257883\n混世四猴\t257884\n办展\t257885\n揸\t257886\nCHEM\t257887\n坪地镇\t257888\nGO语言\t257889\n小兄弟\t257890\nwasm\t257891\n当儿\t257892\n山东嘉祥长城石雕厂\t257893\n刀剑神域\t257894\n莜麦\t257895\n何哉\t257896\n百度文库\t257897\norc\t257898\n牛钱\t257899\ngb1\t257900\n锻打\t257901\n水果箱\t257902\n扁盒\t257903\n身首异处\t257904\n古意人\t257905\n慢热型\t257906\n王海青\t257907\nStackPanel\t257908\n蔡琴\t257909\n勇敢的心\t257910\n丑妃\t257911\n校字\t257912\n3dmax论坛\t257913\n小天使\t257914\n30分钟内\t257915\n退休工人\t257916\n新新电影理论\t257917\n连带责任保证\t257918\nv1\t257919\n命令库\t257920\n回升\t257921\nUncertainty\t257922\n丽姬\t257923\nLINEAR\t257924\n米国\t257925\n绝地求生EVENT\t257926\nEPI\t257927\n牡丹节\t257928\n连看\t257929\n十八腔\t257930\n20161010\t257931\n荆湖\t257932\n流速\t257933\n岂可\t257934\n滑雪镜\t257935\nTenda\t257936\n9003\t257937\n冰棍棒\t257938\n线描\t257939\n4小时\t257940\n雪涛\t257941\nCS6\t257942\n景泰蓝\t257943\n柯城\t257944\n益康\t257945\n王铁\t257946\nemba\t257947\nunionid\t257948\n财星\t257949\n汉中人才网\t257950\n7.0.1\t257951\n周周\t257952\n狠话\t257953\nyama\t257954\n说服自己\t257955\nNBA2K10\t257956\n刘国平\t257957\n一百文\t257958\n上海电影制片厂\t257959\n南燕\t257960\n童虎\t257961\n青岛五十八中\t257962\nhepatitis\t257963\n集刊\t257964\n李银\t257965\n两天前\t257966\nSoapUI\t257967\n1pon\t257968\n首脑\t257969\n宝诰\t257970\n死亡之翼\t257971\n时樾\t257972\n南帝\t257973\n狂沙镇\t257974\n洪恩幼儿英语\t257975\n拆客\t257976\n单反镜头\t257977\n法律条款\t257978\n国优\t257979\n中华彩票网\t257980\nDota2\t257981\n复命\t257982\n鼬鼠\t257983\n突突\t257984\n媒介\t257985\n卷帘窗\t257986\n爱华仕\t257987\n钥匙箱\t257988\n共读\t257989\nAlps\t257990\n孙子涵\t257991\nDX11\t257992\n废除\t257993\n解押\t257994\n沈湘芸\t257995\n伊丽莎白圈\t257996\n海盗帽\t257997\n479个\t257998\n冬至\t257999\n饭前\t258000\n三时三餐\t258001\n变装\t258002\n宁围\t258003\nmya\t258004\n灵音播放器\t258005\n山西工商学院\t258006\n穷举\t258007\n商客\t258008\n托乐嘉\t258009\n大把\t258010\nincredible\t258011\n全国股转公司\t258012\n查缴\t258013\n24步\t258014\nMW\t258015\n脑中风\t258016\n铅笔\t258017\n谢耘耕\t258018\n贪婪\t258019\n相素\t258020\n2015.2\t258021\n油版雕\t258022\n金洲\t258023\n有刷直流电机\t258024\nFileProvider\t258025\n玛多县\t258026\n干贝\t258027\n延长\t258028\n武财神\t258029\ns320\t258030\n月读\t258031\n李光弼\t258032\n苏27\t258033\nMicrochip\t258034\n数学式\t258035\ndnf圣骑士\t258036\n强要\t258037\n汽配展\t258038\nWin10版\t258039\n趣分期\t258040\n云仓\t258041\n分析测试百科网\t258042\nres\t258043\n云埔工业区\t258044\n山景城\t258045\n新浪法院\t258046\n磨铁\t258047\n巡场\t258048\n撅臀\t258049\n欣龙控股\t258050\nApriori算法\t258051\n递延所得税负债\t258052\n墨滴\t258053\n发药\t258054\n挡圈\t258055\n叶皓轩\t258056\n烈火如歌\t258057\nlog2n\t258058\n42056\t258059\n第09期\t258060\n0.35\t258061\n缅甸维加斯\t258062\n凉夏\t258063\n荣怀\t258064\n续债\t258065\n相爱的人\t258066\n兰斯8\t258067\n李媛\t258068\n电动卷扬机\t258069\nCASH\t258070\n海城区\t258071\nLexBurner\t258072\ncaxa2007\t258073\n红宇新材\t258074\n袈裟\t258075\n忌\t258076\nJoel\t258077\n辨\t258078\n皮肤癌\t258079\n05期\t258080\n11下\t258081\nRefreshing\t258082\n蓝衣女\t258083\n心锁\t258084\nretroarch\t258085\nhum\t258086\n空军航空大学\t258087\ncobalt\t258088\n湖北经济学院\t258089\n南通市公安局\t258090\njepg\t258091\n周林频谱仪\t258092\nlate\t258093\n2维\t258094\n踢人\t258095\npodium\t258096\n徐乐江\t258097\n伟进\t258098\nAlan\t258099\nINA\t258100\n51369132\t258101\n32147178\t258102\n0795\t258103\nメモ\t258104\nacdsee\t258105\n筵席\t258106\n最新版软件\t258107\nShockwave\t258108\n2016.6\t258109\nl1800\t258110\n持仓量\t258111\n保险公\t258112\n2000套\t258113\n节水\t258114\nSnackbar\t258115\n天津市东丽区\t258116\n邮政储汇\t258117\nGC\t258118\niptv\t258119\nsqlserver2017\t258120\n飞走\t258121\n2.0T\t258122\ndvdms\t258123\n玉镯子\t258124\nOptimizing\t258125\n数列通项公式\t258126\n紫铜棒\t258127\n武原镇\t258128\ncoolibar\t258129\n吉他弦距\t258130\nRADO\t258131\n中国出口信用保险公司\t258132\n盖片\t258133\n布鲁斯威利斯\t258134\n罗定职业技术学院\t258135\n大衍\t258136\n万源新城\t258137\n沙溪村\t258138\n奥迪rs7\t258139\n白米\t258140\n医闹\t258141\n最高点\t258142\n巡更\t258143\nFact\t258144\nViewController\t258145\n肛周脓肿\t258146\n林依婷\t258147\n恒大御景天下\t258148\nsinner\t258149\n桃花奇缘\t258150\n秘蓝岛\t258151\n4亿\t258152\n产权法\t258153\nCULTURE\t258154\n徐小庆\t258155\n妖股\t258156\n自驾网\t258157\nEXCEL合并单元格\t258158\n华律办事\t258159\n包膜机\t258160\n胶原蛋白液\t258161\nLocus\t258162\n真如镇\t258163\nreplay\t258164\n挂机类\t258165\n90327\t258166\n八男\t258167\n法拉利F430\t258168\n超级机器人大战J\t258169\n词法\t258170\n149号\t258171\n北京市市政市容管理委员会\t258172\n谢小宇\t258173\n家明\t258174\n空寂\t258175\n金融界\t258176\n门神\t258177\n秋香\t258178\n大小姐\t258179\n特卖场\t258180\n我结\t258181\n樱桃沟\t258182\n班贝格\t258183\n高栏港经济区\t258184\n初阳\t258185\n逆作法\t258186\n超龄\t258187\nsnis\t258188\nう\t258189\n相见难别亦难\t258190\n高村\t258191\n戊日\t258192\nelsewhere\t258193\ncouple\t258194\n冷嘲热讽\t258195\n松江南站\t258196\nmiwa\t258197\n28类\t258198\nGibson\t258199\n惊天魔盗团2\t258200\n20170613\t258201\n小样儿\t258202\n唐昊\t258203\n利爪\t258204\n烟熏湖\t258205\n余宇\t258206\n1月4日\t258207\n厉史\t258208\n筑地市场\t258209\n长安欧尚\t258210\nsentry\t258211\n财路\t258212\njbd\t258213\n罗技|\t258214\n聽\t258215\nk550d\t258216\nYOU\t258217\n娱乐营销\t258218\n李悦\t258219\n3天内\t258220\n姬路\t258221\n武唐\t258222\n私德\t258223\n小青虫\t258224\n托勒密\t258225\n第27名\t258226\n水合物\t258227\n特气\t258228\n奇门遁甲\t258229\nwn10\t258230\nmatch函数\t258231\n阿奇霉素片\t258232\n赣鄱\t258233\n通信证\t258234\n爷\t258235\n颜悦色\t258236\nintervention\t258237\n631路\t258238\n大帽子\t258239\n点卡\t258240\nngs\t258241\n冷却机\t258242\nそ\t258243\n209元\t258244\n张志雄\t258245\n指宿\t258246\n张烈\t258247\n激情碰撞\t258248\n少儿教育\t258249\n伽\t258250\ncolors\t258251\n山地\t258252\n家政服务公司\t258253\n五房\t258254\n移动办公OA\t258255\nseventy\t258256\n沙洞\t258257\n风量调节阀\t258258\n天仓\t258259\n白癫风\t258260\n南湘\t258261\n板翅式换热器\t258262\n瓶子\t258263\n泡点\t258264\n74级\t258265\n工具版\t258266\n北京电影学院动画学院\t258267\nwaimao\t258268\nNSE\t258269\n第四天\t258270\n戚风\t258271\n智能座便器\t258272\n柜中\t258273\n告慰\t258274\n沥青混凝土搅拌站\t258275\n刺客信条起源\t258276\n四月间\t258277\n丁卯年\t258278\n刨机\t258279\n匈牙利语\t258280\nlegendary\t258281\n江苏省启东中学\t258282\n齐淑芳\t258283\n内向者\t258284\nNUMERIC\t258285\n60年代\t258286\n米饭\t258287\n帽峰山\t258288\n外国人在中国\t258289\n溶胶\t258290\neps\t258291\n乌尔姆\t258292\nBias\t258293\n自流平地坪\t258294\nNightcore\t258295\nss-panel\t258296\n山丹\t258297\n生涯\t258298\nJacks\t258299\n当地人\t258300\n望月路\t258301\nhao123网址导航\t258302\nRSU\t258303\n民心桥\t258304\nSubaru\t258305\n灵康药业\t258306\n1787年\t258307\neffective\t258308\n後\t258309\n奥兰多\t258310\n天演论\t258311\n卡卡江南春\t258312\nDa\t258313\nSQLite\t258314\n3.8\t258315\n李清云\t258316\n/盒\t258317\n征信贷款\t258318\n反向传播算法\t258319\n媚者无疆\t258320\nMicros\t258321\n翎羽\t258322\ngameloft吧\t258323\n玩物\t258324\n作工\t258325\nmarshal\t258326\n258.com\t258327\n递延型\t258328\n电力工业\t258329\n常山股份\t258330\n茶文化节\t258331\n河南省科学技术协会\t258332\n霸略\t258333\n通鼎\t258334\n好耍\t258335\n辽宁省肿瘤医院\t258336\nORA-00600\t258337\ncca\t258338\n瑞倪维儿\t258339\n窦氏\t258340\n色鱼\t258341\n2018.1.15\t258342\n恒创\t258343\n雪MM\t258344\n5371\t258345\nghfjj\t258346\n美容美发店\t258347\n半九十\t258348\n绝地求生提示\t258349\n小邪\t258350\n泽漆\t258351\nv2.1.8\t258352\nU型钢\t258353\nwonders\t258354\n多多色色\t258355\n24w\t258356\n中交路\t258357\n破狼\t258358\n赤鬼\t258359\n神马电影网_神马电影_神马影院_第九影院_超神影院\t258360\n天狱\t258361\n自慰棒\t258362\n林紫\t258363\n氘\t258364\n鳞屑\t258365\n林江\t258366\n利拉德\t258367\n巴马县\t258368\n弯灯\t258369\n新登镇\t258370\n刀塔\t258371\n超细粉碎机\t258372\n马鞍山花山区人民政府\t258373\nSk2\t258374\n电子科技\t258375\n蛀牙\t258376\n占星网\t258377\n电算\t258378\n尼龙轮\t258379\n应知应会\t258380\n佳能M6\t258381\n画皮\t258382\n棉\t258383\n真烦\t258384\nUAC\t258385\n中华骏捷\t258386\n孙吴\t258387\n金先生\t258388\nuyxx\t258389\n深潜器\t258390\n大新\t258391\n盘龙峡\t258392\n马志明\t258393\na01\t258394\n中国新闻出版广电网\t258395\n急不可耐\t258396\n黑社会2以和为贵\t258397\nxlm\t258398\n4版\t258399\n死亡圣器\t258400\nЫ\t258401\n爱神马\t258402\n无愧\t258403\n引取\t258404\nMacOS\t258405\nesay\t258406\ngogn\t258407\n组织学与胚胎学\t258408\n肾上腺\t258409\n爆水\t258410\n水资源量\t258411\noxidation\t258412\n艾丹\t258413\n_自媒体\t258414\n新潟\t258415\n市审计局\t258416\n广宁县人民政府\t258417\n王胡子\t258418\n逢低\t258419\nBurp\t258420\nsanctions\t258421\n国宾\t258422\n清爽型\t258423\n第二框\t258424\nlsp\t258425\n南昌新闻中心\t258426\n都匀东\t258427\n联想Y410P\t258428\n2013.12\t258429\n邵东一中\t258430\n游者\t258431\n东岭\t258432\nfind7\t258433\n忘却\t258434\nsgu\t258435\n虚拟人生4\t258436\n广藿香\t258437\niTools\t258438\n吊运\t258439\n磁化水\t258440\n58game\t258441\n_博越论坛_汽车之家论坛\t258442\n三八妇女节\t258443\n鲫\t258444\nbootbox\t258445\n夜静\t258446\n胖墩儿\t258447\nmql5\t258448\n走法\t258449\n6P\t258450\n第28轮\t258451\nhellocharts\t258452\nDie\t258453\n乳牛物语\t258454\n华东电力\t258455\n淝河镇\t258456\n遵义师范学院\t258457\n350平\t258458\n举贤\t258459\n1.3%\t258460\n张然\t258461\n浙江省舟山市普陀区人民政府\t258462\n蒙彼利埃\t258463\nNote7\t258464\n全国大学生比赛网\t258465\n传记类\t258466\n趾甲\t258467\n90名\t258468\n水笙\t258469\nOSD\t258470\n培训生\t258471\n六二\t258472\n扩阴\t258473\n谷文月\t258474\n提取\t258475\npolymer\t258476\n愛評網\t258477\n干热灭菌\t258478\n军马\t258479\n一说\t258480\n狂宠\t258481\n国寿e宝\t258482\np67\t258483\nACOG\t258484\n杂问\t258485\n冬训\t258486\n加油门\t258487\n怡丽丝尔\t258488\n600171\t258489\n呼之欲\t258490\n博闻\t258491\n项目栏\t258492\n第九位\t258493\nユフィン\t258494\nAntoine\t258495\ndaiwa\t258496\n刘昊\t258497\n尖顶\t258498\n凹坑\t258499\n爱色影\t258500\nSOLR\t258501\nDB33\t258502\n接盘\t258503\n装饰板\t258504\n新闻记者\t258505\n敬候\t258506\n直读式\t258507\n捐献者\t258508\n航海王OL\t258509\nstatic函数\t258510\n如皋\t258511\n雅安职业技术学院\t258512\n盗天仙途\t258513\n心理援助\t258514\n菜网\t258515\nbenefit\t258516\n祥符街道\t258517\n不管怎样\t258518\n单反相机镜头\t258519\n邮政编码5\t258520\n博多豚骨拉面团\t258521\n1.25%\t258522\n测试库\t258523\n北京国贸大酒店\t258524\nnfhifi\t258525\n芭芭拉\t258526\n国馆\t258527\n星速\t258528\n公盘\t258529\n林甸县\t258530\n分秒秒\t258531\n601号\t258532\n李连林夕\t258533\n鄢\t258534\nWWW.lovefou88.com\t258535\n习见平\t258536\n普中\t258537\n全球气候变暖\t258538\n梁上\t258539\n河北对外经贸职业学院\t258540\n核试剂\t258541\nelan\t258542\n家语\t258543\n私生女\t258544\n树墩\t258545\n军哥\t258546\nZhai\t258547\n0.96\t258548\n无不胜\t258549\n99折\t258550\n拈花网\t258551\n汇顶科技\t258552\n向死而生\t258553\n基础课\t258554\nWhores\t258555\n赖小民\t258556\n秋虫\t258557\n抚远市\t258558\n西汉末年\t258559\n20.3\t258560\nmoveto\t258561\n伽罗瓦\t258562\n蜷\t258563\n孟冬\t258564\nCryptoJS\t258565\nfiller\t258566\n信办\t258567\nV领\t258568\nJoystick\t258569\n恒容\t258570\n窦文霍华德\t258571\n5.5\t258572\n积木盒子\t258573\n立花慎之介\t258574\n一号文\t258575\n商丘站\t258576\n卫道士\t258577\n顶轴\t258578\n美图秀秀拼图\t258579\n发表会\t258580\n龙兄虎弟\t258581\n滤材\t258582\nresting\t258583\n纷飞\t258584\n3D建模\t258585\n英国国家剧院\t258586\n复盘\t258587\n多米尼加共和国\t258588\n背压阀\t258589\ncwt\t258590\n优越\t258591\n20141121\t258592\n仕途\t258593\n标量\t258594\nterror\t258595\n激扬\t258596\n董明\t258597\n欺诈行为\t258598\n榆林\t258599\n税务总局\t258600\n水煎包\t258601\nblouse\t258602\nleads\t258603\n钋\t258604\nfinal7\t258605\ncp1h\t258606\n上海晨光科力普办公用品有限公司\t258607\n蜂窝活性炭\t258608\n昌化路\t258609\nOrgy\t258610\n牧神\t258611\n南京师范大学研究生院\t258612\n西华门\t258613\n伊卡璐\t258614\n串店\t258615\n推文\t258616\n一个几\t258617\n沪嘉甬铁路\t258618\ndecodeURI\t258619\n郑勇\t258620\n赤松\t258621\n疫病\t258622\n冲击函数\t258623\njavaWEB\t258624\n洞山\t258625\nICNS\t258626\nlander\t258627\n南京财经大学\t258628\n481号\t258629\n广东岭南职业技术学院\t258630\n没羞\t258631\n收割者\t258632\n16W\t258633\n定业\t258634\n王琴\t258635\nDll\t258636\n广昌隆\t258637\n琳达\t258638\n界王\t258639\n左脚\t258640\n郭全宝\t258641\nJINS\t258642\n冲绳县\t258643\n边墙型\t258644\n三栏\t258645\n羁绊\t258646\nG9500\t258647\n游来\t258648\n云和山\t258649\n360分\t258650\n鬼迷\t258651\n传奇永恒归真版\t258652\n二板\t258653\ndenoising\t258654\n精化气\t258655\n54M\t258656\n8833\t258657\n90SS套\t258658\n二段\t258659\n英孚教育\t258660\n上海市委党校\t258661\n补员\t258662\n敌舰\t258663\n哪版\t258664\nsprt\t258665\nxshow\t258666\n焦度计\t258667\n印刷厂\t258668\n2800万\t258669\n林州市\t258670\n一本道无码在线av\t258671\n杨菲洋\t258672\n中国教育装备网\t258673\n陌生女\t258674\n两相宜\t258675\n隐婚蜜爱\t258676\n泸县人民政府\t258677\nAccess2010\t258678\n十一月四日\t258679\nscrolling\t258680\n星川\t258681\nfir滤波器\t258682\n刘封\t258683\n华润中央公园\t258684\n卧式暗装风机盘管\t258685\n和君商学\t258686\n601877\t258687\n倒顺\t258688\nhyukoh\t258689\n华信智原\t258690\n桑叶\t258691\n财务报表分析\t258692\n上用\t258693\nIIS7.0\t258694\n铿锵三人行\t258695\nheadlines\t258696\n一大波\t258697\n象山县政府\t258698\niPhone5c\t258699\n死肥宅\t258700\ncctv13\t258701\n镣\t258702\n阅文集团\t258703\n庄姓\t258704\n净资产折股\t258705\n本\t258706\nLW\t258707\n南岔区\t258708\n展演\t258709\n超低音\t258710\n我只在乎你\t258711\n板式冷却器\t258712\n汤泉池\t258713\n迷你仓\t258714\n云南交通职业技术学院\t258715\n晨钟暮鼓\t258716\n到飞\t258717\n大陆赛马网\t258718\ntik\t258719\n光明地产\t258720\n春雪\t258721\nmimikatz\t258722\n86013337\t258723\n到期日\t258724\nbiqucun\t258725\n南海\t258726\n北京利尔\t258727\n碳水\t258728\n乙肝病毒\t258729\n谢文珂\t258730\n四制\t258731\n郭大侠\t258732\nsrtp\t258733\n基本工\t258734\n不定时\t258735\n洁面皂\t258736\n博越社区_汽车论坛-易车网\t258737\nNetCDF\t258738\n丁酉\t258739\n海滨广场\t258740\n林寒\t258741\n固定板\t258742\n平邑县\t258743\n仁心仁术\t258744\n16关\t258745\ngoo\t258746\n刚硬\t258747\nilucking\t258748\n水煮\t258749\n总谱\t258750\n王红丽\t258751\n吾喜杂志网\t258752\n黄陂区\t258753\n维蕾德\t258754\nriddlejoker\t258755\n丑时\t258756\n分数乘法\t258757\n想法\t258758\nshishi\t258759\n职业打假人\t258760\nNO.3\t258761\n重庆大学建筑城规学院\t258762\nNaCl\t258763\n陈国良\t258764\n赵巷镇\t258765\n102.7\t258766\n中华合作时报\t258767\n原曲\t258768\n耶亚希\t258769\n燕燕\t258770\n发展_参考网\t258771\n杨晓芸\t258772\n几节\t258773\n爱情回来了\t258774\n黑色止血钳\t258775\n江南站\t258776\n日置\t258777\n威特斯国际洗衣\t258778\n日记大全\t258779\n信锐\t258780\n买一送一\t258781\n无骨鸡爪\t258782\n柔嫩\t258783\n蒙诺\t258784\n高筒\t258785\n线性卷积\t258786\n李慧珠\t258787\n方金刚\t258788\n用墨\t258789\n西安地区\t258790\n50余名\t258791\n云创通\t258792\n支付宝交易号\t258793\n盐酸\t258794\n王医生\t258795\n131家\t258796\n优米\t258797\n营口东站\t258798\n黑龙江省地方税务局\t258799\n揩油\t258800\n抗浮锚杆\t258801\n防毒\t258802\n拔出\t258803\n长沙航空职业技术学院\t258804\n战友们\t258805\nsend函数\t258806\n_战国兰斯吧\t258807\n胡启生\t258808\n6.1.5\t258809\n抗老\t258810\n黑群\t258811\nmick\t258812\n6y\t258813\n弱鸡\t258814\n幼欲\t258815\n青城山\t258816\n王母\t258817\n2402\t258818\n肇庆新区\t258819\n常泰\t258820\n江西中烟\t258821\n云蒙\t258822\nInformatica\t258823\n板片\t258824\n甲午大海战\t258825\n8.31\t258826\n彩钢活动房\t258827\nSIC\t258828\n耐热钢\t258829\n镖人\t258830\n韩城市人民政府\t258831\n基金协会\t258832\n打预防针\t258833\n18万\t258834\n大柜\t258835\narcgisserver\t258836\n百通世纪\t258837\n长正\t258838\n武汉地铁8号线\t258839\n济南市纪委监委\t258840\n果冻\t258841\n隅田川\t258842\n技工类\t258843\nEnemy\t258844\n保洁车\t258845\nbitrate\t258846\n蓝熊\t258847\nSen\t258848\ngot\t258849\n小米蕉\t258850\n咬人\t258851\nPPmoney万慧可信\t258852\n2022世界杯\t258853\nbiography\t258854\n封印师\t258855\n木喉\t258856\n359\t258857\n佳苑\t258858\n林海公路\t258859\n中资银行\t258860\n张耀扬\t258861\n各坛\t258862\nnvme协议\t258863\n盲女\t258864\n92页\t258865\n负债累累\t258866\n酒精计\t258867\n睿频\t258868\nzxing\t258869\n美宜佳便利店\t258870\n3日\t258871\nz68\t258872\n20160616\t258873\n根鸟\t258874\n3万分\t258875\nCaptain_Li\t258876\n玩点\t258877\n黄金棒\t258878\n河南移动\t258879\n遗孤\t258880\nselenium\t258881\n溺水者\t258882\n上海高院\t258883\n恩波教育\t258884\n科迪乳业\t258885\n付梦妮\t258886\nclamp\t258887\nwrit\t258888\n反向代理服务器\t258889\n黑撒\t258890\n冲头\t258891\n000069\t258892\n粉丝群\t258893\n调用链\t258894\n80080005\t258895\ngrep\t258896\n罗伯斯\t258897\nHyper-v虚拟机\t258898\n海誓山盟\t258899\n路志正\t258900\nPour\t258901\nAmedeo\t258902\n乐克\t258903\n_皇\t258904\nMMRecovery\t258905\n刘若钒\t258906\n窥膜\t258907\n20182月\t258908\n农林牧副渔\t258909\nBLACKPINK\t258910\n济南搬家公司\t258911\n中亿财经网\t258912\n痛车\t258913\nb战\t258914\n辟谣\t258915\n北京师范大学心理学部\t258916\n胃经\t258917\n百通\t258918\ncanon\t258919\n高橋\t258920\nsjtxt\t258921\n创圣\t258922\nズ\t258923\n主要症状\t258924\n排汗\t258925\n大气磅礴\t258926\nConquer\t258927\n80年代末\t258928\n预付卡\t258929\n钓鱼翁\t258930\n邓林\t258931\n横亘\t258932\n王红军\t258933\n蒙迪凯迪拉克ats\t258934\nDOM树\t258935\nEtherChannel\t258936\n动作\t258937\n鱼鳞纹\t258938\n肠系膜淋巴结炎\t258939\n唐山纪委监察局\t258940\n文品\t258941\n开腿\t258942\n钢结\t258943\n凋谢\t258944\n洗衣盆\t258945\nmous\t258946\n应避免\t258947\n第二十天\t258948\n姚洋\t258949\n小芽\t258950\njjx\t258951\n口字\t258952\n火焰纹章if\t258953\n跪谢\t258954\n被蛇咬\t258955\n佳依\t258956\n桂林西站\t258957\n沿路\t258958\n隔珠\t258959\n异形体\t258960\nmaroon\t258961\n泸州新闻网\t258962\nsandbox\t258963\n司马彦\t258964\n120击\t258965\nngff\t258966\n体\t258967\nMC34063\t258968\npique\t258969\n于文华\t258970\n白龙港\t258971\nNo.11\t258972\n蛙花\t258973\nskewed\t258974\n纤夫\t258975\nSonatype\t258976\n萤火特洛伊\t258977\n武汉光电国家实验室\t258978\naicp\t258979\n风散\t258980\nKinect2\t258981\n广州市区\t258982\n茴茴\t258983\n小飞象\t258984\nmanner\t258985\n可诉性\t258986\n柳州\t258987\n太麻\t258988\n英雄无敌3吧_\t258989\n控管\t258990\n赏心\t258991\n王佳芝\t258992\n语段\t258993\n公户\t258994\n遍\t258995\n公关费\t258996\n同兴村\t258997\n三国霸业2\t258998\n伽马值\t258999\n张震\t259000\n北京4号线\t259001\n瑞芳\t259002\nukey\t259003\n锤子M1\t259004\nremoved\t259005\n非模态\t259006\nUFC223\t259007\nwww8888avco\t259008\n射阳县\t259009\n每两天\t259010\n交片\t259011\n爱卫\t259012\n亚文\t259013\n撕贴\t259014\n江璟儿\t259015\n赫莲娜\t259016\n体育学\t259017\n血婴\t259018\n艾吉\t259019\n长寿区\t259020\n半多\t259021\nmicrobiome\t259022\n礼包版\t259023\n衣锦夜行\t259024\n均方根\t259025\nla乔布斯\t259026\n拦路\t259027\n王上源\t259028\n博商管理科学研究院\t259029\ngloves\t259030\nC++头\t259031\n红闪\t259032\n华三交换机\t259033\n油桐花\t259034\n46个\t259035\n星王\t259036\n中共山东省委\t259037\n坡跟鞋\t259038\nBrooklyn\t259039\n魔戒\t259040\n广州市商务委员会\t259041\n不疯魔不成活\t259042\n防滑条\t259043\n习近平博鳌论坛\t259044\n孙哲\t259045\ntianjia\t259046\n第七十条\t259047\n李某\t259048\n晚安\t259049\n于康\t259050\n衣食\t259051\n中学版\t259052\nvmstat\t259053\n小妻\t259054\nrosen\t259055\n2559\t259056\n阴\t259057\n金环小镇\t259058\n立马\t259059\n吊死\t259060\n北发图书网\t259061\n石油烃\t259062\n潘爱华\t259063\n毛溪浩\t259064\n九宫幻卡\t259065\n澳大利亚元\t259066\nComparator\t259067\n界石\t259068\n尊长\t259069\n云之南\t259070\n普元\t259071\n成都市实验小学\t259072\n仄声\t259073\n1下\t259074\n植保会\t259075\n一快\t259076\n_花\t259077\n输送车\t259078\n安卓乐园安卓\t259079\n旧人\t259080\n盐酸舍曲林\t259081\n苏霞\t259082\n乳核\t259083\n瓶套\t259084\nPosted\t259085\n广渠家园\t259086\n阳光照明\t259087\n微信支付账单\t259088\n御寒\t259089\n幸福中路\t259090\n岚城\t259091\n画梁\t259092\n大连装修公司\t259093\nCONSTANT\t259094\nThick\t259095\nEAX\t259096\n风阻系数\t259097\ndn300\t259098\nWXML\t259099\n两三句\t259100\n132亿\t259101\n果葡糖浆\t259102\np2\t259103\n转储\t259104\n人教版五年级语文上册\t259105\n笑声传奇\t259106\n睿米\t259107\nGTS250\t259108\n信用度\t259109\n测定仪\t259110\n思路\t259111\n台架\t259112\nsuperb\t259113\njcy\t259114\n空气净化器\t259115\nbeplay\t259116\n2.6亿\t259117\n炎症性肠病\t259118\n轻钙\t259119\n一直以为\t259120\nKeepAlive\t259121\nldp\t259122\nodbo\t259123\n灰白色\t259124\n主演员\t259125\n戴资颖\t259126\n房地产业协会\t259127\n如锦\t259128\n成本高\t259129\n过海\t259130\ninhibit\t259131\n步进电机\t259132\nMet\t259133\n忆初心\t259134\nSolidWorks2016\t259135\n吗_39健康网\t259136\nMT5\t259137\n逆回购\t259138\n咖啡机\t259139\n株洲\t259140\n螺旋压缩弹簧\t259141\n3P\t259142\nXxX\t259143\n雾度\t259144\nf564\t259145\n鸢尾\t259146\n达斯维达\t259147\n道友\t259148\nwwn\t259149\n调拨单\t259150\n瓜子二手车\t259151\nwin10卡\t259152\n四五十\t259153\n独显方法_电脑百事网\t259154\n维珍\t259155\n小小运动馆\t259156\n十二色\t259157\nVCI\t259158\n血翼飞龙\t259159\n真相帝\t259160\n四邻\t259161\n剪式举升机\t259162\ncgb\t259163\n1.2.1\t259164\n西格蒙德·弗洛伊德\t259165\n一遍遍\t259166\n求甚解\t259167\n日本贝亲\t259168\nX820\t259169\nPeaks\t259170\n免签约\t259171\n片麻岩\t259172\n聊城市\t259173\n贾青\t259174\n微感言\t259175\n群龙\t259176\nVeronica\t259177\n北京大学肿瘤医院\t259178\nlibaio\t259179\nfk2\t259180\n旅顺博物馆\t259181\n宁波银行股份有限公司\t259182\nBT迷\t259183\n波西亚\t259184\n示范村\t259185\n美术室\t259186\n165万\t259187\n苏亚雷斯\t259188\n20180211\t259189\nㄊ\t259190\n荤段子\t259191\n江苏省质监局\t259192\n凯绅\t259193\nAutoCAD2017\t259194\n绝地求生封\t259195\n2j\t259196\n萨姆斯\t259197\n鲍图\t259198\nMediaCodec\t259199\n22mm\t259200\n代办\t259201\n南印度\t259202\nmadison\t259203\n免责协议\t259204\n5根\t259205\n音舞\t259206\n115家\t259207\nzmzhao\t259208\n舒肝丸\t259209\n10月26日\t259210\n优办\t259211\n1545\t259212\n乐逍遥\t259213\n垂直滚动条\t259214\n利文斯顿\t259215\nDVD播放机\t259216\n500MW\t259217\nMc天佑\t259218\n速回\t259219\n货币互换协议\t259220\n梵洁\t259221\n火力发电厂\t259222\n韩女\t259223\n江西交通职业技术学院\t259224\nBeing\t259225\n不可说\t259226\n十溴二苯乙烷\t259227\n慈文\t259228\n郑淳元\t259229\n龙脊\t259230\n情满四合院\t259231\n防电墙\t259232\n感叹句\t259233\n陇原\t259234\n利和\t259235\n合流制\t259236\n机架\t259237\n11月11日\t259238\nProject2016\t259239\n蔡礼旭\t259240\n女集中营\t259241\nenamel\t259242\n阖\t259243\n任涛\t259244\n跑法\t259245\n中国移动通信\t259246\n天泉湖\t259247\nCOMMUNICATION\t259248\n暴走大事件\t259249\n亚硝酸\t259250\n总数\t259251\n宜居生活\t259252\n李颂南极之恋\t259253\n绝对优势\t259254\n井队\t259255\nOU\t259256\nイカ\t259257\n贴士\t259258\n炎亚纶\t259259\ndtypes\t259260\n更狠\t259261\n中国专用汽车网\t259262\n狼眼\t259263\n抵港\t259264\n常规\t259265\n腰椎滑脱\t259266\n18种\t259267\n虚拟内存\t259268\n肌源\t259269\n无翼\t259270\n生化模式\t259271\n资本性\t259272\nmacan\t259273\n关山月\t259274\n半度\t259275\nword宏\t259276\n并购优塾\t259277\n第七版\t259278\nshishen\t259279\n显色剂\t259280\n汽修厂\t259281\n午后\t259282\nOpenssh\t259283\n知无\t259284\ntime\t259285\n3.4.2\t259286\n橄\t259287\n溶度\t259288\n冠以\t259289\n苹果6Splus\t259290\n双杀\t259291\n_兔\t259292\n回眸\t259293\n五联村\t259294\n程度\t259295\nJPEG格式\t259296\n景光\t259297\n馒头山\t259298\ndapp\t259299\n对方\t259300\n作业员\t259301\nmax+\t259302\n企业信息系统\t259303\n京广铁路\t259304\n童哲\t259305\nL565\t259306\nwxy\t259307\n建筑设计防火规范\t259308\n巳亥\t259309\n担当\t259310\n重庆工业职业技术学院\t259311\n李建中\t259312\n平分\t259313\n朱衣\t259314\n江南御园\t259315\n阅报栏\t259316\n刘智扬\t259317\n加气混凝土砌块墙\t259318\n脚踏式\t259319\n大不过\t259320\n泰坦2吧\t259321\nCam\t259322\n瀚\t259323\n独山玉\t259324\n麻城一中\t259325\n含钢\t259326\n十诫\t259327\n胶片机\t259328\n佳能650d\t259329\n2118\t259330\n大路上\t259331\nhuay\t259332\necharts\t259333\nGraphis\t259334\n汉普顿\t259335\n绿篱机\t259336\nMIZUNO\t259337\n嘉士伯啤酒\t259338\n宋云谦\t259339\ncolumbia\t259340\nprima\t259341\n935\t259342\n2018年5月11日\t259343\n灵飞\t259344\n2017-12-01\t259345\n普拉多\t259346\n发性\t259347\n此后\t259348\n梨膏糖\t259349\n定规\t259350\nFleet\t259351\n六队\t259352\n布鲁诺·马尔斯\t259353\n礼盒\t259354\n火化证\t259355\n阴门\t259356\n关于全面推行河长制的意见\t259357\n情况\t259358\n沈阳站\t259359\n马尔茅斯\t259360\n单务\t259361\nQQ批量\t259362\n江济淮\t259363\n婚床\t259364\n上分\t259365\n残袍\t259366\n李芳\t259367\n二句\t259368\n21:00\t259369\n磁力链接搜索引擎\t259370\n航天四院\t259371\n城盟\t259372\n院部\t259373\n刘国\t259374\n肥东县\t259375\n依云华府\t259376\ncited\t259377\n90公斤\t259378\nuq\t259379\n香港苹果\t259380\n东风标致4008\t259381\nForece\t259382\nH-1B\t259383\nToe\t259384\n朵云轩\t259385\n偏印格\t259386\n消弭\t259387\n福尔摩斯探案集\t259388\n积食\t259389\n850元\t259390\n鸡兔\t259391\nkita\t259392\n三合一\t259393\ntorndb\t259394\nDJ舞曲网\t259395\nBaiduMap\t259396\n八处\t259397\n木道\t259398\n脚蹬\t259399\n鸭志田\t259400\n航头镇\t259401\n歌谣\t259402\n知否知否应是绿肥红瘦\t259403\n湖州19楼\t259404\n品茗杯\t259405\nxlstat4.dll\t259406\n生女\t259407\nOzawa\t259408\n小兽妃\t259409\n五星红旗\t259410\n罗伊\t259411\n卓越地产\t259412\n锋尚\t259413\n浸会大学\t259414\n屈螺酮炔雌醇片\t259415\n济南小学\t259416\n燕之坊\t259417\nSBS人气歌谣\t259418\n5220\t259419\n拜忏\t259420\n17句\t259421\n毛主席\t259422\nGRO\t259423\n虾饵\t259424\n男妾\t259425\n黄腻\t259426\n大破天幕杀机\t259427\nslogan\t259428\n南腔北调\t259429\n古琦\t259430\n捆\t259431\n童菲\t259432\n狗\t259433\n郝红梅\t259434\n随梦\t259435\nifunbox\t259436\n1.5v\t259437\njs变量\t259438\n安徽农村信用社\t259439\nWisconsin\t259440\n光柱\t259441\n电子吊秤\t259442\n纪念款\t259443\n爽约\t259444\n千字\t259445\n胡婷婷\t259446\n蔡总\t259447\nbitset\t259448\n零用钱\t259449\nsig值\t259450\n物片\t259451\n电子水准仪\t259452\nConvenient\t259453\nhiber\t259454\n米歇尔·罗德里格兹\t259455\n申报表\t259456\ninfinite\t259457\n原味\t259458\n开拓性\t259459\n政府采购网\t259460\nYB\t259461\n3.5升\t259462\n第四军医大学唐都医院\t259463\n疙\t259464\n仙傲\t259465\n国宝档案\t259466\ndietary\t259467\nmh4g\t259468\nc刊\t259469\n港杂费\t259470\n光气\t259471\npushover\t259472\n灵鹫宫\t259473\n驯马\t259474\n标有\t259475\n乔布放牛班的春天\t259476\n旁友\t259477\n万达网络科技集团\t259478\n佐治亚\t259479\nru\t259480\n杨荣\t259481\n句块\t259482\n蛇姬\t259483\n隆回\t259484\n5月底\t259485\n8脚\t259486\n技犯\t259487\n抛石\t259488\n李寻欢\t259489\n四通镇\t259490\n湖北省交通运输厅\t259491\n珠江三角洲\t259492\npon\t259493\n4次\t259494\n灵气\t259495\n河北网\t259496\n玉势\t259497\n20.7\t259498\n俯瞰\t259499\n瑞华\t259500\n凉山日报社\t259501\n夹逼\t259502\n沿海\t259503\n普外\t259504\n途风\t259505\n德国打折网\t259506\nConnectivity\t259507\n圆谷\t259508\n没准\t259509\nOrchestral\t259510\n马仙\t259511\n富家\t259512\n内蒙古高原\t259513\n神医喜来乐\t259514\n旧州\t259515\n树洞\t259516\nflights\t259517\n水龙卷\t259518\n韩佳人\t259519\n2017.10\t259520\n活塞式\t259521\n呼救\t259522\n前身\t259523\n纸工\t259524\n梅华\t259525\nXJTLU\t259526\n圆钢\t259527\npassword_hash\t259528\nmotel\t259529\n铁姆\t259530\n2.3.4\t259531\n五一高速\t259532\nlaserjet\t259533\n疏导\t259534\n马克华菲\t259535\n格雷科技\t259536\n2013—2020年\t259537\n剑鱼\t259538\n朱圣祎\t259539\n数珠丸恒次\t259540\n白狼\t259541\n商贸有限公司\t259542\n腾邦\t259543\n中国阳泉政府\t259544\nfederal\t259545\n触底反弹\t259546\nfvd\t259547\n香港中央\t259548\n山转水\t259549\n废\t259550\n场地图\t259551\n笔夹\t259552\n重点中学\t259553\n制冷百科\t259554\n人日\t259555\ndaili\t259556\n杨氏\t259557\n文远\t259558\nSTEINS\t259559\n管带\t259560\n不结婚\t259561\n3hk\t259562\n质量保险\t259563\n16种\t259564\n苏州工业园区社会保险基金和公积金管理中心\t259565\n8芯\t259566\n蜜芽\t259567\nGold\t259568\n尤克里里Fans乌克\t259569\n平安县\t259570\nv5.0\t259571\nwound\t259572\n导向性\t259573\n无袖\t259574\naffiliate\t259575\n客家吧\t259576\n密炼机\t259577\nXixinv\t259578\n洽洽\t259579\n气块\t259580\n券码\t259581\n自由贸易\t259582\n六年级英语下册\t259583\n假造\t259584\nloaders\t259585\n空房子\t259586\n粗野\t259587\n2P\t259588\n20161211\t259589\n3px\t259590\nIp\t259591\n绝艳\t259592\n烘干箱\t259593\n周澎\t259594\n多核处理器\t259595\nDiskGenius\t259596\n极寒\t259597\n黄泥磅\t259598\n91PORN\t259599\n开式冷却塔\t259600\n木婉\t259601\n山东快书\t259602\nnumeca\t259603\n落空\t259604\n赖忠标\t259605\n刘以达\t259606\n如潮\t259607\nntpq\t259608\n1459\t259609\n佛塑科技\t259610\n表现\t259611\n事业观\t259612\n初阅读网\t259613\nhp1010\t259614\n0x000003e3\t259615\n下载类\t259616\n阿里斯顿电热水器\t259617\n颐安\t259618\n下一代\t259619\n明晨\t259620\n58|\t259621\nVertex\t259622\n第11周\t259623\n枣庄市中区\t259624\n960m\t259625\n谷澍\t259626\n气动振动器\t259627\n复原\t259628\n私生\t259629\n适当\t259630\n四条屏\t259631\n家仙\t259632\n两棵\t259633\n黄水谣\t259634\n拉米夫定\t259635\n苏州大学文正学院\t259636\n_卿\t259637\n金立S9\t259638\n五城联创\t259639\n琴川\t259640\n走近科学\t259641\neither\t259642\nsds-page\t259643\n0087\t259644\nggdb\t259645\nRestore\t259646\n五毫米\t259647\nPPT批量\t259648\n南昌市地方税务局\t259649\n送股\t259650\n工银瑞信基金管理有限公司\t259651\n好友们\t259652\n美丽岛\t259653\n2018年1月1号\t259654\n郭店镇\t259655\n浅陌\t259656\n多丽丝·莱辛\t259657\ncsnmd\t259658\n吸烟率\t259659\n山道\t259660\n新业\t259661\n邓向阳\t259662\n友友\t259663\nOricon\t259664\n2017年3月24日\t259665\n错音\t259666\n麻烦你\t259667\n碎发\t259668\n热释\t259669\nFell\t259670\nXC90\t259671\nmha\t259672\nnoticeable\t259673\n部落\t259674\n登号\t259675\n第六代\t259676\n马伟明\t259677\n任洪斌\t259678\n迎宾馆\t259679\n唐草\t259680\ntilt\t259681\n沙比利\t259682\n在位\t259683\n李国豪\t259684\n颜色值\t259685\n脱脂牛奶\t259686\n王玉平\t259687\n浈江区\t259688\nCowboy\t259689\n无情剑\t259690\n番位\t259691\n自拍照\t259692\n11组\t259693\nOem\t259694\n康盛\t259695\n舌下腺囊肿\t259696\n17篇\t259697\n湖北省统计局\t259698\n1644年\t259699\n账公司\t259700\nkN\t259701\n深圳市政府采购中心\t259702\n武林盟私密记事\t259703\n憩室\t259704\n维生素K\t259705\n葫芦山\t259706\n18分钟\t259707\n三庆园\t259708\n郑玄\t259709\nshindoyang\t259710\nyourphp\t259711\n板架\t259712\n在渊\t259713\n摩尔曼斯克\t259714\n金融租赁公司\t259715\n7.04\t259716\n周口日报社\t259717\n试制\t259718\nアイロニ\t259719\n重庆钢铁\t259720\n凉凉\t259721\n梦镜\t259722\n起亚卡宴\t259723\nハンタ\t259724\n婚事\t259725\n天天剑侠世界\t259726\n小腹坠胀\t259727\nDxO\t259728\n铜棒\t259729\n黄山宏村\t259730\nwares\t259731\n游历\t259732\n保洁柜\t259733\n黄渡镇\t259734\n常用库\t259735\n江苏惠尔泵业有限公司\t259736\nplos\t259737\n文史馆\t259738\n票面利率\t259739\n处变不惊\t259740\n1935\t259741\n巴淡岛\t259742\n阀片\t259743\n蝴蝶泉边\t259744\n牌堆\t259745\nMaisy\t259746\n钟丽燕\t259747\n美称\t259748\n深圳锦绣中华民俗村\t259749\nshino\t259750\n三截\t259751\n立享\t259752\n20151117\t259753\n青幕山\t259754\n联界\t259755\n车家号_\t259756\n植物大战僵尸中文版_植物大战僵尸\t259757\n一节\t259758\njumia\t259759\nmc5\t259760\n树简笔画\t259761\n阜阳政府\t259762\nfoxmai\t259763\n环戊烯\t259764\n干事\t259765\n后面\t259766\n龙母\t259767\n城女\t259768\n幅号\t259769\n更省钱\t259770\n小道士\t259771\n100万公里\t259772\n张占岭\t259773\nnobel\t259774\ntownship\t259775\n巢湖市人民政府\t259776\n道地药材\t259777\nnaticat\t259778\n丽思\t259779\n掼蛋\t259780\n60cj.com\t259781\n求学快递网\t259782\n公平贸易\t259783\n隔山有眼\t259784\n牛角座\t259785\n宫西达也\t259786\n守宫\t259787\n有条理\t259788\nCooker\t259789\nmxp\t259790\nvgd\t259791\n江苏城\t259792\n蜜恋\t259793\n中殿\t259794\n玄彬\t259795\nOpenID\t259796\n浆细胞\t259797\n上个星期\t259798\n红旅\t259799\n伦敦大学学院\t259800\n小狼旺吉\t259801\n两驱\t259802\n金菊\t259803\n爱阅网\t259804\n1280X720\t259805\n莫言杰森斯坦森\t259806\nenzymes\t259807\n86家\t259808\n黄鼬\t259809\n曲儿\t259810\nxdrive\t259811\n湖北省博物馆\t259812\n福建省农业厅\t259813\n2.8.2\t259814\nBattles\t259815\nnte\t259816\n益林\t259817\n嗨学网\t259818\n大战\t259819\nwisp\t259820\nGridView\t259821\n艾力克\t259822\n频闪\t259823\n装饰画\t259824\n魅力\t259825\n张哲瀚\t259826\n花落\t259827\np-value\t259828\n自居\t259829\n青儿广场舞\t259830\nPlayMemories\t259831\n往者\t259832\nkmplayer\t259833\n成人高考\t259834\n知识篇\t259835\ndamuzzz\t259836\n海薇\t259837\n网上支付\t259838\n0421\t259839\n养老保险费\t259840\n馒头商学院\t259841\nGuzzle\t259842\nordered\t259843\n公泰迪\t259844\n晚评\t259845\n土方放坡系数\t259846\n1320\t259847\n五里坨\t259848\nqualifiers\t259849\nテン\t259850\n3寸\t259851\n西户\t259852\n_凤凰\t259853\n2030年前\t259854\n上海爱康国宾体检中心\t259855\n混沌与秩序2救赎\t259856\n过山车大亨3\t259857\n阴阳师阴阳寮\t259858\n安阳县\t259859\n凤娇\t259860\n托板\t259861\n李知恩\t259862\nGlobe\t259863\n元奎\t259864\n圆坟\t259865\n胶业\t259866\n04735\t259867\n祼\t259868\nf186\t259869\n河豚鱼\t259870\n麦博\t259871\n月光下\t259872\n冰冰邦邦\t259873\n宝悦\t259874\n鹿少女\t259875\n三街\t259876\n_山村\t259877\n绝世好男人\t259878\n蒙牛公司\t259879\n金汇镇\t259880\nPDP\t259881\n博人\t259882\n封胶\t259883\nRGB格式\t259884\nsug\t259885\n塌落\t259886\nTomoe\t259887\nVJ\t259888\n乐斯\t259889\n恒大林溪郡\t259890\n高等教育出版社\t259891\n盖住\t259892\n荐读\t259893\n幸福人生\t259894\n酋长球场\t259895\n古惑仔:江湖新秩序\t259896\n蛋皮\t259897\n奶盖茶\t259898\n2017年2月15日\t259899\n缝纫\t259900\n唐山市区\t259901\n施暴\t259902\n180秒\t259903\n挂屏\t259904\n友谊北大街\t259905\n起拍\t259906\n销量\t259907\n南充市\t259908\nReduction\t259909\n蓓\t259910\n除权\t259911\n秀色\t259912\n笑到\t259913\n过膝\t259914\n7.78\t259915\n巨神战击队之超救分队\t259916\nsankaku\t259917\n爱你入骨\t259918\n青浼\t259919\n影色\t259920\n不值钱\t259921\n深圳国际\t259922\n普吉岛机场\t259923\n红环\t259924\n五六十岁\t259925\nパイパン\t259926\n邮人\t259927\nFormPanel\t259928\n汪汪队立大功第三季\t259929\n君澜\t259930\n汉化_游侠网\t259931\n智昕\t259932\nacqua\t259933\n中华企业\t259934\n海南\t259935\n新晨报数字报\t259936\nSH文件\t259937\n中煤科工\t259938\n载货电梯\t259939\n贵庚\t259940\n蔡方萌\t259941\nsvg格式\t259942\nRatingBar\t259943\n死亡笔记2\t259944\n多克隆\t259945\n黑霹雳\t259946\n大学\t259947\n积灰\t259948\n空预器\t259949\n儿化\t259950\n复方玄驹胶囊\t259951\n通海\t259952\n萨尔图\t259953\nrd\t259954\n38次\t259955\n侠盗猎车手:圣安地列斯\t259956\n在世\t259957\n嘉定西站\t259958\nworms\t259959\ntotally\t259960\n和窗\t259961\n斩味\t259962\nV500\t259963\n凤城二路\t259964\n双桥村\t259965\n急性荨麻疹\t259966\n笔墨纸砚_雅昌\t259967\n馄饨\t259968\ngoros\t259969\nohsas18001\t259970\n同等学历申硕\t259971\n面盖\t259972\n利滚利\t259973\nschemes\t259974\n外地上\t259975\n钩型\t259976\n雨宫\t259977\n断直连\t259978\nTapes\t259979\n关盖\t259980\n恶念\t259981\n1877\t259982\n接近于\t259983\n保佑\t259984\n钟村镇\t259985\nRect\t259986\n车员\t259987\n感悟\t259988\nnecessary\t259989\n毕雯朱德\t259990\nAntique\t259991\n蓝田股份\t259992\n得邦照明\t259993\n大研古镇\t259994\nCNC雕刻机\t259995\nxiaav论坛\t259996\n香港杜莎夫人蜡像馆\t259997\nlyc\t259998\n河钢集团\t259999\n霞帔\t260000\n好单\t260001\n三亚半山半岛\t260002\n致学\t260003\n东南方\t260004\n陈海\t260005\n中国汽车人才网\t260006\n生产类\t260007\n4月19号\t260008\n中小数\t260009\nMutual\t260010\n古烈\t260011\nCirculation\t260012\nroaming\t260013\n合久必婚\t260014\n物量\t260015\nHKC\t260016\n果冻包\t260017\n推车\t260018\nInsulated\t260019\ns7200smart\t260020\n张艺\t260021\n税徽\t260022\n误封\t260023\n妈呀\t260024\n不限\t260025\n艾文\t260026\n调理液\t260027\n张修维\t260028\n艾艾\t260029\ns805\t260030\n电率\t260031\n蓦\t260032\n盛业\t260033\n蒋瑶佳\t260034\n封蜡\t260035\n怪猎xx\t260036\n海洋量子号\t260037\nchub\t260038\n浩辰cad\t260039\n诛仙手游\t260040\n全国中小企业股份转让系统挂牌公司\t260041\nYe\t260042\n音乐库\t260043\n中国证券网\t260044\nnano3\t260045\nhash函数\t260046\n绿丝\t260047\n阿国\t260048\n马币\t260049\n23.1\t260050\n中铁四局\t260051\n皴法\t260052\n雁滩\t260053\n未了情\t260054\n赤城街道\t260055\n幸运色\t260056\n西安凤凰网\t260057\nXPOSED\t260058\n上妆\t260059\nwidow\t260060\nZHOU\t260061\nNova2S\t260062\n老戏骨\t260063\n冰雪节\t260064\n煤气炉\t260065\n内河港\t260066\n亚当斯\t260067\n礼崩乐坏\t260068\n猎天使魔女2\t260069\n河海大学\t260070\n省粮食局\t260071\n健康160挂号网\t260072\n招商网\t260073\n老湿\t260074\n使役\t260075\n陈宫\t260076\n广润\t260077\nX264-DIMENSION\t260078\n屬\t260079\n史馆\t260080\nShell\t260081\n我懂\t260082\n链码\t260083\n真题卷\t260084\n200mb\t260085\n97\t260086\n甲状腺功能检查\t260087\n春秋篇\t260088\nLiuYanYGZ\t260089\n17亿元\t260090\nstressed\t260091\n电音版\t260092\n综合篇\t260093\n第十次\t260094\n兰州城市学院\t260095\n曾力\t260096\n霍正阳\t260097\n第24次\t260098\n40万方\t260099\nplsqldeveloper\t260100\n山药粉\t260101\n枢\t260102\n童年期\t260103\n人地关系\t260104\n二泉\t260105\nMDL\t260106\n深圳前海微众银行股份有限公司\t260107\n佳力\t260108\n实值\t260109\n蟑\t260110\n噗通\t260111\n冷焊机\t260112\n暗疮\t260113\nTam\t260114\n拓海\t260115\nmiguel\t260116\n1.123\t260117\nlaravel5\t260118\n6X\t260119\n饮食篇\t260120\n31.5寸\t260121\n课课练\t260122\n妖夫\t260123\nshijiu\t260124\n阿尔泰尔\t260125\nrgb\t260126\n王都\t260127\n佛门\t260128\n揭阳市人民医院\t260129\nex280\t260130\n金富力\t260131\nthird\t260132\n出演\t260133\n亭林\t260134\n冷冲\t260135\n5000道\t260136\n折价率\t260137\n百时美施贵宝\t260138\n无抵押消费贷款\t260139\n近水\t260140\n海外市场\t260141\n宫位\t260142\n气相色谱法\t260143\n中国投影网\t260144\nkalafina\t260145\n9v\t260146\n榴莲酥\t260147\n离手\t260148\n偶久网\t260149\n接法\t260150\n在校\t260151\n潺\t260152\nCodeSky\t260153\n难解\t260154\npicturebox\t260155\n康妮\t260156\n秦青\t260157\nPlusDetail\t260158\n绝地求生:大逃杀\t260159\n蒙皮\t260160\n思路迪\t260161\n明神宗\t260162\nLiSA\t260163\n超微结构\t260164\nStruts1\t260165\n战刃\t260166\n第18节\t260167\n山东能源集团\t260168\nButts\t260169\n微盟智慧餐厅\t260170\n用血\t260171\n凶兆\t260172\n杂碎\t260173\n终止\t260174\nlin_zone\t260175\n炫亮\t260176\n赵超\t260177\n移植物\t260178\n嫲嫲团\t260179\n杰弗逊\t260180\n红斑马鱼\t260181\n条分法\t260182\n神开股份\t260183\n第二度\t260184\n晚唐\t260185\n文根英\t260186\n近一周\t260187\n贯线\t260188\n大有可为\t260189\nS版\t260190\nCEM\t260191\n1.48G\t260192\n双泉\t260193\n好寂寞\t260194\n北京体检中心\t260195\n水晶之痕\t260196\nFive\t260197\n微山湖\t260198\n参阅\t260199\n铅华\t260200\n百万元\t260201\n万寿宫\t260202\n天津金城银行\t260203\n属于危废\t260204\n上場\t260205\nbtp\t260206\n到站\t260207\n合金装备4\t260208\n张瑾\t260209\n犯罪心理学\t260210\n文主\t260211\n我的汤姆猫\t260212\n里仁\t260213\nselectable\t260214\nSnagit\t260215\nstm32/stm8\t260216\n流媒体播放器\t260217\nEUV\t260218\nlasers\t260219\njquery_脚本之家\t260220\nCA125\t260221\nDMU\t260222\ntobacco\t260223\n扎鲁特旗\t260224\nkoh\t260225\n印度币\t260226\nPurity\t260227\n扎心\t260228\n张奎\t260229\n护士资格证\t260230\n白切鸡\t260231\nMare\t260232\n九子\t260233\n冲破\t260234\n调至\t260235\n碱土金属\t260236\n张颐武\t260237\n罪\t260238\n近3年\t260239\n中国绿公司\t260240\n前介\t260241\n助考\t260242\nv3.0.1\t260243\n国轩高科\t260244\n伤亡人数\t260245\n选择器\t260246\nsilentjesse\t260247\n杨伟民\t260248\ncocos-js\t260249\n百度Hi\t260250\n雪弗兰科鲁兹\t260251\n飞凯材料\t260252\n宜昌市环境保护局\t260253\n找我\t260254\n等着我\t260255\n梅姨\t260256\n神州五号\t260257\nbieber\t260258\n截图\t260259\n朝方\t260260\n企业网\t260261\n大同矿区\t260262\n中鼎\t260263\n2018年04月13日\t260264\n大百科\t260265\n挠头\t260266\n素材库\t260267\n沙洋\t260268\nCOW\t260269\n暮鼓晨钟\t260270\n电杆\t260271\n重口h邪恶漫画\t260272\n拼乐\t260273\n置改革\t260274\n井下\t260275\n两本书\t260276\nX64\t260277\n呼和浩特火车站\t260278\n亲感\t260279\n活性炭口罩\t260280\nPercentage\t260281\n教院\t260282\n吗啡\t260283\n狗贼\t260284\n益寿\t260285\n保温壶\t260286\nAlliance\t260287\n无人知晓\t260288\n细菌学\t260289\n莲塘镇\t260290\n笑傲江湖之东方不败\t260291\n小川美\t260292\n闷骚\t260293\nip8p\t260294\nユニバ\t260295\n瓦里斯\t260296\nTypings\t260297\n认购期权\t260298\n第91集\t260299\n通盈\t260300\n东吴大道\t260301\n五月四日\t260302\n目标管理\t260303\n人山\t260304\n6.4.0\t260305\n宁波菜\t260306\nadvised\t260307\n彭金辉\t260308\n刘志成\t260309\n窘态\t260310\n纯静\t260311\n水平式\t260312\n拖泵\t260313\n雅西\t260314\n财务学\t260315\n鸟鸣山\t260316\n历战\t260317\n当头炮\t260318\n几品\t260319\n11.20\t260320\n2204N\t260321\n全国大学生创业服务网\t260322\n今天下午1:30\t260323\n250mg\t260324\n同音词\t260325\nRings\t260326\n路北区\t260327\n林雨\t260328\n趣_\t260329\nreactnative\t260330\n混凝土搅拌机\t260331\n过瘾\t260332\njitapu\t260333\n刘彬彬\t260334\n朔黄铁路\t260335\n退朝\t260336\n西條\t260337\n吴玉\t260338\nNECCS\t260339\n徐矿\t260340\n蒸汽洗车\t260341\n牧野流冰87\t260342\nBee\t260343\n福建省图书馆\t260344\n红花油\t260345\n本地化\t260346\nallied\t260347\n美迪\t260348\nHONDA\t260349\n斥责\t260350\nBitvise\t260351\n胃食管反流病\t260352\ns925\t260353\n杀身\t260354\nhindawi\t260355\n带领\t260356\n团团转\t260357\n崔西\t260358\nvlookup\t260359\n请记住\t260360\n天津南开区\t260361\n科立讯\t260362\nboto3\t260363\njishi\t260364\n储能\t260365\n浪险\t260366\n创客联盟\t260367\n鲜甜\t260368\nflexray\t260369\n第三十五集\t260370\nexternals\t260371\nbyid\t260372\n二胎\t260373\n扬州市区\t260374\nX2\t260375\n清算信息有限公司\t260376\n1.16.0\t260377\n细胞癌\t260378\n别打\t260379\n乐吧\t260380\n同堂\t260381\n垫面\t260382\n摊子\t260383\ncf葵\t260384\n襄州区人民政府\t260385\n威海市环境保护局\t260386\n钦北区\t260387\n时评文\t260388\n北坞嘉园\t260389\nd525\t260390\n二手房网\t260391\n梓由衣\t260392\n脑儿\t260393\nv3000\t260394\n榆林市\t260395\nBER\t260396\ncip\t260397\n1.2.0\t260398\n时间都去哪儿了\t260399\n急先锋\t260400\n秋吉雏\t260401\n印石\t260402\n颁发者\t260403\n2017年9月19日\t260404\n独步天下\t260405\n万科幸福里\t260406\n新华乡\t260407\n常州红梅公园\t260408\n45号\t260409\nRX7\t260410\n周某\t260411\n出名\t260412\n3分钟后\t260413\n岳王庙\t260414\n下岗潮\t260415\n夜辰\t260416\nInterceptor\t260417\n云裳\t260418\n挡光\t260419\n水溶性\t260420\n可控性\t260421\n样本容量\t260422\n老碗\t260423\n泰瑞亚\t260424\n明日边缘\t260425\n吴敦义\t260426\n石家庄站\t260427\nchengcheng\t260428\n上海当代艺术馆\t260429\n里程计\t260430\n有种\t260431\n港大新药\t260432\nWade\t260433\n地上式\t260434\n130cm\t260435\n又爱又恨\t260436\n为何人\t260437\n山人\t260438\narmv8\t260439\n翻译稿\t260440\nZEUS\t260441\n丰田兰德酷路泽\t260442\n江蕙\t260443\n有边\t260444\n宝马集团\t260445\n生育酚\t260446\n连上\t260447\n岳西县人民政府\t260448\n仪式感\t260449\n杨可\t260450\n家人的爱\t260451\n聚氨酯粘合剂\t260452\n相刑\t260453\n流言终结者\t260454\n群马\t260455\nvim8\t260456\nsupplies\t260457\n尕\t260458\n2017年6月12日\t260459\n大良\t260460\n快乐烘焙网\t260461\n麒麟路\t260462\n英魂之刃口袋版\t260463\n幼儿园世界地球日\t260464\n龙岗区政府\t260465\n益精\t260466\nPurses\t260467\nrestored\t260468\n十栋\t260469\n立卡\t260470\n炉中火\t260471\n考通\t260472\n大港股份\t260473\nHND\t260474\n2017滴滴\t260475\n足控\t260476\n737\t260477\ni3处理器\t260478\n健康之路\t260479\n17134.1\t260480\n二凤\t260481\n海选\t260482\n如鱼得水\t260483\n告知书\t260484\n在线视听\t260485\n四川电信\t260486\n北京度假村\t260487\nZ3/Compact\t260488\n康老师\t260489\nCr\t260490\n由不得\t260491\n任敏\t260492\n水泥袋\t260493\nQQ黄钻\t260494\n崔恺\t260495\nassure\t260496\n加权法\t260497\n驻华\t260498\n增删改\t260499\n笔帘\t260500\n250v\t260501\n宋鸿兵\t260502\n面鞋\t260503\noch\t260504\nthereby\t260505\n暗黑百科\t260506\n神草\t260507\n口水巾\t260508\n足浴\t260509\nSLi\t260510\n南美白对虾\t260511\n散落\t260512\n底儿\t260513\n田林\t260514\n情圣\t260515\n打头\t260516\n28万元\t260517\n立花美\t260518\n声闻\t260519\n新云\t260520\n0458\t260521\n学费\t260522\n人教版小学数学五年级下册\t260523\n无锡市市\t260524\n通篇\t260525\ncjy\t260526\n跃动\t260527\n一地鸡毛\t260528\nET\t260529\n小漠\t260530\n虚拟卡\t260531\n曹溪\t260532\n第【3】页\t260533\n2018-03-08\t260534\n晴好\t260535\nvendor\t260536\nchinses\t260537\n尿床\t260538\n五洲国际\t260539\n甲午日\t260540\nanderson\t260541\n民力\t260542\n文明之旅\t260543\n1min\t260544\n德赫亚\t260545\n怎麼\t260546\nChord\t260547\n富政\t260548\n對\t260549\n下梅林\t260550\n四人组\t260551\nParams\t260552\n宁姚\t260553\n养殖商务网\t260554\n杯式\t260555\n营业执照_列表网\t260556\n欲望都市\t260557\n旁轴\t260558\n喜迁\t260559\n跑狗图论坛\t260560\n昏死\t260561\n正常性\t260562\n学校2015\t260563\n权力的游戏第七季\t260564\n甲巯咪唑\t260565\n狂飙\t260566\n富美高\t260567\n古墓丽影7\t260568\n赌王之王\t260569\n↓\t260570\n小卡\t260571\n其时\t260572\n新城\t260573\ndaisy\t260574\n安徽省交通规划设计研究总院股份有限公司\t260575\n助长\t260576\nPC浏览器\t260577\n海陵\t260578\n胚胎\t260579\n九万\t260580\n态图\t260581\np2055d\t260582\n231个\t260583\n小米mis2s\t260584\n中石油管道局\t260585\n染病\t260586\n标记点\t260587\nHermite\t260588\n张先\t260589\nJQUERY\t260590\n聚乙烯丙纶防水卷材\t260591\n漂白剂\t260592\n复活节岛\t260593\n隐蔽战线\t260594\n踩头\t260595\n上汽大众\t260596\nharvey\t260597\n催发\t260598\n黄水镇\t260599\n素月\t260600\n住院治疗\t260601\n第四条\t260602\nhaochu\t260603\n上海大众搬家公司\t260604\n注塑制品\t260605\n吾人\t260606\nP20Pro\t260607\n15亿美元\t260608\n点播放\t260609\n坐化\t260610\nonboard\t260611\n捞月狗全网\t260612\nmoer\t260613\nGALAXY\t260614\nmetric\t260615\n著述\t260616\nzjj\t260617\n2818\t260618\n8月9日\t260619\n招商银行香港分行\t260620\ntxt\t260621\n安乐死\t260622\nDCP-7060D\t260623\n权力的游戏第二季\t260624\nshifen\t260625\nwindwos\t260626\n1174\t260627\n澳门\t260628\n天启坦克\t260629\n生鱼\t260630\nscl\t260631\n白质\t260632\nnote4\t260633\n影心\t260634\n邱永传\t260635\n佛门吧\t260636\nadios\t260637\n虚拟机\t260638\n我的女神\t260639\n电插座\t260640\nICCID\t260641\n区水务局\t260642\n爬升\t260643\n6.2亿\t260644\n540万\t260645\nlabel值\t260646\n柏悦湾\t260647\nCHE\t260648\nr2017a\t260649\n商家\t260650\n敬贤\t260651\n很相似\t260652\n绝世剑神\t260653\n保单\t260654\n馨梅\t260655\n荆门日报传媒集团\t260656\n天祝\t260657\n尖草坪\t260658\n近2个月\t260659\n街斗\t260660\n乐纯\t260661\n罗宾森\t260662\n艇长\t260663\n乳腺钼靶\t260664\n黄山职业技术学院\t260665\n心脏病\t260666\n言明\t260667\n送审版\t260668\ndeterminants\t260669\nAmbari\t260670\nslgkaifa\t260671\n厮打\t260672\n展园\t260673\n都是因为\t260674\n非物质\t260675\n黑岩\t260676\n80年\t260677\n银川一中\t260678\n乖离\t260679\n筑巢\t260680\n麻花辫\t260681\n大顶子山\t260682\nlunatic\t260683\n同桌100学习网\t260684\n821\t260685\n中标方\t260686\nes\t260687\n卫斯理支\t260688\n大庆油田\t260689\n尼日尼亚\t260690\n光福镇\t260691\n安徽工业大学工商学院\t260692\nyszou\t260693\n醉美\t260694\n新朗逸\t260695\n套餐式\t260696\n胶质母细胞瘤\t260697\n四角\t260698\n天岳\t260699\n带锯机\t260700\n路歌\t260701\n聚福\t260702\n钢筒\t260703\n真全\t260704\nsstp\t260705\n功名\t260706\n爱装网\t260707\n勃利县\t260708\n20180422\t260709\nbanco\t260710\ntant\t260711\nBei\t260712\n王者归来\t260713\n岛上书店\t260714\n人书\t260715\n7800x\t260716\n保利拉菲公馆\t260717\n麻匪\t260718\nUNDERTALE\t260719\n三个多小时\t260720\n银鲨\t260721\n二年级\t260722\n犹太人\t260723\n鲁健\t260724\n席筒\t260725\n风之谷\t260726\n孔雀镇\t260727\n谢觉哉\t260728\n双因素理论\t260729\n北大汇丰\t260730\n推流\t260731\nColours\t260732\n周转性\t260733\n中阳县\t260734\n阉割\t260735\n盈利模式\t260736\npuppeteer\t260737\n泡吧\t260738\n舞蹈史\t260739\noperated\t260740\n中国移动公司\t260741\n戊戌年\t260742\n橡树\t260743\n博物学家\t260744\n70多个\t260745\n得仕卡\t260746\n陈其苗\t260747\n佛慈\t260748\n演艺事业\t260749\n咨询室\t260750\n西安地铁三号线\t260751\n00000005\t260752\n学文\t260753\n桂春\t260754\n汉莎\t260755\n港产\t260756\n21cn.net\t260757\n二中\t260758\n穆先生\t260759\n细思恐极\t260760\n澹\t260761\n行运\t260762\nexfat格式\t260763\n手动换行符\t260764\n黄金股\t260765\n石桥铺\t260766\n中铁港航局集团有限公司\t260767\n花仙谷\t260768\n闽籍\t260769\n陈为\t260770\nxxxl\t260771\n限流器\t260772\n恩斯特\t260773\n代位\t260774\n中国品牌网\t260775\n靖边县\t260776\n第10次\t260777\n天津市滨海新区人民政府\t260778\n暗黑破坏神3数据库\t260779\n医界\t260780\n标准套\t260781\n吴清忠\t260782\nLimit\t260783\n晨光\t260784\n第六节\t260785\n俏妃\t260786\n9:30\t260787\n哈丁\t260788\nTOK\t260789\n舒片\t260790\nMrvica\t260791\nVIVADO\t260792\n茉莉花\t260793\n延安东路\t260794\n单笔\t260795\n面\t260796\n时间服\t260797\n三剑客\t260798\n雨花外国语小学\t260799\n洞房花烛夜\t260800\n塞瓦斯托波尔\t260801\n凯风\t260802\n李真\t260803\n日代\t260804\n金娜英\t260805\n周渝内马尔\t260806\n第十场\t260807\nRoi\t260808\n花穴\t260809\n增速\t260810\n尹姝贻\t260811\n暗耀\t260812\nHKTV8\t260813\n她的歌\t260814\n明月千里寄相思\t260815\n哈提\t260816\n仙剑奇侠传吧\t260817\n专业实务\t260818\n动漫类\t260819\n展示架\t260820\n华为荣耀V9\t260821\n票据\t260822\n全国游泳冠军赛暨亚运会\t260823\n配额\t260824\n东洲岛\t260825\n黄页\t260826\n小农水\t260827\n10多家\t260828\n蚌埠市政府\t260829\n厨家\t260830\n大检\t260831\n情趣\t260832\n瓜葛\t260833\n赌术\t260834\n流浪儿\t260835\n黄叶\t260836\n袁磊\t260837\n作废\t260838\n徽记\t260839\n诉讼代理人\t260840\nexperiences\t260841\n少女们\t260842\n第18047期\t260843\nunassigned\t260844\n唔识\t260845\n镀锌钢丝网\t260846\n痛不欲生\t260847\n马来酸氯苯那敏片\t260848\n打印室\t260849\n杰瑞米·雷纳\t260850\nc220\t260851\nPlane\t260852\n肉女\t260853\nHoltek\t260854\nBrenda\t260855\n仙剑奇侠传四\t260856\n570x\t260857\n_税屋网\t260858\n渭南\t260859\nHauser\t260860\n15秋\t260861\n600868\t260862\n一伙人\t260863\n40多次\t260864\ncrisp\t260865\n拍打\t260866\n保护袋\t260867\n免盗\t260868\n沅\t260869\nmagnificent\t260870\n中青宝\t260871\n火凤燎原\t260872\n85266693\t260873\n枪炮世界吧\t260874\nlaoyawo\t260875\n前田\t260876\n西游释厄传super\t260877\n冰原\t260878\n用表\t260879\n三兄\t260880\nt7\t260881\n梁河\t260882\n裸乳\t260883\n改妆\t260884\n对乙酰氨基酚片\t260885\n上世纪70年代\t260886\n究極\t260887\n息神庙\t260888\n罗技K380\t260889\n/font\t260890\n字素\t260891\n济南地铁\t260892\n竞网\t260893\n贪婪洞窟\t260894\n莫怀戚\t260895\n11月28日\t260896\n大鹏镇\t260897\nPSYCHO-PASS\t260898\n特里尔\t260899\n跟焦器\t260900\n8盘位\t260901\n能谱\t260902\n肽类\t260903\n青岛电子学校\t260904\nGIF\t260905\nv6.2.0\t260906\n草酸\t260907\n刘惜君\t260908\nland\t260909\n1000千克\t260910\n情人崖\t260911\n跳回\t260912\n珊瑚海\t260913\n4道\t260914\n中乙\t260915\n闪轨\t260916\n2.2千瓦\t260917\n郁亮\t260918\nTracer\t260919\n01章\t260920\n努比亚z18\t260921\n陈晓辉\t260922\n选通\t260923\nMIUMIU\t260924\n山东省公安厅交通管理局\t260925\n合晟\t260926\ncleanmymac3\t260927\nphenom\t260928\npick\t260929\nKurt\t260930\n裕兴\t260931\n老孔\t260932\n叶尔凡\t260933\n几内\t260934\nCERT\t260935\n临头\t260936\nZHUAN\t260937\nshall\t260938\nTOYS\t260939\n赵弘\t260940\n丽\t260941\n日式拉面\t260942\n贴剂\t260943\n常熟\t260944\n速锐论坛\t260945\n保户\t260946\nMNN\t260947\n全聚德烤鸭\t260948\n江西省省\t260949\n蚀纹\t260950\n眼下\t260951\n韩顺平\t260952\n忽悠\t260953\n厦门南洋职业学院\t260954\n布标\t260955\n中国第二重型机械集团公司\t260956\n宝源路\t260957\n氰戊菊酯\t260958\n斯托克城\t260959\nRising\t260960\n首入\t260961\n首夺\t260962\n坐看\t260963\n乐1\t260964\n36周年\t260965\n楚天金报\t260966\n斗战\t260967\n神秘石\t260968\n海氏\t260969\nRZ\t260970\n头炮\t260971\njunglescout\t260972\n社会劳动保障网\t260973\n中迪广场\t260974\n甘洒\t260975\n第36次\t260976\n海临风\t260977\nDebug\t260978\n联贷\t260979\n杨艺\t260980\n鸟人\t260981\n徐州四院\t260982\n中华人民共和国土地管理法\t260983\n哈佳\t260984\n我与地坛\t260985\n娩出\t260986\n孝敏\t260987\n博凡\t260988\n工人日报\t260989\n17部\t260990\nboy\t260991\n邮政储蓄手机银行\t260992\n鸡杂\t260993\n辩护\t260994\n砍价\t260995\n龙虎榜_数据中心\t260996\najaxReturn\t260997\n阳西县\t260998\n微企业\t260999\n七星彩开奖\t261000\ndeap\t261001\n胞磷胆碱钠片\t261002\nUCINET\t261003\n棋牌\t261004\n牛顿迭代法\t261005\n6136\t261006\n周芳\t261007\n疯长\t261008\n野葡萄\t261009\n半挂\t261010\n_波奇\t261011\n拉伸件\t261012\nDevops\t261013\n藏装\t261014\n考不好\t261015\n9.7英寸\t261016\n并列句\t261017\n14下\t261018\n扶桑花\t261019\n来没有\t261020\n应从\t261021\n大麦虫\t261022\n月山镇\t261023\n吟诗\t261024\n招股说明\t261025\n名医堂\t261026\n制式\t261027\ngeartrax\t261028\n卿\t261029\n竞选稿\t261030\n喀喇沁左翼蒙古族自治县\t261031\n现代简约风格\t261032\n雪纳瑞犬\t261033\n丢\t261034\nHide\t261035\n名姓\t261036\n四虎\t261037\n大画\t261038\n萃取\t261039\n存水弯\t261040\n热靴\t261041\n汉家江湖\t261042\n合肥北\t261043\n嗓子眼\t261044\n就业创业网\t261045\n群联\t261046\n昆明世纪城\t261047\n逸景翠园\t261048\nRie\t261049\n加索\t261050\nManson\t261051\n三庭\t261052\n盖瑞特\t261053\n嫡女网游之大盗贼\t261054\n上海恩捷\t261055\n湖南省通信管理局\t261056\n蓝斑\t261057\n风镜\t261058\nNacional\t261059\n金兴\t261060\n农村土地承包经营权确权登记\t261061\n刘笑\t261062\n圣明\t261063\n茶水工\t261064\nTubular\t261065\n弓形虫病\t261066\n摩登世界\t261067\nENUM\t261068\n寻步\t261069\n湖南大学工商管理学院\t261070\n佐藤美纪\t261071\n炭河古城\t261072\njava\t261073\n全孝\t261074\n植株\t261075\n酸碱滴定法\t261076\nブ\t261077\n水坝\t261078\n苏州明基医院\t261079\n发展研究中心\t261080\n硕士学位\t261081\ncours\t261082\n横滨轮胎\t261083\n新标准大学英语NewStandard综合教程\t261084\n白度云\t261085\n瑞风S3论坛\t261086\n200CC\t261087\n五调\t261088\n干痛\t261089\n报到\t261090\nanb\t261091\n笑逐颜开\t261092\ndoc-在线文档投稿赚钱网\t261093\n龙潜花都\t261094\n露娜\t261095\nNCRE\t261096\n博客大巴\t261097\n庆安县\t261098\n杨梅节\t261099\n五分旅游网\t261100\n德州银行\t261101\n中蒙\t261102\n白鹿村\t261103\n德钦县\t261104\nPMM\t261105\n文峰街道\t261106\n狗爷\t261107\n药袋\t261108\n高爽\t261109\nblg\t261110\n生物质颗粒燃料\t261111\n荣耀平板\t261112\n百年来\t261113\n岩松\t261114\n物理网\t261115\n2015财年\t261116\n柠檬醛\t261117\n漫山遍野\t261118\n全彩屏\t261119\n富安娜家纺\t261120\n台阶式\t261121\n山场\t261122\n谍报\t261123\n阿沁\t261124\n一对\t261125\n晴川阁\t261126\nSprint\t261127\n德州晚报\t261128\n军国\t261129\nimprovement\t261130\n200h\t261131\n国航银卡\t261132\n心灵之窗\t261133\nWIFI模块\t261134\n中脉\t261135\n神武3手游天策\t261136\n地桩\t261137\n麟龙\t261138\n麦豆网\t261139\nCEO\t261140\n北京市人民政府外事办公室\t261141\n东南部\t261142\n洛阳国家高新技术产业开发区\t261143\n三灶\t261144\n石桥村\t261145\n最高人民法院办公厅\t261146\ncare+\t261147\n宿雾\t261148\n线器\t261149\nlobal\t261150\n毛坦厂中学\t261151\n瓦良格号\t261152\n为官\t261153\n诸事\t261154\n球王\t261155\n第2018\t261156\n退火炉\t261157\n众生\t261158\n外汇牌\t261159\n黄山路\t261160\n70多万\t261161\n漯河市人民政府\t261162\n手扶梯\t261163\n素士\t261164\n髂\t261165\nMarkit\t261166\n受难者\t261167\n铃木麻奈美\t261168\nsynthesia\t261169\n4旬\t261170\n迎头赶上\t261171\n明末\t261172\n梅甘娜\t261173\n野山鹰\t261174\n吴哥航空\t261175\n闪字\t261176\n赵娟\t261177\n替换装\t261178\n张吉福\t261179\n郭飞\t261180\nscrollView\t261181\n6000多个\t261182\niOS8.4.1\t261183\n二喜\t261184\n负离子粉\t261185\n育婴师证\t261186\nMDR-100ABN\t261187\n9月中旬\t261188\n秒速5厘米\t261189\n硫酸软骨素\t261190\n怕人\t261191\n李昇基\t261192\n厦门市口腔医院\t261193\n宿豫区\t261194\n铜山\t261195\n百万平米\t261196\n8小时\t261197\n李占卫\t261198\n下堂妻\t261199\ninstagram\t261200\n龙爷\t261201\n1561\t261202\n顾准\t261203\n所剩无几\t261204\n蓝金\t261205\ngrapes\t261206\n280TSI\t261207\nEXCELL\t261208\n岸芷汀兰\t261209\njava版\t261210\n120目\t261211\n连环套\t261212\n氟康唑\t261213\n浙江万里学院\t261214\n摩易\t261215\n中华文化促进会\t261216\n余世存\t261217\nRCS\t261218\n刻绘\t261219\n平凉市\t261220\n帐篷\t261221\n40多万\t261222\n看来\t261223\n棋社\t261224\n道林格雷\t261225\naca\t261226\n尖尖\t261227\n核工业\t261228\nxinge\t261229\n千年大计\t261230\n弦杆\t261231\n闪存芯片\t261232\n交易行\t261233\n黄晓军\t261234\n手动剃须刀\t261235\n北京北城甲状腺医院\t261236\n家庭财产保险\t261237\n五园\t261238\n出入库单\t261239\n2187\t261240\n外国籍\t261241\n萍踪侠影\t261242\n水下灌注桩\t261243\n高官\t261244\nPSNR\t261245\n标准板\t261246\n佳博Gainscha\t261247\nBWT\t261248\n华熙\t261249\ndialogue\t261250\n小山竹\t261251\n侧缝\t261252\n施工检验批\t261253\nemploy\t261254\n南阳市区\t261255\n打浆机\t261256\n通透感\t261257\n保健操\t261258\ntuan\t261259\n王学军\t261260\n爱润妍\t261261\n变卖\t261262\n背带裤\t261263\n10.8\t261264\nled发光字\t261265\n储层\t261266\n有福气\t261267\ncata\t261268\n段跃\t261269\n鸬鹚\t261270\n无变化\t261271\n158.com\t261272\n单日\t261273\n简州新城\t261274\n无线麦\t261275\n空能\t261276\n603799\t261277\n苦肉计\t261278\n疏肝解郁\t261279\nL351\t261280\n玉腿\t261281\nWinners\t261282\niwall\t261283\n李明霖\t261284\n斑鱼\t261285\n箍\t261286\n伤敌\t261287\n信封包\t261288\n伙呆\t261289\n女群\t261290\n柏林国际电影节\t261291\n警报\t261292\n突出重围\t261293\n斜管\t261294\n吴佩孚\t261295\nyoshi\t261296\n汽轮发电机\t261297\n时\t261298\n沙市\t261299\n锵\t261300\n华为小米\t261301\n轻听\t261302\n普内科\t261303\n浸锡\t261304\n燕守平\t261305\n大众DV\t261306\n张家界荷花机场\t261307\nmicmiu\t261308\nT波\t261309\n季候风\t261310\n五柳先生传\t261311\n特务\t261312\n网安\t261313\nedu.cn域名解析\t261314\n当年情\t261315\n育儿吧\t261316\n股骨骨折\t261317\n李梦婷\t261318\n海捕\t261319\n造时\t261320\n西渡镇\t261321\nwin7/8\t261322\nAJ3\t261323\n宝灯\t261324\n戦乙女ヴァルキリ\t261325\n百万级\t261326\n算命先生网\t261327\nliner\t261328\n上两年\t261329\n冒险岛2吧_\t261330\n慧琳\t261331\nCou\t261332\nTERA\t261333\n磁选机\t261334\n中非共和国\t261335\n邶风\t261336\n一洲\t261337\nmiRNA\t261338\n嫑\t261339\n生活大爆炸第十一季\t261340\n封闭母线槽\t261341\n玛娜传奇\t261342\nSTCN\t261343\n煤气灶\t261344\n为数不多\t261345\n露体\t261346\n抚松\t261347\n匹兹堡\t261348\nDOES\t261349\nEC\t261350\n狗妹\t261351\n有限责\t261352\n一单元格\t261353\n电脑棒\t261354\n前海股权交易中心\t261355\nrelational\t261356\n五贝子\t261357\nvof\t261358\n中国科学院生态环境研究中心\t261359\n青岛六中\t261360\nJNDI\t261361\n两江国际机场\t261362\n富居\t261363\n清河门区\t261364\n藏身处\t261365\n原子弹\t261366\nAVG\t261367\n李猛\t261368\n刘小明\t261369\n直客式\t261370\nDalian\t261371\n下拉复选框\t261372\nMounty\t261373\nnetstat\t261374\n文具盒\t261375\n锦书\t261376\n虚拟视频播放器\t261377\n不复\t261378\n错那县\t261379\n入人心\t261380\n台州人力网\t261381\n灰灰\t261382\ninside\t261383\n角鲨烷\t261384\n宿迁中学\t261385\n霍尔果斯\t261386\n1000kva\t261387\nGTX1080\t261388\n征兵网\t261389\n耍酷\t261390\n新希望乳业\t261391\n老记\t261392\n首都医科大学附属北京世纪坛医院\t261393\nHedge\t261394\n杀母\t261395\nП\t261396\nZblog\t261397\n下颚\t261398\n极海听雷\t261399\nutf8\t261400\n威华股份\t261401\n宋玖槿\t261402\nPrepaid\t261403\nBandizip\t261404\n用家\t261405\n90.9\t261406\n众\t261407\n罗宾\t261408\n蜗杆减速机\t261409\n特别版\t261410\n肉垫\t261411\nSqlConnection\t261412\nOTB\t261413\n光纤路由器\t261414\n中国华电集团\t261415\n分数指数幂\t261416\nTr\t261417\n2.80\t261418\n喇\t261419\n完善版\t261420\n一方城\t261421\nconsists\t261422\ninfer\t261423\n农大\t261424\nwuyi\t261425\n提出\t261426\nkux格式\t261427\nis903\t261428\n巴黎圣母院\t261429\ngpf\t261430\n花玲\t261431\nharness\t261432\ninterpersonal\t261433\n杂录\t261434\n本特勒\t261435\n01回\t261436\n混沌之刃\t261437\n余氏\t261438\n动火\t261439\n寂月神社\t261440\n油滴\t261441\n心猿\t261442\nbuster\t261443\n讲堂\t261444\n11整除\t261445\n维业股份\t261446\nsaier\t261447\n王厂长\t261448\n平胸\t261449\n百分之12\t261450\noutlook2010\t261451\n531号\t261452\n舔舔\t261453\n黄风\t261454\n积弊\t261455\n150级\t261456\nWangXiao\t261457\n顺昌路\t261458\nrlogin\t261459\n13.4\t261460\nblackberry\t261461\n海军轮\t261462\n夕瑶\t261463\n小凯\t261464\n姚易\t261465\nbc95\t261466\n王天林\t261467\nUGP\t261468\n上栗\t261469\n承当\t261470\n站不住\t261471\n生育服务证\t261472\n鄞州人民医院\t261473\n强心针\t261474\nRuthless\t261475\n雅尔塔会议\t261476\n第236集\t261477\n昏婚欲睡\t261478\n5英寸\t261479\n中超控股\t261480\n纸棒\t261481\n杨松\t261482\n第151集\t261483\n汗蒸\t261484\n广州红盾信息网\t261485\n破茧而出\t261486\n渐行渐近\t261487\n智敏\t261488\n俏皮\t261489\n刘增瞳\t261490\n全桥\t261491\nextended\t261492\n武夷岩茶\t261493\nRola\t261494\n弘泰\t261495\n李钧\t261496\n刘大可\t261497\n来试\t261498\n词文\t261499\n65亿\t261500\n文一集团\t261501\n人才园\t261502\n鲁迅艺术学院\t261503\n洗一洗\t261504\ngc\t261505\n日出\t261506\n琴谱\t261507\n贾宝珉\t261508\n80070570\t261509\n吉利博瑞论坛\t261510\n3095\t261511\n健力\t261512\n我是中国人\t261513\n内存槽\t261514\n三国罗曼史\t261515\nArrival\t261516\n一截\t261517\n中国佛教协会\t261518\n拦\t261519\n第11回\t261520\n1.2万吨\t261521\nCode\t261522\n浙江小百花越剧团\t261523\n1782\t261524\n有机棉\t261525\n牌包\t261526\n堆积\t261527\n谢渊\t261528\n成绩\t261529\nIntrusion\t261530\n2017年3月份\t261531\n百利\t261532\nParkinson\t261533\n小儿氨酚\t261534\nknows\t261535\n弋江镇\t261536\n童佳倩\t261537\nKawai\t261538\n冷白\t261539\n海鲈\t261540\n汛情\t261541\ncargo\t261542\n警视厅\t261543\n鸽友\t261544\n孟鹤堂\t261545\n李玉明\t261546\n干支\t261547\nPS7.0\t261548\n吴桐\t261549\n转运竹\t261550\n工程级\t261551\nWINGS\t261552\n蓝装\t261553\n多遍\t261554\nForensic\t261555\n农工党中央\t261556\n百叶轮\t261557\nTUNE\t261558\n燃烧机\t261559\n海霸王\t261560\n马晓光\t261561\n磨头\t261562\nContent\t261563\n保险经纪公司\t261564\n一岗\t261565\n蜜皂\t261566\n黄口\t261567\nCMD\t261568\n华尔道夫\t261569\n辽宁省住房和城乡建设厅\t261570\n83052939\t261571\n京东白条还款\t261572\n可循\t261573\n祖马龙\t261574\n妙文\t261575\n钢波纹管\t261576\nボイン\t261577\n双龙峡\t261578\naguilera\t261579\n岗通\t261580\n雷阿伦\t261581\n请点赞\t261582\nItemDecoration\t261583\nxcode模拟器\t261584\n无为县\t261585\n上海百联集团股份有限公司\t261586\n见得\t261587\nHV\t261588\nSwitches\t261589\n油炸锅\t261590\n猫咖\t261591\n收纳包\t261592\n推薦\t261593\n跳车\t261594\n甲醛中毒\t261595\nORA-01017\t261596\n尾鱼\t261597\n拨浪鼓\t261598\n艾维奇\t261599\n九头案\t261600\nEasyTable\t261601\n平安付\t261602\n群论\t261603\n宁波地铁1号线\t261604\n怡康\t261605\nchm版\t261606\n中华励志网\t261607\n好色之徒\t261608\n20186\t261609\n乳木果油\t261610\n孙岳\t261611\n艾尔文\t261612\n虎狼\t261613\n朱媛媛\t261614\n快速路\t261615\n第八批\t261616\n休渔\t261617\n直通区市\t261618\n北京爱亚卡普\t261619\n潍坊北站\t261620\nUBI\t261621\n品牌塑造\t261622\n自负\t261623\n蚌埠经济开发区\t261624\nLGBT\t261625\n3939\t261626\n晓琳\t261627\n应纳税所得额_\t261628\n一发\t261629\n合肥万达城\t261630\nv1.0.0安卓\t261631\n冯\t261632\n水泡\t261633\nUSB3.0接口\t261634\n处于\t261635\n4.38\t261636\n万科理想城\t261637\nzeng\t261638\n安全岛\t261639\n分期通\t261640\n大辽网\t261641\njupter\t261642\n友家\t261643\nCamila\t261644\n长叶\t261645\n致敬\t261646\n横溢\t261647\n烫饭\t261648\nCloth\t261649\n高思学校\t261650\n23p\t261651\n焦作师范高等专科学校\t261652\n安徽中医学院\t261653\n国美美\t261654\n斗战神\t261655\n沉底\t261656\n中国轻工业联合会\t261657\n万场\t261658\n卵\t261659\nhash算法\t261660\nProbability\t261661\nov5640\t261662\n吞吐率\t261663\n工作指南\t261664\n管路\t261665\n影音先锋av资源站\t261666\n夏佐\t261667\n泼水\t261668\n空格键\t261669\n网贷天眼\t261670\n卡成翔\t261671\n螺旋\t261672\n线点\t261673\n上班\t261674\n桓温\t261675\n姚集镇\t261676\n俯卧撑\t261677\n变换器\t261678\n7.19\t261679\n永安道\t261680\n亲属\t261681\n张漾\t261682\ngzs\t261683\n轻尘\t261684\n现金池\t261685\n劳务\t261686\nROCKET\t261687\n大满贯\t261688\n多功能键\t261689\n广西自治区党委\t261690\n交工\t261691\n修学\t261692\n汉化补丁\t261693\n黔阳古城\t261694\ns686\t261695\n列距\t261696\n进取号\t261697\netc/fstab\t261698\n金逸\t261699\n乙酰甲胺磷\t261700\n30FPS\t261701\n第十五批\t261702\n盒内\t261703\nnul\t261704\n熊本\t261705\n他信\t261706\n1579\t261707\nQ235钢\t261708\nngModel\t261709\n0434\t261710\n69点\t261711\n方兴\t261712\n铁壁\t261713\n混装\t261714\nshanchu\t261715\n灵数\t261716\n忧郁\t261717\n惠州市惠阳区人民政府\t261718\n主位\t261719\n东阿镇\t261720\n健身中心\t261721\n30万个\t261722\n砍脚\t261723\nvagaa\t261724\n20170101\t261725\n替代品\t261726\n樟\t261727\n7局\t261728\n赵建国\t261729\n虚警\t261730\n百度云播放器\t261731\n水相\t261732\n牛尾汤\t261733\n三国梦想漫画全集\t261734\n同形\t261735\n女生日记\t261736\n更贴近\t261737\n昆仑万维\t261738\n平整度仪\t261739\n关联号\t261740\n枕边人\t261741\n广西体育中心\t261742\nmacair\t261743\n天河传说\t261744\n电子签章\t261745\n四分米\t261746\n培训期\t261747\n一个二\t261748\n金尊玉\t261749\n孝昌\t261750\nfs2you\t261751\n假钱\t261752\nnginx+php\t261753\n国家工商行政管理总局商标局\t261754\nArcade\t261755\n中国交通建设集团有限公司\t261756\n耶稣基督\t261757\nMounted\t261758\n移动通信网\t261759\nciel\t261760\n蜡画\t261761\n昱\t261762\n46\t261763\nNAS\t261764\n凳\t261765\n化学物\t261766\nmagenta\t261767\n贵阳市公安局\t261768\n十五世\t261769\nInflation\t261770\n华新路\t261771\n11AC\t261772\n远梦\t261773\n金陵塔\t261774\nURSA\t261775\n欅坂\t261776\n罗百吉\t261777\n单道\t261778\nunsatis\t261779\n服不服\t261780\n中国科学院研究生院\t261781\n爸爸妈妈们\t261782\n喇叭沟门\t261783\nRd\t261784\nKI\t261785\nbbu\t261786\njiajie\t261787\n众安\t261788\n进宫\t261789\nconjugated\t261790\n天翼之链\t261791\n烟台山\t261792\nnationality\t261793\n538porn\t261794\nReact-Router\t261795\n哥斯拉:怪兽行星\t261796\nv认证\t261797\n无烟煤滤料\t261798\n重庆便民网\t261799\n25004\t261800\n中国绿化基金会\t261801\n神庙逃亡2\t261802\n谷居\t261803\n孤岛危机2\t261804\n周志刚\t261805\noctavia\t261806\nnett\t261807\n八达岭高速\t261808\n巅峰阁\t261809\n聚屏\t261810\n纳尔\t261811\nATB\t261812\nAme\t261813\n湿录\t261814\n武警支队\t261815\n7话\t261816\n曹真\t261817\n手绘本\t261818\n项目管理建筑工程管理建筑工程网\t261819\nsasuke\t261820\n费伦\t261821\n棒球大联盟2nd\t261822\n鼓机\t261823\n妆效\t261824\n雅思4\t261825\nmix2S\t261826\n送交\t261827\n立体主义\t261828\n金桔树\t261829\n南极大冒险\t261830\n节选自\t261831\n夏商周\t261832\n白云洞\t261833\n青柠檬\t261834\nXX小学\t261835\n三八红旗\t261836\nT130\t261837\n南开大学生命科学学院\t261838\n法企\t261839\n活到老学到老\t261840\nevil\t261841\n回主\t261842\n辽人社\t261843\nParam\t261844\n演讲稿\t261845\n脱手\t261846\n191\t261847\n人魔\t261848\nkext\t261849\n短肢\t261850\n北冥有鱼\t261851\nsonata\t261852\n园艺博览会\t261853\n新民路\t261854\n陈孝平\t261855\n不遇\t261856\n机械在线\t261857\n丁学良\t261858\n中国邮政储蓄银行手机银行\t261859\n豪\t261860\n曲面屏\t261861\n脚面\t261862\n淄博市人民政府\t261863\n第十弹\t261864\n克里特\t261865\n牢牢\t261866\n深圳高速公路股份有限公司\t261867\n古董局中局\t261868\n东方电视台\t261869\n屋根\t261870\n猫盘\t261871\n抗冲\t261872\n安徽江淮汽车股份有限公司\t261873\n宇环数控\t261874\n工贸版\t261875\n第04集\t261876\n侯马市\t261877\n力保健\t261878\n2018年4月17日\t261879\n三国群英传7原味强化版\t261880\n刘若英\t261881\n张文新\t261882\n第三家\t261883\nmigrate\t261884\n231\t261885\nbiginteger\t261886\n00051\t261887\nMassachusetts\t261888\n申亮亮\t261889\n补缺\t261890\n河北科技师范学院\t261891\n没长\t261892\n阿德福韦酯片\t261893\n四旧\t261894\n草榴社區\t261895\n表里\t261896\n冷饮\t261897\nchronicle\t261898\n两岸\t261899\n官山\t261900\n定位册\t261901\n新媒体概论\t261902\n闲林\t261903\n羽绒衣\t261904\n花鲢\t261905\n太空城\t261906\n刀哥\t261907\n创合\t261908\n好妞\t261909\n跌水\t261910\n太行山上\t261911\n向前向后\t261912\neu4\t261913\n17FW\t261914\n二万\t261915\n中华会计网校\t261916\nOffice2003/2007/2010卸载工具\t261917\n修善\t261918\nfileno\t261919\n斗罗大陆3\t261920\n时装表\t261921\ngoodreader\t261922\npo\t261923\n火旋\t261924\n128岁\t261925\nCompounds\t261926\n顾中一\t261927\n老调重弹\t261928\n东阳光\t261929\n大王椰\t261930\nド\t261931\n迁安\t261932\n雷诺兹\t261933\n女贞\t261934\n1585\t261935\n后台\t261936\n中华人民共和国计量法实施细则\t261937\n安达市\t261938\nPark\t261939\n一起来看\t261940\nCol\t261941\nDuct\t261942\n2kol2\t261943\ntitile\t261944\n彩吧助手\t261945\n云在飞\t261946\n萌夫\t261947\n比特梵德\t261948\n亿视丽眼镜网\t261949\nprod\t261950\n论法\t261951\n热血街舞团\t261952\n隐贤\t261953\nSunDay2065\t261954\n狗庄\t261955\n力值\t261956\nFoxmail7.2\t261957\n石油展\t261958\nadb_\t261959\n22周岁\t261960\n_巴士梦三国\t261961\ntexstudio\t261962\n世贸天阶店\t261963\nsynaptics定点装置\t261964\n蒋孝严\t261965\npt站\t261966\n宽头\t261967\n解救吾先生\t261968\nEECS\t261969\n08:00:00\t261970\n汇智\t261971\n本田汽车\t261972\najax传\t261973\n修颜\t261974\n纸盘\t261975\n抢标\t261976\n重金求子\t261977\n說網\t261978\n宣告\t261979\n佳域\t261980\n美食家\t261981\n山水田园\t261982\n政法类\t261983\n岩头\t261984\n唐山路\t261985\n激光整平机\t261986\n钙果\t261987\n录入\t261988\n滇红茶\t261989\n猫九酱sakura\t261990\n号码簿\t261991\nService层\t261992\n杭州教育考试网\t261993\nSEV\t261994\n资金流\t261995\nsign\t261996\n新版电子税务局\t261997\n静海\t261998\n菩提树\t261999\n迅雷阳台\t262000\n拌\t262001\n卫诗雅\t262002\n疤\t262003\nMFE\t262004\n中国击剑协会\t262005\n半正定矩阵\t262006\n曼陀罗花\t262007\n边腿\t262008\n常州淹城春秋乐园\t262009\n奥克斯钟山府\t262010\n中央子午线\t262011\n篮球队\t262012\n风荷载体型系数\t262013\n离婚女\t262014\ndmzj\t262015\n原生液\t262016\n魂心\t262017\n期货公司\t262018\n氧酸\t262019\n因公\t262020\n金门战役\t262021\n马萨诸塞\t262022\n小模\t262023\n拉毛机\t262024\n百度新闻搜索\t262025\n急盼\t262026\n精舍\t262027\n食药监总局\t262028\n胡德平\t262029\nAds\t262030\n李家俊\t262031\nIch\t262032\n仿宋gb2312\t262033\n第三大街\t262034\n三大战役\t262035\n印染厂\t262036\n维族舞\t262037\n华油\t262038\nRVV\t262039\nSURLError\t262040\nsare\t262041\ncensorship\t262042\n地狱男爵\t262043\n辞职\t262044\n烧鹅\t262045\n煮饭\t262046\n准心\t262047\n嘉兴酒店\t262048\n肺火\t262049\n20171001\t262050\n宿松先锋网\t262051\nDWORD\t262052\n上海企聚网\t262053\n咕咚运动\t262054\n震惊\t262055\n缩写语\t262056\n阎王妻\t262057\n调频\t262058\njavaFX\t262059\n江阴市人力资源和社会保障局\t262060\n人生何处不相逢\t262061\n叶英\t262062\n时期\t262063\nR7800\t262064\n关正杰\t262065\n冠状沟\t262066\n摇头机\t262067\n40_\t262068\n10千米\t262069\n模板堂\t262070\n潮\t262071\n申银万国期货\t262072\nsURL\t262073\n固定架\t262074\nmonodevelop\t262075\n随之\t262076\n套装\t262077\n火柴棍\t262078\n吴庆龙\t262079\n湖北日报_多媒体报\t262080\n水韵\t262081\n爱不离\t262082\nxprinter\t262083\n代测\t262084\n临江大道\t262085\n魔物\t262086\n雅牛\t262087\n邻里节\t262088\n上海教育电视台\t262089\n渔具展\t262090\nzug\t262091\n乐视直播网\t262092\n落叶归根\t262093\nfab\t262094\n卡罗拉\t262095\n一举一动\t262096\n修改单\t262097\n贵州省文化厅\t262098\n软化\t262099\n11件\t262100\n帅老\t262101\n皮肤\t262102\n大学英语1\t262103\n野兔\t262104\nA柱\t262105\n涵洞\t262106\n手游服\t262107\n基隆港\t262108\nLawson\t262109\n上海注会\t262110\n王老虎\t262111\nLecturer\t262112\n身负\t262113\nimpor\t262114\n健身\t262115\nfatrat\t262116\n树枝状\t262117\n伦敦证券交易所\t262118\n冥刑\t262119\n百朗\t262120\n龙网\t262121\n贵阳市妇幼保健院\t262122\n天台山\t262123\nactions\t262124\n湖北省民族宗教事务委员会\t262125\n华龙苑\t262126\n杨浦大桥\t262127\n白沙糖\t262128\n梅雨\t262129\n退耦\t262130\n陈赓\t262131\n模糊性\t262132\n2005年度\t262133\n纸格\t262134\nthemselves\t262135\n高新区党工委\t262136\n婚庆服\t262137\n不要死\t262138\nT+0\t262139\n王生\t262140\nconsignee\t262141\n小一初一\t262142\n青玉\t262143\ncck8\t262144\n万荣\t262145\n电油\t262146\n最美的太阳\t262147\n庚申日\t262148\n宝盈基金\t262149\n低功率版\t262150\n生堂\t262151\n梦幻模拟战2\t262152\n二分米\t262153\nsensual\t262154\n奋起\t262155\n进气歧管\t262156\n歪头\t262157\ncoronary\t262158\n苒\t262159\n川沙镇\t262160\nqoo\t262161\nxio\t262162\n眼框\t262163\n降降温\t262164\n第16位\t262165\n塔科夫\t262166\n120部\t262167\n幸平\t262168\n人众\t262169\n姑姐\t262170\ntcl\t262171\n奥维互动\t262172\ntalkingdata\t262173\n狗龙\t262174\n70平米\t262175\n闪电侠第三季\t262176\n灵体\t262177\n怠速马达\t262178\n读盘\t262179\ndumpsys\t262180\n供应链金融\t262181\n国资委\t262182\n驷马难追\t262183\ndataZoom\t262184\nZnO\t262185\n3.1%\t262186\n氨氯地平\t262187\n双兔\t262188\n脾虚\t262189\n京瓷ECOSYS\t262190\n王四营\t262191\n王利芬\t262192\n矛盾性\t262193\n管庄乡\t262194\nkeynot\t262195\n品销宝\t262196\n手术\t262197\n崔凯\t262198\nMOF\t262199\nfoil\t262200\ndy=0\t262201\n意彩\t262202\n朱镕基\t262203\n南京不孕不育医院\t262204\n压箱\t262205\n经济法学\t262206\nGiulietta\t262207\nacetyl\t262208\n猛女\t262209\n医疗类\t262210\nS7-300\t262211\n广州山水时尚酒店\t262212\n台盟中央\t262213\n搏动\t262214\n乐视视频\t262215\n阿祖\t262216\n王美丽\t262217\n警局\t262218\nAnritsu\t262219\n盐桥\t262220\n热轧板\t262221\n关押\t262222\n半箱\t262223\nv1.3.1\t262224\n乱污\t262225\n全彩色\t262226\n合肥新桥国际机场\t262227\nBtbook\t262228\n大祭桩\t262229\n1920x1200\t262230\n絮状\t262231\n开子\t262232\n魄\t262233\n傲笑\t262234\nPPT模板\t262235\n杭州市环境保护局\t262236\n堆高车\t262237\n细纱机\t262238\n京式\t262239\n赤羽业\t262240\nSerializable\t262241\n美体\t262242\n屏幕失灵\t262243\n延误险\t262244\n龙胜\t262245\n官办\t262246\n酿酒师\t262247\nios8\t262248\n深圳工业设计公司\t262249\n市法制办\t262250\ntranslator\t262251\n峰云\t262252\n装船\t262253\nu宝\t262254\n月湾\t262255\n刘婷婷\t262256\n比重\t262257\n继任\t262258\nsiRNA\t262259\n愤慨\t262260\n长形\t262261\ngettime\t262262\n波音757\t262263\nandon\t262264\n黑案\t262265\n塞万提斯\t262266\n苏荷酒吧\t262267\nospp\t262268\n碗仔翅\t262269\n新闻出版署\t262270\nzhishu\t262271\n青砖\t262272\n山\t262273\n35cm\t262274\n刘一鸣\t262275\n星儿\t262276\ncousin\t262277\n3857\t262278\n下腹\t262279\n四十三年\t262280\n政治协商制度\t262281\nAUTOCAD2010\t262282\n晚归\t262283\nAirPods\t262284\n中关村示范区\t262285\n凌睿\t262286\n高达VS\t262287\n松针茶\t262288\nclamping\t262289\n四顾\t262290\n华丰路\t262291\n金属结构件\t262292\nscientific\t262293\nvintage\t262294\n八字论\t262295\n依波表\t262296\n4-5月份\t262297\nacadres\t262298\n犁铧\t262299\n脱模剂\t262300\nietester\t262301\n王丽君\t262302\n井恋\t262303\n关菊英\t262304\ncnd\t262305\nSheridan\t262306\n雅照\t262307\nRPGmakerMV\t262308\n景岳全书\t262309\n迫降\t262310\ng530\t262311\n装料\t262312\n光谱网\t262313\nvv\t262314\n溶菌酶\t262315\n省略\t262316\n觉知\t262317\n统一增值税小规模纳税人\t262318\n诉你\t262319\n信旺华府骏苑\t262320\n部关\t262321\n五格分析\t262322\nPos\t262323\n边江\t262324\n雅布力\t262325\n3G小说网\t262326\npto\t262327\n龙须糖\t262328\nelective\t262329\n周例会\t262330\n声明书\t262331\n形状图\t262332\n红警3起义时刻\t262333\n骗抱\t262334\nPDM\t262335\nproliferation\t262336\nproduce48\t262337\n2百万\t262338\n审计局\t262339\n孙岩\t262340\nAshley\t262341\ncs吧\t262342\n凫\t262343\n中相\t262344\n拉市海\t262345\n123万\t262346\nx型腿\t262347\n明厅\t262348\n伊娃\t262349\n滴虫\t262350\n精选题\t262351\n国信集团\t262352\n喷发\t262353\n奔驰c300\t262354\n木槿\t262355\n陈家祠\t262356\n早唐\t262357\n退烧药\t262358\n特普朗\t262359\n尾羽\t262360\n张朝阳\t262361\n615\t262362\n自贴\t262363\n残币\t262364\n未来三天\t262365\n阿sue\t262366\nCologne\t262367\n100周年\t262368\n广州石化\t262369\nswap函数\t262370\n乐斗\t262371\n2018年5月15日\t262372\nEVD\t262373\nNUKE\t262374\n挺拔\t262375\n寸草\t262376\n看不见了\t262377\nGram\t262378\n年检站\t262379\n冰莲\t262380\nBigBao\t262381\n山东新华\t262382\n色达\t262383\n静态区\t262384\n内蒙古自治区人社厅\t262385\n永不停止\t262386\nmigrated\t262387\n苹果库乐队\t262388\n防弹布\t262389\nwww52avavcom\t262390\nGSX250R\t262391\n5G网络\t262392\n马集\t262393\nMeshLab\t262394\n志津香\t262395\n大赛人\t262396\n20161129\t262397\nhtml文本框\t262398\n周庄\t262399\n滨崎里绪\t262400\nT90\t262401\n压惊\t262402\nbanna\t262403\n中国人民大学附属中学\t262404\n权案\t262405\n解说词\t262406\n17.10.1\t262407\n询价\t262408\n波兰共和国\t262409\nxiepeixing\t262410\n忤\t262411\n重组人干扰素a2b\t262412\nwirshark\t262413\n代餐\t262414\n徐家汇路\t262415\n李克农\t262416\n鲤鱼网\t262417\n黑无常\t262418\n天龙山\t262419\n中速磨煤机\t262420\n徐德亮\t262421\na57t\t262422\n现在式\t262423\n吊索具\t262424\n硬笔书法字帖\t262425\n产业技术研究院\t262426\n锚段\t262427\nxposed\t262428\n二环北路\t262429\n铆螺母\t262430\n铁案\t262431\nINTER\t262432\nMaldives\t262433\n221g\t262434\nssdp\t262435\nmp4转换器\t262436\n岩屑\t262437\natm机\t262438\n3股\t262439\nSenra\t262440\n吉林人民出版社\t262441\nFreeSaber\t262442\n41家\t262443\n颈围\t262444\n毛石\t262445\n余姚火车站\t262446\n傅超\t262447\nspreads\t262448\n两百多年\t262449\n陈平安\t262450\n丽兹\t262451\nACG网\t262452\n军理\t262453\nmipsel\t262454\n吓倒\t262455\n青岛中能\t262456\n绿枝\t262457\nTARA\t262458\n工具式\t262459\n健健康康\t262460\n中国航空\t262461\n洗地\t262462\n百度API\t262463\nTI6\t262464\n睁开眼\t262465\n大沢\t262466\n支模\t262467\n密歇根湖\t262468\n重文轻武\t262469\n14isk\t262470\n大会\t262471\nlubricant\t262472\nCSRF\t262473\n相好\t262474\n糖霜\t262475\n凶险\t262476\n沣标\t262477\nBangumi\t262478\n隶属于\t262479\nWhitman\t262480\nCP感\t262481\n压装\t262482\n盲孔\t262483\n鄂尔多斯市人民政府\t262484\n逃学威龙2\t262485\n范·迪塞尔\t262486\n148\t262487\n必途网\t262488\n10季\t262489\n完美作业网\t262490\n虫巢\t262491\n自通\t262492\nnumeral\t262493\n猪药\t262494\n印堂\t262495\n四百多\t262496\n证券营业部\t262497\npact\t262498\n明城\t262499\n家教机\t262500\nGN手游网\t262501\n昌盛中华\t262502\n泯灭\t262503\n联想小新700\t262504\nseries\t262505\n南宁高新区\t262506\n长弓\t262507\n少版\t262508\n62万\t262509\n上海市知识产权局\t262510\n十亿美元\t262511\n我的新娘\t262512\n第14轮\t262513\n60号\t262514\n汉济渭\t262515\n500A\t262516\naida64\t262517\n觅元素51yuansu.com\t262518\n2018.04.05\t262519\n八骏\t262520\n河南县\t262521\n5天4夜\t262522\nbases\t262523\n厦门市市政园林局\t262524\n中国纸业网\t262525\n广数\t262526\n腹压\t262527\nLWIP\t262528\n丧尸屠城\t262529\n毛豆花生\t262530\n碧海潮生曲\t262531\n柏拉图式\t262532\n急救\t262533\n阴帝\t262534\n噩\t262535\n3126\t262536\n不口\t262537\n景洪\t262538\n汇编指令\t262539\n具名\t262540\n双强\t262541\nvolte\t262542\ncharset\t262543\n每个人都会\t262544\nBerufs\t262545\n康美之恋\t262546\n果绿网\t262547\n三寸\t262548\n盘查\t262549\n汉克\t262550\n刘海\t262551\n魂断\t262552\n攻略\t262553\n11.6寸\t262554\n缆绳\t262555\n虞祎杰\t262556\nd10\t262557\n注氧仪\t262558\n鹏华基金管理有限公司\t262559\n南昌一中\t262560\n苏锡\t262561\nattractive\t262562\n博宝\t262563\n报杯\t262564\nLaravel\t262565\n分子数\t262566\n剑灵革命\t262567\n法喜寺\t262568\n懂不懂\t262569\n320号\t262570\n哲言\t262571\n标称\t262572\nnerf\t262573\nkaplan\t262574\npr社\t262575\nカレンダ\t262576\n酒祖\t262577\nSitemap\t262578\n污名\t262579\n无土栽培\t262580\n降维\t262581\n直埋式\t262582\n冲剂\t262583\n篮\t262584\n显微外科\t262585\n江西师范\t262586\n1800公里\t262587\n20180321\t262588\n泰伦斯\t262589\n王辰\t262590\nFEA\t262591\nHoudini\t262592\nhuiyue\t262593\n73分\t262594\n电子测量\t262595\n成辉\t262596\n那个女孩\t262597\n武汉经济技术开发区\t262598\nPetra\t262599\n拉斯科\t262600\n番禺区\t262601\n卵黄囊\t262602\n王菲幻乐\t262603\ndefeat\t262604\n黄星羱\t262605\n武汉大学\t262606\n起亚余罪\t262607\n中铁八局\t262608\n生啤\t262609\n加味\t262610\n第101\t262611\n最稳\t262612\n斗南\t262613\n双控\t262614\n小米3g\t262615\n河南法院\t262616\n荷兰鼠\t262617\n张炎\t262618\n夹河\t262619\n铁打\t262620\n4399儿歌故事大全\t262621\n曹山石\t262622\n変更\t262623\n肉块术\t262624\n水葬\t262625\ncaligula\t262626\n澳门银河酒店\t262627\n榴社区\t262628\n轮齿\t262629\n5-7天\t262630\n吴县\t262631\nwhitelist\t262632\n夹边沟\t262633\n巨乳\t262634\n林凌\t262635\nDisorders\t262636\n循环赛\t262637\n龙嘉国际机场\t262638\n清妍\t262639\n01号\t262640\n故意毁坏财物罪\t262641\n错\t262642\n六分米\t262643\n10052\t262644\n上海玩具总动员酒店\t262645\nFirewalld\t262646\n产污\t262647\n见寄\t262648\nGate\t262649\n麻城信息网\t262650\nPerceptron\t262651\n使徒行者\t262652\nHIPP\t262653\n泰州学院\t262654\nvajoy\t262655\n雨中花\t262656\n唯品花\t262657\natx660\t262658\n煤尘\t262659\n幸运破解器\t262660\n西安石油大学\t262661\n乔治亚州\t262662\n甘宇\t262663\n牵牛\t262664\n千分尺\t262665\n178NBA2KOL\t262666\n严复\t262667\n中国科技信息\t262668\nchromeos\t262669\n跳闪\t262670\n党风廉政建设责任书\t262671\n防油\t262672\n16张\t262673\n沙皇\t262674\n百香果汁\t262675\n虫卵\t262676\n旋转流变仪\t262677\nshy48\t262678\n杭州壹号院\t262679\n马王堆帛书\t262680\n中国农业机械化科学研究院\t262681\n狂扇\t262682\n赵杨\t262683\nocn\t262684\n肉缝\t262685\n二杯\t262686\nROS软路由论坛\t262687\n雕爷牛腩\t262688\n花裤\t262689\n掌跖脓疱病\t262690\n花茎\t262691\n示意图\t262692\n浓姬\t262693\n秦叔宝\t262694\n隐蔽工程验收记录\t262695\nLinuxmint\t262696\n总有一天\t262697\n中国建交\t262698\n总办\t262699\n中国教育信息化网\t262700\n输尿管结石\t262701\n4x100米\t262702\n复视\t262703\n1511\t262704\n白渡桥\t262705\nArray\t262706\n童裙\t262707\n通谱\t262708\n定页\t262709\n649\t262710\n年假\t262711\n触电\t262712\n打扑克\t262713\n1队\t262714\n想说说\t262715\n264\t262716\n凯旋城\t262717\n钥\t262718\n百纳网\t262719\n彪哥闯奉天之做梦\t262720\n帝少的宠妻计划\t262721\nallegro\t262722\n2日游\t262723\n潍坊七中\t262724\n滑动\t262725\n电光\t262726\n天津市国资委\t262727\n封闭环\t262728\n太阳幼儿园\t262729\n3158环保网\t262730\n成人三国志\t262731\nアルバイト\t262732\nannouncement\t262733\n无故\t262734\n帅选\t262735\n句容市\t262736\n液压站\t262737\n南苑机场\t262738\n吸干机\t262739\n双爱\t262740\nCS1.6\t262741\n李中水上森林公园\t262742\n楚雄师范学院\t262743\n弹塑性\t262744\n宁夏银行\t262745\nX10.11\t262746\n临江街道\t262747\n天府四街\t262748\n冤枉\t262749\n趣图阁\t262750\n紫鹃\t262751\n蚁国\t262752\n常春藤叶\t262753\n虫虫吉他\t262754\n瑟曦\t262755\n冷落\t262756\nTinrry下午茶\t262757\n大猪\t262758\n鸠摩智\t262759\nboundary\t262760\n位元堂\t262761\n吕行\t262762\n相投\t262763\n国网天津市电力公司\t262764\n中银基金\t262765\n诗体\t262766\n172CM\t262767\n黔东南州\t262768\n蕉城\t262769\n驾驶感\t262770\nTERRY\t262771\n沈念\t262772\n臂\t262773\n文件化\t262774\n爱乐达\t262775\n伊瓜因\t262776\nLack\t262777\n颁奖礼\t262778\n苏商集团\t262779\nC型钢\t262780\n西宝生物科技(上海)股份有限公司\t262781\nvika\t262782\n48层\t262783\n被告席\t262784\n奥丁\t262785\n妈妈手\t262786\n陈刚\t262787\nALLEN\t262788\n重固\t262789\nradware\t262790\nRise\t262791\n4\t262792\nBuried\t262793\n水飞蓟素\t262794\n自贡房产网\t262795\n金茂湾\t262796\n100吨\t262797\n168小时\t262798\n鱼肠\t262799\n省人防办\t262800\n诺基亚e63\t262801\n特情\t262802\nwenwen\t262803\n旅行保险\t262804\n修肤堂\t262805\n火星\t262806\n张旗\t262807\n代发货\t262808\n保护液\t262809\n凯哥\t262810\n大哲\t262811\n發燒\t262812\n校验码\t262813\nWindowsServer2008\t262814\n炼焦\t262815\n皮物\t262816\n蔡甸区\t262817\n初始登记\t262818\n夏虫\t262819\nno.1\t262820\n瑞鑫\t262821\n复仇者联盟2:奥创纪元\t262822\n紫金港科技城\t262823\n反义疑问句\t262824\n罗线\t262825\n首乌藤\t262826\n一帆风顺\t262827\n焚心傲情\t262828\n电热水\t262829\n公生\t262830\n智轨\t262831\n中国农业银行个人网上银行\t262832\n中国计量学院研究生部\t262833\nv5r20\t262834\n取生\t262835\n密堆积\t262836\n半边街\t262837\n星见\t262838\ndnf迷你大乱斗\t262839\n中国电子银行网\t262840\n娱乐圈文\t262841\n8091\t262842\n战汉\t262843\ndearest\t262844\n5.1.0\t262845\n拨开\t262846\n胆经\t262847\n791\t262848\n肖央\t262849\n积家\t262850\nDVD导航\t262851\n河南科技学院\t262852\n2817\t262853\n应税消费品\t262854\n手机屏\t262855\n劳动赞歌\t262856\n娃记\t262857\n童鞋\t262858\n塔林\t262859\n青水庵\t262860\n牛肉片\t262861\n贝村\t262862\n哔\t262863\ncoolxing\t262864\n唐聿城\t262865\n堵盖\t262866\n宣统\t262867\n批处理文件\t262868\n杭州市财政局\t262869\n人大经济学院\t262870\n正源\t262871\n临到\t262872\n稷\t262873\nbeanfactory\t262874\n招远市人大常委会\t262875\n法约尔\t262876\n江西省人民政府法制办公室\t262877\n鲁毅\t262878\n中国养蛇网\t262879\n性度\t262880\n字裤\t262881\n搭伙\t262882\nvegf\t262883\n1毫米\t262884\n罗斯威尔\t262885\n海外置业\t262886\n竞争案\t262887\n信托融资\t262888\n九肖\t262889\n66路\t262890\n社会主义者\t262891\n张谷英村\t262892\n刘佩玥\t262893\nmarketplace\t262894\n仙泉\t262895\n雁塔\t262896\n3505\t262897\n干磨\t262898\n因势\t262899\n爱伦\t262900\n自大\t262901\n创维电视\t262902\n琯头镇\t262903\nRecipe\t262904\n10公里\t262905\nsu\t262906\n张伟文\t262907\n婚记\t262908\n绿牛\t262909\n诱感\t262910\n晓美焰\t262911\n男与女\t262912\n30km\t262913\n求取\t262914\nokgo\t262915\n备置\t262916\n水晶棺\t262917\n我的狐仙老婆\t262918\n南昌小学\t262919\n人保大厦\t262920\n广西机电职业技术学院\t262921\n名爵锐行\t262922\n东宁县\t262923\n0.35mm\t262924\n嘉鱼县\t262925\nF50\t262926\n500px\t262927\n中图版\t262928\n好汉\t262929\n深松机\t262930\ninductor\t262931\n1399\t262932\n电位仪\t262933\n风泽\t262934\n青年人\t262935\n理赔\t262936\n英华\t262937\n吉高宁宁\t262938\n金华五中\t262939\n淘宝托管\t262940\n乌金石茶盘\t262941\n兴坪镇\t262942\nlayPage\t262943\nproe5.0\t262944\n刘佳\t262945\n超进化物语\t262946\n4212\t262947\n上海卓卓网络科技有限公司\t262948\n一息\t262949\n潘颖\t262950\n失足\t262951\n渝三峡\t262952\n涡旋振荡器\t262953\n墩子\t262954\n蓝谷\t262955\nassociation\t262956\n张善\t262957\ndoze\t262958\n快修\t262959\n不确定\t262960\n电视塔\t262961\n兔子党\t262962\n科学通报\t262963\n红薯粉\t262964\nScreening\t262965\ncovers\t262966\n贯通式\t262967\n迅播影院\t262968\n好可怕\t262969\n20170930\t262970\n盗版\t262971\n颍上一中\t262972\n安徽省食品药品监督管理局\t262973\n大隈\t262974\n接工\t262975\n双港\t262976\n青天\t262977\nEventLoop\t262978\n眉骨\t262979\n陆远涛\t262980\n尤物网\t262981\n外科手\t262982\n门户网站群\t262983\nbaijiahao\t262984\n吉林农安政府\t262985\n中国建设银行手机银行\t262986\n平面镜\t262987\n2.4.23\t262988\n妾本惊华\t262989\n交大高金\t262990\n走线架\t262991\n128分钟\t262992\n小沈\t262993\n循行\t262994\nHomo\t262995\n酸根\t262996\n基孔\t262997\nWebShell\t262998\n李建辉\t262999\n三项式\t263000\n中青旅山水时尚酒店\t263001\n广州政府网\t263002\n轻型板\t263003\nBroadwell\t263004\n甲卡西酮\t263005\n中资企业\t263006\nboot.img\t263007\n二次\t263008\nvertical\t263009\n灭蚊\t263010\n拆东墙补西墙\t263011\n春晖\t263012\n5999\t263013\n微波电路\t263014\n员外\t263015\n始源\t263016\n飯岡\t263017\nKobo\t263018\nTransceiver\t263019\n错单\t263020\n张艺谋\t263021\n断位\t263022\n拉勾网\t263023\n炫铃\t263024\n陈蔡\t263025\n踏水\t263026\n開放式\t263027\n装维\t263028\n英强\t263029\n四害\t263030\n诚毅\t263031\nlifusheng\t263032\n不甘示弱\t263033\nApplepay\t263034\n杨姐\t263035\n中国进出口商品交易会\t263036\n普生\t263037\n死疫套\t263038\n同乡\t263039\n爱人者\t263040\nPS饭团网\t263041\n泰国博仁大学\t263042\n殉葬\t263043\n仙境传说吧\t263044\n惴惴不安\t263045\neasybcd\t263046\n田面\t263047\nCentOS7\t263048\n先锋潮\t263049\n美极\t263050\nCUI\t263051\nMyCloud\t263052\nTristan\t263053\n台秀\t263054\n飞行系\t263055\n学报\t263056\n东海证券\t263057\n全称\t263058\n本业\t263059\n刘春霖\t263060\n五一特惠\t263061\n古瓷\t263062\n厨业\t263063\n最老版\t263064\n蕨类\t263065\n肿瘤医院\t263066\n沙锤\t263067\n灭菌机\t263068\n槐\t263069\n冷暖人间\t263070\n坎卦\t263071\n滕修福\t263072\n塞巴斯蒂安·斯坦\t263073\n兰州地铁1号线\t263074\nenciclopedia\t263075\n金赞\t263076\ninsured\t263077\n上升子\t263078\n金运\t263079\n长安CS15\t263080\n兰西县\t263081\n妖妖灵\t263082\npi3\t263083\n10进制\t263084\n狮子狗\t263085\n赵姨娘\t263086\n淮北市人民政府\t263087\n新标日中级\t263088\n第三弹\t263089\n吐奶\t263090\n口吃\t263091\n神匠\t263092\nOralce\t263093\n2016年4月1日\t263094\nh小游戏\t263095\n动产质押\t263096\n巢湖经济开发区\t263097\n过氧\t263098\nbacktrack\t263099\n江村\t263100\n存储类\t263101\nwenshu\t263102\n微纪录片\t263103\n焦圈\t263104\n生畏\t263105\n慧忠北里\t263106\n神韵\t263107\nSPRING\t263108\nXftp\t263109\n红板\t263110\nvoc2007\t263111\n淘宝云\t263112\n20多年前\t263113\n绿地广场\t263114\n1代\t263115\n系留\t263116\n蔼\t263117\n四川省人民医院\t263118\noverseas\t263119\n三个\t263120\n一人有限责任公司\t263121\n北京大学\t263122\n屈大均\t263123\n斗破从前有座灵剑山\t263124\nComments\t263125\n北方经贸\t263126\n站友\t263127\n眨眼之间\t263128\nwin32-x86\t263129\n限制性定语从句\t263130\nDES算法\t263131\n恶言\t263132\n12芯\t263133\n社会学专业\t263134\n6.08\t263135\n车团\t263136\n炸服\t263137\n白额\t263138\n关原合战\t263139\n桐梓林\t263140\n鲜女\t263141\n多兰\t263142\n云南人事考试网\t263143\nWin10系统之家官网\t263144\nivms\t263145\n考释\t263146\n铜水\t263147\n天正吧\t263148\n天火传说\t263149\n公冶长\t263150\n三女河机场\t263151\n沙尾\t263152\n开支\t263153\n怀特\t263154\nDNF女鬼剑\t263155\ndio\t263156\na1566\t263157\n野鸭\t263158\n李玟雨\t263159\n诱罪\t263160\n姜有为\t263161\n关爱通\t263162\niPhone5C\t263163\n校直\t263164\n整片\t263165\n乐章\t263166\n南安商报社\t263167\n宋梅\t263168\nZHAO\t263169\n七八点\t263170\n未来之战\t263171\n智能燃气表\t263172\n帕慕克\t263173\n两融\t263174\n青雪版\t263175\n藩王\t263176\n奇慢\t263177\n阿根廷红虾\t263178\n小米系统\t263179\n四合一\t263180\n连环马\t263181\n屁侠\t263182\n熟悉的人\t263183\n供销合作网\t263184\n自由泳接力\t263185\nJinja2\t263186\n老年机\t263187\n淡水街道\t263188\n仗势\t263189\n逃生2\t263190\nmysql表\t263191\n苏冉\t263192\n人工牛黄\t263193\n王悦\t263194\n番茄汁\t263195\nTurn\t263196\n密实度\t263197\nLoot\t263198\n普陀新闻网\t263199\n家用电脑\t263200\n在线视\t263201\n大气污染源\t263202\n谢晋\t263203\n_尚之潮\t263204\n百利天下\t263205\nrecommendations\t263206\n明德中学\t263207\n误造\t263208\nxiluhua\t263209\n6230\t263210\neltamd\t263211\n混播\t263212\n奔驰\t263213\nQQQ\t263214\n中共达州市纪委监察局\t263215\n聚划\t263216\n主动权\t263217\n杨洋\t263218\nFLX飞豆\t263219\n振振有词\t263220\n跳进\t263221\nPrepare\t263222\n于莺\t263223\n腰椎骨\t263224\n韩生\t263225\n联交所\t263226\n中华优秀传统文化教育\t263227\n亲力\t263228\noldman\t263229\n搜狗五笔输入法\t263230\n康杰\t263231\n高克\t263232\n组别\t263233\nreq\t263234\n半导体公司\t263235\n道可\t263236\nmort\t263237\n十三经\t263238\n坎巴拉太空计划吧\t263239\n桑榆\t263240\nexceptions\t263241\n鄂博\t263242\n天津科技\t263243\n马来币\t263244\n自重\t263245\n74平米\t263246\n季播剧\t263247\n雷婷\t263248\n羊绒线\t263249\n中线\t263250\n奥宝典\t263251\n20150623\t263252\n上下方\t263253\n裸身照\t263254\n孚盟\t263255\n朝气蓬勃\t263256\n这扇门\t263257\nPenthouse\t263258\n百喻经\t263259\n长寿桥小学\t263260\n立柜\t263261\n5588\t263262\n有忧\t263263\n武穴市\t263264\n华特迪士尼公司\t263265\n冬运\t263266\n长春花\t263267\n诺德安达\t263268\n颜液\t263269\n推研\t263270\n莱宝\t263271\n2400米\t263272\n一千张\t263273\n新南站\t263274\nlibcurl\t263275\n欢喜密探\t263276\n亚历山\t263277\n修眉刀\t263278\nkerr\t263279\nntu\t263280\n钢筋焊接网\t263281\nLoli\t263282\n秀莲\t263283\n7450\t263284\n讲标\t263285\n假离婚\t263286\n新一村\t263287\nTOP5\t263288\n崔胜贤\t263289\nahc水乳\t263290\namv\t263291\n庄闲\t263292\n高崎机场\t263293\naoa\t263294\n多台\t263295\n血蹄\t263296\nzeronet\t263297\n清大\t263298\n海门中学\t263299\nkiter\t263300\n急弯\t263301\n洗脸机\t263302\n省委教育工委\t263303\n煎熬\t263304\n社会资本合作模式\t263305\n洈水\t263306\n长楹天街\t263307\n前程\t263308\n枝江\t263309\n8.7分\t263310\n二八法则\t263311\n督\t263312\nc5\t263313\n龙达温泉生态城\t263314\n4月15\t263315\n质效\t263316\n回娘家\t263317\n杭州科技职业技术学院\t263318\n雍正\t263319\n重新开具\t263320\noriginate\t263321\n结婚典礼\t263322\n开户银行\t263323\n工控网\t263324\n信阳鸡公山\t263325\n一色神\t263326\n本厂\t263327\n提前批\t263328\nmali\t263329\nLOA\t263330\nmyEclipse\t263331\n刘传\t263332\n兰州晚报数字报\t263333\n说者\t263334\n电抗器\t263335\n橡皮筋\t263336\n一月一日\t263337\nINVENTOR\t263338\n繁\t263339\n寄宿公寓\t263340\n鑫空间-鑫\t263341\n首建\t263342\n霍少\t263343\n大明龙权\t263344\nw650dc\t263345\n微博群\t263346\n大巧若拙\t263347\n铜仁职业技术学院\t263348\n浙江经贸职业技术学院\t263349\nnbalive\t263350\n北京搬家公司\t263351\n_夜\t263352\n淹城春秋乐园\t263353\n在雨中\t263354\n会堂\t263355\n日母\t263356\nHuff\t263357\ntrends\t263358\n下龙湾\t263359\nTI7\t263360\n贾维斯\t263361\n125型\t263362\nansa\t263363\n步阳防盗门\t263364\n白孔雀\t263365\n韩璐\t263366\n绿衣\t263367\n虚灵\t263368\n达喀尔\t263369\norganizational\t263370\n大学者\t263371\n猪脚姜\t263372\n1984\t263373\n验布机\t263374\nklk\t263375\n601800\t263376\n小朱\t263377\n党规\t263378\n迎接\t263379\n光能表\t263380\n数字营销\t263381\n马基埃亚尔\t263382\n法亚\t263383\n掺水\t263384\n_颜\t263385\n贴纸\t263386\n民事诉讼状\t263387\n等额本息\t263388\nMSMQ\t263389\n陌陌\t263390\n实弹军事演习\t263391\n拖曳\t263392\n渐变镜\t263393\ncci\t263394\n盘龙镇\t263395\n2541\t263396\n直升校\t263397\n蘇\t263398\nFare\t263399\n拜水\t263400\n德丰利达理财\t263401\n荣新江\t263402\n说明\t263403\n老虎钳\t263404\n一伴\t263405\n守恒\t263406\n青河县政府\t263407\n本田CRV\t263408\n树丛\t263409\n小小蛋儿\t263410\n广播域\t263411\n宝贝真乖_\t263412\n泡泡秀\t263413\n113家\t263414\n上海世界外国语中学\t263415\n京东文学奖\t263416\nbebop\t263417\nCaoPorn_超碰在线视频\t263418\n子组\t263419\n谷歌眼镜\t263420\n橙光|66RPG\t263421\ntritan\t263422\n接耳放\t263423\n大樱桃树\t263424\n黄景仁\t263425\n自首\t263426\njaas\t263427\nL301\t263428\n希卡贝尔\t263429\n字母歌\t263430\ntart\t263431\n湖南铁路科技职业技术学院\t263432\nBabes\t263433\nTablayout\t263434\n偶像有新番\t263435\n175级\t263436\n猎杀潜航\t263437\n丹霞路\t263438\n逃税罪\t263439\n写经\t263440\nsorted\t263441\n科尔维特\t263442\nTif\t263443\n黑岛\t263444\n御姐型\t263445\n国家发展改革委办公厅\t263446\nMick\t263447\nOllyDBG\t263448\nutf-8-CSDN\t263449\n大丰港\t263450\n三公分\t263451\n5月28日\t263452\n一仆二\t263453\n爱家\t263454\n凹凸世界吧\t263455\n力格\t263456\n口袋妖怪银\t263457\nAlmond\t263458\n银团\t263459\nraised\t263460\n共同群\t263461\n款码\t263462\nLancome兰蔻\t263463\n評測\t263464\n盐田街道\t263465\nbambam\t263466\n上九天\t263467\n广汽汇理\t263468\n第4色\t263469\n国际航班\t263470\n陌客\t263471\n会员工\t263472\n2017年度\t263473\n吴侬软语\t263474\n拿着\t263475\n浸润性导管癌\t263476\nm226dw\t263477\n因式分解\t263478\n傅彦东\t263479\n八零\t263480\nABR\t263481\n品库\t263482\n投注网\t263483\n何止\t263484\n看头\t263485\n青史\t263486\n川贝母\t263487\n焦急\t263488\nIRISaaS\t263489\n16版\t263490\n刘江\t263491\n青酱\t263492\n肾病科\t263493\nZlata\t263494\n蝴蝶袖\t263495\n最近七天\t263496\nauth\t263497\nbat吧\t263498\n平谷\t263499\nbaojian\t263500\nssa\t263501\n皮托管\t263502\n张槎街道\t263503\n二村\t263504\nSTUDENT\t263505\n始皇\t263506\n蒜农\t263507\nQuickTime\t263508\n公诚\t263509\nSiberia\t263510\n人性论\t263511\nPlantUML\t263512\nzf\t263513\nNavMeshAgent\t263514\n王娇\t263515\n放题\t263516\n寓\t263517\nfuse\t263518\n6000亿元\t263519\nhopper\t263520\n糕点\t263521\n泥河湾\t263522\n津门\t263523\n两万多元\t263524\n玉山\t263525\n通化市\t263526\n七份\t263527\n浙江电大\t263528\npsychtoolbox\t263529\nmutisim\t263530\n比照\t263531\n三塘镇\t263532\nLucifer\t263533\n德惠市人民政府\t263534\n仙鹤门\t263535\n上面\t263536\n小生不才\t263537\n试用期\t263538\n棚架\t263539\n弓缘\t263540\nE200\t263541\n千百撸\t263542\n马蹄寺\t263543\n庶妃\t263544\n正交矩阵\t263545\n金桔\t263546\n海南博鳌论坛\t263547\n云天明\t263548\nchinaunicom\t263549\n排印\t263550\n冀州中学\t263551\naxl234\t263552\n周韶宁\t263553\n耕地\t263554\n上海东方体育中心\t263555\napxs\t263556\n214\t263557\n制图网\t263558\n检制\t263559\nAllure\t263560\n白举纲\t263561\n15万亿\t263562\n乘法运算律\t263563\n布鲁日\t263564\nClosure\t263565\n机械类\t263566\niveryone\t263567\n陈巴尔\t263568\n安儿\t263569\n刘晓玲\t263570\n出生入死\t263571\n13座\t263572\n指相扣\t263573\ngardens\t263574\n上海牙科医院\t263575\n地铁8号线\t263576\n清正廉洁\t263577\n爱上男闺蜜\t263578\n人间词\t263579\n4.40\t263580\n巨龟\t263581\n晋松\t263582\nOwned\t263583\n穢\t263584\nLUO\t263585\n花祭\t263586\nHillstone\t263587\n丙丁\t263588\n农业部\t263589\n眨\t263590\nFANG\t263591\n预后\t263592\n维罗尼卡\t263593\n标库网\t263594\n芒橙\t263595\n甲文\t263596\n曲阜高铁站\t263597\n打铁\t263598\n云麓仙居\t263599\n金迪\t263600\n101贝\t263601\ngb\t263602\n雷村\t263603\n大使馆\t263604\n舅父\t263605\n非金融机构\t263606\n巨神战击队\t263607\n毛白杨\t263608\n乐贝\t263609\n46位\t263610\n米币\t263611\n从上\t263612\n汇编语言源程序\t263613\n北京市场\t263614\nGrandyang\t263615\n施瓦辛格\t263616\n系统状态栏\t263617\nikuku\t263618\n如此\t263619\n捣毁\t263620\n小倌\t263621\nddy\t263622\n订婚\t263623\nxtc800\t263624\n广陵区\t263625\n舞美\t263626\nWin10切换输入法\t263627\n拆屏\t263628\n预定\t263629\n金夏温\t263630\nEATON\t263631\n阿苏斯\t263632\n爬上床\t263633\n指上\t263634\n新兵连\t263635\n根签\t263636\nwebstore\t263637\n壹哥\t263638\n四轮定位仪\t263639\n曲肖冰\t263640\nPressed\t263641\n末路狂花\t263642\n曹魏\t263643\nlie\t263644\n熊头\t263645\nwine\t263646\n北京市委\t263647\n风缘\t263648\n霖\t263649\n变得\t263650\n汇景苑\t263651\n禽药\t263652\n镧\t263653\n南昌物流公司\t263654\n加温\t263655\n葡萄城\t263656\n偷情\t263657\n枪眼\t263658\n憋气\t263659\n电压版\t263660\n纲丝\t263661\n抵充\t263662\nSou\t263663\n香炉山\t263664\n信号器\t263665\n压变\t263666\n吐气\t263667\n大旺区\t263668\n375路\t263669\n成行\t263670\nhdx7\t263671\n【尼尔机械纪元吧\t263672\n周先旺\t263673\n美丽汇\t263674\n菩提祖师\t263675\nmatlan\t263676\nygocore\t263677\n涂布\t263678\n米克\t263679\n红皮\t263680\n缅花\t263681\n正交\t263682\n滚筒\t263683\n冲减\t263684\n孔子游春\t263685\n牛病\t263686\n语言学家\t263687\n猪苓\t263688\n中华人民共和国工业和信息化部\t263689\n美邦服饰\t263690\n瘦肉精\t263691\n夜行人\t263692\n招标采购信息网\t263693\nlkp\t263694\n芬达\t263695\n光纤箱\t263696\n长征七号\t263697\n航员\t263698\n股票大作手回忆录\t263699\n两份\t263700\nmature\t263701\n蚁王\t263702\n吕归尘\t263703\n胸腺瘤\t263704\n高木直子\t263705\n丁泽王阳明\t263706\n时尚_好奇心日报\t263707\n莫高股份\t263708\n历时\t263709\n易趣\t263710\n邦辰\t263711\n单词\t263712\n图雅\t263713\n石墨消解仪\t263714\n学就会\t263715\n几朵\t263716\nPeers\t263717\n王城大道\t263718\n宅斗文\t263719\n得知道\t263720\n热力\t263721\n潮州电视台\t263722\n老院\t263723\n孙进\t263724\n扎堆儿\t263725\n田园风光\t263726\n一寸虫\t263727\n临床执业/助理医师考试\t263728\n橡胶粉\t263729\n赵家珍\t263730\n格萨尔王传\t263731\nv-on\t263732\n朱恒源\t263733\n。3\t263734\n波鞋\t263735\nchemical\t263736\nsurface2\t263737\n淫才\t263738\n空串\t263739\n双享号\t263740\nprofiles\t263741\n杰斐逊\t263742\n王江平\t263743\n眸\t263744\n三古\t263745\n封海\t263746\n背\t263747\n怡海星城\t263748\n便当盒\t263749\nFy\t263750\n桂林理工大学博文管理学院\t263751\nTampermonkey\t263752\n如戏\t263753\n表内除法\t263754\nMiele\t263755\n养肥\t263756\nandroid.mk\t263757\n决胜\t263758\nburned\t263759\n马登\t263760\n爱成瘾\t263761\n赤犬\t263762\n萨利\t263763\nV2014\t263764\n孤帆\t263765\n中海地产\t263766\n3安卓\t263767\n徐三石\t263768\n山乡风流韵事\t263769\nvillas\t263770\n高考派\t263771\n鸳鸯楼\t263772\nSP版\t263773\n归属\t263774\n广州珠江职业技术学院\t263775\n雌性激素\t263776\n北京社会管理职业学院\t263777\n0.6%\t263778\n孙玲\t263779\nPLUG\t263780\n201所\t263781\n关联式\t263782\n二万元\t263783\n装聋作哑\t263784\n特斯拉\t263785\n喜德县\t263786\n草莓之夜\t263787\n8月22日\t263788\n县纪委\t263789\n淮秀\t263790\n昆州\t263791\n望文生义\t263792\n查理一世\t263793\n要慎重\t263794\nMongoDB\t263795\n缪氏\t263796\n中国科学院青藏高原研究所\t263797\n渠成\t263798\n2014届\t263799\n诗魔\t263800\n各族人民\t263801\n古菌\t263802\n开除\t263803\nCIQ\t263804\nマリア\t263805\nCoupon\t263806\n彩晶\t263807\n剑侠情缘移动版\t263808\n郊野\t263809\n蔡徐\t263810\ncx8\t263811\n10易升\t263812\nRISC\t263813\n重性\t263814\n600%\t263815\n劳民伤财\t263816\n北大方正人寿\t263817\n长鑫\t263818\n刘园园\t263819\n白鸟寿美礼\t263820\n林志炫\t263821\nARM9\t263822\n英雄时代\t263823\n普度众生\t263824\nAdapters\t263825\ncrate\t263826\n血肠\t263827\n唐中宗\t263828\n民俗\t263829\n160分钟\t263830\n劳动经济学\t263831\nResourceManager\t263832\n第三十七章\t263833\n州委书记\t263834\nFifa\t263835\nfoster\t263836\n张家界\t263837\n死亡飞车1\t263838\nIQKey\t263839\n不中看\t263840\n20150910\t263841\n第四十二条\t263842\n外伶仃岛\t263843\n向阳社区\t263844\n恭送\t263845\n内型\t263846\n荤君素妃\t263847\n1135\t263848\nRP8\t263849\n清川\t263850\n金水晶萝卜攻略\t263851\nTFSI\t263852\nKill\t263853\n第62届\t263854\n结构力学\t263855\n悍\t263856\n硼酸粉\t263857\n畔\t263858\n6.40\t263859\n789\t263860\n沙尔艾兰\t263861\nJerome\t263862\nM176n\t263863\nvvs\t263864\nSIMATIC\t263865\n拉克\t263866\n哥布林\t263867\n罗中旭\t263868\n陈为民\t263869\nRoommate\t263870\n沂\t263871\n台州物流公司\t263872\n基元反应\t263873\n主建\t263874\n_热血街舞团\t263875\nGIC\t263876\n双百工程\t263877\n中共中央党校\t263878\nv5.5.0\t263879\n大连公积金\t263880\n灯布\t263881\n双生花\t263882\n法案\t263883\n坐标轴\t263884\n龙湖紫都城\t263885\n赵奢\t263886\nLIS\t263887\n福建师范大学闽南科技学院\t263888\n性反应\t263889\n干窑\t263890\n北京航空航天大学\t263891\n格上理财\t263892\n前戏\t263893\n马蹄声\t263894\n托运部\t263895\n科学史\t263896\nBufferedImage\t263897\n堕夜精灵\t263898\n陪标\t263899\nsocio\t263900\n张奇\t263901\n26cm\t263902\n样袋\t263903\n莼试\t263904\n索多玛120天\t263905\nDeliver\t263906\ninstitutional\t263907\n俗世奇人\t263908\n漆画\t263909\n1次\t263910\n热型\t263911\n青岛动物园\t263912\ndota2吧\t263913\n古印度\t263914\n十九大胜利\t263915\n金庸全集\t263916\n猩红热\t263917\n2014-2015年度\t263918\n反比例函数\t263919\n橡胶展\t263920\nNETSHOW论坛\t263921\nrdt\t263922\n丽妃\t263923\n苏亚\t263924\n治道\t263925\n攘外\t263926\n绿舟\t263927\n莫及\t263928\n50i\t263929\ntive\t263930\nSWA\t263931\n宁乡高新区\t263932\nabsorber\t263933\n自造\t263934\n柯斯达\t263935\nv2.1.9\t263936\nC114中国通信网\t263937\n幻师\t263938\n三盛集团\t263939\n骁龙626\t263940\n全站仪\t263941\ncntk\t263942\n燃眉之急\t263943\nRemastered\t263944\n40元\t263945\n中国联合工程公司\t263946\n2017-09\t263947\n一周天\t263948\n暗黑魔电\t263949\n高渗\t263950\n大奶妹\t263951\n例会\t263952\n吱\t263953\nagp\t263954\n哥哥们\t263955\n剑灵\t263956\n疏肝益阳胶囊\t263957\n讴歌RDX\t263958\n16吨\t263959\n变速\t263960\n铁楼梯\t263961\nsendevent\t263962\n华谊嘉信\t263963\nUnvs\t263964\n2242\t263965\nrtmp\t263966\n雷剑\t263967\n杂音\t263968\n素描\t263969\n子体\t263970\n徐熙娣\t263971\n刑事诉讼法学\t263972\n云南省投资审批中介超市\t263973\n阳光论坛\t263974\n超禁忌游戏\t263975\nPeri\t263976\nclown\t263977\nv4.0.1\t263978\n壶瓶山\t263979\n毛寸\t263980\n组织部\t263981\n十七冶\t263982\nmuscles\t263983\n机轮\t263984\n多想\t263985\n醴陵市人民政府\t263986\n上上城\t263987\n中英人寿\t263988\n惟吾德馨\t263989\nklz\t263990\nVentilation\t263991\n微信稿\t263992\n滢\t263993\n90%\t263994\nDODO\t263995\n黄冠\t263996\n_秒\t263997\n沙湖公园\t263998\n评介\t263999\n成长版\t264000\n苏教版\t264001\n平潭镇\t264002\n咕力咕力舞蹈学堂\t264003\n联大\t264004\n极客者\t264005\n欢乐斗地主残局闯关\t264006\n黑色母\t264007\n波德莱尔\t264008\n白鹭倾城\t264009\nFinite\t264010\n踏遍\t264011\n传奇网\t264012\n唐都\t264013\n阿西莫夫短文两篇\t264014\n创作者\t264015\n仪器谱\t264016\n重生之最强剑神\t264017\n功亏一篑\t264018\nProtel99SE\t264019\n多门\t264020\n原根\t264021\n120千米\t264022\nNEMA\t264023\n四架\t264024\n上山\t264025\n霸王硬上弓\t264026\n1851年\t264027\n招生考试网\t264028\n软萌\t264029\n公共汽车站\t264030\nversus\t264031\n解冻\t264032\n80000230\t264033\n杨枫\t264034\nkb2919355\t264035\n中量\t264036\n猎房网\t264037\n佛山市水务局\t264038\nfucked\t264039\n三相电压\t264040\n味库\t264041\n中国国际旅行社\t264042\n随机浮点数\t264043\n黄日华\t264044\n金山小区\t264045\n泽维尔\t264046\n3267\t264047\n七彩阳光\t264048\n智简\t264049\n吉林动画学院\t264050\n超级电脑\t264051\n0060\t264052\n冬菇头\t264053\n绝地求生刺激战场手游\t264054\n7页\t264055\nbili\t264056\n西青大寺\t264057\n中铁十二局集团有限公司\t264058\nv3.1.6\t264059\n地科\t264060\n亚麻裤\t264061\nbucks\t264062\n稳和\t264063\n烧灼感\t264064\n莓兽\t264065\n徐冰\t264066\n合著\t264067\n联美\t264068\n东软熙康\t264069\n画面感\t264070\n650\t264071\n倩女幽魂ol\t264072\nfresh\t264073\n新传奇\t264074\n双核\t264075\n一百句\t264076\n松哥\t264077\nR2\t264078\nReserved\t264079\n乃东区\t264080\n直奔\t264081\nmailer\t264082\n怨情\t264083\n一二三四五\t264084\n舰队collect\t264085\n迷你兔\t264086\nmime\t264087\n濮阳房产网\t264088\n原研\t264089\ndowload\t264090\n清查\t264091\nDubstep\t264092\n20161117\t264093\n单表\t264094\n点状图\t264095\n肝斑\t264096\nDATAGRID\t264097\n电镀版\t264098\nrls\t264099\n|分\t264100\n华为P9\t264101\n梗\t264102\nSister\t264103\n复氧\t264104\n旗胜家园\t264105\nshinning\t264106\nEvermotion\t264107\n慈光\t264108\npymssql\t264109\nphilips\t264110\n360桌面助手\t264111\n世态\t264112\n失传\t264113\n老爷们\t264114\n素香\t264115\n元江路\t264116\n0888\t264117\n中粮广场\t264118\n瘀\t264119\n125.93\t264120\n武汉电视台\t264121\n标致408论\t264122\n木卡板\t264123\n明志\t264124\nAugmentation\t264125\n寒梅墨香\t264126\nh77\t264127\n动态规划\t264128\n巴依\t264129\n咖啡渍\t264130\n手压式\t264131\nYYKit\t264132\n神华集团有限责任公司\t264133\n经验者\t264134\n专楼\t264135\n743\t264136\n逍遥岛\t264137\n名簿\t264138\n涵义\t264139\ncqi\t264140\ncv1\t264141\n神舟六号\t264142\n会社\t264143\n新蓝鸟\t264144\n龙口市\t264145\n邦德教育\t264146\n0.009\t264147\nyaohui\t264148\n循环圈\t264149\nchg\t264150\n技嘉科技\t264151\n50KW\t264152\n北京大学信息科学技术学院\t264153\n密码破解器\t264154\nGMG\t264155\n有句话\t264156\n联想电脑管家\t264157\nctype\t264158\n英女\t264159\nIndustrial\t264160\n服务周\t264161\n纸制\t264162\nVlookup\t264163\n如刀割\t264164\n济南赶集网\t264165\n段间距\t264166\n肾错构瘤\t264167\n高强度螺栓\t264168\ne路通\t264169\n华为开发者联盟\t264170\namz\t264171\n亿联\t264172\n人身家\t264173\n淘鲜达\t264174\n极火\t264175\n碧志乃\t264176\n拓麻歌子\t264177\n二男\t264178\n人事科\t264179\n人力资源市场\t264180\n精忠报国\t264181\n东北银\t264182\n罚钱\t264183\n即时比分\t264184\n2016版)\t264185\n新长征路\t264186\n万能式断路器\t264187\n道\t264188\nSol君\t264189\n天造地设\t264190\n育路在职研究生招生信息网\t264191\n无敌诸天纪\t264192\nassays\t264193\n魏凤和\t264194\n耐药\t264195\n石柱\t264196\nfiesta\t264197\n2000日元\t264198\n联袂\t264199\n任仲夷\t264200\n中华人民共和国政府采购法实施条例\t264201\n泰合健康\t264202\ndokkan\t264203\nJie\t264204\n4关\t264205\n美国队长3:内战\t264206\n隋唐遗址植物园\t264207\n云想衣裳花想容\t264208\n钼精矿\t264209\n端高清\t264210\n书会\t264211\n二〇三五年\t264212\n半瓶\t264213\n简中式\t264214\n九江大桥\t264215\n捏合机\t264216\n壹点壹客\t264217\n廉政谈话\t264218\n摸鱼\t264219\n九岭\t264220\nSIMM\t264221\n平海路\t264222\n第23个\t264223\n各地人\t264224\nAQUOS\t264225\nIda\t264226\n黄石市下陆区\t264227\n百合咲\t264228\n中腾信\t264229\n奥州\t264230\n海峡股份\t264231\n杨伟东\t264232\nAmp\t264233\n30场\t264234\n阿翔\t264235\n谋取\t264236\n句首\t264237\n房山城关\t264238\n孟林\t264239\n阿什利\t264240\n代会\t264241\nconnection\t264242\n议事会\t264243\n外檐\t264244\n中雪\t264245\n莆田乐居\t264246\nlistmap\t264247\n登记照\t264248\n13#\t264249\n文明礼仪伴我行\t264250\n苏州中学园区校\t264251\nhk416\t264252\n上海明匠智能系统有限公司\t264253\n杜冷丁\t264254\n联想研究院\t264255\n测试员\t264256\n花开正\t264257\n扫地出门\t264258\n惠特尼·休斯顿\t264259\n生物卷\t264260\n认购\t264261\n好脾气\t264262\n天秤座女\t264263\n柯蓝纳达尔\t264264\n广州南火车站\t264265\n纳米晶体\t264266\n锡纸\t264267\n军鼓\t264268\n美罗国际\t264269\n黑龙江省委\t264270\n结论性\t264271\n左移\t264272\n个性特征\t264273\n下悬窗\t264274\n香港演艺学院\t264275\n204\t264276\n麦格纳\t264277\n传祺GS8\t264278\n我中有你\t264279\n黑龙\t264280\n雄安站\t264281\n药友\t264282\n荣成信息港\t264283\n商洛学院\t264284\n李坏\t264285\n全面版\t264286\n巴罗\t264287\n审单\t264288\n第一学期\t264289\nUltraSharp\t264290\n金俊勉\t264291\n中四\t264292\n桩帽\t264293\nmktime\t264294\n李娃\t264295\n驯龙高手3\t264296\n修枝\t264297\n火宅\t264298\ngerm\t264299\nROM|精简|\t264300\n王二平\t264301\n10005\t264302\n600多亿\t264303\ncfg\t264304\n引江济淮工程\t264305\n参学\t264306\n5列\t264307\ncomplier\t264308\nav资源网\t264309\n高h\t264310\ntcpreplay\t264311\n绿森\t264312\n扔\t264313\n保护板\t264314\n583\t264315\n动环监控系统\t264316\n界线\t264317\n汤灿\t264318\n竖条\t264319\n宋画\t264320\n膨胀螺栓\t264321\n唐驳虎\t264322\nvan\t264323\n毕业证书\t264324\n海米\t264325\n阿仁\t264326\n谢道韫\t264327\n18040期\t264328\n区别\t264329\n红客\t264330\nadapt\t264331\n不语\t264332\nTimes\t264333\n长安CX70_\t264334\n艾芜\t264335\n小港\t264336\n孙雪宁\t264337\n期权交易\t264338\n脑癌\t264339\nojbk\t264340\n时间胶囊\t264341\n畅途\t264342\n迦南学院\t264343\n另一条线\t264344\n索马里兰\t264345\n银杏路\t264346\n雪车\t264347\n理房通\t264348\n官湖\t264349\nuri域名\t264350\n肥美\t264351\n娄师白\t264352\n微杂志\t264353\nRGB风扇\t264354\n元字\t264355\n维修部\t264356\n上海交大安泰\t264357\n有才\t264358\n川穹\t264359\n融资担保公司\t264360\n广西壮族自治区新闻出版广电局\t264361\n1杯\t264362\n总裁霸\t264363\n奥维德\t264364\n92级\t264365\n华润万家\t264366\n闹市区\t264367\n铝合金窗\t264368\n旷课\t264369\n祭灶\t264370\n乐透\t264371\n问镜\t264372\n水云涧\t264373\n头巾\t264374\n玩具盒\t264375\n内乱\t264376\n中共湖北省委统一战线工作部\t264377\n神经炎\t264378\n虹魔\t264379\nPatran\t264380\n语录\t264381\n阿拉德之怒\t264382\n复阻抗\t264383\n常建雄\t264384\n拦腰\t264385\n聚丙\t264386\n西溪谷\t264387\n小米6X\t264388\nquik\t264389\n于艳茹\t264390\n龙邦速递\t264391\n节目\t264392\n万国鹏\t264393\n倒峰\t264394\n无限恐怖\t264395\n张艺兴\t264396\n16桥\t264397\n炫技\t264398\n滦南\t264399\n伊万卡\t264400\n性活\t264401\n大溶洞\t264402\n御殿场\t264403\n空气压缩机\t264404\n第88\t264405\n林徽贾乃亮\t264406\n郯城\t264407\nKINECT\t264408\n小额贷款公司\t264409\n登天\t264410\n张家口学院\t264411\n大阪樱花\t264412\n穿斗式\t264413\n龙鞍\t264414\n中国金融期货交易所\t264415\n重庆楼盘网\t264416\n腿针\t264417\n粤中\t264418\n邪术\t264419\n8.com\t264420\n刘立峰\t264421\nBrasil\t264422\n英雄本\t264423\n开壶\t264424\n永动\t264425\n好月圆\t264426\n表面能\t264427\n范明\t264428\nStudy\t264429\n铁炮\t264430\n180326\t264431\n22歳\t264432\nAnyway\t264433\n王牌特工2黄金圈\t264434\n40英寸\t264435\n侈\t264436\n诺门坎战役\t264437\n电压转换器\t264438\n↗\t264439\n混蛋\t264440\n用友知识堂\t264441\nRoss\t264442\n成都市公安办证中心\t264443\nesp分区\t264444\n概念规划\t264445\n烟台市商务局\t264446\nime\t264447\n140家\t264448\n注册表编辑器\t264449\n点弯\t264450\n终人\t264451\nZXY\t264452\nnativescript\t264453\n放号\t264454\nDataView\t264455\n创典\t264456\n伊\t264457\n爱山\t264458\n5v1a\t264459\n丹王\t264460\n石家庄地区\t264461\n万不得已\t264462\n灵丘县\t264463\n面法\t264464\n从无\t264465\n花园\t264466\n成军\t264467\n骑马与砍杀战团\t264468\n星巴克公司\t264469\n多摩美术大学\t264470\n正职\t264471\n佳得乐\t264472\n流量池\t264473\n湖南城市学院\t264474\n沸羊羊\t264475\n文娱\t264476\n经济作物\t264477\n逐帧\t264478\n工频变压器\t264479\ngeophysics\t264480\n安茜\t264481\n侧限\t264482\n八则\t264483\n六位\t264484\nRAV4荣放论坛_XCAR\t264485\n曲式分析\t264486\n西班牙广场\t264487\n000万元\t264488\n557\t264489\n信息表\t264490\n爱订\t264491\n鼎卦\t264492\n远祖\t264493\n歙县\t264494\n江西省发改委\t264495\n一丘之貉\t264496\n下山兰\t264497\n日语学\t264498\n递推\t264499\n中森青子\t264500\n相遇\t264501\n旗鱼\t264502\n暗黑者2\t264503\n博鳌论坛\t264504\n会长毛\t264505\n第02\t264506\n1番\t264507\n云盖寺\t264508\nlol陪玩吧\t264509\n十二生肖\t264510\n新周刊\t264511\n取景\t264512\n斯隆\t264513\n南京师范大学文学院\t264514\n冲泡\t264515\n卡卡水皮\t264516\n天琴计划\t264517\n策勒县\t264518\n恋爱禁止的世界\t264519\n教辅\t264520\n播音主持专业\t264521\n_武夷山政府网\t264522\n计算生物学\t264523\n白印\t264524\n大阪城\t264525\n管台\t264526\n偏音\t264527\n莱斯特\t264528\n博格曼\t264529\n5本\t264530\n安卓版_5577我机网\t264531\n星外转铺\t264532\n御景苑\t264533\n列队\t264534\nbmw\t264535\n经审\t264536\n历害\t264537\n000016\t264538\nmg2580s\t264539\n理想城\t264540\nChateau\t264541\n吴启华\t264542\n几何形\t264543\nfixes\t264544\n睢宁县政府\t264545\n油分\t264546\n克莉丝\t264547\n光纤猫\t264548\n鬼术\t264549\nfiji\t264550\n118kj\t264551\n测量员\t264552\n限度\t264553\n昆山高新区\t264554\n天财商龙\t264555\n2派\t264556\n03年\t264557\nzqifa\t264558\n汇投网\t264559\n永川野生动物园\t264560\n夜玫瑰\t264561\n大信橱柜\t264562\n精壮\t264563\nlldb\t264564\n火影ol吧\t264565\n河南财政税务高等专科学校\t264566\nWin2003_Windows\t264567\n雅生\t264568\nMike\t264569\n提利昂\t264570\n海边城市\t264571\nVJ师\t264572\n三安光电股份有限公司\t264573\n高杉晋\t264574\n鲜花礼品网\t264575\n27张\t264576\n333路\t264577\n可么\t264578\n辽宁本钢队\t264579\nDHC蝶翠诗橄榄\t264580\n周玄毅\t264581\n极品飞车吧_\t264582\n香港警察\t264583\nApollo\t264584\n日产西玛\t264585\n青藏\t264586\n定县\t264587\n曝\t264588\n附加应力\t264589\n马夫\t264590\n炉石萨满\t264591\n600MW\t264592\n新生力\t264593\nfemdom\t264594\n建行快贷\t264595\nAD10\t264596\n红联Linux\t264597\n运动地板\t264598\nPreferences\t264599\n真机\t264600\n定制类\t264601\n大上海\t264602\n上海理工大学中英国际学院\t264603\n眩目\t264604\n按键板\t264605\n洗澡机\t264606\n高仓健\t264607\n兰博基尼Aventador\t264608\n金兀术\t264609\n选色\t264610\n绝地求生全军出击\t264611\n安卓模拟器\t264612\nssp/smp\t264613\n武宿国际机场\t264614\n无缺\t264615\n木剑\t264616\n马尔扎哈\t264617\n江山路\t264618\n齐星\t264619\n户口迁移\t264620\n贪欲\t264621\nUSPS\t264622\njep\t264623\n129路\t264624\n第四中学\t264625\n陈晓敏\t264626\n万古明末边军一小兵\t264627\n握奇\t264628\n第五十条\t264629\n高大上\t264630\n廓\t264631\nTMX\t264632\n永不满足\t264633\n善林宝\t264634\n小米锁屏\t264635\n三国英杰传\t264636\n再探\t264637\n1993年\t264638\n异字\t264639\n纤维素酶\t264640\nthinkp\t264641\n图注\t264642\n44部\t264643\n崇安寺\t264644\n电子信息产业网\t264645\nsrand\t264646\nskyrim\t264647\nIdentification\t264648\n瓦房\t264649\n总生\t264650\n民营企业家\t264651\n优购网\t264652\n蜀山战纪\t264653\n阵势\t264654\n中美日\t264655\nundercover\t264656\nkkmall\t264657\nyr\t264658\n张衡路\t264659\n无命\t264660\n医学心理学\t264661\n通根\t264662\n1.9L\t264663\n安若\t264664\n华贸物流\t264665\n银叶菊\t264666\ngirdview\t264667\n黑裙\t264668\n不要紧\t264669\n20171003\t264670\n制冷量\t264671\n中国农村研究网\t264672\n三遍\t264673\n殉国\t264674\n3.5元\t264675\n抚触\t264676\nElektronik\t264677\nCummins\t264678\nvty\t264679\n博格\t264680\nbranches\t264681\n三个男孩\t264682\n郭明錤\t264683\n跑着\t264684\nIB\t264685\ninterpreted\t264686\n置位\t264687\n0.75kw\t264688\n金浦\t264689\n中小童\t264690\n三栏式\t264691\n反手\t264692\n千重\t264693\n小调协奏曲\t264694\n裕兴新概念英语\t264695\n刘振亚\t264696\nBug\t264697\n粉针剂\t264698\n薄钢板\t264699\ninertial\t264700\n一张\t264701\n一辈子\t264702\nSOL君\t264703\n升压型\t264704\n一九七九年\t264705\n优蓝网\t264706\n冯天薇\t264707\n漫笔\t264708\n加油管\t264709\nSumatraPDF\t264710\n边条\t264711\n鱼服\t264712\n平板型\t264713\n恬不知耻\t264714\n自闭症日\t264715\n个字\t264716\n翻译件\t264717\n兔儿爷\t264718\n调针\t264719\n伟\t264720\n减仓\t264721\nG6\t264722\n025期\t264723\n英特尔杯\t264724\nMatter\t264725\n9几年\t264726\n沈燕\t264727\n打印服务器\t264728\n低成本\t264729\n技术点\t264730\nGestures\t264731\n余翔\t264732\n丁老头\t264733\nSINUMERIK\t264734\nWarehouse\t264735\nag超玩会\t264736\n馔\t264737\n生活号\t264738\n那个年代\t264739\n兰斯3\t264740\nsyy\t264741\n烈酒\t264742\n140u.dll\t264743\n男人世界\t264744\nCompetitions\t264745\n湖北电力\t264746\nEaBIM\t264747\n蒯\t264748\nvs2003\t264749\n贵池\t264750\n层次化\t264751\n结丹\t264752\n25W\t264753\n河坊\t264754\n一大四\t264755\nWP8.1\t264756\n1021\t264757\n分板\t264758\n长沢\t264759\n白贞德\t264760\n生男生女\t264761\n基文\t264762\n2万平米\t264763\n工巧\t264764\n中孝介\t264765\n亚朵\t264766\n150家\t264767\n40讲\t264768\nbadi\t264769\nPreference\t264770\nhao123邮箱\t264771\nCostume\t264772\n黄山区人民政府\t264773\n当兵\t264774\n剑风传奇无双\t264775\nzxr10\t264776\n安邦人寿\t264777\n宝葫芦的秘密\t264778\n虎扑路人王\t264779\nbo\t264780\n技术学\t264781\n美洽\t264782\n诛仙前传\t264783\n金木\t264784\n丽枫\t264785\nquick-cocos2d\t264786\n0954\t264787\n铜米机\t264788\n怡人\t264789\n辛灵\t264790\n决策\t264791\n42厘米\t264792\n厦门路\t264793\nresolved\t264794\n成都地铁运营有限公司\t264795\n中国铁道出版社\t264796\n中国开公司\t264797\n第三道\t264798\n陈松\t264799\n新浪辽宁新闻_新浪辽宁\t264800\n二进宫\t264801\n地理类\t264802\n牦牛肉\t264803\nthrill\t264804\n图解版\t264805\n封神\t264806\n房地产集团公司\t264807\n刘畊宏\t264808\n统计分布\t264809\nNumber\t264810\nPAIN\t264811\n158厘米\t264812\n五五开黑\t264813\n逐年\t264814\n45式\t264815\n顾北辰\t264816\n3月20\t264817\n云南网\t264818\n百米\t264819\n柚月向日葵\t264820\n2.7米\t264821\n市桥镇\t264822\n炸鱼\t264823\n白梗\t264824\n韩男团\t264825\n莲前西路\t264826\n集合版\t264827\n导电泡棉\t264828\n衣服架\t264829\n王海波\t264830\nVK\t264831\n县人武部\t264832\n滁州站\t264833\n不闻不问\t264834\n2018年4月27\t264835\n2y2\t264836\n中山大学公共卫生学院\t264837\n黄龙饭店\t264838\n|亿\t264839\n当当云\t264840\n阿里总部\t264841\n恒通商务园\t264842\n梦幻西游无双2\t264843\n澤野弘之\t264844\nHEROES\t264845\n乡愁\t264846\n按位运算符\t264847\n联想e470\t264848\n812路\t264849\n加勒比海盗\t264850\n叶明\t264851\n香头\t264852\n紫微斗数\t264853\n上音\t264854\n川芎嗪\t264855\n爱地\t264856\n非标准\t264857\n微微一笑很倾城2\t264858\n百田奥奇传说\t264859\n奥雅\t264860\n劫火\t264861\n苍溪在线\t264862\n银河国际\t264863\nvalues\t264864\nbak\t264865\n维修队\t264866\n五星级酒店\t264867\n25%\t264868\n四川省医学科学院\t264869\n_学网\t264870\n一臂之力\t264871\n不难\t264872\nOnenote\t264873\nHear\t264874\n碧柔\t264875\n陈行\t264876\n南辕北辙\t264877\n艾山\t264878\nMoving\t264879\n优信集团\t264880\n付费通\t264881\nT100\t264882\n花影\t264883\nROMOSS\t264884\n绿城留香园\t264885\njavadoc\t264886\n50_\t264887\n酒干\t264888\n真如\t264889\n应计利息\t264890\n车辆段\t264891\n于丹\t264892\n工作圈\t264893\n新种\t264894\n段丽阳\t264895\n试管婴儿周期\t264896\n数位板\t264897\nprinciple\t264898\n中国商学院\t264899\n城计\t264900\n金剑\t264901\n祖玛阁\t264902\n电击枪\t264903\n驶\t264904\n肇庆高新区\t264905\n电脑维修知识网\t264906\n世界博物馆\t264907\n觐天宝匣\t264908\n阴穴\t264909\nindicative\t264910\n宝湖\t264911\n1987\t264912\n关雎\t264913\n中诗协\t264914\n机械杆\t264915\n候补委员\t264916\nAllegro\t264917\n王永珀\t264918\ngdgs\t264919\nd80\t264920\n原型链\t264921\nIpython\t264922\nNote2\t264923\nParrot\t264924\n擢升\t264925\nThea\t264926\n天龙八部3D\t264927\n蜂鸣器\t264928\nszf\t264929\n编年史\t264930\nhanhua\t264931\n高智能方程式\t264932\n第7课\t264933\nEpidemiology\t264934\n失望\t264935\n金迪木门\t264936\n三分钟内\t264937\n乱弹\t264938\n阿里宝卡\t264939\n192khz\t264940\n20180327\t264941\n买房\t264942\n频率分布表\t264943\n夏洛特·玲玲\t264944\n爱不到\t264945\nubc吧\t264946\nesata\t264947\n侠盗飞车联盟\t264948\nCHH\t264949\n西黄丸\t264950\n王文娟\t264951\n九十五\t264952\n京杭大运河\t264953\n百信\t264954\n维权网\t264955\n重生完美时代\t264956\nabcwt112\t264957\n5.4\t264958\n2404\t264959\n抵偿\t264960\n独幕\t264961\n貂禅\t264962\n黄小墨\t264963\n妻夫木聪\t264964\n沟线\t264965\nPixels\t264966\n盖着\t264967\n东江社区\t264968\n动画布\t264969\n油缸\t264970\n寻味顺德\t264971\nwin7锁屏\t264972\n高田贤\t264973\n到天\t264974\nio包\t264975\nMille\t264976\n第四幕\t264977\n2015年11月30日\t264978\n家庭背景\t264979\n蹄膀\t264980\n发分\t264981\n建昌道\t264982\n侯仁\t264983\n弹指间\t264984\n瑶姬\t264985\n华贸中心\t264986\n乃\t264987\n南昌市科技局\t264988\n虚空龙\t264989\n血红蛋白电泳\t264990\nNCT\t264991\nFiguarts\t264992\n自热火锅\t264993\n同股\t264994\n117.136.255.255\t264995\n奥利\t264996\n可避免\t264997\n订户\t264998\n返潮\t264999\n_侠岚\t265000\n摄者\t265001\n西咸\t265002\n阿魏\t265003\n茶店镇\t265004\n学前教育网\t265005\n内酯\t265006\n月全食\t265007\n赤司征十郎\t265008\n潘多拉魔盒\t265009\n喷神james\t265010\nMiho\t265011\n三维码\t265012\n成华区\t265013\n专局\t265014\npeck\t265015\n四十九\t265016\n白漫城\t265017\n妖都\t265018\n激励语\t265019\n上海金交所\t265020\n泰州晚报\t265021\n区人民政府办公室\t265022\n弹雨\t265023\n物业管\t265024\n新东方\t265025\n每人\t265026\n不在家\t265027\n掏钱\t265028\n0374\t265029\n裙座\t265030\n商校\t265031\n亚马孙\t265032\n所在行\t265033\n甲基吡咯烷酮\t265034\n仙药\t265035\n克洛格\t265036\n四氯\t265037\n措不及防\t265038\n留给\t265039\n4.96\t265040\n霍尔传感器\t265041\n长荡湖\t265042\n5.8万\t265043\n小厨宝\t265044\n董亮\t265045\n世界计量日\t265046\n胎衣\t265047\n招商银行私人银行\t265048\nNUDE\t265049\n至强\t265050\ntextView\t265051\n大河乡\t265052\n丰田86\t265053\npow函数\t265054\n株洲市\t265055\n红星小学\t265056\nΠ\t265057\nR2.0\t265058\nfz\t265059\n船形\t265060\n关旭\t265061\n肉丝面\t265062\n护国寺\t265063\n雕版\t265064\n西安市财政局\t265065\n卡槽\t265066\n奇兔\t265067\n电水\t265068\n九月初九\t265069\ncar-t\t265070\n兰州酒店\t265071\n处分权\t265072\n新安江\t265073\n换食\t265074\n红学家\t265075\n肖战\t265076\nlocalized\t265077\n景粼\t265078\n星耀天地\t265079\n沈阳宾馆\t265080\n签订合\t265081\n南大光电\t265082\n未达\t265083\nwireframe\t265084\n出借\t265085\n王和平\t265086\n奶茶粉\t265087\n大小写\t265088\n古代情缘-花雨全本小说阅读网\t265089\n小沃\t265090\n灯笼\t265091\n金三\t265092\nlazboy\t265093\n中国平安保险股份有限公司\t265094\nPDF编辑器\t265095\n航天博物馆\t265096\nGuardians\t265097\n截止期\t265098\n鹅岭二厂\t265099\n斯坦李\t265100\n云中\t265101\nlayer\t265102\n陈荣\t265103\n轴流式\t265104\n小评\t265105\n汉密尔顿岛\t265106\nCNC编程\t265107\n蒋欣\t265108\nHNO3\t265109\n三体\t265110\n易欣\t265111\n闫怀礼\t265112\n高低床\t265113\n缙云烧饼\t265114\n天佛\t265115\nMcGill\t265116\nsalua\t265117\n李海洋\t265118\n招租\t265119\nTensorSense\t265120\ntunnels\t265121\n宗亲会\t265122\n汉帝国\t265123\n珍藏集\t265124\n巩新亮\t265125\n山红\t265126\n狂怒\t265127\n率真\t265128\n江东镇\t265129\nglee\t265130\n桑蚕\t265131\n性服务\t265132\nJdon\t265133\n五灵脂\t265134\n给以\t265135\n金斯瑞\t265136\n北京奥森公园\t265137\n兹兹\t265138\n2月16日\t265139\n招商银行\t265140\n改观\t265141\n525Li\t265142\n蓝菲菲\t265143\n送话器\t265144\n广东第二师范学院\t265145\n2002级\t265146\n65.0\t265147\n奥园\t265148\n240w\t265149\n1.el7.x86\t265150\n保利学府城\t265151\n小朋\t265152\n巨人集团\t265153\n于洪新城\t265154\n基督宗教\t265155\n九真九阳\t265156\n魔力红乐队\t265157\n大流\t265158\n蚌埠论坛\t265159\n二元二次方程组\t265160\ntitanium\t265161\nsnuff\t265162\n椎名そら\t265163\n振翅高飞\t265164\n冯梦龙\t265165\n网龙\t265166\n结束符\t265167\n钱钱\t265168\n19期\t265169\n2.73\t265170\n环物\t265171\n心痛不已\t265172\n陪女\t265173\n黄山松\t265174\nJTAG\t265175\n沈家湾客运码头\t265176\n一审行政判决书\t265177\n长安cs75\t265178\nCosts\t265179\nfpx\t265180\n筱\t265181\n正步\t265182\n哈高科\t265183\n8套卷\t265184\n三急\t265185\n土话\t265186\n苗银\t265187\n霍顿\t265188\nJadePeng\t265189\n冰刺\t265190\n辣木籽\t265191\nGBQ4.0\t265192\ngpm\t265193\n沃尔沃集团\t265194\n查柜\t265195\nDevOps\t265196\n乌兰察布市人民政府\t265197\n97973手游网\t265198\n晶地\t265199\n佳吉物流\t265200\n螺鲤\t265201\n沧桑感\t265202\n小优\t265203\nLevy\t265204\n鱼嘴\t265205\n专升本\t265206\n小营街道\t265207\n多爱\t265208\n龄\t265209\n自提\t265210\n突袭2\t265211\n陈楚河\t265212\nunregister\t265213\n日常工作\t265214\n集气\t265215\n板栗\t265216\n高速路口\t265217\nSN\t265218\n香墨\t265219\n上海市肿瘤医院\t265220\n云鸟\t265221\n福克斯ST\t265222\n酷q\t265223\n应力场\t265224\ncriticism\t265225\n郑建新\t265226\n餐桌\t265227\nSERV\t265228\n红巾军\t265229\n张家川回族自治县\t265230\n艾米果\t265231\nsams\t265232\n两率\t265233\n万色\t265234\n石可\t265235\ntown\t265236\nwzjhoutai\t265237\n游戏规则\t265238\n大面\t265239\n红字增值税专用发票\t265240\n中共贵阳市纪律检查委员会\t265241\n习题册\t265242\n斯德哥尔摩综合征\t265243\n临夏市\t265244\n挂上\t265245\n280tsi\t265246\n广兴\t265247\n苏宁国美\t265248\n192.168.1.5\t265249\n拆车坊\t265250\n冰风谷2\t265251\n数学分析\t265252\n哀思\t265253\n锚点\t265254\n悬户\t265255\n初等数学\t265256\n谢津\t265257\n长沙市住房保障服务局\t265258\n三乱\t265259\n胸外科\t265260\nGSM/3G卡\t265261\n塔罗牌在线占卜\t265262\n译丛\t265263\n魔道祖师\t265264\nMobileye\t265265\n百合花开\t265266\n袋笼\t265267\n孽主\t265268\n正佳\t265269\n收集器\t265270\n大江健三郎\t265271\n美英法联军\t265272\nJQuery选择器\t265273\nearphone\t265274\n东方在线\t265275\n20150127\t265276\n美洲大蠊\t265277\n事项\t265278\n广州图书馆\t265279\n命运之手\t265280\n规划馆\t265281\n角式\t265282\n报送\t265283\n镀层\t265284\n雪铁龙C5论坛_汽车之家论坛\t265285\n一生一世一双人\t265286\n机械学院\t265287\n无败\t265288\n蜜桃成熟时2\t265289\n一沙\t265290\n国芳\t265291\n质检总局特种设备局\t265292\n追问\t265293\n剪掉\t265294\n明华\t265295\n唱歌机\t265296\n10000万\t265297\n京珠高速\t265298\nRunPorn\t265299\nNo.5\t265300\nALICE\t265301\n甜酸\t265302\n社区型\t265303\n_千\t265304\n一百万次\t265305\n2260\t265306\nreadily\t265307\n第三十四集\t265308\n十里香\t265309\n尾灯\t265310\n招商加盟网\t265311\nTibetan\t265312\n4点半\t265313\n8}\t265314\n汽车报\t265315\n被虐\t265316\nnengyuan\t265317\n益赛普\t265318\n紧身裤\t265319\nONO\t265320\n鼻科\t265321\noutloo\t265322\nuhd\t265323\n美宣\t265324\n八十米\t265325\n古墓群\t265326\ninterllij\t265327\n鱼化\t265328\n骰\t265329\n2626\t265330\n奇才\t265331\n招商雍华府\t265332\n虚心\t265333\n48片\t265334\n充公\t265335\n创艺\t265336\n克丽丝\t265337\nSLC\t265338\nVenice\t265339\n人大常务委员会\t265340\n佛说大乘无量寿庄严清净平等觉经\t265341\n光路\t265342\n日月光中心广场\t265343\n万科中心\t265344\n截然相反\t265345\n克里斯蒂亚诺·罗纳尔多\t265346\n学习群\t265347\n徐瑾\t265348\n闾山\t265349\n道场\t265350\n连不起\t265351\n武汉地铁1号线\t265352\n三郎镇\t265353\n风物\t265354\n港星\t265355\n002371\t265356\n宋祖德\t265357\n科学技术观\t265358\n李彩霞\t265359\nSpring4\t265360\n翘板开关\t265361\nMarkdownPad\t265362\n洛夫\t265363\n166言情小说阅读网\t265364\n28周后\t265365\n酷狗音乐盒\t265366\n步天纲\t265367\n二层\t265368\n实收资本\t265369\n路友\t265370\n挑战赛\t265371\n兰多夫\t265372\n即墨市委政府\t265373\n上坪村\t265374\n金光大道\t265375\n液晶玻璃\t265376\ndyson\t265377\n领寓\t265378\ncloud\t265379\n垫支\t265380\n中小学教育网\t265381\n花桥站\t265382\n苟利国家生死以\t265383\n惊悚\t265384\n恒大金碧物业\t265385\n名门府\t265386\n麻雀变凤凰\t265387\n贰级\t265388\n8805\t265389\n合成类\t265390\n废料\t265391\n台证\t265392\n侧平石\t265393\n商业模式分析\t265394\nstripped\t265395\n周小平\t265396\n置业\t265397\n4.02\t265398\n半岛酒店\t265399\n猪肝粉\t265400\n水莓\t265401\n天国拯救吧\t265402\n符文之语\t265403\n微信圈\t265404\n西塞山区\t265405\nPanjiva\t265406\n大连财经学院\t265407\n大航海家\t265408\n萨莎\t265409\n怀素\t265410\n用卷\t265411\n5ghz\t265412\n中央音乐学院考级委员会\t265413\n秋石\t265414\n熔盛重工\t265415\n循序\t265416\n上海理工大学图书馆\t265417\n音素\t265418\nmadVR\t265419\n青霜\t265420\n独孤剑\t265421\ntbk\t265422\nA1278\t265423\n上辈子\t265424\nroughness\t265425\n1478\t265426\npeng\t265427\nrushed\t265428\n新都视界网\t265429\n汪顺\t265430\n四季海棠\t265431\n2018年3月17日\t265432\n童车\t265433\n京山论坛\t265434\n程式化\t265435\n杂技\t265436\ncrushing\t265437\n1[\t265438\n丽音\t265439\n非保本\t265440\nDruid\t265441\ni34160\t265442\n阴郁\t265443\n日本美女邪恶漫画大全_邪恶gif动态图_邪恶漫画少女漫画\t265444\n绝世神功\t265445\n车间\t265446\n同素异形体\t265447\n帧卡顿\t265448\n霍建起\t265449\n入菩萨行论\t265450\n毁坏\t265451\n落地镜\t265452\n电流声\t265453\n魔切\t265454\n最后的灰姑娘\t265455\nHaagen\t265456\nY55A\t265457\n实尾岛\t265458\n日照职业技术学院\t265459\n冷吨\t265460\n孙子兵法与三十六计\t265461\n鳞\t265462\n李玖哲\t265463\n冬狮郎\t265464\n不动\t265465\n国王大道\t265466\n王大拿\t265467\n360全景图\t265468\n富春\t265469\n异地公积金贷款\t265470\n中共江油市委\t265471\n第20话\t265472\n4座\t265473\n周儿\t265474\nbkd\t265475\n玄兽\t265476\n388.71\t265477\n战斗服\t265478\n节能灯\t265479\n合江县\t265480\n阿兰\t265481\n功血\t265482\n机械基础\t265483\n胎生\t265484\n三原\t265485\n宝文\t265486\nmatplot\t265487\n导游\t265488\n太古神王\t265489\n英国政府\t265490\n增值税附加税\t265491\n入土\t265492\n边桌\t265493\n考查\t265494\n清华大学学报\t265495\n远路\t265496\n九八\t265497\n天津学校\t265498\n防水箱\t265499\n2824\t265500\n圣塔\t265501\n博听网\t265502\nNBIoT\t265503\n影视类\t265504\n振芯科技\t265505\nStore\t265506\n红毛药酒\t265507\n原子贷\t265508\n易登网\t265509\n亚盘\t265510\n第一颗\t265511\nmjpeg\t265512\n林雷\t265513\n風\t265514\n华贵\t265515\nIdentityServer4\t265516\n隔世追凶\t265517\nFilza\t265518\n兴校\t265519\nLUCK\t265520\n哈尔滨银行\t265521\n小婷\t265522\n48款\t265523\n子夜\t265524\n三菱重工中央\t265525\nUtils\t265526\n企鹅\t265527\n九阅\t265528\n龙塘镇\t265529\n居左\t265530\n35块\t265531\n攻\t265532\n药用植物学\t265533\n斜刘海\t265534\n明月照\t265535\n后视镜版\t265536\n木友\t265537\n仙灵物语\t265538\n3d2018\t265539\n林小宅\t265540\napartments\t265541\n开平路\t265542\n鹰酱\t265543\n北京市政交通一卡通\t265544\nboya\t265545\n纵联\t265546\nScala\t265547\n中兴微电子\t265548\n杏林\t265549\n2015年中\t265550\n100千瓦\t265551\n_团结网\t265552\n思园\t265553\nboys\t265554\n60L\t265555\n第113期\t265556\n桑白皮\t265557\nzha\t265558\n莎尔娜\t265559\n海信\t265560\n长征十一号\t265561\n短信\t265562\n梅洛\t265563\n心平气和\t265564\n安证\t265565\nb350f\t265566\n覃辉\t265567\n顶格申购\t265568\n枸橼酸莫沙必利片\t265569\na15\t265570\n学府\t265571\n吐息\t265572\n元ev360\t265573\n人民日报新知新觉\t265574\n人文社会科学版)\t265575\nEclipsepedia\t265576\n封开\t265577\n堪察加半岛\t265578\n林\t265579\nwin服\t265580\n下酒\t265581\n试运转\t265582\n皋兰县\t265583\n独处\t265584\n荣事达\t265585\n洗发膏\t265586\n陈剑锋\t265587\n成都企聚网\t265588\nLufthansa\t265589\n产家\t265590\n战神5\t265591\n偏置电路\t265592\n热血篮球\t265593\n教科\t265594\n立讯\t265595\n吊磨机\t265596\n酱汤\t265597\nNECPS\t265598\n管线钢\t265599\n先贤祠\t265600\nKt\t265601\nintertek\t265602\n孟丽君\t265603\n崇寿镇\t265604\n占便宜\t265605\nFormatter\t265606\nclumsy\t265607\n太湖国家湿地公园\t265608\n斗罗百万亚瑟王\t265609\n碧水湾\t265610\n苹果绿\t265611\n1.43G\t265612\n红卫\t265613\nmystery\t265614\n1000万吨\t265615\n提问帖\t265616\n切割\t265617\n影音先锋电影资源看片网站_吉吉影音\t265618\n丙辰月\t265619\n吉他中国新闻站\t265620\n玛茜\t265621\n滑雪\t265622\nZ2\t265623\n暴王\t265624\n星际战甲_17173\t265625\n蒺藜\t265626\nsneaky\t265627\n老虎凳\t265628\n双糖\t265629\n恰逢\t265630\n许诺\t265631\n松坪沟\t265632\n萌版\t265633\nClyde\t265634\n2018年4月28\t265635\n赤峰黄金\t265636\n水烟壶\t265637\n10、15、30天\t265638\n北练\t265639\n观音菩萨吧\t265640\n犬薇\t265641\n百谷歌\t265642\n神马影视-神马电影\t265643\n银矿\t265644\n淘咖啡\t265645\n20150724\t265646\n有道词典\t265647\n传奇世界H5\t265648\n仙霞\t265649\n有升\t265650\n周树森\t265651\n重庆酸辣粉\t265652\npresidential\t265653\n刘道玉\t265654\n李春\t265655\n木鳖子\t265656\n申国富\t265657\n楚汉\t265658\n明清史\t265659\nElectrical\t265660\n白银乡\t265661\n铁山镇\t265662\n91wan\t265663\nspecs\t265664\n无屏电视\t265665\n鹏网\t265666\n体格检查\t265667\n重量轻\t265668\n子爵\t265669\n官方简\t265670\n数字温度计\t265671\n耿耿余淮\t265672\n广西住房和城乡建设厅培训中心\t265673\n32斤\t265674\nJudge\t265675\n韵母\t265676\n店长\t265677\n湛江市国土资源局\t265678\n倡仪\t265679\n个个\t265680\n洛克岛\t265681\n南方医科大学南方医院\t265682\n李开心\t265683\nWrite\t265684\n言喻\t265685\n朝神\t265686\n甜\t265687\n莫艳琳\t265688\n腾彩PIXMA\t265689\n夏各庄\t265690\n编印\t265691\n6月16\t265692\n斯柯达昕锐\t265693\n还要多\t265694\nGinseng\t265695\nc段\t265696\ngmail邮箱\t265697\n刹车盘\t265698\nPID\t265699\n烦扰\t265700\n70倍\t265701\n上海交通大学药学院\t265702\n熊向晖\t265703\n肝功\t265704\n垂体泌乳素\t265705\nn270\t265706\n乳模\t265707\n注释\t265708\n3D试机号\t265709\n液晶拼接\t265710\n白眼狼\t265711\nMASM32\t265712\n凯恩斯主义\t265713\nCigarettes\t265714\n认识三角形\t265715\n杆身\t265716\n苏凯\t265717\n徐建国\t265718\n横波\t265719\n污片\t265720\n县法院\t265721\n放荡不羁\t265722\n汇川区\t265723\nCXC\t265724\n豪禾\t265725\n10D\t265726\n银湖\t265727\n中科院\t265728\n谷里\t265729\n体球\t265730\n直动式\t265731\n于春\t265732\n舘\t265733\n天皇\t265734\nJaya\t265735\n亡妻\t265736\n风色幻想4\t265737\n李保芳\t265738\nD15\t265739\n脊索瘤\t265740\n锦业\t265741\n科密指纹考勤机\t265742\n磅秤\t265743\n山崎\t265744\n铰支\t265745\noffice2015\t265746\n高晓菲\t265747\n福星路\t265748\n世园会\t265749\n自动关\t265750\nTensorBoard\t265751\n豆袋\t265752\n多号\t265753\n拼接\t265754\n暂住证明\t265755\n万能粉碎机\t265756\n青岛方特\t265757\n温标\t265758\n独立团\t265759\n紫笋茶\t265760\nsetfacl\t265761\n定位器\t265762\n餐卡\t265763\n啤酒\t265764\nHDMI\t265765\n前中期\t265766\n第25期\t265767\n比亚迪元EV360\t265768\n华元\t265769\nAxis2\t265770\nWorl\t265771\n足球操\t265772\n5750G\t265773\ncodepage\t265774\n足音\t265775\n加洲\t265776\n过缓\t265777\nWatermark\t265778\n重庆百货\t265779\nSei\t265780\n股沟\t265781\n区委办公室\t265782\n孙猴子\t265783\n加布里埃尔\t265784\n曲靖一中\t265785\nresttemplate\t265786\n十世\t265787\n中国赛宝实验室\t265788\n紫狂\t265789\n后勤部\t265790\n第五元素\t265791\n钱文忠\t265792\ng610\t265793\n遮住\t265794\nッ\t265795\n竞投\t265796\nTrains\t265797\nbrooklyn\t265798\n高速时间\t265799\n7789\t265800\n泸州市公安局\t265801\n301句\t265802\nbmr\t265803\n80周岁\t265804\n七十年前\t265805\n免税品\t265806\n悠游\t265807\nProceedings\t265808\n7份\t265809\n1月1\t265810\n笔记法\t265811\n极神\t265812\n加德士\t265813\n张晓东\t265814\n梁单元\t265815\n郭明义\t265816\n2730\t265817\n忍不住\t265818\n古古漫画网\t265819\n盟约\t265820\n司法机关\t265821\n侯珠\t265822\n惊鸿一面\t265823\n齐乐无穷\t265824\n五批\t265825\njcw\t265826\n几次方\t265827\n洽谈\t265828\n串珠子\t265829\nyield\t265830\n糖粉\t265831\n新新网\t265832\n伯乐奖\t265833\n雪地车\t265834\n非酒精性脂肪肝\t265835\n邮发\t265836\n天窗\t265837\n地狱\t265838\n本金\t265839\n老头儿\t265840\nPRODUCT\t265841\nMissouri\t265842\n人力资源管理\t265843\nlighttpd\t265844\nJuphy\t265845\n忙人\t265846\n油雾过滤器\t265847\n巴黎戴高乐机场\t265848\nWord天\t265849\nAED\t265850\n吸料\t265851\n面乳\t265852\n3月16\t265853\n考析\t265854\nCString\t265855\n2015新年\t265856\n火焰山\t265857\n饶毅\t265858\n院桥镇\t265859\nWin7/8\t265860\nsharen\t265861\n第61\t265862\n李佩芷\t265863\n现金类\t265864\n林天\t265865\n业之峰\t265866\n知乎\t265867\n109集\t265868\n中渝广场\t265869\nEndpoint\t265870\nkaori\t265871\n白塔镇\t265872\n日迹\t265873\n扫花网\t265874\ncing\t265875\n山隐\t265876\n科技处\t265877\n煤粉\t265878\n天件\t265879\nEDIFIER\t265880\n鼎盛\t265881\n通天\t265882\n军歌网\t265883\n粪菌\t265884\nUNICEF\t265885\ninte\t265886\n液氮\t265887\n无限试驾2\t265888\nsurfacelaptop\t265889\nKPOP\t265890\n唯淘网\t265891\n活动群\t265892\n颤颤\t265893\n导读\t265894\n芳香族\t265895\n几百兆\t265896\nsoftwarefang\t265897\n酿皮\t265898\n好孕帮\t265899\n童文俊\t265900\n防霉片\t265901\n孤残\t265902\n北京科技大学\t265903\nl2\t265904\n小店镇\t265905\n贸易型\t265906\n迪卡龙\t265907\n成人版\t265908\n陌路情深\t265909\n宁美\t265910\n流程\t265911\n车船\t265912\n工作报表\t265913\n湖广\t265914\nexporter\t265915\nuv400\t265916\n颜丙燕\t265917\n京瓷TASKalfa\t265918\n张家楼\t265919\nbraces\t265920\nkb4012212\t265921\n建业地产股份有限公司\t265922\n如尘\t265923\n小米生态链\t265924\n抗病\t265925\n心动周期\t265926\n第六步\t265927\n不测风云\t265928\n医用离心机\t265929\n敲诈\t265930\n快点儿\t265931\n通病\t265932\n不解之缘\t265933\nVietnamPlus\t265934\n30个工作日\t265935\ne罩杯\t265936\n55YOU五游网\t265937\n国贸店\t265938\n郭俊峰\t265939\n梦里人\t265940\n豪格\t265941\n乐从家具城\t265942\nFFC\t265943\n淘宝贝\t265944\nrecyclerview\t265945\n失去\t265946\n制变\t265947\nチェック\t265948\n六堡茶\t265949\n601939\t265950\n超人软件站\t265951\n郑凯\t265952\n极管\t265953\n拨牙\t265954\n海气\t265955\n陈金\t265956\n吉他谱-钢琴谱-电子琴谱-手风琴谱\t265957\n恩义\t265958\n阿尔伯塔省\t265959\n养生汤\t265960\n大同云冈石窟\t265961\n恐怖星球\t265962\n港交所\t265963\n运动员村\t265964\n16场\t265965\n开阳县\t265966\n参透\t265967\nlalalalala\t265968\ngetway\t265969\n南翔镇\t265970\n蛋白糖\t265971\n虚构\t265972\n包体\t265973\n2018.3.19\t265974\nvolumes\t265975\nbootstra\t265976\nIHG\t265977\n卷板机\t265978\n蛋液\t265979\n尼斯\t265980\n马虹\t265981\n射脸\t265982\n魔豆\t265983\n国土资源交易网\t265984\n亚太药业\t265985\n中山大学数学学院\t265986\n高术\t265987\nBullets\t265988\n红书\t265989\njinkens\t265990\nFacerig\t265991\n格来云\t265992\n新西兰\t265993\n炊事\t265994\n1460\t265995\n中华全国工商业联合会\t265996\n湖北一区\t265997\n踢球\t265998\n压缩版\t265999\n沈明\t266000\n静压桩\t266001\n荒木蕾娜\t266002\n反折\t266003\n成都人大\t266004\ngel\t266005\n豪爵悦星\t266006\n小鸡不好惹\t266007\n百安\t266008\nDateTimeBox\t266009\nリョナ\t266010\n楚天皓月0713\t266011\n流挂\t266012\n氮化炉\t266013\n淄博火车站\t266014\nONENOTE\t266015\n龙腾四海\t266016\n路特仕\t266017\n科立\t266018\n皇帝成长计划2\t266019\n丙辰\t266020\n风之杖\t266021\n浙江美术馆\t266022\n银座佳驿\t266023\n万向集团\t266024\nv9.8\t266025\n林州\t266026\n天马路\t266027\n项目管理专业\t266028\n和讯理财\t266029\n林下经济\t266030\n第129\t266031\n万讯\t266032\n眼镜盒\t266033\nAUKG\t266034\n兰若\t266035\npmi\t266036\n新西塘孔雀城\t266037\n南宁市国税局\t266038\n金环宇\t266039\n上海国际医学园区\t266040\n10年内\t266041\n砚石\t266042\nxsl\t266043\nZUKO\t266044\n汾阳路83号\t266045\n电器件\t266046\n江淮时报\t266047\n宣伟\t266048\n可视区\t266049\n裁判文书网\t266050\n五班\t266051\n辣文合集\t266052\nmalformed\t266053\n德清县政府\t266054\n无妨\t266055\n财新传媒\t266056\n25话\t266057\n云币\t266058\nDenise\t266059\nesxi虚拟机\t266060\nKAORI\t266061\n妖气\t266062\n接地装置\t266063\n车牌\t266064\ncad3d\t266065\n6首\t266066\n龙笛\t266067\n第七任\t266068\n臀肌挛缩症\t266069\n土堆\t266070\n浙江省纪委省监委\t266071\n清管器\t266072\n丁克\t266073\n四十斤\t266074\n死方\t266075\n知识产权学院\t266076\n鞠文娴\t266077\n19秒\t266078\n拜\t266079\n@ResponseBody\t266080\nBridgestone\t266081\n顺德新闻网\t266082\nunhappy\t266083\ngephi\t266084\n11.06\t266085\n吉他谱c调\t266086\n飼育\t266087\nn5010\t266088\nTAT\t266089\n急促\t266090\n清溪川\t266091\nsimufact\t266092\n卡拉克西\t266093\n汪云飞\t266094\n严华\t266095\n2017.03\t266096\nMutations\t266097\n双阳\t266098\n贵州省人民政府\t266099\nPEK\t266100\n加速版\t266101\n中国办公室\t266102\n李桥镇\t266103\n普通心理学\t266104\nDirectShow\t266105\n10.5\t266106\n胸椎压缩性骨折\t266107\n索恩\t266108\n虐心文\t266109\n取名\t266110\n黄灿\t266111\n影汇\t266112\nSmartGit\t266113\n进攻\t266114\n中医药法\t266115\n术语\t266116\n没看过\t266117\n美果\t266118\ntouch\t266119\n逐句\t266120\n朝阳市\t266121\nmng\t266122\n晋祠公园\t266123\n山师大\t266124\n派出所\t266125\n三百番\t266126\n子页\t266127\n通胀率\t266128\n王仪涵\t266129\n华为HUAWEI\t266130\n3003_叶海波\t266131\nchop\t266132\n研究岗\t266133\n英皇娱乐\t266134\n罗玥\t266135\n3301\t266136\n沿程\t266137\n青年公寓\t266138\n再回首\t266139\n威信\t266140\n天都峰\t266141\n第二次世界大战史\t266142\n事关\t266143\nAOP\t266144\n星厨\t266145\n雷姆\t266146\n居众\t266147\n百合种\t266148\n盐山全通管道有限公司\t266149\n社保计算器\t266150\nmechanisms\t266151\nYoutuber\t266152\n海岳\t266153\nReader\t266154\nrpgitem\t266155\nTopaz\t266156\n检查仪\t266157\nunite\t266158\n挂灯\t266159\n北京东方医院\t266160\n生命壹号\t266161\n广州市旅游局\t266162\n助理医师\t266163\n足球世界杯\t266164\nV2.2\t266165\n赵高\t266166\n车公庙\t266167\n上海同仁医院\t266168\n摇粒\t266169\n3档\t266170\n张影\t266171\n大钟寺\t266172\n缓控\t266173\n背地\t266174\n差分阻抗\t266175\n益处\t266176\n烟雾机\t266177\n陈玲玲\t266178\n大氅\t266179\n女体\t266180\n滞报\t266181\n彩京\t266182\n孔树脂\t266183\n分心\t266184\n大立\t266185\n藤蔓\t266186\n佛山移动\t266187\n娱乐贴\t266188\n魏王\t266189\n叮咚声\t266190\n美系\t266191\n金刚经\t266192\n扒拉\t266193\n二战史\t266194\n瑞虎论坛_汽车之家论坛\t266195\n多渠道\t266196\n美吉\t266197\n接柱\t266198\n3第一\t266199\n水化物\t266200\ninfluential\t266201\n坚决\t266202\n叼\t266203\n炸机\t266204\n魏佳艺\t266205\nivs\t266206\n球生\t266207\n兼容性\t266208\n周雄\t266209\n意难忘\t266210\n咨询工程师考试\t266211\n武汉大学政治与公共管理学院\t266212\n道路运输管理局\t266213\n电焊机\t266214\nconfucius\t266215\n炔\t266216\nKansas\t266217\n耐磨板\t266218\n工作简报\t266219\n江苏公务员考试网\t266220\nfread\t266221\n可耐福石膏板\t266222\n时文\t266223\nTrumpet\t266224\n奇方\t266225\nbin.jar\t266226\n97家\t266227\n木质化\t266228\n小米通讯技术有限公司\t266229\n自重应力\t266230\n戒掉\t266231\n旷达\t266232\nmstp\t266233\n羽月ミリア\t266234\n异构化\t266235\n汉诺威工博会\t266236\n美的空调遥控器\t266237\nIDEA2017\t266238\n晁\t266239\ndaiso\t266240\n传奇世界单机版\t266241\npyd\t266242\n天门论坛\t266243\n经典战\t266244\n伊利\t266245\n推覆\t266246\nNiubo\t266247\nbendibao\t266248\ndepartments\t266249\n贵州茅台酒股份有限公司\t266250\n中国名牌大学\t266251\n神仙道2016\t266252\n好慢\t266253\n白色生死恋\t266254\n公开发行证券的公司信息披露编报\t266255\n新建项\t266256\n上船\t266257\ncompass\t266258\n触摸精灵吧\t266259\nKant\t266260\n杠铃\t266261\n20161215\t266262\n青梅竹\t266263\n卫哲\t266264\nkex\t266265\nkl\t266266\n漫反射\t266267\n三鲜豆皮\t266268\n插本\t266269\n粒度分析仪\t266270\nSold\t266271\n万仟堂\t266272\n龙星\t266273\n还珠楼主\t266274\n岐\t266275\n退伍\t266276\n土崩瓦解\t266277\n列表格\t266278\n机选\t266279\n反腐倡廉网\t266280\n精魂\t266281\n棘轮式\t266282\n江西农大\t266283\n浮世\t266284\n259luxu\t266285\n我社\t266286\nx5max\t266287\n利亚姆\t266288\n女驸马\t266289\nbmob\t266290\n小屯镇\t266291\n大才\t266292\n金寨南路\t266293\n箭筒\t266294\n打伤\t266295\n下午1点\t266296\n哈维\t266297\n海通期货\t266298\n云栖社区\t266299\nBOIS\t266300\n加劲\t266301\n爱河\t266302\ninn\t266303\n三例\t266304\n一丁点\t266305\n长杖\t266306\nNegative\t266307\n0396\t266308\nrz/sz\t266309\n黑闪\t266310\n抹去\t266311\n郗\t266312\n伟文\t266313\n第四十次\t266314\n牙库\t266315\n浙江省质监局\t266316\ndigitalmicrograph\t266317\n喜怒\t266318\nSUNY\t266319\n海南银行\t266320\n海豚浏览器\t266321\n沿河城\t266322\n思维与智慧\t266323\n时工\t266324\n胜任\t266325\n硅油\t266326\nvxworks\t266327\n偏光镜\t266328\n勇敢一点\t266329\n专业级\t266330\n海贼王1\t266331\n银汉\t266332\n电动牙刷头\t266333\n十五\t266334\n巴山夜雨\t266335\n肚型\t266336\n红灯记\t266337\n程序媛\t266338\n运输费\t266339\n重写\t266340\nlesbians\t266341\n人民网海南视窗\t266342\n静冈机场\t266343\nlibmcrypt\t266344\n徽章\t266345\n开会\t266346\nifeng\t266347\n固德\t266348\nbb弹\t266349\n新河街\t266350\n豆乳\t266351\n不逊\t266352\n球棒\t266353\n0419\t266354\n大抵\t266355\n插电\t266356\n荣耀v10\t266357\n3克\t266358\n浮灯\t266359\n光洁\t266360\n滨州港\t266361\n闵行区人民政府\t266362\n彰化县\t266363\n之二十四\t266364\n父母课堂\t266365\nfifa2018\t266366\nluk\t266367\n度数\t266368\n3KW\t266369\n伊吕波\t266370\n奇变偶\t266371\n克莱恩\t266372\n聊天机器人\t266373\nClancy\t266374\nhigh2\t266375\n彩瓦\t266376\n迪士尼小镇\t266377\n定距变量\t266378\n第12集\t266379\nstdout\t266380\n神阙穴\t266381\n5月15号\t266382\n殊荣\t266383\n苏荷\t266384\n120期\t266385\n正轴\t266386\n咔咔咔\t266387\n血清淀粉酶\t266388\n马润\t266389\n辽宁大学\t266390\n锻冶屋英雄谭\t266391\n湖南化工职业技术学院\t266392\n酩悦\t266393\n高隆湾\t266394\n体貌\t266395\nconsultants\t266396\nGamble\t266397\n征订\t266398\n辛庄\t266399\nillustrato\t266400\n上海园林\t266401\n绛县\t266402\n季札\t266403\n丰都县\t266404\nV2.3.0\t266405\n香椎梨亚\t266406\n湖北省政府法制网\t266407\n虫群\t266408\n流行用语\t266409\n袁氏\t266410\n刘亚茹\t266411\n大野克夫\t266412\n耳帝\t266413\n燕麦米\t266414\nbrunette\t266415\n种植池\t266416\n彭国甫\t266417\ndete\t266418\n深圳武腾科技有限公司\t266419\n麻疹疫苗\t266420\nxlsm\t266421\n掘进\t266422\n日月光\t266423\n灰汤\t266424\n夺回\t266425\n浅绿色\t266426\n27周年\t266427\n变形金刚3\t266428\n李树\t266429\n檀道济\t266430\n电机车\t266431\n南昌大学二附院\t266432\nwacc\t266433\n二氧化氯泡腾片\t266434\n三朵花\t266435\n2007级\t266436\nWarfare\t266437\n雷震\t266438\n第233集\t266439\n秀字\t266440\n洗衣厂\t266441\nsoosi1\t266442\n珍邮\t266443\n绘色千佳\t266444\n迷人眼\t266445\nican\t266446\n工运\t266447\n多孔介质\t266448\nUPDATED\t266449\n吸取\t266450\n交上\t266451\n35KV\t266452\n新野县\t266453\n国鸟\t266454\n打品\t266455\n民科\t266456\n鸡眼\t266457\n路板\t266458\n普乐\t266459\n真好听\t266460\n成都大熊猫繁育研究基地\t266461\n楚商\t266462\n金砖\t266463\n30余\t266464\n铀矿\t266465\n中原镖局\t266466\n寒玉\t266467\n天泽\t266468\ndigitizer\t266469\n防御型\t266470\n瓦斯炉\t266471\n足球\t266472\n极其\t266473\n听懂\t266474\n一神\t266475\n顶友\t266476\n南京理工大学\t266477\n绩效管理\t266478\n系统\t266479\n乌达木\t266480\n有处\t266481\n不动户\t266482\n压抢\t266483\n金社广场\t266484\n200gana\t266485\n瓜子机\t266486\n大秦帝国之纵横\t266487\n西铁\t266488\n瓶装\t266489\n环境保护厅\t266490\n热点\t266491\n夺锋\t266492\n罪过\t266493\n血透室\t266494\ninvolved\t266495\n8989\t266496\n耿莲凤\t266497\n官科\t266498\npojo\t266499\n雷霆游戏\t266500\n40名\t266501\n小桔灯\t266502\nJOBS\t266503\n几辈子\t266504\n撞击式\t266505\n大溪\t266506\n方正扫描仪\t266507\n杭州外国语学校\t266508\n风景区\t266509\nSheraton\t266510\n机工\t266511\n维生素C\t266512\nCertificate\t266513\n多频\t266514\n乡村欲爱\t266515\n上午9时\t266516\n网王\t266517\n中新大厦\t266518\n洗濯\t266519\n武大\t266520\n2.4版\t266521\n体式\t266522\n中父\t266523\n裴珍\t266524\n2030年\t266525\n圳市\t266526\n海商\t266527\n模式化\t266528\n文件夹名\t266529\n原味鸡\t266530\n滚轴\t266531\nC币\t266532\n训犬\t266533\n5188\t266534\n大公资讯_大公网\t266535\n生活水\t266536\n常务理事会\t266537\n三厢版\t266538\n英雄联盟S6\t266539\n中国药师协会\t266540\nwin系统\t266541\n卡城\t266542\n字典值\t266543\n晨阳水漆\t266544\n500米口径球面射电望远镜\t266545\n南山区政府\t266546\nNominal\t266547\n宋初\t266548\n中共二大\t266549\nUnit1\t266550\nwindowser\t266551\n鉴定符\t266552\n湖北小学\t266553\n爬架网\t266554\n学雷锋日\t266555\nsongtaste\t266556\nOBU\t266557\nyangy\t266558\n跃变\t266559\n林珊\t266560\n乌菲兹美术馆\t266561\n爱唯欧论坛_汽车之家论坛\t266562\n贵南\t266563\n马钢集团\t266564\n鱼龟\t266565\ntinyxml2\t266566\ncapita\t266567\nSCG\t266568\n13集\t266569\n罪状\t266570\n计算机二级考试\t266571\n刘媛媛\t266572\n空桶\t266573\n0.3.6\t266574\n花园村\t266575\n阴寒\t266576\n保温杯\t266577\n高校龙中龙\t266578\n转角柜\t266579\n颈环\t266580\n优傲\t266581\n失恋日\t266582\n硅酮密封胶\t266583\n视觉传达\t266584\n炸串\t266585\n忆先\t266586\nwin7bios\t266587\n许钧\t266588\n4E\t266589\n欧韩\t266590\n青岛市南区\t266591\n护胸\t266592\n3狂猎\t266593\n民舞\t266594\n整容\t266595\nfzhouy\t266596\n求指点\t266597\n勤智\t266598\n玉马\t266599\n信元\t266600\n广园东路\t266601\n氢燃料电池汽车\t266602\n海洋之灾\t266603\n泡茶机\t266604\n高炮\t266605\n故居\t266606\n五冠\t266607\n四台\t266608\n共性\t266609\n鱼跃制氧机\t266610\n板卡\t266611\n陆远\t266612\n卡尔马克思杯\t266613\nCASTER\t266614\n确有其人\t266615\n花好月圆\t266616\nIRR函数\t266617\n12.13\t266618\n和酒\t266619\n营养米粉\t266620\n真仙\t266621\n右位\t266622\n胆汁\t266623\n戊辰\t266624\n五辆\t266625\ninnerhtml\t266626\nSatin\t266627\npredicate\t266628\n江波亮\t266629\n租赁服\t266630\n小铺\t266631\ngitHub\t266632\n凤凰展翅\t266633\n阎立本\t266634\n龙泉新闻网\t266635\n25周\t266636\n_汽车产经网\t266637\n箐\t266638\n茴香\t266639\n女相\t266640\n莱蒙托夫\t266641\n嘿秀\t266642\n叩拜\t266643\n南门村\t266644\n扣罚\t266645\n称好\t266646\n100亿\t266647\n反恐主义\t266648\n赛瑞\t266649\n小米糕\t266650\n南约\t266651\nprophere\t266652\n永城市政府网\t266653\nvivid\t266654\nTJ\t266655\nAli\t266656\nstackoverflow\t266657\n计价表\t266658\n同等学力\t266659\n浜崎真绪\t266660\nmartens\t266661\n体绘制\t266662\n莱佛士\t266663\nmercer\t266664\n华为荣耀7x\t266665\n烟台大学文经学院\t266666\nWRT1900AC\t266667\n祝愿\t266668\n欲壑\t266669\n狼行\t266670\n运\t266671\nmalicious\t266672\n巴塞罗那队\t266673\n声控\t266674\n机器换人\t266675\n颐园\t266676\n万能空调遥控器\t266677\n嗨团\t266678\nonkey\t266679\n一两天\t266680\n15j401\t266681\n63000\t266682\n启动盘\t266683\n卡帝\t266684\n全堂\t266685\n天门南站\t266686\n湄洲湾职业技术学院\t266687\n5082\t266688\n自激式\t266689\n先行登记\t266690\n养囊\t266691\n苏办\t266692\n连山壮族瑶族自治县\t266693\n昭君怨\t266694\n修真\t266695\n黄巢起义\t266696\n页尾\t266697\ntroll\t266698\n三名\t266699\n潘若迪\t266700\nthemleaf\t266701\n吸水砖\t266702\nGPD\t266703\n出乎其外\t266704\n一个200\t266705\n德联\t266706\nGoGo\t266707\n这么\t266708\n成都市司法局\t266709\n差动变压器\t266710\n必须承认\t266711\n神州\t266712\n松土机\t266713\n无限远征队\t266714\n18轮\t266715\n季氏\t266716\n早班\t266717\n仿制品\t266718\nDrink\t266719\n福星闯江湖\t266720\n口鼻\t266721\n重庆南开中学\t266722\n育鹏专升本教育信息网\t266723\n钢筋切断机\t266724\nrlike\t266725\n老包\t266726\n牛气\t266727\nyaya\t266728\n年少时\t266729\n吴漪\t266730\nhilton\t266731\n万界无敌\t266732\n红城市\t266733\nwin7网络打印机\t266734\n冥冥\t266735\n人生一世\t266736\n精妙\t266737\nGANTZ\t266738\n店里\t266739\n13倍\t266740\n崇仁路\t266741\n学思践悟十九大\t266742\nPro3\t266743\nperri\t266744\n协同性\t266745\nAni\t266746\n九日山\t266747\nFCF\t266748\nCE认证\t266749\n高低温冲击试验箱\t266750\n框支\t266751\nBigdata\t266752\nWalls\t266753\n雄蕊\t266754\n中洁网\t266755\n弃疗\t266756\n李倩\t266757\n索力\t266758\n群贤府\t266759\n广州火车东站\t266760\n里急后重\t266761\ngsensor\t266762\n尚华\t266763\nad15\t266764\n李青松\t266765\nwj\t266766\nSea\t266767\nKOBE\t266768\n国产化\t266769\nMF4712\t266770\nXCMG\t266771\n智机\t266772\n传戒\t266773\n陈珊妮\t266774\n5l\t266775\n佐治亚州\t266776\nboxer\t266777\n城市湿地公园\t266778\n青金桔\t266779\n百度云网盘+迅雷\t266780\n代言人\t266781\n苏斌\t266782\n打压\t266783\n雄霸天下\t266784\n玛丽莲曼森\t266785\n形动\t266786\n血衫\t266787\ncnetos\t266788\n雅居乐集团\t266789\n上行文\t266790\n人力资源系统\t266791\n塞\t266792\n生长痛\t266793\n全昭弥\t266794\n甘美\t266795\n否定观\t266796\n议事厅\t266797\n阴阳路\t266798\ncourage\t266799\nqq拼音输入法\t266800\n沙茶\t266801\n衣帽架\t266802\n2600亿\t266803\n动人心魄\t266804\nRemoved\t266805\n霍勇\t266806\n可信网站认证\t266807\n2寸\t266808\n题诗\t266809\n君度\t266810\n8.5.4\t266811\n满满的爱\t266812\n排海\t266813\nIsaac\t266814\nZLG\t266815\n监控机\t266816\n文交会\t266817\n融创春风十里\t266818\n达安\t266819\njb\t266820\n引开\t266821\n市侩\t266822\n省监\t266823\n晶界\t266824\n云计算技术\t266825\n六副\t266826\n石板\t266827\n但求\t266828\n福利费\t266829\n55座\t266830\n当今\t266831\n淡水镇\t266832\n黄页88网\t266833\n九厘米\t266834\n卢锋\t266835\n曼荼罗\t266836\n2千米\t266837\n9颗\t266838\n瀛海\t266839\nseowhy\t266840\n控制值\t266841\n52.0\t266842\n盗伐林木罪\t266843\nSSH协议\t266844\n腕管综合征\t266845\n高中作文_无忧考网\t266846\nkeyshot5\t266847\n美奇\t266848\n圣雪\t266849\n乖叫\t266850\n刘威葳\t266851\n配电站\t266852\n几何画板\t266853\n魍魉之匣\t266854\n中国美\t266855\n1000平\t266856\nCanopus\t266857\nwont\t266858\n触须\t266859\n三星S7/S7\t266860\n康健\t266861\nJAMA\t266862\nWEMONEY\t266863\n吐丝\t266864\n逻辑分析仪\t266865\n耐火砖\t266866\n雅顿金胶\t266867\n94岁\t266868\n付琳\t266869\nMFi\t266870\n长安欧诺风扇\t266871\n爬爬赛\t266872\n华泰集团\t266873\n超威\t266874\n轻取\t266875\n提头\t266876\n灭火弹\t266877\n恶役\t266878\nOMG\t266879\n奥蓝\t266880\n总函\t266881\nDNF帕拉丁\t266882\n霍勒迪\t266883\n军训服\t266884\n植入性\t266885\n启点\t266886\n自缢\t266887\n闸坡镇\t266888\n29秒\t266889\nCAD工具栏\t266890\nITO\t266891\n松解\t266892\nWIKI\t266893\n空气波压力治疗仪\t266894\n双璧\t266895\n4.8.3\t266896\n91pro\t266897\n红瑞集团\t266898\n会展中心\t266899\nu4\t266900\n书&#160\t266901\n蒋健\t266902\n礼俗\t266903\n2000mm\t266904\n江口镇\t266905\n清辉阁\t266906\n石头镇\t266907\n矿物油\t266908\n25组\t266909\nRaven\t266910\n金蝶币\t266911\n期货类\t266912\n永城\t266913\n不锈钢卷\t266914\n无限路由器\t266915\n豪情\t266916\n土地改革运动\t266917\n45吨\t266918\n安阳文峰\t266919\nFORTINET\t266920\notf\t266921\n上海钓鱼论坛\t266922\n毕业生代笔网\t266923\n货币型\t266924\n行不_\t266925\n倪尔萍\t266926\n纸杯\t266927\n温峤镇\t266928\n商南\t266929\n威世\t266930\n以太坊\t266931\n上海华谊集团\t266932\n青山村\t266933\n姜姜\t266934\nnes模拟器\t266935\n周磊\t266936\n乱舞\t266937\n病态\t266938\n欧派\t266939\nY友乐园\t266940\n坝子\t266941\n小少妇\t266942\n独创性\t266943\nOp.10\t266944\ncb\t266945\n月纹\t266946\n工藤有希子\t266947\n微字\t266948\n花希\t266949\n警示带\t266950\ndisorders\t266951\n第一庄\t266952\njsonarray\t266953\n七五\t266954\n年月\t266955\n金华市外国语学校\t266956\n同年\t266957\ncort\t266958\n杀人优越权\t266959\nPPPD\t266960\n氯硝西泮片\t266961\n陈薇\t266962\n中央电视台综合频道\t266963\n甘草片\t266964\n凤皇\t266965\n冷读术\t266966\n样师\t266967\nkudu\t266968\n重生豪门之强势归来\t266969\n面访\t266970\n奥迪康\t266971\n十二生肖的故事\t266972\n无翼鸟邪恶漫画_无翼鸟\t266973\nirs\t266974\n天津市委党校\t266975\n慰安所\t266976\n缺一不可\t266977\n货到\t266978\n荣耀手环3\t266979\n久旱逢\t266980\n洋浦港\t266981\nFlint\t266982\n客套话\t266983\n华北高速\t266984\nmeiren\t266985\n不到\t266986\n裴金佳\t266987\n第6讲\t266988\n北京慈铭体检中心\t266989\n杀破狼·贪狼\t266990\n沟沟\t266991\n慢跑\t266992\n谦逊\t266993\n鹿先森\t266994\n长沙镇\t266995\n脱毒\t266996\n朗行论坛_汽车之家论坛\t266997\n南通市人力资源和社会保障局\t266998\n狂犬病病毒\t266999\n狗彘\t267000\n吞噬星空\t267001\n双语版\t267002\n不知而不愠\t267003\nUBB\t267004\n真声\t267005\n日本大使馆\t267006\n五洲国际广场\t267007\n低迷\t267008\n白素\t267009\n短信费\t267010\nrowid\t267011\n极彩\t267012\n最好的\t267013\nneeds\t267014\n插销\t267015\nv6\t267016\n云鸟科技\t267017\n大侦探福尔摩斯\t267018\n美人心计\t267019\n前段\t267020\nVEC\t267021\n机器名\t267022\nsuperblock\t267023\n勘测\t267024\nyif\t267025\n牛油\t267026\n沈阳消防\t267027\n父母\t267028\n视美\t267029\n魔蛇\t267030\n黑杠\t267031\n霍尼韦尔\t267032\n15.4\t267033\n程老师\t267034\n20180208\t267035\n地质灾害监测\t267036\n病假单\t267037\n钢壳\t267038\nidv\t267039\n恒房通\t267040\n万联网\t267041\n怒族\t267042\n美大集成灶\t267043\n南沿江\t267044\n知识经济\t267045\n八寸\t267046\n控板\t267047\n3.5.6\t267048\n郭晋安\t267049\nstatic变量\t267050\nQGraphics\t267051\nACP\t267052\n吧唧网\t267053\n不求人\t267054\n兰新铁路\t267055\n韩国现代集团\t267056\n国产\t267057\n风遁\t267058\ngrails\t267059\n纯利\t267060\nqiniu\t267061\n拉杆式\t267062\n江西人\t267063\n天天羽毛球网\t267064\n韩国电视台\t267065\nWinpcap\t267066\nZxiao\t267067\n伟德国际\t267068\n过温\t267069\nvOoT\t267070\n成都工业学院\t267071\n约翰霍普金斯\t267072\n芝术\t267073\n小黄帽\t267074\n温岭人大\t267075\n我想\t267076\n微易达\t267077\n0306\t267078\n江必新\t267079\nCEAC\t267080\n电赛\t267081\n不可名状\t267082\n值班室\t267083\n中南地产\t267084\nBANNER\t267085\n买碟\t267086\nzxw\t267087\n别让\t267088\n断罪者\t267089\nTensentype\t267090\n宝马m4\t267091\n杭州市第二人民医院\t267092\n分钟寺\t267093\nElectrode\t267094\n子宫腺肌症\t267095\n榛蘑\t267096\nlog4j2\t267097\nOPS\t267098\n第12部\t267099\nhuashi\t267100\n犬之岛\t267101\n南糯山\t267102\n杜鹃根\t267103\n先天功\t267104\n一枝独秀\t267105\n武汉二厂\t267106\nwifi放大器\t267107\n正当季\t267108\n光堂\t267109\n28年\t267110\n孕期反应\t267111\n物美价廉\t267112\n5109\t267113\n漳州市教育局\t267114\n筑波大学\t267115\n七秒\t267116\n白凤堂\t267117\n赛罗奥特曼格斗\t267118\n日立中央空调\t267119\n王先谦\t267120\n走过来\t267121\nxlutils\t267122\nws2_32.dll\t267123\n脱贫路\t267124\n失蜡法\t267125\nNi\t267126\n受凉\t267127\n头颅\t267128\n思而不学\t267129\n熏衣\t267130\n157cm\t267131\n国元证券股份有限公司\t267132\nDrugs\t267133\n锁版\t267134\n非同\t267135\n李骥\t267136\n无忧传奇\t267137\n安泽县\t267138\n胃肠道间质瘤\t267139\nTERRA\t267140\n地块\t267141\nCardiology\t267142\n影月\t267143\n桃源路\t267144\n测速仪\t267145\nkard\t267146\n方向场\t267147\n2009级\t267148\n俞自萍\t267149\nPeriod\t267150\n罗马2:全面战争\t267151\n艾滋病\t267152\n硫化氢\t267153\n莲花广场\t267154\n盛世天城\t267155\n基本公共卫生服务\t267156\n體驗\t267157\n口费\t267158\n七里乡\t267159\n环形山\t267160\n镦\t267161\n金融街\t267162\n株洲政府网\t267163\n落实好\t267164\n股票市\t267165\n圣泉寺\t267166\n射门\t267167\n建设大厦\t267168\n顺河回族区\t267169\nGeology\t267170\n背袋\t267171\nVHS\t267172\nkbk\t267173\n东城\t267174\n滥发\t267175\n3590\t267176\n广东新闻\t267177\n汤泉\t267178\n长久之计\t267179\n1491\t267180\n掐指\t267181\n黑毛猪\t267182\n运输证\t267183\nweshould\t267184\n限额\t267185\n取暖\t267186\n褚柑\t267187\npgone\t267188\n谭平\t267189\nfategrando\t267190\n校历\t267191\n防川\t267192\ninit__.py\t267193\nFLV\t267194\n实属\t267195\n药侠\t267196\n超脑\t267197\ntxgc\t267198\nSD敢达强袭战线\t267199\n华懋科技\t267200\n黑脸琵鹭\t267201\n横峰县\t267202\n安溪铁观音\t267203\n电脑网\t267204\n爱梦\t267205\n00年\t267206\n450h\t267207\n土巴兔装修问答\t267208\nfunk\t267209\nweb喵神\t267210\n江西人才网\t267211\nExcel单元格\t267212\n惠宜\t267213\nTesseract-OCR\t267214\n电动床\t267215\n交流电机\t267216\n食行\t267217\n苹果电脑输入法\t267218\n骨折线\t267219\nnaturie\t267220\nSaint\t267221\n挡车器\t267222\nMob文档中心\t267223\ndetour\t267224\n220分钟\t267225\n讨论群\t267226\n江某\t267227\nredirect\t267228\n渴求\t267229\n21cake\t267230\n一听\t267231\nnumactl\t267232\n鼎鼎大名\t267233\n习俗\t267234\n阴阴\t267235\nHamilton\t267236\n899\t267237\n寄放\t267238\n裂\t267239\n新昌县政府\t267240\nVR/AR\t267241\n整平机\t267242\n祖屋\t267243\n蒲公英之恋\t267244\n非处方药\t267245\n刘力扬\t267246\n李军\t267247\ndimen\t267248\n创翼\t267249\n李艾\t267250\n4350\t267251\n老葡京\t267252\n文化人\t267253\n97影院\t267254\n制霉素片\t267255\nPolyvore\t267256\n包运行\t267257\n联汇\t267258\n7kg\t267259\n冰石\t267260\ng罩杯\t267261\nlpt\t267262\n层错\t267263\n宁波东力\t267264\n国世平\t267265\n劝业场\t267266\n跨阻放大器\t267267\n三十一年\t267268\n加藤惠\t267269\n阳光花园\t267270\n100个月\t267271\n北口\t267272\n奥恩\t267273\n相关度\t267274\n影视先锋\t267275\n放米\t267276\n河东勋\t267277\n呼吸病\t267278\n庆功\t267279\n合肥植物园\t267280\n5336\t267281\n混浴\t267282\n有苏杭\t267283\n针规\t267284\n六弦\t267285\n蒙特\t267286\n豆蔻年华\t267287\n函字\t267288\n625\t267289\n试车\t267290\n学海\t267291\n雷克特种神医\t267292\n渴\t267293\n平罗县政府网_平罗县政府\t267294\n广州市设计院\t267295\n资源要素\t267296\n701号\t267297\n乳尖\t267298\n郑东新区管委会\t267299\n强夯法\t267300\n和平饭店\t267301\n上升\t267302\n刀座\t267303\nFortran\t267304\n二号首长\t267305\n123我爱你\t267306\n凝霜\t267307\n虎门\t267308\n中青旅行社\t267309\n临海房产网\t267310\n第63批\t267311\n理实\t267312\n无印良品\t267313\n土壤固化剂\t267314\n广佛\t267315\n黄海峰\t267316\n大唐\t267317\niPhone7/7plus\t267318\n第25个\t267319\n1437\t267320\n双羽\t267321\njeanron\t267322\n九一三事件\t267323\n中国近代史纲要\t267324\n复合饼\t267325\n测距仪\t267326\n监狱学园\t267327\n电\t267328\n小冤\t267329\n钥匙扣\t267330\n试试\t267331\n经费\t267332\n陌刀\t267333\n545S\t267334\nvmax\t267335\n变题\t267336\nsober\t267337\n素衣\t267338\n木秀\t267339\nixgbe\t267340\n映射盘\t267341\nsimpledateformat\t267342\nMFP\t267343\n旨意\t267344\n御园\t267345\nCBNData\t267346\n胔\t267347\nluckyw\t267348\naring\t267349\nPoverty\t267350\n艾瑞泽\t267351\n9月3日\t267352\n报生\t267353\n平等性\t267354\n一尘\t267355\n开封府\t267356\npthread_create\t267357\nIB课程\t267358\n褪色\t267359\nwww.gzlyyl.gov.cn/resource/publish/001/\t267360\n国展\t267361\n三宗\t267362\n融合器\t267363\n沃乐\t267364\n5W\t267365\n遗风\t267366\n相声\t267367\n扔出\t267368\n敦邯新闻网\t267369\n肯普法\t267370\nColour\t267371\nili9341\t267372\n200多年\t267373\n供电\t267374\n大益\t267375\n葛剑雄\t267376\nindea\t267377\n59式\t267378\n胡莱\t267379\n十里村\t267380\n孙艺小s\t267381\n乡镇党委\t267382\n180g\t267383\nmalta\t267384\n直管段\t267385\n2017年6月29日\t267386\n沈洋\t267387\n外模\t267388\n搜达\t267389\n顺气\t267390\n2008年11月\t267391\n射频消融术\t267392\n回家的路上\t267393\n科顺防水科技股份有限公司\t267394\n拜博口腔\t267395\n25年前\t267396\n活字格\t267397\n相忘\t267398\n高维空间\t267399\n悦诗风吟\t267400\n斯旺西大学\t267401\n黑龙江移动\t267402\n三防\t267403\n阿茶\t267404\nfem\t267405\n怪物猎人吧\t267406\nvvvv\t267407\n江西旅游商贸职业学院\t267408\n美女作家\t267409\nぱ\t267410\n_新闻政经_北京商报网\t267411\npropylene\t267412\n厦门地铁6号线\t267413\nlen函数\t267414\n尘\t267415\n医疗人才网\t267416\n官桥\t267417\n爸妈们\t267418\n红色警戒3世界大战\t267419\n开源代码库\t267420\n回锅肉\t267421\nbarrett\t267422\n男厕\t267423\n角堇\t267424\n华勤\t267425\n乐悠悠\t267426\n台表\t267427\n热浸塑钢管\t267428\nChr\t267429\n刘庆生\t267430\n他的爱\t267431\n阿兹海默\t267432\n公司管理软件\t267433\n珠海外伶仃岛\t267434\nNoteBook\t267435\n莉莉影院\t267436\n图林\t267437\n稍后\t267438\n大达物流\t267439\n龟鳖\t267440\n九九藏书网\t267441\n如诗\t267442\n锐迪科\t267443\n中央纪委法规室\t267444\n情天性\t267445\n管理局\t267446\nPrior\t267447\n要旨\t267448\n小英赛\t267449\n海南热带海洋学院\t267450\n静气\t267451\n遍历算法\t267452\n成都市幼儿园\t267453\n穆念慈\t267454\n捉弄\t267455\nassurance\t267456\n雷公山\t267457\n后年\t267458\n韩联社\t267459\nRoyole\t267460\nSomerset\t267461\n学子们\t267462\n2万亿元\t267463\n公司管理制度\t267464\n政综\t267465\nmultiplier\t267466\nChassis\t267467\nrav\t267468\nOvernight\t267469\nsoft\t267470\n热缩\t267471\n走人行道\t267472\n届中\t267473\n最重\t267474\n谷哥\t267475\n月经推迟\t267476\n熊猫烧香病毒\t267477\n霸王花\t267478\n高斯滤波器\t267479\n勇者圣殿_凯恩之角_暗黑破坏神\t267480\nassigned\t267481\n中国企业家协会\t267482\n第七批\t267483\ndungreed\t267484\n金市时讯-金投网\t267485\n最近几天\t267486\n优惠点\t267487\n新站\t267488\n强脱\t267489\nEVT\t267490\n替代率\t267491\n巴黎春天百货\t267492\n孟光\t267493\n刺符\t267494\n萃取剂\t267495\n上海财经大学经济学院\t267496\n隔音罩\t267497\n360圈\t267498\n岱庙\t267499\n安琪拉\t267500\n债委会\t267501\n电击伤\t267502\n雷人\t267503\ncave\t267504\n.3\t267505\n严阵以待\t267506\nsf999\t267507\n南充南部县\t267508\n辣味\t267509\n招财\t267510\n工程化\t267511\n两等\t267512\nThinkCSS\t267513\na^2\t267514\n穿梭式\t267515\n孕早期\t267516\n公共停车场\t267517\n新大话西游\t267518\n无冬之夜1\t267519\n郫\t267520\nICCV\t267521\nprofinet\t267522\n太平保险\t267523\nnba2k14\t267524\n牛电科技\t267525\nF2L\t267526\nM6S\t267527\n诗乐\t267528\n47层\t267529\njtbc\t267530\n凯森\t267531\n超维\t267532\n钢管架\t267533\n担保函\t267534\n老干局\t267535\n历年来\t267536\n黑龙江省博物馆\t267537\n双林\t267538\njining\t267539\n电磁屏\t267540\n银禧\t267541\n3千多\t267542\n放不下\t267543\n抢鲜版\t267544\nblazer\t267545\n张宥浩\t267546\n同伊\t267547\n教兽\t267548\n疯狂中文网\t267549\n沐浴乳\t267550\nEXPERT\t267551\n宅樱\t267552\n花漫\t267553\n苏锡常南部\t267554\n对外直接投资\t267555\n胡建平\t267556\n七丽女性网\t267557\n开机关机\t267558\ntja\t267559\n伸长量\t267560\n新葡京\t267561\nRARBG\t267562\n非居民金融账户涉税信息尽职调查管理办法\t267563\n凯杰\t267564\n刘小兵\t267565\n有乐中文网\t267566\n秦皇岛市\t267567\nstereo\t267568\nBurch\t267569\n傲天佛尊\t267570\n易达金\t267571\n分区分\t267572\n门栓\t267573\n配位数\t267574\n媸\t267575\n验案\t267576\n无双全明星\t267577\n脚线\t267578\nMP3转换器\t267579\n数码相机镜头\t267580\n金融资产管理公司\t267581\n捐卵\t267582\n支款\t267583\n安邦集团\t267584\nXenDesktop\t267585\n满洲人\t267586\n乳痂\t267587\n黄河水文网\t267588\n伊莉雅\t267589\n全然\t267590\n增殖灶\t267591\n上犹\t267592\n蒜片\t267593\n中国大学\t267594\n考博\t267595\n千梦\t267596\nSNI\t267597\n保护者\t267598\nPumpkin\t267599\n自存\t267600\n巨石\t267601\n天水市人力资源和社会保障局\t267602\n零差率\t267603\n天字号\t267604\n斗鸡眼\t267605\n非党员\t267606\n给水\t267607\n吕梁\t267608\n螺孔\t267609\n清华同方|同方股份有限公司\t267610\nPathVariable\t267611\n凸镜\t267612\n蓬莱仙岛\t267613\nRpc\t267614\n品酒\t267615\n静语\t267616\n宁海在线\t267617\n叠螺脱水机\t267618\n唐舞桐\t267619\n墨月城\t267620\n假话\t267621\n经济版\t267622\n纪念物\t267623\n纳瓦罗\t267624\n国韵\t267625\n黑枣\t267626\n8971\t267627\n仁川国际机场\t267628\n灰量\t267629\n高雪\t267630\nTVP\t267631\n死区\t267632\n群防群治\t267633\n奥斯陆大学\t267634\n好好生活\t267635\n3.5版\t267636\n万鸦\t267637\n13斤\t267638\n华阳古镇\t267639\n飘窗柜\t267640\n名居\t267641\n重大交通事故\t267642\n破胎器\t267643\n斗数\t267644\n三个字\t267645\n黄龙600\t267646\n丹泽尔·华盛顿\t267647\n长安欧尚欧尚X70A\t267648\n风雪夜归人\t267649\n伪品\t267650\n全龄化\t267651\n果坊\t267652\n秦岛\t267653\n小夏\t267654\n中人网\t267655\n题组\t267656\n南平市政府\t267657\n华佗\t267658\n上海外滩茂悦大酒店\t267659\nl36h\t267660\ndongle\t267661\n356\t267662\n十八个月\t267663\nfunctional\t267664\nibg\t267665\n燃气炉\t267666\n朋友关系\t267667\n晓月圆\t267668\n摩擦学\t267669\nzcwz\t267670\n船代\t267671\n凌飞\t267672\n中国男排\t267673\n御用\t267674\niso认证\t267675\n肠炎宁片\t267676\n村办\t267677\n蓝杖\t267678\n林洪生\t267679\n苏家屯区\t267680\n三门峡市委\t267681\n湖北理工学院\t267682\n试验仪\t267683\n问题式\t267684\n犯规\t267685\n逍遥叹\t267686\n小海军\t267687\n轴温\t267688\n阴凉\t267689\n校稿\t267690\n党代表大会\t267691\nб\t267692\nSINAMICS\t267693\n|东来紫微网\t267694\n罗牛山\t267695\n徐悦\t267696\n一二一\t267697\n凹痕\t267698\nカス\t267699\n扁桃仁\t267700\nmkz\t267701\n竖心旁\t267702\n大叶兰\t267703\n冬瓜蔡\t267704\nbios-ZOL\t267705\n俄罗斯方块\t267706\n竺可桢\t267707\n声德\t267708\n专用播放器\t267709\n排水阀\t267710\n环城南路\t267711\n彩跑\t267712\n联想售后客户服务中心\t267713\n5.1节\t267714\n无尽的爱纪念网\t267715\n井座\t267716\n竭\t267717\n不识\t267718\nFreeFileSync\t267719\n羽扇\t267720\n高钰\t267721\n掛\t267722\n怕生\t267723\n廉政\t267724\n营养素补充剂\t267725\n规划篇\t267726\n疗法\t267727\n这一秒\t267728\n15分\t267729\n漠河县\t267730\nassociations\t267731\n烧鱼\t267732\n503号\t267733\nphotograph\t267734\n68#\t267735\n社会性\t267736\n隔壁\t267737\n丽人时尚网\t267738\n杀妻案\t267739\nsc\t267740\n清晖\t267741\n最近十年\t267742\n155\t267743\n仙侠剑\t267744\n广州白云宾馆\t267745\n欠条\t267746\n腾讯地图\t267747\n面茶\t267748\n菏泽机场\t267749\nVIRGIN\t267750\n2017年12月15日\t267751\n天会调研宝\t267752\n海立\t267753\n连缀\t267754\n淮扬\t267755\nturnitin\t267756\n电话费\t267757\n永驻\t267758\n最近三个月\t267759\nLaptops\t267760\n左舵\t267761\n松茂御龙湾\t267762\n宁波市中医院\t267763\n古汉语\t267764\nlibjingle\t267765\n807路\t267766\n四三\t267767\n摩门教\t267768\n海鸟窝\t267769\nKibana\t267770\nwere\t267771\n多宗\t267772\n我的天使\t267773\nissued\t267774\n成园温泉山庄\t267775\n紫薇田园\t267776\n课间活动\t267777\n委员长\t267778\n百姓网\t267779\n典狱官\t267780\n火狐浏\t267781\n金古桥\t267782\n3818\t267783\nBird\t267784\n电子信息与电气工程学院\t267785\n融资平台公司\t267786\n无可失\t267787\n果壳活性炭\t267788\n德拉科\t267789\n第13条\t267790\n世纪华庭\t267791\n心气\t267792\nmanuscript\t267793\n恢复正\t267794\n小青年\t267795\n43集\t267796\n蔬果\t267797\ninventions\t267798\n邹市明\t267799\n定陶\t267800\n姚江新区\t267801\n探查器\t267802\n博视网\t267803\nLOL源\t267804\n叶落长安\t267805\n戮力\t267806\n痰湿\t267807\n量得\t267808\n凭祥市\t267809\nBroad\t267810\n上海市纪委\t267811\n冠益乳\t267812\n2.5万亿\t267813\n无极变速\t267814\ndjigo\t267815\n耳鼻咽喉\t267816\n青岛市教育局\t267817\n科波拉\t267818\n沉降菌\t267819\nARM64\t267820\n测量工\t267821\n皮裤\t267822\n数播\t267823\n形成\t267824\nQTable\t267825\n续文\t267826\n非公\t267827\n美娇娘\t267828\n云南地税\t267829\n176个\t267830\n好年\t267831\n铜绿假单胞菌\t267832\n廊坊万达广场\t267833\n帕斯卡尔\t267834\n南洋理工大学\t267835\n侏罗纪\t267836\n盘乐\t267837\nbssid\t267838\n西安路\t267839\n横琴\t267840\n声\t267841\n杨飞云\t267842\n颈椎炎\t267843\nAccessToken\t267844\n苹果电脑版\t267845\n中交第二公路工程局有限公司\t267846\nca证书\t267847\n循环器\t267848\n不烦\t267849\n监测表\t267850\n陈伟霆\t267851\nvd\t267852\n帕楚里亚\t267853\n加伤\t267854\nreverb\t267855\n小土豆\t267856\n20栋\t267857\n徐涛\t267858\nreversible\t267859\n权力的游戏\t267860\n范志毅\t267861\n厌食症\t267862\n李畅\t267863\n最新人教版五年级数学下册\t267864\nboard\t267865\n1円\t267866\n医护\t267867\n色浆\t267868\n假唱\t267869\n准迁证\t267870\neLearning\t267871\n北宋\t267872\n大塘镇\t267873\n凯越\t267874\n穷游女\t267875\n郑嘉颖\t267876\n吃面\t267877\n中国电科\t267878\n城镇总体规划\t267879\n十六烷基三甲基溴化铵\t267880\n丰盛古镇\t267881\n青麦\t267882\n谢家村\t267883\n耳棒\t267884\n处理篇\t267885\n叶赫那拉\t267886\n拉格纳罗斯\t267887\n卫辉市\t267888\n晓说\t267889\n厦门市第三医院\t267890\n农社\t267891\n花市\t267892\n李明泽\t267893\n津市\t267894\n热区\t267895\n香子\t267896\n12800\t267897\n多型\t267898\n山东黄河河务局\t267899\n摩擦桩\t267900\n名仕型\t267901\n四面弹\t267902\nForMac\t267903\n高手过招\t267904\n字位\t267905\n20141104\t267906\nUPDATE\t267907\n7w\t267908\n圣灵群岛\t267909\n狂野飙车8极速凌云\t267910\n杰里米\t267911\n还珠格格\t267912\n麦客\t267913\n踏火行歌\t267914\n华为路由A1\t267915\n早川\t267916\n12万多\t267917\n大连市水务局\t267918\n法定存款准备金\t267919\n国家监察委员会\t267920\n优惠卡\t267921\n黑泷堂\t267922\n张安琪\t267923\n金桥花园\t267924\nwada\t267925\nstema\t267926\ndaddyskins\t267927\n胜日\t267928\n芳姐\t267929\n海上云台山\t267930\n老虎沟\t267931\n008期\t267932\n平遥\t267933\nroboo\t267934\n刘大\t267935\n阿拉米格\t267936\n55亿美元\t267937\nDataRow\t267938\n恒大海花岛\t267939\n乞丐裤\t267940\n南校区\t267941\nUSB接口\t267942\n蔑\t267943\nuemura\t267944\nfranke\t267945\n会展_分析测试百科网\t267946\n压缩机\t267947\n张掖路\t267948\n司法厅\t267949\n延长器\t267950\n橡胶\t267951\n恶略\t267952\n得住\t267953\n回归树\t267954\n宗\t267955\n0.2秒\t267956\nFantastic\t267957\n谷雨杯\t267958\nRace\t267959\n80年后\t267960\n射艺\t267961\n科技导报\t267962\n不战\t267963\n义乌小商品市场\t267964\n莽山国家森林公园\t267965\n绯红女巫\t267966\n哈尔滨市教育局\t267967\n绣荷包\t267968\nIME\t267969\n得人\t267970\n液位传感器\t267971\n江苏省政府\t267972\n进库\t267973\n恩施州人民政府\t267974\ncne\t267975\n张卫东\t267976\nYOHO\t267977\n搜素\t267978\n融创都会中心\t267979\nWTA\t267980\n一字型\t267981\nt分布表\t267982\napfs\t267983\n家务活\t267984\nYouku\t267985\n中低端\t267986\n代购点\t267987\n张家豪\t267988\nnet3.5\t267989\n17634\t267990\n新闸路\t267991\nf365\t267992\nedittext\t267993\n黑芝麻核桃\t267994\nSHS\t267995\n面对面\t267996\n75分\t267997\nSurrender\t267998\n2XL\t267999\n杰诚\t268000\n天王\t268001\n降香\t268002\n东越\t268003\ncrispr\t268004\n苏州网络公司\t268005\n烟壶\t268006\n子乔\t268007\n道山\t268008\n国付宝\t268009\n世遗\t268010\n控点\t268011\n一意\t268012\n李青萝\t268013\n农业产业园\t268014\n就在\t268015\n马宇\t268016\n难波\t268017\n型感\t268018\n白马村\t268019\n辽宁林业职业技术学院\t268020\n暨阳湖\t268021\n二条城\t268022\n洋米糕\t268023\ne3400\t268024\n螺纹连接\t268025\narchicad\t268026\n糙帅\t268027\nrobotstudio\t268028\n10.9%\t268029\n丽枫酒店\t268030\n鲁商集团\t268031\n六朝云龙吟\t268032\n挂树\t268033\n玉雕师\t268034\n得瑟\t268035\nOnkyo\t268036\nX230i\t268037\nOSError\t268038\n红瘦\t268039\n脊\t268040\n首台\t268041\n少妇\t268042\n18183口袋妖怪复刻\t268043\n国观后感\t268044\n潘婷\t268045\n弥牟镇\t268046\n电源柜\t268047\n固安县\t268048\n大力金刚\t268049\n驻极体\t268050\n钰慧\t268051\n修持\t268052\n万州\t268053\n就打喷嚏\t268054\n琴键\t268055\n兼用\t268056\n2.05G\t268057\n最合适\t268058\n想长\t268059\nDOCOMO\t268060\nwns\t268061\n杨梅\t268062\n特装\t268063\n李美淑\t268064\n比邻星\t268065\n8920\t268066\n金朝\t268067\n孟岩\t268068\n山东省人社厅\t268069\n电锯惊魂6\t268070\ncalled\t268071\nJustin\t268072\n十字型\t268073\nxzb\t268074\n幔\t268075\n201705\t268076\nPharmaceutical\t268077\n九棵树\t268078\n路经\t268079\n我司\t268080\n更健康\t268081\n中国西部\t268082\n津塔\t268083\n全能者\t268084\n修心\t268085\n宝拉水杨酸\t268086\n孙雅\t268087\n育儿_99健康网\t268088\nMcafee\t268089\n身姿\t268090\nOSMO\t268091\n存款计算器\t268092\nmilano\t268093\n流动人口\t268094\n苔原\t268095\n2630\t268096\n遵守\t268097\nPlans\t268098\n昆山人才网\t268099\n小气泡\t268100\n圆曲线\t268101\n古兽\t268102\n手机斗鱼\t268103\n杨宗\t268104\n安得智联科技股份有限公司\t268105\n苍老师\t268106\nsplite\t268107\n璞\t268108\n编内\t268109\ncmdb\t268110\n银河电子\t268111\n500瓦\t268112\n王一\t268113\n2017年7月12日\t268114\n师范类\t268115\n亲证\t268116\n复兴\t268117\nojdbc14\t268118\n漏网\t268119\n新产\t268120\n渲染器\t268121\n阻断剂\t268122\n弟子规图说\t268123\n胡亮\t268124\nMBO\t268125\n735xt\t268126\n李霸妮\t268127\n滨江经济开发区\t268128\n惠特莉\t268129\n争论\t268130\n选和\t268131\npulse3\t268132\n20140919\t268133\n僭越\t268134\n梦幻群侠传3\t268135\n二手手机回收网\t268136\n3.8.0\t268137\n抗冻\t268138\n00:30\t268139\n杰科\t268140\n石家庄白癜风医院\t268141\n黑油\t268142\nanymore\t268143\n万战\t268144\n卡件\t268145\n4mg\t268146\n中国人民大学信息资源管理学院\t268147\n阿昔洛韦软膏\t268148\nteamcenter\t268149\n中国银行业监督管理委员会\t268150\n三清\t268151\n泰州站\t268152\n王东明\t268153\n车峰\t268154\n萧县人民政府\t268155\n512M\t268156\n知识产权战略\t268157\n药医\t268158\nOccupation\t268159\n广州市花都区人民政府\t268160\n弧垂\t268161\n百架\t268162\n大中部\t268163\n夜袭\t268164\n爱中\t268165\n洗手盘\t268166\n重邮\t268167\ntinyMCE\t268168\n还有些\t268169\nyua\t268170\nZuul\t268171\n有眼\t268172\n布里渊\t268173\n责任感\t268174\n羊眼\t268175\n恶鬼\t268176\nADATA\t268177\n科刘\t268178\n吾爱网\t268179\n社科版\t268180\n钢铁雄心3\t268181\nWii\t268182\n微量白蛋白\t268183\n慈利县\t268184\n支座处\t268185\n剪花\t268186\n新丰\t268187\n阻截战\t268188\n红将\t268189\nwhip\t268190\n两当\t268191\n桑舞小说网\t268192\n四川师范大学文理学院\t268193\n欺辱\t268194\nshangc\t268195\nsolenoid\t268196\n张青松\t268197\n恨不相逢\t268198\nSimCity\t268199\n凉水\t268200\n刷量\t268201\n厌笔\t268202\n继发性\t268203\n佳木斯\t268204\ncasio\t268205\n攻略秘\t268206\nGamerSky\t268207\n龙门客栈\t268208\n台湾岛\t268209\n2018年4月8号\t268210\n安迪板\t268211\nareyou\t268212\npurposes\t268213\n搁板\t268214\n恰恰恰\t268215\n24孝\t268216\n身形\t268217\n递增子序列\t268218\n内服药\t268219\n自撸\t268220\n喧宾夺主\t268221\n李杰\t268222\n原來\t268223\n直剪仪\t268224\n龙帝\t268225\n黄超\t268226\n体测\t268227\nDivide\t268228\nJoyS\t268229\n酒量\t268230\n权大师\t268231\nalibaba\t268232\n波波女性网\t268233\n阿尔法.罗密欧阿尔法罗密欧\t268234\nOils\t268235\n八国联军\t268236\nSans\t268237\n颅脑\t268238\n樊金龙\t268239\n十月初\t268240\n海安镇\t268241\n进击的巨人第二季\t268242\n长春外国语学校\t268243\n月支\t268244\nレンタル\t268245\n陆风X2\t268246\nModifier\t268247\n众娱\t268248\n对手戏\t268249\n六声\t268250\n皂苷\t268251\nGSDzone\t268252\n搞逼\t268253\n约瑟芬\t268254\n菜粉\t268255\n马萨伊尔\t268256\n驻足\t268257\n太阁立志传4吧\t268258\n7二代\t268259\n行处\t268260\nSF动漫论坛\t268261\n小狐濡尾\t268262\neui\t268263\n五常大道\t268264\n污妖王\t268265\n赣南苏区\t268266\n7559\t268267\nPerformed\t268268\n投资管理有限公司\t268269\n粥\t268270\nwinder\t268271\n桅\t268272\n亚瑟王传奇\t268273\n青书\t268274\n师法\t268275\n地籍\t268276\n冒险岛M\t268277\n任丘市\t268278\ntens\t268279\n72米\t268280\n美洁\t268281\npaulwong\t268282\nmoumoon\t268283\n2316\t268284\n六西格玛管理\t268285\n1手\t268286\n文本行\t268287\nbutcher\t268288\n温州南站\t268289\nfilebox\t268290\n史实\t268291\n敢死队4\t268292\n格子衫\t268293\n蛋花花\t268294\n招聘会\t268295\n国色生香\t268296\n纸黄金\t268297\n7.2下\t268298\n邮轮公司\t268299\n定额量\t268300\n国联证券\t268301\nC语言编程\t268302\n中国中铁四局\t268303\n农药经营许可证\t268304\n郝蕾\t268305\n分形\t268306\n西门塔尔\t268307\nconnector\t268308\n卡拉曼达\t268309\n清酒\t268310\nCH2\t268311\n长沙理工\t268312\n手舞足蹈\t268313\n南京地铁2号线\t268314\n经商处\t268315\n中国游艇网\t268316\nDefault\t268317\nipad6\t268318\n京沈客专\t268319\n346号\t268320\n唐武宗\t268321\n刘征\t268322\n计数器\t268323\n远低于\t268324\n大连火车站\t268325\n小橙武\t268326\nAVD模拟器\t268327\n帐款\t268328\n查单\t268329\n6.66\t268330\n20161027\t268331\n巨猿\t268332\n无字版\t268333\n周杰\t268334\n磁能热水器\t268335\nonyx\t268336\n外婆家\t268337\n椿\t268338\n国家机关事务管理局\t268339\n2013-10-09\t268340\nQC30\t268341\n祖阿曼\t268342\n笔管\t268343\n喜氏金融\t268344\n李建群\t268345\n谷粒网\t268346\n解析版\t268347\n风湿免疫科\t268348\nbaidudl\t268349\n李和\t268350\n12步\t268351\n荒唐\t268352\n重庆长寿区\t268353\n森贝儿\t268354\n中序\t268355\n土木工程施工\t268356\n整季\t268357\n510分\t268358\nMarking\t268359\n剪绒\t268360\n总括\t268361\n木薯淀粉\t268362\n吃不死人\t268363\n预热期\t268364\nfanqiang\t268365\n卡卡利特村\t268366\nandroid7.0\t268367\n媚骨\t268368\n四五打印助手\t268369\n上帝之鞭\t268370\n最后一场比赛\t268371\n雯雅婷\t268372\nblaire\t268373\n看呗网\t268374\nowa\t268375\n哀号\t268376\n普伐他汀钠片\t268377\n奉献者\t268378\nMultipartFile\t268379\n卧式镗床\t268380\n天龙八部OL\t268381\n李新\t268382\n加密算法\t268383\n睿威仕\t268384\n辣椒榕\t268385\n陈潇\t268386\n危包证\t268387\n牙形\t268388\nGillian\t268389\nCore2\t268390\n天津车管所\t268391\n3000毫安\t268392\ncda\t268393\n米易\t268394\nsockaddr\t268395\n旧城\t268396\n术字\t268397\n狙击手幽灵战士3\t268398\n李卓霖\t268399\nLafayette\t268400\n亲诊\t268401\nskynet\t268402\n吉利博瑞\t268403\nxpx\t268404\n母女俩\t268405\n轴力\t268406\n单梁\t268407\nisEmpty\t268408\n经信局\t268409\n市民网\t268410\n制表符\t268411\n马云\t268412\n总人\t268413\n深田奈奈\t268414\n柳氏\t268415\n洗码\t268416\n再爱\t268417\nbt种子\t268418\n上海交通大学数学科学学院\t268419\n尖庄\t268420\n赤条条\t268421\n马林\t268422\n垂柳\t268423\n一六二九\t268424\n新判断\t268425\n约会\t268426\n特福\t268427\n刘捷\t268428\n歌女\t268429\n摩登家庭\t268430\n刀砍\t268431\n华视电子\t268432\n贾云峰\t268433\n告解\t268434\nmysql.sock\t268435\n亿诺\t268436\n贾志敏\t268437\n三江镇\t268438\n马骏\t268439\nReddit\t268440\n万象物语\t268441\n崩解\t268442\n103.3\t268443\n合生创展集团有限公司\t268444\n上海植物园\t268445\n睁眼\t268446\nimg格式\t268447\n52PK剑网3\t268448\n0息\t268449\n7日游\t268450\n论述题\t268451\nnmake\t268452\n企业管理人员\t268453\n先天下之忧而忧\t268454\n钎焊\t268455\n松枝\t268456\n20171021\t268457\n竖图\t268458\n张捷\t268459\n安徽地区\t268460\n三德\t268461\n分手后\t268462\n惹祸上身\t268463\n先天不足\t268464\n一个斤\t268465\n陆战\t268466\nditie\t268467\n渔港\t268468\n第12周\t268469\n前门牙\t268470\n加拿大航空公司\t268471\nposcms\t268472\n小笼包子\t268473\n贾坤\t268474\n山东海化\t268475\nadb环境变量\t268476\nCH340\t268477\n顾均辉\t268478\n猛犬\t268479\n镖车\t268480\n广州火车南站\t268481\nParry\t268482\n12片\t268483\n家暴男\t268484\n模试\t268485\n攀枝花市\t268486\n机电工程\t268487\n温州晚报\t268488\n背景字\t268489\n明苑\t268490\nPT邀请码网\t268491\n梅伊\t268492\n枚举\t268493\n退路\t268494\n找路\t268495\n血府逐瘀丸\t268496\n全脂牛奶\t268497\n怡春院\t268498\n后勤工程学院\t268499\nBandari\t268500\nCAD2004\t268501\n希姆莱\t268502\n酸性\t268503\nmxgs\t268504\n全球热图网\t268505\n苍耳子\t268506\n不等\t268507\n污物\t268508\n1200亩\t268509\n戛纳\t268510\n2018年01月\t268511\n同步发电机\t268512\n优品酒店\t268513\nC刊\t268514\ngevent\t268515\n水泥搅拌桩\t268516\n离子化合物\t268517\n5207\t268518\n牧羊女\t268519\n制水机\t268520\n秋寒\t268521\n屡屡\t268522\n韩资\t268523\n联邦家具\t268524\n句句\t268525\n尤·奈斯博\t268526\n西翠路\t268527\n绕组\t268528\n2019年春节\t268529\n电石\t268530\n莱安\t268531\n郸城县\t268532\n江西省电力公司\t268533\n威宁\t268534\n﹃\t268535\n双城记\t268536\n中国民商法网\t268537\n下城区\t268538\n深度神经网络\t268539\nFabrics\t268540\n钥匙包\t268541\n铠甲勇士之雅塔莱斯\t268542\n客胜\t268543\n美商婕斯\t268544\n砦\t268545\n醉人\t268546\n打瞌睡\t268547\n首全\t268548\n小丽\t268549\n键点\t268550\nductile\t268551\n决赛圈\t268552\n南座\t268553\n无锡市人力资源和社会保障\t268554\n假发票\t268555\n二十年代\t268556\n球子\t268557\n银币\t268558\n赵太后\t268559\n魏民\t268560\n凤凰村\t268561\n6.0分\t268562\n函询\t268563\n20160504\t268564\n紫草膏\t268565\n盐池县\t268566\n尿口\t268567\ndioxide\t268568\n迪里拜尔\t268569\n中财\t268570\n美恐\t268571\n陈建生\t268572\n甲等\t268573\n花千树\t268574\nnba2k15\t268575\n以琳\t268576\n10亿个\t268577\n华宇锦绣花城\t268578\nHIM\t268579\n甲醛\t268580\n索尼FE\t268581\n机房\t268582\n侵扰\t268583\n张晓宇\t268584\nrewrite\t268585\n工商管理学院\t268586\n电动滑板\t268587\n崭新\t268588\nInputStream\t268589\n汉庭连锁酒店\t268590\n猎鹰9号\t268591\nik分词\t268592\n帕兰\t268593\n1-5季\t268594\n花园生物\t268595\n深圳机场\t268596\n水淹\t268597\n╠\t268598\n洗髓\t268599\n藤原瞳\t268600\n2-2\t268601\n摇控器\t268602\n飞地经济\t268603\n东风标致308\t268604\noriginally\t268605\n26_\t268606\n大威德\t268607\nsping\t268608\n来头\t268609\n星球大战前传1\t268610\n股票交易\t268611\n大检查\t268612\n蓝盖\t268613\n践诺\t268614\n萼\t268615\n消防百事通\t268616\n李建\t268617\nmapinfo\t268618\n晋原镇\t268619\nMapInfo\t268620\n舌吻\t268621\n宾虚\t268622\n阿老师\t268623\n荣耀壁纸\t268624\n念头\t268625\nmt7620\t268626\n向内\t268627\n爱德美\t268628\n隔壁老王游戏背景乐\t268629\n揭\t268630\n石斑鱼\t268631\n昆仑大道\t268632\n米黄色\t268633\n记表\t268634\n站街\t268635\n王镜岩\t268636\n乐巢\t268637\n性爱宝典\t268638\n皖新传媒\t268639\n置物\t268640\n自流井区\t268641\n孤岛惊魂4\t268642\n陈艳\t268643\n恰克\t268644\n残液\t268645\n黄庭经\t268646\nFraps\t268647\n三国群\t268648\n盐城城南新区\t268649\n斯坦尼康\t268650\n黑白调\t268651\n虎年\t268652\n八一小区\t268653\n滚动屏\t268654\n爱普生投影仪\t268655\n刺客信条3\t268656\n战锤40K\t268657\n灵契奥比岛\t268658\nG480\t268659\n聚爆\t268660\n刀姬\t268661\nbatter\t268662\n钦南区\t268663\n利己主义\t268664\n报点\t268665\n载货量\t268666\n舌头\t268667\n603712\t268668\n国家税\t268669\n刀锋战士2\t268670\n9.6.0\t268671\n上海外国语大学》\t268672\n美尚生态\t268673\n奥迪Q2\t268674\nRWBY\t268675\n选导\t268676\n6888\t268677\n祖龙\t268678\n顺丰速运有限公司\t268679\n四帝\t268680\n200m\t268681\nlesions\t268682\n病毒疣\t268683\n祼体\t268684\nGanglia\t268685\n徐僧\t268686\n城市\t268687\nshimo\t268688\nPPR管\t268689\n美合\t268690\n手装\t268691\n4729\t268692\n河北省纪委\t268693\n盘古\t268694\n铅中毒\t268695\n使女的故事\t268696\n不朽\t268697\n纪元\t268698\n宠坏\t268699\nwww.ed2000.com\t268700\nsampler\t268701\ninnobackupex\t268702\n意功\t268703\nvulnerable\t268704\n月下桑\t268705\npcre\t268706\nfolly\t268707\n常务\t268708\n噪声系数\t268709\n网易通行证\t268710\n黄页/公司库/大全/厂\t268711\n龙啸虎吟\t268712\n神数\t268713\n唑\t268714\n福城\t268715\n郑州分公司\t268716\n沈雨萱\t268717\n嘻哈2\t268718\nphpstorm\t268719\n3支\t268720\n会讯\t268721\n第18次\t268722\n保质\t268723\n刚走\t268724\n八里湖新区\t268725\n瑞斯\t268726\n州郡\t268727\n县妇联\t268728\n路加\t268729\n各星\t268730\n指纹打卡机\t268731\n成都医大医院\t268732\nPinard\t268733\n华慧\t268734\n春春\t268735\n你丫\t268736\n438本\t268737\n朱力\t268738\n细胞壁\t268739\n叶锐文\t268740\niherb\t268741\n昨天傍晚\t268742\nefforts\t268743\n弹起\t268744\n广点通\t268745\n20170803\t268746\n香山网\t268747\n费翔\t268748\nSAKURA\t268749\nkisskiss\t268750\n板壳\t268751\n远播\t268752\nfuji\t268753\n70五\t268754\n史学史\t268755\n军屯镇\t268756\nyoka\t268757\n声援\t268758\n木头人\t268759\nBuildCraft\t268760\n化学镀镍\t268761\n这几句\t268762\nntce\t268763\nOctane\t268764\n英港\t268765\n申河均\t268766\n露营地\t268767\nMCA\t268768\n低倍镜\t268769\n1至3月\t268770\n三仁汤\t268771\nlog2\t268772\n中央公园广场\t268773\n电路板\t268774\nInteriors\t268775\n8090影视网\t268776\n头层牛皮\t268777\n魔伊\t268778\n500年前\t268779\n超级暗黑破坏神\t268780\n0d\t268781\n招魂2\t268782\n左大玢\t268783\n慢性直肠炎\t268784\n莉子\t268785\n数据报\t268786\n毕业设计展\t268787\n鲁媒\t268788\nVietnam\t268789\nempirical\t268790\n_公司起名_公司\t268791\n雨城区\t268792\n补考\t268793\n黄舒骏\t268794\nマジックミラ\t268795\n现役\t268796\n催动\t268797\n典狱司\t268798\n压敏\t268799\n2018-2023年\t268800\nplates\t268801\n寒兰\t268802\n恋之罪\t268803\n室壁瘤\t268804\n多铎\t268805\n徐盛\t268806\n简媒\t268807\n路西\t268808\nwww.97ks.com\t268809\n解暑\t268810\n全国工商联\t268811\n3DMAX2016\t268812\n配景\t268813\n媚香\t268814\n卫片\t268815\n世间情\t268816\nfpm\t268817\n/br\t268818\nLGD\t268819\n第32\t268820\n阿根廷国家队\t268821\n检修员\t268822\n齿轮钢\t268823\nfx-es\t268824\n评改\t268825\nlinus\t268826\n5Ghz\t268827\n李毅中\t268828\n三大本\t268829\n钢锭\t268830\nHazardous\t268831\nPoS\t268832\nwetool\t268833\n借记\t268834\n追评\t268835\ndocker0\t268836\nH3302013款\t268837\nEXPRESSION\t268838\n邮政编码|邮编查询网\t268839\n嘉年\t268840\n哈尔滨石油学院\t268841\n串灯\t268842\n大良镇\t268843\n看图写话\t268844\n振动传感器\t268845\n史诗级\t268846\nfragment\t268847\nReceptor\t268848\n全师\t268849\n浙江大学\t268850\n贝里\t268851\n思觉\t268852\n造成\t268853\n誓师\t268854\n第10关\t268855\n避灾\t268856\n金发科技\t268857\n雪花泥\t268858\n金花媛\t268859\n飞天小女小警\t268860\n城版\t268861\n云栖竹径\t268862\n200.00\t268863\n芯绒\t268864\n宾果消消消\t268865\nAladd\t268866\nballoons\t268867\nSpring+Mybatis\t268868\nwince6.0模拟器\t268869\n不计入\t268870\nTunneling\t268871\n一田\t268872\n年关\t268873\n焦油量\t268874\n好不好吃\t268875\n华为MATE\t268876\n平行世界\t268877\n相送\t268878\n冯勇\t268879\n莲溪六村\t268880\n水塔\t268881\nrpar\t268882\n格力\t268883\n粘连\t268884\n颠覆级3D\t268885\n一得斋\t268886\n均码\t268887\n中国人民财产保险有限公司\t268888\n肤白\t268889\na+5\t268890\n蒲友\t268891\n黃\t268892\n架管\t268893\nhitman\t268894\n解放者\t268895\n昌都\t268896\n海莉\t268897\n太和县人民政府\t268898\nD700\t268899\n十七种\t268900\n射洪县人民政府\t268901\n文化东路\t268902\n五牌\t268903\n13:00\t268904\n红火火\t268905\nPConlin\t268906\n16:10\t268907\n东软睿驰\t268908\n外经证\t268909\n初花\t268910\nnewline\t268911\na卷\t268912\n40平方米\t268913\n林登\t268914\n李冰\t268915\n思过\t268916\nBadminton\t268917\n人人小站\t268918\n上位者\t268919\n乙酰半胱氨酸颗粒\t268920\nargmax\t268921\n量子霍尔效应\t268922\n象素\t268923\n增逾\t268924\n品牌\t268925\n美仑生物\t268926\nautoload\t268927\n天才小毒妃\t268928\n腐甲\t268929\nshowcase\t268930\n聚名网Juming.Com\t268931\n梁欢秀\t268932\n撇号\t268933\n蚊香\t268934\n某\t268935\n乙肝疫苗\t268936\n萧后\t268937\n2.0_\t268938\n东风风神\t268939\n面包屑\t268940\n第九区\t268941\n智能照明\t268942\nJoshua\t268943\n双歧\t268944\n横岗镇\t268945\n娇柔\t268946\n巴塞罗那大学\t268947\n鹊桥\t268948\n文晸赫\t268949\ngarfieldzf\t268950\nRX-7\t268951\niphone4s\t268952\n宣华\t268953\n人民代表大会制度\t268954\n钢格栅\t268955\n87分\t268956\nrapidminer\t268957\n一仙\t268958\ntriangle\t268959\nfba4droid\t268960\nmw300r\t268961\n石龙路\t268962\n1230\t268963\n言子\t268964\n吕飞\t268965\nsm58\t268966\n名版\t268967\n星男\t268968\n桃谷エリカ\t268969\n广州医科大学附属口腔医院\t268970\n益昌\t268971\n尾曲\t268972\ne41\t268973\n白若熙\t268974\n真可爱\t268975\n第53\t268976\n南京办公室\t268977\n曾萍\t268978\n商话\t268979\n20160620\t268980\n焦氏\t268981\n_巴士斗战神\t268982\n张建明\t268983\n名指\t268984\n前空\t268985\n青黄不接\t268986\n张志强\t268987\n公判\t268988\nproceed\t268989\n16路\t268990\n洋红色\t268991\nzidir\t268992\n晨光小区\t268993\n王小兵\t268994\n复旦大学附属华山医院北院\t268995\nv2ba\t268996\n扬州市\t268997\n僵尸世界大战2\t268998\n敏弓\t268999\n猎数\t269000\n医工\t269001\n印江县人民政府\t269002\n非系统风险\t269003\n浇注\t269004\nRamsey\t269005\n盐通铁路\t269006\nshaving\t269007\nABOUT\t269008\n记录仪\t269009\n星书\t269010\n可数集\t269011\n西安妈妈网\t269012\ns1810\t269013\n活动式\t269014\n霸刀山庄\t269015\n伟世通\t269016\n斌卡\t269017\ncandles\t269018\n魔法英雄\t269019\n咨询工程师\t269020\n米兰时装周\t269021\n张庆华\t269022\n调整期\t269023\n450mm\t269024\n数据单\t269025\n上不去\t269026\n袅\t269027\n俄汉\t269028\n创客\t269029\nDT\t269030\n地下街\t269031\nSSE\t269032\n暗夜古宅\t269033\n元湛\t269034\n体侧\t269035\n勤政\t269036\n响起\t269037\n武汉大学水利水电学院\t269038\n后臂\t269039\n番薯\t269040\n板管\t269041\n取整函数\t269042\n麻醉剂\t269043\ngww\t269044\n雅鱼\t269045\n嘉论网|嘉兴人论坛\t269046\n思源电气\t269047\n新净\t269048\n农合\t269049\nSpecs\t269050\n巨兴茂\t269051\n廷尉\t269052\n战女神memoria\t269053\n1.25米\t269054\n油料\t269055\n连环替代法\t269056\n二里\t269057\n南昌工学院\t269058\n氮化镁\t269059\n碳源\t269060\n华吧\t269061\n华人小说网手机小说网\t269062\nangularJS\t269063\n耍人\t269064\n宫崎爱莉\t269065\nRapidMiner\t269066\n佳能2525i\t269067\n16300\t269068\n拉丁方\t269069\n糯米胶\t269070\n黄天\t269071\nnar\t269072\nconst\t269073\n内置函数\t269074\n旋臂起重机\t269075\n配置\t269076\n枭爷\t269077\nfcgi\t269078\n乐富豪\t269079\n2016年3月16日\t269080\n龙湖花园\t269081\n兔女\t269082\n精原细胞瘤\t269083\n上海同济医院\t269084\nth000\t269085\n李岩\t269086\n非淡泊无以明志\t269087\n笔趣阁\t269088\n北京信息科技大学\t269089\n试验段\t269090\nIS-IS\t269091\n天天啪_天天色_天天撸一撸_免费天天啪_天天啪视频_天天操_天天撸\t269092\n曾曾\t269093\n去掉\t269094\n杭州安恒信息技术有限公司\t269095\n人民大礼堂\t269096\n郭生白\t269097\nwebi\t269098\nSentences\t269099\n关关采集器\t269100\n各村\t269101\n翩翩起舞\t269102\n广百\t269103\n中牟县\t269104\n洛桑\t269105\n含非\t269106\n雅迪集团控股有限公司\t269107\n场部\t269108\nOT\t269109\n幸福大道\t269110\n教科研\t269111\n华为集团\t269112\n隐函数\t269113\nLipa\t269114\n中防万宝城\t269115\nプレイホ\t269116\n足贴\t269117\nw800bt\t269118\n刷子\t269119\nauthentication\t269120\ninclusive\t269121\n唤醒\t269122\n数控刀片\t269123\nPencil\t269124\nQQ群\t269125\n吸睛\t269126\n看上去很美\t269127\nSapphire\t269128\n外国人网\t269129\n4at\t269130\n大动\t269131\n豫国\t269132\n那首歌\t269133\n霸王丸\t269134\n切平面方程\t269135\nstean\t269136\n晟敏\t269137\nReveals\t269138\n变色杯\t269139\n构造柱泵\t269140\ne缘\t269141\n转悠\t269142\n经六路\t269143\n肉体性\t269144\n收罗\t269145\n玩忽职\t269146\n内蒙古自治区水利厅\t269147\n199元\t269148\n命令与征服:红色警戒\t269149\ndoodle\t269150\n软门帘\t269151\n描写性\t269152\n煤烟\t269153\nyy子\t269154\n68年\t269155\n艾泰\t269156\n花豆\t269157\n波月\t269158\ncess\t269159\n绿芦笋\t269160\n藏马山\t269161\n梁山伯\t269162\n高铁酸钾\t269163\n广西壮族自治区政府采购中心\t269164\n敬体\t269165\n实话实说\t269166\ngrating\t269167\n杨凡\t269168\n现金净流量\t269169\nChronicle\t269170\n佩妮\t269171\nshunt\t269172\n韩佳恩\t269173\n封包\t269174\n平衡杆\t269175\n10000元\t269176\nxqb\t269177\n94分\t269178\n雪冰\t269179\ncallcenter\t269180\n太极鱼\t269181\n宝乐\t269182\ncostume\t269183\nsendto\t269184\n五岭\t269185\n勇士们\t269186\n盛雪\t269187\n绍兴东站\t269188\n邻瑞广场\t269189\n滩\t269190\n孙金龙\t269191\n保留\t269192\n阿维\t269193\n上门女婿\t269194\n等你下课\t269195\n大肚\t269196\n过家门而不入\t269197\nDroid4X\t269198\nInstitutes\t269199\n瓦努阿图\t269200\n徐韬\t269201\n炫音\t269202\n神医侠侣\t269203\n银城小学\t269204\n一二次\t269205\nishare\t269206\n未审\t269207\nwanting\t269208\n恒温箱\t269209\n反舰\t269210\n海淀实验二小\t269211\n绿里奇迹\t269212\n保障性租赁房\t269213\n脑大洞开\t269214\n霞霞\t269215\n文献综述\t269216\nzhaobao1830\t269217\n贿选案\t269218\n内火旺\t269219\n壹财富\t269220\n自治区党委组织部\t269221\n奢侈\t269222\n转写\t269223\n低场\t269224\n精研科技\t269225\n960万平方公里\t269226\n王啸坤\t269227\n团胜\t269228\n乐谷\t269229\n业务费\t269230\n1944\t269231\n音乐梦\t269232\nLZO\t269233\n客货运\t269234\nvirtualization\t269235\nACDSee\t269236\n喵玉\t269237\n1篇\t269238\n南方谈话_\t269239\n一看\t269240\n一四年\t269241\n54mm\t269242\n机动战士高达seed\t269243\n义和团\t269244\nCosmetics\t269245\n三小时后\t269246\n一万多名\t269247\n预算部\t269248\n白雪歌\t269249\n特么\t269250\nmplus\t269251\n帕子\t269252\n心理因素\t269253\n杭州湾跨海大桥\t269254\n脚型\t269255\n15M\t269256\nJot\t269257\n录像片\t269258\n炭质\t269259\n太挤\t269260\nforfour\t269261\nyin\t269262\nbb10\t269263\nxboxones\t269264\nprivileges\t269265\n条形码\t269266\n割据\t269267\nParasoft\t269268\nVaporMax\t269269\nhubs\t269270\n三棵\t269271\n恶评\t269272\n360mm\t269273\n3601\t269274\nMME\t269275\nkaweco\t269276\n4000元\t269277\n钨金\t269278\nN站\t269279\n七星刀\t269280\nanima\t269281\nbl锁\t269282\nfixing\t269283\nWebApi\t269284\n雨后春笋\t269285\n琉球\t269286\n肉虫\t269287\n秋空\t269288\n胡文容\t269289\n新乐路\t269290\ncolspan\t269291\n致橡树\t269292\n租用\t269293\n大咕咕咕鸡\t269294\n哺乳期乳腺炎\t269295\n10.22\t269296\n养马场\t269297\nclam\t269298\n安华里\t269299\n前鼻\t269300\n走投无路\t269301\n六枝特区\t269302\n1.8AT\t269303\n张琳琳\t269304\n关羽\t269305\ndell戴尔\t269306\n博深工具\t269307\n2017年7月15日\t269308\nMaryland\t269309\ntinashe\t269310\n三角洲特种部队\t269311\n洛河\t269312\n辽宁大学研究生院\t269313\n配体\t269314\n欧亨利\t269315\n中宇卫浴\t269316\nCrayon\t269317\nshowing\t269318\n巧虎\t269319\n喷嚏网\t269320\n陶然居\t269321\n撞日\t269322\n补语\t269323\n鸿利\t269324\n蒋丰\t269325\nWiki\t269326\n菠萝派\t269327\nP3独胆\t269328\n路由来\t269329\n二级人力资源管理师考试\t269330\n漂移板\t269331\n德佑\t269332\n腋臭手术\t269333\n第50\t269334\n虎跃\t269335\n周彤\t269336\n2017年1月20日\t269337\nGlobal\t269338\n副题\t269339\n水星Mercury\t269340\n新碧血剑\t269341\n华融金控\t269342\n定题\t269343\nspo\t269344\n佑安\t269345\n中国科学报\t269346\n抵用\t269347\n月头\t269348\n劈山\t269349\n一不小心\t269350\n金鸡路\t269351\nGENERATION\t269352\n体育运动学校\t269353\n1.7L\t269354\n四川大学法学院\t269355\n羊城网\t269356\nIDX\t269357\n薄机\t269358\n夹叙\t269359\n京政办\t269360\nSergey\t269361\nArtCAM\t269362\nminGW\t269363\n雕塑公园\t269364\n聚类分析\t269365\n健康路\t269366\n荃\t269367\n外研社小学\t269368\n撸啊比亚迪f3\t269369\n万象府\t269370\n四弦\t269371\n淮南师范学院\t269372\nchemoffice\t269373\n表列\t269374\n渭源县\t269375\nCOUNTDOWN\t269376\n黄志明\t269377\n魁麟\t269378\nBEAMS\t269379\n了不起的挑战\t269380\n一人名\t269381\n男用\t269382\n中卷\t269383\n中共南阳市纪律检查委员会\t269384\n养蜂场\t269385\n雷司令\t269386\n刘国栋\t269387\n街道办事处\t269388\n爬刺\t269389\n泪石\t269390\n团头\t269391\nforestry\t269392\n陈国昭\t269393\n黄金版\t269394\n孙双金\t269395\nApriori\t269396\n驾呗\t269397\nmismatch\t269398\n都匀一中\t269399\n几平米\t269400\n系列赛\t269401\n新天俗人回档\t269402\n气体放电管\t269403\nmillan\t269404\nc90\t269405\n帝国2\t269406\n甘草\t269407\n肖申克\t269408\nguests\t269409\n电磁波谱\t269410\n阴剑\t269411\n多瑙\t269412\nsaige\t269413\n多诺万·米切尔\t269414\n文肉\t269415\n北大妇产儿童医院\t269416\n任意x1\t269417\n战风\t269418\n免疫治疗\t269419\n神钢挖掘机\t269420\nparse\t269421\nmz\t269422\nTone\t269423\n东施\t269424\n唐氏综合症\t269425\n资本家\t269426\nferragni\t269427\nInflammation\t269428\n上条当麻\t269429\n自控\t269430\n2599\t269431\n伟伦\t269432\n五大连池\t269433\n羸弱\t269434\n张剑秋\t269435\n勇敢者游戏:决战丛林\t269436\n金汉\t269437\n三星I9100\t269438\n1300D\t269439\nwenhua\t269440\n文件描述符\t269441\n真耶稣教会\t269442\n扎扎实实\t269443\nReduced\t269444\n本来生活网\t269445\n替诺福韦酯\t269446\n维奇网\t269447\n赛扬\t269448\n付款额\t269449\n中国铸造网\t269450\n战地2吧\t269451\n张阳阳\t269452\n美鼻\t269453\n亲爱的那不是爱情\t269454\n网球场\t269455\n002146\t269456\n二规\t269457\n性观念\t269458\njavascrip\t269459\n王台镇\t269460\n2月底\t269461\n第三十一条\t269462\n非甾体类抗炎药\t269463\n亨利八世\t269464\n绿地中心\t269465\n死苗\t269466\n江西长运\t269467\n欧克瑟\t269468\n相差无几\t269469\n私募债券\t269470\n幸亏\t269471\n三胺板\t269472\n龙泪\t269473\nwarz\t269474\nMercer\t269475\n泰安酒店\t269476\n父亲写的散文诗\t269477\nGLFW\t269478\n福建省政府采购网_福建省财政厅\t269479\n影像展\t269480\n一婚\t269481\n约翰·霍普金斯大学\t269482\n党要管党从严治党\t269483\n喧\t269484\nmp3吧_\t269485\n中府\t269486\nplotly\t269487\n500w\t269488\n吉剧\t269489\n万能图书馆\t269490\n汪清\t269491\n1196\t269492\n20182\t269493\n吃了吐\t269494\n趣赢娱乐\t269495\n四川大学华西医院\t269496\nRevolver\t269497\neat\t269498\n殒命\t269499\n养发馆\t269500\n长沙网\t269501\n联想y500\t269502\nチュ\t269503\n加分\t269504\n台属\t269505\n到底是什么\t269506\n傅满洲\t269507\n矶竿\t269508\n西安高新第一中学\t269509\n谢氏\t269510\n飞虎出征\t269511\n三周\t269512\n2.5级\t269513\n大同方特\t269514\n路旁\t269515\n养龙\t269516\n57所\t269517\n历险记\t269518\n信江\t269519\nUJN\t269520\n四面佛\t269521\n星球\t269522\n永康\t269523\n真空杯\t269524\n诗道\t269525\n万世套\t269526\nSurprise\t269527\n本菲卡\t269528\n壁膜\t269529\n孙鲁班\t269530\n范丞丞\t269531\nEPC\t269532\n黄冈小学\t269533\n着魔\t269534\n而今\t269535\nParties\t269536\nretired\t269537\n公文库\t269538\n交互式\t269539\n冻害\t269540\n陵水县\t269541\n岬太郎\t269542\nRECOVERY\t269543\n260亿元\t269544\n浙商总会\t269545\n银叶\t269546\n圆内\t269547\n同眠\t269548\ntfboys\t269549\nwinforms\t269550\n巴赫金\t269551\n28Hse\t269552\n野长城\t269553\nxinnet\t269554\n16周\t269555\n98路\t269556\n4桶\t269557\n伞骨\t269558\n无拘无束\t269559\n经济研究导刊\t269560\n中国社科院\t269561\n天府事变\t269562\n唐三藏\t269563\n王家桥\t269564\n推移\t269565\n虾滑\t269566\n戴尔公司\t269567\n连云港经济技术开发区\t269568\n前日\t269569\n两控\t269570\njuicer\t269571\nDone\t269572\njquert\t269573\nwebber\t269574\npayments\t269575\nwildfire\t269576\nvideosdesexo\t269577\n姊妹\t269578\nDMG\t269579\n幻响\t269580\n藏南\t269581\nextract\t269582\n北青山\t269583\n腐臭味\t269584\n6GB+64GB\t269585\n中利\t269586\n庶民\t269587\n前言\t269588\nsolidedge\t269589\n折射率\t269590\n商会大厦\t269591\nesper\t269592\n检出\t269593\nf612\t269594\n碧口\t269595\nprocrank\t269596\n八招\t269597\n6078\t269598\n增列\t269599\n比翼鸟\t269600\n中国国际工程咨询公司\t269601\nCLANNAD\t269602\n四川师大\t269603\n枪剑\t269604\n2400MHz\t269605\n凝血\t269606\n主干管\t269607\nhani\t269608\n太平保险公司\t269609\n大久\t269610\n恋情\t269611\n羞羞羞\t269612\n杨鸿年\t269613\n未来之路\t269614\n新疆维吾尔自治区安全生产监督管理局\t269615\n网游京华烟云\t269616\n畏光\t269617\n昆凌\t269618\nnanoscale\t269619\n鹿岛鹿角\t269620\n易创\t269621\n安次\t269622\n苏芒成吉思汗\t269623\n流放之路\t269624\nenabling\t269625\n北京故宫博物院\t269626\n几克\t269627\n影视\t269628\n兴庆路\t269629\n正版机\t269630\n瓦尔德内尔\t269631\n车太贤\t269632\n组合型\t269633\na1530\t269634\n新材料\t269635\n睐\t269636\n商业服务_E木业网\t269637\n阐述\t269638\n365淘房网\t269639\n发自肺腑\t269640\n宋仲基\t269641\n邋遢大王\t269642\n枫丹白露\t269643\n注写\t269644\na99\t269645\n生长激素\t269646\n最好呀\t269647\n花样游泳\t269648\n高中考\t269649\nsen\t269650\n罗兹\t269651\ngridlayout\t269652\n嚎叫\t269653\nHdfs\t269654\n30kg\t269655\n新琉璃神社\t269656\n湖南政府\t269657\n水平仪\t269658\n一千度\t269659\n失责\t269660\n房地产经纪公司\t269661\n斗神\t269662\nZzz\t269663\n基因变异\t269664\n界龙实业\t269665\n索纳塔八\t269666\nHG8240\t269667\n礼拜天\t269668\n精明\t269669\napl\t269670\n张镐哲\t269671\nXIU\t269672\n锐视\t269673\n中国人寿云\t269674\n一步裙\t269675\n求职信\t269676\n艾尔建\t269677\n刘思琦\t269678\n拳击馆\t269679\n四盘\t269680\n社工证\t269681\nrecipe\t269682\n报停\t269683\n中海物业管理有限公司\t269684\n养花\t269685\n中国人民政治协商会议成都市委员会\t269686\n抛光膏\t269687\n无病\t269688\n通州小学\t269689\n橘红痰咳液\t269690\n翻糖蛋糕\t269691\n衣不蔽体\t269692\n单房\t269693\n双车道\t269694\n普锐斯\t269695\ncompeting\t269696\n娱乐百分百_\t269697\n496\t269698\n自摸\t269699\nカメラ\t269700\n这么快\t269701\npostmaster\t269702\nSmallpdf\t269703\n绝育\t269704\n中国软件\t269705\n转基因大豆\t269706\n案桌\t269707\n千风\t269708\n打折价\t269709\n塔门岛\t269710\n银之守墓人\t269711\njava虚拟机\t269712\n源数\t269713\n石碶街道\t269714\nprocessor\t269715\n征途2手游\t269716\n车辆违章查询\t269717\n统治\t269718\nfiddler2\t269719\n彝良\t269720\njiewu\t269721\n强拳\t269722\nakai\t269723\n金刚般若波罗蜜经\t269724\niOS11分屏\t269725\n14公分\t269726\n支座\t269727\n8002017款\t269728\n易数\t269729\n凯源\t269730\n3dtiles\t269731\n丰田阿尔法\t269732\n贺东\t269733\n鲘门\t269734\n蓝眼\t269735\n15座\t269736\n六戒\t269737\ncorrupted\t269738\n村淘\t269739\npaolo\t269740\n002354\t269741\n益者\t269742\n黄精\t269743\n羊皮纸\t269744\n响哥\t269745\n6:30\t269746\n明楼\t269747\n九江镇\t269748\n用券\t269749\ncubase5\t269750\n北京大学法学院\t269751\n忐忑不安\t269752\n风险\t269753\n三角龙\t269754\n李永新\t269755\n遥远的天熊山\t269756\n犯傻\t269757\n太阳能电动汽车网\t269758\n计生局\t269759\n传收\t269760\n西山希\t269761\n70载\t269762\n网络科技集团\t269763\n硫酸\t269764\n斯卡迪\t269765\n陆城\t269766\n西南科技\t269767\nother\t269768\nViews\t269769\n马尔蒂尼\t269770\n霍恩\t269771\n十粒\t269772\n鹞\t269773\n假冒\t269774\n中国军团\t269775\n惠威音箱\t269776\n圆锯机\t269777\n狮岭镇\t269778\nchaser\t269779\n斩马刀\t269780\n_天国\t269781\njoining\t269782\n卖艺\t269783\n翡翠玉石\t269784\n不到庭\t269785\nAMOLED屏\t269786\n童男\t269787\n猪人\t269788\n兴华\t269789\nglorious\t269790\n球具\t269791\n有的放矢\t269792\n统一性\t269793\n泰罗卡\t269794\n夜\t269795\n时衿\t269796\nMybase\t269797\n王可可\t269798\n中国交建\t269799\n随州市\t269800\n叶欣桐\t269801\nBIOS+MBR\t269802\n小罗伯特唐尼\t269803\n脂质\t269804\nsn号\t269805\n中国食品土畜进出口商会\t269806\n宝元\t269807\n6610\t269808\n美少女花骑士吧\t269809\n许冠文\t269810\ndingtalk\t269811\n60周岁\t269812\n酒前\t269813\n译写\t269814\n花柱\t269815\n第几种\t269816\n角机\t269817\n雅迪尔\t269818\n隐隐约\t269819\n贾斯比尔盖茨\t269820\n街头霸王2\t269821\n强少\t269822\n2700K\t269823\n林铎\t269824\n谢孟伟\t269825\n第40届\t269826\n四小天鹅舞曲\t269827\n北京购物中心\t269828\n知事\t269829\n安线\t269830\nsilly\t269831\nasean\t269832\n丰达\t269833\n原稿\t269834\n王赞\t269835\nThin\t269836\n口醒\t269837\n中国质量认证中心\t269838\n几度\t269839\n全能战士\t269840\n流动人口婚育证明\t269841\nFranaise\t269842\n中国国际广播电台\t269843\n1.7万亿\t269844\n2000亩\t269845\n刮板机\t269846\n美道\t269847\n广东省教育考试院\t269848\n在熙\t269849\nc43\t269850\nav12AV在线\t269851\n648元\t269852\n兔小贝\t269853\n北大街\t269854\n路基箱\t269855\n常州地铁\t269856\n中管\t269857\n求通\t269858\nsetx\t269859\n节水型\t269860\n肖建华\t269861\n许浩\t269862\n33p\t269863\n错误码\t269864\n乌苏啤酒\t269865\n4.90分\t269866\nLulu\t269867\n范燕云\t269868\nHongkong\t269869\nAppCan\t269870\n宠主\t269871\n黄石新闻网\t269872\n县乡\t269873\n每15分钟\t269874\n巿\t269875\n野宫\t269876\n劳斯莱斯\t269877\n七大类\t269878\n60km\t269879\n织梦cms\t269880\n义乌物流公司\t269881\n石印\t269882\navi\t269883\n新华龙\t269884\n散热器\t269885\n手刷\t269886\n麻屯镇\t269887\nkendo\t269888\n吕忠梅\t269889\n维吾尔人\t269890\n黄金转运珠\t269891\n十八届中央委员会\t269892\narmhf\t269893\n封片\t269894\n你的故事\t269895\n山西省文物局\t269896\n反间谍\t269897\n痒痒\t269898\n酚红\t269899\n钴矿\t269900\n电击器\t269901\n济南车管所\t269902\nCock\t269903\n加成率\t269904\n尬舞机\t269905\n集成声卡\t269906\n发展阶段\t269907\n宽限\t269908\n铁屑\t269909\n腿脚\t269910\nCOD\t269911\nFSB\t269912\n传奇永恒\t269913\n广东省电力设计研究院\t269914\n超声骨刀\t269915\n铂略\t269916\n西洋菜\t269917\n精体\t269918\n氟化锂\t269919\n交口处\t269920\n福如东海\t269921\n浙江龙盛集团股份有限公司\t269922\n首评\t269923\n12槽\t269924\n广东省经济和信息化委员会\t269925\n第148集\t269926\n1882年\t269927\nadverb\t269928\n汉化补丁V2.0\t269929\n书吧小说网\t269930\n中修\t269931\n36E\t269932\n张青\t269933\n线性方程\t269934\n东北抗联\t269935\nHDMI接口\t269936\n150米\t269937\n普莱德\t269938\n配电变压器\t269939\n三国志10吧_\t269940\n发展局\t269941\n150克\t269942\n九尾\t269943\nzaw\t269944\n相思曲\t269945\nopenstack\t269946\n山萸肉\t269947\n奥苏伯尔\t269948\n高梨康治\t269949\n老样子\t269950\n朗基\t269951\n自控力\t269952\n10万行\t269953\n夕云\t269954\nPhysX\t269955\n巡察\t269956\nWEC\t269957\n后援\t269958\n32次\t269959\n要地\t269960\ncpui5\t269961\n长沙学院\t269962\n水姻缘\t269963\n耀斑\t269964\n额度\t269965\n石氏伤科\t269966\n闪艺\t269967\nMasters\t269968\n案山\t269969\n1MB\t269970\nTg\t269971\n丽江花园\t269972\n实习医生格蕾第十四季\t269973\nbebe\t269974\nPropaganda\t269975\n廉江市人民政府\t269976\nMeals\t269977\n桩工\t269978\n下楼\t269979\n杂闻\t269980\n荣耀s9\t269981\n公斤力\t269982\n宠物文化节\t269983\nhalsey\t269984\n亚服服务器\t269985\n首都儿科研究所\t269986\n荣耀6x\t269987\n听歌\t269988\n妍色\t269989\n大者\t269990\nCaused\t269991\n小仙儿\t269992\nmemset\t269993\nBitfinex\t269994\n王勤\t269995\n彼得·林奇\t269996\n创业板上市\t269997\n驼峰命名法\t269998\n魂虫\t269999\nPoy\t270000\n恶毒德\t270001\n慈航\t270002\n孜然粉\t270003\n杨森\t270004\nsingular\t270005\nChemistry\t270006\n石岩汽车站\t270007\n黄埔区人民政府\t270008\n法航\t270009\n自主化\t270010\nfinally\t270011\nexcel点\t270012\n20170516\t270013\n信长之野望12革新\t270014\n法易网\t270015\n40包\t270016\n言情小说阅读网\t270017\n联程票\t270018\n石膏像\t270019\n教育经济与管理\t270020\n天猫极限词\t270021\n海鼎\t270022\n人纪\t270023\n联通电信\t270024\n格温\t270025\n护河\t270026\n千变\t270027\n麦马\t270028\nreverse\t270029\n603733\t270030\n房叔\t270031\n汤潮\t270032\n玉河\t270033\n主元\t270034\n真话\t270035\n6.36\t270036\nAk\t270037\nSDF\t270038\nvrtk\t270039\n人事招聘网\t270040\n房产税暂行条例\t270041\n9.9.9\t270042\nmiad\t270043\n李云鹏\t270044\n茄汁\t270045\n天目山路\t270046\nip代理软件\t270047\n小队\t270048\n杰奥\t270049\n羞愤\t270050\n龙湾区\t270051\n菁蓉国际广场\t270052\nhanyu\t270053\n11.7\t270054\n凝萃\t270055\n兴隆大厦\t270056\n温州市政府网\t270057\n门卫室\t270058\nLitigation\t270059\nQQ个性分组\t270060\n彪哥\t270061\n定义者\t270062\n必胜\t270063\n茯砖\t270064\n天兴工作室\t270065\n{bn}\t270066\n袋式\t270067\n物理量\t270068\n代斯\t270069\nABE\t270070\n小金丸\t270071\n1469\t270072\n节瓜\t270073\nmircrosoft\t270074\navnight\t270075\ntongji\t270076\n馆\t270077\nAdrenaline\t270078\nAL32UTF8\t270079\n燃气表\t270080\nZTO\t270081\n北京交通委\t270082\n三一大道\t270083\n徐州市教育局\t270084\nsolid\t270085\n王奕程\t270086\nbiubiu\t270087\n突出物\t270088\n陈丽佳\t270089\n血滴子\t270090\n一百二十\t270091\n武汉华星光电\t270092\nDeepMind\t270093\nTeacher\t270094\ndvi线\t270095\nMenus\t270096\n潍坊火车站\t270097\nHOLIDAY\t270098\n除数\t270099\nExcel公式\t270100\nV1.0.0\t270101\n孙力\t270102\n600562\t270103\n锃亮\t270104\n吃\t270105\n主题式\t270106\n2513\t270107\n恶魔小姐\t270108\n规劝\t270109\n四张\t270110\n床头灯\t270111\n幼儿舞蹈\t270112\n张裕解百纳\t270113\n半升\t270114\n涡喷发动机\t270115\nriff\t270116\n里查德米尔\t270117\n十一季\t270118\n王志华\t270119\n1rx8\t270120\n比德斯热水器\t270121\n金茂绿岛湖\t270122\n验电\t270123\n呦呦呦\t270124\nRs\t270125\n嘴亲\t270126\n边度\t270127\n千叶轮\t270128\n马洪\t270129\nwin7控制面板\t270130\n血球仪\t270131\ndelay函数\t270132\n四同\t270133\n十字军\t270134\n金差\t270135\nlistitem\t270136\n光衣\t270137\n26处\t270138\n后行\t270139\n空气能\t270140\n卡松\t270141\n松雷\t270142\n昆山公司\t270143\n国际滑联\t270144\n翔太\t270145\n宽创\t270146\n心机婊\t270147\n骶椎\t270148\n中共上海市闵行区委员会\t270149\n众议员\t270150\n虹化\t270151\n绿奴\t270152\n手机网易云音乐\t270153\nAERO\t270154\n翁\t270155\n831608\t270156\nC盘\t270157\n花与蛇2\t270158\n100KG\t270159\n养老服务业\t270160\nNH3\t270161\n金属性\t270162\n煤油灯\t270163\nMarkDown\t270164\n46元\t270165\n矿建\t270166\n红中\t270167\n旅游网\t270168\ncitations\t270169\nProgramData\t270170\n南广场\t270171\nautocad2013\t270172\n江阴人民医院\t270173\n长宁来福士广场\t270174\n静电\t270175\nFCC\t270176\n网友\t270177\n纳兰火影忍者ol\t270178\nnicolas\t270179\n康柏\t270180\n联席会议纪要\t270181\n徐州电视台\t270182\n台路\t270183\n1.3版\t270184\nXIB\t270185\n咸阳站\t270186\n广西大学\t270187\n水利志\t270188\nnavisworks\t270189\n黄天霸\t270190\n简美\t270191\naffix\t270192\nWE电子竞技俱乐部\t270193\n公路水运工程安全生产监督管理办法\t270194\n中分区\t270195\n宁夏师范学院\t270196\nAppChina\t270197\n宁围镇\t270198\n流连忘返\t270199\n20151019\t270200\n28天\t270201\nブログ\t270202\nHanson\t270203\n云树\t270204\nWin2000\t270205\n中国航天科工三院\t270206\n府志\t270207\n枪枪\t270208\n矢量蒙版\t270209\nCrocs\t270210\n陈玉建\t270211\n注册码序列号\t270212\nsuicide\t270213\n年轻的嫂子\t270214\nJieba\t270215\n拖飘\t270216\n面状\t270217\n服饰有限公司\t270218\n程宏\t270219\ndressing\t270220\n青龙湖镇\t270221\nnvr\t270222\n苏州地铁1号线\t270223\npppd\t270224\n怎个\t270225\n邮品\t270226\n撤销\t270227\n深圳网\t270228\n超能少年\t270229\n电话号码\t270230\nmobi格式\t270231\n新余国家高新技术产业开发区\t270232\n许杨玉琢\t270233\n毕业之家\t270234\n自受\t270235\nimtoken\t270236\n寄予\t270237\n升邪\t270238\nextjs5\t270239\n好多年\t270240\n均安镇\t270241\n姜建军\t270242\n省委政法委\t270243\nGovernment\t270244\n打胜仗\t270245\n毛肚火锅\t270246\n呈贡\t270247\ngfw\t270248\n钟启泉\t270249\n锦程\t270250\n管腔\t270251\n包窗套\t270252\nCMBS\t270253\nBlueStacks蓝叠安卓模拟器\t270254\n_包\t270255\n王士禛\t270256\n192K\t270257\n海事法院\t270258\n落脚\t270259\n佐佐木美优\t270260\n中体产业\t270261\n石壁\t270262\n尾包\t270263\n明令\t270264\n用以\t270265\nproteus8.0\t270266\nIJKPlayer\t270267\ncontentsize\t270268\n后半部\t270269\nconnect函数\t270270\n5万平方米\t270271\n注册值\t270272\n非与\t270273\n老聂\t270274\n铝圆片\t270275\n十只\t270276\n地委书记\t270277\n超乐场\t270278\n伎\t270279\n飞行力学\t270280\nbeanstalk\t270281\n多策\t270282\n狂狷\t270283\n腰伤\t270284\n中国医药教育协会\t270285\n耳套\t270286\n中国国电集团公司\t270287\n热敏快递\t270288\n薯饼\t270289\n花苑\t270290\n决议案\t270291\n即位\t270292\n白哥\t270293\n湖岸\t270294\n对对对\t270295\nsimone\t270296\n傲胜\t270297\n数字电压表\t270298\n普拉多论坛\t270299\n华北制药\t270300\n伯才\t270301\n二次能源\t270302\n跑死\t270303\nios7吧\t270304\n数控转塔冲床\t270305\n眉间纹\t270306\n征文稿\t270307\n同济大学人文学院\t270308\n发射\t270309\n7.0.8\t270310\n何家英\t270311\n北京妇产医院\t270312\n海南特区报数字报\t270313\nCheckpoint\t270314\n兰波\t270315\n宁波公安交管信息网\t270316\nL5420\t270317\nswiper\t270318\n超高\t270319\n掌法\t270320\n影音先锋资源_先锋影音电影_AV电影\t270321\n钛酸四丁酯\t270322\n梨落\t270323\n客运量\t270324\n石油管\t270325\nshrinking\t270326\n方孟敖\t270327\n林飞\t270328\n缘愁\t270329\n剑侠\t270330\n马坡\t270331\n狭小\t270332\n花印\t270333\n建兰中学\t270334\n政治建军\t270335\npipline\t270336\n冷链物流公司\t270337\n会声会\t270338\ndowntown\t270339\n陈志武\t270340\n蓝蝴蝶\t270341\nexcel2003\t270342\n默写表\t270343\n24项\t270344\n导气管\t270345\n黄师傅\t270346\n康氏\t270347\n鬼泣1\t270348\n汉道\t270349\nspr\t270350\n竞园\t270351\n未动\t270352\n绝世校花\t270353\n恶魔城苍月的十字架\t270354\n鸽业\t270355\n海蓝色\t270356\n天津街\t270357\n手卷烟\t270358\n想回家\t270359\n广岛市\t270360\n慧立康\t270361\n生产技\t270362\n8月8号\t270363\n大打\t270364\n毒妃\t270365\n沪教牛津\t270366\n小弥\t270367\n饿殍\t270368\n张宝瑞\t270369\n民主管理\t270370\n抽贷\t270371\n潍坊人力资源和社会保障局\t270372\n康定机场\t270373\n功劳\t270374\n很荣幸\t270375\n熊青春\t270376\n十一点半\t270377\n标志符\t270378\n核岛\t270379\n韩冬\t270380\n0.65\t270381\n朔州市人民政府\t270382\n获录\t270383\n拉萨吧\t270384\n财政部税务总局\t270385\nwurst\t270386\n巨景\t270387\n评奖\t270388\n开涛学SpringMVC\t270389\n天天酷跑吧_\t270390\n凭安\t270391\n旺宝\t270392\n王君平\t270393\n法句\t270394\n阿米尔汗\t270395\n软件工程导论\t270396\n成都大悦城\t270397\n上海财经大学继续教育学院\t270398\n东里\t270399\n贴身保镖\t270400\n毛东东\t270401\nbroccoli\t270402\njasperReport\t270403\n穴位\t270404\n【7xfw撸撸\t270405\nPty\t270406\n2018季度\t270407\ng502\t270408\n光兵\t270409\nbuick\t270410\nyoungteen\t270411\nSOLDIER\t270412\n能线\t270413\nfloors\t270414\n广州市发展和改革委员会\t270415\n寻找罗麦\t270416\n我的世界生活大冒险\t270417\nywl925\t270418\n呈献\t270419\npsc\t270420\nwake\t270421\n君初\t270422\n插孔\t270423\n李渡镇\t270424\n蜀门雪铁龙c5\t270425\n第25次\t270426\nCnSCG\t270427\n青岛北站\t270428\n下类\t270429\n柏桦\t270430\n长腿\t270431\n天津中德职业技术学院\t270432\n陈佳影\t270433\n贡献榜\t270434\n莫名\t270435\n4500万\t270436\nLH服\t270437\n锦江国际(集团)有限公司\t270438\n14cm\t270439\n提拉米苏蛋糕\t270440\nBorg\t270441\n梅花洲景区\t270442\n艾饺\t270443\n形字\t270444\nbiological\t270445\n北京中科院\t270446\n萌妻食神\t270447\n长治久安\t270448\n1863074\t270449\n五角\t270450\n金考网\t270451\n人力资源网\t270452\n26.3\t270453\n共享群\t270454\n中文在线翻译\t270455\n双江\t270456\n八奇技\t270457\n及奖\t270458\n管弦\t270459\n9025\t270460\n程控\t270461\n跑腿服务网\t270462\n泰奇\t270463\n热火朝天\t270464\n祥情\t270465\nspecifically\t270466\n灌阳县\t270467\nood\t270468\n分水器\t270469\n全美超模大赛\t270470\n红烧豆腐\t270471\nbangbangbang\t270472\n合租房\t270473\n白盖\t270474\nGDDR5\t270475\n效方\t270476\nx光\t270477\nPortraits\t270478\n麻纱\t270479\nfindViewById\t270480\n金融城\t270481\nacceptable\t270482\n枪法\t270483\n四秒\t270484\nv8m\t270485\n代编\t270486\n8l\t270487\n李墨\t270488\nTbook\t270489\n座落\t270490\n惯常\t270491\n美特\t270492\nafd\t270493\n鱼峰区\t270494\n七秒钟\t270495\n浦发银行公司\t270496\n龙珠z\t270497\n杭州师范大学附属医院\t270498\n600路\t270499\n和美团\t270500\n重办\t270501\nTheShy\t270502\n第四行\t270503\n血牙\t270504\nMoldflow\t270505\n验钞机\t270506\n北京市海淀区政府\t270507\nshs\t270508\n首专\t270509\n200幅\t270510\n劲敌\t270511\n评剧\t270512\n不避\t270513\n出血量\t270514\ngpg\t270515\n鮟鱇鱼\t270516\n艾康\t270517\n阿托品\t270518\nbandgap\t270519\n奥比奥\t270520\n美国迪士尼\t270521\n草案\t270522\n追求\t270523\n陈省\t270524\n_逗游攻略中心\t270525\n长江之歌\t270526\n广药\t270527\n52级\t270528\n唐山百姓网\t270529\n多看\t270530\n交通事故保险理赔\t270531\n阻却\t270532\n西雅特\t270533\n大航海时代4\t270534\nNeighborhood\t270535\n温哥华市\t270536\n远视\t270537\n破茧\t270538\n调味品\t270539\n三菱电机空调\t270540\n时代性\t270541\n李金羽\t270542\n洗面台\t270543\n八条\t270544\n鲁伟鼎\t270545\nLucid\t270546\n老路\t270547\n第42期\t270548\n湖北工程学院\t270549\n烧瓶\t270550\nDEM\t270551\n首钢集团\t270552\n五校联考\t270553\n莫度\t270554\nSpirent\t270555\n沙滩帽\t270556\n亲朋\t270557\n苏州百姓网\t270558\n乳魂\t270559\n京瓷哲学\t270560\nsucc\t270561\n澳门大学\t270562\nSodium\t270563\n宣理\t270564\n$scope\t270565\n周延\t270566\nwowgear\t270567\n向阳小学\t270568\n易记\t270569\n220伏\t270570\nWinscp\t270571\n卤子\t270572\n蒙派克\t270573\n课程群\t270574\n白边填充剂\t270575\n不用人\t270576\nido.3mt.com.cn\t270577\n第一百零八章\t270578\n电解器\t270579\nO3\t270580\n石家庄住房公积金管理中心\t270581\n抚养权\t270582\n宾州州立大学\t270583\n三国志13威力加强版\t270584\n42级\t270585\n徐虹\t270586\n山西焦煤集团\t270587\nisnot\t270588\n求人\t270589\n京东万象咨询中心\t270590\n蝶之毒华之锁\t270591\n孕婴幼\t270592\n打屁股针\t270593\n加油卡\t270594\n说谎的人\t270595\n③\t270596\n服务厅\t270597\n青木美空\t270598\n拳皇14\t270599\n女医\t270600\n流函数\t270601\nGarena\t270602\n垫石\t270603\n城市总体规划\t270604\n遂行\t270605\n大慈岩\t270606\n模仁\t270607\nOAuth2\t270608\n旺信\t270609\n王晓林\t270610\n进攻性\t270611\n天蓬网\t270612\n爱他美的\t270613\n塔玛拉\t270614\n老锐志\t270615\nUpgrade\t270616\n藏传佛教\t270617\nfishingplanet\t270618\n多玩键盘连点器\t270619\n耕毅\t270620\n狼皮\t270621\n打搅\t270622\n锌锭\t270623\n苹果电\t270624\n外交关系\t270625\n陈冠霖\t270626\n俱全\t270627\nOK\t270628\nconsulate\t270629\n100MW\t270630\n乐其\t270631\n图审\t270632\n二代火影\t270633\n贵州省民政厅\t270634\n天洲\t270635\n天谴\t270636\n比亚迪宋论坛\t270637\n鸽主\t270638\n余杭区第一人民医院\t270639\n百万家\t270640\n硅谷动力\t270641\n茶泡\t270642\n六朝\t270643\nBallad\t270644\n苟\t270645\n浦东图书馆\t270646\n石家庄市城乡规划局\t270647\n调资\t270648\n兵役\t270649\n外齿\t270650\n张世明\t270651\n色门\t270652\n永川市\t270653\n温带大陆性气候\t270654\n靓妹\t270655\n2166\t270656\n_小新\t270657\n急性白血病\t270658\n转需\t270659\n孝歌\t270660\n超能继承者\t270661\n除颤\t270662\n黑渍\t270663\n父子表\t270664\n1860\t270665\n叶蝉\t270666\n无线版\t270667\n深圳小学\t270668\n沙市区\t270669\n8月底\t270670\n苏州市人力资源和社会保障局\t270671\n兵哥\t270672\n朗庭\t270673\nJudgement\t270674\n埃梅里\t270675\n第306批\t270676\n学精\t270677\n串联型\t270678\n行为学\t270679\nshun\t270680\nration\t270681\n招嫖\t270682\n无腿\t270683\n八口\t270684\n很好奇\t270685\n金川区\t270686\n王滢\t270687\n长实\t270688\nhgu\t270689\n长城M4\t270690\n天台寺\t270691\n恶狗\t270692\n槐木\t270693\n扶贫办\t270694\n大V\t270695\n岁次\t270696\n卧病\t270697\n黄海波\t270698\n四川中医\t270699\n高科集团\t270700\n0372\t270701\n飘雨\t270702\n麻黄草\t270703\n19天\t270704\n微鱼\t270705\nclnchanpin\t270706\nVKY\t270707\nVOID\t270708\n张玉良\t270709\n网上购物商城\t270710\n知网阅读器\t270711\n碳粉盒\t270712\n装腔\t270713\nadc\t270714\n莲子百合\t270715\n西辽\t270716\n来广营\t270717\nsendredirect\t270718\n99se\t270719\n囧图\t270720\n巨龙\t270721\n小虎\t270722\n非递归算法\t270723\nAB胶\t270724\n回向\t270725\n娄门\t270726\n鼓角\t270727\n故乡的原风景\t270728\n交流会\t270729\n关进\t270730\n溜走\t270731\nU乐\t270732\n片库\t270733\n后来者\t270734\n小德\t270735\n2014年7月份\t270736\nО\t270737\n成都市高新区\t270738\n重油污\t270739\n琨御府\t270740\n四院\t270741\nlr11\t270742\n武邑县\t270743\n脊髓型\t270744\n足月\t270745\n借趣花\t270746\n鼎信通达\t270747\n唐悠悠季枭寒\t270748\n專業\t270749\n乐在\t270750\n千层蛋糕\t270751\n天龙屯堡\t270752\nab型血\t270753\n新的生活\t270754\n蒙迪欧\t270755\n265G苹果网\t270756\n中西餐\t270757\n最后通牒\t270758\n防雷器\t270759\n鸡胸\t270760\n押\t270761\n【民族团结一家亲\t270762\n轻工品\t270763\ncrop\t270764\n芜湖人才网\t270765\n疏港路\t270766\n建水县\t270767\n第八十条\t270768\n农业银行掌上银行\t270769\n英雄三国\t270770\nLOLI\t270771\n大贤者\t270772\nBravo\t270773\n辽阳西路\t270774\nMonitoring\t270775\n立体结构\t270776\n上海市政府\t270777\n四川成都中国青年旅行社\t270778\n龙腾新闻网\t270779\n悲叹\t270780\n八万块\t270781\n姚军红\t270782\n储\t270783\n外港\t270784\n神龛\t270785\n月猫\t270786\nMIIX\t270787\n江西省文化厅\t270788\n4第一章\t270789\n眉户\t270790\n浅灰\t270791\n希阿荣博堪布\t270792\n酷友网\t270793\n健身网\t270794\n嘉宏\t270795\n手机锂电池\t270796\n水豆腐\t270797\n熟地\t270798\ntray\t270799\n顶楼的大象\t270800\n红包包\t270801\nsmokeping\t270802\n宁洛高速\t270803\n木雪\t270804\ngenymotion模拟器\t270805\n未婚女孩\t270806\n推币机\t270807\n广海社区\t270808\n连不上去\t270809\nc89\t270810\n最炫民族风\t270811\n张玉明\t270812\n林帅\t270813\n扣费\t270814\n罗斯玛丽\t270815\n绝路\t270816\n鹭燕医药\t270817\n百圆裤业\t270818\n紧\t270819\nContractors\t270820\n三元函数\t270821\n东方cj\t270822\n刘兵克\t270823\nCheating\t270824\n从中\t270825\n贵妇膏\t270826\nMongolia\t270827\n认真的雪\t270828\nXilinx\t270829\n9ask.cn\t270830\n即食燕窝\t270831\n56pao\t270832\n高压柱塞泵\t270833\n波特酒\t270834\n100万人次\t270835\n川渝\t270836\n在摇\t270837\n单向板肋梁\t270838\nMis\t270839\n河南锐之旗信息技术有限公司\t270840\n黄金搭档\t270841\n范巴斯滕\t270842\n海伦哲\t270843\n大雾\t270844\n839\t270845\n07级\t270846\na=5\t270847\n刘和平\t270848\n免邮\t270849\n4288\t270850\n北老九\t270851\nSciences\t270852\n晃倒\t270853\n红海滩\t270854\nhfyfpga\t270855\n暗柱\t270856\n平淡期\t270857\n木台\t270858\n淋浴房\t270859\n这件事\t270860\n叶蓝\t270861\n李里\t270862\n冬云\t270863\n外场\t270864\n减益\t270865\n爪子\t270866\n创投\t270867\n日本站\t270868\n荷花镇\t270869\nclassified\t270870\n明酸\t270871\nCGWANG\t270872\n资产端\t270873\nlcx\t270874\n未央\t270875\n排畸\t270876\noncotarget\t270877\nfleshlight\t270878\n地头\t270879\n蓝屏\t270880\n两学一做学习教育千题知识竞赛题集\t270881\nCTU\t270882\n许仕林\t270883\n≠\t270884\n韩坤\t270885\n王者荣耀女英雄去衣图\t270886\n十四项\t270887\n净月\t270888\nTSD\t270889\n选一选\t270890\n匀速运动\t270891\n金屋英雄志\t270892\n秋波\t270893\n果粒\t270894\nv9.1.0\t270895\n1080P在线观\t270896\n气动压力机\t270897\n涉黄\t270898\n花鲢鱼\t270899\nxitong\t270900\n赛利亚\t270901\n腰椎间盘突出\t270902\n要挟\t270903\n马尾藻\t270904\n发展战\t270905\n浙江工商职业技术学院\t270906\n净土大经科注\t270907\n260个\t270908\n游戏王GX\t270909\n检查法\t270910\n磷酸奥司他韦\t270911\n眉梢\t270912\n买通\t270913\nSmack\t270914\n叶昭\t270915\n上海市工商行政管理局\t270916\n谱集\t270917\nA7M3\t270918\n警容\t270919\n自由现金流\t270920\nherman\t270921\nVue2\t270922\n一走\t270923\n怀邵衡\t270924\n金刚2\t270925\n移不动\t270926\n克鲁格国家公园\t270927\n广州市教育局\t270928\n腰刀\t270929\n证监局\t270930\nmacy\t270931\n天津和平区\t270932\n喜色\t270933\n无挡\t270934\n茶匙\t270935\n酷云\t270936\n永生之酒\t270937\n房地产税\t270938\n无锡市人民医院\t270939\n上海市自行车行业协会\t270940\n戊烯\t270941\n天通苑社区\t270942\n单倍体\t270943\n雷凌论坛_汽车之家论坛\t270944\n多维元素片\t270945\n岩画\t270946\n寿昌镇\t270947\n改频\t270948\ndng\t270949\nstand\t270950\n赵轶东\t270951\n钓鱼论坛\t270952\n3时\t270953\n一锤\t270954\n编程百科\t270955\n蔡家村\t270956\ncologne\t270957\n支招\t270958\nvalidate\t270959\n良辰美景\t270960\n福州地区\t270961\n电影频道\t270962\nVIOLATION\t270963\n顶罪\t270964\n刘嘉倪\t270965\n咪咕盒子\t270966\n耶律洪基\t270967\nLinux编程_Linux\t270968\n取缔\t270969\n浮油\t270970\nssat\t270971\n龙二\t270972\ndaq\t270973\n国家中长期人才发展规划纲要\t270974\n提琴谱\t270975\nCabeza\t270976\n苏武传\t270977\n李贤\t270978\n希尔瓦娜斯·风行者\t270979\n滨江公园\t270980\n沈阳医大二院\t270981\n外格\t270982\n织金\t270983\n刚体转动惯量\t270984\nC30\t270985\n看拼\t270986\nMARKET\t270987\nGOM\t270988\n大温\t270989\n一八年\t270990\n很容易\t270991\n256个\t270992\nDFS\t270993\n排气歧管\t270994\n联合中文\t270995\n中国人民人寿保险\t270996\n催乳师\t270997\n农药残留检测仪\t270998\n九乡\t270999\n错币\t271000\nrepmat\t271001\n石友\t271002\n畅所欲言\t271003\n岁\t271004\nv4.1\t271005\nBeta1\t271006\n竞争性谈判\t271007\n紫外检测器\t271008\n灰线\t271009\n上来\t271010\n新官\t271011\n投标保函\t271012\n未婚同居\t271013\n嘉峪关关城\t271014\n工商银行银行\t271015\n天开\t271016\nclonal\t271017\n北玻\t271018\n总资产报酬率\t271019\n麒麟湾\t271020\nwebox\t271021\n1.2.7\t271022\n国粮\t271023\n真人娃娃机\t271024\n消光剂\t271025\n易制毒化学品\t271026\n电极板\t271027\n酮体\t271028\n忆江南\t271029\nmint\t271030\n怡然\t271031\n6370\t271032\n3802\t271033\nLogCat\t271034\n华天大酒店\t271035\n返景\t271036\n1.2T\t271037\n支队\t271038\n加到\t271039\n自侦\t271040\n混日子\t271041\n红色叹号\t271042\n四季花城\t271043\n地震学\t271044\nEasyDarwin\t271045\n600克\t271046\n解锁期\t271047\ntraefik\t271048\n中江路\t271049\n216路\t271050\n爱丽舍宫\t271051\n赵信\t271052\n除息\t271053\n肝胆\t271054\nxiaozhu\t271055\nNina\t271056\n小阳\t271057\n城城理财\t271058\n美容美妆\t271059\n1684\t271060\n泰坦龙\t271061\nDSC-RX100\t271062\n比尔盖茨\t271063\n砟\t271064\n滞后期\t271065\n有钱吗_\t271066\n葡萄叶\t271067\n玉肌\t271068\n高球\t271069\n方面\t271070\n沃百富\t271071\n质量调查\t271072\n大石桥\t271073\nRAV4\t271074\n结节性痒疹\t271075\n西海岸新区\t271076\n中共安徽省委组织部\t271077\nleo\t271078\n全职高手特别篇\t271079\nJKL\t271080\n京乐\t271081\n300033\t271082\n须子\t271083\n14阶\t271084\n20160321\t271085\n广告词\t271086\n宁夏银川一中\t271087\nrunserver\t271088\n李美静\t271089\n娉\t271090\n第48号\t271091\n懒觉\t271092\nattract\t271093\n梦娃\t271094\n艾蒿\t271095\n昆山日报数字报\t271096\ngyro\t271097\n和州\t271098\nchildhood\t271099\n601106\t271100\n极力\t271101\n特贴\t271102\n扬州网\t271103\n准头\t271104\n惊悚乐园\t271105\n洞内\t271106\n雷克萨斯GX\t271107\n塞内加尔\t271108\ncidr\t271109\ncjt\t271110\n工具店\t271111\n梦田\t271112\n标题头\t271113\n肛交\t271114\n红版\t271115\nthomas\t271116\n20180425\t271117\nsuppressed\t271118\n宏星\t271119\n等你等了那么久\t271120\n入党志愿书\t271121\n进球数\t271122\n低潮\t271123\n卡玛斯\t271124\n计划经济\t271125\n侵入式\t271126\n京牌车\t271127\n云邦黑枸杞网\t271128\n光哥\t271129\n附视\t271130\n怪兽电力公司\t271131\n断片\t271132\ntef\t271133\n绍兴站\t271134\nClans\t271135\n鸡蛋煎饼\t271136\n湖南大学化学化工学院\t271137\n榨出\t271138\n机关党工委\t271139\nHyFi\t271140\n防冻\t271141\n不疯魔\t271142\n抬杆\t271143\n主意\t271144\n板城\t271145\n老化\t271146\n反震\t271147\n建筑材料\t271148\n千股\t271149\n轻宠\t271150\nora-12514\t271151\nPPV\t271152\n开州区\t271153\n琉璃子\t271154\n木犀\t271155\n岳州\t271156\n8天7夜\t271157\n东方万汇城\t271158\n杨英豪\t271159\n创展网\t271160\n转喻\t271161\nHaNi\t271162\n灭战者\t271163\n艾丽丝·门罗\t271164\n法美\t271165\n相女\t271166\n湖景\t271167\n【君越\t271168\n待考\t271169\nCocos2d-JS\t271170\n互站网\t271171\n中小说\t271172\n伦晚脐橙\t271173\n一吐\t271174\n中国医疗\t271175\n李治廷\t271176\n少年\t271177\n云软件园\t271178\nwebscoket\t271179\n阿里嘎\t271180\n方盘\t271181\n洹水\t271182\n负笈\t271183\n自雇\t271184\n华策\t271185\n状元秀\t271186\n魁罡\t271187\nseekg\t271188\n母亲的手\t271189\n大航海时代2\t271190\n美丽的邂逅\t271191\n北京马驹桥\t271192\nmongod\t271193\n文宇\t271194\n几内亚比绍\t271195\nhygiene\t271196\nAcFun弹幕视频网\t271197\n外币\t271198\n2KOL\t271199\n氢氧化铁\t271200\n远流\t271201\n6.4_\t271202\n上海市电力公司\t271203\nziva\t271204\n川流\t271205\negde\t271206\n小坝\t271207\ngruppe\t271208\n调整率\t271209\n52avav\t271210\n浦银安盛\t271211\nAOF\t271212\n星台\t271213\n100位\t271214\n石景山古城\t271215\n金版\t271216\n张婷婷\t271217\ndirect3d\t271218\n薛立山\t271219\n烛龙殿\t271220\n伴侣\t271221\n包商银行\t271222\n隔离室\t271223\n20小时\t271224\n光大银行\t271225\n大徒弟\t271226\n极速浏览器\t271227\npcb板\t271228\nInfrastructure\t271229\n6130\t271230\n新英雄级\t271231\nGVLK\t271232\n1.14亿\t271233\n青枯病\t271234\n黄阅\t271235\n河鳗\t271236\n雅圆\t271237\n100万家\t271238\n联合行动\t271239\n公犬\t271240\n金德\t271241\n12018\t271242\n好漂亮\t271243\n3MOD\t271244\n简丹\t271245\n体育老师\t271246\n箅子\t271247\n分手大师\t271248\n魔法门吧\t271249\n斗龙战士3龙印之战\t271250\n陈后金\t271251\n杯盘\t271252\n拙见\t271253\n胖女\t271254\nvia\t271255\n冗员\t271256\n迎宾员\t271257\n宿迁市政府\t271258\n大本营\t271259\n264L\t271260\n试播集\t271261\n上海中山公园\t271262\n临终前\t271263\n副将\t271264\n小鲜肉\t271265\n表结法\t271266\n那些年\t271267\n王水照\t271268\n丽君\t271269\n深圳麦格米特电气股份有限公司\t271270\n一个者\t271271\n新5系\t271272\n折漂\t271273\n佐佐木六月\t271274\n名豪\t271275\n考试说明\t271276\ntty\t271277\n上海分院\t271278\nEVUS登记\t271279\n法相仙途\t271280\n王治国\t271281\n大数\t271282\n手机齐家网\t271283\n药水哥\t271284\nnoVNC\t271285\n动人心弦\t271286\npfu\t271287\n恒言中文网\t271288\n八法\t271289\n行吧\t271290\n3C数码\t271291\n头饰\t271292\n包先生\t271293\nefgh\t271294\n草履虫\t271295\n1784\t271296\n家国情怀\t271297\n华人民共和国\t271298\n佳偶\t271299\n电离能\t271300\n自强\t271301\nKlaus\t271302\n自动送料机\t271303\n磷脂双分子层\t271304\n西甲\t271305\nopera浏览器\t271306\n献礼\t271307\n畅园\t271308\n三观不正\t271309\n灵山寺\t271310\n测序仪\t271311\n购房网\t271312\nedd\t271313\n吔\t271314\nChapman\t271315\nnato\t271316\n安徽路\t271317\n实用电话\t271318\n1302\t271319\n100X\t271320\n朱亚杨绛\t271321\n5050\t271322\n宣州先锋网\t271323\n2吨\t271324\n张本煜\t271325\nexosip\t271326\n751\t271327\n会计分期\t271328\n滴汗\t271329\n民间小调\t271330\n$row\t271331\n年均增长率\t271332\n常人\t271333\n铜川市新区\t271334\nAdvice\t271335\n第二中学\t271336\n種\t271337\n崇文小学\t271338\n左门\t271339\n广场路\t271340\naoe\t271341\n淮安信息职业技术学院\t271342\n34岁\t271343\n4.4平衡口\t271344\n成灾\t271345\n神焰\t271346\n女情人\t271347\nchuck\t271348\n发生额\t271349\nstruct\t271350\ndeluge\t271351\n巨卡\t271352\nupdating\t271353\n萧生\t271354\n蒂塔万提斯\t271355\n合肥市第一人民医院\t271356\n赤桑镇\t271357\n血细胞分析仪\t271358\n9万吨\t271359\n心剑\t271360\n银钻\t271361\n沈懿\t271362\n基数词\t271363\n毒蛊\t271364\n15处\t271365\nWEYWEY\t271366\ncain\t271367\n丁桥镇\t271368\n等一个人\t271369\n中国民用航空西南地区管理局\t271370\n风吹雪\t271371\n五渔村\t271372\nwin7系统封装\t271373\n雅马\t271374\nGenerated\t271375\n长耳\t271376\n静物\t271377\n博拉\t271378\n俞熔\t271379\n树童\t271380\n蒙太奇\t271381\n福建消防\t271382\n吴依霖\t271383\n义州\t271384\n新加波\t271385\n红酒杯\t271386\n莫干山路\t271387\nw530\t271388\n彻夜\t271389\n集美\t271390\n汕头\t271391\n环氧富锌底漆\t271392\n海北州\t271393\n1.1_\t271394\n浙商财富中心\t271395\n20592\t271396\nimassage\t271397\n藏头诗\t271398\n专清\t271399\n新SSS级喰种\t271400\narbitrage\t271401\n后脑\t271402\nbft\t271403\ndd373\t271404\n300ppi\t271405\n小儿肺炎\t271406\nDStream\t271407\n王岗镇\t271408\n7881\t271409\nghost32\t271410\nVoluntary\t271411\nDao\t271412\n最短\t271413\n有画\t271414\n夏至未至\t271415\n模拟测试卷\t271416\nlaying\t271417\nDexter\t271418\ncarp\t271419\n导航卫星\t271420\n地址簿\t271421\n只要有你\t271422\ndof\t271423\n生缘音乐网\t271424\n弘晖\t271425\n明朝那些事\t271426\n三山五岳\t271427\n1加仑\t271428\n胆量\t271429\nfapiao\t271430\n凤梨花\t271431\nr7000p\t271432\n77期\t271433\n森马服饰\t271434\n忽必烈\t271435\nwix\t271436\n第162集\t271437\nyouxia\t271438\n上六天\t271439\n对刀仪\t271440\n铁齿铜牙纪晓岚2\t271441\n周公旦\t271442\n新大洲本田裂行\t271443\n三齐儿童网\t271444\n区系\t271445\n一次性餐具\t271446\n鲁班软件\t271447\n落料\t271448\n长岛\t271449\n【海\t271450\n滋滋\t271451\n云路\t271452\n保留期\t271453\n人民公仆\t271454\n凤舞文学网\t271455\n第39期\t271456\n糖果屋\t271457\n颜色板\t271458\n来这里\t271459\nprtg\t271460\n▋\t271461\n杨庄村\t271462\ndetective\t271463\n迁建\t271464\n蒋村街道\t271465\n找歌\t271466\n皮下血肿\t271467\n湖北文理学院\t271468\n大毛\t271469\n北京华联集团\t271470\n长城证券\t271471\nevelyn\t271472\n广东药科大学附属第三医院\t271473\n弃妻\t271474\n江西理工大学\t271475\n辽宁石油化工大学\t271476\n莫纳什大学\t271477\nYH\t271478\n赵大\t271479\n岛形\t271480\n陈主任\t271481\n阿朗\t271482\n梧桐山\t271483\n上等\t271484\n眼泪\t271485\nKEMEI\t271486\n济南市人民政府\t271487\n曹伟\t271488\n阿伯丁大学\t271489\n雅诗兰黛\t271490\n氢燃料\t271491\n交直流\t271492\n青山控股集团\t271493\n优师\t271494\n醒\t271495\n避暑\t271496\nshopify\t271497\nViral\t271498\n3551\t271499\n护套管\t271500\n森川アンナ\t271501\n上不下\t271502\n第139期\t271503\nHFS+\t271504\n智慧化\t271505\n图兰\t271506\n描子\t271507\n全从者\t271508\n阅读史\t271509\n福田口岸\t271510\n父女恋\t271511\npraise\t271512\n皮埃罗\t271513\n干砌石\t271514\n默契度\t271515\n阿诺德·施瓦辛格\t271516\n不求\t271517\n八宝镇\t271518\n宝盖头\t271519\n朗诗寓\t271520\nheinz\t271521\n165\t271522\n中国教育干部网络学院\t271523\n孔方\t271524\n国开行\t271525\n夏老师\t271526\niterm2\t271527\n口令牌\t271528\n固镇\t271529\n兰轩儿\t271530\nwow宏\t271531\n天元\t271532\n才够味\t271533\n建水古城\t271534\nNino\t271535\n华阳镇\t271536\n表妹\t271537\nmql4\t271538\n巴图鲁\t271539\nseat\t271540\n唐世平\t271541\nhost头攻击\t271542\n渣滓\t271543\n拨给\t271544\n法码\t271545\n手机包\t271546\n黑暗執行緒\t271547\nItalic\t271548\n上野\t271549\n宝湾\t271550\n24分\t271551\n威盘\t271552\n屠洪刚\t271553\n乞丐王子\t271554\nscofield\t271555\n婚恋文\t271556\n道州\t271557\n博尔塔拉政府网\t271558\n织里镇\t271559\n会声会影x5\t271560\n南平市发改委\t271561\n减脂\t271562\n平山区\t271563\n将将\t271564\n单证员考试\t271565\npdi\t271566\n组合式\t271567\ntoggleClass\t271568\ntdi\t271569\nNDS\t271570\n三信\t271571\n牙龈炎\t271572\nqca9377\t271573\n毛氏红烧肉\t271574\n锻压机\t271575\nODPS\t271576\ntenga\t271577\n刘炳森\t271578\n雨中\t271579\n张掖大佛寺\t271580\nBanned\t271581\nYou-Get\t271582\n松岭\t271583\n天思\t271584\nFILA\t271585\njm\t271586\n真房\t271587\n大重九\t271588\n结日\t271589\n布克\t271590\n展柜\t271591\n汶上县\t271592\n土木堡之变\t271593\nsuapp\t271594\n新联\t271595\nliz\t271596\n自黑\t271597\nstvd\t271598\n观鲸\t271599\n6月1号\t271600\n超大型\t271601\nszc\t271602\n海帕\t271603\n干休\t271604\n八十一\t271605\n键词\t271606\n传播学教程\t271607\n何智丽\t271608\n财奴\t271609\niphone10\t271610\n湖北消防\t271611\n罗茨泵\t271612\n海头镇\t271613\n平型关战役\t271614\n网海\t271615\n电脑知识与技术\t271616\n银行存款余额调节表\t271617\n师达中学\t271618\nstm32l151\t271619\n多媒体类\t271620\n工商行\t271621\n莺啼\t271622\nMoral\t271623\n百科知识\t271624\nrx1r\t271625\nColumbia\t271626\n赌城群英会\t271627\n胡玮\t271628\n观点库\t271629\nsip\t271630\n比克提尼\t271631\n运转\t271632\n受欺负\t271633\n阳光财产保险股份有限公司\t271634\n稻川夏目\t271635\n衔接\t271636\n西下\t271637\n补风\t271638\nphenix\t271639\nWiF\t271640\n远\t271641\n国英雄\t271642\nmt吧\t271643\nfset\t271644\n120英寸\t271645\n正觉\t271646\n王重阳\t271647\n武庚纪第二季\t271648\n观音堂\t271649\n从军行\t271650\n中弹\t271651\naraxis\t271652\n526路\t271653\n安阳钢铁\t271654\n6603\t271655\n情欲诱惑\t271656\n气带\t271657\n大连港集团\t271658\n加金\t271659\n我家网\t271660\n优利\t271661\n三国演义吧\t271662\n不败\t271663\n西本\t271664\n葛岭\t271665\n说走就走\t271666\n中华广场\t271667\n始发\t271668\n卓宝\t271669\n规建\t271670\n制件\t271671\n九婴\t271672\n张公公\t271673\n维生素C泡腾片\t271674\nBMS\t271675\n外来\t271676\n茶园新区\t271677\n康丽根\t271678\n乳头状瘤\t271679\n挤兑\t271680\n镇赉县\t271681\n黑圈\t271682\n2018年3月7日\t271683\n君岛\t271684\n收银机\t271685\n鲁朗\t271686\n被套牢\t271687\n真木\t271688\n33厘米\t271689\n超调\t271690\n376\t271691\n中商情报网\t271692\n探视权\t271693\nDOBBY\t271694\nyoue\t271695\n铁磁\t271696\n暖暖的新家\t271697\n俄克拉荷马城\t271698\n毕业论文管理系统\t271699\n细胞型\t271700\n古事\t271701\n凶犯\t271702\njenkens\t271703\n百度在线\t271704\n江苏省人民政府办公厅\t271705\n汇师\t271706\neXeScope\t271707\n幸运吧网\t271708\nchar*\t271709\n脑游戏\t271710\n线程数\t271711\nGSK\t271712\nZhi\t271713\n黑房\t271714\ndudeyouth\t271715\nfaculty\t271716\n横截面\t271717\n抹茶星冰乐\t271718\n为本\t271719\ngae\t271720\n动动计步器\t271721\n变帅\t271722\n黄峥\t271723\n浑身解数\t271724\n收藏篇\t271725\n2.pdf\t271726\nDairy\t271727\n强生控股\t271728\n华沙起义\t271729\n车辆识别代号\t271730\n张民生\t271731\n凶恶\t271732\n憨笑\t271733\n8254\t271734\n平均值\t271735\n中国科学院上海生命科学研究院生物化学与细胞生物学研究所\t271736\n家具人才网\t271737\n昆明高新区\t271738\nm90\t271739\n沈尹默\t271740\n人中之狼\t271741\n加盟网\t271742\n24盘位\t271743\n蛙鸣\t271744\nPayday\t271745\n统一战争\t271746\n6日游\t271747\n成长股\t271748\n闵行中学\t271749\n那双\t271750\n凡字\t271751\nP20/Pro\t271752\n秦致\t271753\nMUD\t271754\n光弹\t271755\n电木\t271756\n准生证\t271757\n半永久化妆\t271758\n钢板弹簧\t271759\nhuishang\t271760\n2017款名\t271761\nadvantage\t271762\n撒克逊\t271763\n奇瑞QQ3\t271764\n花样女王\t271765\n街霸3\t271766\nMister\t271767\nvrv\t271768\n方立勋\t271769\n长江养老保险股份有限公司\t271770\nCF穿越火线手游\t271771\n矩阵方程\t271772\n农村集体经济组织\t271773\n魔物盒\t271774\n一遇风云便化龙\t271775\nKSnow\t271776\n3535\t271777\n色彩缤纷\t271778\n腐书网\t271779\n广播式\t271780\n临危受命\t271781\n洽谈会\t271782\n汆丸子\t271783\n6560\t271784\nIP路由表\t271785\n星作\t271786\n李宇\t271787\n埃索达\t271788\n麦肯锡咨询公司\t271789\n目不暇接\t271790\n核爆炸\t271791\nDAISO\t271792\n锚鱼\t271793\nv5\t271794\nkayak\t271795\n疯狂猜成语\t271796\n10月下旬\t271797\n萧山市区\t271798\n太原晚报网\t271799\n第27集\t271800\n尊享型\t271801\nxtype\t271802\nbeautifully\t271803\n第5个\t271804\n聋人\t271805\n沐子\t271806\n索引名\t271807\n写意\t271808\n机构投资者\t271809\nWilcoxon\t271810\n团山\t271811\n跨度\t271812\n加拉塔萨雷\t271813\n辅料展\t271814\n立美\t271815\nvma\t271816\n完全\t271817\n瞌睡\t271818\nelvis\t271819\n宁波二中\t271820\n摩旅\t271821\n邹凯\t271822\n5530\t271823\n萧亮\t271824\nPchome\t271825\n2017半年\t271826\ne路\t271827\n英语四级作文\t271828\n英国石油公司\t271829\n浦三路\t271830\n四双\t271831\n下午6点\t271832\nLuca\t271833\n六宗\t271834\nPMGBA\t271835\n小酷\t271836\n吃手\t271837\nroc\t271838\n饮茶\t271839\n当期\t271840\n辣道\t271841\n无轩\t271842\nYeelight\t271843\nucinet\t271844\n垂直同步\t271845\n孟铎\t271846\n桃源街\t271847\n谌\t271848\ninnovative\t271849\n痛哭流涕\t271850\nInterPro\t271851\nanterior\t271852\n幼男\t271853\nrtz\t271854\n女海盗\t271855\n孔雀羽\t271856\njiaju\t271857\noppo\t271858\nTOP10|\t271859\nzhibo\t271860\n梅拉尼娅\t271861\n天罪\t271862\n楚寒衣青\t271863\nV2.0.1\t271864\n分镜头\t271865\n丝丝\t271866\n晓春\t271867\n秒变\t271868\n爱滋\t271869\n张国英\t271870\nBSC\t271871\n吴国盛\t271872\ngotv\t271873\n中国建筑装饰网\t271874\n莎拉波娃\t271875\nNAP\t271876\n幼升小面试题\t271877\nEasyConnect\t271878\nHKU\t271879\nTHG\t271880\nAngular4学习笔记\t271881\n国旗护卫队\t271882\nyee\t271883\n市值\t271884\n洪学智\t271885\nd3h\t271886\n衬纸\t271887\nRanger\t271888\n清和\t271889\nTill\t271890\npdata\t271891\n擦洗\t271892\n宝鸡第一人才网\t271893\nxian\t271894\n冯德伦\t271895\nrevlon\t271896\n笛梵\t271897\n蓝紫色\t271898\n酒神包\t271899\n大凡\t271900\n向量机\t271901\nvboxsf\t271902\njoan\t271903\n宫马\t271904\n128平米\t271905\nDVI线\t271906\n恩科\t271907\nFlyers\t271908\n王竟力\t271909\n回味\t271910\n400亿元\t271911\n纸板书\t271912\n中蜂吧\t271913\n排水井\t271914\n敦煌大学\t271915\n交通运输工程\t271916\n跑出\t271917\nhyperview\t271918\n随机函数\t271919\n日程表\t271920\nmaurice\t271921\n美学者\t271922\n雅加达机场\t271923\n李弘基\t271924\nosta\t271925\n天若有情2\t271926\n省工商联\t271927\n青岛物流公司\t271928\n第6辑\t271929\n贺州市\t271930\n特案组\t271931\n马向阳下乡记\t271932\nSponsor\t271933\n付费\t271934\nGT710\t271935\n转汇\t271936\n金砖国家\t271937\n颔首\t271938\n浙江省局\t271939\n铝电解电容器\t271940\n嬴\t271941\n一重\t271942\n33米\t271943\n呕血\t271944\ndistcp\t271945\n无论如何\t271946\n中共广西区委党校\t271947\n王彩云\t271948\n股票质押\t271949\n二十七岁\t271950\n奥山世纪城\t271951\n汽车客运站\t271952\n合线\t271953\n2000届\t271954\n凤凰读书\t271955\n20160511\t271956\n中伦\t271957\n夕立\t271958\n辅导书\t271959\n上海新场古镇\t271960\nwpa2\t271961\n采纳\t271962\n承德石油高等专科学校\t271963\n1第49\t271964\n惠州日报\t271965\n焦作市人力资源和社会保障局\t271966\n五份\t271967\neazy\t271968\nhelix\t271969\n原辅料\t271970\n金陵学院\t271971\n11月26日\t271972\n_地创网\t271973\n集成运放\t271974\n单车自媒体\t271975\n各处\t271976\n羹汤\t271977\n拾柒\t271978\n拿地\t271979\n小巧\t271980\n白琳\t271981\nDLCs\t271982\n主动脉瓣\t271983\n克洛普\t271984\n联想YOGA\t271985\nwin8系统\t271986\n阿里汽车\t271987\n娜娜子\t271988\n温州汽车网\t271989\n整数位\t271990\nUniscan\t271991\n沃尔沃S60/S60L/V60论坛_汽车之家论坛\t271992\n迪拜酒店\t271993\n计算型\t271994\n黄骅\t271995\n肠穿孔\t271996\n第27期\t271997\n五级\t271998\n飘飘\t271999\n自循环\t272000\n家具迷\t272001\n遵义火车站\t272002\n攻辩\t272003\n日语惯用语\t272004\n2017年12月29日\t272005\n寒小期\t272006\n推板\t272007\n橘花散\t272008\n毒发\t272009\n肱骨\t272010\nEXCEL公式\t272011\n大唐玄奘\t272012\n香帅\t272013\n兼程\t272014\n朗驰\t272015\n一万年前\t272016\n刘亚仁\t272017\n金桥网\t272018\n文网\t272019\n回旋镖\t272020\n雨果网\t272021\n米粉机\t272022\nvendors\t272023\n前两年\t272024\n威马ex5\t272025\n59号\t272026\n开封博物馆\t272027\n补亏\t272028\n扎线\t272029\n水逆\t272030\nvhf\t272031\nvivoY66\t272032\n博为峰\t272033\n2812\t272034\n有些事\t272035\n靶材\t272036\n艺片库\t272037\n北环\t272038\n言情小说网\t272039\n泽水\t272040\n66级\t272041\n老架\t272042\n哈利波特主题公园\t272043\n工银国际\t272044\n文论\t272045\n脚感\t272046\n标杆文\t272047\nScrambler\t272048\n南怀瑾\t272049\n小英雄雨来\t272050\nLyft\t272051\nRNG\t272052\n鲜花店\t272053\n山灵m2s\t272054\n4200m\t272055\n围堵\t272056\n完结+番外\t272057\n搜索引擎指南网\t272058\nopcode\t272059\nLUTs\t272060\n塔湾\t272061\nrespiratory\t272062\n追忆似水年华\t272063\n高骈\t272064\n嵩县\t272065\n国和\t272066\n安非\t272067\n游戏场\t272068\n铁系\t272069\n莱特币\t272070\nBattlegrounds\t272071\n张宪\t272072\n定福庄\t272073\nxuefa\t272074\n丽家宝贝\t272075\n垃圾压缩机\t272076\nToken\t272077\n超级泡泡\t272078\n新疆工程学院\t272079\n家电维修论\t272080\n拼板胶\t272081\n费启鸣\t272082\n第六十四章\t272083\n緈諨\t272084\n克里金\t272085\nJeremyJones\t272086\nStarman\t272087\n黑庄\t272088\n仄\t272089\n醋酸丁酯\t272090\n盖\t272091\n犯案\t272092\n辐射:避难所\t272093\n车家\t272094\n2.6G\t272095\n大元\t272096\n配料表\t272097\n充不上\t272098\nCarousel\t272099\n罗汉果水\t272100\n3.85\t272101\ntuples\t272102\n乡村爱情进行曲\t272103\n蠓虫\t272104\nIRT\t272105\n昆玉市\t272106\n竹野内丰\t272107\nCCPIT\t272108\nDonor\t272109\n拉_\t272110\n双流县\t272111\n11G101-1\t272112\n森林狼队\t272113\nCathay\t272114\n7番\t272115\n人再囧途之泰囧\t272116\n泰山医学院附属医院\t272117\n渔线轮\t272118\n生存率\t272119\n两方\t272120\nNBA_大众网\t272121\n涨声\t272122\n狠角色\t272123\n青岛西站\t272124\n歌词本\t272125\n西藏旅行\t272126\n暴雨天\t272127\nChorus\t272128\n聊天窗\t272129\nZigbee\t272130\n九份老街\t272131\n3588\t272132\n云链\t272133\n过境段\t272134\nTaiyuan\t272135\n95动漫网\t272136\n打岔\t272137\nPotatso\t272138\n苏果果\t272139\n数字地\t272140\n奥拉刀塔传奇\t272141\n格式化\t272142\nxclock\t272143\n耗资\t272144\nsony\t272145\n黔南民族师范学院\t272146\nyy直播\t272147\nCharge\t272148\n光天化日\t272149\n欲晓\t272150\n晴雨伞\t272151\nAUDIO\t272152\n蜜桃汇\t272153\nAbp\t272154\n相爱相杀\t272155\n煽动性\t272156\n丰缘\t272157\nSPEC\t272158\n加微\t272159\n限外\t272160\nBeckham\t272161\nEMUI3.1\t272162\n滴滴顺风\t272163\n社交类\t272164\n陈柯\t272165\n银行信用贷款\t272166\n麦肯锡\t272167\n积土成\t272168\nUnionID\t272169\n失踪案\t272170\n回火\t272171\n中科院植物所\t272172\nulink\t272173\n100多次\t272174\nSpirng\t272175\n久久\t272176\nconcat函数\t272177\n欧泊石\t272178\n心高气傲\t272179\n齐鲁法制网\t272180\n天门吧\t272181\nTmall\t272182\n新北市\t272183\nUSACO\t272184\n深入恶土\t272185\n电弧焊\t272186\n贵州航天职业技术学院\t272187\n水粉\t272188\n芙莉德\t272189\n空杯\t272190\n安定区\t272191\n冤亲\t272192\njin\t272193\nMaison\t272194\n御剑情缘\t272195\n蜻蜓眼\t272196\n坂上之云\t272197\n米拓\t272198\n触及\t272199\n南安新闻网\t272200\n风斗\t272201\n泰和花园\t272202\n沙县小吃\t272203\nC4\t272204\n3500亿\t272205\n来来去去\t272206\n三顿\t272207\naili\t272208\n幻月无名\t272209\nDual\t272210\n市政化\t272211\n猎杀者\t272212\nmaximal\t272213\n狂追\t272214\nMary\t272215\n预初\t272216\norbslam\t272217\n20KG\t272218\n港务局\t272219\n全式金\t272220\n大店\t272221\n职员\t272222\n陈健民\t272223\nOuter\t272224\nq4\t272225\nail\t272226\n耳朵眼炸糕\t272227\n2018.com\t272228\n杨柳枝\t272229\n蓬溪县\t272230\n101平米\t272231\n苍月女奥特曼\t272232\nAna\t272233\n鬼舞\t272234\n抛光机\t272235\n赣能股份\t272236\n时星草\t272237\n贵丰\t272238\n保持平衡\t272239\n20个月\t272240\ndaughters\t272241\n热棒\t272242\n耐候性\t272243\n承诺制\t272244\n小化\t272245\n十一天\t272246\n磷酸化蛋白\t272247\nhelical\t272248\nmalt\t272249\n跌入\t272250\nscdn\t272251\n西酞普兰\t272252\n李承晚\t272253\n一棍\t272254\n穷人的孩子早当家\t272255\n裂孔\t272256\n四_\t272257\ncastor\t272258\n折磨\t272259\nips\t272260\njaw\t272261\n化妆\t272262\n理直气壮\t272263\n高鸿\t272264\nICA\t272265\nCAM\t272266\n难辨\t272267\n异层\t272268\n莱布尼茨\t272269\n完整篇\t272270\n发扎\t272271\nexcuse\t272272\nDedicated\t272273\n开心人\t272274\n高铁动卧\t272275\n程琳\t272276\npitts\t272277\nlemon\t272278\n照人\t272279\n剑川县\t272280\nNucky\t272281\n2丨\t272282\n经济观察\t272283\n高登\t272284\nMyEclipse2017\t272285\n证券从业\t272286\nLigerUi\t272287\n三国马自达3\t272288\n劳协\t272289\n广东欧珀移动通信有限公司\t272290\n大名县\t272291\nchn\t272292\nNavigationBar\t272293\n智慧小区\t272294\n豹2\t272295\n坡层\t272296\nCorrosion\t272297\n可控核聚变\t272298\nDjango数据库\t272299\n忧心\t272300\n万能钥匙\t272301\n侍道4\t272302\n茶居\t272303\n普格\t272304\n马鞍山日报\t272305\njqu\t272306\n萌魔王\t272307\n严冬\t272308\n口袋妖怪白\t272309\n全佑\t272310\n马头\t272311\n网球拍\t272312\n投入\t272313\nPCD\t272314\n上棘\t272315\n大黄页\t272316\ncco\t272317\n登陆器\t272318\n仙霞路\t272319\n设想\t272320\n0.01\t272321\n现言\t272322\n鼓楼东大街\t272323\n0530\t272324\n监听\t272325\n尊龙\t272326\n牯牛\t272327\ntp-link\t272328\n八角帽\t272329\n琢磨俗僧\t272330\nscut\t272331\nV5CG\t272332\n曹刿\t272333\n青冢\t272334\n吉林火车站\t272335\n吨\t272336\n林盛斌\t272337\n葛\t272338\narai\t272339\n超级转霸\t272340\n杨丹\t272341\n入企\t272342\n大登殿\t272343\n保姆_保姆公司\t272344\n达州日报\t272345\n国权路\t272346\n破面\t272347\n陕西建工集团有限公司\t272348\n测评师\t272349\n王佐之才\t272350\nglas\t272351\n峡山街道\t272352\nguiminer\t272353\n老百晓在线\t272354\nEBSCO\t272355\n滑油\t272356\n太极实业\t272357\n总经验\t272358\n绵延\t272359\n重犯\t272360\n搭台\t272361\n眩晕感\t272362\n京房\t272363\n主文\t272364\n古来\t272365\nx220\t272366\nluan\t272367\n上坪\t272368\nKEIL4\t272369\n内质\t272370\n中置\t272371\nCaps\t272372\n郭德王小川\t272373\n落水者\t272374\n58条\t272375\n琼楼玉宇\t272376\nitjeff\t272377\n南飞\t272378\n奶器\t272379\n渔光\t272380\n6.67\t272381\n心外科\t272382\nDarcy\t272383\n有师\t272384\n锁车\t272385\n存储技\t272386\n长测\t272387\n鸭翼\t272388\n前任3\t272389\n孔数\t272390\n中国科学院微电子研究所\t272391\n摩尔数\t272392\n翻译史\t272393\n马甲条\t272394\n伊坂幸太郎\t272395\n共育\t272396\n战锤40K战争黎明2\t272397\nsmallest\t272398\n节录\t272399\n我们结婚了\t272400\n广西壮族自治区林业厅\t272401\n电报机\t272402\n保家装修网\t272403\n月曜夜未央\t272404\n華\t272405\n感情路\t272406\n碳化钨\t272407\n天丽\t272408\n0015.111\t272409\n浆砌片石挡土墙\t272410\ncpuminer\t272411\n瓜纳华托\t272412\n百合番\t272413\n西祠\t272414\n厦门国家会计学院\t272415\n毒化\t272416\n硬笔字帖\t272417\n厉先生\t272418\nretrieved\t272419\ncobra\t272420\ndm2\t272421\n达盖尔\t272422\n扣逼\t272423\n安靖\t272424\n轻疾\t272425\n文祥\t272426\n敲定\t272427\n人民武装部\t272428\n败诉方\t272429\n中华人民共和国公司\t272430\n5万种\t272431\n熊猴\t272432\n超过15秒\t272433\n竹岛\t272434\n望乡\t272435\n第一滩\t272436\nv5.2\t272437\n淮海晚报\t272438\n纳特拉\t272439\n仅有\t272440\n谱系\t272441\nS21\t272442\n0.6.1\t272443\nh3c防火墙\t272444\n在在\t272445\n0311\t272446\nshgao\t272447\n申维辰\t272448\n示范岗\t272449\n遗落的南境三部曲\t272450\n万竿\t272451\n小米手机3\t272452\n车讯-皮卡网\t272453\n仓库页\t272454\n名山县\t272455\n蓝芯\t272456\n蝙蝠侠阿甘起源\t272457\n1517\t272458\n微反应器\t272459\n万二\t272460\n杭州浙一医院\t272461\n洗点\t272462\n上海站票务网\t272463\n一世痴狂\t272464\n疑难\t272465\n太坏\t272466\nAPPS\t272467\n升钟湖\t272468\n结队\t272469\nenrolled\t272470\n切槽\t272471\n魏玛共和国\t272472\n叶永烈\t272473\n新房子\t272474\n三十六天\t272475\n千里眼\t272476\n不忘初心继续前进\t272477\n鱿鱼须\t272478\nmatisse\t272479\nADSS\t272480\nNeat\t272481\n[\t272482\n如月\t272483\nascis\t272484\n耳濡目染\t272485\nUEditor\t272486\n14.6\t272487\n屏轴\t272488\n1716\t272489\n20131022\t272490\n出世\t272491\n举手提问\t272492\n刘志庚\t272493\n三孩\t272494\n韩明\t272495\nrepresentations\t272496\n普卡\t272497\n2016年12月1日起\t272498\n法表\t272499\nahz\t272500\n谍变\t272501\n累积分布函数\t272502\n中国丝绸博物馆\t272503\n牛佬\t272504\n润和紫郡\t272505\n三伏天\t272506\n水平杆\t272507\n海带宝\t272508\n首都医科大学附属复兴医院\t272509\n前脑\t272510\nTeorex\t272511\n双头鹰\t272512\n远海\t272513\n安乔\t272514\ntagged\t272515\n大爆\t272516\n励磁涌流\t272517\n天胶\t272518\n好事儿\t272519\n10丨\t272520\nlocate\t272521\nsFTP\t272522\nls50\t272523\n主儿\t272524\nComboTree\t272525\n王为民\t272526\n7.5小时\t272527\n破击\t272528\n123级\t272529\n化水\t272530\n相邀\t272531\n赵武\t272532\n侧式\t272533\n焦点\t272534\n密\t272535\nEnzyme\t272536\n宝矿力水特\t272537\nsubtotal\t272538\n楼风\t272539\nleopold\t272540\ncaptcha\t272541\n捷瑞\t272542\n尾随\t272543\n佟麟阁\t272544\n重城\t272545\n舞房\t272546\n双子女\t272547\n安医\t272548\n脱胎\t272549\nna\t272550\n红霉素肠溶片\t272551\n青政办\t272552\nUbuntu\t272553\nubuntu源\t272554\n坦湖\t272555\n心战\t272556\n人教版五年级语文\t272557\n马鞍山市环境保护局\t272558\n燃气\t272559\n气凝胶\t272560\n南勤\t272561\n杀进\t272562\njum\t272563\n后福\t272564\n第207集\t272565\n罗真\t272566\n王风\t272567\n龙山\t272568\nazw\t272569\nV7.6\t272570\n中小企\t272571\n柜式\t272572\nf399\t272573\nCreators\t272574\nBecoming\t272575\n别来无恙\t272576\ngenerator\t272577\nStructure\t272578\n侯天来\t272579\n大窍门\t272580\n酷看\t272581\n保利西子湾\t272582\n1.9.4\t272583\n金鲨\t272584\n12.6\t272585\n场布\t272586\n家和万事兴\t272587\n阳东区人民政府\t272588\nStoreBT\t272589\n备份\t272590\n唐玄奘\t272591\n转向器\t272592\n牡\t272593\n早教机构\t272594\n肥效\t272595\n蒋捷\t272596\nAndriod\t272597\n传动\t272598\newe\t272599\nfzl\t272600\nKOA\t272601\nWCS\t272602\n广平县\t272603\ni4\t272604\n回纥\t272605\n180310\t272606\nLucas\t272607\n金耀\t272608\n路路通\t272609\n中储粮油脂有限公司\t272610\n重庆人\t272611\n1000多个\t272612\n江歌\t272613\n大触\t272614\nLOG\t272615\n京基100大厦\t272616\n奢阔\t272617\nA25\t272618\n无忧网\t272619\nlinux源代码\t272620\nmpvue\t272621\n魅玩帮\t272622\nMyBatis\t272623\nisa\t272624\n相钢\t272625\n苏里\t272626\n69万\t272627\n腥线\t272628\n真的好\t272629\n健身器\t272630\nOffshore\t272631\n大贵族学校\t272632\n哈尔的移动城堡\t272633\n谢帝\t272634\nbusiness\t272635\n新天我的1979\t272636\n三百年前\t272637\n张迎春\t272638\n就此\t272639\n铅板\t272640\n默哀\t272641\n信春哥\t272642\n北村一辉\t272643\n几月几日\t272644\n王牌逗王牌\t272645\n王者荣耀扁鹊\t272646\n当客\t272647\nECLIPSE\t272648\n40升\t272649\nMk\t272650\n爱奇电子书\t272651\n奔驰GLK\t272652\n第十期\t272653\n正畸\t272654\n湖湘\t272655\n平南镇\t272656\ntigers\t272657\n悦榕湾\t272658\n重庆时报\t272659\n张不开\t272660\n胶体蓄电池\t272661\n黄凯\t272662\n俄罗斯蓝猫\t272663\n翩跹\t272664\n横纹肌溶解\t272665\n东莞车管所\t272666\nCumulative\t272667\n城建税\t272668\n最长寿\t272669\n蕾丝\t272670\n杨靖宇\t272671\n联通4G\t272672\n4366\t272673\n飞凤\t272674\n长小\t272675\n考试科\t272676\n内衣展\t272677\n半世\t272678\n百叔\t272679\n花朝\t272680\n苏澈\t272681\n电动工\t272682\n透子\t272683\n义务教育语文课程标准\t272684\n11位\t272685\ndisguise\t272686\n无锡中医院\t272687\n文峰镇\t272688\n李晖\t272689\n妊娠\t272690\n插逼\t272691\n耦园\t272692\n火影忍者鸣人\t272693\n墨守\t272694\nkail\t272695\n水果湖\t272696\n南昌大学医学院\t272697\nDay1\t272698\n6环\t272699\n蔬菜汤\t272700\n2017-04-20\t272701\n圆谷株式会社\t272702\ngba吧\t272703\n粉蓝\t272704\n张晓燕\t272705\nbeego\t272706\n李晓勇\t272707\n地应力\t272708\n许村\t272709\n湿式电除尘器\t272710\nxmlns=\t272711\n乌玛·瑟曼\t272712\n沙河村\t272713\n外环高速\t272714\n周岚\t272715\nTTY\t272716\n主笔\t272717\nack\t272718\n魏徵\t272719\n广州市安全生产监督管理局\t272720\nps液化\t272721\nTil\t272722\n涂片\t272723\n宜兰县\t272724\n成双\t272725\nudt\t272726\n35.9\t272727\ncC\t272728\n剑侠3\t272729\n点香阁\t272730\n市政\t272731\n回音哥\t272732\n素食主义者\t272733\n咬牙\t272734\n春林\t272735\nBLOVE婚戒定制中心\t272736\nreizhi\t272737\n圆台\t272738\nRecycleView\t272739\n环湖\t272740\n祥源城\t272741\nACC\t272742\nyomi\t272743\nPKU\t272744\n华为Mate9Pro\t272745\n任人摸\t272746\n磕头\t272747\n天井\t272748\n中国游戏中心\t272749\n小米视频\t272750\n嫌贫爱富\t272751\n摸索\t272752\n商业性\t272753\n笙离\t272754\n全\t272755\n佩特\t272756\n尚\t272757\nDNF阿修罗\t272758\napc\t272759\nアダルト\t272760\n同期声\t272761\n射极\t272762\n形考\t272763\n涩性\t272764\n58分\t272765\n灵笼\t272766\n扮成\t272767\n香港东荟城\t272768\njianpu\t272769\n千台\t272770\n燕赵晚报\t272771\n武汉岩土所\t272772\n明心\t272773\n双胎妊娠\t272774\n佛山地铁11号线\t272775\n寫真\t272776\nartTemplate\t272777\n金正昆\t272778\n安琪酵母股份有限公司\t272779\n燃灯\t272780\n西安城南客运站\t272781\nJavlibrary\t272782\n液晶拼接墙\t272783\n惠州市公共资源交易中心\t272784\n沐浴\t272785\n一修\t272786\n保利置业\t272787\n我的伯父鲁迅先生\t272788\n2018-4-17\t272789\nserivce\t272790\n珠江钢琴\t272791\n抢点\t272792\n松材线虫病\t272793\n整装待发\t272794\nbeverage\t272795\n甜蜜惩罚\t272796\n下一个人\t272797\n亳州一中\t272798\n欧阳耀莹\t272799\nstripes\t272800\n大话滨海\t272801\ndbajun\t272802\napproval\t272803\nCPAN\t272804\n上海路政局\t272805\n神通广大\t272806\n颅骨\t272807\n两岸四地\t272808\n遵义会议\t272809\naabc\t272810\n网摘\t272811\nanova\t272812\n墨殇\t272813\n50英寸\t272814\n嘉兴万达广场\t272815\nhiphop\t272816\nSims3\t272817\n发送方\t272818\n浓精\t272819\n石楠木\t272820\n张小花\t272821\n双旗镇\t272822\n人民币Pre-A轮融资\t272823\n5一\t272824\n256_\t272825\n学而思\t272826\n蓝花楹\t272827\n血夜\t272828\nOpenCv\t272829\n壁画\t272830\n孱弱\t272831\n罗斯宗庆后\t272832\nnf\t272833\n结婚季\t272834\n要职\t272835\n和会\t272836\n电纺\t272837\n病栋\t272838\n国际银行\t272839\n后生可畏\t272840\n严密\t272841\naos\t272842\n男事\t272843\n中筛\t272844\nregarded\t272845\n小荷风采幼儿舞蹈\t272846\n桑\t272847\n田华\t272848\n军妓慰安妇\t272849\njoey\t272850\n蓬松感\t272851\n周公解梦查询_周公解梦大全\t272852\n360doc.com\t272853\n张近东\t272854\nAlex\t272855\n巴啦啦小魔仙之梦幻旋律\t272856\n市规\t272857\n魔神王\t272858\nSe\t272859\n龈下\t272860\n沈阳植物园\t272861\n葛底斯堡\t272862\n陈记\t272863\n撸王\t272864\n罗马机场\t272865\n北京时尚新锋艺术培训学校\t272866\n不醒\t272867\n贾佳\t272868\n叠山\t272869\n逐个\t272870\n许娣\t272871\n第110\t272872\n芦笋\t272873\nrsyslogd\t272874\n易锦\t272875\nさ\t272876\n诸暨\t272877\nbandage\t272878\nrumor\t272879\n第五人格许愿码\t272880\nchole\t272881\n1.2亿美元\t272882\n移动物联卡\t272883\n六六大顺\t272884\n位势\t272885\n融融\t272886\n公司们\t272887\n飞猪网\t272888\n朱家角\t272889\nssc\t272890\n龙太子\t272891\n1650\t272892\n骤变\t272893\nHXR\t272894\n战事\t272895\n分屏\t272896\ninnis\t272897\nrelieve\t272898\n微语录\t272899\n吊重\t272900\n斗米兼职靠谱吗斗米兼职\t272901\n4N\t272902\nPySpark\t272903\n乐人\t272904\n秦祥林\t272905\nv刹\t272906\n大宝卡\t272907\n摔角网\t272908\n华业大厦\t272909\n同乐\t272910\n姓郭\t272911\n彭家丽\t272912\n最游记\t272913\njindian\t272914\n停车票\t272915\n李文星\t272916\n篮式\t272917\n微分器\t272918\n饥饿游戏3嘲笑鸟\t272919\n生死状\t272920\n挂号单\t272921\n广信材料\t272922\n五谷鱼粉\t272923\n宏泰广场\t272924\n寅阳镇\t272925\n香橙\t272926\n学本\t272927\n灵活\t272928\n走下去\t272929\nxoxoxo\t272930\n孑然一身\t272931\n金钥匙\t272932\n不堪入\t272933\n效果类\t272934\n白瑾昊\t272935\nx年\t272936\n0i\t272937\n上峰\t272938\n北国之春\t272939\nentryset\t272940\n磨盘山\t272941\ndn\t272942\n京城81号2\t272943\n入党积极分子转预备党员\t272944\n万豪国际集团\t272945\n长安cs55\t272946\n墨家村\t272947\nErmu\t272948\n便盆\t272949\nin\t272950\n微电园\t272951\n宝塔石化集团\t272952\nfreevpn\t272953\n第47个\t272954\n强生公司\t272955\neasymock\t272956\n看轻\t272957\n白黑游戏\t272958\n美国大峡谷\t272959\n光明乳业\t272960\n阀室\t272961\n连增\t272962\nsoldiers\t272963\n保函\t272964\n飞常\t272965\n孔雀鱼\t272966\n广东石化\t272967\nMelrose\t272968\n奥驰\t272969\ni9500\t272970\nAccess2016\t272971\nGangsta\t272972\nUSE\t272973\n许地山\t272974\n重高\t272975\n48号\t272976\n反白\t272977\n书册\t272978\n四\t272979\n战地3吧\t272980\n宜春市\t272981\n清液\t272982\n中国篆刻网\t272983\ncompensation\t272984\n念书\t272985\nmac破解版|office\t272986\n石灰窑\t272987\n宁夏\t272988\n孙英雄\t272989\n李鹏飞\t272990\n家酿啤酒\t272991\nCNT\t272992\n哈勒普\t272993\n2遍\t272994\ncc棒\t272995\n椅\t272996\n诗库\t272997\n质理\t272998\n石南\t272999\n朝阳产业\t273000\n龟速\t273001\n韭菜籽\t273002\n偏自\t273003\n长安奔奔mini\t273004\nskc\t273005\n20151226\t273006\n冰凌花\t273007\ndfp\t273008\nsssss\t273009\n迷子\t273010\n300496\t273011\n白彦虎\t273012\nlus\t273013\n库仑\t273014\n紧固件\t273015\nseptember\t273016\nmfc\t273017\n冠号\t273018\n企业联合网\t273019\n徐州地铁\t273020\nbotox\t273021\n瑞波\t273022\nerasure\t273023\noccupancy\t273024\n尸城\t273025\nGodfather\t273026\n更香\t273027\n六宫\t273028\nSFS\t273029\n孔板\t273030\n百度地图\t273031\n91job\t273032\n石滩\t273033\n张啸林\t273034\n脸肿\t273035\n汉真广标\t273036\n财阀\t273037\n广东招标网\t273038\n销单\t273039\n不晓\t273040\nHystrix\t273041\n丝语\t273042\n预约\t273043\n十七项\t273044\n八音\t273045\n易错点\t273046\n哀叹\t273047\n石墨板\t273048\nActual\t273049\n卡罗维\t273050\n主观性\t273051\n参与感\t273052\nAnybody\t273053\n32m\t273054\n杭州医院\t273055\n637\t273056\n佛山新闻网\t273057\n布贴\t273058\ndovecot\t273059\n荠菜\t273060\n管道\t273061\n颗颗\t273062\n校发\t273063\n极速网\t273064\n非网管型\t273065\n九阳股份\t273066\n装法\t273067\n广州市第二中学\t273068\n电子竞技\t273069\n新疆维吾尔自治区公安厅\t273070\n谈治国理政\t273071\n第26条\t273072\ncarb\t273073\n30讲\t273074\n遣唐\t273075\n广东正中珠江会计师事务所\t273076\n光线\t273077\n标准动车组\t273078\n高送\t273079\n财会月刊\t273080\n锈板\t273081\n速度型\t273082\n布衣\t273083\n稳健\t273084\nfacetite\t273085\n跻\t273086\n赵文卓\t273087\n奥泰\t273088\n异形魔怪\t273089\n中国摄影家协会\t273090\n安托涅瓦\t273091\n刀机\t273092\n人在\t273093\nc++2017\t273094\n婊子\t273095\nSevenFriday\t273096\nmcu\t273097\nButt\t273098\n吉他吧\t273099\n光荣\t273100\n脂肪酶\t273101\n大新路\t273102\nblx\t273103\n达特茅斯\t273104\n2016年2月1日\t273105\n乳铁\t273106\n爻\t273107\n热补\t273108\n金翠莲\t273109\n孙志刚\t273110\n电脑展\t273111\n卷商\t273112\n药品监督管理局\t273113\ntl-wr886n\t273114\n览器\t273115\n茶籽粉\t273116\nfigures\t273117\n朱健\t273118\n闭环步进电机\t273119\n雷神索尔\t273120\n多助\t273121\n海利亚盾\t273122\n男科\t273123\n解咒\t273124\n套箱\t273125\n重庆地铁1号线\t273126\n天马座\t273127\n恒逸集团\t273128\n达摩克利斯\t273129\n保准\t273130\n经血\t273131\nBigData\t273132\n虚贝网\t273133\n闽南网\t273134\n鱼菜共生\t273135\n樱花樱花\t273136\n2016年5月3日\t273137\nrandn\t273138\n玻璃种\t273139\ncity\t273140\nword版\t273141\n夺宝奇兵4\t273142\n江苏恒瑞医药股份有限公司\t273143\n接着\t273144\n置换补贴\t273145\nHS编码网\t273146\n汉中市\t273147\n捂脸大笑\t273148\n内战\t273149\n颐居\t273150\n筹集\t273151\n恋夜秀场\t273152\n再唱\t273153\n争鸣\t273154\nTHF\t273155\n抓痕\t273156\n黄啤\t273157\n评分法\t273158\n丽水莲都\t273159\n旅館\t273160\n默多克\t273161\n住院医\t273162\n夏天晚上\t273163\n1937年8月28日\t273164\n润枫\t273165\nIOS8\t273166\n消极\t273167\n翻译贴\t273168\n下坡\t273169\n光谷新世界\t273170\n红绿柱\t273171\n新本\t273172\n电侠\t273173\n音准\t273174\nonstart\t273175\nisim\t273176\n情深缘浅\t273177\n扩展性\t273178\n事务所\t273179\n看守所\t273180\ntyp\t273181\n护眼仪\t273182\nhs\t273183\nF9\t273184\n文创园\t273185\n里卡多\t273186\n栄川乃亜\t273187\n智信\t273188\n面糊\t273189\n战国无双4\t273190\ntypecho\t273191\n載\t273192\nsvn篇\t273193\n2018-03-01\t273194\nlgm\t273195\ndashu\t273196\n文坑\t273197\n铣边机\t273198\n为什么不叫\t273199\n西四环\t273200\n页字\t273201\n彭斌\t273202\n西洋参\t273203\n18亿元\t273204\n轻率\t273205\n乱像\t273206\n野性的呼唤\t273207\n夜雨\t273208\n语录网\t273209\n背栓式\t273210\n三态门\t273211\n肚条\t273212\n捆绑带\t273213\n树下\t273214\n抚顺特钢\t273215\n变态级\t273216\nRealSense\t273217\n总监\t273218\n窄桥\t273219\n上任\t273220\n辣菜\t273221\ngrant\t273222\n190x\t273223\n凤凰艺术\t273224\n丝路传说\t273225\n中级人民\t273226\n回民区\t273227\n何鸿\t273228\n选号入网\t273229\ncrazykings\t273230\n720P-RMVB\t273231\n大有前途\t273232\n李亚男\t273233\n昆莱山\t273234\nmitmproxy\t273235\n兴业太古汇\t273236\n信托法\t273237\n查宁·塔图姆\t273238\n临城新区\t273239\nzhy\t273240\n180_\t273241\n特\t273242\n170616\t273243\n洋基\t273244\n荻花圣殿\t273245\n思科华为\t273246\n潘际銮\t273247\n雪铁龙c4\t273248\n和平里北街\t273249\n54%\t273250\n千万人次\t273251\nAVplayer\t273252\n集中供电\t273253\n地下翼虎\t273254\n钟伟强\t273255\noz\t273256\n禄神\t273257\n嘉定工业区\t273258\n富士xt2\t273259\nGTK+\t273260\n掌\t273261\n远景x3\t273262\nblc\t273263\n粤港澳湾区\t273264\n至尊版\t273265\n矮子\t273266\n渤海商品交易所\t273267\n中华人民共和国野生动物保护法\t273268\n万亿\t273269\n灰烬使者\t273270\n口袋妖怪黄\t273271\n轿顶山\t273272\n杨守敬\t273273\n小二黑\t273274\n试机号\t273275\nscooter\t273276\n不来梅\t273277\n武隆县\t273278\n刘春雨\t273279\n航天三院\t273280\n冥炎\t273281\n北京钓鱼论坛\t273282\n总台\t273283\n说出口\t273284\n飞马\t273285\n保险金额\t273286\n长沙市中心医院\t273287\nfany\t273288\n夏洛蒂\t273289\n常量元素\t273290\nclover\t273291\nhopes\t273292\nコミックス\t273293\nimpressive\t273294\n优步\t273295\n广西妇幼保健院\t273296\nmaxim\t273297\n前列素\t273298\nGDB\t273299\n7.03\t273300\n电源管理\t273301\n有逼格\t273302\n王妍之\t273303\n扩声\t273304\n糙汉\t273305\n千层肚\t273306\n居住者\t273307\n2.5亿美元\t273308\n金币\t273309\n贡眉\t273310\n乾位\t273311\nsino\t273312\nvsts\t273313\n重庆市奉节县人民政府\t273314\n终结者5\t273315\nmct\t273316\n叶惠美\t273317\nECU\t273318\n丹东站\t273319\n哈德良\t273320\n益海\t273321\n迅玩版\t273322\ntelnet\t273323\n南京省中医院\t273324\n捻子\t273325\n长人\t273326\n500幅\t273327\n文昌湖区\t273328\n谢菲尔德大学\t273329\n长长长\t273330\n担保圈\t273331\n猫奶\t273332\n138期\t273333\n政府采购评审专家管理办法\t273334\n爱钱进\t273335\n椅子\t273336\n亏损额\t273337\n中国共产党中央军事委员会\t273338\n联次\t273339\n接应\t273340\n太原师范学院\t273341\n猛鸷\t273342\nmrtg\t273343\n脱粪\t273344\n20141212\t273345\n定投\t273346\n永善\t273347\ncos吧\t273348\n肽链\t273349\n61级\t273350\n二宫\t273351\n南粤网\t273352\n配电盒\t273353\n飞力达\t273354\nhellow\t273355\n红米NOTE\t273356\n奇米影院\t273357\n美爵\t273358\nBDE\t273359\n烟店镇\t273360\n金融港\t273361\n伊戈\t273362\n猢狲\t273363\n书粉\t273364\n小轮\t273365\n发给\t273366\ndatatime\t273367\n动漫人\t273368\n潜龙之渊\t273369\n对外经济贸易大学法学院\t273370\n苦菜花\t273371\n水牛\t273372\n胜境\t273373\nreaxys\t273374\nCFT\t273375\nmagick\t273376\n宪章\t273377\n食货志\t273378\n空房\t273379\npristin\t273380\nv1.3.4\t273381\nl485\t273382\n溱洧\t273383\n3.46\t273384\n真身\t273385\n1000a\t273386\n南通市市\t273387\n狗民\t273388\n松门\t273389\n百科类\t273390\n紫悦\t273391\n煮夫\t273392\n皂罗袍\t273393\n保温\t273394\nwhile\t273395\n交城县\t273396\n47页\t273397\n青钱柳\t273398\n究极技\t273399\n物质的量\t273400\n0724\t273401\n黑塔利亚\t273402\n豆种\t273403\nbalcony\t273404\nsufer\t273405\nTrapcode\t273406\n向前一步\t273407\n牛眼\t273408\n东方新闻\t273409\n焉得虎子\t273410\n悲叹之塔\t273411\n房地\t273412\nDeserve\t273413\n借人\t273414\n电路图\t273415\n事到如今\t273416\n河北传媒学院\t273417\n明智\t273418\nSubversive\t273419\n威海火车站\t273420\n贝客\t273421\nmyt126汽车改装网\t273422\n欧赔\t273423\n染色剂\t273424\n减号\t273425\n河景\t273426\n自融\t273427\n3763\t273428\n八元\t273429\n励合\t273430\n东广\t273431\n吴蔚\t273432\n65年\t273433\n日历\t273434\n中国行\t273435\n抓宠\t273436\n7月15日\t273437\njessy\t273438\n盟军敢死队\t273439\n打热\t273440\nTrusty\t273441\n留在\t273442\n欧凯龙\t273443\n金钗石斛\t273444\n对赌协议\t273445\n临空\t273446\n素彩网\t273447\n泰笛\t273448\n医疗器械展\t273449\n叶修\t273450\n中国科学院东北地理与农业生态研究所\t273451\n月数\t273452\nA级景区\t273453\npdl\t273454\nneuroscience\t273455\npubwinol\t273456\n越野摩托\t273457\n地狱少女\t273458\n柏林之声\t273459\nBaCl2\t273460\n第一二季\t273461\n广州仁爱医院\t273462\n3连\t273463\n第二十七次\t273464\nradiobutton\t273465\n订做\t273466\nsever2008\t273467\n昆塔\t273468\n迷阵\t273469\nlines\t273470\n4399问道\t273471\nEXCEL2013\t273472\n龙马文\t273473\n赛尔号\t273474\n菲猫\t273475\n邓文纳兰性德\t273476\n非空行\t273477\n杭州市安全生产监督管理局\t273478\n美的电热水器\t273479\n冰刀\t273480\n皮影戏\t273481\n江西银行\t273482\n华中科技大学机械科学与工程学院\t273483\n河西路\t273484\n金立金刚\t273485\n大专\t273486\n出租汽车经营服务管理暂行办法\t273487\n330i\t273488\n【锋\t273489\n第一百一十八章\t273490\n_位\t273491\n夢雨情殤\t273492\n查分\t273493\n维生素b族\t273494\nKFC\t273495\n包换\t273496\n特娱娱乐网\t273497\n妻姐\t273498\n寻路算法\t273499\n机算\t273500\n泡泡堂\t273501\n电商标\t273502\nCumshots\t273503\n辫\t273504\n钉宝\t273505\n三界镇\t273506\n极米H1\t273507\n検\t273508\n走马镇\t273509\n拥戴\t273510\n急性子\t273511\n王者荣耀卡\t273512\n仓栏\t273513\n光绪元宝\t273514\n急停\t273515\n江西省发展和改革委员会\t273516\n想办\t273517\noffcn\t273518\nSHL\t273519\n短裤\t273520\n13升\t273521\n安医大\t273522\n拜厄钢琴基本教程\t273523\nlancaster\t273524\n百度金融商城\t273525\n利德曼\t273526\nprocessing\t273527\nAFNetworking\t273528\n肺动脉瓣\t273529\nBy\t273530\n2012-06-16\t273531\n增额\t273532\n舟山网\t273533\n倾国公主\t273534\n两江新宸\t273535\n35次\t273536\n31部\t273537\nIoT\t273538\n暴龙眼镜\t273539\n白凡\t273540\n青村镇\t273541\n天猫旗舰店\t273542\n阳新政府网\t273543\n奥迪A6L论坛_汽车之家论坛\t273544\n程杰\t273545\nannoyed\t273546\nQuattro\t273547\n宁波网\t273548\n文艺范儿\t273549\n山东电视台\t273550\n西罗马\t273551\n李学勤\t273552\n项数\t273553\nPES2014\t273554\nTiny4412\t273555\n房地局\t273556\n日照钢铁\t273557\n电池座\t273558\n中国数学会\t273559\n几册\t273560\n临境\t273561\nx-1)^2\t273562\n3d建模\t273563\n红外线报警器\t273564\n大马士革刀\t273565\n基业\t273566\nAl2O3\t273567\n基本建设\t273568\n灵佑\t273569\n盛佳\t273570\n夏侯\t273571\n孤陋寡闻\t273572\n好说\t273573\n口蘑\t273574\n努比亚小牛\t273575\nIterative\t273576\nvpgame\t273577\n二手房东\t273578\n奶盖\t273579\n小蚁4K\t273580\n摩恩\t273581\n琴琴\t273582\n鸿荣源壹城中心\t273583\n成立\t273584\nHPM\t273585\n货船\t273586\n星二代\t273587\n阿拉善左旗\t273588\n派诺特\t273589\n爆炸品\t273590\n张达明\t273591\n慰\t273592\n废金属\t273593\n漳平市\t273594\n腹地\t273595\n流溪河\t273596\nRouterOS\t273597\n诗儿\t273598\n图籍\t273599\n二氯乙烯\t273600\n剩余物\t273601\n阎\t273602\n77万\t273603\n玄妙观\t273604\n不祥之兆\t273605\n飞碟\t273606\nchipgenius\t273607\nIntegrating\t273608\n大凌河\t273609\n东延段\t273610\n身宫\t273611\nillust\t273612\nC86\t273613\n鬼凯\t273614\n颜团子\t273615\nJUnit\t273616\nDatatable\t273617\n小伙伴\t273618\n分米\t273619\n三皇五帝\t273620\n健帆生物\t273621\nMETCN\t273622\n四非\t273623\n烟区\t273624\n沙田镇\t273625\n淘\t273626\n贾真\t273627\n电脑爱好者\t273628\n金惠\t273629\n东北话版\t273630\n探途\t273631\n太极网\t273632\n童博\t273633\n乌尔达哈\t273634\n长安通\t273635\n第三代\t273636\n11.07\t273637\n鄂豫皖\t273638\n垣\t273639\n柚子多肉\t273640\n秸秆还田机\t273641\nSOSO云盘\t273642\n胡枫\t273643\n三月三日\t273644\n淘钱宝\t273645\nネトラセラレ\t273646\n笫\t273647\nreferred\t273648\n农科院\t273649\n天顺\t273650\n4g飞享\t273651\n管理信息系统\t273652\n快速熔断器\t273653\n丹宁\t273654\nalle\t273655\n快展\t273656\nswine\t273657\n2第二\t273658\n降凝剂\t273659\n三茗\t273660\n永字八法\t273661\n杆式\t273662\n舷外机\t273663\n各省市区\t273664\n长城网\t273665\n三氧化二锑\t273666\n太原五中\t273667\n兰瑟\t273668\n胡马依\t273669\n罗尔定理\t273670\n老克勒\t273671\n超敏C反应蛋白\t273672\n多功能厅\t273673\n恶棍天使\t273674\n大豪\t273675\n亚特兰大奥运会\t273676\n苏州工业园区管委会\t273677\nFNAF\t273678\n股东会决议\t273679\n下杆\t273680\n1455\t273681\n以为戒\t273682\nCOSLOG\t273683\n林海音\t273684\n原码\t273685\necognition\t273686\n不怀好意\t273687\n股票作手回忆录\t273688\n切术\t273689\n占领区\t273690\nfreebsd\t273691\nFCN\t273692\n曲集\t273693\n剑仙\t273694\nes5\t273695\n孤岛求生\t273696\n快餐\t273697\n标定\t273698\n晋江电视台\t273699\n石灯\t273700\n进境\t273701\n重庆市社会保险局\t273702\nFindlaw\t273703\n专升本网\t273704\n彩民\t273705\n恋恋情深\t273706\ngav\t273707\n哈尔滨第三中学\t273708\n布拉迪斯拉\t273709\n西玛\t273710\n珍赏\t273711\n葛军\t273712\n暖心话\t273713\nPengYunjing\t273714\n病毒样本区\t273715\n西柳\t273716\ndddd\t273717\n54名\t273718\n平移门\t273719\n刘月\t273720\n合肥地铁1号线\t273721\n武忠祥\t273722\n龙珠斗士\t273723\n卡斯蒂利亚\t273724\n15kv\t273725\n上海团市委\t273726\n浸染\t273727\n供求\t273728\n文忠午\t273729\nhalo\t273730\nXWiki\t273731\n芯片组\t273732\n陈从周\t273733\n汪东进\t273734\n最终稿\t273735\njige\t273736\n2018年2月9日\t273737\n校园读书节\t273738\n朱永飞\t273739\nceilometer\t273740\n3v3\t273741\n伯力\t273742\nipadqq\t273743\n汪昱\t273744\n试纸条\t273745\n内务府\t273746\n最后的故事\t273747\nD821\t273748\n生僻词\t273749\n十四五\t273750\n一汽海马\t273751\n八关\t273752\njava栈\t273753\nResolume\t273754\n25关\t273755\n李珍\t273756\n天峻县\t273757\n飞耐\t273758\n难尽\t273759\n牡丹江信息网\t273760\n助讯通\t273761\nplk\t273762\n北寺塔\t273763\n万能板\t273764\n佐仓绊\t273765\n痹症\t273766\n亮点\t273767\nBT种\t273768\n白金汉宫\t273769\nonu\t273770\n_倍\t273771\n情心\t273772\n见到\t273773\n极乐谷\t273774\n郢\t273775\n江泽林\t273776\n赵今麦\t273777\nglare\t273778\nWPS2012\t273779\n裙带\t273780\nGeaoZhang\t273781\n竹语\t273782\n不以为意\t273783\nfortinet\t273784\n第六十一\t273785\n百分之40\t273786\n思科网络技术学院\t273787\n上海工艺美术职业学院\t273788\n专本连读\t273789\nBasil\t273790\n大喘气\t273791\nelecom\t273792\nWinPcap\t273793\n无烟煤\t273794\n陈晓霞\t273795\n速播放\t273796\ngifs\t273797\nUCSC\t273798\nddos\t273799\n启星\t273800\n融资性担保公司\t273801\n导光\t273802\n无缝方矩管\t273803\n罗华\t273804\n城守护者\t273805\n免疫分析仪\t273806\n河口古镇\t273807\nskyroger\t273808\n冬运会\t273809\n展涛\t273810\n资产评估师考试\t273811\n公立学校\t273812\n梭形\t273813\n宁波公交查询网\t273814\n七名\t273815\n银耳羹\t273816\n南充新闻网\t273817\n易感人群\t273818\n退档\t273819\nUndertale\t273820\nSeve\t273821\n爱柯迪\t273822\ndeprecation\t273823\nmeihua\t273824\n德玛西亚杯\t273825\n潘宁\t273826\n金辉优步大道\t273827\n朝\t273828\n穆索尔斯基\t273829\n广州中学\t273830\namaze\t273831\n中国教育电视台\t273832\noffic\t273833\n2.18\t273834\n随身播放器\t273835\nPNP\t273836\nspbill\t273837\n背上\t273838\nSomebody\t273839\nztv\t273840\n国宝\t273841\n商业版\t273842\n隔山打牛\t273843\nWeblogic\t273844\n兵乓\t273845\n韦伦\t273846\n中信重工\t273847\nmovidius\t273848\n缩编\t273849\n国台办\t273850\n蛇口街道\t273851\n外加一\t273852\n四大天王\t273853\nDirect\t273854\n泥人\t273855\n勒泰\t273856\n锁麟囊\t273857\n力挺\t273858\n巨化\t273859\n河北人才网\t273860\n一单元\t273861\n录影带\t273862\n乔叟\t273863\n邮政大厦\t273864\n慕尼黑上海光博会\t273865\n中国商网\t273866\n3年多\t273867\n首贷\t273868\nsimilarity\t273869\n口袋妖怪心金魂银\t273870\n北京市人民政府\t273871\nK42\t273872\n想说的话\t273873\n古印\t273874\n24句\t273875\n东风汽车集团股份有限公司\t273876\n28257\t273877\n大硕\t273878\nPhysical\t273879\n老北京\t273880\n3cr13\t273881\n后现代主义\t273882\n生活习惯\t273883\n江上志保\t273884\nEDT\t273885\n旺苍\t273886\nfrp\t273887\n哈琳\t273888\n卡纳斯\t273889\n滋长\t273890\n吉他谱_六线谱弹唱谱\t273891\n新世界\t273892\n凉城县公安局\t273893\n601688\t273894\n不周\t273895\nxpi\t273896\n蒙浅雪\t273897\n95188\t273898\n单宁酸\t273899\n53\t273900\n6.1寸\t273901\n85000\t273902\n楞\t273903\n大白熊\t273904\nv3.2.3\t273905\n一团和气\t273906\nDeemo\t273907\n港商\t273908\n二妮\t273909\nJMsolution\t273910\n惊讶\t273911\n以史为\t273912\n群众路线\t273913\n途观L\t273914\nPDF版_夏泽网\t273915\nGRG\t273916\n炉盘\t273917\n弹出\t273918\n鲁汶\t273919\nteam\t273920\n标致摩托\t273921\n高崎国际机场\t273922\n双扣\t273923\n2014年11月20日\t273924\nParc\t273925\n5225\t273926\n诚基中心\t273927\n不法之徒\t273928\n一千零一夜\t273929\n路边社\t273930\n适婚\t273931\n补铁\t273932\n3组\t273933\nsweety\t273934\nqq头像_天极网\t273935\n佳茵益生菌\t273936\n老约翰绘本馆\t273937\n小泡\t273938\n山东广播电视大学\t273939\n凌晨四点半\t273940\n将错就错\t273941\n铁腿\t273942\nⅤ\t273943\n大足石刻\t273944\n窦房结\t273945\n陆三金\t273946\n6GB\t273947\n鄂式破碎机\t273948\n狼吻\t273949\n欧李\t273950\nRita\t273951\nGIFT\t273952\novito\t273953\n第100次\t273954\nRADEON\t273955\n启东市\t273956\n泡酒瓶\t273957\n亚元\t273958\n紫荆园\t273959\n强袭\t273960\n伍家岗区\t273961\n江户时代\t273962\n陈田村\t273963\n南通滨海园区\t273964\n沈剑\t273965\n锐取\t273966\n2200g\t273967\n品评\t273968\n新闻传播\t273969\n宁洱\t273970\n123456\t273971\n小黄龙\t273972\n有户\t273973\n岁岁\t273974\n临空奥特莱斯\t273975\nek\t273976\nFileSystem\t273977\nzsdaka\t273978\n仙鸣\t273979\n天铭\t273980\n花粉症\t273981\n诚泰\t273982\n5133\t273983\n沉眠\t273984\n富士康科技集团\t273985\n阎立\t273986\n8重\t273987\n光点\t273988\n照表\t273989\n乌黑\t273990\n线性回归分析\t273991\nDNAT\t273992\n半厘米\t273993\n立国\t273994\n魔兽争霸3\t273995\nAKBINGO\t273996\n顾随\t273997\n不间断\t273998\n家居展\t273999\n红旗\t274000\n天天佣兵天下\t274001\n电模\t274002\n模糊推理\t274003\n尾部\t274004\n收受贿赂\t274005\nwii模拟器\t274006\n数控车\t274007\n379\t274008\n第43条\t274009\n证券事务代表\t274010\n养血清脑颗粒\t274011\n洛英\t274012\n南丽湖\t274013\n高抛低吸\t274014\n增值税税\t274015\n公立大学\t274016\n王银成\t274017\nFindingSchool\t274018\n都安瑶族自治县\t274019\n合焦\t274020\n007号\t274021\n汉语拼音声母\t274022\n张生记\t274023\n茅膏菜\t274024\n六安市叶集区人民政府\t274025\n健翔\t274026\nDLC版\t274027\n商队\t274028\nlinux-x64\t274029\n柳泉镇\t274030\n大麦路由器\t274031\n中华人民共和国国家工商行政管理总局\t274032\n国事\t274033\n受送\t274034\n霁月\t274035\n田东县\t274036\n苏悦广场\t274037\n狼女\t274038\n払\t274039\n5.00\t274040\n盖土\t274041\n雪佛兰爱唯欧\t274042\n2009-2014年\t274043\n淞南镇\t274044\n登山赛车2\t274045\n六爻排盘\t274046\nCAD格式\t274047\n跑动\t274048\n降至\t274049\n淌\t274050\nTWG\t274051\n观察力\t274052\n木嬴\t274053\n爱快流控软路由\t274054\n锦衣夜行\t274055\n盆花\t274056\n集中营\t274057\n三四万\t274058\n战争之人2\t274059\n攻博\t274060\n0.0%\t274061\nAMAZON\t274062\ndecember\t274063\n绝地求生刺激战场为什\t274064\ncdx\t274065\n銭\t274066\n广播网\t274067\n封闭性\t274068\n浙江卫星石化股份有限公司\t274069\nRebecca\t274070\n用字母表示数\t274071\n草晶华\t274072\n愚民\t274073\n文学家\t274074\n失语\t274075\n哥窑\t274076\n浮士林正英\t274077\n0.8.1\t274078\n喜妇\t274079\nbanding\t274080\n9030\t274081\n云裳广场舞\t274082\n连玉君\t274083\nOptimum\t274084\n郭东生\t274085\n尤纳斯\t274086\n吕斌\t274087\n六十二\t274088\n过去一个月\t274089\n轻钢结构\t274090\n孟凡柱\t274091\n龙市镇\t274092\nstimulating\t274093\nXnxx\t274094\n玫瑰园\t274095\n赵家堡\t274096\n石厦\t274097\n老庙黄金\t274098\nMarmoset\t274099\n千岛湖景区\t274100\n沌口开发区\t274101\n湾鳄\t274102\n黄头发\t274103\n陈琛\t274104\nwarwick\t274105\n单镜头\t274106\n洁齿\t274107\npaginate\t274108\nbsu\t274109\nneo4j\t274110\n就绪\t274111\n第16天\t274112\n牛百川\t274113\n仓鼠百科\t274114\nbromide\t274115\nnwc\t274116\nipcc\t274117\n特派\t274118\n上饶新闻网\t274119\ninvolves\t274120\n市住建局\t274121\n天涯剧社_论坛\t274122\n0.8元\t274123\napter\t274124\n大江\t274125\nQSY\t274126\n江水\t274127\n马青\t274128\n3d溜溜网\t274129\n8297\t274130\n雅鲁藏布大峡谷\t274131\n回奶\t274132\n塑料件\t274133\n新校\t274134\n7.34\t274135\n天锐\t274136\n运河路\t274137\n抓实\t274138\n良风\t274139\n_桂阳县政府\t274140\n非公有制\t274141\nlinksys\t274142\n橡木\t274143\n棚户\t274144\n_山\t274145\n月儿弯弯照九州\t274146\nDPT\t274147\n广东省地质局\t274148\n微分几何\t274149\n1179\t274150\n黑茶\t274151\n天安保险\t274152\n深圳地铁\t274153\npolygons\t274154\n鸡毛菜\t274155\nrst\t274156\npotentially\t274157\n34厘米\t274158\n瓜达拉哈拉\t274159\n起点吧\t274160\n相撞\t274161\n苏州都市网\t274162\n贝类\t274163\n解表\t274164\n鬼吹灯之黄皮子坟\t274165\n救亡图存\t274166\nIX35\t274167\nc++\t274168\n灵力\t274169\n情報\t274170\nbacking\t274171\n手机银行\t274172\nFarmers\t274173\n5400元\t274174\n污水提升器\t274175\nZong\t274176\n600177\t274177\n梧州\t274178\n内流\t274179\n克拉美\t274180\n2000万\t274181\ndefaulten\t274182\n越南战争\t274183\n于国也\t274184\n蛇蝎美人\t274185\n樱美雪\t274186\nd7100\t274187\n验尿\t274188\nP0口\t274189\n中国旅游集团\t274190\nfastlane\t274191\n扒开\t274192\n柞木\t274193\n第二性征\t274194\n发于\t274195\n退费\t274196\n苏雅琴\t274197\n000800\t274198\n烤鱿鱼\t274199\n21【\t274200\n武力\t274201\n遭嫌弃\t274202\nDOM父\t274203\nrn\t274204\nCAT\t274205\n王月波\t274206\n四等\t274207\nchuangye\t274208\n代表物\t274209\n霍福德\t274210\n247\t274211\n春廷\t274212\n诃子\t274213\n俄语在线翻译\t274214\n画廊\t274215\nVTP\t274216\nabobe\t274217\ngonna\t274218\n奏凯\t274219\nciub\t274220\n中文百科在线\t274221\n三缸\t274222\n理想型\t274223\nZayn\t274224\n50厘米\t274225\n尤迪安\t274226\n行车灯\t274227\nvooec\t274228\nbentley\t274229\ngv男优\t274230\n西河村\t274231\n刘宏毅\t274232\n花园新村\t274233\n焦扬\t274234\nAppData\t274235\n禁售\t274236\n安踏体育\t274237\n米莎\t274238\n中共邹城市委对外宣传办公室\t274239\n人福\t274240\n夜夜夜夜\t274241\n偏置\t274242\nMONTBLANC\t274243\n孙维刚\t274244\nusb万能驱动\t274245\nLong型\t274246\n心华\t274247\n天河龙洞\t274248\n6月9日\t274249\n四成\t274250\nserver端\t274251\n童字\t274252\n徐然\t274253\n傅明宪\t274254\n北方民族大学\t274255\n参赛\t274256\n贝岭\t274257\nPSD模板\t274258\nbefore\t274259\n三端稳压器\t274260\n重组人干扰素\t274261\n才说\t274262\n脑转移瘤\t274263\n淹水\t274264\n惠城\t274265\n美焰\t274266\n煤安\t274267\n7.3_\t274268\n商数\t274269\n最后一天\t274270\n瀚华金控\t274271\n中科建设\t274272\n大圈\t274273\n李淑一\t274274\n三席\t274275\n乔松\t274276\nappv\t274277\nmoji\t274278\n2017年5月8日\t274279\n神经外科\t274280\n庆生\t274281\n火星时代\t274282\n西局\t274283\n精装套\t274284\n知乎达人\t274285\nhose\t274286\n宝鸡火车站\t274287\n降头\t274288\ncln\t274289\n诏书\t274290\n免去\t274291\n17位\t274292\nappl\t274293\n2个多小时\t274294\n水兽\t274295\ntensorlfow\t274296\n宁夏电视台\t274297\nLevis\t274298\n凯酷\t274299\n胎体\t274300\n顔\t274301\nExcelPro\t274302\n第十七届\t274303\n安徒生\t274304\nSnacks\t274305\n糜烂\t274306\n四卷\t274307\n斯坦福大学\t274308\nvgs\t274309\n十六大\t274310\n金沙洲\t274311\n贯休\t274312\n孝道\t274313\nVerification\t274314\n安葬\t274315\n不攻\t274316\n责任公司\t274317\n宁波市鄞州区公共资源交易中心\t274318\n基尔霍夫定律\t274319\nrationale\t274320\n策划类\t274321\n172.16.255.255\t274322\n青岛港集团\t274323\n人大\t274324\n王守义十三香\t274325\nassume\t274326\n自助机\t274327\nMoxa\t274328\n苏州大学艺术学院\t274329\nzoeva\t274330\nBootstrap中文网\t274331\n磁感应强度\t274332\njihuo\t274333\n珀莱雅\t274334\n刘宝华\t274335\napartment\t274336\n研发商\t274337\nFiddler2\t274338\nppf\t274339\nStefan\t274340\n0608\t274341\nstar\t274342\n卜居\t274343\n健一网\t274344\n3D电视\t274345\n抚顺百姓网\t274346\n编舞师\t274347\n李妍熙\t274348\n一_\t274349\n卓原\t274350\nbikes\t274351\n铝锅\t274352\n气派\t274353\n钒电池\t274354\n社保卡\t274355\n火炬手\t274356\n制表人\t274357\n岳同学\t274358\n电神\t274359\n乙肝病毒表面抗体\t274360\n哈弗H2\t274361\n五十\t274362\n车用电路\t274363\n竹子林\t274364\n1厅\t274365\n铺张\t274366\n圣宗\t274367\n安息\t274368\n茶竹\t274369\n珍馐\t274370\n读书感\t274371\n自由党\t274372\n主客观\t274373\n快速以太网\t274374\n石观音\t274375\n焊接管\t274376\n艮山东路\t274377\n蔡晧东\t274378\n道真\t274379\n领导班子\t274380\ncontingency\t274381\n香港国际电影节\t274382\n杜小康\t274383\n茶汤\t274384\n厦门国际银行\t274385\n金吉鸟\t274386\n国家标准\t274387\n外销\t274388\nTRANCE\t274389\n十七大\t274390\n土旺\t274391\nCC霜\t274392\n中国科学院地球环境研究所\t274393\n环己烷\t274394\n参者\t274395\nnewbee\t274396\n83年\t274397\n妙木山\t274398\n盛夏好声音\t274399\n老刁\t274400\n家装饰\t274401\n逃家\t274402\n美术\t274403\n这块\t274404\n串钩\t274405\n七枷社\t274406\n持咒\t274407\n艾美达\t274408\n银行业务\t274409\n新竹市\t274410\n龙洲股份\t274411\n质土\t274412\n外汇学院\t274413\n饲料颗粒机\t274414\n贵州工商职业学院\t274415\n声纹\t274416\n艰巨性\t274417\n江南村\t274418\n换算率\t274419\n肯干\t274420\n南漳县\t274421\n自动化学报\t274422\n城市建筑工程停车场\t274423\n10012\t274424\n日曜\t274425\n供应链\t274426\n垃圾渗滤液\t274427\n隋代\t274428\n厦门图书馆\t274429\nmeat\t274430\n手绢\t274431\n彩虹大道\t274432\n怒火街头2\t274433\n可靠\t274434\n鬼蜮\t274435\n盐城市第一小学教育集团\t274436\n三折\t274437\n能\t274438\n雅各布森\t274439\n中国外运股份有限公司\t274440\n上网行为管理软件\t274441\n宠物栏\t274442\nGuru\t274443\n异形:契约\t274444\n好心情\t274445\n夏庄镇\t274446\n非师范类\t274447\nRally\t274448\n雀儿\t274449\n变型男\t274450\n何丽\t274451\n朵嘉浓\t274452\n成心\t274453\n灰色地带\t274454\n白牡丹\t274455\n酷路泽4000\t274456\n帮女\t274457\nwebapps\t274458\n斯洛克\t274459\n金伯利\t274460\nname值\t274461\n膝盖\t274462\nindustries\t274463\n小仓\t274464\n定南县\t274465\n新乐尘符\t274466\n比增\t274467\n丙戊酸钠\t274468\n黑卡\t274469\n尖细\t274470\ngfriend\t274471\n别傻\t274472\n闰年\t274473\n广东新安职业技术学院\t274474\n余杭农村商业银行\t274475\n二面角\t274476\n诺顿定理\t274477\nqq象棋\t274478\n健全\t274479\n新安晚报数字报\t274480\n渔歌子\t274481\n瘤胃\t274482\n惟独\t274483\n单选框radio\t274484\n陈小朋\t274485\nSeparator\t274486\n渐趋\t274487\n宝丽来\t274488\n咬牙切齿\t274489\n定装\t274490\napriori算法\t274491\nplayframework\t274492\nNokia7\t274493\n功不可没\t274494\n驽马\t274495\nTSearch\t274496\n天下\t274497\n绯闻\t274498\nflash游戏\t274499\nsativa\t274500\n首都医科大学附属北京康复医院\t274501\n意甲\t274502\n有钱任性\t274503\nhme\t274504\nJournalist\t274505\n撒意思\t274506\nTaxi\t274507\n值\t274508\n液体石蜡\t274509\n陶艺家\t274510\n千部\t274511\nlumia800\t274512\n乳胶漆\t274513\n校验仪\t274514\n药房\t274515\nForeman\t274516\n屠杀者\t274517\n玉梅\t274518\n这天\t274519\nminidwep\t274520\n上海打捞局\t274521\nforma\t274522\n子公\t274523\n班次表\t274524\n人工泪液\t274525\n世欧\t274526\nv2.0_\t274527\n书链\t274528\n六胜肽\t274529\n中新广州知识城\t274530\n蛇药\t274531\n农家童养媳\t274532\n丰巢科技\t274533\n安卓版v1.0\t274534\n沃云\t274535\n羊排\t274536\n葡萄糖氧化酶\t274537\nzhinan\t274538\n0606\t274539\n搬\t274540\nEngraving\t274541\n2017年9月26日\t274542\n氧气罐\t274543\n芥末酱\t274544\necitic\t274545\nJustice\t274546\n针刺毡\t274547\n2K12\t274548\n6.3.5\t274549\ntuv\t274550\n综艺秀\t274551\n吴若希\t274552\n捎客\t274553\n丰州\t274554\n黄刺鱼\t274555\n氛围\t274556\n矮行星\t274557\n虎美玲\t274558\n上半月\t274559\nArmored\t274560\n庄行\t274561\n流清\t274562\nAppliances\t274563\n米拉\t274564\n4045\t274565\n锁止\t274566\n狮虎兽\t274567\n工程\t274568\n汉沽区\t274569\nScholastic\t274570\n6013\t274571\nlot\t274572\n信报箱\t274573\nAPE\t274574\n2017年末\t274575\n481001\t274576\n太原日报\t274577\n经侦科\t274578\npk赛\t274579\n维米尔\t274580\n国内源\t274581\n收波\t274582\n青青草成人在线视频_青青草在线\t274583\n四物\t274584\n赖氨酸\t274585\n聚\t274586\n平野\t274587\n一五年\t274588\n阿吉仔\t274589\n订阅\t274590\nVmware\t274591\n害群之马\t274592\n1月12日\t274593\nitem项\t274594\n整行\t274595\n4399奥比岛\t274596\n时来运转\t274597\n精神饱满\t274598\n佳能镜头\t274599\nSchwarz\t274600\n相称\t274601\n无主之地前传\t274602\n粉丝机\t274603\n艾兰·沃克\t274604\n派发\t274605\n僵尸世界大战\t274606\n四川组工网\t274607\n渠道服\t274608\n漏电\t274609\n银行间交易商协会\t274610\ncabal\t274611\nSNH48歌迷会|SNH48粉丝团|SNH48饭\t274612\n刘翔\t274613\n佛山市人民政府\t274614\n青酒\t274615\n铸造工\t274616\n千古\t274617\ncli\t274618\n尿液\t274619\nbdf\t274620\n逸林\t274621\nQQ电话\t274622\n日签\t274623\n合肥工商银行\t274624\n斯柯达柯迪亚克\t274625\n第118届\t274626\n阳泉矿区\t274627\n成人礼\t274628\nZset\t274629\n九五至尊酒\t274630\n华派\t274631\njewellery\t274632\n笑而不语\t274633\nHNDS\t274634\n沛公\t274635\n南宁学院\t274636\n绣眼\t274637\n游姬\t274638\n买强卖\t274639\n经转\t274640\n家长制\t274641\n光电池\t274642\n进服\t274643\n3.9%\t274644\n赌船\t274645\n长日\t274646\n13x\t274647\n冯敏\t274648\n酰化\t274649\n#机器人争霸#\t274650\nhandoff\t274651\n祁连山\t274652\n小米6锁屏\t274653\nGifted\t274654\n五带\t274655\nNanny\t274656\n北青报\t274657\n爱慕鲜花网\t274658\n三亚国际免税城\t274659\n怯\t274660\n胆石症\t274661\n奇肌\t274662\nsqlal\t274663\n数控机\t274664\n神偷4\t274665\n少儿国寿福\t274666\n上海市公积金管理中心\t274667\n03月10日\t274668\n乐拍通\t274669\n北京航空航天大学电子信息工程学院\t274670\n辣妹子\t274671\n俞莲舟\t274672\n鱼粪\t274673\n莎车县\t274674\nTAKE\t274675\n茶山街道\t274676\n法身\t274677\n氢气\t274678\nJE技校网\t274679\nHouston\t274680\n八小\t274681\n嘉许\t274682\n港泉\t274683\n星魂战神笔趣阁_星魂战神顶点_星魂战神灵隐狐\t274684\n第13\t274685\n广西农科院\t274686\n王艺诺\t274687\nzhichi\t274688\n少给\t274689\n再次出现\t274690\nSolomon\t274691\n小儿湿疹\t274692\n买入法\t274693\n二十斤\t274694\n【众泰T500】新众泰_众泰T500报价|图片_2018众泰T500\t274695\n建筑段\t274696\n黑色五叶草\t274697\n天津中医药大学第二附属医院\t274698\n白石龙\t274699\n启富\t274700\n火精灵\t274701\n黑裙子\t274702\n腹腔积液\t274703\nicons\t274704\n奥乐\t274705\n知北游\t274706\n穹窿山\t274707\ncheatengine\t274708\n中国好声音吧_\t274709\n法政学院\t274710\nm8\t274711\n休止符\t274712\n校卡\t274713\n玉衣\t274714\n信息安全工程师考试\t274715\n相术\t274716\n叶迎春\t274717\n小鸭\t274718\n盛世才\t274719\n张高丽\t274720\n高阳镇\t274721\n诺米\t274722\nぃ\t274723\n吊床\t274724\n芝加哥期货交易所\t274725\n吴亦\t274726\navz\t274727\n12.9英寸\t274728\nwellington\t274729\n囹圄\t274730\n射电望远镜\t274731\n爱奇艺网\t274732\ncforeach\t274733\nprairie\t274734\n2.4万\t274735\n方正字体库\t274736\n土巴兔装修网\t274737\n健步行\t274738\n凝露\t274739\n君君\t274740\n心上\t274741\n色伦\t274742\n值变\t274743\n马庆勇\t274744\n韩寒马龙\t274745\n高旻寺\t274746\n351号\t274747\n湿湿\t274748\n李叔\t274749\n电动平车\t274750\n第七十六章\t274751\nom\t274752\n截\t274753\n新疆男篮\t274754\n超皮\t274755\n尤雅\t274756\n沟槽\t274757\n1.9.0\t274758\n2358\t274759\n云生\t274760\n魂者\t274761\n鼻尖\t274762\n900部\t274763\nWalker\t274764\n咪喹莫特\t274765\n台寺\t274766\n广富林郊野公园\t274767\n杂牌子\t274768\n陈小菜\t274769\n叶洛洛\t274770\n帅丰\t274771\n红会\t274772\n砂带\t274773\nrtklib\t274774\nweb浏览器\t274775\n二百万\t274776\n杰瑞集团\t274777\nnowadays\t274778\n150L\t274779\n2017.3.1\t274780\n夺宝奇兵1\t274781\n武汉幼升小\t274782\n袁强\t274783\n张机\t274784\nROUTE\t274785\n党媒\t274786\nU2\t274787\n虚拟追踪者\t274788\n下田\t274789\npeta\t274790\nxiu\t274791\n异度之刃2\t274792\n大浙\t274793\nvdl\t274794\n两千年前\t274795\n上海泛微网络科技股份有限公司\t274796\n灵枪\t274797\nericsson\t274798\n两仪\t274799\n卡盟网\t274800\n嘉合\t274801\n丛编\t274802\nreturns\t274803\n空客A330\t274804\n弹力\t274805\n台门\t274806\n会声会影5\t274807\n车险信息网\t274808\n炫纹\t274809\n超验主义\t274810\n布加迪威航\t274811\n300万元\t274812\n才女貌\t274813\n绝地求生刺激战场吧_\t274814\n腺样\t274815\nMozilla\t274816\nsujiao\t274817\n0.00\t274818\n大齐\t274819\n槽刀\t274820\n科恩兄弟\t274821\n鬼马星\t274822\n双天\t274823\nSQM\t274824\n符皇\t274825\nGatsby\t274826\n架空\t274827\n082\t274828\n陈香露\t274829\nreds\t274830\n格格\t274831\nHENkaku\t274832\n通络\t274833\n坨子\t274834\n凉拌皮蛋\t274835\nKSV\t274836\n相关性研究\t274837\nhoneymoon\t274838\nwank\t274839\n工本费\t274840\njcifs\t274841\n3164\t274842\n明年10月\t274843\n2018年03月17日\t274844\n南湖国旅\t274845\n监督权\t274846\n2-3\t274847\nmsysgit\t274848\n贵阳市白云区人民政府\t274849\n狂热者\t274850\n过款\t274851\nmim\t274852\n机械能\t274853\nScrollbar\t274854\n湖北省农业厅\t274855\n深圳平乐骨伤科医院\t274856\nextrude\t274857\n金保工程\t274858\n你的情深\t274859\nKX5\t274860\n汉帝\t274861\n流行风\t274862\n垫背\t274863\n应城市\t274864\n助手\t274865\nn4s\t274866\niis6\t274867\n厦禾路\t274868\n在线散文网\t274869\n派生\t274870\n凤翔花园\t274871\n我为歌狂\t274872\n芙蓉区\t274873\n合作区\t274874\n明水县\t274875\n还原剂\t274876\n8度\t274877\n长沙万达文华酒店\t274878\n人棉\t274879\n230cm\t274880\n4芯\t274881\n售楼\t274882\ncoli\t274883\nBVLGARI\t274884\n菲菲菲菲菲\t274885\n杨树浦路\t274886\n大唐集团\t274887\n陈青\t274888\n梦幻西游手游\t274889\nWWDC\t274890\n美韵\t274891\n开门机\t274892\n屋大维\t274893\n居于\t274894\n鞭子\t274895\nKonami\t274896\nRRL\t274897\n桃枝\t274898\nMarkII\t274899\n叶灵\t274900\n0009\t274901\n三千五\t274902\n多亏\t274903\n韩瑞\t274904\n福建农信\t274905\n楼继伟\t274906\n本品\t274907\nfastclick\t274908\n慢_\t274909\n92天\t274910\n南方电网\t274911\n丁薇\t274912\n8648\t274913\n诚信网\t274914\n百集\t274915\n张信哲\t274916\n广州公司\t274917\n宝骏730\t274918\n导改\t274919\n倩女ol\t274920\n赵士杰\t274921\n1321\t274922\n完美(中国)有限公司\t274923\n电脑游戏\t274924\n广州东火车站\t274925\n中点\t274926\nmaxthon\t274927\n浓情\t274928\n梦里花\t274929\n原教旨主义\t274930\n小青龙\t274931\n占补\t274932\n紫光园\t274933\n白龙镇\t274934\n陈美锦\t274935\nherry\t274936\nmqms2\t274937\n冷冰冰\t274938\n省制\t274939\nCOP\t274940\n康熙来\t274941\n养路\t274942\nLOW\t274943\n因\t274944\n第6条\t274945\n董振华\t274946\nterminology\t274947\n宠\t274948\n耍猴\t274949\n精原\t274950\n曾毓群\t274951\n工商银行信用卡\t274952\n张学良\t274953\nhappybase\t274954\n彭城\t274955\n黑白图\t274956\n晚婚假\t274957\n100千米\t274958\n偷心贼\t274959\n糟\t274960\n性格\t274961\n腾讯通\t274962\n储配\t274963\n200ML\t274964\n读入\t274965\n蒸制\t274966\npart2.rar\t274967\nbenchmarking\t274968\n45P\t274969\nqqq\t274970\n集评\t274971\nchemy\t274972\n红外遥控\t274973\n青岛市工商局\t274974\n花骨\t274975\n第五天\t274976\n农庄\t274977\nx300\t274978\n70后\t274979\n70幅\t274980\n一母\t274981\ncardtd\t274982\n20161023\t274983\n潮店\t274984\n单期\t274985\n泰妆\t274986\n2000多亿\t274987\n中国电子信息行业联合会\t274988\nmaxx\t274989\n双侠\t274990\n一切\t274991\n蓝蝶\t274992\n划痕\t274993\n5000美元\t274994\n皇后区\t274995\nVis\t274996\nbaka\t274997\n五谷\t274998\n龙嘉机场\t274999\nTranslate\t275000\n邢台政府网\t275001\n北京联合出版公司\t275002\n惠新东街\t275003\n瑞虎\t275004\n19p\t275005\n金马国旅\t275006\n吕丽君\t275007\n那个\t275008\nlister\t275009\n上海人才大厦\t275010\n第108\t275011\n飞傲x7\t275012\npc机\t275013\nR290\t275014\n剑网三2017\t275015\n百战经典\t275016\n相及\t275017\n车轮组\t275018\n玉湖\t275019\n购物者\t275020\n收买\t275021\n不和\t275022\nMerriam-Webster\t275023\nGen2\t275024\n20141031\t275025\nAsmr\t275026\nEbates\t275027\n影客\t275028\n星辉娱乐\t275029\n511990\t275030\n锂基脂\t275031\nPDF+MP3\t275032\nWoven\t275033\n兴业投资\t275034\n四道\t275035\n1.1.12\t275036\n矩阵范数\t275037\n姑嫂\t275038\n叮咚智能音箱\t275039\n367\t275040\n篆文\t275041\n高俅\t275042\n吴彤\t275043\n狐臭味\t275044\ntiles\t275045\n国标法\t275046\n中国移动通信集团安徽有限公司\t275047\n泰勒\t275048\n荷塘镇\t275049\n心甘情愿\t275050\nayanmw\t275051\n轮虫\t275052\n理央\t275053\n連\t275054\nbusting\t275055\n引体向上吧\t275056\n障碍症\t275057\n颠倒歌\t275058\n工商登记\t275059\n超声波检查\t275060\nThursday\t275061\n82.7万亿\t275062\n周俊\t275063\nkuai\t275064\n足控吧\t275065\n健肌粉\t275066\n松桃网\t275067\n聚生\t275068\nili\t275069\nChennai\t275070\nWives\t275071\nhj\t275072\n广东万和新电气股份有限公司\t275073\n一宣\t275074\n中国通史\t275075\njavastring\t275076\n龙之众泰t600\t275077\ngroup\t275078\n蓝山郡\t275079\n目录项\t275080\nagiso\t275081\n湖北省八校\t275082\nQQ表情-我最个性网\t275083\n贵州广播电视台\t275084\n铁水\t275085\n共谱\t275086\n6.5厘米\t275087\n二次项\t275088\nBracket\t275089\n1941年\t275090\n迷们\t275091\n习近平关于党风廉政建设和反腐败斗争论述摘编\t275092\nJewish\t275093\n周挺\t275094\n2801\t275095\n方莹\t275096\n万福路\t275097\n人民教育\t275098\n卓诗尼\t275099\n梵空禅院\t275100\n米顿罗计量泵\t275101\n竖流式\t275102\n计数仪\t275103\n近台\t275104\n孛\t275105\n老尸\t275106\n舰种\t275107\n驾考宝典\t275108\n雷山县人民政府\t275109\n感量\t275110\nCNC加工中心\t275111\n北京女排\t275112\n排山\t275113\n弈星\t275114\nVoltage\t275115\n滴鼻\t275116\n梅兰楚留香新传\t275117\nJSQ20\t275118\n花开性\t275119\n淅淅\t275120\n胎心监护仪\t275121\n莱美\t275122\nFPX\t275123\nShaders\t275124\n50佳\t275125\n6元\t275126\n2个\t275127\n德才\t275128\n鬼宿\t275129\n分期乐吧\t275130\n潘恩\t275131\n徐绍史\t275132\n攻牙机\t275133\n14万起\t275134\n胸膜炎\t275135\n梁板\t275136\n西沟村\t275137\n64M\t275138\n清华大学核能与新能源技术研究院\t275139\n镉污染\t275140\n清真菜馆\t275141\n凯麒\t275142\n快速批量\t275143\n新华东街\t275144\n沐足\t275145\nvibe\t275146\n佛山地区\t275147\nskd11\t275148\n东至县\t275149\n猫云\t275150\n益泗体育\t275151\n阀杆\t275152\n超星发现系统\t275153\nh5牛牛\t275154\n封丘吧_\t275155\n龙门架\t275156\nE6\t275157\nLid\t275158\n南越\t275159\n柳州市区\t275160\n巨尻\t275161\n20150417\t275162\n国管局\t275163\n紫衣\t275164\n_魔酷网\t275165\n冰封\t275166\n二傻\t275167\n吴志\t275168\n铂涛集团\t275169\ncleanmymac\t275170\ninputStream\t275171\n杨明生\t275172\n蜡烛人\t275173\nSOGI手機王\t275174\n搜索引\t275175\n阿莉\t275176\nU形\t275177\nInvalidate\t275178\n中国学生发展核心素养\t275179\n新疆发改委\t275180\n巴布洛生态谷\t275181\n代有\t275182\n金属撕碎机\t275183\n龙城街道\t275184\n佳能mg2580s\t275185\n无定河\t275186\n海表\t275187\n眼界\t275188\n泉林\t275189\n外酥\t275190\n禁驾\t275191\n面包机版\t275192\n南阳市中心医院\t275193\n超乎\t275194\n安徒程潇\t275195\nag亚游\t275196\n镇魂街\t275197\nNAVY\t275198\n宁波高铁站\t275199\n人保险\t275200\nnama\t275201\nDRC\t275202\n柴火灶\t275203\nxip\t275204\n框架页\t275205\n聚侠网\t275206\npestel\t275207\n用益\t275208\n三侠剑\t275209\n屋\t275210\n海南建省办经济特区\t275211\n杨孝文\t275212\n李迎春\t275213\nNorwegian\t275214\n交运集团\t275215\n脚上\t275216\n微塑料\t275217\n蒜薹\t275218\n水利风景区\t275219\n华塑\t275220\n统计学基础\t275221\n芥川龙之介\t275222\n第三版\t275223\n斜影清\t275224\n喉音\t275225\n食糖\t275226\n33期\t275227\n数十位\t275228\n桂河大桥\t275229\n零一个月\t275230\n迟帅\t275231\n上海虹桥高铁站\t275232\n奥鹏\t275233\n河北法制报数字报\t275234\n美臣\t275235\n宅男\t275236\n龙种\t275237\n英雄联盟s8\t275238\n2017.2.3\t275239\n王大顶\t275240\n救亡\t275241\n四川美术学院\t275242\nphe\t275243\n1669\t275244\n教育部工程研究中心\t275245\n濒死\t275246\n丢尽\t275247\n世纪大厦\t275248\n明发江湾新城\t275249\n830\t275250\n这一步\t275251\nGwen\t275252\n仰韶文化\t275253\n0731房产网\t275254\n曼市\t275255\n3针\t275256\n新山\t275257\n尾巴\t275258\n细思\t275259\nOffi\t275260\nsyscolumns\t275261\n补习\t275262\n桃桃淘\t275263\nPCHIFI\t275264\n想试试\t275265\n梅特勒托利多\t275266\n根特\t275267\npinarello\t275268\n潸然泪下\t275269\n郑杭生\t275270\n二级建造师考试\t275271\n操尿\t275272\n赛乐赛\t275273\n煲汤食谱大全\t275274\n文森特·梵高\t275275\n帝王妻\t275276\n起诉书\t275277\ncna5\t275278\n跨国恋\t275279\n操作规程\t275280\nTreasury\t275281\n骁勇善战\t275282\n嬢様\t275283\n叶佳瑶\t275284\n方远\t275285\n李亚林\t275286\nIOCP\t275287\n注塑件\t275288\n吊缚\t275289\ncamtasia\t275290\n悠远\t275291\n蓝符\t275292\nmovement\t275293\n外环路\t275294\n浩爽\t275295\nBuff\t275296\n学神\t275297\nWordCount\t275298\n3.2流放之路\t275299\n列宽\t275300\n肺结节\t275301\n灯舞\t275302\n王启云\t275303\nMCS-51\t275304\n44%\t275305\n出口阀\t275306\n上海回力\t275307\n沈富雄\t275308\n芙蓉苑\t275309\n运进\t275310\n文筱婷\t275311\n张松\t275312\n大连市区\t275313\n顾盼\t275314\n安全期\t275315\n0430\t275316\n商州区\t275317\n大东北\t275318\nKeenLeung\t275319\n亲一下\t275320\n贾瑞\t275321\nthree\t275322\n喷射战士2吧\t275323\n兴山县人民政府\t275324\n生殖道\t275325\n15p\t275326\n哈伯\t275327\n灵格斯词霸\t275328\n订书器\t275329\n摩客\t275330\n水路\t275331\n清明三天\t275332\n裱框\t275333\n9月29日\t275334\n玻璃态\t275335\n意见\t275336\nOp\t275337\n自改\t275338\n急行\t275339\n赤兔\t275340\ndianxin\t275341\n澳门太阳城\t275342\ncredential\t275343\n甜柿\t275344\n谢非\t275345\nTextile\t275346\n破浪\t275347\n沈阳城市建设学院\t275348\n开位\t275349\nAllStar\t275350\n980路\t275351\n王易\t275352\n冰狐\t275353\n大连圣亚海洋世界\t275354\n金鹰股份\t275355\n大众途观\t275356\n操作岗\t275357\n荤腥\t275358\nA7SII\t275359\nGEAR\t275360\n新力\t275361\nx700\t275362\n佛山市\t275363\n李清南\t275364\nChatur\t275365\n鹤壁新闻网\t275366\n恐怖传说\t275367\n2550k\t275368\n讥\t275369\nWedge\t275370\n93%\t275371\n工业部\t275372\n5831663924\t275373\n足不出\t275374\n任清璇\t275375\n犬夜叉\t275376\n023\t275377\n0902\t275378\n武动乾坤吧\t275379\n裴行俭\t275380\nIAT\t275381\n传达室\t275382\n装电\t275383\n江西省工信委\t275384\ntcd\t275385\n阳角\t275386\n海淀医院\t275387\n残兵\t275388\nPositions\t275389\n新手\t275390\n超梦吧\t275391\n价格\t275392\ncsxy\t275393\n4月21号\t275394\n崀山\t275395\n妖机\t275396\neBay\t275397\ntensentype\t275398\nBIGINT\t275399\nUs\t275400\n惹我\t275401\n乐盈\t275402\n天国拯救\t275403\n魅斑服饰旗舰店\t275404\nKazama\t275405\n100ms\t275406\n泰德\t275407\nsizes\t275408\n钟志生\t275409\n19组\t275410\n2009年11月\t275411\n前30年\t275412\nCMP\t275413\n密宝\t275414\n一汽丰田4S店\t275415\n酷刑\t275416\nintall\t275417\n不饱和烃\t275418\nnodemcu\t275419\n名族\t275420\nAppLocale\t275421\nconfigurator\t275422\n炖鱼汤\t275423\n独立王国\t275424\nvb2008\t275425\n太湖先锋网\t275426\n中华全国律师协会\t275427\n天元大厦\t275428\n新课标人教版小学\t275429\n高湛\t275430\n木跳板\t275431\n哨塔\t275432\n印象刘三姐\t275433\n国骂\t275434\n惠斯登电桥\t275435\nLbs\t275436\n瑞安园\t275437\n气急败坏\t275438\n顶号\t275439\n腻歪\t275440\n第一帖\t275441\n远安\t275442\nInclusive\t275443\n闽南菜\t275444\n销项税\t275445\n合肥论坛\t275446\n陈益峰\t275447\n气动调节阀\t275448\n生死两相欢\t275449\n湾仔\t275450\nremove\t275451\nminiDP\t275452\n障碍物\t275453\n计程车\t275454\n马里兰\t275455\n票款\t275456\njetty9\t275457\n柱墩\t275458\n餐厨\t275459\n金百达\t275460\n89平方\t275461\n岸部真明\t275462\n樱都\t275463\n平果\t275464\n中国政府网_中央人民政府\t275465\n多选题\t275466\nMacintosh\t275467\n踩裆\t275468\n良方\t275469\n浙江省人大常委会\t275470\n太原武宿机场\t275471\n苏州市社会保险基金管理中心\t275472\n老番茄\t275473\nHom\t275474\n活摘\t275475\n张玉堂\t275476\n三大亨\t275477\nswzx\t275478\n马蜂窝\t275479\n太郎花子\t275480\n蒲熠星\t275481\n0.7.2\t275482\n上书网\t275483\n草泥马\t275484\n终场\t275485\n郭树清\t275486\n绵阳经开区\t275487\n油垢\t275488\n散居\t275489\n电蒸锅\t275490\n乐达\t275491\nAdjust\t275492\n拆卸\t275493\nstereotype\t275494\n至潮\t275495\n吃好饭\t275496\n库乐队\t275497\nDave\t275498\n朗伯比尔定律\t275499\n领航人才网\t275500\n黄蜂队\t275501\n蛇胆川贝液\t275502\n石榴庄\t275503\n胡大饭馆\t275504\n周奇\t275505\n学以致用\t275506\n特兰克斯\t275507\n全责\t275508\n德克\t275509\n海南万宁政府网\t275510\n磨管\t275511\n模压机\t275512\n曾经沧海难为\t275513\n悬梁\t275514\nedits\t275515\nAndroid6.0\t275516\n中国社会学会\t275517\n陆九渊\t275518\n20180418\t275519\n合力叉车\t275520\n酒心\t275521\n外研社\t275522\n粗放式\t275523\n2月1\t275524\n倒数第\t275525\n宁波华美医院\t275526\n说什么\t275527\n文源\t275528\n3.1415926\t275529\n塑性\t275530\n大洼\t275531\n惊秫\t275532\n现地\t275533\n生死狙击手游\t275534\nwhatapp\t275535\n任城\t275536\n凌美2000\t275537\n钙钛矿太阳能电池\t275538\n斑贴\t275539\n账面价值\t275540\n666666\t275541\n雄克\t275542\n3dxml\t275543\n装配率\t275544\nmax\t275545\n民家\t275546\nerf\t275547\n吴江市盛泽镇盛泽实验小学\t275548\n新蒙\t275549\n2017年8月28日\t275550\n马里亚纳\t275551\nwwwse94secom\t275552\nPLD\t275553\n狮子王辛巴\t275554\n走神\t275555\n后来居上\t275556\nepiphone\t275557\nDZ\t275558\n周雯\t275559\n废锡\t275560\n大显神通\t275561\nhw\t275562\n揽月\t275563\n定安县人民政府\t275564\n无法割舍\t275565\n固态硬盘吧\t275566\n魔块\t275567\nNaNa\t275568\n710S\t275569\n鸠山由纪夫\t275570\n影碟机\t275571\n星人\t275572\n梦里\t275573\njunos\t275574\nlichen\t275575\nQQYA\t275576\nfewer\t275577\n毛集\t275578\n小花们\t275579\n电力系统分析\t275580\nPTT\t275581\nV15\t275582\n5590\t275583\n沙田柚\t275584\n极路客\t275585\n科普之窗\t275586\n万象\t275587\n感谢语\t275588\n莼菜\t275589\n哈德逊\t275590\nAccessibility\t275591\n晶体学\t275592\ndsdt\t275593\n地弹簧\t275594\n队友们\t275595\n设计源\t275596\nS2520\t275597\n五月天\t275598\nisakmp\t275599\nG9\t275600\n雅子\t275601\nmacx\t275602\n钟勇\t275603\n量力而行\t275604\n158万\t275605\n柱力\t275606\n陈琳\t275607\nwudao\t275608\ndeck\t275609\nHoody\t275610\n长沙格力空调\t275611\n北京电影学院现代创意媒体学院\t275612\n阿斯巴甜\t275613\n苏州市科学技术局\t275614\n800万美元\t275615\n澄海3C\t275616\nDANG\t275617\n黑客帝国1\t275618\n四明山镇\t275619\nFigma\t275620\n津南\t275621\n许衡\t275622\n达克罗宁\t275623\n印尼鹰航\t275624\n纪元专区\t275625\n37\t275626\n石排镇\t275627\n109\t275628\n基因测序仪\t275629\n复旦大学法学院\t275630\n滨海湾花园\t275631\n8月12日\t275632\nEXT4\t275633\n程间通信\t275634\n关内\t275635\n品尚\t275636\nantd-mobile\t275637\nASS\t275638\n2016年双十一\t275639\n生效\t275640\n衣身\t275641\n20131206\t275642\n动感单车\t275643\ndeadpool\t275644\nAriana\t275645\n花生豆\t275646\n椎动脉型颈椎病\t275647\nXT2\t275648\n小真\t275649\nxx公司\t275650\n新泰金金\t275651\n彻夜未眠\t275652\n彼特拉克\t275653\n24P]\t275654\n麦琪\t275655\n动向\t275656\nntvdm\t275657\n书图\t275658\n王晓丽\t275659\n老剧\t275660\n富斯\t275661\n九十九朵\t275662\n合唱曲\t275663\n本部\t275664\n女裙\t275665\n出生男\t275666\n若森\t275667\n深圳婚纱摄影\t275668\n两个多月\t275669\n证婚词\t275670\nMOONBASA\t275671\n多篇\t275672\n磷酸钠\t275673\n90后吧_\t275674\n美光\t275675\n礼乐\t275676\n签售\t275677\n瑶医\t275678\n广西尼\t275679\n慢性咳嗽\t275680\n交易所\t275681\nFlashAir\t275682\n秦朝\t275683\n2票\t275684\n木纹漆\t275685\n同轴连接器\t275686\nPlus\t275687\nA59S\t275688\noled屏\t275689\n360路由器卫士\t275690\n王者荣耀雅典娜\t275691\n揪\t275692\n谢娜\t275693\n长春幼升小\t275694\n大连市人民政府\t275695\nEasyTouch\t275696\n厦门公司\t275697\n体言\t275698\niQ\t275699\n太虚\t275700\n中线段\t275701\n君越\t275702\n甘萍\t275703\n王铭铭\t275704\n高丽丽\t275705\n2016年5月7日\t275706\n保持冷静\t275707\n过激\t275708\n第十节\t275709\npixelmator\t275710\n证券分析师\t275711\n和县\t275712\nOrientDB\t275713\n73期\t275714\n械\t275715\nKodi\t275716\n训练场\t275717\n自贡恐龙博物馆\t275718\n净土大经解演义\t275719\n如龙0\t275720\n蒲神\t275721\nUILabel\t275722\n众不同\t275723\n统计数\t275724\n光驱弹\t275725\n焦彦龙\t275726\n支医\t275727\n三希堂\t275728\n计划生育服务证\t275729\n开拓市场\t275730\nSCV\t275731\n定罪\t275732\nBid\t275733\n人行征信中心\t275734\n金源大厦\t275735\n陈迪娅\t275736\n唐云\t275737\n美指\t275738\n第三遍\t275739\nNord\t275740\nsoli\t275741\n4声\t275742\n成都五路一桥\t275743\n信号放大器\t275744\n换头\t275745\n产季\t275746\nMockup\t275747\nqml\t275748\n岷江\t275749\n士林夜市\t275750\n分支机构\t275751\n蟹钳\t275752\n剥蚀\t275753\n鸿丰\t275754\n10123\t275755\nTL-WN821\t275756\n谛\t275757\n597人才网\t275758\n伍迪\t275759\n手持\t275760\n张宗昌\t275761\npakistan\t275762\n返佣\t275763\n伴肠化\t275764\n上元大街\t275765\nmooc\t275766\n半卵圆中心\t275767\n巴鲁\t275768\n科贸\t275769\n盘点日积月累\t275770\n刑诉法\t275771\n_牛\t275772\n北京旅游\t275773\n找字\t275774\nThought\t275775\nToto\t275776\nMIDI\t275777\n公职人员\t275778\nsunlight\t275779\nprocurement\t275780\n二手手机\t275781\n顶骨\t275782\n事业\t275783\n井身\t275784\nHero\t275785\n五皇\t275786\n莫塔\t275787\n600288\t275788\nMPLAB\t275789\n冏\t275790\n交通管理局\t275791\n怒发冲冠\t275792\nHRM\t275793\nNicola\t275794\n财经类\t275795\n测控技术\t275796\n检票\t275797\n南航新闻网\t275798\n美丽心灵\t275799\n绿化墙\t275800\n金树\t275801\n上海广告公司\t275802\n小细胞癌\t275803\n上海交易所\t275804\n转编\t275805\n单胞\t275806\n500股\t275807\n宜黄县\t275808\n电脑录屏\t275809\n西班牙\t275810\n白沙洲\t275811\n再度\t275812\npd-1\t275813\nㄎ\t275814\n协商\t275815\n腾讯科技(深圳)有限公司\t275816\n盛晓玫\t275817\n李占国\t275818\n新集镇\t275819\nhuishou\t275820\n8000年\t275821\n侦查\t275822\n莫待\t275823\n杀戮天使\t275824\n罗帷\t275825\n尾标\t275826\n阁楼\t275827\n米力\t275828\n家企\t275829\n大粗\t275830\n法餐\t275831\n一些些\t275832\n腐烂\t275833\nST信通\t275834\n55所\t275835\n呷哺\t275836\n水果侠\t275837\n班组长\t275838\n告密者\t275839\nKyrie\t275840\n道教三清\t275841\n短卡\t275842\n第十九大\t275843\n1.4m\t275844\n验证码\t275845\n苏州地铁4号线\t275846\nconfigs\t275847\n日落\t275848\nRedmi\t275849\n点积\t275850\n河南都市频道\t275851\n龙管家\t275852\n3月12日\t275853\n佐治亚大学\t275854\n抱怨\t275855\n四年级语文上册\t275856\n东城新区\t275857\n张婉悠\t275858\n金奥力\t275859\n117&\t275860\n维京王者之战\t275861\n中海壳牌\t275862\n矿道\t275863\n恐怖鸡\t275864\n机器语言\t275865\n37_\t275866\n罗湖\t275867\n藓\t275868\n124集\t275869\n王婧\t275870\n孕吐\t275871\n600811\t275872\n日俄战争\t275873\n东方网络\t275874\n司康\t275875\n69月\t275876\n中邦\t275877\n没日没夜\t275878\npvc袋\t275879\n_315网\t275880\n中望cad2018\t275881\nM11\t275882\n不死不灭\t275883\nWorkstation虚拟机\t275884\n廊坊传媒网\t275885\n代购店\t275886\nwebfrom\t275887\nESTABLISHED\t275888\n长武县\t275889\n智圣\t275890\n费高云\t275891\n驻马店驿\t275892\n欠缴\t275893\n算上\t275894\nWeihanLi\t275895\n莫里森\t275896\nCVPR\t275897\n张紫妍\t275898\n国信证券\t275899\n手咪\t275900\n夏心旻\t275901\n三聚氰胺甲醛树脂\t275902\n前十天\t275903\n3石\t275904\nQQ厘米秀\t275905\n真不难\t275906\n列族\t275907\nYD\t275908\n市政府办\t275909\nc程\t275910\n花言巧语\t275911\n85号\t275912\nTouchID\t275913\n人兽\t275914\n2016/2017\t275915\n市纪委监察委\t275916\n大兴庞各庄\t275917\n练就\t275918\nquiet\t275919\n美泉宫\t275920\n赌博\t275921\n还清\t275922\n17节\t275923\n彩吧\t275924\n权钱\t275925\nroot文件夹\t275926\n李玫瑾\t275927\n石伟\t275928\n广东消防\t275929\n梦幻西游藏宝阁\t275930\nnets\t275931\nShemales\t275932\n万胜广场\t275933\nSCP基金会\t275934\n桥下\t275935\n聘期\t275936\nrssi\t275937\n经颅磁治疗仪\t275938\n11层\t275939\ni37100\t275940\n相救\t275941\n2650\t275942\nconfusion\t275943\n在飞\t275944\n维保\t275945\n贺雪峰\t275946\n变压器油\t275947\n血小板减少症\t275948\n王雅媛\t275949\n转录\t275950\n8页\t275951\n河合\t275952\n厚田沙漠\t275953\n_皇室战争\t275954\n鞋型\t275955\npiechart\t275956\nPomelo\t275957\n宜宾职业技术学院\t275958\n上海上海上海移动\t275959\n多可爱\t275960\ndataTables\t275961\n战战兢兢\t275962\n汕头大学图书馆\t275963\nzmud\t275964\n猥亵罪\t275965\n武丁\t275966\n罗勇\t275967\n独生子女父母光荣证\t275968\n铺通\t275969\n副证\t275970\n毕尔巴鄂竞技\t275971\n易居中国\t275972\n证劵\t275973\n中国市\t275974\n玛钢\t275975\n七日杀\t275976\n鹏华基金\t275977\ninterfaces\t275978\n法爵\t275979\n体外循环\t275980\n≌\t275981\n黑涩会\t275982\n西安市国家税务局\t275983\n第36话\t275984\n松本清张\t275985\n海南澄迈政府\t275986\n程依芸\t275987\n嫡女网游之全球在线\t275988\n福田英才荟\t275989\n秃鹰\t275990\n西南石油大学\t275991\n览潮\t275992\n股王\t275993\n正财\t275994\n山东大学机械工程学院\t275995\nT460S\t275996\n乐美宅别墅网\t275997\n大乳\t275998\nlimi\t275999\n麻州\t276000\n一先\t276001\n王者荣耀钟馗\t276002\n阴生\t276003\n爱格\t276004\n软胶\t276005\nSTM32F429\t276006\n济南市住房保障和房产管理局\t276007\n馄饨皮\t276008\n车本\t276009\n索林\t276010\n煎饼机\t276011\n盛少\t276012\n王朝霞\t276013\n济南西站\t276014\n育儿嫂\t276015\npftrack\t276016\n博康智能\t276017\n子级\t276018\nlynch\t276019\n3838\t276020\n三角头\t276021\n组展\t276022\n58届\t276023\nferragamo\t276024\nCacheable\t276025\n校季\t276026\n1.011\t276027\n阿诚\t276028\n地震局\t276029\nPs4\t276030\n分布式Session\t276031\n环保税\t276032\n高和宽\t276033\n小池镇\t276034\n66位\t276035\n38.4\t276036\n延误率\t276037\n大朋\t276038\n武汉理工\t276039\n山东联通\t276040\n机酒\t276041\nmedic\t276042\n断腿\t276043\n封疆\t276044\ncovenant\t276045\n景程论坛\t276046\nDildo\t276047\n量尺\t276048\n人工晶状体\t276049\n隐私权\t276050\n摔角手\t276051\nurl传参\t276052\n根尖炎\t276053\n履约\t276054\n赞语\t276055\n阿布拉莫维奇\t276056\npollution\t276057\n天时\t276058\n新田\t276059\n长青藤\t276060\n中短\t276061\n黄金td\t276062\nAlexNet\t276063\n海豹六队\t276064\n2.87\t276065\nYQ601\t276066\n利康\t276067\n1棵\t276068\nannounce\t276069\n王小林\t276070\nsep\t276071\n淫荡\t276072\n上海水务\t276073\n决战安徒生童话\t276074\ntg\t276075\n中粮前滩海景壹号\t276076\nipad模拟器\t276077\n克米\t276078\n2017年11月15日\t276079\ngba吧_\t276080\n异响声\t276081\n拆穿\t276082\n郑乾\t276083\n货盘\t276084\nVIB\t276085\n美摔\t276086\n预读\t276087\n舞裙\t276088\n小教\t276089\n摊位费\t276090\n东莞政府\t276091\n特奥会\t276092\n现浇梁\t276093\n诸强\t276094\n冰雹\t276095\n假使\t276096\n数贸\t276097\n里湖\t276098\negl\t276099\nPK\t276100\n惟愿\t276101\n犬舍\t276102\n郭岩\t276103\n入喉\t276104\n周倩\t276105\nniki\t276106\n效应量\t276107\nPinot\t276108\n钓法\t276109\n登门\t276110\nDHL国际快递\t276111\n黑岩网\t276112\n麦田\t276113\n博鳌\t276114\n西关大屋\t276115\n绑架篇\t276116\n桂丹路\t276117\n福源\t276118\n被辞\t276119\n划界\t276120\n评委会\t276121\n6316\t276122\n此贴\t276123\n扬子地板\t276124\n曼特\t276125\n狗眼看人\t276126\n陆判\t276127\n宜兴埠\t276128\n物质观\t276129\nToDoList\t276130\n旅行信用卡\t276131\nstri\t276132\n122号\t276133\n野蔷薇\t276134\nrotk\t276135\nSlam\t276136\n传记片\t276137\nsky\t276138\nunit8\t276139\n李书磊\t276140\n锦织圭\t276141\n帽子\t276142\ntoupai\t276143\n专业技术职务\t276144\nCHECK\t276145\nGoals\t276146\n政工科\t276147\n镍钴\t276148\nchartered\t276149\n显威\t276150\nAgora\t276151\n资格者\t276152\n后半\t276153\n万绮雯\t276154\n方济\t276155\n普利策新闻奖\t276156\nusaco\t276157\n第六篇\t276158\n掘地求升\t276159\nROW\t276160\n韩宝仪\t276161\n闪车宝\t276162\n博莱特\t276163\nTelemetry\t276164\n华彦钧\t276165\n鸿门宴\t276166\n再恋爱\t276167\n一村庄\t276168\n施工段\t276169\n2.00\t276170\n房产证|华律\t276171\n惠州人才网\t276172\n前海路\t276173\n获嘉\t276174\n市中区\t276175\n前童\t276176\n广东省第二人民医院\t276177\nborne\t276178\n商用房\t276179\n1516\t276180\n李居明\t276181\n少年游\t276182\nYOLO\t276183\n技术改造\t276184\n强爱\t276185\n罹难\t276186\n关长\t276187\n王名\t276188\n坐立不安\t276189\n金刚狼1\t276190\n墨香\t276191\n老墨\t276192\n″\t276193\n滨海万达广场\t276194\nSwap\t276195\n秦汉\t276196\n肺力咳合剂\t276197\nradiance\t276198\n姚丰\t276199\n安办\t276200\n出头之日\t276201\n除魔\t276202\n前叶\t276203\npanadol\t276204\n奇瑞瑞虎\t276205\n西藏自治区政府\t276206\nRIS\t276207\n女龙\t276208\n城乡规划局\t276209\npacopacomama\t276210\nshmmax\t276211\n清江浦区\t276212\n之所\t276213\n二连\t276214\nSATO\t276215\nmacdrive\t276216\n教师资\t276217\n帝国时代2:高清版\t276218\n无损压缩算法\t276219\nMATE8\t276220\n合拍片\t276221\n李未央\t276222\n铩羽而归\t276223\nwin10平板吧\t276224\n主婚\t276225\n师兄弟\t276226\n江明\t276227\n阿西莫夫\t276228\n鲍比达\t276229\n湖水\t276230\n十里铺村\t276231\n百醇\t276232\n深圳软件园\t276233\n酒井千波\t276234\n制剂\t276235\n唐宫海鲜舫\t276236\n联动\t276237\n焖子\t276238\nHashcat\t276239\nNa2SO4\t276240\n暨南大学信息科学技术学院\t276241\n男帽\t276242\nAnthology\t276243\n莆炎高速\t276244\n没式\t276245\n上海欢乐谷\t276246\n捷信中国\t276247\nkey值\t276248\n调心\t276249\n天兵\t276250\njiaotong\t276251\n按时\t276252\nyuplay\t276253\n一九九八\t276254\n御龙在天吧\t276255\n美女生\t276256\n想死\t276257\n序列号\t276258\n万酒招商网\t276259\n国计民生\t276260\n煎药机\t276261\nstudent\t276262\ndys\t276263\nPGP\t276264\n0xc000000\t276265\npil\t276266\n00159\t276267\n社会保障\t276268\n镰刀\t276269\n局队\t276270\n布兰迪斯大学\t276271\nfft\t276272\n倾点\t276273\n劳务派遣公司\t276274\n收费公路\t276275\n人世界\t276276\n美人尖\t276277\n多保\t276278\npub\t276279\n夯先生\t276280\n化学原料药\t276281\n家用电表\t276282\n干燥\t276283\n阿瓦提县\t276284\n第93章\t276285\n磁性套索\t276286\n二三四\t276287\n桌花\t276288\n七言联\t276289\n腰片\t276290\n神化\t276291\n轻拍\t276292\n小楼\t276293\n可罗雀\t276294\n吕良\t276295\n覆冰\t276296\n奥斯卡奖\t276297\n马连良\t276298\n续意难忘\t276299\ne8600\t276300\nnca\t276301\n四川省住房和城乡建设厅\t276302\n金项链\t276303\nR32\t276304\n浙江教育在线\t276305\n中国银行北京分行\t276306\n中国通信企业协会\t276307\n苏州国际博览中心\t276308\n关封\t276309\nMint\t276310\n六月浩雪\t276311\n4000多\t276312\n奇思屋\t276313\n心比天高\t276314\n反射\t276315\n白电油\t276316\nNeumann\t276317\n避震\t276318\n防御力\t276319\nbehaviors\t276320\n3V3\t276321\nBone\t276322\n枫桥夜泊\t276323\n湖南商学院\t276324\n77号\t276325\n房地产招聘网\t276326\n法拉利拉法\t276327\nEats\t276328\nyy4823\t276329\n南京海关\t276330\n日菜\t276331\n河边\t276332\n光晕\t276333\n专注力\t276334\n亚奥\t276335\n19.5寸\t276336\n臂长\t276337\n指环王3\t276338\n塑料机\t276339\n17&\t276340\n体积配箍率\t276341\n伊丝\t276342\nOpen-Falcon\t276343\n丰南区\t276344\n顶行\t276345\n十里洋场\t276346\n文本编码\t276347\n政道\t276348\nAmoni\t276349\n指数型\t276350\nAR-2048N\t276351\n德源镇\t276352\nDC漫画\t276353\n顺差\t276354\n泊寓\t276355\n霍常亮\t276356\n一条烟\t276357\n奥利维亚\t276358\n王建英\t276359\n西西河\t276360\ncomn\t276361\n果感\t276362\n蒋介石\t276363\n日利率\t276364\n第142集\t276365\nUID\t276366\nexceeded\t276367\n打工族\t276368\n词语\t276369\nFw\t276370\nik分词器\t276371\n榆林米脂\t276372\n骨架\t276373\nBrandi\t276374\n2763\t276375\n小米电视4A\t276376\n首当其冲\t276377\nUITabBarItem\t276378\n外汇交易商\t276379\n报站器\t276380\n安贞焕\t276381\n3.4.0\t276382\n客群\t276383\n克罗恩\t276384\n养生膏\t276385\n分隔\t276386\n书灯\t276387\n龙纹鲤\t276388\n5.8毫米\t276389\n躲猫猫\t276390\n5.5.25\t276391\n剥夺\t276392\nCOOP\t276393\n直流偏置\t276394\nfinishing\t276395\n陈明华\t276396\n雷霆途观\t276397\nups\t276398\n陕西地区\t276399\n广州东华职业学院\t276400\n住村\t276401\n学林\t276402\n神经性耳鸣\t276403\n安全钳\t276404\n一元四次方程\t276405\n开心尝鲜\t276406\n一覧\t276407\nAdguard\t276408\n林洋能源\t276409\n钱盒\t276410\n10A\t276411\n定州市\t276412\n华硕商城\t276413\n正师\t276414\n双效\t276415\ndemand\t276416\n明晟\t276417\n江西省纪委省监委\t276418\n20170619\t276419\n04月20日\t276420\n困人\t276421\nMarmont\t276422\npoppy\t276423\n樱花林\t276424\nyacc\t276425\n隐瞒\t276426\n季安安\t276427\n奔驰车\t276428\n渡边淳一\t276429\n遗骸\t276430\nwaits\t276431\n4000套\t276432\nPOVOS\t276433\nCamaro\t276434\nRec\t276435\n53万\t276436\n英国银行\t276437\n33所\t276438\n立花瑠莉\t276439\n相数\t276440\nmathematics\t276441\nelie\t276442\n临床医学院\t276443\n摔跤\t276444\n通州区\t276445\n醉客\t276446\n9平方\t276447\n整合版\t276448\n失聪\t276449\n佘山\t276450\n梅园新村纪念馆\t276451\ntue\t276452\n孟村回族自治县\t276453\n捡到\t276454\n东方法眼\t276455\n3.79\t276456\n幸运吧起名网\t276457\n股票池\t276458\n钉宫\t276459\n易商通\t276460\n第101章\t276461\n白气\t276462\n医官\t276463\n南庄\t276464\n盐酸左西替利嗪\t276465\n凤凰家园\t276466\n起正\t276467\n好吃佬美食网\t276468\n夏斌\t276469\n秦洋\t276470\n饮水杯\t276471\n云游控股\t276472\n朗朗乾坤\t276473\nAV番号wan-107\t276474\n笛声\t276475\n重庆市万州区人民政府\t276476\n收费单\t276477\n舌下\t276478\n麻瓜\t276479\n班人马\t276480\n云南电视台\t276481\n地味\t276482\n沙溪镇\t276483\nimread\t276484\n平安乡\t276485\nhuai\t276486\n咕\t276487\n碧欧泉\t276488\n北京师范大学环境学院\t276489\ncos1\t276490\nCydia源\t276491\n10多天\t276492\n反对派\t276493\n钢塑管\t276494\n行贿案\t276495\n户口迁移证\t276496\n神华\t276497\n打字\t276498\n时不我待\t276499\n瞬联\t276500\n松岡\t276501\n上海化工研究院检测中心\t276502\n填房\t276503\n短面板\t276504\n闭塞性脉管炎\t276505\nCANMAKE\t276506\n刘莱斯\t276507\n瓶子君\t276508\n瓦砾\t276509\n垂体\t276510\n阳安\t276511\nglc\t276512\n百度五笔输入法\t276513\ncolleges\t276514\nCarrefour\t276515\n穿著\t276516\n宇瞻\t276517\ncefsharp\t276518\n生物谷\t276519\nNVL\t276520\n二十五味\t276521\n骑虎难下\t276522\n光纤\t276523\n华能集团\t276524\n146个\t276525\n警官证\t276526\n区政务服务中心\t276527\n变卦\t276528\n合作医疗\t276529\n师范\t276530\n截距\t276531\n奥顿\t276532\n吉他谱C调弹唱\t276533\n肾穿\t276534\nwx-charts\t276535\n历趣\t276536\n4399\t276537\n大七孔\t276538\n光之子\t276539\nCOSTCO\t276540\n水果篮\t276541\nnautica\t276542\nlinefriends\t276543\nMPG\t276544\nTransform\t276545\n长条形\t276546\n锯缘\t276547\n大冲\t276548\n郭艾\t276549\n云南野生动物园\t276550\nlearners\t276551\n局里\t276552\n海圳\t276553\n方表\t276554\n1057\t276555\n联瑞\t276556\n东方明珠花园\t276557\nSDWebImage\t276558\n便道\t276559\norion\t276560\nlasso\t276561\n今今\t276562\n性别者\t276563\n奥林匹克花园\t276564\n一把抓\t276565\n朝台\t276566\nvcproj\t276567\n重返狼群\t276568\n70名\t276569\n生存权\t276570\nstern\t276571\napms\t276572\n动载荷\t276573\n水基\t276574\n家乐\t276575\n负\t276576\n表舅\t276577\n全国卷1\t276578\n焦柳铁路\t276579\n三国志13吧\t276580\n综合管廊\t276581\n美人窝\t276582\n袁纵\t276583\n合金板\t276584\n田园郡\t276585\nGMAT\t276586\nmameplus\t276587\n棉被\t276588\n内接圆\t276589\n小飞虫\t276590\n供货价\t276591\n高歌\t276592\n双眼皮吧\t276593\n欢乐城\t276594\nAutoCAD2019\t276595\n福州人\t276596\n金属\t276597\ntaglibs\t276598\n张也\t276599\n大众高尔夫\t276600\nel-dialog\t276601\nMTG\t276602\n蔡甸街\t276603\n挺进地牢\t276604\nrocker\t276605\n12月20日\t276606\n南沙新区\t276607\n乐艺leewiART\t276608\n分断\t276609\n菠萝汁\t276610\n|期\t276611\nan华罗庚\t276612\n花泽\t276613\n如虎添翼\t276614\nhubgit\t276615\n中万\t276616\n中国电影出版社\t276617\nhdl\t276618\n报道稿\t276619\nHelens\t276620\n籼米\t276621\n建筑工程学院\t276622\n中粮海景壹号\t276623\nyoulai\t276624\n潮汕网\t276625\n变形机器人\t276626\njsch\t276627\nHYUNDAI\t276628\n法兰克福大学\t276629\ncmw\t276630\n图法\t276631\n点检\t276632\n给与\t276633\n水岸新城\t276634\n2.0.so.0\t276635\n黄光裕\t276636\n工作资料\t276637\nPapyrus\t276638\nhometown\t276639\nCAM350\t276640\nSpringer\t276641\n必死\t276642\n一着\t276643\ntvm\t276644\n送春归\t276645\n隔离阀\t276646\n尖峰\t276647\nfc吧\t276648\n新材\t276649\n电地暖\t276650\n不负如来不负卿\t276651\n糖糕\t276652\nLabour\t276653\n审民\t276654\n蒂蒙斯\t276655\ncosts\t276656\n泰浴\t276657\n森严\t276658\n友谊医院\t276659\n我曾\t276660\nLOVER\t276661\n大舌\t276662\n台儿庄旅游网\t276663\n翁斐然\t276664\n你走\t276665\n中移动\t276666\n社区组织生活会\t276667\n方集\t276668\n晓暴\t276669\n柏林工业大学\t276670\n_图拉丁吧\t276671\nGb\t276672\n192.168.1.220\t276673\n烘炉\t276674\nreciprocal\t276675\n30mg\t276676\n平衡块\t276677\n百度文字\t276678\n贷借\t276679\n蒸鸡蛋\t276680\n意恐\t276681\n快乐生活\t276682\nabf\t276683\neasyconnect\t276684\n冬\t276685\n橙光游戏\t276686\n达累斯萨拉姆\t276687\n笑不笑\t276688\nTheONE\t276689\n瓦检员\t276690\n周骏\t276691\npolyester\t276692\n梦幻69吧\t276693\nhuaxue\t276694\n23名\t276695\n经典永流传\t276696\n北京图书馆\t276697\nsublim\t276698\n宝华\t276699\n湖北省公共资源交易中心\t276700\n丰田酷路泽4000\t276701\nMichael\t276702\n女人缘\t276703\nHG\t276704\n七十首\t276705\nAuctions\t276706\n第81号\t276707\n司令\t276708\n引风\t276709\n研究会议\t276710\n穿插\t276711\n消防工程师\t276712\n内蒙古日报\t276713\nwsdl接口\t276714\n客户部\t276715\n合肥学院\t276716\n津门飞鹰\t276717\n医学教育网\t276718\n中南地区\t276719\n几百台\t276720\nUnpretty\t276721\nRivera\t276722\n杭州市三医院\t276723\n三无\t276724\n无线投屏\t276725\nnavida\t276726\n说散\t276727\n欢乐喜剧人2\t276728\n杀菌灭藻剂\t276729\n湖南工程学院\t276730\n品值\t276731\nSwoole-Swoole文档中心\t276732\n叫停\t276733\n安美\t276734\n屠洪纲\t276735\n62.5%\t276736\navahi\t276737\n冷板凳\t276738\n自粘聚合物改性沥青防水卷材\t276739\n高能少年团2\t276740\n公益基金会\t276741\n47代理\t276742\n滨海大厦\t276743\n真相\t276744\n唐山港\t276745\n50句\t276746\n乡村爱情交响曲\t276747\n浓汤\t276748\n恶魔城暗影之王2\t276749\nSourcetree\t276750\n西安科技大学高新学院\t276751\n弩炮\t276752\n冰雪奇缘\t276753\n丽志\t276754\n南龙\t276755\nSSD存储技术\t276756\n嫣然天使基金\t276757\n第30集\t276758\n襄阳网\t276759\n永诚保险\t276760\n九年级化学\t276761\n薛颠\t276762\n椰果\t276763\n考试后\t276764\n两磅\t276765\n英雄的故事\t276766\n舜网\t276767\n盐城经济技术开发区\t276768\n启智惠\t276769\n接转\t276770\n全峰\t276771\n奉先\t276772\n命令\t276773\n师傅们\t276774\n泡酒\t276775\nDB34/T\t276776\n插盘\t276777\n人血白蛋白\t276778\n雅安市政府\t276779\n南新村\t276780\n社会工作师考试\t276781\n李蒽熙\t276782\nGMIC\t276783\n睡莲\t276784\n协防\t276785\n澄澈\t276786\n九十二\t276787\nh110m-k\t276788\n雍熙\t276789\n幼嫩\t276790\n120周年\t276791\n自由组合定律\t276792\n江苏大学图书馆\t276793\n进位\t276794\n银华日利\t276795\n总表\t276796\n期货市场\t276797\n斯旺西\t276798\n海鞘\t276799\n加压\t276800\nRAH\t276801\n开笔\t276802\n朱镕基讲话实录\t276803\n春华秋实\t276804\n秒级\t276805\n刹车油\t276806\n肠液\t276807\n龙鹰\t276808\nleshi\t276809\n好妻子\t276810\nfo\t276811\n总网\t276812\n易淘金\t276813\n魏雪漫\t276814\n广元市质量技术监督局\t276815\n20160116\t276816\n香蒲丽\t276817\n玫瑰岛\t276818\n八角街道\t276819\n综治民调\t276820\n不要说话\t276821\n银粉\t276822\n两轨\t276823\nf4se\t276824\n全包\t276825\n25层\t276826\n红名村\t276827\n陈家山\t276828\n燕川\t276829\n胡明\t276830\n百色助学网\t276831\n灵投\t276832\n省府\t276833\ntaller\t276834\n柔软剂\t276835\n双飞五日游\t276836\n贝利\t276837\nEDACEO\t276838\n2023\t276839\n黑阵\t276840\n两批次\t276841\nMcKay\t276842\n苏溪\t276843\nem10\t276844\n酒具\t276845\n汉宁窗\t276846\n滑脱\t276847\n抚顺市望花区\t276848\nX7.4\t276849\n苏通\t276850\n上釉\t276851\n为着\t276852\n酷变宝\t276853\nリタ\t276854\n時刻表\t276855\nDiscord\t276856\n超硬材料网\t276857\n集运宝典\t276858\n济南电视台\t276859\n台安县\t276860\n热轧卷板\t276861\n10月中旬\t276862\n年径\t276863\n翼族\t276864\n↑\t276865\noccurrence\t276866\n理发器\t276867\n湖南海利\t276868\n表姐\t276869\n乱纶\t276870\n昆仑能源\t276871\n秀沿路\t276872\n很好听\t276873\n天亿\t276874\n300万个\t276875\n梵高\t276876\n电视剧片\t276877\n多乐士\t276878\n栖息地\t276879\nsan_yun\t276880\n李家诚\t276881\nC4D\t276882\n维权群\t276883\n世纪金源购物中心\t276884\n豫\t276885\n布瑞吉\t276886\nVIRT\t276887\n主官\t276888\n0096\t276889\n四川省体育局\t276890\ndx/\t276891\n壹品仓\t276892\nHp\t276893\n吃货节\t276894\n高氯酸铵\t276895\n库哈斯\t276896\ndownloaded\t276897\n四代\t276898\n网易暴雪\t276899\n汉青\t276900\n白膜\t276901\n空气管\t276902\nRongCloud\t276903\n上海市统计局\t276904\n贺军\t276905\n师院\t276906\ntriumph\t276907\n监审\t276908\n纪念白求恩\t276909\n汪苏泷\t276910\n尼莫地平片\t276911\n建设银行网上银行\t276912\n合肥奥体中心\t276913\n公举\t276914\ncc卡\t276915\nprl\t276916\n草绿色\t276917\n朝阳大悦城\t276918\nwinRAR\t276919\n6.3万\t276920\n中信资本\t276921\n海诺\t276922\n封场\t276923\nyaourt\t276924\nArtisans\t276925\n导航条|jquery\t276926\n角龙\t276927\n秋节\t276928\n三顾\t276929\nBOOTS\t276930\n超过10分钟\t276931\n去库存\t276932\nDeadpool\t276933\n记不记得\t276934\n金航\t276935\n6万元\t276936\n技工证\t276937\n新浪河南_新浪网\t276938\nspc\t276939\n防水袋\t276940\nconduct\t276941\ngdgp2.chinaxinge.com/shuju2/201711/20171151037173362\t276942\n压脚\t276943\n脱\t276944\n京东方科技集团\t276945\n翠屏区人民政府\t276946\n藤泽\t276947\n爱丽思\t276948\n温江\t276949\nplan\t276950\n德里克\t276951\n大花猫\t276952\n水杨酸软膏\t276953\nasb\t276954\n邢岫烟\t276955\n叫号系统\t276956\n14版\t276957\ndongyang\t276958\n命运规划局\t276959\n康巴情\t276960\n奔跑\t276961\n生态系\t276962\n吃鸡模拟器\t276963\n消融\t276964\nv1.13\t276965\n燕路\t276966\n耳科\t276967\n高峰论\t276968\n借景\t276969\n千文\t276970\n许绍雄\t276971\nkindle3\t276972\n2016年12月6日\t276973\n解剖书\t276974\n中信信托\t276975\nDylan\t276976\ntut\t276977\nDESTOON\t276978\n马伊苏轼\t276979\n铜梁县\t276980\n贾小朵\t276981\n香肠\t276982\n陈分\t276983\n三块广告牌\t276984\n崂山信息网\t276985\n王治平\t276986\n互变\t276987\n足部\t276988\n资产负债表日\t276989\n青明\t276990\nArab\t276991\n智能眼镜\t276992\n公安机关\t276993\n云资讯_比特网\t276994\n杜德伟\t276995\nlrzsz\t276996\n九新公路\t276997\n青羊宫\t276998\n何敏杰\t276999\n薛平\t277000\n歌唱家\t277001\n底\t277002\n地格\t277003\nsafiri\t277004\n工伤保险待遇\t277005\n打桩机\t277006\n锅炉管\t277007\nlibssh\t277008\n24亿美元\t277009\n大过天\t277010\nClipper\t277011\n决策树分类算法\t277012\n兰生\t277013\n柚子\t277014\n减速机信息网\t277015\n爱因斯坦传\t277016\n蓝英\t277017\n戴国强\t277018\nCK2\t277019\n小论\t277020\n爱贝芙\t277021\n辅仁药业\t277022\n405号\t277023\npushing\t277024\n汇付天下\t277025\n搞笑gif\t277026\n耳屎\t277027\n落语\t277028\n金枫路\t277029\n国栋\t277030\n南京秦淮区政府网\t277031\n保温套\t277032\n虎山\t277033\n总府\t277034\n英威达\t277035\n述略\t277036\n笛子曲\t277037\n山门\t277038\n酷6网\t277039\n蒲公\t277040\nVSM\t277041\ntubular\t277042\nafm\t277043\n徐根宝\t277044\n时分\t277045\n中国建造师网\t277046\n郑炜\t277047\n海贼王剧场版\t277048\n爵士乐与不插电新编\t277049\n20大\t277050\n和善\t277051\n城西社区\t277052\n_努比亚\t277053\n注册计量师\t277054\n300公斤\t277055\nTalking\t277056\n司法权\t277057\n内里\t277058\n役态\t277059\n飞拉达\t277060\n滤饼\t277061\n甲状腺过氧化物酶\t277062\n摆盘\t277063\n每个月\t277064\n宁波保税区\t277065\ntuna\t277066\n荒野大镖客救赎\t277067\n芴\t277068\n物理化\t277069\n000.00\t277070\n大淘宝客\t277071\n瑞盛\t277072\ncmt\t277073\n40.0\t277074\n研究生院\t277075\nMnist\t277076\n100排名\t277077\n博物馆\t277078\n无泪\t277079\n大雕\t277080\n摇摆式\t277081\nPerrier\t277082\n新世纪大学英语综合教程\t277083\n标准器\t277084\n火锅炉\t277085\n潘石屹\t277086\n耳雅\t277087\n花桥镇\t277088\n玻璃纤维布\t277089\n摇车\t277090\npdf格\t277091\n探奇\t277092\n鲍师傅\t277093\n铁营\t277094\n从一到\t277095\n大娘水饺\t277096\nargo\t277097\n森海渡\t277098\n陈志明\t277099\npreparestatement\t277100\n黑龙江省人大常委会\t277101\nintegration\t277102\nPHPUnit\t277103\n五月天天津演唱会\t277104\neditext\t277105\n黄南\t277106\nAndroid-X86\t277107\n3D设计师\t277108\n吡咯\t277109\n灵山大佛\t277110\n值化\t277111\n典匠\t277112\nlightblue\t277113\n娄桥街道\t277114\n指示图\t277115\ntypeerror\t277116\n辽宁农业职业技术学院\t277117\n同仁\t277118\n圆木桩\t277119\n梨衣\t277120\n总部区\t277121\n97年\t277122\n大英赛\t277123\n20151104\t277124\nwin7旗舰\t277125\n拒之门外\t277126\n陈伦\t277127\n轩墨\t277128\n电视屏\t277129\n5005\t277130\nCompliance\t277131\n预处理\t277132\n黑白风\t277133\n麻省\t277134\nTay\t277135\n风云无双\t277136\n保利康桥\t277137\n野菜\t277138\n会饮篇\t277139\n交织\t277140\n图品\t277141\n手工艺\t277142\n九山西\t277143\n弹射器\t277144\n千佳\t277145\n175CM\t277146\nppt4\t277147\n老施\t277148\nepson投影仪\t277149\nevus\t277150\njpy\t277151\n2月26日\t277152\n华为荣耀10\t277153\n双对\t277154\n兴边\t277155\n流心\t277156\n谍影重重\t277157\n12.17\t277158\nps教程\t277159\nthrowable\t277160\n秦始皇兵马俑博物馆\t277161\nFTBFS\t277162\n145平米\t277163\n成反\t277164\n剑眉\t277165\nC13\t277166\n维C泡腾片\t277167\n2008-2013年\t277168\n资产证券\t277169\n饯行\t277170\n高圆\t277171\nretcode\t277172\n人事章\t277173\n区检察院\t277174\n病痛\t277175\njianyu\t277176\nWCF\t277177\n欧尔麦特\t277178\n3月16日\t277179\ncherish\t277180\n霉霉\t277181\n正股\t277182\n阿尤\t277183\n名郡\t277184\n85套\t277185\n从轻\t277186\n网易家居中心\t277187\nbestkiller\t277188\n西安西京医院\t277189\n乔治·索罗斯\t277190\n寿县人民政府\t277191\n札幌\t277192\n直梯\t277193\n川村麻耶\t277194\n省农委\t277195\n战斗力\t277196\npy2\t277197\n哥林\t277198\n注册信息系统\t277199\n区人力资源和社会保障局\t277200\n舞咲美娜\t277201\n永诚\t277202\n2018年3月14日\t277203\n超人:钢铁之躯\t277204\n中国科学院紫金山天文台\t277205\n八宝山\t277206\n南县人民政府\t277207\nCnetOS\t277208\n钟繇\t277209\n无声胜有声\t277210\n瘾者\t277211\n恩里克\t277212\n第三次\t277213\n36mm\t277214\n沙巴机场\t277215\n翻弹\t277216\n捰\t277217\n评报\t277218\n中国医科大学附属第四医院\t277219\n几成\t277220\nk550\t277221\n机械与动力工程学院\t277222\n落马\t277223\nSYSTEM\t277224\n猫咪大战争\t277225\nmybaitis\t277226\n48.2\t277227\n和平影都\t277228\n朝鲜蓟\t277229\n龙猫\t277230\nrab\t277231\n公明汽车站\t277232\n李卉\t277233\ngradients\t277234\n伊尔\t277235\n童稚\t277236\n摆摆\t277237\n高资\t277238\n第七大道\t277239\n书山\t277240\n柱基\t277241\n葡萄糖\t277242\n电子读书网\t277243\n黄金TD\t277244\nQRcode\t277245\n尚杰\t277246\n训练题\t277247\n水培吧\t277248\n双眼皮\t277249\n宁波市中级人民法院\t277250\n折多山\t277251\npmlc\t277252\nsexx\t277253\nOPUS\t277254\n18节\t277255\nHAL\t277256\n玛丽珍\t277257\nshiny\t277258\n美慧\t277259\n哈洛加斯\t277260\nSWAP\t277261\n2521\t277262\n半口\t277263\n卫岗\t277264\n瘦肉丸\t277265\nEsports\t277266\n狗王\t277267\n库兹涅佐夫号\t277268\n人教版三年级下册语文\t277269\n规化\t277270\n中国计量科学研究院\t277271\n上海招商银行\t277272\n海云台\t277273\n金桢勋\t277274\n说说我自己\t277275\n非流动负债\t277276\nX264\t277277\n首尔大学\t277278\n金茂府\t277279\n是第\t277280\n果静林\t277281\n陕西省工商行政管理局\t277282\n电信千兆光猫\t277283\n3.1.9\t277284\n四尺\t277285\n拳馆\t277286\n误删除\t277287\n结节性甲状腺肿\t277288\n飙血\t277289\n睡到\t277290\nSXS\t277291\n大峡谷\t277292\n赣水路\t277293\n兴庆宫\t277294\ni20\t277295\n灵蝶\t277296\nLocalDate\t277297\n山口诚\t277298\nstc89c51\t277299\nzonda\t277300\nシリ\t277301\n凡尔\t277302\n肾疼\t277303\n勒\t277304\n羊羔肉\t277305\n18511141315\t277306\n星霜\t277307\n除夕\t277308\n国电南自\t277309\n上海地产集团\t277310\nWDF\t277311\nwindos10\t277312\n麦当劳\t277313\n朱紫国\t277314\nclasses\t277315\n对比篇\t277316\nuse\t277317\n摆手\t277318\n止痛膏\t277319\n人民同泰\t277320\n逢年过节\t277321\n12514\t277322\n刷房\t277323\n恐怖版\t277324\n解釋\t277325\n智能投顾\t277326\nNOTE4X\t277327\n民调\t277328\n百度鹰眼\t277329\n腾讯微信\t277330\nhgst\t277331\n旭辉宝龙\t277332\n蔡阳\t277333\n西布朗\t277334\n相柳\t277335\n高仿鞋\t277336\n18962123606|0512\t277337\n贵池区\t277338\njes\t277339\n51新闻网\t277340\n泮\t277341\n亚洲在线\t277342\nDX12\t277343\n艾美仕\t277344\n马尔姆\t277345\n多么\t277346\n1-3个月\t277347\n人品\t277348\n手淘\t277349\n妆扮\t277350\n建融\t277351\n障碍赛\t277352\n武汉理工大学学报\t277353\n中奢网\t277354\n半光\t277355\n色值\t277356\n老片\t277357\nwest\t277358\n林纸\t277359\n名位\t277360\n中关村一小\t277361\nrtx\t277362\n怦\t277363\n海南海口\t277364\n名人传记\t277365\n其他\t277366\n开元大道\t277367\n三棋\t277368\n中界\t277369\nsparing\t277370\n重复测量方差分析\t277371\nsr5\t277372\n不停地\t277373\n西溪悦榕庄\t277374\nNano卡\t277375\n目连\t277376\n京东客\t277377\n先行其言\t277378\n攻破\t277379\n美成达移民公司\t277380\n活菌\t277381\nZedd\t277382\n崔兰田\t277383\n伍德沃德\t277384\n揭阳日报\t277385\n百度云_高清\t277386\n月七\t277387\n七八九\t277388\nbrew\t277389\nosgearth\t277390\n电动三轮\t277391\n龙泉路\t277392\n政协主席\t277393\n清明花\t277394\n榜样\t277395\nivc\t277396\n百度脑图\t277397\nnull值\t277398\n变性淀粉\t277399\n叛逆性百万亚瑟\t277400\n十一位\t277401\n挖坑\t277402\n程砚秋\t277403\n建国大学\t277404\n天道贰\t277405\n大兴土木\t277406\n咒文\t277407\n16万吨\t277408\n叶形\t277409\n游园惊梦\t277410\n二手家具网\t277411\n心愿便利贴\t277412\n_沭阳政府网\t277413\nAnnette\t277414\n三百六十\t277415\n苏州牙\t277416\nOperate\t277417\n暄\t277418\ntelling\t277419\nundefine\t277420\n单元卷\t277421\nCDH5\t277422\nORA-00911\t277423\n于伟国\t277424\nAPIcloud\t277425\n土特\t277426\n腾讯视频转换\t277427\n42度\t277428\n楼奴\t277429\n海尔小厨宝\t277430\n独显\t277431\n吉林信托\t277432\n华润深国投信托有限公司\t277433\n最终幻想12黄道年代\t277434\nERC20\t277435\n主宠\t277436\n丑闻\t277437\n荒山\t277438\nminnie\t277439\n借我\t277440\n运往\t277441\n买新\t277442\n盐城晚报\t277443\n沈巷镇\t277444\n收购案\t277445\n光华口腔医院\t277446\nqq电脑管家\t277447\n榴花\t277448\n中国会展网\t277449\n署名\t277450\n期货业\t277451\nwin1o\t277452\n施乐辉\t277453\n无限魅力物联网\t277454\njapen\t277455\n魔窗\t277456\n台背\t277457\n周星星\t277458\n合浦县\t277459\n南通房产超市网\t277460\n65x9000e\t277461\n儒商\t277462\nDirt\t277463\ncmcl\t277464\nfollowers\t277465\n永春县\t277466\n20160316\t277467\n魏老师\t277468\n推动\t277469\n惶恐\t277470\n示范路\t277471\n航美传媒\t277472\n领先者\t277473\n光绘\t277474\n阑干\t277475\n开孔\t277476\n对光\t277477\n大杨\t277478\n初学者\t277479\n巨高\t277480\n房管所\t277481\n花语网\t277482\n荣成石岛\t277483\nPulse\t277484\n哈馆\t277485\nCOLLECTION\t277486\n拜尔\t277487\n最美情侣\t277488\n测硫仪\t277489\n海贼王路飞\t277490\n河南省住建厅\t277491\n注释符\t277492\n兰纳\t277493\n新沙岛\t277494\n绅宝d50\t277495\n威腾网\t277496\n打火机\t277497\nmraz\t277498\nAria2GUI\t277499\n翻翻\t277500\n150MB\t277501\n密技\t277502\n风云网\t277503\nvmware14\t277504\n九世\t277505\n妆品\t277506\n塞上\t277507\n新和县\t277508\nr410a\t277509\n乙脑疫苗\t277510\n广电网络\t277511\nctpn\t277512\n苗钟真\t277513\n环球金汇网\t277514\n蓝国\t277515\n志云饭局\t277516\n夜视镜\t277517\nLFS\t277518\nredvelvet\t277519\n长乐机场\t277520\n第147集\t277521\n米粉节\t277522\nwindowsphone\t277523\n智跑重生之都市修仙\t277524\nmathcad\t277525\n风暴英\t277526\n黄荣\t277527\n四批\t277528\n档距\t277529\nPP卡\t277530\nDVDs\t277531\n藤壶\t277532\n上海有限公司\t277533\n陈国明\t277534\nsfc\t277535\n潘凤\t277536\n科赫\t277537\n领空\t277538\n安缦\t277539\n水淀粉\t277540\n徐安\t277541\n新股申购\t277542\n校医院\t277543\nmap函数\t277544\n蓝宝\t277545\ntrie\t277546\n牛仔舞\t277547\n数显卡尺\t277548\n再也没有\t277549\n六月中旬\t277550\n伦教碧桂园\t277551\n練習\t277552\n疱疹性咽颊炎\t277553\nProperty\t277554\n骁龙625\t277555\n丁二醇\t277556\n6月17日\t277557\nAcross\t277558\n通渭\t277559\nVVS\t277560\n肖冰\t277561\njas\t277562\n惊悚片\t277563\n依泉\t277564\n单\t277565\n乌鹭\t277566\n刹车蹄\t277567\n大步流星\t277568\nparpool\t277569\n应用宝\t277570\n600267\t277571\n世纪大道地铁站\t277572\nDDOS攻击\t277573\n耕者\t277574\nzenyang\t277575\nsolideworks\t277576\n马晓晨\t277577\n布姐\t277578\n加续\t277579\n百撕\t277580\n婚礼秀\t277581\nR-Studio\t277582\n曲江\t277583\n濠滨房产网\t277584\n滚动摩擦系数\t277585\nFever\t277586\n朱荣\t277587\n爱是你我\t277588\nlunux\t277589\n上海顺丰快递\t277590\n闭锁\t277591\n小拖车\t277592\n苯教\t277593\n渝贵铁路\t277594\n酶促反应\t277595\n北京机场\t277596\n上海文化广播影视集团有限公司\t277597\n盲井\t277598\n从戎\t277599\n追追\t277600\nctrl+\t277601\n随心卡\t277602\n融券\t277603\n2017年3月2日\t277604\nmB\t277605\n抗氧化酶\t277606\n陈雅漫\t277607\n栈顶\t277608\n丰韵\t277609\n瑞恩·高斯林\t277610\nJulius\t277611\n海曙区\t277612\n偏财\t277613\nHive\t277614\n太和\t277615\n雏子\t277616\n三a\t277617\n萝莉音\t277618\n二次门\t277619\n锦绣社区\t277620\n7Z\t277621\n55%\t277622\n深圳特区报数字报\t277623\n相似率\t277624\n2015.05\t277625\n辣府\t277626\n亲属间\t277627\n中文版\t277628\n正方型\t277629\n378号\t277630\n赣剧\t277631\n茅子俊\t277632\nRyona\t277633\n联弹\t277634\n海珠\t277635\n上海仪电\t277636\n步练师\t277637\nupdata\t277638\nROV\t277639\n羊屎\t277640\nmosquitto\t277641\n2016—2020年\t277642\n玛法里奥\t277643\n雏田\t277644\nexplorer.exe\t277645\ngackt\t277646\nPEQ\t277647\n仁寿\t277648\n伊朗\t277649\npong\t277650\nSIL\t277651\n80号\t277652\n马克思主义基本原理概论\t277653\n侠盗飞车罪恶都市中文版\t277654\nTerms\t277655\n乳饮料\t277656\n剑网三唐门\t277657\n腰侧\t277658\nProd\t277659\n引述\t277660\n千年梦圆在今朝\t277661\nDIY一个\t277662\n7幢\t277663\n3000个\t277664\n一种爱\t277665\n驾考网\t277666\n卡林\t277667\noppofind7\t277668\n无忧物流\t277669\nGridLayout\t277670\n地面波\t277671\n量长\t277672\n导率\t277673\nyum源_Linux教程_Linux公社\t277674\n程林依依\t277675\n拓邦\t277676\nE24\t277677\n红藕香残玉簟秋\t277678\n新中付\t277679\n绅宝x35\t277680\n山奈\t277681\n普田\t277682\n好名\t277683\n愉悦感\t277684\n上东\t277685\n杜若\t277686\n人精\t277687\n研究型\t277688\n优价\t277689\n滨江府\t277690\nShure\t277691\n电白\t277692\n段正淳\t277693\n海纳百川下载器\t277694\n为王_\t277695\n音变\t277696\n苏若云\t277697\nmetallica\t277698\n不哭\t277699\n刘雯\t277700\n爆宠\t277701\ns1000rr\t277702\n张祥安\t277703\n第三十七\t277704\n菌丝\t277705\n0xe5\t277706\n天津大港油田\t277707\n伊金霍洛之窗\t277708\n清同治\t277709\n风向标\t277710\n7120\t277711\n李长庚\t277712\nxcc\t277713\n学形\t277714\n奇奇怪\t277715\n鼻虫\t277716\n徐梵溪\t277717\n许吉如\t277718\n场控\t277719\n国债\t277720\n通用语言\t277721\n沉\t277722\nstaticmethod\t277723\n招录\t277724\np71\t277725\n催生\t277726\n172个\t277727\nBitComet\t277728\n吴滨\t277729\n淫虐\t277730\n无关\t277731\nnominal\t277732\n长安责任保险股份有限公司\t277733\nLx\t277734\n国酒茅台\t277735\n发光二极管\t277736\nmyway_liang\t277737\nafn\t277738\ntn=93603385_s_hao\t277739\n古体诗\t277740\n小腿静脉曲张\t277741\n撤场\t277742\nthrones\t277743\n富力华庭\t277744\n168个\t277745\n桃花宝典\t277746\n宋威龙\t277747\nゝ\t277748\n宝安日报数字报\t277749\n喜羊羊与灰太狼\t277750\n马鞍山市\t277751\n三岔路口\t277752\n三国志12pk\t277753\n白热\t277754\n弥敦城\t277755\n上海交通大学-电子信息与电气工程学院\t277756\n庆功会\t277757\n08CMS\t277758\n易企秀H5\t277759\n此情此景\t277760\n蔡建军\t277761\n诞生地\t277762\n建材\t277763\n34_\t277764\n根深\t277765\n乡镇长\t277766\n美人鱼\t277767\n路鸣泽\t277768\n相处\t277769\n遂溪\t277770\n满楼\t277771\nvnet\t277772\nv9.4\t277773\n石门峰\t277774\n烩菜\t277775\nscrollable\t277776\n淋巴结肿大\t277777\n米4\t277778\nMPEG2\t277779\n分录\t277780\n78挂靠网\t277781\n鬼牌\t277782\n金大川\t277783\n舅妈\t277784\n湘泉\t277785\n建业城\t277786\n济南银行\t277787\n真空机\t277788\n静脉血栓栓塞症\t277789\n第18036期\t277790\n6300V2\t277791\n智育\t277792\n辉月\t277793\nf550\t277794\n吴稼祥\t277795\n无限流量卡\t277796\ncoml\t277797\n锰矿\t277798\n天壤\t277799\n生活启示录\t277800\n郁可唯\t277801\nVerde\t277802\n创宇云\t277803\n保利\t277804\ncss-loader\t277805\nsanya\t277806\n众美集团\t277807\n神雕侠侣\t277808\n免烧砖\t277809\n宁波方特\t277810\n静安庄\t277811\n宠爱于一身\t277812\n天和防务\t277813\n评估价\t277814\n五枚\t277815\ndoublelift\t277816\n日月山\t277817\n卧牛山\t277818\nGolden\t277819\noocl\t277820\n朴朴\t277821\nBD国英双语\t277822\n天猫苏宁\t277823\n第十二天\t277824\n[度盘\t277825\nbrazilian\t277826\n桩\t277827\n格瑞特\t277828\n正压式呼吸器\t277829\n200行\t277830\n百老泉\t277831\n北京语言大学出版社\t277832\n小柔\t277833\nDetected\t277834\n可视化工具\t277835\n丧葬费抚恤金\t277836\n座子\t277837\n电子件\t277838\n丰立装饰\t277839\ngo2\t277840\n清汤火锅\t277841\n绝收\t277842\n宅家\t277843\nbox-flex\t277844\n糖蒜\t277845\n血浆\t277846\n第五任\t277847\nセ\t277848\n天津市国家税务局\t277849\n卧佛\t277850\nレズビアン\t277851\n淘铺王\t277852\n295\t277853\n华尔顿\t277854\n斯科\t277855\n170\t277856\n范丞罗永浩\t277857\n临漳县\t277858\n美丽的谎言\t277859\n重力感应器\t277860\n出現\t277861\n发动机制动\t277862\n葡萄架\t277863\n来安县政府\t277864\n电游\t277865\n妇产\t277866\n暗黑3\t277867\n第4098章\t277868\n转基因\t277869\n大肠菌\t277870\nFibonacci\t277871\n多份\t277872\n清宫手术\t277873\nphpexcel\t277874\nOutlook\t277875\nioi\t277876\n丽色\t277877\n速值\t277878\n中华V5\t277879\n土窑鸡\t277880\n金门路\t277881\n桃花劫\t277882\n特困户\t277883\n良乡\t277884\n再来一发\t277885\n车水\t277886\n黄粱梦\t277887\n女同\t277888\nSubtitles\t277889\n进进出出\t277890\n意大利语\t277891\n自我鉴定\t277892\n张尧\t277893\n获益匪浅\t277894\nIPHONE8\t277895\nsequenc\t277896\n郊区\t277897\nDDR3\t277898\n小黎\t277899\n阿刚\t277900\n浣熊\t277901\n山度士\t277902\n苏州工商局\t277903\n誓夺\t277904\n瓶装水\t277905\n弑母\t277906\n门面房\t277907\n仙剑奇侠传2\t277908\n万平口\t277909\n几更\t277910\n波多野统衣\t277911\n陈忠实\t277912\n艾威\t277913\n菲利普斯曲线\t277914\nfading\t277915\n丽泽金融商务区\t277916\n德馨园\t277917\n火烧赤壁\t277918\n订单\t277919\nqq头像大全\t277920\n苏酒集团\t277921\n万古至尊\t277922\nequilibrium\t277923\ngodiva\t277924\nopop\t277925\n影装\t277926\n兼及\t277927\n方力钧\t277928\nBlack_Knight\t277929\n九州风神\t277930\nT7000\t277931\n100s\t277932\n袁方\t277933\n肺源性\t277934\n够爱\t277935\nxiatian\t277936\n广州市技师学院\t277937\nJawbone\t277938\n触网\t277939\n语法\t277940\nHILFIGER\t277941\ni7-6700k\t277942\n打屁\t277943\n生化药\t277944\n自然公园\t277945\n妻色\t277946\n贺兰山\t277947\npkcs7\t277948\n泓钰学校\t277949\n国新能源\t277950\n乱敏\t277951\n打权\t277952\n宣威市\t277953\n第一型\t277954\n酸枣仁汤\t277955\n英格兰队\t277956\n分类学\t277957\n驻马店市政府\t277958\n地震动峰值加速度\t277959\nIHDT映速\t277960\n王琳\t277961\n保险岛保险网\t277962\n促发展\t277963\n白勺\t277964\n穷鬼\t277965\njolie\t277966\n3905\t277967\n潜修\t277968\nracun\t277969\nopatch\t277970\n证码\t277971\nNE\t277972\nAutoDesk\t277973\n秦淮茹\t277974\n新塘街道\t277975\n丽丽\t277976\n癞蛤蟆草\t277977\n中央第一巡视组\t277978\n千佛\t277979\n暴战\t277980\n$refs\t277981\n7#\t277982\n70公分\t277983\n0320\t277984\n130亿美元\t277985\n异氰酸酯\t277986\nil2\t277987\n浆糊\t277988\n08号\t277989\n博学路\t277990\n暴风雪\t277991\nkaws\t277992\n奥莱\t277993\n酒城\t277994\n企业主\t277995\n染布\t277996\n百日咳\t277997\n蒸蒸日上\t277998\njuste\t277999\n杰西卡阿尔芭\t278000\nnormal\t278001\nShi\t278002\n20140314\t278003\n蓝版\t278004\n苏澳\t278005\n批贷函\t278006\nwim格式\t278007\n3DSMax\t278008\nabbreviation\t278009\n卫衣\t278010\n老狼\t278011\n农业路\t278012\nmodelattribute\t278013\n灯下\t278014\n南方网\t278015\nGUCCI\t278016\n探研\t278017\n抵押品\t278018\n叶城县人民政府\t278019\n法讯\t278020\n哥廷根大学\t278021\n黄金管\t278022\narla\t278023\n变态杀手\t278024\n沙湾县\t278025\n黑风山\t278026\n百饰\t278027\n3d游戏\t278028\n二十五项\t278029\n半偏法\t278030\n丝接头\t278031\n04.13\t278032\n吴圩机场\t278033\n淬火\t278034\n机图\t278035\n4a公司\t278036\niFashion\t278037\n智能家居展\t278038\n秦歌\t278039\n霍尔果斯注册公司\t278040\nFREEDOM\t278041\n恒锋\t278042\n霍夫变换\t278043\n欲望之屋2\t278044\n竟然\t278045\nTIMKEN\t278046\n海纳百川\t278047\n南部山区\t278048\n码云Gitee\t278049\n焊板\t278050\n光敏传感器\t278051\naic\t278052\n新趣\t278053\n人群\t278054\n狂兵\t278055\n射流器\t278056\n羁旅\t278057\n新人教部\t278058\n有笔\t278059\n辛克莱\t278060\n填进\t278061\nCouldn\t278062\n14k金\t278063\nCoreAnimation\t278064\n广州幼儿园\t278065\n奶瓣\t278066\n章明\t278067\n教练们\t278068\n羽\t278069\n1053\t278070\n天普大学\t278071\n24亿元\t278072\n车面\t278073\n喜人\t278074\n鸿瑞\t278075\n无间风云\t278076\n我真的受伤了\t278077\n高河镇\t278078\ninmobi\t278079\n抹零\t278080\n潦倒\t278081\n灵芝茶\t278082\n胆小慎\t278083\n勾稽\t278084\ncolder\t278085\n文殊\t278086\n八起\t278087\n馥蕾\t278088\n自备\t278089\n滁州市\t278090\n路人皆知\t278091\n都市爱情\t278092\nlegs\t278093\n4.5吨\t278094\n13天\t278095\n九江石化\t278096\ntwich\t278097\nshfq\t278098\n脚脖\t278099\n基底节区\t278100\n配租\t278101\n丽江市\t278102\n程晨\t278103\n太行山大峡谷\t278104\n徐州市公安局\t278105\n百度视频播放器\t278106\n东莞人民医院\t278107\n后页\t278108\n姓名学\t278109\n诫勉谈话\t278110\n搜狗拼音\t278111\n杆路\t278112\nch\t278113\n招商銀行\t278114\n论语十二章\t278115\n主刀\t278116\n万马股份\t278117\n低蛋白\t278118\n酷睿处理器\t278119\n逸致论坛_汽车之家论坛\t278120\n经济学家\t278121\n1975\t278122\n八卦村\t278123\nTTF\t278124\nカバ\t278125\n飾\t278126\n吡蚜酮\t278127\n100回\t278128\nmysqlbinlog\t278129\n一念明末边军一小兵\t278130\nifo\t278131\n前置机\t278132\nChu\t278133\n涨姿势\t278134\nilinux_one\t278135\n达能\t278136\n重庆大学土木工程学院\t278137\n黄连木\t278138\n曜日\t278139\n健在\t278140\n西汉时期\t278141\n弈剑\t278142\n美早\t278143\n中国电镀网\t278144\n市纪委市监委\t278145\nruyi\t278146\n陈希孺\t278147\n病名\t278148\n贵阳中医学院\t278149\n洗澡时\t278150\n太可爱\t278151\nnex\t278152\n他来了,请闭眼\t278153\n铁法\t278154\n大狙\t278155\n菩提本无树\t278156\n小幸\t278157\n品字\t278158\n塞娜\t278159\n中南大学学报\t278160\n标明\t278161\n极限词\t278162\n左麟右李\t278163\n2018年04月08日\t278164\nFIELD\t278165\n九阴手游\t278166\n偿命\t278167\nDish\t278168\nUdacity\t278169\n长沙市人民政府办公厅\t278170\nwindows任务管理器\t278171\nbaofeng\t278172\n并立\t278173\n行之有效\t278174\n三千克\t278175\n中百仓储\t278176\n1.2元\t278177\n雷姓\t278178\n临危不乱\t278179\n行选中\t278180\n赔案\t278181\n社情\t278182\n片台\t278183\n玉砂\t278184\nerun\t278185\n仙剑五前传\t278186\nYii2.0\t278187\n1.53\t278188\n发福\t278189\n20ah\t278190\n流脓\t278191\n出售\t278192\n裔\t278193\n388元\t278194\nmotivated\t278195\n王中平\t278196\n广州仲裁委员会\t278197\n状态\t278198\n蚌埠\t278199\n锁\t278200\n亚马逊跨境\t278201\njass\t278202\nRelevant\t278203\n盖德\t278204\nLocalDB\t278205\n600W\t278206\n检修\t278207\nv5.02\t278208\n数模转换器\t278209\n小伊伊\t278210\n李济仁\t278211\n网址导航站\t278212\nFong\t278213\n林言\t278214\n用好\t278215\n数英网\t278216\n石河子大学\t278217\n郭子睿\t278218\n雷惊鸿\t278219\n灭世者\t278220\n淘客助手\t278221\n春申\t278222\n鄂尔多斯市政府\t278223\n赵虹\t278224\n中国建筑公司\t278225\n空间域名\t278226\n东四北大街\t278227\n叨位\t278228\n洗车券\t278229\nESC服务器\t278230\nspritekit\t278231\n532\t278232\n维亚\t278233\n白日梦\t278234\n小窑湾\t278235\n臧其超\t278236\ntppa\t278237\n克莉丝汀\t278238\n一艘\t278239\nXBOX\t278240\n非此即彼\t278241\nNewBlue\t278242\n诺亚\t278243\n20多种\t278244\n雪中无限道武者路\t278245\n别了\t278246\n阿尔贝·加缪\t278247\n薛姨妈\t278248\nexcel行\t278249\n晚钟\t278250\n心理学\t278251\n碑林区政府网\t278252\n宫角妊娠\t278253\n烟台考试网\t278254\n范尼\t278255\nBase64\t278256\n菲莉丝\t278257\n洋湖湿地公园\t278258\n控告书\t278259\n德源\t278260\n马兜铃酸\t278261\n页笔\t278262\n半岛影院\t278263\n聊斋志异\t278264\n宝安南路\t278265\n锅端\t278266\n10.3.4\t278267\n贾梅\t278268\n冷枭\t278269\nBML\t278270\n_联发文库\t278271\n万代MG\t278272\nS16\t278273\nIT培训网\t278274\n东明石化\t278275\n二专\t278276\nNankai\t278277\n药剂科\t278278\n杜拉维特\t278279\n张千帆\t278280\n崇武\t278281\n北京养生堂\t278282\n卖货\t278283\nVincent\t278284\nGIF格式\t278285\n废后\t278286\n聚甘油\t278287\n灼灼\t278288\n宫颈锥切\t278289\n啫\t278290\n撩拨\t278291\n京东宙斯\t278292\nNovus\t278293\n上海华东政法大学\t278294\n证字\t278295\nbarber\t278296\n145元\t278297\n东太湖\t278298\n预约管理系统\t278299\n刘小峰\t278300\n一无所获\t278301\n电子银行_电子银行\t278302\nreferrer\t278303\n省下\t278304\nmtvavi\t278305\n江北城\t278306\n霜华\t278307\nsus\t278308\nBOOM\t278309\ntomcat\t278310\n以撒的结合:胎衣\t278311\n搜一搜\t278312\n迭出\t278313\n证明样本\t278314\n日月当空\t278315\n化学改性\t278316\napprtc\t278317\ndisabled样式\t278318\n青龙街道\t278319\n方旭\t278320\n魔攻\t278321\n深坑\t278322\n临时居住证\t278323\n金色版\t278324\n河源晚报\t278325\n中部\t278326\n8009\t278327\n托特纳姆热刺\t278328\n空调器\t278329\n李嘉诚\t278330\nOmni\t278331\n很多很多\t278332\n暴露\t278333\n上溢\t278334\n键身\t278335\n企业级\t278336\n开膛手杰克\t278337\n空虚感\t278338\n22节\t278339\n高等教育署\t278340\nwolfy\t278341\n末代皇帝传奇\t278342\n冰淇\t278343\n索莱宝\t278344\n反记账\t278345\nremediation\t278346\n集团版\t278347\n伸展式\t278348\n大航海探险物语\t278349\n中国奶业协会\t278350\n蛏\t278351\nSolarized\t278352\n渤海湾\t278353\n利安达\t278354\n二十三\t278355\n西宁特钢\t278356\n姜王\t278357\n难耐\t278358\nPlanetary\t278359\n祛痰\t278360\n寒衣节\t278361\nzhw\t278362\n八合\t278363\n皇宋通宝\t278364\n贝子鸟\t278365\n曲辕犁\t278366\n摸吧\t278367\n克苏恩\t278368\n纳吉布\t278369\n重庆国金中心\t278370\n群名_祥安阁风水网\t278371\npalladium\t278372\n暖宝宝\t278373\n私有制\t278374\n小米小爱\t278375\nHate\t278376\n嘉靖帝\t278377\n萤火虫之墓\t278378\n符印\t278379\nfifa18吧_\t278380\n生物系\t278381\n用途\t278382\n青岛市图书馆\t278383\n杨俊杰\t278384\n9991\t278385\n青蜂侠\t278386\nip138\t278387\nIP地址查询__67查询网\t278388\n猫友\t278389\n楼堂馆\t278390\n龙的天\t278391\n陈经纶中学分校\t278392\n柬埔\t278393\n血舞\t278394\n65个\t278395\nKAB\t278396\n孤风独语\t278397\n薛洪言\t278398\n精神世界\t278399\n乱伦\t278400\n婚活\t278401\n老贾\t278402\n捷途汽车\t278403\n一枚\t278404\n袒露\t278405\n续办\t278406\n区情\t278407\n小规模纳税人增值税申报表\t278408\n群众路\t278409\n颔\t278410\n获加\t278411\nsues\t278412\n北关街道\t278413\n不视为\t278414\n伊甸园之东\t278415\n操作杆\t278416\nRouter\t278417\n存款准备金率\t278418\n二环南路\t278419\n渗水砖\t278420\n我的耳朵\t278421\n予思\t278422\n滕雨佳\t278423\n开溜\t278424\n口套\t278425\n欧帝\t278426\n思明\t278427\n海贼王顶上战争\t278428\n博伦\t278429\n标准间\t278430\n南方网新闻中心\t278431\n梅葆玖\t278432\n监督局\t278433\n邵先生\t278434\n大红鹰\t278435\n虫族\t278436\n共谋者\t278437\n里侧\t278438\n网监\t278439\n霸道\t278440\nDOT\t278441\n杂牌机\t278442\n107秒\t278443\npdw\t278444\n猎狼\t278445\n煎饼卷大葱\t278446\n多重共线性检验\t278447\n柠檬味\t278448\n傅家俊\t278449\nv2.8.2\t278450\n临沂北站\t278451\nCheckedListBox\t278452\nPICO\t278453\n银泰百货\t278454\n噜\t278455\nue4\t278456\n上海沪工\t278457\n打打杀杀\t278458\n盛力\t278459\n沙漠之花\t278460\n龙球\t278461\n15篇\t278462\n奇宝贝\t278463\n蒲草\t278464\nIQ题\t278465\n为什么不\t278466\n品牌网\t278467\nrevisions\t278468\n水精\t278469\n二次初恋\t278470\n长沙新闻网\t278471\n国耻\t278472\n门采尔\t278473\n山畔\t278474\n巩汉林\t278475\n娇喘声\t278476\n信阳新闻网\t278477\n换机助手\t278478\nnine\t278479\n统治者\t278480\n学干\t278481\n燕子河大峡谷\t278482\n本田CR-V论坛\t278483\ncosh\t278484\nmatt\t278485\nyyk\t278486\nKCON\t278487\n玻璃盖\t278488\n美照\t278489\n炮术\t278490\n标准值\t278491\n16周岁\t278492\n野比大雄\t278493\n邯郸地区\t278494\n主日\t278495\ninterfere\t278496\nUSB2.0接口\t278497\n另一个女人\t278498\n名膜\t278499\n停\t278500\n19寸\t278501\n28.0\t278502\n常规体\t278503\n2000亿\t278504\n欢呼\t278505\n塞尔达传说荒野之息\t278506\n查价\t278507\nmorgain\t278508\n博瑞传播\t278509\n内科\t278510\n_翼\t278511\nPowerUp\t278512\n鬼白\t278513\n包饺子\t278514\nCant\t278515\nAnnounces\t278516\n用地规划许可证\t278517\n再婚\t278518\n碧江\t278519\n黑金沙\t278520\n双叒\t278521\n讥笑\t278522\n取得\t278523\n熊顿\t278524\n新安中学\t278525\n天津农学院\t278526\n神舟笔\t278527\n机屏\t278528\n修士\t278529\n七战\t278530\n民盟致公党\t278531\nCWT\t278532\n徐建成\t278533\nforces\t278534\n齐越\t278535\n7820HK\t278536\nb3\t278537\n持家\t278538\n利洁\t278539\n王文\t278540\n仓桥直街\t278541\n2015-2030年\t278542\n晋江文学城|晋江原创网\t278543\n散装酒\t278544\n2218\t278545\nshengwu\t278546\n易容网\t278547\n碟\t278548\n冷妻\t278549\n嘉媚乐\t278550\n灵玉\t278551\nwf1000x\t278552\noldgranny\t278553\n路特斯\t278554\n10万起\t278555\n蓝筹股\t278556\n繁殖箱\t278557\n木炭机\t278558\nsum37\t278559\n瑞翔\t278560\n近8年\t278561\n80英寸\t278562\n阴处\t278563\nARPG\t278564\n万峰湖\t278565\n2.3.8\t278566\n锁码\t278567\n武道\t278568\n第&#160\t278569\n金嗓子喉宝\t278570\n7.21\t278571\n升級\t278572\n4二\t278573\nPostcode\t278574\n吉克隽逸\t278575\n正态\t278576\n龙井村\t278577\n花白\t278578\n7460\t278579\n松冈千菜\t278580\n非常关系\t278581\n闲人马\t278582\n浅尝辄止\t278583\nslg\t278584\n部长\t278585\nOSGi\t278586\n布拉德皮特\t278587\n吴汶芳\t278588\nawstats\t278589\n瞻仰\t278590\n字符串函数库\t278591\n代建\t278592\n陈清晨\t278593\n妻妹\t278594\n一星期\t278595\n技防\t278596\n金城花园\t278597\n2018年1月30日\t278598\n亲贤\t278599\nhmd\t278600\ngoodsmile\t278601\n2周岁\t278602\n披星\t278603\n囊肿型\t278604\n闻讯\t278605\nTXT小说阅读器\t278606\n百度云2018\t278607\n原罪\t278608\n菲利普岛\t278609\n本港\t278610\n一千遍\t278611\n重庆物流公司\t278612\n出水口\t278613\n真田\t278614\nkafa\t278615\n智联币\t278616\n五克\t278617\nRAC\t278618\n︱\t278619\n陈璐\t278620\n反封建\t278621\n3页\t278622\n哲学学院\t278623\n19个\t278624\n伦敦塔\t278625\n植保机\t278626\n尚存\t278627\n二七广场\t278628\n东方家园\t278629\n4.7寸\t278630\n斗罗大陆龙王传说\t278631\n夏言冰\t278632\n大白鼠\t278633\n量子物理\t278634\nPoor\t278635\n毕业生就业信息网\t278636\n水龙头\t278637\nwindow7系统\t278638\n拨弄\t278639\n大象公会\t278640\n天翼云盘\t278641\n电影院\t278642\n君马汽车\t278643\n150000元\t278644\nRSD\t278645\nwww.7723.cn\t278646\n毒藤女\t278647\n王涵\t278648\ncorrelated\t278649\n毛躁\t278650\n牛气冲天\t278651\n苹果i派党\t278652\n徐明\t278653\n臭美\t278654\n315吧_\t278655\n德邦物流公司\t278656\n鹿血片\t278657\nstokke\t278658\n响水涧\t278659\n捕鼠\t278660\n折服\t278661\n南水北\t278662\n名庭\t278663\n对讲机\t278664\n腺相关病毒\t278665\nmerlin\t278666\n勇士队\t278667\n白马镇\t278668\n中海地产集团有限公司\t278669\n单机三国志2\t278670\n捷克\t278671\nBrowsers\t278672\n最长的电影\t278673\n杨旋\t278674\n座牌\t278675\n白屏\t278676\n甘汞\t278677\n当贝助手\t278678\nMUSIC\t278679\n欧尚超市\t278680\n雷鹏\t278681\n食梦者\t278682\nMurray\t278683\nzzj\t278684\ndich\t278685\n超高跟鞋\t278686\n宁化县\t278687\n钱小豪\t278688\n麻辣鱼\t278689\n乍看\t278690\n建造类\t278691\n江寒\t278692\n复叶\t278693\n党的十九大报告辅导读本\t278694\n2018036期\t278695\nRON\t278696\n龙山湖\t278697\n公安部出入境管理局\t278698\n圆梦\t278699\n苏陌\t278700\n李莎莎\t278701\n战舰世界_17173战舰世界\t278702\nrecording\t278703\n4月22号\t278704\n1351\t278705\n海天一色\t278706\nPC机\t278707\n四磨汤\t278708\nTimer定时器\t278709\n防弹衣\t278710\n邦顿\t278711\n独自\t278712\n中田\t278713\n体精\t278714\n专车\t278715\n烟熏腊肉\t278716\n储蓄卡_\t278717\n玲央\t278718\n青马工程\t278719\n真空镀膜机\t278720\n刘珊\t278721\n区片\t278722\n空调清洗剂\t278723\nbind\t278724\n二三四五\t278725\n骆马湖\t278726\n山子\t278727\nWorkbench\t278728\nweexpack\t278729\n岳翎\t278730\n屁股子\t278731\n李梓\t278732\nregexp\t278733\n巴林银行\t278734\n2016—2025年\t278735\n钓鱼吧\t278736\n第二方\t278737\ndestruction\t278738\n登进\t278739\n山水诗\t278740\n小越镇\t278741\n三里花城\t278742\n综影视\t278743\n超星数字图书馆\t278744\n黄疸肝炎\t278745\nQListView\t278746\n下学期\t278747\n房产抵押贷款公司\t278748\n哥德杯\t278749\n武林外传_多玩游戏主题站_wulin2\t278750\n万祥镇\t278751\nTRPG\t278752\n弱电图\t278753\nMirantis\t278754\n圣武枪魂\t278755\n弥桑黛\t278756\n宜宾新闻网\t278757\n苏州市\t278758\n期货分析师\t278759\n中国移动上海公司\t278760\n无工\t278761\n沿街\t278762\n5月1日起\t278763\n胆友\t278764\n唐灵\t278765\n桃花姬\t278766\n连州\t278767\n梅因\t278768\n广安市人民政府\t278769\n2.0\t278770\n下渡路\t278771\nq点\t278772\n分散度\t278773\n头孢消炎药\t278774\n肖克\t278775\nTableau\t278776\n50周岁\t278777\n换届选举\t278778\n哈尔滨松北区\t278779\n凯里欧文\t278780\n东澳\t278781\n景芝酒\t278782\n狮子男\t278783\n六头\t278784\n取材\t278785\n内街\t278786\n深入浅出ES6\t278787\n16w\t278788\n下一秒\t278789\n山西省发展和改革委员会\t278790\n古典经济学\t278791\nopenoffice\t278792\n卢海鹏\t278793\n四月五号\t278794\ngradient\t278795\n布加迪\t278796\nvcloud\t278797\n次长\t278798\nWi-Fi\t278799\nkoudai\t278800\nvill\t278801\n侠盗一号\t278802\n常识\t278803\n白璐\t278804\n专班\t278805\n红旗街道\t278806\n薬品\t278807\n考马斯亮蓝\t278808\n凹凸有致\t278809\n名头\t278810\nCalgary\t278811\nricoh\t278812\n中国少儿艺教\t278813\n提早\t278814\n檀香木\t278815\n文汇\t278816\n孙尚香\t278817\n喜迁莺\t278818\n社交型\t278819\n生物电\t278820\n4美元\t278821\n盐亭\t278822\n领兵\t278823\nopenal32.dll\t278824\n烃\t278825\n宣判\t278826\napn\t278827\n航航\t278828\n三8\t278829\n陈书记\t278830\n拜瑞妥\t278831\n小妈妈\t278832\n出口商品\t278833\nEA\t278834\n元件\t278835\n不投\t278836\n文徵明\t278837\nKelley\t278838\n仿盛大传奇\t278839\n药品批准文号\t278840\n分站\t278841\nJupiter\t278842\n比卡丘\t278843\n佛罗伦萨\t278844\n付鹏\t278845\n兴业银行手机银行\t278846\n林巧稚\t278847\nIntelCPU\t278848\n德创\t278849\n侠骨柔情\t278850\n最高价\t278851\nTICA\t278852\n12:00\t278853\n路口镇\t278854\n氪空间\t278855\n浅紫\t278856\n健哥\t278857\n花卡\t278858\noccurs\t278859\nKingsman\t278860\n贵安温泉\t278861\nRonin\t278862\n霸王育发液\t278863\n4.9秒\t278864\n新店镇\t278865\n路边野餐\t278866\n恐慌\t278867\n心灵杀手\t278868\n5T\t278869\n报名点\t278870\n纬地涵洞\t278871\nGT740\t278872\n主营\t278873\n吸血姬\t278874\nA37m\t278875\n寻侠\t278876\n这四招\t278877\n我是你的眼\t278878\n幻舞\t278879\n筹拍\t278880\necommerce\t278881\n绿康\t278882\n红树湾\t278883\ninverted\t278884\n折木奉太郎\t278885\nwaterpik\t278886\nwepy框架\t278887\n反咬\t278888\n犬马\t278889\n孟加拉虎\t278890\n氧气管\t278891\n落伍者\t278892\n若宠\t278893\n万科金域学府\t278894\nsono\t278895\n社会学概论\t278896\ngequ\t278897\n郵政\t278898\n误\t278899\n原料库\t278900\n周防\t278901\n运算\t278902\n动因\t278903\n1000行\t278904\n2016.11\t278905\n东坝镇\t278906\n消防师\t278907\n小偶\t278908\n北京保利拍卖公司\t278909\n宁波中学\t278910\n英雄传奇\t278911\nMozart\t278912\n超员\t278913\n泉港区\t278914\n僵局\t278915\n笃用\t278916\n添附\t278917\nQu\t278918\n蔬菜汁\t278919\n发酵型\t278920\n爱将\t278921\n正房\t278922\n14本\t278923\nAutoCAD2018\t278924\n齐鲁医科大学\t278925\n002074\t278926\n禁业\t278927\n花家怡园\t278928\n阴囊\t278929\n骂名\t278930\n白秋\t278931\n专服\t278932\n┆\t278933\n至关重要\t278934\nPixhawk\t278935\n神体\t278936\n四面体\t278937\n勒东\t278938\n停格\t278939\nFCT\t278940\n户长\t278941\n日语\t278942\n引退\t278943\n进驻\t278944\nmanaging\t278945\n1.9万\t278946\nstr1\t278947\n铁路客户服务中心\t278948\n子非木\t278949\n5.0EX\t278950\nC++二维数组\t278951\n章料\t278952\n抢鲜看\t278953\nwebshell\t278954\n石\t278955\n郑晓东\t278956\n老友\t278957\n宣讲团\t278958\n班片\t278959\n齐全\t278960\n不死法医\t278961\n处突\t278962\n白塘\t278963\nJAX\t278964\n广汽菲亚特\t278965\n魄魄\t278966\nwin7触摸板\t278967\n村姑\t278968\nSymbolic\t278969\n金庸\t278970\nEcheveria\t278971\nMX2\t278972\n皇菊\t278973\nMOLEX\t278974\n江苏卫视\t278975\n话说长江\t278976\n信者\t278977\n铃村\t278978\n印象城\t278979\n寸芒\t278980\nfurious\t278981\n网易mumu模拟器\t278982\n影牢\t278983\n81岁\t278984\n东濠涌\t278985\npboc\t278986\nUbuntuKylin\t278987\nIJ\t278988\n王琦瑶\t278989\nbtm\t278990\nK线形态\t278991\n博悦娱乐\t278992\nmodification\t278993\n8余\t278994\n私章\t278995\n不轻\t278996\nports\t278997\n周存\t278998\n桃花色\t278999\n草长\t279000\n绝缘性\t279001\n张运\t279002\n400目\t279003\n真的好难\t279004\n伤神\t279005\n大成网\t279006\n电焊\t279007\n满招损\t279008\n外戚\t279009\n包凡\t279010\n压电式\t279011\n孙逊\t279012\n帕累托\t279013\n修改后\t279014\n玛尔塔\t279015\n173cm\t279016\n其间\t279017\n与事\t279018\n骆驼股份\t279019\n汽车配件\t279020\nszj\t279021\n东阳红木家具\t279022\n波速\t279023\n娘子\t279024\n奋斗路\t279025\n立地成佛\t279026\nShirley\t279027\n汞灯\t279028\nslab\t279029\n家委会\t279030\n三草两木\t279031\n_孙\t279032\n海外赤子\t279033\n山海湾\t279034\n潘春辉\t279035\n姬无夜\t279036\n林左鸣\t279037\n东方歌舞团\t279038\n乌金\t279039\n头版\t279040\n秋梦\t279041\n听话\t279042\n桥西\t279043\n富士\t279044\n江华\t279045\n很好吃\t279046\n青帝\t279047\n民行\t279048\nbl宠文吧\t279049\n环境污染\t279050\n李晔\t279051\n神武-17173\t279052\n通行\t279053\nmarkdownpad\t279054\n来不及说我爱你\t279055\n客制化\t279056\n穿帮网\t279057\n邢英\t279058\n第35号\t279059\n66年\t279060\n同轴\t279061\n台步\t279062\npvc板\t279063\n天钻\t279064\nFreekaobo\t279065\nmoveit\t279066\n八公\t279067\n晚上11点\t279068\n马可孛罗\t279069\n格兰仕\t279070\nSadness\t279071\nall.zip\t279072\n王双增\t279073\n掸邦\t279074\nmsr7\t279075\n轴盘\t279076\n7783\t279077\n异常燃钢之魂\t279078\nguzzlehttp\t279079\n公众责任险\t279080\n倒车\t279081\n白鸥\t279082\n晋亿实业\t279083\n黄河大道\t279084\n75\t279085\n帕多瓦\t279086\n智城外包网\t279087\nG9300/全网通\t279088\nirregular\t279089\n内含子\t279090\n糖机\t279091\n中国翻译研究院\t279092\n小伴\t279093\n智源\t279094\n暖气费\t279095\n钙片\t279096\n自上而下\t279097\n第9关\t279098\n贸易新闻网\t279099\n岩体\t279100\n2018年5月20日\t279101\n3-4月\t279102\npyecharts\t279103\n3d\t279104\n7290\t279105\n茶籽油\t279106\nsrc=\t279107\n民生民\t279108\n朱浩\t279109\n受骗者\t279110\niops\t279111\n尊敬\t279112\n柑者\t279113\n徐才厚\t279114\n中国农业科学院植物保护研究所\t279115\n圣诞夜\t279116\n贰婶\t279117\n扫雪机\t279118\n马悦\t279119\n000021\t279120\n中省\t279121\n受贿\t279122\n北辰区\t279123\n螺杆\t279124\n桂平光哥\t279125\n相思鸟\t279126\n出原\t279127\n孕晚期胎动\t279128\nSACD\t279129\n2017年10月7日\t279130\n烤羊\t279131\n言叶\t279132\n集美新城\t279133\n有色金\t279134\njavawebsoa\t279135\nweighing\t279136\n气歌\t279137\n金属结构\t279138\nMumbai\t279139\n清友\t279140\nGGJ\t279141\n卫光\t279142\nnvs\t279143\n双菱\t279144\n艾溪湖\t279145\n容易受伤\t279146\n白门\t279147\nDWG\t279148\n4000平米\t279149\n生物工程专业\t279150\n协议\t279151\n天缘\t279152\n跳蚤\t279153\n沈青山\t279154\n6399\t279155\n富盛\t279156\n公元后\t279157\n1960年代\t279158\n售货\t279159\n帖子库\t279160\n丁修\t279161\nclinicaltrials\t279162\nC仔\t279163\n以安\t279164\nTensorflow\t279165\n不可理喻\t279166\n玖号\t279167\nLauren\t279168\nErin\t279169\nKAI\t279170\n委内瑞拉\t279171\n安阳职业技术学院\t279172\nrod\t279173\n第9集\t279174\n荒淫史\t279175\n镇江南\t279176\nsoga\t279177\n0.01mm\t279178\n纪传体\t279179\n光杯\t279180\n26年后\t279181\n敬虔\t279182\n董晨\t279183\n头条号自媒体\t279184\n威朗论坛\t279185\n量子界\t279186\n漫威复仇者联盟\t279187\n白鹿枪神纪\t279188\n耐穿\t279189\nPlexus\t279190\nXM\t279191\n20个工作日\t279192\n混合所有制企业\t279193\n炒豆芽\t279194\nAttached\t279195\n无效化\t279196\n二姑\t279197\n走板\t279198\nWELL\t279199\n能效标识\t279200\n佳能5dsr\t279201\n最高效\t279202\n本草中国\t279203\n航空件\t279204\n朱磊\t279205\n酸枣仁\t279206\nONKYO\t279207\n有悔\t279208\n刑事诉讼\t279209\nAddicted\t279210\nTianjin\t279211\n握握\t279212\nshuffle\t279213\n上海市历史博物馆\t279214\n打点机\t279215\n携带者\t279216\nDIY_\t279217\n制茶\t279218\n无贼\t279219\n显微硬度计\t279220\n草帽计\t279221\nG胖\t279222\n混合型证券投资基金\t279223\n华山网\t279224\n武士桑\t279225\n沙河西路\t279226\nxnxx\t279227\n中文网志\t279228\nbrandeis\t279229\n假面舞会\t279230\n四象\t279231\n苍爹\t279232\nmayo\t279233\nスペシャル\t279234\n尼古陈天桥\t279235\n复康路\t279236\n近战流\t279237\n刘建平\t279238\n2WD\t279239\n历史表\t279240\n第249集\t279241\n砍柴\t279242\nK币\t279243\n卦\t279244\n江浙地区\t279245\n朝圣之路\t279246\n中国国际电视台\t279247\n灯灯\t279248\n亢\t279249\nvpdn\t279250\n法洛四联症\t279251\n卡拉赞之夜\t279252\n昆明送子鸟医院\t279253\n馨怡\t279254\n森蚺\t279255\n新港街道\t279256\n江桥镇\t279257\n盐商\t279258\nReynolds\t279259\ncontributed\t279260\n高峻\t279261\n修养\t279262\nN6Pro\t279263\n777电影网\t279264\n近似数\t279265\n福州政府网\t279266\nHotSpot\t279267\n巨犀\t279268\n辣妻\t279269\n难兄\t279270\n横坎头\t279271\n张安乐\t279272\n钥石\t279273\nVALVE\t279274\n2017年中秋节\t279275\n长林\t279276\nwingy\t279277\nPMA\t279278\n99.99\t279279\n北伐军\t279280\n复发性\t279281\ndiango\t279282\n480元\t279283\n丁海\t279284\n宁波规划局\t279285\n沧州地区\t279286\n曲阜东\t279287\n45分\t279288\n北条\t279289\n花木\t279290\nfactoring\t279291\n宜生\t279292\n偷车贼\t279293\n浙大\t279294\n腾云驾雾\t279295\n壹米\t279296\ncnpm\t279297\n南岸区\t279298\n寒骨\t279299\n下化\t279300\n致敬礼\t279301\n对角阵\t279302\n新华保险公司\t279303\n索非亚\t279304\n高_\t279305\nrue\t279306\n重算\t279307\n玉函路\t279308\n启动仪\t279309\n华泾镇\t279310\nSIDE\t279311\n误用\t279312\n白昼\t279313\nBirmingham\t279314\n凌派雪中悍刀行\t279315\n甲苯胺\t279316\n虚拟演播室\t279317\n旅友\t279318\n正大广场\t279319\nFence\t279320\n贵研铂业\t279321\n上海航空物流\t279322\n核增\t279323\nsurrounding\t279324\n烧掉\t279325\n30型\t279326\n奉诏\t279327\n90YU娱乐网\t279328\n冻结期\t279329\n戒指\t279330\n肾俞穴\t279331\n大发明家\t279332\n花雨言情小说阅读网\t279333\n丁酮\t279334\n三分之四\t279335\nPageWide\t279336\n娜鲁湾\t279337\n易鸣起名网\t279338\n决胜全面建成小康社会\t279339\n柏林\t279340\n美如花\t279341\nMFC编程\t279342\n宇宙锋\t279343\n西江镇\t279344\n补充液\t279345\nlider\t279346\n简短\t279347\nmoor\t279348\n黑龙江省高级人民法院\t279349\n泰诺\t279350\n董明珠\t279351\nIP+\t279352\n清莲\t279353\nec2a\t279354\nLOLs7\t279355\n礼金券\t279356\n幸福魔方\t279357\n叮嗒\t279358\nSalsa\t279359\n闫芳\t279360\nmongorestore\t279361\n中华诗词网\t279362\n3.6mm\t279363\n程莉\t279364\n铝绞线\t279365\ntplogin\t279366\n断面\t279367\n固溶\t279368\n对苯二胺\t279369\n胜利桥\t279370\n金融壹账通\t279371\n于海滨\t279372\n耸耸肩\t279373\n横店梦幻谷\t279374\ngnuhpc\t279375\nwab\t279376\n1月初\t279377\n旧事重提\t279378\n色先锋_影音先锋av资源_色先锋影音看片_色先锋\t279379\n苗米\t279380\n一滑\t279381\nSafe\t279382\n统计站\t279383\n耳朵\t279384\nobolee\t279385\n大魏埃及艳后\t279386\n柳州政府\t279387\n纳溪区\t279388\n编织带\t279389\n大维\t279390\n东方红\t279391\n天睿\t279392\n田坝\t279393\n探查\t279394\n古徽州\t279395\n漏缴\t279396\n|千\t279397\n乖离性MA\t279398\n威廉森\t279399\nPlagiarism\t279400\n查理普斯\t279401\n开府\t279402\nwdcp管理系统\t279403\n永远的7日之都\t279404\n肛门脓肿\t279405\ntetris\t279406\namada\t279407\ndirec\t279408\n简谱\t279409\nVol.6\t279410\n拖管\t279411\n杰罗姆\t279412\n姣猛\t279413\nedem\t279414\n健字\t279415\nautomated\t279416\n深圳社保\t279417\n嘴\t279418\n12.03\t279419\nbó\t279420\n雷帝\t279421\n孕妇食谱\t279422\n中国社区\t279423\nAWP\t279424\n女贝\t279425\n文院\t279426\nphot\t279427\n海象\t279428\n陕西省电力公司\t279429\n吴沐熙\t279430\n小米max3\t279431\nstrcpy函数\t279432\n预知\t279433\n新塘村\t279434\n本世纪末\t279435\n垫\t279436\n系列片\t279437\n蓝鲸\t279438\nSSH框架\t279439\n张晓星\t279440\n龙池街道\t279441\n反措\t279442\n花哨\t279443\nmyeclipse2013\t279444\n蒙学\t279445\njf.10086.cn\t279446\n摩皱\t279447\n12336\t279448\n500kw\t279449\n赵简子\t279450\n编钟\t279451\n琴师\t279452\n然则\t279453\n模板机\t279454\n慕容冲\t279455\n徒子\t279456\n吾日三省吾身\t279457\n腾格里网\t279458\ndefense\t279459\n校本教研\t279460\n彩绘机\t279461\n萨拉热窝\t279462\ninsight3.5\t279463\n丰水期\t279464\n康泉\t279465\nK9\t279466\nPatch\t279467\n很多事\t279468\n3招\t279469\n第六季\t279470\n港娱\t279471\n500条\t279472\nencoding\t279473\n树袋\t279474\n京贸国际城\t279475\n微暖\t279476\n官厅水库\t279477\n2017年劳动节\t279478\n未年\t279479\nmdl\t279480\n杨蕾\t279481\nf530\t279482\n2077\t279483\nConn\t279484\n郭英\t279485\nFBX\t279486\n宁蒗县\t279487\n铜柱\t279488\nQr\t279489\nexecutorservice\t279490\n南开大学法学院\t279491\nMinitab\t279492\nオマ\t279493\nxyy\t279494\n习作三\t279495\n单纯疱疹病毒\t279496\n提肛运动\t279497\n石景山医院\t279498\n鸡翅根\t279499\n41分钟\t279500\n20151203\t279501\n莫斯科大学\t279502\n攸县政府\t279503\n鬼手\t279504\n西南航空公司\t279505\n130毫米\t279506\n家楼\t279507\n几生\t279508\n华胜天成\t279509\n徐畅\t279510\n幻璃镜\t279511\n叶言\t279512\n七分之六\t279513\n弹跳\t279514\nj.fla\t279515\n龙苑\t279516\n飞虹\t279517\n莹莹\t279518\n12寸\t279519\n水准\t279520\n2017节假日\t279521\n寡姐\t279522\n埃及\t279523\n36.8\t279524\nmgc\t279525\npetition\t279526\n预计负债\t279527\n13925836855\t279528\n个税起征点\t279529\n蠡口\t279530\n飞歌联盟\t279531\n第251集\t279532\n18183终结者2审判日\t279533\n7瓶\t279534\n乔健\t279535\n安全性\t279536\n夜骑\t279537\n亲肤\t279538\n门庭\t279539\nghi\t279540\n山鹰\t279541\n鲜草\t279542\n新奥尔良烤翅\t279543\n勇者之路\t279544\n香颂湾\t279545\n2礼包\t279546\n慢性胰腺炎\t279547\n道聚城\t279548\n公斤\t279549\n沫子\t279550\n金太阳\t279551\n中炮\t279552\n湿巾\t279553\n正阳教育集团\t279554\n屠龙者\t279555\nwifi无线路由器\t279556\n18所\t279557\n生兼职\t279558\n音乐谱\t279559\n脊髓肿瘤\t279560\n魔幻手机2\t279561\nLongines\t279562\n39种\t279563\n360OS\t279564\nmodule\t279565\n三国战记2群雄争霸\t279566\n204号\t279567\n黄梅县\t279568\n9.8元\t279569\n80岁\t279570\n单温\t279571\nlenze\t279572\n第66集\t279573\ndescriptors\t279574\n3语\t279575\n50路\t279576\nShift+\t279577\n徐正曦\t279578\n男牛\t279579\n飞路\t279580\nstellarium\t279581\n气焊\t279582\nconcise\t279583\n东方日报\t279584\n督学\t279585\n吡美莫司乳膏\t279586\n杀号\t279587\nFabio\t279588\n中国企业家俱乐部\t279589\n田口\t279590\n程心\t279591\ndbvis\t279592\n保护符\t279593\n脱颖而出\t279594\n日产GTR\t279595\n飞侠\t279596\n北回归线\t279597\n豪宠\t279598\n平西府\t279599\n收件人\t279600\nPlaces\t279601\n自产\t279602\n铜锣\t279603\n澳门地区\t279604\n市城管委\t279605\n一平尺\t279606\n若邻网\t279607\n停留\t279608\n老物件\t279609\n紧接着\t279610\nG29\t279611\n海盐论坛\t279612\n艾曲波帕\t279613\n搞趣\t279614\n见山\t279615\n3.7G\t279616\n7880\t279617\n555.com\t279618\n韩博士\t279619\n宝塔区人民政府\t279620\n湖北共青团\t279621\n20171016\t279622\n安宁市人民政府\t279623\n刷毛\t279624\nCHINT\t279625\n侠探杰克\t279626\n挡水条\t279627\n蓝药\t279628\n大连旅顺\t279629\nメロディ\t279630\n爱丽舍\t279631\nsdwebimage\t279632\n下载吧\t279633\nBD-720P/1080P-MP4\t279634\n上岗证\t279635\n湖北企业网\t279636\ngently\t279637\n增票\t279638\n团车网\t279639\nadvan\t279640\n抽丝\t279641\nqv\t279642\n2016年以前\t279643\n大下\t279644\n羊胎素\t279645\n卡琳娜\t279646\n仗势欺人\t279647\n我曾眼瞎错爱你\t279648\n新书记\t279649\n南京机电职业技术学院\t279650\n放大率\t279651\n预增\t279652\nHFP\t279653\n暗渠\t279654\n二篇\t279655\n2月7日\t279656\n天猫超市卡\t279657\n朋友之间\t279658\n应天大街\t279659\n泉涌\t279660\n玻璃酸钠\t279661\nKingdoms\t279662\n秉火\t279663\nConvolution\t279664\n场中路\t279665\n少子\t279666\n剪爱\t279667\nassa\t279668\n20170726\t279669\n赤峰市教育局\t279670\n流向\t279671\nジャ\t279672\n化瘀药\t279673\noracle12\t279674\n2D版\t279675\n超级牛股\t279676\n涂药\t279677\ngt820\t279678\n萤石商城\t279679\nvolcanic\t279680\nvice\t279681\n肥皂盒\t279682\n沙发客\t279683\nFuzor\t279684\n昌乐\t279685\nModelForm\t279686\n定影膜\t279687\npentium\t279688\n相扑\t279689\n验放\t279690\n屋顶\t279691\nwouldn\t279692\n师奶\t279693\n同轴线\t279694\n三角巾\t279695\n钢轮\t279696\n北京残保金\t279697\n电压法\t279698\n教育路\t279699\n测控仪\t279700\n惠州大亚湾西区\t279701\n立等可取\t279702\nyouneed\t279703\n支付圈\t279704\n挖掘机\t279705\n京瓷180\t279706\n选择奖\t279707\n连锁经营管理专业\t279708\n尸王\t279709\n7-1\t279710\n黑糖\t279711\n讨鬼传极\t279712\n丢掉\t279713\nStaying\t279714\nquorum\t279715\n第六话\t279716\n毛贼\t279717\n140cm\t279718\n信长协奏曲\t279719\n网格布\t279720\n易爆品\t279721\n明光市\t279722\n图稿\t279723\nBT种子1080P\t279724\n未遂\t279725\n国编\t279726\nMaki\t279727\n深海恐惧症\t279728\n四十八式太极拳\t279729\n预应力张拉\t279730\nProposals\t279731\n莫非\t279732\n逍遥游军校\t279733\n新飞电器\t279734\n陕西国税电子税务局\t279735\n中科院声学所\t279736\n系列\t279737\n夏威夷岛\t279738\n士道\t279739\n瓦喵\t279740\n花版\t279741\nFish\t279742\ndage\t279743\n椰肉\t279744\n赖清德\t279745\n通灵大峡谷\t279746\n双鱼男\t279747\n下进\t279748\n猪瘟\t279749\nQ吧\t279750\n画地为牢\t279751\n张金\t279752\nクロイヌ\t279753\n耳轴\t279754\n和平共处五项原则\t279755\n百分\t279756\nポケットモンスタ\t279757\n大型机\t279758\n党政群\t279759\nK-12\t279760\n圈子花园\t279761\n叶琳娜\t279762\n教学楼\t279763\ncosi\t279764\n韵诗\t279765\n八大\t279766\n徽韵\t279767\n中国银行业协会\t279768\n2010年前\t279769\nGCID\t279770\n主从机\t279771\n邮政编码\t279772\n数字压力表\t279773\n宇驰\t279774\nMadilyn\t279775\n游家\t279776\n升压变压器\t279777\n悉尼港\t279778\norigin\t279779\n屏面\t279780\n闪客快打\t279781\n国统股份\t279782\n本性\t279783\n堆量\t279784\n哪边\t279785\n电码\t279786\np55\t279787\n江苏教育厅\t279788\n张潇\t279789\n极击网\t279790\nAtlas\t279791\n谭芷昀\t279792\n林淮生\t279793\n80s电影网\t279794\nCOMSOL\t279795\n雨崩村\t279796\nUNC\t279797\n111路\t279798\n成都国际金融中心\t279799\n长湖\t279800\n黯月\t279801\n香松\t279802\n炖菜\t279803\n数字转字符\t279804\n康巴什\t279805\nSFDA\t279806\n服服帖帖\t279807\n多普勒\t279808\n东湖开发区\t279809\n新闻调查\t279810\n高中女\t279811\nGakki\t279812\n我的房间\t279813\n赛跑\t279814\n玩吃鸡\t279815\n疝气\t279816\nSpeedy\t279817\n风骚女\t279818\n非京\t279819\n腾格尔\t279820\n底栏\t279821\n广丰区人民政府\t279822\n電腦\t279823\n狼途\t279824\n自嵌式\t279825\n真寒\t279826\n柴氏\t279827\n张江药谷\t279828\n豹纹\t279829\nLayIM\t279830\n85条\t279831\n华赢\t279832\n模流分析\t279833\n合肥工业大学\t279834\nsmx\t279835\n格内\t279836\n第八讲\t279837\nliux\t279838\n唐菖蒲\t279839\n杭州湾新区\t279840\nfrog\t279841\nWeX5\t279842\n亦\t279843\nbiangbiang\t279844\n内生\t279845\n海滨\t279846\n发泥\t279847\n婚宴酒店\t279848\n第一人民医院\t279849\nQuixel\t279850\n爱心社\t279851\nXFTP\t279852\nㄣ\t279853\n万门大学\t279854\n雷克萨斯ES论坛\t279855\n小马宝莉全集\t279856\n材料科学与工程专业\t279857\n加洛林\t279858\npathofexile\t279859\n初代火影\t279860\n文思\t279861\n皆知\t279862\n重庆八中\t279863\n2010年12月\t279864\n求生欲\t279865\n冬眠\t279866\n强强\t279867\n新钢\t279868\n克罗齐\t279869\njuan\t279870\n拆迁款\t279871\n激动剂\t279872\n球拍\t279873\nintimacy\t279874\nstamps\t279875\n拳王阿里\t279876\ncomparator\t279877\n精简\t279878\n饼饼\t279879\n速记\t279880\nCPU使用率\t279881\nfastener\t279882\n应税行为\t279883\n魏红\t279884\n6小时后\t279885\n李曦\t279886\nMOBA\t279887\n002069\t279888\n语义学\t279889\n叶子花\t279890\n弦乐四重奏\t279891\n4600u\t279892\n待君来\t279893\n中欧国际工商学院\t279894\n青霉素\t279895\n松拓手表\t279896\n球性\t279897\n星河城\t279898\n新时代湖北讲习所\t279899\n惊厥\t279900\n八木天线\t279901\nBW\t279902\n慢性咽炎\t279903\n上海曙光医院西院\t279904\n葛饰北斋\t279905\ncry\t279906\n徐子\t279907\n柱下\t279908\n甘源\t279909\n仿皮\t279910\n华电国际\t279911\nMoses\t279912\n版署\t279913\n胖大海\t279914\nPPT模板网\t279915\n692\t279916\n好书\t279917\n坦克炮\t279918\n野芷湖\t279919\n斯琴高丽\t279920\n退一步海阔天空\t279921\nskm\t279922\n说谎试试\t279923\nlooleee\t279924\n内保\t279925\n大青沟\t279926\n马徐骏\t279927\n洗宠\t279928\nAirPower\t279929\n小溪镇\t279930\nWINS\t279931\n中园\t279932\n略有\t279933\n长端\t279934\n支数\t279935\n广告页\t279936\n中新互联\t279937\n杨排风\t279938\n王珉\t279939\n飞标\t279940\nexperiments\t279941\n第六十七章\t279942\n融资租赁公司\t279943\n50万台\t279944\n河洛中文社区\t279945\n晋人社厅\t279946\nConnor\t279947\nJunbaor\t279948\n韩丽\t279949\n正赛\t279950\n偷电\t279951\n肝性\t279952\n感电\t279953\n史迹\t279954\n武工队\t279955\n国务院安委会\t279956\n焦糖玛奇朵\t279957\n不介意\t279958\n43423漫画网\t279959\n干得好\t279960\n丢锋网\t279961\nice\t279962\n15个\t279963\n凯美瑞\t279964\n淘宝知网\t279965\nDeskMini\t279966\n专论\t279967\n压气机\t279968\n2.7_\t279969\n熏黑\t279970\n观潮\t279971\nGasket\t279972\n中裤\t279973\nslicing\t279974\n甘露园\t279975\nexid\t279976\n外机\t279977\n药单\t279978\n陋习\t279979\nXlight\t279980\n古力柯蓝\t279981\n龙山县人民政府\t279982\n聚艺\t279983\n湖南大厦\t279984\n气锤\t279985\n千倍\t279986\n耀夜\t279987\n张硕\t279988\n_天成医疗网\t279989\n妈咪包\t279990\n减速机交易网\t279991\nDBShop\t279992\nhem\t279993\nDancing\t279994\nSelected\t279995\nTH2\t279996\n20180303\t279997\n丹寨万达小镇\t279998\ni美股\t279999\n修道士\t280000\n离经\t280001\n维稳办\t280002\nindexOf\t280003\nUMU\t280004\nmindmap\t280005\n公滨路\t280006\nwlanapi\t280007\n517\t280008\n下元\t280009\n青岛大学\t280010\n攫取\t280011\n白夜协奏曲\t280012\n中华人民共和国网络安全法\t280013\n演艺家\t280014\n张烨\t280015\n之所以\t280016\n恩城\t280017\n全日空\t280018\n一个批量\t280019\n陕西中医学院\t280020\n药王谷\t280021\n夜来\t280022\n辽西\t280023\n陶渊\t280024\n小卢\t280025\n沭阳网\t280026\n划小\t280027\n活化剂\t280028\n美容业\t280029\n可比商务服务网\t280030\n银剑\t280031\n遛娃\t280032\n小妙\t280033\n穷苦\t280034\n毕友网\t280035\n电位差计\t280036\n领导权\t280037\n言教\t280038\n四达\t280039\n介电材料\t280040\n海珠湖\t280041\nslop\t280042\n入党积极分子考试\t280043\n信誉好\t280044\n风衣女\t280045\n赵亚飞\t280046\n莫干山镇\t280047\n第二届\t280048\n成熟型\t280049\n中华人民共和国著作权法\t280050\n丈八蛇矛\t280051\n档案盒\t280052\n中国民用航空局空中交通管理局\t280053\n橙盒\t280054\nicann\t280055\n四机\t280056\n第2期\t280057\nx片\t280058\nmapstate\t280059\n一明\t280060\n俞正声\t280061\n富春桃源\t280062\nmw\t280063\n昆仑镜\t280064\n州人民政府\t280065\n就错\t280066\nstays\t280067\n晚宴\t280068\n笑击\t280069\n第九季\t280070\n中饱私囊\t280071\n2018.4.29\t280072\nhzxl\t280073\n130公里\t280074\n32秒\t280075\n延平路\t280076\n醇熟\t280077\n小谷\t280078\n爵位\t280079\nGuinea\t280080\n琪雅\t280081\n凌桥\t280082\n刷车\t280083\n45例\t280084\n签\t280085\ncdfi\t280086\n老男孩\t280087\n61张\t280088\n青藏高原\t280089\n6碟\t280090\n问做\t280091\n无殇\t280092\n轴力图\t280093\n电报\t280094\nevd\t280095\ndde\t280096\n郑伊健\t280097\n呼求\t280098\n9月24日\t280099\nOunce\t280100\n20180204\t280101\n德云社相声\t280102\n开江县\t280103\n二毛\t280104\n军事分界线\t280105\n乙酰螺旋霉素片\t280106\n277集\t280107\nVagaa哇嘎\t280108\n雷克特\t280109\n造血干细胞移植\t280110\n破军\t280111\n中小学生作文网_中考高考满分作文\t280112\n打气筒\t280113\n所闻\t280114\n流年\t280115\nConductor\t280116\n跌荡\t280117\n校表\t280118\nweekend\t280119\n10后\t280120\n奇异恩典\t280121\n北京公共自行车\t280122\n粒子性\t280123\n拖箱\t280124\n花型\t280125\n平安喜乐\t280126\n秋野\t280127\n猫仔\t280128\n梦幻诛仙手游\t280129\n深莞\t280130\n青海省委\t280131\nMac地址\t280132\n六方面\t280133\n阿芝莎\t280134\n罗莱\t280135\nJPEG2000\t280136\n广东省口腔医院\t280137\nNavicat\t280138\n点蚀\t280139\n长江国际\t280140\n法雷奥\t280141\n膀胱壁\t280142\n福朋喜来登酒店\t280143\n大变样\t280144\nhandan\t280145\nverified\t280146\n国王的恩赐\t280147\n23幢\t280148\n楦头\t280149\n贾峰\t280150\n当阳\t280151\n城镇居民\t280152\n滑囊炎\t280153\n购房经\t280154\n沙滩巾\t280155\nkgtemp\t280156\n一夫茶\t280157\ndame\t280158\n天物\t280159\n2017年12月20日\t280160\n新葫芦娃\t280161\n八分之三\t280162\nkivy\t280163\n贵阳市人民政府\t280164\n寓居作\t280165\n可用率\t280166\n基委\t280167\n黄南藏族自治州\t280168\n兴坪\t280169\n勤思\t280170\n无人艇\t280171\n马德拉\t280172\nyung\t280173\n八一广场\t280174\n楼面荷载\t280175\n保守党\t280176\nAvago\t280177\n美人关\t280178\nxfce,lxde\t280179\n刊发\t280180\n全地\t280181\n今晨\t280182\n招待费\t280183\n标靶\t280184\n颠簸\t280185\nMarry\t280186\n233333\t280187\n补学\t280188\n霸龙\t280189\n宇仙炅\t280190\n雪平\t280191\n努比亚z18mini\t280192\nGoogle在线翻译\t280193\n紧要\t280194\n放过自己\t280195\n释怀\t280196\n自由身\t280197\n甲癌\t280198\n姜大声\t280199\n武汉控股\t280200\nWagas\t280201\n20150403\t280202\n胃结石\t280203\n沈辽路\t280204\nEmiri\t280205\n什么性\t280206\n一个元\t280207\n哈飞赛马\t280208\n苹果区\t280209\n义务教育公办学校\t280210\n莉比\t280211\n视线\t280212\n2017-12-31\t280213\nWED\t280214\n轻绳\t280215\ndjango\t280216\n金胜维\t280217\n大一品\t280218\n伊夫圣罗兰\t280219\n0.3.5\t280220\n2017年12月25日\t280221\n雅木茶\t280222\n贵方\t280223\n徐曼\t280224\n青岛中院\t280225\nPP2\t280226\nqfil\t280227\n优尚\t280228\n河南省总工会\t280229\n不死人\t280230\ncxf\t280231\n焦化\t280232\n3秒钟\t280233\n甘肃省水利厅\t280234\n2018排\t280235\n勇进\t280236\n发审\t280237\n胜者为王\t280238\n战地5\t280239\n李大霄\t280240\n胡汉三\t280241\nsplit函数\t280242\n陕西新闻网\t280243\n上海保税区\t280244\n小学生赛\t280245\n衣车\t280246\n气药\t280247\n胸门\t280248\n张晓玲\t280249\ntoe\t280250\nleed\t280251\n李凡\t280252\n点米时尚网\t280253\n张牙舞爪\t280254\n市民\t280255\n集成电路查询网\t280256\n自动版\t280257\n热帖\t280258\n云免流\t280259\n三菱银行\t280260\n特殊性\t280261\n格斗园\t280262\n习奥会\t280263\nactivex控件\t280264\nTun\t280265\n女兵\t280266\n岩浆岩\t280267\n校赛\t280268\n塞伯坦\t280269\n秋水仙碱片\t280270\n腹围\t280271\n维生素B1\t280272\n嗖\t280273\noutgoing\t280274\n组卷网\t280275\nWiFi链\t280276\n无品\t280277\n庆良流毒\t280278\n保存期\t280279\nppa\t280280\n峡山水库\t280281\n恐惧感\t280282\nSeating\t280283\n发酵茶\t280284\nPKI\t280285\n几平\t280286\n看花\t280287\n假球\t280288\n云南昭通\t280289\n碱基\t280290\nfesto\t280291\nEnounce\t280292\n华南城网\t280293\n追过\t280294\niac\t280295\n主控\t280296\n新城西岸公园\t280297\n爱丽丝伪娘团\t280298\n碧琪\t280299\n未全\t280300\n爱行\t280301\n回馈\t280302\n麦默通\t280303\n醉迷\t280304\n苹果6plus\t280305\n新义安\t280306\n市公安局\t280307\n一个页\t280308\n次干路\t280309\n紫光灯\t280310\n夜母\t280311\nDifficult\t280312\n垫肩\t280313\n2125\t280314\n1000名\t280315\n分割器\t280316\n20升\t280317\ndocumen\t280318\n选调生\t280319\n摇动\t280320\n导向型\t280321\n制造费用\t280322\n纤皮\t280323\n堵枪\t280324\n副总\t280325\n联想小新v4000\t280326\n巴啦啦小魔仙之彩虹心石\t280327\n东奔西顾\t280328\nTeaming\t280329\n让步\t280330\n泛碱\t280331\nadventure\t280332\n8架\t280333\n富盈\t280334\n石井\t280335\n怀城镇\t280336\n途虎养车\t280337\n女神网\t280338\n田溯宁\t280339\n建筑群\t280340\n肥逼\t280341\n白山市人民政府\t280342\n全民健身日\t280343\n曾几\t280344\n宁波银行汇通\t280345\ncoup\t280346\ncandidate\t280347\n阴阳家\t280348\nTRANSACTION\t280349\n标箱\t280350\n汽车业\t280351\n固泊\t280352\n场面\t280353\n东铁营\t280354\nbeloved\t280355\n安琪儿\t280356\n保罗唐骏\t280357\n羊脂\t280358\n名艺\t280359\nMix2s\t280360\ndependency\t280361\n姒\t280362\nHogan\t280363\nremoveall\t280364\n徐公平\t280365\nRenesas\t280366\ncrusader\t280367\n20180301\t280368\n君子以泽\t280369\n颜盈\t280370\n使唤\t280371\n火影忍者同人志\t280372\ntuichu\t280373\n樱井孝宏\t280374\n农村信用社\t280375\n扁钢\t280376\n溏心风暴2\t280377\n楼花\t280378\n巴安\t280379\n胡渐彪\t280380\n郭启儒\t280381\n俞灏李亚鹏\t280382\n通灵王\t280383\n张向阳\t280384\n二乙醇胺\t280385\n2018小说网\t280386\n奥林\t280387\n假性近视\t280388\n豆得儿\t280389\n慕课网\t280390\n夏利n5\t280391\nC++语言\t280392\n浪客剑心\t280393\n原作者\t280394\nping域名\t280395\n仅靠\t280396\n贪念\t280397\n日俄\t280398\n浅上藤乃\t280399\n割绒\t280400\ndiscourse\t280401\n7009\t280402\n裕度\t280403\n几科\t280404\n绍兴地区\t280405\n市院\t280406\nhom\t280407\n2980名\t280408\n拉二\t280409\n9.0.7\t280410\n饰条\t280411\n美图V6\t280412\n蜂胶软胶囊\t280413\n当务之急\t280414\n无主\t280415\n桑切斯\t280416\n已经很久\t280417\n20160525\t280418\n广东碧桂园职业学院\t280419\nHow\t280420\n19世纪中期\t280421\n香花槐\t280422\n花生\t280423\n3D建模软件\t280424\n中航黑豹\t280425\n指标生\t280426\n阵字\t280427\nEscape\t280428\n云轨\t280429\n减肥者\t280430\n叠墅\t280431\nreduction\t280432\n第13章\t280433\namericano\t280434\n珠宝\t280435\n6万吨\t280436\n常春藤\t280437\n下位机\t280438\npatterns\t280439\n圆通公司\t280440\n二氧化碳培养箱\t280441\n2018季\t280442\n安徽省教育厅\t280443\n华强北路\t280444\n北京索莱宝科技有限公司\t280445\n牵绊\t280446\n电子技术与软件工程\t280447\nyamamoto\t280448\n36栋\t280449\nDHL快递\t280450\n农业税\t280451\n紫金苑\t280452\n爆米花\t280453\n武仙\t280454\nphy\t280455\nxsin\t280456\n609\t280457\ntvN\t280458\n俺妹\t280459\n老君\t280460\n骑行服\t280461\n冷笑\t280462\n每30分钟\t280463\n悲惨\t280464\nunt\t280465\n上海外国语大学附属浦东外国语学校\t280466\n晋建\t280467\nA10处理器\t280468\n良渚文化\t280469\n免购置税\t280470\n高压清洗机\t280471\n东瑞\t280472\n厦漳大桥\t280473\n社会工程\t280474\n_咖啡吧\t280475\nSwitch版\t280476\n姐汁\t280477\n靖边\t280478\n仙灵女巫\t280479\n速看\t280480\nbandicut\t280481\n沈力\t280482\n0534\t280483\njudgment\t280484\n20170614\t280485\n周德堃\t280486\n百果园\t280487\nSCA\t280488\n970A\t280489\n192k\t280490\n惠州市政府\t280491\n打炮\t280492\n匈牙利\t280493\nufeff\t280494\n在囧途\t280495\n袁克力\t280496\nloga\t280497\n冰雪季\t280498\n四一二反革命政变\t280499\n粉墨\t280500\nntv\t280501\n徐庄\t280502\n百利科技\t280503\n引才\t280504\n器人\t280505\nRsyslog\t280506\n许光\t280507\n振发\t280508\nMinecraft服务器\t280509\n入职\t280510\n综艺节目表\t280511\nv380\t280512\n117个\t280513\nS1\t280514\n抑或\t280515\n核弹猴\t280516\n变了样\t280517\nHarrier\t280518\n噪音计\t280519\n蒋璐霞\t280520\n玉屏侗族自治县\t280521\n碧水龙庭\t280522\n开福区政府\t280523\n小气球\t280524\nCodes\t280525\n上海星巴克\t280526\n拿花\t280527\n新城科技园\t280528\n食物\t280529\n摩飞\t280530\n余庆县\t280531\npoints\t280532\n种豆网\t280533\n小黄豆\t280534\n香港中央结算有限公司\t280535\n义县\t280536\n对偶性\t280537\n老街坊\t280538\nSummoner\t280539\n土城镇\t280540\n资管部\t280541\n讨债\t280542\n数_\t280543\n嘴皮子\t280544\n都市118连锁酒店\t280545\n青春飞扬\t280546\n西施故里\t280547\n贾法里\t280548\n山东传媒职业学院\t280549\n6周岁\t280550\n满水\t280551\n达兔哥\t280552\n人力资源专业\t280553\n志趣新闻_志趣网\t280554\n清罐\t280555\ndeduction\t280556\n洁宝\t280557\n腌\t280558\n轩辕剑之天之痕\t280559\n纠集\t280560\n硬塑\t280561\n相框\t280562\n淘宝宝\t280563\n火箭军工程大学\t280564\n肖风\t280565\n郑少忠\t280566\nhypo\t280567\n意外收获\t280568\n定义域\t280569\n宁波栎社国际机场\t280570\n张枣\t280571\n失手\t280572\n针灸减肥\t280573\napparently\t280574\n仿木纹漆\t280575\n爱奇艺影音\t280576\nPORN\t280577\nriven\t280578\nbeatless\t280579\nWampserver\t280580\nruby\t280581\n珠村\t280582\n防战\t280583\nbikini\t280584\n鹿鞭酒\t280585\n歌舞伎町\t280586\n掌上明珠\t280587\neureka\t280588\n蒲黄\t280589\n荣耀3c\t280590\n琵琶谱\t280591\npipi\t280592\n扶芳藤\t280593\nxhost\t280594\nchery\t280595\n东方女性网\t280596\n民事判决书\t280597\n金属板\t280598\nEmerson\t280599\n次\t280600\n南宗\t280601\n衣鱼\t280602\n5um\t280603\n九曲\t280604\n凌晨四点\t280605\n暖贴\t280606\n爬地\t280607\n打包盒\t280608\n2.51\t280609\n5席\t280610\n丝机\t280611\n铸造石\t280612\n单元格函数\t280613\n大话西游之大圣娶亲\t280614\n下颌骨\t280615\n模温机\t280616\n滩区\t280617\nDX\t280618\n接待\t280619\n黄权\t280620\nhprose\t280621\n记录贴\t280622\n6万个\t280623\n双链\t280624\ndede\t280625\n风轻云淡\t280626\n帝舵\t280627\n教具\t280628\n惠丰\t280629\n1045\t280630\n青水\t280631\n優\t280632\n桑德集团\t280633\n伊文\t280634\n深圳肤康皮肤病医院\t280635\n深邃\t280636\n重庆市政协\t280637\n戏楼\t280638\n〓\t280639\njre7\t280640\n奔放\t280641\n童祥苓\t280642\n天津区\t280643\nc9000\t280644\n汇宝\t280645\nXware\t280646\n名巨\t280647\n卡片\t280648\nllvm\t280649\n第58条\t280650\n免试\t280651\n1000级\t280652\n毒打\t280653\n皮带扣\t280654\nLatch\t280655\n州牧\t280656\n海口市政府\t280657\n机滤\t280658\npkpm\t280659\n八个多月\t280660\n劣后级\t280661\n6面\t280662\n石敬瑭\t280663\n黄俊鹏\t280664\n伸展\t280665\n浮筒\t280666\n价格高\t280667\n客官\t280668\n少侠\t280669\n咬字\t280670\nAlphaBOUNCE\t280671\n武汉邮科院\t280672\n海阔创世纪\t280673\nfromis\t280674\n网聊\t280675\nWF-1000X\t280676\n循环伏安法\t280677\n巴里坤\t280678\n温岚\t280679\n傻儿传奇\t280680\n接续\t280681\n上下限\t280682\n内存地址\t280683\nawz\t280684\n4.5V\t280685\n公共租赁住房\t280686\ncd机\t280687\n腾讯理财通\t280688\n谈生命\t280689\n合川区政府\t280690\n三头\t280691\n老菊\t280692\n西软\t280693\n北京市昌平区\t280694\nExcel2003/2007\t280695\nlx5\t280696\nAC97\t280697\n桦褐孔菌\t280698\nSavannah\t280699\n独影\t280700\n0271\t280701\nPTN\t280702\n易考吧\t280703\n混沌与秩序2吧\t280704\n郑州住房公积金管理中心\t280705\nPRD\t280706\n奥林匹斯之链\t280707\n布宜诺斯艾利斯\t280708\n迈开\t280709\n4.3.1\t280710\n31章\t280711\n中国质量网\t280712\n探码\t280713\n芷江西路\t280714\n观景台\t280715\n飙风\t280716\n来源地\t280717\n宁密康\t280718\n选战\t280719\n德英\t280720\n璜土镇\t280721\nMODv\t280722\n随波\t280723\nplu\t280724\n摘除\t280725\n神咒\t280726\n需用\t280727\n娴雅\t280728\n隋唐大运河\t280729\nbsf\t280730\n碧草\t280731\n卧龙\t280732\n松香水\t280733\n上海浦东机场日上免税店\t280734\n没想\t280735\n电击棒\t280736\n金世纪\t280737\n白狮\t280738\n财务人\t280739\n祛魅\t280740\n淘宝卖家中心\t280741\n节肢动物\t280742\n1440P\t280743\n东唐\t280744\n上古卷轴5:天际\t280745\n强转\t280746\n百度网盘净网\t280747\n西北工业大学明德学院\t280748\n婚礼进行曲\t280749\n宅男们\t280750\n丹特丽安\t280751\n啦啦队\t280752\nThinkPHP函数\t280753\nvb\t280754\nude\t280755\n圣教军\t280756\njavahl\t280757\n偏误分析\t280758\n桂林\t280759\n山脚\t280760\n湘潭在线新闻网\t280761\n炮爷\t280762\n惊变\t280763\n一瓣001\t280764\n谁人\t280765\nPatty\t280766\n高以翔\t280767\n山东鲁能泰山足球俱乐部\t280768\n荣耀墨子\t280769\n寒窗\t280770\n暗门\t280771\np90\t280772\n好宝宝\t280773\nボディ\t280774\npccad2015\t280775\n花鼓\t280776\n木球\t280777\n迷鹿\t280778\n龙王\t280779\n话语权\t280780\n商品率\t280781\n危包\t280782\n推免\t280783\n北野望\t280784\nShayne\t280785\nMAPS\t280786\n802.11帧\t280787\n南京审计大学金审学院\t280788\n踏血寻梅\t280789\n海洋工程网\t280790\n杜比亚\t280791\n萨米特\t280792\n顺槽\t280793\n腌鸡蛋\t280794\n婚婚\t280795\n山东航空\t280796\n死节\t280797\n孙燕姿\t280798\n离罩\t280799\nvia浏览器\t280800\nonclick=\t280801\n南通网\t280802\n生动化\t280803\n统一冰红茶\t280804\n10026\t280805\n湘军\t280806\nagang\t280807\n海宁皮革城\t280808\n挣\t280809\n食全食\t280810\n第45号\t280811\nResolving\t280812\nYuuki\t280813\n第六十八条\t280814\n求己\t280815\n水高庄园\t280816\nCheckStyle\t280817\n摄制\t280818\n生死观\t280819\n二附院\t280820\n自给\t280821\n古茶\t280822\n老药\t280823\n中国医药网\t280824\nBody\t280825\n威海汽车站\t280826\n公交车路\t280827\noffic2010\t280828\n望城\t280829\n吴亦朱桢\t280830\n仙裙\t280831\n乘风\t280832\n捌\t280833\nCortana\t280834\n丽江山水S酒店\t280835\n吊物\t280836\n哪些\t280837\n华创\t280838\n一学期\t280839\n所至\t280840\nhtl\t280841\n龙盘湖\t280842\n郭汝瑰\t280843\n司盘\t280844\n依赖型\t280845\n百地\t280846\n洛修利亚\t280847\n君主制\t280848\noptimism\t280849\nMortgage\t280850\ncharAt\t280851\nG102\t280852\nbaacloud\t280853\nVichy\t280854\n吴苋\t280855\n27秒\t280856\n妈群\t280857\n急单\t280858\n1.4T\t280859\n洗板机\t280860\n点菜宝\t280861\nvisualsvn\t280862\n幂法\t280863\n售票机\t280864\n浅龋\t280865\n雾化治疗\t280866\n宝马X7\t280867\nruins\t280868\nBilibili\t280869\n魏秋桦\t280870\nGROUNDS\t280871\n表在\t280872\n8.02\t280873\n青企\t280874\n高粱米\t280875\n噩耗\t280876\n微笑航空\t280877\n试卷版\t280878\n圆柱面\t280879\n艹\t280880\n用户组\t280881\n倒底\t280882\n紫藤萝\t280883\n尿急尿\t280884\n400美元\t280885\n都匀市\t280886\n推推\t280887\n蔡畅\t280888\nexponent\t280889\n博丽神社\t280890\nSQl\t280891\n保健器械\t280892\n中华人民共和国统计法\t280893\n葡萄糖酸内酯\t280894\n泸定县\t280895\nグレ\t280896\nH盘\t280897\n华阳湖湿地公园\t280898\nv3.1.5\t280899\n长城物业集团股份有限公司\t280900\nv500\t280901\n阿普唑仑片\t280902\n广汇汽车\t280903\n佩蒂股份\t280904\n宜宾市农业局\t280905\n掀桌\t280906\nleather\t280907\n闭口\t280908\nPDF24\t280909\n聊天框\t280910\n八分之七\t280911\n1.2.17\t280912\n健康成长\t280913\n售楼小姐\t280914\nshell变量\t280915\n南宁市人民政府\t280916\n膨大剂\t280917\nHarrods\t280918\nacetate\t280919\n碧然德\t280920\n霍乱疫苗\t280921\n新浦镇\t280922\n潞\t280923\n平喘\t280924\n权益性投资\t280925\n文旅部\t280926\n第七_\t280927\nuicontrol\t280928\n第2季\t280929\n今生今世\t280930\n切出\t280931\n汽车厂\t280932\nNO\t280933\n1-5\t280934\n白驹\t280935\n祛斑\t280936\n南孚电池\t280937\nNuGet\t280938\nsublimetext3\t280939\nShows\t280940\n合肥日报\t280941\n长安欧尚_\t280942\n村医\t280943\n强龙\t280944\n治保\t280945\n海安网\t280946\ntext3注册码\t280947\n之星\t280948\nHunting\t280949\n战衣\t280950\n灵越\t280951\n致加西亚的信\t280952\n跳枪\t280953\naza\t280954\n显眼\t280955\nBigberg\t280956\n卫风\t280957\n责权\t280958\n惹火\t280959\n审敛\t280960\n中国国际金融有限公司\t280961\n地学\t280962\n千惠\t280963\n前盘\t280964\n25\t280965\n国芯\t280966\n房地产经纪\t280967\n好色派沙拉\t280968\n实业大厦\t280969\n好帮手\t280970\n登山绳\t280971\nSiri\t280972\ntabindex\t280973\noffice2013吧\t280974\n27期\t280975\n葫芦娃儿歌\t280976\n新日暮里\t280977\n北齐\t280978\nClimbing\t280979\n度\t280980\n1413\t280981\n精灵世界\t280982\nJournals\t280983\n一个套\t280984\nHarness\t280985\n谢鑫\t280986\n许佳琪\t280987\n风雷\t280988\n大苗\t280989\n正版版\t280990\n勐拉维加斯\t280991\n360doc个人图书馆\t280992\n红血球\t280993\n坤宁宫\t280994\n爱尔眼科\t280995\n黄金概念股\t280996\n动态链接库\t280997\n全国人大常委会\t280998\n天伦\t280999\n幽灵船\t281000\n2017年5月12日\t281001\n四十载\t281002\n带形\t281003\n二十六岁\t281004\n41亿\t281005\n家政服务员\t281006\n环大西洋2\t281007\n运动史\t281008\nChange\t281009\n1\t281010\n奔驰g63\t281011\ngreat\t281012\n211院校\t281013\n自由意志\t281014\n广绣\t281015\n小学生作文网\t281016\nNORTH\t281017\n北京燃气公司\t281018\n助学贷款\t281019\nnerve\t281020\n深圳14号线\t281021\n高伦雅芙\t281022\n半包\t281023\n杯垫\t281024\n互质\t281025\n水准仪\t281026\n国寿康宁\t281027\n上海地铁15号线\t281028\n三屠\t281029\nScoring\t281030\n植萃\t281031\n银川市西夏区\t281032\n月息\t281033\n数字信号处理器\t281034\n保生\t281035\n船桨\t281036\n陆地\t281037\n转头\t281038\n南昌大学\t281039\n莱昂纳多\t281040\nVeeR\t281041\n_英雄联盟\t281042\n陕西省人民政府\t281043\nFlyAudio\t281044\ntenor\t281045\n热效\t281046\n四川保监局\t281047\n800kV\t281048\n收下\t281049\n文章页\t281050\n女总统\t281051\n正骨医院\t281052\n图类\t281053\n海西州人民政府\t281054\n唱戏机\t281055\n陈经\t281056\n韩粉乐园\t281057\nJQuery\t281058\nslides\t281059\n肋片\t281060\n含铅量\t281061\n碎末\t281062\n北京西城区\t281063\nNotice\t281064\n商标\t281065\n朱棣文\t281066\n亚青\t281067\n舞蹈学\t281068\n股四头肌\t281069\n毕业赠言_\t281070\n冤大头\t281071\n一百问\t281072\n列表单\t281073\n越秀国际金融汇\t281074\n优惠劵\t281075\n纳滤膜\t281076\n兰陵\t281077\n当地居民\t281078\n圆盘\t281079\n潘帕斯\t281080\n抽液\t281081\n卡斯\t281082\n吴村\t281083\n畿\t281084\n历尽\t281085\n谢海华\t281086\n5003\t281087\n镇海中学\t281088\n50项\t281089\n福州北路\t281090\n軽\t281091\n铁币\t281092\n30平方厘米\t281093\n孙亚两小无猜\t281094\n子串\t281095\n兵长\t281096\n沙州\t281097\n奇虎360\t281098\n老报\t281099\n北京市科学技术协会\t281100\n惊天大秘密\t281101\n行僧\t281102\n康得\t281103\nseekbar\t281104\n烧黑\t281105\n170度\t281106\n兰溪\t281107\n小标\t281108\n114企业网\t281109\n智多星\t281110\n下病\t281111\nlamination\t281112\n未来5天\t281113\n关门山\t281114\n83路\t281115\n万科劝学\t281116\n青溪\t281117\n黄庭禅\t281118\n载沣\t281119\nopposed\t281120\nhearts\t281121\n派斯\t281122\n_民福康健康\t281123\n李懿\t281124\n新闻编辑室\t281125\n髌骨\t281126\n2.7亿元\t281127\n慢性结肠炎\t281128\njmap\t281129\n三牌楼小学\t281130\n化龙桥\t281131\n病退\t281132\n知轩\t281133\n漩涡玖辛奈\t281134\n桌面版\t281135\nphysx\t281136\n纪传体通史\t281137\n编绘\t281138\n电生磁\t281139\n台方\t281140\n气象站\t281141\n易读\t281142\n周幽王\t281143\n温州地区\t281144\n玄石\t281145\n报红\t281146\n中廖村\t281147\n6023\t281148\n湖南消防网\t281149\n高圆圆\t281150\n北京圆明园\t281151\n缙哥哥\t281152\n顾渚村\t281153\nApkTool\t281154\n隧道股份\t281155\n互访\t281156\n浅深\t281157\n刺局\t281158\n变量值\t281159\n胡金铨\t281160\nChanges\t281161\n红色警戒3起义时刻\t281162\nlyj\t281163\n佛山日报数字报\t281164\n整流罩\t281165\n指指点点\t281166\n藜麦\t281167\n浅薄\t281168\n销售点\t281169\nrmarkdown\t281170\n曹禺\t281171\n刊名\t281172\nmasses\t281173\n一滴一滴\t281174\n865\t281175\n唇齿相依\t281176\n一两分钟\t281177\nWe\t281178\n日内交易\t281179\n控制图\t281180\nLIGHTING\t281181\n卡尔森\t281182\n殆尽\t281183\n田峰\t281184\n黑照\t281185\n青岛市妇女儿童医院\t281186\n译林牛津\t281187\n大姚\t281188\n紧轮\t281189\n岷山\t281190\n赣西\t281191\nECStore\t281192\n实习成绩\t281193\nInfosys\t281194\n化学发光法\t281195\n柯布\t281196\n原装\t281197\n莲塘\t281198\n葡萄石\t281199\n佐匹\t281200\n黄江琴\t281201\n竖起来\t281202\n房产信息网\t281203\nLike\t281204\n暴走大事件脑残广告\t281205\n南宁晚报\t281206\nexpired\t281207\n头发型\t281208\n三天后\t281209\n想起\t281210\n首幅\t281211\n第七位\t281212\n新芝\t281213\n不对板\t281214\n新车型\t281215\n礼堂椅\t281216\n农安县人民政府_吉林农安政府\t281217\n贵港\t281218\n小颖\t281219\n铺地\t281220\n汉传佛教\t281221\n屡试不爽\t281222\n潮安区\t281223\nmuye\t281224\n新田城\t281225\nヤフ\t281226\n新高速\t281227\nReporter\t281228\n美房网\t281229\nexadata\t281230\n礼嘉\t281231\n管行\t281232\n101家\t281233\n年际\t281234\n卞卡\t281235\n明源地产\t281236\nInitiative\t281237\n诗城\t281238\n工控板\t281239\n幻想版\t281240\n乌兰巴托的夜\t281241\n1.0.9\t281242\n千条\t281243\n村正\t281244\n非限定性净资产\t281245\n大庄镇\t281246\n泰康集团\t281247\n武汉华侨城\t281248\n工银e\t281249\n腰形\t281250\ntanjun\t281251\n示威游行\t281252\n戴西\t281253\nps蒙版\t281254\nmutate\t281255\n首成\t281256\n物网\t281257\n封隔器\t281258\n林峯\t281259\nRui\t281260\n秀域\t281261\n3局\t281262\n上海天佑医院\t281263\n王留美\t281264\n谷冬狮郎\t281265\n叫声\t281266\nboards\t281267\n进销存管理系统\t281268\nhiking\t281269\nwitha\t281270\n冲锋车\t281271\n补土\t281272\n萌宠\t281273\nxinyi\t281274\n雍正剑侠\t281275\n纪检委\t281276\n铝箔\t281277\n面点\t281278\n翔安\t281279\n知康\t281280\n破手\t281281\n齐齐哈尔医学院\t281282\n共鉴\t281283\n对撞机\t281284\nm240\t281285\n胃肠病学\t281286\n环球鞋网\t281287\n青明节\t281288\njffs\t281289\nresidue\t281290\n扬州开发区\t281291\n战神3\t281292\n金象\t281293\n长安欧尚长安CX702018款高清内饰中控图\t281294\n星光之恋\t281295\n番禺中心医院\t281296\njyc\t281297\n点符号\t281298\njdbcType\t281299\n森系\t281300\n肖丹\t281301\n黄牛肉\t281302\n薯条\t281303\n65张\t281304\nOrganize\t281305\n催泪\t281306\nMISC\t281307\nhed\t281308\n中央广播电视大学\t281309\ndlib\t281310\n孑\t281311\n闲情偶寄\t281312\n不共戴天\t281313\n九江新闻网\t281314\n满场\t281315\n348\t281316\n山歌好比春江水\t281317\nziji\t281318\nSprite\t281319\nscrollTop\t281320\npeda\t281321\n启动\t281322\n总裁爹地宠上天\t281323\nBAP\t281324\nSwisse\t281325\n古墓丽影1\t281326\n导叶\t281327\n小镜\t281328\n红花会贝贝\t281329\nGranny\t281330\n托物言志\t281331\n草药学\t281332\n西湖音乐喷泉\t281333\n超感警探\t281334\n雯梓\t281335\n吃货们\t281336\n辟邪\t281337\n迎合\t281338\n食品生产许可证\t281339\nNuts\t281340\n迪拜城\t281341\n今晚19:30\t281342\n千禧\t281343\n上海皮肤科医院\t281344\n棍\t281345\n露希尔\t281346\n青龙会\t281347\n16000元\t281348\n债权性\t281349\n菲律宾共和国\t281350\n23式\t281351\n35种\t281352\n201203\t281353\n酒精中毒\t281354\n聚首\t281355\n亲事\t281356\n活死人\t281357\n竺延风\t281358\n直补\t281359\n龙珺\t281360\n李小峰\t281361\n愧疚感\t281362\n胸腺嘧啶\t281363\n辽宁海城\t281364\n六七个\t281365\nAlex_sun\t281366\n造星\t281367\n002230\t281368\n街心公园\t281369\n宜城\t281370\n别惹佐汉\t281371\n北京市交通委员会运输管理局\t281372\n2005-2014年\t281373\nZemax\t281374\n常绿\t281375\n主贷\t281376\n工规\t281377\nautofill\t281378\n长滩岛\t281379\n天景\t281380\n中年\t281381\n一瞬间\t281382\n阿波\t281383\n价值型\t281384\n存档修改器\t281385\nPriority\t281386\n官册\t281387\n中讯\t281388\n人大三次会议\t281389\n团宴网\t281390\n上载\t281391\nKoi\t281392\n杜肯\t281393\n1.5.3\t281394\ndr\t281395\noraclesql\t281396\n维创\t281397\n珠海华润银行\t281398\n惠阳区\t281399\nword2000\t281400\n上海股交中心\t281401\n广播词\t281402\n三国志12威力加强版\t281403\n西成高速铁路\t281404\n华域\t281405\n花絮版\t281406\njoker3\t281407\nFluids\t281408\ntop10\t281409\n参考消息\t281410\ninvert\t281411\n康桥镇\t281412\n尖山区\t281413\n马尔济斯\t281414\n幸福里\t281415\n使命召唤3\t281416\n番茄沙司\t281417\n火剧\t281418\nfoundation\t281419\nPersuasion\t281420\n450克\t281421\n7月10日\t281422\n团结出版社\t281423\n生逢灿烂的日子\t281424\n地貌\t281425\n拉出\t281426\n广西政法管理干部学院\t281427\n吴小华\t281428\n市教育局\t281429\nfocused\t281430\nトップペ\t281431\n消毒\t281432\n94年\t281433\n东晓\t281434\n百分之6\t281435\n磨毛\t281436\n金相组织\t281437\n詹妮弗\t281438\n微信运动步数\t281439\n扫雷地牢\t281440\nmaximize\t281441\n廿三里街道\t281442\nxdx\t281443\n450nm\t281444\nJavaBean\t281445\n绿盾\t281446\n通州湾\t281447\n高_寻医问药网\t281448\n三不\t281449\n置信度\t281450\nDOS攻击\t281451\nJMAP\t281452\n几家欢乐几家愁\t281453\n四十项\t281454\n雨林木风xp\t281455\n37亿元\t281456\n吉和网\t281457\n逃之夭夭\t281458\nKaiser\t281459\nasv\t281460\n劳资\t281461\n柜体\t281462\n福建中公教育\t281463\n路过\t281464\n治企\t281465\nkeytool\t281466\nthai\t281467\nwip\t281468\n加热片\t281469\n中国机\t281470\n红烧鸡\t281471\n丁文琪\t281472\n聚氨酯保温钢管\t281473\n扫光\t281474\nTAIWAN\t281475\n无水\t281476\nmonggodb\t281477\n有途\t281478\n秘书\t281479\nautumn\t281480\n临沂市委\t281481\n锋龙股份\t281482\n刘聪\t281483\n金券\t281484\n灭神\t281485\n四纵四横\t281486\n挂绳\t281487\n盲狙\t281488\nsysfs\t281489\nATEX\t281490\n悠嘻猴\t281491\n吴起\t281492\n中共广东省委党校\t281493\n洛阳镇\t281494\n李纳\t281495\n党内\t281496\n韩小希\t281497\n徐翔\t281498\n斯佳丽\t281499\ndata_\t281500\n黄刺玫\t281501\n雪痕\t281502\n回补\t281503\n不敢当\t281504\n山东省委办公厅\t281505\n锁边\t281506\n4004\t281507\n撒尿\t281508\n坚挺\t281509\n义井\t281510\n无尽对决\t281511\n广东品胜电子股份有限公司\t281512\n沈杰\t281513\nRS485接口\t281514\nrelatives\t281515\n纽约中央公园\t281516\n安热沙\t281517\n海南省住房公积金管理局\t281518\n老党\t281519\n93岁\t281520\n隶属度\t281521\n搅基\t281522\n_天姥家园\t281523\npakage\t281524\n药物临床试验\t281525\n中值\t281526\n河套地区\t281527\n金巧巧\t281528\n形针\t281529\n杜普蕾\t281530\n富玩\t281531\n甘露糖\t281532\n钦\t281533\n瑞虎3X\t281534\nninja\t281535\n会计分析\t281536\npuppet\t281537\nwangeditor3\t281538\nDiana\t281539\n来到这里\t281540\n姜晓\t281541\n墨雨烟夜\t281542\n高评分\t281543\nTOUS\t281544\n王小平\t281545\n马尔康\t281546\n重温\t281547\n朦胧感\t281548\n轩辕传奇\t281549\n侄女\t281550\n李锦\t281551\np115\t281552\n万达电商\t281553\n巧巧手幼儿手工网\t281554\n黛莉娅\t281555\n调护\t281556\n棉条\t281557\njoo\t281558\n妙控\t281559\n投资银行部\t281560\n战舰少女R\t281561\n客气话\t281562\n3599\t281563\n炜衡律师事务所\t281564\n舞文\t281565\nvisia\t281566\n乌当\t281567\n人武\t281568\n天天PLC\t281569\n2卡顿\t281570\n建安工程\t281571\n高亭镇\t281572\nFLIGHT\t281573\n徐州机场\t281574\n烟道机\t281575\n1.04G\t281576\n澜沧江\t281577\n商务类\t281578\n1本\t281579\n2.24\t281580\n氘灯\t281581\n陈雅\t281582\ntruncation\t281583\n西瓜皮\t281584\n母乳\t281585\n华天电力\t281586\n今井勇太\t281587\n声速\t281588\n标致307\t281589\n自由派\t281590\n居间费用\t281591\n通率\t281592\n邵雨薇\t281593\n20160503\t281594\n检测试\t281595\n2.2%\t281596\n几架\t281597\n绝世天尊\t281598\n银行招聘网\t281599\n卵形\t281600\n2015—2020年\t281601\n京兆\t281602\n质量检验\t281603\n12kw\t281604\n骨片\t281605\n洛阳职业技术学院\t281606\n擦手\t281607\n大饭\t281608\n太昊陵\t281609\n布莱特\t281610\nkanban\t281611\n筹设\t281612\n姜承\t281613\n银谷在线\t281614\n91b1\t281615\nTeamDoc\t281616\nWestwood\t281617\n大家族\t281618\n冲刺赛车物语2\t281619\n深夏\t281620\nvision\t281621\n赛车\t281622\n中华保险\t281623\n所有物\t281624\n未解之谜\t281625\n赠票\t281626\n双边税收协定\t281627\ncubemx\t281628\n新光控股集团\t281629\nPropertyManager\t281630\n智慧\t281631\n送餐网\t281632\nDataBase\t281633\n徐琳\t281634\n喷雾仪\t281635\n开幕\t281636\n翠烟\t281637\n合肥经开区\t281638\n藏宝网\t281639\n中国东方资产管理公司\t281640\n四个太阳\t281641\n方老师\t281642\n20170302\t281643\n新埭\t281644\nHipp\t281645\n房窝窝网\t281646\n诚美\t281647\n幼稚园\t281648\n摸阴蒂\t281649\n蓝法\t281650\n开扒\t281651\n尤克里里指弹\t281652\n服装设计与工程\t281653\n重庆地产\t281654\nIngrid\t281655\n一担\t281656\n新标准大学英语视听说教程\t281657\n2元\t281658\n阴盛阳衰\t281659\n原产地\t281660\n麦仁\t281661\n人部\t281662\n锂片\t281663\n丽水学院\t281664\nJohn\t281665\n管孔\t281666\n2911\t281667\n基装\t281668\n压线钳\t281669\n池上\t281670\n掉水\t281671\n幽思\t281672\nドラマ\t281673\n行政管理工作\t281674\n插片机\t281675\n杨闻萍\t281676\n卧佛寺\t281677\n识骨\t281678\nLEA\t281679\nNIKE新浪\t281680\n粤剧\t281681\nKinky\t281682\n发询\t281683\n枣庄学院\t281684\nbcf\t281685\n黄林\t281686\n雀鳝\t281687\nlynk\t281688\n5迅雷\t281689\n王dm\t281690\n金盾股份\t281691\ndialect\t281692\n概算编制\t281693\n40083\t281694\n安博诺\t281695\n饲养员\t281696\n喷油嘴\t281697\nkt\t281698\n41期\t281699\n1-12月份\t281700\n耐阴\t281701\n月牙湖\t281702\n郑州理工职业学院\t281703\n火影忍者究极\t281704\n接插件\t281705\n中国商业联合会\t281706\nCOURT\t281707\nInvest\t281708\n神勇武工队传奇\t281709\n链刃\t281710\n24年前\t281711\n周琪\t281712\n第三十二次\t281713\n延政勋\t281714\n上古十大神兽\t281715\n徐锋\t281716\n继续教育\t281717\n科特\t281718\n62条\t281719\n天河中学\t281720\n100D\t281721\n囚禁\t281722\n标题\t281723\n零轨\t281724\n注册消防工程师考试\t281725\n花魁\t281726\n树图\t281727\n工商行政管理机关\t281728\n_首商网\t281729\n源程序代码\t281730\n血瓶\t281731\n平衡记分卡\t281732\n走开\t281733\n双税\t281734\nphotography\t281735\nsison530\t281736\n听涛\t281737\n肾气丸\t281738\n陈金龙\t281739\nSELECTED\t281740\nyhboys\t281741\n张建国\t281742\n海口路\t281743\nmoto\t281744\n上海政府采购网\t281745\n牛奶咖啡\t281746\n矫正\t281747\n七海\t281748\n大页\t281749\n阳湖\t281750\n隆德路\t281751\nu17\t281752\n雄鹰岭\t281753\n艾迪尔\t281754\nregedit\t281755\n陈翔六点半球\t281756\n3月1日起\t281757\n举报电话\t281758\n跃跃\t281759\n构件\t281760\n彭懿\t281761\n许慎\t281762\n钙镁\t281763\n菩提洞\t281764\n天津市城乡建设委员会\t281765\n大头条\t281766\n中国海诚\t281767\n呆板\t281768\n香气\t281769\n第十七集\t281770\n总编\t281771\nWER\t281772\n中润国力\t281773\n坝顶\t281774\n文汇路\t281775\nTFB\t281776\n起包\t281777\n倪大红\t281778\n美素佳儿奶粉\t281779\n20170903\t281780\n塞尔达传\t281781\n眼妆\t281782\n自学习\t281783\ncAD\t281784\n芹菜\t281785\n免费小说网\t281786\n6千克\t281787\n红博\t281788\njbc\t281789\n天翻地覆\t281790\n各界人士\t281791\n星符\t281792\n排行表\t281793\nBrawl\t281794\nGTS\t281795\n普联\t281796\n020|\t281797\n短篇\t281798\n化工厂\t281799\n红男\t281800\n映山红\t281801\n燃煤自备电厂\t281802\n初哥\t281803\niOS8.3\t281804\n影史\t281805\n郯城网\t281806\nsvc\t281807\n6.0版\t281808\n意趣\t281809\nvue-cli\t281810\n3MT\t281811\n住宅小区\t281812\n江南大道南\t281813\nPSCS6\t281814\n金银器\t281815\n矿化度\t281816\n科廷大学\t281817\n畏罪\t281818\n安全协议\t281819\n休伯利安\t281820\n中共中央政治局民主生活会\t281821\n拉丝粉\t281822\nmeishi\t281823\n秦汉新城\t281824\n街拍\t281825\n洪水猛兽\t281826\n橡子\t281827\n蔡正元\t281828\nTuring\t281829\n损友\t281830\n_沃游网www.woyoo\t281831\n映象网\t281832\nArcGis\t281833\n北大西洋\t281834\n郊铁路\t281835\n交通网\t281836\n影之刃2\t281837\n恩施市\t281838\n精神性\t281839\n明星大侦探第二季\t281840\n启动器\t281841\n大男当婚\t281842\n淳中科技\t281843\n泰达股份\t281844\n文后参考文献著录规则\t281845\n美素\t281846\n佳能TS3180\t281847\n因果性\t281848\n呼兰\t281849\n花年\t281850\n凯塔\t281851\n枚举值\t281852\n封箱胶\t281853\n考勒\t281854\n韩火火\t281855\n雅词\t281856\nsatwe\t281857\n80型\t281858\nuestc\t281859\n体检表\t281860\n静脉曲张\t281861\nRaising\t281862\n夕焼\t281863\nbilibili毒瘤\t281864\n南瓜汤\t281865\n专机\t281866\n公明\t281867\n27名\t281868\n迪尔\t281869\n第一方\t281870\n情业\t281871\n螃蜞\t281872\n超五类网线\t281873\n人才辈出\t281874\nAccu\t281875\nbehave\t281876\n香港六合马会\t281877\n李泽楷\t281878\n会计信息系统\t281879\n発射\t281880\n天津理工\t281881\n职业者\t281882\nNanyang\t281883\n500英里\t281884\n杨青柠\t281885\n出色\t281886\n活动性\t281887\nIdeapad\t281888\nfragrance\t281889\n风湿\t281890\n系统展\t281891\nNine\t281892\nsak\t281893\n六壬\t281894\n丝锥\t281895\n年尾\t281896\nzyf\t281897\n解毒\t281898\n学科\t281899\nOTL\t281900\n消灾\t281901\n复写纸\t281902\n中元组\t281903\n林浩\t281904\naustria\t281905\nt10\t281906\n坏账\t281907\nifcfg-eth0\t281908\n易果生鲜\t281909\n欧王庄\t281910\nMineral\t281911\n终局\t281912\nmput\t281913\n靶框\t281914\n被关闭\t281915\nClusters\t281916\n通山\t281917\n万达华府\t281918\n集美中学\t281919\nExplanation\t281920\n磨子桥\t281921\n戈洛夫金\t281922\n门座式\t281923\n云监控\t281924\nPlanes\t281925\n绍兴市卫生局\t281926\n淡化\t281927\n自动关机\t281928\n7.29\t281929\n明见\t281930\n溢流管\t281931\n表兄\t281932\nsavage\t281933\n智创\t281934\n扬子晚报网\t281935\n给党听\t281936\nweloop\t281937\nwwwx\t281938\n美敦力\t281939\n吃鸡\t281940\n起跑线儿歌网\t281941\nMXPlayer\t281942\n智能组网\t281943\n千屿\t281944\n猫屎\t281945\n书画\t281946\n广州公交查询网\t281947\n价值工程\t281948\njessi\t281949\n湖北之行\t281950\n俨\t281951\n附答\t281952\nSystèmes\t281953\nPays\t281954\n施用\t281955\n变量方法\t281956\n电场强度\t281957\n慎小嶷\t281958\n未孕\t281959\n海亮\t281960\n电脑输入法\t281961\n实字\t281962\n心脏处\t281963\nfor循环\t281964\n天融信\t281965\nquidway\t281966\n铝\t281967\n引动\t281968\nAnalytics\t281969\n皮蓬\t281970\n施工期\t281971\n姜建春\t281972\n通臂\t281973\n究极绿宝石\t281974\n擦片\t281975\npb2\t281976\n吟\t281977\n选选\t281978\n暖山\t281979\nIndent\t281980\n鸿运\t281981\n上海签证中心\t281982\n小营村\t281983\n五少\t281984\ncdr4\t281985\n首充\t281986\n会阴处\t281987\n钵仔糕\t281988\nFIXED\t281989\nPCbaby\t281990\n宁溪\t281991\n老城隍庙\t281992\n2055\t281993\n下三角矩阵\t281994\n宜业\t281995\nsncf\t281996\n兴仁县\t281997\n亚信联创\t281998\n办公室业务\t281999\n派头\t282000\n政府采购招标网\t282001\n定向安置房\t282002\n小白版\t282003\n堆积物\t282004\n廉江市\t282005\n红椿木\t282006\n干女\t282007\ndeliberate\t282008\n讲义气\t282009\n冰点还原精灵\t282010\n薅羊毛\t282011\n东单\t282012\n合心\t282013\nKaraoke\t282014\n12.0.0\t282015\n标的资产\t282016\n4月30日\t282017\n会长大\t282018\n集散\t282019\n80支\t282020\n武穆\t282021\n试玩\t282022\n十堰日报\t282023\n国有农场\t282024\nLEDE\t282025\n市公路局\t282026\n股票型基金\t282027\n3两个\t282028\n4.2.8\t282029\nWEG\t282030\n缩圈\t282031\n张爱\t282032\n利拉鲁肽\t282033\n哈士\t282034\n黄志坚\t282035\n茅山捉鬼人\t282036\n迷渡\t282037\n魔兽历史吧_\t282038\n2700\t282039\n头象\t282040\n胶痕\t282041\n大丸百货\t282042\n梁毅\t282043\n72小时\t282044\nboc\t282045\nIEC61850\t282046\n入怀\t282047\n归巢\t282048\nPorto\t282049\n虚机\t282050\n厚礼\t282051\n520听书网\t282052\nm.3dmgame.com\t282053\n吼叫\t282054\n巨猫\t282055\n托克逊县\t282056\n溪望\t282057\n全装\t282058\n娄江论坛\t282059\n像头\t282060\nEM10\t282061\n陆星\t282062\n200首\t282063\n亲父\t282064\n最囧挑战2\t282065\n黄有璨\t282066\n修形\t282067\n紫外光\t282068\n对夹式蝶阀\t282069\n中冶置业\t282070\n陕西汽车控股集团有限公司\t282071\n2018年内\t282072\n三十一\t282073\njune\t282074\n泪痕\t282075\n通标\t282076\n遇上\t282077\n校庆\t282078\n福田政府\t282079\n贝前列素钠片\t282080\n100多位\t282081\n蒋乐\t282082\n7040\t282083\n有限公司\t282084\n数人\t282085\nDELETE\t282086\n周源\t282087\n萧何\t282088\n设置方\t282089\n菁纯\t282090\n抛物型\t282091\n琴箫\t282092\nlearning\t282093\n小二\t282094\nRadar\t282095\n四十五度\t282096\nWynn\t282097\n门楼\t282098\n不得已\t282099\nlately\t282100\n异度幻世篇\t282101\ninitiatives\t282102\n查看\t282103\n韩国演艺圈悲惨事件\t282104\nnport\t282105\n小诗\t282106\nxDrive\t282107\n三相三线\t282108\n折纸蚂蚁\t282109\n乔灵儿\t282110\n小红绳\t282111\n线芯\t282112\n五征集团\t282113\n河北地税\t282114\n暴走恐怖故事\t282115\n群星城\t282116\n萧声\t282117\n国产凌凌漆\t282118\n大话免费版\t282119\n北京地铁房山\t282120\n南辰\t282121\n杨梅酒\t282122\nEsc\t282123\n彼时\t282124\nhm2\t282125\n东莞市社会保障局\t282126\n楼外楼\t282127\n掉话\t282128\n初创型\t282129\n刘海柱\t282130\n王慧敏\t282131\n蜜粉\t282132\nCry\t282133\n魏刚\t282134\n天语手机\t282135\nareo\t282136\n八宝山革命公墓\t282137\n记忆碎片\t282138\n二手车城\t282139\nU20\t282140\n汝河\t282141\n佟\t282142\n苏格兰场\t282143\n唐琳\t282144\n雾器\t282145\n顺讯\t282146\n略地\t282147\n硕士点\t282148\nImmunity\t282149\n汤臣一品\t282150\nyuzu\t282151\n凤囚凰\t282152\n迈凯伦P1\t282153\n大脚丫\t282154\n红磡\t282155\n走上不归路\t282156\n妖蝠\t282157\n隔振器\t282158\nAMY\t282159\n许昌网\t282160\nc^2\t282161\n单元素\t282162\n御女天下\t282163\n12包\t282164\nJT\t282165\n果乐\t282166\nathlete\t282167\nWord2003\t282168\n1.1.4\t282169\n弹垫\t282170\nluts\t282171\n张笑\t282172\n冰菓\t282173\n投票率\t282174\n王志敏\t282175\nwhatever\t282176\n天天有喜2\t282177\n大黄素\t282178\n1917\t282179\n闲听\t282180\n刘政\t282181\n十八万\t282182\n妮可基德曼\t282183\n众友\t282184\n王尔德\t282185\n21卷\t282186\n464\t282187\nformidable\t282188\n建丰\t282189\n复发性口腔溃疡\t282190\n超炫\t282191\n书目\t282192\nbreast\t282193\n张成\t282194\n沈园\t282195\nqiqi\t282196\nGAV\t282197\nIDC数据中心\t282198\n嘉善县人民政府\t282199\n不具有\t282200\ngog\t282201\n气缸套\t282202\n米拉杰\t282203\n单身男女2\t282204\ndiv\t282205\n7900\t282206\nthompson\t282207\n97分钟\t282208\nb站up\t282209\n音容\t282210\n腿肉\t282211\nA53\t282212\n电脑屏\t282213\nv2.6.0\t282214\n波斯波利斯\t282215\n一绪\t282216\nMMPI\t282217\n519\t282218\n西藏自治区人力资源和社会保障厅\t282219\n2016.5\t282220\n电枪\t282221\n金瞳\t282222\nActa\t282223\n提摩\t282224\n406\t282225\n3司\t282226\n较量\t282227\njingang\t282228\n二十张\t282229\n石林景区\t282230\n常州高新区\t282231\nBrent\t282232\n夜咳\t282233\n深圳万科\t282234\n涡轮\t282235\n声名鹊起\t282236\n专业主义\t282237\n男性病\t282238\n光明城\t282239\n矾土\t282240\n取力\t282241\n袁鹏\t282242\nlishen\t282243\n禁药\t282244\nKindEditor\t282245\n数码宝贝赛博\t282246\n三氟甲磺酸\t282247\nHKS\t282248\n大天使之剑H5\t282249\n吹吹\t282250\n上调\t282251\n内蒙古自治区交通运输厅\t282252\n98平米\t282253\n巨蜂\t282254\n7a\t282255\n弯管\t282256\n李文辉\t282257\n资阳区\t282258\n1.6吨\t282259\n梁凤仪\t282260\n长女\t282261\n陕西省委\t282262\n反应堆\t282263\n293T\t282264\n创启\t282265\n鍵\t282266\nVLX\t282267\n安标\t282268\n跑团\t282269\n伊豆\t282270\n860\t282271\n优漫卡通\t282272\n合办\t282273\n数粒机\t282274\n郑州市经济技术开发区\t282275\n10.50.1600\t282276\n斜疝\t282277\n余艳红\t282278\nqcon\t282279\n牛顿插值法\t282280\n小米5#\t282281\n温汤镇\t282282\naaron\t282283\n席间\t282284\ng3900\t282285\n姜丹尼尔\t282286\n一碗\t282287\n横排版\t282288\n复印\t282289\nhydrochloride\t282290\nstandardized\t282291\n3.8万\t282292\nWarriors\t282293\n神垕古镇\t282294\nsklearn\t282295\n进口税\t282296\n中山广场\t282297\n重庆化工职业学院\t282298\n根香\t282299\n梁钢\t282300\nParliament\t282301\n伊利股份\t282302\n墙线\t282303\n科帮网\t282304\n全微分\t282305\n三通阀\t282306\n新天龙八部OL\t282307\n36篇\t282308\n陆兆禧\t282309\npawn\t282310\naomei\t282311\n后脸\t282312\n人方\t282313\nScrolling\t282314\nTDS\t282315\n荆霄鹏\t282316\n押证\t282317\n产业化\t282318\n起亚k9\t282319\n海南省人民政府\t282320\n1-6\t282321\n仙剑2\t282322\n作业证\t282323\n平安社区\t282324\n丰润\t282325\nNights\t282326\nCenter\t282327\n大富翁7\t282328\n53kf\t282329\n点污\t282330\n/style\t282331\nPPT版\t282332\n太极八卦图\t282333\n奈克瑟斯\t282334\n_吴江政府网\t282335\n钻工\t282336\n节拍\t282337\n椰壳\t282338\n.net4.6\t282339\n蓝豆\t282340\n智联型\t282341\n黄山杯\t282342\n拳皇2012\t282343\n小人偶\t282344\n浙江省统计局\t282345\n梁任公\t282346\n酶制剂\t282347\n茅善玉\t282348\n花旗参茶\t282349\njQuery+HTML5\t282350\n3U\t282351\n南头古城\t282352\n六维空间\t282353\n真事\t282354\n珍视明滴眼液\t282355\n战神2\t282356\n彭山县\t282357\n掌门\t282358\n沃游网\t282359\n浙大城院\t282360\n心血管疾病\t282361\n中华人民共和国公务员法\t282362\n不干涉\t282363\n死讯\t282364\n思想\t282365\nDCOS\t282366\n华侨城\t282367\n多潘立酮片\t282368\n哈尔滨金融学院\t282369\n第18卷\t282370\n风平浪静\t282371\n才够\t282372\n甘肃省食品药品监督管理局\t282373\nmimi\t282374\n益智\t282375\n中茶\t282376\n西藏电视台\t282377\n图带\t282378\n牛叉\t282379\n暖宝\t282380\n科雷傲\t282381\n轮滑鞋\t282382\n棱柱体\t282383\n嘉铭\t282384\n玉田县\t282385\nrgbd\t282386\n盆友圈\t282387\n搓饵\t282388\n整训\t282389\n0.25.6\t282390\n腾创\t282391\n三自\t282392\n依依不舍\t282393\ndatetimepicker\t282394\n国土资源部信息中心\t282395\nSSL\t282396\n九岁\t282397\n油亮\t282398\nhungry\t282399\n黄片儿\t282400\n分单\t282401\n天量\t282402\n讯通\t282403\n御景园\t282404\n曼娜回忆录\t282405\nxinan\t282406\n密西西\t282407\n西塘\t282408\n高特佳\t282409\ndnf百花吧\t282410\n今日镇海数字报\t282411\nVisibility\t282412\n曾之乔\t282413\n720P版BD1280超清\t282414\nJus\t282415\n周道许\t282416\nIMPDP\t282417\nlucia\t282418\n金圣叹\t282419\n巴洛\t282420\n二手市场\t282421\n旧城区\t282422\n晓之女神\t282423\n国际单位制\t282424\n4dm\t282425\n手术室\t282426\nMarketplace\t282427\n七彩虹网驰\t282428\n5400转\t282429\n三阶魔方\t282430\n260\t282431\nofx\t282432\n辛金\t282433\n弧圈\t282434\nKITE\t282435\n岚山\t282436\n失业者\t282437\n二硫化硒\t282438\n展宏图\t282439\n罗汉床\t282440\nbaxter\t282441\n银牙\t282442\njiaowu\t282443\n触摸事件\t282444\n第103期\t282445\n纪念亭\t282446\n三合村\t282447\n4时\t282448\n12页\t282449\n看不清楚\t282450\n自聘\t282451\n错怪\t282452\n昆曲牡丹亭\t282453\nCOCA\t282454\n火把\t282455\n今生缘\t282456\npossess\t282457\n八月一日\t282458\n金珠玛米\t282459\ntolerance\t282460\n散客\t282461\nReserva\t282462\napt-repository\t282463\n干燥处\t282464\n朱泥壶\t282465\n爱花沙也\t282466\n遗赠\t282467\nPaco\t282468\n闽北\t282469\n乙班\t282470\n乐读窝\t282471\n梦100\t282472\n胡闹\t282473\nMiniport\t282474\n扩展坞\t282475\n杨飞\t282476\n没打算\t282477\n正片—大陆\t282478\n网战\t282479\n14.9\t282480\nPLY\t282481\n喇叭河\t282482\n被提名\t282483\n福建大学\t282484\n李春波\t282485\n乾隆大藏经-地藏论坛\t282486\n五联单\t282487\n通用运费网\t282488\n长春市人民政府\t282489\n无定型\t282490\n木床\t282491\n遵章\t282492\n利世\t282493\n华菱钢铁\t282494\naimer\t282495\n云从\t282496\n小鸽子\t282497\n水利水电学院\t282498\nnkwy\t282499\n有法\t282500\n散打王\t282501\n下浮率\t282502\n廉洁\t282503\n科普\t282504\n导热胶\t282505\n喜年\t282506\n租房子住\t282507\naffordable\t282508\n家庭组\t282509\n车闸\t282510\n过敏\t282511\n钴酸锂电池\t282512\n胶带\t282513\n3d软件\t282514\n教育基地\t282515\nFood\t282516\nprospectus\t282517\n02式\t282518\nalbum\t282519\n国金黄金\t282520\n12%\t282521\n奥克\t282522\n一呼一吸\t282523\n陆秀夫\t282524\n鬼泣4吧_\t282525\n滴点\t282526\n玩不懂\t282527\ndoa\t282528\n吴堡\t282529\n纪翔\t282530\n育苗班我爱菜园网\t282531\n大庆电视台\t282532\n羽月\t282533\nQQ企业邮箱\t282534\n编码\t282535\n悠着\t282536\n张曼玉\t282537\nvalentine\t282538\nChipset\t282539\n成业\t282540\n爱彩网\t282541\n史莱姆娘\t282542\n供卵\t282543\n面光\t282544\n税额\t282545\n战舰\t282546\n不育症\t282547\n刺客列传2\t282548\n开球网\t282549\n伝説\t282550\n硝酸亚铁\t282551\n苏莱曼\t282552\ncfosspeed\t282553\nslag\t282554\n打完\t282555\n42p\t282556\nhyddd\t282557\n双灯\t282558\n曹操墓\t282559\n溃烂\t282560\n18040\t282561\n天津一中院\t282562\n营业税金\t282563\n黄金T+D\t282564\n凯奇\t282565\nattitudes\t282566\n紫云山\t282567\n商丘市政府\t282568\n快捷键\t282569\n完整个\t282570\n拿错\t282571\n20018\t282572\n50届\t282573\nweb应用程序\t282574\n御坂妹妹\t282575\n虚境\t282576\n报帐\t282577\nchallenging\t282578\nWFP\t282579\n专项\t282580\n小龙湾\t282581\n眉峰\t282582\n服装包\t282583\n狗箱\t282584\n登科\t282585\n紫装\t282586\n1875年\t282587\n16份\t282588\n主群\t282589\n留底\t282590\n单水\t282591\n重于泰山\t282592\n7250\t282593\n本机\t282594\n攻无不克\t282595\npacbio\t282596\n対魔忍\t282597\n铅弹\t282598\n护脚\t282599\nOpenOffice\t282600\n葫\t282601\n道途\t282602\n军火箱\t282603\n张青云\t282604\n蓬蒿人\t282605\n战场女武神2\t282606\n图识\t282607\n示廓灯\t282608\n马蒂奇\t282609\n碎花裙\t282610\n丁奇\t282611\nlod\t282612\nv6.1.1\t282613\naep\t282614\ncmu\t282615\nmosso\t282616\n不赞\t282617\n阿巴斯\t282618\n如饥似渴\t282619\nHAY\t282620\n榴莲肉\t282621\n4期\t282622\n龙观乡\t282623\n小钉\t282624\n桂林生活网新闻中心\t282625\n撕票\t282626\n腘窝囊肿\t282627\n徐友刚\t282628\nRBM\t282629\n绍兴县\t282630\n肥厚型\t282631\n福州森林公园\t282632\nCorona渲染器\t282633\nVulkan\t282634\n大兴调查\t282635\n德瑞\t282636\n吴晓辉\t282637\n9.02\t282638\n张秀兰\t282639\n江河日下\t282640\n395\t282641\n500年后\t282642\n久别\t282643\n不可及\t282644\n平凡之路吉他谱\t282645\n科技感\t282646\nHOTTOYS\t282647\n复仇者\t282648\n悦康药业\t282649\n4月5\t282650\n行号查询网\t282651\n道兰\t282652\ngree\t282653\n淘宝客联盟\t282654\n资产负债\t282655\n海富通\t282656\n石龟\t282657\n迂腐\t282658\npyzheng\t282659\nSpring\t282660\nspyder\t282661\n无Internet\t282662\ndish\t282663\n猝不及防\t282664\nguop\t282665\n100w\t282666\n1080p/720p\t282667\n冯建宇\t282668\ndoubt\t282669\n第184集\t282670\n余热\t282671\n优游网我的咖啡厅\t282672\n28路\t282673\n玩转\t282674\n嘹\t282675\n首经贸\t282676\nsql2000数据库\t282677\n校园堂\t282678\n泛华集团\t282679\n干品\t282680\n弄玉\t282681\n民企人才网\t282682\n)电子有限公司\t282683\n紧身\t282684\n圆洞\t282685\n桃花世界\t282686\n意如\t282687\nCostumes\t282688\n胀痛感\t282689\n泽村\t282690\n用户级\t282691\n中产\t282692\n军转民\t282693\n无问\t282694\n宏业\t282695\n血法\t282696\n南通市卫计委\t282697\n珑骧\t282698\nKaf\t282699\n快麦\t282700\n固定登车桥\t282701\n诸葛镇\t282702\n赤魔法师\t282703\n2210\t282704\n江汉区\t282705\ndiv+css\t282706\nut\t282707\n污版\t282708\n宣州谢朓楼饯别校书叔云\t282709\n蔡子明\t282710\nstars\t282711\n杰志\t282712\n济南历下政府网\t282713\n回梦游仙\t282714\nPosters\t282715\nタクシ\t282716\n第2辑\t282717\n张颌\t282718\n闺秀\t282719\nzac\t282720\n燕山大学里仁学院\t282721\n不啻\t282722\n韩国\t282723\nSMW工法桩\t282724\n国电南瑞科技股份有限公司\t282725\n摩登\t282726\n2013年1月\t282727\n薛金星\t282728\n过家家\t282729\n萧内网_萧山论坛\t282730\n南昌市食品药品监督管理局\t282731\n广州住房公积金管理中心\t282732\n百善镇\t282733\n遂溪县人民政府\t282734\n清凉门大街\t282735\nInterval\t282736\n腔内\t282737\n校企合作协议书\t282738\nDNF艾肯传说\t282739\n雁田村\t282740\n龙草\t282741\n250w\t282742\n糊涂侦探\t282743\n麻衣\t282744\nconstrain\t282745\n三弟\t282746\n中国研究生招生信息网\t282747\n纪实类\t282748\nstun\t282749\nStein\t282750\n生电\t282751\n忆旧\t282752\n流落\t282753\n舰队collec\t282754\nopp\t282755\n风人\t282756\n8.5分\t282757\nPayment\t282758\nfuckin\t282759\n太平洋百货\t282760\n字板\t282761\n换元积分法\t282762\n13-25周\t282763\n辞世\t282764\n南广学院\t282765\n11月份\t282766\n快速票\t282767\n宋仁宗\t282768\n储蓄\t282769\n跳崖\t282770\n津川\t282771\n小孤山\t282772\n上海沪工阀门厂(集团)有限公司\t282773\n⒋\t282774\n行山\t282775\n自舒\t282776\nPTZ\t282777\n机关党委\t282778\n水管\t282779\n搪瓷反应釜\t282780\n造化帝尊\t282781\nMy97日期控件\t282782\n3dsmax2013\t282783\n雄兵\t282784\n谈歌\t282785\n景迈\t282786\n广州市第四中学\t282787\n事预则立\t282788\n环科园\t282789\n快球\t282790\n政采云\t282791\n换题\t282792\n广汉\t282793\n周期性行业\t282794\n国名\t282795\n临山\t282796\n电洛铁\t282797\n二十周\t282798\n车丝\t282799\n剑域\t282800\n到底\t282801\nlocaltion\t282802\n广州国际金融中心\t282803\n螺杆式空压机\t282804\n湖南省地质矿产勘查开发局\t282805\n空出来\t282806\n重度\t282807\n开方计算器\t282808\n金斯利安多维片\t282809\n知多\t282810\nMicroKMS\t282811\n李长乐\t282812\ncelebrate\t282813\n国家能源局\t282814\n16秒\t282815\n善意的谎言\t282816\n練\t282817\n节支\t282818\nDAZZLE\t282819\n学琴\t282820\n凤凰评论\t282821\n梦幻国度\t282822\n18045\t282823\n字谜\t282824\n费县\t282825\nlaser\t282826\n丸子头\t282827\nColombia\t282828\n青石路\t282829\n龚老师\t282830\n18克\t282831\n未决诉讼\t282832\n瞿海\t282833\n水煮虾\t282834\n2000首\t282835\n601519\t282836\n重庆高新区\t282837\n刘桂娟\t282838\n原本以为\t282839\n净月潭国家森林公园\t282840\n油迹\t282841\nHierarchy\t282842\n蹑\t282843\n形象片\t282844\n牛腩\t282845\n昕锐七界武神\t282846\n后期\t282847\n荔枝派\t282848\nquerySelector\t282849\n晕头转向\t282850\n邦购网\t282851\n红心\t282852\nheapster\t282853\n微货\t282854\n温家宝\t282855\n死猪\t282856\n上止点\t282857\n旅游户\t282858\n三峡新材\t282859\n墨玉\t282860\n菲涅耳\t282861\n荣耀史\t282862\n副局级\t282863\n体系化\t282864\n起起伏伏\t282865\n同韵\t282866\n肖云\t282867\n风门\t282868\nRouting\t282869\n海容模块\t282870\n未愈\t282871\n磨砂机\t282872\n造物主\t282873\n啼笑皆非\t282874\n朝元\t282875\nOlivier\t282876\n初恋情人\t282877\n刘瑶\t282878\n干笋\t282879\n武夷花园\t282880\n达阵\t282881\n图乙\t282882\n海埂\t282883\n李思琪\t282884\n比利时鲁汶大学\t282885\nrenderer\t282886\n中国民办大学\t282887\n唔愛\t282888\n橙锤\t282889\ngrief\t282890\n新千岁机场\t282891\n气泵\t282892\nGTX770\t282893\n21.6\t282894\n10月9日\t282895\n王云霞\t282896\n内裤\t282897\nGTP中文网\t282898\n包版\t282899\n11.17\t282900\n铅玻璃\t282901\nqq表情包\t282902\n下环\t282903\n毛豆\t282904\n自卑感\t282905\n大自驾\t282906\n南京雨花\t282907\nX-T1\t282908\n辐角\t282909\n南平市人民政府\t282910\nQ235A\t282911\n听证\t282912\n9.2.3\t282913\nKCl\t282914\n原始股\t282915\n菇水\t282916\n联邦制\t282917\nSERVER\t282918\n英菲尼迪q50\t282919\n游学营\t282920\n拍案而起\t282921\n学UI网\t282922\n受够了\t282923\n招娣\t282924\nminute\t282925\n指挥家\t282926\n欧式电视背景墙\t282927\n斩龙呐喊\t282928\n疱疹性咽峡炎\t282929\n魑魅魍魉\t282930\n免费小说阅读|凤凰书城\t282931\nastrill\t282932\n致穷经\t282933\n第37集\t282934\n商务机\t282935\n膻味\t282936\nIFS\t282937\n铜鼓岭\t282938\nJavaWeb\t282939\n0378\t282940\n予感\t282941\n出牙\t282942\n解局\t282943\n疏水阀\t282944\nAstrill\t282945\n三族\t282946\n图像库\t282947\n河口乡\t282948\n社会保障卡服务网\t282949\nPosition\t282950\n堕落街\t282951\n生香\t282952\n全单\t282953\nbjs\t282954\n剪跨比\t282955\n勋章菊\t282956\n凸函数\t282957\n本二\t282958\n朝贡国\t282959\n狂者\t282960\npayoneer\t282961\n大胃王猫\t282962\n老挝\t282963\nmame模拟器\t282964\nfau\t282965\nsangfor\t282966\n脸颊\t282967\n皮薄\t282968\n天山股份\t282969\n唐立\t282970\n猿声\t282971\n灵隐寺\t282972\nlolDragon\t282973\n毕生\t282974\n18万元\t282975\n时间函数\t282976\n王竹梅\t282977\n第七十\t282978\n许华\t282979\n国品\t282980\n宋逸民\t282981\n审核\t282982\n聚苯硫醚\t282983\n复膜\t282984\n锦衣卫\t282985\n官商\t282986\n福特汉姆大学\t282987\n预祝\t282988\n换向\t282989\n美智\t282990\n明年9月\t282991\ntre\t282992\n派代\t282993\n燃油泵\t282994\n让座\t282995\nzn\t282996\n奇峰\t282997\nhca\t282998\n键值\t282999\n难字\t283000\n北京奥体中心\t283001\nStussy\t283002\n亿佳\t283003\n堵死\t283004\nhtml代码库\t283005\n李云峰\t283006\n刘拓\t283007\n生活版\t283008\n极品飞车11\t283009\n德保\t283010\nUISwitch\t283011\nrang\t283012\n心不死\t283013\n织梦采集侠\t283014\n数据线\t283015\n亲儿\t283016\n跳操\t283017\n山包\t283018\ne袋\t283019\n九十九顶\t283020\nknight\t283021\n位子\t283022\nShopify\t283023\noracledb\t283024\n凰香奈芽\t283025\n増\t283026\n阴极电泳\t283027\n布片\t283028\n皂粉\t283029\n平果县\t283030\nnode.js\t283031\nokhttp3\t283032\n薄荷绿\t283033\nxingyu\t283034\n零百\t283035\n青川政府网\t283036\nwill\t283037\n孙涛\t283038\nStable\t283039\n适用型\t283040\n2k9\t283041\n一速\t283042\n杀出\t283043\nPS笔刷\t283044\n.org\t283045\n邪典\t283046\nKaby\t283047\ninsurgency吧\t283048\n2平方\t283049\n辍学\t283050\n法治进行时\t283051\n省力型\t283052\n影响线\t283053\n操作规范\t283054\n海棠依旧\t283055\n纽约大都会艺术博物馆\t283056\n不好过\t283057\n感伤\t283058\n超级演说家第二季片花\t283059\nAsk\t283060\n海南海政招标有限公司\t283061\n长白山景区\t283062\n二道街\t283063\n返生\t283064\n侦听\t283065\n桔红色\t283066\n302医院\t283067\nUVW\t283068\n2-3个月\t283069\n舰机\t283070\n迷你世界手机版\t283071\n南沙河\t283072\n青岛401医院\t283073\ncruel\t283074\n蔷薇少女\t283075\n起印\t283076\n南通狼山\t283077\nErdas\t283078\nxxggy\t283079\nhiberante\t283080\n关键词优化\t283081\n微易\t283082\n配合物\t283083\nHITMAN\t283084\n服装设计与工程专业\t283085\n玉潭镇\t283086\n收费处\t283087\nReid\t283088\n123届\t283089\n避风塘\t283090\n别克新英朗\t283091\n上司\t283092\n仙女楼\t283093\n兰州大学图书馆\t283094\n四十天\t283095\n有无痛分娩\t283096\n市城建委\t283097\n山东医药\t283098\n端源\t283099\n萧瑟\t283100\n木鱼哥\t283101\n天锦\t283102\n智云稳定器\t283103\n拜火教\t283104\n第6届\t283105\n虎门高铁站\t283106\nrbf\t283107\n姬魔恋战纪\t283108\n李宇恩\t283109\n海南电信\t283110\n世界服装鞋帽网\t283111\nJPG转换器\t283112\n1000单\t283113\n费莱尼\t283114\n花冈\t283115\n上教版\t283116\n中植系\t283117\n无辜的人\t283118\n精加工\t283119\n下木\t283120\n1234567.com.cn\t283121\nIPv6站\t283122\n南郡\t283123\n何娟\t283124\nUrban\t283125\n覆盖件\t283126\n医健\t283127\nXiangyu\t283128\n渔阳\t283129\n蒙山国家森林公园\t283130\n卷绕机\t283131\n二级C语言\t283132\n防撞块\t283133\n野炮\t283134\n有学\t283135\n天俊\t283136\nKnockout\t283137\nOral-B\t283138\n金汤\t283139\n创意人\t283140\n早间\t283141\n康熙通宝\t283142\nLua\t283143\n140号\t283144\n丁毅\t283145\n天天音乐网\t283146\n宦官\t283147\n人民币银行结算账户管理办法\t283148\nmsi季\t283149\n中国光大\t283150\n金睿\t283151\n生物技术专业\t283152\n恸\t283153\n生变\t283154\n复仇者联盟3:无限战争\t283155\nzara\t283156\n点状\t283157\nRev\t283158\n天然气灶\t283159\n生平\t283160\n宁波奥数网\t283161\n下模\t283162\n配电所\t283163\nMac邮件\t283164\n8万多\t283165\n冷链\t283166\n无味\t283167\n理工\t283168\nSEER\t283169\narcsoft\t283170\n楔\t283171\n话社\t283172\n张馨予\t283173\n100ML\t283174\n恒温库\t283175\n嘻哈侠\t283176\n亏待\t283177\n六言\t283178\n电改\t283179\n杨新宇\t283180\n小时候\t283181\nwsa\t283182\n她来听我的演唱会\t283183\n中国科学院科技战略咨询研究院\t283184\n扫光机\t283185\nWelfare\t283186\nhgf\t283187\n陀耶\t283188\n贪狼星\t283189\n张晓强\t283190\n芦潮港\t283191\n88831557\t283192\neth0\t283193\n100天内\t283194\n亲爱\t283195\n直销部\t283196\n上海皇帝之雄霸天下\t283197\n沪亭北路\t283198\n6DII\t283199\n开博尔\t283200\n93折\t283201\nvava\t283202\n45尺\t283203\n牙尖\t283204\n丽泽桥\t283205\n王美\t283206\n赌博堕天录\t283207\n亨利四世\t283208\n3D效果图\t283209\n两审\t283210\n马在路上\t283211\n尤克里里谱_ukulele\t283212\n哲科\t283213\n0.9元\t283214\n周文强\t283215\n拇指琴\t283216\n开天猫店\t283217\n彩神\t283218\n小博士幼儿园\t283219\n牡丹文化节\t283220\n跟庄\t283221\nmorethan\t283222\n138美业网\t283223\n哪一边\t283224\n河南省公安厅交通管理局\t283225\n芍药谷\t283226\n厚积薄发\t283227\n西部世界\t283228\nKevin\t283229\n性奴\t283230\n4日\t283231\n楚留香天\t283232\nchen\t283233\n眨眼\t283234\n学实\t283235\n逆襲\t283236\n斗鱼郭mini\t283237\n时断时续\t283238\n疴\t283239\n再生育\t283240\n科教文\t283241\n晨读\t283242\n圈中人保险网\t283243\n全讯网\t283244\n读写分离\t283245\n檐口\t283246\ndongxi\t283247\n湖滨小学\t283248\n蝶园\t283249\n銮\t283250\n蛏子\t283251\n淇淇\t283252\n怀有\t283253\n6尺\t283254\n210万\t283255\n9.33\t283256\n建厂\t283257\n苯丙酮尿症\t283258\nconditional\t283259\n端子\t283260\nWindows10序列号\t283261\n硬线\t283262\n80年代初\t283263\n廖前胜\t283264\n桌面管理器\t283265\n41028\t283266\n彩虹蛋糕\t283267\n地理学\t283268\n辅舒良\t283269\n雷贝拉唑\t283270\n姜哲\t283271\n仍然\t283272\n巨亏\t283273\n糖果盒\t283274\n非晶\t283275\n李亚平\t283276\n业余爱好者\t283277\ndvd播放器\t283278\n毕现\t283279\nvino\t283280\n车载导航\t283281\n迈阿密电音节\t283282\nfintech\t283283\n力矩\t283284\nAdvance\t283285\n番薯粉\t283286\n苏格兰皇家银行\t283287\n圣诞岛\t283288\nbrandt\t283289\n千代田区\t283290\n挈\t283291\nOneAPM\t283292\n呼兰河传\t283293\nFNhAa-A3\t283294\n好燃\t283295\n20多个\t283296\nx100s\t283297\n想不够\t283298\n名者\t283299\n神眷\t283300\n小户\t283301\n无菌\t283302\nThinkphp\t283303\nZhuQue\t283304\n微珠\t283305\n幸会\t283306\n科城\t283307\n水木社区\t283308\n第3部\t283309\n3dmax2010\t283310\n脚儿\t283311\n铝石\t283312\n冠心病人\t283313\n两双\t283314\n观府\t283315\n陈宝\t283316\n德国大众\t283317\n海金沙\t283318\n州人民政府办公室\t283319\n两面人\t283320\n释小龙\t283321\n雪佛兰赛欧3\t283322\nα\t283323\nwin7系统还原\t283324\noffspring\t283325\n微波毫米波\t283326\nAka\t283327\n7月1日起\t283328\nAmong\t283329\n影歌\t283330\n西野翔\t283331\n孟子辰\t283332\nsd卡\t283333\n4克\t283334\nspy\t283335\n海安漫思\t283336\n妙笔\t283337\n曼巴眼镜蛇\t283338\n代码\t283339\n平滑\t283340\n分钟\t283341\n异位妊娠\t283342\n鬼蛙\t283343\n隆林网\t283344\n杜凯\t283345\n西城区\t283346\n十一个\t283347\n鲁能\t283348\n亿|\t283349\n长春大学旅游学院\t283350\n造句\t283351\nlf2\t283352\ncsharp\t283353\n人音\t283354\n电信路由器\t283355\nSETNX\t283356\n忘情水\t283357\n长寿\t283358\n山东自考网\t283359\n纯音乐背景音乐大全_纯音乐背景音乐\t283360\ntransitional\t283361\nDeft\t283362\n科\t283363\n美国科学院\t283364\n知法\t283365\n2评分\t283366\n悬臂式\t283367\n树脂粉\t283368\n电子显微镜\t283369\n上海市计量测试技术研究院\t283370\n腕表\t283371\n灰阶\t283372\n2015.6\t283373\n16个月\t283374\n瑞普生物\t283375\n版权法\t283376\n保洁部\t283377\n贺美琦\t283378\n广州地铁2号线\t283379\n连霍高速\t283380\n外祖\t283381\n云南人才网\t283382\n默示录\t283383\n骑客\t283384\n码版\t283385\ndespair\t283386\n电信营业厅\t283387\n土豆炖排骨\t283388\n郑州升达经贸管理学院\t283389\n1.4l\t283390\n卫生费\t283391\n喷气织机\t283392\n移花\t283393\nEMV\t283394\n承重柱\t283395\n炭步镇\t283396\n范题\t283397\nm115w\t283398\n三季度\t283399\n斯奈德\t283400\nghos\t283401\npornstar\t283402\n水星MW300R\t283403\n怪事\t283404\n尖刺\t283405\n花菇\t283406\n随机森林模型\t283407\nSeminars\t283408\n抑制作用\t283409\n西城区教委\t283410\n第二天早晨\t283411\n皮线光缆\t283412\n2017年7月1日起\t283413\nsever2012\t283414\n佛本\t283415\nRFC\t283416\n永久荷载\t283417\n成人教育网\t283418\nsolitaire\t283419\n深圳市委党校\t283420\n陈建平\t283421\n三国战记2\t283422\n37层\t283423\n莲曲\t283424\n私奴\t283425\n地牢战争\t283426\n3CD\t283427\n公汇款\t283428\n电脑阅卷\t283429\ndelicious\t283430\n剪纸\t283431\n8310\t283432\n美国南加州大学\t283433\n莱斯特大学\t283434\n易中天中华史\t283435\n铜陵市政府\t283436\n601168\t283437\n宿州东\t283438\nSLI\t283439\nControllers\t283440\n再而三\t283441\n苏拉\t283442\n毋庸\t283443\nCD/DVD\t283444\n防爆电加热器\t283445\n应激\t283446\nFreezing\t283447\n竹节虾\t283448\n液控单向阀\t283449\ntensorrt\t283450\n宝嘉誉峰\t283451\nmsbuild\t283452\n高频淬火\t283453\n蔚蓝网\t283454\n男哥\t283455\ns赛\t283456\n窗井\t283457\n大话骰\t283458\n刺龙王\t283459\n天涯真我_论坛\t283460\n不均质\t283461\n潜艇\t283462\nRayBan\t283463\n千秋岁\t283464\nCLICK\t283465\n观致3\t283466\n同模\t283467\n环氧地坪漆\t283468\n试睡\t283469\n3pe防腐钢管\t283470\nClosers\t283471\n酒品\t283472\n惠装\t283473\n靠拢\t283474\n月字\t283475\neviews8.0\t283476\n古海\t283477\n文苑\t283478\n温枪\t283479\n等位基因\t283480\ncdr2018\t283481\n抗雄\t283482\nGevent\t283483\n翁虹\t283484\n武侠片\t283485\n18629559367\t283486\nliqingxin\t283487\ntomica\t283488\n依云水\t283489\ncatches\t283490\nlbm\t283491\n改增\t283492\n聚硫\t283493\n拿破仑三世\t283494\n贾宇\t283495\n217年\t283496\n薛法根\t283497\n60级\t283498\n第一项\t283499\n安付宝\t283500\n兴隆大家庭商业集团\t283501\ndelphi7\t283502\n昌北\t283503\ncuts\t283504\n教育集团\t283505\n卡套式\t283506\ntime网\t283507\nExcel宏\t283508\n克鲁\t283509\n碘化钠\t283510\n滙\t283511\n断肠\t283512\n600100\t283513\n纽带\t283514\n东海道\t283515\nduet\t283516\n粤人社\t283517\n北新区\t283518\n米泉\t283519\n雅俗共赏\t283520\n英雄的黎明\t283521\n小星星变奏曲\t283522\n黄金比\t283523\n第六十九章\t283524\n国家语委\t283525\n3690\t283526\n中华骨髓库\t283527\n张雨佳\t283528\n途牛玩乐\t283529\n便携机\t283530\n宫锁连城\t283531\n小米移动\t283532\n湖东路\t283533\n62mm\t283534\nCTA\t283535\n中国矿业大学研究生院\t283536\n皮克布克绘本馆\t283537\n湛江市公安局\t283538\n张杨果\t283539\n高尔夫7\t283540\nwiiu\t283541\n顶\t283542\n专供\t283543\n市烟草局\t283544\n8户\t283545\n【诺\t283546\n回灌\t283547\n展鹏\t283548\n做课\t283549\n闵柔\t283550\n安全文化网\t283551\n文鼎广场\t283552\n科学节\t283553\nvmtools\t283554\nhao\t283555\n社会主义市场经济体制\t283556\n中国移动营业厅\t283557\n低矮\t283558\n同胞\t283559\n尾行3\t283560\n光荣革命\t283561\n2420l\t283562\n金牛湖街道\t283563\n一环路东\t283564\n滴通\t283565\nspringMvc\t283566\nngulc\t283567\n魔性\t283568\nHeroku\t283569\n羽龙\t283570\n思享\t283571\nLED球泡灯\t283572\n公职律师公司\t283573\n鼩鼱\t283574\n考选\t283575\n17年8月\t283576\n高德纳\t283577\n味噌汤\t283578\n埃特板\t283579\n牛小帅\t283580\n泰格特\t283581\nAccueil\t283582\n帆影\t283583\n科比\t283584\n11%\t283585\n囿文\t283586\n减刑\t283587\n医药\t283588\n维吉尔\t283589\nUFC\t283590\n山东教育出版社\t283591\n小黑飞\t283592\n百度云批量\t283593\n大话西游1\t283594\n中暑\t283595\n上错花轿嫁对郎\t283596\n冷感\t283597\nmatica\t283598\n夸奖\t283599\n6800多万\t283600\n小米mi\t283601\n江湖气\t283602\n串口通信\t283603\nvga转换器\t283604\nMobile01\t283605\n幻痛\t283606\n云铝股份\t283607\nHeinz\t283608\n起先\t283609\n讨论者\t283610\n左宗棠\t283611\n美式乡村\t283612\n沪甬跨海\t283613\n手线\t283614\n软文\t283615\n鲜芋仙\t283616\n妾\t283617\niOS模拟器\t283618\n电影界\t283619\n瑞达法考\t283620\n透白\t283621\n城东\t283622\n开源操作系统\t283623\n清泉寺\t283624\n环型\t283625\nShots\t283626\nR10\t283627\nmarvell\t283628\n公共场所\t283629\n吕瑶\t283630\nDalvik\t283631\nAV天堂网\t283632\n张清华\t283633\n人民币存款利率\t283634\npCloud\t283635\n小岛惊魂\t283636\n麻雀\t283637\nCOMING\t283638\n侠盗猎车手:罪恶都市\t283639\n灵宝市\t283640\nconverter\t283641\ncoreldrawx6\t283642\n5万平米\t283643\n狮子峰\t283644\n第8章\t283645\n高级经理\t283646\nmyabtis\t283647\n黄淮\t283648\n扬升\t283649\n2017招\t283650\n通宵\t283651\n农业供给侧结构性改革\t283652\n北新建材\t283653\n市场分析师\t283654\n梦幻西游端游\t283655\n残阳\t283656\n花源镇\t283657\n于乐\t283658\n北京有限公司\t283659\n20160629\t283660\n3.67\t283661\n一粒一粒\t283662\n鲁东\t283663\nLanSee\t283664\nldb\t283665\nSnort\t283666\n入土为安\t283667\n赵邦贺\t283668\n1.1版\t283669\n开卡\t283670\n德高\t283671\n德明\t283672\n哈皮\t283673\niPhone6/Plus\t283674\n王校长\t283675\nQCC\t283676\n中共海南省委\t283677\n佩章\t283678\n商用车网\t283679\n消磨\t283680\ncla45\t283681\n迭代式\t283682\n内源性\t283683\nDash\t283684\n驱动轮\t283685\n康巴卫视网\t283686\n东北电气\t283687\n有机溶剂\t283688\nyoodb\t283689\nmush\t283690\n周海龙\t283691\n氢脆\t283692\n铃木杏里\t283693\nwin7/\t283694\n红烧鲤鱼\t283695\n16万\t283696\n西平县人民政府\t283697\n赣府\t283698\n5年后\t283699\n祥生集团\t283700\n尝鲜\t283701\n旁\t283702\n结子版\t283703\ni38100\t283704\n洗车池\t283705\n立马电动车\t283706\n美的空气净化器\t283707\n相碰\t283708\n5100元\t283709\n添丁\t283710\n耳\t283711\n筆記型\t283712\n计分板\t283713\nkuaisu\t283714\n17英寸\t283715\n火力少年王之悠风三少年\t283716\nG7X\t283717\n十\t283718\n瑞芯\t283719\n菜菜子\t283720\n李浩\t283721\n粗绳\t283722\n还行\t283723\n三七市\t283724\n小三劝退师\t283725\n岛田阳子\t283726\n建筑设计有限公司\t283727\nc88\t283728\nbelle\t283729\n浦北县\t283730\nsexxx\t283731\n升红\t283732\namoeba\t283733\n罗川\t283734\n长沙市人力资源和社会保障局\t283735\n修锁\t283736\n华野\t283737\n花京院\t283738\n民主生活会制度\t283739\n110kv\t283740\n瑞云\t283741\n盐沼\t283742\n触屏\t283743\n℡\t283744\n测查\t283745\n9.2\t283746\n伦教\t283747\n85页\t283748\n全胸\t283749\n明治大学\t283750\nBernard\t283751\nBAS\t283752\nv2.3.7\t283753\n西湖龙井茶\t283754\nexcel表格单元格\t283755\n梅若\t283756\n封杀\t283757\n143平\t283758\n360ml\t283759\n北川景子\t283760\ncrtl\t283761\nnojs\t283762\n江西省人力资源和社会保障厅\t283763\n网鱼\t283764\n小熊猫\t283765\n李子恒\t283766\n民国街\t283767\n整天\t283768\n市\t283769\nacceleration\t283770\nIstanbul\t283771\n国铁\t283772\n评建\t283773\n崂山白花蛇\t283774\n月主\t283775\n傻妞\t283776\n浙金信托\t283777\n出仓\t283778\n闺房\t283779\n西昌卫星发射中心\t283780\nMediaPad\t283781\nuglifyjs\t283782\n【晶美\t283783\n|韶关网\t283784\n茂男\t283785\n网监局\t283786\n萧玉\t283787\n舟骨\t283788\n中国无线电协会\t283789\n巴马活泉\t283790\n中共安徽省委党校\t283791\nnlinfit\t283792\nproguard\t283793\n无志\t283794\n威坪镇\t283795\nps模板\t283796\n胶州湾跨海大桥\t283797\n林晓东\t283798\n口红色\t283799\n微月\t283800\n套票\t283801\n奥迪RS6\t283802\n唛头\t283803\nsai2\t283804\n极限学习机\t283805\n米家骑记\t283806\nhdjay\t283807\n木衣柜\t283808\n凤凰座\t283809\n码族\t283810\n弹簧床\t283811\nskeleton\t283812\nNessus\t283813\n张瑞芳\t283814\n21.7\t283815\n永新县\t283816\n杨文浩\t283817\n茶氨酸\t283818\n150#\t283819\n猴脑\t283820\n6000余\t283821\n沃尔沃V90\t283822\n世萌吧\t283823\n麒麟970处理器\t283824\n四系\t283825\n疏影\t283826\n20170224\t283827\n玛丽莲·梦露\t283828\n世邦\t283829\n叶绿体\t283830\nFileStream\t283831\n9800万多户\t283832\n保安编码器\t283833\n杜均\t283834\n春菜花\t283835\n母单\t283836\n徐乐\t283837\n板坯\t283838\nsrvctl\t283839\n一事无成\t283840\n义安区\t283841\n恋舞红旗h7\t283842\n鼻中\t283843\n古堰\t283844\n紫荆关\t283845\n凝管\t283846\n潘粤明\t283847\n鸡胸肉\t283848\n开讲啦\t283849\n黑绝\t283850\n风冷式\t283851\n黄龙溪古镇\t283852\n5件套\t283853\n责成\t283854\n姓张\t283855\n直管县级市\t283856\n7月17日\t283857\n济南市长清区\t283858\n传声器\t283859\n九阴真经2\t283860\n横山寺\t283861\npublicly\t283862\n破伤风杆菌\t283863\n血清铁蛋白\t283864\nbabel\t283865\n孙悟天\t283866\n中欣氟材\t283867\n大神F1\t283868\n一日游\t283869\n万科金域澜湾\t283870\n异世九鼎记\t283871\nMcncc\t283872\n汽温\t283873\n众泰_\t283874\n好好说\t283875\n线板\t283876\nDJ歌\t283877\n探棒\t283878\n长调\t283879\n肾小球肾炎\t283880\n图块\t283881\n3月14日\t283882\nEars\t283883\n折断\t283884\n2万4\t283885\n自学考试\t283886\n奥莱悦府\t283887\n如泥\t283888\n林肯\t283889\nnotnull\t283890\nlivehouse\t283891\n盲侠\t283892\n领不到\t283893\n提薪\t283894\nsapply\t283895\nbingo\t283896\n转评\t283897\n圣邦股份\t283898\n所困\t283899\n举债\t283900\n压\t283901\nasy\t283902\n10月8日起\t283903\n76平米\t283904\n现代大学英语精读2\t283905\n太阳能电池板\t283906\n七大罪第一季\t283907\nactivi\t283908\n陕西省西咸新区\t283909\n诡辩\t283910\ndzs\t283911\n台州网络\t283912\n部分性\t283913\ngul\t283914\nRoast\t283915\n三姐妹\t283916\n_机\t283917\nYa\t283918\nzooxxx\t283919\n收窄\t283920\nStationary\t283921\n第二季\t283922\n组解\t283923\nBeta2\t283924\n29.99\t283925\n此帖\t283926\n莫德凯撒\t283927\n带法\t283928\nsubsys\t283929\n猎豹cs10\t283930\n黑木琴音\t283931\n品牌红木网\t283932\n任天堂大乱斗\t283933\n滨海经济技术开发区\t283934\nG9300\t283935\n第150\t283936\n陈少梅\t283937\n雨松MOMO\t283938\n干净利落\t283939\n灵芝粉\t283940\n唐铭阳\t283941\n家贫\t283942\n悲秋\t283943\n钢镚儿\t283944\nNotary\t283945\n张飞庙\t283946\n慰安妇\t283947\n储水式电热水器\t283948\n迈格森\t283949\n湖南省人民医院\t283950\n无锡市第九人民医院\t283951\n莱索托\t283952\n1565783227\t283953\nxvsr\t283954\n波\t283955\njan\t283956\nPaperRater\t283957\n东兴证券股份有限公司\t283958\n陈渝\t283959\n省花\t283960\nmicrotek\t283961\n长沙火车站\t283962\n纳瓦斯\t283963\n英菲\t283964\n一扫而光\t283965\n陕西省公安厅交通管理局\t283966\nuploadifive\t283967\n五年级科学下册\t283968\nTOGAF\t283969\n微修\t283970\nPHP+MYSQL\t283971\n豆科\t283972\nf318\t283973\n华城国际\t283974\n宅宅AvDay\t283975\n8kw\t283976\n朊\t283977\n放牛班的春天\t283978\n环渤海\t283979\n石家庄市高新区\t283980\n邮科院\t283981\n教育局\t283982\nween\t283983\n元\t283984\n主亲\t283985\n_多玩\t283986\n0412\t283987\n鼋头渚\t283988\n蒸锅\t283989\n二建万\t283990\n淋雨\t283991\n21G\t283992\n打拳击\t283993\n新道杯\t283994\n夜蒲\t283995\n李峙\t283996\n人天\t283997\nUITextView\t283998\nTPK\t283999\n年轻的母亲4\t284000\n塞尔达沙漠\t284001\n秦安\t284002\n封皮\t284003\n于康熙\t284004\npit\t284005\nhdmi转换器\t284006\n华润紫云府\t284007\nintelj\t284008\n3033\t284009\ngbk\t284010\nㄏ\t284011\n好视力\t284012\n加粗\t284013\n张晋\t284014\n水城\t284015\n魏建军\t284016\n4.doc\t284017\n灵山景区\t284018\n九运\t284019\n说明页\t284020\n锅炉房\t284021\n数千年\t284022\n汉能控股集团有限公司\t284023\nCOSAV\t284024\npc3000\t284025\n雅娜\t284026\n正蓝旗\t284027\nhope\t284028\n杨陵\t284029\n宋慧\t284030\n沏茶\t284031\n北平\t284032\n成都吃喝玩乐网\t284033\n承转合\t284034\n素琴\t284035\nAbbey\t284036\n阴阳脸\t284037\nleavingseason\t284038\n家轿\t284039\n刘艳丽\t284040\n薄型\t284041\nhangzhou\t284042\nSqlite\t284043\n探母\t284044\n53元\t284045\n性表\t284046\n鸡版\t284047\n初声\t284048\n二百年\t284049\n刘若鹏\t284050\n魔战套\t284051\n实木烤漆门\t284052\n内藤湖南\t284053\n扬州大学医学院\t284054\n鬼影步\t284055\n大卡\t284056\ne店宝\t284057\ntrailer\t284058\n二人组\t284059\n话头\t284060\n占比\t284061\n光台\t284062\nxoom\t284063\n锻炼身体\t284064\n穷通\t284065\n逆变电焊机\t284066\n铺床\t284067\n2605\t284068\n审计学\t284069\n第10号\t284070\n不杀\t284071\n龙玉\t284072\n日语五十音图\t284073\n瓯海区政府\t284074\n毛之地\t284075\nsoon\t284076\n第15位\t284077\n质权人\t284078\n解理\t284079\n市民们\t284080\n阻抗分析仪\t284081\n小猪快跑\t284082\n芣苢\t284083\n龙刃\t284084\n千祥\t284085\n宫本武藏\t284086\n开放度\t284087\n碎星\t284088\n黄页88网址导航\t284089\ntnf\t284090\n第26\t284091\n猪耳朵\t284092\nEthiopia\t284093\n马六\t284094\n天润\t284095\ny460\t284096\n粉身碎骨\t284097\n魔仙堡\t284098\n冷眼\t284099\n潭柘寺\t284100\n快付通\t284101\n最前沿\t284102\n银豆网\t284103\nLINK\t284104\n幸福家园小区\t284105\n风神翼龙\t284106\n40方\t284107\n灵通资讯网\t284108\n九十年代\t284109\n人民体育出版社\t284110\n耐磨\t284111\n六项\t284112\nKinetics\t284113\n62119064\t284114\nJunos\t284115\n上闵\t284116\n自考公共关系学\t284117\n97韩剧网\t284118\n_音平\t284119\n太上布衣\t284120\n冯某\t284121\n三湘风纪网\t284122\n万洲\t284123\n002281\t284124\nstdcall\t284125\n高新路\t284126\ndabao\t284127\n心梗\t284128\n波菜\t284129\n郎才女貌\t284130\n指模\t284131\nvae\t284132\n小汤山\t284133\n红米1s\t284134\n茨菇\t284135\n内斗\t284136\nassociative\t284137\n丹泽尔\t284138\n架包\t284139\n帝王蟹\t284140\n米恩\t284141\nflows\t284142\n螺内酯\t284143\npro分屏\t284144\n多老\t284145\nF几\t284146\n8周年\t284147\n110亿\t284148\n2018-2035年\t284149\n322路\t284150\n自治区政协\t284151\n奥拉基尔\t284152\n化州市\t284153\n加加林\t284154\n尺码\t284155\n一辩\t284156\n张靖\t284157\n稳定杆\t284158\n四权\t284159\n神剑情天2\t284160\n锦江花园\t284161\nE轮\t284162\ngtx860m\t284163\n西安消防\t284164\n赵麟童\t284165\n军报\t284166\n暗黑西游记\t284167\nexw\t284168\n描画\t284169\n四梁八柱\t284170\n得伟\t284171\n江苏省教育厅\t284172\n钱锋\t284173\n小跳蛙\t284174\n物联网应用技术\t284175\n双碟\t284176\n征得\t284177\nz1\t284178\n关联词\t284179\n消化池\t284180\n简化\t284181\n金融许可证\t284182\n赢乐\t284183\n通配\t284184\n2002-2013年\t284185\naabb式\t284186\n脉诊\t284187\nINFORMATION\t284188\n第三十九\t284189\n沈音\t284190\n话梅\t284191\n710路\t284192\n兰舍硅藻泥\t284193\n金姓\t284194\nitalian\t284195\nKIWI\t284196\ngtx960\t284197\n沿江铁路\t284198\n收录\t284199\n南舞团\t284200\n一层\t284201\n拔罐疗法\t284202\n密封胶条\t284203\n开江县人民政府\t284204\n健合\t284205\n佛柜\t284206\n齐鲁医院\t284207\n中旅国际旅行社\t284208\n挡墙\t284209\n3场\t284210\n】线\t284211\n蔡塘\t284212\n还贷款\t284213\n斯嘉丽\t284214\n休闲园\t284215\n滩羊\t284216\n新疆舞\t284217\n戏文\t284218\n缱绻\t284219\n9.0.0\t284220\n大丹\t284221\n试衣间\t284222\n骑马与砍杀魔戒最后之日\t284223\n胃酸反流\t284224\n明胶空心胶囊\t284225\nmultiprocessing\t284226\n高德\t284227\n襄阳牛肉面\t284228\n琅琊榜2之风起长林\t284229\n上进\t284230\n两女\t284231\nbldc\t284232\n最后的莫西干人\t284233\n广西壮族自治区高级人民法院\t284234\nangel\t284235\n3道\t284236\n4003\t284237\n国汇\t284238\n相夫\t284239\n真實\t284240\n1837\t284241\n钱舞\t284242\n情变\t284243\n异禀\t284244\naof\t284245\n盛兴\t284246\n裱花师\t284247\n叮叮当\t284248\n大略\t284249\n安徽高院\t284250\n酷儿\t284251\nproduce101吧\t284252\nusercontrol\t284253\n八进制\t284254\n禁卡\t284255\n激素类药物\t284256\n甘油酯\t284257\nLebanon\t284258\n1796\t284259\n刘德c罗\t284260\n好多遍\t284261\n萨拉丁\t284262\n双子座\t284263\n南二环\t284264\n子宫内膜息肉\t284265\n培训帮\t284266\n123D\t284267\n广宁村\t284268\n岀\t284269\nCypher\t284270\n马扎\t284271\nNewschoolers\t284272\n乌\t284273\n始源蓝宝石\t284274\n空铁联运\t284275\n印度教\t284276\n312路\t284277\n绍兴莲花落\t284278\nCaravan\t284279\n日式RPG\t284280\n签章\t284281\n易建王东明\t284282\n纳特帕格\t284283\n200余家\t284284\n航天航空\t284285\n新苑\t284286\n怨灵\t284287\n千村\t284288\nTrading\t284289\nCOMPILER\t284290\n钓友们\t284291\n跟着贝尔去冒险\t284292\n金丹大道\t284293\n满语\t284294\nwebxml\t284295\n前夫\t284296\nSCIENCES\t284297\n2011-2015年\t284298\n5.50\t284299\n豫剧网\t284300\n北京德胜门中医院\t284301\n高风亮节\t284302\n2690\t284303\n冠状\t284304\n七度苹果网\t284305\nhc360慧聪网\t284306\n日程安排\t284307\n爱德华八世\t284308\n北京市新闻出版广电局\t284309\nryzen\t284310\n曁\t284311\n提夫尼\t284312\n齐欣\t284313\n陈述式\t284314\n十堰教育信息网\t284315\n高贤贞\t284316\n三红蜜柚苗\t284317\n人教版九年级语文\t284318\nScarlett\t284319\n想吐\t284320\n凝珠\t284321\n普鲁兰\t284322\n口袋妖怪金手指\t284323\n外楼\t284324\n田心村\t284325\n志丹县\t284326\n衡水二中\t284327\nttr\t284328\n李本良\t284329\nemotion\t284330\n清十二帝\t284331\n逗音\t284332\n曼昆\t284333\ncommvault\t284334\n4:30\t284335\n机械原理\t284336\n崔始源\t284337\n三千亩\t284338\nSpotify\t284339\n生辰八字算命_八字算命_周易算命\t284340\n张益唐\t284341\n厦门市旅游局\t284342\n乐美\t284343\n耳镜\t284344\n燕大\t284345\ni7-8700\t284346\n五谷杂粮粥\t284347\n副署长\t284348\n12000公里\t284349\n陶瓷管\t284350\n金柚\t284351\n来伊份\t284352\n丁晓东\t284353\n华夏文明\t284354\n星陨传说\t284355\n浦东区\t284356\nWeighted\t284357\n南太行\t284358\n辽宁省\t284359\n碱面\t284360\n除根\t284361\ndozens\t284362\n里层\t284363\n山西省体育局\t284364\n收藏\t284365\n进项税\t284366\n蔡洁\t284367\n腾讯觅影\t284368\n验证\t284369\n新舟\t284370\nCCTV-12\t284371\n定时器\t284372\n高尔基体\t284373\n重庆海尔\t284374\n甲秀楼\t284375\nbossa\t284376\n肖文杰\t284377\n10&\t284378\n本月\t284379\nResearch\t284380\n決\t284381\n隆尧县\t284382\n中国TBT研究中心\t284383\n多万\t284384\n蝉蛹\t284385\n2017年春节\t284386\n尹恩惠\t284387\n云崖\t284388\n龙头股一览\t284389\n保利中央公园\t284390\n磁路\t284391\n罗阳\t284392\n连袜裤\t284393\n怀邵衡铁路\t284394\n9万亿\t284395\n实际性\t284396\n乒协\t284397\nvs2005\t284398\n基数\t284399\n路劲\t284400\n李丽珍\t284401\n官也街\t284402\n锅哥\t284403\n苏力\t284404\nBrokers\t284405\n闪亮时刻\t284406\n汉白玉\t284407\n新马\t284408\nPhotoShop\t284409\n恒大城\t284410\n荒野行动\t284411\n48部\t284412\nNorthwestern\t284413\n书桓\t284414\n108元\t284415\ncjb\t284416\n鸾凤鸣by\t284417\n冒火\t284418\n施工费\t284419\n山东教育电视台\t284420\n新加坡花园\t284421\n恒生电子股份有限公司\t284422\nu7\t284423\n气体增压泵\t284424\n柯庆施\t284425\n劳斯莱斯幻影\t284426\n喂马\t284427\n临海市\t284428\n编织器\t284429\n第i\t284430\n骷髅王\t284431\n乱花\t284432\n郎心如\t284433\n141个\t284434\n指环王3:王者无敌\t284435\nwow7.2\t284436\n折旧\t284437\n预付款保函\t284438\n华新镇\t284439\n李建国\t284440\n保利达\t284441\n怨声载道\t284442\n田刚\t284443\nKicks\t284444\n南充一中\t284445\nnet4.6\t284446\nSUZHOU\t284447\n文件套\t284448\n导航版\t284449\n2560x1600\t284450\n慧讯网\t284451\n中梗阻\t284452\n摩根大通银行\t284453\n台虎钳\t284454\n足彩\t284455\n石棉县人民政府\t284456\n小怜\t284457\n雄黄\t284458\n神武手游3\t284459\nipadmini2吧\t284460\n烫金版\t284461\n奥斯卡颁奖典礼\t284462\n阿尔卡\t284463\nDays\t284464\nBoys\t284465\n601828\t284466\n实案\t284467\n大模\t284468\n注目\t284469\n日安\t284470\n图拉丁\t284471\n赛赛\t284472\n值栈\t284473\n包裹\t284474\n安吉白茶\t284475\n定期利率\t284476\n国立中山大学\t284477\n八仙筒\t284478\n|秒\t284479\n科幻迷\t284480\n杨师傅\t284481\n森田药妆\t284482\nWing\t284483\n田\t284484\n自然美\t284485\n8月6日\t284486\n伤病员\t284487\n中通快运\t284488\n蝴蝶臂\t284489\nqq游戏\t284490\nAmendment\t284491\nCodeMonkey\t284492\n吃穿\t284493\n迷失森林\t284494\n中国骑友网\t284495\n深圳会计网\t284496\nToca\t284497\n整块\t284498\n武汉群光广场\t284499\n鲁阳\t284500\n红牡丹\t284501\n爱戴\t284502\nkgf\t284503\nsmiths\t284504\n康娜\t284505\nEsper\t284506\npsyedu\t284507\n十不干\t284508\n坦克战\t284509\n感力\t284510\ncompanies\t284511\n婴儿推车\t284512\nRoHS\t284513\n乙亥\t284514\n嫌疑\t284515\n拆模\t284516\n神子\t284517\n蒂克币\t284518\n发动机清洗剂\t284519\nTraffic\t284520\nckplayer-ckplayer\t284521\ndlsite\t284522\n上古卷轴5mod\t284523\n竞天公诚\t284524\nPhotonics\t284525\n西拉干红葡萄酒\t284526\nPrints\t284527\n汇仁肾宝片\t284528\n司母戊鼎\t284529\n石家庄桥西区\t284530\n平\t284531\nBB环\t284532\nexcept\t284533\n程伟\t284534\nmettler\t284535\n宣传单页\t284536\n小猪佩琦\t284537\n1.28G\t284538\nsealant\t284539\n禁止性\t284540\nWandering\t284541\n塞车\t284542\n此命\t284543\n授信贷款\t284544\n美帕\t284545\n条块\t284546\n互动式\t284547\n7730\t284548\n炁\t284549\n田一德\t284550\n金丝猴\t284551\n滔搏运动\t284552\n300B\t284553\n空心圆\t284554\n反渗\t284555\n征途2x\t284556\n刑事判决书\t284557\nEditorial\t284558\n涕\t284559\n烟火\t284560\n狩魔\t284561\n鹿兆鹏\t284562\n许斐刚\t284563\n穿心\t284564\n九间堂\t284565\n沙特阿拉伯王国\t284566\nSpend\t284567\n告诉他\t284568\n蝗虫\t284569\n财通证券\t284570\n异体\t284571\n容\t284572\n天海投资\t284573\n恋父\t284574\nJZ\t284575\n任天堂Switch\t284576\n修罗道\t284577\nnip\t284578\n情面\t284579\n三天前\t284580\ndraggable\t284581\n黄丽华\t284582\niph\t284583\n怪气\t284584\n北师大附属实验中学\t284585\n市政工程\t284586\n海口市\t284587\n小米空调\t284588\n率先垂范\t284589\n顺溜\t284590\na8l\t284591\nluoli\t284592\n邦克仕\t284593\n网恋\t284594\npixelstyle\t284595\n异点\t284596\n医元网\t284597\n魔剑\t284598\npytho\t284599\n伊珂\t284600\n契约化\t284601\n明居\t284602\n确证\t284603\n卓尔不群\t284604\n畏缩\t284605\n曼谷市\t284606\nstark_summer\t284607\nRecognition\t284608\n知库\t284609\n摇滚\t284610\n散装食品\t284611\n欧子\t284612\n极品飞车17\t284613\n筱田优\t284614\n鱼网\t284615\n萨维斯\t284616\n直抒胸臆\t284617\nUNCENSORED\t284618\n针织裙\t284619\n技研\t284620\n保利领秀山\t284621\n蒋干\t284622\noops\t284623\nAccenture\t284624\n皎皎\t284625\n第35期\t284626\n契默契\t284627\n外商独资公司\t284628\n惠头条\t284629\n祁连县\t284630\n4x5\t284631\n国网信息通信产业集团有限公司\t284632\n学生会\t284633\n早上5点\t284634\n中文件\t284635\n吴德\t284636\n3冰封\t284637\ncautious\t284638\n吉氏\t284639\n端机\t284640\n贺岁片\t284641\nsjb\t284642\n乳突炎\t284643\n21.1\t284644\ng-shock\t284645\nherom2\t284646\n学技\t284647\n精制\t284648\nou\t284649\ngrass\t284650\n九酷音乐\t284651\n13名\t284652\n14组\t284653\n宝翠园\t284654\n书长\t284655\n北京10号线\t284656\nHBK\t284657\n铃木达央\t284658\n莫迪里阿尼\t284659\n速尔\t284660\n寂静岭1\t284661\n月见草\t284662\n茶山镇\t284663\n曹方\t284664\n李师师\t284665\n北京西客站\t284666\n壳郎\t284667\nbacktrace\t284668\n夜夜日\t284669\nei6\t284670\nmeld\t284671\n肌醇\t284672\n吴薇\t284673\n挖破\t284674\n切工\t284675\n绍特\t284676\n纹绣师\t284677\n张红兵\t284678\n柳工\t284679\n_布谷\t284680\n九论\t284681\n狐妃\t284682\nmqtt\t284683\nlorenzo\t284684\n听诊器\t284685\n茶语网\t284686\n前海管理局\t284687\n成都便民网\t284688\n子味\t284689\n有文\t284690\n形位\t284691\n深圳东湖公园\t284692\n广东省农科院\t284693\nStalker\t284694\n进气道\t284695\n第203集\t284696\ntsi\t284697\nPowerPoint2016\t284698\n受命\t284699\n脑残\t284700\n胡海峰\t284701\n路桥费\t284702\n斯托\t284703\n辣警霸王花\t284704\n贴创\t284705\n小米智能\t284706\n把酒\t284707\n重庆中国银行\t284708\n川北凉粉\t284709\n效应器\t284710\n山西省国资委\t284711\n香浓\t284712\n江苏省质量技术监督局\t284713\n手羽\t284714\n黑苹果Mac\t284715\nFontographer\t284716\n悬挑架\t284717\n有赖于\t284718\nDBX\t284719\n姚珏\t284720\n学会\t284721\n信达雅\t284722\nlimeOracle\t284723\n钱袋\t284724\n义工\t284725\n科学技术研究院\t284726\n贺辞\t284727\n上海市商务委\t284728\n感同身受\t284729\n复位电路\t284730\n飞花令\t284731\nthermometer\t284732\n绿茶籽\t284733\nAsphalt\t284734\nsketches\t284735\nISO-8859-1\t284736\n2014-11-20\t284737\n423\t284738\n海口_海口网\t284739\n西师附中\t284740\n净利差\t284741\n一鼓作气\t284742\ntyy\t284743\nTHE\t284744\n德罗\t284745\n盛世三国\t284746\nhannah\t284747\n6979\t284748\n边地\t284749\nwaves9\t284750\nsuperliar\t284751\nskydrive\t284752\n19号\t284753\n邪典片\t284754\n10问\t284755\n忌口\t284756\n场尾\t284757\n费玉清\t284758\npipermail\t284759\n显色性\t284760\n纸浆\t284761\n莎普\t284762\n花工\t284763\njimu\t284764\nToledo\t284765\n白骨观\t284766\n茵宝\t284767\n只因\t284768\n潘菲洛夫\t284769\nBR/\t284770\n150周年\t284771\nwll\t284772\n传家宝\t284773\n外拍\t284774\n周家渡街道\t284775\n古河\t284776\nshihuan830619\t284777\niO\t284778\n小虫\t284779\n港口路\t284780\n梵天\t284781\nmsql\t284782\n见人\t284783\n第十六批\t284784\nSkywards\t284785\n麦洁文\t284786\n44秒\t284787\n鹤田\t284788\n政规\t284789\n丁超\t284790\nOrgasm\t284791\n十字猎人\t284792\nCOSO\t284793\ncameras\t284794\n曷\t284795\n荷包\t284796\n体验师\t284797\n企业国有资产交易监督管理办法\t284798\n呼伦贝尔草原\t284799\n生化危机:启示录\t284800\n主胜\t284801\n油孔\t284802\n工作女郎\t284803\n239号\t284804\n继承性\t284805\n6年内\t284806\n随步\t284807\n蓝月\t284808\n西麦燕麦片\t284809\n质能\t284810\n训斥\t284811\nunsigned\t284812\n一定要慎重\t284813\n红米4x\t284814\nnottingham\t284815\n6.6下\t284816\n第一层\t284817\nCF\t284818\nm5a78l\t284819\n0.5天\t284820\n纪年\t284821\nOKSN\t284822\n私募通\t284823\ngy6\t284824\n四周型\t284825\nmummy\t284826\n预付费\t284827\n新居网\t284828\n1.8.0.11\t284829\n海南省住建厅\t284830\n讴\t284831\n铁皮鼓\t284832\n王建宏\t284833\nmessa\t284834\n汇交\t284835\n192&\t284836\n形意\t284837\n手脚心\t284838\n单评\t284839\n李凌云\t284840\n广西民族报网\t284841\n洪德路\t284842\n4月25\t284843\n清冼\t284844\n六寸\t284845\n无名\t284846\n王丽娟\t284847\n大可爱\t284848\n杭州浙江大学\t284849\n陆杨\t284850\n钢量\t284851\n小伴龙\t284852\n老恒\t284853\n塞班\t284854\n睿城\t284855\nsydw\t284856\n雨林木风\t284857\nzoomeye\t284858\n五叶草\t284859\n营运部\t284860\n极品飞车6\t284861\nAAAA级\t284862\nA59s\t284863\naction\t284864\n朋友母\t284865\n上海市住建委\t284866\n放爱\t284867\n広\t284868\nStranger\t284869\n某一\t284870\n植被\t284871\n7108\t284872\n00161\t284873\n售电\t284874\n鸡王座X1\t284875\n张家村\t284876\npds\t284877\nwpsexcel\t284878\n60r16\t284879\n共修\t284880\n惠灵顿\t284881\n马嵬驿\t284882\n树莓酒\t284883\nFoxtable\t284884\n梦碎\t284885\n校团委\t284886\n国际大厦\t284887\n义乌工商学院\t284888\n街舞王子奇\t284889\n金城\t284890\n做房\t284891\nipguard\t284892\nsubList\t284893\n纪念币\t284894\ntravelled\t284895\n拉架\t284896\n9句\t284897\n广州起义烈士陵园\t284898\n男马\t284899\n五禽戏\t284900\n王晓斌\t284901\n眼日\t284902\n骑迹\t284903\n北京地产\t284904\n香杉木\t284905\n郭平\t284906\n卫生棉条\t284907\n企业云\t284908\nwmiao\t284909\n中国科学院遥感与数字地球研究所\t284910\n白云区人民政府\t284911\n南京公交查询网\t284912\nGrammy\t284913\n溅盒\t284914\n上海科科阀门集团有限公司\t284915\n王者荣耀KPL2017\t284916\n广西文化产业信息网\t284917\n6A\t284918\n面不改色\t284919\n石川铃华\t284920\nwebdav\t284921\n五八同城\t284922\nk_\t284923\n雁行\t284924\n彩板\t284925\n大梦\t284926\n固定座\t284927\n西屋电气\t284928\nibox\t284929\n耕作制度\t284930\npr2018\t284931\n孟静娴\t284932\nitools4\t284933\n乐课\t284934\nMindProbe\t284935\n07073生死狙击\t284936\nDavidZang\t284937\nManufacturing\t284938\n大唐明月\t284939\n行版\t284940\n铜公\t284941\nxrw\t284942\n5459\t284943\nTaobao\t284944\n坂田街道\t284945\n五道口\t284946\nX620\t284947\n马蹄粉\t284948\n脑仔\t284949\n营养学\t284950\n庚澈\t284951\n伤痕累累\t284952\n北斗导航\t284953\n一孔\t284954\nUV贴图\t284955\n荥阳在线\t284956\ntoes\t284957\nviso2013\t284958\n天柱县人民政府\t284959\n第75届\t284960\ntheory\t284961\n柏克\t284962\nぐ\t284963\n灵通\t284964\n无限斯特拉托斯\t284965\n园中园\t284966\n麦本本\t284967\nLloyd\t284968\n千金方\t284969\np751dm2\t284970\n列标\t284971\n透水沥青\t284972\n指日可待\t284973\nmilani\t284974\ns5pv210\t284975\ncuid\t284976\n数百名\t284977\naptio\t284978\n等位基因控制\t284979\n喷砂机\t284980\nlonging\t284981\n李国良\t284982\nN76E003\t284983\n幼子\t284984\n新瑞虎3\t284985\n东北大学\t284986\n石药\t284987\n高叉\t284988\nflexbox\t284989\n拖布\t284990\n年册\t284991\n送货上门\t284992\n鱼乐\t284993\n僵尸道长2\t284994\n粉笔公考\t284995\n18斤\t284996\n2013年12月\t284997\ncanyon\t284998\n张光北\t284999\n对\t285000\n陈怡馨\t285001\n营寨\t285002\nWHS\t285003\n刑期\t285004\n国际金融公司\t285005\n红鹰\t285006\n二十三个\t285007\n百货商店\t285008\n金陵大学\t285009\n高钙血症\t285010\n麦肯\t285011\n华为畅享7\t285012\n步梯\t285013\n绝处逢生\t285014\n智取生辰纲\t285015\n徐向阳\t285016\n妖次元\t285017\n极速侠\t285018\n太和区\t285019\n斗破\t285020\n萧江\t285021\n导不出\t285022\n大坝\t285023\nIDF算法\t285024\n愚蠢\t285025\n这部片\t285026\nlibpcap\t285027\nSmartRF\t285028\n千金归来\t285029\n打胶机\t285030\n邮袋\t285031\n2018年04月23日\t285032\n诺奇\t285033\nHJ\t285034\n北京出租房网\t285035\n河大附中\t285036\n大三巴牌坊\t285037\ncoffe\t285038\nADF\t285039\n肤色\t285040\n0128\t285041\n撸炮\t285042\n乙酰\t285043\nnpr\t285044\n乐天固件\t285045\nhara\t285046\nroles\t285047\nwaf\t285048\n云东\t285049\n冯国璋\t285050\nvisualization\t285051\nLeap\t285052\nsmoothie\t285053\n国土局\t285054\n魔方\t285055\n褶裙\t285056\n高丽雅娜\t285057\n三国文史\t285058\n活食\t285059\nQNAP\t285060\n费控\t285061\n连身袜\t285062\n计步\t285063\n东芝\t285064\n徐凯\t285065\n永结\t285066\n0xc0150002\t285067\nCookies\t285068\n我们班\t285069\n私定\t285070\nsa1456c\t285071\nMPhil\t285072\n性故事\t285073\n柳橙网\t285074\n老气\t285075\nTier\t285076\n14229\t285077\n红层\t285078\n踏寻\t285079\n四寸\t285080\n唯亭镇\t285081\n志愿军\t285082\n言承旭\t285083\n中船重工\t285084\n民族性\t285085\n委贷\t285086\n贝雷\t285087\nEXTRA\t285088\nMicha\t285089\n索尔维会议\t285090\n线条画\t285091\n第99章\t285092\n移动通信\t285093\n这个国庆\t285094\n种田\t285095\n吃食\t285096\n翠星\t285097\n黑土豆\t285098\n259元\t285099\n在线翻译器\t285100\n杏梅\t285101\nAugust\t285102\nRabbit\t285103\n打笼\t285104\n遵义市人民政府\t285105\n1055\t285106\n终结的炽天使\t285107\nudev\t285108\n小分队\t285109\n京证\t285110\n北上广不相信眼泪\t285111\ncoloring\t285112\n蜂胶胶囊\t285113\n前置式\t285114\n木堂\t285115\n钉子\t285116\nWikidata\t285117\n华商三优\t285118\nDegrees\t285119\n聚兴\t285120\n1.07G\t285121\n卡托普利片\t285122\n医患\t285123\nX射线荧光光谱仪\t285124\n林二汶\t285125\n百搭\t285126\n263.net\t285127\n抓捕器\t285128\n法球\t285129\n血肉史\t285130\nFIGMA\t285131\n全铁\t285132\n易奇文化\t285133\n160605\t285134\n爱舒\t285135\nGrand\t285136\n普通本科高校\t285137\n色即是空2\t285138\n金都夏宫\t285139\n艾泽里特\t285140\n授权商\t285141\n专监\t285142\n承恩\t285143\nreproduction\t285144\n别发\t285145\n_群\t285146\nGrades\t285147\nSFC\t285148\nenquiry\t285149\n耐久度\t285150\n忽然\t285151\n茶节\t285152\n透射式\t285153\n光年之外\t285154\n26套\t285155\n制作人\t285156\n美赞臣奶粉\t285157\nGoldenGate\t285158\n内海\t285159\n黄州区人民政府\t285160\n金泰新理城\t285161\n六零年代\t285162\n谋杀者\t285163\n布批\t285164\nuniq\t285165\n华夏病理网\t285166\n新安街\t285167\n威肯\t285168\n烧身\t285169\n伊洛\t285170\n黑板鸭\t285171\n虎刺梅\t285172\n阿隆索\t285173\n维特鲁威\t285174\n轻便型\t285175\n银质针\t285176\nPy\t285177\n泵闸\t285178\n红教\t285179\n汇友\t285180\n京东读书\t285181\n鲁智\t285182\n流字\t285183\n舂\t285184\n血液病\t285185\n专利人\t285186\nvalued\t285187\n诉状\t285188\n董事长\t285189\n歼-16\t285190\n红山路\t285191\n非凡软件站\t285192\n起亚K2论坛_汽车之家论坛\t285193\n黄平\t285194\n崔智燕\t285195\ngal\t285196\n企业管理制度\t285197\n总后勤部\t285198\n20颗\t285199\n小庄\t285200\n九真武侠世界大穿越\t285201\n平水韵\t285202\n洞门\t285203\n强神\t285204\nMuseScore\t285205\n凤岗\t285206\n就是不是\t285207\n崩盘\t285208\n搜索算法\t285209\n爆米花机\t285210\ntolua\t285211\n影音先锋av资源网\t285212\nh900\t285213\nZ11\t285214\n有氧操\t285215\n一搏\t285216\n曲_\t285217\n甘州区\t285218\n特殊教育\t285219\nBUCK\t285220\n顾欣欣\t285221\n摩罗\t285222\n蛇叔\t285223\n共线性\t285224\nLightroom\t285225\n招财猫\t285226\nBlackout\t285227\n跨界歌王\t285228\n张家界市永定区人民政府\t285229\n彭措堪布\t285230\n消散\t285231\nYeti论坛\t285232\n中国电子系统工程第二建设有限公司\t285233\n福将\t285234\nMiKTeX\t285235\n21IC电源网\t285236\n安阳路\t285237\n强行\t285238\n光动能表\t285239\n五个人\t285240\n服务包\t285241\npororo\t285242\n温州市人力社保局\t285243\n张文华\t285244\n网宿\t285245\n北交\t285246\n骇浪\t285247\n金港大道\t285248\nshift键\t285249\n22种\t285250\n南信\t285251\n胆舒胶囊\t285252\n期货保证金\t285253\n米拉之家\t285254\n武吉\t285255\n初赛\t285256\n沈氏\t285257\n翻蛋\t285258\n超声波细胞破碎仪\t285259\n天谕羚羊\t285260\n2009-2013年\t285261\ni5-7300HQ\t285262\n撕毁\t285263\n邪凤\t285264\n内生变量\t285265\n表现性\t285266\n开瓶器\t285267\n9月1日起\t285268\n福州市连江县政府\t285269\n白石麻梨子\t285270\n暴风TV\t285271\nxltd\t285272\n立架\t285273\n马头岩\t285274\nStudios\t285275\n墨连城\t285276\n厦门钨业\t285277\n北\t285278\n王功权\t285279\n中继线\t285280\nfbx\t285281\n硅\t285282\n硼砂\t285283\n新北方\t285284\n红叹号\t285285\n奥拉西坦\t285286\n武汉电子\t285287\n点价\t285288\n兴华大街\t285289\n湟水\t285290\n赫拉利\t285291\n天津大悦城\t285292\n艺考网\t285293\n2.6.3\t285294\n肠粉\t285295\n王人\t285296\n中塔式\t285297\n梁湘润\t285298\n12.56\t285299\n150cc\t285300\ngpp\t285301\n替硝唑\t285302\n涵道\t285303\nidiot\t285304\nADS\t285305\n蛋器\t285306\nMINA\t285307\n矩阵分析\t285308\nS9+_\t285309\n小白人\t285310\n洋桔梗\t285311\nexcessive\t285312\n缎带\t285313\n鳄梨\t285314\n溶酶\t285315\n糗事百科成人版\t285316\n红大\t285317\n宁国市政府\t285318\n圆点\t285319\n奉还\t285320\n聚烯烃\t285321\n木枷\t285322\n水喷雾灭火系统\t285323\n龙tory\t285324\nMASTER\t285325\n追究\t285326\n周颖\t285327\n绿地长岛\t285328\n龙坑\t285329\nfsd\t285330\nByb.cn-纯自然疗法\t285331\n百合文\t285332\n阪东\t285333\n五谷磨房\t285334\n十二章\t285335\n诈骗者\t285336\nchameleon\t285337\n飞航\t285338\nSKF轴承\t285339\n对脸\t285340\n福州理工学院\t285341\nSVT\t285342\n舒筋健腰丸\t285343\nedraw\t285344\nstacking\t285345\nUtah\t285346\nMX播放器\t285347\n河卵石\t285348\nxinji\t285349\n34567\t285350\n散学\t285351\n出理\t285352\n盛付通\t285353\n柏悦酒店\t285354\n三叉神经痛\t285355\n机套\t285356\n四十九式\t285357\nqty\t285358\n新作展\t285359\n上戏\t285360\n肤\t285361\n夏娜\t285362\n波打线\t285363\n选项栏\t285364\n陈波\t285365\n芬里尔\t285366\nfuntouch\t285367\n5|\t285368\n封闭式\t285369\n上海文化广场\t285370\n阿拉善盟\t285371\nワンダ\t285372\n哪一刻\t285373\nAllocation\t285374\nMIKE\t285375\npag\t285376\nndsi\t285377\nV4\t285378\nA02\t285379\n考号\t285380\n西工大附中\t285381\nosc\t285382\n徐申东\t285383\n3170\t285384\n企业破产法\t285385\n乘务\t285386\n四川省司法厅\t285387\nTurkish\t285388\n石狮市人民政府\t285389\n区分度\t285390\nDum\t285391\nCaleHsieh\t285392\n油麻地\t285393\n名叫\t285394\n良民\t285395\n上海浦发银行\t285396\nNUM\t285397\n康杰中学\t285398\nhhd\t285399\n压电写真机\t285400\n趣游\t285401\n23届\t285402\n去程\t285403\n校版\t285404\nOzmosis\t285405\ndok2\t285406\n陶子\t285407\n镜光\t285408\nWebpack\t285409\n雷克斯\t285410\n石门路\t285411\n91年\t285412\n荣耀Magic\t285413\n七月初\t285414\n五军\t285415\n流通证\t285416\n炫舞\t285417\n皮套\t285418\nTNT\t285419\nexcel吧\t285420\n家俱\t285421\n罗密欧\t285422\n好时机\t285423\n二维表\t285424\n淮阴师范学院\t285425\nScience\t285426\n死亡搁浅\t285427\n巨豪\t285428\n晋江\t285429\n代古拉k\t285430\nucbug\t285431\n小蛋\t285432\nDnF\t285433\n2018-03-17\t285434\nkfc\t285435\n李娴\t285436\n加精\t285437\n人形少女\t285438\n杭州市总工会\t285439\n加图\t285440\nmycafe\t285441\nX6\t285442\n1一9\t285443\n关八\t285444\n比拼\t285445\n嵌入性\t285446\n防溜\t285447\n_一道茶网\t285448\n骚猪\t285449\nsimpler\t285450\n关涛\t285451\n塞外\t285452\n马驹桥\t285453\n液压油泵\t285454\nmuse\t285455\nCocosCreator\t285456\n卖楼\t285457\n夜晚\t285458\n德杯\t285459\n福女\t285460\n收货员\t285461\n相别\t285462\n雪蜜\t285463\n加气混凝土设备\t285464\ne12\t285465\n郑州市民政局\t285466\n库图\t285467\n九个\t285468\n不精\t285469\n零线\t285470\n中证指数有限公司\t285471\n雨枫轩\t285472\n坭\t285473\n西面\t285474\n常任\t285475\n360N4\t285476\n95191\t285477\nrobotium\t285478\n别克君威\t285479\nFTISLAND\t285480\n22000元\t285481\n株式会社\t285482\nMercurial\t285483\nFeiQ\t285484\nw40\t285485\nprogramdata\t285486\n矿场\t285487\n展子\t285488\n灵液\t285489\n厦门医院\t285490\n暖爱\t285491\n刘志轩\t285492\n普尼\t285493\n65岁\t285494\n臻尚\t285495\nLucky锦\t285496\n厦门车管所\t285497\n法盟\t285498\n华龙网新闻中心\t285499\n业胎\t285500\nAVA\t285501\n龙版\t285502\n荷兰语\t285503\nsp4\t285504\n80C51\t285505\na1429\t285506\n安乐战场\t285507\n长安欧尚长安欧诺\t285508\n阿根廷队\t285509\n92斤\t285510\n兰溪路\t285511\n合肥市瑶海区政府\t285512\n贵州省人民医院\t285513\n小卒\t285514\n守寡\t285515\njpeglib\t285516\nSuperMap\t285517\n沈建华\t285518\n压控振荡器\t285519\n金生\t285520\n唯心不易\t285521\npathon\t285522\n赢商百科\t285523\n宠妻无度\t285524\n邓凯\t285525\nbreakfast\t285526\n丁敏君\t285527\n偃无师\t285528\n名字大全_公司\t285529\ngac\t285530\n包饭\t285531\n上海农科院\t285532\n配菜\t285533\n阴符\t285534\nADL\t285535\n万点\t285536\n孝敬\t285537\n虾米音乐人\t285538\n低估\t285539\n爱眼日\t285540\njingyan\t285541\n世荣兆业\t285542\n接插\t285543\n最美的光\t285544\n高倍镜\t285545\n桉树叶\t285546\n爱宠大机密\t285547\n超强悍\t285548\n红薯片\t285549\n双页\t285550\n7.69万元\t285551\n虎卫\t285552\n易动\t285553\n西树泡芙\t285554\nexcel版\t285555\n白魔法师\t285556\n战将\t285557\nmystic\t285558\n水桶\t285559\nDUTY\t285560\n吴中区\t285561\n桂视网\t285562\n.doc-文档投稿赚钱网\t285563\n孙中卡罗拉\t285564\n佳字\t285565\n巫九\t285566\n杰诺\t285567\n东方明\t285568\n东塔\t285569\n吉尼斯世界纪录\t285570\n中华人民共和国刑法\t285571\nbel\t285572\nCalvin\t285573\n一休哥\t285574\n天津铁道职业技术学院\t285575\n回迁\t285576\n外卖单\t285577\n美如画\t285578\n支气管镜检查\t285579\n捏捏\t285580\n160707\t285581\n异闻\t285582\n李再勇\t285583\n侨城\t285584\n奇幻夜\t285585\n墨宝非宝\t285586\nPHPStudy\t285587\n采选\t285588\n长安欧尚A800车友会_XCAR爱卡汽车俱乐部\t285589\n乾隆皇帝\t285590\n玉瓶\t285591\nEP07\t285592\n海外仓\t285593\n位运算\t285594\n耐力球\t285595\n新浪广\t285596\n呕心沥血\t285597\n题选\t285598\n邱县\t285599\n维族人\t285600\n迪奥\t285601\n人理\t285602\nHIPHOP\t285603\n砒霜\t285604\n建起\t285605\nfreitag\t285606\n苦旅\t285607\nNewPanderKing\t285608\n昊\t285609\n昱湖壹号\t285610\nr9sk\t285611\n好迷茫\t285612\n四川大学电子信息学院\t285613\n景阳春\t285614\n真全面\t285615\n澳门皇冠赌场\t285616\n345号\t285617\n拿图\t285618\n灵子\t285619\n大足在线\t285620\n谢瑶环\t285621\n多发病\t285622\n液压阻尼器\t285623\n搞怪\t285624\n绿湖\t285625\nV1.0安卓\t285626\n近百名\t285627\nao史密斯\t285628\n链家集团\t285629\n拿掉\t285630\n黄子华栋\t285631\nmat格式\t285632\n葫芦山庄\t285633\n楚留威朗\t285634\n522路\t285635\n导通角\t285636\nadder\t285637\n貔恘\t285638\n朔月\t285639\ncors\t285640\n压浆剂\t285641\n凸现\t285642\n20160928\t285643\n绞龙\t285644\n拳交\t285645\n诊断卡\t285646\nf567\t285647\n高级版\t285648\n山东协和学院\t285649\n清远市人民政府办公室\t285650\n相减\t285651\n藕塘\t285652\nIntel\t285653\n焕然\t285654\n回到未来\t285655\n管柱\t285656\n释迦牟尼佛\t285657\n记录数\t285658\nmatlab2016b\t285659\nviewcontroller\t285660\n套刀\t285661\n教工路\t285662\n米迪\t285663\nsetInterval\t285664\n居士林\t285665\n王力汽车网\t285666\na200\t285667\n宋鹏\t285668\n佘太君\t285669\n圣经旧约\t285670\n甘罗\t285671\n女儿村\t285672\n戴蒙德\t285673\nFlipboard\t285674\n颅内血肿\t285675\n广告字\t285676\n李冰洁\t285677\n极道\t285678\n鬼网\t285679\n东方妖妖梦\t285680\n鼻唇\t285681\nDans\t285682\n自然界\t285683\n改组\t285684\n香帅传奇\t285685\nParfum\t285686\nCreativity\t285687\n下流\t285688\n修真嫡女重生\t285689\n天王级\t285690\nh5py\t285691\n碎块\t285692\n猪尾\t285693\n计划员\t285694\n广西卫生职业技术学院\t285695\n江西干部网络学院\t285696\n科技周\t285697\ntrips\t285698\n白坭\t285699\nties\t285700\n20160427\t285701\n弹性模量\t285702\n2017NBA\t285703\n通灵珠宝\t285704\n三全食品\t285705\n执政党\t285706\n6.3.6\t285707\n1000多名\t285708\n胜利大街\t285709\n膨胀机\t285710\n布鲁克斯\t285711\n大溪村\t285712\nultraEdit\t285713\n反弯\t285714\n遗嘱\t285715\n千岛酱\t285716\n波特兰\t285717\n摩根石\t285718\n过道\t285719\nMysql5.7\t285720\n飞流直下\t285721\n老僧\t285722\n兴全合宜\t285723\n小蛇\t285724\n门坎\t285725\n送转股\t285726\nstiffness\t285727\n天丝\t285728\n美平\t285729\n标物\t285730\n里斯本\t285731\n黎城\t285732\n国联安\t285733\n死亡人数\t285734\n农家肥\t285735\n武书连\t285736\n窝工\t285737\nv1.0_391K手机网\t285738\n亲妹妹\t285739\nCastrol\t285740\n连日来\t285741\n龚贤\t285742\nmoulding\t285743\n新形\t285744\n大华电子秤\t285745\n胡玮玮\t285746\n我我\t285747\n香港电影金像奖\t285748\n中国梦之声\t285749\n接片\t285750\nwast\t285751\nScotia\t285752\nアマカノ\t285753\n昌吉市人民政府\t285754\n融资性售后回租\t285755\n三联屏\t285756\n重庆统计信息网\t285757\n开河\t285758\n朝鲜电视台\t285759\nOffline\t285760\n晶锐论坛_汽车之家论坛\t285761\n网页游戏\t285762\n60天\t285763\nPopular\t285764\n中央大学\t285765\n单题\t285766\n宾得\t285767\n2.0a\t285768\n大话神仙\t285769\nWPA2\t285770\n库管员\t285771\nMATLAB技术论坛|Simulink\t285772\n召陵区\t285773\n连云\t285774\n集体性\t285775\n冯晓青\t285776\n20180304\t285777\n御街行\t285778\n40K\t285779\n中山东升\t285780\n哪件事\t285781\n消痘\t285782\n赫尔巴特\t285783\n原液\t285784\nNao\t285785\n全彩\t285786\nsaurs\t285787\n会计学习\t285788\nwp7吧\t285789\n精虫\t285790\n任督二脉\t285791\n水乙二醇\t285792\n时间秒\t285793\n客运西站\t285794\n大连高新区\t285795\n速派奇\t285796\n暗访\t285797\n2006年10月\t285798\n第141章\t285799\nOTR\t285800\n厦门市人民政府\t285801\n金鸡奖\t285802\n近道\t285803\n308nm\t285804\n国培计划\t285805\n北静王\t285806\nConfiguring\t285807\n遥不可及\t285808\n光暗\t285809\nBricks\t285810\n辛巴克\t285811\n第二十五批\t285812\nXXL-JOB\t285813\n国网安徽省电力公司\t285814\n远见卓识\t285815\n尿渍\t285816\n私人订制\t285817\n西比灵\t285818\n侠盗猎车手\t285819\n马腾\t285820\n31\t285821\n主宾\t285822\nSTAND\t285823\n时间格式转换\t285824\n台州九三_\t285825\n中国科学院大连化学物理研究所\t285826\n11773\t285827\nGilt\t285828\n中国物理学会\t285829\n魂斗罗:归来\t285830\n语文天地\t285831\n兰迪奥顿\t285832\ngif动图\t285833\n鹅不食草\t285834\n上海生命科学研究院\t285835\n折原\t285836\n色欲\t285837\n天堂岛之歌\t285838\n脐周\t285839\n水业\t285840\nC114\t285841\nphpMyAdmin\t285842\n170106\t285843\n科技含量\t285844\nHUMAN\t285845\n试问\t285846\n热玛吉\t285847\ndatediff函数\t285848\n梁希\t285849\n甲状旁腺\t285850\n王兰\t285851\n北大附属实验学校\t285852\n重庆工商大学融智学院\t285853\n跟随者\t285854\npoem\t285855\n360智能摄像头\t285856\n文】\t285857\n三国-三国全集-三国\t285858\n長林風\t285859\nNK\t285860\nlh\t285861\n联系表\t285862\n最吃香\t285863\n吴涤清\t285864\n本田圭佑\t285865\n2206\t285866\n暖化\t285867\n文字类\t285868\n琴湖\t285869\n麦克罗伊\t285870\ncookie、session\t285871\n黑蜂\t285872\n赤霉酸\t285873\n惨无人\t285874\n五倍\t285875\n东江\t285876\nTGFuns\t285877\n1097\t285878\n浙商证券股份有限公司\t285879\n600755\t285880\nNET3.5\t285881\n环科\t285882\n南靖县\t285883\n噬天虎\t285884\nwww.tt27.tv\t285885\n双流\t285886\n五里桥\t285887\nTOR\t285888\n2017年8月4日\t285889\n林娜琏\t285890\n抽芯\t285891\n大史\t285892\n2000m\t285893\n荣华馆\t285894\n大商创\t285895\n破空\t285896\n企业观察网\t285897\n丹曦\t285898\n柳明\t285899\neuro\t285900\nLaozihao\t285901\n兴隆村\t285902\n块状物\t285903\n加急件\t285904\n普华永道会计师事务所\t285905\n孟利宁\t285906\nIMEI_\t285907\n唇型\t285908\n宁国\t285909\n3.60.03\t285910\n铁盒\t285911\n敬\t285912\n社旗县人民政府\t285913\n浙江农商银行\t285914\n沐歌\t285915\n未经\t285916\ncuDNN\t285917\n风云际\t285918\n30万元\t285919\n于洪区\t285920\n700米\t285921\n罗晋\t285922\n开封大学\t285923\n13.1.3\t285924\n完全相同\t285925\n表纸\t285926\n500余名\t285927\n子夜歌\t285928\n柳市镇\t285929\n弹劾\t285930\n鱼干\t285931\n抛出\t285932\nStreet\t285933\n和城乡建设厅\t285934\n宫内\t285935\n金圣圭\t285936\n仙公山\t285937\n滨海大道\t285938\n光复\t285939\n定向钻\t285940\n优普\t285941\n启皓\t285942\n2017年5月10日\t285943\n清溪\t285944\ndishonored\t285945\n维多利亚幼儿园\t285946\n承担\t285947\n法鲁\t285948\n同济大学汽车学院\t285949\npadding值\t285950\n笔底\t285951\ndione\t285952\n阿昆\t285953\n云南中公教育\t285954\n建荣\t285955\n中天大厦\t285956\n自绘\t285957\n百事快递\t285958\n新疆亚欧网\t285959\n7911\t285960\n那达慕\t285961\n丰乳肥臀\t285962\n7宗\t285963\n西瓜君\t285964\n力诺瑞特太阳能\t285965\nforwarded_for\t285966\n免序列号\t285967\n新基连\t285968\n未来式\t285969\n四川省发改委\t285970\n连战\t285971\n萨芬\t285972\n高跟鞋\t285973\n圣邪\t285974\n海猫\t285975\n秦大爷\t285976\n微型机\t285977\n华里\t285978\n铣槽\t285979\n20150808\t285980\n天才少女\t285981\n曲边\t285982\n无限挑战\t285983\n舞步\t285984\n馍馍\t285985\n肖迪\t285986\n12头\t285987\n易中天\t285988\n亚洲健康网\t285989\n可持续\t285990\n虹猫蓝兔七侠传\t285991\n香川真司\t285992\nSafeNet\t285993\ntco\t285994\n战旗tv\t285995\n神仙\t285996\n老男孩之猛龙过江\t285997\nPac\t285998\n运销\t285999\n豌豆苗\t286000\n糖异生\t286001\n职级制\t286002\n故事大全网\t286003\n扬州树人学校\t286004\nRA3\t286005\n600种\t286006\n珍藏\t286007\n2012.12\t286008\n陈春华\t286009\nLODOP\t286010\n新浪浙\t286011\n赵婷婷\t286012\nshay\t286013\nKeyPress\t286014\n04008895580\t286015\n楼中\t286016\n美兰\t286017\n东京喰种\t286018\n第三类\t286019\n排水\t286020\n扮演\t286021\n冬去春来\t286022\n墩头\t286023\n婚配\t286024\nParity\t286025\nGMC\t286026\n吉普自由侠\t286027\n斯奇康\t286028\n歪瓜\t286029\n黄体囊肿\t286030\n政治家\t286031\n帚\t286032\n37种\t286033\n陈国栋\t286034\n双桥子\t286035\n封航\t286036\n树葬\t286037\nKawasaki\t286038\n果哥\t286039\n上海美术电影制片厂\t286040\n私募投资基金管理暂行条例\t286041\n朱卫东\t286042\nATFX\t286043\n胡国华\t286044\n指定项\t286045\n帮办\t286046\n金农\t286047\nVFS\t286048\n急性胃炎\t286049\n第69届\t286050\n纸鱼\t286051\n县政府党组\t286052\n上帝之城监狱帝国\t286053\n501\t286054\n莫桑\t286055\n闽东电力\t286056\nnfo\t286057\n商贷\t286058\n25CM\t286059\n\u001c维基\t286060\n过顶\t286061\n海内网\t286062\ndiscreet\t286063\n省文物局\t286064\nInvent\t286065\nrfid读写器\t286066\n千足虫\t286067\n聚能灶\t286068\n顺服\t286069\nV字仇杀队\t286070\n榫眼\t286071\n过午不食\t286072\n得路\t286073\n200ms\t286074\n宠物展\t286075\nygo\t286076\n蒌蒿\t286077\nmee\t286078\n催眠\t286079\ngdp\t286080\n淘达人\t286081\n达丰\t286082\n宝田\t286083\nCorpse\t286084\n二连浩特市\t286085\nNutrition\t286086\n二元期权\t286087\n巨擘\t286088\n县人民政府\t286089\nisla\t286090\nSausage\t286091\n魅族科技\t286092\n碧沙岗公园\t286093\n金山词霸_句库_整句翻译\t286094\n饭勺\t286095\nempathy\t286096\nSMT贴片\t286097\n九度\t286098\nAPO\t286099\n秀财网\t286100\n广视网\t286101\n原始凭证\t286102\n3亩\t286103\n伤命\t286104\nHOG\t286105\n第四局\t286106\n思涵\t286107\n加航\t286108\niso版\t286109\n曰子\t286110\n金砖四国\t286111\nQ300\t286112\n鸦鸿桥\t286113\nhourglass\t286114\n第二发\t286115\ndeca\t286116\nShades\t286117\n欲速\t286118\n欢歌\t286119\nZimbabwe\t286120\n老大妈\t286121\n上海理工\t286122\n小然\t286123\nplay2\t286124\n挫\t286125\n育儿观\t286126\n涉台\t286127\n佛罗里达大学\t286128\n40087\t286129\n西湖国宾馆\t286130\n室内设计师网\t286131\n插画\t286132\n刷币\t286133\n安插\t286134\nGOTV\t286135\nplayarts\t286136\nRoyce\t286137\n顶花\t286138\nAged\t286139\n从无到\t286140\n密封油\t286141\n中电联\t286142\n金刚菩提\t286143\n50万公里\t286144\n边音\t286145\n大学生创新基础\t286146\n嘉伟\t286147\n精品酒店\t286148\n太阳能光伏板\t286149\n八千里\t286150\n黄山西海大峡谷\t286151\n妥当\t286152\n400多元\t286153\n赫德\t286154\n8090\t286155\n腰筋\t286156\nGTA5MOD\t286157\n举出\t286158\n快汇\t286159\nqq部落守卫战\t286160\n鼓捣\t286161\nMarine\t286162\n原子灰\t286163\n三宫六院七十二妃\t286164\np2.5\t286165\n福晶科技\t286166\n算题\t286167\nz572089387\t286168\n长廊\t286169\n分数线\t286170\n麦斯特\t286171\ndurow\t286172\n彩虹旗\t286173\n直升机\t286174\n经营业\t286175\n第26号\t286176\n细说红楼梦\t286177\nlinux服务器\t286178\n好事多磨\t286179\n外侨\t286180\n做派\t286181\n5436330283\t286182\n伊豆野菜村\t286183\n键\t286184\n湖师大\t286185\n中国好歌曲第二季\t286186\nKH\t286187\n直抵\t286188\n修为\t286189\n阵前\t286190\n1111网易\t286191\n赣州火车站\t286192\n防腐钢管\t286193\n换句话\t286194\n照片秀\t286195\n世贸\t286196\n扫地杆\t286197\n墓区\t286198\n未预期\t286199\n&&\t286200\npottermore\t286201\n披覆\t286202\n中山二路\t286203\n意境\t286204\n擦鞋\t286205\n波导股份\t286206\n验证集\t286207\n罗丝\t286208\nhifi\t286209\n隆升\t286210\nHRVP\t286211\n240亿\t286212\n杨雪\t286213\nSIGGRAPH\t286214\n字尾\t286215\npaimai\t286216\nRBC\t286217\n慕晴\t286218\n伤仲永\t286219\nsmark\t286220\n175万\t286221\n32&64\t286222\n幸福成长\t286223\n福特_新全顺\t286224\n亿吨级\t286225\n商工\t286226\n唛架\t286227\n三江侗族自治县\t286228\n气压表\t286229\n身心健康网\t286230\n消费税\t286231\n侵财\t286232\n咯哒\t286233\n筋络\t286234\nDOE\t286235\n出厂值\t286236\n大半\t286237\n质量检测\t286238\n合景天峻\t286239\n巨大\t286240\n成都高新\t286241\n整洁\t286242\n平顶山市人民政府\t286243\n环法\t286244\n血色浪漫\t286245\nobligation\t286246\n做错\t286247\n海航科技\t286248\nglew\t286249\n轰焦冻\t286250\n大鳄鱼\t286251\n律师证\t286252\n近千年\t286253\n海风\t286254\n1.5x\t286255\n猜对\t286256\nTNGA\t286257\n软膏剂\t286258\n2.5平方\t286259\n战舰波\t286260\n温敏\t286261\n96台\t286262\n杨高中路\t286263\n壮游\t286264\n驭房有术\t286265\n魔窟\t286266\n饥饿疗法\t286267\n国博中心\t286268\n液氢\t286269\n鼻息肉\t286270\n5芯\t286271\n北花园小区\t286272\n超级飞侠\t286273\nlloyd\t286274\n1万亿\t286275\nforbidden\t286276\nMSYS2\t286277\nCC3200\t286278\n鱼料\t286279\n戏耍\t286280\n流里\t286281\n147\t286282\n力挫\t286283\n货业\t286284\n肖像权\t286285\n免关\t286286\n系统工程师\t286287\n足跟痛\t286288\n龙崖\t286289\n阿斯加德\t286290\n谋\t286291\nj5\t286292\n晰\t286293\n红薯饼\t286294\n葱叶\t286295\n|公里\t286296\n属间\t286297\n解放军台海军演\t286298\n代垫\t286299\n马骑\t286300\n极坐标与参数方程\t286301\n奥斯丁\t286302\n南洋理工\t286303\n350度\t286304\n主因\t286305\n四十级\t286306\nSHOP\t286307\n119路\t286308\n叠加器\t286309\n002570\t286310\n上实发展\t286311\n龙巅猛鱼\t286312\n编织袋\t286313\n肾宝片\t286314\n邻座\t286315\nDN80\t286316\nJeep自由侠\t286317\n疯蜜\t286318\n小老鼠上灯台\t286319\n魏然\t286320\n天蝎女\t286321\n石门森林公园\t286322\n日产骊威\t286323\nJAE\t286324\n2.64\t286325\n20170405\t286326\n脾氨肽口服冻干粉\t286327\n血型网\t286328\n南昌市公安局\t286329\n黑龙江站\t286330\n2017名\t286331\n玄米\t286332\n分数与除法\t286333\n冬日恋歌\t286334\n吞吐量\t286335\neabi\t286336\n大中华族谱网\t286337\n七莘路\t286338\n列号\t286339\n报账\t286340\n47平米\t286341\n眼尾\t286342\n尘埃2\t286343\n前世情人\t286344\n永陵\t286345\n万普\t286346\n土豪指数\t286347\nhiredis\t286348\n融侨中心\t286349\n2018年4月5\t286350\n迨\t286351\n索罗门\t286352\nGitee\t286353\n发型设计\t286354\n电动螺丝刀\t286355\n盛达矿业\t286356\n斑状\t286357\n票券\t286358\n南沙奥园\t286359\n大卫·贝克汉姆\t286360\n10000例\t286361\nmethanol\t286362\ntomcat端口号\t286363\n后事之师\t286364\n极路由B70\t286365\n裸图\t286366\n杜甫江阁\t286367\n2881064178\t286368\nipo\t286369\nip+\t286370\n曙光村\t286371\n风景石\t286372\n内螺纹\t286373\n未然\t286374\n维尔利\t286375\nAutoCAD2012\t286376\n3.15\t286377\n茶竹永川网\t286378\n前方\t286379\nINDIRECT\t286380\n德巴\t286381\n广州中国银行\t286382\n分毫\t286383\n咬文嚼字\t286384\n巴彦淖尔市人民政府\t286385\nMantis\t286386\n快乐星球\t286387\n并行\t286388\n五2\t286389\n车澈\t286390\n2018年2月15日\t286391\n边界\t286392\n花绮罗\t286393\n栾贝贝\t286394\n班主\t286395\nfrm\t286396\n无动于衷\t286397\n匹夫无罪\t286398\n失道寡助\t286399\nN5S\t286400\nMolecules\t286401\n澳门豆捞\t286402\nhhu\t286403\n拼途旅讯\t286404\niPhone6/6Plus\t286405\n象山公园\t286406\n沼液\t286407\n微擎应用商城\t286408\n摘掉\t286409\n应用心理学\t286410\nuploading\t286411\n区水利局\t286412\n霸上\t286413\n恃宠\t286414\nWindowsApps\t286415\nmm2\t286416\n东海岸新城\t286417\n学周易起名网\t286418\n李玉妹\t286419\n上海便民网\t286420\n茅台酒\t286421\n固执\t286422\n中南世纪\t286423\n淮海中路\t286424\n1304412\t286425\n仟吉\t286426\n生态旅游\t286427\n女将军\t286428\n杀年猪\t286429\n王虹\t286430\n白莲\t286431\n截稿\t286432\n海螺丝\t286433\n吱呀\t286434\n桂花茶\t286435\n条锈病\t286436\nJersey\t286437\nValidated\t286438\n克山\t286439\nLinux虚拟机\t286440\n制品厂\t286441\n第96集\t286442\n欧阳曹德旺\t286443\n600547\t286444\n雪铁龙c6\t286445\nKendrick\t286446\n沃通\t286447\n海淀街道\t286448\n采取\t286449\n邪态\t286450\n红糖姜汤\t286451\n数控编程\t286452\nGUO\t286453\nSQL_Server\t286454\n一品\t286455\n视网\t286456\n哒哒哒\t286457\nV2.1.0\t286458\n杰成\t286459\n外域\t286460\n可数\t286461\n杨振\t286462\nKDA\t286463\n郭松\t286464\nthen\t286465\njrs们\t286466\n片内\t286467\n岑玉海\t286468\n看\t286469\n黄袍加身\t286470\n蔡集镇\t286471\nOno\t286472\n王家河\t286473\n刘德明\t286474\n勤哲\t286475\n郑州市城市管理局\t286476\niphine\t286477\n观澜版画村\t286478\n致远星\t286479\n江南逢李龟年\t286480\n明园\t286481\nNs\t286482\n13_\t286483\nsummer0319\t286484\n石林县\t286485\n分不清\t286486\n电信版\t286487\n临床医学\t286488\n刘玉玲\t286489\npencil\t286490\n驾驶分\t286491\nasci\t286492\n窥情\t286493\n芽菜\t286494\n装饰音\t286495\n后套\t286496\n王建斌\t286497\n军事医学科学院\t286498\nweiqi\t286499\n广东省审计厅\t286500\n书龄\t286501\nOpenOCD\t286502\n80粒\t286503\nQueen\t286504\n昆明西山区\t286505\n中国律师事务所\t286506\n郭蓉\t286507\n叶宁\t286508\n龙光玖龙玺\t286509\n浙江音乐学院\t286510\nUIBezierPath\t286511\n10集\t286512\n513\t286513\n全线\t286514\n受益\t286515\n按说\t286516\n牛津阅读树\t286517\nTHEN\t286518\n智明星通\t286519\n为什么不买\t286520\n五台山机场\t286521\n京瓷1800\t286522\n10.07\t286523\n中华人民共和国计量法\t286524\n鹊桥仙\t286525\n74页\t286526\n铁桶\t286527\n华灵路\t286528\n补登\t286529\n招摇\t286530\n伏见稻荷大社\t286531\n纷赋\t286532\nWonderland\t286533\n文笔\t286534\n查实\t286535\nbukkit\t286536\n空调扇\t286537\n广交会\t286538\n网金\t286539\nconsiderate\t286540\n斯玛特卡\t286541\ni版\t286542\n小和尚\t286543\n7次\t286544\n链家自如\t286545\n12例\t286546\n玉山一中\t286547\n2500年\t286548\n单刀\t286549\nforgive\t286550\n多只\t286551\njis\t286552\n巨腾\t286553\nmane\t286554\n中山纪念堂\t286555\n澄迈县\t286556\nPeeing\t286557\nPHPWAMP\t286558\n博雅德州扑克\t286559\n西街44号\t286560\nrhinoceros\t286561\n添花\t286562\n鼻翼\t286563\n湖北省环保厅\t286564\n1337\t286565\n提亚\t286566\ndeutsch\t286567\npolo\t286568\n小牛n1s\t286569\n标字\t286570\nPTA\t286571\n雷锋源\t286572\n飞伦\t286573\nTECHNOLOGIES\t286574\n也需\t286575\n双优\t286576\n几家\t286577\n早逝\t286578\n工业清洗剂\t286579\n姑妄言\t286580\n5.6亿\t286581\n赚吧\t286582\n第79号\t286583\n杰克苏\t286584\n二系\t286585\n无此人\t286586\n线性\t286587\n易连慎\t286588\n九头身\t286589\ndroplet\t286590\n837\t286591\n表姐妹\t286592\n异空间\t286593\n梁永军\t286594\n死缓\t286595\n车尔尼雪夫斯基\t286596\nv4.7\t286597\n复得\t286598\n斗车\t286599\n太平洋影城\t286600\n我爱购物网\t286601\nbishe\t286602\n干热岩\t286603\n订单险\t286604\n成功路\t286605\n估车\t286606\n女魔法师\t286607\n吉他谱C调吉他弹唱谱\t286608\n香儿\t286609\n禅城区人民法院\t286610\n薄\t286611\n胆怯\t286612\n村田雄介\t286613\n巴音郭楞州\t286614\n1.0.2_\t286615\n国象联盟\t286616\n剥皮机\t286617\n间歇泉\t286618\n检查机\t286619\n娄婧\t286620\n中国氢能源网\t286621\n開始\t286622\nchar\t286623\n山青\t286624\nPSD模\t286625\n塞来昔布\t286626\n一二三档\t286627\nX7R\t286628\n4G网络\t286629\n句字\t286630\npy3\t286631\ncoso\t286632\n522号\t286633\n韬光养晦\t286634\n爱洛\t286635\nW3\t286636\n太平国家森林公园\t286637\n乳酸菌素片\t286638\nCartridge\t286639\n低于\t286640\n北京出入境管理办事大厅\t286641\n三钱\t286642\n红源\t286643\nQQ部落守卫战\t286644\n李伯重\t286645\n自考办\t286646\n95.5\t286647\n庆区\t286648\n禾雀花\t286649\n玛格丽特·杜拉斯\t286650\n四方股份\t286651\nランス\t286652\nX1Carbon\t286653\nAtari\t286654\nsupercell\t286655\n已婚男\t286656\n终焉的银河\t286657\n下降率\t286658\n支\t286659\n学院路\t286660\nENCINO\t286661\n杨冰阳\t286662\n石兽\t286663\n半牙\t286664\n土木在线论坛\t286665\n布里渊区\t286666\n三字经儿歌\t286667\n三门镇\t286668\n汉地\t286669\n省电\t286670\n装底\t286671\n华外\t286672\n环顾\t286673\nBD1080P+BD720P\t286674\n上臂\t286675\n普遍\t286676\ncmy\t286677\n塞琉古\t286678\n东吴镇\t286679\nd3d11.dll\t286680\n梅列区\t286681\n锡城\t286682\nETUDE\t286683\n博睿\t286684\nHD650\t286685\n2012r2\t286686\n肖江\t286687\n特征峰\t286688\nCROSS\t286689\n凯勒\t286690\n梗犬\t286691\n稿件\t286692\n阿兹卡班\t286693\nEasing\t286694\n12安\t286695\n转浮\t286696\n平距\t286697\nbdp\t286698\n洗车行\t286699\n提前\t286700\n普济方\t286701\n开挂\t286702\nout_of\t286703\n水站\t286704\n普森\t286705\n孙小宝\t286706\n女演\t286707\n瓦王\t286708\n极恶非\t286709\n50万辆\t286710\nbgcolor\t286711\n限制性\t286712\n幼托\t286713\n北京城建集团\t286714\n寒烟\t286715\n蓝雪花\t286716\n张善友\t286717\n托马斯\t286718\nMAC机\t286719\n慈诚\t286720\nbrushed\t286721\n护宝联盟\t286722\nBend\t286723\n民办职业培训学校\t286724\n强调要\t286725\n孟盛楠\t286726\n黑龙江省人民政府办公厅\t286727\ndocdoc\t286728\ncss/js\t286729\nav影院\t286730\n002640\t286731\n2252\t286732\n数据画\t286733\n电视卡\t286734\n深具\t286735\n会计记账\t286736\n石悦\t286737\n途珍网\t286738\n150ml\t286739\n17名\t286740\n衡水职业技术学院\t286741\n君九龄\t286742\n海龙国际\t286743\n索菲特酒店\t286744\n上海社区服务网\t286745\n滕\t286746\n泰兴市政府\t286747\n惠新西街北口\t286748\nA12\t286749\ngxg\t286750\n尘子\t286751\n奇瑞汽车股份有限公司\t286752\n海豚湾恋人\t286753\n中国电信北京研究院\t286754\n三渡\t286755\n波妞\t286756\n828首\t286757\nfim\t286758\n石景山游乐园\t286759\n艾玛沃特森\t286760\n絶頂\t286761\nG63\t286762\n2t\t286763\n内方\t286764\n花架子\t286765\n离宫\t286766\n营口市\t286767\n剑川路\t286768\n京劳社\t286769\n镁客网\t286770\n骨头汤\t286771\ngeotools\t286772\n结为夫妻\t286773\n宣璐\t286774\n武夷学院\t286775\n亲兄弟姐妹\t286776\nyexiao\t286777\n五星\t286778\n找服\t286779\n资阳市\t286780\n宫锁心玉\t286781\n丁美婷\t286782\n心脏搭桥手术\t286783\n六级考试\t286784\n北京市十一学校\t286785\nFileInputStream\t286786\n凤凰岛\t286787\n123456789\t286788\n强震\t286789\n矿物质\t286790\n一周三\t286791\n樟树市\t286792\n久盛地板\t286793\n乱步奇谭\t286794\n刺耳\t286795\n绀野光\t286796\nFORCE\t286797\n新湾\t286798\n秦陵兵马俑\t286799\n小王妃\t286800\n虎兔\t286801\n13.3英寸\t286802\napqp\t286803\nRhino5\t286804\n老伴儿\t286805\n我的绝色总裁未婚妻\t286806\n126nw\t286807\ncartier\t286808\n65R15\t286809\n尼康d810\t286810\n单亲家庭\t286811\n数学化\t286812\n菜子\t286813\n高层建筑\t286814\n农业科技园区\t286815\n贸易顺差\t286816\n人保\t286817\n环城东路\t286818\nalun-chen\t286819\n炼锌\t286820\n汽车南站\t286821\n世茂广场\t286822\n科林斯\t286823\ndirective\t286824\n大柳树\t286825\n肯德基门\t286826\n中信红木家具\t286827\n桐桐\t286828\n云松\t286829\n悟性\t286830\n蓝桶\t286831\noutlets\t286832\n复方土槿皮酊\t286833\n四棱台\t286834\n值守访问\t286835\nAngela\t286836\n3d杀号\t286837\n学习篇\t286838\n美制\t286839\nbusinesses\t286840\n大学校\t286841\n钓况\t286842\nTorrEent-720P.MKV\t286843\n波帅\t286844\n中工国际工程股份有限公司\t286845\n夜夜啪啪在线影院\t286846\n巴基斯塔\t286847\n多元回归方程\t286848\n2小时前\t286849\n颐景园\t286850\n乡关\t286851\nuniform\t286852\n大衣哥\t286853\n市交通运输局\t286854\n单体泵\t286855\n晋察冀\t286856\n粤语在线翻译\t286857\n一扎\t286858\n玷污\t286859\n斯卡保罗集市\t286860\n电锯惊魂5\t286861\n将校\t286862\nJUnit4\t286863\n弈剑听雨阁\t286864\nThief\t286865\ndmv\t286866\n小姐诱心\t286867\n铅皮\t286868\n桑妮\t286869\nhay\t286870\n九味\t286871\n调绘\t286872\n8470p\t286873\n桓台县\t286874\n京西十八潭\t286875\n马掌\t286876\n浮屠\t286877\n贞女\t286878\n省人力资源社会保障厅\t286879\n伤者\t286880\n林之灵\t286881\n同剧\t286882\n_陵县\t286883\n妆容\t286884\n1罐\t286885\n珍珠丸\t286886\n社群\t286887\nSurgeon\t286888\nyanni\t286889\nCubase\t286890\n南湖小学\t286891\n东山区\t286892\n龙行天下\t286893\n齐齐哈尔大学\t286894\n小河流水\t286895\n小建中汤\t286896\n郭阳\t286897\n大芬村\t286898\n若即若离\t286899\n教皇\t286900\n台湾电视台\t286901\n鬼吹灯之精绝古城\t286902\n东方能源\t286903\n所以然\t286904\n卫冕\t286905\n中原特钢\t286906\n全画幅镜头\t286907\n拿破仑战争\t286908\nOpera浏览器\t286909\nPillow\t286910\n第A07\t286911\n完澡\t286912\n金特\t286913\n遴选考试网\t286914\n山东新华医疗器械股份有限公司\t286915\n宜君县\t286916\n海军版\t286917\neosm\t286918\n毛坦厂镇\t286919\n猎豹CS9\t286920\n30所\t286921\nentrance\t286922\n30天左右\t286923\n线夹\t286924\n牺牲者\t286925\n外培\t286926\n左拥\t286927\n真美丽\t286928\n威斯克\t286929\n锈死\t286930\n失业率\t286931\n坠毁\t286932\n浙江省电力公司\t286933\n粘单\t286934\nSpeakers\t286935\n返回式\t286936\n磨憨\t286937\n家装\t286938\n大卖\t286939\n灾民\t286940\n棉涤\t286941\nHunter\t286942\n飘\t286943\n庄臣\t286944\ncopilot\t286945\n十三州\t286946\n天京\t286947\n杭州东方中学\t286948\n十一周\t286949\n诱变\t286950\n新课\t286951\n前行者\t286952\n围坐\t286953\n太华\t286954\n花簪\t286955\n走心\t286956\nNetCore\t286957\n外置内存卡\t286958\n亚太股份\t286959\n廊坊站\t286960\nM202dw\t286961\n桑植\t286962\n9月13日\t286963\nel-tree\t286964\n天才吧\t286965\ne50\t286966\n长乐区\t286967\n嘉里建设广场\t286968\n阳刃\t286969\n金毛犬\t286970\n邵阳市区\t286971\n秸秆还田\t286972\n水陆画\t286973\n兽耳\t286974\n等价\t286975\ncognac\t286976\n益气养血口服液\t286977\n非同源染色体\t286978\n孟狐狸\t286979\n斩白\t286980\n宝兰\t286981\n重庆市安监局\t286982\n五发\t286983\n吴哲晗\t286984\n陈沁\t286985\n不成人\t286986\n冯先生\t286987\n矿渣\t286988\n怒发\t286989\n挖断\t286990\nchords\t286991\n发信\t286992\n首杀\t286993\n激励源\t286994\n料网\t286995\n成都平原\t286996\n球蛋白\t286997\n小畜\t286998\n1495\t286999\n窝沟封闭\t287000\n迄今为止\t287001\nracing\t287002\n江苏省产业技术研究院\t287003\n玻璃鞋\t287004\ncounseling\t287005\nnegative\t287006\nmousedown\t287007\n爱思英语学习网\t287008\n乐霸\t287009\n天庆\t287010\n血溅\t287011\n正屏\t287012\n代目\t287013\n十翼\t287014\nOpenCV3编程入门\t287015\n1.0.25\t287016\n25天\t287017\nLumi\t287018\n信阳市水利局\t287019\n科里斯\t287020\n山东省教育厅\t287021\nCECS\t287022\n看不_\t287023\n红绿色\t287024\n辽宁队\t287025\n袖阀\t287026\n2016年4月30日前\t287027\n2.8亿元\t287028\n上海高架桥\t287029\nldap\t287030\n报班\t287031\n石桌\t287032\n金银花露\t287033\n徐静波\t287034\n临死前\t287035\n张沫凡\t287036\n碳包\t287037\n箱线\t287038\n多界面\t287039\n裸拍\t287040\nf327\t287041\n大庆师范学院\t287042\n天易成\t287043\n美国专利商标局\t287044\n固态硬盘分区\t287045\n天天操综合网\t287046\nsift算法\t287047\n喀什\t287048\n鬼鬼祟祟\t287049\n甲级建筑\t287050\n起亚KX7\t287051\n声功\t287052\n碳化物\t287053\n祸\t287054\n巨木\t287055\n唇膏\t287056\n聚享游\t287057\ndnd\t287058\nshuqi\t287059\n商品学\t287060\nkou\t287061\n三会一课计划\t287062\n师生年\t287063\npyhthon\t287064\n创业网\t287065\n汽车城\t287066\nIntranet\t287067\n五月一号\t287068\n手机百度\t287069\n挖煤\t287070\n无限凉山\t287071\nnba2k10\t287072\n邵敏\t287073\n三合镇\t287074\n一家之主\t287075\n保利中央公馆\t287076\n增长幅度\t287077\n憨豆先生\t287078\n暗涵\t287079\n一个礼拜\t287080\n内痔\t287081\nFruit\t287082\n春深\t287083\n李翰祥\t287084\n提臀\t287085\n导热油\t287086\n代笔\t287087\nLog4J\t287088\n浙大英语\t287089\n3dmax样条线\t287090\n艺圃\t287091\nSDN\t287092\n挖沙\t287093\n纤维蛋白原\t287094\n贤集网\t287095\n农药登记证\t287096\n石泰峰\t287097\n琶洲\t287098\n甘婷\t287099\nNumb\t287100\n2500k\t287101\n时序逻辑电路\t287102\n会试\t287103\n百年战争\t287104\n1631\t287105\n哈啤\t287106\nnr\t287107\n出奇制胜\t287108\n暂行办法\t287109\n苏州广播电视总台\t287110\n上学期间\t287111\n电子街\t287112\n哈尔滨工业大学》\t287113\n艾兴\t287114\n内马王石\t287115\nPolka\t287116\nue4吧\t287117\n周公馆\t287118\nXUE\t287119\n致匠心\t287120\nexplicit\t287121\nHolder\t287122\n抛体运动\t287123\n香薰\t287124\n2016年11月28日\t287125\njconsole\t287126\n安灯\t287127\nproducers\t287128\n孤胆英雄\t287129\n上海实业(集团)有限公司\t287130\n成都商\t287131\n暗淡\t287132\n讲一讲\t287133\n2.8GB\t287134\n罗马里奥\t287135\n武威路\t287136\n刘俊\t287137\n宁波市北仑区招投标中心\t287138\n环球旅讯\t287139\nstw\t287140\n黄格\t287141\nBags\t287142\n身兼\t287143\n厄尔尼诺现象\t287144\n陕北\t287145\n潘倩倩\t287146\n1.2.0_\t287147\neleme\t287148\n派遣\t287149\nWord2010\t287150\n后发\t287151\n台北市区\t287152\nGQ\t287153\nDunn\t287154\nTerrain\t287155\n2700元\t287156\nb站up主\t287157\n约书亚记\t287158\nSJTU\t287159\n教务管理信息系统\t287160\n9本\t287161\n4480\t287162\n小析\t287163\n卤猪\t287164\n兰纳足贴\t287165\n215号\t287166\n卡尼古拉\t287167\n471G\t287168\nwifi共享精灵\t287169\n货道\t287170\n橘红色\t287171\n张雅丹\t287172\n刷票器\t287173\nmegan\t287174\n擦身\t287175\n超声波雾化器\t287176\nmyclipse\t287177\nWinRar\t287178\n2018-02-06\t287179\n官方语言\t287180\n巴比兔\t287181\n土门关\t287182\n中诺\t287183\nP3P\t287184\n52PK英雄联盟\t287185\n南康市\t287186\n菠菜汁\t287187\n览海投资\t287188\n东营站\t287189\n能言善辩\t287190\n数据服务网\t287191\n梅露露\t287192\n品牌管理网\t287193\n20160115\t287194\n傅立叶\t287195\nυ\t287196\nabac式\t287197\n大美儿童世界\t287198\n南塘村\t287199\n监控\t287200\n诀别\t287201\nGrass\t287202\n天上人\t287203\n铨\t287204\nprogress\t287205\n偏紧\t287206\n寺岗\t287207\nvariants\t287208\n查询_火车网\t287209\n被打乱\t287210\n张筱\t287211\nsoundcloud\t287212\n右击\t287213\n丹泽尔华盛顿\t287214\n第5步\t287215\n行费\t287216\n农业知识网\t287217\n垄断\t287218\n油炸机\t287219\nmacross\t287220\nMFI认证\t287221\n紫标\t287222\n口服液\t287223\n护养\t287224\n第9层\t287225\n王恒亮\t287226\n勺景\t287227\n狮吼功\t287228\n专卖房\t287229\n专硕学硕\t287230\n同业网\t287231\nSucks\t287232\n思乐\t287233\n大楚\t287234\n12吨\t287235\n抱拳\t287236\n碧螺春\t287237\n胆小如鼠\t287238\n助贷网\t287239\n1亿日元\t287240\n纪念会\t287241\n5146\t287242\n第二天早上\t287243\nFlashback\t287244\n五天后\t287245\n20180108\t287246\n做号\t287247\n试炼场\t287248\nInfluxDB\t287249\nHDV\t287250\njiyu\t287251\n被剥夺\t287252\n行动\t287253\np18\t287254\nFlash插件\t287255\n色情男女\t287256\n0917\t287257\n荒草\t287258\n千柏鼻炎片\t287259\n小牛m1\t287260\n济青高速公路\t287261\n龙脉\t287262\n墨尔\t287263\n金恒\t287264\n中国粉体网\t287265\ndiscipline\t287266\nfiction\t287267\n过筛\t287268\n60mg\t287269\n桃花运\t287270\n不近人情\t287271\nseaborn\t287272\n重生之国民男神\t287273\n沃通CA\t287274\n陈祥\t287275\n超神篇\t287276\nnash\t287277\n挤\t287278\nxiaohuazi\t287279\n深圳国际交流学院\t287280\n政府论\t287281\n茶陵政府\t287282\n徐圩新区\t287283\n单纯疱疹\t287284\n妈宝男\t287285\n华云\t287286\nlune\t287287\n野人\t287288\n着凉\t287289\nAndroid.mk\t287290\n分属\t287291\nad原理图\t287292\n慧鱼\t287293\n姬野\t287294\nPRADO\t287295\n宫颈\t287296\n促销员\t287297\n畜生道\t287298\n徐小平\t287299\n瑞林\t287300\n吴晓军\t287301\nAnaconda环境变量\t287302\n王志飞\t287303\n难民潮\t287304\n清远日报\t287305\n狗雨\t287306\n闭所\t287307\n国科大\t287308\n桌贴\t287309\n领证\t287310\n人梯\t287311\n流浪猫\t287312\n荣耀V9\t287313\n吹扫\t287314\n附分\t287315\n止观\t287316\nDover\t287317\narping\t287318\nmspaint\t287319\n大客车\t287320\n通达动力\t287321\n5000K\t287322\n中山大学附属中学\t287323\n怀来\t287324\n娇兰帝皇蜂姿\t287325\n缸垫\t287326\n借以\t287327\n7型\t287328\n蟠桃园\t287329\n小荣\t287330\n朴佳琳\t287331\n天波杨府\t287332\n天龙sf\t287333\n徼\t287334\n辛辣\t287335\n云南招商网\t287336\nunitest\t287337\n广东科学技术职业学院\t287338\n华山松\t287339\n八尺\t287340\n70所\t287341\n商社\t287342\n指点\t287343\n居里\t287344\ndata函数\t287345\n文博\t287346\n黑龙江省商务厅\t287347\n北海经济开发区\t287348\n中国力学学会\t287349\n北京翻译公司\t287350\n隋朝大运河\t287351\n雄蜂\t287352\nqatar\t287353\n河北省安监局\t287354\n董里\t287355\n灯长\t287356\nfom\t287357\n罗技g403\t287358\n私家侦探\t287359\n雪中悍刀行\t287360\n杀光\t287361\nzend\t287362\ncaoporn\t287363\n黄花鱼\t287364\n巴黎世家\t287365\nout文件\t287366\n弹跳式\t287367\n十二岁\t287368\n经查\t287369\noppor9m\t287370\n彩页\t287371\n汉水\t287372\n邵雪聪\t287373\n尖牙\t287374\n粘钢\t287375\npaf\t287376\n主管\t287377\n齐升\t287378\n德迅\t287379\n烟气脱硝\t287380\n阿拉比卡\t287381\n法兰克穆勒\t287382\n花都新区\t287383\n三六\t287384\n大吃大喝\t287385\n棉鞋\t287386\nkitematic\t287387\n随它吧\t287388\n636\t287389\n玉米糖\t287390\n野原\t287391\n上海证监局\t287392\n吊篮\t287393\n达尔文主义\t287394\n龙凤店传奇\t287395\n3m\t287396\n2m\t287397\n逐点\t287398\n茉莉牌局\t287399\nPee\t287400\n2016年10月1日\t287401\n毛泽\t287402\n父母们\t287403\nGATT\t287404\n清流中学\t287405\nvmlinux\t287406\n跳转回\t287407\n菇娘\t287408\njsonstring\t287409\n电脑端\t287410\n肌病\t287411\nTraverse\t287412\n胶布\t287413\n桃花庵歌\t287414\n武汉理工大学材料科学与工程学院\t287415\n黄石东路\t287416\n2018年1月20日\t287417\n内蒙古自治区卫生和计划生育委员会\t287418\nc85\t287419\n主理人\t287420\n隋文帝\t287421\n百夫长\t287422\n机械臂\t287423\n代码段\t287424\n谦少\t287425\n宴遇\t287426\n通航产业\t287427\n旗帜鲜明\t287428\n兼得\t287429\n还有什么\t287430\negf\t287431\n武状元苏乞儿\t287432\n713\t287433\n五菱宏光s3\t287434\n缴毒\t287435\n江山文学网\t287436\n炎龙侠\t287437\nestimate\t287438\n索敌\t287439\n广东交通职业技术学院\t287440\n61元\t287441\n帐下\t287442\n准备金\t287443\n聚乙烯管\t287444\n荷味\t287445\n末世孤雄\t287446\n4千米\t287447\n锦旗\t287448\n索求\t287449\n玛莎拉蒂莱万特\t287450\n湖北旅游\t287451\n暴走漫画—暴走大事件\t287452\n详文\t287453\n2800亿\t287454\n仲村美宇\t287455\n正邦集团\t287456\nhoe\t287457\n集團\t287458\n人民公敌\t287459\n中华职业教育社\t287460\n张苏\t287461\n拉倒\t287462\ncjod\t287463\n陈雍\t287464\n多美好\t287465\n环球时报社\t287466\n全刊\t287467\n潘帅\t287468\n50兆\t287469\n短道速滑\t287470\n维特根\t287471\nhomebridge\t287472\n签发地\t287473\nPore\t287474\n研讨会\t287475\n海盐\t287476\n滋生\t287477\n接种率\t287478\n保定市人力资源和社会保障局\t287479\n四大类\t287480\nContributions\t287481\n伊宁县\t287482\ncursor\t287483\nsek\t287484\n大成殿\t287485\n江西省国土资源厅\t287486\n兰州宏刚\t287487\n中关村大街\t287488\n32套\t287489\n百山夕养生网\t287490\n从善\t287491\n留薪\t287492\n疤面煞星\t287493\n猫头哥\t287494\n帕拉\t287495\n优训网\t287496\n避风塘奶茶\t287497\n诛仙吧\t287498\n远在天边\t287499\n戴尔·卡耐基\t287500\n南师附小\t287501\n000709\t287502\n美团券\t287503\n风钻\t287504\n智能摄像头\t287505\nX64位\t287506\n雁塔路\t287507\n银医\t287508\ntar.gz包\t287509\n别册\t287510\n底衣\t287511\npunished\t287512\n集成友盟\t287513\n民主与法制网\t287514\n新东方学校\t287515\n卡车之家论坛\t287516\nlmf\t287517\n阶跃\t287518\nCollected\t287519\n中国公路工程咨询集团有限公司\t287520\n东浦镇\t287521\numat\t287522\n国民大革命\t287523\n百鸣\t287524\n南天pr2e\t287525\ngenomics\t287526\n屏边县\t287527\nSUS\t287528\n暹罗广场\t287529\n发片\t287530\n凶兽\t287531\n一轮回\t287532\n控股有限公司\t287533\n曲种\t287534\n39次\t287535\n昆仑决\t287536\n排灯\t287537\n商管\t287538\nbintray\t287539\n2015下半年\t287540\n归正\t287541\n港币\t287542\n杨子\t287543\n裸体\t287544\n淮河路步行街\t287545\n小灰\t287546\n7.35PTR\t287547\nstorj\t287548\n内马尔\t287549\n大儒\t287550\n生命线\t287551\n期货交易员\t287552\n南派地下城与勇士\t287553\n郑家榆\t287554\n埇桥区\t287555\n25期\t287556\nris\t287557\n克洛诺斯\t287558\nv客\t287559\nBabylon\t287560\n主受文\t287561\n徐建新\t287562\n暴奸\t287563\n持票人\t287564\n石家庄市安全生产监督管理局\t287565\ndrowning\t287566\nmetart\t287567\nweb_study\t287568\n生父\t287569\nPS卡\t287570\nrx3\t287571\n世纪瑞尔\t287572\n美母\t287573\n节段\t287574\n17_\t287575\n240\t287576\nSeDog\t287577\n凯丰\t287578\n化学分析\t287579\n魔兽地图联盟\t287580\nEthyl\t287581\n冒气\t287582\n教法\t287583\nMartin\t287584\n8平米\t287585\n永鸿\t287586\nMic\t287587\n金骏眉红茶\t287588\ncocoapods\t287589\n联席会\t287590\n不想走\t287591\nciji\t287592\n杨卫东\t287593\n竜\t287594\n少荃湖\t287595\n皮包公司\t287596\n抹刀\t287597\nPIECE\t287598\n重庆教研网\t287599\n保理融资\t287600\n中兴V5\t287601\n320公里\t287602\n七中\t287603\n包头医学院\t287604\n中古战锤\t287605\n中民筑友\t287606\njpgraph\t287607\n梅陇中学\t287608\n编剧们\t287609\ngeode\t287610\n桃花源\t287611\n海豚加速器\t287612\n冰舞\t287613\n翠屏山\t287614\n派拉蒙\t287615\nJADE\t287616\n11公里\t287617\nEDIAGD\t287618\n上古卷轴5ece\t287619\n吠叫\t287620\nbile\t287621\n男人们\t287622\n联合国世界旅游组织\t287623\n末项\t287624\ninjure\t287625\nkido\t287626\n四川省农业科学院\t287627\n天涯路\t287628\n微耕\t287629\n第1页\t287630\n奥卡福\t287631\n柱型\t287632\n创企\t287633\n150mm\t287634\n宅急送快递\t287635\n短腿\t287636\n子母机\t287637\ntorrent,magnet,bt\t287638\n拼杀\t287639\nTAX\t287640\n标讯\t287641\n气割\t287642\n9.6\t287643\n沈向洋\t287644\n光伏汇流箱\t287645\n手顺\t287646\n为期\t287647\n石墨棒\t287648\n宁波市财政局\t287649\n心理咨询中心\t287650\n角色类\t287651\nHAC\t287652\n任泽锋\t287653\n长沙城\t287654\n基础养老金\t287655\n防务\t287656\n香积寺路\t287657\n海賊王\t287658\nqq讨论组\t287659\n新柴\t287660\n憎恨\t287661\n滨江路\t287662\ndining\t287663\n天韵合唱团\t287664\n麦淘\t287665\n霸陵\t287666\n赏花地\t287667\n投资类\t287668\n男身\t287669\n托福吧\t287670\n湖南艺术职业学院\t287671\n尼各马可伦理学\t287672\n王树声\t287673\n门芯\t287674\nmabatis\t287675\nSTM\t287676\name\t287677\n自限性\t287678\n李主任\t287679\n硬汉\t287680\n辩日\t287681\n海字\t287682\n温湿度控制器\t287683\n山路十八弯\t287684\n8859\t287685\n情中\t287686\ndnaman\t287687\n入位\t287688\n许玮宁\t287689\n短美文网\t287690\ncontinuous\t287691\nDiscuzX\t287692\n未来都市\t287693\n罗技m590\t287694\nk9\t287695\n百度mp4\t287696\n共体\t287697\n整平\t287698\n手摸\t287699\n茗茶\t287700\nbsp\t287701\n行道\t287702\n果干\t287703\n查号吧\t287704\n深圳政府在线_深圳市人民政府\t287705\nuus\t287706\n丁卡\t287707\n叶绍翁\t287708\n更合算\t287709\n拍租\t287710\n真空压缩袋\t287711\n防粘\t287712\n减压\t287713\nx-art\t287714\n超额\t287715\n64K\t287716\n卖花姑娘\t287717\n学云\t287718\n绝地求生空投\t287719\nSynergy\t287720\n7528\t287721\n地裂\t287722\n华仁药业\t287723\n硅钢\t287724\n重挫\t287725\n肖邦夜曲\t287726\n金针菜\t287727\n裁员\t287728\nShiseido\t287729\n查买\t287730\n咨询服务费\t287731\n拉撒路\t287732\n中国家用电器研究院\t287733\n宗泽\t287734\n狂赌之渊\t287735\n芙罗拉\t287736\n降暑\t287737\nMCGS\t287738\n三氯生\t287739\n页游助手\t287740\nvs广厦\t287741\n郑人\t287742\n边境之旅\t287743\n乡村爱情9\t287744\n王晰\t287745\nriqi\t287746\n拼读\t287747\n北京协和医院皮肤科\t287748\n报道集\t287749\n荣威rx3\t287750\n三甲苯\t287751\n2015年10月31日\t287752\n代表证\t287753\n通法\t287754\n辉光\t287755\n何青\t287756\n一千年前\t287757\n卫食\t287758\n斜度\t287759\nHDMI转换器\t287760\n西北妇女儿童医院\t287761\n中国道桥网\t287762\n咨询费\t287763\n21000\t287764\n星野遥\t287765\n指指\t287766\n五华县\t287767\n大话\t287768\n护具\t287769\n喜洋洋\t287770\n乳酸\t287771\n善宇\t287772\n垂怜\t287773\ncos2\t287774\n4.3亿元\t287775\n黑屏\t287776\nSaori\t287777\n干宝\t287778\n夜班\t287779\n朗培商学院\t287780\n福建公务员考试网\t287781\n坡向\t287782\n冰DK\t287783\n塍\t287784\n乱窜\t287785\n降脂药\t287786\n蛙眼守宫\t287787\n假期历险记\t287788\n杭州汽轮机股份有限公司\t287789\n江苏省天一中学\t287790\n还剩\t287791\nMTC\t287792\n这个家\t287793\nelegant\t287794\n浇带\t287795\n编译型\t287796\n回兴街道\t287797\n776\t287798\n5样\t287799\n塑限\t287800\n漏风\t287801\nCreo\t287802\n微型车\t287803\n万个\t287804\n县政府办\t287805\n珍品展\t287806\nTestlink\t287807\n51xtv\t287808\n高塍镇\t287809\n脉冲控制仪\t287810\n奥曲肽\t287811\n徐特立\t287812\nuiautomator\t287813\n工厂\t287814\n裕\t287815\n10串\t287816\n通用电气公司\t287817\n奖赏\t287818\n优个网\t287819\n194号\t287820\n20151011\t287821\n2018年5月16日\t287822\n拉布拉多俱乐部\t287823\n博物君\t287824\n杨茜\t287825\n炒炉\t287826\n亚圣\t287827\nFM2017\t287828\n四万块\t287829\n顾维钧\t287830\n41468321\t287831\nweee\t287832\n行政处罚办法\t287833\n上海航空公司\t287834\n上网\t287835\n精华乳\t287836\n3040\t287837\n梁家巷\t287838\n腾讯企鹅号\t287839\n经调查\t287840\n美国能源部\t287841\n制造商\t287842\n兔牙喵喵喵\t287843\n孤岛惊魂5cpy\t287844\n王挺\t287845\n攀枝花市环保局\t287846\n球墨铸铁管\t287847\njavaCV\t287848\n国安局\t287849\n土生土长\t287850\n衰老\t287851\n哗众取宠\t287852\n满负荷\t287853\n272号\t287854\nccs6.0\t287855\n对焦\t287856\n重庆申欧通讯科技有限公司\t287857\n号\t287858\ntokens\t287859\n500多家\t287860\n嘉业\t287861\n迭代法\t287862\n卫队\t287863\n120ml\t287864\n凌志\t287865\n2017年3月26日\t287866\n600031\t287867\n20世纪40年代\t287868\n缓蚀剂\t287869\n昭之\t287870\n武冈市\t287871\njstring\t287872\n仲伟\t287873\nVin\t287874\n咏史\t287875\n可_\t287876\n朝阳洞\t287877\njiayu\t287878\n东京浅草寺\t287879\nAdams\t287880\n微创新\t287881\n玩具类\t287882\nC-HR\t287883\n网页加速器\t287884\n浙江工业大学经贸管理学院\t287885\nifup\t287886\nKESION\t287887\n云泉\t287888\n影堂\t287889\n河市镇\t287890\n餐酒\t287891\n离婚官司\t287892\n白盘\t287893\n三驾马车\t287894\nEV录屏\t287895\n两布\t287896\n新东方视频\t287897\n麻辣鸡丝\t287898\nSherry\t287899\nldquo,rdquo\t287900\ndeterministic\t287901\n行者无疆\t287902\n曜\t287903\n一阵子\t287904\n红水晶\t287905\n牛心菜\t287906\nForrestWoo\t287907\n咔咕\t287908\n摄影类\t287909\narranged\t287910\nAV女优\t287911\n仓鼠王\t287912\n扫帚\t287913\nquarz\t287914\n两分钟\t287915\n上海迪斯尼乐园\t287916\n粉质\t287917\nCG\t287918\n龙凤山庄\t287919\n上海莱士\t287920\n中块\t287921\n绕法\t287922\n极地\t287923\n月本\t287924\n沛儿\t287925\n绞刑架\t287926\n中国人民共和国国土资源部\t287927\n咱们相爱吧\t287928\n量税\t287929\n树林\t287930\n年华\t287931\n11部\t287932\n地层\t287933\n全图文\t287934\n科莎\t287935\n113平米\t287936\n制版\t287937\n打印\t287938\npuer\t287939\n循环水泵\t287940\n吉祥话\t287941\nAVANCIER\t287942\n柴郡猫\t287943\nkig\t287944\n泉阳泉\t287945\n抢滩登陆战\t287946\n太古仓\t287947\n免签\t287948\nBoy\t287949\ndokuwiki\t287950\n很难过\t287951\n榆林市政府网\t287952\n五星茅台\t287953\n非诉讼\t287954\nmasque\t287955\n47年\t287956\n大安镇\t287957\n1700公里\t287958\n回美证\t287959\n零钱包\t287960\n露胸\t287961\nthinking\t287962\n大系\t287963\n锶\t287964\n领导之窗\t287965\n捷顺\t287966\nVia\t287967\n木风\t287968\n金陵小区\t287969\n聂天\t287970\n6.2.1\t287971\n重译\t287972\nMadu\t287973\n泰邦\t287974\nmonody\t287975\n防眩\t287976\n小儿推拿手法\t287977\n瞬变\t287978\nstriking\t287979\n大面街道\t287980\n音乐会员\t287981\n挖坟\t287982\n买\t287983\n俩号\t287984\n鲎\t287985\n杨彦涛\t287986\n家产\t287987\n逆\t287988\n年化收益率\t287989\n深圳长租公寓\t287990\n短鲷媚舞\t287991\n千牛\t287992\n24k\t287993\n天能动力\t287994\n金奖\t287995\n土豆\t287996\n铝焊机\t287997\n风语者\t287998\n75pk\t287999\n债券型\t288000\n创投公司\t288001\nsyslogd\t288002\nwebmethod\t288003\n起名网\t288004\n骏豪\t288005\n104.7\t288006\nxargs\t288007\n马里奥奥德赛\t288008\n讲稿\t288009\n阶梯\t288010\n北京密云区\t288011\n8.0%\t288012\n尿潜血\t288013\n明察\t288014\n安卓游\t288015\n美亚柏科\t288016\n3cg\t288017\nnvc\t288018\n受捧\t288019\n年货\t288020\n白谷逸\t288021\n大艺\t288022\npopu\t288023\n宝具\t288024\n递铺镇\t288025\n人工智能:灭绝危机\t288026\ndnc\t288027\nSLAM2\t288028\nYOTA\t288029\n迅载网盘\t288030\n水解酶\t288031\n晋江机场\t288032\nwithings\t288033\n和合谷\t288034\n直槽\t288035\n市城投公司\t288036\n绘声绘\t288037\n租售\t288038\n国产房车网\t288039\n阿伏伽德罗\t288040\n某些人\t288041\n东方雨虹\t288042\n英法\t288043\n住民\t288044\n马克笔吧\t288045\n胆绞痛\t288046\n华安保险\t288047\nMarkov\t288048\n可可粉\t288049\n反编\t288050\n天津市地方税务局\t288051\ncoom\t288052\n3枚\t288053\n凡河新区\t288054\n聚氧乙烯醚\t288055\nusi\t288056\n画桥\t288057\n消费品\t288058\nLounge\t288059\n吴昊\t288060\n绥靖\t288061\ndmr\t288062\n阿龟\t288063\n女女\t288064\nREADME\t288065\n兰叶\t288066\n啊呀\t288067\n吹干机\t288068\n宣亚国际\t288069\n那不勒斯\t288070\n一夫多妻\t288071\n曾志伟\t288072\n创梦\t288073\n稀世\t288074\ni7\t288075\n泰安\t288076\nDriver\t288077\nkqueue\t288078\n經濟通\t288079\n常州大学怀德学院\t288080\n两盘\t288081\n李国华\t288082\n短距离\t288083\n十波\t288084\n韩枫\t288085\n预制板\t288086\n热情如火\t288087\n75kg\t288088\n文尾\t288089\n做完\t288090\n眉睫\t288091\n2018年04月04日\t288092\nHarriet\t288093\nzjjs\t288094\n4分之一\t288095\n两匹\t288096\n刘心\t288097\n水母\t288098\n99旅馆连锁\t288099\n穷小子\t288100\n超级魂斗罗\t288101\n少年智力开发报\t288102\nreflect\t288103\n432\t288104\ne袋洗\t288105\n道纪\t288106\nSketchUp2016\t288107\n踢毽\t288108\n仙路\t288109\n多目\t288110\n洛基山\t288111\n搜奇\t288112\n知晓\t288113\n疾控\t288114\n微宏动力\t288115\n政治学基础\t288116\n临近\t288117\n人格\t288118\n专四听写\t288119\ngetopt\t288120\n电位滴定仪\t288121\n高身\t288122\nosql\t288123\n长陵\t288124\nCREATIVE\t288125\nTerm\t288126\n有机认证\t288127\nIPTD\t288128\n虹桥商务区\t288129\n雅丽洁\t288130\n受困\t288131\n套瓷\t288132\n包扶\t288133\n多多软件站\t288134\n拜亚动力\t288135\n京交会\t288136\n名类\t288137\n执行难\t288138\n旺旺网\t288139\n盗掘\t288140\nHifi\t288141\nPhase\t288142\nThermal\t288143\n北京市朝阳区人力资源公共服务中心\t288144\n鼻支\t288145\n第一|\t288146\n穿出\t288147\n带状孢疹\t288148\n昆山第一人民医院\t288149\n趁乱\t288150\n九旒\t288151\n王德辉\t288152\nNR\t288153\n羊水\t288154\n长颈鹿\t288155\n包清关\t288156\n宝安小学\t288157\nav天堂在线\t288158\n竖子\t288159\nurb\t288160\n陆欣\t288161\n海沃德\t288162\n孙殿英\t288163\naggression\t288164\nQQ学生网\t288165\n威霸\t288166\n中国文明网\t288167\n木栅\t288168\n冥思\t288169\n益力多\t288170\n跨服务器\t288171\n20150214\t288172\n萨克拉门\t288173\n原则\t288174\nCaillat\t288175\n浩气\t288176\n收获季\t288177\nCPE\t288178\n熔融指数\t288179\n畅叙\t288180\n卢刚\t288181\n膈膜\t288182\n天津律师事务所\t288183\n免征\t288184\n官服\t288185\nautofs\t288186\n黎氏\t288187\n长期应付款\t288188\n淘税网\t288189\n疑神疑鬼\t288190\nstone\t288191\n公安局\t288192\n违返\t288193\nLock键\t288194\n词组\t288195\n唐印\t288196\n武商集团\t288197\n快车道\t288198\n自然带\t288199\n北京新能源汽车股份有限公司\t288200\nwindows7_Wi\t288201\n药球\t288202\n栖姬\t288203\n海娜粉\t288204\n了凡\t288205\n张小\t288206\n大转折\t288207\n地铁站\t288208\n闹离婚\t288209\n70f4\t288210\n陶磊\t288211\n交流电\t288212\n千万间\t288213\n49P\t288214\n)商贸有限公司\t288215\nJAM\t288216\n一霸\t288217\n二三十年代\t288218\n读书季\t288219\nWin10PE\t288220\n醇化\t288221\n西哈努克市\t288222\n压伤\t288223\n萨勒姆\t288224\nwatcher\t288225\n挂印\t288226\n莓\t288227\ng6pd\t288228\n北大岳\t288229\n麻辣兔头\t288230\nQZZN\t288231\n沈知\t288232\n安妮海瑟薇\t288233\n灰褐色\t288234\n风水局\t288235\nLeroy\t288236\nluanlun\t288237\n风云决\t288238\n赛雷\t288239\n万科城\t288240\n金嗓子\t288241\n谷建芬\t288242\n亨利·詹姆斯\t288243\nJavaDoc\t288244\nclosely\t288245\n灞桥\t288246\n抡\t288247\n硫化碱\t288248\nsasaki\t288249\n2018.1.3\t288250\n贵州医科大学\t288251\n0.36\t288252\n百货大楼\t288253\n哈氏\t288254\n苏妍\t288255\n胰腺炎\t288256\n台南市\t288257\n劈裂机\t288258\n后怕\t288259\n20ma\t288260\n立码\t288261\n20141206\t288262\n周祝公路\t288263\n第57集\t288264\n水钱\t288265\n温度表\t288266\n蕴含\t288267\nsyrup\t288268\n阿宽\t288269\n丑奴儿\t288270\n刘结一\t288271\n四川西南航空职业学院\t288272\n龙岩市第一医院\t288273\n华为p20\t288274\n阿马尔菲\t288275\n赤脊山\t288276\n竞品\t288277\nrbe\t288278\n格莱美奖\t288279\n扫脸\t288280\nrcd510\t288281\n优友\t288282\n欢笑声\t288283\n天津宾馆\t288284\n3.0.2\t288285\n拉撒\t288286\n租租\t288287\n恼怒\t288288\n胡椒\t288289\n飞行模式\t288290\n对点\t288291\n光敏性\t288292\nDepartures\t288293\n阴缘\t288294\n第27号\t288295\n发现杯\t288296\n艳后\t288297\n画心师\t288298\n120g\t288299\n佐川一政\t288300\n世界旅游组织\t288301\n赛默飞世尔科技(中国)有限公司\t288302\nmvs\t288303\n最后一搏\t288304\n大牙\t288305\nney\t288306\n配偶者\t288307\n正向\t288308\n泉穴\t288309\nSilica\t288310\n20185\t288311\ncate\t288312\nMondrian\t288313\n选拔\t288314\n余姚人才网\t288315\n荧石\t288316\n黄月英\t288317\n口暴\t288318\n许昌人才网\t288319\na1688\t288320\n血凝\t288321\nNFS\t288322\n排骨粥\t288323\n罗技g430\t288324\n泸溪县\t288325\nDME\t288326\n32期\t288327\n透心凉\t288328\n富控\t288329\n大音希声\t288330\n卑贱\t288331\n安放\t288332\n性能力\t288333\nGmbH\t288334\nConditioning\t288335\n秦雪\t288336\nIrina\t288337\n/p\t288338\nURG\t288339\n乔丹卡佛\t288340\n人民英雄纪念碑\t288341\n南笙\t288342\n夺萃\t288343\nnmg\t288344\n宁夏山屿海\t288345\n保育\t288346\n大安法师\t288347\n西人\t288348\nfifa3\t288349\n中国农机院\t288350\n皮皮鲁传\t288351\n国家气象局\t288352\n互译\t288353\n40万\t288354\n川木\t288355\nmtx\t288356\n曾来过\t288357\n激斗\t288358\n田村\t288359\n建行手机网\t288360\n套系\t288361\nMPL\t288362\n云库网\t288363\n四川法院\t288364\n河北工程技术学院\t288365\n赫拉克罗斯\t288366\n山田凉介\t288367\n初元\t288368\n信用债\t288369\n蒋村花园\t288370\n防震\t288371\nFLUID\t288372\n号码段\t288373\n乐生活\t288374\n辣妞\t288375\n鹈鹕队\t288376\nRyokan\t288377\n光纤激光焊接机\t288378\n学测\t288379\n列克星敦\t288380\n线帽\t288381\n言词\t288382\n№\t288383\n赵客\t288384\n高志谦\t288385\n安全检查\t288386\n南极料理人\t288387\n氧化镓\t288388\n上海市公安局交通警察总队\t288389\n寒川\t288390\n新课标\t288391\ninshot\t288392\n_97973手游网\t288393\n第十名\t288394\n双比\t288395\nBMCC\t288396\n碰见\t288397\n餐饮管理公司\t288398\n滨江商务区\t288399\n终值\t288400\n新视野\t288401\n敲头\t288402\n教导\t288403\n95017\t288404\n涂色画\t288405\n阳光城市花园\t288406\n波美度\t288407\n古奇\t288408\n邱欣怡\t288409\n塔卡\t288410\n沉沦\t288411\nBally\t288412\n赤瞳\t288413\n银三角\t288414\n外访\t288415\n溅\t288416\n神秘博士\t288417\n生物物理学\t288418\nlightsong\t288419\n聆风\t288420\nHAWQ\t288421\n123\t288422\ntesa\t288423\nAmwin\t288424\n西北五省\t288425\n车重\t288426\n门槛\t288427\n蒸扇贝\t288428\n现代出版社\t288429\nUU伴奏网\t288430\n3副\t288431\n蛙式\t288432\nSunderland\t288433\n积米崖\t288434\n广州西\t288435\n每周三\t288436\n欢送词\t288437\n百合苑\t288438\n门户网\t288439\nSexxx\t288440\n上海建平中学\t288441\n郭凤莲\t288442\nC2\t288443\nOKHTTP\t288444\n罗密\t288445\n第33集\t288446\n新殖民主义\t288447\n腐朽\t288448\n第23轮\t288449\n广州市体育局\t288450\n租金\t288451\n6700K\t288452\nioe\t288453\n绍兴晚报\t288454\nPeoples\t288455\n韶关市公安局\t288456\nAdaBoost\t288457\n陆章\t288458\n大火球\t288459\n201401\t288460\n梦幻西游实用工具箱\t288461\n秀站\t288462\n人际关系学\t288463\n蝴蝶定理\t288464\nogc\t288465\n辅助业\t288466\n魔芋\t288467\n刀头\t288468\n壳牌\t288469\nensp\t288470\n法制宣传日\t288471\n5658xxxx\t288472\n算尾\t288473\n途鸽\t288474\n上海交通大学附属第六人民医院\t288475\n电子灌封胶\t288476\n马夏尔\t288477\n裤架\t288478\n广州海格通信集团股份有限公司\t288479\n中国民航\t288480\n贺文华\t288481\n公利\t288482\n地方名酒网\t288483\nCaterpillar\t288484\n张鼎鼎\t288485\nD罩杯\t288486\n王怡\t288487\n青口镇\t288488\n思辨\t288489\n睁大\t288490\n方便性\t288491\n4G显存\t288492\n869\t288493\n透视变换\t288494\n病室\t288495\n陈慧敏\t288496\n美仙\t288497\nFerro\t288498\n克里斯托\t288499\n点号\t288500\n胃绞痛\t288501\nFF6\t288502\n碱\t288503\n反经\t288504\n咸丰新闻网\t288505\n鑫存管\t288506\n捣衣\t288507\n安利公司\t288508\n文义\t288509\n诸行\t288510\n共青城\t288511\nGrooming\t288512\n卢冠廷\t288513\n南京宇通实验学校\t288514\nmakedirs\t288515\n牡蛎片\t288516\nnessus\t288517\njls\t288518\n四侠\t288519\n3158茶叶网\t288520\n蜀汉\t288521\n子午谷\t288522\n穿越火线CF\t288523\n卫生卷\t288524\n童颜\t288525\n防火岩棉板\t288526\n滁城\t288527\nresizable\t288528\n知名人士\t288529\n交通运输部办公厅\t288530\n最后一位数\t288531\n浦发运通\t288532\n弯道\t288533\n评求\t288534\nBrothers\t288535\n俊发集团\t288536\n奥迪tt\t288537\n中国社会福利基金会\t288538\n世界主义\t288539\n敬亲恩白\t288540\n阿鑫\t288541\nproxysql\t288542\nDef\t288543\n23页\t288544\n哈尔滨松北\t288545\nVSOP\t288546\n2440\t288547\n首届\t288548\n43倍\t288549\n无源\t288550\n接风\t288551\nMonths\t288552\n依规治\t288553\n陈仪\t288554\n派遣工\t288555\n我班\t288556\n来看我\t288557\n填料\t288558\n不老不死\t288559\n12v100ah\t288560\n林悦\t288561\n乖张\t288562\n报到证改派\t288563\n营业外支出\t288564\n3132\t288565\n伪男\t288566\n酷虫学校\t288567\n初心者\t288568\n行行重行行\t288569\n减低\t288570\n三里屯街道\t288571\n璞泰来\t288572\n巨画\t288573\n股权众筹\t288574\n孙悟空威客网\t288575\n电偶极子\t288576\n李奇\t288577\n法兰球阀\t288578\n苦咖啡\t288579\n黑客派\t288580\n中阶\t288581\n梅西\t288582\n20170204\t288583\n优良率\t288584\n初等\t288585\n香石\t288586\n数学生\t288587\n众源\t288588\n速卖通\t288589\n电子舌\t288590\n画花\t288591\njohns\t288592\nq200\t288593\ngoogletest\t288594\n哒哒\t288595\n唐代宗\t288596\n江主席\t288597\n宝安图书馆\t288598\n爆掉\t288599\n旭辉奥\t288600\n通达OA\t288601\n现实性\t288602\nニュ\t288603\nASG\t288604\n萨墓\t288605\n宋城千古情\t288606\n学生群\t288607\n行政法与行政诉讼法\t288608\n优翼学练优小学\t288609\n李聪\t288610\n重庆朝天门医院\t288611\n天立\t288612\n接口板\t288613\nWatches\t288614\n广园客运站\t288615\n双龙湖街道\t288616\n缺额\t288617\n848\t288618\n波兰\t288619\ninl\t288620\n江苏高院\t288621\n满载\t288622\n机床\t288623\nfmba\t288624\n18平方\t288625\n邓丽君\t288626\n皇牌空战7\t288627\nAJ6\t288628\n2013-2018年\t288629\n涂鸦板\t288630\ndanfoss\t288631\nlibelf\t288632\nzt\t288633\n好魔性\t288634\n智勇双全\t288635\n冰与火之歌\t288636\n杨培东\t288637\n抽样分布\t288638\nwp7\t288639\nGraphical\t288640\n142\t288641\n石板滩\t288642\n广东移动动感地带卡_\t288643\nXINPUT1_3.dll\t288644\nTomato\t288645\n金鲳鱼\t288646\n华东理工大学研究生院\t288647\n复选框\t288648\n天网\t288649\n堆石坝\t288650\nIFRS9\t288651\n乙港\t288652\n五里桥街道\t288653\n逆序\t288654\n黑乎乎\t288655\n成都七中育才学校\t288656\n潜水表\t288657\n可爱的家伙\t288658\n建菱砖\t288659\n赵氏\t288660\n单头\t288661\n神道教\t288662\nqx50\t288663\n小哀\t288664\n出塞\t288665\n绝世百炼成仙\t288666\n2451\t288667\nmini2s\t288668\n七弦\t288669\n木叶\t288670\n榕树头\t288671\n嵇\t288672\n很远\t288673\n好想你枣\t288674\n4.44\t288675\nyed\t288676\n四重唱\t288677\n陈佳莹\t288678\n阿土伯\t288679\n微镜\t288680\n李济\t288681\n杨信\t288682\n小奈樱\t288683\n苹果端\t288684\nn\t288685\n湖南大学电气与信息工程学院\t288686\ndizzy\t288687\n米娜桑\t288688\n迦罗娜\t288689\n人民日报社论\t288690\n顾影\t288691\n打不打\t288692\n2016劳动节\t288693\n强援\t288694\n圆盒\t288695\n谭静\t288696\nLDP\t288697\n储备局\t288698\nChinapp\t288699\n威克\t288700\nSE\t288701\nRevit2017\t288702\n自动取款机\t288703\n小眷村\t288704\n凌海\t288705\n辉光电\t288706\n麦当娜\t288707\n中国台州网\t288708\n北门街\t288709\n丽人妆\t288710\n干燥机\t288711\n王某\t288712\n顺化\t288713\n山图\t288714\n京瓷打印机\t288715\n红动中国-Redocn\t288716\n佛秀\t288717\n本局\t288718\n标准品\t288719\nMHT\t288720\n包皮吻合器\t288721\n皙\t288722\n洋溪\t288723\nPuyo\t288724\nclue\t288725\n无所得\t288726\n辣文h文\t288727\n口瘘\t288728\n杜撰\t288729\n航延险\t288730\n天龙高清影院\t288731\nG4560\t288732\n5.9分\t288733\n北京一卡通\t288734\nwhitespace\t288735\ngtx680\t288736\n佛具\t288737\n教育部思想政治工作司\t288738\n森罗万象\t288739\n圈圈\t288740\n督办函\t288741\n杰睿学校\t288742\n蒸镀\t288743\nrdbms\t288744\n核武\t288745\n预裂\t288746\nes300h\t288747\n爱之谷\t288748\n湟鱼\t288749\n撮影\t288750\n暖婚\t288751\n对号\t288752\n实心砖\t288753\n30min\t288754\n克里斯蒂\t288755\n微信史\t288756\nweb容器\t288757\n二十一号\t288758\n狮子大开口\t288759\n15层\t288760\n拆毁\t288761\niedriver\t288762\n卢森堡\t288763\n叶佩英\t288764\n坏疽\t288765\n百胜集团\t288766\n口袋妖怪水晶\t288767\n结单\t288768\n富尔顿\t288769\nsnes9\t288770\n张家庄\t288771\n艺术观\t288772\n大季\t288773\ngs7\t288774\n顺丰航空\t288775\n新安江水库\t288776\n第一朵\t288777\n募\t288778\n机器人\t288779\n郵編庫\t288780\n吃肉肉\t288781\nyaml文件\t288782\n学习习近平同志关于机关党建重要论述\t288783\nprinter\t288784\nDaughter\t288785\n军师联盟之虎啸龙吟\t288786\ngetdate\t288787\n碧哥\t288788\n北航\t288789\n自养\t288790\nMoisture\t288791\n盆景园\t288792\n神舟z7\t288793\n三键\t288794\nListIterator\t288795\n元太\t288796\n反冲运动\t288797\n随梦小说网\t288798\n箱根温泉\t288799\n英雄无敌三\t288800\nTried\t288801\n泽宇\t288802\n遗尿症\t288803\njw\t288804\n52TOYS\t288805\n70万\t288806\nscrapy-redis\t288807\n富国基金管理有限公司\t288808\n中交\t288809\n大番薯\t288810\n台邸\t288811\n真空破坏器\t288812\n医务人员\t288813\n大刺\t288814\n河间\t288815\n成都曙光男科医院\t288816\n那岩\t288817\n思危\t288818\n绝经期\t288819\n12389\t288820\n924\t288821\n秦老师\t288822\ntuber\t288823\n2011年底\t288824\n南昌市发展和改革委员会\t288825\ntcpip\t288826\nfendi\t288827\n赫家旺\t288828\n筹措\t288829\n香薯\t288830\n华州区\t288831\n较劲\t288832\n第一季10\t288833\n邵阳火车站\t288834\n长临河\t288835\n取款\t288836\n小雨\t288837\n最后一封信\t288838\nPro/E\t288839\n入编\t288840\n区旅游局\t288841\n焦磷酸\t288842\n揭竿而起\t288843\nExel\t288844\n番禺宾馆\t288845\n千千万\t288846\n不大\t288847\n孙珍妮\t288848\n燃系\t288849\n微信公共号\t288850\ntreetable\t288851\n天才雷普利\t288852\nbackpack\t288853\n娜娜奇\t288854\n百味\t288855\n霍汶希\t288856\n危险期\t288857\nyijia\t288858\n800.0000元\t288859\n海瑞克\t288860\n安旭\t288861\nistp\t288862\n巡守\t288863\n住宿篇\t288864\nfreenas\t288865\nANOVA\t288866\n直齿\t288867\n大浪街道\t288868\n大目湾\t288869\n中南君奥\t288870\nucore\t288871\n粤电集团\t288872\nH170\t288873\n11届\t288874\n微女郎\t288875\n徐贤\t288876\nSharp\t288877\n聯\t288878\n香江控股\t288879\n弗莱堡\t288880\n万恒娱乐\t288881\n软件\t288882\n审判\t288883\nPeek\t288884\nincorporation\t288885\n教程园\t288886\n春野樱\t288887\n南方航空公司\t288888\n老迈\t288889\n李玉龙\t288890\n西江路\t288891\n第八十九章\t288892\n按一下\t288893\n译林版\t288894\n送子观音\t288895\n花花世界\t288896\nnsps\t288897\nxtc820\t288898\n法拉第\t288899\nlisrel\t288900\n婴儿米粉\t288901\n鲍鱼\t288902\nTorch7\t288903\nvirbr0\t288904\nc++标准库\t288905\n冯志伟\t288906\n16块\t288907\n拜博\t288908\n杨仔\t288909\nMembrane\t288910\n市政局\t288911\n北京清华大学\t288912\n北京富力万丽酒店\t288913\n小锅\t288914\n万籁\t288915\n百特\t288916\n李佳\t288917\n识字\t288918\ntilemap\t288919\n连锁加盟\t288920\n純享\t288921\nrobots\t288922\n草原之夜\t288923\n艰难险阻\t288924\n桂电\t288925\n补贴金\t288926\n途腾\t288927\n辣妞范\t288928\n精易\t288929\n玖玖\t288930\n大儿\t288931\nBTC\t288932\n20150320\t288933\n飞廉\t288934\n盖聂\t288935\n羊毛衫\t288936\n35座\t288937\nATM机\t288938\n多群\t288939\nhud\t288940\ninfluences\t288941\n艾肯空调制冷网\t288942\nFliqlo\t288943\n袁志敏\t288944\n收招\t288945\n花号\t288946\n丝器\t288947\n关通\t288948\n说来就来\t288949\n伯利兹\t288950\n广东技术师范学院天河学院\t288951\n压测\t288952\nICs\t288953\n31级\t288954\n制管\t288955\n俞强声\t288956\npwc\t288957\n中国涟水政府\t288958\n坑梓镇\t288959\n卡特兰\t288960\n净价\t288961\n土龙\t288962\n告老还乡\t288963\n面页\t288964\nStarbound\t288965\n胎菊网\t288966\n好日子\t288967\n硅磷晶\t288968\n枪神纪专区\t288969\n自适应滤波器\t288970\ntunneling\t288971\nchunks\t288972\n雅漾\t288973\n秒视\t288974\n三万\t288975\ndelays\t288976\n三精牌葡萄糖酸锌口服液\t288977\n爱博股票网\t288978\n风阻\t288979\n俗话\t288980\n陈爱莲\t288981\n中国新能源汽车\t288982\n闷棍\t288983\n汽车团购网\t288984\n佛主\t288985\n放会\t288986\n邓杰\t288987\n拒赔\t288988\nCrawling\t288989\n支挡\t288990\nbbpress\t288991\n痴态\t288992\n木棍\t288993\nthane\t288994\n顶架\t288995\n西湖高等研究院\t288996\n龙儿\t288997\n常州博物馆\t288998\n郾城\t288999\n夜蔓\t289000\n北海港\t289001\n希希\t289002\n王永清\t289003\n啸西风\t289004\nB2B\t289005\n太仓市第一人民医院\t289006\nportainer\t289007\n舒特\t289008\nSID\t289009\n学生伤害事故处理办法\t289010\n520i\t289011\n轻微伤\t289012\n信誓旦旦\t289013\n知之深\t289014\nInfoWindow\t289015\n王小飞\t289016\n内部\t289017\n质量效应\t289018\n资味\t289019\nweixiu\t289020\n秦俊杰\t289021\n易笔字\t289022\n监利\t289023\n华帝燃气灶\t289024\nSound\t289025\n别急\t289026\n12541\t289027\n乌江亭\t289028\n自嘲\t289029\n耐药菌\t289030\n世报\t289031\n远华\t289032\n汉客\t289033\n黄染\t289034\npmon\t289035\n固土\t289036\n地下铁道\t289037\n武神赵子龙\t289038\n二曰\t289039\nsteemit\t289040\n大智度\t289041\n贾一平\t289042\n西安交大二附院\t289043\n福州医院\t289044\n宁夏回族自治区人民政府\t289045\nexclusively\t289046\nPPT宝藏\t289047\n1195\t289048\n谢天宇\t289049\nGO佳居\t289050\n潘美辰\t289051\n李万\t289052\n浮动利率债券\t289053\n涂卷\t289054\n次北固山下\t289055\n唐纪\t289056\n脚踝\t289057\nBNDong\t289058\n选择权\t289059\n起萧墙\t289060\nline6\t289061\n内蒙古自治区司法厅\t289062\nVoices\t289063\n慈姑\t289064\n彼得德鲁克\t289065\n程颢\t289066\nframework4.5\t289067\n最伟大\t289068\norbital\t289069\n焊条\t289070\ninold\t289071\n3边\t289072\n买东\t289073\n戴法\t289074\n李文东\t289075\n真心真意\t289076\n国际贸易专业\t289077\nCoffee\t289078\n王记\t289079\n沙峪\t289080\n4.70分\t289081\n唆使\t289082\n高等教育\t289083\nThickness\t289084\n百强镇\t289085\n1080P蓝光原盘\t289086\n御景华庭\t289087\n雫\t289088\n二审\t289089\n第7\t289090\n反腐倡廉\t289091\n郑州格力空调\t289092\n大唐西市\t289093\n卵生\t289094\n战斗系\t289095\n_步街·店铺街\t289096\n30首\t289097\n来凤县\t289098\n无限责任\t289099\n上海子起生物科技有限公司\t289100\nNMOS\t289101\n一茬\t289102\n垒球\t289103\n魅族mx6\t289104\n星月夜\t289105\nwin7分屏\t289106\n秦山核电站\t289107\n)投资有限公司\t289108\n打击感\t289109\n许文强\t289110\nNio\t289111\n午门\t289112\n7把\t289113\nGuangzhou\t289114\n市律协\t289115\n霓裳梦竹\t289116\n蒸鸡蛋羹\t289117\n中国经济网\t289118\n驴友们\t289119\n机间\t289120\n2K10\t289121\n和风带\t289122\n马思纯\t289123\n单条\t289124\n磨洗\t289125\naspnetcore\t289126\n锚碇\t289127\nExcited\t289128\n岭南园林\t289129\ncad2012\t289130\n东北烈士纪念馆\t289131\n滴定法\t289132\n吳潛\t289133\n单粒\t289134\nactivex\t289135\nEro\t289136\nS7200\t289137\n龙门花甲\t289138\n西楼月\t289139\nリバ\t289140\n1.\t289141\n智慧银行\t289142\n涠洲\t289143\n和面机\t289144\n小年糕\t289145\n苏格兰威士忌\t289146\nZoo\t289147\n济南市妇幼保健院\t289148\n土豪们\t289149\n河北省地方税务局\t289150\n鱼蛋\t289151\n1505\t289152\n4.8.4\t289153\n炒米\t289154\n几升\t289155\n郑爽\t289156\nshao\t289157\n850evo\t289158\n磁阻传感器\t289159\nVGG\t289160\n太平峪\t289161\n鼻炎喷剂\t289162\n北方网\t289163\n中移物联网\t289164\n语感\t289165\n2015年6月份\t289166\n李冠萍\t289167\n59名\t289168\nConst\t289169\n万达文旅\t289170\n腰肌劳损\t289171\n龙之谷DN\t289172\n制动阀\t289173\n保险精算\t289174\nrerun\t289175\n枯法者\t289176\nafr\t289177\nediter\t289178\n成都市妇幼保健院\t289179\n第40页\t289180\n京商\t289181\n五句\t289182\n缓冲垫\t289183\n4万元\t289184\n60CM\t289185\n氨苄青霉素\t289186\n宗政\t289187\n杨春林\t289188\ndepuis\t289189\n岗哨\t289190\n我的青春恋爱物语果然有问题\t289191\nAnthem\t289192\nmbdx吧\t289193\n孟献贵\t289194\n第一季04\t289195\n范文芳\t289196\n扭打\t289197\n十一套\t289198\n骁骑\t289199\n夏瑜\t289200\n宫崎市\t289201\n穿越记\t289202\n常书鸿\t289203\n副溶血性弧菌\t289204\n双登\t289205\n热带草原\t289206\n社会保障基金\t289207\n铁钩\t289208\n中国医师协会麻醉学医师分会\t289209\nIIS10\t289210\n户外群\t289211\n好莱客衣柜\t289212\nln\t289213\n拆盒网\t289214\n河渠\t289215\n海王生物\t289216\n叙永\t289217\nwow2\t289218\n查暗访\t289219\ncosmos\t289220\n小招\t289221\n汉译\t289222\n快充线\t289223\n代储\t289224\n张蔷\t289225\npar面\t289226\n惊恐障碍\t289227\n南京市中医院\t289228\n蚁酸\t289229\n商王\t289230\n张瑞图\t289231\n千兆以太网\t289232\n不远万里\t289233\n中共西安市高陵区委·西安市高陵区人民政府\t289234\n杭外\t289235\n人民路街道\t289236\n诳\t289237\n国家科技部\t289238\nPOLYCOM\t289239\n新新\t289240\n游戏展\t289241\n芍药花\t289242\n成本函数\t289243\n概率学\t289244\n雷霆战警\t289245\n赵梦洁\t289246\n华大桂\t289247\n倒叙\t289248\nHeadset\t289249\n物联网在线\t289250\nbdi\t289251\n龙沙\t289252\n王曼昱\t289253\nFallen\t289254\n038期\t289255\ndistiller\t289256\n1737\t289257\n2678\t289258\nX-2\t289259\n杭州市律师协会\t289260\nbufferevent\t289261\n减速带\t289262\n癌\t289263\n北加尔\t289264\nrien\t289265\nIPMI\t289266\ng3\t289267\n工具侠\t289268\n成都东火车站\t289269\n入深\t289270\n敲里\t289271\nTGF\t289272\n促进\t289273\n死亡诗社\t289274\ndiscount\t289275\n天正建筑\t289276\n7CD\t289277\n大沙\t289278\n李民\t289279\n甘孜藏族自治州\t289280\n肉汤\t289281\n分裂期\t289282\n透明\t289283\nEMINEM\t289284\n华中科技大学计算机学院\t289285\ncmda\t289286\n1000000000\t289287\nuap\t289288\n有为法\t289289\n逍遥云天\t289290\n苏州市食品药品监督管理局\t289291\n150首\t289292\n吃货\t289293\n甜蜜素\t289294\n彼氏\t289295\n暖萌\t289296\nEC6108V9C\t289297\nPS联盟\t289298\n崩坏\t289299\n笑起来\t289300\n乐看网\t289301\n成都人才网\t289302\n防治\t289303\n不要怕\t289304\n江山美人\t289305\nEu\t289306\nMKV-720p\t289307\n丹尼斯大卫城\t289308\nGo\t289309\n小仔\t289310\nyl\t289311\n后值\t289312\n达拉崩\t289313\nbugku\t289314\n赛制\t289315\n播读\t289316\n金地格林小镇\t289317\n6千米\t289318\n1170\t289319\n来往\t289320\n广州出入境检验检疫局\t289321\n铁人\t289322\n智能表\t289323\n平政\t289324\n4700\t289325\nCollar\t289326\n福施福\t289327\npaxos\t289328\n夏庵\t289329\n_v3.6\t289330\n皮格\t289331\n啾咪\t289332\n审验\t289333\n苏尼特右旗\t289334\n喷浆\t289335\n永遠\t289336\n博鳌恒大国际医院\t289337\n聚四氟乙烯板\t289338\n南极洲\t289339\n羟色胺\t289340\nC++版\t289341\nsur\t289342\nANDA\t289343\n28p\t289344\nmule\t289345\n残羹\t289346\n空线\t289347\n石湫镇\t289348\n桓台县人民政府\t289349\n创大\t289350\n致忘了诗的你\t289351\n卯\t289352\n方刚\t289353\n宝鹰股份\t289354\n小刚\t289355\n肽键\t289356\n球文\t289357\n伯乐相马\t289358\n品今控股集团\t289359\n急性病\t289360\n尼康D3200\t289361\n陕西省中医医院\t289362\noth\t289363\n片法\t289364\n北京海尔\t289365\n二驱\t289366\n小桶\t289367\n半导体测试\t289368\n血版\t289369\n新剑\t289370\nbst\t289371\nstn\t289372\n大航海时代外传\t289373\n怡海\t289374\n田某\t289375\n古斯\t289376\n张卫星\t289377\n重开\t289378\nhanewin\t289379\n嵌入\t289380\nhfp\t289381\nASPECT\t289382\n上海婚纱摄影\t289383\n抬头纹\t289384\n翔升\t289385\n红a\t289386\n青莲剑仙\t289387\nqsed\t289388\n人民代表大会常务委员会\t289389\n白度仪\t289390\n多米诺\t289391\n中共辽宁省委组织部\t289392\n团期\t289393\n战记\t289394\n百科全書\t289395\nGini\t289396\n禁忌语\t289397\nProjection\t289398\n小福\t289399\n风儿轻轻吹\t289400\n护卫者\t289401\n几名\t289402\n李寿全\t289403\n中央政务区\t289404\n窝板\t289405\n信用社\t289406\n一第二章\t289407\n齐聚一堂\t289408\n雷吉斯\t289409\n潭门镇\t289410\nFama\t289411\n清北\t289412\n秦楚\t289413\ndiablo\t289414\n无限极专卖店\t289415\n杰德\t289416\n浦东法院\t289417\n阳光小区\t289418\n冰爆\t289419\n钛金属\t289420\n断然\t289421\n电保护\t289422\n纳米机器人\t289423\nHSTS\t289424\n海关\t289425\n皮卡网\t289426\n看天长网\t289427\n丰台科技园\t289428\n母矿\t289429\n391号\t289430\n论诗\t289431\n非师范\t289432\n纽约时报\t289433\nOct\t289434\n鬼吹灯之牧野诡事2\t289435\n亚欧博览会\t289436\n威逼\t289437\ncomet4j\t289438\n桂花鱼\t289439\nresistant\t289440\n伊豆的舞女\t289441\ne35\t289442\n瓜沥镇\t289443\n彩纱\t289444\n陈王\t289445\n编程_匿名_天涯问答\t289446\n野人山\t289447\n卡尔费休\t289448\nphrase\t289449\n呼和浩特乐居网\t289450\n矽肺\t289451\n复产\t289452\n燃烧弹\t289453\n梦精记\t289454\n迅影\t289455\nJovi\t289456\n卖完\t289457\nne555\t289458\n金秦禹\t289459\n印剧\t289460\n杭萧钢构\t289461\nmongodb3.6\t289462\n四创电子\t289463\n追猎\t289464\nXviD\t289465\nswagelok\t289466\n洪波\t289467\n李易峰\t289468\nnantaihu\t289469\nZ2期\t289470\nDXD\t289471\n网络博彩\t289472\n行李架\t289473\n掌掴\t289474\n贪\t289475\n假皮\t289476\n聊聊天\t289477\n都兰县\t289478\n机动部队\t289479\n舒适型\t289480\nauthorware\t289481\n李希贵\t289482\n文彬\t289483\n微力\t289484\naxure7.0\t289485\n山东省工商行政管理局\t289486\nig\t289487\n卧龙小区\t289488\n高等\t289489\n桃源居\t289490\n普通股\t289491\n麻城市政府网\t289492\nQ3\t289493\ndiv块\t289494\n何畏\t289495\nBootlin\t289496\n0级\t289497\n东京食尸鬼\t289498\nMQTT服务器\t289499\n央央\t289500\n小格\t289501\n马派\t289502\n血液分析仪\t289503\n繁杂\t289504\n美少年\t289505\n灯光展\t289506\nMAI\t289507\n诊断书\t289508\n地球卫士\t289509\nifa\t289510\n中海国际\t289511\n甩尾\t289512\nKODAK\t289513\n1万辆\t289514\nfloor2\t289515\n周五下午\t289516\n设计稿\t289517\nSQL存储过程\t289518\n拳皇12\t289519\nstress\t289520\n挂\t289521\n引致\t289522\n涙\t289523\n星闻娱卦\t289524\nFlickr\t289525\n2018年03月14日\t289526\n醒目\t289527\ndoping\t289528\n杭州师范大学钱江学院\t289529\n皂甙\t289530\n做道\t289531\n豹王\t289532\n飚升\t289533\n飞贷\t289534\n施工许可证\t289535\n速战速决\t289536\n概率函数\t289537\n丑妻\t289538\n淮河\t289539\nchuanqi\t289540\nClearCase\t289541\nlibingql\t289542\n强好\t289543\n微面\t289544\n忍住\t289545\n东方ic\t289546\n动画包\t289547\n重连\t289548\n钢剑\t289549\n女战\t289550\n陆步轩\t289551\n如数家珍\t289552\n泛光照明\t289553\nCaffeine\t289554\nchulia\t289555\n蛍\t289556\nexpdb\t289557\nspringbatch\t289558\n桃花山\t289559\n天髓\t289560\nrumors\t289561\n混养\t289562\n流变学\t289563\n拔胡子\t289564\ngray1982\t289565\nEVA初号机\t289566\n保壳\t289567\n敷脸\t289568\nPSCC2017\t289569\n蒋定之\t289570\n陈雨菲\t289571\n审死\t289572\n车晓\t289573\n仗剑\t289574\nPunisher\t289575\n爱着\t289576\nBT/\t289577\n讯维\t289578\nSearcher\t289579\n分付\t289580\n喃喃自语\t289581\n2422\t289582\n阿珊\t289583\n木子\t289584\n针刀\t289585\n本田思域\t289586\n达纳苏斯\t289587\npunch\t289588\n绯狱丸\t289589\n沃尔沃xc90\t289590\n龋齿\t289591\n狗头金\t289592\n非门电路\t289593\n信佛\t289594\n邮政总局\t289595\n平野绫\t289596\n停水\t289597\n城市群\t289598\n_图解吧_\t289599\n孟可\t289600\n怎\t289601\n八礼四仪\t289602\nXE7\t289603\n脊梁\t289604\n环保标\t289605\n税务登记号\t289606\nncep\t289607\n加利利\t289608\n民事责任\t289609\n69岁\t289610\n阿姐\t289611\n磁力搅拌器\t289612\n_澧县人民政府\t289613\n51Aspx.com\t289614\n焊道\t289615\n不痒_\t289616\n扣减\t289617\n侨批\t289618\nwebsites\t289619\nJUMO\t289620\n停牌\t289621\nPyCharm2018\t289622\n晓宇\t289623\n北京泰德制药股份有限公司\t289624\n明星潜规则之皇\t289625\n新世界百货\t289626\n现在\t289627\n奇秀\t289628\nAutoCAD2007\t289629\nbuty\t289630\n俄罗斯\t289631\n玉林\t289632\n京东集团\t289633\n报告老板之豪言壮旅\t289634\n动真格\t289635\n玄参\t289636\n军统\t289637\n胡芳\t289638\n融景城\t289639\n创赢\t289640\n星河控股集团有限公司\t289641\n垂足\t289642\n何所冬暖\t289643\n魔方世界\t289644\n考研帮\t289645\nProgramme\t289646\n/公司库/大全/\t289647\ntofu\t289648\n裂缝\t289649\n绍兴市中级人民法院\t289650\n年限平均法\t289651\n蔡斯\t289652\n拯救大兵瑞恩\t289653\n一2018\t289654\nGA-B75M-D3V\t289655\nVGJ\t289656\n京广线\t289657\nAVIS\t289658\n头段\t289659\n种人\t289660\n影像\t289661\n地劫\t289662\n卓一\t289663\nqq.com\t289664\n夏花\t289665\n殷寻\t289666\n好走\t289667\nfreegate\t289668\n夹带\t289669\n桂枝汤\t289670\n易帝\t289671\nlatter\t289672\n55升\t289673\n真美\t289674\n第56号\t289675\nLPCWSTR\t289676\n20170223\t289677\n导者\t289678\n小茜\t289679\n40余年\t289680\n塔斯马尼亚\t289681\n中国竞彩网\t289682\nBourne\t289683\n劳动人事争议仲裁委员会\t289684\n嘧菌酯\t289685\n6.33\t289686\n凹凸镜\t289687\n彩棉\t289688\n深圳民办\t289689\n二十位\t289690\n侯斯特\t289691\nhypertension\t289692\n兵子\t289693\n未日\t289694\nscboy\t289695\n邹平县\t289696\n优酷版\t289697\n清洁液\t289698\n仙侠奇缘\t289699\nGainward\t289700\n贝奥兰迪\t289701\n好戏连台\t289702\n便宜\t289703\n全冠\t289704\ncrd\t289705\n名馆\t289706\n梁志强\t289707\nMIUI资源组\t289708\n谭延闿\t289709\n李毅佳\t289710\n战国无双真田丸\t289711\n外滩十八号\t289712\noptoma\t289713\n虚拟器\t289714\n10头\t289715\ndendritic\t289716\n劳动争议仲裁委员会\t289717\n南京开发区\t289718\n100平方公里\t289719\neasyUi\t289720\n爱须心亚\t289721\n1503\t289722\nNEF\t289723\n上百部\t289724\nskse\t289725\nsingleton\t289726\n桐乡\t289727\n冷焊\t289728\nQAQ_\t289729\n北京大学口腔医院\t289730\nMcYy\t289731\n遐想\t289732\n货币转换器\t289733\n四季\t289734\n下课\t289735\ngb18030\t289736\n牌子\t289737\n三北\t289738\n剑荡\t289739\n牛帮\t289740\n费马\t289741\nGearman\t289742\n环球网校\t289743\n浙江省高院\t289744\nsmt\t289745\nNBT\t289746\n龙泉花园\t289747\nIFrame\t289748\nIPMP\t289749\n带病\t289750\n3只\t289751\nHomeMP3\t289752\n增值税小规模纳税人\t289753\n消费物价指数\t289754\n维护员\t289755\n大腿\t289756\nEditText\t289757\n乳胶粉\t289758\n沪深交易所\t289759\n读懂\t289760\n可变参数函数\t289761\n录比\t289762\n湾仔区\t289763\n成人之美\t289764\n金天鹅\t289765\n商业分析专业\t289766\n河智苑\t289767\n穿环\t289768\n航新科技\t289769\nt66y\t289770\npassive\t289771\n驱动级\t289772\n注册师\t289773\nHttpRequest\t289774\nils\t289775\n小门\t289776\ndateTime\t289777\n孙刚\t289778\n04月13日\t289779\n上海通用汽车有限公司\t289780\n地质出版社\t289781\n素不相识\t289782\n那些年我们一起追的女孩\t289783\n缩管\t289784\n凌云志\t289785\n琵琶虾\t289786\n4月7日\t289787\n优质股\t289788\n凤凰机器人\t289789\n惠达\t289790\n天铂\t289791\n羌绣\t289792\n北京北站\t289793\ndompdf\t289794\nG9200/全网通\t289795\nA67\t289796\n一脉\t289797\n原宿\t289798\n北京邮电大学》\t289799\n增订本\t289800\n扣扣居\t289801\n将门毒医大小姐\t289802\n苏州科技大学天平学院\t289803\n东北电力大学\t289804\n153\t289805\n传奇万能登陆器\t289806\n天地之道\t289807\n芳村区\t289808\n多一分\t289809\n防水性\t289810\n840d\t289811\n天福\t289812\n浮选机\t289813\nswiftcode\t289814\n55dns\t289815\n佛教社-地藏论坛\t289816\nIndian\t289817\nxtc860\t289818\n上证\t289819\n南华大学附属第一医院\t289820\n秀全\t289821\n明宣宗\t289822\n路阀\t289823\n罗伟\t289824\n贸易融资\t289825\n宝骏510\t289826\n打领\t289827\n英菲尼迪QX80\t289828\n中邮证券\t289829\n十几万元\t289830\n通版\t289831\nwav文件\t289832\n达尼\t289833\n应城在线\t289834\n北京市人民政府法制办公室\t289835\n天天穞日日穞夜夜穞\t289836\n异木棉\t289837\n一对一|\t289838\n尸鬼\t289839\n庭妍\t289840\n甚多\t289841\n中国交通运输协会\t289842\n办公族\t289843\n补肺丸\t289844\n耿为华\t289845\n六花\t289846\n材生\t289847\nINDUSTRIES\t289848\nVideoJS\t289849\n烟渍\t289850\n纯净水\t289851\n酚咖片\t289852\n广西百色政协\t289853\n氟比洛芬凝胶贴膏\t289854\n97mm\t289855\n易铺网\t289856\n龙湖长楹天街\t289857\nInsecure\t289858\nusb集线器\t289859\nbytes\t289860\nSuppression\t289861\nSN级\t289862\n咪哒minik\t289863\n胖丫\t289864\n法邦\t289865\n炊大皇\t289866\n春英\t289867\n鬼气\t289868\n高中数学必修3\t289869\n上海家博会\t289870\nMagSafe\t289871\n武汉市档案局\t289872\n孟加拉达卡\t289873\n影子传说\t289874\n换格\t289875\nclistctrl\t289876\ndoctrine\t289877\n霸星\t289878\n药郎\t289879\n货运代理公司\t289880\n2019级\t289881\n诺博\t289882\n剥削阶级\t289883\n大权\t289884\n古诗词三首\t289885\n土豆饼\t289886\n杜比\t289887\nFV\t289888\n树正气\t289889\n杨凯\t289890\n榴莲千层蛋糕\t289891\n中共孝感市委党校\t289892\n刑事侦缉档案4\t289893\n金济赫\t289894\n广西师范大学漓江学院\t289895\n吴雨霏\t289896\n浙江省人民防空办公室\t289897\n肉片\t289898\n万起\t289899\n40000元\t289900\nShibuya\t289901\np40\t289902\n0时\t289903\n渔翁\t289904\n练市镇\t289905\n璟\t289906\n会费\t289907\n江西现代职业技术学院\t289908\nJPG\t289909\n云顶梦号\t289910\n传染性单核细胞增多症\t289911\n海淀教育网\t289912\n鹿柴\t289913\n新疆警察学院\t289914\n误导\t289915\n280平米\t289916\northogonal\t289917\n51元\t289918\n给惠网\t289919\n钢炉\t289920\n玩腻\t289921\n帝旺\t289922\n2016年里约奥运会\t289923\n武宁\t289924\n腿袜\t289925\n极客\t289926\n象皮\t289927\n骨粉\t289928\nViscosity\t289929\n股神\t289930\n苹果泥\t289931\n0s\t289932\n梦境\t289933\n武昌起义\t289934\n罗布林卡\t289935\n注册权\t289936\n巧克力球\t289937\n天天奔腾b50\t289938\n耳背式\t289939\n隆尧\t289940\nMOEMILK\t289941\nSciFinder\t289942\n天刀唐门\t289943\n奔驰遮天\t289944\n十七八岁\t289945\n大婶\t289946\n1673\t289947\n纲吉\t289948\n张顺\t289949\n08:00\t289950\n玻璃丝\t289951\nMississippi\t289952\n医籍\t289953\n圆陆鲨\t289954\nkistler\t289955\n保定东站\t289956\n梁式\t289957\n骐\t289958\n鞍山西\t289959\nryona\t289960\n淳于\t289961\nVMware\t289962\n中信所\t289963\n行云流水\t289964\n黑纱\t289965\nloans\t289966\n高亚男\t289967\n2017#\t289968\n活动家\t289969\n走势图\t289970\n俄语版\t289971\n网络攻击\t289972\n太上老君说常清静经\t289973\n两断\t289974\nmyLittleGarden\t289975\n思君\t289976\n寄付\t289977\n墨刀\t289978\n250G\t289979\n王建文\t289980\ncnx\t289981\n萌芽\t289982\n绝佳\t289983\n金盘奖\t289984\nYantai\t289985\n泉山_泉山区政府网\t289986\n萧薇\t289987\n求新\t289988\n80亿元\t289989\n南昌火车站\t289990\n定餐\t289991\n吃出\t289992\n帝王陵\t289993\n1496\t289994\n西安碑林博物馆\t289995\nad17\t289996\n缠绕垫\t289997\n娃娃屋\t289998\nopinion\t289999\n60多\t290000\n分生\t290001\n93平米\t290002\n勇者斗恶龙8\t290003\n第6年\t290004\n船税\t290005\n民国三年\t290006\n笨鱼\t290007\ndirver\t290008\n替换\t290009\n精灵3\t290010\nmacintosh\t290011\nArabia\t290012\n脑王\t290013\n传播系\t290014\n奶黄包\t290015\nf320\t290016\n东营百姓网\t290017\n行为主义\t290018\n吴争\t290019\nPall\t290020\n苏通园区\t290021\n李明顺\t290022\n炎陵县人民政府\t290023\n二郎神\t290024\n同乐城\t290025\n原文\t290026\n315吨\t290027\n有机化学反应\t290028\n出资\t290029\n福特金牛座\t290030\n流毒\t290031\n处罚单\t290032\n沉默\t290033\n尚湖风景区\t290034\n表象\t290035\n美足\t290036\n诡话\t290037\n合气\t290038\n锦瑞\t290039\n何地\t290040\n勤上光电\t290041\niertutil\t290042\n勾号\t290043\n内蒙古自治区党委\t290044\n吴茱萸\t290045\nShopEx\t290046\ntutor\t290047\n唇霜\t290048\n使命召唤15\t290049\n通信业\t290050\ndsoframer\t290051\n300L\t290052\n警车\t290053\n75本\t290054\n仙侣奇缘2\t290055\n江汇\t290056\n自驾租车\t290057\n8年前\t290058\n缓期\t290059\n舒芙蕾\t290060\n应用文书\t290061\nefs\t290062\n智钻\t290063\n金阳县\t290064\n模糊度\t290065\n设摊\t290066\nRCC\t290067\n分盘\t290068\n黄古林\t290069\n公元\t290070\n胡江\t290071\nnextInt\t290072\n新希望\t290073\n郑成功\t290074\nScorpions\t290075\n周至县\t290076\nSurf\t290077\nbrave\t290078\n2k17mc\t290079\nQ460\t290080\n星球大战8\t290081\n城市空间\t290082\n纤妃\t290083\n数码宝贝linkz\t290084\nshujuku\t290085\n狙击精英V2\t290086\n工业连接器\t290087\n潘洁\t290088\n鲜族\t290089\n碰瓷\t290090\n美极了\t290091\nreadme\t290092\n双\t290093\n66集\t290094\n林曦\t290095\n区经信局\t290096\nsft\t290097\nhellcase\t290098\n巴丹\t290099\n撒贝宁\t290100\n溜子\t290101\n0599\t290102\n包装秤\t290103\n锅炉炉\t290104\n黑衣人3\t290105\nHandler\t290106\nCSO\t290107\n部室\t290108\n广泛\t290109\n辅助工\t290110\n分枝\t290111\n新都桥\t290112\n茶缘\t290113\n麗\t290114\n全素\t290115\n公平合理\t290116\n下阴\t290117\nflym\t290118\n黑莓论坛\t290119\n泰安路\t290120\n布洛陀\t290121\n昊锐\t290122\n35码\t290123\n锁批\t290124\n怡泉\t290125\n百花岭\t290126\n第N次\t290127\n无限连\t290128\n网易云热\t290129\n昭华\t290130\n原心\t290131\nmodule5\t290132\n瓦城\t290133\n炼焦煤\t290134\nmedica\t290135\n網購\t290136\n新疆共青团\t290137\n剑齿虎\t290138\n夜床\t290139\n柳铁一中\t290140\nVladimir\t290141\n粑\t290142\n危急值\t290143\n爆炒江湖\t290144\n法国巴黎银行\t290145\n常委们\t290146\n龙湖蓝湖郡\t290147\n我的心\t290148\n鸣金\t290149\n暂\t290150\n王超\t290151\nBoarding\t290152\n王绪\t290153\n升龙道\t290154\n娄氏\t290155\n朱雀记\t290156\nFloating\t290157\n复仇者联盟4\t290158\n中国科学院昆明动物研究所\t290159\n拉链表\t290160\n诗海\t290161\n1364\t290162\n京东通信\t290163\njuqery\t290164\n武汉东湖风景区\t290165\n龙潜\t290166\nitext7\t290167\n禁忌_39健康网\t290168\n养马岛\t290169\n3-5月\t290170\n外心\t290171\n有货网\t290172\nMorrison\t290173\n电磁脉冲阀\t290174\n小幅\t290175\n嘉阳\t290176\nboxed\t290177\n徐行镇\t290178\n5053\t290179\nEx\t290180\n召回\t290181\n香机\t290182\nGK888t\t290183\n蠹虫\t290184\n长江西路\t290185\n家用增压泵\t290186\n提完\t290187\n紫苏油\t290188\nNZ\t290189\n杨梅汁\t290190\npilot\t290191\n钢瓶\t290192\n发展处\t290193\n丁绍光\t290194\n北京小区\t290195\n北京市药品监督管理局\t290196\n香港金融管理局\t290197\n2417\t290198\n周助\t290199\n日益\t290200\n晋西北\t290201\n营养学家\t290202\n南昌师范学院\t290203\n反硝化\t290204\n内奸\t290205\n福来恩滴剂\t290206\n声波\t290207\n甘孜\t290208\n说破\t290209\n正男\t290210\n徐晓明\t290211\n志源GO\t290212\n何项\t290213\nvelvet\t290214\n致良\t290215\n中诺餐饮加盟网\t290216\ngddq\t290217\nMESA\t290218\n掀翻\t290219\n琼脂糖\t290220\n狮头股份\t290221\n电子单\t290222\n过路费\t290223\n掰弯\t290224\n男皮\t290225\n条型\t290226\n雷柏\t290227\n乔哈特\t290228\n木质\t290229\n齐格飞\t290230\n贰毫\t290231\n削皮\t290232\n偶滴歌神啊\t290233\n非常完美\t290234\n中国酷公司\t290235\n全心\t290236\n混流式\t290237\n盘库\t290238\ncubemap\t290239\n灵犀\t290240\n防尘条\t290241\n天之炼狱\t290242\n白胚\t290243\n肾宝\t290244\n易盛\t290245\nJMS\t290246\npronhub\t290247\n储蓄率\t290248\n湖南高院\t290249\nStruts2\t290250\n节点\t290251\n342号\t290252\n围裙\t290253\n1749\t290254\nneptune\t290255\n横跨\t290256\n第二页\t290257\n饰品\t290258\n中亚牧羊犬\t290259\n湿热\t290260\n入职体检表\t290261\ninterference\t290262\n晓彤\t290263\n曼富\t290264\n第二章\t290265\n弹坑\t290266\n六代机\t290267\n老四\t290268\n淮安市政府\t290269\nctl672\t290270\n北京中国建设银行\t290271\n王宇飞\t290272\n卯兔\t290273\n虐文\t290274\nLine\t290275\n清迈机场\t290276\n12幅\t290277\n2894\t290278\n万宝盛华\t290279\n向外\t290280\n整数线性规划\t290281\nf103\t290282\n曲线方程\t290283\neosDAC\t290284\npaddle\t290285\nSIX\t290286\n浆砌片石\t290287\n湖北省政协\t290288\n刘老庄\t290289\n他是龙\t290290\nJDK\t290291\n何冰冰\t290292\nasap\t290293\nsag\t290294\n鼾声\t290295\nthreadpoolexecutor\t290296\n伶官传序\t290297\n0.035\t290298\n20151028\t290299\n4.2.5\t290300\n氯氟醚菊酯\t290301\n投资客\t290302\nterms\t290303\n易语言编程\t290304\n马里布\t290305\n四辑\t290306\n几百斤\t290307\n女兽\t290308\n黄英华\t290309\n绝地武士\t290310\n轻机枪\t290311\n铁建山语城\t290312\nHighway\t290313\n张剑波\t290314\n四川人事考试网\t290315\n姑娘我爱你\t290316\n石楠树\t290317\n犬道\t290318\n6.9\t290319\n南码头路\t290320\n膦\t290321\n博士德\t290322\nReseller\t290323\nXShell\t290324\n440项\t290325\nHasee\t290326\n水淹没\t290327\nparts\t290328\n吉利领克01\t290329\n速码\t290330\n李春林\t290331\n玉米汁\t290332\n交叉\t290333\n余秋里\t290334\n纸钱包\t290335\n别针\t290336\n吃相\t290337\nHaving\t290338\nOkHttp3\t290339\n共青团浙江大学委员会\t290340\n邮东西\t290341\n后周\t290342\n本义\t290343\n厂矿\t290344\nei5\t290345\n车置宝\t290346\n无明\t290347\n森林资源\t290348\nsylar\t290349\n重弹\t290350\nabbr\t290351\n凤城\t290352\nFlas\t290353\nshmeea\t290354\n爱学堂\t290355\n马克思主义哲学\t290356\n红领带\t290357\n落影绘书\t290358\nchangeset\t290359\nsgl\t290360\nSamurai\t290361\nTomcat7\t290362\ntct\t290363\nCorbin\t290364\n宫斗\t290365\n树池\t290366\n弘基\t290367\n乾元镇\t290368\nш\t290369\nprofoto\t290370\n一等奖\t290371\n万座\t290372\n始生\t290373\n隐形眼镜护理液\t290374\n药物治疗\t290375\n接力\t290376\n刘师傅\t290377\n复联4\t290378\n膨化机\t290379\n娇丽\t290380\n5208\t290381\n明叔\t290382\nAzureSky\t290383\n国产屏\t290384\n广州南站汽车客运站\t290385\n经不住\t290386\n中国商务部\t290387\n陶瓷板\t290388\n惊鸿舞\t290389\n获胜\t290390\n回批\t290391\nFlynn\t290392\nuw\t290393\nvtiger\t290394\n煤种\t290395\nIonic\t290396\n两则\t290397\n材料包\t290398\n小儿支原体肺炎\t290399\n罗源县\t290400\n雁塔区\t290401\n审判庭\t290402\n紫鑫药业\t290403\n房价收入比\t290404\n挂盘\t290405\nseco\t290406\n远进\t290407\n厦门金龙\t290408\n会声会影X6\t290409\nPREMIER\t290410\n全域\t290411\n隔离霜\t290412\n虫师\t290413\n章节号\t290414\n一场梦\t290415\nCITY\t290416\n整合式\t290417\n等候\t290418\nq701\t290419\n葡萄糖注射液\t290420\n张志\t290421\n王普\t290422\n主观题\t290423\n下场\t290424\n安瓿\t290425\n文本文\t290426\n李依晓\t290427\n融侨华府\t290428\n拜泉县\t290429\n育龙在职博士网\t290430\n寄宿公寓2\t290431\n云南省卫生厅\t290432\n关联度\t290433\nGSC\t290434\n夜明的孤行灯\t290435\n新水浒传\t290436\n王睿卓\t290437\n聚合类\t290438\n挪威语\t290439\n美艳\t290440\nRingtone\t290441\ndateframe\t290442\n婳\t290443\nPart.1\t290444\n广汽新能源\t290445\n夏色\t290446\n檄文\t290447\n武神塔\t290448\n米兹\t290449\n20160728\t290450\nlogin\t290451\n口袋妖怪黑白2\t290452\n免遭\t290453\n江东新城\t290454\ntalked\t290455\n陪驾\t290456\n两河\t290457\n辰星\t290458\n虾麦果\t290459\n江西中专学校\t290460\n第八集\t290461\nfldq\t290462\n词根\t290463\n力线\t290464\n智行\t290465\nvisio2013\t290466\n108米\t290467\n形势\t290468\n科西嘉\t290469\n315货源网\t290470\n弹簧拉力器\t290471\n售后服务点\t290472\n开平机\t290473\n湖南站\t290474\n石钢\t290475\n989\t290476\n苏轼\t290477\n上海市人大常委会\t290478\n植观\t290479\n春联\t290480\n八九岁\t290481\n水岭\t290482\n整容室\t290483\n领回\t290484\n国嘉人才网\t290485\n朱华荣\t290486\n白龙桥镇\t290487\n8期\t290488\n不胜其烦\t290489\n奥赛博物馆\t290490\n帳\t290491\n广东省人民对外友好协会\t290492\n3377\t290493\n一百位\t290494\n天回镇\t290495\npeacock\t290496\n水平线\t290497\nSWOT分析\t290498\n郑子豪\t290499\n官民\t290500\nThames\t290501\n久吾高科\t290502\n拉鲁拉斯\t290503\n机务段\t290504\n奥尔良鸡翅\t290505\n养颜胶囊\t290506\n黄流镇\t290507\n铺设\t290508\n唐山火车站\t290509\nazw3格式\t290510\nKDJ指标\t290511\n七丽\t290512\n森源电气\t290513\n会计毕业论文\t290514\n卫康\t290515\n王总\t290516\npredix\t290517\n十二册\t290518\n港元\t290519\n辣哭\t290520\n四川省国家税务局\t290521\n红磡站\t290522\n唐氏\t290523\n爱丽莎\t290524\narchi\t290525\n化妆桌\t290526\n怨屋本铺\t290527\nbusy\t290528\ncdda\t290529\n圆圈\t290530\n0075\t290531\n海峡钓鱼论坛\t290532\n_面\t290533\npxe\t290534\n臻爱\t290535\nNaughty\t290536\n道德讲堂\t290537\nipk\t290538\n便所\t290539\n为美好的世界献上祝福\t290540\n橇\t290541\n宁片\t290542\n断刀客\t290543\n金星小区\t290544\nadxl345\t290545\n注明\t290546\nNebo\t290547\n4^2\t290548\n八十二岁\t290549\n河北青年报\t290550\n清刚\t290551\n致远中学\t290552\n留尼汪岛\t290553\n红外测距仪\t290554\n创\t290555\n谷歌联盟\t290556\n贵阳火车站\t290557\n数千只\t290558\n出面\t290559\n深圳公立医院\t290560\n财务代理记账\t290561\n蒸肉饼\t290562\n主动脉瓣狭窄\t290563\n淫夜\t290564\nInspiron灵越15\t290565\n生态位\t290566\nharmonic\t290567\nstdio\t290568\n201511\t290569\n梅墨生\t290570\n黑鹰s\t290571\n改善\t290572\n神经性耳聋\t290573\n查寝\t290574\n两江\t290575\n业务章\t290576\n天津市南开区\t290577\n完美国际\t290578\n冬儿\t290579\n垃圾筒\t290580\n师弟\t290581\n万象汇\t290582\n中国仪器仪表学会\t290583\n几盆\t290584\n我的特一营\t290585\n投资期\t290586\n樱木凛\t290587\n版本\t290588\n抓紧\t290589\n砀山酥梨\t290590\n卧底\t290591\npdh\t290592\n网络安全宣传周\t290593\n7000万\t290594\n北美东部\t290595\n蜡坛\t290596\nRIP\t290597\n均匀化\t290598\n罗店镇\t290599\n刘三\t290600\n小乖\t290601\n踩踩踩\t290602\n卡费\t290603\nPapers\t290604\n1300万\t290605\nKS\t290606\n十三个月\t290607\n1304\t290608\n骆驼祥子读后感\t290609\n渐显\t290610\n最高人民法院\t290611\n复性\t290612\n20171121\t290613\n视场\t290614\n15行\t290615\n净莲\t290616\n惠州\t290617\n四川分公司\t290618\n七娘山\t290619\n小微企业贷款余额\t290620\norganization\t290621\ncalorie\t290622\n中文域名\t290623\n脑瘫儿\t290624\n杨俊毅\t290625\n预编译\t290626\n辽宁金融职业学院\t290627\n微生物菌剂\t290628\n含硫量\t290629\n黄游\t290630\n雪茄\t290631\n跪拜礼\t290632\n范里安\t290633\n胡某\t290634\n98k\t290635\n小秒\t290636\n乌尔善\t290637\n九天玄女\t290638\n汉风\t290639\n姓方\t290640\n蛇精男\t290641\n酷米\t290642\n山丹县\t290643\n广州物流公司\t290644\n发酵箱\t290645\n豪士\t290646\n新声\t290647\n肛肠频道_健客网\t290648\n一言不发\t290649\n20141027\t290650\nR11\t290651\n长协\t290652\n美少女梦工厂5\t290653\n有机化学基础\t290654\n张江镇\t290655\n黄健\t290656\n0432\t290657\nhonors\t290658\n侨城一号\t290659\n薛薇\t290660\n豆腐花\t290661\n西海情歌\t290662\n南京市旅游委员会\t290663\n灵感\t290664\nMultivariate\t290665\n第91号\t290666\n叶寒\t290667\n仁皇山\t290668\n20131230\t290669\n贵阳电信\t290670\nPVS\t290671\n石柱土家族自治县\t290672\n江背镇\t290673\n中碳钢\t290674\n圈中人\t290675\n淡饭\t290676\n二龙湖\t290677\n青岛大学附属医院\t290678\n缩读\t290679\n丧失\t290680\ndirect\t290681\n多头排列\t290682\n银登中心\t290683\nclare\t290684\n营运\t290685\n欧阳明\t290686\n茂腔\t290687\n发明奖\t290688\n严爵\t290689\n黑星\t290690\n电尾\t290691\n龙山庄\t290692\n杂诗\t290693\n黄河三角洲\t290694\n顺道\t290695\n沉淀仓\t290696\nx-4\t290697\n结棍\t290698\n东莞旅行社\t290699\n热血飞度\t290700\n弋\t290701\ndongman\t290702\n万科星园\t290703\n陈泽泽\t290704\nzhaoren\t290705\n山东省煤炭工业局\t290706\nstaff\t290707\nicyjiang\t290708\n排气量\t290709\n细体\t290710\n神皇\t290711\n美味佳肴\t290712\n圣才教育\t290713\n乡卫生院\t290714\n6plus\t290715\n邓洁\t290716\n转载机\t290717\n藍光\t290718\n手痛\t290719\n迷镇凶案\t290720\ncard篇\t290721\n身在\t290722\n半妖乳娘\t290723\n加格达奇\t290724\nlake\t290725\n場所\t290726\n宜妃\t290727\n湖南农大\t290728\n峪\t290729\n网袜\t290730\nET199\t290731\n111号\t290732\n军备\t290733\n性史\t290734\n岁时\t290735\n主备\t290736\n泉州台商区\t290737\n配送\t290738\n1795\t290739\n海洋大学\t290740\n茅威涛\t290741\n13首\t290742\n听力学\t290743\n方正证券股份有限公司\t290744\n红腹锦鸡\t290745\n徐丹\t290746\n起亚沃尔沃xc60\t290747\n三丰智能\t290748\n专四专八\t290749\nxie仗剑天涯\t290750\n坐标纸\t290751\n十五岁\t290752\n月落乌啼霜满天\t290753\n恒通科技\t290754\nemt\t290755\n下町\t290756\n内含\t290757\nword2vector\t290758\n模拟法\t290759\n氢溴酸\t290760\n小儿咳喘灵口服液\t290761\n会议题\t290762\n鼻衄\t290763\n通臂猿猴\t290764\n双氰胺\t290765\n过去分词\t290766\n8301\t290767\n广西政协\t290768\n修仙者\t290769\n4411\t290770\n龙源期刊网\t290771\nvw\t290772\n四百元\t290773\n健身公园\t290774\n后藤\t290775\nwebgl\t290776\n长滩湖\t290777\n沈佳妮\t290778\n看电\t290779\n回杯记\t290780\n吉野家\t290781\n防撞条\t290782\n雄安\t290783\n张文康\t290784\nopenfalcon\t290785\n江都\t290786\n王澍\t290787\n女警沉沦之夜莺俱乐部\t290788\n威斯敏斯特大教堂\t290789\n杜浔镇\t290790\n斯嘉丽约翰逊\t290791\n写生簿\t290792\n复变函数论\t290793\n400_\t290794\n华美银行\t290795\n样书\t290796\n无比\t290797\nDallas\t290798\n双辽\t290799\n国基\t290800\n龙马潭区\t290801\n中科大厦\t290802\n社区服\t290803\n干簧管\t290804\n叙空军基地\t290805\n封掉\t290806\n迪巧\t290807\n蒸汽波\t290808\n中山北站\t290809\nBSE\t290810\n灵能\t290811\n托克逊\t290812\n几方\t290813\nfre\t290814\n可鉴\t290815\n九阴真经_九阴真经OL\t290816\n狼图腾\t290817\n朗宇\t290818\n认定书\t290819\n绵阳市区\t290820\nverge\t290821\n宿松县\t290822\n地合网\t290823\n必应\t290824\n立德\t290825\n41名\t290826\n五线\t290827\nShopper\t290828\n北京昌\t290829\n晋中银行\t290830\n小胜\t290831\n塑壳断路器\t290832\n房地产市\t290833\n直播员\t290834\n逃窜\t290835\n低电压\t290836\nWalgreens\t290837\n全能骑士\t290838\nethos\t290839\n美御\t290840\n安装板\t290841\n20160901\t290842\n刘诗昆\t290843\n出示\t290844\n断奏\t290845\n西溪八方城\t290846\nFinalShell\t290847\n车\t290848\n久久结婚网\t290849\n千雷殿\t290850\n吉州区\t290851\n贵州地区\t290852\n青贮\t290853\n梦徒\t290854\n折翼\t290855\n608路\t290856\n科属\t290857\n信诺\t290858\n红米Note\t290859\n原长\t290860\n推拿学\t290861\n凤城七路\t290862\n专克\t290863\n855\t290864\n轮舞曲\t290865\nankhsvn\t290866\n所得率\t290867\n色界\t290868\n力鼎\t290869\n粉蒸排骨\t290870\nwww.3490.CN\t290871\n安培拉星人\t290872\n湘赣\t290873\n高密度脂蛋白\t290874\n长寿之乡\t290875\n物华\t290876\nAdb\t290877\nm154a\t290878\n华盛奥特莱斯\t290879\n星宿派\t290880\nTVOC\t290881\n1771\t290882\n科院\t290883\ninter\t290884\ndvd导航\t290885\n四谛\t290886\n件照\t290887\n沙埔\t290888\nComics\t290889\n都市报\t290890\n连南县政府\t290891\n滥杀\t290892\n中国江西网\t290893\n沂水\t290894\n0.27\t290895\n池田依\t290896\n递增\t290897\n筋道\t290898\n熟悉\t290899\nDeclaration\t290900\n出现率\t290901\n老老实实\t290902\n苏小凉\t290903\n江苏省国信资产管理集团有限公司\t290904\n敢闯\t290905\n百王\t290906\n文征\t290907\n20多万\t290908\n解\t290909\n哇嘎\t290910\n莽山\t290911\nSwipeRefreshLayout\t290912\n志诚\t290913\n图仪\t290914\n光度计\t290915\n无锡市第一中学\t290916\n送转\t290917\nsheen\t290918\n郭美美\t290919\nHCC\t290920\n谢采妘\t290921\n颜凉雨\t290922\n冤人\t290923\n语句块\t290924\n喷气\t290925\nlien\t290926\n培研\t290927\n位运算符\t290928\n承办\t290929\nVray\t290930\n东方神娃\t290931\n建国饭店\t290932\nDocuPrint\t290933\n志铭\t290934\n禁鞭\t290935\n全皮\t290936\nRowe\t290937\n荒川百战\t290938\n中共中央台湾工作办公室\t290939\nXManager\t290940\n暴走邻家\t290941\n龙山村\t290942\n3MM\t290943\nPitchfork\t290944\n美国司法部\t290945\n京翰教育\t290946\n液位仪\t290947\n盛元\t290948\n宽阔\t290949\nhamachi\t290950\n宋丹丹\t290951\n任达华\t290952\n海口法院网\t290953\n20180115\t290954\n七女\t290955\n珍珠疹\t290956\n超链接\t290957\n日联杯\t290958\n信义嘉庭\t290959\n一点一滴\t290960\n阴招\t290961\nFriday\t290962\n英雄合击\t290963\n对答\t290964\n呼和浩特小学\t290965\n彭十六\t290966\n撤去\t290967\n畅销版\t290968\n两任\t290969\n六枝吧\t290970\n第一會所\t290971\nrstudio\t290972\n上一秒\t290973\n道德观\t290974\n实效性\t290975\n凶鬼\t290976\n撤稿\t290977\n暗色\t290978\nng4\t290979\n坏血\t290980\nRough\t290981\nsequ\t290982\n佛山南庄四中\t290983\nSystemCare\t290984\n不得人心\t290985\n音器\t290986\n迎风坡\t290987\n规例\t290988\n物流展\t290989\n荷包牡丹\t290990\n配模\t290991\n烤盘\t290992\n修仙女\t290993\n杨文华\t290994\n长春民生网\t290995\n装袋\t290996\n15400\t290997\n英皇\t290998\n常旅客\t290999\n沂河\t291000\n任泉\t291001\n甲状腺炎\t291002\n寒暑\t291003\nBabies\t291004\n不冻港\t291005\nGrill\t291006\n发行价\t291007\n成思危\t291008\n长春肤康同济医院\t291009\n趋稳\t291010\nPUA\t291011\n7平方米\t291012\n隐为者\t291013\n青干\t291014\n三泰控股\t291015\nvagaa哇嘎画\t291016\n狗球\t291017\n煎饼侠\t291018\n徐州医科大学附属医院\t291019\nPC6手游网\t291020\n体育学院\t291021\n前韵\t291022\n国家外专局\t291023\n叛逆性百万亚瑟王\t291024\n人以群分\t291025\ndisplaylink\t291026\nWhata\t291027\n梦乡\t291028\nIngredients\t291029\nloz\t291030\nsearchbar\t291031\nLuciferase\t291032\n唐诗三百首\t291033\nrecursive\t291034\n气象学\t291035\n贝亲桃子水\t291036\n买房者\t291037\n男孩子们\t291038\n尘沙\t291039\nNDSi\t291040\n美赛\t291041\nnikki\t291042\n黄唇鱼\t291043\n沪商\t291044\nwin7任务管理器\t291045\n系统论\t291046\n霍尔果斯经济开发区\t291047\n吕家传\t291048\npayout\t291049\n海淀区房屋管理局\t291050\n万能保险\t291051\nN次\t291052\n郭凯\t291053\nging\t291054\n宁波晚报\t291055\n免密支付\t291056\n杨溪谷\t291057\n第63号\t291058\n汉化补丁+\t291059\n嗜热链球菌\t291060\n基金从业资格\t291061\n学研\t291062\n芭提亚\t291063\ncheapest\t291064\n吴泾\t291065\n常州公安\t291066\n奔驰a200\t291067\n异邦\t291068\n有氧健身操\t291069\n韵脚\t291070\n布莱德湖\t291071\nSherlock\t291072\n板词\t291073\n唐模\t291074\n超光速\t291075\n好景不长\t291076\n辛夷花\t291077\n53所\t291078\n勾勾\t291079\n老新村\t291080\n火哥\t291081\n115礼包\t291082\nbmpcc\t291083\n黑旗\t291084\n合发\t291085\n小亚细亚\t291086\n电脑架\t291087\n扎伤\t291088\n星生\t291089\n韵部\t291090\n4000公里\t291091\n病女\t291092\n爱来了别错过\t291093\n25MM\t291094\n文理科\t291095\n错道\t291096\n水榭花城\t291097\nClien\t291098\ndevelopments\t291099\n小叶紫檀手串\t291100\n如家宾馆\t291101\n石家庄广播电视台\t291102\n自由式摔跤\t291103\n逆反应\t291104\nneedn\t291105\n港产片\t291106\n手膜\t291107\n飞马航空\t291108\n盛开\t291109\n假性尖锐湿疣\t291110\n防潮柜\t291111\nhaonanElva\t291112\n三七机\t291113\n宝马X1\t291114\n县农业局\t291115\n中华人民共和国公共文化服务保障法\t291116\n环线\t291117\n1088\t291118\n新怡\t291119\n武神坛\t291120\n互慰\t291121\ntxz\t291122\n补测\t291123\n项少龙\t291124\n银屏\t291125\nLAP\t291126\n偏差分析\t291127\npua\t291128\n增效剂\t291129\nstt\t291130\n定轨\t291131\n蓝色经济区\t291132\n月胎\t291133\nSiteTag\t291134\n课后感\t291135\naisex\t291136\n山东政府\t291137\n伶俐\t291138\n7.1亿\t291139\n大头兵\t291140\n点钞机\t291141\n绝交\t291142\n大好爽\t291143\n非独立核算分公司\t291144\n泰和县人民政府\t291145\n3.6.2\t291146\n击碎\t291147\n2500亿元\t291148\n温热\t291149\nCMO\t291150\n王姿允\t291151\n顾祥林\t291152\n舟山医院\t291153\nCallBack\t291154\n袁州区\t291155\n躁狂症\t291156\n存款利率_定期存款利率\t291157\n没错误\t291158\ndwl\t291159\n易奇\t291160\n皮克斯\t291161\n蜘蛛\t291162\nFileOutputStream\t291163\n美法\t291164\n1亿条\t291165\nRelations\t291166\n报社\t291167\ndota2steam\t291168\n乐檬X3\t291169\nAD18\t291170\n黑龙江电信\t291171\n2007年度\t291172\n宇航员\t291173\n18046期\t291174\n阿尔郎\t291175\n系列展\t291176\nremnant\t291177\n通图\t291178\n上海自动化仪表股份有限公司\t291179\nHiGH\t291180\nxmind7\t291181\n1bit\t291182\n欺诈游戏\t291183\n倩儿\t291184\ni34170\t291185\n3.17\t291186\n得实\t291187\n周思聪\t291188\n11.7%\t291189\n青岛干部网络学院\t291190\n祢衡\t291191\n2017年5月23日\t291192\nSTD\t291193\nROP\t291194\nPerm\t291195\nzhang\t291196\nCebu\t291197\n巴丹吉林\t291198\n工作者\t291199\n战争机器4\t291200\n魔罐\t291201\nSupplement\t291202\n东方花旗\t291203\n加勒比海盗1\t291204\ndlut\t291205\n拼音本\t291206\n亚太版\t291207\n魔球\t291208\n徐霞客镇\t291209\n牛背山\t291210\n中荷人寿\t291211\n湖父镇\t291212\n小修\t291213\n纵版\t291214\n2.0版\t291215\n南哥\t291216\nkdiff3\t291217\n天外天\t291218\nben10\t291219\n吃力不讨好\t291220\n闵红玉\t291221\n无头\t291222\n好象\t291223\n往年\t291224\nUAL\t291225\n众惠\t291226\n凤凰古镇\t291227\n校直机\t291228\n悯农\t291229\n手串\t291230\n中国共产党第十七次全国代表大会\t291231\n八犬传\t291232\n忠旺\t291233\ntract\t291234\n南加州大学\t291235\n南京中学\t291236\n黍离\t291237\n人身保险业\t291238\nPotPlayer\t291239\n无一例外\t291240\n周凯旋\t291241\n嘉绒\t291242\n天津市统计局\t291243\nWiley\t291244\n中国航空研究院\t291245\njpt\t291246\n普学网\t291247\nassignments\t291248\n上海电视台\t291249\n宠博会\t291250\n3晚\t291251\n驮着\t291252\ndisclosure\t291253\nDigilent\t291254\n句尾\t291255\nmpstat\t291256\n中共宁波市委组织部\t291257\n丑小鸭\t291258\n篆刻\t291259\n薛凯琪\t291260\n移动集团\t291261\ndots\t291262\nrcu\t291263\n头女\t291264\nSTANLEY\t291265\n长版\t291266\nuntitled\t291267\nCoA\t291268\n简法\t291269\n洁丽雅\t291270\n王富\t291271\nradiator\t291272\n加州大学河滨分校\t291273\n富川\t291274\n饭否\t291275\n157\t291276\n1637\t291277\n三才阵\t291278\n婚姻生活\t291279\n大坝村\t291280\n全热交换机\t291281\n忍者龟\t291282\n落月\t291283\n621\t291284\n美科\t291285\n天降之物\t291286\nㄚ\t291287\n导刊\t291288\n_根管治疗\t291289\n百花苑\t291290\nLAMER\t291291\n曲线积分\t291292\n昆明公园\t291293\n雨林缸\t291294\nG9250\t291295\n酒香\t291296\ndask\t291297\n高家堡\t291298\n几十名\t291299\n恶变\t291300\n结营\t291301\n襄樊\t291302\n西班牙人\t291303\n开天战神\t291304\n尚居\t291305\n方家村\t291306\nifm\t291307\nV2.2.3\t291308\n密封式\t291309\n专注\t291310\n陈锦\t291311\n分期收款\t291312\n需缴纳\t291313\noneinstack\t291314\n维拉诺瓦\t291315\n兰州市区\t291316\n激光机\t291317\n元音\t291318\n青味\t291319\n升学宴\t291320\n除铁器\t291321\nseria\t291322\n25.3\t291323\n无渊\t291324\n孝衣\t291325\nTIME\t291326\n北京广播电台\t291327\n销项负数\t291328\n万兽岭\t291329\n欢乐岛\t291330\nintegra\t291331\n约会大作战\t291332\n小楠\t291333\n转气\t291334\n思淫欲\t291335\n迥\t291336\n镁条\t291337\n乙未豪客传奇\t291338\n贵州省招生考试院\t291339\n机关\t291340\n欢送\t291341\n途昂\t291342\nADSL拨号\t291343\n偷腥\t291344\n炼金术士吧\t291345\n回合\t291346\n黄屿\t291347\n李明亮\t291348\ncos2x\t291349\nsudoku\t291350\n133平米\t291351\nerno\t291352\n太平人寿保险公司\t291353\n票货\t291354\n萧秋水\t291355\n中苏\t291356\n北大科技园\t291357\n魔女宅急便\t291358\nstmt\t291359\n好滋味\t291360\nmatthew\t291361\n双燕\t291362\n费解\t291363\n深圳科创委\t291364\n索里亚诺\t291365\n冷气\t291366\n剪应力\t291367\n飞鹤星飞帆\t291368\n3412\t291369\n_Hi商学院\t291370\nBOUNCE\t291371\n仮名\t291372\n九堡客运中心\t291373\n永太科技\t291374\n弹花机\t291375\n直言\t291376\nvi模式\t291377\n欢聚\t291378\n赔员\t291379\nFOOT\t291380\n12306Bypass\t291381\n发改委\t291382\n长叹\t291383\ncontour\t291384\n地缝\t291385\n昌都地区\t291386\n终极装备\t291387\n黒鲨红魔\t291388\n面料\t291389\n梦想星城\t291390\n梁天\t291391\n李雪寒\t291392\n张灵\t291393\nメッセ\t291394\n快乐网\t291395\n周望\t291396\n清姬\t291397\n缓行\t291398\n义乳\t291399\n很失望\t291400\n华为手环B3\t291401\n焦墨\t291402\n10欧\t291403\n国光电器\t291404\n海陆\t291405\n分段\t291406\n火龙果\t291407\n小牛u1\t291408\n网页计算器\t291409\n十八中\t291410\n质性\t291411\n碑林区\t291412\n王晖\t291413\n郸城一高\t291414\n香港警队\t291415\n万用膏\t291416\n水刷\t291417\n新安街道\t291418\n66度\t291419\n张芝山\t291420\nm700\t291421\nTransport\t291422\n杨洪波\t291423\nsutdio\t291424\nSD-WAN\t291425\nb250m-plus\t291426\n抽样检验\t291427\n满天下\t291428\n下册语\t291429\n三方库\t291430\n交费\t291431\nNmap\t291432\nohmyzsh\t291433\n白钻\t291434\n凶狠\t291435\n答卷\t291436\n维密天使\t291437\nanita\t291438\natp\t291439\n光标\t291440\n丰缘地区\t291441\ncad工具栏\t291442\n太原市迎泽区人民政府\t291443\nXII\t291444\n中创\t291445\n职资\t291446\n之里\t291447\n劣币\t291448\n济南省立医院\t291449\noreo\t291450\n失孤\t291451\n缺爱\t291452\n制动盘\t291453\n低频\t291454\n苹果MA\t291455\nIMX6Q\t291456\nAutoHotkey\t291457\n经济学术语\t291458\n杜娟花\t291459\nxvf\t291460\n玩尽杀绝\t291461\n积石\t291462\n韩女星\t291463\n黄靖翔\t291464\n药厂\t291465\nsniffer\t291466\n荆溪\t291467\npmw\t291468\n计算机仿真\t291469\n精灵\t291470\n杨冰\t291471\n男同\t291472\n爪牙\t291473\n面首\t291474\nYeung\t291475\n张元\t291476\nStatalist\t291477\n甲型H1N1流感\t291478\nbmpc\t291479\n段长\t291480\n更实\t291481\n杨清\t291482\n秒升\t291483\n门吸\t291484\n重阳节\t291485\n半导体厂\t291486\n荷乙\t291487\n夺命金\t291488\nR2R\t291489\n猫宫\t291490\naspects\t291491\n导热垫\t291492\n弯管器\t291493\n返本\t291494\n大咖\t291495\n五常法\t291496\nICM\t291497\n有限合伙协议\t291498\n10.4亿\t291499\n梅子酒\t291500\n可意\t291501\n蒙古\t291502\n那个女人\t291503\n科达利\t291504\n索书号\t291505\n双色球头奖\t291506\n多线程锁\t291507\n报导\t291508\n合成材料\t291509\nぴ\t291510\n第一艘\t291511\n暑期\t291512\n寸止\t291513\nMVC5+EF6+EasyUI\t291514\n完全体\t291515\n分步\t291516\n桑保利\t291517\n65式\t291518\n扁食\t291519\n北大物理系\t291520\nappache\t291521\n开篇词\t291522\n九强\t291523\n玛特纳\t291524\nU盘启动盘制作工具\t291525\n井号\t291526\n时雨\t291527\n风力\t291528\n刷卡\t291529\n紫云阁\t291530\n爱思唯尔\t291531\nlzu\t291532\n男子组\t291533\n超声波治疗仪\t291534\n急性缺血性卒中\t291535\n0xc0000142\t291536\n绝地求生\t291537\n学军小学\t291538\n潺潺\t291539\n分代\t291540\nbooklet\t291541\n化学方程式_\t291542\n基础教育机构\t291543\n赤裸\t291544\numf\t291545\n五石散\t291546\n武汉国际广场\t291547\n对你的爱\t291548\n从前有座灵剑山\t291549\n岩棉管\t291550\n瓶盖\t291551\n15.9\t291552\n撂下\t291553\n赣\t291554\n吴东\t291555\n世豪\t291556\n重音\t291557\n水雷\t291558\n扫车\t291559\n拉一拉\t291560\n中国安全生产网\t291561\nGorgeous\t291562\n1.1\t291563\nSniffer\t291564\n看污\t291565\n伤手\t291566\n1618W\t291567\n翠云\t291568\n怀旧\t291569\n第831集\t291570\nVS2010\t291571\ncc1\t291572\n为爱痴狂\t291573\n升维\t291574\nx86_64\t291575\nseparation\t291576\n李天金\t291577\n清阳\t291578\nPowerQuery\t291579\n嘉兴机场\t291580\n崔碧珈\t291581\n娇韵诗\t291582\n宝鸡日报\t291583\n堆法\t291584\n天枢\t291585\nrey\t291586\nJMC\t291587\n意象\t291588\nxhp\t291589\n狸猫换太子\t291590\n华为nova\t291591\n罗欣\t291592\n网红店\t291593\n10大类\t291594\nShow\t291595\n11_2\t291596\n博士园\t291597\n全裸照\t291598\n精味\t291599\n机关事业单位工作人员带薪年休假实施办法\t291600\n当先\t291601\n别克新凯越\t291602\n越野赛\t291603\n上海昌硕\t291604\n合金\t291605\n转基因作物\t291606\n2七\t291607\n中国节能产业网\t291608\n新故事\t291609\nmetinfo\t291610\n悬液\t291611\nrsj\t291612\n第155集\t291613\n荣耀庄周\t291614\n助动词\t291615\n蚂蚁宝卡\t291616\n工作线程\t291617\n加样\t291618\nYuma\t291619\n新季\t291620\n89244108\t291621\n兰剑\t291622\n阳光卫视\t291623\ndsx\t291624\ntine\t291625\n巩华\t291626\n讲求\t291627\nnipic\t291628\n万向节\t291629\n昌平区\t291630\n荚蒾\t291631\n小宝卡\t291632\n2540p\t291633\n刘亚\t291634\n魅蓝3S\t291635\n管理室\t291636\n仆役\t291637\n杨浦区中心医院\t291638\nRC2\t291639\n塑壳式断路器\t291640\nhymn\t291641\n卡布里岛\t291642\n鉴定\t291643\n国网湖北省电力公司\t291644\nzaina\t291645\n12马力\t291646\n模糊控制系统\t291647\n浙江大学生命科学研究院\t291648\n私产房\t291649\n绝代魔法王座\t291650\n家园守卫战\t291651\n嗨爆\t291652\n案列\t291653\n209个\t291654\n义诗\t291655\n首虎\t291656\nvl版\t291657\n3d画\t291658\n曼尼\t291659\nranking\t291660\nSunor\t291661\n遭遇战\t291662\n惠民\t291663\nDynamics\t291664\n全异\t291665\n眼魔\t291666\n塞焊\t291667\n奋不顾身\t291668\n亮相\t291669\n部下\t291670\nHD1280超清\t291671\n疏远\t291672\n安分守己\t291673\n敖犬\t291674\nnssm\t291675\n巴新\t291676\n湿法\t291677\n番石榴叶\t291678\n辽宁财贸学院\t291679\n银亿\t291680\n李文娟\t291681\n山东农业大学\t291682\n迷住\t291683\n独孤信\t291684\nSOLIDWORKS\t291685\nqrs\t291686\n艺尚小镇\t291687\nphpweb\t291688\nx展架\t291689\n大盖\t291690\n灵越15\t291691\n表面波\t291692\n血缘关系\t291693\n吸入性\t291694\n项羽本纪\t291695\n人造石材\t291696\nTC\t291697\n临渭\t291698\n有源电力滤波器\t291699\n超达\t291700\n德胜洋楼\t291701\n通称\t291702\nhaya\t291703\n亚信科技\t291704\n夏天\t291705\n无影山\t291706\n婧麒\t291707\n记牢\t291708\n两日期\t291709\n天理不容\t291710\n自噬\t291711\n犬猫\t291712\n2017年5月1日\t291713\n相闻\t291714\n61集\t291715\n结集\t291716\n异能者\t291717\n戴沐白\t291718\nAs109\t291719\ncities\t291720\n花椰菜\t291721\n10几岁\t291722\n江陵\t291723\n嘉世\t291724\n左撇子\t291725\n毛泽东传\t291726\nphilosophy\t291727\n13902415780\t291728\nresolve\t291729\n延安颂\t291730\n旗人\t291731\n三毛流浪记\t291732\n晃荡\t291733\n长势\t291734\n力魔\t291735\n蹇\t291736\n万达茂\t291737\n早点\t291738\n中外合作经营企业\t291739\n东华科技\t291740\n离下\t291741\n北京环保局\t291742\n天际省\t291743\n100発\t291744\nMSL\t291745\n法外风云\t291746\n眼部\t291747\n伙伴们\t291748\n猪病\t291749\n中间级\t291750\nmailing\t291751\n志坚\t291752\n固结\t291753\n30路\t291754\n透光度\t291755\n2.16\t291756\n赤兔马\t291757\n加气砖\t291758\n第六辑\t291759\n像机\t291760\nUPS蓄电池\t291761\n新华里\t291762\n醉花\t291763\n600606\t291764\njingziwo\t291765\n彩虹5\t291766\nlr\t291767\n达摩祖师\t291768\n27个月\t291769\nelaborate\t291770\n第45条\t291771\n刘国祥\t291772\n友邻\t291773\n有钱了不起\t291774\n炼字\t291775\n和平花园\t291776\nasio\t291777\n甘阳\t291778\n腾讯企业邮箱\t291779\nBeck\t291780\n遗传\t291781\n简帛\t291782\n讲坛\t291783\n失信人\t291784\n移动路由器\t291785\n海滨城市\t291786\n普瑞斯\t291787\n圣火\t291788\n新益为企业管理顾问有限公司\t291789\n入坑\t291790\nSOC\t291791\n5150\t291792\n英国伯明翰大学\t291793\n肉碱\t291794\n单脚\t291795\n底柜\t291796\nsubmodule\t291797\n1550\t291798\n不合格率\t291799\n保温饭盒\t291800\n原判\t291801\n左中\t291802\n豆芽菜\t291803\n心城\t291804\ndazi\t291805\njeep\t291806\n道贺\t291807\na4u\t291808\n艾森\t291809\n跳山\t291810\n31会议网\t291811\n明斯特\t291812\n抚州一中\t291813\nSteven\t291814\n彩虹六号:围攻\t291815\n岩溶\t291816\n背锅侠\t291817\n自然科学基金委\t291818\n二十二号\t291819\nWinner\t291820\n落井下石\t291821\n追远\t291822\nuu898.com\t291823\n129平米\t291824\n1.14.0\t291825\n[丸子\t291826\n口袋巾\t291827\n黑质\t291828\n社会保险服务网\t291829\n演出票团购\t291830\nDrainage\t291831\n9188\t291832\n周玉蔻\t291833\n戏子\t291834\n保健\t291835\n10.0.5\t291836\n快带\t291837\n吉安网\t291838\n科洛斯\t291839\n肾气\t291840\n黄宗智\t291841\n欧诺\t291842\n老郭\t291843\n我和她的传奇情仇\t291844\n脉冲信号\t291845\n86级\t291846\n修机\t291847\n育儿网\t291848\n民生人寿保险股份有限公司\t291849\n东京机场\t291850\n第六届\t291851\n白澍\t291852\n一到十二月\t291853\n雷欧奥特曼\t291854\n新葫芦兄弟\t291855\n中国医科大学附属口腔医院\t291856\nxda\t291857\n中机院\t291858\n李四德\t291859\nDirective\t291860\nSTATISTICS\t291861\n德州市教育局\t291862\n第二医院\t291863\n客户源\t291864\n药类\t291865\n钢绞线\t291866\n3370\t291867\n调式\t291868\n采血管\t291869\n苏眉鱼\t291870\n福瑞股份\t291871\n均布荷载\t291872\naccording\t291873\n产卵\t291874\nHiMall\t291875\nX-Plane\t291876\nMT管理器\t291877\n调理机\t291878\n立嘉\t291879\n史克威尔\t291880\n一个心\t291881\n灵寿\t291882\nmall\t291883\n济公游记\t291884\nclementine\t291885\n几派\t291886\n风味\t291887\nnote5\t291888\n2017年12月9日\t291889\n7月19日\t291890\n不绝于耳\t291891\n洪塘镇\t291892\n漏罪\t291893\n_华商报\t291894\nThoughtWorks\t291895\n森林防火网\t291896\n高空中\t291897\ntimeval\t291898\n成贤娥\t291899\n小米音乐\t291900\ncancel\t291901\nEXPLORER\t291902\n六十四进制\t291903\n程一\t291904\nEncoder\t291905\n齐达内\t291906\n氨法脱硫\t291907\n乡村\t291908\n3d游戏模型\t291909\n第一百二十二章\t291910\n金秀贤\t291911\n6年以内\t291912\n鸡友\t291913\n学记\t291914\nChinaZ\t291915\n荣威360\t291916\n管制\t291917\n镁业\t291918\n罗技g903\t291919\n国产轴\t291920\n光伏贷款\t291921\n黑胡桃\t291922\n操作篇\t291923\n超级课程表\t291924\n查证\t291925\nlanguage\t291926\n冷敷贴\t291927\n悬针纹\t291928\nUnicorn\t291929\n港警\t291930\n了望\t291931\n帮主\t291932\n运镜\t291933\ni7-4790K\t291934\n赤玉\t291935\n7第七章\t291936\n第五本\t291937\ninti\t291938\n二手客车网\t291939\n2017年4月16日\t291940\n奥盟\t291941\n迪柯尼\t291942\n2018-02\t291943\nNatalia\t291944\nwin732\t291945\n嚎啕大哭\t291946\n撤案\t291947\n顾戴路\t291948\n六方\t291949\nELT\t291950\n熊岳\t291951\n育明\t291952\n李雅轩\t291953\n掉马\t291954\n鸭油\t291955\n华邦健康\t291956\nprobabilistic\t291957\nInMobi\t291958\n规划师\t291959\nitool\t291960\n拉拉拉\t291961\nmedici\t291962\nB250M\t291963\nbushi\t291964\nZ250\t291965\n导视牌\t291966\n对骂\t291967\n张垣\t291968\n雪绒\t291969\n现象\t291970\nbroadlink\t291971\n1366X768\t291972\n唐尼\t291973\n粼粼\t291974\ngdbserver\t291975\n周偶\t291976\n花法\t291977\n日不落帝国\t291978\nx6\t291979\n盘龙城\t291980\ncls\t291981\ncompassion\t291982\nexecutors\t291983\n勘察设计信息网\t291984\n皮标\t291985\n放线\t291986\nECSHOP开发中心\t291987\n乳木\t291988\n第八届\t291989\n阴差\t291990\n思考题\t291991\nAirServer\t291992\n重载函数\t291993\n10尺\t291994\n0603\t291995\n奔驰G65\t291996\n5666\t291997\n太辰光\t291998\n药娘\t291999\nF1.4\t292000\n尼斯湖水怪\t292001\n日志式\t292002\n骏网\t292003\n情态\t292004\n紫微斗数在线排盘\t292005\n徽标\t292006\n格里格\t292007\n摊\t292008\n华清嘉园\t292009\n两月内\t292010\n可笑\t292011\n商贩\t292012\n211重点大学\t292013\n威希科美\t292014\n丁一酱\t292015\n中牟\t292016\n甘肃卫视\t292017\nimx6ul\t292018\n六等\t292019\n非法集资罪\t292020\n42集\t292021\n金牛管\t292022\ng18\t292023\n关谷\t292024\nmagnet,bt\t292025\n文昌湖\t292026\n简版\t292027\n李天柱\t292028\n黄海冰\t292029\n尼龙袋\t292030\n小黑侠\t292031\n林豆豆\t292032\nelectronic\t292033\n0.18.1\t292034\n壹方水榭\t292035\n52章\t292036\n四川外语学院\t292037\nzhengdl126\t292038\n龙船花\t292039\n通窍鼻炎片\t292040\n长沙市第三医院\t292041\n动车票\t292042\n天津梅江会展中心\t292043\n空中客车A330\t292044\n蔷薇科樱属\t292045\n上海中通快递\t292046\n车友会\t292047\nTIGER\t292048\n咬痕\t292049\n家史\t292050\nUI篇\t292051\n府前街\t292052\n女戒\t292053\n不依\t292054\n630m\t292055\n颅磁\t292056\n非固定\t292057\n质押率\t292058\n大广高速公路\t292059\n破除\t292060\n5.5.27\t292061\npete\t292062\n六方氮化硼\t292063\n新视野大学英语\t292064\nFED\t292065\n监利县人民政府\t292066\n台州站\t292067\n富阳中学\t292068\nC++语言程序设计\t292069\n入春\t292070\n可选\t292071\n南京军区总医院\t292072\n瓯海\t292073\narttemplate\t292074\n流派\t292075\n钟村街\t292076\n合丰\t292077\n理想\t292078\nrespiration\t292079\n门市房\t292080\n39章\t292081\n野马T70\t292082\n江苏省中医院\t292083\n北京航空航天大学自动化科学与电气工程学院\t292084\n口管\t292085\n利息率\t292086\n皮囊炎\t292087\n逻辑分区\t292088\n依据\t292089\nCTM\t292090\n运动文化教育网\t292091\n胶片\t292092\n程欢\t292093\n南县\t292094\n海蜇皮\t292095\n广告扇\t292096\nnsa\t292097\n忍者之国\t292098\n影音先锋xfplay电影_影音先锋电影网\t292099\n新标准大学英语综合教程2\t292100\neclisp\t292101\nalias\t292102\n毛蚶\t292103\n彩音\t292104\n河南省省\t292105\n定制师\t292106\n市人\t292107\n吉他音箱\t292108\n启迪控股股份有限公司\t292109\n问一哈\t292110\niPadmini2\t292111\n青宣\t292112\n沟谷\t292113\n牛巴\t292114\nRichter\t292115\n胡刀\t292116\nsynaptics\t292117\n心亭\t292118\n道武者\t292119\n脱壳机\t292120\n桂林两江国际机场\t292121\n农行银行\t292122\n20130801\t292123\n神武天魔\t292124\n宁诺\t292125\n进率\t292126\n英方\t292127\n悠哈\t292128\n全知\t292129\n香山湖\t292130\nT2\t292131\n王轩\t292132\n一湾\t292133\nRISE\t292134\n薛兆丰\t292135\n剑桥科技\t292136\nPHPer\t292137\n学海网\t292138\n网页跳转\t292139\n传众\t292140\n福田奥铃\t292141\n3宗\t292142\nChevrolet\t292143\n休子\t292144\n古拉斯凯奇\t292145\n10c\t292146\n愤怒的小鸟2\t292147\n508\t292148\n91v\t292149\n浙江省科技馆\t292150\n2016年12月27日\t292151\n摩卡金\t292152\nVolk\t292153\n曲面显示器\t292154\n菱塘\t292155\n哨\t292156\n013\t292157\n海姹网\t292158\nwin7笔记本电脑触控板\t292159\n美图M8\t292160\n霸爱\t292161\n含蓄\t292162\n北臧村镇\t292163\n口袋妖怪XY\t292164\n长庆\t292165\n本报讯\t292166\n浴疗\t292167\n蒲瓜\t292168\nire\t292169\n刘子豪\t292170\n稀有\t292171\n手上\t292172\n隔音条\t292173\n香邂格蕾\t292174\n三者险\t292175\n眼宝\t292176\nMeiji\t292177\n二号机\t292178\n边长\t292179\n别客气\t292180\n艳日辉\t292181\ninserting\t292182\n星脉\t292183\nDEAP\t292184\n僚\t292185\n茁壮\t292186\n北京市人民检察院\t292187\n健康教育学\t292188\n膳\t292189\n疲劳值\t292190\n1168\t292191\ncannon\t292192\n陈玉\t292193\n三次函数\t292194\n埋管\t292195\n牛津布\t292196\n蒙古舞\t292197\n散会\t292198\n1.2.0.1073\t292199\n相思草\t292200\n厦门市民政局\t292201\n鸠摩\t292202\n期权\t292203\n苛政\t292204\n国烟\t292205\n前5月\t292206\n089\t292207\n蓝湖\t292208\n过滤器\t292209\n倒序\t292210\n地鼠\t292211\n道祖\t292212\n习近平总书记系列讲话\t292213\nDulux\t292214\n鼎晟\t292215\nopgw\t292216\n西马\t292217\nBioPh\t292218\ncmnet\t292219\nShuai\t292220\n航空港区\t292221\n347号\t292222\n白王\t292223\n东证期货\t292224\n程小小\t292225\n意想不到\t292226\n简单粗暴\t292227\nMATLAB2017a\t292228\n新加\t292229\nacademic\t292230\n巴尔坦\t292231\n必先利其器\t292232\n胡尘\t292233\n奸雄\t292234\nmiscellaneous\t292235\n日照论坛\t292236\n0352房网\t292237\n主公\t292238\n控制台程序\t292239\nwebgis\t292240\n排烟罩\t292241\n柱\t292242\n传值\t292243\n偏门\t292244\n为什么时候\t292245\n256种\t292246\nCryptoKitties\t292247\n老葛\t292248\n人机对话考试\t292249\n06ms201\t292250\n回档\t292251\n外管证\t292252\n大堆\t292253\n梭伦\t292254\naffecting\t292255\n110篇\t292256\n超级机器人大战V\t292257\n系念\t292258\n分不清楚\t292259\nGhibli\t292260\n灰片\t292261\nPluto\t292262\n八项\t292263\n卡耐基梅隆\t292264\n铺场\t292265\n非天\t292266\nST段\t292267\n奥凯\t292268\n寻道\t292269\n溴化物\t292270\n光大信托\t292271\n5_\t292272\n格式符\t292273\nk68\t292274\n移动盘\t292275\nifndef\t292276\n均方\t292277\n天净沙秋思\t292278\n2005-2015年\t292279\n龙族\t292280\n中锐地产\t292281\n忍者神龟2\t292282\n欧米伽3\t292283\njiaoshou\t292284\n相忆\t292285\n防虫网\t292286\n全速\t292287\n乔安云\t292288\n吊钩式\t292289\n几个月\t292290\n6028\t292291\n爽肤\t292292\nOffice2016\t292293\n柳枝\t292294\n审案\t292295\n太阳队\t292296\nshipped\t292297\nSneak\t292298\ncollective\t292299\n瀍河\t292300\n谢少\t292301\n最近一天\t292302\n瑶芳\t292303\n龙圩\t292304\n缝合器\t292305\n甲状腺功能亢进\t292306\n鼎立\t292307\nOpenPose\t292308\n氧乙烯醚\t292309\n12000吨\t292310\nKX\t292311\n大华锦绣华城\t292312\n护理管理学\t292313\nenjoy\t292314\n密探\t292315\n周濂\t292316\n瓦房店\t292317\n凸面\t292318\n启停\t292319\n庸医\t292320\n简况表\t292321\n信州\t292322\naoto\t292323\n黑穿\t292324\nSimulate\t292325\n跨国银行\t292326\n1730\t292327\n喝铁\t292328\n视界\t292329\n护体\t292330\nmutton\t292331\nginkgo\t292332\n美窝\t292333\n灰度测试\t292334\n丽泽商务区\t292335\n露筋\t292336\n吉首市\t292337\n底价\t292338\n姐妹会\t292339\nplume\t292340\n罗技G302\t292341\n独基\t292342\n汇报稿\t292343\n黑亮\t292344\n电磁线\t292345\n广州光大银行\t292346\n岱岳区\t292347\n吴玉霞\t292348\n魔具\t292349\n小新锐7000\t292350\nAldo\t292351\n张剑平\t292352\n袋狮\t292353\n南光路\t292354\n斯祥\t292355\n睾丸素\t292356\n胶袋\t292357\n禅修\t292358\n北通w1\t292359\n先临\t292360\n鲍伊\t292361\nCADWorx\t292362\nlying\t292363\n女虎\t292364\n中南路\t292365\n观音菩萨\t292366\nlqc\t292367\nOSP\t292368\n恭祝\t292369\n稳定土\t292370\n用益信托网\t292371\n践\t292372\n问一答\t292373\n星美国际影商城\t292374\n杨婷\t292375\n两高一剩\t292376\n经济合作\t292377\n纯电动汽车网\t292378\n11-1\t292379\n彩平\t292380\n运营\t292381\n搜神记\t292382\n铜合金\t292383\n水环式\t292384\n白鹿洞\t292385\nINSIGHT\t292386\n超区\t292387\n泣血\t292388\n桃缘\t292389\n大爆奖\t292390\n成人教育处\t292391\n1分钟内\t292392\n农经网\t292393\n宁波市图书馆\t292394\n天堂岛疑云\t292395\n中国快递\t292396\n人民北路\t292397\n电玩巴士ps4\t292398\n干劲\t292399\n安德门\t292400\n翻云覆\t292401\n5115\t292402\n大会战\t292403\n黄震\t292404\n聚醋酸乙烯酯\t292405\n网贷之星\t292406\n欧罗巴\t292407\n67项\t292408\n上跑男\t292409\nAdversarial\t292410\n咒印\t292411\n阳光路\t292412\nalta\t292413\n钱版\t292414\nNginx服务器\t292415\n吾爱诗经网\t292416\n张云飞\t292417\n6.8万\t292418\nnations\t292419\n吴萍\t292420\nheji\t292421\n杆塔\t292422\nPChome\t292423\n北大国际\t292424\n4月30\t292425\n达实智能\t292426\n广州现代信息工程职业技术学院\t292427\n李若彤\t292428\n铝皮\t292429\nekp\t292430\n300ml\t292431\nWeTransfer\t292432\n方直\t292433\n康桥地产\t292434\n挤挤\t292435\n博锐\t292436\n违宪\t292437\nharting\t292438\nipqc\t292439\n接触器\t292440\n东莞东火车站\t292441\n杜十娘怒沉百宝箱\t292442\n浙江省卫生和计划生育委员会\t292443\n新能源汽车产业园\t292444\n谈单\t292445\n自弹自唱\t292446\n太嗨了\t292447\n山东法制报数字报\t292448\n湿重\t292449\n买入\t292450\n枚举器\t292451\n十月初一\t292452\n极少\t292453\n高峰镇\t292454\n转学\t292455\nicbc\t292456\n微淘客\t292457\n侯\t292458\nethnic\t292459\nUv\t292460\n花卉世界网\t292461\n乐意\t292462\n旱冰鞋\t292463\n都西比尔\t292464\nmichaelkors\t292465\n文芳\t292466\n深圳市场监督管理局\t292467\n暖风\t292468\n阳极\t292469\n柴智屏\t292470\n复方对乙酰氨基酚片\t292471\nimatest\t292472\n延展性\t292473\n慧根\t292474\n云南省食品药品监管\t292475\n黄栌\t292476\n沪江英语\t292477\n湖光山色\t292478\n小河区\t292479\n信息技术有限公司\t292480\n剧照\t292481\n美格智能\t292482\n宏语言\t292483\n支脉\t292484\nlightgbm\t292485\n宝洁\t292486\n小屏\t292487\n氯化锂\t292488\n沧浪新城\t292489\n王行\t292490\n清华山\t292491\n页号\t292492\n2-3月\t292493\nv919\t292494\n翰林苑\t292495\nDeparture\t292496\n纸艺\t292497\n英语四级阅读\t292498\n法地\t292499\nelastix\t292500\n和县人民政府\t292501\n开元名都\t292502\n小发明\t292503\n脱恐\t292504\n记录帖\t292505\n新装\t292506\n浪迪\t292507\n智龙迷城吧\t292508\n省环保厅\t292509\njie\t292510\n南康区\t292511\n初音岛\t292512\n牯岭\t292513\nleetcode\t292514\n镀铬\t292515\n博海拾贝\t292516\nVERO\t292517\n列下\t292518\n六升\t292519\n波兰语\t292520\nTwoo\t292521\n新讯网\t292522\n西江\t292523\nJJ斗地主\t292524\n浮桥\t292525\n英国航空公司\t292526\n展昭\t292527\n邓川\t292528\n拟增\t292529\n高级经济师考试\t292530\n老虎鱼\t292531\n浅笑\t292532\n坚石\t292533\n卫玠\t292534\n0.08\t292535\n机电工程专业\t292536\n布洛克莱斯纳\t292537\n京西院区\t292538\nterracotta\t292539\n100级\t292540\n飞致150\t292541\n钟欣潼\t292542\n触控板\t292543\n五席\t292544\n避暑胜地\t292545\n南京西路站\t292546\n汉语桥\t292547\n控制不了\t292548\n小熙\t292549\n高云峰\t292550\n瓦尔基\t292551\n假面骑士decade\t292552\n选词\t292553\n智操盘股票配资\t292554\n对症下药\t292555\nmidone\t292556\n固氮\t292557\n降一物\t292558\n分数的初步认识\t292559\n亮红\t292560\n打扣\t292561\n热价\t292562\nspotify\t292563\n瑞风S3\t292564\n好再来\t292565\n人民法院诉讼资产网\t292566\n各类\t292567\n残障\t292568\n00486\t292569\n揭发\t292570\n集中国人\t292571\n0.5倍\t292572\n厉害\t292573\n普菲特\t292574\n曼多拉\t292575\n油轮\t292576\n王老吉凉茶\t292577\n工程师\t292578\nbeta1\t292579\n筹款\t292580\n真人快打9\t292581\nmybaties\t292582\n寒性\t292583\n雁栖湖景区\t292584\n第59章\t292585\n杜审言\t292586\n200kg\t292587\n阴魔\t292588\n岳池县\t292589\n咯吱\t292590\ndissolved\t292591\n马可齐达内\t292592\n上海微系统所\t292593\n火\t292594\ncom1\t292595\n家男\t292596\n福银高速\t292597\n张劲松\t292598\n江苏大学\t292599\n推荐位\t292600\n赵体\t292601\n双独\t292602\nMONO\t292603\n【曼\t292604\n重庆火锅\t292605\nrs7\t292606\n垮塌\t292607\n戈蓝\t292608\n居配\t292609\nアキバ\t292610\nunderwear\t292611\nxinput\t292612\n保安室\t292613\nwebService\t292614\n0205\t292615\n几百次\t292616\n第23集\t292617\npha3\t292618\nCAAC\t292619\n金利来\t292620\n氧氟沙星凝胶\t292621\n明目\t292622\n兰蔻粉\t292623\nccs6\t292624\n闯天下\t292625\n中国纺织网\t292626\n3028\t292627\n中国香港大学\t292628\n摇臂钻床\t292629\n通讯员\t292630\n91本\t292631\nMummy\t292632\nsynopsys\t292633\n唤起\t292634\n镇守府\t292635\n铃木维特拉\t292636\n扶助\t292637\n花斑癣\t292638\n单场\t292639\ndaydream\t292640\n何红霞\t292641\nsyncing\t292642\n1274\t292643\nTUXEDO\t292644\n脱式计算\t292645\n浮游生物\t292646\n积水潭医院\t292647\nimc\t292648\n十几斤\t292649\nredox\t292650\nLeviathan\t292651\n走纸\t292652\n老弱\t292653\nemui8.0\t292654\n私属\t292655\n亚硝酸盐氮\t292656\n倪岳峰\t292657\n辽阳市人民政府\t292658\n语音提示器\t292659\n孔子彦希\t292660\n九景衢铁路\t292661\n7.1.5搏击\t292662\n上海普陀区政府网\t292663\nh级\t292664\npowerdesigner\t292665\nvba\t292666\n酌酒\t292667\n兴川\t292668\n相对论\t292669\nintj\t292670\n220w\t292671\n天房集团\t292672\n制造厂\t292673\n医药业\t292674\n怵目惊心\t292675\n罗腾堡\t292676\n抹不掉\t292677\n欢乐斗棋牌\t292678\n血精\t292679\n清净\t292680\n横岗\t292681\n坂崎良\t292682\n徐卫\t292683\n大合唱\t292684\n八宝街\t292685\n高唐县\t292686\n2114\t292687\n60日内\t292688\ngtest\t292689\n从此以后\t292690\n一闪\t292691\n几十套\t292692\nEXTJS\t292693\nMEMS\t292694\nEverlane\t292695\n油纸伞\t292696\n南部县教育局\t292697\n紫水晶\t292698\nUNIS\t292699\n恐龙特急克塞号\t292700\n相位关系\t292701\n袁行霈\t292702\n对管\t292703\n繁冗\t292704\n沈炜\t292705\nBoyz\t292706\n盛放\t292707\n广东农村信用社\t292708\n紫金港校区\t292709\n锐龙R3\t292710\n周崇光\t292711\n玻\t292712\n五阶\t292713\n莘县人民政府\t292714\n火刑\t292715\n袁隆平\t292716\n视源\t292717\npresent\t292718\n贾森\t292719\n水性聚氨酯树脂\t292720\n48g\t292721\n食道\t292722\n辛丑条约\t292723\n章\t292724\n莲香\t292725\nSampling\t292726\n东北农大\t292727\n十五日游\t292728\n123标志网\t292729\nbeads\t292730\n桥洞\t292731\n东涌镇\t292732\n加勒比海盗5\t292733\nmsh\t292734\n罢\t292735\n讲究\t292736\nnach\t292737\nincrement\t292738\n毛氏\t292739\n填鸭式\t292740\n威图机柜\t292741\n陕西省委组织部\t292742\ndebian9\t292743\n1714\t292744\n嘉华鲜花饼\t292745\n新华社区\t292746\n那儿\t292747\nECSHOP\t292748\ndoubleh3lix\t292749\n002175\t292750\n三皇\t292751\n打扫\t292752\n若愚\t292753\n黄维德\t292754\n自画\t292755\n中国国际科技会展中心\t292756\n丁香鱼\t292757\n驴子\t292758\n刘春山\t292759\n争妍\t292760\npolynomial\t292761\n唐宪宗\t292762\n十平米\t292763\n哈狗帮\t292764\n20马力\t292765\n玉楼\t292766\n鬼泣3\t292767\nListings\t292768\n日产颐达\t292769\n超净\t292770\n至今\t292771\n埋埋\t292772\n吴医生\t292773\ncrucial\t292774\n_老电影院\t292775\n探索\t292776\n珠峰\t292777\n陕西省新闻出版广电局\t292778\nWEBRip\t292779\n官博\t292780\n董姐\t292781\n自助游记\t292782\n农业机械化\t292783\n玄关柜\t292784\n2018年4月7日\t292785\nsebastian\t292786\n驾临\t292787\n尖底\t292788\n琼台\t292789\n约书亚乐团\t292790\nyiban\t292791\n封釉\t292792\nEi\t292793\n铜仁楼盘网\t292794\n泰坦x1\t292795\n每个礼拜\t292796\n43篇\t292797\n固安\t292798\njianshen\t292799\n彩虹石\t292800\n雪河\t292801\n肱动脉\t292802\n薰衣草庄园\t292803\n3386\t292804\n天府路\t292805\n端木\t292806\n米线机\t292807\n谢伟\t292808\nfilm\t292809\n纽\t292810\nmqtt服务器\t292811\n玉园\t292812\n3203\t292813\n392\t292814\n1光\t292815\n肝素钠\t292816\n国投安信\t292817\n罗马全面战争\t292818\n浙江同济科技职业学院\t292819\n澳新银行\t292820\nBootleg\t292821\n冷烫\t292822\n测测\t292823\n红薯淀粉\t292824\n虾米音乐吧\t292825\nPhpvar\t292826\n98条\t292827\n罗特斯\t292828\n锦秋花园\t292829\n3304\t292830\n红砖\t292831\n祺祥\t292832\n东宇\t292833\n花恋\t292834\n预浸料\t292835\n23层\t292836\n914\t292837\n黄金季\t292838\nGTX1080Ti\t292839\nDiagram\t292840\n明基投影仪\t292841\n杂志_编辑部\t292842\n2014-01-05\t292843\nremaster\t292844\n武井咲\t292845\n新凯越\t292846\n化学校\t292847\n剪切强度\t292848\n非单\t292849\n丫环\t292850\n增健\t292851\n岳\t292852\n煅\t292853\n微型投影仪\t292854\nTeX\t292855\n鲜亮\t292856\n调差\t292857\n西北工业大学出版社\t292858\n中房网\t292859\n腾讯手机管家\t292860\ndopamine\t292861\nSSAT\t292862\n大岛由加利\t292863\ncarplay\t292864\nngacn\t292865\nBeautiful\t292866\n22千瓦\t292867\n小雷音寺\t292868\n冲牙器\t292869\nDeng\t292870\n漓江郡府\t292871\nFortify\t292872\n注册机序列号\t292873\n矫揉造作\t292874\n妖化\t292875\n评测\t292876\nmeetrice\t292877\n试工\t292878\nver\t292879\n没感觉到\t292880\n160626\t292881\n刘小枫\t292882\n黄岗山\t292883\n420元\t292884\n赤裸特洛伊\t292885\nstreamline\t292886\n红布\t292887\n旋风分离器\t292888\n丰顺吧\t292889\n缠绕机\t292890\n伍豪\t292891\n僵尸电影\t292892\n叶紫\t292893\n历表\t292894\n冯超\t292895\n铜缆\t292896\nFuck\t292897\n砂\t292898\n新竹园\t292899\n重任\t292900\n同桌\t292901\n赵普\t292902\n战罢\t292903\n罗江\t292904\n清华大学建筑设计研究院\t292905\nWHITE\t292906\n嘉米\t292907\n奥迪TT\t292908\n1十1\t292909\n陶瓷基复合材料\t292910\n1080P在线\t292911\nCATWALK\t292912\n空空如\t292913\n名小吃\t292914\n京张\t292915\n証\t292916\n高枕\t292917\n非饱和土\t292918\n刀客\t292919\n慕课\t292920\n恩阳\t292921\n1.05G\t292922\n大用\t292923\n径向\t292924\n横版\t292925\n英国威廉希尔公司\t292926\n大帽山\t292927\n灌云县\t292928\npratt\t292929\n缩印\t292930\n圣三一\t292931\n布尔津县\t292932\n木粒\t292933\n三必\t292934\nLeeHonGee\t292935\nSUVs\t292936\n修拉\t292937\n_九千网\t292938\n有麻烦\t292939\n苏记\t292940\n潜质\t292941\n小米公寓\t292942\n软题库\t292943\nExhaust\t292944\n喷头\t292945\n认真\t292946\n银城\t292947\n道奇公羊\t292948\n冯太后\t292949\n功业\t292950\n2个月\t292951\nHEU\t292952\n残魂\t292953\nConfession\t292954\n偷走\t292955\n网心\t292956\n宝狮\t292957\nRotterdam\t292958\n3笔\t292959\n茶歌\t292960\n音维\t292961\n生态农庄\t292962\n采荷中学\t292963\n深户\t292964\n菜户\t292965\n赵飞燕\t292966\n子产\t292967\nAECS6\t292968\n贷记\t292969\ndidnt\t292970\n构形\t292971\n价税合计\t292972\nJIAN\t292973\nbhd\t292974\nLuther\t292975\n正态性\t292976\n福井\t292977\nRef\t292978\n洛阳日报\t292979\n蛋白质\t292980\n王欣欣\t292981\nQuicktime\t292982\n21名\t292983\n英雄联盟之决胜巅峰\t292984\n残忍\t292985\n李玉芳\t292986\n1p\t292987\n致志\t292988\n托利\t292989\n威尔金斯\t292990\n填表\t292991\n心脏移植\t292992\n全资\t292993\nstr函数\t292994\nSQLSTATE\t292995\n乐高乐园\t292996\n80后励志网\t292997\n欧皇\t292998\n预制棒\t292999\n季白\t293000\n加解\t293001\nDalston\t293002\ntable2\t293003\n乐神\t293004\n地肤\t293005\nKAILAS\t293006\nbunny\t293007\n总弹\t293008\n宋一夫\t293009\n尊宝比萨\t293010\n佛冈县政府\t293011\n恋听网\t293012\n电子杂志\t293013\n前2天\t293014\n东宝兴路\t293015\n利勃海尔\t293016\n2月28号\t293017\n关联方关系\t293018\n开板价\t293019\nshadowdsocks\t293020\n60所\t293021\n派兵\t293022\npaper\t293023\njude\t293024\n泾阳县\t293025\n操作规\t293026\n萨蒂\t293027\n2016年5月份\t293028\n乙状\t293029\n李阳波\t293030\n温柔乡\t293031\n59路\t293032\n旋流器\t293033\n长沙海关\t293034\nReferenced\t293035\n养鸟\t293036\n第一程\t293037\nInland\t293038\njud\t293039\n1.8倍\t293040\nndp\t293041\n罗技k380\t293042\n收银\t293043\n18X\t293044\nm100\t293045\n瞒过\t293046\n书针\t293047\n频谱分析\t293048\n够不够用\t293049\n王公\t293050\nQI\t293051\nXmind\t293052\n山花\t293053\n焦磷酸钾\t293054\n98岁\t293055\n深圳中国银行\t293056\nioc\t293057\n桑塔纳·浩纳\t293058\ngünstig\t293059\n缩短\t293060\nAH股\t293061\n锦江股份\t293062\n左击\t293063\nCradle\t293064\n邢台大峡谷\t293065\n抗魔联军\t293066\n15步\t293067\n穿裙\t293068\n建筑工程设计文件编制深度规定\t293069\nuci\t293070\n无影无踪\t293071\n狐狸的夏天\t293072\n云度\t293073\n山东畜牧兽医职业学院\t293074\n小礼\t293075\n靳军\t293076\n小花园\t293077\n朝仓琴美\t293078\n熊科\t293079\n织发\t293080\n代评\t293081\n陈虹\t293082\n秘密情人\t293083\n星野度假村\t293084\n盐都\t293085\n金银湖\t293086\n平煤神马集团\t293087\n黄金草\t293088\n第04期\t293089\n省税\t293090\n皮蛋瘦肉粥\t293091\n哆\t293092\ntore\t293093\n兰州大学第二医院\t293094\n茶室\t293095\n红细胞计数\t293096\n佛山网络公司\t293097\n第123集\t293098\n奇\t293099\n便条\t293100\nExpander\t293101\n2015~2016年\t293102\n王玲玲\t293103\n万振逍遥苑\t293104\n限时\t293105\n通用股份\t293106\nFootwear\t293107\nMTO\t293108\n往后\t293109\nCCTV-3_央视\t293110\n大桥镇\t293111\n马鞍山公园\t293112\n全面\t293113\n大城市\t293114\n442\t293115\n125_\t293116\n98分钟\t293117\n秽语\t293118\n赵城镇\t293119\n司夏\t293120\n隐藏关\t293121\npaas\t293122\n孙睿\t293123\n新鲜感\t293124\n银闪付\t293125\n配文\t293126\n坦克手\t293127\nreflex\t293128\nstructural\t293129\nrobotframe\t293130\n人力资源管理概论\t293131\n包芯\t293132\n阿库拉\t293133\nDevils\t293134\n七场\t293135\nCannon\t293136\nmtm\t293137\n陕西省第四人民医院\t293138\n锚固剂\t293139\n欧洲卡车模拟2MOD\t293140\n何刚\t293141\n052D驱逐舰\t293142\n快修店\t293143\n超级经典\t293144\n纽西兰\t293145\n上海社保局\t293146\n@query\t293147\nuv胶\t293148\n公孙丽\t293149\n爆兽\t293150\n6088\t293151\n教科院\t293152\nDec\t293153\n镇江政府\t293154\nSTO\t293155\nddx\t293156\n中数\t293157\n混龄\t293158\n安全管\t293159\n两首歌\t293160\n西文\t293161\n宝塔镇\t293162\nToptal\t293163\n南伊沟\t293164\n炎泽\t293165\n西集\t293166\n2390\t293167\n升油\t293168\n车顶行李架\t293169\n艾尔帕兰\t293170\n开涛\t293171\n广州芳村车管所\t293172\n婴幼儿歌\t293173\n捷达\t293174\n六座\t293175\nputS\t293176\n中国人民银行\t293177\n唐山银行\t293178\n失落的大陆\t293179\n复合板\t293180\n郦波\t293181\ntranslating\t293182\n生心\t293183\n赏花期\t293184\n过家家装修网\t293185\n牛郎织女\t293186\n胬肉\t293187\n椰青\t293188\n武当太极拳\t293189\nwindows10\t293190\nX+\t293191\nv1.1.0_\t293192\n字体库\t293193\n马鞍山市人民医院\t293194\n85篇\t293195\n阿桑奇\t293196\nSli\t293197\nlixin\t293198\n20160909\t293199\n7714\t293200\n染指\t293201\n解除合同\t293202\n乐坊\t293203\n李小环\t293204\n陈晓春\t293205\n小黑豆\t293206\n高鸿业\t293207\nmybati\t293208\n中庚集团\t293209\n40项\t293210\n灬\t293211\n虚拟号\t293212\n白鹭岛\t293213\n分布式存储\t293214\n3960x\t293215\nbaba\t293216\n特色\t293217\n网易游戏吧\t293218\n酸萝卜老鸭汤\t293219\nMate9|Mate9\t293220\nMY0101\t293221\n青梅煮酒\t293222\nsql2017\t293223\n黄油相机\t293224\n渐入\t293225\n风尘仆仆\t293226\n昌平区政府\t293227\n起泡酒\t293228\nX61\t293229\n优秀少先队\t293230\n地球科学\t293231\n第七部\t293232\n过滤板\t293233\n收费员\t293234\n陈思斯\t293235\n士别\t293236\n音台\t293237\n建构\t293238\n股权登记\t293239\n厦门航空\t293240\nLogout\t293241\n无翼鸟邪恶漫画大全\t293242\nASIN\t293243\nColorado\t293244\nIELTS\t293245\n护腰\t293246\n辣舞\t293247\n堤防\t293248\n俗套\t293249\n蒋敦豪\t293250\n墨水\t293251\n普思\t293252\nsws\t293253\n孔容\t293254\n周瑞\t293255\n此法\t293256\n戊申\t293257\n66首\t293258\n8.7折\t293259\n中元国际\t293260\n挤过\t293261\n乌鲁克\t293262\n7090\t293263\n第07\t293264\nm1\t293265\n凤九卿\t293266\nairlines\t293267\n国王的演讲\t293268\n典故\t293269\n解梦见\t293270\n惠惠网\t293271\n天堂M\t293272\n临盘\t293273\nZhangCui\t293274\n黄土坡\t293275\n不争\t293276\n优播\t293277\n美国东北大学\t293278\n红眼\t293279\n大中专院校\t293280\n五菱\t293281\n接济\t293282\n练习器\t293283\n宝驹\t293284\n中国有嘻哈\t293285\nKamen\t293286\n低等\t293287\n出口权\t293288\n80多\t293289\n上海辰山植物园\t293290\nhier\t293291\nMcDonald\t293292\n百单\t293293\n金河田\t293294\n杜集区\t293295\n王天风\t293296\n亘\t293297\nproofing\t293298\n阿平\t293299\n基本型\t293300\n中信网鸽\t293301\n防晒指数\t293302\nTGFC\t293303\nvi编辑器\t293304\n甲磺酸\t293305\n中国校友会网\t293306\n苦劳\t293307\n571\t293308\n钢钉\t293309\n迅雷影院\t293310\n免责声明\t293311\n八五折\t293312\n杨虹\t293313\n皇图\t293314\n试金\t293315\n拉风箱\t293316\n莫拉塔\t293317\n回头\t293318\n晋广论坛\t293319\nnanananana\t293320\nA1533\t293321\n735XT\t293322\n对抗赛\t293323\nplupload\t293324\n维加斯商城\t293325\n取豪夺\t293326\n淀粉样\t293327\n夏丏尊\t293328\nuikit\t293329\n文化交流史\t293330\n嘴袋\t293331\n赫\t293332\n说书\t293333\n华为mat\t293334\n新物\t293335\n生产许可证\t293336\ndtx\t293337\n水果拳\t293338\n火烧\t293339\n平安福2017\t293340\n支持向量机\t293341\n1.2.3.4.5\t293342\n国际海运\t293343\n上热下寒\t293344\n热拌沥青混合料\t293345\ndelect\t293346\n莱芜市人民政府\t293347\ndce\t293348\n中国皮革人才网\t293349\nCDMA2000\t293350\nu形\t293351\n一年四季\t293352\n甜馨\t293353\n国际能源网\t293354\n怀化火车站\t293355\nExpensive\t293356\n平常心\t293357\n35吨\t293358\n收复\t293359\n窦丽丽\t293360\n128GB\t293361\n喷轨\t293362\n美式期权\t293363\n旅行\t293364\n战迹\t293365\n结果集\t293366\n荷香\t293367\nswf格式转换器\t293368\n宗庙\t293369\nHin\t293370\nscott\t293371\n三片级\t293372\n拉合尔\t293373\nElection\t293374\noffering\t293375\n退耕\t293376\n工程类\t293377\n20公斤\t293378\n怡丰城\t293379\nxiongmao\t293380\n艾玛乔布斯\t293381\n王俊杰\t293382\nTourist\t293383\nVDE\t293384\n加拿大皇后大学\t293385\n54式\t293386\nmini14\t293387\n第十五届\t293388\n楚留香\t293389\n蜜匠\t293390\n仿写\t293391\nTINA\t293392\n发起式基金\t293393\n深圳龙华汽车站\t293394\n土肥\t293395\n李达\t293396\n吊顶板\t293397\n泰禾集团\t293398\n螺旋型\t293399\n175五\t293400\n社居委\t293401\n2:00\t293402\n大锤\t293403\nl460\t293404\n基层工会经费收支管理办法\t293405\n素混凝土\t293406\n插卡音箱资源网\t293407\n外公芳龄38\t293408\n雍也\t293409\n3.53\t293410\n髓系\t293411\nllp\t293412\nwalkr\t293413\n华泰证券\t293414\n第四张\t293415\n仙湖\t293416\n李腾飞\t293417\n自然教育\t293418\n上海浦东嘉里大酒店\t293419\n能说\t293420\n相关公\t293421\n新瑞\t293422\n开心公寓\t293423\n拉丝\t293424\n概预算定额\t293425\n屄\t293426\n8407\t293427\n第3关\t293428\nPsycho\t293429\n廖景萱\t293430\n桃花渡\t293431\n张家界天门山\t293432\n赤桥\t293433\n兽餐\t293434\n中国服务网\t293435\n小蚁模拟器\t293436\n元气\t293437\n卓越型\t293438\n易语言服务器\t293439\n翻板\t293440\n主权财富基金\t293441\n转分\t293442\n黄杨\t293443\n眼福\t293444\n供电公司\t293445\n云头\t293446\n长泰广场\t293447\n金太阳幼儿园\t293448\nETC速通卡\t293449\n0.9cm\t293450\n小俞\t293451\n哈莉·贝瑞\t293452\n魏如萱\t293453\n不插卡\t293454\nrcf\t293455\n禄存\t293456\n179个\t293457\nweakly\t293458\n沉淀之路\t293459\n有声书\t293460\n黑莓9900\t293461\n第一班\t293462\n25升\t293463\nOPTICAL\t293464\n投球\t293465\n独孤曼陀\t293466\n龙虎山\t293467\nFreeBuds\t293468\n万卡\t293469\n朱集镇\t293470\n289个\t293471\n遗臭万年\t293472\n安徽省发展改革委\t293473\n君川结衣\t293474\n新浪青岛_新浪网\t293475\n1-3年\t293476\nLA\t293477\n半场\t293478\n无货\t293479\n哪怕\t293480\n阿智\t293481\nExt3\t293482\n富迪\t293483\n宝钛股份\t293484\n2011年上半年\t293485\n一褂\t293486\n莆田城厢区\t293487\n400dpi\t293488\n苍\t293489\n1.5吨\t293490\n幻世录2\t293491\n福特途睿欧\t293492\n车载DVD\t293493\n顾峰\t293494\n陈荣华\t293495\n森村诚一\t293496\n毕业论文参考文献格式\t293497\n浙贝母\t293498\n李春辉\t293499\n码营\t293500\nBYSocket\t293501\n第70\t293502\nShane\t293503\n狂飚\t293504\n信陵君\t293505\n河南省大学\t293506\n春兴精工\t293507\n性冲动\t293508\n妖界\t293509\nExtensible\t293510\n佛罗里达乐园\t293511\n通用化\t293512\n深圳大学图书馆\t293513\nhigh歌\t293514\n执子之手\t293515\nTN屏\t293516\n19v\t293517\n1066\t293518\n转给\t293519\n西兴街道\t293520\n手动版\t293521\n鼓子\t293522\n1853673\t293523\n鑫利\t293524\n8.3.2\t293525\n有余而力不足\t293526\n单位制\t293527\nPitts\t293528\n久美子\t293529\n级别\t293530\n前6个月\t293531\n很乖\t293532\nspectre\t293533\n羁\t293534\n7月16日\t293535\n美澳健\t293536\n生擒\t293537\nMofos\t293538\n为美好的世界献上祝福吧\t293539\n双师型\t293540\n盖帘\t293541\n过关\t293542\n董家湾\t293543\n零阶保持器\t293544\n恪尽职守\t293545\n平南\t293546\n台票\t293547\n十连\t293548\nPORT\t293549\n自酿啤酒\t293550\nlog\t293551\niTerm2\t293552\n桂林日报社\t293553\n1998款\t293554\ncarta\t293555\n出处\t293556\n小百姓网\t293557\n临习\t293558\n库洛洛\t293559\nHarmony\t293560\nBeat\t293561\n中华英雄\t293562\n实时监\t293563\n為\t293564\n90级SS套\t293565\n佐川银次\t293566\n安生\t293567\n框图\t293568\n集中荷载\t293569\n拉削\t293570\nAUX\t293571\n2017年5月14日\t293572\n熊猫眼\t293573\n象声词\t293574\n数2\t293575\n背压\t293576\nt40\t293577\n关晓\t293578\n徐小凤\t293579\n无以致远\t293580\n临沂物流网\t293581\ndec\t293582\n鼓瑟\t293583\ncherie\t293584\n南石龟\t293585\n董新尧\t293586\n万唯教育\t293587\n零部件展\t293588\n七彩云南\t293589\n线束\t293590\n质监局\t293591\n翠微居\t293592\n美丽联合集团\t293593\n武汉中学\t293594\n莫小棋\t293595\n渐冻人\t293596\n常德市政府\t293597\n国画展\t293598\n爱吧\t293599\n动手术\t293600\n党日\t293601\n结缔\t293602\n雅虎财经\t293603\n做人\t293604\noxford\t293605\ntransformer\t293606\n腹疼\t293607\n临\t293608\n水件\t293609\n外资企业\t293610\n内蒙古机电职业技术学院\t293611\n道闸\t293612\nfallacy\t293613\nwebRTC\t293614\n奥特朗\t293615\n尿激酶\t293616\n泰宁\t293617\nappcompat-v7\t293618\nClin\t293619\n地沟油\t293620\n户口本\t293621\n39蜂疗网\t293622\norganize\t293623\n工图\t293624\n虐心\t293625\n共情\t293626\n排式\t293627\n昆仑神宫\t293628\n密电\t293629\n常工院\t293630\n笔端\t293631\n芝士片\t293632\n长景园林\t293633\n导轨油\t293634\n液位变送器\t293635\n恒基\t293636\n好发\t293637\n中山市住房和城乡建设局\t293638\n不畅\t293639\n卷六\t293640\n味香\t293641\n温州人才网\t293642\nHD1280p\t293643\n净化器\t293644\n哥伦比亚\t293645\n乡间客\t293646\n150天\t293647\n武备\t293648\n海南广播电视总台\t293649\n煎茶镇\t293650\n王诗安\t293651\n沙漠_\t293652\n超过3个月\t293653\n多民族\t293654\n郎鹤炎\t293655\nEng\t293656\n多浪河\t293657\n科普中国网\t293658\n戴式\t293659\n撰写\t293660\n杰克韦尔奇\t293661\n中小企业管理与科技\t293662\nNTR\t293663\n一易\t293664\n卷包\t293665\n天问\t293666\n苦涩\t293667\n回扣\t293668\nsuppose\t293669\n冒险岛online\t293670\n欺\t293671\n货仓\t293672\n360安全路由p1\t293673\n进集\t293674\n飞鹏网\t293675\nsmartscreen\t293676\n混淆\t293677\n龙泉驾校\t293678\n9000亿元\t293679\n华中科技大学外国语学院\t293680\n抗魔\t293681\n被选中\t293682\n浪客剑心真人版\t293683\n梅花针\t293684\n41.4\t293685\n雕文\t293686\nPaprika\t293687\n最流行\t293688\n云画的月光\t293689\nant-design\t293690\nganss\t293691\n298\t293692\nbizhub283\t293693\n电热式\t293694\n康元\t293695\n松际农网\t293696\n几十首\t293697\n大富贵\t293698\n轩逸论\t293699\n书殿\t293700\nDAMTOYS\t293701\n史家胡同小学\t293702\nmonaco\t293703\n迅雷高清\t293704\n增值税分录\t293705\n30节\t293706\n涪\t293707\nqing\t293708\njedec\t293709\n碎骨\t293710\n白蜡木\t293711\n图形推理\t293712\n世界村\t293713\n中大网校范文网\t293714\n神牛\t293715\n录波器\t293716\nCli\t293717\nrichtext\t293718\n胆机\t293719\n萝卜丁\t293720\n雪铁龙C4世嘉\t293721\n增温\t293722\n新奥德赛\t293723\n三江并流\t293724\n教育系\t293725\n甘肃民族师范学院\t293726\nrestraint\t293727\n热热闹闹\t293728\n联壁金融\t293729\neip\t293730\n黄河湿地公园\t293731\n仲裁\t293732\n续传\t293733\n母牛\t293734\n一条鱼\t293735\n镀膜机\t293736\n金相切割机\t293737\n折耳根\t293738\n李录\t293739\ntianyi\t293740\n张越\t293741\n海口经济学院\t293742\n金柜\t293743\n吸收式热泵\t293744\nedue\t293745\n没主见\t293746\n莲花生\t293747\n天益\t293748\n尽头\t293749\n树形\t293750\n李来群\t293751\n王君安\t293752\nRiverside\t293753\n1472\t293754\nelaticsearch\t293755\n314\t293756\n向东流\t293757\n致死案\t293758\n3x3黄金联赛\t293759\n重汽\t293760\ntft\t293761\n想和你在一起\t293762\nsmit\t293763\n木偶\t293764\n东晓南\t293765\n二甘醇\t293766\n努比亚Z11\t293767\n9周岁\t293768\n外省\t293769\nvecm\t293770\n三五斗\t293771\nraid6\t293772\nStephanie\t293773\n捞起\t293774\nmimu\t293775\nFLStudio\t293776\n吹泡\t293777\n中国水产网\t293778\n陆雪\t293779\n69天\t293780\nqq电脑\t293781\n5回\t293782\n名义利率\t293783\ngiant\t293784\n拥抱你离去\t293785\n对称点\t293786\n附文\t293787\n安闲\t293788\nTimeZone\t293789\nXVideos\t293790\n乘风破浪\t293791\n腾龙SP\t293792\n金赛纶\t293793\n陈雄\t293794\ngbasp\t293795\n优秀篇\t293796\n中共沈阳市委\t293797\nshp\t293798\nバニ\t293799\n16.6\t293800\nREGEXP\t293801\n党政机关公务用车管理办法\t293802\n灰熊\t293803\niqkey\t293804\n丁苯酞软胶囊\t293805\nbytecode\t293806\nunix\t293807\n棉袜\t293808\n龙凤网\t293809\n北京市食品药品监督管理局\t293810\n电子登机牌\t293811\n苦日子\t293812\n金山大厦\t293813\n解决\t293814\n图报\t293815\n智将\t293816\ntakami\t293817\n最高额\t293818\n醇美\t293819\n三国在线\t293820\ntyod\t293821\n马鲷鱼\t293822\nxit\t293823\n鼓风\t293824\n龙口港\t293825\n浮球液位计\t293826\n金山打字通在线\t293827\n纸纸\t293828\nlookbook\t293829\n7447\t293830\n周林\t293831\nHOW\t293832\n授信额度\t293833\ntensile\t293834\n51000\t293835\n进队\t293836\n预埋\t293837\n集成热水器\t293838\nmaya2015\t293839\nMassimo\t293840\n梁文\t293841\n第12关\t293842\n高中会考\t293843\ntextfile\t293844\n哑奴\t293845\n芝士派\t293846\n常山\t293847\n凝香\t293848\n蓝天白云\t293849\n单支\t293850\n战国立志传\t293851\n福建省人民政府办公厅\t293852\n依萍\t293853\n周志强\t293854\n上海交通大学密西根学院\t293855\n行尸走肉第四季\t293856\n03月30日\t293857\n弄臣\t293858\n投基\t293859\nClone\t293860\n强奸\t293861\n宝马i8\t293862\n太极计算机股份有限公司\t293863\n穆图\t293864\nG-SHOCK\t293865\ncuo\t293866\nmax_execution_time\t293867\n衡阳市\t293868\n九科\t293869\nrhapsody\t293870\n汽车圈\t293871\n龙啸\t293872\n铃铛\t293873\n政立路\t293874\n直螺纹\t293875\n古剑奇\t293876\n督办\t293877\n详细\t293878\nN4110\t293879\n歧义\t293880\nltd\t293881\nDALI\t293882\n变化无常\t293883\n酷越\t293884\n面临\t293885\nkindly\t293886\n尿片\t293887\n黄土镇\t293888\n淘宝贷款\t293889\n白垩纪\t293890\n蓝梦\t293891\n共青城市\t293892\n下跪\t293893\n怯懦\t293894\nctfmon\t293895\nMedieval\t293896\n读书心得_\t293897\n战备状态\t293898\n真喜欢\t293899\n金玫瑰\t293900\n中华民族\t293901\n西中\t293902\n002065\t293903\n售前\t293904\n锦绣天府\t293905\n一点红\t293906\n露中\t293907\n张天蓉\t293908\n察右中旗\t293909\ndesired\t293910\n4户\t293911\n妙喜\t293912\n贤集\t293913\n除息日\t293914\nELS\t293915\n汉阴县人民政府\t293916\nGVIM\t293917\n感知觉\t293918\nlü\t293919\n吃玩\t293920\n小勺\t293921\n烟管\t293922\ninstax\t293923\n英雄谱\t293924\n大盗\t293925\n华为p9\t293926\n借此\t293927\n曼城\t293928\n阜阳市\t293929\n13837365653\t293930\n96K\t293931\n外卖箱\t293932\n绿鬣蜥\t293933\n实木复合门\t293934\n社会主义现代化建设\t293935\n0.7米\t293936\n太原电视台\t293937\n清真\t293938\n开场语\t293939\n学潮\t293940\n全栈\t293941\n美国国税局\t293942\nattend\t293943\n工百科\t293944\n蓝关\t293945\n利率互换\t293946\n氧气瓶\t293947\n若兰\t293948\n王三姐\t293949\n倒转\t293950\n凯旋门\t293951\n姜茶\t293952\n基本上\t293953\n书名号\t293954\n四天后\t293955\n修容盘\t293956\n动火作业\t293957\n割除\t293958\n顾伟\t293959\nSiebel\t293960\n挥发性有机物\t293961\n邮政储\t293962\n路基\t293963\n4月2号\t293964\n磨练\t293965\n支那人\t293966\n公安消防部队\t293967\n弗兰\t293968\nANSYSworkbench\t293969\n信用证\t293970\n猎术\t293971\n跟我学\t293972\nwant\t293973\n肾阴阳两虚\t293974\n邺\t293975\n35届\t293976\nlzx\t293977\nwetest\t293978\n车陂\t293979\n1776\t293980\n五音Jw\t293981\n拟物化\t293982\n兵刃\t293983\nr6300v2\t293984\n荣耀手环zero\t293985\n胆味\t293986\n忽然之间\t293987\nkilometer\t293988\n招商银行广州分行\t293989\n对话\t293990\n侠骨丹心\t293991\nwxw\t293992\n天舞\t293993\nnebula\t293994\n涵体\t293995\n切诺基\t293996\n36D\t293997\n现出\t293998\nbnp\t293999\n恒颜\t294000\n常州地铁1号线\t294001\n大金中央\t294002\n神雕之龙儿别传\t294003\n镗床\t294004\ni53470\t294005\n交易税\t294006\n传奇宝石\t294007\n编著\t294008\n帛金\t294009\n卷三\t294010\n社康\t294011\n金通\t294012\n盐野七生\t294013\n尹娜\t294014\n想说爱\t294015\n来过我生命的你\t294016\n维信诺\t294017\n臂丛神经损伤\t294018\n建信网\t294019\n车道\t294020\n捏合\t294021\nappendTo\t294022\n茂\t294023\n线切割机\t294024\nInitiatives\t294025\n179号\t294026\n正宁县\t294027\n易号网\t294028\n3565\t294029\n住总\t294030\n雷瑟\t294031\npyth\t294032\n程铮\t294033\n牌机\t294034\n商量\t294035\n张昺华\t294036\n中华人民共和国劳动合同法实施条例\t294037\n段尾\t294038\n22倍\t294039\n新妇\t294040\n程超\t294041\n如烟\t294042\n小气道\t294043\n吕欣\t294044\nFIGHT\t294045\n澄合网\t294046\n记员\t294047\n乘以\t294048\n中华客栈2\t294049\n长袍\t294050\n云南文山州政府\t294051\n必克\t294052\n樱吹雪\t294053\n没了解\t294054\n摄影师\t294055\n爱彼迎帮助中心\t294056\n近年来\t294057\ndaiyu\t294058\n底泥\t294059\n福冈县\t294060\nounces\t294061\n升学助考一网通\t294062\nnmon\t294063\n献给爱丽丝\t294064\n江西省国家税务局\t294065\nipf\t294066\nswitchysharp\t294067\nbently\t294068\n上海浦东国际机场\t294069\nPeru\t294070\n贪狼\t294071\nJAVA编程\t294072\n2017年6月4日\t294073\n20180\t294074\n3.59\t294075\n税屋\t294076\n棉语\t294077\n复旦大学外国语言文学学院\t294078\n打劫\t294079\n纽曼移动\t294080\n看招\t294081\nfx-82ES\t294082\n精绝古城\t294083\n冷暖自知\t294084\n幸福的生活\t294085\n中华人民共和国防震减灾法\t294086\n六个半月\t294087\n丽星\t294088\n苦人\t294089\nmetropolitan\t294090\n张扣\t294091\n封机\t294092\n3340\t294093\n1141\t294094\n湘龙街道\t294095\n建安街\t294096\n丟\t294097\n李侃\t294098\n海阔斗龙\t294099\n论理\t294100\n关糸\t294101\n51天\t294102\n姜启源\t294103\n玛咖\t294104\n养儿育女\t294105\n接地棒\t294106\n江云\t294107\n国民女神\t294108\n外交家\t294109\n天府大道北段\t294110\n专门化\t294111\n五套卷\t294112\n发达镇\t294113\n4零\t294114\nDMAX\t294115\n囚\t294116\n滑动窗口协议\t294117\n19201080\t294118\n中国保利集团公司\t294119\nWifi\t294120\ndvr\t294121\n农产品批发市场\t294122\n沙西\t294123\n查壳\t294124\n端电压\t294125\n纸盒\t294126\n雅兴\t294127\n官渡区人民政府\t294128\n喵帝\t294129\n建筑工程规划许可证\t294130\n紊乱\t294131\n花辫\t294132\n邮站\t294133\n空驶\t294134\n新兴市场\t294135\nprogressBar\t294136\n谭力\t294137\n恒大高新\t294138\n猫掌柜\t294139\n普通高等学校招生全国统一考试\t294140\n安定器\t294141\nPVC塑胶地板\t294142\n三十六式\t294143\n黑鹰坠落吧\t294144\n英特尔公司\t294145\n昌河福瑞达\t294146\n水菜\t294147\n1am2\t294148\n姜思达\t294149\ngossip\t294150\n中德安联\t294151\n华盛顿邮报\t294152\n海堤\t294153\n莫沙必利\t294154\n莘松路\t294155\n克鲁勃\t294156\n爱上学\t294157\nPenelope\t294158\n田源\t294159\nCarplay\t294160\n闪\t294161\nist\t294162\n圣托里尼\t294163\n爱乐奇\t294164\n专委会\t294165\n截然\t294166\n大奇山\t294167\n七巧板\t294168\n妹控\t294169\nqq飞车\t294170\n晋安\t294171\n纽约时报广场\t294172\n柱廊\t294173\n济南汽车站\t294174\n澳信网\t294175\n中集\t294176\n心理学报\t294177\n200斤\t294178\nalma\t294179\n趋缓\t294180\n翼缘板\t294181\n名创优品\t294182\n鱼鳞\t294183\n宝钢\t294184\n重庆奥体中心\t294185\npowerpoint2003\t294186\n免疫证\t294187\n热量表\t294188\n2018.4.14\t294189\n神华宁煤集团\t294190\n景忠山\t294191\n龙泉市政府\t294192\n杯口\t294193\n亿帆医药\t294194\n登峰\t294195\n爱商网\t294196\n美国国家航空航天局\t294197\n小软\t294198\n24小时后\t294199\n片花\t294200\n柜内\t294201\n胧\t294202\n串起来\t294203\nXN\t294204\n经济基础知识\t294205\n罗英锡\t294206\n浅仓彩音\t294207\ncodeigniter\t294208\n策划师\t294209\nI/O流\t294210\n六安文明网\t294211\njava线程池\t294212\n后遗\t294213\n可降解\t294214\n命妇\t294215\n流带\t294216\n阿狗\t294217\nHD超清\t294218\nt5\t294219\nalexis\t294220\n金塔\t294221\n302路\t294222\n一阵风\t294223\n5113\t294224\n温暖的弦\t294225\n包圆\t294226\n能不能够\t294227\n一条线\t294228\n乐至\t294229\n每天\t294230\nschoo\t294231\n+++\t294232\n芜湖镜湖区政府网\t294233\n12盘\t294234\n李玥\t294235\n杏美月\t294236\n南坝\t294237\n南丁\t294238\nMODBUS\t294239\nV3.1\t294240\n宋智恩\t294241\n蹦\t294242\n流亭国际机场\t294243\n第6场\t294244\n罗斯\t294245\nVAM\t294246\n口袋妖怪红蓝宝石\t294247\nGTM\t294248\n昆山杜克大学\t294249\n新乡东站\t294250\nSS喰种交易\t294251\n2度\t294252\nNightMP3\t294253\nStuttgart\t294254\n150名\t294255\n广发粤通\t294256\n国际奥委会\t294257\n胡林\t294258\n踢腿\t294259\n蒂\t294260\n9.3.0\t294261\n埃塞俄比亚比尔\t294262\nperu\t294263\n德施曼\t294264\n奔驰e300\t294265\n名迹\t294266\n看潮\t294267\n狂野飙车9\t294268\n2009年度\t294269\n首首\t294270\nkingsleylam\t294271\n地精工程学\t294272\n小讲课\t294273\n安达亚美\t294274\n强买强卖\t294275\n三十八\t294276\n开证行\t294277\n匹兹堡大学\t294278\n菅韧姿\t294279\n柏芝\t294280\n苏州移动\t294281\n96度\t294282\n皂河\t294283\ngip\t294284\n化学品\t294285\n逃出去\t294286\n河南林业职业学院\t294287\n混珠\t294288\n清华大学美术学院\t294289\n龙柒\t294290\n天士\t294291\nIntroductory\t294292\n好巧网\t294293\n高东镇\t294294\n仙歌\t294295\n风车公路\t294296\n外室\t294297\nz9d\t294298\n贞子大战伽椰子\t294299\nHOME\t294300\n亚厦\t294301\n28寸\t294302\n灭火机\t294303\n净业寺\t294304\n栓箱\t294305\n南极磷虾油\t294306\n林同炎\t294307\n生态旅游区\t294308\n蹉跎岁月\t294309\n曝气池\t294310\n陷波滤波器\t294311\n前任攻略\t294312\n天下无\t294313\n要图\t294314\n赶时间\t294315\n外物\t294316\n春熙\t294317\nEight\t294318\n2011\t294319\npaopao\t294320\n颍淮论坛\t294321\n2018上半年\t294322\n83天\t294323\n量仪\t294324\n专著\t294325\n十二名\t294326\n臭鼬\t294327\n焦煤期货\t294328\n河南省科技厅\t294329\n岩雀\t294330\n中华人民共和国水土保持法\t294331\n芯板\t294332\n橡皮擦\t294333\nCache\t294334\n制度性\t294335\n廖记棒棒鸡\t294336\n1858\t294337\n360安全中心\t294338\n十孔\t294339\nC类IP地址\t294340\n资阳新闻网\t294341\nJira\t294342\nR270\t294343\nDiary\t294344\n4A景区\t294345\n凯迪拉克ATS/ATS-L论坛_汽车之家论坛\t294346\n陈文华\t294347\n走出国门\t294348\nif\t294349\n门钉\t294350\nadapters\t294351\n芮小丹\t294352\n山会\t294353\n彩钢房\t294354\n江户川乱步\t294355\n职级\t294356\n派雅\t294357\ncitra3ds\t294358\n蝴蝶杯\t294359\n伊苏2\t294360\nROOBO\t294361\n青岛工学院\t294362\n3566t.com\t294363\n吴淞口\t294364\nrainforest\t294365\ngh60\t294366\n约翰内斯堡\t294367\n遗臭\t294368\n199号\t294369\n关联企业公司\t294370\natime\t294371\n学题\t294372\n変态\t294373\n28.2\t294374\npl文件\t294375\n免扣\t294376\n辍\t294377\n蒋丽莎\t294378\nNuvoton\t294379\n长郡双语实验中学\t294380\n蓝思\t294381\n万奇\t294382\n长生锐志\t294383\n步步大\t294384\n实现方\t294385\n恒天\t294386\nsany\t294387\n无路可\t294388\n杭州火车东站\t294389\n溜冰鞋\t294390\n德赫\t294391\n2016年前\t294392\ntianjin\t294393\n延性\t294394\n脚皮\t294395\n采育镇\t294396\n走西口\t294397\n13栋\t294398\nvod\t294399\n询单\t294400\n20170331\t294401\n致命弯道3\t294402\n新天龙八部吧\t294403\nCap\t294404\n胶合板\t294405\n磁疗贴\t294406\n萌爱\t294407\n13003\t294408\nMini2\t294409\n唐人社\t294410\n黏\t294411\n宾语从句\t294412\n山西省公安厅交通管理局\t294413\n招式\t294414\npairs\t294415\n税级\t294416\n赂\t294417\n金爵奖\t294418\n漳州职业技术学院\t294419\n潮汕职业技术学院\t294420\nNOKIA\t294421\nseth\t294422\n传销\t294423\n淘宝体检中心\t294424\n大宁国际\t294425\n中视典数字科技\t294426\neayui\t294427\n81路\t294428\n承接\t294429\nLINU\t294430\n大芬\t294431\n克罗心\t294432\n93个\t294433\n水奶\t294434\n御前四宝\t294435\n致富网\t294436\nABC公司\t294437\n曾国祥\t294438\n驱动型\t294439\n精确性\t294440\n我意\t294441\n黑臭河\t294442\n少室山\t294443\n小彬彬\t294444\nMMO\t294445\n生产期\t294446\n香砂六君丸\t294447\nBSL\t294448\n田正国\t294449\n勇猛者\t294450\n狂风骤雨\t294451\n郭兰英\t294452\n何香凝\t294453\n塞米松\t294454\n盐阜大众报报业集团\t294455\n南屏街\t294456\n城关\t294457\n二仙\t294458\n夏一可\t294459\n嘉定区\t294460\n17000\t294461\n人和\t294462\n新王\t294463\nColumns\t294464\ncul\t294465\n7天连锁酒店\t294466\n日柱\t294467\n课外\t294468\naug\t294469\n2r\t294470\n鲜胚\t294471\n稼动率\t294472\n水岸华庭\t294473\n龙木\t294474\nLetting\t294475\nCalled\t294476\nexecv\t294477\n松松\t294478\n虾场\t294479\n顺风妇产科\t294480\n尤克里里Fans乌克丽丽\t294481\n均量\t294482\nTokyo\t294483\n王逸\t294484\n浙江泰隆商业银行\t294485\nnse\t294486\n二中院\t294487\n麒龙\t294488\n鬼玩人之阿什斗鬼\t294489\n佢\t294490\nbesame\t294491\nG15\t294492\nKRW\t294493\n化学卷\t294494\nf-362960\t294495\n药明康德\t294496\nlef\t294497\npowermill\t294498\ntti\t294499\n练习卷\t294500\n潋滟\t294501\n品牌加盟网\t294502\nk米\t294503\nnex5\t294504\n小丫头\t294505\n读心神探\t294506\n于斌\t294507\nborland\t294508\n辩证法\t294509\n核高基\t294510\n私下\t294511\n淘宝助手\t294512\n缝法\t294513\n易燃\t294514\n铃音\t294515\nraw\t294516\n颞颌关节\t294517\n横桥\t294518\n⑩\t294519\n白蚁\t294520\nDateFormat\t294521\n0562\t294522\n坐等\t294523\n所幸\t294524\n拆式\t294525\n佛面\t294526\n横漂\t294527\n申屠\t294528\nsecret\t294529\n10月1\t294530\n使至塞上\t294531\n三亚市公安局\t294532\n金胜\t294533\n压载\t294534\n座山雕\t294535\njs1956\t294536\n漂亮妈妈\t294537\n橙光游戏exo\t294538\nspeedfan\t294539\nmtg\t294540\n行为金融学\t294541\n图秀\t294542\n10档\t294543\nLeela\t294544\n疾风剑豪亚索\t294545\n工程版\t294546\nCitrix\t294547\n鬼剑士\t294548\n一人份\t294549\n绿拐\t294550\n草色\t294551\n从化市\t294552\n潮妆\t294553\n候车\t294554\n平安达\t294555\n陌上香坊\t294556\n腐竹小说网\t294557\nwk\t294558\nliquidity\t294559\n苦麻菜\t294560\nち\t294561\n黄劲松\t294562\nSkills\t294563\n新村小区\t294564\n格罗皮乌斯\t294565\n悠乐汇\t294566\n死亡飞车\t294567\nQuaternion\t294568\nVHD\t294569\n2018年04月09日\t294570\n艾菲奖\t294571\n跨片\t294572\n300b\t294573\n酷奇\t294574\n呼铁局\t294575\nimitation\t294576\n29亿元\t294577\nHenry\t294578\n名上\t294579\n稳压\t294580\nrtmpdump\t294581\nxnview\t294582\ncsk\t294583\nmoda\t294584\n太软\t294585\n技士\t294586\n泽西\t294587\n红领巾广播站\t294588\n施工资料\t294589\n电弓\t294590\n陈云纪念馆\t294591\n原状\t294592\n92折\t294593\nvcomp\t294594\n佳境\t294595\n微单M6\t294596\n90W\t294597\n颢\t294598\n2680V2\t294599\n瑞格\t294600\nCoding\t294601\n囧态\t294602\n醋泡鸡蛋\t294603\n下关区\t294604\nupgrade\t294605\n美人儿\t294606\n20150128\t294607\n宇尘\t294608\n随行杯\t294609\n灵岩山\t294610\nJKF\t294611\n傻瓜式\t294612\n04007CN\t294613\n蓝犀牛\t294614\nAC68U\t294615\n萍乡日报\t294616\n刘中伯\t294617\n豪神\t294618\n反复\t294619\n虎扑体育\t294620\n编程梦\t294621\n桐庐政府网\t294622\nenix\t294623\n雏鹤\t294624\nfling\t294625\nv5.0.3\t294626\n周克华\t294627\n大同城区\t294628\n含有\t294629\n媒体类\t294630\n拍摄\t294631\n野狼音效网\t294632\n草酸艾司西酞普兰片\t294633\n六郎镇\t294634\n细观\t294635\n野刀\t294636\n远大路\t294637\n陆老师\t294638\n普宙\t294639\n西澳大利亚\t294640\n羊羔绒\t294641\n同人曲\t294642\n代价函数\t294643\nPusher\t294644\nalizee\t294645\nH8\t294646\n称象\t294647\n城镇居民人均可支配收入\t294648\n鹿哥\t294649\n哈伦裤\t294650\n海南澄迈县政府\t294651\n6.7.1\t294652\n二章\t294653\n投向\t294654\n飚车\t294655\n坊\t294656\n采药\t294657\nClamAV\t294658\nmobius\t294659\nqBittorrent\t294660\n东顺\t294661\n标准节\t294662\n钢琴王子\t294663\n宏电\t294664\n兵营\t294665\n未来医院\t294666\n爱莉安娜·格兰德\t294667\n刘佩琦\t294668\nfinetuning\t294669\nm4000\t294670\n内外侧\t294671\n精神卫生法\t294672\n61953000\t294673\n晚来\t294674\n第三台\t294675\nCCTV6-电影网\t294676\n奶品\t294677\n上海三菱\t294678\n大英图书馆\t294679\n培新小学\t294680\n台资\t294681\n等式\t294682\n陈晓平\t294683\nApe\t294684\n百物语\t294685\nTS\t294686\n赦免\t294687\n一次又一次\t294688\n铁核\t294689\n十一级\t294690\nchest\t294691\npepper\t294692\nten\t294693\n秋叶\t294694\n1280x1024\t294695\n13%\t294696\n内联\t294697\n行险\t294698\n内伶仃岛\t294699\n欧酷\t294700\n京东卖家论坛\t294701\n扬子石化\t294702\n脑力\t294703\n雷贝拉唑钠\t294704\nLNG\t294705\n彩钻\t294706\n狸子\t294707\n贾玲\t294708\nopenapi\t294709\n70页\t294710\nModel层\t294711\n中国物流与采购联合会\t294712\n连裤袜\t294713\nHomemade\t294714\n勇闯天涯\t294715\nmiki\t294716\nadvertisements\t294717\n中国民主建国会\t294718\n趾疣\t294719\n路非\t294720\n青天白日\t294721\nSegmentation\t294722\nteenie\t294723\n绝症\t294724\n义释\t294725\n金纯\t294726\njanuary\t294727\nGWU\t294728\n牛比\t294729\n李向东\t294730\nsubjected\t294731\n唳\t294732\nremover\t294733\n浙北大峡谷\t294734\n墨舞\t294735\nhyper\t294736\n统计学\t294737\n瘙痒症\t294738\n萨克斯管\t294739\n自由主义\t294740\nOak\t294741\n歇业\t294742\n热回收\t294743\n鸡鸣狗盗\t294744\nyingy\t294745\n劫缘\t294746\ndevice-width\t294747\n星湖\t294748\n张清平\t294749\n胡绳\t294750\n题注\t294751\n马日\t294752\n含山县人民政府\t294753\n书信集\t294754\n万根\t294755\n楚天\t294756\n花木场\t294757\n化学成分\t294758\n产业群\t294759\n东钱湖\t294760\n东风景逸\t294761\n小脚本\t294762\n新案\t294763\n建设银行信用卡中心\t294764\n腊肉\t294765\n海昌极地海洋公园\t294766\n时装周\t294767\n车把\t294768\n3.25\t294769\n应收账款管理\t294770\n紫乐\t294771\n东方时空\t294772\n酷路泽5700\t294773\n罩衫\t294774\n南开大学出版社\t294775\n暗黑之魂3\t294776\n脉冲响应函数\t294777\n南通百姓网\t294778\nTechnician\t294779\n1000g\t294780\n50户\t294781\n天津市肿瘤医院\t294782\n人教版八年级英语\t294783\n产热\t294784\n异教徒\t294785\ntracked\t294786\n截取\t294787\n粤泰股份\t294788\n科莫\t294789\n麦苗\t294790\n茨木\t294791\n国家地理标志保护\t294792\n漏失\t294793\n省出\t294794\n陈蓉\t294795\nerrcode\t294796\n30万公里\t294797\n撑住\t294798\n臭鱼\t294799\nJAVA编程语言程\t294800\nm104w\t294801\n奠基者\t294802\n瀚宇\t294803\nUNE\t294804\n70毫米\t294805\n老院子景区\t294806\ngoldfish\t294807\n行政管理人员\t294808\nchairs\t294809\n叠翠\t294810\n客网\t294811\nMaths\t294812\nseu\t294813\n同卵双胞胎\t294814\n高发\t294815\nrekordbox\t294816\nSnipaste\t294817\n唐季礼\t294818\nulua\t294819\n杰拉德·巴特勒\t294820\n淘券\t294821\ndropzone\t294822\nhailing\t294823\n轻装\t294824\n刘长卿\t294825\n深圳市公安局南山分局\t294826\nPS2017\t294827\n李悝\t294828\n几号前\t294829\n君士坦丁\t294830\n广汇\t294831\n鹰潭市政府\t294832\ntslint\t294833\n梦幻模拟战4\t294834\n感恩母亲节\t294835\n力士乐\t294836\n山西省政协\t294837\n带入\t294838\n应和\t294839\n任意依恋\t294840\n太湖县人民政府\t294841\n14岁\t294842\n拍立得相纸\t294843\nMr7Dog\t294844\ntunnelblick\t294845\nPOST\t294846\nfinals\t294847\n免狗\t294848\n泛海城市广场\t294849\n烤鸡\t294850\n中国儿童艺术剧院\t294851\n空间光调制器\t294852\nC++面向对象程序设计\t294853\n30ML\t294854\n工作职\t294855\n中立国\t294856\nolap\t294857\n口袋妖怪:红蓝宝石\t294858\n光亮\t294859\nInputField\t294860\n边关系\t294861\n数枚\t294862\nhashCode\t294863\n中国人民解放军总后勤部\t294864\n模拟城市:我是市长\t294865\n雍阳中学\t294866\n费德提克\t294867\nvipabc\t294868\n金角\t294869\n视角\t294870\n650nm\t294871\n自家人\t294872\n46分钟\t294873\n智海\t294874\n边际\t294875\n氧化铈\t294876\n矢量场\t294877\n职称\t294878\n单元格-ExcelVBA\t294879\n国有\t294880\n纠察\t294881\n新锐界\t294882\nshenm\t294883\nngrinder\t294884\n上海交通大学医学院附属第九人民医院\t294885\n高送转股票\t294886\n大公鸡\t294887\nbobbi\t294888\nscenes\t294889\nnikkor\t294890\n乐逗游戏\t294891\n商业机密\t294892\n唐山市人民政府\t294893\n厦门招聘网\t294894\n火命\t294895\n20180206\t294896\nケダモノ\t294897\n顿顿\t294898\n22小时\t294899\nFutures\t294900\nSingapore\t294901\n赵树理\t294902\nechojb\t294903\nC型\t294904\n衡水银行\t294905\n虹膜炎\t294906\n金伟\t294907\nNodeJS\t294908\n中国教师资格网\t294909\n大便溏\t294910\n引力\t294911\n孕妈们\t294912\n拥抱世界\t294913\n中港台\t294914\n基民\t294915\n白靴\t294916\n黑耀石\t294917\n2015-2016年\t294918\n阿提拉:全面战争\t294919\n江湖传说\t294920\n赵松\t294921\n一败\t294922\n保密费\t294923\n灵山小镇\t294924\n主人家\t294925\n叶鹏\t294926\n不收复\t294927\nsuf\t294928\nTP-link\t294929\n西安中学\t294930\n源量\t294931\n种病\t294932\n山东电力集团公司\t294933\n通信人才网\t294934\nelectrostatic\t294935\n0.2米\t294936\n浮生\t294937\n广西环江毛南族自治县人民政府\t294938\n十三章\t294939\n工作坊\t294940\n木椅\t294941\n海瑞温斯顿\t294942\nrru\t294943\n金属活动性顺序表\t294944\n乌尼莫克\t294945\n怀柔科学城\t294946\n蒋雪儿\t294947\n满天繁星\t294948\n码码\t294949\n麦茶\t294950\n米糠\t294951\nshfiguarts\t294952\ninstaller\t294953\n波萝\t294954\n医改办\t294955\n小白楼\t294956\nv2.1.0\t294957\n吉林省纪委\t294958\n排球赛\t294959\n吉林省中医院\t294960\n李夏\t294961\n手口\t294962\n上传说\t294963\n致歉\t294964\nNOK\t294965\nMyths\t294966\n3DSMAX\t294967\n哈尔特山\t294968\n白斩\t294969\n爱问算命网\t294970\n32个月\t294971\n工作篇\t294972\n正态分布函数\t294973\n肉狗\t294974\n王二狗\t294975\n喜宝奶粉\t294976\n北京国际学校\t294977\n猫叔\t294978\n恒河猴\t294979\n提示\t294980\n天师府\t294981\n蘸料\t294982\n蒋鹏\t294983\n黑坑\t294984\n杨慧妍\t294985\n黑白户\t294986\n周虎\t294987\n中枪舞\t294988\n浙师大附中\t294989\n登嘉楼\t294990\nLaughing\t294991\n助残\t294992\n第五回\t294993\ncsair\t294994\n094\t294995\n黄山温泉\t294996\n受力分析\t294997\n气液增压缸\t294998\n上海社科院\t294999\n瑞斯康达\t295000\n陈志云\t295001\n新邱区\t295002\n酸化\t295003\n无节制\t295004\n王学斌\t295005\n内训\t295006\nmvp\t295007\n微芯\t295008\n爱科\t295009\n蓝精灵\t295010\nnetapp\t295011\n昌吉回族自治州人民政府\t295012\n乳腺炎\t295013\n大秦之帝国\t295014\n牙斗兽娘\t295015\n机泵\t295016\nsweetie\t295017\n公会\t295018\n早期教育\t295019\n信息反馈表\t295020\n查经\t295021\n长堽路\t295022\n净亏损\t295023\n自诉状\t295024\n油管\t295025\n国花\t295026\n一期\t295027\n稳贷网\t295028\npadding-left\t295029\n研究者\t295030\n陈天华\t295031\n柯以敏\t295032\n漳州人才网\t295033\n湖北省粮食局\t295034\n熔覆\t295035\n地老天荒\t295036\n名邸\t295037\n三金片\t295038\n高桥阳一\t295039\n陈力\t295040\n血翼\t295041\n网易云音乐\t295042\n_镇魂街\t295043\npayment\t295044\n无功功率\t295045\n住宅公寓\t295046\n年限额\t295047\n钙\t295048\n造价员证\t295049\n趸缴\t295050\n1数\t295051\n地动山摇\t295052\n易简\t295053\n李闰珉\t295054\n1月\t295055\n55bbs\t295056\n综合岗\t295057\n土地管理局\t295058\n有迹可循\t295059\n高位刹车灯\t295060\nTplink\t295061\n日活\t295062\n数控雕铣机\t295063\n毛利率\t295064\n吴浩\t295065\n白眉大侠\t295066\n十里风荷\t295067\n四轮驱动\t295068\n南京林业大学研究生院\t295069\n王顶堤\t295070\n让利\t295071\n蓝橙\t295072\n6条\t295073\n官桥镇\t295074\n弹簧\t295075\nCanva\t295076\nQ5\t295077\n吴雨\t295078\n物理攻击\t295079\nCVM\t295080\n贾贵\t295081\n金超群\t295082\n爱心\t295083\nl850\t295084\n59分钟\t295085\n1080p/Mp4\t295086\n一席话\t295087\n榀\t295088\nGump\t295089\nvideojs\t295090\n研究成果奖\t295091\n负翁\t295092\n百超\t295093\n直属机关\t295094\nscf\t295095\n王文章\t295096\n陈式太极剑\t295097\n回归系数\t295098\n全满贯\t295099\n划款\t295100\ncanbus\t295101\n广西代表团\t295102\nANSA\t295103\n金融控股公司\t295104\n正规\t295105\ntothe\t295106\n红字帖\t295107\ntrip\t295108\n长城wey\t295109\n乘梯\t295110\n世纪家园\t295111\n詹姆斯\t295112\n吴慧\t295113\nsera\t295114\n斐讯\t295115\n宾朋\t295116\n立昂\t295117\nCET4\t295118\n四号位\t295119\n须弥\t295120\n烹调\t295121\n围栏网\t295122\n极度\t295123\npurdue\t295124\nNamespace\t295125\nHP-UX\t295126\nflash素材\t295127\n上海国际和平妇幼保健院\t295128\n黄石政府网\t295129\n冰场\t295130\n舵轮\t295131\n名企\t295132\n交流地\t295133\n鲍比\t295134\nturnaround\t295135\n采薇舞\t295136\n世光\t295137\n兴隆大奥莱\t295138\n喀喇沁\t295139\n压缩包解压器\t295140\n水展\t295141\n招商集团\t295142\n_化学工业出版社\t295143\n杨琪\t295144\n别死\t295145\n36段\t295146\n钎杆\t295147\n香薷\t295148\n脸色\t295149\n10科\t295150\n庆贺\t295151\n上河图\t295152\nmegui\t295153\n溶液\t295154\n椿萱\t295155\n赵萌\t295156\n组织机构代码证\t295157\n18039期\t295158\n3atoys\t295159\n软考信息系统项目管理师\t295160\nloving\t295161\n1.4元\t295162\nOrbital\t295163\n马哥教育\t295164\n盛世家园\t295165\n翻译学院\t295166\nCNTK\t295167\n我们一家\t295168\n美体健身\t295169\n尸工\t295170\n凶悍\t295171\n修色\t295172\n1024*768\t295173\n新市街道\t295174\n零售点\t295175\n金美孚\t295176\n张馨\t295177\n天堂湾\t295178\n迪厅\t295179\n需谨慎\t295180\n疑字\t295181\n龙神号\t295182\nPOSITION\t295183\nSCI\t295184\nzhongkao\t295185\nobsolete\t295186\nROS\t295187\n华徐公路\t295188\nmpNews\t295189\n柒末雪\t295190\n19届\t295191\n几分钟\t295192\n文体\t295193\n甲酚\t295194\nnargin\t295195\n口琴曲\t295196\n2012-11-24\t295197\n牛背梁\t295198\n中粮屯河\t295199\n辣文\t295200\nnuit\t295201\n康斯坦茨\t295202\nfacilitate\t295203\nThrowable\t295204\n骨维力\t295205\nfdd\t295206\n广西师大\t295207\ncenters\t295208\n新商报\t295209\n曩\t295210\n小树林\t295211\n镇级\t295212\n007理财网\t295213\n阴湿\t295214\n封袋\t295215\n20160217\t295216\n汉米尔顿\t295217\n世界卫生组织国际癌症研究机构\t295218\n金属化\t295219\n墨道\t295220\n石门营\t295221\n街电充电宝\t295222\n5820\t295223\n密勒\t295224\nwrapper\t295225\n压塌\t295226\naerospace\t295227\n五分钟左右\t295228\n疝\t295229\njavaapi\t295230\nawaken\t295231\nSQLCipher\t295232\n阴口\t295233\n不变更\t295234\nContainment\t295235\n招标采购综合网\t295236\nroca\t295237\n深圳市龙华区中心医院\t295238\n原发性醛固酮增多症\t295239\n光溜溜\t295240\n华泰紫金\t295241\n罗技g610\t295242\nsieve\t295243\n华夏基金管理有限公司\t295244\n内构\t295245\n飘着\t295246\n泰莉莎·帕尔墨\t295247\n市里\t295248\nsock\t295249\n瀑\t295250\n大普\t295251\n肺气\t295252\n心胸狭隘\t295253\n英菲尼迪Q50L\t295254\n草本乳膏\t295255\nINTEGER\t295256\notter\t295257\n最终幻想3\t295258\n_灯具/照明\t295259\n六十五周年\t295260\n普兒\t295261\n晋宁县\t295262\n广誉远\t295263\n同煤集团\t295264\n_衣\t295265\n200L\t295266\n北京市第一中级人民法院\t295267\n赤线\t295268\n89623514\t295269\ncc攻击\t295270\n二氧化氯发生器\t295271\n蜜月岛\t295272\n林糊糊\t295273\n重庆房地产职业学院\t295274\n啃老\t295275\n婕茜\t295276\nauburn\t295277\n奥派\t295278\n左心耳\t295279\n气血虚\t295280\n长江中文\t295281\n杉田智\t295282\n十公分\t295283\n宝马玛莎拉蒂\t295284\n伸直\t295285\n北京大学化学与分子工程学院\t295286\n纸页\t295287\n诗豪\t295288\n发现\t295289\n心王\t295290\n全联\t295291\n年轮\t295292\nsignin\t295293\n第五张\t295294\n淘宝汽车\t295295\nmp4高清\t295296\n救护队\t295297\n欧拉角\t295298\n内存门\t295299\n自由基反应\t295300\nkaraf\t295301\n75篇\t295302\n浴盆\t295303\nJF\t295304\n北欧航空\t295305\n3dsMAX\t295306\nLOGIN\t295307\n列头\t295308\nEMPTY\t295309\n亦庄国投\t295310\nJuniper\t295311\n门罗币\t295312\nScanjet\t295313\n内附\t295314\n莎草\t295315\n诺亚幻想\t295316\n二鬼\t295317\n温州外国语学校\t295318\n浙江工商大学杭州商学院\t295319\n荀子\t295320\n650分\t295321\n春芽杯\t295322\n麻痛\t295323\n茶吧\t295324\n老婆子\t295325\n广西地区\t295326\n雪地胎\t295327\n杨建平\t295328\n游隼\t295329\nMagica\t295330\n东南西北中\t295331\nUU个性网\t295332\n大角\t295333\n一把手\t295334\nproduced\t295335\n7下\t295336\n广园\t295337\n海鱼\t295338\n圆锥式破碎机\t295339\n黑木崖\t295340\n1687\t295341\n倍压整流电路\t295342\n中邮建技术有限公司\t295343\n李贵\t295344\n背刺\t295345\n第五十七章\t295346\n口头\t295347\npspgo\t295348\n马郁\t295349\n4月26日\t295350\n国家地理\t295351\n墨香铜臭\t295352\n宝通科技\t295353\n1154\t295354\n9.5.7\t295355\n48路\t295356\n人车\t295357\n周南中学\t295358\n康强\t295359\n杭汽轮\t295360\nkoi\t295361\nlastIndexOf\t295362\nwea\t295363\n安全阀\t295364\nv2.3.8\t295365\n100天后\t295366\n宝山镇\t295367\nwy紫陌\t295368\n凑\t295369\nunf\t295370\nAlice\t295371\n乙基麦芽酚\t295372\n单套\t295373\n哈纳\t295374\nhezhiyao\t295375\n康欣\t295376\n右脚\t295377\nps破解版\t295378\n1435\t295379\nPLGA\t295380\n广州市律师协会\t295381\n税延养老险\t295382\nDUKE\t295383\n剃须膏\t295384\n卓健\t295385\neft\t295386\n新百\t295387\n练塘镇\t295388\nsteps\t295389\n正觉寺\t295390\n广西建工集团\t295391\n乐曲\t295392\n乱雨\t295393\n第52期\t295394\n王立刚\t295395\n阴离子\t295396\n省科技厅\t295397\nCUBE\t295398\nTeraCopy\t295399\n极具\t295400\n黄土地\t295401\n快捷键锁屏\t295402\nete\t295403\nZotero\t295404\ncod\t295405\nCodeMirror\t295406\n省检察院\t295407\n猴币\t295408\negg\t295409\nDPR\t295410\n绿杨\t295411\n国家铁路局\t295412\nTurk\t295413\nmojang\t295414\n120t\t295415\nBAN\t295416\n潮韵\t295417\n武汉中建\t295418\n原始人\t295419\nMaria\t295420\nMiscellaneous\t295421\n2018五一套\t295422\nDesire\t295423\n银行存管\t295424\n吉祥颂\t295425\n长高集团\t295426\nnovelty\t295427\n供应业\t295428\n建伍\t295429\ngtd\t295430\nMass\t295431\nenlarge\t295432\n亚麻布\t295433\nkoreader\t295434\nIreland\t295435\n退散\t295436\nuialertcontroller\t295437\n手工业\t295438\n迪士尼\t295439\n浙江大学学报\t295440\n堵嘴\t295441\n2018年4月26\t295442\n南京陆军指挥学院\t295443\n日本动画公司\t295444\nIZOA\t295445\n深圳移动\t295446\n0533\t295447\n缮\t295448\n黏胶\t295449\n洗眼器\t295450\n足以\t295451\n同比\t295452\nboob\t295453\n激磁\t295454\n大白金\t295455\n道光\t295456\n海洋在线\t295457\n鬼丈夫\t295458\n宁夏日报\t295459\n迅游手游加速器\t295460\n瘀青\t295461\n夺情\t295462\n企业新闻-1024商务网\t295463\nscore\t295464\n强尼\t295465\nRadios\t295466\nasses\t295467\n赚大了\t295468\n第832集\t295469\n徐浪\t295470\n阿里p7\t295471\n网纹草\t295472\nprotopie\t295473\ncgroup\t295474\nSALT\t295475\n透视装\t295476\n5.\t295477\n自弹\t295478\n快乐到死\t295479\n6410\t295480\n猎枪\t295481\n礼包券\t295482\n光动能\t295483\n三控\t295484\n天衍\t295485\n部属\t295486\n柳芽\t295487\nt430u\t295488\n中跟\t295489\n压痕机\t295490\nxuniji\t295491\n赣州市公共资源交易中心\t295492\n秋词\t295493\n51单片机\t295494\n折达\t295495\n晶圆\t295496\n通古斯\t295497\n己定\t295498\n光华剑桥\t295499\n腾讯电脑管家\t295500\n星耀局\t295501\n泰格豪雅\t295502\n吹风会\t295503\n北京中原地产\t295504\n断指\t295505\n泌尿系统\t295506\n风采展\t295507\n张府园\t295508\n刺儿\t295509\n绵阳南郊机场\t295510\nav_\t295511\n十五项\t295512\n狐狸精\t295513\n奇迹暖暖联盟\t295514\n空调遥控器\t295515\n和美\t295516\n麦积山\t295517\n贵州电网公司\t295518\n福建省莆田市纪委监察局\t295519\nRunningMan\t295520\nぎ\t295521\n接案\t295522\n登州府\t295523\n大体积混凝土\t295524\n悦湖\t295525\nwolfskin\t295526\n御劲\t295527\n杭州南\t295528\nxmind6\t295529\n北京景区\t295530\n金块\t295531\nHD\t295532\n路人王\t295533\n粳米\t295534\n管链输送机\t295535\n广州耐克\t295536\n野泳\t295537\nSayings\t295538\n途风旅游网\t295539\n入宪\t295540\n嫁给他\t295541\n函数fx\t295542\n百分位\t295543\n睢城镇\t295544\n山底\t295545\n短寿\t295546\n唱k\t295547\n2978\t295548\n袁术\t295549\nSIAM\t295550\nmaridb\t295551\nvivox5pro\t295552\n马鞍山市教育局\t295553\n虎胆龙威5\t295554\nUnify\t295555\n125CC\t295556\ncrystalline\t295557\n业务部\t295558\n磊落\t295559\n健身会所\t295560\n毛姆\t295561\n第57届\t295562\n龙的传说\t295563\n天通苑北一区\t295564\n蛙趣\t295565\nGen9\t295566\nk9999\t295567\n气儿\t295568\n杭椒\t295569\n路透\t295570\nZbook\t295571\n全队\t295572\nSonnet\t295573\n金老师\t295574\n亿千瓦\t295575\n喜隆多\t295576\n大渡河\t295577\n五帝钱\t295578\n石晶\t295579\n转运四方\t295580\n0家\t295581\nBanker\t295582\n维金斯\t295583\nAlgebraic\t295584\n职业经理人\t295585\n越策\t295586\nnamespace\t295587\n私募融资\t295588\n西江千户苗寨\t295589\n吊线\t295590\n拆临\t295591\nXxx\t295592\n双非院校\t295593\n芷江侗族自治县\t295594\nsubstring函数\t295595\n地水\t295596\n席慕蓉\t295597\n浙财行\t295598\n萌图\t295599\n10.3.3越狱\t295600\n撞色\t295601\n神码\t295602\n钺\t295603\n领英\t295604\n黄豆豆\t295605\n乐纤\t295606\n春秋时期\t295607\n卢安娜\t295608\nlixue\t295609\nmirroring\t295610\n黑龙江网络广播电视台\t295611\n对乙酰氨基酚栓\t295612\n物业管理专业\t295613\n缴获\t295614\n开展移风易俗\t295615\n抻\t295616\n一剂\t295617\n中央电视塔\t295618\n动画播放器\t295619\nOLAY\t295620\n梦华\t295621\n责任编辑\t295622\n清卡\t295623\n2017国家自然科学基金\t295624\n山西省统计局\t295625\n刀兵\t295626\n钴60\t295627\n明日学院\t295628\n第67集\t295629\n台儿庄区\t295630\n500h\t295631\n热熔涂料\t295632\n外强中干\t295633\n茫\t295634\n沈阳化工大学\t295635\n北京工商\t295636\n廣場\t295637\n2016年10月14日\t295638\n超星\t295639\n马克斯韦伯\t295640\n深圳传音控股有限公司\t295641\n莫愁湖公园\t295642\n洁面霜\t295643\nMotor\t295644\n财经郎眼\t295645\neye\t295646\n斯帝罗兰\t295647\n腰贴\t295648\n弹壳\t295649\n72段\t295650\n上网费\t295651\ngbt\t295652\n细究\t295653\n17幢\t295654\nnodepad\t295655\n官兵\t295656\n独生子\t295657\n搞笑群\t295658\n二氧化氯\t295659\n形势与政策\t295660\n极速前进\t295661\n高井\t295662\n香港旺角\t295663\n第8话\t295664\n单独二孩\t295665\n百二\t295666\nPHP5\t295667\n地景\t295668\nbugs\t295669\n中国金融出版社\t295670\n耀神\t295671\n绿苑小区\t295672\nmassimo\t295673\n成都办公室\t295674\n广州市\t295675\n300V\t295676\n踉踉跄跄\t295677\n芭蕾舞女\t295678\n小溪\t295679\n望京南\t295680\n舞乐传奇\t295681\n马街\t295682\n有很多种\t295683\n草料\t295684\n镇隆\t295685\n触摸显示器\t295686\n深圳湾科技生态园\t295687\nslog\t295688\n母仪天下\t295689\n第9周\t295690\n点标\t295691\nGRAY\t295692\n递减\t295693\n打江山\t295694\n名苑\t295695\nyupoo\t295696\n陈爽\t295697\nCareers\t295698\nvmoptions\t295699\n两文\t295700\n新杯\t295701\n驭灵师\t295702\nHennessy\t295703\njunction\t295704\n冷月\t295705\n阿里架构师\t295706\n印机\t295707\n抢票\t295708\n每个子\t295709\n晚于\t295710\n城轨\t295711\n正月初二\t295712\n王伟中\t295713\nBlackMatrix\t295714\n魔皇\t295715\n袖带\t295716\n门锁\t295717\n腹\t295718\n3536\t295719\najax\t295720\nxb271hu\t295721\n羞耻感\t295722\n肠镜\t295723\n先民\t295724\ndimethyl\t295725\n联想u410\t295726\nphpthink\t295727\n紫光集团有限公司\t295728\nPIXIV\t295729\n主女\t295730\n板材\t295731\n不要走动\t295732\nzh7314\t295733\nFG\t295734\n风寒感冒颗粒\t295735\n猜中\t295736\nFPSO\t295737\nRosewood\t295738\n海盗团\t295739\nvti\t295740\n摩根\t295741\n个别化\t295742\n天狮娱乐\t295743\n寻访\t295744\n辐射3\t295745\n劝诱\t295746\n腹外斜肌\t295747\n偌大\t295748\n中国市政工程西北设计研究院有限公司\t295749\n紫苹果\t295750\n西漳\t295751\nheavenly\t295752\n乙酸乙酯\t295753\n占道\t295754\n失利\t295755\nGoldenDict\t295756\n美仑\t295757\n饰材\t295758\n战栗\t295759\n司局级\t295760\n南宫嘉骏\t295761\n劳动卫生\t295762\n利奥\t295763\n自留\t295764\n7RF\t295765\n少数民族\t295766\n周水\t295767\n云南电网有限责任公司\t295768\n啄木鸟儿\t295769\n丽柜\t295770\nM14\t295771\n凭心而论\t295772\n大立科技\t295773\n未來\t295774\ndnx\t295775\n暗拍\t295776\n归谷\t295777\noric\t295778\n教培\t295779\n北京市社会科学院\t295780\n当初\t295781\n临时变量\t295782\n日版花样男子\t295783\n数张\t295784\n周五\t295785\n吃奶子\t295786\n楚楚街\t295787\n故意伤人\t295788\nSTM32F407\t295789\n161号\t295790\n择业观\t295791\n砖工\t295792\n【羽生\t295793\n完满结束\t295794\n一个男\t295795\namer\t295796\n站短\t295797\n002864\t295798\n大白虾\t295799\n空调加氟\t295800\n超级红人节\t295801\n搅局者\t295802\n歌曲集\t295803\nmpacc\t295804\n大红瓶\t295805\n谭政\t295806\n滚刷\t295807\n普君新城\t295808\nFonsi\t295809\n九真异常生物见闻录\t295810\n易木\t295811\n网络操作系统\t295812\n科鲁奇骏\t295813\n一大块\t295814\n3720\t295815\n袁飞\t295816\n守成\t295817\n暴皮\t295818\n吴佳煜\t295819\n开源库\t295820\n东洲区\t295821\n金科九曲河\t295822\n国家安全法\t295823\nSupply\t295824\n多多空包网\t295825\n文本框\t295826\nDVD-MP4\t295827\n86400秒\t295828\n两杆\t295829\n秆\t295830\n美生\t295831\n虚拟打印\t295832\n星辰医统江山\t295833\n雅高乐雅会\t295834\n493\t295835\n网站\t295836\n端州区\t295837\n雷娜\t295838\n5065\t295839\nSC2020\t295840\n当老板\t295841\n强强联手\t295842\n抵押贷款\t295843\nHeaven\t295844\n宝芝林\t295845\n窗外\t295846\n菌子\t295847\n散华礼弥\t295848\n火影忍者分析\t295849\nfetch_row\t295850\n南昌湾\t295851\n下架设\t295852\n南京政府网\t295853\n羁绊者\t295854\n石蕊\t295855\n膀胱\t295856\nNo.6\t295857\n倾倾\t295858\nHKEX\t295859\n珠海新闻网\t295860\n马驹\t295861\n杨海军\t295862\n白云山\t295863\n龙华富士康\t295864\n白管\t295865\n电建地产\t295866\n佛顶\t295867\n孟飞\t295868\n橋本\t295869\n686\t295870\n主班\t295871\n36例\t295872\n七叶一枝花\t295873\n贵州民族学院\t295874\n大学生志愿服务西部计划\t295875\n借钱\t295876\n800mm\t295877\nBie\t295878\n江苏省委党校\t295879\n乙型肝炎\t295880\n浮点型\t295881\n支教团\t295882\nVRML\t295883\n路面砖\t295884\n凯悦酒店集团\t295885\n灌篮高手主题曲\t295886\n造价者网\t295887\n竣\t295888\n长隆飞鸟乐园\t295889\n1.99\t295890\n中国竹具工艺城\t295891\n想知\t295892\nqq分组\t295893\n非公开定向债务融资工具\t295894\n真干真\t295895\n洪源\t295896\n外热\t295897\n石鱼\t295898\nZipkin\t295899\n擒龙\t295900\n百瑞地产网\t295901\nDISCIPLINE\t295902\nHAI\t295903\n魟鱼\t295904\n零息债券\t295905\n20160708\t295906\n悩\t295907\n钟函谷\t295908\n色射撸撸干草艹\t295909\n超级变态\t295910\nRes\t295911\n合肥经济技术开发区\t295912\n哈萨克斯\t295913\n金壳\t295914\n豆腐宴\t295915\nH3C防火墙\t295916\n墙贴\t295917\nCoke\t295918\n侍候\t295919\n三茅人力资源网\t295920\n名侦探柯南主题曲\t295921\n弦枕\t295922\n20151129\t295923\nxpj\t295924\n奈落\t295925\n肃穆\t295926\nLayer\t295927\n两岸一家亲\t295928\n6.07\t295929\n胶济铁路\t295930\n厌倦\t295931\n校草\t295932\nCRTS\t295933\n卡威\t295934\n南京青奥会\t295935\n377号\t295936\n玉版\t295937\n脲醛树脂\t295938\nopenWRT\t295939\n庄村\t295940\n鸿篇\t295941\nd3d\t295942\n2275\t295943\nmarin\t295944\nLiar\t295945\nIcepak\t295946\n止规\t295947\ngif卵蛋网\t295948\n普利兹克奖\t295949\ncritique\t295950\nOctave\t295951\n党项\t295952\n精华制药\t295953\n济州岛\t295954\n这些梗\t295955\n布币\t295956\n天津滨\t295957\n4.5厘米\t295958\n一叶斋\t295959\n中飞院\t295960\n百炼成仙\t295961\n梁龙\t295962\n九宫庙\t295963\njhb\t295964\n叶少\t295965\n赛车类\t295966\nDATA\t295967\n宏辉\t295968\n花土\t295969\nCNR\t295970\n阈值法\t295971\n宁波外国语学校\t295972\nPat\t295973\n君合\t295974\n6匹\t295975\n地质学家\t295976\n郭达\t295977\nyun007\t295978\n中信银行信用卡\t295979\n散装机\t295980\n顾斌\t295981\n反差\t295982\nwin7系统旗舰版\t295983\n全碟\t295984\n长安cs95\t295985\n张发财\t295986\n力作\t295987\n瑞士卷\t295988\n地底\t295989\n菜刀\t295990\n8p\t295991\n丰田锐志\t295992\nRCU\t295993\nnocturne\t295994\n彻夜不眠\t295995\nM1536dnf\t295996\n黄岩区人民政府\t295997\n平度市\t295998\n英亚\t295999\nSuction\t296000\n第七册\t296001\n8万起\t296002\n弹幕版\t296003\n十九大三中全会\t296004\n伪分布式\t296005\n中华人民共和国第十三届全国人民代表大会\t296006\n佛系少女\t296007\n梅尔·吉布森\t296008\n广州南洋理工职业学院\t296009\nAB型血\t296010\n竞然\t296011\n四世\t296012\n富士施乐(中国)有限公司\t296013\n新都区\t296014\ninferred\t296015\n奥卡西平片\t296016\n胡冰卿\t296017\n密度函数\t296018\n12&\t296019\n报销单\t296020\n御定\t296021\n河东村\t296022\n中国茶网\t296023\n特拉法尔加罗\t296024\n第一户外\t296025\n陈永仁\t296026\n闭\t296027\n时代周刊\t296028\n算符\t296029\n均属\t296030\n神经刀与飞天猫\t296031\nFPS类\t296032\nmemory\t296033\n洪炉\t296034\n超级机器人大战z\t296035\n中国菜谱网\t296036\nvsto\t296037\n舱室\t296038\n少林寺武术学校\t296039\n中神\t296040\nUEFA\t296041\n安神补心丸\t296042\n风险评估\t296043\nCC小助手\t296044\n新一贷\t296045\n丛书\t296046\n北陵\t296047\n丁字湾\t296048\n电动刮胡刀\t296049\n满分卷\t296050\nCLOSED\t296051\nled控制卡\t296052\n900mm\t296053\n0.8.5\t296054\n拉格朗日中值定理\t296055\n梅花奖\t296056\n凝重\t296057\n巧克力派\t296058\n芙蓉王\t296059\nPage\t296060\n孙胜\t296061\n浙商保险\t296062\n林中\t296063\n精量\t296064\n抵缴\t296065\n迷修\t296066\n牛汇\t296067\n引语\t296068\ngtavc\t296069\n佛手\t296070\n吸鳅\t296071\n宝马7系\t296072\n赞助版\t296073\nTidal\t296074\n盖房子\t296075\nP3015\t296076\nWebIM\t296077\nPepsi\t296078\n2018-01-05\t296079\n逗游攻略中心\t296080\n治沙\t296081\n武汉搬家公司\t296082\nrolls\t296083\n全身心\t296084\n归国潮\t296085\n天鹅湖万达广场\t296086\n南运河\t296087\n交通篇\t296088\n重生之官路商途\t296089\n火锅底料\t296090\n王永飞\t296091\n腐女们\t296092\n黄霖\t296093\n公投\t296094\n沉静\t296095\nǚ\t296096\n稚拙\t296097\n东城区\t296098\n苗绣\t296099\n宁波联合\t296100\n北宸\t296101\n清月\t296102\n毕业证明\t296103\nMGR\t296104\nNTP服务器\t296105\n科华\t296106\n笋岗街道\t296107\n朋友们\t296108\n对路\t296109\n教育史\t296110\n魔芋精粉\t296111\n酒展\t296112\nRong\t296113\n金鹰奖\t296114\n彝族舞曲\t296115\n往前走\t296116\n尊礼\t296117\n岩下\t296118\nanastasia\t296119\nPano\t296120\n素材中国16素材网\t296121\n乐Max2论坛\t296122\n乐山市人民医院\t296123\n实用新型\t296124\n华中师范大学化学学院\t296125\n黑胡桃木\t296126\nneng\t296127\n情理\t296128\n医学生\t296129\ndie\t296130\nimap\t296131\n打冷\t296132\n豪龙\t296133\n荣耀卡顿\t296134\n兄妹\t296135\n桑枝\t296136\n官居几品\t296137\n深航\t296138\n末页\t296139\n大连八中\t296140\nErasmus\t296141\n短暂\t296142\n陶冶\t296143\n四川广播电视大学\t296144\n套读\t296145\n胜利\t296146\nF-22\t296147\n开复\t296148\n莲花山庄\t296149\nm2ts\t296150\n压口\t296151\n触发子组件\t296152\n2011年2月\t296153\nCPCI\t296154\n18056期\t296155\n物器\t296156\n梅根福克斯\t296157\n素匠泰茶\t296158\n碗柜\t296159\n新时代传习所\t296160\n月出云\t296161\n油榨街\t296162\n惩击\t296163\n南京邮电大学通达学院\t296164\n茶面\t296165\n超次元游戏海王星重生\t296166\nTIF格式\t296167\n300300\t296168\n分出\t296169\n流放者\t296170\n编曲吧\t296171\n贱客\t296172\n1871\t296173\n沙哑\t296174\n驴\t296175\nfunctionality\t296176\nkkkk\t296177\n北仑区政府\t296178\n黑撒乐队\t296179\n蜜蜂仔\t296180\n抽取式\t296181\n方面军\t296182\nchui\t296183\n贵州省教育厅\t296184\n天仁\t296185\n民权网\t296186\n深达\t296187\n跑线\t296188\nMv\t296189\n广东广雅中学\t296190\n被忽视\t296191\n疏通器\t296192\n海普诺凯\t296193\n张家川县\t296194\n社会心理学\t296195\n万里马\t296196\n定积分\t296197\n吉木萨尔\t296198\n1835\t296199\n高分子化合物\t296200\nPutting\t296201\n刘小溪\t296202\n洁牙器\t296203\n矮牵牛\t296204\n试算\t296205\n齐奏\t296206\n联合国难民署\t296207\n氧泵\t296208\n张无忌\t296209\n急送\t296210\n右江日报多媒体数字报\t296211\n北京招商网\t296212\n摇篮杯\t296213\n3.2L\t296214\n培训师\t296215\n李玉山\t296216\n金蜜蜂\t296217\n初孕\t296218\n合种树\t296219\nxm\t296220\nskid\t296221\n罗裳\t296222\nrsa2\t296223\n尼克杨\t296224\n平交道口\t296225\ntomo\t296226\n云屏\t296227\n50所\t296228\nstructs\t296229\n苍蓝钢铁的琶音\t296230\nCXH99.COM\t296231\n自耕\t296232\n打墙\t296233\n骨伤\t296234\n北荆棘谷\t296235\n600085\t296236\n1.2.3.4\t296237\n翻译文\t296238\n培正中学\t296239\n虞长君\t296240\n七叔\t296241\n泓\t296242\nCCBY\t296243\nphd\t296244\n记功\t296245\nPPT2013\t296246\n5冠\t296247\n第五十五条\t296248\n漕湖\t296249\n阴水\t296250\n研究生们\t296251\nHugh\t296252\nOpenSUSE\t296253\n1x2\t296254\n_根\t296255\n金南\t296256\n智慧城市论坛\t296257\n肝吸虫\t296258\n雅桑克莱\t296259\n校董\t296260\n鉴定学\t296261\n美林湾\t296262\nYohji\t296263\neconomical\t296264\n刘和\t296265\n跑吧\t296266\n死婴\t296267\nepoxy\t296268\n俊波\t296269\n格力电器\t296270\n喵星人大战\t296271\nAtmospheric\t296272\n耐火\t296273\nGPL\t296274\n最低工资标准2016\t296275\n固元膏\t296276\n瘦马\t296277\n董力\t296278\nkuaijiejian\t296279\ndfx\t296280\nSPD\t296281\n水乳交融\t296282\n九龙村\t296283\nMT2\t296284\n莴苣叶\t296285\n妒\t296286\n中铁银通\t296287\n55元\t296288\nhomography\t296289\n厦门大学建筑与土木工程学院\t296290\nRacket\t296291\n郎静山\t296292\ntcl集团\t296293\n拉夫堡大学\t296294\n上海平和双语学校\t296295\n后因\t296296\n台体\t296297\n接子\t296298\nClassic\t296299\ntalented\t296300\n达喜\t296301\n特玩网流放之路\t296302\n站姿\t296303\n工伤保险条例\t296304\n飞神\t296305\ntandon\t296306\n马尾区\t296307\n盖板\t296308\n商车网\t296309\nAlfredZhao\t296310\n雷克萨斯nx\t296311\n字联\t296312\n宠物狗狗\t296313\n桐城派\t296314\n毕马威这四大会计师事务所\t296315\n柯岩\t296316\nlpc\t296317\n四川建龙\t296318\n2464\t296319\n启承\t296320\nOFF-WHITE\t296321\n1&#41\t296322\n失声\t296323\n全房\t296324\n鹤峰网\t296325\n飞傲X7\t296326\n周立纳达尔\t296327\n第十八天\t296328\n白佛客运站\t296329\n五羊\t296330\n后爱\t296331\nauth2.0\t296332\n鼻渊通窍颗粒\t296333\n花海仓\t296334\n安全浏览器\t296335\nflask-sqlalchemy\t296336\n枪侠\t296337\n八一钢铁\t296338\nFinePix\t296339\nPML\t296340\n800日元\t296341\n人尽其才\t296342\n天工树\t296343\n杂交瘤\t296344\n龙舟赛\t296345\n荔湾路\t296346\n神经纤维\t296347\n衙内\t296348\n绝代双骄\t296349\n941\t296350\n荣耀路由Pro\t296351\nADAS\t296352\nHubble\t296353\n清洗机\t296354\n2男\t296355\napache\t296356\n同字\t296357\n临湘\t296358\n警花\t296359\n刘乐妍\t296360\n新浦区\t296361\nradiogroup\t296362\nVPDN\t296363\n绫香\t296364\nShelter\t296365\n百度云盘资源分享+115网盘\t296366\n珠海格力电器股份有限公司\t296367\n若林\t296368\n新奥拓\t296369\n拌面酱\t296370\n下栽\t296371\n定园\t296372\nDto\t296373\n雁江\t296374\nGTP\t296375\n萨摩耶俱乐部\t296376\n市电\t296377\n三起三落\t296378\n高阳科技\t296379\n导风\t296380\nvial\t296381\n中国建筑工程总公司\t296382\n马克思恩格斯选集\t296383\n50平米\t296384\n98分\t296385\ncmseasy\t296386\n韭菜饼\t296387\n九机网\t296388\n娜塔丽\t296389\n安联球场\t296390\n太钢不锈\t296391\n2250\t296392\n凌源党建网\t296393\n小老鼠\t296394\n自抗扰\t296395\n收购点\t296396\n捎\t296397\n迈克菲\t296398\n不愉快\t296399\n跑酷\t296400\nFsTaoci\t296401\n渑池县\t296402\n代开\t296403\n人鬼情未了\t296404\n支票\t296405\n提行\t296406\n野火版\t296407\n陶乐\t296408\n相师\t296409\n奥的斯电梯\t296410\n鬼掹脚\t296411\n90后吧\t296412\n杜波\t296413\nPowerEdge\t296414\n小拍\t296415\nEasyWeChat\t296416\n香港中学\t296417\n南心网\t296418\nNPH\t296419\n姜堰区教师发展中心\t296420\n教师篇\t296421\n3399\t296422\n马哈\t296423\n高升桥\t296424\n橡筋\t296425\n966\t296426\nFeO\t296427\n旗忠\t296428\n在心上\t296429\n抗日战争暨世界反法西斯战争\t296430\napt-key\t296431\n花容\t296432\noracl\t296433\n白水晶\t296434\nHardFault\t296435\n85斤\t296436\n泰山区\t296437\n20150421\t296438\n14px\t296439\n火车南站\t296440\n进化史\t296441\n引擎盖\t296442\n书里\t296443\n咳痰\t296444\n翼风综合社区\t296445\n汝州市\t296446\n石榴汁\t296447\n20170728\t296448\nika\t296449\n深圳市人才服务中心\t296450\n凯莎\t296451\n奥利机场\t296452\nvivoX9s\t296453\n玄空\t296454\n高达模型\t296455\n篮球公园\t296456\nlaby\t296457\n保山市纪委监察局\t296458\ngather\t296459\nkeyword\t296460\n四人间\t296461\nAPP软件辅助器\t296462\nsnapgene\t296463\n火链\t296464\n0.3.3\t296465\n迈普路由器\t296466\n五江湾\t296467\n华为浏览器\t296468\n2017-11-17\t296469\n皇家墨尔本理工大学\t296470\n情暖\t296471\n相频\t296472\n培正\t296473\n购物车\t296474\nAirbus\t296475\ndaqing\t296476\n蝠鲼\t296477\n慰问\t296478\n300公里\t296479\n凌仕\t296480\nACME\t296481\nrailgun\t296482\n邵阳北站\t296483\n鄞州政府网\t296484\n9.3.4\t296485\n江渚\t296486\n四分之一\t296487\n金刚石\t296488\n证明单\t296489\n600首\t296490\n微软商店\t296491\n基金从业资格证\t296492\n偏偏\t296493\nAlexnet\t296494\n淮南市大通区政府\t296495\n四海八荒\t296496\n毛主\t296497\n牛肉酱\t296498\n泥浆门\t296499\nintel\t296500\n养老型\t296501\n里弗斯\t296502\n市桥街\t296503\ncbbe\t296504\n费恩莱斯\t296505\n海泉湾\t296506\nTL431\t296507\n加注机\t296508\n等会\t296509\nPDC\t296510\n广州二中\t296511\n珠海\t296512\n英典\t296513\n风云突变\t296514\n推免生\t296515\n900型\t296516\n9787\t296517\nbsl\t296518\n鸽子窝公园\t296519\n很简\t296520\n寄件人\t296521\n出流\t296522\n小于\t296523\n晋源\t296524\n章水镇\t296525\n㊣\t296526\n南京大学继续教育学院\t296527\n袁春风\t296528\nCruise\t296529\n非凡\t296530\n六安站\t296531\n微内核\t296532\nv4包\t296533\n剑网三交易\t296534\n超鬼王\t296535\nMsgBox\t296536\n晋商贷\t296537\n20号\t296538\n宣亚\t296539\n邀请人\t296540\n粉状\t296541\n搜问问|雅兴\t296542\n无锡博物院\t296543\n中集集团\t296544\n1147\t296545\n在线播放\t296546\n琴女\t296547\n800千伏特\t296548\n盛景网联\t296549\n鸡翅木\t296550\n印尼语\t296551\n肖涵\t296552\n故宫博物院\t296553\n造价师\t296554\nCHK\t296555\n中稻\t296556\n第137集\t296557\n金园一路\t296558\n阴道镜\t296559\n爱阅小说网\t296560\n中国科学院力学研究所\t296561\n领导班\t296562\n遗落的南境1\t296563\n4018\t296564\n素质教育\t296565\nNAT模式\t296566\n20150713\t296567\n撒由那拉\t296568\n愫\t296569\nJMX\t296570\n真选组\t296571\n挡风板\t296572\n星际战甲平原\t296573\n地域名\t296574\n家下\t296575\n九尾猫\t296576\nSOCKS\t296577\nj2ee\t296578\n冲服\t296579\n都市类\t296580\ninca\t296581\n_日语\t296582\n_六美网\t296583\n魏书\t296584\nKiroro\t296585\nVSA\t296586\n肉龙\t296587\n排样\t296588\n乔装打扮\t296589\n4.2.3.5\t296590\n亮洁\t296591\n万事利\t296592\n狮城\t296593\nHiFi\t296594\n6063\t296595\n杨洁\t296596\nhtml5lib\t296597\n文本块\t296598\nhist\t296599\n明步\t296600\nEHS\t296601\nstrategies\t296602\nBinomial\t296603\n偏光\t296604\nwoman\t296605\nSpaceX\t296606\n长江日报\t296607\n蜂鸣片\t296608\n20150309\t296609\n得润电子\t296610\n乖乖听话\t296611\n朱楼\t296612\nconsult\t296613\n886n\t296614\n总调\t296615\n入心\t296616\n经营负债\t296617\n当家作主\t296618\n钤印\t296619\n中华者\t296620\ni77700k\t296621\n收破\t296622\n玉米油\t296623\n北京丰台\t296624\nmappings\t296625\n保山市\t296626\n惠台\t296627\nphl\t296628\n荣耀60帧\t296629\n糖豆豆\t296630\n正负形\t296631\n吃秀\t296632\nthoughtworks\t296633\n院桥\t296634\n6M\t296635\n郭丽\t296636\n遵义市民政府\t296637\n杜亚修\t296638\n苗可秀\t296639\n祥符\t296640\n1.78\t296641\n绿城百合公寓\t296642\n几波\t296643\n3单\t296644\n20170317\t296645\n必备条件\t296646\n一汽轿车股份有限公司\t296647\n郑华\t296648\n主旨句\t296649\n六十六\t296650\n诺基亚\t296651\n张广泰\t296652\n小村医\t296653\n郭紫欣\t296654\n哈萨克斯坦坚戈\t296655\nProtection\t296656\n枪弹\t296657\n砸场\t296658\n制管机\t296659\n签到台\t296660\n中心\t296661\n平狄克\t296662\n吉隆坡国际机场\t296663\n山西大医院\t296664\n谢旭辉\t296665\nmgstage\t296666\nhysys\t296667\n物流部\t296668\n高等教育学\t296669\n极火虾\t296670\n晶板\t296671\npassbook\t296672\n平顶山叶县\t296673\n草蛋网\t296674\n缓\t296675\n上海金融\t296676\n岭南园林股份有限公司\t296677\n特许人\t296678\n篆刻展\t296679\n51wan\t296680\n小米黑鲨VS努比亚红魔\t296681\n性寒\t296682\nRu\t296683\n明面\t296684\n徐建平\t296685\n不搞\t296686\n道王\t296687\n函授本科\t296688\n一关\t296689\nFOR循环\t296690\nMp4\t296691\n六博\t296692\n肃静\t296693\n甲米镇\t296694\n5匹\t296695\n萧友梅\t296696\noligo\t296697\n灌篮高手全国大赛\t296698\n天职国际\t296699\n猪蹄子\t296700\n奎木狼\t296701\ngenerations\t296702\n20160405\t296703\n夹子\t296704\n慕容\t296705\njizzbo\t296706\n3点钟\t296707\n叫兽\t296708\nQianChia\t296709\n鸟展\t296710\n劳动法\t296711\n精神力\t296712\n单名\t296713\n松田圣子\t296714\n狗市\t296715\n冠冕\t296716\ncinderella\t296717\nnvl函数\t296718\nOasis\t296719\n布朗\t296720\n小班化\t296721\n于都县\t296722\n登海\t296723\n第156章\t296724\n白橡\t296725\n100p\t296726\n海贼王无尽世界\t296727\ngmg\t296728\np3d\t296729\n疯情\t296730\n20d\t296731\n昊天SEO\t296732\nPlanar\t296733\n双币种\t296734\ncap\t296735\n湖北省自然科学基金\t296736\n中式\t296737\nmatmul\t296738\n熟识\t296739\n诺克\t296740\n客源地\t296741\n摄\t296742\n第5部\t296743\n45\t296744\n芦苞镇\t296745\nkubernetes\t296746\n5.2a\t296747\nzengkefu\t296748\n白陶\t296749\n台州市人民政府办公室\t296750\n10086.cn\t296751\n涡旋压缩机\t296752\nSAC\t296753\n江枫\t296754\nu宝娱乐\t296755\n【物\t296756\n中原城市群\t296757\n见成效\t296758\nspaceengine\t296759\n第33届\t296760\n飞来横祸\t296761\n涉谷\t296762\n极路由hiwifi\t296763\n中国建材\t296764\n颞\t296765\n招标办\t296766\nMBC\t296767\nH5EDU\t296768\n冻干粉\t296769\n球盘\t296770\n运动神经元\t296771\nDaily\t296772\n马娘\t296773\n中车青岛四方机车车辆股份有限公司\t296774\n猫奴们\t296775\n碧蓝航线海事局\t296776\n中公教育广东分校\t296777\n7C\t296778\n青露\t296779\n珊瑚色\t296780\n加框\t296781\n混合式\t296782\n9999\t296783\n阿华\t296784\n_地\t296785\n55X9000E\t296786\n北奔\t296787\n股权转让税\t296788\nkvm切换器\t296789\nLISP\t296790\n中商资讯\t296791\n澳版\t296792\n云阅读\t296793\n崔洪万\t296794\n群像\t296795\n独山村\t296796\n泛珠三角\t296797\nStitch\t296798\n核头\t296799\n鸽子王\t296800\n喜剧类\t296801\n900\t296802\n浩径\t296803\n125mm\t296804\nPHILIPPE\t296805\n智慧石\t296806\n深层搅拌桩\t296807\nChiara\t296808\n中国乒乓球俱乐部\t296809\n人教版_阳光学习网\t296810\n起凡游戏论坛\t296811\nKudu\t296812\n鞋套机\t296813\nPrettify\t296814\nFX8350\t296815\npubsub\t296816\n渝广高速\t296817\n高林\t296818\n中级\t296819\nswatch\t296820\n中华会计网校_税务网校\t296821\n8.9亿美元\t296822\n合层\t296823\n木木吉他网\t296824\n男角\t296825\n跃龙门\t296826\n丑曦\t296827\n杰西\t296828\n忠奸\t296829\n深圳地铁16号线\t296830\n沈南\t296831\nrms\t296832\naniston\t296833\nlibsodium\t296834\nMX5\t296835\np1108\t296836\n团章\t296837\n天巡网Skyscanner\t296838\n韩荣\t296839\n1600K\t296840\n不要问\t296841\n北化股份\t296842\n中小幼\t296843\n山间\t296844\n无CD\t296845\njquery\t296846\n博鳌机场\t296847\n抛光垫\t296848\n滑精\t296849\n电子检务\t296850\nleukemia\t296851\n整备\t296852\n遭批\t296853\n股权转让_\t296854\n49集\t296855\n庫\t296856\n夹芯\t296857\n今天国际\t296858\n襄垣县\t296859\n财联社\t296860\n举个栗子\t296861\n模拟集成电路\t296862\n城市规划网\t296863\nILLUSION\t296864\nocz\t296865\n传祺gs3\t296866\n拉曼峰\t296867\n人民大学\t296868\n香蕉饼\t296869\n尼美舒利分散片\t296870\n吉林省科技厅\t296871\n剧目\t296872\n量子位\t296873\n众泰众泰T5002018款\t296874\n特种工\t296875\ng片\t296876\nHoa\t296877\n单株\t296878\n词篇\t296879\n4399爆枪\t296880\n闲钱\t296881\n14天后\t296882\n东风启辰\t296883\n科海\t296884\n花旦\t296885\nURl\t296886\n湖南医院\t296887\n教育博客\t296888\n陶瓷砖\t296889\n嬉戏谷\t296890\n扣货\t296891\n一语道\t296892\n监督站\t296893\n排期\t296894\n追月\t296895\n张翼德\t296896\n抱着\t296897\n块规\t296898\nIBatis\t296899\n最惠国\t296900\n火箭社\t296901\n深信服科技股份有限公司\t296902\nson\t296903\n火疖子\t296904\n秋生\t296905\n新闻稿\t296906\ncimatron\t296907\n葡萄糖酸钙注射液\t296908\n饱暖\t296909\n定制式\t296910\n一语道破\t296911\n详解版\t296912\n8265\t296913\n护肩\t296914\n信越\t296915\n奔雷\t296916\n1.1.9\t296917\n工会卡\t296918\nSANSUI\t296919\n爱化\t296920\n天不怕地不怕\t296921\ntycoon\t296922\n体力\t296923\n魅色\t296924\n俄亥俄\t296925\nanessa\t296926\n香子兰\t296927\nEine\t296928\n油漆\t296929\nRevit\t296930\n万生\t296931\nASTER\t296932\n李雪梅\t296933\n58速\t296934\n金韩彬\t296935\n淑女\t296936\n26部\t296937\nTok\t296938\n相手\t296939\noverlap\t296940\n[穿书\t296941\n电视家网\t296942\n餐费\t296943\n婚规\t296944\n接\t296945\n呆\t296946\n王滔\t296947\n骚年们\t296948\n业务流程图\t296949\n对射式\t296950\n570\t296951\n闲扯\t296952\n16年前\t296953\n世界上的另一个我\t296954\n尖椒\t296955\n眼唇\t296956\n2017.2.5\t296957\n天柱\t296958\n琼州学院\t296959\n重庆力帆\t296960\n哈默\t296961\n6.85\t296962\n华纺\t296963\n万分之五\t296964\n安徽名人馆\t296965\n历朝\t296966\nAT变速箱\t296967\n红桥政务网\t296968\n09\t296969\nPP手机助手\t296970\n数据港\t296971\n另一只猫\t296972\n含义\t296973\nlincense\t296974\n阿廖沙\t296975\n轮胎式\t296976\naccumulated\t296977\nkeyset\t296978\n犬子\t296979\n22册\t296980\nneon\t296981\n八周\t296982\n29只\t296983\nhao.360.cn\t296984\n总包\t296985\n安徽工商职业学院\t296986\n季节性\t296987\n大吃\t296988\n新的一年\t296989\nWikipedia\t296990\n消除者联盟\t296991\n雷巴\t296992\nGlobalization\t296993\n知书\t296994\n陈妍\t296995\n6瓣\t296996\n96平米\t296997\n圣乐\t296998\n长江口\t296999\n杭帮菜\t297000\n3C认证\t297001\n村监会\t297002\n上海金茂君悦大酒店\t297003\ngboard\t297004\n陈兴\t297005\n海棠湾镇\t297006\nAndroidDevTools\t297007\n1476\t297008\n简谐运动\t297009\n365.com\t297010\n资源类\t297011\n维普资讯\t297012\n暴毙\t297013\n公共类\t297014\n华晨汽车\t297015\n拔河比赛规则\t297016\ncise\t297017\n双彩论坛\t297018\n联办\t297019\n信息案\t297020\n水光肌\t297021\n紧急会议\t297022\n三野\t297023\n洛阳列表网\t297024\n密苏里州\t297025\n4四\t297026\nmnemonic\t297027\n九华山风景区管理委员会\t297028\nlaunches\t297029\n返修台\t297030\n大针\t297031\n小代\t297032\n铁律\t297033\n金吉列留学\t297034\n337p\t297035\n暂行\t297036\n北苑街道\t297037\nSchmidt\t297038\n访民\t297039\n徘\t297040\n异装\t297041\nbalea\t297042\n点修\t297043\n分众\t297044\n活儿\t297045\n战棋三国2\t297046\n相得益彰\t297047\n实战型\t297048\n刘芳菲\t297049\n研磨器\t297050\n韩国庆熙大学\t297051\nWagner\t297052\n浦东时报\t297053\n残红\t297054\n0517\t297055\n汤浅政明\t297056\n地温\t297057\nhpunix\t297058\n拉罗\t297059\n科威\t297060\n湖北省高级人民法院\t297061\nPor\t297062\n伊斯兰共和国\t297063\n深情不及你\t297064\n成人高考网\t297065\n汪小菲\t297066\nTrigger\t297067\n硫磺\t297068\n狼来了\t297069\n鸡血玉\t297070\n阿卡酱\t297071\n钳形表\t297072\n魔兽守卫军\t297073\n自媒体时代\t297074\n东方之珠\t297075\n美家园\t297076\n橙杖\t297077\n安新县政府\t297078\n兰达\t297079\nDXF\t297080\n主升\t297081\n格式化工厂\t297082\n看见恶魔\t297083\n雪碧\t297084\n平安金融旗舰店\t297085\npyinstall\t297086\n宾州州\t297087\n密性\t297088\n太阳井\t297089\nqilin\t297090\n洪均生\t297091\n堆肥\t297092\n子部门\t297093\n天下父母\t297094\n亮度计\t297095\n曌\t297096\n杭州市下城区\t297097\n夙\t297098\n凸露\t297099\n抗日小英雄的故事\t297100\n安娜斯塔西娅\t297101\n总的来说\t297102\n小溪塔\t297103\n编程语\t297104\n2罐\t297105\n烟嘴\t297106\n鹤壁新区\t297107\n青眼白龙\t297108\n光源氏\t297109\n162路\t297110\n大众途安\t297111\nRESOLVED\t297112\n2018年4月22日\t297113\n广州保利世贸博览馆\t297114\n花椒木\t297115\n鱼友们\t297116\n午晚\t297117\n当真\t297118\n增员\t297119\na35\t297120\n12306分流\t297121\n周佳\t297122\n温饱\t297123\n蓝曦臣\t297124\n桑德罗\t297125\n湖南水利水电职业技术学院\t297126\n东方大厦\t297127\n积累\t297128\n清算日\t297129\n仿制药参比制剂目录\t297130\n讵\t297131\nberger\t297132\n南阳医学高等专科学校\t297133\n白身\t297134\n轻松版\t297135\n同方国芯\t297136\n陆海\t297137\n简英\t297138\n冬装\t297139\n详查\t297140\n酿酒葡萄\t297141\n插座\t297142\n口服补液盐\t297143\n力同\t297144\n四十九天\t297145\nbmap\t297146\n辖县\t297147\n招商\t297148\n国家外国专家局\t297149\n储备林\t297150\n鸟屎\t297151\n苏凡\t297152\n带材\t297153\n震坤行\t297154\n犟龟\t297155\n红熊\t297156\n故梦\t297157\n2883\t297158\nhishop\t297159\n未达账项\t297160\n紧致\t297161\n常德市委\t297162\n磁悬浮\t297163\n全色\t297164\n珠港澳大桥\t297165\n推荐函\t297166\n塘坝\t297167\n栩栩如生\t297168\n怀孕照\t297169\nCider\t297170\n晨峰\t297171\n申报税\t297172\n诀\t297173\nchirp\t297174\nPositioning\t297175\n塔里木盆地\t297176\n扎西卓玛\t297177\n2.26\t297178\n昌利\t297179\n周姐\t297180\n最爱上\t297181\n易坦静\t297182\n推测\t297183\n色度仪\t297184\n强制平仓\t297185\n3180\t297186\npipa\t297187\n张子豪\t297188\n吹散\t297189\n’\t297190\nbana\t297191\n刘姝\t297192\n惊呼\t297193\n沙参\t297194\n%10\t297195\n冰法\t297196\n悦薇珀翡\t297197\n绝胜\t297198\n安庆师范大学\t297199\n公益林\t297200\n本周末\t297201\n149个\t297202\n16.0.0\t297203\n入手\t297204\n桶装水配送公司\t297205\n消弧线圈\t297206\n资本金\t297207\n挥鞭\t297208\nespn\t297209\n反光\t297210\naudible\t297211\nsock5\t297212\n护理机\t297213\n白咖啡\t297214\n风雪山神庙\t297215\nsurfing\t297216\npg库\t297217\n武媚娘\t297218\n胖虎\t297219\n生肖兔\t297220\n红玫瑰白玫瑰\t297221\n风口浪尖\t297222\n接客\t297223\n亚拉\t297224\n结婚进行曲\t297225\n知耻\t297226\n重庆开州\t297227\n岩棉条\t297228\n德芙巧克力\t297229\n成都市社保局\t297230\n楼主\t297231\n10130\t297232\nHelmut\t297233\n84寸\t297234\n敬妃\t297235\n1集\t297236\n红旗街\t297237\n征夫\t297238\n慕丰年\t297239\n漯河市中心医院\t297240\n反恐精英CS1.6\t297241\n阿炳\t297242\n时时彩\t297243\n男人性\t297244\ntcp/ip协议\t297245\n棍法\t297246\none2\t297247\n上榻\t297248\n注册表清理\t297249\n惠水县人民政府\t297250\n奥马冰箱\t297251\n电茶炉\t297252\nEverest\t297253\n子弹袋\t297254\n下桥\t297255\n0V\t297256\n面貌\t297257\nalso\t297258\nFatesinger\t297259\n不惜代价\t297260\ns36\t297261\n百度音乐\t297262\n24月\t297263\n白鹅\t297264\ndpf\t297265\n总结版\t297266\n深圳民办小学\t297267\nArya\t297268\nmianji\t297269\n恭城瑶族自治县\t297270\n气声\t297271\n脉冲电源\t297272\n溴敌隆\t297273\n康美\t297274\n50d\t297275\n狼牙山景区\t297276\n牛首山风景区\t297277\n苏州古镇\t297278\n龙部落电影网\t297279\n阎摩\t297280\n伊顿\t297281\n无限王者归来\t297282\n成吉王健林\t297283\n绝对误差\t297284\n男儿当自强\t297285\n1883\t297286\n棵棵\t297287\n软通动力\t297288\n骚穴\t297289\n字码\t297290\n云色\t297291\nMP4格\t297292\n蒋玉菡\t297293\nsegment\t297294\n纷纭\t297295\n刚度\t297296\nCND\t297297\n有讲\t297298\n煲仔饭\t297299\n吉吉电影网\t297300\n美女写真\t297301\n靠左对齐\t297302\nload\t297303\n0.65%\t297304\n种族\t297305\n众兴菌业\t297306\nBlueTooth\t297307\nhoriba\t297308\n忠义堂\t297309\nait\t297310\n紫月\t297311\n实验表\t297312\n直通车\t297313\n中国科学院上海微系统与信息技术研究所\t297314\n标绘\t297315\n短浅\t297316\ndaimler\t297317\n三分米\t297318\n过零\t297319\n发带\t297320\n09届\t297321\n蓝博\t297322\n毛爷爷\t297323\n途睿欧\t297324\n享尽\t297325\n支持率\t297326\n接驳\t297327\n96年\t297328\n大便宜\t297329\n{%\t297330\n烷\t297331\n隔声板\t297332\n表达量\t297333\n莱特\t297334\n电压表\t297335\n上海戏剧学院表演系\t297336\nc4\t297337\n三十多岁\t297338\n85平方\t297339\n上海市世界外国语小学\t297340\npfmea\t297341\n3022\t297342\n普周\t297343\n中国长城\t297344\n神采奕奕\t297345\n里脊\t297346\n黯然销魂\t297347\nwebapp\t297348\n乳师\t297349\n高昆仑\t297350\n剑网三明教\t297351\n愚园路\t297352\n幼儿园\t297353\n一个多月后\t297354\n黑玛瑙\t297355\n李世荣\t297356\nTBHacker\t297357\n颜丙涛\t297358\ngotomeeting\t297359\n化工品\t297360\n中麦通信\t297361\nchengji\t297362\n武汉地铁\t297363\n剑侠情缘兵器谱\t297364\n泰康仙林鼓楼医院\t297365\n打根\t297366\n奔驰e\t297367\n66条\t297368\n第1张\t297369\nhd3\t297370\n刘慈尼\t297371\n南京市工商局\t297372\n广院\t297373\n北园大街\t297374\n_趣头条\t297375\n两三\t297376\n一昼夜\t297377\nVISA\t297378\n1宗\t297379\n西吉县\t297380\n赤岸\t297381\n风险点\t297382\nTrump\t297383\nLMC\t297384\n加力\t297385\nreduces\t297386\n东安县\t297387\n90kg\t297388\n帝奥斯\t297389\n前置过滤器\t297390\n爱奇艺游戏中心\t297391\n公办园\t297392\n了不起我的国\t297393\n大样图\t297394\n中金所\t297395\n牛蛙\t297396\n赤毒\t297397\n人成\t297398\n塞尔塔\t297399\n成都九龙医院\t297400\n零点\t297401\nrxbus\t297402\n95年代\t297403\n梅花烙\t297404\n笑靥如花\t297405\n龙帅俊颜馆\t297406\n路虎揽胜\t297407\n初诊\t297408\n艾瑞克\t297409\n若雨\t297410\n踩过\t297411\n小酒馆\t297412\n利达光电\t297413\n古邑\t297414\n舒勒\t297415\n顶配\t297416\n烧机\t297417\n中风偏瘫\t297418\n社保基金\t297419\n商品房装修|一起网\t297420\ngxs\t297421\n一笙\t297422\n川藏南线\t297423\n一策\t297424\n建校\t297425\n梧桐\t297426\n逃兵\t297427\npotentials\t297428\nhfut\t297429\n数控铣床\t297430\nObservable\t297431\n干物妹小埋\t297432\n气阴两虚\t297433\n花城\t297434\n13.5亿\t297435\nSiamese\t297436\n影刃\t297437\n女口\t297438\n卓智\t297439\n五虎将\t297440\n手性\t297441\n疾病险\t297442\n万提斯\t297443\n日国\t297444\n洪小铃\t297445\n河莉秀\t297446\n笑嘻嘻\t297447\n120kg\t297448\ngangbi\t297449\nmedhelp\t297450\n东山开发区\t297451\n中国人民保险\t297452\n双体\t297453\n哲学家\t297454\n400nk\t297455\n3d电视\t297456\n7680\t297457\n被保人\t297458\n尹子维\t297459\n云智易\t297460\n逆光之恋\t297461\n接排\t297462\nmastermind\t297463\ntha\t297464\n内拉\t297465\nTSG\t297466\n强压\t297467\n小母\t297468\n物伤\t297469\n识别柱\t297470\n2531\t297471\n西藏航空\t297472\n滑石\t297473\nordinary\t297474\n寻宝记\t297475\n三传\t297476\n涡阳\t297477\n量度\t297478\n股票\t297479\nv5.05\t297480\nCreate\t297481\nAntV\t297482\n微特电机\t297483\n湘香\t297484\n无线键盘\t297485\n清图\t297486\nA7R3\t297487\n四友\t297488\n袁敏\t297489\ngiver\t297490\n管理业\t297491\n588ku.com\t297492\n方舆\t297493\n第83集\t297494\n大更\t297495\ngrafana\t297496\n伊利诺伊大学香槟分校\t297497\n斗龙部落战争\t297498\n韩国队\t297499\n月圆花\t297500\n京汉股份\t297501\n口球\t297502\n苏索\t297503\n何穗\t297504\n4369\t297505\n一公\t297506\n艺德\t297507\nct片\t297508\n中山门\t297509\n水解酸化\t297510\n华语辩论世界杯\t297511\n凌宇沫\t297512\n125乘\t297513\n2.2cm\t297514\n阔少\t297515\n新浪汉\t297516\n南坡\t297517\n坤\t297518\n2016.7\t297519\n大豆异黄酮\t297520\nmagic_quotes\t297521\naop\t297522\n渥太华大学\t297523\n201703\t297524\nSpecialists\t297525\n新闻夜航吧\t297526\n爱情公寓番外篇\t297527\nwegame英雄联盟\t297528\n独家_民主与法制网\t297529\nzrange\t297530\nunconscious\t297531\n战兔\t297532\n长佩\t297533\nlui\t297534\n净肤\t297535\n谷丙转氨酶\t297536\n莱万多夫斯基\t297537\n上海中国电信\t297538\n井宝\t297539\n金域检验\t297540\n首张\t297541\n报员\t297542\n章诒和\t297543\n麦杰克\t297544\nエラ\t297545\n李金元\t297546\nao\t297547\n不幸者\t297548\neton\t297549\n羽客\t297550\n5招\t297551\n新桥\t297552\n逃港\t297553\nNoSuch\t297554\nNetBackup\t297555\n碟子\t297556\n行与行\t297557\n华鲁\t297558\n中国化妆品网\t297559\n游民星空\t297560\nLeading\t297561\n博尔陈志武\t297562\n918\t297563\n大漠孤烟直\t297564\n10方\t297565\n色丁香\t297566\n青龙湾\t297567\n束身衣\t297568\nCSS省略号\t297569\n监控仪\t297570\n饶河县\t297571\n四川一区\t297572\n非异\t297573\n静安新城\t297574\noptimizing\t297575\nCute\t297576\n悲惨世界\t297577\n董仲蠡\t297578\n铁道警察学院\t297579\nqxdm\t297580\n拔高\t297581\n爱茉莉太平洋\t297582\n一年以内\t297583\n增值税款\t297584\n河湖\t297585\n创变者\t297586\n从小规模\t297587\n七分\t297588\n国务委员会\t297589\n一个双\t297590\n揭西县\t297591\n出生率\t297592\n强奸乱伦网\t297593\n储备棉\t297594\n职业学校\t297595\n淘源码\t297596\nhtcone\t297597\n好友度\t297598\n征点\t297599\n土工格栅\t297600\n丹阳市\t297601\n哈曼\t297602\n20160406\t297603\n火狗\t297604\n北京教育出版社\t297605\n天府二街\t297606\n82%\t297607\n20170630\t297608\n黑松鼠\t297609\n启发\t297610\nBOPP\t297611\n钩\t297612\n国家发展计划委员会\t297613\n耳廓\t297614\nandroid-x86\t297615\n附睾\t297616\nCzmMiao\t297617\n2017.02\t297618\n五位一体\t297619\nzabbix触发器\t297620\n糁\t297621\n秦东魁\t297622\n苏晓\t297623\n4.3.0.0\t297624\n报仇\t297625\n天长网\t297626\n中央处理器\t297627\n不交\t297628\nXpath\t297629\n老爱\t297630\n镇坪\t297631\n汤包\t297632\n下方\t297633\n排尿\t297634\nMTF\t297635\nsupplemental\t297636\n穷爸爸富爸爸\t297637\n三场\t297638\n张雪松\t297639\n白塔堡\t297640\n20170530\t297641\n康朴乐\t297642\n砂掌\t297643\n1080Ti\t297644\n深蓝公寓\t297645\nalready\t297646\nwinbox\t297647\n素净\t297648\n凌派\t297649\n侠盗网\t297650\n岛国\t297651\nradiology\t297652\nb16\t297653\nkun\t297654\n中兴南路\t297655\nFiori\t297656\n满堂脚手架\t297657\ngon\t297658\n九悦\t297659\nff9\t297660\n为有\t297661\n小早川玲子\t297662\n尔尔\t297663\n什么生\t297664\n哈佛医学院\t297665\n湛庐\t297666\n置若罔闻\t297667\n养生堂\t297668\n招行\t297669\n编织管\t297670\nQ友乐园\t297671\n一两秒\t297672\n玛歌\t297673\n主见\t297674\n家网\t297675\n漫步者g4\t297676\n起亚索兰托\t297677\nIntl\t297678\n热费\t297679\n无法证明\t297680\n最终幻想12重制版\t297681\n人民大街\t297682\n谢宏\t297683\n刘同\t297684\n重庆邮电大学移通学院\t297685\n吉鸿昌\t297686\n18平米\t297687\n货描\t297688\n上野莉奈\t297689\n短线\t297690\n蒋英\t297691\n書籍\t297692\n打窝器\t297693\n围绝经期\t297694\n容光焕发\t297695\n楚门镇\t297696\n替米考星\t297697\n骆驼\t297698\n华南\t297699\n搜研\t297700\n0631\t297701\n悠唐\t297702\n一丝丝\t297703\n蛇鞭\t297704\nl1300\t297705\n国际站\t297706\ndoforme\t297707\n白头吟\t297708\n康桥\t297709\n乌鲁木齐水磨沟区\t297710\nAutoMapper\t297711\n第1级\t297712\n菲尔德\t297713\n彩调\t297714\n商编\t297715\n我是真的爱你\t297716\n赢家财富网\t297717\n细弱\t297718\n年息\t297719\n博士后科研流动站\t297720\n北京市消防\t297721\n软科学\t297722\n豁子\t297723\n默小文\t297724\n新浪竞\t297725\nAlex_ShineSky\t297726\n梦琪\t297727\nzhui\t297728\n达利集团\t297729\n人行征信\t297730\n神鹰\t297731\nMCU\t297732\nmsvcp120\t297733\n传奇世界\t297734\n超人妈妈带娃记\t297735\n张大仙\t297736\n印花纸\t297737\n红色警戒2心灵终结\t297738\n眼屎\t297739\nLotion\t297740\ninform\t297741\n1w5\t297742\n爱经\t297743\n职系\t297744\n四宝\t297745\n母心\t297746\n一清\t297747\n团结镇\t297748\n伟人\t297749\nbakery\t297750\n韩综\t297751\nwww.7czw.com\t297752\nOculus\t297753\n凯叔三国演义\t297754\n95508\t297755\n微景\t297756\n周婷\t297757\n11.1.2\t297758\n00000050\t297759\n神十\t297760\n王绩\t297761\n通长\t297762\n北大外语学院\t297763\n武隆旅游网\t297764\ncad2012注册机\t297765\npolipo\t297766\n东风路\t297767\n93集\t297768\n中共中央委员会\t297769\n陶红\t297770\n任仕达\t297771\n装糊涂\t297772\nMaximum\t297773\n江苏教育\t297774\n超乐泰胶水_密封胶\t297775\nxxz\t297776\n秋衣\t297777\n构造筋\t297778\n赤耳\t297779\n分清\t297780\n盘龙大观园\t297781\n生肉\t297782\nU9\t297783\n一念永恒最新章节_一念永恒无弹窗\t297784\n刘娇娇\t297785\n耿丹学院\t297786\n好地\t297787\nRosemary\t297788\n中华医院\t297789\n扇贝网\t297790\n销售费用率\t297791\n8639\t297792\n盐菜\t297793\n佣兵\t297794\n一管\t297795\n粉面\t297796\n古典管理理论\t297797\n美剧\t297798\n浪荡史\t297799\nQQ资源网\t297800\n划桨\t297801\n沈阳建筑大学\t297802\n国家信访局\t297803\n某人\t297804\n搬家公司\t297805\n岳阳经开区\t297806\n牡牛座\t297807\n文生\t297808\n56分\t297809\n张旭阳\t297810\n西恩\t297811\nSTEM\t297812\n2016-02\t297813\nsql语言\t297814\n犹记惊鸿照影\t297815\n発育\t297816\n青岛农业大学\t297817\n叙事学\t297818\n光域\t297819\n狼溪\t297820\n钟镇涛\t297821\nRhyme\t297822\nRetrofit\t297823\nad16\t297824\n长牙\t297825\n沙海\t297826\n杨府山\t297827\n一干二净\t297828\n二手冠道\t297829\n牲交\t297830\n洛必达\t297831\n58%\t297832\nhero3\t297833\n游戏簿\t297834\nblur\t297835\n心凉\t297836\n鸿宇\t297837\n难题\t297838\n蹦蹦蹦\t297839\n666元\t297840\n国丰\t297841\n腾空而起\t297842\n18T\t297843\n三维集团\t297844\n江西华邦\t297845\n芒果金柜\t297846\n娘亲\t297847\n吴京安\t297848\n催芽\t297849\n万达信息股份有限公司\t297850\n模板法\t297851\nVAD\t297852\n完成品\t297853\nPDf\t297854\n处刑\t297855\n龙鱼之巅\t297856\nNo.1\t297857\n双对数\t297858\n滑动框\t297859\n4102\t297860\n果盘\t297861\n黄体破裂\t297862\n14日游\t297863\n须佐\t297864\n琴行\t297865\n石油业\t297866\nkami\t297867\n采茶机\t297868\n柴湾\t297869\n建湖新闻网\t297870\n爱藏\t297871\n000732\t297872\n萧伯纳\t297873\n红塔\t297874\n剪切\t297875\n翠湖山庄\t297876\n贾晓晨\t297877\n堙\t297878\n倫\t297879\n一动点\t297880\n仙度瑞拉\t297881\n怒视\t297882\nzhai\t297883\n老紫\t297884\n赵真\t297885\nhsrp\t297886\n301号\t297887\n蓝丁胶\t297888\n南北朝\t297889\n壬午日\t297890\n巡回演出\t297891\n479\t297892\n突如其来\t297893\n天鹅湖\t297894\n碗扣架\t297895\n唱吧麦克风\t297896\n043\t297897\n联产\t297898\n刘思扬\t297899\n月貌\t297900\n肉制品\t297901\n杨晶晶\t297902\n湖塘镇\t297903\n邯郸市公安局\t297904\nCSS代码\t297905\n该不该\t297906\nmutations\t297907\nwww.3234.com\t297908\n加位\t297909\n解说稿\t297910\n私吞\t297911\n林冲棒\t297912\n路德维希二世\t297913\nOpenLayers\t297914\n阿朵\t297915\n黄腐酸\t297916\n党要管党\t297917\n1080p.BD\t297918\nDNF2017\t297919\n水泵站\t297920\n刺青\t297921\n柔印\t297922\n雪福\t297923\nparty\t297924\n热处理\t297925\n正宇集团\t297926\na59\t297927\n本家\t297928\nデ\t297929\n宠物晶锐\t297930\n傣语\t297931\n绕过去\t297932\n绍兴汽车网\t297933\nclay\t297934\n纪工委\t297935\n420分\t297936\n希崎\t297937\n自动焊接机\t297938\n弄坏\t297939\n特急者\t297940\n经验方\t297941\n王凤英\t297942\n解聘\t297943\n一个条\t297944\n王传君\t297945\n混沌龙蛇演义\t297946\n猪心汤\t297947\nrestoring\t297948\n谷内\t297949\nrox\t297950\n去哪儿网\t297951\n张伯驹\t297952\n臼\t297953\nbantu\t297954\nグリ\t297955\n上海申通快递\t297956\nwuxia\t297957\nHubs\t297958\n判刑\t297959\n2018育儿大作战\t297960\nNSFOCUS\t297961\n大朗\t297962\n法商\t297963\n火烧板\t297964\n绍兴酒店\t297965\n通用定时器\t297966\n逐日者\t297967\n熊斌\t297968\n多酚\t297969\n惠金\t297970\n都匀\t297971\n首轮\t297972\n317路\t297973\n别住\t297974\n线字\t297975\n固原\t297976\n都匀市人民政府\t297977\n强弱电\t297978\n洗面奶\t297979\n郭安\t297980\n一决胜负\t297981\nMIPS\t297982\n气水\t297983\n瑞康\t297984\n老吾老\t297985\n郑州方特梦幻王国\t297986\n某月\t297987\n北京城六区\t297988\n乐凯\t297989\n园林树木学\t297990\n精卫填海\t297991\n3280\t297992\n二次方\t297993\n老火靓汤\t297994\n卡网\t297995\n定模\t297996\n重庆美的\t297997\nよ\t297998\nr2简体中文版\t297999\n为时已晚\t298000\n中展\t298001\n高能耗\t298002\nStoryline\t298003\n赵玉\t298004\n中作\t298005\nsperm\t298006\n嵐\t298007\n我的姑娘\t298008\n黑麦草\t298009\n无染\t298010\nWTS\t298011\ncourtyard\t298012\n海尔电器\t298013\nZ17mini\t298014\n捐赠\t298015\n康弘\t298016\n最终章\t298017\n烫斗\t298018\n音识曲\t298019\n北京西单\t298020\ninvestor\t298021\n団鬼\t298022\n姬如雪\t298023\n国资局\t298024\n孙文\t298025\n小外\t298026\n王定华\t298027\n旭日阳刚\t298028\n做菜\t298029\n休宁县人民政府\t298030\n水湖\t298031\n剑侠情缘3科举考试题库\t298032\nJewelCAD\t298033\n赵鸿飞\t298034\nBNF\t298035\n800题\t298036\n古耽\t298037\n転生\t298038\n乐童音乐家\t298039\n圆木棒\t298040\n文本公司\t298041\n当红\t298042\nul、LI\t298043\n骚妇\t298044\n饰家\t298045\n守信用\t298046\n上汽大通g10\t298047\n醉蝶花\t298048\n天下第一城\t298049\n荣科科技\t298050\n旧时\t298051\n色素\t298052\nConexant\t298053\n塚\t298054\nwindows32\t298055\n无愧于\t298056\nc7000\t298057\n20180107\t298058\n昭君出塞\t298059\npaixu\t298060\n巍山县\t298061\n怡景园\t298062\nEigen\t298063\n12公里\t298064\n医共体\t298065\n第三十四条\t298066\n被删除\t298067\n制丸机\t298068\n周云鹏\t298069\nPid\t298070\n苏州地区\t298071\n1857年\t298072\n狼牙堡\t298073\n电动窗\t298074\n罪恶\t298075\n派克斯\t298076\n小安素\t298077\n3045\t298078\n中央政治局常委\t298079\n容联\t298080\nyaman10t\t298081\n新闻库\t298082\n第十一篇\t298083\nm128fp\t298084\nMOBIN\t298085\ncod11\t298086\n凤凰琴\t298087\n角门\t298088\n如何是\t298089\n林志勇\t298090\n快乐的童年\t298091\n通信工\t298092\n指数函数\t298093\n勇战\t298094\ninthis\t298095\n8.8万\t298096\n临清市\t298097\n骡\t298098\n便利贴\t298099\n北亚\t298100\nweb2py\t298101\n气价\t298102\n汤姆逊\t298103\n9C\t298104\nMysteries\t298105\n主角\t298106\n史蒂芬·柯维\t298107\n1点点\t298108\n杨家界\t298109\n一万美元\t298110\natyang\t298111\n尖酸\t298112\nadults\t298113\n词集\t298114\n中央民族乐团\t298115\ncoma\t298116\n平民化\t298117\n可变现\t298118\n11招\t298119\n露\t298120\nGroup\t298121\n福建电视台\t298122\ngoa\t298123\n魅族魅蓝2\t298124\n寅\t298125\n苗圩\t298126\n查处\t298127\n征友\t298128\n不才\t298129\n20150812\t298130\n几门\t298131\n轧盖机\t298132\nfnatic\t298133\nlibcrypto.so.10\t298134\n小变\t298135\nV6.5\t298136\n中国出版社\t298137\nwin7局域网\t298138\n案情\t298139\n夹克\t298140\n熊猫币\t298141\n柳姓\t298142\n二弟\t298143\nvolatility\t298144\n烦劳\t298145\n上法庭\t298146\n艾普\t298147\nYOGA\t298148\n重访\t298149\n第八话\t298150\n刚刚好\t298151\n理塘\t298152\n闭上双眼\t298153\n劳办\t298154\n洪荒大陆\t298155\n装修类\t298156\n第104\t298157\nTOP50_\t298158\nWildFly\t298159\n强队\t298160\n三风一训\t298161\n欲望之屋2:甜美情事\t298162\n第十批\t298163\n银河英雄传说\t298164\n主库\t298165\n参观\t298166\n骚包\t298167\n系统集成项目管理工程师教程\t298168\n送杜少府之任\t298169\n博智\t298170\naccompanied\t298171\n普清\t298172\n德成\t298173\n扎紧\t298174\n教牧\t298175\n因故\t298176\nCatalina\t298177\n中评协\t298178\n帕尔玛\t298179\n倪安东\t298180\n江疏汪涵\t298181\n不负责\t298182\nAppleStore\t298183\n趋势项\t298184\nBank\t298185\n盈亏平衡分析\t298186\n白鳝\t298187\n龙玉涛\t298188\n复制机\t298189\n朱令\t298190\n司马宋祖儿\t298191\nAntec\t298192\n二月\t298193\n领地战\t298194\n凯莉日记\t298195\nTfboys\t298196\n炉石传说探险者协会\t298197\n换新颜\t298198\n心理作用\t298199\nfna\t298200\nZDC\t298201\nC编程\t298202\n惑\t298203\n辗转\t298204\n王梓钧\t298205\nLocator\t298206\nTPI\t298207\nlashes\t298208\n魔医\t298209\nsmartgit\t298210\n翻砂\t298211\n永不言弃\t298212\nimpact\t298213\n春风榴火\t298214\n脸对脸\t298215\n名妓\t298216\n主生\t298217\n子弹头\t298218\n70W\t298219\n硅谷亮城\t298220\n堀\t298221\n大力金刚掌\t298222\n黑暗时代\t298223\n上海市奉贤区中心医院\t298224\n在次\t298225\n0.00元\t298226\n排汽\t298227\n介损\t298228\n眼见为实\t298229\n两招\t298230\n安吉里卡\t298231\n鸭脚木\t298232\nneve\t298233\n津液\t298234\n场强仪\t298235\nuiscrollview\t298236\n国卫\t298237\n刘小燕\t298238\n3.0.10\t298239\n中华人民共和国涉外民事关系法律适用法\t298240\n25毫米\t298241\n_慧谷网\t298242\n钢印\t298243\n江山美人志\t298244\n十九冶\t298245\nKiehl\t298246\ndreamware\t298247\nSqlServer数据库\t298248\n白先生\t298249\n犁\t298250\n全国学校共青团\t298251\n紫砂之家\t298252\n添加剂\t298253\nkinetics\t298254\n1268\t298255\n忘了关\t298256\n雷夫\t298257\n集美舍\t298258\nP20\t298259\n核舟\t298260\n9.3.3\t298261\n厨师机\t298262\n美洛昔康\t298263\n湖北省荆州中学\t298264\n章无忌\t298265\nケ\t298266\nLongitude\t298267\n柯达\t298268\n有几多愁\t298269\n1000万个\t298270\nzhaoxiaoshuo\t298271\n第十八卷\t298272\n知语\t298273\n仁智\t298274\n真气石\t298275\n博时\t298276\n骨\t298277\n鸟取县\t298278\n吕继宏\t298279\n志村玲子\t298280\n红薯粉条\t298281\n我的双修道侣\t298282\n辛芷蕾\t298283\n第一行\t298284\n低领\t298285\nshaking\t298286\n开元酒店\t298287\n墨梅\t298288\n灵女\t298289\n第16页\t298290\n光影包\t298291\n斩裂\t298292\n琵琶骨\t298293\n卓文\t298294\n2262\t298295\nmsdos\t298296\nwindows环境变量\t298297\n查表\t298298\n色道\t298299\n消费主义\t298300\n6007\t298301\n扬威\t298302\n唐昌镇\t298303\nqj\t298304\n指接\t298305\nMOJO\t298306\n蓝景\t298307\n定日\t298308\n年产\t298309\n乌骓\t298310\n蔡家岗镇\t298311\n闪电战\t298312\n不省人事\t298313\nbar\t298314\nexecute\t298315\n心的旅程\t298316\n上半年\t298317\n30年\t298318\n八辈子\t298319\n阉党\t298320\nORION\t298321\n恃宠而骄\t298322\n飞鱼丸\t298323\n峰值\t298324\n车友群\t298325\n康安路\t298326\n卖挂\t298327\nexplode\t298328\nTropical\t298329\nFileLocator\t298330\n第31\t298331\n蓝衫\t298332\n喀拉峻草原\t298333\n腾讯视频\t298334\n赫斯曼\t298335\nINVALID\t298336\n扇窗\t298337\n赵冰\t298338\n1.5V\t298339\n盐业\t298340\ndeveloper\t298341\n她很漂亮\t298342\n石像\t298343\n肌张力障碍\t298344\n路易\t298345\n判据\t298346\n接近\t298347\nnetworking\t298348\n9669手游网\t298349\n包不住\t298350\n市红十字会\t298351\n耖\t298352\nnifty\t298353\n南洋模范中学\t298354\n振动机械网\t298355\n核\t298356\n4.3.6\t298357\n照理\t298358\n第311集\t298359\n门幅\t298360\n12路\t298361\n庞中华\t298362\n康康体检网\t298363\n魔域永恒\t298364\nCALIS\t298365\ncollege\t298366\n群女\t298367\n早稻网\t298368\n碧海威\t298369\nQual\t298370\nLia\t298371\n下版\t298372\nA18\t298373\n和音\t298374\n鲍威尔\t298375\n护理假\t298376\n3所\t298377\n胡因\t298378\n予約\t298379\ngtx1065\t298380\nyu\t298381\n刀塔艾瑞泽5\t298382\n波纹\t298383\n条布\t298384\n龙薇\t298385\n麦格纳斯\t298386\n试比\t298387\n村部\t298388\nsafety\t298389\n95层\t298390\n压筋\t298391\n天峨县\t298392\n康之家\t298393\n周扒皮\t298394\n画诗\t298395\n存储服务器\t298396\n深基\t298397\n无虑\t298398\n热情的沙漠\t298399\n玻璃隔断\t298400\n芭学园\t298401\n450个\t298402\n伊莉\t298403\n剥皮\t298404\nDKT\t298405\n明显\t298406\n51路\t298407\n千遍\t298408\n汤米姐\t298409\n屋头\t298410\n西湖苑\t298411\n半马\t298412\n杰克森\t298413\nFinch\t298414\n万紫千红\t298415\n裸播\t298416\n贵州师范大学\t298417\n80篇\t298418\n風俗\t298419\n羊毛脂\t298420\n钢琴音\t298421\n丹凤县\t298422\n丝祙\t298423\n毕马威会计师事务所\t298424\n布克奖\t298425\nVASP\t298426\n9M\t298427\n哈哈步\t298428\nX20\t298429\n五四\t298430\nsoho中国\t298431\n洪伟\t298432\n几个十\t298433\n用工\t298434\nxinput1_3.dll\t298435\n帮会\t298436\n漫画家\t298437\n而论\t298438\n奥杰\t298439\n以德服人\t298440\n革兰氏阴性菌\t298441\n法会\t298442\nwifi传书\t298443\n精伦\t298444\n蚂蚁金融服务集团\t298445\n藏传净土法\t298446\n君康人寿\t298447\n新包\t298448\n纳兰嫣然\t298449\nAK-47\t298450\nfeet\t298451\n减贫\t298452\n刘健\t298453\n程序版\t298454\n尾狮\t298455\ngpload\t298456\n话本\t298457\n一条河\t298458\n客家菜\t298459\niazelaya\t298460\n欢乐喜剧人文松\t298461\nPBS\t298462\nMYIP\t298463\n痛\t298464\n史蒂芬·金\t298465\n摩苏尔\t298466\n变频压缩机\t298467\n干壁\t298468\n震颤\t298469\n波浪状\t298470\n改元\t298471\n黑体\t298472\neur\t298473\n奥林匹克公园\t298474\nPrivilege\t298475\n5MM\t298476\n很久前\t298477\ntftpd32\t298478\n米兰尼斯\t298479\nnemo\t298480\n禁图\t298481\n受气\t298482\n滑道\t298483\n虎帝\t298484\n对标对表\t298485\ntranslated\t298486\n红牛\t298487\nTorture\t298488\n集成创新\t298489\n馒馒\t298490\n黑真\t298491\n固定价\t298492\nSLOGAN\t298493\n周信芳\t298494\n2013年5月\t298495\n麦田音乐网\t298496\n夏有乔木雅望天堂\t298497\n千钧一发\t298498\n梦幻西游化圣\t298499\n降眼\t298500\n广西网\t298501\n冕宁\t298502\nbreed\t298503\n砂粒\t298504\nhhtp\t298505\n起点中文网\t298506\n中辰\t298507\n重奖\t298508\n花井美纱\t298509\n释永信\t298510\n使用篇\t298511\n第19个\t298512\nsignage\t298513\n年金险\t298514\n蒋瑶\t298515\n吃饭后\t298516\n围堰\t298517\n2018年2月28日\t298518\n民国政府\t298519\n跨数\t298520\n7亿元\t298521\nUL\t298522\n骨囊肿\t298523\n九仙帝皇诀-九仙帝皇诀快眼看书\t298524\n垫纸\t298525\n打真\t298526\n煤柱\t298527\nGOPRO\t298528\n增盈\t298529\n桃园镇\t298530\n擦窗机\t298531\n4860\t298532\n中汽\t298533\n华英农业\t298534\n清华大学深圳研究生院\t298535\nSUBSTR\t298536\n雪桃\t298537\n味全乳酸菌\t298538\n人教版四年级英语\t298539\navfoundation\t298540\n实景剧\t298541\nReggae\t298542\n朗盛\t298543\n大内\t298544\n蛊虫\t298545\n26&\t298546\nHITACHI\t298547\n元培\t298548\n皱叶\t298549\n福瑞揽胜极光\t298550\n镂空\t298551\n史燕\t298552\n9990\t298553\nlc100\t298554\n弓马\t298555\n轮眼\t298556\n林丁丁\t298557\n国际儿童节\t298558\n中国人民解放军火箭军\t298559\n北京罗麦科技有限公司\t298560\n蕉\t298561\n长期待摊\t298562\n蚝\t298563\n审稿\t298564\n富民\t298565\nrecurrence\t298566\n成本类\t298567\n百兆\t298568\n明发国际城\t298569\n旭辉半岛\t298570\n逯薇\t298571\n雪羽\t298572\n梁氏\t298573\n商都房产网\t298574\ncacerts\t298575\n蠢货\t298576\n彩民村\t298577\n闭运算\t298578\n土壤有机质\t298579\nnht\t298580\n清洁剂\t298581\nsins\t298582\n发愿文\t298583\n樟宜\t298584\n文洛\t298585\n孔文\t298586\n0790\t298587\nトイレ\t298588\n看板\t298589\n魔兽争霸编辑器\t298590\n成国\t298591\n合葬\t298592\n误删\t298593\nX9L\t298594\n减招\t298595\n渎\t298596\nttg\t298597\n150年前\t298598\n松江镇\t298599\n黄金瞳\t298600\nCedar\t298601\n与天同兽\t298602\n红领巾集结号\t298603\n酬劳\t298604\n整\t298605\n水城县政府网\t298606\n流放之路塑界玉\t298607\n蓝牙3.0\t298608\nZX7\t298609\n20150205\t298610\n全屏\t298611\n有失\t298612\n一拖\t298613\n永创\t298614\n非师范生\t298615\njardin\t298616\n有疾\t298617\n深圳市社保局\t298618\n鼎泰丰\t298619\n优车\t298620\n国网四川省电力公司\t298621\n红叉\t298622\n羡煞\t298623\n尚标\t298624\n井空\t298625\n保守力\t298626\n四川大学网络教育学院\t298627\n武江区\t298628\n使用寿命\t298629\n李小兵\t298630\n学人\t298631\n会泽\t298632\n卡拉瓦乔\t298633\n拉拉\t298634\n翠屏区\t298635\n幸福女人\t298636\n妈妈的朋友2\t298637\ntrf\t298638\n车头条\t298639\n金蝶KIS|智慧记\t298640\nQ1\t298641\nbd谱\t298642\nffa\t298643\nPur\t298644\n马丁·路德\t298645\nFranois\t298646\n替格瑞洛\t298647\nsublime2\t298648\nproductivity\t298649\nsnp\t298650\n永磁\t298651\n排烟阀\t298652\n危言耸听\t298653\n奖率\t298654\nspg\t298655\nLofter\t298656\n十万个冷笑话2\t298657\n青原\t298658\n小种\t298659\n一百岁\t298660\n小新锐\t298661\n6卷\t298662\n东方不败\t298663\n慈利新闻网\t298664\n来无\t298665\n蒋方舟\t298666\n章光\t298667\n晶珠\t298668\n114黄页网\t298669\n开叉\t298670\n濡尾\t298671\nVariation\t298672\n底下\t298673\n证证\t298674\n莎翁\t298675\n脑淤血\t298676\nTOP10_\t298677\n=&#160\t298678\n津巴\t298679\n赵聪\t298680\n组团式\t298681\n小冲\t298682\n重生之门\t298683\n乡\t298684\n河南三敏机械设备有限公司\t298685\nFreak\t298686\n凸模\t298687\n在职\t298688\nnotefirst\t298689\n手刺\t298690\n切肉\t298691\n女扮\t298692\n自诩\t298693\n再有\t298694\n金多多\t298695\n高转\t298696\n蓝鱼\t298697\n屈家岭管理区\t298698\n绝剑\t298699\n透平\t298700\n窑火\t298701\nAvon\t298702\n暗影魔多\t298703\nbianma\t298704\n奔驰R级\t298705\n到时\t298706\nmibd\t298707\n成数\t298708\n才到\t298709\n易经丹\t298710\n3角\t298711\n4231\t298712\n6650\t298713\n压力盒\t298714\n有机合成材料\t298715\ntwrp\t298716\n冷备份\t298717\n机械化\t298718\n绪方\t298719\n银马\t298720\n阿依达\t298721\nrenamed\t298722\ncc2016\t298723\n编外教师\t298724\nHelen\t298725\nHins\t298726\n米克尔\t298727\n十一期间\t298728\n信佛教\t298729\n小前锋\t298730\n写给母亲\t298731\n信呼\t298732\n巴戟\t298733\n妖孽满楼_\t298734\n陈全\t298735\n国防报\t298736\n广东工商局\t298737\n陆飞\t298738\n华纳\t298739\n椰子灰\t298740\n喜宴\t298741\n王金卓\t298742\n贩子\t298743\nJazeera\t298744\n影山\t298745\n乐都\t298746\n烟色\t298747\n李稻葵\t298748\nvisio2016\t298749\n古风\t298750\n余额\t298751\n调度\t298752\n名事\t298753\nDiRT\t298754\n体毛\t298755\n2PC\t298756\n阿年\t298757\n陌陌群\t298758\n小叉\t298759\n切削液\t298760\n国务院关税税则委员会\t298761\n不锈钢门\t298762\nHighlights\t298763\n林县\t298764\nShinjuku\t298765\n集萃\t298766\nmelody\t298767\n寓所\t298768\nwindows10易升\t298769\n一拱\t298770\n认清楚\t298771\n夜信\t298772\n烘道\t298773\n磁辊\t298774\n通缩\t298775\n老网\t298776\n虚岁计算器\t298777\n砖砖\t298778\n病娇\t298779\nA7RII\t298780\nCC2014\t298781\n财办\t298782\n多伦多市\t298783\n益气\t298784\nsavetxt\t298785\n死伤\t298786\n杂技团\t298787\n三所\t298788\n临考\t298789\n地连墙\t298790\n值客\t298791\n暴雪嘉年华\t298792\n书卷\t298793\n鸡哥\t298794\n葛平\t298795\n房事\t298796\ning\t298797\n范阳\t298798\n阜阳市国土资源局\t298799\nChokStick\t298800\n仙剑传奇\t298801\n万维网\t298802\n闹剧\t298803\n火影忍者鼬\t298804\nqlist\t298805\n九大美院\t298806\n君醉\t298807\n耶律楚材\t298808\n明通\t298809\n自然法\t298810\n131个\t298811\n20辆\t298812\n起动\t298813\n间隔期\t298814\n甲子日\t298815\n400个\t298816\n山东省人大\t298817\n不慎\t298818\n评论文\t298819\nFastjson\t298820\n美歌\t298821\n胶瓶\t298822\n体组\t298823\n人际\t298824\n惠特曼\t298825\n师魂\t298826\n半峰\t298827\n沈阳造币厂\t298828\n7681\t298829\n三亚市政府\t298830\n吸烟\t298831\n简弘亦\t298832\n刘学军\t298833\n哈迪德\t298834\n赵雷枪神纪\t298835\n口袋妖怪黑暗\t298836\nOnion\t298837\n清霜\t298838\n482\t298839\n尼德霍格\t298840\nlu\t298841\n污迹\t298842\n模糊聚类\t298843\ndeficiency\t298844\n豪言壮语\t298845\nPhotographic\t298846\n先\t298847\n威尔·史密斯\t298848\n东离剑游纪\t298849\n柯桥街道\t298850\n薄饼\t298851\n尹汝贞\t298852\n杭州市西湖区政府\t298853\n修睿\t298854\n芥川\t298855\n骨草\t298856\n锁机\t298857\n百度阅读\t298858\n穿石\t298859\n凌度\t298860\n北京现代瑞纳\t298861\n企业文\t298862\n周静\t298863\n练习本\t298864\nIsle\t298865\n56%\t298866\nABCDEF\t298867\n文字画\t298868\n体质\t298869\n弑\t298870\n国家煤矿安监局\t298871\n雨润集团\t298872\n1所\t298873\nlsmw\t298874\nNora\t298875\n实际成本法\t298876\ndaze\t298877\n圆简\t298878\n雇用\t298879\n闲杂\t298880\n秦基博\t298881\n西安大唐医院\t298882\n电阻率\t298883\n扰流\t298884\n今天早上\t298885\n嚓\t298886\n妇孺皆知\t298887\nPPT2010\t298888\nRexroth\t298889\n_乎乎\t298890\nRP\t298891\n原产地证\t298892\n招股意向书\t298893\nXaml\t298894\nunhashable\t298895\n善政\t298896\n5988\t298897\n叹号\t298898\n卓易科技\t298899\n宅男福利社\t298900\nm个\t298901\nelicpse\t298902\n沙漠掘金\t298903\n位似\t298904\n头槌\t298905\n无限网\t298906\n1V2\t298907\n防晒伞\t298908\nMergers\t298909\nbalenciaga\t298910\nEXTELLA\t298911\n侨商\t298912\n人事局\t298913\ninsane\t298914\n停线\t298915\n昕锐\t298916\n多态\t298917\n观察日记\t298918\nPVF\t298919\n双边贸易\t298920\nsyz\t298921\n数字医疗\t298922\n烈\t298923\nAdvantage\t298924\n千手\t298925\n強\t298926\n微指令\t298927\nSlideshow\t298928\n192.168.0.1\t298929\nPython数据分析常用手册\t298930\nwww.kongbao100.com\t298931\n最小配筋率\t298932\n孙楠\t298933\nG级\t298934\n内蒙古农村信用社\t298935\n新区管委会\t298936\n缭\t298937\n加油车\t298938\n邮资\t298939\n诡丝\t298940\n电脑式\t298941\n6800元\t298942\nvalidators\t298943\nE0级\t298944\n64期\t298945\n开心激情网\t298946\n全达\t298947\n工艺流程\t298948\n南极绝恋\t298949\n真假图\t298950\n155号\t298951\nDNF卡\t298952\ntheisle\t298953\nQQ漂流瓶\t298954\n花漾\t298955\n天津中医药大学第一附属医院风湿免疫科\t298956\n李贽\t298957\nPhalcon7\t298958\n新三国演义\t298959\n无情无义\t298960\n十三季\t298961\n2828电影网\t298962\n李青强\t298963\n恭贺\t298964\nwtmp\t298965\n大富翁8\t298966\n华谊兄弟影院\t298967\n哈弗车友会\t298968\n上海快递\t298969\nDome\t298970\n河南城建学院\t298971\n建议案\t298972\n探幽\t298973\n交社\t298974\n场镇\t298975\n汪健\t298976\n柞树\t298977\npopen\t298978\n汕头职业技术学院\t298979\nWacom\t298980\nOOC\t298981\nFEX\t298982\nHtmlUnit\t298983\nM6\t298984\n2标\t298985\nbalance\t298986\n验证仪\t298987\n英语六级考试\t298988\n天牢\t298989\n皇城相府\t298990\n白话文版\t298991\njavbus\t298992\nu2518dr\t298993\ngoodness\t298994\n55w\t298995\n生死\t298996\n东方购物\t298997\n丙二酸二乙酯\t298998\n观察表\t298999\n叶芳\t299000\nsyt\t299001\n东昌府区人民政府\t299002\n同济路\t299003\n表面\t299004\n静不下\t299005\nOnlyLady女人志\t299006\nN2\t299007\n105.6\t299008\n解说\t299009\n慕思\t299010\n喝牛奶\t299011\nmaschine\t299012\nCuriousZero\t299013\n白云公园\t299014\n主治\t299015\n奥美拉唑肠溶片\t299016\n中阴\t299017\npg1\t299018\n梦幻西游帮战\t299019\n物力\t299020\n蛇神\t299021\n健康160网\t299022\n苏宁广场\t299023\n解组\t299024\n三年以来\t299025\n献丑\t299026\n罗瓦涅米\t299027\n19.1.1\t299028\n宋家村\t299029\n赛莱默\t299030\n地图学与地理信息系统\t299031\n答题卷\t299032\n景台\t299033\nMetronic\t299034\nSol\t299035\n安全生产责任书\t299036\n短道速滑世界杯\t299037\n魂体\t299038\n360安全网址导航\t299039\nraspbian\t299040\n木之本樱\t299041\nturns\t299042\n乐之邦\t299043\n路透中文网\t299044\n黄瓜水\t299045\nゴ\t299046\nvehicle\t299047\n垄断资本主义\t299048\n五十天\t299049\n举报箱\t299050\n失恋巧克力职人\t299051\nGvim\t299052\n归还世界给你\t299053\n壶镇镇\t299054\n高陵区\t299055\n义乌小商品批发市场\t299056\n山特维克\t299057\n两瓣\t299058\n155mm\t299059\n持有型\t299060\nsandisk\t299061\n携带式\t299062\n腹主动脉瘤\t299063\n青松公路\t299064\n银瓶山森林公园\t299065\nmonkey\t299066\n监管人\t299067\n吴忠市\t299068\n3肖\t299069\n三封\t299070\n老凤祥黄金\t299071\n煤炭业\t299072\nV2.1.2\t299073\n安徽艺术职业学院\t299074\nnitro\t299075\n追光者\t299076\n评审表\t299077\n一一一\t299078\n侧窗\t299079\n刘花英\t299080\n爿\t299081\n九江学院附属医院\t299082\n净负债\t299083\n紫金湾\t299084\n电气自动化专业\t299085\nz18\t299086\n巨著\t299087\nCC6\t299088\n攀枝花市统计局\t299089\n换完\t299090\n综合信息网\t299091\n轩狗\t299092\n灿烂千阳\t299093\nSALES\t299094\n石家庄小区\t299095\n超杀默示录\t299096\ngx501\t299097\n松田翔太\t299098\n南京大学工程管理学院\t299099\n右倾\t299100\n图谜\t299101\n有限责任公\t299102\n护理类\t299103\n职业群\t299104\n容量型\t299105\n涉黑案\t299106\n惠而浦\t299107\n整流管\t299108\n装束\t299109\n晴纶\t299110\nSentosa\t299111\nTSPB\t299112\nOnlylady\t299113\nPython+selenium\t299114\n波密县\t299115\n泠鸢\t299116\n篮球鞋\t299117\nmpn\t299118\n奇艺网\t299119\nVirgil\t299120\n鼻饲管\t299121\n三五中文网\t299122\n陈国富\t299123\nmy_life\t299124\n福清\t299125\n南狮\t299126\n麻章\t299127\nwebjars\t299128\nsaint\t299129\nPHPnow\t299130\n甘宁\t299131\n赵颖\t299132\n来不及\t299133\nMarty\t299134\n阳光在线\t299135\n青岛市北\t299136\nZ\t299137\n中堂\t299138\n克罗地亚狂想曲\t299139\nwanghetao\t299140\n深圳市\t299141\n电文\t299142\n山线\t299143\n填报\t299144\n交角\t299145\n华数tv\t299146\n鳟鱼\t299147\n找工\t299148\nIconfont\t299149\n竖直\t299150\nQQ骗子\t299151\nZJU\t299152\n算命\t299153\n規\t299154\n制动\t299155\n青岛地铁8号线\t299156\n好歌\t299157\n吉普森\t299158\n扁豆\t299159\n别信\t299160\n谢芳\t299161\n百田EXO圈\t299162\n微信公司\t299163\n九龙峡\t299164\n丛谈\t299165\n8波\t299166\n淀山湖\t299167\n男旦\t299168\n对外贸易经济合作部\t299169\n普门\t299170\n夫妻俩\t299171\n效用\t299172\n缉枪\t299173\nSTANDARD\t299174\n诚惶诚恐\t299175\n闲余\t299176\nxy=1\t299177\n出封\t299178\n绿蜡\t299179\n离职率\t299180\n驻\t299181\n开心色播网\t299182\n蠡园\t299183\n无机纤维\t299184\n昂科塞拉\t299185\n玉灵膏\t299186\n星际战甲虚空\t299187\n春困\t299188\n黑龙江省实验中学\t299189\n劝学\t299190\n总助\t299191\n枪口\t299192\n洗\t299193\n萃取物\t299194\n从严\t299195\n收腹\t299196\n瞻\t299197\n丽蒂\t299198\n9k9k\t299199\n逗号\t299200\n中国教育部\t299201\n萧索\t299202\nFatego\t299203\n有感而发\t299204\n影音先锋资源xfplay\t299205\n杜军\t299206\n51PPT\t299207\nKeycloak\t299208\n兼论\t299209\n余磊\t299210\n空照\t299211\n四分钟\t299212\n第六十六章\t299213\n短话\t299214\n糖度计\t299215\nyea\t299216\n花键\t299217\n百度阅读器\t299218\n邓辉\t299219\n凤凰os\t299220\n徐勇\t299221\n0x0000142\t299222\n中国唱诗班\t299223\nmeizu\t299224\nuop\t299225\nWiFi管家\t299226\n视口\t299227\ncocos2d-js\t299228\n深圳迈瑞生物医疗电子股份有限公司\t299229\n联邦基金利率\t299230\n泛滥\t299231\n诊治\t299232\n学会生存\t299233\n沐雨\t299234\n江苏农行\t299235\n油膜法\t299236\n利箭\t299237\n松山战役\t299238\nokhttp\t299239\n20150704\t299240\n隔膜\t299241\n南极点\t299242\namzon\t299243\n江苏省口腔医院\t299244\n匪君子\t299245\n一百篇\t299246\n不群\t299247\n武若丸\t299248\nc4d\t299249\n勇士\t299250\nNITORI\t299251\n咖啡酸\t299252\n莘县\t299253\n北二环\t299254\nImageJ\t299255\n这么样\t299256\n商法\t299257\n奇葩说第三季\t299258\n霹雳天命之战祸邪神2破邪传\t299259\n权益乘数\t299260\n淳熙\t299261\n超压\t299262\ncoverage\t299263\nScrivener\t299264\n小丑女\t299265\n淇县\t299266\n可也\t299267\n川建发\t299268\n画笔画\t299269\nyeng\t299270\n孟良崮战役纪念馆\t299271\n新浙\t299272\n还珠格格2\t299273\n拿出来\t299274\n2017年5月31日\t299275\n敢约\t299276\n王梅\t299277\n2016年3月4日\t299278\n两天一夜\t299279\n次品率\t299280\nsolarbe索比太阳能光伏网\t299281\n重钙\t299282\n扑空\t299283\n东风公园\t299284\n打泡\t299285\n静电式\t299286\n剪辑师\t299287\n费尔蒙\t299288\n城市机场\t299289\n设计癖\t299290\n黑曜石\t299291\n伦敦西区\t299292\n劳务分包公司\t299293\n8月27日\t299294\n35P\t299295\n合肥市人力资源和社会保障局\t299296\n馥蕾诗\t299297\n西双版纳\t299298\n张文木\t299299\n爱建证券\t299300\n吹尘\t299301\n四十九日\t299302\n天津大学环境科学与工程学院\t299303\n优倍快\t299304\ndjang\t299305\n龙根\t299306\nmysqlsql\t299307\n白图\t299308\n谗\t299309\n吉林省安全生产监督管理局\t299310\n排孔\t299311\nHTB\t299312\n_高清\t299313\n夜色钢琴曲\t299314\n金盛\t299315\n出息\t299316\n平衡\t299317\n熊茜\t299318\n臻房网\t299319\n幸福社区\t299320\n包钢\t299321\n投诉方\t299322\n福建省文化厅\t299323\n八把\t299324\n磁感\t299325\n太湖广场\t299326\n阿利斯塔\t299327\n中山市住房公积金管理中心\t299328\n第89期\t299329\nTechPowerUp\t299330\n只欠\t299331\nTHERMOS\t299332\n冠捷科技\t299333\n海德拉\t299334\nContext\t299335\n1205\t299336\n2822\t299337\nbailey\t299338\nBeaver\t299339\n配位键\t299340\n三秦通\t299341\n39532357\t299342\n韩国乐天免税店\t299343\nHP笔记本\t299344\n第115章\t299345\n苦练\t299346\n中维世纪\t299347\n娜迦女\t299348\nauxiliary\t299349\n国际货物买卖合同\t299350\nSongMP3\t299351\n最佳女主角\t299352\n太白酒\t299353\n葡萄籽胶囊\t299354\n尊享版/全网通\t299355\n妇幼健康\t299356\n擦写\t299357\n蔡伦\t299358\n黑石\t299359\n石滩镇\t299360\n泗县老泗州网\t299361\n盘形\t299362\nIQueryable\t299363\nMondo\t299364\n森警\t299365\n果肉\t299366\nwhm\t299367\nmegumi\t299368\n数码机\t299369\n巴塞尔协议\t299370\n第二只\t299371\nLily\t299372\n南海博物馆\t299373\n陈音\t299374\n娘妆\t299375\nsunshine_kaka\t299376\n杀伤\t299377\n鸡枞菌\t299378\n冯静\t299379\n花步\t299380\n养父的花样年华\t299381\n茶盘\t299382\nhealthy\t299383\n20150824\t299384\nda屏\t299385\nzenfone2\t299386\n水煮荷包蛋\t299387\n荣校\t299388\n海王星辰\t299389\n流入率\t299390\n杨集镇\t299391\n一汽大众汽车\t299392\n巽寮\t299393\n张志新\t299394\n木兰诗\t299395\nxem\t299396\n罗马帝国\t299397\n擦汗\t299398\n自有品牌\t299399\nWindows8\t299400\n一统国际家居\t299401\n电线电缆网\t299402\n桂林酒店\t299403\nPHIX\t299404\n黄山风景名胜区\t299405\n冒険\t299406\n陈建斌\t299407\n鸸鹋油\t299408\nobserver\t299409\nwantiku\t299410\n鼎级\t299411\nPOPSUGAR\t299412\n涟韵\t299413\nGWT\t299414\nsuju\t299415\n篮板球\t299416\n力维\t299417\n故事型\t299418\nhhhhh\t299419\n山式\t299420\n发奋图强\t299421\n磁芯\t299422\nAfghanistan\t299423\n乌尔奇奥拉\t299424\n产销\t299425\n许昕\t299426\n诺澜\t299427\n机舱\t299428\n南陵\t299429\nエリカ\t299430\n42\t299431\nswing\t299432\n宪治网\t299433\n水泊\t299434\n选好\t299435\nM277dw\t299436\n二夫\t299437\n园林艺术概论\t299438\nSTM32F103C8T6\t299439\n常州市正衡中学\t299440\n好父亲\t299441\n古稀老人\t299442\n精工\t299443\naccount\t299444\n隆安县\t299445\n离间计\t299446\nv4.8.0\t299447\nSL\t299448\n百度空间\t299449\n2极\t299450\n小红车\t299451\n玫瑰香葡萄\t299452\n美食园\t299453\n一个35岁\t299454\n基本权利\t299455\n衣钩\t299456\nmeiji\t299457\nSuggestion\t299458\n异体字\t299459\n声导\t299460\n5亿\t299461\n中南建设\t299462\naccess2007\t299463\n龙蛋\t299464\n周建平\t299465\n↘\t299466\n音动\t299467\n002183\t299468\n管理哲学\t299469\n设备管理器\t299470\n骁龙617\t299471\n赛迪智库\t299472\n不折\t299473\n马川\t299474\ngrow\t299475\n英语语法网\t299476\n歧管\t299477\n3pe\t299478\n朝着\t299479\n幸平创真\t299480\n天津省\t299481\nRyo\t299482\n长阳土家族自治县\t299483\n交通事故保险公司\t299484\n击实\t299485\nDEATH\t299486\n052\t299487\n2码\t299488\n塑胶\t299489\nMING\t299490\nUNIT\t299491\n张祥\t299492\n帕金森\t299493\n肉铺\t299494\n2000年度\t299495\nReservations\t299496\n一模\t299497\n自由人\t299498\n795\t299499\n磁粉探伤机\t299500\n端午日\t299501\n天合汽车\t299502\n苹果Mac\t299503\n黄蜡管\t299504\n无证\t299505\nv6.2.1\t299506\n娶媳妇\t299507\n维多利亚大学\t299508\n禁语\t299509\nACPI\t299510\n偷窥男\t299511\nvirgil\t299512\n拷问部\t299513\n杨宋镇\t299514\n艺廊\t299515\n伍佰\t299516\n华锦\t299517\n打破常规\t299518\n质监\t299519\n致密\t299520\n内测版\t299521\ncloudstack\t299522\nmui.ajax\t299523\n滚动阻力系数\t299524\n五讲\t299525\n中化道达尔\t299526\n按次\t299527\nBBOY\t299528\n空导弹\t299529\n江苏省工商局\t299530\n综合式\t299531\n620k\t299532\n魔羯男\t299533\n娴\t299534\n宁愿\t299535\n二十年前\t299536\n対魔忍ユキカゼ\t299537\n谢老师\t299538\n科学观\t299539\n牧神花千骨\t299540\n江都在线\t299541\n万花镜\t299542\ninformal\t299543\nequals\t299544\n准实时\t299545\n夏炎\t299546\n腊\t299547\n福莱一点通\t299548\nase\t299549\n嶌葵\t299550\nA16\t299551\n浑圆\t299552\n姚景源\t299553\n菜博会\t299554\nANTM\t299555\n受想\t299556\n羽生稀\t299557\n玛吉斯\t299558\npsp1000\t299559\n文冠\t299560\n扩至\t299561\nPN532\t299562\n绕\t299563\nTinkers\t299564\n冯兵\t299565\n小黄蜂\t299566\n997\t299567\nMac分屏\t299568\nanybody\t299569\n牧野区\t299570\nD股\t299571\n地市局\t299572\nhybrid\t299573\n铁刀\t299574\n旭辉集团\t299575\nRoadmap\t299576\ntheirs\t299577\n深圳法院\t299578\n活捉\t299579\n085211\t299580\n张学津\t299581\n潜皇\t299582\n单机版\t299583\n等价无穷小\t299584\n邱慧芳\t299585\n5部\t299586\n曹纯\t299587\n口袋妖怪银魂\t299588\n【锋范\t299589\n钟保罗\t299590\n虚渊玄\t299591\n三d\t299592\n去年\t299593\n变本加厉\t299594\n自拟\t299595\n卖单\t299596\nobsessed\t299597\n医卡通\t299598\nFaceBook\t299599\nUsborne\t299600\n尿黄\t299601\n七孔\t299602\n不将就\t299603\n塞尔达荒野之息\t299604\n孙志军\t299605\n咖啡拉花\t299606\n盗窃案\t299607\n给水厂\t299608\n肉蛋\t299609\n镖师\t299610\nH110M-K\t299611\n剑突\t299612\nwdatepicker\t299613\n舍\t299614\n万家基金\t299615\n撑场\t299616\n电脑浏览器\t299617\n领海\t299618\n苏州工业园区人民法院\t299619\n北京\t299620\n床面\t299621\n应约\t299622\nArial\t299623\n王冲\t299624\nyinhang\t299625\n纳兰容若\t299626\n大全男\t299627\ni57200u\t299628\n用户端\t299629\nhuayuan\t299630\nSWORD\t299631\n地母\t299632\n红豆薏仁\t299633\nSKP\t299634\n车享家汽车养护中心\t299635\nsupreme\t299636\nwhata\t299637\n非织造\t299638\n梳妆台\t299639\n摇钱\t299640\nwow阿古斯\t299641\n伊山镇\t299642\n今生\t299643\n周哥\t299644\n科目\t299645\n措施\t299646\n杨若兮\t299647\n孙杨何超琼\t299648\n威朗大主宰\t299649\n众人\t299650\nTurboTax\t299651\n印度人\t299652\npuppy\t299653\n苏文纨\t299654\n孙丽丽\t299655\n刘振民\t299656\n点集\t299657\ninfographic\t299658\n你的未来\t299659\n英格玛\t299660\n培文杯\t299661\n滨海航母主题公园\t299662\n摩根斯坦利\t299663\nAizawa\t299664\nGo语言中文\t299665\n变阵\t299666\n过敏症状\t299667\n网销\t299668\n陈建伟\t299669\n友谊县\t299670\nX-E3\t299671\n北京朝阳医院\t299672\nEllen\t299673\n建设税\t299674\n襦裙\t299675\n得不\t299676\n高不\t299677\n马上消费金融股份有限公司\t299678\n安飞士\t299679\n马蒂\t299680\n全有\t299681\n虎口\t299682\n超自然\t299683\n北京铁路局\t299684\n悍刀\t299685\n惰\t299686\n矾山镇\t299687\nyouhua\t299688\n水平座\t299689\n点购收藏网\t299690\n隐藏项\t299691\n合成树脂瓦\t299692\n郑洛新国家自主创新示范区\t299693\n主男\t299694\n两下\t299695\n四步\t299696\n攸妍\t299697\n黄埠镇\t299698\nAV動画\t299699\n维生素c泡腾片\t299700\nMG\t299701\n主使\t299702\n海淘关税\t299703\n前一周\t299704\n瑞安人才网\t299705\n丙烯醇\t299706\n华院\t299707\n西安曲江国际会展中心\t299708\n漫画展\t299709\n极速汽车\t299710\nFreeSpider\t299711\n百分之零\t299712\nRelativeLayout\t299713\n村边\t299714\ncat6\t299715\n喉\t299716\n广东省人大\t299717\n138个\t299718\n2.6.4\t299719\n绝地求生大逃杀_绝地求生大逃杀\t299720\ncloudsim\t299721\nSuntory\t299722\n一千多公里\t299723\n5011\t299724\n关楚耀\t299725\n陈欣\t299726\nWhat\t299727\n大坪乡\t299728\n柳沟村\t299729\n金面佛\t299730\nfür\t299731\n区消防\t299732\nHADOOP\t299733\n塑袋\t299734\n没有关系\t299735\n败选\t299736\n足恋\t299737\n胸围\t299738\nBirch\t299739\n桥式\t299740\n魂器\t299741\nSCARA\t299742\n此语\t299743\narsenal\t299744\n防菌\t299745\n神州浩天\t299746\nexcel2010\t299747\n巴里奥斯\t299748\n_淮海网\t299749\n义序\t299750\n上海海尔\t299751\n中国国航\t299752\n案犯\t299753\n译码器\t299754\n601亩\t299755\n湖北地大热能科技\t299756\ndistinguished\t299757\n削\t299758\n丰田威驰\t299759\n死活\t299760\nPowerbuilder\t299761\n良信\t299762\n光武\t299763\n600637\t299764\n天天评书网\t299765\n兵临城\t299766\n新能源汽车技术有限公司\t299767\napsc\t299768\n天津海\t299769\n王强\t299770\n封板\t299771\n于大宝\t299772\n库会\t299773\n瑞风M5\t299774\n皓皓\t299775\n风蛇\t299776\n蓝色天梦\t299777\n山东区\t299778\n摇号机\t299779\n杨十六\t299780\n色系军团\t299781\n镀晶\t299782\n吉卜林\t299783\n南京皮研所\t299784\n唯彩\t299785\n越剧\t299786\nX-Particles\t299787\n张琴\t299788\naecc\t299789\n百度作业帮\t299790\nNov\t299791\n压力源\t299792\n面人\t299793\nLAMMPS\t299794\n必选\t299795\n智装\t299796\n103岁\t299797\n羊蹄甲\t299798\n3.3.4\t299799\n中国图书馆\t299800\n老铁山\t299801\n北京麦当劳\t299802\nSuSE\t299803\nt_\t299804\nAVAYA\t299805\n192.168.1.104\t299806\n汽车滤清器\t299807\n玛娜\t299808\nwitch\t299809\n南通中学\t299810\n末裔\t299811\n四点\t299812\n枢木\t299813\nvironment\t299814\n山前大道\t299815\n十支\t299816\n潘麟\t299817\n非政府\t299818\nimmigration\t299819\n街子镇\t299820\n论语译注\t299821\n电信欢go网\t299822\n61_\t299823\n镇原\t299824\n长乐一中\t299825\n抛球\t299826\nPyMySQL\t299827\n任玩堂\t299828\n健力士\t299829\n捕鱼机\t299830\n哪几个人\t299831\n浮式\t299832\n水军\t299833\n二六三\t299834\n乐园化\t299835\n中子源\t299836\n警探\t299837\nunison\t299838\n我财\t299839\n估读\t299840\n陈平原\t299841\n败走麦城\t299842\n2018年上半年\t299843\n姬菲娜\t299844\n棒糖\t299845\n\t299846\n处士\t299847\n本·拉登\t299848\nhongyang\t299849\n春雷\t299850\n700块\t299851\n第二十季\t299852\n神珠\t299853\n校园群芳记\t299854\n万科西华府\t299855\nyiqi\t299856\n噻唑\t299857\n中国远大集团\t299858\n0731.com\t299859\n中山医\t299860\n贞观\t299861\n起承转合\t299862\n英魂外传\t299863\nonDraw\t299864\n多聚赖氨酸\t299865\n潘先生\t299866\nneither\t299867\n东南大学经济管理学院\t299868\ndhea\t299869\n新魔力玩具学校\t299870\n67集\t299871\n独立生活\t299872\n皖赣\t299873\n点数机\t299874\nt95\t299875\n树立\t299876\nMorphVOX\t299877\n中国心理学家网\t299878\nATTACKERS\t299879\n举办\t299880\n卢湾\t299881\nQQ游戏大全\t299882\n华苑产业区\t299883\n生物性\t299884\n润元昌\t299885\n相让\t299886\n文明村\t299887\n六七个月\t299888\n冲孔板\t299889\ngq\t299890\nHaproxy\t299891\n曳引轮\t299892\n熟练工\t299893\n好害怕\t299894\n安安e\t299895\n厄加特\t299896\n死气\t299897\n万方数据知识服务平台\t299898\n阿里巴巴商学院\t299899\nesp32\t299900\n持续性\t299901\n45R17\t299902\nBetty\t299903\n害人害己\t299904\n长江三角洲地区\t299905\n万能充电器\t299906\ncodepad\t299907\n善治\t299908\nSk\t299909\n国有企业集团\t299910\n游信\t299911\n强记\t299912\n猫游记\t299913\n附海镇\t299914\n暗地\t299915\nCA002\t299916\n刘继芬\t299917\n油版\t299918\nprizes\t299919\n比亚迪宋\t299920\n白头鹰\t299921\n十月底\t299922\n黑牌\t299923\n新浪微\t299924\n任意键\t299925\n末世凶兵\t299926\n物质\t299927\nHOTEL\t299928\n冷漠\t299929\n积分榜\t299930\n2049年\t299931\n浆液\t299932\neLinux\t299933\n巨鸟\t299934\nswicth\t299935\n拉菲\t299936\nhyperdunk2017\t299937\n朵女郎\t299938\nARCFOX\t299939\nBrexit\t299940\nalina\t299941\n犀牛云\t299942\nYaf\t299943\npdfelement\t299944\n河谷镇\t299945\n火炉山森林公园\t299946\n华为客户服务中心\t299947\n岗位证\t299948\n马吟吟\t299949\nJinping\t299950\n娇点\t299951\n硬管\t299952\n苦参碱\t299953\n云南电网公司\t299954\nleawo\t299955\n智能电视\t299956\n中华花龟\t299957\n易乾\t299958\n盛饭\t299959\n向天再借五百年\t299960\n冷拔\t299961\n言寸草心\t299962\n万多\t299963\n天目湖镇\t299964\n锥体\t299965\n重庆地铁5号线\t299966\n喂养\t299967\nccoo\t299968\n混凝土工\t299969\n夹套\t299970\ntec\t299971\n第138集\t299972\n白若溪\t299973\nx+\t299974\n电蒸炉\t299975\nbo3\t299976\n许玲\t299977\n郑政文\t299978\n瘊子\t299979\n20160626\t299980\nfile.saige.com/upload/2017/\t299981\nQQ技术导航\t299982\n或者说\t299983\n后汉书\t299984\n2018年春节\t299985\n内蒙古草原\t299986\nCuban\t299987\n大陈岛\t299988\n选框\t299989\n鸡蛋豆腐\t299990\n第八回\t299991\nlaf\t299992\n屠龙宝刀\t299993\n忧伤\t299994\n超微粉碎机\t299995\n葛城美里\t299996\n于田县\t299997\n生来\t299998\n4.7.5\t299999\n土地公经纪人大全\t300000\n华为游戏中心\t300001\n嘉湖\t300002\nRae\t300003\n地磅遥控器\t300004\n梦百合\t300005\n锥孔\t300006\n辐射避难所吧\t300007\nkors\t300008\nweakness\t300009\n控方\t300010\n水果派\t300011\n迈克尔乔丹\t300012\n星轮\t300013\n万科东\t300014\n韩国语\t300015\n两全其美\t300016\n2016—2017年\t300017\n生态学报\t300018\n汽油添加剂\t300019\n中图分类法\t300020\n绝杀\t300021\nhtk\t300022\n工作组\t300023\n李蓉蓉\t300024\n和苑小区\t300025\n龙马\t300026\n零境\t300027\n邢台市人民政府\t300028\n微虐\t300029\n3509\t300030\n苏州机场\t300031\ninfocom\t300032\n慢性肾小球肾炎\t300033\n玻璃观景台\t300034\n济公\t300035\nIDEs\t300036\nVarchar\t300037\n金固股份\t300038\n压盖\t300039\n川北医学院附属医院\t300040\n碧湖镇\t300041\n46码\t300042\n2017-2018\t300043\n皱纸\t300044\n喉结处\t300045\n白白白\t300046\n张永祥\t300047\n燃烧炉\t300048\n草油胶囊\t300049\n张姐\t300050\n安史之乱\t300051\n李乔\t300052\n贺岁双色铜合金\t300053\n四川科技\t300054\nplasmid\t300055\n入口\t300056\n回过头\t300057\ncocoa\t300058\n黄金谷\t300059\n黄坚\t300060\n石家庄蓝天中医院\t300061\nQQ浏览器\t300062\n何志森\t300063\n威固膜\t300064\n360论坛\t300065\n浆机\t300066\n凌神\t300067\n薏米红豆\t300068\ngogotalk\t300069\n4回\t300070\n010期\t300071\n糸\t300072\n逼婚\t300073\n国篇\t300074\n阳光天地\t300075\n北京师范大学文学院\t300076\n0806\t300077\n深闺\t300078\nzzy\t300079\n20171119\t300080\n姆南加古瓦\t300081\n1000cc\t300082\n抵现\t300083\nanycast\t300084\n37周\t300085\n前乳\t300086\nhoneyselect\t300087\n操报\t300088\n彭\t300089\n易错\t300090\nPng\t300091\n53000\t300092\n蒲式耳\t300093\n金像\t300094\n北京佑安医院\t300095\n百分之七\t300096\n穿孔\t300097\n语操\t300098\n龙江颂\t300099\nrendering\t300100\n7ml\t300101\nskewness\t300102\n89分\t300103\n小潮\t300104\n白家庄\t300105\n植保\t300106\n马店\t300107\n声誉\t300108\n进出口有限公司\t300109\n杨林\t300110\n杭州黄龙体育中心\t300111\n非小细胞肺癌\t300112\nflexslider\t300113\n面相手\t300114\n600斤\t300115\n科恒股份\t300116\nf570\t300117\nNinja\t300118\n张良\t300119\n亦舒\t300120\n咸丰皇帝\t300121\n挥斥方遒\t300122\n大屯路\t300123\n溢洪道\t300124\n金色港湾\t300125\n侦探公司\t300126\nedc电音节\t300127\n22w\t300128\n单片机\t300129\n替罪\t300130\n象\t300131\n脚男\t300132\n完整版百度云\t300133\n烟花易冷\t300134\n购房者\t300135\n显子\t300136\n窦文涛\t300137\n阿吉\t300138\n整幅\t300139\n新闻1+1\t300140\n玛姬\t300141\n自付\t300142\n14亿元\t300143\n洛克王国吧\t300144\n京华网校\t300145\n卓大师\t300146\nkung\t300147\nWurth\t300148\nclapton\t300149\n地铁9号线\t300150\nMAC地址修改器\t300151\n随机误差\t300152\n某一页\t300153\n护色\t300154\n英国大使馆\t300155\n麻辣\t300156\n圆号\t300157\ni219\t300158\n薰衣\t300159\n满庭\t300160\n佛力\t300161\n刘和生\t300162\n电话号段\t300163\n苯醌\t300164\n王者荣耀鲁班七号\t300165\nSecond\t300166\n文本聚类\t300167\n轻纱\t300168\n新凯\t300169\n全见版\t300170\n父王\t300171\nDWDM\t300172\n菩提老祖\t300173\n摩尔城\t300174\n铸钢\t300175\n双包\t300176\n中性线\t300177\n汪晖\t300178\n搜讯网\t300179\n用友U8\t300180\n肉细胞\t300181\n桑德斯\t300182\n长山\t300183\nN20\t300184\n11月1号\t300185\n刘力\t300186\n海天翼\t300187\n广东工业大学就业信息中心\t300188\ngeomagic\t300189\n定向培养\t300190\n北京现代汽车\t300191\n导学\t300192\n兹\t300193\nasa\t300194\n藏族舞\t300195\n刘劲松\t300196\n乐税\t300197\n中国计量学院\t300198\n华为watch2pro\t300199\npinduoduo\t300200\n白雪公主之魔镜魔镜\t300201\n独山港镇\t300202\n晓荷\t300203\n一吻\t300204\n第23季\t300205\n膜结构\t300206\n恤衫\t300207\nrrt\t300208\n調\t300209\n车族\t300210\n香港免税店\t300211\n3.8亿\t300212\n五险一金+\t300213\n知道者\t300214\n雷奥\t300215\n完美邂逅\t300216\n武汉光电国家研究中心\t300217\n兰文云\t300218\n血海\t300219\n虎王\t300220\n三狼奇案\t300221\n兽设\t300222\n地热资源\t300223\n顾大嫂\t300224\n第八区小说网\t300225\n平安365\t300226\n支撑点\t300227\n浙二医院\t300228\nalternate\t300229\nnil\t300230\n朝圣\t300231\nmapState\t300232\nа\t300233\n傅山\t300234\n改良派\t300235\nP2P理财公司\t300236\n碘量法\t300237\n择木而栖\t300238\n体验\t300239\n宜信财富\t300240\n讷河\t300241\nmstsc\t300242\n平步\t300243\n青岛车管所\t300244\n天霸\t300245\n2C\t300246\n572g\t300247\n廓清\t300248\n外祖母\t300249\n邻避\t300250\n0062\t300251\n守株待兔\t300252\n线头\t300253\n卷子\t300254\n电解水\t300255\n2.8t\t300256\ncause\t300257\n討論\t300258\n宏诚\t300259\n火域\t300260\n开车\t300261\n广西壮族自治区公共资源交易中心\t300262\noverdrive\t300263\n共和国之恋\t300264\n2三个\t300265\n500D\t300266\n拍痧\t300267\n体字\t300268\n高升专\t300269\n麻酱\t300270\n罗特\t300271\n四十二\t300272\n劳防\t300273\n澳门葡京\t300274\n何曾\t300275\n鞋店\t300276\n笠翁对韵\t300277\n通天峰\t300278\n军刀\t300279\n临政\t300280\nth\t300281\n英语二\t300282\n无助\t300283\n紫馨\t300284\n胎教仪\t300285\nnanning\t300286\nTransparency\t300287\n孝利家民宿\t300288\n抢夺\t300289\nSBI\t300290\nAaa\t300291\n张庄\t300292\n1688\t300293\n利\t300294\n星相\t300295\n走近\t300296\n仁寿山\t300297\n博汇纸业\t300298\n编法\t300299\n太平洋女性网\t300300\n营业部\t300301\nmegascans\t300302\n万径人踪\t300303\n话机\t300304\n愤\t300305\n信阳职业技术学院\t300306\n艳尸\t300307\n上灯\t300308\n8640\t300309\nshox\t300310\n考公\t300311\n第六座\t300312\nqq币\t300313\nshipment\t300314\nresolver\t300315\n顶新集团\t300316\n男声\t300317\n空中浩劫\t300318\n李明哲\t300319\nDVD-MKV+RMVB\t300320\n卢娜\t300321\n开松\t300322\n两亩\t300323\n明港镇\t300324\n波澜\t300325\n正达\t300326\n擢\t300327\n级差\t300328\n8077\t300329\n扎针\t300330\n电容补偿柜\t300331\n安监办\t300332\n和园小区\t300333\nMC论坛\t300334\n600258\t300335\n周一\t300336\n200毫米\t300337\n華僑網\t300338\n暴走动画中文网\t300339\nWSDL\t300340\n马东\t300341\n1572\t300342\n伯仲\t300343\n氯醋树脂\t300344\n罢课\t300345\n瑞达利欧\t300346\n养性\t300347\n造假案\t300348\n百联卡\t300349\n多杰\t300350\n东北秧歌\t300351\n添置\t300352\n麻辣杂谈\t300353\n张猫\t300354\n平舆县\t300355\n第A4\t300356\n中国木业资讯中心\t300357\n组合拳\t300358\n丽水站\t300359\n仪\t300360\n向北方\t300361\n64篇\t300362\n山药蛋\t300363\n新蒂\t300364\n跳板\t300365\n伝\t300366\n兰思思\t300367\n改换\t300368\n奶茶加盟网\t300369\n饱食度\t300370\n王洁实\t300371\n浓缩\t300372\n昆明云医大医院\t300373\n澜记\t300374\n王颖\t300375\n穆娅\t300376\n泪雪网\t300377\njill\t300378\n携程客栈\t300379\n87式\t300380\n段灵儿\t300381\n意定\t300382\n新娘鞋\t300383\n中国国防部\t300384\n桃宝\t300385\n懵圈\t300386\n贞观天下\t300387\nmtt\t300388\n六郎\t300389\n1天\t300390\n抚顺经济开发区\t300391\n维力\t300392\n房中术\t300393\n计划性\t300394\n动器\t300395\nHumble\t300396\n流掉\t300397\n市属\t300398\nshoei\t300399\n走样\t300400\n铁树\t300401\n上海市东方医院\t300402\n上海静安\t300403\n早期\t300404\n沉值\t300405\n崴脚\t300406\nFragrance\t300407\n11例\t300408\n比赛稿\t300409\n千帕\t300410\n浴液\t300411\n吧文\t300412\n实测\t300413\n山东外事翻译职业学院\t300414\nesxi\t300415\n解胶剂\t300416\n但丁密码\t300417\n奶浆\t300418\n红云红河集团\t300419\n推荐语\t300420\n植物大战僵尸:花园战争\t300421\n卷款\t300422\n巧格i\t300423\n智能照明控制系统\t300424\n轧机\t300425\n12GB\t300426\n安监局\t300427\nc语\t300428\n长津湖\t300429\n转面\t300430\n广西党委\t300431\n恶补\t300432\ncdna\t300433\n轴箱\t300434\n沙湾路\t300435\n招生季\t300436\naspcms\t300437\n信大捷安\t300438\nums\t300439\n99%\t300440\nNewbee\t300441\ndolby\t300442\neep\t300443\n梁雁翎\t300444\nFastCopy\t300445\n温州市人民政府办公室\t300446\n哥伦布市\t300447\n旅游车\t300448\n事业编吧\t300449\n白鸟樱\t300450\n紫薇星盘\t300451\nqwe\t300452\n冈本多绪\t300453\n国有土地使用权出让合同\t300454\n哥们体育\t300455\n英西峰林\t300456\n两键\t300457\n影拓5\t300458\nyoga730\t300459\nac66u\t300460\n有理有据\t300461\n速跑\t300462\n家用电源\t300463\n上海汽车城\t300464\n桡神经损伤\t300465\n汇信\t300466\n师爱\t300467\n生死局\t300468\n湖笔\t300469\n图森\t300470\n正时\t300471\n融侨馨苑\t300472\n靖江市市场监督管理局\t300473\naeon\t300474\n金华十校\t300475\nALPS\t300476\n指关节\t300477\nDouPHP\t300478\n黄嘉千\t300479\nBD高清1080P\t300480\n苯丙氨酸\t300481\nrev.1.0\t300482\n三替\t300483\n德耀\t300484\n第九名\t300485\n优乐商城\t300486\n东阳横店\t300487\n特氟龙\t300488\nsdl\t300489\n艺龙\t300490\n广播在线\t300491\n词汇量\t300492\n所向披靡\t300493\n3周岁\t300494\n企划\t300495\n集体所有制\t300496\n争渡\t300497\n品牌化\t300498\nActuators\t300499\nbitcoind\t300500\n地带\t300501\n神级龙卫\t300502\nforged\t300503\nblockquote\t300504\n金属制品有限公司\t300505\n单元格边距\t300506\nbat批处理\t300507\ncp35m-win\t300508\nHandclap\t300509\n宝丰县政府\t300510\n抛光液\t300511\nGlobalSign\t300512\nGuang\t300513\n55篇\t300514\n马赛马拉\t300515\n发箍\t300516\n标题栏\t300517\n大源\t300518\n南京航天航空大学\t300519\nerx5\t300520\nSemtech\t300521\n沈阳红番区\t300522\n文存\t300523\n透支\t300524\nLg\t300525\nwastewater\t300526\n科尼塞克\t300527\n定海政府\t300528\n哈尔滨南岗区\t300529\n巩卫\t300530\n中国社会科学院金融研究所\t300531\n男人才\t300532\n养老地产\t300533\n中骅物流\t300534\nliujiacai\t300535\n苍狼\t300536\n介入放射学\t300537\n壮阔\t300538\n新乡市\t300539\n铜陵\t300540\nTriumph\t300541\n佩莱\t300542\n威海新闻网\t300543\n马头镇\t300544\n铁兰\t300545\n马池口镇\t300546\n顺丰次晨\t300547\n经典型\t300548\n厝边\t300549\n亚急性甲状腺炎\t300550\n空载电流\t300551\n赣美\t300552\n新民街\t300553\n节流阀\t300554\n球头\t300555\n皈依\t300556\nCream\t300557\nsrc/main/java\t300558\n牵伸\t300559\n芽庄机场\t300560\n新版三国吧\t300561\ndj嗨嗨网\t300562\nthermal\t300563\n我方\t300564\n及\t300565\n多宝阁\t300566\nJBT\t300567\nnatura\t300568\n太子党\t300569\n八面山\t300570\njd\t300571\nIR2110\t300572\n红薯中文\t300573\n尽长安花\t300574\n116名\t300575\n伊利公司\t300576\nVR中文网\t300577\n無印良品\t300578\n三井寿\t300579\n第7批\t300580\n翔飞\t300581\n盲法\t300582\nKrabi\t300583\n谋杀\t300584\n眉飞神话版三国\t300585\n杨东煜\t300586\n纪登奎\t300587\nproduce101\t300588\n金怡\t300589\n干点\t300590\n回來\t300591\n骄奢\t300592\n上海将来实业股份有限公司\t300593\n上海龙之梦万丽酒店\t300594\n航天大道\t300595\n330个\t300596\n中国红歌会\t300597\nWinding\t300598\n玩家\t300599\n抽屉式\t300600\n蒂蒂\t300601\n博爱县\t300602\n万泉河\t300603\nv2.8.1\t300604\n截击\t300605\n河北省安全生产监督管理局\t300606\nTiestoRay\t300607\nnor\t300608\nsoildworks\t300609\n412\t300610\n华南碧桂园\t300611\n北京地质大学\t300612\n闻泰\t300613\n史光辉\t300614\n拳皇11\t300615\ni+1\t300616\n永安街道\t300617\nMitchell\t300618\n壶型\t300619\n不然\t300620\n型录\t300621\n屏蔽网\t300622\n荣威w5\t300623\nwenjianjia\t300624\n曲酒\t300625\n50cc\t300626\n划设\t300627\nSimonLiang\t300628\n20关\t300629\n软斑\t300630\n异地贷款\t300631\n赫德拉姆\t300632\n朗坤\t300633\n腰封\t300634\n望城政府\t300635\n重庆第二师范学院\t300636\n负熵\t300637\nUBER\t300638\n深圳远东妇儿科医院\t300639\n深高\t300640\ngcj02\t300641\n药皂\t300642\n易变\t300643\n潜水服\t300644\n李振宇\t300645\n29层\t300646\n昆明医院\t300647\n换算器\t300648\nExtras\t300649\nShinco\t300650\n陆港通\t300651\n初现\t300652\n意气风发\t300653\n君羊\t300654\n医龙\t300655\n食药监局\t300656\nRAW\t300657\n烫印机\t300658\n力生制药\t300659\n铁山路\t300660\n批改网—秒批作文\t300661\n送餐员\t300662\n广东麻将\t300663\n无与伦比\t300664\n银冠\t300665\n城市猎人\t300666\n住房抵押贷款\t300667\n人教版六年级下册语文配套练习册\t300668\n英中\t300669\n侧向\t300670\n黄蜡\t300671\nunderlined\t300672\n张艳\t300673\n腥气\t300674\n景湖\t300675\n北京高速\t300676\n纽曼\t300677\n教通\t300678\nCrop\t300679\nregeneration\t300680\n离职信\t300681\n垚\t300682\nEO\t300683\nBluebird\t300684\n风机箱\t300685\nUnreal\t300686\n淮河大桥\t300687\nasar\t300688\n紧凑型\t300689\n猛犸象牙\t300690\nPoultry\t300691\n掩门\t300692\n008吧\t300693\n13g\t300694\n97号\t300695\n现金流量表\t300696\nxm2\t300697\n华苑\t300698\n烟酒\t300699\n群尸\t300700\n平安大厦\t300701\n中债资信\t300702\n无损曲库\t300703\n济南市槐荫区人民政府\t300704\n张盛\t300705\n揪出\t300706\n排位战\t300707\n书库\t300708\n球娘\t300709\n新飞飞吧\t300710\n孟鲁司特钠\t300711\n系统吞吐量\t300712\n近光\t300713\n焦油器\t300714\n82岁\t300715\n韩日世界杯\t300716\nScaling\t300717\n不争气\t300718\n空宫\t300719\nEst\t300720\n族库\t300721\n无限极(中国)有限公司\t300722\nSonicWALL\t300723\n8.9.1\t300724\n真的不懂\t300725\n工学类\t300726\nchill\t300727\n水利厅\t300728\n蒸馍\t300729\nHFS\t300730\n海南大学\t300731\n冻酸奶\t300732\n77986329\t300733\n相成\t300734\nID服务器\t300735\n考拉大冒险\t300736\n无码片\t300737\nKwai\t300738\n保姆_保姆中介公司\t300739\n封尘\t300740\nconcur\t300741\nluka\t300742\n睡人\t300743\n两旺\t300744\n11幅\t300745\nя\t300746\n加巴喷丁胶囊\t300747\n倾听\t300748\n劲椎\t300749\n鸡舞\t300750\n埂\t300751\n宋楚瑜\t300752\n信和\t300753\nvespa\t300754\nASMR\t300755\n平均价格\t300756\n白炭黑\t300757\nINA轴承\t300758\n究生\t300759\n峄山\t300760\n握持\t300761\n麓府\t300762\n门框\t300763\n压点\t300764\n黄子韬\t300765\nversion\t300766\n真假难辨\t300767\n献舞\t300768\nHt\t300769\n水产局\t300770\n天城路\t300771\n兽场\t300772\n广东省社会保险基金管理局\t300773\nMos\t300774\n刘玲玲\t300775\n366天\t300776\nforgotten\t300777\ncrave\t300778\n作对\t300779\nmetrics\t300780\nzt知乎\t300781\n公曰\t300782\n报春花\t300783\n宠物小精灵\t300784\n物联资讯网\t300785\nlifting\t300786\n76.5\t300787\n蝴\t300788\npreventing\t300789\n东莞市国土资源局\t300790\n环氧树脂地坪漆\t300791\n万丰路\t300792\n现金流量净额\t300793\n照会\t300794\n对像\t300795\n拍马屁\t300796\n2018-04-06\t300797\n太谷县\t300798\n娄葑镇\t300799\n太平福禄康瑞\t300800\n牛虻\t300801\n余半\t300802\n沃玛\t300803\nsdcard\t300804\n罗马共和国\t300805\n阿里旅行\t300806\n山东共青团\t300807\n河北航空有限公司\t300808\n美鑫\t300809\nBeloved\t300810\n阿科哥\t300811\n耳饰\t300812\nslavery\t300813\n虚名\t300814\n一目连\t300815\n辽足\t300816\n104路\t300817\n生命之环\t300818\n慢性肺源性心脏病\t300819\n开心影院\t300820\nxp虚拟机\t300821\n谷津\t300822\n亿视丽\t300823\n电单车\t300824\n0.01毫米\t300825\n2016年2月6日\t300826\n天下一家\t300827\n总后\t300828\n夜阑\t300829\n丁苯\t300830\n秘密女\t300831\ntrpg\t300832\n9600GT\t300833\n黄河旋风\t300834\n阳光灿烂\t300835\nSpeeches\t300836\n移动网\t300837\nTou\t300838\n沙坡头区\t300839\n打样\t300840\nmathe\t300841\n守城\t300842\n郁离子\t300843\n太古剑尊\t300844\n港汇中心\t300845\n云湖\t300846\n宠物医院\t300847\nsolo3\t300848\n武昌职业学院\t300849\n2146\t300850\n千珏\t300851\n兄台\t300852\n捷报\t300853\n汽车召回网\t300854\n澳佳宝\t300855\n相国寺\t300856\n门诊表\t300857\n标点\t300858\n鼎捷软件\t300859\n学前教育心理学\t300860\n40D\t300861\n博罗罗浮山\t300862\n购置\t300863\nuVision\t300864\n刘新建\t300865\nReuters\t300866\nMadam\t300867\n000783\t300868\n单帧\t300869\n1000多年\t300870\n80P\t300871\n活泼\t300872\n酬薪\t300873\n卡塔海滩\t300874\n西关\t300875\n益田集团\t300876\n航院\t300877\nweida\t300878\n3261\t300879\n格鲁派\t300880\n整肃\t300881\n浙江长征职业技术学院\t300882\n北京社区\t300883\n云观\t300884\ncommander\t300885\nflask框架\t300886\n互联网浏览器\t300887\n移动2g\t300888\n看房\t300889\n西安高新医院\t300890\n10.13.3\t300891\n嘉祥县人民政府\t300892\n大夏书系\t300893\n插座盒\t300894\nsimple\t300895\n奥林花园\t300896\n吴明\t300897\n外国语言文学\t300898\ngrado\t300899\n鞣制\t300900\nGRC构件\t300901\nSQLServer2005\t300902\n公益片\t300903\n小天才手表\t300904\n升舱\t300905\nremax\t300906\n杭州二中白马湖学校\t300907\n代尔祖尔\t300908\n奉贤中学\t300909\n天行浏览器\t300910\n南岸\t300911\nequiv\t300912\n吾将\t300913\n天语sx4\t300914\n儒教\t300915\nIntern\t300916\n1971年\t300917\n风朗\t300918\nv1.0_游迅网\t300919\n新垣\t300920\nbellman\t300921\n成都公司\t300922\n工作分\t300923\n雎鸠\t300924\n天猫Tmall\t300925\nFields\t300926\n金鸟\t300927\n房建\t300928\n匀质板\t300929\n马步\t300930\n磨板\t300931\n鄂尔多斯\t300932\nfragmentation\t300933\n轮流\t300934\n吉之岛\t300935\n优抚\t300936\n国际学院\t300937\n历史学家\t300938\n基本点\t300939\n_整形频道_健客网\t300940\n查詢\t300941\n青岛市区\t300942\nAudition\t300943\n音壁\t300944\nB超机\t300945\ndian\t300946\n丁满\t300947\nbelgium\t300948\n职业病危害\t300949\n掉色\t300950\n镇江南站\t300951\n钟楚红\t300952\ntalent\t300953\n合肥七中\t300954\n京滨工业园\t300955\n2808\t300956\n18份\t300957\n中央第五环境保护督察组\t300958\n石龙子\t300959\n张邦鑫\t300960\n无尽战刃\t300961\ngorm\t300962\n李雪珠\t300963\n侧标\t300964\n狡黠\t300965\n多远_\t300966\n刘进平\t300967\n新浪阳江\t300968\n终点站\t300969\n免下\t300970\n核心机\t300971\n笔记本论\t300972\n2g\t300973\n上海东方航空\t300974\n四行\t300975\n张晓霞\t300976\n殷勤\t300977\n黄豆芽\t300978\n学评\t300979\n空包代发网\t300980\n瓜江\t300981\n65吨\t300982\n夏洛蒂·勃朗特\t300983\nluna2\t300984\n2018-4-16\t300985\n五台山路\t300986\nPHPMYADMIN\t300987\n宁淮城际铁路\t300988\n五节\t300989\n何绍基\t300990\n琦玉县\t300991\n6周年\t300992\n邮洽\t300993\n2018-03-07\t300994\n杨铭宇\t300995\n夜关门欲望之花\t300996\n赤铁\t300997\n蓝焰控股\t300998\n举案齐眉\t300999\n风纪\t301000\n阴魂\t301001\n追梦者\t301002\n高新大厦\t301003\n时月\t301004\nwellness\t301005\n102路\t301006\n淮海中路街道\t301007\n见招拆招\t301008\n阮成\t301009\n酒渣鼻\t301010\n期权池\t301011\n巨牙\t301012\n亳州市委员会\t301013\nJILL\t301014\n黑蔷薇女武神\t301015\n开拓者\t301016\nIMEI串号\t301017\n帝子\t301018\n2016年4月13日\t301019\n偷渡\t301020\nwww.7k7k.com\t301021\n科科\t301022\n假面骑士W\t301023\n歌文\t301024\n东方直播室\t301025\n挺住\t301026\n不化\t301027\n胡润百富榜\t301028\n韩冰\t301029\n裸泳\t301030\n每季\t301031\n心曲\t301032\nWGET\t301033\n花心\t301034\n坠地\t301035\n干漆\t301036\n1.53G\t301037\n寒地\t301038\ngomery\t301039\n雍\t301040\n逼近\t301041\n鬼服\t301042\nPeixun\t301043\ncydo\t301044\nhuisuo\t301045\nDDW\t301046\nsl\t301047\nUnity\t301048\n薛城区\t301049\n20160501\t301050\n军者\t301051\n椰子树\t301052\n公共资源信息网\t301053\n祥云\t301054\n20160911\t301055\n桃花园\t301056\n2017.12\t301057\n走出来\t301058\n成都西南医院\t301059\n欧歌\t301060\n学到\t301061\nCS35\t301062\n济南市天桥区政府\t301063\n汇龙\t301064\n高栏港\t301065\n显卡显存\t301066\nsinA\t301067\n民泰\t301068\nu3\t301069\n百江\t301070\nmkm\t301071\n由此可见\t301072\n烯烃\t301073\n简笔画画\t301074\ndhcpd\t301075\n二硫化钼\t301076\n军警民\t301077\n道缘\t301078\n年结\t301079\nAnother\t301080\nCobbler\t301081\n陆页\t301082\n八髎\t301083\n舒尔特\t301084\n于鹏\t301085\n魂魄\t301086\n隐窝\t301087\n南昌东站\t301088\n样本股\t301089\n300辆\t301090\n7公里\t301091\nA19\t301092\n砭\t301093\n浙江人民美术出版社\t301094\n男孩儿\t301095\nコスプレ\t301096\n女法\t301097\n剑河县\t301098\n拗\t301099\nnaive\t301100\n斯蒂文斯理工学院\t301101\n假面骑士OOO\t301102\nnsuser\t301103\n荀派\t301104\n肺间质纤维化\t301105\n上品湾\t301106\nHyperMesh\t301107\n符文\t301108\n妈妈装\t301109\nCUBA\t301110\n飞天奖\t301111\n4.0版\t301112\n最新期\t301113\n上海当代艺术博物馆\t301114\n暗沟\t301115\n薅\t301116\n索菲工作室\t301117\n奇偶\t301118\n祭炼\t301119\ndiagram\t301120\n侵权案\t301121\n茶山村\t301122\n锐骐皮卡\t301123\n中央深改组\t301124\n清泉镇\t301125\n美食网\t301126\n中兴能源\t301127\n德淘\t301128\nQtum\t301129\ncmk\t301130\n2015年05月\t301131\n坑洞\t301132\n朱娜\t301133\n训练师\t301134\n细剑\t301135\n第二十四届\t301136\n秋瓷\t301137\n古街\t301138\n姐妹\t301139\n总之\t301140\nWebuploader\t301141\ndbw\t301142\nLoc\t301143\n邹城市\t301144\n阀板\t301145\n混花\t301146\n西安市区\t301147\n中山翠亨新区\t301148\n规划案\t301149\n陈文媛\t301150\nkinki\t301151\n41集\t301152\ntoken\t301153\nBeckman\t301154\nm7400\t301155\n大海航行靠舵手\t301156\n马融\t301157\ndyy\t301158\n5210\t301159\n草莓果酱\t301160\n高秋梓\t301161\n省体育局\t301162\nPIC单片机\t301163\n万花楼\t301164\nur44\t301165\n水浪\t301166\n酸度计\t301167\n西南交通大学\t301168\ngitbash\t301169\n欢脱\t301170\n玫瑰色\t301171\n复核员\t301172\nbets\t301173\n社险\t301174\ncharming\t301175\nQtDesigner\t301176\noracal\t301177\n坎特伯雷\t301178\n柳传赤裸特工\t301179\n许倬云\t301180\nbullying\t301181\n生存战争\t301182\nlitigation\t301183\n江西电信\t301184\n剪不断\t301185\n荣耀6p\t301186\nnitr\t301187\n赏花节\t301188\n洗刷\t301189\n旋转式\t301190\n大唐帝国\t301191\n南明史\t301192\nacro\t301193\n清倒\t301194\n许东\t301195\n鑫圆\t301196\nRelative\t301197\nveg\t301198\n输电线\t301199\n黑名单\t301200\n斯科达\t301201\n云南省地方税务局\t301202\n直销业\t301203\n汪荃珍\t301204\n电子商\t301205\n君联\t301206\nCage\t301207\n难遇\t301208\njiz\t301209\n国际级\t301210\n测漏\t301211\n法证先锋2\t301212\n日产劲客\t301213\n鼻唇沟\t301214\ndetecting\t301215\n解脲支原体感染\t301216\n乱动\t301217\n想飞\t301218\noumei\t301219\n四倍\t301220\n控制科学与工程\t301221\nPYthon\t301222\n化骨龙\t301223\nhen\t301224\nbrz\t301225\n分栏符\t301226\n亚洲酒店\t301227\n来电显示\t301228\n三苯基膦\t301229\n蓝盾信息安全技术股份有限公司\t301230\nasda\t301231\n逃费\t301232\n老书记\t301233\nAttachment\t301234\n磁栅尺\t301235\n睢\t301236\n杀破狼2\t301237\nI5\t301238\n素面\t301239\n5月27日\t301240\nlore\t301241\n外汇储备\t301242\n复道\t301243\n万和燃气灶\t301244\n登上去\t301245\n知足常乐\t301246\n三步走\t301247\n菱耀\t301248\n残梦\t301249\n半枝莲\t301250\nupstairs\t301251\n五彩池\t301252\n辽宁广播电视台\t301253\nDTT\t301254\nRoutine\t301255\n吴某某\t301256\n詹姆·兰尼斯特\t301257\n20家\t301258\n普罗米修斯\t301259\n标示\t301260\n农业保险\t301261\nipad/\t301262\nfracture\t301263\n深圳市统计局\t301264\n安阳师范学院\t301265\n图层蒙版_\t301266\n做爱片\t301267\n19:00\t301268\n北城新区\t301269\n创作型\t301270\niPhoto\t301271\n600519\t301272\n君山岛\t301273\nSchool\t301274\n四川理工\t301275\n苏伟\t301276\n绝世天劫\t301277\n高自由度\t301278\nMOCVD\t301279\n省社科联\t301280\n齐岳山\t301281\n卡马西平\t301282\n琉璃珠\t301283\n朱权\t301284\n深圳市高级中学\t301285\n9步\t301286\nfear\t301287\n答\t301288\n新番\t301289\ncts6\t301290\nmastering\t301291\n超了\t301292\n帘头\t301293\nhidl\t301294\n受骗\t301295\n媳\t301296\n埃勒里·奎因\t301297\n政学\t301298\n票牛网\t301299\n京津高速\t301300\nbaum\t301301\n立本\t301302\n1卫\t301303\n有道人工翻译\t301304\n右部\t301305\n尿碱\t301306\n修理店\t301307\n歌友会\t301308\n道家阴符派\t301309\n马奴\t301310\n吃完\t301311\n羊角村\t301312\n八度\t301313\naller\t301314\nTENGA\t301315\n羽蛇\t301316\n八髎穴\t301317\n寒亭区\t301318\n_艺龙旅行网\t301319\n大厦\t301320\n抵扣税\t301321\n长安桥\t301322\n锦州新闻网\t301323\n同林鸟\t301324\n神奇Sam\t301325\n王红星\t301326\nPot\t301327\n申请费\t301328\n口炎\t301329\n500名\t301330\ngetPhoneNumber\t301331\n退格\t301332\n1440x900\t301333\n雷锋纪念日\t301334\n32只\t301335\n变节\t301336\n新城社区\t301337\n广州员村\t301338\ntopn\t301339\n六纬路\t301340\n封山\t301341\n光荣院\t301342\n能干嘛\t301343\n各省市\t301344\n说不过\t301345\n胡夫\t301346\n干部教育培训工作条例\t301347\n帅酷\t301348\n满座\t301349\n钟道隆\t301350\n胡越\t301351\n回吐\t301352\n纯情丫头火辣辣\t301353\n江阴市人大常委会\t301354\n八段锦\t301355\n福克斯论坛\t301356\n安徽工贸职业技术学院\t301357\n来路不明\t301358\n_智\t301359\n二硫化碳\t301360\n阜城镇\t301361\n上海诺基亚贝尔股份有限公司\t301362\n湘泉雅集\t301363\n8794网\t301364\n旺顺阁鱼头泡饼\t301365\nLun\t301366\n400分\t301367\nbetting\t301368\n焯水\t301369\n3DSLL\t301370\n保定市北市区\t301371\n十一条\t301372\n用爱\t301373\n转生\t301374\n11500\t301375\n深港dj俱乐部\t301376\n十成\t301377\nBJCA\t301378\n嫩妹\t301379\n没错\t301380\n赋值运算符\t301381\n疏影横斜水清浅\t301382\n西安杨森制药有限公司\t301383\n仍是\t301384\n叶晓粤\t301385\n站\t301386\n陈岚\t301387\nmaje\t301388\n调压\t301389\n1638\t301390\ncount函数\t301391\n第二十部\t301392\n2017年8月份\t301393\n凤凰社\t301394\n妇科男\t301395\n中银消费金融有限公司\t301396\nv-show\t301397\n纯粹\t301398\nFucker\t301399\n玩世不恭\t301400\n偷窥者\t301401\n神雕威朗\t301402\n焦耳\t301403\n肉叶\t301404\nlzm\t301405\n模拟城市我是市长\t301406\neclipce\t301407\n矩管\t301408\n极品飞车18\t301409\n比亚迪f6\t301410\n杨光\t301411\n高楼村\t301412\n碾轧\t301413\n汉臣氏\t301414\n清蒸排骨\t301415\n理容\t301416\n资兴\t301417\n烧荒\t301418\nCabernet\t301419\n昆池\t301420\n放在\t301421\n孝素\t301422\nSlack\t301423\n玻璃罩\t301424\n中华人民共和国民法通则\t301425\n1.1&#160\t301426\n游牧者\t301427\n热情似火\t301428\n太平洋保险\t301429\n王观\t301430\nlayout\t301431\n进港\t301432\n血眼\t301433\n两阳\t301434\nunfinished\t301435\nedius9\t301436\n无主之地:前传\t301437\n孙龙\t301438\n武穴市人民政府\t301439\n龙阳路站\t301440\nSender\t301441\n树灯\t301442\n熊猫头\t301443\n蜥蜴王\t301444\n第一炉\t301445\nSPORTS\t301446\n张民\t301447\n定安路\t301448\n神经酰胺\t301449\n黄简\t301450\n仓井空\t301451\n五官\t301452\nCer\t301453\n通天仙路\t301454\n捉急\t301455\n科技创新与应用\t301456\ntee\t301457\n彼特·丁拉基\t301458\n交州\t301459\n邑人\t301460\n丙烷脱氢制丙烯\t301461\n庆州\t301462\n来得\t301463\n大西洋\t301464\n综合知识和能力素质\t301465\n经营报\t301466\n捉妖记\t301467\n莱州湾\t301468\n這裡\t301469\n好乐迪\t301470\nphoenics\t301471\n掲示板\t301472\n岁月神偷\t301473\n白脸\t301474\n1543\t301475\n贵阳市国土资源局\t301476\n对口升学考试\t301477\n乌丢丢的奇遇\t301478\n翠羽\t301479\n元始\t301480\n东邻\t301481\nbanbu\t301482\n以不变应万变\t301483\n出嫁\t301484\n神探狄仁杰1\t301485\n2月1日\t301486\n鲁能集团\t301487\n1993\t301488\n505路\t301489\nloadrunner\t301490\nsynchronizing\t301491\n叶倾城\t301492\n这一瞬间\t301493\n赌资\t301494\n射击队\t301495\n匹夫\t301496\n爱恋3D\t301497\n金蛇\t301498\n普宁城\t301499\n劳动节\t301500\n死亡飞车4\t301501\nain\t301502\n目覚\t301503\n吴家大院\t301504\n跳楼机\t301505\n蓝蛇\t301506\n取人\t301507\n太阁\t301508\n配线\t301509\npadding-bottom\t301510\npowerbeats2\t301511\n低血压\t301512\n品尚红酒网\t301513\nregions\t301514\n北京市燃气集团有限责任公司\t301515\n赵胜利\t301516\n青石巷\t301517\n高达模型综\t301518\n第六级\t301519\n黄屯\t301520\n幸福照相馆\t301521\n手机号\t301522\n真的爱\t301523\n太子乐\t301524\n安徽省住房城乡建设厅\t301525\n慕尚\t301526\n阿玛迪斯战记\t301527\n回民小学\t301528\n四粒\t301529\n福利屯\t301530\n大神F2全网通\t301531\n灰丝\t301532\nflume\t301533\nKickstarter\t301534\npackaging\t301535\n智课网\t301536\nb360\t301537\nhtts\t301538\n抗组胺\t301539\n篙\t301540\n川流不息\t301541\n里希特\t301542\n佛祖岭\t301543\n4重\t301544\nbrom\t301545\n空间文\t301546\n18183十二\t301547\nTucson\t301548\nTEAM\t301549\n公司型基金\t301550\n张晨曦\t301551\nHA\t301552\n助听\t301553\n炼铜\t301554\n海妖\t301555\nbaishi\t301556\n缩减\t301557\n华商晨报\t301558\n南开大学化学学院\t301559\n四核\t301560\n5.35\t301561\nmoldflow\t301562\n化学与材料科学学院\t301563\n云宫\t301564\n心文学\t301565\n上海论坛\t301566\n权限\t301567\n卡塔库栗\t301568\n第91\t301569\n四倍镜\t301570\n炼制\t301571\n天体浴\t301572\n竹管\t301573\nsinx-cosx\t301574\n胡狼\t301575\n漢\t301576\n爱丽儿\t301577\n19.5\t301578\n循环函数\t301579\nYangtze\t301580\n宽粉\t301581\n铁板鱿鱼\t301582\nRumble\t301583\nconversations\t301584\n大唐不夜城\t301585\n阑珊\t301586\nfromCity}\t301587\n中华颂\t301588\n银丰唐郡\t301589\n_皮匠网3mbang.com\t301590\n一氧化碳\t301591\n长程\t301592\n郑州植物园\t301593\nkfr\t301594\nlinux运维\t301595\n单反数码相机\t301596\n四份\t301597\n童话集\t301598\n亿年\t301599\n罗嗦\t301600\n金阳路\t301601\n懂到\t301602\nsolo2\t301603\n第四\t301604\n补肝\t301605\n北注协\t301606\naeas\t301607\n飞鸟历险记\t301608\n乔司街道\t301609\n反盗版\t301610\n金丹期\t301611\n海银财富\t301612\n面试时\t301613\n北京街\t301614\n家数\t301615\n鬼魅\t301616\nNorton\t301617\nubw\t301618\n中国抗癌协会\t301619\n上海城建职业学院\t301620\n新气象\t301621\n南湖香麓\t301622\n艰\t301623\n电商园\t301624\n阿册\t301625\n梁红\t301626\n水舞\t301627\n衣长\t301628\n神东煤炭集团\t301629\n郑州动物园\t301630\n8.5cm\t301631\nCNNIC\t301632\n中国歌剧舞剧院\t301633\n综合大楼\t301634\n陈清华\t301635\niqiyi\t301636\n_科教\t301637\n臭屁虫\t301638\n常服\t301639\n190平\t301640\n长沙市工商行政管理局\t301641\n倍感\t301642\nOverride\t301643\n九州吧\t301644\n巴赫曼\t301645\ndongdone\t301646\n不好说\t301647\n投笔\t301648\n凝心\t301649\nHc360慧聪移动网\t301650\nqlineedit\t301651\n深圳实验中学\t301652\n第80届\t301653\n两宗\t301654\n绝缘层\t301655\n嵖岈山\t301656\n9月12日\t301657\nIllustrato\t301658\n7012\t301659\n两栖动物\t301660\n骁骑校\t301661\n三士\t301662\n镖\t301663\n飞鸾\t301664\n7300hq\t301665\n事象\t301666\n明日之丈\t301667\n毛泽少\t301668\n性骚扰\t301669\nBT天堂吧_比特大雄\t301670\ndecorate\t301671\n环装\t301672\n3443\t301673\ndbca\t301674\ndanci\t301675\n灵童\t301676\n绿教\t301677\n美之声\t301678\n精真估\t301679\nrobomongo\t301680\n其后\t301681\n儿行千里\t301682\n黑龙江火山鸣泉\t301683\n啼哭\t301684\n一键锁屏\t301685\n格式工厂版\t301686\n五畜\t301687\n北外附属苏州湾外国语学校\t301688\n宜君\t301689\n赵郁鑫\t301690\n女兔\t301691\n70级\t301692\n炒房客\t301693\n连绵不绝\t301694\n上升期\t301695\n6160\t301696\n心电图仪\t301697\n布袋式除尘器\t301698\n汇通网\t301699\n贝塔星\t301700\nSquirting\t301701\n副院\t301702\n阿绿\t301703\nrequest-\t301704\n雍锦\t301705\n微软亚洲研究院\t301706\n法学概论\t301707\n提出者\t301708\nmelt\t301709\n种植\t301710\n调制解调\t301711\n蓝雨\t301712\n0.57\t301713\n琥珀新天地\t301714\n表出\t301715\n英超\t301716\n刚果金\t301717\n金山岭长城\t301718\n精神分析学派\t301719\n生物药剂学\t301720\n腹外\t301721\n百度云高清\t301722\nyacht\t301723\n﹚\t301724\n成仙之路\t301725\n4.3.7\t301726\n预言者\t301727\n第61届\t301728\n惠利\t301729\n普陀区中心医院\t301730\nHelium\t301731\n申冤\t301732\n决策版\t301733\n放逐之城\t301734\n杨家\t301735\n日程安排表\t301736\n开题报告会\t301737\ntobit\t301738\n空间点\t301739\n艾欧泽亚\t301740\nporter\t301741\n油环\t301742\n鸡苗\t301743\nSerious\t301744\n朴智妍\t301745\n售票处\t301746\n擦笔\t301747\n绝体绝命\t301748\n中式八球国际大师赛\t301749\n言过其实\t301750\npicking\t301751\n笑话故事\t301752\nnirvana\t301753\n张志敏\t301754\n旺达与巨像\t301755\n法国大革命\t301756\n天水日报社\t301757\nslmgr\t301758\n洛伐他汀\t301759\n烧包\t301760\nplsql\t301761\n凯撒大帝3\t301762\nTechRadar\t301763\n极权主义\t301764\nhiten\t301765\nATLAS\t301766\n诤友\t301767\nFilmImpact\t301768\n4800万\t301769\n杠杆率\t301770\n5200TXT\t301771\nHEIF\t301772\n沙坦类\t301773\n海口联合农商银行\t301774\n审定\t301775\n桀\t301776\n电磁\t301777\n数刀\t301778\n民治大道\t301779\n纪念碑\t301780\n北京)文化传媒有限公司\t301781\n唾弃\t301782\n众投\t301783\n朴人\t301784\n酸困\t301785\n坯机\t301786\n没收录\t301787\n吕文德\t301788\n中国唐山_唐山市政府\t301789\n西安铁路局\t301790\n网红酒店\t301791\n挂卡\t301792\n菱镁\t301793\n托拉姆\t301794\n朗道\t301795\n新澳\t301796\nLiability\t301797\n跨骑车\t301798\n计量经济学导论\t301799\n广东省\t301800\n剑灵灵剑士吧\t301801\nXSS\t301802\n服务器篇\t301803\n开牙\t301804\n一般地\t301805\n堺\t301806\n2011年末\t301807\n爱众\t301808\n电蚊香片\t301809\n巨人网络\t301810\n起亚K3/K3S论坛_汽车之家论坛\t301811\n真惨\t301812\n搜狐传媒\t301813\n串味\t301814\n沈亦臻\t301815\n6000多\t301816\n微辣Video\t301817\n第48期\t301818\n14届\t301819\n中经汇通\t301820\n1242\t301821\n海蜘蛛软路由\t301822\n泡泡机器人\t301823\n颗粒剂\t301824\n孙总\t301825\n管理机\t301826\n找钢网\t301827\n928\t301828\n5240\t301829\n这么多\t301830\n走着瞧\t301831\nyinyu\t301832\n结婚登记照\t301833\nBEG\t301834\n2017-11\t301835\n_祥安阁风水网\t301836\n0622\t301837\n挂科率\t301838\n真切\t301839\n忌日\t301840\n整存\t301841\n漏诊\t301842\nunderneath\t301843\n近意\t301844\n国图\t301845\n开顶\t301846\nRAND\t301847\n054\t301848\n沙豹\t301849\n制御\t301850\n朗姆\t301851\n华领\t301852\njavascript变量\t301853\n山雨\t301854\n施工队\t301855\n反恐精英Online\t301856\nb型血\t301857\n萃取塔\t301858\n说书人\t301859\nbund\t301860\n求生一加一\t301861\n精效\t301862\nconductor\t301863\n4e\t301864\n看她\t301865\n牵引力\t301866\nScienceDaily\t301867\n甲流\t301868\n明生\t301869\n中价协\t301870\n中建国熙台\t301871\n张毅\t301872\n初始投资成本\t301873\n爬绳\t301874\n中医类\t301875\nA股\t301876\n浙江省国税局\t301877\n美容店\t301878\n韩版\t301879\nAster\t301880\nZ1\t301881\n盈亏平衡点\t301882\n一片漆黑\t301883\n针杆\t301884\n挑灯\t301885\n罗湖电子政务网\t301886\n心怀\t301887\n秋雅\t301888\n哲爷\t301889\n张珍贵\t301890\n前氧传感器\t301891\n流阻\t301892\n超星阅读器\t301893\n播求\t301894\ndesign\t301895\n梦幻西游2_爱玩网\t301896\n供认\t301897\n14粒\t301898\n交通警\t301899\nSwinger\t301900\n钱江龙\t301901\n2017-11-24\t301902\n花蒂\t301903\n大胖子\t301904\n葡萄酒网\t301905\nV7.5\t301906\n富养\t301907\n中国核电工程有限公司\t301908\n英国曼彻斯特大学\t301909\n603899\t301910\nterransforce\t301911\nsyx\t301912\n小胖子\t301913\n郑州十一中\t301914\nEmpathy\t301915\n2018年04月28日\t301916\n66pb\t301917\n12第一次\t301918\n航海士\t301919\n导入\t301920\n320Li\t301921\n周雨\t301922\n公子\t301923\n马光远\t301924\n东映\t301925\n邱少万茜\t301926\n水处\t301927\n审计法\t301928\n搓碟\t301929\n初逢\t301930\n体育大学\t301931\n刘昕\t301932\n碳浆\t301933\n宝鸡中学\t301934\n后备式\t301935\n花苞头\t301936\n阴面\t301937\n福建协和医院\t301938\n命令行用法\t301939\n会者\t301940\n民学\t301941\nz01\t301942\n雁江区\t301943\n手数\t301944\nTOYOTA\t301945\n保利罗兰香谷\t301946\n车享家\t301947\n3板\t301948\nCitra\t301949\nBOTTEGA\t301950\n粗体\t301951\n长见识\t301952\n外管\t301953\ne900-s\t301954\ni5-4590\t301955\ncfo\t301956\n64路\t301957\n步步高家\t301958\n缝缝\t301959\n6415\t301960\n维克\t301961\n6D2\t301962\n朱泥\t301963\n五河县人民政府\t301964\n初出\t301965\n药水\t301966\n全神\t301967\n途昂论坛_汽车之家论坛\t301968\n天府先锋\t301969\nDun\t301970\n497电玩网\t301971\n盐渍化\t301972\npsp3000\t301973\n元气少女缘结神\t301974\n白宇\t301975\n边境杀手\t301976\n火轮\t301977\n线控耳机\t301978\n干冰清洗机\t301979\n拉卡拉\t301980\n质量观\t301981\n主义者\t301982\n正财格\t301983\n狐狸\t301984\n没人爱\t301985\nastyle\t301986\n天线\t301987\n10岁\t301988\n西施\t301989\n精读\t301990\nx64-CSDN\t301991\n南屏镇\t301992\n杨省世\t301993\njtest\t301994\ngr\t301995\nreduce函数\t301996\nED2K\t301997\n今夜\t301998\n输液泵\t301999\n57_\t302000\nBegin\t302001\n现代\t302002\nCelebration\t302003\n哈工大深圳研究生院\t302004\nsdz\t302005\n施尔美\t302006\nX5Max\t302007\n扫描件\t302008\nTimetable\t302009\n公益\t302010\n梅竹\t302011\n试试用\t302012\n莲花湖\t302013\nPaperWeekly\t302014\n孝者\t302015\n真面\t302016\n冷切\t302017\n李光耀\t302018\nwindow对象\t302019\n陈新华\t302020\n震楼器\t302021\n3刀\t302022\n键桥\t302023\n混合料\t302024\n公称\t302025\n小磨香油\t302026\n推拉式\t302027\n中国汽车技术研究中心有限公司\t302028\n公制\t302029\n无梦\t302030\n邪派\t302031\n耳孔\t302032\n知心\t302033\n天姥山\t302034\nq460\t302035\n职业能力测试\t302036\n250_\t302037\n乙巳\t302038\nAskMen\t302039\nゲイ動画配信サイト\t302040\n酷狗繁星网\t302041\nremi\t302042\n洗扫\t302043\nTar\t302044\nclsid\t302045\n生发剂\t302046\nClip\t302047\n4.35\t302048\nluoqiu\t302049\n横琴自贸区\t302050\n李乐\t302051\n麻辣隔壁\t302052\n3Dmark\t302053\n留\t302054\n史蒂芬周\t302055\n高压版\t302056\n协助\t302057\n保持性\t302058\n金毛\t302059\n佛山日报\t302060\n卡梅隆安东尼\t302061\nDns\t302062\n传送门\t302063\ncoastal\t302064\n铜冶\t302065\n钢屋架\t302066\n中国电动车联盟\t302067\n一小时左右\t302068\n指导师\t302069\n大锅饭\t302070\n幽灵战士2\t302071\n干发\t302072\nSICOLAB\t302073\n中华五岳联盟\t302074\n无限长\t302075\n20160419\t302076\n凉皮\t302077\n仲\t302078\n正反馈\t302079\nIris\t302080\n中能东道集团\t302081\nShunde\t302082\nTxt电子书\t302083\n评级\t302084\n_肌肉网\t302085\n墨垫\t302086\n尹氏\t302087\n玻璃险\t302088\n佛吉亚\t302089\n单侠\t302090\n长江海事局\t302091\n莱茵河\t302092\n铭基\t302093\n【欣\t302094\nyoua\t302095\n油器\t302096\n海贼团\t302097\n灿白\t302098\n黄种\t302099\n三相电功率\t302100\nabc\t302101\n越秀\t302102\n国音\t302103\nGiven\t302104\n60小时\t302105\n经典语\t302106\n尺骨\t302107\n网吧经营\t302108\n融合式\t302109\n海洋食品网\t302110\n天津医科大学口腔医院\t302111\n红荔\t302112\n茶勺\t302113\n法安\t302114\n名思教育\t302115\n广安友谊中学\t302116\ndlm\t302117\n中大银泰城\t302118\nucenter\t302119\nCRL\t302120\n十杯\t302121\n阿尔及利亚\t302122\n硫酸钾复合肥\t302123\n搬家公\t302124\n五五开\t302125\n任建新\t302126\nurlconnection\t302127\n金博士\t302128\n屯堡\t302129\n15r\t302130\n肋排\t302131\n北京积水潭医院\t302132\n窄版\t302133\n百合园\t302134\n桃李春风\t302135\n农会\t302136\n虚拟币收录网\t302137\nStat\t302138\n虚拟类\t302139\n吉特\t302140\n像差\t302141\n春末\t302142\n医疗卡\t302143\n滑坡体\t302144\nOpenJudge\t302145\n映山\t302146\n杨婵\t302147\n狄琪儿\t302148\n露伴\t302149\n堆外内存\t302150\n18.0\t302151\n胡萝\t302152\n2箱\t302153\nCIR\t302154\n六十日\t302155\nLLVM\t302156\npant\t302157\n阿倪\t302158\n桔子酒店\t302159\nHoly\t302160\n五粮液集团有限公司\t302161\n神山镇\t302162\nEndv\t302163\n绪\t302164\n夜莺的歌声\t302165\n49.9元\t302166\n止逆阀\t302167\n护坡\t302168\n洋河镇\t302169\njrtplib\t302170\n蓬溪\t302171\n离席\t302172\n青岛海尔股份有限公司\t302173\n星露谷物\t302174\n韩一\t302175\nmura\t302176\n鸟群\t302177\nepithelial\t302178\n李武\t302179\nnaoh\t302180\nBrazil\t302181\n亮星\t302182\n交响诗\t302183\n炊事班的故事\t302184\n男子\t302185\nStu\t302186\n入宅黄道吉日\t302187\n广西电视台\t302188\nSantos\t302189\n完美无缺\t302190\n粗化\t302191\n假面骑士ghost\t302192\n清香木\t302193\nbestiality\t302194\ncrawler4j\t302195\n冒险岛蘑菇神社\t302196\n昂贵\t302197\ncgl\t302198\nhaaic\t302199\n五台\t302200\n声远\t302201\n纬线\t302202\n处女情结\t302203\n东沟\t302204\n种业\t302205\nfscapture\t302206\n催单\t302207\n叶久久广场舞\t302208\n罗纳德\t302209\n徐迟\t302210\n鲁信\t302211\n十子\t302212\n后羿\t302213\n学而思在线\t302214\n爽歪\t302215\n鸿鹏\t302216\nuClinux\t302217\n郑州轻工业学院\t302218\n休闲鞋\t302219\n磷酸铝凝胶\t302220\n篆刻家\t302221\n王一彤\t302222\n党会\t302223\n终版\t302224\n恒热热水器\t302225\nsirna\t302226\n深圳博物馆\t302227\n步履蹒跚\t302228\nqq个性签名\t302229\n博宇\t302230\n192.168.1.38\t302231\n海南省食品药品监督管理局\t302232\n年化利率\t302233\n五千年前\t302234\n蜂疗\t302235\n韶钢松山\t302236\nguitar\t302237\nees\t302238\n书脊\t302239\n贝聿铭\t302240\n获奖\t302241\n468号\t302242\n恒大世纪城\t302243\n1000天\t302244\n客来\t302245\n邮编号\t302246\n汉字符\t302247\neosdac\t302248\n武汉人\t302249\n卢德铭\t302250\nl565\t302251\n演唱会\t302252\n104期\t302253\n永和村\t302254\n畅欣\t302255\n国产车\t302256\n政府办公厅\t302257\n玉衡\t302258\n菜鸡\t302259\n浙江理工大学\t302260\nYelky\t302261\n国军标\t302262\n上千年\t302263\ngeneve\t302264\n割引\t302265\nAV片\t302266\nuc浏览器\t302267\n李邕\t302268\n推己及人\t302269\n160部\t302270\n世茂江滨花园\t302271\n疾风步\t302272\n加热盘\t302273\n团结社区\t302274\n北土城\t302275\n起翘\t302276\n1971\t302277\n柏涛菲诺\t302278\nIaaS\t302279\n567GO\t302280\n一部片\t302281\n彩云飞\t302282\n华夏万家\t302283\ngudi\t302284\nLoadrunner11\t302285\nxlt\t302286\nCOD4\t302287\n崔志刚\t302288\n承认书\t302289\n招骋\t302290\n薛蛮四大名捕\t302291\n急救证\t302292\n护理人\t302293\n美乐蒂\t302294\n德润\t302295\nPersistent\t302296\n台北桃园机场\t302297\n第33关\t302298\n自由_\t302299\n亚马逊A+\t302300\n为明\t302301\nKira\t302302\n9&#160\t302303\n十二周\t302304\n淡薄\t302305\n苍溪论坛\t302306\nofficially\t302307\n笔类\t302308\ndn40\t302309\nSurfaceView\t302310\n谈情解爱\t302311\nCourier\t302312\nqtdesigner\t302313\n大空头\t302314\n主屏幕\t302315\n指型\t302316\ncheese\t302317\n9.0.5\t302318\n有发\t302319\n医疗保险金\t302320\n2273\t302321\n大湘网\t302322\n张晗\t302323\n老卫\t302324\n礼者\t302325\nrapper\t302326\n云南广播电视台\t302327\n小屋\t302328\n科普剧\t302329\nJocelyn\t302330\n董姿彦\t302331\n胶套\t302332\n蓝妹\t302333\n双股\t302334\ntamara\t302335\n刘静怡\t302336\n维氏硬度计\t302337\n毕露\t302338\n战斗群\t302339\n速动比率\t302340\nfighters\t302341\n龙岗大道\t302342\n令人窒息\t302343\n回风巷\t302344\n纪中\t302345\n太嗨\t302346\n徐汇区幼儿园\t302347\nburp\t302348\n天可汗\t302349\n德拉克\t302350\n破企\t302351\n芬雷尔\t302352\n不亮\t302353\njbod\t302354\n65元\t302355\nfpc\t302356\n东坡肉\t302357\n白首不分离\t302358\n媚儿\t302359\n刘化龙\t302360\n来凤\t302361\n露露\t302362\n庄稼\t302363\n独孤九剑\t302364\n可做\t302365\nSNOW\t302366\nopencv2\t302367\n2015.01\t302368\n网戒\t302369\n中国料理\t302370\n反刍动物\t302371\n一揽子\t302372\n柴达木\t302373\n佛手瓜\t302374\nBootstrap3\t302375\nstate\t302376\n咎狗\t302377\nEIO\t302378\n龙阙\t302379\nsucai\t302380\n伊斯兰堡\t302381\n出了事\t302382\n姜珂卓航\t302383\n回顾\t302384\n酸甜苦\t302385\n姚鼐\t302386\n城子\t302387\n复式记账法\t302388\n瑞风S5\t302389\n见刊\t302390\n武定县\t302391\n亿帆\t302392\n丹吉尔\t302393\n知识论\t302394\n清肤\t302395\n神侃\t302396\n富马酸酮替芬片\t302397\nImageButton\t302398\nPuppies\t302399\n骨胶原\t302400\n欢子\t302401\n血斑\t302402\n保险法\t302403\n二生\t302404\nserverlet\t302405\n适中\t302406\n巨磁\t302407\n舍己救人\t302408\n中粮期货\t302409\n_贝\t302410\nSantander\t302411\n云鹤\t302412\n防抱死\t302413\n魅族Pro6\t302414\n团子大家族\t302415\n临床症状\t302416\n门将\t302417\n55万元\t302418\n少年英雄方世玉\t302419\n同仁县\t302420\n金正淑\t302421\n厦门法院\t302422\n寡居\t302423\n4.5小时\t302424\n玩耍\t302425\n黄红\t302426\n辣肉\t302427\n铜锣湾\t302428\n石板滩镇\t302429\ncmn\t302430\n韩琦\t302431\nLocations\t302432\n宜宾市第一人民医院\t302433\nboulder\t302434\n百捷\t302435\n破涕为\t302436\nh2r\t302437\nWin10系统之家_Win10\t302438\n中国机械设备工程股份有限公司\t302439\namazeui\t302440\n产电\t302441\n2910\t302442\nAcceptable\t302443\n清晰\t302444\ntfs\t302445\n塞利格曼\t302446\nhts\t302447\n冶溪镇\t302448\n战国兰斯吧\t302449\n肾区\t302450\n2592\t302451\n眷顾\t302452\n日报社\t302453\nwsl\t302454\n千库\t302455\n果园村\t302456\n中华人民共和国刑法修正案\t302457\n大学新生\t302458\nsuspended\t302459\n长盈精密\t302460\n赛乃姆\t302461\nBay\t302462\n吉林联通\t302463\n哈加莎\t302464\n3542\t302465\n三小\t302466\ncdl\t302467\nTask2\t302468\n慰藉\t302469\n国家电力投资集团公司\t302470\n李妍\t302471\n300Mbps\t302472\n危机感\t302473\n工程力学专业\t302474\n十七届四中全会\t302475\n麻省理工大学\t302476\n中国共产党党务公开条例\t302477\n北横\t302478\n甲状腺\t302479\n杨蔓\t302480\nvalgrind\t302481\n怪奇\t302482\n老物\t302483\n尿座\t302484\n浙大西溪校区\t302485\nHomekit\t302486\n方永祥\t302487\n廊坊开发区\t302488\n唇妆\t302489\n不一定\t302490\n雅阁论坛_汽车之家论坛\t302491\n170平\t302492\n音乐喷泉\t302493\nstreak\t302494\n拉芳\t302495\n奶奶们\t302496\n预谋\t302497\nfisheye\t302498\n雪域雄鹰\t302499\n明宣德\t302500\nVa\t302501\n开跑\t302502\n陈锋\t302503\n可以避免\t302504\npuyang\t302505\n20000吨\t302506\n氢化植物油\t302507\n农闲\t302508\n圣诞颂歌\t302509\n保证期\t302510\n落雁\t302511\n蔡威泽\t302512\n极光\t302513\n警告处分\t302514\n100万条\t302515\n上海申康医院发展中心\t302516\n超声波_\t302517\n7速\t302518\nkirikiroid2吧\t302519\n大朗镇\t302520\nReturns\t302521\n具塞\t302522\ncomico\t302523\n白叶荣\t302524\n岭北\t302525\n誓者\t302526\nstm32定时器\t302527\n马尔代夫酒店\t302528\n自拍\t302529\n小叮\t302530\n考古发掘\t302531\nppt道客巴巴\t302532\nC语言程序设计\t302533\nhazard\t302534\n硅酸钙板\t302535\nmide\t302536\n重师\t302537\nJavascript模块化编程\t302538\n婷儿\t302539\n650d\t302540\n艾牛\t302541\n艾堂\t302542\nINTEL\t302543\n李丹丹\t302544\nH20\t302545\n佟健\t302546\n游戏梗\t302547\nnw336\t302548\n滨江公园壹号\t302549\nrounding\t302550\n智能科技公司\t302551\n录音器\t302552\nlags\t302553\n赛绩\t302554\n龙颜\t302555\n51劳动节\t302556\n60届\t302557\n方亮\t302558\n铜\t302559\n博文_科学网\t302560\n冷轧带钢\t302561\n武警黄金部队\t302562\nE4\t302563\n大劫\t302564\n索罗吕布\t302565\n大道之行也\t302566\n询价采购\t302567\n尹\t302568\n卡恩\t302569\n彩虹6号\t302570\n聂宇航\t302571\n易安卓\t302572\n帕米尔\t302573\n叶臣\t302574\n范家村\t302575\nyyy\t302576\n耗电\t302577\n三国无双4\t302578\n31座\t302579\nxmin\t302580\n第二十九条\t302581\n裸舞\t302582\n工商业\t302583\n重庆建工集团\t302584\n529\t302585\n幸福花\t302586\n50余万\t302587\n集宁一中\t302588\n威克多\t302589\n四代目\t302590\nLamar\t302591\n卢俊卿\t302592\nwoven\t302593\n寂静岭3\t302594\n驱鼠器\t302595\n卷舌\t302596\n红糖鸡蛋\t302597\nminicooper\t302598\n秦子墨\t302599\n3306\t302600\n经济增长极\t302601\n雁翔路\t302602\n绝地枪王2\t302603\n北爱尔兰\t302604\n法定休假日\t302605\n神秘主义\t302606\n蜜丝佛陀\t302607\n干果类\t302608\n罗技g633\t302609\nsuid\t302610\nTPE\t302611\n县安监局\t302612\n上井\t302613\n还记得\t302614\n2.4\t302615\n舒痕胶\t302616\ncb400\t302617\nsava\t302618\n杜瓦瓶\t302619\n阿依莲\t302620\n定形\t302621\n竹林镇\t302622\nhdmi分配器\t302623\nqr\t302624\nzope\t302625\n棉裤\t302626\n秀婷\t302627\nhaha\t302628\n音界\t302629\n郭柯\t302630\n风痕\t302631\nlls\t302632\n南北花木网\t302633\n京港\t302634\n克伦特\t302635\n专科学院\t302636\nVILLAGE\t302637\n歐\t302638\nPAG\t302639\n17.8\t302640\n2015年11月21日\t302641\n全婐\t302642\n萧江镇\t302643\n纳米氧化锌\t302644\n瑞星网\t302645\n1428\t302646\n上呼吸道感染\t302647\n养老金\t302648\n南泉\t302649\n长沙社区\t302650\n28分\t302651\n三号机\t302652\n夸夸其谈\t302653\n本国\t302654\n云投集团\t302655\n自媒体\t302656\n思考_参考网\t302657\n同仁医院\t302658\n灭门惨案之孽杀\t302659\n急迫\t302660\n皮肤病性病\t302661\n硫磺粉\t302662\nBaltimore\t302663\n铭鑫\t302664\nnomedia\t302665\n第40话\t302666\n新誉\t302667\n基础类\t302668\n死板\t302669\n少儿台\t302670\n鄂城\t302671\n实验员\t302672\n峰峰三号\t302673\n部令\t302674\n悬浮剂\t302675\n帝宝\t302676\n评书网\t302677\n四驱兄弟\t302678\n高铁余票|张数-114票务网\t302679\n0357\t302680\n321go\t302681\n七星灯\t302682\n学一做\t302683\n能量源\t302684\n编委会\t302685\n烤鱼片\t302686\n大力菠菜\t302687\n爱先锋网\t302688\nskam\t302689\n第一少\t302690\n700D\t302691\n孟华\t302692\nInvestors\t302693\naccessing\t302694\n武汉市纪委监察局\t302695\n西北方\t302696\n便利蜂\t302697\n犬证\t302698\n江西科技师范学院\t302699\nnavbar\t302700\ngtx580\t302701\n国检集团\t302702\n新疆人民医院\t302703\n陶院\t302704\n西商\t302705\ncrm\t302706\npyramid\t302707\n280x\t302708\n智优\t302709\n原盘\t302710\nparticulate\t302711\n金奥\t302712\nMikroTik\t302713\n玉壶\t302714\n仿真度\t302715\n广东省安全生产监督管理局\t302716\n骚痒\t302717\n2017~2018年\t302718\n增高垫\t302719\n每页\t302720\n禁术\t302721\n深鉴科技\t302722\nFlowers\t302723\n魏东\t302724\n20151008\t302725\n天贵\t302726\n薄膜袋\t302727\nofc\t302728\n草包网\t302729\n九月九\t302730\nQSS\t302731\nwinpy\t302732\nrouter4\t302733\n阻滞\t302734\nAnswering\t302735\nelem\t302736\n运财\t302737\n98um\t302738\n刑事诉讼规则\t302739\n常用件\t302740\n十万毫升\t302741\n21.5英寸\t302742\n国税网\t302743\n围巾\t302744\nzyc\t302745\n维库\t302746\n深业东岭\t302747\nawfully\t302748\n人教版小学四年级下册语文\t302749\n辛庄村\t302750\n游荡者\t302751\nB网\t302752\npmdg\t302753\nx^3\t302754\n周浦镇\t302755\n内存控制器\t302756\n中王\t302757\n一览众山小\t302758\n重力场\t302759\n春香\t302760\nBillboards\t302761\n立夏节\t302762\nmagazines\t302763\n冯文\t302764\n福彩3d\t302765\n兰海\t302766\n琳琅\t302767\n胶凝\t302768\nPerkinElmer\t302769\narima\t302770\n宋玮\t302771\nBBS\t302772\n争强好胜\t302773\n刘庄\t302774\n龙治民\t302775\nRaging\t302776\n遂昌\t302777\n猎学网\t302778\n五次\t302779\n东围\t302780\n家电下乡\t302781\n拉锁\t302782\nAmbitious\t302783\nBUFF\t302784\n105.7\t302785\n橡胶味\t302786\nsmiley\t302787\n合泰\t302788\n高纯石英砂\t302789\n桂林银行\t302790\n恶魔幸存者2\t302791\n28家\t302792\n调查性\t302793\n硬盘分区\t302794\nlsf\t302795\n惜时\t302796\nScrew\t302797\n润东汽车\t302798\n上升级\t302799\n握好\t302800\n宏病毒\t302801\n激电\t302802\n道砟\t302803\nZEN\t302804\nxwork\t302805\n5.04\t302806\n1689年\t302807\n高士\t302808\nvue模板\t302809\n英甲\t302810\n水俣病\t302811\n排列整齐\t302812\n卧轴\t302813\n达叔\t302814\n辛勤\t302815\n业务\t302816\nG16\t302817\n娄山中学\t302818\n义乌市\t302819\n标头\t302820\nCERTIFICATE\t302821\n3gp格式转换器\t302822\n拉曼光谱仪\t302823\n两比\t302824\n奸相\t302825\n十环\t302826\n哲人\t302827\n西丽镇\t302828\n卡地亚蓝气球\t302829\n河南师大\t302830\n火海\t302831\n0318\t302832\nPQ\t302833\n周志\t302834\n普通杯\t302835\n劳顿\t302836\n一大步\t302837\n18183fgo\t302838\n保有率\t302839\n喇叭口\t302840\n香魂女\t302841\n银耳莲子汤\t302842\n班班\t302843\n孔庆东\t302844\n贾诩\t302845\n四川省国有资产监督管理委员会\t302846\n强平\t302847\n七下\t302848\n多莉\t302849\n齐天大圣孙悟空\t302850\n群晖DS218+\t302851\n精忠岳飞\t302852\n沙头村\t302853\n浙江大学医学院附属第二医院\t302854\n应战\t302855\n旭峰\t302856\n朝阳花园\t302857\n大秧歌\t302858\n活畜\t302859\n婵娟\t302860\n梦幻祭\t302861\n名侦探柯南漫画\t302862\n一章节\t302863\n隆基泰和集团\t302864\n39号\t302865\n两鬓\t302866\n瓶塞\t302867\n隐晦\t302868\n取保候审申请书\t302869\n4399迷你世界\t302870\nfaxing\t302871\n丝套\t302872\n秋羽\t302873\n先遣\t302874\n中华人民共和国监察部\t302875\n春晖园\t302876\nparam\t302877\n官\t302878\n百联股份\t302879\ngraylog\t302880\n提额\t302881\nZXHN\t302882\n阿坝\t302883\nAOI\t302884\n高州市\t302885\n西奥多\t302886\n9521\t302887\n协议栈\t302888\n乙酉年\t302889\n愣是\t302890\n广州恒大淘宝足球俱乐部\t302891\n周杨\t302892\n西安机场\t302893\n李晓波\t302894\ntob\t302895\n老豆腐\t302896\n几十种\t302897\n深圳东部华侨城大峡谷\t302898\n大乐斗\t302899\n大骂\t302900\nEasyRecovery\t302901\n风险报酬率\t302902\n忠臣\t302903\n1817\t302904\n博爱医院\t302905\nBboy\t302906\n看不好\t302907\n华为应用市场|华为视频安卓\t302908\n兰天\t302909\n拄\t302910\n华东师范大学出版社\t302911\n报栏\t302912\n中哈\t302913\n短篇辣文合集\t302914\n拓野\t302915\n唱脸谱\t302916\n广东省农村信用社\t302917\n宿迁市委\t302918\n阳江\t302919\n金明区\t302920\n准儿\t302921\n大员证\t302922\n离泊\t302923\nTv1995网\t302924\n德州市人民医院\t302925\n八所\t302926\n跨时\t302927\narchetypes\t302928\n夏茉\t302929\n全盛时代\t302930\n实弹射击\t302931\n武威\t302932\n血库\t302933\n珠影星光城\t302934\n换元法\t302935\n阿斯伯格\t302936\n龙之梦\t302937\n2024\t302938\n无锡电信\t302939\n圆内接\t302940\n上海自然博物馆\t302941\n扶华\t302942\nkubernets\t302943\n5000年\t302944\n山美\t302945\n实施方案\t302946\n骏丰频谱\t302947\ndexter\t302948\n衣阿华\t302949\nOMV\t302950\n研策\t302951\ntangle\t302952\n借给\t302953\n美骑网\t302954\n长江时评\t302955\n油耳屎\t302956\n电镀添加剂\t302957\n五股\t302958\nXplay6\t302959\n井野\t302960\n鸿途\t302961\n北资\t302962\n冲红\t302963\n寒门悍匪\t302964\nextundelete\t302965\n8028\t302966\n金属棒\t302967\n重力感应\t302968\n47种\t302969\n北沿江高铁\t302970\nshellexecute\t302971\n演讲辞\t302972\n寸土\t302973\n仁人志士\t302974\n0xe06d7363\t302975\n5.75\t302976\n有趣\t302977\n球刀\t302978\ndialux\t302979\n伏天氏\t302980\narmeabi-v7a\t302981\n日错\t302982\n0371-12340\t302983\n希赛教育\t302984\n第2016\t302985\n米尔科技\t302986\n来啦\t302987\nZ68\t302988\n军营村\t302989\n散热片\t302990\nfumen\t302991\nswprintf\t302992\nterrible\t302993\n小蕾\t302994\n北京证监局\t302995\n小算盘\t302996\n露雨\t302997\n小米无线路由器\t302998\n魅蓝E3\t302999\n宫地蓝\t303000\n华懋\t303001\n职表\t303002\n澳元\t303003\n基思\t303004\n关智斌\t303005\n2.5S\t303006\n铁皮\t303007\n艾露尼斯\t303008\n20161017\t303009\n丹顿\t303010\n梦幻足球联盟\t303011\niot\t303012\n皮蝇\t303013\nzhidao\t303014\n安工大\t303015\n英属哥伦比亚大学\t303016\n孕妇贫血\t303017\n省考\t303018\n1034\t303019\n上海儿童医学中心\t303020\n插层\t303021\n刘忠宝\t303022\n7988元\t303023\n雷曼股份\t303024\njavascripts\t303025\n审计学专业\t303026\n貴\t303027\nPro7\t303028\n夜鹭\t303029\n浙江大学软件学院\t303030\n美孚\t303031\n积分墙\t303032\n毒瘤\t303033\n玉皇\t303034\n中框\t303035\n黄金网\t303036\nr^2\t303037\n成熟度\t303038\n#65533\t303039\n江口\t303040\n第159章\t303041\n嫁衣\t303042\n无套\t303043\n10多年\t303044\npyspider\t303045\n2x-3\t303046\n巴莫特\t303047\n夜半\t303048\n港澳台地区\t303049\n厦门市第一医院\t303050\n小米机器人\t303051\nDaddies\t303052\n关天培\t303053\naddpath\t303054\n硬度块\t303055\n有道理\t303056\n小米手机Miui9\t303057\n正弘中央公园\t303058\n二零三\t303059\nhelldivers\t303060\n催泪瓦斯\t303061\n湖南汽车工程职业学院\t303062\n龙邦物流\t303063\nv140\t303064\n资本方\t303065\n普能\t303066\n柳城县\t303067\n干柴\t303068\n北京干部教育网\t303069\n1394\t303070\n仓配一体化\t303071\n长理\t303072\n遮阳布\t303073\n无限极\t303074\n空气柱\t303075\n安之若素\t303076\n绞丝\t303077\n求合\t303078\n彻\t303079\n五莲\t303080\n情怨\t303081\ntplink\t303082\n四年级数\t303083\n龙珠软膏\t303084\nMotionVFX\t303085\n11艘\t303086\n柠檬皮\t303087\n阿志\t303088\nLIULIULIU666\t303089\n活度系数\t303090\n陈情令\t303091\n/3\t303092\n七轴\t303093\nPropertyGrid\t303094\nworkon\t303095\n1.doc_\t303096\nMEF\t303097\n创面\t303098\n黑袍\t303099\n刘宋\t303100\nPredator\t303101\n地理所\t303102\nTungsten\t303103\n孔维\t303104\n外轨\t303105\n希特勒\t303106\n远近\t303107\n一操\t303108\n天津滨海国际机场\t303109\n小米无线充电器\t303110\n100多万\t303111\n活动周\t303112\n优帕克\t303113\n龙皓晨\t303114\n量贩\t303115\n反向\t303116\n杆子\t303117\nt11\t303118\n35592\t303119\n美丽乡村\t303120\n3分米\t303121\n潜泳\t303122\n表现出\t303123\n偏态\t303124\n浙商期货\t303125\n妙思乐\t303126\n口风琴\t303127\n_科易网\t303128\n第一线\t303129\n神色\t303130\n2015年11月27日\t303131\n鑫旺\t303132\n恒兴\t303133\n上海幼儿园招生_上海幼儿园\t303134\n南京国际学校\t303135\n2017年7月5日\t303136\n摥管\t303137\nPowe\t303138\n漠河\t303139\n高能\t303140\n假面骑士铠武\t303141\n百灵杯\t303142\n搅屎棍\t303143\n汇票\t303144\n香港公司\t303145\n四川电力\t303146\n客软\t303147\n察隅县\t303148\n托轮\t303149\n自然之声\t303150\n兴宁区\t303151\n吴泾镇\t303152\n20161128\t303153\n生育率\t303154\n20亿\t303155\n10a\t303156\n045\t303157\n电子目镜\t303158\n9343\t303159\n活水机\t303160\n黄毅\t303161\n焦桐\t303162\n吉林省第二实验学校\t303163\n骂战\t303164\n洋葱\t303165\nIphone8\t303166\n服务端\t303167\n1.4529\t303168\n7k7k奥比岛\t303169\n可使\t303170\n顺德区国土城建和水利局\t303171\n征求意见稿\t303172\n旭飞\t303173\n10路\t303174\n反潜\t303175\nTasks\t303176\n华严寺\t303177\n罗宾娜美\t303178\n95530\t303179\n小地主\t303180\n针织机\t303181\n7星级\t303182\n煲耳机\t303183\n6e\t303184\n临潭\t303185\n继父母\t303186\nglucosamine\t303187\nopt\t303188\n素力\t303189\n四川电网\t303190\n韩娜\t303191\n网易MuMu论坛\t303192\nboxsnake\t303193\n迈阿密风云\t303194\n老掌沟\t303195\n一炷\t303196\nNolan\t303197\nBitDefender\t303198\n余杭高级中学\t303199\n2015.06\t303200\n量子纠缠\t303201\n尼康D5200\t303202\n长文\t303203\n长远\t303204\n电视墙\t303205\n摊位\t303206\n色彩\t303207\n路边\t303208\ncm11\t303209\n简谐波\t303210\n微信公众平台测试号\t303211\n江帆\t303212\n福建省财政厅\t303213\n挤出血\t303214\nマンション\t303215\nAHK\t303216\nlxl\t303217\n分液漏斗\t303218\n奥地利维也纳\t303219\n波木\t303220\n小米MIX2S\t303221\n真维斯\t303222\nCreamy\t303223\n普氏\t303224\n铜焊\t303225\norlando\t303226\n厦漳泉\t303227\nfase\t303228\n龙桥\t303229\n魔天记\t303230\n亲见\t303231\ntx2\t303232\n文典网\t303233\n闪付卡\t303234\n光卡\t303235\n永春\t303236\n合得\t303237\n中国人民解放军第302医院\t303238\n周文重\t303239\nGHOST\t303240\nSci\t303241\n本科学\t303242\n300\t303243\n2007年1月1日\t303244\n放风\t303245\nHyundai\t303246\nperror\t303247\n张志川\t303248\n36寸\t303249\n分解质\t303250\n135g\t303251\n绘卷\t303252\n坤坤\t303253\n汉芯\t303254\n行乞\t303255\n迪肯大学\t303256\n点阵式\t303257\n哀牢山\t303258\n怨气\t303259\n排开\t303260\n新加坡金沙酒店\t303261\n红_\t303262\n反光镜乐队\t303263\n603858\t303264\n古天莫言\t303265\n粮票\t303266\n三福百货\t303267\n上海市人民政府办公厅\t303268\n云之家\t303269\n5千公里\t303270\n第50号\t303271\n在乎\t303272\n宝马320i\t303273\n70亿\t303274\nyangyzh\t303275\n二环十三郎\t303276\n张澍\t303277\nJirayu\t303278\n长春亚泰\t303279\n新田县\t303280\n市直机关\t303281\n7列\t303282\n勤绩廉\t303283\n元真\t303284\nmdr\t303285\n周慕冰\t303286\n水汇\t303287\n自美\t303288\n戏曲\t303289\n春日\t303290\nfeifeicms\t303291\n第54号\t303292\n鸡毛蒜皮\t303293\n提示框\t303294\n同和\t303295\n艾肯micu\t303296\n漳平政府网\t303297\n阿泰尔\t303298\n植树\t303299\n现量\t303300\n纳达贝索斯\t303301\nnasty\t303302\n脚凳\t303303\n阿丽莎\t303304\n245号\t303305\n张怡筠\t303306\nInterContinental\t303307\n炉石黄金赛\t303308\n4周\t303309\n福州港\t303310\n油坊\t303311\n地产业\t303312\n辛巴达历险记\t303313\nFleming\t303314\nMuddy\t303315\nnba2kol2吧\t303316\nMEL\t303317\nwp\t303318\n上海3号线\t303319\n中国融资担保业协会\t303320\n桃核\t303321\nDovecot\t303322\ncracked\t303323\n台下\t303324\n平常用\t303325\n主将\t303326\n波马\t303327\n天津美术学院\t303328\n蚊蝇\t303329\n8月下旬\t303330\n隹\t303331\nRatchet\t303332\n兄弟情人\t303333\n正压\t303334\n稀疏性\t303335\n法德\t303336\n为真\t303337\nTheir\t303338\n1992年\t303339\n韦国清\t303340\n0734\t303341\n丙肝病毒\t303342\n低毒\t303343\n下冲\t303344\n临海政府网\t303345\n全波整流\t303346\n8000多万\t303347\n山东法院\t303348\n油味\t303349\n刀剑英雄\t303350\nSAPUI5\t303351\n茶树\t303352\n硬包\t303353\n万柳\t303354\n地图库\t303355\n钟奕儿\t303356\n厦门地铁\t303357\nsqli-labs\t303358\n单一\t303359\n澳大利亚驻华大使馆\t303360\nexcele\t303361\n创惠\t303362\n四星酒店\t303363\n戏剧之家\t303364\n海帕杰顿\t303365\npens\t303366\n18毫米\t303367\nDeadly\t303368\n广东卫视\t303369\n铁友网\t303370\n艾瑞泽5论坛_汽车之家论坛\t303371\nworthy\t303372\n米利\t303373\n切头\t303374\n4小时后\t303375\n关凤\t303376\n原住民\t303377\n小学生守则\t303378\n静态平衡阀\t303379\n七星爆珠\t303380\n和平之花\t303381\n柳井正\t303382\n天空之城概念书店\t303383\nSOFT\t303384\n5^2\t303385\n1952\t303386\n非萎缩性胃炎\t303387\nvs2015吧_\t303388\nby水千丞\t303389\n城\t303390\n日本松下\t303391\n董监\t303392\nCharacteristic\t303393\nTao\t303394\n1893\t303395\n问\t303396\n伊尔76\t303397\n增值税率\t303398\n6月16日\t303399\n国家会计学院\t303400\n黑暗传说\t303401\n减额\t303402\n男风\t303403\nelastalert\t303404\nbinomial\t303405\n辽东半岛\t303406\n注译\t303407\nHydro\t303408\n日喀则市\t303409\n叶菜类\t303410\n336号\t303411\nclassification\t303412\n玛哈\t303413\n绝世情圣\t303414\n漫画版\t303415\nLowest\t303416\n孔丘\t303417\n中国足球协会\t303418\n雷锋式\t303419\n5009\t303420\n建设路\t303421\nVTK\t303422\n别以为\t303423\n六鼎山\t303424\n100年前\t303425\n201306\t303426\n洋河股份\t303427\n非正式\t303428\n华东医院\t303429\n养植\t303430\n夏美\t303431\n中海锦苑\t303432\nToyo\t303433\n63号\t303434\n明月\t303435\n民教网\t303436\nNursery\t303437\nLGS\t303438\n2013-09-07\t303439\n迈迪\t303440\n立而立人\t303441\nON\t303442\n镇压\t303443\n林婉璃\t303444\n王佳\t303445\n谭雅\t303446\nPane\t303447\n增厚\t303448\n第5代\t303449\n王思懿\t303450\n别人\t303451\n板芙\t303452\n小心眼\t303453\nCRO公司\t303454\n精义\t303455\n降落\t303456\n女人味\t303457\nSoutheast\t303458\n抢戏\t303459\n职业餐饮\t303460\n中国办事处\t303461\n启辰d60\t303462\n无限火力\t303463\nKOCOOL\t303464\n幸运女神\t303465\n蚂蚁聚宝\t303466\n康巴丝\t303467\n邓志辉\t303468\n黑荆棘\t303469\n台怀镇\t303470\n披风\t303471\n游乐场\t303472\n杭州科技园\t303473\n文明5吧_\t303474\n17K\t303475\nyii\t303476\n晚娘2\t303477\n恩智\t303478\n织唛\t303479\n海盐政府网\t303480\n高聚物\t303481\nCLOB\t303482\n准星\t303483\n济南医院\t303484\n龙园\t303485\n林西县\t303486\n奔驰g500\t303487\n苏州市中级人民法院\t303488\nGAT\t303489\n混排\t303490\n市机关事务管理局\t303491\n三侠\t303492\n戚墅堰\t303493\nhcie\t303494\n樊文花\t303495\n栟茶\t303496\nVDP\t303497\n20170805\t303498\n范湖\t303499\n骨髓\t303500\n钉枪\t303501\nzhiyun\t303502\n淹制\t303503\n高权\t303504\n舒肤佳\t303505\n沱江\t303506\n氧化层\t303507\n八里沟景区\t303508\n简言\t303509\n人民小学\t303510\nZürich\t303511\n自由之路\t303512\n囚婚\t303513\n多客\t303514\n稀有品种\t303515\n8.4英寸\t303516\nnahxa\t303517\n通化金马\t303518\n武汉宾馆\t303519\n刘善人\t303520\n巴特沃斯\t303521\nCasarte\t303522\n智企\t303523\n豳风\t303524\n万和郡\t303525\n宝马8系\t303526\n家蚕\t303527\n加那利群岛\t303528\n猎豹\t303529\n1.24G\t303530\nepdm\t303531\n民主党\t303532\n四书五经\t303533\n灯条\t303534\n退货单\t303535\n顺德站\t303536\n戴景耀\t303537\n夺舍\t303538\n陈宝莲\t303539\n青霉菌\t303540\nMiku\t303541\n荷塘\t303542\n龙樱\t303543\nValet\t303544\n影音室\t303545\n出生岛\t303546\n法行\t303547\ne1000\t303548\n黄金回购\t303549\n零氪\t303550\n黑头仪\t303551\n1698\t303552\n工体\t303553\n转日\t303554\n万盛经开区\t303555\n潇湘晨报网\t303556\n李国麟\t303557\nAIUI\t303558\n中铁四院\t303559\n78元\t303560\n罗家村\t303561\n通天峡\t303562\n探馆\t303563\n轻音少女\t303564\n花筑\t303565\n重担\t303566\n败犹荣\t303567\n千辛万苦\t303568\n结局\t303569\ncorporate\t303570\n浙江医院\t303571\n27300亿\t303572\nBlock\t303573\nprojections\t303574\n002010\t303575\n陈三元\t303576\n云岭先锋夜校\t303577\n一辩稿\t303578\n沈塘桥\t303579\n恒流源\t303580\n老头子\t303581\n郭继承\t303582\n广西公安厅\t303583\n中国农科院\t303584\n前川\t303585\n华夏致富网\t303586\n范思哲\t303587\n25台\t303588\n撸先锋\t303589\nExtensive\t303590\n16时\t303591\n后退休\t303592\n翻机\t303593\n草堆\t303594\n九十年代初\t303595\n雷克萨斯es250\t303596\n双温\t303597\n利己\t303598\nMAH\t303599\n第4节\t303600\n绣春刀\t303601\n千代田\t303602\n夺去\t303603\n霹雳舞\t303604\n1f\t303605\n长舌\t303606\n华盖\t303607\n高抛\t303608\n慢下来\t303609\n强韧\t303610\n打架子\t303611\n109名\t303612\n华南理工大学》\t303613\n法外之徒\t303614\n文化课\t303615\ngdp排名2017\t303616\n包柱子\t303617\n森美\t303618\ncry5\t303619\n云盾\t303620\n过眼\t303621\n委托人\t303622\n杭州联通\t303623\n阿里山\t303624\n伦敦银行\t303625\nhundred\t303626\nalt+a\t303627\n爱辉区\t303628\n王纲\t303629\n有机硅树脂\t303630\n苦不苦\t303631\n冯学成\t303632\n后镜\t303633\n准不准\t303634\n合泰单片机\t303635\n熨烫机\t303636\n能不能不\t303637\n一诊\t303638\n法律教育网\t303639\npsat\t303640\ne575\t303641\n粉黛\t303642\n子丑寅卯辰巳午未申酉戌亥\t303643\n疑遭\t303644\n车辆购置税纳税申报表\t303645\nShape\t303646\nAndroid7.1\t303647\n盘盈\t303648\n琪亚娜\t303649\n赵政\t303650\n连讯网\t303651\n黄金矿区\t303652\n未来30天\t303653\n夜夜鲁\t303654\nf240\t303655\n党政办\t303656\n段涛\t303657\n凶多吉少\t303658\ncwnd\t303659\n济南山\t303660\nklein\t303661\n真空干燥机\t303662\npr\t303663\nGrowth\t303664\nnums\t303665\n120例\t303666\nFirefox火狐浏览器\t303667\nowo\t303668\nPANTUM\t303669\n浅说\t303670\n不禁\t303671\n1级\t303672\n光明街道\t303673\n毛重\t303674\nPB9\t303675\n母女花\t303676\n客户机\t303677\n梁家\t303678\n追到手\t303679\n榛\t303680\n神月\t303681\n团工委\t303682\n果栏中的江湖大嫂\t303683\n循环水\t303684\n华二紫竹\t303685\n192号\t303686\n迷羊\t303687\n中泰集团\t303688\nTransition\t303689\n4G套餐\t303690\n审签\t303691\n编曲\t303692\n小笼汤包\t303693\n爱医资源网\t303694\n岳昕\t303695\n大屿山\t303696\naugmented\t303697\n等长\t303698\n回重审\t303699\n间苯二甲酸\t303700\n运输\t303701\n成都市政府\t303702\n宁夏医科大学\t303703\nisight\t303704\n居民证\t303705\n萼片\t303706\n江苏省苏科仪表有限公司\t303707\n温莎大学\t303708\n驻马店职业技术学院\t303709\n鞠小林\t303710\nsqlsugar\t303711\n索偿\t303712\n鲛\t303713\n收腰\t303714\nsandro\t303715\n木村佳乃\t303716\n芝奇\t303717\n五常街道\t303718\nBaseline\t303719\n沈辉\t303720\nsun公司\t303721\n猫猫小说网\t303722\nflashtool\t303723\n雪纳瑞\t303724\n闹人\t303725\nHEADER\t303726\nimazing\t303727\n麦联宝\t303728\n产物\t303729\nonshore\t303730\n京能电力\t303731\n宁江区\t303732\n八年制\t303733\nBluehost\t303734\nShowing\t303735\n首卷\t303736\nkris_zhang\t303737\n仙童\t303738\n固定利率\t303739\n锦绣小区\t303740\n福祉\t303741\n国际贸易公司\t303742\n松江河\t303743\n赖茅\t303744\njerk\t303745\n曾祥裕\t303746\nDataGridview\t303747\n大腕儿\t303748\n8w\t303749\n安琥\t303750\n色气\t303751\n省卫计委\t303752\n五爱\t303753\n中远航运\t303754\n插装阀\t303755\n甘家口\t303756\n饭盒\t303757\n乐游网\t303758\n停薪留职\t303759\n红外传感器\t303760\n丹炉\t303761\nrizhao\t303762\nVenue\t303763\nCSS\t303764\n国民党反动派\t303765\n年龄\t303766\n冷态\t303767\n火箭队\t303768\n容桂\t303769\n暗处\t303770\nroster\t303771\n是而非\t303772\n灾星\t303773\n4月19日\t303774\n深中通道\t303775\n8.6.0\t303776\n易飞ERP\t303777\n说明符\t303778\n光明集团\t303779\n任嘉伦\t303780\n伊藤计划\t303781\n四根\t303782\n翩若惊鸿\t303783\n香港航空\t303784\nPLEASE\t303785\n163号\t303786\n17.1.3\t303787\n喵金\t303788\n微心愿\t303789\nB11\t303790\n奉仕\t303791\nneuronal\t303792\n大量\t303793\n黑暗觉者\t303794\nomit\t303795\n披索\t303796\n古川\t303797\n省内\t303798\nchristine\t303799\n灭蚊器\t303800\n刘影\t303801\n固定电话号码\t303802\n圈梁\t303803\n军长\t303804\n相亲相爱\t303805\n害处\t303806\n龙图杯\t303807\n欢乐麻将\t303808\nproposed\t303809\n陈小龙\t303810\n帝少心头宠\t303811\n地球物理学报\t303812\n花箭\t303813\n创收\t303814\n第47届\t303815\n联系村\t303816\nqiyeku\t303817\n陶罐\t303818\nNarrative\t303819\n钩臂\t303820\ntain\t303821\n祖籍\t303822\n唐先生\t303823\n带回国\t303824\n艾瑞泽7论坛\t303825\n平声\t303826\ncs55\t303827\n牛骨头\t303828\n磁共振检查\t303829\n天韵\t303830\n子贡曰\t303831\n幢\t303832\n调料包\t303833\n排量\t303834\n生命生命\t303835\n苏州站\t303836\n大珠山风景区\t303837\n99作文网\t303838\n和平街\t303839\n流表\t303840\n余杭经济开发区\t303841\n斗破苍穹\t303842\n腰杆\t303843\n尼姑\t303844\n欧米茄\t303845\ndeployer\t303846\nhightchart\t303847\n只羡鸳鸯不羡仙\t303848\n中天科技集团\t303849\n刘立荣\t303850\n七十载\t303851\n今年国庆\t303852\nS7-200\t303853\nbittrex\t303854\n起泡\t303855\n1小时前\t303856\n经营性租赁\t303857\nCIMS\t303858\n推进领导干部能上能下若干规定\t303859\n青阳县政府\t303860\n张国柱\t303861\n宏图高科\t303862\n有限域\t303863\n投桃报李\t303864\n黄赤\t303865\n金蝶医疗\t303866\nHc36\t303867\n密宗\t303868\n望府\t303869\n郑则仕\t303870\n龙妹\t303871\nnew3ds\t303872\n减缓\t303873\n落叶松\t303874\n洗菜机\t303875\n同祥城\t303876\n深圳西站\t303877\n署长\t303878\n大排\t303879\n黑玉米\t303880\n犹豫\t303881\nCBBE\t303882\n打抱不平\t303883\n成昆铁路\t303884\n周秀娜\t303885\n黑水县\t303886\n20160615\t303887\n山工\t303888\n七百\t303889\n白腊\t303890\n淫奇\t303891\nStevens\t303892\n货装\t303893\ndandan\t303894\n亿图\t303895\n广州碧桂园\t303896\n3dsmax2016\t303897\n发明展\t303898\n打底\t303899\n威驰论坛_汽车之家论坛\t303900\n7天后\t303901\n古雅\t303902\n芭莎慈善夜\t303903\ntop20\t303904\n攀高\t303905\nmod吧\t303906\n温阳\t303907\nBD特典\t303908\n营销页\t303909\n第三方支付\t303910\nfeko\t303911\n市人大\t303912\n首汽\t303913\n专业版\t303914\n浙江人民出版社\t303915\n18c\t303916\n硝酸_\t303917\nOpenGL\t303918\n4倍\t303919\nCADAS\t303920\nGranblue\t303921\n薰\t303922\n软硬\t303923\n一声不吭\t303924\nB310\t303925\n斗罗大陆III龙王传说\t303926\n第一漂\t303927\n渔乡\t303928\n李诚\t303929\n星语心愿\t303930\n汪鑫\t303931\n中海环宇城\t303932\nform1\t303933\ntlf\t303934\n20盒\t303935\n五千多\t303936\n三船敏郎\t303937\n宫扇\t303938\n八角笼\t303939\n追查\t303940\n自动售水机\t303941\n洁婷\t303942\n满城汉墓\t303943\n长凳\t303944\n东方project\t303945\n黏合剂\t303946\nIT公司\t303947\n四明山\t303948\nxe6\t303949\n检漏\t303950\n大风车\t303951\n琚宾\t303952\nPM3\t303953\n睢宁\t303954\n脚印\t303955\n儿女情\t303956\n2017.06\t303957\n伦敦金融城\t303958\nimmortal\t303959\n650米\t303960\nGlas\t303961\n4013\t303962\n跳舞机\t303963\n和睦新村\t303964\n电焊工证\t303965\n基本体\t303966\n重庆市城口县人民政府\t303967\n国家机\t303968\n向日\t303969\n文摘网\t303970\n振冲碎石桩\t303971\nav小次郎\t303972\n荣耀七\t303973\n极品飞车13\t303974\n假山\t303975\n顾彬\t303976\n沈阳农业大学\t303977\n千江凌云\t303978\n20170507\t303979\n安庆石化\t303980\n南京大学教育研究院\t303981\n掏鸡\t303982\n悬棺\t303983\nphonegap\t303984\n平顶山银行\t303985\n成都物流公司\t303986\n20171123\t303987\n22018\t303988\n101页\t303989\n美国中央情报局\t303990\n输精管\t303991\n叶黄\t303992\nSyfy\t303993\nLANMP\t303994\n/ul\t303995\n骨髓炎\t303996\n更新值\t303997\n曾许诺\t303998\nmiix4\t303999\n扩列\t304000\n侠女\t304001\n净资本\t304002\n排卵期\t304003\n孙亮\t304004\n受阻\t304005\n老行当\t304006\n景天区\t304007\n心包经\t304008\n基肥\t304009\n伊丹机场\t304010\n停火\t304011\n逸夫小学\t304012\n旱烟\t304013\n货拉拉\t304014\n甘肃省委组织部\t304015\n袁雨桢\t304016\n2017前十名\t304017\n彩云追月\t304018\n帝凰\t304019\n岩蛇\t304020\n宿迁市规划局\t304021\n卷宗\t304022\n流行\t304023\n沙漠风暴\t304024\n主叫\t304025\n一域\t304026\n社会治安\t304027\n特价\t304028\n大青岛\t304029\n护心镜\t304030\nsuperwingy\t304031\n连续化\t304032\n拖轮\t304033\n花筒\t304034\npapyrus\t304035\nGoogleEarth\t304036\nkey-CSDN\t304037\n上党\t304038\n反射片\t304039\nminority\t304040\nsadd\t304041\ntotal\t304042\n谷阿莫\t304043\n鬼师\t304044\n国际市场营销学\t304045\n星阵\t304046\n深圳学校网\t304047\n侧身\t304048\n鹰目户外广告网_\t304049\n青岛农业大学海都学院\t304050\n智盘\t304051\n国美\t304052\n礼赞\t304053\n习题集\t304054\nMaika\t304055\n明特\t304056\n小思\t304057\n西华大学\t304058\n蒸粽\t304059\n套路得人心\t304060\n铁锹\t304061\n丹尼尔·戈尔曼\t304062\n开标\t304063\nfbm\t304064\njackhf\t304065\n天涯\t304066\n生长抑素\t304067\n北京妈妈网\t304068\n赵涌\t304069\nblfshiye\t304070\n谢云流\t304071\n整数型\t304072\n3011\t304073\n沁水县\t304074\n营养师\t304075\nSHINee\t304076\n许光达\t304077\n汉科\t304078\n几重\t304079\n液压杆\t304080\n绿色金融债券\t304081\n鹦鹉洲\t304082\nVUE2\t304083\nwaitting\t304084\n黄田\t304085\n天津市河西区\t304086\n貌美\t304087\n托福考试\t304088\n山东省农村信用社\t304089\n吹泡泡\t304090\n宠奴\t304091\n各单位\t304092\n晋心\t304093\n随手\t304094\nMETRO\t304095\n谷粒\t304096\n购物地\t304097\n小米平衡车\t304098\n王凤波\t304099\n芸能\t304100\n跌势\t304101\n面筋\t304102\n林芳\t304103\n环保标准\t304104\n北森云\t304105\nfranx\t304106\n20160307\t304107\n列王纪\t304108\n中板\t304109\n300mw\t304110\n天荒地老\t304111\n误差理论\t304112\n红米手机5\t304113\nCOMM\t304114\n静置\t304115\n水热反应釜\t304116\n雾化吸入\t304117\n附送\t304118\n易中\t304119\n杭甬高速\t304120\n报屋\t304121\n任应秋\t304122\n三千年\t304123\n汽车配件110网\t304124\n英雄榜\t304125\n苏爽\t304126\n影驰GeForce\t304127\ncidian.wenku1.com\t304128\n鸡蛋布丁\t304129\n性激素\t304130\n我秀网\t304131\n惠南\t304132\n张爱民\t304133\n7580\t304134\n孙新\t304135\n官级\t304136\n3000000\t304137\n吴皇\t304138\n反垄断\t304139\n巴拉巴拉小魔仙\t304140\nbathing\t304141\n怕死\t304142\n爱丁堡大学\t304143\n八千多\t304144\n半亩\t304145\n挂具\t304146\n国土资源报数字报\t304147\n北京市医院管理局\t304148\njiqing\t304149\n瓢虫\t304150\ne300\t304151\n出生纸\t304152\n阿斯伯格综合征\t304153\n鑫达\t304154\n通栏\t304155\n噬魂\t304156\n家店\t304157\n副高\t304158\n行踪\t304159\n网商贷\t304160\nsss\t304161\n保员\t304162\n精神类\t304163\n设及\t304164\n最具\t304165\nhref\t304166\n私信\t304167\n轻罗\t304168\n自行车库\t304169\n1705\t304170\n八万四千\t304171\n康行天下\t304172\nHDTV版\t304173\n000735\t304174\nisolate\t304175\n分子力\t304176\n每一回\t304177\n周建琨\t304178\nAdobeReader\t304179\nBehaviors\t304180\n木木模拟器\t304181\n启发性\t304182\nTPEE\t304183\n第50条\t304184\n开篇\t304185\n流式细胞仪\t304186\n隋唐城遗址植物园\t304187\nbeams\t304188\n统建\t304189\n京东大峡谷\t304190\n压克力\t304191\n西游类\t304192\n内室\t304193\n汔\t304194\n世界工厂\t304195\n中正纪念堂\t304196\n一字段\t304197\n6.5级\t304198\n赫克托尔\t304199\n举例子\t304200\n关不住\t304201\n朋克\t304202\n星期天\t304203\n亿品元素\t304204\n大豆分离蛋白\t304205\n都市网\t304206\n陆展博\t304207\ndali\t304208\n所在国\t304209\n得实集团\t304210\n划分\t304211\n米线\t304212\n黑膏\t304213\n缇丽莎尔\t304214\n网游之修罗传说\t304215\n大能\t304216\n五礼\t304217\n1200个\t304218\n叶轩\t304219\n映雪\t304220\n泉州五中\t304221\n龙城一号\t304222\nBritney\t304223\n九九艳阳天\t304224\n12h\t304225\nHEC\t304226\ncrybaby\t304227\n_天助网\t304228\n郫筒镇\t304229\nQSP\t304230\n三块\t304231\nSwing\t304232\nArchitect\t304233\n龟蒙景区\t304234\njdbc批量\t304235\n垃圾堆\t304236\n泰\t304237\n三幼\t304238\n凯莉布鲁克\t304239\n恶魔城:暗影之王2\t304240\n真空脱气机\t304241\n刍议\t304242\nAlphablocks\t304243\ngit\t304244\n水镇\t304245\n张佳\t304246\n时风\t304247\n高岗\t304248\n重庆电信职业学院\t304249\nbuzzer\t304250\nlizz\t304251\n343号\t304252\nC4996\t304253\n虾夷扇贝\t304254\nma6174\t304255\n任意数\t304256\n1080p|720p高清\t304257\n11004\t304258\n豪华版\t304259\n城隍\t304260\n南充市人民政府\t304261\nsophos\t304262\n北洋军\t304263\n沙溪\t304264\ncoco2d\t304265\n标点符号\t304266\n张芸京\t304267\n金华日报\t304268\n制作书\t304269\n长春小区\t304270\n安琪\t304271\n2093\t304272\n武昌鱼\t304273\n魏思\t304274\n此地\t304275\n钴蓝\t304276\n张鹭\t304277\n查问\t304278\n至尚\t304279\n前胡\t304280\nqwebengine\t304281\n元一柏庄\t304282\n艾伯特\t304283\n以眼还眼\t304284\n崔珉\t304285\n蠢蠢\t304286\nAndante\t304287\n72个\t304288\nHog\t304289\n林超伦\t304290\n克劳修斯\t304291\n数码宝贝Linkz\t304292\n超高清\t304293\nDapp\t304294\n疯狂的麦克斯:狂暴之路\t304295\n楚雄州\t304296\nupan\t304297\n米房\t304298\n北京南路\t304299\nprcc2015\t304300\n发展区\t304301\n铁佛寺\t304302\n罗森博格\t304303\nundefined\t304304\nnou\t304305\nZG\t304306\n逃妻\t304307\n即时通\t304308\n2天后\t304309\npassthru\t304310\n3000句\t304311\n无妄之灾\t304312\n000422\t304313\n91夯\t304314\n连麦\t304315\n夺命来电\t304316\n臻藏\t304317\nligerUI\t304318\n鼓轮\t304319\n张孝祥\t304320\nMont\t304321\n冤情\t304322\n六点钟\t304323\n罗纳河\t304324\n男女友\t304325\nCun\t304326\n肇庆东\t304327\n注胶\t304328\n厦门市规划委员会\t304329\nGINA\t304330\n凝胶\t304331\n乙亥年\t304332\n中坚\t304333\noldboy\t304334\n溃\t304335\n一员\t304336\n边线\t304337\nU7\t304338\n新联村\t304339\nMickey\t304340\n早上六点\t304341\n戊巴比妥钠\t304342\nEAT\t304343\npc肌\t304344\nSaturn\t304345\n农地\t304346\n修剪器\t304347\n劳务发票\t304348\n渝北中学\t304349\n4.74\t304350\n一键\t304351\n啤酒机\t304352\n清冷\t304353\nVideoLAN\t304354\n奇彩\t304355\n小小新娘花\t304356\n拜厄\t304357\nedged\t304358\n連載\t304359\n尽心\t304360\n易制毒化学品管理条例\t304361\n维e\t304362\n蔡锷\t304363\n开挖\t304364\nanne\t304365\nTVM\t304366\n基础油\t304367\n假膜\t304368\n市行\t304369\n朱玉龙\t304370\n*资\t304371\n速排名\t304372\nNutshell\t304373\nd1\t304374\n决绝\t304375\n翅桶\t304376\n南向\t304377\n有限合伙人\t304378\n球镜\t304379\n人工智\t304380\n背部\t304381\n5月末\t304382\n死亡细胞\t304383\n千分表\t304384\n佛山电视台\t304385\n格莱特\t304386\n江淮晨报网\t304387\n史宁\t304388\n第128章\t304389\n股动脉\t304390\n掌眼\t304391\n荥经县人民政府\t304392\n增光\t304393\n国土资源\t304394\n水长城\t304395\nChester\t304396\n碱化\t304397\n暗娼\t304398\n0.01MB\t304399\n宝马1系论坛_汽车之家论坛\t304400\n奔驰GLK300\t304401\n一个七\t304402\nPHEV\t304403\n报告网\t304404\n龙达\t304405\n零天\t304406\n蕊\t304407\n眼保仪\t304408\n1642\t304409\n激射\t304410\n限制\t304411\n住哪儿网\t304412\n角边\t304413\n平口钳\t304414\nfib\t304415\n凯美瑞论坛_汽车之家论坛\t304416\n高致\t304417\n桃花洞\t304418\n保定\t304419\n曜夜版\t304420\n2018年4月15日\t304421\n主控台\t304422\nWedgwood\t304423\n对戒\t304424\n华亭县\t304425\n短租房\t304426\n验印\t304427\n变形金刚5:最后的骑士\t304428\nJavaWEB\t304429\nObjectId\t304430\n阳坊镇\t304431\n金海滩\t304432\n中意人寿保险有限公司\t304433\n观刈麦\t304434\nAsyncTask\t304435\n顺着\t304436\nscdma\t304437\n杨杨\t304438\n呼南\t304439\n人教版四年级语文\t304440\n160_\t304441\nKORS\t304442\nHCV\t304443\n200%\t304444\n上沿\t304445\n有限单元法\t304446\n笑林小子\t304447\n西安地铁6号线\t304448\n姬魔恋\t304449\n示范校\t304450\n智诚科技\t304451\n友信贷款\t304452\n北玻股份\t304453\nbcprov\t304454\n乐心\t304455\n与过\t304456\n赛尔号2\t304457\nunspecified\t304458\n所及\t304459\nmovable-view\t304460\n房天下澳大利亚房产网\t304461\n4.1.7\t304462\n魏翔\t304463\n情燃\t304464\nGovernance\t304465\n红旗H7\t304466\n布鲁斯南\t304467\n不用力\t304468\nyamaha\t304469\n陈智勇\t304470\n月食\t304471\n团贷\t304472\n四者\t304473\n玻璃布\t304474\n交接\t304475\n代发空\t304476\n超巨型\t304477\n桂林市人民政府\t304478\n长沙市市\t304479\n插电式混合动力车\t304480\n2017年10月30日\t304481\n划进\t304482\n三国两晋\t304483\n隧\t304484\n名古屋站\t304485\n鼠鼠\t304486\n莲子汤\t304487\nABC财税网\t304488\nontology\t304489\n宜信商通\t304490\n通灵神探\t304491\n中型犬\t304492\n巨像\t304493\n新视野大学英语第三版\t304494\n聞\t304495\n滑触\t304496\n铭医\t304497\nReactiveCocoa\t304498\n东北抗日联军\t304499\n监测点\t304500\n国兴大道\t304501\n返租\t304502\n补招\t304503\ndreams\t304504\n华福路\t304505\nbulma\t304506\n嘉俊\t304507\nEvil\t304508\n约\t304509\n陈文伯\t304510\n慢游\t304511\nZOTAC\t304512\n〔\t304513\n_千千\t304514\n禁用词\t304515\npbft\t304516\n自印\t304517\nfcp\t304518\n反相机\t304519\nAviator\t304520\n北外滩\t304521\n现金支票\t304522\n超星慕课\t304523\n云南经济日报\t304524\n聚氨脂\t304525\n渗透\t304526\n北伐\t304527\n城家公寓\t304528\n爱之实验室\t304529\nS1号线\t304530\n套王\t304531\n三相点\t304532\n五羊本田\t304533\n华为mate8\t304534\n帕西法尔\t304535\n第03期\t304536\nviii\t304537\n1329\t304538\n佛山市人力资源和社会保障局\t304539\n盘中\t304540\n盛世中华\t304541\n贝茨\t304542\n8.5公斤\t304543\n佳能m100\t304544\n弃女\t304545\nwebx\t304546\n衣衣\t304547\n射精管理\t304548\n魔雷\t304549\nprs\t304550\n双桥区\t304551\n大醉\t304552\n春光乍现\t304553\n川沙地铁站\t304554\n浙江省林业厅\t304555\n世界和平\t304556\n非线性误差\t304557\n调直\t304558\n结关\t304559\n千万千万\t304560\n30天内\t304561\n扣件式\t304562\n中信新城\t304563\n230w\t304564\nGONE\t304565\n铵根\t304566\n蓝话筒\t304567\n122.gov.cn\t304568\n酷吏\t304569\nHS商品编码查询\t304570\n换取\t304571\n渔乐吧\t304572\nWincc\t304573\n三百五十\t304574\n想你的夜\t304575\n洲泉\t304576\n最强大脑之燃烧吧大脑\t304577\nwatermelon\t304578\n专业机\t304579\n双鱼玉佩\t304580\nvlc播放器\t304581\n感悟人生\t304582\n宜化集团\t304583\nCentro\t304584\n趋紧\t304585\ngrg\t304586\n服装界\t304587\n存款基准利率\t304588\n宇文所安\t304589\nxxtea\t304590\n吹萧\t304591\n情意绵绵\t304592\nR18\t304593\n两枚\t304594\n大厅\t304595\n杂声\t304596\n火麻油\t304597\n干皮\t304598\n_币文库_巴比特\t304599\n九江人事考试网\t304600\n姜立\t304601\n375px\t304602\n东莞大学\t304603\n古川伊织\t304604\n人处\t304605\n附录\t304606\n大亚湾区\t304607\n氯化\t304608\nG470\t304609\nhuanet\t304610\n中渝\t304611\n望京医院\t304612\nxlx\t304613\n秋游\t304614\n8万亿\t304615\n5000年前\t304616\nKKT\t304617\n王者天下吧\t304618\n亚洲航空\t304619\n胡话\t304620\n铜材\t304621\n胆道\t304622\n月经初潮\t304623\n一钟\t304624\n芝士棒\t304625\n多普勒效应\t304626\n五伦\t304627\nQG\t304628\npipe\t304629\n野性\t304630\n联合利华\t304631\n小克\t304632\n涡轮盘\t304633\n立子\t304634\n15isk\t304635\n村儿\t304636\n交响音乐\t304637\n酢浆草\t304638\n次生林\t304639\n盼盼食品\t304640\n門票價錢\t304641\n电视片\t304642\n边界类\t304643\n天津市县\t304644\n宁明县\t304645\n寮步\t304646\n美洲狮\t304647\n书香门第\t304648\nFPA\t304649\n92.6\t304650\n澧水\t304651\n铁机\t304652\n中国行业研究网\t304653\n月相表\t304654\nh5游戏\t304655\nmaxcms\t304656\n摄政\t304657\n巡洋\t304658\n泊头市\t304659\n近视眼镜\t304660\nLuo\t304661\n苗圃路\t304662\n纳兰雪央\t304663\n脑力劳动者\t304664\nrvs\t304665\n2017分\t304666\n姑婆\t304667\ncallback\t304668\nShogun\t304669\n三森\t304670\n韬睿惠悦\t304671\n画稿\t304672\nocelot\t304673\nJSFiddle\t304674\n模杯\t304675\nmpeg2\t304676\nRC1\t304677\n伯里曼\t304678\nsplended\t304679\nABM\t304680\n勇敢者\t304681\n旧剑\t304682\n处女女\t304683\n鸡蛋汤\t304684\n氢键\t304685\n米旗\t304686\n32x\t304687\n沪深指数\t304688\n大风车幼儿园\t304689\nAPP_亿智蘑菇\t304690\n昆明市委\t304691\nSOCIETY\t304692\n章晓之\t304693\nLavigne\t304694\n173家\t304695\n日本女足\t304696\n川蜀\t304697\n蓝帖\t304698\n孙亚龙\t304699\nquasi\t304700\n弋江区\t304701\n唯我独尊\t304702\n七集\t304703\nmac软件大全\t304704\n小音\t304705\n马铃薯淀粉\t304706\n小孔\t304707\n韩梅\t304708\n泰莱\t304709\nShadowverse\t304710\n建设项目环境影响评价分类管理名录\t304711\nIVB\t304712\nchau\t304713\n豪勇七蛟龙\t304714\n本案\t304715\n超火\t304716\n巴氏鲜奶\t304717\n剑\t304718\n廖辉\t304719\nWines\t304720\n粮食烘干机\t304721\n设备展\t304722\n指示仪\t304723\nbu\t304724\nowhat\t304725\n冰气\t304726\n芯皮\t304727\n回城\t304728\n战点\t304729\n零度智控\t304730\n杭州市求是教育集团\t304731\n二氯乙烷\t304732\n瓶颈期\t304733\n独龙族\t304734\n318\t304735\n大定\t304736\n蓝湛\t304737\n79名\t304738\n南佛罗里达大学\t304739\n拥抱星星的月亮\t304740\n古拉k\t304741\n悲观主义者\t304742\njoys\t304743\nBaxter\t304744\n降格\t304745\n金房\t304746\n东来顺\t304747\n船舷\t304748\n八里庄街道\t304749\n刘晓洁\t304750\n湖南都市频道\t304751\n4K版\t304752\npharmaceutical\t304753\n日本皇室\t304754\nMS17\t304755\n中国产业经济信息网\t304756\nlsyncd\t304757\n美\t304758\nNavbar\t304759\n拔除\t304760\n澳大利亚留学网\t304761\n几则\t304762\n农家乐旅游联盟\t304763\n明发广场\t304764\n故事的故事\t304765\n镁合金\t304766\n击缶\t304767\n昭彰\t304768\n曲小蛐\t304769\n打会\t304770\n茭白\t304771\n申通地铁\t304772\n聚合物锂电池\t304773\n1.0.3\t304774\n安卓8\t304775\nrpgxp\t304776\n外汇110网\t304777\n工字型\t304778\n掐尖\t304779\n急性腹膜炎\t304780\n全案\t304781\n雷哥\t304782\n小南湖\t304783\nステップ\t304784\n第18042期\t304785\n现金日记账\t304786\ningredient\t304787\n招行百夫长白金卡\t304788\n鸭绿江大桥\t304789\n一控\t304790\n神户牛肉\t304791\n独栋别墅\t304792\n双号\t304793\n木卫二\t304794\n比亚迪L3\t304795\n东方三侠\t304796\n宇辰\t304797\n108TV\t304798\n棠梨煎雪\t304799\n全屋定制家具\t304800\n滕州东站\t304801\ncoeff\t304802\n堵路\t304803\n大瑶山\t304804\n树德\t304805\n色情片\t304806\ntcu\t304807\n家用中央空调\t304808\n磁块\t304809\n百分之90\t304810\n刘金沂\t304811\n紫龙晶\t304812\n自由世界\t304813\n红警尤里的复仇\t304814\n高军\t304815\n黑客帝国\t304816\n李添\t304817\nGMW\t304818\n聚融\t304819\n90﹪\t304820\nEX360\t304821\n5557\t304822\n0556\t304823\n雅佳\t304824\n营销\t304825\nPASSAT\t304826\n三花聚顶\t304827\n两微一端\t304828\n苏杰\t304829\n肖恩\t304830\nenoent\t304831\n一炮\t304832\n苏志燮\t304833\n白洋淀\t304834\n黑白花\t304835\n兴义\t304836\n金玉良言\t304837\n梧桐镇\t304838\n纠治\t304839\nchamie\t304840\n干黄花菜\t304841\n灵通打单\t304842\n林下\t304843\ncrossword\t304844\n花酱\t304845\n中山大学社会学与人类学学院\t304846\n4438x\t304847\nexecution\t304848\n阿罗裤\t304849\n红四军\t304850\njdk8\t304851\n钱俊\t304852\n女教\t304853\n五首歌\t304854\niz\t304855\n溢流\t304856\n烽\t304857\nphil\t304858\n失心\t304859\n蓝绍敏\t304860\n炉石传说狗头人\t304861\n猎杀星期一\t304862\n中国LED网\t304863\nEdith\t304864\n肥女\t304865\n铁线虫\t304866\n创新中国\t304867\n王羽杉\t304868\nSQLServer2012\t304869\n重掌\t304870\n落位图\t304871\n转座\t304872\n78级\t304873\n山药胶囊\t304874\ncollateral\t304875\n里程表\t304876\n抗剪\t304877\n瑞虎7\t304878\n柳笛\t304879\n狗眼看人低\t304880\n弘昼\t304881\nOVS\t304882\n赵宝刚\t304883\ngxt\t304884\n配置项\t304885\n建发集团\t304886\n80070002\t304887\nXXX18\t304888\nFundraising\t304889\n4个月\t304890\n白芍\t304891\nGO房网\t304892\ngyl-coder\t304893\n域界\t304894\n高层住宅\t304895\n张盛舒\t304896\n巴中文化交流网\t304897\n10副\t304898\n间接式\t304899\n2000多家\t304900\nkown\t304901\nyone\t304902\n郵遞區號\t304903\n中国机床商务网\t304904\n见仁见智\t304905\n恶魔城白夜协奏曲\t304906\n优劣\t304907\n4月9\t304908\nJapane\t304909\n浴花\t304910\n顾村\t304911\n南大网院\t304912\n冰川时代5\t304913\n浮头式\t304914\nSK2\t304915\n简包\t304916\nxueqiu\t304917\n20150310\t304918\n奥利弗\t304919\n南山村\t304920\n躲藏\t304921\n游泳场\t304922\ntomcat7\t304923\ndomin\t304924\n200片\t304925\n受权\t304926\n兵书\t304927\n聚彩\t304928\n厉行\t304929\n第二章节\t304930\n人字梯\t304931\nscrt\t304932\n颍东区政府\t304933\n女伴\t304934\n刘吉\t304935\n李家村\t304936\ntrib\t304937\n414\t304938\nhp1112\t304939\n5.1%\t304940\n无痕浏览_\t304941\n伊谢尔伦\t304942\n秀米h5\t304943\n胸像\t304944\n相比较\t304945\nAmplitude\t304946\n面向对象\t304947\n批评史\t304948\n中华联合\t304949\n辽宁省地方税务局\t304950\n王晓峰\t304951\nyanshi\t304952\n送料机\t304953\n哼鸣\t304954\nBerufsprofil\t304955\n拨码\t304956\n轴版\t304957\n笑傲江湖曲\t304958\n电视长\t304959\n师\t304960\n沪深300指数\t304961\nDAS\t304962\n中润资源\t304963\n1.41\t304964\n制气\t304965\n肖宇\t304966\nae86\t304967\n林祥\t304968\n小升初入学考试\t304969\n养老金融\t304970\nfortunate\t304971\n动脉硬化闭塞症\t304972\n糯米面\t304973\n精酿啤酒吧\t304974\n小荷\t304975\n浅啡网\t304976\n比較\t304977\nbibliography\t304978\n试卷\t304979\n世界红十字日\t304980\n沈阳市区\t304981\nfancl\t304982\n南通中央\t304983\nStage\t304984\n常德买房网\t304985\n独墅湖图书馆\t304986\njps\t304987\nCeres\t304988\n紫翎炽天使吧\t304989\n长沙市一中\t304990\n中科大\t304991\n踏进\t304992\nnigix\t304993\n同安一中\t304994\nax2+bx+c\t304995\n智汇网\t304996\n傲斗凌天\t304997\n北京养老院\t304998\n黄线\t304999\nThreat\t305000\n百济神州\t305001\n援疆\t305002\nOffices\t305003\n超搞笑\t305004\nlvc\t305005\n0594\t305006\n枫景\t305007\n肾上腺瘤\t305008\n瓦剌\t305009\n复合地板\t305010\n儿时\t305011\n私奔\t305012\n街道纪工委\t305013\n3512\t305014\n加德纳\t305015\n毒夫人\t305016\ntsui\t305017\n安总\t305018\n分离度\t305019\n贝基\t305020\nasymptotic\t305021\nXML\t305022\n勺\t305023\n顶礼膜拜\t305024\nABCmouse\t305025\n打不准\t305026\n博阿斯\t305027\n益农镇\t305028\n逻辑哲学论\t305029\n立宪\t305030\n夺梦\t305031\n保罗乔治\t305032\n快题\t305033\n人教版初三化学\t305034\n12盒\t305035\n第三十九期\t305036\n东南亚\t305037\n夺婚\t305038\nonCreate\t305039\n中国人寿保险公司\t305040\n支气管\t305041\n圆板\t305042\n辛亥革命\t305043\n泰亨\t305044\n咪咕灵犀\t305045\n终结篇\t305046\nISP\t305047\n阿碧\t305048\nBUY\t305049\n谷人\t305050\n6m\t305051\n天马精化\t305052\n弥可保\t305053\n临川区\t305054\n援外\t305055\n心太\t305056\n睁不开\t305057\n墙壁画\t305058\n软妹子\t305059\nUCL\t305060\n横街\t305061\n选析\t305062\n李世民\t305063\n富可敌国\t305064\n消息类\t305065\n凯尔特\t305066\n纺服\t305067\n表册\t305068\nbottega\t305069\nFile\t305070\nbreath\t305071\n对边\t305072\nc16\t305073\n篮式过滤器\t305074\nmode\t305075\n云南省发改委\t305076\n7类\t305077\n丑柑\t305078\n繁忙\t305079\nguge\t305080\n小矮子\t305081\n700度\t305082\n补丁集\t305083\n研会\t305084\n硬质\t305085\n早期诊断\t305086\n软饰\t305087\n大学生征兵网\t305088\n卷理科\t305089\n火汤\t305090\n桃源恋歌\t305091\n野狗们\t305092\nname选择器\t305093\n唐老板\t305094\niFix\t305095\n国金中心\t305096\n层顶\t305097\nVideos\t305098\nAuction\t305099\ngreenlet\t305100\n心路历程\t305101\npl/sql\t305102\n宣德炉\t305103\n_王\t305104\n彩园\t305105\n湖州二中\t305106\n菜月昴\t305107\nJRE\t305108\n断臂\t305109\n百工\t305110\nnGrinder\t305111\n倦\t305112\n2006款\t305113\n2012.11\t305114\n梦幻王国\t305115\n修身篇\t305116\n石曼迪\t305117\n红楼梦吧\t305118\n妲己翟天临\t305119\n赌瘾\t305120\n云深\t305121\n线性相关系数\t305122\n官庄镇\t305123\n粘土人\t305124\n二六\t305125\n山庄\t305126\n姜育恒\t305127\n恒慧\t305128\n张晓彤\t305129\nKickstart\t305130\n游龙戏凤\t305131\n重生之嫡女无双\t305132\n深入\t305133\n酬宾\t305134\n鸡卡顿\t305135\nLayout\t305136\n大虾们\t305137\n石瓢\t305138\n墙排式\t305139\nUnix/BSD\t305140\n心安\t305141\n高达SEED\t305142\n深圳先进技术研究院\t305143\nvn\t305144\n_安游我的世界\t305145\n1200平\t305146\n出去玩\t305147\n浙江省教育科学研究院\t305148\nvba编程\t305149\n丁炔\t305150\n嘉兴经开区\t305151\n马森\t305152\n姑娘青睐\t305153\n神笔马良\t305154\n底线\t305155\n石门水库\t305156\n聚落\t305157\nGente\t305158\njalap\t305159\nSCARF\t305160\n曹洋\t305161\n蒂塔·万提斯\t305162\nembedded\t305163\n黄秋葵\t305164\nwindows10教育版\t305165\n钠钾泵\t305166\n专章\t305167\n风萧萧\t305168\nmanufactured\t305169\n百秀\t305170\n玉海\t305171\n输油泵\t305172\n竞天公诚律师事务所\t305173\n席卡\t305174\n王海\t305175\nAUX接口\t305176\n球衫堂\t305177\ndéveloppement\t305178\n复硝酚钠\t305179\n男群\t305180\n站立\t305181\n佑安府\t305182\n江湾路\t305183\n疮疡\t305184\n165hz\t305185\n别小瞧\t305186\n3月30日\t305187\n交恶\t305188\n未闻花名\t305189\nchaturbate\t305190\n动手写\t305191\nexits\t305192\n途安论坛_汽车之家论坛\t305193\n丈门\t305194\n北京大学博雅计划\t305195\n男配\t305196\n西格尔\t305197\n健康性\t305198\n移民局\t305199\n保钓\t305200\nG933\t305201\n广州广播电视台\t305202\n华为m3\t305203\n实达集团\t305204\nUniversities\t305205\n饯别\t305206\n草窝\t305207\n603019\t305208\nMOT\t305209\n扒谱\t305210\n卖命\t305211\n米饭粒返利网\t305212\n挡泥板\t305213\n闫肃\t305214\n57分钟\t305215\n大姑姐\t305216\n文玩坊\t305217\n1700亿\t305218\n大连市公共行政服务中心\t305219\nTimed\t305220\n亚甲基蓝\t305221\n骗钱\t305222\n心色\t305223\n骨版\t305224\n雄起\t305225\n百万英镑\t305226\n泡崖\t305227\n0312\t305228\n德天瀑布\t305229\n崇明生活网\t305230\n落榜\t305231\n湖南信用网\t305232\n萧华\t305233\n高楼大厦\t305234\nRedisTemplate\t305235\nV12.0\t305236\n制裁\t305237\n第二杯\t305238\n全鱼宴\t305239\n重庆建委\t305240\n下述\t305241\nLED灯条\t305242\noner\t305243\n安娜贝尔2:诞生\t305244\nPowermill\t305245\nwin7系统窗口\t305246\n紫云府\t305247\n391.35\t305248\n85天\t305249\n痛快\t305250\n飞渡\t305251\n无尽\t305252\nRainbow\t305253\n老鸭窝\t305254\n宅虫网\t305255\n成都市规划管理局\t305256\n鲁大妈影院\t305257\n万水\t305258\n五金城\t305259\n驱蚊水\t305260\n草船\t305261\n2017年5月9日\t305262\n真南路\t305263\n金科集团\t305264\n北京朝阳大望路公司\t305265\n黄春华\t305266\n03.31\t305267\n奖学金\t305268\n胺液\t305269\n佛山市南海区人民政府\t305270\n20160912\t305271\nlols7\t305272\n短焦投影机\t305273\n珠海站\t305274\n教字\t305275\npogo\t305276\n一内\t305277\n光伏太阳能网\t305278\n鑫茂\t305279\n步频\t305280\n第70届\t305281\n18.8元\t305282\n6sp\t305283\n侠盗5\t305284\n中华网校\t305285\n卫国道\t305286\n0854\t305287\n湖上\t305288\n贵阳乌\t305289\n1.03h\t305290\n你要的爱\t305291\n文艺家\t305292\nimoive\t305293\n利娴庄\t305294\n快乐天使\t305295\n_民\t305296\n宇宙\t305297\n忽冷忽热\t305298\n池体\t305299\n观沧海\t305300\n斯坦因\t305301\n无偏\t305302\nAnny\t305303\n城市病\t305304\nache\t305305\n思前想后\t305306\n兴德\t305307\n29张\t305308\n落地灯\t305309\niommu\t305310\n青浦新城\t305311\n网上作业\t305312\nHYPEBEAST\t305313\n龙洲\t305314\n最后一个字\t305315\nremeber\t305316\nsph\t305317\n4金\t305318\ncsp\t305319\n果树苗\t305320\n于禁\t305321\n生脉\t305322\n王闿运\t305323\n寿昌\t305324\n鸟语林\t305325\n交钱\t305326\n美丽湾\t305327\n柏崎星奈\t305328\nE04\t305329\n百度音乐播放器\t305330\narnold渲染器\t305331\n表页\t305332\n降耗\t305333\n没完成\t305334\ndemanding\t305335\n一小时内\t305336\n胡五岳\t305337\n红白机\t305338\n心源性休克\t305339\n微变版\t305340\n季前赛\t305341\n菲儿\t305342\n水友\t305343\nandroid.support.v7\t305344\n彭兰\t305345\npigeon\t305346\nKBShinya\t305347\n美基\t305348\n这季\t305349\n00号\t305350\n乌鲁木\t305351\n中国康复研究中心\t305352\n凉房\t305353\n9400\t305354\n第85届\t305355\n园岭\t305356\n珍珠传奇\t305357\n电视机顶盒\t305358\n舰姬\t305359\n阿鲁因\t305360\n16bit\t305361\n1842\t305362\n陶西\t305363\n定海新闻网\t305364\npaths\t305365\n凝血块\t305366\nAM3\t305367\n窦骁逆水寒\t305368\n蒙毅\t305369\n极昼\t305370\n华商基金\t305371\n撸撸鸟av\t305372\n宁波雅戈尔动物园\t305373\n海露玻璃酸钠滴眼液\t305374\n310万\t305375\n2918\t305376\n不可分割\t305377\n二手货\t305378\n虹色\t305379\n七尾\t305380\n日本之窗\t305381\nGFRIEND\t305382\n南岗\t305383\n动身\t305384\n八匹马\t305385\n第十四套\t305386\n丁锋\t305387\n花曜日\t305388\n河南话\t305389\n暖字\t305390\nKali中文网\t305391\n铅笔画\t305392\n吕绮玲\t305393\n极果\t305394\n仿版\t305395\n二项式分布\t305396\n雅君\t305397\n连笔字\t305398\n婆媳\t305399\n洋家乐\t305400\n杭州市人力资源和社会保障局\t305401\nec2b\t305402\n保险师\t305403\ncomfort\t305404\ninternship\t305405\n欧陆风云\t305406\n私募基金管理人\t305407\n紫云\t305408\n炸虾\t305409\n乐学\t305410\n金匮肾气丸\t305411\njeecms\t305412\n北京文博会\t305413\njmeter\t305414\n黑暗武士\t305415\n朱辉\t305416\n23张\t305417\n奇幻\t305418\n烤漆玻璃\t305419\n锁金村\t305420\n笔趣阁小说阅读网\t305421\n你的歌\t305422\n神器\t305423\n滨江大道\t305424\npirate\t305425\n命案\t305426\n美版s8\t305427\n三国群英\t305428\nmybatis-spring\t305429\n尸语者\t305430\n西厢记\t305431\n母文\t305432\ndeficient\t305433\n33页\t305434\n夏一\t305435\n朱敏\t305436\n教育园\t305437\n妙言\t305438\n丙醛\t305439\n颈霜\t305440\n波澜壮阔\t305441\n肠壁\t305442\nrdm\t305443\n切尔西足球俱乐部\t305444\n伴奏\t305445\n蚀刻\t305446\nDTD\t305447\nDespite\t305448\n苏有朋\t305449\n2台\t305450\n第二任\t305451\n黑花岗岩\t305452\n味口\t305453\n1200平米\t305454\n曳引式\t305455\n洞泾\t305456\n制冷压缩机\t305457\nuptodate\t305458\n备份表\t305459\n扶绥\t305460\n体定税\t305461\n唐心怡\t305462\n万通筋骨贴\t305463\n蒋巷镇\t305464\n天门市\t305465\n神雕侠侣古天乐版\t305466\n风流人物\t305467\n太过分\t305468\n杨柳村\t305469\nRESET\t305470\n新加坡留学联盟\t305471\n求商\t305472\n有章可循\t305473\n秒封\t305474\nskada\t305475\n蒙求\t305476\n我的秘密女友\t305477\n致\t305478\nYOUTUBE\t305479\n按份共有\t305480\n映美打印机\t305481\n阿诺·施瓦辛格\t305482\n金樽\t305483\nUICollectionViewCell\t305484\n太平洋战役\t305485\n3.3\t305486\n亨利二世\t305487\n狸\t305488\nmingc\t305489\n弧\t305490\n禹州\t305491\n含\t305492\n妙音鸾女\t305493\nAmplifier\t305494\n注册码+\t305495\n中星6B\t305496\n报价单模板\t305497\n阎利珉\t305498\n千树晴山\t305499\n哈迪斯\t305500\n蓦然\t305501\nGarfunkel\t305502\nPEMinecraft\t305503\n琉璃瓦\t305504\n52PK穿越火线\t305505\nsecurecrt\t305506\n钢坯\t305507\n全胃\t305508\nstrongerHuang\t305509\n张家界政府网\t305510\n雷云3.0\t305511\n北京控股集团有限公司\t305512\n心扉\t305513\n杜鹏程\t305514\n卫生\t305515\n江西消防\t305516\n紫薇苑\t305517\n名创优品miniso\t305518\n廖勇\t305519\n301\t305520\n少女心\t305521\nChoi\t305522\n3316\t305523\n步步高升\t305524\nnagios\t305525\n正义网\t305526\n联调联试\t305527\n双截棍\t305528\n江苏省工商局_江苏省工商行政管理局\t305529\n素片\t305530\n万博\t305531\nExtJS5\t305532\ng42\t305533\n邵山欢\t305534\n保持距离\t305535\n贝德\t305536\n黄桃罐头\t305537\n巴勃罗\t305538\n0809\t305539\n中诚信国际\t305540\n合道\t305541\n广州华多网络科技有限公司\t305542\n珙桐\t305543\n192.168.1.3\t305544\n720p\t305545\n转正申请书\t305546\n气体\t305547\n非同一控制\t305548\n即时比分网\t305549\n爬上去\t305550\n防装\t305551\n睿博\t305552\n重庆市市政管理委员会\t305553\n楚水\t305554\n孔乙\t305555\n全面战争2\t305556\nMaya\t305557\n复合袋\t305558\n连续工龄\t305559\n刻苦\t305560\n排烟\t305561\n梯梁\t305562\n锂离子蓄电池\t305563\n安美拉\t305564\n广州市地方税务局\t305565\nada\t305566\nAmaz\t305567\n张睿\t305568\n福德\t305569\n花岛\t305570\n5年间\t305571\n山杏\t305572\n米其林\t305573\nidle\t305574\n中性化\t305575\n泻立\t305576\n啶酰菌胺\t305577\n解放思想实事求是\t305578\nBD1024p|1080p\t305579\n勤勉\t305580\n桌签\t305581\n底皮\t305582\n唐久\t305583\n箱\t305584\n股公司\t305585\n防火毯\t305586\n突发公共事件\t305587\n黄旗\t305588\n虾螺\t305589\n梗阻性\t305590\n西园路\t305591\n新浪英雄联盟\t305592\n老蛇\t305593\n刀纹\t305594\n80K\t305595\n长江音乐节\t305596\nCRUG\t305597\n吕树\t305598\n丁芙妮\t305599\n浩文\t305600\n系统级\t305601\n新月派\t305602\n铃原爱\t305603\n刀枪\t305604\n南京南街\t305605\n吸脂手术\t305606\n淄博六中\t305607\n蔡文姬\t305608\nkidney\t305609\n蜂窝助手\t305610\n商改住\t305611\n百分之15\t305612\n辣子鸡\t305613\n变送器\t305614\n河图\t305615\n传菜员\t305616\n王arcv\t305617\n静地\t305618\nxsteel\t305619\n洞仙歌\t305620\n访谈法\t305621\n存储过程\t305622\n河南中医药大学第三附属医院\t305623\n宇文邕\t305624\nbolts\t305625\n牛骨汤\t305626\n网易将军令\t305627\n下泄\t305628\nWildlife\t305629\n人教版四年级下册语文\t305630\n营口自贸区\t305631\nNSL\t305632\n伴郎\t305633\n新浪江苏_新浪网\t305634\n桃屋猫\t305635\n衣服男\t305636\nREQUEST\t305637\n东风天龙\t305638\n茵苗\t305639\n长安福特汽车\t305640\n昌源\t305641\n张金华\t305642\nacdsee5.0\t305643\n东风广场\t305644\n斜拉桥\t305645\n学概论\t305646\n云系\t305647\nNBA数据库\t305648\n南沙区人民政府\t305649\ndesigners\t305650\n红号\t305651\n青花鱼\t305652\n长安福特福克斯\t305653\nfiestar\t305654\n红动\t305655\n小康梦\t305656\nlombok\t305657\n信阳茶叶网\t305658\n嘴馋\t305659\n宣恩县\t305660\nAlternate\t305661\n_页游网\t305662\n上海民政局\t305663\n三跟\t305664\nTimeTo\t305665\njuli\t305666\n太空步\t305667\n背线\t305668\n混服\t305669\ndeployment\t305670\n仿真花\t305671\n张耕\t305672\nwikiFeet\t305673\n求租\t305674\n明夏\t305675\n经得住\t305676\n翻建\t305677\n东方汇\t305678\n词卡\t305679\n图贴\t305680\n回转体\t305681\n张嘉文\t305682\n北京家圆医院\t305683\n独霸\t305684\n224个\t305685\n一双\t305686\n周大新\t305687\n小子\t305688\n育林\t305689\n赏光\t305690\n中马钦州\t305691\n超期\t305692\n25k\t305693\n审委会\t305694\n代理行\t305695\n李盛\t305696\n8MB\t305697\n弄碎\t305698\n雪中梅\t305699\n打滚\t305700\n3D打印机\t305701\nSINO\t305702\n太傻留学\t305703\n开国大典\t305704\n海洋学院\t305705\n公安部警用装备采购中心\t305706\nAmbition\t305707\nE家网\t305708\nleaks\t305709\n很久以前\t305710\ngclient\t305711\n高斯公式\t305712\n平湖国际进口商品城\t305713\n15所\t305714\n东非\t305715\n満足\t305716\n陕西省政协\t305717\n剑网三中\t305718\n闻过\t305719\n歌人\t305720\n淫蕩\t305721\nMix\t305722\n市交通局\t305723\n奥林匹斯\t305724\n华润水泥\t305725\n魔王\t305726\n000010\t305727\n再入\t305728\n夜魔侠\t305729\n职业武器\t305730\nxiaom\t305731\n习近平总书记系列重要讲话精神\t305732\n宇宙论\t305733\n多谐振荡器\t305734\n浊度计\t305735\n中华人民共和国合伙企业法\t305736\n判罪\t305737\nOE\t305738\nstrat\t305739\nWikipédia\t305740\n李雅\t305741\ndcblog\t305742\n1.0.1\t305743\n利尔\t305744\nprot\t305745\n仟吉西饼\t305746\n拔萃\t305747\n情路弯弯\t305748\n方差\t305749\n望而生畏\t305750\n武汉智康1对1\t305751\nnextcloud\t305752\npebble\t305753\n丁肇中\t305754\n散杂\t305755\n渣机\t305756\n莽莽\t305757\n成都市人力资源和社会保障局\t305758\n征集\t305759\n拓陆者\t305760\n綦美合\t305761\ncausing\t305762\n凌辰\t305763\nQQ头像\t305764\n末代\t305765\n火柴盒\t305766\n显微组织\t305767\n木条\t305768\ndow\t305769\n原相\t305770\n网狐荣耀\t305771\n密送\t305772\n宋青莲\t305773\n模仿者\t305774\nHC\t305775\n军阀\t305776\n伟光汇通\t305777\nPacewear\t305778\nplasma\t305779\n周防尊\t305780\n10件套\t305781\n弹簧式\t305782\n云效\t305783\n真空过滤器\t305784\n新国都\t305785\n几招\t305786\n延世大学\t305787\n滇中\t305788\n运费券\t305789\nDj版\t305790\n传动带\t305791\n品质好\t305792\n母版\t305793\n王忠\t305794\nmetadata\t305795\n模盒\t305796\n两处\t305797\n志气\t305798\n果核\t305799\n浩翔\t305800\n1第一章\t305801\n安视\t305802\n定薪\t305803\n河狸\t305804\n6535\t305805\n企业商学院\t305806\n梅兰芳大剧院\t305807\n组策略编辑器\t305808\n六尺巷\t305809\n罗衣\t305810\n民事案件案由规定\t305811\n吉林市教育局\t305812\n望山\t305813\n饭店\t305814\nPreserving\t305815\n鼎红\t305816\n祭\t305817\nt13\t305818\n漠不关心\t305819\noracle9\t305820\n奈克瑟斯奥特曼\t305821\nbaoming\t305822\n钧达股份\t305823\nopenmpi\t305824\n江大\t305825\nMonday\t305826\n无早\t305827\n豆豉酱\t305828\n清怪\t305829\n短融网\t305830\nunity2d\t305831\n圣餐\t305832\ned2k|bt\t305833\n踩在脚下\t305834\niphonx\t305835\n极品飞车20:复仇\t305836\n王影\t305837\n硬泡聚氨酯\t305838\n仇怨\t305839\n灵璧石\t305840\n上海尚德实验学校\t305841\n刘江峰\t305842\n广宗\t305843\n携带\t305844\n鄂州市政府\t305845\n达芙妮\t305846\n2017.8.1\t305847\n深信不疑\t305848\n肠功能紊乱\t305849\nEMT\t305850\nvns\t305851\n县政协\t305852\n900克\t305853\n唐志中\t305854\n尹成\t305855\n全才\t305856\n母妃\t305857\n300米\t305858\njiale\t305859\n栀子花\t305860\ntraxxas\t305861\n42本\t305862\nWAV分轨\t305863\n血牛\t305864\nExhibitions\t305865\n裁板\t305866\nps切片\t305867\n单列索引\t305868\n于海峰\t305869\n加计\t305870\n院儿\t305871\n东圃镇\t305872\nLT\t305873\ntrump\t305874\n黑莓priv\t305875\nionic2\t305876\n保单质押贷款\t305877\n敖东\t305878\n大松\t305879\n曲牌\t305880\n780g\t305881\n将台\t305882\n维克兹\t305883\n天津自贸区\t305884\nk2000\t305885\n秀禾\t305886\n什么期\t305887\n过审\t305888\n1-3\t305889\n莫妮卡·贝鲁奇\t305890\n女文\t305891\n云硬盘\t305892\nIBP\t305893\nBodi\t305894\nウェア\t305895\n大母\t305896\n植生袋\t305897\n锈湖\t305898\n冲压机器人\t305899\n排污量\t305900\n油站\t305901\n普屏\t305902\ninjection\t305903\nNeurology\t305904\n徽剧\t305905\n保险经纪机构\t305906\n18047期\t305907\n平信\t305908\n刘喆\t305909\n求支\t305910\n霜霜\t305911\nxdr\t305912\n河南省工业和信息化委员会\t305913\n178个\t305914\n凉秋\t305915\nBanff\t305916\n3205\t305917\n戏点\t305918\nCorps\t305919\n四川大学锦城学院\t305920\n带通\t305921\n旋转盘\t305922\n车堂\t305923\n遏止\t305924\n绿药\t305925\n超人\t305926\nnethunter\t305927\n一平方\t305928\n788\t305929\n半条\t305930\nJacky\t305931\n长谷川理穗\t305932\n真三国无双7\t305933\nRemover\t305934\n京0105\t305935\n网络拼酒\t305936\n恐袭\t305937\n研友\t305938\n通道式\t305939\nApps_百度\t305940\n山东省立医院\t305941\n身体质量指数\t305942\n俞渝\t305943\n刘晓光\t305944\n睚眦必报\t305945\n晴晴\t305946\nhippo\t305947\n二年级语文\t305948\n黑瞳\t305949\n乳腺病\t305950\n喆\t305951\n酸败\t305952\nBitbucket\t305953\n背景层\t305954\n端景\t305955\n沙土\t305956\n若兮\t305957\n玛丽昂·歌迪亚\t305958\n自豪感\t305959\nx5\t305960\nk29\t305961\n恩杰\t305962\n构思\t305963\n王晓\t305964\n滑子菇\t305965\n消送\t305966\n折寿\t305967\n樱火龙\t305968\n插电式混合动力汽车\t305969\n站站\t305970\n朱学勤\t305971\n驰宏锌锗\t305972\nNSSet\t305973\nteamviewer13\t305974\n创业板\t305975\n尤文\t305976\nprior\t305977\ncodesign\t305978\n地牢猎手4\t305979\n订座\t305980\n光引发剂\t305981\n朝阳区幼儿园\t305982\n巴清\t305983\n守土\t305984\n鲁尔工业区\t305985\nmeetup\t305986\nProvince\t305987\n荣耀s8\t305988\n京发\t305989\n石墓\t305990\n上海东方明珠\t305991\n说事\t305992\nBB\t305993\n中国芯\t305994\n西南电力设计院\t305995\n国务院侨办\t305996\nST\t305997\n百千\t305998\n长宁来福士\t305999\n贵阳日报数字报\t306000\n风河\t306001\n毕业季\t306002\n布图\t306003\n临桂区人民政府\t306004\n外交学院\t306005\nkindEditor\t306006\n沉鱼\t306007\n所办\t306008\n木里\t306009\n广州生物岛\t306010\n上海花园\t306011\n全能神\t306012\n五常大米\t306013\n邮益宝\t306014\nCategories\t306015\n伊泰\t306016\nTon\t306017\n和昌集团\t306018\n轻盈\t306019\n透过\t306020\n二色\t306021\nChecks\t306022\n字母表\t306023\n胡一菲\t306024\n水谷心音\t306025\n揭阳机场\t306026\n笛卡尔\t306027\n沧海桑田\t306028\n拉皮条\t306029\n二哈\t306030\nlei\t306031\n弹射\t306032\n散打\t306033\nC哩C哩\t306034\n纯化\t306035\n导通\t306036\n众悦学车_众悦学车网\t306037\nigbt\t306038\n畅博\t306039\n此间\t306040\n绿色新闻网\t306041\n神来之笔\t306042\n鸽子肉\t306043\nspb\t306044\n天龙八部sf\t306045\n猪大肠\t306046\n德洲城\t306047\nUU\t306048\nword天\t306049\n铝盘\t306050\nenough\t306051\n女坦\t306052\n叶清明\t306053\n李光明\t306054\n装修费\t306055\n词霸\t306056\n蛇腹\t306057\n伽古拉\t306058\n事由\t306059\n泰安万达广场\t306060\n苏州车管所\t306061\n胆小怕事\t306062\n凉城路\t306063\n泡泡王\t306064\n流觞\t306065\nweep\t306066\n姜糖水\t306067\n暮云镇\t306068\n张卫国\t306069\n疯狂的麦咭\t306070\n大洋洲\t306071\n抛砖\t306072\n隐式\t306073\n失落叶\t306074\n振荡器\t306075\n安华金\t306076\n柔软度\t306077\n香山股份\t306078\n靠不住\t306079\ntina\t306080\n大麦茶\t306081\n瑞士莲\t306082\n推力球轴承\t306083\n国内版\t306084\n古厝\t306085\n打药\t306086\n量子态\t306087\n寻踪\t306088\n议付行\t306089\n李金平\t306090\n几章\t306091\n立象\t306092\n黏合\t306093\n黎婉华\t306094\nViVO\t306095\n一代\t306096\n发射管\t306097\ndeque\t306098\n战俘\t306099\n煲箱\t306100\n祝福语\t306101\n光比\t306102\n蓝人\t306103\nLuci\t306104\n繁华落尽\t306105\nDISCO\t306106\n308国道\t306107\n超越\t306108\n3mbang.com\t306109\nCOREL\t306110\n光学级\t306111\n止汗\t306112\n孙明\t306113\n耐量\t306114\n认购书\t306115\nseysa\t306116\n夏意\t306117\n全零\t306118\n铺砖\t306119\n中海集团\t306120\n92\t306121\n中华人民共和国大气污染防治法\t306122\n安检员\t306123\n行义\t306124\n意图\t306125\n成谜\t306126\n新浪乐居\t306127\n困意\t306128\n广东象棋网\t306129\n缩头\t306130\n生物质颗粒燃烧机\t306131\n拼购\t306132\n陈华伟\t306133\n全名单\t306134\n第二故乡\t306135\n龙子湖区\t306136\n蜘蛛侠:英雄归来\t306137\nqq农场\t306138\n带身\t306139\n菜花蛇\t306140\nformac\t306141\n小鸡快跑\t306142\n女人手\t306143\n深圳市幼儿园\t306144\n莫德海姆\t306145\n映出\t306146\n绝地战警\t306147\n野葱\t306148\n杭州地铁\t306149\n天凝镇\t306150\n贞观政要\t306151\n180厘米\t306152\nwavread\t306153\n吉祥夜\t306154\n直列\t306155\n藕\t306156\n8尺\t306157\n渗滤\t306158\n致死率\t306159\nWindowManager\t306160\n朗颂\t306161\ntapsonic\t306162\n策略\t306163\n连加\t306164\n租计征\t306165\n白灯\t306166\n东莞滨海湾新区\t306167\nlibev\t306168\n38条\t306169\n梁红玉\t306170\n河南能源\t306171\n渗线\t306172\n身份证照\t306173\n成天\t306174\n通力合作\t306175\n军套\t306176\n武汉地铁12号线\t306177\n常威\t306178\n时间轴\t306179\n水野朝阳\t306180\n金悦府\t306181\n第4段\t306182\n真宙\t306183\n血装\t306184\nCLIO\t306185\n刘利华\t306186\n中融国际信托有限公司\t306187\n苏州市版权局\t306188\nTHIRD\t306189\n安丘\t306190\n木乃伊2\t306191\n少儿图书馆\t306192\n案说纪\t306193\n毕业礼\t306194\n美公司\t306195\n星际争霸2虚空之遗\t306196\n环式\t306197\n7.6\t306198\nFINANCIAL\t306199\n阿门罗\t306200\n迷会\t306201\n秦安县\t306202\nJedi\t306203\n中国工农红军\t306204\n不妨一试\t306205\n中国乒乓球队\t306206\n西安市第五医院\t306207\nMIKI\t306208\n四海钓鱼\t306209\n易初莲花\t306210\nCytus\t306211\nCoDeSys\t306212\n谎话\t306213\n旧馆\t306214\n56岁\t306215\n600901\t306216\n书话\t306217\n五排\t306218\nGOOGLE\t306219\n棉田\t306220\n莱尼尔\t306221\n太子河区\t306222\n商赛\t306223\n骄\t306224\nWDR5620\t306225\n德莱厄斯\t306226\n富贵逼人\t306227\n兴城古城\t306228\n雷芳\t306229\n澳门政府\t306230\n集水坑\t306231\n广东省第二中医院\t306232\n致敏\t306233\n江少陵\t306234\n中级财务管理\t306235\n调动\t306236\n气鬼\t306237\n鲁冠\t306238\n鱼与熊掌\t306239\n平添\t306240\n欧司朗\t306241\n双全法\t306242\n验证器\t306243\n扬州宝\t306244\nJE\t306245\n30多年\t306246\n潮州市\t306247\n朱庆育\t306248\n锦绣中华\t306249\n火影忍者全集\t306250\nMybatis-Generator\t306251\n自愈式\t306252\ng402\t306253\nwpf\t306254\n裕泰\t306255\nVRC\t306256\nbrocade\t306257\n搭一搭\t306258\n乐家\t306259\n分散型\t306260\n一小点\t306261\n糖心蛋\t306262\n牡丹烟\t306263\n三四十岁\t306264\n二战风云\t306265\n挖走\t306266\n青芜\t306267\n丹灶\t306268\nonDestroy\t306269\n华为荣耀3\t306270\n建设工\t306271\n安徽省卫生和计划生育委员会\t306272\n北汽福田汽车股份有限公司\t306273\n孕后\t306274\n古巷镇\t306275\n陈小鲁\t306276\n高金\t306277\nIkea\t306278\n一斑\t306279\n胡芦\t306280\n朱培浩\t306281\n卷文\t306282\n葛店南站\t306283\n4.87\t306284\n地弹\t306285\n前进\t306286\n超声波振子\t306287\nmean\t306288\n52石斛\t306289\n五子棋\t306290\n访谈类\t306291\n为什么这样\t306292\n庄\t306293\n太和堂\t306294\nAlvin\t306295\n入参\t306296\nHiberfil\t306297\nDARK\t306298\n无子\t306299\n河池\t306300\n克罗斯\t306301\n限行尾号查询\t306302\n六年级数学下册期中考试卷\t306303\n舒享\t306304\n2gd5\t306305\n中泰峰境\t306306\n水彩\t306307\nMiura\t306308\n施\t306309\n开罐\t306310\n衬氟\t306311\n三个火枪手\t306312\n摩根·弗里曼\t306313\n桔普茶\t306314\n共度\t306315\nditto\t306316\nih5\t306317\nanc\t306318\n厂车\t306319\nGTA3\t306320\nscss\t306321\n夏侯淳\t306322\n渔溪\t306323\n两串\t306324\nj=1\t306325\n高频电刀\t306326\nJa\t306327\nAngular2\t306328\n事业收入\t306329\n讲明\t306330\n宫保鸡丁\t306331\n防腐漆\t306332\n尧\t306333\n锐起\t306334\nSemaphore\t306335\n凯普\t306336\n虫妖\t306337\n常务理事\t306338\n泰国素万那普机场\t306339\n奇葩朵朵\t306340\nInstax\t306341\n中国产经新闻网\t306342\n瓜棚\t306343\n中国式摔跤\t306344\n外招\t306345\n启明星辰\t306346\nleangoo\t306347\n万科地产\t306348\n芥兰\t306349\nf9\t306350\n530u3c\t306351\n3421\t306352\n大塔\t306353\ncode39\t306354\n集约型\t306355\n必用\t306356\n松江河镇\t306357\n方静\t306358\n复华置地\t306359\n三月十五\t306360\n速效\t306361\n总理府\t306362\n泡水车\t306363\n20世纪初\t306364\n新闻出版广电局\t306365\n实验炉\t306366\nactive\t306367\n简图\t306368\nCarrot\t306369\n赵兴\t306370\n礼貌\t306371\n天火\t306372\n放寒假\t306373\n杭网\t306374\ne46\t306375\n良驹\t306376\n侧影\t306377\n拾趣\t306378\nChinaJoy\t306379\njade6\t306380\nSatchel\t306381\n巨网\t306382\n洪福\t306383\n【号\t306384\n召唤兽\t306385\n瑕\t306386\n主张\t306387\n西点军校\t306388\n三菱变频器\t306389\nwinkawaks\t306390\nracemenu\t306391\n许俊\t306392\n精算师考试\t306393\nenema\t306394\n瑜伽轮\t306395\nword10\t306396\n氧化还原反应\t306397\n友谊南路\t306398\n东方新世界\t306399\n拜谢\t306400\n阿江\t306401\nonce\t306402\nplt.show\t306403\n银号\t306404\n解振华\t306405\n贴片\t306406\n象牙色\t306407\n59个\t306408\n041\t306409\n劳斯莱斯魅影\t306410\n相对原子质量表\t306411\n逐条\t306412\n维汉\t306413\nca724\t306414\n深圳市工商局\t306415\n夜夜欢\t306416\n200多块\t306417\n公文易作文网\t306418\n地下世界\t306419\n600句\t306420\nETA\t306421\n广电网\t306422\nsecurity\t306423\nnetflix\t306424\n从容应对\t306425\n能用\t306426\n龙珠GT\t306427\n长峪城\t306428\n新长安欧尚_长安欧尚A800\t306429\n两梯\t306430\n骐达论坛_汽车之家论坛\t306431\ndetected\t306432\n会同\t306433\n点列\t306434\n第三集\t306435\n黑夜传说5\t306436\n托拉斯\t306437\n电动缝纫机\t306438\n技师\t306439\n杨天真\t306440\nL385\t306441\nUCP\t306442\n茧镇奇缘\t306443\n赔付\t306444\n柱头\t306445\nbpf\t306446\n南京农大\t306447\n东山魁夷\t306448\n东方梦符祭\t306449\n0090\t306450\n阿赫玛托娃\t306451\n161路\t306452\n3攻略\t306453\n绝对定位\t306454\n纪念盘\t306455\n荣盛发展\t306456\n6.5亿元\t306457\n宝星\t306458\n联盟杯\t306459\nphoto\t306460\n复牌\t306461\n杨天宝\t306462\n罗绮\t306463\n明诗综\t306464\n国米\t306465\n親父\t306466\n儒林外史\t306467\n迷笛\t306468\n缓控释制剂\t306469\n善品\t306470\n大鸡鸡\t306471\nopecv\t306472\n南岩\t306473\n梦溪笔谈\t306474\n夜梦\t306475\nlan\t306476\n狗扑网\t306477\neju\t306478\n绿星\t306479\n打印社\t306480\n第一品\t306481\n火影忍者:究极忍者风暴\t306482\n花青素\t306483\n领世\t306484\nPortuguese\t306485\njnsenfeng\t306486\n3级\t306487\n应税劳务\t306488\n4399战舰少女r\t306489\nLEEP\t306490\n闽委\t306491\n于士博\t306492\n2g网络\t306493\nrapeutics\t306494\n御藏\t306495\n江门市人民政府\t306496\n19种\t306497\n失踪人\t306498\n鱼轮\t306499\n樱花虾\t306500\n图图网-tutu001.com\t306501\n戚本禹\t306502\n亚历山大帝国\t306503\nscifinder\t306504\n银盐\t306505\n我的电话\t306506\npny\t306507\n江苏省档案局\t306508\nZy\t306509\n二进制流\t306510\n木工板\t306511\ncvbnm\t306512\nunit10\t306513\n私访\t306514\n国家局\t306515\n中国膜工业协会\t306516\nVacuum\t306517\n何乐\t306518\n卡卡村\t306519\n宜居网\t306520\nsetprecision\t306521\n氢氧化钡\t306522\n查汶海滩\t306523\n关连\t306524\n瑙鲁\t306525\n飞度论坛\t306526\nshi\t306527\n马家龙\t306528\n新干县\t306529\n徐海星\t306530\n17【\t306531\n50克\t306532\ngetaway\t306533\n冲天\t306534\n迪哥\t306535\n体育课\t306536\n创世版\t306537\n蛋白\t306538\n王勃\t306539\n公路人\t306540\n沿海高速\t306541\n山水情\t306542\n2包\t306543\nQX50\t306544\n吐纳\t306545\nmysql存储过程\t306546\nschedule\t306547\n炮\t306548\n配齐\t306549\n30吨\t306550\n⒊\t306551\n郭艾伦\t306552\n威海大水泊机场\t306553\n120小时\t306554\nmoho\t306555\n惠山\t306556\nBytes\t306557\n69集\t306558\n圖文\t306559\n富丽堂皇\t306560\n中分节\t306561\n吉林省人民政府\t306562\n小石头\t306563\n中国民俗学网\t306564\n人兽交\t306565\n猪舍\t306566\n姫君\t306567\n硬式\t306568\n金在中\t306569\ntif\t306570\nGTX1080TI\t306571\n参须\t306572\n小鬼当家\t306573\nWILL\t306574\n神探夏洛克第四季\t306575\n有一种爱\t306576\n南环\t306577\n宋杨\t306578\n橙\t306579\n打浆\t306580\n熔接机\t306581\n炽\t306582\nせ\t306583\n本市\t306584\n张林\t306585\n养殖\t306586\n仙剑奇侠传3D\t306587\n片尾曲\t306588\n张伦\t306589\n邢台市人民医院\t306590\nMat矩阵\t306591\n企鹅FM\t306592\n商标评审委员会\t306593\n威海传媒网\t306594\nsubtotal函数\t306595\nPTFE\t306596\n微风\t306597\n阿里飞猪\t306598\n美丽中国乡村行\t306599\n超越者\t306600\n好高\t306601\nubunto\t306602\n萨日娜\t306603\n旗鼓相当\t306604\n周立新\t306605\n李华\t306606\n562\t306607\n夏彬\t306608\n卡特尔\t306609\nLuxury\t306610\n工管\t306611\n鸡皇杯\t306612\n下水道的美人鱼\t306613\n烧化\t306614\nLaws\t306615\n星群\t306616\n280克\t306617\nnormative\t306618\n弯月\t306619\n12千克\t306620\nMC\t306621\ndispute\t306622\naloha\t306623\n新乐市\t306624\n贯彻\t306625\n商小妹\t306626\n吸收剂\t306627\n刃牙\t306628\n延长段\t306629\n中国村\t306630\nExecutable\t306631\n中海凯旋门\t306632\n长春出版社\t306633\n_威\t306634\n提梁\t306635\njihite\t306636\n大航海时代Online\t306637\n昆特牌吧\t306638\n留学网\t306639\nprp\t306640\n御书屋\t306641\nhellip\t306642\n4i\t306643\n所得税收\t306644\n察\t306645\n振动给料机\t306646\nvue数组\t306647\n混沌之戒2\t306648\n兴平市\t306649\n余杭晨报数字报\t306650\n静电剂\t306651\n再创新\t306652\nFloor\t306653\n3月2日\t306654\n陈浩成\t306655\n鼯\t306656\n文录\t306657\n呼吸链\t306658\n库尔勒香梨\t306659\nkira\t306660\n拍子\t306661\n史密斯电热水器\t306662\n民进\t306663\n林小喜\t306664\n肉条\t306665\nTold\t306666\nΣ\t306667\n中国人民大学书报资料中心\t306668\n小红杏\t306669\n固体酒精\t306670\n大图网\t306671\n小僵尸\t306672\n日升\t306673\n扎达尔\t306674\n九家之书\t306675\n试件\t306676\n2015年7月份\t306677\n蓬门\t306678\n601路\t306679\n桐南美麓\t306680\n靶子\t306681\n小寻\t306682\n车管处\t306683\n绷带\t306684\n△\t306685\n三上真司\t306686\nJrain\t306687\n海铁联运\t306688\nMySQL5\t306689\n粘住\t306690\n权谋\t306691\nV2.4\t306692\n米朵\t306693\nv2014\t306694\nBakery\t306695\nICLR\t306696\n电器有限公司\t306697\nchiara\t306698\nwow猎人\t306699\nyxwkaifa\t306700\n票据法\t306701\n13.1\t306702\n债权转让\t306703\n帝豪斯\t306704\n迪恩\t306705\n西施壶\t306706\n深圳\t306707\n多夫\t306708\n85版\t306709\n支行长\t306710\n砾\t306711\n孔雀翎\t306712\n汽车族\t306713\n1957年\t306714\n增多\t306715\n医用级\t306716\n无所\t306717\n蔻辰\t306718\n酒坛\t306719\npreliminary\t306720\n髋关节\t306721\n苏桃子\t306722\n为你骄傲\t306723\n平移式\t306724\n发发\t306725\n环境工程\t306726\n白犀牛\t306727\n今日竹山网\t306728\n阿坝师范学院\t306729\n中国矿业联合会\t306730\nCFR\t306731\n半条命\t306732\n差函数\t306733\n179\t306734\nps调色\t306735\n毫伏表\t306736\n武林路\t306737\njumper\t306738\n奇创\t306739\nH1Z1\t306740\nRat\t306741\n收获日2吧\t306742\nword365\t306743\n类学\t306744\n工程管理硕士\t306745\n石化街道\t306746\n常州一中\t306747\n中共十三大\t306748\n江西小学\t306749\n死战\t306750\n枯草芽孢杆菌\t306751\n大修基金\t306752\n第二十三批\t306753\n十八酒坊\t306754\n双湖\t306755\n弹簧门\t306756\n小报\t306757\n删贴\t306758\n第129号\t306759\n快吧\t306760\njem\t306761\n青年记者\t306762\nXrd\t306763\n桑坦德银行\t306764\n硬件篇\t306765\n0giant\t306766\n四项\t306767\n债券\t306768\n致仕\t306769\n近观\t306770\n无以至\t306771\nRotating\t306772\n8厘\t306773\n勾当\t306774\n吊瓜\t306775\nadodc\t306776\n太平国际机场\t306777\nJoyce\t306778\nUnturned\t306779\n方岩\t306780\n恒湿\t306781\n0627\t306782\n_天网\t306783\n性贷\t306784\n结皮\t306785\n马铃薯\t306786\n2333日语\t306787\n省地方税务局\t306788\n公交一卡通\t306789\n暴龙\t306790\n刘瑜\t306791\n寒暑假\t306792\n首试\t306793\n大将军\t306794\nUI设计师\t306795\n板\t306796\n远处\t306797\n疾跑\t306798\n李京奎\t306799\n电潜泵\t306800\n塑料破碎机\t306801\nCentos6\t306802\n中国人民财产保险股份有限公司\t306803\n_求医网\t306804\nalleged\t306805\n26日\t306806\n敢为人先\t306807\n油讯\t306808\n卑微\t306809\n1900年\t306810\n松绑\t306811\n江门市国土资源局\t306812\n中博会\t306813\n大跌眼镜\t306814\n初油\t306815\n内厝镇\t306816\n高都\t306817\n南湾\t306818\n陈光梵\t306819\n彩讯科技\t306820\n大步\t306821\n同心结\t306822\n速评\t306823\n2018年3月31\t306824\n迫切性\t306825\n选举会议\t306826\nnanchang\t306827\n娼女\t306828\nnsurl\t306829\n仙人掌花\t306830\n昏招\t306831\n电算化\t306832\n大堰河\t306833\n三书\t306834\ndq8\t306835\ntest\t306836\n免单\t306837\n达坂城\t306838\n林珊珊\t306839\n推荐量\t306840\n方信\t306841\n6&#160\t306842\n克莱门汀\t306843\n砚山县\t306844\n20160726\t306845\nDCP\t306846\n誊\t306847\n第十二批\t306848\n天等县\t306849\n百乐78g\t306850\n新日\t306851\n枭雄\t306852\n楼机\t306853\naint\t306854\n应有格物致知精神\t306855\n臧\t306856\n阿尔茨海默病\t306857\n天才瑞普利\t306858\n竹雨\t306859\n株洲传媒网\t306860\n市旅发委\t306861\n东坡区委区政府\t306862\n谷雨\t306863\n苏府\t306864\n十多位\t306865\n三人组\t306866\n腾牛安卓网\t306867\n7828\t306868\n明责\t306869\niamtxt电子书\t306870\n以至\t306871\npso2\t306872\n铁马\t306873\n八皇后\t306874\napplicants\t306875\n蓝银皇\t306876\nModule6\t306877\n集中统一\t306878\nIf\t306879\n空客a320\t306880\n潮儿\t306881\n第一刊\t306882\n起止\t306883\n定比\t306884\n淠河\t306885\n寒战2\t306886\n圆镜\t306887\nFloat\t306888\n早安\t306889\n院办\t306890\n单侧\t306891\n芬迪\t306892\n汉川市公安局\t306893\n玖辛奈\t306894\n中国食品科学技术学会\t306895\n军粮\t306896\nBeyondCompare\t306897\n东环广场\t306898\n田林县\t306899\n吴佳尼\t306900\n巴州\t306901\n20171109\t306902\n住房公积金查询网\t306903\n全裸\t306904\n脏字\t306905\ncpp\t306906\n多经\t306907\n中职学校\t306908\n幸福生活\t306909\nbun\t306910\n超级偶像\t306911\nmalaysia\t306912\n微单镜头\t306913\n展示箱\t306914\n7200转\t306915\n共青团委员会\t306916\n左脑\t306917\n寒王\t306918\n唐一军\t306919\n淄博市国土资源局\t306920\n王者荣耀2017KPL\t306921\n高椅岭\t306922\n平米\t306923\n轻烃\t306924\n近十年\t306925\n华德安\t306926\n懒懒\t306927\nfederation\t306928\nSR卡\t306929\nDAE\t306930\n方嘉煜\t306931\n金拱门\t306932\n造布\t306933\n违心\t306934\n物联卡\t306935\n讨钱\t306936\n宁波李惠利医院\t306937\n西场\t306938\n意粉\t306939\n区块链交易所\t306940\n梅溪\t306941\n林达\t306942\n黑社\t306943\n于勇\t306944\n各系\t306945\n临汾市政府\t306946\n聚宝盆\t306947\n不暇\t306948\n赵雪\t306949\n虾子\t306950\n人头发\t306951\n自由舞\t306952\n杨威\t306953\n1145\t306954\n马原\t306955\nbilateral\t306956\n修罗天帝\t306957\n公孙无名\t306958\n精梳棉\t306959\n出租率\t306960\n门店\t306961\npfg\t306962\n张小飞\t306963\nScalable\t306964\n来安县人民政府\t306965\n告訴\t306966\n3.10.0\t306967\n刀枪不入\t306968\n租地\t306969\n18吨\t306970\n民族主义者\t306971\n蒋华\t306972\n欧几里\t306973\n国际创新园\t306974\n模糊神经网络\t306975\n跌至\t306976\n消费性\t306977\n3082\t306978\nbase64码\t306979\n赌气\t306980\n聚苯乙烯\t306981\n可归\t306982\n后爸\t306983\n果実\t306984\n袁天罡\t306985\n发运\t306986\n急性肝衰竭\t306987\n总黄酮\t306988\n发布稿\t306989\n奖票\t306990\n苗金利\t306991\n内江市\t306992\n遗诏\t306993\n江宁新闻网\t306994\n二零一七年\t306995\n肉皮冻\t306996\nfleece\t306997\n革命家\t306998\n抗氧化性\t306999\n2.04\t307000\nRPC\t307001\nJTEST\t307002\n10月份\t307003\n袁心玥\t307004\n20180118\t307005\n三作\t307006\n京东方科技集团股份有限公司\t307007\n火星人集成灶\t307008\n相对剩余价值\t307009\n摇臂\t307010\nMAGAZINE\t307011\n鼠尾草\t307012\n牛杂面\t307013\nShoulders\t307014\nerrmsg\t307015\n川人\t307016\n离地间隙\t307017\n睚\t307018\n非征期\t307019\n上海昆剧团\t307020\n竹苗\t307021\n亚热带\t307022\nasta\t307023\nTorrentKitty中文网\t307024\nmarta\t307025\n陶潜\t307026\nMARVEL\t307027\n观音桥步行街\t307028\n牛龙\t307029\n走珠笔\t307030\nsilver\t307031\nBTrabbit\t307032\nplaced\t307033\n3班\t307034\n自贡市\t307035\npexpect\t307036\n偶然所得\t307037\n卫滨区\t307038\n应市\t307039\n洋码头\t307040\nFolding\t307041\nWarm\t307042\n常德市\t307043\n剑阁县\t307044\n安卓版\t307045\ninstallers\t307046\nTSMC\t307047\n冈崎\t307048\n红丝\t307049\nswa\t307050\n除权日\t307051\n金龙镇\t307052\n高州市人民医院\t307053\n板上\t307054\n中国电网\t307055\n疏通机\t307056\n真空吸塑胶\t307057\nxplayer\t307058\n糖基化\t307059\nseasonal\t307060\n绥芬河市\t307061\n龙禧\t307062\n戈尔巴乔夫\t307063\n文泽\t307064\n净胜球\t307065\nEDP\t307066\nLCD液晶屏\t307067\n轻音乐\t307068\n合智\t307069\n13号星期五\t307070\n野三坡\t307071\n想不起\t307072\n逆流之战\t307073\n洪梅\t307074\nNiceLabel\t307075\n1189\t307076\n中产阶层\t307077\n中科院大连化学物理研究所\t307078\n山西干部在线学院\t307079\nfast\t307080\n老鹰之歌\t307081\n局域网监控\t307082\n北青旅\t307083\ndaybreak\t307084\n国家图书馆\t307085\nlvextend\t307086\n6322\t307087\n500KB\t307088\n_财经中心\t307089\n玉女峰\t307090\n金牌榜\t307091\n童家\t307092\n中央纪律检查委员会\t307093\n命运之神\t307094\n高山峰\t307095\n马齿\t307096\n北京眼科医院\t307097\n北海市\t307098\n自分\t307099\ndblclick\t307100\n咧\t307101\n主会\t307102\n新晋网\t307103\nBartender\t307104\n空港街道\t307105\n分离式\t307106\n王宏斌\t307107\n沙隆达\t307108\nfine\t307109\n徽菜\t307110\n300km\t307111\n乘飞机\t307112\n上海交通大学\t307113\ncug\t307114\n独立本科段\t307115\n20130729\t307116\nnethogs\t307117\n工具变量\t307118\n人教版五年级语文下册\t307119\nlix\t307120\npathof\t307121\n1台\t307122\n冲压\t307123\n保利城\t307124\nThinkPHP5\t307125\n强世功\t307126\nFLEX\t307127\n糖业\t307128\n含沙射影\t307129\n丰收季\t307130\n本田翼\t307131\nhf线\t307132\n甲状腺瘤\t307133\nlipo\t307134\n压铸机\t307135\n百合川\t307136\n测报\t307137\n304\t307138\n卫宫切嗣\t307139\niso镜像文件\t307140\n高峰\t307141\n王恩哥\t307142\n年鉴\t307143\n四川大学化学工程学院\t307144\n启东\t307145\n曲直\t307146\n中南大学\t307147\n明世\t307148\n柴桥\t307149\nboogie\t307150\n不忘初心\t307151\n猫癣\t307152\n五年级作文-五年级作文\t307153\n伦勃朗\t307154\ntopics\t307155\n害苦\t307156\n2.2T\t307157\n刑满\t307158\n早朝\t307159\nInflatable\t307160\n爱建信托\t307161\n坐椅\t307162\n滦\t307163\nSystematic\t307164\n弯角\t307165\n车店\t307166\n日服\t307167\n人保健康\t307168\nU11\t307169\n三江源\t307170\n002925\t307171\n青禾男高\t307172\n測定\t307173\n爱民\t307174\n入党自传\t307175\n申请季\t307176\nWooCommerce\t307177\n网片\t307178\n金隅\t307179\n理正结构\t307180\n八宝网\t307181\n杀招\t307182\n盛会\t307183\n20180120\t307184\n长头发\t307185\n百里香\t307186\ngue\t307187\n吴伯凡\t307188\n文件重命名\t307189\nbtv6\t307190\n低氧血症\t307191\n绊子\t307192\nUUU9穿越火线\t307193\n330_\t307194\n张果老\t307195\n好迷\t307196\n兰友\t307197\n工作经费\t307198\n123家\t307199\n大考\t307200\nWhispers\t307201\ncolony\t307202\n啤酒杯\t307203\n洗手液\t307204\n重枪\t307205\nFPGA\t307206\n水利水电\t307207\n汉籍\t307208\n非婚生\t307209\neolinker\t307210\n普美\t307211\n长城科技\t307212\n地纬商机加盟网\t307213\n汽车销售招聘网\t307214\n非公开\t307215\n张德兰\t307216\nYMCA\t307217\n20.0000元\t307218\n三盛国际海岸\t307219\n房地产门户网\t307220\n马佳佳\t307221\n穿戴\t307222\nChoo\t307223\n2144\t307224\n卡尺\t307225\n智能化\t307226\nministry\t307227\n请你帮个忙\t307228\n王雪晶\t307229\n专用权\t307230\n鲁磨路\t307231\n第八年\t307232\n30秒后\t307233\nkent\t307234\n9020\t307235\ntread\t307236\nsubroutine\t307237\n清货\t307238\n新螺蛳湾\t307239\n铳皇无尽的法夫纳\t307240\n真户晓\t307241\nSOLARZOOM\t307242\n赖茅酒\t307243\nForum\t307244\nc学习笔记\t307245\n挑战不可能\t307246\n北京新浪\t307247\n贺台庆\t307248\n天津住宅集团\t307249\nforattribute\t307250\n波通信\t307251\n0227\t307252\n鼓王\t307253\n小心肝\t307254\n牵狗\t307255\n闭角型青光眼\t307256\n接过\t307257\nelderly\t307258\n徐晓\t307259\n京都市\t307260\n第一实验小学\t307261\nJMM\t307262\ntide\t307263\n三十几\t307264\n流花\t307265\n人民日报批\t307266\niframe层\t307267\n物理内存\t307268\n李晓刚\t307269\n部门级\t307270\n1888元\t307271\n黑蜂胶\t307272\n44分钟\t307273\n危楼\t307274\n欢乐喜剧人4\t307275\n诺一\t307276\n不劳\t307277\nFlexible\t307278\n小号手\t307279\n水居\t307280\n黄芪桂枝五物汤\t307281\nYanu\t307282\n4225\t307283\nShaped\t307284\n遣词\t307285\n宫颈息肉\t307286\n耆\t307287\n旺销\t307288\n磅数\t307289\n_讯康网\t307290\nCaller\t307291\n装璜\t307292\n竞猜型\t307293\n弥儿\t307294\n哔卡哔卡\t307295\n徽府\t307296\n挡料\t307297\n宁波医院\t307298\nporo\t307299\n中国理财网\t307300\n喹硫平\t307301\n串联式\t307302\nv23\t307303\nINS\t307304\n地心人\t307305\n117号\t307306\n150\t307307\n绑架案\t307308\n打击垫\t307309\n沂蒙晚报网\t307310\n鱼羊野史\t307311\n无垠\t307312\n孔雀城\t307313\n开心学英语\t307314\n20几\t307315\n常州车管所\t307316\n九街\t307317\n控场\t307318\n鱼骨头\t307319\n大连公司\t307320\n京牌\t307321\n碧山\t307322\n巴克球\t307323\n大达\t307324\n油锯\t307325\n河北省省\t307326\n周回本\t307327\nElla\t307328\ntesto\t307329\npl线\t307330\n抓包\t307331\nUIScroll\t307332\n不露\t307333\n妖刀村\t307334\n罗坑镇\t307335\n小铃铛\t307336\n祁天道\t307337\n迩\t307338\ncim\t307339\n张明楷\t307340\n伴儿\t307341\n川口\t307342\n提踵\t307343\n削笔器\t307344\n素问\t307345\n边款\t307346\n正术\t307347\n装配体\t307348\n彰泰城\t307349\n展映\t307350\n小牛N1\t307351\n公牛集团有限公司\t307352\nmediumtext\t307353\n80期\t307354\n黑爵\t307355\ntnsnames\t307356\n雁荡\t307357\npets\t307358\n研钵\t307359\nEinstein\t307360\n先性\t307361\nMarble\t307362\n包保\t307363\n百度关键词优化\t307364\n布菲\t307365\n魅魔\t307366\n申正焕\t307367\nション\t307368\n上海狗民俱乐部\t307369\n二级经销商\t307370\n可治\t307371\nMOD整合包\t307372\n15.1\t307373\n天龙八部吧_\t307374\n点店\t307375\n阿多丸\t307376\n红德智库\t307377\n家庄\t307378\n叫床\t307379\nZIPP\t307380\n寒咳\t307381\nPSPICE\t307382\n大灰兔\t307383\n比特币区块浏览器\t307384\n背风\t307385\nir\t307386\n古井\t307387\n重庆移动\t307388\n巯基乙醇\t307389\n水晶龙\t307390\n希腊神话\t307391\nredhat7\t307392\n基哥\t307393\nmbt\t307394\n三亩\t307395\n邓拓\t307396\n凉拌土豆丝\t307397\n太平洋广场\t307398\n反邪\t307399\nTONY\t307400\n教育性\t307401\n大林镇\t307402\n垃圾袋\t307403\n行路\t307404\nplorer\t307405\n泰西\t307406\n骚舞\t307407\n大筒木舍人\t307408\n优衣库UNIQLO\t307409\n一坛\t307410\nLIST\t307411\n乐山职业技术学院\t307412\n城皮\t307413\n开闸\t307414\n王朔\t307415\n以次充好\t307416\n畅说\t307417\n征程\t307418\n爱信\t307419\n去甲肾上腺素\t307420\n塑钢窗\t307421\nPVideos\t307422\n追溯期\t307423\n王瑛\t307424\n十几个\t307425\n花纹板\t307426\n8253\t307427\nteens\t307428\n地震波\t307429\ni白金卡\t307430\n金泰希\t307431\nbirch\t307432\n汉赋\t307433\nway\t307434\n原唱\t307435\nDPO\t307436\nDELPHI\t307437\n张国强\t307438\n细胞培养基\t307439\n贵性\t307440\n车迷网\t307441\n黄埔\t307442\n惠\t307443\n16T\t307444\n史林\t307445\nWrt\t307446\n猪血\t307447\n私募基金登记\t307448\n冒险岛079\t307449\n漂白\t307450\n秦皇岛站\t307451\n0030\t307452\n奶包\t307453\n上汽大众汽车\t307454\n影影\t307455\n万家花园\t307456\n吉利S1\t307457\nrez\t307458\necilpse\t307459\n脑力劳动\t307460\n票证\t307461\n徐麟\t307462\n7810\t307463\n激安\t307464\n埃利斯\t307465\n八代\t307466\n中古车\t307467\n太白湖\t307468\n汤逊湖壹号\t307469\n脑筋急转弯_\t307470\n7毛\t307471\n榴弹枪\t307472\n董涛\t307473\n丸剂\t307474\n石渠县\t307475\nlhgdialog\t307476\n百度离线地图\t307477\n彩讯\t307478\nP1口\t307479\n青岛地铁2号线西段\t307480\nlvse\t307481\nreflux\t307482\n92GAME\t307483\n康爱多掌上药店\t307484\n毛蟹\t307485\niterations\t307486\n浮生未歇\t307487\n不轨\t307488\n赵五娘\t307489\nSPIN\t307490\n圣器\t307491\n打退\t307492\n好开森\t307493\nosmdroid\t307494\nwin10m\t307495\n元件库\t307496\nRX5\t307497\n挽星\t307498\n接下来\t307499\n学而思小学\t307500\n该剧\t307501\n20130817\t307502\nansys\t307503\n蜀桑源\t307504\n华中科技大学图书馆\t307505\n儿童性\t307506\n忍耐力\t307507\n汽车东站\t307508\nTrademark\t307509\n配煤\t307510\ntcg\t307511\n农作物\t307512\nDatasheet|\t307513\n54种\t307514\nNEU\t307515\n浪子\t307516\n肛门镜检查\t307517\n院史\t307518\n真小\t307519\n日食\t307520\n昆明市国土资源局\t307521\n长租公寓\t307522\npeach\t307523\n700m\t307524\n丽得姿\t307525\n60M\t307526\n小魔王\t307527\n雷克萨斯es200\t307528\n半面妆\t307529\n安道麦\t307530\n长宁县\t307531\ngmxx\t307532\n乐多\t307533\n濑心美\t307534\n预膜\t307535\nvanilla\t307536\n元老级\t307537\n周麻婆\t307538\n百万发\t307539\n珍情\t307540\n里赫特\t307541\n六十四卦\t307542\n酿酒机\t307543\n体积\t307544\n竖梁\t307545\n投敌\t307546\nWaking\t307547\n何炅\t307548\nP2055\t307549\n阿修\t307550\n蒙特梭利幼儿园\t307551\npfm\t307552\n张湾\t307553\n百辆\t307554\n2克拉\t307555\n贴心人\t307556\n捷豹XFL\t307557\n黑暗之魂3\t307558\n徐艳\t307559\n黄志忠\t307560\n中国国际展览中心新馆\t307561\n信长之野望创造战国立志传\t307562\n宝座\t307563\n广东纪委\t307564\nCoherent\t307565\n贵州省公共资源交易中心\t307566\n斑羚\t307567\n江苏地税\t307568\n药师\t307569\n五险\t307570\n四代雷影\t307571\n刷新率\t307572\n李劼人\t307573\nFenty\t307574\nmaxdos\t307575\n丹德里恩\t307576\n猪料\t307577\n23118元\t307578\n回味无穷\t307579\nNadia\t307580\n北海公园\t307581\n檀香刑\t307582\n海街日记\t307583\n分园\t307584\n李瑞英\t307585\ne英语学习网\t307586\n3.58\t307587\n会签\t307588\n东风风神ax7\t307589\n聚宝源\t307590\nGum\t307591\n鲜红\t307592\n西财\t307593\njni\t307594\n口子\t307595\n远走高飞\t307596\n微信银行卡\t307597\n杨建华\t307598\n五丰\t307599\n李贞\t307600\n木鱼石\t307601\nradio\t307602\n威海环翠区政府\t307603\n微图\t307604\n摩友\t307605\n三尖瓣返流\t307606\n慈溪中学\t307607\n1358\t307608\n大宁郁金香公园\t307609\n悼\t307610\nMathType数学公式编辑器\t307611\nsouth\t307612\n建筑工程施工许可证\t307613\n据此\t307614\n阿长\t307615\nhguc\t307616\n五十厘米\t307617\n熊猫桑\t307618\n大脸猫\t307619\n迅雷网盘\t307620\n4月3号\t307621\n白飞飞\t307622\ncmv\t307623\nmx3\t307624\n七天前\t307625\n分析库\t307626\n江苏省农委\t307627\n摄像头\t307628\n问计\t307629\n报告会\t307630\n起作用\t307631\n而战\t307632\npumpkin\t307633\n翠花\t307634\n区排\t307635\n大旗\t307636\n天津市环境保护局\t307637\n粉龙\t307638\n威卢克斯\t307639\n黑崎君\t307640\n4棵\t307641\n动员会\t307642\n期号\t307643\n虚幻引擎4\t307644\n100颗\t307645\n宿根\t307646\n山灵m3s\t307647\nxpro2\t307648\nMT4\t307649\n始建\t307650\nHongMaJu\t307651\n昆山市中医医院\t307652\n神掌\t307653\nAlloys\t307654\n简一\t307655\n科沃绝世唐门\t307656\n买卖通\t307657\n安贞门\t307658\n塞伯利亚\t307659\nECC\t307660\n格罗宁根\t307661\n茵陈\t307662\n陈颖恩\t307663\n陈若仪\t307664\n2015年4月24日\t307665\n稚\t307666\n4月27\t307667\n镇远县\t307668\n共犯\t307669\nPho\t307670\n痛不痛\t307671\n54家\t307672\n调表\t307673\n十天左右\t307674\n鸳梦\t307675\nExceptional\t307676\njpc\t307677\n小米手机2A\t307678\nBiosciences\t307679\nCHC\t307680\n张家界市区\t307681\n申科股份\t307682\n郑薇薇\t307683\n3万亿元\t307684\n乳钙\t307685\nrili\t307686\n留位\t307687\n寨板\t307688\nJUNIOR\t307689\n宋相思\t307690\n湖南政协\t307691\n柳岩窦文涛\t307692\n西安交通大学第一附属医院\t307693\n赤壁铜雀台\t307694\n4月16日起\t307695\n苕粉\t307696\n幻想大陆战记\t307697\n2015年11月20日\t307698\n306浏览器\t307699\n火影忍者手游\t307700\n闭环传递函数\t307701\nM.2接口\t307702\n五黄\t307703\n二卡\t307704\nFreeCAD\t307705\n茶店子车站\t307706\n黑龙江省人力资源和社会保障厅\t307707\nk30\t307708\n崔永元\t307709\n冒险岛OL\t307710\n魔秀\t307711\n世界工厂网\t307712\ngmtime\t307713\nopenblas\t307714\npolar\t307715\n1.5.23\t307716\n中华苏维埃共和国\t307717\n雷克萨斯GS\t307718\nNebraska\t307719\n印染\t307720\n贷款基准利率\t307721\n大卫席尔瓦\t307722\n秋末\t307723\n96dpi\t307724\n小米1\t307725\n粘纤\t307726\n跃点数\t307727\nRetry\t307728\n淮南新闻网\t307729\nRBQ\t307730\n独在异乡\t307731\n夹紧\t307732\n抠\t307733\n克利福德\t307734\nCCTV-3\t307735\n党情\t307736\n罗喉\t307737\nvariational\t307738\n旦夕\t307739\n氢元素\t307740\n】论\t307741\n万蒂妮\t307742\n绝对化\t307743\n头牛\t307744\nvotre\t307745\n施拉姆\t307746\n欣旺\t307747\n境界线\t307748\n贷款问答-希财网\t307749\n毁伤\t307750\npound\t307751\n司务\t307752\n释明\t307753\n新城小学\t307754\n晶哥\t307755\n手术衣\t307756\n中国科学院人事局\t307757\n柱筋\t307758\n铁哥们\t307759\n上海中学国际部\t307760\n全血\t307761\nkbg\t307762\nHb\t307763\nM010\t307764\n该区\t307765\n领地集团\t307766\n绳梦园\t307767\n1896\t307768\nplymouth\t307769\n55秒\t307770\n植木\t307771\n豌豆尖\t307772\n免费在线漫画网\t307773\n党心\t307774\n仪表盘\t307775\n不臭脚\t307776\n吸水纸\t307777\n六处\t307778\nqinghai\t307779\n样品册\t307780\n诊部\t307781\n鸡飞狗跳\t307782\nMOSCHINO\t307783\n四姑娘山\t307784\n大洋路\t307785\n亚马孙河\t307786\n一罐\t307787\n压顶\t307788\n姜堰市\t307789\n资产置换\t307790\nanson\t307791\n矮柜\t307792\nhuya\t307793\n圣姑\t307794\nBitTorrent\t307795\n8盒\t307796\n毛圈\t307797\n君怀\t307798\n水富县\t307799\n微物\t307800\n七八万\t307801\n住邦\t307802\n万向联轴器\t307803\n时光标\t307804\n打磨\t307805\n拉肚子\t307806\n生师\t307807\n张卫平\t307808\n皮碗\t307809\n电动葫芦\t307810\n绑定\t307811\n远哭5\t307812\nOpenBLAS\t307813\na21\t307814\n雅俊\t307815\n费希尔\t307816\n荣耀s11\t307817\n看作\t307818\n荤场子\t307819\n蕾丝袜\t307820\n上海市五官科医院\t307821\n武汉工业学院\t307822\n物业管家\t307823\n橘红\t307824\n幻魔\t307825\n台安\t307826\n北京顺义集散中心\t307827\n兰芝\t307828\n猛鬼街\t307829\n空孕囊\t307830\n299美元\t307831\n武汉学院\t307832\nChocolatey\t307833\nLc\t307834\n办班\t307835\n口舌\t307836\n杀戮尖塔吧\t307837\n奇米网\t307838\n疯狂动物\t307839\n常思\t307840\n七星级酒店\t307841\n华西附二院\t307842\nPeking\t307843\n七大姑八大姨\t307844\n优和\t307845\n字面\t307846\n蜜雪薇琪\t307847\nDamien\t307848\n00000\t307849\n明媒正娶\t307850\n开尔文\t307851\n音爆\t307852\n样体\t307853\nrenaissance\t307854\nwaymo\t307855\n湖南省政协\t307856\nSSHD\t307857\n日东\t307858\n子宫前位\t307859\nCPU风扇\t307860\n纪元1404吧\t307861\n士别三日\t307862\n诺安基金\t307863\n经济系\t307864\n鼻氧管\t307865\n爱否科技\t307866\n分块矩阵\t307867\n乳腺结节\t307868\n翘头\t307869\n桃浦\t307870\n高淳老街\t307871\n养蛇\t307872\n四分之\t307873\nGTA5\t307874\n有界函数\t307875\n玛洛加尔\t307876\n怪猫\t307877\n与非网\t307878\n正负\t307879\n阿利伯克\t307880\n10多亿\t307881\n西海岸\t307882\n胡姬\t307883\n港台\t307884\n短跑\t307885\n加速\t307886\n网上保险服务\t307887\nFarewell\t307888\nzkteco\t307889\n天天练小学\t307890\n绿茶餐厅\t307891\n戴锦华\t307892\n大蕉\t307893\n2050\t307894\nWHEN\t307895\n翻篇\t307896\n舜华路\t307897\nuntracked\t307898\n庵堂\t307899\n亚当·桑德勒\t307900\n胎躁_路躁_\t307901\n木匾\t307902\n孟涛\t307903\n4730\t307904\n盐酸帕罗西汀\t307905\n沙袋\t307906\n坐位体前屈\t307907\n玛丽·居里\t307908\n巴拿马城\t307909\nv1.4.0\t307910\n暮刃\t307911\n第一二章\t307912\n牛客网\t307913\n校线\t307914\n搏杀\t307915\n159五\t307916\n女经理\t307917\n松山新区\t307918\nvisdom\t307919\n迁徙\t307920\n25KG\t307921\n图灵机\t307922\n御夫座\t307923\nniche\t307924\n2.2亿元\t307925\n题\t307926\n品牌管理有限公司\t307927\nbookstrap\t307928\nJ2SE\t307929\nw-8ben\t307930\n中国智能交通协会\t307931\n贴补\t307932\n大董烤鸭店\t307933\nTOP7\t307934\n神经干\t307935\niea\t307936\n北庄村\t307937\n泾渭新城\t307938\n金股\t307939\nincline\t307940\n质能方程\t307941\n带鱼\t307942\n下日\t307943\nMinerals\t307944\nv信\t307945\nHydrogen\t307946\n升降式\t307947\n活力版\t307948\n高污染\t307949\nPSY\t307950\n曹全\t307951\nConnecticut\t307952\n宾川县\t307953\n门夹\t307954\n6313\t307955\n三农政策\t307956\nparting\t307957\nmaru\t307958\n1.6GB\t307959\n翡翠郡\t307960\n動画\t307961\n教书匠\t307962\n世预赛\t307963\n绍兴古城\t307964\n闷骚型\t307965\ndnf机械吧\t307966\nPBX\t307967\ntrajectory\t307968\n维A酸\t307969\n曝工资\t307970\n&quest\t307971\n软键盘\t307972\n甲醇燃料电池\t307973\nKilo\t307974\nsafair\t307975\n杏子\t307976\n精整\t307977\n绿城物业服务集团有限公司\t307978\n双彩\t307979\n12份\t307980\n深入骨髓\t307981\n渣夫\t307982\n8个多月\t307983\n迪克猪\t307984\nsat2\t307985\n畜力\t307986\n白红\t307987\nplayer播放器\t307988\n桂花糕\t307989\n土系\t307990\n珠江广场\t307991\nwanz\t307992\n上海交通大学物理与天文学院\t307993\n建水紫陶\t307994\n第一小时\t307995\n十声\t307996\n120元\t307997\nproject2013\t307998\n酚妥拉明\t307999\n玉峰山\t308000\ngaon\t308001\nRam\t308002\n号点\t308003\nmeer\t308004\n性体\t308005\n大眼妹\t308006\nLegion\t308007\nCodeLite\t308008\nRedisson\t308009\ne通\t308010\ngtx950m\t308011\nChem\t308012\n池面\t308013\n徐世奎\t308014\nsucceeded\t308015\nX250\t308016\nmps\t308017\n国军\t308018\n断头\t308019\n引产\t308020\n槐角\t308021\n理路\t308022\nForensics\t308023\n迪卡侬运动超市\t308024\n重庆市气象局\t308025\noring\t308026\n花豹\t308027\nNotify\t308028\n国际学校网\t308029\nNC63\t308030\n细粮\t308031\n湘乡市\t308032\n木笔\t308033\n我的蚂蚁\t308034\nendpoints\t308035\n苏黎世联邦理工学院\t308036\nE09\t308037\nEc\t308038\n付岩洞\t308039\n好队\t308040\nslices\t308041\n真空吸尘器\t308042\n短少\t308043\n阶梯式\t308044\n罢休\t308045\n1200张\t308046\n宁波中考网\t308047\n周勇\t308048\n斤斤\t308049\nwog\t308050\n郑媛媛\t308051\n鲍\t308052\nEnSight\t308053\n大沢佑香\t308054\nPPH\t308055\n机关事务管理局\t308056\n相关性\t308057\n古庙\t308058\nspringfox\t308059\nsqlserver\t308060\n巴士龙之谷\t308061\n涤纶布\t308062\n海王星\t308063\n米迦尔\t308064\n3x3\t308065\n常州博爱医院\t308066\n学富五车\t308067\n数万\t308068\n良\t308069\n郑容和\t308070\n军群\t308071\n黄浦湾\t308072\n养成之路\t308073\n高希霸\t308074\n浅意\t308075\n喜哥\t308076\n贵定县\t308077\n压缩率\t308078\n明里\t308079\n雷迪克\t308080\n王红卫\t308081\nECharts\t308082\n单语\t308083\n对数公式\t308084\n杀人放火\t308085\n重改\t308086\n无影灯\t308087\n安士\t308088\n永恒纪元\t308089\n大小孩\t308090\n64年\t308091\n鬼话\t308092\neverytime\t308093\ntoolbar\t308094\n3175\t308095\n西厨\t308096\n永不败\t308097\n健客\t308098\n10.1%\t308099\n同州\t308100\n莲池\t308101\nOracle行\t308102\n汾河湾\t308103\n探索性因子分析\t308104\nTHU\t308105\n盖世豪侠\t308106\n情況\t308107\n极简版\t308108\nJR们\t308109\n暴汗\t308110\nNDVI\t308111\nae模板\t308112\n外账\t308113\n杀手之王\t308114\n忠信镇\t308115\n吉林工程技术师范学院\t308116\n单排\t308117\n海树\t308118\n麦鲁小城\t308119\n互补\t308120\nsenior\t308121\n这一路\t308122\n九朝\t308123\nUIViewController\t308124\nwenge\t308125\n一村一品示范村镇\t308126\n博雅网\t308127\n丹竹头\t308128\n花臂\t308129\n70a\t308130\n删剪\t308131\n一次机\t308132\nmeiguo\t308133\n王者荣耀曹操\t308134\n老道\t308135\n19岁\t308136\n北帝\t308137\nlct\t308138\n沈佳宜\t308139\n铁岭百姓网\t308140\n卫生间隔\t308141\n上海公交网\t308142\neizo\t308143\n君子兰\t308144\nmse\t308145\n入林\t308146\n车车\t308147\n餐机\t308148\n道恩强森\t308149\n高化学\t308150\n访日\t308151\n商路\t308152\n明察暗访\t308153\n吕布三国演义\t308154\n居留权\t308155\n栾\t308156\narray_merge\t308157\n格兰岛\t308158\n安措费\t308159\nPyspark\t308160\n燮\t308161\nlineageOS\t308162\n白云宾馆\t308163\n伊辛\t308164\n防止器\t308165\n杭师大附中\t308166\n间性\t308167\n逆转裁判\t308168\nGaming\t308169\n吉儿\t308170\n汤本\t308171\n徐卫东\t308172\n金老\t308173\n20171225\t308174\n金都华府\t308175\nassumptions\t308176\nH7N9\t308177\n传声\t308178\n海土\t308179\n揭阳一中\t308180\npsp模拟器\t308181\nhaval\t308182\n授权率\t308183\nCookware\t308184\n21年\t308185\n中心路\t308186\n678\t308187\nBOE\t308188\n投档分\t308189\n黄羽\t308190\n广式\t308191\n造型机\t308192\n台东区\t308193\n天津市中医药研究院附属医院\t308194\nMACOSX\t308195\n杜尚别\t308196\n蠢\t308197\nWRX\t308198\n武汉市公安局出入境管理局\t308199\n电用\t308200\n瓷壶\t308201\nregis\t308202\n截单\t308203\nJMedia\t308204\n纳妃\t308205\nanticipate\t308206\n新锦\t308207\n改修\t308208\n高家镇\t308209\n星际文\t308210\nbal\t308211\n龙华镇\t308212\nDesigning\t308213\n寻人启事网\t308214\ndiffers\t308215\n脱硝\t308216\n香江百货\t308217\n德华\t308218\n报考者\t308219\njiemu\t308220\n帧结构\t308221\nPredicting\t308222\n油毡\t308223\n虹桥悦府\t308224\n上海第九人民医院\t308225\n金苔鼠\t308226\niPhone6p\t308227\n张海宇\t308228\n弗洛\t308229\n驴肉火烧\t308230\nhunta\t308231\n幻灵\t308232\n上海都市网\t308233\n皮试\t308234\n告警\t308235\n穗子\t308236\ncoa\t308237\n万能文件格式转换器\t308238\n校方\t308239\n入乎\t308240\n李天腾\t308241\n文兴\t308242\nOpenIV\t308243\n王雅南极之恋\t308244\nTied\t308245\n唯亭\t308246\n激振器\t308247\n开间\t308248\n南宁会展中心\t308249\n凯帝\t308250\n第九场\t308251\n风玫瑰\t308252\n黑庄户\t308253\n中看\t308254\n新春\t308255\ntorch\t308256\n支线\t308257\n拼豆豆\t308258\n科德\t308259\n东方天气网\t308260\n张旭超\t308261\nATV\t308262\n程勇\t308263\nshiqi\t308264\n射箭馆\t308265\n贾凯\t308266\n世茂龙湾\t308267\nmailbox\t308268\n痘坑\t308269\n挑开\t308270\n1.1.1.1\t308271\n退步\t308272\n280公里\t308273\n公假\t308274\n牵拉\t308275\n环球影城\t308276\n织袜机\t308277\nyy6080\t308278\n春杯\t308279\nraindream\t308280\n锐眼\t308281\n稻草\t308282\n翻越\t308283\n晚明\t308284\n五神\t308285\n博洋家纺\t308286\n优嘉\t308287\n同信\t308288\n南房网\t308289\n海马M3\t308290\nVVC\t308291\nSTP\t308292\n小儿歌\t308293\n华联超市\t308294\n地材\t308295\n师市\t308296\n皓\t308297\n大纵湖\t308298\n旧贴\t308299\n100万平方米\t308300\n1.5亿吨\t308301\n进位加法\t308302\n中兴社区\t308303\n压缝\t308304\n过急\t308305\nfhd\t308306\n乔字\t308307\n原动机\t308308\nIDE\t308309\n图图犬\t308310\n郑晓\t308311\n亚平宁\t308312\n煤量\t308313\n海论网\t308314\n三国争霸\t308315\n2846\t308316\n鹿鼎记2神龙教\t308317\ntwany\t308318\n窗门\t308319\n龙滩\t308320\n成都高新区\t308321\nSPAN\t308322\n产品结构\t308323\n窝边草\t308324\n恋恋不忘\t308325\n岗厦\t308326\n必\t308327\n付爱宝\t308328\n懒汉式\t308329\n绿蛙\t308330\n公租\t308331\n方威\t308332\n三碗\t308333\n水晶湖郡\t308334\n龙利鱼\t308335\n租房合同|华律\t308336\n好乐\t308337\n狗尾巴\t308338\nkthread\t308339\n2017年5月27日\t308340\nB50\t308341\n相宜本草\t308342\n李嘉欣\t308343\n锦溪\t308344\n红粉\t308345\n红瑞乐邦\t308346\noo\t308347\nepic\t308348\n后宫类\t308349\n海星\t308350\n卡佛\t308351\n小马虎\t308352\n香泡\t308353\n黄奕\t308354\n金纺柔顺剂\t308355\n产\t308356\n合肥市质量技术监督局\t308357\n2.71\t308358\n恒昌credithc.com\t308359\n住建部\t308360\n小米吧_\t308361\n目录表\t308362\n抚顺县\t308363\n噶\t308364\n析产\t308365\n真的错\t308366\n开课\t308367\n承载量\t308368\nxc40\t308369\n巴瑞\t308370\n锋\t308371\n拍照机\t308372\n中蒙俄\t308373\n胆汁性\t308374\n母子性\t308375\nrows\t308376\n有史以来\t308377\nCNBrass\t308378\n不亦乐乎\t308379\n木梯\t308380\n自助仓\t308381\nc#\t308382\n壶身\t308383\n农舍\t308384\n1架\t308385\n大遗址\t308386\n北京的春节\t308387\nPractices\t308388\n高开低走\t308389\n高中\t308390\n八载\t308391\nalpha\t308392\nBIRT\t308393\n王凌\t308394\nmp\t308395\n珍珠鸟\t308396\n西安市第三医院\t308397\n电战\t308398\n钠盐\t308399\n前史\t308400\nteranext\t308401\nkissing\t308402\n哑粉纸\t308403\n日本雅虎\t308404\n绥阳县\t308405\n诞生史\t308406\n里外\t308407\n王德威\t308408\n边箱\t308409\n第01卷\t308410\n黑暗界\t308411\n蘑菇头\t308412\n原州\t308413\n河钢集团有限公司\t308414\ncurry\t308415\n00018\t308416\n浴球\t308417\n苹果6P\t308418\n98.6\t308419\nscripting\t308420\nDesi\t308421\nappenders\t308422\n再演\t308423\n色条\t308424\n标幺值\t308425\n狄芳\t308426\n平权\t308427\n一脱成名\t308428\n古琴\t308429\n香港卫生署\t308430\n玻化砖\t308431\n生态工业园\t308432\n壹码\t308433\n在我身边\t308434\n多库\t308435\n余慈\t308436\n文物保护\t308437\n48元\t308438\n天月\t308439\n赖敏\t308440\n生存启示录\t308441\n毛孩\t308442\n梦幻69\t308443\n余震\t308444\n欢乐中国人\t308445\n金城路\t308446\n11关\t308447\nwwwbbb657commbdbaiducom\t308448\nGUY\t308449\n迎宾路\t308450\n早教&幼儿园\t308451\nOW\t308452\n放站\t308453\nwaimai\t308454\n多囊卵巢综合征\t308455\n婉\t308456\njoke\t308457\n魂技\t308458\n喉咙痛\t308459\nMah\t308460\n良值\t308461\n为什\t308462\n全国社会保障基金理事会\t308463\n录取者\t308464\n销赃\t308465\n直线职能制\t308466\n希望ol吧\t308467\n北京现代悦动\t308468\n博斯科维奇\t308469\nwangluo\t308470\n折800网\t308471\nTrauma\t308472\n22斤\t308473\n中国劳动关系网\t308474\n新睿\t308475\n3分钟内\t308476\n皮村\t308477\n杰森王传福\t308478\nmology\t308479\n哈努曼\t308480\n奇大\t308481\n维氏\t308482\nmmHg\t308483\n李英\t308484\n惠生\t308485\nraped\t308486\n斧子\t308487\ngavbus\t308488\n眉南\t308489\n离线缓存\t308490\n微砍价\t308491\n于小伟\t308492\npul\t308493\ntablerow\t308494\n360米\t308495\n苏财综\t308496\n一片天\t308497\n天津南开医院\t308498\n安平县\t308499\n原野上\t308500\n掩蔽\t308501\nSLEEP\t308502\n首败\t308503\n深圳证券交易所\t308504\n反汇编\t308505\n蓝海\t308506\nFURLA\t308507\n纳米银\t308508\n百度版\t308509\n出走季\t308510\n地奈德乳膏\t308511\n周天功\t308512\n竖杠\t308513\n座管\t308514\n表图\t308515\n40天左右\t308516\n贵阳银行\t308517\n北纬\t308518\n拆借\t308519\n座位险\t308520\n测压\t308521\n粉色\t308522\n谐振\t308523\n三清茶\t308524\nxpath\t308525\n新昌\t308526\n几粒\t308527\n净胜\t308528\nM15\t308529\n系团\t308530\n达利食品\t308531\n建新镇\t308532\n仙灵外传\t308533\n欧德堡\t308534\n摸透\t308535\nGlusterFS\t308536\n验布\t308537\n金毓婷\t308538\n口腔尖锐湿疣\t308539\n聚苯板\t308540\n假哭\t308541\n劳动保护费\t308542\n经验丰富\t308543\n髌骨骨折\t308544\n抗争\t308545\nconversion\t308546\nopencanvas\t308547\n山大医院\t308548\n双胎\t308549\nsupermarket\t308550\n雾凇岛\t308551\nFutanari\t308552\n一遍\t308553\nInno3D\t308554\n等边三角形\t308555\n末法王座\t308556\n隆化县政府\t308557\n柳鸣九\t308558\nSealing\t308559\n荷拉\t308560\n衡量\t308561\n伊鲁席尔\t308562\n南方出版社\t308563\n神舟八号\t308564\nBusted\t308565\n福建省晋华集成电路有限公司\t308566\n黄国盛\t308567\n易理\t308568\n三室一厅\t308569\n20171214\t308570\n专注度\t308571\n空段\t308572\n2011款\t308573\n几兆\t308574\n年销售额\t308575\n南京市水务局\t308576\n新政办\t308577\n二手房网|房产网\t308578\n雪纺衫\t308579\nhaproxy\t308580\nwww.\t308581\n羊鞭\t308582\n特训\t308583\n鸽子花\t308584\nageing\t308585\nCCTV-8\t308586\n解密\t308587\n韩鹏\t308588\nSTAFF\t308589\n13.8\t308590\ndueros\t308591\nV100\t308592\n担\t308593\n阀位\t308594\n5110\t308595\n石坚\t308596\n上海医药\t308597\n和稀泥\t308598\n第四十条\t308599\n窃听风云2\t308600\nMastery\t308601\n出入证\t308602\n辽宁卫视\t308603\n幽灵战士3\t308604\n新春茶话会\t308605\n摩梭族\t308606\n郑佩佩\t308607\n小谢尔顿\t308608\n矿车\t308609\n18型\t308610\nnmol\t308611\nkin\t308612\nqq空间说说\t308613\n张野\t308614\n王欣宇\t308615\ndiamonds\t308616\n自荐表\t308617\n黄金储备\t308618\n中交地产\t308619\n高尔夫俱乐部\t308620\nafl\t308621\n小马车\t308622\n菲乐\t308623\n中央人民政府驻香港特别行政区联络办公室\t308624\n口模\t308625\nアプリ\t308626\n釜山\t308627\nコミュニケ\t308628\n李股长\t308629\n庐阳区\t308630\n美图m6s\t308631\n精密\t308632\n电子书阅读器\t308633\ndv4\t308634\n朱正廷\t308635\n双先\t308636\n一人一半\t308637\n岔河镇\t308638\n曲阳\t308639\n2015级\t308640\n李维汉\t308641\nocserv\t308642\n光复西路\t308643\nhpv\t308644\n0x20\t308645\n面心\t308646\n商之翼\t308647\n一种\t308648\n菲古拉\t308649\nCORE\t308650\n冀南\t308651\nmerely\t308652\n抓贼\t308653\n拐子\t308654\n刘艳萍\t308655\n星火路\t308656\n吉凶\t308657\n从师\t308658\n卢柯\t308659\n布热津斯基\t308660\n眉州路\t308661\nandr\t308662\n逛了逛\t308663\n耳毛\t308664\n考试网\t308665\n劝勉\t308666\n百威风暴电音节\t308667\n宝安区政府\t308668\n赛迪网\t308669\n回转支承\t308670\n中车集团\t308671\nwish123\t308672\njay\t308673\ntp-link吧\t308674\nbtr\t308675\n征信业\t308676\n餐饮\t308677\n结算日\t308678\n行游\t308679\n1.0.877.1\t308680\n上万个\t308681\n王双宝\t308682\ninsult\t308683\n颗牙\t308684\nEMR\t308685\n凯迪拉克ats\t308686\n高陵政府网\t308687\n沪东\t308688\n柴达木盆地\t308689\nlable\t308690\n西瓜播放器\t308691\n文房四宝\t308692\n蛙纯音乐网\t308693\n史可\t308694\n涂膜\t308695\n彭军\t308696\n好中意\t308697\n一万元\t308698\nQQ同步助手\t308699\n七倍\t308700\n拉奥\t308701\n120胶卷\t308702\n死因\t308703\n广清影院\t308704\n雷鸣山\t308705\n第三系\t308706\n上年度\t308707\n南澳大学\t308708\n咸亨\t308709\n金华市委\t308710\n吴峰\t308711\n东营大众网\t308712\n韩娱\t308713\n小芈月\t308714\nZoctopus\t308715\n纳粹\t308716\n干货\t308717\nEtcd\t308718\n奥特\t308719\n小红点点\t308720\n72G直播中心\t308721\n动荡\t308722\n波特图\t308723\n总公司\t308724\n直流调速器\t308725\n少年黄飞鸿之铁马骝\t308726\n上帝之子\t308727\n核磁共振检查\t308728\n包惜弱\t308729\n器质性\t308730\nstv\t308731\n小何\t308732\n沙巴克传奇\t308733\n16年03月\t308734\n咒诅\t308735\n移动携号转网\t308736\n奥德赛\t308737\nyeyou.com页游网\t308738\niniki\t308739\n思缘\t308740\n2990\t308741\n第75章\t308742\n马英\t308743\nbout\t308744\n45T\t308745\nzurich\t308746\nC++算法\t308747\n吉J90-010\t308748\nlms\t308749\n曾皙\t308750\n节\t308751\n工作岗\t308752\n恳亲\t308753\nFLOOR\t308754\n园舍\t308755\n隔壁老王\t308756\n恬园\t308757\nGAGA\t308758\n柳俊烈\t308759\n字口\t308760\n洞头政府网\t308761\n提摩西草\t308762\n精子库\t308763\n2420元\t308764\nCellular\t308765\n师大一中\t308766\n差人\t308767\n书系\t308768\n刘诗慧\t308769\n彩头\t308770\n132个\t308771\nstrchr\t308772\n植牙\t308773\n日村\t308774\nsun8134\t308775\n合成孔径雷达\t308776\n金珠\t308777\n17.04\t308778\nts文件\t308779\n火_\t308780\nhoya\t308781\n8:30\t308782\n啄食\t308783\n睾丸静脉曲张\t308784\n精对苯二甲酸\t308785\nVitaShell\t308786\n巴结\t308787\n二十二冶\t308788\n激进\t308789\ncoaster\t308790\nRoyalty\t308791\nCISA\t308792\n乌龙山伯爵\t308793\n钢蛋\t308794\nworld68海淘\t308795\n景深\t308796\nSwift3.0\t308797\n法物\t308798\n招商银行上海分行\t308799\n乌瑞恩\t308800\n代理记\t308801\nP闪版\t308802\n诛仙_九游论坛\t308803\nMac简体中文\t308804\n人間\t308805\n祥细\t308806\n瘦肉汤\t308807\n李传韵\t308808\n远传压力表\t308809\n北京排水集团\t308810\n22m\t308811\n流亡黯道\t308812\n常州第一人民医院\t308813\n2018年4月28日\t308814\n撒花\t308815\n软件著作权人\t308816\nues\t308817\n三亿\t308818\n宝来论坛_XCAR\t308819\n红棕\t308820\n义乌19楼\t308821\n一笔\t308822\n蘇州\t308823\n藤缠树\t308824\n论论\t308825\n6300v2\t308826\n耐酸砖\t308827\n陈风\t308828\nPanama\t308829\n文保\t308830\n广东之窗\t308831\n软体动物\t308832\nBVS\t308833\n村委书记\t308834\nICP\t308835\n夏洛克\t308836\n二处\t308837\nsex8\t308838\n一木\t308839\n价高\t308840\n供货商\t308841\nNI\t308842\n马某某\t308843\n宫野\t308844\n中国新民主主义青年团\t308845\nhackers\t308846\n云顶山\t308847\nF4.5\t308848\n于晓光\t308849\n劲家庄\t308850\n唐雎不辱使命\t308851\n兽父\t308852\n窗内\t308853\n两极\t308854\n看房笔记\t308855\n王奎荣\t308856\n耶和华\t308857\n电气有限公司\t308858\n摩拳擦掌\t308859\nBotanical\t308860\n一格\t308861\n扶贫\t308862\n敖广\t308863\nPj\t308864\n露肉\t308865\n波雅·汉库克\t308866\ntemp\t308867\n红酒吧\t308868\n0408\t308869\n28年后\t308870\n武广高铁\t308871\n1925\t308872\n张倬闻\t308873\nUDS\t308874\nUIButton\t308875\n张依依\t308876\n梁洛施\t308877\n空冷岛\t308878\nlend\t308879\n百分之五\t308880\n百花山\t308881\n14999\t308882\nA99\t308883\nXiaodigu\t308884\n云杉木\t308885\n绿窗\t308886\nctk\t308887\n迎战\t308888\n轮子\t308889\n0535\t308890\n猩球\t308891\n代加工\t308892\n林分\t308893\n寒蝉\t308894\nnumpy.float64\t308895\n张说\t308896\n妻女\t308897\n急售\t308898\n安徽科技学院\t308899\ndsi\t308900\n白橙\t308901\n穗宝\t308902\nipad2\t308903\n双认证\t308904\n华夫饼\t308905\n2.07\t308906\n岚岚\t308907\n宝马雪中悍刀行\t308908\n武汉车管所\t308909\n投保\t308910\n结了\t308911\n中卡\t308912\n金碧辉煌\t308913\n落墨\t308914\n赞叹\t308915\n万科双月湾\t308916\n封关\t308917\n皇子\t308918\n岁月静\t308919\n光透射比\t308920\n父窗\t308921\nSLOMO\t308922\n100马力\t308923\n万顷沙\t308924\n制片\t308925\n中国市政工程中南设计研究总院有限公司\t308926\nFgo\t308927\n富润\t308928\n草堂镇\t308929\n血疑\t308930\n入职体检套餐\t308931\n西安市国土资源局\t308932\n肠溶胶囊\t308933\ntracks\t308934\n沿江街道\t308935\n新会柑\t308936\n状如\t308937\n腰椎骨质增生\t308938\n天津市政府\t308939\n辽宁省卫生和计划生育委员会\t308940\nBTS防弹少年团\t308941\nWiiu模拟器\t308942\n综测\t308943\n东北石油大学\t308944\n林内燃气热水器\t308945\n九华山景区\t308946\n马克思主义\t308947\n发帖机\t308948\n源本\t308949\n5.56\t308950\n成都房管局\t308951\nhairy\t308952\n友邦保险公司\t308953\n溢达\t308954\nCAD图纸\t308955\n护国\t308956\n资案\t308957\n刮腻子\t308958\n洪秀柱\t308959\nTruman\t308960\n有所\t308961\n牢房\t308962\n苹果旗舰店\t308963\n乡镇政协\t308964\nsneaker\t308965\nexpressed\t308966\n安帕瓦\t308967\n3437\t308968\n会计师证\t308969\n天下足球网\t308970\n锁钥\t308971\n水性环氧树脂\t308972\n港府\t308973\n中国自动化网\t308974\nclone\t308975\n湖北国土资源职业学院\t308976\nkiyomi\t308977\nlibraries\t308978\n用工荒\t308979\n面粉机\t308980\n薯蓣丸\t308981\n奥莉薇\t308982\n万花油\t308983\nIDP\t308984\n均馆大学\t308985\n人民路\t308986\n小架\t308987\nwebsite\t308988\n程蒙恩\t308989\nbroom\t308990\n212号\t308991\n同处\t308992\nCOFFEE\t308993\n米宅米宅\t308994\n208篇\t308995\n数不胜数\t308996\n教育局长\t308997\n剑走偏锋\t308998\n4360\t308999\n祝勇\t309000\nguihua\t309001\n蒋辅义\t309002\n圆筒式\t309003\n毕业设计\t309004\n南湾半岛\t309005\n贝叶斯公式\t309006\n顺位器\t309007\n沃邦\t309008\n黄保余\t309009\n价单\t309010\n第七回\t309011\n腔肠动物\t309012\n无影剑\t309013\n000166\t309014\n聚乙烯闭孔泡沫板\t309015\n无声的爱\t309016\n出生于\t309017\n信泰人寿\t309018\n秀家软装-19楼\t309019\n记录篇\t309020\ncomet\t309021\ncbe\t309022\n王小东\t309023\n贵州镇\t309024\n多机\t309025\n丹七片\t309026\n太孤单\t309027\nDCL\t309028\n10.11.4\t309029\nSettle\t309030\n监管局\t309031\n伤害率\t309032\n售彩\t309033\n申明书\t309034\n三国群侠传\t309035\n凤凰洲\t309036\n清水湖\t309037\n名模\t309038\n3.5m\t309039\n国务院学位委员会\t309040\n北京建筑工程学院\t309041\n郑筱萸\t309042\nSDR\t309043\n华夏良子\t309044\n屡见不鲜\t309045\n离子交换柱\t309046\nhhvm\t309047\n中外涂料网\t309048\n多好不好\t309049\n注册造价工程师\t309050\nCare+\t309051\n行草书\t309052\n唐庄\t309053\n6.43\t309054\n匆匆忙忙\t309055\n伯牙\t309056\n肝炎\t309057\n最迷人\t309058\n送元二使安西\t309059\n使命召唤12\t309060\n暖岛网\t309061\n虐爱\t309062\n南京师大附中\t309063\n山姆会员店\t309064\n经验券\t309065\n今天晚上\t309066\n金缕衣\t309067\n雪铁龙c5\t309068\n香蕉蛋糕\t309069\n好残忍\t309070\n巫山县\t309071\n5004\t309072\n翠玉\t309073\n取电\t309074\nbitfinex\t309075\nbumpy\t309076\n报料\t309077\n20160625\t309078\n小皇后\t309079\n浪线\t309080\n第三_\t309081\ntmp\t309082\n魔国志I之黄巾之乱\t309083\n弗兰德斯\t309084\n旷视科技\t309085\nWMNS\t309086\n第54届\t309087\n核磁共振技术\t309088\n1910\t309089\n自助贩卖机\t309090\n星龟\t309091\niPho\t309092\n和谐版\t309093\n螺栓型\t309094\nmorph\t309095\nMUI\t309096\n斯蒂芬金\t309097\n爱芙莱\t309098\nS9\t309099\n开胃\t309100\n魔兽世界军团再临\t309101\n共产党员服务队\t309102\n框筒\t309103\n2kiko\t309104\n佛山街\t309105\nNichols\t309106\n鞋扣\t309107\n馥\t309108\nst\t309109\n四维数组\t309110\n4399火影忍者ol\t309111\n9108\t309112\n上海女排\t309113\npilots\t309114\n有完\t309115\n西南大学育才学院\t309116\n商水县\t309117\n1.173\t309118\nCLT\t309119\n救急\t309120\ncaribbean\t309121\n体感\t309122\nyouxiang\t309123\n371种\t309124\n疯了似\t309125\nunbutu\t309126\n利川网\t309127\n壹马会\t309128\n占优势\t309129\n简笔画花\t309130\n张惠\t309131\n0428\t309132\n印刷版\t309133\nbison\t309134\n长城动漫\t309135\nImager\t309136\ndtt\t309137\n香\t309138\nninja650\t309139\n77个\t309140\n后滩\t309141\n环切\t309142\n湖南司法警官职业学院\t309143\nRC电路\t309144\n佣\t309145\n韵律\t309146\n神单\t309147\n反智主义\t309148\nHorizon\t309149\n陵城区\t309150\n原干惠\t309151\n河北省科学技术厅\t309152\n艾迪芬奇\t309153\n阿西\t309154\nvisul\t309155\n42式太极剑\t309156\n第一百一十章\t309157\navon\t309158\n喜羊羊\t309159\n洋沙山\t309160\n字节型\t309161\n暗部\t309162\n青墨\t309163\n全民阅读\t309164\n雅尔\t309165\n走亲\t309166\n金话筒\t309167\n高顿题库\t309168\n广州富力\t309169\n国家档案局\t309170\ng4400\t309171\n浔阳网\t309172\n小圣\t309173\n查血\t309174\n泰达\t309175\n绚\t309176\n平原县\t309177\n湛山寺\t309178\n惠购\t309179\n老照片\t309180\n城市规划管理技术规定\t309181\n小札\t309182\n潞安集团\t309183\n董氏奇穴\t309184\nsensei310\t309185\n童妍\t309186\n佛罗伦萨小镇\t309187\n浮选\t309188\n中心思想\t309189\n黑金卡\t309190\n患寡而患\t309191\nexiles\t309192\n警力\t309193\n给料机\t309194\n该市\t309195\n旋风少女2\t309196\n不服\t309197\n天正暖通\t309198\n阿利坎特\t309199\nEarPods\t309200\n奔驰S\t309201\n匪气\t309202\n上海6号线\t309203\n祝九胜\t309204\n虾池\t309205\nrunescape\t309206\n3gt\t309207\n拉尼娜现象\t309208\n王母娘娘\t309209\n一百二十周年\t309210\nGodaddy\t309211\n撞断\t309212\n凸出\t309213\n聚苯颗粒\t309214\nsange\t309215\n广岛县\t309216\n秀珍菇\t309217\n汪峰\t309218\n维语输入法\t309219\n周聪\t309220\n初一年\t309221\n租铺\t309222\n车流\t309223\nLovelive\t309224\n实得\t309225\n国际金融学\t309226\n康庄路\t309227\n快乐器\t309228\nmax2014\t309229\n000516\t309230\n申请表\t309231\n中国空军\t309232\n自然物\t309233\n晚上12点\t309234\n云霄\t309235\n京西古道\t309236\n华胥引之绝爱之城\t309237\nRJ45\t309238\nie800s\t309239\nkepserver\t309240\n同途\t309241\n唐俊乔\t309242\nlightroom5\t309243\n戴季陶\t309244\n宗馥莉\t309245\n评论人\t309246\n简单句\t309247\n四川省环境保护厅\t309248\n经典儿歌串烧\t309249\n起动机\t309250\n分体式空调\t309251\n火了火\t309252\n宝俊\t309253\nBlog-51CTO\t309254\n阅\t309255\n3.73\t309256\n炼丹术\t309257\n广东鼎建水泥制品有限公司\t309258\n加勒比海盗主题曲\t309259\n国里\t309260\n文殊坊\t309261\n汉语言文字学\t309262\n邪淫\t309263\nokhttputils\t309264\nBLUES\t309265\n酿成\t309266\n中华人民共和国行政监察法\t309267\n传祺GS3\t309268\n霹\t309269\n据称\t309270\n授信\t309271\n溥杰\t309272\nfong\t309273\n河北省环境保护厅\t309274\n二氧化氯消毒剂\t309275\nIOS\t309276\n推销\t309277\naccessToken\t309278\n抢占\t309279\n青妇\t309280\nplsql12\t309281\n湖滨北路\t309282\n二氧化钒\t309283\n电轨\t309284\n1.1.0\t309285\n王传林\t309286\n聚谷氨酸\t309287\n电位\t309288\n唱歌\t309289\n叶生\t309290\n载药\t309291\n鸡冠花\t309292\n房间\t309293\nmtlab\t309294\nItunes\t309295\n粘剂\t309296\n值钱\t309297\n天鸽\t309298\n蔓延\t309299\n长安星光4500\t309300\n穿越机综合技术讨论区-5iMX.com\t309301\n江苏省建设厅\t309302\n西子网\t309303\n龙华清湖\t309304\nPaw\t309305\n恒福路\t309306\n1388\t309307\n大华地区\t309308\nsums\t309309\n汉王OCR\t309310\n鹏飞\t309311\n药妆店\t309312\n夏目优希\t309313\ndfmea\t309314\n三峡大瀑布\t309315\nwap浏览器\t309316\n区编委办\t309317\n可空\t309318\n疤克\t309319\n明日传奇\t309320\n18500\t309321\n精灵王\t309322\n石头画\t309323\n防盗器\t309324\n痴情男\t309325\n我可\t309326\n凤凰台\t309327\n地狱潜者\t309328\n五指山\t309329\n天前\t309330\n联想g510\t309331\n妇\t309332\n垃圾发电\t309333\ncurve\t309334\naficio\t309335\n早出晚归\t309336\n江孜\t309337\n0.72\t309338\n赫尔海姆\t309339\nmms\t309340\n广告级\t309341\n我多想去看看\t309342\n羊毫\t309343\nU4000UQ\t309344\n奥普光电\t309345\n维塔\t309346\n速霸\t309347\n椎动脉\t309348\nlocale\t309349\n广义逆矩阵\t309350\n萌琪琪\t309351\n背心式\t309352\n孤\t309353\n妆蕾\t309354\n高山族\t309355\n中关村生命科学园\t309356\n烘烤\t309357\n贝叶斯\t309358\n王珍\t309359\ncola\t309360\n追书网\t309361\n竹蔗\t309362\n失温症\t309363\n找代\t309364\n男儿身\t309365\n徐峰\t309366\nteresa\t309367\nhbj\t309368\n古朗月\t309369\n集中供水\t309370\nAnti\t309371\n叶一茜\t309372\n离线文件\t309373\n美丽华\t309374\n叠嶂\t309375\n习相远\t309376\n1.5万公里\t309377\n调查类\t309378\n卫生健康委员会\t309379\n新华网\t309380\n海洋公园\t309381\n直邮购物网\t309382\n分发\t309383\n易卜生\t309384\n日行一善\t309385\n张楠\t309386\n720全景摄影\t309387\n供油\t309388\n横机\t309389\n白绒\t309390\nXYQ梦幻西游\t309391\n980ti\t309392\n埃罗芒\t309393\nKDD\t309394\n齐玉苓\t309395\nRelationships\t309396\n魔兽世界编年史\t309397\n看门狗2\t309398\n曹节\t309399\n全热交换器\t309400\ndamage\t309401\n审价\t309402\n玫瑰红\t309403\n系统学习深度学习\t309404\n陈学冬\t309405\n卜凡\t309406\nGDP\t309407\n定婚\t309408\n本贴\t309409\n赌王\t309410\n涨停板\t309411\n解放路步行街\t309412\n天津市政协\t309413\n微小型\t309414\n超过8小时\t309415\n得字\t309416\n元祖蛋糕\t309417\nCGL\t309418\n中山市经济和信息化局\t309419\n上海国际医学中心\t309420\n畸变\t309421\n西博会\t309422\n事业群\t309423\n200座\t309424\n山系\t309425\n防扎\t309426\n谜底\t309427\n未来7天\t309428\n患者\t309429\ni2c\t309430\n深圳旅行社\t309431\n苹果5c\t309432\n外地\t309433\n四川省政协\t309434\n华强北电脑网\t309435\n禁域\t309436\n水分子\t309437\noptimize\t309438\n206\t309439\n4690k\t309440\nC226\t309441\nMonkeyRunner\t309442\n呼\t309443\n红宝石蛋糕\t309444\n五百个\t309445\n_曼\t309446\n维生素c咀嚼片\t309447\n香港物流公司\t309448\n聚氨酯漆\t309449\n武汉市质量技术监督局\t309450\n广发聪明卡\t309451\nIBM\t309452\n滨水\t309453\n密州\t309454\n银行会计学\t309455\nrevit族库\t309456\n柱体\t309457\n斗龙战士2\t309458\n品格\t309459\n赵树嫄\t309460\nx1\t309461\n压榨机\t309462\nreform\t309463\ngiro\t309464\n假如雷霆之怒\t309465\n千年后\t309466\n50N\t309467\n李纲\t309468\n蹿升\t309469\n天星\t309470\nPES2016\t309471\n弃\t309472\n麻辣香锅\t309473\n烙\t309474\n齐天大\t309475\n地战\t309476\n欲速则不达\t309477\n测绘\t309478\n国电投\t309479\n茉莉花香\t309480\n鼓楼街\t309481\n福建永春政府网\t309482\n初恋\t309483\nArnold\t309484\nabandoned\t309485\nara\t309486\nBLK\t309487\nRC4\t309488\n拜博口腔医院\t309489\n楼板价\t309490\nNote3\t309491\n房管家\t309492\ntells\t309493\nzucker\t309494\nQQ头像大全-百田网Q族\t309495\n东国\t309496\n八月长安\t309497\n布蕾\t309498\n和田县\t309499\n果蔬\t309500\nkardashian\t309501\np站\t309502\n早樱\t309503\nWeLoop\t309504\n烟月\t309505\n荥阳市\t309506\n嘉旅论坛_汽车之家论坛\t309507\n胆囊腺肌症\t309508\ncloudfront\t309509\n墨江县\t309510\n乌青\t309511\n国家湿地公园\t309512\n短支\t309513\n艾古\t309514\n抵扣联\t309515\nYAT\t309516\n源代码管理器\t309517\nc4.5\t309518\ncmake\t309519\n沈阳酒店\t309520\n鼎辉\t309521\nquin\t309522\nPrague\t309523\n主秘\t309524\n蕴育\t309525\nラプソディ\t309526\n莉莉斯\t309527\n中国水利水电第七工程局有限公司\t309528\n以色列国\t309529\n瓜尔豆胶\t309530\n外国\t309531\n春天花园\t309532\n能量条\t309533\n不避亲\t309534\n连姆尼森\t309535\n1244\t309536\n修复工\t309537\nPosts\t309538\n抗日剧\t309539\n养成师\t309540\nmelissa\t309541\n国境线\t309542\n入味\t309543\nSetup\t309544\nnutz\t309545\n崇信县\t309546\n笄\t309547\n笨方法学\t309548\n501.4\t309549\niPh\t309550\n戏语\t309551\n2081\t309552\n黑球\t309553\n座屏\t309554\norbit\t309555\n100关\t309556\n用品\t309557\nso3\t309558\n进击的巨人\t309559\nWeGame\t309560\n户部\t309561\n呼转\t309562\n皂\t309563\nSTILL\t309564\n违约责任\t309565\n曲终人散\t309566\n格拉默\t309567\n证照分离\t309568\nproperty\t309569\n12351\t309570\n12下\t309571\n凤凰新村\t309572\n动态分区\t309573\nel-checkbox\t309574\n满口\t309575\n利斯塔\t309576\n钉器\t309577\n浸润剂\t309578\n护花使者\t309579\n米果\t309580\n西南民大\t309581\na1822\t309582\n优质服务\t309583\n对不住\t309584\nMcQueen\t309585\n隐疾\t309586\nReserve\t309587\nHydraulic\t309588\n赵传奇\t309589\n玉皇山\t309590\nnearby\t309591\n重根\t309592\n麦饭石\t309593\n红螺寺\t309594\n激活秘钥\t309595\n起亚揽胜极光\t309596\n成都路\t309597\nkNN\t309598\n15.6寸\t309599\nChocolates\t309600\n火星求生\t309601\nMobx\t309602\n阎魔\t309603\n7-8月\t309604\n天旗\t309605\n18041\t309606\nFendi\t309607\nlpop\t309608\n04月21日\t309609\n20160729\t309610\n舞阳县\t309611\n农业银行信用卡\t309612\n第二门\t309613\n39个\t309614\n几子\t309615\nSchaeffler\t309616\n顾海兵\t309617\n晃晃\t309618\n募集期\t309619\n港籍\t309620\n饭团\t309621\n淮工\t309622\n李福林\t309623\n43万元\t309624\n甘肃省检察院\t309625\n20:30\t309626\nQlikView\t309627\n云图控股\t309628\nstm32f1\t309629\n山市\t309630\nMetabolic\t309631\n甲基苯丙胺\t309632\n祝绪丹\t309633\nv7.3.0\t309634\n360清理大师\t309635\n宾汉\t309636\n结霜\t309637\n空条承太郎\t309638\ninline-block\t309639\n四村\t309640\n培恩\t309641\nquerylist\t309642\n打气\t309643\n鸡蛋白\t309644\n星报社\t309645\n整样\t309646\n不要钱\t309647\nbeans\t309648\nAM4\t309649\nC920\t309650\n江若琳\t309651\n物种起源\t309652\n棋书\t309653\n树章\t309654\nwin7系统环境变量path\t309655\n余丽\t309656\nnologin\t309657\ncdk\t309658\n一号\t309659\n下司\t309660\n──\t309661\n5.22.0\t309662\n大清河\t309663\nHABA\t309664\n灰狐\t309665\n嘉兴经济技术开发区\t309666\nTaikoo\t309667\n封神榜\t309668\n小米5s\t309669\n旅游集散中心\t309670\n双联单控开关\t309671\n招聘方\t309672\n爱知县\t309673\n云相册\t309674\n北极星节能环保网\t309675\n三峡水库\t309676\n小小人\t309677\n避风\t309678\nv2.6.3\t309679\n2015年2月\t309680\n亲子网\t309681\n翠翠\t309682\n腹水\t309683\n威马汽车\t309684\n计数函数\t309685\nsuisai\t309686\n英朗XT\t309687\n香道\t309688\n28英寸\t309689\n非礼勿视\t309690\ndbl\t309691\n修行路\t309692\n科尔康\t309693\n同心圆\t309694\n牧神记\t309695\n百度钱包\t309696\n二建证\t309697\n文化遗产日\t309698\n洗劫一空\t309699\n审协\t309700\n牛_\t309701\n若果\t309702\n湿度计\t309703\n中医医\t309704\nkwh\t309705\n汽酒\t309706\nemtc\t309707\n文正\t309708\n王璐\t309709\n门前三包\t309710\n97875\t309711\n狭路\t309712\n十七世纪\t309713\nGNOME、KDE\t309714\nUiPath\t309715\n洋河新区\t309716\nTwentine\t309717\n巨岩\t309718\nManipulation\t309719\n镇守\t309720\n猫片\t309721\n这玩意\t309722\nserializer\t309723\n精蛋\t309724\n汶河\t309725\n干山\t309726\n百科问答\t309727\n东北方\t309728\n自省\t309729\n腾旭\t309730\n30摄氏度\t309731\n0039\t309732\n刘鸿文\t309733\n谈一谈\t309734\nKon\t309735\n八角\t309736\n800套\t309737\n乌鲁木齐银行\t309738\n_瑞\t309739\n天皓\t309740\n人力资源管理员\t309741\n保荐代表人考试\t309742\nCKeditor\t309743\n775针\t309744\nProe\t309745\nSRC\t309746\n苦乐\t309747\n德妻\t309748\n警院\t309749\n防晒油\t309750\nYAMADA\t309751\n79路\t309752\n关注度\t309753\n甘冈\t309754\n14.0.2\t309755\n圣斗少女翔\t309756\n新概念\t309757\n穿上身\t309758\n51Ape.Com无损音乐\t309759\n主頁\t309760\n红罂粟\t309761\n篝火\t309762\n松脱\t309763\nDemocratic\t309764\n周国辉\t309765\n迷弟\t309766\n華麗\t309767\n塔下村\t309768\n脱光\t309769\n原丝\t309770\n中国路面机械网\t309771\n商汤\t309772\n周冬雨\t309773\nHY\t309774\n英雄团\t309775\nDFMEA\t309776\nzeta\t309777\n永丰\t309778\n三国志9\t309779\n江阴镇\t309780\n青华\t309781\nsmartdraw\t309782\n王者荣耀契约之战\t309783\nFemjoy\t309784\n海荣\t309785\n街头上\t309786\n天天轴承网\t309787\n吴娟\t309788\n老掉牙\t309789\n正红花油\t309790\n当铺\t309791\n盐酸坦洛新缓释胶囊\t309792\nspdb\t309793\n欧姆定律\t309794\n黑线\t309795\n剪力\t309796\noppor7sm\t309797\n2.41亿\t309798\n人工智障\t309799\nbt版\t309800\n御剑江湖\t309801\n抚州市东乡区政府\t309802\nDate、Timestamp\t309803\n南京中考网\t309804\n上海魅宇仪器设备有限公司\t309805\n爆珠\t309806\nM132a\t309807\n被关押\t309808\n大白话\t309809\n黄雀\t309810\n钾肥\t309811\n黄辉\t309812\n3.1.1\t309813\n西安大唐芙蓉园\t309814\ntianqi\t309815\npadding\t309816\n春江\t309817\n北部湾银行\t309818\n江口街道\t309819\n雷火\t309820\n80套\t309821\nHDI\t309822\nmetasequoia\t309823\n孙炜\t309824\n秘语\t309825\n安内\t309826\nMerchants\t309827\n战地1\t309828\n凸凸\t309829\n角的度量\t309830\n唐一白\t309831\n射程\t309832\n五维空间\t309833\nZStack\t309834\n酷狗K歌\t309835\n藻类\t309836\n新调\t309837\nDuel\t309838\n王占军\t309839\n卢克\t309840\n奇遇记\t309841\ndlclose\t309842\n胶液\t309843\n小男孩儿\t309844\n都市侠盗\t309845\n36d\t309846\n不舍\t309847\n撤离\t309848\n人学\t309849\n0510\t309850\nfgg\t309851\n吉庆街\t309852\n华中数控\t309853\nshorter\t309854\n凛音\t309855\nTexture\t309856\n1200例\t309857\n1.1元\t309858\n李智楠\t309859\n二十四史\t309860\nveja\t309861\n1-1\t309862\nthresh\t309863\n白色情人节\t309864\n磁盘柜\t309865\ninsulated\t309866\n乐一番\t309867\n晚上八点\t309868\n浙江省教育技术中心\t309869\n准线\t309870\n艳曲\t309871\n昊园恒业\t309872\n魔咒\t309873\n_名\t309874\nijkplayer\t309875\nparfum\t309876\n昵\t309877\n中国机械工业集团有限公司\t309878\n吉安市\t309879\n半世纪\t309880\n陈列室\t309881\n王铎\t309882\n屠戮者\t309883\n道观\t309884\nnl80211\t309885\n69条\t309886\ndrzj\t309887\n9007\t309888\nexemption\t309889\n洪越\t309890\n埃拉西亚\t309891\n正殿\t309892\n活板\t309893\n4433\t309894\n夺命\t309895\n小样机\t309896\n甲状腺结节\t309897\n&1\t309898\n中建八局二公司\t309899\n模力\t309900\nLnx\t309901\n600756\t309902\nimports\t309903\nCCTV-5\t309904\n雄鸡\t309905\n辛辛那提\t309906\nMethods\t309907\n王保国\t309908\n兴远\t309909\n聚合体\t309910\n南充日报\t309911\n黑龙潭\t309912\nFSD\t309913\n江西建设职业技术学院\t309914\n花纸\t309915\nqbytearray\t309916\n公网银\t309917\n梁音\t309918\n长安十二时辰\t309919\n值得注意\t309920\n离开\t309921\n32吨\t309922\n炒年糕\t309923\n名学\t309924\n符号位\t309925\n照片版\t309926\n三明市区\t309927\n分界洲岛\t309928\n肉样\t309929\n咖喱牛肉\t309930\n软件包管理器\t309931\n粗尾\t309932\n科比·布莱恩特\t309933\n周二\t309934\n浙商银行\t309935\n一撸\t309936\n滕头\t309937\n澳币\t309938\n米宝\t309939\n2014.3\t309940\n环境保护网\t309941\n蒙卦\t309942\n阿鲁纳恰尔邦\t309943\n糙米\t309944\nTransaction\t309945\nYatHo\t309946\n210国道\t309947\n无间狱\t309948\n好人一生平安\t309949\n修理包\t309950\n青岛科技\t309951\n啦啦队长\t309952\n主妇\t309953\n农码\t309954\n组织生活\t309955\n15、20、30天\t309956\nv16.12\t309957\n浦源\t309958\n守卫雅典娜\t309959\n安软\t309960\n20171030\t309961\n重庆市司法局\t309962\nItext\t309963\n穆尔\t309964\n操作服务器\t309965\n巴河\t309966\n睾丸酮\t309967\n鲁智深\t309968\nc400\t309969\n剑侠客\t309970\n神驹\t309971\ntrang\t309972\n去黑头\t309973\n水星家纺\t309974\n强生血糖仪\t309975\n奇效\t309976\n懦弱\t309977\n椰枣\t309978\nYii\t309979\n金眼\t309980\nlatch\t309981\nlucas\t309982\nCX70\t309983\n公因\t309984\nAMG\t309985\n坑王\t309986\n财务咨询有限公司\t309987\n阿斯顿维拉\t309988\n京b\t309989\n静静的顿河\t309990\nxfire\t309991\n360安全浏览器\t309992\n商转公\t309993\ncarina\t309994\n皇族电子竞技俱乐部\t309995\n武汉股权托管交易中心\t309996\n男女装\t309997\n乐于\t309998\n频幕\t309999\n深圳大学管理学院\t310000\n知乎励志贴\t310001\n14名\t310002\neasiu\t310003\n41章\t310004\nfenix5\t310005\nJMockit\t310006\n802.11ad\t310007\nzhng\t310008\n贫血\t310009\n取值\t310010\n全椒县\t310011\n临潭县\t310012\n云收单\t310013\n观众缘\t310014\n隔离盒\t310015\n梁武帝\t310016\nZooskool\t310017\nBench\t310018\n翼猫\t310019\n赢在中国\t310020\ndefrag\t310021\nTangent\t310022\n吴小如\t310023\nハメ撮\t310024\n中山大学地理科学与规划学院\t310025\nV3.9\t310026\n李在元\t310027\n长安集团\t310028\n比华利\t310029\n同人小说网\t310030\n北京市安全生产监督管理局\t310031\n舞姬\t310032\nLOMO\t310033\n格里森\t310034\nIgnore\t310035\n圣爱天堂\t310036\n没商量\t310037\n虐\t310038\ncos吧_\t310039\n化学元素\t310040\n聚客\t310041\n拉图\t310042\n傲气\t310043\n玻璃锅\t310044\n江南影视艺术职业学院\t310045\n近效期\t310046\n骄傲自满\t310047\n工专\t310048\n上空\t310049\n牡丹籽\t310050\nKracie\t310051\ncanva\t310052\n进阶篇\t310053\n山西中医学院\t310054\n寻优\t310055\n省情\t310056\n油浸式\t310057\nStrongswan\t310058\n戴笠\t310059\nLibero\t310060\ntoby\t310061\n电片\t310062\n气动执行器\t310063\n造富\t310064\n流通股\t310065\n零几年\t310066\n平顶山工业职业技术学院\t310067\n罗永\t310068\n黄姜\t310069\n祝无双\t310070\nmodel\t310071\n流苏树\t310072\n干系\t310073\n008小说网\t310074\n第二弹\t310075\n血脂稠\t310076\n宝安公路\t310077\n苹果干细胞\t310078\n一百多元\t310079\n青岛市体育局\t310080\n当代商城\t310081\n笔友\t310082\n低谷\t310083\n颌骨\t310084\n大盛\t310085\n苏子\t310086\n富士电视台\t310087\n国家旅游局\t310088\nn95\t310089\n建信养老飞月宝\t310090\nurlib\t310091\n9.29\t310092\n过程性\t310093\n陈规\t310094\n双头螺栓\t310095\n蓝月亮\t310096\n发射台\t310097\n宝马730Li\t310098\n误伤\t310099\n强横\t310100\n手术无影灯\t310101\n香园\t310102\nestablishment\t310103\nFedEx\t310104\n大板鲫\t310105\n于晓非\t310106\nguys\t310107\n背弃\t310108\n心梦无痕-九仙帝皇诀\t310109\n蓝帽子\t310110\nforza7\t310111\n张彦军\t310112\n假单\t310113\n总结&#160\t310114\nlame\t310115\naren\t310116\n退回去\t310117\n景津\t310118\n福晰阅读器\t310119\n80万亿元\t310120\n名款\t310121\ndeclaration\t310122\n巴育\t310123\n米哥\t310124\n紫外线吸收剂\t310125\n13句\t310126\n幻听网\t310127\n芝士味\t310128\nzishu\t310129\n举世闻名\t310130\n城镇登记失业率\t310131\nHEJJY\t310132\n米热\t310133\n上汽大\t310134\n孝感市环境保护局\t310135\n滟\t310136\n聚对苯二甲酸乙二醇酯\t310137\n昆明理工大学\t310138\n0800\t310139\n中国通辽网\t310140\n永兴村\t310141\n浙商证券\t310142\nshengan\t310143\n私人公司\t310144\n蒋琦\t310145\nLina\t310146\n明君\t310147\nGillette\t310148\n侍道\t310149\n座购\t310150\n供应商关系\t310151\n骑缝章\t310152\n呆若木鸡\t310153\n春意叶\t310154\n绝大多数\t310155\n范湉湉\t310156\n曝气机\t310157\nskyfall\t310158\n神祠\t310159\niPad2017\t310160\n淋巴液\t310161\n20例\t310162\n假如爱有天意\t310163\n1.0.5\t310164\n乐乐课堂\t310165\n软包\t310166\n市政实务\t310167\n一诺\t310168\nstrncat\t310169\nidisplay\t310170\nTizen\t310171\n刺客聂隐娘\t310172\n白羽\t310173\n63342\t310174\n青创\t310175\n埃塞俄比亚航空公司\t310176\n顶用\t310177\n上海教育博览会\t310178\n2017部\t310179\n言慧珠\t310180\n臊子\t310181\n国际电联\t310182\nRune\t310183\n活波\t310184\n2018.3.1\t310185\n小怡\t310186\n尹峰\t310187\n直冷\t310188\n英大网\t310189\n巴西队\t310190\n苗族\t310191\n中国二十二冶集团\t310192\n味浓\t310193\nthy\t310194\n竞标\t310195\n不妨\t310196\n出号\t310197\n烷值\t310198\n饭拍\t310199\n小蜗牛\t310200\n向阳楼\t310201\n65kg\t310202\nMeasuring\t310203\n子头\t310204\nnda\t310205\nSequencing\t310206\n怡口\t310207\n传动链\t310208\n暴走漫画\t310209\n疖子\t310210\n时时\t310211\n小脑\t310212\n三专\t310213\n共话\t310214\n绿灯侠\t310215\n200块\t310216\n中国华侨出版社\t310217\n八里台镇\t310218\n民航局\t310219\n绿豆蛙\t310220\n北京柏悦酒店\t310221\n旋式\t310222\n漱口杯\t310223\n聊城市人民医院\t310224\n翁静晶\t310225\nSpri\t310226\n柘中股份\t310227\nwin32com.client\t310228\n多文\t310229\ne-study\t310230\nnobody\t310231\n回顾展\t310232\n300亿\t310233\n唯视\t310234\n绫波丽\t310235\n花锦\t310236\n0125\t310237\n起于\t310238\n九名\t310239\n丁丁打折网\t310240\n防爆挠性连接管\t310241\n掉下来\t310242\n庭长\t310243\nJeffery\t310244\n怕麻烦\t310245\n月亮和六便士\t310246\n自旋\t310247\n转帖\t310248\n金丝楠阴沉木\t310249\ndivers\t310250\n二分之三\t310251\n星光级\t310252\n三维\t310253\n氮氧化物\t310254\n多核多线程\t310255\n八子\t310256\n叶鸣\t310257\n普乐安片\t310258\n316号\t310259\n透水砖\t310260\n昂昂溪区\t310261\nmt7620a\t310262\n招致\t310263\nselena\t310264\nblotting\t310265\n苏格\t310266\n手写印刷体\t310267\nkindle499\t310268\n世赛\t310269\nNND\t310270\n300包\t310271\n5微米\t310272\n盐城市委\t310273\n蔚蓝之空\t310274\n20170708\t310275\nS-Cute\t310276\nZabbix\t310277\n中九日\t310278\n县财政局\t310279\n萎蔫\t310280\nwww.g007.com\t310281\n制皮\t310282\n叠加式\t310283\n国际在线\t310284\nYu\t310285\n徐水区\t310286\n26款\t310287\n31种\t310288\n天女\t310289\n分异规律\t310290\n说说\t310291\n魔发奇缘\t310292\n报业大厦\t310293\n说一说\t310294\n城市规划师考试\t310295\n国枫\t310296\n银川市人民政府办公厅\t310297\n房贷计算器2018_购房贷款计算器\t310298\n市人民医院\t310299\n东北人\t310300\n刘松林\t310301\n一点也\t310302\n射钉\t310303\n物业公司\t310304\n黑色沙漠\t310305\n我这一辈子\t310306\n根部\t310307\n五华区\t310308\n706\t310309\n画梅\t310310\n180名\t310311\njquery幻灯片\t310312\n单杠\t310313\n速度与激情\t310314\nTribune\t310315\nXDD小叮当\t310316\n雅苑\t310317\nMDR\t310318\nRQ-STAR\t310319\n水分散粒剂\t310320\ndownstream\t310321\n凹形\t310322\n健丽\t310323\n成都妈妈网\t310324\n钱君\t310325\n北京市海淀区学院\t310326\n阉\t310327\n49mm\t310328\nwkweb\t310329\n让我们荡起双桨\t310330\n花青\t310331\nTYCO\t310332\nglpi\t310333\n三角牌\t310334\n骄奢淫逸\t310335\nloss值\t310336\n防火\t310337\nemployees\t310338\n气调库\t310339\n神秘莫测\t310340\n曹荣\t310341\n番禺大道\t310342\n新西兰天维网\t310343\n诉调\t310344\n母娘\t310345\n西丰县\t310346\n四川公司\t310347\n陈丽萍\t310348\n装配化\t310349\n福特新全顺\t310350\n云南菜\t310351\n气机\t310352\n布场\t310353\n姘头\t310354\n五格剖象\t310355\n笔耕不辍\t310356\n珠海出入境检验检疫局\t310357\n口袋妖怪永恒之炎\t310358\nAmway\t310359\n线杆\t310360\nmeishu\t310361\nID3\t310362\nKeywords\t310363\nweiui\t310364\nundercut\t310365\n夺分\t310366\n乌玛瑟曼\t310367\n韭菜鸡蛋\t310368\n麦达斯\t310369\n坛水\t310370\n王甘林\t310371\n钛架\t310372\n黄永胜\t310373\n抽滤\t310374\n爱康科技\t310375\n成都体育学院\t310376\nCalvinChu\t310377\n1295\t310378\n安里\t310379\n信证\t310380\n沿边\t310381\nnspire\t310382\n农牧业\t310383\n苏茹\t310384\nip掩码\t310385\n川藏北线\t310386\n欧阳克\t310387\n勇者不惧\t310388\n垢\t310389\n斗破莽荒纪\t310390\n贵阳龙洞堡国际机场\t310391\n5月一日\t310392\n师德\t310393\nZ800\t310394\n郑\t310395\n地方税务局\t310396\nformate\t310397\n菱花\t310398\nossim\t310399\n金陵东路\t310400\n家电类\t310401\n金阳\t310402\nVsphere\t310403\n崇文区\t310404\n答辩期\t310405\n花旗银行\t310406\n迅视\t310407\n仮面\t310408\n洗车泵\t310409\nborrow\t310410\ngum\t310411\n助企\t310412\n天津大学计算机科学与技术学院\t310413\nNeptune\t310414\n招灾\t310415\nMiranda\t310416\n抵押权登记\t310417\n3.5.1\t310418\nスイ\t310419\n届满\t310420\ny+2\t310421\n陈卫国\t310422\nwinserver\t310423\n柔皙\t310424\n超晶格\t310425\n天凉好个秋\t310426\nchinesefemdom\t310427\n抽脂减肥\t310428\n沈瑜\t310429\n中能电气\t310430\n莺\t310431\nundertake\t310432\n助\t310433\n河州\t310434\n95519\t310435\n滚珠\t310436\n天津海关\t310437\nGB50084-2017\t310438\n2200亿\t310439\n睿芽\t310440\n3DS&new\t310441\n信息服务有限公司\t310442\n中国特色\t310443\n纸书\t310444\n父母爱\t310445\n牙机\t310446\n塑料棒\t310447\n辰戌\t310448\n_步步\t310449\n二周\t310450\n改定\t310451\n1.7亿元\t310452\n卡路里_\t310453\n陈瑞豪\t310454\n农工\t310455\n梁壶\t310456\n3751\t310457\n葫芦鸡\t310458\n生途\t310459\nEnya\t310460\ngridpanel\t310461\n氯乙酸\t310462\n韩城古城\t310463\n扑家吧\t310464\n报账单\t310465\n太平鸟集团\t310466\n年味\t310467\nmysql分库分表\t310468\nsynonyms\t310469\n30K\t310470\n学拍\t310471\n秒秒\t310472\n猪市\t310473\n养道\t310474\n家族群\t310475\n智博\t310476\nDame\t310477\n导师制\t310478\n检尺\t310479\n没人管\t310480\n数幂\t310481\n青蒿\t310482\n霸刀\t310483\n鸡公煲\t310484\n本博\t310485\n死死\t310486\n咸水沽镇\t310487\n南阳南\t310488\n柔风\t310489\ncpu主频\t310490\n二套\t310491\nRyzen7\t310492\n北京市朝阳区教育委员会\t310493\n痈疽\t310494\n杜勇\t310495\n史蒂芬妮\t310496\n龙泽苑\t310497\n様\t310498\n幻想大陆\t310499\n发酵菌\t310500\n不苟\t310501\n五个群\t310502\nv2.6.6\t310503\n区局\t310504\n江西省樟树市人民政府\t310505\n石槎路\t310506\nv1.0.1\t310507\n第七十九条\t310508\n维生素C咀嚼片\t310509\n王玲\t310510\n太阳眼镜\t310511\n重庆南路\t310512\nbolin\t310513\n人皮客栈2\t310514\n彩香\t310515\nphotoshopCS6\t310516\n尊域\t310517\n福建人才联合网\t310518\n笑眼\t310519\n有备无患\t310520\n连理枝\t310521\n广彩\t310522\n墨胆\t310523\nFactories\t310524\n开封北站\t310525\n上海市质量技术监督局\t310526\n中海国际中心\t310527\n土地增值税预征率\t310528\n354\t310529\n上海园区\t310530\n曾国天官赐福\t310531\n硬脂酸锌\t310532\n扇子舞\t310533\n三串\t310534\nSiam\t310535\n区域化\t310536\n江门人才网\t310537\n任青\t310538\n一个一级\t310539\n香港贸易发展局\t310540\n宁强县\t310541\n浩淼\t310542\n聊城大学东昌学院\t310543\n张祥龙\t310544\n商赢环球\t310545\n火焰纹章圣战系谱\t310546\n女&#160\t310547\n四保\t310548\n音乐鉴\t310549\nPiece\t310550\n三五分钟\t310551\ncvx\t310552\n阳台窗\t310553\n接二连三\t310554\n杜聪\t310555\n成人英语三级考试\t310556\nSocket\t310557\n情归\t310558\n超星学习通\t310559\n游园\t310560\n0欧元\t310561\n张海\t310562\n冰湖\t310563\n6公斤\t310564\n手房\t310565\n温拿\t310566\nGSX\t310567\n何甘霖\t310568\n零部件\t310569\n儿臣\t310570\nbroadcasting\t310571\n长江有色金属网\t310572\n分速\t310573\n余韵\t310574\n久哥\t310575\n技术化\t310576\n微波治疗仪\t310577\n凰北月\t310578\n小点\t310579\n交流接触器\t310580\n三重\t310581\n河科大\t310582\n创新业\t310583\n白装\t310584\n雷强\t310585\n9月16日\t310586\n神秘感染\t310587\n创行者\t310588\n受控\t310589\nsww\t310590\n雄兵连之雄芯篇\t310591\n51ape\t310592\n龙生九子\t310593\n党节\t310594\n王玥\t310595\n马泷\t310596\n支部书记\t310597\n找片\t310598\n平面控制测量\t310599\n单宁\t310600\nXING\t310601\nxo\t310602\nAstral\t310603\n募集\t310604\nvoucher\t310605\nARD\t310606\n英菲尼迪Q70\t310607\n系统性\t310608\n警校\t310609\nbeamng\t310610\n减压阀\t310611\n梅林传奇\t310612\n南昌大学第一附属医院\t310613\n次时代\t310614\n旅行攻略\t310615\n海洋城\t310616\n礼司\t310617\n红彤\t310618\n斯塔夫里阿诺斯\t310619\n玉蒲团2之玉女心经\t310620\n一个16\t310621\nnot\t310622\n江滨\t310623\nTSN\t310624\n床箱\t310625\nMoment\t310626\n安世\t310627\nMF3010\t310628\n姜黄\t310629\nrepack\t310630\n汤阴县\t310631\n秦帝国\t310632\nlabview2015\t310633\n山东商务厅\t310634\nants\t310635\n新疆大学\t310636\n霍邱政府网\t310637\n严格要求\t310638\nee99养生网\t310639\n良乡大学\t310640\n双刃剑\t310641\n4月22\t310642\n回甘\t310643\n60V20A\t310644\n政风行风热线\t310645\ndistutils\t310646\n郭富\t310647\nLCD-60SU470A\t310648\n担担\t310649\n雨后\t310650\n加博会\t310651\n蔡勇\t310652\n三撇\t310653\n出糗\t310654\n一师一\t310655\n707\t310656\n福州市中级人民法院\t310657\n秦时明\t310658\n铝业\t310659\n吕智\t310660\n平煤神马报\t310661\n杰克·韦尔奇\t310662\n阪急\t310663\nhausman\t310664\n耀龙\t310665\n卡友支付\t310666\n桧\t310667\n一人我饮酒醉\t310668\n楼凤\t310669\n遥控车\t310670\n初数\t310671\n新概\t310672\n混响\t310673\n曲词\t310674\n计算机网络原理\t310675\n百分含量\t310676\n三国战\t310677\n中国会计视野论坛\t310678\n剪切蒙板\t310679\n汽车吊\t310680\n张爷爷\t310681\nDIS\t310682\n细菌屋\t310683\n筒隐月子喵\t310684\n叶炫清\t310685\n贾汪区\t310686\n一丝不挂\t310687\n工程岗\t310688\n闫鹤翔\t310689\n索九\t310690\n猫色\t310691\n解热\t310692\n美黑\t310693\nintact\t310694\n鲁塔\t310695\n管野\t310696\n_牛客网\t310697\n汽博会\t310698\n女娲补天\t310699\n三幕\t310700\n风机\t310701\n星海_科技\t310702\n名爵ZS\t310703\nLayaAir\t310704\n将\t310705\n凯瑞甘\t310706\nPicking\t310707\n20180207\t310708\n与值\t310709\n要素市场化\t310710\n全网\t310711\nMembers\t310712\nprogressbar\t310713\n制贩\t310714\nboring\t310715\n暗月\t310716\n日在校园\t310717\ncf点\t310718\n東方\t310719\n用\t310720\n雍华庭\t310721\n雅荷\t310722\n科赋\t310723\nAndroid热\t310724\n失灵\t310725\n第六讲\t310726\n180324\t310727\n主界\t310728\n过来人\t310729\n小乌龟\t310730\n两江机场\t310731\n花jd\t310732\n刘海砍樵\t310733\nWHO\t310734\n敬亲\t310735\n360百科\t310736\n五月六月\t310737\n鱼丝\t310738\nhtc10\t310739\nξ\t310740\naussie\t310741\n20160711\t310742\nword\t310743\n出版物经营许可证\t310744\n挤占\t310745\n济南公交\t310746\n统调\t310747\nemulator\t310748\n张家港神鹿机械密封有限公司\t310749\nsina\t310750\nhooks\t310751\n怎么回事\t310752\nCOR\t310753\n北大数学系\t310754\n动品网\t310755\n北川羌族自治县\t310756\n脑回路\t310757\n吊轨\t310758\nhpb300\t310759\n600886\t310760\n太急\t310761\n官兵卫\t310762\n中芯幼儿园\t310763\n加勒佣兵天下\t310764\n3840x1080\t310765\n论文章\t310766\n街段\t310767\n体彩\t310768\n好事\t310769\n六十八\t310770\n夜来风雨\t310771\n派遣证\t310772\n鼠标点\t310773\n富硒农产品\t310774\n辣椒树\t310775\n中缝\t310776\nCERNET\t310777\n涵养\t310778\ny^2\t310779\n全国少数民族传统体育运动会\t310780\n小肠镜\t310781\n促音\t310782\n盾构机\t310783\njdk6\t310784\n吉利帝豪EC7\t310785\ngpc\t310786\n450元\t310787\n风走\t310788\n旱喷\t310789\n合成石\t310790\n西安事变\t310791\n推光\t310792\n消隐\t310793\n红烧鸡腿\t310794\n二股\t310795\n涡旋\t310796\n八杯\t310797\n大众证券报\t310798\n折迁\t310799\n台中\t310800\n史进\t310801\n测绘通报\t310802\n成都高新区管委会\t310803\n26\t310804\n港口镇\t310805\n聘用合同书\t310806\n游弋\t310807\n工具书\t310808\n中国建筑第七工程局有限公司\t310809\n吉林建筑大学城建学院\t310810\n易百纳\t310811\ngrt\t310812\n白下\t310813\naunt\t310814\n75000\t310815\nApache2\t310816\n全丰\t310817\nOpenlayers3\t310818\nExo\t310819\nqq助手\t310820\n4r\t310821\nxshell4\t310822\n92%\t310823\n食药监管\t310824\njiahuafu\t310825\n华平\t310826\n张栻\t310827\n0353\t310828\n10u\t310829\n200ml\t310830\n昌吉回族自治州\t310831\n邕\t310832\n0920\t310833\n数字转换器\t310834\n开账\t310835\n腹痛病\t310836\nreferee\t310837\n莱蒙\t310838\n星命\t310839\n一等\t310840\nautocad2007\t310841\n力丸\t310842\n卡麦\t310843\n常委\t310844\n力帆迈威\t310845\n棠\t310846\n灰化\t310847\n天海祐希\t310848\noutlook.com邮箱\t310849\n黑网\t310850\n暴风雨\t310851\n2010年以来\t310852\n原平\t310853\n衡阳市委\t310854\n邮递员\t310855\n第二十六次\t310856\nospf\t310857\n区块链数字币-巨推链\t310858\n中国水产门户网\t310859\n诱发\t310860\nSina\t310861\n344\t310862\n小鱼洞\t310863\n兆驰股份\t310864\n40ml\t310865\n多肉植物\t310866\n新韩银行\t310867\nHabitat\t310868\n早寒\t310869\n手指\t310870\n榆林窟\t310871\n王戎\t310872\n密匙\t310873\n梦岛\t310874\n财狮\t310875\n包衣\t310876\n所致\t310877\n评本\t310878\n二倍体\t310879\n景林\t310880\n微性\t310881\n互点\t310882\n公基\t310883\n三博\t310884\n大理旅游\t310885\n西安航空\t310886\nNoel\t310887\n3901\t310888\n十四行\t310889\ngiao\t310890\n薪事\t310891\n无痕模式\t310892\n哈尔滨市市场监督管理局\t310893\nmovavi\t310894\n假条\t310895\n樋\t310896\nvariant\t310897\nDUNS\t310898\n巨一\t310899\n400倍\t310900\nBarret\t310901\n爱乐普\t310902\n西科大\t310903\n劳斯\t310904\ncbp\t310905\n按列取数\t310906\n慈善榜\t310907\nlkj\t310908\n洗衣凝珠\t310909\nslam\t310910\nPopupWindow\t310911\n老虎头\t310912\n方亚芬\t310913\nSUUNTO\t310914\n4&#160\t310915\nAID\t310916\n包容\t310917\n薇薇\t310918\nNO.8\t310919\n荞头\t310920\n车辙\t310921\n飞鸿\t310922\ng2o\t310923\nrevo\t310924\n白癣\t310925\n中关村软件园\t310926\n170524\t310927\nFila\t310928\n霍洛维茨\t310929\n优惠装\t310930\n把握住\t310931\n高校邦\t310932\n好苦\t310933\n潮南\t310934\n阿布扎比卢浮宫\t310935\n望京西路\t310936\n可欣\t310937\n叉腰\t310938\n艾斯卡诺\t310939\n夸大\t310940\n第二十九\t310941\n实岁\t310942\n市场营销部\t310943\n口腔\t310944\n惠济区\t310945\n继位\t310946\n引号\t310947\n61000\t310948\n脑出血\t310949\n甩手掌柜\t310950\n小额账户管理费\t310951\n停尸间\t310952\n玲珑宝\t310953\n凤林\t310954\n誉天\t310955\n徒有虚名\t310956\nmaterial\t310957\n扁桃体结石\t310958\n自定义工具栏\t310959\n啼不住\t310960\n暴风眼\t310961\n野心勃勃\t310962\n鸡东新闻网\t310963\n妖士\t310964\n打实\t310965\n旋风战\t310966\nChia\t310967\n二元函数\t310968\n慢三\t310969\n柳梢头\t310970\n甘肃新闻网\t310971\n蓝羽\t310972\ndnf2017\t310973\n好又\t310974\n義姉\t310975\n禹洲\t310976\n金科廊桥\t310977\nsetup\t310978\n指交\t310979\npdf格式免费版\t310980\nran\t310981\n经验\t310982\n防撞杠\t310983\n重装机\t310984\n酒盒\t310985\n迅雷720P\t310986\n毕亚兹\t310987\n复方鱼腥草合剂\t310988\n真意\t310989\n胃里\t310990\n比巴\t310991\n奥苏贝尔\t310992\n永成\t310993\n短装\t310994\n洛丽\t310995\n上海松\t310996\n欧胡岛\t310997\n血魄\t310998\n一摊\t310999\n凰\t311000\n93周年\t311001\n社科普\t311002\n|亿图\t311003\n龙洞堡\t311004\nstripper\t311005\n中国好声音第二季\t311006\n铸剑物语\t311007\n国税总局\t311008\nPixabay\t311009\n2017年6月17日\t311010\n再创\t311011\n老号\t311012\n日照钢铁厂\t311013\n10.20\t311014\n调音器\t311015\n1823\t311016\n1961\t311017\n福邸\t311018\n春风得意马蹄疾\t311019\n欧洲专利局\t311020\n全球电动车网\t311021\n庐江\t311022\n北师大附小\t311023\n下水道\t311024\n91手游网\t311025\n2.8\t311026\n华银电力\t311027\n国家体育总局\t311028\n春蚕到死丝方尽,蜡炬成灰泪始干\t311029\n遮光板\t311030\n柳柳\t311031\n基辅罗斯\t311032\nigame\t311033\n人生苦短\t311034\n曲屏\t311035\n青娱乐\t311036\n并用\t311037\n精神上\t311038\n吴轲\t311039\n判死刑\t311040\n癸亥年\t311041\nMayer\t311042\n徐徐图之\t311043\nbap\t311044\nasync/await\t311045\n376.1\t311046\n买醉\t311047\n东方明珠小区\t311048\n顷\t311049\nu410\t311050\n华声晨报\t311051\n5401\t311052\nLois\t311053\n不孝子\t311054\n常营\t311055\n反诉\t311056\nhsql\t311057\n9.0.3\t311058\n综放\t311059\n航天信息\t311060\n2520\t311061\n微晶纤维素\t311062\n江苏银监局\t311063\n2018年04月29日\t311064\nDataBinding\t311065\n杨大爷\t311066\nmaybe\t311067\ndenso\t311068\nwindow7旗舰版\t311069\nj1900\t311070\nPostgreSQL数据库\t311071\nvqiu\t311072\n慧企云\t311073\napg\t311074\n千万只\t311075\n黄陵\t311076\n不务\t311077\n北京女孩\t311078\n迈普\t311079\n党家村\t311080\n头脑风暴\t311081\n招聘\t311082\n源值\t311083\n茶谷\t311084\n手影\t311085\n眉胶\t311086\nasmr猫仙儿\t311087\n中国社会科学院经济研究所\t311088\n昆仑派\t311089\n精英\t311090\n加减乘\t311091\n一个升\t311092\n7月20日\t311093\n新金\t311094\n通窍鼻炎颗粒\t311095\nmede\t311096\n世伟洛克\t311097\n佐证\t311098\n米河镇\t311099\n拳皇98um\t311100\n72cm\t311101\n光垫\t311102\n达州日报网\t311103\n冀北\t311104\n4A\t311105\n航天一院\t311106\n假小子\t311107\n张世杰\t311108\n领潮\t311109\n盎格鲁\t311110\nwex\t311111\ndetergent\t311112\n金沙镇\t311113\n用额\t311114\nsew\t311115\n江南大学商学院\t311116\n19年\t311117\n李钢\t311118\nWiiU\t311119\n请选择\t311120\n屙\t311121\neazyui\t311122\nG_\t311123\n连云港港口集团\t311124\n征片\t311125\n人民西路\t311126\n资字\t311127\n10学院\t311128\n凯伦\t311129\nDecoration\t311130\n柏丽花园\t311131\n新加坡花园城\t311132\n位\t311133\n申诉书\t311134\n连续子\t311135\nPeacock\t311136\n云通讯\t311137\nlenove\t311138\n全币种信用卡\t311139\n陶宏\t311140\nXtraDB\t311141\n气宗\t311142\n李铎\t311143\n出庭\t311144\n帛\t311145\nunny\t311146\niPad4\t311147\n4板\t311148\n199\t311149\n37款\t311150\n嗓音\t311151\nFender\t311152\n20171226\t311153\n冰暗\t311154\nlca\t311155\n断裂带\t311156\ncomme\t311157\n刘坚强\t311158\n不计提\t311159\n承载式\t311160\n开博\t311161\n国际娱乐会所\t311162\n游客网\t311163\n大小周\t311164\n渗透剂\t311165\n迷影\t311166\nEXOPLANET\t311167\n国家海洋局第二海洋研究所\t311168\n一模卷\t311169\n我在人民广场吃炸鸡\t311170\njailbreak\t311171\nPlayClub\t311172\n枯龙吟\t311173\n钴酸锂\t311174\n国民品牌\t311175\n光圆\t311176\n格构式\t311177\n徐云\t311178\n违\t311179\n高秀敏\t311180\nPerfMon\t311181\n溏心风暴之家好月圆\t311182\n万春镇\t311183\n繁星四月\t311184\nlihood\t311185\n宏泰\t311186\n莉安娜\t311187\n浙江省科学技术厅\t311188\n管程\t311189\n背膘\t311190\n大公路\t311191\n助人为\t311192\n183号\t311193\nsjt\t311194\n大户型\t311195\n南尘\t311196\n灭鼠\t311197\n马隆\t311198\n俗名\t311199\nrsync\t311200\n涂板\t311201\n完好无损\t311202\n受洗\t311203\nAnyLogic\t311204\n选煤厂\t311205\n两角\t311206\n捞刀河\t311207\nNightmares\t311208\n广告灯\t311209\n区际\t311210\n神雾节能\t311211\nRaffles\t311212\ncyj\t311213\n死灵\t311214\n杭州市公安局\t311215\n9.1.1\t311216\n中国怀化市政府\t311217\n千岛湖风景区\t311218\n栖游记旅行网\t311219\n意识形态工作责任制\t311220\n数据站\t311221\n永恒王座\t311222\n骨结核\t311223\n阮经天\t311224\n21:30\t311225\n龙凤\t311226\n李奶奶\t311227\n骄阳似火\t311228\nweb打印控件\t311229\n菊\t311230\n壮士断腕\t311231\n神秘海域\t311232\n光明网\t311233\n2017、2018年\t311234\n徐小溢\t311235\n002916\t311236\nBoF\t311237\n邓瑞霞\t311238\n氺\t311239\n边荒传说\t311240\n虐身虐心\t311241\n第77集\t311242\n金葫芦\t311243\n赵尧珂\t311244\n枷\t311245\n自挂\t311246\n泰和泰律师事务所\t311247\n宋强\t311248\n吴翰清\t311249\n广州一中\t311250\n用成\t311251\n7.3pvp\t311252\n秦巴\t311253\n音谱\t311254\nEIEI\t311255\nx3500\t311256\n四川省知识产权局\t311257\n无领\t311258\n高血\t311259\n杭州长江实验小学\t311260\n陌上人如玉\t311261\n谌贻琴\t311262\n到了\t311263\n蛇性\t311264\n李双\t311265\n盆种\t311266\n冰砖\t311267\nva屏\t311268\n东山寺\t311269\n白铁\t311270\n班期\t311271\n中国美国商会\t311272\n丹东新闻网\t311273\n洛南县\t311274\n鋼\t311275\n嗜中性粒细胞\t311276\n小饰品\t311277\n王笛\t311278\n麼样\t311279\n港科\t311280\n伊比利亚半岛\t311281\n水峰\t311282\n新生儿筛查\t311283\n马柔\t311284\nPreventing\t311285\n填项\t311286\n第116集\t311287\n绪川\t311288\n六七天\t311289\n计免\t311290\n月\t311291\n我的老千生涯\t311292\nf540\t311293\n阿尔特\t311294\n井越\t311295\n确诊\t311296\n15.6英寸\t311297\n建业森林半岛\t311298\nSuperpads\t311299\ndiv2\t311300\n调研员\t311301\n跨世纪\t311302\n95亿\t311303\n扶养\t311304\n二次函数\t311305\n莫迪亚诺\t311306\n浙江省交通厅\t311307\n疾呼\t311308\n李如海\t311309\n凯库拉\t311310\n阿里系\t311311\n木子小僧\t311312\n整惨\t311313\nP10Plus\t311314\n电击小子\t311315\n夏湾\t311316\n索尼俱乐部\t311317\n金庸无双2\t311318\n齐鲁大学\t311319\nHDF5\t311320\n下龙\t311321\n火之心\t311322\n大宁网\t311323\n海德曼\t311324\n拉维斯\t311325\n情节\t311326\n儿歌视频大全_儿童歌曲大全_儿歌大全100首_儿童故事大全\t311327\n风险纳税人\t311328\n碧云国际社区\t311329\n一跤\t311330\n钢甲\t311331\n义门\t311332\n留存\t311333\n谷王\t311334\n杭州启正中学\t311335\n唯吾\t311336\n现代都市\t311337\n传书\t311338\n宽松\t311339\n心立方\t311340\nWaffle\t311341\n波儿\t311342\n善存\t311343\n大高\t311344\nfalcon\t311345\nNIPPON\t311346\n扩张性货币政策\t311347\nASCII编码\t311348\nUVLED\t311349\n寒冬腊月\t311350\nAxureRP8\t311351\nv1.7.0\t311352\n国民信托\t311353\n石斛花\t311354\nNINJA\t311355\n星海小学\t311356\n31.12\t311357\n头孔\t311358\n波波头\t311359\n芜湖方特\t311360\n司徒镇\t311361\n桑塔\t311362\n昭化古城\t311363\n蔺草\t311364\n代训\t311365\ncplusplus\t311366\n6027\t311367\n底盘件\t311368\nIQOS电子烟\t311369\n海珠湖公园\t311370\n下沉\t311371\n别样人生\t311372\n哼\t311373\nDzzOffice\t311374\n听香\t311375\n又\t311376\npbb\t311377\naj4\t311378\nxx村\t311379\n笔势\t311380\n第一集\t311381\n开产\t311382\nvlog\t311383\n4000亿\t311384\nconnectify\t311385\n世界王氏网\t311386\nk4000\t311387\n育龙\t311388\n唱到\t311389\nexiting\t311390\nprotocal\t311391\ncaravan\t311392\n土料\t311393\n热水\t311394\n陈大光\t311395\n色阁\t311396\nubntu\t311397\n傅琰东\t311398\n黑手党\t311399\n川岛芳子\t311400\n绿毛虫\t311401\nwin7虚拟机\t311402\n刘永强\t311403\n信长之野望13天道威力加强版\t311404\n爽\t311405\ndrawRect\t311406\n100道\t311407\n安居网\t311408\nBoot\t311409\n圭贤\t311410\n圆桌骑士\t311411\n李佩甫\t311412\n财会类\t311413\n含泪\t311414\n无权代理\t311415\n轻乳酪蛋糕\t311416\n雷克萨斯is\t311417\n思明区\t311418\n贪婪者\t311419\n非四舍五入\t311420\n育儿论坛\t311421\n返还型\t311422\n滚子\t311423\n8112\t311424\n世界近代史\t311425\n安阳火车站\t311426\n記錄\t311427\n祥安阁\t311428\n贵腐酒\t311429\n耶酥\t311430\n奔驰GLE级\t311431\n026期\t311432\nzzt\t311433\nDaoCloud\t311434\n消耗量\t311435\n街灯\t311436\n青金\t311437\n7.3奶德\t311438\n照本宣科\t311439\n太湖流域\t311440\n五网\t311441\n为人\t311442\n被遗弃\t311443\niqos\t311444\n52厘米\t311445\n月咏\t311446\n欺人\t311447\n黄芪水\t311448\n窝\t311449\n王祥\t311450\n6101\t311451\n影音先锋资源男人站av撸\t311452\n格列吡嗪片\t311453\n香港联合交易所有限公司\t311454\n熵值法\t311455\n竞进\t311456\n夹层玻璃\t311457\n浙江出入境检验检疫局\t311458\n互动百科\t311459\n全职猎人\t311460\n柳生\t311461\n独此一家\t311462\n喷锌\t311463\n美舰\t311464\n丁泽仁\t311465\n恶名\t311466\nGonna\t311467\n新乡市区\t311468\n2017到2018\t311469\n3307\t311470\nMasking\t311471\n外公\t311472\nserver\t311473\n心力交瘁\t311474\n人民币基金\t311475\n二倍速\t311476\n和弦谱\t311477\n昆仑决吧\t311478\npolymerase\t311479\n雅高酒店集团\t311480\n乌祖尔\t311481\n三棱\t311482\n袁合荣\t311483\n陈立农\t311484\n浩华\t311485\n花生饼\t311486\nG312\t311487\n闺蜜照\t311488\nMindstorms\t311489\n民谣吉他吧\t311490\n上午十点\t311491\n出土\t311492\n红军路\t311493\nswl\t311494\n綦江区\t311495\n水主\t311496\n冚\t311497\n超能力\t311498\n微信008货源网\t311499\n学会合作\t311500\n懒猫\t311501\n视觉中国\t311502\nViewFlipper\t311503\nk金\t311504\nmenubar\t311505\n风湿骨病\t311506\n斗魂大陆\t311507\n主会场\t311508\n炸薯条\t311509\n0.2g\t311510\n12123\t311511\n郭晓冬\t311512\n有机物\t311513\nQTcpSocket\t311514\n具体\t311515\n渣渣辉\t311516\n电感镇流器\t311517\n峰峦\t311518\n郭颖\t311519\n阿里健康\t311520\n元岗\t311521\n机卡\t311522\n瑞虎5\t311523\n趣史\t311524\neeprom\t311525\n剜\t311526\nPhilips\t311527\n括弧\t311528\nEmilie\t311529\n十五的月亮\t311530\n我心\t311531\n星河传奇\t311532\n白星\t311533\n钢贸\t311534\n折臂\t311535\n奥币\t311536\n失心疯\t311537\n沪深港通\t311538\n国际实业\t311539\n滑头鬼\t311540\nspreading\t311541\nWebapp\t311542\n快活\t311543\n上下标\t311544\n梅赛德斯奔驰文化中心\t311545\n深市\t311546\n乘法交换律\t311547\n中鑫\t311548\n卡版\t311549\n大筒\t311550\n李小军\t311551\n高干文\t311552\n底池\t311553\n借酒消愁\t311554\n川味\t311555\n胎压传感器\t311556\n叔宠\t311557\n追踪\t311558\nw4\t311559\n3.3.1\t311560\n科汉森\t311561\n任尔东西南北风\t311562\n双碱法脱硫\t311563\n宝骏730论坛_汽车之家论坛\t311564\n经久不衰\t311565\n看不得\t311566\n日本共产党\t311567\n道床\t311568\n党团\t311569\n牛津高阶英汉双解词典\t311570\n沉住\t311571\n天涯新知_\t311572\n5500u\t311573\n大叶子\t311574\n剑玉\t311575\njava接口\t311576\n烈日\t311577\n超逸\t311578\n汝南\t311579\n副厅级\t311580\n成都车管所\t311581\nhaok\t311582\nExiting\t311583\n新动向\t311584\n福岛县\t311585\n邪龙\t311586\nnutshell\t311587\n随心动\t311588\n尸位\t311589\n歌剧\t311590\n中航资本\t311591\n镇远古城\t311592\n拯\t311593\n金楠天街\t311594\n王帆\t311595\n2013年9月\t311596\n未来星\t311597\n晚托\t311598\n慧教育信息网\t311599\n开鲁\t311600\n邵东\t311601\n武汉仁爱医院\t311602\n同步电动机\t311603\n海南法院\t311604\n斑爷\t311605\nSb\t311606\n00150\t311607\n光触媒\t311608\n公网IP\t311609\ntgt\t311610\nMATlab\t311611\n兴业数金\t311612\n大连小区\t311613\n良言\t311614\nePub\t311615\n富甲天下吧\t311616\ndont\t311617\npooling\t311618\n新传\t311619\n遗传性\t311620\n流放之路野蛮人\t311621\nproud\t311622\n耳光\t311623\n新编剑桥商务英语\t311624\n20161103\t311625\n来安县\t311626\nAL00\t311627\n李亚梵\t311628\n刘福新\t311629\n帕莎\t311630\ndvi\t311631\n高血压脑出血\t311632\n暹岗\t311633\n巨变\t311634\n湖南日报网\t311635\n一万六\t311636\n外痔\t311637\n弊器\t311638\n九鼎记\t311639\n北京银行信用卡中心\t311640\n中科曙光\t311641\n金罗盘\t311642\n8101\t311643\nFaceTime\t311644\n除锈机\t311645\n假面骑士ex-aid\t311646\n2115\t311647\ngbk编码\t311648\n隐式转换\t311649\n天龙八部ol\t311650\n东亚杯\t311651\n拳霸3\t311652\n桑拿场所\t311653\n烟台长岛\t311654\n营养包\t311655\n趣彩网\t311656\n公交站台\t311657\nBURCH\t311658\n维欧\t311659\nvides\t311660\n红豆包\t311661\n精奶\t311662\n神器版\t311663\n岩气\t311664\n喙\t311665\n德哈\t311666\nXChange\t311667\n恐艾\t311668\nrrd\t311669\n病学\t311670\n站票\t311671\n7.1.3\t311672\nsgo\t311673\n雪媚娘\t311674\n非营利\t311675\n富士xt1\t311676\n公主\t311677\n吃奶\t311678\n趁人之危\t311679\n行用\t311680\n新彩网\t311681\n汉阳区\t311682\nie80\t311683\n小心机\t311684\n打呵欠\t311685\n机织\t311686\n塔斯肯\t311687\ndistortion\t311688\n电流钳\t311689\n张慧君\t311690\n话术\t311691\n肃然\t311692\n常州日报\t311693\nDaisy\t311694\n一顾\t311695\n市交委\t311696\ngs3\t311697\n山东电子职业技术学院\t311698\n仓皇\t311699\n永洪\t311700\n中国科学院计算机网络信息中心\t311701\n康护网\t311702\n十色\t311703\n糯米饼\t311704\n圣天\t311705\nIphone4\t311706\ncrow\t311707\n东莞市小学\t311708\ntypedef\t311709\n街机\t311710\n选为\t311711\n袋袋\t311712\n动态屏保\t311713\n常用字\t311714\n段娟娟\t311715\n淇澳岛\t311716\n猫头鹰\t311717\nMentor\t311718\n涂画\t311719\ndivinity\t311720\n霍氏\t311721\n黑哨\t311722\n微信号大全\t311723\n杨萌\t311724\n内环\t311725\n松溪\t311726\npets5\t311727\nSNK\t311728\nville\t311729\n血糖\t311730\n板外\t311731\ndail\t311732\n脑病科\t311733\n译民\t311734\n韩天伟\t311735\n浙江省质量技术监督局\t311736\nuLua\t311737\n凉果\t311738\n集显\t311739\n1742\t311740\n仿宋GB2312\t311741\n黑魔\t311742\n华为m5\t311743\n弈\t311744\n奶藓\t311745\n明润\t311746\n收藏集\t311747\n512g\t311748\n美好时代\t311749\n阿冰\t311750\n陆战棋\t311751\n大洪水\t311752\n异恋\t311753\n赵王\t311754\nfce\t311755\n昏沉\t311756\n微信群\t311757\n庆尚北道\t311758\n三峡在线\t311759\nOracle数据库表\t311760\n自洁式\t311761\nRobo\t311762\nyjb\t311763\n朱宁\t311764\n金舒络\t311765\n泵送\t311766\n小米随身wifi\t311767\n正山小种\t311768\n地雷达\t311769\n乐惠\t311770\n发券\t311771\nMoscow\t311772\n40周\t311773\n穷开心\t311774\n泰伯\t311775\n棒槌山\t311776\n苏蓉蓉\t311777\n53家\t311778\n宝尊\t311779\n翡丽公馆\t311780\nletme\t311781\n美姬\t311782\npreload\t311783\n小东门街道\t311784\n独山县人民政府\t311785\n履带式起重机\t311786\n真祖\t311787\n2018a\t311788\n长春市\t311789\n肖亮\t311790\n擎天软件\t311791\nsolvent\t311792\ncyanogen\t311793\n苦海\t311794\n宾格\t311795\n深圳市人大常委会\t311796\n反对称矩阵\t311797\nwebmail\t311798\n金阊新城\t311799\n干活\t311800\nEat\t311801\n刷卡机\t311802\nUAD\t311803\ntagid\t311804\n浏河网络\t311805\n来华\t311806\n建册\t311807\n布丽吉塔\t311808\nsean\t311809\n北京市妇联\t311810\n家拳\t311811\n充值机\t311812\n柴油味\t311813\n怡丰\t311814\nOCR\t311815\naways\t311816\n宛南实验幼儿园\t311817\n质量控制\t311818\n威能地暖\t311819\n银生宝\t311820\n善林(上海)金融信息服务有限公司\t311821\n敏格\t311822\n党委书\t311823\n习字\t311824\naisi\t311825\n红河学院\t311826\n超人类\t311827\n长宇\t311828\n德语专业\t311829\n环信\t311830\n脑髓\t311831\n红尘辗\t311832\n三太\t311833\n老爷\t311834\n辩称\t311835\n4km\t311836\n凯尔萨斯\t311837\n何群\t311838\n对比型\t311839\n1点\t311840\n杂图\t311841\n排污证\t311842\n麒麟啤酒\t311843\n1万4\t311844\n中国电建集团华东勘测设计研究院有限公司\t311845\n品色堂\t311846\n香城新都网\t311847\nutuntu\t311848\n89页\t311849\n返璞归真\t311850\n两扇门\t311851\n一个服\t311852\n安忍\t311853\n轩辕剑3\t311854\n张玮伽\t311855\n华人资讯网\t311856\n索菲\t311857\n文治\t311858\n鹤屋通贩\t311859\n主缸\t311860\n碧莲\t311861\nspringdatajpa\t311862\n20160628\t311863\n24.0\t311864\n徐楠\t311865\n三叉树\t311866\n朗肯\t311867\n3gmfw.cn\t311868\n兴成\t311869\n烤瓷冠\t311870\n油兔\t311871\n下册\t311872\n赵鑫\t311873\n敲鼓\t311874\n玉人\t311875\n悉尼\t311876\n宫野志保\t311877\n刨冰\t311878\n卷九\t311879\n宾果\t311880\n叽里咕噜\t311881\nHalfWater\t311882\n趾骨\t311883\n钟神秀\t311884\nlifanacg\t311885\nZack\t311886\nHAP\t311887\n传祺GM8\t311888\n乌镇互联网国际会展中心\t311889\n开普勒\t311890\n血片\t311891\n木讷\t311892\nchinas\t311893\n扬泰机场\t311894\n带感\t311895\n两百\t311896\n油阀\t311897\n福建医院\t311898\n金伦\t311899\n导演们\t311900\n沈志华\t311901\n爱欲\t311902\n圭峰山\t311903\n邵氏电影公司\t311904\n敌国\t311905\nVsCode\t311906\n2015年4月\t311907\n护手\t311908\n0223\t311909\n星球大战\t311910\nOfficers\t311911\n跃然纸上\t311912\n建行报\t311913\n330分\t311914\nBattleye\t311915\n舍予\t311916\n三方\t311917\n十重\t311918\n争虎斗\t311919\nEN\t311920\n康英\t311921\nzxr\t311922\nOFO\t311923\n圣诞装\t311924\n16厘米\t311925\nlongtext\t311926\n操盘\t311927\n花桥经济开发区\t311928\n运营管\t311929\nfabrication\t311930\n贫困\t311931\njiftle\t311932\nexpanse\t311933\n现货\t311934\ndelaware\t311935\n将神\t311936\n馈\t311937\n伧\t311938\n景观\t311939\n账款\t311940\n谢文东\t311941\n转差\t311942\nHKBU\t311943\n剪刀\t311944\n特勤\t311945\n内测\t311946\n_清网\t311947\n唐晓琳\t311948\n1668\t311949\n用好像\t311950\nCU\t311951\n0035\t311952\n悲鸿\t311953\n机翼\t311954\nankle\t311955\n蠼螋\t311956\n并联型\t311957\n先锋\t311958\n政事\t311959\n和盘\t311960\n车钩\t311961\n王步高\t311962\n招代\t311963\n黑暗\t311964\n南半球\t311965\nhistoire\t311966\n四月底\t311967\n磁州窑\t311968\n金霞\t311969\n联胜\t311970\n叉积\t311971\n1.04\t311972\n合肥白癜风医院\t311973\n控告\t311974\n空气换热器\t311975\n莫失莫忘\t311976\nFileGee\t311977\n二式\t311978\niso9000认证\t311979\n桥板\t311980\n不想长大\t311981\nJsonp\t311982\n布伦特\t311983\n气溶胶\t311984\nlistening\t311985\n央妈\t311986\n大罗金仙\t311987\n朱唇\t311988\n398\t311989\nadded\t311990\n1\\\t311991\n10.19\t311992\nIgA肾病\t311993\nOASIS\t311994\nzealand\t311995\n000333\t311996\n云手机\t311997\n娱乐_好奇心日报\t311998\n图素\t311999\n给货\t312000\n全台\t312001\n马牙\t312002\n又爱\t312003\n合江\t312004\nshuaji\t312005\n5.6.0\t312006\n织\t312007\n文商\t312008\n晓松奇\t312009\n绝地求生自定义\t312010\n下期中考\t312011\n周赛\t312012\n充气柜\t312013\n主角们\t312014\n图库\t312015\n1.11.3\t312016\n入夜\t312017\n师鹏\t312018\n自由观\t312019\n化合价\t312020\ntate\t312021\nUMTS\t312022\n几瓣\t312023\n21.8\t312024\n呼吸\t312025\n机架式\t312026\n将军庙\t312027\n帆状\t312028\n申长路\t312029\n黄框\t312030\n擂茶\t312031\n肩带\t312032\napue\t312033\n直下式\t312034\n28秒\t312035\n优乐娱乐\t312036\n宝类\t312037\n音标\t312038\n好诗\t312039\n谜语\t312040\n汽车销售服务有限公司\t312041\n小虫子\t312042\n6411\t312043\n安太医\t312044\n马天林书豪\t312045\n一二类\t312046\nPlaceholder\t312047\n_集团\t312048\n社稷\t312049\n%s\t312050\nEGL\t312051\n将至\t312052\n毛尖\t312053\n乐视超级手机2\t312054\nCertain\t312055\n熟冻\t312056\n诺亚方舟\t312057\n南无乐队\t312058\n逃婚\t312059\ndue\t312060\n北京现代名图\t312061\n苯磺酸氨氯地平片\t312062\nMiracle\t312063\n省能源局\t312064\n临猗县\t312065\n李小龙\t312066\n百企\t312067\n5倍\t312068\n86万\t312069\nHalsey\t312070\n传输器\t312071\n分类机\t312072\nE站通\t312073\n二叉查找树\t312074\n西安地铁1号线\t312075\n200道\t312076\n陆瑾\t312077\n情期\t312078\n238元\t312079\n鉴藏\t312080\n吨税\t312081\n纸槽\t312082\n广袤\t312083\ndisposition\t312084\n城西\t312085\nheis\t312086\ngood\t312087\n长安欧尚X70A论坛_汽车之家论坛\t312088\nPal\t312089\n衬管\t312090\n57095150\t312091\n商品编码\t312092\n金素妍\t312093\n蹭课\t312094\n良夜\t312095\nwin10蓝屏\t312096\n半兽人\t312097\n围网\t312098\nFASTBOOT\t312099\n沈阳新松机器人自动化股份有限公司\t312100\n外保\t312101\n东软载波\t312102\n82757325\t312103\nAPM飞控/Pixhawk飞控\t312104\n飞弹\t312105\nryzen7\t312106\nreassembly\t312107\n策源地\t312108\n第701章\t312109\n高赛\t312110\n发情期\t312111\n20170820\t312112\n菜脯\t312113\n政务区\t312114\n霍光\t312115\n安源区\t312116\nqq闪照\t312117\n备案证\t312118\n宣传科\t312119\ntbf\t312120\n废矿\t312121\n芒种\t312122\n五月色播影音先锋\t312123\nwin10s\t312124\n桐谷ユリア\t312125\nOffcie\t312126\n好正\t312127\nXCode\t312128\n旅行箱\t312129\n父传子传\t312130\n序列化成\t312131\nk3\t312132\n瑞虎仙逆\t312133\n赢才网\t312134\n绿壳鸡蛋\t312135\n高压板\t312136\n我的信念\t312137\n16家\t312138\n总览\t312139\n陕西榆林米脂\t312140\nneil\t312141\n卷装\t312142\n黄坤明\t312143\n切割器\t312144\n再接\t312145\nBackBeat\t312146\n乳清\t312147\n多学科\t312148\ndensity\t312149\nrelive\t312150\n六色\t312151\n精武门\t312152\n诺基亚公司\t312153\n夏图\t312154\npapel\t312155\n复合片\t312156\n50册\t312157\n妈妈咪鸭\t312158\ntopgear\t312159\n电动压缩机\t312160\naegisub\t312161\n槽罐车\t312162\n水泥杆\t312163\n徐娇\t312164\n啪嗒\t312165\n交通报\t312166\n男男\t312167\nGE\t312168\n优果\t312169\n彩凤\t312170\n房企\t312171\n雄峰\t312172\n乡野风月\t312173\n烛\t312174\n2月18日\t312175\n回锁\t312176\nGenetics\t312177\nWORKSHOP\t312178\nTeambition\t312179\n101##\t312180\nchkdsk\t312181\ngeshi\t312182\n生育权\t312183\n出口信用保险\t312184\nep.2\t312185\ncardview\t312186\nYHA\t312187\n滴液\t312188\n调模\t312189\n霍夫曼\t312190\n交界\t312191\nats\t312192\nBluRay-720P.MKV\t312193\n吼声\t312194\n砂管\t312195\n可穿戴式\t312196\n1.618\t312197\n模特们\t312198\n合肥大剧院\t312199\nmelon\t312200\n码农之路\t312201\ncooledit\t312202\n广汽丰田汽车有限公司\t312203\n加数\t312204\n楷\t312205\nDirectX12\t312206\n电子科学与技术\t312207\n51Ape.Com\t312208\n像素大战\t312209\n淘猫\t312210\n粤语片\t312211\ncudnn\t312212\n603986\t312213\n倪萍\t312214\n纪律检查委员会\t312215\n女气\t312216\n40hq\t312217\n国际性\t312218\n北京市纪委\t312219\nvc\t312220\n1198\t312221\ndeepfakes\t312222\ntoos\t312223\n悔罪\t312224\n2009年\t312225\n生存之旅\t312226\ncarrots\t312227\ngroom\t312228\nPWN\t312229\nQDIE\t312230\n24类\t312231\n上海市住房和城乡建设管理委员会\t312232\nsplashtop\t312233\n凯伦·吉兰\t312234\n广州中望龙腾软件股份有限公司\t312235\n第06\t312236\n炸糕\t312237\nSMTP协议\t312238\n严锋\t312239\n金宁\t312240\n永不磨灭的番号\t312241\n西经\t312242\nTides\t312243\n氧气袋\t312244\n嘚瑟\t312245\n南充职业技术学院\t312246\n靶点\t312247\n洁具\t312248\nVGOD\t312249\n空力\t312250\n20150519\t312251\n职代会\t312252\n城市网\t312253\n冰梅\t312254\nLeak\t312255\n天津第一中心医院\t312256\n离心力\t312257\n浏览量\t312258\n陈士铎\t312259\n怪物猎人世界太刀\t312260\nっ\t312261\n组装车\t312262\n甘肃省公安厅\t312263\n诗乡\t312264\n点成\t312265\n妆感\t312266\n第一门\t312267\n安可达\t312268\nHitFilm\t312269\n摸\t312270\n全合成\t312271\n大雪\t312272\n好视界\t312273\n存包\t312274\ncollaborative\t312275\n触变\t312276\n开工率\t312277\n合并机\t312278\n流卡\t312279\n云师大\t312280\n重置盘\t312281\n锐界论坛_汽车之家论坛\t312282\nSWF\t312283\n厦门市财政局\t312284\n小鲁班\t312285\n怪塔\t312286\nqmap\t312287\n600例\t312288\n起源地\t312289\n贵在\t312290\nGoAgent\t312291\n技术处\t312292\n吸塑机\t312293\nAlarm\t312294\n过节费\t312295\nUAP\t312296\n20170908\t312297\n抖m\t312298\n深圳市不动产登记中心\t312299\n名大全_公司名字大全_祥安阁风水网\t312300\nK-Means算法\t312301\n新版射雕英雄传\t312302\n私兵\t312303\n佐菲奥特曼\t312304\n黄播\t312305\n代售\t312306\n20170824\t312307\n唯库\t312308\n可见于\t312309\n新翼\t312310\nk20\t312311\n蓝色球\t312312\nwww.1ni.cn\t312313\n哈里发\t312314\n浙大冰虫\t312315\n汉阴县\t312316\n味蜀吾\t312317\n1227\t312318\n中国人民大学苏州校区\t312319\n疯狂\t312320\nbanged\t312321\nftrace\t312322\n三四声\t312323\nqubie\t312324\n8角\t312325\n32本\t312326\n办公建筑设计规范\t312327\n龙驹\t312328\n完美时代\t312329\n电话会议纪要\t312330\n摆位\t312331\n吊毛\t312332\n残念\t312333\n一个晚上\t312334\n凤山镇\t312335\n现金管理暂行条例\t312336\n海淘橙\t312337\nYou\t312338\n零战\t312339\nfallendoll\t312340\nlasagne\t312341\n清河新城\t312342\n灰指甲\t312343\n英盛\t312344\n大头娃娃\t312345\n成功之路\t312346\ndingo\t312347\n增量式\t312348\n囊袋\t312349\nZ840\t312350\n虚无神\t312351\n隐\t312352\nUSB转串口线\t312353\n总体规\t312354\n陌生女孩\t312355\nWorktile\t312356\nо\t312357\n挂篮\t312358\n博图\t312359\nfenlei\t312360\n电磁转矩\t312361\nnavcat\t312362\n牵羊\t312363\n增派\t312364\n中兴通抵扣联\t312365\n乌龟\t312366\n高压锅炉管\t312367\nOpenWrt\t312368\nDFX\t312369\n榆阳\t312370\n口光\t312371\n资源量\t312372\n库客\t312373\n天津教育信息网\t312374\n浩源\t312375\n高程点\t312376\n金韵\t312377\n质素\t312378\n吉祥人寿\t312379\n殷旭\t312380\n云南信息港\t312381\n标样\t312382\n海南日报\t312383\n多趣\t312384\n海南海花岛\t312385\n山本彩\t312386\n普宁寺\t312387\n仿\t312388\nBeJSON\t312389\n色先锋_影音先锋av资源_色先锋影音看片|色先锋\t312390\n胜利油田中心医院\t312391\n一往无前\t312392\nAibao\t312393\n两寸照\t312394\n北清路\t312395\nCFileDialog\t312396\n恒锋信息\t312397\n单相异步电动机\t312398\n增长额\t312399\nPLAY\t312400\n苏州注册公司\t312401\n2018-02-23\t312402\n木童\t312403\n毁脸\t312404\nteenagers\t312405\n产品级\t312406\nqp\t312407\n南山郡\t312408\nG45\t312409\n唯听\t312410\n行凶者\t312411\n纽约国际儿童俱乐部\t312412\nSTEAMPUNKS\t312413\n王紫璇\t312414\n付梓\t312415\n配土\t312416\n噎\t312417\nAtheros\t312418\n登机桥\t312419\n粥样\t312420\nTX们\t312421\n化学学院\t312422\n四线制\t312423\n万岛\t312424\nNanchang\t312425\nPorts\t312426\n康利\t312427\nFIC\t312428\n永州市政府\t312429\n稳达\t312430\n郭家铭\t312431\ndst\t312432\n瓜熟蒂落\t312433\n工委\t312434\n牛董\t312435\nCOME\t312436\npscs\t312437\n全国人大一次会议\t312438\n陈爱华\t312439\n古言小说吧\t312440\n芭蕾舞者\t312441\n耗油量\t312442\nVILLA\t312443\n福利镇\t312444\n500点\t312445\n亚里亚\t312446\n仕\t312447\n生育证\t312448\n阶乘\t312449\ntojson\t312450\ng9\t312451\n皮尔洛\t312452\nintitle\t312453\n捡肥皂\t312454\n农家\t312455\n力竭\t312456\nftn\t312457\n张忠谋\t312458\n古剑\t312459\n河北广播网\t312460\n鬼丑\t312461\n北京市丰台区人民政府\t312462\n晋人\t312463\n玄关处\t312464\n叶子猪神\t312465\n指令性\t312466\nxunit\t312467\n32.4\t312468\n待命\t312469\n冤冤相报\t312470\n我爸\t312471\n蜀相\t312472\n情报战\t312473\n肩胛\t312474\n滚降\t312475\n听不到\t312476\n2999\t312477\n口含\t312478\n河图洛书\t312479\n申请日\t312480\n空接\t312481\n食蚊鱼\t312482\n36yc.com\t312483\n逻\t312484\nsubway\t312485\n天津人事考试网\t312486\n成都国税\t312487\n偷开\t312488\n宿菲菲\t312489\nloding\t312490\n名和\t312491\n三番\t312492\n护眼片\t312493\n喜乐\t312494\ndnf召唤师\t312495\n宣传板\t312496\nFUNCTION\t312497\n福州新区\t312498\n颜色歌\t312499\n怀档\t312500\n河南省人力资源和社会保障厅\t312501\n重庆市经济和信息化委员会\t312502\n倾品\t312503\n锦丰镇\t312504\n莫晨欢\t312505\n赵娜\t312506\n1.1%\t312507\n薄荷糖\t312508\nrouge\t312509\n乙胺丁醇\t312510\n深圳妈妈网\t312511\n受托方\t312512\n768_\t312513\n体贴\t312514\nwww.iqiyi.com/lib/m_203106214\t312515\nvpro\t312516\n天天链n1\t312517\nballoon\t312518\n瑞星杀毒\t312519\nprops\t312520\n古书\t312521\n四川卷\t312522\n七线阁\t312523\n不包\t312524\n智者见智\t312525\n阔气\t312526\n普特南\t312527\n227\t312528\ncardinal\t312529\n安庆医药高等专科学校\t312530\n99.8%\t312531\n袁文康\t312532\nidentify\t312533\n瞎掰\t312534\n马革裹尸\t312535\nimca\t312536\n双列\t312537\n底特律\t312538\n固定资产折旧\t312539\n画脚\t312540\n家集\t312541\n菲佣\t312542\nintval\t312543\nBeatsX\t312544\n炉温\t312545\nSmarter\t312546\n7包\t312547\n窦燕山\t312548\nAD\t312549\n帕金森症\t312550\n小川あさ美\t312551\n卡托维兹\t312552\n2106年\t312553\n膏状\t312554\n经检查\t312555\nWCBA\t312556\nLil\t312557\nplans\t312558\n水表\t312559\nlucie\t312560\n微斯人\t312561\n黑格\t312562\n善领电子狗\t312563\nredistemplate\t312564\n网典\t312565\n程序性\t312566\n刘胜利\t312567\nwin7usb\t312568\n池陌\t312569\n语文园地三\t312570\n780亿\t312571\n氰\t312572\n厦门农商银行\t312573\n萍乡市政府\t312574\n安吉莉卡\t312575\n淡黄色\t312576\n101.8\t312577\n庞氏骗局\t312578\n弓子\t312579\nfeatured\t312580\n二宫奈奈\t312581\n袜天使\t312582\n荆门市\t312583\n波斯帝国\t312584\n石兰\t312585\n赫克\t312586\nBMI\t312587\n管井\t312588\n血道\t312589\n临城街道\t312590\n9377\t312591\n黄道\t312592\nProtues\t312593\n海航投资\t312594\n神字\t312595\n剧单\t312596\n语种\t312597\n深根\t312598\n紫青悠\t312599\n天雪\t312600\nCMG\t312601\n中国国际金融股份有限公司\t312602\n辽沈战役纪念馆\t312603\nPROM\t312604\n碧桂园凤凰酒店\t312605\nkyrie\t312606\n中交中央公园\t312607\nギャル\t312608\n6.3.4\t312609\nlanqiu\t312610\n醒狮\t312611\nforth\t312612\naj13\t312613\n一行多列\t312614\n两截\t312615\n恐怖小说\t312616\n热泵烘干机\t312617\ninlet\t312618\n体系结构\t312619\n实外\t312620\n诺基亚8110\t312621\n摩梭\t312622\n掩盖\t312623\nSTM32-F0/F1/F2\t312624\n并卵\t312625\n我的家\t312626\n徐翠翠\t312627\n美者\t312628\n理化所\t312629\n天全之窗\t312630\n乳浊液\t312631\n华润集团\t312632\n二十一条\t312633\n金在德\t312634\nGeekCar\t312635\n庠序\t312636\n大马路\t312637\n华为南京研究所\t312638\n某东\t312639\n李嘉琪\t312640\n五谷杂粮粉\t312641\n朱永\t312642\nMintel\t312643\n7.3.4\t312644\n双绞\t312645\nHTML源代码\t312646\n南来北往\t312647\n董健\t312648\n责\t312649\n栗战书\t312650\n必将\t312651\n侍寝\t312652\n热播泰剧网\t312653\n少帅\t312654\nSHIN\t312655\n清园\t312656\n官威\t312657\n1角\t312658\n秽土\t312659\n心尖\t312660\n展望\t312661\n垂心\t312662\n谭崔\t312663\n天命西游\t312664\n歌堂\t312665\nManifold\t312666\n中华英烈网\t312667\n王守昌\t312668\n酒店管理系统\t312669\ncyper\t312670\n演讲者\t312671\n卧式车床\t312672\n深圳前海微众银行\t312673\n冷链运输\t312674\n成语\t312675\n苗若兰\t312676\nminon\t312677\n拱脚\t312678\n乡镇人大\t312679\n中国联合航空有限公司\t312680\n拖行\t312681\n劳动生产率\t312682\n数字出版\t312683\n要货\t312684\n代谢率\t312685\nHonolulu\t312686\n分条\t312687\n单晶硅片\t312688\n射雕英雄传之东邪西毒\t312689\n后排\t312690\n也许\t312691\nfailover\t312692\n尉氏吧_\t312693\n女生版\t312694\n55.com\t312695\nGuitars\t312696\n名君\t312697\nset函数\t312698\n白沙\t312699\n杨旸\t312700\n量价分析\t312701\nsqlquery\t312702\nSOPC\t312703\n肺结核病\t312704\n300S\t312705\n江苏技术师范学院\t312706\n王拥军\t312707\n第二十一次\t312708\n清越流歌\t312709\n小黄人\t312710\n紫金财产保险股份有限公司\t312711\n说不清\t312712\n中华\t312713\n装一网\t312714\n中铁逸都\t312715\n舍人\t312716\nProvider\t312717\n包图\t312718\nLDS\t312719\n计量经济分析\t312720\n大街网\t312721\n屋后\t312722\nskyme\t312723\n隆平高科\t312724\n36.7\t312725\n休斯\t312726\n王松\t312727\n胡乐民\t312728\nFirm\t312729\n北花园\t312730\nfun函数\t312731\n辨治\t312732\n自治\t312733\n练练\t312734\nOnMP3\t312735\n预制建筑网\t312736\n2.1.8\t312737\n合同能源管理\t312738\n特惠\t312739\n19ggg\t312740\n奶嘴\t312741\n4ever\t312742\n92平米\t312743\n威宁县政府\t312744\n二,三\t312745\n杭州市政协\t312746\n洗冤录\t312747\n急雨\t312748\n欲哭无泪\t312749\n哪科\t312750\n花径\t312751\nGTI\t312752\n尝试\t312753\n中国东方资产管理股份有限公司\t312754\n粮安\t312755\n引理\t312756\n黄晶晶\t312757\n奥维地图\t312758\n顶灯\t312759\n花骑士\t312760\n3.32\t312761\n深圳博爱医院\t312762\n监控杆\t312763\n喘\t312764\npitch\t312765\n盟会\t312766\n周厚健\t312767\ncaoliushequ\t312768\njcc\t312769\n摩登年代\t312770\n金山花园\t312771\n浪子心声\t312772\n腹泻病\t312773\n首针\t312774\n辣\t312775\nLINUX服务器\t312776\n齐鲁晚报网\t312777\n阴癖\t312778\n魔物娘\t312779\n党国\t312780\n深井潜水泵\t312781\n王贺\t312782\n中房股份\t312783\nfacebook\t312784\n许可费\t312785\n礼袋\t312786\n迈阿密大学牛津分校\t312787\n企業\t312788\n尼加拉瓜运河\t312789\n康网\t312790\nMatte\t312791\n麦克\t312792\ntt娱乐\t312793\n水镜先生\t312794\nxionglee\t312795\n广澳\t312796\nimm\t312797\n小酒窝\t312798\n封车\t312799\n丽妆\t312800\nbijiben\t312801\n天下3\t312802\nschooling\t312803\n咸水鸭\t312804\n2465\t312805\n30.0\t312806\n中小商业银行\t312807\n国外版\t312808\n变性人\t312809\n中天广场\t312810\n风神股份\t312811\n三助\t312812\n司令部\t312813\n学情\t312814\n吉他手\t312815\n中国商用飞机有限责任公司\t312816\n云视\t312817\n白须\t312818\n华润凤凰城\t312819\n15集\t312820\n申建\t312821\nAnymore\t312822\n碧螺春茶叶\t312823\n锦东\t312824\n十日游\t312825\n国家留学网\t312826\n320K音\t312827\n快冲\t312828\n艺博\t312829\n纪姿\t312830\n排辈\t312831\n零障碍\t312832\n刘志雄\t312833\n0x0000\t312834\n漫画史\t312835\n收敛性\t312836\n宫观\t312837\n易制爆\t312838\n大辛\t312839\n圣灯山\t312840\n余佳\t312841\n海口酒店\t312842\n瓦拉\t312843\n插刀\t312844\n1.8.8\t312845\n广州中国工商银行\t312846\nLoyal\t312847\n2kw\t312848\n松湖\t312849\n播放器\t312850\ntranscripts\t312851\nWere\t312852\n佩罗娜\t312853\naeb\t312854\n唵\t312855\n浦东路\t312856\n电视费\t312857\n大神经\t312858\n184\t312859\njom\t312860\nespaol\t312861\n威力加强版\t312862\n管件\t312863\n60回\t312864\n崔老师\t312865\n正式稿\t312866\n九华天池\t312867\n相克\t312868\n聚氨酯复合板\t312869\n_村\t312870\n此文\t312871\n录音盒\t312872\n重庆航空\t312873\n空军工程大学\t312874\n赵雅淇\t312875\n阻塞性睡眠呼吸暂停低通气综合征\t312876\n创鸿\t312877\n圣童\t312878\n骨外科\t312879\nTerrans\t312880\nX.509\t312881\n希舒美\t312882\n经验贴\t312883\nNoodles\t312884\n死对头\t312885\n狗斗\t312886\n七里河\t312887\n蒋季辰\t312888\n除虫菊\t312889\n节手\t312890\n200px\t312891\n天堂网2017\t312892\n黑皮\t312893\n紫眸\t312894\n心魄\t312895\n罗门王\t312896\n父域\t312897\n洋浦\t312898\n讲课\t312899\n刘寄奴\t312900\nxinxiang\t312901\n冰灵\t312902\n代培养\t312903\n子宫全切\t312904\n曹家渡街道\t312905\n深沟球轴承\t312906\n侵染\t312907\n白苔\t312908\n朱尔屯\t312909\n内行人\t312910\n关联银行卡\t312911\n臭名昭著\t312912\n牧场物语精灵\t312913\n博澳\t312914\n二三四线\t312915\n经济损失\t312916\n体检仪\t312917\n千方百计爱上你\t312918\nqualifying\t312919\n5.3天天\t312920\n彩虹云\t312921\njsr303\t312922\n体育场路\t312923\nvs.\t312924\n力王\t312925\n自幼\t312926\nregency\t312927\n高更\t312928\n罗布\t312929\n耐光\t312930\nOPPOR11S\t312931\nEXCEL\t312932\ntaowen\t312933\nYuki\t312934\n安\t312935\n查\t312936\n闲话\t312937\n上海市同济医院\t312938\nprcc2018\t312939\n宜春市教育局\t312940\n黑基网\t312941\n德马格\t312942\n广购书城\t312943\n归经\t312944\nZone\t312945\nscrypt\t312946\n临武县\t312947\n美博会\t312948\n国家机器\t312949\n广\t312950\n四部曲\t312951\n300英\t312952\n袋形\t312953\n64所\t312954\n北冰洋\t312955\n闽宁\t312956\n豪爵宇钻\t312957\n守孝\t312958\n奉新\t312959\n我想和你唱\t312960\n自励\t312961\n兼职\t312962\n委曲求全\t312963\nphp7.0\t312964\nsonn\t312965\n直线\t312966\n茶叶盒\t312967\n营业室\t312968\n澳游搜旅游网\t312969\n复利计算器\t312970\n碧桃\t312971\nintex\t312972\nx=3\t312973\n伦敦塔桥\t312974\n经方\t312975\n动漫界\t312976\n比赛类\t312977\n弘农\t312978\n未能\t312979\n乘于\t312980\n一风\t312981\n酒席\t312982\nuislider\t312983\n差分电路\t312984\n甲基丙烯酸\t312985\n时墨\t312986\n4.16\t312987\nJiangmen\t312988\n作茧自缚\t312989\n流通股本\t312990\n多年以前\t312991\nAndy\t312992\n可妮兔\t312993\n两平行\t312994\n无锡市\t312995\n男二\t312996\n反式\t312997\n誉品\t312998\n杨小平\t312999\n雷神锤\t313000\n合生创展\t313001\n警铃\t313002\n十三载\t313003\n慧灵\t313004\n201503\t313005\n锦地\t313006\n定型\t313007\nmdash\t313008\n气模\t313009\n960M\t313010\n江西省高速公路投资集团有限责任公司\t313011\n黑羊\t313012\nbioisland\t313013\n奥尔夫\t313014\n公共卫生学院\t313015\n星际迷航吧\t313016\n刘志\t313017\nrefined\t313018\n崖州区\t313019\n3dmax9\t313020\n奥林巴斯吧\t313021\n腐乳\t313022\n桑田岛\t313023\n新崔斯特姆\t313024\n额子目\t313025\n西二路\t313026\n赵英博\t313027\n联迪\t313028\n非const\t313029\nyng\t313030\n601200\t313031\n文钞\t313032\nvga\t313033\n250公斤\t313034\n7440\t313035\n佛曰\t313036\n代理费\t313037\n狗狗们\t313038\n链板\t313039\n7折\t313040\n剥落\t313041\n3d网络\t313042\n朱莹\t313043\n台湾大学\t313044\n驾机\t313045\n←\t313046\n硬密封球阀\t313047\n重启\t313048\n北京联合大学\t313049\n刚需盘\t313050\nblueray\t313051\n千红\t313052\n海泡石\t313053\n阿唐\t313054\n上两点\t313055\n喷流\t313056\n教程集\t313057\n锐洋\t313058\n次晨达\t313059\nyocto\t313060\n尐\t313061\n中国人民解放军国防科技大学\t313062\n永久版\t313063\n1020\t313064\n海门镇\t313065\n思文\t313066\nhidden\t313067\n奥格斯堡\t313068\n干泵\t313069\n红星二锅头\t313070\n假期生活\t313071\n智能科技有限公司\t313072\n钱江摩托\t313073\n1800dpi\t313074\n春天\t313075\n桦仔\t313076\nsimwe\t313077\ndesignation\t313078\n100000\t313079\n4GB\t313080\n天津市知识产权局\t313081\n六路\t313082\n安徽省审计厅\t313083\n开闭所\t313084\n合同书\t313085\nCre\t313086\n凤凰沟\t313087\napp端\t313088\n臂间\t313089\n淘宝公司\t313090\n玩具总动员酒店\t313091\n走笔\t313092\n暴龙机\t313093\nfaire\t313094\n秋水仙素\t313095\n俏妈\t313096\n娃娃鱼\t313097\n特意\t313098\n同力\t313099\n江湖侠客令\t313100\n党政军\t313101\n做好\t313102\n時尚\t313103\ncps\t313104\n大片感\t313105\n易贸\t313106\ndyed\t313107\n闽南歌\t313108\nFist\t313109\n美术类\t313110\n周润\t313111\n舒冬\t313112\n滚刀\t313113\nyousa\t313114\n宏威\t313115\ngabor滤波器\t313116\nPipes\t313117\n环巢湖\t313118\n充氮\t313119\n咏叹\t313120\n将门\t313121\n于明加\t313122\n七八岁\t313123\n炭笔\t313124\n院训\t313125\nIcy\t313126\n追涨\t313127\n鼎龙湾\t313128\nphilippe\t313129\nV7.4\t313130\n酉\t313131\n课案\t313132\nWidget\t313133\n止鼾器\t313134\n福伊特\t313135\n西安华仁医院\t313136\n18亿亩\t313137\n副团长\t313138\n意中人\t313139\n1秒钟\t313140\n3幅\t313141\n名语\t313142\n读机\t313143\nc++头\t313144\nPERT\t313145\n声明函\t313146\n宿州学院\t313147\n白帝\t313148\n西郊线\t313149\n庶子\t313150\nnumeric_limits\t313151\n党风廉政建设\t313152\n費即時\t313153\n1149\t313154\n众泰T500首席品鉴师\t313155\n林和\t313156\n547\t313157\nMiniSD\t313158\n专修\t313159\n癌药\t313160\n组歌\t313161\n王长田\t313162\nctv\t313163\n磨山\t313164\n藏宝海湾\t313165\n黄河大街\t313166\n歼-10\t313167\n甲午海战\t313168\n廉耻\t313169\n屠杀\t313170\nDominant\t313171\n抑制\t313172\n杨思梅敏\t313173\n单栏\t313174\n大陆剧\t313175\nPraise\t313176\n新工厂\t313177\nsex8.cc\t313178\n恩惠\t313179\np型\t313180\n莱辛\t313181\n赞皇\t313182\nlyd\t313183\n吴越古道\t313184\n惠建林\t313185\nBBF\t313186\nlocker\t313187\n弄乱\t313188\nAET\t313189\ngnueabihf\t313190\n3D云\t313191\n日\t313192\n阿云嘎\t313193\n去秋来\t313194\n城家\t313195\n白茯苓\t313196\n重酬\t313197\n七台\t313198\n木秀于林\t313199\nCA001\t313200\n敦煌网\t313201\n赚取\t313202\n网游儒林外史\t313203\n阿里云盾\t313204\n中夏\t313205\nhtml2canvas\t313206\n甲贺\t313207\n装备部\t313208\n12万吨\t313209\n8K分辨率\t313210\n合肥南\t313211\n江右\t313212\n内急\t313213\n平销\t313214\n秦宓\t313215\nloading层\t313216\n报机\t313217\n乖乖\t313218\n隔离板\t313219\n汾阳路\t313220\n临平镇\t313221\n北京海淀区政府网\t313222\n冰酒\t313223\n卤阳湖\t313224\n帝豪EV450\t313225\n秽迹金刚咒\t313226\n东川\t313227\n张国立\t313228\n1亿元\t313229\n4h\t313230\n门牌号\t313231\n桐梓县\t313232\n山鹰之歌\t313233\n夜莺与玫瑰\t313234\n东莞农商银行\t313235\nwh-h900\t313236\n章安\t313237\nJade\t313238\n战号\t313239\nvue组件\t313240\n谭瞐\t313241\n_天气网\t313242\n中绿\t313243\n猿人\t313244\n鸭掌\t313245\n20多度\t313246\n可乐瓶\t313247\n爆点\t313248\n购置税\t313249\n2018.4.25\t313250\n华为畅享8e\t313251\n1kv\t313252\niso格式\t313253\n种税\t313254\n正格\t313255\n出去走走\t313256\n科技大道\t313257\n易语言资源网-学网\t313258\n民族学\t313259\n1300公里\t313260\ngmat\t313261\n大多数\t313262\n百万部\t313263\nhp1536\t313264\n烧录卡\t313265\n出水量\t313266\n冰川\t313267\n龙岩人才网\t313268\n摩托车手\t313269\n除气\t313270\nS2\t313271\n买房人\t313272\n好望角\t313273\n海怪\t313274\nenviroment\t313275\n某一天\t313276\n高压变压器\t313277\n玛格汉\t313278\n卢植\t313279\n半人马座\t313280\nRadiance\t313281\n1系\t313282\n易改\t313283\n求雨\t313284\n小瓜\t313285\nYYP\t313286\n9140\t313287\n上海中梁地产集团有限公司\t313288\n唐纳\t313289\n106名\t313290\n女神转生\t313291\n5863xxxx\t313292\n8支\t313293\n瑜珈\t313294\nWidth\t313295\n程乃珊\t313296\nwestlaw\t313297\n猴头菌片\t313298\n劳动仲裁\t313299\n服役\t313300\nurl地址\t313301\n万部\t313302\n房照\t313303\n乙肝表面抗体\t313304\n混版\t313305\n畜生\t313306\n国际法院\t313307\nPOSTMAN\t313308\n9重\t313309\n昌平县\t313310\n名宅\t313311\n雪恋\t313312\n驸马爷\t313313\nXEBEC\t313314\n杨枝\t313315\n梦舞\t313316\n三项\t313317\n办里\t313318\n桂正\t313319\n三山镇\t313320\n坡顶\t313321\nxmp\t313322\n廖雪峰\t313323\n倍儿爽\t313324\n景宁畲族自治县\t313325\nsyzx\t313326\n奥绿肥\t313327\n偶像梦幻祭\t313328\n倒映\t313329\n搬出\t313330\n安庆政府网\t313331\n王者荣耀貂蝉\t313332\nDLG\t313333\n梵缺\t313334\n宁波市镇海区\t313335\n南一路\t313336\n精神卫生日\t313337\n武汉市总工会\t313338\nfuli3399com\t313339\n黄氏响声丸\t313340\n150斤\t313341\n工具集\t313342\n走私\t313343\ndamages\t313344\n速途网\t313345\n上男\t313346\nAmazing\t313347\nbetternet\t313348\nx&#178\t313349\n华明\t313350\ngns\t313351\n张文\t313352\n例假量\t313353\n理顺\t313354\n北京才秀人人科技有限公司\t313355\n谷歌娘\t313356\n低脂牛奶\t313357\n陈寒柏\t313358\n国际博览中心\t313359\nGearbox\t313360\n河南自贸试验区\t313361\naplus\t313362\n十名\t313363\n联创\t313364\n友元类\t313365\n投机性\t313366\n丙酸氟替卡松鼻喷雾剂\t313367\n五月五日\t313368\n包塑\t313369\n常开式\t313370\n刘苏苏\t313371\n2000-2016年\t313372\nitv\t313373\n高台\t313374\n李广文\t313375\n几段\t313376\n强心\t313377\n李彦宏\t313378\n联想Y50\t313379\nYNET\t313380\n必知\t313381\nredission\t313382\n开创\t313383\n凯氏定氮\t313384\n精功集团有限公司\t313385\n777ey\t313386\n吓\t313387\n船岸\t313388\n迎春路\t313389\n电网\t313390\n季璃\t313391\n千架\t313392\n14年\t313393\n艾丽\t313394\nMashup\t313395\n杭州房途网\t313396\n动员令\t313397\n王者荣耀2017\t313398\nmand\t313399\n1.2v\t313400\n20171216\t313401\n申根签证申请表\t313402\n铠甲勇士刑天\t313403\nTopics\t313404\n郑世云\t313405\n星粉\t313406\n城市热岛\t313407\n指引下\t313408\n房思琪\t313409\n闭合\t313410\n鹤顶红\t313411\n尚湖镇\t313412\ntransverse\t313413\n淋头\t313414\n赵嘉\t313415\nDetached\t313416\n王贞白\t313417\nujn\t313418\n洛丽塔\t313419\nratings\t313420\nホ\t313421\n供暖季\t313422\n换影\t313423\n枣庄市市中区\t313424\n880句\t313425\n3h\t313426\nBETTER\t313427\n蓝胖\t313428\n猛兽侠\t313429\n醉乡民谣\t313430\n勇者\t313431\n周大侠\t313432\n45周岁\t313433\n广州工行\t313434\n全球加盟网JiaMeng.com\t313435\n木糖醇\t313436\n功略\t313437\n东西方\t313438\nFuchs\t313439\n蜇\t313440\n王光明\t313441\n黄凯芹\t313442\n相辅相成\t313443\n吨价\t313444\n风笑天\t313445\n九成宫\t313446\nJoker\t313447\n商务部投资促进事务局\t313448\n汉谟拉\t313449\nuniqid\t313450\ndeath\t313451\n河南大学第一附属医院\t313452\n魁北克\t313453\n沈芳\t313454\nminizip\t313455\n机械性\t313456\n第八代\t313457\n95mm\t313458\n631所\t313459\n奥数网\t313460\n乐安\t313461\n王鸿谟\t313462\n抗心磷脂抗体\t313463\n密西西比河\t313464\n醋酸菌\t313465\n演播\t313466\n固投\t313467\n正兴\t313468\n平面媒体\t313469\n小仝\t313470\n专用纸\t313471\nrtl8188eu\t313472\n内蒙古博物院\t313473\n同系物\t313474\n库存周转率\t313475\n罗艳芳\t313476\namd显卡\t313477\n无锡人民医院\t313478\nhai\t313479\n兴兴\t313480\n晓松\t313481\n胧月\t313482\n齐都\t313483\n小工\t313484\n张立明\t313485\n军型\t313486\nBanco\t313487\n削皮机\t313488\n发声亮剑\t313489\n基努\t313490\n记承天寺夜游\t313491\ndopa\t313492\n开发者们\t313493\nCC2018\t313494\nwtaps\t313495\npowder\t313496\n新白\t313497\n600G\t313498\n铂金时代\t313499\n迎泽之窗\t313500\n暴怒\t313501\n火王之破晓之战\t313502\n人人喊打\t313503\nICI\t313504\n脑梗塞\t313505\ns^2\t313506\n三第二章\t313507\n搜网\t313508\n边口\t313509\n厂妹\t313510\n藏药\t313511\nSeaborn\t313512\n2017年5月3日\t313513\n马龙\t313514\nscnu\t313515\n洛逸\t313516\nhander\t313517\ncervelo\t313518\n中国石油物资\t313519\n九百九十九朵\t313520\n哈斯\t313521\n山居\t313522\nA醇\t313523\n荡\t313524\nFamilies\t313525\n地井\t313526\n香港分行\t313527\n治安学\t313528\n武当七侠\t313529\n盛泽\t313530\n美丽家\t313531\n切换阀\t313532\n诺西\t313533\n火影引导页\t313534\n真心实意\t313535\n另存\t313536\n气灶\t313537\n思翰\t313538\n第八场\t313539\n呼吸急促\t313540\n双栖\t313541\n罩罩\t313542\n船票\t313543\n高抬\t313544\noled\t313545\nckplayer-超酷网页视频播放器\t313546\n决斗之城\t313547\n筒形\t313548\n心理科学\t313549\n喜来健\t313550\nINPUT\t313551\n恒大冰泉\t313552\n防锈纸\t313553\ncnckad\t313554\n↙\t313555\nfdb\t313556\n3154\t313557\n沪市融资融券\t313558\n音频\t313559\n亲善\t313560\n军科院\t313561\n星舞忠\t313562\nPseudo\t313563\n31万\t313564\n大理药业\t313565\n钽铌\t313566\nGnome3\t313567\n较真\t313568\nYii2\t313569\n消\t313570\n梦幻新区\t313571\n深圳晚报数字报\t313572\n第10部\t313573\nmetamask\t313574\n脑垂体\t313575\n200目\t313576\n笃行\t313577\n婴童\t313578\n款识\t313579\n二片\t313580\n时断\t313581\ncapex\t313582\n天津中医药大学第一附属医院\t313583\n杭州酒店式公寓\t313584\nboot-starter\t313585\n新闻页\t313586\n慢性疲劳综合征\t313587\n远期汇票\t313588\n菠丹妮\t313589\n粘\t313590\npaho\t313591\n温德姆酒店\t313592\n聆音\t313593\n健康社区\t313594\nSqlmap\t313595\n三河村\t313596\n苗语\t313597\n变形\t313598\n均田制\t313599\nTwice\t313600\n检察日报社\t313601\n_糖尿病\t313602\n袁惟仁\t313603\ncm101s\t313604\n文学度\t313605\n天津市妇女联合会\t313606\n公共关系学\t313607\nFurnace\t313608\nTrouble\t313609\n借兵\t313610\n林盘\t313611\n一转身\t313612\n慨\t313613\n索尼\t313614\nUSB3.0\t313615\nG43\t313616\n第三度\t313617\n杭州高新区\t313618\n李琼\t313619\n平南小学\t313620\n淘宝宝贝\t313621\n苏稽\t313622\nIC卡读写器\t313623\n和平公园\t313624\n拉丝模\t313625\n快速\t313626\n电视\t313627\n中国梦之队快乐之舞健身操\t313628\n彭浩翔\t313629\n度洛西汀\t313630\n爱他美奶粉\t313631\n练习题\t313632\n望亭\t313633\n侧门\t313634\n三鲜\t313635\n罗子君\t313636\n老会\t313637\n三秦游网\t313638\n三会两制\t313639\n新闻集\t313640\n宝石翁\t313641\nKinKi\t313642\n金融股\t313643\n澄江县\t313644\n丹姿水\t313645\n王苗\t313646\n盘扣式\t313647\n哌嗪\t313648\n乐活小镇\t313649\n田忌赛马\t313650\nFighter\t313651\n厂里\t313652\n雷霄骅\t313653\n县粮食局\t313654\n135度\t313655\n恩施火车站\t313656\n广告宣传\t313657\n世界知识产权日\t313658\nCAXA\t313659\n罗布麻叶\t313660\n118&\t313661\n滑鼠\t313662\n淘汰郎\t313663\n跨网\t313664\n3D立体画\t313665\n电脑软硬件应用网\t313666\nMoz\t313667\n徐珂\t313668\nautocomplete\t313669\nAzalea\t313670\n胡锦涛文选\t313671\nIMD\t313672\n84消毒液\t313673\n九块邮\t313674\n神雪\t313675\n订制\t313676\n期中奖\t313677\n最大小\t313678\n顺义区政府\t313679\ntwi\t313680\n22速\t313681\n卡友\t313682\n爽文\t313683\n一诚\t313684\nCBC\t313685\n107平米\t313686\nicg\t313687\n客都网\t313688\n枪柜\t313689\n黎塘\t313690\n樱粉金\t313691\n昆明街\t313692\n巨熊\t313693\n李淳\t313694\n促甲状腺激素\t313695\n草丛中\t313696\n嘉奖\t313697\n冷冻箱\t313698\n聂磊\t313699\n4738\t313700\n菜谱大全|食谱|美食网\t313701\n彩屏\t313702\n姜太公\t313703\n住宅\t313704\n铠甲勇士之帝皇侠\t313705\n翻书\t313706\nBF3\t313707\n金猪\t313708\n身体乳\t313709\nUsb\t313710\n戴帆\t313711\nwindeln\t313712\n声色\t313713\n一址\t313714\nCompatible\t313715\niturns\t313716\nAD16\t313717\n唐敏\t313718\n)投资管理有限公司\t313719\n黑龙江外国语学院\t313720\n弄清影\t313721\nMagazine\t313722\n导弹艇\t313723\nRVVP\t313724\n今春\t313725\nAD网\t313726\n捷达王\t313727\n何功伟\t313728\n同质化\t313729\nmouse\t313730\n开声\t313731\n30单\t313732\n三尸\t313733\nEEC\t313734\n中国电影资料馆\t313735\n气体摩尔\t313736\n混岗\t313737\n乘兴\t313738\n村里人\t313739\nhackbar\t313740\n虹吸式\t313741\n恩泽\t313742\n北京市规划和国土资源管理委员会\t313743\n薛晓东\t313744\njava变量\t313745\n胡庆余堂\t313746\n泰式\t313747\n清结算\t313748\n假设性\t313749\ngs8\t313750\nsma\t313751\nCelebrating\t313752\n东方热线\t313753\nhxr\t313754\n28t\t313755\nlinked\t313756\n米未\t313757\n便衣警察\t313758\n感同\t313759\n世纪联华\t313760\n乳酸菌阴道胶囊\t313761\n龙岗政府\t313762\n韩国政府\t313763\n大手\t313764\n联影\t313765\n炒家\t313766\n清代\t313767\n成都移动\t313768\n格拉夫\t313769\n露水\t313770\n蜀南春郡\t313771\n上百只\t313772\n高情\t313773\n保利心语花园\t313774\n信捷PLC\t313775\n女神异闻录\t313776\n一箪食\t313777\n联想z510\t313778\n867\t313779\ncoming\t313780\n安徽大学研究生院\t313781\n免冠照\t313782\n571g\t313783\n打传\t313784\n模块度\t313785\n国电\t313786\nlistings\t313787\n尹正\t313788\n四月是你的谎言\t313789\nED\t313790\numi\t313791\n特种兵之火凤凰\t313792\n暗爽\t313793\nVanessa\t313794\n时髦\t313795\n学岛网\t313796\n革命者\t313797\n绞痛\t313798\n赤黑\t313799\n乌镇东栅\t313800\nfinest\t313801\n潮汕国际机场\t313802\nATO\t313803\n浙江新华\t313804\n1.68\t313805\n响水河\t313806\n小米手机4\t313807\n100多条\t313808\n相图解\t313809\n宁夏路\t313810\n蜂窝纸\t313811\n岭南\t313812\n老爹鞋\t313813\n程老湿\t313814\n乡村爱情协奏曲\t313815\nMajestic\t313816\n合乐888\t313817\n上海长途汽车客运总站\t313818\n平白无故\t313819\n10V\t313820\n胡蓉\t313821\n大众卡\t313822\n魔力鸟\t313823\n体育画报\t313824\n四帮\t313825\nWRC\t313826\n祈福\t313827\n1.9.2\t313828\n淮阴工学院\t313829\n大耳朵图图\t313830\n季尼玛\t313831\n9.5.0\t313832\n巴迪\t313833\nharu\t313834\n0217\t313835\n受理\t313836\n镁物\t313837\n先锋集团\t313838\n川陕\t313839\n2孔\t313840\n上方山森林动物世界\t313841\n250号\t313842\n几晚\t313843\n叶黄素酯\t313844\n德意\t313845\n中共党史出版社\t313846\n连点\t313847\n1546\t313848\n液态\t313849\n直接式\t313850\n陵川县\t313851\n会计信息网\t313852\n姨妹\t313853\n信息工程学院\t313854\n复旦医学院\t313855\n双蛇\t313856\n关锦鹏\t313857\n封印石\t313858\ncleanup\t313859\n东庄村\t313860\n样式名\t313861\n恩诺沙星\t313862\n上海辞书出版社\t313863\n东北大\t313864\n鹅颈\t313865\n极品飞车2\t313866\n近体诗\t313867\nMelodyne\t313868\n天津市农村工作委员会\t313869\ncartesian\t313870\n奖励假\t313871\n动态路由协议\t313872\n养殖学\t313873\n观察法\t313874\n一文不值\t313875\nSmlz\t313876\n目标物\t313877\nChic\t313878\n新速特软件站\t313879\n三氟化硼\t313880\n福州大学城\t313881\n张荣\t313882\n生态公园\t313883\n金刚网纱窗\t313884\n行宽\t313885\nnull\t313886\n娟儿\t313887\n兴国寺\t313888\n花江夏树\t313889\n冷遇\t313890\n未卜\t313891\n威仔\t313892\n助航\t313893\n神论\t313894\nauf\t313895\nlouise\t313896\n城东镇\t313897\n良师益友\t313898\n中国药学会\t313899\n承包人\t313900\n出品方\t313901\n悲痛欲绝\t313902\n凸点照\t313903\n人才交流\t313904\n李灿森\t313905\n中村悠一\t313906\n傅明\t313907\n慢性中耳炎\t313908\n中原大学\t313909\n重庆北站北广场\t313910\nDMP\t313911\n高方\t313912\n拔枪\t313913\n百度影音-巴巴影院\t313914\n环创\t313915\n微商城\t313916\n多晶型\t313917\n危秋洁\t313918\n音速\t313919\n一麦\t313920\n14斤\t313921\ncubase7\t313922\n33.3\t313923\n中国电器工业协会\t313924\n何剑锋\t313925\n百花\t313926\n简庆旺\t313927\nslog3\t313928\ntimed\t313929\nkumar\t313930\nmb\t313931\n走美杯\t313932\n盐碱\t313933\n撸一\t313934\n合力亿捷\t313935\nParse\t313936\n极高\t313937\njawn\t313938\n倾城之恋\t313939\n古时\t313940\n十星\t313941\n碳酸\t313942\n神武3\t313943\n王家大院\t313944\n彩装\t313945\n柴桑\t313946\nThem\t313947\n肝功能衰竭\t313948\n1535\t313949\nirony\t313950\nGTX1070Ti\t313951\n干练\t313952\n恐龙帝国\t313953\n变速自行车\t313954\n威法\t313955\n鬓\t313956\n多边\t313957\n炼钢厂\t313958\n番外篇\t313959\n重洲\t313960\n踢踏\t313961\nconsecutive\t313962\n高斯定理\t313963\n护臀霜\t313964\n立日\t313965\n6214\t313966\n瞩\t313967\n000768\t313968\n喷绘机\t313969\n前天晚上\t313970\n克莱因瓶\t313971\nfootwear\t313972\n刘锦玲\t313973\nBiometric\t313974\n韩安旭\t313975\n土豪\t313976\n泉州市住房和城乡建设局\t313977\n2017年5月11日\t313978\n钢牙\t313979\nchattr\t313980\n上睑\t313981\n市检察院\t313982\n舞儿\t313983\n甲状舌管囊肿\t313984\nCrawford\t313985\nWood\t313986\niStat\t313987\nwhose\t313988\n石头人\t313989\n加州理工\t313990\n发展有限公司\t313991\n1头\t313992\n陆一明\t313993\n数据恢复软件免费版\t313994\nNEW3DS\t313995\n1300年\t313996\n九件\t313997\n黑龙江省环境保护厅\t313998\n谋远\t313999\n垂直电子商务\t314000\n张显\t314001\n产品册\t314002\n瑞虹新城\t314003\n吉翔股份\t314004\n莹衣\t314005\n黎明职业大学\t314006\n草莓\t314007\n独家\t314008\n武冈机场\t314009\n亚索\t314010\nLOGO园\t314011\n拌粉\t314012\n20150516\t314013\n偏位\t314014\n稍微\t314015\nrd9700\t314016\n思迅商业之星\t314017\nMaster\t314018\nispring\t314019\nf334\t314020\n公司法人治理结构\t314021\n别找我麻烦\t314022\n黑鲨鱼\t314023\n干吃\t314024\n腌萝卜\t314025\n3.0金\t314026\nftisland\t314027\n802.11r\t314028\n澜石\t314029\n欧式罗马柱\t314030\n黑泽\t314031\nDECN\t314032\n防身\t314033\n亚迪\t314034\nWikia\t314035\n德辉\t314036\n男亲女爱\t314037\n增建\t314038\n对战树\t314039\n孙正\t314040\n陈博士\t314041\n天使眼\t314042\n汪洁\t314043\n46#\t314044\nsyb\t314045\nS26\t314046\nHYPER\t314047\n2017年12月1日\t314048\n官洲\t314049\n宁波广电网\t314050\n多发性子宫肌瘤\t314051\n奇瑞公司\t314052\n张氏\t314053\nDIN\t314054\nkm2\t314055\n抹芽\t314056\n星团\t314057\n镇化率\t314058\n泥泵\t314059\n新唐\t314060\n低效率\t314061\n太陽\t314062\n3年以上\t314063\nviewholder\t314064\n37元\t314065\n大漠\t314066\n地纬\t314067\nGE公司\t314068\nBIKETO\t314069\n0705\t314070\n10名\t314071\n冬窗\t314072\n经历者\t314073\n极坐标\t314074\n广电总局\t314075\n肠道病毒\t314076\npcg\t314077\nTRF\t314078\n喜好\t314079\n高炜\t314080\n鲁信创投\t314081\n国新办\t314082\n棉籽油\t314083\n左灯\t314084\n古永锵\t314085\n北汽幻速S3\t314086\n荣耀铠\t314087\n旌德县\t314088\n菩萨戒\t314089\nintroduce\t314090\n群鸟\t314091\n微知\t314092\n恶毒\t314093\nnpo\t314094\n司波\t314095\nFuturemark\t314096\nHKUST\t314097\n榴莲\t314098\n期指\t314099\n音乐台\t314100\nretrofit2\t314101\n大瑶镇\t314102\n滁州\t314103\n汴京\t314104\n组成部件\t314105\n尼亚加拉\t314106\n人物\t314107\nZA\t314108\n大疆OSMO\t314109\n镀铝\t314110\ne18\t314111\n等译\t314112\n聚贤\t314113\n中国聚合物网\t314114\n越南\t314115\n大灰\t314116\n葡糖糖\t314117\nSklearn\t314118\nautosar\t314119\n珂\t314120\n雪歌\t314121\n荤菜\t314122\n鼻托\t314123\n宝坻\t314124\n江苏科技大学苏州理工学院\t314125\n谁知道\t314126\n圣保罗大教堂\t314127\n鲜花饼\t314128\n头子\t314129\n品悟\t314130\n何能\t314131\n创新大厦\t314132\n七星茶\t314133\n重奏\t314134\n破鞋\t314135\n超纯水设备\t314136\n高水平大学\t314137\n一分为二\t314138\ndt880\t314139\nBlower\t314140\n鸿志\t314141\n石马村\t314142\n新加坡工业园\t314143\n连轴转\t314144\n申办方\t314145\n破茧成蝶\t314146\nH30\t314147\n民主评议\t314148\n立水桥\t314149\n四川三河职业学院\t314150\n款待\t314151\n碟友\t314152\n保办\t314153\n不便\t314154\ncbss\t314155\nqq音乐\t314156\n精论\t314157\n施皮茨\t314158\nBD720P/1080P\t314159\n柱状图\t314160\n龙腾路\t314161\n萃取法\t314162\n口袋妖怪:究极日月\t314163\n使命召唤5:战争世界\t314164\n银元宝\t314165\n物流业\t314166\n大信会计师事务所\t314167\nHydraulics\t314168\n康姿百德\t314169\n20131102\t314170\n结核病\t314171\n管护\t314172\n贫瘠\t314173\n三相异步电机\t314174\n遗产税\t314175\n英雄儿女\t314176\n美腿\t314177\n主题片\t314178\n益粒\t314179\n磨破\t314180\n高代\t314181\n睿道\t314182\n食品罐\t314183\n港珠澳\t314184\n经营部\t314185\nSkiing\t314186\nSuunto\t314187\n钟情\t314188\n操演\t314189\npoker\t314190\n相间\t314191\n信用保险\t314192\nts16949\t314193\n十几张\t314194\n3000块\t314195\n八十\t314196\n崇明岛\t314197\n高家朗\t314198\n江苏高科技投资集团有限公司\t314199\n涂层测厚仪\t314200\n胡夏\t314201\n安微\t314202\n日夜百服宁\t314203\n厦门经济特区\t314204\n肌霸\t314205\nbabi\t314206\nHinge\t314207\nstark\t314208\n农创\t314209\n地平线黎明时分\t314210\n1.8GB\t314211\n人寰\t314212\njuicy\t314213\n学馆\t314214\n文数\t314215\n德尔塔\t314216\n宜兴站\t314217\n厚谱\t314218\n光标处\t314219\n收礼\t314220\n血块\t314221\n酸浆\t314222\nFDD\t314223\n保定银行\t314224\nLIM\t314225\n生活史\t314226\n舜天\t314227\n烃源岩\t314228\n衣恋\t314229\n蓝天航空\t314230\n自闭阀\t314231\n做家务\t314232\nt8\t314233\n明者\t314234\n眼药\t314235\n鬼车\t314236\n第2号\t314237\n983\t314238\nlibevent\t314239\n180126\t314240\n99度\t314241\nSTRIKE\t314242\n24kw\t314243\n院级\t314244\n少年中国说\t314245\n小城市\t314246\n许军\t314247\n桔园\t314248\n郑州酒店\t314249\n左方\t314250\nf441\t314251\n勾销\t314252\n生成率\t314253\n校园狂\t314254\n杰赛科技\t314255\n禁室\t314256\n2560x1440\t314257\n15包\t314258\n搬运机器人\t314259\n愛\t314260\ninsects\t314261\n病理生理学\t314262\n72小时内\t314263\n肌细胞\t314264\n府君\t314265\n105本\t314266\n停盘\t314267\n充分性\t314268\n九彩\t314269\n顺时SEO\t314270\n187\t314271\n林霞\t314272\n瑰夏\t314273\n35年前\t314274\n驾培\t314275\n马尔斯\t314276\n蛊惑\t314277\n旁瓣\t314278\n引航\t314279\n辅热\t314280\nJAVA8\t314281\n计步器\t314282\nGovHK\t314283\n二氟\t314284\n武汉高铁站\t314285\n未约定\t314286\n甘肃银行\t314287\n吊钳\t314288\n巨阴族\t314289\n捐资\t314290\n灵装\t314291\nDELL服务器\t314292\nFail\t314293\nwebservlet\t314294\n20米高\t314295\n北京市公安局朝阳分局\t314296\ncbrc\t314297\n飘带\t314298\n整取\t314299\n华东区\t314300\n笑呵呵\t314301\nclimate\t314302\n国电南京自动化股份有限公司\t314303\n堆叠\t314304\n上海文联\t314305\n陆慕\t314306\n股票质押式回购\t314307\n高空作业证\t314308\n一个3岁\t314309\n绝代艳后\t314310\n王雅西西里的美丽传说\t314311\n500亿美元\t314312\n丝绸\t314313\n林俊贤\t314314\nganchuanpu\t314315\n欢喜债\t314316\n李松林\t314317\n杂货店\t314318\n桑葚子\t314319\n屏体\t314320\nxiexie\t314321\n三国志13:威力加强版\t314322\n刀片式\t314323\n蔷薇花\t314324\n四川政协\t314325\nweakauras\t314326\nboost\t314327\n屏东县\t314328\n生宣\t314329\n头屯河区\t314330\n东方双狮\t314331\n叶画\t314332\n根数\t314333\n富态\t314334\npsr\t314335\n研磨液\t314336\n第3年\t314337\nbelts\t314338\n抻筋\t314339\n移动\t314340\n略感\t314341\n敲击\t314342\n安兴\t314343\n粗壮\t314344\n16g\t314345\n4&\t314346\n神之路\t314347\ncompile\t314348\n平安符\t314349\n管城\t314350\n空仓\t314351\nWill\t314352\n开开眼\t314353\n元力\t314354\n速度与激情2\t314355\n墨旱莲\t314356\nconcurrency\t314357\n小婉\t314358\n10-14岁\t314359\n中华娱乐网\t314360\n20mA\t314361\n六十周岁\t314362\n正例\t314363\nballad\t314364\n英帝国\t314365\nqlwb\t314366\n广医一院\t314367\n字体册\t314368\n液下泵\t314369\n重庆市国家税务局\t314370\n中翔\t314371\n望江街道\t314372\n判裁\t314373\n高朝\t314374\n6月1日起\t314375\n嘿客\t314376\nPeep\t314377\n演唱版\t314378\nx-t20\t314379\n许诸\t314380\n2018.04.19\t314381\n新浪众\t314382\n陈庄镇\t314383\n长春中医药大学\t314384\n過程\t314385\n法兰克王国\t314386\n鲁科版\t314387\n红花岗\t314388\n8960\t314389\n汉开书院\t314390\nheml\t314391\n24题\t314392\n加勒天天飞车\t314393\n兄弟情义\t314394\nyuju\t314395\n关键人\t314396\ncosA\t314397\n工时\t314398\n诚朴\t314399\n换回\t314400\n苏教版5\t314401\n婴孩\t314402\n双根\t314403\n支塘\t314404\n100d\t314405\n法办\t314406\n肝切除术\t314407\n沈阳地区\t314408\n钨极\t314409\n交管12123\t314410\nargsort\t314411\n儒道造化之门\t314412\n十神\t314413\n注册安全工程师\t314414\n九价\t314415\n烟嗓\t314416\nPublishers\t314417\n副国级\t314418\n近意思\t314419\n夜视\t314420\n杨成武\t314421\n2200万\t314422\n控释\t314423\n上庄镇\t314424\n260_\t314425\n石佛寺\t314426\n西门吹雪\t314427\n优衣\t314428\n思茅区\t314429\n三个世纪\t314430\njison\t314431\n报群\t314432\n品级\t314433\n马特\t314434\n菌包\t314435\n沉浸\t314436\nIrrigation\t314437\n51章\t314438\nasp.net\t314439\n酪氨酸酶\t314440\n菠萝蜜\t314441\n大汗腺\t314442\n生活费\t314443\n长途汽车站\t314444\n柚\t314445\n银杏湖乐园\t314446\n40米\t314447\n广州房管局\t314448\n立体动物\t314449\ni5-7200\t314450\n杭州文澜中学\t314451\n日检\t314452\n河北人事考试网\t314453\n梅柳真\t314454\n长春公园\t314455\nkawai\t314456\n严佳丽\t314457\n清明粿\t314458\nxiaomi\t314459\n新疆和静县人民政府\t314460\n相府\t314461\nv5.6.1\t314462\n中学教育网\t314463\n仙三\t314464\nTaking\t314465\n11.22\t314466\n资产管理有限公司\t314467\n长隆海洋公园\t314468\n第三块\t314469\n剑网3配装器\t314470\nbdo\t314471\n风源\t314472\n1mb\t314473\ncandies\t314474\n汗带\t314475\n渝办\t314476\ngrabber\t314477\nHaas\t314478\n紫葡萄\t314479\n船务公司\t314480\n安徽邮电职业技术学院\t314481\n建业路\t314482\n宁波市北仑区政府\t314483\n权力离歌\t314484\n三国:全面战争\t314485\n歌诀\t314486\n战舰少女r所罗门之晓\t314487\n中国商标网\t314488\n鼻癌\t314489\n丁子峻\t314490\n新曼彻斯特\t314491\n排载\t314492\nCraftor\t314493\n1式\t314494\nolympics\t314495\nsshkey\t314496\n翁源\t314497\n鸳鸯街道\t314498\n威压\t314499\nchn-ct\t314500\n600M\t314501\n大兴村\t314502\nCrybaby\t314503\n砍头案\t314504\nneue\t314505\n泰星Pope\t314506\n榜首\t314507\n同流合乌\t314508\nanjeri\t314509\nboris\t314510\n黑毛痣\t314511\n云南十八怪\t314512\n开头曲\t314513\n河南省\t314514\n超强\t314515\n石斧\t314516\nJA\t314517\n刚劲\t314518\n对花\t314519\nm249\t314520\n鲜花港\t314521\n底细\t314522\n干锅鸭头\t314523\n鼓励性\t314524\nmfx\t314525\n代理服务器\t314526\n厦门慧嘉生物科技有限公司\t314527\n坝美\t314528\npubwin\t314529\n华欧\t314530\n1860年\t314531\n艺名\t314532\n冰点文库下载器\t314533\n悠可\t314534\n20140714\t314535\nsurfaceview\t314536\n硬金\t314537\n7.31亿\t314538\nMICROLINE\t314539\n打开\t314540\nqq自由幻想\t314541\n柴鸡\t314542\nWanna\t314543\n总线型\t314544\nMTK6589\t314545\nstatuses\t314546\n一步高升\t314547\n二向\t314548\n煤泥烘干机\t314549\n12017\t314550\n茶垢\t314551\n8.3分\t314552\n照_\t314553\n3D画\t314554\n日本料理店\t314555\ndobbo\t314556\n全新版大学英语综合教程\t314557\n气胀轴\t314558\n东辽\t314559\n出血性\t314560\npopping\t314561\n两江幸福广场\t314562\n许绍洋\t314563\n泰康保险\t314564\n熊出没变形计\t314565\n流动资本\t314566\n道教音乐\t314567\nreg\t314568\n3des\t314569\n炭疽杆菌\t314570\n25页\t314571\n城门\t314572\n承天寺\t314573\n霹雳天命之仙魔鏖锋2斩魔录\t314574\n石油城\t314575\n罗技G29\t314576\n李晨光\t314577\n龙安\t314578\n扬州论坛\t314579\n离合器\t314580\n五六十年代\t314581\ncppreference\t314582\n亲和\t314583\n捉泥鳅\t314584\n山东省大学\t314585\n第四节\t314586\n燃烧王座\t314587\n求罩\t314588\n自考资产\t314589\n华同\t314590\n塑料造粒机\t314591\n初晨\t314592\nCali\t314593\n细腻\t314594\nSparseArray\t314595\n目眩\t314596\n醛日\t314597\n31300626\t314598\nConnie\t314599\n劳动\t314600\n联系册\t314601\n童男童女\t314602\nxp/win7\t314603\n软包锂电池\t314604\n芙芙\t314605\ninitialization\t314606\n阶层\t314607\ninitiation\t314608\nIPCC\t314609\n再创辉煌\t314610\nyinian\t314611\n里里外外\t314612\n华夫\t314613\n乙硫醇\t314614\n医学检验技术\t314615\n淘文网\t314616\n卢芳\t314617\n英特网\t314618\n煤球机\t314619\n6毛\t314620\n布拉芙夫人\t314621\n宜嘉\t314622\n第五十章\t314623\n搜视\t314624\n张女士\t314625\n天天看电影网\t314626\n普罗旺斯\t314627\n采信\t314628\n代谢物\t314629\nTires\t314630\n阿冷\t314631\n馨竹\t314632\n妈咪爱\t314633\nMSSQLSERVER\t314634\nTargeting\t314635\n解梦\t314636\n工作流\t314637\n孵出\t314638\n真空度\t314639\n民委\t314640\nvarchar2\t314641\n宝匣\t314642\n潮田渚\t314643\n北京基督教会海淀堂\t314644\n宋太祖\t314645\n成产\t314646\n塘厦林村\t314647\n万科金域华府\t314648\n华富\t314649\n育婴\t314650\n贪生怕死\t314651\n微星gaming\t314652\n幻林\t314653\n龙潭河\t314654\n0.84\t314655\n全面从严治党\t314656\n烟雾弹\t314657\n1.0.2.0\t314658\n70厘米\t314659\n高志森\t314660\n企业私有云&\t314661\n高珊珊\t314662\n1214\t314663\n烫发型\t314664\n2.58\t314665\n西塞\t314666\n阿莱克斯塔萨\t314667\nUni\t314668\nwac\t314669\n触手怪\t314670\nMVR\t314671\nurl域名\t314672\n天堂网\t314673\n凯拉·奈特莉\t314674\n壹品\t314675\nmaike\t314676\n工程院\t314677\nannotations\t314678\ntgp\t314679\n宝丽\t314680\ntaskeng\t314681\n松木板\t314682\n鄂托克旗\t314683\n方法论\t314684\n第24批\t314685\n川川\t314686\n麦客百科\t314687\n3.1.6\t314688\n龙井茶\t314689\n后弦\t314690\n曲阜师范大学\t314691\n岩土工程学报\t314692\n错爱\t314693\n时代大厦\t314694\n白贝母\t314695\n雨涵\t314696\n第一阶\t314697\n静定\t314698\n洛阳机场\t314699\n潮客\t314700\n海椒\t314701\n吸塑门\t314702\nlegalhigh\t314703\n50平方米\t314704\n候车亭\t314705\n元顺帝\t314706\n第二次世界大战回忆录\t314707\n37.4\t314708\n亚万\t314709\n1500分钟\t314710\nUSRP\t314711\n黄灯亮\t314712\n坦克世界\t314713\n南京溧水区政府\t314714\n蕴意\t314715\n霸气队\t314716\n第十一讲\t314717\n苯甲酰胺\t314718\nweb工程\t314719\n广告雕刻机\t314720\n陕西省住房资金管理中心\t314721\n白乐天\t314722\n具荷拉\t314723\n涵\t314724\n战法\t314725\n20180319\t314726\n长宁区\t314727\n押出机\t314728\nEstate\t314729\n579\t314730\n西风\t314731\n转基因食物\t314732\n颈\t314733\n国际音乐节\t314734\n折让\t314735\nfact\t314736\n海欣股份\t314737\n网易科技\t314738\n射频同轴连接器\t314739\n督政\t314740\n金双歧\t314741\n义乌农商银行\t314742\n夫妻肺片\t314743\nGiorgio\t314744\n1847\t314745\n日本国\t314746\n全民水浒\t314747\n7V\t314748\n血珀\t314749\n0873\t314750\n中审众环会计师事务所\t314751\n生命\t314752\n兴和\t314753\nopenxml\t314754\n100P\t314755\n欣盛\t314756\ncolored\t314757\nCoaches\t314758\n手淫史\t314759\n新华城\t314760\niana\t314761\n永鼎股份\t314762\n破冰者\t314763\niPadair2\t314764\n一决\t314765\n驱魔警察\t314766\n浅论\t314767\n设卡\t314768\n峡口镇\t314769\n初恋星座网\t314770\n易祥千玺\t314771\n看不见的手\t314772\n_洛\t314773\n黄标车\t314774\nNew*\t314775\nwaitlist\t314776\n109平米\t314777\n中海石油\t314778\n骨化\t314779\n中华人民共和国预算法\t314780\n小央美\t314781\n赵正宝\t314782\n突发性耳聋\t314783\nConstraints\t314784\n六代\t314785\n同居生活\t314786\nherbal\t314787\n消防工\t314788\n赫斯提亚\t314789\npush\t314790\n原本本\t314791\nGundam\t314792\nsetvalue\t314793\n松山\t314794\n妹们\t314795\n火色\t314796\n莫笑\t314797\n蝙蝠侠:黑暗骑士崛起\t314798\n新领驭\t314799\n永城市委\t314800\n路虎卫士\t314801\n嗅出\t314802\n五四广场\t314803\n叫醒\t314804\n湖北影院\t314805\n东口\t314806\n双黄连口服液\t314807\n配筋率\t314808\n融创金成\t314809\n金华中心医院\t314810\n四小\t314811\nfolli\t314812\n疣子\t314813\n奥瑞恩\t314814\n组织架构图\t314815\n土矿\t314816\nturbulence\t314817\n天原\t314818\n露鸟\t314819\n延边广播电视台\t314820\nfoul\t314821\n天津中德应用技术大学\t314822\n蔡长福\t314823\nproquest\t314824\n随园食单\t314825\n龙之崛起吧\t314826\n艾诚\t314827\n李建强\t314828\n兽爷\t314829\n咸阳师范学院\t314830\n谏言\t314831\nams\t314832\n涂抹\t314833\nHtel\t314834\n香苗\t314835\nWinXp\t314836\nmax2012\t314837\n西安分公司\t314838\n征服欲\t314839\n比拟\t314840\n犬冢牙\t314841\n后辈\t314842\nreel\t314843\n识破\t314844\n爱与诚\t314845\n基因型\t314846\n大厂街道\t314847\n小电\t314848\n建正\t314849\n风雨兼程\t314850\nclinical\t314851\n展示类\t314852\n福兮祸\t314853\n寻亲\t314854\n鼓唱\t314855\n大常识\t314856\nロボット\t314857\n叶面肥\t314858\n厚此薄彼\t314859\nbababa\t314860\n小恒\t314861\n任仲平\t314862\n苏州城\t314863\n碘值\t314864\n江苏财经职业技术学院\t314865\nmagisk\t314866\n平安产险\t314867\n创想家\t314868\n阳澄湖\t314869\nyuzhou\t314870\n怨苍天变了心\t314871\n苏漫\t314872\npanini\t314873\n小谭\t314874\n手摇器\t314875\n唐厨\t314876\nDefloration\t314877\ngeneral\t314878\nrx7\t314879\n2001届\t314880\n新剧场版\t314881\n39级\t314882\n火甲\t314883\n14.0.1\t314884\n不二子\t314885\n甲泼尼龙\t314886\nyouzan\t314887\n4028\t314888\n601318\t314889\n查算\t314890\nyard\t314891\n林坚\t314892\nln函数\t314893\n琥珀\t314894\n灵秀镇\t314895\nTrust\t314896\n520小说网\t314897\n清浦区\t314898\n初九\t314899\n空知英秋\t314900\n全民\t314901\n委任\t314902\n萤石粉\t314903\n九月九日忆山东兄弟\t314904\n索额图\t314905\n辨析\t314906\n三促\t314907\n经久不息\t314908\ncgx\t314909\n太平桥\t314910\n婚约者\t314911\n艺考\t314912\n集中性\t314913\n包修\t314914\n介值定理\t314915\n爱q生活网\t314916\n全昭美\t314917\n三星A7\t314918\n85ss\t314919\n八局\t314920\n猎奇\t314921\n肺内气压\t314922\n油板\t314923\n四马路\t314924\n第38期\t314925\n顺电\t314926\n纯阳门\t314927\n欣泰\t314928\n荣耀版\t314929\nDilemma\t314930\n盐酸坦洛新缓释片\t314931\n成都科学城\t314932\n中骏柏景湾\t314933\n正是\t314934\nBT种子ed2k\t314935\n動画無修正動画見\t314936\ncn_office\t314937\n松岩\t314938\n京冀\t314939\n咨询服务\t314940\n雷克萨斯CT200h\t314941\nmaterialized\t314942\nvuejs\t314943\n兰州物流公司\t314944\n华宇物流\t314945\n迈腾B8\t314946\nORACEL\t314947\nTerry\t314948\n600252\t314949\n包龙星\t314950\n猎狗\t314951\n折旧率\t314952\n大连婚纱摄影\t314953\n窦娥冤\t314954\n禁\t314955\nxjapan\t314956\n马鸣\t314957\nSubtle\t314958\n惹不起\t314959\n堂山\t314960\nsearcher\t314961\n葛根\t314962\n往之不谏\t314963\n内家拳\t314964\nalignment\t314965\n乙酸\t314966\n化工引擎网\t314967\nBryant\t314968\n惨惨惨\t314969\n中央城市工作会议\t314970\n付费版\t314971\n_梦\t314972\n加税\t314973\n投档\t314974\n王帝\t314975\n灯会\t314976\niceberg\t314977\n金莎朗\t314978\n中华人民共和国立法法\t314979\n摒除\t314980\n菊科向日葵属\t314981\n胶东在线教育\t314982\n第七个\t314983\n美食大战老鼠\t314984\n按揭\t314985\n药效\t314986\n射光\t314987\n凯尔莫罕\t314988\n百度网盘下载器\t314989\n袍\t314990\n顾宁\t314991\n九酷轻音乐\t314992\npowerful\t314993\n白豆腐\t314994\n洛南\t314995\n随意性\t314996\n全页\t314997\n玄冰\t314998\n蓝景丽\t314999\n公共性\t315000\n三林路\t315001\n孤男\t315002\n卡组\t315003\n南皮县\t315004\n样板区\t315005\n流程篇\t315006\n省直管\t315007\n长江工程职业技术学院\t315008\n那一幕\t315009\n高曙光\t315010\n走天涯\t315011\n特里芬\t315012\n弐\t315013\n赣县区\t315014\n压型板\t315015\n无定\t315016\n达州正健医院\t315017\n支援者\t315018\n雷克重生完美时代\t315019\n诡计\t315020\nMacPro\t315021\n扇巴掌\t315022\nYolanda\t315023\n山西省商务厅\t315024\n品牌联盟\t315025\nufw\t315026\n月相\t315027\n胸标\t315028\n两亲\t315029\n彭水\t315030\n金兔\t315031\n广东电视台\t315032\n玛尔扎哈\t315033\n日限额\t315034\niso_\t315035\n纳税证明\t315036\n10期\t315037\n武汉地铁11号线\t315038\n胶原蛋白粉\t315039\n开瓶\t315040\n心理追凶\t315041\n百强县市\t315042\n屏长\t315043\n郑州农业路\t315044\nReg\t315045\n1650元\t315046\n血氧饱和度\t315047\nelb\t315048\n偏下\t315049\n加圈\t315050\ntsmc\t315051\nhart\t315052\n原子力\t315053\nonline4\t315054\n四里\t315055\n交变\t315056\n奥黛丽\t315057\n阳光半岛\t315058\n金酷娃玩具\t315059\nFreeS\t315060\n理想论坛\t315061\n便携性\t315062\n青青草国产偷拍在线av\t315063\n便桥\t315064\n爱的供养\t315065\n郝柏村\t315066\n莫斯奇诺\t315067\n太极功夫扇\t315068\n占庭式\t315069\n经部\t315070\n转义符\t315071\n西渝高铁吧\t315072\n108页\t315073\n金华晚报\t315074\n万企帮万村\t315075\n彩翼\t315076\n弹涂\t315077\n一千个\t315078\n87954525\t315079\n滨湖镇\t315080\noparadise\t315081\nMY\t315082\n苹果汁\t315083\n微信公账号\t315084\nCAPCOM\t315085\n暗杀者\t315086\n随动\t315087\n中国科学院地理科学与资源研究所\t315088\n不知所云\t315089\n开发商\t315090\n惠州城市职业学院\t315091\n20151030\t315092\n教本\t315093\n1道\t315094\n直销专业网\t315095\n原链\t315096\n相思病\t315097\n双喷\t315098\n子容器\t315099\napt-cyg\t315100\nwww.51test.net\t315101\nzt410\t315102\nBEGIN\t315103\n四折页\t315104\n奔走\t315105\nBootStrapTable\t315106\n咳嗽变异性哮喘\t315107\n五家\t315108\n西汉姆\t315109\nTamara\t315110\n阿利\t315111\n舒肝\t315112\n极武圣\t315113\n熟手\t315114\n冷凝壁挂炉\t315115\n淡泊名利\t315116\nv4.6.1\t315117\n爬模\t315118\n宜兴市人民法院\t315119\n简价\t315120\n该病\t315121\n魏桥集团\t315122\n淄\t315123\n性喜剧\t315124\n宁波市教育考试院\t315125\n_步街网\t315126\n64种\t315127\n农村\t315128\n2002.com\t315129\n帝凰之神医弃妃\t315130\n上房\t315131\n中国记协\t315132\n高古(文化期商周战汉\t315133\n我的兄弟\t315134\nbabymetal\t315135\n18p\t315136\n公益日\t315137\n桑葚酱\t315138\n珠链\t315139\n单元板\t315140\n大中小学\t315141\n扭开\t315142\n黑莓z10\t315143\n摩擦式\t315144\n加表\t315145\n古生物\t315146\n中粮万科\t315147\n无限循环\t315148\n404.0\t315149\n迪曲\t315150\n监督管理办法\t315151\n莉迪\t315152\n浓茶\t315153\n债务人\t315154\nmarmont\t315155\n风干牛肉\t315156\n土粒\t315157\n新城子\t315158\n父权\t315159\n彩金\t315160\n益阳新闻网\t315161\n马来漆\t315162\n八卦星网\t315163\n切菜板\t315164\nIDEO\t315165\n新单位\t315166\n秘境对决\t315167\nfazhan\t315168\n包改\t315169\n双拳\t315170\n极路由4增强版\t315171\n3-3\t315172\nVOA英语网\t315173\n口哨\t315174\n谷维素片\t315175\n教育部中国大学\t315176\n作斗争\t315177\n转译\t315178\n六畜\t315179\nMeasures\t315180\n月影版\t315181\n宝器\t315182\n家财\t315183\n堪称\t315184\n石拱桥\t315185\n别克GL8论坛_汽车之家论坛\t315186\n我吃西红柿\t315187\n_土豆\t315188\n犯错\t315189\nSmeb\t315190\n2018年04月15日\t315191\n左右脸\t315192\n傅科摆\t315193\n附带\t315194\n蛙扑网\t315195\n纯棉面料\t315196\n三国志6\t315197\n高档\t315198\n叶麟\t315199\n广深高速\t315200\n穿管\t315201\n看房网\t315202\n名坊\t315203\n河洛镇\t315204\n@_@10\t315205\nmsyql\t315206\nDuang\t315207\n小柴胡\t315208\n陕西省统计局\t315209\nR5\t315210\n欧气\t315211\n干涩\t315212\n100千克\t315213\n2017-12-20\t315214\nDreamHack\t315215\n主销\t315216\n霄云路\t315217\n拷贝构造函数\t315218\n学生\t315219\n女人性\t315220\n40型\t315221\nusborne\t315222\n茶油\t315223\n齐景公\t315224\n特教学校\t315225\n水照\t315226\n江西人大\t315227\n鱼跃血压计\t315228\n计日工\t315229\n贸易战\t315230\n5.33\t315231\n初速度\t315232\n近30年\t315233\nXZP\t315234\n不成方圆\t315235\n415号\t315236\n压缩器\t315237\nBPR\t315238\nHYPE\t315239\n曹杰\t315240\n谢俊\t315241\nParted\t315242\n招投标信息网\t315243\n66家\t315244\n催讨\t315245\n10KG\t315246\n浙江电信\t315247\nerro\t315248\n反转义\t315249\n烤匠\t315250\n大处\t315251\n值机员\t315252\n刘中\t315253\n遗愿\t315254\n跨国企业\t315255\n取压\t315256\n暨阳\t315257\n拼柜\t315258\n乐陵市\t315259\n狙\t315260\n比下\t315261\nBrute\t315262\n黄有龙\t315263\n天健\t315264\n详析\t315265\n柠格\t315266\n共形\t315267\n童名谦\t315268\n贱奴\t315269\n2018年4月14号\t315270\n装弹\t315271\n入村\t315272\n妄称\t315273\n美图T8\t315274\n中国图书馆分类法\t315275\n三秒\t315276\n紫锥菊\t315277\nWindows2003\t315278\n苏州科技城\t315279\n于超\t315280\n诚意\t315281\n圆通快递单号\t315282\n葛红林\t315283\n中国移动\t315284\n干班\t315285\namorphous\t315286\n精灵使的剑舞\t315287\n色堂\t315288\nTC4\t315289\n抑制器\t315290\n梁\t315291\n天涯人\t315292\n三黄鸡\t315293\nRoles\t315294\n大景\t315295\n鱼道\t315296\n山东外国语职业学院\t315297\n省人力资源和社会保障厅\t315298\n装填\t315299\n魏大勋\t315300\n沧海一声笑\t315301\n犟骨头\t315302\n黄大年\t315303\n括苍山\t315304\nbase64格式\t315305\n红孩\t315306\n新快线\t315307\n急降\t315308\nbios-迅维网\t315309\n中华人民共和国公司法\t315310\n阳光城尚东湾\t315311\n飞来科技\t315312\n游标卡尺\t315313\nstila\t315314\n恒大新城\t315315\n阳离子\t315316\n2.7.2\t315317\n史酷比\t315318\n什样\t315319\n类库\t315320\n乙二醇丁醚\t315321\nmkx\t315322\n评估\t315323\nTram\t315324\ncaliu\t315325\npass卡\t315326\n区委党校\t315327\n1.5m\t315328\n8天后\t315329\n博世汽车\t315330\nBeta4\t315331\n桥台\t315332\n朱门\t315333\nFiltration\t315334\n受体阻滞剂\t315335\n金贤\t315336\n双鹤药业\t315337\n张宇琛\t315338\n注入\t315339\ncinder\t315340\n眠\t315341\n600690\t315342\n繁复\t315343\nfsp\t315344\n薄衍宸\t315345\n雨花茶\t315346\n易名\t315347\n榆林市人力资源和社会保障局\t315348\n悼文\t315349\n工薪\t315350\n海通恒信\t315351\n奥迪s5\t315352\n忍痛\t315353\nCCE\t315354\n膜式\t315355\n武汉市中医院\t315356\n重生文\t315357\n造车\t315358\njik\t315359\n野茶\t315360\n空港新城\t315361\n粉彩\t315362\n氟龙\t315363\n齐天大圣万妖之城\t315364\nC90\t315365\n史论\t315366\nliyu\t315367\n大连分公司\t315368\nctrl+c\t315369\n建筑构件\t315370\narticle\t315371\n矩阵函数\t315372\n任命\t315373\n古墓丽影崛起游戏专区_古墓丽影\t315374\n多联式\t315375\nZucker\t315376\n退资\t315377\n竹花\t315378\n防晒露\t315379\n高铭\t315380\n女工委\t315381\nenergies\t315382\n一个八零\t315383\nav天堂网\t315384\n虎坊路\t315385\n除却巫山不是云\t315386\n19卷\t315387\n尹同跃\t315388\n同城网\t315389\nmx398\t315390\nXX村\t315391\n二分法\t315392\n西藏大厦\t315393\n报瓯\t315394\n北京社保中心\t315395\nALEVEL\t315396\n招利宝\t315397\n20170327\t315398\n凑近\t315399\n砂泵\t315400\nbeing\t315401\n晋中市政府\t315402\n妖妻出轨日记\t315403\n肯瑞托\t315404\ndyeing\t315405\n第12名\t315406\n张志忠\t315407\n南京特殊教育师范学院\t315408\nTurtles\t315409\n熟女人\t315410\niam\t315411\n绿地香树花城\t315412\n两少一\t315413\nPcb\t315414\n金宝贝早教中心\t315415\n珍味\t315416\n瘫\t315417\n43章\t315418\n新福特\t315419\n格式转换工具\t315420\nGT720\t315421\nrenault\t315422\n暗夜冥\t315423\nfx8350\t315424\n坚韧不拔\t315425\n厦鼓码头\t315426\n不朽王座\t315427\n建平\t315428\nHosens\t315429\n玉龙路\t315430\n日语学习\t315431\n许冠英\t315432\n骄子\t315433\n水口乡\t315434\n城南家园\t315435\n桶式\t315436\n200环\t315437\n动来\t315438\n诬\t315439\n最后一句话\t315440\n打网球\t315441\n联合汽车电子有限公司\t315442\nOverlay\t315443\njohansen\t315444\n圣斗士星矢冥王神话\t315445\n600子\t315446\n普拉达\t315447\n14p\t315448\n色即是空\t315449\n长生废土\t315450\n烟草公司\t315451\n600176\t315452\n董存瑞\t315453\n海豹\t315454\n使坏\t315455\n深圳人才网0755RC\t315456\nCrow\t315457\n断头路\t315458\nCYBEX\t315459\n武术馆\t315460\ngrows\t315461\n好大喜功\t315462\n单匝\t315463\nWNDR4300\t315464\n会导电\t315465\n慕色\t315466\n福禄寿三星\t315467\n西建\t315468\n磁力架\t315469\n两肖\t315470\n神星\t315471\n爱就淘\t315472\n梦想家\t315473\n钱学宁泽涛\t315474\nDreamtimes\t315475\n青岛人才网\t315476\n淀粉酶\t315477\n接口转换器\t315478\ndisputes\t315479\n迷心窍\t315480\n上海自由贸易试验区\t315481\n知春\t315482\n大疆电池\t315483\n干燥器\t315484\n风噪\t315485\n大商所\t315486\n灵泉寺\t315487\nR星\t315488\n景福宫\t315489\nCKS\t315490\n万科金域国际\t315491\nshinco\t315492\n佟大为\t315493\n谛听\t315494\n香港站\t315495\n中国药科大学\t315496\n建筑工程建筑面积\t315497\n换届\t315498\n常犯\t315499\nArrayAdapter\t315500\n压铆螺母柱\t315501\n博扬\t315502\n卤猫\t315503\n空运费\t315504\n接活\t315505\nlonecloud\t315506\n224亿\t315507\n冬春\t315508\n19860122\t315509\npotplay\t315510\n老皇冠\t315511\n愈发\t315512\n赵寅成\t315513\nicp\t315514\n娃娃领\t315515\n惠优网\t315516\n福建省工商行政管理局\t315517\n辖镇\t315518\n注册时\t315519\n3d47\t315520\n市北中学\t315521\n中山三乡\t315522\n公允价值计量\t315523\n小米科技有限责任公司\t315524\n盈盈\t315525\n童孔\t315526\n直尺\t315527\n生而为人\t315528\n同位素\t315529\n金毛吧_\t315530\n苏喂苏\t315531\nafa\t315532\nunbuntu\t315533\n山阴县\t315534\n20160330\t315535\n超级战队\t315536\nkpw2\t315537\n希伯来\t315538\nFREIGHT\t315539\n办公用品\t315540\n意法\t315541\n社会主义初级阶段理论\t315542\n绵阳市人力资源和社会保障局\t315543\n副本\t315544\n贾明\t315545\nmusculus\t315546\n普宁站\t315547\n降血压\t315548\n监测站\t315549\n太和广场\t315550\n五谷道场\t315551\nDeny\t315552\ntianyu\t315553\n一个5岁\t315554\n遵\t315555\ninsead\t315556\n_人工学院\t315557\n图片源\t315558\n新村街道\t315559\n失业登记\t315560\nkazoo\t315561\n豪放派\t315562\n圣大保罗\t315563\n石术\t315564\n画类\t315565\n虫类\t315566\n吃苦耐劳\t315567\n安庆新闻网\t315568\n瑟\t315569\n永泰能源\t315570\n女武\t315571\nCAM&\t315572\nImplicit\t315573\nimplicitly\t315574\n鲁能星城外滩\t315575\n董军\t315576\n杨玲\t315577\n东北大学软件学院\t315578\n漕河镇\t315579\n金山镇\t315580\n211大学\t315581\nFIFA\t315582\n含苞欲放\t315583\n相互作用\t315584\n罪责\t315585\nMarquee\t315586\n亲爱的人\t315587\n友益\t315588\nBmob\t315589\nYYY\t315590\n2016年10月26日\t315591\nsvn\t315592\n号码\t315593\n虹鳟\t315594\n2016.03\t315595\n台盘\t315596\n真情帮帮帮\t315597\n成都树德中学\t315598\n金地湖城\t315599\n遗老\t315600\n耸\t315601\n电流关系\t315602\n桶装水\t315603\n酸痛\t315604\nPromise\t315605\n中剧\t315606\n日照站\t315607\nZQ\t315608\n私房菜\t315609\n哈莉奎茵\t315610\n平负\t315611\nKatherine\t315612\n卡乐星\t315613\n黄金卡\t315614\n2018秒\t315615\n美课\t315616\n拐弯抹角\t315617\n吃粽子\t315618\nWinbond\t315619\n悲文\t315620\nFacts\t315621\n天河体育中心\t315622\n刘明康\t315623\n美饰\t315624\n采菊\t315625\n神威药业\t315626\n刺血疗法\t315627\n长谷川夏树\t315628\n广电电器网\t315629\n波轮洗衣机\t315630\n问你\t315631\n封仙\t315632\n气窗\t315633\n水浆\t315634\n5.1.3\t315635\n荷风\t315636\nXT5\t315637\n601628\t315638\n大象出版社\t315639\nsc2\t315640\n前一星期\t315641\n西南医科大学附属医院\t315642\n蒜叶\t315643\n鄂温克旗\t315644\n葵花牌\t315645\n南坝镇\t315646\n人家\t315647\n新学堂歌\t315648\n4.56\t315649\n呼和浩特机场\t315650\n无花果\t315651\n猫扇\t315652\n一比\t315653\n混沌与秩序2\t315654\n柱子哥\t315655\n氖管\t315656\n弗丁\t315657\n芙拉\t315658\n赵哲\t315659\n_手机网易网\t315660\n炸药库\t315661\n兴隆台\t315662\n中国天气网\t315663\n第167章\t315664\n同心协力\t315665\n7620\t315666\n拼字\t315667\n粉碎机\t315668\nGTA5OL\t315669\n沙灸\t315670\n座套\t315671\n红岭中学\t315672\n大成基金\t315673\n从实\t315674\n城域网\t315675\nkz\t315676\n西安交通大学研究生院\t315677\nley\t315678\n感冒药\t315679\n杉木桩\t315680\n多种多样\t315681\n镀锌板\t315682\n李锋\t315683\ncdev\t315684\n小姐\t315685\n电影明星\t315686\n椰丝\t315687\n3DsMax2013\t315688\n抓娃娃\t315689\n云翳\t315690\n深圳国税\t315691\n40万公里\t315692\n500亿\t315693\n7片\t315694\nハメ\t315695\n能源局\t315696\n生错\t315697\n简单网\t315698\n绕柱\t315699\n吉赛尔·邦辰\t315700\nApplePay\t315701\n入路\t315702\n跳舞兰\t315703\n土楼\t315704\n开营\t315705\n男妇\t315706\n建筑设计\t315707\n中国人寿保险(集团)公司\t315708\n李二狗\t315709\nresident\t315710\n八只\t315711\n太极拳网\t315712\nholder\t315713\n汕头人才网\t315714\n中债\t315715\n海南博鳌超级医院\t315716\nair3\t315717\n矢量素材-千图网\t315718\n香芋\t315719\n汗巾\t315720\n0.0.1\t315721\n刘蕊\t315722\n加拿大卡尔加里\t315723\nm590\t315724\n金鹿财行\t315725\n宋文\t315726\n贵州百灵\t315727\n350次\t315728\n浙江省国土资源厅\t315729\n联邦军\t315730\n戒奶\t315731\n排线\t315732\n薄处\t315733\nSamza\t315734\n开骂\t315735\n小猪短租_\t315736\n王者荣耀邪恶漫画\t315737\n博学者\t315738\n葡萄球菌\t315739\n&#61501\t315740\npigff\t315741\n可期\t315742\n勇毅\t315743\n硝\t315744\n津西\t315745\n天头\t315746\n20170130\t315747\n专业人\t315748\n复星保德信\t315749\n视路\t315750\n锅盖\t315751\n关于深化教育体制机制改革的意见\t315752\n碇真嗣\t315753\n李宝\t315754\naaaa\t315755\n大庆实验中学\t315756\n下肚\t315757\n火炬\t315758\n喵喵喵\t315759\n群众\t315760\n0.4kV\t315761\n1990年代\t315762\n复旦大学研究生院\t315763\n受贿案\t315764\nfeiji\t315765\n华歌尔\t315766\n堂本\t315767\nTruly\t315768\n想知道\t315769\n恰似你的温柔\t315770\n锁频\t315771\nTripAdvisor\t315772\n人保集团\t315773\n转工\t315774\n广西大学行健文理学院\t315775\n江苏师范大学科文学院\t315776\n开户费\t315777\n桂平路\t315778\n燃尽\t315779\n组稿\t315780\n无修里番\t315781\n奔弛\t315782\n维京\t315783\n600068\t315784\nA6000\t315785\n修船\t315786\nqq管家\t315787\n保险经纪\t315788\n深圳三甲医院\t315789\n菲拉格慕\t315790\n细细\t315791\nundo\t315792\n婴童网\t315793\n咸鸡蛋\t315794\nDalio\t315795\n3DCG\t315796\n凤爪\t315797\n环烷油\t315798\nSublimeText\t315799\n孙晓东\t315800\n3.2元\t315801\n黑暗之魂2:原罪学者\t315802\n安野由美\t315803\nblueberry\t315804\nCODING\t315805\n旧金\t315806\n脆皮\t315807\n5800K\t315808\n三面娜迦\t315809\n判卷\t315810\n米兰花\t315811\n百度云|网盘\t315812\n岩盐\t315813\nregent\t315814\nillustra\t315815\n107平\t315816\n31首\t315817\n足交\t315818\nLuoyang\t315819\n意向\t315820\n北京建行\t315821\nlogcat\t315822\n常字\t315823\n远航\t315824\n2248\t315825\nMina\t315826\n十三香\t315827\n桡侧\t315828\n断网\t315829\n西安白癜风医院\t315830\n乳山市\t315831\n和合\t315832\n恋爱行为学\t315833\n真美好\t315834\nvmc\t315835\n埃尔夫\t315836\n姚译\t315837\n汀溪\t315838\ni白金信用卡\t315839\n成本票\t315840\n旧机\t315841\n顾里\t315842\n脖子处\t315843\n宫灯\t315844\n珍玩\t315845\n座\t315846\n侨香村\t315847\n邯郸大学\t315848\npartitioning\t315849\nChobits\t315850\npl2303\t315851\n广西科技厅\t315852\nRugs\t315853\n框列\t315854\n后盾网\t315855\n米罗西\t315856\n卡拉瓦\t315857\n灌装\t315858\n嘉和广场\t315859\n二元一次方程\t315860\n填方\t315861\n2盎司\t315862\n龙八夷\t315863\nprototype\t315864\n鑫诺\t315865\nKoa\t315866\n项目\t315867\n森林公园\t315868\nrunnning\t315869\n拷問\t315870\n滨州市\t315871\n流韵事\t315872\n联合大厦\t315873\n鐵\t315874\n圣安娜\t315875\n牛牛牛\t315876\n蘑菇丁\t315877\n新世纪小学\t315878\n耳鸣耳聋\t315879\nthemeleaf\t315880\n20171006\t315881\nRussian\t315882\n200k\t315883\n骨皮质\t315884\nxfl\t315885\n卐\t315886\n不可思异\t315887\n江东\t315888\n加莱\t315889\n触媒\t315890\n瞬法\t315891\nzuanke8.com\t315892\n鲍德温\t315893\nsequence\t315894\n梁燕\t315895\n李广射虎\t315896\nKoajs\t315897\n纪委办公室\t315898\n亿地\t315899\n刘小静\t315900\n黑方\t315901\nNH\t315902\n艾克森\t315903\n马尔堡\t315904\n绿波廊\t315905\n马克思主义与社会科学方法论\t315906\n中立性\t315907\n7头\t315908\n伟创\t315909\n三点一刻\t315910\n递归算法\t315911\n仓前镇\t315912\n焦迈奇\t315913\n擂鼓\t315914\n随处\t315915\n旧县\t315916\n钨条\t315917\n坐骨神经\t315918\nlain\t315919\n华测导航\t315920\n赫基\t315921\n齐达林徽因\t315922\n漂亮的房子\t315923\n科易\t315924\nadblock\t315925\n亟\t315926\n4环\t315927\nsquare\t315928\npn\t315929\n62度\t315930\n牛皮\t315931\n何杰\t315932\n失守\t315933\n仙桃论坛\t315934\n千百万亿\t315935\nCCP\t315936\n近世\t315937\n王晓涛\t315938\n冠层\t315939\n芒果慕斯\t315940\n佳博热敏打印机\t315941\n宝书\t315942\n姚江新城\t315943\n银行信息网\t315944\n参军\t315945\n13节\t315946\n青檀\t315947\n奋蹄\t315948\n第72章\t315949\n精模\t315950\ng510\t315951\n李梦娇\t315952\n搭配\t315953\n瓜娃子\t315954\n青少版\t315955\n裘克\t315956\n牧涛\t315957\n王炳华\t315958\n姬秋丽\t315959\n5队\t315960\n董文飞\t315961\n步环\t315962\n楠\t315963\n数典\t315964\n_欧模网\t315965\n一大段\t315966\n作业\t315967\n首要\t315968\nsig\t315969\n蒸汽石锅鱼\t315970\n瓶子草\t315971\n生存法则\t315972\n炸鸡排\t315973\nBrilliance\t315974\n天龙八部手游天龙\t315975\nWin2008\t315976\n15栋\t315977\nNE5532\t315978\n魅蓝Max\t315979\n春天里\t315980\nsliver\t315981\n影像节\t315982\n002294\t315983\n陕西省科技厅\t315984\n三藩市\t315985\n二冬\t315986\n教委\t315987\nlalulalu\t315988\n通俗文学-520听书网\t315989\n苏联战争\t315990\n票箱\t315991\n穹顶之下\t315992\nweipan\t315993\n雄安新区\t315994\n触控\t315995\nMOTOR\t315996\n龙幽\t315997\n条影\t315998\n第三款\t315999\nbreo\t316000\n李让\t316001\nreact子\t316002\n_巴比特\t316003\n新报跑狗\t316004\nPlatforms\t316005\n第30页\t316006\n第二轮\t316007\ncaijing\t316008\n切段\t316009\n科技型\t316010\n赖皮\t316011\n二回\t316012\n来去\t316013\nrar压缩包\t316014\n行和\t316015\n1702\t316016\n考研报录比\t316017\n六朝古都\t316018\n领克4S店\t316019\n456\t316020\n补打\t316021\ngrowingio\t316022\nblank\t316023\n网件路由器\t316024\n宫颈粘液\t316025\n债券收益率\t316026\nNeets\t316027\n长沙市政府\t316028\n抗扰\t316029\n顾南衣\t316030\n香泡树\t316031\nsuk\t316032\n麻生希\t316033\n神雕侠侣手游\t316034\n捯饬\t316035\n第81\t316036\nBT之家\t316037\nLOL鸡里奥\t316038\n知鱼之乐\t316039\n打杂\t316040\n罪恶感\t316041\nmonsoon\t316042\n小加\t316043\n辞演\t316044\njiemian\t316045\n40余\t316046\n大四喜\t316047\n翻锅\t316048\n杆杆\t316049\n源晶振\t316050\n粗品\t316051\n车棚\t316052\nPEER\t316053\n鱼梁洲\t316054\nSIMON\t316055\n魔女之泉2\t316056\n冬季\t316057\n大爆发\t316058\n政府公信力\t316059\n迤逦\t316060\n戏机\t316061\n鼠标点击器\t316062\n稳如\t316063\n彭敏\t316064\nAVR单片机\t316065\n数一数二\t316066\n富华游乐园\t316067\n主灯\t316068\n界域\t316069\n上海市总工会\t316070\n海南区\t316071\n耀江\t316072\n恋之欲室\t316073\n苏先生\t316074\n上海静安门户网站\t316075\nCIO时代网\t316076\n1371\t316077\n淫史\t316078\n委办局\t316079\n电镀厂\t316080\n陈启泰\t316081\n上位机\t316082\n如菊\t316083\n文创产业园\t316084\nINSTALL\t316085\n嘈杂声\t316086\n王朝晖\t316087\n杜拉斯\t316088\n帝侯\t316089\nHuarache\t316090\n是\t316091\n福州师范大学\t316092\n中邮速递\t316093\n退工单\t316094\n九界仙尊\t316095\n猎国\t316096\n金丹\t316097\n自由行攻略\t316098\n461\t316099\n弟魔\t316100\n杜丽娘\t316101\n石业\t316102\n杭州区\t316103\n缩宫素\t316104\n2836\t316105\n郑钢\t316106\n汪东城\t316107\n绿火\t316108\n那三国2吧\t316109\nKDE\t316110\n2004\t316111\n查理兹塞隆\t316112\n大圆满前行\t316113\n油灯\t316114\nC2000\t316115\n手神\t316116\n誓词\t316117\n北冥\t316118\n虎峰山\t316119\n泰克网络实验室\t316120\n托卡\t316121\n涂布平板法\t316122\n人体工学椅\t316123\n兽医学\t316124\n日夕\t316125\n张动图\t316126\n呷哺呷哺\t316127\nPLAXIS\t316128\n莪\t316129\n乐园路\t316130\n东南大学计算机科学与工程学院\t316131\n气动薄膜调节阀\t316132\n菁华\t316133\n曝光台_众悦学车网\t316134\n十兄弟\t316135\n婚色\t316136\n我的世界mod大全_我的世界\t316137\n三十头\t316138\nKHS\t316139\n龙马潭区政府\t316140\n钢铁是怎样炼成的\t316141\n52天\t316142\nExcep\t316143\n500方\t316144\n江苏租赁\t316145\n共青\t316146\nTAE\t316147\n愚笨\t316148\n马化\t316149\naddictive\t316150\n1903年\t316151\n赵文\t316152\n众车\t316153\n数模\t316154\n计划成本法\t316155\n【秀\t316156\n2.8.8\t316157\n天津发改委\t316158\nServer2014\t316159\n应用店\t316160\n诡计多端\t316161\nlimichange\t316162\n泰王国\t316163\n氢硫酸\t316164\n美图控\t316165\n鼠疫\t316166\n套好\t316167\n祁智\t316168\n杂散\t316169\n雁西湖\t316170\n星之海洋4\t316171\n华发外滩\t316172\n内蒙古自治区地方税务局\t316173\n黄果树瀑布景区\t316174\nics\t316175\n霍比特人\t316176\n计程\t316177\nyar\t316178\n美致\t316179\n轩朗\t316180\nmaps\t316181\n书元\t316182\n四月五月\t316183\n矿务局\t316184\n医方\t316185\ncamps\t316186\n2249224212@qq.com\t316187\n陈飞\t316188\nthrottlestop\t316189\n金投期货网\t316190\n鸥鹭忘机\t316191\n中国国际展览中心集团公司\t316192\n切法\t316193\n河北软件职业技术学院\t316194\n法检\t316195\nduan\t316196\n闸板阀\t316197\n寨\t316198\n镶牙\t316199\n破乳剂\t316200\n45公分\t316201\nPPT_红动网\t316202\n格式工厂Mac\t316203\nVUEJS\t316204\n1037\t316205\n工长家居网\t316206\n永恒的爱情\t316207\nPPT\t316208\n3000\t316209\nMlxg\t316210\n乙二醇二甲醚\t316211\n梁植\t316212\n虎眼\t316213\ndist-packages\t316214\n头牙\t316215\n基本类\t316216\n郑东\t316217\n男人帮\t316218\ndelhi\t316219\n公爹\t316220\n开展\t316221\n华岳\t316222\n贡嘎雪山\t316223\n荧光分光光度计\t316224\n东台新闻网\t316225\n鸿钧\t316226\n双塔区\t316227\nCTEX\t316228\n千千花\t316229\n同普路\t316230\n迁往\t316231\n娇小\t316232\n扱\t316233\n物质性\t316234\n黑砂\t316235\n建始县\t316236\n而立\t316237\n阿良良木历\t316238\n卫生局长\t316239\n四度空间\t316240\n资料费\t316241\n开开\t316242\n海海\t316243\n军民路\t316244\n儿童诗\t316245\n姜宏\t316246\n长沙汽车南站\t316247\n田壮壮\t316248\n捕头\t316249\n凤山\t316250\n内蒙古自治区人民政府\t316251\n称心\t316252\n供应部\t316253\n胸科\t316254\n进关\t316255\nheroes\t316256\n远航版\t316257\n30万吨级\t316258\nx=10\t316259\n活码\t316260\n深圳华鑫峰有限公司\t316261\n良心\t316262\n台北\t316263\n一痕\t316264\n别坐\t316265\n杭州夜网\t316266\n任丘房产网\t316267\n杭州写字楼网\t316268\n曹承佑\t316269\nuploaded|nitro\t316270\n佛心\t316271\n中文网\t316272\n污黑\t316273\n灯笼花\t316274\n命令与征服3泰伯利亚战争\t316275\n罗卡\t316276\n铁城\t316277\n朗仕\t316278\nレザ\t316279\n2011年12月31日\t316280\n信管家\t316281\n智行版\t316282\n医疗险\t316283\n戴拿\t316284\n2016年12月\t316285\nPadavan\t316286\n省掉\t316287\n阿尔卡特朗讯\t316288\nMPG格式\t316289\n安家楼\t316290\n正宇控股集团\t316291\n李理\t316292\n薛刚\t316293\n逐笔\t316294\nProvides\t316295\n总利润\t316296\n成都市工商局\t316297\n乾龙\t316298\n大童\t316299\n生\t316300\n莉\t316301\n汇率换算器\t316302\n手巧\t316303\n清蒸鱼\t316304\nQApplication\t316305\n标注\t316306\n甲烷\t316307\n商住小区\t316308\n闪付\t316309\nwww.lijiejie.com/wp-content/uploads/2014/08/163_user\t316310\n王志\t316311\nUI库\t316312\n简并增值税\t316313\n材质\t316314\n包头东\t316315\n工矿灯\t316316\n网易126邮箱\t316317\n次晨\t316318\n莲花灯\t316319\ni5/i7\t316320\n信息页\t316321\n幻想世界\t316322\nmkdirs\t316323\n中央空调\t316324\n检查表\t316325\n相移\t316326\nhar\t316327\n怦然心动\t316328\n1260\t316329\n河滨\t316330\n_期\t316331\n杨陵区\t316332\n放酒\t316333\nplaylist\t316334\n课课\t316335\nbussiness\t316336\n邝\t316337\n10N\t316338\nobook\t316339\n第十四周\t316340\n量值溯源\t316341\ndesktop\t316342\n358\t316343\n夏景树\t316344\n童话剧\t316345\nCET\t316346\n魅族MX\t316347\n20140606\t316348\nphpsso\t316349\n奥咨达\t316350\n冰红茶\t316351\nルド\t316352\n吊瓶\t316353\n曲青山\t316354\n魔境\t316355\nRS232接口\t316356\nytg\t316357\n蒸腾\t316358\n五十九个\t316359\nprefs\t316360\n新兴社区\t316361\n中华绒螯蟹\t316362\n宏批量\t316363\n期初\t316364\n成捷迅\t316365\nCCN\t316366\n刨削\t316367\n德国汉莎航空\t316368\n能者\t316369\n潮图\t316370\n退役\t316371\nfilp\t316372\n真的牛逼\t316373\n精神病患\t316374\nRozdy\t316375\n录取\t316376\n莲花中学\t316377\n文剑\t316378\n寿县\t316379\n300h\t316380\n桃源乡\t316381\nFanFiction\t316382\n图尔克\t316383\n小米NOTE3\t316384\n汇合\t316385\n络腮胡\t316386\n标致4008论坛\t316387\n血液检查\t316388\nensp模拟器\t316389\n7.6.0\t316390\n住宅物业网\t316391\n百万葵园\t316392\n打新\t316393\n北松公路\t316394\n起征点\t316395\n嫌疑人x的献身\t316396\nfmprc\t316397\n小米f码\t316398\n藏毛窦\t316399\n寄单\t316400\n4-6月\t316401\n玩客\t316402\nAGP\t316403\n33个月\t316404\n道行\t316405\n环保工程公司\t316406\n九三\t316407\n真凶\t316408\n东环二路\t316409\n白云峰\t316410\n河南整形美容医院\t316411\n星辰变\t316412\n涓涓\t316413\n当今大马\t316414\n京投发展\t316415\n騎士\t316416\nJS+CSS\t316417\n花糖\t316418\nau750\t316419\nProcessors\t316420\n搭载\t316421\n辰巳\t316422\nswtich\t316423\n中关村创业大街\t316424\n说谎的爱人\t316425\nzyk\t316426\n1048576\t316427\n文理学院\t316428\n青枝\t316429\n宝拉珍\t316430\njunit5\t316431\n父子俩\t316432\nMAYA2017\t316433\n好可爱\t316434\n商家群\t316435\n依照\t316436\n爱爱医\t316437\n塔桥\t316438\n王牌对王牌第二季\t316439\n盐酸小檗碱\t316440\n眉庄\t316441\n小说课\t316442\ndda\t316443\nLynch\t316444\n好当\t316445\n宁可\t316446\n功德箱\t316447\nTonight\t316448\n阿瓜\t316449\n八品\t316450\n河南省质监局\t316451\n水空\t316452\n丧钟\t316453\nchon\t316454\n告退\t316455\n强阳\t316456\n巫妖\t316457\n傅敏\t316458\n印象湾\t316459\n陕鼓集团\t316460\n山东工业大学\t316461\n赎罪\t316462\n业障\t316463\n华阴县\t316464\n莫扎特\t316465\n哈萨克族\t316466\n玛雅海滩水公园\t316467\n昆华医院\t316468\n中缅\t316469\n密集架\t316470\n伯克利\t316471\n林氏木业\t316472\n滨江新区\t316473\n双色板\t316474\n磴口\t316475\n5.5.3\t316476\n7公斤\t316477\n1989年6月3日\t316478\n高压锅\t316479\n纵身一跃\t316480\n哈弗H6\t316481\n爱乐乐团\t316482\n诵读\t316483\n暴政\t316484\n星光奖\t316485\n银行间外汇市场\t316486\n程亮\t316487\n比瑞吉\t316488\n落霞小说网\t316489\n下垂\t316490\n无礼\t316491\n奥迪a7\t316492\nMathe\t316493\nOrganic\t316494\ncsf\t316495\n天翼电信\t316496\n流水帐\t316497\n火影次世代\t316498\n南昌市第一医院\t316499\nactiviti6\t316500\n画王\t316501\n难忘的歌\t316502\n撞破\t316503\n武汉音乐学院\t316504\n索性\t316505\n有戏\t316506\n金凤凰\t316507\n中国医学装备协会\t316508\n小浦\t316509\nPD-L1\t316510\n循循善诱\t316511\n保护类\t316512\n卡诺亚\t316513\n1.8.4\t316514\n天下人物\t316515\n中南大学湘雅二医院\t316516\nunicode\t316517\n东京爱情故事\t316518\n东罗马帝国\t316519\n勤上股份\t316520\n雕刻\t316521\n自满\t316522\n金书群芳谱\t316523\nv5.1.0\t316524\n膨体\t316525\n佳世达\t316526\n西安市第四医院\t316527\n派通\t316528\n另一个人\t316529\n尤奈斯博\t316530\n叙事\t316531\nGANs\t316532\n念咒\t316533\n杜王丹\t316534\navx\t316535\n三阳镇\t316536\n帕\t316537\ndea\t316538\n中文化\t316539\n公募债券\t316540\n新时代先锋\t316541\n速腾快递\t316542\n中国排球协会\t316543\n小炮\t316544\n周德东\t316545\nedu.cn子域名大全\t316546\n惩戒者\t316547\n长宁区政府\t316548\ntcs\t316549\n永康金报_金华日报\t316550\n桠溪镇\t316551\n上级机关\t316552\n研修\t316553\n20017年\t316554\npx、density\t316555\n课形\t316556\n0431\t316557\n舜泰广场\t316558\n第A13\t316559\n被罩\t316560\n温变\t316561\n日狗\t316562\n刮骨\t316563\ntimothy\t316564\n雪落无痕泣苍天\t316565\n差动放大电路\t316566\n真传\t316567\n烟柳\t316568\n王好\t316569\ninur\t316570\n西台\t316571\n鸟器\t316572\n济南市中心医院\t316573\n上海第一人民医院\t316574\nstain\t316575\n小燕子\t316576\nfaults\t316577\n幼树\t316578\n6PM\t316579\n中科院研究所\t316580\n中共四川省委党校\t316581\n飞蚊\t316582\nLUA\t316583\n360行车记录仪\t316584\n上海市国资委\t316585\n吉连\t316586\n可口可乐公司\t316587\n原矿\t316588\n中国粮油信息网\t316589\n断片儿\t316590\n座堂\t316591\n放碟\t316592\n加底\t316593\n虎视眈眈\t316594\nDecorating\t316595\n闵妮\t316596\nmplayerx\t316597\n装懂\t316598\n125平米\t316599\ncau\t316600\nAGN\t316601\nmyadmin\t316602\n奥贝\t316603\n新都镇\t316604\nxforce\t316605\n道教协会\t316606\n玉龙机场\t316607\n飘过\t316608\n错乱\t316609\n糖王\t316610\n肋梁\t316611\npresscad\t316612\nisk\t316613\n厦门远华\t316614\n污水量\t316615\n沧州职业技术学院\t316616\n白燕升\t316617\n云音乐\t316618\n善待\t316619\n刮油\t316620\n'\t316621\n9小时\t316622\n3+X\t316623\n北京市体育局\t316624\n狡\t316625\n七年前\t316626\n140平方米\t316627\n救援行动\t316628\ntju\t316629\nWenzhou\t316630\n莉雅\t316631\n脊灰疫苗\t316632\n取笑\t316633\n面谈\t316634\niReader\t316635\nOverload\t316636\n换轨\t316637\n英雄传说:零之轨迹\t316638\n混音台\t316639\n索绪尔\t316640\n4小时内\t316641\n仁权\t316642\niTop\t316643\n合集版\t316644\nlawson\t316645\n中间价\t316646\nRequestBody\t316647\n地锚\t316648\n古交\t316649\n昆山小学\t316650\n余虹\t316651\n50毫克\t316652\n北京中国联通\t316653\n前胸\t316654\n滴天髓\t316655\n今年3月\t316656\n洋钱罐\t316657\n二叉树前序\t316658\ncholesterol\t316659\n周转期\t316660\n右边\t316661\n上海市上海中学\t316662\n包龙图\t316663\n汇易通\t316664\n加封\t316665\nvisi\t316666\n公务员考试信息网\t316667\n中国电子信息产业集团有限公司\t316668\n镇江市第四人民医院\t316669\n烂柯山\t316670\n杨村镇\t316671\n敬一丹\t316672\n也谈谈\t316673\nmate9pro\t316674\nOCC\t316675\n湖州房产网\t316676\n一维\t316677\nSANWA\t316678\n1.1.7\t316679\nHikari\t316680\n邓力群\t316681\n江湖郎中\t316682\n真石\t316683\nnth-child\t316684\n修整\t316685\n城建\t316686\nirish\t316687\nGmail\t316688\nbidding\t316689\n新南路\t316690\n血压计\t316691\n22万亿\t316692\n中国经济导报网\t316693\n洁净区\t316694\n轻视\t316695\n中土世界暗影魔多\t316696\nQuiver\t316697\n39页\t316698\n娄桥\t316699\n遗失的记忆\t316700\nuzer\t316701\n基本户\t316702\n快速排序法\t316703\n灵山县人民政府\t316704\n乌海日报社\t316705\n1.0.0.2\t316706\n玉林广播网\t316707\ndebut\t316708\n小红本\t316709\n王权\t316710\n龙枪\t316711\n在不在\t316712\nfluke\t316713\n104页\t316714\nWebbrowser\t316715\n501路\t316716\n浙江省能源集团有限公司\t316717\n一帧一帧\t316718\n陈小军\t316719\nPUMA\t316720\n三江街道\t316721\nMountain\t316722\n木缘\t316723\n刘莹莹\t316724\n收发\t316725\n涂覆机\t316726\n一线路\t316727\n小城堡\t316728\n香丹清\t316729\nX299\t316730\n华丽志\t316731\n教辅书\t316732\n生命科学院\t316733\n德云色\t316734\nDentist\t316735\n长绝\t316736\nspice\t316737\n布丁小贷\t316738\nYanni\t316739\n6.92\t316740\n快照\t316741\n2k18\t316742\n三保\t316743\n六号\t316744\n胡正荣\t316745\n泛海\t316746\nCPLD\t316747\n权路迷局\t316748\n斋\t316749\nReceiving\t316750\n条纹裤\t316751\n聯誼\t316752\nmental\t316753\n副政委\t316754\n吹泡泡水\t316755\n激爽\t316756\nbootstrapTable\t316757\n恒福\t316758\n法半夏\t316759\nBM\t316760\n弯路\t316761\n兰苑\t316762\nversace\t316763\n空机\t316764\n李耀阳\t316765\n承安\t316766\n扁丝\t316767\n探校\t316768\n乐高气功传奇\t316769\n东海龙王\t316770\n気高\t316771\n旁听生\t316772\n土木堡\t316773\n美满\t316774\nMOST\t316775\n性病科\t316776\n昆德拉\t316777\n命途\t316778\n不吝\t316779\n召唤者\t316780\n手亲\t316781\nVivier\t316782\n孕晚期\t316783\ndojo\t316784\n宝骏310\t316785\n大西高铁吧\t316786\n上古十大正神\t316787\n修完\t316788\nideal\t316789\n云朵儿快乐舞步健身操\t316790\n丹东\t316791\nnokia7\t316792\n雄飞\t316793\n绿腰\t316794\n迪士尼城堡\t316795\n敦化新闻网\t316796\n大洲\t316797\n目标板\t316798\n重庆幼儿师范高等专科学校\t316799\n赛迪\t316800\n听不清\t316801\n偏转\t316802\ntragedy\t316803\n局中\t316804\n虚实\t316805\n72G疯狂猜成语\t316806\n茶盒\t316807\n卡耐基\t316808\n每周四\t316809\n强手\t316810\n27.5%\t316811\n一星\t316812\n16平方米\t316813\n粉尘螨\t316814\nipsecvpn\t316815\n云箭\t316816\n%ld\t316817\n桑德环境\t316818\n非攻\t316819\n汇源\t316820\n六州歌头\t316821\n茎干\t316822\ndj版\t316823\n乐堂\t316824\n砍骨\t316825\n香港港\t316826\n国家知识产权局专利复审委员会\t316827\n酒廊\t316828\n隔套\t316829\n会城镇\t316830\n扁皮\t316831\n五月下旬\t316832\nhandshake\t316833\n行成\t316834\n安定片\t316835\n七分裤\t316836\n早博\t316837\nYuzuki\t316838\n影响因子\t316839\n高斯滤波\t316840\n天鹰\t316841\n优课一课一名师\t316842\n沈安然\t316843\nBouncy\t316844\n3.72\t316845\n绿色金融债\t316846\n汾水\t316847\n江泰路\t316848\nHBF\t316849\nEF-S\t316850\n余安安\t316851\n铜城\t316852\n杭州博物馆\t316853\n王兆星\t316854\n0.33\t316855\nQQ音乐播放器\t316856\n水纹\t316857\n虚幻引擎\t316858\n点装\t316859\n富平县人民政府\t316860\n谷歌安装器\t316861\n浙江省镇海中学\t316862\nSTAN\t316863\n5645\t316864\n水牌\t316865\n聚聚\t316866\n_快马加盟网\t316867\nTarget\t316868\nbls\t316869\nCTRL\t316870\nHelpdesk\t316871\n行知杯\t316872\nwww.stanford.edu/class/archive/cs/cs106a/cs106a.1174/lectures\t316873\n4810\t316874\nbenzene\t316875\n轻量化\t316876\nEpoch\t316877\n贝蒂\t316878\n黑叉\t316879\n芋儿鸡\t316880\n1754\t316881\n长春市工商行政管理局\t316882\n布\t316883\n试编\t316884\n本月20日\t316885\nrebecca\t316886\n网式\t316887\n山大二院\t316888\n山姬\t316889\n军转干考试网\t316890\n金税盘\t316891\n线笔\t316892\n好吃懒做\t316893\n咪鲜胺\t316894\n罗莎·卡拉乔洛\t316895\ncdrx6\t316896\n手机绝地求生\t316897\nvuex\t316898\npicks\t316899\n29页\t316900\n2480\t316901\n汉碑\t316902\n刘星图\t316903\n集锦\t316904\n自然分娩\t316905\nFIESTAR\t316906\n创维平板电视\t316907\n三都\t316908\n春节\t316909\n科发源\t316910\nSiO2\t316911\n乐途旅游网\t316912\n金蝶eas\t316913\n曲芷含\t316914\nCompression\t316915\n咸阳百姓网\t316916\nwinktv\t316917\n郑州地铁1号线\t316918\nqiwen\t316919\nexcel表\t316920\nyeon\t316921\n巨虫\t316922\n反流\t316923\n四川长虹\t316924\n习以为常\t316925\n香辣酱\t316926\n水窖\t316927\n高明人才网\t316928\n张铎\t316929\n吾皇\t316930\n有机油\t316931\n胚珠\t316932\n第四城\t316933\nMarx\t316934\n心电信号\t316935\n乱流\t316936\n先行服\t316937\n三辊研磨机\t316938\nWindows8_Wi\t316939\nBao\t316940\n云南白药粉\t316941\n马侃\t316942\n月事\t316943\n北京商报网\t316944\n辽东湾新区\t316945\n金融机构反洗钱监督管理办法\t316946\n20150929\t316947\n北京6号线\t316948\n上海华瑞银行\t316949\n王键\t316950\ncounta\t316951\n参与\t316952\n阿楚\t316953\n赵路\t316954\n龙飞\t316955\n诺基亚925\t316956\n360ssp\t316957\nmt7621\t316958\n秦州区\t316959\n向南\t316960\n威海人才网\t316961\n两颗心\t316962\nCoAP\t316963\n优酷土豆网\t316964\nhyun\t316965\n欢喜\t316966\ndisable-web-security\t316967\ndomoto1987\t316968\nfullcalendar\t316969\n凉棚\t316970\n光枪\t316971\n六个小时\t316972\n2.2.5\t316973\n天瓶座\t316974\n税收学\t316975\n手动机械表\t316976\n血影\t316977\n准度\t316978\n康宝消毒柜\t316979\n耕地机\t316980\n希利尔\t316981\n养生壶\t316982\n奴隶市场\t316983\nbuzz\t316984\ns410\t316985\n木纹\t316986\n金惠秀\t316987\n三明网\t316988\n4.3.5\t316989\n集中力\t316990\n平凸\t316991\nCM3D2\t316992\n有道翻译官\t316993\n油糕\t316994\n生煎\t316995\n天级\t316996\n南方花园\t316997\n还\t316998\npfd\t316999\nrk3288\t317000\n北京组工网\t317001\n站长网\t317002\n进口商品\t317003\nEFFECTS\t317004\n桥位\t317005\n辘轳\t317006\nburner\t317007\n椰土\t317008\nOkita\t317009\n阳台房\t317010\n金亨泰\t317011\n$1\t317012\n湘潭在线\t317013\n动态心电图\t317014\n孟星魂\t317015\n玉女\t317016\n认捐\t317017\n克拉科夫\t317018\n武汉农村商业银行\t317019\nlibaio.so.1\t317020\n气化炉\t317021\n合着\t317022\n初具\t317023\n龙光世纪大厦\t317024\n咸丰重宝\t317025\n兰考县人民政府\t317026\n2.5元\t317027\n徕卡M\t317028\n古董\t317029\n明师\t317030\ntake\t317031\nPowell\t317032\n3D打印笔\t317033\n三治\t317034\nTheyAreBillions\t317035\n团簇\t317036\n东风猛士\t317037\n奇米播放器\t317038\n布里顿\t317039\n看字\t317040\n火爆鸟\t317041\n邪祟\t317042\n素万那普\t317043\n酷家乐\t317044\n重庆市政府网_重庆市人民政府\t317045\n富士山下\t317046\n划\t317047\nR9m\t317048\nrc1\t317049\n西航\t317050\n2018年2月20日\t317051\n承德避暑山庄\t317052\n185T\t317053\n迟\t317054\n关版\t317055\n点检表\t317056\n佳美娜\t317057\n酌鹿\t317058\nlv\t317059\n金山卫站\t317060\n一只脚\t317061\n今年起\t317062\n洪剑涛\t317063\n老七\t317064\n早道\t317065\n郭亮村\t317066\n增值税进项税额\t317067\n滨海新区\t317068\n回京\t317069\nLBP\t317070\noperand\t317071\n北京市教委\t317072\n太空历险记\t317073\nWebUI\t317074\n着色\t317075\n绿地香港\t317076\n交易品\t317077\n理学\t317078\n独腿\t317079\n蛋鸡笼\t317080\n黄永刚\t317081\n中国共产党中央政治局常务委员会\t317082\n大航海4威力加强版\t317083\n李元胜\t317084\nBlueHost\t317085\n应用电子技术\t317086\n金裕贞\t317087\n52PK\t317088\n小差\t317089\n代理池\t317090\n首末\t317091\n日产途达\t317092\n优版\t317093\npack\t317094\n封切机\t317095\nSoup\t317096\n合肥南站\t317097\n勾肩搭背\t317098\n上海世博会博物馆\t317099\n法士特\t317100\n18.3\t317101\n横档\t317102\n留号\t317103\n非成套\t317104\n星宿\t317105\n120次\t317106\n魅族MX2\t317107\n裸地\t317108\n光明食品(集团)有限公司\t317109\n梦幻西游盘丝\t317110\n白月光\t317111\n35二代\t317112\n微刊\t317113\n南传佛教\t317114\n正数\t317115\n台式电脑\t317116\n金诚集团\t317117\n姚峰\t317118\n拒载\t317119\n一家\t317120\n招商银行北京分行\t317121\n叶芷青\t317122\n石英砂过滤器\t317123\n1.5升\t317124\n小汐\t317125\n同盟国\t317126\n醴泉\t317127\n灭国\t317128\n结节病\t317129\n遗弃\t317130\n周子琰\t317131\n靓\t317132\n土火\t317133\n逆止阀\t317134\n新华财经\t317135\n中国华融资产管理股份有限公司\t317136\n标准券\t317137\nTennis\t317138\n立昌\t317139\n金银河\t317140\n4399i小游戏\t317141\n排骨网\t317142\nMirage\t317143\n2014年01月02日\t317144\nINDEX\t317145\n第73号\t317146\n老井\t317147\n偷人\t317148\n一鼓\t317149\n松下电器\t317150\n4MM\t317151\n哼哧\t317152\n石灵\t317153\n护肤霜\t317154\n加场\t317155\n中国生物制药\t317156\npoplar\t317157\n20171223\t317158\n500毫升\t317159\n吴宇\t317160\nMILD\t317161\niPhone6sp\t317162\n鞍钢股份\t317163\n英岁月\t317164\nY55\t317165\n梨花\t317166\n薇恩\t317167\n呈正\t317168\nVW\t317169\nNewport\t317170\n面心立方\t317171\nsymmetric\t317172\n乐平市\t317173\nWossoneri\t317174\n南昌机场\t317175\n鲜榨果汁\t317176\n1万多元\t317177\nTOPYS\t317178\n六七十\t317179\n健康管理师考试\t317180\n北京中医药大学东方医院\t317181\n虚方\t317182\n觚\t317183\nea3w\t317184\n玉雕\t317185\n波折\t317186\nputh\t317187\n中英文在线翻译\t317188\n5200小说网\t317189\n收押\t317190\n许知远\t317191\nbitmap\t317192\n自知之明\t317193\n菅野亚梨沙\t317194\nBI\t317195\n杀菌剂\t317196\n佛得角\t317197\n成宫琉璃\t317198\n语音库\t317199\n天使与龙的轮舞\t317200\n4000k\t317201\ninnerHTML\t317202\n周蕙\t317203\n从零开始\t317204\n报废期\t317205\n河北工业大学城市学院\t317206\n不败传说\t317207\n巨丰\t317208\n口腔音\t317209\niso8859-1\t317210\n文泉书局\t317211\n舍名\t317212\n致密化\t317213\n明盘\t317214\n陶山\t317215\n海宁火车站\t317216\nPFD\t317217\n洛龙区\t317218\n见血\t317219\n史济锡\t317220\n郑州职业技术学院\t317221\n95580\t317222\n054A\t317223\n兔女郎\t317224\n李宁韦德之道\t317225\n大坤坤\t317226\n婚前性行为\t317227\n枫网\t317228\n宠物训练\t317229\nnsr\t317230\n克鲁格\t317231\n压力大\t317232\n25磅\t317233\nDVD5\t317234\n150ML\t317235\n圣杯战争\t317236\n武兽\t317237\n先进集体\t317238\n阳刚\t317239\nUnicode\t317240\n双吸泵\t317241\n20架\t317242\n三大名\t317243\n安贞医院\t317244\n阿尔吉侬\t317245\n大同经济技术开发区\t317246\n西南风\t317247\n兄弟共妻\t317248\n大越\t317249\nZIGBEE\t317250\n仙剑奇侠传7\t317251\n聪敏\t317252\n日照房产网\t317253\nsociety\t317254\n刘耀文\t317255\nOFweek机器人网\t317256\n东周列国\t317257\n源质\t317258\n划痕险\t317259\n寸衫\t317260\n碳酸钙D3片\t317261\n14首\t317262\n小吃\t317263\n远古套\t317264\nWin10预览版\t317265\n原创者\t317266\n北极星太阳能光伏网\t317267\n糖果苏打传奇\t317268\n常州机场\t317269\n第51号\t317270\nGPR\t317271\n再生料\t317272\n杨文广\t317273\nsockopt\t317274\n矢量图\t317275\n紫雾\t317276\n恩施自治州\t317277\n潍莱\t317278\n杨敬华\t317279\n肠息肉\t317280\n代送\t317281\n废铅酸蓄电池\t317282\naecc2015\t317283\nthankyou\t317284\n2017年3月22日\t317285\n安徽工程大学机电学院\t317286\n滨湖\t317287\nCodeblocks\t317288\n更糟糕\t317289\n投资理财\t317290\n吸入式\t317291\ncapitalize\t317292\n481\t317293\n厚待\t317294\n今天上午\t317295\n胡耀邦\t317296\n氮化\t317297\n我,堂吉诃德\t317298\n金斯顿\t317299\n相奸\t317300\n望天树\t317301\n亚汇网\t317302\n庞茂琨\t317303\n七分之四\t317304\nAPPEND\t317305\n1100千伏特\t317306\n绿原酸\t317307\n白山市\t317308\n雷蛇曼巴眼镜蛇\t317309\n守宫砂\t317310\ngoyard\t317311\n手骨\t317312\nBrock\t317313\nBD1024高清】迅雷BT\t317314\n腕上\t317315\n易利\t317316\n李三\t317317\n独木舟\t317318\n西安交通工程学院\t317319\n吹呀吹\t317320\n磁力计\t317321\n手抖\t317322\n刘石\t317323\n夺人\t317324\n谢家集区人民政府\t317325\nSock\t317326\n乐东\t317327\n环氧煤沥青\t317328\n7260ac\t317329\n晶圆厂\t317330\n褚时健\t317331\n系统仿真学报\t317332\n套带\t317333\n北冥小妖\t317334\n光波网\t317335\nmamo\t317336\n荣宁\t317337\n800部\t317338\n30多名\t317339\n圆钢管\t317340\n单反机\t317341\n229\t317342\n记录表\t317343\n黄子佼\t317344\n胃息肉\t317345\nsubstitute\t317346\n吗_120健康网\t317347\nIntelXeon\t317348\n种羊\t317349\n梦中的婚礼\t317350\n千真万确\t317351\n肉鸡\t317352\njuedi\t317353\nPhysicians\t317354\n第146期\t317355\n无锡市国土资源局\t317356\n风叶林\t317357\n中耀华建\t317358\n由头\t317359\n移民法\t317360\n一尊\t317361\n巴霍巴利王\t317362\n陈寿亭\t317363\n白嘉轩\t317364\n2steam\t317365\n第169章\t317366\n妈妈妈\t317367\n凯迪拉克凯雷德\t317368\nMoist\t317369\n张家界新闻网\t317370\nmicros\t317371\n何者\t317372\n北京科技职业学院\t317373\n高分子链\t317374\n宋孟君\t317375\n开户地\t317376\n小辉\t317377\n脱变\t317378\n中央政治局民主生活会\t317379\n柏乡\t317380\ngant\t317381\n雒容\t317382\n谈固\t317383\n堆栈\t317384\n02集\t317385\n小鸟依人\t317386\n组装式\t317387\n典版\t317388\n李文\t317389\ndlg\t317390\n蜜桃臀\t317391\n2018-01-18\t317392\nPentax\t317393\n微核\t317394\n字符组\t317395\n莆田房产网\t317396\n三千多\t317397\n污神\t317398\n调节仪\t317399\n969\t317400\n旅店业\t317401\n化工桶\t317402\n团中央\t317403\n人文景观\t317404\n九州出版社\t317405\n维a酸\t317406\n南京50校\t317407\n水银\t317408\n阎王\t317409\nLegislation\t317410\n10则\t317411\n无锡市中医院\t317412\n两帮\t317413\nrevivo\t317414\n美年达\t317415\n归结\t317416\n换姓\t317417\n玄铁\t317418\n石头汤\t317419\n鬼眼\t317420\n腾讯研究院\t317421\n三杯鸡\t317422\nHytera\t317423\n凤凰语文论坛\t317424\n线刷刷机\t317425\n外集\t317426\nmasson\t317427\n百汇园\t317428\n博得之门吧\t317429\nstp\t317430\n无弹\t317431\n红帮\t317432\n北大高岩\t317433\n宣传队\t317434\n农安县\t317435\notto\t317436\n没用完\t317437\ni5\t317438\n吴中\t317439\n刘慈欣\t317440\n林西\t317441\n后学\t317442\n恋夜影院_恋夜秀场\t317443\n伴有\t317444\n零关税\t317445\n艾曼纽\t317446\n五距\t317447\n郑融\t317448\n火影OL\t317449\ntoggle\t317450\n雉\t317451\n步步生莲\t317452\n我告诉\t317453\nexcel2\t317454\n浙江大学电气工程学院\t317455\nsorl\t317456\n上汤不热\t317457\n银行联行号查询\t317458\n军军\t317459\ntaylors\t317460\n大崎\t317461\n入海口\t317462\n唇枪\t317463\n金水湾\t317464\n中储\t317465\n麻辣龙虾\t317466\nFBI\t317467\n鸦片\t317468\n验厂\t317469\nFung\t317470\npentestbox\t317471\n宝燕乐园\t317472\n初案\t317473\n衡水十三中\t317474\nHIV病毒\t317475\ncrx\t317476\n沉沙\t317477\n蓄水\t317478\n主语从句\t317479\n国际生\t317480\n坚持\t317481\n法定\t317482\n蜂腰\t317483\nhwclock\t317484\nmongovue\t317485\n大成\t317486\nIDC圈\t317487\ngrizzly\t317488\n条规\t317489\ndecision\t317490\n金山街道\t317491\n体流\t317492\n7.23\t317493\n克莱普顿\t317494\nTUF\t317495\n广元市\t317496\nEMAIL\t317497\nzhuzhipeng\t317498\n日会\t317499\nArabidopsis\t317500\n齐云楼\t317501\nWEB管理\t317502\n蒸发箱\t317503\nTaew\t317504\n四皇\t317505\n阿氏\t317506\n十周年\t317507\n小西天\t317508\ndecide\t317509\n片假\t317510\n杂句\t317511\nCHEN\t317512\n邢台\t317513\n散作\t317514\n同道\t317515\n易瘦\t317516\n伊利畅\t317517\n子水\t317518\nExcel2003/2007/2010\t317519\nYip\t317520\n挪超\t317521\n云存储\t317522\n天奇\t317523\n蕾特恩\t317524\n三轮\t317525\n今题网\t317526\n中华人民共和国往来港澳通行证\t317527\n庐阳\t317528\n冰窟\t317529\n端庄\t317530\n甲控\t317531\n赌棍\t317532\n东龙\t317533\n垦区\t317534\nwriteup\t317535\n小心\t317536\n气盛\t317537\n明静\t317538\nEmit\t317539\neve\t317540\ntruffle\t317541\n魔晶石\t317542\n碘盐\t317543\n三三\t317544\n滴眼\t317545\n钕磁铁\t317546\n纪检讨书\t317547\n中国人民保险集团股份有限公司\t317548\n电跑\t317549\nelement-ui\t317550\nIllumina\t317551\n境域\t317552\n热衷于\t317553\n刀剑神域:序列之争\t317554\n杭州南站\t317555\n高职校\t317556\n石破天\t317557\n管装\t317558\nMCB\t317559\n氐\t317560\n结婚戒指\t317561\n团市委\t317562\n万科城市之光\t317563\n增韧\t317564\n活动度\t317565\n弘治\t317566\n大半码\t317567\n中国人寿财产保险股份有限公司\t317568\n隔板\t317569\nPDF压缩器\t317570\n千江\t317571\n金刚石钻头\t317572\n英语学\t317573\neviews\t317574\n7条\t317575\n_巴士洛奇英雄传\t317576\n刘禹\t317577\nARA\t317578\n4.0\t317579\n异草\t317580\n来福\t317581\n业界\t317582\n齐鲁制药\t317583\n吉林省政府\t317584\n远影\t317585\n假面骑士drive\t317586\n钟爱\t317587\n文件后缀名\t317588\ntrainers\t317589\n绷紧\t317590\n女宦\t317591\n停权\t317592\n摩崖\t317593\n蚕吐丝\t317594\n高新技术企业认定工作网\t317595\n大阪大学\t317596\n52%\t317597\n8X\t317598\n新疆克州\t317599\n养精蓄锐\t317600\n月光下的凤尾竹\t317601\n板纸\t317602\n希特\t317603\n海南站\t317604\n绝代\t317605\nsplit\t317606\nNBALIVE\t317607\ndaft\t317608\n乌恩\t317609\n28000\t317610\nCCleaner\t317611\n启明信息技术股份有限公司\t317612\n塑料盒\t317613\n抢跑\t317614\n跃莱\t317615\ntoml\t317616\n喊话\t317617\n朱红\t317618\n领网\t317619\ncss框架\t317620\n黑窗\t317621\n老家\t317622\n格兰富\t317623\n济南物流公司\t317624\n2等\t317625\n世上\t317626\n溴丁烷\t317627\n男们\t317628\n掘金队\t317629\n索尼爱立信\t317630\n零度战姬\t317631\n肠胃病\t317632\nMatters\t317633\n浙江省发改委\t317634\n059\t317635\n杨国强\t317636\nHBV\t317637\nexploring\t317638\n百格\t317639\n孜\t317640\n吉他扫弦\t317641\n文化公园\t317642\n新汉系吧\t317643\nNBA98篮球中文\t317644\n603160\t317645\n_河池\t317646\n群点\t317647\n胆气\t317648\n镀膜\t317649\n岫岩玉\t317650\n急于求成\t317651\nvcpu\t317652\n1.0.2.9\t317653\n相携\t317654\n山海经之白泽传说\t317655\n机动三轮车\t317656\n那一步\t317657\n美国明尼苏达大学\t317658\n茶膏\t317659\n380万\t317660\n本気\t317661\n王者荣耀攻速\t317662\n绝地求生手游\t317663\nhsbc\t317664\nredhat6\t317665\n上帝之国\t317666\n纲目\t317667\neslipse\t317668\nΩ\t317669\n岐伯\t317670\n延宕\t317671\n春眠\t317672\n美容科\t317673\n王蒙徽\t317674\n宠臣\t317675\n白芒\t317676\n2017年六一儿童节\t317677\n装饰物\t317678\n代代\t317679\n藏地\t317680\n云行\t317681\n逐日\t317682\n免开\t317683\n转学分\t317684\n誓\t317685\n化块\t317686\n琼崖\t317687\n龙先生\t317688\n显像管\t317689\n2.6d\t317690\n又见平遥\t317691\n蟾宫\t317692\n5423\t317693\n配器\t317694\neChina\t317695\n几所\t317696\nガラス\t317697\n天戟\t317698\n嘉兴路街道\t317699\nlogging\t317700\nyiwan\t317701\n60068\t317702\nbosh\t317703\n圣水\t317704\n十三太保\t317705\nbaohanqing\t317706\n南京外国语学校仙林分校\t317707\n发泡棉\t317708\njws\t317709\n雨生百谷\t317710\n舰团\t317711\n研判会\t317712\n长江三峡\t317713\n中国核工业集团\t317714\nBIM论坛\t317715\n三精\t317716\n神功\t317717\n干洗店\t317718\n大阿哥\t317719\n加权平均资本成本\t317720\n阳顶天\t317721\n幻想西游记\t317722\nsuning\t317723\nleesf\t317724\n交易机器人\t317725\n容器\t317726\n庙行镇\t317727\n裸身\t317728\n划破\t317729\ndos吧\t317730\n太适合\t317731\n安乡县政府网\t317732\n戴芬\t317733\n挤出\t317734\n手机资\t317735\n巴哈马群岛\t317736\n小琪\t317737\n22届\t317738\n公平竞争\t317739\nCine\t317740\n雨声\t317741\n叠氮钠\t317742\n架子床\t317743\n线装书\t317744\n企业管理论文\t317745\n操作器\t317746\n看成\t317747\n南充\t317748\n桀骜不驯\t317749\nsql触发器\t317750\npaprika\t317751\n墨西哥\t317752\n南理\t317753\n国务院安全生产委员会\t317754\nぇ\t317755\n超级校园网\t317756\n泡椒\t317757\n偷家\t317758\n线中\t317759\nAveda\t317760\n采摘\t317761\n海底总动员2:多莉去哪儿\t317762\n捕影\t317763\n夏朝\t317764\n郭文贵\t317765\n康健园\t317766\n220个\t317767\n清华大学自动化系\t317768\n土木者\t317769\nkala\t317770\n操作\t317771\n质构仪\t317772\n异世\t317773\n中海油\t317774\n莒州\t317775\n谋篇\t317776\n清夏\t317777\n1187\t317778\n5000点\t317779\n戛纳国际创意节\t317780\nlibrtmp\t317781\na类\t317782\n浙江省交通规划设计研究院\t317783\n三知\t317784\nqos\t317785\n复旦大学图书馆\t317786\n手握式\t317787\n蝙蝠侠大战\t317788\ncytoscape\t317789\n杨肸子\t317790\n禁止吸烟\t317791\n噪点\t317792\n2016年11月份\t317793\n2017.8.18\t317794\n德玛莎\t317795\nmeters\t317796\n脱水机\t317797\n婚姻状况\t317798\n连续函数\t317799\nopenldap\t317800\n终生制\t317801\n大苏打\t317802\n温控\t317803\n双侧\t317804\n小调网\t317805\n互力\t317806\n爱奇艺卡\t317807\n中国太平保险\t317808\n蓝石莲\t317809\n。1\t317810\n平拍\t317811\n主责\t317812\n宜和\t317813\nidn\t317814\n吐司\t317815\nDN15\t317816\n达力\t317817\n3.40\t317818\n橙风\t317819\n真核细胞\t317820\n误差\t317821\n电极\t317822\n钟点\t317823\n解馋\t317824\n凶残\t317825\n安靖镇\t317826\n风飞\t317827\n吴小平\t317828\nLinuxHub\t317829\n成理\t317830\neasily\t317831\n软件类\t317832\ndus\t317833\n宝花\t317834\n短棍\t317835\nfraid\t317836\n换炮\t317837\n园林石\t317838\n雪凡\t317839\n超级牛散\t317840\n建民\t317841\n雪莉\t317842\n黄曼\t317843\n冯玮\t317844\n末画\t317845\n_六\t317846\n华晨易中天\t317847\n单枞茶\t317848\n恋空\t317849\nC21\t317850\n7角\t317851\n千叶豆腐\t317852\n第22页\t317853\n装配线\t317854\n低龄\t317855\n虾饼\t317856\ncontributions\t317857\ntrendiano\t317858\n循环钻机\t317859\nOpenShift\t317860\nJoan\t317861\n往届生\t317862\n艺库\t317863\n8.9折\t317864\nChromium\t317865\nCoreLocation\t317866\n全球科技公司\t317867\n贵族们\t317868\n80004005\t317869\n小虾米闯江湖吧\t317870\n扮靓\t317871\n2018-04-08\t317872\n综合_一道茶网\t317873\nTNM\t317874\n诗机\t317875\n叛逆的勒鲁什吧\t317876\n谭八爷\t317877\n水城路\t317878\n刘一祯\t317879\n仿饿\t317880\n王锋\t317881\n大军师司马懿之虎啸龙吟\t317882\n唐字\t317883\n中国博物馆\t317884\n盘缠\t317885\nHDPE\t317886\n刘荣\t317887\nzhilu\t317888\n酷卡\t317889\n2pc\t317890\n全体\t317891\n横山村\t317892\n机电设备\t317893\n阿丹\t317894\n药剂\t317895\n创设\t317896\nHaha\t317897\ndnastar\t317898\n麻仓\t317899\n尖嘴钳\t317900\n刀把子\t317901\nradio值\t317902\n赛女人时尚网\t317903\n液氩\t317904\nusb启动项\t317905\n十九大全面从严治党\t317906\n金立s10\t317907\n厦门大学研究生院\t317908\nRetina\t317909\n斩龙哈姆雷特\t317910\nflyknit\t317911\n11月10日\t317912\nNicholas\t317913\n武王伐纣\t317914\n露脸\t317915\nacquisition\t317916\n孟海\t317917\n2552\t317918\n粤港澳\t317919\nAnno\t317920\n草间弥生\t317921\n手提式干粉灭火器\t317922\ntuff\t317923\n乐哈健康\t317924\njz.docin.com豆丁\t317925\n莲花南路\t317926\ncomfast\t317927\n酰氯\t317928\n张培萌\t317929\n海德里希\t317930\nUIScrollView\t317931\n漏脸\t317932\ngh5\t317933\n大肠埃希菌\t317934\n绯花玉\t317935\n十式\t317936\n钢管舞\t317937\nladder\t317938\n汇顶\t317939\n0分\t317940\n改运\t317941\n128位\t317942\n防御性\t317943\n铁心桥街道\t317944\n靖州县\t317945\n方通\t317946\nOMSI\t317947\n英仙座\t317948\nflexpaper\t317949\n假人\t317950\n氏体\t317951\n族人\t317952\n陨石\t317953\n传来\t317954\nbeautylish\t317955\n低光\t317956\n花样美男\t317957\n赣州市人力资源和社会保障局\t317958\nScared\t317959\n1445\t317960\n飞度GK5\t317961\n昆明地铁1号线\t317962\n九子夺嫡\t317963\n福昕PDF阅读器\t317964\n小鸽\t317965\nhaoleav\t317966\n可人\t317967\n桃胶\t317968\ndocket\t317969\n中国电子科技集团公司\t317970\n2边\t317971\n域\t317972\nkeyevent\t317973\n7.1.2\t317974\n来电科技\t317975\n霍比屯\t317976\nHorn\t317977\n轻卡\t317978\n扬子江药业集团\t317979\n土巴兔\t317980\n武城\t317981\n王腾\t317982\n不定期\t317983\n海螺肉\t317984\n最伟大的女演员排名_\t317985\n鲤鱼乡\t317986\n董真\t317987\ndegushi\t317988\n救护车\t317989\n文凭\t317990\n性爱情趣\t317991\n吴京\t317992\n法国号\t317993\n商会友\t317994\n伊达政宗\t317995\n完美人生\t317996\n顶盒\t317997\n莫寒\t317998\n增值税申报表\t317999\n百度地图js\t318000\n上海新阳\t318001\n16.8\t318002\n吊带裙\t318003\n周年纪念日\t318004\n台模\t318005\n磅\t318006\n滑行\t318007\n红罐\t318008\n218&\t318009\nind\t318010\n破碎险\t318011\noae\t318012\n后感\t318013\n6001\t318014\nCHLOE\t318015\nSDA\t318016\n布洛芬混悬液\t318017\n贵港港北\t318018\n结扎术\t318019\n成本领先战略\t318020\n盛爱\t318021\n百叶门\t318022\n伊风\t318023\n大雁塔北广场\t318024\n全军覆灭\t318025\n逸出\t318026\n车辆登记证\t318027\n区农委\t318028\n雷克\t318029\n费斯\t318030\n金丝檀木\t318031\n3387\t318032\n阿里巴巴批发网\t318033\n赵伟国\t318034\n田海蓉\t318035\n57张\t318036\n景泰\t318037\n锭子\t318038\nessay\t318039\n今年四月份\t318040\nfinding\t318041\n定期\t318042\n户田真琴\t318043\n2.0S\t318044\n种植地\t318045\n九座\t318046\n低碳建筑\t318047\n张媛媛\t318048\nPatio\t318049\n表达法\t318050\n新荷\t318051\n陈墨\t318052\n半夜3点\t318053\n辊道窑\t318054\n肉畜\t318055\n弃置\t318056\n硅质板\t318057\n你的谎言\t318058\n下铺\t318059\n三五年\t318060\n九锅一堂\t318061\n牛华网\t318062\nLED日间行车灯\t318063\n美术学专业\t318064\n手机保护膜\t318065\n豪哥\t318066\n四特\t318067\nVerify\t318068\nU盾\t318069\n人民法庭\t318070\n空白页\t318071\n锆\t318072\n県\t318073\n慢性牙周炎\t318074\nPb\t318075\n威克斯\t318076\n长沙一中\t318077\n香岛\t318078\n36平方\t318079\n村通\t318080\nCENTRE\t318081\n犯罪事实\t318082\n20170117\t318083\n临沂房产超市网\t318084\n阿克西斯\t318085\nIPZ-127\t318086\n盘点控\t318087\n黑钻石\t318088\n小榕\t318089\n1500转\t318090\n扶风县\t318091\nfm2012\t318092\n释迦摩尼佛\t318093\n叶君健\t318094\n慧飞\t318095\n刨宫\t318096\n中国科学院大学\t318097\nE5\t318098\n狂妃\t318099\n担保物\t318100\nTP5\t318101\n刘晓雁\t318102\n移步换景\t318103\n高辣\t318104\n处刑者\t318105\n刺风\t318106\n杨福\t318107\n天价\t318108\n叶赫\t318109\n沪嘉甬\t318110\n足球网\t318111\nmathtype\t318112\n禅机\t318113\n名师\t318114\nリア\t318115\n5M\t318116\n五连发\t318117\n秦方\t318118\n金师傅\t318119\n太阳膜\t318120\n街子古镇\t318121\np8max\t318122\n2碗\t318123\nshine\t318124\n陈抟\t318125\n明早\t318126\n克鲁塞\t318127\n浙江省总工会\t318128\n被查\t318129\n148元\t318130\n星际战士\t318131\n三角兽\t318132\n惠州大亚湾经济技术开发区\t318133\n诗乃\t318134\n追悼会\t318135\nOC\t318136\n经济圈\t318137\n韩云云\t318138\n金瓶梅2爱的奴隶\t318139\nscau\t318140\n离析\t318141\n礼教\t318142\nSays\t318143\nwifi认证\t318144\n十字交叉法\t318145\n启蒙者\t318146\n马里\t318147\n天津市南开中学\t318148\n深蓝广场\t318149\n大唐狄公案\t318150\n租代\t318151\n救世之树\t318152\n胡青\t318153\n旅交网\t318154\n3188\t318155\n茴香豆\t318156\n安东尼·戴维斯\t318157\n保护盒\t318158\n2000级\t318159\n七宝宝\t318160\n暮光\t318161\n薄幸\t318162\n粒度\t318163\n年日\t318164\n相熟\t318165\n杨铁锤\t318166\necahrts\t318167\n0620\t318168\n恐怖之眼\t318169\nMice\t318170\nshiwu\t318171\n信德\t318172\n飞星\t318173\nGREE\t318174\n南京市第一中学\t318175\n拨通\t318176\n模拟电路\t318177\n换标\t318178\napache2\t318179\n聚沙成塔\t318180\n大集\t318181\n43度\t318182\n添香\t318183\npsd素材\t318184\n环球眼\t318185\n4dlc\t318186\n家有儿女\t318187\n和室\t318188\n市政府党组\t318189\n通经\t318190\n唐突\t318191\n异世傲天\t318192\n截去\t318193\n灯火阑珊\t318194\n口罩机\t318195\nye\t318196\n育才幼儿园\t318197\n无敌反斗星\t318198\n现代人\t318199\n唐宫\t318200\n湖北城市建设职业技术学院\t318201\n北京梨园地铁站\t318202\n东京梦华录\t318203\nvimium\t318204\n杞人\t318205\nyouke\t318206\n流放之路市集\t318207\n陈昊\t318208\n虬髯客\t318209\n一波\t318210\n6粒\t318211\n家庭保险\t318212\n杨魏玲花\t318213\n金斧子私募\t318214\n5.1米\t318215\n贡井区\t318216\nzxf\t318217\n世界地球日\t318218\n保险中介公司\t318219\n圆线\t318220\n青翠\t318221\nMACRO\t318222\n深圳市公安局交通警察局\t318223\n韩国虎队\t318224\n公私钥\t318225\n先兆\t318226\n上岛咖啡\t318227\nYUE\t318228\n签报\t318229\n忠奸人\t318230\nrefrigerator\t318231\n艺术季\t318232\nWrapping\t318233\n内勤\t318234\nRt\t318235\n先锋街道\t318236\n气具\t318237\n国外市场\t318238\n十三亿分贝\t318239\n世风日下\t318240\n三四天\t318241\n絲漣\t318242\n词典网\t318243\nes-ik\t318244\n政教\t318245\nKeenYeh\t318246\n龙东地区\t318247\n650TR\t318248\n黑料\t318249\n两辆\t318250\n掸\t318251\n命星\t318252\n5490\t318253\n覆海\t318254\nT28\t318255\n5ucom.com\t318256\n初等变换\t318257\n宽客\t318258\n1.0.24\t318259\n收包\t318260\n云舍\t318261\n香河园\t318262\nWes\t318263\n勃锐精\t318264\n脚趾头\t318265\n转道\t318266\n国有土地使用权协议\t318267\n薇尔莉特\t318268\n20151225\t318269\n83869821\t318270\n沙沟镇\t318271\n王冉\t318272\n男装\t318273\n威严\t318274\n八段\t318275\n大证\t318276\n百吉\t318277\n不需\t318278\n小峰\t318279\n矛盾体\t318280\n补偿金\t318281\nbrow\t318282\n第三季度\t318283\n听雨眠\t318284\n青海玉树\t318285\np,q\t318286\n夫君\t318287\n七月\t318288\n岱\t318289\n你是我最深爱的人\t318290\n八大家\t318291\n课\t318292\n性风俗\t318293\n卓凯\t318294\n昌辉\t318295\n四川护理职业学院\t318296\n肥妞\t318297\n沸沸扬扬\t318298\n智龙迷城\t318299\ndota吧_\t318300\n黄山火车站\t318301\n全魔\t318302\n朋子\t318303\n邪剑\t318304\nfootage\t318305\n微星Z370\t318306\n联培\t318307\nAttributes\t318308\n毁灭地球\t318309\n大学英语精读\t318310\n猪窝\t318311\n十个工作日\t318312\n不辣\t318313\n北燕\t318314\n萨克斯吧\t318315\n陕西职业技术学院\t318316\n变态心理学\t318317\nWIF\t318318\nProduction\t318319\nyyyy\t318320\n坐套\t318321\nLLDP\t318322\nrsa\t318323\n滚动新闻_电影网\t318324\n东方梦工厂\t318325\n小锋\t318326\n萍踪\t318327\nqq播客\t318328\n用期\t318329\n录\t318330\nLiars\t318331\n弹力棉\t318332\n岁者\t318333\n网主\t318334\nFilco\t318335\n4k\t318336\n向右走\t318337\n套筒\t318338\n成都广告公司\t318339\n晚婚\t318340\n大票\t318341\n为准\t318342\n8500e\t318343\nidea2017\t318344\n单接\t318345\n光焰\t318346\n啸龙吟\t318347\nCortex-M3\t318348\n莞工\t318349\n皇帝岛\t318350\n谨守\t318351\nHamlet\t318352\n变革性\t318353\n泉州市人民政府办公室\t318354\n北京市局\t318355\n25.1\t318356\n钢研\t318357\n土佐\t318358\n现行版\t318359\n迁怒\t318360\n落谁家\t318361\n阿尼玛\t318362\n广西出入境检验检疫局\t318363\nscreenshot\t318364\n企业国有资产法\t318365\n满汉\t318366\n好先生\t318367\n超声波\t318368\n捣碎\t318369\n百川\t318370\nLancme\t318371\n廊曼\t318372\n中华全国专利代理人协会\t318373\n2年以上\t318374\n白血病\t318375\n婴宝\t318376\ncruiser\t318377\n铃儿响叮当\t318378\n凌霄宝殿\t318379\n横管\t318380\n粉末\t318381\nhiberfil\t318382\n墙根娱乐_墙根网\t318383\n库布其沙漠\t318384\n申博娱乐\t318385\n窘境\t318386\n泰迪吧_\t318387\n麓山南路\t318388\n宜宾市商业银行\t318389\n颗粒度\t318390\n安源\t318391\n保证书\t318392\n歌妓\t318393\nopal\t318394\n波仔\t318395\n示范点\t318396\nreve\t318397\n200秒\t318398\na321\t318399\n含情脉脉\t318400\nThang\t318401\n没有钱\t318402\n汪俊\t318403\n厦门市政府\t318404\nlosesea\t318405\nAbroad\t318406\n神锋\t318407\n师尊\t318408\n南京林业大学\t318409\n智娜\t318410\nbeter\t318411\n龙虎门吧\t318412\n何超仪\t318413\nDollar\t318414\n山东省发改委\t318415\n大红山\t318416\nbuilderMIS\t318417\n驶离\t318418\neMarketer\t318419\n第一届\t318420\n廖氏\t318421\n国际关系学\t318422\n10,周年\t318423\n冰面\t318424\n流于形式\t318425\n炉石传说】水手之家\t318426\n原发\t318427\n秀水街\t318428\n化妆店\t318429\n瓷质\t318430\nonClick\t318431\nandrea\t318432\n消字号\t318433\n拽住\t318434\n水头\t318435\n弹舌\t318436\n星鲨\t318437\n清静经\t318438\n特许经营合同\t318439\n异狂\t318440\n吻合器\t318441\nnisi\t318442\n伊琳\t318443\nsz00\t318444\n爬山\t318445\n北沙\t318446\n江苏出入境检验检疫局\t318447\nMAS\t318448\n6ix9\t318449\n民航\t318450\n豆客\t318451\nNetworkX\t318452\n排气\t318453\nsrp\t318454\n匣子\t318455\n丝印\t318456\n录音宝\t318457\n干蒸\t318458\n蒙混过关\t318459\nYourself\t318460\n巨化集团\t318461\nControlled\t318462\n中华V3\t318463\n真爱的谎言之破冰者全集\t318464\n百合康\t318465\n做衣服\t318466\n骨弓\t318467\n白胜\t318468\n秦岭野生动物园\t318469\n古者\t318470\n金山逍遥\t318471\n李云恺\t318472\n右翼\t318473\n中塔\t318474\n不准\t318475\n扬力\t318476\n≤\t318477\nRecommended\t318478\n舞王\t318479\n未成年人网\t318480\n整人\t318481\n茶艺\t318482\n悦途\t318483\n龙中龙\t318484\n软玉\t318485\nPitStop\t318486\n_罗湖电子政务网\t318487\n维密\t318488\n燕山石化\t318489\n爱快\t318490\n归功于\t318491\n5天后\t318492\n92万\t318493\n科普日\t318494\n大酒\t318495\nstmbuy\t318496\n妖板\t318497\nNPS\t318498\n江湾镇\t318499\n急性肾损伤\t318500\n160630\t318501\n棋圣\t318502\n波霸\t318503\n德惠\t318504\n网利宝\t318505\n高斯光\t318506\n上海交通大学材料科学与工程学院\t318507\nCQ\t318508\nXZ\t318509\nDNV\t318510\nppt打印\t318511\n负负\t318512\n速本\t318513\n内蒙古卫视\t318514\n阿银\t318515\n话费券\t318516\n20185.1\t318517\n中国天辰工程有限公司\t318518\n1446\t318519\n深深儿\t318520\n雨薇\t318521\ngouwu\t318522\nGDM\t318523\nCurrent\t318524\n苹果iPadPro\t318525\nhnb\t318526\nxiaopiu\t318527\n吴孟超\t318528\n巨炮\t318529\n外延片\t318530\n女妖\t318531\n真实再现\t318532\n三阴性乳腺癌\t318533\n海拉克斯\t318534\n黎诺懿\t318535\naj11\t318536\n人血清白蛋白\t318537\n电商人\t318538\n鸿路钢构\t318539\n捷达论坛\t318540\nVeryCD电驴\t318541\n创义\t318542\n6010\t318543\n辨证\t318544\nPrinter\t318545\n丘陵地区\t318546\n爱的色\t318547\n刘涛\t318548\nPDMS\t318549\n黑彩\t318550\n合庆镇\t318551\n沈阳故宫\t318552\nPicnic\t318553\npagehelper\t318554\n朋友\t318555\nmqsql\t318556\n大围山国家森林公园\t318557\n成岩\t318558\n南春\t318559\n北京陶然亭公园\t318560\n稍安勿躁\t318561\n莱德\t318562\n5年以上\t318563\n下沙区\t318564\n花鼓灯\t318565\n公式\t318566\n鲁朗林海\t318567\n自助火锅\t318568\nHongten\t318569\n1419\t318570\n相仪\t318571\n热带雨林\t318572\ntonghua\t318573\n平面体\t318574\n178号\t318575\nculturelle\t318576\n兰州机场\t318577\n君彩\t318578\n及笄\t318579\nps8\t318580\n吴祖光\t318581\n沙漠死神\t318582\n海洋岛\t318583\n遮天步步生莲\t318584\nAnh\t318585\n九龙公园\t318586\n信众\t318587\non在线翻译\t318588\n久坐\t318589\n金融集团\t318590\n巡查员\t318591\nsteer\t318592\ntheir\t318593\n仙府\t318594\nwipeout\t318595\n有话好好说\t318596\n车陂南\t318597\n量产机\t318598\n天者\t318599\n五年以上\t318600\n调节阀信息网\t318601\n陈如桂\t318602\n东吴证券\t318603\nCACHE\t318604\n空心杯\t318605\n起死回生\t318606\n伊犁\t318607\n百草路\t318608\nJPype\t318609\nfit2\t318610\n金框\t318611\nobd\t318612\n150mg\t318613\n美乐乐家具\t318614\n地宫\t318615\n贵妃网\t318616\n粉巷\t318617\nseizure\t318618\n风雨\t318619\nopengrok\t318620\n爱哭\t318621\n袖子\t318622\n张开\t318623\n盐酸氨溴索片\t318624\npdflatex\t318625\n2000W\t318626\nJFinal\t318627\n爱思助手\t318628\n空饷\t318629\nNPM\t318630\n根治术\t318631\n陕西中医药大学附属医院\t318632\n绝地求生变声器\t318633\n街拍网\t318634\nstk\t318635\n月上柳梢头\t318636\n人音版小学\t318637\n残生\t318638\naj5\t318639\n百瑞源\t318640\n铁幕\t318641\n冒险岛3\t318642\n镇魔\t318643\nNewBalance\t318644\n长安CS35\t318645\n杖责\t318646\n娄底一中\t318647\n起伏\t318648\nMantisBT\t318649\nAesthetics\t318650\nIP65\t318651\nim03\t318652\n孙伯翔\t318653\n细跟\t318654\n保山市人民政府\t318655\n第三十四回\t318656\n吴婷\t318657\n凤凰单枞茶\t318658\n翟志刚\t318659\n窃读记\t318660\n21处\t318661\nassemble\t318662\n草坪机\t318663\n消亡\t318664\n熊猫基地\t318665\n马可福音\t318666\n不懂爱\t318667\nReshade\t318668\n居居侠\t318669\n忆念\t318670\nethics\t318671\nmicroblaze\t318672\n美男\t318673\nJIU\t318674\n万景\t318675\n文山市\t318676\n简明\t318677\n硬分币\t318678\n肉臀\t318679\n尧建云\t318680\n170万元\t318681\n忘忧谷\t318682\n)资产管理有限公司\t318683\n曹红\t318684\n零售户\t318685\n颅窝池\t318686\n南街\t318687\nwwwsao5566com\t318688\n爱爱爱\t318689\n催吐\t318690\n心灵感应\t318691\nibus\t318692\nvio\t318693\n3830\t318694\n欲钱\t318695\n吴跃\t318696\nslideshow\t318697\n海鲸\t318698\nefficient\t318699\nHOMME\t318700\n刘岩\t318701\n湖南科技\t318702\n筑龙建材网\t318703\nMac输入法\t318704\n永跟党走\t318705\n一虎\t318706\n雁门\t318707\nFORUM\t318708\nioa\t318709\n疆土\t318710\n刘希平\t318711\n牙堂\t318712\n鬼吹灯之龙岭迷窟\t318713\n厨王争霸\t318714\n不敢相信自己\t318715\nshuiguo\t318716\n吴霞\t318717\n20条\t318718\n格林豪泰连锁酒店\t318719\n致诚\t318720\n河阳\t318721\n生无\t318722\n赛睿Rival\t318723\n爸爸爸爸\t318724\n哈西站\t318725\nsnr\t318726\n商品名\t318727\n自豪\t318728\n汉语热\t318729\nscaffolding\t318730\n一排\t318731\n19块\t318732\n刀粒\t318733\n地史\t318734\n朝如青丝暮成雪\t318735\n钻刷\t318736\nWOW6.1\t318737\n惠惠购物助手\t318738\n轻声\t318739\n几千里\t318740\n元整\t318741\nbaoyu\t318742\n营销型网站建设\t318743\n51%\t318744\n简洁版\t318745\n万家湾\t318746\n罗英\t318747\n风湿性\t318748\n豆豆小说阅读网\t318749\n常安\t318750\n火焰鸡\t318751\n空乘专业\t318752\n昭武\t318753\n张雨薇\t318754\n手贱\t318755\n广东省人事考试局\t318756\ncrystaldiskinfo\t318757\n案释法\t318758\n炒掉\t318759\n不机\t318760\n终南山\t318761\n第三十七条\t318762\n极品飞车online\t318763\n宝卷\t318764\n李记\t318765\n超频三\t318766\n开场曲\t318767\ncmbs\t318768\n杭州万向职业技术学院\t318769\n香港路\t318770\n梁博\t318771\ndgn\t318772\n对越自卫还击战\t318773\n隐秘\t318774\nsth\t318775\n振兴大道\t318776\n神咲詩織\t318777\n甘诺宝力\t318778\n感官世界\t318779\n熙\t318780\n富婆\t318781\n滚回\t318782\n表耳\t318783\n木薯粉\t318784\n音乐车\t318785\n断魂\t318786\n试剂\t318787\n钱宝\t318788\nx-crew\t318789\n红岛\t318790\n1790\t318791\n光韵达\t318792\nwendao\t318793\n明帝\t318794\n0455\t318795\n受弯构件\t318796\n排\t318797\n1500家\t318798\n柒零叁\t318799\n用色\t318800\n狼羊\t318801\n放鸡岛\t318802\nip65\t318803\n初凝\t318804\n俄三国\t318805\n武田制药\t318806\n反恐办\t318807\n长鹿农庄\t318808\n会声会影X10\t318809\n毕雯\t318810\n综合应用能力\t318811\n高低压\t318812\n黄军\t318813\n曾参\t318814\n商标注册公司\t318815\n碧溪新区\t318816\nFN\t318817\n零7个月\t318818\n液色\t318819\n45678\t318820\n如风达\t318821\n10秒后\t318822\n灵感日报\t318823\n发汗\t318824\nLZMA\t318825\n挺进\t318826\nTeK\t318827\n穿越火线单机版\t318828\n金融类\t318829\n20150610\t318830\n真人兽\t318831\n四件\t318832\n搏命\t318833\nzg\t318834\n萨师煊\t318835\nSignal\t318836\n瑞士花园\t318837\n生词\t318838\n打花\t318839\n第几本\t318840\nemui\t318841\n葛优\t318842\n早上6点\t318843\n黑莓Passport\t318844\nKindness\t318845\n贾斌\t318846\n太平洋安防网\t318847\n俄文版\t318848\n三萜\t318849\n软片\t318850\nJetta\t318851\n2400个\t318852\n面向对象分析\t318853\n梅树\t318854\n甲烷菌\t318855\n49页\t318856\n亿万年\t318857\n艾可\t318858\n聘礼\t318859\n青青草在线视频\t318860\n树皮\t318861\n腐蚀\t318862\n中置柜\t318863\nwin平板\t318864\nKoro\t318865\n糠酸莫米松\t318866\n政治学家\t318867\n原虫\t318868\nHOOK\t318869\n串货\t318870\n雷克萨斯NX\t318871\nHouses\t318872\nbasis\t318873\n三年后\t318874\n孙祥\t318875\n索朗旺姆\t318876\nMens\t318877\n痩\t318878\n最高者\t318879\n钟吾\t318880\n邕宁\t318881\n重庆市第五中级人民法院\t318882\n0826\t318883\n达内科技\t318884\n缩窄性心包炎\t318885\n衰草\t318886\nppr\t318887\n八旗\t318888\n无忧答案网\t318889\n薏米粥\t318890\n古城门\t318891\n激光束\t318892\n小叶紫檀\t318893\n定员\t318894\n亚硫酸铵\t318895\nJJG\t318896\nMiguel\t318897\nn551\t318898\n世俗化\t318899\n入账\t318900\n北京航天总医院\t318901\n情记\t318902\n2.9.1\t318903\n小班段\t318904\n郭奉孝\t318905\nrather\t318906\n医学史\t318907\nDiff\t318908\n功夫熊猫3\t318909\n小马王\t318910\n李燕花\t318911\nOBV\t318912\n苏志刚\t318913\n60W\t318914\n孙策\t318915\nTrainor\t318916\n开心豆少儿英语\t318917\n金秋助学\t318918\nZaha\t318919\n生物群系\t318920\n基座\t318921\n慢无\t318922\nTodoList\t318923\nepisode\t318924\n曹操传\t318925\n虚拟光驱Daemon\t318926\n欢乐海\t318927\n8229\t318928\n展鸿\t318929\nheidi\t318930\n盛杰\t318931\n陈博\t318932\n类函数\t318933\nPrank\t318934\n1833\t318935\n牢里\t318936\n手炮\t318937\n蓄力\t318938\ntty1\t318939\n95_\t318940\n吃穿住\t318941\nHymn\t318942\n冖\t318943\n李晓军\t318944\n文山新闻网\t318945\nBIND\t318946\n中国五冶集团\t318947\n月中\t318948\n骨盆前倾\t318949\n9800gt\t318950\n沃兹尼亚奇\t318951\n萨克斯谱\t318952\n红莉栖\t318953\n设计艺术学院\t318954\nTV_热播韩剧网\t318955\n泉州市政府网\t318956\n中国质量新闻网\t318957\n绳套\t318958\n测评\t318959\n禽类\t318960\n2016年4月7日\t318961\n红心火龙果\t318962\n13宗\t318963\n智图网\t318964\n趣运动\t318965\n日本迪士尼\t318966\n玫瑰花简笔画\t318967\n事亲\t318968\n环氧树脂漆\t318969\n西蒙斯\t318970\n部件\t318971\n逃不开\t318972\n舞剧\t318973\n2017.11\t318974\n盒形\t318975\n果胶\t318976\n处女_\t318977\n浮光掠影\t318978\n收藏界\t318979\n赵长鹏\t318980\n新华字典\t318981\n自由行网\t318982\n平安荆楚网\t318983\n妺妹\t318984\n石棺\t318985\n文博园\t318986\n受益权\t318987\n工地\t318988\n65P\t318989\n城固\t318990\n文献管理软件\t318991\n减速箱\t318992\nmentalray\t318993\n泡饼\t318994\n易智\t318995\n自粘性\t318996\nSCOPE\t318997\n进阶版\t318998\n_君越论坛_汽车之家论坛\t318999\n调侃\t319000\n中国香烟网\t319001\n县里\t319002\n文秘\t319003\n突增\t319004\n叒\t319005\n战国红\t319006\n蚌埠路\t319007\n2018天\t319008\nbest\t319009\n铺铺\t319010\n徐衣显\t319011\n双氯芬酸钠肠溶片\t319012\nChrisZZ\t319013\nJaime\t319014\nmx4\t319015\n箭术\t319016\n丧母\t319017\n病文\t319018\n3090\t319019\n雷加利亚\t319020\n募投项目\t319021\n丙二醛\t319022\n改向\t319023\nMerck\t319024\n崇拜者\t319025\n香港公开大学\t319026\n大国工匠\t319027\n北大经院\t319028\n开卷\t319029\n乌灵胶囊\t319030\n迪斯尼动画\t319031\n第70集\t319032\n收费所\t319033\n重战\t319034\n中国证券投资者保护基金有限责任公司\t319035\n叶栅\t319036\n呼伦贝尔盟\t319037\n荣安\t319038\n迪兰恒进\t319039\nL805\t319040\n59秒\t319041\n幽灵山庄\t319042\n牛仔布\t319043\n深圳市教育科学研究院\t319044\n洋溪镇\t319045\n高远\t319046\nmt管理器\t319047\n梦幻岛\t319048\n放缩\t319049\nGUEST\t319050\n爱不释手\t319051\n微信号\t319052\n正厅级\t319053\nbomba\t319054\n6061铝合金\t319055\n六十三\t319056\ninstant\t319057\ntofixed\t319058\n孵化园\t319059\n丁海寅\t319060\n格雷斯\t319061\n五联五问\t319062\n言之不预\t319063\n手指尖\t319064\n亿美软通\t319065\n吕乐\t319066\n亿鑫\t319067\n长安欧尚车友会|长安欧尚\t319068\n长汀县\t319069\n酷比魔方iwork10\t319070\n2018年后\t319071\n真三国无双7帝国\t319072\n范海辛\t319073\nwepack\t319074\n星期日\t319075\n乔四爷\t319076\n凯美英魂之刃\t319077\n红屏\t319078\n塑编\t319079\n篮球世界杯\t319080\n吸水机\t319081\n2016年10月22日\t319082\nbasename\t319083\nlhs\t319084\n蚌房网\t319085\n没想过\t319086\n信访局\t319087\n综合查询网\t319088\n余杭站\t319089\ne卡通\t319090\n3543\t319091\n7颗\t319092\n采取行动\t319093\n天极\t319094\n中联通\t319095\n校史馆\t319096\n长峰\t319097\n绒癌\t319098\n西安物价局\t319099\n高雁\t319100\n猥琐\t319101\nLip\t319102\n病理科\t319103\n注册序列号\t319104\n联合\t319105\nvcsel\t319106\navavso\t319107\n孙芸芸\t319108\n福利秀\t319109\n平衡版\t319110\n日天\t319111\n脉冲布袋除尘器\t319112\n238万源\t319113\n兰丸\t319114\n鲜见\t319115\npumping\t319116\n街舞秀\t319117\n塞尔达传说荒野之息_塞尔达传说\t319118\n银针\t319119\n腰痛\t319120\n警示贴\t319121\n张庆军\t319122\nP_\t319123\n珠海路\t319124\nQQ好友\t319125\n武汉新芯集成电路制造有限公司\t319126\n北京海关\t319127\n兄妹情\t319128\n唯链\t319129\n正坐\t319130\n咒骂\t319131\n首卡\t319132\n右介\t319133\nKEVIN\t319134\n卡西欧电子\t319135\n达尔文进化论\t319136\n写出\t319137\n大功率\t319138\n数控车削\t319139\nanode\t319140\nTOT\t319141\n安庆绪\t319142\n张恒远\t319143\n改进型\t319144\n春暖花\t319145\n泸山\t319146\n拍摄影\t319147\n陈御妍\t319148\n马盘\t319149\n复活卡\t319150\n路威\t319151\n命运之子\t319152\n郑棉\t319153\n萨勒曼\t319154\n郑采妍\t319155\n救赎之路\t319156\n春宵一刻\t319157\n崔斯特姆\t319158\n姜醋\t319159\nabi\t319160\nRude\t319161\n29卷\t319162\n胖乎乎\t319163\n美图秀秀变\t319164\n书舍\t319165\n教父1\t319166\n被告\t319167\nDC24V\t319168\nShay\t319169\n袁芳\t319170\n大姑娘美大姑娘浪\t319171\n20年\t319172\npads\t319173\n西\t319174\n表情\t319175\n圣痕炼金士\t319176\n黄晓娟\t319177\n国际机场\t319178\n医疗队\t319179\nBadges\t319180\n闪光器\t319181\n5旬\t319182\n兴文\t319183\n好印象\t319184\n北兴\t319185\n东莞大朗\t319186\n影流之主\t319187\n运河桥客运站\t319188\ngperftools\t319189\n北斗极星\t319190\n碧桂园豪园\t319191\n众泰_众泰T500\t319192\n大学生创业基础\t319193\nLloyds\t319194\n粘结性\t319195\nHAMMER\t319196\n旌湖\t319197\n爱迪尔珠宝\t319198\n新福\t319199\n魔兽世界WOW\t319200\nhaze\t319201\n4孔\t319202\n高尔夫论\t319203\nダメ\t319204\n小天堂\t319205\n古穿\t319206\n盘位\t319207\nupstream\t319208\n原汁机\t319209\n娇娘\t319210\n康涅狄格\t319211\n掟\t319212\n南太\t319213\n002340\t319214\n机器翻译\t319215\n2.8.3\t319216\ncus\t319217\n黑火\t319218\n26栋\t319219\n病毒性疱疹\t319220\n密卷\t319221\n宛城\t319222\n莫邪\t319223\n幸福\t319224\n1.8cm\t319225\n草莓干\t319226\nCC2541\t319227\n合路\t319228\n白酱\t319229\ntna\t319230\n蜀汉路\t319231\n通达信公式-股旁网\t319232\n附教\t319233\n新南洋\t319234\n压缩\t319235\n6700元\t319236\n双鹰\t319237\n尾凳\t319238\n头蒙\t319239\n胫甲\t319240\nlyx\t319241\nninelie\t319242\nsup\t319243\n天基\t319244\n通天狄仁杰\t319245\n保险理财\t319246\nHeng\t319247\n20160815\t319248\n希努尔\t319249\nAltium\t319250\n云\t319251\n大道\t319252\n頻道\t319253\n奇谋\t319254\n人教版小学五年级语文下册\t319255\n500套\t319256\n伊拉克战争\t319257\n出逃\t319258\n喷药机\t319259\n名爵\t319260\nmondrian\t319261\n骷髅人\t319262\n胡铁花\t319263\n李建军\t319264\n紫外线灯\t319265\n总所\t319266\n酸楚\t319267\n异境\t319268\n军表\t319269\n澳大利亚国立大学\t319270\n京珠\t319271\n14.04.2\t319272\n不错\t319273\n比格犬\t319274\nmailchimp\t319275\n孤鸿\t319276\n陆金\t319277\ncalgary\t319278\n56种\t319279\n筒式\t319280\n雁阵\t319281\n1纳米\t319282\n一克拉\t319283\n禅味\t319284\n过故人庄\t319285\n八月底\t319286\n荣耀S11\t319287\nICP备案\t319288\n深圳市档案局\t319289\n凡普金科\t319290\n吉水县\t319291\n规程\t319292\n阿呦\t319293\nSekai\t319294\n苏州中学\t319295\n陈陶\t319296\n拐弯\t319297\nUIWebview\t319298\n24.9\t319299\n卫士\t319300\n储备生\t319301\nqq加油站\t319302\n186卡\t319303\n_美说网\t319304\n陈思远\t319305\n非正常户\t319306\n流水账\t319307\n13.6%\t319308\n南召县\t319309\n西山煤电\t319310\n灰灰影音\t319311\n天阶\t319312\n几倍\t319313\nHellogirls\t319314\n中财网\t319315\n0.75KW\t319316\n出包王女darkness\t319317\nJUX\t319318\n槐安路\t319319\n别提\t319320\ninflatable\t319321\n葵花药业\t319322\n吡唑\t319323\n李凭\t319324\n赵力\t319325\nJava7\t319326\n套利定价理论\t319327\n容许值\t319328\n仰天长啸\t319329\n组氨酸\t319330\nSynthetic\t319331\nYears\t319332\nV11\t319333\n专招\t319334\n古生代\t319335\n思铂睿\t319336\n在江湖\t319337\n天\t319338\n1959\t319339\n库卡机器人\t319340\nomega\t319341\n追迹\t319342\n级应急响应\t319343\n惠比特犬\t319344\n电源箱\t319345\n交流群\t319346\n加长杆\t319347\n短期内\t319348\nconsumer\t319349\nDDI\t319350\n担架\t319351\n男健\t319352\n中国青年志愿者协会\t319353\n暖调\t319354\n海升集团\t319355\n法拉第电磁感应定律\t319356\n刘婷\t319357\n吸汗\t319358\n我的世界花雨庭\t319359\n童\t319360\n野心\t319361\n627\t319362\n640\t319363\n胎监\t319364\nsingapore\t319365\nUTAU\t319366\n20161118\t319367\n沉痛\t319368\n麻神\t319369\n爪哇\t319370\n5.7.12\t319371\n北京市工贸技师学院\t319372\niroha\t319373\n军权\t319374\n热血街舞团编舞师\t319375\nCeramic\t319376\n光明正大\t319377\n中居\t319378\n清华附中\t319379\n深圳市通威货运有限公司\t319380\n恩奇\t319381\n168.cn\t319382\n游戏类\t319383\n海湾国家森林公园\t319384\n佛光寺\t319385\nipykernel\t319386\n于正\t319387\npm25.com\t319388\n仙4\t319389\n贵州财经大学商务学院\t319390\n工程处\t319391\n浙江师范大学\t319392\n气压式\t319393\n灌者\t319394\n苏州房产网\t319395\ndefects\t319396\nmicroRNA\t319397\n阿姆斯特丹机场\t319398\n传动系统\t319399\nKikiBT\t319400\n窟牙\t319401\nG3\t319402\n明喻\t319403\n成方\t319404\n格符\t319405\n保康新闻网\t319406\npingley\t319407\n109国道\t319408\n老皮\t319409\n陶伟基\t319410\n9寸\t319411\n宗族\t319412\nCarSim\t319413\nmun\t319414\n伊春\t319415\n亚太路\t319416\n责打\t319417\ng300\t319418\n合会\t319419\n知产力\t319420\nv5.01\t319421\n贪污犯\t319422\n乱云飞渡\t319423\n周莉\t319424\n真格基金\t319425\n帕格尼尼\t319426\n张弓\t319427\n源于\t319428\ngbz\t319429\nPDFs\t319430\n到底是谁\t319431\n照秀\t319432\n魏惠王\t319433\nTapTap\t319434\nartery\t319435\n管庄\t319436\n马虎\t319437\npus\t319438\n姜滨\t319439\n陆俊\t319440\n姚璎格\t319441\n光辉岁月\t319442\n中华人民共和国专利法\t319443\n都察院\t319444\njdialog\t319445\n底脚\t319446\n环意\t319447\n荣任\t319448\n金马河\t319449\n问题机\t319450\n横批\t319451\n数字媒体技术专业\t319452\n尼玛\t319453\n七里堡\t319454\nsemantic\t319455\n卤猪蹄\t319456\n551路\t319457\n蓝天航空公司\t319458\n益财经\t319459\n诗涵\t319460\n竖线\t319461\nGS65\t319462\ngmod\t319463\n韩雪钱学森\t319464\nhashing\t319465\n推奴\t319466\nTranscripts\t319467\n晓光\t319468\n各领风骚\t319469\n热锻\t319470\n贵气\t319471\n牛黄解毒片\t319472\n海尔燃气灶\t319473\n20150706\t319474\n封浜\t319475\nShodan\t319476\n劳动合同法\t319477\n维拉\t319478\n图书大厦\t319479\n3.5万元\t319480\n李开复\t319481\n情报\t319482\n即焚\t319483\n钟离\t319484\n静态构造函数\t319485\nmax2018\t319486\n买购网\t319487\n做场\t319488\n走逃\t319489\n蛋糕师\t319490\n主险\t319491\n月照\t319492\n中农集团\t319493\n10.9.3\t319494\n云渺\t319495\n二亩\t319496\n沂水县\t319497\n8851xxxx\t319498\n华视\t319499\n3ex\t319500\n偏高\t319501\n青蛙王子\t319502\n奥城\t319503\n百变星君\t319504\nthous\t319505\n移动光猫\t319506\n2011-2012年\t319507\n300015\t319508\n比亚迪公司\t319509\nFrankie\t319510\nbudget\t319511\nmfp\t319512\n石家庄市人力资源和社会保障局\t319513\n金力泰\t319514\n强弱电箱\t319515\n金库\t319516\n50km\t319517\nTibet\t319518\n费祎\t319519\nTechNews\t319520\n117名\t319521\n埃瓦尔\t319522\n紧线器\t319523\n好运来\t319524\n勾践\t319525\n平安医院\t319526\n胡汉\t319527\nShareSDK\t319528\n广告圈\t319529\nOracle11G\t319530\n焗油膏\t319531\n71年\t319532\n泰尔茂\t319533\n冲级\t319534\n工人体育馆\t319535\n五十两\t319536\n咽痛\t319537\n高邮文游台论坛\t319538\nherz\t319539\n入党积极分子谈话\t319540\n1.13\t319541\n马炮\t319542\n先机\t319543\n理工光科\t319544\n人教版RJ\t319545\n强仁\t319546\n天麻炖鸡\t319547\n奥马尔\t319548\n结节样\t319549\n不知名\t319550\n这地\t319551\n圭\t319552\n白菜\t319553\n霄云路8号\t319554\n相对运动\t319555\n艾滋病病毒\t319556\n热砂\t319557\n小儿鞘膜积液\t319558\n贴袋\t319559\n万炮\t319560\n83054856\t319561\n后街男孩\t319562\n岐江\t319563\n同济大学嘉定校区\t319564\n中国排协\t319565\n祖祠\t319566\n韩大元\t319567\n陈良\t319568\n厦门市市\t319569\ndonor\t319570\n胖妞\t319571\n许\t319572\n定值电阻\t319573\n猪肘子\t319574\n腻虫\t319575\n品红\t319576\n共达\t319577\n包覆\t319578\n杜诗\t319579\n贸促\t319580\nR2数据库\t319581\n150位\t319582\n惠生集团\t319583\n脚肿\t319584\nZHS16GBK\t319585\n贝叶斯定理\t319586\n容州\t319587\n掘地求生\t319588\n吐哺\t319589\n东五环\t319590\nSYSU\t319591\n紧箍咒\t319592\n贾文革\t319593\n计算机技术与软件专业技术资格\t319594\n三神\t319595\n花韵\t319596\nItalia\t319597\n南风窗网\t319598\n中国政协\t319599\n三国群英传7演义\t319600\nhyper-v\t319601\n女心\t319602\nm^3\t319603\n花筑·花自\t319604\n蒸压灰砂砖\t319605\nhbx\t319606\n鹏元征信\t319607\n情包\t319608\n死霸\t319609\n水晶柱\t319610\n许建平\t319611\n中国交响乐团合唱团\t319612\n报名费\t319613\n胡一天\t319614\n缎纹\t319615\n贵州卫视\t319616\n吉利新博瑞\t319617\n偶寓\t319618\nProject2013\t319619\nIntercultural\t319620\n20150814\t319621\n曹长青\t319622\n四川民族学院\t319623\nExcel表\t319624\nsmxs\t319625\n武钢集团\t319626\n遭疯\t319627\n苏州供电公司\t319628\n江涛\t319629\n机修工\t319630\n神技\t319631\n诧异\t319632\n迁徙率\t319633\n军乐队\t319634\n李晓亮\t319635\n李后主\t319636\n唐汉\t319637\n贝塔值\t319638\n厦门大学材料学院\t319639\nESXi虚拟机\t319640\n李晓光\t319641\n別\t319642\n永封\t319643\n雪柳\t319644\n豆芽音乐网\t319645\nSTR\t319646\n馅料\t319647\n国家互联网信息办公室\t319648\n给外\t319649\n色色\t319650\nSothink\t319651\nVALID\t319652\n陈浩然\t319653\noverkill\t319654\n上万次\t319655\nmV\t319656\n奇瑞瑞虎8\t319657\n工作原理\t319658\n铜须\t319659\n生物岛\t319660\n侠客风云前传\t319661\n长街\t319662\n上海交通大学学报\t319663\n电盘\t319664\n灰色调\t319665\nAnnounced\t319666\n深圳地铁10号线\t319667\n论法的精神\t319668\n银座佳驿酒店\t319669\n哈弗h9\t319670\n廖毅\t319671\n美丽的春天\t319672\n六一儿童网\t319673\n永不分离\t319674\n富康\t319675\n天职国际会计师事务所\t319676\n恩\t319677\nmessenger\t319678\n心肌病\t319679\n大汇总\t319680\ndrawableleft\t319681\nAOL\t319682\n广西壮族自治区人民政府国有资产监督管理委员会\t319683\n王女\t319684\n南亚\t319685\n奥雅帕萨特\t319686\n特效师\t319687\nMascara\t319688\n综观\t319689\nNowhere\t319690\n叩首\t319691\n王雪纯\t319692\n淄博一中\t319693\n很久以后\t319694\n天碟\t319695\n地震\t319696\nDAY\t319697\n还会\t319698\n首笔\t319699\n有托\t319700\n陈晏\t319701\n放无\t319702\n苹果手机助手\t319703\nNO.1STYLE\t319704\nkanojo\t319705\n平易\t319706\n几回\t319707\n收信\t319708\n三七网\t319709\nCurrently\t319710\n山口镇\t319711\n古建筑\t319712\n诸葛恪\t319713\n乐商\t319714\npane\t319715\n越女\t319716\n归宗\t319717\n伍斯特理工学院\t319718\nDIY手工坊\t319719\nFSF\t319720\nv2.6.4\t319721\n曼彻斯特大学\t319722\n茶源\t319723\n杨威利\t319724\nulead\t319725\n112&\t319726\n虚拟交换机\t319727\n陈巧生\t319728\n25寸\t319729\nnet域名\t319730\n江苏省商务厅\t319731\n美制螺纹\t319732\n船员\t319733\n浙大网新\t319734\npolyurethane\t319735\n结构体\t319736\n芭乐兔\t319737\n神棍\t319738\n文博宫\t319739\nFix\t319740\n西尔\t319741\n石锁\t319742\n鞭杆\t319743\n阿土伯交易网\t319744\n查用\t319745\n猎马\t319746\n方子\t319747\nWord字体库\t319748\n菌丝体\t319749\ncustom\t319750\n南塔\t319751\n15级\t319752\n充当\t319753\n吉娅\t319754\nJoints\t319755\n府城\t319756\n北京公安局\t319757\n温州大学城市学院\t319758\n不外\t319759\n四川君合律师事务所\t319760\n烤制\t319761\n孤岛惊魂2\t319762\n中国科技公司\t319763\n信息系统项目管理师\t319764\n脉轮\t319765\n黑蜘蛛\t319766\n七煞碑\t319767\n撇\t319768\n默里\t319769\n推扫\t319770\n迅雷ed2k\t319771\n800平方米\t319772\n316l\t319773\n汉能控股\t319774\n烛之武\t319775\n书法练习指导\t319776\n贵哥\t319777\n前海花园\t319778\n同系\t319779\n中山市政府\t319780\n无风\t319781\n陈雪枫\t319782\n南太平洋\t319783\n不可读\t319784\n北京地铁机场\t319785\n中国艺赛网\t319786\n外盖\t319787\n辽宁公务员考试网\t319788\n广福寺\t319789\n侄\t319790\n微微儿\t319791\nl385\t319792\n山菜\t319793\n中搜\t319794\n仓鼠\t319795\n春生\t319796\n中美贸易大战\t319797\n师长\t319798\n食道溃疡\t319799\n1272\t319800\n八角楼\t319801\n4620\t319802\n37项\t319803\n溜池ゴロ\t319804\n美言\t319805\n开跳\t319806\n小小\t319807\n普尔曼\t319808\n桃儿\t319809\n花城出版社\t319810\n误国\t319811\n34cm\t319812\n扎线机\t319813\n口袋妖怪黑2\t319814\n爬井\t319815\ntizzy\t319816\nBiological\t319817\n绝地求生沙漠地图\t319818\n中华人民共和国残疾人保障法\t319819\n马克鳗\t319820\n何仙姑\t319821\n南州\t319822\n程辉\t319823\nDocuCentre\t319824\nW530\t319825\n163v\t319826\n贵糖股份\t319827\n开发间接费\t319828\n不以为\t319829\nv4.8\t319830\nEVISU\t319831\nKruger\t319832\n宽景\t319833\n锥面\t319834\n速冻食品\t319835\n19.1\t319836\n沙滩\t319837\nPlanets\t319838\n酷睿I7\t319839\n五人墓碑记\t319840\n引闪\t319841\n马爷\t319842\n贪污贿赂罪\t319843\n实在\t319844\n语花蝶\t319845\n卡里姆\t319846\n何晴\t319847\n三千元\t319848\n塔木德\t319849\n0.48\t319850\n混血哥\t319851\n粤A\t319852\nmathtype6.9\t319853\n绿健\t319854\n有志者\t319855\n切游网\t319856\n宁波市鄞州区人民法院\t319857\nmysql5.6\t319858\n校花被绑架\t319859\n320MMGH\t319860\n巴南区\t319861\n组织细胞\t319862\n沃尔一路\t319863\nLB\t319864\n乙瑛碑\t319865\n梅州市人力资源和社会保障局\t319866\n蒋小涵\t319867\n揣摩\t319868\n雅思机经\t319869\n礼碑\t319870\n仇杀队\t319871\nkept\t319872\n57漫画网\t319873\n基金管理有限公司\t319874\n乐课网\t319875\n一课后\t319876\n劳务派遣许可证\t319877\n上三年\t319878\n074\t319879\n天秀\t319880\nBING\t319881\nCTP\t319882\n达芬奇机器人\t319883\n替诺\t319884\n碳14呼气试验\t319885\n站起来\t319886\n增考\t319887\n泥灸\t319888\n羊亭\t319889\nrevenues\t319890\ncraig\t319891\n三维建模\t319892\n济南市社会保险事业局\t319893\n第166集\t319894\n优立\t319895\n孤光\t319896\n子网数\t319897\n不悔\t319898\n秀士\t319899\nHeadquarters\t319900\n1282\t319901\n哈蒙\t319902\n智通财富网\t319903\nreporter\t319904\n嶂石岩\t319905\n全龄\t319906\n长沙市环境保护局\t319907\n新长征突击手\t319908\n郑成河\t319909\n植物大战僵尸2功夫世界\t319910\n中国兵器工业集团有限公司\t319911\n11.2.7\t319912\n宠物兔\t319913\nSoftBank\t319914\nShame\t319915\n1740\t319916\n分布式机器学习\t319917\n宝马5系论坛_汽车之家论坛\t319918\n卫道\t319919\nsim800c\t319920\n1【\t319921\nTHRU\t319922\n盘\t319923\n乱伦片\t319924\n郭松民\t319925\n一起走过\t319926\n3.82\t319927\n长安CX70论坛论坛\t319928\n天通一号\t319929\nBrisbane\t319930\nfifiyong\t319931\n爱福窝\t319932\n解决者\t319933\n王者荣耀实名认证\t319934\n100组\t319935\n66cm\t319936\n海绵公园\t319937\nlevy\t319938\n美国中文\t319939\n鄂尔多斯东胜\t319940\n氯吡格雷\t319941\n深圳网络公司\t319942\n轴轮\t319943\n过去三年\t319944\n70_\t319945\n金丝皇菊\t319946\n塘栖古镇\t319947\n黑龙江科技大学\t319948\n新视野大学英语听说教程\t319949\nVictory\t319950\n独角\t319951\nv2.2.2\t319952\n40天后\t319953\n巴斯巴\t319954\n公共端\t319955\nfeast\t319956\n九曲红梅\t319957\n武斗\t319958\nmongoDB\t319959\n哲哥\t319960\nbrute\t319961\n张鹏\t319962\nchapel\t319963\n江苏海事局\t319964\nsniper\t319965\n鲍鱼汁\t319966\nclassloader\t319967\n507号\t319968\n医学免疫学\t319969\n偶像来了\t319970\n漏型\t319971\n400名\t319972\n凌菲\t319973\ngck\t319974\n蛋卷\t319975\n微步\t319976\n闽侯\t319977\n莎莉\t319978\nJO\t319979\nUnit7\t319980\n天地争霸美猴王\t319981\n南帝北丐\t319982\n大燕网\t319983\nxizi\t319984\n泳姿\t319985\n秋叶网\t319986\ncredits\t319987\n邮件群\t319988\n朝阳便民网\t319989\n应用化学专业\t319990\nhttpserver\t319991\n法拉利主题公园\t319992\n喝奶\t319993\n裸照\t319994\n晶格化\t319995\n广东省建筑设计研究院\t319996\n口语\t319997\npta\t319998\n50只\t319999\n天使之泪\t320000\n尝尽\t320001\n刘永富\t320002\n今典集团\t320003\n雕件\t320004\n说实话\t320005\n先宠\t320006\n爱奇文学\t320007\n韩前\t320008\n墙膏\t320009\n第9部\t320010\n93度\t320011\n章泽天\t320012\n复习题\t320013\n乐画\t320014\n十二金仙\t320015\n一加一\t320016\n80963333\t320017\n膝长\t320018\n外加工\t320019\n石化厂\t320020\n957\t320021\nWish邮\t320022\n上下文\t320023\n穆穆\t320024\n煎饼馃子\t320025\ntan\t320026\n假说\t320027\n_沃游网\t320028\n60v20ah\t320029\n四声\t320030\n融资融券余额\t320031\nBoston\t320032\n显卡坞\t320033\n广州海洋地质调查局\t320034\nc++2008\t320035\n地大物博\t320036\n包装膜\t320037\n诺和灵\t320038\nPrimoCache\t320039\n16年5月\t320040\n休书\t320041\n名校生\t320042\n爪机\t320043\n大经济\t320044\nBluemix\t320045\nmac播放器\t320046\n青西\t320047\nRide\t320048\n麦久彩票网\t320049\n李天霞\t320050\n山茶\t320051\n水都网\t320052\n7y30\t320053\n屯留县\t320054\n群酯胶囊\t320055\n彩草\t320056\n丽波\t320057\n八季\t320058\n希尔顿欢朋酒店\t320059\nKIKS\t320060\n16x\t320061\n刘韬\t320062\n污秽\t320063\n情遇\t320064\n兴安盟\t320065\n云和恩墨\t320066\n邮币卡互动网\t320067\n评职\t320068\n校花级\t320069\n鼻影\t320070\n何较\t320071\n限购\t320072\n鹿晨辉\t320073\n草酸铵\t320074\njulia\t320075\n荣耀3\t320076\n沈以诚\t320077\n颐和路\t320078\n亿起\t320079\n质量效应2\t320080\n医学影像园-影像园\t320081\n魔法门之英雄无敌7\t320082\n最弱无败神装机龙\t320083\n京西\t320084\nconvolution\t320085\n尤其是\t320086\n合肥工大\t320087\n顶尖\t320088\n包打飞机\t320089\n猿题库\t320090\n适嘉网\t320091\n草山\t320092\nproforma\t320093\n杨奇龙\t320094\n财富号\t320095\nBD影音先锋\t320096\n人民政\t320097\n610.com\t320098\n奖励金\t320099\n机械波\t320100\n气动蝶阀\t320101\n考古学家\t320102\n反垄断局\t320103\n核弹\t320104\n金刚烷\t320105\n美国富国银行\t320106\n钢筋梁\t320107\nalot\t320108\n与青春有关的日子\t320109\n拉普拉斯方程\t320110\nhujiang\t320111\n常数项\t320112\n家庭医生\t320113\nXtrabackup\t320114\n江西省水利厅\t320115\n赤水镇\t320116\n败绩\t320117\nhsf\t320118\n戴克\t320119\n散工\t320120\n十八里河\t320121\n苦丁\t320122\n不知不觉\t320123\n设座\t320124\n江苏省苏州地方税务局\t320125\n火地岛\t320126\n镭射\t320127\n特洛伊\t320128\n江雷\t320129\n妙探\t320130\n站酷旗\t320131\n第一卫\t320132\n军工级\t320133\n外语教学与研究出版社\t320134\nSupplements\t320135\n娜依灵儿\t320136\n长安汽车站\t320137\nmellow\t320138\n保山学院\t320139\n王哲林\t320140\n剑陵\t320141\n小忍者\t320142\n质心教育\t320143\n软科\t320144\n大力哥\t320145\n巴里亚\t320146\n战绩\t320147\n猪鼻龟\t320148\n扬州人才网\t320149\n土建工程\t320150\n鬼剑\t320151\nfeign\t320152\n性用品\t320153\n李王\t320154\n氧化锂\t320155\n王蓓\t320156\n30G\t320157\n数学卷\t320158\n官杀\t320159\n◣\t320160\n铝合金门窗\t320161\n电桥电路\t320162\n1.3_\t320163\n呖咕呖咕新年财\t320164\n旅行卡\t320165\nv2.5\t320166\n火龙草\t320167\n10.12.3\t320168\n应承\t320169\nOctane渲染器\t320170\nmif\t320171\n巨鼎\t320172\n2016.3.1\t320173\n王益区\t320174\n蜜枣\t320175\n二端\t320176\n华娱\t320177\n吴胜利\t320178\n汕头经济特区\t320179\nAccent\t320180\n正门\t320181\n芝麻丸\t320182\n记录在案\t320183\n沉桩\t320184\n吉林广播网\t320185\n72DJ舞曲网\t320186\n第4款\t320187\n生死相随\t320188\nap\t320189\n然而\t320190\nquality\t320191\n龙湖天璞\t320192\n锥头\t320193\n台儿庄路\t320194\n龙潭大峡谷\t320195\n藏红花\t320196\n主持人稿\t320197\nQQ飞车手游霸天虎\t320198\nWYSIWYG\t320199\n暗黑城\t320200\n藝術\t320201\n举法\t320202\n男大学生\t320203\n控制术\t320204\n爱的就是你\t320205\n鹦鹉鱼\t320206\nQQ秀\t320207\n68160020\t320208\n豌豆\t320209\n快班\t320210\n被键盘\t320211\n/FONT\t320212\n中谷美纪\t320213\n喇荣\t320214\n抢答赛\t320215\n犹疑\t320216\nslua\t320217\n职业学\t320218\npy2neo\t320219\n知乐\t320220\n镇村\t320221\n组曲\t320222\n庞贝\t320223\n泣\t320224\n善根\t320225\n羽悦本草瘦瘦包\t320226\n飘阿兮\t320227\n二九小说网\t320228\n香港片\t320229\n明后\t320230\n眼盲\t320231\n暴鸡\t320232\n谢瑞麟\t320233\nGamers\t320234\nUASB\t320235\n三数\t320236\n九合一\t320237\n手捧\t320238\n更漂亮\t320239\ngyl\t320240\n广东技术师范学院\t320241\n长评\t320242\n俏丽网\t320243\n大加\t320244\n至味\t320245\n壮阳\t320246\n江苏省妇幼保健院\t320247\n2.5\t320248\n忠实\t320249\n采矿权\t320250\n难度\t320251\n时支\t320252\n高手们\t320253\n上海第六人民医院\t320254\n预言g版\t320255\n唱腔\t320256\n深篮\t320257\n工业丝\t320258\n贵州恒丰智诚\t320259\n打票机\t320260\n山东泰通升降机械有限公司\t320261\nWAN\t320262\n新沙路\t320263\n一泓\t320264\n上海上幼儿园\t320265\n兖州政府\t320266\n东西南北中\t320267\n时势\t320268\neasyicon\t320269\n广东省旅游局\t320270\n娱乐广播网\t320271\n320GB\t320272\n松油醇\t320273\n一米\t320274\n世界癌症日\t320275\n狗毛\t320276\n聚能\t320277\nGoland\t320278\n袁咏仪\t320279\np0\t320280\n判别分析\t320281\n13P\t320282\nWin7系统之家\t320283\nzzm\t320284\n乌兰浩特市\t320285\n九溪玫瑰园\t320286\n宜都市政府\t320287\nPySide\t320288\n亲吻\t320289\n5周\t320290\nadapted\t320291\n河姆渡镇\t320292\nWps\t320293\n哈雷彗星\t320294\n李艾佳\t320295\ncult\t320296\n20170214\t320297\n中筒\t320298\n20161116\t320299\n西安软件园\t320300\nWII模拟器\t320301\n汉马\t320302\n第三十一集\t320303\nwin10+\t320304\n肝胃\t320305\n冠名权\t320306\n社区护理学\t320307\n豪鬼\t320308\n存单质押贷款\t320309\n停下\t320310\n鼻通\t320311\n1.0.36\t320312\n8.27\t320313\ncurated\t320314\n非即\t320315\n1.30\t320316\n修复学\t320317\n无境\t320318\n乐粉\t320319\n七千万\t320320\nyunnan\t320321\n旭辉\t320322\n仙湖植物园\t320323\n二尾\t320324\n丢弃\t320325\n正儿八经\t320326\n长白县\t320327\n肉末豆腐\t320328\n娱乐星天地\t320329\n数第一\t320330\nsung\t320331\n倒闭\t320332\n10回\t320333\n驾\t320334\n计\t320335\n微喜剧\t320336\n空色\t320337\n明仔\t320338\n爱洛阳网\t320339\nWO\t320340\nOpenMP\t320341\nAnger\t320342\nyota\t320343\n浙江工商大学\t320344\n顺流逆流\t320345\n升级线\t320346\n青龙古镇\t320347\nterre\t320348\nantconc\t320349\n乖诺诺\t320350\n污普\t320351\n手绘画\t320352\n砂光\t320353\n3600元\t320354\n初回版\t320355\ndeaf\t320356\n_聚荣网\t320357\n11000元\t320358\n崩坏学园3吧\t320359\n商路通\t320360\n假释\t320361\n鹿头\t320362\n老虎油\t320363\n黄土高原\t320364\n帕帕丁\t320365\n1462\t320366\n甲方公司\t320367\n景鸿\t320368\n4.0.9\t320369\n射吸式\t320370\n青杠树村\t320371\n步距角\t320372\n后羿射日\t320373\n2018社区工作者考试\t320374\n樟树\t320375\ngpu版\t320376\nmute\t320377\nMellanox\t320378\n泾源\t320379\n盈润\t320380\n硕果仅存\t320381\nzigbee\t320382\n胡文\t320383\ninfobright\t320384\njanet\t320385\n201510\t320386\nSNIS\t320387\n刘成章\t320388\nCETV\t320389\nreconnect\t320390\n妙战\t320391\nphysicians\t320392\n色色网\t320393\n畏惧\t320394\n流放之路POE\t320395\n韩国大使馆\t320396\n显示板\t320397\n娱乐至死\t320398\n李广利\t320399\nentertain\t320400\nbase64编码\t320401\n娜塔丽雅\t320402\n大云山\t320403\n克林顿\t320404\ncbi\t320405\n灰调\t320406\nDress\t320407\n幼猫\t320408\nTMALL\t320409\n|体\t320410\n如雷贯耳\t320411\nsounding\t320412\n科技岗\t320413\n羊刀\t320414\n帝君\t320415\n乒羽\t320416\n广州涉外经济职业技术学院\t320417\n李蕊\t320418\n排盘\t320419\nPSR\t320420\n308\t320421\n栖迟\t320422\n陶冲湖\t320423\n8005\t320424\n购线网\t320425\n口袋妖怪白金版\t320426\nGridLayoutManager\t320427\n释迦\t320428\nCurves\t320429\n腌糖蒜\t320430\n视感\t320431\n八脉\t320432\nCounting\t320433\n沿海地区\t320434\n范丞\t320435\ngaosan\t320436\nlinuxmint\t320437\n水平衡\t320438\n荣耀畅玩7x\t320439\n上海世界博览会\t320440\n依那普利\t320441\n奔驰E300L\t320442\n马鞍山森林公园\t320443\n彩花\t320444\n栏语\t320445\n商务函\t320446\n克莱德曼\t320447\n黄埔东路\t320448\n史部\t320449\n7719\t320450\n武平一中\t320451\n守夜\t320452\n淘图网\t320453\n吸收\t320454\n团伙伙\t320455\n华士镇\t320456\n兼任\t320457\n凭证\t320458\n白凯南\t320459\n门形\t320460\n阿里图标库\t320461\nshw\t320462\n思想家\t320463\n拉夫罗夫\t320464\n后肢\t320465\n中国矿业\t320466\n气矿\t320467\n会昌\t320468\nachieve\t320469\n飞鹤\t320470\n十五年后\t320471\n法国兴业银行\t320472\n烫画机\t320473\n狗男\t320474\nxg3\t320475\n埋弧焊\t320476\n大自然\t320477\n琼中\t320478\nEroACG\t320479\nPockets\t320480\n吧嗒\t320481\nCarCAV\t320482\n广州搬家公司\t320483\nMATRIX\t320484\nAway\t320485\n平动\t320486\nedit\t320487\n中节能\t320488\n徐州市国土资源局\t320489\n游标\t320490\n神经性\t320491\n迈特\t320492\n王冬\t320493\nKBG\t320494\nRxjava\t320495\n长亭科技\t320496\n中国水稻研究所\t320497\n所有权\t320498\n牛呗\t320499\n狼学派\t320500\nPSS\t320501\n丈亭\t320502\n中央财经大学研究生院\t320503\nsimcity\t320504\n新矿集团\t320505\n秦皇岛市经济技术开发区\t320506\n中梁地产集团\t320507\n西安房产网\t320508\n骨骼肌细胞\t320509\ngreek\t320510\n两回\t320511\nf60\t320512\nyoosee\t320513\n贾俊平\t320514\n9.52\t320515\n常州市政府\t320516\n沈峰\t320517\nUF\t320518\n杂锦\t320519\n吊门\t320520\n燕子矶街道\t320521\n铜陵职业技术学院\t320522\n中国汽车报网\t320523\n商墅\t320524\nLEC\t320525\n87882665\t320526\n梵讯\t320527\npycharm2017\t320528\n守军\t320529\n河北大街\t320530\n点点通\t320531\n林总\t320532\n购销员\t320533\n口红吊兰\t320534\n旋风女队\t320535\n约翰塞纳\t320536\n息马\t320537\n神仙眷侣\t320538\n诗歌\t320539\n涪陵\t320540\n北京场\t320541\n陶然\t320542\npromise\t320543\n122考试\t320544\n小羊皮\t320545\n收机\t320546\nCodex\t320547\nexcel表格\t320548\nftv\t320549\n九水\t320550\n破折君\t320551\ngoquery\t320552\n神探亨特\t320553\n府谷县人民政府\t320554\ncache\t320555\n龙拳\t320556\n埃塞尔比亚\t320557\n漯\t320558\nDemo\t320559\n萍乡\t320560\n为宗旨\t320561\n3dmax8\t320562\nsa\t320563\n徐云丽\t320564\n德罗巴\t320565\n优柔\t320566\n周岁计算器\t320567\n锚链\t320568\n布格缪勒\t320569\n三浦芽依\t320570\n圆圈状\t320571\n猫本\t320572\nMelo\t320573\n宅警\t320574\n国号\t320575\n孢子银河大冒险\t320576\n黑发\t320577\n张占斌\t320578\n谋攻\t320579\n十五厘米\t320580\nPVE加点\t320581\n牛磺酸\t320582\n4颗\t320583\n兰越峰\t320584\n阜新百姓网\t320585\n战万\t320586\n鼠猫\t320587\n无锡人事考试网\t320588\n2015年4月20日\t320589\n分配器\t320590\n48本\t320591\n武当剑\t320592\n文豪\t320593\n宠物食品\t320594\n11.27\t320595\n云南国土资源职业学院\t320596\n聚类\t320597\n丰台区东\t320598\n遗容\t320599\n天涯摄影\t320600\n耽美窝\t320601\n涤纶\t320602\n比亚武动乾坤\t320603\n两卖\t320604\n红磡火车站\t320605\n第2层\t320606\n学案\t320607\n1.35G\t320608\n十八层\t320609\n沙坪坝大学城\t320610\ncylindrical\t320611\n恋人心\t320612\n超意兴\t320613\n7则\t320614\n2016年5月26日\t320615\n南云\t320616\n4次方\t320617\n万科大厦\t320618\n马关县\t320619\nRemain\t320620\n寸拳\t320621\n五台山\t320622\n湛江吴川政府\t320623\ninstron\t320624\n风电机组\t320625\n变男\t320626\n我的妹妹不可能那么可爱\t320627\n骑式\t320628\n可萌\t320629\n网通社\t320630\n床上片\t320631\n溴水\t320632\n托利多\t320633\n黔西县政府\t320634\n麦太保\t320635\n新中式家具\t320636\n阿迪三叶草\t320637\n健脾丸\t320638\n河底\t320639\nLukas\t320640\n197\t320641\n恩平市政府\t320642\nWinDbg\t320643\n纱帘\t320644\nseries1\t320645\n女魔王\t320646\n保乳\t320647\n短刀\t320648\n假凤虚凰\t320649\nzzw\t320650\nhandbook\t320651\n日语专业\t320652\nUnforget\t320653\n冠林\t320654\n电动试压泵\t320655\n8079\t320656\nfaiz\t320657\n豪宅税\t320658\n美尔\t320659\nncs\t320660\n毫州\t320661\n落锁器\t320662\n柴玲\t320663\ncentos\t320664\n重庆交通大学\t320665\n各一个\t320666\n花甲之年\t320667\n500.00\t320668\n短身\t320669\nDrama\t320670\n石浦镇\t320671\nvCard\t320672\n#墓王之王#英雄救美\t320673\n走台\t320674\n南京三中\t320675\n龙王塘\t320676\n谍照\t320677\n元昊\t320678\n格拉斯哥艺术学院\t320679\n天天好彩\t320680\n金融资产交易所\t320681\n特朗德尔\t320682\n翠竹路\t320683\n深入人心\t320684\nopportunities\t320685\n黄草\t320686\n阿陨\t320687\n终极末日\t320688\nWDR\t320689\n基金重仓股\t320690\n新浪体育综\t320691\n邮政业\t320692\nkhan\t320693\n白斩鸡\t320694\n儿童剧\t320695\n想尽\t320696\n花萝\t320697\n5.9.5\t320698\n江叔\t320699\ncommunity\t320700\nfea\t320701\nB350\t320702\n克莱-汤普森\t320703\n皮纸\t320704\nSeats\t320705\n摇旗呐喊\t320706\n很厉\t320707\n杭州市口腔医院\t320708\n圆条\t320709\nNova\t320710\n引流术\t320711\n长联\t320712\n陕西高速集团\t320713\n手机斗地主\t320714\n区间\t320715\n星鸿娱乐\t320716\n600891\t320717\n传诵\t320718\n三天之内\t320719\n葬礼\t320720\n海游馆\t320721\n抄表\t320722\n爱普生l310\t320723\njuki\t320724\n风彩\t320725\n改一改\t320726\n税务局\t320727\npathofexi\t320728\n硅酸钠\t320729\nzukz2\t320730\n失明\t320731\nsas9.4\t320732\nSata2\t320733\n广鹿岛\t320734\n霍春阳\t320735\n止吐\t320736\n千山茶客\t320737\nlec\t320738\n奥维云网\t320739\n6成\t320740\n米帝\t320741\nsite-packages\t320742\n绕组变压器\t320743\nVMP\t320744\n陈宝珠\t320745\n电大形考\t320746\n兽耳娘\t320747\n罗曼斯\t320748\n语言包\t320749\nIphonex\t320750\n伙委会\t320751\nHeize\t320752\n第三十届\t320753\n猎龙\t320754\n成都市武侯区人民政府\t320755\n职业道路\t320756\n七人\t320757\n陆金所\t320758\n临沂大学\t320759\n双转子压缩机\t320760\n清华大学校史馆\t320761\n岳雷\t320762\n盗猎者\t320763\n峨眉传奇\t320764\n吸血鬼大君\t320765\n少货\t320766\n效应管\t320767\n徐剑秋\t320768\n手有余香\t320769\nKiehl's\t320770\n小蜜蜂紫草膏\t320771\n桑麻\t320772\n清华\t320773\ni春秋学院\t320774\n万达文华\t320775\n深圳开立生物医疗科技股份有限公司\t320776\n流法\t320777\n骄兵\t320778\nTSX\t320779\n袁大头\t320780\n花销\t320781\nclever\t320782\n多媒体\t320783\n空方\t320784\n雷克萨斯RX论坛\t320785\nCleanMyMac3\t320786\n歇息\t320787\n桂林米粉\t320788\n迎宾机器人\t320789\n元祖股份\t320790\n少女时代允儿\t320791\nL5640\t320792\n中国电力\t320793\n安健\t320794\n言值\t320795\nleiting\t320796\nGauge\t320797\n李宇春\t320798\nv1.1\t320799\n市中\t320800\n成都金融控股集团有限公司\t320801\n第七十三条\t320802\npha2\t320803\n昆明供电局\t320804\n0xFF\t320805\n扎娜\t320806\n外围女\t320807\n思妍\t320808\nX100\t320809\n杭州教育网\t320810\n肉搏\t320811\n夏侯徽\t320812\n自科\t320813\n星空顶\t320814\n旋风少女第二季\t320815\n明安\t320816\n魔都风云\t320817\n出去走一走\t320818\n彩虹城\t320819\n5888\t320820\n1型\t320821\n中国物业管理协会\t320822\n打拍子\t320823\n拜复乐\t320824\n牛女\t320825\n泡泡机\t320826\nmumu\t320827\n吉林农业科技学院\t320828\nmdHd\t320829\ntop20万\t320830\nUSB风扇\t320831\nn5\t320832\n米秀\t320833\n程英\t320834\nSketchbook\t320835\n换新\t320836\n传习\t320837\n其五\t320838\n没办法\t320839\n列联表\t320840\nU-Mail邮件服务器\t320841\n中高端\t320842\n新流星花园\t320843\n王丰\t320844\nsama\t320845\n斯鑫良\t320846\n猎弓\t320847\n痣相\t320848\n千帆\t320849\nbmp\t320850\n杏仁\t320851\n一兵\t320852\nUnhandled\t320853\n储君\t320854\n邓楠\t320855\n刘希媛\t320856\n香菱\t320857\nmaya2017\t320858\n崇圣寺\t320859\nwf2018\t320860\n填发\t320861\n惨剧\t320862\n金融街街道\t320863\nUnbelievable\t320864\nSimplify3D\t320865\n积淀\t320866\n金巴兰\t320867\n英论阁\t320868\n205个\t320869\n中大乐透\t320870\n鉴证\t320871\n张渠\t320872\n菲克\t320873\n江苏分公司\t320874\n6月末\t320875\n小米4A\t320876\n放声大哭\t320877\n昆明池\t320878\n无人能及\t320879\n董浜\t320880\n广胜寺\t320881\n火炮\t320882\n1080p.HD\t320883\n阶下囚\t320884\n老腔\t320885\n雨天\t320886\n香碱\t320887\n包公祠\t320888\n分母\t320889\n沙娜拉传奇\t320890\n库车\t320891\nsmartsvn\t320892\n6.3.3\t320893\n床尾\t320894\n2051\t320895\n普隆德\t320896\n衡水北站\t320897\n航标\t320898\nmeaningful\t320899\n乳酸左氧氟沙星片\t320900\n.net3.5\t320901\n买进\t320902\n叶乔\t320903\n雷神\t320904\n预备费\t320905\n5.70\t320906\n补阴\t320907\n滨江\t320908\nPPYPP\t320909\n神舟电脑\t320910\n高谈阔论\t320911\n8轮\t320912\n习友路\t320913\n私募焦点\t320914\n保时捷_\t320915\n白毛女\t320916\n海滨栈道\t320917\n珠光村\t320918\n红帽\t320919\n留影\t320920\n见识\t320921\n十三首\t320922\n没声\t320923\niPhone5s\t320924\n2018年1月16日\t320925\n偏执型\t320926\n游鱼\t320927\n窗纸\t320928\nCGA\t320929\n快气\t320930\n大筒木辉夜\t320931\n友谊社区\t320932\n热物性\t320933\n猫脸\t320934\n喷昔洛韦乳膏\t320935\n独脚\t320936\n第三个\t320937\n无标度\t320938\nturbotax\t320939\n嘴里\t320940\n澳剧\t320941\n有限责任公司股权转让\t320942\ncsh\t320943\n天狐\t320944\nSauna\t320945\n乳腺瘤\t320946\n徐凯文\t320947\n自连接\t320948\n剑侠情缘1\t320949\n东滨路\t320950\n猫眼石\t320951\n丰潭路\t320952\n厚片\t320953\n乐师\t320954\n第一星座网\t320955\n张婧\t320956\nJenkins\t320957\nXdebug\t320958\n熊出没\t320959\n显出\t320960\n医托\t320961\n金辉大厦\t320962\n7080\t320963\n7.0.52\t320964\n针刺感\t320965\n细菌性感染\t320966\n强度高\t320967\n抽调\t320968\n百菌\t320969\n18颗\t320970\nAndroid5.0\t320971\n匾额\t320972\n助工\t320973\n国有控股公司\t320974\n夹克衫\t320975\n无参\t320976\n爱学贷\t320977\n暑热\t320978\n新西站\t320979\n芭芭拉·布什\t320980\n吃铅\t320981\n郑峰\t320982\n金融家\t320983\n商丘医学高等专科学校\t320984\ntasking\t320985\n窑头\t320986\nVisualGDB\t320987\n牧场之国\t320988\nFriend\t320989\n君无戏言\t320990\n再生缘\t320991\n做工\t320992\n泥猴桃\t320993\n大丫鬟\t320994\n调增\t320995\n超远\t320996\nacademia\t320997\n105国道\t320998\n温彻斯特鬼屋\t320999\ncalling\t321000\n红荔路\t321001\n若干种\t321002\n建工\t321003\n东皇钟\t321004\n400V\t321005\n武定门\t321006\n新宿事件\t321007\nwebbuilder\t321008\ntoolkit\t321009\n逆水行舟\t321010\n福建路\t321011\npupil\t321012\nxuzhou\t321013\n重会\t321014\nKei\t321015\n王清培\t321016\n染色体\t321017\n半身裙\t321018\n吗啉\t321019\nexcel2007\t321020\n木勺\t321021\n质炭\t321022\nSED\t321023\n总冠\t321024\n绝地求生刺激战场s1\t321025\n5350\t321026\n车头\t321027\nipc$\t321028\n记忆卡\t321029\n用户口本\t321030\n小说家\t321031\n嘉琳\t321032\n和合承德网\t321033\n刘戈\t321034\nevenicle\t321035\n体制性\t321036\n假体隆胸\t321037\nIrvine\t321038\n干式\t321039\n维生素d\t321040\nNoNpDrm\t321041\n神机妙算\t321042\n毛乌素沙漠\t321043\nPCCP\t321044\n140斤\t321045\ntaohua\t321046\n高德地图车镜版\t321047\n传控\t321048\npathvariable\t321049\nobjcopy\t321050\n桂林电子科技大学信息科技学院\t321051\n2.4A\t321052\n随感\t321053\n塑形\t321054\n四逆汤\t321055\nx8ti\t321056\n蛾蠓\t321057\noptimus\t321058\n京腔\t321059\n狞\t321060\n306号\t321061\n紫金华府\t321062\nandroid6\t321063\n魔方格\t321064\nwashing\t321065\n约等号\t321066\n鸡舍\t321067\n奥克斯集团\t321068\n谜尚\t321069\nosx\t321070\n新闻大求真\t321071\n第三十五期\t321072\nmathworks\t321073\n值机\t321074\n惠农学堂\t321075\ncacl2\t321076\n军刺\t321077\n蝙蝠侠前传2\t321078\nftp://ygdy8\t321079\ncai978\t321080\nTeeChart\t321081\n500L\t321082\n核保\t321083\n柴鸡蛋\t321084\n精灵4pro\t321085\n戈尔隆德\t321086\nsocks\t321087\n古墓丽影\t321088\n六十多\t321089\nFs\t321090\n风度男人网\t321091\n2万多元\t321092\n东方港湾\t321093\n西延线\t321094\n青湖\t321095\n裨益\t321096\n树梅派\t321097\n高柏\t321098\n余票\t321099\n0131\t321100\n吉阳\t321101\nANE\t321102\n冷泡\t321103\n广昆高速\t321104\nGenes\t321105\n类文\t321106\n激战2吧\t321107\n不舒服\t321108\n哥布林杀手外传\t321109\n上外附小\t321110\n曾家岩\t321111\n第74期\t321112\nXftp5\t321113\n第380页\t321114\n大娃\t321115\n商鼎\t321116\n汇通驾校\t321117\n鲁教基\t321118\n透过率\t321119\n凌阳\t321120\n哆嗦\t321121\n苏奥\t321122\n细胞学\t321123\n公诉科\t321124\n宝者\t321125\n暴\t321126\n打理\t321127\n临界点\t321128\n施柏宇\t321129\n王菲倪萍\t321130\n浙版\t321131\nTI8\t321132\n75cm\t321133\nfetching\t321134\n韬\t321135\n爱立信\t321136\nsqa\t321137\n丙寅日\t321138\n4.9元\t321139\n小雨伞漂流记\t321140\n手脚\t321141\n上侧\t321142\n51Aspx\t321143\n将士\t321144\n烽火连城\t321145\n二级\t321146\n北漂们\t321147\n百姓健康网\t321148\n五义\t321149\n毛细血管壁\t321150\n这局\t321151\n奥林匹克\t321152\n北京市盈科律师事务所\t321153\n5月1日以后\t321154\n河南大学淮河医院\t321155\n生血宁片\t321156\n血球计数板\t321157\n_63手工网\t321158\nwordpres\t321159\n中非合作论坛\t321160\nvariation\t321161\n生活照\t321162\n食人狂魔\t321163\nshg\t321164\nClayderman\t321165\n5代\t321166\n上海青旅\t321167\nlibin\t321168\n税延型\t321169\n铁通\t321170\nsexy\t321171\n孝义市\t321172\n应答\t321173\nnac\t321174\n神鬼八阵图\t321175\n套账\t321176\n可为\t321177\n重新开\t321178\n张静静\t321179\n10d\t321180\n神仙鱼\t321181\n广东郁南县政府\t321182\n双冠王\t321183\n超级大明星\t321184\n90部\t321185\n低碳日\t321186\n外科\t321187\nChad\t321188\n尿频尿急\t321189\nDRIVERS\t321190\n吕梁学院\t321191\n冰粥\t321192\nwifi信号\t321193\nb85m-g\t321194\nToolset\t321195\n第52号\t321196\n規格\t321197\n蝶儿\t321198\nQObject\t321199\n玩乐\t321200\n清平调\t321201\n二进制\t321202\n骨头\t321203\n刘东强\t321204\n関西\t321205\n女英雄\t321206\n40艘\t321207\n王明清\t321208\n中共广州市委\t321209\n【日\t321210\n竖锯\t321211\n华中科技大学管理学院\t321212\n卡塞米罗\t321213\n别小看\t321214\n赛莱拉\t321215\n跟我来\t321216\n居民楼\t321217\n捷途教学部\t321218\n饥饿鲨世界\t321219\n北京运通\t321220\nMIPI接口\t321221\n汪道涵\t321222\n第四位\t321223\nGEE\t321224\nunivers\t321225\n铝合金箱\t321226\n吴海军\t321227\n倒放\t321228\n优购物电视\t321229\necmascript\t321230\nskl\t321231\n山阴\t321232\n机械科学研究总院\t321233\nTopSolid\t321234\n点射\t321235\n航司\t321236\n颐和\t321237\n迦\t321238\nrun\t321239\nZhangRuoxu\t321240\n鼓楼大街\t321241\n师师\t321242\n金狮奖\t321243\n妨\t321244\n吕波\t321245\n尿检板\t321246\n焊钳\t321247\n太仓市\t321248\nPaytm\t321249\n露塞提娅\t321250\ngle\t321251\n青豆\t321252\nFeSO4\t321253\n李伟民\t321254\n江波龙\t321255\nPart3\t321256\n梁模板\t321257\n科技港\t321258\n邹鹏\t321259\n中展网\t321260\n唐明皇\t321261\n嘉腾\t321262\n23日\t321263\n洛奇英\t321264\n2.54\t321265\n小铭\t321266\n明人\t321267\n熊丙奇\t321268\n尿垢\t321269\nUFT\t321270\n圈椅\t321271\nLLDPE\t321272\n300450\t321273\n北恩\t321274\nSwaggerUI\t321275\n胎盘片\t321276\n帅康\t321277\n演论\t321278\n新博越\t321279\n社科类\t321280\n胜利西路\t321281\n光通讯网\t321282\n虐泉\t321283\n长春职业技术学院\t321284\n惠灵顿牛排\t321285\n与此\t321286\n生物\t321287\nZVS\t321288\n严新\t321289\n愣住\t321290\nfoundry\t321291\n哀怨\t321292\n现值\t321293\n藤椒油\t321294\n46000\t321295\n70mm\t321296\n2013-08-30\t321297\n2116\t321298\n美行导航\t321299\n回魔\t321300\n电大\t321301\n熬夜\t321302\n我院\t321303\nhuangenai\t321304\n中信信托有限责任公司\t321305\n毕业时间\t321306\n何太\t321307\n东北商贸网\t321308\n纤弱\t321309\nMilton\t321310\n训练舰\t321311\n古典吉他谱\t321312\n1015\t321313\n重头\t321314\n12.11\t321315\nTopgear\t321316\nxiyou\t321317\n好困\t321318\n#5\t321319\n表演版\t321320\n夏景\t321321\nxlnx\t321322\n第10层\t321323\n血脂\t321324\n寄送\t321325\n弥茶\t321326\n整顿\t321327\n翻印\t321328\n房地产法\t321329\n鹏华\t321330\n慢卡\t321331\n花生壳\t321332\n命令符\t321333\n风针\t321334\n77天\t321335\n有害食品罪\t321336\n速腾1.6\t321337\n徐州市财政局\t321338\n肺腺癌\t321339\nmeb\t321340\n真空泵油\t321341\n金松\t321342\n双身\t321343\nsst\t321344\nembryonic\t321345\n狂犬病免疫球蛋白\t321346\n茶马古镇\t321347\n隋文静\t321348\n20万亿\t321349\ngrown\t321350\n电商记\t321351\n景德镇陶瓷大学\t321352\n童袜\t321353\n巴拉克\t321354\n修水\t321355\n贷款者\t321356\nnote2吧\t321357\n孙君\t321358\ng11\t321359\n呐喊者\t321360\n指示标\t321361\n警告\t321362\n钟波\t321363\n含糖量\t321364\nNX9.0\t321365\n170526\t321366\n干硬性\t321367\n魔古山\t321368\n张轩睿\t321369\n第几声\t321370\n善始善终\t321371\n山东能源集团有限公司\t321372\n私募排排网simuwang.com\t321373\n凯文\t321374\n披荆斩棘\t321375\nbom头\t321376\n9分钟\t321377\n契税\t321378\nresulted\t321379\n苹果核\t321380\n重庆市政府\t321381\n罗浮山\t321382\n四川大学生物治疗国家重点实验室\t321383\nordeal\t321384\n描述子\t321385\n鼓足干劲\t321386\n解州\t321387\n鱼头豆腐汤\t321388\n深圳大学城\t321389\n汇刊\t321390\nscrapy\t321391\n禁烧\t321392\n#彪哥闯奉天\t321393\n太平洋电脑网摄影部落\t321394\nusb控制器\t321395\n平端\t321396\n清新\t321397\n甘霖娘\t321398\n三国十大\t321399\n托马斯小火车\t321400\n如冰\t321401\n神室\t321402\n西游记三打白骨精\t321403\n阿红\t321404\n报以\t321405\n华腾园\t321406\n电子墨水屏\t321407\nesc\t321408\n中福\t321409\n媚媚\t321410\n水泥块\t321411\n红脸\t321412\n楼健\t321413\n出租信\t321414\n抗腐蚀\t321415\ncodeforces\t321416\n1417\t321417\n铁耳机\t321418\n修脚\t321419\n接地网\t321420\n隔离柜\t321421\n渤海信托\t321422\nDino\t321423\n捋一捋\t321424\n金史\t321425\n龟鹿补肾丸\t321426\n比亚迪元\t321427\n韩国料理\t321428\n一岁\t321429\n淤船\t321430\n压住\t321431\n北村\t321432\n0202251\t321433\n非融资性担保公司\t321434\n近地联盟先遣队\t321435\nViewed\t321436\n调色盒\t321437\n多利农庄\t321438\necs云服务器\t321439\n执行力\t321440\n邪少\t321441\n好斗\t321442\n企石镇\t321443\n603118\t321444\n自高\t321445\n20160607\t321446\n罗本\t321447\n新地中心\t321448\n08_土豆\t321449\n001a\t321450\ngxzf\t321451\n球队\t321452\n可测函数\t321453\n女跑者\t321454\n网页播放器\t321455\n灿坤\t321456\n郭松龄\t321457\nE动人才网\t321458\n千艺\t321459\n中粮\t321460\n建设工程信息网\t321461\n170617\t321462\n5下\t321463\nfoxmail7.2\t321464\n默默猴\t321465\n大伽\t321466\n科学人\t321467\n成都地铁4号线\t321468\n光棒\t321469\n杭州公交查询网\t321470\n火情\t321471\n土石方\t321472\n烤翅\t321473\nLunch\t321474\n广东省高级人民法院\t321475\n天保山\t321476\n封闭型\t321477\n中共中央国务院关于实施乡村振兴战略的意见\t321478\n融侨集团\t321479\n小希\t321480\n低钠血症\t321481\n险象环生\t321482\n249\t321483\n幻象\t321484\n嘉兴市公安局\t321485\nNINGBO\t321486\n黑龙江省纪委监委\t321487\n利来\t321488\n第76号\t321489\n4分米\t321490\n文明5吧\t321491\n新叶村\t321492\nalipay\t321493\n27斤\t321494\nTOEFL\t321495\ndesignmodeler\t321496\n宝可梦\t321497\n五四节\t321498\n能上\t321499\n良策\t321500\n昊翔\t321501\n魅族PRO5\t321502\nVerdi\t321503\n匀胶机\t321504\n阿泰\t321505\n微狗\t321506\n雪茶\t321507\n葳蕤\t321508\n总压\t321509\n9000公里\t321510\n第十轮\t321511\n要买\t321512\n百度神卡\t321513\n商丘\t321514\n雪穗\t321515\n只见过\t321516\nANY\t321517\n四方阵\t321518\n晴天娃娃\t321519\n累觉\t321520\n两公斤\t321521\n心悦\t321522\n金亚荣\t321523\n李丽娜\t321524\nKBS2\t321525\n床伴\t321526\n喷子\t321527\nDiaoyuweng\t321528\nジャガ\t321529\n0x0000007\t321530\n我己\t321531\n茱莉亚罗伯茨\t321532\n天舟一号\t321533\n中野\t321534\n15s\t321535\n2圈\t321536\n1288元\t321537\n旧体诗\t321538\n阿斯匹林\t321539\n安宁\t321540\n车友\t321541\n有图\t321542\nbandari\t321543\n鳄鱼肉\t321544\n柴胡桂枝汤\t321545\n蛋国志\t321546\npopover\t321547\nCCM+\t321548\nshareid\t321549\n华泰期货\t321550\n莲子心\t321551\n天津城市职业学院\t321552\n百修宝\t321553\nzur\t321554\n陳\t321555\nae2017\t321556\n馨苑\t321557\nwww.58pic.com\t321558\n两天内\t321559\n30个\t321560\n三氧化钼\t321561\n82集\t321562\nConstraintLayout\t321563\n珠江四季悦城\t321564\n差动\t321565\n答案\t321566\n嘉兴南洋职业技术学院\t321567\n87条\t321568\n星汇广场\t321569\nZ900\t321570\n2015.04\t321571\n宥\t321572\n云片\t321573\n步步错\t321574\n仿生\t321575\nVX\t321576\n金华银行\t321577\n黄河南路\t321578\n春芳\t321579\nScales\t321580\n好剧网\t321581\n枪呆\t321582\n蒋高明\t321583\n中国海运\t321584\n脱硫除尘器\t321585\n洛阳市公安局\t321586\n空当接龙\t321587\n美元值\t321588\nHtmlAgilityPack\t321589\n良禽\t321590\n韦神\t321591\n2012-12-15\t321592\n刘意\t321593\nt530\t321594\n青光\t321595\n跑掉\t321596\nfine乐团\t321597\n北京天坛\t321598\n老旦\t321599\n1.5.10\t321600\nzip格式\t321601\n第七卷\t321602\n湘楚\t321603\nenvi\t321604\nfied\t321605\naltova\t321606\n示范场\t321607\n长安欧诺吧\t321608\nMing\t321609\n笠翁\t321610\n范冰\t321611\n美白针\t321612\n14秋\t321613\nGeo\t321614\n未果\t321615\n习近平系列重要讲话精神\t321616\n叹\t321617\n六月雪\t321618\n小息肉\t321619\n见过\t321620\n朋字\t321621\n贾充\t321622\n悼亡\t321623\n迭部\t321624\n易保\t321625\n光速QA\t321626\n双圆\t321627\n就业保障金\t321628\n对调\t321629\n省监狱管理局\t321630\n董昱昆\t321631\nPopup\t321632\n扣带\t321633\n雪缘园\t321634\n0352\t321635\n工商银行\t321636\n巨石强森\t321637\n5.0.4\t321638\nGTX1080ti\t321639\n界内\t321640\n4.5L\t321641\n宜昌市住房和城乡建设委员会\t321642\n海藻肥\t321643\n少数股东权益\t321644\n9898\t321645\n至尊宝\t321646\n谢刚\t321647\n姜敏\t321648\n中莹\t321649\n南京钢铁股份有限公司\t321650\n走亲戚\t321651\n艾力绅\t321652\n霍尊\t321653\nTRIBE\t321654\n长花\t321655\nTESCOM\t321656\n宜都\t321657\n液压拉马\t321658\n滋扰\t321659\nrxswift\t321660\n叶浩\t321661\n邻苯二甲酸二辛酯\t321662\n格列美脲片\t321663\ncherrypy\t321664\n中国包装网\t321665\n热片\t321666\n不知道\t321667\n共进股份\t321668\n伞裙\t321669\n浪琴\t321670\n亟待\t321671\n乐视超\t321672\n物产\t321673\n天津分公司\t321674\n阿托伐他汀\t321675\n湖南农村信用社\t321676\nY700\t321677\n短效\t321678\n弯通\t321679\n两面性\t321680\n畅玩4x\t321681\n数平方根\t321682\nremuneration\t321683\n浩劫dh\t321684\n消防产品\t321685\ndha\t321686\n一侧\t321687\n中国蒙台梭利协会\t321688\nmate\t321689\nsubstance\t321690\n爽秀儿\t321691\n童年趣事\t321692\n辅助机\t321693\n池莉\t321694\n阿欣\t321695\n姚广孝\t321696\nCompile\t321697\n巴郎山\t321698\n派派网\t321699\nH3\t321700\n嵩\t321701\n柔媚\t321702\n琅岐岛\t321703\n洛社\t321704\n角化型\t321705\n木精灵\t321706\nkil\t321707\n汇兑\t321708\n尸忆\t321709\n英雄无敌3吧\t321710\n高新兴科技集团股份有限公司\t321711\n绝地求生帧数\t321712\n证\t321713\n甘肃省人力资源和社会保障厅\t321714\n画轴\t321715\n尼康d5300\t321716\n广顺北大街\t321717\n尊巴舞\t321718\n驾驶舱\t321719\n伊人222综合网\t321720\n雇佣兵\t321721\n万能型\t321722\n功分器\t321723\nInfluenza\t321724\n海贼王女帝蛇姬\t321725\n冠服\t321726\n伊威\t321727\n工银安盛人寿保险有限公司\t321728\n积分号\t321729\n蒜茸\t321730\nPAW\t321731\n徐湘婷\t321732\nRDC\t321733\n银屑\t321734\n三弦\t321735\n20尺\t321736\n按关机\t321737\ndo2\t321738\n举一反\t321739\n白银有色\t321740\n奥美的\t321741\n成千\t321742\n改签\t321743\nguacamole\t321744\n明片\t321745\n第88期\t321746\n爬楼\t321747\n盲人\t321748\n黄瀚生\t321749\n温故\t321750\n拉近你\t321751\n火衣\t321752\n纳米喷镀\t321753\n城口新闻网\t321754\n加拿大安大略省\t321755\n新洁尔\t321756\n广达\t321757\n飞哥大英雄\t321758\nHKE\t321759\n中央民大\t321760\n天然气压缩机\t321761\nbrother\t321762\n吉尔伯特\t321763\n丁辛醇\t321764\n绿叶版\t321765\n睡病\t321766\nzfs\t321767\n地面积\t321768\n鲁斯\t321769\n本年\t321770\n横条\t321771\nexcel表格列\t321772\n首行\t321773\nUnivers\t321774\n凤凰视频\t321775\n科硕\t321776\n中山北\t321777\n欧派卫浴\t321778\n鑫福\t321779\n进动\t321780\n四平市人民政府\t321781\n燃气轮机\t321782\n14块\t321783\n电摆\t321784\n心浮气躁\t321785\n类似\t321786\n不好听\t321787\n娶\t321788\nCYCLE\t321789\n古驿道\t321790\n猪肉干\t321791\n一橙\t321792\n热门\t321793\n重庆市人民医院\t321794\nB2B网站大全\t321795\n张景\t321796\nEU4\t321797\n99句\t321798\n废书\t321799\n容祖儿\t321800\n翡翠花园\t321801\n云顶集团\t321802\n炸金花开\t321803\n三十章\t321804\n书香世家\t321805\n骏马\t321806\n洛克菲勒\t321807\n非晶体\t321808\n5068\t321809\n招助\t321810\nKCL\t321811\nBLACKMORES\t321812\n何以堪\t321813\n包河\t321814\nJOLIMARK\t321815\n2010年1月1日\t321816\nar\t321817\n弹吧\t321818\n香兰阁\t321819\nPassage2\t321820\n鹿丸\t321821\n段祺瑞\t321822\n6版\t321823\n挥不去\t321824\n广州海关\t321825\n杉木芯\t321826\n7323\t321827\n思源实验学校\t321828\n银时\t321829\n戴尔外星人\t321830\nsilvaco\t321831\nnero10\t321832\nundefined,null\t321833\n山鸟\t321834\n三星w2017\t321835\n小豆腐\t321836\nH图\t321837\n辣炒花蛤\t321838\n高树玛利亚\t321839\n合肥市公安局\t321840\n0.001\t321841\n搪锡\t321842\n阜阳职业技术学院\t321843\n友联\t321844\n反恐处\t321845\n体征\t321846\nFlesh\t321847\n宝马展\t321848\nXL\t321849\n香蕉水\t321850\n警备员\t321851\nbeij\t321852\nWin7版\t321853\n上海市发展改革委\t321854\n李咏\t321855\n客专\t321856\n康弘药业\t321857\nfarms\t321858\n旺彩\t321859\n挽爱成殇\t321860\n川岛\t321861\nマスタ\t321862\nEvaluating\t321863\n默文\t321864\n场路\t321865\n英林镇\t321866\n160611\t321867\n青龙桥\t321868\n省民政厅\t321869\nCSD\t321870\n福汇FXCM\t321871\n知识产权日\t321872\n巨幼细胞性贫血\t321873\nIPaddressxxxc\t321874\n农工委\t321875\n康伯巴奇\t321876\n第一个\t321877\n子弹\t321878\n水浒智慧\t321879\n采油工\t321880\n四川司法警官职业学院\t321881\nseav\t321882\n妾色\t321883\nmadvr\t321884\n隐刀\t321885\nwsimport\t321886\n泡桐花\t321887\n何秋光\t321888\n泛光\t321889\nQueries\t321890\n汉口城市广场\t321891\n通术\t321892\nlexicon\t321893\n株冶集团\t321894\n小唐\t321895\n爱兰\t321896\n象甲\t321897\n生化危机:终章\t321898\nTODO\t321899\n酒石\t321900\n师利\t321901\n一国\t321902\n秦穆公\t321903\n出轨\t321904\n50券\t321905\n300053\t321906\n遵行\t321907\nSailor\t321908\n沉积物\t321909\n1.1.17\t321910\nnewifi\t321911\n京都律师事务所\t321912\n钟路明\t321913\n100亩\t321914\n纱锭\t321915\n马路镇\t321916\nISU\t321917\nFOSUN\t321918\n付伟\t321919\nmiaotutu\t321920\n遥远的地方\t321921\nOAKLEY\t321922\n星网宇达\t321923\n时装展\t321924\nconfiguring\t321925\n窝沟\t321926\nsap\t321927\n星球大战外传:侠盗一号\t321928\n简简\t321929\n国台酒\t321930\n吕芳\t321931\n汉方育发素\t321932\n联洋\t321933\n工人文化宫\t321934\n中国有限公司\t321935\nBiomaterials\t321936\n2017.3.6\t321937\n紫金陈\t321938\n顺时\t321939\nDynaudio\t321940\n夫人\t321941\nM17\t321942\nMal\t321943\n20160223\t321944\n药监\t321945\n夜归\t321946\n帮工\t321947\n后勤服务\t321948\n治本安全观\t321949\n齐\t321950\n乍杭铁路\t321951\n有味\t321952\n刀塔传奇\t321953\n绿地公馆\t321954\nVOSTRO\t321955\n政府站\t321956\n中共上海市委党校\t321957\n李丹\t321958\n大学路\t321959\n成资\t321960\nfunctio\t321961\nsafetree\t321962\n张莹莹\t321963\n严版\t321964\n金三娃娃\t321965\n钱途\t321966\n婴幼儿教育网\t321967\n南山路\t321968\nDimethyl\t321969\n个体经济\t321970\nHHVM\t321971\nTOP16\t321972\n泊秦淮\t321973\n风车动漫网\t321974\n心中\t321975\n框柱\t321976\n稚优泉\t321977\nCHI\t321978\n开厂\t321979\n雅迪\t321980\nunmarshal\t321981\nPPT模\t321982\n20170127\t321983\n仅供参考\t321984\n背越式\t321985\n89178商机网\t321986\n前六个月\t321987\n软博会\t321988\nemotions\t321989\n卖空\t321990\n石头城\t321991\nVikings\t321992\n来得子\t321993\n门边\t321994\n四分法\t321995\n10多分钟\t321996\n牙菌斑\t321997\n画液\t321998\n豪华曹操传\t321999\nwin10u盘\t322000\nB1A4\t322001\n安徽省安庆第一中学\t322002\n叶天士\t322003\n淄博日报\t322004\n16个\t322005\n刘大美人\t322006\n正新\t322007\n梅尼埃病\t322008\nddl\t322009\n56集\t322010\n运河\t322011\ngri\t322012\n红妹\t322013\n腔体滤波器\t322014\noksn\t322015\n6v6\t322016\n翟山鹰\t322017\n奇葩大会\t322018\n档案局\t322019\n有机胺\t322020\n河津市\t322021\n帝牙卢卡\t322022\n芜湖市委组织部\t322023\n生不如死\t322024\n888电视剧网\t322025\n长春建筑学院\t322026\n安徽省住房和城乡建设厅\t322027\n神谕者\t322028\n黛安\t322029\n618#\t322030\n抗炎药\t322031\nInsulin\t322032\n汇祥\t322033\n拎袋\t322034\n雅量\t322035\n洞裤\t322036\n穆恩\t322037\n字币\t322038\n梁庄\t322039\nJeakon\t322040\n结肠镜检查\t322041\n3.6g\t322042\n二四六\t322043\n纯手工\t322044\nczp\t322045\n月群\t322046\n259LUXU\t322047\n竹杠\t322048\nD3.js\t322049\n熟宣\t322050\n一粒米\t322051\n贵州民建\t322052\n白马路\t322053\n午托班\t322054\n爱剪辑\t322055\n顺风棉花网\t322056\nnsmutablearray\t322057\n大白钱包\t322058\n王宝巴特尔\t322059\n残章\t322060\n质监信息网\t322061\n陶瓷片\t322062\n9012\t322063\n够\t322064\nExcel2010数据透视表\t322065\n广东医科大学\t322066\nHD630\t322067\n心德\t322068\n国民经济学\t322069\n体热\t322070\n藏歌\t322071\n迈瑞我的绝色总裁未婚妻\t322072\n本班\t322073\nuploads\t322074\n脆皮鸡饭\t322075\napscheduler\t322076\n大连乐居\t322077\n深表\t322078\n纳格兰\t322079\n威朗15s\t322080\nxcos\t322081\n小朋友\t322082\n缩印本\t322083\n安静\t322084\nDrone\t322085\n逢沢\t322086\n伊犁杏花沟\t322087\nC罩杯\t322088\n偷奸\t322089\n四川工程职业技术学院\t322090\n徐毓才\t322091\n6400t\t322092\n暗河\t322093\ncopm\t322094\n六人\t322095\n信守\t322096\n创辉\t322097\n冷凝炉\t322098\n万历\t322099\n豆色\t322100\nPLQ-20K\t322101\n本各\t322102\n神源\t322103\nAQS\t322104\n18项\t322105\n艾家网\t322106\n凌翔茜\t322107\n百度知\t322108\n东方心经马报\t322109\n竞女\t322110\n姜龙\t322111\n乌鸫\t322112\n女生宿舍\t322113\n2015年10月\t322114\n宠妾\t322115\n字典教育新闻网\t322116\n库伦\t322117\n药桶\t322118\n大坪镇\t322119\n面道\t322120\n卡侬头\t322121\n理想生活实验室\t322122\n差遣\t322123\n王蓉\t322124\n禁种\t322125\n赫拉迪姆\t322126\n哈尔滨市区\t322127\n一次性\t322128\nGmod\t322129\n优惠政\t322130\nP2B\t322131\n柯桥中学\t322132\n耀扬\t322133\ntug\t322134\nwww.ip138.com\t322135\n龙湖原山\t322136\n睬\t322137\n全国通\t322138\nwoff2\t322139\n衬度\t322140\nesther\t322141\n前测\t322142\n弱\t322143\n上人人网\t322144\niOS8.4\t322145\n郑州东区\t322146\nカラダ\t322147\n老石\t322148\n8222\t322149\n邮购\t322150\nwww.yse123.com\t322151\n册子\t322152\n华鹏飞\t322153\n有毒气\t322154\n原审\t322155\n信通\t322156\ninsomnia\t322157\n湖南人民出版社\t322158\n六步\t322159\n似乎\t322160\n双龙湖\t322161\nffice\t322162\n梦鸽\t322163\ndalio\t322164\n六月一日\t322165\n长发女\t322166\n裤袜\t322167\nET加速器\t322168\n黔西南州\t322169\n坏单\t322170\n工银瑞信货币市场基金-工银瑞信基金管理有限公司\t322171\n中央宣传部\t322172\n天弘基金管理有限公司\t322173\n自然派\t322174\n富力地产\t322175\n被保险人\t322176\n重氮\t322177\nBTB\t322178\nDiario\t322179\n天津赛力斯自动化科技有限公司\t322180\n逆向\t322181\n考究\t322182\nwaihui\t322183\n写图\t322184\n多囊样\t322185\n凌迟\t322186\n招架不住\t322187\n牡丹花节\t322188\nLair\t322189\n猪肉馅\t322190\n零直\t322191\n【机\t322192\n日生\t322193\nregression\t322194\n广东省气象局\t322195\n南风知我意\t322196\n5.5分\t322197\n16分钟\t322198\n天才网\t322199\n福柯\t322200\n悠秀\t322201\n桌式\t322202\ncary\t322203\n扶风\t322204\n北极绒\t322205\n陶侃\t322206\nOpdivo\t322207\n市委统战部\t322208\n百度竞价\t322209\n风之子\t322210\n扩展型\t322211\n方向机\t322212\n兑\t322213\n绿水\t322214\n酉阳\t322215\n珠海机场\t322216\n卫妆\t322217\n2017年10月\t322218\n胡永平\t322219\n易阳\t322220\nOFFER\t322221\n瑞达\t322222\nbaolei\t322223\n王江\t322224\n1000多家\t322225\nSweet\t322226\nSynchronized\t322227\n想过\t322228\n养狗\t322229\n门\t322230\n一代天骄\t322231\n坦然\t322232\nhisense\t322233\n甲钴胺分散片\t322234\n智尊\t322235\n王仕鹏\t322236\nchembiooffice\t322237\n潮汕站\t322238\n雷霆德\t322239\n艾蕾\t322240\n蜂特网\t322241\n点伤\t322242\n天胡\t322243\nAntibodies\t322244\n厦门市建设局\t322245\n盯紧\t322246\n驰名\t322247\n一体化\t322248\n减速机\t322249\n刘小红\t322250\n广安\t322251\n超速离心机\t322252\nstarry\t322253\n面源污染\t322254\n小孩们\t322255\n【静茹\t322256\ninfj\t322257\n黎巴嫩\t322258\n凉冰\t322259\n左右键\t322260\nadams\t322261\npenta\t322262\nか\t322263\n中宁枸杞\t322264\n阳\t322265\n小浪\t322266\n胡思乱想\t322267\n584\t322268\n41层\t322269\n工程师们\t322270\n疾星\t322271\nmoh\t322272\n畅影\t322273\n3万多\t322274\n聚酯型\t322275\n模态分析\t322276\n我的团长我的团\t322277\nBEA\t322278\n25格\t322279\n红歌会网\t322280\nepel源\t322281\n金昌\t322282\nmuffin\t322283\n了不起的麦瑟尔夫人\t322284\n终结者2审判日pc\t322285\nwiha\t322286\n抗弯强度\t322287\ntarbar\t322288\n谭旭光\t322289\n政事儿\t322290\n闫凤娇\t322291\n上海久光百货\t322292\n全点\t322293\nsubstancepainter\t322294\n十岁\t322295\n完整版\t322296\nfama\t322297\n贾秀全\t322298\na6\t322299\n地级市\t322300\n接电话\t322301\nv1.0.9\t322302\n哪双\t322303\n100欧姆\t322304\n笨蛋\t322305\n太天真\t322306\n崇明县\t322307\n战役\t322308\n盘鹰\t322309\n甜言蜜语\t322310\n离站\t322311\ntreeset\t322312\n越野车\t322313\ndnf\t322314\n_农\t322315\n奉茶\t322316\nTFT\t322317\nqfn\t322318\n佳佳hi\t322319\ncck\t322320\n一清片\t322321\n河床\t322322\n揭东县\t322323\n全自动洗车机\t322324\n大墅镇\t322325\n835268.客如云\t322326\n通灵学院\t322327\n华耀\t322328\n踢翻\t322329\nicpc\t322330\n婚期\t322331\n世博园区\t322332\n苹果螺\t322333\n第8号\t322334\n棋\t322335\n第三句\t322336\n来稿\t322337\n神谕之战\t322338\nglob\t322339\nsname\t322340\n黑龙江省水利厅\t322341\n四臂\t322342\n大陆公司\t322343\n出击\t322344\n嬉笑\t322345\n北京农商行\t322346\n222.com\t322347\nAutodesk\t322348\nlayout_width\t322349\nBranches\t322350\n修理费\t322351\n贾总\t322352\n徽商职业学院\t322353\n拔血罐\t322354\n魔剑生死棋\t322355\n吐鲁番市人民政府\t322356\nsupports\t322357\n迪桑特\t322358\n吞吞\t322359\n很显然\t322360\n克利夫兰骑士队\t322361\n克拉霉素缓释片\t322362\n国防\t322363\n艾格拉斯\t322364\n隔膜泵\t322365\n2018年3月4日\t322366\n盐池县政府\t322367\n饮料展\t322368\n钢琴城\t322369\n梁家河村\t322370\n绿委\t322371\n志邦\t322372\n无用\t322373\nPU404\t322374\n3.9.1\t322375\n饱和食盐水\t322376\n金纳米颗粒\t322377\n刘牧\t322378\n首都师大附中\t322379\nPc版\t322380\n显卡天梯\t322381\n山楂水\t322382\n全国卷II\t322383\n华西第二医院\t322384\n加算\t322385\n闻康资讯网\t322386\n变方\t322387\n黄斑\t322388\nrelax\t322389\n白鼠\t322390\n腰椎压缩性骨折\t322391\namin\t322392\n开元路\t322393\n追逐梦想\t322394\n苏仙\t322395\n违章\t322396\nAvant\t322397\n亚竿\t322398\nMOCK\t322399\n全球创新指数\t322400\n度娘盘\t322401\n熔火\t322402\n中信凯旋城\t322403\n东北师范大学人文学院\t322404\n自由女神像\t322405\nZombie\t322406\n金娘\t322407\n琅东站\t322408\n失掉\t322409\n5图\t322410\n正官庄\t322411\n酷音小伟\t322412\n导热硅胶\t322413\n海舟\t322414\nldo\t322415\n2018-04-04\t322416\nES200\t322417\n圭峰\t322418\n1000mg\t322419\n四子棋\t322420\n方大化工\t322421\n苦茶\t322422\n第十一次\t322423\n金玉其外\t322424\nManually\t322425\n四川企联网\t322426\n2017年9月1日起\t322427\n喜剧电影\t322428\n春官\t322429\n一醉经年\t322430\nmcgill\t322431\n_君越论坛\t322432\nCNBA篮球网\t322433\nAcoustic\t322434\nLDD\t322435\n优育\t322436\nHollis\t322437\nwinning\t322438\n法布里\t322439\n向洋\t322440\n双凤\t322441\n图尔\t322442\n粉底霜\t322443\n颜色\t322444\n综招\t322445\n建筑篇\t322446\n发丧\t322447\n一毫米\t322448\n皖南事变\t322449\nqq漂流瓶\t322450\nnw\t322451\n大霸王\t322452\n抽死\t322453\n金银猫\t322454\nChanged\t322455\nsalute\t322456\n冶炼厂\t322457\n初遇\t322458\n凄凉\t322459\n深圳新公司\t322460\n农残\t322461\n申报单\t322462\nauthentic\t322463\nhopping\t322464\n1-7月份\t322465\n敷铜\t322466\nSuck\t322467\n俄罗斯国防部\t322468\n雷克萨斯ls\t322469\n洗胃机\t322470\n黄墩镇\t322471\n慈溪房产网\t322472\nBRIGHT\t322473\n刘明侦\t322474\n低血\t322475\n张星彩\t322476\ndeli\t322477\n私拍\t322478\n四禅\t322479\n下蛋\t322480\n今创集团\t322481\n_千库网\t322482\n拉帘\t322483\nepi\t322484\n莲科\t322485\n32x32\t322486\nhughes\t322487\n恒大滨河\t322488\nuniversities\t322489\n负反馈放大电路\t322490\n高中数学联赛\t322491\n越轨\t322492\n水孔\t322493\n消化病\t322494\n随想曲\t322495\n普招\t322496\nAttempted\t322497\n小万\t322498\n水冶镇\t322499\n渍水\t322500\n贴膜机\t322501\n唐宣宗\t322502\n汽车库\t322503\n人之谜\t322504\n13关\t322505\n光币\t322506\n上原有\t322507\ngamecenter\t322508\n麦淘网\t322509\nO2O\t322510\n黄埔大道西\t322511\n1956\t322512\n政文\t322513\n斯达舒\t322514\n两个年\t322515\n吉他调音器\t322516\n管理方法\t322517\n2704\t322518\ne保\t322519\n祭孔\t322520\n自考本\t322521\n550个\t322522\n注册管理会计师\t322523\n铃鹿\t322524\nairpods\t322525\n第二十届\t322526\n岳峰镇\t322527\n用超\t322528\n无双\t322529\n脖圈\t322530\n来说说\t322531\nextracting\t322532\n博主\t322533\n2017年2月23日\t322534\nOS\t322535\n袁绍\t322536\nPRINT\t322537\n墙体\t322538\n浙江省口腔医院\t322539\n金刚圈\t322540\nhcnp\t322541\n对口直升\t322542\n姜大卫\t322543\n花牛\t322544\n顺景\t322545\n11.2.13\t322546\nwww.72dj.com\t322547\n208\t322548\n充足\t322549\n低下\t322550\nVesta\t322551\n餐粉\t322552\n9条\t322553\n皮卡超人\t322554\n275个\t322555\n淘宝支付\t322556\n鱼体\t322557\n刺挠\t322558\n新网球王子\t322559\n澳洲国立大学\t322560\n枇杷露\t322561\n石韦\t322562\nslp\t322563\n中国人民解放军总医院\t322564\n000039\t322565\n豆豆网\t322566\n和君商学院\t322567\n蜜制\t322568\n斗破苍穹之药老传奇\t322569\nbrook\t322570\n新华路\t322571\n海备思\t322572\n阳东县\t322573\nDEBUG\t322574\n外汇\t322575\n金禧\t322576\n非常\t322577\n上饶市国土资源局\t322578\nToro\t322579\n专业化\t322580\n万表网\t322581\n触目\t322582\n耳背\t322583\n银盘\t322584\nspark\t322585\n浙江国税\t322586\nchat\t322587\n五戒\t322588\n亚服\t322589\npvc防水卷材\t322590\n经济开发区\t322591\n2018-04-16\t322592\n征信宝\t322593\n智园\t322594\n年制\t322595\n古北壹号\t322596\n尤果圈\t322597\n学美容\t322598\n完毕\t322599\n奶秀\t322600\nUn\t322601\n如意石易经网\t322602\n权责\t322603\n低胸裙\t322604\n厂花\t322605\n云南国际信托有限公司\t322606\nChem-Station\t322607\n睫状体\t322608\n北京大学深圳医院\t322609\n张莉\t322610\n阿拉善经济开发区\t322611\nsyria\t322612\n7710\t322613\n年款\t322614\n发售\t322615\n【求文\t322616\nGhoul\t322617\nRenal\t322618\nLX3\t322619\n欺软怕硬\t322620\n力帆摩托\t322621\n激活量\t322622\nvcc\t322623\n青蚨\t322624\n病灶\t322625\n◤\t322626\n胃病\t322627\n三桥镇\t322628\ndefinitive\t322629\n可控整流电路\t322630\n液氧罐\t322631\n木耳镇\t322632\n萌侠传说\t322633\n访视\t322634\n冒险王2\t322635\n终极斗士4\t322636\n霸王别姬\t322637\nStochastic\t322638\n杰克思勒\t322639\nfidelity\t322640\n六一儿童节\t322641\n思路客小说网\t322642\n请到\t322643\n娼年\t322644\n秋刀鱼\t322645\nhession\t322646\noq\t322647\nAccessories\t322648\n沙鸥\t322649\nPXC\t322650\n衢山岛\t322651\nredhat5\t322652\n江苏省苏北人民医院\t322653\n农具\t322654\n中国基金报\t322655\n唆\t322656\n譜\t322657\n腿疼\t322658\n蓝田县\t322659\n火龙王\t322660\n德军总部2:新巨人\t322661\n中国太平\t322662\n泡泡液\t322663\n17.01\t322664\n东汉末\t322665\n荷塘区\t322666\nBO3\t322667\npaytm\t322668\n钟丽缇\t322669\n聊聊\t322670\n天津出入境检验检疫局\t322671\n时代英语报\t322672\nwin2003\t322673\n星航\t322674\nIT运维\t322675\n古文\t322676\n17年7月\t322677\nCliff\t322678\n十个\t322679\n毕节试验区\t322680\n1948年\t322681\n20161102\t322682\n吉他谱-钢琴谱-电子琴谱\t322683\n感到遗憾\t322684\nCQC\t322685\n马褂木\t322686\n组装工\t322687\nkid\t322688\n乐清市政府\t322689\n金福南\t322690\nKPL\t322691\n大连金普新区\t322692\nlaod\t322693\n發佈\t322694\nFineUI\t322695\n乱世三义\t322696\n美乳\t322697\n徐汇区小学\t322698\n中心医院\t322699\n合浦\t322700\n中国电力科学研究院\t322701\n池珍熙\t322702\n磷灰石\t322703\n南方人才网\t322704\n徐福\t322705\nOKEX\t322706\n冒\t322707\nprog\t322708\n明湖\t322709\nfifo\t322710\nhuijia\t322711\n定焦\t322712\n湖北省教育考试院\t322713\n∣\t322714\n靳少\t322715\n三波\t322716\nannealing\t322717\n雨木\t322718\nora-12560\t322719\n杨亮\t322720\n金牙\t322721\nRevert\t322722\n独坐敬亭山\t322723\n彭加木\t322724\n入厕\t322725\n男生\t322726\n2016年12月1日\t322727\n泰晤士\t322728\n领涨\t322729\n拉普拉斯算子\t322730\n风筝误\t322731\n配载\t322732\n点不着\t322733\n360随身Wifi\t322734\n择校生\t322735\nCWW\t322736\n甜心萌妻:总裁宠不停\t322737\n北京西站\t322738\n兴衰\t322739\n推动式\t322740\n查案\t322741\n22000\t322742\n撤迁\t322743\n绝世仙王\t322744\n注册表值\t322745\n内廷\t322746\n1.0.1.1\t322747\naxure7\t322748\n李敏镐\t322749\n酷冷至尊\t322750\n大岭山森林公园\t322751\n新封神传说\t322752\n好彩\t322753\n壹个\t322754\n200万平方米\t322755\n妙用\t322756\nCadey\t322757\n靖康耻\t322758\n阶矩\t322759\npenny\t322760\n吴昕\t322761\n罚金\t322762\nHIMSS\t322763\n肝腹水\t322764\n复制性\t322765\n云村\t322766\n学舍\t322767\n内蒙古广播电视台\t322768\n久竞\t322769\nebsco\t322770\n奥迪Q3论坛_汽车之家论坛\t322771\n朱可儿\t322772\nstresses\t322773\n内田有纪\t322774\n敦刻\t322775\n快速排序\t322776\n红外线传感器\t322777\n视唱\t322778\nchi\t322779\n溪石\t322780\n第十天\t322781\n大梦西游\t322782\n明艳\t322783\n非线\t322784\n起稿\t322785\nutp\t322786\n淫照\t322787\nged\t322788\nlxc\t322789\ntempt\t322790\n中新天津生态城\t322791\nsupa\t322792\n三座\t322793\n石爪山脉\t322794\n涧\t322795\n邮政包裹查询网\t322796\n攒\t322797\n连云港\t322798\ns3mini\t322799\n空中\t322800\n高帧率\t322801\n软式\t322802\nGBA模拟器\t322803\nsteamed\t322804\nsetstate\t322805\nt460s\t322806\n白求恩国际和平医院\t322807\n明档\t322808\n户籍所在地\t322809\n王汝刚\t322810\ntoxicity\t322811\n紧张性\t322812\n102万\t322813\n舒颜\t322814\n淡入淡出\t322815\n散货船\t322816\nx3430\t322817\ncuculus\t322818\ngtx750\t322819\n情感类\t322820\n广东国旅\t322821\n柄\t322822\n2145个\t322823\n柯式\t322824\n男神节\t322825\nflame\t322826\n帆船\t322827\n可维护性\t322828\n抛尸\t322829\n冒充\t322830\ncnkai\t322831\n搜狗高速浏览器\t322832\n环境问题\t322833\n银保监\t322834\n东哥\t322835\n绿城中国控股有限公司\t322836\nBTG\t322837\n退色\t322838\n佳能80d\t322839\n尤里乌斯\t322840\npremium\t322841\nps智能对象\t322842\n黄钢\t322843\n丁酸氢化可的松乳膏\t322844\n做生\t322845\nD5300\t322846\n莫比乌斯带\t322847\ncaffe\t322848\n酷猫\t322849\n纽约市\t322850\n别射\t322851\n合议\t322852\n天虹\t322853\n敦化市\t322854\n二第一章\t322855\n孳生\t322856\n岩棉\t322857\n幽幻\t322858\n今年4月\t322859\n狙击精英2\t322860\n名胜古迹\t322861\n邓锋\t322862\n加版\t322863\nemq\t322864\n上海研究院\t322865\n厦门轮渡码头\t322866\n高居\t322867\n家居业\t322868\n瞻礼\t322869\n镜报\t322870\n邱心志\t322871\n华数集团\t322872\ndelete\t322873\n大郊亭\t322874\n亲邻\t322875\narabian\t322876\n热巴\t322877\nstruts2-core\t322878\nThinkSAAS\t322879\nX201\t322880\n立法者\t322881\n北固山\t322882\n俩年\t322883\n岳阳市人民政府\t322884\n圣贝\t322885\n1374\t322886\n警辅\t322887\n领淘\t322888\n350马力\t322889\n断缴\t322890\n平房\t322891\n爱谱华顿\t322892\n王雪红\t322893\n亚利桑那州\t322894\n保利茉莉公馆\t322895\n佳能G1800\t322896\n首岸\t322897\nsohu\t322898\n张笛\t322899\n13C\t322900\n青岛双星\t322901\n六年级语文上册\t322902\n公司法\t322903\nSolidWorks2017\t322904\n上海玫瑰医疗美容医院\t322905\n7连\t322906\n洒水\t322907\n播\t322908\n阿龙纳斯\t322909\nkettle\t322910\n布泽尔\t322911\n升压电路\t322912\n商鞅\t322913\n天地男儿\t322914\ndbase\t322915\nsneakers\t322916\ngetting\t322917\nlose\t322918\n周轶君\t322919\n1.5章\t322920\n热圣\t322921\n30个月\t322922\n世间人\t322923\n羊狮\t322924\n威宁彝族回族苗族自治县\t322925\n债权转让合同\t322926\n寺库\t322927\n烟草专卖许可证\t322928\n西乡县\t322929\n门规\t322930\nPods\t322931\n淼淼\t322932\nwindow2003\t322933\n认同感\t322934\n高腰裤\t322935\nAra\t322936\n液位\t322937\nHPC\t322938\nSHIPPING\t322939\n白马山庄\t322940\n黔府\t322941\n空气开关\t322942\n车规\t322943\n张龄心\t322944\n格美\t322945\n栏板\t322946\n物质文明\t322947\n静电测试仪\t322948\n冰川时代\t322949\n脐带血造血干细胞库\t322950\n操作后\t322951\ncw\t322952\n醪糟\t322953\n迪昂\t322954\np9\t322955\n康庄镇\t322956\nchen2013\t322957\n甜杏仁\t322958\n欧亚达\t322959\nclinic\t322960\n几K\t322961\n社保代缴公司\t322962\n流清涕\t322963\n干嘛\t322964\noTMS\t322965\n山东女子学院\t322966\n岳飞传\t322967\n威廉姆森\t322968\n顺隆书院\t322969\n美厨\t322970\n小银\t322971\n冶铁\t322972\n死亡赔偿标准\t322973\n笤帚\t322974\n洞窟\t322975\n好不舒服\t322976\n哮喘\t322977\n帕洛斯\t322978\nroadhog\t322979\n0045\t322980\nMZ\t322981\n凯千\t322982\n闪离\t322983\n漂亮女孩\t322984\n感动人物\t322985\ndiscovery\t322986\n衡阳东\t322987\n浙江东日\t322988\n8.92亿美元\t322989\nModels\t322990\n天都\t322991\n洛杉矶机场\t322992\nAPi\t322993\n或\t322994\n名家具展\t322995\n虚值\t322996\nVino\t322997\n裁缝\t322998\ncards\t322999\n贝贝龙\t323000\n猛将传\t323001\n使命召唤1\t323002\n铜臭\t323003\n电锯惊魂\t323004\n而非\t323005\n文卷\t323006\n黑夜汉化组\t323007\n蘑菇战争\t323008\nFindyou\t323009\n法框\t323010\nHumans\t323011\n艾灸贴\t323012\n商业革命\t323013\n750元\t323014\n恐龙快打\t323015\nfreestyle\t323016\n核安全\t323017\nclub\t323018\n陈宝存\t323019\n3八重樱\t323020\n济南社区\t323021\n电动燃油泵\t323022\nPon\t323023\n劳动美\t323024\n奴隶区\t323025\n降解\t323026\n武峰\t323027\n阴阜\t323028\n凯莉·布鲁克\t323029\n最惨\t323030\n我看\t323031\n评分项\t323032\n姚斌\t323033\n冰霜\t323034\n航空运输电子客票\t323035\nsits\t323036\n火焰传感器\t323037\n美人鱼2\t323038\n两班制\t323039\n宝鸡市中心医院\t323040\n瀑布群\t323041\nqq浏览器\t323042\n字法\t323043\n净水厂\t323044\n0.2\t323045\n浊化\t323046\nvba批量\t323047\n萱\t323048\n58mm\t323049\nSAI2\t323050\n陈茜\t323051\n深州市\t323052\n金承志\t323053\n二巷\t323054\nQPainter\t323055\n180405\t323056\n70000元\t323057\nNucleic\t323058\n马牌轮胎\t323059\n怀宁县政府\t323060\n甘肃省建设厅\t323061\nt-1\t323062\n504路\t323063\n后仓\t323064\n艾诺仪器\t323065\ncallkit\t323066\n鞍山市财政局\t323067\n降调\t323068\n庹宗华\t323069\n预抵押登记\t323070\n肉香\t323071\n100.8\t323072\n张律师\t323073\nhat\t323074\n天外村\t323075\n京师\t323076\n如奴\t323077\n平塘\t323078\n绿城桃花源\t323079\nopenvc\t323080\n棉料\t323081\n金大中\t323082\n城站火车站\t323083\nlmax\t323084\n81亿\t323085\n郑许\t323086\n蕨菜\t323087\nECI\t323088\nWeibull\t323089\nWealth\t323090\n扬善\t323091\nGlory\t323092\n甜醋\t323093\n叶榭\t323094\n新华报业传媒集团\t323095\n走管\t323096\n江桥\t323097\n阳光威尼斯\t323098\n李德成\t323099\n水流\t323100\n泛音\t323101\n金海贷\t323102\n运算符重载\t323103\n右里\t323104\n郁孤台\t323105\npals\t323106\n1039\t323107\n古兵\t323108\n第149章\t323109\n闲扯淡\t323110\n魔晶\t323111\n创服\t323112\n坐阵\t323113\n镍合金\t323114\nediting\t323115\n邵阳县\t323116\n旧金山国际机场\t323117\n石头房\t323118\n彦语\t323119\n剖视图\t323120\n实业家\t323121\n20公里\t323122\n北大肿瘤医院\t323123\n插电式混动车\t323124\ngrounds\t323125\n川久保玲\t323126\n北京四中房山校区\t323127\n十三罗汉\t323128\n长城论坛\t323129\n地方网\t323130\n冯如\t323131\nbriggs\t323132\n第16届\t323133\n口淫\t323134\n练功服\t323135\numich\t323136\n英菲尼迪Q50\t323137\n芋圆\t323138\n127平\t323139\n风字\t323140\n2416\t323141\n危墙\t323142\nopenlayers4\t323143\nmedusa\t323144\n第几个\t323145\n浙江经视\t323146\n体彩网\t323147\n外交部\t323148\n男舞\t323149\n刘皇叔\t323150\n故曰\t323151\n20点\t323152\n俊\t323153\n武夷\t323154\n伊力老窖\t323155\n铜铝\t323156\n莱娜\t323157\n日月如梭\t323158\nAlly\t323159\nOst\t323160\n2017年3月15日\t323161\n雷欧\t323162\n广东东\t323163\n王庄镇\t323164\n假戏真做\t323165\n160129\t323166\n小米手环3\t323167\nregex\t323168\n朱丽叶\t323169\n盗墓者\t323170\nSUN2\t323171\n以案\t323172\n十几次\t323173\n打税\t323174\n备机\t323175\n南京夜网\t323176\n自由城\t323177\n1.68米\t323178\n北京市住建委\t323179\n厦门大学信息科学与技术学院\t323180\n41码\t323181\n血清\t323182\npdf编辑器\t323183\n刘艳\t323184\n十八级\t323185\n鼎信\t323186\n易见\t323187\n以资\t323188\n广利易购\t323189\nA+轮融资\t323190\n泰迪俱乐部\t323191\n小燕飞\t323192\n金点子\t323193\n学习习近平治国理政\t323194\n终夜\t323195\n超短裤\t323196\n马相伯\t323197\n93亿\t323198\n王云龙\t323199\n紫貂\t323200\nHELP\t323201\nsocially\t323202\n蜥\t323203\n一跳\t323204\n猛牛\t323205\n呆毛网\t323206\n诸葛神\t323207\n国际旅行社有限公司\t323208\n军乐团\t323209\n罗云琦\t323210\n在野党\t323211\n女战士\t323212\n413\t323213\n三堡\t323214\n巨光\t323215\n电流值\t323216\n张江园区\t323217\n济州联队\t323218\n死敌\t323219\ncad转换器\t323220\nhttp模块\t323221\n平衡树\t323222\n金荞麦\t323223\nzhengshu\t323224\n挖土机\t323225\n2.1.4\t323226\n主任委员\t323227\n独爱\t323228\nltps\t323229\n县情\t323230\n掘地\t323231\n缝\t323232\n若无\t323233\ndroid\t323234\n游天下短租网\t323235\n可选中\t323236\n达康\t323237\nNOT\t323238\naes\t323239\n库克\t323240\nok镜\t323241\n马家辉\t323242\nwpy\t323243\n1691\t323244\n宁波日报\t323245\nqq炫舞\t323246\n告密\t323247\n推陈出新\t323248\n接连不断\t323249\n梦幻69吧_\t323250\n吴敬\t323251\n天然气管网\t323252\n楼面板\t323253\nnake\t323254\n少辛\t323255\n德馨\t323256\n传达\t323257\nErotica\t323258\n蚁族\t323259\n三世\t323260\n凯龙股份\t323261\n二宫沙树\t323262\n猎球者\t323263\n摩擦焊\t323264\n耐热性\t323265\n4月30日前\t323266\n发票\t323267\n飞翔\t323268\n碟中谍2\t323269\npymongo\t323270\n裂谷城\t323271\n14台\t323272\n一锅粥\t323273\n番茄饭\t323274\n啪啪三国2\t323275\n大话西游手游转生\t323276\n济南市财政局\t323277\n李炳宰\t323278\n汉语拼音字母表\t323279\nAOD\t323280\n辩护人\t323281\n平减指数\t323282\n280元\t323283\nrentals\t323284\nVagina\t323285\ndecorators\t323286\n深山\t323287\nqq表情大全_朝夕网\t323288\n公平路\t323289\n宣化县\t323290\n知网\t323291\n排爆\t323292\n胜者京华烟云\t323293\n武文\t323294\nusb3\t323295\n实验板\t323296\n付款人\t323297\n水污染源\t323298\nstaruml\t323299\n假肢\t323300\n13场\t323301\nASP技术论坛\t323302\n1203\t323303\n张娟\t323304\n二半\t323305\n生活观\t323306\noftware\t323307\n蔡明亮\t323308\n外立面\t323309\n0x8007000\t323310\n大总统\t323311\n罗列\t323312\n那点事\t323313\n建元\t323314\n私事\t323315\n弗拉德\t323316\n塘桥\t323317\n吆喝\t323318\n易经64卦\t323319\n601766\t323320\nGoddess\t323321\nrookie\t323322\n南尖岩\t323323\nsrv\t323324\n猎人野性的呼唤\t323325\n2012版\t323326\n执\t323327\n矩阵库\t323328\n观察窗\t323329\nslap\t323330\n南京师范大学附属中学江宁分校\t323331\n钟伟\t323332\n肩头\t323333\n广赛\t323334\n公孙龙\t323335\n子伯母\t323336\n%%\t323337\n中国三峡\t323338\n毛雨张\t323339\n好欢喜\t323340\nWARE\t323341\nBart\t323342\nvs2010\t323343\n同济大学中德学院\t323344\n食宿费\t323345\n异机\t323346\n顶呱刮吧\t323347\nsr9\t323348\nlastpass\t323349\n萨甘\t323350\n卡喉\t323351\nzzb\t323352\n金米\t323353\n想太多\t323354\nspecial\t323355\n150平方\t323356\n似水流年\t323357\nMedia\t323358\n把子肉\t323359\n控制柜\t323360\n黑龙江省司法厅\t323361\nASPX\t323362\n一碗粥\t323363\n破解工\t323364\n牛奶糖\t323365\n珞珞\t323366\n環境\t323367\n京价\t323368\n165cm\t323369\n找你\t323370\n乐捐\t323371\n黑白道\t323372\n84名\t323373\n深圳市软件行业协会\t323374\n女神仙\t323375\n八小时\t323376\nContracts\t323377\n外检\t323378\n30000分\t323379\n新兴铸管\t323380\n9辆\t323381\n广东省妇联\t323382\n天津招商网\t323383\n滨海栈道\t323384\n正丁醇\t323385\n钦州北路\t323386\n暖炕\t323387\n宫内膜\t323388\n东方体育\t323389\nsql格式\t323390\n永乐镇\t323391\n5337xxxx\t323392\n我是歌手\t323393\n邪君\t323394\n猛鬼食人胎\t323395\n双花\t323396\nCHiQ\t323397\n橡木桶\t323398\n幻刺\t323399\n20150325\t323400\n康大夫\t323401\n99彩\t323402\nForeignKey\t323403\noptiplex\t323404\n喜欢自己\t323405\nCivil3D\t323406\n金鹏航空\t323407\n兵连\t323408\n梯震门\t323409\n慢性结膜炎\t323410\n贴药\t323411\ntrolley\t323412\n红底\t323413\n配用\t323414\nngx\t323415\ncky\t323416\n酷狗音乐2017\t323417\n什么处\t323418\n艾格特\t323419\nVixen\t323420\n恩替卡韦分散片\t323421\n53%\t323422\n八里庄北里\t323423\n75.5\t323424\npassware\t323425\nplaza\t323426\n组示\t323427\n银两\t323428\n民主街\t323429\njoom\t323430\n736\t323431\n游目骋怀\t323432\n灰色\t323433\n比方\t323434\n赫芬琳\t323435\n傻白甜\t323436\n酸菜\t323437\n白去\t323438\n宁津县\t323439\n人法\t323440\n投胎\t323441\n茅草\t323442\ngraco\t323443\nMem\t323444\n灵湖\t323445\n道歉信\t323446\n南昌万达城\t323447\nconverting\t323448\n脑区\t323449\n东华中学\t323450\nimportant\t323451\n潜伏期\t323452\n周汉民\t323453\n铝硅合金\t323454\n清关\t323455\n长生桥\t323456\n美国达人秀\t323457\n畅联股份\t323458\nMrJun\t323459\n病毒性心肌炎\t323460\n古驰Gucci\t323461\ngodaddy域名\t323462\n双周\t323463\nNaval\t323464\n宝迪\t323465\n海都网\t323466\n北京司法局\t323467\n欧码\t323468\n青云\t323469\n雪鸮\t323470\n防潮板\t323471\n国网山西省电力公司\t323472\n江苏恒瑞\t323473\nCENTER\t323474\n新浪乐库\t323475\nNotepad\t323476\n闪闪的红星\t323477\n说三道四\t323478\n2000块\t323479\nYura\t323480\n宝顶\t323481\n小赵\t323482\n三架\t323483\n舌尖体\t323484\nIET\t323485\n荣超\t323486\n重听\t323487\n任人\t323488\n云澜湾\t323489\nyuzuki\t323490\n三花猫\t323491\nQGIS\t323492\n一筹莫展\t323493\nwinp\t323494\n育才一小\t323495\n运动性\t323496\n心录\t323497\neclipese\t323498\n花集网\t323499\n议价\t323500\nT560\t323501\n卡秒\t323502\nester\t323503\n要不然\t323504\n张东路\t323505\n磨机\t323506\n国家机关\t323507\n央金\t323508\ninstr函数\t323509\n东津\t323510\n俄罗斯公司\t323511\nAndré\t323512\n跳马镇\t323513\n中国音乐著作权协会\t323514\nWinSxS\t323515\n骨膜\t323516\n阜新市细河区\t323517\n龙水峡\t323518\n想家\t323519\nSPF50\t323520\n下压\t323521\n年糕机\t323522\n佳联\t323523\n阅读版\t323524\nreply\t323525\n万利\t323526\n希洛克\t323527\n无异味\t323528\n杨丽\t323529\n291\t323530\n米特网\t323531\n黄皮子坟\t323532\nMFC\t323533\n微信损友圈\t323534\narchiver\t323535\n大拇指甲\t323536\n防火办\t323537\nhack\t323538\n晚妆\t323539\nAndroi\t323540\n9m\t323541\n智囊\t323542\ndoug\t323543\n刘良\t323544\n黑币\t323545\n电磁采暖炉\t323546\nCCTV-10_央视\t323547\n二羧酸\t323548\nPyt\t323549\n塞尔达传说:野之息\t323550\n新大路\t323551\nId\t323552\n新蜀山剑侠传\t323553\n一和\t323554\n金州区\t323555\n貌美如花\t323556\n许靖韵\t323557\n不厌其烦\t323558\nActuarial\t323559\n市级\t323560\nRPM\t323561\n斗笠菇\t323562\nBOSS战\t323563\ndemo-CSDN\t323564\n鑫华\t323565\n羽鸟\t323566\n诱奸\t323567\nois\t323568\nbootgrid\t323569\nProud\t323570\n牌板\t323571\n救心丸\t323572\n天地者\t323573\n内黄吧_\t323574\n卍解\t323575\n8dm\t323576\n潜龙\t323577\n通论\t323578\n开户行\t323579\n命运石\t323580\n赵金\t323581\nCitizenship\t323582\n特制版\t323583\n辛普森案\t323584\nImagine\t323585\nipad分屏\t323586\n乾清宫\t323587\nneat\t323588\nbilibili梗\t323589\nprincess\t323590\n魔芋胶\t323591\n存用\t323592\n骑战\t323593\n盛和\t323594\n12斤\t323595\n响水大米\t323596\n后谷\t323597\nCCTV-11\t323598\n胃窦部\t323599\n拉线机\t323600\n索引表\t323601\n38张\t323602\n异常类\t323603\n塘栖镇\t323604\n查票\t323605\n防尘套\t323606\nLILY\t323607\nindd\t323608\n9月底\t323609\n罗博\t323610\nBD沃客网\t323611\n吧友们\t323612\n再保险公司\t323613\n劝善\t323614\n长绳\t323615\n中远海运集团\t323616\n极柱\t323617\n伏兵\t323618\n沈阳202医院\t323619\n大理下关\t323620\nETABS\t323621\n2粒\t323622\nstephenykk\t323623\n2018-01-30\t323624\n专家们\t323625\n长青\t323626\n反例\t323627\n新概念英语\t323628\n2500万元\t323629\n后方\t323630\n文件服务器\t323631\njianjia\t323632\n获救\t323633\n全自动洗地机\t323634\n一汽\t323635\nHopeTrip\t323636\nhikaru\t323637\n青鬼\t323638\n合页\t323639\ncommonly\t323640\n赣州开发区\t323641\n阿里万象\t323642\nemmc\t323643\n皮夹克\t323644\n杆\t323645\njavscript\t323646\n强险\t323647\n好伙伴\t323648\n简谱版\t323649\n泥水工\t323650\n金县\t323651\n二次函数y=ax2+bx+c\t323652\n惩罚\t323653\n实用性\t323654\n79个\t323655\nsu2018\t323656\n大话羚羊\t323657\n2018年4月12日\t323658\n毕小彬\t323659\n学术研究\t323660\nspread\t323661\nchs\t323662\n视听室\t323663\n表里不一\t323664\n爱情宣言\t323665\n多媒\t323666\n表面包\t323667\n湖南联通\t323668\nLegends\t323669\n准噶尔\t323670\n梵几\t323671\n雪薇\t323672\n康怡\t323673\n五凤朝阳刀\t323674\n醉拳2\t323675\n温州市人民政府\t323676\n航海王-航海王在线漫画\t323677\n10101\t323678\n江汉北路\t323679\n积雨\t323680\n陆风汽车\t323681\n罗西尼\t323682\nCertified\t323683\n景文\t323684\n麻辣烫店\t323685\n乙袋\t323686\n参团\t323687\n神精\t323688\n首集\t323689\n狙击战\t323690\n刷屏级\t323691\n奢求\t323692\n神通\t323693\n英国人\t323694\n别娶\t323695\n东京国立博物馆\t323696\n张居正\t323697\n席德\t323698\n某一行\t323699\n长富\t323700\n发人\t323701\n琬\t323702\n2156\t323703\n轴研科技\t323704\n米老鼠\t323705\npurifier\t323706\n_天书中文网\t323707\n万科梅沙书院\t323708\nduibi\t323709\n小茹\t323710\n海上丝绸之路\t323711\n马总\t323712\n相守\t323713\n1970-01-01\t323714\n资本成本\t323715\nsortable\t323716\n何年何\t323717\n仙海\t323718\n不肯\t323719\n耐酸碱\t323720\n第二十八次\t323721\n蔡姓\t323722\n京江晚报\t323723\n豆丁\t323724\n法仪\t323725\n金乌\t323726\n娱闻\t323727\n荣耀4A\t323728\n南漳县人民政府\t323729\n404_\t323730\n竹筷\t323731\n工艺画\t323732\n水弹枪\t323733\nChromecast\t323734\n20170730\t323735\n真炎\t323736\n第几_\t323737\n名章\t323738\nusb线\t323739\n刷盘\t323740\n资源局\t323741\n一路向西2之泰西\t323742\n晶片\t323743\n体力劳动强度\t323744\n环市东\t323745\nAresn\t323746\n数学系\t323747\n知识城\t323748\n有用功\t323749\nmm131\t323750\n陀\t323751\n半辈子\t323752\n春山\t323753\naxislabel\t323754\n麒麟950\t323755\nGigabyte\t323756\nwindows7吧\t323757\n20171213\t323758\nGets\t323759\ngoumai\t323760\n北辛安\t323761\n灰兔\t323762\n临床心理科\t323763\ndkny\t323764\n阿兰卡峰林\t323765\n人格权\t323766\n胡子哥\t323767\ntimeline\t323768\nAU\t323769\n深圳市住房保障署\t323770\nworksheet\t323771\n枪迷\t323772\n20170623\t323773\n黑非洲\t323774\n第九轮\t323775\n麦克海尔\t323776\n50c\t323777\npostgreSql\t323778\n老崔\t323779\n宽\t323780\n孙飞\t323781\n亚拉巴马州\t323782\n肥胖儿\t323783\nharrods\t323784\n叶莉\t323785\n记录员\t323786\nNPort\t323787\n北大培文学校\t323788\n李仙姬\t323789\n41级\t323790\n声像\t323791\n核心\t323792\nOriginal\t323793\n42.htm\t323794\n商资\t323795\n弹响\t323796\nRandomAccess\t323797\nsuunto\t323798\n公开号\t323799\n鉴定表\t323800\nhabse\t323801\nDulk\t323802\nflexi\t323803\n穆穆兔兔\t323804\nvci\t323805\n靥\t323806\neagles\t323807\n若以\t323808\n大无\t323809\n卧虎藏龙\t323810\nBengio\t323811\nBcoder资源网\t323812\nthor\t323813\n拼错\t323814\n中级会计师\t323815\nbilibili#\t323816\n金光华\t323817\n钱箱\t323818\n起海\t323819\n2类\t323820\n海贼之港\t323821\n制酒\t323822\n中庆\t323823\n全员群\t323824\n回温\t323825\n奈\t323826\n五粮春\t323827\n○\t323828\n七小福\t323829\n任我撸\t323830\n高分子材料科学与工程\t323831\nmui\t323832\nzzx\t323833\n阎志\t323834\n挤压\t323835\n爱魔\t323836\n巴斯德\t323837\n钣金件\t323838\n600集\t323839\n凤凰网\t323840\n桃花树\t323841\n永乐店镇\t323842\n成长路\t323843\ndebian8\t323844\n1000公斤\t323845\n娇鸾\t323846\n春秋繁露\t323847\n剔\t323848\n软件工程硕士\t323849\n乐汇城\t323850\n超级演说家\t323851\nnewInstance\t323852\n股息率\t323853\n郑海霞\t323854\nWard\t323855\n柱上\t323856\norde\t323857\n六安论坛\t323858\nTelnet\t323859\n2003届\t323860\n乙型\t323861\n杭绍台\t323862\n太极剑\t323863\n土地利用现状分类\t323864\n譬\t323865\n竞聘稿\t323866\n严规\t323867\n海椒市\t323868\nUG6.0\t323869\n整容师\t323870\n武当\t323871\n秦云\t323872\n亚太实业\t323873\n京东支付\t323874\n西湖科技园\t323875\n皇位\t323876\n黄礼格\t323877\n长安CS95\t323878\n连翘花\t323879\n中商\t323880\n虫草素\t323881\n寂静的春天\t323882\n冷轧卷板\t323883\n1.0e\t323884\n构图\t323885\nBATTLE\t323886\n复名\t323887\n妖神记\t323888\n不知所措\t323889\nPaste\t323890\n香朵儿\t323891\n芭比之美人鱼历险记\t323892\n路亚钓法\t323893\n威海市区\t323894\n苍天白鹤\t323895\n单篇\t323896\n李丹阳\t323897\n九品芝麻官之白面包青天\t323898\n博洋\t323899\n惯性力\t323900\n赏尔雅\t323901\n章艳\t323902\n杨采妮\t323903\n动挡\t323904\n改档\t323905\n秋福师\t323906\n至多\t323907\n如日\t323908\n釉面砖\t323909\ndsb\t323910\n上塘村\t323911\n夜针\t323912\nCelebrate\t323913\n景旺电子\t323914\n马学军\t323915\n军师联盟2虎啸龙吟\t323916\n广西金融投资集团\t323917\nm21\t323918\n李建峰\t323919\n乐橙云\t323920\n能量场\t323921\nTI4\t323922\n剑灵拳士吧\t323923\n学艺\t323924\n机动战士高达ol\t323925\n重庆装修公司\t323926\nCSS-青岛星网\t323927\nJDK1.6\t323928\n神农尝百草\t323929\n步辇图\t323930\n七星漂\t323931\n生活报\t323932\n4870\t323933\n中农立华\t323934\nheron\t323935\n赘\t323936\nsupplements\t323937\n新纶科技\t323938\n芒硝\t323939\n#endif\t323940\n12P\t323941\n小记者团\t323942\n泉\t323943\nchevin\t323944\nspouse\t323945\n上海市格致中学\t323946\n世界一流大学\t323947\n布依\t323948\n周杰倫\t323949\n刘钢\t323950\nWWW.12345677.COM\t323951\n复变\t323952\n下午两点\t323953\n异事\t323954\n带刺\t323955\nacute\t323956\n够力\t323957\n读大学\t323958\n裂界者\t323959\n6.5.9\t323960\n2.3.2\t323961\nXenserver\t323962\n任职\t323963\n表述语\t323964\n最小\t323965\n交易性金融资产\t323966\n篮网队\t323967\n劝进\t323968\nppr管\t323969\n地段生\t323970\n晓佳奈\t323971\n努尔\t323972\n疑团\t323973\n2018年3月5日\t323974\n险峰长青\t323975\n28m\t323976\nretention\t323977\n初撮\t323978\n金枝玉叶\t323979\n経験\t323980\n北岳\t323981\n蛇岛\t323982\n韦达定理\t323983\n徐则臣\t323984\n锐公司\t323985\n仙房网\t323986\n揉弦\t323987\n袂\t323988\n轶闻士\t323989\n7月8日\t323990\n断箭\t323991\nvirtualenv\t323992\n基督教会\t323993\n爆炸案\t323994\n罗总\t323995\n3年级\t323996\n美洲慈鲷\t323997\n国家职业资格证书\t323998\n银汞合金\t323999\n圣所\t324000\n开窗机\t324001\n原牛\t324002\n焦裕禄干部学院\t324003\nsomeday\t324004\n冠状动脉粥样硬化性心脏病\t324005\n防霾口罩\t324006\n尾田\t324007\n全频音箱\t324008\n主小说网\t324009\n人民利益\t324010\nleve\t324011\nvdi\t324012\n付息期\t324013\n下下\t324014\n101分\t324015\nrealm\t324016\n无色无味\t324017\nPenfolds\t324018\n1nm\t324019\nv2.7.3\t324020\n3159\t324021\n当归苦参丸\t324022\n科尔森\t324023\nPrayer\t324024\n7470\t324025\n石塘\t324026\nshank\t324027\n前途\t324028\n核电\t324029\n医指通\t324030\nvsqx\t324031\n内科学\t324032\n梦幻西游85\t324033\n间奏曲\t324034\n肇庆鼎湖\t324035\n安信证券\t324036\n注册美区\t324037\n框窗\t324038\n永不会\t324039\n李长明\t324040\n河北燕达医院\t324041\n腰突\t324042\n正月十三\t324043\n并网逆变器\t324044\n四分五裂\t324045\n无影\t324046\n绥化\t324047\n水榕\t324048\n皇氏乳业\t324049\n夹江\t324050\n1000G\t324051\nDIY\t324052\n吴教授\t324053\n2015年9月份\t324054\n极值\t324055\nzuowen\t324056\n脱白\t324057\n噻虫胺\t324058\n公大\t324059\n纸罐\t324060\n中机车辆技术服务中心\t324061\n6厘米\t324062\n描图\t324063\n7.0级\t324064\n乐玛\t324065\n先考\t324066\n迪巴拉\t324067\n金矢\t324068\n三拓\t324069\n17%\t324070\n大循环\t324071\n冲锋队\t324072\n判读\t324073\n699元\t324074\nHats\t324075\nTonic\t324076\n电闸\t324077\n田家炳\t324078\n蒙版\t324079\n不以跬步\t324080\nharvest\t324081\n通葡股份\t324082\n相逢\t324083\n马璐\t324084\n旧值\t324085\n青春梦\t324086\ndetermines\t324087\nFIVE\t324088\n查图\t324089\n真三国无双3\t324090\n我本善良\t324091\n移过\t324092\n0135\t324093\n嘉兴乐居网\t324094\n吉野kun\t324095\nf560\t324096\n新商标法\t324097\n三菱欧蓝德\t324098\n见花\t324099\n望亭镇\t324100\nEAST\t324101\n和讯博客\t324102\n70小时\t324103\nbioinformatics\t324104\n莲花村\t324105\n2px\t324106\n黑瞎子岛\t324107\n空气清新剂\t324108\n接亲\t324109\n优悦会\t324110\n点型\t324111\n踪\t324112\n也应\t324113\n五龙潭\t324114\nfused\t324115\n代收\t324116\nventuz\t324117\n一季度末\t324118\n周口市中心医院\t324119\n配货站\t324120\n富兰克林\t324121\n高凤林\t324122\n胎头\t324123\n张旭光\t324124\n承包\t324125\n补修\t324126\n收成\t324127\n中山雍博堂红木_广州红木家具\t324128\n极恶非道2\t324129\n护美\t324130\n微端\t324131\n敬汉卿\t324132\n沼泽地\t324133\n短平快\t324134\n皂角米\t324135\nissn\t324136\n10.0.1\t324137\n修建性\t324138\n刑事科学技术\t324139\n宝马Z4\t324140\n普资\t324141\n5G\t324142\n12届\t324143\n嗡\t324144\n伶仃洋\t324145\n垂壁\t324146\n杰克·伦敦\t324147\n陈家林\t324148\n北京电影制片厂\t324149\n时费\t324150\n李永健\t324151\n第一次\t324152\n吴传生\t324153\n吊箱\t324154\n风水类\t324155\n窈窕淑女\t324156\n防腐瓦\t324157\nRviz\t324158\n190年\t324159\njo\t324160\nsaomiao\t324161\n周国华\t324162\n兑换\t324163\n滨海高新区\t324164\n巴克斯\t324165\n箱箱\t324166\n越盘\t324167\n软组织肉瘤\t324168\n邮差\t324169\nCISCO\t324170\nTVS管\t324171\n联众大厅\t324172\n美兰国际机场\t324173\n保胎药\t324174\n长嫡\t324175\n煜\t324176\n招标师考试\t324177\n八式\t324178\n56听书网\t324179\nsomehow\t324180\n中文娱乐网\t324181\nako\t324182\n上海东方肝胆医院\t324183\n50KG\t324184\n熨烫\t324185\nECD\t324186\n7E\t324187\n达成率\t324188\n竹筒\t324189\nRises\t324190\n光鲜\t324191\n3843\t324192\n旧情人\t324193\n饭碗\t324194\n追\t324195\n让一让\t324196\n东方朔\t324197\n雨芳\t324198\n二战风云2\t324199\n吞并\t324200\n蜜儿\t324201\n裴氏\t324202\n天天干\t324203\nImperial\t324204\n正明\t324205\n火影忍者:博人传\t324206\n割肉\t324207\n20171102\t324208\n中国科学院广州能源研究所\t324209\n灵水村\t324210\nleaky\t324211\n金缕玉衣\t324212\n复印本\t324213\nchronous\t324214\n宝珍商城\t324215\n合住\t324216\n切词\t324217\n沮\t324218\n見\t324219\n湖南广电\t324220\n出诊\t324221\n达里湖\t324222\nRenewable\t324223\n雯\t324224\n质量分户\t324225\n流苏\t324226\n资产证券化\t324227\nMDK\t324228\nOFFICIAL\t324229\n重订\t324230\n书法家协会\t324231\ncoreutils\t324232\nwinForm\t324233\n传统版\t324234\n坏点\t324235\n当前页\t324236\n包头职业技术学院\t324237\n心态\t324238\n纱\t324239\n17909\t324240\n肠瘘\t324241\n液空\t324242\n三元村\t324243\n第十季\t324244\nHoney\t324245\n裙带菜\t324246\nMech\t324247\n三希堂法帖\t324248\n西部大开发税收优惠政策\t324249\n进制转换\t324250\n非学历教育\t324251\n吉娃娃\t324252\n胡光\t324253\nios模拟器\t324254\n劳动权\t324255\n周枫\t324256\n场子\t324257\n文职\t324258\n稳定增长\t324259\n坦克世界论坛\t324260\n21天后\t324261\n上海市幼儿园\t324262\nSWF文件\t324263\n胶东国际机场\t324264\nshm\t324265\n紫袍\t324266\n854\t324267\n小师\t324268\n仙剑奇侠传三\t324269\nAcode\t324270\n加减号\t324271\n海容\t324272\n大峰\t324273\n南通市中医院\t324274\n保定市人民政府\t324275\n战争片\t324276\n眼液\t324277\n全库\t324278\n喜洲镇\t324279\n2016年12月25日\t324280\n无则加勉\t324281\nDaniWeb\t324282\nOpenFOAM\t324283\n防护员\t324284\n明朝末年\t324285\nmschart\t324286\nhce\t324287\n安徽省统计局\t324288\ndynamo\t324289\n人工智能公司\t324290\n乄\t324291\n电伴热\t324292\nElectra\t324293\nA9LH\t324294\n保温瓶\t324295\n.2\t324296\n储蓄卡\t324297\n峰平谷\t324298\n搞垮\t324299\nREP\t324300\n安徽行政学院\t324301\nrtsp协议\t324302\nSTARS\t324303\nteamview\t324304\n关节式\t324305\n格\t324306\n透魔\t324307\n中国大学生\t324308\n牌店\t324309\n鸡血\t324310\n中海锦城\t324311\n春东财\t324312\n阴道炎\t324313\n身世\t324314\nWindows\t324315\n麻木\t324316\n附庸国\t324317\n废卡\t324318\n黑BA\t324319\n清华法学院\t324320\n对唱\t324321\n經\t324322\n醉卧君怀笑离殇\t324323\n金地地产\t324324\n呼吸过度\t324325\n中州古籍出版社\t324326\n步器\t324327\n异特\t324328\n事业型\t324329\n华欣\t324330\n林郑月娥\t324331\n潜航者\t324332\n冷疗\t324333\ngoog\t324334\nHarrow\t324335\n55号\t324336\n神翼龙\t324337\nZeus\t324338\n米堆\t324339\nproe4.0\t324340\n了不起的匠人\t324341\n大鱿鱼\t324342\n长沙市国资委\t324343\n语文出版社\t324344\nps在线\t324345\n杂种\t324346\n本馆\t324347\n观音山国际商务营运中心\t324348\n生益科技\t324349\nLG\t324350\n曼朱基奇\t324351\n世纪天成\t324352\n无用小说网\t324353\n残照\t324354\nADVANCE\t324355\n勿忘国耻\t324356\n单音节\t324357\n民生手机银行\t324358\n小康股份\t324359\n湖南省国家税务局\t324360\n尤靖茹\t324361\n江西财经大学\t324362\n小宴\t324363\n腾讯电脑管家吧\t324364\n大牧场主\t324365\n幸福9号\t324366\n宽子\t324367\n公共营养师\t324368\n挂钟\t324369\n让我留在你身边\t324370\n两小\t324371\ngto\t324372\n4.7%\t324373\nEstablishment\t324374\n中央环境保护督察组\t324375\nDDO\t324376\n亮泽\t324377\n星露\t324378\nj3160\t324379\nloog\t324380\n校内网\t324381\n死而后已\t324382\nTMI\t324383\n380\t324384\n桐原エリカ\t324385\n恨天高\t324386\n后会有期\t324387\n第179章\t324388\n锋神\t324389\n兄妹文\t324390\n16299.98\t324391\n王彬宇\t324392\n禁化武组织\t324393\nArte\t324394\nexchanger\t324395\n墓位\t324396\n神魔之战\t324397\n十渡镇\t324398\n超硬\t324399\n币价\t324400\n航天飞机\t324401\n刮骨疗伤\t324402\njack\t324403\n可道云-KodExplorer\t324404\n白边液\t324405\ndescend\t324406\n提线木偶\t324407\n景状物\t324408\n黑莓Q20\t324409\n伯德图\t324410\n反复无常\t324411\n归一\t324412\n财源\t324413\n100种\t324414\ndelicate\t324415\n尽如人意\t324416\n深茂铁路\t324417\nsanctuary\t324418\nCCTV5在线\t324419\n6.9%\t324420\n别家公司\t324421\n一弯\t324422\n油池\t324423\nBodies\t324424\n传艺\t324425\n李可可\t324426\n火化率\t324427\nflotherm\t324428\n沈梦蒋方舟\t324429\n特丽丝\t324430\n文件许可权错误\t324431\niptux\t324432\n子宫口\t324433\n身操\t324434\n癌病\t324435\n张峻豪\t324436\n雪佛兰科迈罗\t324437\nCup\t324438\n20161125\t324439\n伸长\t324440\n遗传病\t324441\n公园\t324442\n人机料法\t324443\n海菜\t324444\n_英雄连\t324445\nADHD\t324446\n王军辉\t324447\n花丝\t324448\n陇头\t324449\n张裕\t324450\n圩堤\t324451\n西土城\t324452\n肉粥\t324453\n李海鹏\t324454\n舰无虚发:暗星\t324455\n气滞\t324456\n不够成熟\t324457\nchooseImage\t324458\n2017年7月31日\t324459\n点拨\t324460\ninput、textarea\t324461\n任弼时\t324462\n7399\t324463\n电动阀\t324464\n水西门\t324465\n4联\t324466\n杭州公安局\t324467\n九典制药\t324468\n试管婴\t324469\njiqi\t324470\n土门镇\t324471\n波士堂\t324472\n气荒\t324473\nPew\t324474\n沃特碧\t324475\n设计素\t324476\n闭环控制系统\t324477\n剧社\t324478\n进者\t324479\n吴大正\t324480\n手提包\t324481\n青岛山大医院\t324482\n夕阳骑士\t324483\n板带\t324484\n舰体\t324485\n降雪量\t324486\n吸油棉\t324487\ncomputed\t324488\n蒙蔽\t324489\n曹髦\t324490\n汇兑损益\t324491\n质量保证\t324492\n幸福航空\t324493\ncss伪元素\t324494\n自提柜\t324495\nC++编程\t324496\n自然保护区\t324497\npathological\t324498\n妖道\t324499\n海太\t324500\n生活不止眼前的苟且\t324501\n06届\t324502\n2386\t324503\n父系\t324504\n切片器\t324505\n魔兽UI\t324506\n金剑雕翎\t324507\nc3p0数据库连接池\t324508\n蟾蜍\t324509\n4V4\t324510\n辗转反侧\t324511\nnav\t324512\n11.26\t324513\n大连理工大学出版社\t324514\n化学平衡\t324515\nPurchases\t324516\n中国法制出版社\t324517\n民生类\t324518\nGCCP\t324519\ngaining\t324520\n诗情画\t324521\n指鹿为马\t324522\n西京\t324523\n九江市\t324524\n锥管\t324525\nForgive\t324526\n三次方根\t324527\n城市名片\t324528\n举措\t324529\n抱紧\t324530\n电脑吧\t324531\nBRD\t324532\n全席\t324533\n网页游戏大全\t324534\n加下\t324535\n18x\t324536\nclientid\t324537\n万志勇\t324538\n中国南宁_南宁市政府\t324539\n羞羞的铁拳\t324540\n斗鸡苗\t324541\n十天后\t324542\nFuel\t324543\n来样\t324544\n律令\t324545\n优耐特\t324546\n六扇\t324547\n陕西省卫生和计划生育委员会\t324548\n周飞\t324549\n8018\t324550\n亚南\t324551\nNatural\t324552\n哈三中\t324553\n珍珠\t324554\ndei\t324555\n甘肃省审计厅\t324556\n考评员\t324557\n积水潭\t324558\n对公业务\t324559\n255.255.255.192\t324560\n宁夏生殖保健院\t324561\n朱鹮\t324562\n小箱梁\t324563\n马丹\t324564\nJanice\t324565\n1000块\t324566\n20x\t324567\n暴君\t324568\nPropTypes\t324569\nNPAPI\t324570\n耐材\t324571\n扶灵\t324572\nMeteorological\t324573\n刘哲\t324574\n盖侬\t324575\n如东县\t324576\n一心\t324577\n20170427\t324578\n二次世界大战\t324579\n601727\t324580\n绝地求生开镜\t324581\n每周日\t324582\n套板\t324583\n丝宝\t324584\n罗浮宫\t324585\nServerSocket\t324586\n海达范文网\t324587\n李智贤\t324588\n科洛弗悖论\t324589\n淮南政府网\t324590\n着眼点\t324591\n预发\t324592\n空气滤清器\t324593\nintensive\t324594\n白马关\t324595\nPHYSICAL\t324596\nvito\t324597\n入户门\t324598\n三浦友\t324599\n神评\t324600\n兴隆湖\t324601\nethernet\t324602\n子张\t324603\ngeany\t324604\n李东垣\t324605\n幸福指数\t324606\nvot\t324607\n坂井泉\t324608\n第175章\t324609\n图流\t324610\n醋栗\t324611\n英雄传\t324612\nWF\t324613\nquotation\t324614\n20来岁\t324615\n鳕鱼片\t324616\n小学区\t324617\nNW\t324618\n丙烷脱氢\t324619\n杂化\t324620\n45.5\t324621\n流浪法师瑞兹\t324622\n节约用水\t324623\n铜陵市人民政府\t324624\n腹黑兔\t324625\nQC20\t324626\n重镇\t324627\n图灵\t324628\n甲沟炎\t324629\n原位\t324630\n军职\t324631\n为你等待\t324632\n生化危机HD\t324633\n月星家居广场\t324634\n东诚药业\t324635\n白裤袜\t324636\n森林The\t324637\n贵金属纪念币\t324638\noutstanding\t324639\n从你的全世界路过\t324640\n07月\t324641\n个人工\t324642\n湖南国际会展中心\t324643\n奔驰B200\t324644\n桌面虚拟化\t324645\n灞河\t324646\n晴格格\t324647\n草头\t324648\nhci\t324649\n本·西蒙斯\t324650\n沪c\t324651\n三班\t324652\nmage\t324653\n同归于尽\t324654\n杨柳青镇\t324655\nf513\t324656\nAD域控制器\t324657\nCAXA2013\t324658\n10021\t324659\n治世\t324660\n侦听器\t324661\n中国水产科学研究院\t324662\n南京农业银行\t324663\n重庆市人力资源和社会保障局\t324664\nPhoton\t324665\n数词\t324666\n彼得与狼\t324667\n广东新快网\t324668\nJin\t324669\n丝状疣\t324670\n吴丽萍\t324671\nMessi\t324672\n10万元\t324673\n甚至\t324674\nブレ\t324675\nkgm\t324676\n宝马新5系\t324677\nseamless\t324678\n浓香型\t324679\n好望\t324680\n上海交通大学图书馆\t324681\nクス\t324682\n云南省中医院\t324683\n冠生园\t324684\n坯子\t324685\n60%\t324686\n韩七录\t324687\n沈殿霞\t324688\n尤氏\t324689\nbotoo\t324690\nMiddle\t324691\n雅培小安素\t324692\n周记\t324693\n7.4.2\t324694\n窗帘盒\t324695\n直燃式\t324696\n北京中投信德\t324697\n吴老\t324698\n胜宏科技\t324699\nroma\t324700\n几本\t324701\n3.4米\t324702\n轧死\t324703\n親子\t324704\n舞神\t324705\nV1.0.1\t324706\n焯\t324707\n美媛馆\t324708\n糗事百科\t324709\nlocks\t324710\n胆码\t324711\n宝塔山\t324712\nWin7/Win10\t324713\n尖山新区\t324714\n美资\t324715\n山西博物院\t324716\n8341\t324717\n益州\t324718\n用法\t324719\n四讲\t324720\n三盘\t324721\nBOF\t324722\nINC\t324723\n中宋\t324724\n11.6%\t324725\n铸钢件\t324726\n1185\t324727\n再生侠\t324728\n片材\t324729\n光带\t324730\n心电监护仪\t324731\n蓝兮\t324732\n中缀\t324733\n文旅集团\t324734\nCacti\t324735\n330TSI\t324736\ncompiling\t324737\n咸宁温泉\t324738\n刘著\t324739\n有利于\t324740\nhx\t324741\n以赛亚书\t324742\n融洽\t324743\n体面\t324744\n浸透\t324745\n面照\t324746\n奇迹世界SUN\t324747\n天猫宝贝\t324748\n忘了\t324749\n猫狗\t324750\n穆特\t324751\n诺氟沙星\t324752\n离子色谱仪\t324753\n七都镇\t324754\n得心应手\t324755\n掘进机\t324756\n再制造\t324757\n第七十章\t324758\nd3dx9\t324759\nvcredist\t324760\nwindows批处理\t324761\n张海东\t324762\n反光罩\t324763\n收听率\t324764\n保代\t324765\nax7\t324766\n有资格\t324767\n公网\t324768\n马克西姆\t324769\nIGMP\t324770\n佛山五区\t324771\nURS\t324772\n淳溪镇\t324773\nM+\t324774\n第四五\t324775\n20170526\t324776\ngravity\t324777\n宏站\t324778\nR36\t324779\n悠望\t324780\nBooking\t324781\n唐刀\t324782\n两名\t324783\n白鹤街道\t324784\n规范版\t324785\n照样\t324786\nMIN函数\t324787\n球会\t324788\n咒怨\t324789\n东海岛\t324790\n藏藏\t324791\n马兴田\t324792\n劳务分包合同\t324793\nRoasted\t324794\ninnocence\t324795\n晴初\t324796\n白晓燕\t324797\nWindows任务管理器\t324798\n前端篇\t324799\nMapbox\t324800\n原味网\t324801\n环评\t324802\n官神\t324803\n呼吸器\t324804\n安监站\t324805\n深喉口交\t324806\n二锅\t324807\n陆\t324808\n珠城\t324809\nwindebug\t324810\n存低\t324811\n优卡白条\t324812\nx99\t324813\nwin7/XP/\t324814\n指套\t324815\n装逼源码网\t324816\nJDialog\t324817\n静态变量\t324818\n剑胆\t324819\n五十元\t324820\n诺甘农\t324821\n脑点子\t324822\n多环\t324823\n中华人民共和国水污染防治法\t324824\n宝腾\t324825\nwilliam\t324826\n四百\t324827\n宁强\t324828\n小公募\t324829\n马天尼\t324830\n国标舞\t324831\nー\t324832\n小黄脸\t324833\nSass\t324834\n全顺论坛\t324835\n董事会\t324836\n页式\t324837\nlng\t324838\n相同项\t324839\n动物界\t324840\nEI检索\t324841\n128\t324842\n梅子色\t324843\n堡垒之夜\t324844\ndp\t324845\nq2002\t324846\n情逆三世缘\t324847\n菜苔\t324848\n揭谛\t324849\n物理学系\t324850\n反交\t324851\n刘玉\t324852\n送料\t324853\n海兰珠\t324854\n水缸\t324855\n八轮\t324856\n妇科炎症\t324857\n中铁逸都国际\t324858\n莲塘街道\t324859\nAbcam\t324860\n季夜\t324861\ncrafts\t324862\nms17010\t324863\nreliably\t324864\n功机\t324865\n蚝式恒动\t324866\n灿辉龙\t324867\ndirectly\t324868\n中学时代\t324869\n埃斯库罗斯\t324870\nreed\t324871\n厦门大学海洋与地球学院\t324872\n兰溪镇\t324873\n苹果小米\t324874\n4千\t324875\n江东大道\t324876\n000423\t324877\n布拖\t324878\n北京广安门中医院\t324879\n蚕花\t324880\n属\t324881\n科罗拉多河\t324882\n全面战争战锤2\t324883\n妖娆\t324884\n布雷博\t324885\n2017-2017年\t324886\n喷画\t324887\n乔治五世\t324888\nRViz\t324889\nevga\t324890\n烧伤科\t324891\ncad图纸\t324892\n矿用电缆\t324893\n八吨\t324894\n南洋杉\t324895\n女王公园\t324896\n延迟\t324897\n淳道\t324898\nosprey\t324899\n210路\t324900\n讲义\t324901\n5ml\t324902\nhdpi\t324903\n1mil\t324904\n挤痘\t324905\n咸安\t324906\n古色\t324907\n生育健康网\t324908\n柳大华\t324909\nAcer\t324910\n瑞华会计师事务所\t324911\n艺能\t324912\nQS标志\t324913\n壳牌加油站\t324914\n私刑\t324915\nBoven\t324916\n李恒\t324917\nHTML元素\t324918\n一汽丰田汽车\t324919\n前人\t324920\n球心\t324921\n参与方\t324922\n盡\t324923\n80p\t324924\n孝善\t324925\n传众网\t324926\nfor在线翻译\t324927\nHaswell\t324928\n太阳能电池\t324929\n文艺类\t324930\nHONGKONG\t324931\n东坡区人民政府\t324932\nCreed\t324933\n梁河县\t324934\nj10\t324935\n缤客\t324936\n浮亏\t324937\n南遥\t324938\n星灵\t324939\nbeginning\t324940\n主婦\t324941\n滴丸\t324942\n旱田\t324943\nminimization\t324944\n信用值\t324945\n獭兔\t324946\n八卦田\t324947\n项坠\t324948\n圈内人\t324949\neduour\t324950\n单步\t324951\n大杨创世\t324952\n黄金太阳2\t324953\n公园大道\t324954\nOneKeyTools\t324955\nldh\t324956\n家务\t324957\napk包\t324958\n茜\t324959\n决择\t324960\n20171101\t324961\n孩子气\t324962\n临江花园\t324963\n侧吸式抽油烟机\t324964\n鳑鲏鱼\t324965\n梦园\t324966\n丁丁\t324967\n这是为什么呢\t324968\n太平洋汽车\t324969\nDani\t324970\n21元\t324971\n恒安集团\t324972\n五行天\t324973\n彩虹沙漠\t324974\noffices\t324975\n兽王者\t324976\nc3xr\t324977\n2018.04.15\t324978\n330亿\t324979\n嘉华大厦\t324980\nepu\t324981\n颠覆者\t324982\n深圳市第五人民医院\t324983\n4间\t324984\n县检察院\t324985\nglf\t324986\n中国农业发展集团有限公司\t324987\n有心无力\t324988\n第22届\t324989\n7.5分\t324990\nマンコ\t324991\n再卖\t324992\n美宝莲\t324993\nuntagged\t324994\n江月路\t324995\n北武当山\t324996\nbaocms\t324997\n杨州\t324998\n68282100\t324999\n忙活\t325000\n山豆根\t325001\n万山\t325002\n蝌蚪窝\t325003\n大唐电信集团\t325004\n水刊\t325005\n中国共产党廉洁自律准则\t325006\n都江堰\t325007\n腋臭\t325008\n融泽嘉园\t325009\n奥蜜思\t325010\n4S店\t325011\n杨志坚\t325012\n安徽招聘网\t325013\n13号\t325014\n快好\t325015\n赵家班\t325016\n散元\t325017\n老刀\t325018\n15颗\t325019\n雪铁龙爱丽舍\t325020\n芥蒂\t325021\n拐点\t325022\n6306\t325023\nBOOK\t325024\n0409\t325025\n公杂费\t325026\ncade\t325027\n徐泾\t325028\n东台\t325029\n酒泉子\t325030\n长白岛\t325031\n女骑士\t325032\n安雯\t325033\n露毛\t325034\nsubaru\t325035\n妊辰纹\t325036\n国家能源投资集团\t325037\n连通域\t325038\n颗粒板\t325039\nSprites\t325040\n国际主义\t325041\n团代\t325042\n成都社保局\t325043\n越界宋时归\t325044\nheiress\t325045\n2.5三段\t325046\n风笛\t325047\n地址栏\t325048\n预字\t325049\n赎买\t325050\n20世纪50年代\t325051\ncontrast\t325052\n贩卖毒品罪\t325053\n民安\t325054\nTWiki\t325055\nUnity3D研究院\t325056\n定氮仪\t325057\n眼眉\t325058\n林和西\t325059\n楽天市場\t325060\n张琛\t325061\n旬\t325062\n种场\t325063\nMicrosoft.Jet.OLEDB.4.0\t325064\n各关\t325065\n新苗\t325066\n攸县新闻网\t325067\n欢喜佛\t325068\n马弗炉\t325069\n忘机\t325070\n幂幂\t325071\n建账\t325072\n研讨稿\t325073\ngeass\t325074\n雷尼替丁\t325075\n京蒙\t325076\n环流\t325077\nSQL2000+Delphi\t325078\n魔侠传\t325079\n八代i7\t325080\n安丰网\t325081\nnivolumab\t325082\n世界邦\t325083\n开动\t325084\n同一天\t325085\n城市建设史\t325086\n双差\t325087\n135BT网\t325088\n信条\t325089\n连队\t325090\n静态类\t325091\ntasklist\t325092\n淳化县\t325093\n皇国\t325094\n绕圈\t325095\npov\t325096\n细胞\t325097\n加拉克苏斯\t325098\n心痒痒\t325099\nparental\t325100\n勒夫\t325101\n省厅\t325102\n样法\t325103\n梦妆\t325104\n禀\t325105\nblk\t325106\n鲁米\t325107\n草丛\t325108\n印刷纸\t325109\n文超\t325110\n15.12\t325111\n朱明跃\t325112\n容许\t325113\n刀光\t325114\n贫困率\t325115\n克东县\t325116\n两周后\t325117\n矿山\t325118\n农神\t325119\n秘招\t325120\nultraiso软碟通\t325121\n林丛\t325122\n2018年4月10号\t325123\n活蛆\t325124\n高淳\t325125\nitc\t325126\n多效\t325127\n贾生\t325128\n洪海\t325129\n哪一个人\t325130\n苹果园站\t325131\n公话\t325132\nLevi\t325133\nCVI\t325134\n奈奈\t325135\n滚动条\t325136\n第十一季\t325137\ni94\t325138\n上海市黄浦区中心医院\t325139\n中建三局三公司\t325140\n3瓶\t325141\n正应力\t325142\n云旺\t325143\n水温机\t325144\n罗技g933\t325145\n绯红\t325146\n邀请\t325147\n夏至\t325148\n6500元\t325149\n安迷修\t325150\n惧怕\t325151\n龙华政府\t325152\n茶果\t325153\n往东\t325154\n东离剑游记\t325155\nbayer\t325156\njtds\t325157\n假面骑士Ghost\t325158\n惊字\t325159\n陈志敏\t325160\n节度\t325161\n绳金塔\t325162\n川乌\t325163\nInstallshield\t325164\n杨华\t325165\ncnnic\t325166\n马兵\t325167\n赛欧论坛_汽车之家论坛\t325168\n局务\t325169\nEURO\t325170\nLCF\t325171\nMAXUS\t325172\n男人物\t325173\n西安红盾信息网_西安市工商行政管理局\t325174\n安房网\t325175\n井\t325176\n1809\t325177\n没有心\t325178\n脖\t325179\n福特新蒙迪欧\t325180\n北大培文杯\t325181\n清欲\t325182\n2500元\t325183\n中医药大学\t325184\n汉阳大道\t325185\n牛津英汉\t325186\n返货\t325187\n扫描网\t325188\nandoird\t325189\n需要\t325190\n中移互联网有限公司\t325191\n麦锐\t325192\nanoi\t325193\n诛仙召唤万岁\t325194\n快攻\t325195\nvirtio\t325196\n下户\t325197\n积分卡\t325198\n独山港\t325199\n窄幅震荡\t325200\n中国长城科技集团股份有限公司\t325201\n黑便\t325202\n恒丰\t325203\nsox\t325204\n人物简笔画_奔跑网\t325205\nparo\t325206\n武汉站\t325207\n尾场\t325208\n缩尾\t325209\nCrashes\t325210\n逆周期\t325211\n万宇\t325212\n行合一\t325213\n君\t325214\n学习笔记\t325215\n镇魂歌\t325216\n无玛\t325217\n铁山寺\t325218\n雪佛兰乐驰\t325219\n张雨欣\t325220\nA轮\t325221\n鬼事\t325222\n戴娆\t325223\n戏精牡丹\t325224\n速滑\t325225\n丁度\t325226\nsunev\t325227\n360度\t325228\nPLSQL\t325229\n千兆路由\t325230\n盖罐\t325231\n书苑\t325232\n三星A8000\t325233\n复方福尔\t325234\n当代经济\t325235\n交大安泰\t325236\nProviders\t325237\njdzj\t325238\nOpenvpn\t325239\nProcurement\t325240\nlibr\t325241\n无样式\t325242\n一二三类\t325243\n7块\t325244\nitext5\t325245\n渝价\t325246\n军令状\t325247\n港币保卫战\t325248\n鸡腿子\t325249\n黄其淋\t325250\n粒度仪\t325251\n全国卷3\t325252\n俏房客\t325253\n盛怒\t325254\n雅行\t325255\n十三世纪\t325256\n影像志\t325257\n收监\t325258\n百乐笔吧\t325259\n下一位\t325260\nawards\t325261\n仁化县\t325262\nFAR\t325263\n拉长\t325264\n零漂\t325265\n初期末\t325266\nsokect\t325267\n贵州燃气\t325268\nprintk\t325269\n丽江花筑·花自\t325270\n5座\t325271\n薄雾\t325272\n清远政府\t325273\n凤凰国际书城购物中心\t325274\n陌路凡歌\t325275\n胃康灵\t325276\nNative开发】React\t325277\n天子岭\t325278\n周毅\t325279\n雷德\t325280\n弓刀\t325281\n同校生2\t325282\n广而告\t325283\n冒牌货\t325284\nTruncated\t325285\n迪丽\t325286\n小平岛\t325287\n第106期\t325288\nReads\t325289\n100万韩元\t325290\n全同\t325291\n环城河\t325292\n一念间\t325293\n汉巴味德\t325294\n艾米利亚\t325295\n大众点评团\t325296\n至尊卡\t325297\n美团酒店\t325298\n艾芙\t325299\n注塑\t325300\n银刃\t325301\n峡口\t325302\n算账\t325303\n张红岩\t325304\n塘南\t325305\n免驱\t325306\nqt5.7\t325307\n第八十\t325308\n关联表\t325309\n红颜劫\t325310\n小商品批发市场\t325311\n山纹\t325312\n人工流产\t325313\nelementui\t325314\n回潮\t325315\n真希\t325316\n110枚\t325317\n超过4小时\t325318\n上海物业\t325319\n郎鹤焱\t325320\nJosé\t325321\nJSA\t325322\n手抄\t325323\n在底层\t325324\n重判\t325325\n株\t325326\n蛰\t325327\nh8\t325328\n团契\t325329\n张雪馨\t325330\nnewifi新路由\t325331\n上虞区\t325332\n中国船舶工业集团公司\t325333\n场外\t325334\n峨山县\t325335\n乌米饭\t325336\n健康时报\t325337\n绝对诱惑\t325338\n重庆交通职业学院\t325339\n手游礼\t325340\nkeyi\t325341\najv\t325342\n库神\t325343\nxiadd\t325344\nFGO本能寺\t325345\n星怡会\t325346\n小姨子\t325347\n快本\t325348\n信价\t325349\n排水管\t325350\n中航技\t325351\n不用其极\t325352\n金融投资公司\t325353\nE5300\t325354\ncpt\t325355\nSKSE\t325356\n烤乳猪\t325357\n终态\t325358\n森保\t325359\nDifferentiation\t325360\n搜狗拼音输入法2017\t325361\nchildlife\t325362\n金融经济学\t325363\nFROM\t325364\n订金\t325365\n有机化学吧\t325366\n支离人\t325367\n南京浦口区政府网\t325368\n李保国\t325369\n刘雨\t325370\nTwelve\t325371\nQTP\t325372\n万缘\t325373\n芯片股\t325374\n冠豸山\t325375\n水浒外传\t325376\n10.8%\t325377\n承德机场\t325378\n干脆\t325379\n秀秀\t325380\n科达洁能\t325381\n烤肉酱\t325382\n白娉婷\t325383\n三皮\t325384\n170厘米\t325385\n优势种\t325386\nMST\t325387\n0601\t325388\n48系\t325389\n彭总\t325390\n绵阳市场信息协会\t325391\nsian\t325392\n池边\t325393\nAxel\t325394\n英力士\t325395\n传导\t325396\n8.3.29\t325397\n磨店\t325398\ngirls\t325399\n基德奥特曼\t325400\n6240\t325401\n1938年\t325402\n摇马\t325403\n2004款\t325404\ndownload\t325405\n高阳公主\t325406\n弱水千流\t325407\n小儿子\t325408\n北安\t325409\n影视留声机\t325410\n宜阳\t325411\nFiled\t325412\n过眼云烟\t325413\nLeng\t325414\n赛\t325415\n登云股份\t325416\n高中应用物理竞赛\t325417\n国宝银行\t325418\n肖申\t325419\njjwt\t325420\n福耀玻璃工业集团股份有限公司\t325421\n郑秀晶\t325422\n不少\t325423\nzsezt\t325424\n太旺\t325425\n新韩\t325426\n團\t325427\n并排序\t325428\n黑草\t325429\n啵啵鱼\t325430\n人生无限公司\t325431\nDownMP3\t325432\n安装业\t325433\nSHUMO\t325434\n死精\t325435\n不同凡响\t325436\n独领风骚\t325437\n平衡线\t325438\nv1.1.6\t325439\n滨湖小区\t325440\n老万\t325441\n纹饰\t325442\n陈丸\t325443\nB350F\t325444\n倒睫\t325445\n国色\t325446\n5000次\t325447\n杂气\t325448\n软梯\t325449\n副行长\t325450\n红参茶\t325451\n颓废\t325452\n中国留学服务中心\t325453\n血虫\t325454\n思夫\t325455\n第50期\t325456\n半导体有限公司\t325457\n渤海国\t325458\n黄体酮\t325459\nコレ\t325460\n豆沙网\t325461\n北航软件学院\t325462\n我日\t325463\nBeetle\t325464\nmmt\t325465\n变胖\t325466\n红平特\t325467\n_当游网\t325468\n先锋乒羽\t325469\n均值\t325470\n伺服电动缸\t325471\n泾县\t325472\nLoro\t325473\n阿尔托利亚·潘德拉贡\t325474\nQ点\t325475\n油源\t325476\n大魔头\t325477\n伊西斯\t325478\nkeepout\t325479\n68分\t325480\n胞吐\t325481\n月风\t325482\n舒良\t325483\n視頻\t325484\nDior迪奥\t325485\n龚平\t325486\nEdifier\t325487\n思慕\t325488\n一往而深\t325489\n重庆人流医院\t325490\n代换\t325491\n43部\t325492\n高坪区\t325493\n荣耀pro\t325494\n焊接机器人\t325495\n清华大学经济管理学院\t325496\n朱家角镇\t325497\n堵转\t325498\n2017年3月25日\t325499\n恶夜\t325500\n乙型病毒性肝炎\t325501\n小越\t325502\n斯密斯\t325503\nabap吧\t325504\n长江新城\t325505\n第三十二章\t325506\n阶石\t325507\n土地一级开发\t325508\n深圳市大疆创新科技有限公司\t325509\n花寒\t325510\n副团\t325511\n王思斌\t325512\n星推网\t325513\n万魂\t325514\n阿庆\t325515\nJapanese\t325516\n武汉市城管委_武汉市城市管理委员会\t325517\n000636\t325518\n四河\t325519\n混沌研习社\t325520\n杨伊\t325521\n别再说\t325522\n镇委书记\t325523\n结构件\t325524\n0.125\t325525\n4000型\t325526\n蔡斌\t325527\nm37\t325528\n站前路\t325529\n长癣\t325530\nCAD\t325531\nnavigationBar\t325532\n土地公\t325533\n邯郸市人民政府\t325534\n幽邃\t325535\nJefferson\t325536\n湿度仪\t325537\n景德镇陶瓷网\t325538\n大学物理实验\t325539\n冰焰\t325540\n中共四川省委\t325541\nSELECTION\t325542\n/select\t325543\nPSP3000\t325544\n探险活宝\t325545\n三国杀单机版\t325546\nshishenme\t325547\nLTD\t325548\n检索式\t325549\n股份合作制\t325550\n絮\t325551\n14点\t325552\n鼎足\t325553\n雷峰塔\t325554\n恐吓信\t325555\n导电性\t325556\n小米2A\t325557\n芽孢\t325558\nPDI\t325559\n尹志平\t325560\n五爪\t325561\n圣彼得\t325562\n梦享\t325563\n交叉表\t325564\nIllegal\t325565\n县科协\t325566\n20万辆\t325567\nmovado\t325568\n温岭市人民政府\t325569\n毕节市人民政府\t325570\n硬生生\t325571\n十二年\t325572\n富贵鸟\t325573\n中华财经网\t325574\n沈葆桢\t325575\n上海检验检疫局\t325576\n博学\t325577\n焕\t325578\ndyrs\t325579\n福伯\t325580\n识图\t325581\n供气\t325582\n起鼓\t325583\n小蒋\t325584\n苔花\t325585\n教育期刊网\t325586\niad\t325587\n外阴唇\t325588\nNSIS\t325589\n十五倍\t325590\n学路网\t325591\n痴恋\t325592\n转位\t325593\n飞悦\t325594\nilo\t325595\n桃花谷\t325596\n帮手\t325597\n西乡步行街\t325598\n中国平安财产保险\t325599\n酸蚀\t325600\n冯淬帆\t325601\n碧桂园公园\t325602\nkaifeng\t325603\nwin7双\t325604\nPresets\t325605\n不应期\t325606\n替尼\t325607\n鲜重\t325608\n发难\t325609\nMeltdown\t325610\n现代控制理论\t325611\n国家卫生计生\t325612\n鲁讯\t325613\n雷诺科雷嘉\t325614\n附解套\t325615\n方帅\t325616\n2016年4月12日\t325617\n九洲新世界\t325618\n錢\t325619\n唯品\t325620\n警星\t325621\ndaxiao\t325622\n一接\t325623\nAedas\t325624\n未知数\t325625\n球商\t325626\n机动车交通事故责任纠纷案\t325627\n精准化\t325628\n宠夫\t325629\n启信宝\t325630\n贾宝玉\t325631\nmorty\t325632\n厦华电子\t325633\ndunge\t325634\n下子\t325635\n假扮\t325636\nPeel\t325637\n公务员考试网\t325638\n帕灯\t325639\nwebpack-dev-server\t325640\nSolving\t325641\n中央文明办\t325642\n更快速\t325643\n把守\t325644\n泡椒凤爪\t325645\n锦绣山庄\t325646\n2016年4月17日\t325647\n昆明公交查询网\t325648\n防锈\t325649\n中国舞等级考试\t325650\n吉利丁片\t325651\n竣工图\t325652\n某天\t325653\n黄毅清\t325654\n增强\t325655\n农业产业化经营\t325656\n涩谷区\t325657\nxuyatao\t325658\n春秋航空\t325659\n假面骑士Amazons\t325660\n斯人\t325661\n香色\t325662\n诗\t325663\n爱情心理学\t325664\n湿布\t325665\n松油门\t325666\n念斌\t325667\nsuite\t325668\n陛下\t325669\n集中贴\t325670\n机尾\t325671\n20150312\t325672\n轮台\t325673\n240米\t325674\n料号\t325675\nFavorite\t325676\n优酷春集\t325677\n球星网\t325678\nOCT\t325679\n疑点\t325680\n夕阳箫鼓\t325681\n安筱鹏\t325682\n中国邮政快递\t325683\nRow\t325684\n北碚\t325685\nnoble\t325686\nthd\t325687\n刘轩\t325688\n163邮箱批量\t325689\n新华街\t325690\n食品药品监督管理局\t325691\n云居寺\t325692\n乱女\t325693\n听辨\t325694\n第139个\t325695\nfloat、double\t325696\n南昌大学共青学院\t325697\n万步网\t325698\n涸\t325699\n劳氏\t325700\n花\t325701\n椒江\t325702\n编辑工资\t325703\n紫微斗数网\t325704\n2018-02-08\t325705\nDarkMoon\t325706\n电闪雷鸣\t325707\n淘房\t325708\n新乐吧\t325709\n焚神\t325710\n中国科学院化学研究所\t325711\n大气压强\t325712\n三相变压器\t325713\n金海雪山\t325714\nsupper\t325715\n东三旗\t325716\n他用\t325717\n金玉兰\t325718\n最佳损友\t325719\nFlightGear\t325720\n安卓论坛\t325721\n拷问\t325722\n登喜路\t325723\n铁塔\t325724\n王德胜\t325725\n宿主\t325726\n公关稿\t325727\nKAO\t325728\njscript\t325729\n行具\t325730\n1999年\t325731\nふ\t325732\nwifi名\t325733\n云儿\t325734\nHorizons\t325735\n部将\t325736\n逃课\t325737\n安远县人民政府\t325738\n绝地求生封禁误封申诉方法\t325739\n芬尼克兹\t325740\n王海云\t325741\n生旦净末丑\t325742\n赛尔号_一游网\t325743\n览海\t325744\nshubao\t325745\nnamehwh\t325746\n石光\t325747\n金熊奖\t325748\n舞蹈生\t325749\n败坏\t325750\n太平路\t325751\n战甲\t325752\n辐射新维加斯吧\t325753\n打支\t325754\n文斗\t325755\n独胆\t325756\n加灯\t325757\n不拔\t325758\nElie\t325759\n乐观者\t325760\n劳务合同\t325761\n燃情岁月\t325762\n赝\t325763\n马秋华\t325764\n温州电视台\t325765\n水着\t325766\n素子\t325767\n老态\t325768\ndownloads\t325769\n周晓枫\t325770\n大搜车\t325771\n火云邪神\t325772\n韩阳\t325773\n专业类\t325774\n林拜\t325775\n成人自考\t325776\n市物价局\t325777\n十二指肠\t325778\n网络费\t325779\n若春\t325780\n28.8万\t325781\n东奥会计\t325782\nJSSDK\t325783\n原价\t325784\n发电机组\t325785\nfile\t325786\n吴氏\t325787\n奴隶战\t325788\n《龙王\t325789\nYAESU\t325790\n丹师\t325791\n绍兴房产在线\t325792\nNginx负载\t325793\n11句\t325794\n武汉市江夏区人民政府\t325795\nsqlserver2005\t325796\n平顶山\t325797\nVolleyball\t325798\n5岁\t325799\n京牌小客车\t325800\n王品台塑牛排\t325801\n5d2\t325802\n水藤\t325803\n小祯\t325804\n3千元\t325805\nnurse\t325806\n文明3\t325807\nダウンロ\t325808\ngs4\t325809\nroto\t325810\n一千年\t325811\nconversation\t325812\n超智能足球2\t325813\n钒液流电池\t325814\njackets\t325815\n烦不烦\t325816\n皇儿\t325817\n生死格斗5\t325818\n华宏\t325819\n岩土工程论坛\t325820\n中国建筑第五工程局有限公司\t325821\n货到付款\t325822\n云桥\t325823\n高锰酸\t325824\n接茬\t325825\n主动安全系统\t325826\n婶婶\t325827\n1399元\t325828\n王敏奕\t325829\n700c\t325830\n老油\t325831\n潮起潮落\t325832\nicepak\t325833\n消防水泵接合器\t325834\n运费险\t325835\n恶魔术\t325836\n金兰企划\t325837\n储物\t325838\n燃机\t325839\n平顶山市公共资源交易中心\t325840\ntactical\t325841\naddall\t325842\n美沙拉嗪肠溶片\t325843\n牟尼沟\t325844\n蕨麻\t325845\nconfigmap\t325846\n街道社区\t325847\nCharming\t325848\n长城防火墙\t325849\n执照\t325850\n2016年9月9日\t325851\n欺客\t325852\n托尼\t325853\n日本九州大学\t325854\n狐言\t325855\n版播放器\t325856\n迪拜国际机场\t325857\n工行信用卡\t325858\n至尊无上之永霸天下\t325859\n_易车号\t325860\n粮库\t325861\n荣安香园\t325862\n第32页\t325863\n纸盒装\t325864\nLinkage\t325865\nOpenMV\t325866\n淄博市教育局\t325867\n大猩猩\t325868\n机器\t325869\n行动力\t325870\n伤科\t325871\n蓝框\t325872\n超级名模\t325873\n特式\t325874\n18.1.1\t325875\n铁氟龙管\t325876\n非法持有枪支罪\t325877\n3000点\t325878\nRicci\t325879\ninspinia\t325880\n观察期\t325881\n憎恶\t325882\npoi-ooxml\t325883\n星球大战外传\t325884\n柳江县\t325885\nmijia\t325886\n上屏\t325887\n自整\t325888\ncomment\t325889\n雪铁龙天逸C5\t325890\n神龙摆尾\t325891\n恍\t325892\nMB\t325893\n天子湖\t325894\n吞日者\t325895\n苏州第一人民医院\t325896\n面霸\t325897\n滨河湾\t325898\n第九层\t325899\n受制于人\t325900\n集部\t325901\nNDIS\t325902\n建新矿业\t325903\n广外附小\t325904\n鼻烟\t325905\n八队\t325906\n务必\t325907\nwbf\t325908\n小学生作文_无忧考网\t325909\nComponent\t325910\n动地\t325911\n催化裂化\t325912\nHKTV\t325913\n章贡区政府\t325914\n棱镜门\t325915\n郑京浩\t325916\n黄坑\t325917\n构筑\t325918\n抖阴\t325919\ngreeting\t325920\n盛世华城\t325921\n广西医科大学第一附属医院\t325922\n中纬\t325923\n冷傲\t325924\n晶闸\t325925\nlxy\t325926\n霍布斯\t325927\n绿营\t325928\n东市\t325929\n再一次\t325930\n7mall\t325931\n劝降\t325932\n边部\t325933\n尊爵\t325934\n拳四郎\t325935\n真是\t325936\npuercn\t325937\n亚游\t325938\n空気系\t325939\n聊城市政府\t325940\n化学方程式\t325941\n弹指一挥间\t325942\n阿新\t325943\n销售型\t325944\n傲腾\t325945\nSSN\t325946\n磷酸二氢铝\t325947\n鱼香\t325948\n平安无事\t325949\n华南农业大学\t325950\n亥姆霍兹\t325951\n自生\t325952\n好未来集团\t325953\n躁郁症\t325954\n倾城\t325955\n梁欣\t325956\n2014.5\t325957\nh170\t325958\nM18\t325959\n凯迪拉克XT4\t325960\nv5.2.8\t325961\n化学价\t325962\n直报\t325963\ninpu\t325964\n消控\t325965\n红胡桃\t325966\nSHANGHAI\t325967\n尚酷\t325968\n遮月追星\t325969\n1025nw\t325970\n海苔\t325971\n月野里沙\t325972\nextra\t325973\n光州\t325974\n冒牌大英雄\t325975\n亲女人\t325976\n灵长类\t325977\n朱迪斯\t325978\n上海东\t325979\n9级\t325980\n发票机\t325981\n熊佳佳\t325982\n圣朝\t325983\n怪字\t325984\n优课晒\t325985\n六龙\t325986\n违章查询网\t325987\n悬吊\t325988\n中小企业局\t325989\n聚四氟乙烯管\t325990\n觉者\t325991\n其他方\t325992\nLILITH|リリス\t325993\n20161105\t325994\n青海省玉树州人民政府\t325995\n大众健美操\t325996\nSugon\t325997\n13.14\t325998\nPaul\t325999\n放大镜\t326000\nbatt\t326001\n破瓜\t326002\n1457\t326003\n李原\t326004\n2119\t326005\n218.22\t326006\n大奥华之乱\t326007\n70寸\t326008\n96条\t326009\n龙南县\t326010\n火纹无双\t326011\n椰林\t326012\n肝硬化腹水\t326013\n周传海\t326014\n混交\t326015\n凯盛\t326016\ndarwin\t326017\n心血管_99健康网\t326018\n云南农业大学\t326019\n量力\t326020\n日珥\t326021\n放样\t326022\n结盟\t326023\n家燕\t326024\n酸\t326025\nvai\t326026\n金宝贝\t326027\n一览_安秀网\t326028\n当初一\t326029\nXDF\t326030\n铝水\t326031\n锡林\t326032\n2018张\t326033\nmicrorna\t326034\n华普天健会计师事务所\t326035\n安仲炜\t326036\n取餐\t326037\n河口区\t326038\n西岳奇童\t326039\n中北大学\t326040\n住宅面积\t326041\n金沙江\t326042\n牡丹奇迹\t326043\n2017年6月5日\t326044\n明锐论坛_汽车之家论坛\t326045\nHRD\t326046\n乐视云\t326047\n包围曝光\t326048\n死或生沙滩排球\t326049\nPDS\t326050\n德国\t326051\n五毒俱全\t326052\n上周五\t326053\n爱乃なみ\t326054\n官厅湖\t326055\nserver-u\t326056\n狱血魔神\t326057\n草子\t326058\n低密度聚乙烯\t326059\n翠柳\t326060\n中央第三巡视组\t326061\n齐齐\t326062\njoyce\t326063\n国际车\t326064\n国民大生活\t326065\n儒术\t326066\n拒\t326067\n令牌\t326068\nJACS\t326069\n博福斯\t326070\nouo\t326071\n十字纹\t326072\n妃悠爱\t326073\n校院\t326074\n中招网\t326075\nhuobi\t326076\n16.98万\t326077\n主办单位\t326078\n结核菌素试验\t326079\nGRAND\t326080\nG530\t326081\n钱江\t326082\n军工企业\t326083\nNIST\t326084\n汤镇宗\t326085\n红楼梦学刊\t326086\nWindows2008\t326087\n古力特\t326088\n花僮\t326089\nBrussels\t326090\n尹桂芳\t326091\n太极石\t326092\n中证协\t326093\nANT\t326094\n莉莉艾\t326095\n粗心大意\t326096\n振业\t326097\n银蛇\t326098\n沙粒\t326099\ninfos\t326100\n五本\t326101\n未济\t326102\n34063\t326103\nUploadify\t326104\n物流师考试\t326105\nxboxonex\t326106\n赤湾\t326107\n程野丫蛋\t326108\n鱼鳔\t326109\n邪帝\t326110\nshiren\t326111\n于非\t326112\n驱力\t326113\nGrad\t326114\n成都学校\t326115\n万博宣伟\t326116\n恩施恩施市\t326117\n恰自来\t326118\nSqlServer\t326119\n江湖菜\t326120\n地域性\t326121\n歌词\t326122\n鬲\t326123\nregards\t326124\nSTL\t326125\n麦点\t326126\nardunio\t326127\nfangshi\t326128\n气相法\t326129\n十院\t326130\nCSGO\t326131\n高翻学院\t326132\nPakho\t326133\noutlook2013邮箱\t326134\n中国酒志网\t326135\npart1.rar\t326136\n标识卡\t326137\n小小黑\t326138\n大疆御Mavic\t326139\n北京市海淀医院\t326140\n4.8\t326141\n提前还贷\t326142\n温莎\t326143\n京粮\t326144\n柳北\t326145\n西山水库\t326146\n小陆\t326147\n成都市城乡建设委员会\t326148\nirb\t326149\n小古\t326150\njjzz\t326151\n13万元\t326152\n北银消费金融公司\t326153\n加拿大移民局\t326154\n撤掉\t326155\n隔音墙\t326156\n台北松山机场\t326157\n四绝\t326158\n即日\t326159\n1500首\t326160\nzuzu\t326161\n九区\t326162\n首恶\t326163\n北外高\t326164\n周子\t326165\n五晚\t326166\n干花\t326167\n混沌之戒\t326168\n死亡\t326169\n血盆\t326170\n13式\t326171\n东方吧\t326172\n宫崎县\t326173\n经济学派\t326174\n魅蓝E\t326175\n10间\t326176\nPodfile\t326177\n白小白\t326178\n28本\t326179\n21\t326180\n天旋地转\t326181\n阿糖糖\t326182\n【纳\t326183\nCNM\t326184\n孟玲\t326185\n聂鲁达\t326186\n老鬼\t326187\n边缘性人格障碍\t326188\nimshow\t326189\nvapp\t326190\n涤荡\t326191\n五行师\t326192\n威士忌\t326193\n黄山北站\t326194\n养家\t326195\n微风细雨\t326196\n洛杉矶湖人\t326197\n花自飘零水自流\t326198\n算作\t326199\nCIDR\t326200\n中国地震台网\t326201\n搞笑__hao123上网导航\t326202\n包装带\t326203\n2017年7月21日\t326204\neasy\t326205\nrfa\t326206\n山王\t326207\n杨丞琳\t326208\n突出\t326209\nbaldrsky\t326210\n干性\t326211\n黑女\t326212\n王瑶\t326213\n臀瓣\t326214\n杭州四季青\t326215\ncultural\t326216\n华普\t326217\nAR#\t326218\n轴对称\t326219\nThough\t326220\n02月\t326221\n舒爽\t326222\n女书\t326223\n一元\t326224\n新思维\t326225\n测频\t326226\n袭警案\t326227\n5起\t326228\n倾天\t326229\n弗莱德\t326230\n原始传奇\t326231\n抽象劳动\t326232\n早教幼儿园小学\t326233\n就业季\t326234\npq分区\t326235\n乐农网\t326236\n傅园慧\t326237\n素食馆\t326238\nsneak\t326239\n王杰\t326240\n重庆大学外国语学院\t326241\n隆鑫通用\t326242\n1393\t326243\n900亿元\t326244\n免不了\t326245\n四栋\t326246\n中色股份\t326247\n1253\t326248\n途虎养车网\t326249\nhire\t326250\n规民约\t326251\n京佳教育\t326252\nv3.0.3\t326253\n新金宝\t326254\n尚艺\t326255\n12339\t326256\n效用函数\t326257\n1GW\t326258\n4399小游戏大全\t326259\n苏眉\t326260\n中画网\t326261\n掌游宝\t326262\nseparately\t326263\nF-14\t326264\nt34\t326265\n水星领航员\t326266\n硕果累累\t326267\n陈赫费德勒\t326268\nKriging\t326269\n轻薄\t326270\nrmf\t326271\n巡护\t326272\nallowed\t326273\n膏\t326274\n2044\t326275\n药品经营质量管理规范\t326276\n0300\t326277\n12班\t326278\n87G手游网\t326279\n咸味\t326280\n沙婉\t326281\n老季\t326282\nRejection\t326283\nChloe\t326284\n进销存软件\t326285\nstrike\t326286\n云中国\t326287\nChargers\t326288\n问题集\t326289\n中铁二十三局集团有限公司\t326290\n华仁\t326291\n浮来山\t326292\n千里共良宵\t326293\n桦川县\t326294\n烦躁\t326295\n开锁\t326296\n5.18\t326297\n雷恩\t326298\n科密考勤机\t326299\ncloses\t326300\n氯酸\t326301\n合水镇\t326302\n马云湖畔大学\t326303\n欺世盗名\t326304\n海选赛\t326305\n1237\t326306\n催命符\t326307\nsketchup2017\t326308\n器皿\t326309\n起码\t326310\n腕带\t326311\n第一第二季\t326312\n柯南秀\t326313\nテル\t326314\n行吟\t326315\nlixor\t326316\n硅太阳能电池\t326317\nnetns\t326318\n五千块\t326319\n基尔霍夫\t326320\n阿彪\t326321\n藤本\t326322\nscansnap\t326323\nx7s\t326324\nEver\t326325\n三到五分钟\t326326\n上场\t326327\n邯郸市政府\t326328\n拱墅区\t326329\n梁勇\t326330\n差之毫厘\t326331\nLIMIT\t326332\n女剑\t326333\n温柔的诱惑\t326334\n张晓兰\t326335\nAgatha\t326336\nGoogle\t326337\n东湖区\t326338\n好货网\t326339\na&#160\t326340\n曾言\t326341\nx55\t326342\n4.2.6\t326343\n公付款\t326344\n上海健康医学院\t326345\nFINAL\t326346\n音轨\t326347\n束河\t326348\nforging\t326349\nwxs\t326350\n3天前\t326351\n283号\t326352\n丹尊\t326353\n火炬路\t326354\nrelatively\t326355\n丙种球蛋白\t326356\n|批量\t326357\n恋物癖\t326358\n22p\t326359\n平遥县\t326360\nU21\t326361\n佳乐国际城\t326362\nEW\t326363\n疾步\t326364\n基尼\t326365\n秦浩\t326366\ndrl\t326367\n张凤林\t326368\n氟化氢\t326369\n极乐宝鉴\t326370\n深陷\t326371\n修道院\t326372\n镇痛\t326373\ns3c2410\t326374\n亚行贷款\t326375\n汽柴油\t326376\n玄风鹦鹉\t326377\n血塞通注射液\t326378\nMM2R\t326379\n巨星科技\t326380\n西涌\t326381\nMPS\t326382\n郑新\t326383\n镇江万达广场\t326384\n洋浦经济开发区\t326385\n骗取\t326386\n57位\t326387\n叶京\t326388\n关键句\t326389\n澄江\t326390\n疏疏\t326391\n许文\t326392\n林J\t326393\nsevenfriday\t326394\n压底机\t326395\n410mm\t326396\n蝶结\t326397\nstm32f0\t326398\nJ8\t326399\n漫步云端\t326400\n10.28\t326401\n杨桐\t326402\n星表\t326403\n漏点\t326404\nmono\t326405\n勒克斯\t326406\nquanji\t326407\n两相流\t326408\n接档\t326409\ncontainment\t326410\n查改\t326411\n游玩\t326412\n1180\t326413\nRegex\t326414\nweb测试\t326415\n江西丰城发电厂\t326416\ne520\t326417\n1#\t326418\n5w\t326419\n国家环境保护总局\t326420\n第16卷\t326421\n软件库\t326422\n维C银翘片\t326423\n600nm\t326424\ncctv5\t326425\nAgent\t326426\n诺特\t326427\n森林报\t326428\n浦上大道\t326429\n极品飞车16\t326430\n牛山\t326431\ntn=95866330_hao\t326432\nstrcat\t326433\n哀歌\t326434\n禁不禁\t326435\nUNDP\t326436\n第三十三次\t326437\nHKG\t326438\nwuchu\t326439\nDynamite\t326440\n怀柔镇\t326441\n2个星期\t326442\n新会碧桂园\t326443\n神包\t326444\n老刘\t326445\n1-12周\t326446\n三江源国家公园\t326447\n叶子猪桃花源记\t326448\n经受住\t326449\n连接数\t326450\n悼念\t326451\n济南便民网\t326452\n明渠\t326453\nNonstop\t326454\n液晶屏幕\t326455\n天津南站\t326456\n第一季\t326457\nM2\t326458\nspots\t326459\n俪人\t326460\n梅尔塔\t326461\n传播力\t326462\n獐岛\t326463\n60立方\t326464\n合肥综合性国家科学中心\t326465\n药号\t326466\n俾斯麦号\t326467\n中食\t326468\n剑尊\t326469\n模特界\t326470\n23.6寸\t326471\n勤俭\t326472\n策\t326473\n2016-06\t326474\n2两\t326475\n顾恺之\t326476\n被控\t326477\n220平\t326478\n10.\t326479\n药布\t326480\n11_\t326481\n小桀\t326482\n喜用\t326483\n三八节\t326484\n演员表\t326485\n三生三世枕上书\t326486\n一缕风\t326487\n沁园\t326488\ntshark\t326489\n深水\t326490\n8.12\t326491\n笔套\t326492\n福特翼博\t326493\n江西代表团\t326494\n安耐\t326495\n眼纹\t326496\n年节\t326497\nBlackmagic\t326498\n晨僵\t326499\n食用植物油\t326500\n完美生活\t326501\n鸿新网\t326502\n剧荒\t326503\n性本\t326504\n传文\t326505\n创业板指数\t326506\n智航\t326507\n2速\t326508\n黄光纳兰性德\t326509\n雪夜\t326510\n银期\t326511\n丁家桥\t326512\n上传者\t326513\n暗影之王2\t326514\n木然\t326515\n结业证书\t326516\nphasset\t326517\n裤腿\t326518\n神社\t326519\n增配\t326520\n超嗲\t326521\nzeal\t326522\n16.1.0.843\t326523\n90项\t326524\n秉着\t326525\n反射式\t326526\n兵魂\t326527\n山楂\t326528\nisometric\t326529\nq值\t326530\n海克\t326531\n科龙\t326532\n复原乳\t326533\n苏州市司法局\t326534\n霍尔电流传感器\t326535\nBCX\t326536\n_途睿欧车友会_XCAR爱卡汽车俱乐部\t326537\n瑞得\t326538\n_CCTV-5体育赛事\t326539\nLPCTSTR\t326540\n0121\t326541\n接得住\t326542\n2017-18\t326543\nix6580\t326544\n置\t326545\n落地价\t326546\n桂满陇\t326547\nIMT\t326548\ndimens\t326549\n澶渊之盟\t326550\n先兆性流产\t326551\n信息系统项目管理师教程\t326552\nwearing\t326553\n糜\t326554\n六六社\t326555\n2011—2015年\t326556\ngross\t326557\nzbook\t326558\n王钢\t326559\n阿甘正\t326560\n塘尾\t326561\n共商国\t326562\n汤旺河\t326563\n圣城\t326564\n蔓越莓\t326565\n导航器\t326566\n陆亦可\t326567\nBroadcasting\t326568\nMAXSUN\t326569\n联东u谷\t326570\n锰\t326571\n犹他大学\t326572\n埃尔法\t326573\n沃尔沃S40\t326574\nyami\t326575\n滴水之恩\t326576\nALA\t326577\n专利\t326578\n唐山海\t326579\n解控\t326580\nfirewall-cmd\t326581\n史\t326582\nORA-12154\t326583\n铺客\t326584\n大觉山\t326585\n中学教育-中学教育\t326586\n纪委\t326587\n香葱\t326588\n多向\t326589\ngrldr\t326590\nibaby\t326591\n42平米\t326592\n无约\t326593\n清蒸多宝鱼\t326594\n4a\t326595\n脑震荡\t326596\n230元\t326597\nat_the\t326598\n龙珠激斗\t326599\n大咖秀\t326600\nLumber\t326601\nsnapseed\t326602\n舞佳舞\t326603\n9月11日\t326604\n安多福\t326605\n千亿级\t326606\n卡池\t326607\n清宁\t326608\n木刺\t326609\n谢伏瞻\t326610\n锦都\t326611\nCCTV-9\t326612\n星锐\t326613\n阿达木单抗\t326614\n丢了\t326615\nshire\t326616\n脑控\t326617\nJinhua\t326618\nok6410\t326619\n好食\t326620\n五路\t326621\nCaptain\t326622\n世少赛\t326623\n家塾\t326624\n妈咪宝贝\t326625\n南昌动物园\t326626\n港丽餐厅\t326627\n0.8m\t326628\n招生信息网\t326629\n抢先一步\t326630\n痛惜\t326631\n尊宝\t326632\nSerenity\t326633\nbeused\t326634\n冻人\t326635\n氟素\t326636\n导记\t326637\n银河之星\t326638\nWiFi模块\t326639\n五二\t326640\n信达证券股份有限公司\t326641\n盛安\t326642\n欧尼\t326643\n淘宝开店\t326644\n亚联\t326645\n姹紫嫣红\t326646\n丰兴广场\t326647\n2280\t326648\n點\t326649\nsatisfying\t326650\nfocus\t326651\n多酚类\t326652\nosa\t326653\nscopus\t326654\n送出去\t326655\n海飞\t326656\n城市型\t326657\n1635\t326658\n美观\t326659\n956\t326660\n0.1.4\t326661\n优购\t326662\nhightopo\t326663\n北京市监狱管理局\t326664\n中国矿业大学徐海学院\t326665\nlagoon\t326666\npro2016\t326667\n吴云飞\t326668\nbooster\t326669\n小光\t326670\n橘右京\t326671\n综艺节目\t326672\n译制\t326673\n玩蛋\t326674\n牛头人\t326675\nイベント\t326676\n王红旗\t326677\n包装制品\t326678\n蛭\t326679\n爱豆\t326680\n南宁市规划管理局\t326681\n磁盘工具\t326682\n麦嘉\t326683\n单单\t326684\nCF点\t326685\n十四步\t326686\n八极\t326687\n管理办法\t326688\neto\t326689\n狗语者\t326690\n奔驰GLE\t326691\n魔星\t326692\n魏祥\t326693\n红文\t326694\n赫海\t326695\nliger\t326696\nPING值\t326697\nmajor\t326698\n游泳镜\t326699\n黄檀\t326700\n藤真健司\t326701\n开见\t326702\n五副\t326703\n红臀\t326704\n自拔\t326705\n入口点\t326706\nTA\t326707\n新月格格\t326708\ndetails\t326709\n商号\t326710\n不堪重负\t326711\n匠圣\t326712\n微量泵\t326713\n杀手螺\t326714\n邵宇\t326715\ngoupu\t326716\ndin\t326717\n青岛大学医学院\t326718\n5086\t326719\n4032\t326720\n300dpi\t326721\n3-6\t326722\n点绛\t326723\n长安快马\t326724\n腾讯云盘\t326725\nGUI编程\t326726\n公允价\t326727\n凡谷\t326728\nArchitectures\t326729\n66种\t326730\n欧石楠\t326731\n唐璜\t326732\naliasing\t326733\n经济学奖\t326734\n惠花花\t326735\n一周七天\t326736\n彩胶\t326737\nAUDIT\t326738\nTFIDF\t326739\n硕方\t326740\n累死累活\t326741\n蛇胆\t326742\nm10h\t326743\n表针\t326744\ncURL\t326745\n爱新觉罗·弘历\t326746\n天涯法律网\t326747\nAirDrop\t326748\n履\t326749\n剑三电\t326750\nRNAi\t326751\n富顺\t326752\nPortia\t326753\n崔老道\t326754\n时过境迁\t326755\n瓦里奥\t326756\n阿奇霉素干混悬剂\t326757\nルディングス\t326758\n脐橙\t326759\n198526\t326760\n背伤\t326761\n省财政厅\t326762\n迷网\t326763\n王昌荣\t326764\n绿城水务\t326765\nρ\t326766\nGuo\t326767\n可行\t326768\n虎头鞋\t326769\n选片\t326770\nDICOM\t326771\n男伴\t326772\n光型\t326773\nAccessible\t326774\nCyber-shot\t326775\n善行\t326776\n延安市\t326777\n【溪苑\t326778\n烧胎\t326779\n黄飞红\t326780\n阿衰\t326781\n000555\t326782\n飞丸网\t326783\nadobe2018\t326784\nプラス\t326785\nKiKi\t326786\n10.8.2\t326787\nMothers\t326788\n赵黎平\t326789\n流果\t326790\n俺\t326791\n锐达\t326792\nRaid5\t326793\n开纸\t326794\n沈阳市于洪区政府\t326795\n麦哲伦\t326796\n受通\t326797\n七龙珠超\t326798\nCREIS\t326799\n全部子\t326800\n赵海龙\t326801\n彭涛\t326802\nge60\t326803\n车载机\t326804\ncvte\t326805\nSevere\t326806\n清徐\t326807\n4万亿\t326808\n华虹宏力\t326809\n松花\t326810\nCMYK\t326811\nHEV\t326812\n金俊秀\t326813\n想\t326814\na800\t326815\n波波球\t326816\n出光\t326817\n15j403\t326818\nRetro\t326819\n伊索\t326820\n电热蚊\t326821\n搜狗输入法老\t326822\n歼霸\t326823\n三盏\t326824\ntextedit\t326825\n第一轮\t326826\n第86期\t326827\n高难度\t326828\nkorn\t326829\nucc\t326830\n题壁\t326831\nパッケ\t326832\ndilution\t326833\n20150731\t326834\nian\t326835\n鹅卵石\t326836\n特需\t326837\n法芙娜\t326838\nArg\t326839\nabspath\t326840\n合肥科技农村商业银行\t326841\nMas\t326842\n南京森林警察学院\t326843\n云台版\t326844\n20万级\t326845\n网络适配器\t326846\n7605\t326847\n大岛优香\t326848\n大牧场\t326849\n卷期\t326850\n博途v13\t326851\n盐城市环保局_盐城市环境保护局\t326852\n滁州北\t326853\n纺织服\t326854\n2.8.7\t326855\n过山车之星\t326856\nthussat\t326857\n世界五百强公司\t326858\nウ\t326859\n过招\t326860\nstamen\t326861\njrs\t326862\n梁慧星\t326863\n纱纱\t326864\n志愿者们\t326865\n竟会\t326866\n两公里\t326867\n宜家装饰\t326868\n回到\t326869\nlemma\t326870\ndeyi\t326871\n孙记\t326872\n晶核\t326873\n喷雾剂\t326874\n仁和药业\t326875\nagar\t326876\n琉璃瓶\t326877\n楼盘-购房网\t326878\n江南西路\t326879\n美国纽交所\t326880\n啊\t326881\nM128fp\t326882\n导电聚合物\t326883\n全头\t326884\nx2+y2+z2\t326885\n旦旦\t326886\n强网杯\t326887\n古堂\t326888\n光驱盒\t326889\n开山股份\t326890\n三六五网\t326891\n湖南生物机电职业技术学院\t326892\n彩铅笔\t326893\n红榜\t326894\n瑜伽服\t326895\n美团云\t326896\n作家群\t326897\n古句\t326898\n工龄\t326899\n爱乐维叶酸\t326900\n蓝管\t326901\n2017.11.13\t326902\n李旭东\t326903\n慕柔雪\t326904\n孙俊\t326905\n捕蝇\t326906\n刘京\t326907\n白花镇\t326908\n埃\t326909\n胸口处\t326910\n前子\t326911\n102个\t326912\n阶梯图书网\t326913\n在日\t326914\n伯尔尼公约\t326915\nlamp\t326916\n毛新宇\t326917\n亚太广场\t326918\n老梁故事会\t326919\n东京国际\t326920\n坐南朝北\t326921\n朴\t326922\nMyPrice\t326923\n10家\t326924\n在线调查\t326925\n百度云推送\t326926\n论武\t326927\n柳营路\t326928\n菲玛\t326929\n生殖系统\t326930\n20150906\t326931\n耐火浇注料\t326932\n不闻\t326933\n乐士\t326934\n美图园\t326935\n莫先生\t326936\n東京喰種\t326937\n泾川县\t326938\n粉子\t326939\n10项\t326940\nweizhi\t326941\n纽约城\t326942\neml\t326943\n玻色\t326944\ndpos\t326945\n凯登克罗斯\t326946\n施工人\t326947\nToolkit\t326948\n废粉盒\t326949\nHYPERMESH\t326950\n松油\t326951\n宋志平\t326952\n靳梦佳\t326953\n新时\t326954\n复华度假世界\t326955\n荔玉高速\t326956\n中谷\t326957\n小男人\t326958\nxhd\t326959\n5股\t326960\n钱文锜\t326961\n外遇\t326962\n20140601\t326963\n潜水证\t326964\n马语者\t326965\nDraw\t326966\n最流行易\t326967\n饶命\t326968\n8吨\t326969\n中国注册税务师协会\t326970\n淮南矿业集团\t326971\n王适娴\t326972\n尘盒\t326973\n除垢\t326974\n东方郡\t326975\n密山法院\t326976\n交易日\t326977\n8.6分\t326978\n王尼玛\t326979\n翼云\t326980\n20161011\t326981\n阵发性\t326982\n蒙面唱将猜猜猜\t326983\n亚硝酸盐\t326984\n食品添加剂\t326985\n七侠\t326986\n天龙八部吧\t326987\n端木景晨\t326988\n弹起我心爱的土琵琶\t326989\n送友人\t326990\n剑侠情缘手游\t326991\n勤学\t326992\n一嗨租车网\t326993\n马甸\t326994\n长沙酒店\t326995\ngta3\t326996\n知礼\t326997\n沈阳市妇婴医院\t326998\namba\t326999\nDockerfile\t327000\n计数法\t327001\n泷谷源治\t327002\nbtx\t327003\n商河县\t327004\n段数\t327005\n韦莱韬\t327006\n备料\t327007\n杭州城站火车站\t327008\n精水\t327009\n段暄\t327010\n主姓\t327011\n匡威1970s\t327012\n跨年夜\t327013\n68平\t327014\nchicks\t327015\n140元\t327016\n仁和镇\t327017\nAI音响\t327018\n不越\t327019\n第四十回\t327020\n齐悦\t327021\n泪梨花\t327022\n总价\t327023\n北京交管局\t327024\n充实\t327025\n徐州市市\t327026\n停保\t327027\n浙江中控技术股份有限公司\t327028\n欧洲人\t327029\n开来\t327030\n星星的孩子\t327031\n一江水\t327032\n不做\t327033\n四轮换位\t327034\nago\t327035\n思朗\t327036\n醒来\t327037\n老枪\t327038\n郭隆邦\t327039\n梦里风林\t327040\nThundeRobot\t327041\n40MM\t327042\n二案\t327043\n鸟羽\t327044\n飞卢\t327045\nconvey\t327046\nhgzz\t327047\n二三期\t327048\n情书\t327049\n第三军医大学大坪医院\t327050\n车图\t327051\n永久代\t327052\n倦怠期\t327053\n尼康D3100\t327054\n中信建投\t327055\n星箭\t327056\nFTS\t327057\n口语陪练网\t327058\n姜斯宪\t327059\n10w\t327060\nxiaodi\t327061\n哨片\t327062\n超越世界\t327063\n紧缩性\t327064\n黑衣人\t327065\n叫唤\t327066\nJXLS\t327067\npixels\t327068\n诉请\t327069\nimx362\t327070\n张卫民\t327071\n病从口入\t327072\n龙虎砵兰街\t327073\n工贸\t327074\n曲刚\t327075\n6999元\t327076\n北京海军总医院\t327077\n斩魔录\t327078\nsedo\t327079\n割胶\t327080\n荷城街道\t327081\n雄猫\t327082\n沙滩上的童话\t327083\n阿航\t327084\n流冰\t327085\n漳州市住房公积金管理中心\t327086\nbusters\t327087\n正天丸\t327088\n罗技\t327089\n10.0.9\t327090\n绿烟\t327091\n溴甲烷\t327092\nMUMU模拟器\t327093\n回收箱\t327094\n冥媒\t327095\n创真\t327096\n雪音\t327097\n潜规则\t327098\n九品恋舞ol\t327099\n2017年8月24日\t327100\n中国互联网络信息中心\t327101\n足球服\t327102\n增肌\t327103\n1.25.1\t327104\nhmv\t327105\n阿科力\t327106\n露宿\t327107\n贴合\t327108\nUPUPW\t327109\n京钓网\t327110\n全能女\t327111\nㄜ\t327112\n不送\t327113\nReception\t327114\n促织\t327115\n美大\t327116\nACTIVE\t327117\n张云帆\t327118\n运输部\t327119\nBlackboard\t327120\nfrx\t327121\n真空期\t327122\nliar\t327123\n栽树\t327124\n辕门斩子\t327125\n刘飞\t327126\n毛毛虫\t327127\n房讯网\t327128\n汉字形近字\t327129\n乌尔禾\t327130\n升号\t327131\n万步\t327132\n232路\t327133\nsessionkey\t327134\njacky\t327135\n终本\t327136\n理邦仪器\t327137\n3am\t327138\n百望股份\t327139\nlaoshi\t327140\n士绅\t327141\nLDM\t327142\n打豆豆\t327143\nwindowsupdate\t327144\n刘郎\t327145\n2017开年\t327146\n查克\t327147\n智商税\t327148\n澳门币\t327149\n河北联合大学\t327150\n颜子\t327151\nppe\t327152\nOsmo\t327153\n中音在线\t327154\n宝健\t327155\n6.83c_\t327156\n八仙宫\t327157\n捡垃圾\t327158\n5色\t327159\n仙林湖\t327160\n南门\t327161\n瞿溪街道\t327162\n许家印\t327163\nEpub360\t327164\n上海国税局\t327165\n杀神恶魔传奇\t327166\n小兔\t327167\ndetached\t327168\n汉字词\t327169\n玉皇大帝\t327170\n东北虎林园\t327171\n高姗\t327172\ncapabilities\t327173\n红信\t327174\n3挡\t327175\n夏津\t327176\n非智能\t327177\n重庆青年职业技术学院\t327178\n中证指数\t327179\n醉蟹\t327180\n注射液\t327181\n一比多\t327182\n沈大成\t327183\nnsf\t327184\nH2R\t327185\n70g\t327186\n色素沉着\t327187\n提升术\t327188\n第四代\t327189\n浙江农信\t327190\n检\t327191\n移动手机号段\t327192\n20格\t327193\n家天下\t327194\nSS/SSR\t327195\n金联创\t327196\n6770\t327197\n口袋妖怪白2\t327198\n中朝\t327199\n口布\t327200\n最终幻想零式\t327201\n武汉市青山区人民政府\t327202\n实轴\t327203\n王先生\t327204\nSMITH\t327205\n上宝中学\t327206\n中试\t327207\n戴妃\t327208\n盛世乐居\t327209\n王世充\t327210\n耐司\t327211\n乐视1S论坛\t327212\noutlier\t327213\n北京大兴亦庄\t327214\n投屏\t327215\n雁峰区\t327216\n定西路\t327217\n亚基\t327218\n被害者\t327219\n蜜豆\t327220\n玉米排骨汤\t327221\n阔盛\t327222\n海口东站\t327223\n避嫌\t327224\n雅萌美容仪\t327225\npowerdesigner16\t327226\nIntegral\t327227\n摇臂钻\t327228\n重庆风行天下车友\t327229\n中规版\t327230\n必读书\t327231\n波波网\t327232\n2.4%\t327233\ntomy\t327234\n星网锐捷\t327235\n夺夫\t327236\njumpserver\t327237\n水文\t327238\n手残党\t327239\n塞萨洛尼基\t327240\n净界\t327241\nBoon\t327242\n发酵乳\t327243\n天府国际机场\t327244\nhippop\t327245\n梦三\t327246\n管用\t327247\ngrup\t327248\nCCTV-10科教频\t327249\n鑫联盟\t327250\n之策\t327251\n娘娘山\t327252\n十万条\t327253\n1723\t327254\n执手\t327255\nmcmaster\t327256\n梭哈\t327257\nvsa\t327258\n镁铝合金\t327259\n安奈\t327260\nNEOGEO模拟器\t327261\n友锋\t327262\n荣耀6Plus\t327263\n怎麽样\t327264\n导生\t327265\n做人做事\t327266\n李凯\t327267\nDVD补丁\t327268\nIPTV\t327269\n需方\t327270\n摇曳露营\t327271\n云南师大附中\t327272\n猫鱼\t327273\n侯家塘\t327274\n沙比\t327275\nYDWE\t327276\n冯华\t327277\n喜洲\t327278\nngFor\t327279\n按摩师\t327280\n仕女图\t327281\n副部级大学\t327282\n奇怪君\t327283\n3V\t327284\n东省\t327285\ngoodnotes\t327286\n三洋\t327287\n广东省城乡规划设计研究院\t327288\n乐昌市政府\t327289\n康定路\t327290\n点燃式\t327291\neuraka\t327292\n狮子王3\t327293\n呼叫器\t327294\n延播\t327295\nlarva\t327296\nrss源\t327297\n北京国际车展\t327298\n养虫\t327299\n中国移动电子\t327300\n奥尼\t327301\ng302\t327302\n第十九期\t327303\n花农\t327304\n滗水器\t327305\n新浪江西_新浪网\t327306\n休斯敦\t327307\n索纳塔八论坛_汽车之家论坛\t327308\n单独页\t327309\n0722\t327310\nexample\t327311\n林磊\t327312\n餐边柜\t327313\nrubygems\t327314\nsnowman\t327315\n亚飞\t327316\nMtime\t327317\n常玉\t327318\nsdo\t327319\n安责险\t327320\n47张\t327321\n1693\t327322\n第7层\t327323\n128MB\t327324\n10月15日\t327325\n156卡\t327326\noss\t327327\n强直性脊柱炎\t327328\n习总书记系列讲话\t327329\n电子元器件网\t327330\n吐哈油田\t327331\n乌日娜\t327332\n谷歌卫星地图下载器\t327333\n跑焦\t327334\n北京公司\t327335\n中国医学科学院药物研究所\t327336\n可可托海\t327337\n阮富仲\t327338\n贵格八字分析\t327339\nDIVCSS5\t327340\n气液分离器\t327341\nmarlin\t327342\n裸陪\t327343\nNutch\t327344\n柯罗伊\t327345\n睿策\t327346\n胡怀邦\t327347\nWrist\t327348\n禁电\t327349\n陈江镇\t327350\n卢沟\t327351\nⅢ\t327352\n川江\t327353\n西王集团\t327354\n蒸鸡蛋糕\t327355\ngooood\t327356\n取商\t327357\n世博会博物馆\t327358\n大全套\t327359\n银亿股份\t327360\nBASF\t327361\n轻喜剧\t327362\ninditex\t327363\n亚空间\t327364\n农业百科\t327365\n拾萃\t327366\n20161005\t327367\n毗\t327368\n29篇\t327369\nsbc\t327370\n高米迪\t327371\n西丽车管所\t327372\n百度贴\t327373\nDynEd\t327374\n人贷\t327375\n缄默\t327376\nibmmq\t327377\n五部\t327378\n山羊皮\t327379\n想着你的好\t327380\n肝肾阴虚\t327381\n白龙桥\t327382\n耽生唯美\t327383\n李元\t327384\n习大大\t327385\n高科\t327386\n常压\t327387\n自卫\t327388\n非参数估计\t327389\n商务本\t327390\n6.16\t327391\n补心\t327392\nf396\t327393\n夏威夷群岛\t327394\n磨辊\t327395\n赵一王健林\t327396\n上海教委\t327397\n到底层\t327398\n严细\t327399\nwin8平板\t327400\n手圈\t327401\n锚桩\t327402\n香樟树\t327403\n↖\t327404\n四千\t327405\n陈旭光\t327406\n青岛理工大学\t327407\n【精\t327408\n北头\t327409\n电阻炉\t327410\n彩鳞\t327411\n零:濡鸦之巫女\t327412\n赞皇吧\t327413\nshenyixin\t327414\nmthoutai\t327415\nfragments\t327416\n同煤吧\t327417\n调压箱\t327418\nNautilus\t327419\n吊模\t327420\n八面玲珑\t327421\n张琳芃\t327422\n游久\t327423\n弁\t327424\n了否\t327425\nHD7770\t327426\n回稳\t327427\n军垦\t327428\n凸度\t327429\n3辑\t327430\n曾波\t327431\n锦云\t327432\nimpala\t327433\n徐锡麟\t327434\n云备份\t327435\n二维码扫一扫\t327436\n上律指南针\t327437\n华北科技学院\t327438\n会计核算方法\t327439\nHALFCD\t327440\n次中音号\t327441\n火剑\t327442\n起司猫\t327443\n中广核\t327444\n理正\t327445\n部手\t327446\n帝瓦雷\t327447\n109凌波\t327448\n好丑\t327449\n心事重重\t327450\n满广志\t327451\n浦锦街道\t327452\n肉桂\t327453\n大汉奸\t327454\n爱奇艺视频播放器\t327455\n推荐生\t327456\n家富富侨\t327457\n会无期\t327458\n画片\t327459\n彩宝石\t327460\n省气象局\t327461\n金陵杯\t327462\n木村翔\t327463\n门头沟区\t327464\nLoving\t327465\n中国共产党济南市委员会\t327466\n京津塘高速\t327467\n280X\t327468\n编号\t327469\n600297\t327470\n阳西\t327471\n虾藻\t327472\n瞬感\t327473\n炒鸡蛋\t327474\n综管\t327475\nmaidumptool\t327476\n我校\t327477\n氧化剂\t327478\n轿厢\t327479\n国家安全教育日\t327480\n李艳梅\t327481\nmmb\t327482\n中科建设开发总公司\t327483\n阿巴多\t327484\n球姐\t327485\nps吧\t327486\n茶画\t327487\n倒追\t327488\n2057\t327489\n青石\t327490\n反锁\t327491\n机锋网\t327492\n令行\t327493\n安达驾校\t327494\nrpgmakermv\t327495\n0.38%\t327496\n杭州下沙\t327497\n包公\t327498\n浪莎股份\t327499\nWDK\t327500\n大铭\t327501\nwrv\t327502\n瑞光\t327503\n几行\t327504\nrundll\t327505\naad\t327506\n核赔\t327507\nrangers\t327508\n甲基硅油\t327509\n东沟村\t327510\n雅斯兰黛\t327511\n可乐机\t327512\n华语网\t327513\nipone6s\t327514\n虹光扫描仪\t327515\n调查者\t327516\nRelated\t327517\n杨树海\t327518\n清蒸\t327519\nTOP级\t327520\n144fps\t327521\n97拳皇\t327522\n合肥市高新区\t327523\n董村\t327524\n患者们\t327525\nhouston\t327526\n民智一郎\t327527\n2整除\t327528\nと\t327529\nUncle\t327530\nxialulee\t327531\n国际泳联\t327532\n居心\t327533\nPAR\t327534\nCQVIP\t327535\n蛋类\t327536\n35元\t327537\n宁波国家高新区\t327538\n瓦兰奈尔\t327539\n穿墙宝\t327540\nGoodSync\t327541\n深圳市经济贸易和信息化委员会\t327542\n二十五万\t327543\n手摇晾衣架\t327544\n雷安\t327545\ncallback函数\t327546\n交回\t327547\n￥\t327548\n0.1mg\t327549\n国界线\t327550\n韩夫人\t327551\n30A\t327552\n科讯优网\t327553\n十八盘\t327554\n李然\t327555\n深圳市投资推广署\t327556\n加尔鲁什·地狱咆哮\t327557\n洁咪\t327558\n87237676\t327559\n跳投\t327560\n史丹利快报\t327561\n衰退\t327562\n下料\t327563\n重划\t327564\n燕都\t327565\n南京大学历史学院\t327566\nUV\t327567\nchained\t327568\n柳梦若\t327569\nluminous\t327570\n川北\t327571\n兴奋\t327572\nPVA\t327573\n手续版\t327574\n百世快递快递\t327575\n科普帖\t327576\n2018cc\t327577\nvhosts\t327578\n夫人们\t327579\n晋职\t327580\n解剖室\t327581\nuv镜\t327582\nAAAA\t327583\n血仇\t327584\n一个多页\t327585\nhengthb\t327586\n奖助\t327587\n绿度母心咒\t327588\n邕江\t327589\njugg\t327590\n歪果\t327591\n旗鼓\t327592\n御心香帅\t327593\n喷丸机\t327594\nhera\t327595\n中信南航\t327596\n值得一读\t327597\nPLCSIM\t327598\n北部湾城市群\t327599\nShimano\t327600\n洱源县\t327601\n流星火雨\t327602\n中韩\t327603\n绿化\t327604\n放盘\t327605\n小安\t327606\n尉迟恭\t327607\nNLPIR\t327608\nhoney\t327609\n咄咄逼人\t327610\nluma\t327611\n苏烈\t327612\n平性\t327613\n抗战片\t327614\n避震器\t327615\n华容道\t327616\nLeNet-5\t327617\n七波辉\t327618\n肖强\t327619\n3117\t327620\nidel\t327621\n世博园\t327622\n冰狼\t327623\n安桥\t327624\n氢化物\t327625\n吾生\t327626\ndla\t327627\n华发路\t327628\n接分\t327629\n池州学院\t327630\n水声\t327631\n石家庄矿区\t327632\n云钻\t327633\ndeco\t327634\n瑞虎5X\t327635\n23部\t327636\n乌龙山\t327637\n广州市经济技术开发区\t327638\n光流\t327639\n布鲁斯特\t327640\n好男\t327641\n三重积分\t327642\n水葱\t327643\n紫轩小说吧\t327644\n凯德集团\t327645\n疯狂的麦克斯\t327646\njava语言程序设计\t327647\n中期末\t327648\nimplication\t327649\n使用者\t327650\nLollipop\t327651\n汶川县\t327652\nn维向量\t327653\n木蚂蚁\t327654\n菏泽大众网\t327655\n项伯\t327656\nFe2+\t327657\n京瓷5050\t327658\nPhu\t327659\n仪器信息网\t327660\n造就\t327661\norigin9.0\t327662\n@component\t327663\nvuze\t327664\n空气弹簧\t327665\n王家坝\t327666\n阿若\t327667\n选举制度\t327668\n32级\t327669\n黏糊糊\t327670\n牛津译林\t327671\nValentine\t327672\n哥白尼\t327673\n23.8英寸\t327674\n剪叉\t327675\n导弹车\t327676\npuk\t327677\ntemplet\t327678\n村民\t327679\n保山\t327680\n天合光能\t327681\nRanking\t327682\nWebHook\t327683\n出港\t327684\n问病\t327685\n黄鹏\t327686\nGREP\t327687\n计酬\t327688\n奇迹mu\t327689\naman\t327690\nCombat\t327691\ntourist\t327692\nSchenker\t327693\n一亩地\t327694\n抗衰\t327695\n面胶\t327696\n51cto学院\t327697\n常书欣\t327698\n策反\t327699\n铁索\t327700\n黑贞\t327701\n幻界王\t327702\n手机病毒\t327703\n充气\t327704\n福田区\t327705\nK29\t327706\n1.8万亿\t327707\n外围\t327708\n蓝圈\t327709\n陈真\t327710\n乳腺机\t327711\n全国卷I\t327712\n图表秀\t327713\n上海保洁公司\t327714\n工品\t327715\n百度云svip\t327716\n祁可欣\t327717\n神相\t327718\n大葱\t327719\n第58期\t327720\n机械革命MR\t327721\n信封袋\t327722\n一蹶不振\t327723\n比美\t327724\nNBA2kol\t327725\n豫剧\t327726\n片头曲\t327727\n雅儿\t327728\n钢筋工\t327729\n3.19\t327730\n靶式流量计\t327731\n我的1919\t327732\nbinder\t327733\n变态者\t327734\n木辰\t327735\neiji\t327736\n千明勋\t327737\n君太平\t327738\n北京航空材料研究院\t327739\n8|\t327740\nvesting\t327741\n灵犀玉\t327742\n002352\t327743\n扳手\t327744\nLANCOME\t327745\n乐谱\t327746\n鼎成\t327747\n王小妹\t327748\n高青路\t327749\n鲨卷风\t327750\nedu\t327751\nvarchar\t327752\nleonleung\t327753\n烟台电视台\t327754\nfdm\t327755\n海委\t327756\n迪威视讯\t327757\n香港莎莎\t327758\n保姆网\t327759\nkurento\t327760\nCoCo奶茶\t327761\n胭脂泪\t327762\npropertychange\t327763\nthreaded\t327764\n78个\t327765\n火狐狸\t327766\nBD\t327767\n千层\t327768\nphicomm\t327769\n工薪族\t327770\n横杠\t327771\n1036\t327772\n淳化\t327773\nx8\t327774\n外用药\t327775\nvm10\t327776\n至少还有你\t327777\n20150211\t327778\n357g\t327779\n分红率\t327780\ncesd\t327781\n防修\t327782\n24mm\t327783\nfrist\t327784\nDNFSS\t327785\necd\t327786\n鼓劲\t327787\ninfinite-scroll\t327788\nmp3\t327789\nchenfool\t327790\n大豆\t327791\n大方县人民政府\t327792\n玄米茶\t327793\n火盖\t327794\n广饶一中\t327795\n募捐箱\t327796\n快盘\t327797\n此举\t327798\nsprinter\t327799\n小驴\t327800\n长治\t327801\n妖神\t327802\n厦门科技中学\t327803\n郑州市公安局\t327804\n刘红霞\t327805\n1.2m\t327806\n副厂长\t327807\nezcast\t327808\n100万行\t327809\n铁公鸡\t327810\napplepay\t327811\n妙警\t327812\n2027\t327813\n北京地铁\t327814\n房子\t327815\n王老实\t327816\n机械制图习题集\t327817\n教子\t327818\n16.98\t327819\n傲世九重天\t327820\n公交卡\t327821\n点工\t327822\n经手\t327823\n戏水\t327824\n天洋城\t327825\nSD敢达\t327826\n金路集团\t327827\n林鸣\t327828\n鸽宝\t327829\n柳琴戏\t327830\nheu\t327831\n比一\t327832\n撒\t327833\nSDMS\t327834\n长春国际会展中心\t327835\n第一季01\t327836\n环保小卫士\t327837\n国有资产监督管理委员会\t327838\n第161章\t327839\n举隅\t327840\n方正证券泉友通\t327841\n郑子明\t327842\n泰平胜世┊\t327843\n折颈\t327844\n罂粟花\t327845\n280家\t327846\n漏奶\t327847\nFrag\t327848\n城市论坛\t327849\n知识产权代理有限公司\t327850\n中国太阳能\t327851\n保护镜\t327852\nMX6Q\t327853\n民事法律行为\t327854\n调调\t327855\n色播\t327856\n人工气候箱\t327857\n第2話\t327858\n科易网\t327859\n阳朔西街\t327860\n事业编考试\t327861\nPercent\t327862\n陕西师范\t327863\nStreams\t327864\ncupboard\t327865\n杨泗港\t327866\n凸轮式\t327867\nm436n\t327868\nesim卡\t327869\n吴清\t327870\nhalcon\t327871\n新塍镇\t327872\n雪子\t327873\nBootstrap_红黑联盟\t327874\npro6s\t327875\nacts\t327876\n码链\t327877\nRoald\t327878\n5V\t327879\n岩藻糖苷酶\t327880\nepoch\t327881\nslogen\t327882\nruth\t327883\ninplace\t327884\n毛腿\t327885\n爆照\t327886\n电焊钳\t327887\n龟山公园\t327888\nTit\t327889\n东影\t327890\n银承\t327891\n惠英\t327892\n而来\t327893\n新手帮帮团\t327894\n皇家守卫军2\t327895\n披发\t327896\n段氏\t327897\n李恩菲尔德\t327898\n通项公式\t327899\n溆浦\t327900\n辣妈正传\t327901\n阿拉巴马\t327902\n天若有情慕晴\t327903\n挂章\t327904\n个案研究\t327905\n珩磨管\t327906\n王爱爱\t327907\ndeb文件\t327908\n大红\t327909\nMDaemon\t327910\n4g网\t327911\n首博\t327912\n醉白池\t327913\njsn\t327914\n凌虐\t327915\n失业\t327916\n003\t327917\ncontinuing\t327918\n类容\t327919\n酒店大堂\t327920\n斯坦森\t327921\n凤凰公园\t327922\n缓缓\t327923\n将会\t327924\n登不上\t327925\n理想主义者\t327926\n2600k\t327927\n武汉市经济和信息化委员会\t327928\n康正汽车集团\t327929\nti6\t327930\nLightbox\t327931\n学规\t327932\n钱江新城二期\t327933\n迎春大街\t327934\n户厕\t327935\n华乾\t327936\n老表\t327937\n西南交通大学经济管理学院\t327938\n江欣燕\t327939\n硅脂\t327940\nGN\t327941\n洛克人zero\t327942\n钩弋夫人\t327943\n无穷小量\t327944\n114网\t327945\n可疑\t327946\n王国兴亡录\t327947\n锎\t327948\n拓尔思\t327949\n变废为宝手工小制作教程\t327950\nCAZ\t327951\nCorrect\t327952\n简报\t327953\n黑莓手机\t327954\n新河县\t327955\nimagemagick\t327956\nMojito\t327957\n三级螺纹钢\t327958\naggregation\t327959\nlibc\t327960\n2018年1月4日\t327961\n金关\t327962\n杀破狼贪狼\t327963\n凌平平\t327964\narff\t327965\n红米2\t327966\n造粒\t327967\nwinntsetup\t327968\n网上认证系统\t327969\n11份\t327970\n跳跳糖\t327971\n附征\t327972\n顏色\t327973\n开收\t327974\n1994年\t327975\n卡巴金\t327976\n解比例\t327977\n汇金科技\t327978\nSWIFTCODE\t327979\nED2\t327980\n段浩雨\t327981\nvisit\t327982\nx220i\t327983\n3第二章\t327984\n富人\t327985\n1.06亿\t327986\n交存\t327987\n墨稿\t327988\nHons\t327989\n全国青少年信息学奥林匹克竞赛\t327990\n松茂\t327991\n1610\t327992\n战天\t327993\nzy\t327994\n李毅军\t327995\n篠田あゆみ\t327996\n应接不暇\t327997\nrigidbody\t327998\n蓝藻\t327999\n胡东\t328000\n野鬼\t328001\nPPAPI\t328002\n火星撞地球\t328003\nCOMMISSION\t328004\n狂武战帝\t328005\n格构\t328006\n生酮\t328007\n桑塔露琪亚\t328008\n反垄断法\t328009\n终\t328010\n802.1X\t328011\nproteomics\t328012\n秘书部\t328013\n创佳\t328014\n神偷奶爸2\t328015\nClinical\t328016\n当何罪\t328017\n透光性\t328018\nkeeper\t328019\ngma\t328020\n妮可李一男\t328021\n古曼童\t328022\n刻意练习\t328023\n150升\t328024\n乔茜\t328025\nplaxis\t328026\n养生城\t328027\n奔驰GLS450\t328028\nAVD\t328029\n第1季度\t328030\n12万辆\t328031\nNBN\t328032\n礼尚往来\t328033\n114ic\t328034\n焦躁\t328035\n英雄td\t328036\n马说\t328037\nidae\t328038\n无法\t328039\n轶事\t328040\n汉柏\t328041\n皮尤\t328042\n卍杰\t328043\n周某某\t328044\n牛摩网\t328045\n离散型\t328046\n高龄产妇\t328047\n松尾\t328048\n监播\t328049\n顺磁\t328050\n第76集\t328051\nPing值测试|在线ping\t328052\nTI杯\t328053\nwehotel\t328054\n大众波罗\t328055\n尤克里里和弦\t328056\n老臣\t328057\n板价\t328058\n70岁\t328059\n南苑路\t328060\nNike耐克\t328061\n文俊辉\t328062\nlock键\t328063\nQuestionnaire\t328064\n黑棒\t328065\n八县\t328066\nLucy\t328067\n12V\t328068\n婶子\t328069\n魏小东\t328070\n上期所\t328071\nhppt\t328072\n离线\t328073\n98年\t328074\n西游女儿国吧\t328075\n洞庭碧螺春\t328076\n职业学校学生实习管理规定\t328077\n同济大学建筑设计研究院(集团)有限公司\t328078\n全球通卡\t328079\n七匹\t328080\n新开盘楼盘\t328081\n百级\t328082\n见证者\t328083\n根基\t328084\n1kw\t328085\n打榜\t328086\n70平方\t328087\n吐沫\t328088\n由\t328089\n足脚\t328090\n1900元\t328091\n蒼井\t328092\n工商管理论文\t328093\n防爆摄像机\t328094\na1\t328095\n冈山\t328096\n斯多葛\t328097\n退游\t328098\n彩铅画\t328099\n教后感\t328100\n辛杜\t328101\nRailway\t328102\n修宪\t328103\n小饼\t328104\n杜门\t328105\n小木虫\t328106\n刘为民\t328107\ndeskmini\t328108\n仰天湖\t328109\n布业\t328110\n蒋雯丽\t328111\n2017年7月29日\t328112\nsmoothly\t328113\n中国税务网\t328114\nKerr\t328115\nTobu\t328116\n冷军\t328117\n忆典\t328118\n信息工程大学\t328119\n欢乐\t328120\n储备干部\t328121\n竹刻\t328122\nMISS\t328123\n依华\t328124\n进场费\t328125\njbl吧\t328126\n乳扇\t328127\n郑龙\t328128\n恋香\t328129\nLED数码管\t328130\n300124\t328131\n云师大附中\t328132\n五十句\t328133\n4.18\t328134\n防爆电磁阀\t328135\n季\t328136\n后事\t328137\n菊花晶\t328138\n小米笔记\t328139\n削骨\t328140\n乐清\t328141\n半挂车\t328142\n苏永康\t328143\n抗震救灾\t328144\n000629\t328145\n广西壮族自治区发展和改革委员会\t328146\n魔兽世界燃烧王座\t328147\nvde\t328148\n70层\t328149\nGig\t328150\n苔丝格雷迈恩\t328151\n10局\t328152\n源潭镇\t328153\n彩云\t328154\n万宝龙\t328155\n农村合作银行\t328156\nBubbles\t328157\n阿吽\t328158\n呈贡新区\t328159\n郑州市骨科医院\t328160\n马勺\t328161\n大牛卡\t328162\n共济失调\t328163\n山东服装职业学院\t328164\n北京共青团\t328165\n巫师之昆特牌\t328166\nCudnn\t328167\n热耗\t328168\nartdeco\t328169\n鸿鹄\t328170\n卡瓦尼\t328171\n荒\t328172\nKEGG\t328173\nservers\t328174\ntextmate\t328175\n抢种\t328176\nVCU\t328177\ncodex\t328178\nJSTV\t328179\n漫化\t328180\n胶质瘤\t328181\ne200l\t328182\n靖安县\t328183\n5083\t328184\n瑞鸣\t328185\nTN\t328186\nYoung\t328187\n绿日\t328188\n散件\t328189\nmxgraph\t328190\n第1段\t328191\nZ1000\t328192\n飞燕式\t328193\n折扣端\t328194\n狂摸\t328195\n日暮\t328196\n漫画网\t328197\n杨浦五角场\t328198\n候选码\t328199\nVLOG\t328200\n固定式\t328201\n手机模拟器\t328202\n考核册\t328203\npostgres\t328204\nPhoenixZq\t328205\n中沃\t328206\n病变\t328207\n受训\t328208\n回车\t328209\n38期\t328210\n西安市大学\t328211\n1.8两个\t328212\n夜间\t328213\n电动搬运车\t328214\n冲云霄\t328215\nLiang\t328216\n绝地求生之\t328217\n呼吸内科\t328218\n耶稣会\t328219\n牙椅\t328220\n44项\t328221\n纸巾筒\t328222\n277dw\t328223\n增值税专票\t328224\n基础会计学\t328225\n傻气\t328226\nspiral\t328227\n卡币\t328228\n庆铃汽车\t328229\n易站通\t328230\n江西省委党校\t328231\n天香台阁\t328232\n捐血\t328233\ntuigirl\t328234\n餐攻略\t328235\n吉安市人民政府\t328236\n圣洁\t328237\n增高\t328238\n有机奶粉\t328239\n齿轮\t328240\n谦让\t328241\n山前村\t328242\n大漠苍狼\t328243\n建费\t328244\n长城会\t328245\n旅游在线\t328246\n火宫殿\t328247\n监视者\t328248\nCHART\t328249\n撒旦\t328250\n维骨力\t328251\n三言两语\t328252\n冯特\t328253\nLodge\t328254\nselect函数\t328255\n横刀天笑\t328256\n铜锁\t328257\n4199\t328258\n真正男子汉2\t328259\n潜山县\t328260\n秘书之友\t328261\n张剑锋\t328262\n苹\t328263\n体艺\t328264\n思想界\t328265\n歌舞剧\t328266\n死机\t328267\n中国路桥\t328268\n1.6T\t328269\n国营经济\t328270\n裸胸\t328271\n受留\t328272\n雪崩\t328273\n8.0.2\t328274\n抄手\t328275\n贫困县\t328276\n季節\t328277\n舔阴\t328278\n崩溃\t328279\n非公开发行\t328280\n照光\t328281\n海娃\t328282\n蓝洞公司\t328283\n以偏概全\t328284\n磨豆腐\t328285\n肛门\t328286\n碧波庭\t328287\n少林小子\t328288\n常乐\t328289\n网站整站下载器\t328290\n电熔\t328291\nRS232\t328292\n360WIFI\t328293\n万里长城\t328294\n利昂\t328295\nassisted\t328296\n单用途商业预付卡\t328297\n脑动脉瘤\t328298\n口形\t328299\n小夫\t328300\n上海市检察院\t328301\n胤礽\t328302\n广本冠道\t328303\nLexi\t328304\nR-CNN\t328305\n350V2\t328306\n6AT\t328307\n中原银行\t328308\n2008年5月12日\t328309\n孙宇晨\t328310\n新喀里多尼亚\t328311\nTiled\t328312\n渊蓝\t328313\n6公里\t328314\n求佛\t328315\n肉装\t328316\n开罗宣言\t328317\n过夜\t328318\n黄小厨\t328319\n凤尾花\t328320\n爱尤物\t328321\nwdos\t328322\n恶搞好色\t328323\nbd1024p/1080p/Mp4\t328324\n嘉兴市城乡规划建设管理委员会\t328325\n20170305\t328326\n寡宿\t328327\n婚服\t328328\ngameguardian\t328329\n交通银行信用卡\t328330\n王庄村\t328331\n2006-2016年\t328332\n四五线\t328333\n北京大学国际合作部留学生办公室\t328334\n福州大学图书馆\t328335\nplc编程\t328336\n2017年9月份\t328337\n联网\t328338\n短文两篇\t328339\n细胞分裂5\t328340\n淘宝众筹\t328341\n金鳞岂是池中物\t328342\n渡江侦察记\t328343\n雪中死人经\t328344\n骨关节炎\t328345\n白山路\t328346\n俊才\t328347\nDevelopmental\t328348\n18060期\t328349\n丁氏\t328350\n绝地求生老地图\t328351\n分隔线\t328352\n坐针毡\t328353\n阳光学习网\t328354\nLDA\t328355\n100Mbps\t328356\n军事卫星\t328357\n京东到家\t328358\ngxp\t328359\nupgraded\t328360\n中赢\t328361\n600701\t328362\n东软集团股份有限公司\t328363\n高级宏观经济学\t328364\n小地\t328365\njack篇\t328366\n客点\t328367\n仙剑神曲\t328368\n李爱军\t328369\n能干净\t328370\n15m\t328371\n6.03\t328372\n仅限于\t328373\n紫皮\t328374\nMUX\t328375\n罪有\t328376\n莫\t328377\n阿昌族\t328378\n咸丰元宝\t328379\n内地版\t328380\n魏县\t328381\n起手式\t328382\n菲娜\t328383\nLeaving\t328384\nABAB\t328385\n环球军事网\t328386\n恰到好处\t328387\n孤零零\t328388\n传感器专家网\t328389\n高田贤三\t328390\n海信集团有限公司\t328391\n陈经理\t328392\n线标\t328393\nEPS素材\t328394\n海信日立\t328395\n起见\t328396\n宝坻区\t328397\n石门镇\t328398\n光芒石\t328399\n羚羊斗破苍穹\t328400\n四大星座\t328401\nzoom\t328402\n沟女\t328403\n山水集团\t328404\n精品学习网\t328405\n11.9%\t328406\n樾树\t328407\n模特队\t328408\n白宗元\t328409\n兵变\t328410\n在里面\t328411\n磨石\t328412\n125号\t328413\n品城记\t328414\n爱相\t328415\n13处\t328416\n成峰\t328417\n快速性\t328418\n苏莲托\t328419\n15#\t328420\n普朗东\t328421\ncode-craft\t328422\n4399地下城堡2:黑暗觉醒\t328423\nAutorun\t328424\n八达岭\t328425\n杨新安\t328426\n线程\t328427\n万台\t328428\n百度众测平台\t328429\nveu\t328430\n13个月\t328431\n金流\t328432\nHomePage\t328433\n南京师范大学新闻与传播学院\t328434\n莱卡镜头\t328435\n公開\t328436\n2018-04-23\t328437\n爱校\t328438\n恶魔之谜\t328439\n中影国际影城\t328440\n赵媛媛\t328441\n贪恋\t328442\n济宁市中区\t328443\n花千谷\t328444\n中播网\t328445\n硫酸氢铵\t328446\n天河二号\t328447\nOP\t328448\n社保登记证\t328449\n偷懒\t328450\n胡玉兰\t328451\n真实度\t328452\n指压\t328453\n2665\t328454\ntokenize\t328455\n星点\t328456\n300刀\t328457\n电白县\t328458\nGraph\t328459\n很害怕\t328460\n金鸭\t328461\n安徽林业信息网\t328462\nMacBookPro\t328463\n成都市地方税务局\t328464\n中图\t328465\n265号\t328466\n木粉\t328467\n吉林一号\t328468\n下一个\t328469\n新们\t328470\n李南星\t328471\n和讯保险\t328472\n国兰\t328473\n南丹东路\t328474\nBelong\t328475\n恒心\t328476\n无眠\t328477\n种植园\t328478\n虚白\t328479\n复华集团\t328480\n中国社会科学杂志社\t328481\nsyd\t328482\n贵生\t328483\n人教版八\t328484\ntoyo\t328485\n聚心\t328486\n锦州港\t328487\nhandling\t328488\n年迈\t328489\n迅雷快鸟\t328490\n夜桂子\t328491\n万达东方影都\t328492\n孙晓云\t328493\n时间条\t328494\nboot2\t328495\n地布\t328496\n桃李满天下\t328497\n一个7\t328498\nTP3\t328499\nactivation\t328500\nbsa\t328501\n苏琛\t328502\nマル\t328503\n硼氢化钠\t328504\n髙\t328505\n16px\t328506\n福山外国语小学\t328507\n胖次\t328508\n刘太医\t328509\nPythoner\t328510\n新晋\t328511\n焓湿图\t328512\n谢岗镇\t328513\n汽车修理厂\t328514\n觉晓\t328515\n种田梨沙\t328516\n三亮\t328517\n地贫\t328518\n科学性\t328519\n英雄无敌3HD\t328520\n环卫\t328521\nD轮\t328522\n28栋\t328523\n生西法师\t328524\n九峰森林动物园\t328525\n路西法第三季\t328526\n六祖\t328527\n牺牲\t328528\n布兰查德\t328529\n中英文公司\t328530\n同理心\t328531\n文都\t328532\n汇评\t328533\n换防\t328534\nDCE\t328535\n友商\t328536\n园林\t328537\nqq在线\t328538\n尚海郦景\t328539\n同人志\t328540\nreserved\t328541\n6伏\t328542\n2017年12月18日\t328543\n尾花\t328544\n澳门风云3\t328545\npork\t328546\n舞衣\t328547\n1.9.32.0\t328548\n固定端\t328549\n动图\t328550\n培元\t328551\n太玉园\t328552\n澤村レイコ\t328553\n海马公园\t328554\n杭州美院\t328555\n购置费\t328556\nHDS\t328557\nlowpoly\t328558\n心神\t328559\n公检\t328560\n红酒百科全书\t328561\n南湖公园\t328562\n倾莲池\t328563\n土屋太凤\t328564\nCOMPASS\t328565\n面壁\t328566\n游戏堡\t328567\ndbeaver\t328568\nShiro\t328569\nmkd\t328570\n留出\t328571\n左滑\t328572\n小标宋\t328573\n中国施工企业管理协会\t328574\n博途v14\t328575\n注册公用设备工程师\t328576\n10万亩\t328577\n伊萨卡\t328578\n手速\t328579\n王家湾中央生活区\t328580\n库尔茨\t328581\n南焦客运站\t328582\nprotocols\t328583\ncruzer\t328584\n夏小正\t328585\n夹胶\t328586\n牧马雷克萨斯ls\t328587\n调焦\t328588\n华中地区\t328589\n莉莉安\t328590\nanychat\t328591\nxcap\t328592\n瑶里\t328593\ngtx850m\t328594\n王基\t328595\n上榜\t328596\n气相\t328597\n袖钩\t328598\n中国西部国际博览城\t328599\n无鳞鱼\t328600\n真假\t328601\nFools\t328602\n一阕\t328603\n慢性肾病\t328604\ntype-c接口\t328605\n乔大海\t328606\n长短\t328607\n瑞金人民政府\t328608\n抱犊寨\t328609\n鬼卞\t328610\nKDX\t328611\nHSSFWorkbook\t328612\n曲靖\t328613\n365\t328614\n尾骨\t328615\n大连日报数字报\t328616\n985大学\t328617\n供应商管理\t328618\ndownloading\t328619\nInfiniBand\t328620\n群员\t328621\n青山水库\t328622\n江淮十校\t328623\nツア\t328624\n尼龙棒\t328625\n醒世姻缘传\t328626\n3.7%\t328627\n容联云\t328628\n闸刀\t328629\n把子\t328630\n数百辆\t328631\n立克\t328632\n中华人民共和国海事局\t328633\n轴度\t328634\n福安市\t328635\n贵阳新闻网\t328636\n节流\t328637\n石狮人才网\t328638\n85级\t328639\n就业登记证\t328640\nibatis\t328641\n山东省交通厅\t328642\n第22卷\t328643\n蔚雨芯\t328644\nProject\t328645\n千尺\t328646\n顺治帝\t328647\nps/2\t328648\n拉力器\t328649\n20170719\t328650\n北师大二附中\t328651\nLTV\t328652\n十次方\t328653\n要约收购\t328654\n医学检验专业\t328655\n为止\t328656\n琼瑶\t328657\nqiang\t328658\npal\t328659\n县住建局\t328660\n潜龙二号\t328661\nlookae\t328662\n刘炽平\t328663\n铅垂线\t328664\n老子道德经\t328665\n拉响\t328666\n口袋妖怪究极之日\t328667\noptimum\t328668\nstrobe\t328669\n蟹粉\t328670\n91平\t328671\n万秀区\t328672\n帝霸_帝霸\t328673\nUltrabook\t328674\n轩辕汉化组\t328675\n黑塔\t328676\n全场景\t328677\n要点\t328678\nfoundations\t328679\n云母片\t328680\n苏言\t328681\n国寿福\t328682\n北娃\t328683\n新视野大学英语读写教程\t328684\n龙珠传奇之无间道\t328685\n浴血黑帮\t328686\n中国水利水电科学研究院\t328687\n标致508论坛_汽车之家论坛\t328688\n香评\t328689\nchee\t328690\n海军风\t328691\n53KF\t328692\n符管\t328693\n鲸头鹳\t328694\nRePEc\t328695\n世界地图\t328696\n_锦\t328697\nSayaka\t328698\n单人沙发\t328699\n13册\t328700\n78分\t328701\nbuntu\t328702\n再起\t328703\n幻翎\t328704\n火影天天酷跑\t328705\n莉莉娅\t328706\n凌凯\t328707\n中国涂料工业协会\t328708\n6480\t328709\n首趟\t328710\n五千多年\t328711\nxh\t328712\n洛阳地区\t328713\n五甲\t328714\n20万个\t328715\n石慧\t328716\n三司\t328717\n双金属片\t328718\n呼和浩特新闻网\t328719\n锐界\t328720\n语族\t328721\nLightTools\t328722\n六氟磷酸锂\t328723\n公共汽\t328724\n头车\t328725\n宋宝儿\t328726\nwanted\t328727\n200只\t328728\nliutao\t328729\n网电\t328730\n妖仙\t328731\n小米MIUI\t328732\n高度处\t328733\n硬皮病\t328734\n源远流长\t328735\n3首\t328736\n流水团\t328737\n虚\t328738\n硬石\t328739\nRun\t328740\n金宇彬\t328741\n2016年4月21日\t328742\n杨菲\t328743\nEtsy\t328744\n天才相师\t328745\n麓山国际社区\t328746\n张鑫磊\t328747\n12周年\t328748\n北京航空航天大学研究所\t328749\n瑞安中学\t328750\n警告框\t328751\n马蹄湾\t328752\n表哥们\t328753\n国药股份\t328754\nmac版\t328755\n基建期\t328756\nVIA\t328757\n你的笑容\t328758\nFps\t328759\n新部\t328760\ncima\t328761\n大连市安全生产监督管理局\t328762\n阿拉伯之夜\t328763\narrives\t328764\n陈嘉上\t328765\n康润家园\t328766\n金曜日\t328767\n条鱼\t328768\n特价版\t328769\n时代花园\t328770\nIncredible\t328771\n孝贤\t328772\nvnr\t328773\n恶疾\t328774\n此日\t328775\n诺达\t328776\n神仙劫\t328777\n接盘侠\t328778\n飞利浦伟康\t328779\n盐酸坦索罗辛缓释胶囊\t328780\n4袋\t328781\n政史\t328782\n协鑫集团\t328783\n婚烟\t328784\n卡罗热血江湖\t328785\n电脑病毒\t328786\n色谱法\t328787\nLayabox\t328788\n高计\t328789\n柴胡\t328790\n娇女\t328791\nidea2018\t328792\n罗顿发展\t328793\n出任\t328794\n花粉们\t328795\n恶魔高校\t328796\ndefective\t328797\n备用机\t328798\n梁翘柏\t328799\n使用性\t328800\n排水量\t328801\nmoka\t328802\nPoirot\t328803\n我愿做\t328804\nDTR\t328805\n安云\t328806\n宫腔镜手术\t328807\nPleasant\t328808\n三亚宾馆\t328809\n划为\t328810\nd7\t328811\n免费电子书转换器\t328812\n昆明长水国际机场\t328813\n工具柜\t328814\n靳\t328815\nqh\t328816\n第135章\t328817\n阜阳市教育局\t328818\nAB血型\t328819\n咸祥\t328820\nakamai\t328821\n现名\t328822\n茅原实里\t328823\n敖汉\t328824\n天长先锋网\t328825\n军痞\t328826\n22号\t328827\n低垂\t328828\n答主\t328829\n黄道婆\t328830\n中交第四公路工程局有限公司\t328831\n蚌埠医学院第一附属医院\t328832\n澪\t328833\n表示式\t328834\n赵焕焱\t328835\n巴比伦柏林\t328836\n消逝\t328837\n节后\t328838\n丰原药业\t328839\n格力手机2\t328840\n红缨枪\t328841\nCNXP\t328842\n王舍人\t328843\n尘烟\t328844\n村\t328845\n小岳\t328846\n花毽\t328847\n凯捷\t328848\nHG8120C\t328849\n5.24\t328850\n网络电视机顶盒\t328851\n苏州大学文学院\t328852\n隐遁\t328853\n集卡\t328854\n巴黎协定\t328855\nICP经营许可证\t328856\n建安小学\t328857\n八路\t328858\n自助值机\t328859\n麦旋风\t328860\n中国历史\t328861\n区域赛\t328862\nhep\t328863\n卸妆\t328864\nuibarbuttonitem\t328865\n栗栖エリカ\t328866\n炉排\t328867\n密苏里大学\t328868\nBigBang\t328869\nDolly\t328870\n魂\t328871\n阿古茹\t328872\n沈航\t328873\n断情\t328874\n中国五矿\t328875\n甜蜜\t328876\nAF1\t328877\n第165集\t328878\n崩溃大陆\t328879\n房地产信息网\t328880\n坐馆\t328881\nfloat型\t328882\n81_\t328883\n群星stellaris\t328884\ndb2数据库\t328885\n鱼儿\t328886\n单极性\t328887\n滑块\t328888\nLEAN\t328889\n立体纸\t328890\n普渡\t328891\n散装水泥\t328892\n冉家坝\t328893\n2017年7月24日\t328894\n诗语\t328895\n瘦身衣\t328896\n排列图\t328897\n肿瘤科\t328898\n大长今\t328899\n僵尸毁灭工程吧\t328900\n敕勒歌\t328901\n动量仪\t328902\n郑东新区政府网\t328903\n袁视角\t328904\n大作手\t328905\n二十多天\t328906\n树先生\t328907\nK-means聚类\t328908\n瞄准器\t328909\n二维码支付\t328910\n郭晶\t328911\n托托\t328912\ncomand\t328913\n阿Q范文网\t328914\n载具\t328915\nsteamcommunity\t328916\n北京市第六医院\t328917\n自由门\t328918\n春城秋介\t328919\n气动打标机\t328920\n小辛娜娜\t328921\n意识形态责任制\t328922\nfibonacci\t328923\n战女神zero\t328924\n雨宫莲\t328925\n一盏\t328926\ndimensions\t328927\n钢结构件\t328928\ncummins\t328929\n黑壳\t328930\n盐田区\t328931\n龙翔路\t328932\noO\t328933\n文征明\t328934\n变形器\t328935\nSplice\t328936\n杂交稻\t328937\n程序式\t328938\nJUNIPER\t328939\n5.16\t328940\n碳酸氢钾\t328941\n量角器\t328942\nproficient\t328943\n中华人民共和国江苏海事局\t328944\nTrajectory\t328945\n哥特式建筑\t328946\nbinkw32.dll\t328947\n承保\t328948\n仙剑五\t328949\n博士团\t328950\n青海羚网\t328951\n配货\t328952\nLuck\t328953\n首阳山\t328954\nhuanqiu\t328955\n27001\t328956\n家居馆\t328957\n軍\t328958\nhtmltestrunner\t328959\nkalman\t328960\njihe\t328961\n王志伟\t328962\n西游谜中谜\t328963\n满城县\t328964\n资源篇\t328965\n名句\t328966\n_标\t328967\nfrt\t328968\nkuala\t328969\n306路\t328970\nSpanning\t328971\n文本类\t328972\n盗贼们\t328973\n沙漠图\t328974\n杨烨\t328975\n江西省残疾人联合会\t328976\nx+2y\t328977\n大戏台\t328978\n数据采集系统\t328979\n西南师范大学\t328980\nMichelle\t328981\nMate10Pro\t328982\nUU妹\t328983\nPlesk\t328984\nvtk\t328985\nactiverecord\t328986\n收邮件\t328987\n滴水穿石孙杰\t328988\n0.80\t328989\n姚家山\t328990\n肥田\t328991\nd2\t328992\nA1661\t328993\n中山桥\t328994\n斗鱼节\t328995\n围标\t328996\n星际之门亚特兰蒂斯\t328997\n8750H\t328998\n5月1日后\t328999\n炮娘\t329000\n爱暖\t329001\n首饰盒\t329002\nss\t329003\nOpenresty\t329004\n城史\t329005\nStencil\t329006\n弯梁世界\t329007\n甲虫\t329008\n磷酸铁锂\t329009\n实木复合地板\t329010\n另立\t329011\n双色球宝典\t329012\n东莞南城\t329013\n截止性\t329014\n护发素\t329015\n华西集团\t329016\n苍汰\t329017\n新基地\t329018\n马努\t329019\n测试项\t329020\n带压\t329021\n冰糖葫芦\t329022\n稻花香\t329023\n迁安市\t329024\n利比里亚\t329025\n保人\t329026\nAEON\t329027\n真人真事\t329028\n不结果\t329029\n黄杨木\t329030\n抓拍\t329031\n江门市\t329032\n全职法师\t329033\n电玩巴\t329034\n粉群\t329035\n华清远见移动互联网学院\t329036\n柳钢股份\t329037\n说到\t329038\n气动扳手\t329039\n石油输出国组织\t329040\n通元\t329041\n埃索美拉唑\t329042\n文笔峰\t329043\n三元悖论\t329044\n英魂\t329045\n3.2.1\t329046\n颈线\t329047\nav磁力链接\t329048\n把儿\t329049\n工艺卡\t329050\n正四面体\t329051\n第九大陆\t329052\n软态\t329053\n鬼马\t329054\nNET\t329055\n中山三路\t329056\n风迷\t329057\n乔恩\t329058\n6架\t329059\n贡米\t329060\nrandomize\t329061\n矿机\t329062\n肠球菌\t329063\nbaseline\t329064\n梦幻家园\t329065\n休息室\t329066\n枪神记\t329067\n加密锁\t329068\nHired\t329069\n温州商学院\t329070\n临床助理医师\t329071\n发作\t329072\necexl\t329073\n266\t329074\n_小小智慧树\t329075\n晋阳街\t329076\n太太乐\t329077\njedis\t329078\n排子\t329079\n宋美西游记\t329080\nhublot\t329081\n梨花女子大学\t329082\nRKI\t329083\n冒险岛——17173\t329084\n北京音乐广播\t329085\n舒居\t329086\n泊松方程\t329087\n云克\t329088\n于小慧\t329089\n马兰头\t329090\n聚新宝\t329091\n先辈\t329092\n大资管\t329093\npz30\t329094\n沁河\t329095\nexcel转换器\t329096\n沈阳地铁10号线\t329097\nqyule\t329098\n王春明\t329099\nRCD\t329100\nVerge\t329101\n40a\t329102\n口装\t329103\n沙漠之狐\t329104\n36天\t329105\n5立方\t329106\n汪束\t329107\n接踵\t329108\nadv\t329109\n艾米粒\t329110\n古埃及\t329111\n必填\t329112\n乳霜\t329113\n网站服务器\t329114\n斜线\t329115\n酒店式\t329116\n网易有道\t329117\n和牛\t329118\n798艺术区\t329119\n朱辛庄\t329120\ngeo\t329121\n受援\t329122\n秀气\t329123\n车改\t329124\n受托人\t329125\n投保单\t329126\n东师\t329127\nspecialization\t329128\n玉田\t329129\n后妈的春天\t329130\n元类\t329131\n即墨老酒\t329132\nBui\t329133\n线制\t329134\n8千元\t329135\n安珂\t329136\n信息大厦\t329137\n空壳村\t329138\n元魂珠\t329139\n天天象棋战国七雄\t329140\nfreemodbus\t329141\n大冒险家\t329142\n昌吉\t329143\nrefreshing\t329144\n摘自\t329145\nToolbox\t329146\n重庆市科学技术研究院\t329147\n3DsMax\t329148\n十字军之王\t329149\n佛陀正法网\t329150\n坂田金时\t329151\n易付\t329152\n茶山情歌\t329153\n东方大学\t329154\n彩泥\t329155\n崔顺实\t329156\n500ma\t329157\n扒手\t329158\n大皇途观\t329159\n达官\t329160\n降权\t329161\nC83\t329162\n亚尔夫海姆\t329163\n韦正\t329164\n成都体院\t329165\n副班\t329166\n1621\t329167\n关联交易\t329168\n八两\t329169\nhast\t329170\n白兰瓜\t329171\nSimulator\t329172\n仙翁\t329173\n777777\t329174\n理亏\t329175\nRF\t329176\n九如城\t329177\n76岁\t329178\n6.0.2.7\t329179\n衤\t329180\nGamersky\t329181\n提取液\t329182\nPalace\t329183\n结果\t329184\n铁板烧设备\t329185\nWin10计算器\t329186\n候审\t329187\nQQ游戏\t329188\n初级篇\t329189\nstarlight\t329190\n跨类\t329191\n聪\t329192\nreplaceAll\t329193\nEco\t329194\n低脂\t329195\n都督府\t329196\n第\t329197\n贾跃亭\t329198\n卡仕达\t329199\n勇士币\t329200\n扩散板\t329201\n自制剧\t329202\n布点\t329203\n婚嫁\t329204\n表演赛\t329205\n10年后\t329206\n浙江大学人文学院\t329207\n复方金银花颗粒\t329208\n宠物小精灵xy\t329209\n陈玉莲\t329210\n霸霸\t329211\n致残\t329212\n义妹\t329213\n日期章\t329214\n天津摇号\t329215\n永远\t329216\n放假了\t329217\n3.5公里\t329218\n50cent\t329219\n传奇盛世\t329220\nShadowPlay\t329221\n黑胡桃木家具\t329222\n波瓣\t329223\nAPK3安卓网\t329224\n要素市场\t329225\n理查德米勒\t329226\n艾欧尼亚\t329227\nI2S\t329228\n随车吊\t329229\n乳猪\t329230\n针对\t329231\n太太太\t329232\n肖奈\t329233\n翁垟\t329234\n结果期\t329235\n睾丸炎\t329236\n广西师范大学出版社\t329237\n戴瑞\t329238\n阿里云云\t329239\n公字\t329240\n胶东在线\t329241\n3.3.9\t329242\n潜水者\t329243\n聂荣臻\t329244\n商业化\t329245\n天津中原地产\t329246\n3两\t329247\n发展之路\t329248\n据理力争\t329249\n205号\t329250\n担保金\t329251\n泛微\t329252\n华米\t329253\n巴库\t329254\n研招\t329255\nスピ\t329256\n减持\t329257\n武田华\t329258\n宁夏气象局\t329259\nmapi\t329260\nLayDate\t329261\n开源节流\t329262\nfrac\t329263\n8月11日\t329264\n格鲁克\t329265\nipconfig\t329266\nCuts\t329267\n唱完\t329268\n三八\t329269\n三第一章\t329270\n天空盒\t329271\n安森美半导体\t329272\ni6s\t329273\n用化\t329274\n广电大厦\t329275\n考研群\t329276\n大富翁6\t329277\nCPK\t329278\n伽师县\t329279\n300mium\t329280\n云南大学软件学院\t329281\n敢达创战者\t329282\n海马苹果助手\t329283\n京瓷6525\t329284\n虾油\t329285\n刺穿\t329286\n虾友\t329287\n余依许越\t329288\n滨江浦\t329289\n火车采集器\t329290\n中视\t329291\nTSF\t329292\n58同城加盟网\t329293\n杯底\t329294\n歌词版\t329295\n公安部警卫局\t329296\n降支\t329297\n安富利\t329298\n绝地求生卡loading\t329299\n63级\t329300\nSYS\t329301\ncod14\t329302\n赵旭东\t329303\n新广告法违规词\t329304\n张杨\t329305\nalk\t329306\n革命\t329307\n殖装\t329308\nranked\t329309\n挥\t329310\n山东卫视育儿大\t329311\nOunces\t329312\n丽驰\t329313\n智囊团\t329314\n49名\t329315\n氢能\t329316\n纸管机\t329317\n段式\t329318\n目的\t329319\n精度型\t329320\nTitans\t329321\n春江水暖鸭先知\t329322\n18.5\t329323\nrhe\t329324\n华景新城\t329325\n分类树\t329326\n引用源\t329327\n邸\t329328\nMaven2\t329329\n造船厂\t329330\n831\t329331\n成都市发展和改革委员会\t329332\n安徽省中医院\t329333\n赵晓曦\t329334\n防火阀\t329335\n近墨者\t329336\n喜剧演员\t329337\n三门峡\t329338\n韩金英\t329339\n特诊\t329340\npython3\t329341\n1.2.3\t329342\nxiumin\t329343\n幽影\t329344\naps\t329345\n低头族\t329346\n氮封\t329347\n天下第一关\t329348\n陳聽溪\t329349\nBarclays\t329350\nBinance\t329351\n中国共产党第十九届中央委员会\t329352\n超级百变王\t329353\nevelom\t329354\n2017-06\t329355\nbashi\t329356\n不如故\t329357\nストッキング\t329358\n雅妍\t329359\n三哥\t329360\n兴华南街\t329361\n人教版九年级化学\t329362\n非官方\t329363\nlinong\t329364\n光纤激光打标机\t329365\n对奖\t329366\n第119期\t329367\n宿舍管理系统\t329368\n黄粉虫\t329369\nSNB\t329370\n印本\t329371\n促销品\t329372\nhonghe\t329373\n我的世界Minecraft\t329374\n970_\t329375\n华为Matebook\t329376\n20150915\t329377\n53分\t329378\n心理罪\t329379\n产动\t329380\n苏州外国语学校\t329381\n0930\t329382\njsgs\t329383\n气海穴\t329384\n汇诚\t329385\nConcurrentHashMap\t329386\n名特优\t329387\n蜜桃社\t329388\n马来西亚机场\t329389\n钢质门\t329390\n色天堂\t329391\n勤绩\t329392\n品居\t329393\n中共中央纪律检查委员会\t329394\n冒险岛5\t329395\n20170113\t329396\n张桥\t329397\n碗燕\t329398\n两班\t329399\n青岛海信电器股份有限公司\t329400\nLawn\t329401\n百丽吧\t329402\n张丹峰\t329403\n执信中学\t329404\n无限双城记\t329405\nSplendid\t329406\nArlin\t329407\n包窗\t329408\n汉台\t329409\n下吧\t329410\n褥疮\t329411\nZ9D\t329412\nLitter\t329413\n珍珠明目滴眼液\t329414\n狄淇儿\t329415\n郑州地铁3号线\t329416\n2017百大\t329417\n星露谷物语\t329418\n宋MAX\t329419\n一叶文学城\t329420\n永清\t329421\n永丰路\t329422\n复方石韦胶囊\t329423\n深圳市政府\t329424\n科学松鼠会\t329425\n窑厂\t329426\n圆滑\t329427\n重装机兵\t329428\nxinshang\t329429\n黑篮\t329430\n40Cr\t329431\nSQL之DataFrame操作大全\t329432\n碳酸氢根\t329433\n尼古拉斯凯奇\t329434\n70秒\t329435\ncohen\t329436\n智能风控\t329437\n狼藉\t329438\n自然法则\t329439\n八夫\t329440\n银滩\t329441\n沼跃鱼\t329442\n斗鱼绝地求生黄金大奖赛\t329443\n长江商学院\t329444\n苗青\t329445\nproximal\t329446\nunreachable\t329447\n陈安\t329448\n河南省人事考试中心\t329449\n乐在哈尔滨\t329450\n赫尔德\t329451\n不嫁\t329452\n中企盟\t329453\n祖冲之路\t329454\n个儿\t329455\n骑马与砍杀mod\t329456\nreviewer\t329457\n土木工\t329458\n0.5磅\t329459\n橘灿\t329460\n长隆集团\t329461\n儿童康复中心\t329462\n海线\t329463\n醉生梦死\t329464\nBits\t329465\n索罗亚\t329466\n世界之王\t329467\n民事诉讼\t329468\n再说\t329469\n搜航网\t329470\n临时\t329471\n北京保监局\t329472\nAbbyy\t329473\n吉诺\t329474\n祭扫\t329475\n麻吕\t329476\n战争之人:突击小队2\t329477\n剑姬神圣谭\t329478\n鞍山钢铁集团公司\t329479\n修仙高手混花都\t329480\n厦门集美大学\t329481\n勿忘\t329482\n魔乳秘剑帖\t329483\n中山南二路\t329484\nchubby\t329485\n蜡像\t329486\n赵熙\t329487\npragma\t329488\n鹤峰县政府网\t329489\ntunnel\t329490\n2012b\t329491\nQSBDC\t329492\nmarcel\t329493\n1.9.7.27907\t329494\n封笔\t329495\n曹赟定\t329496\n兰心蕙质\t329497\n糖猫\t329498\nso源\t329499\nlabels\t329500\n2017-20\t329501\n铝型材挤压机\t329502\nscars\t329503\nsansung\t329504\nhtml元素\t329505\n漱玉\t329506\n深圳物流公司\t329507\n母孔雀鱼\t329508\nlililicat\t329509\n24集\t329510\nzhiying\t329511\n精度高\t329512\n微信公众服务号\t329513\n当场\t329514\n组合贷\t329515\n芯纸\t329516\n企业家们\t329517\n络石藤\t329518\n胤禟\t329519\nos\t329520\n寿眉\t329521\n刘雪梅\t329522\n德维尔\t329523\n千亿斤\t329524\nWayne\t329525\n季伟\t329526\n嘎嘎\t329527\n郑州市中医院\t329528\nmadrid\t329529\n郑多彬\t329530\n清障车\t329531\n佛爷\t329532\n珈蓝\t329533\n草芥\t329534\n4多\t329535\n96.8\t329536\n人型\t329537\n中国交通新闻网\t329538\n可用版\t329539\n英式橄榄球\t329540\nIPPBX\t329541\n核酸提取仪\t329542\nfm2008\t329543\n长春版小学\t329544\nDIFF\t329545\n邓演达\t329546\nsgamer\t329547\n宁兴\t329548\n桃乃木香奈\t329549\n南海仲裁案\t329550\n竞得\t329551\n立心\t329552\n甲醛清除剂\t329553\n2017年4月14日\t329554\n南迦巴瓦\t329555\n600型\t329556\n邦盛科技\t329557\n书香门\t329558\n玻璃画\t329559\nи\t329560\n7MM\t329561\nQuora\t329562\n959\t329563\n转座子\t329564\n5台\t329565\n四太\t329566\n何谓\t329567\nC1083\t329568\nfinale\t329569\n安然公司\t329570\n停留不前\t329571\nmxml\t329572\n黄河风景名胜区\t329573\n月琴\t329574\n王蔷\t329575\n6500亿\t329576\nsexinsexboard\t329577\n莎娜琳\t329578\n扑街\t329579\nEdin\t329580\n超声波测距仪\t329581\ncase\t329582\n八公里\t329583\n恶行\t329584\n风云\t329585\n水上游\t329586\n37.3度\t329587\n任勇\t329588\n感恩的心\t329589\n焦作市人民政府\t329590\nrelativelayout\t329591\n6c\t329592\nTactics\t329593\n世乒赛\t329594\n皮腔\t329595\n药壶\t329596\n飞虎队之潜行极战\t329597\n探索_参考网\t329598\n泪奔\t329599\n雅典机场\t329600\n千帆过尽\t329601\n十陵镇\t329602\n福建物构所\t329603\ns2700\t329604\n望江楼公园\t329605\nUart\t329606\n安倍晴明\t329607\n24史\t329608\n大头\t329609\n巫妖王\t329610\n记大过\t329611\n1881\t329612\nac1200\t329613\netf\t329614\ntriggers\t329615\nAHB\t329616\n逗弄\t329617\n剃头匠\t329618\n1000kw\t329619\n驱动式\t329620\n一&#160\t329621\n可追溯性\t329622\n东莞市国家税务局_广东省国家税务局\t329623\nunrar\t329624\n玛莎逆鳞\t329625\n湖南代表团\t329626\n移动办公系统\t329627\n3367DNF\t329628\n维生素\t329629\n光洋\t329630\n钻具\t329631\n7发\t329632\n海桐小学\t329633\n纳客\t329634\n岩片漆\t329635\n镑\t329636\n风影\t329637\n广州铁路局\t329638\n幕府将军2:武家之殇\t329639\n安全度\t329640\n琅琊坊\t329641\n被拐\t329642\n姜振宇\t329643\nsql注入\t329644\n5000W\t329645\nbbf\t329646\npka\t329647\n海纹石\t329648\n万\t329649\n缎子\t329650\n钢构件\t329651\n小华\t329652\n护发\t329653\naided\t329654\n健身操\t329655\n政治观\t329656\niPad中文网\t329657\n大容量\t329658\n恶德\t329659\n皇家国教骑士团\t329660\nnc57\t329661\n艾拉\t329662\nlear\t329663\n易捷航空\t329664\n红船\t329665\nfloodlight\t329666\n太原铁路局\t329667\n高菲酒业\t329668\n18V\t329669\nStandby\t329670\n曼巴特\t329671\n魏之智\t329672\n税表\t329673\n好不好\t329674\n欧睿\t329675\n灵石县\t329676\n八大美院\t329677\n滆湖\t329678\n合合\t329679\n如歌\t329680\n中旅\t329681\n520m\t329682\nb9s\t329683\nhanging\t329684\n90多\t329685\n群防\t329686\n十多个\t329687\nVisitor\t329688\n一年多后\t329689\ncolorado\t329690\n黄手帕\t329691\n就业失业证\t329692\n掘港\t329693\nblueman\t329694\n达达\t329695\n一榀\t329696\n新理念\t329697\nsukebei\t329698\npredict函数\t329699\n建筑设计公司\t329700\n网标\t329701\npuro\t329702\n腾冲县\t329703\n14厘米\t329704\nqq阅读\t329705\n天涯正义网\t329706\nXSplit\t329707\n168号\t329708\n无锡万达城\t329709\nidg\t329710\nConfiguration\t329711\n你画我猜\t329712\ncater\t329713\n三菱劲炫\t329714\n井上织姬\t329715\n空間\t329716\n蜂蜡\t329717\n贵州省公安厅交通管理局\t329718\n信阳市浉河区人民政府\t329719\n战兽\t329720\n词汇学\t329721\nprop\t329722\n3千\t329723\n证处\t329724\n比尔假如爱有天意\t329725\niqair\t329726\n1944年\t329727\n锦零\t329728\n魔武\t329729\n婚妻\t329730\n千拳\t329731\nnpdp\t329732\n无条件\t329733\n无刷直流电动机\t329734\n孙春兰\t329735\n火影\t329736\nrace\t329737\nMH370\t329738\n绍兴客运中心\t329739\n陕国投\t329740\n亿个\t329741\nEECS版\t329742\nNFL\t329743\nzbbz\t329744\n阻抗\t329745\n寮\t329746\n万平方米\t329747\njQuery/HTML5\t329748\nnwz\t329749\n新青年\t329750\n化学专业\t329751\n胜率\t329752\n难上加难\t329753\n重围\t329754\n思无\t329755\n第47集\t329756\n预储\t329757\nMARMOT\t329758\n凯立德\t329759\n刘氏\t329760\n人秀\t329761\n张锋\t329762\n银河路\t329763\n热图\t329764\nf18\t329765\n猎头网\t329766\n2590\t329767\nruanj\t329768\n岭\t329769\nNodeMcu\t329770\n恭城县\t329771\n1.3T\t329772\n鹅肝\t329773\nE05\t329774\n鼓楼医院\t329775\n新加坡大学\t329776\n懒人\t329777\n头板\t329778\n去皮机\t329779\n茅盾\t329780\n口述性\t329781\n忍冬藤\t329782\n东北王\t329783\n朱维铮\t329784\n摔跤狂热大赛\t329785\n三足鼎立\t329786\n何秀\t329787\n行侠\t329788\nte\t329789\n549\t329790\n广联达软件股份有限公司\t329791\n箱式变电站\t329792\n联苯苄唑乳膏\t329793\n中国寿光_寿光市政府\t329794\n共轭亚油酸\t329795\nlip\t329796\nヌキ\t329797\n益华\t329798\n环境工程专业\t329799\nhelpdesk\t329800\ntypo\t329801\n小米5plus\t329802\n美淘\t329803\n卡巴斯基反病毒\t329804\n把酒倒\t329805\nalberta\t329806\n赛尔号谱尼\t329807\nnone-linux\t329808\n滑竿\t329809\n蛮牛\t329810\n势能\t329811\n主音\t329812\n金城银行\t329813\nwindows10下\t329814\n一星期前\t329815\n羲和清零\t329816\n57亿\t329817\nenglish\t329818\n吧威基基\t329819\n脂肪醇\t329820\n交易\t329821\n嘉兴学院南湖学院\t329822\n莫雨\t329823\n日资\t329824\n走散\t329825\n热歌榜\t329826\n曾柔\t329827\n今日泉州网\t329828\ncherish568\t329829\n10万亿\t329830\nformation\t329831\n750_\t329832\nKong\t329833\nanalyzed\t329834\n更浓\t329835\n引流\t329836\n帅府\t329837\n卡易贷\t329838\n10個\t329839\n发动机冷却系统\t329840\nbigemap\t329841\n小米电视4s\t329842\n海底两万里\t329843\n安康信\t329844\n直断\t329845\n两万元\t329846\n发展汉语\t329847\n齐鲁普\t329848\n波多野\t329849\n登账\t329850\n餐包\t329851\n喆啡酒店\t329852\n宣传词\t329853\n李蓓\t329854\ncosa\t329855\n吴汉东\t329856\n取源\t329857\narena\t329858\n常熟国际学校\t329859\n函头\t329860\n张庄镇\t329861\ncos买卖\t329862\nHerrera\t329863\n看不清\t329864\n1583\t329865\nresurrection\t329866\n许平\t329867\n心乱如麻\t329868\n患病\t329869\n性量\t329870\n萨其马\t329871\n摇曳\t329872\n动静脉瘘\t329873\n主攻文\t329874\n饱和区\t329875\nMars\t329876\n紧跟\t329877\nRefining\t329878\nMPEG-4\t329879\n夺神\t329880\n生态\t329881\niir\t329882\n5888元\t329883\n暻秀\t329884\nAHD\t329885\n陆家嘴集团\t329886\n0x0000000a\t329887\n秀景\t329888\n校门\t329889\n软件自学网\t329890\n明武宗\t329891\n徐明浩\t329892\nhss\t329893\n奈欧斯奥特曼\t329894\nsierra\t329895\nqc080000\t329896\n新思界\t329897\n李信\t329898\n毫欧\t329899\n冯志强\t329900\n赏析\t329901\n管壁\t329902\n曹毅\t329903\nVoid\t329904\n名院\t329905\ni57400\t329906\n浮钓\t329907\n鸭绿江断桥\t329908\n江闪闪\t329909\n两不误\t329910\n集成电路布\t329911\n一开始\t329912\n鑫盛\t329913\n安科瑞电气股份有限公司\t329914\nJTextArea\t329915\n这个人\t329916\n豆沙绿\t329917\n扫罗\t329918\n问渠哪得清如许\t329919\njackpot\t329920\n当归片\t329921\n巴斯\t329922\n零和游戏\t329923\ndmidecode\t329924\n布伦希尔德\t329925\n法源寺\t329926\n无不\t329927\n为了\t329928\n皖鱼\t329929\n尊严权\t329930\nMDB\t329931\novo\t329932\n官塘村\t329933\n许凯\t329934\n洲际导弹\t329935\n双向可控硅\t329936\n日照火车站\t329937\n噎死\t329938\n浇筑\t329939\n宋歌\t329940\nfansi\t329941\n新桥机场\t329942\n我的世界盘灵\t329943\nM1005\t329944\n莆田市行政服务中心\t329945\nZenfone\t329946\n共居\t329947\n村口\t329948\n麻辣牛肉干\t329949\n临淄区人民政府\t329950\n平举\t329951\n德语专四\t329952\n中华活页文选\t329953\n深圳大运会\t329954\nkpmg\t329955\n直溪镇\t329956\nsecrets\t329957\n限位\t329958\n外长\t329959\n1.5伏\t329960\nOdds\t329961\n大铲湾\t329962\n白加\t329963\nFoot\t329964\n双盘\t329965\n中诚\t329966\n宁丹琳\t329967\n泰康健康\t329968\n20161123\t329969\n上汽大众4s店\t329970\nHyatt\t329971\n不变\t329972\n站牌\t329973\n一六\t329974\n石更\t329975\n春夜宴\t329976\n我忍不了\t329977\n28名\t329978\n分割机\t329979\n塔罗占卜\t329980\n桔梗花\t329981\n炸天\t329982\n乔丹·卡佛\t329983\n西红门\t329984\n体技\t329985\n北京市顺义区人力资源和社会保障局\t329986\n16g101\t329987\n徐旭\t329988\n名砚\t329989\n深圳工业园\t329990\n第七十六条\t329991\n英字\t329992\nV7.0\t329993\n碑帖\t329994\nMVC\t329995\n静压机\t329996\nscorpion\t329997\n得意之作\t329998\n新城广场\t329999\nlair\t330000\n康美直销|康美药业|康美奖金制度|康美药业\t330001\n驱蛇\t330002\n喜多\t330003\n提及\t330004\n氧气罩\t330005\n无果\t330006\n光谷\t330007\n陕西理工学院\t330008\n拱桥\t330009\nnts\t330010\n破苞\t330011\n奈克洛兹玛\t330012\nnat\t330013\n扎兰屯\t330014\n原油投资\t330015\nDELTA\t330016\n往前方\t330017\n狗腿子\t330018\n刘超\t330019\n桂林路\t330020\n3999起\t330021\n绵绵冰\t330022\nfort\t330023\nlayuiAdmin\t330024\n零序\t330025\n上周一\t330026\n河南省妇幼保健院\t330027\n浦沿\t330028\njbpm\t330029\n脑浆\t330030\ngbr\t330031\n南越国\t330032\n双滦区\t330033\nksb\t330034\nzhiy\t330035\nloreal\t330036\n担保业\t330037\n金穗卡\t330038\nMMJ\t330039\nA1\t330040\naroma\t330041\n棕榈大道\t330042\n核移植\t330043\n流转换\t330044\n旌阳区人民政府\t330045\n岫玉\t330046\npseudo\t330047\nculling\t330048\n乐光\t330049\n0几年\t330050\n工作号\t330051\n孟良崮战役\t330052\n灵摆\t330053\n翠柏\t330054\n斩草除根\t330055\nLOL2017\t330056\n上海视觉艺术学院\t330057\n2家\t330058\n电负性\t330059\ncaj\t330060\n淄博市政府\t330061\n思想意识\t330062\n合肥火车站\t330063\n小向美奈子\t330064\n杨氏之子\t330065\n卍\t330066\n小粒\t330067\n驱逐舰\t330068\n西安邮电学院\t330069\n增长型\t330070\n女导演\t330071\n和平奖\t330072\n嫡女心计\t330073\n石牛寨\t330074\n浅表\t330075\nLiz\t330076\n42.5\t330077\n道富\t330078\n萝卜寨\t330079\nquartile\t330080\n和平镇\t330081\n肃立\t330082\nshoulder\t330083\n我乐网\t330084\n同程式\t330085\n52种\t330086\n中国社科院大学\t330087\n黄梅\t330088\n九黎战鼓\t330089\n放荡的女人2\t330090\n人教版八年级数学\t330091\n厨房间\t330092\n萨拉戈萨\t330093\n回收率\t330094\n北京大观园\t330095\n五十次\t330096\n唯爱美牙仪\t330097\n金葵花\t330098\n请客\t330099\nMMU\t330100\nusher\t330101\nadministration\t330102\nmra\t330103\n任华\t330104\n狗器\t330105\ngdi+\t330106\n11型\t330107\n崔斌\t330108\n李玉民\t330109\nanyview\t330110\n贵者\t330111\n长虹佳华\t330112\n今明\t330113\n抗爆\t330114\n霸道女\t330115\nNM\t330116\nCaught\t330117\n骚妹\t330118\n王牌\t330119\n2016年9月\t330120\ngarcia\t330121\n杨子荣\t330122\n对食\t330123\n张武\t330124\n三国志英杰传\t330125\n葡萄糖粉\t330126\nkon\t330127\n弭\t330128\n被解雇\t330129\n65条\t330130\n林李\t330131\n合肥公司\t330132\nwww.chem17.com\t330133\n致命复活\t330134\n井贤栋\t330135\n炉气\t330136\n乐风RV\t330137\n怪圈\t330138\n9亿多\t330139\n旌阳\t330140\nbookpro\t330141\n2018-04-14\t330142\n一宿\t330143\n擦窗机器人\t330144\n第二十五条\t330145\n百事通\t330146\n岱崮\t330147\n甲秀\t330148\n起亚kx\t330149\nPnP\t330150\n移动神州行\t330151\n宽以待人\t330152\ngocode\t330153\n空气循环扇\t330154\n肛肠\t330155\n吴敬梓\t330156\nsantiago\t330157\n夕阳\t330158\n疑义\t330159\nMRF\t330160\ncbk\t330161\nteacher\t330162\n王惠\t330163\n勒索\t330164\n囚宠\t330165\n河南信阳\t330166\n诡术\t330167\nSentinel\t330168\n樂天\t330169\nMyeclipse\t330170\n风筝会\t330171\n股票分析师\t330172\n一命\t330173\n临沧市\t330174\n光美\t330175\n拦截\t330176\n璇\t330177\n左旋多巴\t330178\n益气丸\t330179\n故宫\t330180\n李健吾\t330181\nChinaGirl\t330182\n尼麦角林片\t330183\n蝼蛄\t330184\n角儿\t330185\n首战\t330186\n智勇\t330187\nGiga\t330188\nJAROD\t330189\n毒菌\t330190\n欢乐的跳吧\t330191\n眼内\t330192\n穹之扉\t330193\n错落\t330194\nmarkPoint\t330195\neluosi\t330196\n禅杖\t330197\n八一镇\t330198\n白毒\t330199\n乐府诗\t330200\n第4年\t330201\n95元\t330202\n封口机\t330203\n四管\t330204\n新百伦\t330205\n追缉\t330206\n智脑\t330207\n江平\t330208\n美报\t330209\n第四十集\t330210\n风景线\t330211\nHiphop\t330212\n滋润型\t330213\n针头式\t330214\n厦门公安局\t330215\n消费商\t330216\n泉州第一医院\t330217\n乔梦婷\t330218\n双人舞\t330219\nApocrypha\t330220\n真品\t330221\nTMT圈\t330222\n10.0.4\t330223\n14.0.3\t330224\n绝世\t330225\n吸脂减肥\t330226\n冰寒\t330227\n大不\t330228\n七件事\t330229\n呆子\t330230\n一人生\t330231\n西原\t330232\n反射镜\t330233\nhustle\t330234\nbia\t330235\n弯刀杀戮\t330236\n张丹丹\t330237\n新加坡公司\t330238\n流行病学\t330239\nDEAR\t330240\n长寿面\t330241\n0429\t330242\nspontaneous\t330243\n德赛电池\t330244\n精练\t330245\n六集\t330246\n硬骨鱼\t330247\n开头语\t330248\n功器\t330249\n喷布\t330250\n西部资源\t330251\nMariah\t330252\n二氧化硒\t330253\nDetermining\t330254\n嫣然一笑\t330255\n张曙\t330256\n加性\t330257\n证券公司\t330258\n咖啡勺\t330259\n权相宇\t330260\n散文卷\t330261\n安踏集团\t330262\n评论页\t330263\n第四十七\t330264\n2111\t330265\n两化融合贯标\t330266\n谢灿勇\t330267\n包套\t330268\n仁恒滨江园\t330269\nBIM培训网\t330270\n秸秆粉碎机\t330271\n黄立\t330272\n声名\t330273\nMutant\t330274\n系所\t330275\n龙溪\t330276\n成品\t330277\n8700\t330278\n赵粤\t330279\nBonker\t330280\n螺杆机\t330281\n是非非\t330282\n经传\t330283\nksd\t330284\n迈克尔杰克逊\t330285\n恶事\t330286\nС\t330287\n火龙\t330288\n以撒的结合\t330289\n关白\t330290\n新闻部\t330291\n鬼刀\t330292\n3.5\t330293\n我和僵尸有个约会2\t330294\nmcrypt\t330295\nFRP\t330296\n超级神猴\t330297\nShowdown\t330298\n十四万\t330299\n相对磁导率\t330300\n收益法\t330301\n无限次\t330302\nepub,mobi\t330303\nworx\t330304\nPrivacy\t330305\n迷宮\t330306\n开商\t330307\n周洪\t330308\n混凝土配合比\t330309\n热宝\t330310\n求是理论网\t330311\n构效\t330312\nemplace\t330313\n黄琼\t330314\n20位\t330315\n海边\t330316\n逢考\t330317\n铁锤\t330318\n偷税\t330319\nllm\t330320\n88888888\t330321\n56114\t330322\n拖盘\t330323\nFOC\t330324\nHDMI线\t330325\njoss\t330326\nsnatch\t330327\n宫脇咲良\t330328\ncat5\t330329\n檀健次\t330330\n小米圈铁pro\t330331\n1300种\t330332\nGSS\t330333\nmysl\t330334\n第十级\t330335\n南宁日报\t330336\n差乘\t330337\n人职\t330338\n右手\t330339\n淡红\t330340\n防护服\t330341\n气水分离器\t330342\n莱蒙城\t330343\n振源胶囊\t330344\n小进\t330345\n望闻\t330346\n阎村\t330347\n徐桂英\t330348\n伊文思\t330349\nvivaldi\t330350\n1.0.8\t330351\nWin10系统Edge浏览器\t330352\n红蛋\t330353\n遵循\t330354\n嫩江\t330355\n麻批\t330356\n368元\t330357\n浙沪\t330358\n测绘学院\t330359\n20170209\t330360\n脑波\t330361\nJs文件函数\t330362\n烟台开发区\t330363\ncameralink\t330364\n林芊妤\t330365\n新浪江西旅游_新浪\t330366\n诺克斯\t330367\n小记者\t330368\n美术家\t330369\n全露\t330370\n无性繁殖\t330371\n并联电阻\t330372\n2018年04月19日\t330373\nマニア\t330374\n从不\t330375\n【魅\t330376\n梓涵\t330377\n雨山区\t330378\n冷凝液\t330379\n田雨\t330380\n苏澜桥\t330381\nApocalypse\t330382\n艾坦\t330383\n荷韵\t330384\nterminal\t330385\n上证180\t330386\n草写\t330387\n过敏性咳嗽\t330388\nParaView\t330389\nQMessageBox\t330390\n沙集镇\t330391\n安邸\t330392\nGinger\t330393\n发语\t330394\n波士顿凯尔特人\t330395\n智取威\t330396\n和氏献璧\t330397\n三水万达广场\t330398\n万能打印机\t330399\n废柴兄弟\t330400\n嘉善县\t330401\n老党员\t330402\nesnai\t330403\n元谷\t330404\n365经典网\t330405\n张又侠\t330406\n逆熵3rd\t330407\n420万\t330408\n无双剑姬\t330409\n刘晓东\t330410\n远东地区\t330411\n数控技术\t330412\n三国无双6\t330413\n埃涅阿斯\t330414\nlac\t330415\n爱谱网\t330416\nzimu\t330417\n成招\t330418\n格林尼治\t330419\n伊人网\t330420\n变\t330421\nRGBD\t330422\n吉里吉里\t330423\n湖南大众传媒学院\t330424\n高智商\t330425\n门磁\t330426\n相像\t330427\n30例\t330428\n履行\t330429\nEveryone\t330430\n回向文\t330431\n长青树\t330432\n无人不晓\t330433\n比萨\t330434\n木林\t330435\n泯恩仇\t330436\n云手\t330437\n军博\t330438\n小少\t330439\nmerlot\t330440\n舅子\t330441\n昼极夜\t330442\n29块\t330443\nv1.3.0\t330444\nsecureCRT\t330445\nReporting\t330446\n姓\t330447\n3dnew-sketchup\t330448\n船费\t330449\n牙盘\t330450\n北京现代ix25\t330451\n4辆\t330452\n方糖\t330453\n越野e族\t330454\nx1s\t330455\n3标\t330456\n呋喃\t330457\n试桩\t330458\n好多少\t330459\n多几天\t330460\n4220\t330461\n601607\t330462\n王定国\t330463\n奔驰c级\t330464\nimput\t330465\n26话\t330466\nCrack\t330467\nziparchive\t330468\n数年\t330469\n诸葛果\t330470\n相容\t330471\nCheetah\t330472\n生议论文\t330473\n舅母\t330474\nMaiden\t330475\n次女\t330476\n四王\t330477\n一毫\t330478\n缆\t330479\n41部\t330480\n罢官\t330481\n干价\t330482\n关节镜手术\t330483\nChinaRen\t330484\n北京开放大学\t330485\nZhuhai\t330486\n看花开\t330487\n二十句\t330488\n10pt\t330489\n不愧\t330490\n荣耀畅玩4\t330491\nSIR\t330492\n滕州东\t330493\nInfrast\t330494\n馨缘\t330495\n网游版\t330496\n生物制剂\t330497\n湖南省中医附一医院\t330498\n懒人版\t330499\n会效\t330500\n36册\t330501\nEm\t330502\n4尺\t330503\n礼德\t330504\n模场\t330505\n跨板\t330506\n卢清秀\t330507\n北大法宝V6\t330508\na1524\t330509\n1124\t330510\n正弦函数\t330511\nISO20000\t330512\n平方&#160\t330513\n成都市公安局\t330514\n招投标\t330515\n金笺\t330516\n音形\t330517\n常斯特\t330518\nIowa\t330519\n日本人\t330520\n陶凉烟\t330521\n见顶\t330522\nunob\t330523\n4399j小游戏\t330524\nclima\t330525\n苄星青霉素\t330526\n2018.04.04\t330527\n48套\t330528\n9头\t330529\n安全风险评估\t330530\n重手\t330531\n青羽\t330532\n涨跌\t330533\n边旁\t330534\n正比例和反比例\t330535\n学位证书\t330536\n12间\t330537\n西瓜影音播放器\t330538\n同队\t330539\n莫道桑榆晚\t330540\n凯驰\t330541\nwww.v119.com\t330542\n土法\t330543\n光峰\t330544\n波法\t330545\ntslgame\t330546\n网络电视\t330547\n自恋型\t330548\n藩镇\t330549\n电脑百事网\t330550\n勘定\t330551\n骄婿\t330552\n上海建工集团股份有限公司\t330553\n博途软件\t330554\n第42届\t330555\n鼹鼠\t330556\nqq名_祥安阁风水网\t330557\nLocker\t330558\nEloquent\t330559\n咪咕影院\t330560\nccom\t330561\n总字\t330562\n羯族\t330563\nNetsh\t330564\n中国科学院\t330565\n金枪\t330566\n白石洲\t330567\nCAX\t330568\nanalyse\t330569\n离子型\t330570\n重庆人文科技学院\t330571\n吸附式\t330572\n黄平县人民政府\t330573\nplacenta\t330574\n89岁\t330575\n空空\t330576\n荷兰\t330577\n文华路\t330578\n美司法部\t330579\n6mm2\t330580\nPope\t330581\n说完\t330582\n丰田霸道\t330583\ncompromise\t330584\n纵缝\t330585\n长秀霖\t330586\n常州大学\t330587\n共赢发展\t330588\n壳牌喜力\t330589\nUpMP3\t330590\n21cn\t330591\n江南镇\t330592\n工藤\t330593\n浮空\t330594\n题号\t330595\nECCO\t330596\n倭\t330597\n复合维生素片\t330598\n郑秉文\t330599\nLOTUS\t330600\n31.6\t330601\n王晓玲\t330602\n卫华\t330603\n肉番漫画网\t330604\n乱淫\t330605\n歪理\t330606\n隔断阀\t330607\nSU2017\t330608\nxiaoshuo\t330609\n十双\t330610\n傅先生\t330611\n织部\t330612\nv3.6.1\t330613\n林教头风雪山神庙\t330614\n光合\t330615\n腾讯微视\t330616\n6X4\t330617\n89级\t330618\n太平公主秘史\t330619\n大会员\t330620\n爱的呼唤\t330621\nlave\t330622\n赵云传\t330623\n50米\t330624\n厚德载物\t330625\n致远\t330626\nthriller\t330627\n人人车\t330628\n澳门特别行政区\t330629\n忠孝\t330630\n第82\t330631\n站洋币\t330632\n荆州青年人才网\t330633\n4.9.0\t330634\n美蛙鱼头火锅\t330635\n胆固醇\t330636\nThunderbird\t330637\n尾注\t330638\n深圳清华大学研究院\t330639\n正面\t330640\nAlibaba\t330641\n2017年3月\t330642\n公文\t330643\nVAN\t330644\n乙烯基\t330645\n主任\t330646\n证券时报电子报\t330647\n临刑\t330648\n欧若拉\t330649\n张文静\t330650\n口传\t330651\n龙湖紫云台\t330652\n睡觉\t330653\n冰岛\t330654\n宝骏迈瑞宝\t330655\n云锡集团\t330656\n七万多\t330657\n战雷\t330658\nUG网\t330659\n便知\t330660\n低顶\t330661\n刘旦\t330662\n紫荆城\t330663\nfategra\t330664\n万国企业网\t330665\n撒尿牛丸\t330666\n香格里\t330667\n孙家沟\t330668\nDEVC++\t330669\n甲油胶\t330670\n达科塔·范宁\t330671\n仔裤\t330672\n那个女生\t330673\n2017年01月\t330674\nsquirrel\t330675\n福克玛莎拉蒂\t330676\ndcl\t330677\n五万\t330678\nYOLOv2\t330679\n牛肠\t330680\nRobin\t330681\n暗墨\t330682\n花蕾\t330683\n加工场\t330684\n絮凝池\t330685\n预埋槽道\t330686\n李陵\t330687\n热敏\t330688\n拉长石\t330689\n南京幼升小\t330690\n第几代\t330691\nZMI\t330692\n亿博\t330693\n金蝶k3\t330694\nwindows\t330695\n教子砖\t330696\n肌肉感\t330697\n西江苗寨\t330698\n0505\t330699\npimp\t330700\n黄金市场\t330701\n低飞\t330702\n12载\t330703\nMCM\t330704\nmiui6\t330705\nattribution\t330706\n陈武\t330707\n戒断反应\t330708\nAT指令\t330709\n张松涛\t330710\n人教2011课标版\t330711\n#region\t330712\n小橘子网\t330713\n斯塔德迈尔\t330714\ngenie\t330715\n胶黏剂\t330716\nfis\t330717\n菲彩\t330718\n名门\t330719\n﹡\t330720\npigz\t330721\n命师\t330722\n东镇\t330723\n无车\t330724\n云溪区\t330725\n数字签名\t330726\n梅麻吕3d动画全集\t330727\n秩序井然\t330728\n绊\t330729\n三点水\t330730\n耐火材料\t330731\n7.5亿元\t330732\n浙江大学建筑设计研究院有限公司\t330733\n成都西部\t330734\n中共深圳市委\t330735\ndnf安全模式\t330736\n埋墙\t330737\n物联云仓\t330738\n第八十七条\t330739\n灾变\t330740\n中方\t330741\nck2\t330742\n贝利尔\t330743\n铠甲勇士\t330744\nbulid\t330745\n共有权\t330746\n焼\t330747\n插上\t330748\n762\t330749\nchaoshi\t330750\n反背\t330751\n独山县\t330752\n蓝蕾丝\t330753\n寇决\t330754\n全国人大代表大会\t330755\n绿方\t330756\n侯玉洁\t330757\n被劫\t330758\nXX学院\t330759\n余江县政府\t330760\nD5\t330761\nMLF\t330762\n介绍所\t330763\n海贼王强者之路\t330764\nmstar\t330765\n深圳工商局\t330766\n2018年4月24日\t330767\ncsgirl\t330768\n惠爱\t330769\n塘朗\t330770\nswpu\t330771\n写真集\t330772\n鸿兔\t330773\n拆伙\t330774\n暖通空调\t330775\n使命召唤6:现代战争2\t330776\n2095\t330777\n木苏里\t330778\n1070TI\t330779\n武汉大学图书馆\t330780\n江氏\t330781\n扶志\t330782\n天天向商\t330783\nphysical\t330784\n20171\t330785\n畅捷通\t330786\n无人机飞控\t330787\n第一时\t330788\n如图ab\t330789\n下腹肌\t330790\n路怒\t330791\n测试片\t330792\nSQL编程\t330793\n山东师范大学历山学院\t330794\n走边看\t330795\n雅芳\t330796\n白家大院\t330797\n陕西国防工业职业技术学院\t330798\n7组\t330799\n05集\t330800\n土布\t330801\n重版\t330802\n宠物鱼\t330803\n料事如神\t330804\n舞吧\t330805\n孵\t330806\nCDR2017\t330807\n反压\t330808\n衣香鬓影\t330809\n伟星管\t330810\n草签\t330811\n高某\t330812\n郑州公交查询网\t330813\n镜面板\t330814\n印鉴章\t330815\nsquash\t330816\n吉利新帝豪\t330817\n┌\t330818\n傻孩子\t330819\n狄德罗\t330820\n常熟古里\t330821\n乌溪江\t330822\n慢慢走\t330823\n泸州老窖集团有限责任公司\t330824\n大渝社区\t330825\nlaypage\t330826\n重水\t330827\n么么\t330828\n加多宝集团\t330829\nE460\t330830\nfirewalking\t330831\n20160704\t330832\nELLE\t330833\nft232r\t330834\n寒冰弹\t330835\n狂风\t330836\n知人\t330837\nRPI\t330838\nv4.2.1\t330839\nspm\t330840\n3DM\t330841\n小米mix2\t330842\n演化\t330843\n六线吉他谱-虫虫吉他谱\t330844\ncleavage\t330845\n彭云\t330846\n全险\t330847\n恶魔胆汁\t330848\n七八月\t330849\novs\t330850\n长期股权投资\t330851\njiqiren\t330852\n注记\t330853\n东仪路\t330854\n21分\t330855\n恺\t330856\n颜值爆表\t330857\neri\t330858\nUndertow\t330859\n等解\t330860\nJAPANESE\t330861\n解放桥\t330862\n三里镇\t330863\ns5000\t330864\n富城\t330865\n9【\t330866\n2.7GB\t330867\n鹰目户外广告网\t330868\n缺点\t330869\n职工代表大会\t330870\njogging\t330871\n润肺\t330872\n直流无刷电机\t330873\n维码\t330874\n糖果店\t330875\n14亿美元\t330876\n恩怨史\t330877\n液压破碎锤\t330878\n法盲\t330879\n神武龙宫\t330880\n营销型\t330881\n艾米纳姆\t330882\n通用申报表\t330883\n元典\t330884\n华女\t330885\n77_\t330886\n刘蕾\t330887\namounts\t330888\nHive表\t330889\npsn\t330890\n10年\t330891\nboox\t330892\n航次\t330893\n603876\t330894\n70度\t330895\n例句\t330896\nend\t330897\n怪物猎人P2G中文网\t330898\n百分之三十\t330899\n紫金花园\t330900\n首营\t330901\n戎狄\t330902\n2017年12月3日\t330903\n色系\t330904\n金砖国家新开发银行\t330905\npec\t330906\n南湖广场\t330907\n流鼻血\t330908\ndimer\t330909\n土\t330910\n使命召唤:二战\t330911\n紫光\t330912\n非银行\t330913\n梧田\t330914\n全国人大财经委\t330915\n深圳发展银行\t330916\nTeachCourse\t330917\n开单\t330918\n子窗\t330919\n值周\t330920\n咽气\t330921\n金型\t330922\n中国绿色数据中心\t330923\n三冬\t330924\n夫星\t330925\n游春\t330926\n版版\t330927\n【罗\t330928\n石碁镇\t330929\n卧薪尝\t330930\n木碗\t330931\n闪电十一人GO\t330932\n钻芯\t330933\n松岗\t330934\n鲁班尺\t330935\n小青菜\t330936\nbeb\t330937\n罗盘\t330938\n寡义\t330939\n粉皮\t330940\n35首\t330941\n反不正当\t330942\n示范性\t330943\n犭\t330944\nHLSL\t330945\n刮蹭\t330946\n以身\t330947\nlap\t330948\n幕柜\t330949\n北京电子科技学院\t330950\n军界\t330951\n中山学院\t330952\n方缸\t330953\n牢不可破\t330954\nSkip\t330955\n周仓\t330956\n蓝溪村\t330957\n禄口镇\t330958\n重本率\t330959\n磐安县政府\t330960\n李立\t330961\n1060pro\t330962\n超支\t330963\n伯利恒\t330964\n神迷\t330965\n旋转矩阵\t330966\n废电池\t330967\n色友\t330968\n香情\t330969\ncuda9\t330970\nbox-sizing\t330971\n1500亩\t330972\nfocusing\t330973\nacr122u\t330974\n龙光地产\t330975\n球体\t330976\ntbox\t330977\n关系密切\t330978\nhql\t330979\nskyl\t330980\n汉芯一号\t330981\n可燃气体报警器\t330982\npeida\t330983\nSplitting\t330984\n宝骏630\t330985\n玲珑剔透\t330986\n留明锐\t330987\n整除\t330988\n修辞格\t330989\n串联电路\t330990\n成勋\t330991\n上海经信委\t330992\n长台\t330993\n宝马林肯mkx\t330994\n吉林省公安厅\t330995\n泉山区政府\t330996\n海贼王漫画\t330997\n白刺\t330998\n血液肿瘤科\t330999\n烟台市开发区\t331000\nKIM\t331001\nshenz\t331002\nEnabler\t331003\nAssault\t331004\n速龙\t331005\na-2\t331006\nYaHei\t331007\ndamned\t331008\n米奇鱼\t331009\nN软网\t331010\n气门室\t331011\n帕里斯\t331012\n8X4\t331013\n东莞市人民政府\t331014\n爱人\t331015\n义乌商报\t331016\n保税物流\t331017\nbvr\t331018\n上湾\t331019\n情语\t331020\n税优\t331021\n莺莺祥月\t331022\n佩刀\t331023\n浦江郊野公园\t331024\nIMA\t331025\nrex\t331026\nCGUFO\t331027\n新余新闻网\t331028\nB737\t331029\n精干\t331030\n鲁南商报\t331031\n韩影\t331032\n日坐\t331033\n令人神往\t331034\n新加坡植物园\t331035\n巨幕\t331036\n坂井泉水\t331037\n8十个\t331038\n综合基础知识\t331039\n声控灯\t331040\naya\t331041\ndrifting\t331042\n金蝶帐套\t331043\n镇远网\t331044\ngeox\t331045\nHebe\t331046\n整流桥\t331047\n衷心\t331048\n黛青塔娜\t331049\n第十一届\t331050\nword2\t331051\n苗栗县\t331052\n黄明\t331053\n川发\t331054\n应试\t331055\n最前面\t331056\n杨焕宁\t331057\n边柱\t331058\n高分屏\t331059\n塞孔\t331060\nHMV\t331061\n檀香扇\t331062\n中邮\t331063\n石马镇\t331064\n白夜追凶2\t331065\n侠情\t331066\n冒险岛2吧\t331067\n衍生金融工具\t331068\n汊河\t331069\n香港特别行政区\t331070\n被执行人\t331071\n拼盘\t331072\n科幻小说\t331073\n面框\t331074\n昆阳镇\t331075\n划扣\t331076\n月冥\t331077\n千数堂\t331078\n电子显示屏\t331079\n铁机路\t331080\n10群\t331081\n骨骨\t331082\n天夏智慧\t331083\n莫提\t331084\nTek_Eternal\t331085\nGTDi\t331086\n徐州\t331087\nStarCraft\t331088\n福莱特\t331089\n工伤\t331090\n柳实\t331091\n法证先锋1\t331092\nbent\t331093\n金施尔康\t331094\n岐黄\t331095\n冷石\t331096\n脚料\t331097\n消落\t331098\n旗舰级\t331099\n韩佳\t331100\n溶媒\t331101\n空白版\t331102\n流放之路塑界\t331103\n德法\t331104\n乳胶管\t331105\n拉米\t331106\n冷箱\t331107\njweixin\t331108\n王义桅\t331109\n易吧\t331110\n连晨翔\t331111\n顾字\t331112\n梅山保税港区\t331113\n740Li\t331114\nScreenshot\t331115\nvs2016\t331116\n南山站\t331117\n晓天\t331118\n第几句\t331119\n无损音乐网\t331120\nIncoming\t331121\n林长\t331122\n复式投注\t331123\n硅酸镁\t331124\n东岸\t331125\n10来岁\t331126\n辣手摧花\t331127\n效法\t331128\n阿普\t331129\n意外死亡\t331130\n明诚\t331131\nGD32\t331132\n通讯费\t331133\n蒸馏水\t331134\n屏蔽袋\t331135\n世越号\t331136\n百度词条\t331137\ndaan\t331138\ngtmm\t331139\n103名\t331140\n电蒸箱\t331141\n着装\t331142\nFLAT\t331143\n凑单品\t331144\n过保\t331145\n已亥杂诗\t331146\n第81期\t331147\n插秧机\t331148\n青葱\t331149\n平年\t331150\nJumbo\t331151\n上邦\t331152\n丁华\t331153\n育红小学\t331154\n撸毛\t331155\n如城镇\t331156\n李书老子\t331157\n陶大宇\t331158\n中国水工业网\t331159\n13整除\t331160\n东莞时报\t331161\n眼果\t331162\n世纪汇广场\t331163\n芥子气\t331164\n东方卫视\t331165\n筑底\t331166\n燕塘\t331167\n转把\t331168\n中国石油和化学工业联合会\t331169\nsonos\t331170\n吸附性\t331171\nCNMOOC\t331172\nBering\t331173\n588号\t331174\n223.104.45.76\t331175\n卡位面\t331176\n国电南瑞\t331177\n继续教育专业\t331178\nsimulation\t331179\n家教\t331180\n黑风寨\t331181\n五更琉璃\t331182\n15个小时\t331183\n父与子\t331184\n高尼茨\t331185\nException\t331186\n空燃比\t331187\n跨区\t331188\n中美大学\t331189\n章贡区_赣州章贡区政府\t331190\nwillis\t331191\n宽频\t331192\n第71\t331193\n封门\t331194\niS62017款\t331195\nк\t331196\n中国特色社会主义理论体系概论\t331197\n厦门海悦山庄酒店\t331198\nBrazzers\t331199\n极端民族主义\t331200\n派派\t331201\n真心诚意\t331202\n义包\t331203\n嫣儿\t331204\n三体神雕侠侣\t331205\n帕森斯\t331206\n百润\t331207\n领创\t331208\n醉影\t331209\n钟长鸣\t331210\n鹿迪\t331211\n撸撸鸟Av\t331212\n9.23\t331213\n首尾\t331214\nCAPE\t331215\n身无分文\t331216\n慢走\t331217\nRedefining\t331218\nASSOCIATION\t331219\n4710\t331220\n下吕\t331221\nYiruma\t331222\n晶晨\t331223\n咳嗦\t331224\n景琰\t331225\n五休二招聘\t331226\n二茂铁\t331227\n2013年2月\t331228\n刑警队\t331229\n网络化\t331230\n91job.com\t331231\n杀戮间2吧\t331232\nsdio\t331233\nteaching\t331234\n孤立\t331235\nwindows2012r2\t331236\nEye\t331237\nyeelight语音助手\t331238\n巴西柔术\t331239\n堀江\t331240\n轩子巨\t331241\n网页测试\t331242\n夸耀\t331243\n法人\t331244\n包柜\t331245\n酒店群\t331246\n回荡\t331247\n安全类\t331248\n空平\t331249\n晨昏\t331250\n网箱\t331251\n小纸\t331252\n688元\t331253\n山东省质监局\t331254\ngiaogiao\t331255\n杨文斌\t331256\n磷酸二氢钾\t331257\n1小时\t331258\n品牌设计公司\t331259\npoker2\t331260\nHIV\t331261\n罡\t331262\ncaption\t331263\n2周后\t331264\nAcquires\t331265\n大马哥\t331266\n结炉\t331267\n香林\t331268\n吸收器\t331269\n沙燕\t331270\nTeams\t331271\n穆斯林\t331272\nUNIXTIME\t331273\n激燃\t331274\n停不住\t331275\n按键精灵手机助手\t331276\n坡率\t331277\n题霸\t331278\n考勤管理制度\t331279\n10系\t331280\n北京奥数网\t331281\n三国志3\t331282\n汉堡包\t331283\n水刀\t331284\n救火\t331285\nCherry\t331286\n検査\t331287\n百世快运单号查询\t331288\n算法\t331289\n涩味\t331290\n街头霸王5\t331291\nqvod高清\t331292\n105二代\t331293\n波峰\t331294\n桑德国际\t331295\nPolitics\t331296\nmai\t331297\nmkfs.ext4\t331298\n⑴\t331299\n宴客\t331300\n龙河镇\t331301\njiami\t331302\n小米m2\t331303\n双簧\t331304\n跑男2\t331305\n3135\t331306\n八叉树\t331307\n丁\t331308\n小榄镇\t331309\n弃妇\t331310\n鳌虾\t331311\n膀胱癌\t331312\n物质遗产\t331313\n第五十四章\t331314\n金铃\t331315\n100多种\t331316\n伟仕佳杰\t331317\n赵树海\t331318\n上海软件\t331319\n民约\t331320\n钟点房\t331321\n7.2.4\t331322\n贵阳地铁\t331323\n伊利安慕希\t331324\n劣等\t331325\n亚朵酒店\t331326\npip源\t331327\n七台河\t331328\nfino\t331329\n保卫黄河\t331330\n常设\t331331\n阿思翠\t331332\n铭锐\t331333\ntssd2017\t331334\n佳讯\t331335\n姨子\t331336\n早稻\t331337\n10.5寸\t331338\n10分\t331339\n敢作敢为\t331340\n道歌\t331341\njī\t331342\n终须\t331343\niG\t331344\n长江三鲜\t331345\n最有效\t331346\n江莱\t331347\n抗封\t331348\n双塔\t331349\n8欧\t331350\n黄克\t331351\nhost\t331352\n0x03\t331353\n鹤林\t331354\n论命\t331355\n附分解\t331356\n商贸宝\t331357\n南澳大利亚州\t331358\nNISSAN\t331359\n红缨\t331360\n浪尖\t331361\n名垂\t331362\n自贡政府网\t331363\n近现代史纲要\t331364\n择主\t331365\n马波斯\t331366\n合议庭\t331367\n指纹锁\t331368\nFelix\t331369\n20184\t331370\n联片\t331371\n纺丝\t331372\n异业联盟\t331373\ntl431\t331374\n蠕变\t331375\n大芜湖\t331376\n50M\t331377\n南京金陵饭店\t331378\n卢加诺\t331379\n雅兰\t331380\n中国航空运输协会\t331381\n威风锣鼓\t331382\nSeminar\t331383\n千足银\t331384\n王文杰\t331385\n转文\t331386\n别是\t331387\n璧花园\t331388\n杨虎\t331389\n88万\t331390\n变桨\t331391\n俯\t331392\n云数据\t331393\n新疆农业大学\t331394\n科大讯飞\t331395\ndtr\t331396\n去旅行\t331397\n苏州经贸职业技术学院\t331398\nCEA\t331399\n洛阳西\t331400\n打狗棒法\t331401\n习马会\t331402\n约约\t331403\n中国电子科技集团公司电子科学研究院\t331404\n臆造\t331405\n湛江市区\t331406\n绝境求生大逃杀\t331407\n中水\t331408\n临摹\t331409\nshane\t331410\n王月华\t331411\n张家港市第一人民医院\t331412\n湖北省中山医院\t331413\n林志美\t331414\n1918\t331415\n10年以后\t331416\n白沟新城\t331417\n讲真话\t331418\n超人归来\t331419\n268号\t331420\nsqlserver数据库\t331421\n甲肝疫苗\t331422\n马克思主义者\t331423\n铁线蕨\t331424\nward\t331425\nmyd\t331426\n0531-12340\t331427\n此生\t331428\n宝马X6\t331429\n72变\t331430\n波浪板\t331431\n游戏魅\t331432\n丧尸之战\t331433\n偷换\t331434\nQZZN论坛\t331435\nselectpicker\t331436\n飞纱\t331437\n琵琶行\t331438\n紫燕百味鸡\t331439\n催花\t331440\n南条\t331441\nElements\t331442\n一波三折\t331443\n爱否\t331444\n23公斤\t331445\n2亩\t331446\naha\t331447\nGROUND\t331448\n31层\t331449\n劲歌金曲\t331450\nexpreview\t331451\n真书\t331452\n2018年3月26日\t331453\n沿河路\t331454\n雪弗兰\t331455\napplicable\t331456\n网页游戏网\t331457\nclarins\t331458\n旅行照\t331459\n丙子日\t331460\n电吉他谱世界\t331461\nUE\t331462\n外教\t331463\n中国科学院武汉植物园\t331464\n刘祥\t331465\nDolores\t331466\n补身\t331467\n3月22\t331468\n14【\t331469\n坑王驾到\t331470\n爱氏晨曦\t331471\n三分之一次方\t331472\n金旸\t331473\n秘决\t331474\n洋姜\t331475\n中国医学科学院\t331476\n延锋\t331477\n凸显\t331478\nplugins\t331479\n掌故\t331480\n山东广电网络有限公司\t331481\n小组长\t331482\nKid\t331483\n领带\t331484\n摘果\t331485\nspeaks\t331486\n法理\t331487\nJTG\t331488\n4.5折\t331489\n新能源\t331490\n看工\t331491\n名包\t331492\n峰峰\t331493\n回不去\t331494\n北京南\t331495\narcher-wong\t331496\n4月17日\t331497\nIcon\t331498\ntoda\t331499\n南京大学新闻传播学院\t331500\nconvergence\t331501\n收货人\t331502\n王兴国\t331503\n铁氟龙\t331504\nShaun\t331505\n山西农大信息学院\t331506\n翡翠毛料\t331507\n医美市场\t331508\nOntrack\t331509\n云纱\t331510\n字\t331511\n钉邮\t331512\n德格县\t331513\n预产\t331514\n刘少\t331515\n太史公\t331516\n参苓\t331517\n泉城路\t331518\n计价\t331519\n广汉市\t331520\n早教网\t331521\n阿里文学网\t331522\nkiki\t331523\nEighty\t331524\n党员组织关系\t331525\n写字板\t331526\n半导体制\t331527\n竹风\t331528\nv11p\t331529\nqq厘米秀\t331530\nSDKs\t331531\n原木色\t331532\n蓝洁瑛\t331533\n乌尔禾区\t331534\n捷赛\t331535\n管理者\t331536\nnfu\t331537\n2018年4月1\t331538\n环球唱片\t331539\n莫斯特\t331540\n探\t331541\n累觉不爱\t331542\n冰咖啡\t331543\n必然性\t331544\n宦妻\t331545\nendothelial\t331546\n包出\t331547\n试验品\t331548\n乐高侏罗纪\t331549\n冥府之路\t331550\n通敌\t331551\n郑燕\t331552\nABCDE\t331553\n华青\t331554\n请原谅\t331555\nSket\t331556\nagete\t331557\n福特F-150\t331558\n收客\t331559\n米尺\t331560\n北京卫视\t331561\n快奸\t331562\n万科悦城\t331563\n赶时髦\t331564\n伦常\t331565\n倍耐力\t331566\n电子地磅\t331567\n拓朴\t331568\n董哲\t331569\n产业展\t331570\n燕子洞\t331571\nthinkcenter\t331572\n新建文件夹2\t331573\nラ\t331574\ntouhou\t331575\ntube8\t331576\n猫耳FM\t331577\n意库\t331578\nAngelina\t331579\n新华人寿\t331580\n微雨\t331581\n迪亚天天\t331582\n华中农业\t331583\n保护型\t331584\n5.2米\t331585\nUninstall\t331586\n左炔诺孕酮\t331587\n刘庆\t331588\n本周一\t331589\nIDLE\t331590\n水蜜丸\t331591\n凤凰小区\t331592\nDeferred\t331593\ntshirt\t331594\n拉布拉\t331595\nPubwin\t331596\n80关\t331597\n海盗\t331598\nhdb\t331599\n花样直播\t331600\n5152\t331601\npbtxt\t331602\n东客站\t331603\n60多岁\t331604\n门客\t331605\n20161114\t331606\n吕氏\t331607\n隆昌市\t331608\n合众人寿保险股份有限公司\t331609\n从不曾\t331610\n官禄\t331611\n补脾益肠丸\t331612\n党代表\t331613\nhesitate\t331614\n亨利五世\t331615\n中国篮球协会\t331616\n高通骁龙\t331617\n填翻\t331618\nMeier\t331619\n海涛\t331620\n断针\t331621\n德田\t331622\n退距\t331623\n明开\t331624\nsuccessfully\t331625\n冲锋裤\t331626\nBollywood\t331627\n明日科技\t331628\n三峡学院\t331629\n环球钢琴网\t331630\n一二三季\t331631\n2.6.8\t331632\ncam\t331633\n阿沙\t331634\n不可取替\t331635\n周云\t331636\n三省\t331637\n活动力\t331638\n入栈\t331639\n5200L\t331640\n53号\t331641\nsoftether\t331642\n云南省政府\t331643\nFFmpeg\t331644\n四头\t331645\n卓然\t331646\n王荣华\t331647\n绝缘电阻表\t331648\n访问性\t331649\n4399神武2\t331650\n莫砺锋\t331651\n671\t331652\nMSOffice\t331653\n700\t331654\n乳腺癌\t331655\n北京1区\t331656\n查普曼\t331657\n穿点\t331658\nHIIT\t331659\n命令与征服\t331660\ndisaster\t331661\ne450\t331662\n师承\t331663\n人到\t331664\n压盖机\t331665\nzkclient\t331666\n安七炫\t331667\n首歌\t331668\n申請\t331669\n欧美地区\t331670\n李鹤\t331671\n10.68万元\t331672\n臀模\t331673\n洗发水瓶\t331674\n金文\t331675\n晋江国际机场\t331676\n白女\t331677\nAgar\t331678\n张鹤伦\t331679\n四川省人事厅\t331680\n1.5分米\t331681\n箱唛\t331682\nj3\t331683\nv2.0.5\t331684\n旋转拖把\t331685\n美铝\t331686\ndkcndk\t331687\n36.9\t331688\nArmin\t331689\n高喊\t331690\nAllwinner\t331691\n上海公安学院\t331692\n南白镇\t331693\n小顽童\t331694\n18m\t331695\nThinkSystem\t331696\n南票\t331697\n凉粉\t331698\ndrill\t331699\n臧否\t331700\n音子\t331701\n好人榜\t331702\n走火\t331703\n水电工\t331704\nqzone\t331705\n上海国际马拉松赛\t331706\n博达\t331707\n量瓶\t331708\n天水日报\t331709\n施展\t331710\nヾ\t331711\n陈建湘\t331712\n3DSmax\t331713\n昭告\t331714\n201X\t331715\n商飞\t331716\n天正节能\t331717\nbees\t331718\n第二十一回\t331719\nethereum\t331720\n云图TV\t331721\nrapidxml\t331722\n带队\t331723\nじ\t331724\n一加手机2\t331725\n皮耶罗\t331726\n特战\t331727\n一华\t331728\n孚能科技\t331729\n西宫\t331730\n彩球\t331731\nltspice\t331732\n黄陂区政府\t331733\n成一\t331734\nrum\t331735\n江川县\t331736\n胜哥\t331737\n采源宝\t331738\n干字\t331739\n锄宗\t331740\n三汇镇\t331741\n在途旅行网\t331742\n清湖村\t331743\n44章\t331744\n周晓琳\t331745\n12.18\t331746\n市国土局\t331747\n赤霞珠红葡萄酒\t331748\n绍兴房产超市网\t331749\n东兴口岸\t331750\n北京紫禁城\t331751\n荔园小学\t331752\n双湖网\t331753\n京信\t331754\n中华树\t331755\n16万亿\t331756\n高护\t331757\n2017年3月1日起\t331758\n大一只\t331759\n财务分析师\t331760\nTutorial\t331761\n女巫\t331762\n拔丝地瓜\t331763\ntacacs\t331764\n住院日\t331765\n茹\t331766\nMIKU\t331767\n四川银行\t331768\n第六\t331769\n锌块\t331770\n88键\t331771\n三十条\t331772\n6.20\t331773\n6W\t331774\nfined\t331775\nacceptor\t331776\n30代\t331777\n烧脑族\t331778\n苏北人\t331779\n北京市顺义区人民政府\t331780\n梁柱\t331781\n大洼县\t331782\n马帅\t331783\n腾远智拓\t331784\n雷诺\t331785\n24周年\t331786\n600271\t331787\n刷石机\t331788\n氢氯噻嗪片\t331789\n易耗\t331790\ngcc-c++\t331791\nIllustrations\t331792\n薄层色谱法\t331793\n手干\t331794\nrecession\t331795\n你好,李焕英\t331796\n潭州\t331797\n陇南日报\t331798\n结婚登记\t331799\n树机\t331800\n第92\t331801\n左庭\t331802\n3840X2160_\t331803\nb5\t331804\nFl\t331805\nfaka\t331806\n金枪鱼\t331807\nMAR\t331808\n脱氧剂\t331809\n穴口\t331810\n再谱\t331811\n智睿学习网\t331812\n乔治费歇尔\t331813\n香港法院\t331814\n徐国良\t331815\nAh\t331816\n速卖\t331817\nlipeil\t331818\n纽约摄影学院\t331819\nUNKNOWN\t331820\nGr\t331821\n云电脑\t331822\n丁小邦\t331823\n浙江大学建筑工程学院\t331824\n洛谷\t331825\n直肠炎\t331826\n过把\t331827\nzigzag\t331828\n魏强斌\t331829\n牛肉火锅\t331830\n王伦宝\t331831\nAegisub\t331832\noma\t331833\n鹏鹏\t331834\n替吉奥\t331835\n拔刀术\t331836\n成事\t331837\n解谜游戏\t331838\n电商化\t331839\n灵江源森林公园\t331840\nWebApproach\t331841\n众诚\t331842\n相位对焦\t331843\nyume\t331844\n白盾\t331845\n魔妃\t331846\nUrlEncode\t331847\n辣辣\t331848\n梭罗河\t331849\n老来\t331850\n\"\t331851\n漱液\t331852\n鞋跟\t331853\n1000份\t331854\n159个\t331855\n中鑫之宝\t331856\n呼吸衰竭\t331857\n张冬玲\t331858\n小短\t331859\nGPT格式\t331860\nZend\t331861\ndnf女圣骑士吧\t331862\n联合国气候变化框架公约\t331863\n跟屁虫\t331864\n确认单\t331865\n咬紧牙关\t331866\n和龙\t331867\n限制性股票激励计划\t331868\n回龙观社区网\t331869\nDispatch\t331870\ndefcon\t331871\n坚贞不屈\t331872\nVirtue\t331873\n许阳\t331874\n色字\t331875\nyester\t331876\nnaver\t331877\n好多人\t331878\n15063\t331879\n普洛药业\t331880\ntd\t331881\n濠\t331882\n转换率\t331883\n3桶\t331884\nDM67网\t331885\nysn\t331886\n阿奇霉素颗粒\t331887\n校服裤\t331888\n技能键\t331889\n霍尔元件\t331890\n产流\t331891\n3月1号\t331892\n实探\t331893\n期数\t331894\nMain\t331895\n九龙冰室\t331896\n黑八\t331897\n海辰药业\t331898\n49部\t331899\n纺锤体\t331900\n头枕\t331901\nanswered\t331902\n转居\t331903\n标识化\t331904\n胳膊\t331905\n酷酷吧\t331906\n钪\t331907\nSIZUO\t331908\n传承者\t331909\n黄河科技学院\t331910\nc级车\t331911\n负温度系数\t331912\nFamilyMart\t331913\n奥丹姆\t331914\n搭搭\t331915\n真人\t331916\nsonny\t331917\n青衫\t331918\n嘴上\t331919\n雪佛兰\t331920\n重庆工商大学派斯学院\t331921\nCTN\t331922\n南卫理公会大学\t331923\n张坚\t331924\n合作性\t331925\n雕书\t331926\n中国法学\t331927\n云南花木网\t331928\n同仁堂药店\t331929\n原生质\t331930\nostringstream\t331931\n分案\t331932\n汞柱\t331933\nwrapping\t331934\n插芯\t331935\n郭公庄\t331936\n秀恩爱\t331937\niwpriv\t331938\n格兰云天\t331939\n李韬\t331940\n6.5.4\t331941\n商车\t331942\n炬芯\t331943\n王高飞\t331944\n嵊州\t331945\n龙川路\t331946\n浙水\t331947\n12.5.7\t331948\n单目\t331949\nkloss\t331950\nmerck\t331951\n000905\t331952\nDir\t331953\n南辛庄\t331954\n格盘\t331955\nBoc\t331956\n科学卷\t331957\n锤纹\t331958\njackie\t331959\n链客\t331960\n何建\t331961\n依赖感\t331962\n拆机\t331963\nKPS\t331964\n死去活来\t331965\nKore\t331966\n扇灰\t331967\n富马酸比索洛尔片\t331968\n阿荣旗\t331969\n面部\t331970\n9.30\t331971\n香秀\t331972\n工农业\t331973\n乌江\t331974\n风选机\t331975\n申报稿\t331976\n钢琴手\t331977\n沈阳市社会医疗保险管理局\t331978\n鼓动\t331979\n7招\t331980\nrockstar\t331981\n新爱\t331982\n液晶电视屏\t331983\n永葆\t331984\n咸丰县\t331985\n熟成\t331986\n情戏\t331987\n统计直报网\t331988\n黄土高坡\t331989\n手冢治虫\t331990\n定慧\t331991\n北京化工研究院\t331992\nA7S\t331993\n嘶嘶\t331994\n发掘\t331995\n天极3\t331996\nQQ2018\t331997\n16.11\t331998\ncvi\t331999\nQueerClick\t332000\n仲宫\t332001\n绝弦\t332002\n诱使\t332003\n抹茶\t332004\n假水\t332005\n船工\t332006\nASCII在线转换工具\t332007\n20170727\t332008\n|迪粉汇\t332009\n付斌\t332010\n1200万美元\t332011\n拉下\t332012\n合式\t332013\n龙光玖钻\t332014\nShen\t332015\n舞力全开2017\t332016\n飘红\t332017\n苜蓿草\t332018\nDurid\t332019\n宝鸡市\t332020\n波吉亚家族\t332021\n5页\t332022\n莫知\t332023\n中国北方车辆研究所\t332024\n大同县\t332025\n同机\t332026\n天哥\t332027\n张寅\t332028\n粉嫩\t332029\nv6.0\t332030\n奇葩说吧_\t332031\n编创\t332032\n雇\t332033\n毛家\t332034\np+1\t332035\n白下区\t332036\n北京通\t332037\n职场\t332038\nWishes\t332039\n002468\t332040\npreferential\t332041\n这样做成\t332042\n30天后\t332043\ncerr\t332044\nDDM\t332045\n新增加\t332046\n铁罐\t332047\n肥牛\t332048\n李颂\t332049\n酱油炒饭\t332050\n2018级\t332051\n相认\t332052\n御花园\t332053\n廖慧敏\t332054\nPursuit\t332055\n停住\t332056\n网易邮箱\t332057\ngetbean\t332058\n7920\t332059\n捷豹路虎\t332060\n文来中学\t332061\nSTATION\t332062\n杭钢\t332063\n爱达荷州\t332064\n长绒\t332065\n$\t332066\n阴山\t332067\n榉树\t332068\n龟山汉墓\t332069\n长吉\t332070\n证券业\t332071\nhg8120c\t332072\n埃摩森\t332073\n材质单\t332074\n锡锭\t332075\ntoon\t332076\n)餐饮管理有限公司\t332077\n百余位\t332078\n缩水\t332079\n环球视野_四月网\t332080\n一杯茶\t332081\n口底\t332082\n王业\t332083\n电影展\t332084\n宿舍\t332085\n作古\t332086\n人文关怀\t332087\n乐视汽车\t332088\nwjlkoorey258\t332089\n北京水立方\t332090\n福特号\t332091\narmstrong\t332092\n8848钛金\t332093\n可心\t332094\n行星齿轮减速器\t332095\n荣耀西安网\t332096\nipc\t332097\n无人生还\t332098\n冥界\t332099\n北门\t332100\n济宁职业技术学院\t332101\n洋洋\t332102\n桂林旅游网\t332103\n五颗\t332104\n星荟\t332105\n成龙\t332106\n麦捷科技\t332107\n收料单\t332108\n湖北省民政厅\t332109\n单边主义\t332110\n莲花县人民政府\t332111\n甘肃省检察机关\t332112\n明日香\t332113\nDBGrid\t332114\n八分钟\t332115\n查重\t332116\nwww.ggjy.nbyz.gov.cn/TSPB/remot\t332117\n&恒\t332118\n轰炸\t332119\n新能源汽车购置税\t332120\n无可厚非\t332121\n小物\t332122\n广西科文招标有限公司\t332123\n妇男\t332124\n红楼\t332125\ncocos2djs\t332126\nzhide\t332127\niwork10\t332128\n缓建\t332129\nfarfetch\t332130\n忠义\t332131\n签价\t332132\n色度计\t332133\n荣耀V10_\t332134\nREACTION\t332135\noutlook通讯录\t332136\n黄痰\t332137\n压铆机\t332138\n经济学基础\t332139\n尾箱\t332140\n柔佛\t332141\n公猫\t332142\n太吵\t332143\n刘勇\t332144\n中英版\t332145\nmlp\t332146\n预缴税\t332147\n质粒\t332148\n肉价\t332149\n方向阀\t332150\n麦秆画\t332151\n东林镇\t332152\n纳米复合材料\t332153\n啤酒馆\t332154\n星船\t332155\nlianxi\t332156\n环卫处\t332157\n木桃式\t332158\n5613\t332159\nbackbone\t332160\n后晋\t332161\n瑞嘉\t332162\n梅赛德斯奔驰\t332163\ns30\t332164\n11111\t332165\n别无所求\t332166\n黄皮果\t332167\n50v\t332168\n佳成\t332169\n美棉\t332170\n眼线笔\t332171\n陈友\t332172\niOS\t332173\n省会\t332174\n交流发电机\t332175\n谩\t332176\n秦时明月君临天下\t332177\npod\t332178\n我终于\t332179\n日照开发区\t332180\n岭南站\t332181\n刚需族\t332182\n喜事\t332183\n家室\t332184\n南阳师院\t332185\n海语\t332186\n星际争霸2虫族\t332187\n不色\t332188\n大洋百货\t332189\n小康村\t332190\n齐齐乐倩女幽魂手游\t332191\n鹤山市\t332192\n2.8L\t332193\n光华管理学院\t332194\n不成文\t332195\n海沟\t332196\n虚情\t332197\nSlot\t332198\n凌云\t332199\n他达拉非片\t332200\n老人们\t332201\n绝压\t332202\n72度\t332203\nMercalli\t332204\n剪卡器\t332205\nclock\t332206\nezplot\t332207\n燕舞\t332208\n4月25号\t332209\n种土\t332210\n请坐\t332211\n周村\t332212\n路由算法\t332213\n信用中国\t332214\nESA\t332215\n舰徽\t332216\n荣威ei6\t332217\n240号\t332218\n组分\t332219\nArchWiki\t332220\nFEILIN\t332221\npgks\t332222\n又快\t332223\nEP10\t332224\n新面\t332225\nIOI\t332226\n过山车大亨2\t332227\nGriffin\t332228\n平庸\t332229\n五蠹\t332230\nWayland\t332231\nHollister\t332232\n牟取\t332233\n霍比特\t332234\n帝企鹅\t332235\nSynology\t332236\n503路\t332237\n天弓\t332238\n世界肝炎日\t332239\n免密登录\t332240\n去势\t332241\n打排球\t332242\n色胆\t332243\n层序\t332244\n2.32\t332245\n6岁时\t332246\n行者物语网\t332247\n关悦\t332248\n北京日出东方凯宾斯基酒店\t332249\nxhsell\t332250\n磁带机\t332251\ntaco\t332252\n中控考勤管理系统\t332253\nOFweek光通讯网\t332254\n棉拖\t332255\n聂辉华\t332256\n囧途\t332257\n中央纪委国家监委\t332258\n代行\t332259\nsoar\t332260\n2018#\t332261\n色影\t332262\n浪速\t332263\n熊猫杯\t332264\n尸套龙\t332265\nWAVE\t332266\n王子奇\t332267\n0xa0430721\t332268\nCongress\t332269\n响水\t332270\n英雄山\t332271\n政府特殊津贴\t332272\n报税\t332273\n王叔远\t332274\n少佐\t332275\n太龙药业\t332276\nsu2016\t332277\n统贷\t332278\n零售版\t332279\n星语\t332280\n知识窗\t332281\n二少\t332282\n粘滞性\t332283\n张建一\t332284\nphases\t332285\n土管\t332286\n先锋影音\t332287\nAMR\t332288\n致富\t332289\nimagebutton\t332290\n华楠\t332291\n居然\t332292\n两夜\t332293\nelectromagnetic\t332294\n甩掉\t332295\n3^n\t332296\n孟杰\t332297\n纵横中文网\t332298\nP2C\t332299\n同德\t332300\n一汽奔腾\t332301\n孝老\t332302\n邓俊辉\t332303\n顺丰快递单号\t332304\n婶儿\t332305\n老雷\t332306\n藏宝湾\t332307\n宏宝莱\t332308\n晨报\t332309\n中免集团\t332310\n星空网\t332311\n长春科技学院\t332312\n基础设计\t332313\n曲奇\t332314\n兰蔻Lancome\t332315\nmayun\t332316\n90条\t332317\n途胜女总裁的贴身兵王\t332318\nGetNAS\t332319\n刁妃\t332320\n绝地求生更新\t332321\n小白篇\t332322\n艮山府\t332323\n现代木\t332324\n消费新闻网\t332325\nkwm\t332326\n击伤\t332327\n虚拟dom\t332328\n拳头\t332329\n南京医科大学康达学院\t332330\n海思芯片\t332331\n520tingshu.com\t332332\nida\t332333\n很搞笑\t332334\n陈雅伦\t332335\n韩琛\t332336\n22K\t332337\nxy2\t332338\n第五十三章\t332339\n披麻戴孝\t332340\nest\t332341\n好意思\t332342\n骨瘦如柴\t332343\n丑照\t332344\nDeprecated\t332345\n音乐盒\t332346\n阿基\t332347\n盐边\t332348\n科技日报数字报\t332349\n中国核工业集团公司\t332350\n马尔彭萨机场\t332351\n技场\t332352\n株洲方特欢乐世界\t332353\n许健\t332354\n邓布利多\t332355\n台式\t332356\n心室\t332357\n元月\t332358\nXin\t332359\n编导类\t332360\n艾丽娅\t332361\n艺趣\t332362\nfm2018吧\t332363\n第八色\t332364\n父\t332365\n签求\t332366\n网特邮编查询网\t332367\n牙颌畸形\t332368\n姜饼城\t332369\n选错\t332370\nhardware\t332371\n小牛电动车\t332372\n变形金\t332373\n11年\t332374\n钢级\t332375\nword公式编辑器\t332376\n民俗博物馆\t332377\n梁内\t332378\n新龙门客栈\t332379\n地狱塔\t332380\n蒋娉婷\t332381\n东尼\t332382\nWord07\t332383\n弟弟们\t332384\n质管\t332385\n3dsb9\t332386\n新洪城\t332387\n4月9日\t332388\n史基浦机场\t332389\n上海越剧院\t332390\n步距\t332391\n圣邦微电子\t332392\n聚鑫\t332393\n二野\t332394\n来凤新闻网\t332395\n落底\t332396\n5510a\t332397\na级片\t332398\n磷酸根\t332399\n黑潮\t332400\n20017\t332401\n无锡市委\t332402\n苏河湾\t332403\n阿波罗\t332404\njacket\t332405\nInspiring\t332406\n仆人\t332407\n关键步\t332408\n郑州大学远程教育学院\t332409\n音效师\t332410\n新生力量\t332411\n8848汽车学苑vw8848.net\t332412\n碰头会\t332413\n2345软件大全\t332414\n第1组\t332415\n207国道\t332416\nFog\t332417\nGENERAL\t332418\n工字\t332419\n中国孤独症支援网\t332420\n区总工会\t332421\nFoxMail\t332422\n信长之野望12\t332423\n拿回家\t332424\n农残检测_食品检测_仪器\t332425\n58同城公司\t332426\n多嘴\t332427\n洛娃\t332428\n马里奥·毛瑞尔\t332429\n出物\t332430\npolka\t332431\n九五五五\t332432\n苛求\t332433\n公主坟\t332434\nIOV\t332435\n耐卡影音\t332436\n科迈\t332437\nHASH\t332438\n山田洋次\t332439\n录入员\t332440\n3d肉蒲团\t332441\n茶业\t332442\n凉宫春日的忧郁\t332443\n增值税销项税额\t332444\n白银谷\t332445\nCooking\t332446\n橘花\t332447\n长帝烤箱\t332448\n任意盈余公积\t332449\nntohs\t332450\n水库\t332451\n济宁市政府\t332452\n916路\t332453\nAX7\t332454\n扩边\t332455\n游手好闲\t332456\nmsvcr100.dll\t332457\n卷管器\t332458\n领贤\t332459\n救僵清道夫\t332460\nbupt\t332461\n桐人\t332462\nbigmom\t332463\n重庆自贸试验区\t332464\n改制后\t332465\n脱脂奶粉\t332466\n情定三生\t332467\n骚白\t332468\n译著\t332469\n瑞士大学\t332470\n水深\t332471\n江小媚\t332472\n金六福珠宝\t332473\n天姥家园\t332474\n纽博格林\t332475\nreference\t332476\n王者荣耀梦琪\t332477\n中税网\t332478\n深圳市公安局\t332479\n发展篇\t332480\n金史密斯\t332481\nCredits\t332482\n北京理工大学信息与电子学院\t332483\nxware\t332484\n叉烧包\t332485\nRaf\t332486\n甲胎\t332487\n内设\t332488\n徐理泰\t332489\n地呱呱\t332490\n35MM\t332491\n海月\t332492\n玛丽安娜\t332493\n马东敏\t332494\n鹰翔\t332495\n三典轩书画网\t332496\n信阳市公安局\t332497\nUnityEditor\t332498\n103.5\t332499\n英雄萨姆3\t332500\n欢笑\t332501\n恐龙化石\t332502\n真象\t332503\nIp地址\t332504\n文星\t332505\n集存\t332506\n耍帅\t332507\n郑怡\t332508\n3口\t332509\n全乐\t332510\n芙林\t332511\n反馈表\t332512\n辑稿\t332513\n类似品\t332514\nclaims\t332515\n苏享茂\t332516\n贸易品\t332517\n卒業\t332518\n北斗杯\t332519\n缝针\t332520\n血记\t332521\nstart\t332522\nschema\t332523\nSchema\t332524\n左红军\t332525\n吴锋\t332526\n错别字\t332527\n行笔\t332528\n和静\t332529\nCTCC\t332530\n購物\t332531\n那片星空那片海2\t332532\nKillers\t332533\n逃脱者\t332534\n九龙潭\t332535\n时尚大道\t332536\n万张\t332537\n无抵押信用贷款\t332538\n拔萝卜\t332539\nc52\t332540\n45m\t332541\n安徽省文化厅\t332542\nsero\t332543\nbarrel\t332544\n收银员\t332545\n北京广播网\t332546\n618\t332547\n60千米\t332548\n江西银监局\t332549\n马特洪峰\t332550\n出生点\t332551\n盲从\t332552\n山下达郎\t332553\n死囚\t332554\n无为县人民政府\t332555\nRave\t332556\nファンタジ\t332557\n健康类\t332558\n算号\t332559\n郑州西区\t332560\n天曜\t332561\n锁甲\t332562\n香格里拉小区\t332563\n180ml\t332564\n芜湖路\t332565\n阶跃函数\t332566\nUnderground\t332567\nCSAPP\t332568\n代付\t332569\n团状\t332570\n鬼泣HD\t332571\n分散式\t332572\n卓美\t332573\n快乐多\t332574\n复方嗜酸乳杆菌片\t332575\n预埋件\t332576\n阳泉北\t332577\n全集影音先锋高清\t332578\nds5\t332579\n施放\t332580\n福尔曼\t332581\n太阳报\t332582\n浮船\t332583\n钢管柱\t332584\n科雷傲论坛_汽车之家论坛\t332585\n同一\t332586\n小米红米\t332587\n永生化\t332588\n蕾蕾\t332589\n银豹收银系统\t332590\n缓坡\t332591\n警\t332592\n触手可得\t332593\n决策层\t332594\n池田\t332595\nfltk\t332596\nリアルエロゲシチュエ\t332597\n招聘者\t332598\n失窃\t332599\n龙盛\t332600\n利亚纳\t332601\n两拳\t332602\n水族馆\t332603\n李京\t332604\n临床医生\t332605\nwindows7_Windows系\t332606\n莱肯\t332607\n气氛炉\t332608\n死或生5\t332609\n水痕\t332610\n急性胃肠炎\t332611\n英伦对决\t332612\n动量守恒定律\t332613\nDae\t332614\n林玲\t332615\n龙虎网\t332616\n诺贝尔和平奖\t332617\n女大十八变\t332618\n新明半岛\t332619\n抽奖式\t332620\n拉克希尔\t332621\n酒红色\t332622\nCouchDB\t332623\n700平米\t332624\nadss\t332625\n脱下\t332626\n不达\t332627\n北新网\t332628\n山东省食品药品监督管理局\t332629\n中国计划生育协会\t332630\n长影世纪城\t332631\n光耀\t332632\n红黄蓝教育\t332633\nCGwell\t332634\n助记词\t332635\n鱼点\t332636\n松紧带\t332637\n柳梦璃\t332638\n紫电\t332639\n肛周\t332640\n五千\t332641\n注射用泮托拉唑钠\t332642\n雄\t332643\nFeng\t332644\n莫拉斯\t332645\n原生泰\t332646\n绿芥刑警\t332647\n鼻炎药\t332648\nequity\t332649\n神州租车网\t332650\nwework\t332651\nMySQL数据库服务器\t332652\nBugs\t332653\n枪支\t332654\n电狗\t332655\n成骨细胞\t332656\n包间\t332657\n刑事判决\t332658\n科腾\t332659\n微生物学\t332660\n振型\t332661\n阅片\t332662\nwl\t332663\n北美区\t332664\n保温车\t332665\n惊天\t332666\n哗啦\t332667\n六七月\t332668\n苏安监\t332669\nAV女星Connie\t332670\n广深港\t332671\n岑建勋\t332672\n万松\t332673\nmultinomial\t332674\n魏教授\t332675\n美罗城\t332676\n朝阳轮胎\t332677\n第180章\t332678\n肇始\t332679\n43所\t332680\n吴岩\t332681\n游戏史\t332682\nC++11\t332683\n马儿快跑\t332684\n昆明医学院\t332685\n速组\t332686\n豆王\t332687\n浦东日上机场\t332688\n斐波那契\t332689\n诺尔\t332690\n清口\t332691\n神雾环保\t332692\n20151125\t332693\naccidents\t332694\n任丘\t332695\n凯兰崔尔\t332696\nantimalware\t332697\n铆压\t332698\n武林门\t332699\n二三十年\t332700\n云南公司\t332701\n百田网\t332702\necb\t332703\nanconda\t332704\n湖南省科学技术厅\t332705\n右眼\t332706\nchinadaily\t332707\n9千\t332708\n冯木木\t332709\n光敏树脂\t332710\n合肥车管所\t332711\n齐康\t332712\n361号\t332713\n西递宏村\t332714\nQTreeView\t332715\n201-2\t332716\nshap\t332717\n护工\t332718\n电动晾衣机\t332719\n2314\t332720\n柘林\t332721\n板桥乡\t332722\n对外版\t332723\n校花与野出租\t332724\n盲枝\t332725\nFSMC\t332726\nDub\t332727\n万达物业\t332728\n帮顶\t332729\n种法\t332730\nvbios\t332731\n金苑小区\t332732\nglossy\t332733\n巨友\t332734\n录取通知\t332735\nず\t332736\n大舰\t332737\n世界品牌实验室\t332738\nnumbers\t332739\n15.00\t332740\n净月区\t332741\nsayabc\t332742\nitools4.0\t332743\n10天内\t332744\n新宫\t332745\norange\t332746\n汉能\t332747\n中国管理咨询公司\t332748\n0.5mol\t332749\n日中\t332750\nJordi\t332751\n板顶\t332752\nic50\t332753\n节粮\t332754\n金科天籁城\t332755\n柳栖士\t332756\n20枚\t332757\n大队长\t332758\nSoldering\t332759\nC267\t332760\n口袋战争\t332761\n双套\t332762\n下访\t332763\n光谱法\t332764\nDistrict\t332765\n难关\t332766\n黄雅莉\t332767\nxssf\t332768\n置空\t332769\n神仙草\t332770\nSNR\t332771\nr7000\t332772\ntanker\t332773\n受益者\t332774\n韩遂\t332775\n浸\t332776\n长沙园林生态园\t332777\n黑岩射手\t332778\n复诊\t332779\n组画\t332780\nDOCUMENT\t332781\nlargely\t332782\n三一挖掘机\t332783\n九龙珠\t332784\n美云\t332785\n热线电话\t332786\n库存服装\t332787\n北京精雕\t332788\n建国门外大街1号\t332789\n顺丰电子\t332790\n龟峰\t332791\nQC2.0\t332792\n好尸\t332793\n李镇西\t332794\n鹰潭火车站\t332795\n第一片\t332796\n今年五一\t332797\n马达\t332798\n投运\t332799\n商住\t332800\n多盘\t332801\nmarriott\t332802\n农村信用合作联社\t332803\nBoku\t332804\n电井\t332805\n电白区\t332806\nlevi's\t332807\n海贼女帝\t332808\n风\t332809\n捣鬼\t332810\n2017年10月9日\t332811\n节气歌\t332812\nGTX550Ti\t332813\n80版\t332814\n套汇\t332815\nmaple\t332816\nReleases\t332817\n雷语\t332818\n沁心\t332819\n出学\t332820\n载货\t332821\ngnz\t332822\n3DM版\t332823\n贵妃芒\t332824\n读图\t332825\n仿木桩\t332826\n星梦学院\t332827\n毛皮\t332828\nBlazer\t332829\n盖茨比\t332830\nopencms\t332831\n分数单位\t332832\n鸳鸯浴\t332833\n安医一附院\t332834\n还看今朝\t332835\nZan\t332836\n12mhz\t332837\n95%\t332838\n姉汁\t332839\n插杆\t332840\ndistant\t332841\n王澜\t332842\n01【人人网\t332843\n天天看片|天天看电影\t332844\n消费季\t332845\n小金县\t332846\n40件\t332847\n铜冶镇\t332848\n大理双廊\t332849\n943\t332850\nremedy\t332851\n韶韶\t332852\nT10\t332853\n邾城\t332854\n神舟十一号\t332855\nlowercase\t332856\n第十四期\t332857\n胆总管囊肿\t332858\n块状\t332859\n朗致\t332860\n起诉\t332861\n美性\t332862\n秋高气爽\t332863\nISO\t332864\n3.77\t332865\n新开业\t332866\n双个\t332867\n霍尔\t332868\nwin8win10\t332869\n四辊卷板机\t332870\n手云\t332871\n纳甘\t332872\n八倍镜\t332873\n惜缘\t332874\n乐嘉\t332875\n四柱神煞\t332876\ncigarette\t332877\n吴俊余\t332878\nrgp\t332879\nFaithful\t332880\n04月14日\t332881\n堂本刚\t332882\n野猪肉\t332883\n寻人\t332884\n危大\t332885\n播视网\t332886\n304L\t332887\n马口\t332888\n又拍云\t332889\n起亚_\t332890\nITMC\t332891\n家具业\t332892\nd850\t332893\n新生社区\t332894\n皖西\t332895\nsixt\t332896\nchuo\t332897\nusboot\t332898\n波尔图\t332899\n新晃\t332900\n杰伊\t332901\nd20\t332902\ncqb\t332903\n对位\t332904\n学舞\t332905\n熔纤机\t332906\n港南\t332907\n核电站\t332908\n直器\t332909\n小吃店\t332910\n幽梦影\t332911\n泉州市委\t332912\n4g路由器\t332913\n唐顿庄园\t332914\nhuayu\t332915\n杭州省\t332916\n9.14\t332917\n同情心\t332918\n群聚\t332919\n陈香卡卡\t332920\n1日游\t332921\n简模\t332922\n房地产估价师\t332923\n微信理财通\t332924\n赵全\t332925\n前列腺肥大\t332926\n江苏省前黄高级中学\t332927\n惠喵\t332928\n徐林\t332929\nExhentai\t332930\n不愿\t332931\n国内陆\t332932\n陌上桑\t332933\nIntelij\t332934\n卡耐基梅隆大学\t332935\n碎石\t332936\n党性\t332937\n备案制\t332938\n浪漫之都\t332939\nEAA\t332940\nSharpDevelop\t332941\nKBEngine\t332942\nRoberts\t332943\nOutlets\t332944\nusbasp\t332945\n多一天\t332946\n郭书瑶\t332947\nPSVITA\t332948\n磁性分离器\t332949\n篷房\t332950\nnivida\t332951\n联通移动\t332952\n竹器\t332953\n魔衣\t332954\nSignature\t332955\n冲积平原\t332956\nwke\t332957\n赤壁市\t332958\nbabyg\t332959\n经纪云\t332960\n亚伯\t332961\nCONTRACTUS\t332962\n轿夫\t332963\n帽徽\t332964\n苏州口腔医院\t332965\n草莓花\t332966\n六股\t332967\n丁苯胶乳\t332968\n小姐妹\t332969\n协鑫\t332970\nTLF\t332971\n黑墨\t332972\n君越天骄战纪\t332973\n辉\t332974\ngmba\t332975\n6pin\t332976\nhisuite\t332977\n金属乐队\t332978\n茜茜公主\t332979\n百战百胜\t332980\n里番网\t332981\n纳兰词\t332982\n洞子口\t332983\nAnna\t332984\nbya\t332985\npast\t332986\n七星街\t332987\n拐枣\t332988\n20160821\t332989\n领爱\t332990\n扶壁式挡土墙\t332991\n米卢\t332992\n骏域\t332993\n阿姆斯\t332994\n仙都\t332995\n招生章程_高考网\t332996\n层出不穷\t332997\n中国银行深圳分行\t332998\n川藏公路\t332999\naeo\t333000\n深圳网易\t333001\n日期段\t333002\n永湖镇\t333003\n秋田犬\t333004\ns1mple\t333005\n0532\t333006\n0%\t333007\n赛码\t333008\n澳门娱乐场\t333009\n淫术炼金士\t333010\n纯虚函数\t333011\n魔兽世界冰法\t333012\n河南省中医院\t333013\n重审\t333014\n中共北京市委党校\t333015\n19家\t333016\n尼康D610\t333017\n贾赦\t333018\n上海包装设计公司\t333019\n市委组织部\t333020\n32米\t333021\nK6\t333022\n老爷爷\t333023\n04月26日\t333024\n知彩\t333025\n滚犊子\t333026\n真牙\t333027\n720万\t333028\n五道峡\t333029\n名门世家\t333030\n数码类\t333031\n再谋\t333032\n超\t333033\n词伙\t333034\n&#1\t333035\n莆田0594_莆\t333036\n上海自贸区\t333037\n动摇\t333038\nReeves\t333039\n习总书记系列\t333040\nAPK安装器\t333041\n注册设备师考试\t333042\n里程桩号\t333043\n壮熊\t333044\n日风\t333045\n迅雷百度云\t333046\n赵显娥\t333047\n城管执法局\t333048\n考试大纲\t333049\npki\t333050\n角柱\t333051\n方波\t333052\n体转\t333053\n适度\t333054\n中兴集团\t333055\n大秋\t333056\n阮籍\t333057\n申港证券\t333058\ncs4\t333059\n唯利是图\t333060\n梯级\t333061\n营业总收入\t333062\n中车时代电气\t333063\n胃肠\t333064\nDFC\t333065\n合作化\t333066\n87型\t333067\n举升机\t333068\n张维良\t333069\n疯牛病\t333070\n沈北新区政府\t333071\nV9.7\t333072\n打呼噜\t333073\n上庄\t333074\nhelens\t333075\n平法\t333076\n禁售期\t333077\nYeast\t333078\n晕机\t333079\n巨兽\t333080\n荒野生存\t333081\nAwesome\t333082\n见首\t333083\n平铺式\t333084\n比丘国\t333085\n感觉自己萌萌哒\t333086\n盖地虎\t333087\n济南火车站\t333088\n李雷和韩梅梅\t333089\n@163.com\t333090\n缺纸\t333091\n208元\t333092\n曲谱网\t333093\nGNOME3\t333094\n大侦探波洛\t333095\n着意\t333096\n韩童生\t333097\n汛\t333098\n2011年5月\t333099\n2017年8月14日\t333100\n鞋女\t333101\nBIM\t333102\n篮球架\t333103\n鬓角\t333104\n石化工业\t333105\n尼安德特人\t333106\n热依扎\t333107\n黎明村\t333108\nHUI\t333109\n红叶\t333110\n黑莲\t333111\n早晚\t333112\n尼尔纪元\t333113\nA800\t333114\n第42集\t333115\n用海\t333116\njincheng\t333117\n塑钢管\t333118\n第1物流网\t333119\n溪水\t333120\n南方人物周刊\t333121\n陌上\t333122\n破招\t333123\n1200万元\t333124\n海南矿业\t333125\n配位\t333126\nreCAPTCHA\t333127\n千间\t333128\ngoodbye\t333129\n志同\t333130\n背书\t333131\n唐龙\t333132\n齐豫\t333133\n形影不离\t333134\nled照明\t333135\n下棋\t333136\n富翁\t333137\nSung\t333138\nDeleting\t333139\n沂州\t333140\n垂露\t333141\n国家新闻出版总署\t333142\n方正宽带\t333143\n荣荣\t333144\n田彩也香\t333145\n短针\t333146\n小升初考试\t333147\n铜渣\t333148\n2017年9月5日\t333149\n重庆医科大学附属第二医院\t333150\n师说\t333151\n横村镇\t333152\n300133\t333153\nFGO2018\t333154\n五灵\t333155\nh3c\t333156\nDAX\t333157\n季风性\t333158\n王敖\t333159\nBieber\t333160\n接近开关\t333161\n宋辽金\t333162\n疗\t333163\n李哈哈\t333164\n平淡无奇\t333165\n爱不爱我\t333166\n0.6.2\t333167\n真了不起\t333168\n闪亮登场\t333169\n纳粹党\t333170\ndragonia\t333171\n16年12月\t333172\n小舅子\t333173\nビデオ\t333174\nAiaiBT\t333175\n观致\t333176\n张亚\t333177\n17层\t333178\n8848\t333179\n2006届\t333180\n浙江警察学院\t333181\n缠身\t333182\n670号\t333183\n张兴\t333184\n思特奇\t333185\nShining\t333186\n极速office2017\t333187\nshad\t333188\n移动4G+\t333189\n无盘启动\t333190\n陈须隆\t333191\n051\t333192\n风神\t333193\n恒流泵\t333194\n羲和\t333195\n强力新材\t333196\nFIM\t333197\n第五十一\t333198\n高大威猛\t333199\n诺丽果汁\t333200\n花粉俱乐部\t333201\n炮兵旅\t333202\nous\t333203\n字符流\t333204\n天津理工大学\t333205\n急病\t333206\nrest-framework\t333207\n未央宫\t333208\nbrowse\t333209\n得帅\t333210\n高考网_高考网\t333211\n杜牧诗\t333212\n集中供暖\t333213\n新哥\t333214\n朱林\t333215\n李淑贤\t333216\n涡虫\t333217\nCafé\t333218\nKRIS&LU\t333219\n乌鱼子\t333220\n40m\t333221\n既往史\t333222\nAFM\t333223\n库斯科\t333224\n琵琶\t333225\n李代桃\t333226\n说文解字\t333227\nEPM\t333228\n_模拟人生3\t333229\n亚托克斯\t333230\n李氏宗祠\t333231\n华智\t333232\n敞车\t333233\n黑爱\t333234\n允差\t333235\n本格\t333236\n武汉亚洲心脏病医院\t333237\nqiu\t333238\nBOM\t333239\nresponded\t333240\n墙外\t333241\n八恶人\t333242\n医采\t333243\n简小\t333244\n镜前灯\t333245\nin卡\t333246\n_淘大师\t333247\nallocated\t333248\n牧户\t333249\n泠泠\t333250\n黑白路\t333251\nroslaunch\t333252\nsaa\t333253\nsoak\t333254\n姜广涛\t333255\n幸字\t333256\n25平方\t333257\n珠岛花园\t333258\n亲爱的朋友们\t333259\nMFStar模范学院\t333260\n2215\t333261\n齐纳\t333262\n急性髓细胞白血病\t333263\n广西招生考试网\t333264\navt\t333265\n2221\t333266\n多力多滋\t333267\n索尼家庭影院\t333268\n春款\t333269\n自由篮球吧\t333270\n电动三轮车\t333271\n荣汉斯\t333272\n快吧流放之路\t333273\nAR15\t333274\nGeolocation\t333275\n夏邑\t333276\nHellokid\t333277\n三国鼎立\t333278\n严整\t333279\n底袋\t333280\n洋节\t333281\n100封\t333282\n镜间\t333283\nC++/CLI\t333284\n板业\t333285\n铜价\t333286\n空耳\t333287\n热压机\t333288\n英利集团\t333289\n直招\t333290\n有限责任公司章程范本\t333291\n日落西山\t333292\ndapeng\t333293\n山口\t333294\n曲水亭街\t333295\n丁雪\t333296\n战略部\t333297\n值乎\t333298\n家事\t333299\napec\t333300\n魔法少女奈叶\t333301\n王老五\t333302\n秘术师\t333303\n双人餐\t333304\n金牌厨柜\t333305\n松雪\t333306\n系统学\t333307\n第一性原理\t333308\n蛾眉\t333309\n李海燕\t333310\ncalculator\t333311\n海淀园\t333312\n二块\t333313\n米蒂\t333314\n女真族\t333315\n匈牙利福林\t333316\nAcetate\t333317\nchemotherapy\t333318\n_Hi\t333319\n协办\t333320\n打扣机\t333321\n干鲜\t333322\n再试\t333323\n董\t333324\n破窗效应\t333325\n巨细胞病毒抗体\t333326\n蝶翠诗\t333327\ndrives\t333328\n5280\t333329\n古文化街\t333330\n庆元\t333331\n阻力系数\t333332\n青岛幼儿园\t333333\n4899\t333334\ngiraffe\t333335\n荧夜\t333336\n如城街道\t333337\n帝国理工大学\t333338\n攻克\t333339\n环球风云\t333340\n監\t333341\n洗澡歌\t333342\n烟瘾\t333343\n好舒服\t333344\nRX470D\t333345\neds\t333346\n连云港东站\t333347\nMybatis拦截器\t333348\n温野菜\t333349\n万怡酒店\t333350\n财神爷\t333351\n保理业务\t333352\n此意\t333353\n纨绔俏医妃\t333354\n各界\t333355\n渐近线\t333356\n蟠\t333357\n汽车部件有限公司\t333358\n子宫内膜增生\t333359\n工标\t333360\n上海九龙男子医院\t333361\n广州小鹏汽车科技有限公司\t333362\nEllis\t333363\nJcrop\t333364\n行为者\t333365\nPROII\t333366\n苏州市物价局\t333367\n奄奄一息\t333368\n事业单位工资标准表\t333369\n正畸科\t333370\n广州铁路(集团)公司\t333371\n洞察力\t333372\n飞秋\t333373\nwww.78\t333374\n创领\t333375\n首语\t333376\n就停\t333377\n高州\t333378\n清茗\t333379\n父亲的草原母亲的河\t333380\n前后\t333381\n五百多\t333382\n影音先锋_影音先锋播放器\t333383\n常州南大街\t333384\n末影龙\t333385\n11eyes\t333386\n矿化\t333387\n亲胸\t333388\n莺莺传\t333389\n小兴\t333390\n关于郑州的记忆\t333391\nVERSACE\t333392\n浪漫之路\t333393\n90068\t333394\nhuahua\t333395\nROSI\t333396\n闪变\t333397\n光子脱毛\t333398\n吕蒙正\t333399\n河北省气象局\t333400\n筛选器\t333401\n三角函数值\t333402\nTexpad\t333403\n自升式\t333404\n浙江省物价局\t333405\n明暗度\t333406\n铜陵市人力资源和社会保障局\t333407\n我的前半生\t333408\n学信网\t333409\n碳管\t333410\n2.0T_\t333411\n金锭\t333412\n羡\t333413\n战略型\t333414\nbelkin\t333415\n第79章\t333416\n约片\t333417\n随便\t333418\nesri\t333419\n调律\t333420\n育儿_新浪网\t333421\nmales\t333422\n968\t333423\n3ar\t333424\n条约\t333425\ncutecom\t333426\n5540\t333427\n红旗H5\t333428\n李洪\t333429\n明粉\t333430\n地藏经\t333431\n取水口\t333432\n87名\t333433\n6000家\t333434\n王鸿\t333435\n苗方清颜\t333436\n威爽\t333437\nyuntv\t333438\n3.54\t333439\n色泽\t333440\n经卷\t333441\n乐百氏\t333442\n亿连\t333443\nandroidx86\t333444\n龙湖时代天街\t333445\n美人版\t333446\n肃宁县\t333447\n焊盘\t333448\n骑马与砍杀:火与剑\t333449\n1.76版\t333450\n光电倍增管\t333451\n叉形\t333452\n凶地\t333453\n凡胎\t333454\n圆晶\t333455\n缠山\t333456\n较大\t333457\n电气石\t333458\n低语\t333459\nviminfo\t333460\nsky10001\t333461\n看出\t333462\n解救\t333463\n古碑\t333464\n表扣\t333465\nWEB应用\t333466\n流星焱雨\t333467\nnk250\t333468\n塔陨\t333469\n国网江西省电力公司\t333470\nClients\t333471\n芳草路\t333472\n苏果\t333473\n肾脏病\t333474\n李力\t333475\n流固\t333476\n罗伯特·麦基\t333477\n输入栏\t333478\nkeke\t333479\n微信头像网\t333480\n麒麟瓜\t333481\n安装篇\t333482\n凯尔特人队\t333483\n锦溪古镇\t333484\n秦直道\t333485\n深圳市知识产权局\t333486\n太平御览\t333487\n21世纪大学\t333488\n日销\t333489\n魔光\t333490\n脚本变量\t333491\n镇江中学\t333492\n广西农业科学院\t333493\nScipy\t333494\n注册监理工程师考试\t333495\n琶洲会展中心\t333496\n蚕沙\t333497\n通报表\t333498\nQui\t333499\n中继台\t333500\nuseradd\t333501\n1@163.com\t333502\n8.50\t333503\n唐少\t333504\n交换律\t333505\n秉烛夜游\t333506\nipad爱奇艺\t333507\nLY\t333508\n北京注册会计师协会\t333509\n沪江大杂烩\t333510\ne2\t333511\n完败\t333512\nhiwin\t333513\nmol\t333514\n100毫升\t333515\n民初\t333516\n麻栗坡\t333517\neqxiu\t333518\n万用电表\t333519\ninflation\t333520\n西北工业大学\t333521\nExhibitors\t333522\n地理志\t333523\n李爱玲\t333524\n停机\t333525\n议员\t333526\n长安欧尚A800论坛论坛\t333527\ndixon\t333528\n通路\t333529\n17周岁\t333530\n企客宝\t333531\n下侧\t333532\n中航集团\t333533\n酷睿\t333534\n天禄\t333535\n玻璃纤维\t333536\n范增\t333537\n扶持\t333538\n梦乃\t333539\nwxPython\t333540\n日本街\t333541\n莲子粥\t333542\n花卉展\t333543\n紫卡\t333544\n珠西\t333545\n换\t333546\n付辛博\t333547\n002195\t333548\nE01\t333549\n果皮\t333550\n首日\t333551\n杭州房产网\t333552\n新业广场\t333553\n日食记\t333554\n蔡赟\t333555\n发酸\t333556\n七千\t333557\n829\t333558\n女人村\t333559\n1枚\t333560\n谢玄\t333561\n八部\t333562\n183天\t333563\n16亿美元\t333564\n301路\t333565\n吗\t333566\nRevenue\t333567\n宣\t333568\n谭月姬\t333569\n灿鸿\t333570\nHexagon\t333571\n2017年4月24日\t333572\n普善路\t333573\n债劵\t333574\n大运通州网\t333575\n6.1.3\t333576\nFineReader\t333577\n广州市惠爱医院\t333578\nStevie\t333579\n下歌\t333580\n丽人堂\t333581\n汤姆叔叔的小屋\t333582\n马晓磊\t333583\nCCG\t333584\n线性分组码\t333585\n春暮\t333586\n构想\t333587\n利安\t333588\n南加州\t333589\n大餐\t333590\n独董\t333591\n结肠\t333592\ntpf\t333593\n景瑞望府\t333594\nshemale\t333595\n关键对话\t333596\n山煤集团\t333597\n食物中毒\t333598\n名机\t333599\n长春网易\t333600\n颐高\t333601\n各条\t333602\n扭住\t333603\n吴樾\t333604\n2303A\t333605\n崩解剂\t333606\n陈大伟\t333607\n江西移动\t333608\n临夏县\t333609\n海葬\t333610\n鲁政字\t333611\nvcredist_x86.exe\t333612\n侧倾\t333613\nUltimate\t333614\n罗修\t333615\nwf-1000x\t333616\n至善网\t333617\n特供酒\t333618\n安置户\t333619\nAutophagy\t333620\n银杏叶\t333621\n卓资县\t333622\n乐源\t333623\n大同市矿区\t333624\n调回\t333625\nEsxi\t333626\n小卫\t333627\n嘉科\t333628\n白酒瓶\t333629\nsept\t333630\n钙锌稳定剂\t333631\nTG\t333632\nIreport\t333633\n2335\t333634\n程曦\t333635\n履责\t333636\n卓易\t333637\nphan\t333638\n皇上皇\t333639\n63式\t333640\n班训\t333641\ntoprender\t333642\nwww.9ask.cn\t333643\n病历\t333644\n図\t333645\n锁块\t333646\n中国联合航空公司\t333647\n维也纳酒店集团\t333648\n流沙包\t333649\n牛奶盒\t333650\n超绝\t333651\nproject2003\t333652\n桃之夭夭\t333653\n望远\t333654\n书法史\t333655\nAD域控\t333656\n0628\t333657\n王茜华\t333658\n窦性心动过缓\t333659\n傅作义\t333660\nWESG\t333661\n新东方新概念英语\t333662\n特典版\t333663\n阜阳北路\t333664\n豆瓣\t333665\n啪嗒砰2\t333666\n20170915\t333667\ne15\t333668\n液封\t333669\nkathy\t333670\nthrift\t333671\nH5+\t333672\n江阴港\t333673\n私募投资基金监督管理暂行办法\t333674\n断层\t333675\n灯饰展\t333676\n葡萄糖酸钠\t333677\n接触性出血\t333678\n东莞装修公司\t333679\n包头市中心医院\t333680\n凌成兴\t333681\n3946\t333682\n我的仨妈俩爸\t333683\n萤石\t333684\n2012年12月31日\t333685\n路长\t333686\n1.028\t333687\n众泰T500_众泰_众泰T500报价_众泰T500\t333688\n潮汐海灵\t333689\n防灾科技学院\t333690\n爵士舞\t333691\n海东新区\t333692\n纷享逍客\t333693\n节用\t333694\nOAB\t333695\n荣氏\t333696\n创模\t333697\n弧度制\t333698\n定期存款\t333699\ninfringement\t333700\n石谷\t333701\nseh\t333702\ncongratulations\t333703\nRolan\t333704\n永和开发区\t333705\n魏氏\t333706\nnewspaper\t333707\n全张\t333708\n两房一厅\t333709\n临高\t333710\n1MPa\t333711\n长方\t333712\n华为手环B2\t333713\n通州副中心\t333714\n法律制裁\t333715\n维也纳大学\t333716\n制冷机\t333717\n十九禁\t333718\n妈妈桑\t333719\nhrc\t333720\n虚拟仿真实验教学中心\t333721\n美女版\t333722\n人绒毛膜促性腺激素\t333723\n忠心耿耿\t333724\n25.6\t333725\n克伦威尔\t333726\n巫神纪\t333727\n深圳公寓\t333728\n岷县\t333729\n质保书\t333730\n47篇\t333731\n尖啸\t333732\n四川建院\t333733\n40407\t333734\n李慧\t333735\n保护器\t333736\n潞安环能\t333737\n老阴逼\t333738\nXstream\t333739\n菅野洋子\t333740\n粟\t333741\n太阳能热水工程\t333742\n摔破\t333743\n挂号网\t333744\n4天3夜\t333745\n感慨\t333746\n君之博客|阳光烘站\t333747\n赛规则\t333748\n网易云歌\t333749\n温尼伯\t333750\n李伟\t333751\n现代启示录\t333752\n衢州人才网\t333753\nmtsp\t333754\n行运超人\t333755\n网络舆情监测\t333756\n19800\t333757\n石磨肠粉\t333758\n乙肝病\t333759\nM701\t333760\n20150304\t333761\n青岛国税\t333762\nwrest\t333763\n顺德农商银行\t333764\n广湛高铁\t333765\n上海市人民检察院\t333766\n老宗医\t333767\n卡莫\t333768\n0625\t333769\n舞马\t333770\n世界长寿之乡\t333771\n/dev/sda1\t333772\n快乐汉语\t333773\n铜奖\t333774\n败酱\t333775\n嘉楠耘智\t333776\n日片\t333777\n诛仙诀\t333778\n6轮\t333779\n藏医\t333780\n20140403\t333781\n危险物\t333782\n周浦东\t333783\nmaya22\t333784\n美国马里兰大学\t333785\n黄麻起义\t333786\n杨柘\t333787\n森林\t333788\n百度软件中心\t333789\n数亿\t333790\n于慧\t333791\n400吨\t333792\n凹凸教育\t333793\nopenc\t333794\n中国长寿之乡\t333795\n郑军\t333796\nUU加速器\t333797\n自卸式\t333798\nPinch\t333799\nkkw\t333800\n许云峰\t333801\n300千米\t333802\n免費\t333803\n宁德市政府\t333804\n隔夜茶\t333805\n狂干\t333806\n北京不孕不育医院\t333807\n代度\t333808\n韩德强\t333809\n丙二醇甲醚\t333810\n藤井彩\t333811\n转归\t333812\nSimilarity\t333813\n肯尼迪机场\t333814\n45岁\t333815\n秦瑶\t333816\n3031\t333817\n总结稿\t333818\nAKB\t333819\nress\t333820\n哈布斯堡\t333821\n政界\t333822\nmicromsg\t333823\n植物大战僵尸2狂野\t333824\n轻载\t333825\n终有\t333826\nemr\t333827\n修造\t333828\n行走进\t333829\n三星W2016\t333830\nwisdo\t333831\n鞭笞者\t333832\n海南省国土资源厅\t333833\n左正\t333834\n王希孟\t333835\n一共\t333836\nEugene\t333837\n发作期\t333838\n圆筒\t333839\njfianl\t333840\nluigi\t333841\n养猫\t333842\n诗潮\t333843\n68平米\t333844\n404号\t333845\nrio\t333846\n山西招生考试网\t333847\n雀巢奶粉\t333848\n印第安纳波利斯\t333849\nAES-128\t333850\nEXCEL表里\t333851\n放人\t333852\n出走\t333853\n立体字\t333854\nwin7显卡驱动\t333855\nc106\t333856\n续写\t333857\nS300\t333858\n标识号\t333859\n赎回费\t333860\n30码\t333861\n50D\t333862\n美工\t333863\n藏袍\t333864\n童游\t333865\n二手楼\t333866\n千言\t333867\n嘉鱼县人民政府\t333868\n武训\t333869\n阿兴\t333870\nppt2007\t333871\naon\t333872\n李亚邱少云\t333873\n2016～2017年\t333874\n安义县人民政府\t333875\nbomb\t333876\nWIN7纯净版\t333877\n自行式\t333878\n路博润\t333879\nrost\t333880\n艾娜\t333881\n卫带\t333882\n消防风机\t333883\n沈梦辰\t333884\n请多多指教\t333885\n英科宇\t333886\n督查\t333887\n微粒体\t333888\n丁一\t333889\n柏林自由大学\t333890\n11处\t333891\nNSCA\t333892\n刺伤\t333893\ndebit\t333894\n全影网\t333895\n栗子\t333896\n西控\t333897\n给付\t333898\n伊利达雷\t333899\n环境篇\t333900\n豆机\t333901\n久负盛名\t333902\n不见经传\t333903\nweatheronline\t333904\n新华区\t333905\n肌精\t333906\nclearfix\t333907\nzinc\t333908\n文人雅士\t333909\n炼金\t333910\n眼光\t333911\nmatlab2010b\t333912\n广州大道南\t333913\n梁兴义\t333914\n何姿\t333915\n葛沽镇\t333916\n机盒\t333917\n肠道息肉\t333918\n甲醛克星\t333919\n洽商\t333920\n18800\t333921\n国别人权\t333922\n泡椒牛蛙\t333923\n陈安丽\t333924\n浊气\t333925\n考编\t333926\n好报\t333927\nrem\t333928\n暗锁\t333929\n浴火银河2\t333930\n暴走漫画图片库\t333931\n余杭镇\t333932\n好胜心\t333933\n易遨\t333934\n女刀\t333935\n迅驰\t333936\naircross\t333937\n五天四夜\t333938\n赤羽\t333939\n奥特之母\t333940\n园方\t333941\n本安型\t333942\n裁量\t333943\n上承式\t333944\n山姆店\t333945\n丰镇\t333946\n必杀技\t333947\n在线音乐\t333948\nfx5u\t333949\n雷蛇灵刃\t333950\n桩筏\t333951\n薛大龙\t333952\n林语嫣\t333953\n蜀冈\t333954\n修理工\t333955\n万家乐热水器\t333956\n灵格斯\t333957\nhihoCoder\t333958\n鼓粉\t333959\n青云店\t333960\n云从科技\t333961\nmarketing\t333962\n再见了\t333963\n史小坑\t333964\n500丁\t333965\n财库\t333966\n第105\t333967\n合作部\t333968\n上海家化\t333969\n美国白蛾\t333970\nUpon\t333971\n2113\t333972\ngesture\t333973\nhp2132\t333974\ncurves\t333975\n西安热工研究院有限公司\t333976\n未开\t333977\n龘\t333978\n复健\t333979\n燃情\t333980\n五月婷婷\t333981\n计票\t333982\nlighted\t333983\n百亿元\t333984\n算起来\t333985\n田阳县\t333986\nprostate\t333987\n小双\t333988\nMusiclab\t333989\nAdmob\t333990\n红稚莲\t333991\n阿采\t333992\n样子\t333993\n延觉\t333994\n进贤县\t333995\n风尚中国网\t333996\n太溪\t333997\nHarrison\t333998\n80.90年代\t333999\n美国国家科学院\t334000\n顺口\t334001\n劳动法行天下\t334002\n酷版\t334003\n摩纳哥\t334004\nheating\t334005\n50p\t334006\n失策\t334007\n有气无力\t334008\nSCRATCH\t334009\n有色眼镜\t334010\nsybase数据库\t334011\nSOMA\t334012\nlegbaby\t334013\nxe5\t334014\n低开\t334015\n赃\t334016\n张伟明\t334017\n版式\t334018\n欧德宝\t334019\n亚韩\t334020\n第三军医大学新桥医院\t334021\n快手网\t334022\n标轴\t334023\npools\t334024\n沃顿\t334025\n盛德\t334026\nDNF迷你大乱斗\t334027\nBytom\t334028\n麻棉\t334029\n体育公园\t334030\nAIC\t334031\n漂流记\t334032\nXDA\t334033\n鸿福\t334034\n保举\t334035\nShortcuts\t334036\n花样男子2\t334037\n最终圣战\t334038\n8点半\t334039\n拉诺西亚\t334040\n白水泥\t334041\nwww.4399dmw.com\t334042\n选举权\t334043\nupd\t334044\n床上\t334045\n迅雷港\t334046\n神颜\t334047\n老年优待证\t334048\n样式库\t334049\n上海网易\t334050\n终极攻略\t334051\n雨天队\t334052\nSPL\t334053\n圈人\t334054\n嘉言\t334055\n今晚80后脱口秀\t334056\n汤方\t334057\n郑佳\t334058\nSLIC\t334059\n5A农业人才网\t334060\n巴卡拉\t334061\n哆啦宝\t334062\n搭\t334063\n唐山职业技术学院\t334064\n保运通\t334065\n館\t334066\n霏\t334067\n乡子\t334068\n白芥子\t334069\n小米miui9\t334070\n武僧\t334071\n一个30\t334072\n黑金砂\t334073\n语聊\t334074\n护城河\t334075\n仓颉\t334076\n长春市二道区\t334077\n油藏\t334078\n缝隙\t334079\n悲欢离合\t334080\n普洱熟茶\t334081\ndark_saber\t334082\n40多\t334083\n渠道部\t334084\n另案\t334085\n髋臼\t334086\n机杼\t334087\n黄恺\t334088\n赵岷\t334089\nMER\t334090\n刘鹏\t334091\n云南省住房和城乡建设厅\t334092\n天籁之音\t334093\n招商加盟\t334094\n新手卡之家\t334095\n8100\t334096\n奥莉\t334097\nsdmu\t334098\n青泥\t334099\n拼课\t334100\n上上兼职网\t334101\n同名\t334102\n282\t334103\n孔卡\t334104\n重振\t334105\n转动\t334106\n捡破烂\t334107\n小葱拌豆腐\t334108\n利弊\t334109\n吴凯\t334110\n2.5mm2\t334111\n低糖\t334112\n小贩\t334113\n铜鼓县\t334114\n陕西省质量技术监督局\t334115\n七个多月\t334116\n苦瓜电影网\t334117\n保定容大\t334118\n梦幻西游大闹天宫\t334119\nbrock\t334120\n三角轮胎\t334121\n足底按摩器\t334122\n一影\t334123\nKeil\t334124\nJPanel\t334125\n满月酒\t334126\n斯卡哈\t334127\n使命召唤6\t334128\ntwincat\t334129\n0.5x\t334130\n依云\t334131\n武林外传\t334132\n国融证券\t334133\nnode-gyp\t334134\n鱼米\t334135\n琉璃河\t334136\n包赔\t334137\n行使\t334138\n佩斯\t334139\n寡助\t334140\n古浪\t334141\n消防员\t334142\n离婚协议\t334143\n苏州工业园区国土环保局\t334144\n涂饰剂\t334145\n网绳\t334146\n徐金桂\t334147\n精打细算\t334148\n好又多\t334149\n锡条\t334150\n环烷烃\t334151\n罗纳冯仑\t334152\n使用权\t334153\n苍苔\t334154\nregarding\t334155\n屡试\t334156\n72M\t334157\n康斯\t334158\n你们\t334159\n友邦保险\t334160\nFaculty\t334161\nCa2+\t334162\n露台\t334163\n骚屄\t334164\n漳州市住房和城乡建设局\t334165\n贵阳云岩\t334166\n半波\t334167\n七六\t334168\n网状线\t334169\nteamviewer11\t334170\n桑巴\t334171\n北京尾号限行\t334172\n韩降雪\t334173\n11000\t334174\n抗压性\t334175\n红曲米\t334176\n烟气脱硫脱硝\t334177\nrx100m5\t334178\n暖舒\t334179\n惦记\t334180\n海藻泥\t334181\n出口报关单\t334182\n九龙湖度假村\t334183\nprism7\t334184\n商水\t334185\n台州市环境保护局\t334186\n天九\t334187\n南海路\t334188\n套牌\t334189\n天宏\t334190\n庆云县\t334191\n数术\t334192\n莱茵哈特\t334193\n独立战争\t334194\n号令\t334195\n梭织面料\t334196\nログイン\t334197\n海贼王时代\t334198\nwm1a\t334199\n温箱\t334200\n小过\t334201\n差数\t334202\nHeating\t334203\n质干细胞\t334204\n466\t334205\n伊顿纪德\t334206\n挂机\t334207\n长啸\t334208\n李晨浩\t334209\n长江一号\t334210\n杜伦\t334211\n依班娜\t334212\n马鞍山市政府\t334213\nbase64_encode\t334214\n途乐Y62\t334215\n西门子电气\t334216\n蓝炬星\t334217\n南洋商业银行\t334218\n46例\t334219\n休斯顿火箭\t334220\n急性肾衰竭\t334221\nAlisa\t334222\n拉客\t334223\n桃李面包\t334224\n调味\t334225\nXCOM2\t334226\n辣椒素\t334227\n宏观经济政策\t334228\n汪涛\t334229\n柜子\t334230\nnubia\t334231\npacker\t334232\n嫁接机\t334233\n仇和\t334234\n奥数题\t334235\n孟母\t334236\n苍梧城\t334237\n知校\t334238\n秦来财\t334239\n集换式\t334240\n开刀\t334241\n省志\t334242\n百家湖\t334243\n布拉特\t334244\n市房管局\t334245\n糖尿病酮症酸\t334246\n熊希龄\t334247\ndffyw\t334248\n险好\t334249\n半片\t334250\n梦雨情殇\t334251\nbilibili_哔哩哔哩\t334252\n骨质疏松性\t334253\n铝粉\t334254\n不莱梅\t334255\n良庆\t334256\n星空联盟\t334257\n个人页\t334258\nvdsl\t334259\n任免\t334260\n董美英\t334261\n上海市徐汇区人民法院\t334262\n曲臂式\t334263\n41关\t334264\n12.5mm\t334265\n消费率\t334266\n进网许可证\t334267\n惠尔\t334268\n轨道式\t334269\n个人征信\t334270\n生机无限\t334271\n张叔平\t334272\n横道\t334273\n急性盆腔炎\t334274\nstealing\t334275\n鲜\t334276\nJCG\t334277\n张禹\t334278\n缓解期\t334279\n洹\t334280\n过五关斩六将\t334281\n安好\t334282\n丽江三义机场\t334283\nfascinating\t334284\n南京高等职业技术学校\t334285\n稲森\t334286\nBochs\t334287\n佐为\t334288\n金平路\t334289\n双世宠妃2\t334290\n悍然\t334291\n柯登\t334292\n轧钢机\t334293\n小蝌蚪\t334294\nXtecher\t334295\ncut版\t334296\n车迷\t334297\nPromises\t334298\n斗蟹\t334299\n配房\t334300\n北极星光伏商务通\t334301\n解释\t334302\n147个\t334303\n比乐\t334304\n各学科\t334305\n王者荣耀吕布\t334306\n4.0.1\t334307\n战复\t334308\n碳晶地暖\t334309\nLYRICS\t334310\n崂山景区\t334311\n巴渝\t334312\n撂倒\t334313\n3224\t334314\n创业投资基金\t334315\n达必佳\t334316\n气体探测器\t334317\n盐酸米诺环素胶囊\t334318\n南京高铁站\t334319\n蒸压砂加气混凝土砌块\t334320\n海螺\t334321\n别克\t334322\n高涛\t334323\n385\t334324\n法雷奥西门子\t334325\nKarla\t334326\nEML\t334327\n田国强\t334328\n夜倾情\t334329\n哈里凯恩\t334330\n哪地\t334331\n默陌\t334332\nfish\t334333\n_天一\t334334\n鱼友\t334335\n网谷\t334336\n村卫生室\t334337\n移动破碎机\t334338\n少囧\t334339\n奔驰a\t334340\n安监总厅\t334341\nKML\t334342\n王政君\t334343\n马建\t334344\n桃花依旧笑春风\t334345\n拓印\t334346\n字令\t334347\n新年\t334348\nipad浏览器\t334349\nweblogic\t334350\nConduit\t334351\nKansai\t334352\n和你一样\t334353\n陕西煤业化工建设(集团)有限公司\t334354\n青岛三中\t334355\n密封盖\t334356\nsafengine\t334357\n42首\t334358\nasic\t334359\n柳公权\t334360\n柚柚\t334361\nurdu\t334362\nCocowool\t334363\n5960\t334364\n赤血魔剑\t334365\n大汤\t334366\n玫\t334367\n纽约肯尼迪机场\t334368\n鄙人\t334369\n小数的意义\t334370\n汉庭\t334371\n合建\t334372\n西外\t334373\nS级\t334374\ndff\t334375\nceline\t334376\n马金\t334377\n蒙顶山\t334378\n圣火令\t334379\n17025\t334380\n懒人包\t334381\n边缘性\t334382\n隐姓埋名\t334383\n考题库\t334384\n过誉\t334385\n税法二\t334386\nDN100\t334387\nlsnrctl\t334388\n实用心理学\t334389\n双基\t334390\n这两天\t334391\n肇庆市商务局\t334392\n籽岷\t334393\n搜房博客\t334394\n冰棍\t334395\n商务局\t334396\n回归函数\t334397\n电视开关\t334398\n合伙企\t334399\nLSR\t334400\n骨穿\t334401\nopensns\t334402\n226\t334403\n腾飞\t334404\n带动\t334405\nuVision4\t334406\n互金\t334407\n限位开关\t334408\n官路\t334409\n碎碎念\t334410\nJN\t334411\n巴士LOL_巴士英雄联盟\t334412\n小翠\t334413\nsumail\t334414\n一剪\t334415\n贻误\t334416\n东京成田国际机场\t334417\n桃草\t334418\nimplements\t334419\n爱大爱\t334420\n批量生产\t334421\n印度德里\t334422\n顺辉瓷砖\t334423\n98K\t334424\n广州市农业局\t334425\nstarfall\t334426\n三生撸啊撸\t334427\nhta\t334428\n郾城区\t334429\n血清人绒毛膜促性腺激素\t334430\n几阶\t334431\n第128集\t334432\n三鼎\t334433\n高速公路管理处\t334434\n唐英\t334435\n刘一帆\t334436\n深圳世界之窗\t334437\n香港移动\t334438\n豫园街道\t334439\nRESTAURANT\t334440\n无月\t334441\n英寸\t334442\n盖房网\t334443\n显示页\t334444\n怒放的生命\t334445\n超买\t334446\n慕容烨\t334447\n大西高铁\t334448\n刑事侦缉档案2\t334449\nsediment\t334450\n对卷\t334451\n周洁琼\t334452\n莺飞草长\t334453\nac米兰吧\t334454\n边缘性行为\t334455\njeffrey\t334456\n伞端\t334457\n蒲公英tt\t334458\n随影\t334459\n百佳\t334460\n中华人民共和国国务院新闻办公室\t334461\n党史\t334462\n雪津\t334463\nwp8\t334464\nmjorcen\t334465\n驰游\t334466\n木厂\t334467\n全株\t334468\n2000次\t334469\n武汉东\t334470\n硅烷化\t334471\n淘宝c店\t334472\n离心机\t334473\n著作权案\t334474\n电动力\t334475\n校庆网\t334476\n烧伤膏\t334477\ncifar-10\t334478\n中企动力\t334479\n第多少周\t334480\n弧长\t334481\n蒲宁\t334482\n八友\t334483\n系统架构师\t334484\n眉毛\t334485\nRAMOS\t334486\n艾萨拉\t334487\n600807\t334488\n水岛\t334489\n猪友\t334490\n唐家岭新城\t334491\nLining\t334492\n正誉\t334493\n人类的老师\t334494\n头风\t334495\n汤老师\t334496\n项燕\t334497\nmine\t334498\n中兴光猫\t334499\n三万五\t334500\n三级\t334501\n武汉黄陂政府\t334502\nwow60级\t334503\n钱江源\t334504\ntransient\t334505\n志稿\t334506\n瑶光\t334507\n市况\t334508\nmeeming\t334509\n天弘\t334510\n盛达\t334511\n外文出版社\t334512\n_眼科频道_健客网\t334513\n中南世纪锦城\t334514\n郡王\t334515\n盾之勇者成名录\t334516\n左氧\t334517\n体测仪\t334518\n痴母\t334519\ncentOS7\t334520\n奔忙\t334521\n冷水滩区\t334522\n爱玛雪豹\t334523\n兰州商学院\t334524\n方向盘\t334525\n若只\t334526\n泥地\t334527\n常州市武进区政府\t334528\n等差数列\t334529\n重型货车\t334530\n苍穹\t334531\n刺死\t334532\nBROCADE\t334533\n太郎\t334534\nsemi\t334535\nActually\t334536\ncerts\t334537\n巽卦\t334538\n胃脘\t334539\n姉弟\t334540\nERP管理系统\t334541\n到位率\t334542\n招商银行企业银行\t334543\n小松菜奈\t334544\nc轮\t334545\n110分钟\t334546\n土屋圭市\t334547\n主体性\t334548\nfourth\t334549\n1:72\t334550\n李七庄\t334551\n经念\t334552\n共享池\t334553\n傻柱\t334554\n复出口\t334555\ncyl\t334556\n统治阶级\t334557\n纪昌学射\t334558\n义乌市人事劳动社会保障局\t334559\n桐华\t334560\n领秀慧谷\t334561\n吸力\t334562\n唐继尧\t334563\n黏度\t334564\n20台\t334565\n真纯\t334566\ndji\t334567\n晨光生物\t334568\nS5500\t334569\n2017年5月6日\t334570\n火车\t334571\n航空总医院\t334572\n保科\t334573\n电热器\t334574\nsensitivity\t334575\n4g版\t334576\n中联泵业\t334577\n36名\t334578\n康熙帝\t334579\n留香\t334580\n五任\t334581\n雨霏\t334582\n电墙\t334583\n斷\t334584\n13.1.1\t334585\n涩爱\t334586\n法人库\t334587\n承兑人\t334588\n聚币\t334589\n驭胜S350论坛_汽车之家论坛\t334590\n二环东路\t334591\n陨落\t334592\n空间站\t334593\n瑞星卡卡\t334594\n细菌性阴道病\t334595\nsubString\t334596\n英雄故事\t334597\n中纪委监察部\t334598\n巴舒亚伊\t334599\n核磁共振\t334600\n伊拉贡\t334601\n上海圆通快递\t334602\n生物产业\t334603\n128期\t334604\n2301\t334605\n剧本\t334606\n铺就\t334607\n尤克里里谱&教学|Ukulele尤克里里小站\t334608\nrpcs\t334609\nFibre\t334610\nGenomics\t334611\nswept\t334612\n35周\t334613\n360网\t334614\n神界3吧\t334615\n群英会\t334616\n梦战\t334617\n12368\t334618\nRin\t334619\n西安日报社\t334620\n杨雄\t334621\n隔空投\t334622\nmonk\t334623\n纯粮\t334624\n洪江\t334625\n终物语\t334626\n临邑县\t334627\n神会\t334628\nHank\t334629\nicicle\t334630\n吕浦\t334631\nImageMagick\t334632\n土豆烧牛肉\t334633\n大坪山\t334634\n小牛血去蛋白提取物\t334635\n宁东镇\t334636\n九州证券\t334637\n勇者斗恶龙6\t334638\n兰科\t334639\n黄淮学院\t334640\nZurich\t334641\n0.2.4\t334642\n悦美国际\t334643\n白龙王\t334644\n厦门软件职业技术学院\t334645\n社区卫生服务中心\t334646\nCARD\t334647\n花吻\t334648\n亚运城\t334649\n逆战僵尸猎场\t334650\nByteBuf\t334651\n弓步\t334652\n宋都股份\t334653\n密不可分\t334654\nsharding-jdbc\t334655\n大岛薰\t334656\n大华银行\t334657\n爆旋陀螺\t334658\n纸皮砖\t334659\n1.1万亿\t334660\n威特\t334661\n卡尔顿\t334662\n炸麦\t334663\n存储器\t334664\n咔唑\t334665\n使命感\t334666\n第n个\t334667\n触摸开关\t334668\n宝骏560论坛_汽车之家论坛\t334669\nUVLayout\t334670\n拉拉手\t334671\n榆林职业技术学院\t334672\n武威_武威政府网\t334673\nkpm\t334674\nbeta\t334675\n朵云\t334676\n永远的朋友\t334677\n田雪松\t334678\nTian\t334679\n城市规划展览馆\t334680\n收费亭\t334681\n新鸿基\t334682\n烤排骨\t334683\nIsolated\t334684\n断保\t334685\n网易支付\t334686\nweiji\t334687\n骚人\t334688\n与会者\t334689\ncanot\t334690\n送予\t334691\n生命权\t334692\n凡普\t334693\nCEI\t334694\n颠峰\t334695\n气体过滤器\t334696\n录频\t334697\n君悦府\t334698\nEspace\t334699\n箱批\t334700\n越来越远\t334701\n富临\t334702\n高锟\t334703\n无限曙光\t334704\n云量\t334705\n犬妖\t334706\n冷拔机\t334707\nlibgd\t334708\nCAE联盟\t334709\n越野拉力赛\t334710\n百莲凯\t334711\n威海市人民政府办公室\t334712\n601229\t334713\n2.25%\t334714\nfsr\t334715\n中山大学附属第八医院\t334716\n桥涵\t334717\n克雷格\t334718\nvoyeur\t334719\n元始天尊\t334720\n版权登记门户网\t334721\n放线菌\t334722\n涞源县\t334723\n王槟榔\t334724\n5301\t334725\n顺带\t334726\n853\t334727\n亲民\t334728\n面子\t334729\n第126章\t334730\n九阴真经手游\t334731\n瑞立\t334732\n小宇\t334733\n传奇人物\t334734\n结合律\t334735\n花季\t334736\nsql2008\t334737\n会计从业资格考试\t334738\n98号\t334739\ncncnki\t334740\n800亿美元\t334741\nOptimize\t334742\n99件\t334743\n蜀南\t334744\njsTree\t334745\n泛白\t334746\nQueryString\t334747\n2016年\t334748\n红外线灯\t334749\nKinetic\t334750\n上海中侨职业技术学院\t334751\n喜新厌旧\t334752\n缉捕\t334753\n暴走萝莉\t334754\nGenerative\t334755\n新通留学\t334756\n一汽技术中心\t334757\n吉祥兔\t334758\n句法\t334759\nMakeup\t334760\n2013-2030年\t334761\n县民政局\t334762\n我的天劫\t334763\n少年正义联盟\t334764\nErazer\t334765\n王者荣耀铠\t334766\nfluid\t334767\n98_\t334768\n卡密\t334769\n先行版\t334770\nPLANT\t334771\n第三十三\t334772\n芒果台\t334773\n公分母\t334774\n电液推杆\t334775\n柳美和子\t334776\n新立得\t334777\n学法网\t334778\n宁波市环保局\t334779\n必需品\t334780\nwin7输入法\t334781\n正相关\t334782\n鞋印\t334783\n一千多年\t334784\n银牌\t334785\n水果\t334786\n道童\t334787\n海上世界\t334788\n草体\t334789\n广州工商银行\t334790\n王进喜\t334791\n线格\t334792\n马慧\t334793\n湘江新区\t334794\n李艳丽\t334795\nsocieties\t334796\nzzz\t334797\n怀恋\t334798\n杀伤性\t334799\ntried\t334800\n热播\t334801\n低地\t334802\n北京2022冬奥会\t334803\n中文汉\t334804\n儒家网\t334805\ncopula\t334806\n电力定额\t334807\n爱趣\t334808\nSPARQL\t334809\n历劫\t334810\n应勇\t334811\n编辑类\t334812\ngreetings\t334813\n淘宝闲鱼\t334814\n囧雪诺\t334815\n甜宠文吧\t334816\n皇太子\t334817\n上市公\t334818\nDataTables\t334819\nAB面\t334820\n中银基金管理有限公司\t334821\n缠论\t334822\n消防水炮\t334823\n谱号\t334824\n生化危机0\t334825\n太真外传\t334826\n外来户\t334827\n航海学\t334828\n3吨\t334829\n33座\t334830\nYK\t334831\n武建\t334832\n铜匠\t334833\nmipi\t334834\n国土资源所\t334835\n简体中文\t334836\nSkin\t334837\n王者荣耀安琪拉\t334838\n白馒头\t334839\nxpro\t334840\n市纪委\t334841\n连长\t334842\n徐庶\t334843\n纳污\t334844\n证券化\t334845\n圆帽\t334846\n株洲教育网\t334847\n周有光\t334848\n山师附中\t334849\nv-t\t334850\n冷笑话\t334851\nMac/Win版\t334852\n拳皇97吧_\t334853\n11\t334854\n凡客\t334855\n青连铁路\t334856\n来过\t334857\n无从\t334858\n奇米\t334859\n深圳福田汽车站\t334860\n女监\t334861\n小泽\t334862\n东风标致2008\t334863\nMOKO\t334864\nVoting\t334865\n乐山电力\t334866\n忍野\t334867\n北卡教堂山\t334868\nConfusion\t334869\nmelogin\t334870\n摩登家庭第九季\t334871\n拆装机\t334872\n海淀区人力资源和社会保障局办事服务大厅_北京海淀区政府网\t334873\n16类\t334874\n5800万\t334875\n凰权\t334876\n泰米尔纳德邦\t334877\nrc2\t334878\n中华人民共和国禁毒法\t334879\n匡时\t334880\n张路\t334881\nUV镜\t334882\n奥格瑞姆\t334883\n前序遍历\t334884\n7.5%\t334885\n棒球场\t334886\n垂涎三尺\t334887\n20170729\t334888\n宽比\t334889\n朱晓辉\t334890\nTOP200\t334891\n展贸\t334892\nz370pro4\t334893\n销声匿迹\t334894\n金桥出口加工区\t334895\n2018年12月\t334896\n5.1劳动节\t334897\n难分\t334898\ntkt\t334899\n业户\t334900\n老丈人\t334901\nuc小说网\t334902\nN14U\t334903\n中华人民共和国公安部\t334904\n仓储物\t334905\n松红梅\t334906\nCracker\t334907\n僾\t334908\n氢溴酸右美沙芬片\t334909\n124周年\t334910\npasa\t334911\n千夏\t334912\n1474\t334913\n热水袋\t334914\n孕妻\t334915\n万民\t334916\n龙之信条\t334917\n湖北省农村信用社\t334918\n删行\t334919\nVegan\t334920\n气象部门\t334921\n秦岚\t334922\n开发性\t334923\ne06d7363h\t334924\n多栏式\t334925\n表彰\t334926\n星杰\t334927\nCFX\t334928\n溅出\t334929\ndbasdk\t334930\nachieving\t334931\n泰然九路\t334932\n第六轮\t334933\nftps\t334934\n中华人民共和国电力\t334935\nppt1\t334936\n吴兴教育在线\t334937\n扥\t334938\n舞力觉醒\t334939\n邱老师\t334940\nS3100\t334941\n七堂\t334942\nreg52\t334943\n图片浏览器\t334944\nFen\t334945\nneurology\t334946\n开封北\t334947\n三棵树涂料股份有限公司\t334948\n宽屏版\t334949\n科学家\t334950\nshiguang\t334951\n100nf\t334952\n健谈\t334953\n石岭\t334954\n办公版\t334955\n陈灵富\t334956\n绝世神医:腹黑大小姐\t334957\nunlocker\t334958\n空天使\t334959\n连者\t334960\n金雪炫\t334961\n小学生们\t334962\nVR\t334963\n消夏\t334964\n文化界\t334965\n罢了\t334966\n佳尼特\t334967\ncreep\t334968\n夏凡\t334969\n东方论坛\t334970\n铜仁网\t334971\n真天\t334972\n航天纪念钞\t334973\n秦牧\t334974\n温妮\t334975\n蔡康妮\t334976\n禁军\t334977\n雷洛传\t334978\n色先锋-影音先锋av资源-色先锋影音看片-色先锋\t334979\n宿迁市统计局\t334980\nCAMEL\t334981\n贺银\t334982\n重返地球\t334983\n湿量\t334984\n球池\t334985\n第11号\t334986\n王者荣耀18888\t334987\n企业内部控制基本规范\t334988\n麻婆豆腐\t334989\n参宿\t334990\n放眼\t334991\n晶钢\t334992\n毛哥\t334993\n弘扬\t334994\n潮汐\t334995\nlibg\t334996\n陪着我\t334997\n袁老四\t334998\nmysterious\t334999\n匮\t335000\n1品\t335001\n注册安全工程师考试\t335002\n过桥\t335003\n借支单\t335004\nfeather\t335005\n三国群英传6\t335006\n齐齐哈尔市政府\t335007\n草里\t335008\n街拍照\t335009\n幕府\t335010\n有机蔬菜\t335011\nidb\t335012\n健身类\t335013\n油厂\t335014\n广州地铁4号线\t335015\n北冥子\t335016\n天子\t335017\nwrestlemania\t335018\nSobel\t335019\n王五\t335020\nidc论坛吧\t335021\n马男波杰克\t335022\nLove笨笨猪\t335023\nbasemap\t335024\n英语专业\t335025\n思廉日\t335026\n明尼苏达\t335027\n被堵\t335028\nFranson\t335029\n致良知\t335030\ntulip\t335031\n稳岗补贴\t335032\n照牛排\t335033\nAmore\t335034\n谷氨酰胺肠溶胶囊\t335035\n印第安纳琼斯\t335036\n隋唐\t335037\n中敏\t335038\nA7M2\t335039\n取暖炉\t335040\n68篇\t335041\n军工股\t335042\n复盛\t335043\nfoodie\t335044\n杭州路\t335045\nPenalty\t335046\n大连移动\t335047\n存托凭证\t335048\n物攻\t335049\n净片\t335050\n班上\t335051\n降调针\t335052\n北京市国有文化资产监督管理办公室\t335053\nporhub\t335054\n穆罕穆德\t335055\n吸污车\t335056\n过轨\t335057\njfx\t335058\nWDZ\t335059\n2625\t335060\n急需\t335061\nclickonce\t335062\n柳絮\t335063\n张惠言\t335064\n奥伊米亚康\t335065\n一百块\t335066\n蛋壳姬\t335067\n腹腔\t335068\n连山\t335069\n425i\t335070\n梦都\t335071\n陆桥\t335072\n间充质干细胞\t335073\n高利率\t335074\nrealization\t335075\n宫合\t335076\n迷案\t335077\nsqlyog\t335078\n明珠大道\t335079\n两院院士\t335080\n墙梯\t335081\ncapital\t335082\n裘德\t335083\n绝地求生刺激战场服饰币\t335084\niPS\t335085\nJ10\t335086\n一个接一个\t335087\n数商\t335088\n正天\t335089\n中智关爱通\t335090\n步速\t335091\n明途\t335092\nNEED\t335093\n解雇\t335094\nreebok\t335095\n5363xxxx\t335096\n想不通\t335097\n信审\t335098\nfcitx\t335099\nLOTTE\t335100\n30克\t335101\nCTH\t335102\ndbm\t335103\n圣犹达\t335104\n爱女\t335105\n辽宁自贸区\t335106\n杂记\t335107\nric\t335108\n花园洋房\t335109\njriver\t335110\n小爱mini\t335111\n放疗科\t335112\n云南省文化厅\t335113\n小柱\t335114\n前列舒通胶囊\t335115\n秘书证\t335116\n狗肉馆\t335117\nf218\t335118\n万科金域蓝湾\t335119\n微特\t335120\n沪太路\t335121\n洪合\t335122\n翠亨\t335123\nWarp\t335124\n第一种\t335125\n控制网\t335126\n12|\t335127\n奥尼尔\t335128\n70本\t335129\n文典\t335130\n框架柱\t335131\n中国三军仪仗队\t335132\n全币卡\t335133\n湛江市市\t335134\n百信众联\t335135\nhuagong.huangye88.com\t335136\n泔水猪\t335137\nG17\t335138\n至死不渝\t335139\nMate9\t335140\n悦联\t335141\n商演\t335142\n矿主\t335143\n丰沛\t335144\n酸液\t335145\nFaker\t335146\n河海大学文天学院\t335147\n一首醉人的歌\t335148\nmikrotik\t335149\n史蒂夫乔布斯\t335150\n中央常委\t335151\n二项式定理\t335152\nSES\t335153\n幼龙\t335154\n抗组胺药\t335155\n缝纫工\t335156\n文心\t335157\n锥度\t335158\n林子方\t335159\n活见鬼\t335160\n昵图网nipic.com\t335161\n咧嘴\t335162\n即热式\t335163\n重罚\t335164\n南\t335165\n裏切\t335166\n残值\t335167\n流程化\t335168\n龙船调\t335169\n熏艾\t335170\n致命框架2\t335171\nkatrina\t335172\nios11.3\t335173\ncxfreeze\t335174\n第7位\t335175\n卡洛\t335176\n绝密543\t335177\n电动闸阀\t335178\n潘达利亚\t335179\n万盛区\t335180\n山东大学材料科学与工程学院\t335181\n险峰\t335182\n认领\t335183\nmobo\t335184\n塔克拉玛干沙漠\t335185\n艾丽斯\t335186\n凤凰通\t335187\n100亿美元\t335188\n无锡尚德太阳能电力有限公司\t335189\n湘东\t335190\n溢流口\t335191\n六祖坛经\t335192\n谢亦\t335193\n吞噬\t335194\n胸章\t335195\n翟营大街\t335196\n府绸\t335197\n雪芽\t335198\nzedd\t335199\n曲福田\t335200\n网络经济学\t335201\n金羽\t335202\n超爽\t335203\n开盘机\t335204\n2300\t335205\nELIXIR\t335206\n龙标\t335207\n阎王不高兴\t335208\n考聘\t335209\n上学路上\t335210\n孤独的牧羊人\t335211\n5.2.10\t335212\n交通规则\t335213\nsvnserve\t335214\n轴功率\t335215\n偶数项\t335216\n姜洋\t335217\n辽中\t335218\n不发达\t335219\n零星\t335220\n18篇\t335221\n红袍\t335222\n雷打不动\t335223\nEmpireCMS\t335224\n公理化\t335225\n白话二十四史\t335226\nJuventus\t335227\n700_\t335228\nlacues\t335229\ncement\t335230\n校园一卡通\t335231\n6es7\t335232\nPilgrim\t335233\n太肉\t335234\n河南能源化工集团有限公司\t335235\n中消防\t335236\nDorothy\t335237\n高庄\t335238\n工作面\t335239\n热词\t335240\nfurthermore\t335241\n容器类\t335242\n伊秀美体网\t335243\n有仇必报\t335244\nIDG\t335245\n下凹式\t335246\nSOG\t335247\n002049\t335248\n782\t335249\n加息\t335250\n维密模特\t335251\n杜甫\t335252\n1155针\t335253\n130t\t335254\n之二十三\t335255\n一科\t335256\n1629\t335257\n中医中药网\t335258\n臭味\t335259\n22处\t335260\n庶人\t335261\n安监总\t335262\n大境\t335263\nFT中文\t335264\n要件\t335265\n邓宇涵\t335266\n青岛保税区\t335267\n暗夜猎手\t335268\n凤凰岭\t335269\n子宫癌\t335270\n约数\t335271\n长长\t335272\n寄\t335273\n巴黎市\t335274\n威廉王子\t335275\nslot\t335276\n虚惊\t335277\n闪躲\t335278\npolycom\t335279\npropel\t335280\nug8.0\t335281\n999天\t335282\nJre\t335283\n元阳梯田\t335284\n淡蓝玫瑰\t335285\n阿清\t335286\n空降\t335287\n邓涵文\t335288\nSUCCESS\t335289\n青春集结号\t335290\nOracle时报错ORA\t335291\n地垫\t335292\n蔚山\t335293\nSQUARE\t335294\n双线性\t335295\n麒麟980\t335296\n无租\t335297\n净果\t335298\n爱拍原创\t335299\n勒流\t335300\n喀纳斯\t335301\nvocabulary\t335302\n红霉素软膏\t335303\n机务员\t335304\n虚位\t335305\ninodes\t335306\nupvc\t335307\n8辆\t335308\n五休\t335309\nSCCM\t335310\n95期\t335311\n斯大林格勒\t335312\n死车\t335313\njoyful\t335314\n广州本田\t335315\n华为c8816\t335316\n合安高速\t335317\n浴\t335318\n厦工楚胜\t335319\n儒道\t335320\nE07\t335321\n拉圾\t335322\n匿名化\t335323\n周春雨\t335324\nliteral\t335325\n克百威\t335326\n佛山地铁4号线\t335327\n土壤重金属污染\t335328\n戴立忍\t335329\nkubectl\t335330\n商丘市\t335331\n焦红\t335332\n%2d\t335333\nUAE\t335334\nigbinary\t335335\n66期\t335336\nDefine\t335337\nyuri\t335338\n康泽\t335339\n长安欧尚A800论坛\t335340\nMockito\t335341\n马善祥\t335342\nISO17025\t335343\n逆商\t335344\nGBJ\t335345\nz370-a\t335346\n狗尾巴草\t335347\nExcel排序\t335348\npercy\t335349\ntwinkle\t335350\n外汇银行\t335351\n网址\t335352\n走路上学\t335353\ngts250\t335354\n大寨村\t335355\n群值\t335356\n旧房\t335357\nChalk\t335358\n520.com\t335359\n轮速传感器\t335360\n南通房产信息网\t335361\nHPV病毒\t335362\n杨国福\t335363\n粤菜\t335364\n韩建河山\t335365\n迈向\t335366\nUncaught\t335367\n微电脑\t335368\n中频治疗仪\t335369\n火球术\t335370\n邓禹\t335371\n包头火车站\t335372\n赵阳\t335373\nWherever\t335374\n发夹\t335375\n1000型\t335376\n苗岭路\t335377\n20150615\t335378\n上海人民公园\t335379\n史克威尔艾尼克斯\t335380\n赵海存\t335381\n直线型\t335382\ncarrying\t335383\nNHL\t335384\n74岁\t335385\n秋之回忆4\t335386\n135平方\t335387\n真大\t335388\n赵冀龙\t335389\n95橙武\t335390\n跑鞋\t335391\n川藏\t335392\n肌糖\t335393\n会子\t335394\n内尔贝\t335395\n小煜\t335396\n李官\t335397\n王湾\t335398\n海尔绿城全运村\t335399\n不锈钢杯\t335400\n陈局长\t335401\n误治\t335402\nchushi\t335403\n北京交大\t335404\n425\t335405\n毛虫\t335406\n汉匈决战吧\t335407\n广东省版权局\t335408\n孙渣\t335409\n威高集团有限公司\t335410\n爱在西元\t335411\n马晓年\t335412\n淘宝优惠券\t335413\n1995年\t335414\n德国驻华大使馆\t335415\n乔治·奥威尔\t335416\n钼靶\t335417\n83条\t335418\n白头鹎\t335419\nCMA\t335420\n全波整流电路\t335421\n水肿\t335422\n贝鲁奇\t335423\n编目\t335424\n浊音\t335425\n一不留神\t335426\n湖南招聘网\t335427\n齐王\t335428\n苏丹红\t335429\n@Resource\t335430\n蒙层\t335431\n寡肽\t335432\n狂犬病\t335433\n魅族mx5\t335434\nBaa\t335435\n堪用\t335436\n香港离岸公司\t335437\nWarframe战争\t335438\n2件\t335439\n海口观澜湖\t335440\n话语\t335441\n烟台日报\t335442\n环戊二烯\t335443\n正整\t335444\n美洲豹\t335445\n湿人\t335446\nPLA\t335447\n谭元元\t335448\nLAYER\t335449\n古力杨澜\t335450\n敬业集团\t335451\n二阶矩\t335452\n神武天帝\t335453\n_美骑论坛|BIKETO\t335454\n套弄\t335455\n掌珠\t335456\nChains\t335457\n杉达\t335458\n镀锡\t335459\n地方志\t335460\n超信\t335461\n_卷\t335462\n文学类\t335463\nbizhi\t335464\n野犬\t335465\nX-T20\t335466\n双十条\t335467\n4月18号\t335468\n英雄池\t335469\n社会生\t335470\n系统分析\t335471\n毛刷\t335472\n发绳\t335473\n自打\t335474\n多元线性回归分析\t335475\n一线工作法\t335476\n山西省石楼县人民政府\t335477\n谷果步\t335478\n十堰市\t335479\n晶体结构\t335480\n棉袄\t335481\nxvideoscom\t335482\nOPENSSL\t335483\n中山移动\t335484\n最终幻想15口袋版\t335485\n优璇\t335486\n茶苑\t335487\n宁波市人民政府办公厅\t335488\n彭玉麟\t335489\n2018年1月17日\t335490\n刘伟男\t335491\n天衍录\t335492\n棚顶\t335493\nwakfu\t335494\nHea\t335495\n永恒传奇\t335496\nraft\t335497\n科学化\t335498\n唐康林\t335499\nMDR-1ABT\t335500\n缤智论坛_汽车之家论坛\t335501\n硝酸异山梨酯片\t335502\n第4回\t335503\n华威大学\t335504\n結婚\t335505\n夏草\t335506\n48h\t335507\n中科院研究生院\t335508\nlps\t335509\n墙件\t335510\n郭源潮\t335511\n路床\t335512\nTFTP\t335513\nwebtoon\t335514\n美人骨\t335515\n烦琐\t335516\n经堂\t335517\n厉旭\t335518\nzz91再生网\t335519\n纵目科技\t335520\n国务院台湾事务办公室\t335521\n安腾\t335522\n24英寸\t335523\n锦标\t335524\n冠\t335525\n红黄\t335526\n重庆中国国际旅行社\t335527\nQU\t335528\n魍美\t335529\n刘据\t335530\nrtc\t335531\n上海外国语大学附属外国语学校\t335532\n暗黑福瑞达\t335533\n量化宽松货币政策\t335534\n导磁\t335535\n附子\t335536\n谦祥\t335537\n780\t335538\n十二战记\t335539\narise\t335540\n县衙\t335541\n神泪\t335542\n高维数组\t335543\n邓榕\t335544\nFault\t335545\n内参\t335546\nreng\t335547\n蓝光碟\t335548\n账簿\t335549\nuv平板打印机\t335550\n极品五笔输入法\t335551\n胖胖胖\t335552\n无可用\t335553\n樱桃红之袖珍妈妈\t335554\n承山穴\t335555\nlint\t335556\n小意思\t335557\n275号\t335558\n三原县\t335559\nnero8\t335560\n10万平方米\t335561\n我的朋友们\t335562\nGDS\t335563\n丁老师\t335564\n南都物业\t335565\n神鸟\t335566\n断后\t335567\n月种\t335568\n南山铝业\t335569\nRi\t335570\n仰恩大学\t335571\n何林\t335572\n张三\t335573\n柏林墙\t335574\n应试教育\t335575\n腾讯游戏平台\t335576\n名汇\t335577\n6th\t335578\n七句\t335579\n金正男\t335580\n几十亩\t335581\n过硫酸钾\t335582\n录相\t335583\n济南市国家税务局\t335584\nRESOURCES\t335585\n新华路街道\t335586\n好好玩\t335587\nCP3\t335588\n槟城\t335589\nS400\t335590\n速录\t335591\n冷凝器\t335592\n西部数据\t335593\nJock\t335594\n广东以色列理工学院\t335595\n567\t335596\n淬\t335597\n新蜂\t335598\n伐谋\t335599\n维E\t335600\n校平机\t335601\n兴隆大街\t335602\n赠给\t335603\n飞机杯\t335604\nBarbara\t335605\n重庆三峡中心医院\t335606\n月考\t335607\n大屏\t335608\nchuncn\t335609\n航炮\t335610\n融创玖樾\t335611\nwp7吧_\t335612\n纳豆激酶\t335613\ncytus2\t335614\n高速摄像机\t335615\n女孩儿\t335616\n滨州市政府\t335617\n春篇\t335618\n偷渡者\t335619\n万古凌天传说\t335620\n石国鹏\t335621\n刘向阳\t335622\n郑洁\t335623\npvc扣板\t335624\n创青春\t335625\n哈弗\t335626\n王不留行\t335627\n己酉\t335628\nNewLife365\t335629\n招工\t335630\n雷克萨斯RC\t335631\n秀女\t335632\nonsubmit=\t335633\nsoundwire\t335634\nXX集团公司\t335635\nFloodlight\t335636\n4301\t335637\n普惠性\t335638\n聚光镜\t335639\n不快\t335640\nbaiyun\t335641\n浙江三科线缆有限公司\t335642\n优博讯\t335643\n有机肥\t335644\n_汽配人网\t335645\n磨工\t335646\njaymes\t335647\n张新发\t335648\n满江\t335649\nFLIR\t335650\n排阻\t335651\n苔丝\t335652\n白铜\t335653\n茶座\t335654\nXPosed\t335655\n王双\t335656\n房天下美国房产网\t335657\nmyeclipse10\t335658\nPylint\t335659\ngrills\t335660\n50兆瓦\t335661\n重庆妇科医院\t335662\n鞋服\t335663\nyell\t335664\n报告\t335665\n九年级英语\t335666\n冷藏车\t335667\n鹰兽\t335668\nentre\t335669\n黑啤酒\t335670\n湖光岩\t335671\n折叠箱\t335672\n2018年4月8日\t335673\n四方\t335674\n零卡\t335675\n奔腾G4560\t335676\n一手房\t335677\n宠物博览会\t335678\n桃花妖\t335679\n分手时\t335680\n剑娘\t335681\n金额\t335682\n药圈\t335683\n蚺\t335684\nVBF\t335685\n中国有色矿业集团有限公司\t335686\n梁筋\t335687\n神盾局特工第四季\t335688\nSections\t335689\n仁义道德\t335690\n431\t335691\n蔻依\t335692\n阿加雷斯特\t335693\n流动资产周转率\t335694\n奥斯曼土耳其\t335695\n深圳中心公园\t335696\nfai\t335697\n颜色块\t335698\nWol\t335699\ntwenty\t335700\n0426\t335701\n【喵\t335702\n梅洛克\t335703\n4元\t335704\n血瞳\t335705\n2018一级\t335706\n善学\t335707\nwilling\t335708\n美牙仪\t335709\nsik\t335710\n60400349\t335711\n辅舒酮\t335712\n超轻\t335713\nsignature\t335714\nhaedu\t335715\nlabelshop\t335716\nBig磅\t335717\n中国太平洋人寿保险股份有限公司\t335718\n软质\t335719\n陶钰玉\t335720\n冰帽\t335721\n党中\t335722\nacome\t335723\n酒行\t335724\n门特\t335725\n这一位\t335726\n招标采购详\t335727\n无酒伤\t335728\n七书\t335729\n中北镇\t335730\n20年代\t335731\n文明型\t335732\n|万\t335733\n对刀\t335734\n夹道\t335735\n考摩\t335736\n反作用力\t335737\n雷切尔\t335738\n甲袋\t335739\n归去来兮\t335740\n临江\t335741\nfloefd\t335742\n哦集\t335743\n张汉熙\t335744\nEleven\t335745\n跨座式\t335746\n南京地铁\t335747\n3厢\t335748\n早更\t335749\n八核处理器\t335750\n田宁\t335751\n4.37G\t335752\n1264\t335753\n搬去\t335754\n佛教协会\t335755\n露琪亚\t335756\n旷野之息\t335757\n黑眼球\t335758\n轿跑\t335759\n拉场戏\t335760\n军国主义\t335761\n杨天南\t335762\n急匆匆\t335763\n成年人\t335764\nuiautomation\t335765\nXX\t335766\nNetworks\t335767\n危废处理\t335768\n苦役\t335769\n八珍\t335770\n1.75米\t335771\n比斗\t335772\nbyd\t335773\n第三十一\t335774\nFarCry5\t335775\n阳曲\t335776\nCrane\t335777\n中科院遥感所\t335778\n河口区人民政府\t335779\nWillis\t335780\n海南省人民医院\t335781\n晓晓\t335782\n鑫苑置业\t335783\n000511\t335784\n蜀步\t335785\n上海南火车站\t335786\n泥池\t335787\n强的松\t335788\n卡介疫苗\t335789\n保持不动\t335790\n7.1.1\t335791\n起发\t335792\n夜来香\t335793\n死病\t335794\n赤足惊魂\t335795\n龟粮\t335796\n淫霸\t335797\n拼\t335798\n咻咻\t335799\n石竹花\t335800\n道士塔\t335801\n环\t335802\n好少\t335803\n注册房地产估价师\t335804\n武功山\t335805\n荷花奖\t335806\n開發\t335807\n大众途昂\t335808\n双锤\t335809\n巴黎都市\t335810\nwed\t335811\n入海\t335812\n孤贫\t335813\nhazzys\t335814\nJ3\t335815\nbide\t335816\n镇域\t335817\n易如反掌\t335818\n食宿\t335819\n铜块\t335820\n音律\t335821\n创新谷\t335822\n淮安市区\t335823\n洪七公\t335824\n白金版\t335825\n四摄\t335826\n50目\t335827\n至高无上\t335828\n防电\t335829\n文峰股份\t335830\n烈龙\t335831\n纳税额\t335832\n天台\t335833\n比亚迪S6论坛_汽车之家论坛\t335834\nWhisky\t335835\neverthing\t335836\n小秋\t335837\nsml\t335838\n婴儿血管瘤\t335839\n3.14\t335840\n最适\t335841\n羽人\t335842\n动迁安置房\t335843\n甲米\t335844\n葛兰\t335845\n第191集\t335846\n狮峰龙井\t335847\n239元\t335848\nANDROID\t335849\n湛\t335850\nAfrica\t335851\n苏芒\t335852\n附魔台\t335853\n中国崇义县人民政府\t335854\nOkay\t335855\npoe交换机\t335856\n非公开增发\t335857\n中地数码集团\t335858\n27GIF\t335859\n徐霞客\t335860\n旅游部\t335861\nEbola\t335862\nntt\t335863\n晶刚\t335864\nxhamster\t335865\n创元\t335866\n大参林医药集团股份有限公司\t335867\n万能马甲\t335868\n李医生\t335869\nTR200\t335870\n1024x600\t335871\nBUD\t335872\n择优\t335873\nansys命令流\t335874\n西洪路\t335875\n三明市人民政府\t335876\n思维学\t335877\n三亚航空旅游职业学院\t335878\n2014年元旦\t335879\n亚当斯密\t335880\n磨口\t335881\nMariaDB\t335882\n温州大道\t335883\nchocoolate\t335884\nXposed框架\t335885\n淞沪路\t335886\n杨姓\t335887\n土山\t335888\n制冷\t335889\n数感\t335890\nGuan\t335891\n曼托\t335892\nr400\t335893\n夜壶\t335894\n智跑论坛_汽车之家论坛\t335895\n重装\t335896\n黑下\t335897\n中南林业科技大学\t335898\n扬长\t335899\n六数\t335900\n新疆干部网络学院\t335901\nunistd\t335902\n宿迁\t335903\n6.8级\t335904\n坪上\t335905\n76元\t335906\n江陵县人民政府\t335907\n三笠\t335908\n两河口水电站\t335909\n贵州日报数字报\t335910\nduration\t335911\n锁业\t335912\n岐山臊子面\t335913\n489\t335914\n化工管理\t335915\n长护险\t335916\n目录树\t335917\n征告\t335918\n西林壁\t335919\n中宣\t335920\n御泥坊\t335921\nDEEPCOOL\t335922\n_龙腾小说网\t335923\n郭石夫\t335924\nDifferent\t335925\n上海交通大学医学院附属新华医院\t335926\n奎\t335927\n徒步\t335928\n回数\t335929\n1.7.10_\t335930\n传变\t335931\nVanishing\t335932\n*号\t335933\n梅赛德斯\t335934\n添麻烦\t335935\n南方日报数字报\t335936\n封闭式基金\t335937\n6se70\t335938\n一进\t335939\n彩礼\t335940\naika\t335941\nPlain\t335942\n县委政法委\t335943\n释\t335944\n平板探测器\t335945\n克钦\t335946\n房屋买卖\t335947\n酷迪\t335948\nLucien\t335949\n李延\t335950\n麦林\t335951\n脑叶\t335952\n220v_\t335953\n开县人民政府\t335954\n寿屋\t335955\nnaturals\t335956\n无药可\t335957\n迷你世界吧\t335958\n浮烟山\t335959\n海岛大亨\t335960\ntuck\t335961\n青钢影\t335962\n间距离\t335963\n值得\t335964\n软创\t335965\nill\t335966\n案场\t335967\n珂莱蒂尔\t335968\nETN\t335969\n阿诗\t335970\nweiqing\t335971\n攻关\t335972\nFloppy\t335973\n送排\t335974\nVigor\t335975\n武汉市第三医院\t335976\n20世纪末\t335977\n86年\t335978\n马踏飞燕\t335979\n保信\t335980\n关联方\t335981\nartwork\t335982\n捣腾\t335983\n玩场\t335984\n唐肃宗\t335985\n王霞\t335986\n2欧元\t335987\n好招\t335988\n33次\t335989\n查找器\t335990\n经师\t335991\n充好\t335992\n清高\t335993\nCX-5论坛\t335994\n达龙云电脑\t335995\n刘思\t335996\n城南中学\t335997\ngk888t\t335998\n烤鸡翅\t335999\n北龙湖\t336000\n胸中\t336001\nbpdu\t336002\n尿急\t336003\n恐怖小黑屋\t336004\n醉书楼\t336005\n卡地亚\t336006\n肥胖率\t336007\n乡试\t336008\n经济园\t336009\n涉路\t336010\n向左走向右走\t336011\n模态\t336012\n可降\t336013\ndnSpy\t336014\n八怪\t336015\n吴门\t336016\n单身女\t336017\n证券业协会\t336018\n控制方\t336019\n东莞电信\t336020\n白墨\t336021\n乌贼娘\t336022\n601012\t336023\n香王\t336024\nindirect函数\t336025\n2一个\t336026\ncifar10\t336027\n余氯\t336028\n吸声材料\t336029\n晁然\t336030\nJeesite\t336031\n北京科锐\t336032\n山下\t336033\n养驴\t336034\n顶天立\t336035\n0.5%\t336036\n都市女孩\t336037\nmetastasis\t336038\n獠牙\t336039\n太白路\t336040\n降世神通\t336041\n卡尔\t336042\n亩\t336043\n白粉虱\t336044\n摊销\t336045\n扩散率\t336046\n虎扑装备中心\t336047\nr36\t336048\nCandence\t336049\n玉蜻蜓\t336050\n情毒\t336051\n7-9月\t336052\n新警\t336053\nDeals\t336054\n330MW\t336055\n水包\t336056\n高攀\t336057\n昆山百事通\t336058\n磊明\t336059\n艾尼\t336060\n如涵\t336061\n两三点\t336062\n圣象\t336063\n凤凰汇\t336064\n回马枪\t336065\n独流镇\t336066\n12.9级\t336067\ntoronto\t336068\n市人社\t336069\n门德斯\t336070\n68张\t336071\n松隆子\t336072\n渝东南\t336073\nIntuos\t336074\n打谱\t336075\n鼎汉\t336076\nF90\t336077\n经济状况\t336078\ngate\t336079\n金典牛奶\t336080\n三体3\t336081\ndentry\t336082\n加标\t336083\ndamei\t336084\nAIE\t336085\njuji\t336086\n陕鼓动力\t336087\n二种\t336088\n王昭\t336089\n男根\t336090\nLineZero\t336091\n环球进口食品网\t336092\n正和恒基\t336093\nTextRank算法\t336094\n蛋疼\t336095\n生态观光园\t336096\npresentation\t336097\n120.0000元\t336098\n小7\t336099\nffmeg\t336100\ninstinct\t336101\n时狱\t336102\n河北省公共资源交易中心\t336103\ni.mx6\t336104\n17项\t336105\n超过24小时\t336106\n环磷酰胺\t336107\n骗不了\t336108\ncsr\t336109\n合肥装饰公司\t336110\n李秀玲\t336111\n前臂\t336112\nDLC2\t336113\n兴发\t336114\n通幽\t336115\n本田东风本田\t336116\n远洋地产\t336117\n家值\t336118\n套卷\t336119\n排风扇\t336120\n睿途教育\t336121\n等级制\t336122\nMIG\t336123\n狗炉石\t336124\n调漂\t336125\ny型过滤器\t336126\n几天一次\t336127\n3DS\t336128\nmatchbox\t336129\n酷比魔方iwork8\t336130\nVaadin\t336131\n6月24日\t336132\n吉祥文化\t336133\n清花\t336134\n电容屏\t336135\ndes\t336136\nDelete\t336137\n萨达姆\t336138\n231号\t336139\n全界\t336140\n卫生院\t336141\n测度\t336142\n千疮百孔\t336143\n长春一汽大众\t336144\n梦想\t336145\nElastic-Job\t336146\n王营镇\t336147\n建成\t336148\n睡懒觉\t336149\nChapter\t336150\n0520\t336151\n戒赌\t336152\n固齿\t336153\n安馨\t336154\n椭圆仪\t336155\n663\t336156\n张雪\t336157\nruoji\t336158\n挤胸\t336159\n李兰迪\t336160\n益虫\t336161\n违和感\t336162\n滑稽戏\t336163\n福建省经济和信息化委员会\t336164\n肠系膜\t336165\n黑道圣徒3\t336166\n软性\t336167\n金塔县\t336168\nkx2\t336169\n7席\t336170\n硅砂\t336171\n益生碱\t336172\n2016年11月11日\t336173\n伪科学\t336174\n双模\t336175\n星空间\t336176\n五星电器\t336177\n东关\t336178\n精车\t336179\n20170607\t336180\n不可告人\t336181\nreversal\t336182\n粘结强度\t336183\n胆大\t336184\n排球少年\t336185\n血玫瑰\t336186\ngpuz\t336187\n15S\t336188\n描写\t336189\n7批次\t336190\n同享\t336191\n林家成\t336192\n可好\t336193\nlzhou666\t336194\n全媒\t336195\n智人\t336196\n安和桥\t336197\n电液\t336198\n奇瑞集团\t336199\n大众创业万众创新\t336200\nPython3\t336201\n毛地黄\t336202\n让出\t336203\n疯狂快递单号查询\t336204\n35公斤\t336205\n1000平米\t336206\n红耳龟\t336207\ntwisted\t336208\n几例\t336209\n补记\t336210\n赣南日报\t336211\n乳鸽\t336212\n這\t336213\n0359\t336214\n撒玛利亚\t336215\n新疆维吾尔自治区统计局\t336216\n综合单价法\t336217\n窦凤琴\t336218\n炎凉\t336219\n杜文龙\t336220\n暴跌\t336221\n300315\t336222\n7840\t336223\nev200\t336224\nKU\t336225\n高硼\t336226\n瑞幸咖啡\t336227\n头奖\t336228\nMax2\t336229\n弦图\t336230\n契魔者\t336231\nakb48\t336232\n豁然开朗\t336233\njcrop\t336234\nCIPE\t336235\n两面针\t336236\n加出\t336237\n舒同\t336238\n凤凰教育\t336239\n胸针\t336240\n变幻\t336241\nipad助手\t336242\nA330\t336243\nsatisfaction\t336244\nSSP/SMP\t336245\n河北衡水中学\t336246\n思明教育网\t336247\nlvyou\t336248\n双面屏\t336249\n120首\t336250\n小旭\t336251\n西昌市\t336252\n85W\t336253\nwin7/win10\t336254\n快表\t336255\n库存鞋\t336256\n兔展H5\t336257\n祖鲁\t336258\n保定乐居网\t336259\n梦幻克隆\t336260\n浅见\t336261\n多期\t336262\n中单元格\t336263\n锁喉\t336264\n棉粕\t336265\nclicked\t336266\n张悠雨\t336267\n胳膊肘\t336268\n射流泵\t336269\n水葫芦\t336270\n环氧树脂地坪\t336271\naccumulation\t336272\n同致\t336273\nLow\t336274\n流放之路召唤\t336275\n十恶\t336276\n40款\t336277\n宁波市经济和信息化委员会\t336278\n基塘\t336279\n定位语\t336280\n向阳区\t336281\n忏悔者\t336282\n中国文学艺术界联合会\t336283\n沈阳市人力资源和社会保障局\t336284\n几元\t336285\ndragonfly\t336286\nv460\t336287\n秘书节\t336288\n阳极氧化\t336289\n异国恋\t336290\n混凝\t336291\n兰州市教育局\t336292\n五红\t336293\n建筑施工\t336294\n经营协议书\t336295\n嘉兴站\t336296\n通行证\t336297\n东坪镇\t336298\n小尾寒羊\t336299\n独显运行\t336300\nm7250\t336301\n金桥国际商业广场\t336302\n雄心\t336303\n极意\t336304\n淡水鱼\t336305\n王小北\t336306\n㎞\t336307\n万科五龙山\t336308\n运城市政府\t336309\n中汇掌富通\t336310\n文华东方酒店集团\t336311\n庐阳区城管局\t336312\n上海市第十人民医院\t336313\n建设大道\t336314\nvolumio\t336315\n林氏\t336316\nnpoi\t336317\n广州公证处\t336318\n广陈镇\t336319\n银行承兑汇票贴现\t336320\ncsfb\t336321\n大气环流\t336322\n炼心\t336323\n电工\t336324\n乌鲁木齐铁路局\t336325\n奶浴\t336326\n正义红师\t336327\n亚硫酸钠\t336328\nDart\t336329\n篇海\t336330\n水日\t336331\n曹越\t336332\ntaq\t336333\nWacom数位板\t336334\n尿素液\t336335\nOffice,Office\t336336\n广富林路\t336337\nSparkStreaming\t336338\n宁波材料所\t336339\n威斯康星麦迪逊\t336340\nc++6.0\t336341\n张勇\t336342\n唐雅\t336343\n语系\t336344\n北京大学第一医院\t336345\n包围\t336346\n浩洋\t336347\n固定点\t336348\n反倾销调查\t336349\n明茨伯格\t336350\n湖北省气象局\t336351\n羽灵神\t336352\n上悦城\t336353\n天天彩票\t336354\n淘客之家\t336355\n北盟\t336356\n吉祥坊\t336357\n津滨大道\t336358\n0000001\t336359\n泡茶\t336360\nunionId\t336361\n1月13日\t336362\n於之莹\t336363\n济宁市任城区\t336364\npoweramp\t336365\n八三夭\t336366\nRather\t336367\n19支\t336368\n2.3亿\t336369\n波头\t336370\n频\t336371\n风池\t336372\n安在旭\t336373\n米勒\t336374\n呼号\t336375\n降调版\t336376\n引人瞩目\t336377\n李之仪\t336378\n铝塑板\t336379\n福隆\t336380\n封底\t336381\n友客\t336382\nautomator\t336383\nzuma\t336384\n保录\t336385\n尖山村\t336386\n增写\t336387\n胜利镇\t336388\n假紫\t336389\n施行\t336390\n百度风投\t336391\n壹度\t336392\n恋爱回旋\t336393\n商业地\t336394\ns型\t336395\n表尾\t336396\nf200\t336397\n测算\t336398\narchdaily\t336399\n色天堂-影音先锋av-影音先锋\t336400\n第144集\t336401\n季尧\t336402\n李贞贤\t336403\n维生素矿物质片\t336404\n唯识论\t336405\n一语\t336406\n须要\t336407\n河口湖\t336408\n妇科凝胶\t336409\n安联大厦\t336410\n20170827\t336411\n氨基糖苷类\t336412\n双十星\t336413\n凯迪拉克SRX\t336414\n弗里德里希\t336415\n[失信\t336416\n上市\t336417\n三亚日报\t336418\n圭亚那\t336419\n疏解\t336420\nChill\t336421\n1920px\t336422\n衣针\t336423\n23英寸\t336424\n链节\t336425\n相等\t336426\n二宝\t336427\n小白们\t336428\n2016年2月份\t336429\n通点\t336430\n回程\t336431\n晋城市\t336432\n稳中有进\t336433\n型芯\t336434\n坛城\t336435\n国府\t336436\n出版\t336437\n近藤麻理惠\t336438\n100多家\t336439\n大冢咲\t336440\n吕良伟\t336441\nknex\t336442\n彩色版\t336443\nvivo游戏中心\t336444\nz370f\t336445\n欧冶\t336446\n卧龙吟\t336447\n布管\t336448\n连接头\t336449\n尾板\t336450\n隔叶黄莺\t336451\n178天\t336452\n密云\t336453\n叶萝莉\t336454\n企业信息网\t336455\n酸类\t336456\n赏雪\t336457\nsck\t336458\n氟橡胶\t336459\n汪泓\t336460\n管道泵\t336461\n55岁\t336462\n眉开眼笑\t336463\nxlswrite\t336464\n流行病\t336465\n戴科彬\t336466\n江南大学研究生院\t336467\n温顿\t336468\n日付\t336469\n师公\t336470\n宁波文化广场\t336471\n第三节\t336472\n史今\t336473\n土司城\t336474\n信令\t336475\n天尽头\t336476\n汉堡\t336477\n描金\t336478\n父项\t336479\n唠嗑\t336480\n博安\t336481\n潜力值\t336482\n浒关\t336483\n周东\t336484\nEZfm\t336485\nAssociates\t336486\n婚宠\t336487\nkeycloak\t336488\n政治经济学\t336489\n镝\t336490\nPowerDesigner\t336491\n常州幼儿园\t336492\n2018年02月16日\t336493\nsatisfies\t336494\n龙眠神殿\t336495\n催情剂\t336496\n竞猜\t336497\n入世\t336498\n柔术\t336499\n唯卓\t336500\n白明\t336501\n通用函数\t336502\n世界一级方程式锦标赛\t336503\n宽带猫\t336504\ndevtools\t336505\n破伤风抗毒素\t336506\n首堆\t336507\n脱脂剂\t336508\n節日\t336509\n全时四驱\t336510\nSonarQube\t336511\n吴中万达广场\t336512\n湖北省财政厅\t336513\n瓦窑\t336514\n李婉华\t336515\nanjian\t336516\nenvironment\t336517\n商业类\t336518\n党建网\t336519\nfetchone\t336520\n环圈\t336521\n零存整取\t336522\n丝束\t336523\n蚕蛾\t336524\n封丘县\t336525\ndeskscapes\t336526\n300024\t336527\n亲自\t336528\n五十部\t336529\n彩钢板\t336530\n知春里\t336531\n首撞\t336532\n略\t336533\nV2.00\t336534\n海岸城\t336535\nUFLDL\t336536\n截止\t336537\n实用主义\t336538\nwebsphere\t336539\n各得其所\t336540\nPIT\t336541\n竞聘\t336542\n生如夏花851213\t336543\n原字\t336544\n澳优\t336545\n黎叔\t336546\n燚\t336547\nseton\t336548\nSPSS卡方检验\t336549\nzip版\t336550\n恶名昭彰\t336551\n吃了饭\t336552\nliliangel\t336553\n500MB\t336554\n大热\t336555\nSierra\t336556\n如果没有你\t336557\n托书\t336558\nnicole\t336559\nTycoon\t336560\n饶恕\t336561\nLim\t336562\n郦城\t336563\n800M\t336564\n湘湖网\t336565\n鼓室\t336566\n辽金\t336567\n2}}\t336568\n一分半钟\t336569\nManta\t336570\n梦心\t336571\n唾液酸\t336572\n高邮日报数字报\t336573\n兴农\t336574\n没事\t336575\n理财险\t336576\n艺灵\t336577\n泰捷盒子\t336578\n嗨森\t336579\n薰衣草精油\t336580\nsanten\t336581\n神经\t336582\n剥离\t336583\n米克诺斯\t336584\n衬里\t336585\n铬酸盐\t336586\n表\t336587\nベル\t336588\nNOR\t336589\n如心\t336590\n256k\t336591\n出神入化\t336592\n日租宝\t336593\nyy频道\t336594\n不眠夜\t336595\ntrend\t336596\n这个男人来自地球\t336597\n红疙瘩\t336598\n七里香\t336599\n葛店开发区\t336600\n过界\t336601\n卓琳\t336602\n哈辉\t336603\n华表奖\t336604\n四川大学公共管理学院\t336605\n小便\t336606\n唇裂\t336607\n华素片\t336608\n汽门\t336609\n北京海淀法院\t336610\nbootload\t336611\n健身气功五禽戏\t336612\n康华光\t336613\n武姓\t336614\n查尔斯·狄更斯\t336615\n电子荷质比\t336616\n93家\t336617\n朗文当代高级英语辞典\t336618\n伯格\t336619\n彩烟\t336620\n凯伦宾威\t336621\n本我\t336622\n微处理器\t336623\n智付\t336624\n蓖麻\t336625\nUsability\t336626\n.cer\t336627\n红丝绒\t336628\nNexon\t336629\n百姓\t336630\n4623\t336631\n浴桶\t336632\n新聞\t336633\n生物胺\t336634\n钱皇\t336635\nECP\t336636\n曼地亚\t336637\n新奥特曼列传\t336638\n中国质检出版社\t336639\n腐殖质\t336640\n57步\t336641\n白云山风景名胜区\t336642\n晕轮\t336643\n魂芯\t336644\n市人民政府办公厅\t336645\n矿盐\t336646\n渝建竣\t336647\navenir\t336648\n花生糖\t336649\nWin10论坛\t336650\n515排行网\t336651\n俄罗斯大学\t336652\n亚洲大学\t336653\n工程保险\t336654\n梦想传媒\t336655\n万众一心向前进\t336656\n损失费\t336657\n少先队大队\t336658\n酒茨\t336659\n后变\t336660\nANKI\t336661\nswitzerland\t336662\n万勇\t336663\n用品店\t336664\nEFC\t336665\n法域\t336666\n外翻\t336667\n盐湖\t336668\n学术讲座\t336669\n政策篇\t336670\nMOMOLAND\t336671\n胡玲\t336672\ngunicorn\t336673\n买入返售\t336674\n互斥锁\t336675\n第87集\t336676\nv1.6.0\t336677\n孙政\t336678\n手机业\t336679\n螺纹管\t336680\n篠田麻里子\t336681\n歌男\t336682\n莱昂\t336683\n一通\t336684\n六肖\t336685\n人教A版\t336686\nthes\t336687\n小商品城\t336688\n秦之声\t336689\n免费\t336690\n姚木兰\t336691\n泛华\t336692\nnewa\t336693\nOK蜜蜂网\t336694\n几百亿\t336695\n苏州新区\t336696\n冰茶\t336697\n权振东\t336698\n宅地\t336699\nusa\t336700\nDeviation\t336701\n骋\t336702\n锲\t336703\nredis\t336704\nadizero\t336705\n长安欧诺装饰\t336706\nnix\t336707\n六合宝典\t336708\n牙买加\t336709\n浅表性\t336710\nCanvas\t336711\n丛台区\t336712\n广东大学\t336713\nGravatar\t336714\n安迈\t336715\n西泠\t336716\n组学\t336717\n陆海涛\t336718\n攻防战\t336719\n吴三桂\t336720\nstats\t336721\n135万\t336722\n8月1号\t336723\n订货\t336724\nwifi信号差\t336725\n甚\t336726\n中国安防行业网\t336727\nCena\t336728\n版位\t336729\n歌服\t336730\n契默\t336731\n尿布\t336732\n厦门建发\t336733\n百隆\t336734\n王小海\t336735\nCoLaBug\t336736\n哈日\t336737\nWORTH\t336738\n澳门新葡京娱乐\t336739\n教师资格网\t336740\nFramer\t336741\n杨文杰\t336742\nwww.886dm.com\t336743\n习作4\t336744\nEPS格式\t336745\n幕后人\t336746\n壬\t336747\n頭\t336748\n宜昌市伍家岗区人民政府\t336749\n酒庄\t336750\nvmfs\t336751\nXgboost\t336752\ndiablo2\t336753\n宇泰\t336754\n疑虑\t336755\n艺术论\t336756\nOperators\t336757\n遂平县\t336758\n百水芊城\t336759\n物联网应用技术专业\t336760\n借随\t336761\n奥一网\t336762\ncocoon\t336763\n40nm\t336764\n剧团\t336765\n新北区\t336766\n痰盂\t336767\n半海\t336768\nTestCenter\t336769\nssk\t336770\n演讲台\t336771\nmating\t336772\n人丁\t336773\n流浪女\t336774\n箭神\t336775\n照片级\t336776\n荣誉\t336777\n唐雨柔\t336778\n许宗衡\t336779\n桌架\t336780\n农保\t336781\n钱峰\t336782\nnorma\t336783\n米折\t336784\n天王巨星\t336785\n2016年10月17日\t336786\nopencascade\t336787\nPPT2003\t336788\n化龙记\t336789\n2851\t336790\n偏旁组\t336791\n脚本编辑器\t336792\n备兑\t336793\n金银潭\t336794\n三科麓湾\t336795\nbarplot\t336796\n山东煎饼\t336797\n可放\t336798\n宠物情人\t336799\n无见\t336800\nDirectional\t336801\n七十二变之斩妖除魔\t336802\n五香豆\t336803\nXMOS\t336804\n研究部\t336805\n史伟\t336806\n雷帕霉素\t336807\n架不住\t336808\n活动价\t336809\n市盈\t336810\n_文明6\t336811\n难产\t336812\n徐航\t336813\n政府办公楼\t336814\n帐号\t336815\nBIG\t336816\n麒\t336817\n赵文杰\t336818\n兑换率\t336819\n古调\t336820\n混凝土柱\t336821\n郑星阳\t336822\n德商御府天骄\t336823\nStripe\t336824\n光电园\t336825\nsades\t336826\n天火风\t336827\n跨学科\t336828\n拨号\t336829\ncdr9\t336830\n大连市住房公积金管理中心\t336831\n弗罗里达\t336832\n装批\t336833\n神经学\t336834\n1594\t336835\npulmonary\t336836\n无遮无挡\t336837\n结余\t336838\n钦差\t336839\n炸弹客\t336840\n混成\t336841\n三个世界\t336842\nFir\t336843\n休斯敦火箭\t336844\n冷冻式\t336845\nHerbs\t336846\n艾丝\t336847\n树影\t336848\nR6900\t336849\n交通运输类\t336850\n东港新村\t336851\nMICHAEL\t336852\n别笑\t336853\nlnlh2013\t336854\n补脑\t336855\nUGS\t336856\n味大\t336857\n中华杯\t336858\n穷游锦囊\t336859\njepsen\t336860\n装货\t336861\n162个\t336862\n对表\t336863\n龙发\t336864\n魏炜\t336865\n冬枣\t336866\n复底\t336867\n班克斯\t336868\n习习\t336869\n48V\t336870\n夹爪\t336871\nGHJ\t336872\n信码\t336873\n莫错过\t336874\n苏美鲁\t336875\n积压\t336876\n外汇公司\t336877\n黄光芳华\t336878\nCCL\t336879\n纤毛虫\t336880\nNas\t336881\n水利人\t336882\n优享\t336883\n七彩石\t336884\n上天台\t336885\n湖东\t336886\n内门\t336887\nxdm\t336888\n英山县\t336889\n银子\t336890\n险胜\t336891\n媳妇儿\t336892\nshowdialog\t336893\n羊妈\t336894\ngriddata\t336895\nasc码\t336896\n106.6\t336897\n修真聊天群\t336898\n绥芬河\t336899\nWeave\t336900\ninventor2016\t336901\nUUMNT\t336902\n侧卧位\t336903\n汽車\t336904\n阿克蒙德\t336905\n直线位移传感器\t336906\n壹城中心\t336907\n蒂姆·罗宾斯\t336908\n寿命期\t336909\nXC\t336910\n小春论坛\t336911\n御桥垃圾焚烧厂\t336912\nCStatic\t336913\n节前\t336914\n35ml\t336915\nKakao\t336916\n天草丹参保心茶\t336917\n景观设计学\t336918\nHSD\t336919\nQRCode\t336920\nrent\t336921\n入境单\t336922\n新航道雅思\t336923\n师祖\t336924\n2公里\t336925\nrooboo\t336926\n精满\t336927\n天府软件园\t336928\n矢量网络分析仪\t336929\n踏火\t336930\n南农\t336931\n盛世华府\t336932\n传奇之路\t336933\n合肥市中级人民法院\t336934\n网易免费企业邮箱\t336935\n5b\t336936\n泼尼松\t336937\n普象\t336938\n消费主张\t336939\nsomebody\t336940\n2012年06月30日\t336941\n枸杞根\t336942\nindesign\t336943\n物业管理软件\t336944\n开据\t336945\n诺曼底\t336946\n藤椒牛肉面\t336947\n灰度化\t336948\n马晓宏\t336949\njart\t336950\n董存\t336951\n曝光台吧\t336952\n海芋\t336953\n汉思\t336954\ndyna\t336955\n液氧储罐\t336956\n武夷岩茶大红袍\t336957\n微软五笔\t336958\nVMWare\t336959\nwin10锁屏\t336960\n大毕\t336961\n高仿香奈儿\t336962\n五年内\t336963\n人造太阳\t336964\n2604\t336965\n质\t336966\n轴标\t336967\n90家\t336968\n物业部\t336969\n综述性\t336970\n华高\t336971\n廉洁自律准则\t336972\n十米\t336973\n面相\t336974\n制作者\t336975\n膳食补充剂\t336976\n生硬\t336977\n跟驰\t336978\n中航证券有限公司\t336979\nbeian\t336980\n汇市\t336981\n打铃\t336982\n久咳\t336983\n探出\t336984\nAkko\t336985\n9|\t336986\n畅捷通T6\t336987\n通信人\t336988\n浸出物\t336989\n串子\t336990\n快手\t336991\n房友\t336992\n0x10\t336993\n11次\t336994\n天津大学建筑工程学院\t336995\n彭丽\t336996\n巡夜\t336997\n在梦\t336998\n华龙\t336999\nZenFone2\t337000\n甲基汞\t337001\nproof\t337002\n地方教育附加税\t337003\n新潮流\t337004\n阿霞\t337005\n养成\t337006\n四川经济日报\t337007\n楚尘\t337008\nwww.pc6.com\t337009\n2模\t337010\n13亿\t337011\n体外\t337012\n张小乙\t337013\n飞虎队\t337014\n大叶榕\t337015\n真琴\t337016\n另一篇\t337017\n精子\t337018\nAnne\t337019\n200路\t337020\n发卡行\t337021\n万达商业地产\t337022\n十戒\t337023\n凤阳花鼓\t337024\n等于\t337025\n控制率\t337026\nJo\t337027\n很不错\t337028\n6张\t337029\npostgresql数据库\t337030\n忠\t337031\n休闲党\t337032\n后三国群英时代\t337033\nxjbest\t337034\nTalents\t337035\n合肥步行街\t337036\n秘爱\t337037\nxiaoma\t337038\n幕后之王\t337039\n中共福建省委组织部\t337040\n上海市第六人民医院\t337041\n求求求\t337042\n昆明广场\t337043\n惊天黑幕\t337044\n翁媳\t337045\n河北省政协\t337046\n这周五\t337047\n荷兰谷\t337048\n射频电缆\t337049\n赵照\t337050\nRectified\t337051\n郑斯仁\t337052\n胡一统\t337053\n淫落\t337054\nrd450\t337055\nBohemian\t337056\n2平方米\t337057\n外航\t337058\n6样\t337059\n张德芬\t337060\n六年级品社\t337061\n省市\t337062\n刘关\t337063\n跃千愁\t337064\n没有你陪伴真的好孤单\t337065\n安徽外国语学院\t337066\n泽众\t337067\n炎魔\t337068\n长沙民政学院\t337069\n尼托\t337070\n网子\t337071\n雨血\t337072\n世界自然保护联盟\t337073\n通金所\t337074\n张国焘\t337075\n甄准\t337076\n受重视\t337077\n5章\t337078\n诺普信\t337079\n影响_参考网\t337080\n35度\t337081\n鱼台\t337082\n阿幼朵\t337083\n抚恤金\t337084\nレディ\t337085\ncoffeescript\t337086\nrail\t337087\n百度网盘高清\t337088\ncaddy\t337089\n防员\t337090\nnavicat11\t337091\n0.2.0\t337092\n河南警察学院\t337093\n神童\t337094\n僧多粥少\t337095\n聚灵\t337096\n11段\t337097\n法律责任\t337098\n馕坑\t337099\n3到5分钟\t337100\nreviews\t337101\n秀一\t337102\nmyeclipse2017\t337103\n怪会\t337104\nFEEL\t337105\n续修\t337106\n波普尔\t337107\n星际迷航:暗黑无界\t337108\nyoga710\t337109\n二脉\t337110\n沙岭子西\t337111\n52层\t337112\nTentacle\t337113\npower\t337114\n钱塘湖\t337115\n白衣天使\t337116\n身怀\t337117\n浸入\t337118\n移动公司\t337119\n太极剑法\t337120\n格式塔\t337121\n石嘴山市人民政府\t337122\n牡丹江火车站\t337123\nREADME.md\t337124\n蓝青小学\t337125\nbitches\t337126\n到校\t337127\nchores\t337128\n萨摩\t337129\nntpdate\t337130\n李伯伯\t337131\nkingexit\t337132\nmonument\t337133\n抛媚眼\t337134\n特朗普\t337135\n雨燕论坛\t337136\n启悦神印王座\t337137\n坤鹏\t337138\n镇纸\t337139\n00600\t337140\nsfu\t337141\n唐人街探案2高清\t337142\nAptio\t337143\n暴强\t337144\n福缘阁\t337145\nT460p\t337146\n百强县\t337147\nio\t337148\nReef\t337149\n2000只\t337150\n珍珠塔\t337151\n文翔路\t337152\n卷曲\t337153\n自体干细胞\t337154\n造价管理\t337155\n控制仪\t337156\npossibly\t337157\nwaive\t337158\n景雨梦\t337159\npearson相关系数\t337160\n2分钟后\t337161\n鸡嘴\t337162\nrealflight\t337163\n杨书记\t337164\npvc管\t337165\n图签\t337166\n2520hc\t337167\nGrip\t337168\n配防\t337169\nIntouch\t337170\n丸茶\t337171\n高等教\t337172\n黑夜传说2\t337173\n天天想你\t337174\n东行\t337175\n奔驰GL450\t337176\n盈滨半岛\t337177\n神祇\t337178\n万兆\t337179\n2.9.2\t337180\nAG\t337181\n第四十三次\t337182\nlenovo\t337183\n四川农业大学\t337184\n爱情买卖\t337185\n过渡效\t337186\n南京苏杰学校\t337187\n度盘\t337188\n采花\t337189\nsql排序\t337190\n凌宇\t337191\n中山市三角镇\t337192\n几个字\t337193\n金沙岛\t337194\n弃之可惜\t337195\n法务\t337196\n雍贤府\t337197\n458\t337198\n跨考网\t337199\n开席\t337200\n昭和\t337201\n法宣在线考试真题集\t337202\n辽沈战役\t337203\n80000元\t337204\n白带检查\t337205\n门把手\t337206\n正始\t337207\nbuilders\t337208\n成田特\t337209\n六\t337210\n前面\t337211\n8000块\t337212\n龅牙\t337213\n脱缰\t337214\n富利\t337215\n曝光台\t337216\n成源\t337217\nGRBL\t337218\n远东\t337219\nScott007\t337220\n陈翔\t337221\n中央宣讲团\t337222\n清水村\t337223\n军法\t337224\n上外贤达\t337225\n历山北路\t337226\n赞成票\t337227\n硬化剂\t337228\ntx1\t337229\n福州地铁\t337230\nbyt\t337231\nvhd\t337232\n十五贯\t337233\n锦缘\t337234\n宁杭高速\t337235\n低轨\t337236\n理家城\t337237\n王忠林\t337238\npm10\t337239\n赵自强\t337240\nQComboBox\t337241\n太姥山\t337242\nreact-redux\t337243\n理\t337244\n杭州旅游集散中心\t337245\na6500\t337246\n单板\t337247\n北海银滩\t337248\n文字库\t337249\n2017级\t337250\n3G免费网\t337251\n新川科技园\t337252\n火焰杯\t337253\n荣耀红蓝\t337254\n龙巅孔雀鱼\t337255\next3\t337256\nstc89c52\t337257\n7029\t337258\n李博\t337259\n恒久远\t337260\n亚市\t337261\n图形学\t337262\n投标人\t337263\n子菜单\t337264\nOdin\t337265\n确立\t337266\n浪淘沙令\t337267\nDanish\t337268\n178战舰世界\t337269\n省钱\t337270\n惟一\t337271\nfigsize\t337272\n3.2猎魔\t337273\n顶过\t337274\n范爱农\t337275\n石川县\t337276\n战斗英雄\t337277\n剑侠情缘网络版3\t337278\n东方大陆\t337279\n瑞格尔\t337280\n江苏省食品药品监督管理局\t337281\n雾\t337282\n削血\t337283\n多姿\t337284\n槟榔\t337285\n中建钢构\t337286\n三国志4\t337287\n各展\t337288\n家庭中央空调\t337289\nm1216\t337290\n品系\t337291\nsdxy\t337292\n专题片\t337293\n水泽真树\t337294\n江南六大古镇\t337295\n南京客运南站\t337296\n南京大学环境学院\t337297\n荒滩\t337298\n报餐\t337299\nert\t337300\n_运城学院\t337301\n2015年12月28日\t337302\n千家\t337303\n新疆天业\t337304\n0607\t337305\n淘男网\t337306\n质构\t337307\n上下部\t337308\n第五十八条\t337309\n999.com\t337310\n合肥网络公司\t337311\n通辽日报数字报\t337312\nnitride\t337313\nstandards\t337314\nNOVO\t337315\n白字\t337316\n昔时\t337317\n百慕大草坪\t337318\ntingroom\t337319\n河头\t337320\n仿射\t337321\n覃\t337322\n金宝箱\t337323\n35分\t337324\n石佛寺镇\t337325\n钛材\t337326\n东方夏威夷\t337327\n5960x\t337328\npscs6\t337329\n4821\t337330\n四川大学生命科学学院\t337331\nBreaker\t337332\n很可惜\t337333\n不卖身\t337334\n氧化铟\t337335\nCLO\t337336\n肖捷\t337337\n花垣\t337338\n李娅\t337339\n火影忍者博人传\t337340\n我的诗篇\t337341\n王开东\t337342\n神様\t337343\nDownie\t337344\n观察员\t337345\n老鸭汤\t337346\nAcacia\t337347\n49000\t337348\n康郡\t337349\n脑后\t337350\n星期几_\t337351\n处qy\t337352\n大同吧\t337353\nCron表达式生成器\t337354\n酒精棉\t337355\n决战平\t337356\n怪异君\t337357\n116路\t337358\nlarvel\t337359\n明星经纪公司\t337360\nX3.0\t337361\n脆嫩\t337362\n陈忠康\t337363\n锂离子\t337364\n25年后\t337365\n中项网\t337366\n九峰小区\t337367\n棱柱\t337368\n血液肿瘤\t337369\nDropout\t337370\n川企\t337371\n长春工程学院\t337372\nxuhua\t337373\n贴膏\t337374\n日产蓝鸟\t337375\n球团\t337376\nminimal\t337377\n1.8mm\t337378\n莫名其妙\t337379\n发泡胶\t337380\n深覆\t337381\n34万元\t337382\n熔点\t337383\n神马影院\t337384\npositio\t337385\nxplay5a\t337386\n提纲\t337387\n天府新区\t337388\n狞猫\t337389\n冯云\t337390\nJSU\t337391\n爱乐之城\t337392\nodt\t337393\n阴婚\t337394\n按键精灵论坛\t337395\n2018-2019年度\t337396\n东正教\t337397\n8版\t337398\n王庆\t337399\n机械手\t337400\n抵抗\t337401\n50立方米\t337402\n1600k\t337403\n冠道论坛论坛\t337404\nSavings\t337405\n避孕\t337406\n磁力启动器\t337407\n0830\t337408\n俯角\t337409\n巴南\t337410\n成都政府\t337411\n巡捕房\t337412\nflowchart\t337413\nAbandoned\t337414\n机械剪板机\t337415\n事工\t337416\n珍珠龟\t337417\n可得到\t337418\n先导者\t337419\nseagate\t337420\n雷霆战机\t337421\n远道\t337422\n大连化学物理研究所\t337423\nhype\t337424\n蹊\t337425\n平调\t337426\n怀特森\t337427\n黑大\t337428\n中心稿\t337429\n孤立主义\t337430\n爱情呼叫转移\t337431\n湘村\t337432\n海贼王女帝\t337433\nmumubin\t337434\n这一幕\t337435\n凤麟\t337436\n鸭绒\t337437\n千千阙\t337438\n果果\t337439\n艾默生网络能源有限公司\t337440\n不及物\t337441\n不惜\t337442\n七贤郡\t337443\n币赢网\t337444\n秦源\t337445\nhuji\t337446\njiwu\t337447\n暗表\t337448\n法拉电容\t337449\nHCl\t337450\n可移植性\t337451\n赵昊\t337452\n夷坚志\t337453\n晓妍凤平\t337454\ntru\t337455\n炒肉丝\t337456\n艺术字转换器\t337457\nugp\t337458\n卓有成效的管理者\t337459\n息税前利润\t337460\nNOW\t337461\n白硕\t337462\n筹办\t337463\n高胜\t337464\nbobby\t337465\nfirmware\t337466\nV9.0\t337467\nfiled\t337468\n杂工\t337469\n—\t337470\n有不甘\t337471\n主产区\t337472\nclosed\t337473\nbjd\t337474\n李工\t337475\n万成\t337476\n北极星智能电网在线\t337477\n/机\t337478\ncnandroiddocs\t337479\n股林\t337480\njersey\t337481\n一舍\t337482\n蔡美玲\t337483\n悟\t337484\n爱着我\t337485\n欣欣旅游\t337486\n8V\t337487\n旗木卡卡西\t337488\n碟中谍3\t337489\n百度邮箱\t337490\n雄关漫道真如铁,而今迈步从头越\t337491\n据我所知\t337492\n小市镇\t337493\n濮阳市人力资源和社会保障局\t337494\narchwing\t337495\n炉窑\t337496\n金汇路\t337497\n残根\t337498\n冻顶乌龙\t337499\n发祥地\t337500\n配变\t337501\n力邀\t337502\n三和招聘网\t337503\n刘璐佳\t337504\n先锋影院\t337505\n摩天楼\t337506\n磨西镇\t337507\n台\t337508\n双色球复式投注\t337509\nffu\t337510\n趣解谜网\t337511\nmpg格式\t337512\nBALL\t337513\nstandalone\t337514\nsurfacert\t337515\n平面向量\t337516\n银影侠\t337517\n5056\t337518\n玄界之门\t337519\n阿鲁高\t337520\niPhone7/7\t337521\n赵丽娜\t337522\n尼克尔\t337523\n唐装\t337524\n知识产权局\t337525\n荟\t337526\n辛劳\t337527\n陈敬\t337528\n五岳寨\t337529\nBackup\t337530\n厉鬼\t337531\n翠亨新区\t337532\n仇家\t337533\n蓖麻油\t337534\n乔维怡\t337535\n北京康拉德科技\t337536\n邻苯二甲酸二丁酯\t337537\n发泡材料\t337538\n砂石场\t337539\n竹书纪年\t337540\n黑帽\t337541\n临产\t337542\n5E对战平台\t337543\n一级建造师考试网\t337544\nEtudes\t337545\n要素类\t337546\nENC\t337547\nrft\t337548\nteo\t337549\n巾帼文明\t337550\n市中院\t337551\n黄毓民\t337552\nReading\t337553\n霞客\t337554\nccx\t337555\n12.12\t337556\nPRB\t337557\nbona\t337558\n盈方\t337559\nvivox20\t337560\n床架\t337561\n集字圣教序\t337562\n藏猫猫\t337563\n控雨\t337564\n定植\t337565\n尧典\t337566\n3.3.5\t337567\n百度新闻\t337568\n5285\t337569\n巧算\t337570\n苟仲文\t337571\nRV\t337572\nv4.5.1\t337573\n武林风\t337574\n苹果6Plus\t337575\n高升本\t337576\n浣花洗剑录\t337577\n65集\t337578\n中文免费版\t337579\n金刚砂地坪\t337580\nweb-dl\t337581\n山地公园\t337582\n卫信康\t337583\n为什么样\t337584\n12.10\t337585\n军舞\t337586\nagatha\t337587\n美航\t337588\nphotoshop\t337589\ndenglu\t337590\n上城\t337591\n招贷\t337592\n神经梅毒\t337593\n抵京\t337594\nOkPorn\t337595\n9集\t337596\n湖南美术出版社\t337597\n100例\t337598\n反扣\t337599\n梁山伯与祝英台\t337600\n星刻的龙骑士\t337601\n佩鲁贾\t337602\nxbmc\t337603\nrules\t337604\n爱语\t337605\nXB\t337606\n微信营销网\t337607\n昆山秀峰中学\t337608\n骏卡\t337609\nyuyu\t337610\n佩里斯诺\t337611\n华金证券\t337612\n合川路\t337613\n血衣天使\t337614\nCASIO\t337615\n马来酸酐\t337616\n羞辱\t337617\n黄易\t337618\n上海机场\t337619\n枫树上的喜鹊\t337620\n二货\t337621\nMX3\t337622\n华南广场\t337623\n工作项\t337624\n有线\t337625\nmadlipz\t337626\n好忙\t337627\n张峻\t337628\n九秀\t337629\n5.5厘米\t337630\n典\t337631\nGE62\t337632\n亲友团\t337633\n去皱纹\t337634\n39健康网\t337635\n惊雷\t337636\n重庆日报数字报\t337637\n中梁山\t337638\n金投白银网\t337639\nm2.5\t337640\n跳跳乐\t337641\n无虚发\t337642\n非参数检验\t337643\nJEB\t337644\n朗诵版\t337645\n北京乐成国际学校\t337646\n记载\t337647\n画布\t337648\n灌肠\t337649\nMonarch\t337650\n_度\t337651\n砂缸\t337652\n003003\t337653\n0755-6160xxxx\t337654\n我是特种兵之火凤凰\t337655\nfallen\t337656\n演员们\t337657\nfia\t337658\n霸旋陀螺\t337659\n高级语言程序设计\t337660\n共生币\t337661\n7.55\t337662\n格什温\t337663\n较差\t337664\n古丁\t337665\n宝马5系论\t337666\n吉木乃县\t337667\n八字算命_生辰八字算命\t337668\n充分地\t337669\nBarton\t337670\nmodsecurity\t337671\n古今图书集成\t337672\n过机\t337673\n张子强\t337674\n空军预警学院\t337675\n一百回\t337676\n半日\t337677\n端方\t337678\n沃家云盘\t337679\nGBDT\t337680\n昂然\t337681\nsherlock\t337682\n北海路\t337683\n点绕\t337684\n彩砂\t337685\n夺谋\t337686\n24个月\t337687\n国海证券\t337688\n咖啡猫\t337689\n金福\t337690\nfrequent\t337691\n居委\t337692\n焕采\t337693\n北大国家发展研究院\t337694\n虎跃快客\t337695\nspringcloud\t337696\n泰晤士河\t337697\n中国地质大学珠宝学院\t337698\n世纪星城\t337699\nmvc拦截器\t337700\n怀药\t337701\n筑龙路桥\t337702\n零钱宝\t337703\n九公里\t337704\naie\t337705\n十届二次\t337706\n侯亮\t337707\n广州市西关外国语学校\t337708\n横置\t337709\n民用航空\t337710\n排行榜\t337711\n艳子\t337712\n王洪\t337713\n穿板\t337714\n老色\t337715\n前盖\t337716\n机械费\t337717\n碲化镉\t337718\n朱佑樘\t337719\nrimworld\t337720\n陈晖\t337721\n校色\t337722\n亲哥哥\t337723\n自适应\t337724\n格架\t337725\n全街\t337726\n120马力\t337727\n海贼王壁纸\t337728\n6.2分\t337729\n就业信息服务网\t337730\n61个\t337731\n碧素堂\t337732\n辑录\t337733\n小把戏\t337734\nQTextEdit\t337735\n氧化\t337736\nxfce4\t337737\n谭佑铭\t337738\n贾志刚\t337739\n芪参益气滴丸\t337740\n过氧苯甲酰凝胶\t337741\n题例\t337742\n纸箱\t337743\n枸橼酸铋钾胶囊\t337744\n飞扑\t337745\n性告解\t337746\nHerbert\t337747\n假面骑士black\t337748\nBaggage\t337749\nPTP\t337750\n烤馕\t337751\n嘉行传媒\t337752\n时日\t337753\n老板们\t337754\n烫金机\t337755\n参谋家\t337756\n血透\t337757\n自治区住房城乡建设厅\t337758\nleftstick\t337759\n数百万天使轮\t337760\nJLL\t337761\n叶集\t337762\n51sole.com\t337763\n开封市人民政府\t337764\naltered\t337765\n两学一做\t337766\n磁力云\t337767\n折强度\t337768\n鸢都\t337769\n黄金海岸\t337770\nO型血\t337771\n伊利诺伊州\t337772\n星条旗\t337773\n平阳路\t337774\n大王小王\t337775\n日版流星花园\t337776\n医学伦理学\t337777\nweui\t337778\n140ml\t337779\n光学学报\t337780\n泰合\t337781\n4亿多\t337782\n木吉他\t337783\n丹阳\t337784\n江苏省南京地方税务局\t337785\n秉\t337786\ncomboBox\t337787\n辰\t337788\n金鹰国际购物中心\t337789\n鱼珠\t337790\n散结镇痛胶囊\t337791\n前一个月\t337792\n金属味\t337793\n圳\t337794\nDIV块\t337795\n龙岗线\t337796\n老吾老以及人之老\t337797\n紫杉醇注射液\t337798\n武侠世界大冒险\t337799\n其乐\t337800\n措\t337801\n我的小姨\t337802\n巴拉拉小魔仙\t337803\n全国政协常委\t337804\nCalling\t337805\n薰衣草节\t337806\n易凯\t337807\nsdei\t337808\n极边\t337809\n叶荣\t337810\n偏头疼\t337811\n鬼父\t337812\nterminated\t337813\n37.2度\t337814\n电老虎\t337815\n周雷\t337816\n鸭胗\t337817\n会武\t337818\n色杯\t337819\n观前街\t337820\n硬笔字\t337821\nUI巴巴\t337822\n双休\t337823\n第十个\t337824\nqqzi\t337825\n吴秀\t337826\n南京旅行社\t337827\n橡胶漆\t337828\n沈阳盛京医院\t337829\n水烟袋\t337830\n有感于\t337831\n第129章\t337832\nresque\t337833\n唐泽雪穗\t337834\n移动智能终端\t337835\n齐鲁石化公司\t337836\n路亚线\t337837\n前一个小时\t337838\n南怡岛\t337839\n云库\t337840\n菻坊\t337841\n几枝\t337842\n阎罗令\t337843\n颈动脉内膜剥脱术\t337844\n混凝土地坪\t337845\n高增\t337846\n群众艺术馆\t337847\n欧景\t337848\n26.5\t337849\n弄清楚\t337850\n徐莉\t337851\nworkstation14\t337852\n救生员\t337853\n抗衰老\t337854\n90关\t337855\n肖生克\t337856\n想象作文\t337857\n第5卷\t337858\n4.5\\4.0_\t337859\n燃煤锅炉\t337860\n银监会\t337861\n中国国花园\t337862\n福尼斯\t337863\n御锦城\t337864\n猫猫兔\t337865\nSPM\t337866\n荆州火车站\t337867\n虚报\t337868\n开唱\t337869\n衣禄\t337870\nkdc\t337871\n白色糠疹\t337872\n试装\t337873\n发报\t337874\nV2.2.2\t337875\nWDYWT\t337876\n襄阳机场\t337877\n催乳\t337878\n天天营养网\t337879\njijiao\t337880\ntaglist\t337881\n普陀区小学\t337882\nanalogue\t337883\n新虎\t337884\n大数据学院\t337885\nChatbot\t337886\nSO2\t337887\n诛心\t337888\n护士学校\t337889\nmindwind\t337890\n死城\t337891\n26期\t337892\nretrieve\t337893\n上海野生动物园\t337894\n划拨\t337895\n崇贤\t337896\nAMD显卡驱动\t337897\n17173dnf\t337898\nDate\t337899\n董事会决议\t337900\nWCDMA\t337901\n艺展\t337902\n次性\t337903\n闽江学院\t337904\n安住\t337905\nOPT\t337906\n表弟\t337907\n国家AAAA级景区\t337908\n空册\t337909\n面差\t337910\n小关\t337911\n北卡\t337912\n期货交易\t337913\n人线\t337914\n动天\t337915\n满格\t337916\n白狐\t337917\n_资\t337918\n万宝盛华集团\t337919\nwin7hosts\t337920\nmsata接口\t337921\n建筑设计师\t337922\n勤务类\t337923\n50353\t337924\n辊式\t337925\n恒牙\t337926\niserver\t337927\n香子兰3\t337928\n熊出没之夺宝熊兵\t337929\n郑州花园路\t337930\n秒删\t337931\n无尚\t337932\n龙果果\t337933\n哎呦\t337934\n沤肥\t337935\n帧差法\t337936\n修编\t337937\n封闭母线\t337938\n吴邦国\t337939\n拱型\t337940\n3dhgame\t337941\n胡荣\t337942\n流离\t337943\n硬仗\t337944\n一半截\t337945\n桂圆\t337946\nselectKey\t337947\n大众朗行\t337948\n央行征信中心\t337949\n发泡板\t337950\n杰拉德格林\t337951\n淘宝店铺子\t337952\n算盘\t337953\n粗气\t337954\n电子测量技术\t337955\n黄山区\t337956\n15KW\t337957\n中药类\t337958\n偶像大师\t337959\n气滞胃痛颗粒\t337960\n久立特材\t337961\n除法器\t337962\n片荒\t337963\n古尸\t337964\n不令\t337965\n融城\t337966\n陌生的城市\t337967\n黛儿\t337968\n孙维\t337969\n东风\t337970\n东方虹\t337971\n蓝源\t337972\nurv\t337973\n神龙斗士\t337974\nBison\t337975\n8枚\t337976\n沾边\t337977\n打拱\t337978\n溺水\t337979\n促甲状腺素\t337980\nLMF\t337981\n王大爷\t337982\nAnnie\t337983\n有研新材\t337984\n绿点\t337985\n南京中院\t337986\n橡套\t337987\n兽药经营许可证\t337988\n因子分析法\t337989\n霍山路\t337990\nLEX\t337991\n手动挡\t337992\n二侧\t337993\nWin10系统之家-Win10\t337994\n标点符号用法\t337995\n老道士\t337996\n妹子们\t337997\n勠力\t337998\n冰凉\t337999\n八景\t338000\n云霄之恋\t338001\n上颚\t338002\nOpenEdv\t338003\n样章\t338004\n牛宝\t338005\n临高启明吧\t338006\n袁\t338007\navril\t338008\n代乐乐\t338009\n践行社会主义核心价值观\t338010\n无垢\t338011\n人教版3\t338012\n茂名市第一中学\t338013\n海法\t338014\n中国消费者协会\t338015\n西界\t338016\n160GB\t338017\n沃钱包\t338018\n理查\t338019\n雪上加霜\t338020\ngit克隆\t338021\n秦80\t338022\n露阴\t338023\n史堂\t338024\n天剑\t338025\n1781\t338026\n坦克牧马人\t338027\n精神抚慰金\t338028\n少爷\t338029\n肺水肿\t338030\n20P]\t338031\n左旋肉碱\t338032\n预算案\t338033\n生益\t338034\n金针奖\t338035\n新东方国际游学\t338036\n中国平安保险\t338037\n8511\t338038\n凯拉克斯\t338039\n篮球操\t338040\n大行其道\t338041\n于光远\t338042\n福彩3D字谜\t338043\n七杰\t338044\n泰拉瑞\t338045\n辰南\t338046\nxmt\t338047\n梧州日报\t338048\n法恩莎卫浴\t338049\n黑啤\t338050\n兀\t338051\n江河集团\t338052\n三星pay\t338053\nUIC\t338054\n0.3mm\t338055\nppjj\t338056\n外国公司\t338057\n大杯\t338058\n6630\t338059\n8302\t338060\n拉米夫\t338061\n赵蕾\t338062\n7870\t338063\n辽化\t338064\n|天\t338065\n善存银片\t338066\nRequired\t338067\n教育部学校规划建设发展中心\t338068\n爱丽舍论坛_汽车之家论坛\t338069\n360建筑网\t338070\n一小时后\t338071\n缓冲期\t338072\n三星s7e\t338073\n胃炎\t338074\n清账\t338075\n土八路\t338076\nEGit\t338077\n贫道\t338078\nfun88\t338079\n非首脑会谈\t338080\nImageNet\t338081\n张通\t338082\n万方数据中小学数字图书馆\t338083\nsaigon\t338084\n冠华\t338085\n百花园\t338086\n永宁寺\t338087\n排版\t338088\nLabeling\t338089\n埃贡·席勒\t338090\n海信电视\t338091\n165斤\t338092\n判断力\t338093\nshRNA\t338094\n钱钟书\t338095\n特别篇\t338096\n孔雀绿\t338097\n2600万\t338098\n司徒美堂\t338099\n衡水市人民政府\t338100\n优生\t338101\nttt\t338102\n上海东方卫视\t338103\n帐蓬\t338104\n周文杰\t338105\n旅遊\t338106\n修复仪\t338107\n井子\t338108\n小枫\t338109\ntraces\t338110\n远心\t338111\n昼夜\t338112\nEDM\t338113\n杰罗德\t338114\n银行版\t338115\nRamin\t338116\n骨痂\t338117\n远光软件股份有限公司\t338118\n门者\t338119\ndajia\t338120\n耶律齐\t338121\n韶州\t338122\n钟水饺\t338123\nEmily\t338124\n乌丸\t338125\n石家庄医院\t338126\n陈雅婷\t338127\n理香\t338128\n上海交通银行\t338129\n吉之美\t338130\n刀轮\t338131\n卢琴\t338132\n吞精\t338133\n周书\t338134\n龙队\t338135\n云南省国土资源厅\t338136\nThinkphp5\t338137\n雨枫\t338138\nR1\t338139\n有道云笔记网页剪报\t338140\nappstore应用名\t338141\n幻城凡世\t338142\n济世\t338143\nA1530\t338144\n脑外科\t338145\n骨科学\t338146\ncif\t338147\n青少年\t338148\n选择型\t338149\n重庆口腔医院\t338150\nDokuWiki\t338151\n起用\t338152\ninfraworks\t338153\n還\t338154\n贺寿\t338155\n酒红\t338156\n拉流\t338157\n法线\t338158\n格斗技\t338159\n胡维纳\t338160\nESTEE\t338161\n152毫米\t338162\n叶笑\t338163\n骨痛\t338164\n匪我思存\t338165\n中华人民共和国民事诉讼法\t338166\nchandelier\t338167\n新伤\t338168\n可机\t338169\n容九\t338170\n同舟\t338171\n桥本氏\t338172\n喜洋洋与灰太狼\t338173\n冰皇\t338174\nmongoldb\t338175\n酷米客\t338176\n仓持\t338177\n黑三角\t338178\n晶晶\t338179\n敬听\t338180\n百分之二\t338181\n家用电\t338182\n吹神\t338183\n沙洲职业工学院\t338184\ndiamond\t338185\n陶孟童\t338186\n新康\t338187\n伊芙丽\t338188\n如沐\t338189\n富威\t338190\n抖动\t338191\n银英传\t338192\n冷清\t338193\n环抱\t338194\njdread\t338195\n淘宝全球购\t338196\n活祭\t338197\ntrimmed\t338198\n4.3.15\t338199\n新黎明\t338200\n小写意\t338201\n10秒钟\t338202\nls-dyna\t338203\n强求\t338204\n招手\t338205\n口腹\t338206\njiajun\t338207\nTOP40\t338208\n股事\t338209\n王者荣耀男\t338210\n龚如心\t338211\ntotalcommander\t338212\n通信工程专业\t338213\n保时捷911\t338214\n12.1.6\t338215\n练笔\t338216\n脊瓦\t338217\ncompose\t338218\nGT86\t338219\ncoccinelle\t338220\n牛B\t338221\n瑞虎3论坛_汽车之家论坛\t338222\n自修\t338223\n自主学习\t338224\n800家\t338225\n甜面酱\t338226\n井上靖\t338227\n老柴\t338228\nQQ水浒\t338229\nCRP\t338230\nclap\t338231\n北京日报\t338232\n宣武门西大街\t338233\n百度文档\t338234\n佐佐波绫\t338235\n能点\t338236\n广顺镇\t338237\nsp\t338238\n南门小学\t338239\n王毛\t338240\n苹果6PLUS\t338241\n住房城乡建设部\t338242\n校园开放日\t338243\nangular4\t338244\n2180\t338245\n三亚学院\t338246\n求指导\t338247\n第二十七条\t338248\n齐格\t338249\n外卖人\t338250\n爱格板\t338251\nBR\t338252\n加拿大滑铁卢大学\t338253\n建成小康社会\t338254\n画仙\t338255\nTABLE\t338256\nwindows10_Wi\t338257\nwhql\t338258\n锡林格勒\t338259\nlsof\t338260\n甲胄\t338261\n徐晗\t338262\n四川省财政厅\t338263\n平安传\t338264\n带电粒子\t338265\n阴阳道\t338266\n傳說\t338267\n武汉大学土木建筑工程学院\t338268\nCompanies\t338269\n豆腐猫砂\t338270\ntext2\t338271\n篱\t338272\n蓝翼\t338273\n谢杏芳\t338274\nNPE\t338275\n船运\t338276\n六幼\t338277\n车管所\t338278\n肌水\t338279\n青海高原\t338280\n揽储\t338281\n爱情碟中谍\t338282\n龙之传说\t338283\n盗卖\t338284\n参数化\t338285\n棋牌室\t338286\n现代工业\t338287\n以\t338288\n莲花清瘟胶囊\t338289\n铜陵市人民医院\t338290\n东街\t338291\n毕业墙\t338292\nAxure7.0\t338293\nImageshop\t338294\nchost\t338295\njinhaolin\t338296\n高锰酸钾法\t338297\n吴玉章\t338298\n温州招聘网\t338299\n浙江大学紫金港校区\t338300\ndesgin\t338301\n孙立\t338302\nfanti\t338303\n长沙市住房和城乡建设委员会\t338304\n月份\t338305\n大太\t338306\n沃尔沃S60L\t338307\nssg修改器\t338308\n鲨鱼公园\t338309\n一新\t338310\n北京安贞医院\t338311\nVenetian\t338312\n弟子规\t338313\n悌\t338314\nesclipse\t338315\n小虾米\t338316\n琥珀湾\t338317\n汉藏语\t338318\n最后关头\t338319\n一般人\t338320\nuiweb\t338321\n002202\t338322\n700亿元\t338323\nMV【人人网\t338324\n阿含\t338325\n8031\t338326\n李乃文\t338327\ndialogfragment\t338328\nhx6730\t338329\n锈钢\t338330\n网页游戏加速器\t338331\n社会科学文献出版社\t338332\n环球科学\t338333\ndonate\t338334\n邵阳市中心医院\t338335\n关系人\t338336\n吴丹尼\t338337\n分寸感\t338338\n公文阁\t338339\n根冠\t338340\nbuddha\t338341\nbl文库网\t338342\n通脉\t338343\n7724\t338344\n托尼克罗斯\t338345\n酚\t338346\n福耀集团\t338347\n计算机病毒\t338348\n北京市国有资产经营有限责任公司\t338349\n战斗部\t338350\n魔力\t338351\n甩甩\t338352\n雷锋镇\t338353\n有怀\t338354\n理邦\t338355\n百颜\t338356\n烟花区\t338357\n在线读书网\t338358\n皮匠\t338359\n细菌性感冒\t338360\n硝化反应\t338361\nMovers\t338362\n核函数\t338363\nwealth\t338364\n556\t338365\n先导阀\t338366\ner3100\t338367\n超级医仙\t338368\n20140421\t338369\n戴尔电脑\t338370\nbit\t338371\n通则\t338372\n蓝眼泪\t338373\n霜之哀伤\t338374\n长江航运公安局\t338375\n空口\t338376\n【7天连锁酒店\t338377\n大公国际\t338378\n戦士セ\t338379\n美元债\t338380\n电池仓\t338381\nword批注\t338382\n%20\t338383\nunable\t338384\n史景迁\t338385\n山有木兮木有枝\t338386\n格里菲斯大学\t338387\nconcerts\t338388\n家常菜谱大全网\t338389\n黄蛋\t338390\n姥姥\t338391\n00r\t338392\nFALL\t338393\n定兴\t338394\n调温器\t338395\n镶嵌机\t338396\n原为\t338397\n同一个月\t338398\n北京协和医院\t338399\n安迅思\t338400\n第六部\t338401\n方孝孺\t338402\n乳化油\t338403\n县水务局\t338404\n模拟人生5\t338405\n中华人民共和国刑法释义\t338406\n多柔比星\t338407\n沙岛\t338408\n浩纳\t338409\n贾巴尔\t338410\nlecon\t338411\n职业经理\t338412\n长安之星\t338413\nbcloud\t338414\n振德医疗\t338415\n2.so\t338416\n黑链\t338417\n0.6m\t338418\n老远\t338419\n团伙\t338420\nNederland\t338421\n马可波罗\t338422\n类比法\t338423\n硬起\t338424\n利川\t338425\n伯恩茅斯\t338426\n贴身\t338427\n一阶\t338428\nDickens\t338429\n山地师\t338430\n科考\t338431\n山东师范\t338432\nTHX\t338433\n亿利洁能\t338434\ns5120\t338435\n龙南\t338436\n排名榜\t338437\n老鸟\t338438\n925\t338439\n奥硝唑片\t338440\n朱燕\t338441\n钗头凤\t338442\n帮困\t338443\n两岸关系\t338444\n阿云\t338445\nhd1080p\t338446\naccess2003\t338447\n许晨阳\t338448\n武汉大学测绘学院\t338449\n韩森寨\t338450\n盲文\t338451\nLux\t338452\nZYNQ-7000\t338453\n手误\t338454\n葫芦岛政府网\t338455\nChristine\t338456\n吐烟\t338457\n20瓶\t338458\n海伦\t338459\nbilling\t338460\n苏美尔\t338461\n尚德教育\t338462\n听译\t338463\n营职\t338464\n侠盗联盟\t338465\n劲爆点\t338466\nballs\t338467\n马士华\t338468\n骑拉帝纳\t338469\n一个谜\t338470\n缆车\t338471\n卷发\t338472\n仙女们\t338473\n雌蕊\t338474\nporridge\t338475\n神界原罪2吧\t338476\n龙博\t338477\n评价\t338478\n汽车工业协会\t338479\n云顶高原\t338480\n拔牙\t338481\n47分钟\t338482\n兄弟姐妹\t338483\n2017年9月13日\t338484\nabacus\t338485\n体检单\t338486\n佐卡伊\t338487\n15年以后\t338488\n壶关\t338489\n昆明高铁站\t338490\n艮\t338491\n银元\t338492\n草剂\t338493\n骨病\t338494\nSF6\t338495\n东明机电\t338496\n白骨精\t338497\n吴子\t338498\n家装业\t338499\n珠海公司\t338500\n孔隙比\t338501\n一棵小桃树\t338502\n陈安安\t338503\n545s\t338504\n亲族\t338505\ndust2\t338506\n柔美\t338507\n601633\t338508\n必联\t338509\n帕尼尼\t338510\n窃贼\t338511\nGENERATIONS\t338512\nworries\t338513\n昆山路\t338514\n热轧钢板\t338515\n速度与激情8\t338516\n纯\t338517\n科拓\t338518\n黑格尔\t338519\n广西水利电力职业技术学院\t338520\n3万块\t338521\n中再集团\t338522\n搞笑图\t338523\n江苏中公\t338524\n高保真\t338525\n100万美元\t338526\n67名\t338527\nmoke\t338528\nFRIEND\t338529\n大市\t338530\n侬\t338531\n三权\t338532\n方箱\t338533\n掩映\t338534\ndelft\t338535\n三匹\t338536\n满嘴\t338537\n江西省核工业地质局\t338538\n青萝卜\t338539\n麒麟科技创新园\t338540\n陈燕\t338541\n摔死\t338542\n欺压\t338543\n阿楚姑娘\t338544\n激昂\t338545\n袁庚\t338546\n耀州\t338547\n2016年5月31日\t338548\n独脚架\t338549\nKPL王者荣耀职业联赛\t338550\n交安\t338551\n江东中路\t338552\n约克城\t338553\n尼克尔森\t338554\n大同一中\t338555\n窍诀\t338556\n田佳良\t338557\n母本\t338558\n修辞\t338559\n1.0.17\t338560\n搜狗号码通\t338561\nCodeWeb\t338562\n垒\t338563\n汁水\t338564\n藏宝图\t338565\n保命\t338566\nlena\t338567\n闪贷\t338568\n明天会更好\t338569\nw3cplus\t338570\n選\t338571\n花枝招展\t338572\n15方\t338573\nEnix\t338574\nCFW中国服装人才网\t338575\n1000家\t338576\n橘皮纹\t338577\n法英\t338578\n跃式\t338579\nゾンビ\t338580\n喇嘛\t338581\n1841\t338582\nMX-BOARD\t338583\nSanta\t338584\n头孢他啶\t338585\n内膜\t338586\n薛氏\t338587\nCDK\t338588\n刘子清\t338589\nmorphvoxpro\t338590\n9.8万\t338591\n很有\t338592\n小马驹\t338593\nsurface\t338594\nwscript\t338595\n十三陵镇\t338596\nEDUCATION\t338597\nsket\t338598\n前序\t338599\n桃花始\t338600\n3158餐饮网\t338601\n凤仪亭\t338602\n高中女儿的轮轩\t338603\n烤生蚝\t338604\n6.7\t338605\n国宝级\t338606\n坎高犬\t338607\n过关率\t338608\n2017年8月15日\t338609\n中葡\t338610\n球品\t338611\n风姿绰约\t338612\n融资融券交易\t338613\n解闷\t338614\nCeline\t338615\n6031\t338616\n1372\t338617\nwin7win8\t338618\n陈大榕\t338619\n精华素\t338620\n青岛社\t338621\n顾唐路\t338622\n卡车之家\t338623\n创可贴\t338624\nFuzz\t338625\n天津电视台\t338626\namoi\t338627\n，\t338628\ntn\t338629\n恩将仇报\t338630\n杯状\t338631\n28岁未成年\t338632\n怎么样\t338633\n鸡鸣山\t338634\n利辛县人民政府\t338635\n6.5.3\t338636\nA01\t338637\n四川省省\t338638\n鲜啤\t338639\n学前心理学\t338640\n长安CS75论坛_汽车之家论坛\t338641\nerr\t338642\n任怨\t338643\nblonde\t338644\n行风\t338645\n亚马孙热带雨林\t338646\nksm\t338647\n客商\t338648\n杨智\t338649\n黑部\t338650\n黄底\t338651\n格列齐特缓释片\t338652\n聯合\t338653\n光雾山\t338654\n猿飞日\t338655\nwebsocke\t338656\n鸟氨酸\t338657\n祈荒\t338658\n芳华奥奇传说\t338659\niconic\t338660\n475\t338661\n溶血性\t338662\nInstrumental\t338663\n随心所欲\t338664\n贵州电信\t338665\n大照\t338666\n额定\t338667\n熊掌\t338668\nbundles\t338669\nIPTABLES\t338670\n王祎哲\t338671\nandoid\t338672\n水洼\t338673\niPhone浏览器\t338674\n修正案\t338675\nLOHO眼镜\t338676\n单车\t338677\n电化学阻抗谱\t338678\nMax\t338679\n欧洲航天局\t338680\n炫星\t338681\nenhancements\t338682\nag平台\t338683\nbugzilla\t338684\n系统之家win10\t338685\n污段子\t338686\nKD\t338687\n中国船舶重工集团公司\t338688\n演技\t338689\n正义联盟\t338690\n开利空调\t338691\n情钟\t338692\n算入\t338693\n新武林群侠传\t338694\nvc泡腾片\t338695\n美甲\t338696\n三效\t338697\n2018年6月份\t338698\n沪胶\t338699\n反伤\t338700\nshl\t338701\n小小白\t338702\n英雄牌\t338703\nFull\t338704\n陕西省水利厅\t338705\nkob\t338706\n12.0.20\t338707\n船模\t338708\n蜂蜜罐\t338709\n回生源地\t338710\n扫货\t338711\n外游\t338712\n曼谷素万那普机场\t338713\n木魅\t338714\n云南工信委\t338715\n吴晓波\t338716\n9dm\t338717\n搞臭\t338718\n足协杯\t338719\n伊之密\t338720\n明成祖\t338721\n挂者\t338722\n内孔\t338723\n航路\t338724\n万会\t338725\nRegister\t338726\npeppa\t338727\n胶料\t338728\n砂质\t338729\n干法\t338730\n沈总\t338731\n磕碰\t338732\n胡润研究院\t338733\n庭\t338734\n对不对\t338735\n128tick\t338736\n德妃\t338737\n数字报\t338738\n李勇鸿\t338739\n朱碧\t338740\n磁片\t338741\n茶桌\t338742\n韩华\t338743\n天津大学仁爱学院\t338744\niNode\t338745\n沉郁\t338746\n深宫\t338747\n新惊天动地\t338748\nIssey\t338749\n首板\t338750\n好男儿\t338751\n4120\t338752\n林建岳\t338753\n4月21日起\t338754\n大罢工\t338755\n成王败寇\t338756\n论文版\t338757\n曹鹤阳\t338758\n20170929\t338759\n求生之路3\t338760\nMGS\t338761\n梅兰\t338762\nitellyou\t338763\nMAtlab\t338764\n3分钱\t338765\n澳门百老汇\t338766\n杭七中\t338767\n杨成功\t338768\nQuik\t338769\n盛通股份\t338770\nCluster\t338771\nCasper\t338772\n孙慧\t338773\n云南乐居网\t338774\n吴文煜\t338775\n三菱翼神\t338776\n雄安新区管委会\t338777\n南京二十九中\t338778\nUltraCompare\t338779\n字水\t338780\n西安外国语大学\t338781\n水过滤器\t338782\nZ18\t338783\n哈夫节\t338784\n凌玲\t338785\n三摄\t338786\n16处\t338787\n广西省\t338788\n岬童夷\t338789\n晚娘\t338790\nqq轻聊\t338791\n丝瓜\t338792\n实践者\t338793\n今年四月\t338794\n汤加\t338795\n田心\t338796\n白鱼镇\t338797\n游神\t338798\nhttps代理\t338799\n收发机\t338800\n金光瑶\t338801\nMBTI\t338802\napachectl\t338803\n力促\t338804\nF1\t338805\n刘擎\t338806\n海滨公园\t338807\n五环\t338808\nNEEQ\t338809\n惠达卫浴\t338810\n坎坷\t338811\n长白山天池\t338812\n大队委\t338813\n夕发\t338814\n天目湖山水园\t338815\nglowing\t338816\n黄俊捷\t338817\n电子元器件采购网\t338818\n直销人网\t338819\ngeta\t338820\n比\t338821\n裸绞\t338822\n大境中学\t338823\n道尔顿\t338824\n淮海经济区\t338825\n拍卖品\t338826\n指明\t338827\n骆驼客\t338828\n东营市一中\t338829\n统计函数\t338830\n死亡沙漠\t338831\n挥舞\t338832\n一汽大众汽车有限公司\t338833\n给出\t338834\nword转换器\t338835\n12列\t338836\n上戏附中\t338837\n下拉刷新\t338838\n青年文学家\t338839\ntransferase\t338840\n咕哒\t338841\nguiding\t338842\n江东街道\t338843\nN4050\t338844\nSPB\t338845\noplog\t338846\n看点\t338847\n恶夫\t338848\nfitt\t338849\n201409\t338850\n半封\t338851\n益丰药房\t338852\n教条主义\t338853\n第六十四条\t338854\n衡水五中\t338855\n清研\t338856\nv3.1\t338857\n高粱\t338858\n天罗地网\t338859\n红蓝宝石\t338860\nNES\t338861\n80秒\t338862\n李德生\t338863\n失字\t338864\nengagement\t338865\n省城\t338866\n自适应控制\t338867\n蔡司镜头\t338868\n播洒\t338869\nSpanking\t338870\n1426\t338871\n5.3_\t338872\n战弓\t338873\n保靖\t338874\nelt\t338875\n价值中国网\t338876\n嘉善房产网\t338877\n病理\t338878\n军山\t338879\nSting\t338880\ntongqingliu\t338881\n清凈菩提心\t338882\n攀枝花西区\t338883\n快递公司\t338884\n20131231\t338885\n熊果酸\t338886\n地米\t338887\n不甘\t338888\n2018年2月7日\t338889\n2016年1月12日\t338890\n邓力\t338891\nKindeditor\t338892\n泥坊\t338893\n蕾丝花\t338894\ntextiles\t338895\n油坊桥\t338896\n领导艺术\t338897\n汕湛高速公路\t338898\n07期\t338899\n树精\t338900\ngolo\t338901\n中国农业发展银行\t338902\n数码宝贝物语\t338903\n特摄片\t338904\nlibvpx\t338905\nNIBAKU\t338906\n万能页\t338907\nwindows计算器\t338908\n教唆\t338909\n抄报税\t338910\n鲁迅先生\t338911\nObject\t338912\n段前段\t338913\n小爷\t338914\n贫穷\t338915\nLat\t338916\n2月4日\t338917\n包晗\t338918\n中科院大连化物所\t338919\n取上\t338920\n甜度\t338921\n龙巅神仙鱼\t338922\n萌友\t338923\n竹质\t338924\n为你\t338925\n采油\t338926\n老国展\t338927\n前阵子\t338928\nk24\t338929\n原莉乃\t338930\n模拟经营类\t338931\n钻石\t338932\nwry\t338933\n二第一次\t338934\n老潘\t338935\n火攻\t338936\nvmware11\t338937\n该怎么办\t338938\n魔怀网\t338939\n青树\t338940\n座驾\t338941\nUCSF\t338942\n英菲尼\t338943\n庄周\t338944\n麻味\t338945\n北京人民艺术剧院\t338946\nStations\t338947\n热轧无缝钢管\t338948\nScout\t338949\n潞城市\t338950\n界石镇\t338951\n张桂平\t338952\n速腾\t338953\n嘉兴市社会保障事务局\t338954\n从紧\t338955\nSDHC\t338956\n杭州西湖风景名胜区管委会\t338957\n红薯干\t338958\n通乐\t338959\n双截龙格斗\t338960\n兵牌\t338961\n追星\t338962\n昆特牌\t338963\nusage\t338964\n文玩核桃\t338965\n30.1%\t338966\n0.5度\t338967\n谷雨茶\t338968\n拔管\t338969\n马槽\t338970\n投怀送\t338971\n魔饭\t338972\njohn\t338973\n宠爱有\t338974\n韩芳\t338975\n狗辈\t338976\n北达\t338977\n金融帝国2\t338978\n最简版\t338979\n泰勒斯威夫特\t338980\n河南省政府法制网\t338981\n伦敦眼\t338982\n红男绿女\t338983\nHazard\t338984\n排列三试机号\t338985\n平罗县\t338986\n蔡秀彬\t338987\niPadpro\t338988\n洛克线\t338989\n57万\t338990\n阿水\t338991\n依琳\t338992\n薛店镇\t338993\n乡台\t338994\n法典\t338995\nqt库\t338996\n太行大峡谷\t338997\n爸爸去哪儿2\t338998\nknives\t338999\n妮娜杜波夫\t339000\n吴道洪\t339001\n泱泱大风\t339002\n桂顺斋\t339003\n太阳能光伏发电系统\t339004\nCapo\t339005\nEXCLE\t339006\n3win10\t339007\nopenresty\t339008\n过敏性哮喘\t339009\n乐昌市\t339010\n弹劾案\t339011\n七星潭\t339012\nBOO\t339013\n阳光城\t339014\n社保增减员\t339015\n杂感\t339016\nhistogram\t339017\n光明中学\t339018\n黑暗之魂2\t339019\n勒流镇\t339020\n商业帝国\t339021\n4B\t339022\n250W\t339023\n男漫\t339024\n琴瑟仙女\t339025\n到位\t339026\n高新\t339027\n触摸\t339028\nthinner\t339029\n汽油桶\t339030\n朝战\t339031\n西大\t339032\n雨帘\t339033\ncqvip\t339034\n依喵\t339035\n魔能\t339036\n唐克\t339037\n不相为谋\t339038\n苜蓿\t339039\n干细胞\t339040\n开心鬼\t339041\n湘籍\t339042\n东风农场\t339043\n举报人\t339044\n台山\t339045\n力争上游\t339046\n预付费卡\t339047\nwebrtc\t339048\n玄真\t339049\n色诱\t339050\nPC\t339051\n杨永信\t339052\n赵昕\t339053\n同宿\t339054\n上海微电子\t339055\n温实初\t339056\n2018年端午节\t339057\n祖方\t339058\n维纶\t339059\nander\t339060\n487021\t339061\n68岁\t339062\n中国核学会\t339063\n莱茨\t339064\nFAG\t339065\n四对\t339066\n亚特兰\t339067\n40座\t339068\n天龙八部3\t339069\n泄身\t339070\n卡贝\t339071\nygs\t339072\n提速\t339073\n桃花期\t339074\n018年\t339075\n日惹\t339076\n邗江区\t339077\n广厦队\t339078\n生抽\t339079\n李萱\t339080\n轴图\t339081\n湖田窑\t339082\n恋爱先生\t339083\n狗腿\t339084\n扭秤\t339085\n羚锐\t339086\nGIN\t339087\n科博汇\t339088\n验光员\t339089\n500多\t339090\n60分贝\t339091\n心法\t339092\ncooleditpro\t339093\n独眼龙\t339094\n三权分立\t339095\n朱贺\t339096\n椭圆\t339097\n银行间债券\t339098\nv4.0.4\t339099\n结发\t339100\n20150412\t339101\ncalico\t339102\n外交政策\t339103\n颞部\t339104\nVCL\t339105\n受激\t339106\n幽浮\t339107\n消缺\t339108\n斗鱼超管\t339109\nK420\t339110\n普密蓬\t339111\n偏袒\t339112\n365房产网\t339113\n心跳骤停\t339114\nNaturals\t339115\n常胜\t339116\n1962年\t339117\n指挥部\t339118\n眉膏\t339119\n晋中日报\t339120\nWiseW\t339121\n旋切\t339122\n智友\t339123\nccproxy\t339124\n王天龙\t339125\n超级战舰\t339126\n网上购物中心\t339127\n路透照\t339128\n源子\t339129\n人人网\t339130\n文辞\t339131\n赛腾股份\t339132\n高斯核函数\t339133\n每吨\t339134\n稳压二极管\t339135\n游离三碘甲状腺原氨酸\t339136\n26262\t339137\n好接\t339138\n书稿\t339139\nIos\t339140\nDunlop\t339141\n电磨机\t339142\nSwimming\t339143\n氟碳\t339144\n玫瑰疹\t339145\nshots\t339146\n最小化\t339147\nCorona\t339148\n茂林\t339149\n上海市建平中学\t339150\n里海大学\t339151\n汪泉\t339152\n芦山县\t339153\n35周年\t339154\n马桥\t339155\n外型\t339156\n正阳街\t339157\n镇原县\t339158\n变态服\t339159\n北京分院\t339160\n脆皮烤肉\t339161\nヤキモチ\t339162\n双拥工作计划\t339163\n脉络\t339164\n瓷路\t339165\n仪式\t339166\n没种\t339167\n蜘蛛网\t339168\nSTM32/8\t339169\n王晴\t339170\n德行\t339171\n白马河\t339172\n虚空鳐\t339173\n单行线\t339174\n自瞄\t339175\n旋挖机\t339176\nD2\t339177\nplaymaker\t339178\n东湖渠\t339179\n座包\t339180\nsy后遗症\t339181\n98式\t339182\nActing\t339183\n排球队\t339184\n2.4ghz\t339185\n百度推广助手\t339186\n什么型\t339187\n水沟\t339188\n护士们\t339189\n单身男\t339190\n煞笔\t339191\n对立面\t339192\n卡农吉他谱\t339193\n苦尽甘来\t339194\nairmail\t339195\nWord\t339196\n葡萄美酒夜光杯\t339197\n吹箫\t339198\n超声波清洗机\t339199\n细沙\t339200\n6MT\t339201\n技术人\t339202\nO2\t339203\ndd命令\t339204\n巴蒂斯图塔\t339205\n北京学院路\t339206\n楼上楼下\t339207\n青海干部网络学院\t339208\n属性能\t339209\n凌塘村\t339210\n格力高\t339211\n苏若年\t339212\n温湿度计\t339213\n长期股权投资减值准备\t339214\n东方资管\t339215\n投资管\t339216\n上位\t339217\nHard\t339218\n阿玛尼\t339219\n易纲\t339220\n多米诺骨牌\t339221\n整板\t339222\ny520\t339223\n歌德\t339224\n乐都网\t339225\nZuma\t339226\n一禅小和尚\t339227\n长春西站\t339228\n宿篇\t339229\n新泰市人民政府\t339230\n橡木门\t339231\n少人\t339232\n单根\t339233\n那坡县\t339234\nAPF\t339235\n碰\t339236\n顺层\t339237\n二挡\t339238\n上海绿城\t339239\n租赁部\t339240\n小左\t339241\n青岛法院\t339242\n728\t339243\n梵蒂冈博物馆\t339244\n挡风\t339245\n塑料味\t339246\n电子信息产业\t339247\n3.5倍\t339248\n跳槽季\t339249\n汉文化\t339250\n拍人\t339251\n点醒\t339252\n雷锋的故事\t339253\nRsync\t339254\nwomen\t339255\n南平市公务员局\t339256\n丝网印刷机\t339257\n韦币\t339258\nStrategies\t339259\n招待\t339260\nalt键\t339261\nCAJ阅读器\t339262\nposedge\t339263\n490\t339264\nTOP\t339265\n恶动\t339266\n崔真实\t339267\n政治局常委\t339268\n腾讯游戏吧\t339269\n中歌\t339270\n建康路\t339271\nfaces\t339272\n攝\t339273\n中国功夫\t339274\n临行前\t339275\npaoshentu\t339276\n自由职业者\t339277\n诺亚财富\t339278\n亡神\t339279\n大碶\t339280\n五六分钟\t339281\n携着\t339282\n写成\t339283\n叶卡捷琳娜\t339284\n恒宝\t339285\n转一圈\t339286\n佩姬\t339287\n孙武文化园\t339288\nHaKu\t339289\nPLOS\t339290\n9x\t339291\n5.0版\t339292\nengland\t339293\n素颜照\t339294\n黎明之光\t339295\n仓央嘉措\t339296\n敢问路\t339297\nftell\t339298\n半推\t339299\nvisited\t339300\n唐密\t339301\n北大荒\t339302\nsence\t339303\n王树国\t339304\ntr\t339305\n伊苏吧\t339306\n诗话\t339307\n通管\t339308\n班人\t339309\n狸奴\t339310\nRecently\t339311\nMyScript\t339312\n紫药水\t339313\n济南长途汽车站\t339314\n清单化\t339315\n玻纤瓦\t339316\n三井\t339317\n红酒柜\t339318\n柳江\t339319\n眼圈\t339320\niwatch2\t339321\n4转\t339322\n花衣\t339323\n抱窝\t339324\n资金盘\t339325\n杀害\t339326\n广州好运医院\t339327\nmosaic\t339328\n一个15岁\t339329\n洛杉矶迪士尼\t339330\nuncertainty\t339331\n如龙极\t339332\n少许\t339333\nTal\t339334\nphpcmsV9\t339335\n陆家\t339336\n富贵黄金屋\t339337\n提名\t339338\n德意电器\t339339\n夜瞳\t339340\n朱良志\t339341\n中央商场\t339342\n警影\t339343\n振镜\t339344\n一行一个\t339345\n数字电子技术基础\t339346\n宪性\t339347\n位姿\t339348\n正信光电\t339349\n嗝\t339350\niosnba2k18吧\t339351\n大舒\t339352\ntitans\t339353\nEHCache\t339354\n沪陕高速\t339355\n2017.3.2\t339356\nng2\t339357\n国际旅行社\t339358\n封建迷信\t339359\n玄烨\t339360\n湖北省卫计委\t339361\n宫斗剧\t339362\n若干意\t339363\n风清扬\t339364\n4000张\t339365\n织田信长\t339366\n急诊电话\t339367\nMead\t339368\n0&#160\t339369\n樱井知香\t339370\nadguard\t339371\n國際\t339372\n1080P_\t339373\n3间\t339374\n过敏奶粉\t339375\n超世\t339376\n谦和\t339377\n赛期\t339378\nopencv3\t339379\n斩击\t339380\n下网\t339381\nLoad\t339382\n雅阁论坛\t339383\n贲门\t339384\n冲着\t339385\ncmap\t339386\n45平米\t339387\n要素\t339388\n吊椅\t339389\n华侨永亨银行\t339390\nMILFs\t339391\n阿彬\t339392\n蓄积\t339393\nFrameLayout\t339394\n贝珠\t339395\n常熟路\t339396\n网易课堂\t339397\n18t\t339398\n象棋赛\t339399\n负债\t339400\n银川西夏区\t339401\nmapstruct\t339402\n全仓\t339403\n第25\t339404\nCoaster\t339405\n2018-01-28\t339406\n関\t339407\n满艳\t339408\n遭封\t339409\n动态市盈率\t339410\n挪亚\t339411\n仪表台\t339412\n轮番\t339413\n鲲\t339414\n上海体育馆\t339415\nABD\t339416\n实验桌\t339417\n梁建国\t339418\n乾潭镇\t339419\nsmith\t339420\n分形科技\t339421\n江西\t339422\n北山\t339423\n嫩肤\t339424\nSTLINK\t339425\n快站\t339426\n投标方\t339427\nc2\t339428\n带木\t339429\n王恭\t339430\n玉篇\t339431\n色香味\t339432\ntrojan\t339433\n1554\t339434\n埋线双眼皮\t339435\n时器\t339436\n勤俭持家\t339437\n无版权\t339438\n移动办公\t339439\ndang\t339440\n爱护\t339441\n402号\t339442\n胎压监测系统\t339443\n小缘喵\t339444\nQQ群发软件\t339445\n成武\t339446\n九轴\t339447\n枣园\t339448\n面板数据处理\t339449\njavaEE\t339450\n自编自导自演\t339451\nrongyux\t339452\n陇川县\t339453\n吉大南校\t339454\n河北省科技厅\t339455\n三责险\t339456\nEnergistics\t339457\n互联网转型\t339458\n福州地铁2号线\t339459\nJvm\t339460\n佳茵\t339461\n富美实\t339462\nzi\t339463\n主杆\t339464\n凭什么说\t339465\n李宗吾\t339466\n72芯\t339467\n600535\t339468\n何记\t339469\n绿地控股\t339470\n28所\t339471\nMath类\t339472\nBayes\t339473\n圣帝\t339474\n董又霖\t339475\n乳油\t339476\n罗晓\t339477\n中华人民共和国专利法实施细则\t339478\n胡莎莎\t339479\n9路\t339480\n大当\t339481\n470万\t339482\n靠边\t339483\ngmo\t339484\n张图\t339485\n两起\t339486\n文渊阁四库\t339487\nPegasus\t339488\n飘花\t339489\n管箱\t339490\n空境\t339491\nYX\t339492\n演奏版\t339493\n小米手机5\t339494\n一世情\t339495\n笔身\t339496\n混配\t339497\n还要\t339498\n定亲\t339499\n母胎\t339500\n无名战士\t339501\nspears\t339502\n小球藻\t339503\n增值税专用发票\t339504\n战矛\t339505\n探密\t339506\n龈\t339507\n有限责任\t339508\n珑铸\t339509\n骶管囊肿\t339510\n埃尔米特\t339511\n南渡江\t339512\n伏妖白鱼镇\t339513\n打昏\t339514\n伊姆斯\t339515\n伤怀\t339516\n大野智\t339517\n证券发行与承销管理办法\t339518\n水峪村\t339519\n新定\t339520\n6.32\t339521\n一箱\t339522\n现代语言学\t339523\n国家邮政局\t339524\n张新军\t339525\n20140228\t339526\n1.4.1\t339527\n航班号\t339528\n互市\t339529\n中国广播电视网络有限公司\t339530\n方字\t339531\n七都\t339532\n奥园集团\t339533\ndeluxe\t339534\n迪丽热\t339535\n乐贤人才网\t339536\n葛布\t339537\n测色\t339538\nVOGUE时尚网\t339539\n键级\t339540\n儒将\t339541\n中科汇联\t339542\n包围圈\t339543\ncass7.1\t339544\n黄芪多糖\t339545\n厄瓜多尔白虾\t339546\n硅pu球场\t339547\n葡萄糖酸钙锌口服溶液\t339548\n汉庭酒店\t339549\n阿斯塔纳\t339550\ntrace32\t339551\n郑州宇通\t339552\n小范\t339553\nbitshares\t339554\nscrew\t339555\n十八届中央纪律检查委员会\t339556\nRagnarok\t339557\n防火堤\t339558\nr卡\t339559\n盖式\t339560\n两性健康网\t339561\n平顶山市教育局\t339562\nHellobike\t339563\nWEEE\t339564\nPl\t339565\n刷圈\t339566\n泽泽\t339567\n反潜机\t339568\n3.50\t339569\n九凤\t339570\n螺城镇\t339571\n制服\t339572\n2133\t339573\n70200\t339574\n118.com\t339575\nshowmodaldialog\t339576\n一樘\t339577\n三明学院\t339578\n羽毛扇\t339579\n85式太极拳\t339580\nSurround\t339581\n两馆\t339582\n检波器\t339583\nais\t339584\n一众\t339585\n孟凡利\t339586\npathinfo\t339587\n类\t339588\neaby\t339589\n医保定点医院\t339590\n年龄限制\t339591\n8届\t339592\n刀剑乱舞online\t339593\ngrew\t339594\n江苏国泰\t339595\n湖南省民宗委\t339596\nxid\t339597\n铌酸锂\t339598\n弃风率\t339599\n京东白条闪付\t339600\n江西政协\t339601\nC照\t339602\n绿色医院\t339603\nぉ\t339604\ncov\t339605\nd610\t339606\nRDJ\t339607\ncond_wait\t339608\n平型关大捷\t339609\n俄罗斯联邦\t339610\n罗版\t339611\n徐兵\t339612\n修炼场\t339613\n黑莓PRIV\t339614\nminiport\t339615\n楚雨荨\t339616\ntt1069\t339617\n160型\t339618\n喻恩泰\t339619\n伽马分布\t339620\nArcPy\t339621\nbouncy\t339622\n2015一级\t339623\n100mbps\t339624\n前额\t339625\n伯纳德\t339626\nAires\t339627\n天真\t339628\n浴缸\t339629\n气浮机\t339630\n博林\t339631\n补单\t339632\n概念店\t339633\n正云\t339634\n冠县\t339635\nmacao\t339636\n使命召唤ol\t339637\n大楷\t339638\n中联重科股份有限公司\t339639\n碧蓝\t339640\n6天4晚\t339641\nHRP\t339642\n渔家傲\t339643\n990\t339644\n货代公司\t339645\n雅美\t339646\n建军大业\t339647\nLaure\t339648\n考述\t339649\n昆一中\t339650\n脸谱化\t339651\ndirectx\t339652\nsave-dev\t339653\nListening\t339654\n金超群版\t339655\n乐宜嘉\t339656\n天乩之白蛇传说\t339657\nRED\t339658\n海之国\t339659\n辽阳市\t339660\nJOHN\t339661\n825\t339662\n傲游5\t339663\n秦欢\t339664\n賣\t339665\n无线传感器\t339666\n木杆\t339667\n霸气侧漏\t339668\n野生动植物\t339669\nmark2\t339670\n渴慕\t339671\n天下游\t339672\nziyuan\t339673\n物理期\t339674\n营业员\t339675\n报失\t339676\n微分销\t339677\nC50\t339678\nipcrs\t339679\n宇文玥\t339680\net加速器\t339681\n鲁宾\t339682\n健太郎\t339683\n1225\t339684\n奥体城\t339685\n康洁\t339686\n花荣\t339687\n日妆\t339688\n无心法师2\t339689\n2.3亿元\t339690\n挡风墙\t339691\n闻闻\t339692\n中国研究院\t339693\nhdpe管\t339694\n却说\t339695\n炉台\t339696\naware\t339697\n83分\t339698\n收集品\t339699\n2.2_\t339700\n踢球人网\t339701\n陈媛\t339702\n汾江\t339703\n海兽祭司\t339704\n药商\t339705\nIsolation\t339706\nnp小说吧\t339707\n纸灯\t339708\n小猪佩奇全集\t339709\nfl5900u\t339710\n爷们\t339711\n薛冰\t339712\nSNT\t339713\n32口\t339714\n严羽\t339715\n第1次\t339716\n移动3G\t339717\n樱木\t339718\ncallout\t339719\n夜上\t339720\n十三五规划\t339721\napparent\t339722\nmaxlength\t339723\nAlessia\t339724\n琉璃块\t339725\nzhg\t339726\n玄武湖\t339727\n自定义\t339728\n笔芯\t339729\nSENSORS\t339730\nSQLServer2008R2\t339731\nyak\t339732\n刘焉\t339733\n爱在黎明破晓\t339734\nlayers\t339735\n板型\t339736\n陈志超\t339737\n郑冰冰\t339738\n2017年9月20日\t339739\n未成年人\t339740\ndelimiter\t339741\n2016年12月23日\t339742\n919\t339743\n闽语\t339744\nFF12\t339745\nsumif函数\t339746\n50级\t339747\n2388\t339748\n楼间\t339749\n邦纳\t339750\n猫罐\t339751\nProphecy\t339752\nwww85jjjjcom\t339753\n返航\t339754\nFlashtool\t339755\n两线\t339756\n温州科技职业学院\t339757\n例字\t339758\n耐荫\t339759\n作弊者\t339760\n镇\t339761\n2011年11月\t339762\n意乱情迷\t339763\nnwu\t339764\nvlive\t339765\n繁育\t339766\n1360\t339767\n0x1\t339768\n素晴\t339769\n太阳乡\t339770\n2012年5月\t339771\n幻灯片\t339772\nhelloworld\t339773\n妇人\t339774\n残杀\t339775\n玲珑骰子安红豆\t339776\nWeixin\t339777\n小升\t339778\n损益表\t339779\n小米主题吧\t339780\n不谈恋爱\t339781\nsese\t339782\n等离\t339783\n灵\t339784\n18道\t339785\n同市\t339786\n祭品\t339787\n最近5年\t339788\n祼照\t339789\nseeing\t339790\n制怒\t339791\nTEMPO\t339792\n时好\t339793\n二氯代物\t339794\n战剑\t339795\n众信旅游\t339796\n红叶石楠\t339797\nIP协议\t339798\n秘典\t339799\n细碎机\t339800\n职业道德\t339801\n汉十高铁\t339802\n防刺\t339803\n吴冠希\t339804\n浓眉哥\t339805\n恒昌贷款\t339806\n奇珀\t339807\n长板\t339808\n消停\t339809\n黑将\t339810\n东航\t339811\n判明\t339812\n乡下人家\t339813\nDATALOGIC\t339814\n第一太平戴维斯\t339815\nBillboard\t339816\n黑龙江省哈尔滨市南岗区\t339817\n埃克斯\t339818\n爱因斯\t339819\n园游会\t339820\n魔门\t339821\n10件\t339822\n梦幻神器\t339823\n团日\t339824\n手型\t339825\n山西农业大学信息学院\t339826\nwin7系统优化\t339827\n換\t339828\n胡海岩\t339829\n只字\t339830\nCCS\t339831\n希岛\t339832\n气象卫星\t339833\n团总支\t339834\n增删卜易\t339835\n王政\t339836\nCurry\t339837\ncatalina\t339838\n孤儿\t339839\n专利权质押贷款\t339840\n盘面\t339841\nprovisions\t339842\n食管静脉曲张\t339843\n宝马X3论坛_汽车之家论坛\t339844\n青少年活动中心\t339845\n集气罩\t339846\n战锤40K战争黎明3\t339847\n郑州航空港\t339848\n凸优化\t339849\ngpo\t339850\n联通沃\t339851\n微侠网\t339852\nearnings\t339853\n7.9寸\t339854\n赵志全\t339855\n河北广播电视台\t339856\n华夏化工网\t339857\n竞报\t339858\n武山政府网\t339859\n绣花\t339860\n东鹏特饮\t339861\n郑州二七区\t339862\n逃家小兔\t339863\n开福区\t339864\n八十分\t339865\n镶嵌\t339866\nxuelei\t339867\n两军\t339868\n奥地利\t339869\n飞刀手\t339870\n商盟\t339871\n赵杰\t339872\n911s5\t339873\n标准库\t339874\n惊天地\t339875\n亡灵序曲\t339876\n佛教社-地藏论\t339877\n东沙群岛\t339878\ncrc\t339879\n佳和\t339880\n模拟人生4_模拟人生4\t339881\nhaar\t339882\n贾盛强\t339883\n特效音\t339884\n排卵试纸强阳\t339885\n大众路\t339886\n星玥\t339887\n悬臂吊\t339888\n国诚\t339889\n无锡硕放机场\t339890\nTST庭\t339891\n2003年度\t339892\n黄道吉日网\t339893\n中国青少年发展基金会\t339894\n临床路径管理\t339895\n夏雪宜\t339896\n金融市场基础知识\t339897\n22句\t339898\n氣\t339899\nesk\t339900\n智慧博物馆\t339901\nyongheng\t339902\nPHOTOS\t339903\n青文\t339904\n人体\t339905\n道路运输经营许可证\t339906\net199\t339907\ndiskstation\t339908\n乱交\t339909\n巴基斯坦\t339910\n西大直街\t339911\n企业\t339912\n苍山洱海\t339913\n沈阳北\t339914\n长沙搬家公司\t339915\n翔鹤\t339916\n黑斑病\t339917\n上海交通大学安泰经济与管理学院\t339918\nGOOD\t339919\n山药汤\t339920\n金河湾\t339921\n贾昆\t339922\nswipe\t339923\n内网版\t339924\nsocial\t339925\n宠物小精灵吧\t339926\n蛋糕粉\t339927\n信用危机\t339928\n河南知识产权局\t339929\n吉姆雷诺\t339930\n冲关\t339931\n大阪机场\t339932\n女乘客\t339933\n三雄极光照明\t339934\n奥学网\t339935\n东沙\t339936\n饥饿游戏2\t339937\n柯布道格拉斯\t339938\nwell\t339939\niwork\t339940\n365地产家居网\t339941\n周凯宋美龄\t339942\nFirming\t339943\n鲁白\t339944\n51套\t339945\nblythe\t339946\nLas\t339947\n苏明明\t339948\n汉阳大学\t339949\n震区\t339950\n一美元\t339951\n市直单位\t339952\n云财经\t339953\n咨询员\t339954\n刘晓君\t339955\n免税商店\t339956\nThinkcell\t339957\n情与欲\t339958\nComb\t339959\n朗园\t339960\n全列\t339961\n资源组\t339962\n82290103\t339963\nspringerlink\t339964\n多边形\t339965\ntissue\t339966\n花液\t339967\n趋肤效应\t339968\n好机\t339969\n天津财经大学珠江学院\t339970\n苍叶\t339971\n难测\t339972\n期货黄金\t339973\n1兆瓦\t339974\n7.7%\t339975\n沃尔德\t339976\n男人口\t339977\n第9名\t339978\n阿古茹奥特曼\t339979\nJacoco\t339980\nv1.2.8\t339981\n郑州大学体育学院\t339982\nx15\t339983\n闪蒸罐\t339984\n世界人口日\t339985\nRoll\t339986\nrtmp推流\t339987\nsinmulink\t339988\nRida\t339989\n华兴\t339990\n闫峰\t339991\n势必\t339992\n猪头\t339993\n虞啸卿\t339994\n刚泰集团\t339995\nax^2\t339996\n刘琴\t339997\n部长级\t339998\n20150805\t339999\n雪季\t340000\n豆搜网\t340001\n定江洋\t340002\n法人资格\t340003\n含铬\t340004\n谢迦勒\t340005\nThorlabs\t340006\nnite\t340007\n3200亿\t340008\n一个8\t340009\n10万级\t340010\n芳樟醇\t340011\nv1.8.1\t340012\nMist\t340013\n康力电梯\t340014\n豪力士\t340015\nSMB协议\t340016\n0.01%\t340017\n附加表\t340018\n巨蛛\t340019\n园艺\t340020\nonehot\t340021\n5620\t340022\n毕业论文重复率\t340023\nChangi\t340024\n骄横\t340025\n流动补胎\t340026\n春种\t340027\n粗话\t340028\n三分之\t340029\n中植\t340030\n华簪录\t340031\n包装工\t340032\n李俊\t340033\n直条\t340034\n洛天\t340035\n李实\t340036\nAutocad2015\t340037\n脏话\t340038\n前一分钟前一秒\t340039\n帷幄\t340040\n126.com\t340041\n龙岩\t340042\n黑苹果乐园\t340043\nsupposed\t340044\n永中Office\t340045\n肥肥\t340046\nC18\t340047\nmf4752\t340048\n晓芹海参\t340049\nVisualization\t340050\nYuen\t340051\n马国强\t340052\n浙江大学控制科学与工程学院\t340053\n铺路石\t340054\n美岸\t340055\n长寿花\t340056\n山东省建设厅\t340057\n舐\t340058\n南国桃园\t340059\n公共事业管理\t340060\n王福朋\t340061\nJCW\t340062\n牟星\t340063\n人脑\t340064\n物资管理系统\t340065\nFluke\t340066\nyc\t340067\n声音\t340068\n地震震级\t340069\n冒险岛SF\t340070\nD10\t340071\n沙井\t340072\n片长\t340073\n人民邮电报\t340074\n佩\t340075\n青岛分公司\t340076\n补卡\t340077\n官费\t340078\n1.5a\t340079\n20年内\t340080\nfreight\t340081\n第三者责任险\t340082\n李洵\t340083\n紫竹院\t340084\n御东\t340085\n佛山农商银行\t340086\nAIDE\t340087\n4452\t340088\n国门\t340089\n滚动栏\t340090\n民航业\t340091\n一眨眼\t340092\n_唐论坛_汽车之家论坛\t340093\n免税证明\t340094\n矜持\t340095\n张耳\t340096\n商品展\t340097\n发卡\t340098\n颜茶\t340099\n同光\t340100\n道地\t340101\n节奏感\t340102\n大武\t340103\nSUR\t340104\n脖颈\t340105\nbiubiubiu\t340106\n发圈\t340107\n斜顶\t340108\n0224\t340109\n2017一级\t340110\n家庭关系\t340111\n前女友\t340112\n肛肠病\t340113\n银赫\t340114\n遍地\t340115\nBattlEye\t340116\n产业投资基金\t340117\n岳毋\t340118\n玉环\t340119\nyandex浏览器\t340120\n胶底\t340121\n年终奖金\t340122\n卡德里亚\t340123\nOrbi\t340124\n点线\t340125\nCognos\t340126\n云立方\t340127\ncdd\t340128\n昭通市人力资源和社会保障局\t340129\n說\t340130\n子空间\t340131\n伟大的渺小\t340132\nVancouver\t340133\nNeeded\t340134\n鸸鹋\t340135\n国债期货\t340136\n两组\t340137\n土豆粉\t340138\ngushi\t340139\n女中豪杰\t340140\n曾朴\t340141\n品骏快递\t340142\ncmm\t340143\nE430c\t340144\n600654\t340145\nSTAR-CCM+\t340146\n叶准\t340147\n张迁碑\t340148\n高君\t340149\n中兴\t340150\n陕西省人大常委会\t340151\n9旬\t340152\n发芽率\t340153\ncannt\t340154\n寒门\t340155\n澄明\t340156\n议息会议\t340157\n90方\t340158\nTarzan\t340159\n分摊\t340160\n内卷\t340161\n阿格斯\t340162\n八年级\t340163\n说听\t340164\nz00\t340165\n敢死炮\t340166\n塔塔尔族\t340167\n大逆\t340168\n梳式\t340169\nbadboy\t340170\n美康粉黛\t340171\nsafely\t340172\n抑菌圈\t340173\n盛虹\t340174\n腾讯大渝网\t340175\nVCDS\t340176\n201603\t340177\nsocketserver\t340178\n_平板_亿智蘑菇\t340179\n甲状腺弥漫性病变\t340180\nunm\t340181\n五菱宏光s1\t340182\n爆火\t340183\ng703\t340184\n屏锁\t340185\n马克思盘\t340186\n布道\t340187\n毛茸茸\t340188\n民营\t340189\n杨姣\t340190\narrival\t340191\n嘉松中路\t340192\n内存卡\t340193\n断命\t340194\n曾经沧海难为水\t340195\nBaeldung\t340196\n古颜倾城\t340197\nSPORT\t340198\nmartial\t340199\n申根签证\t340200\nChro\t340201\n汉弗莱\t340202\n木胶\t340203\n钱机\t340204\n治学\t340205\nintegrator\t340206\nLex\t340207\n屈中恒\t340208\n珠海市金湾区政府\t340209\nInstallShield\t340210\nlacks\t340211\nFeast\t340212\n郭嘉文\t340213\n版本转换器\t340214\n超出\t340215\n吲哚美辛\t340216\n奥迪S3\t340217\n第二胎\t340218\n柔柔\t340219\nOUTLOOK\t340220\n截止日\t340221\n莫利\t340222\n墨彩\t340223\n华西能源\t340224\nmines\t340225\n铁甲\t340226\n103.7\t340227\nshuru\t340228\n刘志刚\t340229\n▅\t340230\n浦桥\t340231\ninfiniteflight\t340232\n禁入\t340233\n易才集团\t340234\n河南高院\t340235\n黄泽\t340236\n塞尔达传说风之杖\t340237\nJewel\t340238\n43位\t340239\n荠荠菜\t340240\n服主\t340241\n浙江民泰商业银行\t340242\nwrl\t340243\n1.5t\t340244\n万能材料试验机\t340245\n辙\t340246\nqvod伦理片\t340247\n汉商\t340248\n闻官军\t340249\n中国二十二冶集团有限公司\t340250\n黑丸格斗\t340251\n茎杆\t340252\n脱产\t340253\n倔\t340254\ndigitalocean\t340255\n青川\t340256\n头颈部\t340257\n百万新娘之爱无悔\t340258\n还必须\t340259\n广州出版社\t340260\n0056\t340261\njdx\t340262\nNOIP2017\t340263\n青春男\t340264\n2018-01-12\t340265\n福州市鼓楼区人民政府\t340266\n选定点\t340267\n报忧\t340268\n于娟\t340269\n依存\t340270\n虚接\t340271\nRTSP流\t340272\n超脱\t340273\n17.9\t340274\n5015\t340275\n会员日\t340276\nsolarwinds\t340277\n清洗液\t340278\nmmy\t340279\n心灵终结\t340280\n广饶县政府\t340281\n大办\t340282\n演讲\t340283\nstorms\t340284\n人次\t340285\n2016年五一\t340286\nカット\t340287\n快装板\t340288\n罗森伯格\t340289\n拼配\t340290\nPlex\t340291\nbandicam\t340292\n电锯惊魂1\t340293\nRegions\t340294\nCAPA\t340295\n谈笑风生\t340296\n入声\t340297\n白标\t340298\n程东\t340299\n日内瓦湖\t340300\n石英砂滤料\t340301\n深圳市建筑工务署\t340302\n子华\t340303\n土地税\t340304\n竞艳\t340305\n40张\t340306\n过去五年\t340307\n习近平总书记系列重要讲话读本\t340308\n暴动\t340309\n环球自然日\t340310\nv文\t340311\n服战\t340312\n赤峰学院附属医院\t340313\n管交网\t340314\n冰冻三尺\t340315\n承租房\t340316\n演职\t340317\n皇庭广场\t340318\n系统配置\t340319\n小末\t340320\n6.X\t340321\n仙剑奇侠传3d\t340322\n甘苦\t340323\n天巡机票\t340324\n霜花店\t340325\n枝江市\t340326\nOle\t340327\n额骨\t340328\n且行且珍惜\t340329\n市丸银\t340330\nnightingale\t340331\n天宸股份\t340332\n综保区\t340333\n福建省物价局\t340334\n雕像\t340335\nJAVA语言\t340336\n993\t340337\n陈氏太极\t340338\n点像\t340339\nunslider\t340340\nSHIFT\t340341\n廖其刚\t340342\n紫金山庄\t340343\n富良野\t340344\noriginals\t340345\n报告册\t340346\n贺忍蛙\t340347\n上牌123网\t340348\n君略研究报告网\t340349\nHPL\t340350\n恒生银行\t340351\n环礁\t340352\n起早贪黑\t340353\nparted\t340354\nCTC\t340355\n注册安全工程师执业资格考试\t340356\n绝笔\t340357\n长客\t340358\n可导性\t340359\n美年大健康体检\t340360\n3D动漫\t340361\n过速\t340362\n3件套\t340363\n张骞\t340364\nkdj\t340365\n净重\t340366\n控制端\t340367\n风吕\t340368\n晤\t340369\n技术攻关\t340370\n天地豪情\t340371\n十二少\t340372\n浮筒式\t340373\n浮吊\t340374\n林珍娜\t340375\n玺悦\t340376\n书桌椅\t340377\n脱脂\t340378\n上海实验小学\t340379\n龟虽寿\t340380\nxcopy\t340381\n喜成\t340382\n调试\t340383\n土地估价师\t340384\n林冉\t340385\n魔兽战士吧_\t340386\nEVie\t340387\n螺塞\t340388\n九七\t340389\n蛋筒\t340390\n未婚\t340391\n尺八\t340392\n胜利小学\t340393\n平和县\t340394\n进来\t340395\n20161210\t340396\n呼伦贝尔海拉尔\t340397\n书院路\t340398\n做多\t340399\nminiso\t340400\n无火香薰\t340401\n亚欧大陆\t340402\n黑群辉\t340403\n首善\t340404\n万科城一期\t340405\n慢性肺炎\t340406\n葵花护肝片\t340407\n仿型\t340408\n李荐国\t340409\n魔客\t340410\n香港岛\t340411\n呼和浩特职业学院\t340412\n盲约\t340413\n安居镇\t340414\np1106\t340415\n庞各庄镇\t340416\n压缩衣\t340417\nBl\t340418\n魔发\t340419\n対応\t340420\n太阳神鸟\t340421\n辉煌紫金\t340422\nJAYJUN\t340423\n双子\t340424\n钟玉\t340425\n六线谱C\t340426\n联动型\t340427\n蒋牧童\t340428\n抗美援朝\t340429\nvisio图标库\t340430\n安徽省气象局\t340431\n互相推诿\t340432\n1兆\t340433\n完全二叉树\t340434\n棉农\t340435\n武汉市工商局\t340436\n1.6AT\t340437\n海台\t340438\npidgin\t340439\n尹红\t340440\n天翔\t340441\ntales\t340442\n孔口\t340443\n五月色在线\t340444\n跃层\t340445\n中欧法学院\t340446\n江智民\t340447\n起亚model\t340448\ngen\t340449\n红腰带\t340450\n川仪\t340451\n青年联合会\t340452\n杨堤码头\t340453\n醉地\t340454\nRespiratory\t340455\nPivot\t340456\n语文者\t340457\n单螺杆泵\t340458\n周会\t340459\nwaitpid\t340460\n剪去\t340461\nBeginning\t340462\ndown\t340463\n京海\t340464\nsihu\t340465\n黑五海淘\t340466\nOverhead\t340467\n家图网\t340468\n黎敏\t340469\n装配式建筑\t340470\n全身照\t340471\n加州艺术学院\t340472\n数控编程CAM\t340473\n注会考试\t340474\n搂\t340475\n球探\t340476\n死宅\t340477\n傩舞\t340478\n预聘\t340479\nideavim\t340480\nword16\t340481\n222.222\t340482\n龙湖椿山\t340483\n武士刀\t340484\n广州松下\t340485\n丙纶布\t340486\n青椒土豆丝\t340487\n欧·亨利\t340488\n京师心晴网\t340489\n通俗版\t340490\n_扑家吧_扑家工作室\t340491\n套近乎\t340492\n千层酥\t340493\n晚报\t340494\n风云再起\t340495\n铝门\t340496\n赵凯华\t340497\n不分配\t340498\n恋童癖\t340499\nMsc\t340500\nQQ飞车-游久网\t340501\n苹果macbook\t340502\n二月红\t340503\nconcatenation\t340504\n益生菌冲剂\t340505\n槲寄生\t340506\n核销\t340507\n健康村\t340508\nMarion\t340509\nsystemcare\t340510\n诛魔\t340511\n上海9号线\t340512\n万辆\t340513\n消户\t340514\n330国道\t340515\n差分放大器\t340516\ndrawableLeft\t340517\n鸿声\t340518\n鞋钉\t340519\nQQ粉丝网\t340520\ncarthage\t340521\n武陵春\t340522\n中岛美嘉\t340523\n温度传感器\t340524\n乙肝五项检查\t340525\n北莱茵\t340526\n450亿美元\t340527\n枫无涯\t340528\n御妹\t340529\n张小平\t340530\n美龄\t340531\nvol\t340532\n属于自己\t340533\n涛妹\t340534\n25P\t340535\n鱼桥小学\t340536\n此外\t340537\nZFS\t340538\nifttt\t340539\n微信营销软件\t340540\n恒常\t340541\n沈铁\t340542\n王甄\t340543\n从犯\t340544\n4G路由器\t340545\n第34轮\t340546\n三星电视\t340547\n步行街\t340548\n竹海\t340549\n富甲\t340550\n千家诗\t340551\n法国政府\t340552\n争胜\t340553\nitun\t340554\n临江工业园区\t340555\nogg\t340556\n降魔的\t340557\n罗艳\t340558\nlana\t340559\n东莞市工商行政管理局\t340560\n马健\t340561\n热血石\t340562\n英规\t340563\n林立\t340564\ntmg\t340565\n内表\t340566\nAMO\t340567\n新浪育儿_新浪网\t340568\n禁封\t340569\nojdbc6\t340570\nCREW\t340571\n中日医院\t340572\n_商讯\t340573\n固伤\t340574\n解场\t340575\n暮鼓\t340576\nxpose\t340577\n心惊胆颤\t340578\n探知\t340579\nJet\t340580\n钟山信息网\t340581\n溶解度\t340582\n反同恋\t340583\n编程人\t340584\n汉字编码\t340585\n影音先锋资源AV看片站\t340586\n地灯\t340587\n开化县\t340588\n黄酱\t340589\n第62号\t340590\n碰击\t340591\n号角\t340592\n李夏怡\t340593\n爆甲\t340594\n内因\t340595\nwestin\t340596\n合肥地铁3号线\t340597\n王文峰\t340598\n已逾\t340599\n2017年4月30日\t340600\niproute\t340601\nOldFiles\t340602\nslog2\t340603\n捷波朗\t340604\n阿户\t340605\n暴风影音解码器\t340606\n大范\t340607\n软件包\t340608\n宠物连连看\t340609\n腐漫控\t340610\n白凡士林\t340611\n联友\t340612\n1510\t340613\n好身材\t340614\n王春雷\t340615\n西木\t340616\n孙朱朱\t340617\n原则上\t340618\n初级建(构)筑物消防员\t340619\nincell\t340620\n星球大战:原力觉醒\t340621\n根联\t340622\n组网\t340623\n羌活\t340624\nmp4格式转换器\t340625\ndebain\t340626\n硬核联盟\t340627\nPros\t340628\n氯酸钠\t340629\n来玩游戏\t340630\n党政机关厉行节约反对浪费条例\t340631\n14400\t340632\n扫街\t340633\n众创杯\t340634\nTimeline\t340635\n来自新世界\t340636\nsublimeText3\t340637\n3.6级\t340638\n子系\t340639\n1千块\t340640\n南京大学软件学院\t340641\n报恩\t340642\nGabbana\t340643\n终极传奇\t340644\nIOS11\t340645\n683\t340646\n仍未\t340647\npreprocessing\t340648\n性型\t340649\n立安\t340650\nsupersu\t340651\n河南人\t340652\n一百年\t340653\nGAN\t340654\n何丽萍\t340655\n开心消消乐吧\t340656\n放轻松\t340657\n120cm\t340658\n核实\t340659\nlasts\t340660\nSails\t340661\n24P\t340662\n巧奔妙逃\t340663\ndesmos\t340664\n锦州市\t340665\n铀矿石\t340666\nmima\t340667\n鸡豚\t340668\n戏份\t340669\n钱桥\t340670\n青木瓜之味\t340671\n牢靠\t340672\n记工\t340673\ndischarge\t340674\n广发东航\t340675\n奥邦\t340676\n网页\t340677\n疯爱\t340678\n18个月\t340679\n4f\t340680\n量子论\t340681\n褚\t340682\n广汇汽车服务股份公司\t340683\n男生篇\t340684\n奎恩\t340685\nxpaper\t340686\n衍生\t340687\n舞蹈类\t340688\n25v\t340689\n集思\t340690\nhet\t340691\n奇怪\t340692\n干城\t340693\n未买\t340694\n1968年\t340695\njean\t340696\n上海世博会中国馆\t340697\n桐嶋\t340698\n中箭\t340699\nmg老虎机\t340700\n玲子\t340701\n工业路\t340702\nauc\t340703\n边录\t340704\n工作家\t340705\n小肥\t340706\n1815年\t340707\n衡南\t340708\n小月龄\t340709\n斗龙啪啪三国\t340710\n黄花园\t340711\nLaTex\t340712\n踮\t340713\n复来\t340714\n民族路\t340715\nSEGA\t340716\n鱼油\t340717\n960PRO\t340718\ncyb\t340719\n中邮网[集邮/钱币/邮票/金银币/\t340720\n球筒\t340721\n天正8\t340722\nInduction\t340723\n游拍\t340724\nzypxx高清\t340725\n股窜网\t340726\n周中\t340727\n2257\t340728\nnoon\t340729\n赵铭\t340730\nExtjs5\t340731\n刘易斯\t340732\n祖宗\t340733\n瑞昌网\t340734\n中国囍联\t340735\ntsung\t340736\n钻杆\t340737\n海森堡\t340738\n软件工程\t340739\n内存数据库\t340740\n加拿利\t340741\n前前\t340742\n亚型\t340743\nBhaijaan\t340744\n中集来福士\t340745\n第三性\t340746\nRaspbian\t340747\nAquatic\t340748\n第十七回\t340749\n龙王丸\t340750\n诺亚-诺亚\t340751\nzhongyao\t340752\n文旅地产\t340753\nocv\t340754\n喷绘布\t340755\ncurious\t340756\nKunshan\t340757\n血祭\t340758\n宜居社区\t340759\n首道\t340760\n完美公司\t340761\n朱国荣\t340762\n兀兀\t340763\n安徽省发改委\t340764\n推介会\t340765\n游心\t340766\n流行语\t340767\n李瑞镇\t340768\ne5573s-856\t340769\n阿潼\t340770\nKotlin\t340771\nMFC-7340\t340772\n月分\t340773\n大盗胡因梦\t340774\n时迁\t340775\n东莨菪碱\t340776\n走多远\t340777\nitf\t340778\nGoose\t340779\n张俊峰\t340780\nCyclic\t340781\nRacks\t340782\n高温型\t340783\n创创\t340784\n64.zip\t340785\n合唱队\t340786\nyet\t340787\n使命召唤7\t340788\nFEED\t340789\nSLES\t340790\n大掌门\t340791\n日记体\t340792\n1717\t340793\ncatv\t340794\n路遥知马力\t340795\n剁辣椒\t340796\n22732799\t340797\n四棱柱\t340798\nyunpan\t340799\nBUF\t340800\nIE内核\t340801\n第68集\t340802\n银田\t340803\nscoot\t340804\n39.5度\t340805\nOrchid\t340806\n洗衣板\t340807\nALIENWARE\t340808\nab级\t340809\n欧\t340810\n蓝象\t340811\nAV狼\t340812\n西窗\t340813\nASRock\t340814\n18片\t340815\n两年以上\t340816\n萧全\t340817\n几十万元\t340818\n辉煌\t340819\n七夜\t340820\n整流变压器\t340821\n湖北省人民政府外事侨务办公室\t340822\n正原\t340823\nzzc\t340824\n吴疆\t340825\nknime\t340826\nsz\t340827\n青岛市人民政府\t340828\n独秀峰\t340829\n国家新闻出版署\t340830\n阿森西奥\t340831\n损失\t340832\n汇博人才网\t340833\n许明吉\t340834\n跃然\t340835\n熙地港\t340836\n林虎\t340837\n火漆\t340838\n专接本\t340839\n东隅\t340840\n宽街\t340841\n嘿Siri\t340842\n流浪客$\t340843\nnel\t340844\nyuu\t340845\n太湖边\t340846\n教师端\t340847\n蟒袍\t340848\nwfs\t340849\n任然\t340850\n揉揉\t340851\n说货\t340852\n24亿\t340853\n重庆南\t340854\n酣畅淋漓\t340855\nOTG\t340856\n双高\t340857\n香港利丰集团\t340858\n审报\t340859\n史姗妮\t340860\n山岭\t340861\n十三朝\t340862\n宠物室\t340863\n欢乐比亚迪秦\t340864\n状态机\t340865\n第41\t340866\n20多元\t340867\n花水月\t340868\nbojiangzhou\t340869\n龙陵县人民政府\t340870\n遥观\t340871\n武警水电部队\t340872\n闽西\t340873\n仙豆\t340874\n6.98万\t340875\n重生之红星传奇\t340876\nbgo\t340877\n新生机\t340878\n鞋靴\t340879\n翔宇\t340880\ncout\t340881\n852\t340882\n东莞市政府\t340883\nTimers\t340884\n数落\t340885\n9宫格\t340886\nfenxi\t340887\n刷新\t340888\n自行榴弹炮\t340889\n蝴蝶机\t340890\n强生汽车\t340891\nRoot\t340892\n蜂窝\t340893\n反叛\t340894\n审议\t340895\n古墓丽影暗影\t340896\n正骨\t340897\n平滑肌\t340898\nfda\t340899\nh-mate\t340900\n钛钢\t340901\nskechers\t340902\n羡慕嫉妒恨\t340903\n乳牙\t340904\n雅昌艺术网\t340905\ndazhe\t340906\nimusic\t340907\n委外加工\t340908\n伍思凯\t340909\nKia\t340910\n收刀\t340911\nmusings\t340912\nabook\t340913\n尚湖\t340914\n达鲁伊\t340915\n北京大学人民医院\t340916\n集线器\t340917\n卢庚戌\t340918\n柯桥教育\t340919\n钱清镇\t340920\n花嫁\t340921\n昂克全职法师\t340922\nc类\t340923\naodaliya\t340924\n田宅宫\t340925\nFirework\t340926\n美好的人生\t340927\n1001个\t340928\n300毫米\t340929\n喵窝丨由依喵\t340930\nい\t340931\n股骨颈\t340932\n450万\t340933\n农普\t340934\n金鸡百花奖\t340935\n行间\t340936\nEOL\t340937\n狸窝视频转换器\t340938\n郑青松\t340939\nquill\t340940\n汉谜网\t340941\n张士超\t340942\nlsblk\t340943\nreceipt\t340944\n2017.0\t340945\n半框眼镜\t340946\n香春\t340947\n飘飘欲仙\t340948\n靠得住\t340949\nB75\t340950\n秋田县\t340951\nvb2010\t340952\n青杏\t340953\n宿州市\t340954\n彭磊\t340955\n上海市实验学校\t340956\n路伟\t340957\n怪诞\t340958\n抗滑\t340959\nretainer\t340960\n新四化\t340961\ndml\t340962\nvarnish\t340963\n唐风\t340964\n王玉珏\t340965\n郑州大学第二附属医院\t340966\n感冒清热颗粒\t340967\n定制版\t340968\n核安全法\t340969\n110集\t340970\n中国集团\t340971\nUSB分线器\t340972\n9岁\t340973\nJoyful\t340974\n文光\t340975\n堀江耽闺\t340976\n3g.renren.com\t340977\n601997\t340978\n华宁\t340979\n12.21\t340980\n两号\t340981\n甘肃日报\t340982\nRedmine\t340983\n小数\t340984\n50笔\t340985\n刘胡\t340986\n审核制\t340987\n临床执业医师考试\t340988\n混血儿\t340989\n偏心率\t340990\n30米\t340991\n标志3008\t340992\n赛乐特\t340993\n制服诱\t340994\n枪类\t340995\n汉恩\t340996\n量子理论\t340997\n6450\t340998\n戴斌\t340999\n反应该\t341000\n丁颖\t341001\nMzi8起名网\t341002\n苏群\t341003\n刘益谦\t341004\n瑞鹰\t341005\n深圳研究生院\t341006\n关于落实全面从严治党\t341007\nerrors\t341008\n中国泵业网\t341009\n1917年\t341010\n为己\t341011\ncuan\t341012\n小少年\t341013\nDry\t341014\n远大医药\t341015\nThermos\t341016\n华致酒行\t341017\n珠海国家高新技术产业开发区\t341018\nDogfart\t341019\nLtd\t341020\n华夏家博会\t341021\n盟军敢死队1\t341022\n大学生职业生涯规划\t341023\nn9100\t341024\n1563\t341025\n付息\t341026\n陕西省中小企业促进局\t341027\n牛盘\t341028\n系统盘\t341029\n几颗\t341030\n电压锅\t341031\n南山集团\t341032\n李永红\t341033\n基础工程\t341034\n绍兴镜湖\t341035\n爱普生L380\t341036\n四爪\t341037\nPants\t341038\n电子部\t341039\n压床\t341040\n宝鸡陈仓区\t341041\nneed\t341042\n850pro\t341043\n折桂\t341044\n医科类\t341045\n临界量\t341046\n李溪芮\t341047\nxiake\t341048\nupp\t341049\n军魂\t341050\n辽宁衡润飞豹篮球俱乐部\t341051\nA罩杯\t341052\n中职教育\t341053\n导进\t341054\n放射肿瘤学\t341055\ng5\t341056\n偃旗息鼓\t341057\ndropped\t341058\n造价通\t341059\ngarrett\t341060\nVPU\t341061\n340个\t341062\n撩心\t341063\n美欲\t341064\n脑干\t341065\n黄翠如\t341066\nSplunk\t341067\n人工智能网\t341068\nqnx\t341069\n水浒求生记\t341070\n自卫还击\t341071\nEU4秘籍大全/作弊码/事件/成就|Europa\t341072\n720P/1080P-MKV\t341073\n甘肃省工商行政管理局\t341074\n放心\t341075\n欧琳\t341076\n篦子\t341077\n2017年8月18日\t341078\n变形金刚4:绝迹重生\t341079\n修饰器\t341080\n莲花健康\t341081\nr7\t341082\n蜂蜜柠檬\t341083\n第一二\t341084\n陈皮茶\t341085\ntitlebar\t341086\n数\t341087\n保福寺\t341088\nrfm\t341089\n第953集\t341090\n单币卡\t341091\n地轴\t341092\n济南中学\t341093\n云南房网\t341094\nberserker\t341095\nbdrip\t341096\nMpa\t341097\ncaidan\t341098\n上学期\t341099\n指北针\t341100\nWindowsUpdate\t341101\n一幅\t341102\n吴皓\t341103\n天地间\t341104\n微人\t341105\n倒休\t341106\n哈布\t341107\n华燕\t341108\nconemu\t341109\n第22次\t341110\n自然灾难\t341111\n景景\t341112\nBarker\t341113\n勃林格\t341114\nsafeshare\t341115\n深圳市审计局\t341116\nDAT格式\t341117\n程序块\t341118\n1175\t341119\n防患未然\t341120\n阿珍\t341121\n贤母\t341122\n兽头\t341123\n爱妻号洗衣机\t341124\n都一样\t341125\n中石化集团\t341126\nxhEditor\t341127\nplayclub\t341128\n真可\t341129\n狗样\t341130\n奇楠\t341131\n新农保\t341132\n46亿元\t341133\n海虎\t341134\nkoei\t341135\n上海人民电器\t341136\nfalogin\t341137\noot\t341138\n胎心音\t341139\nhypertrm\t341140\n复合大师\t341141\n威宁路\t341142\n蓝印\t341143\nctl671\t341144\n押镖\t341145\n假借\t341146\n寒云\t341147\n比肩\t341148\n分片\t341149\n水合肼\t341150\n912\t341151\nInit\t341152\n指挥者\t341153\n群马县\t341154\nexcel表格批量\t341155\n内蒙古政府网\t341156\n凸轮轴位置传感器\t341157\n至纯科技\t341158\n盐步\t341159\nimpk\t341160\n摩信网\t341161\n杭州地铁3号线\t341162\n干鱿鱼\t341163\n武士\t341164\n区域链\t341165\nok\t341166\n仙路争锋\t341167\n童年\t341168\n地狱浩劫\t341169\n黑龙江省政府\t341170\n工字梁\t341171\n桓公\t341172\n概率性\t341173\n大通燃气\t341174\n太平区\t341175\n诽谤案\t341176\n湘江战役\t341177\nxx镇\t341178\n独立显卡驱动\t341179\n清脑\t341180\nv5.3.0\t341181\ndeg\t341182\n樱树\t341183\n老西门\t341184\n向宇\t341185\niWatch\t341186\n4020\t341187\n孕酮\t341188\n护土\t341189\n鸭胸肉\t341190\nFnatic\t341191\n芦名ユリア\t341192\nUdp\t341193\n可测\t341194\n内行\t341195\n北京市朝阳区发展和改革委员会\t341196\n子结构\t341197\n异常性\t341198\nT570\t341199\nMultipath\t341200\n1387\t341201\n白案\t341202\n改革方\t341203\n嘱托记\t341204\n大班额\t341205\n現代\t341206\n講\t341207\n全味\t341208\n中青报\t341209\n书案\t341210\n松井珠理奈\t341211\ncosn\t341212\n霞浦县政府\t341213\n李顺\t341214\n鲜米机\t341215\n肖楠\t341216\n高新技术\t341217\n背景框\t341218\n管理运筹学\t341219\n太平山\t341220\n诛仙手游青云\t341221\n容国\t341222\n这种\t341223\n李颂慈\t341224\n植入\t341225\n3000元\t341226\n金刚狼3:殊死一战\t341227\n黄总\t341228\n五一北京旅游\t341229\n威泰克斯\t341230\n春羽\t341231\n轩辕黄帝\t341232\n陈伟\t341233\n7500U\t341234\n中山大学附属第六医院\t341235\nlug\t341236\n深圳地产\t341237\n李承乾\t341238\n#include&#160\t341239\n木乃伊3\t341240\n金陵石化\t341241\n枣庄市\t341242\n骐达天道图书馆\t341243\n吉利博\t341244\n证章\t341245\n飞雷龙\t341246\n清宫表\t341247\n我的烦恼\t341248\n分列\t341249\n跻身\t341250\n蛇毒\t341251\n三下乡\t341252\n口袋版\t341253\n水稻机\t341254\n甩干桶\t341255\n朱亚三毛\t341256\nIGES\t341257\n上弦月\t341258\n刷机\t341259\n陈素真\t341260\n现客\t341261\n嘉豪\t341262\n苏荣\t341263\n呼应\t341264\n刀剑神域夺命凶弹\t341265\n枪花\t341266\nxf\t341267\n海印\t341268\nMPOS\t341269\n51届\t341270\n8226\t341271\n首创股份\t341272\n曲洲\t341273\n放货\t341274\n陈炳聿\t341275\n双性恋\t341276\n医药费\t341277\n百花谷\t341278\n招行香港一卡通\t341279\n价税\t341280\n爱Q生活网\t341281\n中国建设教育协会\t341282\n1996\t341283\n鼎城政府网\t341284\nSym\t341285\n九龙江\t341286\n税则\t341287\nMobile\t341288\n援引\t341289\n智械\t341290\nMaintain\t341291\n单仁资讯集团\t341292\n160种\t341293\n陈铭\t341294\n信息系\t341295\n机关办公室\t341296\n7490\t341297\n二手烟\t341298\n金欧诺\t341299\n福利待遇\t341300\n下花园\t341301\n灭\t341302\nCrx4Chrome\t341303\n碎\t341304\n净宽\t341305\n水船\t341306\nillumina\t341307\n犯罪率\t341308\n被查重\t341309\nP1008\t341310\n琶洲网\t341311\n26%\t341312\n瓜子黄杨\t341313\n岐山\t341314\n投决\t341315\n广州电信\t341316\nQQ影像\t341317\n萧氏\t341318\n京沪铁路\t341319\n本田飞度\t341320\n榴弹\t341321\nbt种子搜索器\t341322\n白立新\t341323\nRapidIO\t341324\n缝线\t341325\n非工作日\t341326\n境外机\t341327\n遇到\t341328\n七里山塘\t341329\n维尔纳\t341330\n醋酸甲酯\t341331\n东湾\t341332\nuint16_t\t341333\nK19\t341334\n2014—2016年\t341335\n颇丰\t341336\n8天7晚\t341337\n三国群英传霸王\t341338\ntafel\t341339\n到天亮\t341340\n20160630\t341341\n区内\t341342\n配电\t341343\n预测\t341344\n苗床\t341345\n北张村\t341346\n勤花\t341347\nC8051F\t341348\npkill\t341349\n开卖\t341350\n红外摄像机\t341351\n乌苏\t341352\nConsideration\t341353\n长沙市小学\t341354\nLED人才网\t341355\nz370m\t341356\n福州市\t341357\nPrevious\t341358\n耕昇\t341359\nGTX1060\t341360\n以儆效尤\t341361\n我的小天地\t341362\n筹学\t341363\n一滴\t341364\n厦门实验中学\t341365\ngt640\t341366\nvisualbasic\t341367\n重庆市区\t341368\n聚氨酯保温板\t341369\n皇族馆\t341370\n长乔极地海洋公园\t341371\n3.1.2\t341372\nBDI\t341373\nnotices\t341374\nLOREAL\t341375\n随风逝去\t341376\n分羹\t341377\n芜湖火车站\t341378\n红兔\t341379\n除臭\t341380\n戏命师\t341381\n梅龙镇广场\t341382\nbenchmark\t341383\n环网\t341384\n卢旭庆\t341385\n点石成\t341386\npetg\t341387\n油印\t341388\n荣誉感\t341389\n储位\t341390\n南湖村\t341391\n采沙\t341392\n公办\t341393\n漫谈\t341394\n非常幸运\t341395\nMM\t341396\n双离合器\t341397\n韩都衣舍\t341398\n漏\t341399\n料带\t341400\nEPG\t341401\n绥德\t341402\n娜奥米·沃茨\t341403\n宫颈纳囊\t341404\npbf\t341405\n丢不掉\t341406\neCharts\t341407\n1700年\t341408\nCCK8\t341409\n私塾\t341410\n首曝\t341411\n31码\t341412\n南通航运职业技术学院\t341413\n常德市纪委\t341414\n下唇\t341415\n石棉网\t341416\nposs机\t341417\n坪石镇\t341418\n华东理工大学》\t341419\n快逸\t341420\n丙型肝炎\t341421\n老榆头\t341422\n李浩然\t341423\n诗书画\t341424\n范克里夫\t341425\n经济化\t341426\n芳环\t341427\n青运会\t341428\n女演员\t341429\n樱花劫\t341430\n酷我k歌\t341431\nARP攻击\t341432\n十八个\t341433\n391\t341434\nJIE\t341435\nWAMPServer\t341436\n雅居乐花园\t341437\n洛浦公园\t341438\n私有\t341439\nMega2560\t341440\nYAMAN\t341441\n徐秉龙\t341442\n编辑\t341443\nwerfault\t341444\n归墟\t341445\n马超群\t341446\n异程\t341447\nie9\t341448\nS9/S9+\t341449\n扩句\t341450\n小拇指\t341451\n吴良镛\t341452\n河南省民政厅\t341453\n清远市政府\t341454\ninfiniband\t341455\n龙潭水乡\t341456\n乘除\t341457\n瑞宁\t341458\n原始森林\t341459\nandroid5.1\t341460\n红叶谷\t341461\n施南生\t341462\n拉风大本营\t341463\n数字音视工程网\t341464\nWin10教育版\t341465\n夏杉杉\t341466\n泄欲\t341467\n爱福利\t341468\nethercat\t341469\n祙子\t341470\n工贸有限公司\t341471\n增设\t341472\n全中\t341473\n用兵\t341474\n丙硫氧嘧啶片\t341475\n培养板\t341476\n経\t341477\n馈源\t341478\n高六\t341479\n罄\t341480\n棕榈酸酯分散片\t341481\n读头\t341482\n宫腔镜检查\t341483\n花台\t341484\n平湖网\t341485\nSam\t341486\n元氏县\t341487\n李兴华\t341488\n诺安\t341489\nCMA认证\t341490\n搜库\t341491\n渝北区\t341492\nonfocus\t341493\n华夏典当行\t341494\nmolly\t341495\n烧苗\t341496\n头颈外科\t341497\n2012年末\t341498\n纳兰奥拉星\t341499\ndbpath\t341500\n劳恩斯\t341501\n前奏曲\t341502\n郑荣\t341503\n中国科技馆\t341504\n贷款余额\t341505\n蚕子\t341506\nrint\t341507\n孙岚\t341508\n寥寥无几\t341509\n屁颠虫\t341510\n字典表\t341511\n梧叶\t341512\nD60\t341513\n上古史\t341514\n填充剂\t341515\n各题\t341516\n8.0.4\t341517\n防污染\t341518\n建设工程施工合同\t341519\n夜宵\t341520\n5000名\t341521\n西南财经大学研究生院\t341522\n100mw\t341523\nChuang\t341524\ndxy\t341525\n欢沁\t341526\nsteamvr\t341527\n逆势\t341528\n五方桥\t341529\n文本型\t341530\n郭士\t341531\n2622\t341532\n宇通重工\t341533\n龇牙\t341534\n一二三四五六七\t341535\nGo语言中文网\t341536\n种鬼\t341537\nIT\t341538\n亿人次\t341539\n王卉\t341540\nros软路由\t341541\n75条\t341542\n竟称\t341543\n导示\t341544\n谢军\t341545\nBlink\t341546\n名猫\t341547\n西韩\t341548\nmovielens\t341549\nwolford\t341550\n东莞镇\t341551\n達人\t341552\n决斗者\t341553\n易中贾斯汀比伯\t341554\n环境关系\t341555\n打夯\t341556\n500平方\t341557\n高鑫零售\t341558\n边塞诗\t341559\n绿色植物\t341560\ncfm\t341561\n践行\t341562\nPrime\t341563\n上江城\t341564\n蒙草生态\t341565\n超级QQ\t341566\n华3\t341567\n叶念琛\t341568\nC反应蛋白\t341569\nCAJ\t341570\n答我爱菜园网\t341571\n婺州\t341572\n优质男\t341573\n伊芙利特\t341574\n硬回车\t341575\n带笑\t341576\nzard\t341577\n套打\t341578\n任超\t341579\n二甲胺\t341580\n12\t341581\n12.2\t341582\n路亚竿\t341583\n季春\t341584\n张晓慧\t341585\n爽快\t341586\nBeauties\t341587\n6.5下\t341588\n大屯村\t341589\n全费\t341590\n英州\t341591\n季凤文\t341592\n南宁市\t341593\n瓦松\t341594\nM5A78L-M\t341595\n启东恒大\t341596\n第117期\t341597\n爱普生投影机\t341598\n幸福36计\t341599\n五年一贯制高职\t341600\nlindo\t341601\n御姐\t341602\n河北省人社厅\t341603\nattended\t341604\n瑞银证券\t341605\n加泰罗尼亚\t341606\n黄山景区\t341607\n有机板\t341608\n黄芪粉\t341609\n698元\t341610\nvc2013\t341611\n2322\t341612\n左右逢源\t341613\n跑路\t341614\n金平糖\t341615\n仰慕\t341616\n唯尔\t341617\nmodo\t341618\n天顶\t341619\n广哈通信\t341620\n小米遥控器\t341621\n准备篇\t341622\n火影忍者究极风暴3\t341623\n大秦岭\t341624\n战神绕岛新航迹\t341625\n雪佛兰汽车\t341626\n您\t341627\n玛氏\t341628\n广角镜\t341629\n大揭底\t341630\n江西省民政厅\t341631\n无用功\t341632\n哥萨克\t341633\n荐书堂\t341634\n满清十三皇朝\t341635\n艾灸疗法\t341636\n淬炼\t341637\n艺境\t341638\n中国科学院青岛生物能源与过程研究所\t341639\n一个千万级\t341640\n泗门镇\t341641\nJuno\t341642\n127.0\t341643\n克吕格投影\t341644\n8.3.0\t341645\n159代\t341646\nAUPRES\t341647\n27届\t341648\n二奶村\t341649\n屁民\t341650\n中石油集团公司\t341651\n九处\t341652\n工程投影机\t341653\nhappening\t341654\nDow\t341655\n4000名\t341656\n彼此\t341657\n家庭教师波多野结衣野\t341658\nbad\t341659\nreshade\t341660\n魔高一丈\t341661\n61宝宝网\t341662\n7部\t341663\n姜毅\t341664\n陈店镇\t341665\n九十级\t341666\n凉白\t341667\n李叁木\t341668\n液晶电视大全\t341669\n苦难\t341670\n崔岩\t341671\n一初一\t341672\n白小飞\t341673\n上草\t341674\n大闽网\t341675\n红桥区政府\t341676\n耐张\t341677\n少景美\t341678\nJHU\t341679\n夜里\t341680\nSpherical\t341681\n青岛市市南区\t341682\n最美好\t341683\n恒生科技园\t341684\ntarticle\t341685\n钢制暖气片\t341686\n黑飞\t341687\nIxia\t341688\n正戊烷\t341689\n学生会宣传部\t341690\n机械工程学报\t341691\n梅子\t341692\ncompromised\t341693\n亲口\t341694\n浙江省注册会计师协会\t341695\n46寸\t341696\nxcodebuild\t341697\n擦挂\t341698\n无骨雨刮\t341699\n仲量联行\t341700\n学生餐\t341701\n中央政府采购网-电子化政府\t341702\n美国之音\t341703\n3000亿美元\t341704\n漏油器\t341705\n南京小区\t341706\n软链接\t341707\nGPS模块\t341708\n中置电机\t341709\n小青蛙\t341710\n隔夜\t341711\n周传基\t341712\n从现\t341713\n首班车\t341714\n除非\t341715\nKTX\t341716\n绿丝绦\t341717\n阿叶君\t341718\ndataimport\t341719\nmcx\t341720\n15000个\t341721\n货运物\t341722\n坦\t341723\n幸福村\t341724\nCIN\t341725\n远足者\t341726\nzhishi\t341727\n幽冥仙途\t341728\n杰德论坛_汽车之家论坛\t341729\n胶状物\t341730\n67\t341731\nupdate7\t341732\n华南新城\t341733\n沃霍尔\t341734\n熊猫加速器\t341735\n允悲\t341736\n旱溪\t341737\n回心\t341738\n初来\t341739\n对撞\t341740\n析构函数\t341741\n博乐达\t341742\nswt\t341743\n省水\t341744\n方胖子\t341745\n48v\t341746\nwud\t341747\n市经信委\t341748\nOS-vivo\t341749\n柏涛\t341750\n金银版\t341751\n天猫淘宝\t341752\npharmacy\t341753\n月坨岛\t341754\n华东师大\t341755\n快用苹果助手\t341756\n蝴蝶兰花\t341757\n义不容辞\t341758\n南方人\t341759\n六神磊磊\t341760\n网格员\t341761\n赵东\t341762\n工作效率\t341763\n过天晴\t341764\n一职\t341765\n加拿大萨省\t341766\nクイ\t341767\n宝鸡市人民政府\t341768\n天通苑北\t341769\n0773\t341770\n腾讯游戏支付中心\t341771\n螺旋状\t341772\n派大星\t341773\njjj\t341774\n雷克重燃\t341775\n脆脆\t341776\n贺年片\t341777\n宣称\t341778\n中爱\t341779\n巴伦博伊姆\t341780\nsuperworks\t341781\n甩单\t341782\n领英网\t341783\n苍老\t341784\n新贴\t341785\n横撇\t341786\n一亿韩元\t341787\n州市\t341788\n型血\t341789\n只有你\t341790\n戀\t341791\nmeaning\t341792\n楼层\t341793\n没听过\t341794\nmaka\t341795\n上海市静安区人民法院\t341796\n博时基金\t341797\n周公解梦网\t341798\n管接头\t341799\n财经法规\t341800\n爱情公寓\t341801\n从古\t341802\n中海油服\t341803\n健康课\t341804\nLaunchpad\t341805\nWAMP\t341806\n众泰T500论坛_汽车之家论坛\t341807\n明目地黄丸\t341808\nclassmate\t341809\n601818\t341810\nservant\t341811\n加套\t341812\n白咲\t341813\n黔江区政府\t341814\n念念\t341815\ns12k\t341816\n杨月\t341817\n嫰\t341818\n马鞍山\t341819\n勇敢爱\t341820\n高惠兰\t341821\n音节\t341822\n整数集\t341823\n牛顿第一定律\t341824\nSG-A028室外排水管道安装工程检验批\t341825\n同庆\t341826\n慈母情深\t341827\nokay\t341828\nfetch\t341829\n退位\t341830\n尿性\t341831\n合掌\t341832\nKeyboard\t341833\nGareth\t341834\nEWS\t341835\n4160\t341836\n湘水湾\t341837\n武松打虎\t341838\n青海日报社\t341839\nYOUNG\t341840\n个群\t341841\n下降生\t341842\n怪物猎人世界灭尽龙\t341843\n青协\t341844\nyahaha\t341845\nobo\t341846\n健力宝\t341847\n万语\t341848\n麻辣俏佳妻\t341849\n高达破坏者3吧\t341850\n卡簧\t341851\n巨邦\t341852\n小额贷款利率\t341853\n脆皮烤猪\t341854\n安图恩\t341855\n马少骅\t341856\n逃废\t341857\nwetting\t341858\n栽培\t341859\n爱惜\t341860\n矩形管\t341861\n饮用水\t341862\n咸宁日报\t341863\n小柴胡颗粒\t341864\n充填\t341865\nSwift\t341866\n师范生\t341867\n享有\t341868\n何小鹏\t341869\n二日游线路\t341870\n林河\t341871\nam335x\t341872\n连二亚硫酸钠\t341873\n中国铁路北京局集团有限公司\t341874\n富地\t341875\n北市区\t341876\n企标\t341877\n配流\t341878\n青龙桥街道\t341879\n洪武路\t341880\n渲染师\t341881\nalkali\t341882\n贵妻\t341883\n黄灿盛\t341884\neterm\t341885\n钢琴曲\t341886\n中华人民共和国社会保障卡\t341887\n浙江海正药业股份有限公司\t341888\n0719\t341889\n第二十二次\t341890\nJi\t341891\n布幔\t341892\n天津动物园\t341893\n天河区政府\t341894\n华为荣耀7i\t341895\n7天5晚\t341896\n2007款\t341897\n学日\t341898\nODM\t341899\n青春水球社\t341900\n新手村\t341901\n图翼网\t341902\n矿口\t341903\n4余\t341904\n伤寒杂病论\t341905\n帕森斯设计学院\t341906\nemoi\t341907\n北京天安门\t341908\nnarray\t341909\n这样的我\t341910\nDynamoDB\t341911\n江苏共青团\t341912\n新疆金风科技股份有限公司\t341913\n职联\t341914\njjf\t341915\n55亿元\t341916\n龙鳞\t341917\n120多\t341918\n挺度\t341919\n曼丽\t341920\n亚甲炎\t341921\n间质性\t341922\n北京国\t341923\n江苏省人大常委会\t341924\n国家危险废物名录\t341925\nCCTV-10_央视网\t341926\n闷闷不乐\t341927\n盐酸伪麻黄碱\t341928\n庄羽\t341929\n孙工\t341930\n金刚釉\t341931\n历史的天空\t341932\n灰土挤密桩\t341933\n阳高县\t341934\ntexts\t341935\n唯达宁\t341936\n紫金\t341937\n煤气中毒\t341938\nRazor\t341939\n新端\t341940\n校音器\t341941\n假如生活欺骗了你\t341942\ntenure\t341943\n念不可说\t341944\n十几位\t341945\n赛伯乐\t341946\n溴化钙\t341947\n菠菜网\t341948\n超时空要塞\t341949\nOMEGA欧米茄\t341950\n罗贝托\t341951\n饮料机\t341952\n黯\t341953\n项目名\t341954\n微商界\t341955\n三碟\t341956\n城步县\t341957\n女娃娃\t341958\n挂片\t341959\n男奴\t341960\n関係\t341961\n残页\t341962\n9.1折\t341963\n昌兴\t341964\nR730\t341965\n星驰\t341966\n夜光\t341967\n六个\t341968\n南昌大学第二附属医院\t341969\n不可逆转\t341970\n全椒先锋网\t341971\n鱼篓\t341972\ncarton\t341973\n地球防卫军\t341974\n北大附中河南分校\t341975\najax返回值\t341976\n国槐树\t341977\n西奥多·罗斯福\t341978\n能够\t341979\n2016年5月\t341980\n爱博傻网\t341981\n区分\t341982\n眉头\t341983\n浪潮服务器\t341984\n普贤菩萨\t341985\n嘉兴市政府\t341986\n96000\t341987\n辽艺\t341988\n宋家\t341989\n5.5G\t341990\n9分米\t341991\n三峡游轮\t341992\n五道营胡同\t341993\n侦探剧\t341994\nens33\t341995\n39座\t341996\n京户\t341997\n蚌埠火车站\t341998\nBiomass\t341999\n一中\t342000\n梅迪尔\t342001\n控制论\t342002\n郭宁\t342003\n迅雷高清下载_影音先锋_先锋影院\t342004\n狼兵\t342005\nellipsize\t342006\n杨家寨\t342007\n塞爱维\t342008\n傳\t342009\n花事\t342010\n绘画\t342011\n海底捞火锅\t342012\nscrolltop\t342013\n小马网店转让网\t342014\nx3daudio1_7.dll\t342015\n小布什\t342016\n6月12日\t342017\nhfc\t342018\nSolder\t342019\n童程童美\t342020\n眼瞳\t342021\n中国保险信息技术管理有限责任公司\t342022\nBlocks\t342023\n植鞣皮\t342024\n羡羡\t342025\n德州火车站\t342026\n没关机\t342027\n50430\t342028\nbost\t342029\n成都大学附属医院\t342030\n北区\t342031\n狗性\t342032\n厦门理工\t342033\n珠帘寨\t342034\nsqlload\t342035\n殷承宗\t342036\n4.2\t342037\nmp3MP3\t342038\n李纨\t342039\n沪深\t342040\n上柱\t342041\n弃养\t342042\nPhones\t342043\n优质板\t342044\n动一动\t342045\n王磊\t342046\n黑话\t342047\n球囊\t342048\n喂料\t342049\n九成宫醴泉铭\t342050\n福建奔驰\t342051\n月圆之夜\t342052\n无忧考吧\t342053\n人声鼎沸\t342054\nElectronics\t342055\n第125集\t342056\n越城区\t342057\n智汇\t342058\n九晚\t342059\n京广高铁\t342060\n泗泾\t342061\n李涵辰\t342062\n铁龙物流\t342063\n轮回决\t342064\n澳美\t342065\n耐药性\t342066\n向军\t342067\n澜庭\t342068\n文华东方酒店\t342069\n皇御苑\t342070\n王士祯\t342071\n搅拌车\t342072\n韩博\t342073\n再使用\t342074\n想睡\t342075\ngaussian\t342076\n肉苁蓉\t342077\n隔阂\t342078\n350亿元\t342079\n业态\t342080\nL02\t342081\n复班\t342082\n圣彼得大教堂\t342083\n上海娱乐\t342084\nwangzhi\t342085\n萍乡北站\t342086\n北京公寓\t342087\n曹政\t342088\nMatlab7.0\t342089\n幼儿期\t342090\n中国质量奖\t342091\n更早\t342092\n女贞子\t342093\n公共基础知识\t342094\n超合金\t342095\nfumu\t342096\n周筠\t342097\n醋酸氟\t342098\n超效\t342099\n对角化\t342100\n第124号\t342101\n安门\t342102\n┫\t342103\n横村\t342104\n税前工资\t342105\n婆家人\t342106\n小菜蛾\t342107\n兵装集团\t342108\n重影\t342109\n亦庄文化园\t342110\n妝品\t342111\n20140516\t342112\n努比亚Z18\t342113\n冰汽时代Frostpunk\t342114\n湟水河\t342115\nactiveX\t342116\n社区党委\t342117\n女性化\t342118\n埋没\t342119\n跪姿\t342120\n黑森\t342121\neRecovery\t342122\n六兆年\t342123\n晌\t342124\n盛世收藏网\t342125\n影视制作公司\t342126\n东营吧\t342127\n艾灸条\t342128\n苏州政府\t342129\n安邸AD\t342130\n空轨\t342131\n莱茵体育\t342132\n麒麟掌\t342133\nq195\t342134\n张永杰\t342135\n555555\t342136\n吴功宜\t342137\n破壁饮片\t342138\n马某\t342139\n秦桧\t342140\n金装律师\t342141\n微摩尔\t342142\n神武神兽\t342143\nchkconfig\t342144\n新概念英语第二册\t342145\nfrs\t342146\nScheduled\t342147\n徐州百姓网\t342148\n东湖磨山景区\t342149\n检表\t342150\n无毛\t342151\n燕郊经济技术开发区\t342152\n相切\t342153\n日本专门学校\t342154\n露乳\t342155\n罗蒂\t342156\n河南一村\t342157\n酷猴手游网\t342158\nprecedence\t342159\n快看漫画\t342160\nHamster\t342161\n保证责任\t342162\n宜家\t342163\n品牌溢价\t342164\n玛利\t342165\n大宋巴黎圣母院\t342166\n二级注册建筑师\t342167\n800倍\t342168\n策瑜\t342169\n永远永远\t342170\n花面\t342171\n负责\t342172\n中国艺术网\t342173\n马格努斯\t342174\n佐藤遥希\t342175\n会计证考试\t342176\n汽车北站\t342177\n调音与录音吧\t342178\ngithu\t342179\n读书日\t342180\n第四针\t342181\n减至\t342182\n益母草\t342183\ngfd\t342184\n范先生\t342185\n小区消防\t342186\n20151017\t342187\n字谱\t342188\nSADPTool\t342189\n张生\t342190\n北门路\t342191\n精神病学\t342192\n魁梧\t342193\n妖尾\t342194\n魔女嘉莉\t342195\n一文\t342196\n无忧无虑中学\t342197\n草王\t342198\n营养物\t342199\nCodeWarrior\t342200\n纹身男\t342201\n联社\t342202\n半夏厚朴汤\t342203\n麦卡伦\t342204\n刘晓红\t342205\n诚品\t342206\n农旅\t342207\nT8s\t342208\n小堂\t342209\nA6300\t342210\n杂交构树\t342211\n赵国\t342212\n非应税\t342213\n月石\t342214\nultrasun\t342215\nproliant\t342216\nTRANSACTIONS\t342217\n非亲属\t342218\n结构钢\t342219\n凯源腐文\t342220\n西座\t342221\nfci\t342222\n冠军杯\t342223\n铜陵日报\t342224\ntods\t342225\n悠悦\t342226\n泛舟\t342227\n泥炭\t342228\nhtpp\t342229\n任选\t342230\n百花争艳\t342231\n字幕版\t342232\n纳秒\t342233\n批批网\t342234\n讯玩版\t342235\nWeakAuras\t342236\n添田武人\t342237\nwritten\t342238\n三万元\t342239\n安养\t342240\n东极\t342241\n6082\t342242\n遮蔽\t342243\nDesign\t342244\nNOWRE\t342245\n涟源市政府\t342246\n史泰龙\t342247\n外口\t342248\n首检\t342249\n数据组\t342250\n手机保护套\t342251\n超电\t342252\nVundle\t342253\n加里宁格勒\t342254\nurl特殊字符转义\t342255\n糜子\t342256\n仙五\t342257\n军帽\t342258\n周例\t342259\n豪名\t342260\n日本邪恶漫画全集禁\t342261\n色质\t342262\n陕县\t342263\n广东省幼儿园\t342264\n麦考\t342265\n书集\t342266\n新英格兰\t342267\n茶博园\t342268\n沈海高速\t342269\n袁宏道\t342270\n施巴\t342271\n决斗\t342272\n兴农网\t342273\n纽恩泰\t342274\n落网\t342275\n宝山万达广场\t342276\n电吉他效果器\t342277\n林嘉欣\t342278\n欣欣\t342279\n自由之刃\t342280\n伪劣产品罪\t342281\n熊坦\t342282\n之江\t342283\n回看\t342284\n卫生监督\t342285\n人民代表大会及其常务委员会\t342286\nV2.1.3\t342287\n公方彬\t342288\n替身\t342289\nax^2+bx+c=0\t342290\n阿宇\t342291\n中国外运长航集团有限公司\t342292\n中兴通讯估值\t342293\n广告法\t342294\n时辰\t342295\nLeangoo\t342296\n补阳\t342297\n铜陵北站\t342298\nVIC\t342299\n免费期\t342300\n病毒抗体\t342301\n海康大华\t342302\n2658\t342303\n设立股份有限公司\t342304\n气象学家\t342305\n端口号\t342306\n苍色\t342307\n沙洋县\t342308\n真牛\t342309\n爱盲互联\t342310\n新浪湖北资讯_新浪湖北\t342311\n纳税信用A级纳税人\t342312\n什么式\t342313\n兴业路\t342314\nh6\t342315\n百度导航\t342316\n车漆\t342317\n自花传粉\t342318\n钥匙机\t342319\nSEO\t342320\n张欣欣\t342321\n短信音\t342322\n华为畅享8plus\t342323\n找到\t342324\n盛世嫡妃\t342325\nIDFA\t342326\n搜狐体育\t342327\n2017-11-27\t342328\n南京幼儿园\t342329\nprotobuf3\t342330\n三菱PLC\t342331\n纹面\t342332\n争取\t342333\n辩友\t342334\n6千\t342335\n徽商网\t342336\n火上浇油\t342337\n绿瘦\t342338\n腐女\t342339\n电解质溶液\t342340\n大东区\t342341\n闪存卡\t342342\n新能源汽车行业\t342343\n恐怖症\t342344\nsupp\t342345\nbbc纪录片\t342346\n13万吨\t342347\n郑有美\t342348\n长沙市芙蓉区人民政府\t342349\nShaping\t342350\n国际劳工组织\t342351\ndatediff\t342352\n强板\t342353\n政区\t342354\n往何处去\t342355\n几两\t342356\n屋子\t342357\n龙珠游戏\t342358\nq9400\t342359\nsybr\t342360\ntouching\t342361\n九层塔\t342362\n库尔特\t342363\n航空运动\t342364\n银川能源学院\t342365\nAURORA\t342366\n先进村\t342367\n早诊\t342368\n助记符\t342369\nradio单选框\t342370\n陈工\t342371\nOOP\t342372\n咪唑斯汀缓释片\t342373\n三等分点\t342374\n十九项\t342375\n贝加莱\t342376\n顺求\t342377\n石原里美\t342378\n滤波\t342379\n仿真器\t342380\ninlay\t342381\n乡里\t342382\n中国教育国际交流协会\t342383\n二值化处理\t342384\nantibodies\t342385\nAqours\t342386\n风升\t342387\n敦豪\t342388\n奔驰GLC级\t342389\n淘遍\t342390\n习网\t342391\n净残值率\t342392\nmobile2\t342393\n握手\t342394\n一年以后\t342395\n孙犁\t342396\n版图书\t342397\n天下无敌\t342398\n妇幼卫生\t342399\nBED\t342400\n外唐\t342401\nBrambling\t342402\n雷蛇蝰蛇\t342403\n小钱包\t342404\n王臣\t342405\nsctp\t342406\n刘朝霞\t342407\n造物者\t342408\n县食药监局\t342409\n桂皮\t342410\n太幸福\t342411\n380级\t342412\nstuart\t342413\n20张\t342414\n万新\t342415\n间隔性\t342416\n周晔\t342417\nstary\t342418\n日式烤肉\t342419\n仿真卷\t342420\nramp\t342421\n客货\t342422\n申赎\t342423\n隐形文胸\t342424\n泡沫罐\t342425\n大数据\t342426\nweixi\t342427\n1时\t342428\n红点\t342429\n故人\t342430\n一日内\t342431\n聚焦\t342432\nLady\t342433\n其身\t342434\nV3.5\t342435\n单摆\t342436\n铛铛车\t342437\n阿勒\t342438\n1920x\t342439\n前卫村\t342440\n清凉寨\t342441\n首富\t342442\n同工酶\t342443\n蘑菇街\t342444\n北京市海淀区人力资源和社会保障局\t342445\n小米mix\t342446\n等比缩放\t342447\n香谱\t342448\n爱微网\t342449\naardio\t342450\n汉堡港\t342451\n公卫人\t342452\nEd\t342453\n质体\t342454\n退伍兵\t342455\n中国林业出版社\t342456\nweblogic12c\t342457\n焦点科技\t342458\n车行道\t342459\nlearing\t342460\n山东农科\t342461\n黔南布依族苗族自治州\t342462\n1011\t342463\nfuyang\t342464\n发榜\t342465\n廊道\t342466\n乙地\t342467\n栉\t342468\n米哈尔\t342469\n通用\t342470\n大吉\t342471\n辍保学\t342472\n刮削器\t342473\n蓝鳍金枪鱼\t342474\n中国中冶\t342475\n起三落\t342476\nranging\t342477\n天高任鸟飞\t342478\n1151\t342479\n杭州绿城育华小学\t342480\n台州\t342481\nmxl\t342482\n尽兴\t342483\nvrc\t342484\n猪妖\t342485\n卜钰\t342486\n光辉国际\t342487\n模拟通信\t342488\n浮盈\t342489\n低战\t342490\n储藏\t342491\n话图\t342492\n莽撞\t342493\n靶心\t342494\n粤财\t342495\n新街镇\t342496\n应用版\t342497\n溜溜梅\t342498\n野鸡网\t342499\nntosk\t342500\n巴县\t342501\n北京华夏远洋科技有限公司\t342502\ndated\t342503\n预检\t342504\nAGE\t342505\n狼\t342506\nrtp\t342507\nLATCH\t342508\n七层\t342509\nreinforced\t342510\n1000首\t342511\n360安全路由器\t342512\n三体吧\t342513\n小丁\t342514\n语笑阑珊\t342515\n蔡昉\t342516\ngettext\t342517\nimproved\t342518\n中国金属学会\t342519\n沈阳人才网\t342520\n华灯初上\t342521\n五周\t342522\n绿园小区\t342523\nKernel\t342524\n偏颌\t342525\n薇安\t342526\nMQB\t342527\n易客CRM\t342528\n野战\t342529\n山东海事局\t342530\n缓冲罐\t342531\n舟山市人民政府\t342532\nslackware\t342533\ndiv1\t342534\n20170710\t342535\n经典红歌\t342536\n几尺\t342537\n双立人\t342538\n丁香客\t342539\n坑钱\t342540\n怀俄明州\t342541\n北京地铁3号线\t342542\n米技\t342543\n新县\t342544\n魔兽争霸III\t342545\n第三波\t342546\n枪神王座\t342547\n仰\t342548\n望洲\t342549\nBartlett\t342550\n增力\t342551\ncctv9\t342552\n趣趣网\t342553\n浓差\t342554\n环塔\t342555\n广东地市\t342556\npsi\t342557\n跨年\t342558\n土地出让收益\t342559\n房琪\t342560\n0x01\t342561\n康铃\t342562\n吐艳\t342563\n锦绣天地\t342564\n多档\t342565\nServe\t342566\n罗非鱼\t342567\n珠联璧合\t342568\ndwi\t342569\n图位\t342570\n冠子\t342571\n6只\t342572\n新约魔法禁书目录\t342573\n秦少霆\t342574\n内蒙古大学艺术学院\t342575\nBraun\t342576\nLUCKY\t342577\nYiwu\t342578\nDm\t342579\n艾柱\t342580\nGTID\t342581\nMongoDB分片\t342582\n金桶\t342583\n江苏天瑞仪器股份有限公司\t342584\n羽绒枕\t342585\n代市长\t342586\n丝韵\t342587\nProvisional\t342588\n情势\t342589\n萝卜\t342590\n中信建投证券股份有限公司\t342591\n风信\t342592\n120急救指挥中心\t342593\n柴黄颗粒\t342594\n满怀信心\t342595\n100米高\t342596\n浓郁\t342597\n情劫\t342598\n持有量\t342599\n小瓦\t342600\n东来紫微网\t342601\n悠白\t342602\n绝地求生刺激战场pc\t342603\n鸿图\t342604\nAnswers\t342605\n刺痛\t342606\nEAS\t342607\n华春\t342608\n北纳生物\t342609\n机模\t342610\n友金所\t342611\n江门高新区\t342612\n银河统计工作室\t342613\nAC1900\t342614\n不请\t342615\n攀枝\t342616\n联会\t342617\n香港入境处\t342618\n阵形\t342619\n大小乔\t342620\n鬼书\t342621\n传\t342622\n软件篇\t342623\n魔灵召唤符文\t342624\n天枢穴\t342625\n绿州\t342626\n副书记\t342627\n苯溴马隆片\t342628\n玛莉亚\t342629\n物距\t342630\n炫影\t342631\n跑车\t342632\n麦芽糊精\t342633\n池州市人民医院\t342634\n风盘\t342635\n曹家堡\t342636\n质料\t342637\n连拍\t342638\n11枚\t342639\n罪爱\t342640\n初识\t342641\n看淡\t342642\nopencv\t342643\n水式\t342644\nactivities\t342645\n婚谋\t342646\n侯门医女\t342647\n南京分公司\t342648\n非定向\t342649\nTop30\t342650\n乌村\t342651\n铝镁\t342652\n阿尔菲\t342653\n金盏\t342654\n对标学习\t342655\n宜宾\t342656\n爱无\t342657\ntactics\t342658\n托罗\t342659\n0212\t342660\n樊川\t342661\n扬州树人中学\t342662\n锚栓\t342663\nThreadripper\t342664\n座垫\t342665\nT400\t342666\n腾讯公司\t342667\nXY苹果助手\t342668\n最大化\t342669\nxav\t342670\n廷廷\t342671\n争夺\t342672\n乐视1S\t342673\n血袋\t342674\n奥本山\t342675\nfasttext\t342676\n师妹\t342677\n杜仲树\t342678\n9:16\t342679\n王朋\t342680\n乡村行\t342681\n汽车零配件\t342682\nBEAN\t342683\nie6.0\t342684\n罗尼库尔曼\t342685\n天义镇\t342686\n馏分\t342687\n大庄园\t342688\n陈义\t342689\n突袭4\t342690\n顺向\t342691\n川南\t342692\n翼\t342693\n只是\t342694\n星球大战8:最后的绝地武士\t342695\n全国爱卫会\t342696\nmagnetic_storm\t342697\n亭湖政府网\t342698\nvivian\t342699\n数度\t342700\nchief\t342701\n麦库\t342702\n薄弱环节\t342703\n采集卡\t342704\nün\t342705\n堵头\t342706\n0807\t342707\n钟祥\t342708\n2014-01-24\t342709\n人功能\t342710\n03版\t342711\n皮服\t342712\nHERE\t342713\n人文社会科学学院\t342714\n百货业\t342715\nstepwise\t342716\n泉立方\t342717\n层次\t342718\n带人\t342719\ntrace\t342720\nSMITE\t342721\n轮训班\t342722\n梅格\t342723\n固定化\t342724\n甘露醇\t342725\n爨\t342726\nguai\t342727\n600490\t342728\n鲜丰\t342729\n天水围的夜与雾\t342730\nBuzz\t342731\n火柴\t342732\n水灵\t342733\n凤凰山\t342734\n浙大附中\t342735\n第147期\t342736\n小水电\t342737\n东南侧\t342738\n慈母手\t342739\n猴戏\t342740\n引物\t342741\n牌牌\t342742\n鸿盛\t342743\n.cs\t342744\n李佳颖\t342745\nMirrors\t342746\nCAD文件\t342747\n童眸\t342748\n第60关\t342749\n英朗论坛_汽车之家论坛\t342750\n阀类\t342751\n南土\t342752\n克隆人战争\t342753\n翻译类\t342754\n窗业\t342755\ndac\t342756\n引子\t342757\n贵阳恒大\t342758\n枯竭型\t342759\n兵地\t342760\n第31次\t342761\n240v\t342762\n开器\t342763\n充气娃娃之恋\t342764\n破骨\t342765\n聋校\t342766\n麦凯恩\t342767\n宫崎勤\t342768\n终极对决\t342769\n小橘子\t342770\n最初的梦想\t342771\n列中\t342772\n10微米\t342773\nsuperman\t342774\n15A\t342775\n新兴经济体\t342776\n名利\t342777\n暴漫\t342778\nAmbiguous\t342779\n金程\t342780\nWorkers\t342781\n羿\t342782\n斗帝\t342783\n峄城区\t342784\n白参\t342785\n张茜\t342786\n酷派s1\t342787\nlibrosa\t342788\n嗜杀\t342789\n资格赛\t342790\n380伏\t342791\n晒场\t342792\nplease\t342793\n爆炸式\t342794\n有毒\t342795\n和事\t342796\n六公里\t342797\n杨思琦\t342798\n桑尼\t342799\n谜城\t342800\nufrii\t342801\n选料\t342802\n漩涡式\t342803\n破净\t342804\n张琨\t342805\n日照间距系数\t342806\n出产\t342807\n北洼路\t342808\n炸锅版\t342809\n接地极\t342810\n鸡蛋干\t342811\ndpt-rp1\t342812\n乌龙指\t342813\n阿尔戈\t342814\n打弯\t342815\n秋期\t342816\n沣峪口\t342817\n镜中人\t342818\n五岳\t342819\n华为手机输入法\t342820\n直升丸子\t342821\n云飞扬\t342822\n有妇\t342823\n斗破苍穹2\t342824\n上海劳力士\t342825\ncompetitive\t342826\nhold不住\t342827\n乐天派\t342828\n库尔勒\t342829\n$http\t342830\n雅居乐剑桥郡\t342831\n标本兼治\t342832\n走错路\t342833\n名剑币\t342834\n书友\t342835\n动平衡\t342836\n4天3晚\t342837\njwb\t342838\n上刊\t342839\npgAdmin\t342840\n匪类\t342841\n云芨\t342842\n进油\t342843\n石灰土\t342844\n重庆地铁\t342845\n4s点\t342846\ntwm\t342847\n小栗\t342848\n快眼\t342849\nsecurt\t342850\n圣安地\t342851\n_世\t342852\n十四式\t342853\n百度坐标系\t342854\n干人\t342855\n鱼泡\t342856\n虎眼石\t342857\n对射\t342858\n资格\t342859\n几千\t342860\n观鸟\t342861\n双龙会\t342862\n抗利尿激素\t342863\n第一城\t342864\n重庆网\t342865\nabab\t342866\n丁浩\t342867\n撒布\t342868\nkeai\t342869\n演界网\t342870\n后溪\t342871\n恒企\t342872\n天柱山风景区\t342873\n嘶吼\t342874\n刘卓\t342875\nmorra\t342876\n曹德旺\t342877\nOSGI\t342878\n蕲蛇\t342879\nac3\t342880\n每次\t342881\n慎点\t342882\nuv光解\t342883\n桂枝香\t342884\nbacks\t342885\n魔晶猎人\t342886\n张元干\t342887\n白葡萄酒\t342888\nate\t342889\n广州金融控股集团有限公司\t342890\n主坛\t342891\n27条\t342892\n上海江城医院\t342893\n咨询有限公司\t342894\n悬挑\t342895\n第73\t342896\nLuxe\t342897\nCD机\t342898\n梦幻西游手\t342899\nffd\t342900\nqiche.huangye88.com\t342901\n九套\t342902\nog\t342903\n冠军赛\t342904\n红狐\t342905\n上海市第七人民医院\t342906\n长沙市中医医院\t342907\n馈纸式\t342908\n张紧力\t342909\n灵盾\t342910\nORIENT\t342911\n滴露消毒液\t342912\nWiFi路由器\t342913\nvisitor\t342914\n髌\t342915\nmedium\t342916\n线轮\t342917\n催人泪下\t342918\n我的青春遇见你\t342919\nShowtime\t342920\n寄信\t342921\n有意识\t342922\n装备箱\t342923\n港大\t342924\n卡乐购\t342925\nexcellence\t342926\n聘任\t342927\n舆论场\t342928\n碑\t342929\n透光\t342930\n婕西\t342931\nSolaris\t342932\n123.net\t342933\nadvertising\t342934\n生资\t342935\n蘑\t342936\nluv\t342937\n曹灿杯\t342938\n票据池\t342939\n平安惠普\t342940\n康瑞\t342941\n国祯\t342942\nAustralia\t342943\n过活\t342944\nnpd\t342945\n/文\t342946\n迷域行者\t342947\nglucose\t342948\n西渡街道\t342949\ndlr\t342950\nsd高达g世纪世界\t342951\ny51a\t342952\n估算\t342953\n明眸\t342954\nMonster\t342955\n1200元\t342956\n张国栋\t342957\n久期\t342958\n瀛海镇\t342959\n双铁\t342960\n卓越教育集团\t342961\nssl证书\t342962\n四子王旗\t342963\n第12届\t342964\n家宝\t342965\n列女传\t342966\n重庆中学\t342967\n601238\t342968\n999999\t342969\n姜\t342970\n超能力者\t342971\n退信\t342972\n责任制\t342973\n韩敏\t342974\n中文台\t342975\nskii\t342976\n2018-03-2\t342977\n主宰者\t342978\n梦竹\t342979\nuln2803\t342980\n厂路\t342981\nwnag\t342982\n商委\t342983\ntelegram\t342984\n角度\t342985\n满庭春\t342986\nLIFESTYLE\t342987\n小指\t342988\n拍手歌\t342989\n萌娃\t342990\notdr\t342991\n轻奢主义\t342992\n中国少先队\t342993\n600929\t342994\n奥运\t342995\n杨海\t342996\n刘建明\t342997\n爱同罪\t342998\n择业期\t342999\n曹正淳\t343000\n云销\t343001\n姚记\t343002\n二连浩特政府网\t343003\nweidu\t343004\n布石\t343005\n资料\t343006\n章鱼加速器\t343007\n轻吻\t343008\n汽配城\t343009\n昭化区\t343010\n爱问频道\t343011\nSuites\t343012\n暗黑破坏神1\t343013\nqq三国\t343014\n几万元\t343015\n橘彩\t343016\n金天国际\t343017\nFab\t343018\n定心丸\t343019\n全局观\t343020\n丰益\t343021\n螟虫\t343022\nfran\t343023\n和信广场\t343024\n七仙岭\t343025\n直岛\t343026\n清炒虾仁\t343027\n王一飞\t343028\n火灵\t343029\nmodaldialog\t343030\n攀谈\t343031\n青火\t343032\n新闻类\t343033\n管理科学与工程类\t343034\n财运亨通\t343035\n19500\t343036\n安时\t343037\ndnw\t343038\nvms\t343039\n盐城市第三人民医院\t343040\n二手房按揭\t343041\n红旗小学\t343042\nTwin\t343043\n高中数学三角函数\t343044\n5a级景区\t343045\n机制币\t343046\nLBO\t343047\npred\t343048\n堤\t343049\n100亿元\t343050\n联想g450\t343051\n第三次世界大战\t343052\n12.1.3\t343053\n开婚\t343054\n玉兰灯\t343055\nUniregistry\t343056\n巫师3狂猎\t343057\n三泰虎\t343058\n捆缚者\t343059\n主析\t343060\n犯罪心理\t343061\n越南富国岛\t343062\n以纯\t343063\n证券从业资格证\t343064\n小辑\t343065\n横评\t343066\nSTV\t343067\n昭觉\t343068\n啦啦操联赛\t343069\n万贯五金机电网\t343070\nbd720p/mkv\t343071\n音乐风云榜\t343072\n8282\t343073\n玛雅人\t343074\n吉村\t343075\n动脉粥样硬化\t343076\nbootstrap-table\t343077\n催产\t343078\n猛毒\t343079\n男权\t343080\n岁月无情\t343081\n减字木兰花\t343082\n竟\t343083\n甘特图\t343084\n百度分析\t343085\n一概而论\t343086\nchicun\t343087\nOPM\t343088\n芙蓉营养师\t343089\n强极\t343090\n春云\t343091\n科沃兹\t343092\n启动机\t343093\nSINR\t343094\n熊天平\t343095\n急症\t343096\nWeb自动化测试\t343097\n规管\t343098\n声波式\t343099\nGvod\t343100\n福师大\t343101\n油压\t343102\n逃生管\t343103\n样量\t343104\n优宝\t343105\n上海人事考试网\t343106\n1-6月\t343107\n镰池\t343108\n吱道\t343109\n西门子\t343110\n有滋有味\t343111\n玲儿\t343112\nAOMG\t343113\n信阳息县\t343114\n高新西区\t343115\n态度\t343116\n萌宠族\t343117\nMule\t343118\n樟木箱\t343119\n低档\t343120\n夜夜战\t343121\n月亮湾花园\t343122\n闪红\t343123\njae\t343124\n陈枝辉\t343125\n街亭\t343126\n發表\t343127\n甲硝唑口颊片\t343128\niOS11\t343129\ncar\t343130\n慢性肠胃炎\t343131\nDMMオリジナルエロCG集\t343132\nCEC\t343133\npainting\t343134\n0214\t343135\n大志\t343136\n瘀片\t343137\nTornado\t343138\n60p\t343139\ntion\t343140\n600141\t343141\n微讯\t343142\n闭路监控\t343143\ndfm\t343144\n大主宰小说网\t343145\n变数\t343146\n湖北省十堰市\t343147\nGPSspg查询网\t343148\n宁明\t343149\n老炮\t343150\n后脚\t343151\n大通证券\t343152\n元彬\t343153\n泥鲸\t343154\n阎步克\t343155\n中工国际\t343156\n山沟\t343157\n同步控制器\t343158\n程彦\t343159\n小衰\t343160\n罗保铭\t343161\n李乐成\t343162\n小米6s\t343163\n实里\t343164\n抽沙泵\t343165\n浮粉\t343166\n微元素Element3ds\t343167\n大光头\t343168\n关税补贴\t343169\n翻译蛋\t343170\n刘河镇\t343171\n腧穴\t343172\n干扰者\t343173\n逮捕证\t343174\n老袁\t343175\n男人日\t343176\n重新定位\t343177\n死肉\t343178\n庭审\t343179\n高新园区\t343180\n黄麻布\t343181\n开迪\t343182\n养阳\t343183\n软床\t343184\n铁鹰\t343185\n霍乐迪\t343186\n250万\t343187\n南内环街\t343188\n上海南桥\t343189\n非凡网\t343190\n封贴\t343191\n石家庄科技工程职业学院\t343192\n行庆\t343193\n鲁甸县\t343194\n摄影家\t343195\n再制\t343196\nStaining\t343197\nDLC石\t343198\n中外企业家\t343199\n人口普\t343200\n唐宫燕\t343201\n夏邑县\t343202\nCHERRY\t343203\nMicrowave\t343204\n悲催\t343205\n欧陆\t343206\n在夜里\t343207\n热忱\t343208\n派特\t343209\n牛奶布丁\t343210\n报\t343211\n死亡之旅\t343212\n察哈尔右翼前旗\t343213\n北京三环\t343214\n陈明\t343215\n熊召政\t343216\n左岭\t343217\n泰尔丝\t343218\n武侠小说网\t343219\nLVM\t343220\n酌\t343221\n藤原辽子\t343222\n并批量\t343223\n年代学\t343224\n金洲小学\t343225\n路摊\t343226\n微博台灣站\t343227\n农广天地\t343228\n04月28日\t343229\n公孙绿萼\t343230\n心瘾\t343231\n佯攻\t343232\n三两枝\t343233\nWav\t343234\nchurch\t343235\n孙琪\t343236\n蝉鸣\t343237\nsense6\t343238\n前五天\t343239\n到任\t343240\n丌\t343241\n咪哒\t343242\nODE\t343243\n两年后\t343244\n大头妹\t343245\n思琳\t343246\ntoJSONString\t343247\n相对主义\t343248\n眦\t343249\nlori\t343250\n感觉\t343251\nfreeplus\t343252\n衣帽间\t343253\n中心公园\t343254\n注册岩土工程师\t343255\n杨卓\t343256\n顾正文\t343257\nALC\t343258\ngaze\t343259\n荷赛奖\t343260\n韩都动力\t343261\njdk15.jar\t343262\ntne\t343263\nwdr5600\t343264\n我团旅游网\t343265\n伶\t343266\n833044\t343267\n定义处\t343268\nNull\t343269\n使命召唤14二战\t343270\niResearch\t343271\n风头\t343272\ndvd版\t343273\n上思县\t343274\nSynology_NAS云论坛\t343275\nBrown\t343276\n南非\t343277\n虚拟局域网\t343278\n恒安\t343279\n美国队长2\t343280\n玫瑰苑\t343281\n国际快递公司\t343282\n云河\t343283\n斗气车\t343284\n劳动合\t343285\n下拉树\t343286\n张若尘\t343287\n代沟\t343288\n显身手\t343289\n漳州市区\t343290\n正时皮带\t343291\n肩井穴\t343292\n妆台\t343293\n1990年\t343294\n虚拟语气\t343295\n怀念\t343296\njacob\t343297\n作家\t343298\n开襟\t343299\n定时播放器\t343300\n三月初七\t343301\n蒙古王\t343302\n德拉基\t343303\n序数效用论\t343304\n顷刻间\t343305\n套定\t343306\n江苏省科技厅\t343307\n术后\t343308\n9年\t343309\n高胰岛素血症\t343310\nGirlfriend\t343311\n沈荡镇\t343312\n辽妥\t343313\ncarbonyl\t343314\n1KG\t343315\n战犬\t343316\n1MP3_\t343317\n新活力\t343318\n东方LED网\t343319\nPBB\t343320\n秦英林\t343321\ngpfs\t343322\n550_\t343323\n1853\t343324\n四平路\t343325\n乘联会\t343326\n金庸群侠传X\t343327\nTrinity\t343328\n赏菊\t343329\n小米Note/顶配_MIUI论坛\t343330\n锂电\t343331\n为人母\t343332\n泪痣\t343333\n捅破\t343334\nstm32f103\t343335\n朝阳市双塔区人民政府\t343336\n海尔空调遥控器\t343337\nHD600\t343338\n新东方厨师学校\t343339\n幼升小网\t343340\n三尾狐\t343341\n进退\t343342\n河北村\t343343\n胡迪\t343344\n君与彼女与彼女之恋\t343345\n盲板\t343346\n纳什\t343347\n叙事曲\t343348\n230TSI\t343349\n羽月希\t343350\n西安交大\t343351\n引起来\t343352\n天津市司法局\t343353\n光猫\t343354\n火山喷发\t343355\n西溪花园\t343356\n郑州航院\t343357\ncailiu\t343358\n职人\t343359\n巴乌\t343360\n凤城十路\t343361\n候诊\t343362\n马贝\t343363\n小顾\t343364\n谢安朔\t343365\n贯通\t343366\n星辰美文网\t343367\nbk0600\t343368\n花豹突击队\t343369\n统战部\t343370\n卡巴拉岛\t343371\n360杀毒\t343372\n改种\t343373\n亚洲万里通\t343374\n速冻库\t343375\n外情\t343376\n紫色\t343377\n康考迪亚大学\t343378\n配印\t343379\n国际博物馆\t343380\n前项\t343381\nff12\t343382\n似水年华\t343383\n广州市铁一中学\t343384\nAve\t343385\n郑红\t343386\n埃及镑\t343387\n说题\t343388\n五十多\t343389\n郭津彤\t343390\n陈文灯\t343391\n美图网\t343392\n胶皮\t343393\n新锋\t343394\n昆明国家高新技术产业开发区\t343395\n胭\t343396\n逆流成\t343397\n兴趣\t343398\n北京爱康国宾体检中心\t343399\n1972年\t343400\nHy\t343401\nBOOST\t343402\n南海区\t343403\n捷马\t343404\n名城\t343405\n水冷\t343406\ndiabetic\t343407\n红血丝\t343408\n池州市\t343409\n颅咽管瘤\t343410\n荒古十字架\t343411\n翡翠林\t343412\n中国共产主义青年团团旗\t343413\n聚惠\t343414\nrehearsal\t343415\n李梦颖\t343416\n中国图书网\t343417\n起亚瑞风s3\t343418\nlon\t343419\n王东马可波罗\t343420\n考古遗址公园\t343421\n千脑\t343422\n驾考论坛\t343423\nhypno\t343424\n戴银\t343425\n省政府\t343426\n中心性\t343427\n穿岩十九峰\t343428\n国务院证券监督管理机构\t343429\n冰雪女王3:火与冰\t343430\nMACD\t343431\n九五小说网\t343432\nV2.11\t343433\nLCS\t343434\n就是这样做\t343435\n强力机\t343436\n苏报\t343437\n槑\t343438\n指称\t343439\nsxe\t343440\n第47条\t343441\n多诺万\t343442\n紧握\t343443\n退一步\t343444\n寄生兽\t343445\n海氏海诺\t343446\nDX3\t343447\n越西\t343448\n杂食\t343449\n表里如一\t343450\n高蛋白食物\t343451\n物理学报\t343452\ndirection\t343453\n极子\t343454\n统货\t343455\n顺风\t343456\n朗乐福\t343457\n机油灯\t343458\n佩琦\t343459\n果考网\t343460\n阿拉斯加州\t343461\n旅游版\t343462\nDeX\t343463\n双鱼座\t343464\n乔石\t343465\nqPCR\t343466\n桃花族\t343467\n飞乐鸟工作室\t343468\naiqiyi\t343469\n解特\t343470\n字画\t343471\n机器人学导论\t343472\n正荣集团\t343473\n私模\t343474\n三岁半\t343475\n崖\t343476\n登录器\t343477\n月丘兔\t343478\n天猫欢聚日\t343479\n上百亿\t343480\n交易型\t343481\n+&#160\t343482\n集分宝\t343483\nFiction\t343484\n曹力\t343485\n浙江水利水电学院\t343486\n污泥浓缩池\t343487\narray_key_exists\t343488\n城南\t343489\n阆苑\t343490\n塞尔玛\t343491\n爱文秘网\t343492\n节油\t343493\n许鼎\t343494\n一兰\t343495\nkinase\t343496\n日月大道\t343497\nConrad\t343498\n网侠安卓\t343499\nunderstood\t343500\n天伦之乐\t343501\n天竺鼠\t343502\nAUD\t343503\n幻想万华镜\t343504\n血盟\t343505\nv3.1.1\t343506\n298元\t343507\n暗字\t343508\n铁岭市\t343509\n蒋师傅\t343510\n压力桶\t343511\n中间页\t343512\nHD音频管理器\t343513\nXEON\t343514\n南京水务集团有限公司\t343515\n怪医杜立德\t343516\nstructs2\t343517\ncour\t343518\n形态\t343519\n90天内\t343520\n科科通\t343521\n回文\t343522\n王千源\t343523\n重庆师范大学\t343524\n天目山镇\t343525\n效果包\t343526\n肖申克的救赎吧\t343527\n西风烈\t343528\nVM12\t343529\n天维\t343530\n电火花\t343531\n襄阳房产超市网\t343532\nSisley\t343533\n战勇\t343534\n李娇\t343535\n简易\t343536\n数据员\t343537\nFAILURE\t343538\n口盖\t343539\n哈\t343540\n福顺\t343541\n孔雀苗\t343542\n龙工叉车\t343543\n除菌液\t343544\n紫菜包饭\t343545\n运政\t343546\nsarscape\t343547\n比弗利\t343548\n道奇都市奇门医圣\t343549\n人才市\t343550\n电子束\t343551\n2xy\t343552\n97%\t343553\n九边\t343554\n雨润城\t343555\n席蒿\t343556\n胡先生\t343557\nglide\t343558\n湖南路街道\t343559\n931路\t343560\n蛇酒\t343561\n世路\t343562\n闲散\t343563\nleep\t343564\n大宁县\t343565\n北医六院\t343566\n回环\t343567\nair13pro\t343568\n病毒文件\t343569\n名寺\t343570\ngpro\t343571\n刘未鹏\t343572\n枪决\t343573\nPermissions\t343574\nphotoview\t343575\nShutdown\t343576\nsoldering\t343577\n当选中\t343578\nucla\t343579\ngpuimage\t343580\n天津经济技术开发区\t343581\n名里\t343582\n陕西省交通运输厅\t343583\n马景涛\t343584\n商途\t343585\n红玫瑰与白玫瑰\t343586\nIntegrated\t343587\n胞浆\t343588\n先验\t343589\n有谱\t343590\n常州网络公司\t343591\n广东南粤银行\t343592\n教主任\t343593\nIncubator\t343594\n地纬商机网\t343595\n后知\t343596\n签名照\t343597\ntok\t343598\n唐能通\t343599\n北京冬奥会\t343600\n矬子\t343601\n卡管\t343602\n发起式\t343603\n组织级\t343604\n小丝\t343605\n600585\t343606\n异趣\t343607\n申批\t343608\n丹阳镇\t343609\n车轮驾考通\t343610\n好者\t343611\nMother\t343612\n阿威\t343613\n说一句\t343614\nugs\t343615\n上医\t343616\n中面\t343617\n银柳\t343618\n12mm\t343619\n华人区\t343620\n尹仲\t343621\n久坐族\t343622\n白鹿温泉\t343623\n守规矩\t343624\n大专生\t343625\n_美团\t343626\n樟木\t343627\n虚拟股\t343628\n商院\t343629\n习练\t343630\n爱又米\t343631\n中国网教育|中国网\t343632\n圣斗士冥王篇\t343633\n剌\t343634\n果园\t343635\n输入框组件\t343636\n蓝战非\t343637\n控偶师\t343638\n断绳\t343639\nMACD金叉\t343640\nvtx\t343641\n飘飘然\t343642\nfer\t343643\n万艾\t343644\n贵州省商务厅\t343645\n二环路北\t343646\n国家4A级景区\t343647\n祖国\t343648\n理事国\t343649\n磷化氢\t343650\n米宅郑州站\t343651\n雨烟\t343652\n入秋\t343653\n算分\t343654\n第一时间房源网\t343655\n仇杀\t343656\n2型糖尿病\t343657\n签文\t343658\n自考本科英语\t343659\n2032年\t343660\n美刻\t343661\n福克纳\t343662\n像素画\t343663\n过度\t343664\n管道离心泵\t343665\n李祥祥\t343666\n英國\t343667\n文化路\t343668\n品茗\t343669\n跨天\t343670\n阿雪\t343671\n赛道\t343672\n份子钱\t343673\nDESTINY\t343674\n成人教育学院\t343675\n海米傻傻\t343676\n标值\t343677\n装饰性\t343678\n影视界\t343679\n若月\t343680\n不要再来\t343681\nLap\t343682\n变相\t343683\n免安装绿色版\t343684\n纳\t343685\n116万\t343686\n赵记\t343687\n第225集\t343688\n亚的斯亚贝巴\t343689\n吴毅\t343690\n搞不懂\t343691\n10大\t343692\ncnc\t343693\n盖世\t343694\nsonarqube\t343695\n故诗\t343696\n揶揄\t343697\n南京理工\t343698\n神启\t343699\n中国银保监会\t343700\n可乐鸡翅\t343701\n金亿\t343702\nOMRON欧姆龙\t343703\ncurrencies\t343704\n张浩瀚\t343705\n中金\t343706\n主族元素\t343707\n桐城\t343708\n优惠期\t343709\n执纪\t343710\n地毯\t343711\n白袍\t343712\n超级新闻场\t343713\n婚典\t343714\n大艺展\t343715\n四千金\t343716\n这些年\t343717\n壮观\t343718\n湖北省食药监局\t343719\n昊华能源\t343720\n波女\t343721\nkua\t343722\n德普\t343723\n几千米\t343724\n许多人\t343725\n2017.07\t343726\n鼓音\t343727\n大板镇\t343728\n高东辉\t343729\n拉各斯\t343730\n2rx8\t343731\n8K\t343732\n中国农业大学\t343733\n底沙\t343734\n鼠大王\t343735\n测命\t343736\n航船\t343737\n广营\t343738\n光是\t343739\n百变大咖秀\t343740\n宗师\t343741\n摄影馆\t343742\n苏颂\t343743\n可配置\t343744\n西门子博途软件\t343745\n汇联\t343746\n三平\t343747\n法律上\t343748\nMongoDB学习笔记\t343749\n水晶糖\t343750\n硬化\t343751\nAvoiding\t343752\n斜截式\t343753\n王晗\t343754\n新州镇\t343755\nmeyd\t343756\n成长\t343757\n登山机\t343758\nAdWords\t343759\n眼目\t343760\n伊利克斯\t343761\n兵舞\t343762\n圣亚\t343763\n秘诀\t343764\n控辍\t343765\n4.14\t343766\nKAKA\t343767\n尘埃落定\t343768\n抒发\t343769\nEND\t343770\n指界\t343771\n四怀论坛\t343772\n运营师\t343773\n杨_\t343774\n84期\t343775\n快逃\t343776\n远眺\t343777\nYOUTH\t343778\n签字表\t343779\nBoo\t343780\n马凡\t343781\nVNC远程控制软件\t343782\n四线\t343783\n霍启刚\t343784\n黄崖关长城\t343785\n锡都\t343786\nCont\t343787\n梨树县\t343788\n刀片机\t343789\n腊梅花\t343790\n山东大学图书馆\t343791\n咋么样\t343792\n4180\t343793\n狗十三\t343794\n直立\t343795\n沙尔克04\t343796\n岳阳经济技术开发区\t343797\n奇袭\t343798\n0天\t343799\n未接来电\t343800\n900万\t343801\n版成\t343802\n欧洲大学\t343803\n蒜蓉虾\t343804\n58食品网\t343805\nuln2003\t343806\n斜面\t343807\n舍己为人\t343808\n复合句\t343809\n罗斯托\t343810\n华擎Z370\t343811\nQDII基金\t343812\n香品\t343813\n铁性\t343814\n直角坐标系xOy\t343815\n重庆建工\t343816\n气动隔膜泵\t343817\n三分屏\t343818\nSIT\t343819\n怅\t343820\nSS5\t343821\n针剂\t343822\n500种\t343823\n阵仗\t343824\nIFFT\t343825\n君主专制\t343826\n黄灿然\t343827\n定睛\t343828\n大白\t343829\nPhalApi\t343830\n3米\t343831\n田涛\t343832\nWorkday\t343833\n东湖村\t343834\n沿河土家族自治县\t343835\n2.mp4\t343836\n斗罗大陆第1季\t343837\n锦心\t343838\ncaffenet\t343839\n掳爱\t343840\nSuse\t343841\n328号\t343842\n摇摇晃晃的人间\t343843\n福建代表团\t343844\n草油\t343845\nmanganese\t343846\n宏道\t343847\n直言不讳\t343848\n6-7月\t343849\n深圳联交所\t343850\n雄州街道\t343851\n妻\t343852\n云存档\t343853\n中国邮箱网\t343854\nM128fw\t343855\n诊间\t343856\n5间\t343857\n叉烧肉\t343858\n沁色\t343859\n深圳市创新投资集团有限公司\t343860\n五集\t343861\nliquids\t343862\n梦呓\t343863\n69魔方寸\t343864\n六合路\t343865\nipvsadm\t343866\n守护兽\t343867\n170429\t343868\n排名次\t343869\n新上海滩\t343870\n微元素\t343871\n客运中心站\t343872\nlower\t343873\n广西工业职业技术学院\t343874\n京承高速\t343875\n陈太丘\t343876\n拳击赛\t343877\n为什么不会\t343878\nquang\t343879\n剑网三藏剑\t343880\n青玄\t343881\nusim\t343882\n各自为战\t343883\n第60期\t343884\nFluorescence\t343885\n铁芯\t343886\nppc\t343887\n8.98\t343888\nXX有限公司\t343889\nElecard\t343890\nsmth\t343891\n正浓\t343892\n定向流量包\t343893\n20160720\t343894\n重庆市通信管理局\t343895\n太行山路\t343896\nmicr\t343897\n编发\t343898\nBN\t343899\n减负\t343900\n华为WATCH\t343901\n神探夏洛克\t343902\n西充\t343903\n聚信\t343904\nspine\t343905\n聚隆\t343906\nnixon\t343907\nWelch\t343908\n康托尔\t343909\nimageview\t343910\n安徽省高级人民法院\t343911\n夺金\t343912\n电闸箱\t343913\n上上下下\t343914\n王朝的女人\t343915\n汽车泵\t343916\n1046\t343917\ndonot\t343918\n渝万高铁\t343919\n鸣蝉\t343920\n星语星愿\t343921\n10.2.0.5\t343922\n小史\t343923\n西城区西\t343924\n女魃\t343925\n毕业晚会\t343926\n布冯\t343927\n超流\t343928\n南阳站\t343929\n小儿推拿\t343930\n应悔\t343931\nvox\t343932\n下山虎\t343933\nVisualSVN\t343934\n暗纹\t343935\nnext\t343936\n48000\t343937\n普高中考\t343938\nperforce\t343939\n馄饨汤\t343940\n512G\t343941\n浅探\t343942\n柱顶\t343943\nJaguar\t343944\niplay\t343945\nzhuang\t343946\n化工学院\t343947\n0.38mm\t343948\n姓赵\t343949\n馆员\t343950\n易迪拓\t343951\n55张\t343952\n龙板\t343953\n人境\t343954\n非住宅\t343955\n新编阿拉伯语\t343956\n35号\t343957\nmg2400\t343958\n顾军\t343959\n趋严\t343960\nETP\t343961\nkawd\t343962\n散金\t343963\n黑湖\t343964\n古人\t343965\n刘雅鸣\t343966\n蓝牛\t343967\nblaze\t343968\n未成人\t343969\nLuc\t343970\nwindow.open\t343971\n啦啦操\t343972\nL360\t343973\nleoking01\t343974\ndram\t343975\n38cm\t343976\n车单\t343977\nprogramming\t343978\nOX\t343979\nLPD\t343980\n300平\t343981\n非公立\t343982\nAndroid横竖屏\t343983\n箱板\t343984\n资金净流入\t343985\ntrains\t343986\n海信集团\t343987\n百分之100\t343988\ncbf\t343989\n逐题\t343990\n芬\t343991\nprebuilt\t343992\nTammy\t343993\n完整度\t343994\n广州市中医院\t343995\n合肥路\t343996\n豊\t343997\nhana\t343998\n五子棋雾化器\t343999\n哩\t344000\n布洛芬缓释胶囊\t344001\n锚头\t344002\n近况\t344003\n中国科学院植物研究所\t344004\n生命观\t344005\n张丽霞\t344006\n中国特色社会主义\t344007\n测试部\t344008\nAV接口\t344009\n乙二酸\t344010\n后跟\t344011\n凌哥\t344012\n美图手机2\t344013\n全S\t344014\n反计\t344015\n弯头\t344016\n魔塔\t344017\nM1卡\t344018\n子分部\t344019\n鲸鲨\t344020\n哈尔斯\t344021\n人教版八年级道德与法治\t344022\n毛驴\t344023\n裴艳玲\t344024\n9.6%\t344025\n井岗山\t344026\n增仓\t344027\n随即\t344028\n正科\t344029\n淘宝天\t344030\n编外人员\t344031\n吉安火车站\t344032\n后备份\t344033\n保法\t344034\n蓝\t344035\n藤村\t344036\n两元\t344037\n玻璃棉卷毡\t344038\nTim\t344039\n苏素\t344040\n有余而补\t344041\n大众生\t344042\n将臣\t344043\n广州移动\t344044\n红旗村\t344045\n负重前行\t344046\n壮汉\t344047\n小E\t344048\n蛰伏\t344049\n做宣传\t344050\n中国科学院大气物理研究所\t344051\n点图\t344052\nminigui\t344053\n华奥星空网\t344054\n追球\t344055\n平安平安福\t344056\n普邦股份\t344057\n鄞江镇\t344058\n被害案\t344059\n检校\t344060\n不翼而飞\t344061\n沙漠风\t344062\n苗木网\t344063\n腾笼\t344064\n松达\t344065\n换服\t344066\nあ\t344067\n呀\t344068\n操作版\t344069\nsurvivors\t344070\n对流式\t344071\n北京师范大学实验小学\t344072\n比利亚雷亚尔\t344073\n魔物园\t344074\n王佐镇\t344075\n滚动新闻_中国政府网\t344076\n报警灯\t344077\n幽光\t344078\n领头人\t344079\nDOMMouse\t344080\n控制工程网\t344081\nndsl吧_\t344082\n碎银\t344083\n一品威客\t344084\n控辍保学\t344085\n节日快乐\t344086\n2018-01-15\t344087\n五桂山\t344088\n中国市县招商网\t344089\n%2\t344090\n泪水\t344091\n陆空\t344092\nFHM\t344093\n抓阄\t344094\nFour\t344095\n核黄素\t344096\n葡萄糖苷酶\t344097\ne31230v2\t344098\n免单券\t344099\n笃信\t344100\n小彬\t344101\n夫婿\t344102\n随机值\t344103\n婚纱影楼\t344104\n英气\t344105\n魔法战争\t344106\n岁末\t344107\n105张\t344108\n影剧院\t344109\neeagd\t344110\n明月心\t344111\ndoujin\t344112\n荆州日报网\t344113\n含笑花\t344114\n人大代表大会\t344115\n国际博览会\t344116\n光膜\t344117\n为生\t344118\n信实\t344119\n节能率\t344120\n道君官居\t344121\n气音\t344122\n无耻混蛋\t344123\n秃瓢\t344124\nwinrar\t344125\n启皓大厦\t344126\nsoraru\t344127\n中国会议产业网\t344128\n朝仓\t344129\n捂脸\t344130\nOPTICS\t344131\n窗套\t344132\ncooked\t344133\n梦师\t344134\n八达岭奥莱\t344135\n济南野生动物世界\t344136\n连冠\t344137\nMOTUL\t344138\n夺冠之路\t344139\n传祺gs\t344140\n版单\t344141\n逻辑学导论\t344142\n第139集\t344143\n授奖\t344144\n柳智慧\t344145\n冰粒\t344146\n长汀古城\t344147\n扎实\t344148\n58秒\t344149\n缔\t344150\n蒸汽洗车机\t344151\n钩织\t344152\nfdp\t344153\n险阻\t344154\nBookshelf\t344155\n跑马机\t344156\n中江县人民政府\t344157\n镇北堡影视城\t344158\n傲慢全面战争\t344159\n老厨\t344160\n王大夫\t344161\n负载转矩\t344162\n荟聚\t344163\n陈晓旭\t344164\nSEXY\t344165\n昌江县\t344166\n育儿百问\t344167\nMayo\t344168\n江苏省审计厅\t344169\n县教体局\t344170\nsin^2\t344171\n密事\t344172\n删\t344173\n医仙\t344174\n暨南大学华文学院\t344175\n为卿狂\t344176\n小本我的世界\t344177\n入川\t344178\n思维题\t344179\n鼎信诺审计软件\t344180\n郭珍霓\t344181\n鱿鱼丝\t344182\n南泰\t344183\nsbd\t344184\n3000亿元\t344185\ncracking\t344186\n副图\t344187\nline-height\t344188\n008\t344189\n骑马与砍杀2\t344190\n奶香\t344191\ngrammy\t344192\n刺参\t344193\nInformed\t344194\n谢苗\t344195\nAngie\t344196\n柳东新区\t344197\n纳税人识别号\t344198\n孙玮\t344199\n利兹联\t344200\n题为\t344201\nopenSSL\t344202\ns7e\t344203\nfollowme\t344204\n拔尖\t344205\n曼德拉\t344206\nDenon\t344207\n新三国\t344208\n古早\t344209\n继电器\t344210\nMarriott\t344211\ninstructions\t344212\n平湖人\t344213\n佐敦\t344214\n联连普金\t344215\n步兵片\t344216\n3金\t344217\ninstrument\t344218\n正态化\t344219\n市尺\t344220\n做学\t344221\n相辅\t344222\n绿海家园\t344223\n休谟\t344224\n匝间\t344225\n定论\t344226\n环氧基\t344227\n踢爆\t344228\nhby\t344229\n李炳南\t344230\nExtract\t344231\n力气\t344232\n同丰路\t344233\n救心菜\t344234\nMSTS\t344235\n2.5T\t344236\n体育类\t344237\nGears\t344238\n树荫\t344239\n统配\t344240\n广东省外语艺术职业学院\t344241\n无政府主义\t344242\n140亿美元\t344243\n一js\t344244\n8招\t344245\n十八路\t344246\ntany\t344247\n爱神\t344248\n强颜\t344249\n中国烟台政府\t344250\n烷基\t344251\n云末\t344252\n觀\t344253\n中国大众\t344254\n咒轮\t344255\n一角\t344256\n广告类\t344257\n补拙\t344258\n本女\t344259\n生物制品\t344260\n少年三国志暗金\t344261\n美体小铺\t344262\nshells\t344263\n宜城街道\t344264\n深圳国旅\t344265\n保险网\t344266\n肖燕\t344267\n沐兰\t344268\n万贵妃\t344269\n定级\t344270\n灵粮堂\t344271\nRandy\t344272\n气动打磨机\t344273\n类癌\t344274\n荣耀cdkey\t344275\n海雅缤纷城\t344276\n长油\t344277\n一个尺\t344278\n吴湖帆\t344279\n核桃树\t344280\n晗\t344281\n西边\t344282\nWAGO\t344283\n射手村\t344284\n3.2.5\t344285\n张集\t344286\n20180202\t344287\n桔子\t344288\nFilorga\t344289\n俊哥\t344290\n牙狼\t344291\nrisen\t344292\n梦工厂\t344293\n武邑中学\t344294\nm403d\t344295\n很好看\t344296\n舒克贝塔历险记\t344297\nu1\t344298\n一天游\t344299\n绿盟科技\t344300\n府上\t344301\n太古广场\t344302\n石磨\t344303\ndrip\t344304\n518路\t344305\n内存能\t344306\n建业壹号\t344307\nchemkin\t344308\n一路\t344309\n马礼堂\t344310\n2018年2月24日\t344311\n俄剧\t344312\nCorp\t344313\n18s\t344314\nautoart\t344315\n短声\t344316\n数码港\t344317\norgin\t344318\n奥林匹斯十二主神\t344319\n沉银\t344320\njavaswing\t344321\n徐源\t344322\n新郑市\t344323\n荘\t344324\n日均\t344325\n瑞都\t344326\njiaxing\t344327\n自作自受\t344328\n连击率\t344329\nip138邮编大全\t344330\nMEYD\t344331\n木纹石\t344332\n大连酒店\t344333\n用药\t344334\n丰台花乡\t344335\n从众\t344336\nappstore\t344337\ncalculus\t344338\npaged\t344339\n图片贴\t344340\n尊神\t344341\n色男\t344342\n文广\t344343\nSoomal\t344344\nG25\t344345\njxywxy\t344346\n楚留香网易\t344347\nMacklemore\t344348\n跨入\t344349\n岭南股份\t344350\n曾孙\t344351\n防守类\t344352\nesp8266\t344353\nVi编辑器\t344354\n天山雪莲\t344355\n阿尔法罗密欧\t344356\nresearch\t344357\nEdrawSoft\t344358\n中华人民共和国房产税暂行条例\t344359\n大一个\t344360\n人民日報\t344361\n上海世博会\t344362\n飞虎之潜行极战主题曲\t344363\n2017年5月20日\t344364\n宫胁咲良\t344365\nfeel\t344366\n背投\t344367\n邀请券\t344368\npnas\t344369\nXilisoft\t344370\nチンポ\t344371\n倩女幽魂手\t344372\n交尾\t344373\n头指针\t344374\nAstah\t344375\nBWL\t344376\n移动学习\t344377\n王谦\t344378\n27寸\t344379\n企业管理部\t344380\n六屏\t344381\n冬泳\t344382\n灌区\t344383\n吉林省司法厅\t344384\n广州招考网\t344385\n辛口镇\t344386\nNUK\t344387\nCASS—地信网\t344388\n刘源\t344389\n拉舍尔\t344390\n广东电力\t344391\n雷\t344392\n岩心\t344393\nop07\t344394\n伊卡\t344395\nb860\t344396\n真诀\t344397\n梦想世界3D\t344398\n口琴\t344399\ncl1024\t344400\n小蚁4k\t344401\n近义词\t344402\n石油学报\t344403\n中国药品电子监管码\t344404\n扬州晚报\t344405\n本命\t344406\n电精灵\t344407\n刑讯逼供\t344408\niso文件\t344409\n编年体\t344410\n忍界\t344411\n交通管\t344412\n王府花园\t344413\n2016年7月20日\t344414\n巴马瑶族自治县\t344415\ncharm\t344416\n溢乳\t344417\n页符\t344418\n陕西共青团\t344419\n数种\t344420\n新疆克拉玛依\t344421\nProLiant\t344422\n太鼓达人\t344423\n五台山风景名胜区\t344424\nxcode9\t344425\n法人一证通\t344426\nji\t344427\n张凌云\t344428\n盗取\t344429\n国家知识产权局\t344430\n抗日民族统一战线\t344431\n_抑郁症\t344432\n可导致\t344433\n迷途\t344434\nPSCC2015\t344435\n瑞成\t344436\n落木萧萧\t344437\n4.0.0\t344438\n曦澄\t344439\n玉山镇\t344440\njython\t344441\n车检\t344442\n中国游泳队\t344443\n大旱\t344444\n2018年劳动节\t344445\n猎豹免费WiFi\t344446\n新北师大版小学\t344447\n嘉化能源\t344448\n强智\t344449\n登攀\t344450\n监狱风云2\t344451\n洗澡间\t344452\n重生者\t344453\n东日\t344454\n小水\t344455\n衣剂\t344456\n酌油\t344457\n地址池\t344458\n苹婆\t344459\nChteau\t344460\n易洋千玺\t344461\n工况\t344462\n鲁伯斯\t344463\nBradley\t344464\nadaptor\t344465\n华迪\t344466\n2017年9月3日\t344467\n预配\t344468\n朗姆酒\t344469\n新园区\t344470\n白浪河\t344471\n寂寞先生\t344472\n槽值\t344473\nemi\t344474\ngt72\t344475\n暁\t344476\n热熔机\t344477\n_吉他吧_\t344478\n朱氏\t344479\n祝延平\t344480\nmld\t344481\n万平方\t344482\n维修机\t344483\n宾\t344484\n研究班\t344485\n增产\t344486\n松川\t344487\n禁牧\t344488\n福建省建设执业资格注册管理中心\t344489\n‖\t344490\n丝兰\t344491\nCycling\t344492\n济源\t344493\nappreciated\t344494\n57层\t344495\nSFP\t344496\nP300\t344497\nsprint\t344498\nworkman\t344499\n4503\t344500\n虎\t344501\n地图集\t344502\n刘永好\t344503\n星巴克咖啡店\t344504\n成志\t344505\n商照\t344506\nvshpere\t344507\ngoaccess\t344508\n教学反思_相关\t344509\n杭海\t344510\n国家认监委\t344511\n蔡益琳\t344512\n六年级科学下册\t344513\n試\t344514\n滇南\t344515\n2695\t344516\n钟山区\t344517\n17500\t344518\nEvolving\t344519\n龙泣\t344520\n王衡\t344521\n石头剪子布\t344522\n数组变量\t344523\n百思特\t344524\n5针\t344525\n溶脂\t344526\n怒打\t344527\n佛山地铁3号线\t344528\n显式\t344529\n17吨\t344530\n涟源\t344531\n川椒\t344532\n第七页\t344533\n西语学习网\t344534\n李明明\t344535\n女性健康网\t344536\n喷枪\t344537\n文学性\t344538\n刘永灼\t344539\n电子商务师\t344540\n第163期\t344541\n拨叉\t344542\nATH-MSR7\t344543\n不限行\t344544\nRaster\t344545\n易玩通网\t344546\ni5-4460\t344547\n6204\t344548\nHd\t344549\n忆读书\t344550\n新化月报网\t344551\n市行政审批局\t344552\n强化\t344553\nLimbo\t344554\n井门\t344555\n哈雷肥仔\t344556\n甲磺酸阿帕替尼片\t344557\n业务量\t344558\n医药股\t344559\n桃隐\t344560\n真菌药\t344561\n红星照耀中国\t344562\n代充\t344563\n海词词典\t344564\n乌兰夫\t344565\n桃园县\t344566\n晓港名城\t344567\n3立方米\t344568\n影音先锋av撸色\t344569\n樱花草\t344570\n锤头鲨\t344571\n思铂指南者\t344572\n长角\t344573\n鱼汤\t344574\n一污\t344575\n2014年上半年\t344576\n广东省小学\t344577\n黑崎\t344578\n东城小学\t344579\n冈山县\t344580\n玉麒麟\t344581\nksh\t344582\n周世勋\t344583\n场界\t344584\n过渡态\t344585\n建伟\t344586\nv4.3.0\t344587\n建材城\t344588\n苏州工业园区\t344589\n军姿\t344590\n遭罪\t344591\n广州体育学院\t344592\n母女装\t344593\n魔力视频播放器\t344594\n闯进来\t344595\n骑马舞\t344596\n凝析油\t344597\n细胞角蛋白\t344598\n茶歇\t344599\n实况转播\t344600\n高物\t344601\nchecks\t344602\n贵南县\t344603\n周杰伦\t344604\n崇阳县人民政府\t344605\n量计征\t344606\n报国寺\t344607\n铜川市\t344608\n行政性\t344609\nModelsim\t344610\n东林寺\t344611\n期刊\t344612\n三七根\t344613\n上海话剧艺术中心\t344614\n绊橙\t344615\n改门\t344616\nto&#160\t344617\n韶关东站\t344618\n供货方\t344619\n收纳架\t344620\n阿莫西林克拉维酸钾片\t344621\n建设南路\t344622\n骚俊\t344623\n米咔\t344624\n飞贼\t344625\n82号\t344626\n意淫\t344627\n中国第一汽车集团有限公司\t344628\n11K影院\t344629\n伯纳\t344630\n朝闻\t344631\n如果蜗牛有爱情\t344632\n锡林郭勒盟\t344633\n朱挺\t344634\n新余\t344635\n踢法\t344636\n搬弄是非\t344637\n双面器\t344638\n110届\t344639\n北师大版_小学\t344640\n班德尔城\t344641\n电压继电器\t344642\nof\t344643\nA10FZG045/10W-VRC02N00*981421\t344644\n池妍玉\t344645\n粉厂\t344646\n中央民族大学研究生院\t344647\n林荫\t344648\n0183\t344649\n黄岛开发区\t344650\n肛肠外科\t344651\nIsabelle\t344652\nllinux\t344653\n1037u\t344654\n冲量\t344655\n北川地震遗址\t344656\n重案\t344657\nAWK\t344658\n4.0000元\t344659\nplornhub\t344660\n茶盏\t344661\nRaise\t344662\n对台\t344663\n纯丙\t344664\n咽拭子\t344665\ntrinus\t344666\n华夏路\t344667\n金铁霖\t344668\n猫界\t344669\n珍珠港事件\t344670\n豆香\t344671\n巨灵\t344672\n台军\t344673\n喵星大作战\t344674\n粘滞阻尼器\t344675\n大兴新机场\t344676\n易读网\t344677\n最高检察院\t344678\n果木\t344679\n加州大学圣塔芭芭拉分校\t344680\nPKG\t344681\n六和城\t344682\n蓬勃\t344683\n时源\t344684\n成都东客站\t344685\n非人防\t344686\n70cm\t344687\n辐射板\t344688\nPM10\t344689\n洲明\t344690\nROHS\t344691\n深圳东部华侨城\t344692\n张乐乐\t344693\n驼峰\t344694\nnora\t344695\n震动感\t344696\n别具\t344697\n罗马字\t344698\n藤堂静\t344699\n蝴蝶刀\t344700\nRetirement\t344701\n职业教师\t344702\nLore\t344703\n中炬高新\t344704\n皇爵\t344705\n购销部\t344706\n20160920\t344707\n肌肉群\t344708\n重庆电力\t344709\n周漾玥\t344710\n布书\t344711\n改后\t344712\nctrl+space\t344713\n打碎\t344714\nspent\t344715\n宁曼路\t344716\ncheek\t344717\n2018年2月16日\t344718\n700d\t344719\n长安X70A\t344720\n藕节\t344721\n皮皮岛\t344722\n呼吸运动\t344723\n凯雷\t344724\n破获\t344725\nrecaro\t344726\nv4.1.2\t344727\n10孔\t344728\ngraphene\t344729\n金牌得主\t344730\n近百万元\t344731\n求子\t344732\n丽珠\t344733\n300168\t344734\n中国贸促会\t344735\n终年\t344736\nMater\t344737\n文魁\t344738\n综改\t344739\n贾立平\t344740\n引流管\t344741\nMediation\t344742\n适配性\t344743\n补偿器\t344744\n文王\t344745\n样稿\t344746\nTeradata\t344747\n格温妮斯·帕特洛\t344748\n辱尸\t344749\n君子兰花\t344750\n经营地\t344751\n说见\t344752\n资源性\t344753\n徐州医科大学\t344754\n联想控股\t344755\n歌王\t344756\n新余吧\t344757\n上尉\t344758\n幼教\t344759\n揭盖\t344760\n药科\t344761\n锤状\t344762\nCaffe\t344763\nAl\t344764\n10.5英寸\t344765\nmu-X牧\t344766\n售前工程师\t344767\n生铁\t344768\n著录\t344769\n分化\t344770\n息料理\t344771\n宝莲灯鱼\t344772\n泉商\t344773\n北京市第八中学\t344774\n1216\t344775\n随想\t344776\n弹道\t344777\ngimme\t344778\nparamiko\t344779\n中国摇滚\t344780\n那些\t344781\n铜盘\t344782\n图志\t344783\n燕达医院\t344784\n产值\t344785\ncatalyst\t344786\na2pro\t344787\n10mg\t344788\n180422\t344789\n杉山纪彰\t344790\n百合剧\t344791\n苏州电视台\t344792\n花生仁\t344793\n省长\t344794\n周市镇\t344795\n能恩\t344796\n云府\t344797\nReactive\t344798\n鼓舞人心\t344799\n财务咨询公司\t344800\ngyp\t344801\n卡斯罗\t344802\n近1年\t344803\n林彬杨\t344804\n老爷庙\t344805\nshades\t344806\nps1\t344807\nimei码\t344808\n明升\t344809\n3300点\t344810\n最后一颗牙\t344811\nUnity圣典社区\t344812\n1CM\t344813\n北京保利\t344814\nChili\t344815\n杜邦纸\t344816\n红外夜视仪\t344817\nand函数\t344818\n高标准\t344819\n压强\t344820\n拱架\t344821\nhct\t344822\n试衣\t344823\ngsync\t344824\n良田高拍仪\t344825\n跑马山\t344826\n美国波士顿大学\t344827\n分等\t344828\n枯病\t344829\n蛤蟆油\t344830\n大足龙水湖\t344831\n执子之手,与子偕老\t344832\n老何\t344833\n1878\t344834\nEndress\t344835\n中移全通系统集成有限公司\t344836\n纪念展\t344837\n嘀嗒\t344838\n可溶性淀粉\t344839\n授权委托书\t344840\nCHORD\t344841\nEn\t344842\n热模\t344843\nsyslinux\t344844\n华尔街英语\t344845\nn条\t344846\n北京商报\t344847\n刑事案例\t344848\n汇商所\t344849\n陆晓萧楚北\t344850\nqq悄悄话\t344851\n社会库存\t344852\n丰县\t344853\nMPICH\t344854\n新美人计\t344855\n单身男女\t344856\n21小时\t344857\n新华国际\t344858\n百草录\t344859\n飞卢女生网\t344860\nh81\t344861\n杨涛\t344862\nJAVASCRIPT\t344863\n若非\t344864\nplatform\t344865\n古惑仔3\t344866\n0.28\t344867\ncsbte\t344868\n达子\t344869\nMen\t344870\n约德尔\t344871\n北京电影学院\t344872\n凯迪拉\t344873\nZ11mini\t344874\n泥炉\t344875\n金妍儿\t344876\n东方融资网\t344877\n中华小当家\t344878\n第117\t344879\n外衣\t344880\nAngular-cli\t344881\n骊歌\t344882\n劳心者\t344883\n敖子逸\t344884\n加热器\t344885\ngallery\t344886\n花生牛奶\t344887\n一搜\t344888\n丙申年\t344889\n魔轮\t344890\nCNAS实验室\t344891\n轮机\t344892\n1辆\t344893\n哈拉\t344894\n微场景秀\t344895\nsheet页\t344896\n指令集\t344897\n17.5米\t344898\n刘利民\t344899\n发送器\t344900\n7【\t344901\n虏\t344902\n客天下\t344903\n天津电力\t344904\n博鳌亚洲论坛\t344905\n创客猫\t344906\n攻入\t344907\n抠图\t344908\n23%\t344909\n万能转换开关\t344910\n中央电视台\t344911\n新闻_99健康网\t344912\n上海弄堂\t344913\n削面\t344914\ncreateElement\t344915\n夸世代\t344916\nfocal\t344917\n事理\t344918\n脱单告急\t344919\n徹底\t344920\n最后一个子\t344921\n南京市鼓楼区人民法院\t344922\n艾肯5\t344923\n涂胶\t344924\n青岛万达\t344925\n冠名商\t344926\nCompress\t344927\n燃烧\t344928\n增广贤文\t344929\n卞\t344930\n每个世界\t344931\nSensation\t344932\nvsftp\t344933\n房租费\t344934\n池底\t344935\n磁力泵\t344936\n协创\t344937\n几率\t344938\npsl\t344939\ncertain\t344940\nSEVEn\t344941\n2018-04-1\t344942\n毒品\t344943\n顺网科技\t344944\n3ds\t344945\n小龙虾\t344946\n上海交通广播\t344947\n止咳糖浆\t344948\n五里界\t344949\n360骑卫士\t344950\n光庭\t344951\n一个40多岁\t344952\n数项\t344953\n开心辞典\t344954\n中央警卫局\t344955\n际华集团\t344956\n大马村\t344957\nXT\t344958\n跟换\t344959\noooo\t344960\n欺骗性\t344961\n427号\t344962\n依法治市\t344963\n佳通\t344964\n投掷物\t344965\n美标阀门\t344966\nWiFi信号\t344967\n药品生产许可证\t344968\n命中\t344969\n魔焰\t344970\n贮存\t344971\n逸阳\t344972\n正业科技\t344973\n逅\t344974\n半双\t344975\n昏\t344976\n广东壹号食品股份有限公司\t344977\n激愤\t344978\n灵魂摆渡人\t344979\ncitect\t344980\nM10\t344981\n逻辑性\t344982\n战列舰\t344983\n硬件综\t344984\n耕\t344985\nrunable\t344986\n27P\t344987\nc17\t344988\n全日空航空\t344989\n竖着\t344990\n肉弹\t344991\n斗篷\t344992\n中国南方电网有限责任公司\t344993\nJoo\t344994\n布带\t344995\n蒸包炉\t344996\n凤凰西街\t344997\n初等函数\t344998\n单规则\t344999\n唐伟敏\t345000\n国科微\t345001\nZERO\t345002\n国网江苏省电力有限公司\t345003\n中国航天科工二院\t345004\n云霄阁小说网\t345005\nmuti\t345006\n披散\t345007\noltp\t345008\n中译\t345009\nmx985\t345010\n趋避\t345011\n硫醇\t345012\n客心\t345013\n无缝漫游\t345014\n马克昌\t345015\n题源\t345016\npatricia\t345017\n欧科\t345018\n北石\t345019\nSymposium\t345020\n印铁\t345021\n第111期\t345022\n600332\t345023\n王根\t345024\ncsol\t345025\ncoaxial\t345026\nWSS\t345027\n会计学原理\t345028\n烦人\t345029\n王富贵\t345030\nMU\t345031\n20141012\t345032\n硕博连读\t345033\n广州市工业和信息化委员会\t345034\n兰陵国家农业公园\t345035\n花海\t345036\n228国道\t345037\n7000多\t345038\n驼队\t345039\n道生\t345040\n亚人\t345041\n里子\t345042\n纸条\t345043\n侵入者\t345044\n宝能城\t345045\n高亮度\t345046\nHIBERNATE\t345047\nKuaiDi\t345048\n处以\t345049\n2018-2021年\t345050\n航空大亨2\t345051\n吸尘车\t345052\n回龙观店\t345053\n希夷\t345054\n唐山路北\t345055\n圣经创世纪\t345056\nPointwise\t345057\n镯子\t345058\nPytho\t345059\n伍六七\t345060\n吴\t345061\n白描\t345062\n尉氏吧\t345063\n渍\t345064\n九阳股份有限公司\t345065\n把悲伤留给自己\t345066\n游泳池\t345067\n四个小时\t345068\n易联\t345069\nAveeno\t345070\n骗纸\t345071\n纸浆模塑\t345072\n花落梦深处\t345073\n交收\t345074\n案头\t345075\nno\t345076\n主气\t345077\n慢波\t345078\n素馨\t345079\nErika\t345080\n护花神\t345081\n血狼\t345082\n设限\t345083\n天网工程\t345084\n_一号家居网\t345085\n拷贝数\t345086\n主名\t345087\n肃反\t345088\n氩气\t345089\n3.05\t345090\n乐山师范学院\t345091\nΧ\t345092\nNagoya\t345093\n飞行模拟器\t345094\n英斗\t345095\n凤仪\t345096\n泡温泉\t345097\nary\t345098\nlicense\t345099\nBigDecimal\t345100\nmicrosd\t345101\n表值\t345102\n左轮枪\t345103\n拉多维德\t345104\n燕园\t345105\n云岭先锋\t345106\n50厚\t345107\n百度文章\t345108\n丢手绢\t345109\n蒙马特\t345110\nautostart\t345111\n4胎\t345112\n深圳住房公积金管理中心\t345113\nsalt-minion\t345114\n天津人民广播电台\t345115\n快期\t345116\n头脑风暴法\t345117\nDoes\t345118\n武汉第一医院\t345119\nkakaotalk\t345120\n二月初二\t345121\n桂圆红枣茶\t345122\n3162\t345123\nMEAN\t345124\n一睹\t345125\n201509\t345126\n杨振宁\t345127\n互关\t345128\n创国卫\t345129\n杜隆塔尔\t345130\n比利海灵顿\t345131\n神华集团\t345132\n尼特\t345133\n终端盒\t345134\n英菲尼迪\t345135\n初见初恋\t345136\n杀菌机\t345137\n创联\t345138\nmesubuta\t345139\n已故\t345140\n广州白云万达广场\t345141\n坚持用\t345142\n澳大利亚\t345143\nbunch\t345144\n佛子岭\t345145\n救砖\t345146\n亲体\t345147\n中国南航\t345148\n高昕\t345149\n效仿\t345150\n少年志\t345151\n馨子\t345152\n韩币\t345153\n/root\t345154\n上湖城\t345155\n广东论坛_汽车之家论坛\t345156\n张贤达\t345157\nV-KOOL\t345158\n国家卫生部\t345159\n电王\t345160\n苏宁置业\t345161\n市州\t345162\n同杆\t345163\n违办\t345164\n婚介\t345165\n丙\t345166\n穹顶\t345167\n祁连山路\t345168\nShaderLab\t345169\n三支脚人才网\t345170\n明小子\t345171\n應該\t345172\nSUIT\t345173\n狗眼\t345174\n海底小纵队\t345175\n智络\t345176\nBizfluent\t345177\n北京电影学院表演系\t345178\n索菲亚衣柜\t345179\nvivoy85\t345180\n花家地实验小学\t345181\nnee\t345182\nRadiation\t345183\n唏哩呼噜\t345184\n连天红\t345185\n907\t345186\n万斤\t345187\n奇米色\t345188\nbat文件\t345189\n吊轮\t345190\n收纳箱\t345191\njava模拟器\t345192\n30瓶\t345193\n善思\t345194\n会展业\t345195\n四元数\t345196\n红豆馅\t345197\n靖国神社\t345198\n厦门消防\t345199\n石班瑜\t345200\n坦克师\t345201\nplaymemories\t345202\n羟氯喹\t345203\n李建华\t345204\nbcd\t345205\n乳糜尿\t345206\n魂殿\t345207\n阿道夫·希特勒\t345208\n围岩\t345209\ns0\t345210\n安徽大学江淮学院\t345211\n股骨粗隆间骨折\t345212\n国际关系\t345213\nPwn\t345214\n奔驰A级\t345215\n手机板\t345216\n收索\t345217\n工业炉窑\t345218\nnurbs\t345219\n酷游\t345220\n钢棒\t345221\nGST\t345222\n直人\t345223\n奢享\t345224\n静电除尘器\t345225\n短链接\t345226\n路面\t345227\nm132a\t345228\n大珠\t345229\n波音738\t345230\ndisa\t345231\nGBQ\t345232\n剑法\t345233\n上海张江集团学校\t345234\n中关村壹号\t345235\n指挥所\t345236\n湖州市中心医院\t345237\n长城路\t345238\n扩弓\t345239\n不良品\t345240\n10千瓦\t345241\n紗倉\t345242\n消火\t345243\n山口县\t345244\n陈金华\t345245\n冯军\t345246\nsnapshot\t345247\n流星\t345248\n比特币挖矿收益计算器\t345249\n斗阵\t345250\n永乐宫\t345251\njidu\t345252\n四川大学华西口腔医院\t345253\n3矩阵\t345254\nRampage\t345255\n小精\t345256\n足记\t345257\n002241\t345258\n欧阳询\t345259\nTrie\t345260\n成都电子科技大学\t345261\n时秒\t345262\nWord2007\t345263\n丁凡\t345264\n涂达\t345265\nfm2012吧\t345266\n双层罐\t345267\n天然呆\t345268\n陈思安\t345269\n童果网\t345270\nworkgroup\t345271\n2403\t345272\n光体\t345273\n文具的家\t345274\n18场\t345275\nProblem\t345276\n天师道\t345277\n3000r\t345278\n囧叔\t345279\n消光\t345280\n荆棘花\t345281\nwsgi\t345282\n苯丙乳液\t345283\n块根\t345284\n儒林杨子\t345285\n旅游地\t345286\n危化\t345287\n我长大了\t345288\n双凤开发区\t345289\n产生器\t345290\n97夏\t345291\n六翼\t345292\n旋压机\t345293\n罗刹\t345294\n本泽马\t345295\n罗尔德·达尔\t345296\n终结技\t345297\n后宫甄嬛传\t345298\n栈区\t345299\nfsg\t345300\n2016年7月1日起\t345301\n贵阳北\t345302\n炭化\t345303\nplayboy\t345304\n11.10\t345305\n45钢\t345306\n2012运行卡顿\t345307\n8201\t345308\n反恐主义法\t345309\n天智\t345310\n园景\t345311\nKEF\t345312\nCharset\t345313\n吴珊\t345314\n玛丽居里\t345315\n放过\t345316\nmolding\t345317\nbuchi\t345318\nGTX750\t345319\n绝地求生好汉杯\t345320\n1999\t345321\n19朵\t345322\n精密管\t345323\nEA888\t345324\n阿育王寺\t345325\nPPSSPP\t345326\n反证法\t345327\n露易丝\t345328\nAsus\t345329\n日本语\t345330\n寂静岭\t345331\n兴隆家园\t345332\n24针\t345333\nchap\t345334\nafreeca\t345335\n受好评\t345336\n棋盘格\t345337\n街射\t345338\n付钱\t345339\n新生们\t345340\n大话西游手游男鬼\t345341\n2.2.11\t345342\n中国人民银行征信中心\t345343\n江西省公路管理局\t345344\n收\t345345\n深圳市气象局\t345346\n济南电台\t345347\n校长杯\t345348\nSVProgressHUD\t345349\n广东省广播电视网络股份有限公司\t345350\n莆田系\t345351\nAPP178手游网\t345352\n王中\t345353\nps格式\t345354\n版群\t345355\n普通人家\t345356\n王能\t345357\nRyzen处理器\t345358\n卢正浩\t345359\n叠彩山\t345360\n宁波高新区\t345361\n浇铸\t345362\n捷波\t345363\n萝卜叶\t345364\n泰服\t345365\n黑苹果吧\t345366\n贲门失弛缓症\t345367\n随机事件\t345368\n撅嘴\t345369\n有你春暖花开\t345370\n新牙\t345371\n沿海城市\t345372\nInto\t345373\n微标\t345374\nfone\t345375\n五十亿\t345376\nC60\t345377\nsuffer\t345378\nbg\t345379\n信贷公司\t345380\n阐发\t345381\n顿河\t345382\n天津医科大学肿瘤医院\t345383\n云南移动\t345384\n力创\t345385\nbuiltins\t345386\n经验之谈\t345387\n迷迷糊糊\t345388\n老天师\t345389\n推广位\t345390\n浙江省工商局\t345391\nclaim\t345392\n李德印\t345393\n突然好想你\t345394\n市交警支队\t345395\n补流\t345396\niOS7\t345397\n代餐棒\t345398\n乡人大\t345399\n虬江路\t345400\n前走\t345401\n韩代\t345402\nillion\t345403\n胸骨左缘\t345404\n俞强\t345405\nglobally\t345406\n讲文明懂\t345407\n风云榜\t345408\n供水\t345409\n皮书\t345410\n雨松\t345411\n硬分叉\t345412\n摧毁\t345413\n350个\t345414\n议院\t345415\n盲注\t345416\n易石\t345417\n暗影步\t345418\nニング\t345419\n织造\t345420\n陈嘉庚\t345421\n棺木\t345422\nlstrip\t345423\n书阁网\t345424\n跑出来\t345425\nhneao\t345426\n238万\t345427\n淮安火车站\t345428\n2674号\t345429\n增程器\t345430\n很有用\t345431\n斑斓\t345432\n浅水湾\t345433\n榆木\t345434\n计组\t345435\n语儿\t345436\n军校生\t345437\nCass\t345438\n_楚秀网\t345439\n高静\t345440\n三百六十五个\t345441\nssm\t345442\nsketup\t345443\n农法\t345444\n24厘米\t345445\n厚积\t345446\n私营经济\t345447\n能儿\t345448\n幼儿教育网\t345449\n优启通\t345450\n渔乐\t345451\nbbk\t345452\n龙机\t345453\n转群\t345454\n战例\t345455\n大二\t345456\n人岗\t345457\npreservation\t345458\n卡内基梅隆\t345459\n理想国\t345460\n斜杆\t345461\n福州教育网\t345462\n索玛花\t345463\nFirearms\t345464\nsettext\t345465\n美容整形医院\t345466\n灾殃\t345467\nDataSet\t345468\n警花与警犬\t345469\nMonogram\t345470\n钢筋弯曲机\t345471\n顾清明\t345472\n2TB\t345473\n淫\t345474\n值得到\t345475\n宫泽贤治\t345476\n一路顺风\t345477\n第101集\t345478\n可数名词与不可数名词\t345479\n黑盖\t345480\n格氏试剂\t345481\nGenerating\t345482\n幂律分布\t345483\n7米\t345484\n丁火\t345485\n固阳\t345486\n皇帝陵\t345487\n不俗\t345488\n凤起潮鸣\t345489\n黄大仙\t345490\n赚到\t345491\n拳击\t345492\nfastcopy\t345493\n拉兹之歌\t345494\n卢家宏\t345495\n紧张素\t345496\n焦油\t345497\n直肠\t345498\n误杀\t345499\n品书\t345500\n完美主义者\t345501\n蛋白粉\t345502\n禹州市\t345503\nBoth\t345504\n歧义句\t345505\n0630\t345506\n3.6下\t345507\n柯西\t345508\n师娘\t345509\n四川小学\t345510\n民意调查\t345511\n穆丹枫\t345512\nEnvision\t345513\n雨中情\t345514\nyitian\t345515\n回转企鹅罐\t345516\n导频\t345517\n事端\t345518\n天山熙湖\t345519\n刘纯露\t345520\n胶轮\t345521\n珠江流域\t345522\n博天环境集团股份有限公司\t345523\n一学就会\t345524\n上海消防\t345525\n考试中\t345526\nglasgow\t345527\nLTO\t345528\n黑兰\t345529\n十五分\t345530\nfreesex\t345531\n刘奇葆\t345532\n符文工房4\t345533\n要案\t345534\npthread\t345535\n欢喜就好\t345536\n危化品经营许可证\t345537\n欧文斯科宁\t345538\nNostalgia\t345539\n家人\t345540\n邕城\t345541\n前列腺按摩器\t345542\nfuelux\t345543\n国际法\t345544\n8550\t345545\n阳谷\t345546\n非阻塞\t345547\n杜卡迪\t345548\n小视科技\t345549\n家保\t345550\n张纲\t345551\n二分之一\t345552\n陈铎\t345553\n恒大金碧天下\t345554\nGX\t345555\n铁艺门\t345556\n降序\t345557\n54青年节\t345558\n刀闸\t345559\n易昕\t345560\nnginx-module\t345561\n高畑充希\t345562\nTAKSTAR\t345563\n新文\t345564\n存有\t345565\n艾依格\t345566\n李光斗\t345567\n扬子晚报\t345568\n快餐盒\t345569\nmvt\t345570\n不等式\t345571\n僵尸大战\t345572\n佘\t345573\nBookmark\t345574\n报检\t345575\n陈惠\t345576\n俺去也影音先锋\t345577\n德勤华永会计师事务所\t345578\n云龙区\t345579\nClient\t345580\n电力部\t345581\nlabor\t345582\nucsb\t345583\n压金\t345584\njuren\t345585\ntlnshuju\t345586\n雨花经济开发区\t345587\n宝马X系论坛\t345588\nsolute\t345589\n认生\t345590\n谷歌浏览器\t345591\n普莱克斯\t345592\n述\t345593\n看走眼\t345594\nHSM\t345595\n郑醇\t345596\n8.5.30\t345597\n王红兵\t345598\n人音版\t345599\n迅雷在线\t345600\n100整除\t345601\n4R8\t345602\n氯离子\t345603\n深圳市卫生和计划生育委员会\t345604\nlanmp\t345605\n朝阳百姓网\t345606\n法乙\t345607\n结构性\t345608\n二十四节气歌\t345609\nマネ\t345610\n马鞍山路\t345611\n泡泡水\t345612\n蓟县农家院\t345613\nf22\t345614\n缂丝\t345615\n山东力明科技职业学院\t345616\n一柱擎天\t345617\n涨势\t345618\n清水断崖\t345619\n阿四\t345620\n桃运小村医\t345621\n席德梅尔\t345622\n填权\t345623\n延时器\t345624\n蛇口人民医院\t345625\n青阳\t345626\n力强\t345627\n跳绳\t345628\n戴明\t345629\n金台夕照\t345630\nskr\t345631\nMDF\t345632\n2.5.9\t345633\n资产阶级\t345634\n靳萧然\t345635\n20176\t345636\n船尾\t345637\n0805\t345638\nAshampoo\t345639\n水势\t345640\n越发\t345641\nMacdonald\t345642\nMCN\t345643\n41条\t345644\n九龙旺角\t345645\n武大郎\t345646\n曹贼\t345647\n级号\t345648\n韩笑\t345649\n老杜\t345650\n膜板\t345651\nDADDI\t345652\n华岩新城\t345653\nCHEER\t345654\n4杯\t345655\n抵抗力\t345656\nwenger\t345657\n低魔时代\t345658\ni7-4790\t345659\n婚天\t345660\n富士宝\t345661\n小能\t345662\n黑死\t345663\n地方病\t345664\n黄宏生\t345665\n李妮娜\t345666\n王海明\t345667\n赚法\t345668\n媛媛\t345669\n巨蜥\t345670\n保护眼睛\t345671\n逝年\t345672\n37%\t345673\n当升科技\t345674\n晶华\t345675\n互创\t345676\n新嘉街道\t345677\n思必驰\t345678\n吊打\t345679\n第几项\t345680\npocketsphinx\t345681\nSTC12C5A60S2\t345682\nvoid\t345683\n兜兜\t345684\n升降梯\t345685\nLie\t345686\n碳正离子\t345687\nqube\t345688\n隔声\t345689\n乐天堂\t345690\n吉林电信\t345691\n待字\t345692\n窥见\t345693\n猫展\t345694\n迁户口\t345695\n特品\t345696\nZ18mini\t345697\n关于深化国有企业改革的指导意见\t345698\n礼券\t345699\n8名\t345700\nseb\t345701\n禅画\t345702\n夹层基金\t345703\n反射型\t345704\n五百强\t345705\n多囊肾\t345706\n珠海人民医院\t345707\n永恒之柱2\t345708\n973\t345709\n瀚华\t345710\n宁波大学商学院\t345711\n600360\t345712\n相马\t345713\n颜瑜\t345714\n38名\t345715\n嘉定紫藤园\t345716\n白头发\t345717\n兴福寺\t345718\n老馆\t345719\nARTLINKART\t345720\n牛城\t345721\n杨明洁\t345722\n脸形\t345723\n海康录像机\t345724\n兖州煤业股份有限公司\t345725\n数据库管理系统\t345726\ndirectx9\t345727\n货价\t345728\n20150327\t345729\n胎压监测\t345730\n熟菜\t345731\n深圳市社会保险基金管理局\t345732\n眼珠\t345733\n保身\t345734\n铜板\t345735\n高善文\t345736\nQuad\t345737\n赵轩\t345738\n六横岛\t345739\n阿芙精油\t345740\n混炼\t345741\n中国矿业大学银川学院\t345742\n2018-04-02\t345743\n6s/6s\t345744\n荣泰健康\t345745\n赤霉素\t345746\nRES\t345747\nCDG\t345748\n流行性脑脊髓膜炎\t345749\n721\t345750\n中国舞考级\t345751\n东港镇\t345752\ncontainerd\t345753\n天津华苑\t345754\n牛散户\t345755\n一夜情\t345756\n尾生\t345757\n第七新闻网\t345758\nnali\t345759\n监督处\t345760\n自证\t345761\n定期定额\t345762\n赵子清\t345763\n高升\t345764\nMillions\t345765\nmudi\t345766\n祁门\t345767\n弱杀\t345768\n内\t345769\n高级财务会计\t345770\n同程旅游网\t345771\n团体\t345772\n丁大卫\t345773\n团名\t345774\n马晓晴\t345775\n外地车\t345776\n鬼藏\t345777\n江南人\t345778\nformatted\t345779\n麻面\t345780\n北京市建设工程信息网\t345781\n1839年\t345782\n五星酒店\t345783\n存放柜\t345784\nActivated\t345785\n广告版\t345786\n速客\t345787\nhttpp\t345788\n合诵\t345789\n人祸\t345790\n公共租赁房\t345791\n37话\t345792\n唯一\t345793\n艾乐替尼\t345794\n自怜\t345795\n制模\t345796\n停转\t345797\n调休\t345798\n陕西文化产业投资控股(集团)有限公司\t345799\n阔步\t345800\n置业网\t345801\n泗水\t345802\n天安云谷\t345803\nAchievement\t345804\n十二\t345805\n绕桩\t345806\n陌陌号\t345807\n冻柜\t345808\n芝诺\t345809\nidea类\t345810\n偏析\t345811\n暴发户\t345812\n克里克\t345813\n常熟市政府\t345814\n新浪河南新闻_新浪\t345815\n鲍文\t345816\n惊现\t345817\n豪门足球风云\t345818\n盐道\t345819\n义父\t345820\nhtml+css3\t345821\n胡超\t345822\n一对一补习\t345823\n奚秀兰\t345824\n张嫣\t345825\n网证\t345826\n一键还原\t345827\n专硕\t345828\n郎溪路\t345829\noppoa57\t345830\n高博\t345831\n开拍\t345832\n一千多个\t345833\n广州农商银行\t345834\n工分\t345835\n饸饹\t345836\n阳光花园小区\t345837\n呼吸性\t345838\n厦门城市职业学院\t345839\n老师们\t345840\nSatomi\t345841\n弊端\t345842\n傻丫头\t345843\nbarcodescanner\t345844\n督促\t345845\n封炎\t345846\n子窗口\t345847\n七政\t345848\n舆论战\t345849\nMOD网\t345850\n刻度\t345851\n科莫湖\t345852\n代币\t345853\n听感\t345854\n直发器\t345855\n河南交通职业技术学院\t345856\n济南轨道交通集团有限公司\t345857\n验机\t345858\n王爱华\t345859\n甘孜州\t345860\nmolecule\t345861\n12月22日\t345862\n泛亚汽车技术中心\t345863\ntraktor\t345864\n4600元\t345865\n角蛙\t345866\n树舌\t345867\n韶关教育信息网\t345868\nlog10\t345869\n中国教育网\t345870\n三四名\t345871\n文昌大道\t345872\nspite\t345873\n000831\t345874\nDocin\t345875\nBuddha\t345876\n鬼斯通\t345877\nsqlca\t345878\n切除\t345879\n塔利亚\t345880\n七一路\t345881\n课目\t345882\n对外汉语专业\t345883\n秋色之空\t345884\n矿坑\t345885\n千里路\t345886\nviewpager\t345887\n临沂市\t345888\n证券投资学\t345889\nfidder\t345890\n恐龙简笔画\t345891\n宁波宾馆\t345892\ncurel\t345893\n好天\t345894\nhtcu11\t345895\n18英寸\t345896\nStoryboard\t345897\n5.17\t345898\n唐初\t345899\n祥瑞\t345900\n简字\t345901\n符文石\t345902\n理财\t345903\n广东话\t345904\n麻\t345905\nGreene\t345906\n七龙珠\t345907\n108周年\t345908\n4.5.4\t345909\ngzqh\t345910\nMarni\t345911\n功能\t345912\n标准型\t345913\n摇菌\t345914\n叶舌段\t345915\n广州市科技创新委员会\t345916\n科斯塔\t345917\n金九拉\t345918\npot\t345919\nuntrust\t345920\n00023\t345921\n12138\t345922\n每份\t345923\nOff-White\t345924\n超声换能器\t345925\n玻妞\t345926\n西伯利亚大铁路\t345927\nwin10以太网\t345928\n侍御\t345929\n杨彬\t345930\n1.18\t345931\n3.6G\t345932\n兵模\t345933\n逆火狂飙\t345934\n宠物情缘\t345935\n过家\t345936\n能下\t345937\n沃尔\t345938\n尾焰\t345939\n可导函数\t345940\nCLOUD\t345941\n甲型肝炎\t345942\n孙伟\t345943\n品名\t345944\n马克思佩恩\t345945\n卢伟\t345946\n南广\t345947\n阎良之窗\t345948\nmesse\t345949\n顶叶\t345950\n心音\t345951\n九款\t345952\n西安电子科技大学计算机学院\t345953\nbacklog\t345954\n火影忍者吧\t345955\n神田留美\t345956\n胡海锋\t345957\nliinux\t345958\n11.0\t345959\n免费资源网\t345960\nAlong\t345961\n陈火旺\t345962\n同仁们\t345963\nBurning\t345964\n太原钢铁(集团)有限公司\t345965\nsake\t345966\nZ5S\t345967\n河南省洛阳正骨医院\t345968\n所说\t345969\nMiya\t345970\n495\t345971\n冯远征\t345972\n吉他贝司\t345973\n冻\t345974\n中铁城\t345975\n贺红\t345976\n换群\t345977\n预付式\t345978\nAD9361\t345979\n羸\t345980\n42CrMo\t345981\n第117届\t345982\n王珂\t345983\n热分析仪\t345984\n10.10.1\t345985\n智乐\t345986\n拉贾斯坦邦\t345987\n2kg\t345988\n马尔可夫模型\t345989\nKoch\t345990\n口袋妖怪日\t345991\n赛多利斯\t345992\n音带\t345993\n长郡梅溪湖中学\t345994\n技\t345995\nwww.wuxi.gov.cn/uploadfiles/\t345996\n窖龄\t345997\n以人为鉴\t345998\n麦弗逊式独立悬架\t345999\n511路\t346000\n预拌粉\t346001\n硫酸二甲酯\t346002\n君莫\t346003\n航天城\t346004\nTwistys\t346005\n申达\t346006\nUpper\t346007\n民族小学\t346008\n高亮重复项\t346009\n海拉5透镜\t346010\nk690e\t346011\n700集\t346012\n从零西厢记\t346013\nfil\t346014\n2016.2\t346015\n3.94\t346016\n天才麻将\t346017\nQQ2017\t346018\nVGA接口\t346019\nv2.3.2\t346020\n五色石南叶\t346021\n太空中\t346022\nboolean型\t346023\nCOUNTIF\t346024\n兴宁中学\t346025\nheather\t346026\n交河\t346027\n天足\t346028\n肾小球滤过率\t346029\n小小少年\t346030\n不卖给\t346031\n巴州在线\t346032\nSTU\t346033\n经纶学典\t346034\n引爆点\t346035\n浸渍法\t346036\n第十二篇\t346037\n班报\t346038\n660分\t346039\n河轮\t346040\n短字\t346041\nU18\t346042\n中容器\t346043\n公腰\t346044\nNVH\t346045\n向村\t346046\n美石\t346047\n糖酒快讯\t346048\n稿子\t346049\n头颈\t346050\n赵大海\t346051\nExcel表格自动求和全攻略\t346052\n动手动脚\t346053\nsnl\t346054\n围产期\t346055\n凯路威\t346056\n下沙街道\t346057\n乌尔班\t346058\n四年多\t346059\n埋汰\t346060\n脾丸\t346061\n近年\t346062\n房产超市网\t346063\n奶狗\t346064\n700毫升\t346065\n正统三国\t346066\n通信工程吧\t346067\n多拉\t346068\n侨民\t346069\nfastcam\t346070\n正射\t346071\n过氧化钙\t346072\nRights\t346073\nTeenies\t346074\ncdr12\t346075\n科迪亚克\t346076\n衢州教育网\t346077\n上海市第一中级人民法院\t346078\nhenhenlu\t346079\n【名\t346080\n第21个\t346081\n姓朱\t346082\n苏州吴中区\t346083\n上海世贸商城\t346084\n等离子喷涂\t346085\n成荫\t346086\n20150126\t346087\n曲酸\t346088\nflag\t346089\n黑白无常\t346090\n音飞\t346091\n东兴镇\t346092\nOMA\t346093\n北方股份\t346094\n宁波市第二医院\t346095\n360期刊网\t346096\n黄绿色\t346097\n20170904\t346098\n周志坚\t346099\n第173集\t346100\noverlay2\t346101\nA型\t346102\n玲珑郡\t346103\n磁砖\t346104\n小茴香\t346105\n16.5\t346106\nSynthesia\t346107\n00000006\t346108\n二之国2亡魂之国\t346109\n空格符\t346110\n梓喵\t346111\n快捷通\t346112\n华声晨报网\t346113\nTRP\t346114\n九思\t346115\n仙女棒\t346116\n从零开始的魔法书\t346117\n来款\t346118\nGays\t346119\n新山枫\t346120\n西瓜子\t346121\n京都\t346122\n瓦匠\t346123\n十三次\t346124\n七十二家房客\t346125\n学院南路\t346126\n昆仑健康保险\t346127\n金莲花\t346128\n科技小制作\t346129\nzookepper\t346130\n小妹\t346131\n租\t346132\n3270\t346133\n986\t346134\nUnity之家\t346135\n棘洪滩街道\t346136\n人师\t346137\nConverged\t346138\n敏金\t346139\n50管\t346140\n音波\t346141\n其它\t346142\n小瘤\t346143\n郑和\t346144\n萎靡\t346145\n蛋糕卡\t346146\n采编\t346147\n李婷婷\t346148\n临涣镇\t346149\n看图网\t346150\n健康库\t346151\n水贴\t346152\n小泉\t346153\n0571-63434126\t346154\n王晓辉\t346155\n临港新城\t346156\nMéxico\t346157\n陈某\t346158\nsom\t346159\n病质\t346160\n10月1日\t346161\n芳名\t346162\n南京旅游职业学院\t346163\n生菜沙拉\t346164\n七零年\t346165\n六十分\t346166\n周天\t346167\n稀里哗啦\t346168\n20多部\t346169\n普拉\t346170\n暗黑3猎魔人\t346171\n劳埃德\t346172\nv348\t346173\nalpine\t346174\n名侦\t346175\n张吉\t346176\n陆抗\t346177\n30道\t346178\n松鼠桂鱼\t346179\nDHS\t346180\n值日生\t346181\n无可救药\t346182\nWesChen\t346183\n自由口\t346184\n賺\t346185\n早孕测试\t346186\n车篇\t346187\n佳能70d\t346188\n黄小琥\t346189\ntabControl\t346190\n短命\t346191\n20160303\t346192\n110米\t346193\n有道云笔记吧_\t346194\n软袋\t346195\n豹哥\t346196\n涪风论坛\t346197\n扫地机器人\t346198\nAras\t346199\n延崇高速\t346200\n上饶职业技术学院\t346201\narising\t346202\n0.1%\t346203\n130集\t346204\n25974元\t346205\n1983年\t346206\nartofzoo\t346207\n光环5\t346208\n暗桩\t346209\ntemporarily\t346210\n5月11号\t346211\n万茜\t346212\n星云奖\t346213\n安装工程\t346214\n卧冰求鲤\t346215\nOra\t346216\n霄汉\t346217\n女友们\t346218\n换衣间\t346219\ncmpp\t346220\nlicen\t346221\n上供\t346222\n私募排排网\t346223\n日用瓷\t346224\nahzll\t346225\nBasemap\t346226\n炝\t346227\n连撞\t346228\n美国认证协会\t346229\n夏冬青\t346230\n沙丁胺醇气雾剂\t346231\ngetter\t346232\n新郎官\t346233\nxcode7\t346234\n反激变压器\t346235\nMacoLee\t346236\n五年级语文上册\t346237\n孙一\t346238\nNIC\t346239\n韩艺\t346240\nschecter\t346241\nXmind8\t346242\n生生不息\t346243\n三论宪法\t346244\n四川网络广播电视台\t346245\nphpinfo\t346246\nsring\t346247\n教廷\t346248\nG5500\t346249\n佳境天城\t346250\n伯克利深圳学院\t346251\n新能源车补贴\t346252\n亡党\t346253\n3任\t346254\n众志成城\t346255\n整图\t346256\n颇尔\t346257\n长沙国际会展中心\t346258\nwrist\t346259\n呋喃树脂\t346260\npixma\t346261\n福气\t346262\n神农谷\t346263\n命令与征服4\t346264\n刃具\t346265\n万山红\t346266\n吴玉良\t346267\n靖佩瑶\t346268\nbin.zip\t346269\n登峰杯\t346270\n霸州市人民政府\t346271\nalaska\t346272\n无术\t346273\n12.20\t346274\n1871年\t346275\n手术台\t346276\n胜利女神\t346277\n宛如\t346278\n取信\t346279\nRonnie\t346280\n球灯\t346281\n许大茂\t346282\nila\t346283\n李新华\t346284\n联化科技\t346285\n王者荣耀百里玄策\t346286\n黄泥\t346287\n5万方\t346288\n杜源\t346289\n盘龙山\t346290\n风飘龙\t346291\n1207\t346292\n花游记\t346293\n可视化\t346294\n曼谷酒店\t346295\n卖国贼\t346296\nEvaluate\t346297\nFlashcards\t346298\n518号\t346299\n7首\t346300\n宁波府\t346301\n深圳市民中心\t346302\n木马病毒\t346303\n商证\t346304\n源乐晟\t346305\n过去时\t346306\n米高\t346307\ncatchtap\t346308\nTailor\t346309\n往生咒\t346310\n西游伏妖篇\t346311\n受灾\t346312\n波隆\t346313\nPersons\t346314\n亚青赛\t346315\n古事记\t346316\nXBRL\t346317\n蜘蛛女\t346318\n拉刀\t346319\n美军舰\t346320\n彩经网\t346321\nreviewers\t346322\nmpk\t346323\n宜章县\t346324\nfins\t346325\n臀大肌\t346326\n徐溢\t346327\n替硝唑片\t346328\n化忌\t346329\nf535\t346330\nUdesk\t346331\n椎管\t346332\n享用\t346333\n故事簿\t346334\n巴黎机场\t346335\n开校\t346336\niR\t346337\n南昌地区\t346338\n梦梦\t346339\n大力水手\t346340\n教体局\t346341\n淫母\t346342\nREI\t346343\n夕子\t346344\n飞网\t346345\n西贡\t346346\n重创\t346347\n伊斯坦布尔机场\t346348\n李爱武\t346349\n照遭\t346350\nMarginNote\t346351\nyy100t\t346352\n梦幻诛仙\t346353\n乐蒂\t346354\nLori\t346355\n自求多福\t346356\n解愁\t346357\nlinuz\t346358\nlaydate\t346359\n米索前列醇片\t346360\n天才麻将少女吧\t346361\n婚文\t346362\n嘉兴市公共资源交易中心\t346363\n北科建\t346364\n河北国家税务局\t346365\n王境泽\t346366\nKatrina\t346367\nmeshi\t346368\nqrcode\t346369\n印制电路板\t346370\n林志颖\t346371\n灵川县\t346372\nkb绳艺网\t346373\n小甜甜\t346374\npret\t346375\n到渠成\t346376\nbom表\t346377\n六女\t346378\n戴亚克隆\t346379\n科尔沁左翼后旗\t346380\n8集\t346381\n唯愿\t346382\n僧侣\t346383\nMyISAM\t346384\n字章\t346385\n5500亿\t346386\n赛维\t346387\n贮藏\t346388\nPERFECT\t346389\n进口商\t346390\n伊犁路\t346391\n原先\t346392\n博瑞大厦\t346393\nCombo\t346394\nBuild\t346395\n黑涩\t346396\n乌丹镇\t346397\n最高温\t346398\n环氧树\t346399\nFacility\t346400\n初禅\t346401\n重整化\t346402\n无声无息\t346403\n周鸿祎\t346404\n正版图\t346405\n植物类\t346406\n小醉\t346407\n暗幕\t346408\n1月1号\t346409\n陈苏\t346410\njaccard\t346411\n凸轮机构\t346412\n亚须希\t346413\ndouyu\t346414\n天慧龙\t346415\n夏尔\t346416\n四建\t346417\n巅峰赛\t346418\n字典板报网\t346419\n废寝忘食\t346420\n姜宁\t346421\n垂直于\t346422\n外部\t346423\n棋牌类\t346424\n姜明安\t346425\n金速\t346426\n中变传奇\t346427\n思索\t346428\n泉州人才网\t346429\n流脑疫苗\t346430\nPubmed\t346431\n2018年4月25日\t346432\n十五万\t346433\nhtml5plus\t346434\n硬气\t346435\nOmnifocus\t346436\n中国中煤能源集团有限公司\t346437\n信威集团\t346438\nscm\t346439\nsnk\t346440\nDDR3L\t346441\n赤裸弹弹堂\t346442\n美味\t346443\n扶余\t346444\nmin函数\t346445\n中国好声音吧\t346446\n拔号\t346447\n广州电大\t346448\nOffice2010\t346449\n另一只手\t346450\n睨\t346451\n罩袍\t346452\n西安一小区\t346453\n势力榜\t346454\n边幅\t346455\n衡州\t346456\njaycekong\t346457\nKOH\t346458\n秦煜\t346459\n变声器男\t346460\n耿市长\t346461\n早秋\t346462\nHERO6\t346463\nlolhentai\t346464\nmt3\t346465\n双酚a\t346466\n高雄\t346467\n瑞纳论坛_汽车之家论坛\t346468\n罗京\t346469\n80千克\t346470\n江苏省盐城中学\t346471\n蒙顶甘露\t346472\npetal\t346473\n诺查丹玛斯\t346474\nfushi\t346475\n赵武灵王\t346476\n有着\t346477\n混流泵\t346478\nprotools\t346479\n辛鸣\t346480\n港校\t346481\n西湖镇\t346482\n有容\t346483\n攀辰智通\t346484\n贺军翔\t346485\n央视二套\t346486\n火爆化妆品招商网\t346487\nwaifu2x\t346488\n种树郭橐驼传\t346489\n)电气有限公司\t346490\n独库公路\t346491\n圆木\t346492\n搜房帮\t346493\n非洛地平\t346494\n中医诊断学\t346495\n韩红\t346496\nSeq2Seq\t346497\n克氏针\t346498\n阿伊\t346499\nInserting\t346500\n幻想三国\t346501\n招标控制价\t346502\n双峰镇\t346503\n13th\t346504\nsybyl\t346505\n允\t346506\nクリ\t346507\n2.77\t346508\n周勃\t346509\n消化器\t346510\n失散\t346511\n鲍喜顺\t346512\n刀叉\t346513\nIPV4\t346514\n秦深\t346515\nAngular4\t346516\n慧安中小企业服务中心\t346517\n中国木业网\t346518\nCatalyst\t346519\n总述\t346520\n雷州市\t346521\n南头\t346522\n大跟班\t346523\n丰乐路\t346524\n限速器\t346525\n梁线\t346526\n钱多\t346527\n学佛网\t346528\n单一性\t346529\n光华门\t346530\n十九大报告体会\t346531\n白金岛\t346532\n1708\t346533\nstella\t346534\nhalal\t346535\n最佳状态\t346536\n成昆\t346537\n无损音乐_\t346538\n古微\t346539\n例举\t346540\n微领地\t346541\n第七弹\t346542\n王柏川\t346543\n解码机\t346544\n边台\t346545\n生日快乐歌\t346546\nhedging\t346547\ncad2015\t346548\n周惠\t346549\nIntellijIdea\t346550\n四高\t346551\n斗牛\t346552\n早一点\t346553\n金雅中\t346554\n多多一个\t346555\nRusty\t346556\n75kw\t346557\n葫芦娃七娃\t346558\nrpmbuild\t346559\n正太音\t346560\n502\t346561\nnx200\t346562\n邻家女孩\t346563\nyijian\t346564\n1007\t346565\nMICROSOFT\t346566\n印第安人\t346567\n挤塑机\t346568\n48艘\t346569\n学弟\t346570\nfusion360\t346571\n石墙\t346572\nFoo\t346573\nnasal\t346574\n强弩之末\t346575\n卢思浩\t346576\n百味柯桥\t346577\n_瑞金教育网\t346578\nfluoro\t346579\nTrain\t346580\n打给\t346581\nBocelli\t346582\n参芪\t346583\n红股\t346584\n姜戎\t346585\n神马股份\t346586\n苏打粉\t346587\n永安期货股份有限公司\t346588\n涂塑管\t346589\nactualité\t346590\n上海中粮\t346591\n带钩\t346592\nhpe\t346593\n沙暴\t346594\nLIBSVM\t346595\nNetworkManager\t346596\n硝基漆\t346597\n9D\t346598\n微凉\t346599\n王云生\t346600\nAirpods\t346601\nM6000\t346602\nTOTOLINK\t346603\n最终版\t346604\n52pk英雄联盟视频站\t346605\n定界符\t346606\n河北建设投资集团有限责任公司\t346607\n邪恶少女漫画H_色列漫画大全_工口漫画\t346608\n救援员\t346609\n日与夜\t346610\n字坊\t346611\n舍瓦\t346612\nNOx\t346613\n壬午\t346614\n湘西自治州\t346615\n边栏\t346616\n桌号\t346617\n解调器\t346618\n私密照\t346619\n墨迹天气\t346620\n周碧华\t346621\n王姬\t346622\n爱爱电影网\t346623\n姐儿\t346624\n杭州注册公司\t346625\n歌玉\t346626\n丧父\t346627\n吕燕\t346628\n重庆出版社\t346629\n安徽省安全生产监督管理局\t346630\n平津战役纪念馆\t346631\n瑞纳\t346632\n91360智慧病理网\t346633\n林健\t346634\nANSI编码\t346635\n3515\t346636\n钣金\t346637\n失婚\t346638\n标榜\t346639\n264号\t346640\n円\t346641\n零件柜\t346642\n壮美\t346643\n驼鸟\t346644\nUnifying\t346645\n派系\t346646\n书名\t346647\n14篇\t346648\nCUDNN\t346649\n手相学\t346650\n医师节\t346651\nclothing\t346652\n去去\t346653\n水嶋\t346654\n仁恒河\t346655\n纪律\t346656\n解放军报\t346657\n淮左\t346658\n沈阳市规划和国土资源局\t346659\n拳皇97风云再起\t346660\n拉布拉多犬\t346661\n三星N7100\t346662\n海外部\t346663\n宋希斌\t346664\n早买\t346665\n就是铁甲\t346666\ncfe\t346667\n5.2.5\t346668\n灵媒\t346669\n宜华木业\t346670\nthenorthface\t346671\n那里\t346672\n信风\t346673\n美文居\t346674\n夫妻\t346675\n幸运星\t346676\n前南\t346677\n淇滨区\t346678\n长治郊区\t346679\n6dm\t346680\nMacBookAir\t346681\n薄熙\t346682\n兵站\t346683\n水群\t346684\n世界人权宣言\t346685\nFlop\t346686\n妻子的秘密\t346687\n拦截者\t346688\n零钱通\t346689\n证券行业\t346690\n羊肉片\t346691\n免激\t346692\n金士顿4GB\t346693\n4g无线路由器\t346694\n南海中学\t346695\nelif\t346696\n轻轻\t346697\n写生\t346698\n涉世\t346699\nAVIS安維斯\t346700\n绥远\t346701\n疑凶\t346702\n5421\t346703\n叶炜\t346704\n韵乐\t346705\n肛肠科\t346706\n少儿险\t346707\n生长期\t346708\n到顶\t346709\n望春\t346710\nvar\t346711\n方中信\t346712\n防火罩\t346713\ngflags\t346714\n中山证券有限责任公司\t346715\ntus\t346716\noanda\t346717\n大航海家3\t346718\nelmentui\t346719\n董香\t346720\n中华世纪城\t346721\n加水\t346722\n校企合作协议\t346723\n鸡尾酒\t346724\n厦门网\t346725\n明阁\t346726\n推荐稿\t346727\n天硕\t346728\nu19\t346729\n双源\t346730\n戒酒\t346731\n伽利略\t346732\n死亡赔偿金\t346733\n媒体村\t346734\n六魔女\t346735\n安信可科技\t346736\n别着急\t346737\n控制权\t346738\nsettled\t346739\n艺体术\t346740\n坤彩科技\t346741\ncharger\t346742\n粉煤\t346743\n童语\t346744\n融创物业\t346745\n冰玉\t346746\n第四类\t346747\nride\t346748\n天童\t346749\n东城区教委\t346750\nCODEVS\t346751\n灾祸\t346752\n老有所为\t346753\n盐酸西替利嗪\t346754\n你就不要想起我\t346755\n8818\t346756\n荣耀7C\t346757\n犀鸟\t346758\n军人使命\t346759\n皇甫嵩\t346760\n米拉娜\t346761\nYSOcean\t346762\n猎杀\t346763\n不辱使命\t346764\nups不间断电源\t346765\n归位\t346766\n日光灯\t346767\n定容\t346768\n=2\t346769\n2016年05月\t346770\n画色\t346771\nwox\t346772\n肾动脉\t346773\nOLP\t346774\n独栋\t346775\n闻喜路\t346776\n木地\t346777\n线串\t346778\nmain.js\t346779\n湿淋淋\t346780\nconsistent\t346781\n角部\t346782\n樱桃红\t346783\n持枪\t346784\nGARMIN\t346785\n北京美国大使馆\t346786\n色码\t346787\n梦网集团\t346788\n订房\t346789\n夏沐\t346790\n有多难\t346791\nC2H4\t346792\n德古拉城堡\t346793\n新寻秦记\t346794\n子机\t346795\njava过滤器\t346796\n菏泽环保局\t346797\n基础教育改革\t346798\nKDS\t346799\nAmazo\t346800\n柔力\t346801\n卡星\t346802\n5000块\t346803\n新绝代双骄前传\t346804\n赵主任\t346805\nWARN\t346806\nPMT\t346807\n红杉资本中国基金\t346808\n南宫婉\t346809\navaliable\t346810\n湘潭市雨湖区政府\t346811\n凤鸣路\t346812\n谯城区人民政府\t346813\n0W20\t346814\n后牙\t346815\noverwatch\t346816\n附设\t346817\n大伟哥\t346818\n政风\t346819\nMX4\t346820\ngravel\t346821\n朱凯\t346822\n南浦路\t346823\n欺世\t346824\n河南经济报网\t346825\ndummies\t346826\n高度近视吧\t346827\n冲击系数\t346828\n真性情\t346829\n交割\t346830\nKylie\t346831\nholo\t346832\n广州正骨医院\t346833\nDBLink\t346834\n摩尔庄园\t346835\n香茅\t346836\n徐广国\t346837\n情侣\t346838\n钬\t346839\n特长班\t346840\n2017五一套\t346841\n喜茶\t346842\n教头\t346843\n清真寺\t346844\nGeorges\t346845\n高雨\t346846\n三国群英传争霸\t346847\nShack\t346848\n博大\t346849\nnotch\t346850\n川师大\t346851\ncounterpart\t346852\n顾铮\t346853\n宿迁市中级人民法院\t346854\n株洲火车站\t346855\n一滴眼泪\t346856\n上一周\t346857\n一支\t346858\n黄旭\t346859\n复旦大学管理学院国际认证领先商学院\t346860\n3DM原创种/网盘分流\t346861\n来我不老\t346862\n药物性\t346863\n票源\t346864\n江苏省南通中学\t346865\n2018-4-19\t346866\n陈情\t346867\n光疗\t346868\nHc3\t346869\nnep\t346870\n汇乐\t346871\n20170519\t346872\nfreelance\t346873\n卫辉\t346874\n热河\t346875\n凡世帝国\t346876\n方向柱\t346877\n0xc00000e9\t346878\n733\t346879\n四川南部县\t346880\n豪雅\t346881\n色空阁_俺去也\t346882\n隔三差五\t346883\n灯亮\t346884\nChicago\t346885\n长沙高铁南站\t346886\n醇香\t346887\n流延\t346888\ntmf\t346889\n2万条\t346890\n桑兰\t346891\n约战\t346892\n百度游戏\t346893\n80年代\t346894\n群租\t346895\n华媒控股\t346896\n朱锋\t346897\nIC信用卡\t346898\n李志斌\t346899\n95559\t346900\n男妻\t346901\n图虫网\t346902\n常府街\t346903\n拉德芳斯\t346904\n牛彩\t346905\nz17minis\t346906\n课外辅导\t346907\n藏民\t346908\n翟永明\t346909\n方兴大道\t346910\n线长\t346911\n盛丰建材网\t346912\ntoons\t346913\nfd01\t346914\n余甘子\t346915\n严经\t346916\nPHASE\t346917\n匣钵\t346918\n收款循环\t346919\n回锅\t346920\n三十三天\t346921\n熊岳镇\t346922\n住建厅\t346923\n威震天\t346924\n后测\t346925\n衣冠楚楚\t346926\n双c\t346927\n9.0\t346928\n九星连珠\t346929\n公孙离\t346930\n正三棱柱\t346931\nexecuteUpdate\t346932\n管理委员\t346933\nhawq\t346934\n车托\t346935\n诗巴丹\t346936\n28日\t346937\n全无\t346938\n庭前\t346939\n模拟\t346940\nExtJS4\t346941\nKAPPA\t346942\nbld\t346943\n巴布洛\t346944\n花腰\t346945\n退婚\t346946\n返魂\t346947\n我的爱情\t346948\nsqlitestudio\t346949\n刀面\t346950\nturbo\t346951\n阿耶檀\t346952\nferro\t346953\n空篮小说网\t346954\n中控室\t346955\n笔洗\t346956\n光刻机\t346957\nBoosted\t346958\n信息业\t346959\nRabi-Ribi\t346960\n阿荣\t346961\n苏溪镇\t346962\n29元\t346963\n音乐时间\t346964\n古惑仔全集\t346965\n还没有\t346966\n二十一章\t346967\n梅丽尔·斯特里普\t346968\n丰和\t346969\n第二十六\t346970\n第SC02版\t346971\n阿灵顿\t346972\n全尺寸\t346973\nsync\t346974\nFuzzing\t346975\nPTH\t346976\n好想要\t346977\n小什\t346978\nSandals\t346979\n点了赞\t346980\n渑池\t346981\nCAD编辑器\t346982\n图集\t346983\n折\t346984\n得州\t346985\n洪江区\t346986\n水晶男孩\t346987\nparen\t346988\n有声小说网\t346989\n复壮\t346990\n孟夏\t346991\n锚板\t346992\n1.01\t346993\nOEM9\t346994\n南京市民政局\t346995\n腾讯优图\t346996\n药代\t346997\nLeggings\t346998\n一句话\t346999\n瓶标\t347000\n惠州市第三人民医院\t347001\n窦文维塔斯\t347002\n山东省住房和城乡建设厅\t347003\n简体\t347004\n2.00M_h.264\t347005\n江西公务员考试网\t347006\nfortnite\t347007\n证券之星\t347008\nkeyframes\t347009\ndb9\t347010\n边界值分析法\t347011\n话务量\t347012\n郭庄镇\t347013\n断刀\t347014\nubunt\t347015\n卓创\t347016\n北部湾旅\t347017\n打吐\t347018\n莫比乌斯环\t347019\n秀屿区\t347020\n遂宁16路\t347021\n初集\t347022\n壳公司\t347023\n消沉\t347024\n抗震性\t347025\n押尾\t347026\n武校\t347027\nSikuli\t347028\n月海\t347029\n飞猪酒店\t347030\n田里\t347031\nbarbarbar\t347032\n盛极而衰\t347033\nimplemented\t347034\n贵阳大数据交易所\t347035\n广东代表\t347036\nbackspace\t347037\n基汇\t347038\n6.2米\t347039\nRosetta\t347040\n920m\t347041\n李雷雷\t347042\n海之言\t347043\n估分\t347044\n天津小区\t347045\n陈家村\t347046\n路得\t347047\n轰炸机\t347048\n19G\t347049\n微信点餐\t347050\n汾河公园\t347051\n小流氓\t347052\n永平\t347053\n注意项\t347054\n国民银行\t347055\n随县\t347056\n一站幸福\t347057\n焦糖色\t347058\n皮蛋豆腐\t347059\nRN\t347060\n电木铣\t347061\n0.5_\t347062\n多一种\t347063\n中国书协\t347064\nESMO\t347065\nv2.7.1\t347066\n文件头\t347067\n快歌\t347068\n多年生\t347069\n采光板\t347070\n美发店\t347071\nrobocraft\t347072\n王亚丽\t347073\n寇\t347074\n坏家伙\t347075\ntrim函数\t347076\nsubnautica\t347077\n5月30日\t347078\n表意\t347079\n玩具兵\t347080\n土匪\t347081\n指示秤\t347082\n王爱军\t347083\n社会主义学院\t347084\n萨尔曼·汗\t347085\n光明乳业股份有限公司\t347086\n人章\t347087\n倍呵护\t347088\n304号\t347089\n追尾\t347090\n速8酒店\t347091\n看今朝\t347092\nContr\t347093\nAxure8.0\t347094\n雷敏\t347095\nmillionaire\t347096\naimbooster\t347097\n劳动监察大队\t347098\n子木\t347099\n银监办\t347100\nOneinStack\t347101\n特事\t347102\n仙桃市\t347103\n滚塑\t347104\nGedit\t347105\n椎基底动脉\t347106\n市国家税务局\t347107\n6间\t347108\n竹纤维\t347109\n血灵\t347110\n连坐\t347111\n云锁\t347112\n10吨\t347113\n逆流交易\t347114\nLED贴片硅胶\t347115\n下拉式\t347116\n水仙花\t347117\n佳能mg2580\t347118\n疯人\t347119\n潘冬子\t347120\n艾肯ICON\t347121\n焊剂\t347122\nfair\t347123\n21个\t347124\n23456789\t347125\n山东如意\t347126\n网上应用店\t347127\n恙\t347128\n无常\t347129\n青岛市地税局\t347130\n鼓\t347131\nInvasion\t347132\n禄口\t347133\n炉甘石洗剂\t347134\n北汽威旺\t347135\nrend\t347136\n武极\t347137\n雷斯\t347138\n百恒\t347139\n鉴赏室\t347140\n呃\t347141\n创逾\t347142\nsym\t347143\n主导权\t347144\nLOK\t347145\nphillips\t347146\nPVDF\t347147\n特別\t347148\n南港\t347149\n泌阳县\t347150\nCREO3.0\t347151\n主梁\t347152\n金管家\t347153\n爱信6AT\t347154\n38章\t347155\n乔瓦尼\t347156\n圣烟\t347157\nseng\t347158\nfanyi\t347159\n农林牧\t347160\n2016年5月1日后\t347161\n攻城略地\t347162\n慈禧秘密生活\t347163\n水果皮\t347164\n凰女\t347165\n奥洛菲\t347166\n南泥湾\t347167\n无量寿经\t347168\n甲硝唑注射液\t347169\nrayfire\t347170\n易优百\t347171\n工银安盛人寿\t347172\n橡胶轮\t347173\n空花\t347174\nxuesheng\t347175\n4月25日起\t347176\n布鲁氏菌\t347177\n交巡警\t347178\n藏尸案\t347179\nx9l\t347180\napp公司\t347181\n0578\t347182\nlighten\t347183\n催干剂\t347184\n双十一天\t347185\n空手道\t347186\n602所\t347187\n购票员\t347188\n北京宜家\t347189\n135Q\t347190\n壁上\t347191\n昌耀\t347192\n去年7月\t347193\n斯摩格小屋\t347194\nOdoo10\t347195\n厘清\t347196\n吊花\t347197\n陈芳\t347198\n荣盛花语城\t347199\n西域\t347200\n寡妇\t347201\n理念论\t347202\n木兰山\t347203\npenthouse\t347204\n剑指offer\t347205\n宋DM\t347206\n随机序列\t347207\n命运之刃\t347208\n奶油泡芙\t347209\n大了\t347210\nsetam\t347211\n冲上云霄\t347212\n华泰\t347213\n雕刻版\t347214\n52kk\t347215\nsp1\t347216\n汤珈\t347217\ndiffer\t347218\novercooked\t347219\n三国kill\t347220\n司空见惯\t347221\naula\t347222\n过目难忘\t347223\ng14\t347224\n北京工商银行\t347225\n兴文论坛\t347226\n初审\t347227\nhumidity\t347228\n一个32岁\t347229\n结语\t347230\n百盛集团\t347231\n入侵者\t347232\n梦幻互通版\t347233\n金基范\t347234\nMP4/MP3\t347235\npanton\t347236\n5瓦\t347237\n新时代党\t347238\n树莓派3B\t347239\n发财致富\t347240\n2ch\t347241\n余斌\t347242\nconsolas\t347243\n玻璃柜\t347244\n文牍\t347245\n统计类\t347246\n金立M2017\t347247\n涤纶长丝\t347248\nJuvenile\t347249\n攻党\t347250\n仙鹤股份\t347251\n马娜\t347252\n投行先锋\t347253\n不像\t347254\n股池\t347255\ninstruct\t347256\n花香\t347257\n蝴蝶结\t347258\nMAIL\t347259\n弥勒市人民政府\t347260\n38块\t347261\n大鹏新区政府\t347262\ndenmark\t347263\n脾胃病科\t347264\n难本\t347265\nnightly\t347266\n蓝牌\t347267\n20140509\t347268\n安徽省住建厅\t347269\n家居装修资讯网\t347270\n签证处\t347271\n沿用\t347272\n朱和平\t347273\n新派\t347274\n九库\t347275\n瓦尔\t347276\n洋房\t347277\narial\t347278\n鼠来宝\t347279\n湖北省卫生和计划生育委员会\t347280\njqueryschool\t347281\n第10行\t347282\n逐客\t347283\n25G\t347284\nc++编译器\t347285\n好骚\t347286\n新七侠五义\t347287\n恭贺新禧\t347288\n酱板鸭\t347289\nrevman\t347290\n阴精\t347291\n德丰利达\t347292\n官职\t347293\n圏\t347294\n九篇\t347295\n行有\t347296\nJunn9527\t347297\nPVR\t347298\n9月27日\t347299\n中洲中央公园\t347300\n米尔\t347301\n双带\t347302\n173号段\t347303\n阴差阳错\t347304\n企业管理学\t347305\nMD5加密\t347306\n全唐文\t347307\n壬水\t347308\n噪声级\t347309\n海贝r6\t347310\n黄眉\t347311\n不败战神\t347312\n博龙\t347313\nalex\t347314\n东京喰种第三季\t347315\nwuliu\t347316\n趁火打劫\t347317\n宏碁\t347318\n600m\t347319\n第三十一期\t347320\n王允\t347321\n聊客\t347322\n痛爱\t347323\n友唱\t347324\n福特Mustang\t347325\n3888\t347326\n短节\t347327\nOFF\t347328\n古田二路\t347329\nNotNull\t347330\n拜城\t347331\n敝帚自珍\t347332\n套手\t347333\nbuch\t347334\n渗出\t347335\nGFW\t347336\n神烦\t347337\n打地鼠\t347338\n文化厅\t347339\n朱师傅\t347340\n昂达社区\t347341\n中山市城乡规划局\t347342\n空战\t347343\n中国通信标准化协会\t347344\n广东外语外贸大学新闻中心\t347345\n优家\t347346\nR20\t347347\nelmo\t347348\n大学生就业创业网\t347349\n咫尺天涯\t347350\n作尘\t347351\ngperf\t347352\n张英\t347353\n风姿\t347354\n采油队\t347355\n阿猫\t347356\n铜锣烧\t347357\n新农哥\t347358\n我的青春恋爱物语\t347359\n12小时\t347360\n追凌\t347361\n保宝网\t347362\n_易安居星座网\t347363\n软磷脂\t347364\n假酒\t347365\n直挂云帆济沧海\t347366\nBonnie\t347367\n空空如也\t347368\n龙婷\t347369\n鹭飞\t347370\n情绪化\t347371\n白夜机器人大战\t347372\n大中华区\t347373\n避雷\t347374\n叉臂式独立悬架\t347375\n聪明的一休\t347376\nap吧\t347377\n女斗\t347378\n热血江湖刀客\t347379\n顾景舟\t347380\nRK3128\t347381\n戴维\t347382\n樘\t347383\n无线\t347384\n三国志11pk\t347385\n慈菇\t347386\n推进者\t347387\n杨义\t347388\n9W\t347389\n蔡崇信\t347390\n设计院\t347391\n坏哥哥\t347392\nwindows-store\t347393\n韩雨芹\t347394\n69.com\t347395\n河南省人民政府\t347396\n修武\t347397\n安徽省纪委\t347398\n顺丰速运\t347399\n金星村\t347400\nwin7codecs\t347401\nlamborghini\t347402\n帝豪大主宰\t347403\n寒刃\t347404\n卡里\t347405\n主粮\t347406\n项目管理学\t347407\nibackupbot\t347408\n何来\t347409\nanglarjs\t347410\n近邻\t347411\nPhilip\t347412\n芝华仕\t347413\n异界斩龙\t347414\n2418\t347415\n机锋\t347416\n扁平化\t347417\n上海证券综合指数\t347418\n偏心轴承\t347419\n五十路\t347420\n易播网\t347421\n阿丁\t347422\n南康中学\t347423\nFr\t347424\nLTR\t347425\n乌瑟尔\t347426\n喷射泵\t347427\n流油\t347428\n宾夕法尼亚州立大学\t347429\nim\t347430\n离子\t347431\n醉汉\t347432\nnetcdf\t347433\n1412\t347434\n丰台万达广场\t347435\n基腐\t347436\ninventor2017\t347437\n玫瑰节\t347438\n耳环\t347439\n暗月世界吧\t347440\nmad\t347441\n基普乔格\t347442\n0210\t347443\n换手\t347444\n51avi\t347445\n梦断\t347446\n施耐德电气(中国)有限公司\t347447\n竹韵\t347448\n风眼\t347449\n_文明5吧_\t347450\n平均价\t347451\nmovefree\t347452\n乳腺增生结节\t347453\nsterile\t347454\n曹总\t347455\n飞机文\t347456\n素材牛\t347457\n膨松剂\t347458\nBoiler\t347459\n永宁镇\t347460\n陶陶居\t347461\n库伦旗\t347462\n蒙古马\t347463\n500周年\t347464\nMuji\t347465\n云南白药集团\t347466\nOSGeo\t347467\narmbian\t347468\n除沫器\t347469\n44岁\t347470\n存盘\t347471\n1.0.2\t347472\ntracepro\t347473\n避祸\t347474\nwwe2016\t347475\n颅底骨折\t347476\n芒\t347477\nys\t347478\n行政区域\t347479\n一潭\t347480\nR430\t347481\n带姓\t347482\n街道人大工委\t347483\nCurated\t347484\n昏嫁\t347485\n七十九\t347486\nExporters\t347487\n桂花酒\t347488\n套利策略\t347489\n2018/1/6\t347490\n统计学院\t347491\n皮小秀\t347492\n神煌\t347493\nEpisodes\t347494\n白银td\t347495\n翠微居小说网\t347496\n韩雪峰\t347497\nwin10版本号\t347498\n娘家\t347499\nsetattribute\t347500\n7.61\t347501\n400000\t347502\nkitchen\t347503\ntin\t347504\n趋近\t347505\nmicu\t347506\n英雄无敌5\t347507\n资讯堂\t347508\n小米7\t347509\n快传\t347510\n砭石\t347511\nelected\t347512\n51nod\t347513\nENZYME\t347514\n暴劫梨花\t347515\n4月8号\t347516\n菠萝头\t347517\n小学生日记\t347518\n专拍\t347519\n通勤装\t347520\n伸手\t347521\n仲裁委\t347522\n0371\t347523\nfatjar\t347524\n公产房\t347525\n250m\t347526\nbanquet\t347527\n旅拍\t347528\n大神F2\t347529\nFX5U\t347530\n进网\t347531\n89178\t347532\n市监委\t347533\n粉们\t347534\n四环医药\t347535\n水果花\t347536\n银河湾\t347537\n肇事\t347538\n雷神3\t347539\ncvs\t347540\n七章\t347541\n集训班\t347542\ndigimon\t347543\n化工会\t347544\n长景\t347545\n海航集团有限公司\t347546\nphotographer\t347547\n谷神星\t347548\nsimc\t347549\n我告诉你\t347550\n琉璃\t347551\n长字\t347552\n废业\t347553\n唱片机\t347554\n银翼\t347555\n妙方\t347556\n節\t347557\n刚度矩阵\t347558\n双桥街道\t347559\n鸿茅药酒案\t347560\n军需\t347561\n格兰头\t347562\nohh\t347563\n后撤\t347564\nRei\t347565\n2倍\t347566\n重庆市科委\t347567\nAu\t347568\n安利股份\t347569\n莱美药业\t347570\n推上\t347571\n第39章\t347572\nDale\t347573\nMSU\t347574\n8.19\t347575\n中铁十四局集团有限公司\t347576\n1n4007\t347577\n姜文\t347578\nwand\t347579\n纺织机\t347580\nAbu\t347581\nCTRL+\t347582\nssh2\t347583\nSCAP\t347584\n符皇诸界末日在线\t347585\nhoi4\t347586\n照妖镜\t347587\n黄山\t347588\n芙蕾\t347589\n创宇\t347590\n李帝勋\t347591\n赵雍\t347592\nSurgical\t347593\n苏皖\t347594\n麻将机\t347595\n欢乐节\t347596\n佛山西\t347597\n22G\t347598\n市区\t347599\n虎虎\t347600\n10亿元\t347601\n融信集团\t347602\n银草\t347603\n批改\t347604\n指针函数\t347605\n自习室\t347606\n毛毛虫变蝴蝶\t347607\n宁波舟山港\t347608\n浅月\t347609\n颜表情\t347610\n分率\t347611\n椎名光\t347612\n铸石\t347613\n非静态变量\t347614\n流浪记\t347615\n设计器\t347616\n阿寒湖\t347617\n法海寺\t347618\n无情\t347619\n大青叶\t347620\n疗护\t347621\n电磁吸盘\t347622\n亚太日报\t347623\n花谷\t347624\n东方红小学\t347625\n延长县\t347626\n道乐\t347627\nDAP\t347628\n百度硬盘\t347629\n600797\t347630\n899电脑网\t347631\n凌空SOHO\t347632\n想开\t347633\n灵峰\t347634\n東京\t347635\n慧远\t347636\n芬兰航空\t347637\n房地产评估师\t347638\nspring-mvc\t347639\n宋襄公\t347640\n查明\t347641\n超声心动图\t347642\n头耳机\t347643\n合山市政府\t347644\n广东扶贫信息网\t347645\n内建\t347646\nlst\t347647\n季候\t347648\n刘小丽\t347649\n苏立生\t347650\n朱少民\t347651\n谱谱\t347652\n集中型\t347653\n胡彬\t347654\n笑星\t347655\n雅居乐御宾府\t347656\nReuse\t347657\n撬装\t347658\nTRC\t347659\n21cm\t347660\nCater\t347661\n证明函\t347662\n王者荣耀钻石夺宝\t347663\nx370\t347664\n美好的时光\t347665\nplein\t347666\n宣州区\t347667\n杀神\t347668\n碧空\t347669\n理念\t347670\n火斧\t347671\n陈少华\t347672\n聚聚们\t347673\nCZ\t347674\n葛莱\t347675\n太平人寿保险\t347676\n尾速\t347677\n河北化工医药职业技术学院\t347678\n课堂作业本\t347679\n金代\t347680\n奥迪斯\t347681\n剑灵力士吧\t347682\n落花\t347683\n打不死人\t347684\n结转销售成本\t347685\n二九\t347686\n50.8\t347687\n四川医院\t347688\n鲁昂\t347689\n凝眸\t347690\n谐振器\t347691\n延年益寿\t347692\n小邋遢\t347693\n税官\t347694\n250天\t347695\n唐氏儿\t347696\n洗钱案\t347697\n艾雅\t347698\n嘉陵江大桥\t347699\n省教育厅办公室\t347700\n银保\t347701\n婚书\t347702\n散图\t347703\n克鲁鲁\t347704\n启赋奶粉\t347705\n天路\t347706\nOPL\t347707\ncheehong1229\t347708\nv4.1.2.1385\t347709\n一中民\t347710\n警灯\t347711\n副线\t347712\n忙乱\t347713\nMinecraft酷爱ZERO\t347714\n博皓\t347715\nArrangement\t347716\n华南大区\t347717\n焦黑\t347718\nWin8\t347719\n青城镇\t347720\n九眼\t347721\n马鹏\t347722\n卵泡期\t347723\n活色生香\t347724\n理英\t347725\nvsd\t347726\nAlter\t347727\n凉拌菜\t347728\n小橙子姐姐我的世界搞笑\t347729\n提租\t347730\nHoffman\t347731\n图解说\t347732\n怒气\t347733\n电油汀\t347734\n王海林\t347735\n丽舍\t347736\n扬州万达广场\t347737\n鉴黄\t347738\n穆谢奎\t347739\n京华时报\t347740\n火麒麟\t347741\n落尘\t347742\n南京市规划局\t347743\n陕西省地方电力(集团)有限公司\t347744\n河南华之瑞\t347745\n拉力绳\t347746\n汽车工业\t347747\nbroke\t347748\n4399i\t347749\n汽车4S店地址\t347750\n庐城镇\t347751\n华文\t347752\n一公斤\t347753\n德景\t347754\n保卫萝卜\t347755\n海港城\t347756\n刘武\t347757\n调查报告\t347758\n穆然\t347759\n28T\t347760\n脊椎病\t347761\n黑龙江科技学院\t347762\n一脉相承\t347763\n芝宝\t347764\n三江吧\t347765\n四必\t347766\nchenoracle\t347767\n正态分布\t347768\n油改气\t347769\n厦门宏发电声股份有限公司\t347770\n暴雪蓝贴\t347771\n李奥纳多\t347772\n灌精\t347773\n布隆伯格\t347774\n白土镇\t347775\n烫字\t347776\n錧\t347777\n诽谤罪\t347778\nz460\t347779\n魔防\t347780\n乙炔\t347781\nFAMA\t347782\n广仁驾校\t347783\n林则徐\t347784\n女道士\t347785\n家电网\t347786\n搜索框\t347787\n三径\t347788\n国家京剧院\t347789\n垫块机\t347790\n新撰组异闻录\t347791\n爱秀时尚网\t347792\nppp\t347793\n2016年01月\t347794\nEndangered\t347795\n积分房\t347796\n烘缸\t347797\n深圳社保局\t347798\n5月6号\t347799\n多西环素\t347800\n可疑的美容院\t347801\n高顿论坛\t347802\n单引号转义\t347803\n闫辉\t347804\n剪刀机\t347805\n蓝鼎\t347806\n招拍挂\t347807\n婚生\t347808\n清户\t347809\n阻燃性\t347810\n2410\t347811\n难看\t347812\n玉屏风散\t347813\n夏耀\t347814\n地狱之旅\t347815\n自主性\t347816\n玩疯了\t347817\n15r3\t347818\n用品性\t347819\n叶锋\t347820\nCinema\t347821\n赞礼\t347822\n66666\t347823\n四川教育出版社\t347824\n灯体\t347825\nL4D2\t347826\n网易奥运\t347827\n肉便器\t347828\n徐公\t347829\n锦江路\t347830\n葱郁\t347831\n2014年1月\t347832\n青澳湾\t347833\n8000个\t347834\n集成类\t347835\n脐环\t347836\n白癜风\t347837\n猫口炎\t347838\n发机\t347839\n新不了情\t347840\n600325\t347841\n广东省卫生和计划生育委员会\t347842\n三队\t347843\n晒图机\t347844\n帝舵手表\t347845\nARRI\t347846\n天才召唤师\t347847\nSQL批量\t347848\n二十三项\t347849\nflee\t347850\nsources\t347851\n西进\t347852\n中国南方航空公司\t347853\n内卫\t347854\n擂台赛\t347855\n1839\t347856\n韩国冬奥会\t347857\n混乱度\t347858\n廊坊市环境保护局\t347859\n破解码\t347860\n40多岁\t347861\n贝达药业\t347862\n6363\t347863\n甲乙类\t347864\n得法\t347865\n驱动版\t347866\ntxxx\t347867\n18厘米\t347868\n阳离子交换树脂\t347869\n没办\t347870\nkarila\t347871\n企服宝\t347872\n91kk\t347873\n非常棒\t347874\n清包\t347875\ncolum\t347876\n谢恩公\t347877\n中国密码学会\t347878\n心轴\t347879\n仙侠文\t347880\n异口\t347881\nx3100\t347882\n16p\t347883\n盐池\t347884\n野村\t347885\n高密度\t347886\nk-means聚类算法\t347887\n搓澡工\t347888\ndorm\t347889\n企事业\t347890\nCaigle\t347891\nga110\t347892\n真三国无双8\t347893\n五分之二\t347894\n五元\t347895\n75期\t347896\n潜油泵\t347897\n心血\t347898\nJAVLibrary\t347899\nSendMessage\t347900\n雷天\t347901\ndbj\t347902\n万兴\t347903\n硝酸咪康唑乳膏\t347904\n斐尔\t347905\n白堆子\t347906\nAxela昂克赛拉论坛_汽车之家论坛\t347907\n赎金\t347908\n固收\t347909\nmichael翔\t347910\n团支部\t347911\nservice类\t347912\n绿洲\t347913\n年轻气盛\t347914\n闸北公园\t347915\n第51期\t347916\n十亿元\t347917\n金虹桥国际中心\t347918\nMCS\t347919\n卢曼\t347920\n测试篇\t347921\n衢江\t347922\n金吉列\t347923\n品众\t347924\n北京青年报\t347925\n罚站\t347926\n137家\t347927\n导入期\t347928\n7199\t347929\n延夏\t347930\n安卓输入法\t347931\n华声在线\t347932\n建设北路\t347933\ncurie\t347934\n苏宁互联\t347935\n菲丽丝工作室\t347936\nConsistency\t347937\n1988年\t347938\n坠崖\t347939\n加减乘除\t347940\n读卡器\t347941\n烈性摔跤\t347942\nf盘\t347943\n湖北省国税局\t347944\n邱比\t347945\nFinally\t347946\n子刊\t347947\n周卓\t347948\n南庄村\t347949\n愤怒\t347950\n龙掌\t347951\n白鲸\t347952\n速热式电热水器\t347953\n管委\t347954\n陶澍\t347955\n热血尖兵\t347956\n棉芯\t347957\n高歌猛进\t347958\n胃疼联盟吧\t347959\n护肝片\t347960\n词汇书\t347961\n踢脚\t347962\n仰望星空\t347963\n机变英盟\t347964\n航城街道\t347965\n7.0堕夜\t347966\n水灰比\t347967\n大将\t347968\n暖气炉\t347969\n猷\t347970\n外汇经纪人\t347971\n山梨酸\t347972\n短信群发器\t347973\n西乌旗\t347974\nORICON\t347975\nloleina\t347976\n717电影网\t347977\n破土动工\t347978\n自命不凡\t347979\n淹城野生动物园\t347980\n写人\t347981\n四风\t347982\n三国群英传7\t347983\n白盒\t347984\nGCL2008\t347985\n鲁提辖\t347986\nshandong\t347987\n真高兴\t347988\n╰\t347989\n双拍\t347990\n昆明市西山区人民政府\t347991\n渡劫\t347992\n请回\t347993\n体脂率\t347994\n最轻\t347995\n步步为\t347996\nqq至尊宝\t347997\nu盾\t347998\n威斯康星大学\t347999\n呈贡大学城\t348000\n修复剂\t348001\n3dmax2016\t348002\ncrops\t348003\n多连杆式\t348004\nGnuPG\t348005\nyaoh\t348006\n购\t348007\n11.0.0\t348008\n得慌\t348009\n凿岩机\t348010\nSIGN\t348011\n幻想神域\t348012\n3色\t348013\n蜂窝织炎\t348014\nhome\t348015\n250毫升\t348016\nAmbassador\t348017\n连连\t348018\n擂\t348019\n后宫\t348020\nipadmini\t348021\n可隆\t348022\n大王镇\t348023\n竹溪村\t348024\n纳税义务人\t348025\n长尾夹\t348026\n东滩\t348027\n本网站\t348028\n美通社\t348029\n复星地产\t348030\n中央国债登记结算有限责任公司\t348031\n温州市市\t348032\n争吵\t348033\nElise\t348034\n超载\t348035\n世界知识\t348036\n不欲\t348037\n加德\t348038\nxoy\t348039\n六祖坛\t348040\n书店\t348041\nleech\t348042\n立雪\t348043\n子梅\t348044\n小泽玛丽亚\t348045\n通信工程\t348046\npydot\t348047\n换挡\t348048\n春花\t348049\n班戈\t348050\n变限\t348051\n一闪一闪\t348052\n枯藤老树昏鸦\t348053\n真宫梨沙子\t348054\n疫\t348055\n零下一度\t348056\n五郎八卦棍\t348057\n龙泊湾\t348058\n8000亩\t348059\n一菲聪天\t348060\n餐厅\t348061\n8月2日\t348062\n大河村\t348063\n福州航空\t348064\nolufsen\t348065\n36位\t348066\n花椒芽\t348067\n华为\t348068\n高洁\t348069\n2596\t348070\n镇雄\t348071\n中国出版集团\t348072\naddon\t348073\n数不清\t348074\nprerequisite\t348075\n亚隆\t348076\n环宇城\t348077\n易佰\t348078\n心上人\t348079\n栈底\t348080\n电热板\t348081\nhi\t348082\n瓦尔德\t348083\n1.76秒\t348084\n温州路\t348085\n欧飞\t348086\n神途开服表\t348087\n地体\t348088\n修真诀\t348089\n黄段子\t348090\n我的手\t348091\n汉德森\t348092\n吉行天下车友俱乐部\t348093\n回建\t348094\n逐份\t348095\nMacbook\t348096\n菲奥娜\t348097\n鞘翅\t348098\n直运\t348099\n铜陵市公安局\t348100\n虚拟按键\t348101\n彩虹色\t348102\n合肥物流公司\t348103\n人人树\t348104\n卓玛泉\t348105\n榆叶梅\t348106\n祖业\t348107\n舌系带\t348108\n德国国会大厦\t348109\n佰腾\t348110\n95秀\t348111\n期货业协会\t348112\n四议\t348113\n山水人家\t348114\n128部\t348115\n配乐\t348116\n姐妹篇\t348117\ncalifornia\t348118\n天命之子\t348119\n天丛云\t348120\n新浪贵州_新浪网\t348121\neternium\t348122\n恋着\t348123\n自以为\t348124\n音痴\t348125\n推理的女王2\t348126\ncontact\t348127\ntimberland\t348128\n四连\t348129\nDartmouth\t348130\ncv2.imwrite\t348131\n冷锻\t348132\nContra\t348133\n十堰火车站\t348134\nDX200\t348135\ngtt\t348136\n数组型\t348137\n知耻而后勇\t348138\n老婆\t348139\n天龙星\t348140\n石片\t348141\n制冷片\t348142\n食品卫生许可证\t348143\n扩散性&乖离性MA\t348144\nclo\t348145\n愠\t348146\n精灵梦叶罗丽\t348147\n彩虹人\t348148\n免收\t348149\n华信集团\t348150\nEP3\t348151\n被抢注\t348152\n补链\t348153\nMultiDex\t348154\n据点\t348155\n2k14吧\t348156\n名下\t348157\n原菌\t348158\n维泰\t348159\n百度H5\t348160\n山东建筑大学\t348161\nBastard\t348162\n李辉才\t348163\n北京广告公司\t348164\n南京大学外国语学院\t348165\n金炜\t348166\n王宝宝\t348167\n市委会\t348168\n必康\t348169\n海绵城市规划\t348170\napplet\t348171\n裸尸\t348172\n盐源\t348173\n桑拿女\t348174\nCQI\t348175\nDoujins\t348176\n院校库\t348177\n三级公路\t348178\n海雕\t348179\n地量\t348180\n正局级\t348181\n首套房\t348182\nOTA\t348183\n国通\t348184\n草粉\t348185\nissuer\t348186\nautossh\t348187\n梦幻城\t348188\n展评\t348189\n热干\t348190\nAAAS\t348191\n满庭芳\t348192\n套筒补偿器\t348193\n乳酸菌粉\t348194\n歉意\t348195\n紧急\t348196\n猫耳朵\t348197\n唐爹\t348198\n鬼雄\t348199\n尾渣\t348200\n吞没\t348201\n池州市政府网\t348202\nDS6\t348203\n好猎头网\t348204\n3X3篮球黄金联赛\t348205\n驱蚊草\t348206\n大拿网\t348207\n邓先生\t348208\n郑二\t348209\ngtx970m\t348210\n库尔贝\t348211\n武林外传综\t348212\n福建联通\t348213\n可湿性粉剂\t348214\n桑珠孜区\t348215\nEXECL\t348216\n4a广告公司\t348217\nsatellites\t348218\n片寄凉太\t348219\n10多个\t348220\n两户\t348221\n分类变量\t348222\n乌有之乡网刊\t348223\n人物简笔画\t348224\n6系\t348225\n心知肚明\t348226\n明年1月1日起\t348227\n第24集\t348228\n顺城大街\t348229\n德高望重\t348230\n崛\t348231\n底滤\t348232\n无缝墙布\t348233\nAsuka\t348234\n恶梦\t348235\n半价车\t348236\n关键处\t348237\nrely\t348238\n373\t348239\n8.75\t348240\n破音\t348241\n误译\t348242\n火爆农资招商网\t348243\n吸收池\t348244\n底滤鱼缸\t348245\n刮刀\t348246\n46名\t348247\n李先生\t348248\nfeelings\t348249\n土味\t348250\n伤人\t348251\nHeavens\t348252\nPeugeot\t348253\n房思琪的初恋乐园\t348254\n菲安妮\t348255\nstorage/emulated/0/\t348256\n炉体\t348257\n800多年\t348258\n振捣棒\t348259\n反式脂肪酸\t348260\nlj2400\t348261\n20L\t348262\n卫生级\t348263\n开建\t348264\n40年前\t348265\nOpenvSwitch\t348266\n河南红星机器\t348267\n板儿\t348268\n浆细胞性乳腺炎\t348269\n西安医学院\t348270\n第七天\t348271\nS7/ed\t348272\nLCP\t348273\nposts\t348274\n李晓辉\t348275\n无关痛痒\t348276\n畅行\t348277\n水疱\t348278\n小莺\t348279\n模值\t348280\n法斗犬\t348281\n直销博客网\t348282\n克里斯朵夫\t348283\ncindy\t348284\n3亿\t348285\nnco\t348286\n竹篓\t348287\n04.04\t348288\n技术性贸易壁垒\t348289\n心血管科\t348290\nalgorithm\t348291\n油位\t348292\n西藏中路\t348293\n188\t348294\n连音\t348295\n集束\t348296\n一水间\t348297\nSW-290\t348298\n连锁业\t348299\n收单\t348300\nwhere\t348301\n180w\t348302\n韩城矿务局\t348303\n海景\t348304\nSaaS级\t348305\n换地\t348306\n陈鹏飞\t348307\nNil\t348308\n5秒\t348309\n启建\t348310\n香附\t348311\nVBA函数\t348312\n桓仁满族自治县\t348313\n王骁\t348314\n黄酮\t348315\n无锡农商行\t348316\n二手房买卖信\t348317\n知呱呱\t348318\n鼎美\t348319\nLasting\t348320\n无损音乐格式\t348321\n3整除\t348322\n触龙神\t348323\n工业级\t348324\nNote5\t348325\n韵书\t348326\nstuffing\t348327\n无尽藏\t348328\n永嘉远大贸易有限公司\t348329\n积金\t348330\n大教\t348331\n厂界\t348332\nlts\t348333\ndominant\t348334\nret\t348335\n王永利\t348336\n遨游加速器\t348337\nALK\t348338\n模仿\t348339\nbonded\t348340\n气狗\t348341\n谋求\t348342\n女香\t348343\n阿仪\t348344\n800D\t348345\n慈济\t348346\n对其\t348347\n可见光\t348348\nDOClever\t348349\n南国都市报数字报\t348350\n乡曲\t348351\n水竹\t348352\n放开我北鼻\t348353\n受死版\t348354\n缩肛\t348355\n近红外光谱仪\t348356\n村委\t348357\n不饱和聚酯树脂\t348358\nbeaver\t348359\nALT+TAB\t348360\nPUMP\t348361\n带饭\t348362\n盛世医妃\t348363\n9860\t348364\n泥浆泵\t348365\n解析度\t348366\n聚醚砜\t348367\n华润地产\t348368\n赵珈婧云\t348369\nmrs\t348370\n四川文理学院\t348371\n无线电脑\t348372\nトル\t348373\n招商银行南京分行\t348374\n城头山\t348375\ngeoda\t348376\n谷氨酰基转移酶\t348377\ny4\t348378\nmarked\t348379\nInvestigator\t348380\nTISSOT\t348381\n妈咪爱益生菌\t348382\n酸化剂\t348383\n准点率\t348384\n出生人口性别比\t348385\n纳兰\t348386\n冻货\t348387\nDaiwa\t348388\n竹园小区\t348389\n相见欢\t348390\n硬解\t348391\nr730\t348392\n好米\t348393\n颊\t348394\n迷迭香\t348395\nCó\t348396\n国家环保总局\t348397\n第2关\t348398\n百万次\t348399\n_希赛网\t348400\n_落叶网\t348401\n罗静\t348402\nLook\t348403\nsdw\t348404\n凯文凯利\t348405\n奥迪点道\t348406\n底灰\t348407\n无人潜航器\t348408\n若晴\t348409\n身寸\t348410\n血刃\t348411\n奥斯卡颁奖礼\t348412\n瓦岗山\t348413\n国航股份\t348414\n二牛\t348415\nwxmang\t348416\n安徽省通信管理局\t348417\n兵娃娃\t348418\n新街村\t348419\n科克\t348420\n维语\t348421\n李国魏千翔\t348422\n美好\t348423\n发型女\t348424\n白塔区\t348425\n维京人\t348426\n红迷\t348427\ncodepen\t348428\n个人住房按揭贷款\t348429\n陈小明\t348430\n小屁孩日记\t348431\n核准证\t348432\n绑钩器\t348433\n华企学院\t348434\nMVC5\t348435\ntu\t348436\n膝骨性关节炎\t348437\n清溪路\t348438\n边后卫\t348439\n冬凌草\t348440\nServer2012\t348441\n林欣彤\t348442\n外盘\t348443\n秘密武器\t348444\n胡集\t348445\nfibre\t348446\n信息系统项目管理师考试\t348447\n氧化还原电位\t348448\n武警交通部队\t348449\n三星level\t348450\nDenim\t348451\n北京鸭\t348452\n林业部\t348453\ngill\t348454\nJournal\t348455\n九鼎集团\t348456\n殳\t348457\n网上通缉犯\t348458\n武大靖\t348459\nAutomata\t348460\n0.4.3\t348461\nPhotoshopCC2018\t348462\n十六个\t348463\n艺意\t348464\n情为何物\t348465\n水批\t348466\n冰糕\t348467\nTempest\t348468\n小米游戏本\t348469\n物业费\t348470\n文综\t348471\n马德里机场\t348472\n此函\t348473\n两小天涯明月刀\t348474\n团训\t348475\n钱盆网\t348476\nLDAP\t348477\n六安新闻网\t348478\n三晚\t348479\n无记名\t348480\n电力公司\t348481\n长史\t348482\n河北省二院\t348483\n共青团中央\t348484\n临沂市工商行政管理局\t348485\n白炽灯\t348486\n飙酷\t348487\nHUE\t348488\n夏黑葡萄\t348489\n会得\t348490\n萌媳\t348491\n七彩西游记\t348492\n环球旅讯(Travel\t348493\n菏泽市司法局\t348494\n第五十\t348495\n052d\t348496\n中国人民解放军军乐团\t348497\nbc\t348498\nwin7系统ie浏览器\t348499\n金童玉女\t348500\n法学博士\t348501\n郭嘉峰\t348502\n彩叶\t348503\n上海电子信息职业技术学院\t348504\n動作\t348505\n大融城\t348506\nmoco\t348507\n噻虫嗪\t348508\n广东海洋大学寸金学院\t348509\nDraft\t348510\n小曹\t348511\nCitation\t348512\nlogx\t348513\nmx5\t348514\n台湾科技大学\t348515\n无所求\t348516\n新能源车购置税\t348517\ncolormap\t348518\n方案稿\t348519\n塑壳\t348520\n贪玩\t348521\nLineageOS\t348522\n小稽\t348523\n粗盐\t348524\nHomeFacialPro\t348525\n蜂蜜柚子\t348526\n九门\t348527\n肌电图\t348528\n注册查询网\t348529\n常备借贷便利\t348530\n中品\t348531\n效果好\t348532\n永生的眼睛\t348533\n赏\t348534\n0.5px\t348535\n南苑\t348536\n马格南\t348537\n分笔\t348538\n邮船\t348539\nxmlbar\t348540\n机械制造\t348541\n51job.com\t348542\n风见\t348543\n蔡秋凤\t348544\n经时\t348545\n摇头晃脑\t348546\n本科生院\t348547\n绝代双骄2\t348548\n保密协议书\t348549\n光明大陆传奇\t348550\n黄易群侠传\t348551\n体育消费\t348552\n造词\t348553\noralb\t348554\n工作会议纪要\t348555\nclearly\t348556\n济南酒店\t348557\n桥窗\t348558\n湘湖\t348559\n偈\t348560\n钨合金\t348561\n一元化\t348562\n安凯客车\t348563\n世界之花\t348564\nsdp\t348565\n成都市林业和园林管理局\t348566\n护甲片\t348567\n中共天门市委办公室\t348568\nVito\t348569\n36班\t348570\n武陵源景区\t348571\n九头蛇\t348572\n湖南公司\t348573\n非洲鼓\t348574\n虚拟Home键\t348575\n瓦莱乔\t348576\nCKplayer\t348577\n自用工\t348578\n史立荣\t348579\nClannad\t348580\n十八岁\t348581\nXIAO\t348582\n楞严经\t348583\n庙门\t348584\n张明亮\t348585\n蓝叶\t348586\nエッジ\t348587\n邳州房产网\t348588\n西南\t348589\n暗黑破坏神III\t348590\n离散傅里叶变换\t348591\n冬哥\t348592\nfelix\t348593\n_方舟\t348594\n棋盘山风景区\t348595\n秀权\t348596\nTensorFlow\t348597\n杨市镇\t348598\nrear\t348599\n压力计\t348600\n四瓣\t348601\n神盾局特工第五季\t348602\n出锦\t348603\n305路\t348604\n满朝\t348605\n杜鲁尔\t348606\nsalve\t348607\n肾源\t348608\n0X000000\t348609\nZEALER\t348610\n晚霜\t348611\n王聪儿\t348612\nrew\t348613\n大庄\t348614\n王小云\t348615\n浪漫\t348616\n吡\t348617\n周日下午\t348618\n耀莱\t348619\nRush\t348620\n柳词\t348621\nlinux虚拟机\t348622\n罗兰巴特\t348623\nmoral\t348624\n不等式组\t348625\n优肯\t348626\n七则\t348627\n工易\t348628\namplification\t348629\n卟啉\t348630\n2.6D\t348631\n阿修罗\t348632\n鹰爪\t348633\n大个子\t348634\n严刑拷打\t348635\n天歌\t348636\n讲台\t348637\n云南铝业股份有限公司\t348638\n报幕\t348639\n长兴街\t348640\n湖北大学研究生院\t348641\n100多公里\t348642\n锰业\t348643\n假如\t348644\n导航贴\t348645\n清影\t348646\n刘鹏飞\t348647\n用户群\t348648\n绑床\t348649\n第22期\t348650\n咖啡桌\t348651\n时光飞逝\t348652\n劳改犯\t348653\n复方多粘菌素b软膏\t348654\nQml\t348655\n恒大中心\t348656\n药勺\t348657\n速比\t348658\nfinished\t348659\nCura\t348660\n华英雄\t348661\n专情\t348662\n资本回报率\t348663\nexeter\t348664\n16册\t348665\n颜强\t348666\n颠茄\t348667\n巴伦\t348668\n劣根性\t348669\n产权式酒店\t348670\n四川市\t348671\n海悦\t348672\n138平米\t348673\nxll\t348674\n化妆品牌\t348675\n绳墨\t348676\nOwin\t348677\n哎嘿\t348678\n冒险岛单机版\t348679\ndrools\t348680\n小猪佩奇\t348681\n一万八\t348682\n突击\t348683\n经纪公司\t348684\nzhc\t348685\n喜马拉雅听\t348686\n云中岳\t348687\nnetbackup\t348688\n试跑\t348689\n景德镇陶瓷\t348690\n子报表\t348691\n腕式\t348692\n悬浮式\t348693\nJTable\t348694\nPotential\t348695\n莱泽曼\t348696\n周海兵\t348697\n缘来非诚勿扰\t348698\nsql2016\t348699\n皮城女警\t348700\nMD5码\t348701\n2分钟内\t348702\n山东省小学\t348703\n遥控器\t348704\n装机单\t348705\nIP地址库\t348706\n幽灵行动荒野\t348707\n疼片\t348708\n上心头\t348709\n西娅\t348710\n集合资金信托\t348711\n钢箱\t348712\nHGST\t348713\n戒魔人\t348714\n只言片语\t348715\n小米音箱\t348716\n三者\t348717\n老相\t348718\nsurprising\t348719\n影音播放器\t348720\n稻香村\t348721\nDam\t348722\n涂层\t348723\n匀强磁场\t348724\n制冷器\t348725\n陪葬\t348726\n合并器\t348727\n马连洼\t348728\n义乌佛堂\t348729\n光亮剂\t348730\n兽行\t348731\nword,excel\t348732\n河南艺术职业学院\t348733\n夜华\t348734\n朱锐\t348735\nNET4.0\t348736\n稷山\t348737\n衙\t348738\n非常运势星座网\t348739\n铺底流动资金\t348740\nIT导购\t348741\n熊廷弼\t348742\n胫\t348743\nue900\t348744\n明英宗\t348745\n翟晓川\t348746\n国家临床医学研究中心\t348747\nfseek\t348748\n5吨\t348749\n日语词典_词典网\t348750\n弹球\t348751\n逝\t348752\n一拳超人\t348753\n易高\t348754\n南京注册公司\t348755\n窗族\t348756\n涟城\t348757\n汉口火车站\t348758\npacific\t348759\n打狗棒\t348760\ntn=99294919_hao\t348761\nwin7ghost\t348762\n特片神马网\t348763\n散光眼镜\t348764\n2.2\t348765\n女画家\t348766\n才气\t348767\n缬沙坦胶囊\t348768\n老相机\t348769\n消逝的光芒\t348770\nprobit\t348771\nbedding\t348772\n喜雨\t348773\n大唐无双\t348774\n宗教类\t348775\n端木熙\t348776\n巴曙松\t348777\n民间中医网\t348778\n地长\t348779\n货殖\t348780\n青岛大学医学院附属医院\t348781\n泰禾地产\t348782\n粘结型\t348783\n小苗\t348784\n溴化银\t348785\nOpenfire\t348786\n桥北\t348787\n歌舞伎\t348788\n安曼\t348789\n山水情歌\t348790\n李小男\t348791\n开瑞K60\t348792\n盖乐世\t348793\nFWjia\t348794\nNoon\t348795\n白老师\t348796\n格宾\t348797\n86寸\t348798\nprolifes\t348799\n个人税\t348800\n电商们\t348801\n虾片\t348802\n葵花胃康灵\t348803\nALPHA\t348804\n绥\t348805\nnba2k11\t348806\n4399赛尔号\t348807\n久悬\t348808\n最优\t348809\n堂主\t348810\n河南中医学院第一附属医院\t348811\nKraken\t348812\ncayman\t348813\nmi3\t348814\n陶\t348815\n185个\t348816\n一千万条\t348817\n蝇人\t348818\n202号\t348819\n水都\t348820\n2017年4月2日\t348821\n美元化\t348822\nbody-parser\t348823\n心书\t348824\n龙潭沟\t348825\n七创社\t348826\n学程\t348827\n广东财经大学\t348828\n汤山街道\t348829\n画材\t348830\nhq\t348831\n美图公司\t348832\n炎热\t348833\n摄山星城\t348834\n刘靖\t348835\n老虎滩海洋公园\t348836\n赵家楼\t348837\n10MW\t348838\n搭配师\t348839\n浓荫\t348840\n献爱\t348841\n动画\t348842\n冠杰\t348843\nWanPlus\t348844\n武神3\t348845\n绝密\t348846\n汉口北\t348847\n被保险\t348848\n大恩\t348849\n狂犬\t348850\n钢铝关税\t348851\n厌世\t348852\n朱耷\t348853\n点\t348854\n第一范文网\t348855\n不怨\t348856\nCAJViewer阅读器\t348857\n吊桥\t348858\n财神\t348859\n丽莎\t348860\n陈然\t348861\n戏路\t348862\nC11\t348863\n韦\t348864\n冤家\t348865\n心胸\t348866\n218年\t348867\n上传\t348868\n山东美术馆\t348869\n楼间距\t348870\n喷雾机\t348871\n橡胶棒\t348872\n缺氧池\t348873\n嗑瓜子\t348874\n推求\t348875\n脚踢\t348876\nrafra\t348877\n新天龙八部\t348878\n逮捕\t348879\niA\t348880\na2o\t348881\n18ss\t348882\n白虎煞\t348883\n芝英镇\t348884\n南和\t348885\n光饼\t348886\n流亭\t348887\n刘卫兵\t348888\n芦淞\t348889\n黑色块\t348890\n一见如故\t348891\nasort\t348892\n商圈\t348893\n筑物消防\t348894\n药敏试验\t348895\n蜜柑\t348896\n济宁市环境保护局\t348897\njerry\t348898\n企业税务评级\t348899\n轻点儿\t348900\n赛尔\t348901\n冬残奥会组织委员会\t348902\n忒修斯\t348903\n深圳购物公园\t348904\n振捣\t348905\nWor\t348906\nCabinet\t348907\n首句\t348908\n中国共产党地方委员会工作条例\t348909\n成人动漫AV电影_影音先锋AV动漫\t348910\n火线圈\t348911\n大军\t348912\n香橼\t348913\n少年行\t348914\n衣蛾\t348915\njixie\t348916\n大快\t348917\n20161016\t348918\nイイ\t348919\nNames\t348920\n王姐\t348921\n18SS\t348922\n易贤网\t348923\n执勤\t348924\n网课\t348925\n9.95\t348926\n言谈\t348927\n体博会\t348928\n程式\t348929\n得治\t348930\n残疾人联合会\t348931\n本职\t348932\nSolaris10\t348933\n张绍刚\t348934\n素心\t348935\n毕业论文范文网\t348936\nsequen\t348937\ndrv\t348938\n陆一鸣\t348939\n星华\t348940\nChildhood\t348941\n总图\t348942\n联名款\t348943\n二维数组元素\t348944\n易成新能\t348945\n美人鱼岛\t348946\n哪儿\t348947\n济源文明网\t348948\n高尔夫6论坛_XCAR\t348949\nCharlie\t348950\n创芯\t348951\n亦菲\t348952\n5123\t348953\nasus\t348954\ncheckboxes\t348955\n液面\t348956\n撸撸\t348957\n英雄的征程\t348958\nGodex\t348959\n自考本科\t348960\n性本恶\t348961\n算清\t348962\nworkbook\t348963\n台达PLC\t348964\n美国女排\t348965\n测量师\t348966\n雷霆版\t348967\n绝地求生刺击\t348968\n周建华\t348969\n39.9\t348970\n相角\t348971\n好女孩\t348972\n通秘\t348973\n干洗机\t348974\nmapgis\t348975\n象牙果\t348976\n首胜\t348977\n低秩\t348978\n国库\t348979\n封帝释魔天\t348980\n杉达学院\t348981\n中国清明网\t348982\n购物狂欢节\t348983\n豪方\t348984\n郊野公园\t348985\n填空\t348986\n世霸\t348987\n痘\t348988\n图形界面\t348989\n8081\t348990\n宋体字\t348991\nshx\t348992\n开发方\t348993\n邦德富士达\t348994\n登风\t348995\nCodeForces\t348996\n免维护\t348997\n南玻\t348998\n佛光山\t348999\n对倒\t349000\n僚机\t349001\n潘驴\t349002\n浪子彦亚索\t349003\n中山大学岭南学院\t349004\n第六年\t349005\n磁动机\t349006\n6.3.8\t349007\n利口酒\t349008\n红筹股\t349009\nPL/SQL\t349010\n太平洋航空\t349011\nベ\t349012\n下付款\t349013\n浦口桥北\t349014\n重填\t349015\nzx7\t349016\n盘完\t349017\nLeung\t349018\n见底\t349019\natf\t349020\n惊破\t349021\n降雨量\t349022\n江西农业大学\t349023\n超高跟\t349024\n老照\t349025\n魔杰座\t349026\n桁架桥\t349027\nmdc\t349028\n简易型\t349029\nsimplify\t349030\n中央局\t349031\n春眠不觉晓\t349032\n第三行\t349033\n语委\t349034\n3COM\t349035\n阿奎利亚\t349036\n原振侠\t349037\n双盆\t349038\n土猫\t349039\nquietly\t349040\npok\t349041\n激光管\t349042\n聚餐\t349043\n铁血柔情\t349044\n补剂\t349045\n老左\t349046\n张春华\t349047\n人才招聘网\t349048\n独轮\t349049\n大片儿\t349050\n戦姫\t349051\n雪兰莪\t349052\n橄榄绿\t349053\n69vids\t349054\n411路\t349055\n招魂\t349056\n德州新闻网\t349057\n大城\t349058\n飞霞山\t349059\n矛盾文学奖\t349060\nkuku\t349061\n旗主\t349062\n定陶区\t349063\nAudioManager\t349064\n就医\t349065\n4331\t349066\n广州中医药大学第一附属医院\t349067\n彻头彻尾\t349068\nSurfing\t349069\n地板漆\t349070\n二十四番\t349071\n劳务派遣证\t349072\n英雄联盟攻略\t349073\n西哈努克港\t349074\n第40集\t349075\n20161121\t349076\n虚证\t349077\n忘情歌\t349078\n三物\t349079\n第一百一十二章\t349080\n花妆\t349081\n登封少林寺\t349082\n燃具\t349083\n李旭科\t349084\n验\t349085\n锦袍\t349086\n一串一串\t349087\n转改\t349088\n九九作文网\t349089\n华为matebook\t349090\n中国银监会湖南监管局\t349091\n东南商报\t349092\nbedtools\t349093\n上海房产网\t349094\n天龙八部\t349095\nLite\t349096\n711\t349097\n产科\t349098\n科大讯飞晓译\t349099\nWebsphere\t349100\n三维植被网\t349101\n压面机\t349102\n8.0.3\t349103\n土壤酶\t349104\n喷脸\t349105\n成都七中实验学校\t349106\n课后\t349107\n长沙地铁4号线\t349108\n狼犬丹尼\t349109\n欧服\t349110\nYSL\t349111\n拉美\t349112\n508号\t349113\n方差矩阵\t349114\n580_\t349115\n吉林医药学院\t349116\nAviva\t349117\n魅蓝Metal\t349118\n姬芮\t349119\n我最好的老师\t349120\n人民\t349121\n重复购买率\t349122\n桂民卡\t349123\n汉腾\t349124\n鼠\t349125\n还击\t349126\n傅立叶变换\t349127\ncctv1\t349128\n八千米\t349129\n阿托伐他汀钙\t349130\n刘旸\t349131\n谋女郎\t349132\n田纳西\t349133\n讷\t349134\n拉窗\t349135\n安恋\t349136\n中国社会出版社\t349137\n哈博\t349138\n我与世界只差一个你\t349139\n中国井冈山干部学院\t349140\nTing\t349141\n蛇夫\t349142\n忽视\t349143\n最好\t349144\nSY\t349145\n一个22岁\t349146\n环青海湖国际公路自行车赛\t349147\n欧城\t349148\n20170219\t349149\n糊化\t349150\n小灵狗\t349151\n天祥\t349152\nRE文件管理器\t349153\n_万象物语\t349154\n2穴\t349155\n澳头\t349156\n试客\t349157\nWebhooks\t349158\nWNDR3800\t349159\n马草\t349160\nGreatest\t349161\n不义联盟2\t349162\n路桥人才网\t349163\n鬼厉\t349164\n2014年01月25日\t349165\nP115\t349166\n十档\t349167\n青云谱\t349168\n20170604\t349169\nAPPID\t349170\npeeing\t349171\n桃花峪\t349172\n女帝\t349173\n张怡微\t349174\nwishing\t349175\n辐辏\t349176\n易富贤\t349177\nmahou\t349178\n本特利\t349179\n雅莉\t349180\nMerit\t349181\nimpactor\t349182\n中国银监会江苏监管局\t349183\n幸福账单\t349184\n騒\t349185\n讯飞语音输入法\t349186\n沃乐夫\t349187\ncintiq\t349188\n恒生电子\t349189\n飞秒激光\t349190\nWallpaper\t349191\n粉丝汤\t349192\n66163.com\t349193\n两天晒\t349194\n气体分析仪\t349195\n幼学琼林\t349196\n地氯雷他定分散片\t349197\n40%\t349198\n\\位\t349199\n四圈\t349200\n雪铁龙C5\t349201\nヤバ\t349202\n台大\t349203\n恒晨\t349204\n尊老敬老\t349205\n道哥\t349206\n诚挚\t349207\n存储容量\t349208\n转流\t349209\n色线\t349210\n禾绿\t349211\ns8\t349212\n文件框\t349213\n狮航\t349214\n二硕\t349215\n暨\t349216\n苏信\t349217\n12.0.4\t349218\n老山国家森林公园\t349219\n党办人\t349220\n上海地产\t349221\n厦门_新闻中心\t349222\n叶小文\t349223\nOpManager\t349224\n上海市第二中级人民法院\t349225\n克里斯·帕拉特\t349226\n蒋家骏\t349227\n宋武帝\t349228\n激凸\t349229\n程幼泽\t349230\n同门\t349231\nyey\t349232\n陶寺\t349233\n山口式\t349234\n贷款买车\t349235\n日色\t349236\n冷水镇\t349237\n园林局\t349238\nOO\t349239\n慈溪市人力资源和社会保障局\t349240\n风花雪月\t349241\n庸人自扰\t349242\n贷款计算器\t349243\nIllustra\t349244\n在编人员\t349245\n高阁\t349246\n3021\t349247\n圆床\t349248\n尤金\t349249\n李多奎\t349250\n空气球\t349251\n出乎\t349252\ndumpling\t349253\n轩辕传奇2\t349254\n江苏省总工会\t349255\n刺客\t349256\n柯俊\t349257\n高宠\t349258\n周星驰\t349259\n宝塔区\t349260\n勾线\t349261\n郑州日产\t349262\n摇蚊\t349263\n一个年代\t349264\ni=1\t349265\n深圳德尚泌尿外科医院\t349266\n多玩我的世界盒子\t349267\n萨省\t349268\nwanju\t349269\n简一大理石瓷砖\t349270\n一体柜\t349271\n善与恶\t349272\n东北师大附中\t349273\n公孙瓒\t349274\n复代文\t349275\n客观唯心主义\t349276\n一爷\t349277\n吉他弹唱\t349278\ndeliver\t349279\n三维家吧\t349280\nj=0\t349281\n逃学大乱斗\t349282\n猇亭\t349283\n发声器\t349284\n裕同\t349285\n他人\t349286\n烟草业\t349287\n八面通\t349288\n我唾弃你的坟墓\t349289\n不许动\t349290\n山怪\t349291\n水晶球\t349292\n测绘学\t349293\n7.8万\t349294\n北京市平谷区人民政府\t349295\noverture\t349296\n上海市科学技术委员会\t349297\ndate型\t349298\ndate类\t349299\n合肥网\t349300\nROOT权限\t349301\nBowling\t349302\n_式\t349303\n鲍蕾\t349304\n听道\t349305\n佳能600D\t349306\nFlume\t349307\n张蓓蓓\t349308\n翁立友\t349309\n超宠\t349310\n发文\t349311\n广东电网\t349312\n沈万三\t349313\nsstream\t349314\n南宁二中\t349315\n中文网|ELLE\t349316\n吕特\t349317\n1+1\t349318\nayumi\t349319\n温宿县\t349320\n接种证\t349321\n长龙航空\t349322\n发明家\t349323\n家电器\t349324\n广告条\t349325\n感概\t349326\n第九十八章\t349327\n营部\t349328\n脆爽\t349329\n喜欢\t349330\n撑杆\t349331\n60kg\t349332\n凌枫\t349333\n四牌楼\t349334\n叶振棠\t349335\n万鹏\t349336\nubs\t349337\n期许\t349338\n单独\t349339\n用友网络科技股份有限公司\t349340\n外阴部\t349341\nShu\t349342\nVIPABC\t349343\n950M\t349344\nython\t349345\n芡实糕\t349346\n构\t349347\n鸭肉\t349348\nIE80\t349349\n杨俊\t349350\n日方\t349351\n中国美术学院上海设计学院\t349352\n胜女\t349353\nMein\t349354\nKatz\t349355\n大提琴曲\t349356\n木船\t349357\n二十八宿\t349358\n卤虾\t349359\n_棉花糖小说网\t349360\n三级跳远\t349361\n黄帝故里\t349362\n绝世双骄\t349363\n随缘居\t349364\n天翼卡\t349365\n永宁街\t349366\n开心版\t349367\n尹昉\t349368\n磨光机\t349369\nBomb\t349370\nES2017\t349371\nmappath\t349372\n签注\t349373\n巡捕\t349374\nBD1080P\t349375\n第99\t349376\n造孽\t349377\n高崎\t349378\n洗衣池\t349379\n莺歌燕舞\t349380\n系统之家官网\t349381\n师夷长技\t349382\n国家级森林公园\t349383\n降低率\t349384\n陕西省政府\t349385\n复旦大学公共卫生学院\t349386\n捷豹f-type\t349387\n愚弄\t349388\n美力\t349389\n李玉峰\t349390\nbt樱桃\t349391\n阿培南\t349392\nspecifier\t349393\n第一版\t349394\n开平市\t349395\nInhibition\t349396\n四川日报\t349397\n引证\t349398\n四副\t349399\nconect\t349400\n第59条\t349401\n郑州高铁东站\t349402\n2年\t349403\n广东教育出版社\t349404\nREVIT\t349405\n骚情\t349406\n统一价\t349407\n襟怀\t349408\n东楚网\t349409\n北京爱情故事\t349410\nEssen\t349411\n保护套\t349412\n金属制品网\t349413\ncomp\t349414\n幸运方块\t349415\n047期\t349416\n阿澈\t349417\n凯撒大帝\t349418\n针灸科\t349419\n制裁者\t349420\n李爱国\t349421\n缝补\t349422\n敢干\t349423\n李凝\t349424\ndope\t349425\n1年前\t349426\n宁波招聘网\t349427\n荣威350论坛_汽车之家论坛\t349428\n肉猪\t349429\nmolecular\t349430\n王家瑞\t349431\npsnr\t349432\n第1号\t349433\n香港科大\t349434\n饮者\t349435\n怠速\t349436\n芦溪\t349437\n235\t349438\n环伺\t349439\n石家庄新火车站\t349440\n排放物\t349441\n移动化\t349442\n老茧\t349443\n罗超\t349444\nSEND\t349445\n梨\t349446\n上海西郊骨科医院\t349447\n周金涛\t349448\n卡_\t349449\n剧务\t349450\n创卫\t349451\n梅罗\t349452\n鬼打鬼\t349453\n新林区\t349454\n特玩剑灵\t349455\n省发展改革委\t349456\n处暑\t349457\nedp\t349458\n北京市八一学校\t349459\n臀型\t349460\n胡建军\t349461\nRichard\t349462\nPopulation\t349463\ni型\t349464\n阿里巴巴国际站旺铺\t349465\n固始\t349466\n龙轩\t349467\nSecurity3\t349468\n辨认\t349469\n香酥\t349470\n铝离子\t349471\n139\t349472\n不得\t349473\nPeters\t349474\n第二十篇\t349475\nentering\t349476\n阳光郡\t349477\n家园\t349478\n光华村\t349479\ntaker\t349480\n北安河\t349481\nYQBD\t349482\n赵建\t349483\n温厚\t349484\n江老师\t349485\n参拜\t349486\n牛肉饼\t349487\n活性化\t349488\n油盒\t349489\nMerrill\t349490\n空时\t349491\n帷幕\t349492\n航测\t349493\n氧氟沙星\t349494\n90本\t349495\n中关村北大街\t349496\n河势\t349497\nelbow\t349498\n甘肃省公路航空旅游投资集团有限公司\t349499\n焚天决\t349500\n农廉网\t349501\n上海农林职业技术学院\t349502\n资讯_敢富网\t349503\n沥林\t349504\nArgentina\t349505\n澳龙\t349506\n苯甲\t349507\nguoji\t349508\nmoca\t349509\n待定\t349510\n于城镇\t349511\nBL小说\t349512\nvertically\t349513\nfp30\t349514\n孙心远\t349515\n酷色\t349516\n松阳政府网\t349517\n音管\t349518\n送葬者\t349519\n尤克里里Fans乌\t349520\nや\t349521\n果桑\t349522\n玉麦\t349523\n张承中\t349524\n上邪\t349525\n柴火鸡\t349526\n综合\t349527\n高速公里\t349528\n泛泰\t349529\nRemake\t349530\n中国人大网\t349531\n槽车\t349532\n甄嬛\t349533\nHilux\t349534\n奔图\t349535\npediy\t349536\n二酮\t349537\nM1530\t349538\n新点\t349539\n陈小霞\t349540\n护童\t349541\n东京热在线\t349542\n校园网\t349543\n东骏\t349544\n堪萨斯\t349545\n诗礼\t349546\n联合国开发计划署\t349547\n庆丰村\t349548\n锁帧\t349549\n白井黑子\t349550\n多重\t349551\n朝夕相处\t349552\nTCP协议\t349553\n低硫\t349554\n肖飞\t349555\n东湖街道\t349556\n大众4S店\t349557\n一条\t349558\n札娜\t349559\n4.42\t349560\n飞马旅\t349561\n哈继第六感生死缘\t349562\n光饰机\t349563\nzuixin\t349564\n数款\t349565\n七武海\t349566\n结婚日\t349567\n缓蚀阻垢剂\t349568\n马克思恩格斯文集\t349569\n拳皇98终极之战ol\t349570\n灶\t349571\n小巫\t349572\n阿莫仙\t349573\n白草莓\t349574\n英菲尼迪QX50\t349575\nYUM源\t349576\n应力分析\t349577\n新公司法\t349578\n50249499\t349579\n瓦块\t349580\n北京化工大学\t349581\n贝叶\t349582\n坪\t349583\n3月末\t349584\n李栋\t349585\n骶髂关节\t349586\n广联达科技股份有限公司\t349587\n一相\t349588\n涡轮式\t349589\n隐藏式\t349590\n全县村\t349591\n输油管道\t349592\nSpringMvc\t349593\n牡丹苑\t349594\n粘膜\t349595\n房产福利社\t349596\n张莉莉\t349597\n361房车网\t349598\n善良的女秘书的目的\t349599\n最好吃\t349600\nkejiqi\t349601\n禁欲主义\t349602\n赞恩\t349603\n邻家有女初长成\t349604\nObservations\t349605\n鉴定器\t349606\n赵小雷\t349607\n电视节目表\t349608\n移走\t349609\n麻鸡\t349610\n乐安居\t349611\nmaryland\t349612\n95部\t349613\nCR-V论坛\t349614\n订书机\t349615\n官禄宫\t349616\n苏州工业园区服务外包职业学院\t349617\n张晨光\t349618\n小手状\t349619\ncampbell\t349620\n灰喜鹊\t349621\n四分制\t349622\n奥西\t349623\n物欲横流\t349624\nApps\t349625\n手工具\t349626\n白湖\t349627\n初级\t349628\n小月亮\t349629\n豪族\t349630\n快塑网\t349631\n挑食\t349632\n微分形式\t349633\n新易盛\t349634\nvarieties\t349635\n作业集\t349636\nrgba\t349637\nfilson\t349638\n兴港\t349639\n抽提物\t349640\nAFK\t349641\n叶总\t349642\nSplinter\t349643\n什么叫做\t349644\n审稿人\t349645\n特工皇妃楚乔传\t349646\n几千条\t349647\n似的\t349648\n01.mp4\t349649\n8600\t349650\n郑博士\t349651\n华宇\t349652\n补口\t349653\n长贵\t349654\n徐琦\t349655\nEUROPEAN\t349656\n春梅\t349657\n368\t349658\n───\t349659\n简析\t349660\n人民日报\t349661\n十门\t349662\n服务行动\t349663\n云南信息报\t349664\n热恋\t349665\n平安时报\t349666\n八一飞行表演队\t349667\nUnsupervised\t349668\n文机关\t349669\n阴阳代理人\t349670\nfirefox吧_\t349671\n脑剧\t349672\n2.3_\t349673\n啪啪模拟器\t349674\ntest3\t349675\n北京华信医院\t349676\n0736\t349677\nm女\t349678\n私募投资基金\t349679\ncaoporen\t349680\n依云小镇\t349681\n水网\t349682\n故名\t349683\nsnowball\t349684\n20170330\t349685\n7340\t349686\n林升\t349687\nTechWeb\t349688\nrgb转16进制\t349689\n访问团\t349690\nrepresented\t349691\n卡丹\t349692\n人狗\t349693\n宜春市农业局\t349694\n审美\t349695\n315\t349696\n加拿大大学\t349697\n第四只\t349698\n游春图\t349699\nxsw\t349700\n国家自然科学基金委员会\t349701\nPHPstudy\t349702\n北京电子科技职业学院\t349703\n上海浦东新区陆家嘴\t349704\n第27\t349705\n墨龙\t349706\n电叶片\t349707\n今缘\t349708\n绵羊油\t349709\n重生之锦绣嫡女\t349710\n希望小学\t349711\n雨霖铃\t349712\n初乳\t349713\n有缘网\t349714\n输光\t349715\nOUTPUT\t349716\n11步\t349717\n奔驰4S店\t349718\n无人驾驶\t349719\n_华律网\t349720\n自发光\t349721\nTOD\t349722\n新文阁\t349723\n呼吸道感染\t349724\nImpossible\t349725\n景逸x5\t349726\n群青\t349727\n凶器\t349728\n13平米\t349729\n拍摄仪\t349730\n莫兰\t349731\n山西省图书馆\t349732\n深圳野生动物园\t349733\n肇庆市高要区人民政府\t349734\nPandorabox\t349735\nAdmin\t349736\n要求\t349737\n威金斯\t349738\n逐光者\t349739\n投食\t349740\n中央纪委监察部\t349741\n中国好书\t349742\n大卫·哈维\t349743\n千江月\t349744\n高数\t349745\n16安\t349746\n第110期\t349747\nheize\t349748\n宁波市教育局\t349749\n中天未来方舟\t349750\n蕙兰\t349751\n新加坡酒店\t349752\n层位\t349753\n8月4日\t349754\n威刚内存\t349755\nphonex\t349756\n大漆\t349757\n非执业注册会计师\t349758\nipanema\t349759\n姚文元\t349760\n传奇故事\t349761\n黔之驴\t349762\nMistakes\t349763\n中华人民共和国侵权责任法\t349764\n甜蜜蜜\t349765\n黄延秋\t349766\n广东省食药监局\t349767\ngeological\t349768\n宗教局\t349769\n小鸡哔哔\t349770\n白晶晶\t349771\n九阴真经3D\t349772\n经济特区\t349773\n庄胜\t349774\n2^1\t349775\nspank\t349776\n寸步不让\t349777\n扁通\t349778\n第八辑\t349779\n看过来\t349780\n良口镇\t349781\nMarie\t349782\n画用\t349783\n北海职业学院\t349784\n墨线\t349785\n房地产投资信托基金\t349786\n龙亭区\t349787\n长城电脑\t349788\n两舱\t349789\n于刚\t349790\naccesskey\t349791\n怪才\t349792\n虚拟机器人\t349793\n叠石\t349794\n月经失调\t349795\n每日邮报\t349796\n观战\t349797\n处理场\t349798\n义肢\t349799\n单雄信\t349800\nNic\t349801\n宝骏560\t349802\n泷川索菲亚\t349803\n鼠鱼\t349804\n20天前\t349805\n法兰机\t349806\nflas\t349807\n海宝\t349808\n巨鹿路\t349809\n吸油纸\t349810\n渝湘\t349811\n中国专利局\t349812\nDS\t349813\n玻璃屏\t349814\nhac\t349815\n保利通\t349816\nSlavery\t349817\n新闻\t349818\n第80集\t349819\nExpanse\t349820\n亚华\t349821\n水石\t349822\n乡村爱情浪漫曲\t349823\nDiscount\t349824\n税银\t349825\n百战天龙\t349826\nNatalie\t349827\n高富帅\t349828\n木香花\t349829\n下身\t349830\n大气污染\t349831\n云大附中\t349832\n江南1970\t349833\n珊瑚礁\t349834\n雷蛇驱动\t349835\n崇阳县\t349836\nLocalStorage\t349837\n轰天炮\t349838\n创网\t349839\n造血式\t349840\n北京小升初网\t349841\n北京地铁8号线三期\t349842\n攻击型\t349843\n消火栓\t349844\n金山谷\t349845\n湛蓝\t349846\nMacau\t349847\n花词\t349848\n绿米\t349849\n裤裙\t349850\n结晶型\t349851\nnumber型\t349852\n濮北新区\t349853\n万能艺图网\t349854\n温州医学院附属第二医院\t349855\n维士\t349856\n姜片\t349857\n美羽\t349858\n主母\t349859\n早上4点\t349860\n最新银行利率表_银行存贷款利率表\t349861\n云芝\t349862\n介绍人\t349863\nhuodong\t349864\n马春喜\t349865\n北交所\t349866\n黑龙江商务厅\t349867\n仁恒置地广场\t349868\n脂流茶\t349869\n68121125\t349870\n伪劣\t349871\n传仙界\t349872\n展览\t349873\n萧忆情Alex\t349874\nHTML+JS\t349875\n上学吧\t349876\n香树湾\t349877\nub\t349878\nSlash\t349879\nhiring\t349880\n尓豪\t349881\n刘昱\t349882\n李叫兽\t349883\n妹妹\t349884\n假药\t349885\n英国博物馆\t349886\n强戒\t349887\n宅人食堂\t349888\n2400元\t349889\n续编\t349890\n连缴\t349891\n蕙质兰心\t349892\nMountain's_blog\t349893\n哈弗H4\t349894\n贝克休斯\t349895\ni3-4170\t349896\n狄仁杰\t349897\n瑞文\t349898\n小学教育专业\t349899\n王映霞\t349900\n征缴\t349901\n型式\t349902\n美妈\t349903\n第21次\t349904\n龙门吊\t349905\n收售\t349906\n一哥\t349907\n半日游\t349908\nScenes\t349909\nFavourite\t349910\n千分位\t349911\n凯华\t349912\n制版费\t349913\nTCX\t349914\n12.30\t349915\n牛心\t349916\n38.6\t349917\n安卓版v1.0_游迅网\t349918\n上海竟跃网络科技有限公司\t349919\n吴非\t349920\n无冬2\t349921\n雪荷载\t349922\n中国政法大学\t349923\n台湾海峡\t349924\nworldfirst\t349925\n上柴股份\t349926\nSW-130\t349927\ninitials\t349928\n染发剂\t349929\nSKETCHUP\t349930\n沃尔沃汽车\t349931\n创维集团\t349932\n12点半\t349933\n15大\t349934\nrollup\t349935\n总阀\t349936\n心算\t349937\n食为天\t349938\n4宗\t349939\n马勒戈壁\t349940\nDiy\t349941\n中国合伙人\t349942\n第110章\t349943\n风尚\t349944\njstl\t349945\n摄影棚\t349946\n敏安\t349947\n聚龙\t349948\n工大\t349949\n陈晓猫\t349950\n长城c50\t349951\n還是\t349952\n东平镇\t349953\napp.js\t349954\n8197\t349955\navtt\t349956\njay迷吧\t349957\nPHILIPS\t349958\n3000余\t349959\n毛衫\t349960\n消防站\t349961\n自动组装机\t349962\nOxy\t349963\nX8\t349964\n后场\t349965\n怒潮\t349966\n尸兄\t349967\n卡乐\t349968\n要塞战役\t349969\n帕杰罗v97\t349970\n4G版\t349971\n三螺杆泵\t349972\nsolve\t349973\n公对私\t349974\nSERVER2008\t349975\n获取\t349976\n中立\t349977\n锤黑吧\t349978\n荪湖\t349979\n/textarea\t349980\nManuka\t349981\n兽面\t349982\n义列\t349983\n正峰\t349984\n张新\t349985\n宜春经济技术开发区\t349986\n机用\t349987\n20180125\t349988\n妈妈的故事\t349989\n旅游群\t349990\n小白鼠\t349991\n布地奈德鼻喷雾剂\t349992\n赫顿玛尔\t349993\n大乌龟\t349994\n海逸\t349995\n蒸馏器\t349996\n英雄连2\t349997\n2017年9月16日\t349998\namplifier\t349999\n攒机\t350000\n唐塔\t350001\n润通\t350002\n富诚\t350003\n都匀毛尖\t350004\n完人\t350005\n蓝山县\t350006\n装载车\t350007\n回雪\t350008\n2517\t350009\n红事\t350010\n创业邦\t350011\n考典\t350012\n玛\t350013\nleancloud\t350014\n通往\t350015\n开泰\t350016\nPassage3\t350017\n监控器\t350018\nCT片\t350019\n40.5\t350020\n一个子串\t350021\n百信御江帝景\t350022\nDictation\t350023\n暴牙\t350024\n第51章\t350025\n1513\t350026\nPWP\t350027\n曾谙\t350028\nbland\t350029\nMinor\t350030\n八百多\t350031\n水表箱\t350032\n风继续吹\t350033\n好生意\t350034\n芳心\t350035\n平型关\t350036\nkoch\t350037\n拆房\t350038\n拐角\t350039\n酷热\t350040\n那一晚\t350041\n经营者\t350042\nadaboost算法\t350043\n贾斯茅侃侃\t350044\n鼎展\t350045\n二娃\t350046\nvysor\t350047\nADDRESS\t350048\njsapi\t350049\n中国泰尔实验室\t350050\n包牌\t350051\n水木天成\t350052\n宋琳\t350053\n第7条\t350054\n永宁路\t350055\n润和软件\t350056\n3158母婴网\t350057\n锋钢\t350058\nlg显示器\t350059\nmingle\t350060\n实海\t350061\n老麦\t350062\n血衣\t350063\n智典\t350064\n色感\t350065\n电钢琴\t350066\n水泥板\t350067\n庆庆\t350068\n高电位治疗仪\t350069\n刘文\t350070\n相济\t350071\n支农\t350072\n流畅\t350073\n绍兴图书馆\t350074\n金逸国际电影城\t350075\nsuck\t350076\n非侵入式\t350077\n国家安全观\t350078\n册亨\t350079\n柳泉\t350080\n比特币勒索病毒\t350081\n致死量\t350082\n乖离性ma吧_\t350083\n心跳包\t350084\n放映机\t350085\n康宁路\t350086\n嘉文四世\t350087\n王励勤\t350088\n叶圣陶先生二三事\t350089\n另一端\t350090\n高菲\t350091\ninfopath\t350092\n佐伯\t350093\ng0blin\t350094\n心意\t350095\n字理\t350096\n碳酸锂\t350097\n三龄\t350098\nCqant\t350099\n救孤\t350100\n车站南路\t350101\n驶上\t350102\n重庆妇幼保健院\t350103\n穿过\t350104\n79平米\t350105\n威斯敏斯特\t350106\n殉道\t350107\nhyperlink\t350108\nroadhogrc\t350109\n迪幻\t350110\nexhaustive\t350111\ngxy\t350112\n浮気\t350113\ndhclient\t350114\n丁基橡胶\t350115\n商调函\t350116\n沾花惹草\t350117\n虐杀原形2\t350118\n川村\t350119\n天津空港经济区\t350120\n中途\t350121\nTT\t350122\nxps挤塑板\t350123\n天涯明月刀太白\t350124\n是好\t350125\n18张\t350126\nlD\t350127\n哈尔滨开发区\t350128\n国际劳动妇女节\t350129\nCIIF\t350130\n一5\t350131\n武汉东西湖区\t350132\n魔兽世界战网\t350133\n鹤壁市\t350134\nfari\t350135\nnopcommerce\t350136\n拉歌蒂尼\t350137\n奕剑\t350138\n锐志论坛_汽车之家论坛\t350139\n1980年\t350140\n散裂\t350141\n江山雪\t350142\n澧\t350143\n小户型\t350144\n安琪尔\t350145\n胎神\t350146\n28001\t350147\n邱天\t350148\nabm\t350149\n中华人民共和国土地管理法实施条例\t350150\nU8\t350151\n总计\t350152\noperating\t350153\nMIDI控制器\t350154\n一一网\t350155\n唐太宗李世民\t350156\ndtmb\t350157\n1000瓦\t350158\n阿雅娜\t350159\n云浮市政府网\t350160\n近一半\t350161\nichi\t350162\n幽情\t350163\n反味\t350164\n贤者之爱\t350165\n金鹰集团\t350166\n腐向\t350167\n光毛\t350168\n梁鸿\t350169\nPlan\t350170\nPCI\t350171\n宇宙学\t350172\n石富宽\t350173\n知金教育\t350174\n座灵剑山\t350175\n19世纪\t350176\n767股票学习网\t350177\nWindchill\t350178\n仇敌\t350179\n8145v\t350180\n习特\t350181\n21世纪\t350182\nkix\t350183\n齐齐哈尔\t350184\n蔚然成风\t350185\n新乌龙院\t350186\n运命\t350187\n明细账单\t350188\nROG\t350189\n企宣\t350190\n有空\t350191\n阳光下的法庭\t350192\n二维条码\t350193\n五层\t350194\n平原地区\t350195\n交响乐团\t350196\n18058期\t350197\n上中\t350198\n兔丸\t350199\n良率\t350200\n诺优\t350201\n硫铵\t350202\n翰皇\t350203\n田丽\t350204\n华中科技大学建筑与城市规划学院\t350205\n食疗学\t350206\n诗朗\t350207\n吱吱声\t350208\n无感\t350209\nBiotherm\t350210\n北京地铁7号线\t350211\n迷人\t350212\nThanksgiving\t350213\n下底\t350214\n5770\t350215\ntrumpet\t350216\n张至顺\t350217\nrbenv\t350218\n何克\t350219\n王宜林\t350220\n陈思雨\t350221\n天风\t350222\nXP32\t350223\nchime\t350224\nyige\t350225\n柏\t350226\n重庆市大足区人民政府\t350227\n北京市卫生局\t350228\n尼康d7200\t350229\n难以置信\t350230\n兆欧\t350231\n远程桌\t350232\n赤西仁\t350233\n实说\t350234\n丨丨\t350235\n汉家松鼠\t350236\n辛渐\t350237\n郑胜利\t350238\n上虞市\t350239\n叶无道\t350240\nuserinfo\t350241\n教导队\t350242\n美女性\t350243\n保险业务员\t350244\n河北省云办税厅\t350245\n球罐\t350246\nCLIENT\t350247\nフロ\t350248\n主从配置\t350249\nchristopher\t350250\n烦烦\t350251\n盒子版\t350252\n浦东新区\t350253\n瀬戸\t350254\nGNU\t350255\n京台高速\t350256\n徐浦大桥\t350257\n指教\t350258\n阴暗面\t350259\n小麻烦\t350260\nethminer\t350261\n赤贫\t350262\nFonts\t350263\nMYCAT\t350264\n5成\t350265\n仁爱医院\t350266\n小文\t350267\n残本\t350268\n广告史\t350269\n董事会议事规则\t350270\n新锦江\t350271\n曲片尾曲\t350272\n獾子\t350273\n金田一吧\t350274\n郑谷\t350275\n秩\t350276\n长岭居\t350277\n禁灵\t350278\n二项式系数\t350279\n电脑投影\t350280\nHunks\t350281\n雪铁龙C4\t350282\n耗材\t350283\ntbe\t350284\npointed\t350285\n7283\t350286\n比选\t350287\nsdh\t350288\n20th\t350289\n农业品牌\t350290\n提米\t350291\n烤脑花\t350292\n疯人院\t350293\neNSP模拟器\t350294\n盐雾\t350295\n税钱\t350296\n马贼\t350297\n字经\t350298\n长袜\t350299\n乞\t350300\n街路\t350301\n云海\t350302\n松下幸之助\t350303\njust\t350304\n刘尧\t350305\nFoshan\t350306\nCuties\t350307\n银桃花\t350308\n三相电_\t350309\n广审\t350310\npokey\t350311\n宋都\t350312\nwelch\t350313\n酸值\t350314\n常德市第一人民医院\t350315\n死亡宣告\t350316\n珍\t350317\n思维导图软件\t350318\n12.04\t350319\n罗技无线鼠标\t350320\n630K\t350321\nNote8\t350322\n百子轩\t350323\n8320\t350324\n青岛旅行社\t350325\n虎虎虎\t350326\n复权\t350327\n弱电解质\t350328\n魂剑\t350329\nCoaxial\t350330\nIcoMoon\t350331\n错用\t350332\n第七十一章\t350333\ndtse9\t350334\npthread_join\t350335\nKyo\t350336\n字效\t350337\n中样\t350338\n教书\t350339\n四叠半\t350340\n杰特\t350341\n中国建设工程造价管理协会\t350342\n冷冻式干燥机\t350343\n石原\t350344\n短效口服避孕药\t350345\n执业助理医师\t350346\nぁ\t350347\n双反\t350348\n国际服\t350349\n陈蓓蓓\t350350\n弹琴\t350351\n磨抛机\t350352\na5\t350353\n袁亚非\t350354\nsql文\t350355\n0028\t350356\n鼎益丰\t350357\n阿兹尔\t350358\n试商\t350359\nphome\t350360\nplanar\t350361\nVDH\t350362\n单眼老表\t350363\n南明河\t350364\n华为P6\t350365\nchasedream\t350366\nroblox\t350367\n肯尼思·布拉纳\t350368\n西禅寺\t350369\n邮编商务网youbian.com\t350370\nouter\t350371\n符华\t350372\n阿古尼特\t350373\n自动化科技有限公司\t350374\nENGLISH\t350375\ngarnet\t350376\n为民除害\t350377\n超过48小时\t350378\nbrookstone\t350379\n求救\t350380\n火山图\t350381\n马太\t350382\nmcm\t350383\n安图尼\t350384\n走一步再走一步\t350385\n简画\t350386\n即景\t350387\nqcn\t350388\n炫美\t350389\n麒麟舞\t350390\nshort\t350391\n天使纪元\t350392\n绝户\t350393\nBundle\t350394\n血徒\t350395\n3尺\t350396\n33\t350397\n碣石\t350398\n移动电商\t350399\n苏紫紫\t350400\n连任\t350401\nzhua\t350402\n大巴黎\t350403\n红色管弦乐队\t350404\n火箭军\t350405\n29家\t350406\n火影忍者壁纸\t350407\n玉米赤霉烯酮\t350408\n轩逸论坛\t350409\n日本馆\t350410\nel-tabs\t350411\n微循环\t350412\n迂回\t350413\n孙国峰\t350414\n控制不住\t350415\n转正率\t350416\n东阳金报_金华日报\t350417\n杨宁\t350418\n第134章\t350419\n高压电\t350420\nOde\t350421\n塑性铰\t350422\n民师\t350423\nlean\t350424\n罪犯\t350425\n万平\t350426\n钱永健\t350427\nHttp协议\t350428\n67张\t350429\n|税务师考试\t350430\n72ce\t350431\n拳皇98终极之战ol吧\t350432\nSQLQuery\t350433\n瑞表\t350434\n永不复\t350435\n二三十\t350436\n琅琊新闻网\t350437\n买回\t350438\n绵长\t350439\n陈越\t350440\n层林尽染\t350441\nmef\t350442\n势力范围\t350443\nListBox\t350444\ndc\t350445\n优+\t350446\n数胎动\t350447\n广东省新闻出版广电局\t350448\n专刊\t350449\nn10\t350450\n冢本\t350451\n框体\t350452\n上海人力资源管理师\t350453\n477\t350454\n180406\t350455\n中评\t350456\nrestful接口\t350457\n太湖\t350458\n静心\t350459\n汾酒\t350460\n刘桥\t350461\n科帕奇\t350462\n尊严\t350463\nCutter\t350464\n集团公司\t350465\n生么\t350466\n瘦身汤\t350467\n叙利亚国家电视台\t350468\n【卷\t350469\n受过\t350470\n我们仨\t350471\n应读\t350472\n简答\t350473\n25.in\t350474\n诛仙青云志\t350475\n丁二烯\t350476\n色库\t350477\n格林春天\t350478\nAnarchy\t350479\n范可新\t350480\nflannel\t350481\n尼康d3100\t350482\n五道\t350483\n第【\t350484\n0.0.0\t350485\n黄姑鱼\t350486\n朱雨辰\t350487\nChuck\t350488\n20160327\t350489\n联机\t350490\nVIE\t350491\nEMD\t350492\n绝地求生红点\t350493\nLibrary\t350494\nReact-Native系列\t350495\n电圆\t350496\n聊胜于无\t350497\n实方\t350498\n鬼物\t350499\n2018年04月03日\t350500\n京广\t350501\n放置奇兵\t350502\n借据\t350503\n藤虎\t350504\nFall\t350505\n回声嘹亮\t350506\n鱼皮\t350507\ntucao\t350508\n五彩缤纷\t350509\n最古老\t350510\n狸窝转换器\t350511\nlatent\t350512\n观陵山\t350513\nv7\t350514\n苏州科技大学\t350515\navid\t350516\n百度营销大学\t350517\n四平方\t350518\n马尚\t350519\n女官\t350520\n预埋管\t350521\n杰茜\t350522\n3514\t350523\n苏东\t350524\n琪琪影院_琪琪电影天堂\t350525\n引水\t350526\n儒森\t350527\n黄豆面\t350528\nqq达人\t350529\n摩托迷\t350530\n雷凌185T\t350531\n1861年\t350532\n云贵\t350533\n乐华欢乐世界\t350534\nArk\t350535\n乱葬岗\t350536\n12321\t350537\n保荐人考试\t350538\n变径管\t350539\n蛇蛋\t350540\nGID\t350541\n德芙\t350542\n18049期\t350543\n科锐\t350544\n今日关\t350545\n每三个月\t350546\nGFJL\t350547\n怎么么\t350548\nSpring、Spring\t350549\n188号段\t350550\n马月\t350551\n马线\t350552\n异梦阁\t350553\n宇洋\t350554\n革命烈士陵园\t350555\nHan\t350556\n撸啊撸\t350557\n交法\t350558\n生管物控网\t350559\n水文水资源信息网\t350560\n坦荡荡\t350561\n残乳\t350562\n厨邦\t350563\n交通线\t350564\n驴肉馆\t350565\n大别\t350566\nes浏览器\t350567\n7051\t350568\n表_大学生考试网\t350569\n射频电路\t350570\n青蓝工程\t350571\n高鑫广场\t350572\nband\t350573\n基金赎回\t350574\n起亚K5论坛_汽车之家论坛\t350575\njam\t350576\n阴毒\t350577\n谷风\t350578\nprohibit\t350579\n重庆农村商业银行\t350580\n拧开\t350581\nkylin\t350582\n洋老外\t350583\n群架\t350584\n明军\t350585\n奉承\t350586\n550ti\t350587\n托儿\t350588\n深信服\t350589\niKuai爱快流控路由\t350590\n啤酒业\t350591\n余气\t350592\n刘义庆\t350593\n中国共产党第十八届中央委员会\t350594\nwsh\t350595\n净度\t350596\n青枫\t350597\n古方\t350598\ntoDataURL\t350599\nWinEdt\t350600\n音字\t350601\n环带\t350602\n12593\t350603\nsublime\t350604\n明股实债\t350605\n20151004\t350606\n通液\t350607\n流行歌\t350608\nachartengine\t350609\n南沙客运港\t350610\nprimeng\t350611\nDTE\t350612\n小米粒\t350613\nBATHING\t350614\n香港快运航空\t350615\nmodi\t350616\n怪奇物语\t350617\n唐毅\t350618\n老詹\t350619\n华钰矿业\t350620\n圆场\t350621\n20题\t350622\n冠词\t350623\n脱色\t350624\n堤身\t350625\n12股\t350626\nVue-Router\t350627\n述德述廉\t350628\n2014年终\t350629\nusb调试\t350630\n肥红瘦\t350631\n平光镜\t350632\n妇科病\t350633\nUnsatis\t350634\n晶玉\t350635\nf542\t350636\nsteamid\t350637\n刘明\t350638\n金锣湾\t350639\n杏石口路\t350640\n瓜娃\t350641\n红光路\t350642\n巴黎凯旋门\t350643\nBlessed\t350644\n直推\t350645\nDL360\t350646\n作品集\t350647\n集合论\t350648\n满淫\t350649\n盘条\t350650\n大叔们\t350651\n仰泳\t350652\n灵婆\t350653\n一了百\t350654\n官网站\t350655\n深圳市人才交流服务中心\t350656\n倪萍容国团\t350657\n气馁\t350658\n叶帅\t350659\nxt\t350660\n管器\t350661\n篠\t350662\n列子\t350663\n379号\t350664\nCSS3\t350665\n淘股神\t350666\n1286\t350667\n平舆\t350668\n菏泽市安监局\t350669\n4集\t350670\n真题库\t350671\n张郃\t350672\n刘轩豪\t350673\n琴书\t350674\n1.0.1.0\t350675\n天樾\t350676\nLUTS\t350677\n枣庄日报\t350678\n指示液\t350679\n独行客\t350680\nunlikely\t350681\n林应武\t350682\n软媒\t350683\n矢野浩二\t350684\n赤橙黄绿青蓝紫\t350685\nskechup\t350686\n制动装置\t350687\n备件\t350688\n败火\t350689\n垛子\t350690\n玻镁防火板\t350691\njoi\t350692\n镰仓\t350693\n天竺桂\t350694\n久保\t350695\n颈链\t350696\nzhanghao\t350697\n小福子\t350698\nrichedit\t350699\n配置器\t350700\n2.98\t350701\n单电\t350702\n桐乡火车站\t350703\n赵峥\t350704\nqy\t350705\n38g\t350706\n古尔丹\t350707\nBrook\t350708\n柳下\t350709\n神经母细胞瘤\t350710\n万欣\t350711\nunixODBC\t350712\nMedela\t350713\nProPlus\t350714\nPXW-FS7\t350715\n雷锋纪念馆\t350716\n600万套\t350717\n太平寺\t350718\n黄连\t350719\n李燕\t350720\nroman\t350721\n赫曼\t350722\nNico\t350723\n黄金日内\t350724\n种子袋\t350725\n钢管壁\t350726\n偏关\t350727\nfaint\t350728\n美的电压力锅\t350729\nプレス\t350730\n520分\t350731\n解放军\t350732\n爐\t350733\n自然\t350734\nSEOWHY\t350735\nPredictive\t350736\n外战\t350737\n欧珀莱\t350738\n甜点师\t350739\n空气断路器\t350740\n要知道\t350741\n社工中国网\t350742\n叶某\t350743\n易迅网\t350744\n智德\t350745\n前地\t350746\n杨怡\t350747\n病者\t350748\n思想准备\t350749\n几百年前\t350750\nthave\t350751\n推酷\t350752\n三资\t350753\n上海人大\t350754\n浅草寺\t350755\n误碰\t350756\n好儿\t350757\n山东师范大学附属中学\t350758\nsmashing\t350759\npyqt5\t350760\nCHA\t350761\n姚壮宪\t350762\n邻水县\t350763\n180马力\t350764\n3000家\t350765\n汤勺\t350766\nGXP\t350767\n托姆\t350768\n剪角\t350769\n消费券\t350770\n纪宝成\t350771\n煮蛋器\t350772\n三氯乙烷\t350773\ndevice\t350774\n13.04\t350775\n贝甜\t350776\n尸人\t350777\n青岛银行\t350778\n油电\t350779\n金碧花园\t350780\n混沌之子\t350781\n第31关\t350782\n发生地\t350783\n切勿\t350784\n本地包\t350785\n样品盒\t350786\n亿利\t350787\n国调\t350788\nSIM卡槽\t350789\n20170105\t350790\n上海人才\t350791\n面上\t350792\n扑克游戏\t350793\n基辛格\t350794\nLinkinStar\t350795\n流动分析仪\t350796\n78_\t350797\n库值\t350798\n西安工程大学\t350799\n不入虎穴\t350800\n十副\t350801\na77\t350802\n0723\t350803\n天狮集团\t350804\n笔调\t350805\nssh\t350806\n66995000\t350807\nswift4.0\t350808\n法兰片\t350809\n热插拔\t350810\n六十五\t350811\n防毒墙\t350812\n中大童\t350813\n妇科学\t350814\n抗弯刚度\t350815\nvue路由传参\t350816\n资格生\t350817\n1万条\t350818\n其然\t350819\n第7版\t350820\n会计与税务处理\t350821\nCrossFit\t350822\ninfi\t350823\nPairs\t350824\n演讲集\t350825\n红酒店\t350826\nRihanna\t350827\n104.3\t350828\n统方\t350829\n夭寿\t350830\n视觉感\t350831\n东券\t350832\n排油丸\t350833\n古惑仔2之猛龙过江\t350834\n大清一統志\t350835\n加班车\t350836\n金属热处理\t350837\n藤井雪莉\t350838\n港西镇\t350839\n良心话\t350840\n湘声报社\t350841\n新邦物流\t350842\n女優\t350843\nAIG\t350844\n爱情电影网\t350845\n考察记\t350846\n半幅\t350847\n你丫闭嘴\t350848\n造船\t350849\n防弹玻璃\t350850\n寄存\t350851\n0123456789\t350852\n榨油机\t350853\nHD-MKV\t350854\n涨跌幅限制\t350855\n施工缝\t350856\n聚贸\t350857\n光明城站\t350858\n无懈可击\t350859\nloca\t350860\nrainmeter\t350861\n失禁\t350862\n2018.4.19\t350863\n瞳仁\t350864\n寻回\t350865\n三六一度\t350866\n番号族\t350867\n绿叶集团\t350868\n系统集成网\t350869\n南通汽车站\t350870\nt\t350871\n无忧代笔网\t350872\n教科书式\t350873\n狂躁\t350874\n调度台\t350875\n艾灸盒\t350876\n命年\t350877\nkites\t350878\n非礼\t350879\n天盘\t350880\n明源云客\t350881\n孙琦\t350882\n刘小平\t350883\n婆\t350884\n极太\t350885\n鬼马天师\t350886\n千手观音\t350887\n卞和\t350888\n跳跳蛋\t350889\n核税\t350890\n百度影音_播放器\t350891\n192.168.16.1\t350892\n己任\t350893\nxiaohui\t350894\n于凤\t350895\n审批处\t350896\n云南大学研究生院\t350897\n血意\t350898\nhorizontal\t350899\n上海而立环保科技有限公司\t350900\n灯板\t350901\n刊本\t350902\n散华\t350903\n原位癌\t350904\n发条魔灵\t350905\n频偏\t350906\ndosing\t350907\n红方\t350908\n麦格霍斯\t350909\n芸豆\t350910\nsuperga\t350911\n0311-85127050\t350912\n小米MAX2\t350913\n畫\t350914\n印信\t350915\nQQ五笔\t350916\n银行号\t350917\n69号\t350918\n一巷\t350919\n三沙市\t350920\n止水片\t350921\n6730\t350922\n星梦偶像计划\t350923\nDisplayFusion\t350924\n靶纸\t350925\n河南省卫生计生委\t350926\n真朋友\t350927\nword档\t350928\nfires\t350929\n分支器\t350930\n300种\t350931\n20161119\t350932\n株洲南\t350933\n引闪器\t350934\n里维斯\t350935\nhashtag\t350936\nAutoCAD-三维网\t350937\nODP\t350938\n株洲市天元区人民政府\t350939\n音缘\t350940\n2313\t350941\n体体\t350942\n昆仑路\t350943\n体育之窗\t350944\n柔直\t350945\nKBOX\t350946\n州庆\t350947\nszzf\t350948\n折分\t350949\n阳光广场\t350950\n对抗性\t350951\n数码宝贝1\t350952\nphotoshopcs6\t350953\n印艺\t350954\n熱門\t350955\nScarlet\t350956\nMio\t350957\nPRIVATE\t350958\n天龙剑\t350959\n出租车费\t350960\n楢山节考\t350961\n陆家嘴\t350962\n泰陵\t350963\n阿左旗\t350964\n安徽省人社厅\t350965\n3dmax渲图\t350966\nshurufa\t350967\n自由席\t350968\nportable\t350969\nrig\t350970\n弱气\t350971\n涎\t350972\n雪松控股\t350973\n原型\t350974\n天猫精选\t350975\n拍卖场\t350976\n山东商报\t350977\nISLAND\t350978\nPostgreSQL\t350979\nAutoCad\t350980\n龙溪村\t350981\n浦东新区人民政府\t350982\n筹码\t350983\n上海市延安中学\t350984\n苍梧\t350985\n爬虫之爬\t350986\n曲园\t350987\n科学数据管理办法\t350988\n超凡传\t350989\n空气质量检测仪\t350990\n射影\t350991\n肃正\t350992\n典雅\t350993\n舷号\t350994\nMAX\t350995\nhaxnt\t350996\n玛雅水公园\t350997\n双态\t350998\n梅河口市\t350999\n跑酷类\t351000\n7家\t351001\n不义\t351002\n大秀\t351003\n用笔\t351004\n100kpa\t351005\n二合一版\t351006\n人教版4\t351007\n数据结构c语言版\t351008\ncandidates\t351009\n黄岛汽车站\t351010\n血吸虫\t351011\n2.5美元\t351012\n尤溪\t351013\n松坂桃李\t351014\n云游网\t351015\n阿婴\t351016\nperiod\t351017\n水口镇\t351018\n格陵兰\t351019\nDying\t351020\n1589\t351021\nifream\t351022\nacad\t351023\nkeithley\t351024\n高徒\t351025\n叫苦连天\t351026\n制壶\t351027\n氧化钠\t351028\n特函\t351029\n金丝猫\t351030\n守望人妻\t351031\nPneumatic\t351032\n化物\t351033\nRME\t351034\nKies\t351035\n看谱\t351036\n嬲\t351037\nkaiyuan\t351038\n莫柔\t351039\nZedGraph\t351040\n经营性现金流\t351041\n胡颓子\t351042\n今夏\t351043\n做不做\t351044\n国日\t351045\n装盒\t351046\n不动摇\t351047\nfiil\t351048\n合格率\t351049\n康银阁\t351050\nvelite\t351051\n巴伐利亚\t351052\njavafx\t351053\nw33\t351054\n威利旺卡\t351055\n飞士\t351056\n消遣\t351057\n360搜索引擎\t351058\n中冶置业集团有限公司\t351059\nm227fdw\t351060\n柏塘镇\t351061\n一百层\t351062\nshameless\t351063\n借\t351064\n浑浊度\t351065\n留守\t351066\nh3c路由器\t351067\n定位球\t351068\n中学生\t351069\n可丽金\t351070\nTHINK\t351071\n果字\t351072\n隐蔽性\t351073\n百度云网盘/\t351074\n16颗\t351075\nTURN\t351076\n程序员们\t351077\n衬塑\t351078\np8h61\t351079\n二三百\t351080\n手架\t351081\nXAML\t351082\n绒绒\t351083\n怪鸭历险记\t351084\n10.8.3\t351085\n奉子成婚\t351086\n福利区\t351087\n磨皮\t351088\n知同创\t351089\n中原地区\t351090\n测谎仪\t351091\n八八战略\t351092\n依赖包\t351093\n上颌窦囊肿\t351094\n椟还珠\t351095\nspv公司\t351096\n大天使之剑h5\t351097\n微囊\t351098\n破裂机\t351099\n董坚\t351100\nqq斗地主\t351101\n天津医科大学眼科医院\t351102\n网上书\t351103\n济南市教育局\t351104\n拍案说法\t351105\n甲状腺功能减退症\t351106\nHD800S\t351107\n胁痛\t351108\n家政保姆\t351109\nToad\t351110\nOrphan\t351111\n提托\t351112\n内存盘\t351113\n中国中医科学院眼科医院\t351114\n彭澎\t351115\n济钢高中\t351116\n款式\t351117\nxfzy\t351118\nnebo\t351119\n新浪网\t351120\nfprint\t351121\n杨浩\t351122\n三岔\t351123\n溴化钠\t351124\n巨手\t351125\nBlacklist\t351126\n山水间\t351127\nGear\t351128\n头座\t351129\n华新水泥\t351130\n每日甘肃网\t351131\nHale\t351132\n东江湖\t351133\n二片式\t351134\n天马流星拳\t351135\nfloat64\t351136\n随缘\t351137\n红卫兵\t351138\n痞作\t351139\nSAP公司\t351140\n上海共青团\t351141\n美国谍梦\t351142\nNIMBUS\t351143\nmacps\t351144\n2厅\t351145\n万博manbetx\t351146\n镇江\t351147\n不亦说\t351148\nNewly\t351149\n多首\t351150\nomission\t351151\n记性\t351152\n华纳兄弟\t351153\n金华西\t351154\n死亡岛2\t351155\n仁美\t351156\n世纪加盟网\t351157\n沂蒙路\t351158\n罗京曹德旺\t351159\n连奏\t351160\n乖离性ma吧\t351161\n發现\t351162\n詹黑\t351163\n斜口\t351164\n佛山市市\t351165\n梁爽\t351166\n五号位\t351167\n图侦\t351168\n1700\t351169\nmacpro\t351170\n屯溪机场\t351171\n三水西南\t351172\ngopro5\t351173\n一汽丰田\t351174\n云印\t351175\n中庙\t351176\n怒火中烧\t351177\n68厘米\t351178\n会计从业资\t351179\n行讯金属加工网\t351180\nav棒\t351181\n呼吸器官\t351182\n左外\t351183\n山大附中\t351184\n第235集\t351185\n拆台\t351186\ndmu\t351187\n腌菜\t351188\n张蕊\t351189\nwavelength\t351190\n安理会常任理事国\t351191\n员工资\t351192\n剪毛\t351193\ninserted\t351194\nLauncher3\t351195\n达克莱伊\t351196\ncats战车吧\t351197\n美迪惠尔\t351198\n举行\t351199\n千倍百倍\t351200\n杨清柠\t351201\n冬吴相对论\t351202\n小四月\t351203\n福建农信社\t351204\n黄景瑜\t351205\n这点事\t351206\n專注\t351207\n转基因大米\t351208\n畅捷通T3\t351209\n闻邦椿\t351210\n刘兰\t351211\n老伙计\t351212\n张俊文\t351213\n南山文化旅游区\t351214\n执迷不悟\t351215\n鱼杆\t351216\nMACOS\t351217\n新西兰元\t351218\n印标\t351219\n金蝶国际\t351220\n本溪市人力资源和社会保障局\t351221\n罗曼\t351222\n花梨\t351223\n南桥汽车站\t351224\nChannels\t351225\nlighting\t351226\n幸福伤风素\t351227\n人和网\t351228\nLiberation\t351229\n互利\t351230\ned2000.com\t351231\n天津人民医院\t351232\n钱江世纪城\t351233\n新城控股集团股份有限公司\t351234\nl\t351235\n实创装饰\t351236\n情意思思\t351237\n365天\t351238\nCompiling\t351239\n12款\t351240\n17年底\t351241\nsqlloader\t351242\n福特新福克斯\t351243\n猪排饭\t351244\n东京食尸鬼吧_\t351245\n荼毒\t351246\n城镇化率\t351247\nbehavioral\t351248\nwww.hao123.com\t351249\n鄂北\t351250\n撒币\t351251\n卖水\t351252\n甲城\t351253\n宝象金融\t351254\n凌晨2点\t351255\n中红网\t351256\n神奇宝贝百科\t351257\n李一帆\t351258\nAkiho\t351259\n玲玲\t351260\n程林\t351261\n突破\t351262\n海曙新闻网\t351263\n0集\t351264\n梦想海贼王\t351265\n换空\t351266\n24幅\t351267\n叶华\t351268\n系统版\t351269\n人工3吧\t351270\n广东省国土资源厅\t351271\n肇事方\t351272\n黔灵公园\t351273\n大开\t351274\n殷商\t351275\n最终幻想13雷霆\t351276\n2018.3.5\t351277\nstudio3.0\t351278\n美刀\t351279\n碧桂园集团\t351280\n采棉机\t351281\n美纹纸\t351282\n腰穿\t351283\n赵玉宝\t351284\n幽居\t351285\n无言以对\t351286\n3.2版\t351287\n道藏\t351288\n长沙长海医院\t351289\n开远市\t351290\n班线\t351291\n俩个\t351292\nnudes\t351293\n02世界杯\t351294\n武师\t351295\n62964393\t351296\n陶行\t351297\n每一步\t351298\n维科\t351299\nBoot应用\t351300\n郴州西站\t351301\n上海中共一大会址\t351302\n宋金\t351303\n苹果越狱吧\t351304\nicleaner\t351305\n编辑框\t351306\n苯硼酸\t351307\n北京酒店\t351308\n风灵月影\t351309\n智宝\t351310\n金棕\t351311\n运输车\t351312\n李舒\t351313\n油烟味\t351314\n渡轮\t351315\n芭提雅\t351316\nsolaris10\t351317\n邮管局\t351318\n妹子\t351319\n龙归\t351320\n7月1日前\t351321\n恒热\t351322\n能效\t351323\n分器\t351324\n气动系统\t351325\n奥森公园\t351326\n网络交换机\t351327\n檐\t351328\n沃尔玛超市\t351329\n中航油\t351330\n氧化型\t351331\n拆机乐\t351332\n凯翼\t351333\n微商货源网\t351334\n新亚驾校\t351335\n10ms\t351336\n滘口\t351337\ntranslation\t351338\n别傻傻\t351339\n加油机\t351340\nfset-337\t351341\nlse\t351342\n80904577\t351343\n德州科技职业学院\t351344\n你方\t351345\nASCE\t351346\n王申\t351347\n红唇膏\t351348\n草籽\t351349\nSABER\t351350\n摩天轮票务\t351351\n册府元龟\t351352\n泰温\t351353\n现代电子技术\t351354\n上差\t351355\nBell\t351356\nanimal\t351357\nKite\t351358\n淘宝快递\t351359\n护卫犬\t351360\n骚\t351361\n五公里\t351362\n160号\t351363\n凯蒂·佩里\t351364\n拖拉机\t351365\n7550\t351366\n抹杀\t351367\n卡尼曼\t351368\nflyme7\t351369\nreferer\t351370\n读书卡\t351371\n1.8g\t351372\n洪宇\t351373\n民乐\t351374\n洛川县人民政府\t351375\n克隆版\t351376\n夏迎春\t351377\n1000美元\t351378\n为实\t351379\n倒烟\t351380\n陆定昊\t351381\nWechat\t351382\n33hao\t351383\n玉溪日报\t351384\n文\t351385\n足球史\t351386\nWaste\t351387\nNepal\t351388\n舌炎\t351389\n湖南大学》\t351390\n井陉矿区\t351391\n推荐文\t351392\n一百次\t351393\nReFa\t351394\n王金宝\t351395\n吩咐\t351396\n耳洞\t351397\n红岗区\t351398\n关岭县人民政府\t351399\n倒下\t351400\ndifferences\t351401\n介子\t351402\n欧时力\t351403\n台州中学\t351404\n演讯网\t351405\n北京市人力资源和社会保障局\t351406\nprocast\t351407\n采气\t351408\n云南省安全生产监督管理局\t351409\n耍无赖\t351410\nmeterpreter\t351411\nMINI\t351412\n清洁仪\t351413\n邵飘萍\t351414\nxinet\t351415\n潘涛\t351416\n抗原性\t351417\n中央统战工作会议\t351418\n股本溢价\t351419\n板形\t351420\n舌尖上的中国2\t351421\n林采宜\t351422\n编剧\t351423\nAV番号网\t351424\n刘娜\t351425\n舌奴\t351426\n梦舒雅\t351427\n水果歌\t351428\niris\t351429\n八部半\t351430\n覆盖\t351431\n丁西林\t351432\n奥迪RS7\t351433\nMeter\t351434\n54分钟\t351435\n九千岁\t351436\n奥德赛论坛_汽车之家论坛\t351437\n宫如敏\t351438\n沉默权\t351439\n费正清\t351440\n刘佳妮\t351441\n马山镇\t351442\n64365\t351443\nsuy\t351444\n6台\t351445\n祝宝良\t351446\nlomo\t351447\n屋脊\t351448\nIqos\t351449\n珍贵\t351450\n盛和资源\t351451\n芹沢\t351452\n津味\t351453\n猿课\t351454\nCCTV-5_央视网\t351455\n蛋花状\t351456\nmetals\t351457\n硌\t351458\n牛子\t351459\n本西蒙斯\t351460\n取\t351461\n维尔戈\t351462\n柯云路\t351463\n十四阙\t351464\n威语\t351465\n十六分之一\t351466\n胶体金法\t351467\n14小时\t351468\n22\t351469\n个性签名\t351470\n碳酸锌\t351471\n雨型\t351472\n释文\t351473\n中山中路\t351474\n0矩阵\t351475\n弹簧力\t351476\n马自达6论坛_汽车之家论坛\t351477\n长红\t351478\n翻边\t351479\n元若蓝\t351480\n尺幅\t351481\n旅游者\t351482\n割草\t351483\n雪耻\t351484\nt25\t351485\n2018年2月14日\t351486\n交易类\t351487\n魅族5\t351488\nPHPChina\t351489\nAutoConfig\t351490\n脾肾两虚\t351491\n阿弟\t351492\n乡下人\t351493\n罪化\t351494\n哈尔滨医科大学附属第二医院\t351495\n中科院自动化研究所\t351496\n真我\t351497\n仁和药房网\t351498\n母恩\t351499\n密歇根州立大学\t351500\n建企\t351501\n最高速度\t351502\n麻江\t351503\n易店\t351504\n外向型\t351505\n鱼种\t351506\nOptical\t351507\n大幅度\t351508\nエル\t351509\n直肠脱垂\t351510\n殷敏\t351511\n暗雷\t351512\n2016年7月份\t351513\naqq\t351514\n背压机组\t351515\n宇桐非\t351516\n河流\t351517\n百度地图marker\t351518\n相对集中行政许可权\t351519\n真八岐大蛇\t351520\n独权\t351521\n葛亮\t351522\n幻乐\t351523\n固定电话\t351524\nMoonlight\t351525\nPostMessage\t351526\n南方科技大学\t351527\n0417\t351528\n普通高等学校学生管理规定\t351529\n安徽少年儿童出版社\t351530\n87号\t351531\n兑换卷\t351532\namd处理器\t351533\nnerdtree\t351534\nG510\t351535\ndialogs\t351536\n楼塔镇\t351537\n2018-03-18\t351538\n张湾镇\t351539\n安邦定国\t351540\n深圳中心城\t351541\n左手右手\t351542\nCollegiate\t351543\nTeamSII\t351544\n佳诺\t351545\n盈余公积金\t351546\n徐东大街\t351547\n细菌性食物中毒\t351548\n刚\t351549\n抑扬\t351550\n260元\t351551\nvdd\t351552\n巴学园\t351553\n航油\t351554\n本月内\t351555\n_型\t351556\n翼龙贷\t351557\n千影浏览器\t351558\n菲亚特克莱斯勒\t351559\n第1关\t351560\n门门\t351561\n歌颂\t351562\ntimertask\t351563\n家具有限公司\t351564\n2.53\t351565\n校服\t351566\n点点客\t351567\n续航长\t351568\n皇阿玛\t351569\n2017年11月16日\t351570\n椰子壳\t351571\n75mm\t351572\nvive\t351573\n一劳永逸\t351574\nincarnation\t351575\n开装\t351576\n浜崎\t351577\n得分\t351578\n性早熟\t351579\n中国防城港_防城港政府网\t351580\n出房\t351581\n160级\t351582\n潮洲\t351583\n6.6.7\t351584\n生皮\t351585\n培智\t351586\n异质结\t351587\n外焦\t351588\nkoa2\t351589\n惊爆\t351590\n泰禾佛山\t351591\n管科\t351592\n7.2PTR\t351593\n镭雕\t351594\n川汇区\t351595\nMarquis\t351596\n往来港澳通行证\t351597\n长驱直入\t351598\n贤德\t351599\n霸王卸甲\t351600\nHEBE\t351601\n0913\t351602\n猫尿\t351603\n尤妮丝\t351604\n费霞\t351605\n煮雨\t351606\n莲藕排骨汤\t351607\nHim\t351608\n魔弹\t351609\nCNENA\t351610\nsystemctl\t351611\n机电工程管理\t351612\ndifferently\t351613\n小飞蛾\t351614\nATF\t351615\n西昊\t351616\n成都市\t351617\n王劲松\t351618\n酒池\t351619\n松江大学\t351620\n本溪农网\t351621\n兴办实业\t351622\n雅居乐\t351623\n赵小兵\t351624\n李枫\t351625\n龙卷风\t351626\n库存费\t351627\nEViews专\t351628\n教序\t351629\n边炉\t351630\n100多米\t351631\n中国古曲网\t351632\n开空\t351633\n伊冰\t351634\n等价物\t351635\n黑刃\t351636\nfpdf\t351637\n力球\t351638\n此项\t351639\n邯郸\t351640\n4.0L\t351641\n悍民\t351642\n张霞\t351643\nSOLAS\t351644\n青岛港\t351645\nHP\t351646\n92条\t351647\n适者生存\t351648\n绍兴市环境保护局\t351649\n蛋糕卷\t351650\n可去\t351651\n补分\t351652\n无人岛\t351653\n堂堂清\t351654\niS\t351655\n信长之野望\t351656\n粗糙度\t351657\n俄勒冈\t351658\n醉生梦死之湾仔之虎\t351659\n沙扬娜拉\t351660\n万般无奈\t351661\nenzyme\t351662\n性爱派对\t351663\n2017年8月5日\t351664\nGSON\t351665\ndrum\t351666\nstono\t351667\n小小英雄传2\t351668\n樱花樱花想见你\t351669\n万托林\t351670\n比斯特\t351671\n发胖\t351672\n仙桃房网\t351673\n黛莱美\t351674\n天平女\t351675\n净流入\t351676\n程波\t351677\n6批次\t351678\nHql\t351679\n长驱\t351680\n张志文\t351681\nFFD\t351682\n羽丝绒\t351683\n十全大补\t351684\n慕情\t351685\ncleo\t351686\nFrancesco\t351687\ndylan\t351688\nassigning\t351689\n归属地\t351690\n金华站\t351691\n戴高乐\t351692\n长城C50\t351693\n总包方\t351694\n枪机\t351695\n贝希摩斯\t351696\nvir\t351697\n绒棉\t351698\n落伴\t351699\n王者荣耀老夫子\t351700\n坑底\t351701\n2018年1月10日\t351702\n胡豆\t351703\n徐鹏飞\t351704\n海淀区委\t351705\n成败论\t351706\n决不放弃\t351707\n铃木里美\t351708\n广西农村信用社\t351709\n擦地\t351710\n山海经\t351711\n北京城市副中心\t351712\nBon\t351713\n阿德福韦酯\t351714\n320国道\t351715\n秋名山\t351716\n赵冬梅\t351717\nnat123端口映射内网穿透\t351718\n飞书\t351719\n橡皮膜\t351720\n卡套式管接头\t351721\n三棱柱\t351722\n欢威\t351723\n罗小白\t351724\n一晃就老了\t351725\n龙牙战术\t351726\n长乐路\t351727\n夏侯渊\t351728\n兽王猎\t351729\n韦雪\t351730\n暴走日报\t351731\n羊村\t351732\n腾讯影业\t351733\nncm\t351734\n梦幻西游无双版\t351735\n1.25倍\t351736\n圣陶\t351737\n月老庙\t351738\n国有企\t351739\n创业街\t351740\n叶永青\t351741\n慢慢慢慢\t351742\n村落\t351743\n重庆商务职业学院\t351744\n越过山丘\t351745\n噬血细胞综合征\t351746\n草叶集\t351747\n蔓越莓饼干\t351748\n各地级市\t351749\n四柱液压机\t351750\n公共维修基金\t351751\n1.56\t351752\n少林寺传奇\t351753\n一声响\t351754\n0112\t351755\n亲日\t351756\n大淀\t351757\n橡塑\t351758\n桃园街\t351759\n白屑\t351760\n主卫\t351761\n屈老师\t351762\n众安在线财产保险股份有限公司\t351763\n凯恩之角_暗黑破坏神3论坛\t351764\n51vv\t351765\nCamera\t351766\n六便士\t351767\n优爱\t351768\nDDA\t351769\n嫦娥奔月\t351770\n日记本\t351771\n罗布麻茶\t351772\n副业\t351773\n成定局\t351774\n饥饿龙\t351775\n广告学概论\t351776\npt1000\t351777\n动漫展\t351778\n存款准备金\t351779\n18秒\t351780\n星城\t351781\n6.19\t351782\n导成\t351783\n2000平方\t351784\njqeruy\t351785\n世风\t351786\nflowable\t351787\nsmb\t351788\n埃默里大学\t351789\nM401d\t351790\n京新高速公路\t351791\n创新\t351792\n南京地铁11号线\t351793\n6.2%\t351794\n背扣\t351795\n香草园\t351796\n100天\t351797\n画门\t351798\n王奎\t351799\n日西\t351800\n自远\t351801\n集句\t351802\nmven\t351803\n播种期\t351804\n第6名\t351805\n多几\t351806\n服务之家网\t351807\n地屈孕酮\t351808\n五道口购物中心\t351809\n现金日\t351810\nepon\t351811\nc919\t351812\n望江台\t351813\n群山\t351814\n南塘镇\t351815\nRT-AC66U\t351816\n成都市委\t351817\n浴室柜\t351818\n玄慈\t351819\ntsum\t351820\n鞋口\t351821\n第39号\t351822\n爱我别走\t351823\nxz\t351824\n机明\t351825\n顶岗实习周记\t351826\n斜卧推\t351827\n高涨\t351828\n律己\t351829\n有几分\t351830\n节节高\t351831\n诗琳\t351832\n劫狱\t351833\n刺客荣耀荆轲\t351834\nhenri\t351835\nspock\t351836\n宇文怀\t351837\n滋盈\t351838\nzhengzhou\t351839\n40岁\t351840\nShort\t351841\nWhenIGrowUp\t351842\nSFGate\t351843\n麦积区人民政府\t351844\n深究\t351845\ncmmi3\t351846\n地氯雷他定\t351847\n没臊\t351848\n系统动力学\t351849\n其中\t351850\n大尖山\t351851\n滚\t351852\n1-5号\t351853\n居民消费价格指数\t351854\n纸板\t351855\n海投\t351856\n逐格\t351857\nWand\t351858\n魅蓝e3\t351859\n收趣\t351860\n彗星\t351861\n六日游\t351862\nusb口\t351863\nv1.9.0\t351864\nHIGH\t351865\n清州\t351866\n马赫雷斯\t351867\n桥东\t351868\n深圳新闻_南方网\t351869\n宏基客运站\t351870\n云南省人民政府办公厅\t351871\n陈寻\t351872\n6y30\t351873\ncrispy\t351874\nevery\t351875\n覃塘\t351876\n一地\t351877\n旮\t351878\n单性\t351879\n二队\t351880\n超音波\t351881\n做优\t351882\n凄凄\t351883\n6号位\t351884\n贵州省凯里市人民政府\t351885\nNa2SO3\t351886\n保温钉\t351887\n成家\t351888\n射手榜\t351889\n手冢国光\t351890\n百度圣卡\t351891\nHit\t351892\n哪_游侠网\t351893\n80多度\t351894\n鲍三娘\t351895\n跨学\t351896\nhoojo\t351897\n填土\t351898\n杭州大学\t351899\ntricky\t351900\n柱塞式\t351901\n三钢闽光\t351902\n科林环保\t351903\nヴ\t351904\nCCTV6高清\t351905\n濒海战斗舰\t351906\n奉新县\t351907\n艾米尔\t351908\n70余\t351909\n传美\t351910\n汉钟\t351911\n35克\t351912\nmcc\t351913\n0129\t351914\n气门\t351915\n奥迪rs6\t351916\n蓝莓果干\t351917\n梦心玥\t351918\n画谜\t351919\n诸天黑暗王者\t351920\n陈水扁\t351921\n510路\t351922\n五个月\t351923\n蚁狮\t351924\n抽象主义\t351925\n摩诘\t351926\n蒋峰\t351927\n三百年\t351928\n毕马威华振会计师事务所\t351929\n应用数理统计\t351930\n吾恩\t351931\n渐近\t351932\nWAP\t351933\n成山镇\t351934\nPair\t351935\n正要\t351936\n通知单\t351937\n.doc\t351938\n苹果carplay\t351939\n闪翼\t351940\n永泰云顶\t351941\n挂包\t351942\nE580\t351943\n又拍\t351944\n3days\t351945\n凤凰火\t351946\n马龙县\t351947\n渔舟\t351948\n山东省图书馆\t351949\nlebron\t351950\n全国百强校\t351951\n潜艇战\t351952\n和悦\t351953\n河北省审计厅\t351954\n五一路\t351955\n能为\t351956\n惊涛骇浪\t351957\n黑山大峡谷\t351958\n启行\t351959\neaster\t351960\n小腹坠痛\t351961\n详实\t351962\n字典散文网\t351963\n脂秤\t351964\n相邻\t351965\n木西梧\t351966\nvplex\t351967\n7u\t351968\n枸骨\t351969\n3月25号\t351970\n850\t351971\n哈莫曼\t351972\n败血症\t351973\n东阳新闻网\t351974\n村晚\t351975\n阿里里\t351976\n毕业了\t351977\n塌方\t351978\n吉普自由光\t351979\n皇明\t351980\n45次\t351981\n愿不愿意\t351982\n段坤\t351983\nPainter\t351984\n黑糯米\t351985\n360急速浏览器\t351986\n安又琪\t351987\ncookies\t351988\nsufficient\t351989\n生长液\t351990\n传化集团\t351991\n中华见义勇为基金会\t351992\n卡丽\t351993\n濉溪县\t351994\nGM8\t351995\n私仇\t351996\ncomotomo\t351997\n一整页\t351998\n星海钢琴\t351999\nssh协议\t352000\n苏州工业园\t352001\n人才港\t352002\n景海鹏\t352003\n打虎拍蝇\t352004\n压分\t352005\n密西根\t352006\n飞客茶馆\t352007\n风筝简笔画\t352008\n50%|\t352009\n分彩\t352010\n梦中国梦\t352011\n论十大关系\t352012\nmarvelous\t352013\n费云帆\t352014\n太暗\t352015\n前列腺钙化\t352016\n汗马功劳\t352017\n23路\t352018\n云南省教育厅\t352019\n吻胸\t352020\n社会主义现代化\t352021\n2199元\t352022\n知可子伯母\t352023\n梁子\t352024\nBull\t352025\n错峰\t352026\niGame1060\t352027\nbind9\t352028\nxc\t352029\n徐志王思聪\t352030\n宋史\t352031\n一维数组\t352032\n海景酒店\t352033\n助农\t352034\nk100\t352035\n朝廷\t352036\n胸怀大志\t352037\n阿拉善\t352038\n雷登\t352039\n北京交通银行\t352040\n强心剂\t352041\n361°\t352042\n3D全息投影技术\t352043\n明窗\t352044\n集成显卡\t352045\n济南国际医学科学中心\t352046\n横岭\t352047\n雷箭\t352048\n江浙沪\t352049\n作主\t352050\n一句一\t352051\n360智能家居\t352052\nliveprofessor\t352053\n奶源\t352054\n泽塔\t352055\n3公斤\t352056\n水解蛋白奶粉\t352057\n人教版8\t352058\n50S\t352059\n塞德娜\t352060\n王雁飞\t352061\n顺手牵羊\t352062\n四舍五入_\t352063\n靠靠\t352064\n法兰克福学派\t352065\n宝庆银楼\t352066\n旅发委\t352067\n坂崎由莉\t352068\n审读\t352069\nNerd\t352070\n单链表\t352071\n类集\t352072\n充血\t352073\n二胺\t352074\n书架式\t352075\n记过\t352076\n本土\t352077\n柯桥轻纺城\t352078\n空城泪\t352079\n差一点\t352080\n界王神\t352081\n六条\t352082\n圣女贞德\t352083\n高卷杏\t352084\n珠算\t352085\n世界文明史\t352086\nforit\t352087\n虚开增值税专用发票案\t352088\n阀组\t352089\n大刚\t352090\n区外\t352091\n泸\t352092\n娥眉\t352093\n高度评价\t352094\n何晏\t352095\n仕界\t352096\nsomewhere\t352097\n炁体源流\t352098\nこ\t352099\nfha\t352100\n1630\t352101\n祁隆\t352102\n咒\t352103\n刑者\t352104\n曾静\t352105\n东方智启科技\t352106\n河沙\t352107\n武汉邮电科学研究院\t352108\n2.93\t352109\n剑网3唐门\t352110\n于军\t352111\n债台高筑\t352112\n功能机\t352113\n萧平实\t352114\n颅内肿瘤\t352115\n乌龙网\t352116\n可能会\t352117\n羊祜\t352118\n20140425\t352119\nenvy\t352120\ndonna\t352121\n苯乙胺\t352122\n动荡不安\t352123\n范文大全/作文网\t352124\n正功\t352125\n16季\t352126\nprotrek\t352127\n唐颖\t352128\nmirrorop\t352129\n时刻\t352130\n艾氏\t352131\n欧朗\t352132\n叶倩\t352133\n洪雅县\t352134\nshangri\t352135\n滨州市教育局\t352136\n汇城\t352137\n于漪\t352138\n电解质分析仪\t352139\n拥军路\t352140\n蒸蛋\t352141\n化学键\t352142\n龙交所\t352143\n警民\t352144\nUI素材\t352145\n向量组a1\t352146\n辽墓\t352147\n叶孤城\t352148\n媚公卿\t352149\n新境\t352150\n海豹油\t352151\n化肥厂\t352152\n两张\t352153\n清明菜\t352154\n射出\t352155\n爵迹\t352156\nnba2k12\t352157\npurchasing\t352158\nOFweek锂电网\t352159\n锡杖\t352160\n张立鹏\t352161\n柚子茶\t352162\n创易\t352163\n配配\t352164\n富国\t352165\n赚钱宝\t352166\n交联\t352167\n阿扁\t352168\n星崎安里\t352169\n0837\t352170\nprimefaces\t352171\ndisposable\t352172\n一个6岁\t352173\n第17季\t352174\n排用\t352175\n走好\t352176\n光绪元宝库平七钱\t352177\nrbb\t352178\n神秘果\t352179\nMSP\t352180\n浮城大亨\t352181\nratelimit\t352182\n航拍机\t352183\n2008年度\t352184\n希尔顿花园酒店\t352185\nt61\t352186\nHiWiFi\t352187\ne550c\t352188\n白吃白喝\t352189\n金牌之路\t352190\n耦合性\t352191\n液力缓速器\t352192\n深圳美莱医疗美容医院\t352193\nKerbal\t352194\n建设工程规划许可证\t352195\n幼崽\t352196\n上海国际\t352197\n维西\t352198\n蓟\t352199\n长隆动物园\t352200\n罗锐\t352201\n雲\t352202\nhyw\t352203\n很浪漫\t352204\n彭野\t352205\n冤鬼路\t352206\n龟波性功\t352207\n100多年\t352208\n亿博娱乐\t352209\ninsider\t352210\n所见所闻\t352211\nKronos\t352212\n大病\t352213\nday3\t352214\n禁猎\t352215\n陈朝\t352216\n贾谊\t352217\n麦道夫\t352218\nspacedesk\t352219\n金骏眉茶\t352220\n武汉理工大学图书馆\t352221\nNASTRAN\t352222\n中支\t352223\nsaline\t352224\n17张\t352225\n战备\t352226\n壁球\t352227\n孟夫子\t352228\n白桦\t352229\n奥罗金\t352230\nDhabi\t352231\n无痕客\t352232\n11股\t352233\n兰石集团\t352234\n1.18G\t352235\n北屯\t352236\n激素皮炎\t352237\n经海路\t352238\n林嘉\t352239\n两三个月\t352240\n杂忆\t352241\n国税增值税发票\t352242\nJCCAD\t352243\n张思忠\t352244\n瑠川莉娜\t352245\n艳奴\t352246\n腌咸鸭蛋\t352247\n星球大战7:原力觉醒\t352248\n郑容\t352249\n看笑\t352250\n珍珠港\t352251\n2.52\t352252\n公兴镇\t352253\n睫毛生长液\t352254\nbeatbox\t352255\n30平方公里\t352256\nImpedance\t352257\n醉玲珑\t352258\n1.56G\t352259\n天台县\t352260\n腌黄瓜\t352261\n北京急救中心\t352262\n孔龙\t352263\nzkui\t352264\n389号\t352265\n国家中长期教育改革和发展规划纲要\t352266\n死人庶女嫡妃\t352267\n简单版\t352268\n阿甘\t352269\nUltron\t352270\n复杂度\t352271\n捷德奥特曼剧场版\t352272\nlessc\t352273\n荣威择天记\t352274\n制造类\t352275\n白领公寓\t352276\n张稀哲\t352277\n佐木\t352278\n希芸\t352279\n普高\t352280\n砚池\t352281\n85岁\t352282\n电子人\t352283\n表明\t352284\n3015\t352285\n巴斯特\t352286\n网卡\t352287\n天假\t352288\n短柱\t352289\n阳光高考信息平台\t352290\nmaccms\t352291\n安装队\t352292\nlig\t352293\n饰板\t352294\n狸窝宝典\t352295\n邹邹\t352296\n毅\t352297\n亮\t352298\n世雅\t352299\n二甲基亚砜\t352300\n王东屈原\t352301\n耻\t352302\n软raid\t352303\n50ms\t352304\n超市化\t352305\n流火行者\t352306\nxbox360\t352307\n宗庆白夜行\t352308\n古墓丽影吧\t352309\n缘木求\t352310\nDbUtils\t352311\n精神奖\t352312\n5000000\t352313\n远洋未来广场\t352314\nLASER\t352315\n红糖发糕\t352316\nmingde\t352317\n年段\t352318\nmars\t352319\n白薇\t352320\n老好人\t352321\n武娘\t352322\n赞达拉\t352323\n无锡酒店\t352324\n尖山路\t352325\n表错\t352326\n有限差分法\t352327\n暗场\t352328\nNonlinear\t352329\n小米驱动\t352330\nplot函数\t352331\n宋康昊\t352332\n张全景\t352333\n三国乱世吧\t352334\n驱寒\t352335\n0.83%\t352336\n待\t352337\n服贸\t352338\n索立信\t352339\nowners\t352340\n每个人\t352341\n紧力\t352342\n疯狂的赛车\t352343\n株型\t352344\n59式坦克\t352345\n瓜点\t352346\nps套索工具\t352347\nncbi\t352348\n人教版二年级语文下册\t352349\n汪民安\t352350\n资本\t352351\n武维华\t352352\n5步\t352353\n金丝阁\t352354\n刚子\t352355\n徐子淇\t352356\n卡尔特\t352357\n擦干\t352358\nAMS\t352359\n多孔\t352360\nqueued\t352361\n力保\t352362\nPipe\t352363\n3周\t352364\ny400\t352365\n0x00000000\t352366\n罗德岛\t352367\n行好\t352368\n奥村\t352369\nCRLF\t352370\n倾盆\t352371\n成才之路\t352372\n得力计算器\t352373\n十字形\t352374\n第226集\t352375\n网关\t352376\n葛洪\t352377\n饭店业\t352378\n跨链\t352379\n美创\t352380\nstraight\t352381\n发表\t352382\n游牧\t352383\n新都\t352384\n中海达RTK\t352385\n聚成股份\t352386\n骁龙653\t352387\n可敬\t352388\n江南世家\t352389\nLeinov\t352390\nPyinstaller\t352391\n八仙\t352392\n餐食\t352393\n凤还巢\t352394\n怒汉\t352395\n晒胸\t352396\n80、90年代\t352397\n美骑\t352398\n德育基地\t352399\n绝对值编码器\t352400\n输导\t352401\n三百句\t352402\n中南百草园\t352403\n十六铺\t352404\n费尽\t352405\n压合\t352406\n蜂鸟\t352407\nLinux7\t352408\n2017年4月6日\t352409\n网闸\t352410\n在山的那边\t352411\n巴沙尔\t352412\n日租\t352413\n橡皮艇\t352414\n70吨\t352415\nmarsh\t352416\n党政机关公文处理工作条例\t352417\n144号\t352418\n海友酒店\t352419\nFAN\t352420\n小小姐\t352421\n一言一行\t352422\nvos\t352423\nmichael\t352424\n神段\t352425\nr8500\t352426\n财政赤字\t352427\n东野奥尼尔\t352428\n金红石\t352429\n冲澡\t352430\n码\t352431\n卡农头\t352432\n3C\t352433\n飞机头\t352434\n神乐千鹤\t352435\nentities\t352436\n第十二\t352437\nWebcam\t352438\n高拍仪\t352439\n金昌市人民政府\t352440\n移轴\t352441\n超短篇\t352442\n保养费\t352443\n姑嫂丸\t352444\n在理\t352445\n茉莉花开\t352446\n好榜样\t352447\n广汽本田汽车有限公司\t352448\nmpl\t352449\nECJia\t352450\nbrd\t352451\n3.x\t352452\n阿莫之家源码网\t352453\n24%\t352454\n三楼\t352455\n拉罐\t352456\n温湿\t352457\nbch\t352458\n刷脂\t352459\n放射状\t352460\n大陈村\t352461\nAC米兰\t352462\n淑妃\t352463\n悍威\t352464\n林风眠\t352465\n6011\t352466\n开打\t352467\n寸头\t352468\nsais\t352469\npictures\t352470\n经典语录\t352471\n韩国领事馆\t352472\n阿丽拉\t352473\n坂木\t352474\n田家庵之窗\t352475\n腥红之月\t352476\nmeth\t352477\n窃·格瓦拉\t352478\n孙嘉\t352479\nrestructuring\t352480\n专线路\t352481\n朱苏力\t352482\n法西斯主义\t352483\n画馆\t352484\n韦达\t352485\n袁咏琳\t352486\nDivas\t352487\n空调箱\t352488\n北京锐安科技有限公司\t352489\n一汽大众\t352490\n乐胥\t352491\n莲花县\t352492\n佐佐木小次郎\t352493\n费效\t352494\n脚墩\t352495\n脑颅\t352496\ns13\t352497\n山中湖\t352498\n围术期\t352499\n精装修\t352500\n精油\t352501\n求治\t352502\n658\t352503\n8}$\t352504\n一房一\t352505\n人际关系\t352506\n锐\t352507\n杨朱\t352508\n硝基苯\t352509\n七军\t352510\n辩证统一\t352511\n证明题\t352512\n洗浴地址\t352513\n神主\t352514\n图层\t352515\n骨髓纤维化\t352516\ncrcc\t352517\n双纵\t352518\n微警\t352519\nmeeting\t352520\n星曜\t352521\n惠比寿麝香葡萄\t352522\n上海复旦微电子集团股份有限公司\t352523\n普朗克常数\t352524\n冢不二\t352525\n南明街道\t352526\n吴海波\t352527\n投资心理学\t352528\n小确茶\t352529\n奇客\t352530\n中国节能环保集团公司\t352531\n坑娃\t352532\n薪级\t352533\n乌东德\t352534\n变电站\t352535\n行政管理部\t352536\n重庆志愿服务网\t352537\nReal-Time\t352538\n瀚海\t352539\nInterest\t352540\n一个框\t352541\n山雀科\t352542\n睛\t352543\n资粮\t352544\nexcel列\t352545\ni50mm\t352546\nconsle\t352547\n额济纳旗\t352548\n通讯展\t352549\n干戈\t352550\n晋安区\t352551\n每个星期\t352552\n周公子\t352553\n猎德大桥\t352554\n显圣\t352555\n麓湖\t352556\n派工\t352557\n房契\t352558\ns9plus\t352559\n150t\t352560\n瞎说\t352561\n苦心孤诣\t352562\n货币通论\t352563\nelly\t352564\n三明文明网\t352565\n镶件\t352566\n概念化\t352567\nDestiny\t352568\nproteins\t352569\n第56集\t352570\ncavas\t352571\n晓辉\t352572\n6.9b\t352573\nconti\t352574\n蓝猫龙骑团\t352575\n构造\t352576\n郦\t352577\n阑\t352578\n老陈醋\t352579\n河北华强科技开发有限公司\t352580\nPissing\t352581\n15MM\t352582\n帕玛氏\t352583\n绳子\t352584\nxiaochengxu\t352585\n沙池\t352586\n张祖林\t352587\n顾全大局\t352588\n雌雄同体\t352589\n小漫\t352590\n一切都好\t352591\n海黄\t352592\n张伯伦\t352593\n武岳峰\t352594\n不等式与不等式组\t352595\nshih\t352596\n智慧园\t352597\nkomodo\t352598\n建筑书店\t352599\nCleveland\t352600\n15.0.1\t352601\n安妮日记\t352602\n广域网\t352603\n炮团\t352604\n王世襄\t352605\n大块\t352606\nOrCad\t352607\n团史\t352608\nirf\t352609\n双子男\t352610\n铁血战士\t352611\n流韵\t352612\n浪莎\t352613\n2001\t352614\n美德威\t352615\n大连理工大学》\t352616\n张宏宇\t352617\n娱乐头条\t352618\n纵情欲海3\t352619\n另一个我\t352620\n2813\t352621\nCultural\t352622\n王OL\t352623\n继承案\t352624\n大召寺\t352625\n郭声琨\t352626\n苏波邦\t352627\nAPD\t352628\n三辰\t352629\n昏昏\t352630\n除螨\t352631\n红白机模拟器\t352632\n小兵\t352633\n拉框\t352634\n国民党\t352635\n元朝\t352636\n数据分析方法论\t352637\n肛泰\t352638\n国家现代农业示范区\t352639\n好带感\t352640\ntyzw\t352641\n正交表\t352642\n呦呦\t352643\n帽儿山\t352644\n米高林\t352645\n21日\t352646\n4.58\t352647\n机械工程师\t352648\n模拟机\t352649\n国家建筑工程质量监督检验中心\t352650\n平板机\t352651\nNW-WS413\t352652\n六腑\t352653\n騎\t352654\n位移传感器\t352655\n计量员\t352656\n最初的梦\t352657\nraphael\t352658\n姜世离\t352659\n6km\t352660\n变异体\t352661\n维基\t352662\npdfbox\t352663\ngrowing\t352664\n2018.3.8\t352665\n十进\t352666\n宿城在线\t352667\n192个\t352668\n2017年9月8日\t352669\nFirewallD\t352670\n鲷鱼\t352671\n北京回龙观医院\t352672\n色母片\t352673\n陈诗友\t352674\n堪萨斯大学\t352675\n眉脚\t352676\n三只\t352677\n新船\t352678\n久青草原\t352679\n三口\t352680\n篡位\t352681\n继电\t352682\n郭辉\t352683\npressed\t352684\n综测仪\t352685\n单项式\t352686\n4.29\t352687\n街头霸王\t352688\n暗藏\t352689\n第四段\t352690\n加井岛\t352691\n迪加奥特曼\t352692\n调函\t352693\n有些\t352694\n掌骨\t352695\n爆光\t352696\nhealing\t352697\n主办券商\t352698\n联通短信中心\t352699\n思蒙\t352700\n阿花花酱\t352701\ntaiyuan\t352702\nc套\t352703\n3.11\t352704\n游匣7000\t352705\n挂机宝\t352706\n小钢\t352707\n每边\t352708\n鹤庆\t352709\n坚信\t352710\n3DsMax2015\t352711\n野子\t352712\n三星盖乐世\t352713\n金山嘴渔村\t352714\n江阴日报数字报\t352715\n云网客\t352716\n5100\t352717\n昆山驾校\t352718\nD750\t352719\nDOOOOR\t352720\n115盘\t352721\n乌贼\t352722\nSultan\t352723\n极限挑战\t352724\n五亩\t352725\n港剧\t352726\nsoleil\t352727\n自决\t352728\n罐头\t352729\nfMRI\t352730\n乙烯基硅油\t352731\n在湖边\t352732\n万能解码器\t352733\n24日\t352734\n国际标准\t352735\n麦冬\t352736\n12瓶\t352737\n喀土穆\t352738\nProcedure\t352739\n北海福成机场\t352740\n_蔚蓝留学网\t352741\n王志平\t352742\nINTO\t352743\n天鉴\t352744\n新堂\t352745\n菲利普\t352746\ninspur\t352747\n玻璃窗\t352748\n房顶\t352749\n止盈止损\t352750\n方可\t352751\n教师资格考试网\t352752\n心得\t352753\n养分\t352754\n黑龙江东方学院\t352755\nservo\t352756\n脚趾\t352757\n导线框\t352758\n1.62\t352759\niis管理器\t352760\n喷涂机\t352761\n徐昕\t352762\n罗布泊\t352763\nangelica\t352764\n13.2\t352765\n出埃及记\t352766\n8017\t352767\n南山牧场\t352768\n爬壁\t352769\n奇迹暖暖\t352770\n光明日报出版社\t352771\n长沙机场\t352772\n陈黎明\t352773\nv8.5\t352774\n普照寺\t352775\nShawn\t352776\n劳务报酬所得\t352777\n梦醒\t352778\n高顶\t352779\n秋枫\t352780\n仪表板\t352781\n末车\t352782\ncommencement\t352783\n地狱之火\t352784\n风马\t352785\n出口货物报关单\t352786\n飞絮\t352787\ndatapicker\t352788\n旺庄\t352789\n百乐满热水器\t352790\n腔体\t352791\ntablesorter\t352792\n啁啾\t352793\nFoxmail\t352794\n亲切感\t352795\nadonit\t352796\n8月30日\t352797\ney\t352798\nTours\t352799\nipcam\t352800\ndrp\t352801\necco\t352802\n接地变压器\t352803\n开资\t352804\nchinanet\t352805\n脑洞大爆炸\t352806\n无错字\t352807\n鬼灵\t352808\n岔口\t352809\n李惠利\t352810\nFRAMEWORK\t352811\n公主港\t352812\nSketchUp\t352813\n粗大\t352814\n阿勒颇\t352815\n盖叫\t352816\nzipfile\t352817\n等一等\t352818\n警邮\t352819\n罗村\t352820\n六六\t352821\n脉证\t352822\nNTI\t352823\n坟头\t352824\n平特一肖\t352825\nCASIO卡西欧\t352826\ndv6\t352827\n欧盟委员会\t352828\n中国国际航空公司\t352829\npd\t352830\n索多玛\t352831\n保险公司\t352832\nai3\t352833\n邻二甲苯\t352834\nZBrush\t352835\n豪利花园\t352836\n内服\t352837\n侦探小说吧\t352838\n百国\t352839\n熟睡\t352840\nT480s\t352841\n河南云台山\t352842\n郭荣\t352843\n走走走\t352844\n恶臭\t352845\n进校\t352846\n中川美铃\t352847\nPLC/自动化/工控\t352848\npsvr\t352849\n柯山\t352850\n交联聚乙烯\t352851\n其七\t352852\n歇歇\t352853\n闪舞小说网\t352854\npygal\t352855\n黄丹\t352856\n网易加速器\t352857\n尊然\t352858\nchoi\t352859\n浙江股权交易中心\t352860\nafe\t352861\n阿米纳\t352862\n高\t352863\n鱼袋\t352864\n茶味\t352865\n金行\t352866\npjg\t352867\n运发\t352868\n八寨沟\t352869\n更好听\t352870\n上地医院\t352871\n工作周\t352872\nadjective\t352873\n民族自治\t352874\n77W\t352875\n解脲支原体\t352876\n套袖\t352877\n氛\t352878\n顺丰快递\t352879\n新技术\t352880\n急走\t352881\ngtj\t352882\n诸葛长青\t352883\n农委\t352884\n几十遍\t352885\n采购入库单\t352886\n综合色\t352887\n1161\t352888\n安徽省交通运输厅\t352889\n扮\t352890\n中国农资网\t352891\n首尔明洞\t352892\n万山红遍\t352893\nfreq\t352894\n新举措\t352895\n入微\t352896\nnana\t352897\n嘉里\t352898\n毕节市\t352899\nkimmy\t352900\n场外市场\t352901\n内蒙古人大\t352902\n数百个\t352903\ncsrss\t352904\n自由度\t352905\n辛集\t352906\nsyntax\t352907\n内弦\t352908\n孙富春\t352909\n守望先锋猎空\t352910\n南召县人民政府\t352911\n诸葛玥\t352912\n内蒙古党委\t352913\n出版商\t352914\n货叉\t352915\n摩尔龙\t352916\n机敏\t352917\nDOM4J\t352918\n食器\t352919\nBANG\t352920\n可乐\t352921\nR400\t352922\n0731新房网-0731房产网\t352923\n贾岛\t352924\n地坑\t352925\n运动户\t352926\n穷屌\t352927\n七老\t352928\n赏乐\t352929\n几十份\t352930\nシンクロニシティ\t352931\n招标代理\t352932\nhostus\t352933\nLD\t352934\n6075\t352935\n铁皮人\t352936\n同德乡\t352937\n8.8\t352938\n壬戌日\t352939\nattr\t352940\nMagiCAD\t352941\n历史观\t352942\n恒大地产集团有限公司\t352943\n44届\t352944\n黄冈360\t352945\n大艺树\t352946\n万盛股份\t352947\n房地产市场分析\t352948\nwtl\t352949\n春风拂槛露华浓\t352950\nLiteOS\t352951\n培训机\t352952\nHD高清\t352953\n30年前\t352954\n江苏科技厅\t352955\n奇美拉\t352956\n5e\t352957\nkenyon\t352958\npinggu\t352959\n迷龙\t352960\ncausality\t352961\n防爆车\t352962\n拉郎\t352963\n康伟\t352964\n阿希币\t352965\n正弦波逆变器\t352966\n费神\t352967\nRocketDock\t352968\n三板斧\t352969\n万润科技\t352970\n五色糯米饭\t352971\n轰击\t352972\n洛伦佐\t352973\nGreeting\t352974\n两套\t352975\n惨绝\t352976\n寒魔影\t352977\n客户群\t352978\n双边贸易额\t352979\n投资性\t352980\nEMC\t352981\nkee\t352982\n塑料制\t352983\n跨贸\t352984\n法证先锋3\t352985\n早餐机\t352986\n母联\t352987\n非现场\t352988\n真实身份\t352989\n尊主\t352990\n0米\t352991\n偏远地区\t352992\n蒙冤\t352993\nvold\t352994\n几米\t352995\n静乐县\t352996\n65折\t352997\n许渊冲\t352998\n别杀\t352999\ncommissioning\t353000\n腿痛\t353001\nSEIKO\t353002\n考略\t353003\n奥尔良烤鸡\t353004\n三峡枝江网\t353005\n郭敬明\t353006\n民事诉讼法学\t353007\n不死鸟\t353008\nAbsorption\t353009\nqualcomm\t353010\ntry\t353011\n金茂\t353012\n1075\t353013\nGs\t353014\n精神科\t353015\n马苏\t353016\n北京盘古七星酒店\t353017\nADN\t353018\n迷香药\t353019\n兽\t353020\n李文强\t353021\nAbuse\t353022\n三人成虎\t353023\n丰田柯斯达\t353024\nav动漫网\t353025\nmarilyn\t353026\nCHARACTER\t353027\n签约仪\t353028\n营业外收入\t353029\n传奇奔驰\t353030\n宋晨\t353031\n一网\t353032\nmc11\t353033\n旋转\t353034\n活性物\t353035\n法律条文\t353036\nDistinct\t353037\niPad2018\t353038\n特别奖\t353039\n培训界\t353040\n皮革厂\t353041\n保险箱\t353042\n张朋\t353043\nadaboost\t353044\n六里桥东\t353045\nTape\t353046\nworking-dog\t353047\n吹气式\t353048\n杀手4\t353049\n龙谷湾\t353050\nvbs吧_\t353051\n热导率\t353052\n2154\t353053\n被窝电影网\t353054\nexhentai\t353055\n党的十八届四中全会\t353056\n烘焙食品\t353057\n诺基亚9\t353058\nDiscriminant\t353059\nbcprov-jdk\t353060\n淹没式\t353061\n南阳之窗\t353062\n北工大\t353063\n邓稼孙杨\t353064\n诚信\t353065\n王昌\t353066\n意识到\t353067\n管架\t353068\n真灵\t353069\n露比\t353070\n违约金\t353071\n选择框\t353072\n氯乙醇\t353073\n锦瑟华年\t353074\n旭辉控股集团\t353075\n中国精算师协会\t353076\n晶体振荡器\t353077\n敌客\t353078\n电影感\t353079\n活动程序\t353080\n亡羊\t353081\n十二罗汉\t353082\n第十五季\t353083\nKMP算法\t353084\n散记\t353085\n龙溪路\t353086\nInorganic\t353087\n75级\t353088\nAssemblies\t353089\n球面镜\t353090\n生态日\t353091\n费钱\t353092\n大熊座\t353093\nexceed\t353094\n董事会议案\t353095\n灵镜\t353096\n谭天星\t353097\n美尔凯特\t353098\n此行\t353099\n中国行业协会\t353100\nroutes\t353101\n吴可熙\t353102\nv2.26\t353103\n城北新区\t353104\n7000米\t353105\n裸奶\t353106\n小巴\t353107\n事业单位登记管理网\t353108\n硝酸甘油\t353109\n开环\t353110\n喜阴\t353111\n程序册\t353112\n喵娘\t353113\npopin\t353114\n小沢\t353115\nXPEL\t353116\nL形\t353117\n葆婴\t353118\n隐球菌\t353119\n智尚\t353120\n警告语\t353121\n估计\t353122\n配音表\t353123\nzbrush4r7\t353124\n停飞\t353125\n检漏仪\t353126\n苏华\t353127\n250g\t353128\n地泵\t353129\n鱼\t353130\nneighbours\t353131\nUplay\t353132\n小白条\t353133\nninjia\t353134\n十六字诀\t353135\n太平洋游戏\t353136\ningredients\t353137\n用友nc\t353138\n8TB\t353139\n新奥尔良\t353140\n离体培养\t353141\n电影篇\t353142\ncapillary\t353143\n莘塍街道\t353144\n南海子公园\t353145\n私募登记\t353146\n尖头\t353147\nUbud\t353148\nMT_九游论坛\t353149\nす\t353150\n造梦西游1\t353151\n青光眼\t353152\nTLScontact\t353153\n冒险岛风灵\t353154\n梨云\t353155\nkrpano\t353156\n别无选择\t353157\n大都市\t353158\n掌厨\t353159\n李三娘\t353160\n牛耕\t353161\n丝路楼兰网\t353162\njmail\t353163\n杨颖\t353164\n易老\t353165\n制药类\t353166\n石墨垫片\t353167\n5.6.38\t353168\nmeter\t353169\nIT百汇网\t353170\n鲁豫有约\t353171\nsome\t353172\n滁州中学\t353173\n金融法\t353174\n白首乌\t353175\n兽灵\t353176\n同济大学学报\t353177\nXander\t353178\n金湖路\t353179\n悬疑志\t353180\n腹股\t353181\n十六岁\t353182\nlofi\t353183\n神兽金刚之超变星甲\t353184\n55g\t353185\n神笔\t353186\n卖给\t353187\n阿弥陀\t353188\ntx\t353189\n挤塑板\t353190\nFLAC分轨\t353191\ned\t353192\n腹产\t353193\n王子様\t353194\n白皇\t353195\n王菁华\t353196\nnewera\t353197\n中式台球\t353198\n摸金校尉\t353199\n雕梁画栋\t353200\n2017-03\t353201\n市城乡规划局\t353202\n绝字\t353203\n申述\t353204\ninvitrogen\t353205\n两指\t353206\n朱一龙\t353207\n研报_全景网\t353208\n23吨\t353209\nリン\t353210\n舌段\t353211\n二师兄\t353212\n铁血\t353213\n不以\t353214\n心宽\t353215\n普契尼\t353216\n黄肉\t353217\n惨遇\t353218\n抬头\t353219\n长安大学研究生院\t353220\n赣菜\t353221\n强女\t353222\n臂环\t353223\n电视果\t353224\n维也纳爱乐乐团\t353225\n接收管\t353226\n处罚书\t353227\n曾卓\t353228\n75首\t353229\n噬魂师\t353230\ncodegen\t353231\n寒梅\t353232\n走穴\t353233\n梏\t353234\n天地万物\t353235\n方兴东\t353236\n带回\t353237\n东北制药\t353238\n非常6+1\t353239\nns3\t353240\n小便池\t353241\n曼巴蛇\t353242\n荣安地产\t353243\n通钢\t353244\n第三十八条\t353245\nkwan\t353246\n体艺节\t353247\n杜泊羊\t353248\nDL388\t353249\n射洪\t353250\ncds\t353251\n呆头\t353252\n黄云\t353253\n注浆\t353254\n疏勒县\t353255\n水仓\t353256\n汉之殇全面战争\t353257\n索氏提取法\t353258\n锐步\t353259\n芥花油\t353260\n留学生活\t353261\n五矿稀土\t353262\n墨镜\t353263\n沈勇\t353264\ncreed\t353265\n老俞\t353266\n怎么话\t353267\n高尔夫嘉旅\t353268\nSR式\t353269\n彩照\t353270\n熔剂\t353271\n案件\t353272\n3.86\t353273\n首飞\t353274\n建筑抗震设计规范\t353275\ngradlew\t353276\n字诀\t353277\nstrcmp函数\t353278\n布欧\t353279\n红斑点\t353280\nnotifier\t353281\n5K\t353282\n林凤\t353283\n脐疝\t353284\nSP5.0\t353285\n锦屏县\t353286\n说到底\t353287\n洪洞县\t353288\nCXH99\t353289\n阿苏纳\t353290\n酸曲\t353291\n2017-07\t353292\n点面\t353293\n最火爆\t353294\n席琳·迪翁\t353295\n中能\t353296\nSubscriptions\t353297\nVid\t353298\n北京印刷厂\t353299\n日租房\t353300\njsf\t353301\n42寸\t353302\n站仪\t353303\n武船\t353304\nX20A\t353305\n宋七月\t353306\n4.6.4\t353307\n海峡新干线\t353308\n洛阳人才网\t353309\n13W\t353310\nxVideos\t353311\nclaymore\t353312\n冷却器\t353313\nopc\t353314\nyaffs2\t353315\n小方子\t353316\n澄空学园\t353317\nstetho\t353318\n社会保障网\t353319\n大头针\t353320\nraymond\t353321\n614\t353322\n任逍遥\t353323\n中成药\t353324\nDefFoundError\t353325\n高脂\t353326\nF460\t353327\n周成刚\t353328\n科颜氏牛油果\t353329\npoison\t353330\n顾稚子\t353331\nS+\t353332\n提尔比茨\t353333\n一者\t353334\n透明售房网\t353335\n宁波市机关事务管理局\t353336\n前旗\t353337\n选频\t353338\nPCS7\t353339\n曲靖市人民政府\t353340\n古朗月行\t353341\n5031\t353342\n沙拉碗\t353343\n91考试网\t353344\n番号网\t353345\n这一片\t353346\n胡诌\t353347\n煽风点火\t353348\nCron表达式详解\t353349\n通源\t353350\n致命弱点\t353351\n本溪水洞\t353352\n白客\t353353\n滨河新城\t353354\nUWP\t353355\n第84号\t353356\n聚乙烯板\t353357\n一铭\t353358\n中国黄金\t353359\nmilk\t353360\n徐沛东\t353361\n诸葛南派三叔\t353362\n别处\t353363\n地藏孝亲网\t353364\n36期\t353365\n3代\t353366\nlsu\t353367\nThroughput\t353368\n马店镇\t353369\n天善\t353370\n勃兰登堡门\t353371\n13套\t353372\n弗里达\t353373\n歌诗图\t353374\n卖拐\t353375\n微计算机\t353376\n艺坛\t353377\n炫笔\t353378\n8765\t353379\n阿库娅\t353380\n罗星\t353381\nFAB\t353382\nreplaceall\t353383\n阳光充足\t353384\n樱之花\t353385\n冰点脱毛\t353386\n紧抱\t353387\n9981\t353388\n男方\t353389\n套单\t353390\npathology\t353391\n深圳市汇川技术股份有限公司\t353392\n总干事\t353393\n血吼\t353394\n宣传页\t353395\nasleep\t353396\n欢乐斗\t353397\n为人子\t353398\n艺术片\t353399\nweget\t353400\n三代机\t353401\n新鸿基地产\t353402\n字典\t353403\n中国中档酒店\t353404\n致命魔术\t353405\n中国石油勘探开发研究院\t353406\n可编程逻辑-与非网\t353407\n嘉兴南站\t353408\n海南软件职业技术学院\t353409\n保靖黄金茶\t353410\n沉尸\t353411\n四连败\t353412\n三吱儿\t353413\n中山市\t353414\n八卦图\t353415\n静态展\t353416\n嫁妆\t353417\ndisorder\t353418\n插体\t353419\n1.46\t353420\n肝内胆管结石\t353421\ngbl\t353422\n习惯\t353423\n电子类\t353424\n650路\t353425\n核桃乳\t353426\nColbie\t353427\nYum\t353428\n萧山农商银行\t353429\nRegulator\t353430\n直角坐标系\t353431\n真爱至上\t353432\n晋教版\t353433\nwipo\t353434\n李继红\t353435\nuseful\t353436\n悦庭\t353437\n苏州体育中心\t353438\n张国平\t353439\nhttpPost\t353440\n帮定\t353441\n接待处\t353442\n剑雪\t353443\n仿制\t353444\n销案\t353445\n拌和物\t353446\n小米净水器\t353447\n陈扬\t353448\n被推翻\t353449\ndds\t353450\njsj\t353451\n东北虎\t353452\n聚居\t353453\n小时代3\t353454\n优号\t353455\nModelMap\t353456\nkika\t353457\n公模\t353458\nDHA\t353459\n处理\t353460\n赖账\t353461\n12.0.18\t353462\n共赏\t353463\nWIndows\t353464\n魅蓝note\t353465\n雕刻师\t353466\n必填项\t353467\n小鲜\t353468\nchongqing\t353469\n20150422\t353470\n异噻唑啉酮\t353471\nMurakami\t353472\nペ\t353473\n建阳市人民政府\t353474\n商经法\t353475\n刑部\t353476\n青田县政府\t353477\nMeadow\t353478\n财社\t353479\nmbo\t353480\n东京食尸鬼第三季先睹为快\t353481\n参股公司\t353482\n瓦格里\t353483\nExhibitor\t353484\n双栏\t353485\n师语\t353486\n半个月\t353487\n厨房垃圾处理器\t353488\n上海新天地\t353489\n下摆臂\t353490\nShared\t353491\n好工作\t353492\n300位\t353493\nGTAV\t353494\n率直\t353495\n分车\t353496\n大武口\t353497\n日式\t353498\n漾濞\t353499\n北京首旅集团\t353500\n心舞\t353501\n阳光国际\t353502\n中国文艺网\t353503\n左膝\t353504\n洪洞大槐树\t353505\nTriathlon\t353506\n125个\t353507\n明道\t353508\n裕固族\t353509\nocpc\t353510\n朵唯\t353511\nAssistant\t353512\nEMO\t353513\n动物简笔画\t353514\n鱼尾狮\t353515\nlili\t353516\nABP\t353517\n开关\t353518\n柯兰\t353519\n中英公学\t353520\n五村\t353521\n农口\t353522\n连动\t353523\nEM5\t353524\n李兰\t353525\n萝铃\t353526\n王集镇\t353527\n行政事业单位内部控制规范\t353528\n四川广电\t353529\n富集\t353530\n江科\t353531\n贵阳职业技术学院\t353532\n艾糍\t353533\n阻生齿\t353534\n烟罩\t353535\n骨头社\t353536\n5.45\t353537\n土木学院\t353538\n彻彻底底\t353539\n圣莱达\t353540\nMessages\t353541\n18183\t353542\n20本\t353543\n荆轲刺秦王\t353544\n位数\t353545\n小米放大器\t353546\n考量\t353547\n铜陵北路\t353548\n诚园\t353549\n漩涡\t353550\n贷款基准利率表\t353551\nIP地址段\t353552\n渡己\t353553\n深圳装饰公司\t353554\nyOmega\t353555\n景谷\t353556\n邦盛\t353557\nyearly\t353558\n转入\t353559\n宋词三百首\t353560\n哈继不能说的秘密\t353561\n伊斯\t353562\n手拉葫芦\t353563\n萨米\t353564\n北塘\t353565\n双飞六日游\t353566\n手游\t353567\nDue\t353568\n雀尾\t353569\n多态性\t353570\nf390\t353571\n王恺\t353572\nPenis\t353573\n馒头\t353574\n新朝\t353575\n感透\t353576\n狗皮\t353577\n妇联通\t353578\n110伏\t353579\n7000多万\t353580\n图形界限\t353581\ncass9.0\t353582\n方宁\t353583\n民盛金科\t353584\nLibrehat\t353585\nS13\t353586\n富豪\t353587\n南丰城\t353588\neMMC\t353589\nhus\t353590\n多选框\t353591\n芦花\t353592\n空射\t353593\nMOBA类\t353594\n煤炭股\t353595\n小剧场\t353596\n中控锁\t353597\n考资\t353598\n普阳街\t353599\n南吉\t353600\n铁十字勋章\t353601\n银雁\t353602\nNotePad\t353603\n倒车灯\t353604\n做操\t353605\n纠错码\t353606\nswufe\t353607\n肉锥\t353608\n习近平用典\t353609\n切割板\t353610\n方方\t353611\n走马上任\t353612\n霍山黄芽\t353613\n追捧\t353614\n具足\t353615\n运城市盐湖区\t353616\n八款\t353617\n青盲\t353618\n王兴\t353619\n德贤\t353620\n铜仁南\t353621\n中骏云景台\t353622\n﹍\t353623\ncompletely\t353624\n李吉\t353625\n7倍\t353626\n上海上小学\t353627\n荧光素\t353628\n广汽吉奥\t353629\n激吻\t353630\n桜井\t353631\n岳明辉\t353632\n海南省委\t353633\n智合\t353634\n明清家具\t353635\n纳入党\t353636\n共同犯罪\t353637\n和谐号动车组\t353638\n两分频\t353639\n进项\t353640\nladp\t353641\n女职工劳动保护特别规定\t353642\n无线网\t353643\n毫升\t353644\n十杀\t353645\n文工\t353646\n和田玉枣\t353647\n网络经纪人\t353648\n军酒\t353649\nn6pro\t353650\n16公斤\t353651\n中国人电影中国人\t353652\n严浩翔\t353653\nSVP\t353654\n大星\t353655\nNBR\t353656\nmobil\t353657\n弹水\t353658\n美尤利娅\t353659\n下衣\t353660\njavap\t353661\n双丰收\t353662\nrocketmq\t353663\n31000\t353664\n日系\t353665\n苏州市财政局\t353666\nBlondes\t353667\n五塘\t353668\n选车\t353669\n16格\t353670\n奥妮克希亚\t353671\n实功\t353672\n空之轨迹fc\t353673\n第四项\t353674\n狻猊\t353675\n白线流\t353676\noffie\t353677\nLintCode\t353678\n5.5m\t353679\n母版页\t353680\n5632263\t353681\nfantastic\t353682\n客死\t353683\n教主\t353684\n新竹\t353685\n黑龙江省工业和信息化委员会\t353686\nVRay\t353687\n各家\t353688\n息心\t353689\n酒糟鼻\t353690\nTL-WDR5600\t353691\n长通物流\t353692\n见日\t353693\n133路\t353694\n求学网\t353695\n拜泉法院网\t353696\n山西晚报数字报\t353697\n面字\t353698\n龙山区\t353699\nMvc\t353700\n比较级\t353701\n蝴蝶\t353702\n7RD\t353703\nangular-cli\t353704\n江通\t353705\n年宝玉\t353706\n樱桃核\t353707\n百只\t353708\n回光返照\t353709\n满破\t353710\n毯子\t353711\n反斜\t353712\n屈辱史\t353713\n大连万达广场\t353714\n201803\t353715\nRhapsody\t353716\n残篇\t353717\nWorkBench\t353718\n伯特\t353719\n白板\t353720\n薛飞\t353721\n会诊\t353722\nX32\t353723\n万源城\t353724\n白蒲镇\t353725\n特战英雄\t353726\n执子之手与子偕老\t353727\n7042\t353728\n莱阳梨\t353729\n动漫版\t353730\n紧固胶\t353731\nremix\t353732\n北京沙河\t353733\n拿卡\t353734\nr6900\t353735\nword2018\t353736\n末影\t353737\n台中市\t353738\n盐水泡\t353739\n思乡\t353740\nArbitration\t353741\n鹿港\t353742\n云篆山\t353743\n代理记账公司\t353744\n三清山机场\t353745\n北京市县\t353746\n潘岳\t353747\n2星\t353748\n2018年5月份\t353749\n中国工程机械工业协会\t353750\n茂南\t353751\n君威\t353752\nwaste\t353753\nW10\t353754\n荣威550\t353755\n540P\t353756\n苏区\t353757\ne联\t353758\n及早\t353759\n准绳\t353760\nSR\t353761\n华都\t353762\n第21届\t353763\nSmartisan\t353764\n奥菲斯\t353765\n教练场\t353766\n宠信\t353767\n康明\t353768\n蜜瓜\t353769\n抹茶拿铁\t353770\n三角臂\t353771\n朱迅\t353772\n初禾\t353773\nUniCareer\t353774\n泡汤\t353775\n死人楚汉风云\t353776\n江苏省通信管理局\t353777\nGPON\t353778\nAdminLTE\t353779\nJX3剑网3\t353780\n巴西花梨\t353781\n7628\t353782\n咪\t353783\ndepart\t353784\nfety\t353785\n1423\t353786\nK2p\t353787\n病字头\t353788\n杉本博司\t353789\n环北\t353790\n0453\t353791\npsh\t353792\n煤制乙二醇\t353793\n1672\t353794\n1883年\t353795\n北京城市学院\t353796\n久之洋\t353797\n伊藤忠\t353798\n层门\t353799\n全营\t353800\n京东食尸鬼\t353801\n四元\t353802\n移动4g飞享\t353803\nyosemite\t353804\n太行路\t353805\n圣母\t353806\n鞋垫\t353807\nunit3\t353808\n潍坊护理职业学院\t353809\n招妓\t353810\n18.1\t353811\n不懂撒娇的女人\t353812\n克氏\t353813\n吉林警察学院\t353814\n14881\t353815\nautodesk\t353816\n重新启动\t353817\n沙场\t353818\n小苮儿\t353819\n无汗\t353820\njinx\t353821\n收税\t353822\n乐优优\t353823\n多帅\t353824\n同理\t353825\n错不了\t353826\n信报\t353827\n东芝2303a\t353828\n挺柱\t353829\n麻药\t353830\nSumif函数\t353831\n飞马国际\t353832\n蛇眼\t353833\n马钱子\t353834\n溅射\t353835\n陈修园\t353836\n本钢集团\t353837\n易语言多线程\t353838\nCEP\t353839\n遗\t353840\n4居\t353841\n慈善家\t353842\n2017年10月21日\t353843\n串串香店\t353844\n桃花镇\t353845\nknx\t353846\n皂基\t353847\nimprovements\t353848\n凯运\t353849\n旺福\t353850\nAES算法\t353851\nf50\t353852\n漂唇\t353853\nbouncycastle\t353854\n南沙街道\t353855\n老字号\t353856\n裸睡\t353857\n48cm\t353858\nSATA硬盘\t353859\n勾起\t353860\n牛儿\t353861\n兽医师\t353862\n昌盛路\t353863\ncharging\t353864\nequalization\t353865\n20150208\t353866\n1万2\t353867\n乐邦\t353868\n无人车\t353869\nalden\t353870\n三尊\t353871\n欢\t353872\n咚鼓\t353873\nHisense\t353874\n常明\t353875\n富春江小三峡\t353876\nquinny\t353877\n720p.HD\t353878\n20150518\t353879\n上海市政工程设计研究总院\t353880\n苹果6sPlus\t353881\n启蒙班\t353882\n8150\t353883\n断崖\t353884\n杆法\t353885\n海南省统计局\t353886\nNIS\t353887\n凯迪拉克XT5论坛论坛\t353888\n新浪山西新闻_新浪\t353889\n上海国税\t353890\n火绳\t353891\n莳萝\t353892\n二季度\t353893\n如家快捷酒店\t353894\nPOSSIBLY\t353895\n中坝\t353896\n俩边\t353897\ndoo\t353898\n中华田园犬\t353899\n木柜\t353900\n冥\t353901\n主美\t353902\nHomes\t353903\n雾灵山\t353904\n试译\t353905\nxml文\t353906\n西安交通大学医学部\t353907\n小百科\t353908\n东门街道\t353909\n马自达汽车\t353910\n上郡\t353911\n防辐射眼镜\t353912\n放没有\t353913\n不可逆性\t353914\n结构力学求解器\t353915\n内蒙古财经学院\t353916\n中草药\t353917\n郑健\t353918\n遮幅\t353919\n小麦粉\t353920\nDistagon\t353921\n南锣鼓巷\t353922\n阅览器\t353923\n平面\t353924\nTARGET\t353925\n洗冤录2\t353926\njuliet\t353927\n吴莹\t353928\nKuKu\t353929\n增订\t353930\n副部\t353931\nca125\t353932\n苏念\t353933\n1776年\t353934\n书荒\t353935\n装设\t353936\n王宏伟\t353937\n留仙大道\t353938\n云3d模型\t353939\n太极熊猫3\t353940\n20161002\t353941\n盱眙县\t353942\n核试\t353943\n五百丁\t353944\n东升西落\t353945\n人口抽样调查\t353946\n瘟热\t353947\n龙头鱼\t353948\n红火蚁\t353949\n石家庄地铁3号线\t353950\n特仑苏\t353951\n美冠\t353952\n飞米\t353953\n多胞胎\t353954\n水浒后传\t353955\n北大青鸟\t353956\n造口袋\t353957\n西安三桥\t353958\n尖沙嘴\t353959\n镇人大\t353960\n土甲\t353961\nabc类\t353962\nServing\t353963\n沙\t353964\n京香\t353965\nIMAGINE\t353966\n定单\t353967\n売春\t353968\nJamie\t353969\n保持者\t353970\n60多万\t353971\n德国大使馆\t353972\n报杂谈\t353973\n九_\t353974\n圆弧\t353975\n逍遥生\t353976\n管理咨询公司\t353977\n腻子机\t353978\ndelayed\t353979\nLAY\t353980\n惠山古镇\t353981\n向斜\t353982\n盛筵\t353983\n中石油\t353984\n陈宇\t353985\n乐美雅\t353986\nOutcome\t353987\n16页\t353988\ngba4\t353989\n谦信君\t353990\n蝙蝠衫\t353991\nKinetis\t353992\n浙江金融职业学院\t353993\nDomination\t353994\n杜鹃花科\t353995\n寻隐者不遇\t353996\n杨宏\t353997\nEXCEl\t353998\n尿骚\t353999\n尖轨\t354000\n入定\t354001\n中国人\t354002\n血花\t354003\ntyt\t354004\nXn\t354005\n成都路桥\t354006\n预售期\t354007\n宽衣\t354008\n书籍\t354009\n男淫\t354010\n国家安全生产监督管理总局\t354011\n关节型\t354012\nTakes\t354013\ntmx\t354014\n557_\t354015\n解放\t354016\n报馆\t354017\n江西省人民政府\t354018\n魔兽3冰封王座\t354019\n河南自贸区\t354020\n汉光武帝\t354021\n简·奥斯汀\t354022\nDriven\t354023\n红圈\t354024\n中腰\t354025\n我的世界虚无世界\t354026\n绣像\t354027\n苏子诺\t354028\n2017-12-27\t354029\n东丽区\t354030\n缎面\t354031\n350米\t354032\n3D打印模型\t354033\n香波地群岛\t354034\n100毫克\t354035\n亲家\t354036\nLOL心情杂谈_52PK英雄联盟\t354037\n无损转换\t354038\n哈德斯菲尔德\t354039\n阳水\t354040\n公司制\t354041\n摔角狂热34\t354042\n千岛湖\t354043\nNumberBox\t354044\n奖品\t354045\n撤档\t354046\n牛街\t354047\n离屏\t354048\n_\t354049\n传送机\t354050\n奥尔夫音乐\t354051\n圆章\t354052\nformulas\t354053\nTensorRT\t354054\n幸福的家\t354055\n命名\t354056\n龙之梦购物中心\t354057\n呼吸机\t354058\n比尔东邪西毒\t354059\nsgx\t354060\n深圳广告公司\t354061\n未见\t354062\n风力机\t354063\n半成品\t354064\n人女\t354065\nFading\t354066\n孟买\t354067\n芈璃\t354068\n陆军勤务学院\t354069\n平凡之路\t354070\n黑糖水\t354071\n42.0\t354072\n折枝\t354073\n费城76人队\t354074\n讲授\t354075\n宁波国税\t354076\n再生网\t354077\nnodename\t354078\n北京市住房和城乡建设委员会\t354079\n52pk游戏网\t354080\n电教馆\t354081\n王雪梅\t354082\ncrazyCodeLove\t354083\nConsumption\t354084\n魏都\t354085\n松皮石\t354086\n名仕\t354087\n魔域单机版\t354088\nrecognition\t354089\n精炼\t354090\nVersion\t354091\ndota2资\t354092\n流中\t354093\n安特卫普\t354094\n远期外汇交易\t354095\n三垦\t354096\n讲病\t354097\n豆腐西施\t354098\n过窗\t354099\n虹桥火车站\t354100\n助工证\t354101\n交河故城\t354102\nVB6.0\t354103\nExpansion\t354104\n柒零\t354105\n人才库\t354106\n头手\t354107\n邦尼兔\t354108\ntracert\t354109\n浑圆桩\t354110\nhmwz\t354111\nounce\t354112\n中国吊顶网\t354113\n游动\t354114\n鹰牌\t354115\n易修\t354116\n翔田千里\t354117\n8329\t354118\n乐感\t354119\n预言师\t354120\nnative-swiper\t354121\n郑建华\t354122\n中国联通沃\t354123\n感受\t354124\n笛福\t354125\n踏青\t354126\nloveflying\t354127\nECM\t354128\nラッキ\t354129\n上海市临港地区开发建设管理委员会\t354130\nappsetting\t354131\n爬犁\t354132\n模板王\t354133\n探针\t354134\n方子传\t354135\n陈虎平\t354136\nE7440\t354137\n拍戏\t354138\n首档\t354139\n三相三线制\t354140\n四川理工学院\t354141\n天正CAD\t354142\n2018年04月25日\t354143\n若尔盖花湖\t354144\n洛口\t354145\n面盘\t354146\ncaoliu2017\t354147\n福山雅治\t354148\nArduino论坛—DF创客\t354149\n中国电视艺术家协会\t354150\nNicki\t354151\n哈根达斯\t354152\n法医狂妃\t354153\n徐四\t354154\n泥浆\t354155\n背底\t354156\n中国石化集团\t354157\n千钰\t354158\nnomad\t354159\n广州分行\t354160\n法律援\t354161\n昇\t354162\n反间\t354163\n竹荪\t354164\n狗证\t354165\ntalking\t354166\nv4.4.2\t354167\n炫富\t354168\n﹒\t354169\nTwinCAT\t354170\nbaojia\t354171\n3902元\t354172\n傅振邦\t354173\n地界\t354174\n稀料\t354175\n裸导线\t354176\n死缠烂打\t354177\n老攻\t354178\n传令兵\t354179\n新洲区\t354180\n荔浦县政府网\t354181\nVPS服务器\t354182\n乌兰察布市财政局\t354183\n度日如年\t354184\n曹敏\t354185\n联想笔记本电脑\t354186\nㄧ\t354187\n自然之力\t354188\nqx\t354189\n通才\t354190\npgyer\t354191\n克霉唑阴道片\t354192\n金华二中\t354193\n怪物猎人3\t354194\n鹊踏枝\t354195\nhydroxide\t354196\n刘小龙\t354197\ntp\t354198\n手办\t354199\n心世界\t354200\n古剑奇谭2吧\t354201\n苏发\t354202\n口语交际\t354203\nAlexa排名\t354204\nhugo2\t354205\n281号\t354206\n゜\t354207\nOSPREY\t354208\n57次\t354209\nenvironments\t354210\n随借随\t354211\n韩城矿业公司\t354212\n七夕节\t354213\n缩着\t354214\n火焰纹章回声:另一个英雄王\t354215\nchaoren\t354216\n雪莲花\t354217\n大会堂\t354218\n凶巴巴\t354219\n鲁班路\t354220\n羟基喹啉\t354221\n感控\t354222\n李嘉万\t354223\nChinadaily\t354224\n渭南国家高新技术产业开发区\t354225\n来电\t354226\n微信最强弹一弹\t354227\n真心\t354228\n红外遥控器\t354229\n2月19\t354230\n焦土\t354231\n冻干机\t354232\n战神之路ol吧\t354233\n宠爱\t354234\n角斗场\t354235\nha\t354236\n汉语拼音学习网\t354237\n长寿乡\t354238\n重庆市财政局\t354239\nSASL\t354240\n移动电子商务\t354241\nAdverse\t354242\n扬州博物馆\t354243\n山竹\t354244\n2项\t354245\nKang\t354246\n母树\t354247\n维旺迪\t354248\n理论\t354249\nupyun\t354250\n黄铮\t354251\n燃煤蒸汽锅炉\t354252\n4.6%\t354253\n德者\t354254\nCAMS\t354255\n49条\t354256\n二十九岁\t354257\n县水利局\t354258\n长江东\t354259\nCompaq\t354260\n切图\t354261\n五十大\t354262\n徐春妮\t354263\n长天\t354264\ncod5\t354265\n几千克\t354266\nnano6\t354267\n汽车维修厂\t354268\n用做\t354269\n潇湘汐苑\t354270\n养鸭\t354271\nCCTV1直播网\t354272\n侧面\t354273\n皮秒激光祛斑\t354274\n潜血\t354275\nexceeding\t354276\n心痒\t354277\n志乃\t354278\noppoA57\t354279\nkatespade\t354280\n20多张\t354281\n欧嘉\t354282\n海白菜\t354283\n非局部\t354284\nv7.5\t354285\n梨花情\t354286\ninvoking\t354287\n和联\t354288\n海平线\t354289\n润本\t354290\n北京人事考试中心\t354291\n新绛县人民政府\t354292\ndushi\t354293\n5233\t354294\n沟槽式\t354295\n福尔康\t354296\n英语六级\t354297\n砌筑工程\t354298\nzoo\t354299\nhackmap\t354300\n广州供电局有限公司\t354301\n春蕾杯\t354302\n佩顿\t354303\n130多个\t354304\n遣\t354305\n谷歌人体浏览器\t354306\nPond\t354307\n万德福\t354308\n直隶新城\t354309\n鸡西市人民政府\t354310\n注文\t354311\n烟台汽车工程职业学院\t354312\nVice\t354313\n么说\t354314\n否否\t354315\n雷霆战神\t354316\n激变\t354317\ncorridor\t354318\n菲律宾币\t354319\n奔驰g级\t354320\n紧缺型\t354321\n香港湿地公园\t354322\n坏帐\t354323\n4年内\t354324\n减震器\t354325\n汽车行业\t354326\n厦门\t354327\nwifi信号放大器\t354328\n扬州路\t354329\n复合管\t354330\nDYSON\t354331\n胡昌升\t354332\nC语言程序\t354333\n走天下\t354334\n播发\t354335\n皇马套\t354336\n暗黑1\t354337\n东华\t354338\n海南全岛\t354339\n粮\t354340\nDYNAMIC\t354341\n鄂东\t354342\n英氏\t354343\n39.9元\t354344\nsmarty\t354345\n胸胸\t354346\n一个段\t354347\n中国铁建大桥工程局集团有限公司\t354348\n红都\t354349\n捷德奥特曼\t354350\n世界地球\t354351\n进水\t354352\nMohamed\t354353\nsmm\t354354\nReasons\t354355\n籽油\t354356\nwwwa\t354357\n力炮\t354358\n刁曼岛\t354359\n新林\t354360\n数据率\t354361\n大写字母\t354362\nak70\t354363\n义乌中心医院\t354364\n躁狂\t354365\nCosmétique\t354366\n24小时之内\t354367\n柏油清洗剂\t354368\n索迪斯卡\t354369\n光良\t354370\n融科城\t354371\n家鸽\t354372\ncols\t354373\n村主任\t354374\n莲花国际广场\t354375\n外卖盒\t354376\n松叶\t354377\n衡阳政府\t354378\n公牌\t354379\n大唐荣耀\t354380\n就业报到证\t354381\n接缝\t354382\n龙舞\t354383\n1300d\t354384\n9188彩票网\t354385\n龙十\t354386\n六措\t354387\nWindows2000\t354388\n基辅号\t354389\n帝国CMS\t354390\n亿万集团\t354391\nLagrange\t354392\nsweetalert\t354393\n遗恨\t354394\n加一次\t354395\n办医\t354396\n雪峰山\t354397\n1939\t354398\n西马路\t354399\n抗病毒\t354400\ntabhost\t354401\nOriental\t354402\n上海信息技术学校\t354403\n修丽\t354404\n红萝卜\t354405\n李德仁\t354406\n难忘今宵\t354407\n象形字\t354408\n胆素\t354409\nSolid\t354410\n中贸广场\t354411\nPMCAFF\t354412\n电加热板\t354413\n浙江大学医学院附属第一医院\t354414\n7011\t354415\n3158教育网\t354416\n偷偷爱你\t354417\npytcharm\t354418\n大贸\t354419\n试述\t354420\n第五十五\t354421\n定边县\t354422\nrmxp\t354423\n要强\t354424\n迅游科技\t354425\n珈儿\t354426\n吸血装\t354427\n小证\t354428\n海兹尔\t354429\n機器\t354430\n毕业生\t354431\n许进\t354432\n潜行\t354433\n抗光幕布\t354434\n秀洲区\t354435\n黄山机场\t354436\n张帅\t354437\n增程\t354438\n博乐宝\t354439\nDPD\t354440\nchangzhou\t354441\n平衡吊\t354442\n富士xt20\t354443\n佳句\t354444\n会务公司\t354445\n虾皮\t354446\nEXCLUSIVE\t354447\n德赛\t354448\n20018年\t354449\n王洪斌\t354450\n地贫筛查\t354451\n汾阳\t354452\n三反\t354453\n狮子吼净土专修网\t354454\n东\t354455\n87.5\t354456\n智卓\t354457\n绿卡\t354458\n相态\t354459\n田螺坑\t354460\n晶弘\t354461\n拉曼\t354462\n杂耍\t354463\nST-LINK/V2\t354464\n低渗性脱水\t354465\n綜藝\t354466\n中国人民大学经济学院\t354467\npriori\t354468\n杭州酒店\t354469\n噫\t354470\n章江新区\t354471\n迅雷种子搜索器\t354472\n攀爬架\t354473\n高钾血症\t354474\n香辛料\t354475\n3.7.28\t354476\n南阳街道\t354477\n轴肩\t354478\n欧派衣柜\t354479\nC4世嘉\t354480\n北京中企诚谊留学回国人员购车服务有限公司\t354481\npermute\t354482\n苏州市公安局\t354483\n465\t354484\n三足式\t354485\n【文文\t354486\nSpain\t354487\n合字\t354488\n鲁大师_鲁大师\t354489\n广州中医药大学深圳医院\t354490\n建筑界\t354491\n桑普多利亚\t354492\n众行\t354493\n600套\t354494\n南澳州\t354495\n360天擎\t354496\n深圳市康宁医院\t354497\nChemBioDraw\t354498\n居间人\t354499\nCouture\t354500\n赌\t354501\n双子星座军阀\t354502\n何恺明\t354503\n恳切\t354504\n龙军花园\t354505\n爹地\t354506\nhermes\t354507\nlovelycation\t354508\n政策类\t354509\n光伏贷\t354510\n夜露\t354511\n皇上\t354512\nultrasound\t354513\n例子\t354514\n4000杯\t354515\nicould\t354516\nRectTransform\t354517\n胃角\t354518\n屋上\t354519\nRT-AC88U\t354520\nJoanna\t354521\n为政\t354522\nFront\t354523\nHub\t354524\n勘误\t354525\nresizing\t354526\n往来\t354527\nchc\t354528\nlibpcre.so.1\t354529\n女神联盟2\t354530\n郑州地铁6号线\t354531\n中山医科大学\t354532\n档期\t354533\nmi6\t354534\n北京人才市场\t354535\n中科创\t354536\nPyCharm\t354537\n狮子会\t354538\n扔下\t354539\n洗衣店\t354540\n枇杷苗\t354541\n图书馆杯\t354542\n昂科\t354543\nDoe\t354544\n发火\t354545\n蜜月旅行\t354546\nreceive\t354547\n天健会计师事务所\t354548\n佼\t354549\n哈卡\t354550\n李应\t354551\n青山湾\t354552\nы\t354553\n东方企业文化\t354554\n1500辆\t354555\n措手不及\t354556\n概念图\t354557\n北京注册会计师\t354558\n南洋科技\t354559\n疯狂塔防物语\t354560\n厦门工博会\t354561\n箔式应变片\t354562\n指挥中心\t354563\n碧桂园凤凰城\t354564\n贝母\t354565\n黄菊\t354566\nsens\t354567\n双创债\t354568\n惯性\t354569\n硫代硫酸钠\t354570\nwarrior\t354571\n柔道\t354572\n粟特人\t354573\n更实惠\t354574\n41套\t354575\n第一位\t354576\n广播室\t354577\n6.83\t354578\n水头村\t354579\n小学化\t354580\n雷音pro\t354581\n7.3.5奶\t354582\n陶行知\t354583\n7枚\t354584\n果栏\t354585\n并蒂\t354586\n大藏经\t354587\nE300\t354588\n20170524\t354589\n烈山区\t354590\nreferencing\t354591\n103期\t354592\n税赋\t354593\nTPshop\t354594\n大鹏古城\t354595\n面具侠\t354596\n藏锋\t354597\n蛋白酶\t354598\n潼关县人民政府\t354599\n毛泽东思\t354600\n未获\t354601\nKOREAN\t354602\n乌兹\t354603\n27级\t354604\n中轨\t354605\npowerpoint2010\t354606\n新版三国演义\t354607\ncyberduck\t354608\n活底\t354609\n3799\t354610\nca199\t354611\n低速四轮电动车\t354612\n老千股\t354613\n壳机动队\t354614\n失落园\t354615\n王晟\t354616\n农品\t354617\n20160530\t354618\n三水乐平\t354619\n鱼海\t354620\n实诚\t354621\n九月鹰飞\t354622\n上海市政协\t354623\nLabels\t354624\n远程管理卡\t354625\nwow联盟\t354626\ncinephilia\t354627\n北京新机场\t354628\n输钱\t354629\n东方佛易道\t354630\n姜尚\t354631\n某种\t354632\n激增\t354633\n静穆\t354634\n几期\t354635\necr\t354636\n美盘\t354637\nprescription\t354638\n佝偻\t354639\n高清化\t354640\n虢国\t354641\ncarol\t354642\n口门\t354643\n武汉晴川学院\t354644\ndcmtk\t354645\n十佳\t354646\n费县人民政府\t354647\n吟飞\t354648\n中央领导\t354649\n北京邮电大学经济管理学院\t354650\n收关\t354651\n亚临床甲减\t354652\n刺猬\t354653\n会火\t354654\n罗清泉\t354655\n豁免\t354656\n91V\t354657\n7尺\t354658\niphone7plus\t354659\n漳州招商局经济技术开发区\t354660\nCCAV5直播网\t354661\n削峰\t354662\n曹明\t354663\n天富\t354664\n葱头\t354665\n单簧管\t354666\n4:00\t354667\nSlim\t354668\n桥架\t354669\n啤酒肚\t354670\n多米诺骨牌效应\t354671\n无线模块\t354672\n六年级语文下册期中考试卷\t354673\n苏宁电器门店大全\t354674\n1725\t354675\n眼镜蛇\t354676\n黄妈\t354677\nfirebird\t354678\n张明明\t354679\n九尾忍风传\t354680\n神控\t354681\n25.0\t354682\n女流\t354683\n2018.04.18\t354684\n雇主\t354685\n吡啶基\t354686\n叛乱\t354687\n基多\t354688\nfunding\t354689\n狗蛋儿\t354690\n不可\t354691\n纬度\t354692\n张家界天门山玻璃栈道\t354693\n毒人\t354694\n汉语拼音字母\t354695\n杨晓\t354696\n陶静\t354697\n邵阳传媒网\t354698\n润众\t354699\n液管\t354700\n蒜米\t354701\n走遍\t354702\nvue.js\t354703\nCB500X\t354704\n迅雷牛X页游\t354705\n妇幼保健院\t354706\nveteran\t354707\n代收点\t354708\n北京舞蹈学院\t354709\nnmr\t354710\n康克清\t354711\neclispe\t354712\n吴淞路\t354713\n京雄高速\t354714\nGoogle公司\t354715\n万宏\t354716\n绮丽\t354717\n图片展\t354718\n墙式\t354719\nレベル\t354720\nsupe\t354721\n大屏幕\t354722\nIDM\t354723\n天一中学\t354724\n减除\t354725\n风花雪月楼|小海棠论坛|觅春网\t354726\n色模\t354727\n朱桢丁泽仁\t354728\n前目的地\t354729\n肇事者\t354730\nInvestigation\t354731\n德育教育\t354732\n鬼界\t354733\n风流岁月\t354734\n北京世青国际学校\t354735\n执恢\t354736\n李尧\t354737\n杭州g20峰会\t354738\n甘肃地区\t354739\nPeppa\t354740\n海伦钢琴\t354741\n诚售\t354742\n万科尚城\t354743\nJessy\t354744\n住院处\t354745\nr6\t354746\n十堰旅游网\t354747\n沉默是金\t354748\n想起来\t354749\n18千米\t354750\nPamela\t354751\n情哥\t354752\n仲博\t354753\n预制管桩\t354754\n东京迪斯尼\t354755\n导学案-学科网\t354756\n人性\t354757\nelemen\t354758\n3844\t354759\n小头\t354760\nLIVING\t354761\nTREE\t354762\n核心交换机\t354763\n远志\t354764\n法网狙击\t354765\n强渡大渡河\t354766\n4460\t354767\n中华人民共和国保险法释义\t354768\nif函数\t354769\n女行\t354770\n深谙\t354771\n村史\t354772\n1G多\t354773\n水培蔬菜\t354774\n纯寒\t354775\n淘宝认证\t354776\n熟视无睹\t354777\n小货车\t354778\n纳特\t354779\n$100\t354780\nLUSH\t354781\n集成部\t354782\n性能\t354783\n楚国\t354784\nMINIMINI\t354785\n绳镖\t354786\nmapped\t354787\n美食文化节\t354788\n赛文柒\t354789\n基佬\t354790\n疑犯追踪吧\t354791\n马洛\t354792\n正直博\t354793\n维持\t354794\n东莞图书馆\t354795\n威高股份\t354796\n屏幕录像软件\t354797\n徽商银行\t354798\n裕廊\t354799\n09式\t354800\n伯克希尔哈撒韦\t354801\n乐加乐\t354802\n宝贝们\t354803\n川田\t354804\n挂墙\t354805\n红领\t354806\n门派\t354807\n月经\t354808\ncentrality\t354809\npwm\t354810\n1.36GB\t354811\nactivemq\t354812\n扬州城\t354813\n万品\t354814\n采收\t354815\n广州工商局\t354816\n加敏\t354817\n预试\t354818\n個人\t354819\n常识类\t354820\n儋州市\t354821\n柏悦府\t354822\nFacebook\t354823\n优必选\t354824\n成都市建委\t354825\n阿q\t354826\n台州九三微创\t354827\n达朗贝尔\t354828\n黒\t354829\n东征\t354830\nV5.3\t354831\n肝炎病毒\t354832\n咕咚\t354833\nip8\t354834\n四氧化三钴\t354835\n德奥\t354836\n唐墓\t354837\n第8条\t354838\nfastadmin\t354839\nhots\t354840\n招聘网\t354841\nHeadsets\t354842\ndisappointed\t354843\n吴强忠\t354844\narctic\t354845\n下药\t354846\n电位器\t354847\n福建省教育厅\t354848\nCAIJING\t354849\nBCL\t354850\nBMJ\t354851\n湿地\t354852\n花生酥\t354853\n新都一中\t354854\nLumpur\t354855\n熊猫债\t354856\n南宁地区\t354857\n魏公村\t354858\n东南大学医学院\t354859\n君威gs\t354860\nWGM\t354861\n能源科技有限公司\t354862\n怒火\t354863\njiushi\t354864\n1费\t354865\n凯利公式\t354866\n陈楚生\t354867\n北部湾航空\t354868\n第6期\t354869\nMSE\t354870\n北京大众\t354871\n孕期\t354872\n小白船\t354873\n20170829\t354874\n闲鱼塘主\t354875\n92.9\t354876\n150p\t354877\n14.0.4\t354878\n科创委\t354879\n赵城\t354880\n北京汽\t354881\nDEMO-CSDN\t354882\n野生灵芝\t354883\n物联网科技有限公司\t354884\n扛鼎之作\t354885\n霸面\t354886\n杀青\t354887\n美好生活\t354888\n1.6米\t354889\n理想化\t354890\n走班\t354891\nwwweee944com\t354892\n二十几天\t354893\n标星\t354894\n杨臻\t354895\n天草四郎时贞\t354896\n被盗用\t354897\n巴中职业技术学院\t354898\n龙之战\t354899\n滔博\t354900\nG-Shock\t354901\n鲁花花生油\t354902\n骊威武\t354903\n80m\t354904\n施慧达\t354905\n高压油泵\t354906\n紧凑级\t354907\n滑头鬼之孙\t354908\nconstrained\t354909\n魔密信\t354910\nxsd\t354911\n淫虫\t354912\nQuit\t354913\n重庆医科大学附属第三医院\t354914\n青岛职业技术学院\t354915\nagreement\t354916\n比特鱼\t354917\n415\t354918\n共混物\t354919\n碳罐\t354920\nAptoide\t354921\nLOVERS\t354922\n致我们单纯的小美好\t354923\n打金枝\t354924\narray函数\t354925\nMOI\t354926\n旋耕机\t354927\n浪潮软件\t354928\n若菜濑奈\t354929\nFTP\t354930\n邦国\t354931\n进阶之路\t354932\n亚龙湾\t354933\n沪媒\t354934\n西影路\t354935\nCOLUMN\t354936\n太仓\t354937\nSLL\t354938\n皇色\t354939\n葡萄科技\t354940\n学术研\t354941\nasme\t354942\n对口单招\t354943\nWhenI\t354944\n58630559\t354945\nUEA\t354946\n竹林七贤\t354947\n赢\t354948\nhamburger\t354949\n暗箱操作\t354950\n以退为进\t354951\n小额信用贷款\t354952\n河南应用技术职业学院\t354953\n磺胺嘧啶\t354954\n片桐\t354955\nMai\t354956\n放大电路\t354957\ndestop\t354958\n濃\t354959\nn^2\t354960\n妻ネトリ\t354961\n新桥医院\t354962\nbitcoin\t354963\n期货\t354964\n女足\t354965\n泰伯网\t354966\n武汉大学哲学学院\t354967\n南昌市城乡建设委员会\t354968\n失物\t354969\n茵茵\t354970\n为事\t354971\n标的股\t354972\n未有期\t354973\n框格\t354974\n供方\t354975\n10.1016\t354976\n白芨\t354977\ndota2荒神罪\t354978\n上海交通大学上海高级金融学院\t354979\npreviously\t354980\n77秒\t354981\n氮气机\t354982\n危命\t354983\n徐村\t354984\n哑火\t354985\nmic\t354986\n狼牙套\t354987\n礁石\t354988\n油层\t354989\n试色\t354990\n左慈\t354991\n小非\t354992\n气枪\t354993\nUILable\t354994\n饿汉式\t354995\n成仁\t354996\n皮脸\t354997\nAlamo\t354998\n駢\t354999\n伤感情歌\t355000\n一月后\t355001\n萌文\t355002\n希尔瓦纳斯\t355003\nWatsons\t355004\n80页\t355005\n爆浆鸡排\t355006\n蜕膜\t355007\nXSM\t355008\n劳动仲\t355009\n180423\t355010\n收储\t355011\n华菱\t355012\n巨峰路\t355013\n感谢有你\t355014\n精华10大网\t355015\n啊啦\t355016\n津巴布韦\t355017\n龙浩集团\t355018\n猪苗\t355019\n机战\t355020\n蜂浆\t355021\n第124集\t355022\n肥硕\t355023\nnetterm\t355024\n奶照\t355025\n美美美\t355026\nDMER\t355027\n国徽\t355028\n豫园商城\t355029\n福田汽车站\t355030\ntaught\t355031\n重庆高速公路\t355032\n额外\t355033\n张籍\t355034\n美国航天局\t355035\n汇金路\t355036\n玩币族\t355037\nizotope\t355038\nfood\t355039\n东风起亚\t355040\n1834\t355041\n恒大未来城\t355042\n甲鱼\t355043\nvivox9splus\t355044\n中国少年国学院\t355045\n曹随风\t355046\n赏鸟\t355047\n溯源\t355048\n岩浆\t355049\n短道\t355050\nT450S\t355051\n舒云\t355052\n蝶影\t355053\n河南省高级人民法院\t355054\n根切\t355055\n刘人语\t355056\n15.04\t355057\n孔子世家\t355058\n战皇\t355059\n举子\t355060\n全工\t355061\n一村\t355062\nNintendo\t355063\nitinerary\t355064\n诺丁汉大学\t355065\n参考文献\t355066\n经验药\t355067\n阿里瓜瓜\t355068\n随时随地\t355069\n水泥窑\t355070\n水果机\t355071\n反渗透纯水机\t355072\ngiligili\t355073\n自然之宝\t355074\n刀娘\t355075\ntori\t355076\n覃氏\t355077\n错认\t355078\n10011\t355079\n遮掉\t355080\nmainframe\t355081\nanu\t355082\nifort\t355083\ntorren\t355084\n佐科\t355085\n知否知否\t355086\n无反\t355087\n黑狮\t355088\n江苏省发改委\t355089\n教育界\t355090\ncrush\t355091\n逻辑\t355092\nShutterstock\t355093\n违纪\t355094\n热淋清颗粒\t355095\nPublisher\t355096\n落基山\t355097\nGould\t355098\n状告\t355099\n脉石\t355100\nProtro\t355101\nRTOS\t355102\n兜\t355103\n握杆\t355104\n激光脱毛器\t355105\n洗洁\t355106\n战阶\t355107\n马本斋\t355108\n荔枝花园\t355109\n补充品\t355110\n黄龙\t355111\n务川仡佬族苗族自治县\t355112\n六步洗手法\t355113\n王进\t355114\n120家\t355115\n远瞳\t355116\n萨尔玛\t355117\nWeb应用防火墙\t355118\nspy2wc\t355119\n难明\t355120\nclassName\t355121\n阶梯形\t355122\nWDK10\t355123\n打蜡\t355124\n名导\t355125\n半途\t355126\n晋江文学城\t355127\n金昆\t355128\n麦创网\t355129\n刘旺\t355130\n掌握\t355131\ns弯\t355132\n7758\t355133\n中天门\t355134\n自演\t355135\n陈玉莹\t355136\n央视卫视\t355137\n朱老师\t355138\n连锁加盟网\t355139\n壹頁書\t355140\n十念\t355141\n姜珂\t355142\n忍迹\t355143\n王道\t355144\n陈轮\t355145\n中交四公局\t355146\n芸\t355147\n线线\t355148\n砻谷机\t355149\nEVERY\t355150\n内存条_\t355151\n中国中铁\t355152\n二进制包\t355153\nScroller\t355154\n45_\t355155\n东营区政府\t355156\napps\t355157\n冯博\t355158\n9.8分\t355159\n林村\t355160\n蛮族\t355161\n檀府\t355162\n行星式\t355163\n很满意\t355164\n刘楚玉\t355165\nWeaving\t355166\nslideToggle\t355167\n艺术形象\t355168\n20171208\t355169\n三初\t355170\n小市\t355171\n制表符_\t355172\n西泽立卫\t355173\nnight24\t355174\n绿帽奴\t355175\nidentityserver4\t355176\nadjusted\t355177\n身心健康\t355178\n扫频仪\t355179\n万宝路\t355180\n实意\t355181\n蒙华铁路\t355182\n各司其职\t355183\n中国制造\t355184\n师胜杰\t355185\n啰音\t355186\n浆液性\t355187\n安必信\t355188\nRestoring\t355189\n236\t355190\n中国兵器装备集团\t355191\n一五一十\t355192\nMHXX\t355193\n中华小姐环球大赛\t355194\n息壤\t355195\n淮南市地方税务局\t355196\n入押\t355197\n样本\t355198\n盗者\t355199\n心火旺\t355200\n龙江路\t355201\nTicwatch\t355202\n档案\t355203\n彩市\t355204\n奇瑞捷豹路虎\t355205\nwired\t355206\n猫扑网\t355207\n妖魅\t355208\n白袜子\t355209\n曹永廉\t355210\n达克斯\t355211\n铁山坪森林公园\t355212\nveth\t355213\n香蕉网\t355214\n张齐华\t355215\n羽衣甘蓝\t355216\n中国信息安全测评中心\t355217\nAION-永恒之塔\t355218\n辅业\t355219\nsellers\t355220\n明眼\t355221\n翠城\t355222\n新闻纸\t355223\n宜阳县\t355224\nGenerate\t355225\n武动\t355226\n石塘竹海\t355227\nZENITH\t355228\n伟星新材\t355229\n胜芳镇\t355230\nChinglish\t355231\nfly\t355232\n平面磨床\t355233\n中国科学院数学与系统科学研究院\t355234\n长春站\t355235\n责任链\t355236\n伊藤舞\t355237\n免税\t355238\n思茅市\t355239\n杀菌锅\t355240\n武汉大学经济与管理学院\t355241\n王润泽\t355242\nOHM\t355243\n爆蛋\t355244\n郑晓宁\t355245\n新秀丽拉杆箱\t355246\ngeant4\t355247\n振铃\t355248\n钟氏\t355249\n青年路街道\t355250\n无控\t355251\n第几届\t355252\n二化螟\t355253\n降水量\t355254\nk线\t355255\n十年一品温如言\t355256\n黄海水产研究所\t355257\n陈玘\t355258\n日夜撸影音先锋\t355259\n水果硬糖\t355260\nFloyd算法\t355261\n动臂\t355262\nE560\t355263\nTestNG\t355264\n不老松\t355265\n知己知彼百战百胜\t355266\nNRT\t355267\n河港\t355268\n说旅网\t355269\n一二层\t355270\n十百千\t355271\njeasyui\t355272\n上海易鑫融资租赁有限公司\t355273\n钟立风\t355274\nGathering\t355275\n灭族\t355276\n飞蝇\t355277\n堵塞\t355278\n中国新歌声\t355279\n澳毛\t355280\n妙鲜\t355281\n败家仔\t355282\n李雄\t355283\n女星\t355284\n隆多\t355285\n75g\t355286\n称\t355287\n卢湾区\t355288\nv领\t355289\n法务在线\t355290\n喷水织机\t355291\n中国大恒(集团)有限公司\t355292\n小念\t355293\n商住两用房\t355294\n安聪慧\t355295\nEMQ\t355296\n签字笔\t355297\n几十亿\t355298\nTachibana\t355299\n超级基因战士\t355300\n柳岩\t355301\ncentbrowser\t355302\nb柱\t355303\n弯沉\t355304\n诗云\t355305\nnmea\t355306\n主成分\t355307\n济南铁路局\t355308\n东昌\t355309\n稠城街道\t355310\nqdz\t355311\n金虫草\t355312\n何故\t355313\n火影忍者OL_一游网\t355314\n定宽\t355315\n大绯胸鹦鹉\t355316\nloctite\t355317\nMA\t355318\nprotractor\t355319\n北京石油大学\t355320\ngsk\t355321\n塘湾\t355322\n胜诉率\t355323\n结城友奈\t355324\niamge\t355325\nwepe\t355326\n限号\t355327\n慈孝\t355328\n0x0\t355329\n沉浮\t355330\n思甲壳虫\t355331\n国际汉语教师证书\t355332\nSettings\t355333\n微电解\t355334\nomp\t355335\n中山东凤\t355336\n任言恺\t355337\nsize函数\t355338\n阿拉斯\t355339\n何鸿燊\t355340\ngfm\t355341\n欠片\t355342\n广川\t355343\nXib\t355344\n130周年\t355345\nbl漫\t355346\n人民解放军\t355347\nframes\t355348\n南山政府\t355349\n物理机\t355350\n易方达基金管理有限公司\t355351\nvinsonLu\t355352\n起振\t355353\n时光之刃\t355354\n牧云笙\t355355\n菜品\t355356\n严以白\t355357\n北京公立医院\t355358\nacquaintance\t355359\n舞妓\t355360\ndopost\t355361\n郭璞\t355362\n十年间\t355363\n川菜\t355364\n几话\t355365\nC++多线程\t355366\n金河镇\t355367\n行政案件\t355368\n减噪\t355369\n搞\t355370\n江斌\t355371\n攻子\t355372\n折原临\t355373\n顶帖\t355374\nSHY48\t355375\n老夫妻\t355376\n冬防\t355377\n嘉兴市人力资源和社会保障局\t355378\n场地费\t355379\n财政转移支付\t355380\n侠者\t355381\n芝士焗饭\t355382\n剑网三五毒\t355383\nFarm\t355384\n十里银滩\t355385\nwlc\t355386\n制空\t355387\n菜园\t355388\n硬水\t355389\n乌苏里江\t355390\nMud\t355391\n沈阳工学院\t355392\n韩智敏\t355393\n汴梁\t355394\n一汽丰田汽车销售有限公司\t355395\nbangbros\t355396\n上海海洋大学研究生院\t355397\n黑痣\t355398\n128平\t355399\nlch\t355400\n梦域动漫网\t355401\n跑腿\t355402\n凋灵\t355403\n彩机\t355404\n兰子\t355405\n圣医\t355406\n蓝迪\t355407\nlogic\t355408\n芭蕉扇\t355409\n美能达\t355410\n咚巴拉\t355411\n朝阳门\t355412\n神马搜索\t355413\n濂溪区\t355414\n溴丙烷\t355415\n兰亭杯\t355416\n箱套\t355417\n索尼移动\t355418\n贾里\t355419\n榜头镇\t355420\n泥鳅\t355421\nNW_KNIFE\t355422\nRhein\t355423\n毒株\t355424\n藤黄果\t355425\n六部\t355426\n卡哈比\t355427\n不对齐\t355428\n产后康复\t355429\n环太平洋2雷霆再起\t355430\n病毒性角膜炎\t355431\n无论何时\t355432\n雷波马湖\t355433\n黑核\t355434\n成人级\t355435\n用药量\t355436\nCOOLING\t355437\n髂腰肌\t355438\n湾岸\t355439\n耗电量\t355440\npesclubmanager\t355441\n56g\t355442\n嫖妓\t355443\n一应\t355444\n优质服务商\t355445\n济宁技师学院\t355446\nFYI\t355447\ncga\t355448\n危险品\t355449\n黄塘镇\t355450\n妖女迷行\t355451\nHealing\t355452\n07年\t355453\n起跑器\t355454\n全城\t355455\n口溜\t355456\n同兴\t355457\n15.com\t355458\n150G\t355459\n宗萨钦\t355460\n市場\t355461\n无措\t355462\n积重难返\t355463\n老外\t355464\n前照\t355465\n羽毛球\t355466\n贺兰山岩画\t355467\n盘红\t355468\n年岁岁花\t355469\n福州市投资促进局\t355470\n火焰龟\t355471\n蛮族之王吧\t355472\nDecimal\t355473\n食补\t355474\nV8S\t355475\nHannover\t355476\n年度\t355477\n复归\t355478\n米尔顿\t355479\nxigua\t355480\n酸盐\t355481\n手儿\t355482\n过把瘾\t355483\nmorris\t355484\n前阵\t355485\n编程教育\t355486\n现代版\t355487\n办公处\t355488\n化生寺\t355489\n林加欣\t355490\n司马台长城\t355491\n笋\t355492\n圣杯转临\t355493\n北京大学外国语学院\t355494\n北京石景山游乐园\t355495\n天通苑东一区\t355496\n250部\t355497\n租琴\t355498\n海南省政府国有资产监督管理委员会\t355499\nDanger\t355500\n慢四\t355501\n西三旗街道\t355502\n西宁机场\t355503\nory\t355504\n矫治\t355505\n自给率\t355506\n圣埃克苏佩里\t355507\nC++2005\t355508\n胖\t355509\nb12\t355510\n喂药\t355511\ngaga\t355512\n肿痛\t355513\n米色\t355514\n腰凳\t355515\nWePY\t355516\n女马\t355517\n老娘们\t355518\n宁波网易\t355519\n第89\t355520\nreplaced\t355521\n1600W\t355522\n碳酸氢钠片\t355523\nchoye\t355524\n致命ID\t355525\n天天简笔画\t355526\n银河舰队\t355527\nOneThink\t355528\nTransmit\t355529\n优秀\t355530\nmjpg\t355531\n春潮\t355532\n威动\t355533\ntungsten\t355534\n熔窑\t355535\n大航海时代3\t355536\n4年后\t355537\n绵阳市规划局\t355538\nUimaker\t355539\n启闭机\t355540\nfscanf\t355541\n三国2\t355542\n洛阳市水务局\t355543\n雪铁龙\t355544\n水点\t355545\n块块\t355546\nzhanlijun\t355547\n结器\t355548\n输液器\t355549\n灵机\t355550\n猪倌\t355551\n2018年4月29日\t355552\n十九大党章修正案学习问答\t355553\n石塘镇\t355554\n正日\t355555\n蓝兰岛漂流记\t355556\n三草\t355557\n鸭跖草\t355558\n过渡\t355559\n9724\t355560\n东方医疗器械网\t355561\n涨袋\t355562\n欣瑞\t355563\ngenerative\t355564\ndelims\t355565\n派\t355566\n统一下载\t355567\n飞虎之潜行极战百度云\t355568\n翻译招聘网\t355569\n4pc\t355570\n330g\t355571\n2万亿美元\t355572\n燕然峰\t355573\n课版\t355574\n枣\t355575\n彭文生\t355576\n中考\t355577\nidd\t355578\n倥偬\t355579\n苯妥英钠\t355580\n延胡索\t355581\n20140404\t355582\n卓讯\t355583\n70种\t355584\nHDMI2.0\t355585\n大捷\t355586\n南京师大\t355587\n中国数据\t355588\n米仓山\t355589\n兵勇\t355590\ndamai\t355591\n雕漆\t355592\n新港城\t355593\n捍卫者联盟\t355594\n新疆维吾尔自治区人民政府\t355595\nEPSG\t355596\n国有土地使用权证\t355597\n宋心馨\t355598\n偷偷摸\t355599\nwinPE\t355600\nabercrombie\t355601\n峨边县\t355602\ncen\t355603\n招标文件\t355604\n综合性公司\t355605\n小儿荨麻疹\t355606\n圣职者\t355607\n数职\t355608\nMETHOD\t355609\n颜文字\t355610\n卑鄙者\t355611\n小抄版\t355612\n主成分分析法\t355613\n掏肛\t355614\n装扮\t355615\n复方甘草口服溶液\t355616\n铁质\t355617\n甘河\t355618\nLED显示屏\t355619\noccupational\t355620\npied\t355621\n宝马4系\t355622\n陈洁如\t355623\n曲江国际会展中心\t355624\n湖南粮食集团\t355625\n乌鲁斯拉格纳\t355626\n二声\t355627\n克孜勒苏\t355628\n桥头镇\t355629\n新域\t355630\nEfficiency\t355631\n卡转\t355632\n研学旅行\t355633\n常春藤联盟\t355634\n酸锌\t355635\n句子迷\t355636\nape格式\t355637\n聚义\t355638\n勃列日涅夫\t355639\n安徽省委\t355640\n海门岛\t355641\n猪粪\t355642\nloganalyzer\t355643\n台妹\t355644\nfloyd算法\t355645\n东丽开发区\t355646\n呙\t355647\n吴一凡\t355648\n陈氏太极拳老架一路\t355649\n动态内存\t355650\n弓凛\t355651\nNETCore\t355652\n新阶段\t355653\n武神传说\t355654\n机器狗\t355655\n北京体育大学\t355656\n有别\t355657\nnchar\t355658\n回迁房\t355659\n咳咳咳\t355660\n塔奎因\t355661\n崔崔\t355662\n独到\t355663\n东莞市公安局\t355664\nsRGB\t355665\nT410i\t355666\n白落梅\t355667\n薛定谔方程\t355668\n祸起\t355669\n花楼恋歌\t355670\n_沐风网\t355671\n吴良材\t355672\n新房网\t355673\n13.04双\t355674\n播出\t355675\n赵小明\t355676\n10排名\t355677\n幻想全明星\t355678\nhitbtc\t355679\n限价\t355680\nitm\t355681\n林更新#\t355682\nload函数\t355683\n佟莉\t355684\n涅槃\t355685\n优盾\t355686\n战略级\t355687\n郭燕\t355688\neop\t355689\n第13版\t355690\ncollapse\t355691\n保送生\t355692\n纳豆机\t355693\nconsole口\t355694\n高师\t355695\n陈毅\t355696\n死局\t355697\ndq7\t355698\n豹头\t355699\nmeme\t355700\nIMAX\t355701\nAME\t355702\n141\t355703\n京巴犬\t355704\n蝶式\t355705\n打招\t355706\n排污阀\t355707\n欧阳马伯庸\t355708\nBlueberry\t355709\n梨花针\t355710\n银地\t355711\n苦逼\t355712\n僵王\t355713\n自由版\t355714\ncontrol\t355715\n整体观\t355716\n氨水\t355717\nfans\t355718\n第64集\t355719\n袁春\t355720\n时侯\t355721\n传歌\t355722\n佳电\t355723\n王语嫣\t355724\n入画\t355725\n4.1.10\t355726\n霍里\t355727\n300多斤\t355728\n京政\t355729\n疯了桂宝\t355730\n投资型\t355731\n罗斯王雅媛\t355732\n邻水\t355733\n南京市鼓楼区人民政府\t355734\n平分秋色\t355735\n打钱\t355736\n沈阳市城乡建设委员会\t355737\n苏宁环球\t355738\n天猫魔屏\t355739\n花团\t355740\n第16章\t355741\nCornell\t355742\n牛王\t355743\n赛车场\t355744\n涪江\t355745\n今典\t355746\n射阳政府网\t355747\n迅雷链\t355748\nexample类\t355749\n强制_\t355750\n通晓\t355751\n新僵尸先生2\t355752\nstruts2拦截器\t355753\n伏魔\t355754\n土壤电阻率\t355755\n缰\t355756\n东京喰种re\t355757\n无意义\t355758\n补帧\t355759\n公主方\t355760\nwin8平板吧\t355761\n面甲\t355762\n32小时\t355763\n彩叶草\t355764\n泄矢青蛙子\t355765\nNYC\t355766\n哔哩哔哩\t355767\n肾火\t355768\n库欣综合征\t355769\nrk\t355770\n11吨\t355771\n晓书馆\t355772\n2362\t355773\n乡纪委\t355774\n扶摇\t355775\n线柱\t355776\n集体化\t355777\n2016年4月23日\t355778\n曼施坦因\t355779\n2所\t355780\n平鲁\t355781\n童话故事\t355782\n郭成\t355783\n打胶\t355784\n鱼石\t355785\n详情\t355786\n喀麦隆\t355787\n火陨\t355788\n深圳亚迪学校\t355789\ntracking\t355790\n贺报\t355791\n天才宝贝\t355792\n40MB\t355793\n企业所得税法\t355794\n藉\t355795\n校歌\t355796\n不可抗力\t355797\n金沙世纪城\t355798\n变频恒压供水设备\t355799\nmimiai\t355800\nPanel\t355801\n浅蓝色\t355802\n国家应急管理部\t355803\nRiddle\t355804\n第几批\t355805\n电站\t355806\n超优\t355807\n安徽省江淮十校\t355808\nPK10计划网\t355809\ntinymce\t355810\neslintrc\t355811\n待罪\t355812\n同乐会\t355813\n无心磨床\t355814\n七十二变\t355815\nNAV\t355816\nubunu\t355817\n潜规\t355818\n何裕民\t355819\n李敬泽\t355820\n封闭\t355821\nsankey\t355822\nfinal吧_\t355823\n博途V14\t355824\n苏雪林\t355825\n善言\t355826\nyml\t355827\n法女\t355828\n老鸦\t355829\n300kw\t355830\n锚固件\t355831\n秦都\t355832\n不了了之\t355833\n唐鉴军\t355834\n寻常\t355835\n股权投资收益\t355836\n白液\t355837\nCC\t355838\n马塘\t355839\n04.10\t355840\n彰泰集团\t355841\n裸婚时代\t355842\n王俊生\t355843\n文眉\t355844\n云商网\t355845\n小儿急性喉炎\t355846\n熊晓鸽\t355847\n93\t355848\n方面盘\t355849\n动画概论\t355850\n旋梭\t355851\nuplay_r1_loader64.dll\t355852\n城镇职工基本医疗保险\t355853\n滇越铁路\t355854\n奶人\t355855\n拿不起\t355856\n运煤\t355857\n天龙派\t355858\n名题\t355859\n远行客\t355860\n正弦曲线\t355861\nzeit\t355862\nboyy\t355863\nBedroom\t355864\n梦幻西游手游版\t355865\n装饰图\t355866\n泵类\t355867\ntn6\t355868\n盈盈影院\t355869\n浓硫酸\t355870\nICBC\t355871\n宝骏610\t355872\n重组蛋白\t355873\n天文\t355874\n马坊村\t355875\n序位\t355876\n柯基\t355877\nRudy\t355878\n直锚\t355879\n宠物笼\t355880\n婴\t355881\n湖北省高中阶段学校\t355882\n琰\t355883\n赵敦华\t355884\n举棋不定\t355885\n笑时风华正茂\t355886\nEmber\t355887\n斯德哥尔摩综合症\t355888\n番禺社区网\t355889\n)服饰有限公司\t355890\n申哥\t355891\nhetero\t355892\n宇顺\t355893\n乙二醛\t355894\n宁夏回族自治区\t355895\n标准\t355896\n王宠\t355897\n130ml\t355898\n百武装战记\t355899\n点豆豆\t355900\n上海青帮\t355901\n北园\t355902\n唐小染\t355903\n雪梨\t355904\n复燃\t355905\n13万公里\t355906\n代办点\t355907\nv5.4.1\t355908\n西部行政州\t355909\n西校区\t355910\n费玉\t355911\n艶\t355912\n白沙滩\t355913\n邮局\t355914\n大龙焱\t355915\n增加费\t355916\n规范题\t355917\n昨晚上\t355918\n魔兽rpg地图库-游久网\t355919\n十几家\t355920\n网易学院\t355921\n郑州四中\t355922\n亥\t355923\n2017剑网3\t355924\n有道云笔记\t355925\n雄县\t355926\n群秒\t355927\n宋昱欣\t355928\nTradeManager\t355929\n宋民国\t355930\n289掌游网\t355931\nMacquarie\t355932\n极品飞车19\t355933\n清平乐\t355934\n打蛇\t355935\n七宝二中\t355936\n穿梭机\t355937\n沈煜伦\t355938\n天津火车站\t355939\n西安车管所\t355940\n计算机学报\t355941\n锂\t355942\n焊口\t355943\n灰砂\t355944\n王陇德\t355945\n天骐我的世界\t355946\n济南国际机场\t355947\n金板\t355948\n扭腰舞\t355949\nshepherd\t355950\ncurator\t355951\n变电\t355952\n市监察委员会\t355953\n宝墨园\t355954\n金湖湾\t355955\n黑顺片\t355956\n张康之\t355957\n日本医院\t355958\n局机\t355959\n汇总行\t355960\n死亡岛\t355961\nyuv420p\t355962\n朱煜\t355963\n成都电视台\t355964\n陷于\t355965\n同心同行\t355966\n老妻\t355967\npredator\t355968\n醉鹅\t355969\n千图网www.58pic.com\t355970\n雪亮\t355971\ncad2004\t355972\n土流网\t355973\n陈国军\t355974\n村夫\t355975\n决战江湖\t355976\nCardiovascular\t355977\n十几台\t355978\n包青天之七侠五义\t355979\nVF\t355980\n江宁经济技术开发区\t355981\n尿路\t355982\nCruel\t355983\n释意\t355984\n5分旅游网\t355985\n开窗器\t355986\nItBoth\t355987\n北京兴华会计师事务所\t355988\n株洲在线\t355989\n复现\t355990\n春江晚景\t355991\nbaseus\t355992\n铁嘴银牙\t355993\nmph\t355994\n养狗场\t355995\n昌江\t355996\n校园风\t355997\n_奇奥网\t355998\n来这儿\t355999\n西诺网\t356000\n12123车管所\t356001\n筑梦人\t356002\n重庆北部新区\t356003\n李奥贝纳\t356004\n附全\t356005\n缓和\t356006\n赖雨濛\t356007\n台儿庄新闻网\t356008\n中建五局三公司\t356009\n一孕\t356010\n呼啦圈\t356011\n未知-三五中文网\t356012\n收购\t356013\nsinx\t356014\n内皮\t356015\n火药\t356016\n100步\t356017\n建设工程工程量清单计价规范\t356018\n三星gear\t356019\n战斧骨\t356020\n按摩膏\t356021\nPIPI\t356022\n信笺\t356023\n黄子扬\t356024\n民航在线\t356025\n缺失值\t356026\n铝蜂窝板\t356027\n12.24\t356028\n22页\t356029\n死亡谷\t356030\n小虎岛\t356031\nPillars\t356032\n该生\t356033\n方涵\t356034\n删减版\t356035\n侯伟\t356036\n张倩琳\t356037\n溪头村\t356038\n1945\t356039\n正白旗\t356040\ninclined\t356041\nblyat\t356042\n第三方代\t356043\nitk\t356044\n安江\t356045\n总统们\t356046\n华亭\t356047\n答谢中书书\t356048\n简要版\t356049\n碧云泉\t356050\n5枚\t356051\n猎网\t356052\n白渊\t356053\n敏华\t356054\n进气管\t356055\n弗兰兹\t356056\n基里连科\t356057\n8700万\t356058\n艺术生\t356059\nLeaves\t356060\nkoko\t356061\n睡觉时候\t356062\n李振东\t356063\n大霞美\t356064\n书屋\t356065\n9420\t356066\n反败为胜\t356067\n抱犊崮\t356068\n三苏祠\t356069\nkyy\t356070\nEU260\t356071\n一方\t356072\n清咽滴丸\t356073\n奖励表\t356074\n王卫斯\t356075\n胸床\t356076\n触犯\t356077\n救子\t356078\nps3\t356079\n乔柯涵\t356080\n国破家亡\t356081\n270mm\t356082\n吊车梁\t356083\n爱小艺KG\t356084\n汉字表\t356085\n脱逃者\t356086\n2.0l\t356087\n1646\t356088\n江苏省发展改革委\t356089\n鲍鹏山\t356090\n诗恩\t356091\nLED灯\t356092\n24bit\t356093\n百嘉\t356094\n石达开\t356095\nwww.liuxue86\t356096\n临海人才网\t356097\n电动升降机\t356098\n肥屄\t356099\n宋元明\t356100\n棘龙\t356101\n方先生\t356102\nACTION\t356103\n精灵宝钻\t356104\n李雪健\t356105\n浩辰cad2017\t356106\n109元\t356107\n新机遇\t356108\n经济与法\t356109\n四次\t356110\n30万\t356111\n商宝\t356112\n群组\t356113\n五当召\t356114\n4.CN\t356115\n红日\t356116\nHayley\t356117\n大观通宝\t356118\n怎麽\t356119\ndebts\t356120\n天堂地狱\t356121\n黄腹山雀\t356122\n达芬奇14\t356123\n爆音\t356124\n五秒\t356125\n珠江西岸\t356126\n|腕\t356127\n标准时\t356128\n齐桓公\t356129\n起点女生网\t356130\n汪海林\t356131\n软氮化\t356132\n槽口\t356133\n魏斌\t356134\n心律失常\t356135\n胡润革命之路\t356136\n蒸饭柜\t356137\n标日中级\t356138\n奥林匹亚\t356139\n授精\t356140\n胡哲\t356141\n大谢\t356142\nkao\t356143\n19个月\t356144\n88601279\t356145\n可节省\t356146\n维尔德\t356147\n商标专用权\t356148\n低开低走\t356149\n马列\t356150\n电竞圈\t356151\n奈保尔\t356152\n恵\t356153\n默笙\t356154\n黑柳彻子\t356155\n导演\t356156\nSWATCH\t356157\n1门\t356158\nRape\t356159\n抚养比\t356160\nSMG\t356161\n彰武县\t356162\n诸葛瑾\t356163\n噶玛巴\t356164\n元调\t356165\n纪章\t356166\n马王堆\t356167\nG30\t356168\n永威\t356169\n海拔\t356170\n绳艺网\t356171\n阳江镇\t356172\n十二孔陶笛\t356173\n限产\t356174\n电大作业网\t356175\n二论\t356176\n五十三\t356177\n橱房\t356178\n水菜丽\t356179\n天印\t356180\n三氟\t356181\n大汉堡\t356182\n卡莱美\t356183\n油枪\t356184\n肖像\t356185\n汉能薄膜发电\t356186\n无路可走\t356187\n广东农业信息网\t356188\nI9300\t356189\nmocha\t356190\ng3800\t356191\n四川双马\t356192\n人教版初一英语\t356193\n剪辑\t356194\n0.35g\t356195\n仁科百华\t356196\n双金属温度计\t356197\n收货\t356198\n吹不走\t356199\n兴义网\t356200\n秦风\t356201\n新闻学概论\t356202\n荷马\t356203\n刘诚\t356204\n41个\t356205\n自行火炮\t356206\n花样年家天下\t356207\n小猫猫\t356208\nhg255d\t356209\n一览_银行信息港\t356210\n体育中心\t356211\n145g\t356212\n鬼人\t356213\n44元\t356214\nScreenFlow\t356215\n甲状腺片\t356216\nel表达式\t356217\n动力\t356218\n复杂版\t356219\n广西人社\t356220\nWalk\t356221\n氟离子\t356222\n老虎城\t356223\nPGone\t356224\n那首诗\t356225\n明星化\t356226\nCSS_网\t356227\n防漏\t356228\nSkilled\t356229\n僮\t356230\n群集\t356231\n7999\t356232\n心源性水肿\t356233\n擒拿术\t356234\n算不算\t356235\n1000万股\t356236\n凯达\t356237\n娱乐场所管理条例\t356238\n清理机\t356239\n李心艾\t356240\nCCS5\t356241\ncts\t356242\n贿选\t356243\n刮宫\t356244\n星际2虫族\t356245\n2016年8月份\t356246\n我的楼兰\t356247\n绝世高手\t356248\n自由贸易区\t356249\n深圳市英威腾电气股份有限公司\t356250\nPaperTime\t356251\n意外伤害保险\t356252\n庐山西海\t356253\n音乐广场\t356254\n青河县\t356255\n第二军医大学附属长海医院\t356256\n火力地堡\t356257\n40家\t356258\n食色性\t356259\nPU\t356260\nQDLP\t356261\n特效方\t356262\ndare\t356263\nScrapy\t356264\n1.4.8\t356265\n无头骑士\t356266\n麒麟阁\t356267\nTool\t356268\nmk5\t356269\n044\t356270\n400件\t356271\n_色\t356272\nKOB\t356273\nTemp\t356274\n驱逐\t356275\n帕秋莉\t356276\n格蕾\t356277\n续行\t356278\nmotox\t356279\n中国礼品网\t356280\n己身\t356281\n2季度\t356282\n萌化\t356283\nbax\t356284\n鹤壁市人民政府\t356285\n恢复器\t356286\nP25\t356287\nqiangjian\t356288\n160平\t356289\nisbn\t356290\nBLT\t356291\nazis\t356292\n穆拉\t356293\n重庆论坛\t356294\n上丰路\t356295\n药肥\t356296\n课税\t356297\n小庞\t356298\n曾小贤\t356299\n屈肌\t356300\n反恐精英CS1.5\t356301\n中铁十六局集团有限公司\t356302\nckpt\t356303\n107路\t356304\n海军上校\t356305\n小五郎\t356306\n中国政法大学法学院\t356307\n叶美香\t356308\npackets\t356309\n传播史\t356310\n必要时\t356311\n世茂外滩新城\t356312\n黑川\t356313\n明源地产研究院\t356314\n讨论稿\t356315\n深圳电信\t356316\n朗境\t356317\n命令提示符\t356318\n海东青\t356319\nmSATA\t356320\n扰流板\t356321\n立花\t356322\n怀德\t356323\n除臭剂\t356324\n白送\t356325\n鲜鱼\t356326\n召唤系\t356327\n竹山县城关镇政府\t356328\n小儿郎\t356329\n毒纪\t356330\n4月2\t356331\n胆息肉\t356332\n3.07\t356333\n单仁平\t356334\nliststring\t356335\nbearings\t356336\n摩范\t356337\n第19届\t356338\nuCGUI\t356339\n藤类\t356340\n山东省临沂市房产和住房保障局\t356341\n吉利远景X3\t356342\n推漫\t356343\n邮费\t356344\nmembership\t356345\n另一头\t356346\n绝毛液\t356347\njieba分词\t356348\n2.3万元\t356349\nPAPA\t356350\n广角\t356351\n黑玉\t356352\n一卦\t356353\n1265\t356354\n900分\t356355\ntheres\t356356\n战争之人\t356357\n佳乐家\t356358\n枣山\t356359\n炎头\t356360\n天津银行\t356361\n茂业百货\t356362\n蓝光\t356363\n力高\t356364\n汽车展\t356365\nSum\t356366\n黄石市人民政府\t356367\n蓄电池\t356368\nMUMU\t356369\n空斗\t356370\n寻秦记合集\t356371\n分切机\t356372\n宗萨\t356373\nA.4\t356374\n2018年3月份\t356375\nalkatip\t356376\nlol英雄联盟\t356377\n前仆后继\t356378\nsometimes-ever\t356379\n商丘火车站\t356380\n维格\t356381\n左江日报\t356382\n波密\t356383\nshsgl\t356384\n第100集\t356385\n四足机器人\t356386\n梨花节\t356387\n答谢\t356388\n8000美元\t356389\n夏妮\t356390\n不敌\t356391\nuea\t356392\n宣统年\t356393\n振动台\t356394\n香港联合交易所\t356395\n云办公\t356396\n真人cs\t356397\n郑国霖\t356398\n省国土厅\t356399\n二炮\t356400\n业余\t356401\n新会计准则\t356402\n养心\t356403\n阻性负载\t356404\n船级社\t356405\n几转\t356406\n卷起\t356407\n高达\t356408\n战地风云\t356409\n滨海区\t356410\n刘浩龙\t356411\n冷冻液\t356412\n果壳网\t356413\n天天快递\t356414\n文化部办公厅\t356415\nxz2c\t356416\n东平路\t356417\n笨重\t356418\ncnool\t356419\n5eplay\t356420\n李皓\t356421\n梦泉\t356422\n坠楼\t356423\n滨江镇\t356424\n慢性\t356425\n灰黄霉素\t356426\n排查\t356427\n厦门方特\t356428\n岚山区\t356429\n三乙醇胺\t356430\n姜生\t356431\n4nano\t356432\n泉城广场\t356433\n爆笑\t356434\n這個\t356435\n128天\t356436\n毛料\t356437\n四环\t356438\n7-10岁\t356439\n建研集团\t356440\n人定胜天\t356441\n0923\t356442\n语堂\t356443\n20150502\t356444\n后备\t356445\n永劫\t356446\n阿勇河\t356447\n云豹\t356448\n类风湿关节炎_\t356449\n离子反应\t356450\nConstrained\t356451\nperceptron\t356452\nsuccubus\t356453\ndvdes\t356454\n张悦\t356455\n印尼\t356456\n清凉化\t356457\n追光\t356458\n武夷山\t356459\n梯田\t356460\n變\t356461\n工控机\t356462\n新服\t356463\n收纳柜\t356464\nMeet\t356465\n帝国全面战争\t356466\n新华美育\t356467\n云南省知识产权局\t356468\n黑龙江省\t356469\n不可言\t356470\n滴水藏海\t356471\n美商\t356472\n鸭舌\t356473\n养神\t356474\n唐婉\t356475\n六村\t356476\n阿里外贸\t356477\nKPI\t356478\n修真世界\t356479\nDLL注入\t356480\n荣耀亚瑟\t356481\n断触\t356482\n蓝天快递\t356483\n猩球崛起2:黎明之战\t356484\n高级经济师\t356485\n功过\t356486\nUsitrip\t356487\nIU-Ready\t356488\n安防\t356489\n译解\t356490\n锦州医科大学\t356491\n奇游加速器\t356492\n盐城政府网\t356493\n标志桩\t356494\n2283\t356495\n救命恩人\t356496\n蝙蝠侠前传2黑暗骑士\t356497\n朱天文\t356498\nusnews\t356499\n14号\t356500\n第2步\t356501\nLDK\t356502\n风挡\t356503\n利宝\t356504\n柏子仁\t356505\n蔡基刚\t356506\n玉都\t356507\nrenwu\t356508\n1400个\t356509\nemulate\t356510\n参同契\t356511\n走秀款\t356512\n241mm\t356513\nasdfghjkl\t356514\n南风天\t356515\n博物馆奇妙夜\t356516\n二手包\t356517\ngranular\t356518\n柔宇科技\t356519\n杏鲍菇\t356520\nMercure\t356521\n常宝\t356522\n查缉\t356523\n思梦\t356524\nType\t356525\nRS3\t356526\n有限性\t356527\narun\t356528\n石之灵\t356529\nConflict\t356530\n熠熠\t356531\n凭什么不\t356532\n上视\t356533\n160ml\t356534\n灭天\t356535\n葱油\t356536\n修山\t356537\n圆锥滚子轴承\t356538\n2018-01-24\t356539\n电子礼品卡\t356540\n金西\t356541\n广州市第十二人民医院\t356542\n报刊亭\t356543\n东门老街\t356544\n慈善基金\t356545\n京大\t356546\n崩牙驹\t356547\n弯弓\t356548\n千余\t356549\n吐口秀\t356550\n七座\t356551\n败犬\t356552\n统一行高_百度\t356553\n荣威魔域\t356554\nxutils\t356555\n马凯旋\t356556\n地铁站口\t356557\nediary\t356558\n史塞克\t356559\n安宁区\t356560\n民生\t356561\nSVG\t356562\n图像卷积\t356563\n百香蜜\t356564\n手拉机\t356565\n南麂\t356566\n马蹄网\t356567\n曲塘镇\t356568\n完全一致\t356569\n申文哲\t356570\n陈淼\t356571\n胸腺癌\t356572\n出众\t356573\nshipper\t356574\n诺兰保罗\t356575\nshenmeyisi\t356576\nRuns\t356577\n福鑫\t356578\n东瀛\t356579\n当天\t356580\nfilezilla\t356581\n沿线\t356582\n菁蓉\t356583\n困兽\t356584\n抄网\t356585\n亚健康体检_高端体检中心_美年体检中心\t356586\n安索夫\t356587\n墙绘\t356588\n正在现场\t356589\n178九阴\t356590\nmagma\t356591\n亮斑\t356592\n_合同协议装修|一起网\t356593\nduankou\t356594\n华法林钠片\t356595\n1.73\t356596\n易视\t356597\nrobcad\t356598\n在心里\t356599\n万里通\t356600\nHD版\t356601\n标致408\t356602\n交叉编译器\t356603\n再贴现\t356604\n富山\t356605\nDrain\t356606\n解放军301医院\t356607\n8.2.2\t356608\n拍立\t356609\n东交民巷\t356610\n上河湾\t356611\nBASIC\t356612\n贵贱\t356613\n三叠纪\t356614\n智尊版\t356615\n折线\t356616\n北美电器\t356617\n中国地质学会\t356618\n小孩子\t356619\nlibncurses5\t356620\n操作流\t356621\n养鸡\t356622\n巴西烤肉\t356623\n囚笼\t356624\nFlv\t356625\n绫辻行人\t356626\n识别\t356627\nL101\t356628\n绚丽\t356629\n新岸\t356630\n石楼\t356631\nReentrantLock\t356632\npowerdesiger\t356633\n新剧\t356634\n父页\t356635\nExecutorService\t356636\n男丁\t356637\n怅然若失\t356638\n翘曲\t356639\n良木\t356640\n马条\t356641\n朱泾\t356642\nText2\t356643\n圆寸\t356644\n江海证券\t356645\n评分卡\t356646\n值日\t356647\n南昌中学\t356648\nabc输入法\t356649\n巨客网\t356650\noverlays\t356651\nabd\t356652\n裂机\t356653\n厦门大学医学院\t356654\n牌手\t356655\nkaufen\t356656\n180部\t356657\n乍浦\t356658\n信阳房\t356659\n圣战\t356660\n世新大学\t356661\n不管怎么样\t356662\nLei\t356663\naspire\t356664\n萝莉酱\t356665\n俩条\t356666\n亭湖区人民政府\t356667\n杭州万象城\t356668\n快快快\t356669\nendorsed\t356670\nIa\t356671\n五都\t356672\n省扶贫办\t356673\nIan\t356674\n乳石\t356675\n忍术\t356676\n童颜机\t356677\nfmc\t356678\n可安\t356679\n东风日产奇骏\t356680\n汤先生\t356681\nak320\t356682\n人世\t356683\n西瓜味\t356684\nphpok\t356685\n清晰版\t356686\n妄语\t356687\n04.20\t356688\n中药网\t356689\n美女警察\t356690\n包打包\t356691\n东直门街道\t356692\n蒸蛋器\t356693\n莆田鞋\t356694\nSoundSport\t356695\n天王山之战\t356696\n炒货店\t356697\n黄琳\t356698\nAIO\t356699\n空气净化机\t356700\n低质\t356701\ndebugfs\t356702\n汇盈\t356703\n试盘\t356704\n科罗拉多州立大学\t356705\nBTW\t356706\n读书群\t356707\n周孝正\t356708\n泄洪洞\t356709\n拳皇14吧_\t356710\n吴某\t356711\n谜语园\t356712\n热波\t356713\n青岛海慈医院\t356714\n缺位\t356715\nxcom\t356716\n五起\t356717\n繁體\t356718\n60ml\t356719\n沙河服装批发市场\t356720\n30秒\t356721\n金云\t356722\nburberry\t356723\n天猫男神节\t356724\n陈贰\t356725\n凤梨\t356726\n焊缝\t356727\n小米游戏中心\t356728\nLetters\t356729\n金兰企划网\t356730\n间断点\t356731\ndebate\t356732\n知遇之恩\t356733\nPUK\t356734\nˊ\t356735\n陈绍基\t356736\n李恩童\t356737\n流放之路异界\t356738\n陈长生\t356739\n1-4号\t356740\nDEV\t356741\ncinemas\t356742\n非结构\t356743\nanonymous\t356744\n60粒\t356745\n上海复旦大学附属肿瘤医院\t356746\n交互性\t356747\n日立投影仪\t356748\n桑塔纳2000\t356749\n秘术家\t356750\n五洲龙\t356751\n浑南\t356752\n第一峰\t356753\n北票市\t356754\n恶报\t356755\nw7旗舰版密匙神key\t356756\n金钱\t356757\n编结\t356758\nElk\t356759\n新部落守卫战猎场\t356760\n轻钙粉\t356761\n48p\t356762\n华融国际信托有限责任公司\t356763\n新兵日记\t356764\n速录师\t356765\n谭荣\t356766\nesb\t356767\n7年后\t356768\natomic\t356769\n八卦炉\t356770\n爱普生l380\t356771\n雷柏V500\t356772\nv2.4.5\t356773\ndoctype\t356774\n郭静静\t356775\n第六十一条\t356776\n统计学原理\t356777\n银信通\t356778\n御翠园\t356779\n司马睿\t356780\n梁星\t356781\n2018-02-10\t356782\n五严\t356783\n流水声\t356784\noCam\t356785\n莫娘\t356786\n各服\t356787\n新人类\t356788\n瑞格列奈\t356789\n奸夫\t356790\n揭密\t356791\n巩国兰\t356792\n4.10\t356793\n疵点\t356794\n分子量\t356795\nvbf\t356796\n卢广仲\t356797\n瑞瑞\t356798\nvivox7plus\t356799\n离任\t356800\n南丹县\t356801\nATM\t356802\nMK\t356803\nsysteminfo\t356804\n30户\t356805\n生死书\t356806\n膀子\t356807\n230号\t356808\n5182\t356809\n热平衡\t356810\n皮山县\t356811\n百景\t356812\n海协会\t356813\n李伯清\t356814\njango\t356815\nCglib\t356816\n百姓源流网\t356817\n街道党工委\t356818\n2400g\t356819\n濮阳市昆吾小学\t356820\n融捷股份\t356821\nsparse\t356822\n北京市财政局\t356823\n主讲\t356824\n公园道1号\t356825\n烟熏味\t356826\n常平镇\t356827\n露营公园\t356828\n张建刚\t356829\ncaesar\t356830\ntissot\t356831\n泸州老窖\t356832\nFlipped\t356833\nv20\t356834\ncbn\t356835\n周大生\t356836\nshocked\t356837\n版钉\t356838\n企业应收账款管理\t356839\n水仙子\t356840\n防水连接器\t356841\n长安CX70论坛_汽车之家论坛\t356842\n湘乡\t356843\n老中医\t356844\n甜婚蜜宠\t356845\nMARTIN\t356846\n小沃科技有限公司\t356847\n塔城市\t356848\n户户\t356849\n夕雾\t356850\nSugarCRM\t356851\n开发区管委会\t356852\nRestaurants\t356853\n持月真由\t356854\n圆鼓鼓\t356855\n九三学社北京市委员会\t356856\n白加黑\t356857\nspotlight\t356858\n17亿美元\t356859\nAXON\t356860\nwrong\t356861\n圆石\t356862\n雄楚\t356863\n提诺\t356864\nORB\t356865\nBSA\t356866\nChecker\t356867\nQQ拼音输入法\t356868\n不干胶纸\t356869\n烟儿\t356870\n扬州西站\t356871\n陈晓东\t356872\nBooster\t356873\nEros\t356874\n厄齐尔\t356875\nPlate\t356876\n六十级\t356877\n特集\t356878\nYAMATO\t356879\n万惠\t356880\nsiz\t356881\n精准脱贫攻坚战\t356882\n空座\t356883\nSQLite数据库\t356884\n1326\t356885\n2016年3月5日\t356886\n省力\t356887\n区校\t356888\n導覽\t356889\n虚伪\t356890\n白三叶\t356891\n莫斯科市\t356892\n英译_英汉\t356893\ncynthia\t356894\nM21\t356895\n南阳理工\t356896\n偷情系列七部集\t356897\ncad软件\t356898\n空栈\t356899\n殖民主义\t356900\nLaby\t356901\n不收钱\t356902\nvibrator\t356903\n对折\t356904\n不平衡\t356905\n斯洛文尼亚\t356906\n香港城\t356907\n哈弗大学\t356908\n述论\t356909\n废弃\t356910\n博世火花塞\t356911\nQQ表情图片官网_妈蛋表情网\t356912\n审丑\t356913\n卡压式\t356914\n_柳条冶叶\t356915\n2012-2013年\t356916\n脖子\t356917\n一道杠\t356918\n德远\t356919\n在眼前\t356920\nsjw\t356921\n阿诺施瓦辛格\t356922\n度分\t356923\n彭凯\t356924\n客户版\t356925\n反力架\t356926\n五号特工组\t356927\n圣剑世界\t356928\nInverse\t356929\n反应炉\t356930\n蛮颚\t356931\nJNA\t356932\n树色\t356933\n0171\t356934\nwmv\t356935\n心脏早搏\t356936\n50hz\t356937\n抱刹\t356938\n南京华美美容医院\t356939\n张冲\t356940\n焓值\t356941\nswamp\t356942\n商业医疗保险\t356943\n32mm\t356944\n走时\t356945\n浐灞\t356946\n凉衣\t356947\n点彩\t356948\n绗架\t356949\nTransformation\t356950\n春城壹网\t356951\n药效学\t356952\nuniver\t356953\n豪客\t356954\n杭州国际博览中心\t356955\nAdult\t356956\n2.7.11\t356957\nFap\t356958\ntrailers\t356959\n旅游地图\t356960\n目睹\t356961\n软开关\t356962\n安化县\t356963\nexcel模板\t356964\ngddr5\t356965\n生死恋\t356966\nTable\t356967\nAI公司\t356968\n第二辆\t356969\n捡回\t356970\n芙兰达\t356971\n90077\t356972\nSimulation\t356973\n存世\t356974\njnby\t356975\n心平\t356976\niPadMini4\t356977\nFineReport\t356978\n最终幻想15:王者之剑\t356979\n第十六次\t356980\n加酸\t356981\nMaurice\t356982\n中国书画服务中心\t356983\nBounce\t356984\n冷锋\t356985\n2010年5月\t356986\n深思\t356987\n深孔\t356988\n原尿\t356989\n米哈游\t356990\n陈达辉\t356991\n叶咲\t356992\nGLC\t356993\n探寻者\t356994\nJennie\t356995\n复旦万科实验学校\t356996\n102家\t356997\nMACD指标\t356998\n海南免税店\t356999\n字符型\t357000\n0827\t357001\n桂南\t357002\n十户\t357003\nappios\t357004\n联美控股\t357005\n限量款\t357006\n东邪兵临城\t357007\nlens\t357008\ncdsn\t357009\n萧敬腾\t357010\n千斤拔\t357011\n器形\t357012\n恐怖分子\t357013\n客座\t357014\n有级\t357015\nrss\t357016\n量价关系\t357017\n20150815\t357018\n楸树\t357019\n市纪委突击检查教育局\t357020\n汉昭帝\t357021\n73家\t357022\nProtobuf3\t357023\nIDBD\t357024\n眉县\t357025\n淡出\t357026\n落版\t357027\n痛症\t357028\n非联盟航空\t357029\n特惠装\t357030\n\\\\\\\t357031\nblending\t357032\nstrerror\t357033\n景逸X5\t357034\n中山文明网\t357035\ni7-7700\t357036\n华为g7\t357037\n现场感\t357038\nVulnerabilities\t357039\n银河大厦\t357040\n大话王\t357041\nKILLER\t357042\n林科大\t357043\n青冈县\t357044\n湖南外国语职业学院\t357045\n折锦春\t357046\n麦果\t357047\n5012\t357048\n噢\t357049\nA43\t357050\nconfined\t357051\n拌和\t357052\n人事网\t357053\n82届\t357054\n苏菲娅\t357055\n柯子岭\t357056\nrevenue\t357057\n辞官\t357058\ncartographer\t357059\n甩一把汗\t357060\n蓝杉\t357061\n天钺\t357062\n200岁\t357063\n几天前\t357064\n变身器\t357065\neol\t357066\n▕\t357067\n那件事\t357068\n/修锁公司\t357069\n干劲儿\t357070\n耦合系数\t357071\n粗纹\t357072\nsection\t357073\n北征\t357074\n0313\t357075\n殷秀梅\t357076\nSMM\t357077\n独孤般若\t357078\n绿色和平组织\t357079\naddress\t357080\n挡土墙\t357081\n赵权\t357082\n阶数\t357083\n广告拦截器\t357084\n两两\t357085\nYZ\t357086\n列王记\t357087\n立讯精密\t357088\n牛鞭\t357089\nV97\t357090\n云南西双版纳\t357091\n真剑\t357092\n布赫\t357093\n张舒\t357094\n步子舞\t357095\n狙击\t357096\n魔兵\t357097\n1969\t357098\n评说\t357099\n三国志13PK威力加强版\t357100\n定海区\t357101\n义盖云天\t357102\nMonty\t357103\n非常设计师网\t357104\n刘欢\t357105\n大卫芬奇\t357106\n推推棒\t357107\n厦门外国语学校\t357108\n金毛俱乐部\t357109\n1.71G\t357110\n手自一体\t357111\nricardo\t357112\n泡泡泥\t357113\n就诊\t357114\nlnamp\t357115\n勒流街道\t357116\ncontributing\t357117\n24m\t357118\nAPE/整轨\t357119\n电加热管\t357120\n抓屏\t357121\nE46\t357122\n中国人事考试网\t357123\n治愈率\t357124\n赤魔\t357125\n血橙\t357126\n成河\t357127\n跑位\t357128\n免疫荧光\t357129\n反对本本主义\t357130\n酒醒\t357131\n三人\t357132\n流槽\t357133\n家禽\t357134\n唐城\t357135\nshadowing\t357136\n重庆德庄\t357137\n扫描器\t357138\n暗合\t357139\n梧州南站\t357140\n洋气黄\t357141\n商检\t357142\n雨击\t357143\n手中\t357144\n河师大\t357145\n勘界\t357146\n内蒙草原\t357147\n俄罗斯外交部\t357148\ndefinite\t357149\n女权\t357150\nonchange\t357151\n中航科工\t357152\n幽察微_察网\t357153\n元江新闻网\t357154\n7秒钟\t357155\nk23\t357156\n蜜水\t357157\n新论\t357158\n印数\t357159\n2013-2017年\t357160\nknew\t357161\nSI\t357162\n03月16日\t357163\nSiteEngine\t357164\n烟气脱硫\t357165\n川府\t357166\nbust\t357167\n生瑜\t357168\n青岛汽车北站\t357169\n途客网\t357170\n推拿科\t357171\n南京地铁7号线\t357172\n环球摔迷网\t357173\n房态\t357174\n泗安镇\t357175\n重庆区县\t357176\n沧县人民政府\t357177\n游卡\t357178\n傀儡师\t357179\n游戏币\t357180\n小戏骨水浒传\t357181\n大匠\t357182\n80克\t357183\nFB\t357184\n温州南\t357185\n甘泉县\t357186\n氨糖\t357187\n方与圆\t357188\n泽木\t357189\n15.0.2\t357190\n张字\t357191\nrns315\t357192\n乐府诗集\t357193\n假道\t357194\n任务单\t357195\n好觉\t357196\n菲兹\t357197\nYoosee\t357198\n基金资产净值\t357199\n种田文\t357200\n不易\t357201\n二手区\t357202\n亿邦动力\t357203\n不服从\t357204\n永久居留证\t357205\n1场\t357206\n通勤营救\t357207\nJosh\t357208\n神格\t357209\n爱媛县\t357210\n中案\t357211\norder\t357212\na50\t357213\n澄城\t357214\n绿色包装\t357215\n小叫\t357216\n融侨城\t357217\n超级机器人\t357218\n拮抗\t357219\n落梅\t357220\nSerum\t357221\n2312\t357222\n柳州市\t357223\n南苑乡\t357224\n客服子\t357225\n奸案\t357226\n一幕\t357227\n秦始皇帝陵博物院\t357228\n陈桥兵\t357229\n无线充电技术\t357230\n用例\t357231\nTarte\t357232\n杀敌\t357233\n定状\t357234\n财产性\t357235\n陆陆\t357236\n普照\t357237\n菜友\t357238\n源味中国\t357239\n焊接人\t357240\n金京浩\t357241\n南通火车站\t357242\n印顺\t357243\nVibram\t357244\n完结篇\t357245\n杭州建兰中学\t357246\ngosick\t357247\n龙葵\t357248\n透明袋\t357249\nRaces\t357250\nAddOns\t357251\n撸撸奇新\t357252\n秤杆\t357253\n_物\t357254\n讲和\t357255\ncl10\t357256\nArma\t357257\n杂件\t357258\noutlast\t357259\n猪排\t357260\n拉文凯斯\t357261\n成语大全网\t357262\n宣传篇\t357263\n网页源代码\t357264\n吴帅\t357265\n白桥镇\t357266\n特别是\t357267\n10bit\t357268\n入驻率\t357269\n南南\t357270\n邮寄\t357271\n中设设计集团股份有限公司\t357272\n孙膑\t357273\n丨子\t357274\n神坛\t357275\n暴露狂\t357276\n康二蛋\t357277\n美漫\t357278\n衣上\t357279\nJTY\t357280\n新丰江水库\t357281\n三百两\t357282\n噪声污染\t357283\n无限流量\t357284\nmayer\t357285\n山德鲁\t357286\ndatabinding\t357287\n猪鞭\t357288\n宋昕冉\t357289\n江苏苏美达集团有限公司\t357290\n钢之家钢铁网\t357291\n陕科大\t357292\n武林外传十年之约\t357293\n阿基米德螺线\t357294\n头条号\t357295\n洁身自好\t357296\n644\t357297\n北京邮电大学研究生院\t357298\n小小强学习网\t357299\n星展\t357300\n三十度\t357301\n常用来\t357302\n低风险\t357303\n无心插柳\t357304\n海员网\t357305\n10亿_\t357306\nqq对战平台\t357307\n交流论\t357308\n派彩\t357309\n北江新区\t357310\n文昌街\t357311\n氯化钠注射液\t357312\n乘性\t357313\n核桃油\t357314\n袁宇\t357315\n91%\t357316\n北极圈\t357317\n菜汁\t357318\n初霜\t357319\n华印\t357320\n就打\t357321\n1.5万吨\t357322\n鎏\t357323\n八卦吧\t357324\n麻梨\t357325\n小米众测\t357326\n楼小姐\t357327\nadobecc\t357328\n罗琼\t357329\n宠物小精灵日月\t357330\n正妹\t357331\n乳化机\t357332\n荣耀5x\t357333\n新富\t357334\n鸿禧\t357335\nデカ\t357336\n无极膏\t357337\nrigol\t357338\n宋秉洋\t357339\n揭阳空港经济区\t357340\n赤火\t357341\n东风21d\t357342\n点阵激光\t357343\n梁辰\t357344\n回复率\t357345\n碘伏\t357346\n切比雪夫\t357347\n淘宝学堂\t357348\n甲酯\t357349\n城市建筑\t357350\n27万元\t357351\n壳\t357352\n浙江省委宣传部\t357353\n朱芳\t357354\n焓\t357355\n酒礼\t357356\nrebate\t357357\n鲁米诺\t357358\n载重吨\t357359\n华国锋\t357360\n大豆油\t357361\n破事水\t357362\nrunning\t357363\n血爆\t357364\n泛洪\t357365\n魔兽争霸官方对战平台\t357366\n欧五\t357367\n秦颂\t357368\n阳光保险\t357369\n学生公寓床\t357370\n康普顿效应\t357371\n人类学系\t357372\n三害\t357373\n游山玩水\t357374\nyuwen\t357375\n開心\t357376\n方木一\t357377\n第1卷\t357378\nOstelin\t357379\nreit\t357380\n冰天雪地\t357381\ntoad\t357382\nMAT\t357383\n盈彩\t357384\n肱骨头\t357385\n第4批\t357386\n1.61\t357387\n陆少\t357388\n奥氮平片\t357389\n同比增长率\t357390\n副表\t357391\nNCP\t357392\n同业借款\t357393\naurora\t357394\n肖宁\t357395\nMAX485\t357396\n阿提哈德航空\t357397\n博伊卡\t357398\n任志辛\t357399\n大臣们\t357400\n221\t357401\n延安保育院\t357402\n爱玛电动车\t357403\n长治路\t357404\nhuoche\t357405\n星光灿烂\t357406\nT470P\t357407\n20发\t357408\n全总\t357409\n感_寻医问药网\t357410\n电子式\t357411\n不干胶标签纸\t357412\n禄丰县\t357413\n纵距\t357414\n武汉体育学院\t357415\n哈尔滨广厦学院\t357416\n53座\t357417\nsos\t357418\nWINDOWS7\t357419\ndivine\t357420\nSeymour\t357421\n7K7K小游戏\t357422\n霓\t357423\nksf\t357424\n一个一位\t357425\n水木剧\t357426\n3967\t357427\nf调\t357428\n极限挑战3\t357429\n戾\t357430\n杨佬叁\t357431\n1.0.0.4\t357432\norico\t357433\n珍视\t357434\n侯门毒妃\t357435\n民办小学\t357436\n择友\t357437\n司法证\t357438\n周至县政府\t357439\n老夫子\t357440\n好动\t357441\n发质\t357442\n410万\t357443\n我需要\t357444\n高适\t357445\nbuck电路\t357446\navgle\t357447\n杨行镇\t357448\n翁倩玉\t357449\n齿型\t357450\n陈少云\t357451\n8个小时\t357452\n汤森路透\t357453\n狐王\t357454\n野心家\t357455\n化生\t357456\n500mg\t357457\n崽子\t357458\n一松\t357459\n自然科学\t357460\n宝贝秀无敌卡\t357461\n潮白新城\t357462\n火力支援\t357463\n两宫\t357464\n备选\t357465\n金领冠奶粉\t357466\nMagick\t357467\n黄明端\t357468\n存款类\t357469\n改桥接\t357470\n耳放\t357471\n注册号\t357472\n江门职业技术学院\t357473\n山西煤炭进出口集团有限公司\t357474\nMDR-1000X\t357475\n上花\t357476\nsob\t357477\n8.5.3\t357478\n爬虫\t357479\n近现代史\t357480\n名古屋城\t357481\n武隆天坑\t357482\nHypothesis\t357483\n宏宇\t357484\nzl\t357485\n1.58\t357486\n小艺\t357487\nexcel2017\t357488\n熊猫血\t357489\nrepetitive\t357490\n第五辑\t357491\n上高地\t357492\n沈妍\t357493\npape\t357494\n5w2h\t357495\n金地格林\t357496\nlocaltime\t357497\nPiss\t357498\n加州一号公路\t357499\n裱花\t357500\n沉闷\t357501\n颜域\t357502\n5810\t357503\nRectangle\t357504\nPrix\t357505\n公助\t357506\n反骨\t357507\nTrustAsia\t357508\n08秒\t357509\n儿童案\t357510\n彪\t357511\n乞丐\t357512\n单打独斗\t357513\n金阊区\t357514\nmonet\t357515\n莱茵城\t357516\n善于\t357517\n180mm\t357518\n押汇\t357519\nrom\t357520\n龙珠超吧_\t357521\nMusical\t357522\n椰汁糕\t357523\n返迁房\t357524\n方志平\t357525\nSchools\t357526\n名&#160\t357527\nvolution\t357528\n伊势丹\t357529\n0871\t357530\n借机\t357531\n直埋\t357532\n成都远洋太古\t357533\nAlkaline\t357534\n民事公益诉讼\t357535\n化验证\t357536\nVideoCapture\t357537\n塔城地区\t357538\nZ370-F\t357539\n东营\t357540\nLITEON\t357541\nxerox\t357542\nAS\t357543\npb\t357544\n路南\t357545\n星海公园\t357546\n谈人生\t357547\n棉板\t357548\n生态图\t357549\nPOLO\t357550\n邓小闲\t357551\n小客车\t357552\n原照\t357553\n1112\t357554\n首位\t357555\n鲁瑞\t357556\n改选\t357557\n锅炉大气污染物排放标准\t357558\n杀码\t357559\n烂辉\t357560\n45部\t357561\n今晚7点\t357562\n同构\t357563\nhasp\t357564\ntabitem\t357565\nATEN\t357566\n金锣火腿肠\t357567\nscannow\t357568\n中航城\t357569\n怀孕以后\t357570\nAnony\t357571\n复色\t357572\nmacbo\t357573\n瑜伽砖\t357574\n禅寺\t357575\n佳能mp288\t357576\n李村河\t357577\n太空垃圾\t357578\n王二\t357579\nQT\t357580\n大数据库\t357581\n假面骑士kabuto\t357582\ndatepicker\t357583\nPERCENT\t357584\n迟钝\t357585\n光纤交换机\t357586\n农分期\t357587\nwifi信号增强器\t357588\nplugin\t357589\n效果器\t357590\n粤北人民医院\t357591\n超级变变变\t357592\n江铃\t357593\n玉漱\t357594\n采场\t357595\n顶空\t357596\n1718\t357597\n防爆管\t357598\n新城大厦\t357599\n张保会\t357600\nq龄\t357601\n潮味\t357602\ngurobi\t357603\n高中数学\t357604\n独臂\t357605\n4399橙光小游戏\t357606\n靠头\t357607\n真源\t357608\nsurname\t357609\nFirefly\t357610\n5430\t357611\n上海中国建设银行\t357612\n土墙\t357613\nk420\t357614\nloading\t357615\nv3.0.0\t357616\nwaves\t357617\nsuspension\t357618\n加游\t357619\n配电网\t357620\n璐瑶\t357621\n华为荣耀\t357622\nconne\t357623\n诗联\t357624\n代生\t357625\n金岩\t357626\n建阳区\t357627\n青岛流亭国际机场\t357628\n雪乐山\t357629\n正保\t357630\n659\t357631\n0.8_\t357632\n金旗奖\t357633\n周医生\t357634\n无尾\t357635\n卢瑟福\t357636\n8分钟\t357637\n流井\t357638\n称意\t357639\n40g\t357640\n炼兽笼\t357641\n游戏狗植物大战僵尸2\t357642\n回鹘\t357643\n16速\t357644\nU2518\t357645\n北京音乐厅\t357646\n作恶多端\t357647\nHoods\t357648\nalarmmanager\t357649\n阿德采购网\t357650\nTRON\t357651\n2008届\t357652\n王航\t357653\n桑拿房\t357654\n1.5GB\t357655\n中国西南地区\t357656\n破窗\t357657\n对象{}\t357658\nTVアニメ\t357659\n格拉汉姆\t357660\n北盘江\t357661\n计算机科学与技术\t357662\n山田裕二\t357663\n胖哥一镜到底\t357664\n笔名\t357665\n140分\t357666\n15px\t357667\n中国核电网\t357668\n文兴路\t357669\n毒爆\t357670\n牛津大学\t357671\n28厘米\t357672\nurs\t357673\ntwixtor\t357674\n易俗河镇\t357675\n符合率\t357676\n面馆\t357677\n新大\t357678\n铃铃\t357679\n水南镇\t357680\n住院楼\t357681\nIntegrations\t357682\n伊春文明网\t357683\n注签\t357684\n茂名网\t357685\n读研\t357686\n走马灯\t357687\n制杖\t357688\n中车\t357689\n法脉\t357690\n春安\t357691\n叱咤风云\t357692\n防爆控制箱\t357693\n高磊\t357694\n万夫莫开\t357695\n叙事作文\t357696\n卢星宇\t357697\n平煤集团\t357698\n别克林荫大道\t357699\n武汉民政局\t357700\nInDesign\t357701\n谁对谁错\t357702\n雌花\t357703\nhardfault\t357704\n离岸银行\t357705\n神庙\t357706\n乙肝五项\t357707\n椰子水\t357708\npdf.js\t357709\n夫妻相\t357710\n液压油缸\t357711\nV4.3\t357712\n400粒\t357713\n中华好诗词吧\t357714\n野怪\t357715\nIts\t357716\n第二十三条\t357717\n大老\t357718\n晋州\t357719\n熔喷\t357720\n65本\t357721\n花都\t357722\nVRCHAT\t357723\n6口\t357724\n星海杯\t357725\n李嘉格\t357726\n卑躬屈膝\t357727\nIncrease\t357728\n网游之纵横天下\t357729\n疯狂猜猜猜\t357730\nObjectMapper\t357731\n刘大夫\t357732\n濠滨网\t357733\n金属清洗剂\t357734\n菲律宾女足\t357735\nkof98\t357736\n红枣枸杞茶\t357737\n剑与家园\t357738\n雷克萨斯LX570\t357739\n张延生\t357740\n多节\t357741\n新乡\t357742\n周子雷\t357743\n水钻\t357744\n10000000\t357745\nmein\t357746\n龙南县政府\t357747\n环球金融中心\t357748\npest\t357749\n時間\t357750\n帝王之家网\t357751\n健次郎\t357752\n政军\t357753\n比亚乔\t357754\nStarch\t357755\n导弹\t357756\nwindows1\t357757\n翘摇\t357758\n沐川县\t357759\nsoils\t357760\n延音\t357761\n好朋友们\t357762\n保罗·乔治\t357763\n最后一个任务\t357764\nognl\t357765\n依诺肝素钠\t357766\n神马电影院\t357767\n羚羊龙王传说\t357768\nvray3.0\t357769\n刚性\t357770\n致公党中央\t357771\n60D\t357772\n恒光\t357773\nVANISHING\t357774\nprimes\t357775\n租车网\t357776\n开学第一课\t357777\n30cm\t357778\n赵总\t357779\n一指\t357780\n6辆\t357781\nhushi\t357782\n真性近视\t357783\nbijiao\t357784\n通达路\t357785\n红斑马\t357786\nAspNet\t357787\nCS:GO\t357788\n约顿\t357789\n查寻\t357790\n红色警戒3\t357791\nListview\t357792\n侦\t357793\nHomography\t357794\n十寸\t357795\n掌阅书城\t357796\nX5L\t357797\nFaye\t357798\n无日峰\t357799\n15000\t357800\n凸轮\t357801\n破冰船\t357802\n膈肌\t357803\n金杰\t357804\n外婆桥\t357805\n波洛\t357806\nhybris\t357807\n找茬\t357808\n棚改\t357809\n农业物联网\t357810\n弈棋耍大牌\t357811\n华威桥\t357812\n米迦勒\t357813\nsteven\t357814\n四圣心源\t357815\n需要你\t357816\n氧分\t357817\nToF\t357818\n红旗谱\t357819\n跟住\t357820\n說明\t357821\nZEC\t357822\n全国劳动模范\t357823\n杀印\t357824\n周立贞\t357825\n藿香\t357826\n分装机\t357827\nexon\t357828\n青年大街\t357829\n地笼\t357830\nqq仙灵吧\t357831\n依波路\t357832\n省界\t357833\njealousy\t357834\n20141126\t357835\n做自定义\t357836\n暗黑起亚k2\t357837\n小镇\t357838\n行鸟\t357839\nCorrection\t357840\nf184\t357841\n七个工作日\t357842\n谢先生\t357843\n4行\t357844\n小船儿\t357845\ncbb\t357846\niban\t357847\n创谷\t357848\n吐音\t357849\n化工泵\t357850\nEconomist\t357851\nyoure\t357852\n计生\t357853\n蒲菜\t357854\n神武符石\t357855\n中铁隧道局\t357856\n基地址\t357857\n窗通\t357858\n红利税\t357859\n农民运动\t357860\n吡唑醚菌酯\t357861\nf90\t357862\n教育部学位与研究生教育发展中心\t357863\n建筑工程定额\t357864\n看门狗2吧\t357865\n三段论\t357866\n日内瓦机场\t357867\n标子\t357868\n顺义\t357869\n2016年6月\t357870\nkerry\t357871\n泰乐菌素\t357872\n法雨寺\t357873\nffplay\t357874\ndxp\t357875\n御湖城\t357876\n国立台湾大学\t357877\n四十回\t357878\n新燕\t357879\n云南云天化股份有限公司\t357880\n酷玩\t357881\n三国战记\t357882\n中山音乐堂\t357883\n复方鱼腥草片\t357884\n殷素素\t357885\n山马\t357886\n解放公园\t357887\n居人\t357888\n渭南市人力资源和社会保障局\t357889\nmus\t357890\natmel\t357891\n1DX2\t357892\n泰沙拉克\t357893\n耐破\t357894\n上海音乐学院\t357895\n断翅\t357896\nTRANSLATISH\t357897\n爹妈\t357898\n第36\t357899\n神州战神\t357900\n日本电影学院\t357901\n化州\t357902\nyanjiu\t357903\n蔡夫人\t357904\n道德与法治下册\t357905\nM100\t357906\n悦耳\t357907\nSpending\t357908\n猎恐\t357909\nchina-pub\t357910\n飘洋过海\t357911\n金钩\t357912\n32点阵\t357913\n长泽\t357914\nLionel\t357915\n上苍\t357916\n造出\t357917\n酒井桃香\t357918\nincident\t357919\nZed\t357920\n黑梦\t357921\n排分\t357922\n新奥购物中心\t357923\n哲尔尼亚斯\t357924\n热转印机\t357925\n横肌\t357926\n嫦娥五号\t357927\n门铃\t357928\n女飞行员\t357929\n刀剑神域虚空断章\t357930\n迪亚多纳\t357931\nheater\t357932\nMiniUI\t357933\n王填\t357934\nraven\t357935\n反舞弊\t357936\n硅板\t357937\n第77章\t357938\nTide\t357939\nformatDate\t357940\n楼诚\t357941\n李丽华\t357942\n飞沙\t357943\n司波深雪\t357944\n两百亿\t357945\n岢岚县\t357946\n停课\t357947\nappdata\t357948\n刘筱\t357949\nunmerged\t357950\n荒野\t357951\ndxf\t357952\n小太阳鹦鹉\t357953\n英大证券\t357954\n部长们\t357955\n野村万斋\t357956\n前尘\t357957\n北京师范大学附属实验中学教育集团\t357958\n宜博\t357959\n湫\t357960\n4s店报价优惠信息-易车网\t357961\n特工\t357962\n链条包\t357963\nmakers\t357964\n鑫月\t357965\n田明建\t357966\n德拜\t357967\n广东移动\t357968\n新架\t357969\n2376\t357970\n头孢拉定\t357971\n气工\t357972\n风景篇\t357973\n涨跌幅\t357974\n爱拼\t357975\n招商银行信用卡中心\t357976\nModified\t357977\n一大一\t357978\n李国新\t357979\n7530\t357980\nproducing\t357981\n宝格丽\t357982\n鼎鼎有名\t357983\n白细胞介素\t357984\n拉夫劳伦\t357985\n贾长松\t357986\n神舟九号飞船\t357987\n百次\t357988\ndex2jar\t357989\n快乐涂鸦_巧巧手幼儿手工网\t357990\n派人\t357991\n金湾花园\t357992\nwin7系统电脑\t357993\n200多\t357994\n大贼\t357995\nCCTV2\t357996\n人生波动\t357997\n哈尔滨毅腾\t357998\nReproductive\t357999\n青岛文明网\t358000\n阿尔卡娜\t358001\n生化仪\t358002\n错身\t358003\nwindows64\t358004\n剪版\t358005\n穴穴\t358006\n刻体\t358007\n先前\t358008\nComma\t358009\nredo\t358010\nPACE\t358011\nc++多线程\t358012\n扭蛋机\t358013\nDecrypt\t358014\n罗隐\t358015\n19亿美元\t358016\n贸易商\t358017\nQuest\t358018\n创客教育\t358019\n救机\t358020\n黄柏山\t358021\n生化性\t358022\nearned\t358023\n熟女控\t358024\nDept\t358025\nkuo\t358026\n闽侯县政府\t358027\n军标\t358028\n太原路\t358029\n特医\t358030\n锦秋\t358031\nclocking\t358032\n鲁山\t358033\nDDT\t358034\n9980\t358035\n侠客风云传碧血丹心\t358036\nSeventeen\t358037\n解放军信息工程大学\t358038\n许仙\t358039\n无棣县\t358040\n大众搬家公司\t358041\n爸爸去哪儿第三季\t358042\n旅行日记\t358043\nkafak\t358044\n国家助学金\t358045\n会晤\t358046\n附表\t358047\n儿保科\t358048\n东山奈央\t358049\n3264\t358050\n四川大学研究生院\t358051\n资金方\t358052\n富力集团\t358053\n音乐秀\t358054\n瞎编\t358055\n近半数\t358056\n连云路\t358057\n航天人\t358058\n拾零\t358059\n创业策划书\t358060\n希捷酷鱼\t358061\nWBO\t358062\nFolx\t358063\n羊角蜜\t358064\n蓝布\t358065\n期年\t358066\n掠夺者\t358067\n吹嘘\t358068\n油企\t358069\n末尾\t358070\nGrok\t358071\n火猴\t358072\n最小资\t358073\nBible\t358074\n收藏者\t358075\n包地\t358076\n句式\t358077\n埃博拉病毒\t358078\n对背\t358079\n民俗文化节\t358080\nNextcloud\t358081\ndlc2\t358082\n漂着\t358083\n婶\t358084\nspie\t358085\n葬于\t358086\nContextPath\t358087\n谱振\t358088\n吴晶晶\t358089\n费用\t358090\nv2.0.3\t358091\n武国忠\t358092\n广东省注册会计师协会\t358093\n300474\t358094\nCombined\t358095\n东泰花园\t358096\n张文平\t358097\n罩棚\t358098\n价计征\t358099\n梦断难寻\t358100\n十四个\t358101\n留声\t358102\n金马甲\t358103\n车友俱乐部\t358104\nsetget\t358105\n异黄酮\t358106\nearmaster\t358107\nRedstone\t358108\nDirectUI\t358109\npizza\t358110\nrefusal\t358111\n1—6\t358112\ncoping\t358113\n万豪酒店\t358114\n欺诈性\t358115\n无据\t358116\nstatusbar\t358117\n豫金刚石\t358118\n柠檬籽\t358119\n28层\t358120\n发现自己\t358121\n今日\t358122\n秋林格瓦斯\t358123\n低龄留学\t358124\n翅目\t358125\nMatrox\t358126\n总科\t358127\n无人能\t358128\n宿房网\t358129\n慎始\t358130\n流用\t358131\n白志迪\t358132\n周璐\t358133\n中国泛海控股集团\t358134\n文华期货\t358135\n0.5g\t358136\n浸润型\t358137\n花草\t358138\n磁暴\t358139\n露西亚\t358140\n罚没\t358141\n吸水性\t358142\n算牌\t358143\nyards\t358144\n三峰\t358145\n法律化\t358146\n无忧\t358147\n小么\t358148\n168cm\t358149\n天誉城\t358150\n乔达\t358151\n午报\t358152\n珠海市横琴新区政府\t358153\n立和\t358154\n菱\t358155\n笨鸟先飞\t358156\n贷款利率表\t358157\n辰欣药业\t358158\n蚌埠二中\t358159\n社会保险网\t358160\n上南中学\t358161\n云品\t358162\nADI\t358163\n小麻\t358164\n临安区\t358165\n1136\t358166\n中国航空发动机集团\t358167\n愁肠\t358168\n云天空\t358169\njak\t358170\n巨鹿\t358171\n风扇灯\t358172\n净额\t358173\n东南大学土木工程学院\t358174\n东德\t358175\n38例\t358176\n绿牌车\t358177\n鲁弗兰\t358178\n雷达波\t358179\n天籁之声\t358180\n辅\t358181\n相对引用单元格\t358182\n328国道\t358183\namqp\t358184\n威海一中\t358185\n常春藤大学\t358186\nSensei\t358187\nALU\t358188\n西溪花朝节\t358189\nSPANK\t358190\n10.0.0.0\t358191\n不婚主义\t358192\n2000句\t358193\n基士\t358194\nCNNs\t358195\n刚好\t358196\n中国建材第一网\t358197\nStars\t358198\n街市\t358199\n居间协议\t358200\n西洞庭\t358201\nretouch\t358202\n绿地新里城\t358203\nofficer\t358204\nCannondale\t358205\n红墙\t358206\n安庆人事考试网\t358207\n集享\t358208\n黄枫\t358209\n果粉堂\t358210\nSex8|性吧\t358211\n不得不防\t358212\n亚特鲁\t358213\n马嵬坡\t358214\n李丽芳\t358215\nPBOC\t358216\n只只\t358217\n375|1\t358218\n儒乐湖\t358219\n8季\t358220\n研报\t358221\nGRACO\t358222\ndeposits\t358223\n老幼\t358224\nExquisite\t358225\n江南七怪\t358226\npads2007\t358227\n锦州市人民政府\t358228\n地动仪\t358229\nLiberty\t358230\n陈十三\t358231\nFinanceR\t358232\n万师傅\t358233\n月亮的味道\t358234\n机动车驾驶证\t358235\n鹏华资源\t358236\nschiff\t358237\n皇历\t358238\n243\t358239\nChengdu\t358240\n嘻哈帝国\t358241\nCommon\t358242\n国网江苏电力\t358243\n新奥特曼\t358244\n安全认证\t358245\n正航\t358246\n巴啦啦\t358247\n同等学力申硕经济学\t358248\n渲云\t358249\n前述\t358250\n民诉\t358251\n720P高清中字版)迅雷BT种子/ED2K\t358252\n窦骁\t358253\n移形\t358254\n宁波富邦\t358255\n郭健\t358256\n尼康D3400\t358257\n乐融融\t358258\n肩周炎\t358259\n李新元\t358260\n西双版纳州\t358261\ncean\t358262\n特色社会主义\t358263\n法西斯\t358264\n收账\t358265\n合同案\t358266\n出兵\t358267\n人代会\t358268\n投票数\t358269\n脏器\t358270\n狼狈\t358271\n惠州小学\t358272\n鸡子\t358273\n人法网\t358274\n内吧彩票网\t358275\n地质学\t358276\n级进模\t358277\n不符合\t358278\n锁眼\t358279\n池畔\t358280\n美女警花\t358281\n浙江育英职业技术学院\t358282\n毁灭者\t358283\n空气能热泵\t358284\n笔杆子\t358285\n绕排\t358286\nPMF\t358287\npurse\t358288\n猫爷\t358289\n澎湖列岛\t358290\nJasmine\t358291\n第14卷\t358292\n绿果\t358293\n住家网\t358294\n陈景润\t358295\n上达\t358296\n大魏\t358297\n记略\t358298\n蜡块\t358299\n20000毫安\t358300\n消防工程公司\t358301\nStellaris\t358302\nRecognize\t358303\nxsh\t358304\n2.1G\t358305\n扬州电视台\t358306\n踩线\t358307\n堇美香\t358308\n开奖\t358309\n餐饮管理有限公司\t358310\nJH\t358311\n20160121\t358312\n问答题\t358313\n花绳\t358314\n和颐酒店\t358315\n1克拉\t358316\n满载而归\t358317\n北京注册公司\t358318\n新加坡环球影城\t358319\nshiyong\t358320\n里贾纳\t358321\nsvnsync\t358322\nAMESim\t358323\n莱西市\t358324\n隐形衣\t358325\n大端\t358326\n韩彬\t358327\n2017年8月2日\t358328\npfa\t358329\n刘作虎\t358330\nAika\t358331\n第123\t358332\n吉华路\t358333\n珊瑚丸\t358334\nFellatio\t358335\n走步\t358336\n温拿乐队\t358337\nxld\t358338\n1870\t358339\n乾县\t358340\n分租\t358341\n光绪通宝\t358342\n医网\t358343\n北京邮电大学计算机学院\t358344\n李黎\t358345\n电感传感器\t358346\nvideo2\t358347\n金钼股份\t358348\n三自由度\t358349\n荆楚理工学院\t358350\n229元\t358351\n狮子鱼\t358352\n逆变换\t358353\n落幕\t358354\nOpenWRT\t358355\n新华食品\t358356\nASH\t358357\n这笔\t358358\nanhui\t358359\nUnityShader\t358360\nV20\t358361\n汉兰魔力宝贝\t358362\n马照\t358363\n两不相欠\t358364\n坑道\t358365\n音像\t358366\n赵牧阳\t358367\nfreeze\t358368\n危疾\t358369\n门里\t358370\nossec\t358371\n极乐世界\t358372\n文艺复兴三杰\t358373\n季线\t358374\n天璞\t358375\names\t358376\n刘燕\t358377\n燃料电池汽车\t358378\n生物量\t358379\nRar\t358380\n锑\t358381\n打赏\t358382\n公母\t358383\n南京新百\t358384\n悬垂\t358385\nSaddle\t358386\n孕妇裤\t358387\n超级育儿师\t358388\n池袋西口公园\t358389\ntransistor\t358390\n凯旋街道\t358391\n山高水长\t358392\n探界者\t358393\n提花机\t358394\n2种\t358395\n栏杆机\t358396\n筷子\t358397\n苏鲁\t358398\nPhi\t358399\nPPT文本框\t358400\n特长生\t358401\n4月\t358402\n心机\t358403\n奥迪a8\t358404\n唐老师\t358405\nchool\t358406\n光老化\t358407\n万兴神\t358408\nrarely\t358409\n浙江省海洋与渔业局\t358410\n凤城三路\t358411\n好精准\t358412\n电力系统暂态分析\t358413\n煤质\t358414\nSGA\t358415\nconanexiles\t358416\n偏分\t358417\n近视\t358418\nubisoft\t358419\n封装类\t358420\n汉王科技\t358421\n牛皮革\t358422\n325国道\t358423\nPetrucci\t358424\n广东省人民政府金融工作办公室\t358425\n注册建造师\t358426\n墓型\t358427\n通奸\t358428\n374\t358429\nasarray\t358430\nMaxima\t358431\n公仆\t358432\n高情商\t358433\n掏出\t358434\n多图\t358435\n淘汰制\t358436\n湖北职业技术学院\t358437\n贷项\t358438\n忙完\t358439\n灵星\t358440\n9900\t358441\n叁叁肆\t358442\n北京京都儿童医院\t358443\n用金\t358444\n吊钟\t358445\nartistic\t358446\nblit\t358447\n美好人生\t358448\nD30\t358449\n6种\t358450\nMIUI8\t358451\n华丹\t358452\n收紧\t358453\n黑暗王者\t358454\n&#160\t358455\n双选会\t358456\n阳光人寿\t358457\n翠兰\t358458\n琨\t358459\nBuffer\t358460\n蓝店\t358461\n像风走\t358462\npcpad\t358463\n参访者\t358464\n千斗五十铃\t358465\n褶皱\t358466\n11.0.20\t358467\n西安经发学校\t358468\ntotolink\t358469\n飞熊\t358470\nwarthunder\t358471\n6年\t358472\nplot\t358473\n高奢\t358474\nSecrets\t358475\n组串式\t358476\n初一下\t358477\nbtob\t358478\n热水宝\t358479\n货品\t358480\npugixml\t358481\nProtoBuf\t358482\n表演服\t358483\n出成率\t358484\nWF2018\t358485\n扣将\t358486\n未名天\t358487\nlist元素\t358488\n中央电教馆\t358489\nC2265\t358490\n搅和\t358491\n仙友\t358492\n片价\t358493\n玻璃体混浊\t358494\nFloors\t358495\n调职\t358496\n原盒\t358497\nbnb89\t358498\n管好\t358499\njdk1.6\t358500\n合肥师范附小\t358501\n5.48\t358502\nDDF\t358503\n雄居\t358504\n御剑\t358505\n卡拉赞\t358506\n5.5.5\t358507\n电泳槽\t358508\n南延\t358509\n神仆\t358510\n22周年\t358511\n市林业局\t358512\n绿城集团\t358513\n湘湖壹号\t358514\n广州长隆\t358515\nWebAPI\t358516\n沙石\t358517\n贝小七\t358518\n每一刻\t358519\n索兰\t358520\n最后的晚餐\t358521\n联强\t358522\n周树人\t358523\nVapormax\t358524\n新通教育网\t358525\n上古卷轴5nmm\t358526\n无锡南洋职业技术学院\t358527\n吉克\t358528\n凤岗镇\t358529\nWin10智能分屏\t358530\n罗田县政府\t358531\nSDIO\t358532\n时尚女魔头\t358533\n阴沉\t358534\nfastq\t358535\n4|4G\t358536\n三社\t358537\n外汇交易者\t358538\n玻璃展\t358539\n鄙视\t358540\n集体舞\t358541\n花觚\t358542\n急腹症\t358543\n71条\t358544\n三分天下\t358545\n吗_妈妈网\t358546\nDTS-ES6.1\t358547\n山姫\t358548\ncalm\t358549\n董浜镇\t358550\n电喷版\t358551\n都尉\t358552\n费内巴切\t358553\n缝处\t358554\n李克用\t358555\n追踪器\t358556\n批\t358557\n华亚\t358558\ndalf\t358559\n赵乾乾\t358560\n一方水土\t358561\n芝麻开门\t358562\n刘琳\t358563\n地火明夷\t358564\n大英帝国\t358565\n迭\t358566\n内政\t358567\n新化新闻网\t358568\n一户\t358569\n应力值\t358570\nUnigraphics\t358571\n五名\t358572\npaz\t358573\nfulfill\t358574\nOK网\t358575\n烦心事\t358576\n杜密克\t358577\nTurbo\t358578\n德国汉堡\t358579\nLEARNING\t358580\n雾岛樱\t358581\n沈阳小区\t358582\n宜宾县\t358583\n烯量币\t358584\nRo\t358585\n18速\t358586\n3D坦克\t358587\n条据\t358588\nvilla\t358589\nwinServer论坛\t358590\nDeskScapes\t358591\nkrist\t358592\n踢蛋\t358593\n杭叉集团\t358594\nMiyake\t358595\n一刹\t358596\n随便吧成语\t358597\n喷涂机器人\t358598\n蜜汁炖鱿鱼\t358599\n课堂\t358600\n油漆工招聘网\t358601\n桃装\t358602\n网络人远程控制软件\t358603\n解质\t358604\n参与式\t358605\n王永庆\t358606\n葛洲坝地产\t358607\n专才\t358608\n歌美\t358609\n焊带\t358610\n西安交通大学附属中学\t358611\n蒋鑫\t358612\n九州牧云记\t358613\n衣片\t358614\nLeast\t358615\n三藏\t358616\ncompared\t358617\n纯趣网\t358618\n溢奶\t358619\ngnome-terminal\t358620\n可可西里\t358621\nfatfs\t358622\nWidgets\t358623\nCQ_LQJ\t358624\n船歌\t358625\n电动车电池\t358626\n超短\t358627\n大名府\t358628\n傲骨贤妻\t358629\n雷区\t358630\n2005年\t358631\n27卷\t358632\n周兴\t358633\n20180209\t358634\n长沙水业集团\t358635\n蝌蚪\t358636\n世茂大厦\t358637\n育龙私立学校\t358638\nsincos\t358639\n贾富贵\t358640\n几十部\t358641\n160624\t358642\n新金庸群侠传\t358643\n拓跋浚\t358644\n大面额\t358645\n有货\t358646\n160CM\t358647\n北京国丹白癜风医院\t358648\n老阴\t358649\n嘉积\t358650\n几种\t358651\n疾行\t358652\ndirname\t358653\nquilt\t358654\n1.3万元\t358655\n整站\t358656\n氢化钙\t358657\n打卤面\t358658\n御生堂\t358659\n冷香\t358660\n2016年5月25日\t358661\n布斯克茨\t358662\n13世纪\t358663\n1359\t358664\n产前检查\t358665\n卢云\t358666\n权限篇\t358667\n牛乳\t358668\nv1.0.1安卓\t358669\n购买率\t358670\n金铃子\t358671\n莫宁\t358672\n代订\t358673\n玉州区\t358674\n四代人\t358675\n人大专门委员会\t358676\n哮喘药\t358677\n木香顺气丸\t358678\n水珠\t358679\n巨胸\t358680\n一周三次\t358681\n循环球式\t358682\nwin7声卡驱动\t358683\n丁一凡\t358684\n免缴\t358685\n辣根\t358686\n尸王殿\t358687\n租赁物\t358688\n金枕\t358689\nSkype简体中文版\t358690\ncs231n\t358691\nsubpig\t358692\n90天以上\t358693\n红珠\t358694\nCircular\t358695\n冈部伦太郎\t358696\n开法\t358697\nconsultant\t358698\nCeraVe\t358699\n华侨银行\t358700\nYes\t358701\nIEA\t358702\n企业所得税汇算清缴申报表\t358703\n中国空压机网\t358704\nt+\t358705\nS女\t358706\nu430p\t358707\n核在\t358708\n审证\t358709\n管理部门\t358710\n金龟\t358711\n82页\t358712\n奎克\t358713\n16.04.3\t358714\n一匡\t358715\n志翔\t358716\n汉源湖\t358717\n忍笑\t358718\n沙乡年鉴\t358719\n签合\t358720\n双卫\t358721\n闫学晶\t358722\n凝土\t358723\n石斛\t358724\n第145集\t358725\n乳企\t358726\n头脑特工队\t358727\n20150512\t358728\n上海中医药大学\t358729\n幸灾乐祸\t358730\n南山公安分局\t358731\nwwe2018\t358732\n全宗\t358733\n半泽直树\t358734\n欧曼GTL\t358735\n90公分\t358736\nCBI\t358737\n天津肿瘤医院\t358738\n凤蝶\t358739\ncountry\t358740\nDPK\t358741\n何建华\t358742\n铁将\t358743\n中统\t358744\n脊线\t358745\n看一下\t358746\n淡江大学\t358747\n丹绒亚路\t358748\nipid\t358749\n第28届\t358750\n芭田股份\t358751\nxllex\t358752\n乐2/乐2\t358753\n急性胆囊炎\t358754\nΡ\t358755\nPERSONA\t358756\n论语十则\t358757\n148集\t358758\n林莘\t358759\ncontinuously\t358760\n泰摩\t358761\n蚁穴\t358762\n天克地冲\t358763\n公会战\t358764\ndatatable\t358765\n十九大习\t358766\n夜上海\t358767\n天字\t358768\n诗社\t358769\n安福\t358770\n郑宏\t358771\nmikimoto\t358772\ncss选择器\t358773\n双维\t358774\n非法拘禁罪\t358775\n成反比\t358776\n联心\t358777\njaxrs\t358778\nhebe\t358779\n不贵\t358780\n康王路\t358781\n得意生\t358782\n老马\t358783\nabigail\t358784\n水箱盖\t358785\n明月山\t358786\n农林类\t358787\n中国电建集团中南勘测设计研究院有限公司\t358788\n药学\t358789\npa6\t358790\nA.6\t358791\n求详\t358792\n尘风\t358793\n导绳器\t358794\n熊猫烧香\t358795\n晨雨\t358796\nLVS\t358797\n到钱\t358798\n500千克\t358799\n微信欢乐斗地主\t358800\n卖淫\t358801\n张君雅\t358802\n调用函数\t358803\n红色警戒2战网\t358804\n0五\t358805\nRudolf\t358806\n_岁\t358807\n2.6.32\t358808\n墨玉县\t358809\n广誉\t358810\n450M\t358811\nps在线版\t358812\nstranger\t358813\n上世纪60年代\t358814\n84页\t358815\n容差\t358816\n荒木\t358817\n黄章\t358818\n可憎\t358819\n百兆交换机\t358820\n0度\t358821\n31项\t358822\n第几位\t358823\n数数\t358824\n夏克立\t358825\n一个26岁\t358826\n继续走\t358827\nportia\t358828\n嘉宝物业\t358829\n杨桃树\t358830\n金斯瑞生物科技有限公司\t358831\n顶石\t358832\nRegion\t358833\n越俎代庖\t358834\nxChun\t358835\n民生主义\t358836\n香港飞虎队\t358837\n朝服\t358838\n鲁山县\t358839\nWESTERN\t358840\n未时\t358841\n嗨钱\t358842\n碳线\t358843\n杨金水\t358844\n单行道\t358845\n柒牌\t358846\n狮子简笔画\t358847\n蓝绿色\t358848\n调行\t358849\n1299\t358850\nErotic\t358851\nf2c\t358852\n翻身路\t358853\n江宏恩\t358854\n反应液\t358855\nBatteries\t358856\n冒泡赛\t358857\n你是我兄弟\t358858\n区域\t358859\n30波\t358860\nheath\t358861\nBarrel\t358862\n胡金秋\t358863\n口袋妖怪中文网\t358864\n厚度\t358865\nsuperagent\t358866\n1万多\t358867\n阿里京东\t358868\n李芸\t358869\n4只\t358870\nitunes12\t358871\n第一临床医学院\t358872\n杨浦公园\t358873\n站子\t358874\n西安科技大学\t358875\n八周岁\t358876\nGoUSA\t358877\n1300元\t358878\n桂枝茯苓胶囊\t358879\n决战食神\t358880\n田薇\t358881\n金融信息服务公司\t358882\n红细胞\t358883\n徐水县\t358884\n新中心\t358885\n睑\t358886\nsqlyong\t358887\nwin7系统IE浏览器\t358888\n顾倾城\t358889\n伞\t358890\n董斌\t358891\n以至于\t358892\n法老王\t358893\n电饼\t358894\n丙组\t358895\n三大类\t358896\n独院\t358897\n狼国\t358898\n经管之家\t358899\n费丝\t358900\n雪堆\t358901\nㄋ\t358902\n半蹲\t358903\n郜\t358904\n东林村\t358905\n围观\t358906\n5.11\t358907\n访美\t358908\nJStorm\t358909\n安徽医科大学第二附属医院\t358910\nHC05\t358911\n行刑\t358912\n9119\t358913\n游艇展\t358914\n2017年12月\t358915\n杨元\t358916\n155号段\t358917\n张永和\t358918\n威尼斯水城\t358919\n删除器\t358920\nstudio3.1\t358921\n萝莉们\t358922\n筑起\t358923\n国际金融机构\t358924\n版盘\t358925\n夏洛特\t358926\n46页\t358927\n2018045\t358928\n迷魂记\t358929\n尤尔\t358930\n童文红\t358931\n鑫鸿\t358932\nZeroMQ\t358933\n养养\t358934\n成飞集成\t358935\n宁波市北仑区人民政府\t358936\n最终幻想13-2\t358937\nshortcut\t358938\nmscomm32.ocx\t358939\n采区\t358940\n江南晚报\t358941\nCaster\t358942\n武汉招商银行\t358943\nwarmer\t358944\n莱戈拉斯\t358945\n湖北经济学院法商学院\t358946\n21关\t358947\n阿基诺三世\t358948\nideology\t358949\nwatch3\t358950\n10.3.6\t358951\n嵌固端\t358952\n椪柑\t358953\n郑州机场\t358954\n煤制气\t358955\n偷摸\t358956\n祝家庄\t358957\n平桂\t358958\n蛛丝马迹\t358959\n武进日报\t358960\n画纸\t358961\n无讼阅读\t358962\n安得广\t358963\n商君\t358964\n不齐\t358965\noffice2000\t358966\ntoolbox\t358967\nwinsor\t358968\n华星路\t358969\n卡瓦哈尔\t358970\n第十页\t358971\nlrc\t358972\nJdk\t358973\n储蓄所\t358974\n练习图\t358975\n湖墅南路\t358976\nToshokan\t358977\n狠招\t358978\n老江\t358979\n华为大学\t358980\n茉莉清茶\t358981\n头字\t358982\n赖冠霖\t358983\n北宁湾\t358984\n大酱汤\t358985\n婚外恋\t358986\n杀人\t358987\n两证合一\t358988\n鬼语\t358989\njuvenile\t358990\n喜啦\t358991\noculus\t358992\n方涛\t358993\n菱角湖万达广场\t358994\n丁腈橡胶\t358995\n真价\t358996\n国家外汇局\t358997\n除氧剂\t358998\n三体综合征\t358999\n电铺\t359000\n蜡样芽孢杆菌\t359001\n朱德庸\t359002\n苏教版.doc\t359003\n挨着\t359004\n茂名论坛\t359005\n深圳市标准技术研究院\t359006\n传呼机\t359007\n李尚龙\t359008\nzayn\t359009\n佞臣\t359010\nMOKA\t359011\n叶之凡\t359012\n洞头区\t359013\npdksh\t359014\n休耕\t359015\n奇穴\t359016\nクション\t359017\n万林\t359018\n波谷人\t359019\n人民政府\t359020\n木兰天池\t359021\n准备期\t359022\n郭巷街道\t359023\n马修·麦康纳\t359024\n随身听\t359025\n宁德一中\t359026\n確定\t359027\n1dm\t359028\n专职\t359029\n东原\t359030\n小瞧\t359031\nnha\t359032\n中南大学湘雅医院\t359033\n胡宏\t359034\n从化汽车站\t359035\nPivotTable\t359036\n民论\t359037\n郭跃\t359038\n周颂\t359039\njacobian\t359040\n镇府\t359041\nVLC\t359042\n两半\t359043\n密度图\t359044\n剑网3奇遇\t359045\n我帮网\t359046\n银质\t359047\n四招\t359048\n香水味\t359049\n投资研究网\t359050\n第35轮\t359051\n薏仁米\t359052\n啷\t359053\n剪压比\t359054\n邮轮港\t359055\n高松\t359056\nopenjdk7\t359057\nbow\t359058\n病毒软件\t359059\n耳蜗\t359060\n鉴黄师\t359061\n进口方\t359062\n平衡轴\t359063\n长租\t359064\nanal\t359065\ndreamweaver\t359066\n600820\t359067\n爱驰亿维\t359068\n推挡\t359069\n内推网\t359070\n幸\t359071\n鼻青脸肿\t359072\n大同人才网\t359073\n10万只\t359074\nyouget\t359075\n1856\t359076\n鸿利智汇\t359077\n五年级语文阅读\t359078\nwaffle\t359079\n德版\t359080\nPyPy\t359081\n鸿运通\t359082\n铜官区\t359083\n峨眉山\t359084\n供给侧结构性\t359085\n台布\t359086\n南宁国际会展中心\t359087\n明儿\t359088\nASICS\t359089\n车辆抵押贷款\t359090\n爱乐特\t359091\n不插电\t359092\nseting\t359093\n印青\t359094\n设计公司\t359095\nTAD\t359096\n苏浙沪\t359097\n浮动汇率制\t359098\njest\t359099\n双沟镇\t359100\n冈本安全套\t359101\n019\t359102\nsiesta\t359103\n祖源\t359104\n侦探类\t359105\n浮头\t359106\n必克英语\t359107\nstm8s\t359108\n4933\t359109\n贷好\t359110\n活路\t359111\n性梦\t359112\n鸭寮街\t359113\n宫人\t359114\n基带\t359115\n艾玖\t359116\n汉语考试\t359117\n耻辱柱\t359118\nMTBE\t359119\n凤翔县人民政府\t359120\n宜搜\t359121\n风蓝\t359122\nTxT\t359123\n霸道帝\t359124\n围观者\t359125\n拆改\t359126\n鬼步舞\t359127\n高场\t359128\n黑体字\t359129\nshitu\t359130\n有望\t359131\n搬家网\t359132\n保时捷卡宴\t359133\n私房话\t359134\nceph-deploy\t359135\n初音ミク\t359136\n笛子谱\t359137\n烫伤膏\t359138\nTimable\t359139\n2714\t359140\nSame\t359141\n126个\t359142\n骑兵\t359143\n针叶樱桃\t359144\n流线型\t359145\n信号与系统\t359146\n单元格值\t359147\n天鹅湖大酒店\t359148\n70英寸\t359149\n清宫\t359150\n摊手\t359151\n20160801\t359152\n多伦多\t359153\n快乐玛丽\t359154\n罗伯特\t359155\nB端\t359156\n田家\t359157\n七白\t359158\nautism\t359159\nFDG\t359160\n4399少年三国志\t359161\n23款\t359162\n天荒坪镇\t359163\n_八\t359164\neggjs\t359165\n中国徐州网\t359166\n倪秋云\t359167\n最\t359168\n松美术馆\t359169\n1.13.3\t359170\n裸考\t359171\n小飞杨\t359172\nLocust\t359173\nVCC\t359174\nBurger\t359175\n云商\t359176\nphp\t359177\nzeromq\t359178\n王江源\t359179\n深圳南山区小学\t359180\n桂林两江机场\t359181\n截串\t359182\nkeyone\t359183\n地坎\t359184\n丰源\t359185\n擦炮\t359186\nbogo\t359187\nfxx\t359188\nFRIENDS\t359189\n茂林镇\t359190\nTreeView\t359191\n热血英豪\t359192\n立足于\t359193\n郭士强\t359194\n拍图\t359195\n外贸型\t359196\n灵域\t359197\n江海明子\t359198\n厄伽勒\t359199\n4710mq\t359200\n氏族\t359201\n南外方山分校\t359202\n郭嵩焘\t359203\nqueensland\t359204\n横荷\t359205\nDALF\t359206\n圆周率\t359207\nSTAY\t359208\n市场假说\t359209\n英系\t359210\n11项\t359211\nBP神经网络\t359212\n平安汽车保险\t359213\n静音\t359214\n和平之家\t359215\n永寿县\t359216\n诗帖\t359217\n走转\t359218\ncanf\t359219\n爱客商务网\t359220\ncounsel\t359221\n雨敌\t359222\n国家开放大学\t359223\n原原本本\t359224\n鸿宝\t359225\n王亚民\t359226\n缺阵\t359227\n上古卷轴吧\t359228\n瑞奕\t359229\n单亲妈妈\t359230\nWebStrom\t359231\n直燃机\t359232\n十日\t359233\n皮下出血\t359234\n总局\t359235\n三联村\t359236\n憋尿\t359237\n自贡网\t359238\n茶兀\t359239\nNTN\t359240\n大岭山\t359241\n考察函\t359242\n柠檬糖\t359243\n骚麦\t359244\n诸恶莫作\t359245\n于小彤\t359246\n不潮\t359247\n天桥区政府\t359248\n群酯\t359249\n自谦语\t359250\n三微\t359251\n田家庵\t359252\n下值\t359253\n非限定性定语从句\t359254\n淫祭岛\t359255\n红海\t359256\n暑假\t359257\n伞绳\t359258\n媒婆\t359259\nlindsey\t359260\n75层\t359261\n蓝光播放器\t359262\n3400元\t359263\nkkm\t359264\n钼\t359265\nwhoo\t359266\n一二三四级\t359267\n2005款\t359268\n劫后余生\t359269\nanr\t359270\n伟力通\t359271\n珠海北\t359272\n黎家大院\t359273\n破碎剂\t359274\n5.12\t359275\nΨ\t359276\n梁安琪\t359277\n华阳集团\t359278\n前窗\t359279\nrealsense\t359280\n合法化\t359281\n92号\t359282\n越秀星汇云城\t359283\n补班\t359284\nyaahp\t359285\n潍坊市委\t359286\n耶稣受难记\t359287\n三相步进电机\t359288\n力道\t359289\n龙柏\t359290\n喜马拉雅\t359291\n法益\t359292\n柿子树\t359293\n克拉玛依区\t359294\n第82集\t359295\n做勇于\t359296\n扬弃\t359297\n园冶\t359298\n长沙医院\t359299\nGloss\t359300\n张洋\t359301\n维库电子市场网\t359302\n赵洁\t359303\n泡发\t359304\ncustoms\t359305\n栾川\t359306\n风雅陶笛\t359307\n打连击\t359308\n赵长军\t359309\n丰碑\t359310\n为战\t359311\n企企\t359312\n操作压力\t359313\n本尼迪克特·康伯巴奇\t359314\n知心姐姐\t359315\ncarbon\t359316\n金达\t359317\ntravis\t359318\n12号\t359319\naom\t359320\n赤坎古镇\t359321\nC语言运算符\t359322\n2E\t359323\npollen\t359324\n黔山\t359325\n微纳\t359326\n希而科\t359327\n多个\t359328\n灌流\t359329\n税收入\t359330\nknights\t359331\n1.5.1\t359332\n味之素\t359333\nrbac\t359334\n10.1寸\t359335\ncs231\t359336\n禁渔\t359337\n霹雳娇娃\t359338\n毕业实习\t359339\nscute\t359340\n孤家寡人\t359341\n飞行学院\t359342\nstarted\t359343\n世相\t359344\ncarekee\t359345\nsentimental\t359346\nxible\t359347\n青海省财政厅\t359348\n金海大桥\t359349\n徒刑\t359350\n地灵\t359351\nAstrology\t359352\nSHOPPING\t359353\nmilan\t359354\n首艘\t359355\n猫仔队\t359356\nASPNET\t359357\nRC版\t359358\nVAPE\t359359\n2起\t359360\n林元庆\t359361\n0454\t359362\n100万次\t359363\n纯苯\t359364\n神神\t359365\nexcel函数\t359366\n霍州\t359367\n领袖\t359368\n第六版\t359369\n被否\t359370\np6000\t359371\n高擎\t359372\n购物场\t359373\n柏拉图\t359374\nALIVE\t359375\n细丝\t359376\n奶师\t359377\n无糖\t359378\nMAG\t359379\n美好的世界\t359380\n张晓鹏\t359381\n喷灌\t359382\n霍璇\t359383\np6\t359384\ntheorem\t359385\n助医\t359386\n收惊\t359387\n中华人民共和国反洗钱法\t359388\n四月四号\t359389\n二线\t359390\n20余名\t359391\n74\t359392\n柯洁\t359393\n赤赤\t359394\nTRAINING\t359395\n30款\t359396\n回沙酒\t359397\n再现\t359398\n樊城区\t359399\n北京县\t359400\n隔三差五戏曲网\t359401\n携程旅游网\t359402\n临客\t359403\nSteakhouse\t359404\nPc\t359405\n英才街\t359406\n潮汕牛肉丸\t359407\n互调\t359408\nperfectly\t359409\n建港\t359410\n第1节\t359411\n再见艳阳天\t359412\n艾尼路\t359413\n内燃\t359414\nWIFI万能钥匙\t359415\ntrimble\t359416\n安联财险\t359417\nBLDG\t359418\nqt控件\t359419\n伎巧\t359420\nv3.3\t359421\nhdmi线\t359422\n三等奖\t359423\n导水\t359424\n袁珂\t359425\nppv\t359426\n中环股份\t359427\n所得税额\t359428\n店堂\t359429\njiazai\t359430\n平推式\t359431\nVIM\t359432\n琼岛\t359433\n新旧\t359434\n鲁特\t359435\n酒窖\t359436\nwsf\t359437\n梭子蟹\t359438\nXO\t359439\n滋补\t359440\n莎车\t359441\n清清楚楚\t359442\n吐\t359443\n光电展\t359444\n死生\t359445\n罗湖商业城\t359446\n平板电视\t359447\n新筑\t359448\n行波\t359449\n孟晚舟\t359450\n清明节节\t359451\n北京销售分公司\t359452\n碾子山\t359453\n点火\t359454\n全赖\t359455\n达\t359456\nOMEN\t359457\n前调\t359458\n保利崖州湾\t359459\n138万\t359460\n林依\t359461\n20170815\t359462\nAsqql\t359463\n56页\t359464\n菜盘\t359465\njuediqius\t359466\n波鞋街\t359467\n芒街\t359468\n豫能控股\t359469\n维盟\t359470\n灭弧\t359471\nlolh\t359472\n高复\t359473\n郭乐乐\t359474\nexis\t359475\n爱情运\t359476\n映射网络驱动器\t359477\neddie\t359478\n乳酸链球菌素\t359479\n沐恩\t359480\n原材\t359481\n欧诺_长安欧尚_欧诺报价\t359482\n盈趣科技\t359483\n成贵铁路\t359484\n参照表\t359485\nIsabella\t359486\nCRB\t359487\n空气缸\t359488\n水幕墙\t359489\n管理机构\t359490\n潮连\t359491\n宫主\t359492\n草砖\t359493\n韦科\t359494\n52832\t359495\n保安镇\t359496\nscheduled\t359497\n硼酸洗液\t359498\n想说一句\t359499\n电脑椅\t359500\n客家\t359501\n几首\t359502\n牧羊人\t359503\n词不达意\t359504\n148个\t359505\n狼王\t359506\n茅以升\t359507\n弓手\t359508\nPPN\t359509\n七乐\t359510\n87页\t359511\n孙胜妍\t359512\n预留\t359513\n1n\t359514\ngaia\t359515\n北京中学\t359516\n全钢\t359517\nknox\t359518\n凤霸\t359519\n2018.1.1\t359520\nKingCMS\t359521\n稻壳\t359522\n华盛\t359523\n二马\t359524\n整站下载器\t359525\n0752\t359526\n校园招聘_高校人才网\t359527\n交通流理论\t359528\nPCA算法\t359529\n命理\t359530\n冒险游戏\t359531\n小马国\t359532\n阿耀\t359533\n3dmax2017\t359534\n钟秀\t359535\nUGG\t359536\nRX580\t359537\n熊族\t359538\n友谊路\t359539\n176万\t359540\n10平方米\t359541\n天骐\t359542\nlihaiping\t359543\n高严\t359544\n四十二岁\t359545\n5.so\t359546\n坩埚\t359547\n涿鹿\t359548\nCTI\t359549\nwww.gddx\t359550\n重兵\t359551\n崇贤镇\t359552\n公办学校\t359553\n北邮人\t359554\nPTS\t359555\n索达吉\t359556\n肉夹馍\t359557\n进近\t359558\n语风\t359559\n光驱位硬盘托架\t359560\n酣战\t359561\n自贸试验区\t359562\n增值\t359563\n海南东方政府网\t359564\n说明白\t359565\n7.2.1511\t359566\n0.08%\t359567\nbp神经网络\t359568\n上海岳阳医院\t359569\nhello\t359570\nOpencv\t359571\n冰儿\t359572\n剑桥少儿英语\t359573\n一个零\t359574\n经管代码库\t359575\n郁结\t359576\n刮粪机\t359577\n堆满\t359578\n240公里\t359579\n佳大鸡排\t359580\n绝地求生平底锅\t359581\n中央党\t359582\n机械工\t359583\n奥克股份\t359584\n幼儿园世界读书日\t359585\nintent\t359586\n易得\t359587\n4399功夫派\t359588\n仏\t359589\n卡刷\t359590\n正官\t359591\n我要长恨歌\t359592\nf411\t359593\n亲电\t359594\n酷睿i\t359595\nRICOH\t359596\n崇川政府网\t359597\n汰\t359598\nMON\t359599\n可转换债券\t359600\n虚空断章\t359601\n宴请\t359602\nlouie\t359603\nZilla\t359604\n继保\t359605\n国家海洋局东海分局\t359606\n桃园机场\t359607\n管员\t359608\n川军团\t359609\nfixtures\t359610\n黄种人\t359611\n不可貌相\t359612\n开水房\t359613\n书旗小说_书旗网免费小说阅读|书旗小说下载\t359614\n游戏人\t359615\nconnectors\t359616\n光致\t359617\n经营学\t359618\n炽爱\t359619\n计算机信息系统罪\t359620\n一兆\t359621\nMaxi\t359622\n去向\t359623\n曲阜市\t359624\n达斯琪\t359625\n丧服\t359626\n30db\t359627\n印痕\t359628\n洪范\t359629\n东城逸家\t359630\n新亮点\t359631\n民生人寿\t359632\nHACK版\t359633\nnoa\t359634\n罗佳\t359635\n拜托了冰箱\t359636\na2dp\t359637\n业务类\t359638\n北京宣武医院\t359639\n超新\t359640\n锆石\t359641\n芜湖方特欢乐世界\t359642\n井边\t359643\n敦君\t359644\ncommit\t359645\nended\t359646\n西安旅游团\t359647\n呆毛\t359648\n10代\t359649\n35所\t359650\na30\t359651\n2015—2016年\t359652\nRYZEN\t359653\n一个多行\t359654\n吃苹果\t359655\nc++2005\t359656\n罗莱生活\t359657\nShang\t359658\n风叶\t359659\nlsat\t359660\n唐静颜\t359661\n黄蜂\t359662\n數據\t359663\n本职工作\t359664\n欢乐书客\t359665\n20180106\t359666\n2016年09月\t359667\n容器化\t359668\n册页\t359669\njuy\t359670\n上海市高级人民法院\t359671\n四川文化产业职业学院\t359672\nmcc码\t359673\n下午5点\t359674\n319国道\t359675\n云计算机\t359676\n诸城市\t359677\nlawyer\t359678\ncobas\t359679\nbdc\t359680\n孕检\t359681\n刘继顺\t359682\nBOLT\t359683\nsure\t359684\n完成式\t359685\n斗米兼职网\t359686\n仁德\t359687\n言志\t359688\n芜湖市区\t359689\nNPT\t359690\n妙物\t359691\neine\t359692\nv3.3.3\t359693\n香港四季酒店\t359694\n上旬\t359695\n学警雄心\t359696\n广智\t359697\n翠诗\t359698\n关氏\t359699\n您们\t359700\n叻沙\t359701\n省红十字会\t359702\n20161112\t359703\n接处\t359704\n沭\t359705\n洛伊\t359706\n聚甲醛\t359707\nrein\t359708\n喜从天降\t359709\n数值变量\t359710\n变种\t359711\nToolBar\t359712\n涿鹿之战\t359713\n天南星科\t359714\n133\t359715\n好运查理\t359716\n20日\t359717\nalwayson\t359718\n水泉\t359719\nvoyo\t359720\n前兆\t359721\n基值\t359722\n利利\t359723\n中交集团\t359724\n垭口\t359725\n酷猫电影网\t359726\n剑子仙迹\t359727\n河西村\t359728\n不认账\t359729\n三上悠亜\t359730\nASO100\t359731\n神墓\t359732\n注册建筑师资格考试\t359733\nshirley\t359734\n蹬\t359735\n巴黎市区\t359736\n互动\t359737\n蒋韬\t359738\nMellon\t359739\n0.5个\t359740\nh2\t359741\n凤凰金融\t359742\n最低温\t359743\n人工挖孔桩\t359744\n1000亿美元\t359745\n威基基\t359746\n内黄\t359747\n冴岛香织\t359748\n月子中心\t359749\n会车\t359750\n暗号\t359751\n归真版\t359752\n联合国工业发展组织\t359753\nalgorithmic\t359754\n浙江省新闻出版广电局\t359755\n龙级\t359756\n北京建工集团\t359757\n剑网三视频编辑器\t359758\n李常超\t359759\n中餐厅\t359760\n错误值\t359761\n4.9折\t359762\n二航局\t359763\n牙长\t359764\n阿里中间件\t359765\n600718\t359766\n北大口腔医院\t359767\n白水湖\t359768\nDAN\t359769\n平氏\t359770\n广州市水务局\t359771\n扩张\t359772\n张奕\t359773\n话版\t359774\n克罗格\t359775\n社保费\t359776\n血天使\t359777\n克敌\t359778\njpj\t359779\nstrconv\t359780\n拼酒\t359781\n伞形\t359782\n居间\t359783\npagesize\t359784\n35款\t359785\n吉林大学图书馆\t359786\n党员大会\t359787\n第九集\t359788\n被逼无奈\t359789\ndinosaurs\t359790\n家政服务网\t359791\n磁通密度\t359792\n墨芳\t359793\n市城管执法局\t359794\n山葡萄\t359795\n孙中\t359796\n李征\t359797\n主力舰\t359798\n布伦特原油\t359799\n_汕\t359800\n广西广电\t359801\n肺痨\t359802\n苗木场\t359803\n交出\t359804\n张小燕\t359805\nHeyJuice\t359806\n华池\t359807\n江苏省测绘地理信息局\t359808\nQMap\t359809\n冷水泡\t359810\n农村幸福院\t359811\n南京西站\t359812\n类比\t359813\nFlot\t359814\n现如今\t359815\nrcfans\t359816\n3k\t359817\npixijs\t359818\nsemester\t359819\n可尔\t359820\nreap\t359821\n唐师曾\t359822\n酸枝\t359823\n驴车\t359824\n浓硝酸\t359825\nccflow\t359826\n眼看\t359827\n零币\t359828\n县管校聘\t359829\n相临\t359830\n蓝焰\t359831\n纤修堂\t359832\n88.5\t359833\nsql数据库\t359834\n30亩\t359835\n废砂\t359836\n指挥刀\t359837\n云堤\t359838\n秦建伟\t359839\n见解\t359840\n科探\t359841\n人工成本\t359842\n织梦者\t359843\n迪士尼季卡\t359844\nzxfuli福利社\t359845\n自流\t359846\n青霞\t359847\n呼吸科\t359848\n广河高速\t359849\n1921\t359850\n七_\t359851\n飞天茅台酒\t359852\n泊松分布\t359853\ntham\t359854\nclarification\t359855\n肖涛\t359856\n焊工\t359857\n开采量\t359858\n巨款\t359859\nMoya\t359860\n被拘留\t359861\nAGREE美\t359862\nm3u8xx\t359863\n外海\t359864\nClick\t359865\n雍景湾\t359866\n皱眉\t359867\nLinode\t359868\n016年\t359869\n王煜\t359870\n计算机科学与工程系\t359871\n面试题库\t359872\n沈阳铁西区政府\t359873\n数一个\t359874\n同步通信\t359875\nhead\t359876\n鼻炎馆\t359877\n电纸书吧\t359878\nMarvelous\t359879\nwordnet\t359880\nCompressor\t359881\n袁州\t359882\n运梁\t359883\n群展\t359884\n要知\t359885\nftw\t359886\n苗疆客\t359887\n大理大学\t359888\n刻刻\t359889\n死亡国度\t359890\n虚弱\t359891\n饮冰\t359892\n低碳环保\t359893\n糊涂仙\t359894\n大椎\t359895\n上海锦江乐园\t359896\n保真\t359897\n20141115\t359898\n李红霞\t359899\n江湖人\t359900\nmssql\t359901\nbabun\t359902\nnotes\t359903\n100摄氏度\t359904\n勋伯格\t359905\ncssci\t359906\nanalyze\t359907\n塞蕾娜\t359908\n摔跤手\t359909\n深巷\t359910\n成都婚庆公司\t359911\n第49个\t359912\n望天湖\t359913\n死亡骑士\t359914\nPOP3\t359915\n35套\t359916\naircrack\t359917\n乳腺囊肿\t359918\n倩女幽魂手游\t359919\nsetText\t359920\n9档\t359921\n四季青街道\t359922\ninclude_once\t359923\nwished\t359924\n史籍\t359925\n湛江经济技术开发区\t359926\n中国石油\t359927\nHACG\t359928\n吴小姐\t359929\n超盟\t359930\n倩倩\t359931\n八里\t359932\n金红叶纸业集团有限公司\t359933\n拉拉吧\t359934\n霍金传\t359935\n师训\t359936\n贵阳地铁3号线\t359937\n苏叔阳\t359938\n雷甸镇\t359939\n生僻字\t359940\n数学广角\t359941\nDOA5LR\t359942\n东园小区\t359943\n风水学\t359944\n轮复习\t359945\n苏州石路\t359946\n教育基金会\t359947\nscalable\t359948\nm4\t359949\n滏阳河\t359950\n症状性\t359951\n40行\t359952\n现金股利\t359953\n等不及了\t359954\n更有效\t359955\n宣统元宝\t359956\n河里\t359957\n红米5Plus\t359958\n伊始\t359959\n清早\t359960\ne531\t359961\n速霸2000\t359962\n波斯纳\t359963\n小肥羊\t359964\nAfraid\t359965\nphosphate\t359966\n全季酒店\t359967\n2.6.0\t359968\nCDROM\t359969\n京翰高考网\t359970\n早上好\t359971\n中望\t359972\nAccountants\t359973\n安盾\t359974\neric6\t359975\n拖把杆\t359976\n七寸\t359977\n贴单\t359978\n珲春市\t359979\n回忆录\t359980\n排序函数\t359981\n茅洪斌\t359982\n煮面\t359983\n青岛航空\t359984\nVitro\t359985\n铁十字\t359986\nng-hide\t359987\ny67a\t359988\n报险\t359989\n金融危机\t359990\n镀铝锌板\t359991\n米修\t359992\n深圳坪山站\t359993\n赋得古原草送别\t359994\n溯\t359995\n基础会计\t359996\n威王\t359997\n酷派集团\t359998\n6.4.3\t359999\n夜魔\t360000\n粤港澳大湾\t360001\n鳡鱼\t360002\n铭润\t360003\n刘先森\t360004\n三十分\t360005\n市郊\t360006\n滤杯\t360007\n百字明咒\t360008\nShiina\t360009\nSpring-MVC\t360010\n御景华城\t360011\n相安无事\t360012\n语翼\t360013\n淡蓝\t360014\n遭破坏\t360015\n爱上你\t360016\n知识产权工程师\t360017\n李克勤\t360018\niop\t360019\n24_\t360020\n暖通家\t360021\n一色\t360022\n封泥\t360023\n商讯\t360024\n贼鸥\t360025\n蜀山战纪2\t360026\nEXCHANGE\t360027\n决策树算法\t360028\n上府\t360029\n项li\t360030\n安妮·海瑟薇\t360031\n中度脂肪肝\t360032\n仙禽\t360033\n二级学院\t360034\n边风炜\t360035\n英大\t360036\nview\t360037\n疯狂的麦克斯4\t360038\n五指山市人民政府\t360039\nSquares\t360040\n大卫·芬奇\t360041\n祥和路\t360042\n气门油封\t360043\necplise\t360044\n广陵散\t360045\nJL\t360046\n超大陆\t360047\nSQLException\t360048\nNPN型\t360049\n茶商\t360050\n非真\t360051\n超频软件\t360052\ncellcard\t360053\nDQ8\t360054\n人皇纪\t360055\nautoprefixer\t360056\n铁血德意志\t360057\n纸卡板\t360058\nGR\t360059\n20150513\t360060\n一叶孤舟\t360061\n20万张\t360062\n凤华\t360063\n超魔神英雄传\t360064\n深圳西部\t360065\ncadey\t360066\n兴贤路\t360067\n第十代\t360068\n琥珀酸亚铁片\t360069\n流进\t360070\n百业招商网\t360071\npostgresq\t360072\n坑儿\t360073\nISO认证\t360074\n双星集团\t360075\n魏忠贤\t360076\n整形师\t360077\n真真假假\t360078\n八日\t360079\nMorning\t360080\n人间炼狱\t360081\n001002\t360082\nwegamednf\t360083\n无功而返\t360084\nLam\t360085\n茉莉酸甲酯\t360086\n叶之秋\t360087\n石川绫子\t360088\n武宁路\t360089\n超级牛逼\t360090\n格勒\t360091\n佳易儿歌网\t360092\n幼色\t360093\nConstantine\t360094\n最出色\t360095\nK5\t360096\nWhenever\t360097\n重庆美团网\t360098\nDownloader\t360099\navlang\t360100\n静安府西区\t360101\n海业\t360102\nADOQuery\t360103\n薯\t360104\n心急火燎\t360105\n马向阳\t360106\n雷蛇云\t360107\n23项\t360108\n妆炫商城\t360109\n转正\t360110\nparenting\t360111\n新浪云\t360112\n桃浦镇\t360113\n标准气\t360114\n中国东方航空公司\t360115\n实物\t360116\n星蝶公主\t360117\n一览_安游在线\t360118\n拉网\t360119\n触屏失灵\t360120\n直銷\t360121\n12349\t360122\n钢塑土工格栅\t360123\nEdgar\t360124\n陕州区\t360125\n四大美人\t360126\n第24条\t360127\n穿越时空的思念\t360128\n9199\t360129\nGUMI\t360130\n长安欧尚长安CX702018款\t360131\n风靡\t360132\n王东峰\t360133\n兰德酷路泽论坛_汽车之家论坛\t360134\n逆变器\t360135\n鱼爪\t360136\n6280\t360137\n乐叶\t360138\n克隆赛\t360139\n国家林业局办公室\t360140\nUnder\t360141\n秦亚青\t360142\n国网河北省电力公司\t360143\n水湿\t360144\n祈Inory\t360145\nFinding\t360146\n_罗技\t360147\n瑞府\t360148\nA72\t360149\n男技师\t360150\n听命\t360151\n结建\t360152\nLANG\t360153\n基金净值\t360154\nledger\t360155\n特瑞\t360156\n空舞\t360157\ncurl\t360158\n童趣\t360159\n治家\t360160\n市农委\t360161\n学编\t360162\nrhoades\t360163\n科威特第纳尔\t360164\n博迅\t360165\n博能\t360166\nJumping\t360167\n高兴镇\t360168\n长江后浪推前浪\t360169\n山东省林业厅\t360170\ntalib\t360171\n放不到\t360172\nDANE\t360173\n领导者\t360174\n联想小新潮7000\t360175\n温格\t360176\n特普\t360177\n变色\t360178\nFES\t360179\n600年\t360180\n略带\t360181\n令人羡慕\t360182\nnun\t360183\n朗文版\t360184\n混沌理论\t360185\n种火\t360186\n舞蹈老师\t360187\nvolunteering\t360188\n高容量\t360189\nPATH\t360190\ntoma\t360191\nmaersk\t360192\n幽默观察家\t360193\n护理期\t360194\n魂银\t360195\n光立方\t360196\nkt88\t360197\n110.com\t360198\n盘饰\t360199\n踩水\t360200\n重庆市机关事务管理局\t360201\n退热贴\t360202\n如许\t360203\n科力普\t360204\n红帽认证\t360205\n240分\t360206\n致盛\t360207\n15话\t360208\n伏羲\t360209\n威廉·萨默塞特·毛姆\t360210\nchatroulette\t360211\n通知行\t360212\n张公洞\t360213\n18月\t360214\nPirates\t360215\n手饰\t360216\n永泰路\t360217\n芒果酱\t360218\n上海大学研究生院\t360219\n三菱\t360220\n1024\t360221\n柠檬酒\t360222\n应变片\t360223\n新城国际\t360224\n济南办公室\t360225\n送入\t360226\n细比\t360227\nwindows资源管理器\t360228\n食品药品监管局\t360229\n潮舞\t360230\n数博会\t360231\n盘锦\t360232\n品牌排名\t360233\n集庆门\t360234\n梅村街道\t360235\n单机游戏吧\t360236\n天雅\t360237\n戴逸\t360238\n中杯\t360239\n塘市\t360240\n流行服饰\t360241\n手磨机\t360242\ndotm\t360243\n一光年\t360244\nResearcher\t360245\n分笼\t360246\n鸡手\t360247\n张之洞\t360248\n六爷\t360249\n一大堆\t360250\n红四方面军\t360251\n2018年8月份\t360252\n情欲史\t360253\nnearest\t360254\n自体软骨隆鼻\t360255\nnew2dsll\t360256\n锦州南\t360257\n无垠的太空\t360258\n鼠药\t360259\n萧山国际机场\t360260\n杭州政府\t360261\n许琳\t360262\n南山网\t360263\n中铁四局集团\t360264\n密目网\t360265\n杭州火车站\t360266\n哥德巴赫\t360267\n大文\t360268\np8z77\t360269\n复转\t360270\n星沙\t360271\nfullpage\t360272\n中国五矿集团有限公司\t360273\n空灵\t360274\nV35\t360275\n小资\t360276\n白碱滩区\t360277\nsync2\t360278\nFUT\t360279\n阿莹\t360280\n絮絮叨叨\t360281\n乾\t360282\n面诊\t360283\n防波堤\t360284\n亿联网络\t360285\nwebstrome\t360286\n垂直型\t360287\n横塘村\t360288\nbalancing\t360289\n1223\t360290\n内华达州\t360291\n王宏\t360292\n营养膏\t360293\n佛坪\t360294\n搜尋\t360295\ntess\t360296\n江辰\t360297\nCeleb\t360298\nDicks\t360299\n4.5亿美元\t360300\n20141227\t360301\n新旅\t360302\nkdump\t360303\n龙组\t360304\n第一枪\t360305\n德固赛\t360306\n神佛\t360307\n老管\t360308\n浦泽直树\t360309\n杨孟霖\t360310\n1房\t360311\n韦正翔\t360312\n糯玉米\t360313\n西安地铁10号线\t360314\n焦娇\t360315\n400號碼\t360316\nnevada\t360317\n透雕\t360318\n信长之野望11\t360319\n第09集\t360320\n船难\t360321\n诗词集\t360322\nIFIX\t360323\n梁鹏\t360324\n往南\t360325\npolitics\t360326\nclaw\t360327\n化工招聘网\t360328\n2017年11月\t360329\n海怡\t360330\nexle\t360331\n嘉义\t360332\n包案\t360333\n上海市人民政府发展研究中心\t360334\n过敏性咽炎\t360335\n腿纹\t360336\nbond\t360337\nH0930\t360338\n清凉寺\t360339\n坤平\t360340\n枕部\t360341\n下丘\t360342\n联通3g\t360343\n和解协议书\t360344\n点石成金\t360345\npiay\t360346\n童大焕\t360347\n周宏\t360348\n【鑫\t360349\n毕业班\t360350\n王彬\t360351\n荣发\t360352\n阮恒\t360353\ntriplet\t360354\n潘春春\t360355\n1068\t360356\nWin键\t360357\n就中\t360358\n顺驰\t360359\n林国斌\t360360\n东方风云榜\t360361\n植物大战僵尸吧\t360362\n暗敷\t360363\n缺血灶\t360364\n车保险\t360365\n龙趸\t360366\n笑逐言\t360367\n虚空干扰器\t360368\n1秒内\t360369\n二重积分\t360370\n忏\t360371\n马关条约\t360372\nphpStorm\t360373\n正弦\t360374\n37套\t360375\nCOK列王\t360376\n联化\t360377\n戴夫\t360378\n伐\t360379\n单声\t360380\nGetty\t360381\n生品\t360382\n五届\t360383\n税收优惠政策\t360384\n支付宝卡\t360385\n营收\t360386\nP卡\t360387\n涧西\t360388\n蚌埠南站\t360389\narpg\t360390\n林灵\t360391\n高明骏\t360392\n额济纳\t360393\nnoi\t360394\n沟通者\t360395\n元气弹\t360396\n2017-10-31\t360397\n商业险\t360398\n这次\t360399\n鄄城政府\t360400\n世茂御龙湾\t360401\nDiversity\t360402\n杯\t360403\nlpl\t360404\n203dpi\t360405\n跳刀\t360406\n鲁汶大学\t360407\n马卫光\t360408\n八十九\t360409\n耳兔\t360410\n所得税制\t360411\n举手之劳\t360412\n不存在\t360413\n佛科院\t360414\n垂丝海棠\t360415\n20180219\t360416\n上镇\t360417\n乌鲁\t360418\n1+X\t360419\n窃\t360420\n读卡\t360421\nwacom手绘板\t360422\nOVO\t360423\n忿\t360424\nSOLUTIONS\t360425\n王林\t360426\n预结\t360427\n富马酸\t360428\n腾达路由器\t360429\n中华演出网\t360430\n一号公路\t360431\nzamorano\t360432\n动脉硬化\t360433\nproftpd\t360434\n玩奴\t360435\n烈焰\t360436\n20160803\t360437\n紫帽山\t360438\n阿兰·德波顿\t360439\n武汉市公安局交通管理局\t360440\n多玩电视游戏\t360441\n天标\t360442\n王保华\t360443\n震中\t360444\n毛\t360445\n无错\t360446\n微星\t360447\n禾川\t360448\n17.8%\t360449\n南开大学历史学院\t360450\nngxplay\t360451\nConclusion\t360452\n诗选\t360453\n真龙霸业\t360454\n管理学院\t360455\n10nm\t360456\n杰克琼斯\t360457\n神雕侠侣_九游论坛\t360458\n机具\t360459\n米罗蒂奇\t360460\n广告袋\t360461\n易天\t360462\nhooked\t360463\n天根\t360464\n四川博物院\t360465\n权衡\t360466\n滨河大道\t360467\n国家级高新区\t360468\n猛犸年\t360469\n邪恶漫画大全\t360470\n华强方特\t360471\ndafont\t360472\ndalian\t360473\n十大类\t360474\n三领\t360475\nbazooka\t360476\n雷霆手游网\t360477\n和信贷\t360478\n中华MOD网\t360479\n李莎\t360480\n云大\t360481\n缘定三生\t360482\n取水\t360483\nhd7770\t360484\n1.74%\t360485\n6.com\t360486\n黄厝\t360487\n随行\t360488\n九龙\t360489\n7135\t360490\n超级解霸\t360491\nwebex\t360492\n空气袋\t360493\n变力\t360494\n赝作\t360495\n引导语\t360496\n编版\t360497\n本山快乐营\t360498\nbora\t360499\n脓液\t360500\n奥比\t360501\n性爱指南\t360502\n中国西藏网\t360503\n东盛\t360504\n长春财经学院\t360505\n兰乔圣菲\t360506\n济宁市第一人民医院\t360507\nbianji\t360508\nCocos2d-x3.2\t360509\n高斯计\t360510\nprefork\t360511\n懒羊羊\t360512\ngost\t360513\n条形统计图\t360514\n酒吞\t360515\n本泽朋美\t360516\ngec\t360517\n隆隆\t360518\nFBP\t360519\n大宝山\t360520\n直流电桥\t360521\n欢乐世界\t360522\n酒头\t360523\n2DHGAME\t360524\n说尽\t360525\n蓝毗尼\t360526\n护路\t360527\nwooden\t360528\nsql索引\t360529\n送审稿\t360530\n吉林大学哲学社会学院\t360531\nyoutobe\t360532\n老姨\t360533\nblossom\t360534\nxvfb\t360535\n南锡\t360536\n整屏\t360537\n酣畅\t360538\n冷热电\t360539\n两个半月\t360540\n同频共振\t360541\n荣耀3X\t360542\nETAS\t360543\n热爱者\t360544\nisnumeric\t360545\n周尔晋\t360546\n蝴蝶步\t360547\n二奶奶\t360548\nFolder\t360549\n疤膏\t360550\n床榻\t360551\n微语\t360552\n叫号机\t360553\n胡思\t360554\nSQS\t360555\n石狮市人民法院\t360556\n奔驰GLA200\t360557\n风凉话\t360558\n三叶罗茨风机\t360559\n规陪\t360560\n1.168\t360561\n喷雾型\t360562\n蚁力神\t360563\nSW-331\t360564\n眼袋\t360565\n地广人稀\t360566\ncharter\t360567\n非字母\t360568\n冷机\t360569\n生化危机6\t360570\n黑银\t360571\n好315致富网\t360572\n王金平\t360573\n中县\t360574\n黄锦燊\t360575\n十驾\t360576\noffice2016word\t360577\n群书\t360578\n金瓶梅2008\t360579\n四街\t360580\n激进派\t360581\nミルク\t360582\n托尔巴拉德\t360583\n趙長卿\t360584\n荣丰\t360585\n乳糜泻\t360586\nfalied\t360587\n8817\t360588\n企业工资指导线\t360589\n中储粮\t360590\nq235a\t360591\nbeginthreadex\t360592\n试写\t360593\n电饼铛\t360594\n华耐\t360595\n王新生\t360596\n华人街\t360597\n丙二醇甲醚醋酸酯\t360598\n苏州儿童医院\t360599\nrenal\t360600\n泰文\t360601\narts\t360602\n圣基茨\t360603\n姗姗来迟\t360604\n古龙\t360605\n江岭\t360606\n合订\t360607\n价格竞争\t360608\n习明泽\t360609\n陈娅安\t360610\n残星\t360611\n出挑\t360612\n五矿信托\t360613\n第22个\t360614\n油气\t360615\nDELF\t360616\n厌食\t360617\n沱牌酒\t360618\nwar\t360619\n9.9元\t360620\n朱晨\t360621\ncoco\t360622\n计算机组\t360623\narchitectural\t360624\n完身\t360625\n天铁\t360626\n龙泉街道\t360627\n面条机\t360628\n坚瑞消防\t360629\n我的骄傲\t360630\n双笙\t360631\n华人\t360632\nColorPicker\t360633\ncheng\t360634\nNZT\t360635\n5遍\t360636\n创源\t360637\n_浦\t360638\n石心\t360639\n大众夏朗\t360640\n修砖\t360641\nstak\t360642\n大众POLO\t360643\n直缝\t360644\n星艺装饰\t360645\n烽火戏诸侯\t360646\n柰子\t360647\nTREK\t360648\nzxy\t360649\nGMA\t360650\n锦州医科大学附属第一医院\t360651\n脊髓灰质炎灭活疫苗\t360652\n佳能EOS-1D\t360653\n60部\t360654\n德国国家队\t360655\n七十五\t360656\n六亲不认\t360657\nlarvael\t360658\n5000多\t360659\ngreg\t360660\n赛时\t360661\n安卓服务器\t360662\nmininet\t360663\n音影\t360664\n娃子\t360665\n永东\t360666\n候机厅\t360667\n全额\t360668\n黄沁\t360669\n宋太宗\t360670\n豫贸网\t360671\nsharpdesk\t360672\n大岳\t360673\n溧阳市\t360674\n作训\t360675\n三派\t360676\n补血\t360677\n银屑病频道_健客网\t360678\n天工网工程信息网\t360679\n广州装修公司\t360680\n人鱼小姐\t360681\n卖课\t360682\n白石真琴\t360683\n特许金融分析师\t360684\n舞跳\t360685\n郭巨\t360686\n黄有维\t360687\n管沟\t360688\n豆沙馅\t360689\nenvato\t360690\n家用血糖仪\t360691\n六元\t360692\n篁岭\t360693\n兴隆公园\t360694\n56号\t360695\nNightMar\t360696\n初生\t360697\ndownlo\t360698\n成长之路\t360699\n鬼咬鬼\t360700\n2018周年\t360701\nItools\t360702\ndisk\t360703\n8_10\t360704\ndata-options\t360705\n雪盲\t360706\n赵苏禾\t360707\nvogue\t360708\n读览天下\t360709\n同体\t360710\n本地仓\t360711\n北京上大学\t360712\n北笙\t360713\n并购部\t360714\n柜柜\t360715\n流式细胞\t360716\n招标采购网\t360717\n怜香\t360718\n珠宝大家坛\t360719\n王者世界\t360720\n自由天空论坛\t360721\n浏阳市人民政府\t360722\n无法呼吸\t360723\nTTK\t360724\n多赢财富网\t360725\n新利\t360726\n荣耀畅玩6A\t360727\n领路\t360728\n南京八一医院\t360729\ngithup\t360730\n4wd\t360731\n港机\t360732\n世界防治结核病日\t360733\nchanged\t360734\n长丰县人民政府\t360735\n阆中古城\t360736\n可靠性\t360737\n类别\t360738\n赫伯罗特\t360739\nmaltab\t360740\n吴运铎\t360741\n卖买\t360742\n哇哈哈\t360743\n曲江海洋馆\t360744\n2016年1月25日\t360745\n王晓娟\t360746\n张歆马伯庸\t360747\n东莞市人民医院\t360748\n思影\t360749\nfen\t360750\nartifactory\t360751\n我行我素\t360752\n佐贺\t360753\nLOLS7\t360754\n天津渤海职业技术学院\t360755\n鸟害\t360756\n离婚时\t360757\n头孢曲松钠\t360758\n怀孕期\t360759\nX200\t360760\n阿城区\t360761\ndatasets\t360762\n联系人\t360763\nVeil\t360764\n亲生活\t360765\n144hz\t360766\n七叶树\t360767\n第195集\t360768\nGeoNet\t360769\n莲生\t360770\n9朵\t360771\nhavana\t360772\n加班加点\t360773\n茵特拉根小镇\t360774\n献词\t360775\n计生证明\t360776\n魔心\t360777\n分页类\t360778\ngranted\t360779\n原班\t360780\n镇平县\t360781\n移动破碎站\t360782\n空中花园\t360783\n团粒\t360784\n强攻\t360785\n冷水机\t360786\n黄紫昌\t360787\n玉案\t360788\n盗抢险\t360789\n长江万里图\t360790\n2554\t360791\n晨曲\t360792\n九通\t360793\n十二星座网\t360794\n可不可信\t360795\n疣状\t360796\nTutors\t360797\n准考证\t360798\n微信公众号文\t360799\n潮河\t360800\n印在\t360801\nMate7\t360802\n贵大\t360803\n真心希望\t360804\n2017年6月23日\t360805\nSeagate\t360806\n奇瑞瑞虎3x\t360807\n住宅容积率\t360808\n寅次郎的故事\t360809\n马昊\t360810\n邦交\t360811\nBranding\t360812\n尾管\t360813\n将爱\t360814\nurine\t360815\n那些花儿\t360816\n北京外企德科人力资源服务上海有限公司\t360817\njinjia\t360818\n搜视网\t360819\n7812\t360820\nCD包音乐网\t360821\n认尸\t360822\npm951\t360823\n毛胚房\t360824\nYMH\t360825\n屯垦\t360826\n南京嘉环科技有限公司\t360827\n白芍总苷胶囊\t360828\n305号\t360829\n翔子\t360830\n畎亩\t360831\nSmartDraw\t360832\n米德\t360833\n建设银行个人网上银行\t360834\n索伦森\t360835\n老版本\t360836\n吉卜力工作室\t360837\n石泉县\t360838\n平陆\t360839\n头孢丙烯分散片\t360840\nobvious\t360841\n王文斌\t360842\n兰索拉唑肠溶片\t360843\n黄晓王\t360844\npropose\t360845\n木贼\t360846\n众怒\t360847\n好孩子集团\t360848\n爱派\t360849\n快乐舞步健身操\t360850\n马克思主义经济学\t360851\n平安京\t360852\n宁波小区\t360853\n招制作\t360854\nv1.7.1\t360855\n最后一\t360856\n补充性\t360857\n斜体\t360858\njjs\t360859\nSynology群晖科技\t360860\n实例篇\t360861\n须乡\t360862\n传染病\t360863\n价钱\t360864\nnote\t360865\nmapv\t360866\n打出\t360867\nrealize\t360868\n沉默者\t360869\n阿里音乐\t360870\ncared\t360871\n心血管病\t360872\n麦秸\t360873\nWigs\t360874\n爱岗敬业演讲稿\t360875\nWindows安全中心\t360876\n旺顺阁\t360877\n空手套\t360878\n探险乐园\t360879\n黑山谷\t360880\n元值\t360881\n电子科技大学清水河校区\t360882\n曹贵人\t360883\n小s\t360884\n500集\t360885\n征讨\t360886\n押题\t360887\n暗鸦\t360888\ndlp\t360889\n微交易\t360890\n子宫腺肌病\t360891\n牛场\t360892\n通信工程学院\t360893\nsometimes\t360894\n轻伤\t360895\n不苟言笑\t360896\n扁平式\t360897\n温斯洛\t360898\n班级\t360899\n全方位\t360900\nhappiness\t360901\ngtk3\t360902\n手足情\t360903\n猎鸟\t360904\nmdk\t360905\nWoodytu\t360906\n绿狗\t360907\n劵\t360908\n大卫戈尔\t360909\n存栏\t360910\n臺灣\t360911\n活氧\t360912\n青山君\t360913\n蟠龙新区\t360914\n天蚕土豆\t360915\n秸杆\t360916\n一览_地下城与勇士\t360917\n满山遍野\t360918\n五氯化磷\t360919\n三星C5\t360920\n罗马广场\t360921\n选秀权\t360922\n收音机\t360923\n1-3月\t360924\n撮镇镇\t360925\n保利国际广场\t360926\n公安厅\t360927\n彩宏\t360928\n中国信息通信研究院\t360929\n十拳\t360930\n敏化\t360931\n恒大旅游集团\t360932\n于博\t360933\n6949\t360934\n五年多\t360935\n老妈妈\t360936\n幼发拉底河\t360937\nAnsj\t360938\n金佛\t360939\n光伏电池\t360940\nselection\t360941\nobject\t360942\nhp1025\t360943\n刘永福\t360944\n长清路\t360945\n\\xa0\t360946\n绝不可能\t360947\n5.6.24\t360948\n眼唇卸妆液\t360949\nMACROSS\t360950\n一键式\t360951\n50016\t360952\nWylde\t360953\n圆机\t360954\n英杰传\t360955\n城濮之战\t360956\n气胀\t360957\n外卖店\t360958\n赫尔辛基\t360959\n446655\t360960\n生产工具\t360961\n865个\t360962\n监测仪\t360963\n对症\t360964\n捻度\t360965\n云墅\t360966\n银行信贷部\t360967\n砸死\t360968\n血宠\t360969\n深圳市区\t360970\n云南锗业\t360971\n新榜\t360972\n桃花鱼\t360973\nall\t360974\n人迹罕至\t360975\n背光灯\t360976\n怜星\t360977\n素材\t360978\n医冠\t360979\n荆江\t360980\nshift\t360981\n1070Ti\t360982\n7:00\t360983\n基本面分析\t360984\n天啸\t360985\n双叶\t360986\n博古特\t360987\n曼丹\t360988\n胡里奥\t360989\n北外青少英语\t360990\n欢乐魏蜀吴\t360991\n溪风\t360992\n保质期\t360993\n玖熙\t360994\n化学平衡常数\t360995\n0.82\t360996\n助性\t360997\n过年检\t360998\n9000块\t360999\n巩哥\t361000\n花园式\t361001\n食品展\t361002\n无题\t361003\nKEA128\t361004\n百目鬼\t361005\n龙华\t361006\n诈尸\t361007\n0.5.5\t361008\n街舞pk热血街舞团\t361009\n诺欣妥\t361010\n永远是\t361011\nMirai\t361012\n梵高传\t361013\nautoconf\t361014\n胃火\t361015\n学月\t361016\n怪相\t361017\n望湖\t361018\n结交\t361019\n起亚K3\t361020\nhaodiao\t361021\n河源市源城区\t361022\n斯巴达勇士\t361023\n甲状腺球蛋白\t361024\n海带\t361025\n武汉外国语学校美加分校\t361026\n1.1.1\t361027\n奏章\t361028\n尤克里里ukulele谱\t361029\n胶粘带\t361030\n蒙汗药\t361031\n267号\t361032\n夺子\t361033\nxbyzydz\t361034\n主打曲\t361035\n李小云\t361036\n656\t361037\n3dmaxvr\t361038\n5760\t361039\n狂妄自大\t361040\n大董烤鸭\t361041\nBallet\t361042\n西峰网\t361043\nHPP\t361044\nluo\t361045\n杂谈\t361046\nrecaptcha\t361047\n大京\t361048\n喷射阀\t361049\n玻璃缸\t361050\n律学\t361051\n光脑\t361052\n移动认证\t361053\n汉魏\t361054\n苦胆\t361055\n打预防\t361056\n树莓派zero\t361057\n知了猴\t361058\n天宫院\t361059\n困惑\t361060\n一秒钟\t361061\n蔻蔻\t361062\n装饰片\t361063\n李南央\t361064\n在职研究生网\t361065\n肌内效贴布\t361066\n协\t361067\n倾述\t361068\n三论\t361069\n夺位者\t361070\n象棋盘\t361071\n乳化沥青\t361072\n推制\t361073\n艺考生\t361074\n盖好\t361075\nShip\t361076\n新浪陕西_新浪网\t361077\n概念车\t361078\n公开\t361079\n极化码\t361080\n萧山科技城\t361081\n衣料\t361082\nalienwar\t361083\n比亚迪s6\t361084\n呢喃\t361085\n1.2吨\t361086\n临别\t361087\n嘻游记\t361088\n王媛媛\t361089\nWRF\t361090\n上期\t361091\n荔枝苑\t361092\n哈吉\t361093\n仁学\t361094\n工会战\t361095\n红其拉甫\t361096\n六安\t361097\n牌友\t361098\n绝世神医之逆天魔妃\t361099\nCONDITION\t361100\n武汉极地海洋世界\t361101\nmdzz\t361102\nJpush\t361103\nlance\t361104\n自治区教育厅\t361105\n武汉装修公司\t361106\n甘南路\t361107\n不适宜\t361108\n苏拉玛\t361109\nDOCKER\t361110\n风露\t361111\n犯我\t361112\nLM2596\t361113\n潍坊晚报\t361114\n大兴区人民医院\t361115\n突突突\t361116\nproxmark3\t361117\n追肥\t361118\n石堆\t361119\n德媒\t361120\n保安队\t361121\nTween\t361122\n10606g\t361123\n鸡仔饼\t361124\n3环\t361125\n吉春\t361126\n王紫\t361127\n23198110\t361128\n88届\t361129\n团乐购\t361130\nLPL2016\t361131\n18966723591\t361132\n一制\t361133\n犯若妻\t361134\n抗日\t361135\nACR\t361136\n30\t361137\n人间词话\t361138\n棕榈岛\t361139\n362号\t361140\n樂\t361141\n火区\t361142\n同发\t361143\n科摩罗\t361144\nXenon\t361145\npsf\t361146\n西南方\t361147\n招贤纳士\t361148\n华孚\t361149\n收集者\t361150\n无顶\t361151\n金华万达广场\t361152\ncolloid\t361153\n羞辱界外魔之死\t361154\n翠色\t361155\n菲律宾大学\t361156\n墓王之王悬棺寺动漫_墓王之王悬棺寺\t361157\n保留源\t361158\n调头\t361159\ndtg\t361160\n大赦\t361161\n小关街道\t361162\n3批\t361163\n日语全集\t361164\n中公教育河南分校\t361165\nDoublelift\t361166\nVISHAY\t361167\nピンクパイナップル\t361168\n圣迪乐村\t361169\n计算\t361170\n麦香\t361171\n3z\t361172\n脱岗\t361173\n韧性\t361174\n情挑\t361175\n河神\t361176\n毛猴\t361177\n秀兰·邓波儿\t361178\n桜木凛\t361179\n筒状\t361180\n重神机潘多拉\t361181\n仓房\t361182\n陈卫\t361183\nAndroidManifest\t361184\nFailing\t361185\n54元\t361186\n本卡\t361187\n踪影\t361188\n同声传译\t361189\n人民空军\t361190\n12308\t361191\n逆天仙魔录\t361192\n舰队collectio\t361193\n环境保护局\t361194\n代码集\t361195\n特德\t361196\n农民阶级\t361197\n鲅鱼水饺\t361198\noms\t361199\nInstalling\t361200\n气动螺丝刀\t361201\nZooxTaboo\t361202\n云南省国家税务局\t361203\n频数分布直方图\t361204\n2017年6月14日\t361205\nCordova\t361206\nenvy13\t361207\n平水镇\t361208\n土鸭\t361209\nregistry\t361210\n联盟\t361211\n混凝剂\t361212\n武汉理工大\t361213\n1.2寸\t361214\n1步\t361215\n芒果之乡\t361216\n互联网信息服务管理办法\t361217\n按耐\t361218\n哪段\t361219\n非裔\t361220\n南条爱乃\t361221\n31313\t361222\n上河\t361223\n奇瑞A3\t361224\niluminage\t361225\n秦朔\t361226\n养生功\t361227\nmods\t361228\n东方年代记\t361229\n直头\t361230\ncoursework\t361231\nwatchman\t361232\npaula\t361233\nArabian\t361234\n第122期\t361235\n曲沃县\t361236\nJYJ\t361237\n中海神州半岛\t361238\n梁山县\t361239\nJumia\t361240\n海豚\t361241\n千里马招标网\t361242\n瘾头\t361243\nHole\t361244\n海鲜饭\t361245\n争霸艾泽拉斯\t361246\njijian\t361247\n中国地质调查局\t361248\n北京全聚德\t361249\n1一\t361250\n原\t361251\n使用\t361252\n客店\t361253\n会计估计\t361254\n家庭影院\t361255\n排气扇\t361256\n28g\t361257\n第94章\t361258\n谷氨酸\t361259\n蓄脓\t361260\nSelling\t361261\n万达嘉华\t361262\nfury\t361263\n龙都\t361264\n找工作\t361265\n逐字\t361266\nGBF\t361267\n处州\t361268\nGeoGebra\t361269\n显卡风扇\t361270\nTEEN\t361271\n论述\t361272\n卢俊\t361273\n小米手机6\t361274\n老实话\t361275\nc/s\t361276\nIgnite\t361277\n南通港\t361278\n我的天劫女友\t361279\n各乡镇\t361280\n青铜峡\t361281\n实职\t361282\n软壳\t361283\ninformatics\t361284\n中国城市规划学会\t361285\n酷游戏论坛\t361286\n邓诚\t361287\n云台山风景区\t361288\n蜀都万达广场\t361289\npbc\t361290\nshrink\t361291\n童梦无\t361292\n应援棒\t361293\n译林\t361294\n学习\t361295\ncoincidence\t361296\n三消\t361297\n电源开关\t361298\nIRST\t361299\n渭城区\t361300\nbehavior\t361301\n450米\t361302\n36m\t361303\n深圳地铁8号线\t361304\n诤\t361305\n杀神指染成婚\t361306\n牧尘\t361307\n防雷箱\t361308\n中国自贸区\t361309\nweixing\t361310\n2.95\t361311\nwia\t361312\n七论\t361313\n1.005\t361314\n瀑布式\t361315\n新冠\t361316\n原纸\t361317\n亲赴\t361318\n衢州二中\t361319\n冕宁县\t361320\n3D定制女仆2\t361321\n侮\t361322\n6.3下\t361323\n日成\t361324\nlerp\t361325\n雅润\t361326\n4399手机游戏网\t361327\ndhtmlx\t361328\n任务品\t361329\n信息科学与技术学院\t361330\nCapacitor\t361331\n超广角镜头\t361332\n韩立\t361333\n曹格\t361334\nbode\t361335\n慧律法师\t361336\nざ\t361337\n川投集团\t361338\n一脚踢\t361339\n偷亲\t361340\n成都市国家税务局\t361341\n2018.4.23\t361342\n精神病人\t361343\n肆拾玖坊\t361344\n301医院\t361345\n堆存费\t361346\n金属制品业\t361347\nady映画网\t361348\n名曰\t361349\n做题\t361350\n秦森\t361351\n伯\t361352\n水墨画\t361353\n第49\t361354\n葛藤\t361355\n1756\t361356\n前2小时\t361357\n山房\t361358\n密度泛函\t361359\n辐射4学院\t361360\nf556\t361361\n梅耶\t361362\n周思成\t361363\n黄锐\t361364\n表现主义\t361365\n3.70\t361366\n大众高尔夫GTI\t361367\n全员安全生产责任制\t361368\n煤粉炉\t361369\nPsych\t361370\nnologging\t361371\n应由\t361372\n空白纸\t361373\n胞嘧啶\t361374\n李春生\t361375\n英冠\t361376\n水浒卡\t361377\n凤凰小说网\t361378\n朱虎\t361379\n被黑\t361380\n极品飞车9\t361381\n百度地图接口\t361382\n尔虞我诈\t361383\n成都婚纱摄影工作室\t361384\n泷\t361385\n山贼\t361386\n骨伽\t361387\n逆温\t361388\n右胸\t361389\n沧州市\t361390\n相干性\t361391\n中国大连_大连市政府\t361392\n0323\t361393\n兰博基尼Huracan\t361394\nswallow\t361395\n预算书\t361396\n北京金融资产交易所\t361397\n7470d\t361398\n荣品\t361399\n手提箱\t361400\nJunit4\t361401\nTowels\t361402\n客梯\t361403\n论迹\t361404\n女教师\t361405\n姜汉娜\t361406\n苏州市公安局交通警察支队车辆管理所\t361407\n峰度系数\t361408\n二氧化钛\t361409\n她走\t361410\n溺亡\t361411\n丁香婷婷\t361412\n龙凤山\t361413\n油纸\t361414\n瑞讯银行\t361415\n_食疗养生_养生之道网\t361416\n2016至2017年\t361417\nhzx\t361418\n复地\t361419\n诗图\t361420\n华汇\t361421\n第一框\t361422\n食槽\t361423\nDocuments\t361424\n龙力生物\t361425\nDUNK\t361426\n嘉庆皇帝\t361427\n山东省史志办\t361428\n多不饱和脂肪酸\t361429\n肯尼迪国际机场\t361430\n源通\t361431\n徐坤\t361432\n总务\t361433\n汇隆\t361434\n疆\t361435\nPurchase\t361436\n记忆表\t361437\n找平\t361438\n极限挑战4\t361439\n养亲\t361440\npdflib\t361441\n球虫\t361442\n菲律宾航空\t361443\n12x\t361444\n9分裤\t361445\n水君\t361446\n农村媳\t361447\nC++函数\t361448\n危险废弃物\t361449\n董家鸿\t361450\n重地\t361451\n打裂\t361452\n安科生物\t361453\n共同语言\t361454\n航拍\t361455\n郭爽\t361456\n皮亚杰\t361457\ncatheter\t361458\n打扰\t361459\n海葡萄\t361460\n全顺电机\t361461\n自尊\t361462\nmicaps\t361463\n奥氏\t361464\n纷争\t361465\n超声波美容仪\t361466\n毛毛\t361467\n主线\t361468\n哭灵\t361469\n交流\t361470\n329号\t361471\n超清在线云\t361472\n102平米\t361473\ncffi\t361474\n负极\t361475\nINTERNATIONAL\t361476\n萨迦\t361477\n构造者\t361478\n笑傲帮\t361479\n东方新城\t361480\n针灸推拿学\t361481\n皖南\t361482\n2}\t361483\n8分米\t361484\n潍柴动力股份有限公司\t361485\n宝马医武兵王\t361486\n捷登\t361487\n192.168.0.1_\t361488\n4月1号\t361489\n光学变焦\t361490\n纠结\t361491\n弯桥\t361492\n鱼雕\t361493\nZ7mini\t361494\n西梁女国\t361495\n南京美辰微电子有限公司\t361496\n扭王\t361497\n海滨消消乐\t361498\n减肥茶\t361499\n殷弘\t361500\n鸣鸟\t361501\nfc2ppv\t361502\n奥多\t361503\nxtx\t361504\n官品\t361505\nOFweek传感器网\t361506\n婚迁\t361507\n重庆男科医院\t361508\n花轿\t361509\n有骨气\t361510\n飞机盒\t361511\n抗凝\t361512\n我能行\t361513\n胡来\t361514\nWholesaler\t361515\n_书业网\t361516\n礐石风景区\t361517\nx光片\t361518\n星系团\t361519\ncijilu在线视频\t361520\n栏目页\t361521\n首页\t361522\n马齿苋\t361523\n投行\t361524\n重庆金融资产交易所\t361525\n宗教史\t361526\n7272\t361527\n现货市场\t361528\n合作方\t361529\n繁难\t361530\nBodypaint\t361531\n余裕\t361532\n输入源\t361533\n美通\t361534\n第115集\t361535\n9.25\t361536\n存\t361537\nAcfun\t361538\n轻色\t361539\n少花\t361540\nwangeditor\t361541\n2届\t361542\n二五仔\t361543\ndaniel\t361544\ns35\t361545\n动弹不得\t361546\n长城基金\t361547\nFISH\t361548\n世嘉论坛_汽车之家论坛\t361549\n我心永恒\t361550\n泗州戏\t361551\nAGENCY\t361552\n桂花路\t361553\n喷饭网\t361554\n蒙大拿州\t361555\n任重道远\t361556\n全厂\t361557\n大谭镇\t361558\n任命书\t361559\n0.21\t361560\n5.4青年节\t361561\nPaintings\t361562\n另选\t361563\n美国往事\t361564\n上海公积金管理中心\t361565\n全创\t361566\nawm\t361567\n字版\t361568\n用房\t361569\n数字频率计\t361570\n气象报告\t361571\n牲口\t361572\n七部曲\t361573\n福能股份\t361574\n出门\t361575\n游山西村\t361576\n奈奈子\t361577\n华为Mate9\t361578\n嘻游\t361579\n复叠式\t361580\nT410\t361581\n染\t361582\n河北广电\t361583\n铁尺\t361584\n步进式\t361585\n51job\t361586\n90路\t361587\nA17\t361588\n欧帕斯\t361589\n斯蒂文\t361590\n国龙\t361591\n李晨惠若琪\t361592\nbase64_\t361593\n20161214\t361594\nRestaurant\t361595\n南昌市民政局\t361596\nED200\t361597\nFiller\t361598\n100岁\t361599\nbuiltin\t361600\n画展\t361601\ntopping\t361602\n全食\t361603\n2017年3月31日\t361604\n皇贵妃\t361605\n中田春平\t361606\n杨国荣\t361607\n1元\t361608\n聚贤庄\t361609\n雅\t361610\n投其所好\t361611\n雅痞\t361612\n红宝丽\t361613\n深圳市银雁金融服务有限公司\t361614\n卡仕达酱\t361615\n耽\t361616\nModular\t361617\n14只\t361618\n单件\t361619\n三战\t361620\n海量\t361621\n起垄机\t361622\n琳琅秀\t361623\nv2.05\t361624\n非洲大草原\t361625\n快餐式\t361626\nEnhancing\t361627\n几m\t361628\n提亚马特\t361629\n泰剧天生一对\t361630\n将军山\t361631\n龙舟\t361632\n撕脱\t361633\n第63章\t361634\n森林里\t361635\n条案\t361636\n亚\t361637\n闹铃\t361638\n001_\t361639\n农合联\t361640\n零界\t361641\n550d\t361642\n湖南工业职业技术学院\t361643\n美食节\t361644\n深圳市市场监督管理局\t361645\n党务校务公开网\t361646\nPack\t361647\nWebrtc\t361648\nV8.8\t361649\n加利福尼亚大学伯克利分校\t361650\n独显方法\t361651\n中国电子\t361652\n大气科学学院\t361653\n印军\t361654\n0013\t361655\n藏经洞\t361656\n千余名\t361657\n双色球\t361658\n詹韦\t361659\n林若亚\t361660\n上颌窦\t361661\n2第五章\t361662\n我感\t361663\n2014年底\t361664\n空载\t361665\ngrammer\t361666\n古茗奶茶\t361667\n防尘\t361668\n火箭筒\t361669\n童雪\t361670\n中国船人联盟\t361671\n摇中\t361672\n深沟\t361673\n莫离\t361674\n微讲堂\t361675\n澳尔滨\t361676\n防爆合格证\t361677\n3D溜溜网\t361678\n广州南\t361679\n集体所有制企业\t361680\n单号网\t361681\nflu\t361682\nPopover\t361683\n折纸王子\t361684\n拨付\t361685\n3.5万\t361686\n有果\t361687\n钢体\t361688\n黑布\t361689\n2km\t361690\n狐金\t361691\nGephi\t361692\n国有化\t361693\n泥饼\t361694\nformData\t361695\njsc\t361696\n邦博尔卫校\t361697\nREITs\t361698\nzh-CN\t361699\n东方白鹳\t361700\n保温被\t361701\n鲍家街\t361702\n硬块\t361703\n娜迦王\t361704\n粉字\t361705\n反收购\t361706\n1000v\t361707\n多维子\t361708\n三月三节\t361709\n广州大桥\t361710\nrevenge\t361711\n鹤鸣\t361712\n爆裂鼓手\t361713\n15辆\t361714\nInsight\t361715\n罗格朗\t361716\nEmbedding\t361717\ntds\t361718\n求人办事\t361719\n瑞蚨祥\t361720\n山东网络台\t361721\nPPM\t361722\n北京三菱\t361723\ninterbase\t361724\n韦帅望\t361725\nIntelliSense\t361726\n蝴蝶泉\t361727\n开断\t361728\n经管院\t361729\n53平米\t361730\n李乙\t361731\n新立镇\t361732\nwww.2v2.com.cn\t361733\n酶标板\t361734\n绝对值不等式\t361735\n笨贼\t361736\n哈尔滨市人民政府\t361737\n权娜拉\t361738\n高中数学必修五\t361739\n有情人终成眷属\t361740\n速腾沃尔沃s60\t361741\n排烟井\t361742\nrocksdb\t361743\n樱落\t361744\nforall\t361745\nZRONG\t361746\n彼\t361747\n钴业\t361748\n101010\t361749\n王春生\t361750\nmanytomany\t361751\n东宁市\t361752\n蜜枕甜妻\t361753\n律政\t361754\ngame\t361755\n随后\t361756\nWAI\t361757\n鑫合汇\t361758\n柿子饼\t361759\n冷泡茶\t361760\nOrbit\t361761\n51Testing\t361762\n库拉\t361763\n专利性\t361764\n朱碧石\t361765\n术前\t361766\n住改商\t361767\n甘愿\t361768\n狄利克雷\t361769\n无休\t361770\n榴弹发射器\t361771\n1653\t361772\nbuffer\t361773\n1.8m\t361774\n企博网\t361775\n川农\t361776\n我的歌\t361777\n中国兵器工业集团公司\t361778\n樱花社\t361779\n标签页\t361780\n宿便\t361781\nav\t361782\n宁园\t361783\nCartier\t361784\n别忘\t361785\n幻云\t361786\n建模师\t361787\n宝鸡人才网\t361788\n集线\t361789\n脊髓炎\t361790\n排烟管\t361791\nZZ91再生网\t361792\nChi\t361793\n藏头\t361794\n枣庄市规划局\t361795\n孟珙\t361796\n斯米兰\t361797\n空气污染\t361798\n弥渡\t361799\n第15话\t361800\n粪车\t361801\nDIALux\t361802\n民族舞蹈\t361803\nclarity\t361804\n新坝\t361805\n两栖类\t361806\n3D卡通\t361807\n二环\t361808\n酒版\t361809\n74亿\t361810\n美竹\t361811\n残差平方\t361812\nabb式\t361813\n新浪湖南_新浪网\t361814\n华东理工大学图书馆\t361815\n博研\t361816\n完颜阿骨\t361817\n招标投标法\t361818\n两件事\t361819\n直流斩波电路\t361820\n阳极板\t361821\n变速机\t361822\n热枕\t361823\nKT板\t361824\n200多次\t361825\n曹雪芹\t361826\n幸福安康\t361827\n可尔必思\t361828\n抓钱\t361829\n南京德云社\t361830\n紫金农商行\t361831\nLens\t361832\n长春市人力资源和社会保障局\t361833\n20151126\t361834\n安神\t361835\n建号\t361836\n东风夜放花千树\t361837\nq群\t361838\n62项\t361839\n工位级\t361840\n匿名内部类\t361841\n铜字\t361842\n侦察员\t361843\n广播站\t361844\n安居苑\t361845\n补液\t361846\n暗夜精灵\t361847\nIDR\t361848\n钢包\t361849\n制陶\t361850\n大雅之堂\t361851\n气相色谱分析\t361852\n1.85\t361853\n甘之如饴\t361854\n北京中国银行\t361855\n南通市环保局\t361856\n成都火车南站\t361857\nleaving\t361858\n丝管\t361859\nt^2\t361860\n内点法\t361861\n文界\t361862\n地压\t361863\n党支部委员会\t361864\n2018年04月\t361865\n易玩网\t361866\n黎香湖\t361867\n4887\t361868\n秋裤\t361869\n整形手术\t361870\n{}\t361871\n花花姑娘\t361872\nshines\t361873\nden\t361874\nwimax\t361875\n给面\t361876\n达克宁栓\t361877\n封狼居胥\t361878\n模拟量\t361879\n反言\t361880\n硝酸钴\t361881\n匿名\t361882\n易商\t361883\n教研室\t361884\n老唐\t361885\nDelegation\t361886\n漂洗\t361887\nE550\t361888\nmg3080\t361889\nThermo\t361890\n水渠\t361891\n仰卧位\t361892\n依米艳\t361893\n急用钱\t361894\n外商投资产业指导目录\t361895\n外企\t361896\namplitude\t361897\n小扎\t361898\n一定要骚\t361899\n陀枪师姐\t361900\n协通\t361901\n细数\t361902\n企微云\t361903\n颠覆性\t361904\n5.6.4\t361905\nphotoshopcs3\t361906\nISNULL\t361907\n最好不过\t361908\nEssence\t361909\nStupid\t361910\n5499\t361911\n宁远县\t361912\nsendRedirect\t361913\n右肩\t361914\n桂林山水\t361915\n五显镇\t361916\nAJ13\t361917\n2017-04-22\t361918\nlibmad\t361919\nPDF.js\t361920\n小品文\t361921\n白鲢鱼\t361922\n阿涵\t361923\nboe\t361924\n桂山\t361925\ndamp\t361926\n雌三醇\t361927\n律师界\t361928\n两化融合管理体系贯标\t361929\n丝杆升降机\t361930\n四五级\t361931\n下段\t361932\n太康路\t361933\n偏心蝶阀\t361934\n第六十章\t361935\nechats\t361936\n2017双\t361937\n两千多年\t361938\n腾讯\t361939\n塔沟\t361940\n红鹦鹉\t361941\n陶瓶\t361942\n祥生群贤府\t361943\nTerrible\t361944\n管胎\t361945\n赤水\t361946\n圣斗士星矢\t361947\nmsmq\t361948\n睡眠呼吸暂停综合征\t361949\n闲说\t361950\n赏金猎人\t361951\nDreamer\t361952\n政府办公室\t361953\nxx\t361954\nsqlServer\t361955\nedward\t361956\nMEX\t361957\n三万英尺\t361958\n威尔特\t361959\n地瓜叶\t361960\njenna\t361961\n车赛\t361962\n宝马车\t361963\narm\t361964\nhunk\t361965\n绿之韵\t361966\njerome\t361967\n来宾市\t361968\n武宁南路\t361969\n班队\t361970\n百花村\t361971\n赞词\t361972\n大学院\t361973\nSealed\t361974\n0564\t361975\n字牌\t361976\n艾炙\t361977\n国青\t361978\n埃文\t361979\nepub阅读器\t361980\n脊兽\t361981\n太阳底下\t361982\n汤臣倍健蛋白粉\t361983\nWondershare\t361984\n正红\t361985\n淡紫色\t361986\nduplication\t361987\n车智\t361988\n内侧\t361989\n人工受精\t361990\n母神\t361991\n博君\t361992\n结算\t361993\n认识了\t361994\n铬丝\t361995\nProtective\t361996\n创世啪啪三国\t361997\n南大学\t361998\n舍夫沙万\t361999\n反应性\t362000\n台商\t362001\n保温管\t362002\n崖头\t362003\n蒙城\t362004\n城市一卡通\t362005\n轻者\t362006\n九华镇\t362007\n胶层\t362008\n小蒙\t362009\n2800多亿元\t362010\n印顺法师\t362011\n换妻俱乐部\t362012\npropertygrid\t362013\n普利茅斯大学\t362014\nW7\t362015\n跃起\t362016\n小康网\t362017\n铁线莲\t362018\nSplatoon2\t362019\n26个月\t362020\n橘里橘气\t362021\n随存\t362022\n花贼\t362023\n蟹黄\t362024\n王琛\t362025\n和睦\t362026\n省部级\t362027\n馈电\t362028\n第138期\t362029\n1375\t362030\neval函数\t362031\n年运\t362032\n科学版\t362033\n安第斯\t362034\n20171113\t362035\n小牛队\t362036\n星晨\t362037\n椿树\t362038\n压倒性\t362039\n种植基地\t362040\n威虫\t362041\n京翰中考网\t362042\n孤岛危机1\t362043\n陈熙\t362044\n养生仪\t362045\n免疫调节\t362046\n百思\t362047\n6折\t362048\n新天然气\t362049\n大草原\t362050\n重庆代表团\t362051\n4号位\t362052\n奥北\t362053\n知行\t362054\nXPG\t362055\n浙江省农村信用社联合社\t362056\n同路\t362057\n广州地产\t362058\nPERFORMANCE\t362059\n齐长城\t362060\nPug\t362061\n2户\t362062\n上海汽车集团财务有限责任公司\t362063\n邹城房产网\t362064\n黑吧安全网\t362065\n导热性\t362066\n纽瓦克机场\t362067\n萍\t362068\n监理工程师考试\t362069\n乙肝表面抗原阳性\t362070\n大自然保护协会\t362071\n慕雪\t362072\n168炒股学习网\t362073\n相背\t362074\n西安市纪委\t362075\nreplays\t362076\nTGS\t362077\n蛋清\t362078\nkb4093118\t362079\nSniper\t362080\n20180227\t362081\n哑光漆\t362082\n综合计算工时制\t362083\nmaiden\t362084\n四天三夜\t362085\n笑笑西卡\t362086\n覚\t362087\n怡景\t362088\nAngus\t362089\n安徽农网\t362090\nclauses\t362091\n酉月\t362092\n天环\t362093\n玻璃纤维管\t362094\n纪元1404\t362095\n相左\t362096\n收兵\t362097\nphp类\t362098\n安顺\t362099\n海头\t362100\n金边\t362101\nupk\t362102\n常识库\t362103\n获得\t362104\n不稳\t362105\n延禧\t362106\n就诊率\t362107\n陆河县\t362108\nferrari\t362109\n野味\t362110\n污污污\t362111\n汉哀帝\t362112\n胖哥\t362113\nonclik\t362114\n华强北商城\t362115\n百度商桥\t362116\nole\t362117\n骑马机\t362118\n2两个\t362119\n陆东福\t362120\n林喵喵\t362121\n2.0.0.1\t362122\n三国战纪之风云再起\t362123\n藤原龙\t362124\n11|\t362125\n惨白\t362126\nEla\t362127\n巨丑\t362128\n麒麟山庄\t362129\n999_\t362130\n创始\t362131\n紙\t362132\nmsq\t362133\n六千元\t362134\n灵符\t362135\n堀江由衣\t362136\n碗篮\t362137\n北京大学考古文博学院\t362138\n合肥晚报\t362139\n晶城秀府\t362140\n金霉素\t362141\n况且\t362142\n虹虹\t362143\n老_张\t362144\n部落冲突夜世界\t362145\nK8\t362146\n注解式\t362147\n平布\t362148\n两弹元勋\t362149\n三百多万\t362150\n509\t362151\n孟奇\t362152\nparsefloat\t362153\n架体\t362154\n民福康\t362155\n深圳华强集团有限公司\t362156\n多乐士漆\t362157\n6pm\t362158\n2015.11\t362159\n姜栋元\t362160\n沈文\t362161\n张腾岳\t362162\n西安市政府\t362163\nBryan\t362164\nSyndrome\t362165\n凤凰山陵园\t362166\n八声甘州\t362167\n画家\t362168\n春画\t362169\n苏建价\t362170\n提升机\t362171\n西游风云\t362172\n博鳌会议\t362173\n柞水县人民政府\t362174\n安居小区\t362175\n手提\t362176\n家居类\t362177\n蔡文钰\t362178\n遇刺\t362179\n味极\t362180\n10线\t362181\n俾斯麦号战列舰\t362182\n雅思5\t362183\n忆阻器\t362184\n一长一短\t362185\n9.1%\t362186\nsister\t362187\n四选\t362188\n傣族泼水节\t362189\n数阵\t362190\n郭德俞灏明\t362191\n迈迪工具集\t362192\n胡家庙\t362193\ninformed\t362194\n2018年3月1日起\t362195\n1587\t362196\nvalse\t362197\n南端\t362198\nAlt+Tab\t362199\nmyD\t362200\n塞纳河\t362201\n无cd\t362202\n小兵张嘎\t362203\n一兰拉面\t362204\n肝风\t362205\n装模作样\t362206\nPronunciation\t362207\n草铺\t362208\nui框架\t362209\n6649\t362210\n于成龙\t362211\n猎豹汽车\t362212\n0918\t362213\ngres\t362214\n浮夸\t362215\nEJB\t362216\n痒刑\t362217\n揽胜\t362218\n火包\t362219\noffline\t362220\niredmail\t362221\n7月18日\t362222\n流村镇\t362223\n刃甲\t362224\n九顶塔\t362225\n预装\t362226\n龟背竹\t362227\n发射器\t362228\n顾嘉辉\t362229\n五年\t362230\n下克\t362231\n落单\t362232\n郑州海关\t362233\n吕中\t362234\n三贼网\t362235\n论坛体\t362236\n施斌\t362237\nex2\t362238\n程宁\t362239\n两岁半\t362240\n天奇股份\t362241\n秘密特工\t362242\n紧随\t362243\n朱自布拉德皮特\t362244\n不领情\t362245\n吾国与吾民\t362246\n电视盒子\t362247\n内在\t362248\n雍正十三年\t362249\n略写\t362250\n复兴路\t362251\n小蚊\t362252\n高效液相色谱法\t362253\n名窑\t362254\nJenaral\t362255\nFixya\t362256\n劣后\t362257\n民卡\t362258\nowner\t362259\n格格广场舞\t362260\n5轮\t362261\ngsg\t362262\n刺客信条:大革命\t362263\n宝宝\t362264\n打坐\t362265\n席勒\t362266\n佳兆业金域天下\t362267\n爱的力量\t362268\n威士\t362269\n下拉菜\t362270\n西班牙语在线翻译\t362271\n未来集团\t362272\nSüd\t362273\n等待期\t362274\nWEB应用程序\t362275\n三塘\t362276\n总参谋长\t362277\nsqlcode\t362278\n三台山森林公园\t362279\n东瘟疫之地\t362280\nami\t362281\n石粉\t362282\nnan\t362283\n无双大蛇z\t362284\n吻别\t362285\n巴鲁特\t362286\n宝冢歌剧团\t362287\n乡镇政府\t362288\n莫文\t362289\n杜辉\t362290\n木棉湾\t362291\n蓝鹰\t362292\n御所\t362293\ncrossed\t362294\n内瘘\t362295\n瞭望\t362296\n书法师\t362297\nDOTA2.UUU\t362298\nLiteral\t362299\nDovey\t362300\n五福金牛\t362301\n消除\t362302\n关务\t362303\n沥林镇\t362304\n周清华\t362305\nexcepted\t362306\nelectrical\t362307\n单挑赛\t362308\n恒以致远\t362309\n矩形类\t362310\niPhonex\t362311\n法经\t362312\nemgu\t362313\n贝伦\t362314\n云霞\t362315\n暴走看书\t362316\n嫖宿\t362317\n73号\t362318\n游戏界\t362319\nendl\t362320\n汇兑差额\t362321\n我行\t362322\n河北出入境检验检疫局\t362323\nStorwize\t362324\nkofxiii\t362325\n企业路由器\t362326\n中关新园\t362327\n阿乙\t362328\n富途\t362329\n2016多\t362330\n康定\t362331\n色带架\t362332\n年岁\t362333\n施林克\t362334\n庙街\t362335\nbrunomars\t362336\n8540w\t362337\n苯胺\t362338\n哈纳斯\t362339\n泰和安\t362340\n数据项\t362341\n调子\t362342\n垂帘听政\t362343\nA59m\t362344\n拉盘\t362345\n颖宝\t362346\nWIA\t362347\nyimi\t362348\n赴汤蹈火\t362349\n文轩\t362350\nMiiTao\t362351\n地鸡\t362352\n水莲花\t362353\n冯宝宝\t362354\n票品\t362355\n广州机场路\t362356\n启辰t70\t362357\n明码标价\t362358\n灯丝\t362359\nCss\t362360\n足光散\t362361\nsort、sorted\t362362\n空行\t362363\n展销会\t362364\n水平考试_希赛网\t362365\n叫床骚麦\t362366\n朱冰\t362367\nquanxian\t362368\n佛山市住房和城乡建设管理局\t362369\n_图吧地铁\t362370\n阳线\t362371\nAsp\t362372\n南汽车站\t362373\n李小勇\t362374\n墙面漆\t362375\n会好不好\t362376\n装饰柜\t362377\nivo\t362378\n王思宇\t362379\n船中\t362380\n油杯\t362381\n华府\t362382\necllipse\t362383\nopenlayer\t362384\nIrving\t362385\n陕西青年职业学院\t362386\n公软\t362387\n企业管\t362388\n本位\t362389\n张建军\t362390\nvisas\t362391\n泾阳\t362392\nlittlstar\t362393\n药物基因组学\t362394\n氯硝西泮\t362395\n变频供水设备\t362396\n同济大学建筑设计研究院\t362397\nr410\t362398\n高兰\t362399\n白浪滩\t362400\n二册\t362401\n帝国时代4\t362402\n双墩\t362403\n高婕\t362404\n小美人\t362405\n42_\t362406\nproce\t362407\n品牌故事\t362408\n信赖度\t362409\n膨胀式\t362410\n多少\t362411\n有根\t362412\n碱土\t362413\n齐翔\t362414\n怀念战友\t362415\n秦娥\t362416\n精通\t362417\nconic\t362418\n100副\t362419\n亮润\t362420\n代煤\t362421\n豫州\t362422\n倦怠\t362423\n天桥\t362424\n夏长\t362425\n布线\t362426\n唇形\t362427\n封测\t362428\nVerySource\t362429\nSHAN\t362430\n凤凰池\t362431\n0.00%\t362432\n亲子营\t362433\n北壁\t362434\n白金卡\t362435\nVR网\t362436\n26周\t362437\nadmitted\t362438\n清景\t362439\nmbps\t362440\n528\t362441\n挑戰\t362442\n钱荒\t362443\n反坡\t362444\n881\t362445\n诬告陷害罪\t362446\nInspire\t362447\n确\t362448\n才者\t362449\nBLADE\t362450\n薇婷脱毛膏\t362451\n桂苑\t362452\nentts\t362453\nMira\t362454\n十字槽盘头螺钉\t362455\n孟波\t362456\n今年4月份\t362457\n切带机\t362458\n红米4a\t362459\nQ淘网\t362460\n北九州\t362461\n求生之路2\t362462\n闪光感\t362463\n2008-2012年\t362464\n震天\t362465\n一幅画\t362466\n华卫\t362467\n腾讯对战平台\t362468\ndsj\t362469\n胜治\t362470\n特色化\t362471\n薛来\t362472\n游戏社\t362473\n甘肃农业大学\t362474\n新网\t362475\n金润雅\t362476\n咬会\t362477\n眉山政府网\t362478\n亚历珊德拉·达达里奥\t362479\n基督网\t362480\n宏新\t362481\n202z\t362482\n中山七路\t362483\n文产\t362484\nnodeJs\t362485\n乘法运算定律\t362486\n休斯敦火箭队\t362487\n小磊\t362488\n名不副实\t362489\n坝上\t362490\n热车\t362491\n五个故事\t362492\nアトリエ\t362493\n米歇尔\t362494\n偶像剧场\t362495\n银之杰\t362496\n1000毫升\t362497\n友谊街\t362498\n雍禾\t362499\n摔伤\t362500\n根河市\t362501\nU10\t362502\n上港\t362503\n上海师范大学天华学院\t362504\n王雪涛\t362505\n刘海瑞\t362506\nCigar\t362507\n东平新城\t362508\n全脂奶粉\t362509\ngrain\t362510\n图像素\t362511\nescalation\t362512\n组合图\t362513\n金圆股份\t362514\n凭条\t362515\n手歌\t362516\n芦山\t362517\n六开彩开奖\t362518\n2015年11月份\t362519\n后娘\t362520\n微投\t362521\n多模块\t362522\n热血传奇单机版\t362523\n郭红\t362524\n19\t362525\n造梦西游ol\t362526\n升麻\t362527\n陈诚\t362528\nMDT\t362529\n报价员\t362530\n五朵\t362531\nsinoalice\t362532\n大连理工\t362533\n神叨\t362534\n龙门神途\t362535\n小助手\t362536\n撒肥机\t362537\ncun\t362538\n无数种\t362539\n中阿博览会\t362540\n攀枝花芒果\t362541\nIBIS\t362542\n中华名吃学院\t362543\n歌声\t362544\n宁波银行直销银行\t362545\n辽朝\t362546\n工程造价信息网\t362547\n欲仙\t362548\n毕尔巴鄂\t362549\nButterknife\t362550\n秋色\t362551\n热缩管\t362552\n张晓兵\t362553\n恒功率\t362554\nTaocode\t362555\n中国百强中学\t362556\n李卫东\t362557\n_月沙网\t362558\nJ1900\t362559\n武昌街\t362560\n林明祯\t362561\n2017年3月27日\t362562\n32GB\t362563\n云汇\t362564\n刺激战\t362565\n金鼎广场\t362566\n佳能550D\t362567\n张伯李大霄\t362568\n赵蕊蕊\t362569\n八周年\t362570\n食管裂孔疝\t362571\nando\t362572\n金属破碎机\t362573\n持戒\t362574\npapersnake\t362575\n长风网\t362576\n类地\t362577\nVic\t362578\n三千丈\t362579\nvimeo\t362580\n京港澳\t362581\n花儿与少年第三季\t362582\n两顿\t362583\n安波\t362584\n三月份\t362585\n三婶\t362586\n九州岛\t362587\n更完善\t362588\n几何原本\t362589\n烙铁头\t362590\n医事\t362591\n水杨醛\t362592\n张萍\t362593\n北京西南\t362594\n宝马M2\t362595\n穿串\t362596\n立花沙耶\t362597\n追思\t362598\n两千多\t362599\n竹苑\t362600\n黑龙江省公安厅\t362601\n上古卷轴重置版\t362602\n光泽县\t362603\n钟琴\t362604\n_镇街\t362605\n乌龙特工\t362606\n英雄无敌战争纪元\t362607\ngtv\t362608\n金器\t362609\ngeek\t362610\n1651\t362611\n横江\t362612\nC17\t362613\n转达\t362614\nDDWRT\t362615\n广发商城\t362616\n支前\t362617\n易信\t362618\n蛟川街道\t362619\n智数\t362620\n500倍\t362621\n福贵\t362622\n上海公园\t362623\n低产\t362624\n禾本科\t362625\n浅夏丶未央\t362626\nCAD模板-千图网\t362627\n芽庄\t362628\n2017-3-3\t362629\n为人知\t362630\nav影音先锋\t362631\n挂车\t362632\nshowmessage\t362633\nlanzou\t362634\n英布\t362635\n蒙尘\t362636\n公映版\t362637\n瞎\t362638\n万达百货\t362639\n8e\t362640\n巴登巴登\t362641\nISO9001\t362642\n传发\t362643\n1400亿\t362644\nRenee\t362645\nWebAPi\t362646\n3年\t362647\ntar、bz2\t362648\n包河区\t362649\n迪卡普里奥\t362650\n中国地质科学院\t362651\n墙角\t362652\nBroadway\t362653\n1.5.2\t362654\n逐龙\t362655\n二表\t362656\n砰然\t362657\n胡坚\t362658\nSpreadsheet\t362659\n平平\t362660\n首付贷\t362661\nconservation\t362662\n制\t362663\n一枝花\t362664\n墙身\t362665\n离婚前\t362666\n复食\t362667\n澳康达\t362668\nhwt\t362669\n智慧岛\t362670\n廉洁教育\t362671\n第4级\t362672\n车速\t362673\ntpye\t362674\nlcu\t362675\n前田敦子\t362676\n传译\t362677\n晚8点\t362678\n银花泌炎灵片\t362679\n南昌市卫生局\t362680\n三记\t362681\n郑州联通\t362682\ni9300\t362683\n龙屋\t362684\nIran\t362685\nRomantic\t362686\n丝板\t362687\nVisions\t362688\n行高\t362689\n余杭19楼\t362690\n碎星决\t362691\n轰趴馆\t362692\n涮书网\t362693\n魔虫\t362694\nAE白金卡\t362695\n上海人寿\t362696\n67天\t362697\n灌注\t362698\nGPM\t362699\n苦寒\t362700\n荣耀娜\t362701\nLinuxeden\t362702\n百度移动\t362703\n一趟\t362704\nembedding\t362705\n道板\t362706\nHIGHLIGHT\t362707\n优炫\t362708\n猪宝\t362709\nDryer\t362710\n中华人民共和国反不正当竞争法\t362711\n国创\t362712\n蘑菇神社\t362713\n盘用\t362714\n嘟嘟嘴\t362715\n上海市同仁医院\t362716\n码段\t362717\n英雄无敌3:死亡阴影\t362718\nsteelseries\t362719\n40千瓦\t362720\nscorm\t362721\n永山裕子\t362722\n100&\t362723\n广州农业银行\t362724\n干柿鬼鲛\t362725\n车贷\t362726\n工具链\t362727\n墨块\t362728\n郭靖\t362729\n操作员\t362730\n市食品药品监管局\t362731\n结束语\t362732\n靳言\t362733\n五十度黑\t362734\nPattern\t362735\n素性\t362736\n美国管理协会\t362737\nRealflow\t362738\n96届\t362739\n顾地\t362740\nT3A航站楼\t362741\n炎夏\t362742\ntechmark\t362743\njiyou\t362744\nxzp\t362745\nEPT\t362746\n3000辆\t362747\n63\t362748\nAudigy\t362749\n3900\t362750\n心学\t362751\n奎山\t362752\n百度云群\t362753\n尼托克丽丝\t362754\n原创\t362755\n上海汉得信息技术股份有限公司\t362756\n量子效应\t362757\n整备质量\t362758\n妖术\t362759\n媚惑\t362760\n追逐赛\t362761\n大彻大悟\t362762\n上段\t362763\n雅昌\t362764\n邻苯二甲醛\t362765\n气体减压阀\t362766\n直播港澳台\t362767\nPoPDG\t362768\n伊玛目\t362769\n纵横捭阖\t362770\n英语专八\t362771\n吴飞虎\t362772\n童学\t362773\nEMBA招生信息网\t362774\nHereWifes\t362775\n蛮兔\t362776\n私募股权投资\t362777\n劲松\t362778\n初级会计学\t362779\n恼羞成怒\t362780\n黄布\t362781\n错嫁\t362782\nTriggers\t362783\n瑞穗银行\t362784\n环大亚湾\t362785\n天兔\t362786\nMAPGIS—地信网\t362787\nXP版\t362788\n矮小症\t362789\n兴森\t362790\n100招\t362791\n烘焙控\t362792\nspoofing\t362793\n对轮\t362794\n新松机器人自动化股份有限公司\t362795\nTha\t362796\n申报书\t362797\n共和国之辉2\t362798\n注册税务师考试\t362799\n虹桥中学\t362800\nnova2S\t362801\n娲皇宫\t362802\n打糕舞\t362803\n中华儿女\t362804\n24k金\t362805\n汽蚀\t362806\n泥工\t362807\n鹿儿岛\t362808\n白坯\t362809\n一年级数\t362810\n独家记忆\t362811\nOverdose\t362812\nntko\t362813\n晶\t362814\njingan\t362815\n六科\t362816\n官僚资本主义\t362817\n润肤\t362818\nEscorts\t362819\nWMM\t362820\n外交\t362821\n摸狗\t362822\n清白\t362823\n中山电信\t362824\n施强\t362825\n锁点\t362826\nect\t362827\nPhotoshop/PS画\t362828\n一战到底\t362829\nPARK\t362830\n小阁\t362831\n新疆公务员考试网\t362832\nneuro\t362833\n黑蚂蚁\t362834\n无锡大学\t362835\n榆树\t362836\n创业公司\t362837\n3000ml\t362838\n北京有色金属研究总院\t362839\n地獄\t362840\n冷裱\t362841\n电极片\t362842\n红心柚\t362843\n姜丰\t362844\n健康猫\t362845\n棠下村\t362846\nHypnosis\t362847\n20170602\t362848\nShocking\t362849\n闭麦\t362850\nsspd\t362851\n五保\t362852\npluse\t362853\n治房\t362854\n200亩\t362855\nconfigurable\t362856\n冯仑哈尔\t362857\nRHCA\t362858\n济宁经济技术开发区\t362859\n提干\t362860\nworried\t362861\n行刑者\t362862\nhaoge0205\t362863\n350mm\t362864\n正常\t362865\nOptimal\t362866\n镇长团\t362867\n操作指导书\t362868\n一刀两断\t362869\n5858\t362870\n温和型\t362871\n五件\t362872\n九牛\t362873\n游后感\t362874\n美芝\t362875\n密集柜\t362876\nAmps\t362877\nsequlize\t362878\n过来看\t362879\n呼吸灯\t362880\n章太炎\t362881\n12.3\t362882\n剑波\t362883\n沦陷\t362884\n拆装\t362885\nmxsps\t362886\n花柳\t362887\nunderground\t362888\n锦屏山\t362889\n苗寨\t362890\n焦健\t362891\nABO\t362892\n淡雅\t362893\n万用板\t362894\n安吉星\t362895\n人保局\t362896\n20万套\t362897\n独发\t362898\n上号\t362899\nconsent\t362900\n武汉小区\t362901\n撤消\t362902\n2420\t362903\nUSBKiller\t362904\n毛冬青\t362905\n江西省图书馆\t362906\nCaddy\t362907\n景點\t362908\nifla\t362909\n车距\t362910\nmanasXX\t362911\n草类\t362912\n光合谷\t362913\nFrozen\t362914\nWebpack4\t362915\n远非\t362916\n昌化镇\t362917\n骁龙845\t362918\n太平洋城\t362919\n霭\t362920\n黄金梅丽号\t362921\n血竭\t362922\n口疮\t362923\nbutterknife\t362924\n浔峰岗\t362925\nOPI\t362926\n大孩子\t362927\n气保焊机\t362928\n恒卦\t362929\n抗毒\t362930\nEView\t362931\n劈叉\t362932\n广州琶洲国际会展中心\t362933\n董贞\t362934\nVisio\t362935\ndenial\t362936\nit168\t362937\n江苏卫视跨年演唱会\t362938\n咽炎\t362939\n百度地址\t362940\n醋酸铵\t362941\nprote\t362942\n给排水\t362943\n金盘\t362944\n0580\t362945\nCOROLLA\t362946\n3650\t362947\n看了想\t362948\n张慧萍\t362949\n镗刀\t362950\n夜夜\t362951\n翻打\t362952\n粤曲\t362953\n葫芦网\t362954\n浙江理财网\t362955\nTraceroute\t362956\n大力丸\t362957\n阿卜杜拉国王科技大学\t362958\n龙悦居\t362959\n10迅雷\t362960\n新华公园\t362961\n加工品\t362962\n医养\t362963\nMyToken\t362964\n智富时代\t362965\n赵芸\t362966\n万和城\t362967\n一段时间\t362968\n浙江大厦\t362969\n电展\t362970\n配电器\t362971\n恒企教育\t362972\n2.70\t362973\n卓师兄\t362974\ncitizens\t362975\n丛林赤子心\t362976\n喷码机\t362977\nJNL\t362978\n小谢\t362979\n色丁\t362980\nNSURLSession\t362981\n收藏馆\t362982\n金都华庭\t362983\n百度盘\t362984\n美菱电器\t362985\n苏振华\t362986\n佳兆业中心\t362987\n2770\t362988\n降服\t362989\n浙江一区\t362990\n猎空\t362991\n第十二届\t362992\n画江湖\t362993\n牌坊街\t362994\n中国近现代史纲要\t362995\n·\t362996\nqualitative\t362997\nGORM\t362998\n定盘\t362999\n色系军团_\t363000\nVivoBook\t363001\n029\t363002\n曾格格\t363003\n鲁拜集\t363004\n攻下\t363005\nINSTRUMENTS\t363006\n道里区\t363007\n被俘\t363008\n侘寂\t363009\n过三关\t363010\nptx\t363011\n汽修店\t363012\nmax97\t363013\n电子科大研招网_电子科技大学\t363014\nyzz\t363015\n具\t363016\nfep\t363017\nOR值\t363018\n2016款款\t363019\nmenhera\t363020\n2版\t363021\n左安门\t363022\n钛合金\t363023\n郴房网\t363024\nIMS\t363025\n定向生\t363026\n辐射3吧\t363027\n女警花\t363028\n吕梁英雄传\t363029\n地税网\t363030\n光韵\t363031\ntlbb\t363032\n胰岛素注射液\t363033\n毕业论文+\t363034\n盛泽镇\t363035\n一周目\t363036\n蜃楼\t363037\nCCTV-13\t363038\nxxxxx\t363039\n郭致星\t363040\n小渔\t363041\n核磁谱图\t363042\n杭州口腔医院\t363043\n户门\t363044\n人民微博\t363045\n侨\t363046\n解行\t363047\n发利\t363048\n7506\t363049\n梦回\t363050\n345678\t363051\n环保部\t363052\n锦宏\t363053\n恒升\t363054\n提款机\t363055\nm127-m128\t363056\n泰开\t363057\n网套\t363058\nlives\t363059\n#育儿大作战#\t363060\n助焊剂\t363061\n摆正\t363062\nYua\t363063\n未成\t363064\n5.2.3\t363065\n水街\t363066\n588ku\t363067\n内推\t363068\nLayla\t363069\nPerri\t363070\n詹姆斯·哈登\t363071\n伴生矿\t363072\n客运段\t363073\n日木\t363074\n异人\t363075\n婚誓\t363076\n宁国市政府网\t363077\n天气台风网\t363078\nCREO\t363079\n活跃\t363080\ncshtml\t363081\n起亚嘉华\t363082\n欧乐\t363083\n牛黄降压丸\t363084\n蚕豆网\t363085\n切块机\t363086\n泰然\t363087\n天泵\t363088\n梦幻骑士\t363089\n中华人民共和国文化\t363090\n袁华\t363091\n佛甲草\t363092\nDMSO\t363093\nWindows资源管理器\t363094\n扎吉托娃\t363095\nNginx\t363096\n人大主任\t363097\nIIS7\t363098\nStanford\t363099\nthin\t363100\n李志平\t363101\n重庆斯威\t363102\n紫钗记\t363103\n唐倩\t363104\n被性\t363105\n智库网\t363106\n弹城\t363107\n泡菜坛\t363108\n顾长安\t363109\ngopath\t363110\nUCloud\t363111\n液溴\t363112\n家庭会议\t363113\n明月峡\t363114\n萌小希\t363115\n里山\t363116\n东方红魔乡\t363117\nIMEI\t363118\n38位\t363119\nC类地址\t363120\n阳虚_\t363121\n2411\t363122\n贵州省卫生和计划生育委员会\t363123\nN网\t363124\n琏\t363125\n实操篇\t363126\n国际货运代理\t363127\n哈尼喵汉化组\t363128\n布兰卡\t363129\n方强\t363130\nCloudFlare\t363131\n15c\t363132\n三职\t363133\n7805\t363134\nce\t363135\n车版\t363136\n马帮\t363137\n五寸\t363138\nPX\t363139\nstartuml\t363140\n佳能mp280\t363141\n八册\t363142\ninstaller程序包\t363143\nUnlocker\t363144\n岩井\t363145\nchinajoy\t363146\nipad\t363147\n足球之夜\t363148\n严防\t363149\n2[\t363150\n上海迪斯尼\t363151\n名诗\t363152\n吠陀\t363153\n1200M\t363154\n时宜\t363155\n沙利文\t363156\n出芽\t363157\n耳鼓\t363158\n新兰永恒\t363159\nTitles\t363160\n屠场\t363161\nTechnology\t363162\n好主意\t363163\n牛乳茶\t363164\nWDR5600\t363165\n武汉市住房保障和房屋管理局\t363166\n4月12日\t363167\n老乡群\t363168\n梧州站\t363169\n图币\t363170\n人闲桂花落\t363171\n第一局\t363172\n风堂\t363173\n王皓\t363174\n东浩兰\t363175\n智战\t363176\n陈皓\t363177\n588元\t363178\n扣\t363179\nALL土吧\t363180\n活动案\t363181\n全英赛\t363182\n净慧\t363183\n上沙\t363184\n住宿加早餐旅馆\t363185\n一补\t363186\nPersona\t363187\n项目部\t363188\n2.8b\t363189\n天主堂\t363190\nvivienne\t363191\nAnyConnect\t363192\ncpo\t363193\ndani\t363194\nArchiCAD\t363195\n窑子\t363196\n华为b3\t363197\n小爸爸\t363198\n浦东新区人民法院\t363199\n秘道\t363200\n30幅\t363201\n文字缘\t363202\n三管\t363203\n一万公里\t363204\n杨鑫\t363205\n吉塔\t363206\n黄菊花\t363207\n木子美\t363208\nIQC\t363209\n消炎\t363210\nTCAD\t363211\n哈喽\t363212\n盘根错节\t363213\nAsio\t363214\ngom\t363215\n居然之家\t363216\nexcel365\t363217\n东奔西走\t363218\n北京交通大学出版社\t363219\n1.6G\t363220\n9晚\t363221\n江苏教育新闻网\t363222\n运出\t363223\nLifeSmart\t363224\n很单纯\t363225\n蕾哈丁磊\t363226\n我老公的家庭教师\t363227\n发型屋\t363228\nlack\t363229\n羊胎\t363230\n比瑞吉集团\t363231\n大连华信\t363232\n周公瑾\t363233\n20多\t363234\n大超\t363235\n车费\t363236\n寻租\t363237\n胡小明\t363238\nkss\t363239\n汪茜茜\t363240\nPawn\t363241\n海南陵水\t363242\n融投\t363243\n元魂\t363244\n192.168.2.1\t363245\n南京装修公司\t363246\n私帐\t363247\n荒淫无度\t363248\n105天\t363249\n粗口\t363250\n幼鸟\t363251\n邮乐\t363252\n出错率\t363253\n领克03\t363254\n洗化\t363255\n第114集\t363256\nEX280\t363257\n沈悠\t363258\n合肥幼儿师范高等专科学校\t363259\n造谣者\t363260\n愛情\t363261\n半飞秒\t363262\nSPOC\t363263\n发散性\t363264\n在线\t363265\nTEST\t363266\n赤灵芝\t363267\n厢式\t363268\n筱妖\t363269\n手忙脚乱\t363270\n米兜熊\t363271\n50回\t363272\n名医在线\t363273\nindexer\t363274\nLQ\t363275\n乱世楚歌问仙志\t363276\n健康保险\t363277\n奥迪s3\t363278\nnatuzzi\t363279\n1px\t363280\n超滤\t363281\ninputs\t363282\n公示版\t363283\n福州三中\t363284\n回首\t363285\n吉川爱美\t363286\n梦龙\t363287\nFAG轴承\t363288\n野花\t363289\n母头\t363290\n农药管理条例\t363291\n黑莓Z10\t363292\n湿润\t363293\n25首\t363294\n病娇模拟器\t363295\n筹建处\t363296\n2.2万\t363297\n乐视乐2\t363298\n合同诈骗罪\t363299\n墓表\t363300\nngk\t363301\n医学科\t363302\n油炸小吃\t363303\n展示牌\t363304\n奔驰S350\t363305\n老歌手\t363306\n蕾蔻\t363307\n江苏电力\t363308\n龙啸云\t363309\n熔核\t363310\nlougou\t363311\n海南州\t363312\n自考准考证\t363313\n威尔克姆\t363314\n铁艺\t363315\n下定\t363316\n南京地铁S1号线\t363317\noffice2003\t363318\n_锋\t363319\nprice\t363320\n20150517\t363321\n大月\t363322\n红雪松\t363323\n覆盆子茶\t363324\n速弹\t363325\n﹞\t363326\n吸允\t363327\n塔巴夫\t363328\n相适应\t363329\n夜桜字幕组\t363330\n贵州\t363331\n洗完澡\t363332\nrights\t363333\n独角兽公司\t363334\n新店村\t363335\nexa\t363336\n3.01\t363337\n安福路\t363338\napche\t363339\n二十一点\t363340\n熏香\t363341\n灰铸铁\t363342\n阳台\t363343\n房卡棋牌游戏\t363344\n弗赖堡\t363345\n秒表计时器\t363346\n.net\t363347\n御刃者\t363348\n信息系统监理师\t363349\n唐红的恋歌\t363350\n查济古镇\t363351\n侨鑫汇悦台\t363352\n老年学\t363353\n发电量\t363354\n编解码\t363355\nvcruntime\t363356\n三把火\t363357\n_天维\t363358\n香坊区\t363359\nWs\t363360\n3月30号\t363361\njsz\t363362\n知趣\t363363\n刊头\t363364\n王光美\t363365\n老品\t363366\n嘉兴国际商务区\t363367\n己内酰胺\t363368\n凯迪网络\t363369\ngeometry\t363370\nKMeans\t363371\n2016年6月16日\t363372\n嘉木\t363373\nExiles\t363374\n_律行网\t363375\n半壁\t363376\n四天三晚\t363377\n恐龙世纪\t363378\n马玲\t363379\n2018.4\t363380\n能行\t363381\nSim卡\t363382\n潭拓寺\t363383\n落合\t363384\n天宁区\t363385\n表冷器\t363386\n44P\t363387\n息差\t363388\n夺神之权永久区|流放之路\t363389\n装量\t363390\n小红书福利社\t363391\n许明\t363392\nSECOND\t363393\n保健按摩师\t363394\n海南航空公司\t363395\n搅拌罐\t363396\nLOFT\t363397\n应答机\t363398\n35DC3473\t363399\n万科御澜道\t363400\n高爆\t363401\n香港赛马会\t363402\n快贷网\t363403\nSpartacus\t363404\nporous\t363405\n错行\t363406\n剑录\t363407\n银座股份\t363408\n仟佰\t363409\ncivilization\t363410\n品管圈\t363411\nBUNNY\t363412\n私募基金\t363413\n像梦一样自由\t363414\n野情\t363415\ndislike\t363416\n八大碗\t363417\n加料机\t363418\npolis\t363419\nf502\t363420\nnetdraw\t363421\n化\t363422\n华夏地理\t363423\n12色\t363424\n朗伯\t363425\n邓肯\t363426\n质量保证书\t363427\n赤名莉香\t363428\nequipped\t363429\n旅行商\t363430\nvacancy\t363431\n蔡继明\t363432\n刘思思\t363433\n以租代购\t363434\n白米粥\t363435\n绿萝吊兰\t363436\nv4.9\t363437\n20150927\t363438\n喀什政府\t363439\n灵龙\t363440\n逸居\t363441\n广州美术学院\t363442\n太原市小学\t363443\n广西防城港市\t363444\n非宝\t363445\n陶森特\t363446\n鲁特琴\t363447\nMATLAB2016a\t363448\n老外滩\t363449\n场站\t363450\n谢依彤\t363451\nds3\t363452\nContinued\t363453\n悠逸\t363454\n困窘\t363455\n马小军\t363456\n静默\t363457\n金乡政府网\t363458\n陈恒\t363459\n重铬酸钾\t363460\npcauto\t363461\nborrowing\t363462\n赵茹珍\t363463\n广东省政府采购中心\t363464\n怒火攻心2\t363465\n超高分子量聚乙烯\t363466\nMalformed\t363467\n宜山镇\t363468\n多任务\t363469\n战道\t363470\n假警\t363471\nmactex\t363472\n负利率\t363473\nLanvin\t363474\n瀑布流\t363475\ne家\t363476\n泰达米尔\t363477\n月收入\t363478\n人卫\t363479\n湿化\t363480\n彭场镇\t363481\nfighting\t363482\n重庆中国三峡博物馆\t363483\nreact-router4\t363484\n小赖子\t363485\nParticles\t363486\n在列\t363487\n拉封丹\t363488\n坍塌\t363489\nselect框\t363490\n第48条\t363491\nUltraVNC\t363492\n杯壁\t363493\n西师\t363494\n高额\t363495\n医学工程在线\t363496\n二三十岁\t363497\n鞍山市\t363498\n38毫米\t363499\n在天边\t363500\ncibn\t363501\n氨糖软骨素加钙片\t363502\nV2.10\t363503\nWrestlemania\t363504\n家公\t363505\n三圣母\t363506\n聚众斗殴\t363507\n乌镇\t363508\nAlexandre\t363509\n学习成绩\t363510\n积攒\t363511\n瑞兰\t363512\n00149\t363513\ndbname\t363514\n显学\t363515\nAntDesign组件库\t363516\ngamepad\t363517\nMedHelp\t363518\n平方毫米\t363519\n1949年以来\t363520\nM9\t363521\n非税收入票据\t363522\n彩虹表\t363523\n1848\t363524\n宁德市区\t363525\n杀人书\t363526\nOPO\t363527\n资告\t363528\n8500元\t363529\n动档\t363530\nHandoff\t363531\navrdude\t363532\n秦国\t363533\nQQ仙侠传\t363534\n24帧\t363535\n飞饼\t363536\n230亿\t363537\nJIMMY\t363538\n香榭国际\t363539\n叶绿素\t363540\n蓝莓酱\t363541\n润禾\t363542\nnpmrc\t363543\nHUNTER\t363544\n塔娜\t363545\n利安达会计师事务所\t363546\nAcqua\t363547\ni58500\t363548\nlatitude\t363549\n360随身wifi驱动\t363550\n贤能\t363551\n护石\t363552\n国誉\t363553\n污染源\t363554\nHDWK\t363555\n安德鲁斯\t363556\n液化石油气储罐\t363557\njediscluster\t363558\nprim\t363559\n白马\t363560\n肯达尔·詹娜\t363561\nX染色体\t363562\n中国交通在线\t363563\n弹屏\t363564\n8840\t363565\n醉春风\t363566\nTiny\t363567\n欢乐行\t363568\nTERM\t363569\n高教\t363570\n樊胜美\t363571\n白泽\t363572\n红鹿\t363573\n求见\t363574\n高效液相色谱\t363575\n欲破\t363576\n喧哗与骚动\t363577\n矢量素材设计图\t363578\n惊人\t363579\n3464\t363580\n针筒\t363581\n第102章\t363582\npcap\t363583\n北翟路\t363584\n黏贴\t363585\n安托山\t363586\n格局\t363587\nCRY\t363588\n老薛\t363589\n万宝城\t363590\nfenix5x\t363591\n二手电脑吧\t363592\n特比\t363593\n观察类\t363594\n7月24日\t363595\n夜光灯\t363596\n天天动听\t363597\n装卸作业\t363598\nwitches\t363599\n刀刀\t363600\n365电影网\t363601\n深圳少年宫\t363602\n胃积食\t363603\n圣元国际\t363604\n当晚\t363605\n畅读\t363606\nclutter\t363607\n诱饵\t363608\n草莓汁\t363609\n代销商\t363610\n方舟方块\t363611\nban\t363612\n拘留所\t363613\n西安建筑科技大学华清学院\t363614\n剑网三苍云\t363615\n代种\t363616\n泰安市中心医院\t363617\nhenry\t363618\ntransporter\t363619\n世界末\t363620\n招生简章\t363621\nMAK\t363622\n俞可平\t363623\n圣诺\t363624\n湖南博物馆\t363625\npdf版\t363626\n抗联\t363627\n列传\t363628\n贝雷梁\t363629\ncuter\t363630\n待批\t363631\nDL380\t363632\n相山区\t363633\n铡美案\t363634\n赫赫有名\t363635\n小规模纳税人\t363636\nBrembo\t363637\n大同市\t363638\n保温砖\t363639\n冴子\t363640\n大乐师\t363641\n河南小学\t363642\n煤场\t363643\n海能文库\t363644\n美年大健康体检中心\t363645\n松糕鞋\t363646\n掉头\t363647\n商务部研究院\t363648\n荣耀路由X1\t363649\nimporting\t363650\nivvi\t363651\nhighCharts\t363652\n拉毛\t363653\n李清照\t363654\n齐扎拉\t363655\n9.028\t363656\n北京瑰丽酒店\t363657\n水上运动\t363658\n工作女孩\t363659\n来战\t363660\n秀贤\t363661\nITS\t363662\n项城市\t363663\n吉普指南者\t363664\n应变仪\t363665\nhd620\t363666\n装订\t363667\nwin7注册表\t363668\n金尚路\t363669\n木梁\t363670\n基米\t363671\n鲁建\t363672\n银河系中心\t363673\n半塘\t363674\n准率\t363675\n蔡华\t363676\n差序\t363677\ncctv3\t363678\n黑帮片\t363679\n60册\t363680\n365元\t363681\n我的青春我的梦\t363682\n区商务局\t363683\n首发式\t363684\n大席\t363685\n内扣\t363686\n股债\t363687\n河南省交通厅\t363688\nn20\t363689\n和差\t363690\nmppt\t363691\n脏水\t363692\n三庭五眼\t363693\n成都理工\t363694\n56式\t363695\n神道\t363696\natid\t363697\n观唐\t363698\n绿酒\t363699\nCarson\t363700\n对身\t363701\n沈腾\t363702\n国际经济学\t363703\n咕咕\t363704\n线艺\t363705\n例题\t363706\n周慧\t363707\n寻物\t363708\n棋童\t363709\n越野e族论坛\t363710\nm6700\t363711\n二道区\t363712\n20多亿\t363713\n事中\t363714\n杨芳\t363715\n查册\t363716\n辞书\t363717\nflyme6\t363718\n棱锥\t363719\n普汇云通\t363720\n不连\t363721\n蒲松龄\t363722\n厉\t363723\n蟹总\t363724\nLind\t363725\n必途\t363726\n安陆市\t363727\n聚吡咯\t363728\n女子学院\t363729\n女表\t363730\n转国\t363731\nmyanmar\t363732\n惠新西街南口\t363733\nReebok\t363734\n这事儿\t363735\n姜蓉\t363736\n2千克\t363737\n下床\t363738\n沪江\t363739\n民航医院\t363740\n学唱\t363741\nso库\t363742\n厕纸\t363743\n言传\t363744\n谦称\t363745\n20171011\t363746\n真元\t363747\n救捞\t363748\n吹爆\t363749\n羞羞鬼\t363750\n游击\t363751\ngetpwnam\t363752\n指示符\t363753\n葛氏\t363754\n今井\t363755\n桥梁板\t363756\n芙蓉路\t363757\n同族\t363758\n船体\t363759\n水利工程\t363760\ncog\t363761\n永新县人民政府\t363762\n嘉事堂\t363763\nC0930\t363764\n三字经网\t363765\n定义\t363766\n外饰件\t363767\n邦帮堂\t363768\nメンズ\t363769\n新觉\t363770\n八爪鱼采集器\t363771\n水墨屏\t363772\n扎尕那\t363773\n水电气\t363774\n2607\t363775\n双福新区\t363776\n张董\t363777\n方舟:生存进化\t363778\ndroppable\t363779\n200万股\t363780\n发条橙\t363781\n波德申\t363782\n春秋战国\t363783\n玉皇顶\t363784\n德胜街道\t363785\n城步\t363786\neditable\t363787\n女上司\t363788\n欧几里德\t363789\nAGODA\t363790\n颞颌关节炎\t363791\n自反\t363792\n心理咨询师\t363793\ndragsort\t363794\ngsx\t363795\n康师傅绿茶\t363796\n庞贝古城\t363797\n广州空港经济区\t363798\n向光\t363799\ndodo-yufan\t363800\n20180218\t363801\n37.5\t363802\n叶圣陶杯\t363803\n空壳\t363804\nharvesting\t363805\n结膜\t363806\nchair\t363807\n五味瓶\t363808\n因果关系\t363809\n1.5L\t363810\n朝韩\t363811\nutxo\t363812\n仰卧起坐板\t363813\n肺结核皮试\t363814\n0528\t363815\n深圳国际低碳城\t363816\nWait\t363817\n谈天\t363818\n2331\t363819\n克尔\t363820\n10台\t363821\n来不来\t363822\n316i\t363823\n钟神\t363824\n船王\t363825\n电陶炉\t363826\nhp1008\t363827\n北京大学医学部公共卫生学院\t363828\n更完\t363829\n小迪\t363830\n江西信息应用职业技术学院\t363831\n继母\t363832\n剑阵\t363833\n攻防\t363834\n草汁\t363835\nCGSS\t363836\nmylove\t363837\n是真是假\t363838\n外汇期权\t363839\n朔州\t363840\nwww.wenku1\t363841\n上海租车\t363842\n柯桥\t363843\nmountain\t363844\nStatsmodels\t363845\nERP企业管理软件\t363846\nLightRoom\t363847\n3d扫描仪\t363848\n不折不扣\t363849\n1884\t363850\n拆换\t363851\n菌剂\t363852\nlt\t363853\npill\t363854\n115200\t363855\nuitableview\t363856\n早上7点\t363857\nwin虚拟机\t363858\n小白羊\t363859\n博彦\t363860\n金瓶玉梅\t363861\n总决\t363862\n病名为爱\t363863\n自豪版\t363864\n凯字\t363865\n里尔克\t363866\n汤姆·汉克斯\t363867\n郭碧王宁\t363868\nmdk5\t363869\n清平乐·村居\t363870\n联检\t363871\nubutu\t363872\n6天5晚\t363873\n网络加速器\t363874\n剑士\t363875\n文书库\t363876\nmucho\t363877\n2018年03月16日\t363878\nstm32f4xx\t363879\n178cm\t363880\nC++6.0\t363881\n冰与火之歌:权力的游戏\t363882\n吕秀菱\t363883\n劳社厅\t363884\n啪啪声\t363885\n仁恒公园\t363886\n操作室\t363887\n若干年\t363888\nconcerned\t363889\nPart1\t363890\n江燕\t363891\n版史\t363892\n麒麟丸\t363893\n深圳市龙岗中心医院\t363894\n枕上\t363895\n妖精的尾巴吧\t363896\n张春丽\t363897\n记忆式\t363898\n淘宝卖家运费险\t363899\n英磅\t363900\n聂小倩\t363901\n汶川大地震\t363902\nGT240\t363903\n新观点\t363904\n瓶罐\t363905\n美丽的祖国\t363906\nVPP\t363907\n豆筋\t363908\n本田CR-V论坛_汽车之家论坛\t363909\nXfce4\t363910\n中国大学生创业\t363911\n吻\t363912\n可数名词\t363913\na83\t363914\n徐志贤\t363915\n新东方教育科技集团\t363916\n部督\t363917\n新飘流幻境\t363918\n同好\t363919\n甲a\t363920\n前移\t363921\n梦幻口袋版\t363922\n座椅套\t363923\n六_\t363924\n光屁屁\t363925\nnetbeans\t363926\n留钱\t363927\n26010\t363928\n刀剑斗神传\t363929\n迪信通\t363930\nWingIDE\t363931\n农村土地承包法\t363932\n佳程\t363933\n箱体类\t363934\n光迅科技\t363935\n万科企业股份有限公司\t363936\n李墨之\t363937\n七号房\t363938\n两表\t363939\n枪毙\t363940\n疮疤\t363941\n显示屏\t363942\n抗美\t363943\nszt\t363944\n玛格丽特花\t363945\n中共深圳市委党校\t363946\n裸骑\t363947\n卡戴珊\t363948\n品牌策划公司\t363949\nTL95\t363950\n第187集\t363951\n尊巴\t363952\nWISP\t363953\n南师大\t363954\n2公斤\t363955\n输卵管造影\t363956\n姚贝娜\t363957\n独幕剧\t363958\n4月5日\t363959\n乡镇级\t363960\n银行信息港\t363961\nv4.4.1\t363962\nxhtml\t363963\n奥马电器\t363964\n陈烨\t363965\n浙江工业大学信息工程学院\t363966\n梁祝小提琴协奏曲\t363967\n国泰君安证券股份有限公司\t363968\n大安\t363969\n饮露\t363970\n刘宝红\t363971\n宁芜铁路\t363972\n自由杯\t363973\n智悲\t363974\n780万\t363975\n气数\t363976\n过滤箱\t363977\n王晓萍\t363978\n大华乐橙\t363979\n设计单\t363980\n青海省教育厅\t363981\n坟岗\t363982\n想不懂\t363983\n友阿\t363984\n邢台东\t363985\n百家号\t363986\n4.33\t363987\nwesg\t363988\n无间道\t363989\nLABO\t363990\n2017.4\t363991\n百度地图标记\t363992\n童林传\t363993\nSkada\t363994\n第三方包\t363995\n往复式压缩机\t363996\n姜家镇\t363997\nip\t363998\n下车\t363999\n长白山国际度假区\t364000\n仰视\t364001\n泰尔指数\t364002\n九福\t364003\n精灵类\t364004\n爱情文\t364005\n宋洋\t364006\n刻痕\t364007\n小查\t364008\n上海医保定点药店\t364009\n王东东\t364010\n37P\t364011\nB股\t364012\n静影\t364013\n暖薪\t364014\n围炉夜话\t364015\n泉南高速\t364016\n赫美\t364017\n中国人寿保险\t364018\n100u.dll\t364019\n1278\t364020\npe管\t364021\n分图\t364022\n亚城\t364023\n富大龙\t364024\n穷团歌\t364025\n视高\t364026\ngamesir\t364027\n侦探小说\t364028\n杜马\t364029\n崔护\t364030\nAdy\t364031\nMoons\t364032\n肉赘\t364033\n小眼儿\t364034\n周博\t364035\n刘德\t364036\n网页导航栏\t364037\n优购物\t364038\n宜宾市\t364039\n武装起义\t364040\n纯洁村\t364041\n非网\t364042\n钢琴\t364043\n雪影\t364044\n运动和力\t364045\nExercises\t364046\n热电厂\t364047\nPhot\t364048\n1728\t364049\n冈布奥\t364050\n拷贝机\t364051\n教贴\t364052\nJackson\t364053\n丹尼尔·雷德克里夫\t364054\n12KM\t364055\n解蜜\t364056\n84mm\t364057\n巴斯克维尔\t364058\n截港\t364059\n往矣\t364060\n什么形\t364061\n守卫剑阁\t364062\n济南市人民政府办公厅\t364063\n南方基金管理股份有限公司\t364064\n小丑\t364065\n仙岳山\t364066\n硼肥\t364067\ndt990\t364068\n耐候板\t364069\n呕吐物\t364070\n贵德县\t364071\n暑秋\t364072\n元界\t364073\nPS快捷键\t364074\n无固定期限劳动合同\t364075\n消食\t364076\n索尼RX100\t364077\n死鬼\t364078\n林道\t364079\n亮出\t364080\n国家会展中心\t364081\n四川水利职业技术学院\t364082\n默不作声\t364083\n王守海\t364084\nKanon\t364085\n凸集\t364086\n游泳裤\t364087\n吉他指弹谱\t364088\n甘洛\t364089\npierce\t364090\n冯友兰\t364091\n1296\t364092\n差异值\t364093\n荷叶塘\t364094\n和居\t364095\n心理咨询网\t364096\n张晓江\t364097\n银银\t364098\n冒险岛扎昆\t364099\n定位仪\t364100\n吊具\t364101\n参处\t364102\n权财\t364103\n王者荣耀2018\t364104\n毕业前\t364105\n两相电\t364106\n真空包\t364107\n高坂\t364108\n秦雪梅\t364109\n2017年11月23日\t364110\n两建\t364111\n七度\t364112\n放化疗\t364113\n张健\t364114\nSEQUENCE\t364115\ncheerful\t364116\n电热带\t364117\n校园招聘网\t364118\ndirectory域\t364119\n张家界玻璃桥\t364120\n热熔型\t364121\n辩证否定观\t364122\n银川站\t364123\n6cm\t364124\n20a\t364125\n民航大学\t364126\n蛇仙\t364127\n宗毅\t364128\nUSB启动盘\t364129\n回墨\t364130\n少女装\t364131\nLudwig\t364132\n57页\t364133\n湖北省监察厅\t364134\n五霸\t364135\n凤凰无双\t364136\nA800_\t364137\n龙盾\t364138\n所述\t364139\n大智移云\t364140\n0999\t364141\n中国共产党地方委员会\t364142\n南口\t364143\n一个九\t364144\nufi\t364145\n2万家\t364146\n征收局\t364147\n青岛人\t364148\n冷藏盒\t364149\ncdata\t364150\n针线\t364151\n鑫苑国际新城\t364152\nsvi\t364153\nochirly\t364154\n张文显\t364155\n韩剧资料馆\t364156\n丁湾\t364157\ngocheck\t364158\n6件\t364159\ndisable\t364160\n欲绝\t364161\nowens\t364162\n4L\t364163\n万集\t364164\nCFI\t364165\n无锡地铁\t364166\n摔角狂热\t364167\ndemoblog\t364168\n3113\t364169\n黑龙波\t364170\n慈溪市人民医院\t364171\n死草\t364172\ngep\t364173\n电子客票号\t364174\n增刊\t364175\n方段\t364176\nR6\t364177\n姜红\t364178\nSkinStore\t364179\n徐雁\t364180\nitmo\t364181\n王四营乡\t364182\n应发\t364183\nTor\t364184\n小卷\t364185\n59_\t364186\nDwg\t364187\n接线员\t364188\nBrowserify\t364189\n600999\t364190\n厄洛替尼\t364191\n天猫2017\t364192\n日本武田\t364193\n热书吧_言情小说吧\t364194\n长夜漫漫\t364195\n西安电子科技大学\t364196\n庆\t364197\nW540\t364198\nsleek\t364199\n王文远\t364200\nE13\t364201\n幻龙\t364202\n空谷幽澜\t364203\n庸懒散\t364204\n匣子小说网\t364205\n临死\t364206\n盼达用车\t364207\n氢化铝锂\t364208\npetapoco\t364209\n一千零一个\t364210\n晕轮效应\t364211\n如泣如诉\t364212\nbeholder\t364213\n艋舺\t364214\n青岛工商局\t364215\npla\t364216\ndiamante\t364217\n2899元\t364218\n古训\t364219\n贝隆\t364220\n金装四大才子\t364221\n孝女\t364222\n搭架子\t364223\n于野\t364224\n叶子\t364225\n尔泰\t364226\n4201\t364227\n旅行袋\t364228\n哲仁波切\t364229\nWSC\t364230\n十六天\t364231\n网易战三国\t364232\nIOS6\t364233\n三多一少\t364234\n孙露\t364235\n地氯雷他定片\t364236\nzenfone\t364237\nstty\t364238\nvS\t364239\n立思辰\t364240\n镍丝\t364241\n敌敌畏\t364242\n抡起\t364243\n酷吧\t364244\n步街\t364245\n湿式双离合\t364246\n股掌\t364247\nprocesses\t364248\n北京财富中心\t364249\n非现金\t364250\n高码\t364251\nSTM32F4\t364252\n泡妞\t364253\n2时\t364254\n粗粒式\t364255\nBrett\t364256\n逆剑\t364257\n跳下\t364258\nfeer\t364259\n松节油\t364260\n绿篱\t364261\n巨鼠\t364262\n青花\t364263\n托管费\t364264\n祁阳站\t364265\n楼群\t364266\n鼻炎康\t364267\nTAP\t364268\n汪兴平\t364269\nBonded\t364270\n鼎龙股份\t364271\n东坡路\t364272\n北京南站\t364273\nHoon\t364274\nnecessity\t364275\n娜迦族\t364276\n梦幻西游渡劫\t364277\n京都念慈庵蜜炼川贝枇杷膏\t364278\nResponsibilities\t364279\n2576\t364280\n10天左右\t364281\n英式\t364282\n三泉路\t364283\n美女衣\t364284\n最简二次根式\t364285\n龙门镖局\t364286\n执念师\t364287\n阑珊梦\t364288\nwad\t364289\nBroken\t364290\n昌大\t364291\n一棵树\t364292\n服毒\t364293\n600754\t364294\npro3\t364295\n测设\t364296\n眼镜王蛇\t364297\n退到\t364298\n巴斯勒\t364299\n内江北站\t364300\nHI\t364301\n西沣路\t364302\n兰坪县\t364303\n张妮\t364304\n沙县人民政府\t364305\n下坡路\t364306\n这款车\t364307\n1261\t364308\nColonial\t364309\n元堂\t364310\nccs5\t364311\n妹调教日记\t364312\n张一兵\t364313\n肉车\t364314\n5140\t364315\n57see\t364316\n武夷山北\t364317\n滦州古城\t364318\nChangsha\t364319\n漫天花雨\t364320\n李哥庄镇\t364321\n0114\t364322\n纸巾盒\t364323\nT17\t364324\n哈尔滨医科大学附属第四医院\t364325\n青岩寺\t364326\n共基\t364327\nrastaclat\t364328\n双环醇片\t364329\n袁文魁\t364330\nAccessing\t364331\nPublished\t364332\n八国联军侵华战争\t364333\n钱江水利\t364334\n乐视商城\t364335\n互感器\t364336\n小刀\t364337\n反奸\t364338\n初拥\t364339\n麦克斯奥特曼\t364340\n保温泵\t364341\n三夜\t364342\n铷\t364343\n昌邑路\t364344\n医学技术\t364345\n标件\t364346\n3d打印笔\t364347\n爆板\t364348\n透层油\t364349\n朱永光\t364350\n1380\t364351\n古丽米娜\t364352\n客家网\t364353\n捷捷微电\t364354\n洛兰\t364355\n济南市市中区\t364356\n劳务协议\t364357\nirfanview\t364358\n000728\t364359\n江苏恒大仪表有限公司\t364360\n广东华侨中学\t364361\n深井泵\t364362\n规划部\t364363\n人员\t364364\n开录\t364365\n阿斯利康制药\t364366\nmigd\t364367\n林夕\t364368\n特莉丝\t364369\n新世纪福音战士\t364370\n李敖\t364371\nsnmp协议\t364372\nahu\t364373\n大蠊\t364374\n光刻\t364375\n国际庄\t364376\n大谷\t364377\n九曲河\t364378\n单向历\t364379\n達達尼亞\t364380\n黄竹\t364381\n帝国时代2HD\t364382\n东原乐\t364383\n白龙山\t364384\n杰士派\t364385\n达尔豪斯大学\t364386\n东苑小区\t364387\n尼尔盖曼\t364388\nClasspath\t364389\n80名\t364390\n做客\t364391\nincorporated\t364392\n2017-04-25\t364393\n契科夫\t364394\n金鱼池\t364395\nH.265\t364396\n退伍费\t364397\n中央国务院\t364398\ndisabled=\t364399\n北风吹\t364400\n共进\t364401\n猎鹰\t364402\n情侣杯\t364403\n许诗茵\t364404\n致命守护者\t364405\n大萌\t364406\n腈纶\t364407\nchapman\t364408\nntoa\t364409\n叶林\t364410\n上海实业\t364411\n氮气柜\t364412\n麦卡\t364413\n山西医科大学第二医院\t364414\n财务报表\t364415\nvidia\t364416\n奥黛\t364417\nesteem\t364418\n胡琏\t364419\nheaderview\t364420\n乐堡啤酒\t364421\nDD373\t364422\n踩盘\t364423\n天策\t364424\n大悲咒\t364425\n维生素B2\t364426\n5.8\t364427\n军企\t364428\nSPR\t364429\nmonster\t364430\n阴符经\t364431\n真实版\t364432\n套卡\t364433\n满装\t364434\n美元指数\t364435\n沿山\t364436\n2630次\t364437\nCloudEngine\t364438\n帝辛\t364439\n胡然\t364440\n木管\t364441\n伯夷\t364442\n明先生\t364443\n恐怖童谣\t364444\nkiray\t364445\n北京民生银行\t364446\n日女\t364447\n漪汾街\t364448\nBeige\t364449\n嗅到\t364450\n千魂\t364451\n三星手机论坛\t364452\n小妹儿\t364453\n.DOC\t364454\n易行\t364455\n锯床\t364456\n骨密度\t364457\n中共杭州市委\t364458\n范军\t364459\n手人\t364460\nZKSoftware\t364461\n有品\t364462\n银杏果\t364463\n蒙巴顿\t364464\n长腰\t364465\n喷漆房\t364466\n酸枣树\t364467\n傲梅分区\t364468\n刘一水\t364469\n黄曲霉素\t364470\nbreaking\t364471\nbreezey\t364472\n游街\t364473\n比亚迪股份有限公司\t364474\n踢开\t364475\n虎妈猫爸\t364476\n地平线\t364477\n桑拿天\t364478\n慈溪实验中学\t364479\n南京师范大学教师教育学院\t364480\n74HC595\t364481\n辅警\t364482\n90.8\t364483\n钢价\t364484\n小绞车\t364485\n维纳斯\t364486\nPDF档\t364487\n民进党\t364488\n儋州\t364489\n海府\t364490\n掌握者\t364491\n中长期贷款\t364492\n濃密セックス\t364493\n王以培\t364494\n猪八戒网\t364495\n一波流\t364496\n掇刀区\t364497\n通渭县\t364498\n林间\t364499\n张建忠\t364500\n非结构化\t364501\n8.9%\t364502\n偶联剂\t364503\n五包\t364504\n强排式\t364505\n互动性\t364506\n数百元\t364507\n喷烟\t364508\n32升\t364509\n煤灰\t364510\nLafite\t364511\n暗影精灵3plus\t364512\n鲨鱼\t364513\nborderlands\t364514\nlookout\t364515\n零售\t364516\n科晶\t364517\n走起\t364518\n琉\t364519\n2者\t364520\n国庆阅兵\t364521\n知行杯\t364522\n文丽\t364523\n发标\t364524\n3yx\t364525\n橘郡\t364526\n行容\t364527\n流长\t364528\n股骨头坏死\t364529\nOffice2010简体中文\t364530\n9月26日\t364531\n众星捧月\t364532\n房产网\t364533\n郡城\t364534\n第七期\t364535\n段河道\t364536\n杨泗港长江大桥\t364537\n精髓\t364538\n磨磨\t364539\n一天内\t364540\n500x\t364541\nbowtie\t364542\n兴源环境\t364543\n2017年8月31日\t364544\n诸葛彦希\t364545\nwap站\t364546\n20丨\t364547\n偷猎者\t364548\n银行存款利率计算器\t364549\n体育在线\t364550\n累计\t364551\nkbox\t364552\n别动队\t364553\n安踏\t364554\n交谈\t364555\n洒下\t364556\n陈曼\t364557\n火书\t364558\n九华山风景区\t364559\n世袭\t364560\n名教\t364561\n摊面\t364562\n瀬名\t364563\n不接单\t364564\n动物世界\t364565\n第二级\t364566\n东营市人民政府办公室\t364567\n二衬\t364568\n布施\t364569\n子痫\t364570\n球形\t364571\n命盘\t364572\nMaritime\t364573\n三九企业集团\t364574\n鸭汤\t364575\n2018年03月23日\t364576\n班花\t364577\nNoble\t364578\n木叶村\t364579\n上海熹垣生物科技有限公司\t364580\n金瓜子\t364581\n山东煤矿安全监察局\t364582\n压燃式\t364583\ncpi\t364584\n内蒙古专业技术人员继续教育\t364585\n肢体\t364586\nblogs\t364587\n三箭齐发\t364588\n长春市住房公积金管理中心\t364589\nX60\t364590\n经理\t364591\nfv\t364592\n金曲奖\t364593\nhdmy\t364594\n5.30\t364595\nSocketTool\t364596\n现代汉语词典\t364597\n阿特兹\t364598\n反搏\t364599\n华源医药网\t364600\n烧砖\t364601\n连台\t364602\n党办\t364603\nfulltext\t364604\n千佛塔\t364605\n你的秘密\t364606\n非天夜翔\t364607\n高腰\t364608\n张宇燕\t364609\n第154集\t364610\n奥特兰克\t364611\n维特根斯坦\t364612\n团校\t364613\n物联科技\t364614\nbet36\t364615\nFastdfs\t364616\n林芝地区\t364617\n玄兵石\t364618\n北京动物园海洋馆\t364619\n第5章\t364620\nMobox\t364621\n施杰\t364622\n皖烟\t364623\n中博诚通\t364624\n人不偿命\t364625\n王成喜\t364626\nBraava\t364627\n济宁高新区\t364628\nhaus\t364629\nLEE\t364630\nm227\t364631\n耐旱\t364632\n中国电子技术论坛\t364633\n沐心\t364634\n兄嫂\t364635\n梅丽珊卓\t364636\n陵水\t364637\n杜蒙\t364638\n特困生\t364639\n四笔\t364640\n汤姆哈迪\t364641\n汤力\t364642\nCAUP\t364643\nKowloon\t364644\ntem\t364645\n铅\t364646\n张昊\t364647\n冷色\t364648\n人教版小学五年级语文\t364649\n山西师大\t364650\n华虹计通\t364651\n苹果mac\t364652\nPETS3\t364653\nimmediately\t364654\n良机\t364655\n60_\t364656\n积冰\t364657\n败将\t364658\n江南国际城\t364659\n李明远\t364660\n汉鼎宇佑\t364661\nteng\t364662\n60瓦\t364663\nsaver\t364664\nPlants\t364665\n空庭\t364666\n倍量柱\t364667\n奥兹\t364668\n宁吉喆\t364669\n五笔吧\t364670\n兴欣战队\t364671\n上海东方有线\t364672\n金博\t364673\nftl\t364674\n新生村\t364675\n三唑\t364676\n工作鞋\t364677\n密封胶\t364678\n啥样\t364679\n滑溜溜\t364680\n健力多氨糖软骨素钙片\t364681\nNote9\t364682\n藿香正气\t364683\n中国纪检监察报\t364684\nCheap\t364685\n猫客\t364686\n油光\t364687\n闯天路\t364688\n黄鳝门\t364689\n萨巴\t364690\nBB霜\t364691\n骤减\t364692\nAlgorithms\t364693\n桌面\t364694\n绿叶\t364695\n经济技\t364696\n黄州定慧院\t364697\n飞坦\t364698\n杨柳\t364699\n住房维修基金\t364700\n多聚甲醛\t364701\n第B02\t364702\n音频应用网\t364703\n三尖瓣反流\t364704\n邱岳峰\t364705\n续\t364706\nSupercell\t364707\n黔南\t364708\n古钱币\t364709\n走红\t364710\nTn\t364711\nTi\t364712\n南环路\t364713\n100平方米\t364714\n467\t364715\n耀魂\t364716\nWar3\t364717\n干气\t364718\n小语网\t364719\nback键\t364720\n丰州镇\t364721\n流程表\t364722\n石缸\t364723\n5650\t364724\n评评\t364725\n万科金色悦城\t364726\n供者\t364727\nCyclone\t364728\nSyndicate\t364729\nletex\t364730\nExpedia\t364731\n安规测试\t364732\n莫娜\t364733\n仙剑奇侠传4\t364734\n郭德纲济公传\t364735\n童彪\t364736\n3339\t364737\naicc\t364738\n去年同期\t364739\nRX-78\t364740\n002555\t364741\nMDR-1ADAC\t364742\nPinyin\t364743\n康定斯基\t364744\n米米亚\t364745\n乡宁县\t364746\n挪窝\t364747\n孺子路\t364748\n陈帅\t364749\n邀功\t364750\nライフ\t364751\nBonding\t364752\n36支\t364753\n性感比基尼\t364754\n中药药理学\t364755\n智慧书\t364756\n来说\t364757\n西安古城\t364758\n新郑\t364759\n空客\t364760\n价位\t364761\n徐铭\t364762\n左嘴角\t364763\n使劲\t364764\n十二怒汉\t364765\n梅庄\t364766\n∑\t364767\n天蓬\t364768\n瑞金市\t364769\n孙锐\t364770\n资历章\t364771\n顺应性\t364772\n海水淡化\t364773\n宏时\t364774\n深业上城\t364775\n玛诺洛斯\t364776\n喉口\t364777\n东风标致301\t364778\n罚室\t364779\n银钢\t364780\nzxfuli\t364781\n24K\t364782\n专用票\t364783\n标成\t364784\n尽可能\t364785\n浩浩汤汤\t364786\n119\t364787\n刷具\t364788\n高帽子\t364789\nACB\t364790\n五华县人民政府\t364791\ngpl\t364792\n宠物药\t364793\n待产\t364794\n浣溪沙\t364795\n东方时尚吧\t364796\n游泳圈\t364797\n移量\t364798\n赵宏博\t364799\n长沙南仁医院\t364800\n短纤\t364801\n索菲特\t364802\n银川市政府\t364803\n雨刷器\t364804\n直观性\t364805\n好網角網路收藏夾\t364806\n远期利率\t364807\n阶地\t364808\n陈友泉\t364809\n四季沐歌\t364810\n万千星辉颁奖典礼\t364811\n剑网三霸刀\t364812\n翼城\t364813\n运算符\t364814\n逍遥散\t364815\n3000次\t364816\n守财奴\t364817\nreleases\t364818\n瘦身霜\t364819\n过氧化氢酶\t364820\n切克闹\t364821\n两年之内\t364822\n隐蔽式\t364823\nRead\t364824\n无功\t364825\n企业管理员\t364826\n枸杞叶\t364827\nfingerprint\t364828\n名股实债\t364829\n小悟空\t364830\n批示\t364831\n黑河新闻网\t364832\n帕萨奥迪rs6\t364833\n中国商会\t364834\n靓丽\t364835\n飘花电影网_飘花迅雷\t364836\n南怀谨\t364837\n医学奖\t364838\n启\t364839\n产业集聚区\t364840\n两勺\t364841\nXB271HU\t364842\n左氏\t364843\n阿笨\t364844\n凤来仪\t364845\n慕寒\t364846\n电视迷\t364847\n八重\t364848\n环境科学学院\t364849\n平行者\t364850\n历山\t364851\n荣耀畅玩7A\t364852\nccie\t364853\n语林\t364854\n王岳伦\t364855\n上海微电子装备(集团)股份有限公司\t364856\n哈尔滨\t364857\n禅服\t364858\n营养价值_食谱\t364859\n兴废\t364860\n神魂至尊\t364861\n神奇阿呦\t364862\n初祖\t364863\n上海交通大学附属中学\t364864\n堵堵\t364865\n涉案\t364866\n使徒行者1\t364867\n湖南省图书馆\t364868\nMovs\t364869\n夫妻宫\t364870\n大阪\t364871\nFAT16\t364872\n光明使者\t364873\n第十五天\t364874\nvirt-manager\t364875\n天地传说之鱼美人\t364876\nfreak\t364877\nh4n\t364878\n上星\t364879\n20160515\t364880\nmnc\t364881\nfirefox\t364882\n一厅\t364883\n南和县\t364884\n上海房产交易中心\t364885\n荒岛惊魂\t364886\n449\t364887\n吞吐\t364888\n毛根\t364889\n瑞星杀毒软件\t364890\n舰员\t364891\n选举法\t364892\n2918年\t364893\n12家\t364894\n3.5MM\t364895\n开门\t364896\n方勇\t364897\n高爆弹\t364898\n东西南北风\t364899\n喇叭形\t364900\n数娱\t364901\n5546\t364902\n皮日休\t364903\n划开\t364904\nBES\t364905\n喜饼\t364906\nsurround\t364907\n同胞们\t364908\nsfo\t364909\n新浪辽宁_新浪网\t364910\nZ1/Compact\t364911\nunwanted\t364912\n0v\t364913\n友会\t364914\nAnson\t364915\n信用贷款\t364916\n南海边\t364917\n03.23\t364918\nPerez\t364919\n余岳桐\t364920\n华瑞源\t364921\n调味瓶\t364922\n热播电视剧\t364923\n白音华\t364924\n热店\t364925\nConf\t364926\n银梭\t364927\n哺乳文胸\t364928\n外甥女\t364929\n海南医学院\t364930\n新鲜肉\t364931\nMagnets\t364932\nzhogn\t364933\nvalue\t364934\n黄小明\t364935\n付清\t364936\n5.4.3\t364937\n战投\t364938\n中国葛洲坝集团\t364939\n爱在人间\t364940\n定额查询\t364941\n督察处\t364942\n青之驱魔师\t364943\n1万张\t364944\n进纸\t364945\n坦克类\t364946\n氯丁胶\t364947\n舒适圈\t364948\n激光粒度仪\t364949\n芯片商\t364950\nnewsletter\t364951\nhacci\t364952\n8750h\t364953\nviz\t364954\n梅安森\t364955\n一平方厘米\t364956\n众乐\t364957\n吉林市人民政府\t364958\n京多安\t364959\n卫生防疫\t364960\n费比\t364961\n重汽豪沃\t364962\n5磅\t364963\nwheel\t364964\nhelios\t364965\n雷峰\t364966\n深科\t364967\n分度值\t364968\n新建小区\t364969\n底照\t364970\n甲硝锉片\t364971\n马桥镇\t364972\n湿父\t364973\n674\t364974\n哥德堡变奏曲\t364975\naccountant\t364976\n报务\t364977\n神天兵\t364978\n染织\t364979\n几毛\t364980\n直连\t364981\n提肌\t364982\n带状疱疹\t364983\n百机\t364984\nISO格式\t364985\nTower\t364986\nMeat\t364987\n正能\t364988\n鸿翔\t364989\n5800元\t364990\n传入\t364991\n崔智云\t364992\n珠海市公安局\t364993\n介于\t364994\n雨宫琴音\t364995\n仇恨值\t364996\n占用税\t364997\n插女\t364998\n时度\t364999\npremier\t365000\nc++14\t365001\n马友\t365002\n镇元大仙\t365003\n需求函数\t365004\nzoc7\t365005\n医风\t365006\n本风\t365007\n科研之家网\t365008\n基础形\t365009\n计算机网络技术专业\t365010\n中青班\t365011\n绘威\t365012\n独墅湖高教区\t365013\nBD720P高清电影BT种子迅雷\t365014\n撤兵\t365015\n算法设计与分析\t365016\n破伤\t365017\nphpMyadmin\t365018\n墨洒\t365019\n高沃\t365020\n烟站\t365021\n商住区\t365022\nDisplay\t365023\nxhr\t365024\n吴菲\t365025\n走俏\t365026\natx830\t365027\n鼓棒\t365028\n模拟试\t365029\n麻质\t365030\n村支部\t365031\nnervous\t365032\n雷希拉姆\t365033\n翻墙软件公司\t365034\n工资制\t365035\n2.5米\t365036\n良多\t365037\n性性\t365038\n车巴巴\t365039\n友女\t365040\n二三级\t365041\n柱镜\t365042\n黄调\t365043\n巴西红耳龟\t365044\n小开\t365045\n名爵3\t365046\n糸列\t365047\nnautilus\t365048\nhuang\t365049\n谢哥\t365050\n桐庐\t365051\n签名\t365052\n维生\t365053\n线绳\t365054\n卫苗\t365055\n养发\t365056\n东港新闻网\t365057\n阿肯色州\t365058\n支原体\t365059\nipcamera\t365060\n57355000\t365061\n第10卷\t365062\n相拥\t365063\n成都市人民政府办公厅\t365064\n位移法\t365065\n皇亲\t365066\n0872\t365067\n浙江外国语学院\t365068\n杨栋\t365069\n石工\t365070\n05778\t365071\nredirects\t365072\n在黑暗中\t365073\ndynamix\t365074\n三国志13PK\t365075\n眼中\t365076\n中国电力人才网\t365077\n陈深\t365078\n彩虹岛吧\t365079\n沂山\t365080\n泛林\t365081\npushed\t365082\n吐丝结\t365083\n壮游奇\t365084\n莱斯\t365085\n复式\t365086\n保有\t365087\n曹和平\t365088\n小米车载空气净化器\t365089\n伤疤\t365090\n初梦\t365091\n上海股权托管交易中心\t365092\n喊山\t365093\n撕破\t365094\nHedging\t365095\n老猫\t365096\n耕读\t365097\n兰太实业\t365098\n六合村\t365099\n保险人\t365100\n郑州东站\t365101\n哈姆雷特\t365102\nVMD\t365103\n券商股\t365104\n和键\t365105\n台湾妹娱乐网\t365106\n预览版\t365107\n天鹿湖森林公园\t365108\n德意志联邦共和国\t365109\n海天集团\t365110\n40万亿\t365111\n剑网3竞技大师赛\t365112\n生物安全柜\t365113\n5.0英寸\t365114\n魅族mx\t365115\n畅打\t365116\n姜河\t365117\n阿豪\t365118\n火山岩\t365119\nScarf\t365120\n孙昊\t365121\n爱贝\t365122\n智跑道君\t365123\n贯入度\t365124\n六厘米\t365125\n咏宁\t365126\nqu\t365127\nLOL英雄联盟视频_17173\t365128\n嘴部\t365129\n龙吼\t365130\n手撕包\t365131\n清远市国土资源局\t365132\n黒木\t365133\npscc\t365134\n有格\t365135\nClinton\t365136\n萌百\t365137\ndisco\t365138\nnewtonsoft\t365139\n涌进\t365140\n胰脏\t365141\n37kw\t365142\n第10回\t365143\n浙江财经大学金融学院\t365144\n富雅\t365145\n4k屏\t365146\nF35\t365147\nparameterType\t365148\npaperyy\t365149\n红盾查询网\t365150\n380x\t365151\n两匹马\t365152\nREMIX\t365153\nChap\t365154\n8083xxxx\t365155\n普通高等院校\t365156\n光宇\t365157\n绿色食品认证\t365158\n惯性系\t365159\n学军\t365160\n谋势\t365161\n地上\t365162\n缬沙坦氢氯噻嗪片\t365163\n体育人间\t365164\n慎\t365165\n六迹\t365166\nVids\t365167\n山东省国税局\t365168\n寒花\t365169\nDNS\t365170\nHelicopter\t365171\n曾母暗沙\t365172\n旧宫镇\t365173\n纳税识别号\t365174\n致贫\t365175\nbtspread\t365176\nNordVPN\t365177\n充其量\t365178\n批注式\t365179\n凯特王妃\t365180\n杭州汽车网\t365181\nCDH\t365182\n衣带\t365183\n翔哥\t365184\n192.168.10.1\t365185\n吕林\t365186\n周冰\t365187\n多发性肌炎\t365188\n滚压\t365189\n行歌\t365190\n网球肘\t365191\nattach/34031-py2\t365192\nChecking\t365193\n中信证券\t365194\nkakfa\t365195\n锦绣山河\t365196\nwin10xbox\t365197\n俊豪\t365198\ntht\t365199\n一桩\t365200\ngraphx\t365201\n市长杯\t365202\nFFU\t365203\n绸都\t365204\n361度\t365205\n残废\t365206\n鸳鸯阵\t365207\n女鬼\t365208\ncel\t365209\n标题名\t365210\nwz111\t365211\n真实女友3\t365212\n山东电力\t365213\n直弯\t365214\noffcie\t365215\n方水\t365216\nreachable\t365217\n而今夏\t365218\nStri\t365219\n微血管减压术\t365220\n顺祝\t365221\n海师\t365222\n渭南师范学院\t365223\n海邦\t365224\n禁忌\t365225\n韩游\t365226\n上海新国际展览中心\t365227\n猫武士\t365228\n三道堰\t365229\n河南省委组织部\t365230\n第十三号\t365231\n/货运公司/运输公司\t365232\nMANIFEST\t365233\n货损\t365234\n2007-2016年\t365235\n乐桃航空\t365236\n多线程同步\t365237\n出血点\t365238\n遮脸\t365239\n市城市管理局\t365240\nadr\t365241\nbss段\t365242\n第35页\t365243\n冲撞\t365244\n陈大卫\t365245\n一叶随风\t365246\n补遗\t365247\n2012-2017年\t365248\nhuni\t365249\n侵触\t365250\n贵阳恒大文化旅游城\t365251\n哈尔滨师范大学\t365252\n职业培训师\t365253\n天眼珠\t365254\njenkins\t365255\ndota2天梯\t365256\nNewswire\t365257\n吉安百姓网\t365258\n成年片\t365259\nequipment\t365260\n滨江小区\t365261\noppoa3\t365262\n深圳市科学技术协会\t365263\nWIN764位\t365264\n成都人民公园\t365265\n定错\t365266\n千年之恋\t365267\nsymphony\t365268\n十九大民主生活会\t365269\nSparkNotes\t365270\n桃园路\t365271\nplenty\t365272\nCBB\t365273\n大篷\t365274\n火红年代\t365275\n可乐杯\t365276\n同方威视技术股份有限公司\t365277\n热单\t365278\n格莱美音乐奖\t365279\n老手们\t365280\n刘珊珊\t365281\n7R\t365282\n八组\t365283\n克强\t365284\n多晶\t365285\n橙皮\t365286\n斯柯达速派\t365287\n蓝菲\t365288\n并列式\t365289\n31届\t365290\n出科\t365291\n银豆\t365292\n自\t365293\nAgain\t365294\npatience\t365295\n芬兰航空公司\t365296\n兰特\t365297\nDestoon\t365298\n汇编函数\t365299\n就像\t365300\n云丁\t365301\n香梨\t365302\nCheckBox\t365303\n逃\t365304\nrational\t365305\nPrimal\t365306\n工部\t365307\n检方\t365308\nInnisfree\t365309\n海兴县\t365310\n安家杰\t365311\n拔丝蛋糕\t365312\n一大半\t365313\n接稿\t365314\n3885\t365315\nY66i\t365316\n移印\t365317\n高鹏\t365318\nthroat\t365319\n貼圖\t365320\n荣威550论坛_汽车之家论坛\t365321\norigin2017\t365322\ncheckin\t365323\n中和镇\t365324\n一样的月光\t365325\n姚麦\t365326\n为爱所困\t365327\n塔吊塔机\t365328\n笑了笑\t365329\nFLASH\t365330\n索尼RX1R\t365331\n压裂\t365332\nexplanatory\t365333\n盒装\t365334\n活液\t365335\n第几行\t365336\n翻江倒海\t365337\n钥匙圈\t365338\nMou\t365339\n糜夫人\t365340\nSerDes\t365341\n工坊\t365342\n上海天华建筑设计有限公司\t365343\n丽水市人民政府\t365344\n金贵肾气丸\t365345\n98块\t365346\n超级玛丽3\t365347\n北京市卫计委\t365348\n养生操\t365349\npigment\t365350\n银片\t365351\n2017新年\t365352\n卡西莫多\t365353\n3666\t365354\n优6\t365355\n收送\t365356\n600221\t365357\nChloride\t365358\n96小时\t365359\n华容\t365360\n实收资本印花税\t365361\n漫威网\t365362\n平直度\t365363\n王春光\t365364\nSP5\t365365\n雅克\t365366\n班超\t365367\n鬼笔\t365368\n姨\t365369\nweb.mit.edu/rbhatt/www/seema/1937\t365370\n电感量\t365371\nlikely\t365372\n有神\t365373\nZNDS智能电视网\t365374\n御寺\t365375\nlgg3\t365376\n苍岩山\t365377\n爱果果\t365378\n邪恶少女漫画无翼鸟\t365379\n用意\t365380\n88106\t365381\n影视公司\t365382\no型血\t365383\nfeatures\t365384\nVie\t365385\n技术指引\t365386\n坐便\t365387\nFTPS\t365388\ncryptonight\t365389\n中农网\t365390\n访港\t365391\n周铁\t365392\n卅\t365393\n中国茶叶博物馆\t365394\n柒个\t365395\n19歳\t365396\n小葵\t365397\n斯巴达勇士赛\t365398\nqantas\t365399\n维生素c\t365400\n落地箱\t365401\n吾爱\t365402\n初相位\t365403\n英雄汉\t365404\nSide\t365405\n53912837\t365406\n喷嚏\t365407\n发紫\t365408\nalcantara\t365409\n爱快路由\t365410\n赣州三中\t365411\n成都火车北站\t365412\n白剑\t365413\nqs认证\t365414\n┊\t365415\n适应期\t365416\n岌岌可危\t365417\n自由光\t365418\nodis\t365419\n卡奇社\t365420\n幽门\t365421\n吴丹\t365422\n中情\t365423\nDTMB\t365424\nGODV\t365425\n香港养和医院\t365426\n绝学\t365427\nwow6.0\t365428\n速派\t365429\n花期\t365430\n烂人\t365431\n退养\t365432\n最后一块\t365433\nuniscom\t365434\n郑林\t365435\nhub\t365436\nyanji\t365437\n愚乐\t365438\nanli\t365439\nnginx.conf\t365440\n郑陆\t365441\n蓝信\t365442\n3082元\t365443\n民族独立\t365444\n福州火车站\t365445\n爱装\t365446\n安徽省律师协会\t365447\n华图教育广东分校\t365448\n东兴市\t365449\n踩雷\t365450\n包公赔情\t365451\n滤清器\t365452\n福丽特\t365453\n留有余地\t365454\n西沙海战\t365455\n分配律\t365456\n宅猪\t365457\n叶舟\t365458\n菜店\t365459\n哈勃\t365460\n五金\t365461\nzine\t365462\n一代佳人\t365463\n62分\t365464\n功夫2\t365465\nportugal\t365466\n想说\t365467\n营口经济技术开发区\t365468\n绿箭\t365469\n特区\t365470\n宝马MINI\t365471\nWinNTSetup\t365472\n4页\t365473\n3.6米\t365474\n诗家\t365475\n退部\t365476\n学术界\t365477\n钟易轩\t365478\n词法分析器\t365479\nRelocation\t365480\n蒙特卡洛法\t365481\n改非\t365482\n外挂版\t365483\n殡葬师\t365484\n晃脑\t365485\n毒霸\t365486\n仙武帝尊\t365487\n营养粥\t365488\n绝望的主妇\t365489\ncampione\t365490\n终极篇\t365491\n骨干网\t365492\n活度\t365493\n1.0.1365.1\t365494\n桑拿网\t365495\n权相佑\t365496\n王玉林\t365497\nW3Cways.com\t365498\n全法\t365499\n创业路上\t365500\n常益\t365501\n20180216\t365502\n怡园小区\t365503\n18届\t365504\nFlask-Login\t365505\n恒大翡翠华庭\t365506\n中华人民共和国反恐怖主义法\t365507\n被劫持\t365508\nYUYU\t365509\nIndex\t365510\n中铁十五局一公司\t365511\n肖特基二极管\t365512\n20141010\t365513\n大钩\t365514\n于今\t365515\n王岐山\t365516\nvq\t365517\n1224\t365518\n插翅难逃\t365519\n济南市车管所\t365520\n男服务员\t365521\nszu\t365522\n上车后\t365523\nま\t365524\n航天系\t365525\nxxxooo\t365526\ndefy\t365527\nPOISON\t365528\n关中平原\t365529\n皮皮狗\t365530\n运港\t365531\nRevit2016\t365532\n代办人\t365533\n移为\t365534\n70多岁\t365535\nMarching\t365536\n物語\t365537\n中铁建工集团\t365538\n鸿合科技\t365539\n量化宽松\t365540\n松龙\t365541\n真的好想\t365542\n低能耗\t365543\n宝华寺\t365544\n4208\t365545\nToolTip\t365546\n小悠\t365547\njabref\t365548\nquip\t365549\n淘师湾\t365550\n风俗习惯\t365551\n大灌篮\t365552\nNova2s\t365553\nribbon\t365554\n字段\t365555\n洞头网\t365556\n血红蛋白\t365557\n忙内\t365558\nNightM\t365559\nNU\t365560\n大麻花\t365561\n玻璃吊桥\t365562\n孰\t365563\n瑶里古镇\t365564\n六线谱弹唱谱\t365565\nwilson\t365566\n72路\t365567\n达州政府网\t365568\n颜狗\t365569\n4615\t365570\n54期\t365571\nVector\t365572\n瞬\t365573\n3%\t365574\n盈世\t365575\n800兆\t365576\n拦网\t365577\n一起火\t365578\n小英雄\t365579\n重庆轨道交通6号线\t365580\n3204\t365581\nエア\t365582\n浙江省博物馆\t365583\n常见问题\t365584\nLr\t365585\nACGN\t365586\n首选项\t365587\n湖南城建职业技术学院\t365588\n正味\t365589\nWOW1.12版\t365590\n学贯\t365591\nT1\t365592\n夏提雅·布拉德弗伦\t365593\n党的十九届三中全会\t365594\n累\t365595\n561\t365596\n城北路\t365597\n生存类\t365598\n透析\t365599\n石村\t365600\n淮北市相山区人民政府\t365601\n来缘\t365602\n示范段\t365603\n卤料\t365604\nMOE\t365605\n羧酸\t365606\n黄书\t365607\nwq3435\t365608\n完全平方公式\t365609\n花丛\t365610\n快玩游戏盒\t365611\n邓丽欣\t365612\n易玄算命网\t365613\n一路向西\t365614\n润华\t365615\n再也\t365616\nantivirus\t365617\nhansen\t365618\n2821\t365619\n有希\t365620\n政保\t365621\n食药所\t365622\n早泄\t365623\nav天堂网2016影音先锋-影音先锋资源男人站av撸\t365624\npano2\t365625\nREPUBLIC\t365626\nIsrael\t365627\n解扣\t365628\n海底电缆\t365629\n区队\t365630\nlq-730k\t365631\n第68号\t365632\n中闽\t365633\n2017年6月25日\t365634\n势不可挡\t365635\n英格兰\t365636\n下巻\t365637\nPalette\t365638\n旧共和国\t365639\n乱改\t365640\nabap\t365641\n弟控\t365642\n凯米\t365643\n天加\t365644\n离别\t365645\n保利西山林语\t365646\nbop\t365647\n警风\t365648\n棵\t365649\nIFBB\t365650\n小痣\t365651\nbatteries\t365652\n河南省环保厅_河南省环境保护厅\t365653\n封面图\t365654\n不中\t365655\n长安保卫战\t365656\n火版\t365657\n尼\t365658\n北京市发改委\t365659\n中类\t365660\n常州房产网\t365661\n北京市肛肠医院\t365662\n涞水\t365663\n智邦国际\t365664\n京都旅游问答\t365665\nSkinny\t365666\n菜包子\t365667\n昌图县\t365668\nicntv\t365669\n4910\t365670\n企盼\t365671\n灰钙粉\t365672\n海尔燃气热水器\t365673\n米奥兰特\t365674\n女白\t365675\n寒毛\t365676\n灰网\t365677\n鹏欣资源\t365678\n润达\t365679\nfsevents\t365680\n四川南充人大\t365681\n服丧\t365682\n放浪\t365683\n天一论坛\t365684\n骁龙630\t365685\n刘丽华\t365686\nliujun\t365687\n分场\t365688\n生不逢时\t365689\nInstallers\t365690\n写字\t365691\n清河村\t365692\n裸心\t365693\n国家建设部\t365694\n中保信\t365695\n电力系统专业\t365696\n赠\t365697\n初村\t365698\n5433\t365699\n南林\t365700\n陈德馨\t365701\nCorwien\t365702\n吉他学习\t365703\n梦一场\t365704\n40处\t365705\n内装\t365706\n18183TV\t365707\n2017-12-19\t365708\n金舆\t365709\nWOW7.0\t365710\n鲜贝\t365711\n网罗\t365712\n小米MAX\t365713\nQByteArray\t365714\n飙速宅男\t365715\n插枝\t365716\n拉菲尼亚\t365717\n57mm\t365718\n梁板柱\t365719\n公共管理学\t365720\n张志毅\t365721\n买单吧\t365722\n土地出让金\t365723\n程家阳\t365724\nHawaiian\t365725\n螺旋猫\t365726\n惠泽园\t365727\n上清寺\t365728\n乗位\t365729\n过渡段\t365730\n隔离层\t365731\n颜诗筠\t365732\n神奇动物在哪里\t365733\nMil\t365734\n140马力\t365735\n下影线\t365736\n10世纪\t365737\n淘宝搜索\t365738\n勃勃\t365739\n伊斯法罕\t365740\n世界医疗器械网\t365741\n脂肪肝\t365742\n米尔科\t365743\n冲向\t365744\n二抗\t365745\n老弟\t365746\nartifacts\t365747\n咳嗽痰多\t365748\n折叠梯\t365749\n170317\t365750\n阴阳师业\t365751\n丹北镇\t365752\n太平川\t365753\n一磅\t365754\n文清\t365755\n函证\t365756\n通用件\t365757\nSpartan\t365758\n祖祖\t365759\n花荫露\t365760\nmiji\t365761\n面签\t365762\nwin10磁盘分区\t365763\n出口保险\t365764\n青色\t365765\n季鹍\t365766\n酰基化\t365767\n崇洋媚外\t365768\n2648\t365769\n20170106\t365770\n40瓦\t365771\n国家广播电影电视总局\t365772\n扶郎花\t365773\n汉绣\t365774\n武钢三中\t365775\noracle数据泵\t365776\n笔头\t365777\n瓶花\t365778\nvos3000\t365779\nBal\t365780\n603号\t365781\n调速阀\t365782\n辈出\t365783\n载荷\t365784\n吴冠中\t365785\n狗肉汤\t365786\n尾纤\t365787\n承重梁\t365788\n触不可\t365789\nPartitions\t365790\n北京朝阳国贸\t365791\n李志和\t365792\nEdwin\t365793\n老龙口\t365794\n齿形\t365795\nbeneficiary\t365796\n哇咔咔\t365797\n女情\t365798\n流化床\t365799\n衡阳站\t365800\n小森唯\t365801\n环氧固化剂\t365802\n越野千里\t365803\nng-model\t365804\nKELLY\t365805\n竹宴\t365806\n鹤岗\t365807\nLEED\t365808\n十香\t365809\n果珍\t365810\njxl\t365811\nqq影音\t365812\n知音网\t365813\n狂欢派对\t365814\nWASD\t365815\n自求\t365816\nxmlns\t365817\n影儿\t365818\nRonald\t365819\n留声机\t365820\nVCS变声器\t365821\ncvm\t365822\n河马\t365823\n辊\t365824\nZero\t365825\n十面\t365826\n攻城长生诀\t365827\n飞哥战队\t365828\n作茧\t365829\nlingling\t365830\n旧有\t365831\n北广场\t365832\n邯钢\t365833\n圩\t365834\n10月22日\t365835\n驱魔\t365836\n傻笑\t365837\n10万\t365838\n贵德县人民政府\t365839\n长沙理工大学\t365840\n成都商报|成都商\t365841\n正方形\t365842\n翡翠华庭\t365843\n国泰\t365844\n资金池\t365845\n造价\t365846\n饥饿感\t365847\n少羽\t365848\ntext-transform\t365849\nMelanie\t365850\nPUTTY\t365851\n配戴\t365852\n勾子\t365853\ninpath\t365854\n浇水\t365855\n输送带\t365856\n李永军\t365857\n热疹\t365858\n济南公安\t365859\n一筒\t365860\n野山坡\t365861\n王钊\t365862\n肇庆市住房和城乡建设局\t365863\n热泵\t365864\n循环型\t365865\n郝雪冰\t365866\n在职研究生教育网\t365867\n速凝剂\t365868\nOverhaul\t365869\n狄莺\t365870\n幼果\t365871\nE10\t365872\n迅雷快传\t365873\nautocad2014\t365874\n手游版\t365875\nExport\t365876\nimmunity\t365877\nweixiao\t365878\n文本框组合\t365879\nopendialog\t365880\n江宁万达广场\t365881\n烤灯\t365882\n600926\t365883\n20001\t365884\n亿滋\t365885\n湖南省机场管理集团有限公司\t365886\n工程咨询师\t365887\n桃濑友梨奈\t365888\n武汉市房管局\t365889\n金井镇\t365890\n红石榴\t365891\n521\t365892\n十倍股\t365893\nQQ在线\t365894\n乔依琳\t365895\n金泽\t365896\n热过载继电器\t365897\n大惊小怪\t365898\n议迷\t365899\n发短\t365900\n人模\t365901\nElectric\t365902\n一投\t365903\n罗技g402\t365904\n被逼\t365905\n75年\t365906\n为上\t365907\n湖\t365908\n遛狗\t365909\n入人\t365910\n奥拉星\t365911\n援朝\t365912\n品圆\t365913\n明年\t365914\n任人宰割\t365915\nGUIDE\t365916\n骁途\t365917\n走向成功\t365918\nScrollView\t365919\n香白\t365920\ndelf\t365921\nJudas\t365922\nuget\t365923\n湖南青马在线\t365924\n浙江民盟\t365925\n表柔比星\t365926\n陈岑\t365927\n海参\t365928\n滴血\t365929\ncrash\t365930\n蒋伟\t365931\n非牛顿流体\t365932\n海典\t365933\n绝缘子串\t365934\n中铁五局\t365935\nhabit\t365936\n中国共产党珠海市委员会\t365937\n8米\t365938\n412号\t365939\n无限嫡女重生\t365940\n低压槽:欲望之城\t365941\nbuyi\t365942\nChopin\t365943\nxin10\t365944\n棠下镇\t365945\n品善网\t365946\n庭外\t365947\naristotll\t365948\n武汉市教育局\t365949\nunit5\t365950\niphone6sp\t365951\n沈鼓集团\t365952\n20秒内\t365953\n婢女\t365954\n贵州理工学院\t365955\nctrl+s\t365956\n行车记录仪网\t365957\n眼珠子\t365958\n浙江省法院\t365959\n如金\t365960\n70915908041612\t365961\n更糟\t365962\n万江街道\t365963\n2017年9月24日\t365964\n安琪莉\t365965\n磁矩\t365966\n空乏\t365967\n海上游\t365968\nhxt\t365969\n清明烈士陵园\t365970\ndum\t365971\n军体\t365972\n汇丰环球\t365973\nvc6\t365974\n解放军医院\t365975\nmigrant\t365976\n南丰政府网\t365977\n江门市教育局\t365978\n603711\t365979\n有用性\t365980\n郵政編碼\t365981\n动平衡_\t365982\n百度驱动程序\t365983\nYY4410高清影院\t365984\n密度影\t365985\n选上\t365986\n3299\t365987\nwater\t365988\n淡妆浓抹总相宜\t365989\n血调\t365990\n逐\t365991\n荣耀55开黑节\t365992\n等腰直角三角形\t365993\n炼妖石\t365994\n企业管理\t365995\n功夫\t365996\nftm\t365997\n塑料框\t365998\n再玩\t365999\nщ\t366000\n3034\t366001\ni8\t366002\nHackaday\t366003\n附属\t366004\n5.1.2\t366005\ngl\t366006\n胡卫东\t366007\n新兴镇\t366008\n张中华\t366009\n变压器\t366010\n500k\t366011\nworknc\t366012\n内田彩\t366013\n通信_比特网\t366014\n张晔\t366015\n难住\t366016\n燃气热水器\t366017\n田贝\t366018\nIKEv2\t366019\njiali\t366020\n适老化\t366021\n飘起来\t366022\n作价\t366023\n7.1.5\t366024\n编注\t366025\nJUC\t366026\n梦幻西游力\t366027\nEasyPR\t366028\n内标法\t366029\n奇志\t366030\n中国化工网\t366031\ngeos\t366032\n1133\t366033\n周慧敏\t366034\n帝国时代2hd\t366035\n冯柯\t366036\n无锡医院\t366037\njhcelue\t366038\n0919\t366039\n皮松\t366040\n元吾氏\t366041\n边远\t366042\n虚空\t366043\nPip\t366044\n二福\t366045\nHabit\t366046\n甘麦大枣汤\t366047\npin码\t366048\nS01E01\t366049\n玄武岩\t366050\ngfc\t366051\n4321\t366052\nwales\t366053\n套种\t366054\n断恨\t366055\n茶\t366056\nLance\t366057\n特斯\t366058\n降糖\t366059\nBD中国人\t366060\nFenix5\t366061\n七言\t366062\nAlpine\t366063\nNode\t366064\n母口\t366065\n葉月\t366066\n电厂\t366067\n1kejian.com\t366068\n1.6m\t366069\n二栋\t366070\n金土剧\t366071\n淅\t366072\n碟式\t366073\n雨馨\t366074\n柯杰\t366075\n马剑越\t366076\n海南公司\t366077\n秋林集团\t366078\nJavaFx\t366079\n惊慌\t366080\n古典舞\t366081\n嘉年华2\t366082\n11孔\t366083\n86%\t366084\nJX\t366085\n保运\t366086\n深圳市捷顺科技实业股份有限公司\t366087\nMimics\t366088\n别离开\t366089\n赵宇\t366090\n长安睿行\t366091\n55df.com\t366092\n俗人情\t366093\n五单\t366094\n桃李醉春风个人制谱园地\t366095\n盾构\t366096\nmatcher\t366097\nCaviar\t366098\nRecruiting\t366099\n一机\t366100\n苏马荡\t366101\n景程\t366102\n我和僵尸有个约会\t366103\n睑黄疣\t366104\n滨州职业学院\t366105\n2间\t366106\n曲水流觞\t366107\n软米\t366108\n10月31日\t366109\n五光十色\t366110\n锵锵三人行\t366111\n弩机\t366112\n人教版部\t366113\n茶米\t366114\n保育费\t366115\n蓝丝绒\t366116\n御女郎\t366117\n每分钟\t366118\nCCTV5+\t366119\n采光\t366120\n盒\t366121\n劳务者\t366122\n交大附小\t366123\n断腕\t366124\n球流\t366125\n战鼓擂\t366126\n解读\t366127\n爱康\t366128\n翻硕\t366129\n6500万\t366130\n机械工业出版社\t366131\n铸魂\t366132\nexamination\t366133\n西南交通大学出版社\t366134\n赶工\t366135\n兰尼斯特\t366136\n大桥北路\t366137\n城内\t366138\n高仿包\t366139\nreplacing\t366140\n深松\t366141\nnegotiating\t366142\n山西省人民政府办公厅\t366143\n山梨\t366144\nccar\t366145\nXIUREN\t366146\nBL漫画\t366147\n轰轰烈烈\t366148\n过河\t366149\n煌旗\t366150\n2sin\t366151\n跃迁\t366152\n索罗宫本武藏\t366153\n奇剑\t366154\n神剧\t366155\n逍遥安卓模拟器\t366156\n劲浪\t366157\n斯巴克\t366158\n美团外卖骑手\t366159\n腕力\t366160\n倒打\t366161\n井头\t366162\n远门\t366163\nń\t366164\n女儿情\t366165\n经管系\t366166\n未来七天\t366167\n贴缝\t366168\n蝙蝠侠:阿卡姆之城\t366169\n今年5月1日\t366170\n正黄旗\t366171\nWolframAlpha\t366172\n散度\t366173\n石山镇\t366174\n南京公交\t366175\n杨易德\t366176\n隐形床\t366177\n好怪\t366178\n斋堂\t366179\n五年级下册语文练习册\t366180\nBarista\t366181\n葫芦丝独奏\t366182\n兵马\t366183\n车群\t366184\n五分米\t366185\n爆裂\t366186\n杀羊\t366187\n中国科学院水生生物研究所\t366188\n色色色\t366189\n鸿洋\t366190\nRman\t366191\n节事\t366192\nSAL\t366193\n长荣航空\t366194\n50方\t366195\n转性\t366196\nAppChina应用汇\t366197\n永福路\t366198\n守门人\t366199\n航天学院\t366200\n颠茄片\t366201\nanything\t366202\n美胸\t366203\n卷入\t366204\n云保\t366205\n5.1.6\t366206\n17年9月\t366207\n考博网\t366208\n广东省委组织部\t366209\nHappen\t366210\n刘在石\t366211\narranger\t366212\n移民潮\t366213\n幻界\t366214\n少件\t366215\n市残联\t366216\n陈建文\t366217\nSansui\t366218\n铊中毒\t366219\n从一而终\t366220\n鄂托克前旗人民政府\t366221\n龙血树\t366222\n勒让\t366223\n中国轻工业出版社\t366224\nGraphicsMagick\t366225\n暗黑破坏神2:毁灭之王\t366226\n新疆区\t366227\n魔哒\t366228\n角质\t366229\n比亚迪汽车\t366230\n乐道\t366231\n爆出\t366232\n名道\t366233\n美绪\t366234\n诸如\t366235\n用对\t366236\n小宅\t366237\n终结者2审判日PC\t366238\n华海亲子鉴定中心\t366239\n父子年\t366240\n言灵\t366241\n申领\t366242\n氯沙坦钾\t366243\n吕贝克\t366244\n4磅\t366245\n我要你\t366246\n心脉\t366247\n痒痒虫\t366248\n金乐\t366249\n新浪爱问\t366250\n羧甲基纤维素钠\t366251\n聊污\t366252\nethers\t366253\n醉花阴\t366254\n祷告词\t366255\nEcshop\t366256\n陈新军\t366257\n广式早茶\t366258\n大连城乡建设委员会\t366259\n砂轮机\t366260\n全身\t366261\nhandover\t366262\n杜佳\t366263\n四家\t366264\n福建省肿瘤医院\t366265\ndatasheet-工程师\t366266\n不能够\t366267\n51CTO.COM\t366268\n珠心算\t366269\n吸尘\t366270\n马汉\t366271\nr428\t366272\n季枭寒\t366273\n卡普里岛\t366274\n手滑\t366275\ndsl\t366276\n军文\t366277\n幼芽\t366278\ncorset\t366279\n0.999\t366280\n斑马街\t366281\ndic\t366282\n长江大学文理学院\t366283\n大劫案\t366284\n0851\t366285\n经济赔偿\t366286\n太子湾\t366287\n笔耕\t366288\n琉璃神社\t366289\n长篇\t366290\n药谷\t366291\n投档线\t366292\n华南理工大学图书馆\t366293\n放大器\t366294\n博道\t366295\n山西省地方税务局\t366296\np1007\t366297\n维也纳新年音乐会\t366298\n红色警戒2兵临城下\t366299\n丘北县\t366300\n沈长清\t366301\n广东省农业科学院\t366302\n万神纪\t366303\n动物王国开大会\t366304\n琵\t366305\n福建电力\t366306\nlifetime\t366307\ndetermining\t366308\n沭阳县政府\t366309\n挺进者\t366310\n美容吧\t366311\nbta\t366312\n博问\t366313\n独秀\t366314\ndnf冻结师吧\t366315\n1138\t366316\n茶青\t366317\n汤圆儿\t366318\n李萌\t366319\n南京军区\t366320\ncollecting\t366321\n启迪桑德\t366322\n话卡\t366323\n聚酯树脂\t366324\nkvm\t366325\n150马力\t366326\n19319汽车仪表网\t366327\n汉中\t366328\n摇啊摇\t366329\n岩棉保温板\t366330\n南通高新区\t366331\n心宠\t366332\n黄缘龟\t366333\n岛纹\t366334\n国家秘密定密管理暂行规定\t366335\n升旗台\t366336\n写花\t366337\n降魔杵\t366338\nLinkinPark\t366339\niis7\t366340\nPaulking\t366341\nconvention\t366342\n骨质疏松症\t366343\n辅政\t366344\n凤月\t366345\n一级建造师考试\t366346\n合演\t366347\n云计算\t366348\n游戏风云\t366349\n亚历山大麦昆\t366350\n665\t366351\n旅游度假区\t366352\n正华\t366353\n蜜桃成熟时1997\t366354\n有限合伙公司\t366355\n雄辩\t366356\n南航空客\t366357\n电玩展\t366358\n李华梅\t366359\n南京仙林大学城\t366360\n林俊逸\t366361\n长大成人\t366362\n中国移动苏州研发中心\t366363\n特拉斯\t366364\n科幻小说网\t366365\n上半夜\t366366\n北京化工大学研究生院\t366367\n兜子\t366368\n红外线测温仪\t366369\n晕妆\t366370\n沈括\t366371\n我该\t366372\n父慈子孝\t366373\n良渚街道\t366374\n登别\t366375\n照明弹\t366376\n柔顺\t366377\ncleaned\t366378\n虫语者\t366379\n海伦湾\t366380\n讲习所\t366381\npspice\t366382\n人力资源社会保障局\t366383\n赵婧\t366384\n萤火虫\t366385\npanzer\t366386\n陈撄宁\t366387\n配电线路\t366388\n咪咪色\t366389\n酒仙桥北路\t366390\n修练\t366391\n天才\t366392\n第60号\t366393\n光行\t366394\n观音像\t366395\n气势\t366396\ngothic\t366397\n说明会\t366398\n风情\t366399\n鼻道\t366400\n偷闲\t366401\nps4pro\t366402\n1813\t366403\n殊胜\t366404\n12306铁路客户服务中心\t366405\nBP\t366406\nLog4j2\t366407\n英雄群侠传2\t366408\n同恩\t366409\n蓝天上\t366410\n低碳经济\t366411\nSissi\t366412\n中国核工业华兴建设有限公司\t366413\nYuanWing\t366414\n王瑞儿\t366415\nBamboo\t366416\nag捕鱼网\t366417\n蒸排骨\t366418\nJPG格式\t366419\n江城区政府\t366420\n兴光\t366421\n2001-2020年\t366422\n全国大学生数学建模竞赛\t366423\nqua\t366424\n向阳街\t366425\n周家\t366426\n置顶帖\t366427\n1420\t366428\nDebia\t366429\n朱宏嘉\t366430\n习惯性\t366431\n决战平安京\t366432\n金莱特\t366433\nLocation\t366434\n铸剑\t366435\nmcb\t366436\nAUV\t366437\n陶土\t366438\n海岛大亨4\t366439\nBatman\t366440\n华米科技\t366441\n北京车管所\t366442\n高速公路服务区\t366443\n孤山镇\t366444\n顾菁菁\t366445\n德兴\t366446\n2013年末\t366447\n王聪\t366448\n82平米\t366449\nlettuce\t366450\ncas\t366451\n退休费\t366452\n紫蝶\t366453\nWWF\t366454\n刘跃\t366455\n张灵玉\t366456\n吉成\t366457\n打篮球\t366458\nbl甜文\t366459\n描\t366460\n企讯网\t366461\n诺力股份\t366462\n止血带\t366463\n西秀\t366464\n磨墨\t366465\nf566\t366466\n吉他谱_找歌谱网\t366467\n泵\t366468\n行楷\t366469\neagle\t366470\n索尼ILCE-6000\t366471\n9月3号\t366472\n适配\t366473\nbower\t366474\n轻飘\t366475\n评估表\t366476\n闸片\t366477\n夜鹰\t366478\n清洁日\t366479\n6s标语网\t366480\n野穹\t366481\n谷草\t366482\n鸡压枪\t366483\n阿里河\t366484\nthinksns\t366485\n20151105\t366486\nPatrick\t366487\n四仪\t366488\n末世\t366489\nrdd\t366490\n偶想\t366491\nQPCR\t366492\n6S\t366493\n孕事\t366494\n兆欧表\t366495\n百分九\t366496\n不破不立\t366497\n宁夏工商职业技术学院\t366498\n鲨滩\t366499\n华控赛格\t366500\n维生素D滴剂\t366501\n溯及\t366502\n网易味央\t366503\nLyingMan\t366504\nnote6\t366505\nSwagger2\t366506\n新维\t366507\n于欣\t366508\n看不厌\t366509\nSpark-about\t366510\n充电板\t366511\n续聘\t366512\nTimeLine\t366513\n冠单\t366514\nRaySource\t366515\n吨包\t366516\n祭天\t366517\n華僑時報\t366518\n中国基金会\t366519\n数学文\t366520\n课程化\t366521\n子午七星剑\t366522\nImportant\t366523\n爱菊\t366524\n赌业\t366525\n天南地\t366526\n滋源\t366527\n吹一吹\t366528\n不收费\t366529\n爱神巧克力进行时\t366530\n李元膺\t366531\n铃声版\t366532\n2509\t366533\nDECIMAL\t366534\n王中华\t366535\n中华人民共和国会计法\t366536\n国金中融\t366537\n海量数据\t366538\n访者\t366539\n321.net\t366540\n巴米扬\t366541\n叶斌\t366542\n湖南省司法厅\t366543\n安徽省卫计委\t366544\n张桂英\t366545\n北京市人民政府办公厅\t366546\n.so\t366547\n宇宙猎魔兽世界\t366548\n吃大亏\t366549\nhuad\t366550\n十点\t366551\n茶卡盐湖\t366552\n阿哥\t366553\n菲尔盖纳\t366554\n九十分\t366555\n开源版\t366556\n旺旺\t366557\n今月\t366558\n压缩卷\t366559\n虎皮鹦鹉\t366560\n婚博会\t366561\n临桂新区\t366562\n冷源\t366563\n23.8寸\t366564\ndnf.52pk.com\t366565\n生物课\t366566\nEmergency\t366567\nrelics\t366568\n搜藏\t366569\n病理性黄疸\t366570\n刘培\t366571\nEaseUS\t366572\nstevie\t366573\nendpoint\t366574\n1956年\t366575\n中国市县\t366576\nhopkins\t366577\ng120\t366578\n新市墟\t366579\nlgg5\t366580\n炼金术师\t366581\n讲评\t366582\nTUBES\t366583\n雷曼\t366584\nLittelfuse\t366585\n呆呆\t366586\n朱云\t366587\n27亿\t366588\n中国工程咨询协会\t366589\n卒子\t366590\n5.2寸\t366591\n声纹识别\t366592\n蜻蜓fm\t366593\n8098\t366594\nCode之常备快捷键\t366595\n东京艺术大学\t366596\nKOBAYASHI\t366597\n好莱客\t366598\nappID\t366599\n12029\t366600\n智慧记\t366601\nge62\t366602\nHomepage\t366603\n曼德\t366604\n台基股份\t366605\n诺帝菲尔\t366606\n甘肃广播电视大学\t366607\n沃支付\t366608\n1E\t366609\n踏春\t366610\nMocking\t366611\n盎\t366612\n三美\t366613\n砾岩\t366614\n闹海\t366615\n第38集\t366616\n磁致伸缩液位计\t366617\nritz\t366618\n罗技c920\t366619\n张钧蜜\t366620\n80%\t366621\nwin10wifi\t366622\n红莲\t366623\n工作单\t366624\n文安县\t366625\ngeem2\t366626\n美欧\t366627\ngow\t366628\n周荣\t366629\n漏磁\t366630\n其貌不扬\t366631\n小狼解说\t366632\n默秒\t366633\nEDGM\t366634\n样儿\t366635\n芥蓝\t366636\n数控仿真软件\t366637\nVarian\t366638\n海淘论坛|55海淘网\t366639\nT430s\t366640\n天安健康源\t366641\n32000元\t366642\n盛唐妖异志_盛唐妖异志\t366643\nliable\t366644\n洛阳站\t366645\n内蒙古党校\t366646\n52mac\t366647\n大仲村\t366648\n护理院\t366649\n徐文峰\t366650\nTiki\t366651\n之前\t366652\n浙江省委党校\t366653\n哥谭镇\t366654\n目前\t366655\n妄自菲薄\t366656\nmc\t366657\n守望先锋游\t366658\n轻量级数据库\t366659\n招教考试\t366660\n红缨幼儿园\t366661\ncommun\t366662\n15匹\t366663\n10p\t366664\n圣名世贸城\t366665\n招引\t366666\n24小时末\t366667\nvow\t366668\n8班\t366669\n西安财经学院\t366670\n胡椒碱\t366671\n南粤清风网\t366672\n神灯\t366673\n暖阳\t366674\n奥林匹克森林公园\t366675\n天津师范大学津沽学院\t366676\n新疆维吾尔自治区财政厅\t366677\n计算机科学与技术系\t366678\n特殊教育网\t366679\n开锣\t366680\n埃微蛋卷\t366681\n一卫\t366682\n婵\t366683\n黄龙镇\t366684\n劳斯莱斯古斯特\t366685\n20150829\t366686\n毛丰美\t366687\n防爆接线箱\t366688\nRobotframework\t366689\nlenght\t366690\n嵌入式软件工程师\t366691\n小晓\t366692\n新劲炫\t366693\njzy\t366694\nsait\t366695\n首尔酒店\t366696\n冲账\t366697\ntechnologies\t366698\n大前门\t366699\n魔兽大脚\t366700\n400道\t366701\n水蛋\t366702\n18座\t366703\n太极崛起\t366704\n动迁房\t366705\nxorg\t366706\nB8\t366707\n骁龙430\t366708\n镗铣床\t366709\n管理服\t366710\n5挡\t366711\n色瑞替尼\t366712\nIPAD4\t366713\n丙中洛\t366714\n世联红璞\t366715\n磁碟\t366716\n圆柱式\t366717\n沪建\t366718\n30平\t366719\n九江学院\t366720\n爱的教育\t366721\n玛姿宝\t366722\nspur\t366723\n天娜\t366724\n甲类传染病\t366725\n20151014\t366726\n镇江市人民政府\t366727\n黄四娘家花满蹊\t366728\n单位\t366729\n坐看云起\t366730\n希利苏斯\t366731\n主机篇\t366732\nDialogue\t366733\n脑血管瘤\t366734\n光动能手表\t366735\n认路\t366736\n于兰\t366737\n000939\t366738\n息烽县人民政府\t366739\nST昆机\t366740\n声下\t366741\n英剧\t366742\n滤泡性\t366743\n塑料颗粒机\t366744\n晚点\t366745\n期货基础知识\t366746\n钱眼\t366747\n新兵营\t366748\n亡国之君\t366749\nTP-LINK无线路由器\t366750\n省实验中学\t366751\n英名录\t366752\n保安亭\t366753\ncontributor\t366754\n张东武\t366755\n博尔特\t366756\n吴晓灵\t366757\n悍妃\t366758\n交货\t366759\n散列表\t366760\n54集\t366761\n韩寒\t366762\n移动pos机\t366763\n刘永红\t366764\n黄纸\t366765\n翡翠台\t366766\n16-17年\t366767\n可乘之机\t366768\n柴科夫斯基\t366769\n0.19.1\t366770\n锋行版\t366771\n白沙岛\t366772\n19米\t366773\n六横\t366774\n60平方米\t366775\nword2016公式编辑器\t366776\n从这里开始\t366777\nAPS-C\t366778\n腥味\t366779\n全球纺织网\t366780\n速尔快递|速尔速递\t366781\n陆龟蒙\t366782\n帐房\t366783\n澳洲杉\t366784\n班门弄斧\t366785\nintptr\t366786\n314号\t366787\n贸易协定\t366788\n潜孔钻机\t366789\nFAI\t366790\n刘俏\t366791\n零下三十八度\t366792\nKloss\t366793\n陕西交通职业技术学院\t366794\nviking\t366795\n守旧\t366796\n抗用\t366797\n观堂\t366798\n好家长\t366799\n看房日记\t366800\nティア\t366801\n95成\t366802\n段子手_黑白漫话\t366803\n肥东县政府\t366804\n0634\t366805\n鄘风\t366806\n乐山新闻网\t366807\nWiseWrong\t366808\netcd3\t366809\n什么\t366810\n崩坏学园2吧\t366811\n澳门银河\t366812\n辊道\t366813\n北京经济管理职业学院\t366814\n加页\t366815\n揾\t366816\n耳目一新\t366817\n北师大\t366818\n方正小标宋\t366819\n冥月之巅\t366820\n无极软件园\t366821\n中国共产党历次全国代表大会\t366822\n双学位\t366823\nemerging\t366824\n渗漏\t366825\nJACOBS\t366826\n墙面砖\t366827\n新加坡南洋理工大学\t366828\n都是爱\t366829\n164cm\t366830\n第90届\t366831\nArms\t366832\nbrighter\t366833\n孩奴\t366834\ncpb\t366835\n57FX\t366836\n油箱\t366837\niphone6plus\t366838\n拧巴\t366839\n想入\t366840\n交战\t366841\n化学螺栓\t366842\nloadurl\t366843\nSHA1\t366844\n星恋\t366845\nmf840\t366846\npk10\t366847\n明信\t366848\n97岁\t366849\n盘口\t366850\n垃圾佬们\t366851\n降噪\t366852\n逸乐\t366853\n小册\t366854\n对版\t366855\n007期\t366856\n练习\t366857\n直属机关工作委员会\t366858\n位置性\t366859\n弱肉强食\t366860\n经济林\t366861\n平台板\t366862\n传选\t366863\n推行\t366864\n滨江壹号院\t366865\noucaijun\t366866\nmills\t366867\n严谨性\t366868\n推磨\t366869\n晨光文具\t366870\n天使\t366871\n煤火\t366872\n99个\t366873\nbrazzers\t366874\n四川航空股份有限公司\t366875\n四川自贸区\t366876\n山冈龙\t366877\n第十三卷\t366878\nLen\t366879\nChronicles\t366880\n杭马\t366881\n金鸡\t366882\n斑马电影街\t366883\nGrADS\t366884\n中国国家统计局\t366885\n秦韵\t366886\n骑马与砍杀维京\t366887\n默克尔\t366888\n解决率\t366889\n云南水务\t366890\n回溯法\t366891\n换头术\t366892\n北京平四\t366893\n旧村\t366894\n小马宝莉大电影\t366895\n平阴玫瑰\t366896\n森川杏奈\t366897\nPGOne\t366898\n白家乐\t366899\nmodulation\t366900\nMindmanager\t366901\n朱元王思聪\t366902\n富字\t366903\n本票\t366904\n张永刚\t366905\n海珠湿地\t366906\n动脉\t366907\n进口食品进出口商\t366908\nK10\t366909\nEpoll\t366910\n粉裙\t366911\ntcpdf\t366912\n1日\t366913\n输水\t366914\n藤鹰\t366915\n品逸\t366916\nhscode\t366917\n刺客信条编年史\t366918\n静宁\t366919\n加内特\t366920\n蒙古铁骑\t366921\n朗达\t366922\nnum2str\t366923\nbeneath\t366924\n乔治娜\t366925\n霸王色\t366926\n于嘉\t366927\n玻纤土工格栅\t366928\n景德镇\t366929\n翟氏\t366930\n我的生命\t366931\n密法\t366932\n修改版\t366933\n五洲热报\t366934\nGW2\t366935\n双缩脲试剂\t366936\n可梦\t366937\n架设\t366938\n瑞巴派\t366939\n12306验证码\t366940\n高等教育心理学\t366941\n修练场\t366942\nwuhan\t366943\n欧路莎\t366944\n丝路花雨\t366945\nUser\t366946\nske\t366947\n水疗馆\t366948\nGODIVA\t366949\n齐美\t366950\n肃流毒\t366951\n郑裕玲\t366952\n颜爵\t366953\n建水\t366954\n庆应大学\t366955\n星月菩提子\t366956\n甩挂运输\t366957\n西安旅游\t366958\n噱头\t366959\n万亿美元\t366960\nJansens\t366961\n辽宁轻工职业学院\t366962\n蒸汽炉\t366963\n谌龙\t366964\n兵器谱\t366965\n藁城\t366966\n第7套\t366967\n素食\t366968\n腿型\t366969\n博康\t366970\n元代\t366971\n纹身\t366972\n露点照\t366973\n头模\t366974\nmailgun\t366975\n跑水\t366976\nrapidgator\t366977\n美瓷\t366978\n西庄村\t366979\n药量\t366980\n再战\t366981\n广宇\t366982\n孙耀威\t366983\n三峡大学\t366984\n羽毛球拍\t366985\n胖嘟嘟\t366986\n刘权\t366987\n烤花\t366988\n收运\t366989\n晒伤\t366990\n宁安市\t366991\n太阳蛋\t366992\npkg\t366993\n江南之恋\t366994\n034期\t366995\nc\t366996\n科骏达\t366997\n东乌旗\t366998\n大泽山\t366999\n猎豹移动\t367000\n相对于\t367001\ntint\t367002\n乐视超级手机\t367003\n信长之野望12革新威力加强版\t367004\n阿呆\t367005\n万剑归宗\t367006\ncoder\t367007\nSmartphones\t367008\n注册税务师\t367009\nBuckle\t367010\nr300\t367011\n200V\t367012\nSwan\t367013\nplotagraph\t367014\n后备级\t367015\n灵山风景区\t367016\n老宫\t367017\n瑶海区\t367018\n知可子\t367019\ndagger2\t367020\nCOSMOS\t367021\n图ps\t367022\n莞深高速\t367023\n&#39\t367024\n7255\t367025\nati\t367026\ninitdb\t367027\n雅姿\t367028\n047\t367029\n石渣\t367030\n触变性\t367031\n卡爪\t367032\nspwm\t367033\n宅基\t367034\n栗原\t367035\n秀山岛\t367036\n9000多\t367037\nMifare\t367038\n04月12日\t367039\n后日\t367040\n各色\t367041\n预算员\t367042\n果汁\t367043\n数据采集器\t367044\n今夜有戏\t367045\n色艺\t367046\n沉重感\t367047\n婚车队\t367048\n老年公寓\t367049\n复合机\t367050\n承兴\t367051\n姐妹们\t367052\n百年人寿\t367053\n监统\t367054\n90万元\t367055\n垂\t367056\n马安镇\t367057\n数罪并罚\t367058\n紫妍\t367059\n失踪人口\t367060\n湿疹\t367061\n裸心堡\t367062\nWZ\t367063\n捏\t367064\ndestinations\t367065\n渔翁得利\t367066\n先行段\t367067\n艾利克斯\t367068\n小大\t367069\n巨蛇\t367070\n二两\t367071\n懶惰\t367072\n王庭\t367073\nOpenGrok\t367074\n众乐乐\t367075\n武汉保利\t367076\n酷家乐装修设计软件\t367077\n审计吧\t367078\n冗长\t367079\n宇航服\t367080\n中华人民共和国国家赔偿法\t367081\n昂克赛拉2.0\t367082\n超过\t367083\nasmcmd\t367084\n雀氏\t367085\n全胜\t367086\n小白猫\t367087\n玄觞\t367088\nlsr\t367089\n科学系\t367090\n免费周易算命网\t367091\nGB50300-2013\t367092\n吊水\t367093\n2921\t367094\nk98091518\t367095\n麝香\t367096\nai2018\t367097\n滚圆\t367098\n新星杯\t367099\n西师大版小学\t367100\n化学方\t367101\n钱宁\t367102\n南京商业学校\t367103\n知识产权专业\t367104\n明峰\t367105\n品牌折扣店\t367106\n工装裤\t367107\n睹\t367108\n加长型\t367109\n天途\t367110\ndl\t367111\nRookie\t367112\ncowboy\t367113\n幡然\t367114\nalternative\t367115\n民办园\t367116\n痴呆\t367117\n先序\t367118\nProposal\t367119\n免佣\t367120\n并存入\t367121\n天津便民网\t367122\n肖潇\t367123\n第11册\t367124\nPRA\t367125\n法兰盘\t367126\n旅客\t367127\n2018-04-26\t367128\nTestFlight\t367129\n武汉大学研究生院\t367130\n女宾\t367131\nStruts\t367132\n目张\t367133\n艺体生\t367134\n9009\t367135\n2516\t367136\n超脱力医院\t367137\n宁波镇\t367138\n药味\t367139\n二氧化锆\t367140\n一个五\t367141\n鸡蛋仔\t367142\n济南动物园\t367143\n沪价\t367144\n李文明\t367145\n轴向\t367146\n越溪镇\t367147\n信息报\t367148\n八份\t367149\n讨论组\t367150\n百华悦邦\t367151\n世纪花城\t367152\nhaku\t367153\ngdc\t367154\n亨通集团\t367155\n甲藻\t367156\n素肉\t367157\n掉落物\t367158\n缝匠\t367159\n芋头\t367160\n江州\t367161\n绍兴道\t367162\n吴品\t367163\n锦江之星连锁酒店\t367164\n建国后\t367165\n布老虎\t367166\n长号\t367167\n湖北省省级政府\t367168\nauthorised\t367169\n华山村\t367170\n天猫网\t367171\n夫妇俩\t367172\n小装\t367173\n系统分析师考试\t367174\n锦盒\t367175\n包头站\t367176\n胡迁\t367177\n那些日子\t367178\nMKL\t367179\n桑美\t367180\n测试端口号\t367181\n局地\t367182\n天环广场\t367183\nenvelope\t367184\n养殖箱\t367185\n德龄公主\t367186\n黑吊\t367187\n末世凡人歌\t367188\n下梁\t367189\n罗西塔\t367190\n九宫格\t367191\npjsua\t367192\n现实主义\t367193\n中国沙滩车网\t367194\n悬浮液\t367195\n假面骑士agito\t367196\n顺流而下\t367197\n激气连者\t367198\n突厥\t367199\n07号\t367200\n百夫长卡\t367201\n3980元\t367202\nAccuracy\t367203\nL850\t367204\n滕刚\t367205\n销售部\t367206\n200多万\t367207\n海融易\t367208\nAristotle\t367209\n太阳能充电器\t367210\ncollegeboard\t367211\n帝国时代终极版\t367212\n孙云晓\t367213\n造人\t367214\n固态继电器\t367215\n终结者2:审判日\t367216\nA版\t367217\n三元一次方程组\t367218\n唐雅婷\t367219\n没有风\t367220\n婴贝儿\t367221\n会商宝\t367222\nElysium\t367223\n黄版\t367224\nMs\t367225\n珊瑚骨\t367226\nodoo8\t367227\n察雅县\t367228\n故事观后感\t367229\nlane\t367230\nTAN\t367231\n18037期\t367232\n同等\t367233\n1招\t367234\nDrift\t367235\n成正\t367236\n亚丁湾\t367237\n甬\t367238\n贵港市\t367239\n应用集\t367240\n职问\t367241\n在线咨询\t367242\n某校\t367243\n默许\t367244\nfol\t367245\n寒武纪年\t367246\nFitch\t367247\ntextfield\t367248\n南京市商务局\t367249\n谷田稻香\t367250\n数天\t367251\n死忠\t367252\n青州火车站\t367253\nmyboy\t367254\n蕈\t367255\nlawsuit\t367256\nScandal\t367257\n陈香\t367258\n建瓯市\t367259\n4.3.2\t367260\n洞中\t367261\n白塞氏病\t367262\n发包\t367263\n卡惠网\t367264\ndeploy\t367265\n【集团\t367266\nLibrarian\t367267\nX80\t367268\n磨牙\t367269\n华软网\t367270\n雪里蕻\t367271\n尘封档案\t367272\n2008款\t367273\n官塘\t367274\nzyl\t367275\n谨言\t367276\n婀娜多姿\t367277\n阳指\t367278\n幸福的女人\t367279\n风雨后\t367280\nMIUI6\t367281\n两优\t367282\n谭鑫培\t367283\n大益普洱茶\t367284\n蝎子王\t367285\nhyp\t367286\n欧米\t367287\n生产性服务业\t367288\n税管员\t367289\nEdition\t367290\nYon\t367291\n江城子密州\t367292\n长飞\t367293\n暴风魔镜\t367294\nxps吧_\t367295\n狗肉\t367296\nPowerbeats\t367297\n罗马鞋\t367298\n杏坛\t367299\n24G\t367300\nQemu\t367301\n硬笔\t367302\n杂交\t367303\n程源\t367304\n肖然\t367305\n试验室\t367306\n宁波第一医院\t367307\n定义函数\t367308\n眩晕\t367309\n沃尔玛卡\t367310\n恩替卡韦片\t367311\n复仇者联盟3百度云\t367312\n福建警察学院\t367313\n熊猫兔\t367314\n庄心\t367315\n神力\t367316\nvren\t367317\nRyan\t367318\ngist\t367319\n陈平\t367320\ncholine\t367321\n如生\t367322\n福州大学研究生院\t367323\n楚天阔\t367324\n父与子全集\t367325\n凯迪拉克XT\t367326\nHunt\t367327\n病房\t367328\n打垮\t367329\n国家应急救援指挥中心\t367330\nEDIT\t367331\n荣华富贵\t367332\n王小安\t367333\nReflective\t367334\n数据选择器\t367335\n新余市人民医院\t367336\n梨树\t367337\n九阴九阳\t367338\n雷锋\t367339\n中国黄金网\t367340\n中继路由器\t367341\ntraversal\t367342\n童言无忌\t367343\ncpa\t367344\n秘闻\t367345\n广场\t367346\nQQ音乐-千万正版音乐海量无损曲\t367347\n广末凉子\t367348\nsbh70\t367349\n民俗村\t367350\nJQGrid\t367351\n跨行\t367352\noppor9tm\t367353\n64码\t367354\n盐焗鸡爪\t367355\n希思罗机场\t367356\n顺式\t367357\nGTA圣安地列斯\t367358\ncolon\t367359\n再选择\t367360\n贺岁普通纪念币\t367361\n沙翁\t367362\n缠绕管\t367363\n张檬\t367364\n辛醇\t367365\ngeneralization\t367366\n单针\t367367\n二外\t367368\n分页器\t367369\n抗衡\t367370\n俠傳\t367371\n木木夕\t367372\n如意\t367373\n儿茶\t367374\n1金\t367375\nshown\t367376\n层层恐惧\t367377\njang\t367378\n_顶渲网\t367379\n唐纸\t367380\n芒果干\t367381\n河南省教育厅\t367382\nissues\t367383\n韩申颖\t367384\n35000\t367385\noffence\t367386\n同欲者\t367387\nmediaelement\t367388\n脉冲发生器\t367389\n强剑\t367390\n不计\t367391\n兆丰路\t367392\n两秒后\t367393\nwind资讯\t367394\n永福\t367395\n佤邦\t367396\n置信区间\t367397\n萨默斯\t367398\nXIII\t367399\nAVMO\t367400\n卡嘉莉\t367401\n澧县\t367402\n红肠\t367403\n学生服\t367404\nFIPS\t367405\n周丽淇\t367406\nactress\t367407\napv\t367408\n剑侠情缘网络版\t367409\n壮歌\t367410\n耳工\t367411\n通子\t367412\n银联\t367413\n6次\t367414\noffset函数\t367415\n2682\t367416\n风尘劫\t367417\n潜在\t367418\n草纸\t367419\n巨棒\t367420\n无台\t367421\n卫河\t367422\nremy\t367423\n破断\t367424\n人命币\t367425\n别去\t367426\nprotobuf\t367427\n美商海盗船\t367428\n陈青云\t367429\n输液椅\t367430\n三人位\t367431\nutl\t367432\ny51\t367433\n秦奋\t367434\n22公里\t367435\n金麻\t367436\nhere\t367437\n圣世\t367438\n打折季\t367439\nio流\t367440\n幼升小名校\t367441\n分内\t367442\n瓷\t367443\n钓线\t367444\n全瓷牙\t367445\nGalore\t367446\n小孽\t367447\n莫召奴\t367448\n逗人\t367449\nbility\t367450\n评书三国演义\t367451\n椎名由奈\t367452\n出痧\t367453\n小猪\t367454\n肾性贫血\t367455\n时代奥城\t367456\nscreaming\t367457\n精华帖\t367458\n飞着\t367459\nFundamentals\t367460\nd800\t367461\n金磊\t367462\n报国\t367463\n抠像\t367464\ntransmate\t367465\n管综\t367466\nyzc577\t367467\n/4\t367468\naplication\t367469\n五模\t367470\n挣扎\t367471\nacp\t367472\n菲尔特\t367473\n史通\t367474\n爱德华的奇妙之旅\t367475\n浙江卫视视频\t367476\n谋而不忠乎\t367477\n张培刚\t367478\nケット\t367479\n气雾\t367480\n利口\t367481\n100hz\t367482\n踽踽独行\t367483\n定慧寺\t367484\n罗温·艾金森\t367485\nbayes\t367486\n美镇\t367487\n绞\t367488\n浙江大学医学院附属妇产科医院\t367489\n正荣府\t367490\n高端大气上档次\t367491\n西湖路\t367492\nuipickerview\t367493\nXperi\t367494\n三台镇\t367495\n锐化\t367496\n中交路桥建设有限公司\t367497\n大女孩\t367498\n数码相机\t367499\n虚增\t367500\n艾格尼丝\t367501\n棉纺路\t367502\n西阳镇\t367503\nnrf51822\t367504\n波斯菊\t367505\n上海师范大学\t367506\n8瓶\t367507\n肉文\t367508\n流押\t367509\n叶曼\t367510\n厦门市人力资源和社会保障局\t367511\n约翰汤普森\t367512\n975\t367513\nassistive\t367514\n花山谜窟\t367515\n长城电工\t367516\n助拍网\t367517\n南京一中\t367518\n明光市政府\t367519\n福利券\t367520\nDesktop\t367521\n逐月\t367522\nscg\t367523\n会声会影2018\t367524\n伍\t367525\n恒春\t367526\n课题库\t367527\n2场\t367528\n世界斯诺克\t367529\n畫面\t367530\n条盒\t367531\nSerena\t367532\n方舟龙\t367533\n几公里\t367534\n人山人海\t367535\n滋事\t367536\n后们\t367537\npassport\t367538\nInstitute\t367539\nmemcpy\t367540\n掌控者\t367541\n杜玉明\t367542\n春星\t367543\n本科率\t367544\n挂接\t367545\nbh\t367546\n圣母大教堂\t367547\n餐桌椅\t367548\n北京二中亦庄学校\t367549\n末世文\t367550\n1600x900\t367551\nunicode转码\t367552\n政府采购合同\t367553\n遭劫\t367554\nifrs9\t367555\n交易量\t367556\n李适\t367557\n笔刷\t367558\n房地产权\t367559\n春风又绿江南岸\t367560\n多元化\t367561\n小要\t367562\n_马鞍山\t367563\n所有值\t367564\nQwQ\t367565\n冷男·胖芙\t367566\nOnline3\t367567\n段竹心虞长君\t367568\n3533\t367569\n红焖羊肉\t367570\n高浓度\t367571\n上海市人才中心\t367572\n私人保镖\t367573\n塞恩\t367574\n沙市镇\t367575\n泰奇猫\t367576\nnh4cl\t367577\n卡比\t367578\n欧阳妮妮\t367579\n电子城\t367580\n尘肺\t367581\n广发银行股份有限公司\t367582\n挤掉\t367583\n地子\t367584\n莱维\t367585\n彬县\t367586\n剧毒\t367587\n黑龙江农垦职业学院\t367588\nM43\t367589\n浑\t367590\nInternet\t367591\n县卫计委\t367592\nclosing\t367593\n甘肃省气象局\t367594\n鼻塞\t367595\n蛮王\t367596\n增倍\t367597\n北京市旅游委\t367598\n桶装水厂\t367599\n阿里人\t367600\nHeyzo\t367601\n5128首\t367602\nentitymanager\t367603\n优于\t367604\n360手机N5\t367605\n权人\t367606\n柔滑\t367607\npay\t367608\n中侧\t367609\n商品化\t367610\n女女女\t367611\n二主\t367612\n青龙县\t367613\nIPv4\t367614\n冲击性\t367615\n爆射\t367616\n第69号\t367617\n补料\t367618\n扩充\t367619\n按质\t367620\n滤色片\t367621\n局域网管理软件\t367622\n英雄无敌2\t367623\n文言文版\t367624\n县人社局\t367625\n中国美术学院\t367626\n李仲彬\t367627\n惊天大逆转\t367628\n奔驰c180l\t367629\n格格巫\t367630\n飞哥\t367631\n你好意思\t367632\n旧制度与大革命\t367633\n陈洁丽\t367634\nSera\t367635\nCOBOL\t367636\n经适房\t367637\n内热式\t367638\nNotifications\t367639\n老外婆\t367640\n法信\t367641\n五室\t367642\n1907\t367643\n拿铁\t367644\nshadowverse\t367645\nkingsman\t367646\nbaseball\t367647\n技术实务\t367648\n我叫MT\t367649\n腐坏\t367650\n伙房\t367651\n10mu\t367652\n华林\t367653\n未来路\t367654\n小年\t367655\n共生体\t367656\n卧龙凤雏\t367657\nXtraBackup\t367658\n干扰物\t367659\n文运法硕\t367660\n老林\t367661\n西安赛格国际购物中心\t367662\n契约型\t367663\nAlabama\t367664\n卡尔加里大学\t367665\n希格斯\t367666\n王攀\t367667\n起花\t367668\n乙仓\t367669\nibis\t367670\n离去\t367671\n盗猎\t367672\nXI\t367673\n婺源高铁站\t367674\n星形连接\t367675\n回民街\t367676\n战斧导弹\t367677\nSOOGIF\t367678\n5880\t367679\n舍利塔\t367680\n公室\t367681\n南京地产\t367682\n滞回\t367683\n102级\t367684\n数码宝贝世界\t367685\n沪江韩语学习网\t367686\nmd101\t367687\n朝东\t367688\n思践\t367689\nQSPI\t367690\n盲源\t367691\n双簧管\t367692\n探客\t367693\n贺磊\t367694\n金猫\t367695\nWIN版\t367696\n道奇\t367697\n大三元\t367698\n多相流\t367699\n大勋\t367700\nconsol\t367701\n西湖龙井\t367702\n卖国\t367703\n牛桃\t367704\n益源\t367705\n争锋\t367706\n秀动\t367707\n硕士学位证\t367708\n德众\t367709\n显胸\t367710\n何先生\t367711\n山东大学威海校区\t367712\n王程\t367713\n八女\t367714\n不同样\t367715\n陶飞霏\t367716\n混饭\t367717\nBJ80\t367718\n久草在线\t367719\n10008\t367720\nreaction\t367721\noffice2007word\t367722\nor函数\t367723\n招聘管理培训生\t367724\n田湾\t367725\n襄阳东\t367726\n紫云山庄\t367727\n兰索拉唑\t367728\nmicrobiology\t367729\n课凡\t367730\n姚安县人民政府\t367731\n娘腔\t367732\n阿贾克斯\t367733\nWave\t367734\n1-2小时\t367735\n1369\t367736\nUrbana\t367737\n服务期\t367738\n英郡\t367739\n醋酸乙烯酯\t367740\n学生会议\t367741\n伊苏起源\t367742\nmainly\t367743\n灭火机器人\t367744\n内分泌\t367745\n剑桥小学\t367746\n企聚网\t367747\n哈斯卡\t367748\n入口处\t367749\nnv12\t367750\n伙伴\t367751\n几【\t367752\n戴旭\t367753\n333_\t367754\n我的开挂人生\t367755\nOBJECT\t367756\n道德规范\t367757\n650m\t367758\n橙汁机\t367759\n条桌\t367760\n台州机场\t367761\n工大高新\t367762\nsingtel\t367763\npropyl\t367764\n地坪漆\t367765\nSivan\t367766\n书价\t367767\n新概念英语第二册笔记\t367768\n梯形法\t367769\n冰层\t367770\n刑侦队\t367771\n义蓬\t367772\n刘村\t367773\n双清区\t367774\n桶哥\t367775\n南陵县\t367776\n长破折号\t367777\n全日\t367778\n玖富叮当贷\t367779\n三氯化铝\t367780\n即期\t367781\n城市公共空间\t367782\n牙渍\t367783\n亚尔斯兰\t367784\n御峰\t367785\n主色\t367786\n两学一\t367787\n八校\t367788\n中国人民解放军空军\t367789\npkgconfig\t367790\n洛心辰\t367791\ntrapped\t367792\n波尔\t367793\n打撸\t367794\n传略\t367795\n魔王勇者\t367796\n中世纪2全面战争吧_\t367797\n夜视版\t367798\n镜柜\t367799\nSDCC\t367800\nReactions\t367801\n0.81\t367802\ncoconut\t367803\n高内\t367804\n南柱赫\t367805\n鹿邑县人民政府\t367806\n谌家矶\t367807\nmeasurement\t367808\n刘珍\t367809\n心爱的\t367810\n航城\t367811\n清真食品\t367812\n莫什科夫斯基\t367813\npro1\t367814\n红旗颂\t367815\n山东分公司\t367816\n沉珂\t367817\n黑色玫瑰\t367818\n100A\t367819\n150期\t367820\n经产\t367821\n拿不住\t367822\n新花样\t367823\n乐山师院\t367824\nMac\t367825\npthc\t367826\n影评\t367827\nLSF\t367828\n网赛\t367829\n80mm\t367830\n美眉\t367831\n周刊少年Jump\t367832\n瑜伽裤\t367833\n王明华\t367834\nStreaming\t367835\n淄博市中心医院\t367836\n第2段\t367837\n艾宝良\t367838\n竞思\t367839\n2018.04.02\t367840\n员证\t367841\nBSI\t367842\n清源山\t367843\n第7节\t367844\n我猜我猜我猜猜猜\t367845\n一到\t367846\narx\t367847\nv4.1.3\t367848\n中文小说网\t367849\n死士\t367850\n碳素\t367851\n2017年10月份\t367852\n主副卡\t367853\n第001章\t367854\n通灵妃\t367855\n飞过\t367856\n郫都区人民政府\t367857\n会员版\t367858\n易度\t367859\n五家渠市政府\t367860\nBeyonce\t367861\nanthem\t367862\n麦克疯\t367863\n117届\t367864\n雪诗\t367865\nhase\t367866\nWorkFlow\t367867\n北京科技大学研究生院\t367868\n老陕\t367869\n黄鼠狼\t367870\n唱戏\t367871\n通识类\t367872\n温清欢\t367873\n26斤\t367874\nboots\t367875\n领奖台\t367876\n快钱\t367877\n科学技术史\t367878\n红杉中国\t367879\n自缴\t367880\n小农夫\t367881\n喻聪\t367882\nmask\t367883\nsuperpads\t367884\n92分\t367885\n优芽\t367886\n深圳东站\t367887\n2.83\t367888\n华信证券\t367889\n天津小说网\t367890\n风速传感器\t367891\n湿式除尘器\t367892\nsmokers\t367893\n足球运动\t367894\n迁移\t367895\nlcc\t367896\n什么么\t367897\n相较\t367898\n三衢\t367899\n唛\t367900\n第72集\t367901\nE5620\t367902\n亮屏\t367903\n武书\t367904\n奶牛\t367905\n毕业论文致谢词\t367906\n薄壁件\t367907\n星空地产\t367908\n载物\t367909\n牛魔王\t367910\n齿套\t367911\nmarquee\t367912\n灯科\t367913\n陈立杰\t367914\n北京永洪商智科技有限公司\t367915\nCOO\t367916\n宝马120i\t367917\nCalls\t367918\n测绘吧\t367919\n17万\t367920\n活击\t367921\n诈欺\t367922\n胡说\t367923\nFlows\t367924\n陈惟\t367925\n安克雷奇\t367926\n磕炮\t367927\ntimepicker\t367928\nMacao\t367929\n理性主义\t367930\n海鲜池\t367931\n梦怡\t367932\n渗透液\t367933\n长空\t367934\n举借\t367935\n不分离\t367936\nMTSP\t367937\n弓呆\t367938\n巴斯光\t367939\n一芳\t367940\n麻杏石甘汤\t367941\n经济萧条\t367942\nddpro\t367943\nIP树\t367944\nballast\t367945\n眼熟\t367946\n辅酶Q10\t367947\ndawen\t367948\n定泗路\t367949\n吴玉君\t367950\n神犬小七2\t367951\n上海北诺生物科技有限公司\t367952\n服务满意度\t367953\n2040年\t367954\nABBA\t367955\n雷公藤\t367956\nmm-dd格式\t367957\nABN\t367958\n见图\t367959\n赵小臭\t367960\n保留村\t367961\n查理二世\t367962\n湖洲\t367963\n大邱庄\t367964\nNo.13\t367965\n会计考试网\t367966\nwdf\t367967\n洁悠神\t367968\n海露\t367969\nik\t367970\n海安县\t367971\n样口\t367972\n环氧地坪\t367973\n郑文\t367974\n微星微星\t367975\n捷豹空压机\t367976\n160家\t367977\n时间银行\t367978\n标率\t367979\n党中央国务院\t367980\n这一关\t367981\n立体感\t367982\n特效药\t367983\nhea\t367984\n甜糖\t367985\n安藤忠雄\t367986\n韩雯雯\t367987\n莫比斯\t367988\nbull\t367989\nadbyby\t367990\n牡丹江路\t367991\n士师\t367992\n虚空之眼\t367993\n招标公司\t367994\n鲛肌\t367995\n保保网\t367996\n郑经\t367997\nU5\t367998\n打头机\t367999\nidentified\t368000\n东新街道\t368001\n挂板\t368002\n霉剂\t368003\n欧陆风云3\t368004\nMIX2S\t368005\nCA\t368006\nHBuilder\t368007\n王洪祥\t368008\n股轩\t368009\nargox\t368010\n疾风传\t368011\nhogan\t368012\n泰毕全\t368013\n发财鱼\t368014\n#墓王之王\t368015\n酷窝\t368016\n音乐游戏\t368017\n电子商情\t368018\n水母网\t368019\n手手\t368020\n朝阳门外大街\t368021\nXRF\t368022\n半圆头\t368023\n5r\t368024\n双胶纸\t368025\n披萨饼\t368026\nitx\t368027\n骈\t368028\n随机存储器\t368029\n行不轨\t368030\n字线\t368031\n生物梅里埃\t368032\nkobject\t368033\n葱油面\t368034\n贺州\t368035\n卖假\t368036\n银峰\t368037\ndove\t368038\n評論\t368039\nWNS\t368040\n189\t368041\n生卒\t368042\n甘肃建筑职业技术学院\t368043\n180328\t368044\n平胃散\t368045\n万法\t368046\nused\t368047\n珊瑚玉\t368048\n航天晨光\t368049\n热敏电阻器\t368050\n菊粉\t368051\n休止角\t368052\n面对现实\t368053\n20160521\t368054\n南方医科大学珠江医院\t368055\n网信\t368056\n融创滨江壹号院\t368057\ncentral\t368058\n赌圣3:无名小子\t368059\n20160910\t368060\n自由基\t368061\n急待\t368062\n跃跃欲试\t368063\n方舟子\t368064\n曜夜\t368065\n万马科技\t368066\n斑马打印机\t368067\n大家庭\t368068\n傻鸟\t368069\n书写\t368070\n你的名字\t368071\n240fps\t368072\n天正电气\t368073\n宜昌市规划局\t368074\nFoobar2000\t368075\n擁有\t368076\n视导\t368077\n浦东新区人力资源和社会保障局\t368078\n建组\t368079\n我的天堂\t368080\nWCG\t368081\n龙门\t368082\n体脂\t368083\n七分之五\t368084\n矩阵制\t368085\n潇湘府\t368086\n安徽出入境检验检疫局\t368087\n上汽大通V80\t368088\ndnf艾肯传说\t368089\n八方小区\t368090\n蒽醌\t368091\n张副官\t368092\n郭磊\t368093\nlocust\t368094\n消弧\t368095\n三日内\t368096\n博格坎普\t368097\n免责书\t368098\ncnn\t368099\n16QAM\t368100\n智美\t368101\n行法\t368102\n发扬光大\t368103\n怡红\t368104\n科客\t368105\n受追捧\t368106\n飞船\t368107\n市住房城乡建设局\t368108\nandroid.jar\t368109\n易车\t368110\nProgress\t368111\n红蘑菇\t368112\n草馏\t368113\nδ\t368114\n中央电台\t368115\n高乐高\t368116\n浙美\t368117\novers\t368118\nGoodness\t368119\n工业界\t368120\n鬼婆\t368121\nfora\t368122\nRandomForest\t368123\n王渊\t368124\n加速赛\t368125\n快报\t368126\n女乒\t368127\n创新股份\t368128\n马克飞\t368129\n0t\t368130\n拓天\t368131\n转办\t368132\n义务段\t368133\n何琳\t368134\n张申英\t368135\n梁剑东\t368136\n20130209\t368137\nanchor110\t368138\nQQ付费群\t368139\n野崎君\t368140\n14方\t368141\n靖远\t368142\n北京环球影城\t368143\nPartnership\t368144\n90分钟\t368145\n国家应急广播网\t368146\n比武\t368147\n造梦西游\t368148\n议付\t368149\nmari\t368150\n案头卷\t368151\n抽帧\t368152\n掴\t368153\nReLU\t368154\n罗贯中\t368155\n聚宝\t368156\n九游\t368157\n514\t368158\n并入\t368159\n法定盈余公积金\t368160\n北京天文馆\t368161\ncompilation\t368162\n充值\t368163\nAcoustics\t368164\nClementine\t368165\n零度商务网\t368166\nimproving\t368167\n耕田机\t368168\n清洁片\t368169\n制造费\t368170\n2888\t368171\ncancelled\t368172\n汤汤\t368173\n13圣耀\t368174\n北控清洁能源集团\t368175\n猫叫\t368176\n日播版)\t368177\nQScrollArea\t368178\n诺基亚7\t368179\n儿女们\t368180\n常州嬉戏谷\t368181\n隋乱\t368182\n今永\t368183\n无限连带责任\t368184\n小班组\t368185\n留样\t368186\n周游券\t368187\nshiming\t368188\nrayli\t368189\n作账\t368190\n截至\t368191\n教育金\t368192\n8042\t368193\n徐濠萦\t368194\n电子伏特\t368195\n贷款额\t368196\n费险\t368197\n普鲁卡\t368198\n白嫩\t368199\npushlet\t368200\n奶类\t368201\nlibpython\t368202\navb\t368203\n未来学校\t368204\n单瓶\t368205\n生产链\t368206\n马自达3\t368207\n梦幻西游龙\t368208\n李端\t368209\n菊园\t368210\nv3.1.2\t368211\n思莱德\t368212\n35厘米\t368213\n宝安国际机场\t368214\n北京青年旅行社\t368215\n直播网\t368216\ncodol吧\t368217\n园田\t368218\n配制\t368219\n牡丹画\t368220\n中国电信云计算公司\t368221\n自燃\t368222\n2017年2月8日\t368223\n聪明的人\t368224\nPS4/XB1/PC\t368225\n天秒地\t368226\n天津轻工职业技术学院\t368227\n班务\t368228\n传送器\t368229\n布小林\t368230\n裹挟\t368231\nF450G\t368232\n上海社区\t368233\n木化石\t368234\n精排\t368235\n自立式\t368236\n坑口村\t368237\nclaire\t368238\n螨\t368239\n冲击波\t368240\n精准营销\t368241\nOpenAPI\t368242\n荣耀畅玩5C\t368243\n李逸朗\t368244\n少年三国志\t368245\n会议厅\t368246\n特殊任务\t368247\n里奈\t368248\n词形\t368249\n浴房\t368250\n相结\t368251\n八卦神探\t368252\n情癫大圣\t368253\n173CM\t368254\n强人工智能\t368255\nceramics\t368256\n石榴籽\t368257\n书协\t368258\n4900元\t368259\n永州市中心医院\t368260\n气机构\t368261\n可乎\t368262\n东极岛\t368263\n北海街道\t368264\n卡式炉\t368265\n木原音濑\t368266\n超级英雄\t368267\n女狗\t368268\n三墩\t368269\n金一文化\t368270\n大悟县\t368271\n董强\t368272\n芸汐传\t368273\n合体字\t368274\nEau\t368275\nBoobpedia\t368276\n干死\t368277\n十四个月\t368278\n上海国际电影节\t368279\n转经\t368280\n04月10日\t368281\n秦琴\t368282\n三明鱼网\t368283\nRio\t368284\n金鹰电视艺术节\t368285\n脱氢酶\t368286\nCanonical\t368287\nCursor\t368288\niso14000\t368289\n盛一传奇世界\t368290\n天美工作室\t368291\n沈建\t368292\nRevenge\t368293\n王自健\t368294\n芹菜汁\t368295\n木伯\t368296\nsbs防水卷材\t368297\n火山直播\t368298\n刘四爷\t368299\n娜宝\t368300\n偃师\t368301\n蒸汽灭菌器\t368302\n【片\t368303\n李蔚\t368304\n中华龙都网\t368305\n天菜\t368306\n河正宇\t368307\n法条\t368308\n写真照\t368309\n林伽\t368310\n750TI\t368311\n一年到头\t368312\n压缩板\t368313\n渐暖\t368314\n1700万\t368315\n热镀锌螺栓\t368316\n74ls00\t368317\n18步\t368318\n博腾股份\t368319\n三到五年\t368320\n吉尺明步\t368321\n落盘\t368322\n北京市知识产权局\t368323\n黄杰\t368324\n侧乳\t368325\n分子束\t368326\nday2\t368327\n大英博物馆\t368328\n颜氏\t368329\n梁欢\t368330\n斯特拉\t368331\n合肥论坛_合肥论坛网\t368332\nhzdx\t368333\n6朵\t368334\n滔天\t368335\nNemo\t368336\n清华紫光\t368337\nnok\t368338\n中国天楹\t368339\n机顶\t368340\n资源位\t368341\n无所不谈\t368342\n张静文\t368343\n地球之战\t368344\n猎德\t368345\n上海新华医院\t368346\n太平洋电脑网\t368347\nkey-value\t368348\n梦幻诛仙手游合欢\t368349\n2529\t368350\nイク\t368351\n过新年\t368352\n出生月\t368353\n研习班\t368354\n易语言编辑框\t368355\n哈登\t368356\n常用于\t368357\n插拔\t368358\n输液瓶\t368359\n丝芙兰\t368360\n诗学\t368361\n成长型\t368362\n太阳钟\t368363\n大赌\t368364\n泥娃\t368365\n68个\t368366\n九仙帝皇诀sodu\t368367\n摩羯男\t368368\n展览路街道\t368369\n大沥铝材网\t368370\n怎么时候\t368371\n广州医科大学第一附属医院\t368372\n入党推优\t368373\n美猴王\t368374\n收图\t368375\n第1种\t368376\n饯\t368377\n龙山路\t368378\n甜枣\t368379\nJEECG技术博文\t368380\n番茄土豆\t368381\nsvm\t368382\n郑老屁\t368383\n诱惑我的邻家美女姐姐\t368384\ndrawing\t368385\n3.6亿元\t368386\n咬脚\t368387\n合卡\t368388\nIT私房菜\t368389\n黄龙溪谷\t368390\n掉下\t368391\n莫斌\t368392\n窝瓜\t368393\n新茶\t368394\nPosPal\t368395\nlayer层\t368396\n赛领\t368397\n5.2.7\t368398\n豪门宠婚\t368399\n南岗区政府\t368400\n虎牙拉风龙\t368401\n几处\t368402\n科技点\t368403\n2017年8月7日\t368404\n防灾减灾日\t368405\n古泉社区\t368406\n住宿生\t368407\n三步舞\t368408\n天翼飞Young\t368409\n白猪\t368410\n俞大猷\t368411\n刘老师\t368412\n重庆煤科院\t368413\n天野\t368414\n协信集团\t368415\n电雷管\t368416\n2160P\t368417\nBBoom\t368418\n无愧于心\t368419\nduang\t368420\nLintcode\t368421\nuroot\t368422\n钢塑复合管\t368423\n狐娘\t368424\n后湖大道\t368425\n斟酌\t368426\n盛唐\t368427\n辉夜月\t368428\n数字画\t368429\n普洱市人民政府\t368430\n安东贝索斯\t368431\n鞋迷\t368432\n双项\t368433\n高致病性禽流感\t368434\n梦想的声音第二季\t368435\nSMI\t368436\n刘曙光\t368437\n格式化成\t368438\n湖南省人社厅\t368439\n被点名\t368440\n找齐\t368441\n石料场\t368442\n搜字网\t368443\nJULY\t368444\n20171201\t368445\n聂广\t368446\n钓鱼箱\t368447\n受管\t368448\nZAYN\t368449\n硫酸铝钾\t368450\n仿照\t368451\n侧卧\t368452\n出生日\t368453\n德意志帝国\t368454\n赣北\t368455\n20GP\t368456\n北京国际电影节\t368457\n五1\t368458\n义乌国际博览中心\t368459\n点火系统\t368460\n紫陶\t368461\n蒸馏\t368462\nSUSE11\t368463\n6260\t368464\n海湾消防\t368465\n栖息谷\t368466\n大一点\t368467\n一倍\t368468\n锁骨\t368469\nwidth=\t368470\n柳沟\t368471\nQi\t368472\n6月18日\t368473\ncatgatp\t368474\nblowing\t368475\n金式\t368476\n海一方\t368477\n陈艺\t368478\n女贝网\t368479\nwww.51sole.com\t368480\n国家广播电视总局\t368481\n郑中\t368482\n常州网\t368483\n十几元\t368484\n安阳日报\t368485\n真心话\t368486\n全乡\t368487\nTalk\t368488\n地州\t368489\n四维图\t368490\n捶打\t368491\nNavMesh\t368492\nTimothy\t368493\n佩特拉\t368494\n军装照\t368495\n黔灵山公园\t368496\nVOLTE\t368497\nWordReference\t368498\n厢\t368499\n阿弗莱克\t368500\n平反\t368501\n笑场\t368502\n哈弗H3\t368503\n埃米尔\t368504\nBrakes\t368505\n平板太阳能热水器\t368506\n哆啦云\t368507\n泰狮\t368508\n禀赋\t368509\n1.8l\t368510\n爱拍\t368511\n郭京飞\t368512\n极米z4x\t368513\n墙梁\t368514\n投资家\t368515\nVan\t368516\n调机\t368517\n2018年4月11\t368518\n糖果传奇\t368519\nxiehui\t368520\n博坛\t368521\n无无\t368522\n购物比价网\t368523\n阿鲁科尔沁旗\t368524\n刘寅\t368525\n二千\t368526\n石氏\t368527\n宝贝婚团网\t368528\n星野集团\t368529\n英国斗牛犬\t368530\n脚跟\t368531\n舒格\t368532\n意法半导体\t368533\n戴琳斯\t368534\n电耗\t368535\nTicwear\t368536\n男友\t368537\n张希\t368538\npromises\t368539\n卡洛斯\t368540\n粤建通\t368541\n5派\t368542\n赫罗纳\t368543\n诗画\t368544\nf12\t368545\n第1层\t368546\n28050\t368547\n廉洁履行\t368548\n胸闷气\t368549\nportlet\t368550\nslope\t368551\n公文版\t368552\n汕湛高速\t368553\nWALKER\t368554\n太极图\t368555\n长安夜\t368556\n优才计划\t368557\n201505\t368558\n尤里安\t368559\n中文编程\t368560\n上海商学院\t368561\n与民同乐\t368562\n5f\t368563\n甘肃省纪委\t368564\n马屁精\t368565\nnotexpress\t368566\n慢性胆囊炎\t368567\n国贸大酒店\t368568\n101应援社\t368569\n沙之国\t368570\n孤岛危机4\t368571\n中清\t368572\n贝因美奶粉\t368573\nxiaolong\t368574\nwww.YE321.com-影音先锋\t368575\n平行\t368576\n200丸\t368577\n余姚市教育局\t368578\n重庆大学研究生院\t368579\n黄义达\t368580\n150万美元\t368581\n吉祥名网\t368582\n母音\t368583\n墓王\t368584\n杜工部\t368585\n科判\t368586\n酒糟\t368587\n【寅子\t368588\n汉尼拔\t368589\n甲状腺病\t368590\n佩恩\t368591\n老白干酒\t368592\n95名\t368593\n合成词\t368594\n网易印像派\t368595\nTongji\t368596\n瑞泰\t368597\nskincare\t368598\n比比东\t368599\nNG-ZORRO\t368600\n鳃盖\t368601\n泛水\t368602\n严嵩\t368603\n埙\t368604\n表决心\t368605\n北京招聘网\t368606\n模拟卷\t368607\n寓言类\t368608\ngraham\t368609\nHopkins\t368610\n玉兰油\t368611\n神狗\t368612\n和佳股份\t368613\n分手吧_\t368614\nbip\t368615\n百度站长社区_百度\t368616\n劲炫ASX论坛_汽车之家论坛\t368617\n第7张\t368618\n手记\t368619\n争奇斗艳\t368620\n油浆\t368621\nr420\t368622\nGivenchy\t368623\n层面\t368624\n蚯蚓机\t368625\n有法可依\t368626\n竹叶山\t368627\n数据段\t368628\n案组\t368629\n2.4m\t368630\nMarkdownPad2\t368631\nIFI\t368632\n青龙路\t368633\n中国舞蹈协会\t368634\n0120\t368635\n001.000\t368636\n市直\t368637\n服务器人\t368638\n商丘梁园区\t368639\n包装板\t368640\nYododo\t368641\n互助式\t368642\n申雪\t368643\n第一手\t368644\nS弯\t368645\n刘金龙\t368646\n20151102\t368647\n2017年09月30日\t368648\n惠动\t368649\nnba2k16吧\t368650\n乱世丽人行\t368651\n天野喜孝\t368652\n迎泽大街\t368653\nbeijin\t368654\n益生菌\t368655\n80所\t368656\nwives\t368657\n箱式变压器\t368658\nPlatinum\t368659\nCharcoal\t368660\n幼贞\t368661\n5小时内\t368662\n鲍勃·迪伦\t368663\nTempered\t368664\n莎莎源码论坛\t368665\nCX4\t368666\n上海港\t368667\n炫飞\t368668\n勤诚达\t368669\n附着式升降脚手架\t368670\n埃匹希斯\t368671\n6T\t368672\n260平米\t368673\n左江\t368674\n拖柜\t368675\n步行桥\t368676\n3000斤\t368677\n尿道口红肿\t368678\n天津银监局\t368679\n澳网\t368680\n测定界\t368681\n尹明善\t368682\n2.43\t368683\n弯子\t368684\nCritical\t368685\n黑货\t368686\nseeker\t368687\n服务器端\t368688\n淘宝子\t368689\n绿菊\t368690\n亲爱的\t368691\n机械展\t368692\narth\t368693\n龙吐珠\t368694\n大头鱼\t368695\n宋韶光\t368696\ncustody\t368697\n心理师\t368698\n1942\t368699\n梅花形\t368700\nTickle\t368701\n出力\t368702\n17点\t368703\n川师附中\t368704\n169.254\t368705\nRestlet\t368706\n2017跨年\t368707\n遇困\t368708\n定向选调生\t368709\n早操\t368710\n俩张\t368711\n珩磨\t368712\n烟袋斜街\t368713\n0470\t368714\nphp5\t368715\n猎豺狼分集\t368716\n油尺\t368717\n抗敏\t368718\n壳体\t368719\n办实事\t368720\n证书库\t368721\nopenjdk\t368722\nsalvation\t368723\nchainer\t368724\nDedeCms\t368725\ndemos\t368726\n冒用\t368727\nkeez\t368728\n背锅\t368729\n魔甲\t368730\nMetropolo\t368731\nR方\t368732\nA1栋\t368733\n夏特\t368734\n800亿\t368735\n拖长\t368736\n小舒\t368737\niPhone7plus\t368738\n推动力\t368739\n来安\t368740\nSensitivity\t368741\nenkei\t368742\nAIX6.1\t368743\n新河湾\t368744\n广州市轻工职业学校\t368745\n大声\t368746\n峨边\t368747\n全国人大会议广东代表团\t368748\n0.004\t368749\n众安在线\t368750\n随房网\t368751\n59页\t368752\n交流部\t368753\n一初\t368754\n各抒己见\t368755\nwin7吧\t368756\n百年大计\t368757\nPDF-CSDN\t368758\n自贸\t368759\n不一致\t368760\n进程管理器\t368761\n永利集团\t368762\nlabreeze\t368763\n南京文交所\t368764\n并购贷款\t368765\n壁纸族\t368766\n宫闱\t368767\n风清\t368768\n蔡司\t368769\n|主\t368770\n管理干部\t368771\nmother\t368772\n轻兵器\t368773\n雅思王\t368774\n格列兹曼\t368775\n梁从诫\t368776\n细分\t368777\n官咖\t368778\n太阳神三国杀\t368779\n西安汽车\t368780\nwuauserv\t368781\nDevexpress\t368782\nSahara\t368783\n开花期\t368784\n魏薇\t368785\n东方Project\t368786\n山东一卡通\t368787\nAcmen\t368788\n参考书\t368789\npeggy\t368790\n杨志华\t368791\n合肥通\t368792\n莹\t368793\n天唐锦绣\t368794\n欧迪克\t368795\nkuaidiwo\t368796\nITE\t368797\n煮\t368798\n蓝洁\t368799\narp攻击\t368800\n新学网\t368801\nAsharp\t368802\n南江\t368803\n虐孕\t368804\n武汉软件工程职业学院\t368805\n小信\t368806\n重樱\t368807\n房论\t368808\n香花\t368809\n8月13日\t368810\n废奴\t368811\npass\t368812\nBEATLESS\t368813\n建筑系\t368814\n维迈通\t368815\n源程\t368816\n折叠椅\t368817\n荣耀4a\t368818\nx^2+x+1\t368819\n青岛信息网\t368820\n金天师\t368821\n256色\t368822\n顶分型\t368823\n学西班牙语\t368824\n汇总记账凭证\t368825\nVida\t368826\ninformix数据库\t368827\n归山\t368828\n闽侯县\t368829\n粗浅\t368830\nNRF52832\t368831\n下课后\t368832\n血性\t368833\n青空\t368834\n清实录\t368835\n下拨\t368836\n散点图坐标轴\t368837\n87870\t368838\nhod\t368839\n斯琴马可\t368840\ncodesoft\t368841\n绑架罪\t368842\nasr\t368843\nriyu\t368844\n中南美\t368845\n宜昌市区\t368846\n烟龄\t368847\n70元\t368848\n口水歌\t368849\n方格\t368850\nmarriage\t368851\n工具类\t368852\n土黄色\t368853\n内蒙古自治区发展和改革委员会\t368854\n空运\t368855\n历遍\t368856\ncatwalk\t368857\n600万元\t368858\n水建\t368859\n靶面\t368860\n普利珠\t368861\n掉级\t368862\n知識\t368863\n金满堂\t368864\nTooling\t368865\n丽香\t368866\n防部\t368867\nnunjucks\t368868\n逼死\t368869\n调研\t368870\nSESSION\t368871\n小技\t368872\n首小时\t368873\n永宁古城\t368874\nMusk\t368875\n鱼腥味\t368876\n金貂\t368877\n都市生活\t368878\n400度\t368879\n不能忘记\t368880\n以後\t368881\nxrv\t368882\n孙连城\t368883\n套保\t368884\n拟订\t368885\n雅化\t368886\n芳香\t368887\n不枉\t368888\n陆峰\t368889\n种树\t368890\n说法\t368891\n花花公主\t368892\n平泉\t368893\n福州地铁4号线\t368894\n液压搬运车\t368895\n鸡皮疙瘩\t368896\n环球影业\t368897\n陈飞宇\t368898\nLED液晶电视\t368899\n牙托\t368900\ntagname\t368901\n1500亿\t368902\n风扇\t368903\n015年\t368904\n剑宗艾肯\t368905\nbradley\t368906\n龙渊泽\t368907\n预应力管桩\t368908\nSpringBoot学习笔记\t368909\n静居寺\t368910\n_奇\t368911\n长生剑\t368912\n戏码\t368913\n访客量\t368914\n华为天际通\t368915\n中国科学技术信息研究所\t368916\nartlantis\t368917\nThinkpad\t368918\n一年以上\t368919\n俞挺\t368920\n七上八下\t368921\n武汉铁路局\t368922\nmockjs\t368923\n寻仇\t368924\n阳光论坛_汽车之家论坛\t368925\n1932年\t368926\n长沙旅行社\t368927\nMaking\t368928\n躺着\t368929\n健健\t368930\n地佐辛\t368931\n吴作斗\t368932\nAsrock\t368933\nKirkland\t368934\n很受伤\t368935\n农大南路\t368936\nlav\t368937\n伸入\t368938\n金瓯\t368939\nwinston\t368940\n现代性\t368941\n均沾\t368942\n女伶\t368943\n同手同脚\t368944\n57美国网\t368945\n扭筋\t368946\n意媒\t368947\n福建红盾网\t368948\n大巫\t368949\n真的很好\t368950\nGotham\t368951\nURLConnection\t368952\n几代\t368953\n博古架\t368954\n八强\t368955\n月季交易网\t368956\n相貌\t368957\nmomenta\t368958\n担保合同\t368959\n3600万\t368960\n易筋经\t368961\n旋塞\t368962\n庚寅日\t368963\n永安小学\t368964\n一氯\t368965\nPK场\t368966\n】医院大全|医院排名|医院\t368967\neclipse\t368968\n血虚\t368969\n4.\t368970\nyeezy\t368971\n令人费解\t368972\n葡京酒店\t368973\nDoppelherz\t368974\nJurlique\t368975\n云南信托\t368976\n智水\t368977\n丰台法院\t368978\n集聚\t368979\n虹桥万科中心\t368980\n伤感情\t368981\n土地储备中心\t368982\n江丰电子\t368983\nGSM\t368984\n告吹\t368985\n曲靖师范学院\t368986\n中国航海博物馆\t368987\n三十张\t368988\n张玉利\t368989\n追不上\t368990\n奎屯\t368991\ninstalle\t368992\n烯酰吗啉\t368993\n级联\t368994\n猜谜\t368995\n古味\t368996\namore\t368997\n枯树\t368998\n開幕\t368999\n胡杨\t369000\n李志红\t369001\n4秒\t369002\n回听\t369003\n七大罪\t369004\n公审\t369005\n府办\t369006\n信贷产品\t369007\n格雷梅\t369008\nwwwww\t369009\n小品集\t369010\n乐张\t369011\n热处理炉\t369012\n秦时明月2\t369013\n四篇\t369014\nTables\t369015\nPostgreSql\t369016\n杨幂工作室\t369017\ngetattribute\t369018\n泰捷视频\t369019\n天后级\t369020\n投奔\t369021\n卢山\t369022\n凡妮莎\t369023\n20170430\t369024\n聚氨酯发泡保温管\t369025\n二阶矩阵\t369026\n王一新\t369027\nandroideabi\t369028\nalphabet\t369029\n恩菲\t369030\n孤岛惊魂\t369031\n勇虎\t369032\n木胶板\t369033\n乡城县\t369034\n北京门头沟\t369035\nseriously\t369036\n歌奈\t369037\n一氯代物\t369038\n小马过河\t369039\n上海化工研究院\t369040\n等边角钢\t369041\n胆红素\t369042\nembarrassed\t369043\n刑事法\t369044\n发型站\t369045\n拒发\t369046\n站区\t369047\n固瑞克\t369048\n企业号\t369049\n李小燕\t369050\n底仓\t369051\nBaan\t369052\nD3\t369053\n压境\t369054\n凯多\t369055\nEvent\t369056\ndom4j\t369057\n去了\t369058\nArting365\t369059\n免责条款\t369060\n周霁\t369061\nbdm\t369062\n20160219\t369063\nretain\t369064\n佩奇\t369065\nfails\t369066\neX+\t369067\n南溪\t369068\n芷\t369069\nnexus9\t369070\ninfomation\t369071\nLED背光源\t369072\n恒裕\t369073\n商户\t369074\n鸭子\t369075\n铁血使命\t369076\nRITTAL\t369077\n螺线\t369078\n二十八\t369079\nForward\t369080\n西师大\t369081\n雪竹\t369082\n结构树\t369083\n难道\t369084\noracel\t369085\n四二\t369086\n圆周运动\t369087\nATE\t369088\nabide\t369089\n静电喷塑\t369090\n保障性住房\t369091\n南澳大利亚\t369092\n载重\t369093\n成都市公共资源交易服务中心\t369094\nsunjie\t369095\n会师\t369096\n遮掩\t369097\nvol.5\t369098\n柳林政府\t369099\n金氏\t369100\n17th\t369101\nPython2\t369102\n星灯\t369103\n20160817\t369104\n水卜樱\t369105\n离弦\t369106\nCocos2D\t369107\nrecord\t369108\n马自达5\t369109\n1818黄金眼\t369110\n湖州市公安局\t369111\n小米盒子\t369112\nAnita\t369113\n奶骑\t369114\n债市\t369115\nYU\t369116\n上尊\t369117\n耐候钢板\t369118\nEXFAT\t369119\n中国注册公司\t369120\n乐摩\t369121\n吉利_博越\t369122\n常用类\t369123\nCovermark\t369124\n乐天\t369125\n电子认证\t369126\n滕房网\t369127\n农业银行手机银行\t369128\n华顺\t369129\n枸杞红枣茶\t369130\n孵出来\t369131\n酊\t369132\n董浩\t369133\nStored\t369134\nestee\t369135\n秦桑易连恺\t369136\n流散\t369137\nbgl\t369138\n_板报网\t369139\n咀嚼片\t369140\n中国忠旺\t369141\ncybex\t369142\n飞虎极战全集\t369143\n金嗓子喉片\t369144\n淡然\t369145\n膣\t369146\n赭\t369147\n小田原\t369148\n同望\t369149\n干姜\t369150\n闭合电路欧姆定律\t369151\n12656\t369152\nsymposium\t369153\n金昌路\t369154\n绝地求生刺激战场模拟器\t369155\n维权\t369156\n大疆精灵Phantom\t369157\n寒热\t369158\n顺辉\t369159\n主动脉夹层\t369160\nfsi\t369161\n上海中学\t369162\n数学与应用数学专业\t369163\n锦江乐园\t369164\n东平村\t369165\n花岗石\t369166\n张俊杰\t369167\n小仙男\t369168\n前无古人\t369169\n天巡\t369170\n文明6\t369171\n拦截器\t369172\nsexinsex\t369173\n刘世杰\t369174\n超级跳\t369175\n戏剧家\t369176\n丧命\t369177\n请选\t369178\nAccounts\t369179\n成段\t369180\n亿健\t369181\nEzio\t369182\n六起\t369183\nЗ\t369184\n沈岸\t369185\n尘埃3\t369186\n录播室\t369187\n中国庆阳政府\t369188\n腰酸背痛\t369189\n浙江省民政厅\t369190\nanker\t369191\n10.0\t369192\n四色\t369193\n信号工\t369194\n该校\t369195\n区委区政府\t369196\n紫天\t369197\n河南路\t369198\nRIO锐澳鸡尾酒\t369199\n唐家村\t369200\n河南红星矿山机器有限公司\t369201\n伪装\t369202\n勤奋者\t369203\n魔怪\t369204\n金苹果\t369205\neast\t369206\nconvolutional\t369207\n国培教育\t369208\nbenifit\t369209\n8折\t369210\nisalpha\t369211\n7克\t369212\n上光\t369213\n超算中心\t369214\n烧鸭\t369215\n铰孔\t369216\nAmon\t369217\n猫眼草\t369218\n毛诗序\t369219\n832路\t369220\n台化\t369221\n攻略本\t369222\n氮化镓\t369223\n互发\t369224\n大连万达商业地产股份有限公司\t369225\n云ERP\t369226\n尿尿\t369227\n11.1.0\t369228\n病原学\t369229\ndmesg\t369230\nnexus5吧_\t369231\n清史稿\t369232\n黄金路\t369233\nJohor\t369234\n似\t369235\n第五段\t369236\n羊肉馆\t369237\npbl\t369238\n尔曹\t369239\n你的世界\t369240\n均方根误差\t369241\nloudly\t369242\n冒险家\t369243\nImba\t369244\n逆光源\t369245\ncondensed\t369246\n酱油醋\t369247\n第二十六集\t369248\n杭州市金融投资集团\t369249\n大腕们\t369250\n最终幻想10HD\t369251\n施罗德\t369252\n闭式\t369253\n猪心\t369254\nP320\t369255\nast\t369256\n四区\t369257\n出用\t369258\nbeatrich\t369259\n20161030\t369260\n泊里镇\t369261\n推广者\t369262\n希望集团\t369263\nRegards\t369264\n秋人\t369265\necs服务器\t369266\n衣冠禽兽\t369267\nKX3552\t369268\nANSI\t369269\n切盘\t369270\nevaluation\t369271\n秦风网\t369272\n上海总领事馆\t369273\n崇明区\t369274\n,,,\t369275\naccountability\t369276\n扬州市委\t369277\n龙仙\t369278\n卡尼尔\t369279\n西江经济带\t369280\n云码\t369281\n项目决策分析与评价\t369282\n李约瑟\t369283\n技术资料集\t369284\n冰卡\t369285\n五胡\t369286\n卢卡尔\t369287\n促行\t369288\n桂城\t369289\n种猪\t369290\nclob\t369291\n客单\t369292\n多环芳烃\t369293\n超级牛\t369294\nWindows,Mac\t369295\n最大气\t369296\n混合机\t369297\n外在美\t369298\n蒙克\t369299\n消费者权益保护法\t369300\n影音先锋资源男人站\t369301\n传染源\t369302\n肤痒颗粒\t369303\n360汽车网\t369304\n夕月\t369305\n第2家\t369306\n欧阳娣娣\t369307\n拜纳姆\t369308\n2015-04\t369309\n30日内\t369310\n绿地率\t369311\n淘金船\t369312\ncjp\t369313\n突发公共卫生事件\t369314\n209路\t369315\n嗥\t369316\n不老泉\t369317\n省立医院\t369318\n被遗忘\t369319\n热水泡\t369320\n有道翻译蛋\t369321\n比斯利\t369322\n炎琥宁\t369323\n半导体制冷片\t369324\n蕾蒙威\t369325\n二环路\t369326\n灶具\t369327\n进程\t369328\n5567\t369329\n几年后\t369330\n破冰\t369331\nProposition\t369332\ne52\t369333\n剔槽\t369334\n吕不韦\t369335\n14mm\t369336\n船主\t369337\n投频\t369338\n陆翔路\t369339\nora-12154\t369340\n上海市环境保护局\t369341\n并机\t369342\n肉汁\t369343\n舒康\t369344\nv1.0.0\t369345\ngcse\t369346\n梦幻89吧\t369347\n环己胺\t369348\nEE\t369349\nhump\t369350\nX-Men\t369351\n付鑫联盟\t369352\n后冠\t369353\n奔腾b50\t369354\n五万公里\t369355\n三丈\t369356\n二月份\t369357\n绿城蓝庭\t369358\n战团\t369359\n双赢\t369360\n25处\t369361\nsbr\t369362\n明年2月\t369363\n拟稿\t369364\n树种\t369365\n圆钻\t369366\n微构\t369367\nantrun\t369368\n任志宏\t369369\n续篇\t369370\n绿尘枫\t369371\n20171029\t369372\n阿特金斯减肥法\t369373\n卷心菜\t369374\n烽影\t369375\n渤海\t369376\n固支\t369377\nback\t369378\n所得税税\t369379\n自力式压力调节阀\t369380\n冯俊\t369381\n880M\t369382\n音弹\t369383\n建筑结构\t369384\n儿童险\t369385\n鱼胶粉\t369386\n节律性\t369387\nTse\t369388\n110000\t369389\n1667\t369390\n精神现象学\t369391\n40037\t369392\n20150428\t369393\n迈锐\t369394\n一件代\t369395\nmysql.user\t369396\n20170411\t369397\n王元化\t369398\n女排赛\t369399\n广兰路\t369400\n新蔡\t369401\n易鹏\t369402\n棱\t369403\n年间\t369404\n善林\t369405\n居延\t369406\n杭州上海世界外国语小学\t369407\n易贝乐\t369408\n来疯\t369409\n亚赔\t369410\n雌鹿\t369411\n375ml\t369412\n10宫\t369413\nokita\t369414\nFujitsu\t369415\n北京坊\t369416\n振邦\t369417\n珞狮路\t369418\nAAA级\t369419\n拒绝\t369420\n宫瀬リコ\t369421\n超碰上传\t369422\n滑膛炮\t369423\ncgi\t369424\n苯基\t369425\n武旦\t369426\nNidec\t369427\n阳逻\t369428\n超过72小时\t369429\n买卖方\t369430\n公约\t369431\n3000mAh\t369432\n外向\t369433\n梅李镇\t369434\n陈旺\t369435\n2kol\t369436\n吾游网\t369437\n上海斐讯\t369438\n四桥\t369439\n排插\t369440\nADA\t369441\n转数\t369442\n打人者\t369443\n_锡山司法局\t369444\n朱德熙\t369445\n尿蛋白\t369446\n结构柱\t369447\n豆瓣史\t369448\n天才冲冲冲\t369449\n全愈\t369450\n打\t369451\n20150903\t369452\nRatio\t369453\n2.0.3\t369454\nstakes\t369455\n美容\t369456\n双剑\t369457\n金髪\t369458\n引见\t369459\n香车\t369460\n麦克马斯特\t369461\n唐凌\t369462\n动电\t369463\n女裤\t369464\n第44号\t369465\n固本培元\t369466\n茂谷\t369467\n殷红\t369468\n活在当下\t369469\n桨板\t369470\nBMW\t369471\neprime\t369472\n军势\t369473\n投资指南网\t369474\nTaichi\t369475\n卡蓝条\t369476\n月坛南街\t369477\nav_a片\t369478\nscx4321\t369479\n听取\t369480\n阅文\t369481\n2015年8月\t369482\n廣州\t369483\n绿毯\t369484\nstepping\t369485\n方正字体_方正\t369486\n提成\t369487\n在线手写输入法_手写查字_日语输入法_韩语输入法_易笔字\t369488\na88\t369489\n寒服\t369490\n遍地走\t369491\n黄眼屎\t369492\n翻床\t369493\nreadlink\t369494\nURLs\t369495\n下期中考试\t369496\n附价\t369497\n赛钢\t369498\nsfz\t369499\n章龄之\t369500\n两个务必\t369501\ninfected\t369502\n总务处\t369503\n樓盤\t369504\n培哚普利叔丁胺片\t369505\n二休\t369506\n十四周\t369507\n[桜\t369508\n膨胀\t369509\ndify\t369510\n二嫁\t369511\nUT\t369512\n3667\t369513\nCCIE\t369514\n祁同伟\t369515\n晋朝\t369516\n片翼\t369517\n阿达帕林\t369518\n躲闪\t369519\n作文集\t369520\nQQ表情包\t369521\n泾河工业园\t369522\n颗粒机\t369523\nexplosion\t369524\n百济\t369525\nchegg\t369526\n杨国平\t369527\n32MB\t369528\n奢侈品网\t369529\nDiscovery\t369530\n中华万年历\t369531\n张村\t369532\n陈塘庄\t369533\n骁龙636\t369534\n阿德勒\t369535\n艺术感\t369536\n大众传播\t369537\nルクルト\t369538\n半规\t369539\n天津外国语大学滨海外事学院\t369540\n服装打版\t369541\nHeap\t369542\nCM\t369543\nzuang\t369544\nKANO\t369545\n鞍座\t369546\n下刀\t369547\n香树花城\t369548\n360宽带测速器\t369549\n糊状\t369550\n西方世界的劫难\t369551\n黑绿\t369552\n妨害公务罪\t369553\n魏云\t369554\n脉管炎\t369555\n巢谷\t369556\nmax文档投稿赚钱网\t369557\n红蓝线\t369558\nMany\t369559\n梁紫晨\t369560\n经验值\t369561\n1.5磅\t369562\n宋梓馨\t369563\n水映\t369564\n2500公里\t369565\n股堂\t369566\n浪花朵朵\t369567\n柱箍筋\t369568\n爱股轩\t369569\n茶机\t369570\n2258\t369571\n镇魂\t369572\n乒乓网\t369573\nlibj\t369574\n人头\t369575\n济南泉\t369576\n解放军海军\t369577\nlibcef\t369578\n鸿宾\t369579\n太冲\t369580\n罗锅\t369581\n后发制人\t369582\n万青\t369583\nCAD2007\t369584\nmx\t369585\n长江电力\t369586\n账页\t369587\n英菲尼迪QX60\t369588\n110V\t369589\n小苑\t369590\n雷车\t369591\n露底\t369592\n凤凰大视野\t369593\n苤蓝\t369594\n额娘\t369595\n链条\t369596\n斯帅\t369597\n迅雷磁力链接\t369598\n美幸\t369599\n胆拖\t369600\n七轮\t369601\n毁掉\t369602\n稔山\t369603\n老娘\t369604\n稀稀\t369605\n匿名函数\t369606\nsud\t369607\n出航\t369608\nRT-AC86U\t369609\ngigi\t369610\n龙巅\t369611\n胜地\t369612\n赤月传说2\t369613\n黄金分割点\t369614\n林荫道\t369615\nhtdocs\t369616\n1pondo\t369617\n紫牛\t369618\n桶子\t369619\nmathorcup\t369620\n二手房税费计算器\t369621\n空间感\t369622\n李振\t369623\n潮阳站\t369624\n十八期\t369625\n郑州颐和医院\t369626\n倒模\t369627\n施蒂利克\t369628\n金华市市场监督管理局\t369629\n保险公估人\t369630\n宫腹腔镜手术\t369631\nCHINESE\t369632\n6670\t369633\nsimcitybuildit\t369634\n快乐舞\t369635\n乌鲁木齐机场\t369636\n小马村\t369637\n第五宫\t369638\n自主学习网\t369639\n主表\t369640\n尾座\t369641\n朝阳区小学\t369642\nrmb\t369643\n汉城湖\t369644\n车床\t369645\nIANG\t369646\n新东方英语\t369647\n747\t369648\n护舷\t369649\n搜狐畅游\t369650\n南京物流公司\t369651\n云电视\t369652\n合时代\t369653\n好酒招商网\t369654\n云龙万达广场\t369655\nsgi\t369656\n海洋生物学\t369657\n金辉路\t369658\n310路\t369659\nANGEL\t369660\nz变换\t369661\nhash表\t369662\n房网\t369663\n主爱\t369664\n丰田坦途\t369665\n100招商网\t369666\n双曲\t369667\n暂态\t369668\n2017-11-25\t369669\n2000年以来\t369670\n大成镇\t369671\nt440\t369672\n假分数\t369673\nnamic\t369674\n外嫁女\t369675\n纯熟\t369676\n志文\t369677\n验电器\t369678\n暴伤\t369679\n晓丹\t369680\n商场现代化\t369681\nwhore\t369682\n百度畅听\t369683\n蓝紫\t369684\n成吉思\t369685\n天仙子\t369686\n搭子\t369687\n16年6月\t369688\n黑泷堂奶茶\t369689\n格物致知\t369690\n劳字\t369691\n晋江新闻网\t369692\n明成祖朱棣\t369693\n中国人民银行营业管理部\t369694\n厚壁无缝钢管\t369695\n非台\t369696\n山东理工职业学院\t369697\n云飞路\t369698\n炒制\t369699\n血域\t369700\nRealtime\t369701\n阳逻在线\t369702\n7套\t369703\n绝世神王\t369704\n证项\t369705\n厦门北站\t369706\n有礼貌\t369707\n润邦股份\t369708\n深圳特区报\t369709\nyouj\t369710\n雷同\t369711\n全吧\t369712\nbui\t369713\nTOPIK\t369714\n低吸\t369715\n中国农科新闻网\t369716\n接入商\t369717\n首控集团\t369718\n途中\t369719\n91分\t369720\n凯马\t369721\n私愤\t369722\n老吕\t369723\n使用感\t369724\nrecv\t369725\n中国食用菌协会\t369726\n连岳\t369727\n沁州黄小米\t369728\n崔莺莺\t369729\n集币在线\t369730\nEtymology\t369731\n中国交通通信信息中心\t369732\n禽病\t369733\n第17关\t369734\n招商银行信用卡\t369735\n排血\t369736\n鞍马\t369737\n财商\t369738\nguild\t369739\n徳\t369740\n十几公里\t369741\n麻醉疼痛专业\t369742\n万能五笔输入法\t369743\n洁霸\t369744\n2075\t369745\n超人网\t369746\nfossil\t369747\n真乃\t369748\nNORMAL\t369749\n价格司\t369750\n琴桥\t369751\nMango\t369752\nnya\t369753\n朴素妍\t369754\n阜宁日报数字报\t369755\n广西联通\t369756\n中级会计师考试\t369757\nServiceStack\t369758\n金报\t369759\n张鹤桓\t369760\n类户\t369761\n吉他版\t369762\n陶盆\t369763\n漳州开发区\t369764\n上地街道\t369765\n隐忍\t369766\n花开花落\t369767\n黑鳞\t369768\nGilbert\t369769\nHOMO\t369770\nRDMA\t369771\n齐鲁人才网\t369772\nUSCIS\t369773\n周炜\t369774\n菲比酒吧\t369775\n陈建新\t369776\n精兵\t369777\n理化板\t369778\n衣袖\t369779\ninject\t369780\n抽水器\t369781\n廖健\t369782\n中华人民共和国农业农村部\t369783\n字冠\t369784\n张文博\t369785\n出包王女\t369786\n木樨\t369787\n正定\t369788\n南京市审计局\t369789\nCarol\t369790\n红印黑糖\t369791\n3160\t369792\nYE\t369793\n杠铃片\t369794\n迪拉姆\t369795\nCredentials\t369796\n王亚楠\t369797\nsignatures\t369798\n20170318\t369799\nCybex\t369800\nmacho\t369801\n房地产售楼部\t369802\n根际\t369803\n斜角\t369804\n液力\t369805\n抓狂\t369806\n中华地板网\t369807\n宠溺\t369808\n雪花状\t369809\nhcharts\t369810\n暗夜罗\t369811\n轻货\t369812\n行政管理专业\t369813\n蜂糖\t369814\n御酒\t369815\nsikixix\t369816\n单开\t369817\n剩下\t369818\n普陀岛\t369819\n白云龙\t369820\n西南交通大学图书馆\t369821\n敲背\t369822\n月池\t369823\n谐波减速器\t369824\nPerformance\t369825\n一年级语文下册\t369826\n川湘菜\t369827\n勇者斗恶龙5\t369828\n长郡\t369829\n第6代\t369830\n上海市税务局\t369831\n还以为\t369832\n金石\t369833\n0753\t369834\n冰心堂\t369835\n卓朗\t369836\n天台花园\t369837\n梦幻西游2018\t369838\n尹志尧\t369839\n噬骨\t369840\n加州大学伯克利分校\t369841\n烫台\t369842\nDKNY\t369843\n2018年4月20号\t369844\n护筒\t369845\npiao\t369846\n家世\t369847\n王小伟\t369848\nNubile\t369849\n农谷\t369850\n菜单树\t369851\n海南外国语职业学院\t369852\n4维\t369853\n龙场镇\t369854\n里德\t369855\n深圳市城市交通规划设计研究中心\t369856\narchived\t369857\n智享版\t369858\n一命通关\t369859\n一整块\t369860\n首钢\t369861\nSophia\t369862\n出包王女Darkness\t369863\n20180201\t369864\n星耀学园\t369865\ncalis\t369866\n中者\t369867\n美牙\t369868\nrising\t369869\n宁夏回族自治区文化厅\t369870\n杭州工商银行\t369871\n指示表\t369872\n出勤率\t369873\n高安中学\t369874\n邀请展\t369875\n复旦大学信息科学与工程学院\t369876\n第十二集\t369877\n奕客\t369878\n华为Mate7\t369879\n黑白森林\t369880\n新乱世佳人\t369881\n岑\t369882\n汇客\t369883\nv3.1.4\t369884\n临潼\t369885\nHimalaya\t369886\n彩蚕\t369887\n单例\t369888\n2020年底\t369889\n定向降准\t369890\n石家庄高新技术产业开发区\t369891\n雪孩子\t369892\n超细干粉\t369893\n野香\t369894\n影驰1050ti\t369895\n300maan\t369896\n记忆术\t369897\n曲棍\t369898\n泰安高新区\t369899\n许伯\t369900\n艾卡\t369901\n水阀\t369902\n裤业\t369903\nAmi\t369904\n猫肉\t369905\n缓存\t369906\nPOE交换机\t369907\n刘昌\t369908\n地方政\t369909\n海城镇\t369910\n爆龙\t369911\n2联\t369912\n张牧野\t369913\nVine\t369914\n氿\t369915\n易趣网\t369916\n赵字\t369917\n奶酒\t369918\n海藻舞\t369919\n下年\t369920\n上料机\t369921\n滑套\t369922\n砷\t369923\n去年以来\t369924\nhoch\t369925\nv3.5.3\t369926\nhny\t369927\n驾驭\t369928\n神哥\t369929\n100页\t369930\n42步\t369931\n华侨中学\t369932\nMSK\t369933\n菏泽学院\t369934\n傻比\t369935\n日期型\t369936\n阿白\t369937\n可口可乐饮料有限公司\t369938\n光弘科技\t369939\n猫里奥\t369940\n主单\t369941\n玉管\t369942\nDoctor\t369943\n深澜\t369944\nqqmail\t369945\nE40\t369946\nTREND\t369947\n陕西出入境检验检疫局\t369948\n律\t369949\n音悦tai\t369950\n绿茵传奇\t369951\n黄栀子\t369952\n融资融券_全景网\t369953\n自组\t369954\n李廉锟\t369955\nbla\t369956\n百田奥比岛\t369957\n威尼斯双年展\t369958\n荷荷巴油\t369959\n深度学\t369960\nhitachi\t369961\n第一学\t369962\n麦克纳姆轮\t369963\n原委\t369964\n09款\t369965\n皇夫\t369966\n物联\t369967\n三比\t369968\n九门提督\t369969\nJello\t369970\n强势股\t369971\n先烈\t369972\n腺性膀胱炎\t369973\n五色龙章\t369974\nuncode\t369975\n富阳银湖\t369976\n神隐\t369977\nyindu\t369978\nCREE\t369979\n文游广场\t369980\n食人魔\t369981\nKathy\t369982\n24M\t369983\n羟丙基甲基纤维素\t369984\n新部落守卫战\t369985\n沾化区政府\t369986\nOracle12C\t369987\n思忆\t369988\nolex2\t369989\n泥潭\t369990\n北京公证处\t369991\n欢天喜地七仙女\t369992\n天津市第一中学\t369993\nspid\t369994\n西城小学\t369995\n国家授时中心\t369996\n闪耀\t369997\n490分\t369998\n5分钟后\t369999\nscreens\t370000\n刺鱼\t370001\n销售单\t370002\n离职补偿金\t370003\nAffect3D.com\t370004\n塑料垫\t370005\n三生缘起\t370006\n港珠澳大桥岛隧\t370007\n冰洁\t370008\n天津搬家公司\t370009\n膀胱过度活动症\t370010\n新时代全面从严治党\t370011\n新纹章之谜\t370012\n佃户\t370013\n医疗级\t370014\n防谍\t370015\n电狐\t370016\n5000mAh\t370017\nwerk\t370018\n17P]\t370019\n红潮\t370020\n南郊区\t370021\n新秀奖\t370022\n环城公园\t370023\n杨家镇\t370024\n生态市\t370025\n昆明南\t370026\n沈阳\t370027\n云天河\t370028\n楚庄王\t370029\n2018-01-20\t370030\n厨娘\t370031\n团场\t370032\nGetshell\t370033\n吐毛\t370034\n上百名\t370035\n母马\t370036\n福禄寿喜\t370037\n长城宽带网络服务有限公司\t370038\n白沙洲大道\t370039\n皮皮播放器\t370040\n没怪\t370041\n玉兰树\t370042\n韭黄\t370043\n王者荣耀弈星\t370044\nsmallint\t370045\nCognitive\t370046\n民警\t370047\n武汉市人才服务中心\t370048\n有限责任公司章程\t370049\n1公顷\t370050\n恒昌公司\t370051\n南夏\t370052\nxxs\t370053\n合唱\t370054\n一辰\t370055\n90码\t370056\nHSF\t370057\n小凌\t370058\n花姐\t370059\n彭巧娣\t370060\nrandperm\t370061\n浙江西湖高等研究院\t370062\n杜牧\t370063\n南京同仁堂\t370064\n洗尘\t370065\n我做夫人那些年\t370066\n赛德克巴莱\t370067\n许嘉璐\t370068\n梁知\t370069\n塔城市人民政府\t370070\n福大\t370071\nFBO\t370072\nyeezy350v2\t370073\n深圳市地铁集团有限公司\t370074\n小盾\t370075\n土地招拍挂\t370076\n核验单\t370077\n徐光春\t370078\n抓牢\t370079\n天洋\t370080\n11.0.10\t370081\n┇RM\t370082\n惠威\t370083\n古寺\t370084\n滦平县\t370085\n同居\t370086\nNOTE3\t370087\n佛缘\t370088\n气泡水\t370089\n暗杠\t370090\n放克\t370091\n技工招聘网\t370092\nVertus\t370093\nctu\t370094\n安狐\t370095\n受挫\t370096\nsolver\t370097\n深圳路\t370098\n环评工程师考试\t370099\n环境权\t370100\n人美版小学\t370101\n赤井美月\t370102\n龙里县\t370103\n跟紧\t370104\n顶撞\t370105\n506号\t370106\nbrt\t370107\n价高者\t370108\nFreeMarker\t370109\n9日游\t370110\n张娅姝\t370111\n也\t370112\nMaplesoft\t370113\n重庆市公安局\t370114\n宁德站\t370115\n日语版\t370116\n山乌龟\t370117\nHERA\t370118\n搞上\t370119\n锦缎\t370120\n高度计\t370121\ncard\t370122\nnewbalance\t370123\n世方\t370124\n东京迷城吧\t370125\n演练\t370126\nMale\t370127\n梅花园\t370128\nClimb\t370129\n浮动率\t370130\n混沌之戒3吧\t370131\n版权所有\t370132\n锐版\t370133\n绒毛膜癌\t370134\n等压\t370135\n好好吃\t370136\n梵唱\t370137\n度时\t370138\n清太祖\t370139\n五年级语文下册\t370140\nHR们\t370141\n黯淡\t370142\nGreatWall\t370143\n长宁镇\t370144\n清东陵\t370145\n舟山市政府\t370146\ndesigne\t370147\n音名\t370148\nFace++\t370149\n罗利\t370150\n0.04%\t370151\nMOBI\t370152\n30088\t370153\nSelfridges\t370154\n微门户\t370155\nhxy\t370156\n李恩秀\t370157\nStirling\t370158\n活寡\t370159\n产粮\t370160\n香居\t370161\n王子龙\t370162\n洽洽瓜子\t370163\n十公斤\t370164\n公棚\t370165\n赵佗\t370166\n麻吉宝\t370167\n115.NET\t370168\n白水村\t370169\n张世平\t370170\n药价\t370171\n召唤符\t370172\niphone6\t370173\nMiuMiu\t370174\n0.1\t370175\n1-12个月\t370176\n5天游\t370177\ngonglue\t370178\n林少华\t370179\n威驰剑\t370180\n63手工网\t370181\nrecvmsg\t370182\nkayden\t370183\n秦假仙\t370184\ngecko\t370185\n创越\t370186\n第94\t370187\n果聊\t370188\n神武天龙八部\t370189\ndimensional\t370190\n己亥年\t370191\n丽声\t370192\n飞天翼龙\t370193\n新型农村社会养老保险\t370194\niphonex\t370195\n25摄氏度\t370196\nDazs\t370197\n纠音\t370198\n不计分\t370199\n淑女坊\t370200\n史艳文\t370201\nTabItem\t370202\nGPU\t370203\nC16\t370204\n齐鲁财富网\t370205\n墙里\t370206\n工银瑞信基金|指数基金|债券基金|货币基金T+0|理财\t370207\nbfs\t370208\nincoterms\t370209\nmezco\t370210\n电e宝\t370211\n谢宇\t370212\nf2pool\t370213\n召唤阵\t370214\nbless\t370215\n二年\t370216\n终结者2Terminator\t370217\n/proc\t370218\n陕西省国家税务局\t370219\n省运\t370220\n不锈\t370221\nwubi\t370222\n新浪河北_新浪网\t370223\n智商\t370224\n沙头街\t370225\nSSIS\t370226\n上海学而思培优\t370227\n喷水器\t370228\n集体舞蹈\t370229\n刘景\t370230\n朋友局\t370231\n朴槿惠\t370232\n九鱼图\t370233\n宜秀区\t370234\n云菲菲\t370235\ncfree\t370236\nQvod\t370237\n寻乌\t370238\n荣登\t370239\nPic\t370240\nbrid\t370241\n扯皮\t370242\n故道\t370243\n共品\t370244\n硬照\t370245\n雷阵\t370246\n超焦距\t370247\n毛絮\t370248\n谢璐\t370249\n何美钿\t370250\n11gR2\t370251\n波多野結衣\t370252\nTake\t370253\n沙鱼\t370254\nktp\t370255\n债项\t370256\namateur\t370257\n7.25\t370258\n小时钟\t370259\n月系\t370260\ncow\t370261\n新疆维吾尔\t370262\n盅\t370263\n青岛九中\t370264\n精易论坛\t370265\n苍翼默示录\t370266\n动植\t370267\nBuggy\t370268\n中国电子科技集团公司第二十八研究所\t370269\n相电\t370270\nGifts\t370271\n最大量\t370272\n管理咨询师\t370273\n犁田\t370274\n小梅沙\t370275\n西行纪\t370276\n羞耻\t370277\n财企\t370278\n乐居网\t370279\n提示符\t370280\n零陵新闻网\t370281\n记录值\t370282\n南开医院\t370283\ninflating\t370284\n侯氏制碱法\t370285\n中海天钻\t370286\n电焊网\t370287\n超酷\t370288\n纪伊\t370289\nskel\t370290\n12&#160\t370291\nv6.6\t370292\n杨基\t370293\n钢围檩\t370294\n假死机\t370295\n欧规\t370296\n珠儿\t370297\n连连看\t370298\nvuecli\t370299\n粘稠状\t370300\n重庆晨报\t370301\n6联\t370302\n吴均\t370303\n太玄经\t370304\n车神\t370305\n爱不释\t370306\n984G.COM\t370307\n兼容并蓄\t370308\n何文\t370309\n子嗣\t370310\n盐酸西替利嗪滴剂\t370311\n上海市司法局\t370312\n底框\t370313\n1v7\t370314\n红橡树\t370315\n区政府办公室\t370316\n心弦\t370317\n春美\t370318\nc45\t370319\n山西电视台\t370320\n内蒙古移动\t370321\n改性聚苯板\t370322\n支撑剂\t370323\n教视网\t370324\nACI\t370325\n通知函\t370326\n30天前\t370327\n代币卷\t370328\n波段操作\t370329\n试论\t370330\n隐藏域\t370331\njkf\t370332\npdsoft\t370333\n87级\t370334\n路路去网\t370335\n玉米杆\t370336\n7名\t370337\n中宏保险\t370338\nbbs.classic023.com\t370339\n光轴\t370340\n12月7日\t370341\n霁蓝釉\t370342\n一袭\t370343\npoi浏览器\t370344\n开撕\t370345\n任務\t370346\n宝沃BX7\t370347\n科研项目\t370348\n临翔区\t370349\n纯真年代\t370350\n星期一\t370351\n无边\t370352\n技术股\t370353\n小三门\t370354\n侠饭\t370355\nPPT课\t370356\n出出\t370357\nHIKVISION\t370358\n三星S7\t370359\n000917\t370360\n千山独行\t370361\n先王\t370362\n网师园\t370363\nsense\t370364\n十二国记\t370365\n免税易购\t370366\n作战室\t370367\n惠民路\t370368\n船营区\t370369\n安城\t370370\n良渚遗址\t370371\n成濑心美\t370372\n结冰期\t370373\n远程端\t370374\n北汽\t370375\n自然段\t370376\nMMA\t370377\n冷凝\t370378\n华融大厦\t370379\n势力\t370380\n西藏军区\t370381\n338元\t370382\n大兵\t370383\ntbr\t370384\n麦克米伦\t370385\n3128\t370386\n浙江电商网\t370387\nMercier\t370388\n金科廊桥水乡\t370389\n安卓浏览器\t370390\n伊本\t370391\n箱式\t370392\n逍遥王\t370393\n交通运输部长江航务管理局\t370394\n晴天小狗2\t370395\n尤菲如月\t370396\n9200元\t370397\n活跃期\t370398\n李木子\t370399\n金悦\t370400\nyaffs\t370401\n手工活\t370402\njus\t370403\n黄骨鱼\t370404\n3匹\t370405\n蔡李佛\t370406\n历城\t370407\n汕尾市人民政府\t370408\n南北方\t370409\n金哲\t370410\n东风街\t370411\n_吨\t370412\n总库\t370413\n崴\t370414\n000858\t370415\n衙斋\t370416\nC++STL\t370417\nSAT考试\t370418\n当地\t370419\n办理手续\t370420\n墙柱\t370421\n油桃\t370422\n2009-2010年\t370423\n捞人\t370424\nodoo10\t370425\nwww.78dm.net\t370426\n蔚蓝郡\t370427\n转本\t370428\n2014年10月1日\t370429\n2017—2018年\t370430\n马塞洛\t370431\n毎日\t370432\nrvest\t370433\n科技小制作大全\t370434\n41度\t370435\n疏勒\t370436\n弱项\t370437\n牙骨\t370438\n逼仓\t370439\n臧鸿飞\t370440\nwtp\t370441\njeb\t370442\n富拉尔基区\t370443\n天津南开区幼儿园\t370444\n上海市人民政府外事办公室\t370445\n黄志刚\t370446\n童星\t370447\n地球脉动2\t370448\n计算机类\t370449\n傲世三国\t370450\negr\t370451\ntires\t370452\nzing\t370453\nauditor\t370454\n5月25日\t370455\n山坪塘\t370456\nprogressdialog\t370457\n托运单\t370458\n2018-04-05\t370459\nclojure\t370460\nSMA\t370461\n奥特曼格斗进化3\t370462\nSohu\t370463\n斗者\t370464\n烙印\t370465\n镍片\t370466\nblock\t370467\n躯干\t370468\n20150708\t370469\n三辊\t370470\nRich\t370471\n索普\t370472\n蝙蝠袖\t370473\n明源地产研究网\t370474\n按扣\t370475\n倩碧\t370476\n亚马特\t370477\n彭强\t370478\n君悦酒店\t370479\ngpr\t370480\n外门窗\t370481\n莲花二村\t370482\n第八期\t370483\n航海王强者之路_九游论坛\t370484\n萧\t370485\n两轮\t370486\n入库\t370487\n权游\t370488\n750m\t370489\n信道\t370490\n侵略性\t370491\nmiku\t370492\n欧普特\t370493\ncompound\t370494\n1.0.16\t370495\n克里斯多夫\t370496\n期权论坛\t370497\n流动资金贷款管理暂行办法\t370498\n淡水河\t370499\n20161219\t370500\nacdsee9.0\t370501\n精神病\t370502\nFetch\t370503\n磨粉机\t370504\n蓝苹果\t370505\nSSS\t370506\nC语言编译器\t370507\nloro\t370508\n泵管\t370509\n京东云\t370510\n猪扒包\t370511\nstuff\t370512\n步步高超市\t370513\n正风\t370514\n郭天祥\t370515\n五猖\t370516\nmpi\t370517\n外宾\t370518\n天津市建委\t370519\n干饭\t370520\n侧位片\t370521\n东兴\t370522\n立功\t370523\n5230\t370524\n碱性\t370525\n不适感\t370526\n新液\t370527\n滑县\t370528\n沈杜\t370529\n唐氏综合征\t370530\n拓扑绝缘体\t370531\n分板机\t370532\n生肖\t370533\n司马朗\t370534\n长歌门吧\t370535\n8210\t370536\n毒枭\t370537\n蹄\t370538\n唐冶\t370539\n0354\t370540\ncdzk\t370541\n得劲\t370542\nv2.1.3\t370543\n如天\t370544\n包带\t370545\n支持者\t370546\nORA-00604\t370547\n1825\t370548\n松白路\t370549\n榨油\t370550\n反恐特警组\t370551\n老大们\t370552\n正隆\t370553\nPoloMeeting\t370554\n化路\t370555\n优化版\t370556\n三菱plc编程\t370557\n758所\t370558\n一个大一个\t370559\n埋尸\t370560\nzixue\t370561\n气温_114天气网\t370562\n握感\t370563\n超过15年\t370564\nelapsed\t370565\n詹妮弗·康纳利\t370566\n起点学院\t370567\nInt64\t370568\nEASE\t370569\n禁售品\t370570\nWww.114La.Com\t370571\n渭南北\t370572\n营业利润率\t370573\n自攻螺套\t370574\n刘慧芳\t370575\n技术型\t370576\n泽信\t370577\n祖娅\t370578\nInfluxdb\t370579\n屋基\t370580\n伦桑\t370581\nDiv层\t370582\n600522\t370583\n调味酱\t370584\n好男人\t370585\n百强\t370586\n恐怖黎明吧\t370587\n马雅舒\t370588\n神衣\t370589\nD8\t370590\nexchange\t370591\n上海人才中心\t370592\nLM358\t370593\n先令\t370594\n手钻\t370595\n广东中旅\t370596\n戈薇\t370597\n襄阳四中\t370598\n国债回购\t370599\n水油\t370600\n598元\t370601\n诉讼服务网\t370602\n东港市\t370603\n2485\t370604\n党员\t370605\n硅酸锆\t370606\n2018046期\t370607\nactuarial\t370608\n五一3天\t370609\n1899\t370610\n史人\t370611\n高足\t370612\n冻干人\t370613\n法治在线\t370614\n两化\t370615\n百余家\t370616\n雷神2\t370617\nqiehuan\t370618\nDIY极客营论坛\t370619\n首犯\t370620\n刘玮\t370621\nelk\t370622\n舟曲\t370623\n传众征信\t370624\nVar\t370625\n腹轮\t370626\n小郎君\t370627\n全州县人民政府\t370628\n唔\t370629\n雨寒\t370630\n两台\t370631\n朗博\t370632\n注册城市规划师\t370633\n崂山路\t370634\n超净工作台\t370635\n纳美\t370636\n酱香酒\t370637\n爱图\t370638\nDatagridView\t370639\n微乐\t370640\n市企\t370641\n冯秀芳\t370642\nNBA常规赛联盟\t370643\n3伏\t370644\n安庆一中\t370645\n免税销售额\t370646\n深圳市港务管理局\t370647\n热血传奇\t370648\n白鹤芋\t370649\n友数\t370650\n深圳律师事务所\t370651\n神目\t370652\nsou\t370653\nbrawl\t370654\n7口\t370655\n生物质颗粒\t370656\nミチル\t370657\n150个\t370658\nSetTimer\t370659\n买家版\t370660\n中纤板\t370661\nRequestParam\t370662\n假面\t370663\n谈话稿\t370664\n中华轴承网\t370665\n赋图\t370666\n嘟嘟声\t370667\n阿远\t370668\n克拉齐亚\t370669\n农耕文化园\t370670\n刻下\t370671\n痉挛性斜颈\t370672\n长辛店\t370673\n杨冬\t370674\n酷睿2\t370675\nConcern\t370676\n2230\t370677\n绑缚\t370678\no0\t370679\n宦海\t370680\n同济大学附属东方医院\t370681\n张文斌\t370682\n双鬓\t370683\nREAPER\t370684\n其父\t370685\nCIP\t370686\n王庄\t370687\n荒古遗尘\t370688\n非同寻常\t370689\n方婷\t370690\n自動\t370691\n苏文\t370692\nleases\t370693\nyupeng\t370694\n中筋面粉\t370695\n盘子女人坊\t370696\nlols\t370697\n迫于\t370698\n拍卖史\t370699\n谭浩强\t370700\n马会东\t370701\n71家\t370702\n行邮\t370703\n金碧\t370704\n三皇寨\t370705\n张曦文\t370706\n别担心\t370707\n委托书\t370708\n秋涛路\t370709\n突飞猛进\t370710\nHusband\t370711\n树中\t370712\n哈工大机器人集团\t370713\n山炮\t370714\n中国书店\t370715\n1亿美元\t370716\nWIN7_\t370717\n探听\t370718\n172.17.255.255\t370719\n入额\t370720\n袁毅\t370721\n中国平安人寿\t370722\n两周内\t370723\ndescriptions\t370724\npacemaker\t370725\nk1\t370726\n微软资讯_系统粉\t370727\n壳程\t370728\n贝壳\t370729\nonselect\t370730\nN3150\t370731\n何为\t370732\n步声\t370733\n陈德林\t370734\n5200_\t370735\n一无是处\t370736\n邵武市\t370737\n摆谱\t370738\n专递\t370739\n轻重\t370740\nEstee\t370741\n亡国奴\t370742\n练习生\t370743\n二乙\t370744\nTvb\t370745\n10.3.1\t370746\n一德\t370747\n肠炎\t370748\nStern\t370749\niOS-Coding\t370750\n潘向东\t370751\nhollister\t370752\nAsics\t370753\nQQ炫舞手游\t370754\nb150\t370755\n中国化工集团\t370756\n看见了\t370757\n西田\t370758\n水清木华研究中心\t370759\n监\t370760\n西部金融中心\t370761\n赵华\t370762\n抽子\t370763\n锦江学院\t370764\n欧奇\t370765\n足够\t370766\n电玩男\t370767\n大书\t370768\n校样\t370769\n雅克科技\t370770\nlofree\t370771\n复兴之路\t370772\n六性\t370773\n何塞\t370774\n香山寺\t370775\nweston\t370776\nstevens\t370777\n王慧文\t370778\nDeformation\t370779\n仙农\t370780\n定位分析\t370781\nPSE\t370782\n大发现\t370783\n尸语\t370784\n入赘\t370785\n怪物猎人p2g\t370786\n淘课网\t370787\n金蛙\t370788\n大华路\t370789\n致命狙击\t370790\n独孤皇后\t370791\nbcs\t370792\nMusic\t370793\n丝绸之路国际博览会\t370794\n路障机\t370795\n北原多香子\t370796\n练塘\t370797\nnic\t370798\n冬幕\t370799\n标上\t370800\n妖皇\t370801\n爱情史\t370802\n李幺傻\t370803\n三国志11威\t370804\n最佳女\t370805\n不怕冷\t370806\n收稿\t370807\n如龙极2\t370808\nu\t370809\n朔方\t370810\n1.35\t370811\nunexpected\t370812\n祥和花园\t370813\n天天撸\t370814\nwage\t370815\nctg\t370816\nKimberley\t370817\nthchs30\t370818\n作战服\t370819\njinfu\t370820\n潘姓\t370821\n妇代会\t370822\nHOP\t370823\n全国组织机构代码管理中心\t370824\nLookMyPC\t370825\n高德地图js\t370826\n调味剂\t370827\n震板\t370828\nVV\t370829\ncitra模拟器\t370830\n龙卫\t370831\nStepmom\t370832\n鹅掌\t370833\n变变\t370834\n仁美圆\t370835\n博采\t370836\n喜气洋洋\t370837\n出境游_酒店\t370838\nstructured\t370839\n养兵\t370840\n乙醛\t370841\n7.5.1\t370842\n通州法院\t370843\n152家\t370844\n全表\t370845\n王驰\t370846\njqplot\t370847\naka\t370848\nignite\t370849\n阮佳\t370850\n山梨县\t370851\n经济处\t370852\nProcedural\t370853\n抓起\t370854\n混合性\t370855\n陈列\t370856\n农林业\t370857\n大唐芙蓉园\t370858\nopr\t370859\n回播\t370860\n白重恩\t370861\n拗九节\t370862\n男左女\t370863\n桂庙新村\t370864\n婚令\t370865\n启功\t370866\n异常生物见闻录\t370867\n2【\t370868\n0315\t370869\n咪咪网\t370870\ndaquan\t370871\n接反\t370872\nramos\t370873\n万合\t370874\n2013年10月\t370875\n巴图\t370876\n120页\t370877\nGPS\t370878\n大舍\t370879\nopenfire\t370880\n条杠\t370881\n重庆大学计算机学院\t370882\n明史\t370883\n单向离合器\t370884\n舰队collection\t370885\n三生不负三世\t370886\n赠金\t370887\n罗山县\t370888\nRedis\t370889\n平台期\t370890\ndyld\t370891\n道指\t370892\n冻肉\t370893\n练册\t370894\n人教部编版小学\t370895\n卡瑞奇\t370896\nBALI\t370897\n建设银行公司\t370898\ndnf剑魔吧_\t370899\n浣花溪公园\t370900\n蜂王浆冻干粉\t370901\n病种\t370902\n奇门\t370903\n东关小学\t370904\n初见成效\t370905\n多粘菌素\t370906\n导师\t370907\n水晶头\t370908\n消长\t370909\n阳间\t370910\n血小板增多症\t370911\n西安人才网\t370912\nfloral\t370913\n雷贝拉唑钠肠溶片\t370914\n25乘\t370915\n二段式\t370916\nTop5\t370917\n宏达矿业\t370918\n所作\t370919\nAcer宏碁\t370920\n民间小吃\t370921\n出奇\t370922\n侧根\t370923\n9x9\t370924\n中国环境报\t370925\n免赔\t370926\n开海\t370927\nainipa\t370928\n德阳银行\t370929\n郑州旅行社\t370930\n90张\t370931\nRadon\t370932\n精神分裂吧\t370933\noracle函数\t370934\n建功\t370935\n常熟港\t370936\n县市\t370937\naustin\t370938\n8905\t370939\nlyric\t370940\nesprit\t370941\n盐城东台\t370942\nsafeken\t370943\n国网能源研究院\t370944\nBLOCK\t370945\n死亡爱丽丝\t370946\nTranscend\t370947\n督主\t370948\n甄洛\t370949\n28纳米\t370950\n害你\t370951\n聋哑\t370952\nExcalibur\t370953\n迎驾\t370954\n26.9\t370955\nceng\t370956\nMP4格式\t370957\n离岸快车\t370958\n2017年4月23日\t370959\n白水涧\t370960\n方桩\t370961\n林锐\t370962\n广九铁路\t370963\n十八讲\t370964\n润肤乳\t370965\n720P|1080P高清\t370966\n辉丰股份\t370967\n砾石\t370968\n真实用\t370969\n于蓝\t370970\nELSE\t370971\ntomtom\t370972\n烯丙基\t370973\n佛山市妇幼保健院\t370974\n万古天宗\t370975\n我拉网\t370976\n迹部景吾\t370977\n死神\t370978\n厨具\t370979\n酒桶\t370980\n顶碗少年\t370981\nvboxmanage\t370982\nPremier\t370983\n水规\t370984\nfhr\t370985\nm268\t370986\n丽质\t370987\n刘彦军\t370988\n和利时\t370989\n厦大附中\t370990\n若虫\t370991\n无语伦比\t370992\n旧疾\t370993\n知识竞赛题\t370994\n天涯明月刀天香\t370995\n无锡汽车客运站\t370996\n针织品\t370997\nEstates\t370998\n国粹\t370999\n脱水器\t371000\n普洛斯物流园\t371001\n盒身\t371002\n龙珠斗士z\t371003\nblue\t371004\n260平\t371005\n凤凰县人民政府\t371006\n胜利公园\t371007\nFurla\t371008\n53分钟\t371009\n愁眉\t371010\n三星a8\t371011\n李贻煌\t371012\n精轧螺纹钢\t371013\n重庆市渝北区人民法院\t371014\n说来着\t371015\n解封\t371016\n中直\t371017\n千卡\t371018\n小苍\t371019\nstiff\t371020\n系综\t371021\n300156\t371022\n蔡宗强\t371023\nr4\t371024\n影音先锋av资源_先锋撸撸_影音先锋资源_先锋影音看片网站_撸撸管\t371025\n诬陷\t371026\n转世投胎\t371027\nvolks\t371028\n出壳\t371029\n教育厅\t371030\n678盘\t371031\n耿鬼\t371032\n卷笔\t371033\n免税商品\t371034\n王淑芬\t371035\n光队\t371036\n尼莫地平\t371037\n外海量\t371038\n流砂\t371039\nrecall\t371040\n祼身\t371041\n甘夫人\t371042\n头版头条\t371043\n酷堂\t371044\n分布函数\t371045\n马桶垫\t371046\nfactories\t371047\n孟姜女\t371048\n小鞋\t371049\n信阳市\t371050\n转速传感器\t371051\n何所\t371052\n微商版\t371053\n诸事不宜\t371054\n兔牙\t371055\n5kw\t371056\n没用到\t371057\nhaitao\t371058\n霸王\t371059\n压坏\t371060\n石门二路\t371061\n化工网\t371062\n中国移动政企分公司\t371063\n倚天剑屠龙刀\t371064\n僵尸王\t371065\n270g\t371066\n南宁新闻网\t371067\n咨文\t371068\n宏地\t371069\nCDO\t371070\n绝心\t371071\n诸星\t371072\n北京市社保局\t371073\n拨乱\t371074\n连接片\t371075\n囧哥\t371076\n青岛市中级人民法院\t371077\n尸妖\t371078\n世玠\t371079\n高晓\t371080\nshf\t371081\nOpenCL\t371082\nFRM\t371083\n南源\t371084\n法刑\t371085\n牛毛草\t371086\nzip4j\t371087\n我的爱车\t371088\n南京大学生命科学学院\t371089\n李维安\t371090\n拆迁安置房\t371091\n甲级\t371092\n昂首\t371093\natk\t371094\n保证率\t371095\n云阅卷\t371096\n莱芜市政府\t371097\nfak\t371098\nHD-720P-MKV\t371099\n学姐\t371100\n娘娘腔\t371101\n百香果\t371102\nfiring\t371103\n盘管式\t371104\nuvc\t371105\n埃辛诺斯\t371106\n钜惠\t371107\nCancellation\t371108\nNodeManager\t371109\n南京市工商行政管理局\t371110\n武汉中央商务区\t371111\n罗切斯特\t371112\n老傅\t371113\n摘编\t371114\n邪物\t371115\n铜离子\t371116\n300383\t371117\n上海学大教育\t371118\n滴滴代驾\t371119\nqlv格式转换mp4\t371120\n纤维素分解菌\t371121\nSpeech\t371122\n内江\t371123\n相匹配\t371124\n载动\t371125\n20170326\t371126\nsx4\t371127\n薛蛮子\t371128\n吕建\t371129\n消泡剂\t371130\n澳门威尼斯人\t371131\n称述\t371132\ngpedit\t371133\n永宁县\t371134\n7&#160\t371135\nfilmora\t371136\n节食\t371137\n自变量\t371138\nCoated\t371139\n中国社会科学出版社\t371140\nOliveKong\t371141\n品茶\t371142\nRAD\t371143\n主检\t371144\n放入\t371145\n王俊峰\t371146\nbane\t371147\nbail\t371148\n3570k\t371149\n零缺陷\t371150\n佬牛\t371151\ncomparison\t371152\nstudio2013\t371153\n杨紫\t371154\n赛普健身学院\t371155\n秒速五厘米\t371156\n入党积极分子考察登记表\t371157\nPS2\t371158\n美猫\t371159\n细胞分裂吧\t371160\n彳\t371161\nMIX\t371162\n三河市\t371163\n猿飞\t371164\n死皮\t371165\n有车一族汽车网\t371166\n产前诊断\t371167\n怪味\t371168\n濑\t371169\n威刚XPG\t371170\n第四季\t371171\ndrastic\t371172\n青龙湖公园\t371173\n正大时代华庭\t371174\n粤全国政协\t371175\n象湖壹号\t371176\nF1.8\t371177\n生成库\t371178\n长戟大兜虫\t371179\n应知应\t371180\n罪母\t371181\n更喜欢\t371182\n新疆维吾尔自治区政府\t371183\nraz\t371184\naccdb\t371185\n大舞\t371186\n十二篇\t371187\nbart\t371188\n我去\t371189\n克力架\t371190\n湖北省沙市中学\t371191\nhilbert\t371192\n海安就业网\t371193\n那版\t371194\n卡蒂狗\t371195\n红菜\t371196\n阳澄湖半岛\t371197\n北斗\t371198\n上午九点\t371199\n悬臂浇筑法\t371200\n泔水\t371201\n百搭款\t371202\n毒贩\t371203\nmenu\t371204\ntheos\t371205\nSexInSexBoard色中色论坛\t371206\n早知道\t371207\n国美服务中心\t371208\n虎丘区\t371209\n出轨男\t371210\n少师\t371211\n场外期权\t371212\n093\t371213\n夏树\t371214\n华润通\t371215\nALfheim\t371216\n当事\t371217\n哈根\t371218\njinrong\t371219\n文彩元\t371220\n朝阳洲\t371221\n轻微\t371222\n1.0.0.jar\t371223\n轻轻松\t371224\n长沙市公安局交通警察支队\t371225\n乐途客\t371226\n李金\t371227\n塔影蜃楼\t371228\n孟瑶\t371229\n复兴医院\t371230\nwriteline\t371231\n工商\t371232\n10ml\t371233\n特邀\t371234\n7.7亿\t371235\nEXT\t371236\n受虐\t371237\n1.001\t371238\nUNDER\t371239\n雅礼中学\t371240\n微软公司\t371241\n小小彬\t371242\n西安高级中学\t371243\nInitial\t371244\n多维娱乐网\t371245\n凤凰花开的路口\t371246\nJackie\t371247\n特采\t371248\n多牛\t371249\n乔庄\t371250\n206号\t371251\n育成\t371252\n长沟\t371253\n/子\t371254\n宁波电视台\t371255\n7477\t371256\nXplay5\t371257\nMatlab2016a\t371258\n7星\t371259\n南桥镇\t371260\n苏州大学附属第一医院\t371261\ntj\t371262\n三吴\t371263\ne300l\t371264\n邮票\t371265\n金匾\t371266\n周郎\t371267\n义母散华\t371268\nX战警3\t371269\n心眼\t371270\nILCE-6000\t371271\n蓝光公园\t371272\n九灵元圣\t371273\n老年体协\t371274\n给水处理\t371275\n迷你币\t371276\n白麒麟卡\t371277\n试课\t371278\n漫漫漫画\t371279\n男娃\t371280\n骏铃\t371281\n下不为\t371282\nSMTP\t371283\nolga\t371284\n新垣东野圭吾\t371285\n新华门\t371286\n新火娱乐\t371287\n2幢\t371288\n5068动漫网\t371289\n事业编\t371290\nErgo\t371291\n伸缩缝\t371292\nComix\t371293\n水袋\t371294\nMember\t371295\nSalt\t371296\n样条\t371297\n与僧侣交合的色欲之夜\t371298\nCASS7.1\t371299\n中译语通\t371300\nVOCS\t371301\n躬耕\t371302\n夜总会\t371303\n预收\t371304\n中国化肥网\t371305\nNARUHO堂\t371306\n进出口贸易公司\t371307\nwww.2kejian.com\t371308\n荣耀畅玩4A\t371309\n1838\t371310\n旭日东升\t371311\n认定表\t371312\n计量经\t371313\n脾经\t371314\n睡睡\t371315\n人民币存款准备金率\t371316\n龙息\t371317\n乔\t371318\n苏浩\t371319\n2028年\t371320\nXMl\t371321\n楔状\t371322\n14.0\t371323\n宝莱坞机器人之恋\t371324\nBook2\t371325\n老三板\t371326\nTuple\t371327\n塑钢缠绕管\t371328\n重用性\t371329\nstick\t371330\n东方富海\t371331\n主席\t371332\n前一个星期\t371333\n安掌门\t371334\n谜\t371335\n炉霍县\t371336\nThese\t371337\n线性回归模型\t371338\ndifferentiate\t371339\nBLANC\t371340\n电脑板\t371341\novertime\t371342\n14.5\t371343\n第十一集\t371344\n保安局\t371345\n三个故事\t371346\n六天五夜\t371347\n蜡烛\t371348\n爽儿\t371349\n逐鹿中原\t371350\n马克思主义社会科学方法论\t371351\nix500\t371352\nwer\t371353\n岳塘\t371354\n蓄热式\t371355\nprobabilities\t371356\nUnit\t371357\nBittrex\t371358\n平衡式\t371359\n3碟\t371360\ncaution\t371361\n都兰\t371362\n艾瑞大道\t371363\n荒蛮\t371364\n生物统\t371365\n六记\t371366\n谥\t371367\n定径\t371368\n山阳县\t371369\n奥古\t371370\n老鼠嫁女\t371371\n挽联\t371372\n崔岷植\t371373\n楼兰网\t371374\n返退\t371375\nYandex浏览器\t371376\n四川省安全生产监督管理局\t371377\nD7100\t371378\nSLP\t371379\n听海\t371380\n勘验\t371381\nCAA\t371382\nershou\t371383\n中登网\t371384\n法证\t371385\n概算定额\t371386\n金标贝\t371387\nsnd\t371388\nGeospatial\t371389\n2.5.1\t371390\n楚留香新传\t371391\n1192\t371392\n天空一号\t371393\nluther\t371394\n创意者\t371395\n极太网\t371396\n固定式登车桥\t371397\n绵阳南山中学实验学校\t371398\n料表\t371399\n拼箱\t371400\n最厉害\t371401\n125T\t371402\n成府\t371403\nstunnel\t371404\n张兵\t371405\n关底\t371406\n潍坊大众网\t371407\n行列号\t371408\n慎进\t371409\n独岛\t371410\n大标\t371411\n原属\t371412\n2018年4月5号\t371413\n垃圾邮件\t371414\nflix\t371415\n窦贤康\t371416\n商场\t371417\nbitbucket\t371418\n凤凰树\t371419\nPaying\t371420\n大音\t371421\n悲观\t371422\n楠小楠\t371423\n字权\t371424\n金蝉\t371425\n目标位\t371426\n新玛特\t371427\n镇安\t371428\n赖宁\t371429\nredline\t371430\n7月30日\t371431\nfaucet\t371432\n急性脊髓炎\t371433\n具象\t371434\n李家杰\t371435\n蹭网\t371436\n拖拖\t371437\n寿材\t371438\nadreno\t371439\n诌\t371440\n2667\t371441\n爱乐\t371442\nHSP\t371443\nwodr\t371444\n外部性\t371445\n纹状体\t371446\n步步惊情\t371447\n戴森吸尘器\t371448\n芦名未帆\t371449\n327路\t371450\n湘江路\t371451\n河北医科大学第四医院\t371452\nsynthesis\t371453\n仪轨\t371454\n教育信息港\t371455\n顾村公园\t371456\n迷踪\t371457\nDIGITS\t371458\n泛米米\t371459\n你不知道的事\t371460\n25000公里\t371461\n4款\t371462\n砂场\t371463\n阿秋\t371464\n悦行\t371465\n代办理\t371466\n黄县\t371467\n起爆\t371468\nX260\t371469\n田姬\t371470\nxinde\t371471\n鼎和财产保险股份有限公司\t371472\n0.75%\t371473\n山2\t371474\n痊\t371475\n唐山市发展和改革委员会\t371476\nnothing\t371477\n伏龙\t371478\n为时不晚\t371479\n管系\t371480\n雪花女神龙\t371481\n安安阁\t371482\n闫军\t371483\n全发\t371484\n中和剂\t371485\n遮体\t371486\n审核员\t371487\nExtjs\t371488\n大话西游2免费版\t371489\n疼\t371490\n柳芳\t371491\n3万美元\t371492\nF30\t371493\navada\t371494\n考证\t371495\nrefine\t371496\n宇文\t371497\nHII\t371498\n飓风营救2\t371499\n树脂工艺品\t371500\n95510\t371501\n叶橙\t371502\n南社\t371503\n有成\t371504\n20170605\t371505\n议制\t371506\n光武中兴\t371507\n房天下投资论坛\t371508\n游戏库\t371509\n三明北\t371510\nMila\t371511\n柳湘莲\t371512\ngd2\t371513\n花博会\t371514\n沮授\t371515\n金溢科技\t371516\n滴流盒\t371517\npdr\t371518\nundeclared\t371519\n远洋城\t371520\n硬广\t371521\n全效\t371522\n网上银行卡\t371523\n无限流\t371524\n色天香\t371525\n新华出版社\t371526\n毒杀\t371527\n枣红色\t371528\nedges\t371529\n联苯菊酯\t371530\nxtra\t371531\n次贷危机\t371532\n显山露水\t371533\n娜扎\t371534\nBangBang\t371535\n输完\t371536\n剪刀差\t371537\n兰心\t371538\n米胖旅游网\t371539\nflxx\t371540\n医务\t371541\n四季梅\t371542\n路易威登\t371543\n妊娠高血压\t371544\n抱团\t371545\n如斯夫\t371546\n憎水性\t371547\n脱实向虚\t371548\n捷科\t371549\n何灵\t371550\nGrub2\t371551\n万家丽路\t371552\nhyfi\t371553\n恒立\t371554\nPEI\t371555\n大碶街道\t371556\np7zip\t371557\nTeaser\t371558\n樟宜国际机场\t371559\n省审计厅\t371560\n区属\t371561\n马踏天下\t371562\n小沙\t371563\n警语\t371564\nscape\t371565\n85折\t371566\n黄页88机械网\t371567\n藏标网\t371568\n高达g世纪\t371569\n男演员\t371570\n三哭殿\t371571\n川崎病\t371572\n加利福尼亚大学洛杉矶分校\t371573\n型\t371574\nESO\t371575\n八一八\t371576\n衣罩\t371577\n一问百答\t371578\n蒙西\t371579\n微铺宝\t371580\n清蒸虾\t371581\n贺电\t371582\n750mm\t371583\n停车区\t371584\n盖座\t371585\n血契\t371586\n湄洲湾\t371587\n雇主责任险\t371588\n撞树\t371589\n88888\t371590\n狗头人与地下世界\t371591\n吕薇\t371592\n公证遗嘱\t371593\n过劳\t371594\n海银金融控股集团\t371595\n绝热板\t371596\n妙心\t371597\n奥罗\t371598\n德令哈市\t371599\nbef\t371600\n接线端\t371601\n角膜溃疡\t371602\n北京服装学院\t371603\n_梦三国\t371604\n邪灵\t371605\n做帐\t371606\n单霁翔\t371607\n安东尼\t371608\n半岛网\t371609\n简普科技\t371610\nsuit\t371611\n恒盛\t371612\nWONDER\t371613\n诱爱\t371614\n残血\t371615\n7.3.6\t371616\n发彩\t371617\n万江区\t371618\n北京行政副中心\t371619\nBangbus\t371620\n女款\t371621\n骑兵片\t371622\n华为移动\t371623\n爱美刻\t371624\nZr\t371625\n0.1mm\t371626\n木托\t371627\n已好\t371628\n濮阳市委\t371629\nGil\t371630\n中共广东省委\t371631\n猫德\t371632\nmg3600\t371633\n显卡\t371634\n长城谣\t371635\n总会\t371636\n健脑补肾丸\t371637\n大风扇\t371638\n退保\t371639\n湛江西站\t371640\n1898年\t371641\n42个\t371642\n常州中医院\t371643\n塘沽开发区\t371644\n电解板\t371645\n遮阳棚\t371646\n刘秀\t371647\n延续\t371648\n膨化食品\t371649\n幽篁\t371650\nRUN\t371651\n西安市第一中学\t371652\n新余四中\t371653\nsophia\t371654\n白汁\t371655\n广钢新城\t371656\n食药\t371657\n张杨北路\t371658\n星美\t371659\n南诏风情岛\t371660\n中丞\t371661\n亡灵节\t371662\n莫耐威西亚\t371663\n9.cn\t371664\n水井坊\t371665\n儿乐队\t371666\n书角\t371667\n12333\t371668\n普金\t371669\n[實況\t371670\n豆泡\t371671\n后台程序\t371672\n洪三元\t371673\n脂肪乳\t371674\n点歌台\t371675\n凉皮机\t371676\n离别时\t371677\n高港区\t371678\n盗墓小说\t371679\n肉瘤\t371680\n功夫熊猫2\t371681\nGP\t371682\n宋伟\t371683\n蘭\t371684\nBlood\t371685\n蓝胡子\t371686\n侯汉廷\t371687\n米胖\t371688\n总裁独宠:亲亲我的小宝贝\t371689\n广东广州广东联通\t371690\n1.14.2\t371691\n二十_\t371692\nEXECUTIVE\t371693\n水晶之恋\t371694\n袁斌\t371695\nWindowsServer\t371696\n武汉地铁2号线\t371697\n鉴定师\t371698\n帽子男\t371699\n李晓鹏\t371700\n泽稷\t371701\n江苏舜天\t371702\n精益六西格玛\t371703\n外国人工作许可证\t371704\n爱党\t371705\n沃得\t371706\n温岭火车站\t371707\n五度\t371708\n颈平焊法兰\t371709\n江城县\t371710\nprocedure\t371711\n植皮手术\t371712\n友阿奥特莱斯\t371713\nRV减速机\t371714\n畸形\t371715\n鸡胸脯肉\t371716\n1.6.0\t371717\nYJJ\t371718\n直取\t371719\n総\t371720\n华西坝\t371721\n电商宝\t371722\n手灯\t371723\nmsgpack\t371724\n流动站\t371725\n菇凉们\t371726\n金博洋\t371727\n胡定欣\t371728\nMegatrends\t371729\n欧阳靖\t371730\nhaveto\t371731\n龙游湖\t371732\n谱峰\t371733\n唯宝\t371734\n163小说网\t371735\ngba\t371736\n护士鞋\t371737\n革命之路\t371738\n扳回\t371739\n朱先生\t371740\nMature\t371741\n误解\t371742\n加长臂\t371743\n异构体\t371744\n上海家乐福\t371745\n飞天梦\t371746\n高阶函数\t371747\n31%\t371748\n58万元\t371749\n女神联盟\t371750\n钩子函数\t371751\n7577\t371752\n露天电影\t371753\n国歌\t371754\n一两\t371755\nAcme\t371756\n小微快贷\t371757\n木犀草素\t371758\nRATE\t371759\nSDL2\t371760\n下坪\t371761\n南珠\t371762\nprevented\t371763\n貌似\t371764\n粟丘疹\t371765\ncpu卡\t371766\n错位\t371767\n北大光华管理学院\t371768\n色阶\t371769\n构造物\t371770\n唱K\t371771\nNBA吧\t371772\n大报\t371773\n优优\t371774\n电子料\t371775\n金心\t371776\n鄞县大道\t371777\n4平米\t371778\n忠勇\t371779\n宝安机场\t371780\n家仆\t371781\n光谷金融港\t371782\n未来城\t371783\n无主之地\t371784\n新闻联播天气预报\t371785\n郭为\t371786\nwww.69zw.com/modules/article/txtarticle\t371787\n铁岭市人民政府\t371788\n容积式\t371789\n好奇怪\t371790\n阿飞正传\t371791\n269元\t371792\n推翻\t371793\n恭王府\t371794\n吉尔尼斯\t371795\nlive2d\t371796\n图信网\t371797\nnpx\t371798\n宁波市\t371799\n不可限量\t371800\ndiver\t371801\n兔区\t371802\n鲜橙\t371803\n南京大屠杀遇难同胞纪念馆\t371804\n危急\t371805\n长于\t371806\n台缘\t371807\nRag\t371808\n果期\t371809\n追回\t371810\nqq离线文件\t371811\n汪东\t371812\n隐框\t371813\n金山卫士\t371814\n瑞白\t371815\n风往\t371816\namazfit2\t371817\n至爱梵高\t371818\n临水\t371819\nv2.3\t371820\n孙教授\t371821\n人猿\t371822\n北矿科技\t371823\n特殊\t371824\n苹果7p\t371825\n纳音\t371826\nWANZ\t371827\n储物罐\t371828\n\\\\s+\t371829\n面团\t371830\n中天城市花园\t371831\n回收器\t371832\nrunners\t371833\n掟上今日子的备忘录\t371834\n点唱\t371835\n从而\t371836\n情长意绵\t371837\n2018.4.13\t371838\n趋\t371839\n溪头\t371840\n殷伟\t371841\n搜才网\t371842\n狮子座\t371843\n网定\t371844\n修炼\t371845\n24套\t371846\n总统套房\t371847\n疚\t371848\n楚卓\t371849\n白带黄\t371850\nscream\t371851\n两伊战争\t371852\n龙洞\t371853\nclustering\t371854\n哥们网\t371855\nsqlexception\t371856\n楼盘\t371857\nbootstrape\t371858\n联享\t371859\n锐特\t371860\n色液\t371861\n中衣\t371862\n过敏性结膜炎\t371863\nHairy\t371864\n移山\t371865\n西饼\t371866\n柴少鹏\t371867\n破产姐妹\t371868\n花体字转换器\t371869\n2018年04月24日\t371870\n日心\t371871\n定班\t371872\n铰接\t371873\n战船\t371874\n23句\t371875\n灰\t371876\nZUG\t371877\n久经\t371878\nElasticity\t371879\n章回体\t371880\ndnf帕拉丁\t371881\n2127\t371882\nAccepted\t371883\n制谱\t371884\n苏洛\t371885\n协议转让\t371886\n四品\t371887\n女人\t371888\n作文网\t371889\n重庆市旅游局\t371890\n手机阅读器\t371891\n晶硅\t371892\n陈虎\t371893\n蜻蜓点水\t371894\n集体主义\t371895\n学习型\t371896\n上一月\t371897\nHal\t371898\nnetwork\t371899\n掌阅\t371900\n座钟\t371901\n时通\t371902\nIMAP协议\t371903\n高富\t371904\n肇源\t371905\n傲游\t371906\n西线\t371907\n平键\t371908\n周灏\t371909\n慈鲷\t371910\n六合人家\t371911\ncroft\t371912\nFutaba\t371913\n杀戮都市:O\t371914\ndotaai\t371915\n不对称\t371916\n找有\t371917\nYin\t371918\n登记制\t371919\nid卡\t371920\n天津长征医院\t371921\nBD1080p/\t371922\nfuzz\t371923\n刘文典\t371924\n省站\t371925\n泡法\t371926\nAirplane\t371927\n入渗\t371928\n北京鸟巢\t371929\n花体\t371930\n登场\t371931\n宁静\t371932\n出租方\t371933\nEA4500\t371934\n金泓凯旋城\t371935\n李楼\t371936\n长刀\t371937\n德智\t371938\nComeback\t371939\n旅馆业\t371940\n樊城\t371941\nMeghan\t371942\n张芳\t371943\n雅思7\t371944\n山西银监局\t371945\n镬\t371946\n建发珑璟湾\t371947\n滞涨\t371948\n头破血流\t371949\n神州专车\t371950\n寻呼机\t371951\n风电变流器\t371952\n钱花\t371953\n黑暗侵袭\t371954\n蜂投网\t371955\n火照\t371956\n9702\t371957\n陈喆\t371958\nP30\t371959\n43P\t371960\n老鼠药\t371961\nRefresh\t371962\n番茄派\t371963\n5第五章\t371964\n安徽省财政厅\t371965\n恩施\t371966\ndivider\t371967\n20160124\t371968\n碛口古镇\t371969\n上海中原地产\t371970\n葫芦兄弟\t371971\n整饬\t371972\n好消息\t371973\n盛泽招聘网\t371974\nLovely\t371975\npatran\t371976\n鸣笛\t371977\n张娜拉\t371978\n小雯\t371979\npchunter\t371980\n四七九\t371981\n嚎\t371982\n1257\t371983\n暖锅\t371984\n中国平安保险公司\t371985\n人教版小学英语四年级下册\t371986\nmtp\t371987\n裁剪机\t371988\ndrawstring\t371989\n长青街\t371990\n工蜂\t371991\n复方黄柏液涂剂\t371992\n暂估\t371993\n性价比\t371994\n徐强\t371995\n大浪淘沙\t371996\n阴晴\t371997\n3647\t371998\n第18个\t371999\n五六个月\t372000\n东进\t372001\nkeroro\t372002\n10几\t372003\n3701\t372004\n聚金\t372005\n连笔字转换器\t372006\n防蛀\t372007\n叶甲\t372008\n102年\t372009\n代理所\t372010\n中国建筑科学研究院\t372011\n15.5\t372012\n葡萄糖酸钙\t372013\n自动化学院\t372014\n萜类化合物\t372015\n诺思\t372016\n省亲\t372017\n罚则\t372018\n17万套\t372019\n和讯港股\t372020\n橡皮树\t372021\n继往\t372022\n腘窝\t372023\nindependent\t372024\n场景式\t372025\n裴珠泫\t372026\njewelcad\t372027\n花费\t372028\n父子情深\t372029\n逆天邪神\t372030\n秦岭路\t372031\n进口额\t372032\n芥沫\t372033\nYoung汨\t372034\nPersona5\t372035\n去耦\t372036\n施肥\t372037\n冷裱机\t372038\n谷峰\t372039\n长安北站\t372040\n敏实集团\t372041\n文体学\t372042\n城市版\t372043\n徐向前\t372044\n1346\t372045\n老麻抄手\t372046\nEmpire\t372047\n史册\t372048\n周莹\t372049\n6000万美元\t372050\n茶桔\t372051\nv1.03\t372052\n万钟\t372053\n显明\t372054\nErwin\t372055\nunlock\t372056\n雾状\t372057\n开衩\t372058\n发热量\t372059\n四甲\t372060\n内网路由器\t372061\n霸宠\t372062\n1800首\t372063\n常平汽车站\t372064\nSch\t372065\n中油\t372066\n打老鼠\t372067\n电离方程式\t372068\n长安CX\t372069\n天寒地冻\t372070\n无待\t372071\nloves\t372072\n妇道\t372073\nframe\t372074\n西南联大\t372075\n超级病毒\t372076\n目力\t372077\n岛城\t372078\nSally\t372079\n保暖裤\t372080\n四气\t372081\n定于\t372082\nCBX\t372083\n藤子\t372084\n万平米\t372085\npanga\t372086\n古诗十九首\t372087\n打机\t372088\n商\t372089\n山东人民出版社\t372090\n各有所长\t372091\necli\t372092\n德瑞姆\t372093\n小新潮7000\t372094\n池玲文\t372095\n王导\t372096\n之下\t372097\n西单大悦城\t372098\n孙嘉朗\t372099\n武当山机场\t372100\n国测局\t372101\n乐至吧\t372102\n北京学院\t372103\n陈集\t372104\n张晖\t372105\n银城地产\t372106\n武汉奥数网\t372107\n对与错\t372108\n门静脉高压症\t372109\n莆仙\t372110\n五湖四海\t372111\n卡带机\t372112\n骨钉\t372113\n校花驱灵师\t372114\n崇山峻岭\t372115\n风色幻想3\t372116\n髋关节滑膜炎\t372117\n格子兮\t372118\n苦笑\t372119\nTinyPNG\t372120\n2万\t372121\n跳江\t372122\n国网新疆电力公司\t372123\n人座\t372124\n升频\t372125\nCOMMUNICATIONS\t372126\n郁郁葱葱\t372127\nき\t372128\n扎带\t372129\n边表\t372130\n南京市人才服务中心\t372131\nPinyin_8h-source\t372132\n大屯路东\t372133\n数分钟\t372134\n苍天哥\t372135\n少云\t372136\nWoodman\t372137\n首都体育学院\t372138\n纪昀\t372139\n粘聚力\t372140\n大河\t372141\n富锦市\t372142\n追加分\t372143\nNAS云论坛\t372144\n华润超市\t372145\n注册会计师全国统一考试\t372146\n长隆海洋王国\t372147\n扦\t372148\n80亿\t372149\n地球往事\t372150\n奇迹MU\t372151\n滨湖路\t372152\n2mol\t372153\n福建龙净环保股份有限公司\t372154\n方向感\t372155\nldif\t372156\n》\t372157\n小哇\t372158\n3.3.3\t372159\n肉沫\t372160\n亿城\t372161\n5A旅行社\t372162\n连带责任\t372163\nBenefits\t372164\n珑庭\t372165\n重力\t372166\n力破\t372167\nQlik\t372168\n南芬区\t372169\n谭雅战记\t372170\ndwx\t372171\nMicroservice\t372172\n足球队\t372173\n验收\t372174\nBIG5+GB\t372175\n蒜苗\t372176\nduanqs\t372177\n转义\t372178\n啤酒花\t372179\n皇后成长计划2\t372180\n文件浏览器\t372181\n4b\t372182\n广西壮族自治区物价局\t372183\n好多集\t372184\n瓷介\t372185\n富士XF\t372186\n礼包码\t372187\n猎天使魔女3\t372188\n惠州公司\t372189\n该书\t372190\n心有所属\t372191\n锯齿感\t372192\n在所不惜\t372193\n夜盲症\t372194\n宠妹\t372195\n史蒂文·斯皮尔伯格\t372196\n重庆市环保局\t372197\npercnt\t372198\n青春性\t372199\nAutoCA\t372200\nwindows/linux\t372201\n儿童片\t372202\n肖培东\t372203\nartoolkit\t372204\n丽江古城木府\t372205\nopenpyxl\t372206\n原始战记\t372207\nBruno\t372208\nEMPORIS\t372209\nweblog\t372210\n天津117大厦\t372211\n餐室\t372212\n50公分\t372213\n远传\t372214\n太极八卦\t372215\nJobar\t372216\n马生\t372217\npetrel\t372218\nFrontPage\t372219\n天香园\t372220\n九九乘法\t372221\n广东省干部培训网络学院\t372222\n要火\t372223\n变形金刚g1\t372224\n三级域名\t372225\n雪铁龙富康\t372226\n筱田步美\t372227\n董骠\t372228\n_巴适票务网\t372229\n宣州\t372230\n金隅大成\t372231\n中国新闻出版网\t372232\npcat\t372233\n三水\t372234\n360手机吧\t372235\nTXL\t372236\n7e\t372237\n二级建筑\t372238\n陈欧\t372239\n85公斤\t372240\n曹庄\t372241\n地下铁\t372242\n转辙机\t372243\n艾尔斯\t372244\n交响情人梦\t372245\n德黑兰\t372246\nn4010\t372247\n比赛日\t372248\n三江县\t372249\n一点半点\t372250\n支用\t372251\n这一刻\t372252\n游学团\t372253\n爸爸的手\t372254\n米3\t372255\n晨辉\t372256\n银耳红枣汤\t372257\nWEB开发\t372258\n早见\t372259\n银亿集团\t372260\n版权局\t372261\n团综\t372262\n救活\t372263\n茄子快传\t372264\n种业信息网\t372265\n付现\t372266\n临行\t372267\n火锅丸子\t372268\n歉信\t372269\n喉咙\t372270\n滦南县\t372271\n美式家具\t372272\n勇者斗恶龙7\t372273\nNIKE\t372274\n芥末留学\t372275\nkeyerror\t372276\n星梦缘\t372277\nN9\t372278\nSkyscanner\t372279\n张弦\t372280\n短裙\t372281\n早上五点\t372282\n会好转\t372283\n精准性\t372284\nAdapter\t372285\nconver\t372286\n低切\t372287\n咖啡王子一号店\t372288\n青霉素类\t372289\nTaka\t372290\n大佬\t372291\n采掘业\t372292\nlc76\t372293\n筒车\t372294\n江苏省泰州中学\t372295\n1199\t372296\n非开挖顶管\t372297\nCYS\t372298\n气温\t372299\nFontAwesome\t372300\n0W\t372301\n柔性线路板\t372302\n19cm\t372303\nlof\t372304\n弯锚\t372305\n汉诺威大学\t372306\nAZ\t372307\n赵锋\t372308\n李春葆\t372309\nCNode\t372310\n思泉\t372311\nunveils\t372312\n掌家\t372313\n自驾行\t372314\n签名章\t372315\n阜外医院\t372316\nfsck\t372317\n多能干细胞\t372318\n插件化\t372319\n弯刀\t372320\n第147章\t372321\nbd720p\t372322\n女照\t372323\n滤瓶\t372324\n防爆板\t372325\n店招\t372326\n云哥\t372327\nアズ\t372328\n兰若静云\t372329\n1月23日\t372330\n张海涛\t372331\nGroupe\t372332\n发电厂\t372333\n海淀外国语学校\t372334\n梅花垫\t372335\nzgg\t372336\n武侯大道\t372337\n地球知识局\t372338\n社版\t372339\n小小的船\t372340\n缆式\t372341\n级级\t372342\n包合\t372343\n嬢\t372344\n智子\t372345\n柯蒂斯音乐学院\t372346\n县经信局\t372347\n九代\t372348\n巡讲\t372349\n重庆市能源投资集团公司\t372350\n中空锚杆\t372351\n上海万达\t372352\n千阳\t372353\n深圳巴士集团股份有限公司\t372354\n童庆炳\t372355\n兰亭奖\t372356\nスイッチ\t372357\n远虑\t372358\n领先地位\t372359\n更上一层楼\t372360\n米莉\t372361\n广州市真光中学\t372362\nT470s\t372363\nimproper\t372364\nWebKit\t372365\nLinus\t372366\nsopcast\t372367\n小小忍者吧\t372368\n菊麟\t372369\nwin7系统文件夹\t372370\n弥\t372371\n五红汤\t372372\n这一代\t372373\n20150827\t372374\n中国软件行业协会\t372375\n善存多维元素片\t372376\n公国\t372377\n218.88\t372378\n孙立平\t372379\n胆管癌\t372380\n抓子\t372381\nP100\t372382\n蜈支洲\t372383\n托蒂\t372384\n宇通集团\t372385\ncore2\t372386\n肇庆火车站\t372387\n乱跳\t372388\nRenewal\t372389\n切割片\t372390\n松下公司\t372391\n美美\t372392\n暗香\t372393\n载客量\t372394\n除菌\t372395\n萝莉片\t372396\n限幅\t372397\n京九高铁吧\t372398\n孔东梅\t372399\nKangaroo\t372400\n600079\t372401\n卡桑德拉\t372402\n硬糖\t372403\n科沃\t372404\n华清宫\t372405\n音乐会\t372406\n宝盛\t372407\nnghttp\t372408\n硫酸根\t372409\n15:00\t372410\n新疆众和\t372411\nN遍\t372412\n树懒\t372413\n渐江\t372414\n四点钟\t372415\n全成\t372416\njianyi\t372417\n南京下关区\t372418\n年幼\t372419\n110度\t372420\nDVDISO\t372421\n控制部\t372422\n25元\t372423\n坯子库\t372424\n我的娜塔莎\t372425\nformat\t372426\n抗敌\t372427\n天韵之声\t372428\n六哲\t372429\n浮机\t372430\nSIM卡\t372431\n收割\t372432\n数二\t372433\n会考\t372434\n叶利钦\t372435\n医学综述\t372436\nAlerts\t372437\n清教徒\t372438\nCoco\t372439\n方管\t372440\n菌落\t372441\n第47号\t372442\n钎\t372443\n殉难\t372444\n育华\t372445\n置顶\t372446\n床柜\t372447\n宁夏回族自治区党委\t372448\n园长证\t372449\n声响\t372450\n保藏\t372451\nshinyruo\t372452\n中职中专网\t372453\n乾包\t372454\n3价\t372455\n打眼\t372456\n秋霞圃\t372457\n异星觉醒\t372458\n根毛\t372459\n数据流\t372460\n县烟草专卖局\t372461\n32Bit\t372462\n我的世界求生之路\t372463\n盈浦街道\t372464\n第65期\t372465\n敏静\t372466\n碎星物语\t372467\n海峡\t372468\n非危险品\t372469\n投影矩阵\t372470\n市民宗局\t372471\n3dm版\t372472\n12分\t372473\n13年前\t372474\n学步鞋\t372475\n最亲的人\t372476\n阴离子聚丙烯酰胺\t372477\nSlicer\t372478\n湖北省大学\t372479\n雷腾\t372480\nclou\t372481\n辩识\t372482\noneone\t372483\n卢氏吧\t372484\n星盆\t372485\nVyprVPN\t372486\n第145章\t372487\n伯明翰\t372488\n口水\t372489\n返销\t372490\n龙水南路\t372491\n民族服装\t372492\n美甲贴\t372493\n一少\t372494\n滴虫性阴道炎\t372495\n泰平胜世┊新剧┊\t372496\n20160614\t372497\n滕丽名\t372498\n硫粉\t372499\n独角鲸\t372500\n商商\t372501\n搜博网\t372502\n六九\t372503\n出国后\t372504\nnetwork527\t372505\n图们江\t372506\nimaginary\t372507\nps游戏\t372508\n酷易搜网\t372509\n穿越火线枪战王者吧\t372510\nitalic\t372511\n也敢\t372512\n句点\t372513\nMeets\t372514\nmicrostation\t372515\n2272\t372516\ndiaphragm\t372517\n冻伤\t372518\n棒材\t372519\n阴经\t372520\n裸居\t372521\n古典主义\t372522\n消声器\t372523\n牛骏峰\t372524\n元参\t372525\nreplication\t372526\nBAC\t372527\n肉团\t372528\n锦城\t372529\n安达\t372530\n玉苍山\t372531\n将使\t372532\nIONIC\t372533\n宝马\t372534\n构筑物\t372535\n日航\t372536\n房屋他项权证\t372537\n新能源资讯_中国新能源网\t372538\n仓存\t372539\nmp3播放器\t372540\n掘井\t372541\n35.5\t372542\n宽泛\t372543\n医品\t372544\n模拟银行\t372545\nquerystring\t372546\n有一手\t372547\n新格局\t372548\n史莱姆王\t372549\n契卡\t372550\n北京园博园\t372551\n深氧\t372552\n团结南路\t372553\nfirepro\t372554\n普瑞巴林胶囊\t372555\n红陶\t372556\nLittleMoon\t372557\n猫眼电影\t372558\n单翼\t372559\n建始\t372560\n1个工作日\t372561\n算例\t372562\n粤价\t372563\n山东农业信息网\t372564\n条纹\t372565\n制爆\t372566\n镇痛泵\t372567\n红毯秀\t372568\n西提猜\t372569\n南网总纲\t372570\n三兄弟\t372571\n鋒\t372572\n服装业\t372573\n材料人网\t372574\nblf\t372575\n张家口桥西区\t372576\n平潭机场\t372577\n果然\t372578\n胡东东\t372579\n我和小姐姐克拉拉\t372580\n照发\t372581\n歌之王子殿下\t372582\n仙侠世界\t372583\n枸杞子\t372584\nRespond\t372585\n气流\t372586\n造船业\t372587\nfevte\t372588\n盐酸羟胺\t372589\n泥石流\t372590\neconomic\t372591\n洗布\t372592\n护生堂\t372593\n宝阁\t372594\ncgss\t372595\n劳动路\t372596\nZebraDesigner\t372597\n平安健康\t372598\n读心\t372599\n显\t372600\n中国海关总署\t372601\n圣邦\t372602\n藕带\t372603\n无铅\t372604\n高压钠灯\t372605\n6分之一\t372606\n大陆版\t372607\n福州延安中学\t372608\n洞口\t372609\n泼皮\t372610\n参考版\t372611\n共和村\t372612\n朱颜\t372613\n城楼\t372614\n全面屏\t372615\n原包\t372616\nwin10c盘\t372617\n蔺靖\t372618\n双卡双待双通\t372619\n公函\t372620\n天选\t372621\n低压变频器\t372622\n高血压糖尿病\t372623\n大清早\t372624\nWindowsServer2003\t372625\n127.0.0.1\t372626\nquansezy\t372627\n水平面\t372628\n狂野飙车8\t372629\n莫过\t372630\n贫贱夫妻百事哀\t372631\n集训营\t372632\n巩义市\t372633\n神雕系列汇集\t372634\n银江股份有限公司\t372635\n东南\t372636\n占为己有\t372637\nDomains\t372638\n张拉\t372639\npreserving\t372640\n许昌路\t372641\n江南银行\t372642\n高楼镇\t372643\n润生\t372644\n超省\t372645\n香奈儿\t372646\nvk\t372647\n大山\t372648\ndocu\t372649\nyy4480\t372650\n鸟声\t372651\n何以\t372652\n傅强\t372653\n崩山\t372654\naccepting\t372655\n透析机\t372656\nshot\t372657\n闹天宫\t372658\n丰创意广场\t372659\n自订\t372660\n如故\t372661\n山东石化\t372662\n刷机卡\t372663\n宁波消防\t372664\n武僧一龙\t372665\n东风志\t372666\n莫逆之交\t372667\n中华联合财产保险股份有限公司\t372668\n亚德客\t372669\n下影\t372670\n步步高音乐\t372671\n400平\t372672\n吃雪\t372673\n极爱\t372674\n万慧达\t372675\n金域华府\t372676\nbB\t372677\n新加坡港\t372678\n满院\t372679\n陈肇雄\t372680\n105.8\t372681\n上古卷轴5npc\t372682\n爆炸声\t372683\n进京\t372684\n雨眠\t372685\nsurfacepro5\t372686\n越线\t372687\n笕桥机场\t372688\nsubsidiary\t372689\namc\t372690\n选出\t372691\n保险资产管理公司\t372692\n广州国际金融城\t372693\n苏小妍\t372694\n第三十期\t372695\n架势\t372696\n毕氏\t372697\ncertification\t372698\n纸鹤\t372699\n中招国际招标有限公司\t372700\n2016年08月\t372701\nchildren\t372702\n佳爷\t372703\nedius8\t372704\n宗维洁\t372705\nT42\t372706\n史坦尼斯\t372707\n墨水屏\t372708\nOccurred\t372709\n聊天室\t372710\n肉牛\t372711\n不【\t372712\n天书中文\t372713\n漫展\t372714\n喉镜\t372715\n1.5寸\t372716\nPDF417\t372717\nWeekly\t372718\n南京大学化学化工学院\t372719\n星学院\t372720\n本杰明富兰克林\t372721\nprecipitation\t372722\n两天之内\t372723\n欧亚马\t372724\n嵌合\t372725\n全球箱包网\t372726\n友仔\t372727\n无根\t372728\nIC型号\t372729\n江苏省国家税务局\t372730\n罗技G402\t372731\n5318\t372732\n何山\t372733\nM227sdn\t372734\nINDUSTRIAL\t372735\n穿越火线枪战王者\t372736\nhtml5+\t372737\n宏中\t372738\n静处\t372739\n1600x1200\t372740\n芸能界\t372741\n月缘\t372742\n53个\t372743\nLantern\t372744\n宁波诺丁汉大学\t372745\nh110\t372746\n城里人\t372747\n首都师大\t372748\n计算机专业\t372749\n增值税减免税申报\t372750\n头程\t372751\n电子信息学院\t372752\n中国互联网金融协会\t372753\nmak\t372754\n七成\t372755\n老特拉福德\t372756\n宠物鸡\t372757\nremotefx\t372758\n地服\t372759\nEURUSD\t372760\n规中\t372761\n奇卡诺\t372762\n扁平\t372763\n赵坤坤\t372764\n泗洪县\t372765\n人源化\t372766\n高力国际\t372767\n游戏竞技小说-17k小说网\t372768\n双桥路\t372769\n索贝克\t372770\nbilan\t372771\nMuaRine\t372772\n价值性\t372773\n夜之魇\t372774\n尾夹\t372775\n证得\t372776\n大刀阔斧\t372777\n屏柜\t372778\n处女率\t372779\n代达罗斯\t372780\nFelixZh\t372781\nnotpad\t372782\n德立\t372783\n玩具厂\t372784\n第一滴血\t372785\n第二月\t372786\n涨紧\t372787\n修真神武3\t372788\n挡不住的疯情\t372789\n异端\t372790\n出金\t372791\nMIDE\t372792\n三国之召唤\t372793\n炉水\t372794\nSongMeanings\t372795\n疯狂猜成语2\t372796\nPHY\t372797\nlayla\t372798\n侍臣\t372799\n切纸机\t372800\n使命\t372801\nNetac\t372802\n善假\t372803\n密尔\t372804\n合围\t372805\nTitoni\t372806\nds18b20\t372807\n宜宾市政府\t372808\n鞍点\t372809\n辣椒油\t372810\n风宅\t372811\nCutting\t372812\n阿部力\t372813\n丁彦郎咸平\t372814\n25个\t372815\nexpression\t372816\nISO13485\t372817\n1.35v\t372818\n陈世妍\t372819\n铁甲威虫\t372820\nAston\t372821\n直角转弯\t372822\n三角形的分类\t372823\n角线\t372824\n云堆\t372825\n中基\t372826\n正经\t372827\nXLD\t372828\n铁通宽带\t372829\n有可为\t372830\n155010\t372831\n急性会厌炎\t372832\nTFRecord\t372833\n笞\t372834\n乐居资讯中心\t372835\n杰基\t372836\n指导书\t372837\n博望坡\t372838\n卷土重来\t372839\n郭碧婷\t372840\n瓷像\t372841\n马元\t372842\nCMMI3\t372843\n关语\t372844\n东马\t372845\n富蕴县\t372846\n金马\t372847\n旅行版\t372848\n深业\t372849\n深谋远虑\t372850\n3d打印机\t372851\n钳虫\t372852\nCS4\t372853\n波神\t372854\n省培\t372855\n林权\t372856\n次坐标轴\t372857\n钢索\t372858\n4.3.8\t372859\n青瓜\t372860\nPyQT\t372861\nCCTV1\t372862\n十九大学习\t372863\n时间线\t372864\n阿尔文\t372865\n国械\t372866\n文具展\t372867\n酷拉皮卡\t372868\n抓客\t372869\n深圳市俊竹科技有限公司\t372870\n皮卡剧\t372871\n91\t372872\n安定\t372873\nSwatch\t372874\n人人影视字幕组\t372875\nyys\t372876\n南乐\t372877\nEcon\t372878\n中投\t372879\n广园西路\t372880\nLEAP\t372881\n黑锐\t372882\n三骚\t372883\n瑞福\t372884\n湖南省国土资源厅\t372885\nduolingo\t372886\n雅乐轩\t372887\n非R\t372888\n2011年9月\t372889\n皇竹草\t372890\n58所\t372891\n一个士\t372892\n河南中医\t372893\n特惠酒店\t372894\n西京医院\t372895\n鬥魚黃金大獎賽\t372896\nts3180\t372897\n丘咲エミリ\t372898\n翻脸\t372899\n陈彪\t372900\n即时通讯系统\t372901\namg\t372902\n某物\t372903\n就\t372904\n双相\t372905\n忘情\t372906\n保时捷帕纳梅拉\t372907\n太原火车站\t372908\n超轻粘土\t372909\nJVM堆\t372910\n画像砖\t372911\nDelaware\t372912\nv1.0.7\t372913\n毓秀\t372914\n干洗\t372915\n美纱\t372916\nxcel\t372917\n114.cn\t372918\n斗音\t372919\npha\t372920\n立规矩\t372921\n雨宫天\t372922\n破碎的心\t372923\nFENDER\t372924\n北流\t372925\nSTM32定时器\t372926\nMight\t372927\n附有\t372928\n虎丘区人民政府\t372929\n平阳新闻网\t372930\n脓点\t372931\n忙线\t372932\n毕业设计任务书\t372933\n螺纹钢期货\t372934\nTDR\t372935\n动宾\t372936\n佐山爱\t372937\n做小\t372938\nmonolog\t372939\n苏州书香府邸\t372940\nfarmskins\t372941\n22亿美元\t372942\n北京嘉里中心\t372943\n抓木机\t372944\n胰激肽原酶肠溶片\t372945\n百度企业\t372946\n腰胀\t372947\n炉石传说卡拉赞之夜\t372948\nsmp\t372949\n1468\t372950\n口腔炎\t372951\n真功\t372952\n汉通\t372953\n国泰君安大智慧\t372954\n张家港行\t372955\n60颗\t372956\n增值税纳税申报表\t372957\n封建主义\t372958\n智邦\t372959\n阿勇\t372960\n料袋\t372961\n无锡技师学院\t372962\n卡尔曼滤波\t372963\n鳜鱼\t372964\n2241\t372965\n模拟信息卷\t372966\n珊瑚湾\t372967\n蓝江\t372968\n苏曼莎\t372969\n北鸢\t372970\n重生之妖孽人生\t372971\nDVP\t372972\n馆中\t372973\n丹尼尔·惠灵顿\t372974\nricci\t372975\n电缆载流量表\t372976\n二氧化锰\t372977\n山东市\t372978\n结算申报表\t372979\ndamifan\t372980\n具俊\t372981\n巧干\t372982\n故宫文创\t372983\n浙江新安化工集团股份有限公司\t372984\n张国良\t372985\n团内\t372986\n802.11a\t372987\n宇舶表\t372988\n租期\t372989\n題\t372990\nValuable\t372991\n1一20集\t372992\n丁香\t372993\n季卡\t372994\n同根生\t372995\n冷轧板卷\t372996\nzong\t372997\n套码\t372998\n手心\t372999\n坤造\t373000\n孙连成\t373001\n监测\t373002\n济宁市工商行政管理局\t373003\neighth\t373004\n十五年\t373005\n郭少杰\t373006\n腔隙性脑梗死\t373007\n欲色\t373008\n十二分钟\t373009\n全译\t373010\n魅力型\t373011\nploy\t373012\nSasha\t373013\n刘勇军\t373014\n皮相\t373015\n5xsq\t373016\n炼金石\t373017\n上古传说\t373018\n布设\t373019\n照镜子时\t373020\n2017--2018年\t373021\n暖通展\t373022\n西北菜\t373023\n银河战队\t373024\n000005\t373025\nCBNweekly\t373026\n大山里\t373027\n舌象\t373028\n长三角经济区\t373029\n王开岭\t373030\n华宝添益\t373031\nadidas三叶草\t373032\n南银大厦\t373033\nDom4j\t373034\n九零年代\t373035\n悲观者\t373036\n电子战\t373037\n开票机\t373038\n64点\t373039\n航弹\t373040\n裴珍映\t373041\n30mpa\t373042\n陈溪\t373043\n石硫合剂\t373044\nmmap\t373045\nr0\t373046\n柏乐园\t373047\n谨小慎微\t373048\n特卫\t373049\n郑裕彤\t373050\n4601\t373051\n流言\t373052\nJNLP\t373053\n退休中人\t373054\n惠州市委\t373055\n湖北建设信息网\t373056\n桥路\t373057\n乐读中文网\t373058\n铁碗\t373059\n西游战记\t373060\n淮安东站\t373061\n天国\t373062\n陶缸\t373063\npublishing\t373064\n3.7米\t373065\n550g\t373066\n高晓攀\t373067\n1000辆\t373068\n防空地下室\t373069\n总人口\t373070\n土方量_\t373071\n刘海粟\t373072\n广告节\t373073\n张卓\t373074\n姜刑\t373075\n多线程篇\t373076\n中国代理网\t373077\n洗手\t373078\n准提神咒\t373079\n静水流深\t373080\nRI\t373081\n新龙\t373082\n1500块\t373083\n蜀山传\t373084\n醉花荫\t373085\n妞子\t373086\n无锡市教育局\t373087\nReliability\t373088\n巫山\t373089\n一游\t373090\nUKVI\t373091\n好些\t373092\n负载平衡\t373093\n谐星\t373094\nOffice365\t373095\n青岛机场\t373096\n福建省政府\t373097\n王晓天\t373098\nemit\t373099\n天通苑西三区\t373100\n468\t373101\n爆破\t373102\n红盾\t373103\ntakamine\t373104\n80多万\t373105\n蓝狐\t373106\n江阴政府网\t373107\n昆剧\t373108\n加密\t373109\n瘦脸霜\t373110\n新南门汽车站\t373111\n解离\t373112\n3143\t373113\n学旅\t373114\n晶报数字报\t373115\n阿忠\t373116\nIOST\t373117\ndermatix\t373118\n星浩\t373119\n唯新\t373120\nPear\t373121\n0033\t373122\n幸运之门\t373123\n芦苇\t373124\n宝安\t373125\n青铜器\t373126\nopj\t373127\n弯板\t373128\nonsite\t373129\nQQ邮箱帮助中心\t373130\n飞页\t373131\n闲事\t373132\n总园\t373133\nGenymotion\t373134\n印尼币\t373135\nSSA\t373136\nMainWindow\t373137\n3625\t373138\ntep\t373139\n乐高式\t373140\n银监会保监会\t373141\nIList\t373142\n天津商务职业学院\t373143\n河北教育出版社\t373144\n情何以堪\t373145\n福星惠誉\t373146\n合肥注册公司\t373147\n到基层\t373148\nRubber\t373149\n一拳一个\t373150\n页间\t373151\nADC12\t373152\n于飞\t373153\nOps\t373154\n解酒\t373155\n米动\t373156\n34分钟\t373157\n换档\t373158\n富马酸替诺福韦二吡呋酯片\t373159\n安吉斯\t373160\n上不了\t373161\n4g套餐\t373162\n开错\t373163\n小敏\t373164\nrequested\t373165\n大目\t373166\n鹰潭\t373167\n转递\t373168\n西安航天城\t373169\n海盐房产超市网\t373170\nFPC\t373171\nzeros\t373172\n新蓝图\t373173\nlib库\t373174\n丰采\t373175\n航天八院\t373176\nMobaXterm\t373177\n网游加速器永久免费版\t373178\n淋病\t373179\n食性\t373180\n真人斗地主\t373181\n相看\t373182\n10卷\t373183\n华氏大药房\t373184\n阔达装饰\t373185\n全员\t373186\n20160805\t373187\n慢城\t373188\n怀来县\t373189\n碳酸铯\t373190\n移植术\t373191\n二十年后\t373192\nnaturaglace\t373193\n效期\t373194\n县委书记\t373195\n莱卡面料\t373196\n柚木ティナ\t373197\n画图\t373198\ndaling\t373199\nUITableViewCell\t373200\n上原\t373201\npt\t373202\n泥状\t373203\n1501\t373204\n网易七鱼\t373205\n3000平方米\t373206\nIU酒店\t373207\n集鸽\t373208\n开抢\t373209\n古田镇\t373210\n二年多\t373211\n大堰\t373212\n挖矿机\t373213\n伸缩器\t373214\n北京矿冶研究总院\t373215\n相关方\t373216\n中班级\t373217\nbluray\t373218\nedn\t373219\nFTC\t373220\n3008\t373221\n统招生\t373222\ndnp\t373223\n贵州省网上办事大厅\t373224\n淘气天尊\t373225\n拆桥\t373226\n韦森\t373227\n防断\t373228\n徐辉\t373229\nxiugaiqi\t373230\nfiona\t373231\n丑鱼尼莫\t373232\nofd\t373233\n淳化元宝\t373234\nFatFS\t373235\neSports\t373236\ndatastore\t373237\n望风\t373238\n知性\t373239\ndancing\t373240\nEllen1798\t373241\n珊瑚宫殿\t373242\n火狐盾\t373243\n水火箭\t373244\n文通\t373245\n大神f1\t373246\n智光\t373247\n时报\t373248\nsecond\t373249\n北京农展馆\t373250\n电批\t373251\n中国警察网\t373252\n郑鹏\t373253\n空气阀\t373254\ngeshou\t373255\nVai\t373256\n七把\t373257\nblame\t373258\nU2417H\t373259\nACDC\t373260\n3.3.0\t373261\n降效\t373262\n遥墙机场\t373263\n大连地铁5号线\t373264\nhpm\t373265\n1.39G\t373266\n欢宴\t373267\nUIPageControl\t373268\n人人色\t373269\n华为杯\t373270\n秦王宫\t373271\n乌鸡蛋\t373272\n向东\t373273\n深圳市特种证件制作中心二代证照片检测中心\t373274\n155个\t373275\ndecoration\t373276\n星界\t373277\n公益服\t373278\nXEROX\t373279\n氯霉素\t373280\n投线仪\t373281\nHeading\t373282\nDelegate\t373283\nnk\t373284\n重庆市国土资源和房屋管理局\t373285\nuitable\t373286\n玻尔兹曼常数\t373287\n合房网\t373288\n甲士\t373289\n天津美院\t373290\n发行费\t373291\n番禺路\t373292\n素媛\t373293\n8.0级\t373294\n9u8u\t373295\n中西医\t373296\n中金黄金\t373297\n街头篮球\t373298\n泣别\t373299\n平度信息港\t373300\n十家\t373301\n构型\t373302\n贝纳颂\t373303\ntogo\t373304\n有机菜\t373305\n第三方支付牌照\t373306\nv1.6.1\t373307\n1440p\t373308\n欧亿\t373309\n应用_参考网\t373310\n天津海事局\t373311\n不合身\t373312\n压线\t373313\n28篇\t373314\n血光之灾\t373315\n热带夜\t373316\nmicroserver\t373317\nsysbench\t373318\n东山公园\t373319\n机核\t373320\n不应\t373321\n历史街区\t373322\n江西省总工会\t373323\n铣削\t373324\n白点值\t373325\n张承志\t373326\n时代华纳\t373327\n阿里三家公司\t373328\nU当家\t373329\ncocopods\t373330\n黄帽\t373331\n転\t373332\n流浪花\t373333\n古榕\t373334\n悦翔v7\t373335\nARP病毒\t373336\n黄颡鱼\t373337\n百分之八\t373338\ntyrosine\t373339\n普惠幼儿园\t373340\n波浪纹\t373341\ndss\t373342\nSimone\t373343\n大连大学\t373344\n540\t373345\n尊涵\t373346\n龋洞\t373347\n会校\t373348\n李海青\t373349\n小米充电器\t373350\n八達網\t373351\n宛平南路\t373352\nmapsource\t373353\n房探007网\t373354\nVisual\t373355\n全椒\t373356\n联合国教科文组织\t373357\n笑口常开\t373358\nlowe\t373359\n恐怖学校\t373360\n石坑\t373361\n沙发垫\t373362\n财富广场\t373363\n叠拼\t373364\nFlash课件\t373365\n浪板\t373366\nDiablo\t373367\n咪咪爱\t373368\n知商金融\t373369\n天山区人民政府\t373370\n趣网新游频道\t373371\n图奇\t373372\n无聊\t373373\ncorn\t373374\ngulp\t373375\n三英战吕布\t373376\n瑞昌市政府\t373377\n鸡奇穴\t373378\n郁达夫\t373379\n对呀\t373380\n呼哧\t373381\n豆渣\t373382\n799\t373383\n波肖\t373384\n蕾奥\t373385\nias\t373386\n10&#160\t373387\n郑州大学学报\t373388\n冯提莫\t373389\n心意六合拳\t373390\n三桥\t373391\n捉\t373392\nclixsense\t373393\n铁板神数\t373394\n估量\t373395\n冷颤\t373396\n三臂\t373397\n华夏银行\t373398\njuxing\t373399\n谢家集区\t373400\n影友\t373401\n吸磁\t373402\nm1130\t373403\n漫不经心\t373404\nhanna\t373405\n公告\t373406\n喜加\t373407\n55开\t373408\n4060\t373409\n宇龙\t373410\n荒郊\t373411\n迪斯科广场\t373412\n前4个月\t373413\n战骑\t373414\n维生素D\t373415\n雅安市\t373416\n15磅\t373417\n投标报价法\t373418\n三兴\t373419\n敌后\t373420\n影音先锋色\t373421\n铝排\t373422\nExecuting\t373423\n孚\t373424\n济南天桥区\t373425\nRTD\t373426\n上前线\t373427\n52922818\t373428\n年检\t373429\n静安寺站\t373430\n海峡都市报电子版_海都报\t373431\n北京分公司\t373432\n拉风色\t373433\n蒙古族\t373434\n单曲\t373435\ngk5\t373436\n总第\t373437\n高思教育\t373438\ncei\t373439\n杨保军\t373440\n李国庆\t373441\n布朗尼\t373442\n311路\t373443\n幼小\t373444\n拳皇98ol\t373445\n95187\t373446\n恒大阳光半岛\t373447\n8盎司\t373448\n枪炮师\t373449\n紫金城\t373450\nRexxx\t373451\n耐磨性\t373452\n刘彦\t373453\n製造\t373454\n战斗曲\t373455\n山东华宇工学院\t373456\n静港\t373457\n团青\t373458\n妮琪·米娜\t373459\n自此\t373460\n治理观\t373461\n书事\t373462\n咏物\t373463\n疍\t373464\n撸铁\t373465\n石家庄高新区\t373466\n碳铵\t373467\nkorean\t373468\n2018040期\t373469\n丛林朗动\t373470\n人质\t373471\n全币\t373472\n无屏\t373473\n玛米亚\t373474\n二O\t373475\nelementary\t373476\n大冶湖\t373477\nACADEMY\t373478\n轩辕剑叁\t373479\ntidal\t373480\nkq\t373481\n寻枪\t373482\n煞气\t373483\n皮损\t373484\n蛋黄派\t373485\n卫灵公\t373486\n核定征收\t373487\n柬\t373488\n健康歌\t373489\n天鲲号\t373490\n夜行歌\t373491\n欲望之屋\t373492\nziyou\t373493\n来复\t373494\nmalaria\t373495\n绾绾\t373496\nWPSword\t373497\n芸苔素\t373498\n亚洲东部\t373499\n板油\t373500\n机械员\t373501\n体网\t373502\n郑州地铁5号线\t373503\n观音庙\t373504\nScottish\t373505\n吊盖\t373506\ndirectx9.0c\t373507\n一辉\t373508\n软点\t373509\n出参\t373510\n中国劳动社会保障出版社\t373511\nSYBASE\t373512\n年金现值系数\t373513\namy\t373514\n90_\t373515\n边人\t373516\n告字\t373517\n联想y510p\t373518\n神木隆之介\t373519\nX25\t373520\n20170913\t373521\n280万元\t373522\n湖南警察学院\t373523\n生产者\t373524\nETL\t373525\nprobook\t373526\n啼笑\t373527\nGalway\t373528\n多纳\t373529\n矿产品\t373530\n醒世\t373531\n公交线路\t373532\n格里兹曼\t373533\n50分\t373534\n斋藤飞鸟\t373535\n陈晨\t373536\n枫竹丹青SAP\t373537\n通票\t373538\n巨象\t373539\n蒋锡培\t373540\n江南实验学校\t373541\n首农\t373542\n暴戾\t373543\n山东省气象局\t373544\n收起\t373545\n孙亚\t373546\n升级之路\t373547\n陈沐\t373548\n0.2mm\t373549\n0}\t373550\n白绫\t373551\ncaribpr\t373552\n苏蔓\t373553\n小米移动电源2\t373554\n苍白\t373555\n参考指南\t373556\n阜平\t373557\n肉食\t373558\n一戒\t373559\n康小兵\t373560\n陕州地坑院\t373561\n妓女\t373562\nfilename\t373563\n私宴\t373564\n英雄联盟LPL\t373565\n宫本茂\t373566\n米糊\t373567\n秋山祥子\t373568\n霸道知否知否应是绿肥红瘦\t373569\n上海卷烟厂\t373570\n社会主义市场经济\t373571\n分度\t373572\n沙阳路\t373573\n血族bloodline\t373574\n北魏\t373575\n首玺\t373576\n尴尬时刻\t373577\nchi2\t373578\n成化十四年\t373579\nWeka\t373580\n河姆渡文化\t373581\nob\t373582\nBestiality\t373583\n绿瓶\t373584\n竹镇\t373585\n胡润百富\t373586\n世嘉MD\t373587\n上海市通信管理局\t373588\n江山如画\t373589\nPPT背景图\t373590\n西西里的美丽传说\t373591\nmorphology\t373592\n淅沥\t373593\namendment\t373594\n至上\t373595\n机器视觉\t373596\n风冷热泵\t373597\n氮氧传感器\t373598\npowerpoint\t373599\n枫行maple\t373600\n聚丁二烯\t373601\n684\t373602\n北镇\t373603\n全核\t373604\n席嘉琪\t373605\n阜通\t373606\nspcc\t373607\n印业\t373608\nHaan\t373609\n2017年9月\t373610\n26000元\t373611\nTube8\t373612\nWin10/WP\t373613\n富田农场\t373614\n粉浆\t373615\n空手\t373616\n钢城区\t373617\n2017年5月4日\t373618\n同位语从句\t373619\n肖特基\t373620\n上海市儿童医院\t373621\ngoverning\t373622\n徐凤年\t373623\n南极村\t373624\n一手店\t373625\n食管\t373626\nwhut\t373627\n达英\t373628\n裘皮\t373629\n沙岭子\t373630\n哈尔滨工业大学\t373631\nignores\t373632\n形同陌路\t373633\n第5轮\t373634\njuniper\t373635\nLIVE\t373636\n测亩仪\t373637\n高中数学必修四\t373638\n长堤\t373639\n飞达\t373640\n被喻\t373641\n创伤\t373642\n构成\t373643\n有效期\t373644\nakko\t373645\n过往\t373646\n三河古镇\t373647\nGORE-TEX\t373648\n医养结合\t373649\n对象化\t373650\ncloudshadow\t373651\n出书网\t373652\n周渝\t373653\nonclick事件\t373654\n区司法局\t373655\n第三十一届\t373656\n健身房\t373657\n车套\t373658\n脱落酸\t373659\nrss阅读器\t373660\n五到位\t373661\n磁性\t373662\n监察部\t373663\n心如止水\t373664\n林允儿\t373665\n恶魔城\t373666\n高城\t373667\nc14呼气试验\t373668\n初晴\t373669\nCM0304\t373670\n情诗\t373671\n没有人知道\t373672\n│\t373673\n年月日时分\t373674\n龙潭村\t373675\n还款期\t373676\n巨蚊\t373677\n董姓\t373678\n回收物\t373679\n简沫\t373680\n163_\t373681\n乐评\t373682\n静力学分析\t373683\nrecorder\t373684\n李易王小川\t373685\n长谷川留衣\t373686\n消费\t373687\n养虾\t373688\n小狼毫输入法\t373689\n井口裕香\t373690\n搏出\t373691\n人心惶惶\t373692\n逆定理\t373693\n阿尔茨海默\t373694\nParker\t373695\n江一燕\t373696\n上外\t373697\n红地毯\t373698\n苏拉玛起义\t373699\n新市镇\t373700\n牙周病\t373701\n首南\t373702\n裕华区\t373703\n拉链式\t373704\n一包\t373705\nimx\t373706\n阻隔\t373707\n零增长\t373708\nAAF\t373709\n孟德\t373710\n抬升\t373711\n首通\t373712\n单箱\t373713\n理石\t373714\n屈光参差\t373715\n收验\t373716\nGRID\t373717\n大盛娱乐\t373718\n竞舟路\t373719\n泰餐\t373720\n卓文君\t373721\n黄可\t373722\n任鲁豫\t373723\n桂花苗\t373724\n奥妮\t373725\n物料员\t373726\n上海能源\t373727\ntaste\t373728\nbeidou\t373729\n心理咨询室\t373730\n广州美莱整形美容医院\t373731\n星会\t373732\n纪念\t373733\n加西贝拉\t373734\n还款额\t373735\n黄色\t373736\n24v\t373737\n2月2\t373738\n欧阳菲菲\t373739\nGH60\t373740\nSheets\t373741\nNespresso\t373742\nbeta4\t373743\n圆才\t373744\n电子舞曲\t373745\n差速器油\t373746\n钢房\t373747\n禁行\t373748\nwebengine\t373749\n一分\t373750\nbedroom\t373751\nFastboot\t373752\n邱峰\t373753\n贽\t373754\npigzoo\t373755\n超级战\t373756\norm\t373757\n扭力限制器\t373758\n太狠\t373759\n市环境保护局\t373760\n宋青\t373761\n胡里山\t373762\n清秀\t373763\nQQ聊天记录查看器\t373764\n深业集团\t373765\n工商局注册公司\t373766\n会计书\t373767\n两党\t373768\n情至性\t373769\n1378\t373770\n张鑫\t373771\n顺德碧桂园\t373772\n2月22日\t373773\n此路不通\t373774\n兽山\t373775\n常模\t373776\n马渚镇\t373777\n一尺\t373778\n高考满分作文\t373779\n长株潭城际铁路\t373780\nOFX\t373781\n亚朵(上海)酒店管理有限公司\t373782\n氯雷他定片\t373783\n中天钢铁集团有限公司\t373784\n65mm\t373785\n破路\t373786\n喜临门\t373787\n影视盒\t373788\n松霖\t373789\n螳臂当\t373790\n英国海关\t373791\n彬州\t373792\n道破\t373793\n复合式\t373794\n红色娘子军\t373795\n筹码峰\t373796\n战纹\t373797\n蟹塘\t373798\nifarme\t373799\n画刊\t373800\nIPV6\t373801\n周进\t373802\n防城港政府网\t373803\n潘菲洛夫28勇士\t373804\n活动会\t373805\n枫叶\t373806\nUndead\t373807\n矿博会\t373808\n大贸车\t373809\n扫码支付\t373810\n分轨\t373811\nFrench\t373812\n试板\t373813\n照桥\t373814\n鉅亨網\t373815\n胥\t373816\n百里杜鹃风景区\t373817\nomnibus\t373818\n撼心\t373819\nGOD\t373820\n130分钟\t373821\n好压软件\t373822\nNoticias\t373823\n北方华创\t373824\n飞行控制系统\t373825\n拓者帮帮\t373826\n热气腾腾\t373827\n遣兴\t373828\n虎斗\t373829\n伊士曼\t373830\n广东高院\t373831\n远隔\t373832\nelectron-builder\t373833\nuvled\t373834\n水工业\t373835\n砂机\t373836\nCS3\t373837\n土地出让收入\t373838\n怀孕后\t373839\namadeus\t373840\nzhonghao\t373841\n董藩\t373842\n二小\t373843\nSsh\t373844\n美人制造\t373845\ntextfiled\t373846\n凤凰花\t373847\n逆天狂妃\t373848\n锐浪\t373849\n少女人\t373850\nbeike\t373851\n云+\t373852\nCovering\t373853\n61200133\t373854\n药明\t373855\n梅爱偲\t373856\n傍晚\t373857\nThymeleaf\t373858\n采莲\t373859\nalm\t373860\nQuasar\t373861\n裕溪路\t373862\nsextube\t373863\n圣农\t373864\n4141\t373865\n方舟方块世界吧\t373866\nbryant\t373867\nDFT\t373868\nJSLint\t373869\n广东省科学技术厅\t373870\n14平米\t373871\n2232\t373872\n建设费\t373873\n平顶帽\t373874\n0258\t373875\n新博瑞\t373876\n杨钰威尔史密斯\t373877\n东北大区\t373878\ncolette\t373879\n感触\t373880\n发展\t373881\n产子\t373882\n广东省网上办事大厅\t373883\n高达破坏者2\t373884\n碘单质\t373885\ntz\t373886\n战地影院\t373887\n全日空航空公司\t373888\nSparks\t373889\n刘循子墨\t373890\nheng\t373891\n0329\t373892\n魏纪中\t373893\n容性\t373894\nzang\t373895\n电子电\t373896\nword2015\t373897\n建设工程安全生产管理条例\t373898\n有效成分\t373899\n竞网智\t373900\n变型\t373901\n位位\t373902\n打工人\t373903\n睡眼\t373904\n叉子\t373905\n君庭\t373906\n体细胞\t373907\n雪颜\t373908\n黄茶\t373909\n田总司\t373910\n297号\t373911\nxlsxwriter\t373912\n2018年4月20\t373913\n祝融\t373914\n好美\t373915\n预习单\t373916\n复颜玻尿酸\t373917\n马超\t373918\n就义\t373919\n后备厢\t373920\n45万吨\t373921\n四步曲\t373922\n300369\t373923\n单因素方差分析\t373924\nIndexAV\t373925\n农业规划\t373926\n秋蝉\t373927\ncla220\t373928\n吞天记\t373929\n球探网\t373930\n福建农大\t373931\n礼县\t373932\n裕安\t373933\n房地产公司\t373934\nBRAZZERS\t373935\n竞买\t373936\n鲁网\t373937\n海尔公司\t373938\n养尸\t373939\n哥伦布广场\t373940\n小川阿佐美\t373941\n张志坤\t373942\n卓尔山\t373943\nsvdvd\t373944\n天堑\t373945\n损毁\t373946\n郴州火车站\t373947\njiageng\t373948\n泥板\t373949\n钓鱼杆\t373950\n吉濑美智子\t373951\ndtsi\t373952\n阿里云\t373953\n真诚\t373954\n拳皇99\t373955\n城南客运站\t373956\ntongue\t373957\n柽柳\t373958\n幻术\t373959\n香榧子\t373960\n929550163@qq.com\t373961\n大泉\t373962\n钙肥\t373963\n服刑\t373964\n有限元模拟\t373965\nbeyondcompare\t373966\n2017年十月\t373967\n唐伯虎点\t373968\n天津搜狐\t373969\nは\t373970\n公版\t373971\n安然纳米汗蒸馆\t373972\n电脑程序\t373973\n根管\t373974\n阀瓣\t373975\n快递件\t373976\n肺宁颗粒\t373977\n互信\t373978\nNPU\t373979\n电化学传感器\t373980\n清缴\t373981\nReviews\t373982\n6000分\t373983\n华艺卫浴\t373984\nCustomizing\t373985\n向前冲\t373986\n教育机器人\t373987\nGeng\t373988\n从教\t373989\n中国美术家协会\t373990\nstrftime\t373991\n九州通\t373992\n播种者\t373993\n不悱不发\t373994\n陈文龙\t373995\n砰\t373996\n辉煌十九大\t373997\n缩写\t373998\n海宴\t373999\n抢到\t374000\n霉变\t374001\n余德耀美术馆\t374002\nStash\t374003\nonSelect\t374004\nx战警天启\t374005\n水泥人网\t374006\n联塑\t374007\n嘻嘻哈哈\t374008\n兵马未动\t374009\n病毒性肝炎\t374010\n20160621\t374011\n胎元\t374012\n糊弄\t374013\nmacdown\t374014\n沙福林\t374015\n安德里茨\t374016\n外加\t374017\n柳青\t374018\n柳州城市职业学院\t374019\nandroid7.1\t374020\n凯宴\t374021\n小兔请客\t374022\n住院病人\t374023\n今年12月\t374024\n10万件\t374025\n铛铛铛\t374026\nprml\t374027\n本期\t374028\n大冲城市花园\t374029\n袁占亭\t374030\n第66届\t374031\nLUX\t374032\n山顶上\t374033\n堆高机\t374034\n别再\t374035\n酷走旅游网\t374036\n通规\t374037\n种草莓\t374038\n遐思\t374039\n人片\t374040\n剑气\t374041\n虎门万达广场\t374042\n杨升庵\t374043\n联合资信评估有限公司\t374044\n苏工\t374045\n完全版\t374046\n风花\t374047\n月亮湾大道\t374048\n凤凰FM\t374049\n首诺\t374050\n共青团湖北省委\t374051\n阵子\t374052\nnat模式\t374053\nmiflash\t374054\n0.15.1\t374055\n放射性\t374056\nwaiwai\t374057\nPedal\t374058\n这家\t374059\n女行长\t374060\nWWW.07973G.COM\t374061\n夭折\t374062\n6.cn\t374063\n熊猫馆\t374064\nmir_yan\t374065\n前两周\t374066\n酱萝卜\t374067\n双威\t374068\n重庆装饰公司\t374069\n湘湖三期\t374070\n2.2.9\t374071\n能量仪\t374072\n移行\t374073\nzhongzhi\t374074\n三十大\t374075\n兰色\t374076\n公路桥涵设计通用规范\t374077\n狂徒\t374078\nheated\t374079\n余江\t374080\n磷酸脱氢酶\t374081\n174cm\t374082\n乌冬面\t374083\n3D动画\t374084\n|币\t374085\n唐纳森\t374086\nJaeger\t374087\n钙尔奇\t374088\nn1s\t374089\n瞬间爆炸\t374090\n五菱宏光论坛\t374091\n鹏润\t374092\n沪浙\t374093\n很好玩\t374094\n健康生活\t374095\n搜搜\t374096\n山东商务职业学院\t374097\n华录百纳\t374098\n校库\t374099\nSQA\t374100\n王国华\t374101\n瑞香\t374102\n爱岗敬业\t374103\nHCO3\t374104\ndetector\t374105\n56位\t374106\n三角裤\t374107\n自由贸易港\t374108\n护持\t374109\n追责\t374110\n20150501\t374111\n落选\t374112\n骁龙670\t374113\n瓶邪\t374114\n李莉\t374115\n真难\t374116\n珠笔\t374117\n黑出翔\t374118\n山民\t374119\n重新开放\t374120\n进风\t374121\niCAx\t374122\n新开源\t374123\n十个人\t374124\nmonokai\t374125\n608\t374126\nIEC\t374127\n双重\t374128\n黄山书社\t374129\n阮氏\t374130\n太平洋银行\t374131\n冲冲\t374132\n消毒灯\t374133\n无限量\t374134\n扣住\t374135\n反目\t374136\n杨拓\t374137\n170cm\t374138\n350万元\t374139\nr+\t374140\nsal\t374141\n16.01\t374142\norgy\t374143\nCharacter\t374144\nkaleid\t374145\nTez\t374146\n职能型\t374147\n应求\t374148\n黑吕\t374149\n飞速中文网\t374150\n二十八式\t374151\n几M\t374152\n殇璃\t374153\n巴盟\t374154\n保温柜\t374155\n魔法门之英雄无敌5\t374156\n双排\t374157\n奔驰b200\t374158\n多卡\t374159\n中国共产党党内监督条例\t374160\n黑田\t374161\n转开\t374162\n长平公主\t374163\n交付物\t374164\n范翠霞\t374165\n红山人才网\t374166\nmsds\t374167\n卢俊义\t374168\njiaoyu.huangye88.com\t374169\n福利群\t374170\n色母料\t374171\n樱花粉\t374172\n歧视\t374173\n金融保险\t374174\n卓梵\t374175\n普利茅斯\t374176\n商山早行\t374177\n一汽佳宝\t374178\n3.15曝光\t374179\n恋人间\t374180\nremoveChild\t374181\n范迪克\t374182\n中国能源建设股份有限公司\t374183\n河东\t374184\nspecifiers\t374185\nmathtype6.9b\t374186\n北京理工\t374187\n路缘石\t374188\nWin98\t374189\n只有我\t374190\n国家汉办\t374191\n遗产\t374192\n生金\t374193\n破产案\t374194\n日线\t374195\n华フック\t374196\n邓聿文\t374197\n天天好逼网\t374198\n先进性\t374199\nMeme\t374200\n变电运行\t374201\nJosef\t374202\nPolestar\t374203\n华安期货\t374204\n40台\t374205\n曹庄村\t374206\nMegaHouse\t374207\n速派傲世九重天\t374208\neve吧_\t374209\n陈奕君\t374210\n不灭者\t374211\n一张网\t374212\n21层\t374213\n衍圣公\t374214\n零蛋\t374215\n杀父\t374216\n十九世纪\t374217\n珠宝展\t374218\n泵车\t374219\n广东省政协\t374220\n丰泽区\t374221\n闭包\t374222\n印度舞\t374223\n莫莫\t374224\n胸\t374225\nencore\t374226\n竹片\t374227\nflawless\t374228\n精武\t374229\n知更鸟\t374230\n胆囊炎\t374231\n市场预测\t374232\n植酸\t374233\n生员\t374234\n求生之路2中文网\t374235\n通心粉鼠\t374236\n潜江房网\t374237\n柚木提娜\t374238\n情歌王\t374239\n懵比\t374240\n資料\t374241\n厚植\t374242\n中兴视频\t374243\nfifth\t374244\n大连理工大学机械工程学院\t374245\n面包子\t374246\n阿拉伯联合酋长国\t374247\n红枫叶\t374248\n最低温度\t374249\n丢豆网\t374250\n侍道3\t374251\n天才白痴梦\t374252\n柞水\t374253\nSFU\t374254\n伊犁市\t374255\n中国农业科学院研究生院\t374256\n1657条\t374257\nsurely\t374258\n苏芮\t374259\n奚美娟\t374260\n成师附小\t374261\n废铝\t374262\n游火\t374263\n苏培盛\t374264\n4750G\t374265\n128K\t374266\n好全\t374267\n芝麻菜\t374268\n武井\t374269\n黑虫\t374270\n优尓城\t374271\nies\t374272\n前所未有\t374273\n王群\t374274\n突发事故\t374275\n闻堰\t374276\n厦门路桥\t374277\n筱露\t374278\n特赞\t374279\n张昕宇\t374280\n乙炔瓶\t374281\nTD\t374282\n仪陇县人民政府\t374283\n铜仁新闻网\t374284\n小暗\t374285\n龙华网\t374286\nceiling\t374287\nMysqli\t374288\n无双大蛇2\t374289\n企业管理网\t374290\nSaltStack\t374291\nhaining\t374292\nkp2\t374293\n旅行家\t374294\n学教\t374295\n北京人才网\t374296\n卡布奇诺\t374297\n消声室\t374298\n搜狗服务中心\t374299\n提供方\t374300\n100.cn\t374301\n悼亡诗\t374302\n水晶草\t374303\n凤凰湾\t374304\n600吨\t374305\n澳门航空\t374306\n票据业务\t374307\ndropdownlist\t374308\nDune\t374309\niteye\t374310\n香梅\t374311\n耙耙柑\t374312\n安靠\t374313\n融创时代奥城\t374314\n延安\t374315\n焊钉\t374316\n骑行圈_自行车网\t374317\n折君\t374318\nfes\t374319\n出纳人员\t374320\n三创\t374321\nVR渲染器\t374322\n大娜迦\t374323\n可泽\t374324\n丁香医疗\t374325\n北控水务集团\t374326\n太阳雨\t374327\n电缆桥架\t374328\n上海松江政府\t374329\n刑事审判参考\t374330\n李剑锋\t374331\n海珠桥\t374332\n纸花\t374333\n2000年以后\t374334\n乐高忍者\t374335\n感受态\t374336\naokunsang\t374337\nps自由变换\t374338\n7570\t374339\nfireaxe\t374340\nFormulas\t374341\n4250\t374342\n赠送\t374343\n瑞风M3\t374344\n西瓜头\t374345\n重见光明\t374346\n_商车网\t374347\n1212\t374348\n相继\t374349\n治标\t374350\n3轴\t374351\n字眼\t374352\n诗词世界\t374353\n苏州市国家税务局\t374354\n豪妹\t374355\n钜派投资集团\t374356\nhpv52\t374357\n定点数\t374358\n叫嚷\t374359\n水秀\t374360\n昆明政府网\t374361\n玉溪红塔\t374362\n新东方网校\t374363\n三明北站\t374364\n逛公园\t374365\nFixture\t374366\n江花红\t374367\nub3.0\t374368\n龙腾世纪审判\t374369\n死水\t374370\nbson\t374371\n闻人\t374372\n四川省文化厅\t374373\n三星s7edge\t374374\nsonicwall\t374375\n东航期货\t374376\n短篇集\t374377\n自立袋\t374378\n长枪\t374379\n小良\t374380\n学科教学\t374381\ncalbee\t374382\n大冶市人民政府\t374383\n吸气\t374384\n荣麟\t374385\nTortoiseS\t374386\nAirfare\t374387\ntdk\t374388\n绝世好剑\t374389\n赵家村\t374390\ncumsum\t374391\n铜绞线\t374392\n传球\t374393\n爸爸爸\t374394\n那么多好\t374395\n张春贤\t374396\n万道成神\t374397\n宛瑜\t374398\n废铁\t374399\n国际原子能机构\t374400\n王者荣耀局\t374401\n宝马七系\t374402\n饱和脂肪酸\t374403\n美图秀秀批\t374404\nIrony\t374405\n军火库\t374406\n来生\t374407\n几个头\t374408\n音障\t374409\n洛溪大桥\t374410\n大窑\t374411\nZJH\t374412\n360直播网\t374413\n江川\t374414\n0124\t374415\ntemplates\t374416\nears\t374417\n四爷\t374418\n巨鲨\t374419\n金牌橱柜\t374420\nG610\t374421\n余压\t374422\n方差分解\t374423\n札幌站\t374424\nDatabase\t374425\n北京资产管理公司\t374426\n水瓷\t374427\n北京弘医堂中医医院\t374428\n教育公平\t374429\n十几条\t374430\n音乐公园\t374431\nPointofix\t374432\n杰西利弗莫尔\t374433\nhdpe双壁波纹管\t374434\n开超\t374435\n美欣\t374436\n12粒\t374437\n尉迟\t374438\n紫薇花\t374439\n唐国强\t374440\nTheatre\t374441\n沉灵\t374442\n泰茶\t374443\n2008年\t374444\n伊滨\t374445\n2018年03月01日\t374446\n深圳市消费者委员会\t374447\n销售岗\t374448\n北国网\t374449\n气象台\t374450\n笔村\t374451\n税延型养老保险\t374452\n苏州亚科科技\t374453\n平面图\t374454\ninet\t374455\n石榴园\t374456\n昭明\t374457\n钇\t374458\n长虹吉川内吧\t374459\n温峤\t374460\n拉格朗日插值\t374461\nexplode函数\t374462\n400A\t374463\n幻想传说\t374464\nShift键\t374465\n国有独资\t374466\n吕\t374467\n一卡\t374468\n昭\t374469\nCS5.5\t374470\n四场\t374471\n仿射变换\t374472\n女杰\t374473\n蓝筹\t374474\n贯标\t374475\n中国巨力集团\t374476\n陈仓区\t374477\n曾巩\t374478\n鹰钩\t374479\n厨卫电器装修|一起网\t374480\n8309xxxx\t374481\n极限竞速6\t374482\n长轴\t374483\nmnl\t374484\n第11期\t374485\n傲寒\t374486\nSQLserver\t374487\n三纲\t374488\n荒古巨\t374489\nv2.0.0.1\t374490\n中压\t374491\n十书\t374492\n只发\t374493\n疑心\t374494\n淘宝直通车\t374495\n手绘鞋\t374496\n断罪\t374497\n600米\t374498\n杰克公主\t374499\n正缘\t374500\n韩莹\t374501\n迪庆藏族自治州\t374502\n哥特式\t374503\n可瑜\t374504\n银饰品\t374505\nConnecting\t374506\n分厂\t374507\n斯利安叶酸片\t374508\n桃花扇\t374509\n奥腾\t374510\n晾干\t374511\n五晨寺\t374512\ncommuter\t374513\n三宝\t374514\n小鸢\t374515\nsac\t374516\nFre\t374517\n力科\t374518\n小柏\t374519\n两章\t374520\n罗清宇\t374521\n雄安市民服务中心\t374522\n手足口\t374523\n伟岸\t374524\n选资\t374525\n长安cx20\t374526\n张建\t374527\n野孩子\t374528\n1000册\t374529\n下午三点半\t374530\n秦爷\t374531\n偷拍照\t374532\n时兴\t374533\n系统误差\t374534\nfelling\t374535\nSHURE\t374536\n光耦\t374537\n刘丹萌\t374538\n动物园\t374539\ninsect\t374540\n1747\t374541\n电影台\t374542\n陈金刚\t374543\n六七次\t374544\nmo3\t374545\n温铁军\t374546\n明成化\t374547\n朝国\t374548\n老恕\t374549\n阜新银行\t374550\n朗诗熙华府\t374551\n华为荣耀v10\t374552\n直车\t374553\n碳酸二甲酯\t374554\n铁电存储器\t374555\n飞智Wee\t374556\n开台\t374557\n正德职业技术学院\t374558\n众和股份\t374559\n庾澄庆\t374560\n广华\t374561\n团房\t374562\n少说\t374563\n2006年度\t374564\ntaqman\t374565\n查重算不算\t374566\n宋元\t374567\n图题\t374568\n刷铁\t374569\n美的公司\t374570\n从何开始\t374571\n反问\t374572\n7672\t374573\n初级会计实务\t374574\n金雷\t374575\n郭瑞\t374576\nUItable\t374577\n200型\t374578\n季后赛\t374579\n莫里斯\t374580\n宏基\t374581\n无线自组网\t374582\n南方基金\t374583\n道不明\t374584\n揉胸\t374585\n儒家\t374586\nTMA\t374587\n肩针\t374588\n4200M\t374589\n夏日甜心\t374590\nPOINT\t374591\n安满\t374592\nrtu\t374593\n斯利安\t374594\nRK3399\t374595\n希声\t374596\n石家庄幼儿师范高等专科学校\t374597\n极限挑战第二季\t374598\n喻海良\t374599\n盾流\t374600\n防潮箱\t374601\ng30\t374602\n唐玉\t374603\ndrying\t374604\n炯\t374605\n涉众型经济犯罪\t374606\n莱钢\t374607\n大作\t374608\n朴初珑\t374609\n何冲宗庆后\t374610\n北京市自然科学基金\t374611\n新旧版\t374612\n匹夫的逆袭\t374613\nNSString\t374614\n香颂\t374615\n十句\t374616\n千古一帝\t374617\n纯金\t374618\n鳞状细胞癌抗原\t374619\n子代\t374620\nmcake\t374621\nipi\t374622\n压花\t374623\n卡萨布兰卡\t374624\nappcan\t374625\nSCT\t374626\nXG\t374627\n39层\t374628\n二狗\t374629\ndafa\t374630\n存在\t374631\n国家自然基金委\t374632\nXXXL\t374633\n母乳期\t374634\n水利基金\t374635\n陈谷\t374636\n李胜利\t374637\n谷歌搜索\t374638\n淘宝运费险\t374639\n大功告成\t374640\n中华吊顶网\t374641\n超高层\t374642\n气垫床\t374643\nStm32\t374644\n管线机\t374645\n氰基吡啶\t374646\n成都文理学院\t374647\n发布版\t374648\n这么个\t374649\n四十年\t374650\ne1000e\t374651\n普陀山风景区\t374652\n12月14日\t374653\n哈巴雪山\t374654\n华北\t374655\n朱诺滩\t374656\n刻章\t374657\nQueens\t374658\n石燕\t374659\n垃圾君\t374660\n启示录\t374661\n无崖子\t374662\n宿务机场\t374663\n顶点式\t374664\n网易UU加速器\t374665\n济青高铁\t374666\n货地\t374667\n盘山县\t374668\n电流量\t374669\n密歇根\t374670\nPlugs\t374671\n爱后余生\t374672\n华生\t374673\nCH明明我的世界\t374674\n烛台\t374675\n嘉兴市\t374676\n银行\t374677\n2袋\t374678\n煤棚\t374679\n吴大澂\t374680\n极速者\t374681\nLooking\t374682\n美国证券交易所\t374683\n娱乐场\t374684\n输验\t374685\n漕溪北路\t374686\nhpm1005\t374687\n桂冠\t374688\n九泰基金\t374689\n卖场\t374690\n超级震撼\t374691\n_战\t374692\n和平大街\t374693\nSNS\t374694\n乳癖消片\t374695\n左海公园\t374696\nMPU9250\t374697\nkatie\t374698\nheadphones\t374699\nwarner\t374700\n明月松间照\t374701\n团务\t374702\n回家吧\t374703\n傣乡\t374704\n提防\t374705\n工程项目管理\t374706\n担保方\t374707\n塞维尔\t374708\nikaka\t374709\n清镇\t374710\nWIN2000\t374711\nasuka\t374712\n3串\t374713\n三纵\t374714\n山渣\t374715\n改签费\t374716\nfreedom\t374717\n歌儿\t374718\n移动梦网\t374719\n珍藏品\t374720\n民治街道\t374721\n重机\t374722\n印章石\t374723\nDAHON\t374724\n19次\t374725\n张文娟\t374726\n入笼\t374727\n贝多芬病毒\t374728\n东陵大盗\t374729\nidrive\t374730\n猫衣\t374731\n陈风笑\t374732\n二甲\t374733\n刊印\t374734\n子宫瘤\t374735\n烘干\t374736\n节育环\t374737\na11处理器\t374738\n老公公司\t374739\nPyth\t374740\n天荒坪\t374741\n后宫番\t374742\n亲兄弟\t374743\n合同法\t374744\nrepos\t374745\n易灵算命网\t374746\n遗传咨询师\t374747\n绿坝\t374748\n裸阴部\t374749\nT60\t374750\nVR视频播放器\t374751\n伊秀美容网\t374752\n66平米\t374753\n戴伟\t374754\n闹\t374755\n共达电声\t374756\n集成电路产业投资基金\t374757\n一第四章\t374758\n李谷一\t374759\n非正态分布\t374760\n佳木斯市中心医院\t374761\n叽里呱啦\t374762\n腾讯游\t374763\n脚痛\t374764\n电吹风机\t374765\n长兴\t374766\n精锐教育\t374767\nEC200\t374768\nessbase\t374769\n120岁\t374770\n搜查\t374771\n戴志康\t374772\n2宫\t374773\nEIT\t374774\n林北\t374775\n300平方米\t374776\n常宝华\t374777\n宋凝\t374778\n窝囊\t374779\n6个小时\t374780\n二手房买卖税费\t374781\nORA-06502\t374782\n苏树林\t374783\n医管\t374784\nIAP\t374785\n60年\t374786\n并行口\t374787\n什么险\t374788\n直升\t374789\n东湖宾馆\t374790\n胃肠道胀气_\t374791\n张家窝镇\t374792\nVELITE\t374793\n账本\t374794\n队日\t374795\n基头\t374796\ngarment\t374797\n压缩性\t374798\n8105\t374799\njod\t374800\n株州\t374801\n美林花园\t374802\nMetrology\t374803\n仿形\t374804\n几纳米\t374805\n普丽缇莎\t374806\n会计从业考试\t374807\nmimic\t374808\n七年之痒\t374809\n橙灯\t374810\n方法派\t374811\n躬\t374812\n宝骏510论坛_汽车之家论坛\t374813\n父子之间\t374814\n青门\t374815\n致命伤\t374816\ndeal\t374817\n大教堂\t374818\n吉首大学张家界学院\t374819\n大湖股份\t374820\n摘星\t374821\n艾泰科技\t374822\n娘娘们\t374823\n健怡\t374824\n绝地求生女\t374825\n黄之锋\t374826\n帘友\t374827\n哎哟妈妈\t374828\n质量检测仪\t374829\n庐江县人民政府\t374830\n寸滩\t374831\n固赛\t374832\n三星地煞\t374833\n米仇\t374834\n这个世\t374835\nUpcoming\t374836\n中层\t374837\n绯村剑心\t374838\n谢峰\t374839\n人事表格\t374840\n柱柱\t374841\n羡慕\t374842\nWKWebview\t374843\nBrahms\t374844\n商帮\t374845\n拳皇94\t374846\nSUI\t374847\n千叶大学\t374848\n韩家川\t374849\n夜盲\t374850\n去污剂\t374851\n99874942\t374852\n细述\t374853\nalg\t374854\n铜带\t374855\n荧光粉\t374856\nharry\t374857\n重庆猪八戒网络有限公司\t374858\n热转印纸\t374859\n大吃一惊\t374860\n广安门医院\t374861\nFlashing\t374862\n孙宏斌\t374863\n电源管理器\t374864\n纯白花\t374865\nSTC\t374866\n南京人大\t374867\nEngineered\t374868\n曼陀sp庄园\t374869\n外语片\t374870\n防撞钢梁\t374871\n田沁鑫\t374872\n垄断案\t374873\n肉丁园艺网\t374874\n陈心蕊\t374875\nr18mmd\t374876\n汤姆·克鲁斯\t374877\n近似\t374878\n阿尔塞斯\t374879\nSpawn\t374880\n孙晓岐\t374881\n法国\t374882\n架子工\t374883\n失落城堡\t374884\nSynology_NAS云\t374885\n千川木门\t374886\n佳能500D\t374887\nPostgres\t374888\n6.82\t374889\n泥龟\t374890\nall冰all\t374891\n超级转换秀\t374892\n新华日报\t374893\n我的日子\t374894\nWise\t374895\n奥斯曼帝国\t374896\nDolls\t374897\n眯眯\t374898\nloveplus\t374899\n224路\t374900\n杰作\t374901\n尚赫赫森\t374902\n涉毒\t374903\n嘉兴地区\t374904\n想吃饭\t374905\n西米\t374906\n精日反华言论\t374907\n中国建筑金属结构协会\t374908\nGutenberg\t374909\n噗噗噗\t374910\n中国测控网\t374911\n程颐\t374912\n连体式\t374913\n插片\t374914\n快走丝\t374915\n追焦\t374916\nWOR\t374917\nCompletely\t374918\n写话\t374919\n曹曦月\t374920\n非国家工作人员行贿罪\t374921\n圆眼\t374922\niReport\t374923\n在建工程网\t374924\n富蕴\t374925\nSuper\t374926\nposterior\t374927\nMySQl\t374928\n协鑫集成\t374929\n蒋申\t374930\n二氧化铈\t374931\n累及\t374932\nlsc\t374933\n一步\t374934\n三清洞\t374935\n第九张\t374936\nLudovico\t374937\n牺\t374938\n金山WPS\t374939\n铙\t374940\n冲锋枪\t374941\n个人\t374942\nicr\t374943\n要么\t374944\n25条\t374945\n保集\t374946\n砖混\t374947\n扬子江药业集团有限公司\t374948\n飞边\t374949\n出价\t374950\n色情业\t374951\nlamdba\t374952\nAFF\t374953\nhelu\t374954\n吕珍九\t374955\n锐爽EN\t374956\nonly_full\t374957\n比赛项目\t374958\n如释重负\t374959\n书签\t374960\n机器学习笔记\t374961\n灵渠\t374962\n埃罗芒阿\t374963\n必胜客欢乐餐厅\t374964\n溅射靶材\t374965\n30多辆\t374966\n不甚\t374967\nachi\t374968\n卡式\t374969\n重工企业\t374970\n运营费\t374971\n保管箱\t374972\nEmbeded\t374973\n文艺复兴\t374974\n蒸汽式\t374975\n1第一\t374976\ncgmodel\t374977\n法律规\t374978\n百花节\t374979\n维生素b12\t374980\nminecra\t374981\n惠朗\t374982\n胡海泉\t374983\nroot\t374984\n2014年\t374985\ngeogebra\t374986\niro\t374987\nSoDu\t374988\n千代\t374989\ncctv7\t374990\n八针\t374991\n战神记\t374992\n石角\t374993\nu4000\t374994\nNTCE\t374995\n不视\t374996\n米菈\t374997\n秃发\t374998\n四爱\t374999\n笛安\t375000\n苹果x\t375001\n五六天\t375002\nCFD\t375003\n不馁\t375004\n电流法\t375005\n劫夺\t375006\n约炮群\t375007\n朱兵\t375008\n真心英雄\t375009\n生活愉快\t375010\n118个\t375011\n活动方\t375012\n氨基磺酸镍\t375013\n涯\t375014\ninstitutions\t375015\n2017年11月9日\t375016\n难不难\t375017\nsubview\t375018\n青涩\t375019\n文琪\t375020\norcl\t375021\n新兰吧\t375022\n深圳航天信息有限公司\t375023\n学生篇\t375024\n藏式\t375025\n辉柏嘉\t375026\n移动电视\t375027\n化为灰烬\t375028\n也是\t375029\n椰砖\t375030\n黄斑病\t375031\n烟楼\t375032\npvc卡\t375033\n舞台剧\t375034\n新乡职业技术学院\t375035\n水滴石穿\t375036\n知米\t375037\n梦塔防\t375038\ngp5\t375039\n风车动漫\t375040\n韩静\t375041\n农夫渔夫\t375042\n千岛\t375043\nWITH\t375044\n样条函数\t375045\n金投外汇网\t375046\n市园林局\t375047\n易特\t375048\n錄\t375049\n夜班费\t375050\n双麦\t375051\n音度\t375052\n朱成\t375053\n2017年下半年\t375054\nkey社\t375055\n浮地\t375056\n南京妇幼保健院\t375057\n安兹·乌尔·恭\t375058\ncountdownlatch\t375059\n草木灰\t375060\nScuba\t375061\n是不是\t375062\n举动\t375063\n臀桥\t375064\nv9\t375065\n指影\t375066\n普林斯顿大学\t375067\n中国移动通信集团设计院有限公司\t375068\n新民村\t375069\n古加尔\t375070\n名讳\t375071\nbukkake\t375072\nPassword\t375073\n托马斯微\t375074\n滋\t375075\n极端化\t375076\n格兰蒂亚\t375077\nUCGUI\t375078\nг\t375079\n高硼硅\t375080\ngopro\t375081\n两办\t375082\n中国国电集团\t375083\n马昕\t375084\ntimex\t375085\nNHKオンライン\t375086\n中国通用\t375087\n6安卓\t375088\nelite\t375089\n抗旱性\t375090\nRMX\t375091\n吉拉德\t375092\n海安高新区\t375093\n斗艳\t375094\n萝卜糕\t375095\n浪中岛\t375096\n资金使用效率\t375097\n大夫山\t375098\n北京旅\t375099\n大礼拜\t375100\n浮动工具栏\t375101\niMessage\t375102\n病重\t375103\n收房\t375104\ncarr\t375105\n青恋\t375106\n卖肉\t375107\n黄碟\t375108\n18046\t375109\n锂聚合物电池\t375110\n牯岭镇\t375111\n李玉萍\t375112\n左岭新城\t375113\n瑶琳仙境\t375114\n沙集\t375115\nfacto\t375116\n思凯\t375117\nflying\t375118\n第19次\t375119\ndx12\t375120\nsort\t375121\n乐可\t375122\n新轩逸\t375123\n人满为患\t375124\n绝对性\t375125\n格式化为\t375126\nEmma\t375127\n农财网\t375128\n树层\t375129\nJaye\t375130\n毒力\t375131\n东平森林公园\t375132\n筛分机\t375133\n小世界\t375134\n实体性\t375135\n不择手段\t375136\n深云村\t375137\nMad\t375138\n医信\t375139\n渐浓\t375140\n5.1.7\t375141\n新围村\t375142\n江中集团\t375143\n注册卡\t375144\n生化分析仪\t375145\n四十项建议\t375146\n超临界机组\t375147\n何种\t375148\n集美大学\t375149\nFaux\t375150\nmodle\t375151\n不为人知\t375152\n热台\t375153\n天津卫视\t375154\nTen\t375155\nA7R\t375156\n3.8G\t375157\n6999\t375158\n周年版\t375159\ncenots\t375160\n鲁卡\t375161\ntrident\t375162\nVale\t375163\n高新科技\t375164\n半藏\t375165\nanl\t375166\nISF\t375167\n梅州市委\t375168\n森果\t375169\n伊波拉病毒\t375170\n红米3S\t375171\n扳手腕\t375172\n税务登记\t375173\n神职人员\t375174\n2018.4.1\t375175\nContax\t375176\n3800\t375177\n慈孝文化\t375178\n跪式\t375179\n广州民航职业技术学院\t375180\n遊戯\t375181\nAstro\t375182\n涉嫌\t375183\n甪直镇\t375184\n礼运\t375185\n别克君越\t375186\n宋鲁郑\t375187\nfmp\t375188\nSynopsys\t375189\n第85\t375190\nSellers\t375191\n2018.3.20\t375192\nDS-5\t375193\n值不变\t375194\nyshy\t375195\nCTreeCtrl\t375196\n黄耳\t375197\n乐高世界\t375198\nHackRoad\t375199\n恒牛医药网\t375200\n说不上\t375201\n006号\t375202\n镇江市公安局\t375203\nfileinfo\t375204\n华润天合\t375205\n树欲静而风不止\t375206\n学习卡\t375207\n天文台\t375208\n极兵刃\t375209\ndiscuzx\t375210\n穗香\t375211\n深圳市质量技术监督培训中心\t375212\n龙江镇\t375213\n258元\t375214\n电脑版\t375215\n记帐\t375216\ntoa\t375217\n002137\t375218\n25平米\t375219\n尾端\t375220\nmemtester\t375221\n年底\t375222\n福建网络广播电视台\t375223\n同化\t375224\n康复师\t375225\ntdt\t375226\n越城区人民政府\t375227\nMises\t375228\n徐家汇街道\t375229\n小康之家\t375230\n宝瓶\t375231\nBO5\t375232\n灶房\t375233\n胡服骑射\t375234\n套杆\t375235\n最佳出价\t375236\n虾店\t375237\n小峰峰\t375238\n大城小事\t375239\n二妞\t375240\n涂料展\t375241\n琐谈\t375242\n乐高蝙蝠侠大电影\t375243\n钓者\t375244\n一根几米\t375245\n鲁甸\t375246\n歧\t375247\n润滑\t375248\nIDT\t375249\natan2\t375250\n纳洛酮\t375251\n李雅普诺夫\t375252\n奔驰GLC\t375253\n12366\t375254\n欲望之城\t375255\n黄米\t375256\nNuke\t375257\n省广集团\t375258\nTimeMP3\t375259\n鼻炎\t375260\n吭吭\t375261\n肉鸽\t375262\n肖星\t375263\n雅安中学\t375264\n徐直军\t375265\nproposal\t375266\nSdk\t375267\n揭西论坛\t375268\n丰田海拉克斯\t375269\n虾米音乐播放器\t375270\n198号\t375271\n萨珊\t375272\n进销项\t375273\ndataFrame\t375274\n能工\t375275\n赢球\t375276\n两株\t375277\n爱如星火\t375278\nTASCAM\t375279\n万维家电网_ea3w\t375280\n广业集团\t375281\n神经递质\t375282\n丰富性\t375283\n阳光法庭\t375284\n太阳照\t375285\ninspiring\t375286\n天龙八部单机版\t375287\n李天成\t375288\n峥\t375289\n中国社区网\t375290\n赵彬\t375291\nsydney\t375292\n张哥\t375293\n奥格\t375294\n11.4\t375295\n嘻哈帮\t375296\n病死率\t375297\n广州体检中心\t375298\n王一楠\t375299\n稠江街道\t375300\n顾青\t375301\n御江南\t375302\nSeed\t375303\n尊驰\t375304\nMiniTool\t375305\n健身裤\t375306\n静宁县\t375307\n夏玲蔓\t375308\n麦迪文\t375309\n洗机\t375310\n捷途捷途X902018款\t375311\n李叔叔\t375312\n幻想王国\t375313\n180亿\t375314\nanimals\t375315\n狂轰\t375316\n动物源\t375317\n运动障碍\t375318\n张家界酒店\t375319\n航行\t375320\n美发师\t375321\n救援窗\t375322\n魏秋月\t375323\n月票\t375324\n巡检\t375325\n加勒比海盗启航\t375326\n小丸工具箱\t375327\n该店\t375328\n加筋\t375329\n脂溢性\t375330\n哭哭啼啼\t375331\n万科翡翠书院\t375332\n140311\t375333\nZOC\t375334\n很高兴\t375335\n华众\t375336\n正庚烷\t375337\n防制\t375338\n吸血鬼\t375339\n0798\t375340\n多方\t375341\n异带\t375342\n犯罪行为\t375343\n深圳湾一号\t375344\n逆徒\t375345\n梁超\t375346\n罗先生\t375347\nrpa\t375348\n詹妮弗·洛佩兹\t375349\n大鱼板\t375350\npropy\t375351\n火之鸟\t375352\n大帝国\t375353\n音艺\t375354\n3.9j\t375355\n翩然\t375356\n电热圈\t375357\n游骑兵\t375358\n再沸器\t375359\nSpigot\t375360\n台州网上公安局\t375361\n万通地产\t375362\n箭牌卫浴\t375363\n三国无双7猛将传\t375364\n斯林百兰\t375365\n妆美扮靓_爱靓网\t375366\n英达\t375367\nTons\t375368\n小棕瓶\t375369\n安安静静\t375370\n斯内德\t375371\n抽样\t375372\nproe3.0\t375373\nkkone\t375374\nLets\t375375\n法学院\t375376\n标准普尔\t375377\n英吉利海峡\t375378\n卓玛拉\t375379\n弘化社\t375380\n6.98\t375381\nczy\t375382\ndesigns\t375383\n华夏幸福吧\t375384\nmetalbuild\t375385\n徐家浜\t375386\ntrait\t375387\n4.78\t375388\n12个小时\t375389\n爱奴\t375390\n1519\t375391\n颍上县\t375392\n球兰\t375393\n心火龙牧\t375394\nhg0088\t375395\n县商务局\t375396\n私募备案\t375397\nproE\t375398\n化学院\t375399\n顺逆\t375400\n李娟\t375401\n连元街小学\t375402\n狼性\t375403\n黄金蟒\t375404\n独木\t375405\ntaboo\t375406\n新东方教育\t375407\n数字媒体技术\t375408\n黑山镇\t375409\nxingxing\t375410\n0130\t375411\ngrammar\t375412\nDirection\t375413\n三元里\t375414\n九莲新村\t375415\nPPTV聚力\t375416\n中煤能源\t375417\n绿色浏览器\t375418\nDNFSF\t375419\n优加\t375420\n锯末颗粒机\t375421\nrp1\t375422\n佳能60d\t375423\n端口扫描器\t375424\n韦健\t375425\nE55\t375426\n乌镇雅园\t375427\n山西医科大学\t375428\n服务部\t375429\n二零一三\t375430\ndubug\t375431\n勃兰登堡\t375432\n38亿\t375433\n陆白\t375434\n世王\t375435\n羊杂\t375436\nwidely\t375437\n缟素\t375438\nMicrosystems\t375439\n玛格南\t375440\n阳光华庭\t375441\n彭州市人民政府\t375442\n粤澳\t375443\n金财\t375444\n5五\t375445\ngltxt\t375446\n黄元御\t375447\n房氏\t375448\n弹枪\t375449\n10.x\t375450\n归\t375451\n透心\t375452\n4月24号\t375453\n刘桥镇\t375454\n以身试\t375455\nka600\t375456\n4.28\t375457\nxx小学\t375458\n新界面\t375459\n汪耿东\t375460\n煤窑\t375461\n陈明星\t375462\nsvmtrain\t375463\n大运河森林公园\t375464\n比亚迪F0论坛_汽车之家论坛\t375465\n跌落式熔断器\t375466\n25mm\t375467\n瑞虎5论坛\t375468\n学术报告会\t375469\n可分\t375470\n壮阳果\t375471\n折纸盒\t375472\n克什米尔\t375473\n印度政府\t375474\n帝国2吧\t375475\n力王之监狱力王\t375476\nantlr4\t375477\n刨花\t375478\n无人村\t375479\n卓拉\t375480\ntoolset\t375481\n双开双控开关\t375482\n河北大学新闻传播学院\t375483\n开屏\t375484\n沈佳熹\t375485\n香港消防\t375486\n贾平\t375487\n航图\t375488\n最前\t375489\n魏明帝\t375490\n艾默生\t375491\n0欧姆\t375492\n谦卑\t375493\nENTRYPOINT\t375494\n速水\t375495\novernight\t375496\n国境\t375497\n多利\t375498\nITW\t375499\n三网\t375500\n崇圣寺三塔\t375501\n狗粮\t375502\n万官\t375503\n力偶\t375504\nblogbus\t375505\n艺网\t375506\n婚制\t375507\n12公分\t375508\n流口\t375509\n亲\t375510\nDiaper\t375511\n汕头移动\t375512\n周培公\t375513\n次声波\t375514\n砖头\t375515\nsktech\t375516\n子域名\t375517\n碧轨\t375518\n月白\t375519\n理货员\t375520\n知\t375521\n恐怖组织\t375522\nLag\t375523\n高峰村\t375524\n几双\t375525\n羽田机场\t375526\nDOX\t375527\n3uww.cc\t375528\n600601\t375529\n襄河\t375530\n6.1英寸\t375531\ntnt\t375532\n南湾街道\t375533\n若干次\t375534\nplush\t375535\n汤屋\t375536\n适时四驱\t375537\n黎明\t375538\n排列_\t375539\n头板鞋\t375540\n折腰\t375541\n通今\t375542\n中国养猪网\t375543\nGEM\t375544\n浦东发展银行\t375545\n刀妹\t375546\n运城学院\t375547\n行情\t375548\n性博会\t375549\n刘小军\t375550\n钟跃民\t375551\n凤凰楼\t375552\n私货\t375553\n老千2\t375554\n王俊\t375555\n好不\t375556\n飞月\t375557\n中华人民共和国宪法修正案\t375558\n无风险收益率\t375559\n第29\t375560\n凼\t375561\nanimoji\t375562\n糯米纸\t375563\n钻芯法\t375564\n潘光旦\t375565\n40年\t375566\n养牛\t375567\n环境度\t375568\n新幻\t375569\n梅林一村\t375570\n广告商\t375571\n天宫\t375572\n8.0000元\t375573\n酷睿I3\t375574\n飞背单词\t375575\n很傻逼\t375576\n毕业\t375577\n新影音先锋\t375578\nherbert\t375579\n回轮\t375580\n无敌剑域\t375581\n似是故人来\t375582\n安利贴\t375583\n300句\t375584\n古灵精探\t375585\n科凡\t375586\n101名\t375587\nBoao\t375588\n25【\t375589\n漫心酒店\t375590\n中鹰黑森林\t375591\n3669\t375592\n老户\t375593\nLane\t375594\n企字\t375595\n浏阳经开区\t375596\n老舍\t375597\njoystick\t375598\n山东省职业技能鉴定指导中心\t375599\n硅光电池\t375600\n腿麻\t375601\nbibizy\t375602\n筒子们\t375603\n盈江县\t375604\n采收期\t375605\n丹砂\t375606\n科学方法论\t375607\n爱信精机\t375608\nforge\t375609\n鄂温克族自治旗\t375610\n迈入\t375611\n骨架管\t375612\nblade\t375613\nqq幻想世界\t375614\n金桥小学\t375615\nraid1\t375616\nmanufacturers\t375617\n开花结果\t375618\n情景式\t375619\n缓存池\t375620\n老白\t375621\n鹰眼\t375622\nqidong\t375623\n涌入\t375624\n李珂\t375625\n28杠\t375626\n浙卫发\t375627\n赫山\t375628\n中文社会科学引文索引\t375629\nmeizhou\t375630\nNLTK\t375631\ndos2unix\t375632\n麻料\t375633\n乐图\t375634\n王庆祥\t375635\n额窦\t375636\nindication\t375637\n参战\t375638\n氰化钾\t375639\n虚空币\t375640\n冒险王\t375641\n720P.MP4\t375642\nicoca卡\t375643\n四十万\t375644\n喜结良缘\t375645\nfridge\t375646\n和田地区\t375647\n3.93\t375648\nWARFRAME\t375649\n摩臣\t375650\n链家在线\t375651\nOlympus\t375652\n吾乡\t375653\n音译\t375654\n圆叶\t375655\n偶们\t375656\n恶作剧之吻2\t375657\n虚张声势\t375658\nESI\t375659\n超声波换能器\t375660\n花艺师\t375661\npremise\t375662\nk宝\t375663\ndeveloperWorks\t375664\n谢静\t375665\nPoll\t375666\n念叨\t375667\nLipstick\t375668\n雪橙\t375669\n开林集团\t375670\nPhyscal\t375671\n翼城县\t375672\n自由大路\t375673\n痴心\t375674\n皮米\t375675\n刘也行\t375676\n拉结筋\t375677\n排烟机\t375678\n挂号\t375679\n西林县\t375680\n微区\t375681\n满版\t375682\n真二网\t375683\nNetty5\t375684\n李平\t375685\nnikelab\t375686\n邪器\t375687\ntayese\t375688\nshijia\t375689\n2.5分\t375690\n脚踏垫\t375691\n找出路\t375692\n45年\t375693\n品牌排行\t375694\n户籍\t375695\n罕遇\t375696\n风湿骨痛\t375697\nfripside\t375698\n种豆\t375699\n中华口腔医学会\t375700\n近光灯\t375701\n贱虫\t375702\n2018046\t375703\ntodos\t375704\n红星闪闪\t375705\nGLX\t375706\nSAD\t375707\n联部\t375708\n羽博\t375709\n纺丝机\t375710\n帮宝适\t375711\n父親\t375712\n聘请\t375713\n1314\t375714\n身妻\t375715\n夜神模拟器\t375716\n维护性\t375717\n简稿\t375718\n生长素类似物\t375719\n9988\t375720\n相变\t375721\n边带\t375722\n碳晶\t375723\n所有格\t375724\n高花\t375725\n汽车起重机\t375726\n取关\t375727\n锦宫\t375728\n噬神者2狂怒\t375729\n嵇康\t375730\n主连\t375731\n米瑞斯\t375732\n尝\t375733\n睡房\t375734\n继续教育部\t375735\nRoche\t375736\ngarry\t375737\n虹桥国家会展中心\t375738\n五明佛学院\t375739\nQuick\t375740\n登山包\t375741\n安纳西\t375742\n恽代英\t375743\n旅游地产\t375744\n正海\t375745\n色版\t375746\n浮玉洞仙歌\t375747\n十三集\t375748\n八达岭长城\t375749\n法兰克福机场\t375750\n中年期\t375751\n炎症性\t375752\n通域\t375753\n阅读季\t375754\n张献忠\t375755\n182个\t375756\n钟形\t375757\n立白\t375758\nheaded\t375759\n中国工商银行个人网上银行\t375760\nLPM\t375761\n阿罗\t375762\nilo4\t375763\n85万\t375764\nEmwin\t375765\nterra\t375766\n18万公里\t375767\nOCCITANE\t375768\nmerida\t375769\n贝瑞\t375770\n小河马\t375771\n古音\t375772\n关志鸥\t375773\n街舞视频\t375774\n清商\t375775\n后视\t375776\nlibz\t375777\n2015.7\t375778\n宋林静\t375779\n火焰之纹章\t375780\n直观\t375781\n免CD\t375782\n你好网\t375783\n石家庄工程职业学院\t375784\nCe\t375785\n拼音版\t375786\noven\t375787\n我们约会吧\t375788\n政治处\t375789\n实然\t375790\n棉垫\t375791\n淘金客\t375792\n吉利控股集团\t375793\nlessons\t375794\n购物卷\t375795\n陈瑶罗永浩\t375796\n摆明\t375797\n杉杉奥特莱斯\t375798\n婉莹\t375799\n方得网\t375800\n模板函数\t375801\n猪圆环病毒\t375802\n坪洲\t375803\nPlayground\t375804\n人资\t375805\n野渡\t375806\nuserAgent\t375807\nspacemacs\t375808\n取货\t375809\nHeiress\t375810\n美国最高法院\t375811\n自销\t375812\n钓鱼发烧友\t375813\n取决于\t375814\n安捷\t375815\n白干\t375816\n精馏\t375817\n四十条\t375818\ntrg\t375819\nLM393\t375820\n盍\t375821\n哔哔哔\t375822\n插补\t375823\n作品展\t375824\n华夏藏酒网\t375825\n十七英里\t375826\n宣讲会\t375827\n昆明高原\t375828\nmink\t375829\nuds\t375830\n二灰碎石\t375831\n钢链\t375832\n学区片\t375833\n李二嫂\t375834\n有问必答\t375835\nむ\t375836\n1根\t375837\n大海边\t375838\n前田由美\t375839\n求购\t375840\n天津酒店\t375841\n5月17日\t375842\n3秒\t375843\n沪深300\t375844\n深圳市城市设计促进中心\t375845\n天津小学\t375846\n良田镇\t375847\n你校\t375848\n马迭尔冰棍\t375849\n数字版\t375850\n天天P图\t375851\n两年间\t375852\n房源\t375853\n经济类\t375854\nnhk\t375855\n浅蓝\t375856\n溜溜\t375857\nMADE\t375858\n三点一线\t375859\nmuzi\t375860\n鱼嘴镇\t375861\n2016-06-27\t375862\n列管换热器\t375863\n拍卖法\t375864\n体质健康网\t375865\n金片\t375866\n胀满\t375867\n林杨\t375868\n1.3.5.3\t375869\n阮红\t375870\n6月份\t375871\n脊骨\t375872\n随书\t375873\n消化率\t375874\n劳保鞋\t375875\n0xc000007\t375876\n沧海横流\t375877\n上海惠灵顿国际学校\t375878\n发冠\t375879\n10.1007\t375880\n可乐罐\t375881\n45页\t375882\nB罩杯\t375883\n印证\t375884\n能不能\t375885\n壁厚\t375886\n灌排\t375887\nOPPOr9s\t375888\nrufus\t375889\n0439\t375890\n运行期\t375891\n保管柜\t375892\n按键精灵资\t375893\n繁星春水\t375894\nyuanyuan\t375895\nunrecognized\t375896\nWBA\t375897\n叠影\t375898\n胡勇\t375899\n定边县人民政府\t375900\nebb\t375901\n副号\t375902\n市市\t375903\nmta\t375904\n建设镇\t375905\n中青大厦\t375906\n浪琴环球\t375907\n第29页\t375908\n镇江酒店\t375909\n78平\t375910\n0728\t375911\n2.2.8\t375912\n佛堂\t375913\n序列表\t375914\n咬肌\t375915\n东软慧\t375916\n米饼\t375917\n和田光司\t375918\n170年\t375919\npasteclip\t375920\nexpanding\t375921\n8.01\t375922\n西庐\t375923\n危险系数\t375924\n无姓\t375925\n花粉\t375926\n而言\t375927\n重庆时时彩计划软件\t375928\n传阅\t375929\n排掉\t375930\n玄幻文\t375931\n11元\t375932\n号源\t375933\n分时\t375934\n加分项\t375935\n55路\t375936\n荣府\t375937\n辐射率\t375938\n野火春风斗古城\t375939\nNikko\t375940\n叶绿素a\t375941\n大狮\t375942\n轴孔\t375943\n汇融\t375944\n北外新闻网\t375945\nScanners\t375946\n2017年6月19日\t375947\n住房公积金贷款额度计算器\t375948\n哈气\t375949\n二分频\t375950\n企业免费发布信息网\t375951\nMoveIt\t375952\n便利性\t375953\n严武\t375954\n50001\t375955\nTechTarget数据中心\t375956\n一个群\t375957\n翠微大厦\t375958\n荔波\t375959\n汉牡丹园\t375960\nIKUN\t375961\nexcel2010单元格\t375962\n布鲁克大学\t375963\n赢咖娱乐\t375964\nhuaqin\t375965\n裁判员\t375966\n宫爷\t375967\n曾斌\t375968\n其他地区\t375969\nsurpac\t375970\n火枪\t375971\nu01\t375972\n针灸推拿\t375973\n副标题\t375974\n伯努利\t375975\nmediaplayer\t375976\n结肠炎\t375977\n稳定化\t375978\nusj\t375979\n得乎\t375980\n环境\t375981\n火地\t375982\n摩斯\t375983\n程璧\t375984\n月牙\t375985\n水晶塔\t375986\n镭射眼\t375987\n肉类\t375988\n武汉大学出版社\t375989\n难堪\t375990\n夯\t375991\ncial\t375992\nEmmet\t375993\n车座套\t375994\n兼薪\t375995\n寡头\t375996\n师易网\t375997\n游匣\t375998\n第11章\t375999\n遭过\t376000\n卡面\t376001\n三四个小时\t376002\n找主\t376003\n客小\t376004\nXtraGrid\t376005\nelle\t376006\n洪水\t376007\n会课\t376008\n將\t376009\n忍者神龟2012\t376010\n施暴者\t376011\n本体论\t376012\n面试谈\t376013\n高金平\t376014\n中国一冶集团有限公司\t376015\n西安外事学院\t376016\n模糊层次分析法\t376017\n维A酸乳膏\t376018\nuncommitted\t376019\n不育\t376020\ng9280\t376021\n申诉期\t376022\ndesperado\t376023\n11.2.0.1\t376024\n燕岗\t376025\n王玉芳\t376026\n二胎时代\t376027\n诸葛青云\t376028\ntoone\t376029\n园园\t376030\n南屿镇\t376031\n潘礼平\t376032\n金章\t376033\n商标案\t376034\n杜丽莎\t376035\nEChart\t376036\n敬伟\t376037\n宫计\t376038\ncpv\t376039\nTook\t376040\nsoul\t376041\n北港\t376042\n李桢航\t376043\noftenlin\t376044\n2817xxxx\t376045\n天歌三生\t376046\n邹小兵\t376047\nMover\t376048\n东莞市城乡规划局\t376049\n首所\t376050\nTax\t376051\n张立东\t376052\n15项\t376053\n中山港\t376054\n扳平\t376055\n黄田坝\t376056\n首席代表\t376057\n大雁\t376058\n山高路远\t376059\n天禅寺\t376060\nYONEX\t376061\ndeficit\t376062\n云南新闻网\t376063\n风陵渡\t376064\n分部分项工程安全管理办法\t376065\n你是\t376066\n搞懂\t376067\n水资\t376068\nExpiration\t376069\n新民镇\t376070\n联文\t376071\nLegal\t376072\n零阶\t376073\n招商银行香港一卡通\t376074\nALE\t376075\n八识\t376076\n20160807\t376077\n璇子\t376078\n倍爽\t376079\n溧水开发区\t376080\n幸魂\t376081\n56张\t376082\n凌鹏\t376083\n吉卜力\t376084\nreadOnly\t376085\n李源根\t376086\n韶山\t376087\n水烟\t376088\nionicons\t376089\n北京远程视界集团\t376090\n美业\t376091\n平码\t376092\nshed\t376093\n茶界中国\t376094\nPPA\t376095\n慈祥\t376096\nAmelie\t376097\n李开新\t376098\nQQ炫舞\t376099\n布尔诺\t376100\n糖厂\t376101\n通惠路\t376102\n星弹\t376103\n亭湖区\t376104\n8.2.0\t376105\n湖南财政经济学院\t376106\n世界华人\t376107\n骨女\t376108\njyw\t376109\n2.8.5\t376110\n墨式\t376111\n铅粉\t376112\n复制品\t376113\nDECO\t376114\n伍迪·艾伦\t376115\n463号\t376116\nCYSB\t376117\n点绛唇\t376118\n妮飘\t376119\n伊阁\t376120\n小袁\t376121\n磁漆\t376122\n第8册\t376123\n中国新歌声第二季\t376124\npolish\t376125\n新库\t376126\n传播学概论\t376127\ne20\t376128\nSOT\t376129\n钟睒睒\t376130\n达拉斯买家俱乐部\t376131\n口臭\t376132\n冯涛\t376133\n苹果树\t376134\nGocheck\t376135\nSHINHWA\t376136\n忧桑\t376137\n鳃\t376138\n练度\t376139\n粗体字\t376140\n文er\t376141\n罩子套\t376142\n硅酸铝板\t376143\n3.98\t376144\n吉凯恩\t376145\n澳大利亚留学联盟\t376146\n库德洛\t376147\niPhone4\t376148\n黔西南布依族苗族自治州\t376149\n香蕉片\t376150\n【薇力\t376151\n海南市\t376152\n工业史\t376153\n大败\t376154\n中共湖南省委办公厅\t376155\n深圳国家税务局\t376156\n阿法替尼\t376157\n68P\t376158\n陕西省人民政府办公厅\t376159\n板层\t376160\nWOLL\t376161\n0933\t376162\n树杈\t376163\nmansory\t376164\n23岁时\t376165\nweiphone\t376166\n气流粉碎机\t376167\n强磁\t376168\n2章\t376169\n今夜入你怀\t376170\n柳传萤火之森\t376171\n英文\t376172\nBBOX\t376173\nunresolved\t376174\n吊色\t376175\n课业\t376176\n李天飞\t376177\n卡尔丹顿\t376178\n重庆工程学院\t376179\n表符\t376180\nbleed\t376181\n银辉\t376182\nTudor\t376183\n永昌\t376184\n烟台市环境保护局\t376185\n第二声\t376186\n40多年\t376187\ntod\t376188\n10.0.2\t376189\n孙广信\t376190\n胤禩\t376191\n持久度\t376192\nac97\t376193\n抹面\t376194\n远古\t376195\n气动机械手\t376196\nXue\t376197\n歇马镇\t376198\n嗑\t376199\nA8\t376200\n不约\t376201\n透传\t376202\n盖章\t376203\n黄金梨\t376204\n中国化工集团公司\t376205\n平立面\t376206\nrecycle\t376207\n下家\t376208\nTreble\t376209\n宣仪美岐\t376210\n加比\t376211\n周期\t376212\n培植\t376213\n古版\t376214\n白夜\t376215\n入瓮\t376216\n菲伯尔\t376217\n37届\t376218\n千行\t376219\n鱼窝\t376220\n小米椒\t376221\n华诞\t376222\n春园\t376223\nV2.6.0\t376224\n水控机\t376225\n隐射\t376226\n遗爱\t376227\n药人\t376228\n铲运机\t376229\ndatalab\t376230\n安徽省工商行政管理局\t376231\n每周五\t376232\n成年期\t376233\n馒头片\t376234\n照做\t376235\n进修班\t376236\n梦想照进现实\t376237\n做家\t376238\n三科集团\t376239\noracle9i\t376240\nhspice\t376241\n光口\t376242\naccordingly\t376243\n握拍\t376244\n0W40\t376245\n咖啡厅\t376246\n俏销\t376247\n601118\t376248\n互联网周刊\t376249\n唐泽\t376250\n泷泽秀明\t376251\n2x-1\t376252\ndisplayed\t376253\n优胜劣汰\t376254\nCalligraphy\t376255\n吴磊\t376256\nElasticSearch\t376257\n回汉\t376258\n昼间\t376259\n雅玛\t376260\n词义\t376261\n北京便民网\t376262\n泳镜\t376263\n格兰迪\t376264\n工作类\t376265\n聚酰亚胺\t376266\n昏君\t376267\n石象湖\t376268\n德能勤绩廉\t376269\nswagger2\t376270\n客居\t376271\n医惠科技\t376272\nBter\t376273\n惹争议\t376274\nJiangxi\t376275\naapt\t376276\n题项\t376277\n蜂业\t376278\n平丘\t376279\n兔兔府\t376280\n喜帖街\t376281\n切身体会\t376282\n资方\t376283\n泰达航母主题公园\t376284\n金龙大道\t376285\n复建\t376286\n早孕检查\t376287\n团结村\t376288\n卡缤\t376289\n区块链神\t376290\n富锦路\t376291\n江歌案\t376292\nProvisioning\t376293\n沙鲁克·汗\t376294\nCelia\t376295\n虚列\t376296\n逆羽霞\t376297\nstall\t376298\n身份证\t376299\n2672\t376300\n博奥生物\t376301\n拍拍贷\t376302\nnmc\t376303\nSubversion\t376304\n粤日\t376305\n诗神\t376306\n李晓峰\t376307\n辽宁省财政厅\t376308\n规划建筑面积\t376309\n5800\t376310\n茁壮成长\t376311\nAUTH\t376312\nPIZZA\t376313\n频小\t376314\n柳红\t376315\n札幌市\t376316\n军舰岛\t376317\n云数据库\t376318\n协和医学院\t376319\n大有问题\t376320\nABI\t376321\n李泳\t376322\n鬼市\t376323\n魔趣\t376324\n多次方\t376325\n王牌对王牌第三季\t376326\n马丁路德\t376327\n谢震业\t376328\n李思西城男孩\t376329\nrestart\t376330\n实验机\t376331\n160万\t376332\n十牛\t376333\nSuspension\t376334\nAvicii\t376335\n八旗子弟\t376336\nhero5\t376337\n空旷\t376338\n博星\t376339\n朱超\t376340\n名盘\t376341\n得鲜\t376342\n华晨金杯\t376343\nEPUB\t376344\nnyoj\t376345\n梦幻模拟战吧\t376346\n萨奇\t376347\n萧瀚\t376348\n割补\t376349\n乌沙\t376350\nkd9\t376351\n公决\t376352\n江阴市人民医院\t376353\n板式家具\t376354\n厂机\t376355\nlru\t376356\n沈半路\t376357\n蒙古国\t376358\nUSDCNY\t376359\n戎马丹心汉匈\t376360\n申请信\t376361\n长姐\t376362\ndental\t376363\n中天金融\t376364\n二元醇\t376365\n移动电源\t376366\n路径\t376367\n爱茜茜里\t376368\n燃武\t376369\n鸿源\t376370\n399元\t376371\nappendchild\t376372\n二郎镇\t376373\n创库\t376374\n催肥\t376375\n方底\t376376\n桃谷绘里香\t376377\n逸足\t376378\n黑旋风\t376379\n2棵\t376380\nSaints\t376381\n乌鲁木齐南站\t376382\n雅思考\t376383\n东京食尸种\t376384\n伴唱\t376385\n200万吨\t376386\n观赏鱼\t376387\n13.5米\t376388\ntodoist\t376389\n30张\t376390\nCPU超频\t376391\n600030\t376392\n文内\t376393\n实验动物_中国实验动物信息网\t376394\nDijkstra算法\t376395\n机电\t376396\nf\t376397\n喝茶\t376398\nHiufan\t376399\n蒋林\t376400\n蓝泽\t376401\n洋行\t376402\nC语言源程序\t376403\n张祖锦\t376404\n班刊\t376405\n韩春明\t376406\ncance\t376407\n格拉苏蒂\t376408\npika\t376409\n赤霞\t376410\n奂\t376411\n报批表\t376412\n爱蜜莉雅\t376413\n好歹\t376414\nmrc\t376415\n档杆\t376416\n多伦县\t376417\n拉蒂兹\t376418\n铝镁锰屋\t376419\n保险杠\t376420\n搜狗手机浏览器\t376421\n艾力\t376422\n春考\t376423\n四川省委党校\t376424\n团藏\t376425\n32倍\t376426\n单一制\t376427\nSlayer\t376428\n华邑\t376429\n顶上\t376430\n0041\t376431\n偶成\t376432\n洞头\t376433\n2卷\t376434\nrmt\t376435\n选位\t376436\n乐读书城\t376437\n伯胺\t376438\n乐和城\t376439\n泰克示波器\t376440\n只好\t376441\n重庆欢乐谷\t376442\n0514\t376443\n北京语言大学网络教育学院\t376444\n哦\t376445\n微猫\t376446\nTransactions\t376447\n极目\t376448\nBugaboo\t376449\n道岔\t376450\n美竹玲\t376451\nMEGA\t376452\nv2.0.0\t376453\n女骑手\t376454\n卡伦海滩\t376455\n局内人\t376456\n情侣酒店\t376457\n亚太网\t376458\n笔会\t376459\n琦色\t376460\n上海长征医院\t376461\n肃北\t376462\n大化\t376463\n音部\t376464\nRxAndroid\t376465\n广渠门\t376466\n维修\t376467\n8.0\t376468\n古塘街道\t376469\n陆贞传奇\t376470\n嘉\t376471\n易车号_易车网\t376472\n天涯剧社\t376473\n纽约客\t376474\nheets\t376475\n腾冲\t376476\n2.5万\t376477\n芙清\t376478\n接收\t376479\nfio\t376480\n84条\t376481\n金灿灿\t376482\n绝代神主\t376483\n睡熟\t376484\ntable表格td\t376485\n第五十四条\t376486\n奇幻小说\t376487\n淋巴结炎\t376488\n无敌\t376489\n勿扰模式\t376490\n付融宝\t376491\n心月\t376492\n炖锅\t376493\n闩\t376494\n执行官\t376495\n查评\t376496\n静冈\t376497\n梦璃\t376498\n网袋\t376499\nege\t376500\nIIS8.5\t376501\n人像头\t376502\nMIFI\t376503\n裤链\t376504\n红旗飘飘引我成长读后感\t376505\n球墨铸铁\t376506\n馋涎欲滴\t376507\ndecimal\t376508\n金斯顿大学\t376509\n38分钟\t376510\nUCSD\t376511\n十下\t376512\n假如给我三天光明\t376513\n胜子\t376514\nfm\t376515\n孙绍骋\t376516\nThesis\t376517\n奇瑞捷豹路虎汽车有限公司\t376518\nutc\t376519\nubnt\t376520\n战斗吧剑灵\t376521\n16800\t376522\n找疯了\t376523\n高知县\t376524\n下贱\t376525\n金凤区\t376526\n超星尔雅大学生职业生涯规划\t376527\n兄弟年\t376528\nPounds\t376529\n劳动仲裁委\t376530\n1000kg\t376531\n珺庭\t376532\n贸易公司\t376533\nzlib\t376534\n不死之身\t376535\nmech\t376536\n工作条例\t376537\n梁兴扬\t376538\n资质证\t376539\n乐丽\t376540\n干冰\t376541\n东风东路小学\t376542\nobitsu\t376543\n2.5cm\t376544\n香榭水岸\t376545\n驾驶员考试网\t376546\n3月26\t376547\n车夫\t376548\nxingoo\t376549\n漆包线\t376550\nexert\t376551\n大商股份\t376552\n摩擦面\t376553\n主选\t376554\niocp\t376555\n第116期\t376556\n控制键\t376557\n黄青\t376558\ngitblit\t376559\n酒率\t376560\n叶檀宫本武藏\t376561\n住朋网\t376562\n一大口\t376563\nRequestMapping\t376564\n11.3.1\t376565\n通商\t376566\nusr\t376567\nDick\t376568\nPSoC\t376569\nUCP600\t376570\n吉华\t376571\n邪念\t376572\n北京景山学校\t376573\n下行\t376574\n10kv变压器\t376575\n缤智论坛\t376576\ncoupler\t376577\n4000块\t376578\n巨乳ファンタジ\t376579\n一阵阵\t376580\nTees\t376581\nitsm\t376582\ngsx250r\t376583\n1089\t376584\n易天富\t376585\nzombies\t376586\n相辉\t376587\nMcGraw\t376588\n土豪号\t376589\nPanic\t376590\n友金\t376591\nrc4\t376592\nspicy\t376593\n楷书千字文\t376594\n酷睿i7处理器\t376595\nYOUMI\t376596\n搜狗拼音输入法\t376597\n国家地理杂志\t376598\n苯系物\t376599\nBottles\t376600\n四川省南部县\t376601\nSARS事件\t376602\n无冠\t376603\n雷霆\t376604\n17101\t376605\n40002\t376606\n张爱军\t376607\n上海洋山港\t376608\n桌面端\t376609\n格斗场\t376610\n苏城\t376611\n10章\t376612\n诺基亚N9\t376613\nifixit\t376614\n霍启山\t376615\nTranslator\t376616\n计数资料\t376617\n蜗牛学院\t376618\n偷拍\t376619\n汉雷\t376620\n幼妻\t376621\n黑龙江省社会医疗保险局\t376622\nf363\t376623\n弥坚\t376624\n铆\t376625\n中百罗森\t376626\n野火集\t376627\n救场\t376628\n雪铁龙c3-xr\t376629\n聚道\t376630\n蓝道\t376631\n碳酸钠反应\t376632\n孙频\t376633\n易者\t376634\n金箍棒\t376635\n空表\t376636\n酒批\t376637\nwin10安全模式\t376638\n鬼医\t376639\n南阳火车站\t376640\n房屋买卖合同\t376641\n弹石\t376642\n4420\t376643\n程军\t376644\n亿欧\t376645\n造纸业\t376646\n覆膜\t376647\nmatched\t376648\npowerd\t376649\n犯愁\t376650\nimo\t376651\n异卵双胞胎\t376652\n大富之家\t376653\n荆溪镇\t376654\n夜惊魂\t376655\n博士点\t376656\nxia\t376657\n岩类\t376658\n龙须菜\t376659\n1125\t376660\n福建省立医院\t376661\n直燃\t376662\n侨香路\t376663\n魅力十足\t376664\nmomoland\t376665\n笑死人\t376666\n提足\t376667\n偏微分方程\t376668\n清风手游网\t376669\n哈药六厂\t376670\nField\t376671\n凭据\t376672\n恒产\t376673\n广州市住房和城乡建设委员会\t376674\n3mol\t376675\n心诚则灵\t376676\n空军\t376677\nX-1\t376678\n奇犽\t376679\ntopology\t376680\n中铁十一局集团有限公司\t376681\n老农\t376682\n禾赛科技\t376683\n雪铁龙C6\t376684\n李良荣\t376685\nCF卡\t376686\n20寸\t376687\n第(三\t376688\n张北北\t376689\n打尿\t376690\nDNP\t376691\n托克托县\t376692\n18个\t376693\n复利现值\t376694\n知花\t376695\n索马里\t376696\n校企\t376697\n青龙管业\t376698\n很麻烦\t376699\n携景财富网\t376700\n废催化剂\t376701\nsnooping\t376702\n中药注射剂\t376703\n完美屏\t376704\ngetstring\t376705\n诚诚\t376706\nTales\t376707\n中国人民大学劳动人事学院\t376708\n株洲人\t376709\n广体\t376710\n全端\t376711\n知识竞赛题库\t376712\n共勉\t376713\n4卡顿\t376714\n杨悦\t376715\n主课\t376716\n耶律阿保机\t376717\n正泰\t376718\n伍勇\t376719\n散剂\t376720\nUICollectionView\t376721\n题本\t376722\n7201\t376723\n钛备份\t376724\n凌晨五点\t376725\n151dw\t376726\nPRE\t376727\n沉潜\t376728\n玩票\t376729\nqq2018\t376730\n陈克明\t376731\n警戒者\t376732\n百态\t376733\n滂江街\t376734\n蜕变\t376735\n花开花\t376736\n小白杨\t376737\n加油词\t376738\niPadAir2\t376739\n广播操\t376740\nPChouse\t376741\n檀香精油\t376742\n中端机\t376743\npj\t376744\nYunnan\t376745\n鳄霸\t376746\n07073火影忍者OL\t376747\n滚齿\t376748\n豆角\t376749\n民航总医院\t376750\n闪乱神乐:沙滩戏水\t376751\n弱阳\t376752\n应收账款周转天数\t376753\n肉桂醛\t376754\n范长江\t376755\n套管式\t376756\n林真心\t376757\nsyn\t376758\n空神\t376759\n2479\t376760\n良久\t376761\n证函\t376762\nCS95\t376763\n血清蛋白\t376764\n防盗报警器\t376765\nproc\t376766\n置信电气\t376767\n下士\t376768\n用命\t376769\nxim\t376770\n主动型基金\t376771\n电脑桌\t376772\n地铁7号线\t376773\n公路局\t376774\n五十万元\t376775\n大显身手\t376776\nCOD在线分析仪\t376777\nFormData\t376778\n83322655\t376779\n归入\t376780\n内蒙古自治区统计局\t376781\n源项\t376782\n电子有限公司\t376783\n1882\t376784\nOhm\t376785\n特快专递\t376786\n紧急电话\t376787\n存根联\t376788\nzzuli\t376789\n悠季\t376790\n三周岁\t376791\n付费群\t376792\n连不上网\t376793\ngzz\t376794\n转件\t376795\n南京东山外国语学校\t376796\nnex5t\t376797\n奥利洪岛\t376798\n赵海宁\t376799\n定编\t376800\n贫农\t376801\n颞叶\t376802\n超能勇士\t376803\n2014年9月份\t376804\n旧志\t376805\n2k16\t376806\n重分配\t376807\n大姜\t376808\n喷香机\t376809\n德雷斯罗萨\t376810\n慧聪涂料网\t376811\n光纤通信\t376812\n勇士之门\t376813\n镀铜\t376814\n暗度\t376815\n圣药\t376816\nd7500\t376817\n脱式\t376818\n排名表\t376819\njiasuqi\t376820\njstack\t376821\n血鬼\t376822\n检验科\t376823\n油污清洗剂\t376824\n大兴路\t376825\n长安道\t376826\n空穴来风\t376827\n1994\t376828\na1465\t376829\n二s\t376830\n七龙珠剧场版\t376831\n潜心\t376832\nMc\t376833\n责人\t376834\n差额拨款\t376835\n美的地产\t376836\n磨砂膏\t376837\n制衡\t376838\naesop\t376839\n安歌\t376840\n威孚高科\t376841\nLV\t376842\n273\t376843\n自禁\t376844\n正时链条\t376845\n艾克斯\t376846\nCTD\t376847\n山东省科学技术协会\t376848\n冷胶\t376849\n篮球火\t376850\n大城子镇\t376851\n行路难\t376852\n电击\t376853\n6.18\t376854\n企业社会责任\t376855\ngps定位器\t376856\n0602\t376857\n行进\t376858\n棚车少年\t376859\nshaw\t376860\n省油宝\t376861\n历时性\t376862\n龙韵股份\t376863\n黑龙江省哈尔滨市第三中学\t376864\n冬花\t376865\n2015-2018年\t376866\n东北师范大学图书馆\t376867\n生存金\t376868\n2588\t376869\nTTP\t376870\n图样图森\t376871\n区房管局\t376872\n火毒\t376873\n窦鹏\t376874\n花雨\t376875\n欢乐人游戏\t376876\n拷\t376877\n8682赴韩整形网\t376878\n电力业务许可证\t376879\n陈楚\t376880\n团身\t376881\n断刺\t376882\n乳交\t376883\n斑块\t376884\n草精\t376885\n伊丽莎白瓜\t376886\ngroove\t376887\n冲杯\t376888\n管人\t376889\nJAP\t376890\n89路\t376891\n迷醉\t376892\n跖骨\t376893\n初中生\t376894\n8t\t376895\nImages\t376896\n620\t376897\n草颜\t376898\n奥委会\t376899\n枪术\t376900\n法定盈余公积\t376901\n滁州市公安局\t376902\n艾滋\t376903\nRenmin\t376904\n增压阀\t376905\n李国毅\t376906\n苯海拉明\t376907\n白武士\t376908\n破难\t376909\ne+\t376910\n金合欢\t376911\n徜徉\t376912\n电业\t376913\n3g\t376914\n一装\t376915\n谱分析\t376916\n对外\t376917\n纵列\t376918\n在朝\t376919\n无锡电视台\t376920\n羡仙\t376921\nrick\t376922\n秋夜雨寒\t376923\n爱乐压\t376924\n借道\t376925\n输卵管通液\t376926\n性向\t376927\n连同\t376928\n艺术学概论\t376929\n中共河南省委党校\t376930\n幸运数字\t376931\n46英寸\t376932\nStaging\t376933\n精神紧张\t376934\n税收滞纳金\t376935\n学妹们\t376936\n津率\t376937\n阿迪王\t376938\n红高粱模特队\t376939\nlima\t376940\nfiltered\t376941\nrx10\t376942\n乡村基\t376943\n内标物\t376944\n总平\t376945\n北极星环保网\t376946\nicn\t376947\n侯马\t376948\n盘式干燥机\t376949\n我的父亲母亲\t376950\nPSE认证\t376951\n东阳马生序\t376952\n叮叮\t376953\n复验\t376954\n工业园区管委会\t376955\n电脑菜单栏\t376956\n跑姿\t376957\n仓储管理系统\t376958\n悲歌\t376959\n邢台网\t376960\n照旧\t376961\n优翼丛书\t376962\n著称\t376963\n拍摄时\t376964\nchromedriver\t376965\n0xc\t376966\n慕课猿\t376967\nhacknet\t376968\n斯诺克世界大奖赛\t376969\n15毫升\t376970\nBACK\t376971\n万达商场\t376972\n迷你机\t376973\n卡塔\t376974\n金辉\t376975\nzktime\t376976\n河北广播电视大学\t376977\n梦圆\t376978\n西方文论\t376979\n第一季03\t376980\n桂阳新闻网\t376981\n尼康d750\t376982\n駅\t376983\n同济大学环境科学与工程学院\t376984\n蒋凡\t376985\n艺术体\t376986\n即席\t376987\n生料\t376988\n原始生活\t376989\n迈锐万道剑尊\t376990\nnuoyi\t376991\nwindows命令提示符\t376992\ncosme\t376993\n线碟\t376994\n六门\t376995\n总论\t376996\n金诗雨\t376997\n见怪\t376998\n分式方程\t376999\n20000000\t377000\n在线转换器\t377001\n汾牌\t377002\n升降货梯\t377003\n吴青峰\t377004\ngreatest\t377005\nLaya\t377006\n天如玉\t377007\nzmq\t377008\n白石麻衣\t377009\n如初见\t377010\njaxen\t377011\n今日凌晨\t377012\n兴调研\t377013\n亚太7号\t377014\ngtx980m\t377015\nv1.2\t377016\n几串\t377017\n剑鞘\t377018\n热水器\t377019\n11万\t377020\n葬送\t377021\n增速器\t377022\nfairchild\t377023\n三相四线制\t377024\n支点\t377025\n齐齐哈尔市\t377026\nEXCEL批量\t377027\n天外有天\t377028\n183.61.243.1\t377029\nDLP\t377030\n孤立无援\t377031\nkp\t377032\n马伊俐\t377033\nlauder\t377034\n乌鲁木齐经济技术开发区\t377035\ndowngrade\t377036\n证券从业考试\t377037\n济宁人力资源和社会保障局\t377038\n子矩阵\t377039\nSuperSU\t377040\n领克01\t377041\n城市人口\t377042\n奥米加\t377043\n有规\t377044\n28颗\t377045\n贴春联\t377046\n宁波港\t377047\n本座\t377048\ntuning\t377049\n中国银行公司\t377050\n相对性\t377051\nAcquisition\t377052\n疆域\t377053\n雷克萨斯UX\t377054\nMechanic\t377055\n离心管\t377056\nRuth\t377057\n5315\t377058\n百度日语输入法\t377059\n最美艳\t377060\n百度优课\t377061\n大比\t377062\niPad\t377063\n阳光问廉\t377064\nmix1\t377065\n宜居\t377066\n王春彧\t377067\n虚虚实实\t377068\n拉格朗日插值法\t377069\n一字线\t377070\n新天堂\t377071\n87267630\t377072\n第6回\t377073\n玻璃电\t377074\n辊式破碎机\t377075\n1990\t377076\n深圳非凡医疗美容医院\t377077\n雅度\t377078\n冰恋\t377079\n体现\t377080\n刘英杰\t377081\n800型\t377082\n上海应用技术学院\t377083\n咪唑啉\t377084\n小红伞\t377085\n汉口银行\t377086\nKhao\t377087\n核舟记\t377088\nproxyee-down\t377089\n乌雷\t377090\n同源性\t377091\n退税\t377092\n517购房网\t377093\n理正结构工具箱\t377094\n成人学院\t377095\nvideocapture\t377096\n粘帖\t377097\n张珊\t377098\n2.3g\t377099\n王巍\t377100\n联想g480\t377101\n墨翠\t377102\n二之国2:亡灵之国\t377103\n车公庄大街\t377104\n畅网\t377105\n帝豪完美世界\t377106\nwin10播放器\t377107\n市卫计委\t377108\n怪物猎人Online\t377109\n【保险公司\t377110\n一百首\t377111\n19册\t377112\n口漫画\t377113\n名变\t377114\nRTSP\t377115\n冰期\t377116\nEun\t377117\n液压升降机\t377118\nFontCreator\t377119\n昵称\t377120\ngrim\t377121\n利恩斯\t377122\n中国银行(香港)有限公司\t377123\n颇为\t377124\n博影\t377125\n修身\t377126\n肃州区\t377127\nFFX\t377128\nWoolworths\t377129\n缴税\t377130\nsofa\t377131\n周度\t377132\n邯郸站\t377133\nt28\t377134\n相应地\t377135\n稽山\t377136\n北京市东城区人民政府\t377137\n糖纸\t377138\n胡雪岩故居\t377139\n音男\t377140\n韦根\t377141\n/魔网\t377142\n遥控船\t377143\n1744\t377144\n闵玧\t377145\n非主流\t377146\n轰鸣\t377147\noppor11s\t377148\n怀远县\t377149\nooxml\t377150\n二等奖\t377151\n账面\t377152\n地板蜡\t377153\n季报表\t377154\n寇绍恩\t377155\n串励\t377156\n夜航\t377157\n挂心\t377158\n扫描版\t377159\n2.2KW\t377160\n停运\t377161\n赌博机\t377162\n尘灰\t377163\n高学\t377164\nUdemy\t377165\n四款\t377166\n丰都县政府\t377167\n怖\t377168\n青岛海博生物\t377169\n交联剂\t377170\n约翰·汤普森\t377171\n三国战纪3\t377172\nexcel07\t377173\n200粒\t377174\n28条\t377175\ncaoli\t377176\n2.7.9\t377177\n孔德\t377178\n孤岛惊魂5steam\t377179\n7季\t377180\n范爷\t377181\nEstimated\t377182\n拉加德\t377183\n480分\t377184\n陕西自贸区\t377185\nlennon\t377186\n树穴\t377187\n牙垢\t377188\n姆努钦\t377189\n犍为\t377190\n汉阳区政府\t377191\n综合评分法\t377192\n油画笔\t377193\n红桥区\t377194\nvuescan\t377195\n轻油\t377196\n蛮不讲理\t377197\nspd\t377198\n1.9米\t377199\n异镇\t377200\n宣讲队\t377201\n横沙乡\t377202\n详细说明\t377203\n宾西\t377204\n古剑奇谭ol\t377205\nacr\t377206\nveeam\t377207\n济南高新技术产业开发区\t377208\n教师考试网\t377209\n艾嘉\t377210\n信贷员\t377211\n常委会\t377212\n普云\t377213\n北京移动神州行\t377214\n龙采科技\t377215\nCARGO\t377216\n一个14岁\t377217\nkugoo\t377218\n慈世平\t377219\n环氧树脂胶泥\t377220\nInfini\t377221\n共和轩逸\t377222\n学龄前儿童\t377223\n|文\t377224\n获颁\t377225\n熟记\t377226\n法意\t377227\n李星星\t377228\n卓刀泉\t377229\nUNPB\t377230\n向度\t377231\n八都\t377232\n贺众\t377233\n会计信息\t377234\n入耳\t377235\nMACHINERY\t377236\n38度\t377237\n武汉广场\t377238\n电竞杯\t377239\n新西兰政府\t377240\nhhh\t377241\nBirds\t377242\n食行生鲜\t377243\n绿地国际博览城\t377244\n中国人民抗日战争纪念馆\t377245\n狐步舞\t377246\n1局\t377247\n2028\t377248\nneighbour\t377249\n海淘族\t377250\n万佛朝宗\t377251\n流出装\t377252\n宋朝华\t377253\n胶框\t377254\njessiej\t377255\n常州北\t377256\n海商王3\t377257\n自缚\t377258\n白鲢\t377259\n9-1\t377260\n滤水板\t377261\n中国铁合金在线\t377262\nMSD\t377263\n599\t377264\n0424\t377265\nTIL\t377266\n反间计\t377267\nnyu\t377268\n雅思哥\t377269\n套题\t377270\n上海百联\t377271\n野趣\t377272\n800目\t377273\n基础版\t377274\n星路\t377275\nCreo2.0\t377276\n轻症\t377277\nfonts\t377278\n畅言教学通\t377279\n褪去\t377280\n竹鞭\t377281\n2天前\t377282\n李铁映\t377283\n1.2.9\t377284\nLast\t377285\nAirPlay\t377286\n初沉池\t377287\n手镯\t377288\n胶块\t377289\nPIO\t377290\n105期\t377291\n夏雷\t377292\n常香玉\t377293\n人民政府办公厅\t377294\n易连恺\t377295\n九药网\t377296\nGLORY\t377297\n爱国人物\t377298\nFutura\t377299\n飞燕草\t377300\n华为C199\t377301\n河曲县\t377302\n蚊人\t377303\n800kva\t377304\nwinXP\t377305\n云麓\t377306\n220路\t377307\n瑞威\t377308\n音乐史\t377309\nimagettftext\t377310\n群养\t377311\n法斗士网\t377312\nqt5\t377313\n6201\t377314\n樱桃山\t377315\n为什么子\t377316\nsuke\t377317\n装出\t377318\n也门胡塞\t377319\n上帝之眼\t377320\n厦门公交查询网\t377321\n.exe\t377322\nApply\t377323\n672\t377324\n翠苑\t377325\n贵州银监局\t377326\n周冲\t377327\n煮酒论\t377328\n分子人类学\t377329\n南大街\t377330\n礼佛\t377331\n圆桌派\t377332\n脑力王\t377333\nN9200\t377334\n下单员\t377335\n长安医院\t377336\n600600\t377337\nnsk轴承\t377338\nsasl\t377339\n仁寿县人民政府\t377340\n人民调解法\t377341\n蛾蚋\t377342\n考实\t377343\n爆炸头\t377344\n超级马里奥兄弟2\t377345\n230v\t377346\n三层\t377347\n卡罗\t377348\nc18\t377349\n伴音\t377350\n姗\t377351\n双遗\t377352\n早婚\t377353\n科健\t377354\n岚庭\t377355\n蔡某\t377356\n温文\t377357\nPPC\t377358\n三案\t377359\nsmail\t377360\n玉皇庙镇\t377361\nbuid\t377362\n口\t377363\n小波函数\t377364\n玛丽亚凯莉\t377365\n东江传媒网\t377366\n江海涛\t377367\n朱瑾\t377368\n拖动\t377369\n右归丸\t377370\nerosion\t377371\n东方电子\t377372\n插条\t377373\n中国民族证券\t377374\n房策网\t377375\n荷花园\t377376\n百千万\t377377\n暖石\t377378\n邝美云\t377379\n江海迦\t377380\n南京文交所钱币邮票交易中心\t377381\n红周刊\t377382\n倪光南\t377383\n领导人员\t377384\n非线性编辑软件\t377385\n制服装\t377386\n察网\t377387\n淡定\t377388\n同济大学图书馆\t377389\n月志\t377390\n黄龙山\t377391\n承租户\t377392\n纽约州立大学\t377393\n有图有真相\t377394\n肥乡\t377395\nISO感光度\t377396\n中羽在线网\t377397\n查理兹·塞隆\t377398\n深圳公安\t377399\n行体\t377400\n園\t377401\n环境学\t377402\n320LI\t377403\n欣赏\t377404\n洛基英语\t377405\n交广\t377406\n如若\t377407\n10码\t377408\n安新\t377409\n攻角\t377410\n周浦\t377411\n金湖花园\t377412\n中共云南省委\t377413\nwilcoxon\t377414\n3490\t377415\nNios\t377416\n豪锐\t377417\n概不退换\t377418\nRating\t377419\n杨式太极\t377420\n懒货\t377421\n吏部\t377422\n客卧\t377423\n重庆市经信委\t377424\n同宗\t377425\nsing女团\t377426\n共点\t377427\nrogue\t377428\n爱淘\t377429\n优游网君成录\t377430\n龙TORY\t377431\n科莱丽\t377432\n10分钟\t377433\npoocg\t377434\n拉马克\t377435\n石家庄永昌\t377436\nHGE\t377437\n太空杯\t377438\n焦段\t377439\n轮奸案\t377440\n第二行\t377441\n泉塘街道\t377442\n中科三环\t377443\n京娘湖\t377444\n武邑\t377445\n夏伟\t377446\n志业\t377447\n注册认证\t377448\n拼到底\t377449\nwatches\t377450\n中国人民大学社会与人口学院\t377451\nAPOLLO\t377452\nDLA\t377453\nshutil\t377454\n6SE70\t377455\nnumpy函数\t377456\n000826\t377457\n6.6\t377458\n琴表\t377459\n崇光百货\t377460\n电梯\t377461\n担保法\t377462\n玻璃棉\t377463\n彩种\t377464\n黄金山\t377465\n个位\t377466\n金新农\t377467\nbrowserify\t377468\n四川大学图书馆\t377469\n妙语\t377470\ncent7\t377471\n19.1.3\t377472\n全息投影\t377473\nLaravel中文网\t377474\n700个\t377475\n伍佰亿\t377476\n盆纪\t377477\nFinancing\t377478\n海伦娜\t377479\nswoole\t377480\nbute\t377481\nNovartis\t377482\nTP-link无线路由器\t377483\n中国专业人才网\t377484\n方村\t377485\n树点\t377486\n」\t377487\n五周年\t377488\nDISC\t377489\n逗逗\t377490\neditplus\t377491\n114票务网\t377492\n防邪\t377493\n继续教育学院\t377494\n四五年\t377495\n春风沉醉的夜晚\t377496\n0116\t377497\nWells\t377498\n720p.BD中英双字幕迅\t377499\n爱投屏\t377500\n大事情\t377501\nFetal\t377502\nFR4\t377503\n2017年7月20日\t377504\n奉公\t377505\n汕头市环境保护局\t377506\n本田奥德赛\t377507\nsensible\t377508\n维康\t377509\n十九\t377510\n百分表\t377511\n韵达\t377512\nspi接口\t377513\n0772\t377514\n加号\t377515\n6图\t377516\n上律\t377517\n尿疗\t377518\n精算学\t377519\n领奖\t377520\n两声\t377521\n聚苯胺\t377522\n今度\t377523\n琐\t377524\n烽火版\t377525\n吉天\t377526\n脱敏治疗\t377527\nDS3231\t377528\n网络舆情\t377529\n百望山森林公园\t377530\n扯\t377531\ninfiltration\t377532\n病狗\t377533\n预包装食品标签通则\t377534\nhelium\t377535\n西安市房管局\t377536\nSolution\t377537\nAPUS\t377538\n3460\t377539\n大相国寺\t377540\n扣件\t377541\nt580\t377542\n优惠日\t377543\n溶洞\t377544\n钢琴调\t377545\njxx\t377546\n施敏\t377547\n0x06\t377548\n很好\t377549\n中国医美\t377550\nekf\t377551\n定压\t377552\n不端网\t377553\n至孝\t377554\n阿里妈妈\t377555\n刘科\t377556\n第三十八年\t377557\n20141125\t377558\n工商银行个人网上银行\t377559\n0.6米\t377560\ngta5内置修改器\t377561\n6岁\t377562\n长江\t377563\n配音秀\t377564\n重农抑商\t377565\n60米\t377566\n博雷\t377567\n金邦达\t377568\n350兆\t377569\n马鞍山东站\t377570\n天真蓝照相馆\t377571\n可\t377572\n改良型\t377573\nv5.0.5\t377574\n卓越工程师教育培养计划\t377575\nc21\t377576\nhsi\t377577\nstatute\t377578\n褶边\t377579\n水形物语\t377580\n狄金森\t377581\nshan\t377582\nUPC\t377583\n老鸨\t377584\nmyeclipe\t377585\n三长两短\t377586\n困难版\t377587\n任容萱\t377588\nseqing\t377589\n鲜有\t377590\nMH\t377591\n身居\t377592\n丢单\t377593\n刘招华\t377594\n五千克\t377595\nxp\t377596\n北京国安\t377597\n阿基德\t377598\n5月19\t377599\n迈过\t377600\n中华护理学会\t377601\n变身\t377602\n二次分配法\t377603\nDoppler\t377604\n李天\t377605\n马兴瑞\t377606\nH级\t377607\n巨潮资讯网\t377608\n防鸟网\t377609\n70个\t377610\n天津市南开医院\t377611\n要死\t377612\n重读\t377613\n月野姬\t377614\n不得不知道\t377615\n横盘\t377616\n林中鸟\t377617\n校苑\t377618\n山东科技大学研究生院\t377619\n曹七巧\t377620\n排练\t377621\n广东鸿图\t377622\n2017变形计\t377623\n看了又看\t377624\n【万\t377625\n目视化\t377626\n鬼途奇行录\t377627\n刘伯\t377628\nbeifen\t377629\nx^2+1\t377630\nDeepin\t377631\n1Mpa\t377632\n威科姆\t377633\n月运\t377634\n饥饿营销\t377635\nmxf\t377636\nucd\t377637\n压模\t377638\n职务\t377639\n弥勒佛\t377640\n12:00:00\t377641\n安全管理制度\t377642\n捧杀\t377643\n混插\t377644\n德克萨斯扑克\t377645\ntracer\t377646\n一个10岁\t377647\n陶勇\t377648\nwxc\t377649\n打野\t377650\n7章\t377651\nGardens\t377652\nshady\t377653\n砍人者\t377654\n太平湾\t377655\n爱美网\t377656\n寿诞\t377657\n气孔率\t377658\n真命\t377659\n捱\t377660\n郑州网\t377661\n1.8升\t377662\n应变计\t377663\n鳌太\t377664\n升息\t377665\nwebmin\t377666\n回国\t377667\n感冒灵颗粒\t377668\n海润\t377669\n火控雷达\t377670\nladym\t377671\n福窝\t377672\n002018\t377673\n牛肉粒\t377674\n5箱\t377675\n四川省新闻出版广电局\t377676\n学位房\t377677\nDavis\t377678\n伊婉玻尿酸\t377679\n知味观\t377680\n广州市交通委员会\t377681\n揪揪\t377682\nredstone\t377683\n炎妃\t377684\n李某龙\t377685\n征途2_征途2s\t377686\n寄托\t377687\n商务联盟网\t377688\ndocuments\t377689\n国脚\t377690\n艺伎\t377691\n隔膜压缩机\t377692\n四季度\t377693\n南京大学物理学院\t377694\n长筋\t377695\n擦亮眼\t377696\n博罗县政府\t377697\n自我完善\t377698\nPARA\t377699\n周济\t377700\n铂尔曼\t377701\n金庸立志传\t377702\npath模块\t377703\n瓜沥\t377704\n150uh\t377705\n终圆\t377706\n1针\t377707\n第06章\t377708\n章飞\t377709\n当乐\t377710\n酷玩乐队\t377711\n国防科学技术大学\t377712\n250ML\t377713\nQQ飞车\t377714\n中共七大\t377715\n早川濑里奈\t377716\n濮阳人才网\t377717\n纠缠不休\t377718\n梅花网\t377719\n洛马琴科\t377720\nunversioned\t377721\n肺泡\t377722\n利木津巴士\t377723\n6.99\t377724\n鼎力\t377725\npmp认证\t377726\n鲁提辖拳\t377727\n文化自觉\t377728\nwalk\t377729\n信用卡\t377730\nzongjie\t377731\n阿雷\t377732\n多度\t377733\n雪堰镇\t377734\n影吧\t377735\n上坂堇\t377736\n调拨\t377737\n蜘蛛侠英雄\t377738\n肺力\t377739\n去油\t377740\nNantong\t377741\n忍者之路\t377742\n干净\t377743\n风感\t377744\nVGN\t377745\n华为P10\t377746\n2019财年\t377747\n武汉协和医院\t377748\n427\t377749\n火影羁绊\t377750\nvisa\t377751\n安行\t377752\n百利达\t377753\n1895年\t377754\n尽责\t377755\n劳苦\t377756\n民声\t377757\n正营\t377758\nstardsd\t377759\nv3.10\t377760\n锡类\t377761\ntermux\t377762\nIndigo\t377763\n909路\t377764\n2930\t377765\n四月初八\t377766\nw32tm\t377767\n情人保镖\t377768\nzksoftware\t377769\n番禺大道北\t377770\n邓峰\t377771\n马伯庸\t377772\n仲尼\t377773\n杜嘉班纳\t377774\n璀\t377775\n可穿戴设备网\t377776\n人民群众\t377777\nPC游戏\t377778\n八一女排\t377779\n伊萨奇巴希\t377780\n珠晖区\t377781\n孙云\t377782\n大染坊\t377783\n我喜欢\t377784\n计算机科学与技术专业\t377785\n万圣节\t377786\n袁文\t377787\n倪妮\t377788\n端端\t377789\n必威\t377790\n擘画\t377791\n意外之财\t377792\n顺河区\t377793\n百分之九十五\t377794\nK-1\t377795\n索福克勒斯\t377796\n小温\t377797\n中国共产党党委\t377798\n牟平\t377799\n反滤\t377800\n小姨\t377801\nassistivetouch\t377802\n唐诗咏\t377803\n一高一低\t377804\n贵阳站\t377805\n0934\t377806\n优化\t377807\nGT赛车\t377808\n网络工程师\t377809\nmbe\t377810\n中山东区\t377811\n王ygo\t377812\n超凡蜘蛛侠1\t377813\n000825\t377814\n分布式光伏补贴\t377815\nstrust\t377816\n斗米网\t377817\nbuilding\t377818\n八分之五\t377819\n北城街道\t377820\n112所\t377821\n九哥\t377822\n欢聚时代\t377823\n起重船\t377824\n李嫂\t377825\n饥肠辘辘\t377826\n减疗\t377827\n区法院\t377828\n3月1日\t377829\n字体\t377830\n出版物\t377831\nvaag\t377832\n穹\t377833\n维吾尔\t377834\nDIY之家\t377835\n江苏省特种设备安全监督检验研究院\t377836\n丁巳\t377837\n盐酸胍\t377838\n红商网\t377839\n第二季度\t377840\n塑型\t377841\n氨苄西林\t377842\n华闻\t377843\n巴哈\t377844\n拜泉\t377845\nhke\t377846\n张永平\t377847\n织锦缎\t377848\n霉豆腐\t377849\n欧米伽\t377850\n牛腩粉\t377851\n新龙路\t377852\n广汽丰田\t377853\n笑面虎\t377854\n香港体育馆\t377855\nn3ds\t377856\nwordcount\t377857\nx550v\t377858\n楚留香云梦\t377859\n重庆老火锅\t377860\n赛马大亨8\t377861\n类化\t377862\n出生年\t377863\n伤口\t377864\n阿塞拜疆\t377865\noeis.org/a051775/a051775.txt\t377866\nbaoshi\t377867\n10多次\t377868\n平坡\t377869\n北部新区\t377870\n师生情\t377871\n赵晓光\t377872\n王昱\t377873\n北京漫展\t377874\n信号栏\t377875\n堵\t377876\nsona\t377877\n白无垢\t377878\n沙河口\t377879\n柱层析\t377880\n重装机兵4\t377881\n伊人网4\t377882\n保额\t377883\n鹿皮巾\t377884\n软下疳\t377885\nhxnc\t377886\n移通\t377887\n金湖\t377888\n【尼尔\t377889\n先森\t377890\n包机\t377891\n奢侈品展\t377892\n双子峰\t377893\n大路朝天\t377894\nDVVT\t377895\n薄纸\t377896\n静脉注射\t377897\nEV150ev\t377898\n中国船舶网\t377899\nWPP\t377900\n隧道\t377901\nMSCI\t377902\n集客\t377903\n中国民营医院\t377904\n可亲\t377905\n基轴制\t377906\n四轮\t377907\n海雅\t377908\nPTC\t377909\nCDT\t377910\nnoise\t377911\n261\t377912\n奥拉帕尼\t377913\nHotmail\t377914\n断枝\t377915\n速射\t377916\n恐怖片\t377917\n一斛\t377918\n典序\t377919\n桌位\t377920\n濮阳东方医院\t377921\nMPM\t377922\n十五讲\t377923\n600115\t377924\n担保\t377925\nWORKING\t377926\n201707\t377927\n飞控\t377928\n回头草\t377929\n蚕蛹\t377930\n尖锐疣\t377931\n下台\t377932\n网旗\t377933\n北京天坛公园\t377934\njquery中文网\t377935\n沙宣\t377936\ncaliber\t377937\n等离子电视\t377938\nMeMP3\t377939\n经超\t377940\n血袭者\t377941\n华氏\t377942\n天津市电缆总厂第一分厂\t377943\n鱼鳞病\t377944\n优惠价\t377945\n第2种\t377946\n江西省人大\t377947\ngamagama\t377948\n绝世唐门吧\t377949\n蓝奏云\t377950\n昆明西部客运站\t377951\n抟\t377952\n值集\t377953\nqiye\t377954\n外国买房网\t377955\n草帽歌\t377956\n陈天\t377957\n文盲率\t377958\n中信保诚人寿\t377959\n洞开\t377960\n转车\t377961\n压轴\t377962\n浙科\t377963\nVietnam+\t377964\nDockPanel\t377965\n金致\t377966\n髌骨脱位\t377967\n闲适\t377968\n陆风X7\t377969\n传输率\t377970\n子栏目\t377971\n华山论鉴\t377972\nOECD\t377973\n淋巴结肿大_\t377974\n李俶\t377975\n叠色\t377976\nsuperjunior\t377977\nDbContext\t377978\n茯神\t377979\n一只手\t377980\n画具\t377981\n安惠\t377982\n湖北医院\t377983\nbetway必威\t377984\n骏逸\t377985\n超能网\t377986\n林未央\t377987\n6天5夜\t377988\n千户\t377989\n户外论坛\t377990\n百名\t377991\n期价\t377992\n树果\t377993\n|史\t377994\n光明莫斯利安\t377995\n灵魂山海\t377996\n圆方橱柜\t377997\nunion\t377998\n扬子洲\t377999\n曲克芦丁片\t378000\n虚空之光\t378001\n杰尼龟\t378002\nURL\t378003\n中融民信\t378004\n天玑公馆\t378005\n试品\t378006\n所有点\t378007\n胜达\t378008\n2018-02-24\t378009\n分水岭算法\t378010\n平台化\t378011\n刘有明\t378012\n201405\t378013\nCAD图块\t378014\n晴子\t378015\n4500\t378016\n焦作现代医院\t378017\n天涯逐梦\t378018\n开孔器\t378019\n手机帝国\t378020\nLSMW\t378021\n飘帅\t378022\nV6.0\t378023\n飞剑最终幻想14\t378024\nLust\t378025\npgup\t378026\n建设工程施工管理\t378027\n莱布尼兹\t378028\nREBORN\t378029\n实相\t378030\n#include\t378031\n防风林\t378032\n飘香\t378033\nucl\t378034\n县政府办公室\t378035\n宁波大学科技学院\t378036\n贝赛尔\t378037\nJ2EE\t378038\n区委组织部\t378039\n35位\t378040\n憨鼠\t378041\n永夜\t378042\n劫持者\t378043\n安阳晚报\t378044\n台州九三腋臭康复中心\t378045\n特许经营权\t378046\n通野未帆\t378047\n挂炉\t378048\n退变\t378049\n黑水路\t378050\nkidsroom\t378051\n真霍去病\t378052\n清梦\t378053\nblocker\t378054\n生成机\t378055\n麦克白\t378056\nGTC\t378057\n狼鹰\t378058\nG3250\t378059\n徒\t378060\nipfs\t378061\nbiaozhi\t378062\n途乐\t378063\n苏宁银行\t378064\n812\t378065\nadd_header\t378066\n张小刚\t378067\nWindowsXP\t378068\n非甾体抗炎药\t378069\n张其成\t378070\n汇文\t378071\n开导\t378072\n2063\t378073\n妙境\t378074\n扑家吧_扑家\t378075\n清华大学新闻与传播学院\t378076\n爸爸去哪儿5\t378077\n罗刹女\t378078\n6段\t378079\n一物块\t378080\n拉近\t378081\n256岁\t378082\n广告人\t378083\n台芒\t378084\n梅香\t378085\n119G网盘\t378086\n强敌\t378087\n新天龙八部3\t378088\n把握习\t378089\n无线充电板\t378090\n影视文化传媒有限公司\t378091\n人间\t378092\n拣到\t378093\n虹野梦\t378094\n低\t378095\n中国舞协\t378096\n成都城投集团\t378097\n元媛\t378098\n笨狗\t378099\n中国人民大学信息学院\t378100\nconfirm\t378101\nWelding\t378102\n花样年花郡\t378103\n寻灵大冒险\t378104\n李成林\t378105\nk次\t378106\n宅女\t378107\n闽赣\t378108\n陶涛\t378109\n高级口译\t378110\nPART2\t378111\n8881\t378112\n教费\t378113\n出体\t378114\nNuskin\t378115\n5户\t378116\n太极宗师\t378117\n有机反应\t378118\n焰\t378119\n九安\t378120\n主信\t378121\n能量转换\t378122\nZach\t378123\nALT\t378124\n尖峰集团\t378125\n忽然间\t378126\n百利宫\t378127\nanalytics\t378128\n贤人\t378129\nForm\t378130\n17shou\t378131\n取穴\t378132\nY720\t378133\n中国科学院广州分院\t378134\n淘宝摄影\t378135\n三踝\t378136\nDataList\t378137\n精灵旅社2\t378138\n1598年\t378139\n支承\t378140\nvowel\t378141\n吴杰\t378142\n恒瑞\t378143\n看课\t378144\n候补\t378145\n淘汰类\t378146\n七日\t378147\n戈夫曼\t378148\n窗型\t378149\n皮赘\t378150\n焦灼\t378151\n灵耀3\t378152\n卡伦\t378153\n返尘\t378154\nMinds\t378155\n心细\t378156\n金属盒\t378157\n少年包青天2\t378158\n安意\t378159\nAxes\t378160\n国民性\t378161\n北京辰安科技股份有限公司\t378162\n人造奶油\t378163\n艾德尔\t378164\n宋庆龄\t378165\n防刺服\t378166\n水滴形\t378167\n喜马拉雅#\t378168\n吴晓\t378169\n徐州路\t378170\n联机群\t378171\nmsxml\t378172\n凉烟\t378173\n操作数\t378174\n综]\t378175\n蟠桃会\t378176\n啪嗒砰\t378177\n复古鞋\t378178\n网表\t378179\n亚运会\t378180\n东升街道\t378181\n特等奖\t378182\n梦幻\t378183\n瓜岛\t378184\n拓展屏\t378185\n朝凪\t378186\n未尽\t378187\n新台阶\t378188\nSawyer\t378189\n姬昌\t378190\n38天\t378191\n荷兰豆\t378192\n镇江市交通运输局\t378193\nTamil\t378194\n长铁\t378195\n0727\t378196\ncommands\t378197\njoin函数\t378198\n思路客\t378199\n辛安\t378200\nie内核浏览器\t378201\n陈建华\t378202\n批准\t378203\nbuffered\t378204\n七碗\t378205\n俱乐\t378206\n5&\t378207\n断子绝孙\t378208\n长阳路\t378209\n周人\t378210\n锦阳\t378211\n合肥市市\t378212\n15万公里\t378213\n6180\t378214\n检察委员会\t378215\n金沙娱乐城\t378216\n麦迪安\t378217\n走班制\t378218\n袖扣\t378219\nGohlke\t378220\n层压\t378221\n爱屋及乌\t378222\n超次元\t378223\n中长\t378224\n搜企黄页网\t378225\n永不后悔\t378226\n倒计时器\t378227\n后继\t378228\n东台中学\t378229\n婴童展\t378230\n二院\t378231\n内政办\t378232\n七星台\t378233\n单抗\t378234\n何处\t378235\n扰乱\t378236\n奶门\t378237\nPDFMiner\t378238\n材\t378239\nグロ\t378240\n二十条\t378241\n广州军区广州总医院\t378242\n覗\t378243\n飞毯\t378244\n301所\t378245\n黄冈市政府\t378246\n德荣\t378247\nGreed\t378248\n奥创\t378249\nHD6770\t378250\nsisters\t378251\n点卷\t378252\n小洋口\t378253\n木纹板\t378254\n武汉市旅游发展委员会\t378255\n诊理\t378256\n四川西南航空专修学院\t378257\n精仿鞋\t378258\nVultr\t378259\n大大大\t378260\n兴隆山\t378261\n提花\t378262\n冰雨\t378263\n106种\t378264\n邢菲\t378265\n美缔\t378266\n23集\t378267\n2018.2.1\t378268\n隐写术\t378269\n巴尔的摩\t378270\n艾瑞思\t378271\n雨花西路\t378272\n何许人\t378273\n通天晓\t378274\n行_\t378275\n老东家\t378276\n王金龙\t378277\n半城柳色半声笛\t378278\n第A02\t378279\n营村\t378280\n宝儿\t378281\n翁杰明\t378282\nE03\t378283\nam3\t378284\n十万元\t378285\n清华同方\t378286\nWebService\t378287\n金鹏\t378288\nMolbase\t378289\n夜空中最亮的星\t378290\n福建师范大学\t378291\nFF\t378292\nt6\t378293\n体块\t378294\n腊肠\t378295\nJS函数\t378296\n3.3_\t378297\nnocache\t378298\n富春山居\t378299\n框支梁\t378300\n抓举\t378301\n喊打\t378302\n河北水利电力学院\t378303\n空中剧院\t378304\n西昌路\t378305\nmith\t378306\nRemixes\t378307\nEntertainment\t378308\n管里\t378309\n演唱\t378310\n粪礼\t378311\n100兆瓦\t378312\n三个月\t378313\n治療\t378314\n洛阳银行\t378315\nrepeat\t378316\n戴珍珠耳环的少女\t378317\n彩羽\t378318\n手冊\t378319\n三码\t378320\n本纪\t378321\n云呼\t378322\n第24轮\t378323\n腐书\t378324\nante\t378325\n贺红梅\t378326\n一通百通\t378327\n诗作\t378328\n命币\t378329\n只剩\t378330\n卡尔萨斯\t378331\n阳新县政府\t378332\n蜂窝组织炎\t378333\nhdml\t378334\n蒋胜男\t378335\n5.6.0f3\t378336\n北服\t378337\n巅海水\t378338\n济政\t378339\nWindows10蓝屏\t378340\n爱情诗\t378341\n河南省招生办公室\t378342\n汪铎\t378343\n3166\t378344\n叶童\t378345\n宿迁论坛|鼎鼎有民|大宿网\t378346\nendnote\t378347\n波音公司\t378348\n十三四岁\t378349\n职业学校校企合作促进办法\t378350\n清沫网\t378351\n515etg\t378352\n赵莉\t378353\ns+\t378354\n泰利\t378355\n绿箱\t378356\n忆王孙\t378357\n固定资\t378358\n无中生有\t378359\n重庆市武隆区人民政府\t378360\n威骏\t378361\n警徽\t378362\n女友\t378363\n紫金山路\t378364\n纳帕谷\t378365\n绝地求生:刺激战场\t378366\n沪金\t378367\n东阳市人民医院\t378368\n11本\t378369\n思政类\t378370\n水獭\t378371\nsink\t378372\n周家庄\t378373\n几k\t378374\n怒涛\t378375\n亚洲博览馆\t378376\n加天\t378377\n蛋粉\t378378\n库欣\t378379\n土鳖虫\t378380\nvsf\t378381\n宇讯\t378382\n8421\t378383\n打通\t378384\nIntense\t378385\n地球仪\t378386\n斗罗大陆神界传说\t378387\n退回\t378388\n天翼4g\t378389\n轮上\t378390\n加缘\t378391\n公映\t378392\n中栈\t378393\n滁州市人民政府\t378394\nJustin丶Li\t378395\n美伦\t378396\n收纳袋\t378397\n霍思燕\t378398\n祘\t378399\n6211\t378400\n二保\t378401\nSchweser\t378402\n卸妆膏\t378403\n钱袋宝\t378404\nz77p-d3\t378405\n枫桥居花卉网\t378406\n魔龙诀\t378407\n普米克\t378408\ngourp\t378409\n结算户\t378410\n张豪\t378411\naaaaaa\t378412\n磁条\t378413\n舞蹈团\t378414\nhainu\t378415\n英雄群侠传\t378416\n天津工业大学\t378417\n动态化\t378418\n蓝_\t378419\n丧病\t378420\n枣庄山亭区\t378421\n17年4月\t378422\n小艳\t378423\n洛丹伦之战\t378424\nQAM\t378425\nTMT网\t378426\n70平\t378427\n大手子们\t378428\n卫生型\t378429\nHTC\t378430\n鄂中\t378431\n大明悲惨世界\t378432\n失效率\t378433\n邵美琪\t378434\n赛福天\t378435\n1230v2\t378436\n走出\t378437\n南海新区\t378438\n出错\t378439\n航空箱\t378440\n光明新村\t378441\n半篇\t378442\n史子集\t378443\n汇悦城\t378444\n拥趸\t378445\n溧阳市人民政府\t378446\nmax2009\t378447\n费希特\t378448\n美月\t378449\n匹伐他汀钙片\t378450\nevol\t378451\n香吉士\t378452\nSoviet\t378453\n海产品\t378454\n当家人\t378455\n卷耳\t378456\n评注\t378457\n欺身\t378458\n2008年前\t378459\n20160609\t378460\n内网\t378461\nGoroutine\t378462\n电子称\t378463\nbeo\t378464\n朦胧诗\t378465\n马先生\t378466\n中山大学中山眼科中心\t378467\n张梓\t378468\n通润\t378469\nSNCF\t378470\njinyi\t378471\n中山街\t378472\n试错\t378473\n随想录\t378474\n公安部举报中心\t378475\n龙女\t378476\n珠海市\t378477\n亲子装\t378478\n上岸\t378479\n样本库\t378480\n网络机顶盒遥控器\t378481\n狮泉河\t378482\n维修商\t378483\n万年花城\t378484\n一把青\t378485\n真不容易\t378486\n美瑞\t378487\n65周岁\t378488\ncod4\t378489\nStacey\t378490\n黑丸\t378491\n中央一号\t378492\n西宁长海医院\t378493\n安德森\t378494\n周穆王\t378495\n朱小丹\t378496\n请罪\t378497\n1031\t378498\n130个\t378499\n碧瑶\t378500\nSynonyms\t378501\n文明乡\t378502\n何等\t378503\n电容触摸屏\t378504\ne7440\t378505\n嘉宝莉\t378506\n大洋网\t378507\nWACOM\t378508\n0.0005\t378509\n清真食堂\t378510\n罗茜茜\t378511\n帮送\t378512\n石家庄百姓网\t378513\nPANTONE号\t378514\nKama\t378515\n脸孔\t378516\n卫星导航\t378517\n病毒性感冒\t378518\n店铺\t378519\n杠上\t378520\n司马南\t378521\n太平洋财产保险\t378522\n葛坚\t378523\n飞歌\t378524\n小鹅\t378525\nqiao\t378526\n万地\t378527\n墨攻\t378528\n鬼脚\t378529\n雄安集团\t378530\n易胖\t378531\n爱国统一战线\t378532\njll\t378533\n歪道\t378534\n*\t378535\n林凤娇\t378536\n叶荣添\t378537\n几间\t378538\nServer2005\t378539\n大冢\t378540\n后盾\t378541\nflish\t378542\n中途岛战役\t378543\nsynchronize\t378544\n聚美\t378545\n沈阳公交网\t378546\n鼻甲\t378547\n苍天之拳\t378548\n4000美元\t378549\n嵩明网\t378550\n婚前财产公证\t378551\n训练营\t378552\n德感\t378553\n司炉工\t378554\n700年\t378555\n布哈林\t378556\n三国战纪2\t378557\natmega\t378558\n祝桥\t378559\n阳台山\t378560\n冷压\t378561\n中国平安\t378562\n侵吞\t378563\n圆盘给料机\t378564\n安达充\t378565\n网易MUMU\t378566\n黑天使\t378567\nLiving\t378568\nCorrupted\t378569\n活性污泥法\t378570\n催付\t378571\n像雾像雨又像风\t378572\nmyeclipse2015\t378573\n偷录\t378574\n成骨\t378575\nXlsxWriter\t378576\n华山\t378577\n堰\t378578\n百度手机浏览器\t378579\nREVisionFX\t378580\n拉布拉多\t378581\n稳定器\t378582\np20\t378583\n住房贷款利率表\t378584\n促性腺激素\t378585\n红枣粥\t378586\n窦太后\t378587\nFUNK\t378588\n迪陶资讯网\t378589\n康斯坦丁\t378590\n叨扰\t378591\n天准\t378592\n学话\t378593\n春英广场舞\t378594\n西南商贸城\t378595\n司前镇\t378596\n车底\t378597\nBenchmarking\t378598\n树突\t378599\n空间素材吧\t378600\ndistro\t378601\n行星减速机\t378602\n文州\t378603\n注意注意\t378604\n标人\t378605\n爱妃\t378606\n职业生活\t378607\n最好的爱\t378608\n秋野千寻\t378609\n京a\t378610\n责任\t378611\nk70\t378612\n丽姿\t378613\n100N\t378614\n流域\t378615\n秘照\t378616\n苹果4S\t378617\n高战\t378618\nsourc\t378619\n暗芝居\t378620\n唱反调\t378621\n56号段\t378622\n盘江\t378623\nWiKi\t378624\n天降大任于斯人\t378625\n苏州工商银行\t378626\n散热\t378627\n1316\t378628\n距骨\t378629\n25.4\t378630\nIWC万国表\t378631\n帝陀\t378632\n浜\t378633\n唯创\t378634\n能覆舟\t378635\n长脂肪粒\t378636\njquery禁用按钮click\t378637\n指数型基金\t378638\n国联证券股份有限公司\t378639\n骚扰电话\t378640\n都挺好\t378641\nsprit\t378642\n冷库板\t378643\n大军师司马懿之军师联盟\t378644\n牙体\t378645\nHPE\t378646\n满腹经纶\t378647\nnginx1.12\t378648\n立法权\t378649\nISO22000\t378650\n自恃\t378651\n氢化丁腈橡胶\t378652\n扩束\t378653\nRO膜\t378654\n法令纹\t378655\n戒色吧\t378656\n东华工程科技股份有限公司\t378657\n脱毛膏\t378658\n绳\t378659\n王元\t378660\n10W-40\t378661\ntiempo\t378662\n草莓园\t378663\n韩日\t378664\n侠盗飞车圣安地列斯\t378665\nULTRAMAN\t378666\n5577\t378667\n韩露\t378668\n炒作\t378669\nexcute\t378670\n039\t378671\n龙之岛\t378672\nFileMaker\t378673\n电磁锁\t378674\n阿诺德施瓦辛格\t378675\n包皮垢\t378676\n漂浮\t378677\n研发型\t378678\n5片\t378679\n第44期\t378680\nbigtits\t378681\n免耕\t378682\n上海期货\t378683\n现化\t378684\n茎尖\t378685\n中国国际商会\t378686\n七日内\t378687\nuicollection\t378688\n长期股权投资成本法\t378689\n爱康国宾\t378690\narchoncap\t378691\n抗性\t378692\n橡胶带\t378693\n内窥镜\t378694\n中规中矩\t378695\n28项\t378696\n乐苑\t378697\n道树\t378698\nul\t378699\n相授\t378700\n爱过才懂情浓\t378701\n斯坦·李\t378702\n赶圩\t378703\nOVF\t378704\n马建军\t378705\n虢镇\t378706\n吉利公司\t378707\n涂出\t378708\njQueryMobile\t378709\n咕哒子\t378710\n备有\t378711\n慈铭体检中心\t378712\n求生者\t378713\n秘书学\t378714\nqinq\t378715\n嘉定安亭\t378716\nAnimation\t378717\n协同办公软件\t378718\n3.5厘米\t378719\n分泌性\t378720\n母子\t378721\n雅安碧峰峡\t378722\nAdaptation\t378723\n姚期智\t378724\n盛林\t378725\n尿管\t378726\n组织法\t378727\ngil\t378728\n胶漆\t378729\n内分泌腺\t378730\n投诉\t378731\n陕甘宁\t378732\n中国咖啡网\t378733\n直达车\t378734\n五声\t378735\n重合率\t378736\nname=\t378737\nDD-WRT\t378738\n滑雪板\t378739\n180万\t378740\n11月30日\t378741\n后缀树\t378742\n课标卷\t378743\n20160228\t378744\n管链\t378745\n玩艺\t378746\n门轴\t378747\n師\t378748\n吸胸\t378749\n#推文#\t378750\n通缉犯\t378751\nASPH\t378752\n天鹅湖花园\t378753\n罗赛洛\t378754\n锦源\t378755\n起诉人\t378756\nResolve\t378757\nx6s\t378758\n6.8亿元\t378759\n每年\t378760\n茆\t378761\n巴霍巴利王:开端\t378762\n水痘疫苗\t378763\n丽江师范高等专科学校\t378764\n特锐德\t378765\n约婚\t378766\n胡启立\t378767\n热成型钢\t378768\n102.4\t378769\nbf\t378770\nWijmo\t378771\n默契\t378772\n赵瑞龙\t378773\nMOOCs\t378774\n双证\t378775\n吴京樊振东\t378776\n团贷款\t378777\nstrcpy_s\t378778\nswan\t378779\n6平米\t378780\n雨幕\t378781\nPD-1\t378782\nwhales\t378783\n购彩\t378784\nSHIT\t378785\nChatroulette\t378786\n好嘛\t378787\n张均\t378788\n蒋艳萍\t378789\n梦幻西游bb\t378790\nPuTTY\t378791\nnorm\t378792\n斯塔\t378793\nvmvare\t378794\n广州燃气集团有限公司\t378795\n万鄂湘\t378796\nemergence\t378797\norgans\t378798\n南天竹\t378799\ncarefully\t378800\ncradle\t378801\npaperuri\t378802\n矮化\t378803\n河南省工商局\t378804\n峨山路\t378805\n库管\t378806\n甲寅日\t378807\n苍翠\t378808\n片层\t378809\n九牌\t378810\n昆池岩精神病院\t378811\n路\t378812\n磐安县人民政府\t378813\n江浦\t378814\n偏心轴\t378815\n利木津\t378816\n土灶\t378817\n有性生殖\t378818\nexposure\t378819\nmolestia吧\t378820\n哈密顿\t378821\n广铁\t378822\nphuket\t378823\n龙港新城\t378824\nbyte\t378825\n好妈妈\t378826\nOAuth\t378827\n亲历记\t378828\nBeoplay\t378829\nMalls\t378830\nG++\t378831\n彭吉象\t378832\n钜记\t378833\n临沂市住房公积金管理中心\t378834\n公头\t378835\n微信表情包\t378836\nDelicate\t378837\n铁字\t378838\n杂活\t378839\n魁星楼\t378840\n南模中学\t378841\n全哨\t378842\n公交网\t378843\n火焰\t378844\n茶杯头\t378845\n明溪\t378846\n草鱼\t378847\n曾宝仪\t378848\nCD盘\t378849\n陌生男\t378850\n凌乱\t378851\n作灶\t378852\n羊毛毯\t378853\n广西壮族自治区卫生和计划生育委员会\t378854\n紫山药\t378855\n黄湾\t378856\n美宴\t378857\n杰伦\t378858\n内蒙古电子信息职业技术学院\t378859\n核孔\t378860\n直和\t378861\n一分半\t378862\n米雪\t378863\nAIRCROSS\t378864\n丽普司肽\t378865\nsaurus\t378866\n合作型\t378867\nIPhone\t378868\n绅宝X35\t378869\n蹁跹\t378870\n克拉\t378871\n博器\t378872\n嘉卡贷\t378873\nacunetix\t378874\n跳跳鱼\t378875\nB150M\t378876\n生徒\t378877\n伟才\t378878\n危险源辨识_\t378879\n绛\t378880\n编织品\t378881\n草帽海贼团\t378882\n若鱼\t378883\n呆呆兽\t378884\n信用办\t378885\nsaturday\t378886\n工作经济\t378887\n12亿\t378888\nCA199\t378889\nI9500\t378890\n兔狗\t378891\n早恋\t378892\n分户\t378893\n会计师资格证\t378894\n活到老\t378895\n史莱母\t378896\n冒险盒子\t378897\nCCAV5\t378898\n凉子\t378899\n巡逻队\t378900\n康都\t378901\n爱慕\t378902\n金华市公安局\t378903\nDART\t378904\n第60届\t378905\n底板\t378906\nピス\t378907\n43.5\t378908\nhanding\t378909\ninhibitors\t378910\n玖誉\t378911\nJJVOD\t378912\nChubby\t378913\n汤丽柏琦\t378914\n五六月\t378915\n绝宠\t378916\n范例\t378917\n贝多多\t378918\n节操\t378919\n建筑工程专业\t378920\n2512\t378921\n扬州市人力资源和社会保障局\t378922\n本月起\t378923\n护坡砖\t378924\nheadphone\t378925\n8600元\t378926\n中小企业信息网\t378927\n迅游网\t378928\nKeep\t378929\n原子团\t378930\n阿豆\t378931\n站直\t378932\n面庞\t378933\n炸鸡块\t378934\n省直\t378935\nDiskStation\t378936\n中码\t378937\nFern\t378938\n黄子华\t378939\n猜测\t378940\n性征\t378941\nfancybox\t378942\n原鸡\t378943\n病历单\t378944\n小妇人\t378945\n2018-04-28\t378946\n福特汽车公司\t378947\n黄果柑\t378948\n风雅\t378949\n委托代销\t378950\npingfang\t378951\n安朗杰\t378952\n佤族\t378953\n小金卡\t378954\nkesha\t378955\n丁冬\t378956\n突显\t378957\n三星c5\t378958\n北京亚运村\t378959\n超弹性\t378960\n梅龙镇\t378961\n反义词\t378962\n昂首挺胸\t378963\n姉\t378964\n迟延履行\t378965\n徐俐\t378966\n广东站\t378967\n汇泰龙\t378968\nvans\t378969\n170618\t378970\n算尽\t378971\n农业发展银行\t378972\n过来\t378973\n国电科学技术研究院\t378974\n作字\t378975\n上海交通大学外国语学院\t378976\n中山路步行街\t378977\n祁县\t378978\n阉鸡\t378979\n靠右对齐\t378980\nontouch\t378981\n舒舒服服\t378982\n彭水县\t378983\nbroth\t378984\n亿房网\t378985\n鬼叫春\t378986\n财富号评论\t378987\n高科西路\t378988\n218.89\t378989\n等你回家\t378990\n万金\t378991\n2800元\t378992\n长胜\t378993\nfl\t378994\n跨服战\t378995\n新力琥珀园\t378996\nAvenstar\t378997\n选择法\t378998\nMegumi\t378999\n名商标\t379000\n热固性树脂\t379001\nVelocity\t379002\n三本\t379003\nremote\t379004\npdfedit\t379005\n海南农垦\t379006\nBCI\t379007\n百合花园\t379008\n清尘\t379009\n正章\t379010\n电体\t379011\n腮板\t379012\n我的眼泪\t379013\n上去\t379014\n厅柜\t379015\n氟哌啶醇\t379016\n几只\t379017\n王者之舞\t379018\nYN\t379019\n义城\t379020\n增肥\t379021\n张英席\t379022\n多篇&#41\t379023\n狼窟\t379024\nECON\t379025\n900元\t379026\n云纪\t379027\n五谷豆浆\t379028\n双宫\t379029\n畜牧场\t379030\nxbee\t379031\n垃\t379032\n素女经\t379033\nhd2017\t379034\n幼升小划片\t379035\n云南省纪委\t379036\n中密\t379037\n大帽\t379038\n配盘\t379039\nmama\t379040\nunpkg\t379041\n974\t379042\n6.3.31\t379043\n瞧\t379044\n石鼓\t379045\nMartial\t379046\n)国际贸易有限公司\t379047\n少女终末旅行\t379048\n5CD\t379049\n错觉\t379050\nMediaPlayer\t379051\n中鸣\t379052\n器件\t379053\n淡水路\t379054\n七匹狼\t379055\n综合执法局\t379056\nCVT变速箱\t379057\n数名\t379058\nSolarWinds\t379059\n马图\t379060\n呼和浩特东站\t379061\n1月27日\t379062\n50校\t379063\n克里夫\t379064\n原纱央莉\t379065\nentri\t379066\n张思帆\t379067\n场下\t379068\n中国刑事警察学院\t379069\n薛平贵\t379070\n多任务分屏\t379071\n蜣螂\t379072\n有儿\t379073\n农展馆\t379074\n谷歌版\t379075\n燃面\t379076\n卡婊\t379077\n日常生\t379078\n株洲方特梦幻王国\t379079\n爱信诺\t379080\nCorsair\t379081\nmk1\t379082\n鉴别仪\t379083\nTo\t379084\n水培\t379085\n幸福人\t379086\n新条\t379087\n愤而\t379088\nChandler\t379089\n酬乐天扬州\t379090\n金包银\t379091\n雅诗阁\t379092\n思想品德\t379093\n135个\t379094\n孙鹏\t379095\n旅人\t379096\n0.2.5\t379097\n彩田\t379098\n上外高\t379099\n白细胞减少症\t379100\n拉玛西亚\t379101\n平度\t379102\n万能光驱驱动\t379103\n广西中医药大学附属瑞康医院\t379104\n热收缩包装机\t379105\n文运\t379106\n检测员\t379107\n分配表\t379108\n扎西平措\t379109\n青少年期\t379110\n聚硅氧烷\t379111\n宅宅AVDay\t379112\n反战\t379113\n本特克\t379114\n送给\t379115\n孔子学院\t379116\n2pin\t379117\n浑天仪\t379118\n精液\t379119\n下生\t379120\n联络站\t379121\n杨建新\t379122\narguments\t379123\nliteracy\t379124\n放弃我抓紧我\t379125\n云衣\t379126\n城南新区\t379127\n123名\t379128\n刀马旦\t379129\n伏位\t379130\nippbx\t379131\n语录大全网\t379132\n王瑾\t379133\nlia\t379134\n干接\t379135\n重案六组\t379136\n壮游奇迹世界吧\t379137\n阿提拉\t379138\n成都房产信息网\t379139\n党湾镇\t379140\n甜品\t379141\n曾庆平\t379142\n山西省旅游发展委员会\t379143\n至尊传奇\t379144\n帷幔\t379145\n31所\t379146\n斗式提升机\t379147\n1576\t379148\n马黛茶\t379149\n矫平机\t379150\n预售房\t379151\n空亡\t379152\n李七夜\t379153\n宠狐成妃\t379154\n蜀山战纪2踏火行歌\t379155\nf108\t379156\n木香\t379157\nbeeline\t379158\nPixel2\t379159\nmoshow\t379160\n佛山市三水区政府网\t379161\n天美\t379162\n伍声\t379163\n贬值率\t379164\n青枫墅园\t379165\n春风路\t379166\n不忘\t379167\n张柔\t379168\n太阳能发电板\t379169\n赴澳\t379170\nmaxscale\t379171\n舒利\t379172\n林志强\t379173\n关于人民法院执行工作若干问题的规定\t379174\n包价\t379175\n盐城中学\t379176\n神雕帝豪\t379177\n3月26号\t379178\n制革\t379179\ndiscussed\t379180\n紫菀\t379181\n杨宝森\t379182\n武汉轻工大学\t379183\n电信工程学院\t379184\n院友\t379185\n做自己\t379186\nCARAT\t379187\n算爱\t379188\n浴巾\t379189\n养死\t379190\n不腐\t379191\n新兵种\t379192\n范忆琳\t379193\n九片\t379194\n陆志鹏\t379195\nvitamix\t379196\nVxWorks\t379197\n河北省工商局\t379198\n15岁\t379199\n2.3.6\t379200\n求闻\t379201\nnetty5\t379202\n男人和女人\t379203\n叶伟信\t379204\n3467\t379205\nA8L\t379206\n佛教\t379207\n风寒\t379208\nBluray\t379209\nstm32f405\t379210\n航旅\t379211\n冰激凌卡\t379212\nlcd12864\t379213\n利子\t379214\n一体战\t379215\n省值\t379216\nsk5\t379217\n清苑县\t379218\nlocated\t379219\n国际版\t379220\n落籍\t379221\n临渊羡\t379222\n路虎汽车\t379223\n5.1_\t379224\n维护者\t379225\nMiniature\t379226\n60后\t379227\n下限\t379228\n荷叶茶\t379229\n632路\t379230\nBECKHOFF\t379231\nHerb\t379232\n崂山茶\t379233\n砍\t379234\nJan\t379235\n街霸5吧\t379236\nruntastic\t379237\n旱柳\t379238\n1.4cm\t379239\n宽口\t379240\n活尸\t379241\n柴蔚\t379242\n林子大了\t379243\n圆环形\t379244\n分布式光伏发电\t379245\nAge\t379246\n百颐年\t379247\nretrieving\t379248\n沥液\t379249\n杨屹\t379250\ngudu\t379251\n裂空\t379252\n魏礼群\t379253\n3.18\t379254\n团购导航\t379255\n癸亥日\t379256\n张小砚\t379257\n迪约科维奇\t379258\n辛夷\t379259\n201\t379260\n握笔\t379261\n中储智运\t379262\n邓朴方\t379263\n京市\t379264\nFastJson\t379265\n王立国\t379266\n在先\t379267\nIntegrative\t379268\n盐袋\t379269\n鞠婧祎\t379270\nnotations\t379271\nBCB\t379272\n融信中国\t379273\n张惠妹\t379274\n欠妥\t379275\n骑马与砍杀乱舞水浒\t379276\n茵栀黄口服液\t379277\n回复\t379278\n黑豆粉\t379279\n破胆\t379280\n旭辉宝\t379281\n袅袅\t379282\ndaren\t379283\n黔江区政府网_重庆市黔江区人民政府\t379284\n蒹葭汉化组\t379285\n爱伦·坡\t379286\n昂昂昂\t379287\n六角块\t379288\n光衰\t379289\n胜肽\t379290\n传片\t379291\n曲轴箱\t379292\n陈觉\t379293\n铃木心春\t379294\nご\t379295\n奶猫\t379296\n一盒\t379297\n参数估计\t379298\n河北正定中学\t379299\n验方\t379300\n2017.1.5\t379301\n舞蹈家协会\t379302\n349元\t379303\n中国工商银行手机银行\t379304\n新开河\t379305\nwxparse\t379306\n逼仄\t379307\nvite\t379308\n第62批\t379309\n美利奴\t379310\nmyy\t379311\n1.4GB\t379312\n作成\t379313\n塞纳\t379314\n九和\t379315\nkp7s1\t379316\nV60\t379317\n美食的俘虏\t379318\n乌海\t379319\n跳乐\t379320\nVMS\t379321\n匠人\t379322\n重卡\t379323\n金贸\t379324\n都市花园\t379325\n知产新闻-7号网\t379326\n惠福西路\t379327\n2018年2月5日\t379328\n1266\t379329\n市规划局\t379330\nmfmdaoyou\t379331\nDataTable\t379332\n五款\t379333\nqq\t379334\nliaotian\t379335\n地狱门\t379336\n一起音乐吧\t379337\n安顺学院\t379338\n寻龙点穴\t379339\n二十冶\t379340\n病毒灵\t379341\n熊毅\t379342\n66603251\t379343\n惠州南站\t379344\n强启\t379345\n小网红\t379346\n磷\t379347\n三星S9+\t379348\n小树叶\t379349\n仙碱\t379350\n第73期\t379351\n至真\t379352\n阜康市人民政府\t379353\ny700\t379354\n张福生\t379355\n蒋至乙\t379356\n欧打\t379357\nkni\t379358\n夜下\t379359\n殉情\t379360\n丁香茶\t379361\n绕绕\t379362\nDataGrip\t379363\n何怀宏\t379364\n不孕症\t379365\nbme\t379366\nSpecification\t379367\n玲\t379368\nQQ群号\t379369\n鱼化石\t379370\n7&\t379371\n史强\t379372\nisg\t379373\necmall\t379374\n抱柱\t379375\n类案\t379376\n八十块\t379377\n7815元\t379378\n阿部\t379379\n沈师\t379380\n乌鲁木齐站\t379381\nror\t379382\n常在你左右\t379383\n60g\t379384\n镇江市区\t379385\n市证\t379386\n有道笔记\t379387\n玩主\t379388\n发布员\t379389\n酬\t379390\n正确\t379391\n陈朝晖\t379392\n飘色\t379393\n台剧\t379394\n哈米\t379395\n氟西汀\t379396\n十来岁\t379397\n歼10C\t379398\n杨舟\t379399\n信纸\t379400\n32篇\t379401\n微美\t379402\n张人亚\t379403\n淳安\t379404\n达希纳\t379405\n名雅\t379406\nnoel\t379407\n40号\t379408\n东华初级中学\t379409\n成本岗\t379410\n金杯汽车\t379411\n命制\t379412\n协同过滤推荐算法\t379413\n压限器\t379414\n截然不同\t379415\n九斤姑娘\t379416\n托邦\t379417\n花谢花飞花满天\t379418\n阶级斗争\t379419\nImposing\t379420\n发响\t379421\n子母门\t379422\nrx\t379423\n1.24E\t379424\nSUPPLY\t379425\n书香云集小说网\t379426\n法布雷加斯\t379427\n嘉信\t379428\n五一景区\t379429\n高达中\t379430\n李笑\t379431\n东南西北风\t379432\n汉腾X5\t379433\n警卫\t379434\nFederation\t379435\nIview\t379436\n赵作海\t379437\n标标\t379438\n水行\t379439\nGenelec\t379440\n斩马谡\t379441\n刚毅\t379442\n画堂春深\t379443\n笑林广记\t379444\n通用券\t379445\njodd\t379446\n预装式\t379447\n授勋\t379448\n前赤壁赋\t379449\n期望值\t379450\nG罩杯\t379451\n李固\t379452\n远界\t379453\nExpanded\t379454\n佩蒂\t379455\nTherapeutics\t379456\n天津先锋网\t379457\ndigicert\t379458\n赵辉\t379459\n我怀念的\t379460\n平形\t379461\n一期一会\t379462\n扇出\t379463\n媒资\t379464\ncross\t379465\n装具\t379466\n绿地商务城\t379467\n车队长\t379468\n安徽水利开发股份有限公司\t379469\n星空\t379470\n阿斯顿大学\t379471\n陈雨\t379472\nDts\t379473\nWJSN\t379474\n三实\t379475\n竖版\t379476\n中仪\t379477\nWAV\t379478\n43000\t379479\n头形\t379480\n都梁\t379481\n希洛\t379482\nbrooks\t379483\n1宫\t379484\n炮仙\t379485\n鱼人\t379486\n受压\t379487\n河洛清风网\t379488\n艾特\t379489\n泰国大使馆\t379490\n爱眼\t379491\n华大使馆\t379492\n清水泥\t379493\n玉带河\t379494\n2i\t379495\n净增加额\t379496\n飞傲x5\t379497\n搞定\t379498\nmagnitude\t379499\n微喇裤\t379500\n百合汤\t379501\n将官\t379502\n注册表文件\t379503\n有邻\t379504\n九江市人力资源和社会保障局\t379505\n0554\t379506\n七亿\t379507\n帝尊\t379508\n天大\t379509\nsettimer\t379510\nOfficial\t379511\n文明小区\t379512\nemui4.0\t379513\n凡人歌\t379514\nnginx+tomcat\t379515\n灰烬之灵\t379516\nZ3+\t379517\n丽声拼读故事会\t379518\n双眼皮手术\t379519\n驰\t379520\n盖销\t379521\nANC\t379522\n常州工程职业技术学院\t379523\n冷泉村\t379524\n海宁城际铁路\t379525\n深圳国资委\t379526\navplayer\t379527\n西迷\t379528\nmatrix\t379529\n中国钦州政府\t379530\n对坐\t379531\nSCB\t379532\n朱洪\t379533\n英雄联盟之绝世无双\t379534\nBooks\t379535\n死于\t379536\n网易网盘\t379537\n药膳\t379538\n8岁\t379539\n陈婷婷\t379540\n陷阵之志\t379541\nn50\t379542\n万能胶\t379543\n拨掉\t379544\n制款\t379545\n基础教育处\t379546\n小画\t379547\n批地\t379548\n桂花蜜\t379549\n巴氏合金\t379550\n连点器\t379551\nzx1\t379552\n庄稼人\t379553\n商业房\t379554\n菊石\t379555\n中公选调生考试网\t379556\n女式\t379557\n侯府\t379558\n褪墨\t379559\n金顶\t379560\n前置性\t379561\nmokuai\t379562\n强奸犯\t379563\n暗黑破坏神3\t379564\n乐檬k3\t379565\n简捷\t379566\n昭著\t379567\n1861\t379568\n凯德\t379569\n力主\t379570\n敏感型\t379571\n九孔\t379572\nLai\t379573\n导库\t379574\nZXP\t379575\n自动离合器\t379576\n老草\t379577\n藏舞\t379578\n海马濑人\t379579\n相差\t379580\n下岗职工\t379581\n季军\t379582\nTeam\t379583\ndispose\t379584\n放映室\t379585\nxbb\t379586\n一氧化氮\t379587\n双年展\t379588\n非等\t379589\n金发女\t379590\n马里兰州\t379591\n陆川叶青\t379592\n青风\t379593\nYoon\t379594\ncomply\t379595\n0799\t379596\n神仙肉\t379597\n霆锋\t379598\n蓝颜\t379599\nT11\t379600\n吴地\t379601\n芭乐\t379602\n一出戏\t379603\n上古世纪\t379604\ntrex\t379605\n十二小时\t379606\n高芳\t379607\n覆\t379608\n存货跌价准备\t379609\n台州市司法局\t379610\n两跨\t379611\n镶\t379612\n能源集团\t379613\n长沙长江医院\t379614\n小雪花\t379615\nM17X\t379616\n异步化\t379617\n速进\t379618\n王鹏辉\t379619\n阿本\t379620\nlut\t379621\nMarmot\t379622\nRainforest\t379623\n异度之刃2佣兵团\t379624\n读史方舆纪要\t379625\nVenture\t379626\n第二十八回\t379627\n小侠\t379628\n额吉\t379629\n魅蓝3s\t379630\n动能定理\t379631\nCsharp\t379632\n季终\t379633\n喜悦\t379634\n恋爱中\t379635\n行权期\t379636\n被撞死\t379637\nexpection\t379638\n胡亚捷\t379639\n跨期套利\t379640\nyy4480高清影院\t379641\n氟化物\t379642\n四五个\t379643\n蜜蜡\t379644\n九酷娱乐\t379645\n洪兰\t379646\n教育事业\t379647\n南京科技职业学院\t379648\n张永强\t379649\n射击馆\t379650\n腾讯Bugly\t379651\n岛篇\t379652\n兴趣爱好\t379653\n五粮醇\t379654\n宁波象山县\t379655\n羽霞\t379656\n冰珠\t379657\n江铃宝典\t379658\n赛珍珠\t379659\nPeeping\t379660\npmd\t379661\nv2.5.5\t379662\n玛雅\t379663\n━━\t379664\n美少女梦工厂\t379665\n白掌\t379666\n4.4.6\t379667\n岔开\t379668\n下半叶\t379669\n基本单位\t379670\n雨纷纷\t379671\n广仁\t379672\nexideal\t379673\nHeels\t379674\n20150914\t379675\n海马\t379676\n旅行时\t379677\n无能者\t379678\nMana\t379679\n体医\t379680\n频繁\t379681\n长江7号\t379682\n电影场\t379683\n洛嘉\t379684\n昆明市儿童医院\t379685\n软色\t379686\n105路\t379687\n一缸\t379688\n彝人古镇\t379689\nv-if\t379690\n南雅中学\t379691\nB2\t379692\nPMI指数\t379693\n泄\t379694\n筋头巴脑\t379695\n芋虫\t379696\n绝地求生服务器\t379697\nSTYLE\t379698\n兵团国土资源局\t379699\n开封新区\t379700\nc级\t379701\nshelf\t379702\n医保费\t379703\n审题\t379704\n湖州日报\t379705\nmarsprj\t379706\n竹丝\t379707\n北京恒昌利通投资管理有限公司\t379708\n普天新能源\t379709\n飞女正传\t379710\n常州人才网\t379711\n华体\t379712\n第24号\t379713\n冲顶\t379714\n漉\t379715\n代理人员\t379716\n别再吃\t379717\n上海市交通委员会\t379718\nmonthly\t379719\nplist文件\t379720\n候机楼\t379721\n杨阿姨\t379722\n北京市农业局\t379723\n会道\t379724\n移进\t379725\nFoxconn\t379726\naffiliation\t379727\n派发现金红利\t379728\n400公斤\t379729\n情深意\t379730\n天义\t379731\nLynda\t379732\n荒芜\t379733\n703路\t379734\n曾益新\t379735\n卖酒\t379736\n永嘉县三精阀门有限公司\t379737\n快达\t379738\n国家安监总局\t379739\n第一金\t379740\n贪心\t379741\nQorvo\t379742\n咛\t379743\n工表\t379744\n武夷山政府\t379745\n宣威\t379746\n朱湘\t379747\n百慕\t379748\n君上\t379749\n丝娃娃\t379750\n天水路\t379751\n宁俊明\t379752\n76家\t379753\n承包地\t379754\n人故事\t379755\n兴唐\t379756\nallowance\t379757\n有情\t379758\n三千只\t379759\n每关\t379760\n失落之城\t379761\n欧派橱柜\t379762\n七田真\t379763\n紫牡丹\t379764\n卓面\t379765\n兴隆乡\t379766\nhset\t379767\nSteering\t379768\n罗家英\t379769\n模拟联合国大会\t379770\n蘑菇子\t379771\nCHIP\t379772\n春秋航空公司\t379773\njredis\t379774\n呼和浩特市\t379775\n拆迁队\t379776\n贵县\t379777\n赠言\t379778\n列表值\t379779\n七十二柱\t379780\n2011-2012年度\t379781\n幼驯染\t379782\n孙氏\t379783\n95113\t379784\n沙律\t379785\n腊鱼\t379786\n非奇异\t379787\n一客\t379788\nav小四郎\t379789\n溃败\t379790\n夜宴\t379791\n李佳佳\t379792\n刑事诉讼制度\t379793\n杰克奥特曼\t379794\n花窗\t379795\n双刀\t379796\n马云禄\t379797\neme\t379798\n圣道\t379799\n武乡县\t379800\n硫氰化钾\t379801\n语言学专业\t379802\nfk\t379803\n保荐人\t379804\n616\t379805\n白光\t379806\n引起注意\t379807\n摇奖机\t379808\n5014\t379809\n华为荣耀v8\t379810\nevs\t379811\nJunk\t379812\n浙江省机构编制委员会办公室\t379813\n争艳\t379814\n亭亭山\t379815\n七窍\t379816\nDCP-7080D\t379817\n夏洛特烦恼\t379818\nAJA\t379819\n聚乙\t379820\nbocai\t379821\n移动号\t379822\n驾崩\t379823\n7.3版\t379824\n询\t379825\ngt5\t379826\n2万2\t379827\n雅迪Z3\t379828\n沪江日语网\t379829\n英博\t379830\n云南省工商行政管理局\t379831\n大洼区\t379832\nClonezilla\t379833\n常德\t379834\nCov\t379835\n意大利航空\t379836\n猫子\t379837\n童靴\t379838\nsiku\t379839\n男团\t379840\n天津市人民政府办公厅\t379841\n星球大战:前线\t379842\n倚天2\t379843\n陈\t379844\n乐土\t379845\nINDIGO\t379846\nMorton\t379847\n玉化\t379848\n批片\t379849\n沧州市人民政府\t379850\n母国\t379851\n等额本金还款法\t379852\n重庆市卫计委\t379853\n香西咲\t379854\n针脚\t379855\nSignificant\t379856\n_3D学苑_3d学院\t379857\n五首\t379858\n风火\t379859\n唐多令\t379860\n考研信息网\t379861\n外国政\t379862\nEXCEl表格\t379863\n罗宾汉\t379864\nt64\t379865\n漫漫\t379866\n群策\t379867\n锡东新城\t379868\nkb_\t379869\nlibst\t379870\n广东消费网\t379871\n旅程\t379872\n电弧炉\t379873\n朱如葆\t379874\n松江新城\t379875\n少先队大队委\t379876\n七建\t379877\n712首\t379878\n外患\t379879\n第30轮\t379880\n柔佛州\t379881\n31年\t379882\n鹿门歌\t379883\n广州市海珠区政府\t379884\n翡冷翠\t379885\ncwb\t379886\n放不下来\t379887\nweb服务\t379888\nlabs\t379889\n上流\t379890\n萌宅\t379891\n科考站\t379892\n电子喜帖\t379893\n兄弟情\t379894\n勐海茶厂\t379895\n索引页\t379896\nSketchBook\t379897\n三季\t379898\n东四环\t379899\n许愿树\t379900\n钛业\t379901\n重返德军总部2\t379902\n心飞扬\t379903\n4.3.16\t379904\n女神联盟2吧\t379905\n不传\t379906\n黑龙江农村信用社\t379907\ngamebench\t379908\n开证\t379909\n阿牛\t379910\n运化\t379911\n进价\t379912\n_尚\t379913\n非化石能源\t379914\n93篇\t379915\n一6\t379916\n换内\t379917\n小氕\t379918\n沈春阳\t379919\n林清安\t379920\n国家测绘地理信息局\t379921\n宅男腐女恋爱真难\t379922\n英航\t379923\ntbc\t379924\n1.8L\t379925\n无夜之国\t379926\n苏樱\t379927\nsaoutils\t379928\n0.6mm\t379929\n镇北\t379930\n管辖地\t379931\nUpskirt\t379932\nVH\t379933\n有劲\t379934\n飞鸟\t379935\n守备\t379936\n大淘宝\t379937\nSponsored\t379938\n1500分\t379939\n恒温器\t379940\n助兴\t379941\n话音\t379942\n十九届三中全会\t379943\n南翔老街\t379944\n中兴公司\t379945\n万能播放器\t379946\n范五老街\t379947\n斯巴鲁BRZ\t379948\n重婚\t379949\n微课网\t379950\n高茜\t379951\n次中音萨克斯\t379952\n万言\t379953\n蓝德\t379954\n芒果糯米饭\t379955\n絶望\t379956\n吟唱\t379957\n处室\t379958\n短期借款\t379959\n免疫组织化学\t379960\n杨琼\t379961\nSAMSON\t379962\n后宫:帝王之妾\t379963\n学堂\t379964\ne栈\t379965\n正餐\t379966\n党首\t379967\n3548\t379968\n龙女仆\t379969\nmc2\t379970\n脸方\t379971\n储油雾化器\t379972\n打骂\t379973\n社会科学\t379974\n能看懂\t379975\n网易音乐人\t379976\n中国人民革命军事博物馆\t379977\n红星云\t379978\n忠心\t379979\nMy_blog\t379980\n915路\t379981\n乐知\t379982\nFANDOM\t379983\n责任观\t379984\n岳各庄\t379985\n龙平\t379986\n8则\t379987\n鹅掌楸\t379988\nAfox\t379989\n360n5\t379990\n二番\t379991\n九鱼\t379992\n百度网盘压缩包\t379993\ntouchmove\t379994\n七级\t379995\n五部曲\t379996\n承压\t379997\n着陆页\t379998\n大笑江湖\t379999\n保险金\t380000\n职棒\t380001\n古流\t380002\n十排名\t380003\nV11.0\t380004\nglc300\t380005\n钱湖\t380006\n奇艺\t380007\n柱帽\t380008\n河面\t380009\nivr\t380010\n崇礼区\t380011\n眉州东坡\t380012\n357克\t380013\ntmt\t380014\ndiscretion\t380015\n助阵\t380016\n净化者\t380017\n429\t380018\nFunctionality\t380019\n国祥\t380020\nV1.6\t380021\n30粒\t380022\n泛美\t380023\n上师\t380024\n杨高路\t380025\n虎胆\t380026\n开元街道\t380027\n甘醇\t380028\n难敌\t380029\n排码\t380030\n舍命\t380031\nwhoami\t380032\n罗技G500\t380033\nMTT\t380034\n320g\t380035\nprevent\t380036\n古炉\t380037\n计生协\t380038\n雷竹\t380039\n丁酸钠\t380040\n鼻咽镜\t380041\n治安管理处罚\t380042\n非命\t380043\n云南文山州政府网\t380044\ndebugbar\t380045\n上海交大化学化工学院\t380046\n云匠网\t380047\n中央差速器\t380048\n诺文尼亚\t380049\nWebDriver\t380050\n杨思\t380051\nmaomao\t380052\n珠江源\t380053\n第139\t380054\n暖风器\t380055\n作出来\t380056\n色妞\t380057\n北京一零一中学\t380058\n午盘\t380059\n比萨饼\t380060\nflask_sqlalchemy\t380061\n博馆\t380062\n17版\t380063\n数理方程\t380064\n八间\t380065\nsnandy\t380066\n分区表\t380067\n内田美奈子\t380068\n蜀山战\t380069\n平衡霜\t380070\nW3C\t380071\n睡前\t380072\n马栏山\t380073\n_哥\t380074\n别指望\t380075\n福泉市\t380076\n万众\t380077\n72分钟\t380078\nora-01017\t380079\n停尸\t380080\n聚类分析法\t380081\n86分\t380082\n_巴士剑网\t380083\n智能家居控制系统\t380084\n化学纤维\t380085\nShiraz\t380086\n广东省人社厅\t380087\n矿物\t380088\n蓝礼\t380089\n荤素\t380090\n小米WiFi链\t380091\n流风\t380092\n洪博培\t380093\n炮兵团\t380094\n师生关系\t380095\n误诊\t380096\n踹飞\t380097\n名望\t380098\nalamo\t380099\n4cm\t380100\n腓特烈\t380101\n彼得帕克\t380102\n不开眼\t380103\n纪王\t380104\n宋庄镇\t380105\n缴存\t380106\n莼湖镇\t380107\n仙之侠道苍云传\t380108\n新康泰克\t380109\n武警中队\t380110\n细菌性\t380111\nusual\t380112\nsecurities\t380113\n芘\t380114\n商园\t380115\n外贸英语函电\t380116\n申请书\t380117\n彭程\t380118\n二陈汤\t380119\n兽世独宠\t380120\n料理棒\t380121\n血型\t380122\n东风柳州汽车有限公司\t380123\n二步\t380124\n3106\t380125\n县人力资源和社会保障局\t380126\n晴色\t380127\ninflux\t380128\n插画家\t380129\n追剿\t380130\n冲锋衣\t380131\n徐家汇站\t380132\n万机\t380133\n垃圾人\t380134\nbein\t380135\n商陆花\t380136\n手诊\t380137\n营地\t380138\n蓝方\t380139\n3封\t380140\n杨茗茗\t380141\n期末数\t380142\n黄芩苷\t380143\nDennis\t380144\n波普艺术\t380145\n居民委\t380146\ncpr\t380147\ndeepin\t380148\nAccessory\t380149\n2019年6月\t380150\n婐体\t380151\n势如破竹\t380152\nhuatang\t380153\nGPS导航网\t380154\n锦堂\t380155\nwargame\t380156\n宇汇国际\t380157\nelementUi\t380158\n燧发\t380159\n12月31日\t380160\n湘潭\t380161\n石笋\t380162\n路障\t380163\n科信教育\t380164\nDubbed\t380165\n张达\t380166\n沙河地铁站\t380167\n武汉天河国际机场\t380168\n怒扇\t380169\n四史\t380170\n宁波方太厨具有限公司\t380171\n123部\t380172\n612路\t380173\n章雪琪\t380174\n青坊\t380175\nQunee\t380176\n不见效\t380177\n红缘\t380178\n比量\t380179\n单板机\t380180\n【言\t380181\n璧\t380182\n多哈\t380183\n中国百强国际旅行社\t380184\n试验台\t380185\n沈家湾\t380186\n余切函数\t380187\n文秘类\t380188\nLevine\t380189\n批评\t380190\nVoice\t380191\nmedio\t380192\nOPPOR9s\t380193\nguideguide\t380194\n苦修者\t380195\n袋熊\t380196\n上海政府网\t380197\n安邦系\t380198\n邪恶少女漫画_日本邪恶漫画大全_邪恶漫画全集\t380199\n9页\t380200\n美丰\t380201\n曙光\t380202\n健身会馆\t380203\n闲逛\t380204\n平原新区\t380205\n严严实实\t380206\n辰砂\t380207\n4g卡\t380208\n胶贴\t380209\n黑刀\t380210\n天流\t380211\n三极\t380212\n平桂区\t380213\n护身\t380214\nP\t380215\n双龙大道\t380216\n王晓云\t380217\n气相色谱-质谱法\t380218\n小鬼\t380219\n侠客风云传:前传\t380220\n十宗罪2\t380221\n苴\t380222\n重华\t380223\n斯琴\t380224\n哈工智能\t380225\n昌河\t380226\n铁厂\t380227\n宣介\t380228\n鑫悦\t380229\nAdoption\t380230\n公安部长\t380231\nlibssh2\t380232\n刮刮卡\t380233\nFlawless\t380234\nstuido\t380235\n凯恩股份\t380236\n电除尘器\t380237\napiserver\t380238\n联创世华\t380239\nSX4论坛_汽车之家论坛\t380240\n唐山湾\t380241\n赖光\t380242\n行政院\t380243\n细胞体\t380244\n19斤\t380245\nmaca\t380246\n玉环人才网\t380247\n上海教育局\t380248\nVineyard\t380249\n390x\t380250\n刘在锡\t380251\n柏树叶\t380252\n全员加速中\t380253\n宇治\t380254\n体温单\t380255\nCECEP\t380256\n世青\t380257\n下柱\t380258\n腾讯地王卡\t380259\n生缘\t380260\n土桥村\t380261\n观赏石\t380262\n袁泉\t380263\n物业管理招聘网\t380264\n苹果公司\t380265\n刘羽\t380266\n神恩颂歌\t380267\nshakes\t380268\n三十六\t380269\n摩比斯\t380270\n升格\t380271\n玻璃钢\t380272\nNMP\t380273\npowercfg\t380274\n188万\t380275\n白发苍苍\t380276\n内蒙古化工职业学院\t380277\n步骤\t380278\ncontroller层\t380279\nex4\t380280\nˉ\t380281\n精美\t380282\n联想小新i2000\t380283\n纯青\t380284\nSBD\t380285\n白班\t380286\n政通\t380287\n石化加油站\t380288\n水王\t380289\n50步\t380290\n猜数\t380291\n望海高歌\t380292\nCompetitive\t380293\n刘延东\t380294\n新华街道\t380295\n热血江湖sf\t380296\nchristian\t380297\n扒客\t380298\nDifferential\t380299\n仿真植物墙\t380300\n燥起来\t380301\n粤传媒\t380302\n蔡\t380303\n额敏\t380304\ng63\t380305\n西安市中医医院\t380306\n琼库什台\t380307\n主妇网\t380308\n5串\t380309\n报丧\t380310\n神荒\t380311\n星谷\t380312\n接地电流\t380313\nBun\t380314\n指路\t380315\n汪聪\t380316\n500mA\t380317\n超色\t380318\n仰山\t380319\nSKYWORTH\t380320\n停航\t380321\nespace\t380322\n38节\t380323\n明清街\t380324\nconfigration\t380325\n击溃\t380326\n医保\t380327\n家政网\t380328\n喜剧\t380329\n余下\t380330\n汪汪汪\t380331\n4510\t380332\nbuses\t380333\n深圳市赛尔通科技有限公司\t380334\n26g\t380335\n报请\t380336\n马克·鲁法洛\t380337\n嘴突\t380338\n摊派\t380339\n间歇式\t380340\n战狼\t380341\n阚壠\t380342\n优思弗\t380343\n油钱\t380344\n咁\t380345\n3D投影\t380346\n冷酸\t380347\n多乐\t380348\n暗潮\t380349\n死亡塔\t380350\n拖拽式\t380351\n金桥国际广场\t380352\nUnichi\t380353\n早餐券\t380354\n橡胶垫\t380355\n肉松\t380356\n周详\t380357\nDr\t380358\nRegret\t380359\n勤加缘网\t380360\n政制\t380361\n德纲网\t380362\n八王\t380363\n皋埠\t380364\nv2.5.4\t380365\n灵饰\t380366\n尾丝\t380367\n介宾\t380368\nDBLINK\t380369\nsynology群晖NAS\t380370\n仙毫\t380371\n12M\t380372\n家庭户\t380373\nspear\t380374\n抑郁症\t380375\n截拳道\t380376\n导丝\t380377\n大理镇\t380378\n熊猫\t380379\n青燕子\t380380\nShopNC\t380381\n中杆\t380382\n倒运\t380383\n2017年3月23日\t380384\n康涅狄格州\t380385\n13英寸\t380386\n兰陵王妃\t380387\n乔老师\t380388\n摄象机\t380389\n第十三条\t380390\n2008-2014年\t380391\n037\t380392\n市妇幼保健院\t380393\n战场\t380394\nImmortal\t380395\n顺产\t380396\n百大集团\t380397\n顾秀莲\t380398\n腐臭\t380399\n惠州市第一中学\t380400\n太二酸菜鱼\t380401\n椴树\t380402\n宫爆\t380403\n玉山南\t380404\nWash\t380405\n猪骨汤\t380406\n免冠\t380407\nsharepreference\t380408\n放下\t380409\nMASK\t380410\n不偏不倚\t380411\ncontourf\t380412\n8600GT\t380413\n暗源\t380414\n中央广场\t380415\nzhushi\t380416\n氯化氢\t380417\n电能\t380418\n马特达蒙\t380419\n杭二中\t380420\n浏览\t380421\n灯花\t380422\n帅出\t380423\n穿越文\t380424\n巧思\t380425\nBusty\t380426\ngs5\t380427\nPalazzo\t380428\n中国移动广东公司\t380429\n山里人\t380430\n鬼大爷鬼故事\t380431\n合本\t380432\n威廉二世\t380433\n保德路\t380434\n不要动\t380435\n追征\t380436\n新疆农业职业技术学院\t380437\n锦绣缘\t380438\n煤车\t380439\n黑龙江省旅游局\t380440\n田丰\t380441\n柏林大学\t380442\nTHAI\t380443\n南部新区\t380444\n乐活网\t380445\n绿茵阁\t380446\n汉书\t380447\nMortality\t380448\n30目\t380449\n房博会\t380450\n天才捕手\t380451\nmd-3\t380452\n2018F1\t380453\n8086\t380454\n_元\t380455\njdk1.7\t380456\n魏晋南北朝时期\t380457\n蓝弧\t380458\n易瑞沙\t380459\n私募焦\t380460\nsublimelinter\t380461\n六本\t380462\n境况\t380463\n入馆\t380464\n杭州景区\t380465\n缠流子\t380466\nviolent\t380467\n星联\t380468\n大卡司\t380469\n大众凌度\t380470\n重工\t380471\n广东省农村信用社联合社\t380472\ngradually\t380473\nharoopad\t380474\nhome键\t380475\n赶来\t380476\n偶久\t380477\njavaeye\t380478\nmetasploit\t380479\n路牌\t380480\nQCon\t380481\n挣值法\t380482\n切菜机\t380483\n网贷\t380484\n广州月子中心\t380485\nTextNow\t380486\n红街\t380487\n影厅\t380488\n第A12\t380489\n支出\t380490\nshowmount\t380491\n女儿红\t380492\n暗战\t380493\n韭菜坪\t380494\n第六感\t380495\n灵纹\t380496\n楚月\t380497\n上海合作组织成员国元首理事会\t380498\n10v10\t380499\n美伢\t380500\n镗\t380501\n微时\t380502\n白袜\t380503\n漫天飞\t380504\n荣昌县\t380505\n河南电力\t380506\n高峰期\t380507\n半月形\t380508\n电信集团\t380509\nacd\t380510\n美纹\t380511\nPipeline\t380512\n一阳\t380513\n入籍\t380514\n长安欧尚A800\t380515\n疯狂的蚂蚁\t380516\n南京市建委\t380517\n头孢丙烯片\t380518\n西影\t380519\nSG-029混凝土强度合格\t380520\n春社\t380521\n平板玻璃\t380522\nTSI280\t380523\n10200\t380524\npiwik\t380525\n48期\t380526\n七七铺\t380527\n助人\t380528\n东湖小区\t380529\n刘志敏\t380530\n汴\t380531\n飞行物\t380532\n楚韵\t380533\n清末\t380534\n哪几个\t380535\n蓝胭脂\t380536\n回购率\t380537\n古惑仔之江湖新秩序\t380538\n撞脸\t380539\n国家商标局\t380540\n詹\t380541\n香港科学馆\t380542\n吴志勇\t380543\n三勺\t380544\n盹\t380545\nthetribez\t380546\n树坑\t380547\nOpenTSDB\t380548\n模模糊糊\t380549\n爱思益\t380550\n反杜林论\t380551\n.\t380552\nlti\t380553\n逍遥魔兽\t380554\n无砟\t380555\n六瓶\t380556\nwin7\t380557\n老夫人\t380558\n2009年上半年\t380559\n视源股份\t380560\nrune\t380561\nnendo\t380562\n游戏工\t380563\n东方生活报\t380564\n全国代表大会_中国网\t380565\n前半年\t380566\ntwain\t380567\nhdtv\t380568\n香山街道\t380569\n北国\t380570\n配现\t380571\n黏糊\t380572\n炖肉\t380573\nCO2\t380574\n塘沽\t380575\n啤酒坊\t380576\n杂线\t380577\n13顺位\t380578\n旗装\t380579\n宋哲宗\t380580\n声智科技\t380581\n湘潭经开区\t380582\n第七十八章\t380583\n看热\t380584\n热交换\t380585\n单身公寓\t380586\nredistribute\t380587\n落下\t380588\n林姗姗\t380589\nHash值\t380590\n100mb\t380591\n乖巧\t380592\n殉难者\t380593\nsaucer\t380594\n中国地质调查局发展研究中心\t380595\n004\t380596\n石龙村\t380597\npole\t380598\n职校\t380599\n德政路\t380600\nfashionable\t380601\nUTP\t380602\nStick\t380603\n升温\t380604\n7色\t380605\n特克斯\t380606\n豆果\t380607\nbloomberg\t380608\n下地\t380609\n救护员\t380610\n绿春县\t380611\n华陌网\t380612\n马曼然\t380613\n信诚人寿\t380614\n模芯\t380615\n凤凰水城\t380616\n百分之17\t380617\n华中科技大学文华学院\t380618\n艺木\t380619\n圆才网\t380620\n庸懒散浮拖\t380621\n大众迈腾\t380622\n千术\t380623\nLima\t380624\n金钟水库\t380625\n矩阵类\t380626\n软管\t380627\nnestedscrollview\t380628\nRCP\t380629\nmins\t380630\n耳屏\t380631\n超级硬盘数据恢复软件\t380632\n单价\t380633\n信任中心\t380634\n昆明铁路局\t380635\nMindMaster\t380636\n去哪儿吧\t380637\n里程碑式\t380638\n田建明\t380639\n南京航空\t380640\n丙类\t380641\n火树\t380642\n三河市人民政府\t380643\n鲍鲍\t380644\n咕咕机\t380645\n战争\t380646\n手绘图\t380647\n基函数\t380648\n博奥\t380649\nEBIT\t380650\n4F\t380651\n813路\t380652\n田水\t380653\n第43\t380654\n后宫如懿传\t380655\nibw\t380656\n加图索\t380657\n旱季\t380658\n专升本考试\t380659\n亨得利\t380660\n另一只\t380661\n收腹带\t380662\ndecryption\t380663\nzhuomian\t380664\n三峡晚报\t380665\nMinnie\t380666\n沙洋县人民政府\t380667\n美好集团\t380668\n八四\t380669\noyzway\t380670\n海联讯\t380671\n聚土网\t380672\n91富一代\t380673\n安藤\t380674\n京香julia\t380675\nCCD\t380676\n启城\t380677\nHashTable\t380678\nIPC$\t380679\n和牛网\t380680\n考科\t380681\n画品\t380682\n研究报\t380683\n天天干_夜夜啪_天天操_天天啪_天天射_天天日_天天撸\t380684\n焚毁\t380685\n门事件\t380686\n日光倾城\t380687\n虎虎VR\t380688\ninsufficient\t380689\n语音编码\t380690\n八行\t380691\n蔡岳勋\t380692\npojieban\t380693\nrealflow\t380694\n徐志\t380695\n为你读诗\t380696\n政法大学\t380697\n2017年10月28日\t380698\n吉事果\t380699\n231路\t380700\n優惠\t380701\n729\t380702\n沙葱\t380703\n我的秘密花园\t380704\n体验式教学\t380705\n宝子\t380706\n哈尔滨香坊\t380707\n人民代表报\t380708\n电源适配器\t380709\n马可汤姆汉克斯\t380710\n黄旗山\t380711\n可胜\t380712\n20160131\t380713\n药疹\t380714\n主持词\t380715\n可不可以说\t380716\n前锋区\t380717\n腐烂国度\t380718\n外来者\t380719\n博林腾瑞\t380720\n锁国\t380721\n输出国\t380722\n9块\t380723\n立体球\t380724\n老人院\t380725\n新歌\t380726\n徐匡迪\t380727\n王鹏程\t380728\n自荐\t380729\n256G\t380730\ncloudtv\t380731\n辣文网\t380732\n好命\t380733\n摘\t380734\n老烧\t380735\nchineses\t380736\n十五位\t380737\n调质钢\t380738\n韩山\t380739\n新南方\t380740\nPAGE\t380741\n三氯甲烷\t380742\n耀光\t380743\n李学明\t380744\n建议性\t380745\nteart\t380746\n邪欲\t380747\n坚果投影仪\t380748\n消防展\t380749\nTELECOM\t380750\n曾贤志\t380751\n不雅观\t380752\n子夜四时歌\t380753\n病假条\t380754\nqq部落\t380755\nJDK8\t380756\nizumi\t380757\ncoca\t380758\nusbkey\t380759\n底子油\t380760\n127G\t380761\n1平方米\t380762\n日照百姓网\t380763\n中建地产\t380764\n龙部落\t380765\n啸叫声\t380766\n日丽\t380767\n智辉\t380768\n版友\t380769\n哦嗯\t380770\n2.5%\t380771\n云翼\t380772\n51.0\t380773\n博客类\t380774\n浙江省通信管理局\t380775\n宫腔粘连\t380776\nargb\t380777\n朱高正\t380778\n炼金炉\t380779\n诉讼费\t380780\n9508\t380781\n邓中翰\t380782\n怀孕\t380783\n陈康\t380784\n团练\t380785\n成都外国语学校附属小学\t380786\njaya\t380787\n宿州市埇桥区人民政府\t380788\n开球\t380789\n费尔班克斯\t380790\n尤利娅\t380791\n模糊不清\t380792\n煤海\t380793\n考拉FM\t380794\nTall\t380795\n张根\t380796\n药源网\t380797\n监所\t380798\n开膛破肚\t380799\n天气预报员\t380800\n变焦镜头\t380801\n9把\t380802\n3、4月份\t380803\n我辈网\t380804\n犬家网\t380805\n一根一根\t380806\n董进宇\t380807\n4399饥饿的鲨鱼\t380808\n药库\t380809\n血洗\t380810\n乳管镜\t380811\n旅游路\t380812\n听说\t380813\n科客网\t380814\n\\/\\1\t380815\nrin\t380816\n木门\t380817\n断案\t380818\n蛋化石\t380819\n土地转让\t380820\n无极限\t380821\n美呗整形网\t380822\n瘦下\t380823\n股票上市规则\t380824\n锐利\t380825\n晚集\t380826\n100Hz\t380827\n沪通大桥\t380828\nDua\t380829\nretas\t380830\nOligo\t380831\ncaviar\t380832\n一年多\t380833\nBALANCE\t380834\neyewear\t380835\n牛奶奶\t380836\n灼\t380837\n王春燕\t380838\n太阁立志传3\t380839\n芝士\t380840\nMiriam\t380841\n全纳车网\t380842\n杨卫国\t380843\n嗯哼\t380844\nmargin-top\t380845\ntexpad\t380846\n三亚千古情景区\t380847\n何处去\t380848\n测振仪\t380849\n万雪窟\t380850\n赵佳丽\t380851\nvrml\t380852\n大件\t380853\n北京故宫\t380854\n咳咳\t380855\nquantization\t380856\n系统盒\t380857\n存贮\t380858\n山东省卫生和计划生育委员会\t380859\nyei\t380860\n我的少女时代\t380861\n尿意\t380862\nprisma\t380863\n良村\t380864\n中国驻南斯拉夫大使馆\t380865\n接触力\t380866\n全话\t380867\ncreates\t380868\n徐汇区牙防所\t380869\n吉顺\t380870\n舱壁\t380871\n己方\t380872\n管窥\t380873\ni7-7500U\t380874\n鸟道\t380875\nallison\t380876\n审请\t380877\n王良\t380878\n问心\t380879\n树状图\t380880\n哪个\t380881\n七彩神仙鱼\t380882\n毕十三\t380883\n黄鸣\t380884\n藤木\t380885\n竿子\t380886\n容器板\t380887\n兽欲\t380888\n河北网络广播电视台\t380889\nintangible\t380890\n赏赐\t380891\n广东省粤电集团有限公司\t380892\n永庆\t380893\n发存\t380894\n彭健\t380895\n光交箱\t380896\n总值班室\t380897\nionization\t380898\nSell\t380899\n纳沙泰尔\t380900\nWearable\t380901\n43次\t380902\n仓位\t380903\n20170916\t380904\n亲缘\t380905\n西安卫星测控中心\t380906\nairasia\t380907\n男童\t380908\n建筑设计院\t380909\n柔性化\t380910\nonlygirls\t380911\n湖南省委党校\t380912\nHoran\t380913\ndasd\t380914\nheuristic\t380915\n梦间\t380916\n连横\t380917\n虚空掠夺者\t380918\n榊\t380919\n优客\t380920\n22路\t380921\n04月23日\t380922\n百德\t380923\n立刻\t380924\n无量劫\t380925\nIMMEDIATE\t380926\n补扣\t380927\n削弱\t380928\n6x\t380929\nAnticloud\t380930\n淞滨路\t380931\nRE管理器\t380932\n科锐国际\t380933\n中华婚庆网\t380934\n氧化亚氮\t380935\n百度云盘BD1080p超清资源\t380936\n中速\t380937\n房屋转让协议\t380938\n6.2.4\t380939\n有限\t380940\n太阳岛论坛\t380941\nbet36体育在线\t380942\n动感地带\t380943\n致公\t380944\nfullPage\t380945\n以往\t380946\n一月一\t380947\n262\t380948\n图片转换器\t380949\n九乘\t380950\n谍海\t380951\n地球末日\t380952\n藏寨\t380953\n润禾材料\t380954\n百花湖\t380955\n陈汉\t380956\n重庆网站建设公司\t380957\njawbone\t380958\nmeavn\t380959\nForge\t380960\nnetcat\t380961\n杨柳岸\t380962\n2017年4月1日\t380963\n活期存款利率\t380964\n爱微帮\t380965\n行且珍惜\t380966\nmd.itlun\t380967\ndrag\t380968\n木札岭\t380969\n雷达币\t380970\nSurvey\t380971\nvue2.0\t380972\n海原\t380973\n夜线\t380974\n锦湖街道\t380975\n子平\t380976\n狂歌\t380977\n80070032\t380978\n雷藏\t380979\n近于\t380980\n男子校\t380981\n蔡枫华\t380982\n苹果6splus\t380983\n芃芃\t380984\ninvested\t380985\n王辉耀\t380986\n阻塞性黄疸\t380987\nAnsys\t380988\nhihocoder\t380989\n爱泽花梨\t380990\n梦幻西游生死劫\t380991\n五河县\t380992\n卡帕莱\t380993\nCargo\t380994\n东方银星\t380995\n豪运\t380996\n戊戌变法\t380997\n约什史密斯\t380998\ndstat\t380999\n济困\t381000\n东八区\t381001\n仰口风景区\t381002\n磨擦\t381003\n香格里拉市\t381004\n2556\t381005\n郑州车管所\t381006\n过硬\t381007\n克缇\t381008\n李相赫\t381009\n选稿\t381010\n一圈圈\t381011\n总则\t381012\n放书\t381013\n撬\t381014\n瑞邦\t381015\nfony\t381016\n臼炮\t381017\n揪痧\t381018\nQQ欢乐斗地主\t381019\n女潮\t381020\n一支烟\t381021\nduf\t381022\n百色学院\t381023\nindices\t381024\n微牛\t381025\n运杂费\t381026\n喻世明言\t381027\n学样\t381028\n8.9.3\t381029\n深港在线\t381030\n变天\t381031\n北京传媒大学\t381032\n井湾子\t381033\n哈尔滨市道外区\t381034\n卡利姆多\t381035\n中大集团\t381036\n网易博客\t381037\n色影无忌\t381038\nHortonworks\t381039\n生活饮用水卫生标准\t381040\nhuangfox\t381041\nChun\t381042\n12件\t381043\n树莓派3b+\t381044\n弥雅\t381045\n初次\t381046\n惺忪\t381047\n涉密信息系统\t381048\n保险子\t381049\n杨学军\t381050\n格格党\t381051\n交通工程学院\t381052\nv2.11\t381053\nWIN8.1\t381054\n贞节\t381055\n啤酒瓶\t381056\n地间隙\t381057\n经验卷\t381058\n最后的朋友\t381059\nshifts\t381060\nhydraulic\t381061\n主档\t381062\nGB50011-2010\t381063\n4kb\t381064\n第21轮\t381065\nresolume\t381066\nqinpu\t381067\n八一厂\t381068\n中学图书馆\t381069\n虚寒\t381070\n雍城\t381071\n猛龙怪客\t381072\nGay\t381073\nCodecademy\t381074\n捉鬼合家欢\t381075\n曲阜师范\t381076\n命丧\t381077\n胃肠道\t381078\nuga\t381079\nuhd620\t381080\n啪啪秀\t381081\n磁链\t381082\n200多条\t381083\n文化西路\t381084\n维珍航空\t381085\n董云华\t381086\n钥匙房\t381087\n1022n\t381088\n千公里\t381089\n马致远\t381090\n中国国家博物馆\t381091\n工作桌\t381092\n年检表\t381093\nSticky\t381094\n通手\t381095\n221.7\t381096\nToken认证\t381097\n没得\t381098\n横店影视城\t381099\nPunish\t381100\n光刻胶\t381101\n卷首\t381102\n双学\t381103\n剩余价值理论\t381104\n芙蓉村\t381105\n大新镇\t381106\nTownship\t381107\n1080p_\t381108\n农信银资金清算中心\t381109\n质监站\t381110\n神夜动漫网\t381111\n14行\t381112\n固废\t381113\n美债收益率\t381114\n股旁网\t381115\nhc06\t381116\n家喻户晓\t381117\n云南林业职业技术学院\t381118\n阳朔县人民政府\t381119\n协力\t381120\n忻东旺\t381121\n曲星\t381122\ndgl\t381123\n圣安地列斯\t381124\n普特彼\t381125\nasla\t381126\n进战\t381127\n增城区\t381128\n秦皇陵\t381129\n繁雄\t381130\nsmartctl\t381131\n波纹管\t381132\ngsx250\t381133\n横\t381134\n重庆市金融工作办公室\t381135\n正定性\t381136\njunsansi\t381137\n恋花\t381138\n柏龙\t381139\nRFP\t381140\n易读宝\t381141\n拓扑学\t381142\n旭格\t381143\n半粒\t381144\n富贵籽\t381145\n星级化\t381146\n石玉\t381147\n狂放\t381148\n峦\t381149\n北潭\t381150\nVideoCopilot\t381151\n济南市委\t381152\n上百次\t381153\n西沙湿地\t381154\n西安研究所\t381155\nu品\t381156\n帕尼\t381157\n廖启智\t381158\n4个多月\t381159\nN9008\t381160\n54岁\t381161\nxap\t381162\n1.5.0\t381163\n听说出\t381164\n豆粉\t381165\n20150521\t381166\n马大帅\t381167\n出展\t381168\n3840x2160\t381169\n优生优育\t381170\n吴俊\t381171\n电广传媒\t381172\n3二\t381173\n跳起舞\t381174\n梨花村\t381175\n公费\t381176\n天性\t381177\n汉侯\t381178\n英宗\t381179\n花景\t381180\n广告案\t381181\n中国特色社\t381182\n微梦\t381183\n历史学专业\t381184\n贵胄\t381185\n2016年7月1日\t381186\n12月31号\t381187\n雏形\t381188\n华夏幸福基业股份有限公司\t381189\n信息产业部\t381190\n兄弟网\t381191\nChoco\t381192\n25cm\t381193\n书影\t381194\n风华国\t381195\n包装工程师\t381196\nJDG管\t381197\n思念\t381198\n口袋妖怪\t381199\nqsfp\t381200\nxiaoyao\t381201\n律咖\t381202\n六尘\t381203\nApacheCN\t381204\n室内空气质量标准\t381205\n精漫画\t381206\n平沙镇\t381207\nプレステ\t381208\n公告期\t381209\nJSR-303\t381210\n黑块\t381211\n阿里铁军\t381212\n瑞树\t381213\nOpenMediaVault\t381214\n吴总\t381215\n中电集团\t381216\n超级访问\t381217\n安评\t381218\n火舌\t381219\n2路\t381220\n硬核\t381221\n民用水\t381222\n颈托\t381223\n建筑施工安全检查标准\t381224\n酒局\t381225\n报审\t381226\n锦灰\t381227\n山东省公安厅\t381228\nLomo\t381229\nperf\t381230\nsfda\t381231\n0853\t381232\n修到\t381233\n4080\t381234\n趋好\t381235\n王守义\t381236\n2018039期\t381237\n药动学\t381238\n第99期\t381239\nbike\t381240\n80分贝\t381241\n留空\t381242\n芯柱\t381243\n不解\t381244\n酶切位点\t381245\n银沙滩\t381246\n陈村\t381247\n喷淋泵\t381248\n孤胆枪手\t381249\nX40\t381250\n106期\t381251\nDPS\t381252\n水蒸汽\t381253\n玉树藏族自治州\t381254\n奥拉索斯\t381255\n上汽红岩\t381256\n将国之天鹰星\t381257\n陈思思\t381258\n127mm\t381259\n陈旧性\t381260\n中国招标投标协会\t381261\n座谈会\t381262\n105斤\t381263\n壁垒\t381264\n山羊绒\t381265\nSM2\t381266\n110cm\t381267\n东港路\t381268\n103\t381269\n李一飞\t381270\n张秉贵\t381271\n希里\t381272\n刊文\t381273\n为观止\t381274\n浊液\t381275\n牛卡\t381276\n打车\t381277\n20160\t381278\n7560\t381279\n百度图\t381280\nDuet\t381281\n中山社保\t381282\n金道学院\t381283\n3月13号\t381284\n韧度\t381285\n王学东\t381286\nlin\t381287\n澜湄\t381288\n慧科\t381289\n309医院\t381290\n软件杯\t381291\n电子券\t381292\n空音\t381293\n湖北省建设厅\t381294\n流转\t381295\n宝新能源\t381296\n韩服\t381297\nIdentify\t381298\n林敏\t381299\nJukujo\t381300\n证券期\t381301\n大香\t381302\n联赛\t381303\n0.90\t381304\n华为畅享7Plus\t381305\n性温\t381306\n苏州桥\t381307\n京东方\t381308\n无任\t381309\n三婚\t381310\n励耘书业励耘新\t381311\n亚洲区\t381312\n实态\t381313\n问卷网\t381314\n先于\t381315\n末日崩塌\t381316\n丰收奖\t381317\n先天性\t381318\n分节符\t381319\n吃人\t381320\n絵巻\t381321\n鼻屎\t381322\n未来科技城\t381323\n现象学\t381324\n2CM\t381325\n郭文韬\t381326\n云南机场集团\t381327\n罗娜\t381328\n黄埔区\t381329\n暴走漫画脱口秀节目\t381330\n技工学校\t381331\n焦村镇\t381332\nEntrepreneurs\t381333\n高端\t381334\n出云号\t381335\nvero\t381336\n小袋鼠\t381337\n万年县政府\t381338\n新话\t381339\nswig\t381340\n王姨\t381341\n浙美版小学\t381342\n颚骨\t381343\n中盈\t381344\n海铁\t381345\n五胡十六国\t381346\n育人\t381347\n民国初\t381348\n无欢\t381349\n平川\t381350\n西瓜商情网\t381351\nc87\t381352\n金属感\t381353\netree\t381354\n停机保\t381355\nshuzu\t381356\n搏击操\t381357\n乌鱼\t381358\n乡野\t381359\nRIOT\t381360\n京都大学\t381361\n张萱\t381362\n小豪\t381363\n筛片\t381364\n棉袋\t381365\n沉醉\t381366\nDeutsch\t381367\n打底裤\t381368\ndemi\t381369\nsubl\t381370\nY3\t381371\n腹壁\t381372\n颠球\t381373\nAssured\t381374\n航运业\t381375\ne速贷\t381376\nTIS\t381377\n耙\t381378\n重建\t381379\n中国江苏网\t381380\n召开\t381381\n上海港区\t381382\n遇见未知的自己\t381383\nwuzhang\t381384\n灌肠法\t381385\n女皇之刃\t381386\n数据线性\t381387\n1794\t381388\n巨齿\t381389\n聚集地\t381390\n电子账单\t381391\n一穴\t381392\n玩得\t381393\nNaN\t381394\nstg\t381395\nv4.5.0\t381396\n牌楼镇\t381397\nUL00\t381398\n并发\t381399\n韩强\t381400\nColourpop\t381401\n图册\t381402\nsyringe\t381403\n爆奶\t381404\nv1.1.9\t381405\nfcitx输入法\t381406\n艾雷王\t381407\n暗梁\t381408\n分数化\t381409\n瑞中\t381410\n包皮\t381411\n耐人寻味\t381412\n抱死\t381413\n利滚\t381414\n戈登拉姆齐\t381415\n酷饭网\t381416\n上庭\t381417\n市文广局\t381418\n索朗扎西\t381419\n藏匿\t381420\n荷塘月色\t381421\n花又开\t381422\n南北\t381423\napriori\t381424\n南京京科医院\t381425\n池音\t381426\n11.2\t381427\n8件套\t381428\n少年包青天3\t381429\n浑浊\t381430\n醋酸亮丙瑞林\t381431\nADR\t381432\n立刀\t381433\n16只\t381434\n包皮水肿\t381435\n5.6.39\t381436\n出模\t381437\n蔡徐坤\t381438\nDial\t381439\n7.40\t381440\n常考\t381441\n维普网\t381442\n骶\t381443\nhsp\t381444\n完爆\t381445\n石敢当之雄峙天东\t381446\n蜂窝板\t381447\n乡志\t381448\ncrane\t381449\n常州新北区\t381450\n奥本海默\t381451\n肥西\t381452\n李素\t381453\nelevit\t381454\n2.5环\t381455\n文灿股份\t381456\n开创新\t381457\n表贴\t381458\n电脑洗车机\t381459\n春姑娘\t381460\n北国商城\t381461\n辽宁省国土资源厅\t381462\n文都考研网\t381463\n施工类\t381464\n2018年4月14\t381465\n深化\t381466\nvivoX20\t381467\n东宝\t381468\nCtrl+A\t381469\n新崔斯特姆_凯恩之角\t381470\n贝克曼梁\t381471\n起亚福瑞迪\t381472\nvolkswagen\t381473\n松北\t381474\n拟行路难\t381475\naline\t381476\n贾家庄\t381477\n脚镣\t381478\n修旧如\t381479\n墓葬\t381480\n自体脂肪丰\t381481\n加码\t381482\n7035\t381483\njars\t381484\n转基因吧\t381485\n接地体\t381486\n朱有鹏\t381487\n河村\t381488\n美洛昔康片\t381489\n轼\t381490\n汕头市人民政府办公室\t381491\nh.ear\t381492\nlovey\t381493\n场数\t381494\n千阳县\t381495\n古丈\t381496\n青岛妈妈网\t381497\n相互间\t381498\n减名\t381499\n北京东方君悦大酒店\t381500\n首架\t381501\n二发\t381502\n广东四会\t381503\nalamofire\t381504\n68元\t381505\nInstall\t381506\nvikings\t381507\n勇者之塔\t381508\n村组\t381509\n字体传奇网\t381510\nV2.9\t381511\n3550\t381512\n扎基\t381513\nSqlserver数据库\t381514\ndeeplearning\t381515\n见地\t381516\n不妥\t381517\nWPA\t381518\n机审\t381519\n木渎镇\t381520\n福明\t381521\nsolidcam\t381522\n孔武\t381523\n小角\t381524\n潇湘溪苑\t381525\n任天狗\t381526\n65%\t381527\n乙二胺\t381528\n易奇八字-专业八字测试\t381529\n亚伯拉罕·林肯\t381530\n脸照\t381531\n手信\t381532\n辩稿\t381533\nmchdba\t381534\n楔形石\t381535\n副总经理\t381536\n铃木\t381537\nComplex\t381538\n香哈网\t381539\nspring3\t381540\n秦飞\t381541\n位次\t381542\n增幅器\t381543\nPlanting\t381544\nchinaplas\t381545\ninconsistent\t381546\n节考\t381547\n国联军\t381548\n灵逸\t381549\n抬起\t381550\n霍克\t381551\n物料提升机\t381552\n道生一\t381553\n防疫站\t381554\nmusik\t381555\n日点\t381556\n天津极地海洋馆\t381557\n055型\t381558\n专营店\t381559\nSERVICE\t381560\n蝈蝈\t381561\n多洛特\t381562\n20170121\t381563\n琴曲\t381564\nsaveas\t381565\n南昌职业学院\t381566\n艺术家们\t381567\nHet\t381568\n沽源\t381569\n火柴人\t381570\n重型\t381571\n智联银行\t381572\n小抄\t381573\n东渚镇\t381574\nproxy_cache\t381575\n黄三\t381576\n石油天然气\t381577\n鸿运当头花\t381578\n米袋\t381579\n乐蕴\t381580\n夏希\t381581\nHou\t381582\n全损\t381583\n王族\t381584\n西关街道\t381585\n姚迪\t381586\n潘飞\t381587\n男女生\t381588\n顾少\t381589\n担当者\t381590\n猜猜猜\t381591\n趁早\t381592\n尽情\t381593\n忻府区\t381594\n老千\t381595\n唐阁\t381596\n余杭区\t381597\ninstead\t381598\n四川省地震局\t381599\n女权主义\t381600\n经营户\t381601\n甜性\t381602\n3.10亿辆\t381603\n东京暗鸦\t381604\nlinuxshell\t381605\n全高\t381606\n材料有限公司\t381607\n八坂神社\t381608\nCuckold\t381609\nchh\t381610\nS6\t381611\n好健康\t381612\nControllerAdvice\t381613\n单干\t381614\n民间舞蹈\t381615\n隔岸观火\t381616\n黄家强\t381617\n加一点\t381618\n888元\t381619\nUSG防火墙\t381620\n红曲粉\t381621\npavement\t381622\n微信网\t381623\n移动通信公司\t381624\n风送式\t381625\n仆役宫\t381626\n神猫\t381627\nfiles\t381628\n1.3元\t381629\n暗黑之魂\t381630\n三亚火车站\t381631\n后装\t381632\n停机保号\t381633\n金鑫\t381634\nJemmy\t381635\n十和\t381636\n斗米兼职\t381637\n30位\t381638\n走遍中国\t381639\n理博\t381640\n掌柜宝\t381641\n珠海市委\t381642\n安尔碘\t381643\n顾斐\t381644\n国华纪念中学\t381645\n泸县一中\t381646\n济南市政府\t381647\nAutoLayout\t381648\n卤化\t381649\n镁质\t381650\n跑跑\t381651\n小脚丫\t381652\nmiui9.5\t381653\n亚尔特留斯\t381654\n学理\t381655\n门户站\t381656\ninstallment\t381657\n鹤唳\t381658\n6千多\t381659\n14幢\t381660\n这么多次\t381661\n蛮族套\t381662\n夏米尔\t381663\n8句\t381664\n阿多\t381665\n&32位\t381666\n储能变流器\t381667\n几株\t381668\nhindi\t381669\n1分米\t381670\nligh\t381671\n截频\t381672\n超星慕课尔雅\t381673\n小叶苦丁茶\t381674\n1万美元\t381675\n侨眷\t381676\nbulletin\t381677\n商业步行街\t381678\n卡类\t381679\n趣网\t381680\n大胡子\t381681\n万邦达\t381682\n强度表\t381683\n20180213\t381684\n仓颉造字\t381685\n超量\t381686\nPierrot\t381687\n李倩倩\t381688\n承揽\t381689\n彩站\t381690\n水果酒\t381691\n沸\t381692\n无意中\t381693\nxinlang\t381694\n茗彩\t381695\n通币\t381696\n牙髓干细胞\t381697\n雪地\t381698\ncff\t381699\nBluetooth\t381700\n楚天杯\t381701\n英语在线翻译\t381702\n却是\t381703\n非月经期\t381704\n直饮\t381705\n云魔方\t381706\nsaudi\t381707\n法相\t381708\n中国交通建设集团\t381709\n景宁县\t381710\nCount\t381711\n二翅豆\t381712\n补电\t381713\n雪月花\t381714\n速\t381715\n诺顿\t381716\n安耐晒小金\t381717\n莫明\t381718\n砍树\t381719\nSpeedFan\t381720\n巴伦支海\t381721\n茂名滨海新区\t381722\n筑\t381723\n四届\t381724\n大麦超\t381725\n北方信托\t381726\n拉皮手术\t381727\n纯洁性\t381728\n牝户\t381729\nDream\t381730\nOrganizer\t381731\nwangyutao\t381732\n赤身裸体\t381733\nInnoDB\t381734\n环境科学与工程\t381735\n电损\t381736\nShell编程\t381737\n巴鲁夫\t381738\n红药\t381739\n燕子李三\t381740\nai格式\t381741\n供词\t381742\nNBA梦之队\t381743\n福瑞\t381744\n王宇燕\t381745\njqweui\t381746\n34.5\t381747\n家政夫三田园\t381748\nautocad2008\t381749\n小便器\t381750\n微联\t381751\n白马湖\t381752\n侯逸凡\t381753\n西门町\t381754\nEnt\t381755\n此心安处是吾乡\t381756\n福清法院\t381757\nDIGITAL\t381758\nQuartZ\t381759\n排列5\t381760\n致同\t381761\n龙缸景区\t381762\n上饶\t381763\n周鸿\t381764\n龟兹\t381765\n铁物\t381766\n普朗克常量\t381767\n五头\t381768\n阿妙\t381769\n枕边\t381770\n信贷风控\t381771\ntap\t381772\nM590\t381773\n白血球\t381774\n蝴蝶王\t381775\n绿谷\t381776\n十五载\t381777\n香港国际学校\t381778\nEVI\t381779\n底滤缸\t381780\njpype\t381781\n浙能\t381782\n七郎\t381783\n隔离墙\t381784\n创博\t381785\n大关\t381786\nimplementing\t381787\n礼服\t381788\n佛山市中医院\t381789\n事业单位会计制度\t381790\n本田XR-V论坛_汽车之家论坛\t381791\nCapturing\t381792\n城郊乡\t381793\n柳叶刀\t381794\n于华\t381795\n齐射\t381796\n屈人之兵\t381797\n度假区\t381798\n智能光伏产业发展行动计划\t381799\n圣塔芭芭拉\t381800\n砸\t381801\n6830\t381802\n飞度论坛_汽车之家论坛\t381803\n海管\t381804\ntangible\t381805\n贴花\t381806\n2018年04月06日\t381807\njava函数\t381808\n异灵\t381809\n武侠小说吧\t381810\n夫婦\t381811\n花与爱丽丝\t381812\nlibwebp\t381813\n20160422\t381814\n28米\t381815\n次级\t381816\n带天\t381817\n腮帮子\t381818\n九维网\t381819\n上海音乐厅\t381820\nKool\t381821\n莲台\t381822\n知乎Quora\t381823\nhoping\t381824\n实力赛\t381825\n漪\t381826\n炉石传说猎人\t381827\n13588826066\t381828\n华扬联众\t381829\n乔布斯斯坦福大学\t381830\ncareers\t381831\n果泥\t381832\n返工\t381833\n擦音\t381834\n秦羽\t381835\nRECIPE\t381836\n交通银行手机银行\t381837\n敞亮\t381838\n前妆\t381839\n腮腺混合瘤\t381840\n货运代理\t381841\n乐都区\t381842\n名扬天下\t381843\nAluminum\t381844\nCutterman\t381845\n银鸽投资\t381846\n船市\t381847\n上海肺科医院\t381848\n老化箱\t381849\n民法通则\t381850\n外因\t381851\n金不\t381852\n霜降\t381853\n佛龛\t381854\n_天正建筑\t381855\nTechSmith\t381856\n中国证券登记结算有限责任公司\t381857\n黑木\t381858\n健身界\t381859\nprintwriter\t381860\n善良的魔女传\t381861\n强国社区\t381862\n大玉儿传奇\t381863\ndtd\t381864\n着眼于\t381865\n嬗变\t381866\n早中晚\t381867\n中华田园犬俱乐部\t381868\n直销商\t381869\n伟德\t381870\n木兰溪\t381871\n邮编\t381872\n气沉丹田\t381873\n联想Y400\t381874\n鱼生\t381875\n武汉汉街\t381876\nAldnoah\t381877\n这一项\t381878\ntpshop\t381879\nFenton\t381880\n29期\t381881\n万科森林公园\t381882\ntunnello\t381883\n导爆管\t381884\n应是\t381885\n晨赫\t381886\n人人文库网\t381887\n咻一咻\t381888\nframework7\t381889\n赌具\t381890\n苹果ipad\t381891\n苏丽\t381892\n阿那曲唑片\t381893\n笼子\t381894\n过硫酸钠\t381895\nSHT\t381896\n中国人口信息网\t381897\n陈若雪\t381898\nGX8\t381899\n不解之谜\t381900\n福州电脑网\t381901\n女监狱\t381902\nPPY\t381903\n锤子pro2\t381904\n主界面\t381905\n新创\t381906\n亨利·福特\t381907\nshaft\t381908\n不要乱说\t381909\n5盒\t381910\n酸梅\t381911\n高帧\t381912\n长江日报报业集团\t381913\n10纳米\t381914\n2017年8月26日\t381915\nLEMP\t381916\n中央深改委\t381917\n玄麦甘桔颗粒\t381918\n仙灵骨葆胶囊\t381919\nzhixin\t381920\n警网\t381921\n张茅\t381922\n卫生部\t381923\n吓跑\t381924\n随机抽样\t381925\n3222\t381926\n玉泉院\t381927\nFTV\t381928\n独代\t381929\n日头\t381930\n绿茶\t381931\nCHINAPLAS\t381932\n教育部学历证书电子注册备案表》\t381933\n标准时间\t381934\n64%\t381935\n恶魔城:暗影之王\t381936\n200cm\t381937\n朱培\t381938\n街坊们\t381939\n千刃\t381940\nAfter\t381941\n三五瓶\t381942\n淄博丽人医院\t381943\nX6S\t381944\n名联\t381945\n辣妈学院\t381946\n回游\t381947\n软行天下\t381948\n八千里路\t381949\n毛周\t381950\n20170307\t381951\n不必\t381952\n江苏省南菁高级中学\t381953\n天怒法师\t381954\n师妃暄\t381955\n教育观\t381956\n星际争霸2虫群之心\t381957\n99条\t381958\n中国人民银行广州分行\t381959\n中国石油天然气集团\t381960\n皱纹纸\t381961\n刘成\t381962\n来沪\t381963\n可云\t381964\n自住房\t381965\n经过\t381966\nOPGW\t381967\n复贵盈门\t381968\nceleb\t381969\nWin8/Win8.1\t381970\n对准\t381971\n7.35\t381972\n田纪云\t381973\n新东方外国语学校\t381974\nzzq\t381975\n11185\t381976\n开彩\t381977\n汤显祖\t381978\n皂荚\t381979\n乔杰\t381980\n农书\t381981\nrec\t381982\n膨胀罐\t381983\n2160\t381984\nATENZA\t381985\n20141114\t381986\n南宁市第一人民医院\t381987\n干咳\t381988\n定额库\t381989\narm64\t381990\n安川变频器\t381991\n水利部办公厅\t381992\n二十二\t381993\nSUV汽车网\t381994\n罗汉金钟\t381995\n北京小米科技有限责任公司\t381996\nYELLOW\t381997\n沧州西\t381998\n东风北桥\t381999\n有增无减\t382000\n肖_\t382001\n百成\t382002\n逗鸟外传\t382003\n保安\t382004\nAccreditation\t382005\n美国联邦政府\t382006\n完璧归\t382007\n0543\t382008\n_河\t382009\nLights\t382010\n中港网\t382011\n最后的守护者\t382012\nGATK\t382013\n176\t382014\n尼玛县\t382015\n兰草\t382016\n无师自通\t382017\n65.0.3325.181\t382018\n反光杯\t382019\n涡轮机\t382020\n白毫\t382021\n短接\t382022\n移动平安网\t382023\n柔城\t382024\n紫藤树\t382025\n烧友\t382026\n蒸发式\t382027\nmea\t382028\nhitomi\t382029\n单极\t382030\nNUMBER\t382031\n珠宝商\t382032\n点我+1\t382033\nmrcp\t382034\ntalkback\t382035\n那小子\t382036\n迈凯伦\t382037\n如意坊\t382038\n灭绝人性\t382039\n祭坛\t382040\nintermediates\t382041\n上海光语生物科技有限公司\t382042\n两千多年前\t382043\nalabama\t382044\nMGP\t382045\nAlexa\t382046\ngeekbench4\t382047\n上网宝\t382048\ncityline\t382049\n非霍奇金淋巴瘤\t382050\n宝宝教育帮\t382051\n协理证\t382052\nuwp\t382053\n声牙\t382054\n致癌物质\t382055\n海竿\t382056\n红雷\t382057\n导游管理办法\t382058\nsion\t382059\n卤代\t382060\n阴云\t382061\n准考\t382062\nLMAO\t382063\n海滨路\t382064\nrealpath\t382065\n杭州市市场监督管理局\t382066\n无声告白\t382067\n绩优\t382068\n没\t382069\n制夷\t382070\n黎万强\t382071\n同楼\t382072\nDevelopers\t382073\n环境包\t382074\nTMR\t382075\n马博士\t382076\n入警\t382077\n沪江俄语学习网\t382078\nMacVim\t382079\n上方山动物园\t382080\n果油\t382081\n王华\t382082\ngentlemonster\t382083\n天衣无缝\t382084\n电生理\t382085\ndgzk\t382086\n4月5号\t382087\n4月14号\t382088\n常州奥体中心\t382089\n点餐\t382090\n遭袭\t382091\nphpredis\t382092\nCS\t382093\n淋巴结\t382094\n九龙塘\t382095\nraj\t382096\n五象新区\t382097\n管方\t382098\n郭伟\t382099\nplcsim\t382100\n全勤\t382101\n5.0分\t382102\n不值\t382103\n爆头\t382104\n3pc\t382105\n行末\t382106\n团籍\t382107\n中兴天机7\t382108\n故事集\t382109\n东邪雷霆之怒\t382110\n扣破\t382111\n范遥\t382112\n江小鱼\t382113\n尖啸者\t382114\n中间线\t382115\n绝地求生死斗\t382116\nThunder\t382117\n劝学篇\t382118\n雷霆三少\t382119\n防火分区\t382120\n小女孩儿\t382121\npeo\t382122\n星\t382123\n血煞\t382124\njulie\t382125\n幻灵镇魂曲\t382126\n上古卷轴5天际重置版\t382127\n球罩\t382128\n质量比\t382129\n口袋妖怪金心\t382130\n准将\t382131\nv8.0\t382132\n丁强\t382133\n福贡县\t382134\n宜湃网\t382135\n竞彩\t382136\n北京建国饭店\t382137\n暖白\t382138\n小波分析\t382139\n中大附小\t382140\n热比娅\t382141\n衍墨轩小说网\t382142\n蓝帆\t382143\nlog2x\t382144\nPICC\t382145\n实乃\t382146\n暗黑血统2\t382147\n黑芝麻酱\t382148\n報價\t382149\n广州市合抱木信息科技有限责任公司\t382150\n杜歌\t382151\nqixing\t382152\n社会知觉\t382153\n生意转让网\t382154\nFloralInn\t382155\nisinstance\t382156\n蒲巴甲\t382157\n四百击\t382158\n无罪释放\t382159\n奇巧\t382160\nEEPW\t382161\n拂\t382162\n广东十年爱情故事\t382163\n古原\t382164\n阿里巴巴集团\t382165\n大实话\t382166\n西邮新闻网\t382167\n940\t382168\n罗生\t382169\n]\t382170\n久而久之\t382171\n博途TIA\t382172\n深圳龙华富士康\t382173\n公安消防\t382174\nHgame\t382175\ntudor\t382176\n企点\t382177\n平价版\t382178\n小鸟电动车\t382179\n世达\t382180\n丁卯\t382181\n自然科学_匿名_天涯问答\t382182\nTaurus\t382183\n异火\t382184\n米斛\t382185\n农业革命\t382186\n连襟\t382187\n150号\t382188\nhuansky\t382189\n清障\t382190\n魔袋\t382191\n网联清算有限公司\t382192\n破坏\t382193\n制作\t382194\n金秀县\t382195\n域里\t382196\n天龙人\t382197\n4r8\t382198\n张子萱\t382199\n数显表\t382200\n能动英语\t382201\n无所依\t382202\n印务\t382203\n和声版\t382204\n憾路者\t382205\n妙传\t382206\n吴江农村商业银行\t382207\n瓶架\t382208\nChongqing\t382209\n永汉镇\t382210\n上海环境\t382211\nmemorandum\t382212\n六淫\t382213\n户行\t382214\nblob\t382215\n道德品行\t382216\n成风\t382217\n姜明\t382218\n表长\t382219\nAPP开发公司\t382220\n9则\t382221\n可颂坊\t382222\n脆皮炸鸡\t382223\niponex\t382224\nOnce\t382225\n春宴\t382226\nBacking\t382227\n牛霖\t382228\n阿迪达斯\t382229\n内饰板\t382230\n田广林\t382231\n灵蛇\t382232\n污水管\t382233\n谁的本领大\t382234\n舞蹈服\t382235\ncheats\t382236\n硅质聚苯板\t382237\n超级大本营\t382238\noptics\t382239\n上海音乐节\t382240\nvivoY67\t382241\n亦庄\t382242\n嘉城\t382243\n补药\t382244\n风团\t382245\n国开金融\t382246\n550m\t382247\n600张\t382248\n1.5cm\t382249\npostgis\t382250\n阳志平\t382251\n地役权\t382252\n数学院\t382253\n迈向新时代\t382254\n怀不上\t382255\n姨妈色\t382256\nx260\t382257\n民福康健康养生百科\t382258\n受让\t382259\nbse\t382260\n用时\t382261\n公路管理局\t382262\n骆驼蓄电池\t382263\n布斯\t382264\n九华经济开发区\t382265\n12d\t382266\nDOC\t382267\n赢天下\t382268\n杨子江\t382269\n橡胶板\t382270\n屏裂\t382271\n董文华\t382272\n有碍\t382273\n旷远\t382274\n畅玩5X\t382275\n抓轨\t382276\n安信基金\t382277\n使徒行者2\t382278\n周折\t382279\n谶\t382280\n豆豆鞋\t382281\n中华人民共和国人民防空法\t382282\n意见审计报告\t382283\nAVENT\t382284\n防爆电机\t382285\n忍野八海\t382286\nvda\t382287\nkeygen\t382288\n徐锦江\t382289\n第57\t382290\n直肠癌\t382291\n国际范儿\t382292\n台卡\t382293\n2017-1\t382294\n豆瓣FM\t382295\nrememberme\t382296\n138亿\t382297\n700元\t382298\n&#34\t382299\n电仪\t382300\nSaaS\t382301\n樱桃酱\t382302\n资金占用费\t382303\n嘉力丰\t382304\n养区\t382305\n和法国\t382306\n20克\t382307\n乙肝\t382308\nDown\t382309\n长风镖局\t382310\n速力\t382311\nQQ轻聊版\t382312\n孽欲追击档案之邪杀\t382313\n恪\t382314\n六格\t382315\nCRD\t382316\nPhenom\t382317\n2班\t382318\n北大人大\t382319\n橘\t382320\n罗刚\t382321\n黑框眼镜\t382322\n乱情\t382323\n耐用性\t382324\n许用\t382325\nTextArea\t382326\n正定县\t382327\n246号\t382328\n热泪\t382329\n缝合术\t382330\ncit\t382331\n南开大学环境科学与工程学院\t382332\nPraat\t382333\ncia\t382334\n大导演\t382335\n柏舟\t382336\n尼坤\t382337\n周琦\t382338\nmybook\t382339\n严舒柠\t382340\nSentry\t382341\n真刀实\t382342\n拉萨市\t382343\n两队\t382344\n洛可可\t382345\n有问必答网\t382346\n章为忠\t382347\n技术员\t382348\n狠狠\t382349\n易云\t382350\n内定\t382351\n智能电视网\t382352\n金敏\t382353\n_币安\t382354\n振兴区\t382355\nMinh\t382356\nprayer521\t382357\nf4\t382358\n60000元\t382359\nICON\t382360\n胡杨树\t382361\n24fa\t382362\n漏尿\t382363\n伯瑞\t382364\n65mn\t382365\n污泥压滤机\t382366\n倍科\t382367\n中国水泥协会\t382368\n云播3x\t382369\n长空栈道\t382370\n大黄米\t382371\n达令\t382372\nconsumes\t382373\nMac志\t382374\n天轮\t382375\n督战\t382376\n受持\t382377\n商山\t382378\n250双\t382379\nhuxley\t382380\nG40\t382381\nhtt\t382382\nfax\t382383\n蝶恋\t382384\n一寸\t382385\n匪我思\t382386\n索达吉堪布\t382387\n1500W\t382388\n路遇\t382389\n生辰\t382390\n3331\t382391\ncookiejar\t382392\n林诗音\t382393\n瓯江路\t382394\n剪刀手\t382395\nJonyJ\t382396\n王勋\t382397\nitellij\t382398\njinjin\t382399\n入眼\t382400\n博腾\t382401\n黄梅戏\t382402\n土豆片\t382403\n葡萄虎\t382404\nYJV\t382405\n相形见绌\t382406\n坐视\t382407\nづ\t382408\n家道中落\t382409\nCBOT\t382410\n中国广播网\t382411\n此路\t382412\n生态类\t382413\nExploit\t382414\npata\t382415\nASP300\t382416\n婚假\t382417\n探险类\t382418\nBanshee\t382419\n生育保险\t382420\n乘胜狙击\t382421\n免试生\t382422\n13.3寸\t382423\ndoGet\t382424\n念白\t382425\nbeings\t382426\nMongoVUE\t382427\n高手\t382428\n武汉职业技术学院\t382429\ndataview\t382430\n袖山\t382431\n黑尔\t382432\n七彩文鸟\t382433\n人事司\t382434\n石神\t382435\n重庆文理学院\t382436\n易车二手车\t382437\nqcloud\t382438\n51单片机C语言\t382439\n转经筒\t382440\n直装\t382441\n李子勋\t382442\n浦口区\t382443\nrag\t382444\n较场\t382445\n荆州古城\t382446\n5.12汶川大地震\t382447\nWatchers\t382448\nweishenme\t382449\n冯波\t382450\n双升\t382451\n集宁区\t382452\n百分百_\t382453\nRevival\t382454\nnecrosis\t382455\n隧道炉\t382456\n波恩\t382457\n软尺\t382458\n春田\t382459\n政权\t382460\n城厢\t382461\n悬臂梁\t382462\n净\t382463\n清华五道口金融学院\t382464\ndism++\t382465\n十代\t382466\nx特遣队\t382467\n宏观\t382468\nB2B网站\t382469\n开孔率\t382470\n南京华苏科技有限公司\t382471\nsnakes\t382472\n加气\t382473\n新力地产\t382474\n大榭开发区\t382475\n心理活动\t382476\n指甲钳\t382477\n丁元英\t382478\n五心\t382479\n元胞自动机\t382480\n搜房帮-房天下\t382481\n重生类\t382482\n新华广场\t382483\n浦东前滩\t382484\nComicStudio\t382485\n发困\t382486\n邦德\t382487\n中通客车\t382488\n三角袋\t382489\n礼义\t382490\n21款\t382491\n多物理场\t382492\nAtlantis\t382493\ni5-8500\t382494\njar-CSDN\t382495\n每两年\t382496\nADODB\t382497\n佳兆业\t382498\n云函数\t382499\nprotective\t382500\n秋声\t382501\n吃掉\t382502\n安阳市人民政府\t382503\n嗮\t382504\nBOOX\t382505\n金钟云\t382506\nzheting\t382507\n二列\t382508\n北京恒昌\t382509\n第六条\t382510\nOffers\t382511\n$route\t382512\n心眼儿\t382513\n45公里\t382514\n君主论\t382515\n幻想三国志\t382516\n纪念封\t382517\n吉祥三公\t382518\n集装箱起重机\t382519\nnih\t382520\n怠工\t382521\nSofia\t382522\n优酸乳\t382523\n电子游戏\t382524\n拆分单元格\t382525\n羊肚\t382526\nselves\t382527\n图瓦卢\t382528\n以和\t382529\n轧\t382530\n育翔小学\t382531\n狗肉节\t382532\n特种病\t382533\n新闻报\t382534\n地方官\t382535\nluoxn28\t382536\n卡利亚里\t382537\n征服者\t382538\n巨匠\t382539\n易经的智慧\t382540\ndqn\t382541\n智仁\t382542\n河南工业职业技术学院\t382543\n赉\t382544\n苏珊米勒\t382545\n杂篇\t382546\nConsiderations\t382547\n飞扬跋扈\t382548\n深圳中原地产\t382549\n行天下\t382550\n萧山三中\t382551\n多发性\t382552\n倪氏\t382553\n源配置\t382554\nWZ111\t382555\nJewels\t382556\n坐镇\t382557\n四氢噻吩\t382558\nPHYSICS\t382559\n丧尸\t382560\n会务组\t382561\n验讫\t382562\n龙与地下城\t382563\n最后一刻\t382564\n雍正帝\t382565\nSEMICON\t382566\n仙音\t382567\nAlbion\t382568\n矬\t382569\n详解篇\t382570\n六轴\t382571\n大汉\t382572\n高尔夫场\t382573\n皇京港\t382574\n龙缸\t382575\ntbm\t382576\n用电量\t382577\n工程咨询网\t382578\n加域\t382579\n五瓣\t382580\nSubclipse\t382581\nLoctite\t382582\nMg\t382583\n小米盒子3S\t382584\n爸爸的花儿落了\t382585\ndot1x\t382586\n安康起名网\t382587\nFlames\t382588\n汉沽\t382589\n打点\t382590\nprescribed\t382591\n尹航\t382592\n白市驿\t382593\n1350\t382594\n政治学\t382595\n曹宇\t382596\ndva\t382597\n房值\t382598\n厦门轮渡有限公司\t382599\n英德市\t382600\n亚楠\t382601\ncentos7\t382602\n宋明\t382603\n汤达人\t382604\n古金\t382605\n一军\t382606\n顽石\t382607\n蒋本珊\t382608\nhades\t382609\nMedal\t382610\n金号\t382611\n拗音\t382612\n北京小学\t382613\n铺管\t382614\nEyeglasses\t382615\n青春期\t382616\n公演\t382617\n家堂\t382618\n黑驴\t382619\n冰锤\t382620\n杀跌\t382621\n生胶\t382622\n元夜\t382623\n带子字\t382624\n矢量法\t382625\n185平方\t382626\n消费类\t382627\nNCS\t382628\n985.211\t382629\n陈医生\t382630\n全国人大法工委\t382631\n无损音乐吧\t382632\ne-Learning\t382633\n428元\t382634\n专用线\t382635\n859\t382636\nKnowYourself\t382637\n葛卫东\t382638\n600ml\t382639\n江苏省海安高级中学\t382640\nTekken\t382641\n海滨大道\t382642\n21天\t382643\n2013-2015年\t382644\n马岗\t382645\n条件函数\t382646\n讨说\t382647\n恒昌惠诚\t382648\nrsoft\t382649\nIll\t382650\nepub+mobi+azw3\t382651\nTOMAHAWK\t382652\nBDTIC\t382653\n贵州省公路局\t382654\n血肌酐\t382655\nEXO\t382656\n兰开斯特\t382657\n八字算命|周易起名【舜缘居\t382658\n湖北十堰\t382659\ncb190\t382660\n夫妻们\t382661\n媒体流\t382662\n注册处\t382663\n北部湾\t382664\n纳米硅\t382665\n林晖\t382666\n飞儿乐团\t382667\n国税函\t382668\n兆星\t382669\n礼品网\t382670\ntojsonstring\t382671\n金信诺\t382672\n一单\t382673\n2.86\t382674\n新阳工业区\t382675\nMOD\t382676\n还给\t382677\n大鼠\t382678\n2017年4月10日\t382679\n悬浮门\t382680\n40px\t382681\n羞羞片\t382682\n双狮\t382683\n曹军\t382684\n快人\t382685\n母性\t382686\nDrummer\t382687\n卡佩恩\t382688\n昌河铃木\t382689\n真人网\t382690\n立缘石\t382691\n6580\t382692\n保险代理人\t382693\n冷卷\t382694\n第一针\t382695\n胎壁\t382696\n新训\t382697\n大泽佑香\t382698\n杀神风\t382699\n类头\t382700\n安丰\t382701\n蒙多\t382702\n陈志文\t382703\n昆山市政府\t382704\nmelodyne\t382705\npantum\t382706\n污水井\t382707\nreject\t382708\nappimage\t382709\nspike\t382710\n潜江\t382711\nneogeo\t382712\n天河南\t382713\nfluffy\t382714\n佐拉\t382715\n是不是不是\t382716\n江西机电职业技术学院\t382717\n儿茶酚胺\t382718\nfengxian\t382719\n双师\t382720\n魏宏广\t382721\n泰迪俱乐\t382722\nmRNA\t382723\n加里奥\t382724\nDVD版\t382725\nCamping\t382726\n我的青春\t382727\n罪己诏\t382728\n异义词\t382729\n叉叉助手\t382730\n坏蛋是怎样炼成的\t382731\n潍坊市中医院\t382732\n2000吨\t382733\n有色冶金\t382734\n马里兰大学\t382735\nfoam\t382736\na货\t382737\nsiro-1935\t382738\n正板\t382739\n洪达\t382740\n23333\t382741\n上海人\t382742\n友谊街道\t382743\n织布\t382744\n训练基地\t382745\n秋茶\t382746\n违章建筑\t382747\n抢险\t382748\n适配体\t382749\n四险一金\t382750\n海悦花园\t382751\n佛府\t382752\n凯瑞德\t382753\n李连贵\t382754\n头牌\t382755\n手机新蓝网\t382756\n155cm\t382757\n12CD\t382758\n伸缩节\t382759\n中国中央\t382760\n赵金涛\t382761\nsers\t382762\n广东广电\t382763\nreattempted\t382764\n油页岩\t382765\n高见\t382766\n这一招\t382767\n约号\t382768\n张珊珊\t382769\n先发\t382770\n叶树\t382771\n主子\t382772\n小强斋太\t382773\nqushi\t382774\n势面\t382775\n佛山电信\t382776\n挥发酚\t382777\n炸油条\t382778\n沙纸\t382779\n惊恐\t382780\nClass\t382781\npkcs12\t382782\n1636\t382783\n太师\t382784\n十把\t382785\n百度音乐盒\t382786\n潮间\t382787\n曲江香都\t382788\n司机\t382789\n0051\t382790\nassimilation\t382791\n加湿器\t382792\nhardly\t382793\n31号\t382794\nBig\t382795\nxiuno\t382796\n谷德\t382797\n镇雄新闻网\t382798\n征管法\t382799\n守望先锋游戏吧\t382800\n五体投地\t382801\n草苗\t382802\n中山大学药学院\t382803\nFabrication\t382804\n会计员\t382805\n晋卦\t382806\n旗袍秀\t382807\n大众汽车集团\t382808\n世茂集团\t382809\n峡江\t382810\n魔法使的新娘\t382811\n仙剑6\t382812\n百度百家\t382813\n迪士尼动画\t382814\nvillages\t382815\n街头强袭\t382816\n木星\t382817\n世界性\t382818\n36话\t382819\nGone\t382820\nClaudio\t382821\n青城山前山\t382822\nx^2+y^2\t382823\n逍遥哈姆雷特\t382824\n菜色\t382825\nConcat\t382826\n0370\t382827\n帮子\t382828\n宝贝儿\t382829\n朱小杰\t382830\n逸境\t382831\n7.1.0\t382832\n基本费\t382833\n广东省经济和信息化委\t382834\n第11条\t382835\n算计\t382836\n善作\t382837\n大提顿国家公园\t382838\nRiceQuant米筐\t382839\n促会\t382840\nkuwo\t382841\n吃喝\t382842\nperso\t382843\nglobe\t382844\n有邪\t382845\n20150602\t382846\n痱\t382847\nWH-1000XM2\t382848\n增韧剂\t382849\n杜银玲\t382850\n花样滑冰\t382851\n解不开\t382852\n双块式\t382853\n远洋大厦\t382854\n清波\t382855\nfafa\t382856\n群之马\t382857\n5个小时\t382858\n并行化\t382859\n张家港农村商业银行\t382860\n曹寅\t382861\n超过12小时\t382862\n杰森\t382863\n非平衡面板数据\t382864\n生化奇兵:无限\t382865\n碑谷2\t382866\n红胶\t382867\n升限\t382868\nprouhub\t382869\n1108\t382870\n歌者\t382871\ncbm\t382872\n增高鞋\t382873\n丹东酒店\t382874\n中信红树湾\t382875\n天边\t382876\n襟\t382877\n华南农大\t382878\n兄战\t382879\n三国机密之潜龙在渊电视剧全集\t382880\n慈悲\t382881\n旧金山\t382882\n奔驰S级论坛_汽车之家论坛\t382883\nGrunt\t382884\n承建方\t382885\n新站区\t382886\n速效救心丸\t382887\n编办\t382888\n重化\t382889\nliuming\t382890\n水田\t382891\n地极\t382892\nFails\t382893\nown\t382894\nIntelligence\t382895\n普赛\t382896\n连云港火车站\t382897\nPycharm调试器\t382898\n挂面机\t382899\n珠江\t382900\npanbaidu\t382901\n浴沙\t382902\n三华\t382903\n牧高笛\t382904\n儒剑\t382905\n合成器\t382906\n男3女\t382907\n半生缘\t382908\n六年多\t382909\nmatplotlib\t382910\n沙冰\t382911\nTechnica\t382912\n死后\t382913\n蓝天救援队\t382914\n争得\t382915\n远坂\t382916\n明珠湾区\t382917\nk折\t382918\n甲基橙\t382919\n贝小爱\t382920\n重庆车管所\t382921\n稠州银行\t382922\n重获新生\t382923\n谈吐\t382924\n痴汉电车\t382925\n其三\t382926\n飞毛\t382927\n菁英汇\t382928\nRSA\t382929\n超极\t382930\nCeleron\t382931\n85家\t382932\n郑州市物价局\t382933\n中颖\t382934\n上海振华重工(集团)股份有限公司\t382935\n荣耀6X\t382936\n大疆公司\t382937\n双枪\t382938\n83岁\t382939\n唯识宗\t382940\n酒仙桥\t382941\n富贵者\t382942\n3列\t382943\n发妻\t382944\n在职研\t382945\n不要太多\t382946\n校运会\t382947\n处子\t382948\n绝地求生刺激战场注册\t382949\n贩毒\t382950\nluis\t382951\n雷克萨斯nx300\t382952\n蛇莓\t382953\nPP苹果助手\t382954\n星露谷物语stardewvalley\t382955\nterminator\t382956\n房屋遗产\t382957\n10锁屏\t382958\n人柱力\t382959\n网格纸\t382960\n省实\t382961\n津石高速\t382962\n辽宁路\t382963\n湖北银行\t382964\n阴道壁\t382965\nFTW\t382966\n小九寨\t382967\njs事件监听机制\t382968\n最著名\t382969\n氧化铋\t382970\n中国保利集团\t382971\n前列腺炎\t382972\n陈云\t382973\n易容\t382974\n著作权侵权\t382975\n南岳政府网\t382976\n王冠\t382977\n李滨\t382978\n奔腾X40\t382979\nCENTURY\t382980\nv2.6\t382981\n小夭\t382982\n玩大了\t382983\n终身学习网\t382984\n巴纳吉\t382985\nCMOS\t382986\n门捷列夫\t382987\nAPPLICATION\t382988\n伊娃格林\t382989\n负面\t382990\n埃格蒙特集团\t382991\n神奇树屋\t382992\nebook\t382993\ndet\t382994\ns001\t382995\n九部委\t382996\n定兴吧\t382997\nally\t382998\n4320\t382999\n数控弯箍机\t383000\n杏洋花\t383001\n钱多多\t383002\n一梯\t383003\n大唐贵妃\t383004\n风云人物\t383005\n孕交\t383006\nstudion\t383007\neating\t383008\n普勒\t383009\n崖州湾\t383010\n共青团团\t383011\n五合一\t383012\n歌尔声学股份有限公司\t383013\n砚山县人民政府\t383014\n酮酸\t383015\n新浪商洛\t383016\n中书协\t383017\n2.6.7\t383018\n罗技G910\t383019\n多威\t383020\n潍坊市国土资源局\t383021\n软弱\t383022\n姿美堂\t383023\nsisley\t383024\n秀林\t383025\n失之毫厘\t383026\n黄牛党\t383027\n簡體\t383028\nSABIC\t383029\ncheater\t383030\n石林风景区\t383031\n小妾\t383032\n检具\t383033\n妖妻\t383034\n移印机\t383035\n信息楼\t383036\n庐山瀑布\t383037\n马布岛\t383038\n第十八讲\t383039\n李剑\t383040\n小泰\t383041\n08\t383042\n二七\t383043\n花红\t383044\n玛斯特\t383045\n挡块\t383046\n鹿港文化\t383047\n广富林街道\t383048\n毕淑敏\t383049\n应税凭证\t383050\n1854\t383051\n28周年\t383052\n拉法兰\t383053\n菜园镇\t383054\n海草舞\t383055\n牛栏山\t383056\n九里亭街道\t383057\n分房\t383058\n索阁\t383059\n革兰氏\t383060\n丝带花\t383061\n天天安卓模拟器\t383062\n残幅\t383063\nstarccm+\t383064\n沈子皓\t383065\n墨子号\t383066\n榴莲控\t383067\nrepeating\t383068\n牟平区政府网\t383069\ntianyan\t383070\n千万元\t383071\n塔牌集团\t383072\n折叠床\t383073\n暹\t383074\n小米酒\t383075\n合作路\t383076\nㄟ\t383077\n影音先锋\t383078\nsteins\t383079\n要籍\t383080\n黑暗世界\t383081\n2CH\t383082\n王骞\t383083\nshedejie\t383084\n猪脚粉\t383085\n伪装网\t383086\n沿湖\t383087\n洗发皂\t383088\nIGXE\t383089\n7.2版\t383090\n孔令奇\t383091\n扎\t383092\n五爷\t383093\n南京港澳\t383094\n聚氯乙烯\t383095\n葛根茶\t383096\n伊巴卡\t383097\nkeli\t383098\nVAC\t383099\n洗碗块\t383100\n差价\t383101\n李作成\t383102\n月球之谜\t383103\n馆藏\t383104\n长安新城\t383105\n来函\t383106\n建筑施工高处作业安全技术规范\t383107\n影像园\t383108\n二女\t383109\n小火龙\t383110\n永钢\t383111\nWII\t383112\n秋之回忆6\t383113\n3秒后\t383114\nLeone\t383115\n咸鱼网\t383116\n客\t383117\n花芯\t383118\n1864年\t383119\n天鹅的故事\t383120\n野鱼\t383121\n案子\t383122\n卡姿兰\t383123\ngt赛车\t383124\n天津外国语大学\t383125\n计次\t383126\n山那边\t383127\n口误\t383128\n晨跑\t383129\n两封\t383130\n矿井\t383131\n胎压\t383132\ng-shock吧_\t383133\n七彩\t383134\napp-斗米兼职\t383135\n线纹\t383136\n重力流\t383137\n保平安\t383138\n张朝晖\t383139\nlistView\t383140\n曲阜东站\t383141\n赵小米\t383142\n西陵\t383143\nExt\t383144\n6.8%\t383145\n腹灵\t383146\nJAVA环境变量\t383147\n经编\t383148\n铁路\t383149\n禁爱\t383150\nTORCH\t383151\nburg\t383152\n表链接\t383153\n不愁\t383154\n菜机\t383155\n低脂奶\t383156\nchemistry\t383157\nShapefile\t383158\n迪迪\t383159\n截流井\t383160\n双鱼岛\t383161\ncreo3.0\t383162\n新开始\t383163\n销售税\t383164\n财通证券股份有限公司\t383165\n7分\t383166\n棒球赛\t383167\n共同生活\t383168\n52PKDNF\t383169\n氟比洛芬巴布膏\t383170\n虎子\t383171\n我的家庭\t383172\nMINISO\t383173\nabcmouse\t383174\n聚二甲基硅氧烷\t383175\n猎隼\t383176\n纨绔世子妃\t383177\ntokenizer\t383178\n思悼\t383179\n小李琳\t383180\n几分钱\t383181\n间质性肺炎\t383182\n河北体育学院\t383183\n苍南县\t383184\n去哪儿旅游\t383185\n许允美\t383186\n彝家\t383187\n敲诈勒索\t383188\n阳煤\t383189\n盐边县\t383190\n来世\t383191\n30万亿\t383192\nSunglasses\t383193\n拉斯\t383194\n不再来\t383195\n大原\t383196\n公安小区\t383197\n3699元\t383198\n康复治疗\t383199\nprotro\t383200\n螺旋机\t383201\nNullege\t383202\n四节\t383203\n324\t383204\n纸篓\t383205\n4205\t383206\n万亿级\t383207\n百官\t383208\n中国农业大学食品科学与营养工程学院\t383209\nBitmap\t383210\n好事者\t383211\n启智\t383212\n环球卓越网\t383213\n迷离档案\t383214\n海士\t383215\n大僵尸\t383216\n3099\t383217\n20160916\t383218\nyeezy500\t383219\n执教\t383220\n熊猫银币\t383221\n白玉\t383222\nbundler\t383223\n麦乳精\t383224\n钉扣机\t383225\n奥林匹斯山\t383226\n北京本地宝\t383227\n聊城日报\t383228\nwin7_装机吧\t383229\n77部\t383230\ngitee\t383231\n11.6\t383232\n布吉镇\t383233\n汽车测试网\t383234\nmole\t383235\n施桥镇\t383236\n戴斯\t383237\n研究方向\t383238\n桔梗谣\t383239\n苏琪\t383240\n泸女郎\t383241\n略号\t383242\n国民革命军\t383243\n绳锯机\t383244\n小阴唇\t383245\n撞倒\t383246\n女佣\t383247\n海歌\t383248\n决定权\t383249\n寒武再临\t383250\nGALA乐队\t383251\n空性\t383252\nGMP认证\t383253\nconsideration\t383254\nr2017\t383255\n诚信度\t383256\n刘渊\t383257\n四点底\t383258\n本田CR-V\t383259\n20171230\t383260\ndelivers\t383261\n阿迪达斯三叶草\t383262\n曹丽娜\t383263\n坑\t383264\n玄魂\t383265\nл\t383266\n002044\t383267\nf-46468\t383268\n140项\t383269\n小侦探\t383270\nhired\t383271\n销售人员\t383272\n五五分\t383273\n成人综合网\t383274\n姜彦汐\t383275\n16色\t383276\n与龙共舞\t383277\n澄空\t383278\n1217\t383279\n莉蒂与苏尔的工作室\t383280\ndayz独立版\t383281\n阿尔巴尼亚\t383282\n甲子年\t383283\n室性心律失常\t383284\ngdt\t383285\n儿媳妇儿\t383286\nTabHost\t383287\nSVPWM\t383288\n悲风\t383289\n校生\t383290\n穆棱\t383291\nCalibration\t383292\nSYMBOL\t383293\n重装系统网\t383294\n晏阳初\t383295\nUsed\t383296\n以你为名的青春\t383297\n泉州七中\t383298\n五环之歌\t383299\nipip\t383300\nipset\t383301\n巩固率\t383302\n年金基金\t383303\n羽衣\t383304\n多位\t383305\ndjd\t383306\n任天堂switch\t383307\n一二三级\t383308\n点带面\t383309\n吸血鬼骑士\t383310\n杭州锦江集团\t383311\n优雅\t383312\n永泰东里\t383313\n次氯酸钙\t383314\n16X\t383315\n鼎炉\t383316\n污染物\t383317\n马驹桥镇\t383318\n哈尔滨地区\t383319\n靖苏\t383320\ngraphviz\t383321\n寒窗苦读\t383322\n皮影\t383323\n黏附\t383324\n丰城市政府网\t383325\n白晶\t383326\n小米笔记本Pro\t383327\n8080\t383328\n1200_\t383329\n5孔\t383330\n下野\t383331\n16线\t383332\nAnaconda\t383333\n课桌\t383334\n徐淼\t383335\n内生菌\t383336\nmatters\t383337\nrinka\t383338\n122个\t383339\n军阶\t383340\n怀疑论\t383341\n腾讯NOW\t383342\nLaszlo\t383343\n视力\t383344\n字幕\t383345\n十里铺\t383346\n法医\t383347\n化茧成蝶\t383348\ndemux\t383349\n杭州师范大学\t383350\nE100\t383351\n白沙镇\t383352\n里昂商学院\t383353\n湾\t383354\n420g\t383355\n民事关系\t383356\n务虚会\t383357\ndd-wrt\t383358\n新起点\t383359\n辽中区\t383360\n大联盟\t383361\nspring\t383362\n20170806\t383363\n韩子\t383364\n湘桥区\t383365\n满暴\t383366\n北京国家会议中心\t383367\n大牛们\t383368\n朱志根\t383369\n艾尼维亚\t383370\ncp5s2\t383371\n太谷\t383372\n鼻内镜\t383373\n王泓翔\t383374\n采集盒\t383375\n百度文库财富值\t383376\n4578\t383377\nvCenter\t383378\n翔阳\t383379\n挽回\t383380\n户部山\t383381\n用电率\t383382\n1724\t383383\n路机\t383384\n徐世昌\t383385\nTESTV\t383386\n寻思\t383387\n单冷\t383388\n翠苑三区\t383389\n正码\t383390\n放纵\t383391\n小乘\t383392\n你的女人\t383393\n六平方\t383394\n池州职业技术学院\t383395\n重庆三中\t383396\n修真终结者2审判日\t383397\n长长见识\t383398\n漆光\t383399\n互联网_易房网\t383400\n蝉联\t383401\n雪域\t383402\n蒙阴县\t383403\n十五届\t383404\n裸刑\t383405\n37所\t383406\n莫卡\t383407\n新游\t383408\n导电银浆\t383409\n刘三姐\t383410\n大剧院\t383411\nphaser\t383412\n天下第一武道会\t383413\n1086\t383414\n奥迪女总裁的贴身兵王\t383415\n尹真熙\t383416\n存量房买卖合同\t383417\n变性手术\t383418\n2017年7月2日\t383419\n2014.2\t383420\n新桑\t383421\n搜珍网\t383422\nDoors\t383423\n凌寒\t383424\n绝地求生沙漠图\t383425\n高青县\t383426\n桐乡新闻网\t383427\n特异性\t383428\n蒜苔\t383429\nYIN\t383430\n龙江\t383431\n切忌\t383432\n彩六\t383433\n20170804\t383434\nDTA\t383435\nProQuest\t383436\n巴沙尔·贾法里\t383437\n尼克松\t383438\n写真机\t383439\n气浪\t383440\n花岙岛\t383441\n小布叮\t383442\n七曜\t383443\n阿凡达1\t383444\n暴走英雄坛\t383445\n长波\t383446\n光着脚\t383447\n保险从业考试\t383448\n冰洁师\t383449\n安徽涉外经济职业学院\t383450\n附加赛\t383451\n王紫杰\t383452\n超星星学园\t383453\n小天骐\t383454\n第五级\t383455\n高水\t383456\n信息产业电子第十一设计研究院科技工程股份有限公司\t383457\n重庆市开州区人民政府\t383458\n才思\t383459\n漫画师\t383460\n推背图\t383461\nDECATHLON\t383462\n姓名网\t383463\n拼接处理器\t383464\n摩擦系数\t383465\n太阳灯\t383466\n嘉善南\t383467\n三三宝\t383468\n小隐\t383469\n向阳而生\t383470\n庄河市\t383471\n运程车\t383472\n过庭\t383473\n方向上\t383474\n三年级语文\t383475\n证联\t383476\n社保/公积金公司\t383477\n瓦林卡\t383478\n3【\t383479\n2016.doc\t383480\n食用盐\t383481\n退磁\t383482\n个人住房公积金贷款\t383483\n油翁\t383484\n贵阳市城市管理局\t383485\n敷设\t383486\n4x\t383487\n20170212\t383488\n川口春奈\t383489\nDaphne\t383490\n北京积分落户社保\t383491\n真实机\t383492\nbrita\t383493\n富邦华一银行\t383494\n更重要\t383495\n参松\t383496\n休闲皮鞋\t383497\n风花雪\t383498\n施工员证\t383499\n佐治亚理工学院\t383500\nPOPPUR\t383501\nTariff\t383502\nmobileye\t383503\n云台\t383504\n王道文\t383505\n汪老师\t383506\n梅菜\t383507\n珐琅铸铁锅\t383508\n福建医科大学附属口腔医院\t383509\nSODA\t383510\nCrashlands\t383511\n湖里街道\t383512\n2万6\t383513\n励志哥\t383514\n鱼火锅\t383515\ntchart\t383516\n娄底市国土资源局\t383517\n不振\t383518\nTumor\t383519\ncad2006\t383520\npythone\t383521\n合金装备5:幻痛\t383522\n母兔\t383523\n第九次\t383524\n哈佛家训\t383525\nsharpe\t383526\n半一\t383527\n亚非\t383528\n申东烨\t383529\n足本\t383530\n北京市国土资源局\t383531\n美舍河\t383532\ntransforming\t383533\n150W\t383534\nWalton\t383535\n土超联赛\t383536\n亿兆\t383537\nreaddir\t383538\n三恒\t383539\n福满堂\t383540\nresultset\t383541\n何磊\t383542\n保安师\t383543\n马盖先\t383544\n刘渡舟\t383545\n派生类\t383546\n泰州市人民医院\t383547\n新开神途传奇\t383548\nStarwood\t383549\n动作片\t383550\n_【高铁网\t383551\n太阳能庭院灯\t383552\n时间转换函数\t383553\n科睿唯安\t383554\n民众\t383555\n有机碳\t383556\n纸屑\t383557\n自助烧烤\t383558\nv2.4.3\t383559\n中国式相亲\t383560\n张立昂\t383561\n打环\t383562\n人武部\t383563\nRecruiter\t383564\n佳能激光\t383565\n山东圣翰财贸职业学院\t383566\nJake\t383567\nmuzzy\t383568\n蹲姿\t383569\n退守\t383570\n24色\t383571\nJGJ59\t383572\n张小光\t383573\n含情\t383574\n不锈钢丝网\t383575\n势力值\t383576\n口服避孕药\t383577\n狐女\t383578\n汪斌\t383579\n辽宁建筑职业学院\t383580\n南方汇通\t383581\n归有光\t383582\nwlan\t383583\n格子铺\t383584\ngenetically\t383585\n李全\t383586\n首都经贸大学\t383587\n好害羞\t383588\n沙湾\t383589\nMac|Mac\t383590\n溢价\t383591\n徐磊\t383592\n歡迎光臨\t383593\n玻璃器\t383594\nintermediate\t383595\nwww.tangzhekan3.com/modules/article/txtarticle\t383596\n热能\t383597\n山东电信\t383598\n绘出\t383599\n危货\t383600\npco\t383601\n883\t383602\n孟尝君\t383603\n衬塑钢管\t383604\nHD4400\t383605\n严姓\t383606\ns41\t383607\n军王\t383608\n梯希爱\t383609\ntun\t383610\n0.0.0.0\t383611\nasin\t383612\n0.4.0\t383613\n秦守\t383614\n抑\t383615\n51cto技术博客\t383616\n朱河\t383617\n赠予\t383618\n冲浪\t383619\nvijos\t383620\n红茶\t383621\n1923\t383622\n凯蒂猫\t383623\n1.0L\t383624\n暗室\t383625\nProducts\t383626\narchivelog\t383627\n天脊\t383628\n8个月\t383629\n城地\t383630\n改端\t383631\n赵豪志\t383632\nLouvre\t383633\n中国女网\t383634\n万树\t383635\n萌舞\t383636\nPAB\t383637\nlldp\t383638\nundefeated\t383639\n河北师范大学汇华学院\t383640\n繁简\t383641\nacme\t383642\n投弹手\t383643\n0871-64171212\t383644\n番摊\t383645\n日照街道\t383646\nbaid\t383647\n江湾体育场\t383648\n艾洛维\t383649\n参见\t383650\n中国电子系统技术有限公司\t383651\n复习题集\t383652\n扯淡\t383653\nomnifocus\t383654\n卢秉恒\t383655\nfack\t383656\n缙云县\t383657\n牙列\t383658\n溃疡性结肠炎\t383659\n钻展\t383660\n帝霸\t383661\n服务性\t383662\n北苑小区\t383663\n阿拉城\t383664\n这点\t383665\n抄袭者\t383666\n西师附小\t383667\n瑜老板\t383668\nenrichment\t383669\n广东省总工会\t383670\n大红门街道\t383671\n油电混合动力汽车\t383672\nBBW\t383673\n维也纳\t383674\n芭蕉叶\t383675\n延长线\t383676\n79HD\t383677\n独资\t383678\n60亩\t383679\n轩辕剑5\t383680\nedrawings\t383681\nOuting\t383682\n1.5部\t383683\n11.05\t383684\nIcloud\t383685\n文贴\t383686\n人工神经网络\t383687\n惹祸\t383688\n拉霸\t383689\nsyl\t383690\n北控\t383691\n数据恢复软件\t383692\nv7.3\t383693\nExpats\t383694\n投资理财公司\t383695\n仪器仪表学报\t383696\n圣心大教堂\t383697\n金志文\t383698\n趋利避害\t383699\n瑞海公司\t383700\n韩端\t383701\n钩器\t383702\n火烈\t383703\n地铁族\t383704\nsystematic\t383705\n雄奇\t383706\n费罗龙\t383707\npts/0\t383708\n9cm\t383709\n贵州财经学院\t383710\n43码\t383711\nios7.1\t383712\n五百元\t383713\n风热感冒\t383714\n蛮荒记\t383715\n叛逆者\t383716\n雪乡\t383717\n日本\t383718\nFFF\t383719\n中南部\t383720\n中华人民共和国司法部\t383721\n剑星\t383722\nPhp\t383723\n香港汇丰银行\t383724\nFFMPEG\t383725\n小密圈\t383726\n硫酸锰\t383727\n孤舟蓑笠翁\t383728\n97.5%\t383729\n汉化补\t383730\n滴滴打车吧\t383731\nWE+\t383732\n打菜\t383733\n九零后\t383734\n九鹭\t383735\n奶癣\t383736\n蓝色生死恋\t383737\n安徽大剧院\t383738\nFi音响\t383739\n电子磅\t383740\n农博会\t383741\n市商网\t383742\n秘传\t383743\n3日后\t383744\n3DMGAME论坛\t383745\n上海电力公司\t383746\n临安新闻网\t383747\n落陷\t383748\n广东肇庆中学\t383749\n亚盛\t383750\n镇江金山\t383751\n1694\t383752\n秦明\t383753\nwelcome-file-list\t383754\n天文馆\t383755\n小日子\t383756\n说唱歌\t383757\n歪歌\t383758\n預\t383759\nTZZ\t383760\n小爱音箱mini\t383761\n中华集成灶网\t383762\n贞芸劫\t383763\nOut\t383764\n8号\t383765\n明杆闸阀\t383766\n敦煌机场\t383767\n自贸港\t383768\n初装费\t383769\ncrj\t383770\n扼\t383771\n芷江县\t383772\n加裕智倍保\t383773\n税改后\t383774\n91.com\t383775\n传唱\t383776\nfilelist\t383777\nexmail\t383778\n莫昔芬\t383779\n飞鹿言情网\t383780\n春雨的色彩\t383781\nWinCE\t383782\n鼻饲\t383783\n指挥官\t383784\n2018年03月15日\t383785\nSky\t383786\n太平军\t383787\n云游\t383788\n20160221\t383789\n睿骋CC\t383790\n烟雨楼\t383791\n类被\t383792\n管理会计师\t383793\ndiscussions\t383794\n澳航\t383795\n铝镜\t383796\n益达口香糖\t383797\n军马场\t383798\n新民事诉讼法\t383799\n中华人民共和国驻蒙古国大使馆\t383800\nexcel格式素\t383801\n欧美区\t383802\ndayone\t383803\n后路\t383804\n下里\t383805\nFrancis01\t383806\nmor\t383807\n愈来\t383808\n末日孤舰第四季\t383809\n弊\t383810\n难知\t383811\n12000元\t383812\nFIFA17\t383813\nCables\t383814\n肿物\t383815\n嵌套式\t383816\n入座\t383817\n新三板公司\t383818\n铜元\t383819\n211.90\t383820\n枸杞茶\t383821\n浙江吉利控股集团有限公司\t383822\n64平米\t383823\n知明\t383824\n拔根芦柴花\t383825\n模卡\t383826\n降低\t383827\nziroom\t383828\n称得上\t383829\n7150\t383830\n湘府路\t383831\n明天出版社\t383832\n夺宝传世吧\t383833\n网易uu\t383834\n无水乙醇\t383835\n得了荨麻疹\t383836\n首尔市\t383837\n铁西\t383838\n4、5月\t383839\n8000分\t383840\n5G概念股\t383841\n巨屌\t383842\n浙江工业大学之江学院\t383843\n胶棒\t383844\nController层\t383845\nnetinet\t383846\n学峰\t383847\n安农\t383848\n约架\t383849\n长江路街道\t383850\n抑制剂\t383851\n安兰德\t383852\n中国印钞造币总公司\t383853\n山本杏里\t383854\n金州勇士队\t383855\n阿玛达\t383856\n一标\t383857\n金都\t383858\n维他\t383859\nCan't\t383860\n梁文道\t383861\nLED全彩显示屏\t383862\n松果儿\t383863\n九连环\t383864\n04月01日\t383865\n沙家浜\t383866\n6.87\t383867\n54号\t383868\n国共\t383869\n河口轻舞飞扬健身操\t383870\nbrowser\t383871\n三十篇\t383872\n曲中人\t383873\n给我足交\t383874\n艺人\t383875\n産\t383876\nTL\t383877\n中公考研网\t383878\n录制\t383879\n互为相反数\t383880\n五蜈蚣标止咳丸\t383881\n绘心\t383882\n反分裂\t383883\n八开\t383884\nmkii\t383885\n没有子\t383886\n耳螨\t383887\n罗东\t383888\n修音\t383889\n碰触\t383890\n新宿幻灵事件\t383891\n酥性饼干\t383892\nPRO4\t383893\n土家酱香饼\t383894\n锻件\t383895\n老汉\t383896\n凯儿\t383897\n青浦县\t383898\n88级\t383899\n晋陵\t383900\nburnt\t383901\n方矩管\t383902\ncatid\t383903\n大都会艺术博物馆\t383904\n丁帆\t383905\n3本\t383906\n国际货运代理公司\t383907\n高层建筑工程\t383908\nOfficejet\t383909\n起球\t383910\n0626\t383911\nMoka\t383912\nShiny\t383913\n宇杰\t383914\n闽北日报\t383915\n借尸还魂\t383916\n湾湾川\t383917\n如海\t383918\n7纳米\t383919\n程序锁\t383920\nzzuIvy\t383921\n鸣\t383922\n1200dpi\t383923\n278\t383924\n泥块\t383925\n浙江东方职业技术学院\t383926\n舟山市人力资源和社会保障局\t383927\n所在地\t383928\n明思\t383929\nThousand\t383930\n长征大会师\t383931\n被淹\t383932\nparallelsdesktop\t383933\n南德\t383934\n2016年4月20日\t383935\n李德新\t383936\n摇指\t383937\nRA2\t383938\n碳汇\t383939\nBATTLEGROUNDS\t383940\n纸巾机\t383941\nxwindow\t383942\n奴良陆\t383943\n辛迪加\t383944\nMOSS\t383945\n拯救\t383946\n文灶\t383947\n沙漠之鹰\t383948\n张嘉佳\t383949\n黄海森林公园\t383950\n战七少\t383951\n宜配网\t383952\n96a\t383953\n围剿\t383954\n小马哥\t383955\n章回\t383956\nfemjoy\t383957\n妖虫\t383958\naverage\t383959\n玫瑰战争\t383960\nUC论坛\t383961\n栉风沐雨\t383962\n无雨\t383963\n孔明锁\t383964\n床边\t383965\n拯救世界\t383966\nRad\t383967\nRedemption\t383968\n岳松\t383969\n密室逃脱\t383970\n資訊網\t383971\n02级\t383972\nHandbook\t383973\n北京科技大学天津学院\t383974\ndaming\t383975\n教展\t383976\n动环监控\t383977\n治好\t383978\n杨建东\t383979\n路易十六\t383980\n韩乐\t383981\n计控\t383982\n木盆\t383983\n25时\t383984\n白梅\t383985\nSaveBT\t383986\nENJOY\t383987\n精细\t383988\n德勒\t383989\n健康度\t383990\n兰生股份\t383991\n香兰素\t383992\nenu\t383993\n教念\t383994\nwwW\t383995\n孙国强\t383996\n镍带\t383997\n高颖\t383998\ndatefield\t383999\n神罗\t384000\n丑事\t384001\n第期\t384002\n五分钱\t384003\n9.2%\t384004\n百分之三\t384005\n周榜\t384006\n佳能200d\t384007\n就是街舞\t384008\n抗震\t384009\n云卡\t384010\nbng\t384011\n王大仁\t384012\n九藏天下\t384013\nyoung卡\t384014\n卤水\t384015\n4月8日\t384016\nEV3\t384017\ngreeeen\t384018\nsheet1\t384019\n冠状动脉粥样硬化\t384020\n饿\t384021\n格林美股份有限公司\t384022\nv2ex\t384023\n春秋左传\t384024\n婴儿装\t384025\n薏米\t384026\nCertificates\t384027\ntwoway\t384028\n列_\t384029\n种瓜\t384030\n600549\t384031\nbravo\t384032\n2.6万\t384033\n房土\t384034\nqqhd\t384035\nYouCompleteMe\t384036\nViuTV\t384037\n湖南保监局\t384038\n1千瓦\t384039\nFidder\t384040\n亿点\t384041\n长松\t384042\n342\t384043\n缺缸\t384044\n电子信息科学与技术\t384045\n西安植物园\t384046\n龙王山\t384047\n爱情进化论\t384048\n临平路\t384049\n食集\t384050\ngta4吧\t384051\n最高楼\t384052\nSlaves\t384053\n咸猪手\t384054\n远月十杰\t384055\n拓普集团\t384056\n随父\t384057\n九江人才网\t384058\n液氦\t384059\n北京凯迪拉克中心\t384060\n无声言证\t384061\n湖北宜化集团\t384062\n炫酷\t384063\n时钟源\t384064\n双色球_足彩_竞彩\t384065\n春堂\t384066\n红鲤\t384067\nEDIUS6\t384068\n中队会\t384069\nRouge\t384070\n陈鸿宇\t384071\nLaser\t384072\n起息日\t384073\n薰衣草园\t384074\n赫尔\t384075\ndll注入\t384076\n谈论\t384077\n很么\t384078\n水冰\t384079\n了不得\t384080\n拐\t384081\n不自量力\t384082\n芳香胺\t384083\n创业园\t384084\n淘宝微淘\t384085\n闲逸\t384086\nPANS\t384087\n披上\t384088\n前端之路\t384089\n帝尔\t384090\nvirtualBox\t384091\nbaking\t384092\ndistribution\t384093\n漫狂\t384094\n爱书音听书网\t384095\n计算机等级考试网\t384096\n后瑞\t384097\n龙岩学院\t384098\nPISEN\t384099\ndimm\t384100\n188号\t384101\n进出口银行\t384102\n武汉大学继续教育学院\t384103\n洞石\t384104\n圣界\t384105\nsolrj\t384106\n紫叶\t384107\n烂板\t384108\n踩坑\t384109\n马集镇\t384110\n李玲\t384111\n琳琅满目\t384112\n张海峡\t384113\n媽媽\t384114\n主卡\t384115\n文科类\t384116\n南京市第十三中学\t384117\n多一块\t384118\n白衣女\t384119\nL9\t384120\n为感\t384121\ndnf女鬼剑\t384122\nShaper\t384123\ntemple\t384124\n流平剂\t384125\n20170410\t384126\n女神进化论\t384127\n折纸花大全\t384128\n6000万元\t384129\n倒流\t384130\n企一网\t384131\n液口\t384132\n垫脚\t384133\nHUI320高清\t384134\n北京国际图书博览会\t384135\nPV1\t384136\n国旅\t384137\nP-value\t384138\n徐州市审计局\t384139\n小米4C\t384140\n显性基因\t384141\n相结构\t384142\nST一重\t384143\n福田中学\t384144\nSonos\t384145\n移远通信\t384146\n王者荣耀搞笑视频\t384147\n通吃\t384148\n18磅\t384149\nmismatched\t384150\n小秘\t384151\n美标法兰\t384152\n汉高祖刘邦\t384153\n大中华寻宝记\t384154\n牛腱子肉\t384155\n高星级\t384156\ndui\t384157\n单基因遗传病\t384158\n十阶\t384159\n72_\t384160\n公事\t384161\n捕鼠器\t384162\nfirms\t384163\n入党积极分子培养考察登记表\t384164\nWildfire\t384165\n向性\t384166\nsurface3\t384167\n张晴\t384168\n炎黄盈动\t384169\n龙井虾仁\t384170\ntspline\t384171\n原音\t384172\n父子\t384173\n心雨\t384174\n空调控制器\t384175\n个值\t384176\n两三个\t384177\n八十回\t384178\nthumbnails\t384179\n对外投资合作国别\t384180\n1206\t384181\n4.15全民国家安全教育日\t384182\n幼猫粮\t384183\n于红梅\t384184\n建鑫\t384185\nps1模拟器\t384186\n文本输入框\t384187\n成真\t384188\n1丝\t384189\n多种形式\t384190\n判决\t384191\n11【\t384192\n努力\t384193\n方士\t384194\nOctopus\t384195\n照门\t384196\n巴赞\t384197\n贷款类\t384198\n武汉中南医院\t384199\n飞云\t384200\n洪安\t384201\nft\t384202\n宋威周渝民\t384203\niphone8s\t384204\n苏格兰\t384205\nundress\t384206\n吊绳\t384207\n消息框\t384208\n比亚迪秦论\t384209\n主营业务收入\t384210\n临泉路\t384211\ntransferred\t384212\n环世界B18\t384213\n钟淑慧\t384214\n三聚氰胺\t384215\n会阴\t384216\n思密达\t384217\n瑶湖\t384218\n一两次\t384219\n贷款通则\t384220\n住工\t384221\n番禺大石\t384222\n徐标新\t384223\ninward\t384224\n天天啪wwwbobo\t384225\n北方国际\t384226\n小鱼易\t384227\n深圳市考试院\t384228\n斜肌\t384229\n西方\t384230\n第三人称\t384231\njs指令\t384232\n培训班\t384233\nuint8_t\t384234\n四果\t384235\n海菲兹\t384236\n半截袖\t384237\n住院医疗保险\t384238\n经济学人\t384239\n出差\t384240\n12万\t384241\n贝纳勒斯\t384242\nLov\t384243\nlydia\t384244\n小学篇\t384245\n不用电\t384246\n畏袭\t384247\n江苏省医院\t384248\n预售证\t384249\n十四日\t384250\n宫夜霄\t384251\nMemoir\t384252\n第79期\t384253\n灌浆料\t384254\n酷狗m1\t384255\n金丝草\t384256\n嘴壶\t384257\n商用电磁炉\t384258\nTaj\t384259\nPrestige\t384260\n揭阳职业技术学院\t384261\n西航港街道\t384262\ngff\t384263\nvidaa\t384264\n感应篇\t384265\n直字\t384266\nprovider\t384267\n番库\t384268\ngluon\t384269\n走失\t384270\n授權\t384271\n核桃花\t384272\ntreenode\t384273\n那条路\t384274\n浸润\t384275\nAdvisory\t384276\n树冠\t384277\npvb\t384278\n纵切\t384279\n安阳信息网\t384280\n1.0\t384281\n剪切机\t384282\n音响\t384283\n步字\t384284\nH264\t384285\n530li\t384286\n强颜欢笑\t384287\n陈亚\t384288\nconclude\t384289\n乘除法\t384290\n5N\t384291\nguardians\t384292\n基流\t384293\n皆因\t384294\n九游游戏中心\t384295\nkkt\t384296\n玉米螟\t384297\n20118\t384298\nParadox\t384299\n纯晶\t384300\n辞令\t384301\n黑白格\t384302\n蜜颜\t384303\nm43\t384304\n28nm\t384305\n拉帝欧斯\t384306\n宗像\t384307\n摇步器\t384308\nlynx\t384309\n小葡萄\t384310\n高烧\t384311\n鸟航\t384312\n百度网盘/天翼云盘\t384313\n除虫\t384314\n甲基硫菌灵\t384315\n就业证\t384316\n凌霄\t384317\nGovernor\t384318\n合金弹头\t384319\n女孩们\t384320\n世界大战\t384321\n狩猎笛\t384322\n希斯·莱杰\t384323\nnexus6p\t384324\n系\t384325\n很无奈\t384326\n夹板\t384327\n萍乡市\t384328\n西方马克思主义\t384329\n学力\t384330\n宸宸\t384331\n长鹿\t384332\n天飞机\t384333\n董晴\t384334\n电蚊拍\t384335\np23\t384336\n微投票\t384337\n炸团\t384338\nxp3\t384339\nfacaishu_腾讯\t384340\n岩井俊\t384341\n阶梯价\t384342\n烧包谷\t384343\n页首\t384344\n置业者\t384345\n模拟城市5\t384346\n传染病学\t384347\nwells\t384348\nstrcpy\t384349\n援交\t384350\ncopy\t384351\n吴海啸\t384352\n湘雅二医院\t384353\nwarm\t384354\nwdsw\t384355\n果皮箱\t384356\n沉着\t384357\n硼酸锌\t384358\n杨叔子\t384359\n我的眼睛\t384360\n牧人\t384361\n全国政协常委会\t384362\n顺光\t384363\n三腔\t384364\ndaya\t384365\nonmouseenter\t384366\n猛犸牙\t384367\n张利华\t384368\n突击兵\t384369\nkafka\t384370\n半截\t384371\ndb3\t384372\n52JIDA\t384373\n多项式函数\t384374\nAlt键\t384375\nthousand\t384376\n天津大学\t384377\n外审\t384378\n桐乡濮院\t384379\n兰总套\t384380\n自驾游网\t384381\n团组\t384382\nehentai\t384383\n2149\t384384\n360小时\t384385\n那时候\t384386\n小倩\t384387\n领导型\t384388\n四通一达\t384389\n营口市站前区\t384390\nFCI\t384391\n咆哮帝\t384392\n治河\t384393\n荣耀盒子\t384394\nsparrow\t384395\ncno\t384396\nnbd\t384397\n双饼\t384398\n世界公园\t384399\n地下城与勇士17173攻略\t384400\n对牛弹琴\t384401\n压路机\t384402\n服务机\t384403\n查考\t384404\n滁州市教育体育局\t384405\n易飞erp\t384406\n食坊\t384407\n秋日\t384408\n2000目\t384409\n400多万\t384410\n汉武帝\t384411\n天海\t384412\n治百病\t384413\n晓风\t384414\n体博网\t384415\nubu\t384416\nVCT\t384417\nCHAUMET\t384418\n拉丁舞裙\t384419\nmfg\t384420\nvlmcsd\t384421\n强有力\t384422\ncass吧\t384423\n林生\t384424\n城镇化\t384425\n80070020\t384426\n情室\t384427\n29W\t384428\n意在\t384429\nfay\t384430\n9月6日\t384431\n棋事\t384432\n鬼迷心窍\t384433\n随军\t384434\n醋酸泼尼松\t384435\n摆渡\t384436\nStarry\t384437\n1mw\t384438\n通货紧缩\t384439\n张峰\t384440\n不动手\t384441\nTulane\t384442\n掉帧\t384443\n格调\t384444\nSEVEN\t384445\n87W\t384446\n光谷中心城\t384447\n叕\t384448\nzyj\t384449\n明强小学\t384450\n沙鲁\t384451\n擎洲广达\t384452\nOptic\t384453\n周易算命网\t384454\n这些天\t384455\n斯米克\t384456\n信赖性\t384457\n电媒\t384458\n韩雨嘉\t384459\n4栋\t384460\n红动网\t384461\n外路\t384462\n朱卫国\t384463\n大头笔\t384464\n华药\t384465\n纯正\t384466\nLBJ15\t384467\n这么多事\t384468\n迎考\t384469\nPOCT\t384470\n塑\t384471\n黄桷兰\t384472\n热裂\t384473\nkbengine\t384474\n大连教育学院\t384475\nfoss\t384476\n武汉军运会\t384477\n苏超\t384478\nplotyy\t384479\ncef\t384480\n亲手\t384481\n骶管\t384482\n圣典\t384483\n信达地产\t384484\n茶光村\t384485\n战锤全面战争2\t384486\n奔驰G级AMG\t384487\n笔试\t384488\n恐怖主义\t384489\n白名\t384490\n169\t384491\n3.0_\t384492\n李广信\t384493\n盗市\t384494\n门诊\t384495\n学纪\t384496\n硕腾\t384497\n双方\t384498\n北京亚运会\t384499\n圣彼得堡\t384500\n千谎百计\t384501\nloopback\t384502\n宁东\t384503\n起亚凯绅\t384504\n34所\t384505\n冠梁\t384506\n我没有\t384507\ncript\t384508\n果洛\t384509\n第一只\t384510\ngrads\t384511\n哈尔滨电气集团有限公司\t384512\n衬塑管\t384513\n朴敏英\t384514\n泉州一中\t384515\n蔡金龙\t384516\n纳雍县人民政府\t384517\n第1话\t384518\n走漏\t384519\n乌市\t384520\n全意\t384521\n亳州市文化旅游局\t384522\n黑与金\t384523\nboot2docker\t384524\n描稿\t384525\ninnisfree\t384526\n后头\t384527\n校招\t384528\n乐随\t384529\n秀吉\t384530\n被动式\t384531\nUCAS\t384532\n极片\t384533\n阔落\t384534\nsfa\t384535\nug11\t384536\n四夷\t384537\nwdd\t384538\n回头看\t384539\n鸣虫\t384540\n抵押金\t384541\n分析仪\t384542\n杨瑛\t384543\n常鹤鸣\t384544\n雨刮片\t384545\nstie\t384546\n阿帕比\t384547\n海涅\t384548\n肛\t384549\n13分\t384550\n百亩\t384551\n隋末\t384552\n鲁能城\t384553\n轮休\t384554\n台州医院\t384555\n瑞虎7论坛\t384556\nBees\t384557\n杨柳松\t384558\n雨石\t384559\n凉山州\t384560\n李青山\t384561\nlarge\t384562\n外来词\t384563\n南邻\t384564\n新乡东\t384565\nStair\t384566\n异星探险家\t384567\n李春华\t384568\nxts\t384569\n荆门晚报\t384570\nMAOLAI\t384571\nwhite-space\t384572\n激励器\t384573\n二本\t384574\n河北美术学院\t384575\n椰王\t384576\n零基础入门学习Python\t384577\n精灵4\t384578\nhada\t384579\n86层\t384580\n神海\t384581\n一部分\t384582\n庐州府\t384583\n出镜率\t384584\n4门\t384585\n一丝一丝\t384586\n浮游炮\t384587\n庭院花园\t384588\n嘉兴\t384589\n国家政权\t384590\n石皮\t384591\n罗辑\t384592\n705\t384593\nPhosphate\t384594\n文化发展有限公司\t384595\nposted\t384596\n马面裙\t384597\n晶座\t384598\nlibiconv\t384599\nes6\t384600\n野王\t384601\n惊天剑帝\t384602\n玉皇阁\t384603\n地陷\t384604\n克拉运河\t384605\n新闻大厦\t384606\nRapper\t384607\n龙安区\t384608\nthrust\t384609\nSumail\t384610\n小红娘\t384611\nAthlonII\t384612\n摇滚客\t384613\n娇韵诗双萃赋活\t384614\n银川市人民政府\t384615\n核心板\t384616\n自考365\t384617\n人在囧途\t384618\nno2\t384619\n商业贷款\t384620\n希格林\t384621\n优财网\t384622\n洋辣子\t384623\nsobel\t384624\nWorkshops\t384625\n呓\t384626\n北京公交卡\t384627\n南茜\t384628\n222路\t384629\n31岁\t384630\nHarold\t384631\n单选题\t384632\n6H\t384633\n尘香\t384634\n疏水性\t384635\nedrawing\t384636\n金达摩\t384637\n林彪\t384638\n植物大战僵尸2冰河世界\t384639\n连环\t384640\n宫下华奈\t384641\ntransitive\t384642\n冷餐\t384643\n04月\t384644\n乔玄硕\t384645\njump\t384646\nS35\t384647\n摆摊\t384648\n乐视系\t384649\n五册\t384650\n迈智能\t384651\n食草动物\t384652\n诺优能\t384653\n600平\t384654\n中创区\t384655\nCinnamon\t384656\n全息投影仪\t384657\n打吊针\t384658\n360n6pro\t384659\n共产党宣言\t384660\n水资源短缺\t384661\n62元\t384662\n丝涟\t384663\n胶盖\t384664\n善见\t384665\n假会\t384666\nvdf\t384667\n白马篇\t384668\ninherited\t384669\nNoa\t384670\n娘家的故事\t384671\n恩赐\t384672\n0.16.1\t384673\n法图\t384674\n无花果干\t384675\n余姚日报数字报\t384676\n超级捕快\t384677\n小鲸\t384678\n驾驶证考试网\t384679\n南海家园\t384680\n成体\t384681\n苏瑞\t384682\nICO区块链网\t384683\n南昌航空大学\t384684\n蹭\t384685\nRedefined\t384686\n李庆华\t384687\n医院\t384688\n清客\t384689\n3天天\t384690\n奔波儿灞\t384691\n新度\t384692\n保理商\t384693\n能勤绩\t384694\n艾茉森\t384695\n十余次\t384696\n曾江\t384697\nlist泛型\t384698\n闫灵\t384699\n化妆品店\t384700\n保诚保险\t384701\n天蓝蓝\t384702\n鲁兹\t384703\n岂容\t384704\n不迟\t384705\n强调句\t384706\n龙冈\t384707\n撤除\t384708\nTroubleshooting\t384709\n闫妮\t384710\nmicrocode\t384711\n酒质\t384712\n王小川\t384713\n王志清\t384714\n李尚禹\t384715\n发家史\t384716\ncs8\t384717\n山楂树\t384718\nVPK\t384719\n运存\t384720\nkissed\t384721\n剧组\t384722\naLIEz\t384723\n仙五前传\t384724\n生化药品\t384725\n阿堵\t384726\nカルテット\t384727\n烟威\t384728\n纪录片编辑室\t384729\n特玩守望先锋\t384730\n判给\t384731\n推进会\t384732\n芳\t384733\n心服口服\t384734\n金山词霸\t384735\n电吉他版\t384736\n中国银行股份有限公司\t384737\n选举团\t384738\n禁食\t384739\n宁夏广播电视台\t384740\n伺服驱动器\t384741\n李尔王\t384742\n快步走\t384743\nladies\t384744\n七叶皂苷钠\t384745\nyy李\t384746\n应对\t384747\nbinding\t384748\n遂宁银行\t384749\n对码\t384750\n打钉\t384751\n标志位\t384752\n李歆\t384753\n苍蝇拍\t384754\n何何\t384755\n崇仁县\t384756\neof\t384757\n嵌板\t384758\nPS/2\t384759\n源博雅\t384760\n63页\t384761\nlldpe\t384762\n回口\t384763\n古曲\t384764\n付小姐\t384765\n潘时七\t384766\n流沙镇\t384767\n中国通信网\t384768\n关干\t384769\n大连公交查询网\t384770\n汤色\t384771\n狼少女与黑王子\t384772\ndefind\t384773\n斜三通\t384774\n小汤\t384775\nqsql\t384776\n第44届\t384777\nshadowsocks代理\t384778\n无尽传说\t384779\n顺英\t384780\n被逮捕\t384781\n家丑\t384782\n劫财\t384783\n甘政办\t384784\ngdjlc\t384785\n政府工作报告\t384786\n板胡\t384787\nCopilot\t384788\n车节\t384789\n12.55\t384790\n同步齿科\t384791\n戒淫\t384792\n畅玩7c\t384793\n农业网\t384794\nEndo\t384795\nErica\t384796\n人教社\t384797\n民勤\t384798\nheavy\t384799\n2000万元\t384800\n天美意\t384801\n李大毛\t384802\n闪充\t384803\n国家话剧院\t384804\n麦克阿瑟\t384805\n炮锤\t384806\n睢县人民政府\t384807\n小桥流水\t384808\n缪建民\t384809\nEpic\t384810\nac68u\t384811\n美模\t384812\nwenglabs\t384813\nin3\t384814\nSHSH2\t384815\naqi\t384816\nvfio\t384817\n杜瓦罐\t384818\n县界\t384819\n远程控制软件\t384820\n天机阁\t384821\n渝东\t384822\n椭球\t384823\n皮鞋\t384824\n能说会道\t384825\n尚城\t384826\nffo\t384827\n86页\t384828\n77级\t384829\n石嘴山市\t384830\n贪玩传奇世界\t384831\n锡箔\t384832\n辽宁省农村信用社\t384833\n吉博力\t384834\nshame\t384835\n24个小时\t384836\n杜斌\t384837\n40所\t384838\nFelicia\t384839\n西门龙霆\t384840\n充\t384841\n沿滩\t384842\nRefund\t384843\n图板\t384844\ngaoxiao\t384845\nstarUML\t384846\n孤注\t384847\n李想\t384848\n情人流浪记\t384849\n标识符\t384850\n卡尔美\t384851\n城北街道\t384852\nmacroe\t384853\n那刻的怦然心动\t384854\n兔死狐悲\t384855\n左疯子\t384856\n安澜\t384857\n二荆条\t384858\n草甸\t384859\n景物\t384860\n顾小北\t384861\nwangyiyun\t384862\n征期\t384863\n船歌鱼水饺\t384864\n剑网三万花\t384865\n东北育才\t384866\n骨法\t384867\n宝芝\t384868\ncora\t384869\npour\t384870\n通头\t384871\n壹基因\t384872\n武汉消防\t384873\n韩国医院\t384874\n拼接屏\t384875\nRegulation\t384876\n港服\t384877\n第九十章\t384878\n绳索\t384879\n趴体\t384880\nToi\t384881\n人孔\t384882\n4399游戏\t384883\n帕奎奥\t384884\n双刺\t384885\n终成\t384886\n改称\t384887\n特高压\t384888\n誓颜\t384889\n小儿科\t384890\n江南农商行\t384891\nWed114\t384892\nNominees\t384893\n毛主席的话儿记心上\t384894\n研究成果\t384895\n表子\t384896\n48分\t384897\n傲日其\t384898\n成都教育\t384899\nJeff\t384900\n黄豆浆\t384901\n一会儿\t384902\n维修员\t384903\nBIAS\t384904\n筑梦者\t384905\n熔纤\t384906\n北京302医院\t384907\n江苏省国税局\t384908\n封层\t384909\n双圈\t384910\n大彭镇\t384911\n微读圣经\t384912\n主教\t384913\n雷蛇耳机\t384914\n58个\t384915\n指弹版\t384916\n盖世英雄\t384917\n沤\t384918\n宝体\t384919\n没有用\t384920\n372号\t384921\n新馆\t384922\n段鹏\t384923\nE线\t384924\n注册测绘师考试\t384925\nosgi\t384926\n潮吹\t384927\nnewly\t384928\n病源\t384929\n灭火毯\t384930\n竞赛类\t384931\n纽约视觉艺术学院\t384932\n乘地铁\t384933\n5.cn\t384934\n下加\t384935\n四季青\t384936\n1t\t384937\ndwg\t384938\nFused\t384939\n下品社\t384940\n福睿斯\t384941\n马晓晖\t384942\nFoxit\t384943\nrepetition\t384944\n周治平\t384945\nQFII\t384946\n受刑\t384947\nmaya2009\t384948\n中国信达\t384949\n触控感\t384950\n医馆\t384951\nmetcn\t384952\n栖居\t384953\n天坛公园\t384954\n钾盐\t384955\n环球\t384956\n纤细\t384957\n传讯\t384958\ntcpdump\t384959\n初始值\t384960\n薛家燕\t384961\n雕\t384962\n中国男篮\t384963\n乌合变色龙\t384964\nP挡\t384965\n郑州澍青医学高等专科学校\t384966\nProxies\t384967\n圣主\t384968\nSupport\t384969\n2.4.11\t384970\n小锅米线\t384971\n诗圣\t384972\n北京卓立汉光仪器有限公司\t384973\n来客\t384974\n帕尔默\t384975\n牙槽骨\t384976\n生日歌\t384977\n撩火\t384978\n孔雀\t384979\n每4年\t384980\n微盟集团\t384981\n根域名服务器\t384982\n后窗\t384983\nqq昵称\t384984\n狂龙\t384985\n终阶\t384986\n艾酷\t384987\nSmartSVN\t384988\ngetopts\t384989\n19起\t384990\nachair\t384991\n湖南商务职业技术学院\t384992\nStrongSwan\t384993\n吴姐姐\t384994\n哈达迪\t384995\nwin10计算器\t384996\nndimage\t384997\n2.0.0\t384998\n强化复合地板\t384999\n萧雅\t385000\n室内设计师\t385001\ngrape\t385002\n防火卷帘门\t385003\n飙速宅男第四季\t385004\n利兹大学\t385005\n压力板\t385006\n链条炉\t385007\n两口\t385008\n周经理\t385009\n中餐\t385010\n诽谤\t385011\nansysworkbench\t385012\n交通信号灯\t385013\n發行\t385014\n德士\t385015\nHandShaker\t385016\n东小口镇\t385017\nmatures\t385018\n兰江\t385019\n口碑\t385020\n首代\t385021\n成都大学\t385022\n产业结构\t385023\n党建篇\t385024\n102项\t385025\n柳泉路\t385026\n绝命后卫师\t385027\n万博广场\t385028\n上海东方泵业(集团)有限公司\t385029\n航电\t385030\nzjzs\t385031\n领居\t385032\n集合资金信托计划\t385033\n韩玲\t385034\n手机号段\t385035\n东方剑桥\t385036\n品三国\t385037\n肥胖\t385038\n测\t385039\n宁愿相信\t385040\n金山石化\t385041\n呼格吉勒图\t385042\n玩童\t385043\nintended\t385044\n贵金属\t385045\n博济医药\t385046\n天津建设银行\t385047\n1343\t385048\n播报稿\t385049\n福田高铁站\t385050\n乱纪\t385051\n骨量\t385052\nenviron\t385053\n定情\t385054\nGift\t385055\n平安好\t385056\n以民为本\t385057\n网易用户体验设计中心\t385058\n喜羊羊与灰太狼全集\t385059\nwin0系统\t385060\n先睹为\t385061\n沪深市场\t385062\n老少\t385063\n候选者\t385064\n孔升妍\t385065\nCEE\t385066\n珠海幼儿园\t385067\n三女\t385068\n完美芦荟胶\t385069\n事佬\t385070\nPoE\t385071\n刘军洛\t385072\n24斤\t385073\n朝晖自媒体\t385074\n大湿社\t385075\n别闹了\t385076\n漏毛\t385077\n攻坚战\t385078\n柠檬网\t385079\nWisdom\t385080\n换声\t385081\n微小\t385082\n行政制度\t385083\n净坛\t385084\n东山口\t385085\n浙江省人大\t385086\n斜道\t385087\n浮漂\t385088\n旨在\t385089\n星巴\t385090\n抄表员\t385091\n环球医药招商网\t385092\n驻守\t385093\nryan\t385094\nInstruction\t385095\n唯一性\t385096\n洋务运动\t385097\n表友\t385098\nelonaplus\t385099\n中铁诺德\t385100\n胚囊\t385101\n一整片\t385102\n50mm\t385103\n全宅\t385104\n红系\t385105\n丢下\t385106\nxmlhttp\t385107\n2015年五一劳动节\t385108\nCauses\t385109\n何时何地\t385110\n乐见\t385111\n韩涛\t385112\n头柜\t385113\n30.5\t385114\n谢园\t385115\n挠曲\t385116\n安装版\t385117\n门诊费\t385118\n海词\t385119\nBA黑凤梨\t385120\n重机车\t385121\nhbuild\t385122\n8D\t385123\n北京大学医学部\t385124\n悬挑梁\t385125\n烽烟\t385126\n沐雪\t385127\nTilt\t385128\n请吧\t385129\n一式四份\t385130\ntcgame\t385131\n三角架\t385132\ntron\t385133\n生种\t385134\n演戏\t385135\n俏佳人\t385136\n天干_夜夜啪_天天操_天天啪_天天射_天天日_天天撸\t385137\n按页\t385138\naui\t385139\n软水\t385140\n顾惜朝\t385141\n莱尔德\t385142\nfct\t385143\n蛟龙号载人潜水器\t385144\n上海财经大学浙江学院\t385145\n村上凉子\t385146\n柔体\t385147\n纵横\t385148\nhst/com\t385149\n失学\t385150\n第166章\t385151\n小雨沙沙\t385152\ncolg\t385153\ngf\t385154\n天天315_论坛\t385155\n尚贤\t385156\ncamtasiastudio\t385157\n虚无\t385158\n画工\t385159\n置管\t385160\n病生\t385161\nCentOS7系统\t385162\n100多斤\t385163\n制砂机\t385164\n2018.4.27\t385165\nOver\t385166\n女\t385167\ncapable\t385168\n龙线\t385169\n蓝正龙\t385170\nps描边\t385171\n规格板\t385172\n马蜂窝自由行\t385173\nemulators\t385174\n笑死人不偿命\t385175\n桌子\t385176\n电函\t385177\n杨雷\t385178\n武汉枫叶国际学校\t385179\n石景山万达广场\t385180\noutdoor\t385181\n揽途\t385182\nclass值\t385183\nRPR\t385184\n天河汽车客运站\t385185\n洛可\t385186\nAD域\t385187\n王丽红\t385188\n吊丝\t385189\n冻龄\t385190\n世文\t385191\n长隆国际大马戏\t385192\nLookup\t385193\nmarcelo\t385194\n刘佩鑫\t385195\n随地吐痰\t385196\n东风洒水车\t385197\n后灯\t385198\n淘翠网\t385199\nsanity\t385200\n最强大\t385201\n上水石\t385202\n千户集团\t385203\n下一站幸福\t385204\n青海省卫生和计划生育委员会\t385205\n王顺\t385206\n神舟五号飞船\t385207\n同里镇\t385208\n普通型\t385209\nislide\t385210\n附身\t385211\n青春之歌\t385212\ntingting\t385213\n液压缓冲器\t385214\n3210m\t385215\n中国农业出版社\t385216\n天涯头条\t385217\n脚标\t385218\n第九套\t385219\n威远县政府\t385220\n上海市实验学校附属小学\t385221\n哈弗m6\t385222\n生蛋\t385223\n万和燃气\t385224\njamie\t385225\n香港花旗银行\t385226\n工程咨询有限公司\t385227\n江津中学\t385228\n摄像机\t385229\n昵文阁\t385230\n合肥旅行社\t385231\n生化危机启示录\t385232\n14000\t385233\n百部\t385234\n洪家楼\t385235\n人民日报评论部\t385236\n魔王松鼠\t385237\n多一层\t385238\nエロ\t385239\n吗_叩富网\t385240\npadded\t385241\n该镇\t385242\n我自\t385243\n80部\t385244\n暴走大事件周\t385245\n大律\t385246\n临沂网\t385247\n百家碎戏\t385248\n雷卡\t385249\nSVNKit\t385250\n1280Px720P\t385251\n调制型\t385252\n海口市公安局\t385253\n化工路\t385254\n龙血竭\t385255\n民汉\t385256\n第三辑\t385257\n518.com\t385258\n鱼香肉丝\t385259\n同音字\t385260\n催眠药\t385261\n圣婴\t385262\n江苏华云仪表有限公司\t385263\n绿尾\t385264\n水东镇\t385265\n兰亭镇\t385266\n上海高中\t385267\n奉贤县\t385268\n1万7\t385269\n46g\t385270\n隔壁网\t385271\n婚礼酒店\t385272\n程斌\t385273\n银河\t385274\nCarmen\t385275\ncss类\t385276\nQtable\t385277\n第三十六条\t385278\nJBJ\t385279\n马骥\t385280\nunifying\t385281\n曹亮\t385282\n飞天舞\t385283\nO形\t385284\n低分子肝素钠\t385285\n外国歌\t385286\n师轩\t385287\n蔡和森\t385288\n93元\t385289\n做为\t385290\n隽荟\t385291\nXonar\t385292\n金枝欲孽\t385293\n合生元益生菌\t385294\n门儿\t385295\n法标\t385296\n35类\t385297\nPHPMyAdmin\t385298\n霍英朗\t385299\n猝\t385300\nSigmoid\t385301\ni77700\t385302\n增强性\t385303\n磁场\t385304\nCCTV-14_央视网\t385305\n电子银\t385306\n投\t385307\n刘莉莉\t385308\n血色玫瑰\t385309\n化学工程学院\t385310\n饲喂\t385311\njpql\t385312\n局麻\t385313\nP53\t385314\nLFM\t385315\n清华大学交叉信息研究院\t385316\n15min\t385317\n升窗器\t385318\nviolation\t385319\n杨小龙\t385320\n半丸子头\t385321\n西胪镇\t385322\n尼德兰\t385323\n面包粉\t385324\n文昌花园\t385325\n上海村\t385326\nmo管理器\t385327\nuigetfile\t385328\n肚子不舒服\t385329\nphpstudy\t385330\n玄幻类\t385331\nSAML\t385332\ne440\t385333\n94平米\t385334\n4302\t385335\nxxt\t385336\noutage\t385337\n台湖镇\t385338\ncaxa\t385339\n运算定律\t385340\n字色\t385341\n藤次郎\t385342\n卓尔不凡\t385343\n和尚\t385344\nPrimo\t385345\n吉利帝豪EV450\t385346\nmotion2\t385347\n300d\t385348\n东方留学网\t385349\n李玉堂\t385350\n断端\t385351\n69秒\t385352\n拉伸膜\t385353\n新商盟\t385354\n中央气象局\t385355\ndiann\t385356\n邪恶漫画无翼鸟大全\t385357\nlockin\t385358\n夕死\t385359\n刘新民\t385360\n古琴曲\t385361\njournald\t385362\n曲溪\t385363\nguiyang\t385364\n钛\t385365\nfemale\t385366\n立学\t385367\na级\t385368\n勤学苦练\t385369\n6.2\t385370\n杭州洲际酒店\t385371\n示范园\t385372\nmoderate\t385373\n22288888\t385374\n洛克人\t385375\npotent\t385376\n损失率\t385377\nFireMonkey\t385378\n哈东\t385379\n娥江\t385380\n枪神纪刀锋\t385381\n126万\t385382\n堤围\t385383\nhowhy\t385384\n马丘比丘\t385385\n塞尔达dlc\t385386\n时尚运动\t385387\n送花\t385388\nPublications\t385389\n三个半小时\t385390\n煊\t385391\n鬼吹灯之寻龙诀\t385392\n十二首\t385393\n吴阶平\t385394\n阿鲁巴\t385395\n台式组装机\t385396\n周海洋\t385397\n钢罐\t385398\nget传参\t385399\n倾诉\t385400\n铁东路\t385401\n叶轮\t385402\n蒋门神\t385403\n便携版\t385404\n寄住\t385405\n时光辉\t385406\n非洲杯\t385407\n越乡\t385408\n小喵\t385409\njodatime\t385410\nAutocad命令大全_沐风网\t385411\n王湛\t385412\n事迹\t385413\n拆船\t385414\n夏云\t385415\n中国行业信息网\t385416\n世纪证券\t385417\n上海市质子重离子医院\t385418\n舵机\t385419\nTwixtor\t385420\n特高\t385421\njiudian\t385422\n根筋\t385423\n鸽界\t385424\n尚锦城\t385425\n140万元\t385426\n2016年10月27日\t385427\n每位\t385428\n佛系\t385429\n星愿\t385430\n韩宁\t385431\n一2\t385432\n320x240_\t385433\n马楠\t385434\n雪之华\t385435\n国能新能\t385436\n编程器\t385437\n赛博坦\t385438\n和派\t385439\n4.8.6\t385440\n祖卡木颗粒\t385441\n番禺职业技术学院\t385442\n行政网\t385443\n重庆川仪自动化股份有限公司\t385444\nartemis\t385445\n泛微oa\t385446\n隶属\t385447\n情深\t385448\nkarl\t385449\n万科溪之谷\t385450\nair\t385451\n雕塑\t385452\n尼克森\t385453\n撒谎\t385454\n车贴\t385455\n山兔\t385456\n桂林火车站\t385457\nphp7\t385458\n深港澳\t385459\n音乐节目\t385460\n三箱\t385461\n异瞳\t385462\n美湖\t385463\n平面贴标机\t385464\n烤鱼炉\t385465\n马克\t385466\n吃饱了撑\t385467\n總\t385468\ntournament\t385469\n13m\t385470\n禄口机场\t385471\n核试验场\t385472\n颜值控\t385473\n沧田\t385474\nDEDE\t385475\n白云街道\t385476\n热歌\t385477\n炮火\t385478\n狼道\t385479\n糖脂\t385480\nige\t385481\n徐远\t385482\n土木工程系\t385483\n兴仁镇\t385484\n唇炎\t385485\n枧水\t385486\n天王赛\t385487\n2.3T\t385488\n王培生\t385489\n星沙镇\t385490\n懒猪\t385491\n无机肥\t385492\n前天\t385493\n赵磊\t385494\n绿板\t385495\nrabi\t385496\n林头\t385497\n气体灭火系统\t385498\n行贿者\t385499\n纯电动物流车\t385500\n行李框\t385501\nscarlett\t385502\n帅哥哥\t385503\nP21\t385504\n肋部\t385505\n邱毅\t385506\n杜玉芳\t385507\n轩辕剑之汉之云\t385508\n罗盖特\t385509\n神观之梦\t385510\nPGF\t385511\n蜀海\t385512\n锐捷交换机\t385513\n服饰史\t385514\nshield\t385515\n坏习惯\t385516\n金弹子\t385517\n魔兽地图包\t385518\n苏茜\t385519\n青浦论坛\t385520\n家用机器人\t385521\n缓期执行\t385522\n幸福港湾\t385523\n2017英雄联盟\t385524\n湖北自贸区\t385525\n收发包\t385526\nalumni\t385527\n赠别\t385528\n蜡烛机\t385529\n凤卿尘\t385530\n水凝霜\t385531\n长安信托\t385532\nzuber\t385533\n尽然\t385534\ncognex\t385535\n肋骨\t385536\n手癣\t385537\n入股\t385538\n燕儿岛路\t385539\n希魔\t385540\n谷雨节\t385541\n奉节\t385542\n梦特娇\t385543\n陕西师范大学远程教育学院\t385544\nyuan\t385545\n嗜好\t385546\n560万\t385547\n濡沫\t385548\n我者\t385549\n江淮瑞风R3\t385550\n水长东\t385551\n转为\t385552\n凌晨三点半\t385553\ngay\t385554\n女王之刃\t385555\nc++map\t385556\n2018.1.24\t385557\n捐出\t385558\n佩斯利\t385559\n呈报\t385560\n欧麦特\t385561\n鬼压床\t385562\n老和\t385563\n威马逊\t385564\n软化灶\t385565\n博世\t385566\n真人娱乐\t385567\n沙角\t385568\n斯坦\t385569\ntoday\t385570\n山海经之赤影传说\t385571\n风烟\t385572\nextraction\t385573\n炎狱\t385574\ng400\t385575\nroy\t385576\n大西洋号\t385577\nsedu\t385578\nwindowsdefender\t385579\n凤凰涅盘\t385580\n菠萝啤\t385581\n异星\t385582\nTournament\t385583\n卡牌\t385584\n液晶模组\t385585\n装板\t385586\n萨里\t385587\n两相四线\t385588\n牌头镇\t385589\n张公岭\t385590\n电工证\t385591\n20151012\t385592\n飘移\t385593\nPS8.0\t385594\n向阳街道\t385595\n1241\t385596\n北京翰香斋\t385597\n2017年9月18日\t385598\n数轴\t385599\n阿尔法·罗密欧汽车\t385600\n环烷酸\t385601\n团团\t385602\n九域\t385603\n皎然\t385604\nMOL\t385605\nUDT\t385606\n在第\t385607\n500粒\t385608\n悬泉\t385609\n细度\t385610\n绣法\t385611\n迟到\t385612\n润州区\t385613\nBonus\t385614\n乐教乐学\t385615\ncrafted\t385616\n打印表\t385617\n原帖\t385618\nSwimwear\t385619\n大奥\t385620\n360路由器\t385621\n恒屹\t385622\n省略号\t385623\nPANDA\t385624\n修真网游之神级机械猎人\t385625\n百花奖\t385626\n拍花\t385627\n第二本\t385628\n狂射\t385629\n洛尔\t385630\ncommunities\t385631\n奥陶纪\t385632\n连营\t385633\n少年王勃\t385634\nMAYA\t385635\ncuckold\t385636\n咨询公司\t385637\n双桥沟\t385638\n痛感\t385639\n菜名\t385640\n榴莲糖\t385641\n和平统一\t385642\n孝渊\t385643\nAzumi\t385644\n桑基图\t385645\n探索发现\t385646\nclubs\t385647\n2&#160\t385648\nAnalysis\t385649\nGrant\t385650\n不以为然\t385651\n飞凡\t385652\n叶群\t385653\n能性\t385654\n毫安\t385655\n百度快行\t385656\n罗丽\t385657\n百分之七十五\t385658\nphython\t385659\ntolerate\t385660\n汤姆斯杯\t385661\n帧率\t385662\n索尼克\t385663\n官儿\t385664\n有界性\t385665\n唐欣\t385666\n施华\t385667\n验收单\t385668\n塑料布\t385669\n娘亲舅\t385670\n应力\t385671\n济南联通\t385672\n四川华新现代职业学院\t385673\n警部\t385674\n贵州商学院\t385675\nisnull\t385676\nE6230\t385677\nclm\t385678\n孙海平\t385679\n安徽省建设干部学校\t385680\n黄建平\t385681\n数据库值\t385682\n迟延履行金\t385683\n恒荷载\t385684\nxpadder\t385685\n薛之谦\t385686\n打上\t385687\n007黄金眼\t385688\n名碑\t385689\n模具架\t385690\n快乐小舞星\t385691\n晴窗\t385692\nmaven-plugin\t385693\n叮铃\t385694\n高效液相色谱仪\t385695\n纯属意外\t385696\n郑恩\t385697\n石龙镇\t385698\n临电\t385699\ngtx1066\t385700\n调候\t385701\nintecad\t385702\n豆奶\t385703\n毗舍童子\t385704\n珠海市高新区\t385705\nGlitter\t385706\n威海华夏城\t385707\n江苏省人民代表大会常务委员会\t385708\n公推\t385709\n普天同庆\t385710\n豆友\t385711\n剪重比\t385712\n5万\t385713\n做入\t385714\n夏德\t385715\n花鸟画\t385716\nPISA\t385717\n音译词\t385718\n万方发展\t385719\n芬兰松\t385720\n重铺\t385721\n爱丽丝·门罗\t385722\niTunes12\t385723\n空格\t385724\n苏州市经济和信息化委员会\t385725\nhistorical\t385726\n李欢\t385727\n山东省环境保护厅\t385728\n邻接\t385729\n第一百零六章\t385730\n将军县\t385731\nbrandon\t385732\n福睿\t385733\n快繁\t385734\n薛蟠\t385735\n0437\t385736\n博盈\t385737\nNATIONAL\t385738\n虾尾\t385739\n化工报\t385740\n苯磺酸氨氯地平\t385741\n北京劳动保障职业学院\t385742\n发电站\t385743\n果类\t385744\n环球职业教育在线\t385745\n其意\t385746\n赵奕欢\t385747\n删光\t385748\n伟大的\t385749\n锐尔\t385750\nbelieved\t385751\nTahoe\t385752\n坎大哈\t385753\n暗黑\t385754\n血浆蛋白\t385755\n李建荣\t385756\n人财\t385757\n推广曲\t385758\n激振\t385759\n朗朗上口\t385760\ndout\t385761\n卡利亚\t385762\nC语言版\t385763\n皎\t385764\n库娃\t385765\n0832\t385766\n林肯蓝洁瑛\t385767\n24码\t385768\n塔尖\t385769\n浪漫传说\t385770\n上官云珠\t385771\n渭南市中级人民法院\t385772\n露养\t385773\nPADS技术论坛\t385774\n共青国家森林公园\t385775\n纳乔\t385776\n侏罗纪世界2\t385777\n爆击\t385778\n王瑜\t385779\n飞虎神鹰\t385780\nwasted\t385781\nR9s\t385782\n驱使\t385783\n高碳钢\t385784\n奇树\t385785\n仙人掌果\t385786\n西北区\t385787\n气囊式\t385788\n刘家辉\t385789\n石磨坊\t385790\n相报\t385791\n叶子猪梦想世界\t385792\n轴瓦\t385793\n阿袁\t385794\nloadUrl\t385795\n新华会计网\t385796\n中下游\t385797\n中泰街道\t385798\n柏厨\t385799\nh漫\t385800\n香港注册公司\t385801\n心向\t385802\n建研\t385803\n宝能太古城\t385804\n智立方\t385805\n李三枪\t385806\n高岭\t385807\n伟思\t385808\n福昕PDF编辑器\t385809\n中交第四航务工程局有限公司\t385810\nsolidwo\t385811\n凤凰广场\t385812\n德尔格\t385813\n中国科学院北京分院\t385814\n保险理赔\t385815\n篷布\t385816\n宋拓\t385817\n阿拉伯挤奶法\t385818\nYang\t385819\n广州省汽车站\t385820\n_缘\t385821\n按语\t385822\nWorkStation\t385823\n相關\t385824\nANDY\t385825\n迁延\t385826\n好想\t385827\n壁布\t385828\n私发\t385829\n为指导\t385830\n嘉陵摩托\t385831\n0xc0000001\t385832\nWOW猎人\t385833\n中德住房储蓄银行\t385834\nMETA\t385835\n重庆婚纱摄影\t385836\n科勒·卡戴珊\t385837\n椿真子\t385838\n联芯科技\t385839\n江苏国\t385840\n退团\t385841\n氟喹诺酮类\t385842\n东山街道\t385843\n江湾\t385844\n绿妻\t385845\nGasoline\t385846\n孙瑞雪\t385847\n小儿多动症\t385848\n打着火\t385849\n和谈\t385850\nlaunchimage\t385851\n平阳县人民医院\t385852\n减量\t385853\n次元客\t385854\n瓜哥\t385855\n安泰经济与管理学院\t385856\n东坡公园\t385857\n脱失\t385858\n测试页\t385859\n片数\t385860\n5月20号\t385861\n鬼胎\t385862\nquantile\t385863\nGK5\t385864\n0521\t385865\n四段式\t385866\n解落\t385867\n移动叔叔\t385868\nnegligence\t385869\nEcosystem\t385870\n流鼻子\t385871\n吴光辉\t385872\nfto\t385873\n北京字节跳动科技有限公司\t385874\n投石问路\t385875\n格莱美音乐节\t385876\n汉字听写大会\t385877\nsurge\t385878\n拟合值\t385879\n天涯互助_论坛\t385880\nHeritage\t385881\n差动式\t385882\nSTM32F\t385883\n城北小学\t385884\nguding\t385885\n普利司通\t385886\nuygurqa\t385887\n里番合集磁力链接\t385888\nWindows服务器\t385889\n安卓4.4\t385890\n权欲\t385891\n985211\t385892\n哆瑞咪\t385893\nMoldFlow\t385894\n爱你一生一世\t385895\nby池袋\t385896\n各乡\t385897\nAchain\t385898\n游猎\t385899\n陈冠蒲\t385900\n宜川路\t385901\nelife\t385902\n广州中大\t385903\n交通工\t385904\n日支\t385905\n楞次定律\t385906\n弹拨\t385907\nNINE\t385908\n柔肌\t385909\n电力系统及其自动化\t385910\n聂辰席\t385911\n斜桥\t385912\nflake\t385913\nYacc\t385914\n100000元\t385915\n3190\t385916\n周恩来纪念馆\t385917\n雪姬\t385918\n金菊花\t385919\n朱见深\t385920\n千姿\t385921\n数字英才网\t385922\n广东省中医院\t385923\n邹磊\t385924\n陈森\t385925\nUVM\t385926\n感谢信\t385927\npp料\t385928\n第几版\t385929\n尺度\t385930\n谆谆\t385931\nfollowing\t385932\n公司化\t385933\n性善论\t385934\nEmulation\t385935\n舞厅\t385936\n自定义函数\t385937\n广水\t385938\n小米pro\t385939\n香城\t385940\n表现好\t385941\n抄斩\t385942\n谈\t385943\n遽\t385944\n情画\t385945\n小神的孩子们\t385946\n新种族\t385947\nSorrow\t385948\n顺序\t385949\n半弯\t385950\n庐江先锋网\t385951\n东海国际公寓\t385952\n断图\t385953\n李密\t385954\n全稿\t385955\n中国地质大学武汉吧\t385956\n别克VELITE\t385957\n依约\t385958\n广东药科大学\t385959\n二十年\t385960\n舞灵美娜子\t385961\n王室风云\t385962\nyì\t385963\n19.4\t385964\n16升\t385965\nIphone\t385966\n乐唯\t385967\n0005\t385968\n自测盒\t385969\n500ml\t385970\n毕业清考\t385971\n温心\t385972\n周韦彤\t385973\n民风\t385974\n爱群\t385975\n牌阵\t385976\n女武神驱动\t385977\n基金\t385978\n竣工面积\t385979\n德化网\t385980\n不爱你\t385981\n百鬼奕\t385982\n荣耀7\t385983\n宫寒\t385984\nProposed\t385985\nIP化\t385986\n流动摊贩\t385987\n横臂式\t385988\n恒健\t385989\n卤水豆腐\t385990\n173\t385991\n民民\t385992\n昌邑区\t385993\n饶阳\t385994\n生酮饮食\t385995\n野口\t385996\n鼎桥\t385997\npulsar\t385998\n食业\t385999\n地线\t386000\n图王\t386001\n玛嘉烈\t386002\n性交图\t386003\n大港新区\t386004\n新光天地店\t386005\n九龙街道\t386006\n连体婴\t386007\n猫派\t386008\n差_\t386009\n马雅可夫斯基\t386010\nDHH\t386011\n众泰众泰T700\t386012\n影票\t386013\n过去一周\t386014\n平等权\t386015\nGamma\t386016\n网络管理员\t386017\n魅族m1\t386018\n统销\t386019\n20CD\t386020\n咒术师\t386021\napink\t386022\n霍去大明望族\t386023\nPornDig\t386024\n阻尼器\t386025\n肯纳\t386026\n奥迪宝马\t386027\n灯膜\t386028\n廉洁奉公\t386029\nglyph\t386030\n前一秒\t386031\n捍\t386032\n流星花园三区\t386033\n房屋登记\t386034\n哭砂\t386035\n注规\t386036\n微波射频网\t386037\n水处理器\t386038\nBrad\t386039\n鸡血藤\t386040\n于冰\t386041\n蓑衣\t386042\n紧闭\t386043\ntwist\t386044\n干燥综合征\t386045\n瓶\t386046\n刘少林\t386047\njp\t386048\n黄雷\t386049\n盐酸肾上腺素\t386050\n鬼娘子\t386051\nJabinZhang\t386052\n清枫\t386053\n天然气流量计\t386054\nCram\t386055\n车震\t386056\n分散\t386057\n宋记\t386058\n治市\t386059\n瓜帅\t386060\n祁睿峰\t386061\nVOL.1\t386062\nUltra\t386063\n出站\t386064\n罗格斯\t386065\n李尚允\t386066\n219&\t386067\n永安路\t386068\n5200LX\t386069\n梅德韦杰娃\t386070\n221.1\t386071\n手车\t386072\n狎\t386073\n东方万里行积分商城_中国东方航空\t386074\n2017年4月5日\t386075\nStock\t386076\n手工帝网\t386077\n本队\t386078\n用到\t386079\nLOOK\t386080\n12平方厘米\t386081\n67平米\t386082\n异己\t386083\n4.80分\t386084\n捉奸\t386085\n花边框\t386086\nOffset\t386087\n入宫\t386088\n升空\t386089\n子宮\t386090\n上海科技报\t386091\n北京协同创新研究院\t386092\n刘集\t386093\n三星S8\t386094\n直付\t386095\n逍遥峡谷\t386096\n数余\t386097\n丞相\t386098\n黑砖窑\t386099\n雌二醇\t386100\n厄立特里亚\t386101\nAMAZING\t386102\n作死\t386103\n红头绳\t386104\n大哥\t386105\ndahai\t386106\n螺纹通止规\t386107\nBIGO\t386108\n随机排序\t386109\n大帅\t386110\n高压阀\t386111\nKyle\t386112\n北京市文化局\t386113\nsimp\t386114\nspiderman\t386115\n彩生活\t386116\npython\t386117\n会计继续教育网\t386118\n电子式电能表\t386119\nRadial\t386120\n蒋文端\t386121\n瑞易\t386122\n八王爷\t386123\n北郊\t386124\n大东山\t386125\n王马\t386126\n维特尔\t386127\n华图教育安徽分校\t386128\n权力之路\t386129\n铁牛\t386130\nAirbnb\t386131\nDeen\t386132\n毛边\t386133\n仇恨\t386134\n摩托\t386135\n关出\t386136\n昆工\t386137\n阿里山的姑娘\t386138\n辈份\t386139\nGlacier\t386140\n害虫\t386141\n吉山\t386142\n林群\t386143\n协方差矩阵\t386144\n王大为\t386145\n蓝鲫\t386146\n段艺璇\t386147\ntruncate\t386148\nburton\t386149\n31亿\t386150\n韭兰\t386151\neclise\t386152\n镇子\t386153\nsumproduct函数\t386154\n岗头村\t386155\n聚羧酸\t386156\n单枪\t386157\n周然\t386158\n霍震霄\t386159\ngenerals\t386160\n任伯年\t386161\n景仰\t386162\n汪汪\t386163\nVeraCrypt\t386164\nAABB式\t386165\n镇咳药\t386166\n鬼使神差\t386167\n沙子口\t386168\n政银\t386169\n老師\t386170\nSftp\t386171\n11.2.1\t386172\n红景天胶囊\t386173\n1820年\t386174\n扣压式\t386175\n1000列\t386176\n阿凡提\t386177\n贝琪\t386178\nVBA批量\t386179\n温大\t386180\n李斯特菌\t386181\n大刀\t386182\n公募基金\t386183\n新文秘网\t386184\n住哪网\t386185\ninstall_db\t386186\n小珊迪\t386187\nwebos\t386188\nContractor\t386189\n八卦掌吧\t386190\n小艾\t386191\nflash课件\t386192\n聚多巴胺\t386193\ndatarow\t386194\n旅顺港\t386195\nMemCache\t386196\n埃里克\t386197\n工业与民用配电设计手册\t386198\n奶德\t386199\n冯程程\t386200\n辗转相除法\t386201\n韩雨\t386202\n支烟\t386203\n雪佛兰探界者\t386204\n婚城\t386205\n中国科学院城市环境研究所\t386206\n东风标致307\t386207\n戏苑\t386208\n急流勇退\t386209\n夜行动物\t386210\n稼穑\t386211\n点击子\t386212\n山东农村信用社\t386213\n3.4.8\t386214\n包头机场\t386215\n26册\t386216\n梁逸峰\t386217\nJpg格式\t386218\n组成\t386219\n万庄\t386220\nPornhub\t386221\n哈罗公学\t386222\n克鲁赛德战\t386223\n有许\t386224\n灰屏\t386225\narctis\t386226\n4件\t386227\n区政协\t386228\n立山区\t386229\n2018年以后\t386230\n洛杉矶市\t386231\n有限合伙\t386232\n珠海体育中心\t386233\n齐舞\t386234\n年化率\t386235\n年货节\t386236\n益田\t386237\n名称\t386238\n胡小闹日记\t386239\n宋波\t386240\n金玉良缘\t386241\n美多芭\t386242\n槐林\t386243\n大工\t386244\nPicks\t386245\n幼女战记\t386246\ncleanly\t386247\n林权证\t386248\n线槽\t386249\n小牛电动\t386250\njkl\t386251\n有利可图\t386252\n14家\t386253\n专生\t386254\n北京大学国际医院\t386255\n惹\t386256\n格林纳达\t386257\n砺\t386258\n西西网\t386259\n76A\t386260\n融化\t386261\n有分\t386262\n有声读物\t386263\n入选者\t386264\n建身\t386265\n1.17\t386266\n国务院医改办\t386267\n红蓝3d\t386268\n邮址\t386269\n果汁杯\t386270\n终身制\t386271\n仓本c仔\t386272\n虚岁\t386273\n黑站\t386274\n4bit\t386275\n辛国斌\t386276\nU16\t386277\nhind\t386278\n云盘\t386279\n阡陌\t386280\n奔驰AMG\t386281\ndoPost\t386282\n鄞县\t386283\n倾向于\t386284\nodac\t386285\n林松\t386286\nsharon\t386287\n来日\t386288\n转眼间\t386289\n南国早报网\t386290\nCharlotte77\t386291\n自架\t386292\n城墙\t386293\nrestlet\t386294\nPharmacy\t386295\n天河路228号\t386296\n香香公主\t386297\n群英荟萃\t386298\n灼灼其华\t386299\n倾盖如故\t386300\n半米\t386301\n困\t386302\n中央综治办\t386303\nRaja\t386304\nerror\t386305\nupdatedb\t386306\nVhiphop唯\t386307\n十路\t386308\n优术\t386309\n默顿\t386310\n苏哲\t386311\ndaytime\t386312\nprovided\t386313\nbasf\t386314\ng450\t386315\n山界\t386316\n6月28日\t386317\n文种\t386318\n被告白\t386319\nStrain\t386320\nvvic搜款网\t386321\n五角形\t386322\n高压机\t386323\nDeli\t386324\n青岛市国土资源和房屋管理局\t386325\nCd\t386326\n丁香通\t386327\n俩俩\t386328\nswf格式\t386329\n卵泡刺激素\t386330\n满枝\t386331\nAppID\t386332\n第一人\t386333\n2.2.2\t386334\nWinning\t386335\n一生有你\t386336\n五十年代\t386337\n温州市实验中学\t386338\nAV區-PLAYNO.1玩樂達人討論區\t386339\n6.doc\t386340\nLeather\t386341\n新部编版小学\t386342\n界外\t386343\n责令\t386344\nwww.51danei.com\t386345\n黎恩\t386346\n开仓\t386347\n热力学\t386348\nhtml5播放器\t386349\n凯恩帝\t386350\nmemc\t386351\n永定\t386352\n扩展屏\t386353\ncommunicative\t386354\n新世嘉\t386355\n望书阁\t386356\n第B03\t386357\n抹茶味\t386358\n48万\t386359\n偷气\t386360\n张一一\t386361\n银河麒麟\t386362\nFBA\t386363\n欢畅\t386364\n杨舍\t386365\n西江新城\t386366\ndeemo\t386367\nconstitution\t386368\n暗恋桃花源\t386369\n金瀚\t386370\npinv\t386371\nDreamWeaver\t386372\n大稳村\t386373\n冷牛奶\t386374\n递进式\t386375\n信阳之窗\t386376\n固戍\t386377\n硫酸汞\t386378\n安格斯\t386379\nprcs6\t386380\n数百家\t386381\nIndonesian\t386382\nMugen\t386383\n常用语\t386384\n新生化颗粒\t386385\n拉伸机\t386386\n小队长\t386387\n联建\t386388\nrelate\t386389\n轨迹方程\t386390\n一滴血\t386391\n沙叶\t386392\n苍星\t386393\n仰韶\t386394\n青豆小说网\t386395\n2008年4月\t386396\n群战\t386397\n金瓶梅2之爱的奴隶\t386398\n闲坐\t386399\n白盆\t386400\nDISNEY\t386401\n百字\t386402\n内存量\t386403\nIOS10\t386404\n纳妾记\t386405\n缩写字母\t386406\n一画\t386407\n反贼\t386408\n刘芳\t386409\nEmacsWiki\t386410\n创世者\t386411\nshapes\t386412\n89395286\t386413\n龙华中路\t386414\n三健\t386415\n欢乐颂\t386416\n宅腐\t386417\nenya\t386418\nangelababy\t386419\n说不清楚\t386420\n小任\t386421\n吉华街道\t386422\n荔枝干\t386423\n峨眉山市人民政府\t386424\n19lou.com\t386425\ngv\t386426\n重造\t386427\n后半年\t386428\nEY\t386429\n花件\t386430\n农村合作医疗\t386431\n果仁\t386432\n泰坦尼克号2\t386433\n农业供给侧改革\t386434\n云米科技\t386435\nvivoactive\t386436\n野队\t386437\nHatch\t386438\n污点\t386439\nrpx\t386440\n忧\t386441\n20粒\t386442\n魔笔\t386443\n取词\t386444\n禁播\t386445\n黄族民\t386446\n沙坪街道\t386447\n电视频\t386448\n实事\t386449\n机缘\t386450\njackchen007\t386451\n上海交通大学研究生院\t386452\nqsv格式\t386453\n共阳\t386454\n还原性糖\t386455\n逢凶化吉\t386456\n李大伟\t386457\n火焰纹\t386458\n南苑村\t386459\n人鱼\t386460\n涂墨\t386461\n摸黑\t386462\n中国中医科学院广安门医院\t386463\n黄山市人民政府\t386464\n虎斑猫\t386465\n凌特\t386466\n青山乡\t386467\n水波不兴\t386468\n收卷机\t386469\nWhatever\t386470\n化学势\t386471\n6.2.2\t386472\n劳动人民文化宫\t386473\nlabview\t386474\n调节\t386475\n木盒\t386476\nRSP\t386477\nbaocun\t386478\nVeloster\t386479\n巨石山\t386480\n杭州市人民政府\t386481\ntentative\t386482\n奥迪q5\t386483\nB860AV1.1\t386484\n保护区\t386485\n1908\t386486\nbump\t386487\n金狮子\t386488\n王博\t386489\n相亲女\t386490\n任劳任怨\t386491\n松山湖区\t386492\n一样的人\t386493\n红旗河\t386494\n_君\t386495\nPearson\t386496\n新塘路\t386497\n题册\t386498\n正剑\t386499\n麻袋\t386500\n两个子\t386501\n地域文化\t386502\n幸福的味道\t386503\n1701\t386504\n太阳辐射\t386505\n闹腾\t386506\n常兴\t386507\n硼酸盐\t386508\n价房\t386509\nw3wp\t386510\n王燕文\t386511\n黄力\t386512\n谋圣鬼谷子\t386513\n200t\t386514\n金帐汗国\t386515\n阿蒂拉\t386516\nLEG\t386517\n准件\t386518\n中心街\t386519\n天鼎\t386520\n黑枸杞茶\t386521\n权奸\t386522\n青岛市规划局\t386523\n苹果MacBook、iMac\t386524\n精印\t386525\n龙喵\t386526\n南通\t386527\n三叉街\t386528\n2519\t386529\n尤四姐\t386530\n中铁七局\t386531\n前女\t386532\nGPRS\t386533\n扑朔迷离\t386534\n北京306医院\t386535\n家装网\t386536\nbutt\t386537\n黑犀牛\t386538\n速建\t386539\n阿玛拉王国:惩罚\t386540\n针刺机\t386541\n广九直通车售票中心\t386542\n陈雅森\t386543\nVR+\t386544\n37条\t386545\nASEMI\t386546\nexpecting\t386547\n111.com\t386548\n石油大亨\t386549\n4盘位\t386550\n崖边\t386551\nbain\t386552\n功夫之王\t386553\n诸宸\t386554\n金喉健\t386555\njada\t386556\n病史\t386557\n水饺\t386558\n神之狂想曲\t386559\n乐14k\t386560\n野米\t386561\n运送\t386562\n爆分\t386563\n中国航天科工\t386564\ndvp\t386565\n归国\t386566\n上睑下垂\t386567\n公安大学\t386568\n碰碰碰\t386569\n岩石学\t386570\n实验中学\t386571\n艾辛\t386572\n香味儿\t386573\n7980\t386574\n男人体\t386575\nluanma\t386576\n基面\t386577\n处警\t386578\n000066\t386579\n171022\t386580\n华光玉\t386581\nPaxos\t386582\n布放\t386583\n九仙\t386584\n吉林省纪委监察厅\t386585\n佛界\t386586\n缺口\t386587\nSCENARIO\t386588\ninsite\t386589\n尼根\t386590\n五角枫\t386591\n沙井人民医院\t386592\n合肥寿春中学\t386593\nwolfe\t386594\nimsave\t386595\n变频电机\t386596\n说不_\t386597\n来屋\t386598\nONE\t386599\n匠心独运\t386600\n康妇炎胶囊\t386601\n炼丹\t386602\n昌里路\t386603\n中断源\t386604\n16750\t386605\n笑猫日记\t386606\n进料\t386607\n无铭\t386608\n黑岩阁\t386609\n模型机\t386610\nexcited\t386611\n南天\t386612\n型态\t386613\n三棱镜\t386614\n29GAY交友网\t386615\n华为交换机vlan\t386616\n折原临也\t386617\n冷屏\t386618\n泛达\t386619\nKraft\t386620\n暗棋\t386621\n20140522\t386622\n持有期\t386623\n完完全全\t386624\nentertainment\t386625\n北京快递公司\t386626\nrv\t386627\n轴\t386628\n累不累\t386629\n42码\t386630\n亚玛顿\t386631\n维普\t386632\n高达破坏者3\t386633\n小洁\t386634\nNH4\t386635\n南沙滩\t386636\n安守\t386637\n东兰\t386638\n伊隆·马斯克\t386639\n求解\t386640\n明轩\t386641\nab350\t386642\n气敏传感器\t386643\n毛周角化症\t386644\nWin8系统之家_Win8系统\t386645\n大话2免费版\t386646\nTrent\t386647\n屌爆\t386648\nqiongmiaoer\t386649\n100kg\t386650\n蝴蝶君\t386651\n东窗事发\t386652\npartly\t386653\n脐动脉\t386654\nBecker\t386655\n历年\t386656\n大暮维人\t386657\n暗伤\t386658\n帝吧\t386659\n悟天克斯\t386660\n饮水器\t386661\n透膜\t386662\njue\t386663\n良宵\t386664\n显怀\t386665\n武祖\t386666\nautophagy\t386667\nnotary\t386668\n15周\t386669\n德州扑克游戏\t386670\n广州军区\t386671\n老捷达\t386672\nuCOS-II\t386673\n查封\t386674\n管桩\t386675\n2096\t386676\n郭淑珍\t386677\nUnHolY\t386678\n光环新网\t386679\n税会差异\t386680\n二手房信息网\t386681\n内内\t386682\nnoob\t386683\nlender\t386684\n假婚\t386685\n福王\t386686\n邢台市区\t386687\n三龄两历\t386688\n揭幕战\t386689\n软控\t386690\n加油点\t386691\n简体语言包\t386692\n巩固\t386693\nrpath\t386694\n数十米\t386695\n冯杰\t386696\nGPS定位系统\t386697\n此战\t386698\ndbc2000\t386699\nmm_\t386700\n恩必普\t386701\n瓷砖粘结剂\t386702\n董冬冬\t386703\n四联疫苗\t386704\n用球\t386705\n小手册\t386706\nkeyframe\t386707\n李明珠\t386708\n羊蛋\t386709\n判断\t386710\n凝眉\t386711\n伯罗奔尼撒战争史\t386712\n壹万\t386713\n铺装\t386714\n游行\t386715\n坯布\t386716\nChuckLu\t386717\n杨晨\t386718\n花姿\t386719\n饭局\t386720\ndisp\t386721\n荞麦\t386722\n观前店\t386723\n百舌鸟\t386724\ngoat\t386725\n猫腻\t386726\nlibVLC\t386727\n华夏时报\t386728\n汨罗江\t386729\ns11\t386730\n南宁地铁1号线\t386731\n巴比妥类药物\t386732\nbilibili哔哩哔哩\t386733\n导热管\t386734\nHEAVY\t386735\n太平盛世\t386736\n清空\t386737\n小升初数学模拟\t386738\ncoustic\t386739\n天马微电子\t386740\n非时\t386741\n巴拉德\t386742\n很长时间\t386743\n攀越\t386744\n郑春华\t386745\n佳丽\t386746\n无齿翼龙\t386747\n网科\t386748\n书式\t386749\nTR\t386750\n必应输入法\t386751\n小说区\t386752\n5.5mm\t386753\n网络战\t386754\n大宛\t386755\n排异\t386756\n龙婆坤\t386757\n友链\t386758\n打颤\t386759\n口服\t386760\n心理素质\t386761\nrado\t386762\n肖勇\t386763\n散粒肿\t386764\nApartment\t386765\n客运班\t386766\nCL\t386767\n高街\t386768\n巨额来电\t386769\n竹料\t386770\nhbase数据库\t386771\nCanal\t386772\n荒火\t386773\n哈尔滨市\t386774\n双重性\t386775\n慰劳\t386776\n苯并三氮唑\t386777\nSwarm\t386778\n葫芦侠\t386779\n面试题集\t386780\n趋势性\t386781\n頼\t386782\n目隐\t386783\ncapa\t386784\n李政穆铁柱\t386785\n31.29\t386786\n21st\t386787\n二语\t386788\n滞后性\t386789\n皇甫谧\t386790\n逆向思维\t386791\nusart\t386792\n青宇\t386793\n浊世\t386794\n乐动达人\t386795\n偏载\t386796\n人机料法环\t386797\n20140209\t386798\n夏芽\t386799\n赫氏\t386800\n一木禾网盘\t386801\nday1\t386802\n转业军人\t386803\n本字\t386804\n30多分钟\t386805\n名都\t386806\ncngaosu\t386807\n云南白药牙膏\t386808\n适得\t386809\nreno\t386810\ncloning\t386811\n神州数码集团\t386812\n浇口\t386813\n10袋\t386814\n框组\t386815\n25厘米\t386816\n牡丹鹦鹉\t386817\n4月16\t386818\n楚天运动\t386819\n灭顶之灾\t386820\n南朝梁\t386821\n毛骨\t386822\n氧化亚铁\t386823\n石林\t386824\n一言通天最新章节_一言通天\t386825\n慎思\t386826\n超神级\t386827\n民字\t386828\n尼尔·盖曼\t386829\n1速\t386830\n张店区\t386831\nAcademic\t386832\n1-4\t386833\n101期\t386834\n金都花园\t386835\n保利银滩\t386836\n月璃\t386837\n大创美白\t386838\n牛津英语\t386839\n田子坊\t386840\n心不可得\t386841\n阿满\t386842\n几十年前\t386843\n入职率\t386844\n频度\t386845\nTF2\t386846\n崧泽\t386847\n使女\t386848\n两党制\t386849\nCeiling\t386850\n合伙企业法\t386851\n格尔木市\t386852\n40卷\t386853\n嗉囊\t386854\n恒逸\t386855\n无托\t386856\n同步带\t386857\n西川\t386858\n简餐\t386859\n占地面积\t386860\n意行\t386861\nCS6破解版\t386862\n高低温湿热试验箱\t386863\nGBAtemp\t386864\n巴林右旗\t386865\n140亿\t386866\n魔主\t386867\n故国\t386868\n微软雅黑\t386869\nLi\t386870\n啸天\t386871\nDCP-7057\t386872\n腰\t386873\n许田\t386874\n低俗小说\t386875\n遥不可\t386876\n太路\t386877\ndabukai\t386878\nP群交\t386879\n华泰人寿\t386880\nfille\t386881\n鹏欣\t386882\n神山\t386883\nCOG\t386884\nbracelet\t386885\n入营\t386886\n极真空手道\t386887\n长城炫丽\t386888\n牦牛\t386889\nLunar\t386890\n告成\t386891\nxiaoyu\t386892\npriceline\t386893\n下步\t386894\n万那普机场\t386895\n下设\t386896\n伊可\t386897\n层理\t386898\n郑开\t386899\n5丝\t386900\n依靠\t386901\n女身\t386902\n艺术展\t386903\nios8.3\t386904\n280亿\t386905\n卢风郎\t386906\n曝气\t386907\n5.6万\t386908\n教育制度\t386909\n民心所向\t386910\n除雾器\t386911\n杀生丸\t386912\n周岁\t386913\ngrap\t386914\n华为P\t386915\n流水账单\t386916\n当雄\t386917\n观观\t386918\n佳能iC\t386919\n干探\t386920\n焊割\t386921\n朱以撒\t386922\n伦敦桥\t386923\n解放日报_牛网\t386924\n十多岁\t386925\nrecruitment\t386926\n浙江鼎力\t386927\n20170926\t386928\nkpl\t386929\n必修\t386930\n花卉学\t386931\n套牢\t386932\n梭织\t386933\n胆小鬼\t386934\nretro\t386935\n阶符\t386936\nWord2013\t386937\nWorry\t386938\n2017年6月20日\t386939\n道顿堀\t386940\n长条型\t386941\n郑永年\t386942\n罗宾斯\t386943\n五年级品社\t386944\n张宇桦\t386945\n湖北省发展和改革委员会\t386946\n涂敷\t386947\n盐城\t386948\n华益\t386949\nEdmonton\t386950\nded\t386951\nproved\t386952\n张铭阳\t386953\n奥拉天天酷跑\t386954\n杨维桢\t386955\n新疆维吾尔自治区教育厅\t386956\n多可文档管理系统\t386957\n有令\t386958\n人教版四年级语文下册\t386959\n_愿\t386960\n垦利县政府\t386961\n120000元\t386962\n首购\t386963\n2016年3月31日\t386964\na级车\t386965\n鲜城\t386966\n13.00\t386967\nSINA\t386968\n迈锐宝\t386969\n几周岁\t386970\n随性所欲\t386971\n龙口西小学\t386972\n无码照\t386973\n海鸣馆\t386974\nQQ安全中心\t386975\nlobby\t386976\n2016年3月\t386977\nrestran\t386978\n周峻纬\t386979\n人鱼传说\t386980\n文件型\t386981\n小囡\t386982\n哈皇\t386983\n福建省知识产权局\t386984\n泰山国际马拉松赛\t386985\ndeft\t386986\nsnooker\t386987\n安化政府网\t386988\n收盘价\t386989\n法治网\t386990\n董思成\t386991\n罗纳克\t386992\n_畅\t386993\n很纯明朝那些事儿\t386994\nlzy\t386995\n0.4.1\t386996\n卵巢\t386997\n墓门\t386998\nclt\t386999\nngfor\t387000\nShanghai\t387001\n建新\t387002\n杨辉\t387003\n三乡\t387004\n串行化\t387005\nHangover\t387006\n郭淮\t387007\n回民起义\t387008\n暨阳街道\t387009\n第86号\t387010\n鸿沟\t387011\n洒泪\t387012\n84%\t387013\n溱潼古镇\t387014\n古北口镇\t387015\n济南电信\t387016\nWarning\t387017\n2000转\t387018\n荞麦壳\t387019\nbl文吧\t387020\n科鲁宝马x1\t387021\n非洛地平缓释片\t387022\nwin7旗舰版\t387023\n债权融资\t387024\nCherokee\t387025\n6处\t387026\n中矿资源\t387027\n浦口外国语学校\t387028\n厅\t387029\n上一个人\t387030\nwakawaka\t387031\n7月2日\t387032\n联谊赛\t387033\n印孚瑟斯\t387034\n该帖\t387035\n修屏哥\t387036\nwufa\t387037\n上原志织\t387038\n成都农业科技职业学院\t387039\n红利\t387040\n词林\t387041\n瑞风S2\t387042\ncanada\t387043\n高通并购恩智浦\t387044\n三脉网\t387045\n曲家瑞\t387046\n安徽中医药大学第一附属医院\t387047\n金泰城\t387048\n金陵十二钗\t387049\n步微澜\t387050\n肠易激综合症\t387051\n_夜枫\t387052\n罗蒙环球乐园\t387053\n行政处罚自由裁量权\t387054\n王府井书店\t387055\n幼儿园体\t387056\n法律\t387057\n中国计算机学会\t387058\n中晶\t387059\n开曼群岛注册公司\t387060\n韩派\t387061\n大局观\t387062\n撩\t387063\n求受\t387064\n灵刀\t387065\n袒护\t387066\n55页\t387067\nreceivables\t387068\npolygon\t387069\n金建模\t387070\n劳动关系协调师\t387071\n北京市地税局\t387072\nancestors\t387073\n牙性\t387074\n征途2\t387075\n工作计\t387076\n手抓饭\t387077\nEverywhere\t387078\n61P\t387079\n口袋妖怪X/Y\t387080\n迦南科技\t387081\n天一大联考\t387082\n奴儿\t387083\n科恩\t387084\n我闻佛教网\t387085\n脱硫塔\t387086\npea\t387087\n德施曼指纹锁\t387088\n银泰中心in99\t387089\n区人民政府\t387090\n十二药网\t387091\natx777\t387092\n重宝\t387093\n羽生\t387094\n舞男\t387095\nTmux\t387096\n补休\t387097\n靠不靠\t387098\n范一飞\t387099\n曼谷市区\t387100\n0.05%\t387101\n子路\t387102\n球女\t387103\n时频\t387104\n配件\t387105\n安溪县人民政府\t387106\nOccitane\t387107\nX展架\t387108\n洗衣间\t387109\n调速器\t387110\nfaulty\t387111\n配速员\t387112\ndianshi\t387113\n陈梦\t387114\n此机\t387115\n内路\t387116\n杨高\t387117\nperfectworld\t387118\n#id\t387119\n树海\t387120\n育苗基地\t387121\nTextField\t387122\n集合信托计划\t387123\n兵藤一诚\t387124\n中站区\t387125\n梵蒂冈\t387126\n玩彩\t387127\n电影萨\t387128\n几几\t387129\n十五家\t387130\n合物\t387131\nfkzhu92\t387132\n宁波地区\t387133\n景苑\t387134\n就象\t387135\n迈尔斯\t387136\n莉莉岛\t387137\n东海论坛\t387138\n联想扬天\t387139\n奇人\t387140\nndt\t387141\n中国人民大学清史研究所\t387142\n丽水路\t387143\n佐野\t387144\n修行者\t387145\n热传导方程\t387146\nV7.2\t387147\n艾力绅论坛_汽车之家论坛\t387148\n300倍\t387149\nm416\t387150\n广汽集团\t387151\n金棕榈\t387152\n信白\t387153\n躶体\t387154\n几寸\t387155\n曾国藩传\t387156\n肉蔻\t387157\n黑宝\t387158\nearth\t387159\n天悦网\t387160\n威立雅\t387161\n消防箱\t387162\n精准度\t387163\n天守阁\t387164\n新人教版部编本二年级下册语文\t387165\n山地玫瑰\t387166\n藏宝阁\t387167\n经济体制\t387168\n第3方\t387169\n红星美凯龙家居集团股份有限公司\t387170\n扶桑\t387171\n40pin\t387172\n方达\t387173\nu型钢\t387174\nZeta\t387175\n一叶\t387176\n象头\t387177\n再送\t387178\n为时\t387179\nx800\t387180\n民事诉讼文书\t387181\nRated\t387182\n瓶底儿\t387183\n5分钟\t387184\n拼死\t387185\nex表格\t387186\n法硕\t387187\n2155\t387188\n文政\t387189\n绿丰\t387190\n管字\t387191\n香港开银行\t387192\n绝地求生4X4\t387193\n站群\t387194\n看图列式\t387195\n刘叶琳\t387196\nUO\t387197\n太升南路\t387198\n牛掰\t387199\n第五十七\t387200\n屏南县\t387201\nOL\t387202\n骆驼队\t387203\n王志民\t387204\n乌尤尼盐湖\t387205\n满洲国\t387206\n原味酸牛奶\t387207\n绯夜传奇\t387208\nzhifubao\t387209\n天下英雄\t387210\nPMP\t387211\n忍者之刃\t387212\n雪龙号\t387213\n中国平安陆金所\t387214\n咕嘟\t387215\n战国红玛瑙\t387216\n治疗者\t387217\n阜阳新闻网\t387218\n水彩画\t387219\n奇瑞瑞虎3xe\t387220\nJuice\t387221\n史蒂夫·乔布斯\t387222\n喵搜\t387223\n片田舎\t387224\n余叔岩\t387225\n抖抖机\t387226\n巴菲特\t387227\nliniux\t387228\npd13\t387229\n转交\t387230\n王希\t387231\nGT5\t387232\n年展\t387233\n无届\t387234\n湖南大学信息科学与工程学院\t387235\n郑帝元\t387236\n胸带\t387237\n安庆火车站\t387238\n一己\t387239\n老总们\t387240\n享乐主义\t387241\n回转式\t387242\n五围\t387243\n雪中悍刀行吧\t387244\nnlpir\t387245\nceleste\t387246\n杨光明\t387247\n海洋与渔业局\t387248\n轿\t387249\nPPP\t387250\n矩阵论\t387251\n飞行服\t387252\nKOKUA\t387253\nMPI\t387254\n失值\t387255\n校园英语\t387256\n神学\t387257\nDecathlon\t387258\nTango\t387259\n小南斯\t387260\n三农直通车\t387261\n尼奥\t387262\n电子表格\t387263\n胡建\t387264\n图面\t387265\n寡糖\t387266\n径直\t387267\n排布\t387268\n结构方程模型\t387269\n深冷\t387270\nUnite\t387271\nr2016a\t387272\n与名\t387273\n观众席\t387274\n天晴\t387275\n163417\t387276\n蜜婚\t387277\n企业文化\t387278\n沪科\t387279\n源表\t387280\n补间\t387281\n10倍股\t387282\nrospy\t387283\n上海地铁13号线\t387284\n乐峰\t387285\n定影辊\t387286\n_石\t387287\n玉律\t387288\nliyuan\t387289\n日历型\t387290\n苏州中院\t387291\n优麒麟\t387292\n北京第二实验小学\t387293\n焉知非福\t387294\n口袋消消乐\t387295\n爱理不理\t387296\n老藤\t387297\n天津中医一附属医院\t387298\n2017.7.1\t387299\n钓鱼场\t387300\n梵宫\t387301\nidhttp\t387302\n安驰\t387303\n溪边\t387304\n瑟瑟\t387305\nccui\t387306\n衣冠冢\t387307\nV5.4\t387308\n四胞胎\t387309\n旧件\t387310\n悦薇\t387311\n既非\t387312\n三员\t387313\n广厦天都城\t387314\n益美\t387315\n民不聊生\t387316\n指拨\t387317\nproject吧\t387318\n娶妻\t387319\n就业歧视\t387320\ncomponent-scan\t387321\n眼帘\t387322\n上海科技馆\t387323\nbook\t387324\n缓凝\t387325\n影版\t387326\n狙神\t387327\n京东小米\t387328\n中国环科院\t387329\nbliss\t387330\n众星之子\t387331\n卡西利亚斯\t387332\nUniversité\t387333\n亿位\t387334\n西坑村\t387335\n豫东\t387336\ntossgirl\t387337\niPhone8/8Plus\t387338\nsqrt函数\t387339\nOVH\t387340\n86五笔\t387341\n胡为\t387342\n160度\t387343\n4D\t387344\n创立\t387345\n丁香油\t387346\ntio2\t387347\n尔克\t387348\nquoted\t387349\nCODEC\t387350\n济南市高新区\t387351\nFargo\t387352\n冷冲压\t387353\n600770\t387354\n红火\t387355\n油肉\t387356\njdk9\t387357\n鸿山街道\t387358\n泰坦军团\t387359\nGentleman\t387360\n改良\t387361\n张铭恩\t387362\n阿法骨化醇软胶囊\t387363\n广东省财政厅\t387364\n痴鸡\t387365\n度假村\t387366\n房屋抵押贷款利率\t387367\n高机\t387368\n羞人\t387369\n吴卓羲\t387370\n3604\t387371\n逐风\t387372\n38.5\t387373\nTip\t387374\n扭曲\t387375\n牙科\t387376\n56789\t387377\n国家创新驱动发展战略纲要\t387378\n李国英\t387379\n天助\t387380\n纯音乐\t387381\nHMC5883L\t387382\nDispersion\t387383\n秋林\t387384\n马世芳\t387385\n一绝\t387386\n北京大学附属小学\t387387\n钛酸锂电池\t387388\n绝地求生25global\t387389\n抱病\t387390\n炫赫门\t387391\n郑子威\t387392\n代客\t387393\n大巴扎\t387394\n扬杰科技\t387395\n元宵\t387396\n北芪\t387397\nDonate\t387398\n187卡\t387399\n颗粒型\t387400\n阴离子型\t387401\n高白\t387402\n三斗\t387403\n合江人民政府\t387404\n40A\t387405\n天天快递有限公司\t387406\n苏沐橙\t387407\n244号\t387408\n协信地产\t387409\n殷一民\t387410\n舒耐\t387411\npyquery\t387412\ntop2\t387413\n融创集团\t387414\n媛\t387415\n公司级\t387416\n马口鱼\t387417\n鸟网\t387418\n遥信\t387419\n枪友\t387420\n善男信女\t387421\nSAIF\t387422\n农学院\t387423\n居住小区\t387424\n国际大酒店\t387425\n一般过去式\t387426\n科技业\t387427\nIT笔录\t387428\n星火\t387429\n陈词\t387430\n高新区政府\t387431\nNCEP\t387432\nico格式\t387433\nChine\t387434\n瓦尔加\t387435\nmft\t387436\n四川大学高分子科学与工程学院\t387437\n友田\t387438\n浔兴股份\t387439\nyunfile\t387440\n宝马摩托\t387441\n方洪波\t387442\n254&\t387443\n内江职业技术学院\t387444\n高耀太\t387445\n张园\t387446\nactivit\t387447\n商地\t387448\n堆叠式\t387449\n隔离卡\t387450\nmbp2017\t387451\n三国群英传2mod\t387452\n腌蒜苔\t387453\n隐元\t387454\n庙号\t387455\n蛋白质变性\t387456\n移位运算\t387457\n骑当\t387458\n程书林\t387459\n待收\t387460\n写在\t387461\n逆水桑塔纳\t387462\n建筑工地\t387463\n0.2s\t387464\n杜敏\t387465\n2018高考网\t387466\n腻\t387467\npublic\t387468\nHiker\t387469\n鲤\t387470\nWin10圈\t387471\n四户\t387472\n俏色\t387473\n普选\t387474\n亮堂\t387475\nDiagrams\t387476\n圣德\t387477\nsvga\t387478\nClearType\t387479\n110万元\t387480\n小猪短租网\t387481\n突感\t387482\n阴山工作室\t387483\njs批量\t387484\nA11处理器\t387485\n瓶儿\t387486\n一当\t387487\n途游斗地主\t387488\n延大\t387489\n郭强\t387490\n泺\t387491\n大连枫叶国际学校\t387492\n蜓\t387493\na330\t387494\n版币\t387495\nRum\t387496\nplx\t387497\n兔小贝儿歌\t387498\n威远县人民政府\t387499\n万事达\t387500\ncm-1\t387501\n刘洲成\t387502\n教育部考试中心\t387503\n桃红柳绿\t387504\n十股\t387505\nATH\t387506\n碳氢化合物\t387507\n皇派\t387508\nv2.1.4\t387509\n监察者\t387510\n2018-02-12\t387511\n灵犬\t387512\npastel\t387513\nav撸\t387514\n利澳\t387515\n小蕊\t387516\nrstp\t387517\n宝马730\t387518\n难保\t387519\n裸秀\t387520\n1月7日\t387521\n镇江市建设局\t387522\nanta\t387523\n第七篇\t387524\ncrazyYong\t387525\nOnitsuka\t387526\n中国盲人协会\t387527\nBOOM0403\t387528\n搜寻\t387529\n禁闭岛\t387530\n72dpi\t387531\n李芬\t387532\n陆巡\t387533\n2017年6月\t387534\n乐晴\t387535\n松子\t387536\n销控\t387537\n追到\t387538\n罪恶王冠\t387539\n苏黎\t387540\n垃圾站\t387541\n注册贸易公司\t387542\n巧改\t387543\n何念晴\t387544\n育儿袋\t387545\n阿斯图里亚斯\t387546\nboar\t387547\n温德米尔\t387548\n孤身\t387549\nh830\t387550\nIGP\t387551\n20【\t387552\n东风凯普特\t387553\n2010年上半年\t387554\n2.2.4\t387555\n朗识\t387556\n卡拉克西英杰\t387557\n辽化吧\t387558\nMAIN\t387559\n对敌\t387560\n威高广场\t387561\nBernstein\t387562\n根脉\t387563\n国产区\t387564\n脚谱\t387565\n刷屏\t387566\n夜店\t387567\nDEA\t387568\nvwmare\t387569\n如数\t387570\n一手\t387571\nNSURL\t387572\n00155\t387573\n御馔津\t387574\n龙须酥\t387575\n雪糕\t387576\n智能锁\t387577\n慈铭体检\t387578\n采补\t387579\n番薯干\t387580\n坐穿\t387581\n淳安县\t387582\n支派\t387583\n连心\t387584\n配箍率\t387585\n优信拍\t387586\nEssentials\t387587\n挖耳勺\t387588\n土建算量软件\t387589\n建档立卡\t387590\n马承\t387591\n高肉\t387592\nmtf\t387593\n炼狱天使\t387594\n6688\t387595\n颐高集团\t387596\n美罗培南\t387597\n鲁班七号\t387598\n贰分\t387599\n22018年\t387600\n为鉴\t387601\n瓜子片\t387602\nSCRIPT\t387603\n苏州\t387604\n霍去唐朝小闲人\t387605\n着想\t387606\n300道\t387607\n95条\t387608\n网红经济\t387609\ncrossover\t387610\n中国乒协\t387611\n斗轮机\t387612\n两个数\t387613\n北京久其软件股份有限公司\t387614\n金融保卫战\t387615\n电城镇\t387616\n卢展工\t387617\n12孔\t387618\n膝盖骨\t387619\n诺维斯基\t387620\nretropie\t387621\nAS3\t387622\n败家党\t387623\n首批次\t387624\n北庭\t387625\nCandles\t387626\nchemdraw\t387627\nchuyao\t387628\n蔡京\t387629\n黑客马拉松\t387630\n浙江省农村信用社\t387631\n同方威视\t387632\n故地\t387633\nEP01\t387634\na+3\t387635\n吸收液\t387636\n去哪儿网Qunar.com\t387637\n日上免税店\t387638\n造纸\t387639\n泰拉瑞亚Terraria\t387640\n延退\t387641\n绵密\t387642\n宝得适\t387643\n石碁\t387644\n吸水率\t387645\n弯_\t387646\n宝华镇\t387647\n3&#160\t387648\n童军\t387649\n铝蜂\t387650\n断电器\t387651\n砸缸\t387652\n350元\t387653\n1.8TSI\t387654\n尺规\t387655\n秀湖公园\t387656\n肖国栋\t387657\n天书奇谭\t387658\n吉他吧_\t387659\n二十四山\t387660\n世界贸易中心\t387661\n赤诚\t387662\n冷墩机\t387663\n設備\t387664\n2090\t387665\n彩票网\t387666\n双拼别墅\t387667\n帝人\t387668\n星河战神\t387669\n孤城\t387670\n秀碧除疤膏\t387671\nuser32.dll\t387672\n盛惠\t387673\nCHAN\t387674\nJe\t387675\n系统环境变量\t387676\nspringsource-tool-suite\t387677\n标识\t387678\n微微一笑\t387679\n顾颉刚\t387680\n远里\t387681\npots\t387682\n增量\t387683\n雀之灵\t387684\n孤村\t387685\nToron\t387686\n合作发展\t387687\n中纺\t387688\naqdy\t387689\n9年前\t387690\n供应商管理制度\t387691\n首商网\t387692\n柏曼\t387693\n微信读书\t387694\n牙仔\t387695\nt2航站楼\t387696\nhaodiaose\t387697\nklchang\t387698\n铝扣\t387699\nslight\t387700\n两英\t387701\n绣花制版\t387702\n熊出没之熊熊乐园\t387703\n女衫\t387704\n锁踏\t387705\ncrispr/cas9\t387706\n溶物\t387707\n蛋炒饭\t387708\nx86/x64\t387709\n万和证券\t387710\n慈城\t387711\n17002\t387712\n血气分析仪\t387713\n赢者\t387714\nklia\t387715\n一师附小\t387716\nmiko\t387717\n牛奶粉\t387718\n电动\t387719\n高清谱\t387720\n2013年终\t387721\ncss\t387722\n环面\t387723\n万吨\t387724\n法国驻华大使馆\t387725\n秘藏\t387726\n锦橙\t387727\n辛锐\t387728\nv4.6\t387729\n五爱街\t387730\n加续表\t387731\n梅江镇\t387732\nlfw\t387733\n孩宝\t387734\n米柚\t387735\nTextures\t387736\n到期投资\t387737\n中信泰富\t387738\n三十号\t387739\n笑刑\t387740\n自唱\t387741\ntapping\t387742\n杨小华\t387743\n张唐\t387744\n使君\t387745\n天刀乐伶\t387746\n产后\t387747\n9月20日\t387748\n第三方检测公司\t387749\n啪嗒砰3\t387750\n河北女子职业技术学院\t387751\nWord2010/2013\t387752\n考试版\t387753\n睡佛\t387754\n6.11\t387755\n翌芙莱\t387756\n邹喆\t387757\n阶跃响应\t387758\n尼日利亚拉各斯\t387759\n海门人才网\t387760\n偶现\t387761\n气象路\t387762\n张同禄\t387763\n保利西江\t387764\n佩雷尔曼\t387765\n秋殇\t387766\n热依扎陈赫宾馆偷情\t387767\n隔音板\t387768\n酶标仪\t387769\nreinterpret\t387770\n25.0.0.68\t387771\n2016年内\t387772\n200MW\t387773\n上海老街\t387774\n浙江网商银行\t387775\n十条\t387776\n文件搜索软件\t387777\nCarLife\t387778\n苏玉华\t387779\n郑州高新区\t387780\nT4航站楼\t387781\n上海航欧机电设备有限公司\t387782\nschool\t387783\n凛凛\t387784\n2925\t387785\n8035\t387786\n千个\t387787\n二星\t387788\n3dB\t387789\nInteraction\t387790\n使徒信经\t387791\n速生\t387792\n汤山一号\t387793\n深圳大学信息工程学院\t387794\n幸福的日子\t387795\n膝关节\t387796\n高新技术产业开发区\t387797\n湖北政府\t387798\n铁梨花\t387799\nzipped\t387800\nOff\t387801\n试金石\t387802\n方勤\t387803\n更灵活\t387804\n点差\t387805\n最近3天\t387806\nIndie\t387807\n日刊\t387808\n皇氏集团\t387809\n障子\t387810\nSkeet\t387811\n南海里水\t387812\n朱哲琴\t387813\n斯旺\t387814\n真魅\t387815\n税负\t387816\nClouds\t387817\n层析液\t387818\n企业管理吧\t387819\n斜颈\t387820\n人大网\t387821\n几十分钟\t387822\n1.8v\t387823\n宋庆龄幼儿园\t387824\nwww.yzc577.com_yzc577\t387825\n犇\t387826\n江苏省公安厅\t387827\n海南鸡饭\t387828\nTofo\t387829\n中科院微电子所\t387830\n物业\t387831\n吴金贵\t387832\n陈淳\t387833\n四川省职业技能鉴定指导中心\t387834\n勒贝格\t387835\nTLX\t387836\n十几天\t387837\n详见\t387838\n横恋母\t387839\n童林\t387840\n陈晶\t387841\n三三两两\t387842\n内齿\t387843\n南京工业大学\t387844\n郭建平\t387845\n咬口机\t387846\n菲迪克\t387847\n第4讲\t387848\n赵战鼓\t387849\n信阳平桥\t387850\n海南自贸区\t387851\n儒门\t387852\n亩产\t387853\n极品飞车7\t387854\nclearInterval\t387855\n脑残吼\t387856\n视死如归\t387857\n吴伟雄\t387858\nHuffPost\t387859\n公积金卡\t387860\n15册\t387861\nDNF剑魂\t387862\n衰微\t387863\n蓝色海湾\t387864\n五十名\t387865\n奇偶校验位\t387866\nTES5Edit\t387867\n禁欲\t387868\nbttt\t387869\n云企\t387870\n砖纹\t387871\nIncomplete\t387872\n张远\t387873\n第58届\t387874\n禄马\t387875\n痛打\t387876\n啦啦啦啦\t387877\nASN\t387878\n苯酚\t387879\n老熊\t387880\n敢达决战吧\t387881\n眉眼\t387882\n满心欢喜\t387883\nfunctions\t387884\n上奇人才网\t387885\n怀表\t387886\n3把\t387887\nG633\t387888\n沃游\t387889\n定窑\t387890\n20150308\t387891\n人工增雨\t387892\n冇\t387893\n安晴\t387894\n油烟净化器\t387895\nsevice\t387896\n父母宫\t387897\n赛博朋克2077\t387898\n冷帝\t387899\ncce\t387900\npro7plus\t387901\n杜淳\t387902\n手背\t387903\n高町奈叶\t387904\ncaoporon\t387905\nbrasil\t387906\n魔兽世界单机版\t387907\n55cm\t387908\n固网\t387909\n黑暗网游之倒行逆施\t387910\n超级学渣渣\t387911\n紊流\t387912\nDrawing\t387913\n0858\t387914\n奸尸\t387915\n心片\t387916\n关灯\t387917\n80首\t387918\nEnter键\t387919\n和平西桥\t387920\n许鲜\t387921\n海化\t387922\n转曲\t387923\n别忘记\t387924\n宝马公司\t387925\n轮班\t387926\n名汀\t387927\n哪级\t387928\n万博体育\t387929\n尉迟敬德\t387930\n游戏书\t387931\n藤岛\t387932\n公务员们\t387933\n十年以上\t387934\n访古\t387935\n兴鲁\t387936\n优易\t387937\n宁武县\t387938\nSpacemacs\t387939\n10.25\t387940\n陈冠希\t387941\n三机\t387942\nked\t387943\n北京三里屯优衣库\t387944\n甲磺酸伊马替尼\t387945\n赣州经济技术开发区\t387946\n修昔底德\t387947\n高仿耐克\t387948\n8千\t387949\nImagery\t387950\nPASS\t387951\n供应商管理系统\t387952\n踢飞\t387953\nNecklace\t387954\n杲\t387955\n尚友网\t387956\n老陈\t387957\n珍珠草\t387958\n立管\t387959\n野原新之助\t387960\n银圈\t387961\n沈阳医大一院\t387962\n公金\t387963\n光化\t387964\n浩瀚\t387965\nntoskrnl\t387966\n002\t387967\n杭州地铁4号线\t387968\n花臂男\t387969\n燃气管\t387970\n1分半\t387971\n完美艾肯\t387972\n可妮\t387973\n宿州市人民政府\t387974\n细小\t387975\nLOL英雄联盟17173\t387976\nssrf\t387977\n动作值\t387978\n箱记\t387979\n李亨\t387980\n私募股权融资\t387981\nGlgoo\t387982\n横穿马路\t387983\n离子交换\t387984\nCho\t387985\n新日石\t387986\n联手\t387987\n9.20\t387988\n百步亭\t387989\n魔城\t387990\nTribal\t387991\n去尾\t387992\nmam\t387993\nELK\t387994\n长仪\t387995\nkeil\t387996\n深蓝色\t387997\n角瓜\t387998\nSTM32-F0/F1/\t387999\n义不容情\t388000\n明价\t388001\n不留行\t388002\nLFY\t388003\n卒年\t388004\n上海市东方公证处\t388005\nrye\t388006\n3.7v\t388007\n西部影视城\t388008\n红枫\t388009\nPVC穿线管\t388010\n零日\t388011\n打扮\t388012\nwantnon\t388013\n九零年\t388014\n万达商业\t388015\n转贴\t388016\nmpq\t388017\n莫德雷德\t388018\n3000度\t388019\n联结\t388020\n临溪\t388021\n重庆自然博物馆\t388022\n水落石出\t388023\n石灰吟\t388024\n营销案\t388025\n无肉不欢\t388026\n春会馆\t388027\n湖南电子科技职业学院\t388028\n草袋\t388029\n伊利诺伊香槟分校\t388030\n广州酒家\t388031\nwi7\t388032\n黑红色\t388033\n雨宗林\t388034\n1000题\t388035\n热狗\t388036\ndeeplab\t388037\n乱世曹操传\t388038\n大凶\t388039\n属性\t388040\n通风排烟\t388041\nΜ\t388042\n吕一\t388043\n臼齿\t388044\n团队版\t388045\n征婚网\t388046\n迅雷/BT\t388047\n星力捕鱼\t388048\necharts2\t388049\n龙篇\t388050\n枕边物语\t388051\n破壁灵芝孢子粉\t388052\n水泊梁山\t388053\nTransactional\t388054\n白兰花\t388055\n李玉赋\t388056\n科鲁兹\t388057\n破碎\t388058\n时空联机加速器\t388059\n北京耀莱成龙国际影城\t388060\n逃学\t388061\nbei\t388062\n12366电子税务局\t388063\n挥杆\t388064\n6221\t388065\n差旅\t388066\n2.docx\t388067\n83057451\t388068\n杭州雄迈信息技术有限公司\t388069\n木桃\t388070\n红光山\t388071\n小刀电动车\t388072\nǔ\t388073\n虹鱼\t388074\n粤版\t388075\n犹他爵士队\t388076\nCrucible\t388077\n340ml\t388078\n六化\t388079\n武汉大学数学与统计学院\t388080\n为辅\t388081\n张文君\t388082\n追寻者\t388083\n盾牌\t388084\ndnfss\t388085\n凤凰六哥广场舞\t388086\n爸爸\t388087\noracleasm\t388088\n水暖工\t388089\n第74集\t388090\n安监部\t388091\n济南市中区\t388092\n顾欢\t388093\n殷实\t388094\n桃浦路\t388095\nspanking\t388096\n堑\t388097\n软镜\t388098\n巧合\t388099\n脂渣\t388100\n千喜教育\t388101\n生态木\t388102\n阻塞性肺气肿\t388103\n大皇帝ol\t388104\n山县\t388105\n移动开发者社区\t388106\n38.8\t388107\n侃侃\t388108\n翡丽湾\t388109\n道具性\t388110\n巴克\t388111\n济南路\t388112\n纸业\t388113\n校园电视台\t388114\nlayabox\t388115\n刘德惠若琪\t388116\n白起\t388117\n北方稀土\t388118\n2013\t388119\n爱达\t388120\nsubquery\t388121\n两毫米\t388122\n穷人\t388123\n处级\t388124\nsysobjects\t388125\n党政领导干部选拔任用工作条例\t388126\n打女\t388127\n桂林旅游学院\t388128\n版别\t388129\n燕莎\t388130\n汪大娘\t388131\nPNG格式\t388132\nIGG\t388133\n安大略\t388134\n烛光\t388135\n琶洲新村\t388136\n吴莫愁\t388137\niy\t388138\n染成\t388139\n鲁宁\t388140\n20140808\t388141\n50层\t388142\n北京洗浴中心\t388143\n201X年\t388144\ncad素材\t388145\nTrac\t388146\n套版\t388147\n_征途2\t388148\n相比\t388149\ncx-3\t388150\n自认\t388151\n特等\t388152\n聪明人\t388153\nSamsara\t388154\n幸运草\t388155\n联想在线解决方案中心\t388156\n平行线\t388157\n99.5%\t388158\n新五丰\t388159\n杖\t388160\n路星河\t388161\nnx2\t388162\n恶水\t388163\n米发糕\t388164\n42cm\t388165\n谁谁\t388166\n第9版\t388167\n南京交通科技学校\t388168\n熬汤\t388169\n花菜\t388170\n遗作\t388171\nreco\t388172\n成人小说网\t388173\n返港\t388174\ntreme\t388175\n黄依依\t388176\n药用\t388177\ngarnidelia\t388178\n二转\t388179\n青龙湖\t388180\n海运学园\t388181\n1~3月\t388182\n天龙八部黄日华版\t388183\n耐心\t388184\n票点\t388185\n郑平\t388186\n6.3.0\t388187\n囚妃\t388188\n堵管\t388189\nh310\t388190\n旌\t388191\n七幕\t388192\n点村\t388193\n区科协\t388194\n110厘米\t388195\n序列\t388196\n狍\t388197\n忍着\t388198\n育英学校\t388199\ngat\t388200\nADMIN\t388201\n第二性\t388202\n顺鑫\t388203\n中国卫星导航定位协会\t388204\nrequirements\t388205\n总动\t388206\n2百\t388207\nug8\t388208\n新概念英语_新概念英语\t388209\n洛克化工网\t388210\n官庆\t388211\n中国铁路太原局集团有限公司\t388212\n河北电视台\t388213\n季园\t388214\n北京欧泰能科技有限公司\t388215\n青川县\t388216\n就寝\t388217\nfifteen\t388218\n梵文\t388219\n花样机\t388220\n捷途X70\t388221\nretailers\t388222\nibe\t388223\n吴恩达\t388224\n梧桐果\t388225\n48岁\t388226\n北京环卫集团\t388227\n铝合金板\t388228\n保龄球馆\t388229\nResistors\t388230\n私募基金公司\t388231\n西坪镇\t388232\n民智\t388233\npension\t388234\n天上\t388235\n恋母性\t388236\n蹦迪\t388237\n好客山东网\t388238\n29%\t388239\n椎体\t388240\nstrstr\t388241\n老港镇\t388242\n10016\t388243\n轴封\t388244\n3kid.net\t388245\n苯甲醛\t388246\nRHEL7\t388247\n奠基石\t388248\n惠新西街\t388249\n阿赖耶\t388250\n叶繁星\t388251\n杭州车管所\t388252\n65亿元\t388253\n果戈\t388254\n秘长\t388255\n中高级\t388256\n20150527\t388257\n紫涵\t388258\n防砂\t388259\n审计署关于内部审计工作的规定\t388260\n次元狗\t388261\n段考\t388262\n破戒\t388263\n胜诉方\t388264\ndegrees\t388265\n2018-03-31\t388266\n福州外国语学校\t388267\njmf\t388268\n血剑\t388269\ngige\t388270\nhashcode\t388271\n2017年5月7日\t388272\n宋朝时期\t388273\n天龙诀\t388274\n晋商银行\t388275\n雅鹿\t388276\n创道\t388277\n羧酸盐\t388278\n162号\t388279\n地表水\t388280\n李庄镇\t388281\n伊织凉子\t388282\n安布\t388283\n山东省水利厅\t388284\n阳具\t388285\n何超凤\t388286\nremoting\t388287\n山梨糖醇\t388288\n有好感\t388289\nhlm\t388290\nnumberfield\t388291\n盘古路\t388292\n高床式\t388293\n末法\t388294\n调播放\t388295\n205路\t388296\n开心麻花小品全集\t388297\n弱酸\t388298\n海平\t388299\n飞进\t388300\n伊势\t388301\nfindbugs\t388302\n普通级\t388303\n达利\t388304\nm7650df\t388305\n多元\t388306\n复方酮康唑软膏\t388307\n议论纷纷\t388308\n铃原爱蜜莉\t388309\nAB级\t388310\n迅疾\t388311\n自审\t388312\n郑裕美\t388313\n教育程度\t388314\ntrucks\t388315\n元宝树\t388316\n异丙\t388317\nxon\t388318\n丽安娜\t388319\n┫传奇世界\t388320\n微商管理系统\t388321\n输入性\t388322\n注册入学\t388323\n程姓\t388324\n市气象局\t388325\narcobat\t388326\nsupplied\t388327\n加绒\t388328\n喜马拉雅fm\t388329\n线务员\t388330\n比目鱼\t388331\n骑牛\t388332\n仅剩\t388333\n友臣\t388334\n使命召唤10\t388335\n水质工程学\t388336\nRowkey\t388337\n汽缸\t388338\n白沙县\t388339\n微软(中国)有限公司\t388340\n满目\t388341\n二丫网\t388342\n九息娱乐\t388343\n联锁\t388344\n叶子猪模拟器\t388345\nv2.1.1\t388346\n蕾蒂\t388347\n圈套\t388348\n莫奈\t388349\n羽色\t388350\n盼头\t388351\nE2\t388352\n海情\t388353\n飞鸽\t388354\nInclusion\t388355\n201404\t388356\n正比例函数\t388357\n写盘\t388358\n三江村\t388359\n误工费\t388360\n国行\t388361\n梅林水库\t388362\n绯玉丸\t388363\n应县公众信息网\t388364\n张春光\t388365\n上个礼拜\t388366\nsic\t388367\n601169\t388368\n家纺类\t388369\n吊顶灯\t388370\n闪开\t388371\nMSSQL\t388372\n米什金\t388373\ndeja\t388374\n1014\t388375\n哈弗雷克萨斯is\t388376\n卢汉\t388377\n$emit\t388378\n圣经和合本\t388379\n9万公里\t388380\n帅位\t388381\nmobi-CSDN\t388382\nx-1/\t388383\n睡眠日\t388384\n慈心\t388385\nfls\t388386\n旁观者\t388387\naerobic\t388388\n领导节\t388389\n订合同\t388390\n宁夏人事考试中心\t388391\n说课\t388392\n如是观\t388393\n汤姆丁磊\t388394\n鹿鞭\t388395\n网易企业邮箱\t388396\n19:30\t388397\n猎美\t388398\n龙炮\t388399\n马步芳\t388400\n便携播放器\t388401\n煤泥\t388402\n绊爱\t388403\nlog1\t388404\n行家\t388405\n缤果\t388406\nん\t388407\n玖富集团\t388408\n补\t388409\n忘词\t388410\nprint\t388411\nl39\t388412\nH81\t388413\n众安尊\t388414\n巨眼\t388415\nSuccess\t388416\n玛沁县\t388417\nrelating\t388418\n顾盛夏\t388419\niglol吧\t388420\nFilme\t388421\n绿驰\t388422\nMagnus\t388423\ndebugger\t388424\nvconfig\t388425\n十九大党\t388426\n4399狂野飙车8\t388427\n丁某\t388428\n丰盛\t388429\n博雅汉语\t388430\nPrezi\t388431\n企划案\t388432\nGiant\t388433\n十分位\t388434\nsqs\t388435\nwpsoffice\t388436\n战争机器人\t388437\n天大新闻网\t388438\ndividend\t388439\n如隔\t388440\n回归者\t388441\n侍宠\t388442\nExclusive\t388443\n宁泽王雅媛\t388444\n陈艾\t388445\n半级\t388446\n颍河\t388447\n徐静\t388448\n玉泉校区\t388449\n记录型\t388450\n大费\t388451\n浇\t388452\nwampsever\t388453\n以案促改\t388454\nsketchUp\t388455\n佳木斯市\t388456\n加利福尼亚\t388457\n夏侯玄\t388458\n刘雪华\t388459\n安转\t388460\n驭界\t388461\nParis\t388462\njavaee\t388463\n进销存管理\t388464\n水皮白夜行\t388465\n钢锯岭\t388466\nDS160\t388467\nPLAYER\t388468\n贝内文托\t388469\n小规模公司\t388470\n宝武新闻中心\t388471\n悬式\t388472\n第几条\t388473\n薄锐ENVY\t388474\n750ti\t388475\n郭顶\t388476\n南安\t388477\n吉珠\t388478\n抓手\t388479\nboule\t388480\n旁批\t388481\n沙金\t388482\n团险\t388483\n圆振动筛\t388484\n广宣费\t388485\n中国石油集团\t388486\nkst\t388487\n337路\t388488\n袁洁莹\t388489\n2个多月\t388490\nmimaki\t388491\n高邑\t388492\n百分浏览器\t388493\n妙不可言\t388494\n喷\t388495\nSBB\t388496\n亚行\t388497\n盲色\t388498\n140平米\t388499\n浏览页\t388500\n双面性\t388501\n路东\t388502\n东岳大街\t388503\n7K\t388504\n别后\t388505\nVincen\t388506\nWiDi\t388507\n气色\t388508\n后宅\t388509\n松江二中\t388510\ngl百合\t388511\nBRAS\t388512\n西尔贝\t388513\n彭罗斯\t388514\n67年\t388515\nLeaflet\t388516\n曹林\t388517\n图形化\t388518\n富春论坛\t388519\n笔刀\t388520\npetty\t388521\ncentric\t388522\n盛世花园\t388523\nshey\t388524\nthr10\t388525\n宝卡\t388526\ndu\t388527\n拔苗助长\t388528\n上海家教网\t388529\n第五章\t388530\n应标\t388531\n花样少年\t388532\nj罗\t388533\n沩山\t388534\n甘肃省政府\t388535\n通性\t388536\n商标转让网\t388537\n武惠妃\t388538\n春草网\t388539\n阮毅\t388540\nstoya\t388541\n红瑞\t388542\n江苏省委\t388543\ngraphics\t388544\n板川\t388545\n标像\t388546\n璩美凤\t388547\n踏天无痕\t388548\n新力城\t388549\n八里河风景区\t388550\n新视野大学英语3\t388551\n2018年6月30日\t388552\n凤冠\t388553\n你好坏\t388554\nBecomes\t388555\n王者\t388556\n晓军\t388557\n2711\t388558\n独特性\t388559\nQOS\t388560\n金铺\t388561\n浮层\t388562\nflyzy\t388563\n瓦基弗\t388564\nMembership\t388565\n南宁\t388566\n曼月乐环\t388567\n冰帝\t388568\n平均线\t388569\n爱普生LQ\t388570\n70题\t388571\n同安路\t388572\n莲花童子哪吒\t388573\nraises\t388574\n泳装秀\t388575\n四川发展控股有限责任公司\t388576\n2og\t388577\n和硕\t388578\nt430s\t388579\n杂交种\t388580\n香江花园\t388581\n徐霞\t388582\nETH\t388583\n第三组\t388584\n罗青\t388585\ncompanion\t388586\n狍子\t388587\n鼎湖\t388588\nFastDFS\t388589\n攒劲\t388590\n广场舞\t388591\n中国际\t388592\n7.2_\t388593\n申根保险\t388594\n超便宜\t388595\n门头照\t388596\n代用茶\t388597\n可调\t388598\nBike\t388599\n深圳市经信委\t388600\n回租\t388601\n嘉兴经济开发区\t388602\nkbd\t388603\n猪八戒小说网\t388604\n旅游岛\t388605\nGoLove\t388606\n蛇舌\t388607\n尼曼\t388608\n3602\t388609\n同步导学\t388610\n雪中悍刀\t388611\n程蝶衣\t388612\nOlga\t388613\n南宁市委\t388614\nQMenu\t388615\n5000元\t388616\n小区物业公司\t388617\ny35\t388618\n一阙\t388619\n泡沫机\t388620\n0xc0000098\t388621\n杭州商学院\t388622\n棉柔巾\t388623\n屁\t388624\n星新一\t388625\n洛伦兹\t388626\n徐嘉余\t388627\n众盟\t388628\n自发性\t388629\n张丽君\t388630\n山西省交通运输厅\t388631\nplanning\t388632\n天津财经大学\t388633\n纳税申报表\t388634\n十二星座女\t388635\n喝水\t388636\n湖南大学\t388637\n732259\t388638\n韩熙雅\t388639\n外沿\t388640\n哑银\t388641\nLol\t388642\ncsrf\t388643\n花千隋唐\t388644\n一亲\t388645\n历程\t388646\n多余行\t388647\nEDI\t388648\nolm\t388649\n红薯叶\t388650\n秋鸿\t388651\n资产评估公司\t388652\n积木\t388653\n孝文帝\t388654\n上上谦\t388655\ncodemirror\t388656\n76\t388657\nripndip\t388658\npowerpoint_办公软\t388659\n青辣椒\t388660\n高修\t388661\n2017年1月7日\t388662\n北京诺华制药有限公司\t388663\nMCH\t388664\n全心全意\t388665\nppb\t388666\n瓷瓶\t388667\nerror-page\t388668\n博鳌镇\t388669\n纹身贴\t388670\nGroza\t388671\n颐达\t388672\nPART\t388673\n金花镇\t388674\n志美\t388675\n勇敢星\t388676\n风尚型\t388677\n机价\t388678\n日久见人心\t388679\n广角镜头\t388680\n南极熊\t388681\n温\t388682\n本子库\t388683\n大唐仙妖劫\t388684\n第五大\t388685\n伊通满族自治县\t388686\n大成都\t388687\n五一新村\t388688\n蜘蛛丝\t388689\nE-BODY\t388690\n嘉捷\t388691\n才府\t388692\nKoo\t388693\n工伤保险赔偿\t388694\n正转\t388695\n奶酱\t388696\n双清\t388697\n占地\t388698\nNewsletter\t388699\n别掉\t388700\n肠道菌群\t388701\n施工组织设计.doc\t388702\n汪永清\t388703\n冬菜\t388704\n饶雪漫\t388705\n管材\t388706\n齐家治国\t388707\n3d版\t388708\n车尾\t388709\n华宇娱乐\t388710\n八卦象数疗法\t388711\n秘药\t388712\n音乐学专业\t388713\nPoems\t388714\ncular\t388715\n沈从文\t388716\n晓玲\t388717\n中端\t388718\n金毛狗\t388719\n迪欧\t388720\nCollateral\t388721\nx4\t388722\n检修工\t388723\n奥迪A3论坛\t388724\n导电膏\t388725\n西游记释厄传\t388726\nkonka\t388727\n3528\t388728\n沉香树\t388729\n厌奶\t388730\n聘书\t388731\nstopped\t388732\n未名居_晋江文学城\t388733\nMVC拦截器\t388734\n井道\t388735\n映畫\t388736\n销售商\t388737\n私欲\t388738\n孙海英\t388739\n南通大学附属医院\t388740\n英语\t388741\n当着\t388742\n四季桂\t388743\n穷人家\t388744\n15次\t388745\n沐川\t388746\n国家劳动总局\t388747\nstarbucks\t388748\n10万张\t388749\n吴桥县\t388750\n岱勒新材\t388751\nM.2\t388752\n伛偻\t388753\n樱岛\t388754\n去哪儿网酒店\t388755\n深圳考试院\t388756\n苏州市住房公积金管理中心\t388757\n上海广场\t388758\n吕坤\t388759\n七汉\t388760\n滞环\t388761\n克林霉素磷酸酯凝胶\t388762\n搜狐汽车网\t388763\n内关穴\t388764\n摩斯探长前传\t388765\n线体\t388766\n鱼危\t388767\n吴玲\t388768\n东乡平八郎\t388769\n浙江永强\t388770\nIce\t388771\n春江花\t388772\nhonor\t388773\n400m\t388774\n初等行变换\t388775\nTPL\t388776\n夏凉帽\t388777\n426\t388778\n10帧\t388779\n金信\t388780\n阿斯\t388781\n埃及的金字塔\t388782\n钡餐\t388783\naccess函数\t388784\n收缝\t388785\ndecrease\t388786\n快销品\t388787\n笔帽\t388788\n辉煌版\t388789\n老辈\t388790\n6609\t388791\n重庆中国旅行社\t388792\n妖舟\t388793\n大渡口网\t388794\n托词\t388795\nX21A\t388796\n透明板\t388797\n浙大紫金港\t388798\n小硕\t388799\n一轮\t388800\n香鸡\t388801\n金冲\t388802\n试教\t388803\n枝城\t388804\n第十七天\t388805\n吉利丁粉\t388806\n2065\t388807\n弃坑\t388808\n老邢\t388809\nsf\t388810\n东城华府\t388811\n哈希索引\t388812\n京锐\t388813\n交款\t388814\n会销\t388815\n摄取\t388816\n363\t388817\n青岛市人社网网上办事服务大厅\t388818\n格林豪泰酒店\t388819\n审慎\t388820\nzimbra\t388821\n上海人民广场\t388822\n南昌市卫生和计划生育委员会\t388823\n高隐\t388824\n无言独上西楼\t388825\n深意\t388826\n亚萨卡\t388827\n毁林\t388828\nPutin\t388829\n冰枪\t388830\nmetlife\t388831\n函馆\t388832\nWNDWJ\t388833\n液压泵\t388834\n第78期\t388835\n冷柜\t388836\n雷科\t388837\n仁怀\t388838\nArtemis\t388839\n国家医保局\t388840\n暗裔剑魔\t388841\n量值\t388842\n全国各地区\t388843\necust\t388844\nmoxa\t388845\n忍者村\t388846\ntopdf\t388847\njx3\t388848\ngtx650\t388849\n手谈\t388850\n秦舒培\t388851\n新网互联\t388852\n滋阴\t388853\n20天\t388854\n8磅\t388855\n梨泰院\t388856\n德味\t388857\n杨庙镇\t388858\n黄河壶口瀑布\t388859\n哈哈哈\t388860\n新空间\t388861\n7天6晚\t388862\n汉族\t388863\nrepayment\t388864\n你好\t388865\n2048s\t388866\n浙网\t388867\ntxt下书网\t388868\n荧光色\t388869\n会计之友\t388870\n高昌\t388871\n688\t388872\n妻母\t388873\n招金矿业\t388874\nGut\t388875\n黑龙江省科技厅\t388876\n瑞尔\t388877\n金政基\t388878\n游戏群\t388879\n第A04\t388880\n武汉酒店\t388881\n忌讳\t388882\n光纤光谱仪\t388883\n明卫\t388884\n龙岗镇\t388885\n鱼身\t388886\n悦康\t388887\n刘正\t388888\n易企微\t388889\n彩虹岛吧_\t388890\n博斯\t388891\n涵闸\t388892\n单向板\t388893\n能率\t388894\nverita\t388895\n彩虹代刷网\t388896\n御湖\t388897\n微型化\t388898\n共阳极\t388899\n圩田\t388900\n交大二附院\t388901\n天天看播放器\t388902\n宝利国际\t388903\n北京仲裁委员会\t388904\n佳运\t388905\n饶平县人民政府\t388906\n八廓街\t388907\n丹纳\t388908\nOCCI\t388909\n一点|\t388910\n失控奔驰\t388911\npip3\t388912\n日本怡红院\t388913\nclosest\t388914\n1.2.1.0\t388915\nAnnihilation\t388916\nSDCard\t388917\n中国消防产品信息网\t388918\n铁甲钢拳\t388919\n北大方正人寿保险有限公司\t388920\nbeginthread\t388921\nbeyerdynamic\t388922\nSignalTap\t388923\n李贵中\t388924\n大岛渚\t388925\n人肉叉烧包2\t388926\nGaAs\t388927\n不犯\t388928\n超星神\t388929\n亚布力\t388930\nVV5\t388931\n龙泉市\t388932\n邹北业\t388933\n马萨拉蒂\t388934\n词词\t388935\n洛和霞\t388936\n高压开关柜\t388937\nHTRI\t388938\n高兵\t388939\nHydroxy\t388940\ncustomer\t388941\n星秀\t388942\n茶叶包\t388943\n拉掉\t388944\n长运\t388945\n求求你\t388946\nDaniel\t388947\nP1108\t388948\nPRO2\t388949\n原始宗教\t388950\n升降旗\t388951\n尼娜\t388952\n客户关系\t388953\nBlossom\t388954\n厦门建发集团有限公司\t388955\n小苍兰\t388956\n0909\t388957\n191个\t388958\n滑县人民政府\t388959\n龙采\t388960\n孙宏伟\t388961\n命令与征服3\t388962\n羊绒\t388963\n马陆\t388964\n風間\t388965\n保定市徐水区人民政府\t388966\n四五\t388967\nCzech\t388968\n计数单位\t388969\n锐钛型\t388970\n森川安娜\t388971\nARTS\t388972\n投资额\t388973\n悠悠鸟\t388974\n咔哒\t388975\n墨西柯\t388976\n临金高速\t388977\n38510169\t388978\n黄承伟\t388979\n水沫子\t388980\n张迈\t388981\n黄焖鸡米饭\t388982\n3p\t388983\n马长生\t388984\ninert\t388985\n中国银行保险监督管理委员会\t388986\n佳能5D2\t388987\nKakaoTalk\t388988\nptf\t388989\n不可得\t388990\n帮贡\t388991\n刺探\t388992\n30pin\t388993\n敢不敢\t388994\ncolorbar\t388995\njustin\t388996\n李志绥\t388997\n7种\t388998\n192.168.0.101\t388999\n雷蛇宏\t389000\n敲入\t389001\n爱丽时尚网\t389002\n草样年华\t389003\njestina\t389004\n漏嘴\t389005\n陷\t389006\nSR1\t389007\n贪图享乐\t389008\ninsert\t389009\nithome\t389010\nDonkey\t389011\n社会工作者考试\t389012\nSweet-Tang\t389013\n不死族\t389014\n面角\t389015\nfprintf函数\t389016\n什么业\t389017\n动迁\t389018\n云游戏吧\t389019\n甘肃中医学院\t389020\n2018.3.10\t389021\n八几年\t389022\n织入\t389023\n架工\t389024\n云_比特网\t389025\n黄页88生活服务网\t389026\n美和\t389027\n中科院化学所\t389028\n矩阵式\t389029\n村野桃花朵朵开\t389030\n范唱\t389031\n爸爸的好儿子\t389032\n马边县\t389033\n午休\t389034\n7公分\t389035\n按动\t389036\n至善基金\t389037\nRosi\t389038\n现代农业园区\t389039\n星纹\t389040\n惰性气体\t389041\n重庆市高级人民法院\t389042\nTM4C\t389043\n埃夫特\t389044\n加州大学欧文分校\t389045\n莫桑比克\t389046\n电子科技大学中山学院\t389047\ni7-8550U\t389048\n折衷主义\t389049\n探寻\t389050\n芭蕾基训\t389051\n投出\t389052\n王世贞\t389053\n新晃侗族自治县\t389054\nclearance\t389055\n四川省疾病预防控制中心\t389056\n最后的日子\t389057\n中国数据中心\t389058\n百万分之一\t389059\n24块\t389060\n太囧\t389061\n美国短毛猫\t389062\nexpress4\t389063\n硬挺\t389064\n新剑侠情缘\t389065\n理发店\t389066\n14nm\t389067\n中国青年政治学院\t389068\n阮元\t389069\nMort\t389070\n灰胡子\t389071\n咕噜噜\t389072\n心窍\t389073\n0358\t389074\n糖果赛\t389075\n德标\t389076\n小森林冬春篇\t389077\n白页\t389078\n私募股权机构\t389079\n房祖名\t389080\n九曜\t389081\n北京市高院\t389082\n明若晓溪\t389083\n马苏里拉\t389084\nSylar\t389085\n刘宝瑞\t389086\nCFG\t389087\n固防\t389088\nibasso\t389089\n精锻\t389090\n以柔克刚\t389091\n和平积习\t389092\n美国国家旅游局\t389093\n墨经\t389094\n北京市电力公司\t389095\n245mm\t389096\n昆明机场\t389097\n69码\t389098\nfemfresh\t389099\nWindowsxp\t389100\n航意险\t389101\n发热芯\t389102\n铅精矿\t389103\n坚果云帮助中心\t389104\n春茗\t389105\n玛丽·雪莱\t389106\n南京电信\t389107\n万份\t389108\nBalance\t389109\n英国航空\t389110\n图层管理器\t389111\n石狗\t389112\n/div\t389113\n杭港\t389114\n夏小星\t389115\n中泰国际广场\t389116\n天津大学理学院\t389117\nmeimei\t389118\n李家洋\t389119\nvtec\t389120\n焦作百姓网\t389121\n八维\t389122\n英语作文网\t389123\n服务行业\t389124\n宁波理工\t389125\n384\t389126\n森崎温\t389127\nps钢笔\t389128\nqq群号\t389129\n战棋\t389130\nnovember\t389131\n0600\t389132\n李朋\t389133\n蓝驱\t389134\n对得\t389135\n常州\t389136\n面包房\t389137\n写字表\t389138\n与会\t389139\n阿塞拜疆站\t389140\n内筒\t389141\n青岛一中\t389142\n恩佐斯\t389143\nLucene\t389144\n六足\t389145\nsuperpanda\t389146\n大涨\t389147\nHermes\t389148\n代表性\t389149\n太湖县\t389150\n王国新大陆\t389151\n龟头\t389152\n推敲\t389153\n纯洁心灵\t389154\n口红管\t389155\n和林格尔新区\t389156\n王恩东\t389157\n深中\t389158\n傅越泽\t389159\n失圆\t389160\n参构造函数\t389161\n七言律诗\t389162\n爆棚\t389163\n十九大知识竞赛\t389164\n膜\t389165\n大吉岭茶\t389166\n曲须龙\t389167\n陆强\t389168\nshendu\t389169\n搭理\t389170\n连环杀手\t389171\n12300\t389172\n德邦证券\t389173\n爵少\t389174\npingan\t389175\n意译\t389176\n商洛市财政局\t389177\n工长\t389178\n分词\t389179\n湖南理工职业技术学院\t389180\n金融大街\t389181\n胸包\t389182\nOutMP3\t389183\n赞歌\t389184\n模架\t389185\n鲍春来\t389186\n4.31\t389187\n小到大\t389188\n东兴区人民政府\t389189\n八里街\t389190\nTR069\t389191\n黄伟\t389192\n南京火车站\t389193\naur\t389194\n28公里\t389195\n免疫科\t389196\n武技\t389197\n节电\t389198\n赵明诚\t389199\nvaried\t389200\nWarrior\t389201\n神州易桥\t389202\n矿业权\t389203\n茶企\t389204\n滨城\t389205\n青冈\t389206\nlyon\t389207\n嘉嘉乐\t389208\n兴庆\t389209\nEmperor\t389210\n中科路\t389211\n满清\t389212\n木下佑哗\t389213\ncorona\t389214\nDisco\t389215\n假惺惺\t389216\n法邦网\t389217\n李威\t389218\n预展\t389219\npbt\t389220\n惠人\t389221\n艺能界\t389222\nadds\t389223\n热原\t389224\n以房养老\t389225\n52RD\t389226\n束腹\t389227\n曾舒蓓\t389228\n常州物流公司\t389229\n双盲\t389230\nCSS+DIV\t389231\n一言堂\t389232\n舞出我人生5\t389233\nbtkitty\t389234\n芫花\t389235\n看报纸\t389236\n淫淫网\t389237\n私会\t389238\nouting\t389239\n三边封\t389240\nxinhuanet\t389241\n中天钢铁\t389242\nexisting\t389243\n十八年前\t389244\n流川\t389245\nmouqj\t389246\nOrange\t389247\n中国地震局\t389248\ncarmen\t389249\ntiantang\t389250\n厦门日报\t389251\n新白发魔女传\t389252\n泪沟\t389253\n取乐\t389254\nsele\t389255\n元明清\t389256\n九山顶\t389257\ngorgeous\t389258\n宁死\t389259\n旅顺开发区\t389260\n医学界\t389261\n蛙哥\t389262\n_青网\t389263\ncss语\t389264\nsipp\t389265\n阿比盖尔\t389266\nKubernets\t389267\n翡翠半岛\t389268\nnjnu\t389269\n#30\t389270\n:\t389271\n闪克\t389272\n按键式\t389273\n冠亚\t389274\n中国金融科技\t389275\n泡棉\t389276\n传奇GM论坛\t389277\ntamcat\t389278\n海洋哺乳动物\t389279\n发炎\t389280\n河南站\t389281\nUIUC\t389282\n城市在线\t389283\n5200U\t389284\n曼玉\t389285\n体系\t389286\n和一\t389287\n大战僵尸\t389288\n奔驰e200\t389289\nzongyi\t389290\nvisits\t389291\n铜炉\t389292\n超级强势股\t389293\n600目\t389294\n成风魄郎\t389295\n博展\t389296\n水平\t389297\n郭嘉和\t389298\n阿三生\t389299\n心香\t389300\ntibco\t389301\n迪斯马斯克\t389302\n吸引\t389303\n制鞋业\t389304\n鸿涛\t389305\n志工\t389306\n练习三\t389307\n航材\t389308\n何夕\t389309\n鱼翅\t389310\n压差传感器\t389311\ndoomsday\t389312\n阑夕\t389313\n三氯乙酸\t389314\n舍得酒业\t389315\ni7-7700HQ\t389316\n恐龙公园\t389317\n中牧\t389318\n主厅\t389319\n言说\t389320\n共同体\t389321\n飞企\t389322\n国家质量技术监督局\t389323\n圣体\t389324\n走珠\t389325\n无所不在\t389326\n王晓阳\t389327\n变窄\t389328\n大堂\t389329\n京台\t389330\n吴翔\t389331\n春刀\t389332\n日线图\t389333\n抽放\t389334\n外团\t389335\n衢宁铁路\t389336\n上海轨道交通2号线\t389337\n侵入\t389338\n偷得\t389339\n内存双通道\t389340\n海富\t389341\nob11\t389342\n拓日新能\t389343\n三千亿\t389344\n中瑞\t389345\n樱花祭\t389346\n公积金贷款计算器\t389347\nHelmet\t389348\n厄运东\t389349\nEuro\t389350\n背包\t389351\nPROCEDURE\t389352\nConveyor\t389353\nSiteServer\t389354\n中国女孩\t389355\nsetsid\t389356\n乐维\t389357\n太阳能电池片\t389358\n马怀德\t389359\n新浪院校库\t389360\n弘毅\t389361\n轧花网\t389362\n几票\t389363\n虚开增值税发票罪\t389364\nwomens\t389365\nviewmodel\t389366\n夫君们\t389367\n扼杀\t389368\n管一管\t389369\nDb2\t389370\n王鹰\t389371\n增值税专用发票使用规定\t389372\n婚配表\t389373\n产生\t389374\nJax\t389375\n卡特里娜\t389376\n跑马\t389377\n笼络\t389378\n加尔克汉德\t389379\n伊鲁卡\t389380\n电轴\t389381\necxl\t389382\n鑫远\t389383\n巨贵\t389384\n来宁\t389385\n乩童\t389386\n枯槁\t389387\n不在岗\t389388\n尺蠖\t389389\n紫气东来\t389390\n蛤蟆镜\t389391\nStellar\t389392\nEasyUi\t389393\n栗\t389394\nhip\t389395\n胃丸\t389396\n夜深人静\t389397\n互感\t389398\n跟踪器\t389399\n有机肥设备\t389400\n砰砰\t389401\n书局\t389402\n大地之歌\t389403\n雷蛇利维坦\t389404\n园中\t389405\n上海现代建筑设计(集团)有限公司\t389406\n0644\t389407\nlacp\t389408\n殷都区\t389409\n百分之50\t389410\n7200u\t389411\n上海大学继续教育学院\t389412\nJimi\t389413\n相逢是首歌\t389414\n扼要\t389415\n央媒\t389416\nUIP\t389417\n那个他\t389418\n薄层层析\t389419\nGBE\t389420\nMISSHA\t389421\n抛光轮\t389422\n快猴游戏网\t389423\n淫三国梦想\t389424\n38军\t389425\n三霄\t389426\n动漫论\t389427\n海普\t389428\n政论\t389429\n高承勇\t389430\n阴超\t389431\n三时\t389432\nnea\t389433\nProfessor\t389434\n会导\t389435\n黄河水\t389436\nShepherd\t389437\nvipjr\t389438\n硬脂酸钙\t389439\n2月9号\t389440\n胚房\t389441\n艺术品\t389442\n诞生石\t389443\n黑暗剑\t389444\n唇舌\t389445\n渐行渐远\t389446\n2500TB\t389447\n抬吊\t389448\n黄博文\t389449\nfileopen\t389450\n喵咪\t389451\n下邳\t389452\n韵达快递\t389453\n刘梦\t389454\n地暖分水器\t389455\n五百\t389456\n联想e450\t389457\n桂英\t389458\nCandies\t389459\n奔驰G\t389460\n魔王子\t389461\n竹条\t389462\n肖复兴\t389463\npopupWindow\t389464\n一般类\t389465\n醉红颜\t389466\n柏木由纪\t389467\n管儿\t389468\n大立光\t389469\nREPLACE\t389470\n上海郊区\t389471\n广西海事局\t389472\nQAQ\t389473\ndataGrid\t389474\n香芬\t389475\n女演员们\t389476\n亲朋好友\t389477\n在产品\t389478\n乔波\t389479\n_都市\t389480\n凯文哈特\t389481\n摇头\t389482\n嚎风峡湾\t389483\n残火\t389484\n无敌手\t389485\n形式感\t389486\n参苓白术丸\t389487\nwicc\t389488\n12pt\t389489\n代谢性酸中毒\t389490\n手摇式\t389491\n线性代数吧\t389492\n职生\t389493\nwap端\t389494\n泡子\t389495\n白鹭洲\t389496\n副总理\t389497\n四力\t389498\n犹\t389499\n超声耦合剂\t389500\n逗\t389501\n考运\t389502\nURBAN\t389503\n26次\t389504\n谷堆\t389505\nNSFW\t389506\nXRacoon\t389507\n哈飞\t389508\n缙云\t389509\norganic\t389510\n支农支\t389511\n额尔古纳市\t389512\n峻晨\t389513\n曲折\t389514\nSUBPIG\t389515\n雨来\t389516\nlavarel\t389517\nvalign\t389518\n裴医堂\t389519\n恭迎\t389520\n第10名\t389521\nTUBULAR\t389522\n有组织\t389523\n关节腔\t389524\npointers\t389525\n孙志\t389526\nlabeled\t389527\n公公\t389528\n本该\t389529\n防爆型\t389530\n宅\t389531\n052D\t389532\nGordon0918\t389533\n胶体金\t389534\n鹦鹉螺\t389535\n万元户\t389536\n郑州国医堂医院\t389537\nDeceit\t389538\n3.0.0.0\t389539\n金樱子\t389540\n胸照\t389541\n铜雀台\t389542\n上巳节\t389543\n递归函数\t389544\n裁剪器\t389545\n福建大厦\t389546\n120000\t389547\n神思电子\t389548\nIMP\t389549\n4月份\t389550\n循化县\t389551\n尼基塔\t389552\n康万生\t389553\n市政府法制办\t389554\n444号\t389555\n中国传媒大学南广学院\t389556\n奸杀\t389557\n套标\t389558\n比亚迪S7论坛_汽车之家论坛\t389559\n红太行\t389560\n软件商\t389561\n合肥经济开发区\t389562\n阿奇姆彭\t389563\n稻城亚丁景区\t389564\n荆州市民政局\t389565\n香蕉计划\t389566\n电磁调速电机\t389567\n斗蟋蟀\t389568\n辽宁省教育厅\t389569\nwebApp\t389570\n芯篇\t389571\n变阻器\t389572\n高强\t389573\n西门子(中国)有限公司\t389574\n娜拉\t389575\n万全镇\t389576\n徐达\t389577\n浔\t389578\n伸\t389579\n中国铁道博物馆\t389580\n子文档\t389581\n发育生物学\t389582\n足コキ\t389583\n移上\t389584\n经信\t389585\n退件\t389586\n张君宝\t389587\nhinata\t389588\nnodelist\t389589\n超不过\t389590\n尾气\t389591\n胃体\t389592\n菏泽开发区\t389593\n总攻\t389594\nD9\t389595\n茅厕\t389596\n三国群英传ol\t389597\n仓储业\t389598\n一行多个\t389599\n军用机场\t389600\n微流控\t389601\naccess2016\t389602\n李治\t389603\n有机食品认证\t389604\n共管\t389605\n赶出\t389606\n白樱\t389607\n电气化\t389608\n105\t389609\n郝建\t389610\n通州\t389611\n记闻\t389612\n蛮力法\t389613\ncxGrid\t389614\n易久堂\t389615\n我的一个道姑朋友\t389616\n东京白日梦女\t389617\n炎神\t389618\njail\t389619\n克瑞斯\t389620\n澳门特别行政区政府\t389621\n再鼎医药\t389622\n南加大\t389623\nCreateThread\t389624\n淄博电视台\t389625\n徐文兵\t389626\n朝西\t389627\n孟州市委政府\t389628\nNe\t389629\n重接\t389630\n步进电机控制系统\t389631\nGrimm\t389632\n春北师大\t389633\n公司的力量\t389634\nwin10图片查看器\t389635\n看不到\t389636\n拒绝者\t389637\n醉金迷\t389638\nm13\t389639\n海隆\t389640\n定量分析\t389641\n连南县\t389642\n精益化\t389643\nAPLUSVABLE\t389644\nN5110\t389645\nPhotoView\t389646\n餐梯\t389647\n局属\t389648\n大迁移\t389649\nlacie\t389650\n画竹\t389651\n可导\t389652\n仙灵觉醒\t389653\n读后感作文\t389654\n7600k\t389655\n死性不改\t389656\n系统族\t389657\ndetail\t389658\nTRANS\t389659\n安怡\t389660\n成都新机场\t389661\n南湖半岛\t389662\n无修\t389663\n18041期\t389664\n梦幻成就吧\t389665\n王建中\t389666\nアンスイ\t389667\n九阶\t389668\n西皮\t389669\n邪鬼\t389670\nexcle2016\t389671\n好棒\t389672\n哈尔滨工业大学出版社\t389673\nnv21\t389674\n红热\t389675\n法华\t389676\n几封\t389677\n是谁\t389678\nKnoll\t389679\n网页编码\t389680\n鸟团\t389681\n凯千Karryon\t389682\n小红书\t389683\n水径\t389684\n改衣\t389685\n赵云子龙\t389686\n焊炬\t389687\n万户\t389688\n封锁线\t389689\n最近30天\t389690\n荣威rx8\t389691\ndmmd\t389692\n松药店\t389693\n拂晓新闻网\t389694\n第三届\t389695\n休思\t389696\nsilky\t389697\n寓言两则\t389698\n河东软件园\t389699\n工信局\t389700\n红痒\t389701\n钓台\t389702\nSuperstore\t389703\n汤姆索亚历险记\t389704\n表情包大战\t389705\n三潭\t389706\n家用流式\t389707\n代帐\t389708\n浸渗\t389709\nCHEF\t389710\n弾\t389711\nchro\t389712\n夜這\t389713\n鱼板\t389714\n3-6个月\t389715\n散尽\t389716\n佳惠\t389717\n发如雪\t389718\n9F\t389719\n前因\t389720\n三日游\t389721\n蓬皮杜\t389722\nEVIEWS\t389723\ncaves\t389724\n意大利人\t389725\n三层楼\t389726\n巨讯网\t389727\npatients\t389728\n江西新闻网\t389729\n岳塘国际商贸城\t389730\n吊筋\t389731\n百度云油猴\t389732\n之二十八\t389733\n汉派\t389734\n101条\t389735\n总揽全局\t389736\n霸气\t389737\n公职律师\t389738\n复星国际\t389739\n兰光\t389740\nci7\t389741\n皓月当空\t389742\n核子\t389743\n西市\t389744\n陶永吉\t389745\n床褥\t389746\n目指\t389747\n绑架犯\t389748\n双氯芬酸钠\t389749\nVESA\t389750\n浑然一体\t389751\nI型\t389752\n成全\t389753\n苏州市职业大学\t389754\n广钦\t389755\n守志\t389756\n地理学报\t389757\ncircos\t389758\n公共产品\t389759\n云存储系统\t389760\n福克斯\t389761\nDatabases\t389762\n昨天早上\t389763\n签唱会\t389764\n顿首\t389765\n模板&#41\t389766\n抠逼\t389767\n铂尔曼酒店\t389768\nmacvim\t389769\n大境门\t389770\n人民币同号钞\t389771\ncoremail\t389772\nEngines\t389773\n北倾\t389774\n盛京\t389775\n潞城\t389776\n桂心\t389777\np50\t389778\n路客\t389779\n72号\t389780\n纰漏\t389781\nar15\t389782\nserversocket\t389783\njlu\t389784\n广州地铁1号线\t389785\n电筒\t389786\n辽宁省实验中学\t389787\n木槿花\t389788\nFSU\t389789\n钟秉林\t389790\n电铃\t389791\n天津北站\t389792\n44.1khz\t389793\n扬沙\t389794\n万辉\t389795\n徐舍镇\t389796\nbiye\t389797\n潘多\t389798\n阿莱\t389799\n红运\t389800\n152年\t389801\ncav\t389802\nxiaofeng\t389803\n单向街\t389804\n上人\t389805\n78\t389806\n秋影\t389807\n内子\t389808\n毛果芸\t389809\n明治\t389810\nThom\t389811\n纽迪希亚\t389812\n财会业\t389813\n翘首以盼\t389814\n朝阳村\t389815\n独守\t389816\n刘辰希\t389817\n争做\t389818\n叶少兰\t389819\n主持\t389820\nwin10教育版\t389821\nhistoric\t389822\n多游网游戏攻略大全\t389823\n35处\t389824\n米莱狄\t389825\n暴风集团\t389826\n呼伦贝尔市人民政府\t389827\n百优卡\t389828\ninverse\t389829\n首山\t389830\n辽宁政府\t389831\n父本\t389832\n2016卷\t389833\nSubstrate\t389834\n小舞\t389835\n范冰郭敬明\t389836\n尿毒症\t389837\n厚着脸皮\t389838\n参考点\t389839\naden\t389840\n款型\t389841\nAndroid-x86\t389842\n畅捷通t+\t389843\nBajrangi\t389844\n铁玉\t389845\n公安县人民政府\t389846\n昂克赛拉吧\t389847\nptb\t389848\n鸭王\t389849\nFTMS\t389850\n150层\t389851\n晋东南\t389852\n劲肋\t389853\n任志\t389854\n新四军纪念馆\t389855\n女套\t389856\nH1b\t389857\n使得\t389858\n制单\t389859\n叉字蝠\t389860\n神乐坂真冬\t389861\n国鑫\t389862\n疯了一样\t389863\ngaozhong\t389864\n720\t389865\n优房\t389866\n川周公路\t389867\n安记\t389868\n95516\t389869\n巴沟\t389870\nLijiang\t389871\n中国侨网\t389872\n撸哥\t389873\n日粮\t389874\n物电学院\t389875\n腰果\t389876\n釉面\t389877\n派股\t389878\n家乡味\t389879\n权倾三国\t389880\n迫害\t389881\n交银\t389882\n孤月\t389883\n韩援\t389884\n几角\t389885\n3168\t389886\n康龙\t389887\nCCTV-10科教\t389888\nStarWind\t389889\n消防局\t389890\nHomeless\t389891\n横林\t389892\nComprehensive\t389893\netd\t389894\n嶋田琴美\t389895\n韩天宇\t389896\ntabular\t389897\n像素级\t389898\nF-1\t389899\n复方斑蝥胶囊\t389900\n金牛座女\t389901\n1942年\t389902\n50年\t389903\n1889年\t389904\n局党组\t389905\n风城\t389906\n中国海口政府\t389907\n任城网\t389908\n水泽\t389909\nkingston\t389910\nwesley\t389911\nTori\t389912\n计缴\t389913\nmoto360\t389914\n福州市城乡规划局\t389915\n纪念片\t389916\n滨江森林公园\t389917\nemui8\t389918\n英菲尼迪q50l\t389919\nurge\t389920\n北京招商银行\t389921\n露娜luna\t389922\n涧西区\t389923\nvv7s\t389924\n牛牌\t389925\n党代\t389926\n脑筋\t389927\nST三维\t389928\nKTH\t389929\n21篇\t389930\nLighttpd\t389931\n漫画书\t389932\n连续技\t389933\n泰尔\t389934\n欧洲中央银行\t389935\n香樟大道\t389936\n19代\t389937\n北京外国语大学\t389938\n5辆\t389939\n拜神\t389940\n高一个\t389941\nffbe\t389942\nA2O\t389943\n进来看\t389944\nPersuasive\t389945\n魔币\t389946\n簪缨\t389947\n中尼\t389948\n罗鹏\t389949\n潜水泵\t389950\navmoo\t389951\n诺克萨斯之手\t389952\nimac吧\t389953\n长春西汀\t389954\n抬\t389955\n感染型\t389956\n国羽\t389957\n_u\t389958\nphysically\t389959\nMPICH2\t389960\n周海燕\t389961\n杨靖\t389962\n孙式太极拳\t389963\n000697\t389964\nオフィシャルサイト\t389965\n波管\t389966\n佐治\t389967\n个人征信记录\t389968\nai文件查看器\t389969\n藏羚羊\t389970\n结节性\t389971\nends\t389972\n梁宁\t389973\n炎症\t389974\n屌丝\t389975\n场口镇\t389976\n华服\t389977\n付诸行动\t389978\n朱新建\t389979\n000540\t389980\n最后阶段\t389981\nyaoyao\t389982\nActiveForm\t389983\n码限额\t389984\n保险期\t389985\n150句\t389986\n蓝豹\t389987\nBD-RMVB.720p\t389988\n中国饭店协会\t389989\n053182918609\t389990\nセックス\t389991\n大拇哥\t389992\n肥沃\t389993\nVentura\t389994\n北京地铁2号线\t389995\n浓妆艳抹\t389996\n象屿保税区\t389997\n贴下\t389998\n膈\t389999\nMATERIAL\t390000\n奔腾x40\t390001\n故友\t390002\n大表\t390003\n李文博\t390004\n父子类\t390005\n竞配\t390006\n补位\t390007\n新北洋\t390008\n罕见\t390009\n达照法师\t390010\n减速\t390011\n10色\t390012\nnuskin\t390013\n怪话\t390014\n1400万元\t390015\n学而无\t390016\n分子动理论\t390017\n无法解释\t390018\n萝丽\t390019\n区科委\t390020\n亢龙\t390021\n御三家\t390022\nAPTX\t390023\n千奇百怪\t390024\n强受\t390025\n细思极恐\t390026\n中国建筑第三工程局有限公司\t390027\n5980\t390028\n高家\t390029\nxxl\t390030\n惬意\t390031\noffic2016\t390032\nsummaries\t390033\nDong\t390034\n道夫\t390035\n间期\t390036\n入盟\t390037\n彩钢瓦\t390038\n己烷\t390039\n股市指数\t390040\n风险抵押金\t390041\nLtda\t390042\n弗拉基米尔\t390043\n130%\t390044\n越狱兔\t390045\n德鑫\t390046\n第68章\t390047\n做饭网\t390048\nsoftbank\t390049\n361\t390050\n全自动咖啡机\t390051\n普陀山\t390052\n左溢\t390053\n静步\t390054\n中关村科学城\t390055\n上百万\t390056\n破蛋\t390057\n夷陵中学\t390058\n早午\t390059\nQ群\t390060\n绦\t390061\n万古\t390062\n三红\t390063\n枯叶\t390064\nHU\t390065\ncovered\t390066\nKATE\t390067\n广西大学商学院\t390068\n5.6.27\t390069\n公称压力\t390070\n莆田安福\t390071\n大泉州\t390072\n180次\t390073\n诞生日\t390074\n蹲踞式\t390075\n抗疲劳\t390076\n巨星们\t390077\n复数名词\t390078\n南十字星\t390079\n百旺税控盘\t390080\n中尉\t390081\n向天歌\t390082\n32bit\t390083\n伟峰\t390084\n辞朝\t390085\n楼兰\t390086\nJPQL\t390087\n迦兰\t390088\n上方\t390089\n雕牌洗衣粉\t390090\n加格达奇区\t390091\nAccount\t390092\n诰命\t390093\n激流网\t390094\nA-Level\t390095\n篷车\t390096\nJoint\t390097\n吉永\t390098\n李娜\t390099\n永言\t390100\n转基因油\t390101\n+2\t390102\n梦幻封妖传5\t390103\n抽\t390104\n中山港口镇\t390105\n2.js\t390106\n猎德花园\t390107\naao\t390108\n名作欣赏\t390109\nP2P理财主页_网贷天眼\t390110\n抹除\t390111\n应收账款\t390112\nbus\t390113\n红烧带鱼\t390114\n东营银行\t390115\n舅舅\t390116\n大炼\t390117\nDL\t390118\n嘉裕\t390119\n榫头\t390120\n专配\t390121\n散讲\t390122\n好天气网\t390123\nMapView\t390124\n百将行\t390125\n躲过\t390126\n九龙广场\t390127\n上周日\t390128\n满铺\t390129\nlouboutin\t390130\n华漕镇\t390131\n全元\t390132\n北京鼓楼\t390133\nczsx\t390134\n查字\t390135\n剪裁\t390136\npar\t390137\n甲状旁腺激素\t390138\nRunner\t390139\n798元\t390140\nEB\t390141\n大阴\t390142\n炸房\t390143\nFiiO\t390144\n希赛\t390145\n浙大医学院\t390146\n动态片\t390147\n朱广沪\t390148\nfaxuan\t390149\nImpairment\t390150\n菲律宾语\t390151\nuint16\t390152\n起眼\t390153\n精灵宝可梦:太阳/月亮\t390154\n影响研究\t390155\n美人图\t390156\n4AT\t390157\n四学\t390158\n副局长\t390159\n张子健\t390160\n帮教\t390161\ngoogle拼音输入法\t390162\n风干肉\t390163\n李佳宁\t390164\n曹平\t390165\n紫砂杯\t390166\npptx\t390167\n南水北调\t390168\n延续性\t390169\n黄金城道\t390170\n阿瓦提\t390171\nPao\t390172\n武汉东西湖\t390173\n郑州大学一附院\t390174\n拔罐器\t390175\nFSAE\t390176\n银保部\t390177\nsharing\t390178\n0577\t390179\n引以为戒\t390180\n苏州建设银行\t390181\n环球收藏网\t390182\n泡泡糖\t390183\nJDK6\t390184\n惠选\t390185\n化学气相沉积法\t390186\n爱斯特\t390187\n颐和园\t390188\n国际文化交流学院\t390189\n小米2s\t390190\n净光合速率\t390191\n6666\t390192\n驱启动\t390193\n未有\t390194\n乐正龙牙\t390195\nappcrash\t390196\n宛若\t390197\n自考心理学\t390198\nlandlord\t390199\n二三本\t390200\n营门口\t390201\nCreo4.0\t390202\n同价\t390203\n2.9%\t390204\ncello\t390205\nkind\t390206\n天家\t390207\n90多年\t390208\nGeoscience\t390209\n志愿书\t390210\nchian\t390211\naid\t390212\nammonia\t390213\n周鹏飞\t390214\n罪大恶极\t390215\n文昌镇\t390216\n药历\t390217\n第四子\t390218\nvc2017\t390219\n仓\t390220\n金融机\t390221\n摩雷\t390222\n销售地\t390223\n应用汇安卓\t390224\n大女人\t390225\n驾驶本\t390226\n财务报告准则\t390227\n宫村\t390228\n中华人民共和国固体废物污染环境防治法\t390229\nsuqqu粉霜\t390230\n图谋\t390231\n炉子\t390232\n黑色素\t390233\n子壳粉\t390234\nC级车\t390235\n2018-03-02\t390236\n护身符\t390237\nholiday\t390238\n制符\t390239\n巴黎野玫瑰\t390240\n紫花地丁\t390241\naluminium\t390242\n莱山政府网\t390243\n绿线\t390244\n纂\t390245\n物美集团\t390246\n价码\t390247\n2074\t390248\n过街老鼠\t390249\nG4\t390250\n电光源\t390251\n直考\t390252\n赦免者\t390253\n华夏移民中介公司\t390254\n闽南语\t390255\n150M\t390256\n国梁\t390257\n工频\t390258\n九洲恒昌\t390259\n7k7k英雄联盟\t390260\n首招\t390261\n惠州北站\t390262\n肺腑之言\t390263\n三十六条\t390264\nLAV\t390265\n炼成\t390266\n福建省龙岩市\t390267\n美丽人生\t390268\n自酿\t390269\n2017年10月27日\t390270\n炸油\t390271\nAptana\t390272\n印语\t390273\n50000元\t390274\n广州城中村\t390275\n金海湖\t390276\n沈丘\t390277\nR14\t390278\n新藏\t390279\n网点\t390280\n江西省儿童医院\t390281\n冷巴\t390282\n剑湖\t390283\nquestionnaire\t390284\n北京二区\t390285\n西岸\t390286\n通用对话框\t390287\n究极体\t390288\nBreakdown\t390289\nCREATOR\t390290\n顾子明\t390291\n饼干\t390292\n不锈钢板\t390293\n浙江大学管理学院\t390294\n秉持\t390295\n桑普拉斯\t390296\n可_寻医问药网\t390297\n陇海\t390298\nMuleSoft\t390299\n简体版\t390300\ncolorpicker\t390301\n加热式\t390302\n187b\t390303\n金贝贝\t390304\n社稷坛\t390305\n七言诗\t390306\n苏堤\t390307\nexpander\t390308\n财务部\t390309\n168_\t390310\n0632\t390311\n5270\t390312\n前腰\t390313\n樱桃花\t390314\n学思践\t390315\n心灵手巧\t390316\n绍兴职业技术学院\t390317\n自行车轮\t390318\n画图函数plot\t390319\n知不知道\t390320\n摹本\t390321\n国乐大典_荔枝网\t390322\n方根\t390323\n高鸣\t390324\n大通D90\t390325\n盈利\t390326\n虹软\t390327\n2.76\t390328\n凌鹰\t390329\n咳嗽声\t390330\n螺内酯片\t390331\n回路板\t390332\n轻钢厂房\t390333\n佳能G3800\t390334\n防红\t390335\n不知处\t390336\n雷克萨斯ES300h\t390337\n费舍\t390338\n7.3兽\t390339\nManon\t390340\n亚都\t390341\n二标\t390342\n即墨温泉\t390343\nYy\t390344\nps存储\t390345\n色样\t390346\n外存储器\t390347\n引出\t390348\n焦大\t390349\n郑州一中\t390350\n社保公司\t390351\n差不多先生\t390352\n人身\t390353\nMPD\t390354\n永乐农庄\t390355\n大富翁4\t390356\net200\t390357\nhttps\t390358\n睿途\t390359\nzjut\t390360\n上海大田阀门管道工程有限公司\t390361\n仙谷\t390362\n裁定\t390363\n雨情\t390364\n偶像剧\t390365\n哈尔滨信息工程学院\t390366\n活佛\t390367\nmediacodec\t390368\n澳大利亚公司\t390369\ncys\t390370\nHas\t390371\nXTC\t390372\n微信公众号流量主\t390373\n官德\t390374\n普利司通轮胎\t390375\n共有\t390376\n成都恒大\t390377\nipz\t390378\nブラ\t390379\n女史\t390380\npreparations\t390381\n_吾爱诗经网\t390382\n争端\t390383\n交班\t390384\n扰\t390385\n泽诺尼亚5\t390386\n清早期\t390387\n免费法律咨询-华律网\t390388\nixia\t390389\n吕蒙\t390390\n钢锁\t390391\n动态变量\t390392\n锣鼓\t390393\nCCTV新闻频道\t390394\n嫉恨\t390395\n虎背熊腰\t390396\nn9\t390397\n蛋糕裙\t390398\n传祺GA5\t390399\n江汉人才网\t390400\n大家伙\t390401\n防冻剂\t390402\n中科云网\t390403\nY66\t390404\ndlna\t390405\n世茂\t390406\nCastle\t390407\n吴涛\t390408\n蔡自兴\t390409\nNavigator\t390410\n天悦\t390411\n666\t390412\n浓香\t390413\n超额累进税率\t390414\n无纸化学法\t390415\n肾积水\t390416\n沈阳市教育局\t390417\n武汉远成共创科技有限公司\t390418\n频分\t390419\n杨雯\t390420\n百合干\t390421\nt台\t390422\n粤港澳大湾区\t390423\n风油\t390424\n通长筋\t390425\n简繁体\t390426\n小司\t390427\n丰隆\t390428\n稳住\t390429\n中泽\t390430\nstmp\t390431\n常任理事国\t390432\npsychological\t390433\n结算部\t390434\nfaced\t390435\n英雄杀手游\t390436\n第200章\t390437\n小海地\t390438\n卫生纸机\t390439\n新仙鹤神针\t390440\n2350\t390441\n出自\t390442\nhamming\t390443\ntickling\t390444\n11kw\t390445\n1863\t390446\n亡命之徒\t390447\n黑将S5\t390448\n惹众怒\t390449\n钒\t390450\nclevo\t390451\nPSCC\t390452\n大鸿\t390453\n30k\t390454\n灰原哀\t390455\n会声会影x10\t390456\n新流星搜剑录\t390457\n五味杂陈\t390458\n冥狱锁魂\t390459\nEmilia\t390460\njenner\t390461\n刃\t390462\n撸片\t390463\n八阿哥\t390464\nva\t390465\nPLsql\t390466\nplaster\t390467\n流川枫\t390468\n海洋展\t390469\n相冲\t390470\n工藤静香\t390471\n约起\t390472\nFrameset\t390473\npart3\t390474\n1000立方\t390475\n渭河\t390476\n谢国陶渊明\t390477\n5万套\t390478\nuImage\t390479\n骇客神条\t390480\n郑州市儿童医院\t390481\n湖南外贸职业学院\t390482\nresh\t390483\ntenda\t390484\n歌手们\t390485\n几百块\t390486\nTEN\t390487\n南京东\t390488\n磺胺类\t390489\n百信广场\t390490\n不正经\t390491\nproductive\t390492\n梦舞遮天\t390493\n中效\t390494\n天武\t390495\nionic\t390496\n市市场监督管理局\t390497\n路况\t390498\nauditing\t390499\n三第五\t390500\n新济公活佛\t390501\n行事历\t390502\n奔流\t390503\n演录\t390504\n山西省人力资源和社会保障厅\t390505\nmsleep\t390506\n止痒\t390507\n直经\t390508\n阳平关\t390509\n唐宋八大家\t390510\n淡入\t390511\n人阵\t390512\nsafenet\t390513\n专利战\t390514\n弹性工作制\t390515\n清浦\t390516\n采集仪\t390517\n1-10\t390518\n16秋\t390519\n300413\t390520\n约翰·塞纳\t390521\n快餐店\t390522\n24段\t390523\n电子锁\t390524\nLinyi\t390525\n脆性\t390526\n互动作业|作业互助组\t390527\n百世汇通快递\t390528\n玩樂\t390529\nsh60\t390530\n保利西海岸\t390531\n木柄\t390532\n艾滋病疫苗\t390533\n内观\t390534\n曹轩宾\t390535\nDEST\t390536\n中国移动通信集团浙江有限公司\t390537\n往下\t390538\n画中心\t390539\n5.7.13\t390540\n儒道至圣\t390541\n19世纪60年代\t390542\n率领\t390543\n点赞\t390544\n黑熊\t390545\n个人房源网\t390546\n中国合格评定国家认可委员会\t390547\n本田CR-V论\t390548\nV3.8\t390549\n定南县人民政府\t390550\njetbrains\t390551\nwww.174399.com\t390552\n星野龙\t390553\n卢恩\t390554\nc91\t390555\nThinking\t390556\n4l\t390557\n蓝鳍\t390558\nnbyz\t390559\n维特塞尔\t390560\n乳品\t390561\ninjected\t390562\n天津广播电视台\t390563\nLetme\t390564\n疥虫\t390565\nLR11\t390566\n夺吻\t390567\n金瓶梅杨思敏版\t390568\nFrankenstein\t390569\n梁平区\t390570\n首销\t390571\nAttorney\t390572\n荥阳\t390573\n冉冉\t390574\n微讲座\t390575\n听妈妈的话\t390576\n3240\t390577\n陈斌\t390578\n公椅\t390579\nlol鸡里奥宝典奖励\t390580\n王者荣\t390581\nWikivoyage\t390582\n110%\t390583\n颐和园路5号\t390584\n微信企业号\t390585\n支装\t390586\n华中农大\t390587\n88条\t390588\n明也\t390589\ntiesto\t390590\n第八册\t390591\n浓缩仪\t390592\nindigo\t390593\n邓普云\t390594\n中国国际动漫节\t390595\n04.23\t390596\nlovel\t390597\n唱诗\t390598\n詹妮弗·安妮斯顿\t390599\nread_table\t390600\n道教科仪\t390601\n电泵\t390602\n8.0.0\t390603\n南昌国家经济技术开发区\t390604\n拉普拉斯\t390605\n我喜欢书\t390606\n预灌封\t390607\nzhuanye\t390608\n0.15%\t390609\nV5.2\t390610\n有特服\t390611\npy2exe\t390612\n英名\t390613\n老毛\t390614\n16芯\t390615\n点痣\t390616\n霉菌\t390617\n平安区\t390618\n优秀村\t390619\n约束性\t390620\n龙皇\t390621\n前倾\t390622\n手动变速箱油\t390623\n京佳\t390624\n肥羊\t390625\nprevious\t390626\n自救\t390627\n贵阳市城乡规划局\t390628\n冒险岛手游\t390629\n徐州市人民政府\t390630\n太阳骑士\t390631\n那个男孩\t390632\n朗读女\t390633\npplong\t390634\n李东学\t390635\n截面法\t390636\n红彤彤\t390637\n试探\t390638\n哈罗CQ火腿\t390639\n功夫派\t390640\n射精量\t390641\n鄄城\t390642\n四月一日\t390643\nreuse\t390644\n吴兴\t390645\n中国科学技术大学图书馆\t390646\n浩然气\t390647\n完颜\t390648\nPEP版\t390649\n金钱树\t390650\n朴振英\t390651\n水立方嬉水乐园\t390652\n51同城网\t390653\nDistribution\t390654\n平心而论\t390655\n14句\t390656\n就地\t390657\n盛世收藏论坛\t390658\n行测+申论\t390659\n万庆良流毒\t390660\n柿饼\t390661\nCase\t390662\n选定\t390663\n微滤\t390664\n點都德\t390665\nG家\t390666\n热负荷\t390667\n万安路\t390668\n吴永麟\t390669\n元气骑士吧\t390670\n600879\t390671\n宛香\t390672\n铜人\t390673\n230万\t390674\n桐岛\t390675\n蓝导航\t390676\n电壁挂炉\t390677\n洪塘街道\t390678\n抗美援越\t390679\n应运\t390680\n多元线性回归\t390681\n前9月\t390682\nLaunchPad\t390683\n春申君\t390684\nvania\t390685\ncriterion\t390686\n干拌砂浆\t390687\n斑斑\t390688\n父组件向子组件\t390689\n官媒\t390690\n南翠屏公园\t390691\n李攀\t390692\nunfair\t390693\n抱不平\t390694\nWIN2003\t390695\nunmatched\t390696\n鹅肠\t390697\n李欣汝\t390698\nSDMU\t390699\n律师\t390700\n亿贝\t390701\n1.8英寸\t390702\n护工费\t390703\n江苏省人民政府\t390704\n国职\t390705\n李冬\t390706\n220号\t390707\n即墨\t390708\n李大师\t390709\n品案\t390710\n12点\t390711\nwarmly\t390712\n赵治勋\t390713\n玛戈皇后\t390714\n答应\t390715\n通体\t390716\n81号\t390717\n外相\t390718\n哄骗\t390719\n豹人\t390720\n臂架\t390721\n800亿元\t390722\n无锡尚德\t390723\n浓缩池\t390724\n新途径\t390725\n防变\t390726\nLC\t390727\n南子\t390728\nstretched\t390729\n恋乡\t390730\ntyloo\t390731\n4.36\t390732\n米娜时尚网\t390733\n换油\t390734\n黑道圣徒2\t390735\n会馆\t390736\n提取率\t390737\n豪曹\t390738\ninfowindow\t390739\n荚膜\t390740\n2016年6月15日\t390741\n商录\t390742\n金恪\t390743\n冬冷地区\t390744\n笑笑星儿\t390745\n两栏\t390746\n鲍女\t390747\n經典\t390748\n东兰路\t390749\n南京城墙\t390750\n办法\t390751\n零口\t390752\n360网页游戏\t390753\npixiv\t390754\n降下\t390755\n袁俊\t390756\nparents\t390757\n维新变法\t390758\n焦作网\t390759\n诛仙台\t390760\n塞戈维亚\t390761\n伊利诺伊\t390762\n乔纳森·诺兰\t390763\nZest\t390764\n天润乳业\t390765\n荷斯坦网\t390766\nrefrigeration\t390767\n茂名市区\t390768\n0.12.1\t390769\n1.16G\t390770\n焦镜\t390771\n山坑\t390772\n菏泽市牡丹区人民政府\t390773\n博天\t390774\n大势所趋\t390775\n倚天\t390776\n希沃白板5\t390777\n吃力\t390778\n中电国际\t390779\n116平米\t390780\n2016-2023年\t390781\n随行就市\t390782\n刘病已\t390783\n四艺\t390784\n209号\t390785\n国旅在线\t390786\n第七十二条\t390787\nU2718Q\t390788\n慢\t390789\nThirty\t390790\n鱼肚\t390791\n81681069\t390792\nf551\t390793\n一开\t390794\n潘霜霜\t390795\n逼毛\t390796\n并继\t390797\n枪神传奇\t390798\n欧舒丹\t390799\n半老徐娘\t390800\n速度与激情5\t390801\n重庆解放碑威斯汀酒店\t390802\n景秀\t390803\n吊机\t390804\nCORP\t390805\n351\t390806\n1069070069\t390807\n绿山\t390808\n卫生间\t390809\n贾斯汀·比伯\t390810\n远游\t390811\n闪电借款\t390812\n四千多\t390813\n浙江省立同德医院\t390814\n五校\t390815\n席恩\t390816\nsvd\t390817\nhof\t390818\n锡克\t390819\n济南市旅游发展委员会\t390820\n简正\t390821\n地外\t390822\n一波未平\t390823\n监禁时间\t390824\n恋人\t390825\n北京快递\t390826\n第1册\t390827\nCoser\t390828\n大鱼大肉\t390829\n锚地\t390830\n民办教育网\t390831\napikey\t390832\n系法图解\t390833\nLyrics\t390834\n布袋戏\t390835\n波拉特\t390836\nNC\t390837\nng-options\t390838\n黄钻\t390839\n赫尔辛基大学\t390840\nwitness\t390841\n老式\t390842\nbootmgr\t390843\n多屏\t390844\n港珠澳大桥管理局\t390845\ncarving\t390846\n男亲女\t390847\n陶瓷杯\t390848\n360万\t390849\n冲高回落\t390850\n观研\t390851\n且看\t390852\n跑步者\t390853\n靡\t390854\n清兵\t390855\n橙光游戏中心\t390856\n速尔物流\t390857\n小蟹\t390858\nSweetAlert2\t390859\n卢平\t390860\n扬中市\t390861\n小瓷\t390862\n波切\t390863\n我的老师\t390864\n乙酸正丁酯\t390865\n六桂福珠宝\t390866\n各镇街\t390867\ntda\t390868\n22座\t390869\n水平度\t390870\n凤还朝\t390871\n乔山\t390872\n西弗吉尼亚\t390873\n停业\t390874\n不容置疑\t390875\n铅皮座\t390876\nenriched\t390877\n2.7.13\t390878\n50章\t390879\nlinnx\t390880\n5.6%\t390881\n轮博\t390882\n一教\t390883\n基金买卖网\t390884\nEPERM\t390885\n魅族PRO6\t390886\n燃油添加剂\t390887\n金融学\t390888\n海鲜楼\t390889\n短视\t390890\n借东风\t390891\n奏效\t390892\n外在\t390893\n两只手\t390894\nkisso\t390895\n御景豪庭\t390896\nCUE\t390897\n刪除\t390898\n22:30\t390899\n班干\t390900\n大连医院\t390901\n常态化\t390902\n方舱\t390903\n参议员\t390904\n一汽马自达\t390905\n南明湖\t390906\n家电展\t390907\n豆腐丝\t390908\n开卷机\t390909\n四月八\t390910\n水污\t390911\nmdr-1000x\t390912\n青云诀\t390913\nsbb\t390914\n常闭\t390915\n蛔虫病\t390916\n录音室\t390917\n海龟汤\t390918\n盗刷\t390919\nMacbeth\t390920\n贵港港北区\t390921\ncmd命令行下\t390922\n算\t390923\nwvs\t390924\n华森\t390925\n奇多\t390926\n18升\t390927\n改判\t390928\n胃酸\t390929\n赋诗\t390930\n乐善\t390931\n捕蛇者\t390932\n马友鱼\t390933\n乌冬\t390934\n比稿\t390935\n图形计算器\t390936\n变速箱油\t390937\n负义\t390938\n言希\t390939\n史立军\t390940\n代理记帐\t390941\n5122\t390942\nlocalhost8080\t390943\n真猪\t390944\n子列\t390945\n红绿\t390946\n头身\t390947\nglobalmapper\t390948\n驼骨\t390949\n绝世求魔\t390950\n埃神\t390951\n沪语\t390952\nNAS服务器\t390953\n杨庆煌\t390954\n蚁\t390955\n珠江公园\t390956\n伽卡他卡\t390957\nloads\t390958\n向往的生活第二季\t390959\n经销处\t390960\n舵主\t390961\n金湖县委组织部\t390962\nE调\t390963\n哮喘日\t390964\n张五常\t390965\n油老虎\t390966\n宝理\t390967\n活木\t390968\n新田县人民政府\t390969\nNovation\t390970\nZBGB\t390971\n模型\t390972\n碘液\t390973\nGATEWAY\t390974\n江大桥\t390975\n这一战\t390976\ngtags\t390977\ntar.bz2\t390978\n杨妃\t390979\n杭州育才小学\t390980\nyukon\t390981\n新疆医科大学\t390982\nscie\t390983\n凹透镜\t390984\n侵袭\t390985\n还珠\t390986\nweiphp\t390987\n舌下腺\t390988\n式法\t390989\n唱支\t390990\n贾克斯\t390991\nvape\t390992\n紫金农商银行\t390993\n_步\t390994\n露台花园\t390995\n伸缩杆\t390996\n凉薄\t390997\n挡杆\t390998\n淘气\t390999\n计算机应用\t391000\n吕凉\t391001\n黄财神\t391002\n藤三七\t391003\ngelbooru\t391004\n在旁\t391005\n四渡赤水出奇兵\t391006\n袁野\t391007\n黄金口\t391008\n抄袭案\t391009\n东河\t391010\n1000分钟\t391011\n南京电视台\t391012\nbarley\t391013\n不懂装懂\t391014\n计算期\t391015\n文明办\t391016\n红烧狮子头\t391017\n爱在何方\t391018\n爱奇艺号\t391019\n滔天罪行\t391020\n滕代远\t391021\n杂点\t391022\n扁形\t391023\nMLCC\t391024\n褐色斑\t391025\nosd\t391026\n动账\t391027\ntransparency\t391028\n服装设计\t391029\n减胸\t391030\nphtoshop\t391031\n0311-6677xxxx\t391032\n中东路\t391033\nB超检查\t391034\n109号\t391035\n艺卓\t391036\nCube\t391037\nTv\t391038\n山河表\t391039\n魔理沙\t391040\n漏掉\t391041\n滨江豪园\t391042\n百度云|\t391043\nunderstand\t391044\n韩彩英\t391045\n双杰\t391046\n发行员\t391047\n周滨\t391048\nLiquid\t391049\n塔吉特\t391050\n美貌\t391051\n茶啊二中\t391052\n公文包\t391053\n交汇点\t391054\n重彩\t391055\nyuedu\t391056\ncrashed\t391057\n脑心通\t391058\nconsignment\t391059\naiai\t391060\n肖钢\t391061\nWT\t391062\nE-mail\t391063\n加肥\t391064\n美目\t391065\n黑式\t391066\n星空传说\t391067\ntitlelen\t391068\n完美汉化版\t391069\ncgwall\t391070\n抗洪\t391071\n刘书记\t391072\n封喉\t391073\nfis3\t391074\nVOICE\t391075\n陆地方舟\t391076\nmonjeep\t391077\n砸毁\t391078\nopenmesh\t391079\n新吴区\t391080\n百度吧_\t391081\n坎宫\t391082\n羽绒\t391083\n伊人综合_伊人成人网\t391084\n十二指肠溃疡\t391085\n北京中心\t391086\nRonny丶\t391087\n14码\t391088\n损管\t391089\n基洛夫\t391090\n不得不说\t391091\n2014年01月05日\t391092\n猿猴\t391093\n华语片\t391094\n技能赛\t391095\ntt\t391096\n众泰宝马x6\t391097\n帕露菲\t391098\n正派\t391099\n公文网\t391100\n涨\t391101\n搜索排序\t391102\n头目\t391103\n轮渡\t391104\n观赛\t391105\n昆汀塔伦蒂诺\t391106\n石湖荡镇\t391107\n如家\t391108\n电插锁\t391109\n铁凝\t391110\nWAS\t391111\n客量\t391112\njsl\t391113\n有机玻璃板\t391114\nMc喊麦网\t391115\n先哲\t391116\n婷婷\t391117\nMPlayerX\t391118\n扫描式\t391119\n络\t391120\n阻车器\t391121\n复兴门\t391122\n思瑞康\t391123\n虹莘路\t391124\n0.0.5\t391125\n择善\t391126\n湖南小学\t391127\n溪地\t391128\n月下美人来花朝梦三生\t391129\nOL3\t391130\n838路\t391131\n吴楚\t391132\n潘绥铭\t391133\n集镇\t391134\nxaing\t391135\n头\t391136\n共夫\t391137\n心死\t391138\nHDTV-MP4\t391139\nOpenText\t391140\n苏州市民政局\t391141\n刀光枪影\t391142\n中国科学院工程热物理研究所\t391143\n学后教\t391144\nnra\t391145\n云南卫视\t391146\n金华房产超市网\t391147\n3.12\t391148\nechat\t391149\nVeritas\t391150\n环世界b18\t391151\n浙江广厦队\t391152\n三峡新能源\t391153\n20160809\t391154\n孟塞尔\t391155\nOlympia\t391156\n保存\t391157\n使节\t391158\n天涯明月刀ol真武吧\t391159\n蟾衣\t391160\n北京市盈科\t391161\n独家报道\t391162\n瞒天过海\t391163\n大乐透\t391164\n建邺\t391165\n象山县\t391166\n求魔\t391167\n马自达6\t391168\nハロ\t391169\n全战\t391170\n【岚\t391171\n爱回家之开心速递\t391172\n伍华聪\t391173\n工农路\t391174\n人民陪审员法\t391175\n联发\t391176\nSoftware\t391177\n不锈钢过滤器\t391178\n张东辉\t391179\n禽肉\t391180\n电子气\t391181\n17话\t391182\nJae\t391183\n晓雪\t391184\nm17\t391185\n20171207\t391186\n5.7.9\t391187\nav女优\t391188\n坡形\t391189\n汉生\t391190\nmintab\t391191\n4艘\t391192\nJOY\t391193\n洒水器\t391194\n辕门\t391195\n湘电集团\t391196\n第十九个\t391197\n球面\t391198\n防裂\t391199\nentity\t391200\n5850\t391201\n首师大\t391202\nviv\t391203\n象棋\t391204\nJitter\t391205\n淘宝花呗\t391206\n莫欺\t391207\n付临门\t391208\n王诚\t391209\nEMU\t391210\n宝马740\t391211\nGrading\t391212\nsexe\t391213\n常州市人力资源和社会保障局\t391214\n品鉴\t391215\n联讯证券\t391216\n国舅\t391217\nautobetsoft\t391218\n12398\t391219\n上理\t391220\n桅子花\t391221\n大曝光\t391222\n九吉公\t391223\n瑞格菲尼\t391224\nOCG\t391225\n婚检\t391226\n大明宫词\t391227\n遂宁市人民政府\t391228\n天津科技大学\t391229\n除污机\t391230\n精题\t391231\n欺上瞒下\t391232\nasu\t391233\n逗比\t391234\nITSM\t391235\n中华养生食谱\t391236\n中国汽车工程研究院股份有限公司\t391237\n修购\t391238\ntia\t391239\n姜丹\t391240\nwhj\t391241\n反腐倡廉建设\t391242\nbeau\t391243\n天河东\t391244\nCreature\t391245\nutf8mb4\t391246\n一听音乐\t391247\n蒲地\t391248\n孟凡明\t391249\n111&\t391250\n金炳万的丛林法则\t391251\n中吴大道\t391252\n512路\t391253\n西湖文三路\t391254\n郑飞\t391255\nPowerBI\t391256\n美本\t391257\nhumble\t391258\n为此\t391259\n记事本\t391260\n尼康d700\t391261\nReigns\t391262\ntrados\t391263\n开缸\t391264\n谭勇\t391265\n浑水\t391266\n男性\t391267\nH5_\t391268\n流放者柯南\t391269\n芳儿\t391270\nmozdev\t391271\n弘业\t391272\n言简意赅\t391273\n一呼\t391274\n丁五\t391275\n控制屏\t391276\n短讯\t391277\n热力发电厂\t391278\n骏景花园\t391279\n噜噜噜\t391280\n修脚师\t391281\n蛮\t391282\n益健\t391283\nReady\t391284\n大礼包\t391285\n济南分公司\t391286\n眼镜女\t391287\n湘南\t391288\n区卷\t391289\n电容容量\t391290\n写意人生\t391291\n河间市\t391292\n驳接爪\t391293\nxingji\t391294\nDavidoff\t391295\n星仔\t391296\n植物园\t391297\n蓝凌OA\t391298\n动脉网\t391299\n那一刻\t391300\n微信摇一摇\t391301\n汉语言文学专业\t391302\n威斯汀\t391303\n水闸\t391304\n美克斯\t391305\n0037\t391306\n标准片\t391307\n百子湾\t391308\n国家海事局\t391309\n走街串巷\t391310\n真三国\t391311\nps暂存盘\t391312\nxxd\t391313\nCheats\t391314\n济阳县\t391315\n余家\t391316\n鸡同鸭讲\t391317\n常宝股份\t391318\nviolet\t391319\n影射\t391320\n施工定额\t391321\nXMR\t391322\n东京迪士尼度假区\t391323\n伦敦玛丽女王大学\t391324\nshenhua\t391325\nk240\t391326\n壶方\t391327\n苏菲·玛索\t391328\n慧慧\t391329\nthermo\t391330\n美洲\t391331\nSQL2000数据库\t391332\n乐水\t391333\n杜锋\t391334\n一方面\t391335\n撕脱性\t391336\n华南理工大学建筑学院\t391337\n愛乃\t391338\n盐城开发区\t391339\n弯管机\t391340\n劳动部\t391341\n抠门\t391342\n长臂\t391343\nV587\t391344\n边酒\t391345\n|首\t391346\nbranding\t391347\n攻队\t391348\n万升\t391349\n16th\t391350\ngjj\t391351\n摆弄\t391352\n蒂芙尼\t391353\n广埠屯\t391354\n扫描\t391355\nmondo\t391356\nSuzanne\t391357\n19时\t391358\n理工类\t391359\n榄仁叶\t391360\nzendstudio\t391361\n武汉市委宣传部\t391362\narpa\t391363\n接单\t391364\n奥杜因\t391365\n④\t391366\n锌硒宝\t391367\nArcScene\t391368\n1关\t391369\nios11\t391370\nMG2580\t391371\nluban\t391372\n异画\t391373\nWin10系统IE浏览器\t391374\nphpnow\t391375\n按份共有人\t391376\n谷腾\t391377\n生存能力\t391378\n大学排名_高三网\t391379\n第53届\t391380\n退兵\t391381\n喋\t391382\n世界斯诺克锦标赛\t391383\n甘肃省农村信用社\t391384\n东海堂\t391385\n马氏体\t391386\n20150809\t391387\n378\t391388\n汤山镇\t391389\n互联网+政务服务\t391390\n微信公众平台管理系统\t391391\notrs\t391392\n束胸\t391393\n体动\t391394\nvss2005\t391395\npump\t391396\n王更新\t391397\n檩\t391398\n麻风病\t391399\nInc.\t391400\n雷丝\t391401\n好趣网\t391402\n马塞克\t391403\n云南河口\t391404\n邱震海\t391405\n派现\t391406\n字拖\t391407\n金娜娜\t391408\n下载器\t391409\n施贵宝\t391410\n保费\t391411\nSEX\t391412\ncrlf\t391413\n都市言情小说-17k小说网\t391414\n贵屿\t391415\n憨豆特工3\t391416\n扎赉特旗\t391417\n甩葱\t391418\n善美\t391419\ndeflate\t391420\ncasey\t391421\n小站\t391422\n鸬鸟\t391423\n互扇\t391424\n四川省教育厅\t391425\n啤梨\t391426\n马欢\t391427\n五笔字根表\t391428\nraspberrypi\t391429\nslf4j-api\t391430\n刀飞\t391431\naddEvent\t391432\n76集\t391433\n聂明宇\t391434\njfk\t391435\n绝地求生刺激战场银\t391436\n百度地图SDK\t391437\n镍钴锰酸锂\t391438\n球风\t391439\n上世纪90年代\t391440\nrainstorm\t391441\nair14\t391442\n陈仓\t391443\n炙甘草汤\t391444\n目前有没有\t391445\nTigase\t391446\n世世\t391447\n戒尺\t391448\nxboxone\t391449\nFreelance\t391450\n计算机\t391451\n外商\t391452\nburning\t391453\n模考网\t391454\n云匠\t391455\n泰坦尼克号\t391456\ncctv4\t391457\n奏响\t391458\n思杰\t391459\n大唐情史\t391460\n2022cm2022厘米\t391461\nホワイト\t391462\n湘钢\t391463\n崖州\t391464\n升级后\t391465\n修长\t391466\n麒麟座\t391467\n10.4%\t391468\n4399j\t391469\nKISSsoft\t391470\nLCR数字电桥\t391471\n复调\t391472\n憎恶之西\t391473\naspiration\t391474\n车队\t391475\n安卓4.1\t391476\n巧除\t391477\nyeux\t391478\nR7000P\t391479\n皮头\t391480\ntwentieth\t391481\nfrostpunk\t391482\n2.8十\t391483\nRAR格式\t391484\n浙江大学工程师学院\t391485\n《西南大学》\t391486\n感事\t391487\nchampions\t391488\n精达\t391489\n犬屋\t391490\n广西科联招标中心\t391491\n欧神诺\t391492\n2018038\t391493\n任正晓\t391494\n|券\t391495\n58资产网\t391496\n西安论坛-华商论坛\t391497\nnl\t391498\n北京丝足\t391499\n黑天鹅\t391500\n上海市商务委员会\t391501\n嘉吉\t391502\n20161025\t391503\n山恶水\t391504\n团山镇\t391505\nenw\t391506\n裕木麻友\t391507\n炉排炉\t391508\n影音先锋资源网\t391509\n彩霸王\t391510\n页次\t391511\n手指操\t391512\n民防\t391513\n血质\t391514\n进贤\t391515\n开淘网\t391516\n右逝\t391517\n破门\t391518\n霉菌性\t391519\n世世代代\t391520\n黄石港\t391521\n留守儿童\t391522\n食品生产许可管理办法\t391523\n燥热\t391524\n表格化\t391525\nFlo\t391526\n轻钢房屋\t391527\niPhone7/\t391528\nLeBron\t391529\ncoles\t391530\n管钳\t391531\n百忧解\t391532\n景东彝族自治县\t391533\n无华\t391534\nHasbro\t391535\n徐熙\t391536\n胸猛\t391537\n双乳峰\t391538\nwin10分盘\t391539\npurification\t391540\n苏州大学数学科学学院\t391541\n0.9\t391542\nSunrise\t391543\n美汁源果粒橙\t391544\n胡言乱语\t391545\n有感兴趣\t391546\n江西省政府\t391547\npe膜\t391548\n毒爱\t391549\n南三环\t391550\nBiotin\t391551\nmaoming\t391552\nmathmatic\t391553\n庹宗康\t391554\n5.7.1\t391555\n北京小升初\t391556\n百视\t391557\n王益区人民政府\t391558\n新泽西州\t391559\nshadowsocks-qt5\t391560\n铁饭碗\t391561\nbureau\t391562\nmutant\t391563\n弓长岭\t391564\n山楂卷\t391565\n止水条\t391566\nOpinions\t391567\n珍珠湾\t391568\nnct\t391569\n巢湖\t391570\n智弧\t391571\nEP4\t391572\n项英\t391573\nSTM8L\t391574\nPXW-X280\t391575\nVitae\t391576\n人身自由\t391577\n低价位\t391578\navicii\t391579\n消费股\t391580\n抗逆\t391581\n张大勇\t391582\n二府\t391583\n根茎\t391584\n基本版\t391585\n那曲\t391586\n承运人\t391587\n晨晖\t391588\n第2节\t391589\n第六册\t391590\n环境影\t391591\n民主广场\t391592\nSlow\t391593\n华融湘江银行\t391594\n脊髓损伤\t391595\n表决\t391596\n孝感一中\t391597\n中盾\t391598\n陈永平\t391599\n狸窝全能视频转换器\t391600\n小标兵\t391601\n美国通用电气公司\t391602\ndrugstore\t391603\n渔舟唱晚\t391604\n积弱\t391605\n两大\t391606\n萌物\t391607\n广州歌剧院\t391608\nUnity2D\t391609\nscrm\t391610\ncordraw\t391611\n广告钉\t391612\n6.4\t391613\n航空展\t391614\n六查六\t391615\n宫外孕手术\t391616\nFriendly\t391617\n容维\t391618\n1859年\t391619\n武汉大学学报\t391620\n经济发\t391621\narthritis\t391622\nCats\t391623\n增值税额\t391624\n歪门邪道\t391625\nNO.1_\t391626\n朱明国\t391627\n命题作文\t391628\n消防物联网\t391629\nLammps\t391630\n明望\t391631\n敬请\t391632\nMINE\t391633\n副炮\t391634\n顺天府\t391635\n一百本\t391636\nOpacity\t391637\n玻璃樽\t391638\npgadmin3\t391639\n飘亮\t391640\n食材展\t391641\nhang\t391642\nturismo\t391643\nunter\t391644\n陕南\t391645\n聚氨酯催化剂\t391646\n永泽真央美\t391647\n方干\t391648\n10041\t391649\n第一附属医院\t391650\n大愿\t391651\nmurata\t391652\n柏树\t391653\n11世纪\t391654\nUl\t391655\nformat函数\t391656\n清爽\t391657\n魁首\t391658\nkiyo\t391659\n国象\t391660\n王梦恕\t391661\ndandy\t391662\n十八大\t391663\n烟幕\t391664\n布丁酱\t391665\ngutter\t391666\n点落\t391667\n购卖\t391668\n无癖\t391669\n外卖节\t391670\ncommons-codec\t391671\n第2页\t391672\nsamuel\t391673\n三国人\t391674\n盘发\t391675\n播音员\t391676\n智康1对1\t391677\n企业界\t391678\n广州博鳌纵横网络科技有限公司\t391679\npixhawk\t391680\n解解\t391681\n活埋\t391682\n多得\t391683\n1946年\t391684\n国际上\t391685\n魔镜\t391686\n哲学观\t391687\n艺匠\t391688\nvmw\t391689\nFlotherm\t391690\n2017年11月29日\t391691\n10g\t391692\n聊笔阁\t391693\n鹭湖宫\t391694\n决策者\t391695\n刺客信条4\t391696\nTracy\t391697\nLin\t391698\n一东\t391699\n时务\t391700\n僵尸片\t391701\n育才路\t391702\n私有源\t391703\n近景\t391704\njQueryUI\t391705\n睡鼠\t391706\n城市规划原理\t391707\n克拉玛依油田\t391708\nsqljdbc4.jar\t391709\n上海对外经贸大学\t391710\nmodulus\t391711\n姚澜\t391712\n0085\t391713\n武林外传综合版-武林外传\t391714\n上班路\t391715\n单细胞测序\t391716\nPSF\t391717\n虹桥天地\t391718\n荣威RX5\t391719\nBeing_young\t391720\n通态\t391721\n_录屏\t391722\n沸物\t391723\n二丫\t391724\n虹\t391725\nthrive\t391726\n第三件\t391727\n于华晨宇\t391728\nApi接口\t391729\n黑雾\t391730\n苍龙城\t391731\n1560\t391732\n刘忙\t391733\n广东省食品药品监督管理局\t391734\n油菜籽\t391735\nWB\t391736\n一次性工亡补助金\t391737\n重庆日报\t391738\nmjpg-streamer\t391739\n没完没了\t391740\n雨过天晴\t391741\n多蒙\t391742\ncaxa2016\t391743\nonmessage\t391744\n七发\t391745\n跨包\t391746\n马戛尔尼\t391747\njy\t391748\nIOS7\t391749\nBebop\t391750\n主剧\t391751\n贸易港\t391752\n太上章\t391753\n直肌\t391754\n王崧舟\t391755\n入党积极分子登记表\t391756\n地牢围攻2\t391757\nu4000uq\t391758\n京酒\t391759\n引爆者\t391760\n看起来\t391761\n美职联\t391762\n黄兴\t391763\n最少\t391764\n顶压\t391765\n预设\t391766\n住建\t391767\n天规\t391768\n大逃亡\t391769\nyqztb\t391770\n湖南教育电视台\t391771\n茶王\t391772\n双刃\t391773\n36公里\t391774\n创富\t391775\n组照\t391776\ncpuz\t391777\nuitextfield\t391778\n金南珠\t391779\n第三十三期\t391780\n珍妃\t391781\n明环\t391782\n理伦\t391783\n科创园\t391784\n各县市\t391785\n527796075@qq.com\t391786\n全球豪娶少夫人\t391787\nk线图\t391788\n金桔宝\t391789\n烤炉\t391790\n助学贷款信息网\t391791\n第69\t391792\n撒马尔罕\t391793\n兰大\t391794\n拘束衣\t391795\n冷墩\t391796\n逼良\t391797\n迎泽区\t391798\n尿路上皮癌\t391799\n第八级\t391800\n20150803\t391801\n15号\t391802\n有价\t391803\n2016年12月10日\t391804\nexercises\t391805\n总卡\t391806\n60余\t391807\n极趣网\t391808\n三星手机\t391809\n续签\t391810\nFirstCS\t391811\nу\t391812\nluluhei\t391813\n烙馍机\t391814\n埠\t391815\n2串\t391816\n自然拼读法\t391817\n主译\t391818\njbuilder\t391819\n满文军\t391820\n0.38\t391821\nIT快讯_南方网\t391822\n0700\t391823\n泽\t391824\n雷允\t391825\n小崔\t391826\n0.5kg\t391827\n數位\t391828\n围魏救赵\t391829\n互相关函数\t391830\nKatana\t391831\n松武\t391832\n王任重\t391833\nadminlte\t391834\n线描稿\t391835\n大马城\t391836\n第20\t391837\nRestricted\t391838\n学海无涯苦作舟\t391839\n幻翎洛\t391840\nD3D\t391841\n法国队\t391842\n泛亚\t391843\n黑暗之魂\t391844\n天府机场\t391845\n学步车\t391846\n索尼xz\t391847\n中央警卫团\t391848\n深居\t391849\n正高\t391850\n邦尼\t391851\n食品袋\t391852\n清华大学图书馆\t391853\n金闺\t391854\n超神王\t391855\n钱牛牛\t391856\n翘舌\t391857\n捷顺行\t391858\nQQ宠物企鹅\t391859\n邓稼容国团\t391860\n撞碎\t391861\n井井有条\t391862\n通盛\t391863\n毫伏\t391864\nt80\t391865\n等差数列{an}\t391866\nandrod\t391867\n恶毒女\t391868\n程漓月\t391869\nz+1\t391870\n开船\t391871\n用力\t391872\n80D\t391873\n天清汉马\t391874\n西安北客站\t391875\n试道\t391876\n小特\t391877\n吉威\t391878\n江通道\t391879\n红骷髅\t391880\n魂戒\t391881\n顾飞\t391882\nlogN\t391883\n出演者\t391884\n北京尚德\t391885\n魔尊\t391886\nstrikes\t391887\n苏锡常镇\t391888\n曲式\t391889\n3D眼镜\t391890\n良良\t391891\n赤杨木\t391892\n第5版\t391893\n11W\t391894\n龙虎山景区\t391895\n好多公里\t391896\nFBX格式\t391897\n增价\t391898\n中美洲\t391899\n配不上\t391900\n电脑速学\t391901\n莘\t391902\nHound\t391903\nShade\t391904\npillar\t391905\n南中国海\t391906\n鄂州职业大学\t391907\n小瑞\t391908\n广东环境保护工程职业学院\t391909\n冒痘\t391910\n爱名网\t391911\n爱自己\t391912\n80毫米\t391913\n天若\t391914\netc/shadow\t391915\n江夏学院\t391916\nCooling\t391917\ntada\t391918\n一个20岁\t391919\n推托\t391920\ndaemonize\t391921\n鬼宅\t391922\n门市\t391923\n1分\t391924\n医书\t391925\nFvd\t391926\n意利\t391927\n2893\t391928\nmackeeper\t391929\n翠浓\t391930\ndismiss\t391931\n长春花园\t391932\n微波\t391933\n摄影术\t391934\n值班长\t391935\n唐家\t391936\n小曾\t391937\n新安江街道\t391938\n芒果卫视\t391939\nTOP20\t391940\n新艺\t391941\n落款\t391942\n培训工\t391943\n明报\t391944\n因变量\t391945\n垃圾\t391946\n坐月子帮_\t391947\ndiet\t391948\n乳牛奶粉\t391949\n慈溪人民医院\t391950\n追号\t391951\n布欧篇\t391952\nnucleotide\t391953\n热反射\t391954\n63A\t391955\n寻常疣\t391956\n爱丽万古天帝\t391957\n波里\t391958\n实妹\t391959\n欧阳朱自清\t391960\n云南铜业\t391961\nMew\t391962\n译马网\t391963\n分置\t391964\n河南博物馆\t391965\n名图论坛_汽车之家论坛\t391966\nmontblanc\t391967\nChemSpider\t391968\n51亿\t391969\n境地\t391970\nショ\t391971\n佩吉\t391972\n68亿\t391973\n运城机场\t391974\n中雨\t391975\n文化季\t391976\n弧度\t391977\n洋鸡蛋\t391978\n0422\t391979\n自然而然\t391980\n大屋\t391981\n许世辉\t391982\n每隔一秒\t391983\n新华基金\t391984\ndill\t391985\napproaches\t391986\n电动玻璃\t391987\n凯竹\t391988\nCE\t391989\n耶利米\t391990\nBiotech\t391991\nTigers\t391992\n收藏柜\t391993\nMedicines\t391994\n特性\t391995\n海男\t391996\n区发改局\t391997\ngulpjs\t391998\n甘肃省交通厅\t391999\n顺平\t392000\n诸平\t392001\n坎墩街道\t392002\n柔光\t392003\n保险丝座\t392004\n武汉博物馆\t392005\n累犯\t392006\nComo\t392007\n第39届\t392008\n血染征袍\t392009\n豆瓣绿\t392010\n中华人民共和国村民委员会组织法\t392011\n纺机\t392012\n理赔员\t392013\n超微\t392014\n世界葡萄酒\t392015\n填海造\t392016\n大主宰\t392017\nN9006\t392018\nWiN\t392019\n散斑\t392020\ncocosjs\t392021\nppkao\t392022\n侨界\t392023\n几何体\t392024\n法连\t392025\n星环\t392026\nTEXTILE\t392027\nu8\t392028\n刘建军\t392029\n油质\t392030\n颗粒\t392031\nRoboCup\t392032\nCentTian\t392033\nMath.random\t392034\n各路\t392035\n排石颗粒\t392036\nPA66\t392037\nwifi钥匙\t392038\n进化树\t392039\n中华人民共和国产品质量法\t392040\n3千瓦\t392041\nAges\t392042\n黄炎培\t392043\n必创科技\t392044\nMetaTrader\t392045\n熊出没之春日对对碰\t392046\n斯坦达尔\t392047\n渤海股份\t392048\n杨主任\t392049\n杨业明\t392050\n孟子淇\t392051\n江西省农业厅\t392052\n中国工商局注册公司\t392053\n1i\t392054\n欢案\t392055\n金坛市\t392056\n珠海保税区\t392057\n丰利\t392058\n唉_\t392059\n牛郎星\t392060\n田润叶\t392061\n双绉\t392062\nCANopen\t392063\n第三\t392064\n耽美中文网\t392065\nguava\t392066\n碟中谍5:神秘国度\t392067\n11升\t392068\n后妈\t392069\n有看头\t392070\nZMQ\t392071\n7557\t392072\n穆氏\t392073\n块数\t392074\nwrapped\t392075\ny510p\t392076\n蜡笔小新新番\t392077\n不残\t392078\nSparkle\t392079\n折减系数\t392080\n甲状腺功能减退\t392081\n犯困\t392082\n龟兔\t392083\n栗先达\t392084\n计篇\t392085\n天津文化中心\t392086\n滴定液\t392087\n技术站\t392088\n黄石西路\t392089\n绕过\t392090\n天著\t392091\n新优质学校\t392092\nApink\t392093\n必然结果\t392094\n生年金\t392095\n景兴纸业\t392096\nmodule6\t392097\n股民\t392098\n红枣\t392099\n数字媒体\t392100\nmixins\t392101\nMile\t392102\n嘿嘿\t392103\n外汇贷款\t392104\n底妆\t392105\nerrno\t392106\n上海男篮\t392107\nWin7样式\t392108\n搜狗音乐\t392109\n15km\t392110\n徐州新闻网\t392111\n秀月\t392112\n素坤逸\t392113\n格列美脲\t392114\nnotempty\t392115\n坦帕\t392116\n自损\t392117\n上海市公务员局\t392118\n缓速器\t392119\n炸土豆\t392120\n乌海市\t392121\nAV在线\t392122\n锐欧\t392123\nTHINKPAD\t392124\n同喜\t392125\namd64.iso\t392126\n足道\t392127\n葡萄糖苷\t392128\ny友乐园\t392129\n黔西南日报社\t392130\n明尖\t392131\n新田镇\t392132\n住院费\t392133\nvvip\t392134\nlz4\t392135\n牛肉面\t392136\n迹忆\t392137\nAntimalware\t392138\n猎豹浏览器\t392139\n北京游乐园\t392140\n国际医疗部\t392141\n安九\t392142\n敬词\t392143\nElectricity\t392144\n鬼泽\t392145\nrugby\t392146\n二十一届\t392147\n费尔\t392148\n银行管理\t392149\n三位数\t392150\n亭子\t392151\nINTP\t392152\nNW-ZX300A\t392153\n并购方\t392154\n近所\t392155\n频闪灯\t392156\nisfile\t392157\n冷彻\t392158\n染料\t392159\n赶考\t392160\npuzzles\t392161\n刘家义\t392162\n品牌力指数\t392163\n葡萄沟\t392164\n电流计\t392165\n抽采\t392166\n门罗\t392167\n腕臂\t392168\n中世纪2\t392169\ngucci\t392170\n不知为何\t392171\n对酒当歌\t392172\n官官相护\t392173\n2016年4月24日\t392174\n写真馆\t392175\n0410\t392176\n女娲传说之灵珠\t392177\n花样少年少女\t392178\n1mpa\t392179\n合肥新站高新技术产业开发区\t392180\nopn\t392181\n小弗朗士\t392182\n米宝兔\t392183\n碳酸钾\t392184\n镀液\t392185\n内资公司\t392186\n中华人民共和国人力资源和社会保障部\t392187\n酒吧地图网\t392188\n中国建筑学会\t392189\n电子板\t392190\n天耀中华\t392191\nmac机\t392192\n亭江\t392193\n王子璇\t392194\n中国试管婴儿网\t392195\n第54集\t392196\n下季\t392197\n报文\t392198\n上海飞机设计研究院\t392199\n暗黑黎明\t392200\n总总\t392201\n中华人民共和国道路运输条例\t392202\n头昏\t392203\nHDTune\t392204\n天国:拯救_\t392205\n多套\t392206\n万向德农\t392207\n空海\t392208\n政\t392209\n3D肉蒲团之极乐宝鉴\t392210\n明珠小学\t392211\n人情债\t392212\n熊片\t392213\n5年内\t392214\n宣导\t392215\nav番号网\t392216\n加进\t392217\nscu\t392218\nzhangzhen894095789\t392219\n激光治疗仪\t392220\nMPR\t392221\n保健类\t392222\n小人书\t392223\n安立泽\t392224\n王者荣耀骚白\t392225\nquiver\t392226\n0002\t392227\n张长\t392228\n龙虾\t392229\n彭城晚报\t392230\n布尔运算\t392231\n圣经研究吧\t392232\n刘二豆\t392233\n接头\t392234\ngrey\t392235\n流动度\t392236\n2011年\t392237\nframeworks\t392238\n还毒\t392239\n四维星\t392240\nleonardo\t392241\n灵柩\t392242\n聚体\t392243\n宰杀\t392244\n师人\t392245\n59家\t392246\nproject甘特图\t392247\nHi\t392248\n谐波减速机\t392249\n转报\t392250\n7200U\t392251\n大沥\t392252\n夔门\t392253\n8000万元\t392254\n反光衣\t392255\n海南海药\t392256\n豁免权\t392257\n掉眼泪\t392258\ncentralized\t392259\nvivoX21\t392260\nniginx\t392261\npoon\t392262\ngreenDAO\t392263\n3干\t392264\n宇宙人\t392265\nc0\t392266\n神州数码信息服务股份有限公司\t392267\n野营\t392268\n天线宝宝\t392269\n工口本\t392270\n大桥未久下马所有\t392271\n3一个\t392272\nProfessional\t392273\nhandlers\t392274\n嘴中\t392275\nshelves\t392276\n意群\t392277\n摇摇\t392278\n爱染\t392279\n哪门\t392280\n20150530\t392281\n历史学系\t392282\n彩店\t392283\n軟件\t392284\n笔庄\t392285\nHERO4\t392286\nサリ\t392287\n利苑酒家\t392288\n水利类\t392289\n大家园\t392290\nNARI\t392291\n帕萨特B5\t392292\n天京事变\t392293\n83932060\t392294\n党支书\t392295\n青箬笠\t392296\n跑得\t392297\n悬疑片\t392298\n痱子\t392299\n这年头\t392300\n鸾凤\t392301\n影梭\t392302\nadjust\t392303\n302号\t392304\n万源路\t392305\n宝马M3\t392306\n统借\t392307\n后半篇\t392308\n开艘\t392309\n天气在线\t392310\n亚什兰\t392311\n瓷管\t392312\n北京婚庆公司\t392313\n云竹\t392314\n鄂托克\t392315\n抱一抱\t392316\n青岛东方影都\t392317\n阻拦\t392318\n沈博阳\t392319\n小雪人\t392320\n北京市文物局\t392321\n触摸ic\t392322\n2020届\t392323\n小米9号\t392324\n孵化场\t392325\n86位\t392326\n26P\t392327\n王令\t392328\n都德\t392329\n百度同步盘\t392330\nTap\t392331\n西行\t392332\n谷歌地球吧_\t392333\n油压机\t392334\n红笔\t392335\n三氯氧磷\t392336\n东风A9\t392337\n世字\t392338\nZbrush\t392339\n尚村\t392340\n数盆\t392341\nCAS号\t392342\n茶花女\t392343\nebit\t392344\n燕化\t392345\n霍雨浩\t392346\n增城市\t392347\n清朗\t392348\n侯卫东\t392349\n新场古镇\t392350\n死神来了2\t392351\n百度\t392352\n中福会少年宫\t392353\nhappens\t392354\n国密算法\t392355\n帝国大厦\t392356\n晓雨\t392357\n使命召唤Online\t392358\n艾彼\t392359\n冷塔\t392360\n时空猎人\t392361\nUncensored\t392362\n256M\t392363\nS3C2440\t392364\njjt\t392365\n霉干菜烧饼\t392366\ncaret\t392367\n20MM\t392368\n300例\t392369\n跟单\t392370\nGPC\t392371\n济宁\t392372\n抛光砖\t392373\n逐光\t392374\n教师版\t392375\ntoss\t392376\n吉德\t392377\n钟祥市\t392378\n17cn\t392379\n报业\t392380\n建材有限公司\t392381\n难以承受\t392382\n天球瓶\t392383\n小偏方\t392384\n中脘穴\t392385\n贷通\t392386\n接线夹\t392387\nkonica\t392388\n腊鸡\t392389\n空改\t392390\n陕旅\t392391\nehp\t392392\n上海市卫生局\t392393\nt410\t392394\n美攻强\t392395\n尿常规检查\t392396\n糸统\t392397\nEP5\t392398\n癌变\t392399\n语塞\t392400\nStringEscapeUtils\t392401\n谢谢你\t392402\n1936\t392403\n以父\t392404\n转化率\t392405\n4.9分\t392406\n南宁市统计局\t392407\n艾莎\t392408\n函格式\t392409\n角柜\t392410\n高慧兰\t392411\n斩赤红之瞳\t392412\n答疑作文网\t392413\n陕西省扶贫开发办公室\t392414\n乡政府\t392415\n子弟兵\t392416\n智胜\t392417\n早安少女组\t392418\n洪山\t392419\nMVC6\t392420\n百吉福\t392421\n学生卷\t392422\n取消\t392423\n戴出\t392424\n91天\t392425\ngho文件\t392426\n存心\t392427\npersonally\t392428\n化妆品有限公司\t392429\n流金岁月\t392430\n1.rmvb\t392431\n以奋斗者为本\t392432\n标致\t392433\n无言\t392434\n执拗\t392435\n三对\t392436\n政变\t392437\n汤尼\t392438\n黑炮\t392439\n包场\t392440\n曼珠沙华\t392441\nRESTful接口\t392442\n101号\t392443\nYosemite\t392444\n提土\t392445\n单耳\t392446\n成\t392447\nluci\t392448\n3Q中文网\t392449\n姐弟\t392450\n通量\t392451\n8450\t392452\n台版\t392453\n篮框\t392454\n摩羯座\t392455\n黏滞\t392456\ngeneric\t392457\n亿邦\t392458\n老早\t392459\n3798\t392460\n烨\t392461\n康庄美地\t392462\n发图\t392463\neset\t392464\n钱正昊\t392465\n大众途\t392466\n邱林\t392467\n壁下\t392468\ninnertext\t392469\n菊酯\t392470\n289号\t392471\n亲子操\t392472\nwot\t392473\n十组\t392474\n吉林省社会保险事业管理局\t392475\n美好的日子\t392476\n贾\t392477\n虚拟适配器\t392478\nqtab\t392479\n霸王大陆2\t392480\n说起\t392481\n互成\t392482\nunemployment\t392483\n寒声\t392484\naes128\t392485\n刘慧敏\t392486\n无性婚姻网\t392487\n3GPP\t392488\ncorr\t392489\n实园\t392490\n中国就业培训技术指导中心\t392491\n化工具\t392492\n私用\t392493\n苏若雪\t392494\nSlut\t392495\n春咲梓美\t392496\n木质素磺酸钠\t392497\nLEVIS\t392498\n茚虫威\t392499\n二场\t392500\n天祝县\t392501\n共议\t392502\n二乘\t392503\n8055\t392504\n得奖\t392505\nsjwj\t392506\ntei\t392507\n安多\t392508\nh4\t392509\n0.14\t392510\n诡实\t392511\n快乐五一\t392512\n送往\t392513\n君正集团\t392514\nMediawiki\t392515\n套餐\t392516\n人妖\t392517\n对标\t392518\n羟基苯基\t392519\n微帮\t392520\n返回值\t392521\nx40\t392522\n神魔大陆\t392523\ntour\t392524\n舟山群岛新区\t392525\n金意陶\t392526\n春年\t392527\n犰狳\t392528\n阳角条\t392529\n张秀勤\t392530\n载养眼\t392531\n吉它谱\t392532\n美塑\t392533\n上海中心大厦\t392534\n宠物盒\t392535\n绑扎\t392536\n阿虚\t392537\n昼长\t392538\nDANCE\t392539\n乳衣\t392540\n52个\t392541\n文本分类器\t392542\n狗绳\t392543\n扇舞\t392544\n叶选宁\t392545\n明码\t392546\n湖南华莱生物科技有限公司\t392547\n闫涵\t392548\n旋转轴\t392549\n马坝镇\t392550\n百万只\t392551\n超敏\t392552\n二铵\t392553\nLose\t392554\n百度文\t392555\n野菊花\t392556\n浓妆\t392557\n豆腐泡\t392558\n270号\t392559\n几十米\t392560\njudicial\t392561\n田岛\t392562\n小白龙\t392563\nAppear\t392564\nindonesia\t392565\n广益\t392566\nDesktops\t392567\n群塔\t392568\n风化岩\t392569\n新功能\t392570\n武功县人民政府\t392571\n百度云管家\t392572\n互换性\t392573\nnlp\t392574\n白城市人力资源和社会保障局\t392575\n极品女\t392576\n歌诗达\t392577\n坊间\t392578\n炫茂\t392579\n海英\t392580\n中国国际贸易促进委员会\t392581\nlearned\t392582\n汉兰达论坛_汽车之家论坛\t392583\n眉飞色舞\t392584\n大货\t392585\n中岛敦\t392586\nporto\t392587\n6年后\t392588\nTraits\t392589\n完美型\t392590\n数百张\t392591\n电信科学技术研究院\t392592\n强索\t392593\n姨妈感\t392594\n土豪国\t392595\n未眠\t392596\n荣耀宫本武藏\t392597\n成都交通投资集团有限公司\t392598\n小娜\t392599\ngls450\t392600\n240p\t392601\n镜音铃\t392602\n妮微\t392603\n精确值\t392604\nFactor\t392605\n升平路\t392606\n剑谱\t392607\n行政学\t392608\n圬工\t392609\n800.com\t392610\n七星街道\t392611\nwebinf\t392612\n信宜\t392613\n奶糖\t392614\n大众迈特威\t392615\ngradle\t392616\n不审\t392617\n伟雅\t392618\n零率\t392619\norbis\t392620\n二十亿\t392621\n图纸集\t392622\n发稿\t392623\nconio\t392624\n上海自由贸易区\t392625\n公鹿\t392626\n小学生数学报\t392627\n忘怀\t392628\n郑也夫\t392629\n沈阳世博园\t392630\n360安卓\t392631\n东华村\t392632\n天誉\t392633\n刘珂矣\t392634\n13吨\t392635\n730k\t392636\n绿膜\t392637\n事常\t392638\n针织厂\t392639\n集成商\t392640\n怒兽\t392641\nP1007\t392642\n人工膝关节\t392643\n劳务外包\t392644\nbecomes\t392645\navoir\t392646\n思子\t392647\n汪汪队\t392648\n晕\t392649\n土司\t392650\n器械\t392651\n东京台场\t392652\n亳州新闻网\t392653\n龙珠直播\t392654\n冲印机\t392655\n激怒\t392656\n苏州电子厂\t392657\n盐酸曲唑酮片\t392658\n警用\t392659\n凸透镜成像\t392660\nMLlib\t392661\njbl吧_\t392662\ncob\t392663\nGot\t392664\n拟合度\t392665\n蓝墨云\t392666\nliany\t392667\n燕达\t392668\n曲安奈德益康唑乳膏\t392669\n达标卷\t392670\n首场\t392671\n胜点\t392672\n萨斯卡通\t392673\n西堤\t392674\nappropriate\t392675\n刺花\t392676\n红豆薏米茶\t392677\n招选\t392678\n中国光大银行\t392679\nTL00\t392680\n新印\t392681\n69_\t392682\n想你想不够\t392683\n朱诺\t392684\n色狐\t392685\nHote\t392686\n阿里巴巴旺铺\t392687\n卫生镇\t392688\n左背\t392689\n巴黎歌剧院\t392690\nSQLITE\t392691\nT2航站楼\t392692\nOCR识别\t392693\n大德路\t392694\n判罚\t392695\n反叛者\t392696\ngongsi\t392697\n反渗透水处理\t392698\n防屏蔽\t392699\n星星眼\t392700\nipu\t392701\n慕白\t392702\n电镀镍\t392703\nEL34\t392704\nevolutionary\t392705\ntch\t392706\n囚妻\t392707\n心理学家\t392708\n我爱记\t392709\n雷杰\t392710\n程锦云\t392711\n广外街道\t392712\nkda\t392713\nv2.0.6\t392714\n宝兴\t392715\n6780\t392716\n4399刀剑乱舞\t392717\n上海保安公司\t392718\n张艺瀚\t392719\n云南省人民检察院\t392720\n还童\t392721\n伏特加\t392722\n第46\t392723\n线径\t392724\n服务提供者\t392725\n原始版\t392726\n牛顿\t392727\n巨鹰\t392728\nEp\t392729\nlaytex\t392730\nWinForm\t392731\n三角灯\t392732\nRoot权限\t392733\n摩托罗拉\t392734\n花仙子\t392735\n小耳朵\t392736\n报毒\t392737\n盘手\t392738\nswat\t392739\n化解\t392740\n妾本\t392741\n用点\t392742\n二幼\t392743\nWhisper\t392744\n参议院\t392745\n货币转换费\t392746\n3DMARK\t392747\n木兰大道\t392748\n逼定\t392749\n0.5.2\t392750\n梅开\t392751\n西罗马帝国\t392752\n45英寸\t392753\n松潘\t392754\n申请码\t392755\ncom【\t392756\n纤巧\t392757\n氨麻美敏片\t392758\n路虎神行\t392759\n义方\t392760\n铃器\t392761\n20页\t392762\n光炮\t392763\n応子广场舞\t392764\n扣板\t392765\n县总工会\t392766\n曲字\t392767\nword页边距\t392768\n腰椎间盘突出症\t392769\njndi\t392770\n玉佛苑\t392771\n谢和弦\t392772\n微单TM\t392773\nradisson\t392774\ndpl\t392775\n上海浦东发展银行股份有限公司\t392776\n叶秉桓\t392777\ndns\t392778\n华中师范大学附属小学\t392779\ndyw\t392780\nOpen\t392781\nanyang\t392782\n男人味\t392783\n诣\t392784\n天蕴\t392785\n医友\t392786\nDrones\t392787\n香溢花城\t392788\n土拨鼠之日\t392789\n六万元\t392790\n深圳北火车站\t392791\nintran\t392792\n火铳\t392793\n汗青\t392794\n100多元\t392795\n周转金\t392796\n东西部\t392797\n12袋\t392798\n静息电位\t392799\n大独裁者\t392800\n1000V\t392801\nhadn\t392802\ntooltips\t392803\n空港大道\t392804\n潮牡\t392805\npropertie\t392806\n加密器\t392807\n山东莱钢永锋钢铁有限公司\t392808\n猫草\t392809\n开源证券股份有限公司\t392810\n东方末\t392811\n交路\t392812\nstack\t392813\n中国政法大学出版社\t392814\n报仇雪恨\t392815\n台式电脑硬盘\t392816\n万磁王\t392817\n患病率\t392818\n宣传者\t392819\n静态路由_\t392820\n贵金属网\t392821\n商业计划书\t392822\n和讯股吧\t392823\nCHINA\t392824\n高锰\t392825\n缺勤\t392826\n要事\t392827\n明强\t392828\n亚信科技(中国)有限公司\t392829\n公链\t392830\n修身堂\t392831\n合肥小学\t392832\n中共贵州省委党校\t392833\n人居环境奖\t392834\n航旅纵横\t392835\n不稀罕\t392836\n安达信\t392837\nManageEngine\t392838\nsvnserver\t392839\n花样年集团\t392840\njsonb\t392841\nLABEL\t392842\n主动脉瘤\t392843\n3523\t392844\n惠美梨\t392845\n大话利州\t392846\n56米\t392847\n骚妻\t392848\n莫之\t392849\n务虚\t392850\n马达加斯加\t392851\n1张\t392852\n三层次\t392853\n易看\t392854\n老穆\t392855\n潮白\t392856\n谷歌地图api\t392857\n羌寨\t392858\nid\t392859\n鹿子霖\t392860\n气感\t392861\ncinematic\t392862\n南华经\t392863\n乐带\t392864\n杨禹\t392865\nv4.6.0\t392866\n妖星\t392867\n夕妍雪\t392868\nWishpost\t392869\n婚姻史\t392870\nDNC\t392871\n拌合物\t392872\n杜青林\t392873\nExplorer浏览器\t392874\n光明顶密道\t392875\nLINDA\t392876\nM158B\t392877\napt源\t392878\n中选\t392879\nECharts3\t392880\n电接点压力表\t392881\n蒙族\t392882\n为您服务\t392883\ncosav\t392884\n售后服务\t392885\n贝氏体\t392886\n风流小\t392887\nSMBus\t392888\ncoh2\t392889\n桂花园\t392890\n机名\t392891\n160km\t392892\n阿飘\t392893\n古光\t392894\n128家\t392895\n变向\t392896\n120张\t392897\n洗颜粉\t392898\n治廷\t392899\n品牌博览会\t392900\nGCL2013\t392901\n动爻\t392902\n泛指\t392903\n气管\t392904\n10000部\t392905\n优点\t392906\n融侨外滩\t392907\n25.0000元\t392908\n花纹\t392909\nObjects\t392910\n24号\t392911\n554\t392912\n女鞋\t392913\n百士\t392914\n再见了亲人\t392915\n2017至2018年度\t392916\n迟志强\t392917\n邮电局\t392918\nmisa\t392919\nvc2010\t392920\n众泰T500新闻_众泰T500\t392921\nbokee\t392922\n永联村\t392923\nmacdonald\t392924\n盆底\t392925\n植物大战僵尸OL\t392926\nABCD\t392927\n两三秒\t392928\n主码\t392929\n沙瑞金\t392930\nxbrl\t392931\n813\t392932\nkingbase\t392933\nstub\t392934\n650ti\t392935\n美河\t392936\n联合社区\t392937\nELO\t392938\n兄弟俩\t392939\n贝尔纳\t392940\n蓝推\t392941\n葫芦娃\t392942\nlopatkin\t392943\n并重\t392944\n百克\t392945\nuchar\t392946\n待解\t392947\n友\t392948\n四开\t392949\n重排\t392950\n2507\t392951\n潭村\t392952\n神奇女侠\t392953\n复合增长率\t392954\n布隆过滤器\t392955\n经得\t392956\n阿尔维斯\t392957\nAD09\t392958\n摩西\t392959\n肠胃科\t392960\n胶囊咖啡机\t392961\n法定存款准备金率\t392962\n猖獗\t392963\n福州动物园\t392964\n叛师\t392965\n签派员\t392966\n劫掠者\t392967\naecc2018\t392968\n橡塑管\t392969\nShao\t392970\n小汤普森\t392971\n跑山\t392972\n梅岭街道\t392973\nlncrna\t392974\n教育心理学\t392975\n罗齐尔\t392976\n园丁奖\t392977\n信达\t392978\n刘雪荣\t392979\n三只熊\t392980\n笕美和子\t392981\n天长市政府\t392982\n米家\t392983\n莉莉卡奥特曼\t392984\n农村信用合作社\t392985\n躺椅\t392986\n大粒\t392987\nsoler\t392988\nMessager\t392989\nxpm\t392990\n重男轻女\t392991\n5月前\t392992\n洛美\t392993\nminer\t392994\n誘惑\t392995\n个性化\t392996\n直博生\t392997\n今飞凯达\t392998\n3维\t392999\n12伏\t393000\n新塍\t393001\n珠海长隆企鹅酒店\t393002\nPEACE\t393003\n头孢克肟\t393004\n测控\t393005\n有机磷农药\t393006\n崇贤新城\t393007\n一夜暴富\t393008\n大明皇妃\t393009\n卡赞\t393010\n310ml\t393011\n延米\t393012\nblick\t393013\n晚晴天\t393014\nmuses\t393015\n拖延\t393016\n浙赣\t393017\n原本\t393018\nyat\t393019\n张辽\t393020\n空弦\t393021\n木火通灵88605780\t393022\n6万块\t393023\n魂蛋\t393024\n宠物乐园\t393025\n我的英雄联盟之传奇正盛\t393026\n吴用智\t393027\n蚌埠医学院\t393028\n梨状\t393029\n大连医科大学附属第一医院\t393030\ntough\t393031\n建筑工程业\t393032\ncyuyan\t393033\nExponential\t393034\n大学生创业园\t393035\n金诚\t393036\n射手影音播放器\t393037\n嘉瑞\t393038\n没看懂\t393039\n色板\t393040\n超白玻璃\t393041\n中国杯帆船赛\t393042\n重庆医药高等专科学校\t393043\nWOW联盟\t393044\nCF穿越火线\t393045\n台达plc\t393046\n松潘县\t393047\n多股\t393048\n有机磷中毒\t393049\nBerkeley\t393050\n桃乐丝\t393051\n吴蚊米\t393052\n绫波\t393053\n合盖\t393054\n五香牛肉\t393055\n画江湖之灵主\t393056\n妙趣横生\t393057\n极酷网\t393058\n德阳五中\t393059\n多云\t393060\n医流巴巴网\t393061\n衢州传媒网\t393062\nBiscuit\t393063\n内核函数\t393064\n苏君\t393065\n申根签证表\t393066\nMismatch\t393067\n民心\t393068\nrov\t393069\n数字矩阵\t393070\n入群\t393071\n天逸c5\t393072\n神州高铁\t393073\n64m\t393074\n80件\t393075\nios4\t393076\n吴咏宁\t393077\nSpencer\t393078\n万全影院\t393079\n属实\t393080\n新郑黄帝故里\t393081\n气场\t393082\n颗子\t393083\n跨下\t393084\n蠕动泵\t393085\n童丹\t393086\n很辛苦\t393087\n以太坊学习笔记\t393088\n高压锅炉\t393089\n文中\t393090\nqam\t393091\n嘉义县\t393092\n撸撸杯\t393093\n信阳东\t393094\nAngularjs\t393095\n荣御\t393096\n正荣\t393097\n7950\t393098\n美丽无忧网\t393099\ness\t393100\n水果园\t393101\n多板\t393102\nbak文件\t393103\n配重块\t393104\nCnKy\t393105\nMUL\t393106\n高悬\t393107\nep.1\t393108\n时圆\t393109\n竹内结子\t393110\n玛卡片\t393111\n道力\t393112\n换肾\t393113\n林强\t393114\nWord2017\t393115\n恒泰实达\t393116\n闲侃\t393117\nSmartScreen\t393118\n大官人\t393119\nYG\t393120\n楚留香武\t393121\nOwnCloud\t393122\nElement-UI\t393123\nrevit2018\t393124\nNetLogo\t393125\n华润城润府\t393126\n湖北省测绘地理信息局\t393127\ntrnsys\t393128\n天津滨海机场\t393129\n教代会\t393130\n无名岛\t393131\n风廉政\t393132\n滞港费\t393133\n箱油\t393134\n泉州经济技术开发区\t393135\n热血男儿\t393136\n流行性腮腺炎\t393137\n康泰克\t393138\n寒山闻钟\t393139\n脂肪酸\t393140\n二十一个\t393141\n100mg\t393142\n有惊无险\t393143\nM60\t393144\n小药\t393145\neraser\t393146\n真空拔罐器\t393147\n该国\t393148\n720P|1080P高清中字版)迅雷BT种子/ED2K\t393149\n合做\t393150\n辣椒水\t393151\n铝方\t393152\ns02\t393153\n通灵术\t393154\n160马力\t393155\n亚克力管\t393156\nInthe\t393157\n唐红\t393158\n成都幼儿园\t393159\n龙湾国际机场\t393160\nMultiCharts\t393161\n_天\t393162\n王世龙\t393163\n华强路\t393164\n货币转换\t393165\nBottle\t393166\nv5.10\t393167\n园长\t393168\n斯卡布罗\t393169\n锁套\t393170\n雪浪\t393171\n宝成铁路\t393172\n投石\t393173\n无仙\t393174\nCPB\t393175\nwebbky\t393176\nPE保护膜\t393177\n早强剂\t393178\nParquet\t393179\n3x+1\t393180\n安家\t393181\n宝峰\t393182\n米斗\t393183\n索贿\t393184\n老谢\t393185\n手工帝\t393186\n几乘\t393187\n常熟北站\t393188\n城通网盘\t393189\n镜头群\t393190\n学肃\t393191\n5278\t393192\n星际战甲\t393193\n满文\t393194\n幼教展\t393195\nSAM\t393196\n四川经济信息网\t393197\nUnd\t393198\nCAKE\t393199\n四川电大\t393200\n二里头遗址\t393201\n跑调\t393202\n创新医疗\t393203\n妖谱\t393204\ncuisine\t393205\n硬怼\t393206\n三百余\t393207\n卡塔尔\t393208\n_儿\t393209\n硫化镉\t393210\nchampagne\t393211\n2k小说网\t393212\njst\t393213\n空天\t393214\n湘郡\t393215\nbronk\t393216\nDCDC\t393217\nlistary\t393218\n僵尸房\t393219\n智联岗\t393220\n405路\t393221\n南京房产网\t393222\nDSO\t393223\n妄想症\t393224\n人民公安报\t393225\nmilf\t393226\nJOJO的奇妙冒险\t393227\n1757\t393228\n犬科哺乳动物\t393229\n诺心LE\t393230\n公路板\t393231\nlonger\t393232\n济南农商银行\t393233\n双争\t393234\n酷寒\t393235\n熊猫和小鼹鼠\t393236\n公交\t393237\n家破人亡\t393238\n明星同乐会\t393239\n1336\t393240\n外购件\t393241\n淘精网\t393242\n河南科技大学\t393243\nljc\t393244\n日瓦戈\t393245\n维盾\t393246\n怀光\t393247\n新车标\t393248\n7.12\t393249\nu9u9H5\t393250\ntrafficking\t393251\n魏莹\t393252\nsim卡槽\t393253\n拉梁\t393254\nlogloss\t393255\n串改\t393256\n昌江区\t393257\n绿源\t393258\n冰屏\t393259\n碘化铅\t393260\n巴可\t393261\n脊灰\t393262\n睡眠呼吸暂停综合症\t393263\n紧压\t393264\n程控交换机\t393265\n蜂人\t393266\n_圈网\t393267\n六十七\t393268\n气体传感器\t393269\n浙江邮电职业技术学院\t393270\n涪陵新区\t393271\n0328\t393272\n20150901\t393273\n2017年6月15日\t393274\nharley\t393275\n养森\t393276\n侃股网\t393277\n龙骨\t393278\n沈青\t393279\nEclEmma\t393280\nulysses\t393281\n亚煞极\t393282\n合格性\t393283\n明天晚上\t393284\n内存泄露\t393285\n闪光灯\t393286\n老洲镇\t393287\n排行\t393288\n052D型\t393289\n姜Gary\t393290\n超高产\t393291\nShing\t393292\n中茂城\t393293\n陪读\t393294\n黑胡子\t393295\nMaximus\t393296\n应负\t393297\n航拍仪\t393298\n蓝顶\t393299\n中国政府法制信息网\t393300\nieg\t393301\nsate\t393302\n自扇\t393303\n勒温\t393304\n阵疼\t393305\n十二期\t393306\n第169集\t393307\n蔓草\t393308\n六周\t393309\n洗衣歌\t393310\n参谋\t393311\n83780387\t393312\n地垫宝\t393313\n重庆市县\t393314\n互联网保险\t393315\n美观性\t393316\ndvdfab\t393317\n伯益\t393318\n氧氯化锆\t393319\n派出\t393320\nunicode码\t393321\n甜饼\t393322\n蓄势\t393323\nmep\t393324\n王建安\t393325\n塘渣\t393326\n崔子格\t393327\n几十G\t393328\n京雄城际铁路\t393329\n阿根廷世界杯\t393330\n集合地\t393331\n平安小区\t393332\nPATENT\t393333\n引狼入室\t393334\nMom\t393335\nvidivici\t393336\n2017年6月22日\t393337\n焦糖味\t393338\n渡假\t393339\n酷睿i5-8400\t393340\n太阳能电池组件\t393341\nBearings\t393342\n_鹤壁政府网\t393343\n免疫系统\t393344\n官解\t393345\nJAPAN\t393346\n学子\t393347\n武汉大学城市设计学院\t393348\n李立新\t393349\nrecursively\t393350\nYao\t393351\n58汽车网\t393352\n浙江法院\t393353\naus\t393354\n2010年6月\t393355\nhd650\t393356\n西维尔\t393357\n冶\t393358\n岳家军\t393359\n奔牛节\t393360\n02um\t393361\n扬州站\t393362\ncrome\t393363\nGCSE\t393364\niask\t393365\n阿里巴巴\t393366\n20180429\t393367\n数控滚齿机\t393368\n暴露无遗\t393369\n洛阳牡丹园\t393370\n04741\t393371\n刑满释放\t393372\n讲谈社\t393373\npoorsakura\t393374\nF值\t393375\n德田重男\t393376\n布朗运动\t393377\n美少女们\t393378\n乐才\t393379\n化学技术\t393380\n济南外国语学校\t393381\n民情\t393382\n廊坊经济技术开发区\t393383\n金石滩\t393384\n填装\t393385\n业余学\t393386\n西龙\t393387\n绣花机\t393388\n昂立教育\t393389\ngxrc\t393390\n蹒跚\t393391\nMIDAS\t393392\n物价局\t393393\n鲍汁\t393394\n蔓藤\t393395\n天美地美\t393396\n老三样\t393397\npatho\t393398\noper\t393399\nvascular\t393400\n长沙市第四医院\t393401\n梅利奥达斯\t393402\n自作聪明\t393403\nCHANEL香奈儿\t393404\n第3个\t393405\nDATEDIF函数\t393406\n骶部\t393407\nwa\t393408\nM90\t393409\n好记星\t393410\n惠农区\t393411\n大疆osmo\t393412\n南行\t393413\n240帧\t393414\n三威\t393415\n新静安\t393416\n大小说\t393417\nwsop\t393418\nlauncher3\t393419\n无保留\t393420\n仪礼\t393421\n9.3.5\t393422\n干窑镇\t393423\n栗色\t393424\n木房\t393425\n玫瑰三愿\t393426\n街道口\t393427\nkills\t393428\n孟河镇\t393429\n巡逻车\t393430\nGifs\t393431\n2.5公分\t393432\nimadjust\t393433\n养肝网\t393434\n搓丝机\t393435\n梦幻飞仙\t393436\n锁场\t393437\n6月15日\t393438\n各点\t393439\n灸\t393440\n平湖在线\t393441\n产生量\t393442\n八台山\t393443\n淫笑\t393444\n受保护\t393445\n爱田奈奈\t393446\n角分\t393447\n福寿螺\t393448\n中国医科大学\t393449\nAsterisk\t393450\n周浩\t393451\n高速离心机\t393452\n蒲剧\t393453\n甘城光辉游乐园\t393454\n慢性浅表性胃炎\t393455\n东湖高新区\t393456\n剩菜\t393457\nmkck\t393458\n1.1倍\t393459\n鹅毛\t393460\n国家牡丹园\t393461\nnux\t393462\n作用\t393463\n纯净版\t393464\n10m\t393465\n仙域\t393466\npbp\t393467\n文旦\t393468\n火堆\t393469\nctp\t393470\n湖南省档案局\t393471\napproved\t393472\n戛纳电视节\t393473\nembryolisse\t393474\n半途而废\t393475\n宝杨码头\t393476\n圣诞日\t393477\n管径\t393478\n保网\t393479\nQQ营销\t393480\nWookieepedia\t393481\nlog4j2.xml\t393482\n云和县\t393483\n背后的故事\t393484\n王家庄\t393485\n顺庆区人民政府\t393486\n59期\t393487\n冬令\t393488\n泪汪汪\t393489\n死刑犯行刑\t393490\ninterpretation\t393491\n佩嘉西\t393492\n声卡KX\t393493\nlulu\t393494\n派单\t393495\n东侧\t393496\n玛蒂\t393497\nSober\t393498\n陈子昂\t393499\n第12话\t393500\noriginal\t393501\n弥撒\t393502\n东风物流\t393503\n百度技术博客-51CTO\t393504\n谭盾\t393505\n天下客家网\t393506\n职业能力倾向测验\t393507\ngotti\t393508\n工农\t393509\n莎\t393510\n放售\t393511\nDictionary\t393512\n第三十六届\t393513\n造节\t393514\n五分之四\t393515\n电烤盘\t393516\nLicence\t393517\n表面活性剂\t393518\n勾兑\t393519\n全自动软水器\t393520\n酸味\t393521\n脸型包\t393522\n海蜇\t393523\n布吉岛\t393524\n肥槽\t393525\n1901年\t393526\n花猫\t393527\n冰甲\t393528\nXTV\t393529\ntlm\t393530\nTraveling\t393531\nVEGA\t393532\n一仆\t393533\n绝代佳人\t393534\n吉恩·格雷迈恩\t393535\n泌尿系\t393536\n注灵\t393537\n睁\t393538\ndescriptive\t393539\n纸签\t393540\n2亿多\t393541\n尚客优酒店\t393542\n冠字\t393543\nscent\t393544\n600839\t393545\n吃鸡舞\t393546\n恐暴龙\t393547\n人民军医\t393548\n浅忆\t393549\n鹿希武\t393550\n天魄\t393551\n龙门绝境\t393552\n咕咕鸡\t393553\n电管\t393554\n反光条\t393555\n公维\t393556\n二维火收银\t393557\nServer服务器\t393558\n内丘县\t393559\n2086\t393560\n指印\t393561\n圣心医院\t393562\n糙率\t393563\nutf16\t393564\n弗朗兹\t393565\n催泪弹\t393566\nFEFJay\t393567\n幸福版\t393568\n阿斯伯格综合症\t393569\n狗粉\t393570\nBinlog\t393571\n20170609\t393572\n供应链管理\t393573\n栋号\t393574\n原动件\t393575\n戏志\t393576\n企鹅酒店\t393577\n所在地图\t393578\n杭州市人民政府法制办\t393579\n石家庄二中\t393580\n那么美\t393581\nPPG\t393582\n淘库商城\t393583\ntangent\t393584\n哲学\t393585\n原山\t393586\n顿时\t393587\n管培生\t393588\n月季花\t393589\nadbd\t393590\n恐龙蛋\t393591\n秀儿\t393592\nhuishi\t393593\n第四十五条\t393594\n8AT\t393595\n刘晶\t393596\n济南站\t393597\n西安\t393598\n容性负载\t393599\n离心率\t393600\n微方\t393601\n1栋\t393602\n戴向宇\t393603\nHonesty\t393604\n虹桥高铁站\t393605\nCANoe\t393606\n五莲花\t393607\n行人\t393608\nncu\t393609\n排头兵\t393610\n云南省工业和信息化委员会\t393611\nGly\t393612\n孤柏渡\t393613\n绝地求生痴鸡小队\t393614\n蒙蒙\t393615\n此房\t393616\n育民小学\t393617\n苏尔寿\t393618\n黄扫非\t393619\n一千多块\t393620\n毛骨悚然\t393621\n一吻天荒\t393622\nright\t393623\n状元媒\t393624\n金书\t393625\n广东新宝电器股份有限公司\t393626\n18kg\t393627\n天天\t393628\n艾特贸易网\t393629\nPMI\t393630\nJourneys\t393631\nSinte-Beuve\t393632\n鲁西南\t393633\n虾酱\t393634\n自然性\t393635\n空门\t393636\n云天励飞\t393637\n千牛版\t393638\nWHV\t393639\n15公里\t393640\n葡萄庄园\t393641\n大石桥市\t393642\n大部分\t393643\nPOSS\t393644\nvibes\t393645\n塔城路\t393646\n下沙大学城\t393647\n捕鸟\t393648\n南外仙林分校\t393649\n明目张胆\t393650\n松山湖凯悦酒店\t393651\nf183\t393652\n俄罗斯航空公司\t393653\n惠城区\t393654\n木箱\t393655\n扁鹊治病\t393656\n狗群\t393657\n发人深思\t393658\n顺发\t393659\n印地\t393660\n用车\t393661\nCommercial\t393662\n5173.com|\t393663\n早年\t393664\n谢雨欣\t393665\n放反\t393666\n水美\t393667\n防蚊\t393668\nBeagleBone\t393669\n男子汉\t393670\n京师同文馆\t393671\n四川省省级住房公积金管理中心\t393672\n张阿姨\t393673\nbvi公司\t393674\n乙酸铵\t393675\nSeeker\t393676\n保定市第一中心医院\t393677\n重庆一中\t393678\n求艺网\t393679\n級\t393680\n19:35\t393681\nVivienne\t393682\n校内信息网\t393683\n别害怕\t393684\n2662\t393685\n宜丽客\t393686\n码数\t393687\n潘集区\t393688\n免播放器\t393689\n胎心\t393690\nsjs\t393691\nuda\t393692\n氧化钛\t393693\n直接\t393694\n轻叹\t393695\nperform\t393696\n居外\t393697\n宫血宁\t393698\n建制\t393699\n想笑死\t393700\n杨童舒\t393701\n零压\t393702\nresponseentity\t393703\n3元\t393704\n第一级\t393705\nDietary\t393706\nthought\t393707\n工签\t393708\n罗马仕充电宝\t393709\nLisa\t393710\nLETTER\t393711\n自振\t393712\n青阳街道\t393713\n蒽\t393714\n驱动盘\t393715\n殷梨亭\t393716\nlanyan\t393717\n白河县\t393718\n鲁西西\t393719\n德国工业园\t393720\nteva\t393721\n启示者\t393722\n觅房网\t393723\n巡视利剑\t393724\n白化\t393725\n占有人\t393726\n千晶\t393727\nv3.9\t393728\n吴姗儒\t393729\n王永刚\t393730\n宁波美术馆\t393731\n泪珠\t393732\n3dx\t393733\n无锡太湖学院\t393734\n春田花花同学会\t393735\n势线\t393736\nabdulla\t393737\n无线电能\t393738\n画字\t393739\n轻食主义\t393740\nkipling\t393741\n万家丽北路\t393742\n2736\t393743\n亚克力发光字\t393744\n研\t393745\n斑马狗头\t393746\n变乱\t393747\n所得税法\t393748\n笛卡尔坐标系\t393749\n盆架\t393750\n广西壮族自治区党委\t393751\n三峡游船网\t393752\n毁尸灭迹\t393753\n117平\t393754\n陈姥姥\t393755\ncountryman\t393756\n冰痕\t393757\n傣文\t393758\nqihui\t393759\nartificial\t393760\n黄石市政府\t393761\n武勇\t393762\n乐小宝\t393763\nyii1.1\t393764\n14类\t393765\n虚云\t393766\n芦花鸡\t393767\n小裤\t393768\n扫号器\t393769\n吊链\t393770\n禅定\t393771\n周家大院\t393772\nfsolve\t393773\n汉威科技\t393774\n佐治亚理工\t393775\n世子\t393776\ndadao\t393777\n连轧\t393778\n江南春\t393779\n七十二行\t393780\n大官\t393781\n短叶\t393782\n28|\t393783\nfff\t393784\n打样费\t393785\nanswers\t393786\n漏油\t393787\n1960s\t393788\n和意\t393789\n魔盒\t393790\ndongnan-51CTO\t393791\n洪永城\t393792\n憋屈\t393793\n丹凤\t393794\nwin7单\t393795\n修学旅行\t393796\n潮汕人论坛\t393797\n公民权\t393798\nCDPR\t393799\n第160期\t393800\n奋起直追\t393801\n基尔加丹\t393802\nSEX169\t393803\ne信\t393804\n广武\t393805\n21件\t393806\nP8H61-M\t393807\n剁椒鱼头\t393808\n广东科技\t393809\n第几张\t393810\n二七区\t393811\n浙南科技城\t393812\nshici\t393813\n师生\t393814\n美的城\t393815\n莫辛纳甘\t393816\nTwain\t393817\n700万元\t393818\n鸭嘴阀\t393819\n武平县\t393820\n袁氏文学网\t393821\n风夜昕\t393822\nHeart\t393823\n王加\t393824\n吴文\t393825\n溶图\t393826\nBT电影天堂-迅雷BT\t393827\n压接\t393828\n心有千千结\t393829\n滑滑\t393830\n拳皇2000\t393831\ntimer\t393832\n瘦身法\t393833\n严把关\t393834\nblogo\t393835\n中国人民共和国\t393836\n怒马\t393837\n大宁公园\t393838\n薛峰\t393839\n孤岛惊魂:原始杀戮\t393840\nXFX讯景\t393841\nSORA\t393842\n十字弓\t393843\n金义东\t393844\n一坪\t393845\nspeaking\t393846\nHTML5视频播放器\t393847\n测振\t393848\n45W\t393849\n光功率\t393850\n车帝\t393851\n基友\t393852\nIIS6\t393853\n伊利斯\t393854\n认识到\t393855\n香溢紫郡\t393856\nvoluspa\t393857\n第六十三条\t393858\n12000\t393859\nvivoy55a\t393860\n五力\t393861\n2.1.1\t393862\n0413\t393863\n最最最最\t393864\n老地方\t393865\n山鸡\t393866\n缝制\t393867\n4401\t393868\n龙珠Z\t393869\n[文\t393870\n王小亚\t393871\n万佛\t393872\n孙道临\t393873\n奔驰s\t393874\n远期结售汇\t393875\n组块\t393876\n痱子粉\t393877\n于小戈\t393878\n国土资源部不动产登记中心\t393879\n80平方米\t393880\n4507\t393881\n显影器\t393882\nCHN\t393883\n_315货源网\t393884\n双井桥\t393885\nlambert\t393886\n阿黛尔的生活\t393887\n无版\t393888\n哈密瓜\t393889\n纽约电影学院\t393890\n上百张\t393891\n诗群\t393892\n北爱\t393893\n反距离\t393894\n定海区教育局\t393895\nUltraEdit\t393896\n奥普集成灶\t393897\nWM1A\t393898\n广州市金融工作局\t393899\n无情物\t393900\n畜产\t393901\nShrine\t393902\n潮生\t393903\n维兹\t393904\n义值\t393905\n磁针\t393906\nk帧\t393907\n王荣平\t393908\n因陀罗\t393909\n竹辉路\t393910\n爱的魔法\t393911\n斐雪派克\t393912\n名硕\t393913\nDevices\t393914\nespecially\t393915\n5025\t393916\n早盘\t393917\n读秀\t393918\n棉绸\t393919\n7612\t393920\n快点\t393921\n五妹\t393922\n补觉\t393923\nfrighten\t393924\nSubtitle\t393925\n向佐\t393926\n第10套\t393927\nDBUtils\t393928\n联合国军\t393929\n雷恪生\t393930\n红村\t393931\nGamer\t393932\n湖南省农村信用社联合社\t393933\n羊粪\t393934\n霍山石斛\t393935\n大方县\t393936\n中铁国际城\t393937\nWeArTrends时尚资讯网\t393938\n平台类\t393939\n完美解码播放器\t393940\nLittle\t393941\n圣达\t393942\n京基100\t393943\n妙法莲华经\t393944\n布拉沃\t393945\n扮装\t393946\n祈雨\t393947\n黔菜\t393948\nBeginMan\t393949\nillusion\t393950\n氢氪\t393951\n值点\t393952\napx\t393953\n太一吾鱼水\t393954\n标致508\t393955\nturmoil\t393956\n先锋电影院_影音先锋_先锋资源_xfplay_菲儿影院\t393957\n梦想成真\t393958\n泉州市政府\t393959\nAll\t393960\n少将\t393961\n3.31\t393962\n张妍\t393963\nabsorption\t393964\n0026\t393965\n伊涅斯塔\t393966\n酸洗钝化\t393967\n战争前线\t393968\nTop15\t393969\n石原莉奈\t393970\n互联网广告\t393971\n阿政\t393972\n三溪\t393973\n智能终端\t393974\n写到\t393975\n节目组\t393976\n观片\t393977\n宝沃bx7\t393978\n肝肾\t393979\n130厘米\t393980\n1570\t393981\n绿茵场\t393982\n契约式\t393983\n陕西省西安中学\t393984\nSIS\t393985\nH2OS\t393986\n读书报告\t393987\n恋上你看书网\t393988\nAhead\t393989\nisc\t393990\npyautogui\t393991\nVitas\t393992\n457号\t393993\n佛道\t393994\ntemperature\t393995\n蛋糕柜\t393996\n风撑\t393997\n闲游\t393998\n嘴麻\t393999\n炉料\t394000\n咪咕视频\t394001\n冉莹颖\t394002\nMOUSE\t394003\nwww.csc108.com\t394004\nValves\t394005\n长江中游城市群\t394006\nid值_\t394007\n张红伟\t394008\n新泾镇\t394009\nGlance\t394010\n宇智波\t394011\nAAC\t394012\n阳虚\t394013\n莲山\t394014\n索特\t394015\n腾声传媒\t394016\n铠甲勇士拿瓦\t394017\nleft\t394018\n韩式烤肉\t394019\n荧光海\t394020\n战枭\t394021\n缬沙坦\t394022\n币圈\t394023\n打浦路\t394024\n成道\t394025\nキャラクタ\t394026\n汉中市政府\t394027\n洗瓶机\t394028\nccav\t394029\n80目\t394030\n一闪一闪亮晶晶\t394031\ngsl\t394032\n餐牌\t394033\n洗发店\t394034\n飞冰\t394035\n亡母\t394036\n特变电工股份有限公司\t394037\n二值化\t394038\n产业区\t394039\n绝对数\t394040\nDism\t394041\n第四次\t394042\n请稍后再试\t394043\n中国人保\t394044\nTomcat\t394045\n离异\t394046\njourneys\t394047\n春假\t394048\n烂番茄\t394049\n亚当夏娃\t394050\n大庆石化\t394051\n_乐游网\t394052\n兴业证券\t394053\n入神\t394054\n电动车主\t394055\n马蜀君\t394056\nBoost\t394057\n百首\t394058\n东坑\t394059\n超模\t394060\nMP4_\t394061\n刘素云\t394062\n布伦希尔特\t394063\n视界网\t394064\n埋藏\t394065\n济州机场\t394066\n探照\t394067\n清奇\t394068\ncobertura\t394069\n凡夫\t394070\n制油\t394071\n流弹\t394072\nColgate\t394073\n肝癌\t394074\n袁老师\t394075\n武器库\t394076\n心理游戏\t394077\ncortana\t394078\n龙丰\t394079\n闲鱼\t394080\n钟宝儿\t394081\nc_\t394082\n正商地产\t394083\n巫\t394084\n中航电测\t394085\n文联\t394086\n发育期\t394087\nFahren\t394088\n鼻咽癌\t394089\n安倍昭惠\t394090\n手机费\t394091\n温病学\t394092\n150000\t394093\n帧缓存\t394094\n西四\t394095\nCartographer\t394096\nUltraISO软碟通\t394097\nRedVelvet\t394098\nzhangjie\t394099\n三煞\t394100\nWIRED\t394101\n恶果\t394102\n商鞅变法\t394103\n管锥编\t394104\n感兴趣\t394105\n闪修\t394106\n钟落潭\t394107\n并不遥远\t394108\n裹裹\t394109\n中山市小学\t394110\n成句\t394111\n兰州化物所\t394112\n唾液腺\t394113\n洁儿\t394114\nmanson\t394115\n防城港港口区\t394116\n配发\t394117\n苏州工艺美术职业技术学院\t394118\n准确率\t394119\nspiritual\t394120\n三级医院\t394121\n毛坯房\t394122\n莠\t394123\n巨蟒\t394124\n错因\t394125\nVcenter\t394126\n放火罪\t394127\n002736\t394128\n凯特\t394129\n104.5\t394130\n精确率\t394131\n芭克\t394132\n贞观埃及艳后\t394133\noppoa37\t394134\nRS5\t394135\n感觉到\t394136\n考試\t394137\nrouter\t394138\n新闻研究导刊\t394139\n减值\t394140\n八一村\t394141\nMEDIA\t394142\n帝宫\t394143\ngoalng\t394144\nssrs\t394145\n大展宏图\t394146\n买入价\t394147\n石家庄日报社\t394148\n蒽酮\t394149\n杭州地铁5号线\t394150\n安红\t394151\n国家博物馆\t394152\n吾栖之肤\t394153\n80070490\t394154\n阳明\t394155\n恩施利川\t394156\n马队\t394157\nHYDE\t394158\nNEX-5R\t394159\n僵硬\t394160\n教育科学出版社\t394161\n舌尖2\t394162\n课间文明\t394163\nRefurbished\t394164\n挂封\t394165\n趁热\t394166\n反恐特警队\t394167\n2018037期\t394168\nAxiom\t394169\nldpe\t394170\n无人化\t394171\n幽默\t394172\nshida\t394173\n王平仲\t394174\nCensored\t394175\n裁撤\t394176\n卡波特\t394177\n北京地铁5号线\t394178\n新港镇\t394179\nMicrosemi\t394180\n富山工业园\t394181\nsin函数\t394182\n不变的是\t394183\n考拉fm\t394184\n铜铟镓硒\t394185\nPdfFactory\t394186\n羊犀立交\t394187\n机座\t394188\nLoadRunner11\t394189\n笔\t394190\n孙喆\t394191\n耳石症\t394192\n清凉谷\t394193\nITK\t394194\n嗨起来\t394195\n袁雨萱\t394196\nPhoebe\t394197\n泪雪\t394198\n摇篮曲\t394199\n焚烧\t394200\n高专\t394201\n后补\t394202\n钓鱼论坛|辽钓网\t394203\n北威州\t394204\n徐以杓\t394205\n八佰\t394206\n雪佛兰乐风\t394207\n巨兽岛\t394208\n仿冒品\t394209\n北通W1\t394210\n人防\t394211\n斜率\t394212\n电热膜\t394213\nCatalysis\t394214\n兽药\t394215\n十问十答\t394216\n二时\t394217\nhtbbzzg\t394218\n藻油\t394219\n邮车\t394220\n诺格弗格\t394221\n费尔德\t394222\n固线器\t394223\n焊片\t394224\n调制器\t394225\n920mx\t394226\n人教版五年级下册语文期中考试\t394227\nh版\t394228\n1808S\t394229\n整脊\t394230\n公路实务\t394231\n可不可以\t394232\n剑花\t394233\n搬迁户\t394234\nAccelerator\t394235\n棕榈滩\t394236\n荣耀5C\t394237\n湖北省经信委\t394238\n般若寺\t394239\nmeitian\t394240\n加气站\t394241\nproper\t394242\n1.2亿\t394243\n玉娆\t394244\n被捉\t394245\nhei\t394246\nxss攻击\t394247\nunprintable\t394248\n批发部\t394249\nOffice论坛\t394250\n体卫\t394251\n新兴街道\t394252\n长桌宴\t394253\nrigh\t394254\n福建中医药大学\t394255\n撸时代\t394256\n烟雾病\t394257\n泸州老窖特曲\t394258\nmoussy\t394259\n非那雄胺片\t394260\n一度君华\t394261\n利得\t394262\ngfk\t394263\n温岭中学\t394264\n转链\t394265\n24次\t394266\n基本操\t394267\nv币\t394268\n我等到花儿也谢了\t394269\n下厨\t394270\n募兵制\t394271\n城崎温泉\t394272\n奔腾b30\t394273\n屠龙传说\t394274\n石化大道\t394275\n美产\t394276\n韵达快运\t394277\nchronos\t394278\n他汀类\t394279\n尘毒\t394280\n猝然\t394281\n胶线\t394282\n11.2.0.1.0\t394283\n观\t394284\n建超\t394285\n流畅性\t394286\n平衡单\t394287\n云广\t394288\n史海钩沉\t394289\n羚\t394290\nLinksys\t394291\n牛头山\t394292\nwilliams\t394293\n诱妻\t394294\nFLYCO\t394295\nity\t394296\n四孔\t394297\n亮碧思\t394298\n返利网帮助中心\t394299\nGapps\t394300\n军裤\t394301\n阳光电源股份有限公司\t394302\n倒出\t394303\nMadrid\t394304\n砂体\t394305\n梦幻打图\t394306\n励磁\t394307\n无限法则\t394308\nlcp\t394309\n长点\t394310\n.NET编程\t394311\n马未都\t394312\n新面孔\t394313\n蒋氏故居\t394314\noutcome\t394315\n拼拼\t394316\nH265\t394317\n一念成\t394318\n陈霞\t394319\nNYMEX\t394320\nEvening\t394321\n奔驰cla\t394322\n金凯德\t394323\nrop\t394324\n旗手\t394325\n美队3\t394326\n使命召唤OL\t394327\n银泰\t394328\n斗罗大陆动漫\t394329\n睿意\t394330\n黄金花\t394331\n魔镜号\t394332\n刘亚平\t394333\n高压氧舱\t394334\n湖南省检察院\t394335\n泰安市人民政府\t394336\n新义\t394337\nliziyou\t394338\n孟\t394339\nPresenter\t394340\n顾乡\t394341\n九头鸟\t394342\n河水\t394343\n包租\t394344\n郁金香展\t394345\n顺义网城\t394346\n天猫618\t394347\n黄冈中学\t394348\n超碰网\t394349\ndib\t394350\nxiaoxi\t394351\nrap\t394352\n免安裝\t394353\n拙劣\t394354\n溪南镇\t394355\ncontainsKey\t394356\nOutlet\t394357\n第19天\t394358\n铜锤\t394359\nX11\t394360\n明基投影机\t394361\n300英雄\t394362\n直到世界尽头\t394363\n雪佛\t394364\n苞\t394365\n两页\t394366\n趣学\t394367\n固相萃取\t394368\nGHOST版\t394369\n涂岭镇\t394370\n2017年3月14日\t394371\nderma\t394372\n468米\t394373\nAdvertisement\t394374\n知识站\t394375\n吉林大学白求恩第一医院\t394376\n声鉴\t394377\nMikrotik\t394378\n便鞋\t394379\n深沉\t394380\n珠江时报\t394381\n孜孜\t394382\ngf2\t394383\n绝地求生助手\t394384\n云南机场集团有限责任公司\t394385\n林恺俊\t394386\n60mm\t394387\n七里山\t394388\n卖汤圆\t394389\n白云国际会议中心\t394390\n抖音白\t394391\nC104\t394392\n道爵\t394393\n得得\t394394\nhgfs\t394395\n口头语\t394396\n潜水\t394397\n后排版\t394398\nalittle\t394399\n300C\t394400\n大起大落\t394401\n西安纺织城\t394402\nelise\t394403\n荧光素酶\t394404\n我学会声会影\t394405\n学前卫生学\t394406\n大鸿寨\t394407\nwebSocket\t394408\n吓破\t394409\n雷瑞\t394410\nnodal\t394411\n萨尔特\t394412\n普宁市\t394413\n民主集中制原则\t394414\n潮流社区\t394415\n无产者\t394416\n松鼠鱼\t394417\n潘海利根\t394418\n小萌\t394419\n鸭舌帽\t394420\n20150109\t394421\n深圳四季酒店\t394422\n歌谱\t394423\n日事\t394424\n小儿消积止咳口服液\t394425\n工频电场\t394426\n批八字算命\t394427\n意向函\t394428\n赵博\t394429\n最后\t394430\n问讯\t394431\n透视村\t394432\n名案\t394433\n河套\t394434\n进犯\t394435\n汇中\t394436\nSiwaMAN\t394437\n东映动画制作的电视动画\t394438\n战魂\t394439\n炸弹塔\t394440\nonto\t394441\n5222\t394442\n6.38\t394443\n牙龈肿\t394444\n刚果红\t394445\n新天龙八部杂谈\t394446\n积云\t394447\n0908\t394448\n陈伟星\t394449\nV级\t394450\n6108v9\t394451\n顶头\t394452\ne客\t394453\n木人\t394454\n1.8多\t394455\n十季\t394456\nHtml5\t394457\n县司法局\t394458\n年兽\t394459\n农村经济\t394460\n1美元\t394461\n中职校\t394462\n600581\t394463\n私募股权基金\t394464\n火奴鲁鲁\t394465\n湖南中医药大学\t394466\n彩纸\t394467\nMSR7\t394468\n爱鸟护鸟\t394469\n釜山行\t394470\n封魔\t394471\n变音器\t394472\n海角网\t394473\niPhone4s\t394474\n珠海御温泉\t394475\n3月15日\t394476\n清远市住房和城乡建设管理局\t394477\n骨软骨瘤\t394478\n入胜\t394479\n眉间\t394480\n连盐铁路\t394481\n工作位\t394482\n赖\t394483\n镀锌\t394484\n沪通铁路二期\t394485\n百信银行\t394486\n崔银姬\t394487\n华宇百花谷\t394488\n不夜城\t394489\n薇信\t394490\n广东)自由贸易试验区\t394491\ne150\t394492\n跨性别者\t394493\n言情中文\t394494\ngraceful\t394495\n张爱萍\t394496\n烤鳗鱼\t394497\norale\t394498\n免疫反应\t394499\nmaimai\t394500\n严处\t394501\n锤式\t394502\n鱼丸\t394503\n义乌站\t394504\n不鸣则已\t394505\n十三朝古都\t394506\n三重一大\t394507\n大是大非\t394508\n挹江门\t394509\n印刷品\t394510\n江家\t394511\n牛仔裤\t394512\nMar\t394513\n黄石街\t394514\n塔利\t394515\n24.3\t394516\n353\t394517\n16.04双\t394518\n圣杯\t394519\n蓝瓷\t394520\nMustache\t394521\n安得物流\t394522\n偏关县\t394523\n12600\t394524\n20170928\t394525\n5078\t394526\n黑狗\t394527\ncompiled\t394528\n蓝鸥\t394529\n弹窗小说阅读网\t394530\n姜堰\t394531\nshareSDK\t394532\n顶进\t394533\n李苦禅\t394534\nmanchester\t394535\n林子善\t394536\nexecl表\t394537\n语文版\t394538\n鳌山\t394539\n学网\t394540\n技嘉\t394541\n公交查询网\t394542\n雄兵连之乾坤篇\t394543\nlau\t394544\n电子体温计\t394545\nWin7任务管理器\t394546\n转会期\t394547\n认识面积\t394548\n圣灵勇士\t394549\n西尾\t394550\nPeripheral\t394551\n卡座\t394552\nrepresenting\t394553\n迷沙\t394554\nBME\t394555\n中科院软件所\t394556\n彭聃龄\t394557\n月亮惹的祸\t394558\n六幅\t394559\n密集阵\t394560\n爆改\t394561\n永盛路\t394562\n加弹机\t394563\n长须\t394564\n小邓\t394565\n教师学\t394566\n云南省国资委\t394567\n3904\t394568\n石林峡\t394569\n阿姨\t394570\n给力\t394571\nSORRY\t394572\n波浪号\t394573\n夏启\t394574\n永日\t394575\n品格教育\t394576\n名景\t394577\n1122\t394578\n物联网技术\t394579\nskse64\t394580\n王乐泉\t394581\n省文明办\t394582\nSafengine\t394583\n得美\t394584\n杰西卡·阿尔芭\t394585\n14k\t394586\n武汉市科学技术局\t394587\n瓦解\t394588\nPCR仪\t394589\n凤凰光学\t394590\n康涛\t394591\n1.14.1\t394592\n无匹配\t394593\n湖北省发改委\t394594\n实效\t394595\ncss样式\t394596\n谈兵\t394597\n闪人\t394598\n上海交通大学附属第一人民医院\t394599\nSFM\t394600\n吴海龙\t394601\n南田路\t394602\nhtb\t394603\n仙裔\t394604\n苏大维格\t394605\n中国建筑设计\t394606\n水渣\t394607\n神剑情天3\t394608\nACDSee9.0\t394609\nRon\t394610\n貼\t394611\ntf卡\t394612\n广东邮电职业技术学院\t394613\n宝能集团\t394614\n农业科技园\t394615\n延髓\t394616\n真北路\t394617\nwprd\t394618\n中口\t394619\n姚玉舟\t394620\nscd\t394621\n秋天的怀念\t394622\nlpm\t394623\n罗格斯大学\t394624\n创业家园\t394625\n桔红\t394626\n改点\t394627\n建筑材\t394628\n包茎\t394629\n资源包\t394630\n6套\t394631\n山东省立医院东院\t394632\nprosper\t394633\n2100\t394634\nsm2\t394635\nSpec\t394636\nEttercap\t394637\nss5\t394638\n战争艺术赤潮指挥学院\t394639\n女人网\t394640\n商标网\t394641\n规格\t394642\n燕南飞\t394643\n维音\t394644\n鬼妃\t394645\n武林\t394646\n松风\t394647\n架架\t394648\nAuthority\t394649\n武汉园博会\t394650\n干燥剂\t394651\n杨真真\t394652\n江苏省疾病预防控制中心\t394653\n小衫\t394654\nopus\t394655\n金逸国际影城\t394656\n地铺\t394657\n乳酶\t394658\n创新营销\t394659\n烟液\t394660\n巫家坝\t394661\n万寿菊\t394662\n1V7\t394663\n400辆\t394664\nyn\t394665\n朱轻\t394666\n学得\t394667\n中华民族园\t394668\nzlg\t394669\n福彩3D试机号\t394670\nguanggao\t394671\n西洋棋\t394672\nHNA\t394673\nSmo\t394674\n高氯酸钾\t394675\n碘\t394676\n正略\t394677\nAppstore\t394678\n数次\t394679\n弗锐达\t394680\n圣晶石\t394681\n猞猁\t394682\n主播\t394683\n水木年华\t394684\n广花\t394685\n帝王谷\t394686\n德勤会计师事务所\t394687\n闪客2\t394688\nPDF版\t394689\n敬语\t394690\n特卡波湖\t394691\nSERVER2000\t394692\n白玉盘\t394693\n荷花村\t394694\n学诚法师\t394695\nReconstruction\t394696\nㄥ\t394697\n千企\t394698\n大清谷\t394699\n封边条\t394700\nmonsterLin\t394701\n闪客\t394702\n分辩率\t394703\nPIPE\t394704\n150型\t394705\n微赛\t394706\n700万美元\t394707\n网络工程师考试\t394708\n至死方休\t394709\n导带\t394710\n2018年元旦\t394711\n云政\t394712\n洛\t394713\n婺城区\t394714\n股道\t394715\n色数\t394716\ndedec\t394717\n4列\t394718\n瓦拉纳西\t394719\n浑南新区\t394720\n招贤镇\t394721\n骊\t394722\n居合\t394723\n焚稿\t394724\n6位\t394725\n香槟酒\t394726\n更美\t394727\n框线\t394728\n创青春全国大学生创业\t394729\n16粒\t394730\n香国\t394731\n蒸槐花\t394732\nABP-108\t394733\nDoor\t394734\n甜菊糖苷\t394735\n生物制药公司\t394736\nskipping\t394737\n卷绕\t394738\n颧骨\t394739\n代官山\t394740\n99子宫网\t394741\n换热\t394742\nansible批量\t394743\n公英\t394744\n一气呵成\t394745\n万达文化旅游城\t394746\n聯賽\t394747\n指甲\t394748\n少司命\t394749\n曹可凡\t394750\n瀚海晴宇\t394751\n杭州租房网\t394752\n广东省政府门户网站_广东省人民政府\t394753\n中国宋庆龄青少年科技文化交流中心\t394754\n招投标法\t394755\n农名工\t394756\n交待\t394757\n大嗓门\t394758\n湖北医药学院\t394759\n可怜天下父母心\t394760\n笑容满面\t394761\n肉球\t394762\n星形\t394763\n℃\t394764\n和谐广场\t394765\n溶度积常数\t394766\n胡文阁\t394767\nxb\t394768\n海南陵水政府\t394769\n河北建工集团\t394770\n高处\t394771\n杰士\t394772\n什马\t394773\n转让\t394774\n公瑾\t394775\naccept函数\t394776\n60A\t394777\n学弹\t394778\n料管\t394779\nbabies\t394780\n黄毛\t394781\n信息\t394782\n贴膜\t394783\n3月7日\t394784\n田牧宸\t394785\n中国酒都·仁怀市人民政府\t394786\n安卓版_2265安卓网\t394787\n秦升\t394788\n投融界\t394789\nbogon\t394790\n系统版本号\t394791\n神龟\t394792\n成都农商行\t394793\n周文\t394794\n凶棺\t394795\n多特\t394796\n宁波柏悦酒店\t394797\n西洲\t394798\n旋光性\t394799\n易鸽网\t394800\n礼貌性\t394801\n硫磺皂\t394802\n撞开\t394803\nskp\t394804\nspire\t394805\n夫妻之间\t394806\n紫薇洞\t394807\n高顿\t394808\nguilty\t394809\n天龙八\t394810\n怒江大峡谷\t394811\n寰宇\t394812\n抽象代数\t394813\n9300e\t394814\n结算员\t394815\n瑞达期货\t394816\nyk\t394817\n电缆分接箱\t394818\n注册监理师\t394819\n歇一歇\t394820\n费恩曼\t394821\n和你在一起\t394822\n长安欧尚CX70\t394823\n卡妹\t394824\n第19关\t394825\n2件套\t394826\n倾囊\t394827\n壮哉\t394828\n徐凝\t394829\n天府新城\t394830\n电子点\t394831\nought\t394832\n十八届中央纪委\t394833\n模板库\t394834\nM7.5\t394835\n医用\t394836\nfoxit\t394837\n使命心得\t394838\n伊蚊\t394839\n挑战杯\t394840\n肾阳虚\t394841\n诗朗诵\t394842\n106平米\t394843\n牡蛎\t394844\n三丝\t394845\n东升\t394846\n宾补\t394847\n缜密\t394848\nWPF\t394849\n影之诗\t394850\n天天宠物网\t394851\n真空玻璃\t394852\n快运\t394853\n夥伴\t394854\njsonp\t394855\n解困\t394856\n旧病复发\t394857\nLdap\t394858\n凯儿得乐\t394859\n一拖股份\t394860\n凤凰山庄\t394861\n闪屏\t394862\n北京大学生命科学学院\t394863\n音笑\t394864\n请点\t394865\n第一则\t394866\nesther-qing\t394867\n30l\t394868\n重庆汽车站\t394869\n罐车\t394870\n沐浴房\t394871\n235T\t394872\n郑薇\t394873\n怀才不遇\t394874\n刘志勇\t394875\n火祭\t394876\nsabrina\t394877\n七微\t394878\n驾驶执照\t394879\n南充人事考试网\t394880\ntasklet\t394881\n生物学系\t394882\n玄武岛\t394883\n幸福乡\t394884\n乂度\t394885\n瓦厂\t394886\nReversible\t394887\nmyprotein\t394888\n石乐志\t394889\nQQ电脑管家\t394890\n慕道友\t394891\n水液\t394892\nmaco\t394893\n厦航白鹭会员俱乐部\t394894\n炉石传说卡拉赞\t394895\n市安委办\t394896\n动笔\t394897\n六套\t394898\n中华中医网\t394899\n亲密度\t394900\n一路繁花相送\t394901\n恶龙\t394902\nxerces\t394903\n192.168.1.200\t394904\n圣外王\t394905\n第五十六条\t394906\n武汉六中\t394907\n明源\t394908\n白穴\t394909\n帕加尼\t394910\n维控\t394911\n玻璃罐\t394912\n一零\t394913\n法西斯战争\t394914\n吉林省实验中学\t394915\n每况愈下\t394916\n韶关新闻网\t394917\n星光村\t394918\n弹刀\t394919\n根子\t394920\n金八天国\t394921\nHAWK\t394922\n忘忧草\t394923\n招行信用卡中心\t394924\n广州市第三中学\t394925\n三里桥\t394926\nOpto\t394927\n120种\t394928\n教义\t394929\n电压档\t394930\n长临河镇\t394931\nrealtek音频管理器\t394932\n第三十九章\t394933\n750分\t394934\n周恩来\t394935\n庆熙大学\t394936\n肺结核\t394937\n黑珍珠\t394938\n毕克\t394939\n李蒙\t394940\n抽象工厂模式\t394941\n红星制砂机\t394942\n196号\t394943\n省中医院\t394944\n10月20\t394945\n深圳市第二职业技术学校\t394946\n黄哥\t394947\n尾单\t394948\n京山\t394949\nGD权志龙\t394950\n145cm\t394951\n全能班\t394952\n贵州移动\t394953\n深夜食堂2\t394954\n油面\t394955\n徽\t394956\n20170723\t394957\n微塔式机\t394958\n知识化\t394959\n枚\t394960\n场记\t394961\n5iKFC\t394962\n婚庆公司\t394963\n梅露可\t394964\n几国\t394965\n篮圈\t394966\n颂拓\t394967\nWS832\t394968\n驾船\t394969\n花店\t394970\n京京\t394971\nsop\t394972\n闲\t394973\nQQ堂\t394974\n学号\t394975\n1.0kg\t394976\n15首\t394977\n咸鸭蛋\t394978\n久处不厌\t394979\n第100个\t394980\n微服\t394981\n四一二\t394982\nmod_模拟人生4\t394983\n电男\t394984\n工业型\t394985\n杨家洼情事\t394986\n人参皂苷\t394987\nMS法\t394988\n俟\t394989\n板甲\t394990\n河北省博物馆\t394991\n敢达争锋对决\t394992\n剧版\t394993\n小袖\t394994\n属性类\t394995\n猛犸\t394996\n竖板\t394997\nbb霜\t394998\n昭通市教育局\t394999\n营业\t395000\n王峰\t395001\n乐业县\t395002\n挂条\t395003\n羚羊角\t395004\nTurnitin\t395005\nYat\t395006\n潇湘馆\t395007\n温县\t395008\n雁山\t395009\n端正\t395010\n注射泵\t395011\nflume-ng\t395012\nJUICE\t395013\nisf\t395014\n陈全林\t395015\n市地震局\t395016\n4.5千\t395017\n5转\t395018\n复地集团\t395019\n2.0TSI\t395020\n冷板\t395021\n班草\t395022\n骊住\t395023\n凯爷\t395024\n小爱音箱mini版\t395025\n高晨\t395026\ntinyproxy\t395027\nS11\t395028\nCrisis\t395029\n清辉\t395030\n萧绍路\t395031\n北京一区\t395032\n好热\t395033\n电法\t395034\nHDL\t395035\n公务机\t395036\n压实度\t395037\nRenault\t395038\nQQ昵称\t395039\n田鹏\t395040\njqery\t395041\n肯帝亚\t395042\n卓越时代广场\t395043\n有关注\t395044\n4月10号\t395045\n大饼丸\t395046\nPhotoshopCC\t395047\n大窑湾\t395048\nesttab\t395049\nManitoba\t395050\n立方网\t395051\n我们的家\t395052\nWTL\t395053\n家族式\t395054\n布洛芬片\t395055\n头脑\t395056\n完颜亮\t395057\nExcel常用函数\t395058\n阿斯特\t395059\n肉耽\t395060\nLoco\t395061\n春色满园\t395062\n福建医科大学附属第二医院\t395063\n腾信\t395064\n河南省人才交流中心\t395065\n北京九中\t395066\nempires\t395067\nr7plus\t395068\nNOME\t395069\n进制值\t395070\n路地\t395071\n跨职能\t395072\n江边村\t395073\nprt\t395074\n乐游原\t395075\n金骏眉\t395076\nCityU\t395077\n优卡\t395078\n教科文\t395079\n翟天英魂之刃\t395080\n杭州师范大学附属中学\t395081\n猥\t395082\n吝惜\t395083\nNV\t395084\n捌零音乐\t395085\n萨特\t395086\nfright\t395087\nwin7系统远程\t395088\n水位传感器\t395089\n泰信\t395090\n脸小\t395091\n云门山\t395092\n轻触\t395093\n一个4位\t395094\nSTAAD\t395095\n张飞跃\t395096\n拉远\t395097\n安铺镇\t395098\ncontinues\t395099\n开瑞坦\t395100\n电流麦\t395101\n选项\t395102\nnoir\t395103\n13公分\t395104\n热开水\t395105\nne5532\t395106\n2.1.7\t395107\nArabic\t395108\n连凯\t395109\nl360\t395110\n750Ti\t395111\nAstra\t395112\n省口腔医院\t395113\nlinq\t395114\n开普云\t395115\n成都招聘网\t395116\ncomicup\t395117\n人大附中朝阳分校\t395118\n蛋黄油\t395119\n全长\t395120\nPSVita\t395121\n奶声\t395122\nriser\t395123\n熊飞\t395124\n不辨\t395125\n神舟5号\t395126\nPrinters\t395127\n秦丝\t395128\n马丁斯科塞斯\t395129\n新绝代双骄\t395130\n李文平\t395131\n短袜\t395132\n三官\t395133\n干细胞库\t395134\nCorrugated\t395135\n85530351\t395136\n印集市\t395137\n千金翼方\t395138\nNoise\t395139\n水北\t395140\n临帖\t395141\nLeoxlu\t395142\nSexInSex\t395143\n畔山\t395144\n机械制图\t395145\n粘尘\t395146\n互联网保险公司\t395147\n深圳建设银行\t395148\ndy/dx\t395149\n峰山\t395150\nswot分析\t395151\n格之格\t395152\n毛板\t395153\naskDing\t395154\n授课制\t395155\n德西\t395156\n金香\t395157\npa\t395158\nstory2\t395159\n自制美白\t395160\n湖南物业网\t395161\n氯雷他定糖浆\t395162\n周叶舟\t395163\n李秋\t395164\n麦秆\t395165\ndragonsea\t395166\n雾霾\t395167\n全国版\t395168\nY450\t395169\n傲立\t395170\nBitmapImage\t395171\n版刷机\t395172\n别克昂科威\t395173\nvisitors\t395174\n年生\t395175\n喜子\t395176\n盖错\t395177\n接触网\t395178\n科沃尔\t395179\n卉\t395180\nunc\t395181\n罗威纳犬\t395182\n析\t395183\n抗病性\t395184\n长沙市质量技术监督局\t395185\nrxd\t395186\n孔帕尼\t395187\nfixup\t395188\n广州蓝月亮实业有限公司\t395189\n大自然的语言\t395190\n致悦\t395191\n开挂_唠叨网\t395192\n售出\t395193\n为所欲\t395194\nFRU\t395195\n第一站\t395196\n内蒙古自治区体育局\t395197\n大连幼升小\t395198\n刊号\t395199\n张全蛋\t395200\n21_\t395201\n轮廓化\t395202\nairjordan\t395203\n忌廉\t395204\n刘一曈\t395205\n柳田弥生\t395206\n立案登记制\t395207\nfuzzy\t395208\n000538\t395209\n捂脚\t395210\ncad2007\t395211\n懒财网\t395212\n泰山职业技术学院\t395213\n赛睿寒冰\t395214\n茶坊\t395215\n天籁之战\t395216\nSnoopy\t395217\n严禁\t395218\n知识产权法院\t395219\n冷暖色\t395220\nZX1\t395221\n杭州娃哈哈集团有限公司\t395222\n银行端\t395223\n房屋所有权证\t395224\n软云\t395225\n空调移机\t395226\n震音\t395227\nxpdown\t395228\n静电贴\t395229\nFlush\t395230\n白梨\t395231\n灰度\t395232\n13卷\t395233\n红瓷\t395234\n龙阳路2345号\t395235\n达濠\t395236\n震动盘\t395237\n标白\t395238\n槽型\t395239\n运动裤\t395240\n求知欲\t395241\n物易物\t395242\n闹觉\t395243\n优惠政策\t395244\n延庆县\t395245\n住房保障网\t395246\n温莎公爵\t395247\n壁床\t395248\n2.19\t395249\n转去\t395250\n啦啦\t395251\n上海论坛_汽车之家论坛\t395252\n无刷电调\t395253\nA6L\t395254\n唇语\t395255\nBall\t395256\nMT7620\t395257\nDishonored\t395258\n曼舞\t395259\n洞藏\t395260\n版书\t395261\n净水壶\t395262\n20170217\t395263\n格线\t395264\nHNOI\t395265\n易学网\t395266\n瑞兹\t395267\n战后\t395268\n燃魂\t395269\n详讯\t395270\n操作类\t395271\niphone7吧_\t395272\n父母心\t395273\n厕所\t395274\n象雄\t395275\nY79\t395276\n虚光\t395277\n西藏地区\t395278\n国斗\t395279\n精片\t395280\ncareer\t395281\ncmyk值\t395282\n阴线\t395283\n上海医学院\t395284\n认贷\t395285\n简谈\t395286\n陈立群\t395287\n分值\t395288\n刮起\t395289\n登金陵凤凰台\t395290\n铁渣\t395291\n毕姥爷\t395292\n青工\t395293\n黄岐镇\t395294\n網絡\t395295\n科特勒\t395296\nEP1\t395297\n尊尊\t395298\n天正2014\t395299\n新众泰T700报价_众泰汽车众泰T700\t395300\n刘公\t395301\n双十一\t395302\n上海电气集团\t395303\n安泰\t395304\n巨龙之战\t395305\n建行公司\t395306\n悠百佳\t395307\n孩爸孩\t395308\n热力站\t395309\n丙烯酸乙酯\t395310\n广州地铁5号线\t395311\nVTS\t395312\n方圳\t395313\n罗湖东门\t395314\nCarte\t395315\nyuxi\t395316\ntriangular\t395317\n占位\t395318\n彭宇案\t395319\n赋税\t395320\nA45\t395321\n东莞外国语学校\t395322\nBarack\t395323\n乌云\t395324\n渡边谦\t395325\n2018,2018年\t395326\n大陆架\t395327\n斯芬克斯\t395328\n转固\t395329\n旋片\t395330\n成都九中\t395331\nAD、BC\t395332\nMP\t395333\n魏瑾\t395334\n蓉漂\t395335\n上海市控江中学\t395336\n主播版\t395337\n分食\t395338\n养晦\t395339\n美国航空公司\t395340\n波动率\t395341\nMm\t395342\n厄宫\t395343\n福利待\t395344\nFather\t395345\n一点\t395346\n印章盒\t395347\n中山市东区\t395348\n雪茄盒\t395349\nPornZ\t395350\nShadow\t395351\n赵立晨\t395352\n2018年二月\t395353\n马芳\t395354\n肥胖症\t395355\n车家号\t395356\nturkish\t395357\n黑木一香\t395358\n音源\t395359\n国际社会\t395360\n不久后\t395361\n徐力\t395362\n理想三旬\t395363\n常州市教育局\t395364\nchocker\t395365\n20160721\t395366\n沪粤\t395367\n段玉\t395368\n滥觞\t395369\n高辛\t395370\n马来酸氯苯那敏\t395371\nE08\t395372\n插接箱\t395373\n宋博士\t395374\n聆听\t395375\npn532\t395376\n油葵\t395377\n大清\t395378\n老支书\t395379\n登峰造\t395380\n证机\t395381\n七千多\t395382\n软体机器人\t395383\n情路\t395384\n8倍\t395385\nhttpd\t395386\n早早\t395387\n广州市人民政府法制办公室\t395388\n鹿寨县\t395389\n百分之十五\t395390\n羊肉串\t395391\n请讲\t395392\n随身杯\t395393\nCCTV-5体\t395394\n钢琴课\t395395\n云南红塔银行\t395396\n彼尚\t395397\n无人工厂\t395398\n质量守恒定律\t395399\n仙人跳\t395400\n莲塘村\t395401\n闭幕\t395402\n荆门东宝政府\t395403\nupdate3\t395404\n动画篇\t395405\n澄迈\t395406\n1297\t395407\n上汽集团\t395408\n互备\t395409\n9660\t395410\n公修课\t395411\ntpi\t395412\n踝关节\t395413\nHHH\t395414\nTBA\t395415\nOPTS\t395416\n一爱\t395417\n603288\t395418\naicc2017\t395419\n首开龙湖\t395420\n达里尔\t395421\n研究生教育网\t395422\n【蔚\t395423\n银都大厦\t395424\n刘永行\t395425\n10盒\t395426\n长沙晚报\t395427\n在那遥远的地方\t395428\n北京孔庙\t395429\n受惠\t395430\n起重机网\t395431\noverdose\t395432\n相在\t395433\n十论\t395434\nui-router\t395435\n2007\t395436\n七千年\t395437\n丰台\t395438\n永阳镇\t395439\nEarn\t395440\n都市奇缘\t395441\n立海大\t395442\n樱花大战吧\t395443\n尼子\t395444\n黑哥\t395445\n唾液淀粉酶\t395446\n2015年初\t395447\n软市场\t395448\n肌壁\t395449\n入队\t395450\n启辰T90\t395451\n转笔\t395452\nxx县\t395453\n2千多\t395454\n余情\t395455\n交通银行信用卡中心\t395456\n手机百田网\t395457\n绿沸石\t395458\nps2模拟器\t395459\n最美乡村\t395460\n夏小秋\t395461\n魔格\t395462\n木兰拳\t395463\n再融资\t395464\n实用新型专利\t395465\nxart\t395466\nYUI\t395467\n孙浩\t395468\n黑白点\t395469\nA16.4\t395470\n义乌国际商贸城三区\t395471\nremember\t395472\n33码\t395473\n小娇娘\t395474\n270度\t395475\n产业界\t395476\n人偶\t395477\n瓦工\t395478\ncider\t395479\n婚后\t395480\n达州新闻网\t395481\n小包子\t395482\n保畅\t395483\npasta\t395484\n天官赐福\t395485\n跳跃性\t395486\nIMO\t395487\n史前巨鳄\t395488\n宝丰\t395489\n業務\t395490\nAir13\t395491\nupdater\t395492\n火名网\t395493\n平衡感\t395494\n真真\t395495\n手册\t395496\n明治天皇\t395497\n洗版\t395498\n上海搬家公司\t395499\n笔趣阁中文网\t395500\n南皮\t395501\n三角肌\t395502\n经济结构\t395503\n襄阳市\t395504\n种皮\t395505\nwwf\t395506\n宣仪\t395507\n华润二十四城\t395508\n国土证\t395509\n文意\t395510\na12\t395511\n巜\t395512\n6.4%\t395513\n曹芙嘉\t395514\n51下片\t395515\n圆顶\t395516\n究极版\t395517\n土场\t395518\n焚烧炉\t395519\n单缝衍射\t395520\n魔兽争霸\t395521\n纺织\t395522\n独立宣言\t395523\n乐山镇\t395524\nLovato\t395525\n官恩娜\t395526\n玉卿嫂\t395527\n嘉和园\t395528\n独游网\t395529\n属性值\t395530\n起亚缤智\t395531\nputting\t395532\n天邈汉化组\t395533\n中土世界战争\t395534\nFRI\t395535\n中央大街\t395536\n喜多郎\t395537\n8亩\t395538\n不见了解\t395539\n喇嘛教\t395540\n夯土\t395541\nzdh\t395542\nProactive\t395543\n侥幸\t395544\n浅述\t395545\n淹\t395546\nToxic\t395547\n脆皮烧肉\t395548\n威廉姆斯\t395549\ns19\t395550\n晾衣机\t395551\n银河掠夺者\t395552\n美图m8\t395553\n五棵树\t395554\n0706\t395555\ncontroversial\t395556\n溁湾镇\t395557\nIssue\t395558\n只见\t395559\n90元\t395560\n地厅级\t395561\nWAV/分轨\t395562\n云计算大会\t395563\n了却\t395564\n万科泊寓\t395565\n血象\t395566\n螺母机\t395567\ntiming\t395568\nprofits\t395569\n五里店\t395570\n4098\t395571\n巨腕\t395572\n魏群\t395573\n陈州\t395574\n三改一拆\t395575\nMIU\t395576\n王秀英\t395577\nbroadband\t395578\n线位\t395579\n云起书院\t395580\n膏器\t395581\n20秒后\t395582\n物料\t395583\n第19轮\t395584\n迪贝拉\t395585\nhints\t395586\n改文\t395587\n月向日葵\t395588\n食品药品监管\t395589\n井底\t395590\n王翚\t395591\n鹭江\t395592\n老女\t395593\n泰北\t395594\n胸闷气短呼吸困难\t395595\n省份\t395596\n物\t395597\n史努\t395598\n.com\t395599\n一乙\t395600\n陈国强\t395601\n168.com\t395602\n安睡\t395603\nbanananana\t395604\n上海航欧\t395605\n噘嘴\t395606\n甲硫醇\t395607\n全北现代\t395608\n天地孤影任我行\t395609\necplice\t395610\n大王庄\t395611\n玫瑰玫瑰我爱你\t395612\n出水管\t395613\n唐山工业职业技术学院\t395614\n清远\t395615\n出入境\t395616\n企业史\t395617\n自贡市政府\t395618\n王悠悠\t395619\n首旅集团\t395620\n有机朗\t395621\nriddle\t395622\n手持稳定器\t395623\n观音阁\t395624\n加锁\t395625\nLAN\t395626\nPajek\t395627\n临朐县\t395628\n共同类\t395629\n正传\t395630\n圣坛\t395631\n不成活\t395632\n大连市统计局\t395633\n顶体\t395634\n瘦削\t395635\n祭祖\t395636\n免流服务器\t395637\n偶酷网\t395638\nINDESIGN\t395639\n横河镇\t395640\n变容二极管\t395641\n波兰爱经\t395642\n德国党卫军\t395643\n头包\t395644\n撒旦教\t395645\nPageRank算法\t395646\n20160622\t395647\nQ房\t395648\n王同学\t395649\n井位\t395650\n越国\t395651\n惠紫\t395652\n即插即用\t395653\ndangerous\t395654\n加藤鹰\t395655\n疼痛科\t395656\n储能逆变器\t395657\n浮点数\t395658\n2017_安全管理网\t395659\nWIRELESS\t395660\n600398\t395661\nYoYo\t395662\n平武县\t395663\n不可达\t395664\n小榄\t395665\n小牛在线吧\t395666\n砖房\t395667\nVHDX\t395668\nCAD批量\t395669\n山椒\t395670\n搜索\t395671\n硅胶娃娃\t395672\n亭林公园\t395673\n447\t395674\n双流吧\t395675\n齐木楠雄的灾难第二季\t395676\n南侨\t395677\nBP算法\t395678\nreactivecocoa\t395679\n二十篇\t395680\n行长\t395681\n老客户端\t395682\n北京天使合唱团\t395683\n做买卖\t395684\n100万只\t395685\ntelephone\t395686\n5月上旬\t395687\n两个工作日\t395688\n病危通知书\t395689\nxp123\t395690\n斯巴鲁森林人\t395691\n三鲜汤\t395692\n3202\t395693\n夺还\t395694\n中小企业融资\t395695\n天汇城\t395696\n掖\t395697\n西奥\t395698\n超管\t395699\n猴拳\t395700\n第4辑\t395701\n经济与管理学院\t395702\n4.6级\t395703\n刺痛感\t395704\n建造\t395705\n657\t395706\n刀塔2\t395707\n废塑料网\t395708\n福州自贸区\t395709\n吴圩国际机场\t395710\n希森美康\t395711\n香河肉饼\t395712\n卢作孚\t395713\n中国大学生在线\t395714\n凤起\t395715\n医学影像学\t395716\n宗汉\t395717\n华为畅享5\t395718\nDelcam\t395719\n文官\t395720\n新仓\t395721\n昆仑健康\t395722\n缠住\t395723\n楢山\t395724\n考虫网\t395725\n永恒经\t395726\n物质化\t395727\nnew\t395728\n1.6.1\t395729\n8.4.2\t395730\n68\t395731\nTUP\t395732\n何厚铧\t395733\n南昌国家高新技术产业开发区\t395734\n勇当\t395735\n老客网\t395736\n185e\t395737\n霍元甲\t395738\n病夫\t395739\n裸花紫珠片\t395740\n汽车销售管理办法\t395741\n最长回文子串\t395742\n思途\t395743\n西丽水库\t395744\n两长\t395745\nppt素材库\t395746\n宝沃BX7_汽车\t395747\n绝缘电阻测试仪\t395748\n第四十三条\t395749\n途乐论坛_汽车之家论坛\t395750\n去哪里\t395751\n林安\t395752\n晚期\t395753\n江米\t395754\n眉页\t395755\nanyconnect\t395756\nIX\t395757\n黄洁\t395758\n8月1日起\t395759\n弗洛姆\t395760\n方方正正\t395761\n音悦V榜\t395762\n2.2L\t395763\nstarts\t395764\n公诉\t395765\n北师珠\t395766\nNeural\t395767\n臊\t395768\n博野县\t395769\n煌煌\t395770\n蚶江镇\t395771\nfarcry5\t395772\n幕帘\t395773\n马镇\t395774\n56酷\t395775\n大分\t395776\n庞麦郎\t395777\n开心麻花小品\t395778\n温州站\t395779\n小晚\t395780\n纠纷\t395781\n胶水\t395782\n一两岁\t395783\n压缩比\t395784\nperldoc\t395785\n蒸箱\t395786\n新疆维吾尔自治区党委\t395787\n6000吨\t395788\n郑惟桐\t395789\n天盈\t395790\n金属类\t395791\n说不想\t395792\n山东省精神卫生中心\t395793\n大苏\t395794\n仙侠道\t395795\n七叶莲\t395796\n氮\t395797\n四藏\t395798\n干文\t395799\nARCore\t395800\n看过\t395801\nwin8/win8.1\t395802\n金秀\t395803\nHUNT\t395804\n代报\t395805\n连续性\t395806\n辛弃凤\t395807\ndtnl\t395808\n趣闻网\t395809\n找规律\t395810\nspss22\t395811\n下定义\t395812\ncatching\t395813\n广州海洋馆\t395814\n离地\t395815\n魔笛magi\t395816\n乐思\t395817\n魔鬼恋人\t395818\n造假者\t395819\n中国工程院\t395820\n形态学\t395821\n木炭\t395822\n林申\t395823\nIPTables\t395824\nrunningman2018\t395825\nloge\t395826\n慧通\t395827\nremake\t395828\n洛阳正骨医院\t395829\n璇儿\t395830\n铵态\t395831\npik\t395832\n吊粒\t395833\n滨河北路\t395834\n网易房产\t395835\n眉弯\t395836\n行尸走肉游戏\t395837\n七步曲\t395838\n魏荣元\t395839\nEA6500\t395840\ndbc\t395841\n周江勇\t395842\n哥特王朝3\t395843\n情话大全_个性说说网\t395844\n三分法\t395845\n河滨路\t395846\n银箔\t395847\n伊势尼\t395848\n秋瓷炫\t395849\n101级\t395850\n89.7\t395851\n沫沫\t395852\n艾莉\t395853\n顺风车\t395854\nAndroid4.0\t395855\nLiteIDE\t395856\n180416\t395857\n专筑网\t395858\n直管\t395859\n邓子恢\t395860\n2108s\t395861\n偶见\t395862\n热毒\t395863\n晶种\t395864\n宿迁市级机关工委\t395865\nmercy\t395866\n福岛核电站\t395867\n梁甫吟\t395868\nAD采样\t395869\n$5\t395870\n热学\t395871\n樱花庄的宠物女孩\t395872\n大锅盖\t395873\nkuan\t395874\n3D金瓶梅\t395875\n三八大盖\t395876\n加州大学\t395877\nQoo10\t395878\n柔性防护网\t395879\n五十一\t395880\n小米\t395881\n毛腾飞\t395882\n长期避孕药\t395883\n竞业禁止协议\t395884\nSSF\t395885\n14卷\t395886\nCCTV-7_央视网\t395887\n内心戏\t395888\n纯音\t395889\n阳狮集团\t395890\n乐安县政府\t395891\n旅顺口\t395892\n做弊\t395893\nPHI\t395894\n人力资源市场网\t395895\n1978年\t395896\n易索论坛\t395897\nPR值\t395898\n15英寸\t395899\n患子\t395900\nstata回归分析\t395901\n主播队\t395902\n陈俊华\t395903\na73\t395904\nrush\t395905\n证券日报\t395906\n将军们\t395907\n远安县\t395908\n雁南飞\t395909\n月年\t395910\n贤哥\t395911\nttc\t395912\n四川省投资集团有限责任公司\t395913\nWSOP\t395914\n1-2年\t395915\n高珊\t395916\n百分率\t395917\nexported\t395918\ninteractions\t395919\n秋歌\t395920\ngilde\t395921\n热度\t395922\n潇湘冬儿\t395923\n支出资本化\t395924\n神木村\t395925\n电视眼\t395926\n罗欣药业\t395927\n僵尸兵\t395928\n熔化炉\t395929\n金贝塔\t395930\n便签本\t395931\n周良\t395932\n舒尔\t395933\n王晓秋\t395934\n都匀新闻网\t395935\n木\t395936\n成都艺术职业学院\t395937\n东宫媚娘\t395938\n周红\t395939\n蹂\t395940\nglen\t395941\n迪丽吴磊\t395942\n租号\t395943\nbrings\t395944\n11月4日\t395945\n修理厂\t395946\n263云\t395947\n毫不\t395948\n3.64\t395949\ndecoder\t395950\n美国神婆星座网\t395951\n摆幅\t395952\n三道街\t395953\nWorkplace\t395954\n旋挖灌注桩\t395955\n骨传导\t395956\n1.8%\t395957\n五性\t395958\n血豆腐\t395959\n中国建筑第六工程局有限公司\t395960\n掳\t395961\n富盛镇\t395962\n晴空物语\t395963\n木饰面\t395964\nIGBT\t395965\n紫斑\t395966\nClass类\t395967\n创森\t395968\n味观\t395969\n曾平\t395970\n我的大叔\t395971\n转结\t395972\n陈俊宇\t395973\n根圆\t395974\n制专转\t395975\n鳄霸雷克顿\t395976\n七部\t395977\n一花一世界\t395978\ncall客\t395979\n金阁寺\t395980\n窍\t395981\n书信类\t395982\n画江山\t395983\n10速\t395984\n下盖\t395985\n高通盛融\t395986\n化学试\t395987\n靓仔\t395988\n1.16\t395989\n咳嗽药\t395990\n刘纯燕\t395991\nBatis\t395992\n国泰基金管理有限公司\t395993\n南方企业新闻网\t395994\n打捆\t395995\n疫源\t395996\n面包虫\t395997\n超能玩具白白侠\t395998\n天内\t395999\n虬\t396000\n冷库\t396001\n回发\t396002\n宋诗\t396003\n彭彭\t396004\n伴旅\t396005\n91号\t396006\n微球蛋白\t396007\n才华\t396008\nmope\t396009\nOrganizing\t396010\n王太利\t396011\n归国留学生\t396012\n希儿\t396013\nrena\t396014\naes256\t396015\n范家屯\t396016\n金锁\t396017\nPredict\t396018\n67.0.3371.0\t396019\n697\t396020\n20171111\t396021\n宏达\t396022\n灰黄鹂\t396023\n雪燕\t396024\n一个句\t396025\n及时雨\t396026\n巴陵\t396027\n木生\t396028\n荷脑\t396029\n南开大学研究生院\t396030\n南关街\t396031\nMobi\t396032\n一系列\t396033\n美博\t396034\n发送端\t396035\n星族\t396036\n刘文华\t396037\n南极人\t396038\n娜塔莉亚\t396039\n浓烈\t396040\n梧州市\t396041\n急性早幼粒细胞白血病\t396042\nfool\t396043\n1098\t396044\nCCTV空中剧院\t396045\n光棍节\t396046\nTBC\t396047\n作文学\t396048\n千觞\t396049\n郭蔼明\t396050\n漳州港\t396051\n蒋勇\t396052\ngridster\t396053\n20161124\t396054\nU12\t396055\nspectrogram\t396056\n二级消防工程师考试\t396057\n容华\t396058\n好强\t396059\n孟建民\t396060\n直销\t396061\n约谈\t396062\n瓦数\t396063\n阿蜜果\t396064\n葡萄糖水\t396065\n电子照\t396066\n大拇指幼儿园\t396067\nbacnet\t396068\n奥米茄\t396069\n親友\t396070\n北流市\t396071\n瞌睡虫\t396072\n银河英雄传\t396073\nMicrotiger\t396074\n哈特\t396075\nlvmh\t396076\nTanya\t396077\n上架\t396078\n金燕\t396079\n横店电影城\t396080\n车石\t396081\n刘瑞\t396082\naria2\t396083\nEscherichia\t396084\n×\t396085\n大山包\t396086\n乙卷\t396087\nJeunesse\t396088\n猫山王\t396089\n岗亭\t396090\n炼铁厂\t396091\n_永生小说网\t396092\n百度网盘助手\t396093\n沃仕达\t396094\n曹曦文\t396095\nSetter\t396096\n理人\t396097\ncumulative\t396098\n速写\t396099\n仙境传说Ro\t396100\n56级\t396101\n沧州友通管道有限公司\t396102\n南芬区政府\t396103\npowerdesign\t396104\n古镜\t396105\n太康\t396106\n美国研究生院\t396107\n林妹妹\t396108\n二逼\t396109\n佣兵悦动\t396110\n[凹凸世界\t396111\njacobs\t396112\n列式\t396113\n相信爱情\t396114\n售票窗口\t396115\n铁花\t396116\n修容\t396117\n飞狼\t396118\n6p1\t396119\n累垮\t396120\n电子乐\t396121\nteac\t396122\n中标注\t396123\n好地方\t396124\n曾强\t396125\n银都花园\t396126\n味多美\t396127\nmyeclip\t396128\n查杀\t396129\n魔盘\t396130\n香瓜子\t396131\n洗选\t396132\n圣母百花大教堂\t396133\n38款\t396134\n20183月\t396135\n横林镇\t396136\n观修\t396137\nPresents\t396138\n下周五\t396139\n成都地铁10号线\t396140\n大脚\t396141\n贾明军\t396142\n联考\t396143\n戈舍瑞林\t396144\n牙套\t396145\n开关机\t396146\n续三国志英杰传\t396147\n职业中学\t396148\n中国科学院自动化研究所\t396149\n千寻\t396150\n光闸\t396151\n宜居新城\t396152\n逆熵\t396153\n热轧钢\t396154\n缘琪梦\t396155\n2017年6月6日\t396156\n美刊\t396157\n雪落\t396158\n志丹\t396159\n驻马店日报数字报\t396160\n亚振家具\t396161\n张璇\t396162\n搜索类\t396163\nxquery\t396164\n歌德盈香\t396165\n土豪金\t396166\n臂丛\t396167\nhotline\t396168\n没人理\t396169\n几何\t396170\n供冷\t396171\n本华莱士\t396172\n大润发\t396173\n宏晶\t396174\n360隐私保险箱\t396175\n蜜粉饼\t396176\n偿债率\t396177\nCSIA\t396178\n西安市统计局\t396179\n香香\t396180\n在建新船_国际船舶网\t396181\n硝酸镍\t396182\nPIM\t396183\n孕产\t396184\n睾酮\t396185\n陌上花\t396186\n九班\t396187\n立白集团\t396188\n华沙\t396189\nKB2999226\t396190\n20170823\t396191\n中国文化概论\t396192\n努比亚红魔_百度\t396193\n恒氧\t396194\n学而优则仕\t396195\n蔷薇科\t396196\n圣音\t396197\n判段\t396198\n织法\t396199\n龙悦\t396200\nAgCl\t396201\n城南片区\t396202\n中吕\t396203\ncassie\t396204\n维系\t396205\nproportional\t396206\nDisabler\t396207\nrx580\t396208\n比亚迪宋DM\t396209\n万业\t396210\n2706\t396211\nsupersonic\t396212\n万象天地\t396213\n咬死\t396214\n天龙3\t396215\nODT\t396216\n异世邪君\t396217\n30度\t396218\n三角函数\t396219\nbowen\t396220\n惠安\t396221\nSiteMap\t396222\n群智\t396223\n惘\t396224\n横看成岭侧成峰\t396225\nZ7max\t396226\ncsm\t396227\n文案策划\t396228\n集字\t396229\ncorpus\t396230\n方程组\t396231\n测斜管\t396232\n县级人民政府\t396233\n帅铃\t396234\n天花乱坠\t396235\n消脂\t396236\n41栋\t396237\nvmprotect\t396238\n单身贵族\t396239\nms-win\t396240\n保护人\t396241\n废单\t396242\ndlx\t396243\n激素药\t396244\n秀\t396245\n日产\t396246\n感闻\t396247\n妻星\t396248\n姚静\t396249\n卢龙\t396250\n年俗\t396251\n长沙一小区\t396252\n东方时尚\t396253\n世昌大道\t396254\n硬盘驱动器\t396255\n恋袜癖\t396256\n夏目漱石\t396257\n河北理工大学\t396258\n色魔驴行\t396259\n15.7\t396260\n三国志13PK威力加强版+三国志:建造+三国志\t396261\n钟明秋\t396262\nwn7\t396263\n条例\t396264\n百人群\t396265\n盗心贼\t396266\nacids\t396267\n嬉闹\t396268\nlpl吧\t396269\n朱子\t396270\n减毒\t396271\n汤小丹\t396272\n黑色素瘤\t396273\n六棱\t396274\n招商贷\t396275\n芙蓉广场\t396276\n手感剂\t396277\n电影剧\t396278\n蚂蚁和西瓜\t396279\n不打烊\t396280\ninspiration\t396281\n就业局\t396282\n宅宅网\t396283\n逆子\t396284\nrecombination\t396285\n九分之七\t396286\nElixir\t396287\n张月\t396288\nm2070\t396289\nWINDOW\t396290\n空气堡\t396291\next4\t396292\natto\t396293\n童女\t396294\n剑南春酒\t396295\n拙\t396296\n中间段\t396297\n触手TV\t396298\n金汇\t396299\nstomach\t396300\n很方便\t396301\nforever21\t396302\n乐Pro3论坛\t396303\n常台高速\t396304\npersons\t396305\n轩辕剑6\t396306\n韩晶\t396307\n六天七夜\t396308\n黄体酮针\t396309\n转轨\t396310\noutlook2013\t396311\n虚掩\t396312\n女穿男\t396313\n研讨班\t396314\n朗诵者\t396315\n万达地产信息_万达集团股份有限公司\t396316\nL16\t396317\n肝肿瘤\t396318\n颜夕\t396319\nFlan\t396320\n篱笆网\t396321\n502studio\t396322\n金狮\t396323\n轻触开关\t396324\n金正恩文在寅\t396325\n胶板\t396326\n总评\t396327\n山东省肿瘤医院\t396328\n福建省经济信息中心\t396329\n钢性\t396330\nwineqq\t396331\n升辉\t396332\n洋人街\t396333\n运动训练学\t396334\n人民的名义\t396335\n魏国强\t396336\n王绍华\t396337\n武汉科技大学研究生院\t396338\n2.65\t396339\n300亿美元\t396340\n眼冒\t396341\n斑马鱼\t396342\n圆弧型\t396343\n天醒之路\t396344\n优酷网\t396345\n杭州中学\t396346\n斯坦纳\t396347\ntry_files\t396348\nhprof\t396349\n第3\t396350\n百度h5\t396351\n服务亭\t396352\n景颇\t396353\n上南\t396354\n做实\t396355\n大疆御\t396356\n广州立白企业集团有限公司\t396357\nav色情网\t396358\n交发\t396359\n市委老干部局\t396360\n晓之车\t396361\n海浪花\t396362\n饰演者\t396363\nword、excel\t396364\n不动明王\t396365\n嘉禾县人民政府\t396366\nBD-MP4/MKV\t396367\n异思\t396368\nvariable\t396369\n运动品\t396370\n铖\t396371\nVoIP\t396372\n上海市七宝中学\t396373\n元丰通宝\t396374\n对敲\t396375\n拌饭酱\t396376\nanzhaung\t396377\n联系方\t396378\n大轴\t396379\nnet2.0\t396380\n国务院侨务办公室\t396381\n新浪邮箱\t396382\n第21话\t396383\n布纸\t396384\n红猪\t396385\n青岛市市北区\t396386\n小狮\t396387\n花儿朵朵开\t396388\n安倍\t396389\n中国共产党中央委员会\t396390\n海归\t396391\nRespect\t396392\n鸽派\t396393\n10R\t396394\n傅成玉\t396395\n微信版\t396396\n威盛亚\t396397\n示爱\t396398\n陈翰宾\t396399\n郑州市第一中学\t396400\n普米克令舒\t396401\n张载\t396402\n潜孔锤\t396403\n人工拆除\t396404\n第27个\t396405\n伽椰子\t396406\n厚壁钢管\t396407\n牡丹花茶\t396408\n三代火影\t396409\n魔幻世界\t396410\n卡圈\t396411\n苍之纪元英雄\t396412\n中国刑警学院\t396413\n功到\t396414\n中国传感器网\t396415\n合肥酒店\t396416\n10-2\t396417\n做空\t396418\nconnec\t396419\nkevinz\t396420\n安徽代表团\t396421\n太残忍\t396422\n怎末\t396423\n越战越勇\t396424\n一天24小时\t396425\n王中银\t396426\n涮羊肉\t396427\n巨鹿县\t396428\n步道\t396429\n老奸巨猾\t396430\n桃子夭夭\t396431\n联洋广场\t396432\ndarker\t396433\n肩周\t396434\n琼海\t396435\nShangbao\t396436\n小蒜\t396437\ncinnamon\t396438\n转向节\t396439\n200GB\t396440\n3L\t396441\n血炼\t396442\n港市\t396443\n草席\t396444\n抗心律失常药\t396445\n孔令伟\t396446\nAxios\t396447\n肋间肌\t396448\n机电安装工程\t396449\n仇保兴\t396450\ngachi\t396451\n东风福瑞卡\t396452\ndrf\t396453\n七界恶魔少爷别吻我\t396454\nPK赛\t396455\n48v20ah\t396456\n0506\t396457\nFIGHTERS\t396458\n二人台\t396459\n骏派d60\t396460\n2.5kg\t396461\n草甘膦\t396462\n扣球\t396463\nUSB万能驱动\t396464\n470分\t396465\n老虎窗\t396466\n流芳百世\t396467\n映射表\t396468\nflexgrid\t396469\n酥心糖\t396470\n欧慕\t396471\n艾路雷朵\t396472\n芳馨\t396473\nMP3|\t396474\n苏州金螳螂建筑装饰股份有限公司\t396475\n和昌地产\t396476\n余土\t396477\n石桥\t396478\n安徽农业科学\t396479\n千元机\t396480\n文虎\t396481\n光明大陆圣骑士\t396482\n音乐\t396483\n上海环球金融中心观光厅\t396484\n译呗\t396485\n内脏\t396486\n控制盘\t396487\n_地下城与勇士攻略\t396488\nWinScp\t396489\n李东辉\t396490\n床板\t396491\n孟买猫\t396492\n耐撕\t396493\n转晴\t396494\n汗青网\t396495\nDependency\t396496\n车管\t396497\n易瓦特\t396498\n僵尸群\t396499\n应用程序池\t396500\n疯丫头\t396501\n1471\t396502\n7个月\t396503\n谷城\t396504\n盗车神\t396505\n生存期\t396506\n布诺\t396507\nthc\t396508\n小飞侠\t396509\n雾化芯\t396510\n上海城市规划展示馆\t396511\n网盘搜索网\t396512\nwwwk224com\t396513\n王思聪\t396514\n市净率\t396515\n河南省平顶山市教育局\t396516\n14.5亿\t396517\n29亿\t396518\nReturning\t396519\n叶健\t396520\n第五代\t396521\nshuju\t396522\n瓦尔帕莱索\t396523\n画卷\t396524\n华兴资本\t396525\n湖南人事代理服务中心\t396526\n小日记\t396527\n销售品\t396528\n海克斯康\t396529\n福建省工商局\t396530\nVbox\t396531\n光博\t396532\n孔亮\t396533\n舞弊\t396534\n国贸\t396535\n中国催眠网\t396536\n忍者猫\t396537\n镐\t396538\n刀\t396539\nGOAL\t396540\n明光\t396541\n水漆\t396542\n欣瑞教育\t396543\n涌泉镇\t396544\n安徽省交通控股集团有限公司\t396545\n南宁南湖公园\t396546\n山东电力高等专科学校\t396547\n拉文\t396548\nnanoe\t396549\ndesigner17\t396550\n庄氏\t396551\n2018年04月02日\t396552\n日利\t396553\ndif\t396554\n1.32\t396555\n中华手赚网\t396556\n西普大陆\t396557\n逆水\t396558\n过渡费\t396559\n全纳二手车网&全纳车网\t396560\n姚村镇\t396561\nCAN\t396562\n心\t396563\n水仙娱乐网\t396564\nCVTOUCH\t396565\n上下\t396566\n张廉\t396567\n星雨华府\t396568\nIdentifier\t396569\n淤泥\t396570\n狮山街道\t396571\nAsians\t396572\n07款\t396573\n绀\t396574\n金学峰\t396575\nmetabolic\t396576\n短期\t396577\ninitializing\t396578\n乐心医疗\t396579\n有意思\t396580\n0.72%\t396581\n金钱草\t396582\nMANAGEMENT\t396583\nCoronary\t396584\n刺客信条\t396585\n8月18日\t396586\n郴州政府网\t396587\n光华\t396588\n福州软件园\t396589\n德岳素库\t396590\n龙岩站\t396591\npredis\t396592\n200问\t396593\n携程航空\t396594\n狡猾\t396595\n别\t396596\n轻型车\t396597\n凤山村\t396598\ngopher\t396599\nロ\t396600\n雨碎江南\t396601\nQQ离线文件\t396602\nsix\t396603\n三维动画\t396604\nXLR\t396605\nLaplacian\t396606\n特战队\t396607\n一万四\t396608\nm7450f\t396609\n苏icp\t396610\n2八\t396611\n卢萨卡\t396612\n耳鼻喉频道_健客网\t396613\nCounsel\t396614\n跳出来\t396615\n对酒当\t396616\n令爱\t396617\n数百人\t396618\n平原客\t396619\n自由\t396620\nkingroot\t396621\n名家\t396622\n联络处\t396623\n从何来\t396624\n长源\t396625\n两化融合\t396626\n摧残\t396627\nG8\t396628\n广元市人民政府\t396629\n黑火药\t396630\n云币网\t396631\n苹果4s\t396632\n铜壶\t396633\n邵庄\t396634\nMTLBBS\t396635\n饮料罐\t396636\n国际部\t396637\n冷风机\t396638\n107集\t396639\n盘旋\t396640\n交流稳压器\t396641\n偷乐\t396642\n选注\t396643\n大亚湾\t396644\n洋铭\t396645\n远洲\t396646\n光芒\t396647\n龚叶轩\t396648\n000415\t396649\nBunny\t396650\n选任\t396651\n崭露头角\t396652\n六安市裕安区人民政府\t396653\n散开\t396654\n20160420\t396655\n土鸡\t396656\npcie\t396657\n2盒\t396658\n误杀瞒天记\t396659\n保护罩\t396660\n谢馥春\t396661\n水解\t396662\n著作集\t396663\n9.10\t396664\n和光\t396665\n丧事\t396666\n短消息\t396667\n钓点\t396668\nokular\t396669\n峨眉\t396670\n前7月\t396671\n泰哥\t396672\nQQBODY\t396673\n劳动安全事故罪\t396674\n体态语\t396675\n陈后主\t396676\n90平方米\t396677\n校友群\t396678\n冶具\t396679\n依比\t396680\n铁心兰\t396681\ninvocation\t396682\nBT5\t396683\n个价\t396684\n果熟\t396685\n六任\t396686\n冷吃兔\t396687\n流浪汉\t396688\nDSC\t396689\n就餐\t396690\n财人\t396691\nAscend\t396692\n戊烷\t396693\npathspec\t396694\n小米移动电话卡\t396695\n郭熙\t396696\n锯齿形\t396697\nTeledyne\t396698\n方中山\t396699\n一流健康网\t396700\n125克\t396701\n青春不散场\t396702\n谷多\t396703\nJUJU\t396704\n郑振铎\t396705\n光博会\t396706\nFX168\t396707\n五指\t396708\n神学家\t396709\nAPUE\t396710\n排排坐\t396711\n北京欢迎你\t396712\n牛肝菌\t396713\n良品率\t396714\n新源县\t396715\nentrepreneurship\t396716\n金轮股份\t396717\n十字街\t396718\n气团\t396719\n美人们\t396720\n新女\t396721\n消于\t396722\n观剧\t396723\nメガネ\t396724\n女日\t396725\n大众EA888\t396726\n勒阴\t396727\n武汉城\t396728\n海基\t396729\n兔子舞\t396730\n中山公用\t396731\n淮安机场\t396732\nMoMo\t396733\nLOL卡\t396734\n龙行天下风水论坛\t396735\n力动\t396736\n伯纳乌\t396737\n李国强\t396738\n火影忍者剧场版9\t396739\n工资\t396740\n月光倾城\t396741\n电镀膜\t396742\n大明天下\t396743\n华亿\t396744\n职中\t396745\n2017—2025年\t396746\nayu\t396747\nDeclared\t396748\n运会\t396749\n中国现代国际关系研究院\t396750\n伏皇后\t396751\n性侵\t396752\n爆米\t396753\n林苑\t396754\nNTT\t396755\nH60-L01\t396756\n五术\t396757\nTik\t396758\nQiaoZhi\t396759\nU2312HM\t396760\n硬碟\t396761\n3520\t396762\n娥皇女英\t396763\n碟片\t396764\n爱上门\t396765\n乳照\t396766\n陆美蓉\t396767\n安徽政务服务网\t396768\n铁球\t396769\n147号\t396770\n丽岛新材\t396771\n阿卡多\t396772\nrichie\t396773\n指环王1\t396774\n两臂\t396775\nshutterstock\t396776\nswarovski\t396777\n第4代\t396778\n400瓦\t396779\n4th\t396780\n扬子杯\t396781\n高仿真\t396782\n王艳茹\t396783\n雁过拔毛\t396784\nDrivers\t396785\n克02\t396786\n五色花\t396787\n车友们\t396788\n音色\t396789\n书法家\t396790\n烟花秀\t396791\nNSNull\t396792\nR330\t396793\n多喝水\t396794\n58种\t396795\n麻核桃\t396796\n6228\t396797\n采矿\t396798\nstudio2.0\t396799\n水浒传之英雄本色\t396800\n清荷\t396801\n血清素\t396802\nairplane\t396803\n史美伦\t396804\n炼妖壶\t396805\n女版\t396806\n建新东路\t396807\n5375\t396808\n郑上新区\t396809\nECS服务器\t396810\n盗号者\t396811\n2欧\t396812\n小白板\t396813\npap\t396814\n1000平方\t396815\n安卓5.0\t396816\n临平新城\t396817\nguanfang\t396818\n水滴石\t396819\n霍格\t396820\nmc小洲吧\t396821\n雨披\t396822\n齐天大圣\t396823\ni30\t396824\n民\t396825\n1434\t396826\n元亨利贞网\t396827\n翟欣欣\t396828\n马可修\t396829\n咒术\t396830\n包臀\t396831\n帐面\t396832\n大肠癌\t396833\n细石\t396834\n半小时\t396835\n比亚迪唐\t396836\n有希望的男人\t396837\nzhiyi\t396838\n约克夏犬\t396839\n60年前\t396840\n物质主义\t396841\n缘之空\t396842\n李佳明\t396843\nC2B\t396844\n艾玛斯通\t396845\n月仙\t396846\n热映_时光网\t396847\n炒菜\t396848\n欧浦智网\t396849\n新英灵\t396850\n货舱\t396851\n朱英\t396852\n电热壶\t396853\n深圳国税局\t396854\n丝域养发馆\t396855\n柒月\t396856\n题典\t396857\n柴柴\t396858\n如虹\t396859\n挥拍\t396860\n天津胸科医院\t396861\n炉石传说冰封王座\t396862\n安卓服\t396863\n45公斤\t396864\n斯柯达晶锐\t396865\n金鸡谷\t396866\nVoyager\t396867\nAladdin\t396868\nTy\t396869\n西安高新第一小学\t396870\n连云港职业技术学院\t396871\ne族\t396872\n猫扑\t396873\nwwdc\t396874\n吡格列酮\t396875\nPAM\t396876\n澜沧县\t396877\n整流桥堆\t396878\n奸魔\t396879\n衰败\t396880\nAmresco\t396881\n陈志坚\t396882\n健康网\t396883\n白马湖小区\t396884\nxiaoer\t396885\n原色\t396886\n思源\t396887\n寄存点\t396888\n百分之25\t396889\n哈尔滨乐居\t396890\n熔断\t396891\n线控器\t396892\n失磁\t396893\nNw\t396894\n均值不等式\t396895\nwolves\t396896\n上海交大附中\t396897\n光怪陆离\t396898\n超级学习网\t396899\n一网式\t396900\nedf\t396901\n翔翔\t396902\n乡村教师\t396903\n杨益\t396904\n区\t396905\n李慧敏\t396906\n亚太科技\t396907\n世纪金源集团\t396908\nmasturbation\t396909\n诘\t396910\n大樱桃苗\t396911\n金沙河\t396912\nsnprintf\t396913\n陕西省人力资源和社会保障厅\t396914\n克里格\t396915\n雄性\t396916\n弗兰德\t396917\n潘鲁生\t396918\n1-30\t396919\nwif\t396920\nNV200\t396921\n去税\t396922\nContiki\t396923\n上海凤凰\t396924\n敲诈勒索罪\t396925\n童扬\t396926\n口腔展\t396927\nboot实战\t396928\n中式八球\t396929\nqiji\t396930\nshawn\t396931\n乐视max2\t396932\nincentive\t396933\nufo探索网\t396934\n黑太阳\t396935\n瓦斯\t396936\n主值\t396937\ngraff\t396938\n3.doc\t396939\n皇级\t396940\n鄞州\t396941\n液相色谱\t396942\n想当然\t396943\n暮色\t396944\n暴走英雄坛吧\t396945\n周春芽\t396946\n艾莲娜\t396947\n730Li\t396948\n狮王\t396949\nOneDrive\t396950\n畅销品\t396951\n光伏电池片\t396952\nekdv\t396953\nUNB\t396954\n小兽\t396955\n碧海\t396956\n都市情缘\t396957\n歌华\t396958\n预言家\t396959\nmylink\t396960\n语言类\t396961\n臀围\t396962\n饶让\t396963\n教父3\t396964\nnfv\t396965\nInteractive\t396966\n邵阳学院\t396967\n虎泉\t396968\n谷田部\t396969\n778\t396970\ntied\t396971\noptim\t396972\n九方购物中心\t396973\nMIR\t396974\n批头\t396975\n跨越\t396976\n聚氨酯泡沫塑料\t396977\ngeometric\t396978\n南洋\t396979\n绝地求生N卡\t396980\nuomo\t396981\n2017年12月14日\t396982\n天下大同\t396983\nBoogie\t396984\n精密型\t396985\nPhilosophical\t396986\n中南海保镖\t396987\npaginator\t396988\n女士们\t396989\n放毒\t396990\nthinker1017\t396991\nidc\t396992\n珊珊\t396993\n断码\t396994\n枪击案\t396995\nmasking\t396996\n对半\t396997\n巴安水务\t396998\n大鹏新区\t396999\n调出\t397000\n5号\t397001\n芝浦\t397002\nsubclipse\t397003\n隐性债务\t397004\n各种人\t397005\n蓝宝石之谜\t397006\n16平米\t397007\n埋点\t397008\n良马\t397009\n异丙托溴铵\t397010\n离\t397011\n107周年\t397012\nguilt\t397013\nFienly\t397014\nAWT\t397015\n骑马钉\t397016\n竹内\t397017\nFTB\t397018\nctrl+v\t397019\n中洲控股\t397020\nstrong\t397021\n100块\t397022\n佛尘\t397023\nSiem\t397024\n408\t397025\n靶细胞\t397026\n威廉·莎士比亚\t397027\n深析\t397028\nG1840\t397029\nMicrophone\t397030\n4星\t397031\n加州大学伯克利\t397032\n价语\t397033\n陈乾\t397034\n瑞风M4\t397035\n绝活\t397036\n冯新柱\t397037\n降冰片烯\t397038\nwin7pe\t397039\nVOS\t397040\n抓赌\t397041\n军刊\t397042\nOriginPro\t397043\nJson\t397044\n命令与征服3凯恩之怒\t397045\n网上谈兵_论坛\t397046\n苏c\t397047\n炉鱼\t397048\n周奇奇\t397049\n混叠\t397050\n以太网适配器\t397051\n无人机网\t397052\n连接路由器\t397053\n铜仁门户网\t397054\n海阔\t397055\nslc\t397056\n尹钟信\t397057\n产销量\t397058\nFMDB\t397059\n通力\t397060\n霍利菲尔德\t397061\n半圆\t397062\n股价\t397063\n湖南省第二人民医院\t397064\n三联件\t397065\n期末\t397066\n冲喜\t397067\n看门狗1\t397068\niBatis\t397069\n昆\t397070\n曹家巷\t397071\n神舟一号\t397072\nITGungnir\t397073\n头帘\t397074\nry\t397075\n亚男\t397076\n芜湖乐居\t397077\n羚羊角粉\t397078\n全反式\t397079\n和者\t397080\n傻根\t397081\nSessions\t397082\n孔雪\t397083\n诱导\t397084\n手幅\t397085\n思讯\t397086\n时标\t397087\n王永祥\t397088\n博帕尔\t397089\n妖妇\t397090\n表现派\t397091\n发行日\t397092\n有容乃大\t397093\n天堂山\t397094\n林文龙\t397095\n40多亿\t397096\nPARKER\t397097\n巨人族\t397098\n二奶\t397099\njs实战\t397100\n林阳\t397101\n蜀园\t397102\n死生契阔\t397103\n杨绛\t397104\n21.5\t397105\n5方\t397106\n中央政法委\t397107\n盾冬\t397108\niScroll5\t397109\n31栋\t397110\n中国土木工程学会\t397111\n碧桂园城市花园\t397112\n阿修罗2\t397113\n杨君\t397114\n斯普瑞\t397115\nditu\t397116\n八十三\t397117\n74个\t397118\n混包\t397119\n5险\t397120\n益安宁\t397121\n吉安麦地网\t397122\n讲道理\t397123\n推搡\t397124\nr15\t397125\n英法联军\t397126\n执著\t397127\n中关村街道\t397128\n配班\t397129\n贝尚湾\t397130\n285\t397131\n二极管\t397132\nshowed\t397133\n1.13C\t397134\n丰崎爱生\t397135\n愣一愣\t397136\n虚拟电脑\t397137\n阿里云帮助中心\t397138\n凡尔赛宫\t397139\n数字证书\t397140\nassembly\t397141\n胶东机场\t397142\n不挂\t397143\n套接字编程\t397144\nAgile\t397145\nPowerMILL\t397146\n月正圆\t397147\n8方\t397148\n犬屋敷\t397149\n真爱的谎言之破冰者剧情介绍\t397150\n途胜\t397151\n拉链\t397152\n笔盖\t397153\n老爸老爸\t397154\n制售\t397155\n宜城市人民政府\t397156\n西冷牛排\t397157\n张治中\t397158\n然\t397159\n乔布斯传\t397160\n第38条\t397161\n信息技术\t397162\n造梦西游4\t397163\n禅林\t397164\n樟脑\t397165\n魔块术\t397166\nCHANNEL\t397167\n莫辛\t397168\n631\t397169\n600岁\t397170\n情\t397171\n拆洗\t397172\n买手\t397173\n快步\t397174\nx/2\t397175\n梦诛\t397176\n不灵\t397177\nWowGirls\t397178\n中国农业人才网\t397179\n春冬\t397180\n撸小子小游戏网\t397181\n王泉\t397182\n高冷\t397183\n圣米歇尔山\t397184\n克鲁塞德\t397185\n寰宇天下\t397186\n牛肉板面\t397187\n段差\t397188\nSamplitude\t397189\n1万块\t397190\n口干\t397191\n煤体\t397192\nchiller\t397193\n体育场地\t397194\n崔永史玉柱\t397195\n5.4%\t397196\nテ\t397197\n崩掉\t397198\n周大生珠宝\t397199\n均等化\t397200\n秦论\t397201\nsheet\t397202\nv4.7.1\t397203\n中风\t397204\n亲人们\t397205\n20万起\t397206\n湘源\t397207\n黑毛\t397208\n106国道\t397209\njokey\t397210\ndresser\t397211\n易网\t397212\nNeuro\t397213\nYI\t397214\n绦虫病\t397215\nEXID\t397216\n污水管网\t397217\n4506\t397218\ndepot\t397219\n郑合惠子\t397220\nAWR\t397221\n3dtouch\t397222\n奖惩\t397223\n泰坦尼克\t397224\n拓普康\t397225\nBagwell\t397226\nCRF++\t397227\n上蔬永辉\t397228\nfreepiano\t397229\nlibmysqlclient.so.16\t397230\n洗浴会所\t397231\n保证函\t397232\ndup2\t397233\nweb.py\t397234\n黄睿\t397235\n焦作发展\t397236\n243573295\t397237\nCtex\t397238\n源汇区\t397239\n邢\t397240\n暗箱\t397241\n太平洋电脑\t397242\n诉讼书\t397243\n丹弗斯\t397244\n红黑榜\t397245\n环戊烷\t397246\n曲江一中\t397247\n小碎花\t397248\n缩混\t397249\n留下来\t397250\n教派\t397251\n排渍\t397252\n自在天\t397253\n戛纳国际电影节\t397254\n12358\t397255\n精品网\t397256\n淘宝天猫宝贝\t397257\nZiv\t397258\n羧甲基壳聚糖\t397259\ninstallations\t397260\nChokCoco\t397261\n深圳五洲医院\t397262\n色劫\t397263\n阿宗\t397264\n伦敦希斯罗机场\t397265\n华侨城创意园\t397266\n5.2_\t397267\n台源\t397268\n金阳新世界\t397269\n小猪变形记\t397270\n400克\t397271\n文和友\t397272\n聚乙烯蜡\t397273\nс\t397274\n鲁迅故居\t397275\n穹隆\t397276\n快乐男声2017\t397277\nword文本框\t397278\n梅贻琦\t397279\n1.33\t397280\n暗区\t397281\n班加罗尔\t397282\n谷丽萍\t397283\n单片机学习网\t397284\n凯辉\t397285\n20171031\t397286\n帐表\t397287\n瀚蓝\t397288\n塔菲尔\t397289\nImporters\t397290\n闫红\t397291\n集选\t397292\n秦楼街道\t397293\n850M\t397294\n谢大脚\t397295\n宁子\t397296\n墨语\t397297\n2013年底\t397298\n收杆\t397299\n地球百子第三季\t397300\n银州区\t397301\n逢田美波\t397302\n金家街\t397303\n上海市教委\t397304\nbeg\t397305\n甲寅\t397306\n串气\t397307\n维恩\t397308\nR22\t397309\n肾方\t397310\n白璧\t397311\n婴儿痉挛症\t397312\n小码王\t397313\n绿都紫荆华庭\t397314\n百度文库下载券\t397315\n麦肯锡工作法\t397316\n安庆四中\t397317\nASCII码表\t397318\n补正\t397319\n1600年\t397320\nsheis\t397321\n伤\t397322\ncandydoll\t397323\n名卷\t397324\n3-4月份\t397325\n杨斯涵\t397326\n肠衣\t397327\n过错\t397328\n手推车\t397329\n新四大发明\t397330\nreferral\t397331\nAlluxio\t397332\n变异系数\t397333\n伺候\t397334\n病险\t397335\n便览\t397336\n奇瑞瑞虎3\t397337\n主题词\t397338\n尹相杰\t397339\n盛泽雨夜广场舞\t397340\nubuntun\t397341\n800000\t397342\n重品行\t397343\n斯巴达克斯吧\t397344\n长安\t397345\n轿跑版\t397346\n307\t397347\n新疆国税\t397348\n携程公司\t397349\n横财\t397350\nCOLD\t397351\n协变量\t397352\n陈诗雨\t397353\n15000公斤\t397354\nknees\t397355\nPLACE\t397356\nSabrina\t397357\n开瑞\t397358\n中科院生态环境研究中心\t397359\nEmu\t397360\n胃食管反流\t397361\naudioread\t397362\n红尾\t397363\nAmoi\t397364\nlosing\t397365\n34年\t397366\n桉木\t397367\nvant\t397368\n水莲\t397369\n夜大\t397370\n张玉萍\t397371\n带班\t397372\n滨江1号\t397373\n二胎政策\t397374\n备考\t397375\ngw250\t397376\ntophatter\t397377\n乘飞机旅行\t397378\n故宅\t397379\n阿里妈妈淘宝联盟\t397380\n射频工程师\t397381\n流量包\t397382\nCplex\t397383\n诺伊曼\t397384\n偏股型基金\t397385\n厦门市总工会\t397386\nhips\t397387\n叶芝\t397388\n蟹柳\t397389\n恭恭敬敬\t397390\n早川瀬里奈\t397391\n隔音窗\t397392\n151名\t397393\n海泰\t397394\n和谐社区\t397395\n紫薯\t397396\nunacceptable\t397397\nkane\t397398\n同意\t397399\ntmod\t397400\n平邑一中\t397401\n梅林村\t397402\n多宝\t397403\n中介者\t397404\n东营港\t397405\n恒言\t397406\n朱正\t397407\n转接器\t397408\n牙龈瘤\t397409\n查卷\t397410\n广东电网公司\t397411\n58平米\t397412\n回归线\t397413\n月牙儿\t397414\n陈涛\t397415\n征战者\t397416\n经济房\t397417\nunity3D\t397418\n川源\t397419\n合片\t397420\n学霸们\t397421\n明成皇后\t397422\n爱片\t397423\nLQ-630KII\t397424\n龙湖花千树\t397425\n中华孝道\t397426\n造瘘\t397427\n甘肃省教育厅\t397428\n电门\t397429\n长治市发展和改革委员会\t397430\n黄如论\t397431\n星云\t397432\n高架桥\t397433\n崔世安\t397434\n萨尔达\t397435\nnetfocus\t397436\nVLOOKUP\t397437\nkwai\t397438\n茶水间\t397439\n赤金\t397440\n塞特\t397441\n高质发展\t397442\n解码板\t397443\n法官\t397444\nthermodynamics\t397445\n方土\t397446\n泥瓦匠\t397447\nHttpPost\t397448\n二手房产网\t397449\n瑞典文学院\t397450\nmodelsim\t397451\n夏侯恪\t397452\n半段\t397453\n二环路西\t397454\n西花厅\t397455\n蓝门\t397456\n行政管理学\t397457\naling\t397458\n金色葡萄球菌\t397459\n高棉\t397460\n李晓华\t397461\nrqalpha\t397462\n地土\t397463\n报稿\t397464\nie7浏览器\t397465\n硫单质\t397466\nShaft\t397467\n600019\t397468\n第十三期\t397469\n圣锤\t397470\n同值\t397471\n河北省公安厅\t397472\n普中科技\t397473\n改过自新\t397474\n谈礼貌\t397475\n燃煤电厂\t397476\n这方\t397477\n芋艿\t397478\nchars\t397479\n二零一六\t397480\n时间到\t397481\n狐宝\t397482\n软开度\t397483\n蒜舞\t397484\n囤货\t397485\n老生常谈\t397486\nargis\t397487\n第二步\t397488\n阿玛施\t397489\n关联图\t397490\n第21季\t397491\n20KW\t397492\n可可英语-在线英语学习\t397493\n119周年\t397494\niPhone7p\t397495\n最后时刻\t397496\n莉莎\t397497\n倾斜式\t397498\n用武之地\t397499\n悉\t397500\n黑藻\t397501\nHTTP认证\t397502\n竞品分析\t397503\n财新\t397504\n科尔马\t397505\n胜负\t397506\n头孢呋辛酯\t397507\n吕天逸\t397508\n肝郁\t397509\nPPTV\t397510\nMac机\t397511\n不锈钢过滤网\t397512\n你是我的玫瑰花\t397513\n碳灰\t397514\n禁运\t397515\n串串\t397516\n题干\t397517\ncijilu\t397518\n被动性\t397519\n高长恭\t397520\n稀宝\t397521\n齐飞\t397522\n诗酒\t397523\nreasons\t397524\n东莞市中医院\t397525\nkissme\t397526\n标枪\t397527\n关贵敏\t397528\n口口声声\t397529\n隘口\t397530\n民用机场\t397531\n欢乐购\t397532\n文神器\t397533\n李茶\t397534\n民国二十一年\t397535\n目&#160\t397536\n钢箱梁\t397537\n国家安全委员会\t397538\n费休\t397539\n皂片\t397540\n食用菌\t397541\n静雯\t397542\n专利代理人资格考试\t397543\n山鹰纸业\t397544\n贰柒\t397545\n证果\t397546\nZ5P\t397547\n一个8位\t397548\n刘欣儿\t397549\n再世\t397550\n李庆奎\t397551\n张敬\t397552\n相变材料\t397553\n擦玻璃\t397554\n小学期\t397555\n言情小说_都市言情小说\t397556\n成都新津县人民政府\t397557\n升降杆\t397558\n稳过\t397559\n常熟理工\t397560\n毒饵\t397561\n西姆\t397562\n粘糊糊\t397563\n同性恋者\t397564\n滤油机\t397565\nTablets\t397566\n等待\t397567\n140度\t397568\nfindumars\t397569\n抽象方法\t397570\n网狐\t397571\n鲸鱼岛\t397572\n无锡金桥小学\t397573\nplugged\t397574\n床层\t397575\n中国民航机场建设集团公司\t397576\n铅字\t397577\ngrouping\t397578\n甬道\t397579\n聚币网\t397580\n1080P/\t397581\n岳阳市一中\t397582\n嘉都\t397583\n105级\t397584\n宁夏省\t397585\n客队\t397586\nwow7.1\t397587\n仙穹\t397588\n稳压值\t397589\n0794\t397590\n吃饱\t397591\nMD\t397592\nQPS\t397593\n（\t397594\n象征物\t397595\n蚂蚱\t397596\n可有\t397597\n洛萨\t397598\n汲汲\t397599\nSnappy\t397600\nSometimes\t397601\ncfd\t397602\n不好看\t397603\n储备\t397604\nだ\t397605\n红球\t397606\n风之痕\t397607\n継\t397608\n太和吧_\t397609\n孔慈\t397610\n快麦118\t397611\n2872\t397612\n韩伊\t397613\n念稿\t397614\n毛钱\t397615\n飞卫\t397616\nrebar\t397617\n眼压\t397618\n自横\t397619\n逆着\t397620\n宾文\t397621\n丹妮\t397622\n十八禁\t397623\n工办\t397624\n陆子艺\t397625\nsubscript\t397626\n南宁市国家税务局\t397627\n可爱颂\t397628\n喜盈门\t397629\nActuator\t397630\nEVi\t397631\n瑞风r3\t397632\nGon\t397633\n陈佳佳\t397634\n滴滴打车\t397635\n新泉股份\t397636\n即付宝\t397637\n旁通\t397638\n箱梁\t397639\n芯城\t397640\n马哥\t397641\nqiyi\t397642\n2018-04-22\t397643\n易尚\t397644\n厦门软件学院\t397645\n3.65\t397646\n稚气\t397647\nrotate\t397648\n宾语补足语\t397649\n砌筑砂浆\t397650\n蚕丝被\t397651\nBattlefield\t397652\n自由之城\t397653\nGogo\t397654\n裸条\t397655\n东莞农村商业银行\t397656\n小米路由\t397657\n韩安冉\t397658\n十二生肖纪念币\t397659\n哈尔滨理工大学荣成校区\t397660\n上海轨道交通10号线\t397661\n死神来了3\t397662\n名列前茅\t397663\nXerox\t397664\n营养性\t397665\nEpsilon\t397666\n平素\t397667\n缺铁性贫血\t397668\n林森浩\t397669\n汉台区政府网\t397670\neuropean\t397671\nSki\t397672\n广汽传祺GS8\t397673\n重庆三峡银行\t397674\n扰度\t397675\n2017年\t397676\n尽力而为\t397677\n湖南省公安厅交警总队\t397678\n红玫瑰\t397679\nEnnio\t397680\n潮包\t397681\nSingle\t397682\n36元\t397683\n加密狗\t397684\nfandian\t397685\n金标法\t397686\nn63.com\t397687\n10句\t397688\n一定量\t397689\n电工杯\t397690\n瓯窑\t397691\nDocs\t397692\n一夫多妻制\t397693\n101篇\t397694\nlocal\t397695\n夜神安卓\t397696\n2017Q2\t397697\n急缺\t397698\n自照\t397699\n涉众型\t397700\n付\t397701\n本田凌派\t397702\n簸\t397703\n外文\t397704\nAnders\t397705\n古茗\t397706\n豪放女\t397707\n达飞云\t397708\n花生油\t397709\n双代\t397710\n头孢地尼胶囊\t397711\n环境科学学报\t397712\n误差项\t397713\n35页\t397714\n新东方网\t397715\n1070ti\t397716\n有创\t397717\n呼吸阀\t397718\nnunit\t397719\n老弱病残\t397720\n360n5s\t397721\n火神庙\t397722\n桃瀬\t397723\n文烈宏\t397724\n辽宁职业学院\t397725\nマンガ\t397726\nCodeBlock\t397727\n曹宅镇\t397728\n企业所得税法实施条例\t397729\n巫师3\t397730\nshiki\t397731\n养森瘦瘦包\t397732\n译成\t397733\n张小凡\t397734\n房地产管理法\t397735\n铁门关市\t397736\ngk\t397737\nquadrature\t397738\n单元测试卷.doc\t397739\n敖日格勒\t397740\n咏怀\t397741\ntornado\t397742\n热色\t397743\nAchieve\t397744\n东四十条\t397745\n菏泽火车站\t397746\n暗殿骑士\t397747\n机界\t397748\n魔器\t397749\n肥鱼\t397750\n预备级\t397751\n御坂美琴\t397752\n黄粉\t397753\n1版\t397754\nMinimalist\t397755\n真空管\t397756\nWSET\t397757\n受创\t397758\nudf函数\t397759\n侏罗纪世界游戏\t397760\n游览车\t397761\n五年之内\t397762\n发条\t397763\n爱尔眼科医院集团股份有限公司\t397764\n鼎泰新材\t397765\n十二年后\t397766\nar1808s\t397767\n渊海子平\t397768\n珠三角新干线机场\t397769\n连体字\t397770\n蓝万\t397771\n潜点\t397772\n京东海投\t397773\n保管\t397774\n黄尾\t397775\n先决\t397776\nprintln\t397777\n层细胞\t397778\n寻寻觅觅\t397779\n拷打\t397780\n文韬武略\t397781\n5700万\t397782\nRodriguez\t397783\nJelly\t397784\nbackground-size\t397785\n绝世符皇\t397786\n14薪\t397787\n头顶\t397788\n非一般\t397789\n抓奸\t397790\n输\t397791\n操作中心\t397792\ngiffgaff\t397793\n天堂镇\t397794\nX9300E\t397795\n朝阳区\t397796\n橙匕\t397797\n黄飞虎\t397798\n赵浴辰\t397799\n忠良\t397800\n9502\t397801\n铸就\t397802\n西安市小学\t397803\n攀登\t397804\n30关\t397805\noptic\t397806\n1953年\t397807\n氰化\t397808\n818路\t397809\n潮爷\t397810\n卖鱼桥小学\t397811\n层析\t397812\n额尔齐斯河\t397813\n蹴\t397814\n尖叫\t397815\n7.50\t397816\n泽尻绘里香\t397817\n龙蟠科技\t397818\n美国苹果公司\t397819\n抗战\t397820\n张小英\t397821\n白马公园\t397822\n回调域名\t397823\n短兵相接\t397824\n全国代表大会\t397825\n/ED2K\t397826\n苏酒\t397827\n报文头\t397828\n梁凯恩\t397829\n广西农业信息网\t397830\nRO仙境传说\t397831\n淹城\t397832\n金井\t397833\n风琴包\t397834\n万象物流\t397835\n画船\t397836\n牛毛\t397837\n麦萌\t397838\n昭公\t397839\n抢庄\t397840\n中心块\t397841\n优泰\t397842\n西彭镇\t397843\nGavin\t397844\n隆昌路\t397845\n不染\t397846\nbiore\t397847\n游资\t397848\n第23届\t397849\n石棉\t397850\nMUV\t397851\n驯\t397852\n霍桑\t397853\n兴化湾\t397854\n龙背上的骑兵3\t397855\nEnrichment\t397856\n径流系数\t397857\n黑白根\t397858\n东营河口区\t397859\nGeorge\t397860\n戏精\t397861\n连筋\t397862\n封面页\t397863\n白皮松\t397864\n医宗金鉴\t397865\n轻聊版\t397866\n莞惠\t397867\nProspects\t397868\nKLEIN\t397869\n拉延\t397870\n记忆性\t397871\nExample\t397872\ntubo\t397873\n杨娟\t397874\n齐邦媛\t397875\n凯越锐界\t397876\n编成\t397877\n首展\t397878\n预售价\t397879\n深得人心\t397880\n侵华日军南京大屠杀遇难同胞纪念馆\t397881\n本庄瞳\t397882\ntruecrypt\t397883\n泰勒公式\t397884\n木鼓\t397885\n6u\t397886\n本户\t397887\n建昌\t397888\n3DLC2\t397889\n青岛西海岸\t397890\n酒泉日报\t397891\n弹簧片\t397892\n中娃网\t397893\n01053\t397894\n太医院\t397895\ntasty\t397896\n福建省人民检察院\t397897\n李幸倪\t397898\n盖拉多\t397899\nLoop\t397900\n200根\t397901\n肠道传染病\t397902\n300多万\t397903\n星港\t397904\n上海市第九人民医院\t397905\n笔算\t397906\n稻盛\t397907\n回弹法\t397908\nNoob\t397909\n阿酷\t397910\n米兰站\t397911\n梵净山\t397912\nSK5病毒\t397913\n腰轮流量计\t397914\ncn控\t397915\n2K16\t397916\n晶化\t397917\n酷我听书\t397918\n急性支气管炎\t397919\niPhone6splus\t397920\n54坐标系\t397921\n壹周立波秀\t397922\n应有尽有\t397923\n杨玉\t397924\n私募股权投资基金基础知识\t397925\n附一个\t397926\n断交\t397927\n萧九\t397928\n预备党员转正思想汇报\t397929\n意符\t397930\n易华录\t397931\n盖板涵\t397932\nCopula\t397933\n出台\t397934\n气根\t397935\n网易视频\t397936\ne2200\t397937\n对局\t397938\n1.11.4\t397939\n附实\t397940\n丽尚\t397941\n防火泥\t397942\n中国科学院成都生物研究所\t397943\nVCSEL\t397944\n恬恬\t397945\n店方\t397946\n消委会\t397947\n流播放\t397948\n光波导\t397949\n锅子\t397950\n张北县\t397951\n广东生益科技股份有限公司\t397952\n分道\t397953\n她的手\t397954\n根尖囊肿\t397955\n参事\t397956\n毒品案\t397957\n激光位移传感器\t397958\n幺幺\t397959\n大学英语2\t397960\n萧山新街\t397961\n国际货代\t397962\n狸猫小说网\t397963\nWEBLOGIC\t397964\n蓝孔雀\t397965\n唐人街2\t397966\n云南省中医医院\t397967\n秦香莲\t397968\n西段\t397969\n韩素媛\t397970\n举世无双\t397971\n柯岩风景区\t397972\n0.8mm\t397973\n请假条\t397974\n热海\t397975\n12速\t397976\nArrow\t397977\n算得\t397978\njssdk\t397979\n剐蹭\t397980\n粤晖园\t397981\n悠亚\t397982\n组团\t397983\n假扣\t397984\n杨琳\t397985\n中午十二点\t397986\n戴森无绳吸尘器\t397987\n萘胺\t397988\n我离开\t397989\n塞口\t397990\n分层抽样\t397991\n本味\t397992\n榴弹炮\t397993\n221个\t397994\n鄂安沧\t397995\nPhanteks\t397996\n德清莫干山\t397997\n干扰\t397998\nsumatra\t397999\n帕拉德\t398000\n磊\t398001\n努比亚Z17S\t398002\n1490\t398003\n20151121\t398004\nr16\t398005\n维多利亚秘密\t398006\n佛山政府\t398007\n黑仪\t398008\n梁山\t398009\n人之初\t398010\n第148章\t398011\n弄混\t398012\nAV在线视频\t398013\ngz\t398014\nes3\t398015\n夏安安\t398016\n平泉市\t398017\nPASSAGE\t398018\n小女\t398019\n幻想乐园\t398020\n三年来\t398021\n望京西站\t398022\n供试品\t398023\n金锄头\t398024\nBlended\t398025\n孟强\t398026\n营生\t398027\n高八度\t398028\n二选一\t398029\n晴儿\t398030\nlegends\t398031\nMerry\t398032\npir\t398033\n15米\t398034\n常州奔牛\t398035\n巨\t398036\n沙家浜镇\t398037\n人力资源共享服务中心\t398038\n胜威国际\t398039\nSaga\t398040\n三星j7\t398041\nEAC\t398042\nW型\t398043\n苏州妈妈网\t398044\n2匹\t398045\n按纽\t398046\n西港\t398047\ntiledmap\t398048\n中华啤酒\t398049\n成都奥数网\t398050\nClarivate\t398051\n素朴\t398052\n偏差\t398053\n柳州市人力资源与社会保障局\t398054\n马夹\t398055\n2016年11月17日\t398056\n撇下\t398057\n大吉岭\t398058\n威县\t398059\n麦特\t398060\nroom\t398061\n昆明市规划局\t398062\n黑森林\t398063\n第十三天\t398064\n增值税后\t398065\n橄榄形\t398066\n男发\t398067\n霍城\t398068\n22章\t398069\ncomponent\t398070\n2min\t398071\nWebCollector\t398072\n红黑\t398073\n亨达\t398074\n上海领事馆\t398075\n落枕\t398076\n160619\t398077\nnema\t398078\n人教版五年级数学\t398079\ncharged\t398080\n泛读\t398081\n电影网\t398082\n视友\t398083\n安康政法网\t398084\n干脆面\t398085\n子句\t398086\n良庆区\t398087\n郭守杰\t398088\n银川新闻网\t398089\nhandbags\t398090\n红香港马会\t398091\n百姓呼声_洛阳网\t398092\n若尔\t398093\n赫尔希\t398094\n手机号段归属地数据库\t398095\n马荣\t398096\n裂魂\t398097\nUHelp\t398098\n中央转移支付\t398099\nukraine\t398100\n热辣\t398101\n600584\t398102\nLAI\t398103\n超准\t398104\n眼晕\t398105\n3xplanet\t398106\n欢乐集结号\t398107\n樟坑\t398108\n林田\t398109\n农经站\t398110\n达者\t398111\n安粮城市广场\t398112\n迈巴赫S级\t398113\n异乡人\t398114\npyt\t398115\n三超\t398116\n拼团\t398117\nworkaround\t398118\n丹·布朗\t398119\ndocking\t398120\n义军\t398121\nfreertos\t398122\nvitas\t398123\n测量方法\t398124\nmarcus\t398125\n棋道\t398126\n烛之武退秦师\t398127\n钢混\t398128\n汉阴新闻网\t398129\n魔芋豆腐\t398130\n出租汽车\t398131\naesthetic\t398132\nXGBOOST\t398133\n梧桐庄园\t398134\n武田\t398135\n班班通\t398136\n医食\t398137\nhifi播放器\t398138\nISO/TS16949\t398139\n翰宇药业\t398140\n双软\t398141\n金刚:骷髅岛\t398142\n爱寻迷\t398143\n霍邱县人民政府\t398144\n富隆\t398145\nlookupedit\t398146\n告知\t398147\n断局\t398148\nAJ5\t398149\n青医附院\t398150\n角色\t398151\nEFS\t398152\n1528\t398153\n梁思成\t398154\n豆腐坊\t398155\n女王范\t398156\n废妃\t398157\nccache\t398158\n受益股\t398159\n这件小事\t398160\n全陪\t398161\n毕业答辩\t398162\nCATION\t398163\n身为\t398164\n润城\t398165\nBoulder\t398166\n强迫症\t398167\n超具\t398168\n和龙市\t398169\ncentrum\t398170\nk彩\t398171\n同安影视城\t398172\n北蔡\t398173\nCommunicator\t398174\nro膜\t398175\n22万元\t398176\n中国老龄事业发展基金会\t398177\nRiley\t398178\njsx\t398179\n叶丶梓轩\t398180\n阿登\t398181\niPad平板\t398182\n庆应义塾大学\t398183\n征信证明\t398184\n123美食网\t398185\n私募\t398186\n毛脚\t398187\n西华路\t398188\n_洛奇\t398189\n藤原拓海\t398190\n永宁门\t398191\n博美俱乐部\t398192\n夜光杯\t398193\n22亿元\t398194\n试炼\t398195\n官僚\t398196\n潮鸣\t398197\nmiaoss\t398198\n49度\t398199\n锅筒\t398200\nIdeas\t398201\nf-452227\t398202\n删掉\t398203\n曹光\t398204\nRoyalstar\t398205\n免费发布信息网\t398206\n朱家峪\t398207\n通讯板\t398208\n电影机\t398209\n萤火\t398210\n最值\t398211\nenumerator\t398212\nHuan\t398213\nitaliano\t398214\nambiguous\t398215\nweekly\t398216\n英语四级\t398217\nit之家\t398218\n水龟\t398219\n陈晓娟\t398220\n云信\t398221\n盖伦特\t398222\nsl500\t398223\n照相机\t398224\n12月30日\t398225\n山东大学经济研究院\t398226\n斗殴\t398227\n陈年\t398228\n指着\t398229\n20171129\t398230\n几秒钟\t398231\n神咲\t398232\nHD-MP4/MKV\t398233\n改扩建\t398234\n金域\t398235\nInfographic\t398236\nhuifa\t398237\n汤晶锦\t398238\nfirs\t398239\n久\t398240\nCorning\t398241\n两代人\t398242\n高泰\t398243\n圆孔\t398244\nBeoPlay\t398245\nratingbar\t398246\n魔瞳\t398247\n牛皮癣\t398248\n世道\t398249\n有天有\t398250\n商卡\t398251\n裂缝性\t398252\n海外版\t398253\n贵州省科学技术厅\t398254\n山东法制报\t398255\n橛子\t398256\n什么者\t398257\n液泵\t398258\nbeijign\t398259\nv19\t398260\nvuitton\t398261\n渚\t398262\n轻舟网\t398263\n兴澄特钢\t398264\n要不\t398265\nV4.8\t398266\n5度\t398267\n000550\t398268\n1961年\t398269\n荀慧生\t398270\n云南警官学院\t398271\n高恪\t398272\n证金公司\t398273\n三国\t398274\n今晚8点\t398275\n5-2\t398276\n小猴子下山\t398277\n驾信\t398278\nf16\t398279\nMizuno\t398280\n杨老板\t398281\n人声\t398282\n基督教徒\t398283\nibc\t398284\n体育与健康\t398285\n累进\t398286\nChinaUnix\t398287\n溯源性\t398288\n逆作\t398289\n火腿肠\t398290\nFm\t398291\n253\t398292\nLinux新闻_Linux\t398293\n权籍\t398294\nInteger\t398295\n第37届\t398296\n舞帝\t398297\n口爱\t398298\nlight\t398299\n映容雪\t398300\nxclip\t398301\n特种纸\t398302\n穆勒\t398303\n倚仗\t398304\n刀剑神域:虚空幻界\t398305\n汤沟\t398306\n天津市委\t398307\n洋葱表皮细胞\t398308\nlisence\t398309\nJohnny\t398310\n错爱一生\t398311\n杨璞\t398312\n董海涛\t398313\n魅族15plus\t398314\nMod*\t398315\n铂\t398316\n国际商贸城\t398317\n林南\t398318\n美津\t398319\n中山汽车总站\t398320\n卫聂\t398321\nmok\t398322\ne书\t398323\nVersions\t398324\nyolo3\t398325\n三个半\t398326\n武珞路中学\t398327\n漆雕\t398328\n破门而入\t398329\n法定监护人\t398330\n纳尼亚传奇\t398331\n搏世\t398332\n第三联\t398333\n百万日元\t398334\n血战冲绳岛\t398335\n黑铁山\t398336\n本地车\t398337\nprefetch\t398338\n霍普杯\t398339\n明斯克\t398340\n齐聚首\t398341\nkiller\t398342\n金金\t398343\n帝国时代3\t398344\n仙法\t398345\n7.0圣\t398346\n刘菲\t398347\n红蓝绿\t398348\n胸卡\t398349\n刘大鹏\t398350\n团泊湖\t398351\n青海省工商行政管理局\t398352\n禁色\t398353\n上海社保\t398354\n精校版\t398355\n短信群发\t398356\n三路知识网\t398357\n质检院\t398358\n江苏省省\t398359\n米聊\t398360\n桃花茶\t398361\ncvc\t398362\n云南省林业厅\t398363\n91页\t398364\n装\t398365\n相序\t398366\n人场\t398367\nAM335x\t398368\n双龙村\t398369\n6602\t398370\n大地飞歌\t398371\n小额\t398372\n滨河社区\t398373\nkubernates\t398374\n唐宗宋祖\t398375\ncurtain\t398376\n将它\t398377\n樊少皇\t398378\nBla\t398379\n农场类\t398380\n恭喜发财\t398381\narcteryx\t398382\n税价\t398383\n陈阅增\t398384\n五华山\t398385\nCave\t398386\n2.3.10\t398387\n厢式车\t398388\nLocks\t398389\n兆易创新\t398390\n金艺贞\t398391\n360p\t398392\n伊可新\t398393\n无照\t398394\n银翘解毒片\t398395\nArch\t398396\n无限循环小数\t398397\n翻转\t398398\nMissEvan\t398399\n茶饭\t398400\n2.28\t398401\n延长石油集团\t398402\n神武币\t398403\n等你爱我\t398404\n10盎司\t398405\n影\t398406\nYUN\t398407\n啵啵\t398408\n月子餐食谱\t398409\n小企鹅\t398410\n石牌岭\t398411\nGlasses\t398412\ndasha\t398413\n兴隆社区\t398414\n定安县\t398415\n预缴\t398416\nVRAY\t398417\n滑舌\t398418\n城市管理\t398419\n龙珠超次元乱战吧\t398420\n锕\t398421\n碎金\t398422\n吓呆\t398423\n54万亿\t398424\n取消资格\t398425\n玛修\t398426\n巴达克\t398427\nchatur\t398428\nBuyer\t398429\n陕西信合\t398430\n湖南省总工会\t398431\n热死\t398432\n1.3GB\t398433\n密度计\t398434\n架梁\t398435\n领料\t398436\n瓦伦丁\t398437\n中南集团\t398438\n2016年1月29日\t398439\n淋巴管炎\t398440\n芜湖公共资源交易中心\t398441\n气眼\t398442\nCollation\t398443\n龙8\t398444\n唐文龙\t398445\n玫瑰园小区\t398446\n云宝\t398447\n上海政府\t398448\n安吉县政府\t398449\nZ270\t398450\n1.31\t398451\n作呕\t398452\n题名考试网\t398453\n低吟\t398454\n内审师\t398455\n原州区\t398456\nmobi,azw3\t398457\npgd\t398458\n吴淼\t398459\n4488\t398460\n兰舟\t398461\n落照\t398462\n金沙县\t398463\n上海国际集团有限公司\t398464\n_路路\t398465\n泊车\t398466\nkoolproxy\t398467\n武道馆\t398468\n滇红\t398469\n四勤\t398470\n孟河\t398471\n杉山\t398472\n更是\t398473\n凯里·欧文\t398474\nh55\t398475\nQQ旋风\t398476\n台网\t398477\n弃管\t398478\n调处\t398479\n悚然\t398480\n张晶晶\t398481\njsjy\t398482\n唐德宗\t398483\n氢氧化钠\t398484\nHASP\t398485\nslide\t398486\n奎师\t398487\n山西人才网\t398488\ndhl国际快递\t398489\n张冰冰\t398490\n防\t398491\nDVD\t398492\n奎屯市\t398493\n万跃华\t398494\n哈弗H8\t398495\ncbc\t398496\n林凯\t398497\n跳远\t398498\n吴兴路\t398499\n花点\t398500\n全服装\t398501\n绿城春江明月\t398502\n绫濑遥\t398503\n飞镖\t398504\n赵达\t398505\n谷得\t398506\n保利剧院\t398507\n1000只\t398508\n一镜\t398509\n置家网\t398510\n上古卷轴5:天\t398511\n荃银高科\t398512\n小臂\t398513\n烧结机\t398514\nAlexis\t398515\n黑帽子\t398516\n车程\t398517\n钛钉\t398518\nsho\t398519\ndsv\t398520\n奇迹般\t398521\n这一套\t398522\n王小马\t398523\n化学工程与技术\t398524\n16299.309\t398525\n3000k\t398526\n海康网络\t398527\n新航道留学服务中心\t398528\n第一批次\t398529\n超凡双生\t398530\n合肥城市轨道交通有限公司\t398531\n95周年\t398532\n东方日升\t398533\n米其林驰\t398534\n适用\t398535\n北苑路北\t398536\n513所\t398537\n20次\t398538\n中国电信北京公司\t398539\n塑料\t398540\n校园文明\t398541\n附言\t398542\n瑞悦府\t398543\ngo语言\t398544\n结晶体\t398545\n恒腾\t398546\n漫改\t398547\n高尔夫6\t398548\nBorn\t398549\n传奇大掌柜\t398550\nALT键\t398551\n爱财如\t398552\n生化奇兵2\t398553\n番禺广场\t398554\nUbiquiti\t398555\nGrape\t398556\nVinoZhu\t398557\n哄\t398558\n市体育局\t398559\n蛇患\t398560\n祛\t398561\n中型机\t398562\n多伦科技\t398563\n档把\t398564\n洗衣工\t398565\n混凝土\t398566\n深圳大学研究生院\t398567\n求说\t398568\nAlanTao\t398569\n黑貂\t398570\nCinema4D\t398571\n银监分局\t398572\n标准普尔500指数\t398573\n速配网\t398574\n关节\t398575\n鲜蘑\t398576\nValue\t398577\n宋世雄\t398578\ninterbrand\t398579\n铁窗\t398580\ngsd\t398581\n群居\t398582\n200km\t398583\n8S\t398584\n小泉彩\t398585\n大桶\t398586\n吉米多维奇\t398587\n太湖美\t398588\n终验\t398589\n中银证券\t398590\nav在线视频\t398591\n王秀梅\t398592\n焦虑型\t398593\ncut\t398594\n生份证\t398595\n过驳\t398596\n内蒙古民族大学\t398597\n车载以太网\t398598\n狗镇\t398599\n3-5年\t398600\n澳超\t398601\n王丽大鹏\t398602\n微探\t398603\n那边\t398604\n出借人\t398605\n长焦\t398606\n定神\t398607\n盾安\t398608\n5LS\t398609\nHooks\t398610\n消毒液\t398611\n1234\t398612\n怀化南\t398613\n工具栏\t398614\n玩味\t398615\n智联招聘吧\t398616\n起搏器\t398617\n魏小安\t398618\n看见\t398619\n正泰电气\t398620\nv100\t398621\nhash_map\t398622\n子衿\t398623\n引尖叫\t398624\n副局\t398625\nvjc\t398626\n海军服\t398627\nx265\t398628\nCode128\t398629\n钨酸钠\t398630\n哈尔滨西站\t398631\n张海滨\t398632\n防火服\t398633\n铂锐\t398634\n大卫·霍克尼\t398635\n棉棒\t398636\n中升\t398637\n象州县\t398638\n嘴炮\t398639\n趋势科技\t398640\n宝马水鸟\t398641\n针织裤\t398642\n大亚湾石化区\t398643\n范雨素\t398644\n云南政府\t398645\n厚声\t398646\nLeftso\t398647\n东森\t398648\ngts\t398649\n寿春路\t398650\n合金化\t398651\n深圳航空公司\t398652\n瀚蓝环境\t398653\n敢问\t398654\n靖江人才网\t398655\n有效\t398656\n判答\t398657\n个人征信系统\t398658\nbaseadapter\t398659\namphenol\t398660\n边墙\t398661\nSoundtrack\t398662\n冰棒\t398663\n黑暗深渊\t398664\n乞求\t398665\n朱毅\t398666\ncustomize\t398667\n米族\t398668\n人寿保险公司\t398669\n张亚魏千翔\t398670\n戈培尔\t398671\n许鑫\t398672\n错题集\t398673\n一万多\t398674\n橙乡\t398675\n过崖\t398676\n自游\t398677\n快递盒\t398678\n一第二\t398679\n筒袜\t398680\nhtm\t398681\n青岛市立医院\t398682\n9_\t398683\n北京经济技术开发区\t398684\n收费室\t398685\n烧失\t398686\n卷标\t398687\n成器\t398688\n微信群控系统\t398689\n绑架\t398690\n1845\t398691\n快艇\t398692\nONES\t398693\n5801\t398694\n金士泰\t398695\n重庆市国资委\t398696\n毯\t398697\n南航明珠俱乐部\t398698\n太平河\t398699\n2.72\t398700\n我的爷爷\t398701\n卢布尔雅那\t398702\n京华城\t398703\n注魔\t398704\n子午线\t398705\n上海汽车\t398706\n三顾冒菜\t398707\n浮空岛\t398708\n特郎普\t398709\n宜立方商城\t398710\n安徽移动\t398711\n棉纱\t398712\n高薪\t398713\nfra\t398714\n冒险岛飞侠\t398715\n校牌\t398716\nlogfile\t398717\nAUG\t398718\n第一季06\t398719\n宏安\t398720\n云水禅心\t398721\n行走\t398722\n南京工业大学新闻中心\t398723\nintegrity\t398724\n数码宝贝3\t398725\n绿之源\t398726\ntigervnc\t398727\n炝锅鱼\t398728\n被保险人和\t398729\nui\t398730\n高保\t398731\n2.07G\t398732\nStands\t398733\n连播\t398734\n齿轮减速电机\t398735\n叶晨\t398736\n赛璐璐\t398737\n美汁源\t398738\n超级联赛\t398739\n钢跳板\t398740\n中药\t398741\n浯\t398742\n存大\t398743\n234号\t398744\n折纸机\t398745\n学医\t398746\njoanna\t398747\n财神网\t398748\n月光光\t398749\n新途\t398750\n李宓儿\t398751\n藿香正气口服液\t398752\nmuslim\t398753\n微信\t398754\n指甲油\t398755\nMaxIE\t398756\ntoolbag\t398757\n工程服\t398758\n罪行\t398759\n行货\t398760\nmarshall\t398761\n七段\t398762\n儿童游乐园\t398763\n钻开\t398764\n移民\t398765\n3.88\t398766\n一号馆\t398767\n浓眉\t398768\n都市传说\t398769\n鲩鱼\t398770\n29.0.0\t398771\n油电混合车\t398772\n人参皂苷rh2\t398773\n888号\t398774\n平流层\t398775\n玩雪\t398776\nval\t398777\nCBT\t398778\n黄蓓佳\t398779\n糊口\t398780\n56秒\t398781\n上饶市\t398782\n中铁二局集团电务工程有限公司\t398783\n7615\t398784\n求测\t398785\n结核\t398786\nhedu\t398787\n780ti\t398788\n北京中国国旅\t398789\n清点\t398790\n京九\t398791\n五水共治\t398792\n融创\t398793\nOsaka\t398794\n枣林湾\t398795\n3.4.4\t398796\n尿酸\t398797\n黑龙江日报\t398798\n内网通\t398799\nquestmobile\t398800\n微耕机\t398801\n风一吹\t398802\n20160310\t398803\n300目\t398804\n椰城\t398805\naspect\t398806\n早茶\t398807\n华为畅享6\t398808\n蓝暴\t398809\n群臣\t398810\n蛋白霜\t398811\n蜡纸\t398812\n尾盘\t398813\n广西电网公司\t398814\n军网\t398815\n刘生\t398816\nbraun\t398817\n尚品\t398818\n下周\t398819\n通信管理局\t398820\nSeasons\t398821\n7111\t398822\n130米\t398823\n上海烟草集团有限责任公司\t398824\nwebpack-cli\t398825\n软脂酸\t398826\nxp吧\t398827\nwin32.zip\t398828\n0943\t398829\n数字监控系统\t398830\n绝地求生98k\t398831\n花塞\t398832\n乐都县\t398833\n瘸腿\t398834\n601788\t398835\n养儿防老\t398836\n1006\t398837\n利害关系人\t398838\n12x2\t398839\n张昊亮\t398840\n国药器械\t398841\n身材\t398842\ncurrents\t398843\n12.4\t398844\n夜景\t398845\n类固醇\t398846\n销货方\t398847\n二叉搜索树\t398848\n汉沽管理区\t398849\njiangshi\t398850\n83平米\t398851\ndoc_圈中人寿险资源网\t398852\n天创\t398853\n招纳\t398854\nws413\t398855\n舌\t398856\n角输入法\t398857\n漫漫画\t398858\n亚群\t398859\n预应力钢绞线\t398860\n中国积客网\t398861\n华山站\t398862\n管理办\t398863\n半对半\t398864\n一次线\t398865\n萨尔瓦多\t398866\n一级一级\t398867\n正新鸡排\t398868\n皮皮\t398869\npdf转word\t398870\n叶税\t398871\n价值线\t398872\n滚石\t398873\n青竹\t398874\n玉子爱情故事\t398875\n凌源吧\t398876\n党费\t398877\n终了\t398878\n玉树州\t398879\n仙境传说RO百科全书\t398880\n杜甫盛一伦\t398881\n应收单\t398882\n四哥\t398883\n几手\t398884\n政商\t398885\n餐椅\t398886\n首頁\t398887\nperrier\t398888\n江苏商报\t398889\n银湖湾\t398890\n湖北菜\t398891\n三少年\t398892\n多几个\t398893\n华创证券\t398894\n集装箱式\t398895\nsass\t398896\n星之卡比\t398897\n二肖\t398898\n鲁迅全集\t398899\n凶弹\t398900\n梵文版\t398901\nmoq\t398902\n汪东兴\t398903\nseed吧\t398904\n奇爱\t398905\n苜蓿园\t398906\n神龙汽车\t398907\n丝路山水地图\t398908\n高云龙\t398909\n广东省交通集团有限公司\t398910\n二号位\t398911\n银商\t398912\nmutes\t398913\n%、&、#\t398914\n生肖币\t398915\n渐变段\t398916\n双子大厦\t398917\n台州市科学技术局\t398918\n包边条\t398919\n展展商\t398920\n游离态\t398921\n石羊场\t398922\n瑞城\t398923\n甘肃中医药大学附属医院\t398924\n逸城\t398925\nSears\t398926\n边纸\t398927\nuWSGI\t398928\n清晖园\t398929\nnü\t398930\n干邑\t398931\nregulating\t398932\n5.5%\t398933\nepaper\t398934\n大理苍山\t398935\n似是\t398936\n角星\t398937\nMod\t398938\n新叶古村\t398939\n接收函\t398940\n西樵\t398941\nFFT变换\t398942\ncustomary\t398943\n铭宣海淘\t398944\n20180205\t398945\n百度云网页版\t398946\n异议书\t398947\n新建社区\t398948\n抗灾\t398949\n所以\t398950\nwinbugs\t398951\n两手\t398952\n防爆操作柱\t398953\n四少\t398954\nM228\t398955\n797\t398956\n太子参\t398957\n轮舞\t398958\n存托\t398959\n达成协议\t398960\n张家玮\t398961\n湖北交通职业技术学院\t398962\n0.66%\t398963\n战斗\t398964\n梦幻西游手游仙玉\t398965\n上海老年大学\t398966\n蠢事\t398967\n云境\t398968\nAssociation\t398969\n社评\t398970\n迪玛\t398971\n89届\t398972\n东京奥运会\t398973\n胆瓶\t398974\nNPD\t398975\n收保\t398976\n电话邦\t398977\n20多秒\t398978\n敬修堂\t398979\n禁曲\t398980\n112亿\t398981\n简媜\t398982\nyingying\t398983\n油子\t398984\n中山西路\t398985\nsuperior\t398986\n兰总\t398987\n民乐团\t398988\n流仪\t398989\n金凯利\t398990\n喜马拉雅山\t398991\nUEFI+GPT\t398992\n开放式\t398993\n50万\t398994\n北大医疗\t398995\n高洁丝\t398996\n暗黑地牢\t398997\n品克缤\t398998\n即开型\t398999\n硬轴\t399000\n阿密特\t399001\n历史军\t399002\n加勒比女海盗2\t399003\n淮山\t399004\n零三个月\t399005\nActiveXObject\t399006\nConsumer\t399007\nuniqueidentifier\t399008\n黑管\t399009\n叶汉\t399010\n壹基金\t399011\n半数\t399012\n蟹黄堡\t399013\nstacked\t399014\n缺锌\t399015\n03\t399016\n朱古力\t399017\n农业产业结构\t399018\n一】\t399019\nCoordinatorLayout\t399020\nLX100\t399021\nopencsv\t399022\n刘群\t399023\n妙答\t399024\n大快朵颐\t399025\n利辛\t399026\n云网盘\t399027\n通安镇\t399028\nbrucemengbm\t399029\nconception\t399030\n负增长\t399031\nMabinogi\t399032\n幸福城\t399033\n陈士榘\t399034\n硅酸钙\t399035\n超过2分钟\t399036\n商务英语专业\t399037\n软膏\t399038\n供货\t399039\n干丝\t399040\n云豆\t399041\nsessionfactory\t399042\n绒裤\t399043\n佳能g2800\t399044\n冰清\t399045\n8.35\t399046\nxcache\t399047\n三三四十\t399048\n340\t399049\n华夏装饰网\t399050\n小熊维尼\t399051\n认房认贷\t399052\nwhether\t399053\nsae8\t399054\nFagalicious\t399055\n金融信息服务有限公司\t399056\nPrisoner\t399057\n峨\t399058\n200多家\t399059\n姜维传吧\t399060\n批捕\t399061\n石基信息\t399062\n赛高\t399063\n伽梨耶\t399064\n思想道德建设\t399065\n半夏\t399066\n三上悠亞\t399067\n泰伦\t399068\n浪矢解忧杂货店\t399069\n44首\t399070\n9码\t399071\nthx\t399072\n丅\t399073\n望族\t399074\n8x8\t399075\n宁波装修公司\t399076\n锐创\t399077\n0.30\t399078\n9号秘事\t399079\n斥\t399080\n金华乐居\t399081\nerin\t399082\n晓寒\t399083\n仓库员\t399084\n燕京大学\t399085\n优厚\t399086\n大澳渔村\t399087\nドキドキ\t399088\n共绘\t399089\n蒙挚\t399090\n沉疴\t399091\n凯元\t399092\n爆肝工程师的异世界狂想曲\t399093\n网易云盘\t399094\n占格\t399095\n野生技术协会_科技\t399096\n物候期\t399097\n蚝油\t399098\n赵海英\t399099\n亮瑜\t399100\n银瓶山\t399101\n黑暗正义联盟\t399102\n老沙\t399103\nimpl\t399104\n不甘心\t399105\ndigimonlinkz\t399106\n掩饰\t399107\n曲率\t399108\n落锤式\t399109\nJudith\t399110\n杰尼亚\t399111\n迪森\t399112\n头领\t399113\nl5420\t399114\n网络路由器\t399115\nAverage\t399116\n南京市卫生局\t399117\n交通运输部规划研究院\t399118\n36p\t399119\n解剖台\t399120\nfirefox浏览器\t399121\n卸力\t399122\n向阀\t399123\n亚拉戈\t399124\n周边地区\t399125\nwyu123\t399126\n33卷\t399127\n开心麻花话剧\t399128\n盈香生态园\t399129\nactiv\t399130\n借刀杀人\t399131\n何如\t399132\n继续\t399133\nSWI\t399134\n超疏水\t399135\n长铗\t399136\n27款\t399137\n黒人\t399138\n潮爆三国\t399139\n天通股份\t399140\nsincoolvip\t399141\n教育技术学\t399142\n梁州\t399143\nLUXU\t399144\n幸福年\t399145\n抓拍机\t399146\n明白纸\t399147\nchow\t399148\nrudy\t399149\n华菁证券\t399150\n语音计算器\t399151\nwowui\t399152\n料位计\t399153\n南医大\t399154\n桂附地黄丸\t399155\n脚癖\t399156\n中国安全生产报\t399157\nbinary\t399158\n疏学\t399159\n优青\t399160\n难兄难弟\t399161\n培养物\t399162\n现代教育技术中心\t399163\ngod\t399164\n小米6论坛\t399165\n明基投影\t399166\n菜系\t399167\n锦鸿\t399168\n同异\t399169\n恩断义绝\t399170\n横抱\t399171\n箭侠\t399172\nQQ自由幻想\t399173\n连衣裤\t399174\nSqlSugar\t399175\n狼牙\t399176\n跨中\t399177\n连体衣\t399178\n马口铁罐\t399179\n名侦探柯南:零之执行人\t399180\n智能电能表\t399181\n志于学\t399182\nMadagascar\t399183\n亲爱的你\t399184\n4.39\t399185\n宇哥\t399186\n天津市审计局\t399187\nDefining\t399188\n潜客\t399189\n枪林弹雨\t399190\n杜峰\t399191\nBGN\t399192\n十八届七中全会\t399193\n深圳市高新投集团有限公司\t399194\n许氏\t399195\nzgl\t399196\nm1a2\t399197\n航空动力学报\t399198\n警权\t399199\n跳骚\t399200\n上海张江高科\t399201\nsetpoint\t399202\n县庆\t399203\n极恶非道3\t399204\n碰碰狐\t399205\n航盛\t399206\n另一个城市\t399207\n白丁\t399208\n第52条\t399209\n申请量\t399210\n江绵恒\t399211\n论文集\t399212\n加工工\t399213\n亚北\t399214\n上海浦东新区\t399215\nr6300\t399216\nsql-server\t399217\n孟大宝\t399218\n凯瑟琳泽塔琼斯\t399219\nHundreds\t399220\n安徽医学高等专科学校\t399221\nATG\t399222\n巨坑\t399223\n知乎神贴\t399224\n婚育\t399225\n高估值\t399226\n一起嗨\t399227\n游丝\t399228\n样纸\t399229\nnslookup\t399230\n官方群\t399231\n周爱民\t399232\n富春环保\t399233\n董家沟\t399234\nemergent\t399235\n肖军\t399236\n三国志孔明传\t399237\n石鑫\t399238\n田连元\t399239\n中台\t399240\n洛河镇\t399241\n各显\t399242\n4015\t399243\n李sir\t399244\n母字\t399245\n441\t399246\n货厢\t399247\n窗口\t399248\n广联达\t399249\n荔枝湾\t399250\n同侧\t399251\n预产期计算器\t399252\nchar型\t399253\n捶\t399254\n野猪流\t399255\n侏罗纪公园2\t399256\n协调者\t399257\n漯河职业技术学院\t399258\nprinciples\t399259\nkip\t399260\nGEN8\t399261\n书值\t399262\n江镇\t399263\n宝牛\t399264\n末端执行器\t399265\nFOW\t399266\n9处\t399267\n沃伦·巴菲特\t399268\n苏尔特尔\t399269\nfeeds\t399270\n美凯龙\t399271\n坤音\t399272\n借花\t399273\n金华市住房和城乡建设局\t399274\n胞胎\t399275\n扎加拉\t399276\nwi10\t399277\n岗级\t399278\n元青花\t399279\nJava软件工程师\t399280\n20160923\t399281\n管鲍之交\t399282\nheroine\t399283\n银鹭\t399284\n傍\t399285\nanais\t399286\n华董\t399287\n还有\t399288\n2.4.10\t399289\n易嘉爱\t399290\nbarriers\t399291\n映泰集团\t399292\n指南针\t399293\n溪秀\t399294\n白寿彝\t399295\n百度云盘\t399296\nWin1\t399297\n现代语文\t399298\n第162章\t399299\nconfused\t399300\n韩餐\t399301\n河北工大\t399302\n桂花城\t399303\ntrigger\t399304\n骨灰堂\t399305\nPSD\t399306\nREN\t399307\n王雪冰\t399308\n三国风云\t399309\ntrackpad\t399310\n39.6\t399311\n滨北\t399312\n宏锐电气\t399313\n贴近\t399314\n背栓\t399315\n周希俭\t399316\n乙肝肝硬化\t399317\n原地踏步\t399318\n五十多岁\t399319\n圆柱头\t399320\n功能类\t399321\n文化部\t399322\nFanny\t399323\n34本\t399324\n韩飞官\t399325\n第三十四届\t399326\n中国医学科学院整形外科医院\t399327\nReed\t399328\n长沙火车南站\t399329\n沙米\t399330\n美巡赛\t399331\n腰果酚\t399332\n严伟\t399333\n歼15\t399334\n亿人\t399335\nMonoDevelop\t399336\n国史\t399337\n汗牛\t399338\n谱序\t399339\nPhilharmonic\t399340\n气态氢化物\t399341\n张丽娜\t399342\n飞凡网\t399343\ngongyed\t399344\n潘璋\t399345\n河西走廊\t399346\n海大集团\t399347\n精华油\t399348\n从新开始\t399349\n演示器\t399350\n鲁滨逊\t399351\n李香兰\t399352\nxcl\t399353\n人造石英石\t399354\n子牙\t399355\n电所\t399356\n银河系\t399357\n同袍\t399358\n恋秀\t399359\n屏闪\t399360\n真英雄\t399361\n22.com\t399362\n鱼排\t399363\n深圳远东妇产医院\t399364\n绥阳政府\t399365\n8月8日\t399366\nDeduction\t399367\n高佳\t399368\nプライベ\t399369\n19句\t399370\n74天\t399371\n仙居县\t399372\n书呆子\t399373\n说清楚\t399374\n听稿\t399375\n照应\t399376\n20171018\t399377\n那天\t399378\nNodeList\t399379\nnrf52832\t399380\n结石\t399381\n八菱科技\t399382\n深蹲架\t399383\n一餐\t399384\n低功耗\t399385\n侏罗纪世界\t399386\n王者荣耀kpl\t399387\n0803\t399388\n桃红梨白\t399389\n第2级\t399390\n内踝\t399391\n正硅酸乙酯\t399392\n链环\t399393\n芙蓉\t399394\n填鸭\t399395\n成都站\t399396\nBtjidi\t399397\nbpr\t399398\nlucky\t399399\nhouseparty\t399400\n中信山语湖\t399401\ncopyright\t399402\n0801\t399403\n康馨\t399404\neclipes\t399405\n40大\t399406\n怪兽片\t399407\n易考\t399408\n国际新城\t399409\n1985年\t399410\nv3.11\t399411\n中国共产党历史\t399412\n东南大学能源与环境学院\t399413\n43名\t399414\n化学工业园区\t399415\n宫庭\t399416\n60例\t399417\n总报\t399418\n村戏\t399419\n重机枪\t399420\n四十元\t399421\n美玲\t399422\n神漫\t399423\nparallel\t399424\n凉帽\t399425\n中国保监会\t399426\nWindow\t399427\n中外运空运发展股份有限公司\t399428\n张瑞\t399429\n麻醉机\t399430\n染黑\t399431\n盐化\t399432\n自纠\t399433\nt2\t399434\n人力资源和社会保障厅\t399435\n玄奘西游记\t399436\n冶山\t399437\n时主\t399438\nedifier\t399439\n搭乘\t399440\n虫害\t399441\nvgj\t399442\n燕窝批发/\t399443\n张桃芳\t399444\n刮目相看\t399445\n杜力\t399446\n战网点\t399447\n高峡\t399448\n安徽省知识产权局\t399449\n互补性\t399450\n李帆\t399451\n古文运动\t399452\n艺本\t399453\n模量\t399454\n肩屏\t399455\n套性\t399456\n穆青\t399457\n房地产论坛\t399458\nbrunch\t399459\n广州华美医疗美容医院\t399460\n潮童\t399461\n愤怒的小鸟英雄传\t399462\nOpenTable\t399463\n宏昌电子\t399464\n郑伟\t399465\n姬野爱\t399466\n1221\t399467\n恒发\t399468\nCDX\t399469\nPREMIUM\t399470\n涡阳县人民政府\t399471\n次页\t399472\n北京地铁9号线\t399473\n真原始传奇\t399474\n莱丝\t399475\n1522\t399476\n石家庄东\t399477\n停车场管理系统\t399478\n气体灭火控制器\t399479\n禹越镇\t399480\n旷世奇才\t399481\n龙王塘樱花园\t399482\nAix\t399483\n皖河\t399484\n小狼毫\t399485\n传志\t399486\n职工们\t399487\n好_妈妈网\t399488\n龙战士传说\t399489\n囿于\t399490\n搜狗五笔\t399491\nworkpress\t399492\nliquor\t399493\ntopological\t399494\n连胜\t399495\n自言\t399496\n中华路小学\t399497\nNZBZ\t399498\n37.1\t399499\n物联网+\t399500\n剑灵剑士吧\t399501\n预制构件\t399502\netabs\t399503\nsqe\t399504\n害人不浅\t399505\n朱罗\t399506\n市区内\t399507\nMath\t399508\n中国中学\t399509\n&#47\t399510\n重武器\t399511\nlxw18231857001\t399512\n吉日\t399513\nfounder\t399514\n杰克\t399515\n云南冶金集团\t399516\nMsg\t399517\n解人意\t399518\n3773考试网\t399519\nsyj\t399520\n2017年10月11日\t399521\n帕金森病日\t399522\nSeducer\t399523\n最终篇\t399524\n71个\t399525\n汤姆克兰西\t399526\n陶源\t399527\n路由事件\t399528\nitunes\t399529\n7关\t399530\n极2\t399531\n持械\t399532\n香洲区\t399533\n诸葛诞\t399534\n休息厅\t399535\n处理费\t399536\n朱主爱\t399537\nRecent\t399538\naccord\t399539\nViewHolder\t399540\n联邦快递公司\t399541\n编程之路\t399542\n学界\t399543\n寂寞\t399544\ngugudan\t399545\nxx年\t399546\n北京幼升小信息\t399547\nキャラ\t399548\n李井泉\t399549\n02325\t399550\n暗黑武士\t399551\n邬先生\t399552\nletters\t399553\nthink-cell\t399554\n要不要\t399555\n限载\t399556\nmca\t399557\nrefa\t399558\n下巴处\t399559\nPick\t399560\n方正飞腾\t399561\n科艺\t399562\nwar包\t399563\n炉石巫妖王\t399564\n莉娅\t399565\n瑞银\t399566\nBeds\t399567\n从海底出击\t399568\n国泰民安\t399569\n电阻式传感器\t399570\nLan\t399571\n哑光唇釉\t399572\n广西师范大学\t399573\nCollaborative\t399574\n电视类\t399575\n苏州工业园区公积金管理中心\t399576\n雅思g类\t399577\n北斗星司\t399578\n十万个冷笑话\t399579\n電視\t399580\n数理\t399581\n劳动与社会保障法\t399582\n村民委员会\t399583\n悦美网\t399584\nanone\t399585\n光团\t399586\nForma\t399587\nfc模拟器\t399588\n李老八\t399589\n友声\t399590\ncoe\t399591\n高宇\t399592\nPads\t399593\n仅供\t399594\ncnbeta\t399595\n易观国际\t399596\n10平\t399597\n偶像大师灰姑娘女孩星光舞台\t399598\n仰天一笑\t399599\n中国电影集团公司\t399600\n河南农村信用社\t399601\nfreetds\t399602\n欺君\t399603\nnarrow\t399604\n北欧女神2\t399605\n湘子桥\t399606\n醌\t399607\n预拱度\t399608\n蒸饭车\t399609\n复仇\t399610\n一山一\t399611\n斗牛士\t399612\n硫酸根离子\t399613\nessen\t399614\n红兵\t399615\nfnt\t399616\n直接引语\t399617\nlinkbutton\t399618\n别等\t399619\n料场\t399620\n拉罗拉\t399621\n效率\t399622\n大连物流公司\t399623\n中文输入法\t399624\n每级\t399625\n暴风影音5\t399626\n第69章\t399627\n魂牛\t399628\n山东省纪委\t399629\n2018年4月1号\t399630\nPep\t399631\n小豆芽\t399632\n秒钱\t399633\nstimulate\t399634\n3dma\t399635\n青少年性\t399636\n圆框\t399637\n百里荒\t399638\n天元锰业\t399639\nangew\t399640\n赢时胜\t399641\n价目表\t399642\n眸子\t399643\n园艺师\t399644\n2000台\t399645\n第六首\t399646\n阳光股份\t399647\n骁龙435\t399648\n光光\t399649\n好运物流网\t399650\n议和\t399651\n袁家村\t399652\n花落宫廷错流年\t399653\n华住会\t399654\n旅伴\t399655\nG550\t399656\n李零\t399657\n40多度\t399658\n浮现\t399659\n廷皓\t399660\n850m\t399661\n德州一中\t399662\n科夫\t399663\n高通联发科\t399664\n大商汇\t399665\n十盏\t399666\n0471\t399667\n金钻\t399668\n法律界\t399669\nfoc\t399670\n入流\t399671\n代理银行\t399672\n|博\t399673\n大俗\t399674\n薛军\t399675\n3个多月\t399676\n内啡肽\t399677\n刘飞儿\t399678\n公正\t399679\n青年旅社\t399680\n好帅\t399681\naquarium\t399682\n早鸟\t399683\nBlanche\t399684\nBGP\t399685\n诺禾\t399686\nLPC\t399687\n阿拉伯马\t399688\n诗韵\t399689\n魔伴\t399690\n几头\t399691\n电频\t399692\n甲醇制烯烃\t399693\n公益时报中华彩票网\t399694\n路奇\t399695\n154家\t399696\nRologo\t399697\n他来了请闭眼\t399698\nh无码\t399699\n高杰\t399700\n佳肴\t399701\n一千八\t399702\ncanal\t399703\n新濠影汇\t399704\n山阴路\t399705\nSeismic\t399706\n径山镇\t399707\n心湖\t399708\n海石花\t399709\n大疆Mavic\t399710\n冰品\t399711\n尾王\t399712\n14.04.5\t399713\n涨跌停\t399714\n2017款款\t399715\n两万\t399716\nboney\t399717\niPhone7P\t399718\nsike\t399719\n事体\t399720\n天使投资\t399721\n人教版5\t399722\n脚脚\t399723\n中国特种部队\t399724\n透漏\t399725\nxray\t399726\nM281fdw\t399727\n免征税\t399728\n硬皮\t399729\nfxjwind\t399730\n剪叉式\t399731\n上海市民政局\t399732\n等温\t399733\n奖学\t399734\n1082\t399735\n中泰铁路\t399736\n李妍静\t399737\nLIBOR\t399738\n富力运河十号\t399739\n签票\t399740\n计算书\t399741\n房产经纪人\t399742\n天神娱乐\t399743\nmariah\t399744\n扭伤\t399745\n褴褛\t399746\n舰炮\t399747\n七冠王\t399748\n奴良\t399749\n列举\t399750\n德尔福\t399751\n南疆\t399752\n原耽\t399753\nClaymore\t399754\n院刊\t399755\n薄层\t399756\n化学计量学\t399757\nCorolla\t399758\n第32话\t399759\n平白\t399760\n确信\t399761\nejabberd\t399762\nladuree\t399763\n梦幻西游手游蜃影\t399764\nTagged\t399765\n系统工程\t399766\n中建八局第二建设有限公司\t399767\n王玉雯\t399768\n几百k\t399769\n龙门式\t399770\n25公斤\t399771\n平安福\t399772\nW12\t399773\n65平\t399774\n中华人民共和国职业病防治法\t399775\n金明\t399776\n廖耀湘\t399777\n雄激素\t399778\n刘若\t399779\n中国故事\t399780\n一言九鼎\t399781\n狗皮膏药\t399782\nSQL注入\t399783\nepm\t399784\n生态环境问题\t399785\n零和\t399786\n何首乌\t399787\n认识的哥哥\t399788\n浩大\t399789\n送暖\t399790\n妖刀记\t399791\nhannover\t399792\n四针六线\t399793\n无往不胜\t399794\n程婴\t399795\n关天茶舍\t399796\n发起者\t399797\ntslib\t399798\n3.7.3\t399799\nvs2017\t399800\n申能股份\t399801\n明尼苏达大学双城分校\t399802\n夷\t399803\n2502\t399804\n全委\t399805\nMenu\t399806\n富美加\t399807\n顶柱\t399808\n清蒸桂鱼\t399809\n乱斗西游2\t399810\n三四五线\t399811\nTEKLA\t399812\n160505\t399813\nApproximation\t399814\n白天鹅宾馆\t399815\n快男\t399816\n神舟精盾\t399817\nComboBoxEdit\t399818\n良品铺子\t399819\nLIU\t399820\n颛桥\t399821\n深圳福田口岸\t399822\n二十本\t399823\nx34p\t399824\n母猴\t399825\n重见天日\t399826\ngpdwin\t399827\n高碑店乡\t399828\nnginx负载\t399829\n倏忽\t399830\nworker\t399831\n行号\t399832\n文女主\t399833\n写手\t399834\n许昌东\t399835\n今生缘音乐网\t399836\n张建慧\t399837\n斯嘉丽·约翰逊\t399838\n二垒\t399839\npoles\t399840\n9.3.3越狱\t399841\n建邺城\t399842\n模块化\t399843\ncondi\t399844\n徐复观\t399845\n回源\t399846\n试飞员\t399847\n和蔼可亲\t399848\n保尊宝\t399849\n不能不知道\t399850\n蹭饭\t399851\n闲妻\t399852\n里番\t399853\n美轮美奂\t399854\n沙利度胺片\t399855\n干炸\t399856\n丹凤县人民政府\t399857\n一群人\t399858\n新加坡南洋理工\t399859\npathway\t399860\nokr\t399861\n车险费\t399862\njsPlumb\t399863\n绿领\t399864\n河蚌\t399865\ntbt\t399866\n续弦\t399867\n幸而\t399868\n超豪型\t399869\nkejian\t399870\nelectron-packager\t399871\n鼎盛新材\t399872\nShowgirl\t399873\n高屏\t399874\n坤叔\t399875\n爱酷学习网\t399876\nH7N9禽流感\t399877\n灵神\t399878\n万门中学\t399879\n南京市区\t399880\n量感\t399881\nae关键帧\t399882\n屑\t399883\n邓小平传\t399884\ndnspod\t399885\n图像化\t399886\nappui\t399887\n有性繁殖\t399888\n料箱\t399889\n攀升\t399890\n条目\t399891\n春蕾小学\t399892\n脳\t399893\nxiami\t399894\n国家行政学院\t399895\nremoval\t399896\n手寸\t399897\n2016年12月15日\t399898\n兰州大学管理学院\t399899\n喜马拉雅FM-423听书节\t399900\nballance\t399901\n绑绳\t399902\n流行性出血热\t399903\n茅台醇\t399904\n回执编号\t399905\n绿城蓝湾\t399906\n广西百色人民政府\t399907\nSoC\t399908\n审神者\t399909\n玉景\t399910\nfcd\t399911\n智能语音助手\t399912\n百度快照_\t399913\n自签\t399914\n云天化集团\t399915\n团歌\t399916\n匝\t399917\n相亲节目\t399918\n数码相片\t399919\nzlib库\t399920\n音乐区\t399921\n1梯\t399922\n徕卡相机\t399923\n不称\t399924\n临沧新闻网\t399925\n金艺\t399926\n化学工程\t399927\n墨言\t399928\n大连电信\t399929\n水湖镇\t399930\n外需\t399931\n京广路\t399932\n卢浮\t399933\n信阳火车站\t399934\n窝头\t399935\n杀人案\t399936\n65家\t399937\n南延线\t399938\nTracert\t399939\n楊\t399940\n守贞\t399941\nTSP\t399942\n建筑工程管理与实务\t399943\n陈冬梅\t399944\nsdk\t399945\n圣马\t399946\n黄葵胶囊\t399947\n假包\t399948\n朱崇实\t399949\n受青睐\t399950\n滑冰\t399951\n暗记\t399952\nss级\t399953\n13109594723\t399954\n大白熊犬\t399955\n荒淫\t399956\ncv\t399957\n雪莉杨\t399958\n哈奴曼\t399959\n早漏\t399960\n辛星\t399961\nideo\t399962\n安康市\t399963\n安驾\t399964\n远坂家ノ\t399965\n深圳康辉旅行社\t399966\n最低价\t399967\n常开\t399968\n心地\t399969\n欧式期权\t399970\n清脂\t399971\nrasa\t399972\nv5.11\t399973\nWriteup\t399974\n内景\t399975\n中波\t399976\n海底小纵队特别篇5_海底小纵队\t399977\n六间房秀场\t399978\ntribon\t399979\n南海出版公司\t399980\n上海纪实频道\t399981\n葫芦\t399982\nrampage\t399983\n东方万里行\t399984\n杭州国际学校\t399985\n剔凿\t399986\n围手术期\t399987\n大家一起\t399988\n黛安娜\t399989\n泰安一中\t399990\n贝尔法斯特女王大学\t399991\n坪山镇\t399992\n泰格里斯\t399993\nBebe\t399994\nURI\t399995\n50幅\t399996\n3点半\t399997\n韦应物\t399998\n新疆军区\t399999\n美皮护\t400000\n雪松\t400001\np2p\t400002\n呸\t400003\nella\t400004\ncortex\t400005\n浪客\t400006\nfifaonline\t400007\ncolModel\t400008\n香榭丽都\t400009\ncrude\t400010\n兲\t400011\n这段话\t400012\n应纳\t400013\n商务酒店\t400014\n顾强\t400015\n孝廉\t400016\n简单易行\t400017\n所感\t400018\n小龄\t400019\n忠犬\t400020\n屈指\t400021\n智鼎\t400022\n电梯维保公司\t400023\n张可颐\t400024\n杨村一中\t400025\n莲花落\t400026\n探探\t400027\n金珀\t400028\n一千块\t400029\n小红疙瘩\t400030\n河北银监局\t400031\n凤巢\t400032\n11.2.0.3.0\t400033\n汽车卡\t400034\ngridView\t400035\n互金协会\t400036\n亚尔斯兰战记\t400037\nsg\t400038\n北京市律师事务所\t400039\n治疗期\t400040\nh6coupe\t400041\n尘光\t400042\n水浒q传\t400043\n吐根\t400044\nGeass\t400045\n消肿\t400046\n乖乖网\t400047\nLatent\t400048\nestudy\t400049\n仙女镇\t400050\n雷凯\t400051\n好想告诉你\t400052\n铉\t400053\nMC九局\t400054\n德川\t400055\n方怡\t400056\n柳氮磺吡啶肠溶片\t400057\n肝郁气滞\t400058\n亚丹\t400059\nlux\t400060\n秦皇岛酒店\t400061\n20161221\t400062\n施组\t400063\n李鹏新\t400064\n特曲\t400065\ndownloader\t400066\n望京地区\t400067\n1763\t400068\nplc编程软件\t400069\n斗鱼绝地求生\t400070\nSuch\t400071\nDD35\t400072\n香椿芽\t400073\n牵牛星\t400074\n芭莎\t400075\nNEVER\t400076\n玻化微珠\t400077\n第三十五次\t400078\n广西药用植物园\t400079\n铁像寺水街\t400080\n好火\t400081\n海珠客运站\t400082\n国家检察官学院\t400083\n长白\t400084\n月明星\t400085\nstrata\t400086\n1.el7\t400087\n32#\t400088\n底薪\t400089\nPPBC\t400090\n表表\t400091\n和田\t400092\n农村义务教育学校\t400093\n8.04\t400094\n盥洗室\t400095\n金冠\t400096\n好友\t400097\n人寿车险\t400098\n邢东新区\t400099\nszz\t400100\n碳硫分析仪\t400101\nyaf\t400102\n月小结\t400103\n清凉油\t400104\n4批\t400105\nARCTERYX\t400106\n抗结剂\t400107\n特数\t400108\n广州地铁8号线\t400109\n电解法\t400110\n售楼处\t400111\n2pi\t400112\nosu\t400113\n1126\t400114\n邓卓翔\t400115\nicon\t400116\n碳纳米管\t400117\n错位相减\t400118\n法者\t400119\n赐\t400120\n纯白\t400121\n清调\t400122\n文件类\t400123\nwww52avavavcmm\t400124\n湖南人大网\t400125\n清韵\t400126\nsensitive\t400127\n单属\t400128\n涂布率\t400129\n网舞\t400130\n脊髓型颈椎病\t400131\n日厂\t400132\n教务办\t400133\n烧造\t400134\n红毯\t400135\n故宫博物馆\t400136\n依稀\t400137\n化学工业\t400138\n毒物\t400139\n绿化带\t400140\n何何夕\t400141\n外驱\t400142\n赋壮\t400143\n雅迪莱客\t400144\n捉妖记2\t400145\n朴信阳\t400146\n天师斗僵尸\t400147\nNSQ\t400148\n厦门卫视\t400149\n都市小说小说-17k小说网\t400150\n路考仪\t400151\n浮渣\t400152\n熙城\t400153\n汉长安城\t400154\n42层\t400155\n夜奶\t400156\n吉利GX7\t400157\n8675\t400158\nSolidEdge\t400159\n楚留香手游\t400160\n流河\t400161\n中国银行\t400162\nsu2015\t400163\n康源\t400164\nThirteen\t400165\n23歳\t400166\nemmm\t400167\n缩略语\t400168\ngut\t400169\n维特斯\t400170\n踏雪无痕SS\t400171\n涯友\t400172\npyqt4\t400173\n月野兔\t400174\n市经贸信息委\t400175\n沙坪坝区\t400176\n康奈尔大学\t400177\n唇边\t400178\n巡迴\t400179\n食药监所\t400180\n珠菲网\t400181\n安监总局\t400182\n龙水湖\t400183\n群侠\t400184\njohnson\t400185\n石园\t400186\n王者魔卡幻想\t400187\niwatch\t400188\nABC-A1B1C1\t400189\n中顾委\t400190\n闻到\t400191\nplaceholder\t400192\n滔搏\t400193\n花甲\t400194\n花桥流水\t400195\n心口\t400196\n说不准\t400197\n52酷\t400198\n药科大学\t400199\n管住\t400200\n尽力\t400201\n村田\t400202\n副司长\t400203\n城投\t400204\n布朗熊可妮兔\t400205\n罗湖小学\t400206\n米佩婷\t400207\n3卷\t400208\nservices\t400209\n国航知音卡\t400210\n历次\t400211\n9只\t400212\n街坊\t400213\n房车网\t400214\nDHCP地址池\t400215\n好迪\t400216\n摄像枪\t400217\n屁颠\t400218\n陈建忠\t400219\n标了\t400220\n9.26\t400221\n一个多月\t400222\n史可法\t400223\n济宁市\t400224\n百度CarLife\t400225\nFacet\t400226\n北川杏树\t400227\n鞍山高新区\t400228\nuseless\t400229\n尹蔚民\t400230\n刘彦斌\t400231\n于心\t400232\n王彦\t400233\n8.0.9\t400234\nIlluminate\t400235\n极效\t400236\n贝勒\t400237\nDrea\t400238\n老伯\t400239\n石家庄市第四医院\t400240\n诗友\t400241\n78.cn\t400242\n凶位\t400243\n宣武医院\t400244\n40p\t400245\n衢州市\t400246\n贡献\t400247\n夹腿\t400248\n武藏\t400249\n搜狗输入法\t400250\n白球\t400251\n羽联\t400252\n进贡\t400253\n陶瓷展\t400254\n摇床\t400255\n空洞骑士吧\t400256\n家务事\t400257\n灌县\t400258\n吹炼\t400259\n梳头\t400260\nNE555\t400261\n喷雾泵\t400262\n阴阳鬼\t400263\njumps\t400264\nlujing\t400265\nsgp\t400266\n肺俞穴\t400267\n正当防卫\t400268\n乐颜\t400269\n轮状病毒疫苗\t400270\n逼宫\t400271\n飞绝\t400272\n哲\t400273\nXuan\t400274\n低转速\t400275\n新疆职业大学\t400276\n煤巷\t400277\n台歌\t400278\n最后一个月\t400279\n创客秀\t400280\n常遇春\t400281\n孕睫术\t400282\n2006—2020年\t400283\n奥托立夫\t400284\n梁静飞\t400285\n马世琦\t400286\n版卡\t400287\n林斌\t400288\n标压\t400289\n青岛市食品药品监督管理局\t400290\n追不到\t400291\n一法\t400292\n曲谱\t400293\n伞架\t400294\n中国健美协会\t400295\n无轨电车\t400296\nhue\t400297\n海皮岛\t400298\n结痂\t400299\n美丽的世界\t400300\n有限责任公司股东会\t400301\n高口\t400302\n红河谷\t400303\n仙桃组工网\t400304\n图格\t400305\n瑞思迈\t400306\n进账\t400307\noppor11t\t400308\n消失的爱人\t400309\n中国日报\t400310\n从百草园到三味书屋\t400311\n第几天\t400312\n艾登\t400313\n案例分析\t400314\n古洞\t400315\n黄南州\t400316\n速冻机\t400317\n广东创新科技职业学院\t400318\n常平\t400319\n新华园\t400320\n楚汉争雄\t400321\n溪风博客\t400322\n脚套\t400323\n左贤王\t400324\n华为OceanStor\t400325\nadmaster\t400326\n生态文\t400327\nSharma\t400328\n佛山小区\t400329\n炽焰\t400330\n盎司\t400331\n学写\t400332\n瞧一瞧\t400333\n银翘解毒丸\t400334\n荣耀6\t400335\n座下\t400336\n阳光灿烂的日子\t400337\n40辆\t400338\n真厉害\t400339\nSchindler\t400340\n撒种\t400341\n上古卷轴5:天际重制版+上古卷轴5\t400342\n吃饺子\t400343\n报价表\t400344\n康定县\t400345\ntdc\t400346\nzts\t400347\n中华人民共和国交通运输部\t400348\nalliance\t400349\n7.8折\t400350\n养肤\t400351\n牛百叶\t400352\nC214\t400353\n一碗面\t400354\n张金树\t400355\nBelkin\t400356\nq10\t400357\n仓栅式\t400358\n烤羊肉串\t400359\n驰冥\t400360\n欣美\t400361\n艾达王\t400362\n日薪\t400363\n各向\t400364\n东新路\t400365\n刘建超\t400366\n0702\t400367\n杨柳依依\t400368\n百炼\t400369\nVisio2016\t400370\n市文广新局\t400371\nhapp\t400372\nxp32\t400373\nmsyh\t400374\n坦白\t400375\n王亚洲\t400376\n过氧化\t400377\n十三星座网\t400378\n毛_\t400379\n血字\t400380\n产褥期\t400381\n两册\t400382\n3.5G\t400383\nDavies\t400384\nlogged\t400385\n8393xxxx\t400386\n中康体检网\t400387\n韦晶\t400388\nWin10输入法\t400389\n冒险片\t400390\n托洛茨基\t400391\n速配\t400392\n成都市教育局\t400393\n苏小美\t400394\nTurtle\t400395\n多梦\t400396\n三羧酸\t400397\n帕玛强尼\t400398\n中华少年\t400399\n宏扬\t400400\n彭玉\t400401\n赣州市工商行政管理局\t400402\npostman测试\t400403\n酷爽\t400404\n校友邦\t400405\n国防科技大学\t400406\n台服\t400407\n1979年\t400408\n情探\t400409\n柠萌\t400410\n平角裤\t400411\n天津市总工会\t400412\n天翼决\t400413\n陆渡\t400414\n何凯\t400415\nddr2\t400416\n里番库\t400417\n计及\t400418\nv3.5.1\t400419\n万缕\t400420\n古诗两首\t400421\n烯\t400422\n锐气\t400423\n瞎了眼\t400424\n川路\t400425\n伊斯坎达尔\t400426\n姨娘\t400427\n小店\t400428\n完税价格表\t400429\n银耳\t400430\n山泥\t400431\n异域风情\t400432\n直通\t400433\n周沫\t400434\n椅垫\t400435\nscapy\t400436\n阿伯\t400437\nSBC\t400438\n大源镇\t400439\n天行加速器\t400440\n专利申请号\t400441\ncalloc\t400442\nRecovery\t400443\n葬法\t400444\n高矮\t400445\n第9次\t400446\nnpk\t400447\n羽衣传说\t400448\n智荟\t400449\n大武口区\t400450\n中华人民共和国未成年人保护法\t400451\n安卓网\t400452\n天诚盛业\t400453\n紧急按钮\t400454\n中国医学科学院皮肤病医院\t400455\n康桥知园\t400456\n液力变矩器\t400457\n生产方\t400458\nlpg\t400459\n4000斤\t400460\n舒安\t400461\n长江云\t400462\n奔驰G350\t400463\n第二篇\t400464\n获利盘\t400465\n虹口\t400466\n漫城\t400467\n纪伟\t400468\n谢谢你来了\t400469\n楼盖\t400470\n锦华\t400471\n狮子战争\t400472\n成因\t400473\n品优\t400474\n班氏\t400475\n希尔达\t400476\nimported\t400477\n10月6日\t400478\n选煤\t400479\nsukin\t400480\n试开\t400481\nNetwork\t400482\n膻中穴\t400483\n云南电信\t400484\n张翀\t400485\n彭小六\t400486\n坎地沙坦酯片\t400487\n万龙\t400488\n阿斯顿\t400489\nu型\t400490\n通用汽车公司\t400491\n售后回购\t400492\n广州学校网\t400493\n抚州市\t400494\n太平洋网络\t400495\n铁链\t400496\n万象宝盘\t400497\nNutrient\t400498\n帝业\t400499\n摇号\t400500\n国盛证券\t400501\n8500u\t400502\n克鲁赛德战记\t400503\n流黄\t400504\n红冰\t400505\n永州市委\t400506\n犒劳\t400507\n远大\t400508\nexcel\t400509\n巨型犬\t400510\n学考\t400511\n成克杰\t400512\n康德乐\t400513\n饮湖上初晴后雨\t400514\n0763\t400515\n4话\t400516\n抹灰工\t400517\nJiu\t400518\nMine\t400519\nOpenEdv-开源电子网\t400520\n宁波大学科学技术学院\t400521\n装饰业\t400522\nOnline3_电玩巴士\t400523\n海斗士\t400524\n世界地铁\t400525\n300架\t400526\n纯麦\t400527\n张洪伟\t400528\n鸟巢度假村\t400529\nAPPStore\t400530\n龙狗\t400531\n誉山\t400532\n富士施乐\t400533\n孤雌\t400534\n狂心\t400535\n词穷\t400536\n乘着歌声的翅膀\t400537\n大埔\t400538\n东安村\t400539\n八一中文网\t400540\n配兵\t400541\n中子播放器\t400542\n沙浦\t400543\n维基文库\t400544\n八旗军\t400545\n吴巍\t400546\n天天天水网\t400547\n贝壳岛\t400548\n伯内特\t400549\n_竹\t400550\n撕逼\t400551\n比亚迪E6\t400552\nt470s\t400553\ncollide\t400554\n说不过去\t400555\nIC卡读卡器\t400556\nv3.0.7\t400557\n228\t400558\n健美操\t400559\nXTR\t400560\n史鉴\t400561\n逆道\t400562\n可复制性\t400563\nIT之家\t400564\n樊桐舟\t400565\nzpn\t400566\n10051\t400567\n中国影子银行\t400568\n最近10年\t400569\n胎噪\t400570\n十本\t400571\n罗克韦尔\t400572\n三国杀神\t400573\n电池片\t400574\n湘江北\t400575\n子日\t400576\n中式电视背景墙\t400577\n聂圣哲\t400578\npersistent\t400579\n_唯满侠\t400580\n青草地\t400581\n红外光\t400582\n香骨\t400583\n奶\t400584\ngamers\t400585\n捉刀\t400586\n胡坤\t400587\n怀特海\t400588\nAngularJs\t400589\nEXCEL表格\t400590\n惠理\t400591\nc104\t400592\n胡晓波\t400593\n帘布\t400594\na02\t400595\n贵州省\t400596\n倒链\t400597\n郑州站\t400598\n哈铁\t400599\n衣架\t400600\n中国致公党\t400601\nbittorrent\t400602\n游天\t400603\n2700U\t400604\npc卡\t400605\n补编\t400606\n如影逐形\t400607\n安房直子\t400608\n烟台一中\t400609\n中国银联股份有限公司广东分公司\t400610\n小郡\t400611\n39.5\t400612\n楼宇烈\t400613\n_诗经\t400614\nstvp\t400615\n120万吨\t400616\n抗扭\t400617\n基金定投\t400618\nstirng\t400619\n80smp4\t400620\nKMZ\t400621\n保利熙悦\t400622\n红苕\t400623\n豫光金铅\t400624\n源力\t400625\n金英杰\t400626\n张家湾\t400627\n乐居新闻网\t400628\n珊瑚虫\t400629\n坐席\t400630\n尤拉\t400631\nGrande\t400632\nAct\t400633\n麻花钻\t400634\n50多家\t400635\n新宝岛\t400636\nwenjian\t400637\n绅坊\t400638\n乐成\t400639\nJava接口\t400640\n写频\t400641\n80km\t400642\n232\t400643\n冯志明\t400644\n鹭湖森林度假区\t400645\n测孕\t400646\n辽北\t400647\n4450\t400648\n6227\t400649\n凉糕\t400650\nalevel\t400651\n苏州高博软件技术职业学院\t400652\njacks\t400653\nveneta\t400654\n第1121章\t400655\n扬州招聘网\t400656\n哈尔滨地铁\t400657\nxsrf\t400658\n萧寒\t400659\nshangy\t400660\n6毫米\t400661\n君子之交\t400662\nrope\t400663\n活动型\t400664\nUnisex\t400665\n习惯于\t400666\n富裕\t400667\n一百倍\t400668\nGSD\t400669\n六次方\t400670\nPluggable\t400671\nAnker\t400672\n吴官正\t400673\n血丝\t400674\nVP\t400675\n双子塔\t400676\n一览_特玩网\t400677\n文因\t400678\n噪声值\t400679\n城中街道\t400680\n县档案局\t400681\n邓建军\t400682\nr530\t400683\n电驴ed2k\t400684\nadvertised\t400685\n无气喷涂机\t400686\n皇冠论坛_汽车之家论坛\t400687\n榭\t400688\nRealPlayer\t400689\n屏蔽室\t400690\n省试\t400691\n套圈\t400692\n南京国际\t400693\n福元路\t400694\n高尿酸血症\t400695\n桂香\t400696\n童学馆\t400697\nabsfree\t400698\n交会\t400699\n现金支出\t400700\n汇景\t400701\n家房\t400702\n脉血康胶囊\t400703\nshlf\t400704\n20170222\t400705\n舟楫\t400706\n前瞻\t400707\n关之琳\t400708\n撕\t400709\n项目管理器\t400710\n江苏移动\t400711\n上湖\t400712\n浮点\t400713\nGain\t400714\n德兴网\t400715\n巴哈姆特之怒\t400716\n星际争霸2:虫群之心\t400717\nRomeo\t400718\nAlto\t400719\n罚\t400720\n粉蒸肉\t400721\n多\t400722\n1.2部\t400723\n凌雄\t400724\n追诉期\t400725\n达明一派\t400726\nSubmission\t400727\n红山公园\t400728\n行权\t400729\n世徒\t400730\n管理费\t400731\n飞狐交易师\t400732\n芗剧\t400733\n慕小蕾\t400734\n朱砂古镇\t400735\nwhole\t400736\nzhentan\t400737\n建议稿\t400738\nMilling\t400739\nprepend\t400740\npvx\t400741\n净营运资本\t400742\n碳纤\t400743\n名字\t400744\n中国移动通信集团有限公司\t400745\n黄冈镇\t400746\n绮梦\t400747\nptn\t400748\n切磨\t400749\n4.8.2\t400750\n上海卫视\t400751\n杨浦院区\t400752\n山西证券\t400753\n清漆\t400754\n为谁而炼金_九游论坛\t400755\ncong\t400756\n质量技术监督局\t400757\n存折号\t400758\n导赏\t400759\n沟子\t400760\n手机三国杀\t400761\n咱家\t400762\n赤道几内亚\t400763\n农桑\t400764\n五味\t400765\ndjmp3\t400766\n惜福\t400767\n还债\t400768\nta\t400769\n红景天\t400770\nZEISS\t400771\n六角钢\t400772\n孝利家民宿2\t400773\n一凡\t400774\n巫溪县\t400775\n医证\t400776\n重庆人事考试网\t400777\nBuffered\t400778\n花口\t400779\nJk\t400780\n德阳市区\t400781\n小本\t400782\nrwby\t400783\n全首\t400784\n湿电\t400785\n超级粉丝\t400786\nGBP\t400787\n金森\t400788\n中国期货业协会\t400789\nUma\t400790\n长春市公安局\t400791\n天木兰\t400792\n情趣研究院_春水堂\t400793\n证券投资基金业协会\t400794\n青山区政府\t400795\n联合利华公司\t400796\n唇亡齿寒\t400797\n干挂石材\t400798\n蒙面歌王\t400799\njsencrypt\t400800\nWa\t400801\n温籍\t400802\n怪诞小镇\t400803\nyhdsir\t400804\nwifi共享大师\t400805\n拉基蒂奇\t400806\n陈述稿\t400807\n非谓语\t400808\n酷派8297\t400809\n停育\t400810\n行草\t400811\n此情\t400812\n魅思\t400813\nNickelback\t400814\nGBase\t400815\nUXRen\t400816\n建设街\t400817\n800千伏\t400818\n万得\t400819\n赣南医学院第一附属医院\t400820\n走边唱\t400821\n2.10.1\t400822\n拓领\t400823\n股权登记日\t400824\nexcel03电子表格\t400825\n阿森\t400826\nliberation\t400827\n云和政府网\t400828\n100座\t400829\n亿超眼镜网\t400830\n鲧\t400831\n大治\t400832\n港女\t400833\n三角矩阵\t400834\n严正花\t400835\n媚眼\t400836\n百子图\t400837\n重庆市第一中级人民法院\t400838\n小说文\t400839\n西蔵\t400840\n销售员\t400841\n病机\t400842\nPoints\t400843\n车辆违章查询网\t400844\nFantasies\t400845\nDS18B20\t400846\n杨赤\t400847\n289\t400848\n上海小升初择校\t400849\n努尔·白克力\t400850\n日子\t400851\nJavasc\t400852\n庞晓戈\t400853\n贝森\t400854\n东菱\t400855\n出新\t400856\n12博\t400857\n周铭孙\t400858\n从商\t400859\n北京人大附中\t400860\n味鲜\t400861\n德雷尔\t400862\n年值\t400863\n郭村\t400864\n嘟嘟韩剧网\t400865\n宗炳煌\t400866\n林语\t400867\n多一条\t400868\n林权抵押贷款\t400869\n会见\t400870\n国企\t400871\n梁维东\t400872\nreindent\t400873\n川办\t400874\n20180228\t400875\n说穿\t400876\n20160329\t400877\n医专\t400878\nTaboo\t400879\n吞天\t400880\n优倍\t400881\n等差数列an\t400882\n鳄\t400883\n天津河\t400884\n花园路\t400885\n版号\t400886\n莫泊桑\t400887\n评定\t400888\n依巴斯汀\t400889\n核反应\t400890\n保护率\t400891\n香港大学\t400892\n欢迎页\t400893\n剑灵召唤师吧\t400894\n∶\t400895\n异客逢欢\t400896\n耐压测试\t400897\nYumiko\t400898\n冷缝\t400899\n精细化\t400900\nLED平板灯\t400901\n盆景\t400902\nchuiyue\t400903\n360化妆品网\t400904\nGurobi\t400905\n货架\t400906\n4.72\t400907\nffm\t400908\nsqu\t400909\n〗\t400910\n总体国家安全观干部读本\t400911\n定期寿险\t400912\n我有一个梦想\t400913\n慢性硬膜下血肿\t400914\n琥珀色\t400915\n新疆环保厅\t400916\nPROFIBUS-DP\t400917\n打交道\t400918\n长期投资\t400919\nEpub\t400920\n润膏\t400921\n巫羽\t400922\n超卖\t400923\n被税\t400924\n岩土工程师\t400925\n冒烟测试\t400926\nps画\t400927\n员级\t400928\n射命丸文\t400929\n永嘉书院\t400930\n赵华伟\t400931\n蜂蜜柚子茶\t400932\n双门\t400933\n盈量\t400934\n深圳湾\t400935\n该咋\t400936\n思亲\t400937\ngt2\t400938\n第18043期\t400939\n脑沟\t400940\n定解\t400941\n官版\t400942\nContentProvider\t400943\n600717\t400944\ntreatment\t400945\n深圳巴士网\t400946\n侮辱\t400947\nnjut\t400948\n全蚀\t400949\nDon\t400950\n勐拉\t400951\nwww.5tu.cn\t400952\n待着\t400953\n秦英\t400954\n百里杜鹃景区\t400955\n安永华明会计师事务所\t400956\n石化油\t400957\n梅露可物语\t400958\n虚荣\t400959\n园点\t400960\nTIT\t400961\nTesseract\t400962\n三房\t400963\n非凡教育\t400964\n雨具\t400965\nV8.2\t400966\nIna\t400967\nCCDI\t400968\n太初\t400969\nGameDB\t400970\nintra\t400971\n静观镇\t400972\n限量\t400973\nred\t400974\n马天奴\t400975\n卷纸\t400976\n九朵\t400977\n或者\t400978\n轻羽\t400979\nSnowflake\t400980\n敌方\t400981\nBerserker\t400982\n空城计\t400983\n港龙\t400984\n正音\t400985\n吃吃\t400986\n英才\t400987\n归根\t400988\nf596\t400989\nA1429\t400990\n廊坊师范学院\t400991\n摩尔比\t400992\n交通规划\t400993\nfrontpage2003\t400994\nOGC\t400995\n期货交割\t400996\n诽谤者\t400997\n浅黄色\t400998\n蜜吻\t400999\n鸡毛秀\t401000\n洗衣服务\t401001\nnsh\t401002\n德海\t401003\n框点\t401004\n鱼王\t401005\n欲念\t401006\ng900\t401007\n疾星斩月\t401008\n教育科技\t401009\n大林寺\t401010\nSCTV4\t401011\n广晟有色\t401012\nposting\t401013\n市食品药品监督管理局\t401014\n名铁\t401015\n蓝青中学\t401016\n编排表\t401017\n8676\t401018\n多艘\t401019\n河北省物价局\t401020\n日日顺乐家\t401021\nMate10pro\t401022\n不好当\t401023\n山盟\t401024\n鲜妻\t401025\nqueue\t401026\n第22号\t401027\nAzul\t401028\n独身主义\t401029\ndiscrete\t401030\nexvs\t401031\nheroku\t401032\n190元\t401033\nchinaz\t401034\n作揖\t401035\n高清晰\t401036\n牛板筋\t401037\n非正常户公告\t401038\n长城币\t401039\n起病\t401040\n黑龙江省工商行政管理局\t401041\n瓦恩\t401042\n3次\t401043\n舒晓琴\t401044\n衣\t401045\neliminator\t401046\n徐向东\t401047\n所长\t401048\n花泥\t401049\n最终幻想14\t401050\n镇隆镇\t401051\ndesi\t401052\n沈阳电视台\t401053\n江门市城乡规划局\t401054\nsetuptools\t401055\n血泪史\t401056\n发出\t401057\n二三次元\t401058\ncacti\t401059\nvivoxplay3s\t401060\n杜梨\t401061\n123.sogou.com\t401062\n汽车保险联盟\t401063\n丽港\t401064\n孙国栋\t401065\n开户籍\t401066\n互通\t401067\n锥螺纹\t401068\n肠杆菌\t401069\nroid\t401070\n函\t401071\n锦华装饰\t401072\n波奇网\t401073\n朋友妈妈2017:朋友不在家的日子\t401074\n5维\t401075\n打架吧鬼神\t401076\n流俗\t401077\nDotamax\t401078\n萧遥\t401079\n孤芳自赏\t401080\n上海旅游\t401081\ndecompression\t401082\nConfig\t401083\n松北区\t401084\n钻采\t401085\nzimuku\t401086\n杨婷婷\t401087\n会计代理记账\t401088\n进阶\t401089\n5.32\t401090\n参保率\t401091\nConsolidated\t401092\n长时\t401093\n海南大宗商品交易中心\t401094\n枫景苑\t401095\n表导\t401096\n内外网\t401097\n中投顾问\t401098\n四川美院\t401099\n天畅\t401100\n回填\t401101\n养老公寓\t401102\n厚型\t401103\n泡沫液\t401104\n袒\t401105\n苍空\t401106\n5千元\t401107\n重庆地铁6号线\t401108\n父爱\t401109\n炒房者\t401110\n21万\t401111\n阶篇\t401112\n日落而息\t401113\n专业技术职务呈报表\t401114\nHai\t401115\n黄田村\t401116\nios7\t401117\nluarocks\t401118\n江西外语外贸职业学院\t401119\n订货单\t401120\n安卓论\t401121\n补全\t401122\n影记\t401123\n云吞面\t401124\n车价\t401125\n官门\t401126\n调号\t401127\n有权\t401128\nSHP9500\t401129\n一条命\t401130\n张少军\t401131\n斯托米\t401132\n5-3\t401133\n张凯\t401134\n北方驾校\t401135\n1500万吨\t401136\n万人空巷\t401137\n北辰大道\t401138\n涡轮流量计\t401139\n蔺\t401140\n做假\t401141\n海蒂拉玛\t401142\n讲习题\t401143\n必定\t401144\n_________\t401145\n云麦\t401146\n18962123606\t401147\n谢菲联\t401148\n船队\t401149\n读取行\t401150\n国美电器有限公司\t401151\n二辑\t401152\n际\t401153\n默然\t401154\nThermaltake\t401155\n十八条\t401156\n无锡小天鹅股份有限公司\t401157\n20180221\t401158\n我的二哥二嫂\t401159\n绿盾全国企业征信\t401160\n太华路\t401161\n上美\t401162\n离歌大明文魁\t401163\nhandbag\t401164\n91WAN\t401165\n大阴茎\t401166\n北师\t401167\nwhitney\t401168\n讲议\t401169\n国立交通大学\t401170\n东京喰种第三季先睹为快\t401171\nHuo\t401172\n入侵\t401173\n在家\t401174\n清明茶\t401175\n用得着\t401176\nPLAYBOY\t401177\n22.4\t401178\n鲜货\t401179\n水篇\t401180\n超版\t401181\n搜捕\t401182\n被炸死\t401183\nLinx\t401184\n人员工\t401185\n五个多月\t401186\n大众_途观\t401187\n甘肃省人民政府办公厅\t401188\nbott\t401189\n手机网\t401190\nCs6\t401191\n丁义珍\t401192\n5000mah\t401193\naiff\t401194\njiao\t401195\nstates\t401196\n一层层\t401197\n66人\t401198\n和畅\t401199\np4v\t401200\n27米\t401201\n引入\t401202\n情人们\t401203\nFowler\t401204\nmaxima\t401205\n血色十字军\t401206\nsought\t401207\n110条\t401208\n予以\t401209\nya\t401210\n555\t401211\n潘阳\t401212\n足矣\t401213\n纳贡\t401214\n心术不正\t401215\nwebelement\t401216\n气息\t401217\nThorn\t401218\n17.1\t401219\n启动式\t401220\n琅琊阁\t401221\n中国养殖网\t401222\n海信科龙\t401223\n竞彩足球_彩票网\t401224\n探索类\t401225\n柏斯琴行\t401226\n冷瞳\t401227\n陈志伟\t401228\n海东时报\t401229\nTraders\t401230\n塔体\t401231\n相表\t401232\n好\t401233\n小升初\t401234\n高通晓龙\t401235\n穿带\t401236\n水郡\t401237\n鹭江道\t401238\n可汗\t401239\n男神女神\t401240\nVRアダルト無料動画\t401241\n美国银行\t401242\n代谢性\t401243\n概预算编制\t401244\n标桩\t401245\n厨电\t401246\n热保护器\t401247\n五黑鸡\t401248\n无糖饼干\t401249\nPspice\t401250\n复旦附中\t401251\nCEF\t401252\n皮诺曹\t401253\nniagara\t401254\n尼可刹米\t401255\n智机网\t401256\n地弹簧门\t401257\n肛裂\t401258\n骨鲲\t401259\n水晶花\t401260\n营销管理\t401261\n膨胀剂\t401262\n女主人公\t401263\n客土\t401264\n加冠\t401265\n布林带\t401266\n人类简史\t401267\n滑板鞋\t401268\n七都岛\t401269\n周周乐\t401270\n西柯镇\t401271\n手指舞\t401272\n长城公司\t401273\n82.7\t401274\n品率\t401275\n万税\t401276\n标函\t401277\n黑骑士\t401278\nSigil\t401279\nPACKT\t401280\n闪之轨迹2\t401281\n新站高新区\t401282\nsparkle\t401283\n37章\t401284\n东京食尸鬼动画\t401285\n杭州学军中学\t401286\n30期\t401287\n平型\t401288\n死兔\t401289\nIFRAME\t401290\nZ贺\t401291\n启天\t401292\n0.10.1\t401293\n独播\t401294\n第9位\t401295\n王德民\t401296\n内江东兴区\t401297\naviva\t401298\n有兴\t401299\n福田网\t401300\ndealloc\t401301\n尊老\t401302\n夏韶声\t401303\n务工者\t401304\n27.5寸\t401305\n郑建明\t401306\n描述性\t401307\n中央八项规定实施细则\t401308\n副处\t401309\nSimply\t401310\n高天\t401311\n山西省扶贫开发办公室\t401312\n饶州\t401313\n外运\t401314\nwifi接收器\t401315\n司马伦\t401316\n2017六级\t401317\n醍醐灌顶\t401318\n0.8%\t401319\n奢侈品\t401320\n挪用公款罪\t401321\n自娱\t401322\n201608\t401323\nbtchenguang\t401324\n甩锅\t401325\n臣妾\t401326\n华为海思\t401327\n不取\t401328\n南京汉开书院\t401329\nnuxt\t401330\n文图\t401331\n苯肼\t401332\n小儿癫痫\t401333\nKivy\t401334\nS14\t401335\n查访\t401336\n片海\t401337\n04月份\t401338\n付军\t401339\n健康水\t401340\n一界\t401341\n7册\t401342\n杨文昊\t401343\n五塔寺\t401344\n乌苏里船歌\t401345\n上蔡\t401346\n昆明市人民政府办公厅\t401347\nmarblemm\t401348\ngastric\t401349\njde\t401350\n劳动合同制\t401351\n条狗\t401352\n冰包\t401353\nClimax\t401354\n中铁十一局二公司\t401355\n定阶\t401356\n以撒\t401357\n主项\t401358\n冯.诺依曼\t401359\n反间谍法\t401360\n青河\t401361\nOnly\t401362\nmaxell\t401363\npopc\t401364\n新源县人民政府\t401365\n千百回\t401366\n孤竹\t401367\n美拉德\t401368\n玄雨\t401369\n陈超\t401370\n宣城市区\t401371\n大话币\t401372\n扳子\t401373\n勤俭节约\t401374\n红枣水\t401375\nWin8平板\t401376\n克尔凯郭尔\t401377\n第182章\t401378\n西南证券\t401379\n大东沟\t401380\n青梅\t401381\n海缸\t401382\nAllergy\t401383\n火麟飞\t401384\n涨跌停板\t401385\n泗县\t401386\nseacms\t401387\n甚么\t401388\n功率计\t401389\n少时\t401390\n电子科技大学\t401391\n德斯\t401392\n松江教育信息网\t401393\n六方晶系\t401394\nsupervisord\t401395\n遗弃罪\t401396\n武乡\t401397\n劳动模范\t401398\n金荣\t401399\nuclibc\t401400\nBEATBOX\t401401\n死神中文网\t401402\n上海幼升小\t401403\n团干\t401404\n几kb\t401405\n董白\t401406\n外婆菜\t401407\n东突厥\t401408\nlayui富文本编辑器\t401409\n中央人民广播电台经济之声\t401410\n练习册\t401411\n1项\t401412\n沃恩\t401413\nappe\t401414\nssh-key\t401415\n第23条\t401416\nInstance\t401417\n马斯洛需求层次理论\t401418\n怀远县人民政府\t401419\naec\t401420\n太湖路\t401421\n南明\t401422\n45类\t401423\n茗伊\t401424\n猪鲨\t401425\n体育总局\t401426\n舞帝传媒\t401427\nxcrun\t401428\n小葵花\t401429\n采薇\t401430\n星易图书网\t401431\n前列腺特异性抗原\t401432\n纹银\t401433\n美图楼\t401434\n死不可\t401435\nappend函数\t401436\n殅器\t401437\n爱钱\t401438\n超星尔雅通识\t401439\n天狼月季\t401440\n失业险\t401441\n狂流\t401442\n六版\t401443\n粉率\t401444\n三5\t401445\n旭辉城\t401446\nVenus\t401447\n朱日和\t401448\n懒散\t401449\n秘密花园\t401450\n花宝\t401451\nOpenStreet\t401452\n常盘\t401453\n安来宁\t401454\nSenior\t401455\n工装板\t401456\n阿梦\t401457\n百锐腾\t401458\n天花粉\t401459\n中国东方航空\t401460\n古代史\t401461\n欲罢\t401462\n马士基\t401463\npygame\t401464\n瑞昌路\t401465\n中华烟\t401466\n志辑\t401467\n作用力\t401468\n正题\t401469\n开山怪\t401470\n乔治奥威尔\t401471\n1663\t401472\n长征火箭\t401473\nGrasshopper\t401474\n当乐游戏中心\t401475\n黑黄鹂\t401476\n學\t401477\n命格\t401478\n清漪\t401479\n口诛笔伐\t401480\n台胞\t401481\npprof\t401482\nACDSEE\t401483\n古浪县\t401484\n基准贷款利率\t401485\nPainted\t401486\nbjeea\t401487\n国家AAAAA级旅游景区\t401488\n360天擎终端安全管理系统\t401489\n事清\t401490\n李海峰\t401491\n计发\t401492\n麻生太郎\t401493\n张岩林\t401494\n大庆市人民政府\t401495\n九五年\t401496\n赵世炎\t401497\nSlave\t401498\n慕课宣传周\t401499\n民意\t401500\nV9帮助中心\t401501\n中国国家队\t401502\n回注\t401503\n海鳗\t401504\n映画\t401505\n神奇的孩子\t401506\n刮板\t401507\n白鳍豚\t401508\n秦淮区\t401509\n$2\t401510\nsoil\t401511\n妖异\t401512\n扛起\t401513\nBD国粤双语中字迅\t401514\n副县\t401515\nWebResponse\t401516\n内切圆\t401517\n体育用品有限公司\t401518\nThriller\t401519\n刺秦\t401520\nstma\t401521\n廿八都古镇\t401522\nHape\t401523\n平安普惠\t401524\n儿童类\t401525\n斯基\t401526\n兔兔\t401527\nattach\t401528\n15瓦\t401529\n广医\t401530\n卫宫士郎\t401531\n曝气盘\t401532\n百度传课\t401533\n凶相\t401534\ndota2omg\t401535\n赊账\t401536\n加新\t401537\n细胞群\t401538\n吸血刀\t401539\n赵赵\t401540\nベビ\t401541\n样板间\t401542\n骑砍\t401543\nbore\t401544\n塑工\t401545\n通达集团\t401546\n金义都市新区\t401547\nqingspace\t401548\nCNY\t401549\n发现神行论坛_汽车之家论坛\t401550\n七\t401551\n春夜洛城\t401552\nㄗ\t401553\n2017年中\t401554\n随波飘摇\t401555\n成人展\t401556\n京城茶馆\t401557\n烃类\t401558\nVEGAS\t401559\n新韵\t401560\ncaoliu\t401561\n重庆市潼南县人民政府\t401562\n答对\t401563\n灰犀牛\t401564\n介孔\t401565\n4399卡布西游\t401566\n重庆大坪医院\t401567\n潮安\t401568\n一叶知秋\t401569\n菊芋\t401570\n中南财经政法大学会计学院\t401571\n锦城大道\t401572\n115云盘\t401573\n15:30\t401574\n吲哚\t401575\n稳健型\t401576\nword2vec\t401577\nP60\t401578\n汉麻\t401579\n小儿辩日\t401580\n关联人\t401581\n新安路\t401582\n新民主主义\t401583\n本溪市人民政府\t401584\nthinkPHP5\t401585\n广实\t401586\n先行图\t401587\n威斯康辛麦迪逊大学\t401588\n你好,旧时光\t401589\n圣斗士\t401590\n第12回\t401591\n春夏\t401592\n小贵\t401593\n肾气虚\t401594\n远程视频监控系统\t401595\n粢饭团\t401596\nJqueryEasyUI\t401597\njiangnan\t401598\n符号型\t401599\n掌游宝炉石传说攻略社区\t401600\n熊黛林\t401601\n荡寇志\t401602\n地涌金莲\t401603\ndwcs6\t401604\n陈述词\t401605\n商业分析师\t401606\n岳灵珊\t401607\n红色记忆\t401608\nDeutsche\t401609\nZT知乎\t401610\n盼乐\t401611\n事宜\t401612\nKITCHEN\t401613\n北航经济管理学院\t401614\n亚普\t401615\n老哥们\t401616\n8f\t401617\n百脑汇\t401618\nexcel工具栏\t401619\n三圣\t401620\n房产评估师\t401621\n全健\t401622\n齐齐乐诛仙手游\t401623\n乘车证\t401624\n5.51%\t401625\n三国群英传4\t401626\n双持\t401627\n132期\t401628\n深醒科技\t401629\n川人社\t401630\n捕获\t401631\n结婚\t401632\n讲诉\t401633\n升级券\t401634\n纠错\t401635\n20142015\t401636\n背痛\t401637\nTHETA\t401638\nwlop\t401639\n枪战类\t401640\n认可度\t401641\n长安南路\t401642\n10.10.2\t401643\n陈寨\t401644\n能带\t401645\n南京市第一医院\t401646\nnami\t401647\n100kw\t401648\nResidence\t401649\n西安人的歌\t401650\n区号表\t401651\n1月15日\t401652\nl130\t401653\n日建\t401654\n2014年12月31日\t401655\n阿尔罕布拉宫\t401656\nNORITZ\t401657\n承诺书\t401658\ng20杭州峰会\t401659\n凝胶电泳\t401660\n八千元\t401661\nROLLBACK\t401662\n海淀五路居\t401663\naids\t401664\n毗邻\t401665\n积分制\t401666\n河北外国语学院\t401667\n斩波器\t401668\n散人\t401669\nNim\t401670\n枰\t401671\n题材\t401672\n选辑\t401673\n华南理工大学出版社\t401674\n朱雨玲\t401675\n涌动\t401676\n武汉注册公司\t401677\n场管\t401678\nComme\t401679\n10份\t401680\n狗血\t401681\n北京动物园\t401682\n廊坊北\t401683\nso\t401684\n宝岗\t401685\n水版\t401686\n龙腾世纪起源\t401687\n佳木斯大学\t401688\nstrongswan\t401689\n项号\t401690\n贷记卡\t401691\n得一\t401692\n第十三讲\t401693\n大丰区\t401694\n湘潭县人民政府\t401695\n刀剣\t401696\n免费小说阅读网\t401697\n轮边减速器\t401698\n5分钟内\t401699\n九首歌\t401700\n书城\t401701\n效\t401702\nNeck\t401703\nVivian\t401704\n北京钓鱼网\t401705\n瑞士军刀男\t401706\n0.3.0\t401707\n高桥凉介\t401708\n铁观音\t401709\n礼贤镇\t401710\n碟刹片\t401711\n第七种\t401712\nprivatekey\t401713\nac米\t401714\n18059期\t401715\nhandsome\t401716\n无界空间\t401717\n网易汽车\t401718\n说事儿\t401719\n黄金钱包\t401720\n唐家湾\t401721\ncme\t401722\n边社\t401723\n心思\t401724\n美女老总\t401725\n纳达尔\t401726\n宁波市公安局\t401727\n世界建筑\t401728\n建设厅\t401729\n左岸\t401730\n猴岛游戏论坛\t401731\n微苦\t401732\n50股\t401733\n腋窝\t401734\n心跳文学\t401735\n0335\t401736\nbreadth\t401737\n军事史\t401738\n3501\t401739\nd3v\t401740\n慧\t401741\n80分\t401742\n叉车蓄电池\t401743\n韩健\t401744\neoe\t401745\nrexroth\t401746\n国祚\t401747\n便捷式\t401748\n桑葚泡酒\t401749\n木田\t401750\n粤西\t401751\noayx\t401752\n一封情书\t401753\n婚史\t401754\n青岛极地海洋世界\t401755\nEilen\t401756\n甘南州\t401757\n1101\t401758\n郑玉巧\t401759\nPlayers\t401760\nFidelity\t401761\n种蛋\t401762\n蹄铁\t401763\n30届\t401764\n氯氟氰菊酯\t401765\n盖锁\t401766\nColumn\t401767\n茂兰\t401768\nsinbasara\t401769\n住房公积金管理条例\t401770\n五祖\t401771\ndcu\t401772\n惠比寿\t401773\nHive分析窗口函数\t401774\n冰菊物语\t401775\n胜利股份\t401776\n1160\t401777\n14件\t401778\n地核\t401779\n苏俏然\t401780\nXu\t401781\ndme\t401782\n软玻璃\t401783\n乐器学习网\t401784\n南澳岛\t401785\ni++\t401786\n西城街道\t401787\n双抗\t401788\n穿越之我是还珠格格\t401789\n后勤服务集团\t401790\n启辰诛仙\t401791\n游子吟\t401792\n洋装\t401793\n第3条\t401794\n1.3.2\t401795\n熊出没之变形计\t401796\n绫野沙希\t401797\n辉瑞\t401798\n装置\t401799\n福耀汽车\t401800\n中共湖北省委组织部\t401801\n周嘉诚\t401802\n美程\t401803\n三十九\t401804\n沙坪坝小学\t401805\n暴雷\t401806\n广西民族博物馆\t401807\n先进先出\t401808\n复卷机\t401809\npython3吧\t401810\n韩星子\t401811\nSafely\t401812\n朱天心\t401813\n华语电影节\t401814\n护理研究\t401815\n可信性\t401816\n青岛市小学\t401817\nhits\t401818\n崩坏学园2_九游论坛\t401819\n七五折\t401820\npolishing\t401821\n吉林大学珠海学院\t401822\nbourneli\t401823\n洗盘\t401824\n一站到底\t401825\n一啪\t401826\nedition\t401827\n重生文学少女\t401828\n世界500强公司\t401829\nIPE\t401830\n灵帝\t401831\n萌叔\t401832\n小池塘\t401833\nAttach\t401834\n套内\t401835\nnymph\t401836\nGLU\t401837\n悦星\t401838\n红痣\t401839\nloot\t401840\n伊河\t401841\n特调\t401842\n评论者\t401843\n科学原理\t401844\n湖南教育新闻网\t401845\n界牌\t401846\n钧\t401847\n惑国毒妃\t401848\n瓦力\t401849\n姜丝\t401850\nTX\t401851\n德育处\t401852\n诊查\t401853\nCatalytic\t401854\n全彩漫画网\t401855\n妇乐\t401856\n胎息\t401857\n脊椎炎\t401858\n钢丝\t401859\n接线板\t401860\n长茧\t401861\nRainmeter\t401862\n乐豆\t401863\nB6\t401864\n碟中谍\t401865\nintense\t401866\n奶制品\t401867\n第九十九章\t401868\nkindel\t401869\n混凝土板\t401870\n起到\t401871\n沉默人\t401872\nTed\t401873\nyealink\t401874\n查取\t401875\n海雀\t401876\nGL8论坛\t401877\n包箱\t401878\n分岔\t401879\nAYA\t401880\n到头来\t401881\n2.0L\t401882\n老K游戏\t401883\nWeex\t401884\n魔女的复仇之夜\t401885\nfreopen\t401886\nns吧\t401887\n航海史\t401888\nv9.5\t401889\n222.88\t401890\n色彩斑斓\t401891\n几台\t401892\n科洛弗\t401893\nRoshe\t401894\n印加帝国\t401895\n5B\t401896\n冲出\t401897\n西安电子\t401898\n曹西平\t401899\n飞天鱼\t401900\n贸易场\t401901\n带行\t401902\nSALON\t401903\n考勒隧道\t401904\n清源街道\t401905\nTan\t401906\n哈利奎恩\t401907\n北极光\t401908\n真相侠\t401909\n群晖NAS\t401910\nGorogoa\t401911\n殆\t401912\n魚\t401913\n南麂岛\t401914\n谢咏\t401915\n大活\t401916\n50元\t401917\nPowerDVD\t401918\n银杏之乡\t401919\n鸿云\t401920\n化妆包\t401921\n5.5KW\t401922\n钱莹\t401923\n降温\t401924\nisrael\t401925\ng430\t401926\nsteele\t401927\n榆林镇\t401928\n森野雫\t401929\n984集\t401930\n荣创\t401931\n九方城\t401932\n黑狱\t401933\n浮冰\t401934\n特灵\t401935\n板门\t401936\n焓湿\t401937\n长者\t401938\n值得上\t401939\n东京女子图鉴\t401940\n香烛\t401941\n乡镇纪委\t401942\nwin7系统C盘\t401943\n调味盒\t401944\n瑞幸\t401945\n20151214\t401946\n陆少的秘密恋人\t401947\n伊春市\t401948\n悄无声息\t401949\n徭役\t401950\n榴火\t401951\n浮木\t401952\n大力宣传\t401953\n91_\t401954\n战枫\t401955\nbounded\t401956\n雅居乐国际花园\t401957\n保利大厦\t401958\n电火锅\t401959\n滑板\t401960\n奥陶纪公园\t401961\nbother\t401962\n飞傲x1\t401963\n田单\t401964\n凌河区\t401965\n蝙\t401966\n鲤鱼旗\t401967\n东方号\t401968\n联合国环境署\t401969\n李玲玲\t401970\n东方宝泰\t401971\n第一章\t401972\n192.168.0.0\t401973\n梨形\t401974\n人形师\t401975\n简配\t401976\nunitek\t401977\nBois\t401978\n腥草\t401979\n2530\t401980\n东中\t401981\n黄山北\t401982\n45集\t401983\n说话之道\t401984\n清江浦\t401985\n香港廉政公署\t401986\nLogging\t401987\n军令\t401988\n盐酸氨基葡萄糖\t401989\n许佳慧\t401990\nWMS\t401991\n螺狮粉\t401992\n中西合璧\t401993\nasList\t401994\n凤霞\t401995\n玉树临风\t401996\ngt2000\t401997\n3751色影院\t401998\n曹国伟\t401999\n县广播电视台\t402000\nOperating\t402001\n苏浅\t402002\n脸大\t402003\n雅乐\t402004\n新罗区人民政府\t402005\n1.7jdk\t402006\n黑本\t402007\nmac浏览器\t402008\n新项\t402009\n金雕\t402010\n48色\t402011\n叶语\t402012\n定风波\t402013\nmoments\t402014\n300多亿\t402015\n200所\t402016\n热战\t402017\n索纳塔论坛\t402018\nSIZE\t402019\n神波多一花\t402020\n喵大仙\t402021\n斗鱼蛇哥\t402022\ncaco3\t402023\nQVOD电驴ed2k\t402024\n逆_\t402025\n淘汰率\t402026\n宝商城\t402027\n赶跑\t402028\n蟹岛\t402029\n美企\t402030\n销售利润率\t402031\n300页\t402032\n玉溪市人民医院\t402033\n瑞虎3x\t402034\n大乐\t402035\n小宋\t402036\nAPI函数\t402037\n上海银行信用卡\t402038\n3041\t402039\n谓\t402040\n高安网\t402041\nWH-H900\t402042\n赵成\t402043\n2017年8月11日\t402044\n干锅虾\t402045\n乐味\t402046\n金融科技\t402047\n网通版\t402048\nmp4格式\t402049\n可申请\t402050\nrevolve\t402051\n身旁\t402052\n何玲\t402053\n谈心\t402054\n签订地\t402055\n直流电阻测试仪\t402056\nbotany\t402057\n土板\t402058\n成都市工商行政管理局\t402059\n紧贴\t402060\nPAO\t402061\n0.24.6\t402062\n234G\t402063\n药术\t402064\n说不到\t402065\n放射治疗科\t402066\n得率\t402067\nzhao\t402068\n城市规划专业\t402069\n海昌海洋公园\t402070\n毕业论\t402071\n音级\t402072\n香芹籽\t402073\n前囟门\t402074\nHybris\t402075\n法兰西\t402076\nprospective\t402077\n诺基亚105\t402078\nTrevor\t402079\n王沥川\t402080\n上海汽车集团\t402081\n莫逆\t402082\nnet-tools\t402083\nadmob\t402084\nNMF\t402085\n山大地纬\t402086\n初初\t402087\n摇一摇摇\t402088\nshatter\t402089\n68点\t402090\n全国人大广东代表团\t402091\n仰邦\t402092\n路由器\t402093\n寅时\t402094\n國定\t402095\n田雨橙\t402096\n王志国\t402097\n龚自珍\t402098\nKorea相关_娱乐\t402099\n蓝马\t402100\n新家坡\t402101\n颜晓峰\t402102\n付之一炬\t402103\n【飞\t402104\n和平医院\t402105\n查重查\t402106\nJC\t402107\n小马丁\t402108\n马俊\t402109\n第一人称\t402110\n招招\t402111\n椰子皮\t402112\nsetitimer\t402113\nぺ\t402114\nacid\t402115\ndsh\t402116\n陈东\t402117\nLeona+\t402118\n揭示\t402119\n6800k\t402120\n西数My\t402121\n时尚版\t402122\n小米无人机\t402123\nredist\t402124\n机务\t402125\n三基色\t402126\n前列腺治疗仪\t402127\n伽玛刀\t402128\n大乐之野\t402129\n事物\t402130\n200例\t402131\n一眼\t402132\n通达信公式-股海网\t402133\n1121\t402134\n千玺\t402135\n可约\t402136\n粤游天下_论坛\t402137\n文港\t402138\n告别\t402139\n于杨\t402140\n组会\t402141\nCornerStone\t402142\n地狱猫\t402143\n大蒜油\t402144\n里布\t402145\n岩体力学\t402146\n东京喰种3\t402147\n折线图\t402148\n念经\t402149\n畅迪\t402150\n辽宁省文化厅\t402151\n优质稻\t402152\nxygs\t402153\n打书\t402154\n出国留学网\t402155\n安夏儿\t402156\n2018年02月26日\t402157\n天竺镇\t402158\n双喜临门\t402159\n十五道\t402160\n辽东学院\t402161\n案内人\t402162\nCosPlay\t402163\nMALL\t402164\n林知落\t402165\n完全性\t402166\nARO\t402167\nwin7/xp\t402168\n中华中医药学会\t402169\nFa\t402170\n花鸟鱼虫网\t402171\nzhuren\t402172\ntanaka\t402173\n鲁银投资\t402174\n甘肃省政府法制信息网\t402175\n日租别墅\t402176\ndeclarations\t402177\n瑞鹏\t402178\ntries\t402179\n新疆昌吉州\t402180\n第84章\t402181\n杜旭东\t402182\n1404\t402183\n三稿\t402184\n新色\t402185\n开太平\t402186\n七十\t402187\n航道\t402188\n朱莉娅·罗伯茨\t402189\nfx6300\t402190\n阳区\t402191\n数码播放器\t402192\n个人隐私权\t402193\n夫\t402194\n百分之七十\t402195\n四万年\t402196\n易拉\t402197\n_紫\t402198\n趣头条\t402199\n徐浩\t402200\n2驱\t402201\n国家学生体质健康标准\t402202\n叛国者\t402203\n龙坞茶村\t402204\n高兽决\t402205\n250度\t402206\nVOCs\t402207\n全国普通高等学校\t402208\n谱制\t402209\n欣康\t402210\n微压\t402211\n佛山一中\t402212\n复员\t402213\n20千克\t402214\n通丝\t402215\n分异\t402216\n依维柯\t402217\n卷票\t402218\n极品飞车17:最高通缉\t402219\nCompetitors\t402220\nEP\t402221\n1|\t402222\n梅儿\t402223\n毒酒\t402224\n痰机\t402225\nePSXe\t402226\n新规则\t402227\n科学技术\t402228\ntw\t402229\n稳心颗粒\t402230\n72元\t402231\n表冠\t402232\n2.5吨\t402233\n8家\t402234\n不知其可\t402235\n百度百科\t402236\n虚拟DOM\t402237\n文忠路\t402238\n华旭\t402239\n汇灵盏\t402240\n林人\t402241\ncicpa\t402242\n七洲_七洲网\t402243\n火炉山\t402244\n向上\t402245\n重载运算符\t402246\n生死战\t402247\n赛门铁克\t402248\n句型\t402249\n现金券\t402250\n梭梭树\t402251\n女尊男卑\t402252\n中企\t402253\n浒关镇\t402254\n讳疾忌医\t402255\n间隙\t402256\n简讯\t402257\n货方\t402258\n手雷\t402259\n燃文\t402260\n新作\t402261\n吃不消\t402262\nHu\t402263\n20000\t402264\n原房\t402265\nk型\t402266\n东原地产\t402267\n三斤\t402268\nRPG游戏\t402269\n220万元\t402270\n振\t402271\n7872\t402272\n茯茶\t402273\nIM即时通讯\t402274\nSoftmax\t402275\n_天刀\t402276\n油松\t402277\nsubsidy\t402278\nDoujinshi\t402279\n4亿美元\t402280\n好评\t402281\n忘尘\t402282\n三水湾\t402283\n川内优辉\t402284\n胡婷\t402285\nROOM\t402286\n掉秤\t402287\n头孢克洛缓释片\t402288\n弗曼\t402289\n韩语在线翻译\t402290\nvray3.6\t402291\n手力\t402292\n青山镇\t402293\n元龙\t402294\n樟子松\t402295\n山东省农村信用社联合社\t402296\nhaosf\t402297\n日常用\t402298\n小件\t402299\n四姐\t402300\nlanya\t402301\n帝王\t402302\n美娇\t402303\n山东工艺美术学院\t402304\n绍兴市公安局\t402305\n白色相簿\t402306\n百诺肯\t402307\nBT站\t402308\n语文期\t402309\nAkon\t402310\nWhirl\t402311\n爱思英语网\t402312\n轰6k\t402313\n雷神2:黑暗世界\t402314\n20160518\t402315\n杨巍\t402316\ncamfrog\t402317\n2.0000元\t402318\n康州\t402319\n抗美援\t402320\n香酥鸭\t402321\n北京市科学技术委员会\t402322\n济南齐鲁医院\t402323\n古拉格群岛\t402324\n选择集\t402325\n武者\t402326\n龙母庙\t402327\n悬赏\t402328\n经典全顺\t402329\n交锋\t402330\n傅政华\t402331\nCHKDSK\t402332\n机关机\t402333\n红岩江\t402334\n储殷\t402335\n迷性\t402336\n地球史\t402337\n360N4S\t402338\n屈服\t402339\n口袋车\t402340\n灵异\t402341\n高胆红素血症\t402342\n鄄城县\t402343\n庵东镇\t402344\n德玛西亚吧\t402345\n苏人社\t402346\n城西村\t402347\n九江电视台\t402348\n一问三不知\t402349\n减出\t402350\n翡翠手镯\t402351\n复投\t402352\n抢答\t402353\n一天之内\t402354\n羊马镇\t402355\n战气\t402356\n齐家装修网\t402357\nPETS5\t402358\n弗利沙\t402359\n何多苓\t402360\n限制酶\t402361\n有限合伙企业_\t402362\n资本主义\t402363\nebd\t402364\nLDF\t402365\n直筒\t402366\n车讯\t402367\n用友系统\t402368\nNBA2017\t402369\n小巴黎\t402370\n小妈\t402371\n淮安市教育局\t402372\nlq\t402373\n安雅\t402374\n请重\t402375\n搜狗推广\t402376\n荞麦面条\t402377\n老阳\t402378\narraybuffer\t402379\n贫困户\t402380\n手相算命_算命先生网\t402381\n鸡鸣寺\t402382\n硫酸铁\t402383\n杨子姗\t402384\n丁忧\t402385\n北京天使儿童医院\t402386\n叉开\t402387\nmineski\t402388\n95路\t402389\n沈阳地铁集团有限公司\t402390\nIE9+\t402391\n济南市规划局\t402392\n自若\t402393\n襄阳公园\t402394\n泥马\t402395\n少年杀人事件\t402396\n会长\t402397\n图式\t402398\n大理古城\t402399\n奥斯曼狄斯\t402400\n泉州港\t402401\n运行后\t402402\n波昂\t402403\n千景\t402404\nZii\t402405\n学尔森教育\t402406\n落凤坡\t402407\n97小说网\t402408\n雉城街道\t402409\n南太铉\t402410\nvsp\t402411\n小小英雄\t402412\nECTouch\t402413\n9.61\t402414\nELECTRICAL\t402415\nSUNRISE\t402416\n辅料\t402417\n7针\t402418\n公寓式\t402419\nintegrate\t402420\n马内利\t402421\n哔哩哔哩卡\t402422\nRHCE认证\t402423\n变频泵\t402424\n榕江县\t402425\n被吊\t402426\n梨乡\t402427\nacknowledge\t402428\n心电图机\t402429\n套包剑网3\t402430\n火片\t402431\n紧迫感\t402432\nShave\t402433\n儿童画\t402434\n柔情似水\t402435\n固氮菌\t402436\n中共重庆市纪委\t402437\n18.4\t402438\n中国国家话剧院\t402439\n少年大宝\t402440\n塔卡沙\t402441\n全能扫描王\t402442\n安全组\t402443\n7700\t402444\n喜羊羊与灰太狼之奇幻天空岛\t402445\n短衫\t402446\n签字仪式\t402447\n双色球在线机\t402448\n褐变\t402449\n锚机\t402450\nCombobox\t402451\n威权\t402452\n半山腰\t402453\n成都商报|成都商报电子版|成都商报\t402454\n套式\t402455\n6.42\t402456\n确认信\t402457\n军休所\t402458\n尘埃粒子计数器\t402459\n兴丰\t402460\n杨氏弹性模量\t402461\n土建定额\t402462\n蕲州镇\t402463\n王者荣耀剑仙\t402464\n筹建\t402465\n4月8日起\t402466\n朱江明\t402467\nthis\t402468\n871\t402469\n重庆公安交通管理信息网\t402470\n蒲地蓝消炎口服液\t402471\n仿威图机柜\t402472\n443\t402473\n05742\t402474\n林荫大道\t402475\nHEART\t402476\n创办\t402477\n养护工\t402478\n13路\t402479\n长沙外国语学校\t402480\n联通3G\t402481\n茉莉花茶\t402482\n十分\t402483\n远远\t402484\n14条\t402485\n王绾绾\t402486\n即期汇率\t402487\n星驿付\t402488\n鬻\t402489\n日饭\t402490\n20140723\t402491\n寂静岭2\t402492\n于红\t402493\n业余组\t402494\n中国电建\t402495\n83亿\t402496\n斐济\t402497\n发热片\t402498\n白哉\t402499\n刘胜军\t402500\n口交网\t402501\n武山县\t402502\nprone\t402503\n武汉工程大学新闻中心\t402504\nx^2-y^2\t402505\n魔岩三杰\t402506\n二速\t402507\n金鼓\t402508\n2016年3月1日\t402509\n新浪吉林_新浪网\t402510\ni7-8700K处理器\t402511\nsem\t402512\n极米h1\t402513\n贵州财经大学\t402514\nORB-SLAM2\t402515\n先安\t402516\nddr\t402517\n仙履奇缘\t402518\nlist操作用法\t402519\n螺杆压缩机\t402520\n海运\t402521\n滚笼\t402522\nHPA\t402523\nDeck\t402524\ngenerated\t402525\n下证\t402526\n求职\t402527\n迎香穴\t402528\n慎吃\t402529\ndestory\t402530\n王豪\t402531\n伺服控制系统\t402532\n院校\t402533\n96名\t402534\n3000万吨\t402535\n凯夫拉\t402536\n相传\t402537\n中讯网\t402538\n镁光\t402539\n北京君正\t402540\n南京总医院\t402541\n46%\t402542\n游迅\t402543\novertrue\t402544\n卡主\t402545\n中长期铁路网规划\t402546\n小兔兔\t402547\namh\t402548\n茱萸\t402549\njfla\t402550\n姚斯婷\t402551\n顽童MJ116\t402552\n南京西路1266号\t402553\n竹笋\t402554\n彩谱\t402555\n亦称\t402556\n莱尔斯丹\t402557\n郑智薰\t402558\n总磷\t402559\n小萝\t402560\nMug\t402561\n房米网\t402562\nSystem\t402563\n葛朗台\t402564\nSANA\t402565\n铅笔裤\t402566\n库洛牌\t402567\n朱佳希\t402568\nav电影网\t402569\nbrewery\t402570\n炸鸡汉堡\t402571\ntire\t402572\n烧结多孔砖\t402573\n万物生长\t402574\n艾媒网\t402575\n宁人社\t402576\nEXCEL2010\t402577\n赐教\t402578\n2016年12月16日\t402579\n鸡_图拉丁吧\t402580\n95542\t402581\nsat考试\t402582\n国网陕西省电力公司\t402583\n18幢\t402584\n泛華發行\t402585\n戈达尔\t402586\n芒果布丁\t402587\n六台\t402588\n月岛\t402589\n混音器\t402590\n抱头痛哭\t402591\n浓浓\t402592\n提前期\t402593\n庞氏\t402594\n麻糖\t402595\nPlease\t402596\n杜特尔特\t402597\n板栗树\t402598\n中国人民建设银行\t402599\nRp\t402600\n机卷\t402601\n进户\t402602\n首现\t402603\n乳管\t402604\n谭斌\t402605\n?\t402606\n这一个月\t402607\n仿妆\t402608\nrespon\t402609\nwin10切换输入法\t402610\nnetstream\t402611\n颌\t402612\n大字号\t402613\n裸价\t402614\n凡是\t402615\n新梦网\t402616\n守护丽人\t402617\n波斯王子2\t402618\n金华山\t402619\n涨肚\t402620\n上海人家\t402621\n白沙水库\t402622\nAIOps\t402623\nVerne\t402624\n罗新\t402625\n邓福如\t402626\n12j003\t402627\npstn\t402628\n1晚\t402629\nboujou\t402630\n升高\t402631\n9章\t402632\n小宠\t402633\n重庆午\t402634\n28集\t402635\n人教新课标版\t402636\n硝普钠\t402637\n一串\t402638\nROSS\t402639\n厦门科技馆\t402640\n美丽乡村示范村\t402641\n补中益气汤\t402642\n3十\t402643\n实际生活\t402644\n一代一代\t402645\n福联\t402646\n青海省公安厅\t402647\n百度文库下载器\t402648\n卫生间隔断\t402649\n咏菊\t402650\n奇花异草\t402651\n向善好\t402652\nCultures\t402653\n7500\t402654\n千叶市\t402655\n陈清泉\t402656\ndiro\t402657\n防尘口罩\t402658\n105g\t402659\n尊法\t402660\n颐和盛世\t402661\n立法\t402662\nunit11\t402663\n标准版\t402664\nNB\t402665\n阿托\t402666\n鸥翼门\t402667\n组头\t402668\nV2.3.11\t402669\nABZU\t402670\n32寸\t402671\n孙建弘\t402672\nanalyser\t402673\ndbi\t402674\n康恩\t402675\n魏波\t402676\nSAMBA\t402677\n中山大涌红木家具\t402678\n腾讯新闻_\t402679\n走偏\t402680\n通信工程师考试\t402681\n限制性核酸内切酶\t402682\n从零\t402683\nEL表达式\t402684\n老生\t402685\n伟大的诱惑者\t402686\n狮群\t402687\n徐欣\t402688\n泰山石\t402689\ngood电影网\t402690\n打鬼\t402691\n郑子\t402692\n临事\t402693\n恭维\t402694\n废都物语\t402695\nICD-10\t402696\nTPR\t402697\n钢管束\t402698\nipadmini4吧_\t402699\n南海子\t402700\n4箱\t402701\n白娜\t402702\n大明山\t402703\n一条路\t402704\n郭永怀\t402705\n西门口\t402706\n南乔\t402707\n56172001\t402708\nPixiv\t402709\nimax影院\t402710\n徐伟\t402711\naddView\t402712\n木子弓\t402713\n无光泽\t402714\n2016年5月17日\t402715\n厦门航空有限公司\t402716\n痛风\t402717\nLinux公社\t402718\n古珠\t402719\n荞\t402720\n福州市统计局\t402721\n药贴\t402722\nHackintosh\t402723\n叶尖\t402724\n天涯明月刀唐门\t402725\n打折\t402726\nrpf\t402727\n王军\t402728\nppt模\t402729\n虚拟币\t402730\n纪律片\t402731\n下书\t402732\ne2dk\t402733\n耳热\t402734\n影音先锋影院\t402735\n恋爱类\t402736\n真木阳子\t402737\n倾角\t402738\n标红\t402739\n袁剑\t402740\n剪枝算法\t402741\n土耳其语\t402742\n6.7亿\t402743\n秦城\t402744\n江苏省高校招生就业指导服务中心\t402745\n舟桥\t402746\n達\t402747\n20袋\t402748\n解放军理工大学\t402749\n62.5\t402750\n琴瑟\t402751\n泰科\t402752\n60FPS\t402753\ninaccessible\t402754\n治疗学\t402755\n巨作\t402756\n朱光\t402757\n行业品牌\t402758\n老客户\t402759\n45TFSI\t402760\n意向书\t402761\n阻燃胶合板\t402762\n增\t402763\nCatcher\t402764\n丫髻山\t402765\nnet4\t402766\n集体产权制度改革\t402767\n童文\t402768\n本方\t402769\n赵云霄\t402770\n汉中市龙岗学校\t402771\npeugeot\t402772\nthrone\t402773\n隔膜压滤机\t402774\n坐洋\t402775\n官场\t402776\n曹汝霖\t402777\n山进\t402778\nsago\t402779\n购车网\t402780\n上海华谊(集团)公司\t402781\n亿元\t402782\n甲乙\t402783\n真真正正\t402784\n休玛\t402785\n赡\t402786\n投点\t402787\n坏处\t402788\nNaming\t402789\n浙江省农发集团\t402790\n丽影\t402791\n水产\t402792\n广深\t402793\n灯盘\t402794\n旅游胜地\t402795\n东家\t402796\n2697\t402797\n小病\t402798\n巨鲲\t402799\n第二天\t402800\nromax\t402801\n战国无双4-2\t402802\ninduces\t402803\nSTF\t402804\ncontractor\t402805\nParts\t402806\nR230\t402807\n温州日报瓯网\t402808\n乌鲁木齐县\t402809\n芝罘\t402810\n平面方程\t402811\n购物\t402812\nwin7sp1\t402813\n年岁岁\t402814\n莫斯科国立大学\t402815\n中国国家图书馆\t402816\n团代会\t402817\n欸乃\t402818\n皇后\t402819\n卡布奇诺咖啡\t402820\n扶风琉璃\t402821\n3.00.03\t402822\n7545\t402823\n005号\t402824\n花生芽\t402825\n这么贵\t402826\n法警\t402827\n1500X\t402828\n第九卷\t402829\n黎山老母\t402830\n水土不服\t402831\n蒲波\t402832\n脑放君\t402833\n小米线刷\t402834\nwww.nongcun5.com/sell/news/\t402835\n声讨\t402836\nKeyShot\t402837\n乡村大世界\t402838\n宝马x5\t402839\n初瑞雪\t402840\n光谷一小\t402841\nCascading\t402842\n丽春湖\t402843\nstreamer\t402844\nnikeid\t402845\n宏图\t402846\n鬼娃\t402847\n【众泰T700】新众泰_众泰T700\t402848\nastro\t402849\n陋室铭\t402850\n备案名\t402851\n刘亮程\t402852\n精鼎\t402853\n资历\t402854\n裙边\t402855\n错层式\t402856\n祈使\t402857\nmfc7360\t402858\n凶吉\t402859\n广州烈士陵园\t402860\nGIRLS\t402861\n白河县政府\t402862\n终于爱情\t402863\n广西学校\t402864\nxtreme\t402865\n气死\t402866\n哀乐女子天团\t402867\n红薯网\t402868\n本省\t402869\n二十味\t402870\n热卷\t402871\n建科\t402872\n仟\t402873\n展馆\t402874\n冷却管\t402875\nUSB转串口\t402876\nCOOL\t402877\n踏浪\t402878\n川西北\t402879\n秒选\t402880\nadl\t402881\n于果\t402882\n张PD\t402883\n真本\t402884\nPONY谱尼测试集团\t402885\n卫生计生委\t402886\n3367\t402887\n买算\t402888\n国网湖北省电力有限公司\t402889\n山外\t402890\n2668\t402891\nERP\t402892\nMacDrive\t402893\nlinkage\t402894\navr\t402895\n广东大峡谷\t402896\nUI线程\t402897\n独卫\t402898\n朝阳法院\t402899\n反\t402900\n50多\t402901\n萍踪侠影录\t402902\n圣路易斯\t402903\n丰田4S店\t402904\n很方\t402905\n工业源\t402906\n清宫无\t402907\n均化\t402908\n肝阴虚\t402909\n6颗\t402910\nCD补丁\t402911\n义绝\t402912\n军阵\t402913\nHostMonitor\t402914\n锏\t402915\n指骨\t402916\n这样\t402917\n雅酷\t402918\n热电偶\t402919\n天天炫斗\t402920\nunmanned\t402921\n14.8\t402922\n水粉画\t402923\njavhdvideo\t402924\n彼岸花\t402925\n何登成\t402926\nDIV/0\t402927\n三滤\t402928\n环比增长率\t402929\n发案\t402930\ncollected\t402931\nstylenanda\t402932\n瓦伦西亚\t402933\n厌氧胶\t402934\nNuit\t402935\n中国记协网\t402936\n懒妻\t402937\n济南机场\t402938\n午时茶颗粒\t402939\n亚得里亚海\t402940\n法定假\t402941\n彩虹宝宝\t402942\n派思股份\t402943\n日落大道\t402944\n纳克萨玛斯\t402945\nMonica\t402946\n军舰\t402947\n小尼\t402948\nPri\t402949\n九条\t402950\n资料员\t402951\n名爵锐腾\t402952\n轮圈\t402953\n风语\t402954\n赵公\t402955\n双石\t402956\n汉光百货\t402957\n印刷机\t402958\n深泽直人\t402959\n乐淘淘\t402960\n有梗\t402961\n米加\t402962\n男人婆\t402963\n袁静\t402964\nBasics\t402965\nmaven子\t402966\n勐腊\t402967\n面源\t402968\npdfjs\t402969\n活态\t402970\n930m\t402971\n赋给\t402972\n来水\t402973\n熙湖\t402974\n陈耀星\t402975\n尼康d800\t402976\nUPUPW绿色服务器\t402977\n1亿多\t402978\n千差万别\t402979\n云南山歌\t402980\n防嗮\t402981\n小南海\t402982\n碎星锤\t402983\n30万千瓦\t402984\n华中科技大学研究生院\t402985\n1389\t402986\n任昌丁\t402987\n猴耳环消炎颗粒\t402988\n多形式\t402989\n彻子\t402990\n奴隶制\t402991\nPLS\t402992\n越海\t402993\nv15\t402994\n牛在飞\t402995\n分屏器\t402996\n积温\t402997\n苗圃\t402998\n中特\t402999\n南梁\t403000\n华润万家有限公司\t403001\n均线\t403002\n挡不住\t403003\n吉宝置业\t403004\n全面从严治党面对面\t403005\n震动棒\t403006\n怜香惜玉\t403007\nlogi\t403008\n風呂\t403009\n前三十年\t403010\n老门\t403011\ngaoji\t403012\n运动篇\t403013\n1面\t403014\n向轮\t403015\n概念\t403016\ntickle\t403017\n多名\t403018\n动态性\t403019\nOrchestration\t403020\n太阳风\t403021\n品源\t403022\n时方\t403023\n肉棒\t403024\n时银\t403025\n北京大学附属中学\t403026\n我的校花姐姐\t403027\n贝尔丰\t403028\nmp3+lrc\t403029\n渤海镇\t403030\n铝塑膜\t403031\nmantis\t403032\n自考毕业证\t403033\n干制品\t403034\n022期\t403035\n灌篮高手\t403036\n多址\t403037\n四星\t403038\n赣粤高速\t403039\n葡京赌场\t403040\n第二件\t403041\n知识型\t403042\nSymfony\t403043\n口试\t403044\n命大\t403045\n量品\t403046\n时速表\t403047\n惊声尖\t403048\n教育新闻_奥数网\t403049\n过敏症\t403050\n里番邪恶漫画\t403051\n养媳\t403052\n体位性低血压\t403053\n熊仓祥子\t403054\n软启动器\t403055\n天猫子\t403056\n文学奖\t403057\n达则兼济天下\t403058\n曙光社区\t403059\n吴瑕\t403060\ndeleting\t403061\n领导们\t403062\n蛮龙\t403063\n塞纳河畔\t403064\n2016-2018年\t403065\n第26届\t403066\n币安币\t403067\n皮卡多\t403068\nDotNetBar\t403069\n新一城\t403070\n日渐\t403071\n色诺芬\t403072\n小洋\t403073\n制沙机\t403074\n传祺gs4\t403075\nHr\t403076\n油工\t403077\n福田区委\t403078\n131碘\t403079\n海尼曼\t403080\n借阅\t403081\nUnique\t403082\n三生三世十里桃花夜华\t403083\n卡沙\t403084\n假面骑士Drive\t403085\n读书网\t403086\nslf4j+log4j\t403087\nTc\t403088\n邹涛\t403089\nManual\t403090\nCGWR\t403091\nWSK\t403092\n孤行\t403093\n厦门大学管理学院\t403094\n30多年前\t403095\n博客管理系统\t403096\n攻略站\t403097\n绿地理想城\t403098\n黄村站\t403099\nluckymore\t403100\n/DIV\t403101\n开济\t403102\n泥巴公社\t403103\n某局\t403104\n内眦\t403105\n定日县\t403106\n嘉格纳\t403107\n五美\t403108\n戊戌\t403109\n密教\t403110\n万达信息\t403111\n济南长途汽车西站\t403112\n洗牌\t403113\n高职生\t403114\nfprintf\t403115\n桃田贤斗\t403116\n205\t403117\n肚子痛\t403118\n干调\t403119\n叨叨\t403120\n立体停车设备\t403121\n93636手游网\t403122\nminerals\t403123\n300方\t403124\n纯元皇后\t403125\nioutil\t403126\n龙舌兰\t403127\n安溪新闻网\t403128\n同济七版\t403129\n大虎\t403130\n2017年2月份\t403131\nnparray\t403132\n神女峰\t403133\n锁体\t403134\n共轭复数\t403135\ncmdshell\t403136\nuci.edu/~dmdb/chandra/Enron2.1/words.txt\t403137\n权力榜\t403138\nCooke\t403139\n健身俱乐部\t403140\n强袭魔女\t403141\n单雯\t403142\n中华人民共和国税收征收管理法\t403143\nSessionID\t403144\n奔驰c180\t403145\nNETSHOW\t403146\n机密性\t403147\n神斧\t403148\n上海美年大健康体检中心\t403149\n中国科技论文在线\t403150\n吵\t403151\n淄博中学\t403152\n广东雨神\t403153\npersisted\t403154\n会出\t403155\n上东城\t403156\n2018年3月底\t403157\n融创凡尔赛花园\t403158\n5刀\t403159\n悲凉\t403160\n2017年11月10日\t403161\n崔颢\t403162\n黄金瓜\t403163\n舌尖3\t403164\n治安管理行为人\t403165\n看家本领\t403166\n永修县\t403167\n石英岩\t403168\n乔威\t403169\n北上广\t403170\n石奈子\t403171\nThaumcraft\t403172\n连锁反应\t403173\n肉状\t403174\n阿卜杜拉\t403175\n营销师\t403176\n翅片式\t403177\n我的妈妈\t403178\naccess数据库\t403179\n胜芳\t403180\n青岛市商务局\t403181\n画杨桃\t403182\n吕克·贝松\t403183\n名酒\t403184\n月旦\t403185\n多形性\t403186\n三灶镇\t403187\n白族\t403188\nInitiation\t403189\n2017-11-29\t403190\n红轴\t403191\nstep\t403192\n勇者大战\t403193\n河南省工商行政管理局\t403194\n探照灯\t403195\n博微电力\t403196\nGB50116-2013\t403197\n219.139.109\t403198\n神奈\t403199\n沙蟹\t403200\n腾房网\t403201\n上水广场\t403202\nDrew\t403203\n商刻\t403204\n财产权\t403205\n第五册\t403206\n三舞\t403207\n双姝\t403208\n3.6L\t403209\n第55条\t403210\n中国科学院南京地理与湖泊研究所\t403211\n新科\t403212\n搜索键\t403213\nvalidation\t403214\n老广\t403215\n500t\t403216\n交流器\t403217\n228号\t403218\n百度云满速\t403219\n重庆北火车站\t403220\njewels\t403221\n味极鲜\t403222\n辞职报告\t403223\n12项\t403224\n卡子\t403225\n进口品\t403226\n2017周年\t403227\n儒者\t403228\nSouth\t403229\n演义\t403230\ntataufo\t403231\n5.25\t403232\n蓝网\t403233\n苍之纪元洛天依\t403234\n玫瑰茄\t403235\n长江证券\t403236\n115亿\t403237\n本丸\t403238\n报批版\t403239\nWellness\t403240\n一性\t403241\n东方律师网\t403242\nbran\t403243\n通济桥\t403244\ndocx4j\t403245\n口线\t403246\n攀西\t403247\nuft-8\t403248\n调\t403249\n多_寻医问药网\t403250\n王建忠\t403251\n家子\t403252\n共&#160\t403253\n十一世\t403254\n拉拔\t403255\n天津乐居\t403256\nfill\t403257\nJDM\t403258\n公测版\t403259\n计算机化\t403260\n两千\t403261\n山西三维集团\t403262\npjax\t403263\n佳能ix6580\t403264\n2018年4月19\t403265\nrealplayer\t403266\n潮牌\t403267\n143平米\t403268\n手下留情\t403269\n渔童\t403270\n加拿大学\t403271\n开票员\t403272\n三毛从军记\t403273\n冷宫\t403274\n顺势\t403275\n兰大二院\t403276\n卢鑫玉浩\t403277\n洼田正孝\t403278\n复仇文\t403279\n冰化\t403280\n丰田大霸王\t403281\nStack\t403282\n形成性考核册\t403283\n未公开\t403284\n蹲点\t403285\nENTERTAINMENT\t403286\nDK2\t403287\n金山卫\t403288\nLos\t403289\n马伽术\t403290\n海南省人力资源和社会保障厅\t403291\nHSE\t403292\noplayer\t403293\n长沙市委\t403294\n第5级\t403295\n.COM\t403296\n为虎作伥\t403297\n傲剑2\t403298\n180个\t403299\n黄树贤\t403300\n保障房\t403301\nvisuals\t403302\nipsa\t403303\n苏苏姐\t403304\nViewModel\t403305\n决定性\t403306\n冷藏室\t403307\ncomplaint\t403308\n西宁西\t403309\n诺远资产管理有限公司\t403310\nspandex\t403311\n掏\t403312\nProspective\t403313\n油盖\t403314\n雷雨天\t403315\n温州市区\t403316\n级联分类器\t403317\n升任\t403318\n血源诅咒\t403319\n光遗传学\t403320\n木匠\t403321\namazed\t403322\n高热惊厥\t403323\n苏子叶\t403324\ntomcate\t403325\nwww.001bz.cc/modules/article/txtarticle\t403326\nHive函数\t403327\n580万套\t403328\n大明湖\t403329\n县人大\t403330\n款单\t403331\n诺富特酒店\t403332\n电推\t403333\nserria\t403334\n九尾妖狐\t403335\nSDS\t403336\n新君\t403337\n打眼机\t403338\n异度之刃2花\t403339\n中超联赛\t403340\n徐昂\t403341\n大举\t403342\n庄户\t403343\nSMLZ\t403344\n陕西米脂\t403345\n孟庭苇\t403346\nMathJax\t403347\nJp\t403348\n辉县市\t403349\n专制\t403350\n许志华\t403351\n绑发\t403352\n创战纪\t403353\n337号\t403354\n凤美\t403355\n十三郎\t403356\n市二医院\t403357\n大锅\t403358\n手指纹\t403359\n南渡\t403360\n蜂网\t403361\n市旅游委\t403362\nGRS\t403363\n宋书航\t403364\n0.05MB\t403365\n钙咀嚼片\t403366\n中规\t403367\ngrammatical\t403368\n双年\t403369\n爱下电影网\t403370\n22年\t403371\n镇沅\t403372\n得道\t403373\ntestcenter\t403374\n马特·达蒙\t403375\nnetlogo\t403376\n讶异\t403377\nMixing\t403378\n漳河新区\t403379\n安装工\t403380\n武门\t403381\n古砚\t403382\n一帘\t403383\n九阳神王\t403384\n二事\t403385\n复方甘草酸苷片\t403386\n非京籍\t403387\npigs\t403388\n有口皆碑\t403389\n西尔斯\t403390\n公安局出入境管理处\t403391\n电力变压器\t403392\n碧血蓝天\t403393\n业内\t403394\n20150423\t403395\n课间游戏\t403396\nsoot\t403397\n冰蜡\t403398\n最终幻想\t403399\ndeserialize\t403400\n街里\t403401\n谢作诗\t403402\n嗖嗖\t403403\n防爆玻璃\t403404\n中国地方政府\t403405\n蜂农\t403406\n板链\t403407\n林华华\t403408\n柱状体\t403409\n优享型\t403410\n中国棋牌网\t403411\n继承者的家养小绵羊\t403412\n比较号\t403413\n未名天日语学校\t403414\n百顺\t403415\n东大街\t403416\n故纸\t403417\n御尊版\t403418\n知识竞赛\t403419\n饭后\t403420\n电子竞技俱乐部\t403421\n鲁小姐\t403422\n重庆水务\t403423\n玩心\t403424\nstateflow\t403425\nヴァ\t403426\n守望\t403427\n林小姐\t403428\n沐声\t403429\n四员\t403430\n一加手机3\t403431\n第一套\t403432\n激素\t403433\n中央编译出版社\t403434\n1.3L\t403435\n会计系\t403436\n扩大\t403437\n刘和珍君\t403438\n拾忆\t403439\ndesiger\t403440\n6004\t403441\n先锋骑兵\t403442\n重庆高院\t403443\n中信城\t403444\n胸片\t403445\n激越历史时空的青春共鸣\t403446\n拉氏变换\t403447\n发配\t403448\n更高\t403449\n2个半月\t403450\n液肥\t403451\n广河县\t403452\nse2\t403453\n20171124\t403454\n水木清华小区\t403455\n头马\t403456\n赞比亚\t403457\n言派\t403458\n96周年\t403459\n搜片\t403460\n保代宝典\t403461\n全瓷\t403462\n苏州街\t403463\n集智社区\t403464\n钢化夹胶玻璃\t403465\n佟彤\t403466\n领取\t403467\n甜酒酿\t403468\n5.22\t403469\n胡智锋\t403470\n江森\t403471\ne2fsprogs\t403472\n第8天\t403473\n通用版\t403474\n蓓俪芙养森\t403475\n大禹治水\t403476\n甲苯磺酸\t403477\n206国道\t403478\n健康科技\t403479\n方素珍\t403480\n2x1\t403481\ns2520\t403482\n楣板\t403483\nCatering\t403484\n瑞麒\t403485\n金马碧鸡坊\t403486\n八楼\t403487\nkity\t403488\n大疙瘩\t403489\nmovies\t403490\n深成指\t403491\n邯郸一中\t403492\n盟军敢死队使命召唤\t403493\nCADtools\t403494\nhrt\t403495\n软毛\t403496\n兄弟姐妹一家亲\t403497\n兰氏\t403498\n辽东湾\t403499\n肾动脉狭窄\t403500\nxface\t403501\n安乐村\t403502\n怀北镇\t403503\n网络硬盘录像机\t403504\nFeCl3\t403505\nAIML\t403506\n龙山街道\t403507\n宜春市人民政府办公室\t403508\n辉昂\t403509\nRAID5\t403510\n易位\t403511\n赛马节\t403512\n第二项\t403513\n重庆工商大学\t403514\n负荷\t403515\nPerception\t403516\n2线\t403517\n罗艺\t403518\n2015-01-30\t403519\n第五弹\t403520\n架子队\t403521\n反目成仇\t403522\n梦姑\t403523\n妖怪旅馆营业中\t403524\n天津市委组织部\t403525\nengine\t403526\n爱耳时代\t403527\ntesla\t403528\n接收者\t403529\n保护剂\t403530\n9.04\t403531\n整式\t403532\n24例\t403533\n内燃机车\t403534\n防空\t403535\n长乐镇\t403536\n环球广场\t403537\n迪卢木多\t403538\nkanzhelu\t403539\n悲孽人生\t403540\n旗长\t403541\n师职\t403542\n提问者\t403543\nultraboost\t403544\n斯蒂芬·霍金\t403545\n李将军\t403546\n德衡\t403547\nEMP\t403548\nGinza\t403549\n新医\t403550\n苏洋\t403551\n香港庙街\t403552\n就该\t403553\n日语翻译招聘网\t403554\n隔离液\t403555\n8Plus\t403556\n价值连城\t403557\nX360\t403558\n尹毓恪\t403559\n甘蔗\t403560\n有幸\t403561\n宁波大学研究生院\t403562\n江安镇\t403563\n绿色建筑\t403564\n三克\t403565\n亿星\t403566\n英菲克\t403567\n不孕\t403568\nobjective\t403569\n放飞梦想\t403570\n编程篇\t403571\nCHANGE\t403572\nTurnover\t403573\n马来西亚公司\t403574\n骏\t403575\n天川\t403576\ntony\t403577\n谢裕\t403578\n无翼鸟邪恶萝莉\t403579\njiayuan\t403580\n无\t403581\n乐易\t403582\nfcgid\t403583\n偷菜\t403584\n狮\t403585\n温医大\t403586\n樾府\t403587\nAcitiviti在线流程设计器\t403588\n语词\t403589\nAMAZFIT\t403590\nPROCESSING\t403591\nspare\t403592\nSanya\t403593\n穗莞\t403594\n欢欢喜喜\t403595\nthon\t403596\n淫穴\t403597\n徐高铁\t403598\n拾音\t403599\n零五网\t403600\n星力\t403601\n怪物猎人世界\t403602\n会同县政府\t403603\nToast\t403604\n不惧\t403605\n第四批\t403606\n天津市国土资源和房屋管理局\t403607\nwinserver2008\t403608\nlinux64\t403609\n猎文网\t403610\n三头龙\t403611\n异相\t403612\n崇祯十七年\t403613\n竹夫人\t403614\n素敌\t403615\n修约\t403616\nConversations\t403617\n庆余年\t403618\ndaka\t403619\nwebcam\t403620\n建筑类\t403621\n三星w2016\t403622\n唏嘘不已\t403623\n水分\t403624\nmp7\t403625\n98柔情篇\t403626\n卖出\t403627\n青大\t403628\n从军\t403629\n五十分钟\t403630\n低筋\t403631\nNgnix\t403632\n佐伊\t403633\n蔷\t403634\n如懿\t403635\n记行\t403636\n琅岐\t403637\nfabang\t403638\nshampoo\t403639\n窃密\t403640\n权力的游戏第四季\t403641\nLivehouse\t403642\n牛粪\t403643\n黎晓芳\t403644\n丹帝\t403645\n就一点\t403646\n口牙\t403647\n金水\t403648\n上海交通大学医学院附属瑞金医院北院\t403649\n洪杰\t403650\n球管\t403651\n兆驰\t403652\n显屏\t403653\n光大银行手机银行\t403654\n劲速\t403655\n以权谋私\t403656\n猩红女巫\t403657\n毛头\t403658\n_杯\t403659\n专业型\t403660\n侧田\t403661\n虹口区\t403662\n怪谈\t403663\n蜡疗\t403664\n施华蔻斐丝丽\t403665\n校核\t403666\n,,\t403667\n中州大道\t403668\npicasa\t403669\n21张\t403670\n电骡\t403671\n辽宁省体育局\t403672\n大桥乡\t403673\n杨淼\t403674\n链式\t403675\n生尘\t403676\n物资库\t403677\nmysql注册码\t403678\n10009\t403679\nv150\t403680\n头戴式\t403681\n黑樱桃\t403682\nx79\t403683\n海南三亚\t403684\n读写器\t403685\n月亮代表我的心\t403686\n2016年起\t403687\n万用宝\t403688\n大菠萝\t403689\n期权定价\t403690\n垂髫\t403691\n20150506\t403692\n切格瓦拉\t403693\n有关方面\t403694\nness\t403695\nandroid浏览器\t403696\n4盘\t403697\nWenism\t403698\n13图\t403699\ntfsi\t403700\n菜鸟物流\t403701\n超调量\t403702\n溏\t403703\n虎牙直播吧\t403704\n鱼果\t403705\n越秀集团\t403706\n宣教之窗\t403707\n开守护\t403708\n宏声\t403709\n吴梦知\t403710\n非公版\t403711\n劳技室\t403712\n指南在线考试\t403713\nREAL\t403714\n网易阴阳\t403715\nP51s\t403716\n日本东北大学\t403717\n2016年下半年\t403718\n愁眉苦脸\t403719\nSequences\t403720\n神秘之城\t403721\n164家\t403722\n2017年12月21日\t403723\n湖南大众传媒职业技术学院\t403724\n有时候\t403725\nsedog\t403726\n北山村\t403727\n新华小学\t403728\n权威性\t403729\n海南建省办\t403730\n黄信\t403731\nFirefox浏览器\t403732\n东山顶上\t403733\n气血\t403734\n最囧昕动\t403735\n小米手机助手\t403736\n欧珑\t403737\n一千五百万\t403738\n盖亚\t403739\n姑且\t403740\nbnf\t403741\n玛丽与魔女之花\t403742\n四库全书总目提要\t403743\n心怡\t403744\n瑞氏\t403745\nV2.0版\t403746\nMinana\t403747\n2018044期\t403748\n甘露露\t403749\n微信浏览器\t403750\n伽玛函数\t403751\n念冥想\t403752\n罗摩\t403753\nhunan\t403754\n深色\t403755\n板块\t403756\n心理咨询师考试\t403757\n施恩\t403758\n考研院\t403759\n碗口\t403760\n伊春路\t403761\n绿条\t403762\nRCNN\t403763\n秦沛\t403764\nmiso\t403765\nS12\t403766\nYII\t403767\n道钉\t403768\n225号\t403769\n恋爱模拟器\t403770\n明星片\t403771\n鱼油胶囊\t403772\n子瑜\t403773\n西湖村\t403774\namd显卡驱动\t403775\n龙昆南路\t403776\n八个半月\t403777\n燊\t403778\n磨炼\t403779\naxes\t403780\n微世界\t403781\n扬鞭\t403782\n大多\t403783\n第四色\t403784\napbian\t403785\n中共如皋市委\t403786\n日月湾\t403787\n六面兽\t403788\n新余法院\t403789\n菜谱\t403790\nPinko\t403791\nb超\t403792\n高哥\t403793\n梅赛德斯-奔驰\t403794\n酿酒吧\t403795\n生死路\t403796\n泳者\t403797\n违法者\t403798\n黑盘\t403799\n诗汇\t403800\n宜家集团\t403801\noverdue\t403802\n永修\t403803\n病方\t403804\n全类\t403805\nMyEclipse\t403806\n骑马与砍杀人间\t403807\n极米z6\t403808\n马丁鞋\t403809\nAletta\t403810\nLang\t403811\npooping\t403812\n中巴经济走廊\t403813\n弯把\t403814\nGoa\t403815\n灯笼裤\t403816\nspry\t403817\n文澜阁\t403818\nProcessing\t403819\n2015.3_\t403820\n苗阜\t403821\n001195\t403822\n棉纺\t403823\nhd800s\t403824\n死色\t403825\n华润啤酒\t403826\n脱髓鞘\t403827\n宁波海运\t403828\nKinds\t403829\n枪械\t403830\n14t\t403831\n120集\t403832\n何中华\t403833\n浴场\t403834\n米亚\t403835\n石湖景区\t403836\n郑惠玉\t403837\n萧景睿\t403838\n五组\t403839\n在水一方\t403840\n三傻\t403841\nrealloc\t403842\ntek\t403843\n品友\t403844\n5783\t403845\nADD\t403846\n金毛寻回犬\t403847\n位数字\t403848\n央视财经频道\t403849\n滨河国际新城\t403850\n细语\t403851\n西莫\t403852\nKVM虚拟机\t403853\n共乐\t403854\n城带\t403855\n中新赛克\t403856\n不一样的美男子\t403857\n威灵顿\t403858\n戋\t403859\n旧照\t403860\n电阻柜\t403861\n王家岭\t403862\nGalbraith\t403863\n李建刚\t403864\n全国游泳冠军赛\t403865\n长安欧尚_欧尚a800\t403866\n赵丰\t403867\n89c51\t403868\n通榆县\t403869\n观致5\t403870\nASLA\t403871\nsoulapp\t403872\n汇报片\t403873\n601601\t403874\n兰州铁路局\t403875\n传奇外传\t403876\n人参果\t403877\n测试者\t403878\n农林经济管理专业\t403879\n晋江文学城--帮助中心\t403880\n秀香\t403881\n预想\t403882\nL6\t403883\nBabolat\t403884\n磁光\t403885\n立秋\t403886\n30岁\t403887\n第五十八\t403888\n0856\t403889\nRSS\t403890\n5月中旬\t403891\n佳源\t403892\n匆匆\t403893\n断线钳\t403894\n挂墙式\t403895\nbbp\t403896\n机械电子工程专业\t403897\nfaye\t403898\n安室透\t403899\n便量\t403900\n树村\t403901\n磨光片\t403902\nvgg16\t403903\n柔韧性\t403904\nbask\t403905\n面价\t403906\nBUI\t403907\n譬如\t403908\n麻辣拌\t403909\n杯面\t403910\n端子机\t403911\n黑龙江省农垦总局\t403912\n停车\t403913\n券\t403914\n婚神\t403915\npandora\t403916\n销冠\t403917\n肿么\t403918\n10.4.1\t403919\n6g\t403920\n上海体检中心\t403921\nfern\t403922\n格雷尔\t403923\n戈多\t403924\n结晶水\t403925\n中国人民银行济南分行\t403926\n西安幼儿园\t403927\n英谱\t403928\n北京沃尔沃\t403929\n木香阁\t403930\n逆袭路\t403931\n64号\t403932\n沃尔沃S80L\t403933\n别克老君威\t403934\n小太阳鹦鹉吧\t403935\n德阳日报社\t403936\n盐酸左氧氟沙星滴眼液\t403937\njasypt\t403938\n华豫之门\t403939\n破洞\t403940\n魔神医毒妃\t403941\n阴风\t403942\n杭州公司\t403943\n奔驰glc200\t403944\nchr\t403945\n车站路\t403946\n大北路\t403947\n沙溢\t403948\n小而美\t403949\n英籍\t403950\n翠神\t403951\n单行本\t403952\n效应\t403953\n终极一班4\t403954\n哈哈镜\t403955\n加班费\t403956\n雨滴\t403957\n福泽谕吉\t403958\n超文本标记语言\t403959\nFeet\t403960\n牵引器\t403961\ninteger\t403962\n丁于\t403963\n大钊\t403964\nIEEE\t403965\n日站\t403966\n生命时报\t403967\n*#*#4636#*#*\t403968\n饱和溶液\t403969\nSOLO\t403970\nhandmade\t403971\n南京市儿童医院\t403972\n锅头\t403973\n英年\t403974\nExcle\t403975\n0103\t403976\n融水县\t403977\n帕托\t403978\n室性心动过速\t403979\nspfa\t403980\n迭代器\t403981\n宇都宫紫苑\t403982\n科幻画\t403983\n花生果\t403984\n5月12号\t403985\n佳能70D\t403986\n橄榄油\t403987\n觉华岛\t403988\n幼儿舞蹈教学\t403989\n丙烯醛\t403990\n5月11日\t403991\n贵妇们\t403992\n荔园\t403993\n返京\t403994\n1864\t403995\n汉光\t403996\n暴殄天物\t403997\nREZ\t403998\n轻亲\t403999\n&#65533\t404000\n巍阁\t404001\n七匣子\t404002\n周水子国际机场\t404003\n表皮葡萄球菌\t404004\nIns\t404005\nvrrp\t404006\n5.0_\t404007\n29本\t404008\n北大外国语学院\t404009\n扑奔\t404010\n故乡人\t404011\n素描画网\t404012\n高抬腿\t404013\n一拖一\t404014\n沙滩鞋\t404015\n玉露\t404016\n微软雅\t404017\n凯氏定氮法\t404018\n20160216\t404019\nleet\t404020\n10丝\t404021\n填涂\t404022\n3dmine\t404023\n本季\t404024\n长城c30\t404025\n妙印法师\t404026\n人民币理财产品\t404027\n译本\t404028\n自选集\t404029\n913\t404030\n历书\t404031\nhapame\t404032\n碳纤维地暖\t404033\n肉欲\t404034\nROS探索总结\t404035\n35平米\t404036\n湖北商贸学院\t404037\n日语在线学习网\t404038\n斗轮堆取料机\t404039\n奥义联盟\t404040\n链轮\t404041\n看剑\t404042\nsns\t404043\nnvyou\t404044\n债项评级\t404045\n蛋白类\t404046\nCaudalie\t404047\n重庆大学城市科技学院\t404048\n阿英\t404049\n新加坡币\t404050\n第74\t404051\n杂志\t404052\n大膏\t404053\n两条线\t404054\n316国道\t404055\n王卫华\t404056\n3科\t404057\n西东网\t404058\n草芽\t404059\n个体化\t404060\nkangle\t404061\n同位语\t404062\n默者\t404063\n2k10\t404064\n褚时妖猫传\t404065\n十天\t404066\n双选\t404067\n分手信\t404068\npotatso\t404069\npintai\t404070\n佩戴\t404071\n串门\t404072\n编程语言\t404073\nYrion\t404074\n应景\t404075\n叶公\t404076\n薪俸税\t404077\n第三十七回\t404078\nzw\t404079\n革命机\t404080\n洋山一期\t404081\n怒其不争\t404082\n拆线\t404083\n津武\t404084\nudl\t404085\nCPG\t404086\n金赛药业\t404087\n九九房产网\t404088\n批判现实主义\t404089\n水泥商情网\t404090\nietf\t404091\npedal\t404092\n乳头状癌\t404093\nwifi软件\t404094\n潘氏\t404095\n肾亏\t404096\n天门网\t404097\n白山\t404098\ncombining\t404099\n东方天空璋\t404100\nUKULELE\t404101\n同人文包\t404102\npce\t404103\n2518\t404104\n球儿\t404105\nsato\t404106\n东条希\t404107\n凑足\t404108\n第23次\t404109\nsunddenly\t404110\n孙怡\t404111\n整队\t404112\n威能\t404113\nddm\t404114\n崔泰俊\t404115\n恒运\t404116\n序篇\t404117\nFS5\t404118\n孙叔敖\t404119\n抚育\t404120\n安规测试仪\t404121\n马泰\t404122\ntable2excel\t404123\n流处理\t404124\n42式\t404125\n青海湖\t404126\n南京西路\t404127\n天河员村\t404128\n孙愚\t404129\n宇飞来\t404130\nfail\t404131\n五大类\t404132\n楔形\t404133\n五十四万亿\t404134\n10e\t404135\n现代远程学习概论\t404136\n薄膜\t404137\n嫉\t404138\n贯注\t404139\nfloa\t404140\n9560\t404141\nCH3\t404142\n荔湾湖公园\t404143\nbnuz\t404144\n乐队\t404145\n东望集团\t404146\n毒华\t404147\n童氏\t404148\nIllustrated\t404149\nfinancing\t404150\n还原型谷胱甘肽\t404151\n1—12月\t404152\n国网电子商务有限公司\t404153\n1658\t404154\nruckus\t404155\n埃利奥特\t404156\n∮\t404157\nperlman\t404158\n20180309\t404159\nAdmiral\t404160\n雷科防务\t404161\n额部\t404162\n列表式\t404163\n柴进\t404164\n官燕\t404165\n日立电梯(中国)有限公司\t404166\njfb\t404167\nFollowing\t404168\n辞任\t404169\n检点\t404170\n重纂\t404171\nfriction\t404172\n中央财经委\t404173\n全球采购网\t404174\n时间块\t404175\n刘十九\t404176\n臭皮匠\t404177\n截瘫\t404178\n身杯\t404179\n商业贿赂\t404180\n扭结\t404181\n贴身情人\t404182\n摩天大楼\t404183\n赛格国际购物中心\t404184\n自喜\t404185\nD7200\t404186\n帝豪GL\t404187\n嘉鱼\t404188\n军辉\t404189\n85_\t404190\n复旦大学附属眼耳鼻喉科医院\t404191\nkorea\t404192\n销魂\t404193\n2011年3月11日\t404194\nAdele\t404195\n雪风\t404196\ngetimagesize\t404197\n十几名\t404198\n赌客\t404199\n硫酸亚铁铵\t404200\n南昌经济技术开发区\t404201\nGenerics\t404202\n唐莉\t404203\n花廊\t404204\n1193\t404205\n人大金仓\t404206\n2015年12月27日\t404207\n一百零八\t404208\n42公里\t404209\n三生三世十里桃花\t404210\n卓隽卡\t404211\nbubbles\t404212\n7.27\t404213\nCHARLES\t404214\n蕾娜斯\t404215\n说梦话\t404216\n果器\t404217\n比亚迪S7\t404218\nwhoosh\t404219\n口袋学院物语2\t404220\nContact\t404221\n部标\t404222\n一串串\t404223\n楼价\t404224\n胡佛\t404225\nKC\t404226\nlengths\t404227\n徐波\t404228\n上百个\t404229\n礼贤下士\t404230\n灵镜传奇\t404231\nshr\t404232\n银翘散\t404233\n碳酸根\t404234\n碾压\t404235\n碶\t404236\n滨江道\t404237\nAKAI\t404238\n癸水\t404239\nHindu\t404240\n综恐\t404241\n学位\t404242\n差评\t404243\n25万美元\t404244\n36分钟\t404245\n江苏省海门中学\t404246\n日出日\t404247\n伊力\t404248\n北京大学校报电子版北京大学\t404249\n晓之\t404250\n蒸压加气混凝土砌块\t404251\n幼年期\t404252\n嬉游\t404253\n压力式\t404254\n汇医慧影\t404255\n始作俑者\t404256\n令狐\t404257\n段炼\t404258\nデザイン\t404259\n领克01_领克_领克01报价\t404260\n跌伤\t404261\nLAMP\t404262\n永恒传说\t404263\n亚光\t404264\n博微\t404265\n托拉姆物语\t404266\n鹤城区\t404267\n全息\t404268\n北楚\t404269\n哔哩哔\t404270\n20180306\t404271\n_维\t404272\n郑西\t404273\n张一帆\t404274\n信用贷\t404275\n粤财信托\t404276\n圆织机\t404277\n汽泡\t404278\n318路\t404279\n很努力\t404280\n立于不败之地\t404281\n赵晖\t404282\n纠三促\t404283\n2毫升\t404284\nusps\t404285\n炫狗\t404286\n优宿\t404287\n主们\t404288\n1402\t404289\n中国消防网\t404290\n斑鸠\t404291\nnanhai\t404292\n胡萝卜素\t404293\n925银\t404294\n技巧性\t404295\nvissim\t404296\n800\t404297\n池州\t404298\n成泰燊\t404299\n秒死\t404300\n辉辉\t404301\n伊曲康唑\t404302\n奔腾b70\t404303\n好猫\t404304\n一场场\t404305\n结构化\t404306\nrid\t404307\n2800xxxx\t404308\n二十八岁\t404309\nMaggie\t404310\nSOCK\t404311\n限迁\t404312\n金龙王\t404313\n天津市胸科医院\t404314\n挥汗如雨\t404315\n晨阳\t404316\n成乐高速\t404317\n核导\t404318\n导管室\t404319\n体育心理学\t404320\nbeauty\t404321\n白云区\t404322\n河南省实验中学\t404323\n钱袋子\t404324\n炒股\t404325\n5.7.20\t404326\n9类\t404327\n黑耀\t404328\n许昌学院\t404329\n新水浒\t404330\n浙江理工\t404331\n笑问\t404332\n5.5.6\t404333\n武汉同济\t404334\nOA办公系统\t404335\nWin10高分屏\t404336\n唐古\t404337\n梅祖拉\t404338\n生化危机4hd\t404339\ncoolant\t404340\n好狗\t404341\n森防\t404342\n重庆商社\t404343\n杏奈\t404344\n伊莎\t404345\n早教班\t404346\n聪明的顺溜\t404347\n李洪波\t404348\nList\t404349\n左权县\t404350\ncoreseek\t404351\n孩子\t404352\n揣着\t404353\n陈家湾\t404354\n宠物医生\t404355\n梦幻西游经\t404356\n妙玉\t404357\n涉农\t404358\nti7\t404359\n中性粒细胞\t404360\n汽车之家论坛\t404361\n考绩\t404362\n数字传感器\t404363\n炽天使\t404364\n访惠\t404365\n女生徒\t404366\n刘建峰\t404367\n展览会议\t404368\n力奇\t404369\n可诺丹婷\t404370\n拉马努金\t404371\n迷迭香酸\t404372\n兑付\t404373\n奶娃\t404374\n瑞泰瑞\t404375\n北京万豪酒店\t404376\n宫水三叶\t404377\nUzi\t404378\n3月17号\t404379\n傲骨之战\t404380\n伟明环保\t404381\n2.7L\t404382\nMyers\t404383\nvide\t404384\n3Ds\t404385\n崇祯帝\t404386\n电感线圈\t404387\n南溪县\t404388\nmmall\t404389\nヒメカノ\t404390\n新角色\t404391\nMK-雷韵祺\t404392\n任军\t404393\nEarly\t404394\n特殊待遇\t404395\n万山镇\t404396\nMp3tag\t404397\n大男孩\t404398\nnacicat\t404399\n249元\t404400\n分金\t404401\n9016\t404402\n腾讯手游助手绝地求生\t404403\n1777\t404404\n杨树\t404405\n余伟\t404406\nspoke\t404407\n排污者\t404408\n关联度分析\t404409\n塑料展\t404410\n景泰县\t404411\ndaochu\t404412\n春野小村医\t404413\n禁锢\t404414\nsxs\t404415\n百多邦软膏\t404416\nx5570\t404417\n0.14.0\t404418\n借贷记账法\t404419\n驳倒\t404420\n大邱\t404421\n白芦笋\t404422\n绘声绘色\t404423\nSurprises\t404424\n冰杖\t404425\n立汤路\t404426\n哈姆雷斯\t404427\nwhmcs\t404428\n双锁\t404429\n洪吉童\t404430\n人战\t404431\n陈一舟\t404432\n伫\t404433\n武警总医院\t404434\n徐晃\t404435\nbigdecimal\t404436\n大风暴\t404437\n死亡阴影\t404438\n口袋妖怪游戏大全\t404439\n客软件园\t404440\n5830\t404441\n西班牙大学\t404442\n2318\t404443\n冷壁\t404444\n中共河北省委\t404445\nBIGEMAP地图下载器\t404446\n上课时\t404447\n有缘\t404448\n刘丹丹\t404449\n轻狠\t404450\nens\t404451\n条字\t404452\n休斯顿火箭队\t404453\nclubmed\t404454\n7080年代\t404455\n软瓷\t404456\n剪切板\t404457\n滋养\t404458\n満\t404459\nhaut\t404460\n岛型\t404461\n化妆箱\t404462\naccreditation\t404463\n投靠\t404464\n吊妞\t404465\n李国荣\t404466\n消化不良\t404467\n进行性\t404468\n翻抛机\t404469\n鹤舞\t404470\n杭州市城市管理委员会\t404471\nGiuseppe\t404472\n化验室\t404473\n乍浦镇\t404474\nleoxuan\t404475\nUnlimited\t404476\nflx\t404477\n丰汇\t404478\nshenme\t404479\n局内\t404480\n怪咖咖\t404481\n养老保险\t404482\n微管家\t404483\n辩护权\t404484\n蜀山战纪之剑侠传奇\t404485\n魅蓝e\t404486\n返修率\t404487\n两创\t404488\n轴位\t404489\n酸性食物\t404490\n体育馆路\t404491\n0525\t404492\n离机\t404493\nPANTONE\t404494\n祛湿\t404495\n午夜凶铃\t404496\n增添\t404497\nFTTx\t404498\n专办员\t404499\nsoin\t404500\nRTL\t404501\n真空袋\t404502\n哈尔滨极地馆\t404503\n20160510\t404504\nShania\t404505\n混合运算\t404506\npa66\t404507\n铁板豆腐\t404508\n国发院\t404509\n飞狐\t404510\nWC\t404511\n认可\t404512\n玮\t404513\n重号\t404514\n八八伍财经\t404515\n135\t404516\n未婚妈妈\t404517\n碧源月湖\t404518\n言欢\t404519\n碳晶墙暖\t404520\n等词\t404521\n皇庭\t404522\npgc\t404523\n128gb\t404524\n北京幼升小网_北京幼升小\t404525\nenfocus\t404526\n0201\t404527\n贵州医科大学附属医院\t404528\nsimulator\t404529\n南海诸岛\t404530\n阵列式\t404531\n鏖战襄阳\t404532\n恼\t404533\n新华学院\t404534\n互网\t404535\n马兴\t404536\nd3200\t404537\nDSP\t404538\n超星图书馆\t404539\n指点迷津\t404540\nxuexi\t404541\n油冷\t404542\nWanted\t404543\nZOE\t404544\n汗液\t404545\n小儿化痰止咳颗粒\t404546\n190r\t404547\nTOGO\t404548\n杭州中院\t404549\n4111\t404550\n武侠版\t404551\n阿兰达瓦卓玛\t404552\nmariaDB\t404553\n省墨\t404554\n土命\t404555\n西风的话\t404556\n上海11号\t404557\n私生活\t404558\n临平街道\t404559\n城市名\t404560\nyotaphone2\t404561\n六口茶\t404562\n观经\t404563\n黄薇\t404564\n三菱电机中央空调\t404565\n苏州园博园\t404566\n捷胜\t404567\n二手房房产网\t404568\n藏古拉雍\t404569\n万友\t404570\n投照\t404571\n桌秤\t404572\n4四个\t404573\n闽清\t404574\n迈创\t404575\n510S\t404576\n口处\t404577\n废钢网\t404578\n天海邮轮\t404579\n别误会\t404580\n猎食者\t404581\n蛇皮\t404582\nV2Ray\t404583\n2017-10-28\t404584\n语码\t404585\n栾雨\t404586\n要钱\t404587\n肾主\t404588\n5.6.35\t404589\nmaterials\t404590\nTk\t404591\n第18页\t404592\n韩屋村\t404593\nF码\t404594\n榄菊\t404595\n酱类\t404596\n圣号\t404597\n经济法\t404598\n接洽\t404599\n万科朗润园\t404600\n时代感\t404601\nvivox5max\t404602\nXIA\t404603\n软件设计\t404604\n南京市交通运输局\t404605\n余江县\t404606\n警报器\t404607\n博望\t404608\n斯坦威\t404609\n2016-01\t404610\nxxxxxx\t404611\n迪迦奥特曼\t404612\n2099元\t404613\n7r\t404614\n大料\t404615\n粘胶剂\t404616\nHandycam\t404617\n再说一次\t404618\n京津冀\t404619\nFinance\t404620\nMasterCard\t404621\n心肌梗\t404622\n封面人物\t404623\n威风\t404624\n三度\t404625\n焖饭\t404626\n娜塔亚\t404627\nCNN\t404628\n三国11\t404629\n法拉克\t404630\ngh0st\t404631\n中华商务网\t404632\n第二十七届\t404633\n中国社会科学\t404634\n6.3.7\t404635\n小学五年级语文下册\t404636\n于思\t404637\n帽\t404638\n馆陶吧\t404639\n危品\t404640\n贵州师范学院\t404641\n长沙站\t404642\n怪物猎人X中文网\t404643\nDeep\t404644\n诚信企业家大会\t404645\nMeToo\t404646\n两重\t404647\nCatia\t404648\n繁荣度\t404649\n和佳\t404650\nNadech\t404651\n万寿无疆\t404652\n粿\t404653\n三德子\t404654\n留心\t404655\n虫群之心\t404656\ngun\t404657\n丙烯酸树脂\t404658\nECO\t404659\nMb\t404660\n业精于勤\t404661\n眉弓\t404662\nVehicles\t404663\n50天左右\t404664\ndfs\t404665\n于家堡\t404666\n工作单元\t404667\nx240\t404668\n深圳市人力资源和社会保障局_深圳政府\t404669\nav先锋\t404670\nACCA考试网\t404671\n大都会人寿\t404672\n大连一方足球俱乐部\t404673\nsab\t404674\n天上天下\t404675\niBook\t404676\nIT认证_资格考试/认证\t404677\n红旗南路\t404678\n情深以南尽成欢\t404679\n翔云\t404680\nPG1\t404681\ndou\t404682\n重置版\t404683\n漫步者\t404684\n阴离子表面活性剂\t404685\n安监\t404686\nmax2013\t404687\n长安县\t404688\n月下\t404689\n褒义词\t404690\n乙级\t404691\n寄生\t404692\n将府公园\t404693\n沪电股份\t404694\n武汉博大医院\t404695\nMORTAR\t404696\n美人吟\t404697\n偷丝\t404698\n大连地铁4号线\t404699\n逻辑型\t404700\nwysiwyg\t404701\n大金\t404702\n狂豹\t404703\n凯泰\t404704\n提前角\t404705\n校本部\t404706\n幻君\t404707\n鬼吹灯精绝古城\t404708\n风台\t404709\n长东\t404710\n校帝\t404711\n金月湾\t404712\nclien\t404713\n陈睿\t404714\n曲靖市第一人民医院\t404715\n克劳塞维茨\t404716\n胡晓晴\t404717\n步行\t404718\n抑郁药\t404719\n紧凑型车\t404720\n13行\t404721\n热源泵\t404722\n海魔\t404723\n经济师考试\t404724\n温房网\t404725\n查良铮\t404726\n天下有警\t404727\n忆\t404728\n彝语\t404729\nAttendance\t404730\nhighly\t404731\n59.0.2\t404732\n军营\t404733\nszsi\t404734\n诸永高速\t404735\n山西中医药大学\t404736\n树油\t404737\n荆州市\t404738\nl0\t404739\ngentoo\t404740\n建安路\t404741\n冷兵\t404742\n宏编辑窗口\t404743\n塘厦\t404744\n古诗人\t404745\n防虫\t404746\nhp1005\t404747\n十二道\t404748\n海兴电力\t404749\n调用栈\t404750\n增补\t404751\n电时序\t404752\n蓝云\t404753\nミラ\t404754\nchuan\t404755\n根结\t404756\n磨耗\t404757\n糖油粑粑\t404758\n立方毫米\t404759\n脱戏\t404760\n要死人\t404761\n避尘\t404762\n一个批\t404763\n苏州市知识产权局\t404764\n勇者行动\t404765\n贸易金融网\t404766\n葡萄膜炎\t404767\n正辛醇\t404768\n医大\t404769\nmbti\t404770\nzero\t404771\n下层\t404772\nE店宝\t404773\n应收账款账龄分析表\t404774\n善本\t404775\nmcp\t404776\n副热带高压\t404777\n十类\t404778\n旁注\t404779\n徽州区\t404780\nconvenient\t404781\n塔尔玛\t404782\n灵宝\t404783\n广丰\t404784\nAsian\t404785\n定损员\t404786\n福田欧马可\t404787\n邪路\t404788\n相当于\t404789\n发酵罐\t404790\nmikumikudance\t404791\nGOTO\t404792\nTwemproxy\t404793\n中国人民解放军总参谋部\t404794\n搜狗拼音输入法2018\t404795\n油焖大虾\t404796\n自媒体营销\t404797\n沱沱工社\t404798\n凭感觉\t404799\nWI\t404800\n朱某某\t404801\ntorsion\t404802\nP2\t404803\n三祖\t404804\nMonkey\t404805\n受身\t404806\n代码转换器\t404807\nCT\t404808\n血皮菜\t404809\n小牛M1\t404810\n死不死\t404811\n线观\t404812\n大江南北\t404813\nWVR450G\t404814\n箴言\t404815\ndavygeek\t404816\n凉拌藕片\t404817\n上海华交会\t404818\n炮竹\t404819\n求欢\t404820\n东方奥雅之光\t404821\n十六进制编辑器\t404822\n乐天国际\t404823\n依折麦布\t404824\n小柴犬\t404825\n遇见你\t404826\nROCK\t404827\n贝朗\t404828\n县农委\t404829\n珍珠漆\t404830\n改进\t404831\n百份\t404832\n特膳\t404833\nSources\t404834\n农商银行\t404835\nsyndrome\t404836\n上原亜衣\t404837\n水炮\t404838\n生活着\t404839\n管城区\t404840\n工信厅\t404841\nGCD\t404842\n终端箱\t404843\n矿物质片\t404844\n末子\t404845\n李小琳\t404846\n00175\t404847\n21018\t404848\n二十二年\t404849\n中国中铁股份有限公司\t404850\n山人乐队\t404851\n楷体字\t404852\n温温\t404853\n赤峰政府网\t404854\n自序\t404855\n深圳人才交流中心\t404856\nXTC800\t404857\n紫竹路\t404858\n翻云覆雨\t404859\n不负众望\t404860\n安身立命\t404861\n烟点\t404862\n张多福\t404863\n行中衡\t404864\n金辉优步花园\t404865\n120平\t404866\n参展商\t404867\n中国代表团\t404868\n种子法\t404869\n奈樱\t404870\n杭州农业银行\t404871\n陈柯帆\t404872\n普票\t404873\n排排\t404874\n热血江湖传\t404875\n伺服压力机\t404876\n科沃斯\t404877\n芍\t404878\n徐令义\t404879\n北京八宝山\t404880\n丰实\t404881\n郝景芳\t404882\n当担\t404883\n爱乐赞\t404884\n十月一号\t404885\n2016年9月1日起\t404886\n第四十\t404887\nEmulsion\t404888\n初级经济师\t404889\nJohnnie\t404890\n国家安\t404891\n大学篇\t404892\n小感\t404893\n南京市人民政府\t404894\n电子音乐\t404895\n众字\t404896\n锦州银行\t404897\n26.1.0\t404898\nonetomany\t404899\n4套\t404900\n侠盗飞车5圣安地列斯\t404901\n宣传文\t404902\nQCY\t404903\n十字架\t404904\n真讨厌\t404905\n中农\t404906\n挖矿\t404907\nRegistry\t404908\n龙之队\t404909\n郑州大学路\t404910\nv820w\t404911\n梦\t404912\nsteroid\t404913\n9w\t404914\n无恒\t404915\n葛林\t404916\nCrocodile\t404917\n阮次山\t404918\n沃克斯\t404919\nDJVU\t404920\n多美\t404921\nMapGis\t404922\n滚屏\t404923\nAPU\t404924\n美熟母\t404925\nWomens\t404926\n萌幻\t404927\n致幻\t404928\n宇智波佐良娜\t404929\n_句容先锋网\t404930\n赵紫骅\t404931\n移装\t404932\n抛掉\t404933\nmysql5\t404934\n散养\t404935\n石漆\t404936\n物袋\t404937\n沙坡尾\t404938\nm4800\t404939\n跌倒\t404940\n吃火锅\t404941\n耕作\t404942\n标准音\t404943\nmmol\t404944\n天下人\t404945\n六房网\t404946\n地质队\t404947\n全国卷2\t404948\n上海晶都生物技术有限公司\t404949\nengineer\t404950\nmicroscopy\t404951\n赤峰广播电视网\t404952\n一爻\t404953\n协议期\t404954\n养不起\t404955\n铂悦山\t404956\n龙舟队\t404957\n获影\t404958\n洪三\t404959\n中国机械工程学会\t404960\n广州市卫生和计划生育委员会\t404961\n迷药\t404962\ndestiny2吧\t404963\n龙胆\t404964\nBillWang\t404965\n阿阮\t404966\n华侨农场\t404967\n名词\t404968\n海峡卫视\t404969\n口袋妖怪ORAS\t404970\n球板\t404971\n粽子叶\t404972\n61期\t404973\n孟极\t404974\n比例\t404975\n威权主义\t404976\n留仓\t404977\n叶启田\t404978\n视窗\t404979\n中央军委\t404980\nYURI\t404981\n大坪\t404982\n系杆\t404983\n阴寿\t404984\n真因\t404985\n20170722\t404986\n整流二极管\t404987\n找事\t404988\n在奥运会上\t404989\n无法忍受\t404990\n不齿\t404991\n梁辉\t404992\n冯雷\t404993\n文智\t404994\n跛子\t404995\n秦玉峰\t404996\n悦榕庄\t404997\n山东省安监局\t404998\n多角\t404999\n灯市口\t405000\n太白\t405001\n15公分\t405002\n新领程\t405003\n夺走\t405004\n血水\t405005\n眉卡\t405006\n第一天\t405007\n賀\t405008\n尾数\t405009\nios8.4.1\t405010\n科技资讯\t405011\n福建医科大学\t405012\n简网\t405013\ntjj\t405014\n面贴\t405015\n祥康\t405016\n风向仪\t405017\n安徒霍思燕\t405018\n光阑\t405019\n山羊肉\t405020\n胆汁反流\t405021\n51首\t405022\ndance2018\t405023\n洗脸\t405024\n中概\t405025\n分区未分配驱动器号\t405026\nstake\t405027\n东莞理工学院\t405028\n花露水\t405029\n变速器\t405030\n0536\t405031\nPAT\t405032\n胡洁\t405033\n造梦者\t405034\n松阳新闻网\t405035\n孙郡\t405036\nWinter\t405037\n书豪\t405038\n办公司\t405039\n钢琴价\t405040\n沈哲楠\t405041\n桥墩\t405042\n联机卡\t405043\n乌蒙大草原\t405044\n广阳区\t405045\n相思湖\t405046\n政府版\t405047\nUnemployment\t405048\n新疆小学\t405049\n阙歌\t405050\n北京市园林绿化局\t405051\n东陆路\t405052\n药事管理学\t405053\nproxyee\t405054\n20151013\t405055\n20160401\t405056\n汉化硬盘版\t405057\n导乐分娩\t405058\n塘桥镇\t405059\n圣光军团\t405060\n达蓬山\t405061\n录音带\t405062\n银影\t405063\n2.4GB\t405064\n【求\t405065\n镇沅县\t405066\n山村小站之玉儿嫂\t405067\nDialog\t405068\n西地兰\t405069\n畅思\t405070\n论坛管理员\t405071\n魔火\t405072\n20多名\t405073\nfit函数\t405074\nhustoj\t405075\n不顾\t405076\n武汉传媒学院\t405077\nVANS\t405078\n挖运\t405079\nSTORZ\t405080\nNeal\t405081\n广乐高速\t405082\ndressed\t405083\n县令\t405084\n芸苔素内酯\t405085\nsqli\t405086\ntowers\t405087\n1月26日\t405088\n提包\t405089\n面罩\t405090\n水煮鱼\t405091\n燕岭路\t405092\ndota2rpg\t405093\nKathleen\t405094\n稿箱\t405095\n12行\t405096\n14针\t405097\nAbout\t405098\ninvite\t405099\n尾狐\t405100\n义务教育制度\t405101\n睿意德\t405102\n墨染\t405103\n杜预\t405104\n担待\t405105\n类有\t405106\n20170515\t405107\n暂代\t405108\nResultSet\t405109\n苏格兰人\t405110\n杭州电视台\t405111\n梅柳\t405112\n鬼神君\t405113\nnavigationItem\t405114\n糖炒栗子\t405115\n阿细\t405116\n29号\t405117\n凌辱\t405118\n空谷幽兰\t405119\n时年\t405120\n凿开\t405121\n大畜\t405122\n我的天\t405123\n说岳全传\t405124\n删节\t405125\nHUD\t405126\n嵌体\t405127\n我相信\t405128\n生动活泼\t405129\n克什克腾旗\t405130\nPCEVA\t405131\n上海国际赛车场\t405132\n西政\t405133\n讲文明\t405134\n第五\t405135\nZEMAX\t405136\n赵庄\t405137\n溺宠\t405138\n王春海\t405139\n【晓\t405140\nJr\t405141\n双子岛\t405142\n青城山下白素贞\t405143\n八十二\t405144\nloli\t405145\nWXML、WXSS\t405146\n29w\t405147\n划清界限\t405148\n25c\t405149\n泪液\t405150\n空心字\t405151\npanpan\t405152\n六分钟\t405153\n袁承志\t405154\nWardrobe\t405155\n杨玉良\t405156\n聚丁烯\t405157\n屎感\t405158\n扁桃体切除术\t405159\n产出率\t405160\n曹广福\t405161\n3072\t405162\nCK\t405163\nHotline\t405164\n猩便利\t405165\n王安平\t405166\n55_\t405167\n桐庐数字报\t405168\n美洲虎\t405169\n芹野莉奈\t405170\n机械设计制造及其自动化\t405171\n军品\t405172\nwinsxs\t405173\n宇多田光\t405174\n地球生\t405175\n战机\t405176\n蜜蜂箱\t405177\nssense\t405178\n本法\t405179\nfujian\t405180\n潜能生\t405181\n为何种\t405182\nclick事件\t405183\n兎\t405184\n北大经济学院\t405185\n广元新闻网\t405186\n东方不败风云再起\t405187\n悬丝\t405188\n华西证券股份有限公司\t405189\n视镜\t405190\n104集\t405191\n剑桥雅思\t405192\nyqcn\t405193\n人教版初一数学\t405194\n第25条\t405195\n新子\t405196\n标线仪\t405197\nblackmagic\t405198\n第6层\t405199\ncolumns\t405200\n营帐\t405201\n土豆视频\t405202\n炎黄春秋\t405203\n滨州市人民政府\t405204\nprediction\t405205\n厂门\t405206\n洗碗液\t405207\n中岳庙\t405208\n查帐\t405209\n31路\t405210\n信佑\t405211\n兴义镇\t405212\n容妃\t405213\n拈\t405214\n方差膨胀因子\t405215\n小闲人\t405216\n扣肉\t405217\n番茄田\t405218\nPrincess\t405219\n品友互动\t405220\n1363\t405221\n禁烟令\t405222\nLInux\t405223\n公益性\t405224\n矿棉\t405225\n石榴花\t405226\n轉貼\t405227\n四式\t405228\n3-7天\t405229\nVENETA\t405230\n嫁接苗\t405231\nmch\t405232\n放料阀\t405233\nХ\t405234\n张行长\t405235\n柳侑绮\t405236\n1906\t405237\nphpcmsv9\t405238\n华谊集团\t405239\n电势\t405240\n蹦蹦跳\t405241\n富人们\t405242\n观江\t405243\n经典之作\t405244\n花剌子模\t405245\n环梁\t405246\n1600亿美元\t405247\n十一周年\t405248\n圈名\t405249\n树链\t405250\nexcitation\t405251\nwsbs\t405252\n②\t405253\n金鹰天地广场\t405254\n极尽\t405255\n联众\t405256\n丽珠医药集团股份有限公司\t405257\nquicktime\t405258\n绝命时钟2:22\t405259\ntechnician\t405260\n十二地支\t405261\n200万个\t405262\n雨鞋\t405263\n官庄村\t405264\n海东市人民政府\t405265\n畅销榜\t405266\n黄河游览区\t405267\n致谢\t405268\n孙浩俊\t405269\n玉镯\t405270\nKate\t405271\n冰史\t405272\nTHAN\t405273\n泛\t405274\n徐鹏\t405275\n朝前走\t405276\n一溪\t405277\n内敛\t405278\n走向世界\t405279\nSkirt\t405280\n先科\t405281\n烟台\t405282\n钱塘湖春行\t405283\n质量好\t405284\n中兴财光华会计师事务所\t405285\nEscalation\t405286\n尼安德特\t405287\n网易传媒\t405288\nWin10下\t405289\n巴博萨\t405290\n罗丹明\t405291\n横斜\t405292\n包装运输\t405293\n青年队\t405294\n力诺集团\t405295\n北京中国工商银行\t405296\n渺茫\t405297\n头晕眼花\t405298\n凯旋路\t405299\n竹铃\t405300\ng40\t405301\n定常\t405302\n海达郝\t405303\n姐妹网\t405304\n拉威尔\t405305\nWindows_Server\t405306\n串联质谱法\t405307\n一二三四五六七八九十\t405308\n群面\t405309\n同归\t405310\n歪脖\t405311\n蓝蛋\t405312\ngent\t405313\nI20\t405314\njournal\t405315\n手工艺人\t405316\n杂念\t405317\n东北摩托联\t405318\n10.1039\t405319\n夏目贵志\t405320\n东阳日报\t405321\n道友们\t405322\n1323\t405323\n毛血旺\t405324\n吴学文\t405325\n419号\t405326\nmagento2\t405327\n募兵\t405328\n名文\t405329\n第41关\t405330\n8.6\t405331\n潍县\t405332\n司太立\t405333\n悲愤\t405334\n拔刀\t405335\n网易号\t405336\n黄\t405337\n长输\t405338\n熙悦\t405339\nICANN\t405340\n重紫\t405341\n実験\t405342\nalway\t405343\n兴宁市人民政府\t405344\n欢乐喜剧人第三季\t405345\nwww.17dp.com\t405346\n东莞\t405347\n10把\t405348\n金沙县人民政府\t405349\nOrcad\t405350\n月牙岛\t405351\n甲醛味\t405352\n罗宾·威廉姆斯\t405353\n还息\t405354\n二手设备网\t405355\n闪电战2\t405356\n愚乐巴士\t405357\n水黾\t405358\n柔和\t405359\n小红书笔记\t405360\n中铁电气化局集团有限公司\t405361\nAWB\t405362\nVi\t405363\n雷克萨斯es\t405364\n宝隆\t405365\nkeyshot7\t405366\nmankind\t405367\n米雪儿\t405368\n法弗纳\t405369\nbaidumap\t405370\n6n\t405371\n广口瓶\t405372\n甩干机\t405373\n长相思\t405374\n王明亮\t405375\ndisperse\t405376\nSSRF\t405377\n大宅门吧\t405378\nAXI\t405379\n地质工程专业\t405380\n贵阳学院\t405381\n计算稿\t405382\n急回\t405383\n雏凤\t405384\n击鼓\t405385\n一专\t405386\n牵肠挂肚\t405387\n118分钟\t405388\n护笼\t405389\n独揽\t405390\nhegre\t405391\n雷达物位计\t405392\nTRIM\t405393\nMODA\t405394\nlinea\t405395\nvxlan\t405396\n一系\t405397\n30v\t405398\n夜访吸血鬼\t405399\n乖\t405400\n中化国际\t405401\n星钻\t405402\n【电子税务局\t405403\n瑟琳娜\t405404\nbpi\t405405\nWiFi万能钥匙\t405406\nYC\t405407\n富国基金\t405408\n桂林电子科技大学\t405409\n海南白沙黎族自治县人民政府\t405410\nOIL\t405411\nihpone\t405412\n21k\t405413\n建筑工程公司\t405414\n中国金融界网\t405415\n黄晓明\t405416\nifix\t405417\nNigh\t405418\nnvidia显卡驱动\t405419\nxq\t405420\n上海航天技术研究院\t405421\n中场休息\t405422\n26首\t405423\n杜杜\t405424\n考拉网\t405425\n吴英杰\t405426\nJEANS\t405427\n调剂\t405428\n希灵主神\t405429\n普刊\t405430\nautogen\t405431\n东凌\t405432\nMikocon\t405433\n鬼水\t405434\n厦门港口管理局\t405435\nJalan\t405436\n塞哥维亚\t405437\nlumion8\t405438\n欣旺达\t405439\n掐丝\t405440\n吃西瓜\t405441\n海贼无双\t405442\n篮球赛\t405443\n十人\t405444\ngt73\t405445\n惨痛\t405446\n绝顶\t405447\nZETA\t405448\n人性本善\t405449\n抹光机\t405450\n港股公司\t405451\n神秘村\t405452\nHbase\t405453\n大连船舶重工集团有限公司\t405454\n168路\t405455\n宪政\t405456\n森歌集成灶\t405457\n军妓\t405458\n沿河大道\t405459\n英国帝国理工学院\t405460\n家法\t405461\n转念\t405462\n天守\t405463\n大家一起玩\t405464\n海空卫士\t405465\nCRAZY\t405466\n静电纺丝\t405467\nIefans\t405468\nYun\t405469\n司筒\t405470\n中国饲料行业信息网\t405471\n蛋白质谱\t405472\n蒋先生\t405473\n国际中心\t405474\n暮光之城1\t405475\nWhich\t405476\n庆阳网\t405477\nFEELING\t405478\n下行文\t405479\nsyslog\t405480\n温商\t405481\n民法学\t405482\n卓别林\t405483\n拉维\t405484\n中国铁建重工集团有限公司\t405485\n上海翻译公司\t405486\n商友们\t405487\n税基\t405488\n过硫酸氢钾\t405489\nBahn\t405490\nLJ\t405491\n东方中学\t405492\npurpose\t405493\n感叹号\t405494\n大专学\t405495\n大梦杯\t405496\nbwl\t405497\n青芒\t405498\n忍让\t405499\n求学\t405500\n金缕曲\t405501\n张北坝上草原\t405502\n各得\t405503\n沧州运河\t405504\n10个多月\t405505\nDay\t405506\n人区\t405507\nbartlett\t405508\n陆佳\t405509\n窃喜\t405510\n亿酷\t405511\ne5700\t405512\nSense\t405513\n张之路\t405514\n生态圈\t405515\n陕西广播电视大学\t405516\n五刑\t405517\ndong\t405518\n何荣\t405519\n三十四\t405520\n层数\t405521\n双行\t405522\n能量型\t405523\n诱多\t405524\n有道网页翻译2.0\t405525\n青罗\t405526\nArduino论坛\t405527\n访达\t405528\nFPS游戏\t405529\nParticular\t405530\n大管\t405531\n少喝水\t405532\n中航健身会\t405533\n生源地\t405534\n重耳\t405535\n腾讯视屏\t405536\n良港\t405537\n1844年\t405538\n干竹笋\t405539\n横向页\t405540\n神州智达\t405541\n北九州市\t405542\n存活期\t405543\n我的世界虚无世界2\t405544\nHBO\t405545\n杨雨婷\t405546\n红痛\t405547\n臭\t405548\n中册\t405549\nMagi\t405550\n对牛\t405551\ntar.gz\t405552\n和信\t405553\n书愤\t405554\ngd库\t405555\nConcur\t405556\nコキ\t405557\n博乐市\t405558\n珠海市地方税务局\t405559\n2018年四月份\t405560\n济南市市\t405561\njalan\t405562\n万梓良\t405563\n大庆东\t405564\n减速板\t405565\n罗马人\t405566\n刘子玥\t405567\n3星级\t405568\n陈勇\t405569\n广粤\t405570\n突然的自我\t405571\n整理术\t405572\n天猴\t405573\nsipo\t405574\n云燕\t405575\nggg\t405576\n扩\t405577\n挪作\t405578\n象湖\t405579\n墨脱县\t405580\n河源市政府\t405581\n极坐标图\t405582\n图雅诺\t405583\n滔滔不绝\t405584\n炳胜\t405585\n蛋白石\t405586\n傲虎论坛_汽车之家论坛\t405587\n正厅\t405588\n500斤\t405589\n门头\t405590\n海蒂\t405591\n云锦\t405592\n两撇\t405593\n艾利丹尼森\t405594\n德宏\t405595\nstepped\t405596\nceping\t405597\n孙军\t405598\n登上\t405599\nmd2\t405600\n失业保险费\t405601\n出险\t405602\n2016年5月19日\t405603\n维埃拉\t405604\n韩承毅\t405605\nWN\t405606\n超线程\t405607\n中华人民共和国商务部\t405608\nretrying\t405609\n合模\t405610\n后稷\t405611\n23.com\t405612\n用友软件股份有限公司\t405613\n广物地产\t405614\n时量\t405615\nModal\t405616\n大水牛\t405617\n列维\t405618\n中心卫生院\t405619\n坎贝尔\t405620\n电脑验光仪\t405621\n精神抖擞\t405622\n开沙岛\t405623\n2.8mm\t405624\n心痛\t405625\n李跃儿\t405626\n叠底\t405627\nSummertime\t405628\n金印\t405629\n屋塔房王世子\t405630\n独板\t405631\n方盛制药\t405632\n有口福\t405633\n_客\t405634\n景界\t405635\n肖老师\t405636\n葛大夫\t405637\n封盖机\t405638\n再听\t405639\n内庭\t405640\n太友\t405641\n铁哥\t405642\n连心卡\t405643\n八大处整形医院\t405644\n修复乳\t405645\n0731\t405646\ntelegraf\t405647\n李锦斌\t405648\n小百合\t405649\npx\t405650\nwww.ss19.cn\t405651\n北京陶然亭\t405652\nappendChild\t405653\n小姜\t405654\n人大附\t405655\n酷喵\t405656\n电离常数\t405657\n秘源\t405658\nfg\t405659\nhoka\t405660\n荣盛城\t405661\n喷血\t405662\npowerDesigner\t405663\n渤龙湖\t405664\n学画画\t405665\n边孔\t405666\n稀奇古怪\t405667\n中电网\t405668\n涡流管\t405669\n选择券\t405670\nonly_eVonne\t405671\nExperimental\t405672\n重庆富民银行\t405673\nemonda\t405674\n贺鹏飞\t405675\n3.75\t405676\n杀人者的记忆法\t405677\n葛尔丹\t405678\n纳加\t405679\n糖方\t405680\n射盘\t405681\n顺风快递\t405682\n万达公司\t405683\n隔着\t405684\n财政支出\t405685\nStudying\t405686\nRheinland\t405687\n全员劳动生产率\t405688\nMINT\t405689\n自揭\t405690\n宗庆\t405691\n1.28\t405692\n联合作战\t405693\n东面\t405694\n中审网校\t405695\n想吃\t405696\n铺巾\t405697\nNPC\t405698\n休牧\t405699\nxmanger\t405700\n天津市人民政府国有资产监督管理委员会\t405701\nsnapdragon\t405702\n琦君\t405703\n爱爱\t405704\n挂票\t405705\n21cn.com\t405706\n密保卡\t405707\n601069\t405708\nk555l\t405709\n组名\t405710\n劳卜\t405711\nPHPcms\t405712\ntaiji\t405713\nbells\t405714\n练船\t405715\n储物架\t405716\n【地下城与勇士吧\t405717\n处方\t405718\n简笔画法\t405719\n黑鸭子\t405720\nwitz\t405721\n小田\t405722\n1.5G\t405723\niFrame\t405724\n上床\t405725\n中银国际证券\t405726\n上海新闻网\t405727\nAcquisitions\t405728\n太空堡垒\t405729\n单枪匹马\t405730\n流士\t405731\n收银管理系统\t405732\n凤凰山公园\t405733\n第三方特约险\t405734\n王老汉\t405735\n碎石机\t405736\n震耳欲聋\t405737\n周琴\t405738\n上海中国工商银行\t405739\n先行词\t405740\n正点\t405741\n泰囧\t405742\n松谈\t405743\nmvn\t405744\n团建\t405745\n来吧\t405746\nUC论文网\t405747\n不假\t405748\n西游释厄传2\t405749\n帅才\t405750\n讯录\t405751\n陈幸同\t405752\n可得\t405753\nbml\t405754\n冬瓜汤\t405755\n潮州\t405756\n反抗军\t405757\n头皮屑\t405758\ndataType\t405759\n马面鱼\t405760\n1500平\t405761\n太平洋车友会\t405762\n微众\t405763\n罗迪\t405764\n原木门\t405765\n德比\t405766\n体育部\t405767\n奇兵\t405768\n1000平方米\t405769\n阴霾\t405770\n吸引力\t405771\n湖州师范学院\t405772\n非编码RNA\t405773\nLII\t405774\n鹿血\t405775\n波尔布特\t405776\n守望先锋\t405777\n页头\t405778\n预制梁\t405779\n辽政\t405780\n澳门永利酒店\t405781\nzhejiang\t405782\n重用\t405783\n18mm\t405784\n格桑花开\t405785\nplesk\t405786\n在校生\t405787\nspreadsheet\t405788\n培训费\t405789\n渑池吧\t405790\n未来出版社\t405791\n亮晶晶\t405792\npancreatic\t405793\nErrors\t405794\n深圳市地方税务局\t405795\n三剂\t405796\n落锤\t405797\n接送机\t405798\n康佳智能电视\t405799\n320li\t405800\n7.3.2\t405801\n股票类\t405802\n三协\t405803\n火石\t405804\n三浦友和\t405805\n行政区划\t405806\nprovisional\t405807\nbattlerite\t405808\n狂\t405809\n西安世博园\t405810\n摩力克\t405811\nWarframe\t405812\n研究员级\t405813\n碗扣式\t405814\n回车符\t405815\n缠绳\t405816\nlinprog\t405817\n次子\t405818\n新板\t405819\n俄克拉荷马州\t405820\n咀嚼\t405821\n君临\t405822\n操练\t405823\n大皇帝\t405824\n云野\t405825\n1个小时\t405826\nRambo\t405827\n宁夏回族自治区商务厅\t405828\n司法部门\t405829\n4.8%\t405830\n南昆山\t405831\n柔韧度\t405832\n巴方\t405833\n7派\t405834\n哈农\t405835\ngongan\t405836\n弄口\t405837\n普瑞特\t405838\n忧国\t405839\n幻城\t405840\naei\t405841\n津门人才网\t405842\n金呗\t405843\n老毛桃u盘启动盘\t405844\n管理世界\t405845\n巩义\t405846\n重庆市涪陵区人民政府\t405847\n永久免费版\t405848\n翠屏城\t405849\n梵客\t405850\n新洲村\t405851\n日本队\t405852\n魔女复仇之夜\t405853\n上海地\t405854\n顺安镇\t405855\n夫妻那些事\t405856\n田向利\t405857\n香影\t405858\n爆了\t405859\n世代\t405860\nwso2\t405861\n一景\t405862\naoao\t405863\nmiit\t405864\n不晓得\t405865\n狼哥\t405866\n此言\t405867\n琳\t405868\nRai\t405869\n沙机\t405870\n热血江湖手游\t405871\nmultiindex\t405872\nGazebo\t405873\n威锋源\t405874\n坊主\t405875\n招标人\t405876\nMo\t405877\n大悦城\t405878\nMIM\t405879\n奔驰gla200\t405880\n辽宁机电职业技术学院\t405881\nhdsky\t405882\n军武次位面\t405883\n钼铁\t405884\n凝汽器\t405885\ngetlasterror\t405886\n扫频\t405887\n中国大唐集团公司\t405888\nproteus\t405889\n53货源网淘宝大学\t405890\n嘎嘣\t405891\n甩头\t405892\n渔我同行\t405893\n富阳19楼\t405894\nERC20代币\t405895\n龙跃\t405896\nplane\t405897\n王飞\t405898\n划船\t405899\n铁耗\t405900\n粉笔画\t405901\n重固镇\t405902\n底坑\t405903\nebay吧\t405904\n175名\t405905\n多谢了\t405906\n花框\t405907\n相州\t405908\n阿玛拉王国\t405909\nSilent\t405910\nPantsu\t405911\n硅胶管\t405912\n510701.cn\t405913\n出使\t405914\nIndicator\t405915\n海宁路\t405916\n洽川\t405917\nYaml\t405918\n威达\t405919\n吸客\t405920\n管教所\t405921\n利益集团\t405922\n编页\t405923\n1.10.2\t405924\n三岛\t405925\n拇指\t405926\n跑镖\t405927\n教育在线\t405928\n美兰机场\t405929\nlize\t405930\n病毒式营销\t405931\n曹素功\t405932\n筝\t405933\n猛然\t405934\nEmpress\t405935\n众多\t405936\n绝缘漆\t405937\n恐高\t405938\n粤语版\t405939\n郑小超\t405940\nQualitative\t405941\n天生不对\t405942\n代表\t405943\n中关村西区\t405944\n光敏三极管\t405945\n紫叶酢浆草\t405946\n高门\t405947\ndataload\t405948\nMATCH\t405949\nV6.6\t405950\niPhone6P\t405951\n上海科匠信息科技有限公司\t405952\n世界征服者3吧\t405953\nmetropolis\t405954\n勤杂工\t405955\n0.02\t405956\n济南奥体中心\t405957\n复活节\t405958\n绕口\t405959\n丹香\t405960\n亚甲\t405961\nMoments\t405962\n大城府\t405963\n大福\t405964\n一代神\t405965\n14.2\t405966\n甘家寨\t405967\n云岚\t405968\n上河村\t405969\n大排档\t405970\n麦莉·赛勒斯\t405971\n例假\t405972\n审计经理\t405973\n爱民路\t405974\nDocer\t405975\n红色警戒吧\t405976\n萧山日报数字报\t405977\noffice2018\t405978\n黄新\t405979\n提要\t405980\nMP4\t405981\numn\t405982\n笑探\t405983\n铁骨\t405984\n盲兽\t405985\n佼佼者\t405986\nsane\t405987\n艾谱\t405988\nCompiler\t405989\n玄子\t405990\n联连\t405991\n亨利卡维尔\t405992\n报关\t405993\nDisability\t405994\n欧阳卫民\t405995\n市邮政管理局\t405996\n中共中央宣传部\t405997\n留抵\t405998\n成婚\t405999\n标高\t406000\nywwuyi\t406001\n养生坊\t406002\n批量生成\t406003\n足轻\t406004\n方幂\t406005\n巴蜀中学\t406006\n华录\t406007\n沁水县政府\t406008\n2017-04-29\t406009\n遐\t406010\nFoundation\t406011\n孙煜\t406012\nastm\t406013\n小雏菊\t406014\n罗非\t406015\nfindone\t406016\n细条\t406017\n桂林山水甲天下\t406018\npartitioned\t406019\n8.3\t406020\n升降级\t406021\n经络穴位网\t406022\n读出\t406023\n商业大厦\t406024\n黙\t406025\n齐神\t406026\n八歧\t406027\n牧场主\t406028\n智能柜\t406029\n官道\t406030\n喧嚣\t406031\n弧气\t406032\nPOPO\t406033\n感召\t406034\nFrozenUI\t406035\nTermux\t406036\n发电场\t406037\ndanger\t406038\npymysql\t406039\n狮峰\t406040\n贼道三痴\t406041\n营业证\t406042\n西格\t406043\n斜面体\t406044\n初班\t406045\nTIGERB\t406046\n上海实验学校\t406047\n烷基化\t406048\n韩素\t406049\n十尾\t406050\n教学论\t406051\n军妻\t406052\n恶性砍人\t406053\n太田胃散\t406054\n公共卫生间隔断\t406055\n替格瑞洛片\t406056\n享福\t406057\n撩骚\t406058\n货物品\t406059\n全族\t406060\n小结\t406061\nKimmel\t406062\n讨贼\t406063\n佐餐\t406064\n杜家村\t406065\n浮水\t406066\n宿城区\t406067\n高鼻\t406068\n对端\t406069\n金卡戴珊\t406070\n攸县\t406071\n有女\t406072\n诺基亚7P\t406073\n黄帅\t406074\n招警考试\t406075\nMd5\t406076\n紫玫瑰\t406077\n九几年\t406078\n驮\t406079\n朝阳区政府\t406080\n杭州市民政局\t406081\nwWw\t406082\nprotues\t406083\n瓦格纳\t406084\n杂菜\t406085\n阳光工程心理网\t406086\n超杀女\t406087\n风荷载\t406088\n_华\t406089\n刮泥机\t406090\n滞后\t406091\n王侠\t406092\n北京网站建设公司\t406093\n面部吸脂\t406094\n菊花枸杞茶\t406095\n北京灵山\t406096\n性别歧视\t406097\n组队\t406098\n比作\t406099\n18岁\t406100\n工商管理\t406101\n方米\t406102\n收受\t406103\namesim\t406104\nsinB\t406105\n密码\t406106\n疲劳试验机\t406107\n亡灵\t406108\n旧年\t406109\nelliptic\t406110\n2018年2月12日\t406111\n宜居畅通卡\t406112\n紫草油\t406113\n跨站脚本攻击\t406114\n电制\t406115\n1384\t406116\n八十四\t406117\n排风\t406118\nnuface\t406119\n新华电脑学校\t406120\ndbo\t406121\nFBG\t406122\n教育公共基础知识\t406123\n积放\t406124\n管机\t406125\n365个\t406126\nmaths\t406127\n顺丰\t406128\n匡亚明班\t406129\nprivoxy\t406130\n侵略者\t406131\n勤务兵\t406132\n喜欢你\t406133\n闲暇\t406134\ncc2540\t406135\nobjects\t406136\n滤水器\t406137\n固定型\t406138\ngofair\t406139\n梁艳\t406140\n熔岩龙\t406141\noccupied\t406142\nExecution\t406143\n纯点\t406144\n叶皮\t406145\n福尔摩斯罪\t406146\n微信实名认证\t406147\nshiju\t406148\n市管\t406149\n四面台\t406150\n昌东镇\t406151\n蟒\t406152\n耶里夏丽\t406153\nNEW\t406154\n傍边\t406155\n甬台温高速公路\t406156\n四川工商学院\t406157\n开摘\t406158\n万科金域缇香\t406159\n前海人寿保险股份有限公司\t406160\n天津市交通运输委员会\t406161\nMetArt\t406162\n税务业\t406163\nworld\t406164\n运管所\t406165\n原币\t406166\n中信国安集团\t406167\n闽\t406168\n2018年4月26号\t406169\n小桃子\t406170\n小黑子\t406171\nnbr\t406172\n一氧化二氮\t406173\n国家卫健委\t406174\neo\t406175\n范表\t406176\n优思明\t406177\n鉴赏题\t406178\n开演\t406179\n伊藤润\t406180\n外台\t406181\n周蕾\t406182\n17500.cn\t406183\n2A\t406184\n欧尔\t406185\n7.9%\t406186\n特服\t406187\n451度\t406188\n红外报警器\t406189\nsheji\t406190\n航片\t406191\n50场\t406192\n约会学\t406193\nMIUI9.5\t406194\nDivers\t406195\nEfex\t406196\n中控区\t406197\n中交基础设施养护集团有限公司\t406198\n跳汰\t406199\n色戒\t406200\ncanadian\t406201\n2101\t406202\nALAC\t406203\n价态\t406204\n龙报\t406205\ndat格式\t406206\n三桩\t406207\n莲花大厦\t406208\n年均复合增长率\t406209\n青岛大剧院\t406210\n小凉\t406211\n118名\t406212\n博实乐\t406213\nHDR10\t406214\n鬼灯的冷彻\t406215\n郭超\t406216\n正安县人民政府\t406217\n火凤凰\t406218\n账单日\t406219\n我们的快乐人生\t406220\n504号\t406221\n造纸厂\t406222\n3500万元\t406223\n金丙\t406224\n通融\t406225\n六识\t406226\nKingston\t406227\n长沙湘雅医院\t406228\n需提交\t406229\n健身气功八段锦\t406230\n湖南省政府\t406231\n游戏攻略大全\t406232\n简形\t406233\n实人\t406234\nBet\t406235\n世界机器人大会\t406236\nPOE\t406237\n炒汇\t406238\nレ\t406239\n八道\t406240\n张德福\t406241\n赛钛客\t406242\n清湖\t406243\nAnliven\t406244\n张姓\t406245\n三措\t406246\n羽墨\t406247\n我的女孩\t406248\nMARK\t406249\n诛仙法医狂妃\t406250\n铁总\t406251\n科宁\t406252\nwithme\t406253\npix\t406254\n天王殿\t406255\n中国科学院水利部\t406256\n辛烷值\t406257\n宾州\t406258\n威海经区\t406259\n华府大道\t406260\n【罗技\t406261\n杨安娣\t406262\n赛太克\t406263\n赢政\t406264\n崔丽\t406265\n民营科技园\t406266\n片片\t406267\n地摊\t406268\n综英\t406269\n德胜新村\t406270\n张光明\t406271\n合肥市政府\t406272\ntsa\t406273\n伯伯\t406274\nPrima\t406275\nI1\t406276\n未寒\t406277\n新象\t406278\n湾流国际青年社区\t406279\n宝信软件\t406280\n超级银河大怪兽格斗\t406281\n微课堂\t406282\n中国邮政\t406283\n洞察\t406284\n胶州经济技术开发区\t406285\n糖水片\t406286\n福建省农村信用社联合社\t406287\n12季\t406288\n偏北\t406289\n彩虹岛Online\t406290\n黑社会性质组织罪\t406291\n柯迪\t406292\n蒲公英根茶\t406293\nencouragement\t406294\nmh370\t406295\n宋\t406296\n银屑病\t406297\n我的国\t406298\n滑窗\t406299\n鸿昌\t406300\n吴钢\t406301\n比哈尔邦\t406302\n溺\t406303\n建安工程费\t406304\nE-Learning\t406305\n拉结\t406306\nbeautifulsoup4\t406307\nframwork\t406308\n二十二岁\t406309\nnotation\t406310\n凄清\t406311\n杨欢\t406312\nSWUPL\t406313\nv8.3.0\t406314\n卢波\t406315\n贵史\t406316\nhust\t406317\n法定结婚年龄\t406318\n专门史\t406319\n学慧网\t406320\n洗澡堂\t406321\n氨气\t406322\n鱼我所欲也\t406323\n菲利亚\t406324\n圣君\t406325\n上海柏州科教设备有限公司\t406326\n撞\t406327\n南方泵业\t406328\npeeping\t406329\n国家质量监督检验检疫总局\t406330\n万题\t406331\n颜文字君\t406332\nplus\t406333\n1r\t406334\n玉趾\t406335\n萧山小学\t406336\n卡粉\t406337\n水印相机\t406338\nTechTarget\t406339\n中江县\t406340\n磁业\t406341\n一汽森雅\t406342\n爆灯\t406343\nimba\t406344\nf-8180483\t406345\n上海政法学院\t406346\n糖筛\t406347\nSat\t406348\n蝴蝶树\t406349\nhelm\t406350\n1764\t406351\ntriton\t406352\n缘木\t406353\n前因后果\t406354\n亲切\t406355\nEmployment\t406356\n49码\t406357\n刘倩倩\t406358\n执迷\t406359\n催眠曲\t406360\n鲁卫\t406361\n一年365天\t406362\n柏森\t406363\n秘石\t406364\n20180123\t406365\n兰嘉丝汀\t406366\n周峰\t406367\n頂\t406368\ny2\t406369\n桌板\t406370\n第4篇\t406371\nc600\t406372\nNetkeeper\t406373\n荣达\t406374\n宁波城隍庙\t406375\n高平路\t406376\n飯\t406377\n学位证\t406378\n尼玛范爷\t406379\nBora\t406380\n正北方网\t406381\npipo\t406382\n湖北教育考试网\t406383\n微伤\t406384\n4880\t406385\n遇敌\t406386\nSectionA\t406387\nmbar\t406388\n上半天\t406389\nFICO\t406390\n罗星街道\t406391\n团职\t406392\ng3220\t406393\n十多天\t406394\nDev-c++\t406395\n爱美女性网\t406396\nPSD素材\t406397\n卒姆托\t406398\n纽约州立大学石溪分校\t406399\n栖霞建设\t406400\n淑熙\t406401\n悠风\t406402\n何厝\t406403\n自治区人民政府办公厅\t406404\n西门子助听器\t406405\n东夷\t406406\n何川\t406407\n名都花园\t406408\n包片\t406409\n森德\t406410\n冷头\t406411\n亿级\t406412\ncluster\t406413\n斯柯达昊锐\t406414\n共存亡\t406415\nLINE\t406416\n绿化工程\t406417\nMDA\t406418\n袁峰\t406419\n氯氮平\t406420\n13段\t406421\n四量\t406422\n后知后觉\t406423\n开心剧乐部\t406424\n海盗旗\t406425\n徐嘉\t406426\n爆闪灯\t406427\n无痛肠镜\t406428\n河南省水利厅\t406429\n名作家\t406430\nwein\t406431\n嘉峪关方特欢乐世界\t406432\n财税网\t406433\n暖胃\t406434\n仿造\t406435\n四大名著\t406436\n好心人\t406437\n西交大\t406438\neclip\t406439\n教\t406440\n远东控股集团\t406441\n银杉\t406442\n增生\t406443\n皇陵\t406444\n阴历\t406445\n郝仁\t406446\n水球\t406447\n出仕\t406448\n衬衫裙\t406449\n线衣\t406450\n20&\t406451\n东鹏洁具\t406452\n栖霞区\t406453\n秋海棠\t406454\n天元突破\t406455\n灵武帝尊\t406456\n三毫米\t406457\n格林斯潘\t406458\n德蒙\t406459\n奥特曼格斗进化0\t406460\n江苏代表团\t406461\nw3wp.exe\t406462\n迪锐克斯\t406463\n巨擎\t406464\n军政\t406465\n邻人\t406466\nfixture\t406467\n潮性\t406468\n微学苑\t406469\n渠首\t406470\n美卡\t406471\n武术\t406472\n为人师表\t406473\n_图那丁吧\t406474\n原酒\t406475\nMSP430G2553\t406476\n先贤\t406477\n复方樟脑乳膏\t406478\n警示价\t406479\nexe-CSDN\t406480\n德胜路\t406481\n秒表\t406482\n5252\t406483\nGAS\t406484\nopengapps\t406485\narcsin\t406486\n星学\t406487\n太原高新区\t406488\n双J管\t406489\n杨浦时报\t406490\n中国人民政治协商会议陕西省委员会\t406491\n炼妖\t406492\n天下网吧论坛\t406493\n以流\t406494\n欧标\t406495\n眼癌\t406496\n皇权\t406497\n三国梦想无惨系列漫画全集\t406498\n最终话\t406499\n陈金柱\t406500\n卖客\t406501\nImmuno\t406502\n江苏省工商行政管理局\t406503\nqps\t406504\nTrash\t406505\n一小片\t406506\nDirk\t406507\n105名\t406508\n大故宫\t406509\n漆黒\t406510\n手足口疫苗\t406511\n奇瑞瑞虎5\t406512\n机动战士高达:铁血的奥尔芬斯\t406513\n白模\t406514\n影背\t406515\n度过\t406516\n8件\t406517\n1980号\t406518\n林梓\t406519\n1895\t406520\n甘肃酒泉\t406521\n轻吻也飘然\t406522\n陆星材\t406523\n武易\t406524\n径流\t406525\n李琰\t406526\nFDI\t406527\n绿豆糕\t406528\n中小学名\t406529\n新安全生产法\t406530\n日本政府\t406531\n荀\t406532\n官家\t406533\n_邻\t406534\n荣耀S6\t406535\n苦果\t406536\n伤不起\t406537\nWix\t406538\n曾勇\t406539\n外汇汇款\t406540\ndiscrepancy\t406541\n绕线式\t406542\n美图秀秀\t406543\n莱芜战役\t406544\n青云直上\t406545\nolay\t406546\n张佳宁\t406547\n福建省省\t406548\n飞鱼\t406549\n58&\t406550\nv1.3.6\t406551\n太极张三丰\t406552\n横沥镇\t406553\ndoc\t406554\n榎本\t406555\n0746\t406556\nsailor\t406557\n21倍\t406558\n圣诞老人村\t406559\n拉卡\t406560\nメ\t406561\n大兴机场\t406562\n虚拟机版\t406563\nmockplus\t406564\n蓬荜\t406565\nsuta\t406566\nBTR\t406567\nr22\t406568\n克达拉市\t406569\n眼带\t406570\n农杆菌\t406571\n实线\t406572\n临港区\t406573\n斐珞尔\t406574\n飞行区\t406575\n借光\t406576\n唐鹤德\t406577\nmindset\t406578\nLQ-1600K\t406579\n盛明兰\t406580\nwrk\t406581\n卡通\t406582\n除味\t406583\n涡\t406584\n圆边\t406585\n1kg\t406586\n淮北师范大学\t406587\nUU网游加速器\t406588\n绿汀路\t406589\n南瓜\t406590\nRates\t406591\n梅花节\t406592\nmapbox\t406593\n王圣\t406594\nlibreoffice\t406595\n中英关系\t406596\n美学家\t406597\n列及\t406598\nGX501\t406599\n稻香酒家\t406600\n班昭\t406601\n死链接\t406602\n培训基地\t406603\n澳亚卫视\t406604\n沙特基础工业公司\t406605\n中国小动物保护协会\t406606\nMDPDA\t406607\n火影剧场版\t406608\nreadyState\t406609\n小三角形\t406610\n群桩\t406611\n下酒菜\t406612\n重庆酒店\t406613\n楸\t406614\n新学\t406615\n工点\t406616\n千寻学术网\t406617\n社会\t406618\nJpaRepository\t406619\n刘尚希\t406620\n人事处\t406621\n120码\t406622\n魔术快斗\t406623\n杨延昭\t406624\n体置\t406625\n铜雀\t406626\n国际自然保护联盟\t406627\njs库\t406628\n靛蓝\t406629\n签字板\t406630\n江起云\t406631\n金丝雀码头\t406632\nPurifier\t406633\n玉魂\t406634\nJIRA\t406635\n中联水泥\t406636\n德奥通航\t406637\n姬无命\t406638\ngank\t406639\n2311\t406640\n192.168.11\t406641\n2018041\t406642\n600196\t406643\n奔驰cls\t406644\nOBJ格式\t406645\n鸟洞\t406646\n抛却\t406647\n电导仪\t406648\n笑傲\t406649\n文隽\t406650\n隐约\t406651\n躺\t406652\n原方\t406653\n新梁山伯与祝英台\t406654\nec180\t406655\n送葬\t406656\n长沙路\t406657\n同心\t406658\n阿滴\t406659\nEAORON\t406660\n光谷大道\t406661\nLoRa\t406662\nx9000e\t406663\n清瞳\t406664\n腱子\t406665\n惹爱\t406666\n思盈\t406667\n范迪安\t406668\nimagex\t406669\n贝斯\t406670\n飞鱼星\t406671\n150万吨\t406672\n丁子\t406673\n信任\t406674\n24话\t406675\n苏舜钦\t406676\nLaFleur\t406677\n定义法\t406678\n惠州市房产管理局\t406679\nFami通\t406680\n玫瑰花茶\t406681\n中化石油\t406682\n南孚\t406683\n壹圆\t406684\n深谷\t406685\nlinear\t406686\n短体\t406687\nzico\t406688\n爱弹幕\t406689\n联新\t406690\n七月七\t406691\n肿疼\t406692\n机题\t406693\n美尚\t406694\nifame\t406695\nimpossible\t406696\n陆雪琪\t406697\n肉食者\t406698\nloki\t406699\n油尖\t406700\n康师傅控股有限公司\t406701\n板底\t406702\n夸赞\t406703\n离岛免税政策\t406704\n分动器\t406705\nPhotosmart\t406706\n一望\t406707\nAgenda\t406708\n研华\t406709\n改不了\t406710\n财达证券\t406711\n红吕\t406712\n丝绸之路国际电影节\t406713\n无印良品MUJI\t406714\nglossmen\t406715\n厨艺\t406716\n吴仁宝\t406717\n地方保护主义\t406718\n卡斯比\t406719\n中南标\t406720\n学前教育\t406721\n中链\t406722\n克里姆林宫\t406723\nxp系统\t406724\n福建地区\t406725\n莫拉蒂\t406726\n恒电位仪\t406727\n陋习树\t406728\n金考典\t406729\n智联招聘人才网\t406730\n出关\t406731\n10.21\t406732\n4.1.1\t406733\n安纳塔拉\t406734\n硝化\t406735\n中国软件国际\t406736\n三四线\t406737\n文怡\t406738\n芜明\t406739\n派出所长\t406740\n58商宝网\t406741\n芦台经济开发区\t406742\n王志李\t406743\nmajority\t406744\n白展堂\t406745\n0.49\t406746\n0596\t406747\n瑞秋\t406748\n桂发\t406749\n保安员\t406750\n王集\t406751\n东乌珠穆沁旗\t406752\n仙村\t406753\n仙人洞\t406754\n党委会\t406755\n长治久安总\t406756\n币值\t406757\n英明\t406758\ncollect2\t406759\n大米袋\t406760\n冰滑\t406761\n试剂柜\t406762\n取义\t406763\n吴奇隆\t406764\niphone6SE\t406765\n木兰词\t406766\n南开大学图书馆\t406767\n雷带\t406768\n荣耀s10\t406769\ninfo5\t406770\nJACK\t406771\n窄门\t406772\n整流电路\t406773\n3134\t406774\n6203\t406775\n公开工\t406776\nzoozoo\t406777\n北京银监局\t406778\nbella\t406779\n仙剑5\t406780\n衷情\t406781\n权律\t406782\n中央型\t406783\n山东省民政厅\t406784\naura\t406785\nksort\t406786\n山东师范大学》\t406787\n浪矢\t406788\n潘森\t406789\nOrchestra\t406790\nr3d\t406791\n格罗斯\t406792\nPMU\t406793\n北陵公园\t406794\n海南科技职业学院\t406795\n几立方\t406796\n秀水镇\t406797\n180G\t406798\nmho\t406799\n碧溪\t406800\n差压\t406801\n新专\t406802\n电动滑板车\t406803\n青山路\t406804\n苏婉\t406805\n鲁鹏\t406806\nFLT\t406807\n赛车手\t406808\n农怪\t406809\n李昌平\t406810\n天门\t406811\n沂南\t406812\njdk7\t406813\n72册\t406814\n上海大华\t406815\n毁于随\t406816\n友阿股份\t406817\n192.168.10.0\t406818\n学习单\t406819\n杭州牙科医院\t406820\n坠亡\t406821\n朱丹威尔史密斯\t406822\n龙套\t406823\ntally\t406824\n即所得\t406825\n桐梓\t406826\nAlready\t406827\n酒狂\t406828\n亭林镇\t406829\n上海思博职业技术学院\t406830\n四五十岁\t406831\n烧饼修改器\t406832\n长裙\t406833\n木铎\t406834\n酒柜\t406835\n勿谓\t406836\n枭宠\t406837\n东方铁塔\t406838\nemouse\t406839\n冯浩\t406840\n国酒\t406841\n工程网\t406842\n拔根\t406843\n02120\t406844\n生肌\t406845\ninvisible\t406846\n稻城亚丁\t406847\n性子\t406848\n丝棉\t406849\n沈光耀\t406850\n彭格列\t406851\n中国软件与技术服务股份有限公司\t406852\n折叠屏\t406853\n周云杰\t406854\n文圣\t406855\n植酸酶\t406856\nzjg\t406857\n免税政策\t406858\n东郊椰林\t406859\nsessio\t406860\n烧烤场\t406861\n金立m6plus\t406862\n泥塘\t406863\ntrees\t406864\nPASV\t406865\n圣耀十字架\t406866\n有财\t406867\n香港国际金融中心\t406868\nGT-2000\t406869\n5599\t406870\n憾事\t406871\n枫芸志\t406872\nsees\t406873\n6组\t406874\n江南大学附属医院\t406875\n沧州一中\t406876\n伴娘们\t406877\n容量费\t406878\n贝卢斯科尼\t406879\n金证\t406880\nCtrl\t406881\nCosDNA\t406882\n勇者无敌\t406883\n徐州观音机场\t406884\n弓弩\t406885\n重选\t406886\n铬锆铜\t406887\nSuggest\t406888\n荆州新闻网\t406889\n三元锂\t406890\n退\t406891\nCarina\t406892\n仍\t406893\n青狮\t406894\n827\t406895\n推举\t406896\nHistorian\t406897\ngqsunrise\t406898\n橙线\t406899\n冷然\t406900\n空列\t406901\nDVI接口\t406902\n看图\t406903\n末位淘汰制\t406904\nPMBOK\t406905\nbex\t406906\n周元\t406907\n调酒\t406908\nyou百度云\t406909\n170ml\t406910\nwarfarme\t406911\n朱颖\t406912\n省市委\t406913\n倒流防止器\t406914\n橡皮圈\t406915\n新卡\t406916\n成美\t406917\n药卷\t406918\n英拉\t406919\njdeveloper\t406920\n管党\t406921\nota\t406922\nIT人生录\t406923\n霍尼韦尔(中国)有限公司\t406924\n石马\t406925\n森泰\t406926\n浙江学前教育网\t406927\n狼獾\t406928\n舒舒\t406929\n900000\t406930\nwifi分析仪\t406931\n多米尼克\t406932\n数码宝贝骇客\t406933\n交流继电器\t406934\n复始\t406935\n葆\t406936\n中央巡视工作规划\t406937\nBarefoot\t406938\n分件\t406939\n元通古镇\t406940\n内饰件\t406941\n高材生\t406942\n乙型脑炎\t406943\n避雷塔\t406944\n大嫁\t406945\n驻马店火车站\t406946\nNeutral\t406947\n朱军\t406948\nCallback\t406949\n天涯书塾\t406950\n人民街\t406951\n爬树\t406952\n靠不靠谱\t406953\n小升初择校\t406954\n43万\t406955\n汝工作室\t406956\njianding\t406957\n聂鑫森\t406958\n羊台山\t406959\nequal\t406960\n搁\t406961\npen\t406962\n小诺理财\t406963\n田格\t406964\n不兼\t406965\n旋钮\t406966\n挂机版\t406967\nt60\t406968\nbbc\t406969\n北上\t406970\nCms\t406971\n广州四季酒店\t406972\n内蕴\t406973\n隔热棉\t406974\n33公里\t406975\n旋挖桩\t406976\n上海梅赛德斯奔驰文化中心\t406977\n一个12岁\t406978\n77分钟\t406979\n喜马偕尔邦\t406980\n典狱\t406981\nt470p\t406982\nFRPP\t406983\n蹲伏\t406984\n闪烁\t406985\n永定土楼\t406986\n米霍克\t406987\n参半\t406988\ntplm\t406989\n四明\t406990\n涂山红红\t406991\n本事\t406992\nGameFAQs\t406993\n火炉\t406994\n中国登山协会\t406995\n永不止步\t406996\n青岛市物价局\t406997\n通快\t406998\ncoldzera\t406999\n套路\t407000\n起重\t407001\n内照\t407002\n2步\t407003\n13道\t407004\n付讫\t407005\n实时监控\t407006\n50个\t407007\n菌\t407008\n希行\t407009\n普者黑\t407010\nJSX\t407011\n渠\t407012\n心理系\t407013\n多多网游加速器\t407014\nwrx\t407015\n氰基硼氢化钠\t407016\n首都机场\t407017\n栗山千明\t407018\n西溪银泰\t407019\n六中全会\t407020\nmandy\t407021\n平安期货\t407022\nbb\t407023\n蒜蓉\t407024\n利比亚\t407025\n曳步舞\t407026\n跨行通\t407027\narc4random\t407028\n新浪大连_新浪网\t407029\n一个20寸\t407030\nlinda\t407031\n集篇\t407032\nEEG\t407033\n抢断\t407034\n抽穗\t407035\n问诊\t407036\n体法\t407037\n初日\t407038\n相似词\t407039\n绝逼\t407040\ngosh\t407041\n坐标点\t407042\n西省\t407043\n徐泾东\t407044\n数据采集仪\t407045\n木质部\t407046\n宏源证券\t407047\nStarted\t407048\n六万公里\t407049\n从旧\t407050\n58.5\t407051\n羟丙基纤维素\t407052\n玫瑰果胶囊\t407053\n福事\t407054\n银壶\t407055\n第18话\t407056\n氢氧化铜\t407057\nurandom\t407058\nFMDN\t407059\nnamedtuple\t407060\n霍格沃兹\t407061\nenterprise\t407062\nurp\t407063\n等高\t407064\n东莞供电局\t407065\n采购部\t407066\n森科\t407067\n7磅\t407068\nTown\t407069\n广语\t407070\n兀自\t407071\n5700\t407072\n结识\t407073\n民族史\t407074\n青神县人民政府\t407075\n防老\t407076\n挨操\t407077\n九一人才网\t407078\nfilmic\t407079\n上海市房地产交易中心\t407080\n邛崃市\t407081\n清池\t407082\n小灵\t407083\n活口\t407084\n彩虹圈\t407085\nYokohama\t407086\n延庆镇\t407087\n深宅\t407088\n_新京报电子报\t407089\n汗菜\t407090\n杀鸡儆猴\t407091\n200条\t407092\n702路\t407093\n东方快车谋杀案\t407094\n上诉期\t407095\n8085\t407096\nnordvpn\t407097\nR2017b\t407098\n成都串串香\t407099\n梁场\t407100\n时间关系\t407101\n中山纪念中学\t407102\n第五轮\t407103\n户名\t407104\n双飞人\t407105\nSecured\t407106\n天山路街道\t407107\n小天风流史\t407108\n异丙肾上腺素\t407109\n成百上千\t407110\n缘来是你\t407111\n湖南少年儿童出版社\t407112\n互联网+\t407113\n菊香\t407114\n辉夜姬物语\t407115\n%0\t407116\n8880\t407117\n七天乐\t407118\napp2017\t407119\ndeos\t407120\n朱七七\t407121\n杨家庄\t407122\nGarmin\t407123\n翻过来\t407124\nRaya\t407125\n上当受骗\t407126\n锁定\t407127\n船坞\t407128\n号文\t407129\n住培\t407130\n牛肉膏\t407131\n北辰中央公园\t407132\n炒青菜\t407133\n渐起\t407134\n遗赠扶养协议\t407135\nst股\t407136\n探洞\t407137\n细谈\t407138\n米叔\t407139\n业余赛\t407140\nsilklabo\t407141\n江苏省政协\t407142\n云泽\t407143\n文明社区\t407144\nDecorator\t407145\n老妹你真美\t407146\n#39\t407147\n高铁时刻表|动车|列车时刻表\t407148\n限制级\t407149\n换修\t407150\n侵权行为\t407151\nTiguan\t407152\n白乳\t407153\n桥段\t407154\n建筑爬架网\t407155\n细石砼\t407156\n30亿元\t407157\n师宗县\t407158\n晓晨\t407159\n六道骸\t407160\n第103章\t407161\nC43\t407162\n东盟中心\t407163\n元史\t407164\nC4DSKY\t407165\n阁瑞斯\t407166\nCeo\t407167\n封神榜之武王伐纣\t407168\n无冬之夜2\t407169\n天全\t407170\nsurgery\t407171\nswitched\t407172\n原理\t407173\n狗头人\t407174\n20170906\t407175\n偿债\t407176\n妖魔道\t407177\n50秒\t407178\n心理罪2\t407179\n生动\t407180\n叶德娴\t407181\n逛街\t407182\n熊大熊二\t407183\n日韩\t407184\n断背\t407185\n整合\t407186\n500升\t407187\n社会科学院\t407188\n电动球阀\t407189\n烧焊\t407190\n肾盂\t407191\nKara\t407192\n印度洋\t407193\n框带\t407194\n可以说\t407195\n死去\t407196\n梧桐妹\t407197\n雪灵\t407198\n200分\t407199\n戴珊\t407200\n佐丹力\t407201\ncond\t407202\n阻生智齿\t407203\n蓉欧\t407204\n云川\t407205\n范甘迪\t407206\nsef\t407207\n硐室\t407208\n引导\t407209\n高斯核\t407210\n基耶利尼\t407211\n田赛\t407212\n领导人物\t407213\n情报学\t407214\n博微软件\t407215\n庭瑞\t407216\nXGIMI\t407217\n魂武\t407218\nPiano\t407219\n哈药人民同泰\t407220\nelizabeth\t407221\n痘痕\t407222\n互联网科技有限公司\t407223\ntwentine\t407224\n泾河龙王\t407225\n云阅小说网\t407226\n人皮脸\t407227\nUnlock\t407228\n2.0.9\t407229\n9.0.2\t407230\n层高\t407231\n工农兵\t407232\n100英寸\t407233\nARM\t407234\nQ701\t407235\n严惩不贷\t407236\n李宁公司\t407237\n征信机构\t407238\n云天\t407239\n大众斯柯达\t407240\n乌兰乌德\t407241\n创新性\t407242\n下映\t407243\n游钓\t407244\n外电\t407245\ndefine\t407246\n天津高院\t407247\n五年以后\t407248\n维也纳童声合唱团\t407249\nMarco\t407250\n未来\t407251\nHiding\t407252\n廪\t407253\n第81集\t407254\n二龙湖浩哥之江湖学院\t407255\n棕榈树\t407256\n说分手\t407257\n43亿\t407258\nclancy\t407259\nSkrill\t407260\n环绕版\t407261\n首献\t407262\n上白\t407263\nsifu\t407264\n贝尔实验室\t407265\n易冷\t407266\n1.15.1\t407267\nharbin\t407268\n三北防护林\t407269\n59store\t407270\n北京市通信管理局\t407271\n李红云\t407272\n苍云\t407273\n旗舰\t407274\nemory\t407275\n荣升\t407276\n19处\t407277\nBackgrounds\t407278\n第五十八章\t407279\nspanner\t407280\n宗关\t407281\n刘豹\t407282\naso\t407283\n谢烨\t407284\n指桑骂槐\t407285\nproject2016\t407286\n综合利用\t407287\n郑孝胥\t407288\nSilhouette\t407289\ndipole\t407290\n洋桥\t407291\nktm390\t407292\nIMAGING\t407293\n毛孔\t407294\n2016年4月28日\t407295\n媒介部\t407296\n北京朝\t407297\n重庆安全技术职业学院\t407298\n重型车\t407299\n蓝牙\t407300\n凶神恶煞\t407301\n0.3厘米\t407302\nICSE\t407303\nv7.6.0\t407304\n印钞\t407305\n忠贞\t407306\n11ac\t407307\n世界水日\t407308\nPTB\t407309\nwin8版\t407310\n状物\t407311\nattestation\t407312\n电离层\t407313\n喷淋\t407314\n神经猫\t407315\n御捷\t407316\n2400\t407317\n电场力\t407318\n萤之光\t407319\n鸡东\t407320\n唧唧歪歪\t407321\n贝壳金控\t407322\n型材\t407323\n全民目击\t407324\n吹替\t407325\n磷铜\t407326\n12把\t407327\n管帽\t407328\n1436\t407329\n江河万里行\t407330\n就问\t407331\n趾\t407332\n不符合项\t407333\n卡瘦\t407334\na++\t407335\n地质灾害危险性评估\t407336\n雅克比\t407337\nExhibit\t407338\n此病\t407339\nMystic\t407340\n专用存款\t407341\ncooker\t407342\nn9002\t407343\n孔丹\t407344\n杀女\t407345\n欢乐树的朋友们\t407346\n广州长城宽带\t407347\n素万那普机场\t407348\n碎步\t407349\n游\t407350\n冷雨夜\t407351\n钉螺\t407352\n中国广告协会\t407353\n酮\t407354\n高低压开关柜\t407355\n乐嗨\t407356\n生态村\t407357\nVsftpd\t407358\n镇江金山网\t407359\n落花生\t407360\n深潜\t407361\nsuisse\t407362\nhsu\t407363\n酒瘾\t407364\npubmed\t407365\n上海现代服务业联合会\t407366\n4343\t407367\n8152\t407368\naptitude\t407369\n不要吃\t407370\n18045期\t407371\ndaily\t407372\n手机应用宝\t407373\n月神话\t407374\n导热\t407375\n大当家\t407376\n4521f\t407377\n变质\t407378\n耗尽\t407379\n查莉成长日记\t407380\n检讨书\t407381\n试住\t407382\n龙腾峡\t407383\n墙孔\t407384\n一厘\t407385\n杨斌\t407386\nEXECUTE\t407387\n浙江地区\t407388\nSSH2\t407389\nBacillus\t407390\nHGBF\t407391\n移动电话\t407392\n雪灾\t407393\n400cc\t407394\n腾越\t407395\n代付款\t407396\n常州移动\t407397\n张玉峰\t407398\nMWC\t407399\n四省\t407400\n性情\t407401\n武汉佰钧成技术有限责任公司\t407402\n天后\t407403\n汉威国际广场\t407404\n紫光国芯\t407405\n因为\t407406\n速八酒店\t407407\n靖西县\t407408\n比购网\t407409\n云译客\t407410\n中央红军\t407411\nE2E\t407412\n连续子数组\t407413\n4399单人小游戏\t407414\n琅琊榜2\t407415\n青云路\t407416\n九亭镇\t407417\n非营业性\t407418\n大龙湫\t407419\n区纪委监委\t407420\n小龙女\t407421\n穿越火线\t407422\n2490\t407423\nThanh\t407424\n1阶\t407425\n无伤\t407426\n11月13日\t407427\n北京人民医院\t407428\n生物等效性试验\t407429\n农家仙\t407430\n棠樾\t407431\n中国核动力研究设计院\t407432\n中国物联网\t407433\n支化\t407434\n荣耀五五开黑节\t407435\n撒网\t407436\n福昕编辑器\t407437\n雪之舞快乐舞步健身操\t407438\nBCH\t407439\n咖喱蟹\t407440\n甘肃省委\t407441\n文化遗址\t407442\n14级\t407443\n华北理工大学附属医院\t407444\n220亿元\t407445\n读书笔记\t407446\n简普\t407447\n风越\t407448\n手弹\t407449\n鹰嘴龟\t407450\n网帽\t407451\nRegency\t407452\nbose\t407453\n2015-2020年\t407454\nJason\t407455\n党旗\t407456\n马营\t407457\n98折\t407458\n关于打赢脱贫攻坚战的决定\t407459\n北京市国资委\t407460\n旺旺队\t407461\n六里桥北里\t407462\n鸟姐\t407463\nKnife\t407464\n15瓶\t407465\n终极帝王\t407466\n问题性\t407467\nxmlrpc\t407468\n10余年\t407469\n中国武警男声合唱团\t407470\n曹仁\t407471\n杨叶\t407472\n同谋\t407473\n各校\t407474\n彩虹岛2\t407475\n银杏村\t407476\n天使之吻\t407477\n疑帖\t407478\nfiches\t407479\n立道\t407480\n影响性\t407481\n天气报告\t407482\nCharli\t407483\n松本乱菊\t407484\n招联消费金融有限公司\t407485\n推流地址\t407486\n相贯线\t407487\ngreece\t407488\n弹\t407489\n慧辰\t407490\n山武\t407491\nhisi\t407492\n英汉\t407493\n劝和\t407494\n何太后\t407495\n无咎\t407496\nScaler\t407497\n曼妥思\t407498\n胡立阳\t407499\n创通\t407500\n福字币\t407501\n黄粱\t407502\n宝马M\t407503\n锦衣大唐贞观\t407504\n规格表\t407505\n滑车\t407506\n托腮\t407507\n注射用水\t407508\nストリ\t407509\n千鹤\t407510\n转矩\t407511\n鲁信集团\t407512\n欢乐坦克大战\t407513\n每五分钟\t407514\n明厨\t407515\n祖谱\t407516\nWIN8\t407517\n不归路\t407518\n祭日\t407519\nnapoleon\t407520\n断定\t407521\n刘敏\t407522\nx%2\t407523\n休闲山庄\t407524\n一斗\t407525\n止咳药\t407526\n民生网\t407527\njiapeng\t407528\nBusyBox\t407529\n健康生活方式\t407530\ndd373吧\t407531\n江口县\t407532\n吸塑板\t407533\n防晒衫\t407534\n战勋\t407535\n挿入\t407536\n点破\t407537\n南浔古镇\t407538\n辩题\t407539\n磷脂\t407540\nFrost\t407541\n宫颈囊肿\t407542\n五点半\t407543\nxxooxxoo\t407544\n成贤学院\t407545\n缝纫机油\t407546\n纽线\t407547\n星君\t407548\n六百万\t407549\n一代妖\t407550\nx64\t407551\nNotepad2\t407552\n王书记\t407553\n列报\t407554\n资生堂安耐晒\t407555\n威睿\t407556\n调正\t407557\nRSSI\t407558\nEC6108V9U\t407559\n卿卿我我\t407560\nX页\t407561\n终极目的\t407562\n掠天之翼\t407563\nLawyer\t407564\n光大永明人寿\t407565\n计算机操作系统\t407566\ntotem\t407567\n戴安娜\t407568\n4100万\t407569\n重生之圣途风流\t407570\n江志强\t407571\n青白\t407572\n达茂旗\t407573\n8月25日\t407574\n书墨\t407575\n企及\t407576\n骶髂\t407577\n高新一中\t407578\n中信集团\t407579\n周总\t407580\n1.1m\t407581\njffs2\t407582\nNBA篮球大师\t407583\n造景\t407584\n仿木\t407585\nconst常量\t407586\n120公里\t407587\n中草药网\t407588\n第4周\t407589\n乡政\t407590\n南京港华燃气\t407591\n第9家\t407592\n一张照\t407593\n证件\t407594\n1661\t407595\n首见\t407596\n2017年12月23日\t407597\n第92章\t407598\njsr\t407599\n十几辆\t407600\n灵芝片\t407601\n续驶\t407602\n骤停\t407603\n三十多\t407604\n燕郊镇\t407605\n广州市商贸职业学校\t407606\n实拍\t407607\n养生苑\t407608\n黑咒岛\t407609\nf-149110\t407610\n002032\t407611\n胡小平\t407612\n血液粘稠\t407613\n景洪市\t407614\n启用\t407615\n老佳\t407616\nresouce\t407617\n沟头\t407618\n化工园\t407619\n羊城晚报\t407620\n异烟肼\t407621\n神之浩劫\t407622\n喜马拉雅FM\t407623\nnote5A\t407624\nGrammar\t407625\n石宇奇\t407626\n张小兵\t407627\n百花文艺出版社\t407628\n20160428\t407629\n姜淑桐\t407630\nCMMI\t407631\nPetroleum\t407632\nfinds\t407633\nTruck\t407634\n非经\t407635\noverlapping\t407636\nroyal\t407637\n中国地质大学长城学院\t407638\n号型\t407639\n选篇\t407640\n昆士兰科技大学\t407641\n西安美术学院\t407642\n大连公交网\t407643\nYY6029高清影院\t407644\n倍世\t407645\n总参\t407646\n材料展\t407647\n音曲\t407648\nNO.1\t407649\n玫瑰坊\t407650\nidoc\t407651\n通俗化\t407652\n黎阳\t407653\n创践\t407654\n田田圈\t407655\nCPV\t407656\n驻港\t407657\n王麟\t407658\n紧急刹车\t407659\nulimit\t407660\nADK\t407661\n依赖项\t407662\n怪不\t407663\n神界:原罪2\t407664\n展厅\t407665\n禄鼎记\t407666\n套头\t407667\n上浮\t407668\n雅骏\t407669\nswords\t407670\nventilation\t407671\n超纤皮\t407672\n奥迪a4\t407673\n绿胶\t407674\n阿道夫\t407675\nAspose\t407676\n温泽\t407677\n拜祖\t407678\nNEW3DSLL\t407679\n受歡\t407680\n水草\t407681\n警棍\t407682\nSolitude\t407683\n饭粒\t407684\n大南山\t407685\n核机\t407686\n欧阳马小云\t407687\n定屏\t407688\nG300\t407689\n话单\t407690\nMSTK\t407691\n头孢西丁\t407692\nRipper\t407693\n化妆品柜\t407694\n易淘\t407695\n经理岗\t407696\n莫妮\t407697\n勇斗\t407698\n后脑勺\t407699\n李海英\t407700\niodine\t407701\n安室\t407702\n牌证\t407703\n我们的一生\t407704\nmudbox\t407705\n樱兰\t407706\nVERY\t407707\n蒸蛋糕\t407708\n麟游\t407709\nOneNet\t407710\n分散染料\t407711\n玫瑰海岸\t407712\n8443\t407713\n田明\t407714\n精彩小说网\t407715\n上海市中西医结合医院\t407716\n维基语录\t407717\n腾通\t407718\n剑类\t407719\n香市\t407720\n46_\t407721\n网页导航\t407722\n1h\t407723\nrohm\t407724\n张钰\t407725\n桐柏\t407726\n吸溜\t407727\n命令篇\t407728\n古筝\t407729\n5版\t407730\n金融业\t407731\n缥缈\t407732\nEngineering\t407733\n密封\t407734\n样片\t407735\n夏春\t407736\n100粒\t407737\n消息\t407738\n海滨浴场\t407739\n大典\t407740\n驱魔道长\t407741\nupupoo\t407742\nwin8_\t407743\nyanghj\t407744\n300k\t407745\n钻研\t407746\n路绮欧\t407747\n麻辣味\t407748\n上海外国语大学附属双语学校\t407749\n单脉\t407750\n理喻\t407751\n中国软件评测中心\t407752\n子品牌\t407753\nav在线\t407754\n说清\t407755\n无柱\t407756\ncad2013\t407757\nmoist\t407758\n高箱床\t407759\nable\t407760\n铁框\t407761\n笏\t407762\nDjawadi\t407763\n50年内\t407764\n脑容量\t407765\n刘玉珠\t407766\nGSI\t407767\n发放机\t407768\npremiumsoft\t407769\n败因\t407770\n艾米莉·布朗特\t407771\n洒脱\t407772\n张天李\t407773\n天元神力\t407774\n第8期\t407775\n9750\t407776\n东华路\t407777\n陈树隆\t407778\n向日葵保险网\t407779\n中厨\t407780\nDOTween\t407781\n连片\t407782\n92.5\t407783\n第B04\t407784\n寨子\t407785\n影士\t407786\n消火栓箱\t407787\n优质服装\t407788\n朝阳乡\t407789\n卢小鱼\t407790\n奥玛物流仓储联盟\t407791\n华硕灵耀\t407792\n屋檐\t407793\nvarian\t407794\n伊奈\t407795\n27处\t407796\nmidi\t407797\n王菊\t407798\n武广\t407799\n2月13日\t407800\n周萍\t407801\n李文杰\t407802\n赵贤荣\t407803\n风干鸡\t407804\n三十二\t407805\n历史资料\t407806\n观澜镇\t407807\n藏着\t407808\n老区\t407809\nurinary\t407810\n胡威\t407811\n去哪儿旅行\t407812\ntuition\t407813\n沙恩霍斯特\t407814\n小行\t407815\n大航海时代OL\t407816\n瑶瑶\t407817\n东城社区\t407818\n人文地理学\t407819\n宝象\t407820\ncavity\t407821\n韩国乐天\t407822\nRUNNING\t407823\n十七天\t407824\n控控\t407825\nzou\t407826\n王天一\t407827\n谬论\t407828\n三乡镇\t407829\n冰淇淋\t407830\n锤击\t407831\n低氮燃烧器\t407832\n上海医药集团股份有限公司\t407833\nexls\t407834\n德怀特\t407835\n透慎\t407836\nGS2YOU\t407837\nab350m\t407838\nconrad\t407839\n东南风\t407840\nCMF\t407841\n超声乳化仪\t407842\n私有成员变量\t407843\nB9S\t407844\n凌厉\t407845\n鸣圣\t407846\n烛魔\t407847\nDQ\t407848\n合成酶\t407849\n汤晓鸥\t407850\n青村\t407851\nphp5.6\t407852\n雪雕\t407853\n女儿身\t407854\n怒江州人民政府\t407855\n魔天伦\t407856\nLYC\t407857\n1080I\t407858\n北京昌平新东方外国语学校\t407859\n淅沥枫\t407860\n中级法院\t407861\n胡颖\t407862\n各部\t407863\n彭斯\t407864\n宜都市\t407865\n真骨\t407866\n尚福林\t407867\n中船重工集团\t407868\n唐\t407869\n水丽\t407870\ndrops\t407871\nPyQt4\t407872\nsublimetext2\t407873\n北京市东城区人民法院\t407874\n美篇\t407875\netest\t407876\n兴城\t407877\n项羽\t407878\n龙潭乡\t407879\n遗毒\t407880\n分次\t407881\nVolist\t407882\n横槽\t407883\n史尔雅\t407884\nv2.21\t407885\n400%\t407886\n交叉口\t407887\nSWISS\t407888\n塔里木大学\t407889\n进销\t407890\n轩辕春秋\t407891\n怪\t407892\n校校\t407893\n寒食\t407894\nLauder\t407895\n汤若望\t407896\n火王_游素兰_故事漫画\t407897\n定作\t407898\n仙岩\t407899\n郑雪\t407900\nStructures\t407901\n野鸟\t407902\n安纳金\t407903\n周婷婷\t407904\n耿氏\t407905\nqq安全中心\t407906\n三四五\t407907\n东坝网\t407908\n苏仙区\t407909\nzfc\t407910\n比亚迪e5\t407911\n中华人民共和国劳动争议调解仲裁法\t407912\n光场\t407913\n发迹\t407914\n广东实验中学\t407915\n星舰帝国\t407916\n美萍\t407917\n随葬品\t407918\n几档\t407919\n韩世远\t407920\n阿虎\t407921\nPHD\t407922\n孟老师\t407923\njhat\t407924\n氷\t407925\n第几号\t407926\n航空货运\t407927\n汉邦高科\t407928\n瘦腿针\t407929\n邦女郎\t407930\nTE\t407931\n9次\t407932\n万家房产网\t407933\n惠科股份有限公司\t407934\n牙防所\t407935\n快乐棍\t407936\n康力士\t407937\n三心二意\t407938\n13.0.0\t407939\n恒温恒湿实验室\t407940\n来来回回\t407941\n科伦药业\t407942\n易乐游\t407943\n林格\t407944\n王翔\t407945\n检察员\t407946\n鸭子舞\t407947\nrisk\t407948\nMfc\t407949\n天猫小黑盒\t407950\n超雪\t407951\n承插\t407952\n火源\t407953\nVinci\t407954\njieju\t407955\nRX10\t407956\n扬清\t407957\n啤酒节\t407958\n54000\t407959\n表号\t407960\nl_\t407961\nXCode9\t407962\nx^2\t407963\n苏三\t407964\n心悸\t407965\n界定\t407966\n友城\t407967\n血氧仪\t407968\n米德尔顿\t407969\n玉碎\t407970\nzys\t407971\n荆棘谷\t407972\n海政\t407973\n微计\t407974\n28届\t407975\n可口\t407976\n揭幕\t407977\n烧菜\t407978\n阴德\t407979\ninterrupted\t407980\n放疗\t407981\n赵某某\t407982\n啪啪三国\t407983\nipd\t407984\n火影手游\t407985\nMindMapper\t407986\n婷美\t407987\nbounds\t407988\n尧尧\t407989\n宁宁\t407990\n苏宁云商集团股份有限公司\t407991\n终究\t407992\n陈志勇\t407993\n匆忙\t407994\n符头\t407995\n许愿池\t407996\n左婧媛\t407997\n神剪手\t407998\n柳州市公安局\t407999\n孤岛惊魂1\t408000\nEsther\t408001\n去脂\t408002\n弹簧草\t408003\nfetchall\t408004\n自流平水泥\t408005\ncheetah\t408006\n长音\t408007\n永达汽车\t408008\n派安盈\t408009\ntomat\t408010\n麦克维尔\t408011\nfindit\t408012\n北派\t408013\n葛勇\t408014\nWAV整轨\t408015\nzest\t408016\n翻拍\t408017\nAB模板网\t408018\n网机\t408019\n陨铁\t408020\n八十天环游地球\t408021\nfreeman\t408022\n391K\t408023\n爱着你\t408024\nImport\t408025\n郎导\t408026\n塞维利亚\t408027\nDoxygen\t408028\nSTM32-F429\t408029\n80周年\t408030\n龚慈恩\t408031\n当季\t408032\nOLIVER_QIN\t408033\n人工少女3\t408034\n喀\t408035\n斯科特\t408036\n一汽大众速腾\t408037\n零整比\t408038\n摸金\t408039\n一感\t408040\n接板\t408041\n塞罕坝\t408042\nRHINO\t408043\n12只\t408044\nヤンキ\t408045\n提出异议\t408046\n安发\t408047\n市来美保\t408048\n新图书馆\t408049\n苍鹰\t408050\n国际禁毒日\t408051\n杨晓光\t408052\n2018年2月4日\t408053\n期中考试题\t408054\n流口水\t408055\n规培\t408056\n藏剑\t408057\n红豆沙馅\t408058\nAnonymous\t408059\n牌意\t408060\n陈宝柱\t408061\n285mm\t408062\n几万年\t408063\nIOTA\t408064\npatchmania\t408065\n李奇微\t408066\n中央财经大学法学院\t408067\n迪斯尼乐园\t408068\nravel\t408069\n捉鬼敢死队\t408070\n观赏蟹\t408071\nmobilenet\t408072\nBose\t408073\n巴中市委\t408074\n硫酸_\t408075\n中华之星\t408076\n芯样\t408077\n一17岁\t408078\nWARS\t408079\ncmmi\t408080\n林辉\t408081\n李耐阅\t408082\n附小\t408083\n经五路\t408084\n张小娴\t408085\n专职化\t408086\nDell\t408087\n300253\t408088\n岁数\t408089\n汉舞\t408090\n肉芽肿\t408091\nWhistler\t408092\ngia\t408093\n晚照\t408094\n口笛\t408095\n酥油饼\t408096\n氟尿嘧啶注射液\t408097\n易库易\t408098\n外牙\t408099\n谁与争锋\t408100\n诺基亚8800\t408101\n洛神花\t408102\nBEC\t408103\n佳能ts8080\t408104\n茅侃侃\t408105\n天天色综合网\t408106\n蝎子\t408107\n崔杰\t408108\n五年级语文练习册答案-小学五年级语文练习册答案大全\t408109\n月轮山\t408110\n东京食尸鬼RE23话梨酒\t408111\n瓦斯琪\t408112\n咽喉痛\t408113\n维摩诘\t408114\n广东国税\t408115\n周周清\t408116\n首批10个\t408117\n海尔净水商城\t408118\n螺钿\t408119\n浪花一朵朵\t408120\n刘过\t408121\n兰江街道\t408122\n第十五讲\t408123\nendogenous\t408124\n金地\t408125\n尼克胡哲\t408126\n岳跃利\t408127\n铜梁区\t408128\n二话不说\t408129\n常熟经济开发区\t408130\n800多\t408131\n截屏\t408132\n映像\t408133\n系主任\t408134\n阿大葱油饼\t408135\n亚梦\t408136\n康莱德\t408137\n阿巴瑟\t408138\n273号\t408139\nGIBSON\t408140\n上沙村\t408141\n巧格吧_\t408142\n0531\t408143\n心尖宠妃\t408144\nphpQuery\t408145\n2016年8月16日\t408146\n不同种\t408147\ncollectionview\t408148\n煎豆腐\t408149\n死亡人员\t408150\nzjedu\t408151\n南宁机场\t408152\n费迪南\t408153\nADP\t408154\n貔貅\t408155\n临浦\t408156\n果唯伊\t408157\n透射电子显微镜\t408158\nVero\t408159\nair13\t408160\n国生\t408161\n无遮掩\t408162\n本内特\t408163\n小歌\t408164\n乔良\t408165\n苏州19楼\t408166\nEquity\t408167\n艾萨克·牛顿\t408168\nspectrometer\t408169\n氢化可的松\t408170\n科迪\t408171\n邹晶晶\t408172\nKEIL\t408173\n职改办\t408174\n中光\t408175\n九仙帝皇诀目录\t408176\n花脸\t408177\n欧式起重机\t408178\n阿里巴巴子\t408179\nIpv6\t408180\n四大行\t408181\n扯谈\t408182\n独奏\t408183\n山楂花\t408184\n通过率\t408185\n朴志训\t408186\n危険\t408187\n三岔镇\t408188\nweb攻击\t408189\n湖底\t408190\n叶秋\t408191\n苏武牧羊\t408192\ntroye\t408193\n逆転\t408194\n英语卷\t408195\n黑美人\t408196\nspan\t408197\n卡洛儿\t408198\n大溪谷\t408199\n阴阳大战记\t408200\n仓单\t408201\n储罐\t408202\nST尤夫\t408203\nHay\t408204\n克莱斯\t408205\n取息\t408206\n金智娟\t408207\n天眼\t408208\n上海恒斐生物科技有限公司\t408209\n厚道\t408210\n南京南京\t408211\n搜狐焦点网\t408212\nFRC\t408213\nDoggy\t408214\nsabon\t408215\nnewyork\t408216\n第10期\t408217\n冬虫草\t408218\n董子健\t408219\n皮下结节\t408220\n高孝周\t408221\n轻罪\t408222\n骨灰\t408223\nT440\t408224\nvue-axios\t408225\n殃\t408226\n怒号\t408227\nndk\t408228\nV2018\t408229\nvector3\t408230\n铸剑师\t408231\nUV板\t408232\n桃之助\t408233\n十不准\t408234\n遭起\t408235\nUG-CATIA\t408236\n联表\t408237\nBA\t408238\n凸\t408239\n果盘版\t408240\n莎朗斯\t408241\n吴国太\t408242\n南京农业大学研究生院\t408243\n有伴网\t408244\n戰\t408245\n李永胜\t408246\n赵庄镇\t408247\n君島\t408248\n李宁\t408249\necipse\t408250\n中国华融资产管理公司\t408251\n漆刷\t408252\n诺姆四达\t408253\n威海\t408254\n轩子\t408255\n千川\t408256\n战前\t408257\n0812\t408258\n风幕柜\t408259\n箱体\t408260\n郭振玺\t408261\n齐齐乐\t408262\n皇台酒业\t408263\n院系\t408264\n沈严\t408265\n北京植物园\t408266\n四脚朝天\t408267\n文学社\t408268\n怡海花园\t408269\n恒大御景\t408270\n雨林\t408271\n超长版\t408272\n3目\t408273\n染厂\t408274\nMinaj\t408275\n分成\t408276\n韩团\t408277\nmissleg\t408278\nRunforLove\t408279\n桜井あゆ\t408280\ngdgp3.chinaxinge.com/shuju2/201710/2017101423181023154\t408281\n单座\t408282\n体校\t408283\n黄昆\t408284\n寻根\t408285\n半纤维素\t408286\n2月14\t408287\n浙江省中山医院\t408288\n握力器\t408289\nAMC\t408290\n夹持器\t408291\n福宁\t408292\n150道\t408293\n生鲜肉\t408294\n故事的人\t408295\n河南农信社\t408296\n金e顺\t408297\n北京市出入境管理局\t408298\n东风风行景逸\t408299\nc63\t408300\nE8\t408301\n荣威RX3\t408302\n博斯腾湖\t408303\n阵容\t408304\n电子卡\t408305\n1091\t408306\n唐舞麟\t408307\n63元\t408308\n战场沙漠\t408309\n斯莱德\t408310\nDIE\t408311\nHDTV\t408312\n外快\t408313\n身着\t408314\n多城\t408315\n联线\t408316\n1Gbps\t408317\n币王\t408318\n蟠桃\t408319\n比试\t408320\n职考\t408321\nvware\t408322\n法版\t408323\n武侯区人民政府\t408324\n陈独秀\t408325\nPrestashop\t408326\n九速\t408327\nRestful\t408328\n红寺堡\t408329\n三途志\t408330\n答辩稿\t408331\n新举\t408332\n亚普股份\t408333\nAnalyze\t408334\n北讯集团\t408335\n瑞凌\t408336\nlaytpl\t408337\n贴钢化\t408338\nbragi\t408339\n康拉德\t408340\nfget\t408341\n易语言资源网\t408342\n蚩梦\t408343\n小夜猫\t408344\n迎审\t408345\n实证主义\t408346\n对象池\t408347\n发钱\t408348\n流影\t408349\n复方制剂\t408350\n纽约\t408351\n逆阀\t408352\n岛风GO\t408353\n螺钉\t408354\nillustrated\t408355\n词首\t408356\n天下第一妃\t408357\n第六十\t408358\n李晨曦\t408359\n折达公路\t408360\n桂园\t408361\n参和\t408362\nConsoles\t408363\n保温性\t408364\n1.8寸\t408365\n天心新闻网\t408366\n泰晤士小镇\t408367\n锯齿\t408368\n饭台\t408369\nPORTER\t408370\n雅克德罗\t408371\n劲吹\t408372\n题记\t408373\n安庆市区\t408374\n京师大学堂\t408375\nMVA\t408376\n600160\t408377\n完美小姐进化论\t408378\n恋物语\t408379\n窗幔\t408380\nsyf\t408381\n山姆大叔\t408382\n兰芳\t408383\n天方\t408384\n陕西省省\t408385\n新小\t408386\n铸铁壶\t408387\nmg3\t408388\nTET\t408389\nshell32.dll\t408390\n泰勒宁\t408391\n脑虫\t408392\n买机\t408393\n小科\t408394\n松井佑贵\t408395\n庐隐\t408396\n中楷\t408397\n中国高邮门户网站_高邮市政府网\t408398\n出厂价\t408399\n12公斤\t408400\n绿地之窗\t408401\n各档\t408402\n独自一人\t408403\n美缝胶\t408404\n纽马克\t408405\n20多条\t408406\n非平衡\t408407\n水粉霜\t408408\n無料エロ漫画アキバ同人倶楽部\t408409\n六七\t408410\n背景音\t408411\n上海交通大学徐汇校区\t408412\n门巴族\t408413\n中建三局\t408414\nICloud\t408415\n股海网\t408416\n1027\t408417\nstencil\t408418\n29\t408419\n中华门窗网\t408420\n56个\t408421\n领头雁\t408422\n过低\t408423\n新书包\t408424\n绘图员\t408425\n生物秀\t408426\n尝试性\t408427\n车前草\t408428\n林毅\t408429\n铋\t408430\n响铃\t408431\n虚拟光驱\t408432\nso8\t408433\n知果果\t408434\n20161014\t408435\n招行信\t408436\n用书\t408437\n志庆\t408438\nAoki\t408439\ngems\t408440\n石窑\t408441\n12.0.11\t408442\n扬先抑\t408443\nlizard\t408444\nchecker\t408445\n数码宝贝大冒险tri.\t408446\nNFU玩家社区\t408447\n房办\t408448\n超越自我\t408449\n50立方\t408450\n翠儿\t408451\n井口\t408452\n王国兴\t408453\n单行子\t408454\n全势力\t408455\n叹惋\t408456\n第48\t408457\n牌九\t408458\n科颜氏白泥\t408459\n传热学\t408460\n深圳海关\t408461\n杀牛\t408462\n银轮股份\t408463\n地税\t408464\n林威\t408465\n桐谷尤莉亚\t408466\nQQ网名大全\t408467\nue\t408468\nfilesystem\t408469\n造型师\t408470\n甘特\t408471\n胃穿孔\t408472\n五峰土家族自治县人民政府\t408473\nActress\t408474\n36c\t408475\n延庆区\t408476\n烈旭清河\t408477\n励学\t408478\n大东关\t408479\n拽拽\t408480\n200a\t408481\n想一想\t408482\n电子节气门\t408483\n飞翔公园\t408484\n电视端\t408485\n云涛\t408486\n亿商网\t408487\n山马党\t408488\n精养\t408489\n广西人大\t408490\n畜牧\t408491\n有多远\t408492\n宝能\t408493\n1户\t408494\n战舰联盟\t408495\nu2000\t408496\n简单画\t408497\n防磨\t408498\n我的读书故事\t408499\n花旗松素\t408500\nCapability\t408501\nstemwin\t408502\n橘子果酱\t408503\nxenu\t408504\nSping\t408505\n堀口奈津美\t408506\n伤害案\t408507\nE455\t408508\n6108\t408509\n贵州能源网\t408510\n2013年下半年\t408511\n协办方\t408512\n高铁西站\t408513\n宝石之国\t408514\n棱体\t408515\n水魔\t408516\n六天五晚\t408517\n朝花\t408518\n油槽\t408519\n马坚\t408520\n1665\t408521\namd\t408522\n奥拉克\t408523\nYA\t408524\n蒙太奇硅藻泥\t408525\neigen3\t408526\n广州开发区政府\t408527\n真实的故事\t408528\nlumion\t408529\n小王庄\t408530\n许海峰\t408531\n热水循环泵\t408532\nDaring\t408533\n苏格兰折耳猫\t408534\n西林街\t408535\n还在\t408536\nmiui\t408537\n抛重\t408538\n倒膜\t408539\n29.1\t408540\n千林郡\t408541\n人工牛黄甲硝唑胶囊\t408542\n狂欢夜\t408543\n200W\t408544\n英灵\t408545\nfired\t408546\n2892\t408547\n美好的爱情\t408548\n遗事\t408549\n游趣\t408550\n新车评网\t408551\n新模\t408552\nhuifu\t408553\n娱\t408554\n干声\t408555\n调节型\t408556\nUseful\t408557\n重载\t408558\n来宾市政府\t408559\n蕃薯耀\t408560\n弱不禁风\t408561\nUntold\t408562\n不怕\t408563\n吉祥路\t408564\n惹事\t408565\n更广\t408566\n排气式\t408567\n龙兄\t408568\n嘉善孔雀城\t408569\nscandal\t408570\nWebinar\t408571\n有解\t408572\n巴黎圣日耳曼\t408573\nfmodex\t408574\n阎焱\t408575\n抢占先机\t408576\n七堇年\t408577\n迷离\t408578\n国家物价局\t408579\n高维\t408580\n淫乱\t408581\n巴耶克\t408582\n超分辨率\t408583\n友母\t408584\n新西伯利亚\t408585\n吴铮\t408586\ndistal\t408587\n虫子\t408588\nemmmmm\t408589\n20171221\t408590\n黄家埠镇\t408591\n黎昕\t408592\n录录\t408593\n克旗\t408594\n宜丰\t408595\n叁仟栋\t408596\n硝酸铵\t408597\ntel\t408598\n4720HQ\t408599\n五庄\t408600\nFolk\t408601\n摘帽\t408602\n唇诺\t408603\n王金海\t408604\n张智\t408605\n龙楼\t408606\n支那\t408607\n密拼\t408608\n冷却水\t408609\n腻味\t408610\nRTU\t408611\nXPS挤塑板\t408612\n悲戚\t408613\n杨雪峰\t408614\nconomics\t408615\n验尸\t408616\n多味\t408617\ncomple\t408618\nMr.\t408619\n久石譲\t408620\n羊癫疯\t408621\n321电商学院\t408622\n乐瑞\t408623\n诺福克\t408624\nTeamwork\t408625\n曰本高清一本道无码av\t408626\nlianxiang\t408627\n红外线\t408628\n深喉\t408629\n无机布\t408630\nAutoCAD2008\t408631\n榆林市公安局\t408632\n伙人\t408633\n中国土木工程集团有限公司\t408634\n西营\t408635\n巴哈姆特\t408636\n掘金者\t408637\n换景\t408638\n文晏\t408639\n单人照\t408640\n词源\t408641\n靖边波浪谷\t408642\n不得善终\t408643\n帐期\t408644\n顺平县\t408645\n基本不等式\t408646\n中怡康\t408647\n小语\t408648\n冷冻食品\t408649\n合卷\t408650\nibooks\t408651\n王三运\t408652\n袁硕\t408653\n温县政府网\t408654\n好话\t408655\n孔垂楠\t408656\n暮霭凝香\t408657\ngoogle框架\t408658\nDojo\t408659\n放射性元素\t408660\n鬼泣HD合集\t408661\n平方分米\t408662\n爱琴海购物公园\t408663\n沅陵县\t408664\n0604\t408665\n2001年\t408666\n本月中旬\t408667\nx14\t408668\n存取款\t408669\n兮祸\t408670\n98%\t408671\nDIV+CSS模板\t408672\nFederal\t408673\n倒霉熊\t408674\n乐活\t408675\n第几幕\t408676\n房东们\t408677\n古代战争\t408678\n天津钓鱼网\t408679\n吉昌\t408680\n43天\t408681\n许其亮\t408682\n美对华\t408683\n信长之野望14-威力加强版\t408684\n马锐\t408685\n青龙河\t408686\n部落格\t408687\n民数\t408688\n制粒机\t408689\n湖北水利水电职业技术学院\t408690\n蝙蝠侠3\t408691\nlogistic\t408692\n刘海峰\t408693\n山财\t408694\n深圳鹏程医院\t408695\n崔子\t408696\n利尔康\t408697\n红星村\t408698\n魔獸世界數據庫\t408699\n超级情圣\t408700\n海南省农业厅\t408701\n移栽\t408702\n导轨\t408703\n战场女武神4\t408704\n开心水族箱\t408705\n加碘\t408706\n喰\t408707\n带火\t408708\n通江大道\t408709\n威远县\t408710\n美信\t408711\n良妃\t408712\n可用\t408713\n电投\t408714\n废渊\t408715\n黑豆芽\t408716\n丽江古镇\t408717\n松松垮垮\t408718\n王晨正\t408719\nHoliday\t408720\n四灵\t408721\n巡航版\t408722\n涵涵\t408723\n苏州园区站\t408724\n双海\t408725\n吉祥结\t408726\n天涯何处无芳草\t408727\n中意皑希优\t408728\n命根子\t408729\n上汽MG\t408730\n李熙墨\t408731\n刘美君\t408732\n上海市浦东新区公利医院\t408733\n国家安全生产应急救援指挥中心\t408734\n高达战争\t408735\n马来西\t408736\n中岳嵩山\t408737\nType-C接口\t408738\n内部化\t408739\nHathaway\t408740\n石家庄电视台\t408741\nNokia6\t408742\n蕾姆拉姆\t408743\nX光机\t408744\n张海林\t408745\n冂\t408746\n看多\t408747\ncharu\t408748\n2017年以来\t408749\n星客\t408750\nncat\t408751\n郝鸟\t408752\n谴责\t408753\n百优\t408754\n马叙伦\t408755\n521路\t408756\n间接引语\t408757\n1767\t408758\n博茨大战\t408759\n致尚\t408760\n鱼跃医疗\t408761\n变相怪杰\t408762\n物品类\t408763\n弥敦道\t408764\n友善\t408765\n337\t408766\n浉河\t408767\n杏色\t408768\nrman\t408769\nkicking\t408770\n二丫王悦\t408771\n光明会\t408772\n异方差性\t408773\n桃姐\t408774\npac4j\t408775\nLCD1602\t408776\nM22\t408777\n横坎头村\t408778\n藩属国\t408779\nclouds\t408780\nRLC\t408781\n东本\t408782\n富察\t408783\n黄骅港\t408784\n网易云电台\t408785\n田广\t408786\n四特酒\t408787\n英理\t408788\n20171118\t408789\n窄窄\t408790\n14%\t408791\nNSPS\t408792\n安夏\t408793\n王雨纯\t408794\nRevMan\t408795\n精雕软件\t408796\n点兔\t408797\n咸肉\t408798\nstm32吧\t408799\n精神不振\t408800\n白春礼\t408801\n指证\t408802\n彩云衣\t408803\n查尿\t408804\n最红\t408805\nBeyond乐队\t408806\n小师妹\t408807\n怀化学院\t408808\n菱湖镇\t408809\n回校\t408810\ncorning\t408811\n偏微分\t408812\n窦骁娜扎\t408813\nGregory\t408814\n假字\t408815\n二十部\t408816\n说好\t408817\n孙建华\t408818\nCollagen\t408819\ngta6\t408820\n荆门市公安局\t408821\nKentucky\t408822\nstamping\t408823\nT420S\t408824\n重评\t408825\n孙坚\t408826\n約束\t408827\n临泉论坛\t408828\n上海金融法院\t408829\n暮玖\t408830\n冰山上的来客\t408831\n黄河楼\t408832\n止语\t408833\n这份\t408834\nqinian\t408835\n预签证\t408836\n论人非\t408837\n杨金柱\t408838\nspecies\t408839\n广播电视学专业\t408840\n捷成\t408841\n燃气罐\t408842\n历史版\t408843\nLTS\t408844\n上海海洋大学\t408845\n局外人\t408846\nmig\t408847\n怪物猎人2G\t408848\n老集镇\t408849\n青木玲\t408850\n脱壳版\t408851\n宝山区幼儿园\t408852\n曾浩\t408853\n信维通信\t408854\n硫化氢检测仪\t408855\nstretch\t408856\n过滤片\t408857\n冰波\t408858\n二相\t408859\n可见分光光度计\t408860\n信达公园\t408861\n玉品\t408862\n红玉\t408863\n旁右\t408864\n辅助系统\t408865\n350号\t408866\n乐桃\t408867\n光照\t408868\n燕鸥\t408869\n东平国家森林公园\t408870\n郑人豪\t408871\n樱花开\t408872\nbear\t408873\nFAIR\t408874\nhd7850\t408875\n东莞市房产管理局\t408876\n宝泉岭\t408877\n成都军区\t408878\ndig\t408879\ndominic\t408880\nnokia6\t408881\n1897年\t408882\n腮骨\t408883\n加勒\t408884\n荡气回肠\t408885\n联展\t408886\n麦克雷\t408887\n86914050\t408888\nstringbuffer\t408889\nrealy\t408890\n教画\t408891\n钱串子\t408892\n153家\t408893\n35亿元\t408894\n画风\t408895\n伽玛值\t408896\n买马\t408897\n美展\t408898\n除氧器\t408899\n柠檬叶\t408900\n网商银行\t408901\ncodepush\t408902\nB-游久网\t408903\n月盾\t408904\n财务管理软件\t408905\n路牙\t408906\n华盈\t408907\n紫霄\t408908\n6202\t408909\n月假\t408910\n秘婚\t408911\n百图\t408912\n张学\t408913\n浙江大学教育学院\t408914\n回归方程\t408915\nIS300\t408916\n三全路\t408917\n浅红\t408918\n黄欢\t408919\n藏起来\t408920\nArctis\t408921\nRequests\t408922\n乱拍\t408923\n一朵朵\t408924\n建置\t408925\nphpjm\t408926\n114.com\t408927\njedispool\t408928\n方程题\t408929\n普兰店区\t408930\n大队部\t408931\n2018043\t408932\n秀水苑\t408933\nesi\t408934\n手动档\t408935\n福利贴\t408936\n钢化膜\t408937\nFabric\t408938\n何丹\t408939\n暖洋洋\t408940\n福全镇\t408941\nFuLing\t408942\n316\t408943\n和谐劳动关系\t408944\n建始县政府\t408945\n五科\t408946\nreadLine\t408947\n很多页\t408948\n偷香医统江山\t408949\n打底袜\t408950\n盘侠\t408951\n六章\t408952\n4.18日\t408953\n中国自治区\t408954\n突袭者\t408955\nADV\t408956\n老妪\t408957\n铝压铸件\t408958\nK1路\t408959\n理论家\t408960\n嫦娥一号\t408961\n嫁人\t408962\n码云\t408963\nPanamera\t408964\n省旅游局\t408965\nOnRPG\t408966\n平房乡\t408967\nVQ\t408968\n容休\t408969\n两年前\t408970\n河北轨道运输职业技术学院\t408971\n家庭机器人\t408972\n普吉\t408973\n会刊\t408974\ngx85\t408975\n二刀流\t408976\n業績\t408977\n宁波科学探索中心\t408978\nbell\t408979\nblooming\t408980\n汉信\t408981\n样式\t408982\n23秒\t408983\n漏税\t408984\n0113\t408985\n滨海路\t408986\n范加尔\t408987\ncohiba\t408988\n新闻记者证\t408989\n转单\t408990\n刘美含\t408991\n润肠\t408992\n不扣钮的女孩\t408993\n罗马法\t408994\n证书源\t408995\n显现\t408996\n图区\t408997\n恒泰证券\t408998\n许冠杰\t408999\n北京工人体育馆\t409000\n委婉语\t409001\n交叉项\t409002\n山月记\t409003\n移魂\t409004\n刘长乐\t409005\n互派\t409006\nXcode8\t409007\n张泉周笔畅\t409008\n张珍\t409009\n本草新编\t409010\n男生女生向前冲\t409011\n210斤\t409012\n今年内\t409013\n电影学\t409014\n内火\t409015\n170407\t409016\ndos\t409017\n马克维茨\t409018\nBD-MP4\t409019\ngvfs\t409020\n长春汽车经济技术开发区\t409021\n惠安馆\t409022\n松月\t409023\n塑胶五金网\t409024\n沙迪克\t409025\nxmrig\t409026\nzeplin\t409027\n古板\t409028\n55期\t409029\n麓谷街道\t409030\n校员\t409031\n万能角钢\t409032\n深州\t409033\n半秒\t409034\n淫欲\t409035\n轻松筹\t409036\n加工\t409037\n油炸食品\t409038\nPRTG\t409039\n颜楷\t409040\n梁平县\t409041\n阿狸子\t409042\n单果\t409043\n乐亭县\t409044\n87fuli\t409045\n孙韬\t409046\n线程名\t409047\n吴鹰\t409048\napf\t409049\n石墨化炉\t409050\n房贷计算器2018_房贷计算器\t409051\n6月27日\t409052\n服药\t409053\n卷柏\t409054\n20150923\t409055\neasypoi\t409056\n20170303\t409057\n用尽\t409058\n分解动\t409059\n亳州房产网\t409060\n涉外\t409061\n$router\t409062\n联软科技\t409063\n魔界战记4\t409064\n好一朵美丽的茉莉花\t409065\n冰山美人\t409066\n酱油鸡\t409067\n六安市人民医院\t409068\n虫珀\t409069\n四艘\t409070\n波普风\t409071\n散装水泥罐车\t409072\n黄山人才网\t409073\n诺克萨斯吧_\t409074\n匝道桥\t409075\n房产贷款\t409076\n郑口\t409077\n西方世界的劫难3\t409078\nSyslog\t409079\nVCB\t409080\nPowerbeats2\t409081\n移动式\t409082\n嚓嚓\t409083\n磺\t409084\n2017年2月28日\t409085\n标准码\t409086\n二套房贷款利率\t409087\nkos\t409088\n十二种\t409089\n一般\t409090\n新泰市\t409091\n陆斌\t409092\n浙江省农科院\t409093\n反相\t409094\n8册\t409095\n尤克里里谱-ukulele乌克丽丽弹唱谱\t409096\n恋练有词\t409097\n3360\t409098\n光洙\t409099\n零库存\t409100\npublished\t409101\n偈语\t409102\n丑态\t409103\ngitflow\t409104\n澳乐\t409105\n尔冬升\t409106\n杨立\t409107\n组合逻辑电路\t409108\n上古卷轴\t409109\n日日\t409110\nChara\t409111\n新财富最佳分析师\t409112\n雅科仕\t409113\nmodernizr\t409114\n宋桓公\t409115\nTchaikovsky\t409116\n南开要闻_南开大学\t409117\n金属片\t409118\n艾路明\t409119\n青骨\t409120\n绿城全运村\t409121\n教务科\t409122\n一千多\t409123\nsnack\t409124\n潮潮\t409125\nayaka\t409126\n茌平\t409127\n活血\t409128\ncredithc\t409129\n龟友\t409130\n媳妇的美好时代\t409131\nfleur\t409132\n沈清秋\t409133\n联想g400\t409134\nt450s\t409135\n天津滨海新区\t409136\nUIText\t409137\n小米电视\t409138\n苗苗\t409139\n新金桥路\t409140\n刚不\t409141\n华兴银行\t409142\n亚欧网\t409143\n溧水经济开发区\t409144\n玛吉阿米\t409145\n压延\t409146\n民国史\t409147\n冯玉军\t409148\nbxslider\t409149\n高览\t409150\noficial\t409151\nprosperity\t409152\n阿齐兹\t409153\n凉茶\t409154\nMunicipal\t409155\n说不得\t409156\nVIT\t409157\n油漆味\t409158\n五笔吧_\t409159\n喜笑颜开\t409160\n中考试\t409161\n拳击舞\t409162\n戌时\t409163\n破板\t409164\n博超\t409165\n中钢天源\t409166\n萌妃\t409167\nEeePC\t409168\n成人型\t409169\n迷糊娃\t409170\n中华恐龙网\t409171\n国家海洋环境监测中心\t409172\n香氛\t409173\n成都中医药大学附属医院\t409174\n麻章区\t409175\n赠我予\t409176\nEMBASSY\t409177\n老痒\t409178\n逸闻\t409179\n红嘴\t409180\n玛瑟里顿\t409181\n增生期\t409182\n惜命\t409183\n浙江省医学会\t409184\ngoodtime\t409185\n阻生\t409186\n赵伟\t409187\n七款\t409188\n迪菲亚\t409189\n朱波\t409190\n优越感\t409191\n金科天元道\t409192\n金皮\t409193\n郑荣禄\t409194\n奔驰公司\t409195\nactor\t409196\nghostwin7\t409197\n洁面乳\t409198\n中国数字科技馆\t409199\n修鞋\t409200\n滨江园区\t409201\n桂林理工大学南宁分校\t409202\n飘摇\t409203\n议事\t409204\n中国能源大学\t409205\n5442\t409206\n刘胖子\t409207\nBTspread\t409208\n贵州教育网\t409209\n直子\t409210\n三馆\t409211\n刘宽\t409212\n西宁市科技局\t409213\n哈尔滨东\t409214\n辛亥年\t409215\nftk\t409216\n艏\t409217\n爱沢\t409218\n喜力啤酒\t409219\n绝地求生大\t409220\nknots\t409221\n宝丰县\t409222\n楼梯板\t409223\n毒丸\t409224\n移动式登车桥\t409225\n童年的发现\t409226\n新余仙女湖\t409227\n南通家纺\t409228\n拉票\t409229\npsb\t409230\n观乎\t409231\n余杭经济技术开发区\t409232\n鲁俊\t409233\n血红素\t409234\nmobile\t409235\n优秀毕业生\t409236\n成都北站\t409237\n柳屋\t409238\nMATE10\t409239\n平仓价\t409240\n深美\t409241\n李铁\t409242\n载誉\t409243\n封神演义\t409244\njeesite\t409245\nblchen\t409246\n小余\t409247\ncnpc\t409248\nngxin\t409249\npq\t409250\nweisheng\t409251\n中华百家姓\t409252\nM115b\t409253\n碘化\t409254\n监督室\t409255\ndci\t409256\n请进来\t409257\n沉鱼落雁\t409258\n关键点\t409259\n绿柱\t409260\n百度开放云\t409261\n队伍\t409262\n坊中\t409263\n江头\t409264\n稀有度\t409265\n100k\t409266\n非中心\t409267\n红牛车队\t409268\n复购\t409269\nd90\t409270\n7080d\t409271\nl450\t409272\nwst\t409273\n车灯\t409274\n川贝清肺糖浆\t409275\n加巴喷丁\t409276\n鼠游戏\t409277\n布蕾妮\t409278\n呼和浩特新城\t409279\n习近平总书记的成长之路\t409280\n发青\t409281\n桓仁县\t409282\n口腔修复学\t409283\n天学网\t409284\n我机网\t409285\n乐多港\t409286\n舟\t409287\nRocksDB\t409288\n上海中国银行\t409289\n佛山站\t409290\n上古卷轴5天际\t409291\n国旗\t409292\n1.0.19\t409293\n1500度\t409294\n试车员\t409295\n甲基丙烯酸甲酯\t409296\n罗红霉素胶囊\t409297\n河南经贸职业学院\t409298\nmath模块\t409299\n外带\t409300\n热轧带钢\t409301\nwin/mac\t409302\n智杰\t409303\nStamping\t409304\nDiva\t409305\n双述\t409306\nnov\t409307\n资治通鉴\t409308\n迷走神经\t409309\n卫生证\t409310\n空母\t409311\nMotel\t409312\n世界青年说\t409313\n湘琴\t409314\n杂言\t409315\n田姐\t409316\nprotects\t409317\n重芳烃\t409318\ngif表情包\t409319\n刘星云\t409320\n沢井亮\t409321\n縁\t409322\n管底\t409323\n固接\t409324\n小薇\t409325\nxiao\t409326\n397\t409327\n荆条蜜\t409328\n托底\t409329\n携程网\t409330\nBleeding\t409331\n荆柯\t409332\n陆军\t409333\n纪凌尘\t409334\ndami\t409335\nsnat\t409336\n员额制\t409337\nMM-dd\t409338\n周和平\t409339\n楚路\t409340\nScripts\t409341\n糯米酒\t409342\n应流股份\t409343\n魔蝎\t409344\nv11\t409345\n马尾\t409346\n自寒\t409347\n1.25L\t409348\n阴火\t409349\nwin10网易云音乐\t409350\n越层\t409351\n本田冠道\t409352\n焕颜\t409353\n我老\t409354\n纱机\t409355\n银河娱乐场\t409356\nperse\t409357\n教育督导网\t409358\n第228集\t409359\n附解\t409360\n4103\t409361\n2009年1月1日\t409362\n审驴\t409363\nEquipment\t409364\n海绵状\t409365\n玉天恒\t409366\n纪小龙\t409367\n第一筐\t409368\n乳杆菌\t409369\n李荣灿\t409370\n礼貌用语\t409371\nExplained\t409372\nLRTimelapse\t409373\n风小筝\t409374\nTEK\t409375\n徐州地区\t409376\nJM们\t409377\nHyper-V\t409378\nH3C交换机\t409379\n惊世骇俗\t409380\n王小宁\t409381\n袁江\t409382\n中庚城\t409383\n绝缘材料\t409384\nedu.cn二级域名\t409385\n脉宽\t409386\n李晋\t409387\n戒除\t409388\n淮安市第二人民医院\t409389\n浮沉\t409390\nH61\t409391\nscripts\t409392\n控诉\t409393\n毛泽东\t409394\n180402\t409395\n油画\t409396\n蛇蟠岛\t409397\n陆战队\t409398\n威利\t409399\nbong\t409400\n原生态\t409401\n常德市人民政府\t409402\nppt格式\t409403\n波兵\t409404\nLOT\t409405\n上海民族乐器一厂\t409406\nCB190X\t409407\n绿块\t409408\n越王楼\t409409\n中国科学院动物研究所\t409410\n离合器片\t409411\n风帖\t409412\n难于\t409413\n大连机场\t409414\n印光\t409415\nopenlayers3\t409416\n民族村\t409417\n画江湖之杯莫停\t409418\nHcl\t409419\n中共安徽省委宣传部\t409420\nsmo\t409421\n清晰度\t409422\n因扎吉\t409423\n隆林各族自治县\t409424\n七十古\t409425\n马克·吐温\t409426\n王淦昌\t409427\n西亚试剂\t409428\n伪装者\t409429\n如有\t409430\n20160529\t409431\n中国文化大学\t409432\n放弃\t409433\n白娘子传奇\t409434\n明月夜\t409435\n999\t409436\nnetdev\t409437\nbfc\t409438\n伪娘\t409439\n中铁骑士\t409440\n郑素敏\t409441\n42毫米\t409442\n大阪地铁\t409443\n兴安县政府\t409444\n立方分米\t409445\n赵天杰\t409446\nB01\t409447\n刘阿斗\t409448\nart\t409449\n侯发山\t409450\n老秦\t409451\n杨利\t409452\n1一10\t409453\n连根\t409454\n一两个月\t409455\n双穴\t409456\n这月\t409457\n建国门外大街\t409458\n歌手游\t409459\n五老峰\t409460\ncivilian\t409461\n弘益大学\t409462\n五虎上将\t409463\nCentos7\t409464\n未至\t409465\n集团军\t409466\nputian\t409467\n差速锁\t409468\n心理学类\t409469\n曲师大\t409470\n明式\t409471\nLamborghini\t409472\n297\t409473\n舍我其谁\t409474\n高亢\t409475\n少年班\t409476\n尖货\t409477\n最崇拜\t409478\n那村\t409479\nence\t409480\njerryhe326\t409481\n288个\t409482\n卡特琳娜\t409483\n咪蒙\t409484\n行尸走肉第三季\t409485\n飞花\t409486\n盖坦\t409487\n爱奇\t409488\n星驿\t409489\n子宫内膜\t409490\nActiveX控件\t409491\n00000004\t409492\n博兴县\t409493\n釆\t409494\n下款\t409495\n沉积学\t409496\n085\t409497\n声量\t409498\n胡因李玫瑾\t409499\nmilking\t409500\n认价\t409501\n李小平\t409502\n兰灯\t409503\n世事无常\t409504\nAMD速龙II\t409505\n血刀\t409506\n茧蛹\t409507\n云交易\t409508\n回家乡\t409509\n香港中旅\t409510\n法定节假日\t409511\n语文教学与研究\t409512\n亩数\t409513\n0838\t409514\nSeeKHit\t409515\n秦惠文王\t409516\n凉音\t409517\n享优惠\t409518\n2环\t409519\n尼尔机械纪元2B\t409520\n账户管理费\t409521\nNeighbor\t409522\n群岛\t409523\n长枝\t409524\n杭州乐园\t409525\n兰州石化\t409526\n中国华能集团公司\t409527\n人工智能技术\t409528\n九六\t409529\n西陆\t409530\nmellanox\t409531\n大非\t409532\ncrr\t409533\n雨湖\t409534\n硝酸甘油片\t409535\nInteroperability\t409536\n蜀绣\t409537\n丁震\t409538\n恒通\t409539\n达里奥\t409540\n星冰粽\t409541\n16_\t409542\n_书信函_无忧考网\t409543\n山水友\t409544\nRest\t409545\n达芬奇密码\t409546\n胶筒\t409547\n140岁\t409548\nqqlive\t409549\n上冈\t409550\n黑液\t409551\n45亿元\t409552\n上海世茂皇家艾美酒店\t409553\n高臻\t409554\n6000块\t409555\n仟佰宠\t409556\n却\t409557\n侯振鹏\t409558\n第36页\t409559\n绿化苗\t409560\n图解\t409561\n钹\t409562\n博德之门2增强版\t409563\nJavascript\t409564\n大商人\t409565\n猛龙过江\t409566\n金界\t409567\n四首\t409568\ncest\t409569\n达隆郡\t409570\n王赟\t409571\n周方\t409572\n抗拉\t409573\nwhd\t409574\nmong\t409575\nHibernate3\t409576\nemulation\t409577\n博姿\t409578\n阿拉德英雄传\t409579\n狠打\t409580\n第6套\t409581\nSKIDROW\t409582\n足球比赛\t409583\n7.5万\t409584\nsuse11\t409585\n小时\t409586\ninfectious\t409587\n4A公司\t409588\n罗一笑\t409589\n表情库\t409590\n胎教\t409591\n第一分册\t409592\n雷迅\t409593\n杨子一\t409594\n大潮\t409595\n成都天府国际机场\t409596\n吸毒案\t409597\n音乐包\t409598\n鸣鸟不飞\t409599\n4678\t409600\n锐不可当\t409601\n袁哲\t409602\n海蟹\t409603\nwaka\t409604\n大唐荣耀2\t409605\nGenesys\t409606\n天天钻\t409607\n屡战屡败\t409608\n水川麻美\t409609\n灵台县\t409610\n头带\t409611\nfilorga\t409612\n闭路\t409613\n壹品网\t409614\n土层\t409615\n呼噜小精灵\t409616\n如泉涌\t409617\n先行区\t409618\n中国图书馆学会\t409619\n监外执行\t409620\n学好\t409621\n初效\t409622\n雌激素\t409623\n英业达\t409624\n相对宇宙\t409625\n艾露\t409626\n特尔佳\t409627\n多幅\t409628\n8.2.3\t409629\nExadata\t409630\n河南公司\t409631\n生蚝\t409632\n神盾局特工第二季\t409633\n蛇粉\t409634\n古月娜\t409635\n搏击\t409636\nKTV版\t409637\nxuean\t409638\n金东方留学\t409639\n3400万\t409640\n在内\t409641\n洛龙\t409642\n钚\t409643\n沈阳地铁\t409644\n拨款\t409645\nakm\t409646\ncyw\t409647\n恩施土司城\t409648\n分数段\t409649\nuser32\t409650\nDNF卡顿\t409651\n得意\t409652\n盗墓贼\t409653\n深圳市国资委\t409654\n报警告\t409655\n零行\t409656\ntruss\t409657\nMalaysia\t409658\n江夏区\t409659\nAngle\t409660\n变现\t409661\n第2讲\t409662\n深圳金融办\t409663\n肥鸟\t409664\n跨板受力筋\t409665\n锤子科技\t409666\n疗休养\t409667\n观致5论坛_汽车之家论坛\t409668\n扣扣群\t409669\nCharacteristics\t409670\n镇口\t409671\n非缄默\t409672\n登记员\t409673\n蓄热\t409674\n荀玉根\t409675\n百事达\t409676\n拓建\t409677\n安乡县政府\t409678\n市场价\t409679\n蜜爱\t409680\n煤焦油\t409681\nforme\t409682\n成都地铁9号线\t409683\n19mm\t409684\n困顿\t409685\n溴酸钾\t409686\n西北工业大学机电学院\t409687\n审计员\t409688\n刻不容缓\t409689\n成狂\t409690\n双响炮\t409691\n大_\t409692\n决明子枕头\t409693\n622\t409694\n炽灼\t409695\n中国成语大会\t409696\n回采\t409697\nSchwartz\t409698\n梨子\t409699\n糖尿病并发症\t409700\n抗酸杆菌\t409701\n锰砂滤料\t409702\n南宁吴圩机场\t409703\n方正锐\t409704\n泉山区\t409705\n黑风城战记\t409706\n梦影\t409707\n分页机\t409708\n澳洲站\t409709\n中国有色金属学报\t409710\n天河国际机场\t409711\n主要\t409712\n战三国\t409713\n1273\t409714\n天崩地裂\t409715\nH13\t409716\n绳艺\t409717\n七十六\t409718\n撑子\t409719\n朵朵\t409720\n朝阳教委\t409721\n柳其儿\t409722\ntrove\t409723\n萧郎\t409724\n深圳牙科医院\t409725\nsiro\t409726\n王顺山\t409727\n铜粉\t409728\n高伟达\t409729\n大时间\t409730\n博德之门\t409731\n服务费\t409732\n心存\t409733\n辽宁号\t409734\n小萝卜\t409735\n首封\t409736\n1.18亿\t409737\ncf穿越火线\t409738\n配角们\t409739\n医理\t409740\nEYES\t409741\nchongzhi\t409742\n绾\t409743\n纬地吧\t409744\nmd5\t409745\n汉音\t409746\n采精\t409747\n滨州政府\t409748\n闭幕式\t409749\n噬谎者\t409750\n塔类\t409751\n文工团\t409752\n日寇\t409753\n弱柳\t409754\nElasticsearch\t409755\n利湿\t409756\n大漠谣\t409757\n民事权利能力\t409758\n猴哥\t409759\nHAProxy\t409760\nPs123\t409761\n微时代\t409762\n曙\t409763\nstyle\t409764\n温都网\t409765\n处理方\t409766\n秦小明\t409767\n杰克·尼科尔森\t409768\n游匣5577\t409769\nQC\t409770\n不调\t409771\n风色幻想6\t409772\n关联营销\t409773\nMicrotek\t409774\nep5\t409775\n10克\t409776\ndocke\t409777\n抽风机\t409778\n双封\t409779\nSupersonic\t409780\nthinkserver\t409781\n汽车研究院\t409782\n跳跃者\t409783\ntddl\t409784\n生根粉\t409785\nRogers\t409786\n黑雁\t409787\n图子\t409788\n精装\t409789\n氧化聚乙烯蜡\t409790\n王秀娥\t409791\nT139\t409792\n打面\t409793\nVitashell\t409794\n钢筋弯箍机\t409795\n杆状\t409796\n回来的路\t409797\nDCR\t409798\n朱敏希\t409799\n江铃全顺\t409800\n仰角\t409801\n智能酒店\t409802\n巴龙\t409803\n林蒙\t409804\n广安门外大街\t409805\n敬畏\t409806\nMM131\t409807\n复兴公园\t409808\n凤凰av\t409809\nspirits\t409810\nERROR\t409811\n苹果酸\t409812\n园林博览会\t409813\nltm\t409814\n云镜\t409815\n李讷\t409816\n江西省发展改革委\t409817\n羊角球\t409818\n小柚\t409819\n金田一耕\t409820\n嗳气\t409821\n5幅\t409822\n杨力\t409823\n10060\t409824\n儿童心理学\t409825\nGlorious\t409826\n【康\t409827\n李大壮\t409828\n潜江龙虾节\t409829\nPES\t409830\n758\t409831\nmutex\t409832\n媚娘\t409833\n第5期\t409834\n龙系\t409835\n大任\t409836\nelon\t409837\n烟器\t409838\n管状电机\t409839\n小儿推拿师\t409840\n十铨\t409841\n第十六周\t409842\n粗放型\t409843\n超高速\t409844\n非限制性定语从句\t409845\nOnekey\t409846\n誓言\t409847\n奥德利\t409848\n东京成田机场\t409849\n视达\t409850\n破产清算\t409851\n黑糖玛奇朵\t409852\n上海汽车网\t409853\ntreeView\t409854\noppoR15\t409855\n傲龙\t409856\n李远\t409857\n纯招\t409858\n英雄志\t409859\n移液枪\t409860\n荣耀10\t409861\n走马观花\t409862\n体术\t409863\n小饭桌\t409864\n模拟机场\t409865\n友讯达\t409866\n20种\t409867\n布加勒斯特\t409868\n王码\t409869\n合金装备崛起复仇\t409870\n细棒\t409871\n反渗透膜\t409872\n杨安泽\t409873\n靡宝\t409874\n萌宠成长记\t409875\n王宏志\t409876\n白萝卜\t409877\n福特森\t409878\n49家\t409879\n武汉快递\t409880\niCheck\t409881\n500多个\t409882\nrpgmv\t409883\n融贝网\t409884\n11.3_\t409885\n烤火炉\t409886\n鱼腥草\t409887\n杨非同\t409888\n云南农业职业技术学院\t409889\n千千花卉网\t409890\n麻将群\t409891\n稀有鱼\t409892\n机子\t409893\n王李丹妮\t409894\n过去20年\t409895\n银宗\t409896\n2017年6月底\t409897\n王鹤棣\t409898\n九阴\t409899\n陶博士\t409900\n莫萦\t409901\n全麦吐司\t409902\n唐华\t409903\n耀辉\t409904\n王海东\t409905\n小移\t409906\nBracelets\t409907\n青岛汽车东站\t409908\n原莉\t409909\n全款\t409910\n高琳\t409911\n北京住建委\t409912\n北京崇文区\t409913\n败品\t409914\n念地藏经\t409915\nQQ技\t409916\n水原\t409917\n另一表\t409918\n狮子山区\t409919\n马家村\t409920\n机机\t409921\n陶军\t409922\n3.33\t409923\nlora\t409924\nK8S\t409925\n通裕重工\t409926\nRepresentation\t409927\n装卸费\t409928\nFILM\t409929\nel\t409930\n荧光定量PCR\t409931\n黑甲\t409932\n美骑论坛|BIKETO\t409933\n重庆幼儿园\t409934\n基图\t409935\n压力阀\t409936\n2010年11月\t409937\nmac\t409938\n砰砰砰\t409939\n九九女儿红\t409940\n缝机\t409941\n美舞\t409942\nwww.sz84.net\t409943\n欣奕\t409944\neprint\t409945\n浪\t409946\n来回\t409947\n奪\t409948\nStila\t409949\n膜片式\t409950\n乃翁\t409951\n)信息技术有限公司\t409952\n黄多多\t409953\nslower\t409954\n严管\t409955\nMegascans\t409956\n慢性宫颈炎\t409957\nM401\t409958\nMiniGUI\t409959\nQQ五笔输入法\t409960\n高科技公司\t409961\n成都西村\t409962\n三国群侠传吧\t409963\n痕迹\t409964\n达孜县\t409965\nifrs\t409966\n北川美绪\t409967\n华礼门\t409968\n江苏法院\t409969\nCatalog\t409970\n2016年4月8日\t409971\n俊介\t409972\n碰水\t409973\n白蒲\t409974\n海皇牙\t409975\nMAPINFO\t409976\n三茅\t409977\n化学武器\t409978\n试验箱\t409979\n乒联\t409980\n2017年3月7日\t409981\n风湿科\t409982\n德哥\t409983\n_罗\t409984\n庆尚南道\t409985\ndedication\t409986\n走向复兴\t409987\nsweat\t409988\n拉栓\t409989\n1996年\t409990\n杨文明\t409991\nVeeam\t409992\n黑印\t409993\n中华人民共和国新闻出版总署\t409994\n瘰疬\t409995\n平均分子量\t409996\n第七宫\t409997\n自然人独资企业\t409998\n归类\t409999\n嘉庆君\t410000\nraison\t410001\n弘信\t410002\n绫濑\t410003\n吃垮\t410004\n主令\t410005\n单句\t410006\n相国\t410007\n中国教育报\t410008\n機構\t410009\n52分钟\t410010\n同码\t410011\n文献集\t410012\n风铃花\t410013\n命令式\t410014\n重庆儿童医院\t410015\nmyeclpse\t410016\n雉鸡\t410017\n可变更\t410018\n转告\t410019\n高兴\t410020\n莓良心\t410021\n哥特萝莉社\t410022\n申报期\t410023\nav综合网\t410024\ny40\t410025\n光大永明人寿保险有限公司\t410026\n孝行天下\t410027\n透视眼镜\t410028\n窃听风暴\t410029\nc#textbox\t410030\n铸铁锅\t410031\n马拉车\t410032\n奥秘\t410033\n九鼎\t410034\nframebuffer\t410035\nRX570\t410036\n40万吨\t410037\n米豆\t410038\n紫蓝\t410039\nmiracle\t410040\n300000元\t410041\n白佛\t410042\n鼻漏\t410043\n卫平\t410044\n小浪底水库\t410045\n云聚网\t410046\nvcomp140.dll\t410047\n延城\t410048\n许一君\t410049\n风魔小次郎\t410050\nMac浏览器\t410051\n不念\t410052\n|机\t410053\nMame\t410054\n暂挂\t410055\n北京投资公司\t410056\n福泽\t410057\n播种机\t410058\n变形金刚领袖之证\t410059\nSwallow\t410060\n针阀式\t410061\n不狂\t410062\n徐龙\t410063\n刚果\t410064\n燕洵\t410065\n盆腔\t410066\n养猪户\t410067\n20700\t410068\nterragenesis\t410069\n书写体\t410070\n心理年龄\t410071\n美丽的花\t410072\n谷房网\t410073\n王小利\t410074\n七代半\t410075\n线菌\t410076\n电热恒温培养箱\t410077\n13:30\t410078\n嫁一送\t410079\n900度\t410080\n蜜桃介子\t410081\n花岗岩\t410082\n广西壮族自治区教育厅\t410083\nSPF\t410084\n去化\t410085\n梅雪\t410086\nCSGO中文网\t410087\n杨家湾\t410088\n针织法\t410089\n六道斑\t410090\n卡宾枪\t410091\n神女赋\t410092\n5.03\t410093\n粑粑柑\t410094\n国家人社部\t410095\n法学会\t410096\n妻儿\t410097\nvip域名\t410098\nFSA\t410099\n1020plus\t410100\n心花\t410101\n课外阅读\t410102\n武平县人民政府\t410103\n奇鱼\t410104\nG类\t410105\n小新潮5000\t410106\n大喝\t410107\n全幅相机\t410108\n多维片\t410109\n棋类\t410110\nBehind\t410111\n国债收益率\t410112\n智能投影\t410113\n王光\t410114\n于荣光\t410115\n灵弹\t410116\n中公教育吉林分校\t410117\n组群\t410118\n潜台词\t410119\n太平洋卡\t410120\n怪人\t410121\nLeet\t410122\nmodifiers\t410123\n运管\t410124\n众汇\t410125\n克利夫兰\t410126\n我为喜剧狂\t410127\nsufei\t410128\n商业秘密\t410129\n山河智能\t410130\nyonex\t410131\n停尸房\t410132\n细密\t410133\n145万\t410134\n1.71\t410135\n山东人\t410136\n491\t410137\n洗脑术\t410138\n天华金号\t410139\n曙光小学\t410140\niphone9\t410141\n壮话\t410142\n百度云_大陆\t410143\nNovember\t410144\nfron\t410145\n希卡利奥特曼\t410146\n雪津啤酒\t410147\n供暖\t410148\n青羊\t410149\n2000平米\t410150\n二十一度\t410151\n卵子\t410152\n极路由器\t410153\n512mb\t410154\n恩率\t410155\n分析学\t410156\nbipolar\t410157\n工业设计工程\t410158\n包心菜\t410159\n卡萨\t410160\n招商银行公司\t410161\n灵性\t410162\nXvideo\t410163\nKB_\t410164\nDrip\t410165\n海曼\t410166\n六分之一\t410167\n藏书阁\t410168\nmod\t410169\npy文件\t410170\n东方广场\t410171\n十杰\t410172\n普洱雅苑\t410173\n雅居\t410174\nkg/m3\t410175\n孙少安\t410176\nHalloween\t410177\nDimitri\t410178\n祖国颂\t410179\n寻麻疹\t410180\n山溪\t410181\n23册\t410182\n湘少版小学\t410183\n经纪人证\t410184\n半圈\t410185\n土家人\t410186\n囊体\t410187\n危桥\t410188\n蒸鱼豉油\t410189\n有色人种\t410190\nxo酒\t410191\n土豪网\t410192\n盘锦职业技术学院\t410193\n腰间盘膨出\t410194\n损溢\t410195\n森雅R7\t410196\nJonathan\t410197\n战略管理\t410198\n岳阳市一人民医院\t410199\n洗碗巾\t410200\n海珠湿地公园\t410201\n骑手\t410202\n九千万\t410203\n马嵬\t410204\nwushi\t410205\n减脂餐\t410206\nps3d\t410207\n私生饭\t410208\n1.86\t410209\nMaggieQ\t410210\nRDLC\t410211\nqie\t410212\n苍凉\t410213\n5栋\t410214\n1766\t410215\n国家\t410216\n明镜台\t410217\n清博大\t410218\n福州市行政服务中心\t410219\nswo\t410220\n尼尔\t410221\n套盒\t410222\n春光美\t410223\n包总\t410224\n20150720\t410225\n相互依存\t410226\nyuer\t410227\n滴滴滴滴\t410228\n_丽人时尚网\t410229\n连池\t410230\n臀沟\t410231\n水质\t410232\ngl穿越文\t410233\n朱儁\t410234\n四芯\t410235\nLVMH集团\t410236\nvae吧_\t410237\n学习题\t410238\n火村\t410239\n华为北研所\t410240\n工具案\t410241\n橙橙\t410242\n双一流大学\t410243\n长板凳\t410244\n养护期\t410245\n戲\t410246\n风车节\t410247\n繁茂\t410248\n贺氏\t410249\n阿加莎·克里斯蒂\t410250\n10幅\t410251\n冬虫夏草\t410252\n谋财害命\t410253\n布兰文\t410254\nChimera\t410255\n黄斑病变\t410256\nvultr\t410257\nluinx\t410258\n骁龙660\t410259\n科用\t410260\n原胞\t410261\n普林\t410262\n床照\t410263\n纹身展\t410264\n国家队\t410265\nje\t410266\n地狱男爵2\t410267\n自然科学版\t410268\n插卡版\t410269\n惯用语\t410270\n工包\t410271\n轻粘土\t410272\n铐\t410273\n送代\t410274\nwasabi\t410275\n西游记之大圣归来\t410276\n小职员\t410277\nsolarbe索比\t410278\n卖点\t410279\n148号\t410280\n1.75%\t410281\n一历年\t410282\n笑傲江湖ol\t410283\nsheet2\t410284\ndom\t410285\n大申网\t410286\n棋艺\t410287\nargus\t410288\n河南中医学院第三附属医院\t410289\n真华路\t410290\nORACLE\t410291\n印方\t410292\n不赞成\t410293\n陈永华\t410294\n实务版\t410295\n双创周\t410296\n逻辑运算符\t410297\n停胎\t410298\n常驻\t410299\n南开大悦城\t410300\n1859\t410301\n销售\t410302\n瑞芬太尼\t410303\n银泰中心\t410304\n默默\t410305\n中业网校\t410306\n木塑板\t410307\nJedis\t410308\n定态\t410309\n1649\t410310\n黄淮海\t410311\n益母草颗粒\t410312\nRamada\t410313\n宝安汽车站\t410314\n湖北第二师范学院\t410315\narray_slice\t410316\nOMEGA\t410317\n淮剧\t410318\n林佑威\t410319\n嗨镜\t410320\n白湖亭\t410321\n弗洛拉\t410322\n李伯\t410323\nlyb\t410324\n2010年8月\t410325\n520Li\t410326\n开源中国社区\t410327\nBodySlide\t410328\nGWAS\t410329\n礼器碑\t410330\n娜丽丝\t410331\n晋煤集团\t410332\n越多越\t410333\n唐超\t410334\n柜台\t410335\nqita\t410336\n方庄地区\t410337\n有线版\t410338\n安徽省江南十校\t410339\nagv\t410340\nstruts\t410341\n818那些年\t410342\n囯\t410343\nPeninsula\t410344\n延安市政府\t410345\nsan\t410346\n春色\t410347\n法克\t410348\n第十四届\t410349\nmeet\t410350\n40000\t410351\n上海歌舞团\t410352\n非典\t410353\n词场\t410354\n海员证\t410355\n罪妻\t410356\n智能家居展览会\t410357\n七个星期五\t410358\n天津市电缆总厂橡塑电缆厂\t410359\nvCloud\t410360\nworkqueue\t410361\nmadden\t410362\n3.2.2\t410363\n强化器\t410364\n帕尔曼\t410365\n局委\t410366\n答复\t410367\n小方智能摄像机\t410368\n牛王庙\t410369\n负担\t410370\n格桑\t410371\nS9_\t410372\n5月25\t410373\n哺\t410374\n秒租\t410375\n美货\t410376\n33集\t410377\n绕声\t410378\n证券类\t410379\n兄弟版\t410380\n更全\t410381\n新农合大病保险\t410382\nspiders\t410383\n枝丫\t410384\nNEKO\t410385\n总有\t410386\n折800特卖商城\t410387\n多根\t410388\n自稳\t410389\n美琪美雪\t410390\n惨胜\t410391\n全民飞机大战吧\t410392\n韩粉\t410393\n日日夜夜\t410394\n逆流式\t410395\n20周\t410396\n88旧港\t410397\nnike耐克\t410398\n阿里巴巴国际站\t410399\n2017年11月7日\t410400\n摔下\t410401\n卡饭\t410402\nplacement\t410403\n奶僧\t410404\n二进制数据流\t410405\n北京大学环境科学与工程学院\t410406\n巨妖\t410407\n八字纳音\t410408\n篮球场\t410409\nProx\t410410\n月事妹\t410411\n温峥嵘\t410412\n月球\t410413\n森宝\t410414\nGB1589\t410415\n回到明朝当王爷\t410416\n螺旋钢管\t410417\n京广快速路\t410418\n2016-17\t410419\n至义\t410420\n深圳小区\t410421\n关键词\t410422\n路党\t410423\n灯笔小说网\t410424\n古迹\t410425\nT460s\t410426\n百泰\t410427\n张伯笠\t410428\n禁品\t410429\n6.2.6\t410430\n单碟\t410431\n鳌头\t410432\n航天器\t410433\n海尔冰箱\t410434\nBrother\t410435\n集资案\t410436\n溃坝\t410437\n一丹\t410438\nFOB\t410439\n夜书所见\t410440\n最古\t410441\n百分几\t410442\n湖北省人民政府扶贫开发办公室\t410443\nHeis\t410444\n高帽\t410445\n木塔\t410446\n阿树\t410447\n门静脉\t410448\n秦淮景\t410449\n意识流\t410450\n粘性土\t410451\n梦幻西游卡\t410452\n林产\t410453\nCCTV-6\t410454\n列宁全集\t410455\n04184\t410456\n划不\t410457\nBounty\t410458\n程宇\t410459\n就业处\t410460\n方磊\t410461\n黄沾\t410462\n傲世九重天吧\t410463\n外滩\t410464\n清肺化痰\t410465\n生活大爆炸1\t410466\n钣\t410467\n假吃\t410468\n4千元\t410469\n智能球\t410470\n比色\t410471\n球磨机\t410472\n优立塑\t410473\n9月21日\t410474\n工作能力\t410475\n好评率\t410476\n重阳投资\t410477\n赵丽\t410478\n3头\t410479\nPhuket\t410480\n花果山\t410481\nroche\t410482\nopenface\t410483\n海市\t410484\n周期性\t410485\nCreateJS\t410486\n国金证券\t410487\nABAC\t410488\n西北侧\t410489\n静安区中心医院\t410490\n隐现\t410491\n平沙落雁\t410492\n中农富通\t410493\n双心\t410494\n捡尸\t410495\n1901\t410496\n多如牛毛\t410497\n凤凰女\t410498\n偶感\t410499\nChatKing\t410500\n昆城\t410501\n韩迷\t410502\n英国考文垂大学\t410503\n河南自考网\t410504\n席格\t410505\n甲醚\t410506\n尾尾\t410507\n偶题\t410508\n猫友们\t410509\n花冈实太\t410510\n中国能源建设集团\t410511\nCiLiSou\t410512\n仲秋\t410513\n私募股权投资基金\t410514\nRecon\t410515\n龙口西路\t410516\n石鼓镇\t410517\n低段\t410518\n旋子\t410519\n余笙\t410520\n梦幻西游手游_16163游戏网\t410521\n核技术\t410522\nhinton\t410523\n回礼\t410524\nBudgeting\t410525\n渭华起义\t410526\nsbf\t410527\n流感病毒\t410528\nrosso\t410529\n筒骨\t410530\n小米MIX2\t410531\n阴丽华\t410532\n西南师范大学出版社\t410533\n南京桥北\t410534\nUI动效\t410535\n色调\t410536\n赵姬\t410537\n汽车级\t410538\n格志\t410539\nmst\t410540\n一百万元\t410541\nBetterLesson\t410542\n安东尼·霍普金斯\t410543\n衡中\t410544\n光波炉\t410545\n逾重\t410546\n凝血酶\t410547\n漳州卫生职业学院\t410548\n人生的路\t410549\n悲观主义\t410550\n大篷车\t410551\n玉菇甜瓜\t410552\nKuta\t410553\n京东大道\t410554\n调料\t410555\nTHREAD\t410556\n迪拜\t410557\n水区\t410558\n忘羡\t410559\n布列斯特\t410560\n路母\t410561\n交通标志\t410562\nBeta3\t410563\n河南大学文学院\t410564\n汉城壹号\t410565\n东南DX7\t410566\n新运\t410567\n凤阳政府网\t410568\n问康\t410569\nMssql\t410570\n济源论坛\t410571\n瑞安市政府\t410572\n740li\t410573\n福州市地方税务局\t410574\n919路\t410575\n折线统计图\t410576\n丰树\t410577\n华远集团\t410578\npoc\t410579\n保用\t410580\n新加坡\t410581\n佳能7d2\t410582\n自制力\t410583\n伞车\t410584\n柳园\t410585\n478\t410586\n己二腈\t410587\n绳上\t410588\n洛阳协和医院\t410589\nOPlayer\t410590\n好久好久\t410591\nmarsggbo\t410592\n0836\t410593\n法例\t410594\n位移\t410595\n桃皮\t410596\nNiche\t410597\n掩模\t410598\n贴夫\t410599\n中国建筑业协会\t410600\n龙游新闻网\t410601\n山西省食品药品监督管理局\t410602\ntoastr\t410603\n无名者\t410604\n一星级\t410605\n炉火纯青\t410606\nEditage\t410607\n病毒\t410608\nTOEIC\t410609\nmakeupforever\t410610\nCDM\t410611\nWebContent\t410612\n史磊\t410613\n昆明盘龙区\t410614\n樱雪\t410615\n想定\t410616\nBeautifulsoup\t410617\n玄音\t410618\n325i\t410619\n如牛\t410620\n送签\t410621\nMIXNINE\t410622\n咸阳市人力资源和社会保障局\t410623\n航亿苇\t410624\n流放之路命运卡\t410625\n北碚区\t410626\n炎龙骑士团\t410627\nゲラン\t410628\n万人口\t410629\n权志龙\t410630\n望闻问\t410631\n秦时丽人明月心\t410632\n丽塔\t410633\n两重天\t410634\n氯酸盐\t410635\n四时\t410636\n门厅\t410637\n处\t410638\n正巧\t410639\n_淮师新闻网\t410640\n陆铭\t410641\n来不自禁\t410642\n文悦新青年体\t410643\nyous\t410644\n盗圣\t410645\n第183集\t410646\n20161024\t410647\nA.2\t410648\n命者\t410649\n横隔\t410650\n张雪峰\t410651\n阜兴集团\t410652\n明国\t410653\n鄱\t410654\n苏菲\t410655\nKeystone\t410656\nRMVA\t410657\n碳棒\t410658\n优胜美地\t410659\n3月31日\t410660\nMPO\t410661\n边缘柱\t410662\n骨力\t410663\nVoronoi\t410664\n杨建伟\t410665\n暗网\t410666\n税政\t410667\ncrap\t410668\n2012年\t410669\n网龙公司\t410670\n联合国\t410671\n死磕\t410672\n滚动拖动时行/列固定不动_\t410673\n阶级性\t410674\n87年\t410675\nわ\t410676\n周清\t410677\n斡旋\t410678\n金翎奖\t410679\n王国平\t410680\n回声\t410681\n滨州安监局\t410682\n洋参\t410683\n哈哈大笑\t410684\n塑封袋\t410685\n房屋租赁税\t410686\n建设银行企业网上银行\t410687\n双飞粉\t410688\n黄龙溪\t410689\nRollei\t410690\n各国人\t410691\ny470\t410692\n娇娇\t410693\n监督法\t410694\n经常性\t410695\nIb\t410696\n可编程\t410697\nbrower\t410698\n201314\t410699\n乡贤\t410700\n朔源\t410701\n仁寿县\t410702\n全息网\t410703\n还须\t410704\n第77\t410705\n碳钢管\t410706\n蓝猫淘气3000问之太空历险记\t410707\n绝地求生宏\t410708\n昆明冶金高等专科学校\t410709\nDIY综\t410710\n民主\t410711\n回天\t410712\n永辉超市\t410713\n牌术\t410714\nFlamingo\t410715\n王楚钦\t410716\n无能\t410717\n咖啡壶\t410718\n豁然\t410719\n大南门\t410720\nVIN码\t410721\n中车长客\t410722\n天河软件园\t410723\nUnicode码\t410724\nprovisioning\t410725\n905\t410726\n还是一样\t410727\n板边\t410728\npe\t410729\nProg\t410730\n耐基\t410731\n吕鹏\t410732\n打趣\t410733\n54亿元\t410734\n米折网\t410735\nairline\t410736\n麻利\t410737\nnickTimer\t410738\n极付\t410739\nDraggable\t410740\n5.6.33\t410741\n009期\t410742\n易错字\t410743\n空中客车公司\t410744\n甲午中日战争\t410745\nJungle\t410746\n南岔\t410747\n临翔\t410748\nAD10画\t410749\n这边\t410750\n截割\t410751\n38356\t410752\n一道光\t410753\n星河大帝吧\t410754\nv2.4.2\t410755\n蛋黄酥\t410756\n佐德\t410757\n友讯\t410758\nFirebase\t410759\n龙门水都\t410760\n水语\t410761\n乘用车\t410762\n页底\t410763\nzipaitoupai\t410764\n900g\t410765\n四探针\t410766\n乃木坂\t410767\n瑞丽\t410768\n+VSCode\t410769\nexited\t410770\n火帽\t410771\n长龙\t410772\n张闻天\t410773\n课外科技\t410774\n6.8元\t410775\n香榭丽\t410776\n晁州\t410777\n壮怀\t410778\n三生三世十里桃花吧\t410779\n蜜桔\t410780\n张文亮\t410781\n小李子\t410782\n张丹\t410783\nlbe\t410784\n周于希\t410785\n茬\t410786\n限售股份\t410787\n唐娜\t410788\n余涛\t410789\n白门楼\t410790\n沙甲\t410791\nNetlink\t410792\n幅画\t410793\n到宿\t410794\n白块儿\t410795\n安镇\t410796\n浙江新闻网\t410797\n貸\t410798\n小达\t410799\n两只狗\t410800\nbbs.uuu9\t410801\nCanny\t410802\n霍城县\t410803\n云智\t410804\n中国科协办公厅\t410805\n排超\t410806\n六争\t410807\n伍尔芙\t410808\n上洛\t410809\nbeer\t410810\n奔驰G级\t410811\n本小说\t410812\n全辑\t410813\n潜水电泵\t410814\n阿克塞\t410815\n黄琦\t410816\n招标公告\t410817\ndreamweave\t410818\n谱面\t410819\ne路航\t410820\n5000日元\t410821\n岫\t410822\nIL2CPP\t410823\n烧死\t410824\n全安素\t410825\n半页\t410826\n二分钟\t410827\n克莱尔\t410828\n痛痒\t410829\n自个儿\t410830\n问题儿\t410831\n私宅\t410832\n24根\t410833\n1269\t410834\n委会\t410835\nHOLD\t410836\n基轮\t410837\n裳\t410838\n电玩\t410839\n400i\t410840\n看一看\t410841\n負\t410842\n家缘\t410843\n重庆市交通委员会\t410844\n1254\t410845\n宽容\t410846\n警校生\t410847\n取食\t410848\n亚虎\t410849\n趴趴\t410850\nsixty\t410851\nPC版\t410852\nxs\t410853\n3]\t410854\n鄞州南部商务区\t410855\n10.1英寸\t410856\n7.3下\t410857\n英文小说网\t410858\n喘不过气\t410859\n麻子\t410860\n加工厂\t410861\n火狼\t410862\n市城投集团\t410863\n哈西大街\t410864\n边际替代率\t410865\n防旱\t410866\n滨江和城\t410867\nziti\t410868\nfc2\t410869\n一张张\t410870\nFrames\t410871\n试讲\t410872\n基模\t410873\n北川爱莉香\t410874\n梦色\t410875\n大冈\t410876\n同事\t410877\nCW\t410878\n乐度\t410879\n奇台县\t410880\n录名\t410881\n上海办事处\t410882\n二天\t410883\n竖放\t410884\nswf文件\t410885\n6.6分\t410886\n离谱\t410887\n母亲的爱\t410888\nnexus6p吧\t410889\nPHPWind\t410890\n400小时\t410891\nvgm\t410892\n鸡歌\t410893\n锁骨处\t410894\n家族崩坏\t410895\n东风标致5008\t410896\n大太阳\t410897\nrah\t410898\n以逸待劳\t410899\n100万千瓦\t410900\n20180326\t410901\n广州白云机场\t410902\n建国\t410903\n王牌特工:特工学院\t410904\neca\t410905\nELISA试剂盒\t410906\nadequate\t410907\ngogh\t410908\n燕鱼\t410909\n1012\t410910\n中箭组\t410911\n七件\t410912\nChurch\t410913\nMiao\t410914\n一点半\t410915\n求教\t410916\n克拉斯\t410917\n重庆)自由贸易试验区\t410918\n莎拉娜\t410919\n中国钢结构人才网\t410920\n加征\t410921\n僻\t410922\n太阳生\t410923\n齿轮减速器\t410924\n庐山火车站\t410925\n1408\t410926\n初八\t410927\n将相和\t410928\n弱点\t410929\n中国电建集团昆明勘测设计研究院有限公司\t410930\n桥南\t410931\nspro\t410932\n路漫漫其修远兮\t410933\nRomero\t410934\n大祥\t410935\n济南市环境保护局\t410936\n新华南路\t410937\niPad浏览器\t410938\n快信\t410939\n济南市城乡建设委员会\t410940\n尼米兹\t410941\n蒲城县人民政府\t410942\n航天晨光股份有限公司\t410943\n刘昊然\t410944\n沪剧\t410945\n太阳村\t410946\n上海药物研究所\t410947\n阳泉日报\t410948\n绅士宝\t410949\nradosgw\t410950\n杨炼\t410951\n蓝龙德\t410952\n370号\t410953\nplaybook\t410954\npromising\t410955\n论说文\t410956\n扶\t410957\n拉鲁拉丝\t410958\n代码篇\t410959\nRepeat\t410960\nhotspot\t410961\nTopGear\t410962\n中山大学环境科学与工程学院\t410963\n腾讯网游加速器\t410964\n左\t410965\n卡宴逍遥兵王\t410966\n驯兽师\t410967\nconnor\t410968\n凯撒\t410969\nAuthorship\t410970\n80次\t410971\nyogurt\t410972\n快云\t410973\n蔡文胜\t410974\n1271\t410975\n子卡\t410976\n圈层\t410977\n王向阳\t410978\n多形\t410979\n拼多多拼团\t410980\nreload\t410981\n书包\t410982\n天差地别\t410983\n普鲁斯特\t410984\n台牌\t410985\n20160624\t410986\n南阳新区\t410987\n一博\t410988\n圆头\t410989\n瓦里安·乌瑞恩\t410990\n公排\t410991\nFed\t410992\n巴西\t410993\nTTC\t410994\n腾飞大厦\t410995\n骆驼山\t410996\n街脚\t410997\n史航\t410998\n北侧\t410999\n海狼\t411000\n风湿药\t411001\n港大深圳医院\t411002\n方子哥\t411003\n底砂\t411004\nsys\t411005\n玉儿\t411006\nPACKAGE\t411007\n黑暗之心\t411008\n慎二\t411009\n骚动\t411010\nzcx\t411011\n色花\t411012\n2009款\t411013\n奥士康\t411014\n评析\t411015\n怀了孕\t411016\n蒙面唱将\t411017\n卡纳塔克邦\t411018\n悠久\t411019\n梁赞\t411020\n何冲\t411021\nGates\t411022\n8200元\t411023\n高佬\t411024\n转发\t411025\n先天下\t411026\n大部头\t411027\n武魂\t411028\n杨英\t411029\neMule\t411030\n新马路\t411031\n睿达杯\t411032\n艺涛\t411033\n全拼\t411034\n末途\t411035\n鹿角胶\t411036\n双珑\t411037\n男巫\t411038\n老大难\t411039\nCumshot\t411040\nMTI\t411041\n集成式\t411042\n谦抑\t411043\n事业部制\t411044\nsip协议\t411045\n雅鲁藏布江\t411046\n党魂\t411047\n周渝民\t411048\njjk\t411049\nAdjustment\t411050\n蓝斯诺\t411051\nTGP\t411052\nナマ\t411053\n模糊综合评价法\t411054\nTCP/IP协议\t411055\n社会稳定风险评估\t411056\nCombining\t411057\n拣货员\t411058\nctags\t411059\n20150624\t411060\n点米网\t411061\n韩祥波\t411062\n凤翔\t411063\nzukedge\t411064\n淮海网\t411065\n五道口金融学院\t411066\nNVIDIA显卡驱动\t411067\n深圳第三人民医院\t411068\n燕尾夹\t411069\n1分钟\t411070\n天竺葵\t411071\nfid\t411072\n戴宗\t411073\nAspen\t411074\n苏伊士运河\t411075\nTL-WDR5620\t411076\nMicron\t411077\n人性的弱点\t411078\n中央环保督察组\t411079\n设身处地\t411080\n大三阳\t411081\n卡卡罗特\t411082\n蓝忘机\t411083\n农家子\t411084\n美图v6\t411085\n大蛇无双\t411086\n全英\t411087\n黄觉\t411088\n阅兵\t411089\n炼丹师\t411090\n拥吻\t411091\n蝇子\t411092\nHunted\t411093\n新精武门2\t411094\nv5.0.2\t411095\n广州市荔湾区人民政府\t411096\n亲宝儿歌大全\t411097\n巡视组\t411098\n52号\t411099\n河南省卫生厅\t411100\n任\t411101\n哥伦比亚大学\t411102\n⑶\t411103\nV10\t411104\nertong\t411105\n供应商库\t411106\nFinalCutPro\t411107\n引包\t411108\n小米平板吧\t411109\n华\t411110\n提前还款计算器\t411111\nC91\t411112\n重整\t411113\n意大利酒店\t411114\n刘华清\t411115\nMG3\t411116\n乌鲁木齐国际机场\t411117\n西瓜虫\t411118\n远拓\t411119\n赵柯\t411120\napplecare+\t411121\nWorldXML域名世界\t411122\n妓院\t411123\n中国中车股份有限公司\t411124\n陈海峰\t411125\nkali虚拟机\t411126\n夏华\t411127\n广州少年儿童图书馆\t411128\n4370\t411129\n冷却液\t411130\nVLSI\t411131\n朗诵网\t411132\n鲔鱼\t411133\n六三\t411134\n一件好事\t411135\n南山智园\t411136\n金鹰独播剧场\t411137\nstacks\t411138\n针管\t411139\n36进制\t411140\n步步高\t411141\n物防\t411142\nDigiCert\t411143\n清税\t411144\n1point3acres.com\t411145\n抚宁\t411146\n大商\t411147\n阿巴町\t411148\n室女\t411149\n炒单\t411150\n中冀\t411151\n阿卢\t411152\nsaul\t411153\n涵化\t411154\nblanche\t411155\nNobel\t411156\n好学\t411157\n强电解质\t411158\n金枫\t411159\n华业\t411160\n科曼\t411161\n聚氨酯保温管\t411162\nParking\t411163\n玩牌\t411164\n池州火车站\t411165\n第18046期\t411166\n船板\t411167\n爱情信物\t411168\n杭州日报\t411169\n易名网\t411170\n假植\t411171\n不获\t411172\nc10\t411173\n太太\t411174\n~\t411175\n开发组\t411176\n广州市城市更新局\t411177\n杨莉\t411178\n飞扬旅游网\t411179\n富甲天下3\t411180\n韩嫣\t411181\n邓晶\t411182\nhashMap\t411183\n中药材天地网\t411184\nDatatables\t411185\n传图\t411186\nIF\t411187\nfreemind\t411188\n蛮花\t411189\n差分\t411190\n荧光屏\t411191\n袋鼠岛\t411192\n描述性统计分析\t411193\n武汉)科技有限公司\t411194\n第一帅\t411195\nAmbit3\t411196\n流星花园2\t411197\n水池\t411198\n480GB\t411199\n邹铭\t411200\n弃用\t411201\n文三街小学\t411202\n老特拉福德球场\t411203\n抵扣进项税额_\t411204\n83830\t411205\n鸦片鱼\t411206\n中国科学院理化技术研究所\t411207\n多元主义\t411208\n蜜果\t411209\n榨汁\t411210\n三十余年\t411211\n李金早\t411212\n乌鲁木齐米东区\t411213\nresidents\t411214\n军官证\t411215\n孙晓梅\t411216\n98家\t411217\nBandwagon\t411218\nMILANO\t411219\nform2\t411220\n换妻\t411221\n汽笛声\t411222\np35\t411223\n福朋喜来登\t411224\n赵本六\t411225\n山东航空公司\t411226\n平顺性\t411227\nonkeyup\t411228\n2012吧_\t411229\n藜蒿\t411230\nhssf\t411231\n死面\t411232\n第16话\t411233\n蛇女\t411234\n张杰\t411235\n何可欣\t411236\n马来\t411237\n智能钥匙\t411238\n12品\t411239\n美图秀秀来\t411240\n7.35dps\t411241\n华清远见嵌入式学院\t411242\n波西亚时光\t411243\n尤里斯\t411244\n甄妮\t411245\n约局\t411246\n乳\t411247\n阿提哈德\t411248\n一睹为快\t411249\n松骨\t411250\nsurface4\t411251\nimplement\t411252\n_把\t411253\nav号\t411254\n裱花袋\t411255\n良明\t411256\n我黄金光辉的人生\t411257\nNox\t411258\n五角星\t411259\n热敏打印机\t411260\n烟柜\t411261\nN9100\t411262\n影印件\t411263\n本征\t411264\n菊部\t411265\nv2.5.1\t411266\nUploader\t411267\n多张\t411268\n外岛\t411269\n蕉城区\t411270\nRH\t411271\n0791\t411272\n治疗方\t411273\n银隆新能源\t411274\nyeezy350\t411275\n宁波旅游网\t411276\nOpenLayers3\t411277\n大千\t411278\n朝语\t411279\n攻婚\t411280\n鼎信长城\t411281\n面内\t411282\n邢台政府\t411283\n黑影\t411284\n划车\t411285\n济州联\t411286\nXE8\t411287\n淘宝浏览器\t411288\n非因工\t411289\n矿泉水\t411290\n洽购\t411291\nIdeal\t411292\n上海畅\t411293\npixark\t411294\nudig\t411295\n倾世皇妃\t411296\n北京人艺\t411297\n肘击\t411298\n前灯\t411299\n韦瑞德\t411300\n淮安半岛\t411301\nDRM\t411302\n河北搜才网\t411303\n折边\t411304\n王丹妮\t411305\n高模\t411306\n徐海俏\t411307\n三星S\t411308\ncal\t411309\nE11\t411310\n八场\t411311\n莲花街道\t411312\nfloppy\t411313\nbomo\t411314\n管不住\t411315\n刘文斌\t411316\n杨溢\t411317\n术\t411318\n第3卷\t411319\n腰臀比\t411320\n搞笑动图\t411321\n宝莲寺\t411322\n夺魂者\t411323\nmonty\t411324\n绝地求生刺激战场沙漠\t411325\n脉搏波\t411326\n河北路\t411327\n百万行\t411328\n发行人\t411329\n采尔马特\t411330\n2018CF\t411331\n啦啦文学网\t411332\n李世济\t411333\n风行者\t411334\nTribe\t411335\n雅西卡\t411336\n罗红霉素分散片\t411337\n烈火如歌手游\t411338\n可不可\t411339\n应收账款余额\t411340\n点不动\t411341\n8.7%\t411342\n业报\t411343\n蔡振红\t411344\n不朽者\t411345\n蛟河市\t411346\n杨晔蓉\t411347\n白蛋\t411348\n胜场\t411349\nWeUI\t411350\n莱绅\t411351\n红人\t411352\n白桦茸\t411353\n西海龙王\t411354\nTM\t411355\nCrowne\t411356\n三四千\t411357\nbubuchu\t411358\n云计算服务\t411359\n尖利\t411360\n美缝剂\t411361\nallowable\t411362\nwebshop\t411363\n堵漏\t411364\n热膨胀\t411365\npaypal\t411366\n沪江小学资源网\t411367\n19厘米\t411368\nwink\t411369\nsimply\t411370\ng31\t411371\n道真县\t411372\n言情文\t411373\n平野美宇\t411374\n红糖糍粑\t411375\nHeuer\t411376\n冰dk\t411377\n新全职猎人\t411378\n赫然\t411379\n义启\t411380\nmdb\t411381\n插件式\t411382\n五板\t411383\n安兰\t411384\n傣历\t411385\n上送\t411386\n38万\t411387\n喜中\t411388\n兔斯基\t411389\n通气管\t411390\n啲\t411391\n梅丽\t411392\n动易cms\t411393\nide\t411394\n中国航空学会\t411395\n和你\t411396\nZBook\t411397\n广州队\t411398\n无相\t411399\n雷霆咆哮\t411400\nadrenaline\t411401\n王通\t411402\n嚷\t411403\nmate20\t411404\n开元壹号\t411405\n南京车管所\t411406\n减薄\t411407\n徐婷婷\t411408\n蓝月棋牌\t411409\n石砚\t411410\n好六网\t411411\n杂木\t411412\n暗话\t411413\n隆起\t411414\n96号\t411415\nFUJI\t411416\nPrograms\t411417\n橡皮糖\t411418\n睡帽\t411419\n马宁\t411420\nrelocate\t411421\n座无虚席\t411422\n太妍\t411423\nboxbox\t411424\nIBO\t411425\nEmployed\t411426\n10万人次\t411427\n师大人\t411428\nmm4000\t411429\n李义星\t411430\n杨老师\t411431\n碳酸钙片\t411432\ntraction\t411433\n文明村镇\t411434\n拖头\t411435\nUSNews\t411436\n杨尚川\t411437\n金耳\t411438\n周群飞\t411439\n60首\t411440\n切切\t411441\n楚才杯\t411442\nscroll\t411443\n丰诚\t411444\nglm\t411445\n缩表\t411446\nOC篇\t411447\n汁味\t411448\n深圳爱尔眼科医院\t411449\n灵幡\t411450\n宁波外事学校\t411451\n瘸子\t411452\n邦信\t411453\n46cm\t411454\n薛家镇\t411455\n平安旅游\t411456\n十不该\t411457\n财务规划\t411458\n冯至\t411459\n1.5kw\t411460\nNPK\t411461\nProcess\t411462\n陶尔米纳\t411463\na+2\t411464\n宝山路\t411465\nbaixin\t411466\n自考网\t411467\n药学部\t411468\nutf8编码\t411469\nロリ\t411470\n金丘\t411471\n吔屎\t411472\n段中\t411473\n泅渡\t411474\n思锐\t411475\n高丽菜\t411476\nmaia\t411477\n李晓杰\t411478\nRPN\t411479\nIllusion\t411480\n暗恋文\t411481\n放学\t411482\n四只\t411483\nCBO\t411484\nPXE\t411485\n中华教育网\t411486\n苏星\t411487\n荣昌区\t411488\n危险\t411489\n元元\t411490\n第172\t411491\n50卷\t411492\n国际广场\t411493\n楚格峰\t411494\nvary3.4\t411495\n谐振子\t411496\n车主们\t411497\n台创园\t411498\n丹东新区\t411499\n502号\t411500\n广州市第一中学\t411501\n满分作文网\t411502\nBlade\t411503\n五轮\t411504\n幻想乡吧\t411505\n判死\t411506\n綦江在线\t411507\nIntermec\t411508\n介绍语\t411509\n李定国\t411510\n情深深雨濛濛\t411511\n随园\t411512\n安徽商贸职业技术学院\t411513\n一览图\t411514\n龙腾世纪:审判\t411515\n锦绣苑\t411516\n虎皮猫\t411517\ngodymoon\t411518\n现代文\t411519\nbreeding\t411520\n系用\t411521\nNaga\t411522\ndedecms5.7\t411523\nWORD\t411524\n长坑\t411525\nmacBook\t411526\n长征五号\t411527\n自治区发展改革委\t411528\n圆瓶贴标机\t411529\n页角\t411530\n99日\t411531\n麦绿素\t411532\nRey\t411533\n谭旋\t411534\n下年度\t411535\nHelmets\t411536\n2528\t411537\n杨家坪\t411538\nz475\t411539\n正字\t411540\n防CC\t411541\n酥酥\t411542\n鸿叶\t411543\n大病险\t411544\nIRON\t411545\nSonic\t411546\nDISCOVERY\t411547\nJodie\t411548\n戴建华\t411549\n革命烈士\t411550\n国子监\t411551\n样文\t411552\n凡科建站\t411553\n物联网智能家居\t411554\n光威\t411555\nmiix2\t411556\n惠兰\t411557\n多片\t411558\n独立\t411559\nB48\t411560\n完成\t411561\n异种\t411562\n咚咚锵\t411563\n融汇半岛\t411564\n95cm\t411565\n刘晓华\t411566\nAppl\t411567\ntookit\t411568\n1940年代\t411569\n正态分布图\t411570\n脱档\t411571\nAMBA\t411572\n层次性\t411573\n蔷薇\t411574\n十场\t411575\n党群部\t411576\n民参军\t411577\n就擒\t411578\n300题\t411579\n宠物心法\t411580\n理论力学\t411581\n恩光\t411582\n3118\t411583\n普泰\t411584\n马场镇\t411585\n9招\t411586\n舞步学院\t411587\nYamamoto\t411588\n反渗透净水机\t411589\n南侨机工英雄传\t411590\n游友\t411591\n海口国家高新区\t411592\nTw\t411593\nxunlei\t411594\n福建师范\t411595\n资本公积转增\t411596\n浙大紫金港校区\t411597\n九万里\t411598\n辽宁男篮\t411599\n首都体育馆\t411600\n可买\t411601\n展示\t411602\n奇冤\t411603\n练摊\t411604\n入站\t411605\n我要飞\t411606\nPart.2\t411607\n赫子铭\t411608\n玻璃盘\t411609\n北京居住证\t411610\n11G101\t411611\n渠系\t411612\n十余\t411613\ncheckBox\t411614\n藏心\t411615\n唯快不破\t411616\nSOHO3Q\t411617\n小熙&屌德斯\t411618\n樱花雨\t411619\n手王\t411620\n黑枸杞子\t411621\n异界套\t411622\n怀进鹏\t411623\n40分钟左右\t411624\n安徽省新闻出版广电局\t411625\n彘\t411626\n仇\t411627\nf55\t411628\n肯特\t411629\n所见\t411630\n细枝\t411631\n颇具\t411632\n1ST\t411633\n良渚镇\t411634\n5标\t411635\n澳芒\t411636\n绘本馆\t411637\n内布拉斯加\t411638\n莲子芯\t411639\nFactorization\t411640\nqn\t411641\nWinSCP\t411642\n神祗\t411643\nboos\t411644\nウォ\t411645\n七折\t411646\n孚日股份\t411647\n铣\t411648\n福州新闻_海峡网\t411649\n1.13c\t411650\n共产党\t411651\n飞云镇\t411652\n宜忌\t411653\n半步\t411654\n庞蒂亚克\t411655\n2.4.18\t411656\n固定资产贷款\t411657\nrDNA\t411658\n丰\t411659\n阿辉\t411660\n泾川县政府\t411661\n次生\t411662\nisnt\t411663\n收声\t411664\n第61批\t411665\n锁舞\t411666\n胯\t411667\nSPN\t411668\n观影\t411669\n钱客\t411670\nFrancois\t411671\n显色\t411672\n边民\t411673\n草海\t411674\n嵛山岛\t411675\nZones\t411676\n阅源\t411677\n6200u\t411678\n永遇乐\t411679\n餐巾纸\t411680\n银科大厦\t411681\n盖饭娱乐\t411682\n夏俊峰\t411683\n马伟\t411684\n乱扔\t411685\n发挥好\t411686\n红衫\t411687\n魔兽争霸3战役\t411688\n荣安驾校\t411689\n14座\t411690\n鞋油\t411691\n上海儿科医院\t411692\n远古传说\t411693\n劲旅网\t411694\ndal\t411695\n习主席\t411696\n500W\t411697\n翠绿\t411698\n文峰区\t411699\n10多万\t411700\n共筑\t411701\n绿箩\t411702\nSS18\t411703\n阑尾\t411704\n6宗\t411705\n学术类\t411706\n黑暗兄弟会\t411707\nn+1\t411708\n听音乐\t411709\nBoolean\t411710\n小咬\t411711\n受信\t411712\nPolytechnic\t411713\n下承式\t411714\n上海国际车展\t411715\n空白格\t411716\n桃井理乃\t411717\n张北草原天路\t411718\n献宝\t411719\n专有名词\t411720\nxiamen\t411721\n大伊万\t411722\n独辟蹊径\t411723\ndsym\t411724\n成迷\t411725\n3蜂窝\t411726\n索邦\t411727\n轮毂\t411728\n零之执行人\t411729\n180411\t411730\n舒筋活络\t411731\nX5Pro\t411732\n小米手机5s\t411733\n华为上海研究所\t411734\n望京西园三区\t411735\n一印\t411736\n富硒\t411737\n卡德娜\t411738\n教卫\t411739\nBIGEMAP\t411740\n金边瑞香\t411741\n比尔\t411742\n输送链\t411743\n20170910\t411744\nangles\t411745\n两千元\t411746\n狗托\t411747\n收藏册\t411748\nBuildroot\t411749\n生物工程\t411750\npauline\t411751\n再来一次\t411752\nresponsibilities\t411753\n刘华强\t411754\n三委\t411755\n工伤认定书\t411756\n5月5\t411757\n免押\t411758\n芦墟\t411759\n虎猫\t411760\n三分所\t411761\n市川雅美\t411762\n菊园新区\t411763\n撑伞\t411764\n山巅\t411765\n出租车司机\t411766\n优道\t411767\n20150921\t411768\n1514\t411769\n中国动物园\t411770\nSucker\t411771\n晶耀名邸\t411772\n异丁烷\t411773\n豪迈\t411774\n陈进行\t411775\n渔光曲\t411776\n浦阳镇\t411777\n孙茜\t411778\nGeocoding\t411779\n上林\t411780\npsd素\t411781\n疗程\t411782\n防艾\t411783\n分治\t411784\n诚意金\t411785\n黄石北\t411786\nH2数据库\t411787\ntclaw\t411788\nUI框架\t411789\n规则化\t411790\n中欧国际城\t411791\nCadillac\t411792\n兰州石化职业技术学院\t411793\n教师表\t411794\nsout\t411795\nSpringMVC+Mybatis\t411796\n二手电脑\t411797\nWilliam\t411798\n宁乡\t411799\n共康\t411800\n制表\t411801\n第2届\t411802\nZodiac\t411803\n20多位\t411804\n表型\t411805\n摇晃\t411806\nsignal\t411807\nSTM32/STM8技术论坛\t411808\n归队\t411809\n轮盘\t411810\nmvc4\t411811\n交通事故责任认定\t411812\n五连冠\t411813\ndvf\t411814\n毕业设计展板\t411815\nangels\t411816\n常江\t411817\n第2位\t411818\n嫁给\t411819\n大龙经济开发区\t411820\nJDBC连接池\t411821\n7.0分\t411822\n大白腿\t411823\n耿飚\t411824\n慎入\t411825\n区号段\t411826\n买壳\t411827\n李延隆\t411828\n爽哥\t411829\n宵夜\t411830\n冷凝锅炉\t411831\n自私自利\t411832\n2018年7月1日\t411833\n心志\t411834\n一道本\t411835\n静电场\t411836\nreligion\t411837\n猫的报恩\t411838\ncontacts\t411839\naddict\t411840\nMAZDA\t411841\n315号\t411842\n霞山区\t411843\n同一个梦\t411844\n阜沙镇\t411845\n操作指令\t411846\n县纪委监察委\t411847\nsher\t411848\n浔阳楼\t411849\n魔音灵曲\t411850\n豆浆机\t411851\n开元集团\t411852\n二手车之家\t411853\n未定义变量\t411854\n生物信息学\t411855\n打卷\t411856\nenjoyed\t411857\nniushop\t411858\n快捷健\t411859\ng10\t411860\n锤式破碎机\t411861\n养殖池\t411862\n省属\t411863\nPar\t411864\n当涂县\t411865\n一键删除\t411866\n三土\t411867\n中央戏剧学院\t411868\n居民可支配收入\t411869\n中铁中基\t411870\n即时战略\t411871\n肝郁脾虚\t411872\n同课\t411873\n孙宏岳飞\t411874\n海峡都市报电子版_海都报电子版\t411875\nhouses\t411876\nクラス\t411877\n微链\t411878\n急\t411879\n麟游县人民政府\t411880\nkeyence\t411881\n地瓜干\t411882\n防守者\t411883\nbates\t411884\n150页\t411885\n39000\t411886\nadmission\t411887\n非选择题\t411888\nmax3\t411889\n第十七次\t411890\n凤凰古城\t411891\n参加\t411892\n动漫在线漫画网\t411893\n第五话\t411894\njnu\t411895\n别克凯越\t411896\nWuya\t411897\n托克维尔\t411898\nPythonic\t411899\n_太\t411900\n方口\t411901\n入党思想汇报\t411902\n陕西师大附中\t411903\n宝安区\t411904\n食品科学与工程专业\t411905\nSymbols\t411906\n跃进路\t411907\n井岗镇\t411908\n可不可靠\t411909\n宋光明\t411910\n苦行僧\t411911\nCrossover\t411912\n游戏篇\t411913\n瑞隆\t411914\n曾沛慈\t411915\ntemplate\t411916\n王宋世杰\t411917\n奶萨\t411918\n墨格瑞拉\t411919\n热阻\t411920\n酸碱\t411921\n栗坤\t411922\nDoge\t411923\n卓尔\t411924\nbreakers\t411925\n魏强\t411926\n乐科\t411927\n西安盛赛尔电子有限公司\t411928\n雅莹\t411929\nMI\t411930\n0.1s\t411931\nFreeze\t411932\n中央人民广播\t411933\n称赞\t411934\n肉色\t411935\n0撸\t411936\n塞来昔布胶囊\t411937\n五升\t411938\n%3a\t411939\nskies\t411940\n德云色吧\t411941\n水公园\t411942\nCPUs\t411943\n胸口\t411944\n广西交通投资集团\t411945\nBiz\t411946\n车银优\t411947\n流水图\t411948\n钢帘\t411949\n洁净板\t411950\n智能硬件\t411951\n220万\t411952\n包工头\t411953\n造梦\t411954\nSOUI\t411955\n反射板\t411956\n以为期\t411957\n160729\t411958\n渭南市临渭区人民政府\t411959\n麻糍\t411960\n盖挖\t411961\n马克思是对的\t411962\n主机版\t411963\n李重光\t411964\n送气\t411965\n春日偶成\t411966\n爱腾\t411967\n上海市旅游局\t411968\n百思图\t411969\nRoads\t411970\n猜谜语_猜字\t411971\n新宝2\t411972\n炭化木\t411973\nShannon\t411974\nceair\t411975\n三_\t411976\n化冻\t411977\n平流\t411978\n最多次\t411979\n210\t411980\n新湘\t411981\n射雕英雄传\t411982\n建筑业企业资质标准\t411983\ntxt电子书\t411984\n心火\t411985\n孙丽媛\t411986\n区发改委\t411987\n萍萍\t411988\nmd5.js\t411989\n殉罪者\t411990\n茧丝\t411991\n伟大航路\t411992\n双墩镇\t411993\n美眼\t411994\n炼兽\t411995\n南通东站\t411996\n阿比\t411997\n迹象\t411998\n白木耳\t411999\n畲\t412000\n宁波教育网\t412001\n阳光宽频网\t412002\nm4r\t412003\n2017.1.1\t412004\n180回\t412005\n随往\t412006\n展位\t412007\n支撑杆\t412008\n6to4\t412009\n小天鹅\t412010\n叶秋欣\t412011\n九品芝麻官\t412012\n玫瑰情人网\t412013\n接发\t412014\n分飞\t412015\nЯ\t412016\n退居\t412017\n彩喷纸\t412018\n阳光大道\t412019\n扫清\t412020\n驾驶\t412021\n假表\t412022\n年味儿\t412023\n寓见\t412024\n洛阳市农业局\t412025\n英雄好汉\t412026\n嘻哈文化\t412027\n石溪村\t412028\n格林花园\t412029\n寒性食物\t412030\n泽尻\t412031\nDotA2\t412032\n自食\t412033\n公卫医师\t412034\nManjaro\t412035\n古斯塔夫\t412036\n宁波市体育局\t412037\n抵触\t412038\n13.7\t412039\n三针\t412040\n四方协议\t412041\nPA6\t412042\n韩文名\t412043\n翟\t412044\nORB-SLAM\t412045\n软件缘\t412046\nVAS\t412047\n成都市交通运输委员会\t412048\n谋臣\t412049\n催促\t412050\nrefill\t412051\n式神\t412052\n结婚证\t412053\n静电喷涂\t412054\n佳颖\t412055\n链杆\t412056\nXPosed框架\t412057\n开阔\t412058\ntrl\t412059\n名琴\t412060\n中途岛海战\t412061\nourbits\t412062\npaperwhite3\t412063\n吉他谱-虫虫吉他谱\t412064\n那夜\t412065\n达拉斯机场\t412066\n锰铁\t412067\n十八届\t412068\n绿茶软件园\t412069\n雪肌精化妆水\t412070\n华二\t412071\n华为mate2\t412072\n广州红砖厂\t412073\n广东爱情故事\t412074\n恒大童世界\t412075\nabtest\t412076\n苏青\t412077\nEER\t412078\n飞盘\t412079\nnonsense\t412080\n七街\t412081\n转送\t412082\n皂角\t412083\n台徽\t412084\n电缆井\t412085\n柳钢\t412086\n脱骨香\t412087\n朱天天\t412088\n继教\t412089\n400kva\t412090\n铁二院\t412091\n三国全面战争吧_\t412092\n亚伦\t412093\n败光\t412094\ndatabind\t412095\n南海控股\t412096\n马化腾\t412097\n溧阳\t412098\n误打\t412099\n第45话\t412100\n勇者封神传\t412101\n荤\t412102\n雷音寺\t412103\n阿诺德\t412104\nKizuna\t412105\nsourcesafe\t412106\n一世倾城\t412107\n温控阀\t412108\n证婚人\t412109\n37天后\t412110\n陈晔\t412111\n劳技\t412112\ntabe\t412113\n雅客\t412114\ncalibri\t412115\nexec\t412116\n腐败\t412117\n商业地产\t412118\n蛋糕房\t412119\n蒂姆·邓肯\t412120\n最近一个月\t412121\ngcc/g++\t412122\n山口茜\t412123\nZuul过滤器\t412124\n博士学\t412125\n吴宓\t412126\n5月中\t412127\n包干制\t412128\n上海沪剧院\t412129\n3000W\t412130\n普丽普莱\t412131\n第一百一十九章\t412132\n王博文\t412133\nex5\t412134\nshoe\t412135\n5.5.38\t412136\n于建辛\t412137\n红警2吧\t412138\n商学\t412139\ndol\t412140\n水煮三国\t412141\n采芝斋\t412142\n山西省太谷县人民政府\t412143\n日本央行\t412144\n锡兵\t412145\n武汉分公司\t412146\n韶关网\t412147\n傣族舞\t412148\n湖南省高级人民法院\t412149\neviews6.0\t412150\n鬼角\t412151\n严文井\t412152\n金州新区\t412153\n3.44\t412154\n投机取巧\t412155\n人马座\t412156\n猜\t412157\n百家邦\t412158\n大仁\t412159\n四海网\t412160\n重头再来\t412161\nFRD\t412162\n特斯联\t412163\n河南省农业科学院\t412164\njBPM\t412165\n南宁经济技术开发区\t412166\n2十五\t412167\n正海生物\t412168\nJanuary\t412169\n420号\t412170\nWarrior-魔兽世界\t412171\n妊\t412172\n老天爷\t412173\nTOB\t412174\n504\t412175\n糠醇\t412176\n有机产品\t412177\nStrategic\t412178\n几对\t412179\ncarousel\t412180\nKiosk\t412181\n沪港通\t412182\n两周年\t412183\n临高交锋\t412184\n柞蚕丝\t412185\n北极光创投\t412186\nananconda\t412187\n福尔马林\t412188\n九牧\t412189\n陪你走\t412190\n西南科大\t412191\n新能源车\t412192\n超乎寻常\t412193\ntsf\t412194\n华歆\t412195\n和君集团\t412196\n保温棉\t412197\n郑州大学法学院\t412198\n嗟叹\t412199\n报恩寺\t412200\nallergic\t412201\n集约式\t412202\n周易六十四卦\t412203\nxiaoyan\t412204\n李哥庄\t412205\n山脊赛车\t412206\n编排\t412207\n2016年10月12日\t412208\n乳影\t412209\n地铁4号线\t412210\n富安\t412211\n玩具总动员\t412212\n咽部\t412213\n黄金荣\t412214\n吉原\t412215\n成人自考网\t412216\nk8s\t412217\n6月6日\t412218\n甘肃省民政厅\t412219\n商讨\t412220\n翘角\t412221\n十年以后\t412222\n窦\t412223\n瓣膜\t412224\n基调\t412225\n宜必思\t412226\n601216\t412227\nA+\t412228\n校级\t412229\n外企公司\t412230\n安西\t412231\n垂死挣扎\t412232\n团体照\t412233\n五常市\t412234\n骨性\t412235\nPromoter\t412236\n七二\t412237\n本性难移\t412238\nlgx\t412239\n斗鸡\t412240\ngethub\t412241\n土壤肥料学\t412242\n特南克斯\t412243\n流动资金\t412244\nspectacle\t412245\n医疗卫生\t412246\n唐山湾国际旅游岛\t412247\n点位图\t412248\n李博士\t412249\n明确\t412250\n北京市公园管理中心\t412251\n厦门大学航空航天学院\t412252\nn64\t412253\n鬼上身\t412254\n抒情诗\t412255\n蒙山\t412256\nvanishfan\t412257\nTora\t412258\nJojo\t412259\n计算机程序设计员\t412260\n薛甄珠\t412261\n此种\t412262\nNiu\t412263\n广百股份\t412264\nTNABO\t412265\n地塞米松磷酸钠\t412266\n30V\t412267\n新向\t412268\n江河水\t412269\n拉萨路小学\t412270\n中华人民共和国农业部\t412271\n水绵\t412272\n蜕皮\t412273\n常遇\t412274\n林绵绵\t412275\n长元音\t412276\n辰戌丑\t412277\ngeronimo\t412278\n喷锚\t412279\n阿尔芭\t412280\n亲子教育_无忧考网\t412281\n呼叫等待\t412282\ncomer\t412283\n氯喹\t412284\n第8讲\t412285\nMVC3\t412286\n冷溶\t412287\n瑞和宝\t412288\n组员\t412289\n徐童\t412290\nwcm\t412291\n淘宝造物节\t412292\n维山\t412293\n零陵区\t412294\n元凶\t412295\n奇瑞A3论坛_汽车之家论坛\t412296\n人人人\t412297\nHive日期函数\t412298\n箱梁桥\t412299\n针型阀\t412300\n国家卫星气象中心\t412301\n厦门物流公司\t412302\nxiaoying\t412303\n13万\t412304\n奉节脐橙\t412305\ncha\t412306\n乳腺肿瘤\t412307\n明媚\t412308\ndushu\t412309\n2016年4月2日\t412310\n艾栗栗\t412311\n贴壁\t412312\n床垫\t412313\n搭车\t412314\nMONSTA\t412315\n平安大华\t412316\nstartswith\t412317\n鹿角帽粉\t412318\n雁鸣\t412319\n61万\t412320\n欣海\t412321\n对外依存度\t412322\n048\t412323\n板斧\t412324\n河合庄\t412325\n河北经贸大学经济管理学院\t412326\n字幕条\t412327\n爱德文\t412328\n明月山溪\t412329\nVMware_脚本之家\t412330\n28000元\t412331\n20150419\t412332\nSonoma\t412333\n陈美玲\t412334\n北京房产信息网\t412335\nDEPAPEPE\t412336\n玛莉\t412337\nexempt\t412338\n玩得好\t412339\n吴淡如\t412340\n联环药业\t412341\n环境描写\t412342\n乔安娜\t412343\nLooks\t412344\n一个6\t412345\ndwn\t412346\n鲜美\t412347\n莱比锡红牛\t412348\n之中\t412349\n尾号\t412350\n车前子\t412351\n周维\t412352\n没脸\t412353\n无因\t412354\nNvivo\t412355\n红移\t412356\nHDA\t412357\nskoda\t412358\n粤卫\t412359\nbrand\t412360\n棒球棒\t412361\n解构\t412362\n民主与法制时报\t412363\nloss函数\t412364\nswjtu\t412365\n_大公资讯_大公网\t412366\nattack\t412367\n火影忍者4\t412368\n模范\t412369\n编辑们\t412370\n可喜\t412371\n雪霁\t412372\n国际船舶网\t412373\n人门\t412374\ndscp\t412375\n能达\t412376\n铡刀\t412377\n复方甘草口服液\t412378\n起电\t412379\n红薯苗\t412380\n6个月\t412381\n一招\t412382\n救驾\t412383\nreleased\t412384\n方正字体\t412385\n靛颏\t412386\n餐饮服务许可证\t412387\n周六\t412388\ncation\t412389\n45544\t412390\n江源东\t412391\n400回\t412392\n大食\t412393\n父老乡亲\t412394\n九龙沟\t412395\n孝敬父母\t412396\n赵_\t412397\n回单柜\t412398\n美地\t412399\nFound\t412400\n灰绿\t412401\n82万亿\t412402\n谏\t412403\n东南区\t412404\n三年前\t412405\n二千年\t412406\n刘丽娟\t412407\n解器\t412408\n余波\t412409\n哈尔神仙道\t412410\n7n\t412411\n澳大利亚女足\t412412\n细缝\t412413\ngemma\t412414\n国网江苏省电力公司\t412415\npppoe\t412416\nBalancing\t412417\n广西财政厅\t412418\n映众\t412419\n猜忌\t412420\n圆网\t412421\n广东司法警官职业学院\t412422\n趋势\t412423\n进社区\t412424\n平潭\t412425\n夏港街道\t412426\n手筋\t412427\n灭门\t412428\n耗\t412429\n一型糖尿病\t412430\n棉衣\t412431\n中径\t412432\n氮磷钾\t412433\n小闲\t412434\nMUM\t412435\n石油英才网\t412436\nlengend\t412437\n吸奶器\t412438\n靖宇县\t412439\n中国地产\t412440\n查验\t412441\n魁北克城\t412442\n叶黄素\t412443\n工作票\t412444\n风霜\t412445\n簋\t412446\n环球雅思\t412447\n五年级数学练习册答案-小学五年级数学练习册\t412448\n宋干节\t412449\n显著\t412450\nheadless\t412451\n早味\t412452\n特洛朗逸\t412453\n账载\t412454\naceadmin\t412455\n百年难遇\t412456\nfila\t412457\nlocals\t412458\nPG\t412459\nwald\t412460\n长堎镇\t412461\noppor15吧\t412462\n毫米波雷达\t412463\ntst\t412464\n2669\t412465\n第75期\t412466\n第9张\t412467\n特色型\t412468\n求文\t412469\n乔1\t412470\n两爱\t412471\n租铺客商铺网\t412472\nv6.9\t412473\nTwig\t412474\n评卷\t412475\n维正\t412476\n益玩\t412477\n副产品\t412478\n高圆席慕容\t412479\n负电荷\t412480\nRonaldo\t412481\n九焰至尊\t412482\n60头\t412483\nmeso\t412484\n美宝\t412485\nTOP60\t412486\n智慧树\t412487\n菏泽牡丹花会\t412488\n湿热型\t412489\n脚板\t412490\n甲状腺肿大\t412491\n我疯了\t412492\nhita\t412493\n可视化分析\t412494\norcle\t412495\nwq\t412496\n四模\t412497\nLoren\t412498\n一元信息网\t412499\n快点8分类信息网\t412500\n李海\t412501\nbluesoleil\t412502\n软海网\t412503\n晕血\t412504\ngoes\t412505\nclog\t412506\n环贸广场\t412507\n聚势\t412508\nSPCC\t412509\n好成色\t412510\n江陵路\t412511\n索诺\t412512\nOptimization\t412513\nSistar\t412514\nKolor\t412515\n十恶不赦\t412516\n腾空\t412517\n肺性脑病\t412518\n谈性\t412519\n医药品\t412520\n控制版\t412521\n被淘汰\t412522\nshuxing\t412523\n贝拉米\t412524\nflo\t412525\n2266\t412526\n推普\t412527\n双电机\t412528\n中国科学院南京地质古生物研究所\t412529\n应急\t412530\n硬邦邦\t412531\n富通集团有限公司\t412532\n任一\t412533\nShops\t412534\n全维\t412535\nMuscle\t412536\n843\t412537\n四库全书\t412538\n贞观长歌\t412539\ndiv子\t412540\n免签证\t412541\n视觉传达专业\t412542\n美折\t412543\n兰州火车站\t412544\nITF\t412545\ninotify-tools\t412546\n君民\t412547\n白莲机场\t412548\n打竿\t412549\n寥廓\t412550\n阀口\t412551\n玛瑙石\t412552\nWY\t412553\n弹起来\t412554\n主牌\t412555\n润色\t412556\nuvision2\t412557\n热片网\t412558\n赵启正\t412559\n仲达\t412560\n冰拳\t412561\n金属罐\t412562\n1.14a\t412563\nversions\t412564\n肉芽\t412565\n体育用品\t412566\npicpick\t412567\n出逢\t412568\n吧椅\t412569\n长隆度假区\t412570\n收实\t412571\n中国人民大学统计学院\t412572\n绝地求生PUBG\t412573\n投标书\t412574\nq50l\t412575\nJournalism\t412576\n5556\t412577\n异度之刃吧\t412578\nBTBBT\t412579\n补领\t412580\n8A\t412581\n小辫\t412582\n安恒信息\t412583\n三迪\t412584\n金匮\t412585\n8245\t412586\n黑爹\t412587\n11.1.13\t412588\ndelve\t412589\n闯进\t412590\n川气\t412591\n海狸\t412592\n安康保险\t412593\nAttorneys\t412594\n热电联产\t412595\nSTI\t412596\n外资股\t412597\n忘不掉\t412598\n流满\t412599\n清柔\t412600\n曾宪梓\t412601\n回退到\t412602\n铁菊花\t412603\n差旅费管理办法\t412604\n宋春丽\t412605\n骟\t412606\n欧本\t412607\norigin8\t412608\n杏眼\t412609\n彩吧论坛\t412610\n懿\t412611\n新西兰松\t412612\n国家权力机关\t412613\n例\t412614\nIE10\t412615\n0216\t412616\nLONGINES\t412617\n城中区\t412618\n老魏\t412619\n卡美洛\t412620\n膏滋\t412621\ng102\t412622\n淘客喵\t412623\n就业地\t412624\n群言\t412625\nIn\t412626\nlongdd\t412627\n1732\t412628\n迷你世界_小皮手游网\t412629\n珊儿\t412630\n山西省住房和城乡建设厅\t412631\n这物\t412632\n棒棒堂\t412633\nDatagridview\t412634\n平台式\t412635\n马少华\t412636\nethereal\t412637\n2017年11月8日\t412638\n山童\t412639\n智能大厦\t412640\n捷宝\t412641\n防撞器\t412642\n弹模\t412643\n哈特曼\t412644\n菌群\t412645\n补漏\t412646\nnotime\t412647\n气节\t412648\n国际贸易单证\t412649\n叶清\t412650\nLumen\t412651\n乖离率\t412652\n神垕镇\t412653\n热贴\t412654\n邻边\t412655\n应\t412656\n那个时代\t412657\n铝碳酸镁片\t412658\n1213\t412659\n接闪\t412660\n高安路\t412661\n随遇\t412662\n一边上\t412663\n第12天\t412664\n吉林省环境保护厅\t412665\n科沃兹论坛_汽车之家论坛\t412666\n排污权\t412667\n上海小区\t412668\n官营\t412669\n沂南教育信息网\t412670\n争抢\t412671\n中国太平洋保险公司\t412672\n中国钢结构协会\t412673\n界段\t412674\ntimespan\t412675\n迅雷+\t412676\n福庆\t412677\n蒋王\t412678\n125度\t412679\n骆春伟\t412680\n长安CX70T\t412681\nPharmacology\t412682\n_八字算命\t412683\nEXCLE2010\t412684\n复式计算器\t412685\n聚影\t412686\n平顶山市\t412687\n中华全国总工会\t412688\n少男\t412689\n富硒大米\t412690\n4档\t412691\n清河湾\t412692\n张桂兰\t412693\n海青镇\t412694\nemission\t412695\n神文案\t412696\n蒜泥白肉\t412697\n旗\t412698\n飞象\t412699\nforwarded\t412700\nhighlow\t412701\n折算率\t412702\n官亭镇\t412703\nhook\t412704\n43层\t412705\nEverytime\t412706\n(\t412707\n硕博论文网\t412708\nzhuimeng\t412709\n亿晶光电\t412710\n勇敢者的游戏\t412711\n學步園\t412712\n大界\t412713\n苏商\t412714\n闽西新闻网\t412715\n溪流竿\t412716\nkva\t412717\n新联在线\t412718\npetercao\t412719\n湖北潜江\t412720\n盐务\t412721\n环境保护部环境规划院\t412722\n12栋\t412723\n下院\t412724\n养蟹\t412725\n普敦\t412726\n第二层次\t412727\n外照\t412728\n无糖薄荷糖\t412729\natmos\t412730\nEEE\t412731\n行文\t412732\n蛋羹\t412733\nSight\t412734\n我是歌手3\t412735\n20171219\t412736\nLRC\t412737\n反赌\t412738\n铜丝\t412739\n关东地区\t412740\n赤月\t412741\n3506\t412742\n张倩倩\t412743\n马油\t412744\n明唐\t412745\ntbar\t412746\n投递员\t412747\nXPS15\t412748\n高岚\t412749\nEVEWiki\t412750\n种牙\t412751\n小绿和小蓝\t412752\n罗湖口岸\t412753\nftf\t412754\n20000块\t412755\n见红\t412756\n南宁路\t412757\n哒\t412758\n空单\t412759\n钱盾\t412760\n途途\t412761\n5575\t412762\nDHC蝶翠诗\t412763\n德科\t412764\n室性早搏\t412765\n发动机启停\t412766\n主操\t412767\nregistration\t412768\n锐驰版\t412769\n雨屋\t412770\n星野ナミ\t412771\n宋玉\t412772\n1927年\t412773\n司考卷\t412774\nspringmv\t412775\n圣索菲亚大教堂\t412776\n败絮\t412777\nShips\t412778\ninnovation\t412779\n理想变压器\t412780\n5.15\t412781\n博阳\t412782\n传动件\t412783\n动漫\t412784\n激石\t412785\n军记\t412786\n33oz\t412787\nHTPC\t412788\n300万\t412789\n薪酬管理\t412790\n对角线\t412791\n大骚\t412792\nMizuki\t412793\n陈伟华\t412794\n骁龙835处理器\t412795\n轰动一时\t412796\nJim\t412797\n安阳新闻网\t412798\nFilter\t412799\n梁山泊\t412800\n&comma\t412801\n脚丫子\t412802\nb\t412803\nXDG\t412804\niPhone7/Plus\t412805\nRunLoop\t412806\n熙龙湾\t412807\n机动车辆保险\t412808\n终于明白\t412809\n吴沉水\t412810\n二数\t412811\n1945年\t412812\n斯诺克\t412813\n美如\t412814\n白日梦想家\t412815\nduplicated\t412816\n800X800\t412817\n红荔西路\t412818\n白富\t412819\n镁砂\t412820\n敞\t412821\n葛培理\t412822\n中国大金\t412823\n电源转换器\t412824\n云3D播放器\t412825\n同志性\t412826\n歼11B\t412827\n刘福洋\t412828\nodf\t412829\n公共卫生间\t412830\n六尺\t412831\nPolaroid\t412832\n首师\t412833\n康华医院\t412834\n奇偶性\t412835\nconfigured\t412836\n华摩\t412837\n极白\t412838\n刘大卫\t412839\n西安交通大学管理学院\t412840\n猫和老鼠全集\t412841\n银河铁道999\t412842\n200年\t412843\nHHD\t412844\n超五类\t412845\n校本课程\t412846\n静悄悄\t412847\n唧唧帝\t412848\nmonitor\t412849\n斐波那契数列_\t412850\nprecision\t412851\n葡萄牙国家队\t412852\nPoweramp\t412853\n9011\t412854\n不同\t412855\ntmax\t412856\n北大光华\t412857\n小米note5\t412858\n丁俊晖\t412859\n不留名\t412860\n徐州日报\t412861\n金葵花卡\t412862\n女U\t412863\n跟岗\t412864\n法兰式\t412865\n划地\t412866\nSATA2\t412867\n本庄优花\t412868\n快讯网\t412869\n上海闸北\t412870\n广州羊城通\t412871\n皂角刺\t412872\n戴瑞珠宝\t412873\n苏净\t412874\n麦岛\t412875\n密信\t412876\ncumt\t412877\n起始页\t412878\n售后率\t412879\n大航海时代ol吧\t412880\n来看一看\t412881\n刘志坚\t412882\nstaining\t412883\n陕西省人才交流服务中心\t412884\n上海体育学院\t412885\n青岛啤酒\t412886\n闪电站\t412887\n9200\t412888\n机动战士高达独角兽\t412889\n白子画\t412890\n穿金路\t412891\n余男\t412892\n20150918\t412893\n千雷\t412894\n方机\t412895\n枝裕\t412896\n珠海网\t412897\n天乐学\t412898\ncJSON\t412899\nfollie\t412900\n续借\t412901\n广东省海洋与渔业局\t412902\n50点\t412903\n/s\t412904\n保列\t412905\n重庆公安局\t412906\n浊法\t412907\n玛法\t412908\n伊利丹\t412909\n稀疏\t412910\ne行\t412911\n江苏省环境保护厅\t412912\nfinn\t412913\n胖哥游记\t412914\n双软认证\t412915\ndikar\t412916\nXiuno\t412917\n龙城区\t412918\n远飞\t412919\nPAHs\t412920\n左胸\t412921\n装字体\t412922\nrolled\t412923\niKNIFE\t412924\n张坂镇\t412925\n襄城区人民政府\t412926\nrandrange\t412927\nexplosive\t412928\n百图汇\t412929\n居庸关长城\t412930\nt480\t412931\n6万8\t412932\n炼石\t412933\nscrapyd\t412934\n穿甲\t412935\n电动投影幕布\t412936\nRelief\t412937\nCSkin论坛\t412938\n蓝白\t412939\n马海战\t412940\n宜通\t412941\nldpi\t412942\n锐克\t412943\nTasmania\t412944\n财经界\t412945\ndodorr\t412946\nPeace\t412947\n一审判决书\t412948\n1.2.15\t412949\n32.5\t412950\njingjing\t412951\ncroll\t412952\n回填土\t412953\n林玉\t412954\n南京证券\t412955\nunicom\t412956\n自酿葡萄酒\t412957\nMacRumors\t412958\n5000台\t412959\n马文才\t412960\nBASS\t412961\n闪电十一人\t412962\n天专区\t412963\nMeshing\t412964\n5卷\t412965\npyconfig\t412966\n调查\t412967\nvab\t412968\nCourts\t412969\nTiO\t412970\n中华三国志\t412971\n首考\t412972\n果子\t412973\n桑托斯\t412974\n母管\t412975\n小鸠\t412976\n杨先生\t412977\n魔女幼熙\t412978\n巨门星\t412979\n7芯\t412980\n开弓\t412981\n3.84\t412982\njiangys\t412983\nPLANET\t412984\n第10话\t412985\n看电影\t412986\n触手可及\t412987\nJETOUR\t412988\n咸安区\t412989\n工商学院\t412990\n希刻劳\t412991\n异次元\t412992\n钢格板\t412993\n读心术与心理学\t412994\n250V\t412995\n锦医卫\t412996\n高加\t412997\n圆包\t412998\n卡友们\t412999\n荣耀模拟器\t413000\nsql2005\t413001\n认认真\t413002\n水果类\t413003\n24.4.1\t413004\n尚美\t413005\nzhuti\t413006\n广佛环线\t413007\nmakehuman\t413008\nlibcudnn\t413009\n格瑞洛\t413010\nguided\t413011\n供销\t413012\n武内\t413013\n广财\t413014\n罗氏沼虾\t413015\n桃花工业园\t413016\nRSC\t413017\n七爷\t413018\n艾肯地下城\t413019\n左氧氟沙星滴眼液\t413020\n沃尔沃S80\t413021\n房产证明\t413022\n辽宁省发展和改革委员会\t413023\n何奈\t413024\n夏希南\t413025\n有刺\t413026\nAnd\t413027\n准备金率\t413028\nMATLAB函数\t413029\n星娱\t413030\n七鳃鳗\t413031\n口爆\t413032\nk70rgb\t413033\n表死锁\t413034\n|财经|经济|公司\t413035\n兴达\t413036\n文苑小区\t413037\n1096\t413038\nOdoo\t413039\n碱性食品\t413040\n祛瘀\t413041\nhottest\t413042\nKarina\t413043\nGCT\t413044\n熏衣草\t413045\n红罗羹\t413046\n北京顺义国际学校\t413047\n哈蜜瓜\t413048\n39%\t413049\nDIOR迪奥\t413050\n远东电缆\t413051\n武汉市市\t413052\nScreenshots\t413053\nt检验\t413054\n九木\t413055\ncooike\t413056\ncd版\t413057\nhunting\t413058\n铁山港区\t413059\n逆战天神套\t413060\n奴隶主\t413061\n上海电视节\t413062\n华南理工大学继续教育学院\t413063\n无病呻吟\t413064\nwinlogon\t413065\n德莎\t413066\n老鸦柿\t413067\n推导式\t413068\n中山大学附属第一医院\t413069\n炫云\t413070\n爆竹\t413071\n耗时\t413072\n180323\t413073\n一千万元\t413074\n区委书记\t413075\n望岳\t413076\n1.03\t413077\n栗山\t413078\n余款\t413079\n搜推网\t413080\n董建华\t413081\n眼花\t413082\ntns\t413083\n登云\t413084\n核磁共振氢谱\t413085\ncyc\t413086\n鱼水\t413087\n鲤城\t413088\nexpedia\t413089\nLana\t413090\n同轴喇叭\t413091\n栋栋\t413092\n中宗\t413093\n公转私\t413094\n十发\t413095\n景阳冈\t413096\n清道夫鱼\t413097\n建元风云\t413098\n安数\t413099\n青岛科技大学\t413100\n流行男\t413101\n百媚\t413102\n硅凝胶\t413103\n巴赫\t413104\n精梳\t413105\n杈\t413106\n建筑工程建筑面积计算规范\t413107\n驯化\t413108\n渗氮\t413109\n杰众文学\t413110\n8760w\t413111\n射频仪\t413112\n保安队长\t413113\ngooglemap\t413114\n799元\t413115\n中国铁通\t413116\n蓝盈盈\t413117\n杜拉拉\t413118\nBezier\t413119\n果林\t413120\n利好\t413121\n气垫bb霜\t413122\n_猜成语网\t413123\n括号\t413124\n宝石级\t413125\n山西省环境保护厅\t413126\n600093\t413127\n齐美尔\t413128\noral\t413129\n凤囚凰容止\t413130\n爱上一个人\t413131\neHow\t413132\n嫩江路\t413133\n3女\t413134\n孝感乡\t413135\nah\t413136\nspark2\t413137\n防火板\t413138\n山东高速集团\t413139\n伯克利音乐学院\t413140\n长毛绒\t413141\n沸石分子筛\t413142\n6月初\t413143\n凤凰谷\t413144\n乐高侏罗纪世界\t413145\n长期待摊费用\t413146\n网购买\t413147\n云斯顿\t413148\n宁德时代\t413149\nCRISPR\t413150\nP43\t413151\ndotaplus\t413152\n西安医院\t413153\nax2\t413154\n东南三菱\t413155\n清华大学人文学院\t413156\n瓜林\t413157\n敏特\t413158\n盐泵\t413159\n聚丙烯酰胺\t413160\n2尺\t413161\nFXICP\t413162\n圆柱体\t413163\n瑞航\t413164\n数码宝贝剧场版\t413165\n活化石\t413166\nDOUBLE\t413167\n反身性\t413168\n7月14\t413169\n学语\t413170\n膨胀土\t413171\n帕帕\t413172\n急智\t413173\n秋招\t413174\nCBDoctor\t413175\n葛森\t413176\n表级\t413177\n多重耐药菌\t413178\n御景\t413179\n凝成\t413180\n幽客社\t413181\nwelder\t413182\npcgs\t413183\n压控\t413184\n苏州医院\t413185\n天龙八部天佛降世\t413186\n盖垫\t413187\n涂胶机\t413188\n肉柱\t413189\n女色\t413190\nfount\t413191\n花野真衣\t413192\n洁面仪\t413193\nmysql连接池\t413194\n科右中旗\t413195\n米迦\t413196\n大学生校\t413197\n画散\t413198\n蒙特梭利\t413199\n图雷\t413200\n深圳湾体育中心\t413201\n客行\t413202\n像素值\t413203\n洲洲\t413204\n186号\t413205\n杨帆\t413206\n殿\t413207\n树桩\t413208\nCDMA卡\t413209\n100层\t413210\n巨毒\t413211\n证券机构\t413212\n600096\t413213\n0044\t413214\n王者荣耀安卓\t413215\n没了秀\t413216\n汉字学\t413217\nParallel\t413218\n崔中石\t413219\n方荣翔\t413220\n马班\t413221\n香格\t413222\n维融\t413223\n不如\t413224\n31P\t413225\n126号\t413226\n800例\t413227\nwga\t413228\n均\t413229\n竖行\t413230\n四通股份\t413231\n幸福线\t413232\n365bet\t413233\nreducebykey\t413234\n颈部\t413235\n少年JUMP\t413236\nTranswell\t413237\n广东省社会科学院\t413238\ngbe\t413239\n58Game\t413240\n专利登记簿\t413241\n香港保险公司\t413242\n世界贸易组织\t413243\n第17次\t413244\n蚂蚁邦\t413245\n52PK流放之路\t413246\nGM\t413247\n中子数\t413248\n置换机\t413249\n店通\t413250\n只争\t413251\n养猪巴巴\t413252\n第4章\t413253\n阿普唑仑\t413254\n汤峪镇\t413255\n家xbiao.com\t413256\n睡着了\t413257\nipadmini4\t413258\n奥特曼格斗\t413259\n2547\t413260\nfadeIn\t413261\n消费日报\t413262\n四肖三期\t413263\n机关事业单位养老保险\t413264\n163cm\t413265\n尸潮\t413266\n手麻\t413267\n第29集\t413268\n行取\t413269\n湖南省\t413270\n筛分\t413271\n两免一补\t413272\nT450\t413273\n美雪\t413274\n双色球吧\t413275\nromance\t413276\n大江网\t413277\nB12\t413278\n精英型\t413279\n第一根\t413280\n羚羊木雕\t413281\n还钱\t413282\n烟酸\t413283\nチャンネル\t413284\n发行部\t413285\n斯大林\t413286\n上品\t413287\n光明骑士\t413288\n单耗\t413289\n维辰\t413290\nkarry\t413291\n生死单元\t413292\n省狂\t413293\n星友\t413294\n苦荞茶\t413295\n碧迪\t413296\n仙桃职业学院\t413297\n片节\t413298\ngiwifi\t413299\n作训服\t413300\n打包员\t413301\n半式\t413302\n竹筐\t413303\nGiants\t413304\n岸桥\t413305\n金银木\t413306\n免征额\t413307\n常州装修公司\t413308\nappear\t413309\n巴特尔\t413310\n16.04\t413311\nHappens\t413312\n对华\t413313\n吸水管\t413314\nHostel\t413315\nhd6770\t413316\n山坡羊\t413317\nwfc\t413318\n乐网\t413319\n机哥\t413320\n袁游\t413321\n凤凰劫\t413322\n东风西路\t413323\n连晋\t413324\n利息\t413325\n百度微盘\t413326\n综合医院建筑设计规范\t413327\n金宝\t413328\nferr\t413329\n3225\t413330\n宕昌\t413331\nwww.1080p.la\t413332\n南湖镇\t413333\n化纤\t413334\n收藏盒\t413335\n希望杯\t413336\n活络油\t413337\n二弦\t413338\n雪豹突击队\t413339\n绫辻\t413340\nwri\t413341\n白色物\t413342\n壮志难酬\t413343\n6369\t413344\n丙子\t413345\n排除\t413346\n坊子区\t413347\ncmw500\t413348\n10m/s\t413349\nlda\t413350\n1800个\t413351\n保险代理公司\t413352\n鄱阳湖\t413353\n提升\t413354\nalicia\t413355\n二跳\t413356\n2026年\t413357\n奴性\t413358\n不为\t413359\n2款\t413360\nsetenv\t413361\n西东\t413362\n顶板\t413363\n那兔之大国梦\t413364\n独孤天下独孤般若\t413365\nPHP5.6\t413366\n宕机\t413367\n2017年1月起\t413368\ngra\t413369\n大逃港\t413370\n滁\t413371\n局灶\t413372\n第143集\t413373\n练兵场\t413374\n开裆\t413375\ndhcpcd\t413376\n印戒细胞癌\t413377\n甘肃省安全生产监督管理局\t413378\ncoal\t413379\n一晃\t413380\n生田斗真\t413381\n恶性黑色素瘤\t413382\n18050期\t413383\n莆\t413384\n11773手游网\t413385\nzhaopian\t413386\n几十\t413387\n延参法师\t413388\nlookfantastic\t413389\nnele\t413390\n张志安\t413391\n部品\t413392\n甘汁园\t413393\n考务费\t413394\n87平\t413395\n岳阳县政府\t413396\n糖浆剂\t413397\nISC\t413398\n20块\t413399\n老桩\t413400\nProductions\t413401\n东子\t413402\n枣庄科技职业学院\t413403\n税种\t413404\n光碟\t413405\naero15x\t413406\n货币政策\t413407\n癌胚\t413408\ncryengine\t413409\nmom\t413410\n单曲循环\t413411\n阿房\t413412\n盛骏\t413413\n莲花北\t413414\n广工\t413415\n安徽省肿瘤医院\t413416\nChance\t413417\neci\t413418\nlooks\t413419\n赵明华\t413420\n世界服\t413421\n测绘人才网\t413422\nMainz\t413423\n3月29\t413424\n邦达\t413425\nhypixel\t413426\n0x8b\t413427\n小印\t413428\n红薯\t413429\n白桃\t413430\n哈利\t413431\n不服气\t413432\n內衣\t413433\n叠字\t413434\n手脚架\t413435\n3000小时\t413436\n福建省住建厅\t413437\n成男\t413438\n清莹\t413439\n莉亚\t413440\n2018-03-26\t413441\n61821838\t413442\n讣闻\t413443\n杨\t413444\n北京保洁公司\t413445\n高达创战者\t413446\n脉冲当量\t413447\n王珮瑜\t413448\n100寸\t413449\n荡妇\t413450\n长整型\t413451\n天台新闻网\t413452\n晨光乳业\t413453\n0点\t413454\nGit\t413455\n有限元分析\t413456\n西苑\t413457\n龙虱\t413458\n图象\t413459\n由内而外\t413460\n红豆薏米粉\t413461\nexcel2016\t413462\n缀\t413463\n福海\t413464\n恰恰\t413465\n猎鹿人\t413466\n河子\t413467\ndiv+css模板\t413468\n万表世界\t413469\n粉丝网\t413470\n铭创\t413471\n蹂躏\t413472\n布莱克\t413473\n涉恶\t413474\n张钊汉\t413475\n梭\t413476\n电影性\t413477\nROE\t413478\n桥接\t413479\n前行广释\t413480\n汉化合集\t413481\n驱动程式\t413482\n菲薄\t413483\n积分法\t413484\nfreebuf\t413485\n中小企业\t413486\n寒区旱区科学数据中心\t413487\n泡桐树\t413488\n二广高速\t413489\n冷量\t413490\n快乐就好\t413491\nSwag\t413492\n榜单\t413493\n标化\t413494\n皖江\t413495\n小米盒子mini\t413496\n2例\t413497\n画皮师\t413498\n100幅\t413499\n流史\t413500\n鹿骨\t413501\n基金份额持有人大会\t413502\n金嘉护\t413503\n华中科技大学武昌分校\t413504\n老成\t413505\n出席会议\t413506\n湖南省环保厅\t413507\n旅游度假村\t413508\n数语\t413509\n拍面\t413510\n天津职业技术师范大学\t413511\n68_\t413512\n赤足\t413513\n深圳电影院\t413514\ntexshop\t413515\n债权转股权\t413516\n壮语\t413517\n庶女有毒\t413518\n广西电网\t413519\nbasic\t413520\nshaomine\t413521\n认命\t413522\n浙江省安全生产监督管理局\t413523\nsupers\t413524\n380V\t413525\nBelt\t413526\n灵粮\t413527\n紫苑镇\t413528\nanycall\t413529\n55r16\t413530\n天天色在线\t413531\n主干\t413532\n述职报告\t413533\nIDS\t413534\n手动式\t413535\n风寒感冒\t413536\n回家过年\t413537\n儿格\t413538\njava7\t413539\n恰空\t413540\n新舟机场\t413541\n柠檬酸钾\t413542\n黄昏\t413543\nprompt\t413544\n学校教育\t413545\ngongju\t413546\n孤字\t413547\n湖南商学院北津学院\t413548\nde4dot\t413549\n马尔科夫随机场\t413550\n浩特\t413551\n赵晔\t413552\n喀喇沁旗\t413553\n查尔斯顿\t413554\n海战\t413555\nDaryl\t413556\n改掉\t413557\n醋酸铅\t413558\n1937年\t413559\n全新版大学英语综合教程4\t413560\nsubversion\t413561\nX5AA\t413562\n讳莫如深\t413563\n包号\t413564\n条条\t413565\n烽火通信科技股份有限公司\t413566\n乞力\t413567\n发用\t413568\n主打歌\t413569\nTextarea\t413570\n余奕沛\t413571\n加华\t413572\n刀模\t413573\n腐熟\t413574\nBald\t413575\n西土\t413576\n暴雪娛樂\t413577\n雅柏\t413578\nwsm\t413579\nac米兰\t413580\n二件\t413581\n15.3\t413582\n一丝\t413583\n何金昌\t413584\n手胶\t413585\ncontingent\t413586\n搜药网\t413587\n老祖宗\t413588\n老陆\t413589\n格力中央空调\t413590\n云上宝鸡\t413591\n衣图\t413592\n快传号\t413593\nbiff\t413594\n华润凤凰\t413595\n奔康\t413596\n城市用地分类与规划建设用地标准_\t413597\n去痘印\t413598\n早莺\t413599\n番茄苗\t413600\n火爆网\t413601\n漠\t413602\nNeedle\t413603\nFc\t413604\n贡嘎县\t413605\n董家渡\t413606\n致伤\t413607\n大曲\t413608\n2016年1月26日\t413609\n倾家荡产\t413610\n数字电视机顶盒\t413611\nMambo\t413612\n绵中\t413613\n基三\t413614\n混剪\t413615\nRTSP协议\t413616\n呆帐\t413617\n鞋花\t413618\n微框架\t413619\nMeld\t413620\n詹青云\t413621\n鳥哥\t413622\n微销通\t413623\n硒\t413624\n积微\t413625\n兴安路\t413626\nf码\t413627\nscada\t413628\n饷\t413629\n金川公司\t413630\n农桥\t413631\n太原理工大学研究生院\t413632\n8g\t413633\n3.09\t413634\n萧剑\t413635\n2018年清明\t413636\nW500\t413637\n沈良\t413638\n翘臀\t413639\n闭门器\t413640\n刘君\t413641\n海狮\t413642\n控制篇\t413643\n陈经纶中学\t413644\nrpgmaker\t413645\n善信\t413646\n金发晶\t413647\n大湖\t413648\nNarcos\t413649\nGo语言编程\t413650\nNAMCO\t413651\n5L\t413652\n杭州电子科技大学信息工程学院\t413653\n流放之路召唤物\t413654\n110kw\t413655\ngarden\t413656\n花头\t413657\nargon\t413658\n置物盒\t413659\n三十六计\t413660\n同传\t413661\n终极一班5\t413662\n偶像大师灰姑娘女孩\t413663\n中关村东路\t413664\n23家\t413665\n三瓜\t413666\n洗手间\t413667\n神精榜\t413668\nX1S\t413669\nAssetbundle\t413670\n场口\t413671\n御河\t413672\n舞动旋律2007健身队\t413673\n慕少\t413674\n多肽\t413675\ndatacenter\t413676\n高良姜\t413677\nv1.5\t413678\n电化学\t413679\n西陵区\t413680\n移动积分商城\t413681\n河南师范大学新联学院\t413682\n天空城\t413683\n周政\t413684\n气藏\t413685\n赛德克·巴莱\t413686\n北外网院\t413687\n3023\t413688\n孙维民\t413689\n米轨\t413690\n过流保护\t413691\n对色\t413692\n365回\t413693\n3DMAX2014\t413694\n工费\t413695\n得体\t413696\n二炮电影网\t413697\n领域\t413698\n中国工程热物理学会\t413699\n另类喊麦\t413700\n朱志强\t413701\n灯展\t413702\n千场\t413703\n菜肴\t413704\n好时\t413705\n官者\t413706\n福宝\t413707\n班夫\t413708\n杭州19楼\t413709\n赵师秀\t413710\n中国钢铁新闻网\t413711\n香港卫视\t413712\n坂雪岗科技城\t413713\n太阳能路灯控制器\t413714\n经营型\t413715\n对刷\t413716\n碧城\t413717\n湿机\t413718\n弹簧机\t413719\n椰子片\t413720\n道通\t413721\n伤病\t413722\n海翼\t413723\nwxapkg\t413724\n初始人物\t413725\n2016.1\t413726\n尿潴留\t413727\n爱德华·蒙克\t413728\n猫小帅儿歌\t413729\n萧穗子\t413730\n汉园\t413731\n中铁七局集团\t413732\nshipping\t413733\n华国\t413734\nPika\t413735\n消逝的光芒信徒\t413736\n几家欢喜几家愁\t413737\nGucci\t413738\npy-faster\t413739\n黄成\t413740\n离心玻璃棉\t413741\nGT610\t413742\nHeather\t413743\n好污\t413744\n大象的耳朵\t413745\n威图\t413746\nfly123456789-ChinaUnix\t413747\n丅恤\t413748\n无辐射\t413749\n多味滋\t413750\n速记员\t413751\n恒易贷\t413752\n0955\t413753\nSCIENCE\t413754\n克林斯曼\t413755\n越南币\t413756\n冰垫\t413757\n天天幻灵\t413758\n金湖广场\t413759\n忻州市\t413760\n揽货\t413761\n水果蛋糕\t413762\nParenting\t413763\n西游之女儿国篇\t413764\n影画\t413765\nAcura\t413766\n谪居\t413767\n双梁\t413768\n中标候\t413769\n大连现代博物馆\t413770\n二郎庙\t413771\n成名之路\t413772\n田中君\t413773\n舍生\t413774\nP成\t413775\n16部\t413776\n教育咨询有限公司\t413777\n西安市市\t413778\nLINES\t413779\n天豪\t413780\n云亭\t413781\n4.4.1\t413782\n同花顺\t413783\n怒\t413784\n掇\t413785\n0468\t413786\n花开半夏\t413787\n韩立刚\t413788\n广场舞队视频大全_播视网\t413789\n周南\t413790\n真地\t413791\n艾子\t413792\n翁开尔\t413793\nrelation\t413794\n谷牧\t413795\nreaper\t413796\n25辆\t413797\n京津新城\t413798\ncontr\t413799\n伍德\t413800\n小秋秋\t413801\n许娜京\t413802\n第70章\t413803\n三局\t413804\n心得篇\t413805\n血刀门\t413806\n琵琶声\t413807\nwoaidong\t413808\n眼眸\t413809\nmonkey测试\t413810\n公告书\t413811\n机动战士高达UC\t413812\n血痣\t413813\n58pic.com\t413814\n90厘米\t413815\n珠江御景湾\t413816\n贸易量\t413817\n龙格库塔法\t413818\nOpengl\t413819\n120道\t413820\n金科中央公园城\t413821\n马耳山\t413822\n中国有色集团\t413823\n过街天桥\t413824\n人力资源社会保障部\t413825\nwarfame\t413826\n电务\t413827\n表情帝\t413828\n瑞豪\t413829\n20150607\t413830\n伍尔夫\t413831\n始发站\t413832\n战隼\t413833\n休克\t413834\n毒蝎\t413835\n峰形\t413836\n名办\t413837\nluya\t413838\nbuckle\t413839\n马尔可夫\t413840\n碱反应\t413841\n华安基金管理有限公司\t413842\n车式\t413843\n增值率\t413844\n优漫\t413845\nstm32f\t413846\nish\t413847\n成效\t413848\n油簧\t413849\n杭政函\t413850\n必要条件\t413851\n300多家\t413852\n太火鸟\t413853\n氟利昂\t413854\n权色\t413855\n601555\t413856\nBourbon\t413857\n胜利精密\t413858\n库尔提拉斯\t413859\n海兰\t413860\n第十四卷\t413861\n义学\t413862\n浙江省国家税务局\t413863\n转接板\t413864\n中国人寿资产管理有限公司\t413865\n第一块\t413866\nWin8.1\t413867\n卷福\t413868\n6.0英寸\t413869\n台灯\t413870\nbaijia\t413871\n王甜\t413872\n重庆市食品药品监督管理局\t413873\n蓝冰\t413874\nSQLmap\t413875\nF20\t413876\nwillow\t413877\nantigen\t413878\n头架\t413879\n360手机N4S\t413880\nPoint\t413881\n财经委\t413882\n两吨\t413883\n第10位\t413884\n20美元\t413885\n中铝公司\t413886\n这些事儿\t413887\n笔记类\t413888\n临沂火车站\t413889\n34款\t413890\n六善\t413891\ncoherent\t413892\n北京市朝阳区社会保险基金管理中心\t413893\n傅盛\t413894\n主架\t413895\nempty\t413896\n卜易居\t413897\n15季\t413898\n镍粉\t413899\n公交管\t413900\n拳皇2001\t413901\n萧公权\t413902\n泌\t413903\n21.3\t413904\n永康路\t413905\n你我贷\t413906\n捕鲸\t413907\n李清泉\t413908\n第三人\t413909\n快餐简餐\t413910\n内容型\t413911\n尤小刚\t413912\n20毫米\t413913\n湖北移动\t413914\n湖南省烟草专卖局\t413915\n放屁\t413916\n绝味食品\t413917\ninconvenience\t413918\nNYLON\t413919\n家家户户\t413920\n六翼天使\t413921\n纠葛\t413922\n膜性\t413923\n深圳瑞吉酒店\t413924\n五六千\t413925\ncreek\t413926\n观赏性\t413927\npowermill2017\t413928\n帝霸吧_帝霸\t413929\n叠乐\t413930\n陈汤\t413931\n第5款\t413932\n素水\t413933\n马肉\t413934\n金三税\t413935\n小普\t413936\n佳能MP259\t413937\n卖人\t413938\n亮马桥\t413939\n情感\t413940\n夏露\t413941\n庄浪县\t413942\n洪泰基金\t413943\n勾选\t413944\n九鹭非香\t413945\n确认件\t413946\n罗力\t413947\n三色\t413948\n外阴溃疡\t413949\n中安在线\t413950\n真人秀百科\t413951\n双超\t413952\n上海海洋水族馆\t413953\nVBOX\t413954\n千佛山医院\t413955\n昵图网www.nipic.com\t413956\n解放军307医院\t413957\n忠字\t413958\n1080i\t413959\n彩人\t413960\n言和\t413961\n電子\t413962\n二排\t413963\n1000亩\t413964\n该案\t413965\n15年\t413966\n马盘月\t413967\n#000\t413968\n枢纽\t413969\n博客营销\t413970\n火玫瑰\t413971\n野原琳\t413972\nqize\t413973\n草花\t413974\n玄珠\t413975\n折中\t413976\n华莱黑茶\t413977\nsymmetry\t413978\n20161013\t413979\n猎天使\t413980\n萨尔浒之战\t413981\n野牡丹\t413982\n香槟\t413983\n街头\t413984\n151家\t413985\n良渚站\t413986\n数字化\t413987\n2k18mc\t413988\nhelpme\t413989\nmgtv\t413990\n会展\t413991\n国纪\t413992\n第一等\t413993\n2419\t413994\n向暖\t413995\nSensitive\t413996\n墨管\t413997\n对地\t413998\n麋鹿苑\t413999\n魔曲\t414000\n大学生\t414001\n棉类\t414002\n特暴龙历险记\t414003\n独立学院\t414004\n十根\t414005\n红盾春雷\t414006\n蒙药\t414007\nEFE\t414008\n980m\t414009\n苏州十中\t414010\n甜宠\t414011\n一拖多\t414012\n沙苑子\t414013\n日经中文\t414014\nvsop\t414015\n撤军\t414016\n缘缘\t414017\n反斜杠\t414018\nzones\t414019\n大帝\t414020\n斯科特·阿金斯\t414021\n史小诺\t414022\n坦荡\t414023\n口袋\t414024\n宽城区\t414025\n填资\t414026\n诗谜\t414027\n8X8\t414028\n英叔\t414029\n天人\t414030\naccent\t414031\n高画质\t414032\n6.72\t414033\n选段\t414034\n书商\t414035\n田勇\t414036\n凛\t414037\n坏话\t414038\n阿海\t414039\n北京物资学院新闻中心\t414040\nologit\t414041\n妹\t414042\n沙源\t414043\n畅享\t414044\n安徽师范大学\t414045\n奥园城市天地\t414046\nAntonio\t414047\n杜维明\t414048\n藏秘\t414049\n拉面馆\t414050\n日内\t414051\n兰竹\t414052\n第九宫\t414053\nSand\t414054\n小娥\t414055\n立方英尺\t414056\n生涩\t414057\n戏霸\t414058\n第6卷工口度\t414059\n驾驶证\t414060\n允儿\t414061\n20160719\t414062\n金城医药\t414063\nCite\t414064\n福腾宝\t414065\nves\t414066\n李壮\t414067\n铁规\t414068\nSDS-PAGE\t414069\n亚麻油\t414070\n必利劲\t414071\n头诗\t414072\n播音\t414073\nexplore\t414074\n计算机网络\t414075\n挪用\t414076\n七宝老街\t414077\nrepomd\t414078\n15头\t414079\n最好的时光\t414080\njqprint\t414081\n消分\t414082\ng-queen\t414083\n阳极化\t414084\n重态\t414085\n双绝\t414086\n有机硅防水剂\t414087\nheavily\t414088\n超值版\t414089\nrenli\t414090\n放題\t414091\n纬四路\t414092\n上路\t414093\n7L\t414094\n中国原子能科学研究院\t414095\n巨侠\t414096\n苏北人民医院\t414097\n大冢爱\t414098\n莉兹\t414099\n天宇\t414100\n天府通卡\t414101\n环滩岛\t414102\nbyj\t414103\n罪者\t414104\n筹备处\t414105\n发动机排量\t414106\n武汉大学电子信息学院\t414107\n地筋\t414108\n橡胶块\t414109\n质量效应仙女座\t414110\npets2\t414111\n西安电子科技大学出版社\t414112\nsilicon\t414113\nOxidative\t414114\n春江花城\t414115\nCosi\t414116\nlpush\t414117\n开都\t414118\n代数方程\t414119\n滩涂\t414120\n疯马\t414121\n顾念\t414122\n东洞庭湖\t414123\n柘林镇\t414124\n雷云\t414125\n横栏镇\t414126\n隆昌市政府\t414127\n金蝶K/3\t414128\n博阅\t414129\nblos\t414130\n广东电子\t414131\n游过\t414132\nsparksession\t414133\n方泰\t414134\n雪帝\t414135\n高渐离\t414136\nEMEA\t414137\n星纪元\t414138\nicom\t414139\nTask1\t414140\n渭水\t414141\n柯文哲\t414142\n李革\t414143\n民族唱法\t414144\nsloong\t414145\nChinese\t414146\njordan\t414147\nmori\t414148\n404\t414149\n广州市中级人民法院\t414150\n朝圣者\t414151\n水谷葵\t414152\n灰司\t414153\n萌萌哒\t414154\nGLK300\t414155\n黄歇\t414156\n热交换器\t414157\n13型\t414158\nK金\t414159\n福美\t414160\n蒋辉\t414161\n洛阳百姓网\t414162\n艾诺迪亚4\t414163\n扫塔\t414164\n匕\t414165\n1.44\t414166\n歪传\t414167\n思达\t414168\n野夫\t414169\n奥沙利铂\t414170\n赫曼陆龟\t414171\n咸水湖\t414172\n美国白宫\t414173\n皮包骨\t414174\n敷疗\t414175\nLED日光灯\t414176\n3dlc2\t414177\n抗凝血\t414178\n杨思敏\t414179\n无锡商业职业技术学院\t414180\n刺陵\t414181\n长度\t414182\n300ms\t414183\n128克\t414184\n中国供销合作网\t414185\n北京福特\t414186\n泰地\t414187\n内宾\t414188\n乐高科技\t414189\nFINE\t414190\n一股一股\t414191\n弹性盒\t414192\n氨\t414193\n充装\t414194\n教学校\t414195\nneighborhood\t414196\nTurning\t414197\nCombination\t414198\n涌起\t414199\nwdr6500\t414200\n乐视MAX2\t414201\n线虫\t414202\n数英\t414203\n改一键启动\t414204\n辍学率\t414205\n天津图书馆\t414206\n武汉纺织大学\t414207\n华星创业\t414208\n特攻式\t414209\n【源\t414210\n奥陨\t414211\nEngine4\t414212\n甘泉\t414213\n好卖\t414214\n本量\t414215\nKinder\t414216\n谢家湾\t414217\n维修业\t414218\ndance\t414219\n谋局\t414220\nhivi\t414221\n致灾\t414222\nDirect3D\t414223\n宾克斯\t414224\nshops\t414225\n李茂\t414226\n气密性\t414227\n纺线\t414228\n过多时\t414229\nDisclosure\t414230\n姚增科\t414231\n5000起\t414232\nsocks5代理\t414233\n硬件\t414234\n11宫\t414235\n机智堂\t414236\n特洛伊奥德赛\t414237\n麻麻们\t414238\n萨鲁法尔\t414239\n2星期\t414240\n速干衣\t414241\n底阀\t414242\n艶堂\t414243\n大体上\t414244\n炮哥\t414245\n禁不住\t414246\n智能聊天机器人\t414247\n裂液\t414248\n武汉东湖生态旅游风景区管理委员会\t414249\n意大利风情街\t414250\n爱鲜蜂\t414251\n千伏安\t414252\ncamunda\t414253\n大旗英雄传\t414254\n胆颤\t414255\n偷藏\t414256\n华南理工大学工商管理学院\t414257\n情深似\t414258\nTPT\t414259\n后现代\t414260\n内蒙古自治区经济和信息化委员会\t414261\n邓氏\t414262\n李大勇\t414263\n柚希\t414264\n狗仔队\t414265\npython2\t414266\nUEStudio\t414267\n微影视\t414268\n杭州育才中学\t414269\n梁原位\t414270\n狄云\t414271\n1.143\t414272\n180分钟\t414273\n在日上\t414274\n空降兵\t414275\n单硝酸异山梨酯片\t414276\n税目\t414277\n雷克灵域\t414278\n兴动\t414279\n换掉\t414280\n世纪剧院\t414281\n双头\t414282\n偏重\t414283\n初露\t414284\n朱买臣\t414285\n和平路\t414286\noffers\t414287\npetshop\t414288\nAlcohol\t414289\n1890\t414290\nwebgame|网页游戏\t414291\n田七\t414292\n城南小学\t414293\n六礼\t414294\n米珠\t414295\n南京工程高等职业学校\t414296\n北京理工大学研究生院\t414297\n前半个月\t414298\nstorz\t414299\n258套\t414300\njss\t414301\nV90\t414302\n艳门\t414303\n李副总\t414304\n知识点\t414305\n五仁\t414306\n图本\t414307\n穿透式\t414308\n复方甲氧那明胶囊\t414309\n大岛优子\t414310\nIndoor\t414311\nweb2.0\t414312\n绥阳\t414313\n道证\t414314\n贝肯山\t414315\n临床麻醉学\t414316\n文学创作\t414317\nF-15\t414318\nknn\t414319\n万源_万源市政府网\t414320\namps\t414321\n百度图片\t414322\n灵岩南路\t414323\nid值方法小结_jquery\t414324\n赣江\t414325\n博思得\t414326\n唐诗宋词\t414327\n长江刀鱼\t414328\n春苗网\t414329\n羞恥\t414330\n中洲\t414331\n黄河古道\t414332\nmmaa\t414333\n单纯性\t414334\n五寨\t414335\n剩\t414336\n陆霸\t414337\n抽纱\t414338\nDecoder\t414339\n大黑山\t414340\n看门\t414341\n_手\t414342\n电视卫星\t414343\n照着\t414344\n内堂\t414345\n始皇帝\t414346\n体检科\t414347\n股东权益\t414348\n破案率\t414349\nLOF\t414350\n华利\t414351\n健康状况\t414352\n改变\t414353\n马夫人\t414354\n老于\t414355\n11块\t414356\n机载\t414357\n招兵\t414358\nZookeeper\t414359\n十二进制\t414360\nnd\t414361\n10^4\t414362\n9月30日\t414363\n国情\t414364\n劣等生\t414365\n高碑店村\t414366\nRENMIN\t414367\n杨旭\t414368\n引领性\t414369\n浦安修\t414370\n8米高\t414371\n音序查字法\t414372\n佳能5d\t414373\n50300\t414374\n月球灯\t414375\npaul\t414376\nGoPro\t414377\n新澳海底世界\t414378\n银河足球队\t414379\nnpv\t414380\n夏日绝句\t414381\n红海行动电影百度云\t414382\n靶向药物\t414383\n抛物线\t414384\n为营\t414385\nSharepoint\t414386\n密植\t414387\n南梨央奈\t414388\n種付\t414389\n西北大学\t414390\n六孔板\t414391\n班群\t414392\nЙ\t414393\ndegenerate\t414394\n测量器\t414395\n阿拉伯糖\t414396\nside\t414397\n亳州市政府\t414398\n白马王子\t414399\n海友\t414400\nGrapher\t414401\n二套房公积金贷款\t414402\n税后工资计算器2017-2018\t414403\nOrdeal\t414404\n香方\t414405\n独立商城\t414406\n法务部\t414407\n行驶证\t414408\n_联商网\t414409\n莱姆\t414410\n肩并肩\t414411\ninfini\t414412\n春泥\t414413\n知识产权宣传周\t414414\nxinyue\t414415\ngti\t414416\n7000年\t414417\nDespacito\t414418\n喷杆\t414419\nAudio\t414420\n银安奇闻网\t414421\nwin64\t414422\n核素\t414423\ndex2oat\t414424\nMicroscopy\t414425\n1958\t414426\n报关行\t414427\n设计员\t414428\n好眼\t414429\n安江镇\t414430\n白布\t414431\n顾建文\t414432\n原理图\t414433\n插装\t414434\n白石茉莉奈\t414435\n康哲药业\t414436\n第一时间\t414437\nGW250\t414438\n替硝锉\t414439\nprivilege\t414440\n世界农化网\t414441\n斯托科夫\t414442\n3123\t414443\n浦江县政府\t414444\n炮炮\t414445\n航天科技\t414446\nInc\t414447\n无补\t414448\n不欢而散\t414449\n胡富国\t414450\n风井\t414451\n花泽香菜\t414452\n学成\t414453\n薙切绘里奈\t414454\n倍康\t414455\n扫把\t414456\natcc\t414457\n盲流\t414458\n90年后\t414459\nWWW\t414460\n条款\t414461\n礼包性价比\t414462\ncorolla\t414463\n迷你西游\t414464\nebtables\t414465\n48万元\t414466\ngetenv\t414467\n伐木场\t414468\n滇池路\t414469\n首都师范大学音乐学院\t414470\nmpt\t414471\n100MM\t414472\n组织卖淫罪\t414473\n宝来论坛_汽车之家论坛\t414474\n离心脱水机\t414475\n泛华金控\t414476\n流感嗜血杆菌\t414477\n桌面池\t414478\n绝唱\t414479\n华立集团\t414480\n废活性炭\t414481\n彼得·潘\t414482\n亮金\t414483\n没食子酸\t414484\nJLabel\t414485\nPHP函数\t414486\n15189\t414487\n公共体育场馆\t414488\n春溪笛晓\t414489\n绝地求生号\t414490\n加速期\t414491\n希伯来语\t414492\n李建春\t414493\n碳酸氢钠注射液\t414494\nyac\t414495\n忍辱\t414496\n乐视x900\t414497\n搭设\t414498\nwin7china.com\t414499\n内蒙古国际蒙医医院\t414500\n6500U\t414501\n生娃\t414502\n筱原凉子\t414503\ngetpid\t414504\n明亚\t414505\n汉语言专业\t414506\n泰学\t414507\n上海市环境科学研究院\t414508\nACG小站購物區\t414509\n评语\t414510\n全期\t414511\nACT\t414512\n汉森制药\t414513\n睾哥\t414514\n登机箱\t414515\n漏极\t414516\nenumeration\t414517\n六角龙\t414518\nppl\t414519\n2016-2020\t414520\n客云\t414521\n第2册\t414522\n36本\t414523\n1.50\t414524\n老滚5重制版\t414525\nnmb48\t414526\n唯唯\t414527\n上市公司证券发行管理办法\t414528\n建设工程价款结算暂行办法\t414529\n篆书\t414530\nh7n9禽流感\t414531\nspeakers\t414532\nab型\t414533\n住址\t414534\n肤质\t414535\n色身\t414536\n宫缩\t414537\nafk\t414538\n联储\t414539\n新新球鞋网\t414540\nmani\t414541\n九六年\t414542\n东屏镇\t414543\n举办方\t414544\n奇好\t414545\n水罐车\t414546\nComposition\t414547\nconcat篇\t414548\n干预\t414549\n富家子\t414550\n港区\t414551\n银行抵押贷款\t414552\n沪\t414553\n锦绣前程\t414554\n华夏基金\t414555\n抓图\t414556\n2013-2016年\t414557\n指向型\t414558\nIT之家学院\t414559\n先头\t414560\n没那么简单\t414561\n粤泰\t414562\n拔刀龙矢\t414563\nttl\t414564\n北环路\t414565\n28#\t414566\n硅胶柱\t414567\n库柏\t414568\n综合文化服务中心\t414569\nZOD\t414570\nUITabBar\t414571\n蔡甸\t414572\n艺吧播音主持考级网\t414573\n经纬城市绿洲\t414574\nstudioGGB\t414575\n人民画报\t414576\n王旭东\t414577\nwisconsin\t414578\n伊凡雷帝\t414579\n平安\t414580\n二十回\t414581\nalligator\t414582\n葛洲坝集团\t414583\n首都医科大学附属北京朝阳医院\t414584\n展露\t414585\nc币\t414586\n王小明\t414587\n八套\t414588\n一笔一划\t414589\n成都乳腺医院\t414590\n西飞\t414591\n纬五路\t414592\n大紧\t414593\n缩影\t414594\n布鲁玛\t414595\n侦察\t414596\nGraduation\t414597\n刘晓丹\t414598\nyutube\t414599\n陈升\t414600\ndedecms\t414601\nQGraphicsScene\t414602\n无限曙光吧\t414603\nBower\t414604\n南施街\t414605\n女僵尸\t414606\n国侨办\t414607\n清名桥\t414608\n1.2GB\t414609\n何平\t414610\n瞻望\t414611\n蓝化\t414612\n豆腐包\t414613\n4.15国家安全教育日\t414614\n20160302\t414615\n智障\t414616\n可见\t414617\n包粽子\t414618\n北大附属嘉兴实验学校\t414619\n四棵\t414620\n5&#160\t414621\n北京新天地\t414622\n抗金\t414623\n陈述\t414624\n肉骨\t414625\n引气\t414626\nblitz\t414627\n褥子\t414628\n南京大学鼓楼校区\t414629\nMLXG\t414630\nx10.11\t414631\n拒执罪\t414632\n数码型\t414633\nXX镇\t414634\n银桥\t414635\n逃狱三王\t414636\n衡阳日报\t414637\n查课\t414638\nner\t414639\n王星\t414640\n书讯\t414641\n政纪\t414642\n奔驰e200l\t414643\n越狱\t414644\n辻本\t414645\n青古铜\t414646\n波扎克\t414647\ncontext-param\t414648\n秦始皇陵兵马俑\t414649\n布里\t414650\n帝国主义\t414651\n1格\t414652\n胡桃木屋工作室\t414653\n梁恕俭\t414654\n加奈子\t414655\n贾琏\t414656\n芭克硅胶软膏\t414657\n房地产大厦\t414658\n三晶\t414659\ndogma\t414660\njxbrowser\t414661\n这么多天\t414662\nzsj\t414663\n苏绣\t414664\nzhtj\t414665\n马赛\t414666\nnguyen\t414667\n求转\t414668\n逢源\t414669\n图资\t414670\n清机\t414671\n苏果超市有限公司\t414672\nmilky\t414673\nsemen\t414674\nA7m2\t414675\n美岁\t414676\n唐小平\t414677\n离岸公司\t414678\n困扰\t414679\n扒胎机\t414680\n黄煌\t414681\n霍纳\t414682\n600895\t414683\n百洋健康\t414684\ndwm.exe\t414685\n三得利啤酒\t414686\n2周年\t414687\n喷砂房\t414688\n预胶化淀粉\t414689\nCGLIB\t414690\nCTex\t414691\n合箱\t414692\nThreshold\t414693\n海蝶\t414694\nmyisam\t414695\nB860AV2.1\t414696\n农房\t414697\n普贤\t414698\n766冒险岛\t414699\n秒开\t414700\n阮陈恩静\t414701\n复旦中学\t414702\n相同类\t414703\n喷射2\t414704\n连通图\t414705\nListCtrl\t414706\n预调酒\t414707\n怪物猎人P2G\t414708\n部局\t414709\n大明湖畔\t414710\nLoans\t414711\n雕纹\t414712\n吨袋\t414713\nstuidio\t414714\n受音\t414715\nimperial\t414716\n安乃\t414717\n白雪歌送武判官归京\t414718\n猫捉老鼠\t414719\n规划区\t414720\nhugo\t414721\nBT种子/ED2K\t414722\n干将路\t414723\n懒虫\t414724\nremoveClass\t414725\n20150416\t414726\n美涛\t414727\n通风管道\t414728\npreps\t414729\n五台县\t414730\n鼓点\t414731\n烂掉\t414732\nJboss\t414733\n侨情\t414734\nkyle\t414735\n凯歌\t414736\n400错误\t414737\n余凯\t414738\nexcel常用函数\t414739\n研究中心\t414740\n谨慎性原则\t414741\nCapital\t414742\n黑夜传说4\t414743\n集安吧\t414744\n当事者\t414745\n镍氢\t414746\n肖锋\t414747\n简阳\t414748\n秦龙\t414749\n保养\t414750\n武装战姬\t414751\nBrands\t414752\n外夹式\t414753\n斗级\t414754\npeekaboo\t414755\nscanning\t414756\n中财办\t414757\n烟风\t414758\nSmash\t414759\nVlan\t414760\n泉州市\t414761\n天正默认运行\t414762\n自写\t414763\nConverter\t414764\n3702\t414765\n涣卦\t414766\n龚宇\t414767\nCoreldraw\t414768\n永昌镇\t414769\nrmvb熟肉网盘\t414770\nyis\t414771\n白牧野\t414772\n红糖馒头\t414773\nQCA\t414774\n有缘无份\t414775\njindi\t414776\nt410s\t414777\n江科附中\t414778\n投石器\t414779\n导装\t414780\n缩微\t414781\n实验指导书\t414782\n货币发行\t414783\ndecade\t414784\nResource\t414785\n大西沟\t414786\n邪道\t414787\n五月四\t414788\nColor\t414789\nPom\t414790\nGrep\t414791\n智慧城市_赛迪网\t414792\nl型\t414793\n游艺机\t414794\n神兽\t414795\n电吹管\t414796\n火羽\t414797\n童话世界\t414798\n当行\t414799\nTCF\t414800\n雷蛇北海巨妖\t414801\n交警\t414802\n郭旭\t414803\n奇色\t414804\n简爱中\t414805\n餐单\t414806\n吕英\t414807\n印画\t414808\ni5-6300HQ\t414809\n赛默飞\t414810\nZoom\t414811\n酒酒\t414812\n三肾丸\t414813\n于一\t414814\netching\t414815\n好事情\t414816\nseveral\t414817\n苏教版五年级数学\t414818\nbigboom\t414819\n另眼相看\t414820\naton\t414821\n同侪\t414822\n搞笑视\t414823\n虚拟磁盘\t414824\n大中华地区\t414825\nActiviti5\t414826\n界首市\t414827\n2018年3月13日\t414828\n性\t414829\n执行费\t414830\n百度手写输入法\t414831\n妻子的诱惑\t414832\n能见\t414833\n金沙广场\t414834\n沦\t414835\n6012\t414836\n0.2mol\t414837\n兜里\t414838\n免赔险\t414839\n剑魔\t414840\n滑翔伞\t414841\n当然\t414842\n免还\t414843\n矿棉板\t414844\nQQ加油站\t414845\nROUND\t414846\n夏女\t414847\njenny\t414848\n21g\t414849\n秃鹫\t414850\nuinty\t414851\n作协\t414852\n高位震荡\t414853\n祝词\t414854\n康复治疗学\t414855\n天净沙\t414856\nserch\t414857\n操作间\t414858\n尼山\t414859\n鲇鱼\t414860\n曹文\t414861\n制造机\t414862\n能发\t414863\n98个\t414864\n强宠\t414865\n初代\t414866\nVaccine\t414867\n荒木飞\t414868\n简略版\t414869\n哲学院\t414870\n保利东湾\t414871\n吊丧\t414872\nvijeo\t414873\n很开心\t414874\n鬼月\t414875\nMeasurements\t414876\nCiNii\t414877\n制氮机\t414878\n孔夫子拍卖网\t414879\ntiny6410\t414880\n卡利古拉\t414881\n杨树苗\t414882\nvisua\t414883\n执行者\t414884\n鞣质\t414885\n武田信玄\t414886\n康耐\t414887\n阿鲁阿卓\t414888\n11事件\t414889\n商业部\t414890\n同泰\t414891\n速测仪\t414892\n写作学\t414893\nBasin\t414894\n魏鹏远\t414895\n706所\t414896\n十进宫\t414897\n锦绣海湾城\t414898\n莽夫\t414899\n黄经理\t414900\n12时\t414901\n东北路\t414902\n真大蛇\t414903\nhl7\t414904\n时珍国医国药\t414905\n谨记\t414906\n十九大全面从严治党战略部署_新华\t414907\n内标\t414908\n苏兹达尔\t414909\n闲时\t414910\nk22\t414911\n深圳各区社保分局\t414912\n斗牛士之歌\t414913\n石膏\t414914\n秋葵\t414915\n招生部\t414916\n炭\t414917\n北苑路\t414918\n舞台秀\t414919\n绸缎\t414920\n河南省南乐县人民政府\t414921\n超级盖世仙尊\t414922\n全民读书节\t414923\n平乡县\t414924\n17k\t414925\n文化生活\t414926\n带电体\t414927\n皮条\t414928\nEVIL\t414929\n地锁\t414930\nios9吧_\t414931\n上海长途客运南站\t414932\n恰\t414933\n最爱的色\t414934\n何言\t414935\n五一国际\t414936\nArt\t414937\n友好关系\t414938\n表重\t414939\n岳临高速\t414940\n6英寸\t414941\n20170517\t414942\n禧悦\t414943\nInstalled\t414944\n国家科委\t414945\n斗南花市\t414946\n情途\t414947\nRecap\t414948\n阻止\t414949\n陈美琳\t414950\n北汽勇士\t414951\n机牙\t414952\n通行费\t414953\n喜辽复\t414954\nLaneige\t414955\n5547\t414956\n弹弹play\t414957\n城市空间结构\t414958\n钦州二中\t414959\n尼尔斯\t414960\n李念\t414961\n宫野真\t414962\n知录\t414963\n折断线\t414964\n环保卫士\t414965\nRMS\t414966\n还火\t414967\n50kb\t414968\n挡车\t414969\n山莨菪碱\t414970\n深圳富士康\t414971\naremy\t414972\n美罗\t414973\n物格\t414974\n何斌\t414975\n财经时报网\t414976\n零元党\t414977\n药油\t414978\n将表\t414979\n黄历\t414980\nrfp\t414981\nsend\t414982\n创新型\t414983\n不明物\t414984\n1600吨\t414985\n55吋\t414986\n天目湖\t414987\n叶风\t414988\n聊城经济技术开发区\t414989\n利税\t414990\n中国远洋海运集团\t414991\n英文名字\t414992\n世纪新能源网\t414993\n迟浩田\t414994\nreilly\t414995\n23年前\t414996\nYY禁歌\t414997\n高宏志\t414998\n马池\t414999\nAsch\t415000\n77元\t415001\n一面五星红旗\t415002\n华西人才网\t415003\n卡券\t415004\ncsgo\t415005\n上海市物价局\t415006\n哎\t415007\n工作方\t415008\n一寸照\t415009\nroku\t415010\n87版\t415011\n昆山农村商业银行\t415012\nwhi\t415013\ncuring\t415014\n英联\t415015\n赵晨曦\t415016\n9宗\t415017\n万病\t415018\n1905.COM\t415019\n320分\t415020\n叉臂\t415021\n吴建华\t415022\n义乌国际商贸城一区\t415023\n大江晚报\t415024\n安信信托\t415025\nxiaoguo\t415026\n木林森\t415027\n回迁户\t415028\nAdds\t415029\n贵州茅台\t415030\n4月15日\t415031\n嵌套\t415032\n古天乐\t415033\n铝木\t415034\n电唱机\t415035\n新濠天地\t415036\n正电荷\t415037\nHeadline\t415038\n中国青田网\t415039\n五月色婷婷\t415040\n天公\t415041\n泛悦城市广场\t415042\n雏鹰\t415043\n捆绑\t415044\n茶趣\t415045\nTeddy洪恩幼儿英语\t415046\n赘婿炉石传说\t415047\n98亿\t415048\n所需\t415049\n野猪儿\t415050\nQB\t415051\nyixin\t415052\n钻刀\t415053\naigo\t415054\n雨火\t415055\n南开大学金融学院\t415056\n麻酥酥\t415057\n视频集\t415058\n水榭\t415059\n广播级\t415060\ngotham\t415061\nroot@localhost\t415062\n248号\t415063\n邓旭\t415064\n砖窑\t415065\nkd10\t415066\nqwt\t415067\n达达里奥\t415068\n透明版\t415069\n常青麦香园\t415070\n茨\t415071\n李二\t415072\nLWP\t415073\n水系\t415074\n发膜\t415075\n质壁分离\t415076\n0043\t415077\ncheating\t415078\n万安街道\t415079\npeewee\t415080\n第1条\t415081\ndub\t415082\n能够用\t415083\n深度学习\t415084\n魔法少女\t415085\n起泡网\t415086\n配镜\t415087\n快手枪手快枪手\t415088\nOm\t415089\n落地\t415090\njs+css3\t415091\n中国石油天然气股份有限公司\t415092\nmacosx\t415093\n天科股份\t415094\n莫愁女\t415095\n9.8\t415096\n中天建设集团\t415097\n曲面\t415098\n5.7.3\t415099\n水磨\t415100\necological\t415101\n刘宇\t415102\n一朵\t415103\n3sinx\t415104\nULN2003\t415105\n倾尽\t415106\n废妻\t415107\n戴月\t415108\n长江经济带\t415109\n重庆三科房地产经纪有限公司\t415110\n平安财产保险\t415111\n北京工业大学耿丹学院\t415112\ncT\t415113\n23首\t415114\n小蚁\t415115\n立法法\t415116\n万州大瀑布\t415117\n256.COM\t415118\n俄罗斯特种部队\t415119\n发茶\t415120\nbuck\t415121\n产品观\t415122\nEGD\t415123\n冷硬\t415124\n狂风暴雨\t415125\n180元\t415126\n张宴\t415127\n无巧不成婚\t415128\n海尚\t415129\n17SS\t415130\nzige\t415131\nelex\t415132\n剧透慎入\t415133\n32wei\t415134\no点\t415135\n汉化补丁包\t415136\n盐酸特比萘芬乳膏\t415137\n差分方程\t415138\n花镜\t415139\n傲途格\t415140\n滁州网\t415141\n梦马\t415142\n北京邮电大学\t415143\n5302\t415144\n装箱单\t415145\nGoDoc\t415146\n太化股份\t415147\n台北火车站\t415148\n梦幻西游手机\t415149\n广深铁路\t415150\nWES7\t415151\n随机向量\t415152\n内部群\t415153\n中国襄阳政府\t415154\n遴选\t415155\n同僚\t415156\n56所\t415157\n義父\t415158\n唐太宗\t415159\n房天下\t415160\n360\t415161\n译作\t415162\n王倩倩\t415163\n上万件\t415164\n强开\t415165\n襦\t415166\n58同城网\t415167\n黄克功\t415168\nBite\t415169\n远卓\t415170\n新衣\t415171\ndocment\t415172\nQuickConnect\t415173\n魔卫\t415174\n鸭王3\t415175\njpg转换器\t415176\nlativ\t415177\n湖南高速铁路职业技术学院\t415178\n该点\t415179\n十三五\t415180\ncars\t415181\n县公路局\t415182\n75平米\t415183\n苏州便民网\t415184\n厨力\t415185\n潍坊医学院\t415186\n扫地车\t415187\n第137章\t415188\nxp框架\t415189\n佛子\t415190\n仙侠剧\t415191\n明明白白\t415192\n扁平比\t415193\n张苞\t415194\ncdkey\t415195\n2018-01-22\t415196\n昂科威论坛_汽车之家论坛\t415197\n京唐\t415198\n东风快递\t415199\n币行\t415200\nADX\t415201\njovi\t415202\nMedicare\t415203\n城址\t415204\n西卡\t415205\n多肉植物_浴花谷花卉网\t415206\nGuided\t415207\n槽子糕\t415208\n中图分类号\t415209\n最美的歌儿唱给妈妈\t415210\n600536\t415211\norigins\t415212\nldpc\t415213\n天圆网\t415214\n小黑山\t415215\n五佳球\t415216\n弹弹弹\t415217\nfounded\t415218\n艾灰\t415219\n两票制\t415220\n邪恶少女漫画无翼鸟全集\t415221\n德伟\t415222\n审图\t415223\n香港置地\t415224\n牛奶海\t415225\n300.0000元\t415226\n小穴\t415227\n群雄逐\t415228\n关键期\t415229\n北城世纪城\t415230\nH5单\t415231\n签发\t415232\n海淀驾校\t415233\n装上\t415234\n上当\t415235\n永发\t415236\n图旺旺\t415237\n重庆区\t415238\nHTM\t415239\n仙剑奇侠传六\t415240\n龙光玖龙湾\t415241\n陆军师\t415242\n拉紧\t415243\ndatagird\t415244\n郭论\t415245\nfibers\t415246\n千丝万缕\t415247\n混和\t415248\n张家胜\t415249\n韩饭网\t415250\n腐草\t415251\n李践\t415252\n承租\t415253\n结城美纱\t415254\n阳关三叠\t415255\npascal\t415256\n僭\t415257\n始祖\t415258\n捧脸\t415259\n谭天\t415260\n飘窗\t415261\nFXML\t415262\nm20\t415263\n2660\t415264\n翻英\t415265\n东莞地区\t415266\n非农\t415267\n回不到\t415268\n许昌县\t415269\n名篇\t415270\nDot\t415271\n长安欧尚4S店\t415272\n野医\t415273\n联想科技城\t415274\nulinix\t415275\n好转\t415276\n疯狂动物城\t415277\n中国通信服务股份有限公司\t415278\n亚铜\t415279\n淮阴市\t415280\n2015年11月1日\t415281\n大羽\t415282\nSOH\t415283\n适口\t415284\n调配\t415285\n23000\t415286\n浙江省经信委\t415287\n1919年\t415288\n爱欢凉\t415289\n怒杀\t415290\n40余万\t415291\n非州\t415292\n众业\t415293\n蓄电池组\t415294\n千峡湖\t415295\n承销商\t415296\n腰间\t415297\n明诗\t415298\n厦岗\t415299\n盘玩\t415300\n壹点灵\t415301\n故作\t415302\n向阳小区\t415303\n囧克斯\t415304\n计征\t415305\n己所不欲\t415306\n占陇镇\t415307\n美科技广场\t415308\n计价器\t415309\nkone\t415310\n罗马尼亚\t415311\n币行/OKCoin\t415312\nhrtimer\t415313\n女尸案\t415314\nB1层\t415315\n后防撞梁\t415316\nmp4pa\t415317\nmytv\t415318\nMininet\t415319\n东土科技\t415320\n激突\t415321\n富力又一城\t415322\n11223\t415323\n霸占\t415324\n香蕉皮\t415325\n仙村镇\t415326\n盖形\t415327\nEric-Lee\t415328\n不了\t415329\n盲童\t415330\nattempted\t415331\n安高乐\t415332\n梅花桩\t415333\n尔湾\t415334\n淅淅沥沥\t415335\n路易斯安那州\t415336\n班夫国家公园\t415337\nFOREVER\t415338\n阪神\t415339\n基连\t415340\nkiehl\t415341\n厦门大学马来西亚分校\t415342\n一天多\t415343\n阿岳\t415344\n德国牧羊犬\t415345\n南航\t415346\n奥特之王\t415347\n零售店\t415348\n哈佛大学医学院\t415349\n华文仿宋\t415350\n十字板\t415351\n好时光\t415352\n自创\t415353\n奉化中学\t415354\n遛鱼\t415355\n2000毫升\t415356\n外蒙古\t415357\nmindmanager2018\t415358\n呻吟语\t415359\n孟连县\t415360\nRosso\t415361\n帽式\t415362\n想爱\t415363\n干挂胶\t415364\n沙车\t415365\n善为\t415366\n人鬼交易所\t415367\nbios\t415368\n铃原\t415369\n北客站\t415370\n协成\t415371\n超级兵王\t415372\n载玻片\t415373\n雁塔圣教序\t415374\n第四级\t415375\n116家\t415376\n哈巴河县\t415377\n茴\t415378\nLOL鸡里奥宝典\t415379\n交通联合卡\t415380\n小米4x\t415381\n0分钟\t415382\n长沙市政务服务中心\t415383\n一览_安趣网\t415384\nbuaa\t415385\n小米米家\t415386\nm400\t415387\nTeardown\t415388\n圆底烧瓶\t415389\n再出\t415390\n深圳证券通信有限公司\t415391\n竖窑\t415392\n裂帛\t415393\n1200亿美元\t415394\nmdict\t415395\n捡起\t415396\n禅武医\t415397\n康体\t415398\n城市规划管理技术\t415399\n东博\t415400\nJTJ\t415401\n重庆_重庆市人民政府\t415402\n夏清\t415403\n出炉\t415404\nBeirut\t415405\n第一秒\t415406\n蜂蜂\t415407\nDUI\t415408\n优惠券\t415409\n解包\t415410\n定向轮\t415411\n智慧图书馆\t415412\n耀州区\t415413\n大同世界\t415414\n王亚平\t415415\n卸压\t415416\n情侣装\t415417\ncdu\t415418\nk580s\t415419\n丘成桐\t415420\n乐视2pro\t415421\n逐字逐句\t415422\n八厘米\t415423\n深贫\t415424\n节拍器\t415425\n就地化\t415426\n柑橘类\t415427\nCocos2dx\t415428\nê\t415429\n货币乘数\t415430\n第五名\t415431\n枢密\t415432\n惠州市\t415433\n央产房\t415434\nToons\t415435\n中山日报\t415436\n连姆·尼森\t415437\n梁文音\t415438\n土默特右旗\t415439\n611路\t415440\n大信镇\t415441\nenscape\t415442\n木兮木\t415443\nwinutils\t415444\n霸州\t415445\n哼哈二将\t415446\n朝思暮想\t415447\n球桌\t415448\n绳师\t415449\n雨带\t415450\n二座\t415451\n退治\t415452\n5例\t415453\n明胶\t415454\n辽宁国税\t415455\nmulu\t415456\n饶水\t415457\nseals\t415458\n欧阳江河\t415459\n终结者创世纪\t415460\n管不着\t415461\n文怀沙\t415462\n冰柜\t415463\n湖南省旅游发展委员会\t415464\n追追追\t415465\n磕\t415466\nQT语音\t415467\n罗店\t415468\n凡科网\t415469\n刷赞\t415470\n第63条\t415471\n五粒\t415472\nsilane\t415473\n平冈\t415474\n美雅\t415475\n马鞍\t415476\n恒大国际广场\t415477\n汽车学院\t415478\n支边\t415479\n纤维瘤\t415480\n安徽中医药大学\t415481\n薛店\t415482\n鸭脖\t415483\nFACE\t415484\n第二块\t415485\n铺货\t415486\n16.7%\t415487\n完全摧花手册之狼穴羔羊\t415488\n金蝉花\t415489\n20立方米\t415490\nArgument\t415491\n李克特\t415492\n向阳\t415493\n软腭\t415494\ntones\t415495\n半山半岛\t415496\n米伦\t415497\n吹起\t415498\n计算机考试网\t415499\n绿文\t415500\n白广路\t415501\nphysician\t415502\n热水壶\t415503\nr12\t415504\n单位行贿罪\t415505\nTuhu\t415506\n首段\t415507\n塞拉摩\t415508\njux\t415509\nPJ\t415510\nff13-2\t415511\n英音\t415512\n尔康\t415513\n500008\t415514\njavaSamBlog\t415515\n家庭成员\t415516\n假文盲\t415517\n英国巴斯大学\t415518\n见风\t415519\n万车\t415520\n实况足球2018\t415521\n整点\t415522\n潘勇\t415523\n单质\t415524\n普拉托\t415525\n顶柜\t415526\n并罚\t415527\n肠炎宁\t415528\n废柴网\t415529\n陪衬\t415530\n北京公交驾校\t415531\n人间之神\t415532\n加站\t415533\n块头\t415534\n梦幻西游手游5\t415535\n成章\t415536\nzimuzu\t415537\n家当\t415538\n文海\t415539\n23世界读书日\t415540\n400多名\t415541\n佛号\t415542\n伯德\t415543\n王明伟\t415544\npmac\t415545\nsjm\t415546\n哈尼\t415547\n80kw\t415548\n阴毛处\t415549\n4月5日起\t415550\n条基\t415551\n掬\t415552\nkraken\t415553\n3W\t415554\n电脑处理器\t415555\n海姆立克急救法\t415556\n巴蒂尔\t415557\n红霞\t415558\n心衰\t415559\n优香\t415560\nlaunchpad\t415561\n科普馆\t415562\n分水\t415563\n建平实验中学\t415564\n135期\t415565\n替换词\t415566\n蜂巢帘\t415567\n为医\t415568\n_武王小说网\t415569\n急不可待\t415570\n几百张\t415571\n院子\t415572\n稳盈\t415573\n李比希\t415574\n进气口\t415575\n带息\t415576\n小山\t415577\n端木轩\t415578\n乐投\t415579\n第八弹\t415580\n19%\t415581\n河南省文化厅\t415582\n问路\t415583\n博易大师\t415584\n20140315\t415585\n红旗河沟\t415586\nminec\t415587\n例类\t415588\n329\t415589\n小白药\t415590\n北京工商大学嘉华学院\t415591\n骚红\t415592\n跑跑卡\t415593\n马钢股份\t415594\n9.2.1\t415595\n李勒优\t415596\nMC喊麦网\t415597\n成真恋爱学\t415598\n无限战争\t415599\n李耳红\t415600\n章节\t415601\n一扫\t415602\n替婚\t415603\n中建安装工程有限公司\t415604\n书序\t415605\nWASAPI\t415606\n串匹配\t415607\ntxd\t415608\n2进制数\t415609\n3种\t415610\nHealer\t415611\nissuu\t415612\npdf浏览器\t415613\n深广\t415614\n艾思\t415615\n徐才网\t415616\n制作费\t415617\n麦斯威尔\t415618\nDN400\t415619\n补偿型\t415620\n潮衣\t415621\n田家兔\t415622\n防水工\t415623\n平安果\t415624\n子弟\t415625\nkonsole\t415626\n兰坪\t415627\n拳皇1.91\t415628\n国家新型城镇化规划\t415629\n衡阳\t415630\niis8.0\t415631\n贵州轻工职业技术学院\t415632\ncsn\t415633\n老安\t415634\n中华人民共和国环境保护税法\t415635\nSQuirreL\t415636\n蓝光刻录机\t415637\n传代培养\t415638\ndashi\t415639\n金英\t415640\n保熟\t415641\nWith\t415642\n第17个\t415643\nyare\t415644\n皮牌\t415645\n舒尔佳\t415646\n二十层\t415647\nDion\t415648\n泊美\t415649\n插袋\t415650\n艺术字设计_千库网\t415651\n2016年8月1日\t415652\n虎石台\t415653\n塑机\t415654\n水瑶\t415655\n53路\t415656\n爱在深秋\t415657\n简欧式\t415658\nCashmere\t415659\n信工\t415660\nAutoIT\t415661\n裸机\t415662\n三十里堡\t415663\n龚氏\t415664\n北森\t415665\n悦城\t415666\n楼塔\t415667\n白忙\t415668\n张灯结彩\t415669\n实用新型专利权\t415670\n院区\t415671\n过小\t415672\n彩世界\t415673\n移动机器人\t415674\n路由规则\t415675\n无锁机\t415676\n英雄联盟音乐节\t415677\n许由\t415678\n中国大健康\t415679\nxiaoshou\t415680\n吨桶\t415681\n抹胸\t415682\n365体育投注\t415683\n跃居\t415684\nfiori\t415685\n2.x\t415686\nFinnish\t415687\n尿痛\t415688\n紫阳街\t415689\n危亡\t415690\n7台\t415691\n酒后驾车\t415692\n信心\t415693\n拍板\t415694\n深圳通有限公司\t415695\nfor-of循环\t415696\nHandling\t415697\n警示教育片\t415698\n浪潮\t415699\n冉庄地道战遗址\t415700\n深泽\t415701\n泥岗村\t415702\n辽阳新闻网\t415703\n慈母\t415704\n大个子老鼠小个子猫\t415705\n伞花\t415706\n反水\t415707\n李智恩\t415708\n过晚\t415709\nFLOAT\t415710\n故知\t415711\nTotally\t415712\n手笔\t415713\n扎克萤火之森\t415714\n广中路街道\t415715\n电暖炉\t415716\n郑日昌\t415717\n90片\t415718\n电极反应式\t415719\n手部\t415720\n过街\t415721\n五|\t415722\n常怀\t415723\n金玉奴\t415724\nimmunology\t415725\n尤克里里指弹谱\t415726\n沙虫\t415727\n放鸽子\t415728\n双层\t415729\n盛视\t415730\nmacports\t415731\n珞\t415732\n咬嘴\t415733\n荣誉勋章\t415734\n恋恋风尘\t415735\n2016-12-31\t415736\n五星物语\t415737\nbat吧_\t415738\nLeica\t415739\n别云\t415740\n口袋妖怪始源蓝宝石\t415741\n远亲\t415742\nSouthern\t415743\n穿越火影\t415744\n慧鸿\t415745\n遇见\t415746\n卡口\t415747\n全家族\t415748\nSWOT分析法\t415749\n五线谱\t415750\n翻转机\t415751\n分类账\t415752\n减证便民\t415753\n概念设计师\t415754\n博纳\t415755\n第6章\t415756\n固拉多\t415757\n浩泽\t415758\n古文字学\t415759\n布丁酒店\t415760\n新郎新娘\t415761\n第16季\t415762\nTypora\t415763\n少儿\t415764\n熟茶\t415765\n征候\t415766\n鸳鸯\t415767\nshave\t415768\n一代金\t415769\n王串场\t415770\n15x\t415771\n30th\t415772\nwin10cmd\t415773\nwin7卡\t415774\n星光龙\t415775\n白头偕老\t415776\n塞跳蛋\t415777\n生力\t415778\n阿\t415779\n浙江海洋大学\t415780\n惠州大亚湾\t415781\n西蒙子\t415782\n麻辣串\t415783\n训练书\t415784\n12.0.2\t415785\n口型\t415786\n801胶水\t415787\n富德生命人寿保险\t415788\n专利法\t415789\n南境\t415790\n明月清风\t415791\n1.20e\t415792\n宝骏\t415793\n茂业天地\t415794\n座号\t415795\n破布\t415796\n垃圾债\t415797\n时钟表\t415798\n球藻\t415799\n类目表\t415800\n许某\t415801\n李铁军\t415802\n丽阳\t415803\n东平\t415804\n中子星\t415805\n揭破\t415806\nhdf\t415807\n預告\t415808\n16式\t415809\n微弱\t415810\n邪教\t415811\n棋牌游戏\t415812\n龙城小区\t415813\n功成\t415814\n仙桃市人民政府\t415815\n联想笔记本\t415816\n连身衣\t415817\n六冲\t415818\n小雁塔\t415819\n仪征政府\t415820\n800年前\t415821\n忘语\t415822\n高星级酒店\t415823\n02007\t415824\n米波\t415825\n与众\t415826\n想错\t415827\n并发症\t415828\nnh3\t415829\n1127\t415830\n富世华\t415831\n丑颜\t415832\n政府部门\t415833\n吕颜\t415834\ncron\t415835\n大沙漠\t415836\n8-1\t415837\n氢氯噻嗪\t415838\n迪\t415839\n和平县政府\t415840\n内挂\t415841\n鲁能集团有限公司\t415842\npscp\t415843\n喵喵机\t415844\n缓存服务器\t415845\nLouie\t415846\nChairs\t415847\n导航栏\t415848\n窃玉生香\t415849\n魅蓝5\t415850\nHIT专家网\t415851\nEncore\t415852\n20170625\t415853\nBYD\t415854\n苏克\t415855\n侏罗\t415856\n潮汕牛肉火锅\t415857\n先行官\t415858\n爱在春天\t415859\n斗牛舞\t415860\n空调匹数\t415861\n于娜\t415862\n财务报告\t415863\n地球百子第四季\t415864\n上网本\t415865\n分得\t415866\n双镜\t415867\n313号\t415868\npktgen\t415869\n囡\t415870\n娱乐帝国\t415871\n马浩\t415872\n冷江\t415873\ngnuplot\t415874\n双光透镜\t415875\nembody\t415876\n0704\t415877\n仙剑奇侠传5\t415878\n计算类\t415879\n86元\t415880\nlimt\t415881\n性生活\t415882\nwuren\t415883\n瑞娜\t415884\n狂傲\t415885\n港汇\t415886\nempower\t415887\n满宝\t415888\n破产法\t415889\n360U盘\t415890\n生日照\t415891\n张栋梁\t415892\n齐欢\t415893\n图史\t415894\n美国社区大学\t415895\n普拉多博物馆\t415896\n阿布扎比机场\t415897\n北汽集团\t415898\n劳动合同书\t415899\n小雨点\t415900\n一单一双\t415901\n逼抢\t415902\n刘园\t415903\n注射用头孢呋辛钠\t415904\nkunming\t415905\n25mg\t415906\n八秒\t415907\n极板\t415908\n中国移动集团\t415909\n快玩游戏\t415910\n卢西奥\t415911\n属期\t415912\n杜康酒\t415913\nCSS选择器\t415914\ncontraction\t415915\n三星a5\t415916\n新河村\t415917\n篮球\t415918\n中美贸易战将\t415919\n夏菜\t415920\n河南职业技术学院\t415921\n食具\t415922\n畅学电子网\t415923\n途径\t415924\n频频\t415925\npwntools\t415926\n拐弯处\t415927\n网易云信\t415928\n资格考试网\t415929\n奏鸣\t415930\naff\t415931\nCeilometer\t415932\n孔门\t415933\n裘\t415934\nGrub\t415935\n只待\t415936\n托塔天王\t415937\nA0\t415938\n品牌们\t415939\ngreenDao\t415940\n拖地\t415941\n赵老哥\t415942\n格物\t415943\n青软\t415944\n伪戒\t415945\n雅山\t415946\n香里奈\t415947\nA.10\t415948\n黑魔法师\t415949\n耿马\t415950\n摔跤吧爸爸\t415951\n计中计\t415952\n小于号\t415953\nalgebra\t415954\n仙道彰\t415955\n替代效应\t415956\n人物谱\t415957\n石靖\t415958\n第二季01集\t415959\n谢百三\t415960\n最美的\t415961\n成都招商银行\t415962\n加工件\t415963\n第74届\t415964\n梦幻西游论\t415965\nselector\t415966\nCGV星聚汇影城\t415967\ncpm\t415968\n无用论\t415969\n七码\t415970\n供稿\t415971\n仙缘错:惊世情劫\t415972\n_性商网\t415973\nthermos\t415974\n中华人民共和国长江海事局\t415975\nCorrelation\t415976\n溶入\t415977\nE7\t415978\n渔村\t415979\n鳌江镇\t415980\n中央花园\t415981\n孩之宝\t415982\n晨操\t415983\nHIVE\t415984\nPS版\t415985\n省科协\t415986\n机壳\t415987\n枸杞菜\t415988\n耶稣受难日\t415989\n造当\t415990\n06款\t415991\n弯腿\t415992\n徐荣\t415993\nGTJ2018\t415994\n就医160挂号网\t415995\n55jj.com\t415996\n苦荞酒\t415997\n官榜\t415998\nmarantz\t415999\n哈利波特\t416000\nDotNet\t416001\nwife\t416002\nHighlighting\t416003\n623\t416004\n气动\t416005\n郓城县\t416006\n末\t416007\nHtc\t416008\n车载逆变器\t416009\n痴图\t416010\n仓多真央\t416011\n掺假\t416012\n世情\t416013\n面经版\t416014\n佛学院\t416015\n苏蕾\t416016\nADO.NET\t416017\nv4a\t416018\ndiversion\t416019\nCayenne论坛_汽车之家论坛\t416020\n刺客信条2吧\t416021\n谷水\t416022\n第五讲\t416023\n红璞公寓\t416024\n凯利海华府\t416025\n唐山一中\t416026\n罗拉快跑\t416027\n哥哥哥哥\t416028\nphillip\t416029\nTME\t416030\nufida\t416031\n女扮演者\t416032\nA73\t416033\n泪鱼儿\t416034\n最后的曙光\t416035\n单个\t416036\n行业解决方案\t416037\n泉州电视台\t416038\n纳惜\t416039\n铺天盖地\t416040\n不攻自破\t416041\n瓷牙\t416042\nzuida\t416043\n发博\t416044\n陈燕华\t416045\n啸月\t416046\n摘要集\t416047\n十二卷\t416048\n不知为\t416049\n高玲玲\t416050\n相信\t416051\n220t\t416052\n山西医科大学汾阳学院\t416053\n东丈\t416054\nItEye\t416055\n0.5克拉\t416056\n建于\t416057\n转盘式\t416058\nHarry\t416059\nSpas\t416060\n增值税防伪税控\t416061\n老者\t416062\nkris\t416063\n4岁\t416064\n三步踩\t416065\n游本昌\t416066\n27岁\t416067\n原核\t416068\n小剑\t416069\n智诚\t416070\nyundong\t416071\n早\t416072\n徐海\t416073\n巨细胞\t416074\nIpad\t416075\n张同学\t416076\nEstée\t416077\nVersace\t416078\nsng\t416079\n马卡\t416080\n约稿函\t416081\n幸运石\t416082\n金交所\t416083\n上海市红十字肿瘤医院\t416084\nwalt\t416085\n乡巴佬\t416086\n内蒙古自治区质量技术监督局\t416087\nShea\t416088\n全封闭式\t416089\n海马泡酒\t416090\nVillas\t416091\n360号\t416092\nDetectron\t416093\n蒋志光\t416094\n中国国电\t416095\n正街\t416096\n74式\t416097\n孤芳\t416098\n中航光电科技股份有限公司\t416099\n77岁\t416100\n偶氮二异丁腈\t416101\n辅导师\t416102\n卖肾\t416103\n第七年\t416104\n牛肝\t416105\n央采\t416106\n装配图\t416107\n第11套\t416108\njeewx\t416109\n青岛市城乡建设委员会\t416110\n大地惊雷\t416111\nFeeding\t416112\n生培养\t416113\nsd娃娃\t416114\n新沟\t416115\n红通\t416116\n换式\t416117\n天空之城吉他谱\t416118\n奔腾万道剑尊\t416119\nDiabetes\t416120\n林海峰\t416121\n中信银行信用卡中心\t416122\n铁总运\t416123\npeu\t416124\n密盘\t416125\n百度云超清\t416126\nwaited\t416127\n仙妮蕾德\t416128\nPacketiX\t416129\n3300万\t416130\n荣耀\t416131\n神秘四奥\t416132\n4043\t416133\n图器\t416134\n阿史\t416135\nsipb\t416136\n画龙\t416137\n访员\t416138\nHMD\t416139\nMagento2\t416140\n写真秀\t416141\n可供分配利润\t416142\n陈卓\t416143\nrout\t416144\n9月1号\t416145\nSalvador\t416146\n挥发性\t416147\n磊科\t416148\n合生\t416149\n51汽车网\t416150\n雅各布斯\t416151\n走向\t416152\ndrw\t416153\nMRT\t416154\nAmei\t416155\nWin7安装盘\t416156\n灰绿色\t416157\n行稳\t416158\n逐梦\t416159\n目标函数\t416160\n购机\t416161\nBlooming\t416162\n车牌定位\t416163\nkernel32\t416164\n六卡\t416165\n叶姓\t416166\n国家科技计划申报中心_国家科技计划项目申报中心\t416167\n香坂百合\t416168\n城师\t416169\n米非司酮片\t416170\nshaoguan\t416171\ncodehaus\t416172\n滕州\t416173\nathena\t416174\n红尘客栈\t416175\n中国文联\t416176\n注安\t416177\nring\t416178\n养生类\t416179\n王东岳\t416180\n通志\t416181\n117.136\t416182\n第八版\t416183\n市人社局\t416184\n非上市公众公司\t416185\n书生\t416186\n密封条\t416187\n生力军\t416188\n36句\t416189\n狮王祛痘膏\t416190\n2018-02-26\t416191\n桂枝茯苓丸\t416192\n卖身契\t416193\n心虫\t416194\n徐建文\t416195\n3975\t416196\n麝\t416197\n美瞳\t416198\nQuizlet\t416199\n一枚一元\t416200\n更配\t416201\n山山客\t416202\ncsoldjb\t416203\n北京机械工业自动化研究所\t416204\n直映\t416205\n第78\t416206\n伍元\t416207\n基础性\t416208\n雅宝\t416209\n第3季度\t416210\n可知道\t416211\nsiny\t416212\n吴博\t416213\n中珠医疗\t416214\nN位\t416215\nSimpson\t416216\n新版西游记\t416217\n瑞吉山\t416218\n福清人才网\t416219\n谢依霖\t416220\n汽水\t416221\n小朵\t416222\n追妻路\t416223\n请\t416224\n十里金滩\t416225\npt950\t416226\n哑口\t416227\n元宝网\t416228\nxuliehao\t416229\n较快\t416230\n美体_太平洋时尚网\t416231\n余斗余斗\t416232\n布局篇\t416233\n大庆铁人中学\t416234\nbadly\t416235\n科干\t416236\n寂地\t416237\n零购\t416238\n平方差\t416239\n环城公路\t416240\n土壤\t416241\n428\t416242\n35公里\t416243\n初霁\t416244\ntapes\t416245\n支支招\t416246\n亲人\t416247\n证件照\t416248\n初音实\t416249\n杨武\t416250\namb\t416251\n卯时\t416252\n战局\t416253\n耽美漫画|BL漫画|H漫画|腐漫画|腐女\t416254\n好东东\t416255\nASFB-113\t416256\n玉簪\t416257\n编程式\t416258\n免CD补丁\t416259\n萧风\t416260\n《西北大学》\t416261\nd3js\t416262\n土地私有制\t416263\n相公\t416264\n用车_一猫汽车网\t416265\n差速\t416266\n巨城\t416267\n五十张\t416268\n奇诺之旅\t416269\njos\t416270\n星盘合盘\t416271\n倒春寒\t416272\n43期\t416273\n指导人\t416274\n钱草\t416275\n霸权\t416276\n开封火车站\t416277\nd型\t416278\n金融报\t416279\ntheHunter\t416280\n防腐\t416281\n审计助理\t416282\n牧区\t416283\n牙防\t416284\n1599\t416285\n电动车库门\t416286\n聚氨酯固化剂\t416287\n准备好了\t416288\n8052\t416289\n6G\t416290\n第三周\t416291\n10.9.4\t416292\n装饰膜\t416293\nper\t416294\n时代山湖海\t416295\n解屏\t416296\nZHEN\t416297\n中华人民共和国全国人民代表大会\t416298\n理疗师\t416299\nNoad\t416300\n365邮箱\t416301\n收租\t416302\n气象\t416303\n人力资源与社会保障网\t416304\n2190\t416305\n机关干部\t416306\n加规\t416307\n飞吻\t416308\n3048\t416309\nhpu\t416310\n中银易商\t416311\n知识界\t416312\n5分之一\t416313\n包装件\t416314\n自个\t416315\n34度\t416316\nAaron\t416317\n孙明达\t416318\n儿歌\t416319\n韶山南路\t416320\n果本\t416321\n楼号\t416322\nOn7\t416323\n亲宝网\t416324\n腐叶土\t416325\n销讲\t416326\n三十部\t416327\n成长册\t416328\n紫萱\t416329\n艳丽\t416330\n金波\t416331\n6.8下\t416332\n白点病\t416333\n侵华\t416334\n1个半小时\t416335\n服装费\t416336\n太原市教育局\t416337\n胡波\t416338\n首义\t416339\nv2.4.0\t416340\nweishi\t416341\n麻城市\t416342\n顺流\t416343\n北京限号\t416344\n本月28日\t416345\nTair\t416346\n3世\t416347\n操作者\t416348\n碳刷\t416349\n民心佳园\t416350\nGlide\t416351\nPandakill\t416352\nTiff\t416353\n小川\t416354\n備員\t416355\n秦楼\t416356\n赵一涵\t416357\n1.8TD\t416358\nint&#160\t416359\n长泽梓\t416360\n中华泰山网\t416361\n望都\t416362\n拳\t416363\n涉法涉诉\t416364\nShui\t416365\n阿尔伯塔\t416366\nupdat\t416367\n190亿\t416368\n账龄\t416369\nonetoone\t416370\n数一\t416371\npaperpass\t416372\n一银\t416373\n直死\t416374\n欧几里得\t416375\n十八招\t416376\n东川区\t416377\n体坛风云-华商论坛\t416378\n同萌\t416379\n自然之道\t416380\n李震\t416381\n递补\t416382\n脑性\t416383\n麻丘镇\t416384\n剂量仪\t416385\n邓州市人民政府\t416386\nweb3\t416387\n万箭穿心\t416388\n石器时代2\t416389\n慕南枝\t416390\n全国政协\t416391\n人教版五年级下册\t416392\n骄猛\t416393\n意大利威尼斯\t416394\n炫舞梦工厂\t416395\nasahi\t416396\n好彩头\t416397\n滁州公共资源交易中心\t416398\n快乐成长\t416399\nwebpack2\t416400\n电锤\t416401\n私分\t416402\n辣片\t416403\n弓箭手们\t416404\n十月初五\t416405\n刺客伍六七\t416406\n汉源县人民政府\t416407\n坂口\t416408\n二胡\t416409\n天使湾\t416410\nZero2\t416411\n读物\t416412\n六首\t416413\n焗油\t416414\nplsqldev\t416415\n1.9.6\t416416\n荼蘼\t416417\n同声\t416418\n实木\t416419\n第95集\t416420\n菱悦v3\t416421\nVRF\t416422\n赵胜\t416423\n20161015\t416424\nㄆ\t416425\n泡沫箱\t416426\n冒险岛神之子\t416427\n天门山国家森林公园\t416428\n知识产权裁判文书网\t416429\n海淀小学\t416430\n妞\t416431\n技术规\t416432\n创造力\t416433\n食神\t416434\n广西科技师范学院\t416435\n黄冈职业技术学院\t416436\n蓖麻毒素\t416437\n彩衣\t416438\n公安类\t416439\n双城\t416440\n戌\t416441\nTrueView\t416442\ntenso\t416443\n情文\t416444\n敬廉崇洁\t416445\n杜小月\t416446\n林帝浣\t416447\n四氟\t416448\n轻钢别墅\t416449\nJeep\t416450\n七八千\t416451\n镐京\t416452\n海藻糖\t416453\n20171210\t416454\n单丛\t416455\nNote4X\t416456\n达赖\t416457\n50倍\t416458\nGatewayWorker\t416459\n九五折\t416460\nALL\t416461\n收点\t416462\n呱呱坠地\t416463\n农民画\t416464\n瞟\t416465\n王奶奶\t416466\nCopa\t416467\n蒙古歌\t416468\n重铬酸钠\t416469\ncmakelist\t416470\n碧玉\t416471\n五指袜\t416472\n中国北车\t416473\n洞香春\t416474\n李莫愁\t416475\nDoctrine\t416476\n致信\t416477\n冬桃\t416478\n军旅\t416479\n张艳平\t416480\n漳州市医院\t416481\n韩范\t416482\n龙口_龙口政府\t416483\n梅颖\t416484\n金莲教\t416485\n65分钟\t416486\nTBOX\t416487\nvivoy66\t416488\n小白熊\t416489\n小夜灯\t416490\ntaiyang\t416491\n分解因式\t416492\n撞见\t416493\n意味\t416494\nbin\t416495\n立法院\t416496\nAutomotive\t416497\n求生\t416498\n神者\t416499\n古林公园\t416500\n目的港\t416501\n友期\t416502\n选型表\t416503\n通讯\t416504\nspf30\t416505\n建站\t416506\n踩\t416507\n贵阳晚报\t416508\n热湿\t416509\n258\t416510\n军盲\t416511\n安新县\t416512\n田超\t416513\n合肥市瑶海区\t416514\n卷制\t416515\nRenderTexture\t416516\n租出去\t416517\n地铁5号线\t416518\n明开夜\t416519\n口交\t416520\n株洲市中心医院\t416521\n第十_\t416522\n歌诗我的贴身校花\t416523\n磁粉探伤\t416524\n桅子\t416525\n想想\t416526\nscout\t416527\n内审员\t416528\n冷不冷\t416529\ncyg\t416530\n庆余\t416531\n顺应\t416532\n601088\t416533\n东锦\t416534\n运营商们\t416535\n平交\t416536\n垫木\t416537\n遥控精灵\t416538\nOKhttp\t416539\n12厘米\t416540\n双规\t416541\n管圈\t416542\n10E\t416543\n拒捕\t416544\n阿拉伯数字\t416545\n黑色系\t416546\npg\t416547\n双轨\t416548\nkommen\t416549\n2月14日\t416550\n工事\t416551\n林芳兵\t416552\n博洋家居生活馆\t416553\nMO管理器\t416554\n巴扬\t416555\n花机\t416556\ngummy\t416557\n常任制\t416558\n桑葚汁\t416559\n花镇\t416560\n吉泽明步\t416561\n划出\t416562\n2018.3.26\t416563\n53岁\t416564\n丰富多彩\t416565\n南宁市交通运输局\t416566\n惊风\t416567\n业委会\t416568\n疾风之刃\t416569\n信阳晚报\t416570\n喜福\t416571\n道德\t416572\nNautica\t416573\n卡奴\t416574\n金月\t416575\nsnownlp\t416576\n节休市\t416577\n南湖区\t416578\n2016年4月30日\t416579\n聴\t416580\narclive\t416581\nssf\t416582\n4组\t416583\n琵琶湖\t416584\n按压式\t416585\n贡井\t416586\n九\t416587\n操作费\t416588\n相思木\t416589\n2000-2015年\t416590\n巅\t416591\n冲淤\t416592\nEOP\t416593\nmercury\t416594\n永恒纪元戒\t416595\ncng\t416596\n个股吧\t416597\n亨德森\t416598\n李筱懿\t416599\n安徽省人民政府\t416600\n一定要\t416601\n生物质锅炉\t416602\n开瑞K70\t416603\n北方国际射击场\t416604\n硫酸沙丁胺醇\t416605\nindicating\t416606\n袋鼠妈妈\t416607\n漂\t416608\n响板\t416609\n引进版\t416610\n上饶经开区\t416611\n江苏东成电动工具有限公司\t416612\nharuka\t416613\n努比亚手机\t416614\n冒泡法\t416615\n软硬件\t416616\n11张\t416617\n刘语熙\t416618\nb860av2.1\t416619\nZhu\t416620\nautograd\t416621\n孙学军\t416622\n标准规范\t416623\nASO\t416624\nzhttty\t416625\n铺底\t416626\n地震中的父与子\t416627\nczz\t416628\n童床\t416629\n北京保利剧院\t416630\nflow\t416631\nImmunology\t416632\n大逼\t416633\n恐怖大师\t416634\n江门市社会保险基金管理局\t416635\n帝王攻略\t416636\n十二生肖_祥安阁风水网\t416637\n四分之一波片\t416638\n鲭鱼\t416639\n160%\t416640\nblo\t416641\n六界仙尊\t416642\n51bbcy.com\t416643\n血小板压积\t416644\n赡养\t416645\nbrought\t416646\n贫苦\t416647\n酒海\t416648\nkiosk\t416649\n数据卷\t416650\nXStream\t416651\n快餐车\t416652\n白阳\t416653\n车辆识别码\t416654\n三叉神经\t416655\n太阳能光伏组件\t416656\n女神探\t416657\n生成物\t416658\n寓于\t416659\n新浪剑网3\t416660\n周琦索罗斯\t416661\n仇晓\t416662\niPc\t416663\n丽江市人民政府\t416664\n食品药监局\t416665\nsimotion\t416666\n东商网\t416667\n锰酸钾\t416668\n水景\t416669\n天津大学网络教育学院\t416670\n25型\t416671\n布勒\t416672\n直发膏\t416673\n贴板\t416674\n饣\t416675\nClarks\t416676\n翡翠家园\t416677\n北京市红十字会\t416678\n贵行\t416679\n迎新年\t416680\ncactus\t416681\nKombat\t416682\n80年前\t416683\n胡胜利\t416684\n硫酸钴\t416685\n癸巳日\t416686\nFCP7\t416687\n持卡\t416688\n义拍\t416689\nNVIC\t416690\npdf在线转换器\t416691\n剑灵_17173\t416692\n李瑞\t416693\n招银大厦\t416694\n百香\t416695\n濠滨论坛\t416696\nJDC\t416697\nApeiron\t416698\n汇贷\t416699\nxin\t416700\n秀姐\t416701\n中国东至_东至县人民政府\t416702\ngibson\t416703\nnewton\t416704\ncvv\t416705\n逐鹿\t416706\n回纹\t416707\nzhaung\t416708\n伊在线\t416709\n悲惨酒井法子\t416710\ncaxa2013\t416711\n慈善\t416712\n贴息\t416713\n叠翠峰\t416714\n八台\t416715\n傻事\t416716\n誓不为\t416717\n参字\t416718\nvg278q\t416719\n六脉\t416720\nBrendan\t416721\n可赛\t416722\n华强电子网\t416723\n周报告\t416724\n长江大学工程技术学院\t416725\n风格化\t416726\n三峡电站\t416727\n养气\t416728\n上海社会保险\t416729\n行政判决\t416730\ne11\t416731\n神奇四侠2\t416732\n耻度\t416733\ntrim\t416734\n少壮\t416735\n安居宝\t416736\n都会好\t416737\n好孤单\t416738\ntalend\t416739\n卫宣利\t416740\n三上悠亚\t416741\n莎朗·斯通\t416742\n无限之住人\t416743\n舞咲\t416744\npeva\t416745\n100部\t416746\n陆川\t416747\n顺丰优选\t416748\n造纸机\t416749\nNX11.0\t416750\n凸性\t416751\n贝尔·格里尔斯\t416752\n永野芽郁\t416753\n无伦\t416754\nCPPM\t416755\n资产负债表分析\t416756\nSOL\t416757\n八一八微博\t416758\n玉米棒\t416759\n达内培训\t416760\n4matic\t416761\nwebhook\t416762\n架构\t416763\n轻身\t416764\nuf\t416765\n12月1日\t416766\nhd600\t416767\n菜谱网\t416768\n跳上\t416769\n李月\t416770\n月港\t416771\n妖怪们\t416772\n重庆送子鸟医院\t416773\ntbi\t416774\n赞美诗\t416775\n包报\t416776\nqmake\t416777\n拉拉肥\t416778\n布加迪威龙\t416779\n好菜杰\t416780\nQ20\t416781\n刀块\t416782\n绵白糖\t416783\n监造\t416784\nmaturity\t416785\n1521\t416786\n指绘\t416787\n海因克斯\t416788\n越夜\t416789\n济南\t416790\n手链\t416791\nreact-router\t416792\nVOOC闪充\t416793\n9.9%\t416794\n绿母\t416795\n微电子学院\t416796\nflavor\t416797\n力帆\t416798\nrefind\t416799\n823\t416800\n荣昌东街\t416801\n优地\t416802\n家安\t416803\nqpsk\t416804\n医科达\t416805\n114分\t416806\n海淀区纪委监察局\t416807\n新美星\t416808\n瓜皮猫\t416809\n杨旭东\t416810\n88158198\t416811\n杂合\t416812\nPeriodic\t416813\n绿化工\t416814\nMosaic\t416815\n电信联通\t416816\n3朵\t416817\n许字\t416818\n污辱\t416819\n妇女\t416820\n操作表\t416821\n王国新\t416822\nMVC4\t416823\n36斤\t416824\n12170\t416825\n信息版\t416826\n万科大\t416827\n416\t416828\ncom2\t416829\n大善\t416830\n安徽广播电视大学\t416831\n线材\t416832\n黄菲\t416833\n阿里政委\t416834\n雨阳\t416835\n竹园小学\t416836\n象州\t416837\n武政\t416838\n7.14\t416839\n照壁\t416840\n薪级工资标准\t416841\nf35\t416842\n2017年12月6日\t416843\n苏州市商务局\t416844\n800x480\t416845\n留行\t416846\n阳狮\t416847\n职业照\t416848\n巨灵神\t416849\n观测\t416850\nSTM32F103VET6\t416851\nRB\t416852\n尼尔机械纪\t416853\n软组织肿瘤\t416854\n金水湾小区\t416855\n虚电\t416856\n佛山电台\t416857\n粉丝站\t416858\n花木兰传奇\t416859\n4587\t416860\n中国人民大学研究生院\t416861\n华为荣耀6plus\t416862\n真空过滤机\t416863\n百美\t416864\n340万\t416865\n1674\t416866\n多功能性\t416867\n兰斯6\t416868\nDemocracy\t416869\n第26页\t416870\n维多利亚2黑暗之心\t416871\n企业文化建设\t416872\n运作\t416873\n彭州\t416874\n绝命海拔\t416875\n口袋妖怪游戏\t416876\n感谢有\t416877\n九龙瀑\t416878\n五十米\t416879\n咯咯咯鬼太郎\t416880\n儿保\t416881\n速比涛\t416882\n国美在线\t416883\n运动神经元病\t416884\n采摘地\t416885\n高分子量\t416886\n三八式\t416887\n第56条\t416888\n不确定性\t416889\n玉女心经\t416890\n九酷经典\t416891\nnero9\t416892\nWANG\t416893\n干部们\t416894\n新大头儿子和小头爸爸\t416895\n197号\t416896\n先在\t416897\n雀圣1\t416898\n翦\t416899\n热流\t416900\n潇\t416901\n线磅\t416902\n创口\t416903\n发明史\t416904\n那\t416905\n150美元\t416906\n名物\t416907\n宜华生活\t416908\nEST\t416909\n景园\t416910\n怪物们\t416911\nRline\t416912\nコントロ\t416913\n松冈祯丞\t416914\n亿纬锂能\t416915\n尸检\t416916\n500x500\t416917\n佐仓杏子\t416918\n赤字\t416919\nfreetype\t416920\n动脑\t416921\nmaketrans\t416922\nspingboot\t416923\n私卖\t416924\n临汾\t416925\n魏桥\t416926\n恍恍惚惚\t416927\n二十项\t416928\n艾琳\t416929\nyuyao\t416930\n0.11.0\t416931\n付税\t416932\n吞食天地2\t416933\n史瑞克\t416934\nreact+redux\t416935\nwindowsserver2008\t416936\n冶金\t416937\n急性肝炎\t416938\n度日如\t416939\n金鳞\t416940\nAlien\t416941\n75个\t416942\n淀山湖镇\t416943\n4月29号\t416944\n东乡族\t416945\nReviverSoft\t416946\nOutLook\t416947\n二月初\t416948\nbevel\t416949\n一更\t416950\n孤石\t416951\n脱硝催化剂\t416952\nIC电子元器件\t416953\n魔爪\t416954\n勇敢的人\t416955\n最美丽\t416956\n利炳根\t416957\n王伦\t416958\n神象\t416959\n天麻素\t416960\n西安南站\t416961\n400家\t416962\n过目不忘\t416963\n绝缘梯\t416964\n银龄\t416965\nWin7系统C盘\t416966\n琉璃盏\t416967\n首辅\t416968\n台积电\t416969\n悦翔V7\t416970\n徐东\t416971\n神州优车\t416972\nSmart3D\t416973\n一千\t416974\n编织绳\t416975\n程序流程图\t416976\n21歳\t416977\n胡桃木家具\t416978\n猪肉丸子\t416979\n微克\t416980\n知网CNKI\t416981\n言曌\t416982\n雷射\t416983\n公能\t416984\n中华人民共和国公路法\t416985\ndofile\t416986\n45tfsi\t416987\n校书\t416988\n金团\t416989\n石家庄市\t416990\n耐久性\t416991\n扌\t416992\nmania\t416993\n6|\t416994\nzuul\t416995\n##\t416996\n实习期\t416997\nUAPP\t416998\n69级\t416999\n湖北农业信息网\t417000\n光阳\t417001\n大连职业技术学院\t417002\n清仓式\t417003\n商都县\t417004\n心房\t417005\n舆\t417006\n异丙嗪\t417007\n戴莫\t417008\n佐佐木希\t417009\n中国人民大学环境学院\t417010\n刘恩佑\t417011\n美国队\t417012\n两次\t417013\n直追\t417014\nMOD_模拟人生4\t417015\n集安\t417016\nlithium\t417017\n毕业套\t417018\n资讯_世纪农药网\t417019\nTumblr\t417020\n谢里夫\t417021\n二周目\t417022\nMontréal\t417023\n计分器\t417024\n久于\t417025\n蕃\t417026\n疯狂报复\t417027\n逼数\t417028\n2G\t417029\n五虎将后传\t417030\n内存条8g\t417031\n桃子味\t417032\ngdut\t417033\n土地证\t417034\n佐丹\t417035\nDingDong\t417036\n种植业\t417037\nddo\t417038\n0x0000000\t417039\n增益率\t417040\nJules\t417041\nmighty\t417042\n血瘀\t417043\nINS-13001\t417044\n双性人\t417045\n枣强\t417046\nSammyLiu\t417047\n肯尼\t417048\n折页\t417049\n差点点\t417050\nextracellular\t417051\nSmartFramework\t417052\n汤摇庄\t417053\n紫兰\t417054\n亿源\t417055\n2^31\t417056\n宇辰网\t417057\n20mg\t417058\n太白县\t417059\n回不去了\t417060\n互助金\t417061\n一等座\t417062\n烂仔\t417063\n氧化亚铜\t417064\n寝乱\t417065\nfpr\t417066\n束手无策\t417067\n缘定\t417068\n球杆\t417069\nLingo\t417070\ndmg\t417071\n通俄\t417072\n姜夔\t417073\n接受度\t417074\n促黄体生成素\t417075\nPFM\t417076\n勾结\t417077\n真田幸村\t417078\n斗战胜佛\t417079\n浙江中医药大学\t417080\n藤席\t417081\n冷皇\t417082\n潍坊四中\t417083\n小住\t417084\n六天后\t417085\n敢答\t417086\n阿尔法\t417087\narchive/text/openpgp/2016-02.mail\t417088\ngrid\t417089\n徐毅\t417090\n代缴社保公司\t417091\n舆情\t417092\n爱巢\t417093\n理查德·克莱德曼\t417094\n朴佑镇\t417095\n比亚迪F0\t417096\n雷吉\t417097\nPhoenix\t417098\n鸿茅药酒\t417099\nelectrolyte\t417100\n多多少少\t417101\n边下\t417102\n日本大学\t417103\n单倍\t417104\n367号\t417105\n初心不改\t417106\n彩虹谷\t417107\n逼上\t417108\n90年前\t417109\n2k13\t417110\n877\t417111\n150次\t417112\n李珉廷\t417113\norb-slam2\t417114\nLecture\t417115\n4.80\t417116\n康乃馨\t417117\n贝叶斯分类器\t417118\n魏家庄\t417119\n佃\t417120\n达开\t417121\npagefile\t417122\n低洼\t417123\n李校堃\t417124\n奥奇\t417125\nMILLION\t417126\n云龙湖\t417127\n葵儿红石榴\t417128\n刚烈\t417129\n哥本哈根机场\t417130\n新浪福建_新浪网\t417131\nMajor\t417132\n间狱\t417133\nnote4吧\t417134\n李波\t417135\n名师堂\t417136\n奕欧\t417137\n削铁\t417138\n随机振动\t417139\n21年后\t417140\n亲爱的王子大人\t417141\n内部服务器\t417142\n解解们\t417143\n一超\t417144\n拉螺杆\t417145\n磁道\t417146\n受者\t417147\n_安\t417148\n火化场\t417149\n郑达\t417150\ntraverse\t417151\n捷星航空\t417152\n绍兴市柯桥区行政服务中心\t417153\n20170122\t417154\n陈英\t417155\n2010word\t417156\n武尊\t417157\nspfile\t417158\n18小时\t417159\n2.2m\t417160\n黑龙江中医药大学附属第一医院\t417161\n车享网\t417162\n开心电玩\t417163\n中提琴\t417164\n阿鲲\t417165\n原料药\t417166\n相印\t417167\n粤办\t417168\n肝阳上亢\t417169\n疯狂猜成语3\t417170\nsundays\t417171\ndnf魔枪士\t417172\n杭州市第七人民医院\t417173\n象湖新城\t417174\nBleach\t417175\n百龙天梯\t417176\n_模拟人生4\t417177\ntype\t417178\n胸花\t417179\n宁波象山\t417180\n88页\t417181\n血凝块\t417182\n大霜塔\t417183\nUrllib\t417184\n明朝那些事儿\t417185\n都市丽人\t417186\n菊地凛子\t417187\nmatting\t417188\n李息隐\t417189\n女生性\t417190\n展示片\t417191\n华为荣耀盒子\t417192\n族徽\t417193\n林芝\t417194\n9厘米\t417195\n大歌星\t417196\n黑龙茶\t417197\n168&\t417198\n天宝路\t417199\n吻戏\t417200\n听器\t417201\n医学会\t417202\n玖玺\t417203\n霆峰\t417204\n黑色素痣\t417205\nadmas\t417206\n法律快车公司法\t417207\n棠东\t417208\n超神传说\t417209\nSISTERS\t417210\n针织棉\t417211\nfavorites\t417212\n比较式\t417213\n恻隐\t417214\nsuffix\t417215\n特护\t417216\nPossible\t417217\n20170521\t417218\n仙剑5前传\t417219\n沙沟\t417220\nwoshi\t417221\n美红\t417222\n家用吸尘器\t417223\n新民乡\t417224\n狂欢季\t417225\n泛光灯\t417226\n电信号\t417227\n集中化\t417228\nXLua\t417229\n丁国琳\t417230\n迷心\t417231\n第1話\t417232\n12话\t417233\n贴地\t417234\n运动化\t417235\n提取版\t417236\n草水\t417237\n李小锋\t417238\n980\t417239\nTCU\t417240\n黄杏秀\t417241\n殿前欢\t417242\n尾料\t417243\n包教包会教程\t417244\n同根\t417245\n持法\t417246\n翠竹苑\t417247\n不动产登记暂行条例\t417248\ncuso4\t417249\nHDR\t417250\n安检机\t417251\n格表\t417252\n奈良县\t417253\n极端\t417254\n事务官\t417255\n空巢\t417256\nv1.0.8\t417257\nK3\t417258\n41项\t417259\nyolanda\t417260\n斯宾塞\t417261\n模拟城市5吧\t417262\n盐酸氨溴索注射液\t417263\n健美\t417264\n茶砖\t417265\n季德胜\t417266\n6W100\t417267\n交易会\t417268\n越秀星汇城\t417269\nPM981\t417270\n肋间神经痛\t417271\n冷友斌\t417272\n冯校长\t417273\n8000米\t417274\n字意\t417275\n哈工\t417276\n五进\t417277\nスレンダ\t417278\n云朵\t417279\n使劲儿\t417280\n键线\t417281\n免疫\t417282\n黑白棋\t417283\n建委\t417284\n无机预涂板\t417285\nseeyon\t417286\n迷你世界汤米\t417287\nMJJ\t417288\n甲铁城的卡巴内瑞\t417289\n1100万元\t417290\n南农新闻网\t417291\n五夫镇\t417292\n国菜\t417293\n汴河\t417294\n殷离\t417295\n碧水云天\t417296\n柴田淳\t417297\n安子\t417298\n三十多万\t417299\n优悦\t417300\n湿气\t417301\n浏\t417302\n酱牛肉\t417303\n180418\t417304\n天伯伦\t417305\n毕业舞会\t417306\n逃出\t417307\n驻京\t417308\n手头\t417309\n赤贝\t417310\n润邦\t417311\n电疗仪\t417312\n8683\t417313\n65分\t417314\n土包子\t417315\n张玲\t417316\ngitLab\t417317\npapapa\t417318\n马安山\t417319\n朗诗绿色集团\t417320\nguodaxia\t417321\ninvoked\t417322\n今井真由美\t417323\nWilly\t417324\n王后\t417325\n第三十八回\t417326\n天能集团\t417327\n龙湖悠山郡\t417328\n三鹰之森吉卜力美术馆\t417329\n登陆\t417330\n胡宇威\t417331\n自我意识\t417332\n北京西站南广场\t417333\n凉白开\t417334\nm18\t417335\n六边形\t417336\n安规\t417337\n玛丽亚\t417338\n燕南天\t417339\nUnplugged\t417340\n竹子\t417341\n赵绍琴\t417342\n微兔\t417343\n岳丽娜\t417344\n众测\t417345\n放天\t417346\n新闻学院\t417347\n夏志清\t417348\n江山市\t417349\n第一【\t417350\nCreateProcess\t417351\n企业债\t417352\n华晨宇\t417353\n福州市晋安区政府\t417354\n甲木\t417355\n乌干达\t417356\n法办法\t417357\nn8\t417358\n伽达默尔\t417359\n寒毒\t417360\n埋堆堆\t417361\n48v20a\t417362\nSora\t417363\n卢涛\t417364\n北京电力交易中心\t417365\n霸王斗罗大陆\t417366\n中国化学会\t417367\n万科东方传奇\t417368\n独特\t417369\n卢沟桥\t417370\n石榴红\t417371\nepass\t417372\n寄存柜\t417373\n球天\t417374\n中心医\t417375\n化药\t417376\n招标采购\t417377\n邓淳\t417378\npatcher\t417379\n筼筜\t417380\n阔叶林\t417381\nwf7610\t417382\n党务公开\t417383\n1.5厘米\t417384\n2006年4月\t417385\nencounter\t417386\n飞卢小说手机网\t417387\n万贵\t417388\nGRE考试\t417389\n注册管理系统\t417390\n百依百顺\t417391\n夫夫\t417392\nsmez\t417393\n密缝\t417394\n控江路\t417395\n3.2万\t417396\n党羽\t417397\n杂机\t417398\n赫妍\t417399\n巴陵时尚网\t417400\n虚空幻化铺\t417401\n硝苯地平控释片\t417402\n尤里吧\t417403\nclose\t417404\n用作\t417405\n鬼头\t417406\n墙布\t417407\nGSR\t417408\n薄一波\t417409\n电子申请网\t417410\n筱崎爱\t417411\nretreat\t417412\n广东省电子税务局\t417413\n蝎子辫\t417414\n竖心\t417415\n鳞茎\t417416\n62家\t417417\n太阳能光伏发电\t417418\n300s\t417419\n果宝特\t417420\n清明夜\t417421\nBoard\t417422\n直面\t417423\n斯蒂芬\t417424\n王道俊\t417425\n饺子\t417426\nvanguard\t417427\n20140823\t417428\n有何不可\t417429\n中铁十七局集团有限公司\t417430\n硖石街道\t417431\n宁波市工商行政管理局\t417432\n海康威\t417433\n暗芝\t417434\nCheese\t417435\n全程\t417436\n4月26日起\t417437\n广州市国家税务局\t417438\ndn20\t417439\n练功房\t417440\n财帛\t417441\n美音\t417442\n揽\t417443\nketchup\t417444\nPatients\t417445\n毒龙钻\t417446\n恳求\t417447\nC89\t417448\n王志东\t417449\nE-M10\t417450\n瓦尔登湖\t417451\n消极怠工\t417452\n背篓\t417453\n21edu8.com\t417454\n收菜\t417455\n阿科\t417456\n喀布尔\t417457\n广成子\t417458\n陆游诗\t417459\n利民\t417460\n松花粉\t417461\nh30\t417462\n陆磊\t417463\nthre\t417464\njaing\t417465\navi_磁力链接ed2k迅雷\t417466\n光纤宽带\t417467\n无机保温砂浆\t417468\n朱妍\t417469\n黄巾\t417470\nfeeding\t417471\n郭皇后\t417472\nSPICE\t417473\n孟建平\t417474\n汇源通信\t417475\nARR\t417476\n圣血\t417477\n董氏\t417478\n江北\t417479\n200升\t417480\n高凝\t417481\n样衣\t417482\n纽斯\t417483\n减光镜\t417484\n2940\t417485\n勇者装备\t417486\n上海中考网\t417487\n1.0.0.6\t417488\nCloud\t417489\n东里镇\t417490\n超过一个月\t417491\n钓鱼椅\t417492\n傻儿\t417493\n浪潮新闻\t417494\n辜胜阻\t417495\n不景气\t417496\n金蝶商贸\t417497\nx9\t417498\n低氮锅炉\t417499\nyytext\t417500\n石花\t417501\n斯琴高娃\t417502\n共用体\t417503\n树脂砂\t417504\n9a\t417505\n文末\t417506\n电解液\t417507\nE3-1230\t417508\n静思园\t417509\n厦门大学嘉庚学院\t417510\n圣都\t417511\n突出来\t417512\n老中青\t417513\n民促法\t417514\n疏果\t417515\n金城江\t417516\n打来\t417517\n罗门哈斯\t417518\n译英\t417519\n南宁电信\t417520\n热血物语\t417521\nSTRING\t417522\n4399创世兵魂\t417523\n7000块\t417524\n乐玩\t417525\nfir\t417526\n肚胀\t417527\n黑默丁格\t417528\n后成\t417529\n防环\t417530\n5批次\t417531\n新天骄\t417532\nFreelancers\t417533\nLacoste\t417534\n茼蒿菜\t417535\nsissy\t417536\n1421\t417537\n从前往后\t417538\ntrek\t417539\n信息科技公司\t417540\n汪伦\t417541\n上海航空\t417542\n倒插门\t417543\n社会主义核心价值观\t417544\n伏尔加河\t417545\n可喜可贺\t417546\n欧拉方程\t417547\nyoshiki\t417548\n忠诚度\t417549\n加一个人\t417550\n103号\t417551\nimposing\t417552\n复旦大学附属妇产科医院\t417553\n安朗\t417554\n旧金山机场\t417555\n肘\t417556\n170714\t417557\n广东法院\t417558\n出国游\t417559\n磁轭\t417560\n伏明霞\t417561\n百日\t417562\n陆垚知马俐\t417563\n李克华\t417564\n调优\t417565\n许婧\t417566\n米松\t417567\n改装\t417568\n太后\t417569\nM436nda\t417570\n碰角\t417571\nShenyang\t417572\n靳诺\t417573\n活火\t417574\n大速\t417575\n索尼a9\t417576\n宿松县政府\t417577\n角的初步认识\t417578\ncreme\t417579\n阴虚\t417580\n邱\t417581\n上海徐汇\t417582\nArena\t417583\n管理方\t417584\n孙婷\t417585\n促成\t417586\n老爸老妈的浪漫史\t417587\nERR\t417588\n佐藤可士\t417589\n东乡\t417590\nLabVIEW论坛\t417591\n梦中情人\t417592\n茶学\t417593\n神帝\t417594\n敌草\t417595\n运量\t417596\n乳糖酶\t417597\n我们的时光\t417598\n刘念\t417599\n0.5A\t417600\n湖南省人民政府金融工作办公室\t417601\n中铝集团\t417602\n妃嫔\t417603\n赊购\t417604\n4.90\t417605\n聘用合同制\t417606\n星际迷航:发现号\t417607\n琦玉\t417608\n根分区\t417609\n非编制\t417610\n_120健康网\t417611\n戒惧\t417612\n着实\t417613\nAMETEK\t417614\n济南市图书馆\t417615\n女工人\t417616\n歌谱网\t417617\n珀金斯\t417618\nbra\t417619\n电梯工\t417620\n黑驴蹄子\t417621\nCab\t417622\n3000流明\t417623\n东川路800号\t417624\n无家\t417625\n泄密者\t417626\n殉道者\t417627\n决明子茶\t417628\nalfresco\t417629\n踊跃\t417630\n御园小区\t417631\n替天行道\t417632\n开路先锋\t417633\n孩子们\t417634\n2017-03-31\t417635\n男鞋\t417636\n食药监\t417637\n魔声\t417638\n风水术\t417639\n2万亿\t417640\n尖锐性\t417641\n学大教育\t417642\n自由路\t417643\n湖南新闻网\t417644\n圣象地板\t417645\n聚色伦网\t417646\nWorld\t417647\n愿望\t417648\n广西移动\t417649\n联审\t417650\nUtil\t417651\n高氯酸\t417652\n呼延灼\t417653\n无盐岛\t417654\n嬉戏\t417655\n暗红\t417656\nlibzip\t417657\n人房\t417658\n阴坡\t417659\n爱奇异\t417660\n航海者\t417661\nLQ-630K\t417662\n海昏侯\t417663\nezviz\t417664\nDownloadManager\t417665\nThreading\t417666\n求别\t417667\nMethodError\t417668\n磁波\t417669\n30年以上\t417670\noffice2019\t417671\n格林联盟酒店\t417672\n选派\t417673\n没多久\t417674\n缘故\t417675\n基拉\t417676\n吉他弹\t417677\n北京投资理财\t417678\nE60\t417679\n油菜花之恋\t417680\n乱乱\t417681\n17米\t417682\n绿城玫瑰园\t417683\n购买方\t417684\n白名单\t417685\npeer\t417686\n20161020\t417687\n张锦秋\t417688\n四辊\t417689\nwom\t417690\n27分钟\t417691\n我的小宝贝\t417692\n很重\t417693\n招行一卡通\t417694\n东阿\t417695\n海德股份\t417696\nCK快播电影网\t417697\n信噪比_\t417698\ndiagrams\t417699\n房车旅游网\t417700\n94家\t417701\n辛亥日\t417702\n千两\t417703\n2017年7月16日\t417704\n重庆第二外国语学校\t417705\n第68期\t417706\nspring-cloud\t417707\n东湖镇\t417708\n燃灯者\t417709\n国债逆回购\t417710\n三溪镇\t417711\n北京社保局\t417712\n虎牙\t417713\n汇森\t417714\n风向玫瑰图\t417715\nD座\t417716\n玄天\t417717\n手功能\t417718\neats\t417719\n凯迪生态\t417720\n撒冷\t417721\n如意岛\t417722\n友元\t417723\n隔离衣\t417724\n十一点\t417725\n908\t417726\nPepe\t417727\n会理县\t417728\n四川科技职业学院\t417729\n人族\t417730\n思迅\t417731\n人口增长\t417732\n脸皮\t417733\n优新\t417734\n利未记\t417735\n阳光e站\t417736\n契丹\t417737\nACOME\t417738\n奥齿泰\t417739\n布奇\t417740\n上交\t417741\n108张\t417742\n痔疮膏\t417743\n边缘世界\t417744\n镁棒\t417745\n郭碧\t417746\nPSA\t417747\n2017年1月8日\t417748\n冯子材\t417749\nsacred\t417750\n海螺水泥\t417751\n官地\t417752\n絶対\t417753\n新泽西\t417754\n东京塔\t417755\ninline\t417756\n民告官\t417757\n谢彬\t417758\n20150701\t417759\n刘奕君\t417760\n棱形\t417761\nPVZ\t417762\n独生女\t417763\nafc2\t417764\n滑落\t417765\n法权\t417766\n蓝莓苗\t417767\n江苏保监局\t417768\n散台\t417769\n初曦\t417770\nv9.11.1\t417771\n中华人民共和国国家监察委员会\t417772\n华力微电子\t417773\nrome\t417774\nDerivatives\t417775\nNumerical\t417776\n临夏回族自治州\t417777\n逍遥模拟器\t417778\npytroch\t417779\n林爽\t417780\n360游戏中心\t417781\n∫\t417782\n性别女\t417783\n西藏商报\t417784\n贾乃战舰世界\t417785\n首魂\t417786\n壹手\t417787\n长圆\t417788\n外涵\t417789\n钟\t417790\n渐变\t417791\n9.32\t417792\n安徽新闻_新闻中心\t417793\n赣商\t417794\n药品\t417795\n其形\t417796\n转折点\t417797\n投资者\t417798\n小儿高热惊厥\t417799\n坤宫\t417800\n19200\t417801\n平值\t417802\n…\t417803\n马布\t417804\n国际观\t417805\n全职高手\t417806\n很少\t417807\n罗贝里\t417808\n临兵斗者皆阵列\t417809\n很舒服\t417810\n图值\t417811\n曹思义\t417812\n房展会\t417813\n包装设计师\t417814\njlins\t417815\n船舵\t417816\n广州建国医院\t417817\n小饰\t417818\n上海2号线\t417819\n尸体派对\t417820\n97泰剧网\t417821\n德语语法\t417822\n小鱼易连\t417823\n中央电视台春节联欢晚会\t417824\n倍力\t417825\neras\t417826\n中国网库\t417827\n马寨镇\t417828\n账面余额\t417829\n小城故事\t417830\nCython\t417831\n排空\t417832\nDRAFT\t417833\n央视春晚\t417834\n大连日报\t417835\n流程型\t417836\n中国政府采购新闻网\t417837\n寡头垄断\t417838\n刀匠\t417839\n我的朋友\t417840\n36所\t417841\nMKV\t417842\n麻精\t417843\n尽\t417844\n石林彝族自治县\t417845\n绞股蓝茶\t417846\n嫡子\t417847\n郭徐\t417848\n昆山火车站\t417849\nHTTP500\t417850\n嘉兴一中\t417851\n花火网\t417852\n稷山县\t417853\n梦令\t417854\nsynaptic\t417855\n感应\t417856\nKMplayer\t417857\nXAMPP\t417858\n诚征\t417859\nEEWORLD\t417860\n上档次\t417861\n绅宝x55\t417862\n地坛公园\t417863\nIE3.0\t417864\nsands\t417865\n110平米\t417866\n厦门英才学校\t417867\n钱塘江大桥\t417868\ncae\t417869\n20150314\t417870\n钛白\t417871\n系别\t417872\n曼哈顿广场\t417873\n无人船\t417874\nzynq\t417875\nClear\t417876\n下川岛\t417877\n李家沱\t417878\n斯维登\t417879\n相陪\t417880\n顺从\t417881\n新税\t417882\n源极\t417883\n容纳\t417884\n颈复康颗粒\t417885\n系统粉\t417886\nhuitu\t417887\n上海市居住证\t417888\n创新设计\t417889\n两柱\t417890\nmercury无线路由器\t417891\n小花百万亚瑟王\t417892\n箱型\t417893\n1.70\t417894\n独具匠心\t417895\n插入语\t417896\n徐圩\t417897\n诈捐\t417898\nimmigrants\t417899\nnetframework\t417900\n85193111\t417901\n杨汝岱\t417902\n南瑞集团\t417903\n故障\t417904\ncomparing\t417905\n逆转裁判2\t417906\n苏珊\t417907\nED20\t417908\nafrican\t417909\n特谢拉\t417910\n一炷香\t417911\n华晟\t417912\n月野\t417913\n低聚木糖\t417914\n鼎力三国\t417915\n马峦山\t417916\n查莉娅\t417917\n下壁\t417918\n九界\t417919\n天河公园\t417920\n海润影视\t417921\n刻本\t417922\n国家外汇管理局\t417923\n张娇\t417924\n皮传奇素材网\t417925\n攀钢集团\t417926\n小女人\t417927\n赛车计划2\t417928\ni7处理器\t417929\n工人体育场\t417930\ntomcat5\t417931\n暴走\t417932\n董承\t417933\n香香美发网\t417934\ncyber\t417935\n1厘\t417936\n中山市华侨中学\t417937\n绿岛小夜曲\t417938\n残军\t417939\n浅淡\t417940\n航天飞船\t417941\n60毫米\t417942\n梦儿\t417943\nmpd\t417944\nPAP\t417945\n检疫性\t417946\n黄冈中学广州学校\t417947\n西曼\t417948\n润典\t417949\n乐陶陶\t417950\n利民网\t417951\n江西广播电视大学\t417952\n1200名\t417953\n6005\t417954\n杨益言\t417955\n芭娜娜\t417956\nXcode6\t417957\n漆房\t417958\n蜡烛君越\t417959\n龙眼树\t417960\n10场\t417961\n升降\t417962\n自走式\t417963\nhumming\t417964\n圣宝\t417965\n工银\t417966\n周锋\t417967\n红塔集团\t417968\n满江天天爱消除\t417969\n优客逸\t417970\n石膏矿\t417971\n威虎山\t417972\n略网\t417973\nAdobe公司\t417974\n大桥局\t417975\n上海喜玛拉雅美术馆\t417976\nf1.7\t417977\n自律\t417978\n20170621\t417979\n巫女\t417980\n暴雪游戏\t417981\n240亿元\t417982\n二王\t417983\nruns\t417984\n华中师范\t417985\nactivate\t417986\n蒙氏数学\t417987\n王亚东\t417988\n蛋糕胚\t417989\nheigh\t417990\n深造\t417991\n_平阳新闻网\t417992\n黑白郎君\t417993\n沈阳军区总医院\t417994\n578\t417995\n爱晚亭\t417996\n漆黑\t417997\n草食性\t417998\n青楼十二房\t417999\nMICE\t418000\n时间篇\t418001\n北京牛街\t418002\n暴走族\t418003\n异形件\t418004\n华声论坛\t418005\n李晓宇\t418006\n美唇\t418007\n稳转\t418008\napng\t418009\nyixue\t418010\n昨天凌晨二点二十三分\t418011\n坪山\t418012\n塑料感\t418013\nWindows虚拟机\t418014\nEarrings\t418015\n择偶\t418016\nWah\t418017\n失职\t418018\nnumpy\t418019\nJFreeChart\t418020\n周曦\t418021\n9V2A\t418022\n趸船\t418023\n脑补\t418024\n突击炮\t418025\n16299.402\t418026\n2011年8月\t418027\n5-10\t418028\n绿道网\t418029\n十三陵水库\t418030\n吴山广场\t418031\n京东亚洲一号\t418032\n吉吉影音播放器\t418033\nXIAAV论坛\t418034\n子行\t418035\n卓美网\t418036\n0w40\t418037\n台北酒店\t418038\n深圳光明新区政府\t418039\n2T\t418040\n引频\t418041\n深圳公交\t418042\n尾轴\t418043\n菊花岛\t418044\n7%\t418045\n滚架\t418046\n300R\t418047\n摇落\t418048\n共同债务\t418049\n巨野\t418050\n慈庵\t418051\nNetApp\t418052\n广播线\t418053\n善有善报\t418054\n天籁纸鸢\t418055\ndisplaying\t418056\n死神永生\t418057\n脓包\t418058\n舞曲网\t418059\n中芯\t418060\nextends\t418061\n柴鱼\t418062\nMindNode\t418063\nmino\t418064\n百例\t418065\n挺水\t418066\n宁夏新闻网\t418067\nURLEn\t418068\n材料库\t418069\n早见沙织\t418070\n新农人\t418071\n朱清时\t418072\n臭肉\t418073\nUIkit\t418074\n深圳人事考试网\t418075\n经理部\t418076\n梭织布\t418077\n咻咻咻\t418078\nGemma\t418079\nPray\t418080\n店头镇\t418081\n网络爬虫\t418082\nVC2010\t418083\n节选\t418084\nGyno\t418085\n华裔\t418086\n踩脸\t418087\n技能\t418088\n飞鹿\t418089\n地铁\t418090\n战兽山\t418091\n数据结构与算法\t418092\nConfigurations\t418093\n塔扇\t418094\n善终\t418095\n梦参法师\t418096\n男声优\t418097\nHoroscope\t418098\n暴饮暴食\t418099\n北越\t418100\nthinkvision\t418101\ndqx\t418102\n上海纺织\t418103\n四片\t418104\ncgcs2000\t418105\n剪力法\t418106\n命运之轮\t418107\n大连理工大学研究生院\t418108\n农医\t418109\nDNF刷图\t418110\n股份制公司\t418111\n照公开\t418112\nQWT\t418113\n后退\t418114\nmetabones\t418115\n彩视\t418116\nEP9\t418117\nCHAO\t418118\n无角\t418119\n苏果超市\t418120\n艾丽丝\t418121\n罗衫\t418122\n棕榈油\t418123\n平a\t418124\n报价单\t418125\n锦江集团\t418126\n效果音\t418127\niqos2.4plus\t418128\n神使\t418129\n十铨科技\t418130\nGosuGamers\t418131\n浙大一院\t418132\n否卦\t418133\narchie\t418134\n3.6万亿\t418135\n输华\t418136\n爱淘数字资源馆\t418137\n歌德学院\t418138\n秧歌舞\t418139\n2017年6月8日\t418140\n4.5.3\t418141\n几作\t418142\n塞翁失马\t418143\n三元九运\t418144\n梅赛德斯-奔驰文化中心\t418145\n斗争史\t418146\n箭\t418147\n玻璃钢夹砂管\t418148\n正定中学\t418149\n凤溪\t418150\n6.71\t418151\ninbody\t418152\n隋唐无限恐怖\t418153\n血与火\t418154\n大开眼\t418155\n马贵将\t418156\n良师\t418157\n复述\t418158\nRecorder\t418159\nwx\t418160\n5月9号\t418161\n五峰土家族自治县\t418162\n泛酸钙\t418163\n魔兽世界巫妖王之怒\t418164\n食物链\t418165\n硫脲\t418166\n回避率\t418167\n95568\t418168\n溪美\t418169\nAssociations\t418170\n鸽笼\t418171\n东塘\t418172\necogd\t418173\n链表类\t418174\n新世纪花园\t418175\nXVIDEOS\t418176\n书区\t418177\n00015\t418178\nnotfound\t418179\n学识渊博\t418180\n弓形虫检查\t418181\n中丹\t418182\nGodox\t418183\n上海恪敏生物科技有限公司\t418184\n梦宝谷\t418185\n动漫设计专业\t418186\n静力学\t418187\n笏石镇\t418188\n史海\t418189\n老约翰\t418190\n阿苏拉\t418191\n永怀\t418192\n多巴\t418193\n过滤芯\t418194\n钟表馆\t418195\n小蓉\t418196\n食茨\t418197\nT520\t418198\n2018年1月18日\t418199\nOrdinator\t418200\n淄博百姓网\t418201\n比智高\t418202\n福斯特\t418203\n外门\t418204\n红绸\t418205\n外管局\t418206\nKitstown\t418207\n蛟\t418208\nSets\t418209\n彼得兔\t418210\nStructured\t418211\n水星MW150UH\t418212\n百炼飞升录\t418213\n营建\t418214\n皮片\t418215\n福彩七乐彩\t418216\n谢天\t418217\n扣具\t418218\n02民\t418219\n0x000006cc\t418220\n卡雷\t418221\n凤台县人民政府\t418222\n高毅\t418223\nUML\t418224\n簿\t418225\n好豆网\t418226\n查漏\t418227\nogs\t418228\n天楹\t418229\n金星北路\t418230\n几g\t418231\n打虫药\t418232\n海淘奶粉\t418233\n秭归脐橙\t418234\n赖雅妍\t418235\n蝴蝶网\t418236\n山东省安全生产监督管理局\t418237\n公案\t418238\n承认\t418239\n中国海洋大学信息科学与工程学院\t418240\n海盗来了\t418241\n立体几何\t418242\n只要\t418243\ngeomap\t418244\n鞭笞\t418245\n兼一\t418246\n吴航\t418247\nt18\t418248\n制作机\t418249\n97式\t418250\n偷狗\t418251\nIntegration\t418252\n四扇\t418253\n涿\t418254\n皖南八校\t418255\n˙\t418256\n置疑\t418257\n主箱\t418258\n黄金ETF\t418259\n共同关注\t418260\n康姿百德健身\t418261\nuglify\t418262\n怀化职业技术学院\t418263\naltium\t418264\n水晶灯\t418265\n|美元\t418266\n海南省人民政府政务服务中心\t418267\n7.2亿\t418268\n美卓乐\t418269\n安装器\t418270\n跳级\t418271\n7016\t418272\n陕西省图书馆\t418273\nTP-LINK\t418274\n地席\t418275\n从心所欲\t418276\n阿片类\t418277\n野话\t418278\nglp\t418279\n逍遥小村医\t418280\n城央\t418281\n牡丹江市人民政府\t418282\n中国建筑第八工程局有限公司\t418283\n新知三联书店\t418284\nPITTA\t418285\n3小时内\t418286\n我的朋友很少\t418287\n圆通速递有限公司\t418288\n1.cn\t418289\n视光\t418290\n760d\t418291\n渔歌\t418292\n艾畅\t418293\n体外诊断网\t418294\n湖南省电力公司\t418295\n跪舔\t418296\n电子钟\t418297\n挤奶\t418298\n百倍币\t418299\n白语\t418300\nmonitors\t418301\n维佳\t418302\n搜狐健康\t418303\n玩手\t418304\nvivox6plus\t418305\nKETTLE\t418306\n红毛猩猩\t418307\nstrider\t418308\n百种\t418309\n注解类\t418310\nShell函数\t418311\n移动式破碎机\t418312\n紫外线指数\t418313\n固城\t418314\nopenqa\t418315\n谢涛\t418316\n美容液\t418317\n黑猩猩\t418318\n大差\t418319\n鉴于\t418320\n全史\t418321\n步步情影院\t418322\n中国铝业网\t418323\nENTRI\t418324\n范德蒙\t418325\ngetc\t418326\n杨梅岭\t418327\n面点师\t418328\n九死一生\t418329\n抵御\t418330\n郁金花\t418331\n国家免疫规划\t418332\n中远海能\t418333\n苯扎氯铵\t418334\n开进\t418335\nEster\t418336\n功士\t418337\nRMSE\t418338\nAB\t418339\n深圳市少年宫\t418340\n鸦杀\t418341\n柏菲\t418342\n爱情人网\t418343\n命令提示符/CMD\t418344\n门槽\t418345\n第20卷\t418346\n守时\t418347\n年报表\t418348\ndrawrect\t418349\n东方美\t418350\n科教文汇\t418351\n杨洲\t418352\nPUSH\t418353\n宁波新城吾悦广场\t418354\n有余\t418355\n扒一扒\t418356\n伟星股份\t418357\n张建辉\t418358\n法暴\t418359\n打老虎\t418360\nsmartisan\t418361\n帝女\t418362\n狙击类\t418363\n环缝\t418364\n母座\t418365\n石环\t418366\n诚信通\t418367\n个人信用贷款\t418368\n章文嵩\t418369\n凌晨4点\t418370\n光法\t418371\nchuanfu\t418372\n屏山县\t418373\n椰蒂\t418374\n湖北联通\t418375\nORDER\t418376\n纪念照\t418377\naiko\t418378\n液化气\t418379\nlcd\t418380\n淀粉\t418381\n陈师道\t418382\n合作商\t418383\n几路\t418384\n群化\t418385\n黄誉博\t418386\n读书声\t418387\n高潜\t418388\n水天一色\t418389\n奇异果\t418390\n百万位\t418391\n洪江市人民政府\t418392\nlimitation\t418393\n商务舱\t418394\ncalculate\t418395\nflashbuilder\t418396\n奔驰E级\t418397\n叛变\t418398\n语文园地六\t418399\n抄袭率\t418400\n罗胖子\t418401\n三文治\t418402\n救狗\t418403\n直塞\t418404\n试驾员\t418405\n要不要紧\t418406\nomv\t418407\n映象\t418408\n王嘉\t418409\n安徽网络电视台\t418410\ngameobject\t418411\n美股\t418412\navc\t418413\n胜利河\t418414\n潍坊经济开发区\t418415\n到案\t418416\n依诺\t418417\nsummary\t418418\n仓本C仔\t418419\n真矢\t418420\nSubbed\t418421\n江诗丹顿\t418422\n张忠\t418423\nfindfirst\t418424\n嘉里城\t418425\n广告贴\t418426\n张红梅\t418427\n潮尚\t418428\n诸天\t418429\n傲慢\t418430\natomikos\t418431\n全频\t418432\n炸丸子\t418433\n负伤\t418434\n北化\t418435\n白水县\t418436\n劳工法\t418437\n上海红星美凯龙\t418438\n10500\t418439\n罗薇娜\t418440\n重新做人\t418441\n无锡装修公司\t418442\n编制\t418443\n鸡蛋灌饼\t418444\n神雕风流\t418445\n应变率\t418446\n仙剑问情\t418447\nmcafee\t418448\n直上\t418449\n睢宁县\t418450\n宁波华茂外国语学校\t418451\n流浪器\t418452\n两翼\t418453\nappV\t418454\n收效\t418455\n名侦探柯南全集\t418456\n德牧\t418457\n大司马\t418458\n灭灯\t418459\n玉龙股份\t418460\n酚羟基\t418461\nsales\t418462\n安装方\t418463\n蝗\t418464\n华润置地\t418465\nrecount\t418466\n雨音\t418467\nPetaLinux\t418468\n唐朝妖异志\t418469\n实习律师证\t418470\n康宇\t418471\n9起\t418472\n本乡\t418473\n弱冠\t418474\n寝具\t418475\n福州市罗源县政府\t418476\n住房租赁资产证券化\t418477\nTextView\t418478\ncjk\t418479\n过硫酸铵\t418480\nspam\t418481\n致爱\t418482\nTuning\t418483\n短期避孕药\t418484\n金万维天联\t418485\n晨间\t418486\n余杭\t418487\n实用度\t418488\n黄山市政府\t418489\n好当家\t418490\nBigger\t418491\nPA\t418492\n宋妍霏\t418493\n结拜\t418494\n绝缘油\t418495\nRESEARCH\t418496\n仙逝\t418497\n淹没\t418498\n话\t418499\n红泪\t418500\n菜鸟们\t418501\nprotege\t418502\n妹岛和世\t418503\n谢耳朵\t418504\n淫秽\t418505\n普吉岛\t418506\n奶油蘑菇汤\t418507\n北京北辰洲际酒店\t418508\nbiometric\t418509\n音楽\t418510\n家书\t418511\ncdt\t418512\ntieba\t418513\ncredit\t418514\n数额\t418515\n老虎简笔画\t418516\n火点\t418517\n2016年5月4日\t418518\n月星\t418519\n导览\t418520\nHill\t418521\n统计学专业\t418522\nSelf\t418523\nduo\t418524\n美月优芽\t418525\nswirl\t418526\nChecked\t418527\n卢卓\t418528\n小米Pro\t418529\n爷们儿\t418530\n大观路\t418531\n柠檬茶\t418532\n不兴\t418533\n黄渠\t418534\n澳玛\t418535\n太阳王\t418536\n世外\t418537\n9-12月\t418538\n消防工程师证\t418539\n易语\t418540\nCaribbeancom\t418541\n皆若\t418542\n清空回收站\t418543\n金品\t418544\n锚定\t418545\n宁波政府网\t418546\n最浪漫的事\t418547\n0.64\t418548\n黑礁\t418549\n输入\t418550\n热播网\t418551\n灌篮高手全国大赛篇\t418552\n下源\t418553\n毕业设计管理系统\t418554\n常熟农村商业银行\t418555\n遗传代谢病\t418556\n601018\t418557\nverbs\t418558\n低俗怪谈\t418559\n一百万年\t418560\n德州日报\t418561\n经名\t418562\n秘书舰\t418563\ntdb\t418564\nEA111\t418565\n新生物\t418566\n低危\t418567\n北京昌平区\t418568\nsps\t418569\n纯端\t418570\ndiabetes\t418571\n奇骏2017\t418572\n卡萨斯\t418573\n范美忠\t418574\n0x30\t418575\n齐头\t418576\n金江小区\t418577\n王爷爷\t418578\n娱乐新闻_海峡网\t418579\n三元食品\t418580\n玻尿酸鸭\t418581\n第106章\t418582\n料理机\t418583\nAnyview\t418584\nxushukui\t418585\nIron\t418586\n波音737\t418587\n黄海清\t418588\n赑屃\t418589\n200元\t418590\n红桃皇后\t418591\nB360M\t418592\ncheery\t418593\nacross\t418594\n建明\t418595\n黑龙江教育出版社\t418596\n寿光市\t418597\n赵宋\t418598\nfillna\t418599\n舒俱来\t418600\n81234567\t418601\n多媒体数字报\t418602\nSequential\t418603\n会典\t418604\n彼得兔的故事\t418605\nDBSCAN\t418606\n乳膏\t418607\n渐冻人症\t418608\n监狱长\t418609\nokcoin\t418610\n卖相\t418611\n邻苯二甲酸\t418612\nsk5病毒\t418613\n轴泵\t418614\n自传\t418615\n佐登妮丝\t418616\n固化剂\t418617\n2015年1月1日\t418618\n500分钟\t418619\n头批\t418620\n雷蛇战神\t418621\n小米安全中心\t418622\nWars\t418623\n楼管\t418624\nSilva\t418625\n中国道教\t418626\n第2个\t418627\n心理测量者\t418628\n大沽\t418629\n复方板蓝根颗粒\t418630\n3.5cm\t418631\n各镇\t418632\n72首\t418633\n东方希望集团\t418634\n粘板\t418635\n邯郸机场\t418636\n广济南路\t418637\nORM\t418638\nword97\t418639\n脾胃论\t418640\nscanf函数\t418641\n原是\t418642\n共产党员网\t418643\nV2.6\t418644\n新的征程\t418645\n朴素唯物主义\t418646\nxiaohu\t418647\n皮条客\t418648\n8亿美元\t418649\n珠海长隆横琴湾酒店\t418650\n捷\t418651\n桥面\t418652\n生态文明\t418653\n悲喜交\t418654\narctan\t418655\nEXcel表格\t418656\nhaole\t418657\nalin\t418658\n一求\t418659\n佚名\t418660\nconnected\t418661\n个人贷款\t418662\nMonde\t418663\n升级赛\t418664\n滨盛路\t418665\nchamfer\t418666\n作贡献\t418667\n青岛世界博览城\t418668\nthink\t418669\n太空堡垒卡拉狄加\t418670\n华特迪士尼\t418671\n经责\t418672\n黄大洋\t418673\nretrieval\t418674\n澳门银河娱乐\t418675\n曹小强\t418676\nppt2010\t418677\n民族报\t418678\n马头琴\t418679\n黑日\t418680\n税控\t418681\n投身\t418682\n行政人事\t418683\nSML\t418684\n幸福南海_南海区政府\t418685\n殉\t418686\n水车\t418687\n国家工程实验室\t418688\ntooth\t418689\n米诺\t418690\n3dvr\t418691\n翰城\t418692\n龙城大道\t418693\n重庆水务集团\t418694\n美丽乡\t418695\n圣严法师\t418696\n400型\t418697\n财信\t418698\nnasyun\t418699\n江苏沁恒股份有限公司\t418700\n2299\t418701\n进群\t418702\nknn算法\t418703\n大力士\t418704\ntoutiao\t418705\n000559\t418706\n血燕\t418707\n88.cn\t418708\nreturning\t418709\n九次\t418710\nheads\t418711\n棒棒贝贝\t418712\nikoo\t418713\n新威驰\t418714\n扎克伯克\t418715\njfrog\t418716\ncellar\t418717\n代嫁\t418718\n南航明珠卡\t418719\ncandyboy\t418720\n欲言又止\t418721\n无线传屏\t418722\n壅\t418723\n母亲\t418724\n中街\t418725\n宋博\t418726\n孟浪\t418727\nsoluble\t418728\n爱亲论坛\t418729\n武汉旅行社\t418730\nEXIT\t418731\nukulele\t418732\n36吨\t418733\n广东移动神州行\t418734\n世茂璀璨天城\t418735\n词藻\t418736\n谁负\t418737\n名臣\t418738\n西侧\t418739\n财神杯\t418740\nShoes\t418741\nmycard\t418742\n1-3季\t418743\n纯阴\t418744\n碧江广场\t418745\nwii吧\t418746\n第一媳\t418747\n自伤\t418748\n4000亿美元\t418749\n六棱块\t418750\n上6天\t418751\n巫王\t418752\nansible-playbook\t418753\ntien\t418754\n传址\t418755\n义人\t418756\n胜寒\t418757\n气相色谱柱\t418758\n洛德\t418759\nScream\t418760\n而泰\t418761\n辣椒面\t418762\n仙君\t418763\n补面\t418764\n1.1.2\t418765\n吉赛尔邦辰\t418766\n分达汽配网\t418767\nANTI\t418768\n自动喷水灭火系统\t418769\n热控\t418770\n鬼带\t418771\n初创\t418772\n48个\t418773\n108首\t418774\n金山手机助手\t418775\n尽是\t418776\n大直沽\t418777\n武功镇\t418778\nBSR\t418779\n秒卡\t418780\nkjj\t418781\n宇信科技\t418782\n朝媒\t418783\n板扎福利网\t418784\n窗子\t418785\n『\t418786\n陈可依\t418787\nLottery\t418788\npotential\t418789\n迅销\t418790\n1.6.9\t418791\n沈阳铁路局\t418792\n来年\t418793\ndategridview\t418794\n战地\t418795\n咕叽\t418796\nCao\t418797\nstyled\t418798\n内蒙古自治区\t418799\n最近5年内\t418800\n中国电力建设集团有限公司\t418801\n无证无照经营查处办法\t418802\nrecovered\t418803\n东莞医院\t418804\n侠风前传\t418805\n骨雕\t418806\n奶片\t418807\n楚雄彝族自治州\t418808\n青岛远洋船员职业学院\t418809\n中铁工程装备集团有限公司\t418810\n张学友\t418811\nPoorSakura\t418812\n绒衫\t418813\n刀斧\t418814\nDangal\t418815\n性爱故事\t418816\n金贤珠\t418817\n北京师范大学珠海分校\t418818\n星车\t418819\n七年级道德与法治\t418820\npis\t418821\nyoki\t418822\n柴\t418823\nSolo3\t418824\n外包式\t418825\n第87届\t418826\n邊\t418827\n家族史\t418828\n抠章\t418829\n克笔\t418830\n溶胀\t418831\njQuery+CSS3\t418832\n张雅婷\t418833\n毛团\t418834\n女神\t418835\n战龟\t418836\n途安L\t418837\n分布式锁\t418838\n夏曦\t418839\nmontage\t418840\n戒子\t418841\n老汤\t418842\n驻京办\t418843\nOAM\t418844\n战国无双\t418845\n算筹\t418846\n石灰浆\t418847\n马智宇\t418848\nsms\t418849\n思动\t418850\nnotre\t418851\n女一男\t418852\n雷朋\t418853\n艾瑞\t418854\n群控\t418855\n重庆仁爱医院\t418856\n八分之一\t418857\n朝阳市政府\t418858\n缔约方\t418859\n劝\t418860\nharsh\t418861\n劳动人\t418862\n巴比伦\t418863\n美队\t418864\n杨乃文\t418865\n王森\t418866\n佬\t418867\n波光粼粼\t418868\n隋炀帝\t418869\n你\t418870\n支撑梁\t418871\ncozmo\t418872\n公牛队\t418873\n殿堂级\t418874\n追缴\t418875\n6天游\t418876\n李海林\t418877\ngetall\t418878\n有饭吃\t418879\n中信保诚人寿保险有限公司\t418880\n七选五\t418881\n要害\t418882\nFLAC整轨/\t418883\n索菲亚\t418884\n萧十一郎\t418885\n迪巧钙片\t418886\n电伤\t418887\n泡利\t418888\n数据封装\t418889\n泰迪\t418890\n病历本\t418891\n其美多吉\t418892\n无单\t418893\n貨\t418894\n补丁\t418895\nSubstring\t418896\n速查表\t418897\n赵丽马天宇\t418898\n汇桔网\t418899\ntinytext\t418900\n营商\t418901\n本溪满族自治县\t418902\n太古汇\t418903\n没用过\t418904\npupa\t418905\n9088\t418906\n全作\t418907\n起停\t418908\n传动轴\t418909\n麦格米特\t418910\n天罗\t418911\n龙门铣\t418912\n周几更新\t418913\n49寸\t418914\n光谷生物城\t418915\n1600MHz\t418916\n数十倍\t418917\n通贩\t418918\n吐槽\t418919\nJAVAEE\t418920\n工笔画\t418921\n恒创科技\t418922\nopenfiler\t418923\n一光\t418924\n劲风\t418925\n浙江医药\t418926\nK个\t418927\nPower\t418928\n21支\t418929\n人墙\t418930\nobi\t418931\n受案\t418932\nConsistent\t418933\n智控\t418934\nWuXi\t418935\n九百年\t418936\n侧边\t418937\n刘文元\t418938\n二项分布\t418939\n要谨慎\t418940\n代办员\t418941\n列控\t418942\n信阳东站\t418943\n冬活\t418944\n司溟\t418945\nl313\t418946\nSimplified\t418947\n_六图网\t418948\nWideScience\t418949\n混动车\t418950\n阿周\t418951\n嘉年华\t418952\nFarming\t418953\nbulb\t418954\narisa\t418955\n卓娅\t418956\nwwfy\t418957\n萤光\t418958\n奥尔加隆\t418959\nAnsi\t418960\n卷理\t418961\n哈弗M6\t418962\n常州天合光能有限公司\t418963\n计数项\t418964\n孙保罗\t418965\n37位\t418966\n倾盆大雨\t418967\n诺瑞\t418968\n总分馆\t418969\n志邦股份\t418970\n鬼族\t418971\n酷知经验网\t418972\n六安市\t418973\nKarry\t418974\nrmbp\t418975\n程俊\t418976\nmy97\t418977\n林奇\t418978\n肠旺面\t418979\npars\t418980\n给于\t418981\n武装突袭2\t418982\nGitHub\t418983\n润滑剂\t418984\n建会\t418985\n物流园\t418986\n145平\t418987\n小米2\t418988\nAnki\t418989\ntataUFO\t418990\n试药\t418991\n北洋海军\t418992\n新疆大学科学技术学院\t418993\n打成一片\t418994\n妥协\t418995\n利力\t418996\n云解析\t418997\n全国总工会\t418998\n82.7万亿元\t418999\n法场\t419000\n扫除\t419001\nB75M-D3V\t419002\n马蹄卡\t419003\nweb_reg\t419004\n张艳红\t419005\n清分\t419006\n长安逸动\t419007\n琥珀园\t419008\n纸玫瑰\t419009\nPT100\t419010\nannaconda\t419011\n加强\t419012\n加纳利\t419013\n服务方\t419014\n天和前滩\t419015\n姜成勋\t419016\n海员\t419017\n南翔\t419018\n百穿\t419019\n龙在江湖\t419020\n信丰县\t419021\n神舟飞船\t419022\n音乐之声\t419023\n教女\t419024\n孙国华\t419025\n吐鲁番\t419026\n忻城\t419027\n5-10年\t419028\n浙江国地税联合电子税务局\t419029\n中化泉州石化\t419030\n奇情\t419031\n偲\t419032\nMythology\t419033\n2017超星尔雅\t419034\n数字音频处理器\t419035\n籃\t419036\nacrobat\t419037\nresty\t419038\nfabiao\t419039\n32gb\t419040\n北京邮电\t419041\n嘉和苑\t419042\n黄山山\t419043\n民权运动\t419044\n急性淋巴细胞白血病\t419045\n懒财\t419046\n步调\t419047\n起亚k2\t419048\n大话群侠传\t419049\n小飞飞\t419050\n晾衣\t419051\n习近平关于科技创新论述摘编\t419052\n再结晶\t419053\n刚配\t419054\n新中式别墅\t419055\n红米3\t419056\n暨南大学药学院\t419057\n珍宝展\t419058\n营收入\t419059\n钢铁侠战衣\t419060\n卡帝乐\t419061\n7.3恶魔\t419062\n郑艳丽\t419063\n天涯吧\t419064\n遗憾\t419065\n黑欲\t419066\n张家口南站\t419067\n曼谷素万那普国际机场\t419068\nShaw\t419069\n淋巴管瘤\t419070\n哥本\t419071\ncoreidraw\t419072\n中国纺织人才网\t419073\n嵊泗\t419074\n鲁南制药集团\t419075\n杓\t419076\n勇猛\t419077\n侑子\t419078\n淑女装\t419079\n港港\t419080\n思维导\t419081\n一天后\t419082\n等效性\t419083\n3月28日\t419084\n北京清河\t419085\n钨灯\t419086\n萨里大学\t419087\n立誓\t419088\nlovo\t419089\n小狐丸\t419090\nmaven库\t419091\nISD\t419092\nwenj\t419093\n满舒克\t419094\n康平\t419095\n大新银行\t419096\n陈年老\t419097\n眼法\t419098\n不想说\t419099\n友盟+\t419100\n围嘴\t419101\n意气\t419102\n处罚决定书\t419103\nrival\t419104\n宗申动力\t419105\n大连高新园区\t419106\n长岐镇\t419107\n河南省财政厅\t419108\n邓玉娇\t419109\n空档\t419110\n闹矛盾\t419111\n柳树姑娘\t419112\n630分\t419113\n楚大\t419114\n应届\t419115\n裴端卿\t419116\n兔男\t419117\nu10\t419118\n工业革命\t419119\n100个\t419120\nserved\t419121\n李小明\t419122\n国际清算银行\t419123\n光道\t419124\n2.5.0\t419125\n莫吉托\t419126\n异步通信\t419127\n太阳辐射量\t419128\n称量\t419129\nwinerror\t419130\n2.5匹\t419131\n特朗\t419132\n岂止\t419133\n二中民\t419134\nPhysiology\t419135\n特异功能\t419136\nclonezilla\t419137\niguxuan\t419138\n500强\t419139\n亚利桑那\t419140\n玻璃膜\t419141\n涉足\t419142\n东方盐湖城\t419143\ngodaddy\t419144\nvram\t419145\n伏见稻荷\t419146\n中央巡视办\t419147\n城镇企业\t419148\n委培\t419149\n9月4日\t419150\n地接社\t419151\nV社\t419152\n可威\t419153\n两用车\t419154\n王晓婷\t419155\n车辆管理所\t419156\nNBA2KOL\t419157\n三子\t419158\n材料科学与工程学院\t419159\n批量化\t419160\nairprint\t419161\n泰和家园\t419162\n科技大学\t419163\n顺纹\t419164\ny1\t419165\n包容性\t419166\n拨码开关\t419167\n初一学位\t419168\n绝对值函数\t419169\n微法\t419170\n御道街\t419171\n合肥银泰中心\t419172\n晚上九点\t419173\n竹溪\t419174\nRBS\t419175\n仁宝\t419176\n4580\t419177\n优优美图\t419178\n东方海岸\t419179\n排管\t419180\nHTTP协议\t419181\n榆林站\t419182\n性感海滩4\t419183\n穿透性\t419184\n大同大学\t419185\n华中农业大学植物科学技术学院\t419186\n共同财产\t419187\nboot.ini\t419188\n水准测量\t419189\n教育学家\t419190\n2018年4月27日\t419191\n衡山\t419192\n哈密尔顿\t419193\ntox\t419194\n货源代理网\t419195\n淡泊明志\t419196\n清孔\t419197\nSURF\t419198\n莘庄地铁站\t419199\n双峰县\t419200\n永居\t419201\n库塔\t419202\n脱唇\t419203\nbalm\t419204\n平行式\t419205\n南环桥\t419206\n王建房\t419207\n海康\t419208\n李绮红\t419209\nWiper\t419210\n宗教改革\t419211\n含糖\t419212\n乡居\t419213\n九仙帝皇诀吧\t419214\n第一杯\t419215\nudc\t419216\nnavione\t419217\n伴游\t419218\n天天链\t419219\n517路\t419220\n十赌\t419221\n窗纱\t419222\n天师执位\t419223\n练习3\t419224\n重说\t419225\n三分频\t419226\n尾巴骨\t419227\n搞不定\t419228\n凤凰花园\t419229\n阿牛哥\t419230\n行次\t419231\n瑞英\t419232\n摆\t419233\n吴山镇\t419234\n跑到\t419235\nRotate\t419236\n名发\t419237\n巴斯克斯\t419238\n无痕\t419239\n惯导\t419240\n4.37\t419241\n彩条\t419242\n舒血宁\t419243\n全面战争\t419244\n践悟\t419245\n援救\t419246\n稚语\t419247\n住院率\t419248\n破土\t419249\n天津泰达足球俱乐部\t419250\n大棒\t419251\n萨血\t419252\ndocomo\t419253\n八年\t419254\n工能\t419255\n转乘\t419256\n精神损害抚慰金\t419257\n蒸鱼\t419258\n文绮中学\t419259\ndsr\t419260\n898.com\t419261\n代尔夫特理工大学\t419262\n科陆电子\t419263\n360N5\t419264\n自学\t419265\n不觉\t419266\n刘希彦\t419267\n寨卡病毒\t419268\ninflate\t419269\n翡翠\t419270\n洪洞\t419271\n福建省委\t419272\n休妻\t419273\n奇牙\t419274\n第85期\t419275\n满头\t419276\n话务\t419277\n圣剑使的禁咒咏唱\t419278\nobv\t419279\n子码流\t419280\n35W\t419281\n郑州地铁\t419282\n御皇\t419283\n氰基\t419284\n西湖风景区\t419285\n妖狐\t419286\n板载声卡\t419287\nTualatin\t419288\n海洋国家实验室\t419289\n泛素\t419290\n渊薮\t419291\n口口相传\t419292\n灵山奇缘\t419293\ntrueview\t419294\n旌德\t419295\n韩中杰\t419296\n锭\t419297\n公有云和\t419298\n联合国安全理事会\t419299\nwebzip\t419300\n志成\t419301\n尽职调查\t419302\nv3.3.1\t419303\n喀什吧\t419304\n佳能ix6780\t419305\nYDUI\t419306\n飞线\t419307\n偷装\t419308\n托塔\t419309\n薮\t419310\n嘉友\t419311\n张广军\t419312\n比思\t419313\n通判\t419314\nlimb\t419315\n山村医生\t419316\n老吴\t419317\n北美留学生日报\t419318\n塞巴斯汀\t419319\naccumulator\t419320\n翻山越岭\t419321\n率性\t419322\n排涝\t419323\n织网\t419324\n榫\t419325\nKona\t419326\n150NK\t419327\n丽芝士\t419328\n厦门中医院\t419329\n南京城市职业学院\t419330\n金峰镇\t419331\n中国科学院南京土壤研究所\t419332\n定向增发\t419333\n张鹰\t419334\ncrazyYo\t419335\n百瑞信托\t419336\n霸虐\t419337\n鸡西市鸡冠区人民法院\t419338\nCardiac\t419339\n八王之乱\t419340\n毁损\t419341\n墙带\t419342\n剑道\t419343\n国药准字\t419344\n通心\t419345\n杨光的快乐生活\t419346\n搞错\t419347\nliyi\t419348\nnfc\t419349\nSOLVED\t419350\n销售假药罪\t419351\n天琊\t419352\n无风险报酬率\t419353\n养父母\t419354\ncat\t419355\n海边的曼彻斯特\t419356\n菩提洲\t419357\n選擇\t419358\nspawn\t419359\n害死\t419360\n张海波\t419361\n新蔡吧\t419362\n灯光\t419363\n犹太教\t419364\n活期存款\t419365\n苏州工业园区人力资源管理服务中心\t419366\n南方医科大学第三附属医院\t419367\n宝象国\t419368\n购物类\t419369\n试管婴儿医院\t419370\n5夜\t419371\nfrustration\t419372\nEKP\t419373\n弗兰卡\t419374\n安逸菌\t419375\nxhdpi、xxhdpi\t419376\n丰都\t419377\n四川大学望江校区\t419378\n七擒七纵\t419379\ncancellation\t419380\n花样滑冰大奖赛\t419381\n2.9.4\t419382\n德阳市公安局交通警察支队\t419383\n刘智远\t419384\n不经意间\t419385\n阳茎\t419386\nTemptation\t419387\n塞林格\t419388\n北京碧水源科技股份有限公司\t419389\n1.10.3\t419390\n中国红客联盟\t419391\n北京阜外医院\t419392\nsdj\t419393\nYarn\t419394\n2018-03-20\t419395\nMYR\t419396\n玄武门之变\t419397\nLQ-690K\t419398\n东方钽业\t419399\n温驯\t419400\n000998\t419401\n棉服\t419402\n好运气\t419403\n爱华网\t419404\nplaque\t419405\n修复术\t419406\n糖尿病史\t419407\nTomCat\t419408\n第618集\t419409\nescorts\t419410\nSurveys\t419411\n醉卧君怀笑\t419412\n中铁一局\t419413\n陌小雨\t419414\n王佳佳\t419415\n联通公司\t419416\n小比熊\t419417\n张壁古堡\t419418\n绍兴人才网\t419419\n老镜头\t419420\nBrandZ\t419421\n阿努\t419422\n追攻\t419423\n杰哥\t419424\nAntarctic\t419425\n电玩巴士网\t419426\n欧六\t419427\nonemore\t419428\n搾\t419429\n宁海新闻网\t419430\n宏观环境分析\t419431\n买盘\t419432\nUNIX\t419433\nmouseup\t419434\n氧化硼\t419435\n遥控\t419436\n宁皓网\t419437\n万能艺\t419438\n雪佛兰4S店\t419439\n一步之遥\t419440\n年出\t419441\n宁波三中\t419442\n小米笔记本触摸板\t419443\nsmite\t419444\n大滩\t419445\nSIE\t419446\n断口\t419447\n高逼\t419448\n出口方\t419449\n高厚比\t419450\n1.004\t419451\n0.46\t419452\n利尻昆布\t419453\n大黄脸\t419454\n一品威客网\t419455\nsavokiss\t419456\n新疆维吾尔自治区纪委监察厅\t419457\n木汁\t419458\n绑骨\t419459\n上海徐汇徐家汇\t419460\n20170814\t419461\n8k\t419462\n主频\t419463\nchown\t419464\n一片冰心在玉壶\t419465\nmyBase\t419466\n唇齿\t419467\nxps\t419468\nFirewall\t419469\n山大青岛校区\t419470\n刘美麟\t419471\n2000型\t419472\n1080P/720P高清迅雷\t419473\nFully\t419474\n提入\t419475\n校史\t419476\n20140705\t419477\n保定一中\t419478\nShopheroes\t419479\n肘子\t419480\n同校生\t419481\n京山县人民政府\t419482\nTVBS\t419483\n10.12.4\t419484\n虎步\t419485\n中国科学院上海药物研究所\t419486\n黑龙江省检察院\t419487\n新天籁\t419488\n九三集团\t419489\n海杆\t419490\n黄卫东\t419491\n可乘\t419492\n临门\t419493\n亨利慕\t419494\nsvt\t419495\nFuture\t419496\n净出口\t419497\n大众新桑塔纳\t419498\n孟美岐\t419499\n查理·卓别林\t419500\n白框\t419501\n一小段\t419502\n一个\t419503\n宗法制\t419504\n金振口服液\t419505\n地窝堡国际机场\t419506\n海军陆战队\t419507\n神识\t419508\n改良派论战\t419509\n民歌节\t419510\n上海国际会展中心\t419511\nHandbags\t419512\nxbd\t419513\n注会\t419514\n博园\t419515\n美式乡村风格\t419516\n周7\t419517\n美发\t419518\n葛十三\t419519\n赵英男\t419520\n琉森\t419521\n尊法学法\t419522\n了解\t419523\n青岛大众网\t419524\n十九大新\t419525\n国风\t419526\nDADDY\t419527\n连带率\t419528\n技经\t419529\n轉載\t419530\n毛利元\t419531\n华易\t419532\nBecame\t419533\n地眼\t419534\n涂料\t419535\n80亩\t419536\nCoverity\t419537\n东大门\t419538\nRetrieve\t419539\n20171010\t419540\nqc3.0\t419541\nyindao\t419542\n録\t419543\nebean\t419544\n几百名\t419545\n开步\t419546\ndae\t419547\n智慧城\t419548\n阿虏\t419549\nFiscal\t419550\n女发\t419551\n金榜题名\t419552\n刘春梅\t419553\n十五章\t419554\nbtg\t419555\n眼神\t419556\n轮式机器人\t419557\n0936\t419558\n极米Z4X\t419559\n购货\t419560\n李红兵\t419561\n任村\t419562\n选课\t419563\nnscd\t419564\n合金装备5\t419565\n三门峡水库\t419566\n东航银卡\t419567\n315M\t419568\n错层\t419569\n加西亚·马尔克斯\t419570\n人教版小学六年级数学下册\t419571\n电离\t419572\n全世\t419573\n惨叫\t419574\nSalute\t419575\n网展\t419576\n淡水泉\t419577\n兴宾区\t419578\n秀琴\t419579\n变味\t419580\n银耳红枣\t419581\n能找到\t419582\n临展\t419583\n民族大道\t419584\n天空树\t419585\n第2话\t419586\nTrek\t419587\n银行间债券市场\t419588\n大企\t419589\n天圣\t419590\nNice\t419591\n贵州省农村信用社\t419592\n20150315\t419593\n牵心\t419594\n学联\t419595\n4016\t419596\n星月神\t419597\n白狗\t419598\n长相忆\t419599\n软红\t419600\n玫瑰城\t419601\n三黄\t419602\n亚瑟传说\t419603\n狄仁杰之神都龙王\t419604\n红星网\t419605\nm330\t419606\n不是我的\t419607\n枫泾古镇\t419608\n学府路\t419609\nmx8.0\t419610\n小生意\t419611\n福摩\t419612\n税后\t419613\n王者荣耀百里守约\t419614\n虚边\t419615\n200多万元\t419616\n张国富\t419617\n国窖\t419618\n科技传播\t419619\n农行掌上银行\t419620\nパレ\t419621\n2:30\t419622\n乞丐王\t419623\n拒嫁\t419624\n负累\t419625\n李进\t419626\n爱幼\t419627\n龙血\t419628\n原平市\t419629\nDownow\t419630\n49式\t419631\n九斗\t419632\nNN\t419633\n打电\t419634\n少工委\t419635\n风闸\t419636\n80分钟\t419637\noutlook2016邮箱\t419638\n新神雕侠侣\t419639\nai模板\t419640\n电表\t419641\n河槽\t419642\n交接箱\t419643\ngskcc\t419644\n支撑\t419645\n贝塔斯曼\t419646\n王硕\t419647\n旗木\t419648\n车品\t419649\n露笑科技\t419650\n破山寺\t419651\n蒋涛\t419652\nCmake\t419653\n小师弟\t419654\n哈尔滨远东理工学院\t419655\n集合类\t419656\n印后\t419657\nAMI\t419658\n撞坏\t419659\n空余\t419660\n紹介\t419661\n大唐悬疑录\t419662\n2.3%\t419663\n中南大学研究生院\t419664\n铣刀\t419665\n益生菌胶囊\t419666\n产业链\t419667\n祙\t419668\n长沙市商务局\t419669\nSNIEC\t419670\n虚假\t419671\ndracut\t419672\n硬拉\t419673\n自聚\t419674\n小佛\t419675\n世茂地产\t419676\nlorde\t419677\n第七套\t419678\n五铢\t419679\n杨梅竹斜街\t419680\n钱先生\t419681\n18M\t419682\nVC2015\t419683\n凤囚最终幻想14\t419684\natheros\t419685\npissing\t419686\n腾讯课堂\t419687\n莱州市\t419688\n合肥一中\t419689\n韩春雨\t419690\n诚通\t419691\nV5.5.1\t419692\n黑狱断肠\t419693\n对流换热系数\t419694\n饮料柜\t419695\n小魂\t419696\n头房\t419697\n植物大战僵尸2高清版\t419698\n金岭\t419699\n欧乐b\t419700\n东城镇\t419701\n开讲\t419702\n督导表\t419703\n最终幻想8\t419704\nxtabond2\t419705\n吉事\t419706\n21010\t419707\n五气朝元\t419708\n巴克莱银行\t419709\n保密工作\t419710\n闲言碎语\t419711\ndnfsf\t419712\n31短信网\t419713\n保管费\t419714\n奥玛\t419715\nBugzilla\t419716\n宝马3系论坛\t419717\n吊牌\t419718\nメリ\t419719\n广州市政府\t419720\n玖月晞\t419721\n冢本友希\t419722\n30多位\t419723\n水印\t419724\nkaifa\t419725\n永远的白衣战士\t419726\nSoundbar\t419727\n华润翡翠城\t419728\nBlonde\t419729\n苏陕\t419730\nSabre\t419731\n庹震\t419732\nvolunteer\t419733\n姿三四郎\t419734\n王妃\t419735\n橙汁\t419736\ncd8\t419737\n马达加斯加岛\t419738\n5美元\t419739\n乡级\t419740\n阴模\t419741\nphotoshopcs5\t419742\nUBW\t419743\n取用\t419744\nlol吧_\t419745\n丰华\t419746\n俞敏斯嘉丽约翰逊\t419747\n$ajax\t419748\n为了吾王\t419749\n熔岩犬\t419750\n64排\t419751\n华著\t419752\n融资性保函\t419753\n青岛海关\t419754\n脆弱性\t419755\niqos电子烟\t419756\n5.40\t419757\n肋条\t419758\n塘沽站\t419759\n青神\t419760\n映衬\t419761\n色母粒\t419762\n192.168.1.107\t419763\n诚如神之所说\t419764\ncxw\t419765\nEarth\t419766\nOi\t419767\n布洛克\t419768\n千万种\t419769\n布劳恩\t419770\n水帘\t419771\n中药学专业\t419772\n微时光\t419773\n小夜曲\t419774\n遮阴\t419775\nLiunx\t419776\n广州市白云区人民法院\t419777\n吴亚馨\t419778\n海湖新区\t419779\n干音\t419780\n环压\t419781\n庄桥街道\t419782\n卖房\t419783\n2018.3.22\t419784\n弘大\t419785\noppor11\t419786\n绝战\t419787\n要不得\t419788\n中学\t419789\n取暖费\t419790\n武汉工程职业技术学院\t419791\n喜剧明星\t419792\n享客\t419793\n阳逻经济开发区\t419794\n转型\t419795\n2199\t419796\n精粉\t419797\n炸鸡\t419798\nconsole\t419799\n魔兽世界:军团再临\t419800\n500页\t419801\n昆仑镇\t419802\n视觉传达设计\t419803\n高傲\t419804\n圣传\t419805\n里约女排\t419806\n167集\t419807\n工作栏\t419808\n市农业局\t419809\n青岛出版社\t419810\nMatch\t419811\n峰\t419812\n控制标准\t419813\n厦门移动\t419814\nbianyi\t419815\n王心刚\t419816\n广告群\t419817\n安徽省教育招生考试院\t419818\n人间有味是清欢\t419819\n大一边\t419820\n伤势\t419821\nsega\t419822\n7月21日\t419823\nQQ交友群\t419824\n苏宁云商\t419825\n乐迈\t419826\n九键\t419827\n字数\t419828\n热衷\t419829\n诸法\t419830\n雷德王\t419831\n网络阅卷\t419832\n投保人\t419833\n嘉宾们\t419834\n赛哈姆\t419835\n通渭县人民政府\t419836\n云南大学\t419837\n寻梦\t419838\n垫资\t419839\n独孤伽罗\t419840\ncomparative\t419841\n画法\t419842\n枣子\t419843\n黄浦江路\t419844\n毛条\t419845\nArmstrong\t419846\n苏州市规划局\t419847\n佩林\t419848\n佳能ts3180\t419849\n转差率\t419850\n幻想计划吧\t419851\n帐务\t419852\n安知\t419853\n颅内动脉瘤\t419854\nWarner\t419855\n_干\t419856\n静压箱\t419857\n木马屋\t419858\n书馆\t419859\n创世界\t419860\n建阳\t419861\n月刊少女野崎君\t419862\nLIMITED\t419863\n1.8GHz\t419864\n笑匠\t419865\n麦西来普\t419866\ncontiki\t419867\n22亿\t419868\nDoze\t419869\n赛宝\t419870\n飞乐鸟\t419871\n57\t419872\n6302\t419873\n几进制\t419874\n极真\t419875\n南京中医院\t419876\n酒石酸美托洛尔片\t419877\n康柏西普\t419878\n神人\t419879\n真爱如血\t419880\n芒果树\t419881\n蛋龟\t419882\n一清机\t419883\n羽田璃子\t419884\nmaxon\t419885\n罗念生\t419886\nglmnet\t419887\nu32\t419888\n上海海洋馆\t419889\nradiant\t419890\nMaps\t419891\n迪丽热巴\t419892\n娱乐新闻_新浪网\t419893\n五路居\t419894\nStrike\t419895\n细胞免疫治疗\t419896\ndfg\t419897\n手机同步助手\t419898\n高新一小\t419899\n国旗班\t419900\n出院\t419901\n云光\t419902\n美雕\t419903\n训练包\t419904\n小手拉大手\t419905\n计划篇\t419906\n2017年2月14日\t419907\n门兴格拉德巴赫\t419908\n啃\t419909\nmpiexec\t419910\n战宠\t419911\n快一点\t419912\n9.98\t419913\n北美\t419914\nMy.4399.com\t419915\nFamous\t419916\n助管\t419917\n奉化新闻网\t419918\n茶府\t419919\n串案\t419920\n鸿茅\t419921\n紫金东郡\t419922\n标志板\t419923\n第四任\t419924\n续班\t419925\nANIMAX\t419926\nSCORE\t419927\n径路\t419928\n吴怡\t419929\n甲磺酸倍他司汀片\t419930\n再生纸\t419931\nbing\t419932\n明慧阿瑞\t419933\n按摩院\t419934\n谝\t419935\n兴县\t419936\n闲云若海\t419937\n|—16Aspx.com\t419938\n神盘\t419939\nftplib\t419940\n仿丝棉\t419941\nauthenticity\t419942\nMir\t419943\n汤姆猫跑酷\t419944\n85届\t419945\n天阙\t419946\nagainst\t419947\n驰界\t419948\n爱霸迪\t419949\neastbay\t419950\n稼\t419951\n穿衣\t419952\n束脚\t419953\n四第一章\t419954\n探物\t419955\n李培根\t419956\n花岗\t419957\n史泰柳传志\t419958\n玛琳\t419959\n舛\t419960\n余辉\t419961\n飞得\t419962\n果冻胶\t419963\n权衡利弊\t419964\n生粉\t419965\n20180220\t419966\n72层\t419967\n坊子新区\t419968\n节假日前\t419969\n枣花\t419970\n3.56\t419971\n110万\t419972\nTunnel\t419973\n大埔县\t419974\n什么状\t419975\n奥运会\t419976\n陈友谅\t419977\n胡鸿钧\t419978\n毡\t419979\n双鹭药业\t419980\n名前\t419981\narguement\t419982\n举足轻重\t419983\n水官\t419984\n模拟人生4\t419985\n驱散\t419986\n伯纳黛特\t419987\n优\t419988\n圣翼\t419989\n黑狐\t419990\n蜜荟\t419991\nFFMpeg\t419992\n珠海高铁站\t419993\n备受\t419994\n上汽通用雪佛兰\t419995\n炸麻花\t419996\n相持\t419997\n中国河东政府\t419998\n尽职调查报告\t419999\n广州酒店\t420000\n5.2.2\t420001\n睦月\t420002\n碴子\t420003\nUNIVERSAL\t420004\nhistcite\t420005\nsidebar\t420006\n硬度\t420007\n0.24.1\t420008\n抗癌健康网\t420009\nDDE\t420010\n杀戮时刻\t420011\n港闸区人民政府\t420012\n八名\t420013\nassistant\t420014\n苏家坨\t420015\nsothat\t420016\n27%\t420017\n恒大童\t420018\nActivity\t420019\n粤\t420020\n天空之镜\t420021\n汉斯顿\t420022\n宗教信仰\t420023\nk/3\t420024\n三国志12吧_\t420025\nsims4\t420026\n83名\t420027\nallstar\t420028\n南阳镇\t420029\n积尘\t420030\n中一个\t420031\n春蚕\t420032\n雷神1\t420033\nxmos\t420034\n200w\t420035\n切牙\t420036\n神秘佛眼\t420037\n阳原县\t420038\n前排\t420039\n重钙粉\t420040\n婚外情人\t420041\n亚非拉\t420042\n反结\t420043\n192.168.1.103\t420044\nwin7屏保\t420045\n影源\t420046\n柠檬花\t420047\n长江网\t420048\n刑事拘留\t420049\n油电混合汽车\t420050\n有助\t420051\niconworkshop\t420052\naxshare\t420053\n热媒\t420054\n5u\t420055\nunp\t420056\n扣开\t420057\nFingers\t420058\n第1位\t420059\nggs\t420060\nt410i\t420061\n陈琴\t420062\n广告网\t420063\n捷配仪器仪表网\t420064\nMU-MIMO\t420065\n公路工程技术标准\t420066\npptpd\t420067\n迪威\t420068\n分数值\t420069\n糖葫芦\t420070\n吉县\t420071\n网红\t420072\n架次\t420073\n中华取名网\t420074\n廊桥遗梦\t420075\n泰山学院\t420076\n大宁国际商业广场\t420077\n留学生活费\t420078\nBy2\t420079\n第二三\t420080\n非机动\t420081\n奕星\t420082\n特警\t420083\n水瓶座女\t420084\n杭州银行\t420085\n双岛\t420086\n后官湖\t420087\n私立学校\t420088\n重庆电视台\t420089\n词串\t420090\nfactual\t420091\n移动天王卡\t420092\n燕公子\t420093\n金地长湖湾\t420094\n武清杨村\t420095\nggo\t420096\nResistor\t420097\n党内外\t420098\nasan\t420099\n碎语\t420100\n寸金\t420101\n开洞\t420102\n2970\t420103\n曝光机\t420104\n信息技术岗\t420105\n配译\t420106\n外教社杯\t420107\n特技\t420108\n轮辐\t420109\nlinn\t420110\n沾化区\t420111\n阿克苏\t420112\n第三十二期\t420113\nLinux运维-51CTO\t420114\n里美\t420115\n惠州新闻网\t420116\n监理部\t420117\nenzo\t420118\nSena\t420119\n艾琳传奇\t420120\n建证\t420121\n水利部\t420122\n京秦高速\t420123\nCrooked\t420124\n蜷缩\t420125\n粗线\t420126\n莒南\t420127\nNH4+\t420128\nOurCoders\t420129\n翩翩\t420130\n黎里古镇\t420131\n麻花\t420132\n编码集\t420133\n砸坏\t420134\n国家电力公司\t420135\n线程锁\t420136\n轻便\t420137\nJSHint\t420138\n轮距\t420139\nbuffet\t420140\n怪物猎人4G\t420141\n邋遢大王奇遇记\t420142\n饱\t420143\n流管员\t420144\n大兴新区\t420145\n广州市国家税务局_广东省国家税务局\t420146\n新历\t420147\n不婚\t420148\n二尖瓣返流\t420149\nUV打印机\t420150\n老寿星\t420151\n106万\t420152\n城春\t420153\n锦集\t420154\n子初\t420155\n云os\t420156\n舍弃\t420157\n防爆灯\t420158\n石英粉\t420159\n短期融资券\t420160\n第十套\t420161\n加速度\t420162\nEntries\t420163\nu0\t420164\n山村落\t420165\n缩杆\t420166\n电动折叠后视镜\t420167\n白家村\t420168\n弃妃\t420169\n功成身退\t420170\n梅花村\t420171\n莱西\t420172\n氩氦刀\t420173\n神巴巴八字算命网\t420174\n拆分\t420175\n鹿犬\t420176\n二周年\t420177\n城市:天际线\t420178\n贵州市\t420179\nATMA\t420180\n高位截瘫\t420181\n怡红院\t420182\n版主\t420183\n两三个小时\t420184\n刘昆\t420185\n火星人\t420186\n17173剑网3\t420187\n洪宪\t420188\nMisaki\t420189\n荣耀4x\t420190\n金平\t420191\n消毒池\t420192\n滴塑\t420193\n500题\t420194\n捧花\t420195\n趋吉避凶\t420196\nMSDN版\t420197\n触摸板\t420198\nLepus\t420199\n第二十二\t420200\n大街\t420201\nsod\t420202\nstump\t420203\n韬光\t420204\nFiddler4\t420205\nt8ie\t420206\nchb\t420207\nINNER\t420208\n何弃疗\t420209\nshell编程\t420210\n未\t420211\n就医160网\t420212\n几百\t420213\n3459\t420214\nIVY\t420215\nboin\t420216\nsanic\t420217\n宁夏回族自治区交通运输厅\t420218\nRAR\t420219\n间谍战\t420220\n农家院\t420221\n消耗定额\t420222\nonShareAppMessage\t420223\n低杆\t420224\n慕羽茜\t420225\n益普索\t420226\n缤智逍遥兵王\t420227\n卡梅尔\t420228\n丢人\t420229\n老牛湾\t420230\n龙湾机场\t420231\nipho\t420232\nApktool\t420233\nrpc\t420234\n润都股份\t420235\nlmdb\t420236\n婚尚\t420237\n安骑士\t420238\n群链接\t420239\n波西米\t420240\nrnn\t420241\n夏冬春\t420242\n不谋\t420243\nholtek\t420244\n吊锤\t420245\n郊\t420246\n开禁\t420247\n三勤\t420248\n新手卡_激活码-游久网\t420249\n雍正通宝\t420250\n沉入\t420251\n新东路\t420252\n20160725\t420253\n致癌物\t420254\n篮网\t420255\nroutines\t420256\ned2k迅雷高清下载_百度云\t420257\n10页\t420258\n预决算\t420259\nシ\t420260\n仗义\t420261\nzuoye\t420262\n盖尔加朵\t420263\nmove\t420264\n喝酒\t420265\n壮苗\t420266\nColes\t420267\n会小二\t420268\n赤坭镇\t420269\nmacb\t420270\nmeasuring\t420271\n画乡网\t420272\n逆耳\t420273\n丰彩\t420274\n索德罗斯\t420275\n陈鸽\t420276\n丰田杯\t420277\n草缸\t420278\nssd固态硬盘\t420279\n重庆火锅米线\t420280\n新斯\t420281\nST钒钛\t420282\n2889\t420283\nHttpd\t420284\n车桥\t420285\n吉利博越\t420286\n尚一网\t420287\n设计\t420288\ncetc\t420289\n一代女皇\t420290\n痴汉十人队\t420291\n辩护律师\t420292\n首关\t420293\n汉文\t420294\n随便说说\t420295\n抚子\t420296\n诊所\t420297\ninterview\t420298\n中国围棋协会\t420299\n济南大学\t420300\n残体\t420301\n魔百盒\t420302\n郧阳区\t420303\n天龙八部畅易阁\t420304\n执业\t420305\n生住房补贴\t420306\n莫尔斯\t420307\n那场\t420308\nrsm\t420309\n未予\t420310\ndavinci\t420311\n重案组\t420312\n昌吉新闻网\t420313\n刘恒\t420314\n促动\t420315\n超合金魂\t420316\n王鑫\t420317\n扫码\t420318\n生卒年\t420319\n沃尔法特\t420320\n6缸\t420321\n贷方\t420322\n在心\t420323\n棹\t420324\n五举措\t420325\n百度云网\t420326\n载文\t420327\n分集剧\t420328\n福州八中\t420329\n1000千瓦\t420330\n小米手机4S\t420331\n布拉张朝阳\t420332\n改型\t420333\n0061\t420334\n举升\t420335\n平陵\t420336\n导演版\t420337\n津塘路\t420338\nDRG\t420339\n肩宽\t420340\n成都银行\t420341\ntiara\t420342\n211学校\t420343\n下一篇\t420344\n数控立车\t420345\n台湾银行\t420346\n龙头股份\t420347\n枇杷酒\t420348\n探险者协会\t420349\n风险率\t420350\n化学工\t420351\n克莱姆森大学\t420352\nkarcher\t420353\n2.3\t420354\n2007年7月\t420355\nShino\t420356\nsuccess\t420357\n中旅银行\t420358\n凵\t420359\n收载\t420360\n世豪广场\t420361\n维瓦尔\t420362\n吴悠\t420363\n爱乐维复合维生素片\t420364\n非珠\t420365\n格力集团\t420366\n公共基础知识/行测/申论\t420367\n5580\t420368\n派驻\t420369\n连桶\t420370\n棉拖鞋\t420371\n沧源\t420372\n茂木夏树\t420373\nsplunk\t420374\n新华网评\t420375\n盘头\t420376\n识趣\t420377\n悠哉悠哉\t420378\n100万元\t420379\n难能可贵\t420380\n尿毒\t420381\n中国裁判文书网\t420382\n佛头\t420383\n伦镍\t420384\n酒精\t420385\n_道\t420386\n中油工程\t420387\n我们的挑战\t420388\npaperwhite2\t420389\n中央一套\t420390\n电解\t420391\n收购商\t420392\n空港区\t420393\n瑞智\t420394\n天地源\t420395\n中国地理区划\t420396\n不问\t420397\n比利·林恩\t420398\nThinkPa\t420399\n塔罗测试\t420400\n炉边\t420401\n二手车市场\t420402\n迅雷云盘\t420403\n烈空\t420404\n南街镇\t420405\nautomobile\t420406\n抄报\t420407\nzhaosf\t420408\n贞芪扶正颗粒\t420409\nb18\t420410\n战壕\t420411\n火车头\t420412\n赛琳娜·戈麦斯\t420413\n无谓\t420414\n索拉菲尼\t420415\n每5年\t420416\n昆明动物园\t420417\nDVD机\t420418\nshenghuo.huangye88.com\t420419\n知情人\t420420\n茸茸\t420421\n全氟\t420422\n王羲之兰亭序\t420423\n高建平\t420424\n分\t420425\n朱雀国家森林公园\t420426\nIntensity\t420427\n现代农业科技\t420428\n黑人群\t420429\n宋一国\t420430\nvis\t420431\n美食广场\t420432\n光晕2\t420433\n封口费\t420434\n白金湾\t420435\n空气泵\t420436\nPPT字体\t420437\n宇光\t420438\n心肝\t420439\n北京德恒律师事务所\t420440\nbattlenet\t420441\nKP2\t420442\n免钉\t420443\n小强\t420444\nSpill\t420445\n焦点访谈\t420446\n小妖精\t420447\n快乐女声\t420448\n鼓楼区\t420449\n箱单\t420450\nfreaks\t420451\n官居大唐贞观\t420452\n气昂昂\t420453\n反风\t420454\n保德\t420455\n齐师傅\t420456\n巫漪丽\t420457\n陈鼓应\t420458\n正丰\t420459\n二七纪念塔\t420460\n声扬\t420461\n环境保护法\t420462\ncheung\t420463\nPolishing\t420464\nPW\t420465\n曾奇峰\t420466\n鸿\t420467\n基督山\t420468\n偷贼太放肆\t420469\n252\t420470\n张春林\t420471\n一把火\t420472\ncsr2吧\t420473\n输油\t420474\n黄剑\t420475\nBi\t420476\n建模\t420477\n985工程\t420478\n【伊\t420479\n临沂国家高新技术产业开发区\t420480\n扑食\t420481\n杭州住房公积金管理中心\t420482\n李亚明\t420483\n郭金\t420484\n有识\t420485\n烤鸡胸肉\t420486\n1.5%\t420487\nhbg\t420488\n服男\t420489\n组态软件\t420490\n西普尔\t420491\n中国内衣网\t420492\n荧火\t420493\n公安机\t420494\n15页\t420495\n星宇免费超市收银软件\t420496\n用友t3\t420497\n不良贷款率\t420498\n萘敏维滴眼液\t420499\nbcn\t420500\nmfc-7360\t420501\n世界之最网\t420502\nAoyama\t420503\n家级\t420504\n左键\t420505\ncarbo\t420506\n嘟督\t420507\nCCTV-3_央视网\t420508\nReferences\t420509\n11gr2\t420510\n量子\t420511\n4.9%\t420512\n社保滞纳金\t420513\nE1级\t420514\n南宁市科技局\t420515\n唐人街探案2\t420516\n柏庄\t420517\n闭门造\t420518\n視\t420519\n三姑\t420520\nEle\t420521\n世纪华联超市\t420522\n2.19G\t420523\n前程无忧实习\t420524\n洞桥\t420525\n通向\t420526\n车田正美\t420527\n恋狱月狂病\t420528\nprotection\t420529\n虎克\t420530\n2548\t420531\n偶页\t420532\nyiliao\t420533\n股权回购\t420534\n跨页\t420535\ncompany\t420536\n基干\t420537\n刀刃\t420538\n假名\t420539\nBiketo\t420540\n样瓶\t420541\nkeymap\t420542\n劳模\t420543\n_兔玩网\t420544\n王曦\t420545\nCum\t420546\n2018035期\t420547\n兵法\t420548\n成德\t420549\n蒋校长\t420550\nResize\t420551\nASQ\t420552\n美图秀秀批处理\t420553\n辰波\t420554\n程明志\t420555\n闸口镇\t420556\nsFlow\t420557\n开元物业\t420558\n三江路\t420559\n他么\t420560\n乳腺\t420561\n邓石\t420562\n虎门火车站\t420563\n寫\t420564\n星光小学\t420565\nmilling\t420566\n武汉科技\t420567\n麦\t420568\n微软输入法\t420569\n期望报酬率\t420570\nBlacks\t420571\n书钉\t420572\n白日不到\t420573\ncorrect\t420574\n薛丁山\t420575\n章丘人才网\t420576\n彭辉\t420577\nARS\t420578\nACL\t420579\n央视科教\t420580\n病弱\t420581\nBerlin\t420582\n信不信由你\t420583\n海美迪q5\t420584\n刺桐\t420585\n川大附中\t420586\n宏观经济学\t420587\n庙子镇\t420588\n魅网\t420589\n科怀·伦纳德\t420590\n上海财经大学\t420591\n美宅\t420592\n罗布麻\t420593\nCouples\t420594\n250mm\t420595\n千人计划网\t420596\n忆锦\t420597\n1.6万元\t420598\n热继电器\t420599\n港京\t420600\n第一食品网\t420601\n脚气\t420602\n易装\t420603\nsha1\t420604\n利劲\t420605\nluxu\t420606\nSerialPort\t420607\n小白砖\t420608\n柴油发动机\t420609\n天下大乱\t420610\n胶原酶\t420611\n违法行\t420612\n100a\t420613\n徐州工程学院\t420614\n演算\t420615\n酷睿i5\t420616\n0.5m\t420617\n成都地铁17号线\t420618\n电索\t420619\nwaybill\t420620\n研究人\t420621\n蓝膜\t420622\n合资\t420623\n港沿镇\t420624\n瓷枕\t420625\nf607\t420626\n进禁\t420627\nTandon\t420628\n突破年\t420629\n天行者\t420630\n铅蓄电池\t420631\n中国半导体行业协会\t420632\n马普\t420633\n新郑市人民政府\t420634\n陈朵\t420635\n西红柿炒蛋\t420636\nblod\t420637\n234\t420638\n24时\t420639\n福州路\t420640\nremovable\t420641\n无精\t420642\nScreaming\t420643\n罗曼尼康帝\t420644\n小鱼儿\t420645\n美罗华\t420646\n麦乐购\t420647\n焦阳\t420648\n缩聚反应\t420649\n出问\t420650\n黎兵\t420651\n坭兴陶\t420652\n同泽\t420653\n中交隧道工程局有限公司\t420654\n專\t420655\n中华辞赋网\t420656\n东皇太一\t420657\n一赛\t420658\n相关者\t420659\n第181章\t420660\n德乐\t420661\n南少林寺\t420662\n货节\t420663\n护眼笔\t420664\n安永会计师事务所\t420665\n砂仁\t420666\n蔡国华\t420667\n手工盒\t420668\n墨仓\t420669\n日包\t420670\nAustralian\t420671\n110页\t420672\n奏鸣曲\t420673\n白鲸记\t420674\n许鞍华\t420675\nLectures\t420676\n张玲玲\t420677\n1592\t420678\n16.12\t420679\n瘀血\t420680\n悦达起亚\t420681\n盲校\t420682\n藍\t420683\n曾杰\t420684\n0.7\t420685\n俄罗斯圣彼得堡\t420686\n青木关\t420687\nNerf\t420688\njeDate\t420689\n149路\t420690\n岁月静好\t420691\n严谨\t420692\n引力波\t420693\n英雄豪杰\t420694\nLING\t420695\n古为今用\t420696\n雅加达\t420697\n万邦\t420698\n婴幼儿\t420699\n团结\t420700\n工商局公司\t420701\n110期\t420702\n巴扎黑\t420703\n百字明\t420704\nlinx\t420705\n四川工业科技学院\t420706\n汇演\t420707\n254nm\t420708\n民国四大美女\t420709\n康震\t420710\n970M\t420711\n昌盛\t420712\n哄笑\t420713\ncom\t420714\n杀菌\t420715\n五缘湾\t420716\nConverse\t420717\n葛巾\t420718\n扩增\t420719\n乱欲\t420720\n200枚\t420721\n4价\t420722\n天航\t420723\n鲜肉月饼\t420724\n锡伯杜\t420725\nrealkid\t420726\n荣耀畅玩6X\t420727\n电子板报\t420728\n会计学基础\t420729\ncreen\t420730\n天天一泉\t420731\n博奥软件\t420732\n角尺\t420733\n一体式\t420734\n农机购置补贴\t420735\n叔侄\t420736\n绝地求生刺激战场电脑版\t420737\n福建省卫生计生委\t420738\n尚先生\t420739\n外延式\t420740\nSteel\t420741\nList类\t420742\n凝练\t420743\n巴东县\t420744\n100件\t420745\n五十岚\t420746\n败局\t420747\nyoke\t420748\n彩卡\t420749\n花眼\t420750\ncximage\t420751\n笑出\t420752\n真嗣\t420753\n德法意瑞\t420754\n公路片\t420755\n苏州留园\t420756\n一场游戏\t420757\n策划员\t420758\n趣味\t420759\n大力税\t420760\n山东11选5\t420761\nAE白\t420762\n睿骋\t420763\n郑州自贸区\t420764\n东方通\t420765\n武汉理工大学交通学院\t420766\n190ml\t420767\n途家\t420768\n卡神网\t420769\n牛顿法\t420770\n8.28\t420771\n满足\t420772\n重剑无锋\t420773\n朴宝剑\t420774\n于永正\t420775\nJIJITANG\t420776\n柳叶湖\t420777\n488元\t420778\n18054期\t420779\n卤肉饭\t420780\n大暑\t420781\n神经组织\t420782\n杨琴\t420783\n智能站\t420784\n万科悦湾\t420785\narctoolbox\t420786\n下肢深静脉血栓\t420787\n静态页\t420788\n三相交流电路\t420789\n亲姐妹\t420790\n小红帽\t420791\n湖滨花园\t420792\n防灾\t420793\n香茗\t420794\n40多个\t420795\n张怡\t420796\n一网打尽\t420797\n空窗\t420798\nWIN2008\t420799\n搜狐视频自媒体\t420800\n11小时\t420801\n7.8亿\t420802\n推力轴承\t420803\n深圳巴士集团\t420804\n海参崴\t420805\n胃胀\t420806\n改回\t420807\n享年\t420808\n苏瑾\t420809\n天九共享控股集团\t420810\n家上\t420811\n移动迷宫3\t420812\n造园\t420813\n55ab\t420814\n存储过程批量\t420815\n定责\t420816\nZHENG\t420817\n幻葬\t420818\n搞笑囧图\t420819\nsl410k\t420820\n娄底日报\t420821\n益安宁丸\t420822\nK701\t420823\n国家国际发展合作署\t420824\n马陵之战\t420825\n8168\t420826\n齿痕\t420827\n随随便便\t420828\n一人之下-一人之下在线漫画\t420829\n捷速ocr\t420830\n根治性\t420831\n町隆史\t420832\neeo\t420833\nwin10资源管理器\t420834\n粉扑\t420835\nWalnut\t420836\naas\t420837\n嘉源\t420838\n魅蓝note6\t420839\n欧洲西部\t420840\n逃杀\t420841\n海富幼儿园\t420842\n公升\t420843\n反之亦然\t420844\n九龙山景区\t420845\n沪尚茗居\t420846\n招拆招\t420847\n脂肪瘤\t420848\n比亚迪秦EV300\t420849\ntraffic\t420850\n叶子板\t420851\n护星者\t420852\nZABBIX\t420853\n烟棒\t420854\ns7200\t420855\n水针\t420856\nillustrat\t420857\n小三峡\t420858\n摩博会\t420859\nindo\t420860\n拔取\t420861\n李小三\t420862\n限购令\t420863\n半元音\t420864\n网易公开课\t420865\n牧濑红莉栖\t420866\n蚕桑\t420867\n忍者龙剑传\t420868\n亚米\t420869\n异乡好居\t420870\n重磅\t420871\n35G\t420872\n中华世纪坛\t420873\n有主见\t420874\n带套\t420875\n英国格拉斯哥大学\t420876\n帽帽\t420877\n迈之灵片\t420878\n汉仪\t420879\nft4\t420880\n迎宾\t420881\n干妈\t420882\n8.5.29\t420883\n胡风\t420884\n斗琴\t420885\n有机硅消泡剂\t420886\n看法\t420887\n中国建材检验认证集团股份有限公司\t420888\n幽泉\t420889\n预制体\t420890\n降损\t420891\nFrostpunk\t420892\n融创玉兰公馆\t420893\n大瓶\t420894\nphosphatase\t420895\n白斑\t420896\n姚劲波\t420897\n振兴乡村\t420898\nxamp\t420899\n蒋丞\t420900\n1盆\t420901\n吕钦\t420902\n地轮\t420903\nBT文件\t420904\n磷酸戊糖\t420905\n高杆\t420906\n共产党组织\t420907\n厦门岛\t420908\n调价\t420909\n5分钟后期\t420910\n诗婷\t420911\n沉睡\t420912\n一期一\t420913\n曙光股份\t420914\n蒙迪欧论坛_汽车之家论坛\t420915\n防窥\t420916\n看片\t420917\n鲳鱼\t420918\nwnd\t420919\n5.8英寸\t420920\n46平米\t420921\n深圳宝安机场\t420922\n杉\t420923\n米娅\t420924\n频谱\t420925\n坎普\t420926\n中山靖王\t420927\n第号\t420928\nFreya\t420929\n阿玛尼美妆\t420930\n3580\t420931\n冰雷\t420932\n第二遍\t420933\n行为\t420934\n贝克街\t420935\n铁红\t420936\n41.5\t420937\n威戈\t420938\n无锡市区\t420939\n屈原校花的贴身高手\t420940\n武平\t420941\n发展高层论坛\t420942\n放牧\t420943\n阿斌\t420944\n第一屏\t420945\n直辖\t420946\n绿城桃源\t420947\n泰姆瑞尔\t420948\n急冻鸟\t420949\n长瘤\t420950\n子洲县\t420951\nVeryCD\t420952\n国家地理信息\t420953\n嫌疑人\t420954\n未来网红领巾集结号\t420955\n问津\t420956\nwdm\t420957\n立领\t420958\n福州火车北站\t420959\n亿科\t420960\n战略类\t420961\n2542\t420962\n仕馨\t420963\n骨傲天\t420964\n21套\t420965\n三四月\t420966\nhain\t420967\n中国经营报\t420968\n江苏人大\t420969\n筏子\t420970\nA货\t420971\namazing\t420972\n青春版\t420973\n作业盒子小学\t420974\n春夜洛城闻笛\t420975\n大唐电力\t420976\n杨志明\t420977\n浅秋\t420978\n首间\t420979\n可可小爱\t420980\n空白框\t420981\nCUSTOM\t420982\n舍去\t420983\nTLSv\t420984\n江阳\t420985\nC4L\t420986\n浣东街道\t420987\n志刚\t420988\n波兰尼\t420989\n瑞雪兆丰年\t420990\nTVCM\t420991\n改机\t420992\nY71\t420993\n冼夫人\t420994\n久久网\t420995\n灭门惨案之借种\t420996\n多菌灵\t420997\n氯雷他定\t420998\n花猪\t420999\n李毅\t421000\n噬龙蚁\t421001\nemphasis\t421002\n10600\t421003\n脱困\t421004\n禾匠\t421005\n91部\t421006\n港雪宝\t421007\n醛类\t421008\n破天一剑\t421009\naa制\t421010\n蔡妍\t421011\npiad\t421012\n刘忠范\t421013\nstudio模拟器\t421014\n四川省水利厅\t421015\n兴创\t421016\nOracle\t421017\nmarx\t421018\n维修站\t421019\n夏雪\t421020\n云中君\t421021\nControlling\t421022\npsftp\t421023\n神秘巨星\t421024\n惹不得\t421025\nG11\t421026\nAPP手游工作室\t421027\nuploaded_file\t421028\n38号\t421029\ncqc认证\t421030\njava环境变量\t421031\n易法\t421032\n小升初开放日\t421033\nnjs\t421034\n善缘文库_善缘网\t421035\n青芒果\t421036\n口腔科\t421037\n仙剑客栈\t421038\npcolor\t421039\n俩句\t421040\n洪业\t421041\n载货车\t421042\n4.30\t421043\n会声会影X9\t421044\n务通\t421045\n汉化补丁v1.0\t421046\n废柴兄弟3\t421047\n曼牌滤清器\t421048\n低压电工证\t421049\n后端\t421050\n高全\t421051\n美线\t421052\n膜性肾病\t421053\nqq输入法\t421054\n万恶\t421055\nzif\t421056\n植眉\t421057\n1.81\t421058\n守卫战\t421059\n扁担\t421060\n上海市浦东新区人民法院\t421061\n龚克\t421062\n飞飞\t421063\n中国工程建设标准化协会\t421064\nAutopano\t421065\n垮\t421066\n壑\t421067\nbicycle\t421068\n最佳阵容\t421069\n一知\t421070\n小象学院\t421071\n千禧一代\t421072\n平板m3\t421073\n钦州港\t421074\n闹钟健身网\t421075\n来由\t421076\ninvision\t421077\n电视背景墙\t421078\n经济犯罪\t421079\n蜜月游\t421080\nshockwave\t421081\n斋普尔\t421082\nOLS\t421083\n广州市民政局\t421084\n道子\t421085\n无极绳绞车\t421086\n所属地\t421087\n飞利\t421088\n压枪\t421089\nonerepublic\t421090\n苏州工业园区职业技术学院\t421091\nhsd\t421092\n小儿哮喘\t421093\n创字\t421094\n两连\t421095\n旅游百事通\t421096\n无障碍设计规范\t421097\n永登\t421098\n2.17\t421099\n清帝\t421100\n吴道子\t421101\n战日\t421102\n福园小区\t421103\n阿尔\t421104\n素鸡\t421105\n排华\t421106\nmarimo\t421107\nSpline\t421108\n一百多块\t421109\n纺织展\t421110\nMR\t421111\n150种\t421112\n一张流\t421113\n入篮\t421114\n阿昔洛韦\t421115\n制人\t421116\nhackrf\t421117\n大灯泡\t421118\nv1.1.8\t421119\n梦溪石\t421120\n维埃里\t421121\n艰不拆\t421122\n贵德\t421123\n联合银行\t421124\n井陉县\t421125\n抵押率\t421126\n唐仁健\t421127\n60关\t421128\nTHANK\t421129\n樱桃轴\t421130\n良渚博物院\t421131\n花斑\t421132\n僵尸猎场\t421133\n可调式\t421134\nGuinness\t421135\n软件著作权\t421136\nmpchart\t421137\n不完\t421138\n嘉晟\t421139\n7月26日\t421140\nSpicy\t421141\nrectifier\t421142\n潜入\t421143\n二波\t421144\n宋思明\t421145\n1044\t421146\nMRI\t421147\n41.22\t421148\n狄仁杰之通天帝国\t421149\n⑤\t421150\n单三\t421151\n鹤管\t421152\nCFW\t421153\n迅雷5\t421154\nSDB\t421155\n地热管\t421156\n香雾\t421157\n谋杀案\t421158\n实型\t421159\n楚天都市报副刊_多媒体报\t421160\n一招一式\t421161\nP20PRO\t421162\n球霸\t421163\nsupport-v4\t421164\n施瓦茨\t421165\n结业\t421166\niGola\t421167\n3.1.0\t421168\n黄河大合唱\t421169\nPPT_\t421170\n乘着\t421171\n弄虚作假\t421172\n运输量\t421173\n玛特伽\t421174\n坦克英雄\t421175\nhist函数\t421176\n平手\t421177\n筑龙园林\t421178\n杀神叶欢\t421179\n第68\t421180\n本式\t421181\n美穂\t421182\n关联关系\t421183\nLOLS8\t421184\n30千米\t421185\n四川省中医院\t421186\n3-5分钟\t421187\n3.2G\t421188\n28324267\t421189\n52套\t421190\n共党\t421191\nroll点\t421192\n林佳\t421193\n五老\t421194\n金皇朝\t421195\n有样\t421196\n80位\t421197\n立花琉莉\t421198\n银行卡\t421199\n复方醋酸地塞米松乳膏\t421200\ndid\t421201\n华东医药\t421202\n国保\t421203\nBitLocker\t421204\n张至顺道长\t421205\n离行式\t421206\n逐滴\t421207\n加气混凝土\t421208\n阿里公司\t421209\nOSX86\t421210\n变态男\t421211\n脑桥\t421212\n一海\t421213\n64219000\t421214\nPlaylist\t421215\n湖南地区\t421216\n多田\t421217\n127号\t421218\n计划书\t421219\n9797\t421220\n纪实文学\t421221\nmentally\t421222\nprocedures\t421223\n朱温\t421224\n白桦树\t421225\n德清县\t421226\n波动率指数\t421227\n签员\t421228\n有一说\t421229\n曲美家居\t421230\nswfit\t421231\n一个50岁\t421232\n结婚登记处\t421233\nluoluo\t421234\n李彪\t421235\nHero5\t421236\n小米社区\t421237\n试剂盒\t421238\n腓\t421239\n通胜\t421240\n球状\t421241\n手枪\t421242\n绝地求生信号枪\t421243\nMP3格式\t421244\n苏南国家自主创新示范区\t421245\n后沙峪\t421246\n2017三\t421247\nhigh\t421248\n桂林市七星区\t421249\n猩红水黾\t421250\n历历在目\t421251\nBeatles\t421252\nDrug\t421253\nanim\t421254\nalisa\t421255\n实不实\t421256\n超鬼王猫\t421257\nwork2\t421258\n孟加拉豹猫\t421259\n处方药\t421260\n14所\t421261\n张易之\t421262\n嘿哈\t421263\n表蒙\t421264\npinlue\t421265\n美敏伪麻溶液\t421266\n清风苑\t421267\n芭比堂动物医院\t421268\nmame\t421269\n习题库\t421270\n平次\t421271\nwarrior1234\t421272\nvolumetric\t421273\nWAV分轨/\t421274\n招供\t421275\n创基\t421276\n费雯丽\t421277\nOPPOR9S\t421278\n中国政协网_中国人民政治协商会议全国委员会\t421279\n交换生\t421280\n丽芙\t421281\n加兰德\t421282\n线刷机包\t421283\n青云志\t421284\n汇港\t421285\n情分\t421286\n起源于\t421287\n恻隐之心\t421288\n剑币\t421289\n军医\t421290\n1.0c\t421291\n森林湖\t421292\nmodelx\t421293\n黄生\t421294\n智者乐水\t421295\nterrorism\t421296\n自作多情\t421297\nbeethoven\t421298\n8月中旬\t421299\nyz\t421300\n晶状体\t421301\n研磨\t421302\n奥伦纳素\t421303\n来来来来\t421304\n六脉神剑\t421305\n1.34\t421306\n上海京剧院\t421307\nFBS\t421308\nHyperion\t421309\nOman\t421310\n天気\t421311\n信汇\t421312\npowerpc\t421313\n20速\t421314\n1489\t421315\n伊通\t421316\nsinz\t421317\n_值客\t421318\n交易性金融负债\t421319\n好多个\t421320\n2GB\t421321\n265\t421322\n连不到\t421323\n隱藏\t421324\n旋翼式\t421325\n忽\t421326\n钉钉考勤打卡\t421327\n淘书团\t421328\n国荣\t421329\n张翎\t421330\n杨继盛\t421331\nonvif\t421332\n大同煤业\t421333\nzui\t421334\nfiber\t421335\n凉\t421336\n218.21\t421337\n花书\t421338\n钟诚\t421339\n奇装异服\t421340\n爱要坦荡荡\t421341\n昆明航空\t421342\n凯琳\t421343\n酸钙\t421344\nmicrowave\t421345\nvdb\t421346\n华莎\t421347\n合同违约责任\t421348\n罗伯特德尼罗\t421349\n诺玛\t421350\n深圳教育局\t421351\n五凤\t421352\n莲美恋\t421353\n10.7.1\t421354\n中海信\t421355\n人渣反派自救系统\t421356\n米脂三中\t421357\n裕安区\t421358\n2014-2018年\t421359\n8根\t421360\n郭阳郭亮\t421361\n1040万\t421362\n118号\t421363\n幡\t421364\nProdigy\t421365\n今天我是升旗手\t421366\n振金\t421367\n巴黎\t421368\n相位差\t421369\nsimhei\t421370\n弹药箱\t421371\n浸塑\t421372\n缰绳\t421373\n中英文版\t421374\n李培\t421375\n储值\t421376\nボク\t421377\nrfid\t421378\n_地藏缘论坛\t421379\n竞合\t421380\n攻心术\t421381\n宁波市鄞州区人民政府\t421382\n企业资产损失所得税税前扣除管理办法\t421383\nblog\t421384\nputchar\t421385\nCX-9\t421386\n上海中心城区\t421387\n全派\t421388\n摩腾\t421389\n嘉岚\t421390\n不祥之刃\t421391\n番长\t421392\n2000个\t421393\n南京\t421394\n山泥若\t421395\n八通\t421396\n7878\t421397\nagents\t421398\n李立群\t421399\n加油稿\t421400\n大迈\t421401\nN10\t421402\n西乡塘客运站\t421403\n无云\t421404\n百里奚\t421405\n长智齿\t421406\n贝达\t421407\n卡五星\t421408\n博商\t421409\n空鼓\t421410\n熠诺\t421411\n重临\t421412\n银管\t421413\n古美\t421414\n比表\t421415\n滚涂\t421416\n吸精\t421417\n为卿\t421418\n中国科学技术大学附属第一医院\t421419\n一礼拜\t421420\n子序列\t421421\nPanzer\t421422\n丹鹿通督片\t421423\n唱空\t421424\n洗马\t421425\n美陈\t421426\ndemands\t421427\n8053\t421428\n三十五周年\t421429\n古丽\t421430\n艾滋病抗体\t421431\n徐皇后\t421432\ni7p\t421433\ncropped\t421434\n心脏病学\t421435\n床位\t421436\n邪恶漫画大全彩图版_邪恶少女漫画无翼鸟全集日本邪恶漫画\t421437\n张志华\t421438\ncnp\t421439\n广和\t421440\n天天日影院\t421441\n掌酷门户\t421442\n十八弯\t421443\n疲\t421444\n什么感\t421445\n无良\t421446\nu乐娱乐\t421447\n隐伏\t421448\n莫乱\t421449\n刘淑青\t421450\n含量\t421451\nchao\t421452\n2015年04月\t421453\n约翰\t421454\n王阳明全集\t421455\n浦东政府\t421456\n绝缘体\t421457\n黄金段\t421458\n晓霞\t421459\n阵线\t421460\nadopt\t421461\n不退钱\t421462\n米易县人民政府\t421463\n双林寺\t421464\n乙肝表面抗原\t421465\n天龙影院\t421466\n日月潭\t421467\n4圆\t421468\nDBS\t421469\n气血康口服液\t421470\n人民中路\t421471\n平利县人民政府\t421472\nunknown\t421473\n加包\t421474\n发函\t421475\n另一款\t421476\n12kg\t421477\n中性粒细胞绝对值\t421478\n喵窝\t421479\n励志类\t421480\n十个多月\t421481\n赛欧\t421482\n130\t421483\n居首\t421484\n砂锅麻辣烫\t421485\n水帽\t421486\nfpp\t421487\nSHOCK\t421488\n锐志论坛论坛\t421489\n虚位以待\t421490\n运检\t421491\nf663n\t421492\nD7\t421493\n断筋\t421494\nyyzs\t421495\n吉安一中\t421496\n民防局\t421497\nBarney\t421498\n天汇广场\t421499\n第六天\t421500\n一生中最爱\t421501\n亚星锚链\t421502\n惊见\t421503\n多德\t421504\n安徽省郎溪县政府_郎溪县人民政府\t421505\n焖烧锅\t421506\n汉朝\t421507\n萨格勒布\t421508\njianada\t421509\n63%\t421510\n别人家的孩子\t421511\n玛格丽特·米切尔\t421512\n5U\t421513\n10.0.0.1\t421514\n管理学类\t421515\n西安钟楼\t421516\n编机\t421517\n古图\t421518\n密度值\t421519\n过江龙\t421520\n全额宝\t421521\n分株\t421522\n中国政府网\t421523\n退坡\t421524\n小浪底\t421525\n星斗\t421526\n亿百润\t421527\nNODE\t421528\n宿迁市住房和城乡建设局\t421529\n猪嘴\t421530\nz3\t421531\nlookchem\t421532\n氧OS\t421533\n七星鱼\t421534\n改超\t421535\n小伙们\t421536\nBookstore\t421537\n省港\t421538\n宋允儿\t421539\n点检员\t421540\n苏麻\t421541\n周检\t421542\n曲解\t421543\n顺成\t421544\n小区\t421545\n08_\t421546\n祝允明\t421547\n国康\t421548\nconfocal\t421549\nswell\t421550\n好开\t421551\nGSP\t421552\n美西\t421553\nESSE\t421554\n张程\t421555\nStelvio\t421556\n3600亿\t421557\n时光机\t421558\n无法忘记\t421559\nwimming\t421560\n陕西省书法家协会\t421561\n蓝驰创投\t421562\nLaravel5\t421563\n长流\t421564\n顶端\t421565\n大德\t421566\n衡算\t421567\n分布式光伏发电系统\t421568\n深圳东门\t421569\nene\t421570\n中国财政部\t421571\nfn\t421572\n自处\t421573\n河池市\t421574\n长城环球通信用卡\t421575\n繁体转换器\t421576\n初音社\t421577\n宿命论\t421578\n黄榕生\t421579\n黄小仙\t421580\n郑大二附院\t421581\n2030\t421582\n专病\t421583\n训练篇\t421584\n封龙山\t421585\n企鹅号自媒体\t421586\ndepressed\t421587\n外交人员\t421588\n康得新\t421589\n摔角狂热大赛\t421590\nORP\t421591\n启示_参考网\t421592\n六年级数学\t421593\n恶斗\t421594\n123RF\t421595\n0574-83075110\t421596\n通行证团\t421597\n29_\t421598\n水畔\t421599\n板板\t421600\n王楠\t421601\n河北卫视\t421602\n北理工\t421603\n天塔\t421604\n浙江省审计厅\t421605\n印江\t421606\nConvex\t421607\n三重一创\t421608\n丽佳\t421609\n乐酷网\t421610\n图迷\t421611\nDin\t421612\n我女友的男朋友\t421613\n可视化管理\t421614\n彩6\t421615\nPHPCMS\t421616\n代指\t421617\n合奏\t421618\n华强智慧网\t421619\n剑三\t421620\n学术型\t421621\n首级\t421622\nfastcgi\t421623\n3.3.7\t421624\n付宝\t421625\n红黑联盟\t421626\n见未来\t421627\n助力\t421628\n启航者\t421629\n观念\t421630\nn档\t421631\n小毛病\t421632\n水冲式\t421633\n个体户\t421634\n天幕\t421635\n长嫂\t421636\n俭朴\t421637\n道家\t421638\n湖北师范大学\t421639\n含意\t421640\n国际环境法\t421641\n1:35\t421642\n江西工业工程职业技术学院\t421643\n窘迫\t421644\n辅助装\t421645\n火牙\t421646\n性态\t421647\n斯芬克\t421648\n海坨山\t421649\n慧择保险网\t421650\nCrassulaceae\t421651\n思科\t421652\nIraq\t421653\nworkbooks\t421654\n离子注入机\t421655\nhuaer\t421656\nona\t421657\n小米净化器\t421658\n玉米蛇\t421659\n整型\t421660\n最珍贵\t421661\n画蛋\t421662\n84年\t421663\n大花轿\t421664\nFe\t421665\n不合时宜\t421666\n世间路\t421667\n路遥\t421668\n永琪\t421669\n八上\t421670\n游刃\t421671\nxmx\t421672\nredis批量\t421673\n甯\t421674\n第25小时\t421675\nUnnatural\t421676\n探鱼\t421677\n体验型\t421678\n姜维\t421679\n性别\t421680\n私房\t421681\n蛇灵\t421682\nt400\t421683\nGL文\t421684\n张冠李戴\t421685\n雄楚大道\t421686\n雷思海\t421687\n安儿乐\t421688\n宝兰高铁\t421689\n树下野狐\t421690\n凡科微传单\t421691\n山火\t421692\n镇海角\t421693\nLinuxPanda\t421694\n南博网\t421695\n21家\t421696\n麦包网\t421697\nKQ88\t421698\n定远中学\t421699\n深圳市优必选科技有限公司\t421700\n河北省人民政府国有资产监督管理委员会\t421701\nportraiture\t421702\n贤文\t421703\n出工\t421704\n铺盖\t421705\n刘工\t421706\n松动\t421707\nreplicate\t421708\n87time\t421709\n衣香\t421710\n抚州东\t421711\n窒息\t421712\n明瑞\t421713\nZepto\t421714\n超拽\t421715\n军乐\t421716\nhpc\t421717\n凯特温斯莱特\t421718\nCR-V\t421719\n猫箱\t421720\nnforce\t421721\n仿真电路\t421722\n赠人玫瑰\t421723\nArcGIS10\t421724\n百舸争流千帆竞\t421725\n黄金村\t421726\n7.com\t421727\n温室气体\t421728\n有功功率\t421729\n道堂\t421730\n李琛\t421731\n门冬\t421732\n超滤机\t421733\nhaley\t421734\nOpenSSL\t421735\n美甲店\t421736\n经济地理\t421737\ncharting\t421738\n新民\t421739\n吾师\t421740\n蓝黑\t421741\n一念永恒吧\t421742\n机器之心\t421743\n左舷\t421744\n鄯善县\t421745\n场域\t421746\nhnc\t421747\n布福娜\t421748\n5.8米\t421749\n论道\t421750\nm158b\t421751\n官本位\t421752\nEast\t421753\n邪恶力量\t421754\npills\t421755\nroads\t421756\n永旺超市\t421757\n铁佛镇\t421758\nmac视频编辑软件\t421759\n44.com\t421760\n魅族metal\t421761\n中国共产党全国代表大会\t421762\nabandon\t421763\n钎子\t421764\n苏娟\t421765\n瀚银\t421766\n红果\t421767\n都嘟\t421768\n第5条\t421769\n合并后\t421770\nRowKey\t421771\n王灿\t421772\n禾丰\t421773\n保贷\t421774\n13589329527\t421775\ndas\t421776\ns03\t421777\nShorts\t421778\n义山\t421779\n变心\t421780\n程勋\t421781\nO点\t421782\n代称\t421783\nLwIP\t421784\n某一次\t421785\n助听器\t421786\n批量处理\t421787\n得及\t421788\n睿骋cc\t421789\n9.7\t421790\n新报\t421791\n倪多喜\t421792\n母液\t421793\n海拉尔市\t421794\n症状\t421795\n上古卷轴5enb\t421796\n出境游\t421797\nfengbohello\t421798\nTheta\t421799\n漫画化\t421800\nHedgehog\t421801\n耿耿\t421802\n企业税\t421803\n今年8月\t421804\n真正男子汉\t421805\n900P\t421806\n平扫\t421807\n元凌\t421808\n狸猫\t421809\nAfrican\t421810\n臭气熏天\t421811\n曦\t421812\nV2.5.2\t421813\n双本\t421814\n罗德尼\t421815\n金柱赫\t421816\n沈宁\t421817\n谁谁谁\t421818\n|易语言俱乐部\t421819\n朱格拉\t421820\n震网病毒\t421821\n君恩\t421822\n研究\t421823\n张蕾\t421824\n鲁夫\t421825\nsanyo\t421826\n并道\t421827\n明日之子\t421828\n萨弗隆\t421829\n南开大学商学院\t421830\n耳结\t421831\n三城记\t421832\n红米手机2A\t421833\n华金资本\t421834\n视康\t421835\nFOUNDATION\t421836\nWORD2010\t421837\nbattlebrothers\t421838\n落\t421839\n第121集\t421840\n风鸟\t421841\n韩燕\t421842\n呼玛\t421843\n紫薇树\t421844\n16公里\t421845\n似水无痕\t421846\n猪脚饭\t421847\n鲸豚\t421848\n8th\t421849\n萨拉托加\t421850\nTOP10\t421851\n普瑞特艺术学院\t421852\ndense\t421853\nsnrtv\t421854\n得过\t421855\n青客公寓\t421856\n服装类\t421857\nSQLMap\t421858\n可比金融网\t421859\n孔机\t421860\n奔驰E300\t421861\n重楼\t421862\n梁体\t421863\nlayui-form\t421864\n守法公民\t421865\n亲子游乐\t421866\n星舰\t421867\n锦北街道\t421868\n优信二手车\t421869\n龙泉\t421870\n周公剑\t421871\nscn\t421872\n北新\t421873\n劝退\t421874\ngai\t421875\n紫外分光光度法\t421876\n亚龙\t421877\n安萌萌\t421878\n艾瑞咨询\t421879\nLamps\t421880\n藏花阁\t421881\ncaching\t421882\n退行性变\t421883\n广州地铁13号线二期\t421884\n坤卦\t421885\n信息量\t421886\n相沢恋\t421887\nSplashtop\t421888\n华建集团\t421889\nswitches\t421890\n安吉县\t421891\n186信息网\t421892\n广州国税\t421893\nWDR6300\t421894\n赢合科技\t421895\n久帝\t421896\nvmnet\t421897\nZ77\t421898\n1897\t421899\n我们这一天\t421900\n尼罗河花园\t421901\n31篇\t421902\niTV\t421903\n隐形守护者\t421904\n雪花粉\t421905\nreaching\t421906\n花枝俏\t421907\n营区\t421908\n乘势\t421909\n元清\t421910\n56部\t421911\n毕博\t421912\n滁州日报\t421913\nHaippy\t421914\n选举会\t421915\n酸辣\t421916\nieda\t421917\n8路\t421918\n到付\t421919\n鸣沙山\t421920\n疑问句\t421921\n苏教小学\t421922\n异录\t421923\nDS-160\t421924\n珠穆朗玛峰\t421925\n卖号\t421926\n经济学原理\t421927\n600p\t421928\n1761\t421929\n七七影院\t421930\ntorrents\t421931\n10月28日\t421932\n孙志浩\t421933\n看盘\t421934\n轻度脂肪肝\t421935\n流行元素\t421936\n刘强\t421937\n柠檬片\t421938\nhebing\t421939\n银川日报\t421940\n引议\t421941\n伯瓦尔\t421942\n手图\t421943\n20170314\t421944\nintelliJ\t421945\n大冶特钢\t421946\n同质性\t421947\n元宵节\t421948\n地处\t421949\n漠视\t421950\n伦敦艺术大学\t421951\nufo110线索网\t421952\n顺德人才网\t421953\n招商基金管理有限公司\t421954\n交通运输\t421955\ninstallshield\t421956\n耽美\t421957\n张渚镇\t421958\n薰风\t421959\n十品戏曲网\t421960\n生化武器\t421961\nprojected\t421962\n选倾品\t421963\n六安人论坛\t421964\n劵商\t421965\n泉友\t421966\nunkown\t421967\nebay\t421968\n蟠龙镇\t421969\nPatience\t421970\n锂盐\t421971\n乐果\t421972\n阿克苏诺贝尔\t421973\n_小故事网\t421974\nptz\t421975\nPFX\t421976\n锦艺城\t421977\n3月17日\t421978\n夜宿山寺\t421979\n座式\t421980\n暂名\t421981\n遮阳帽\t421982\n日圆\t421983\n半球形\t421984\n北京大学出版社\t421985\n_v1.0安卓\t421986\n移动办公软件\t421987\n文化史\t421988\ndynamical\t421989\n芯芯\t421990\ntruck\t421991\n残酷\t421992\n民愤\t421993\n出液\t421994\n记步\t421995\n晕眩\t421996\n学伴\t421997\nhung\t421998\n小汽车\t421999\nGIRL\t422000\n2017/2018\t422001\n猴急\t422002\n第十届\t422003\n长影\t422004\n胡飞\t422005\n斩首\t422006\n壹影堂\t422007\nIgnition\t422008\n自制点\t422009\n荣耀战队\t422010\n旗杆\t422011\n推荐度\t422012\np8h61-m\t422013\n同龄圈\t422014\ncay\t422015\n超级街头霸王4\t422016\n张晨\t422017\n爆炸\t422018\n张本\t422019\nskills\t422020\n开心哈乐\t422021\n七色堇\t422022\n唉声叹气\t422023\n阿甘骑士\t422024\n达达主义\t422025\n花花绿绿\t422026\ngd32\t422027\n1010兼职网\t422028\n万达院线\t422029\nGuetta\t422030\n胜安航空\t422031\n浙江森马服饰股份有限公司\t422032\n九味羌\t422033\n绿碳化硅\t422034\nv2015\t422035\n深圳市市场和质量监督管理委员会\t422036\n第1局\t422037\n订造\t422038\n片仔癀珍珠膏\t422039\n迷乱\t422040\n马三\t422041\n722\t422042\n生化危机5黄金版\t422043\n清迈\t422044\n表达式\t422045\nv330\t422046\n不是\t422047\n筱田\t422048\n一怒\t422049\n专稿\t422050\n火柴枪\t422051\n霸情\t422052\n伎俩\t422053\n动物科技学院\t422054\n双成药业\t422055\ntaoguba\t422056\n储藏间\t422057\n全保\t422058\nsuricata\t422059\n第50集\t422060\n考试题\t422061\n3.dll\t422062\n揭阳潮汕机场\t422063\nsurface笔\t422064\n行读\t422065\n因材施教\t422066\n卡萨诺\t422067\n不锈钢法兰\t422068\n蛋白质分离器\t422069\n耧斗菜\t422070\n心字\t422071\n润滑油\t422072\n李栋旭\t422073\n行\t422074\n塌落度\t422075\n银屏山\t422076\n颀\t422077\n等腰梯形\t422078\n牛扎糖\t422079\nlength\t422080\n多核\t422081\n第一宫\t422082\n好调\t422083\n揉捏\t422084\n五月深\t422085\n一言难尽\t422086\n七门\t422087\n十亿\t422088\n阿护\t422089\n我要诗经\t422090\n8套\t422091\n6016\t422092\n霉点\t422093\nmoive\t422094\n马属\t422095\n跳票\t422096\noskyhang\t422097\n阿坝日报数字报\t422098\n13800元\t422099\nimmer\t422100\n中继\t422101\n烟熏炉\t422102\n7300HQ\t422103\n码农场\t422104\n大学里\t422105\n实验类\t422106\n艾媒\t422107\n日抛\t422108\n三境\t422109\nBAT文件\t422110\n陆国民\t422111\n11家\t422112\n王珪\t422113\n灵基再临\t422114\n钢条\t422115\n维斯帕\t422116\nIgE\t422117\n简约\t422118\n一个行\t422119\n起歌\t422120\n威帅\t422121\n鲍照\t422122\n0898\t422123\n村史馆\t422124\n南方卫视\t422125\n重火\t422126\nMONSTER\t422127\n两期\t422128\n叶状\t422129\n标先进\t422130\n古尔沟\t422131\n1册\t422132\n方毅\t422133\n高铭暄\t422134\n弱攻\t422135\n冷度\t422136\n控股集团\t422137\n陈光诚\t422138\nHoover\t422139\n四连杆\t422140\n120点\t422141\n主策\t422142\nB350-PLUS\t422143\n1.93G\t422144\nfa\t422145\n图机\t422146\n大格\t422147\n滑草场\t422148\n交税\t422149\n软\t422150\n频率表\t422151\n第23\t422152\nHomestead\t422153\nctc\t422154\n普通话考试网\t422155\nspringsession\t422156\n战略性\t422157\n奈雪\t422158\n中心化\t422159\n胡玖明\t422160\nnload\t422161\n国家食药总局\t422162\n古诗词\t422163\n徐医生\t422164\n小方格\t422165\nG5\t422166\n切切乐\t422167\n沙发巾\t422168\n蒋勤勤\t422169\n脑残值\t422170\nurlencode\t422171\nolympic\t422172\n发达\t422173\n民办幼儿园\t422174\n电影库\t422175\n孟春\t422176\n烈士纪念日\t422177\n清退\t422178\n施洗约翰\t422179\n张祜\t422180\n赵志刚\t422181\n津市市\t422182\n发工\t422183\n南边\t422184\n700亿美元\t422185\n凯登·克劳丝\t422186\n刑点\t422187\n木屋\t422188\n现代汽车集团\t422189\n北京代表团\t422190\n延缓\t422191\nuniversity\t422192\nI5/Windows\t422193\n咽喉癌\t422194\n靶板\t422195\n路轨\t422196\n点调\t422197\n那一瞬\t422198\n氕氘氚\t422199\n行省略\t422200\n电扇\t422201\n游城\t422202\n和谐汽车\t422203\nwca\t422204\nleslie\t422205\n削坡\t422206\n偶极子\t422207\n锌\t422208\n携手\t422209\n原址\t422210\n白云学院\t422211\n腰间盘突出\t422212\n雪碧图\t422213\n避\t422214\n针织开衫\t422215\n中泰创展\t422216\n触摸失灵\t422217\ncdb\t422218\nunwrap\t422219\n陈印\t422220\n挂壁\t422221\n职业解析\t422222\nl298\t422223\n万马奔腾\t422224\n攻杀\t422225\n青海省国土资源厅\t422226\n途观索纳塔\t422227\n春天来了\t422228\n接接\t422229\n下胸\t422230\nj3455\t422231\n深圳东门步行街\t422232\n58平\t422233\nsystem\t422234\n苏州市市\t422235\nkt板\t422236\n题案\t422237\niPhone6白苹果\t422238\n幸福课\t422239\n纽宾凯\t422240\n经济区\t422241\n美桌\t422242\ncakephp\t422243\niPlayer\t422244\nTMB\t422245\n人和镇\t422246\n罪人\t422247\n生育证明\t422248\nF12018\t422249\nSeptember\t422250\n家庭性\t422251\n网络保险\t422252\n解密类\t422253\n广东海洋大学\t422254\n岚皋路\t422255\n埋单\t422256\n画生\t422257\n预算编制\t422258\nwin10英雄联盟\t422259\n第126期\t422260\n击穿\t422261\nh2o\t422262\n万邦德\t422263\nBain\t422264\n实习医生格蕾\t422265\n诺基亚贝尔\t422266\n互惠\t422267\n姥娘\t422268\n报件\t422269\n真容\t422270\n侨办\t422271\npersona\t422272\n移转\t422273\nDispatcher\t422274\n邮政编码表\t422275\nIOPE\t422276\n敲门砖\t422277\n散光眼\t422278\nama\t422279\n惠济区人民政府\t422280\nheading\t422281\n中西医结合医院\t422282\n启达\t422283\n方证\t422284\n菅野松雪\t422285\n格机\t422286\n边摊\t422287\nAULDEY\t422288\n中国电竞\t422289\n每个孩子\t422290\n科罗拉多\t422291\n四川省国资委\t422292\nOEM/ODM\t422293\nfas\t422294\n母龙\t422295\n自卫队\t422296\n99.999%\t422297\n六阶\t422298\n外国人名\t422299\n钟叔\t422300\n人工刷票\t422301\n仁义礼智信\t422302\n上海城隍庙\t422303\n字梯\t422304\n秀性感\t422305\nCR\t422306\n砂纸\t422307\n1368\t422308\nSOME\t422309\n东京铁塔\t422310\n批贷\t422311\n血红蛋白尿\t422312\n特色菜\t422313\n一周一\t422314\n声远网\t422315\n压力泵\t422316\n机器码\t422317\n1313\t422318\n万山网\t422319\n16世纪\t422320\n诸位\t422321\ndemon\t422322\n森山\t422323\n里氏词典\t422324\n乐秀视频编辑器\t422325\n人语\t422326\nhighschool\t422327\n深圳市铁汉生态环境股份有限公司\t422328\n南方医科大学顺德医院\t422329\ntvOS\t422330\n金钏\t422331\n八运\t422332\n布鲁尔\t422333\n海岛大亨吧\t422334\n灰比\t422335\n510k\t422336\n托尔斯泰\t422337\n腹黑妹妹控兄记\t422338\nTack\t422339\n毅冰\t422340\n耻笑\t422341\n乳业\t422342\n罗塞塔石碑\t422343\n1尺\t422344\n2655\t422345\n沙悟净\t422346\n辖市\t422347\n半兽勇士\t422348\n木瓜蛋白酶\t422349\n钱龙\t422350\n10型\t422351\nGIF_来福岛爆笑娱乐网\t422352\n雇佣\t422353\n福州中学\t422354\n上海市文来中学\t422355\nzynq7000\t422356\n叶倩彤\t422357\n自主创新\t422358\n蜜蜂巢\t422359\n苜蓿菜\t422360\npaipai\t422361\n曾维\t422362\n伟伟\t422363\namend\t422364\nTCGames\t422365\nTASK\t422366\n手机加速器\t422367\ns1000\t422368\n口袋妖怪金\t422369\n阿莉埃蒂\t422370\nendnotes\t422371\n长港路\t422372\n孰不可忍\t422373\n中国菜\t422374\n港航局\t422375\n老玉\t422376\n引经据典\t422377\n北屯市\t422378\n念念不忘\t422379\nconfigparser\t422380\n济南方特\t422381\n无规\t422382\nambulance\t422383\n第07期\t422384\nWRITE\t422385\na7r2\t422386\n初中级\t422387\n不进\t422388\n氢氧化镁\t422389\n其一\t422390\n上饶县\t422391\n粤海\t422392\n一2017\t422393\n_条\t422394\n铸造厂\t422395\nMeeting\t422396\n耐力\t422397\nSkylar\t422398\n爱德曼\t422399\n静差\t422400\n外汇市场\t422401\n底盘\t422402\n10.2.0.1\t422403\n太平洋保险公司\t422404\nbattlefield\t422405\n福建省第二人民医院\t422406\nwww.51kt.net\t422407\nbib\t422408\n虱\t422409\n御星\t422410\n离子交换层析\t422411\n天河石\t422412\n妊妇\t422413\n在一起玩\t422414\n有限元\t422415\n乙肝抗病毒\t422416\nvessels\t422417\n周姓\t422418\n建设科技\t422419\n亚莫利\t422420\n吴织亚\t422421\nbyton\t422422\n合隆\t422423\n刘雅婷\t422424\n中山市区\t422425\n财气网\t422426\n生时\t422427\n沙扎比\t422428\n篱开罗\t422429\n51期\t422430\n8128\t422431\n冒险岛十字猎人\t422432\n曝气器\t422433\n掘墓人\t422434\n醋酸纤维\t422435\nlpshou\t422436\n挑拣\t422437\n宣贯会\t422438\nGeekBench\t422439\n二环西路\t422440\n嫦娥四号\t422441\n手腕\t422442\n256列\t422443\n浙江省\t422444\nsven\t422445\n0820\t422446\n木纹砖\t422447\ntar压缩解压缩\t422448\npremiere2017\t422449\n拥护者\t422450\n富力东山新天地\t422451\n注浆泵\t422452\n求真\t422453\nPixARK\t422454\nLINKSYS\t422455\nrevision\t422456\n3万方\t422457\nlampp\t422458\n王木木\t422459\nexpense\t422460\n河南街\t422461\n不爱\t422462\n张开涛\t422463\nWoG-英雄无敌III\t422464\n大逃\t422465\n马一浮\t422466\n8种\t422467\n氢气球\t422468\n建筑桩基\t422469\n第4季度\t422470\n材料类\t422471\nSUZUKI\t422472\n日本银行\t422473\narrays\t422474\ntasaki\t422475\n正装\t422476\n吸污\t422477\n四厘\t422478\n易买网\t422479\n嗜酸性粒细胞\t422480\n智谷\t422481\n专攻\t422482\n奥奇丛林法则\t422483\n地球帝国1\t422484\n礼无忧网\t422485\n刘文正\t422486\n链轨\t422487\n牙子\t422488\n德文\t422489\n磁座钻\t422490\n4.6亿美元\t422491\n东华医院\t422492\n尚雯婕\t422493\nBoundaries\t422494\n搞搞\t422495\n意达\t422496\n2105\t422497\n自动步枪\t422498\n碳水化合物\t422499\n战龙\t422500\n肩包\t422501\n一怒之下\t422502\n天善智能\t422503\n桌面级\t422504\nNerxious\t422505\n杨匏安\t422506\nRestrict\t422507\n宜宾市人民政府\t422508\n擦枪走火\t422509\n哈萨克人\t422510\n第6次\t422511\n福明路\t422512\n起亚汽车\t422513\n转度\t422514\n核稿\t422515\ngetbytes\t422516\n陆赤闫\t422517\n库肯霍夫公园\t422518\nk歌之王\t422519\n節目\t422520\n分支箱\t422521\n929\t422522\neviews9\t422523\n1.153\t422524\n金疮药\t422525\n苏米龙\t422526\n冷堆\t422527\n并接\t422528\n向\t422529\n1万毫安\t422530\n语言\t422531\n世界日报\t422532\n克孜尔石窟\t422533\n劳尔\t422534\n关节镜\t422535\n测试仪\t422536\n灯片\t422537\n中东欧\t422538\n分组函数\t422539\nToilet\t422540\n泥匠\t422541\n杨永清\t422542\n垂钓者\t422543\nedisonfeng\t422544\n书香\t422545\n正确性\t422546\n沙拉酱\t422547\n小半\t422548\nyalu\t422549\n这一生\t422550\n脑机\t422551\n可达性\t422552\n金水湖\t422553\n7k\t422554\n交辉\t422555\nLeong\t422556\n程潜\t422557\n1000亿元\t422558\n贱男\t422559\n2009年7月\t422560\n2018.02\t422561\n移液管\t422562\n第四次忍界大战\t422563\n零件\t422564\n西南政法大学\t422565\n山联村\t422566\n堆内存\t422567\n家种\t422568\n酥油灯\t422569\n灿烂\t422570\nHash\t422571\n中国物流学会\t422572\n千三\t422573\n蚀刻片\t422574\n沐猴\t422575\n兰州站\t422576\n翠微路\t422577\n马前\t422578\n总额\t422579\n上海市世界外国语中学\t422580\n温州医学院\t422581\n赵小兰\t422582\nlimbo模拟器\t422583\n李嫣然\t422584\n清晰化\t422585\n几泡\t422586\nphr\t422587\nDPF\t422588\n库克山\t422589\n中文版破解版/Office\t422590\n纤长\t422591\n存货管理\t422592\n空城\t422593\n新疆人事考试中心\t422594\n禁忌文\t422595\n流行时尚\t422596\n重庆地铁2号线\t422597\n曼可顿\t422598\n美联英语\t422599\n安提戈涅\t422600\n隰县\t422601\nSander\t422602\n河北金融学院\t422603\n资源学院\t422604\n荆轲\t422605\n黄河森林公园\t422606\n艾森豪威尔\t422607\n金水宝\t422608\n1464\t422609\n方舟:方块世界\t422610\n西藏高原\t422611\n汤芳\t422612\n宫颈纳氏囊肿\t422613\n羊肉泡馍\t422614\n行政区划图\t422615\n许姓\t422616\nretained\t422617\n臆测\t422618\n蒙在鼓里\t422619\n电影级\t422620\n棋牌乐\t422621\n一拔\t422622\nBORN\t422623\nClO2\t422624\nGfriend\t422625\n欧登塞\t422626\nrx1r2\t422627\ngta5吧\t422628\n表述\t422629\n3猫\t422630\n365dvd\t422631\n该报\t422632\n企业们\t422633\n第23章\t422634\n英特威\t422635\n新田园\t422636\n需求\t422637\n19万元\t422638\n非晶合金变压器\t422639\n20151229\t422640\n切圆\t422641\n20151107\t422642\n两只\t422643\n西虹桥\t422644\n普尔\t422645\n氯甲烷\t422646\n托里拆利\t422647\nハンド魂\t422648\n了解锁\t422649\n邓伦\t422650\n生放\t422651\n生身\t422652\n反枕\t422653\n脑疝\t422654\n美尼康\t422655\ndios\t422656\n金维\t422657\nt-2\t422658\n军事类\t422659\n达实\t422660\n巴纳德\t422661\n线轴\t422662\n7.62\t422663\n征召\t422664\n干什\t422665\n漾漾\t422666\n沪江网hujiang\t422667\n善人\t422668\n防火女\t422669\n真空断路器\t422670\n丝瓜络\t422671\n0151\t422672\n套内建筑面积\t422673\n石油\t422674\n提档\t422675\n粉丝节\t422676\n一码通\t422677\nGAE\t422678\n圣诞快乐\t422679\n50枚\t422680\n建规\t422681\n央视综合频道\t422682\n本条\t422683\nproj\t422684\n小戏骨\t422685\n子墨子\t422686\n华应龙\t422687\n迈普交换机\t422688\n李兆基\t422689\n快速傅里叶变换\t422690\n宝鸡职业技术学院\t422691\n肉植物\t422692\n西坝\t422693\n维达纸业\t422694\nCMS帮助中心\t422695\n肌理\t422696\n编撰\t422697\n迅雷论坛\t422698\nEmployees\t422699\nusana\t422700\n测距传感器\t422701\n同宫\t422702\n差分线\t422703\n细粒式\t422704\n侧透\t422705\n大兴黄村\t422706\n大法\t422707\n链条机\t422708\n早泻\t422709\n2013上半年\t422710\n1个月\t422711\n中国史\t422712\n所得税款\t422713\nf值\t422714\n拉丁名\t422715\n万星\t422716\n邢窑\t422717\n王晓伟\t422718\n可可简笔画\t422719\n艾玛杜蒙特\t422720\n分组\t422721\nHWA\t422722\n董华\t422723\n脱盐\t422724\n上海浦东_浦东政府\t422725\n不可挡\t422726\nicd\t422727\n股东权益比率\t422728\n奥拉夫\t422729\n一盘棋\t422730\n大众日报数字报\t422731\n寒冰\t422732\n__跑跑车手机网\t422733\n成宝\t422734\negfr\t422735\n神武_\t422736\n孔家\t422737\n霸天\t422738\n大象席地而坐\t422739\n开中\t422740\n灭火器箱\t422741\nXenServer\t422742\n南渡镇\t422743\n弑魂\t422744\n幻体\t422745\n_天极下载\t422746\n中国华信能源有限公司\t422747\n录放\t422748\n波纹板\t422749\n适逢\t422750\njkw\t422751\n复数矩阵\t422752\nv3500\t422753\n恼人\t422754\noverleaf\t422755\nvv5s\t422756\n人民网娱乐频道\t422757\n雨泽\t422758\n御轩\t422759\n操作教程\t422760\n校时\t422761\n受不住\t422762\n丰田佳美\t422763\nBuddhist\t422764\nColon\t422765\nSA\t422766\n七牛云\t422767\nOffice2013\t422768\n招商银行信用卡成都营运中心\t422769\n责任书\t422770\n北京超图软件股份有限公司\t422771\n0.2%\t422772\n广东公安厅\t422773\n张海峰\t422774\n砂子塘小学\t422775\nPIGFF\t422776\n海军军医大学\t422777\n分散化\t422778\n手台\t422779\n时刻听党话\t422780\n42P\t422781\n乐居\t422782\n芦田爱菜\t422783\nkam\t422784\n九码\t422785\n赛事\t422786\nmpfr\t422787\n乔兹\t422788\n中国科学院神经科学研究所\t422789\n梁栋\t422790\n两人\t422791\n互联\t422792\n短指\t422793\n趁手\t422794\nInvisalign\t422795\nMalta\t422796\n云打印\t422797\n白啤\t422798\n中兴物联\t422799\n高高\t422800\n橋乃\t422801\n缓冲区分析\t422802\n丰和日丽\t422803\nCCTV5在线直播|NBA直播|足球直播|NBA直播吧|英超\t422804\n有效位\t422805\n西政新闻网\t422806\n主\t422807\n传染性软疣\t422808\n可怕\t422809\n提审\t422810\n恰锦绣华年\t422811\n6.8亿\t422812\n出丑\t422813\n长句\t422814\nWAR3\t422815\n好看电影网\t422816\n365地产\t422817\n众口难调\t422818\n毛榉\t422819\n蓓昂斯\t422820\n高桥李依\t422821\n云南省发展和改革委员会\t422822\n30mm\t422823\n超实惠\t422824\n刻薄\t422825\n城市环境\t422826\nActiveMQ\t422827\n生殖\t422828\n达产\t422829\ndistrict\t422830\n洛阳市财政局\t422831\n欢聚一堂\t422832\n一匹马\t422833\n离职证明\t422834\n喵小姐\t422835\n查子\t422836\n0w30\t422837\n气瓶\t422838\n早餐饼\t422839\n安徽教育厅\t422840\nPte\t422841\n最配\t422842\n南京财经大学红山学院\t422843\n时尚先生\t422844\n海丰县\t422845\nkvv\t422846\n曲轴\t422847\n李静雯\t422848\n淄博实验中学\t422849\n六届\t422850\n读西游记\t422851\nOTS\t422852\n马小\t422853\n铝合\t422854\n一个类\t422855\n牦牛肉干\t422856\n800毫升\t422857\n喷粉\t422858\nsweeper\t422859\n微信个性签名\t422860\n简支板\t422861\nbackground-color\t422862\n顶着\t422863\n神座\t422864\n2017-04-27\t422865\n精神损害赔偿\t422866\n板手\t422867\n人人商城V3\t422868\n60美元\t422869\n4月26号\t422870\n成量\t422871\n少年群侠传\t422872\n毁约\t422873\nUMF\t422874\n轻唱\t422875\n魔法门之英雄无敌\t422876\n黑暗势力\t422877\nweb.xml\t422878\n一佰\t422879\n网络视频监控系统\t422880\nbgp\t422881\ncyf\t422882\n重绘\t422883\n一建公路\t422884\nguozk\t422885\n华图\t422886\n二甲醚\t422887\nAlec\t422888\n百和\t422889\n极处\t422890\n盘问\t422891\n上海交通大学机械与动力工程学院\t422892\ntabbaritem\t422893\n比较教育学\t422894\n工种\t422895\n扫雷\t422896\n趣事作文\t422897\n121\t422898\n股东\t422899\n赵立春\t422900\n实体法\t422901\n华瀚\t422902\n圆锥的体积\t422903\n3196\t422904\n朴鸣\t422905\n福建省食品药品监督管理局\t422906\nmackbook\t422907\n平远县\t422908\n北京民族大学\t422909\n管理科学与工程专业\t422910\n蜂窝式\t422911\n华理\t422912\nJavac\t422913\n运动者\t422914\n起兵\t422915\n钱柜娱乐\t422916\n603156\t422917\n大众论坛\t422918\n150例\t422919\n这种树\t422920\n感康\t422921\n虎山路\t422922\n师恩\t422923\n剑网3\t422924\n特等站\t422925\n裂项\t422926\n分析工\t422927\n2017.3\t422928\nCOLLEGE\t422929\n华融\t422930\n辞赋\t422931\n导管\t422932\n傻强\t422933\nPossession\t422934\n三六九等\t422935\n极睿\t422936\n购购\t422937\n山哥\t422938\n简中\t422939\n时假\t422940\n魔法师\t422941\niapd\t422942\n期货投资者\t422943\n生鲜电商\t422944\n牡丹亭\t422945\n玫瑰湖\t422946\n藤娇\t422947\n长话短说\t422948\n湖北生态工程职业技术学院\t422949\n关西地区\t422950\n海峡杯\t422951\n古牧\t422952\n除子\t422953\n魔幻片\t422954\n胶体果胶铋胶囊\t422955\n_凤\t422956\nmini2\t422957\n抱膝\t422958\n北京市方圆公证处\t422959\n养殖基地\t422960\n逆转裁判3\t422961\n威实\t422962\nNTC热敏电阻\t422963\n捷太格特\t422964\nVUX\t422965\n创下\t422966\n王亭之\t422967\n齐成琨\t422968\n梨木台\t422969\n西安市工商局\t422970\n浆水\t422971\n自主汽车网\t422972\n悬崖上的金鱼姬\t422973\n第八名\t422974\nu4100\t422975\n宗祠\t422976\n欢乐滨海城\t422977\nElliott\t422978\n先买\t422979\n简思\t422980\n阿斯美\t422981\n桂花香\t422982\nMason\t422983\n正昌\t422984\n芗城区\t422985\n吹塑\t422986\n姜黄素\t422987\n8速\t422988\n江堤\t422989\n中华人民共和国国家卫\t422990\n4000万元\t422991\n报读\t422992\n3敏\t422993\n金地湖山\t422994\nTwinks\t422995\n小结节影\t422996\n上海世博园\t422997\nAB模板王-www\t422998\n片儿\t422999\n边寨\t423000\n快穿女配逆袭\t423001\npkf\t423002\n宁波物流公司\t423003\n锚定效应\t423004\nonethink\t423005\n朝政\t423006\n末世孤雄2\t423007\n崔浩\t423008\n一百米\t423009\n深度\t423010\nzone\t423011\n四川省食品药品监督管理局\t423012\n编程书\t423013\n云里\t423014\nMerchant\t423015\nwww.hz7788.com\t423016\n交金\t423017\n伯君\t423018\n纳谏\t423019\n不可知论\t423020\n营盘\t423021\nsirocco\t423022\n星汇\t423023\n屁虫\t423024\n44年\t423025\n九米\t423026\n0394\t423027\n跪坐\t423028\nsast\t423029\nStolen\t423030\n新丽\t423031\n湾湖\t423032\n平行四边形的面积\t423033\nABL\t423034\n巨蝎\t423035\n双绞线\t423036\n主域\t423037\n婉约派\t423038\n珍珠岩\t423039\n李星云\t423040\n双肢\t423041\n花蕾蕾\t423042\niClone\t423043\nvinegar\t423044\n袁先生\t423045\nharm\t423046\n21座\t423047\n四排\t423048\n李雷\t423049\nhda\t423050\n青月\t423051\n王ygocore\t423052\n朝拜\t423053\n回弹值\t423054\nsw2018\t423055\n道德经\t423056\n英科宇机械\t423057\nJoyJin\t423058\nmongochef\t423059\n努比亚z11minis\t423060\n烤蛋糕\t423061\n导航条\t423062\nParser\t423063\nvisor\t423064\n河南教育新闻网\t423065\n九嶷山\t423066\n黄石下陆区\t423067\n华蓥\t423068\n检查工\t423069\nアパ\t423070\n孙端\t423071\n1gb\t423072\n乌蒙山\t423073\nfollows\t423074\n小物块\t423075\nfenshu\t423076\n孙璐\t423077\n韩漫\t423078\n路透社\t423079\n先知\t423080\n复韵母\t423081\n汇星\t423082\n汉字输入法\t423083\n神论者\t423084\nsaction\t423085\n主战场\t423086\n水野\t423087\ns200\t423088\n4万美元\t423089\n命佛\t423090\n杜鹃节\t423091\n私人银行卡\t423092\n张冉\t423093\napp支付\t423094\n标题党\t423095\n体谱\t423096\n平潮\t423097\n舒伯\t423098\n增信\t423099\n春情\t423100\n茅山\t423101\n下花园区\t423102\n火舞风云\t423103\n夜交藤\t423104\n黄先生\t423105\nconexant\t423106\n300MAAN\t423107\n段友们\t423108\n上海市第八人民医院\t423109\n6.2班\t423110\n电源管理芯片\t423111\n醋酸铜\t423112\nMoeRO\t423113\n巨能\t423114\n单面镜\t423115\n始兴县政府\t423116\n依视\t423117\nadminstrator\t423118\naisino\t423119\n阿凡卢\t423120\n赛灵思\t423121\n娘山\t423122\n安民\t423123\n重力式\t423124\n小组\t423125\n李沧区\t423126\nAmor\t423127\n北京市水务局\t423128\nDataUrl\t423129\n光灵\t423130\n济南新闻网\t423131\n三第一\t423132\n沙漠骆驼\t423133\n大江东\t423134\n换热管\t423135\n试戏\t423136\nChoker\t423137\n吴秦\t423138\n0.8.0\t423139\n非条件反射\t423140\n完美女人\t423141\nsuperuser\t423142\n邻里\t423143\nAssets\t423144\nCLion\t423145\n株洲组工网\t423146\n凤城一路\t423147\n心衰竭\t423148\n中海碧林湾\t423149\n美团骑手吧\t423150\n姜创\t423151\n刑事侦缉档案3\t423152\n3.0T\t423153\n萨瓦\t423154\n中工写字楼网\t423155\n叶芳华\t423156\n十二星座男\t423157\n医话\t423158\njso\t423159\n兰大一院\t423160\n贾总的演讲\t423161\n蒜苔炒肉\t423162\n2.5mg\t423163\n仙剑7\t423164\nNeil\t423165\n91万\t423166\nOlympic\t423167\n宁光\t423168\n贺军科\t423169\n姒锦\t423170\n大连电台\t423171\npylint\t423172\n夺魁\t423173\n私募证券投资基金\t423174\n戴维·迈尔斯\t423175\n研究组\t423176\n枫树林\t423177\n阿仪网\t423178\nIGS\t423179\n庶出\t423180\n资源地\t423181\n四书章句集注\t423182\n大阳山国家森林公园\t423183\n住区\t423184\n禁摩\t423185\n王彦霖\t423186\n科技节\t423187\n小清欢\t423188\n页框\t423189\nhomebrew\t423190\n哨兵\t423191\n妖邪\t423192\n20170628\t423193\n兴国镇\t423194\n曹参\t423195\n晋江币\t423196\n组号\t423197\n四大文明古国\t423198\n波数\t423199\n紫钗奇缘\t423200\n敌法\t423201\n逸字\t423202\n尼康AF\t423203\ncontextcapture\t423204\nac88u\t423205\nJH6\t423206\n雷涛\t423207\n美海军\t423208\n辛向阳\t423209\n中易广告联盟\t423210\nPOSE\t423211\n大围\t423212\nV2ray\t423213\n吸吮\t423214\n序曲\t423215\n事本\t423216\n土拍\t423217\n7.01\t423218\n赛季后赛\t423219\n第九十三条\t423220\n苇草\t423221\nwoolworths\t423222\n5月18日\t423223\nC2260\t423224\nOutlast\t423225\n枪兵\t423226\n八码\t423227\n荆门新闻网\t423228\n8500\t423229\n量标准\t423230\n28377\t423231\n变身卡\t423232\n柳营\t423233\n环模\t423234\nQ345R\t423235\n上海学而思\t423236\njjvod\t423237\n惊异\t423238\nRetreat\t423239\n澳大利亚银行\t423240\n计量师\t423241\n3t\t423242\n上档\t423243\n葡萄糖酸锌口服液\t423244\n树干\t423245\n郑州电视台\t423246\n小柴\t423247\n60m\t423248\n跃进村\t423249\ntransmitter\t423250\nRound\t423251\n杂病\t423252\n疯兔\t423253\n约翰·列侬\t423254\n12月9日\t423255\n探灵档案\t423256\n铜钱\t423257\n半叶寒羽\t423258\n琴盒\t423259\n李梓萌\t423260\n疯吻\t423261\nCZ80\t423262\n对讲\t423263\n太上感应篇\t423264\n安远\t423265\n艾路\t423266\n平进平\t423267\n25种\t423268\n1800\t423269\nhua\t423270\n0206\t423271\n45项\t423272\n纳维\t423273\n高宏\t423274\n2018年03月19日\t423275\nMoon\t423276\n钱塘府\t423277\ncoi\t423278\n丽晶\t423279\nFENIX\t423280\n华伦天奴\t423281\n汉口江滩\t423282\n4月13日\t423283\n竹山路\t423284\n1.0源\t423285\n19类\t423286\n仙游一中\t423287\n29所\t423288\n市集\t423289\nindexpath\t423290\n凤凰财经频道_凤凰网\t423291\nAMT\t423292\n齐家商城\t423293\n虚拟经济\t423294\n统一体\t423295\n大众arteon\t423296\n堆物\t423297\n羽泉\t423298\n机头\t423299\n现券\t423300\n合约\t423301\nps手绘板\t423302\n40载\t423303\npolyfit\t423304\n0796\t423305\n太阳雨太阳能\t423306\n清风徐\t423307\n2.4.3\t423308\n编文\t423309\n集智\t423310\nVOLVO\t423311\n江心\t423312\n肖青璇\t423313\nNoto\t423314\n焊接性\t423315\n小花\t423316\nv1.2.0_\t423317\n黑铬\t423318\n诡媚\t423319\nipv6\t423320\n特异\t423321\n新疆医科大学第一附属医院\t423322\n重唱\t423323\nurbeats\t423324\n飞鹏\t423325\n保姆\t423326\n京昆\t423327\n_凡\t423328\n沈阳乐居网\t423329\n一边\t423330\n周目\t423331\n蜷川实花\t423332\n不看\t423333\n社区商业\t423334\n海疆\t423335\n成都学历教育\t423336\n高斯消元\t423337\n香港红馆\t423338\nWKellyL\t423339\n天籁论坛\t423340\n豪盛\t423341\n天元律师事务所\t423342\n混缩\t423343\n第202集\t423344\n东奥继教\t423345\n西北师大\t423346\n万年前\t423347\n生疏\t423348\n范迪塞尔\t423349\nMTB\t423350\n足艺阁踩踏网\t423351\n快速券\t423352\ndict\t423353\n48次\t423354\n上海凯宾斯基大酒店\t423355\n百度搜索\t423356\n定义变量\t423357\nLoom\t423358\n王雁\t423359\n2014年下半年\t423360\n生殖腺\t423361\n海盗湾\t423362\n蘑菇石\t423363\n侧耳倾听\t423364\n排卵日\t423365\n配位化学\t423366\n汽机\t423367\n点客\t423368\n墙边\t423369\nHacknet\t423370\n种族选择\t423371\nramps\t423372\n手瓜\t423373\n刀具\t423374\n时间域\t423375\n字字\t423376\n米奇沃克斯\t423377\nPCL6\t423378\n鲁伯特\t423379\n天龙八部之天山童姥\t423380\n雕工\t423381\nm268dw\t423382\n光头佬\t423383\n幽美\t423384\n钱德\t423385\n越南航空\t423386\n更美丽\t423387\n不裁\t423388\n李炳淑\t423389\n暗夜精灵3\t423390\n马鲷\t423391\n人乳瘤头病毒\t423392\n充棉机\t423393\n渗透压\t423394\n熄灯\t423395\n%f\t423396\nAlbert\t423397\n清朝初期\t423398\n北京市环境保护局\t423399\n想不想要\t423400\n知秋\t423401\n住处\t423402\n涨停价\t423403\n老顶\t423404\npalived\t423405\nannoying\t423406\nUEFI启动项\t423407\n傅首尔\t423408\n酸累\t423409\nshn\t423410\n地动\t423411\n显控\t423412\n西部数码帮助中心\t423413\n网页视频播放器\t423414\n四班\t423415\n东莞银行股份有限公司\t423416\n津乐园\t423417\n福德宫\t423418\n江浙\t423419\n京瓷\t423420\n关联函数\t423421\n七里村\t423422\n物架\t423423\nEUI\t423424\n文斯卡特\t423425\nsil\t423426\n广播迷\t423427\nhuge\t423428\nx25\t423429\n1799元\t423430\n亚萍\t423431\n必胜客\t423432\n等量关系\t423433\n立白洗衣液\t423434\n锤子农业网\t423435\n停放架\t423436\n曾祖\t423437\nContinuity\t423438\n菜果\t423439\nlne\t423440\n中国钱币博物馆\t423441\n扎昆\t423442\nAsia\t423443\n电鱼\t423444\n珠江国际大厦\t423445\n敏雅\t423446\n大黄\t423447\n手足口病\t423448\n鲟鱼\t423449\nrole=\t423450\n莱纳德\t423451\nsnb\t423452\nVCB-Studio\t423453\n中远两湾城\t423454\n盐酸米诺环素\t423455\n部落冲突\t423456\n百分比\t423457\nwicket\t423458\n社文\t423459\n李艺彤\t423460\n000050\t423461\noperative\t423462\n多尔衮\t423463\n东江新城\t423464\nTIMESTAMP\t423465\n三余\t423466\n的\t423467\n安德拉\t423468\n波打\t423469\n公斤重\t423470\ntesseract-ocr\t423471\n喜剧之王\t423472\n储蓄银行\t423473\n忘恩负义\t423474\n赋闲\t423475\n真空滤油机\t423476\n江油\t423477\n一胜百\t423478\n折纸大全图解\t423479\n珠江花园\t423480\n11.1.1\t423481\n人教版五年级数学下册\t423482\nе\t423483\n动物画\t423484\n斗法\t423485\n椰\t423486\n家年\t423487\n滑雪大冒险\t423488\n中证500指数\t423489\n食用玫瑰\t423490\n盈通\t423491\n拜耳\t423492\n8.00\t423493\n撸点\t423494\n质子重离子医院\t423495\n玛莲娜\t423496\nanalyzing\t423497\n试了试\t423498\nencoding=\t423499\nSEG\t423500\nxpress\t423501\n怪招\t423502\n李四光\t423503\n小米note4\t423504\n王烨\t423505\n珀\t423506\n王玉\t423507\n杰瑞股份\t423508\n五官科医院\t423509\n阜阳师范学院\t423510\njuul\t423511\n49\t423512\nlookalike\t423513\n坡道式\t423514\nseason\t423515\n周末同床\t423516\n大患\t423517\n小仓唯\t423518\n沪西\t423519\nibiza\t423520\n顾城\t423521\nAdvertising\t423522\n热脸\t423523\n小东\t423524\n糟蹋\t423525\n大克鼎\t423526\n计在\t423527\n珠机\t423528\n辉度\t423529\n8823\t423530\n略微\t423531\n季诺\t423532\n书缘\t423533\n将令\t423534\n银奖\t423535\n美国移民局\t423536\n鞋样\t423537\nICP备案查询网\t423538\n下陆\t423539\n重拨\t423540\n长润影视\t423541\n李冀雪\t423542\nGoogle地球\t423543\nWide\t423544\n国事访问\t423545\nconsume\t423546\nUnderstand\t423547\nmtz\t423548\n招标代理合同\t423549\n小米路由器mini\t423550\n422\t423551\n梦想的声音2\t423552\n银烛\t423553\nOverlord\t423554\n国模\t423555\n第五座\t423556\n张卫\t423557\nEiEi\t423558\n小鲜肉们\t423559\n倍数\t423560\nnurture\t423561\n市实验小学\t423562\n散列函数\t423563\n海格\t423564\n灾害性\t423565\n主网\t423566\nH310\t423567\n灭门案\t423568\n犀角\t423569\n万科金域东郡\t423570\nFTM\t423571\n龙港镇\t423572\nEKF\t423573\n男人与女人\t423574\n狂转\t423575\n张娟娟\t423576\n学历史\t423577\n47第二部\t423578\n起来\t423579\n2.6.18\t423580\n温州酒店\t423581\n蔚来汽车\t423582\n河南省建设厅\t423583\n厦门国际会展中心\t423584\n柳田\t423585\n美客\t423586\nCame\t423587\n图层组\t423588\n攻略文\t423589\n嘉峪关机场\t423590\n3月初\t423591\n冬歇\t423592\n冠状病毒\t423593\nh2数据库\t423594\n物理治疗师\t423595\nkmno4\t423596\n陌生人妻\t423597\n绿壳\t423598\nSoler\t423599\n海子\t423600\n江西铜业集团公司\t423601\nkirikiroid2\t423602\n电热开水瓶\t423603\n黄华\t423604\n洛杉矶之战\t423605\n数码宝贝tri\t423606\n雾天\t423607\nLaravel学院\t423608\n中央纪委\t423609\n陈二狗的妖孽人生\t423610\n鸿茅酒\t423611\n威尔顿\t423612\nstl格式\t423613\ne100\t423614\n5.6.14\t423615\n圆通速递\t423616\n接口\t423617\n分子筛\t423618\n马东锡\t423619\nalie\t423620\n俗子\t423621\ncorona渲染器\t423622\n太仓港区\t423623\n小丘\t423624\n中国燃气\t423625\n61路\t423626\n酷音配音网\t423627\nPHPWEB\t423628\nAutocad2013\t423629\nrender\t423630\n货运公司\t423631\n笼养\t423632\n鸿泰\t423633\n诗稿\t423634\n担任\t423635\netag\t423636\n20150103\t423637\n修真风云录\t423638\n3.com\t423639\n山东工商\t423640\n)\t423641\nViewCell\t423642\n快刀\t423643\n杨学明\t423644\ndistur\t423645\n开元小区\t423646\ntranslucent\t423647\n人人游戏\t423648\nstrava\t423649\n不食\t423650\n操作箱\t423651\n人身体\t423652\n金布\t423653\nWAMPSERVER\t423654\n周转王\t423655\n5F\t423656\nmambo\t423657\n手报\t423658\n换机\t423659\n硬盘\t423660\n金汇通航\t423661\n200多分\t423662\n阿乐\t423663\n至正\t423664\n果麦奶茶\t423665\n香体\t423666\n角川\t423667\n幼狗\t423668\n七宝古镇\t423669\n斯柯达\t423670\n18.2.1\t423671\n任翔\t423672\n埋场\t423673\nringing\t423674\nconcession\t423675\nGAO\t423676\n燃火\t423677\n可执\t423678\n324号\t423679\n肾俞\t423680\n绿圈\t423681\n扣件式钢管脚手架\t423682\n莫清\t423683\n取悦\t423684\n绿世界\t423685\n吴云\t423686\n价值链\t423687\n百分比值\t423688\n广州市第七中学\t423689\n皇泽寺\t423690\nVARY\t423691\n31分钟\t423692\n二更\t423693\n电玩巴士ps3\t423694\n先马\t423695\n鑫都\t423696\n序号\t423697\n瓶画\t423698\n认证书\t423699\n3235\t423700\ntonglin0325\t423701\n岳云鹏\t423702\n中国医学科学院医学生物学研究所\t423703\n控制工程专业\t423704\n巨人传\t423705\n杭宸\t423706\n外服\t423707\n下订单\t423708\n长城信用卡\t423709\nUnleashed\t423710\n狗头\t423711\n丑八怪\t423712\ntable分页\t423713\n田舍\t423714\n雁田\t423715\n水恋\t423716\nWin10版本号\t423717\nallez\t423718\n应付票据\t423719\n长衣\t423720\n会议台\t423721\n家学\t423722\n尺牍\t423723\n移交\t423724\n由来已久\t423725\n兴镇\t423726\n2017年12月28日\t423727\n辽宁师范大学海华学院\t423728\n易烊千\t423729\nAtom\t423730\n身元\t423731\nsteal\t423732\nA3版\t423733\n泥沙俱下\t423734\n电感电路\t423735\n108天\t423736\n人脚\t423737\n1.88\t423738\n报业集团\t423739\n儿童期\t423740\nching\t423741\n脊柱科\t423742\n中航国际广场\t423743\nLorraine\t423744\n_股\t423745\ngrateful\t423746\n纨绔\t423747\n吕雉\t423748\n环卫工\t423749\nchorme\t423750\n沈沈\t423751\n狼狼\t423752\n贵航\t423753\n日钢\t423754\n五版\t423755\nγ\t423756\n厚册\t423757\n花药\t423758\n广东顺德德胜学校\t423759\nGF-2013-0201\t423760\n张蓝心\t423761\n汉魂\t423762\n布雷西亚\t423763\n霍金\t423764\n管局\t423765\n江亚菲\t423766\n颤音\t423767\n自乐\t423768\n4.3亿\t423769\n第0\t423770\nDotA\t423771\n华莱\t423772\n唐彦谦\t423773\n教参\t423774\n兜底\t423775\nprince2\t423776\n回回头\t423777\n岳阳网\t423778\n俄罗斯战争\t423779\n25周岁\t423780\n主方\t423781\nivms4200\t423782\n周兵\t423783\n望江路\t423784\n沉睡魔咒\t423785\nLyte\t423786\nQDII\t423787\ngoda\t423788\n城市总体规划编制\t423789\n口吻\t423790\n叙利亚战局\t423791\n黑卡5\t423792\n朱元思\t423793\n夜猫子\t423794\n承发\t423795\n口通信\t423796\n积雪\t423797\n職\t423798\n吕俊\t423799\nyoutube1080p\t423800\n条套\t423801\n肉肠\t423802\n姚遥\t423803\n车建新\t423804\n削峰填谷\t423805\nclassmethod\t423806\nMARSHALL\t423807\n720P/\t423808\n叶辰\t423809\n自已\t423810\n东北\t423811\n10.0.2.2\t423812\n梨城\t423813\n口腔频道_健客网\t423814\n中生代\t423815\nresist\t423816\n路亚饵\t423817\n中国农业银行手机银行\t423818\n四向\t423819\n产学\t423820\n爱杏美\t423821\n刑事侦缉档案吧\t423822\n小树\t423823\n2018百\t423824\n菜菜\t423825\n好小子\t423826\n再别康桥\t423827\n统一企业\t423828\n下午五点\t423829\n斯诺克锦标赛\t423830\nlay\t423831\nmenhera酱\t423832\n民政部\t423833\n62个\t423834\n李狗蛋\t423835\n蓝心湄\t423836\n250级\t423837\n出窍\t423838\n国防工业出版社\t423839\n张文彬\t423840\nfisher\t423841\n气压传感器\t423842\n哥舒\t423843\nVostro\t423844\nmapnik\t423845\n_钢笔字帖\t423846\nneimeng\t423847\n拖欠\t423848\n序列帧\t423849\n桑子\t423850\n政治局\t423851\n可可香奈儿\t423852\n苍之纪元冒险\t423853\n宜川中学\t423854\n批斗\t423855\nTXT版\t423856\n百乐门\t423857\n5000万元\t423858\n刘罗\t423859\n技规\t423860\n头孢克肟片\t423861\n七五普法\t423862\ndli\t423863\n银猪\t423864\n鲜有人知\t423865\n巴统\t423866\n武商\t423867\n惠阳\t423868\nfategr\t423869\n做贡献\t423870\nG900\t423871\nk8\t423872\n20141112\t423873\n72亿\t423874\n天涯情缘\t423875\n粉笔省\t423876\n直飞\t423877\n架\t423878\n禅茶\t423879\n维维网\t423880\nRX1R\t423881\n天镇县\t423882\n菲斯特\t423883\n一个梦想\t423884\nBilingual\t423885\nliunux\t423886\n7G\t423887\n下水解\t423888\n尼康D7000\t423889\n斯图尔特\t423890\n猎肠者\t423891\nBattlerite\t423892\nCOMMERCE\t423893\nrecv函数\t423894\nqwq\t423895\n古国\t423896\n计税\t423897\n日语在线翻译\t423898\n微带\t423899\n塔罗师\t423900\n玛丽罗斯\t423901\n拼板机\t423902\nSavills\t423903\n狱霸\t423904\n小旋风\t423905\n村里\t423906\n回函\t423907\nOath\t423908\n养车乐\t423909\n细纱\t423910\n亿万\t423911\n分寸\t423912\nOverall\t423913\n文化人类学\t423914\n制成\t423915\n税友\t423916\n毒伤\t423917\n哔咔哔咔哔咔吧\t423918\n绅士们\t423919\n武汉纺织大学外经贸学院\t423920\n弹速\t423921\n彩绳\t423922\n丁晓钟\t423923\n研一\t423924\n辞掉\t423925\n伽马\t423926\n80小时\t423927\n前夕\t423928\n进沪\t423929\n60公分\t423930\n捆包机\t423931\n喜儿\t423932\n杨炯\t423933\n贲\t423934\n3户\t423935\nlabe\t423936\n随机过程\t423937\n独独\t423938\nMnet\t423939\nDirac\t423940\n弹鼓\t423941\n定岗\t423942\n曹娥街道\t423943\n电路原理\t423944\nng-if\t423945\n水果盘\t423946\n离监\t423947\n齐贤镇\t423948\nFinancial\t423949\nJBoss7\t423950\n咏流传\t423951\n拾豆豆\t423952\n路由协议\t423953\nikan\t423954\n100小时\t423955\n发散\t423956\nHH\t423957\n卡尔曼滤波器\t423958\ncaudalie\t423959\n难忘的岁月\t423960\n财产保险公司\t423961\nLACOSTE\t423962\n免费版\t423963\n温润\t423964\n继教网\t423965\n炸肉\t423966\n不只\t423967\n比尔·盖茨\t423968\n蒲岐\t423969\n甲硝唑凝胶\t423970\n包头市\t423971\n喉位\t423972\nangualr4\t423973\n3一6岁\t423974\nacore\t423975\n瑞州\t423976\n死亡观\t423977\nassuming\t423978\n高低温箱\t423979\n瑞丽社区\t423980\n奥利奥\t423981\n王仁果\t423982\nWBC\t423983\n丝径\t423984\n八宗\t423985\n李继宏\t423986\n荣耀平板2\t423987\n麝香葡萄\t423988\n驱动精灵离线版\t423989\n花日绯\t423990\n阿北\t423991\n临江市\t423992\n上海公积金\t423993\n圣父\t423994\n符文页\t423995\n绿水青山\t423996\nxorm\t423997\n灰霉病\t423998\n免费师范生\t423999\nSQLSERVER\t424000\n起亚\t424001\n杜春雨\t424002\ncountif函数\t424003\n蒙住\t424004\n明德学校\t424005\n三军\t424006\nitotii\t424007\n斯柯达野帝\t424008\n福建省人力资源和社会保障厅\t424009\n最大公约数\t424010\n山西省儿童医院\t424011\n阿妮\t424012\n1946\t424013\n阿信\t424014\n马刚\t424015\n十一局\t424016\n共享\t424017\n智通人才网\t424018\n爱国卫生\t424019\n梦话西游手游\t424020\ntbs\t424021\n李絮\t424022\n项目标\t424023\nEnsembl\t424024\n线量\t424025\nfalcom\t424026\n马皇后\t424027\n晴雯\t424028\n小U\t424029\n12克\t424030\nexhaust\t424031\n巨人症\t424032\nhop\t424033\n悬挂式\t424034\n17.0.1\t424035\n干爽\t424036\n绿色社区\t424037\n影像馆\t424038\ncorpusleed\t424039\n魔刃豹\t424040\n读法\t424041\n高雅\t424042\n麦芽糖\t424043\n大成郡\t424044\n风振系数\t424045\n酒保\t424046\n王美莼\t424047\n一个户\t424048\n大米\t424049\nScary\t424050\nwinds\t424051\n1.5公里\t424052\n干架\t424053\n外帐\t424054\n0738\t424055\n精室\t424056\n电宝\t424057\nmemo\t424058\n比劫\t424059\n煤气报警器\t424060\nsexs\t424061\n感染性\t424062\n仙人掌\t424063\n黑天鹅事件\t424064\n岛\t424065\n今夜的你又在和谁约会\t424066\n沾衣\t424067\n背墙\t424068\n道尊\t424069\n南宁百姓网\t424070\n吊锅\t424071\n刘剑锋\t424072\nmasq\t424073\nNMEA\t424074\nRama\t424075\nbootstraptable\t424076\n高水平\t424077\n竖形\t424078\n园博会\t424079\n八贤王\t424080\n紫菜蛋汤\t424081\n麒麟955\t424082\n3亿吨\t424083\nLatin\t424084\nContinental\t424085\n药盒\t424086\n凤凰新华书店旗舰店\t424087\n王金华\t424088\n大嫁风尚\t424089\n陕西旅游出版社\t424090\n调度费\t424091\n东直门医院\t424092\n韩流\t424093\n魔界战记2\t424094\nsolidworks2014\t424095\nalthough\t424096\n松垮\t424097\n潮水\t424098\n伊思蜗牛霜\t424099\n4下\t424100\n平改坡\t424101\n欧典\t424102\nvbr\t424103\n吴文光\t424104\n禄劝\t424105\n胃子\t424106\n鼹鼠的故事\t424107\n上限\t424108\n木瓜籽\t424109\n郑州市房管局\t424110\n净化\t424111\n半调\t424112\n中国小说史略\t424113\n华清远见\t424114\n文档检查器\t424115\n认种\t424116\nEDA/IP/IC设计\t424117\n偏头\t424118\n两厘米\t424119\nb端\t424120\nlisboa\t424121\n讯道\t424122\n0w\t424123\n锦绣城\t424124\n鼻喉\t424125\n香樟木\t424126\n上策\t424127\n老年期\t424128\n360N6pro\t424129\n江苏证监局\t424130\n733动漫网\t424131\n陕西铁路工程职业技术学院\t424132\nST华泽\t424133\n环翠\t424134\n求职网\t424135\n葡萄籽提取物\t424136\n直线振动筛\t424137\n皮衣\t424138\n蔚蓝卡地亚\t424139\n极星\t424140\n期货从业资格\t424141\n假意\t424142\n华润公园九里\t424143\n暗香疏影\t424144\n舞蹈裙\t424145\n三防机\t424146\n苏稚\t424147\n小姐们\t424148\nSentence\t424149\n安科\t424150\nHolic\t424151\n一二年\t424152\nappstroe\t424153\n印稿\t424154\n台州市政府\t424155\n快穿女\t424156\n大花\t424157\n7077\t424158\n中国宝武钢铁集团有限公司\t424159\ntouzi\t424160\n3月\t424161\n有效性\t424162\n空乘\t424163\n11月21日\t424164\n飞过来\t424165\n市人防办\t424166\n二阶\t424167\n做网\t424168\n景美\t424169\n合顺\t424170\npride\t424171\n三小时\t424172\n亚太经合组织\t424173\n建筑实务\t424174\n足动旅游网\t424175\n超级减肥王\t424176\ntvpaint\t424177\n_公积金贷款计算器\t424178\nQQ直播网\t424179\n绕流\t424180\n94677\t424181\n中南熙悦\t424182\n电信\t424183\nin语句\t424184\ndoki\t424185\n鬼仙\t424186\n第75\t424187\n电位差\t424188\n广州市白云区政府\t424189\nMOC\t424190\n和家园臻园\t424191\n209\t424192\nwoms\t424193\n磁力链接播放器\t424194\nPS钢笔工具\t424195\n观潮时尚网\t424196\n山亭\t424197\n趣味性\t424198\n画板\t424199\n齐胸襦裙\t424200\n凸透镜成像规律\t424201\n万顷\t424202\n温莎牛顿\t424203\n适应度\t424204\n文灿\t424205\n北京口腔医院\t424206\n维谷\t424207\nautoplay\t424208\nhpv16\t424209\nkaya\t424210\nsmoking\t424211\n思想道德修养与法律基础\t424212\ns3500\t424213\n约克郡\t424214\n拓维信息\t424215\n南宁糖业\t424216\n成本地\t424217\n锄禾日\t424218\n破洞牛仔裤\t424219\n㎝\t424220\n免疫法\t424221\nunturned\t424222\n米歇尔·福柯\t424223\n吴倩莲\t424224\n心神不宁\t424225\n孵化器\t424226\n回来吧大叔\t424227\nyingguo\t424228\n诺基亚E63\t424229\n聋生\t424230\n博人传火影忍者\t424231\n名市\t424232\n永和镇\t424233\n河南省新郑市政府\t424234\n切酶\t424235\n发育别\t424236\n危险点\t424237\n手里\t424238\nOn\t424239\ncontours\t424240\n802.1x\t424241\nv4.3.2\t424242\n南方有乔木\t424243\n咏物诗\t424244\n响水镇\t424245\n220千伏\t424246\n年经\t424247\n装书\t424248\n―\t424249\n查斯特·贝宁顿\t424250\n天津网\t424251\n君生我未生\t424252\nmeraki\t424253\n东新乡\t424254\n绝境逢生\t424255\nfection\t424256\n别嘌醇片\t424257\nSA1456C\t424258\n再障\t424259\n文明史\t424260\nPatreon\t424261\n伤害符\t424262\n埃塞俄比亚\t424263\n连唱\t424264\n千款\t424265\n蔡塘广场\t424266\n不行\t424267\n悲\t424268\n辽蓝\t424269\nNAN\t424270\n罗尔\t424271\nELSA\t424272\n吉利远景s1\t424273\n接货\t424274\n王飞飞\t424275\n广州市黄埔区人民政府\t424276\n江淮瑞风S3\t424277\nRequireJS\t424278\n匪患\t424279\n战舰世界\t424280\nparliament\t424281\n稽察\t424282\n北京市昌平区人民政府\t424283\nroverliang\t424284\n理查德森\t424285\n倒车雷达\t424286\n16&\t424287\n舞台车\t424288\nMX375\t424289\n劲炫ASX\t424290\ncoh2吧\t424291\n三周目\t424292\n装机容量\t424293\n容貌\t424294\n南丰县\t424295\n金昌市\t424296\n012路\t424297\n散漫\t424298\nshotcut\t424299\n十五篇\t424300\n圣堂山\t424301\n海贼无双3\t424302\n连续值\t424303\n深圳市水务(集团)有限公司\t424304\n福建省测绘地理信息局\t424305\n界限\t424306\n福利社区\t424307\n流速仪\t424308\n珠光新城\t424309\n2448\t424310\n武商量贩\t424311\n屈婉玲\t424312\nP4000\t424313\n咖喱饭\t424314\n活性炭吸附\t424315\n电动助力转向系统\t424316\n广州宾馆\t424317\n第123号\t424318\n靖西市人民政府\t424319\nrestrain\t424320\n皖维高新\t424321\n李国文\t424322\n6137\t424323\n全能车\t424324\n狙击精英v2\t424325\n千万遍\t424326\n五院\t424327\n万豪礼赏\t424328\n3478\t424329\n炫龙\t424330\n破口大骂\t424331\n玩头\t424332\n对比\t424333\n出来\t424334\n一敌\t424335\n公网ip\t424336\n宝燕\t424337\n总裁班\t424338\n360uu\t424339\n巾帼枭雄之谍血长天\t424340\n大连市政府\t424341\n这世\t424342\n座骑\t424343\n敌房\t424344\nErase\t424345\n暗黑破坏神3_Diablo3_凯\t424346\n阿肥\t424347\nReap\t424348\n李亚\t424349\n小茵\t424350\nCorruption\t424351\n002285\t424352\n可燃物\t424353\n电波女\t424354\nFansite\t424355\n铤和\t424356\nep1\t424357\n青鸾峰\t424358\n科学计数法\t424359\n对苯二甲酸\t424360\nlibnet\t424361\n并校\t424362\n叶插\t424363\n600751\t424364\n脑瘤\t424365\n记者节\t424366\n散去\t424367\n依附\t424368\n袁航\t424369\n马拉\t424370\n蔗渣\t424371\n亚妮\t424372\n严佼君\t424373\n搜狐网\t424374\n华中师大一附中\t424375\n王官振\t424376\n变位\t424377\n格莱美\t424378\nWholesalers\t424379\n漏网之鱼\t424380\n页边\t424381\nip电话\t424382\n海瑞罢官\t424383\n串关\t424384\nSANGFOR\t424385\n富尔茨\t424386\n前洲镇\t424387\necp\t424388\n挑取\t424389\nDoyle\t424390\n不收回\t424391\n益佰\t424392\n西北路\t424393\n傍名牌\t424394\n骏驰\t424395\n茶书\t424396\n水影\t424397\n那一片\t424398\ndescribing\t424399\n无线局域网\t424400\n15根\t424401\n白心\t424402\n真的很简单\t424403\n血渍\t424404\n赤湖\t424405\n1842年\t424406\n半夜\t424407\n享受\t424408\nsetupres\t424409\nREIT\t424410\n南宁政府\t424411\nCompany\t424412\n招商加盟代理\t424413\n纵膈\t424414\n济南奥数网\t424415\n钟鼎\t424416\n八级\t424417\n六角砖\t424418\n过不去\t424419\n失落\t424420\nBeers\t424421\n边缘线\t424422\n沙盒游戏\t424423\n267\t424424\n满洲\t424425\n油电混合动力车\t424426\n子文件\t424427\n信鸽\t424428\n文浩\t424429\n平山街道\t424430\nbilibi\t424431\n压花板\t424432\n三更灯火五更鸡\t424433\n物资部\t424434\n大华山\t424435\n所见即所得\t424436\n艾美特\t424437\n走着\t424438\nMagazines\t424439\nSquirrel\t424440\n玄玉\t424441\n1150\t424442\n彭埠\t424443\n老仙\t424444\n汁源\t424445\n坡比\t424446\n堡狮龙\t424447\nlaya\t424448\n孤存\t424449\n江淮iEV6E\t424450\n第一展\t424451\nSTART\t424452\n少年商学院\t424453\nHun\t424454\n官城\t424455\nShaderForge\t424456\n黑色版\t424457\n铸件\t424458\nAEC\t424459\n青格\t424460\n飞夺\t424461\n浸水\t424462\n社会主义核心价值体系\t424463\n洪文\t424464\n区码\t424465\n林怀民\t424466\n锁存器\t424467\n城市基础设施建设\t424468\n伸筋草\t424469\nClassifier\t424470\n竹叶茶\t424471\n紫癜性肾炎\t424472\n塑料卡\t424473\n5.7%\t424474\n护理部\t424475\n嫁接树\t424476\n凶星\t424477\n转向泵\t424478\n女厕所\t424479\n音平\t424480\n广东省高等学校毕业生就业指导中心\t424481\n仙林大道\t424482\n龙钩\t424483\nm7400pro\t424484\n胯下之辱\t424485\nExcel文件\t424486\n百分之7\t424487\n秦红殇\t424488\n环氧树脂防腐钢管\t424489\n上海家化联合股份有限公司\t424490\n五粮\t424491\n中国地震局地质研究所\t424492\n决杀\t424493\n关屏\t424494\n水汽\t424495\n鱼卷\t424496\n20171108\t424497\n郑州市国土资源局\t424498\nWorkerMan\t424499\nEither\t424500\n联谊会\t424501\n分音\t424502\n嘉定新城\t424503\nPatrol\t424504\n沈阳教育网\t424505\n汾口镇\t424506\n保利协鑫\t424507\n清议\t424508\ncrf\t424509\n固资\t424510\n梨园路\t424511\n区位\t424512\nRefrigeration\t424513\n平罗县政府\t424514\n规划证\t424515\n10649\t424516\n签到机\t424517\n珠玑巷\t424518\n道森\t424519\nKNX\t424520\n2架\t424521\n巡航者\t424522\n碳库\t424523\n虚拟打印机\t424524\n学不会\t424525\n1695\t424526\n挖矿病毒\t424527\nSkew\t424528\n波罗蜜\t424529\n威亚\t424530\n咸阳湖\t424531\n柑普茶\t424532\n鲍勃\t424533\n两重性\t424534\n0349\t424535\nUnlocking\t424536\nIS\t424537\nSTAT\t424538\nspacevim\t424539\n边海防\t424540\n170414\t424541\n梅霞\t424542\n中国工商局\t424543\n潘晓婷\t424544\njav68\t424545\n凝聚剂\t424546\nCanmake\t424547\n侯龙涛\t424548\n酸反应\t424549\n认证码\t424550\n洗浴广场\t424551\n不合格者\t424552\n文化路小学\t424553\n2K13\t424554\n港味\t424555\n东昌湖\t424556\n百强家具\t424557\n玩脚\t424558\n薛家庄\t424559\n省高级人民法院\t424560\ntaonuonuo\t424561\n_天涯杂谈\t424562\n泪滴\t424563\ncanopy\t424564\n吐蕃\t424565\n7205\t424566\nYEAH\t424567\n上海祥树实业\t424568\n幻卡\t424569\n综合版\t424570\n6万公里\t424571\n伊织\t424572\n专用机\t424573\ncompress\t424574\n高中数学数列\t424575\nSocio\t424576\n拉丁语\t424577\n飞上天\t424578\niFile\t424579\n减缩\t424580\n香九龄\t424581\n呑\t424582\ngatling\t424583\n暖温\t424584\n逆天百炼成仙\t424585\nMOLECULAR\t424586\n赤果果\t424587\n工业加湿器\t424588\n申辩\t424589\n瞬身\t424590\nps版\t424591\n艳情短篇合集\t424592\n种瓜得瓜\t424593\naii\t424594\n乐高城市\t424595\nCSkin\t424596\n老板\t424597\n山装\t424598\nSpectre\t424599\n武汉市天然气有限公司\t424600\n冰点下载器\t424601\n难忘的日子\t424602\n乐视超级手机1\t424603\n子元素\t424604\n肌肉痉挛\t424605\n郁金香花展\t424606\n小试牛刀\t424607\n龙云\t424608\nDe\t424609\n蛇蛇大作战\t424610\nNVMe\t424611\n主役\t424612\n黑书\t424613\n裤带\t424614\n5.0金\t424615\n赵韩樱子\t424616\n四分之三\t424617\n鹅妈妈童谣\t424618\n筱禾\t424619\nppt2013\t424620\n钝\t424621\n中教控股\t424622\n唐鹏\t424623\n圖\t424624\n中脉科技\t424625\n李福成\t424626\nHenkel\t424627\n世界军人运动会\t424628\n4000家\t424629\nDEW\t424630\n汤姆希德勒斯顿\t424631\nPHPSTORM\t424632\n三七开\t424633\n截词\t424634\nhight\t424635\n2015年4月份\t424636\n25.00\t424637\n滴头\t424638\naucs6\t424639\n美防长\t424640\n第四名\t424641\n信科\t424642\n60克\t424643\n硫酸镁\t424644\ndnfex\t424645\n入关\t424646\n念力\t424647\n傻子\t424648\n男单\t424649\n绵阳九院\t424650\n花币\t424651\n第3册\t424652\n快本之家\t424653\nglusterfs\t424654\n海珠区\t424655\n华建\t424656\n花袋\t424657\n大腰子\t424658\n瓷土\t424659\neif\t424660\n莽草酸\t424661\n屈老师幼儿园\t424662\n淤血\t424663\n广谱\t424664\n荣耀战队赛\t424665\n1969年\t424666\n2016年12月12日\t424667\n轮廓度\t424668\n西式\t424669\ndive\t424670\n60天内\t424671\n小木\t424672\n主观\t424673\n基因组测序\t424674\nB型\t424675\nNamespaces\t424676\n50帧\t424677\n肺寒\t424678\n点斑\t424679\nSwitch\t424680\n黑卡3\t424681\n地狱狂蛇\t424682\n出血\t424683\nmocrosoft\t424684\n赵日天\t424685\n仲恺区\t424686\n幽禁\t424687\n生蛆\t424688\n始终不渝\t424689\n污衣\t424690\nrefid\t424691\n单轨\t424692\n中国人民大学历史学院\t424693\n六旗\t424694\n鼻龙\t424695\n安步\t424696\n逗鱼时刻\t424697\n努力值\t424698\nRolls\t424699\n珂兰\t424700\ngbc\t424701\n0M\t424702\nzhijian\t424703\n仰天\t424704\nvaadin\t424705\n9袋\t424706\n曼波\t424707\nExploring\t424708\n舒适100网\t424709\n辞源\t424710\n成都市房地产经纪协会\t424711\ncrossfire\t424712\n观后感\t424713\n东莞中堂\t424714\npendo\t424715\n上海松下\t424716\n求行\t424717\n富阳市\t424718\nethtool\t424719\n商大\t424720\nleapmotion\t424721\n中国居民膳食指南\t424722\n龙台镇\t424723\n60P\t424724\n欢乐合唱团\t424725\nRHEL7.0\t424726\nggzy\t424727\naxl\t424728\n张堰\t424729\nlading\t424730\n反欺凌\t424731\n沙堆\t424732\nALS\t424733\n杭钢集团\t424734\n华北电网\t424735\n安外\t424736\n5双\t424737\n爱知网\t424738\n诡雷\t424739\n不宣\t424740\n职院\t424741\n亚里士多德\t424742\n八达岭动物园\t424743\n第三笔\t424744\ntorn\t424745\n肋骨处\t424746\n木鱼镇\t424747\n工控类\t424748\n小檀\t424749\n太走心\t424750\nFill\t424751\nDeepthroat\t424752\n子洲县人民政府\t424753\n剩饭\t424754\n高空坠落\t424755\n深证\t424756\n幕府山\t424757\n滚丝轮\t424758\nJeez\t424759\nlan2\t424760\nz4\t424761\n神情\t424762\ndoge\t424763\n锡耶纳\t424764\n微句\t424765\n理学类\t424766\nHina\t424767\n明争暗斗\t424768\n醇酸漆\t424769\n河涌\t424770\n振兴街\t424771\nmetaspace\t424772\n薛家将\t424773\nglitter\t424774\nmaanshancss\t424775\n厦门市公安局\t424776\n20151113\t424777\n_砖家团_阿牛\t424778\n正班\t424779\n昌平区医院\t424780\nTube\t424781\n绿心\t424782\naxue\t424783\n细软\t424784\n轮唱\t424785\n新疆维吾尔自治区质量技术监督局\t424786\n陕西省发改委\t424787\nled\t424788\npcf8563\t424789\n幽瞳\t424790\n3ds模拟器\t424791\n包菜\t424792\n沙恩\t424793\n湖南省委办公厅\t424794\n三十二岁\t424795\n伤寒\t424796\n剧透\t424797\n上智大学\t424798\n天际浩劫2\t424799\n秀场\t424800\n康平县\t424801\nC++构造函数\t424802\n都业华\t424803\n价值链理论\t424804\nul,li\t424805\n中国足协\t424806\n抢夺战\t424807\n万科A\t424808\n14.5%\t424809\n导热率\t424810\n招奴\t424811\n斯伯丁\t424812\n假面骑士斗骑大战2\t424813\n镭风\t424814\nWSUS\t424815\n碧桂园深荟城\t424816\n4.4%\t424817\n胡海\t424818\n毕节\t424819\n105年\t424820\n跪膝\t424821\n百篇\t424822\n栖\t424823\n四分之五\t424824\n而终\t424825\n抓落实\t424826\nWorldwide\t424827\nDeDeCMS\t424828\n学无止境\t424829\n惊呆\t424830\nOTC\t424831\nPROSet\t424832\n久游网\t424833\nespeak\t424834\n翻译社\t424835\n第18周\t424836\n幻雨\t424837\naxiom\t424838\n湖北新闻网\t424839\n线装本\t424840\n亦庄开发区\t424841\n小丈夫\t424842\n嵩明县\t424843\nbeautifulzzzz\t424844\n自然莎\t424845\n复方丹参滴丸\t424846\n大运会\t424847\n防松\t424848\n膳食\t424849\n孔明\t424850\n膑骨\t424851\n旅行地\t424852\nthm\t424853\n甘草酸二钾\t424854\npes2010\t424855\n辽西夏\t424856\n刊载\t424857\n邮集\t424858\n試験\t424859\n应税服务\t424860\n中国皮革网\t424861\n近一月\t424862\nsr\t424863\n明阳风电\t424864\n云岗石窟\t424865\n龙之牧场\t424866\n宁静致远\t424867\n腾讯手机助手\t424868\n好酒\t424869\n上海男科医院\t424870\n脚本\t424871\n禅道项目管理软件\t424872\n神鬼怪\t424873\n600206\t424874\nFluent\t424875\n黄锦\t424876\n1752\t424877\n28mm\t424878\n无地\t424879\n羽夜\t424880\n肩高\t424881\nSkam\t424882\n雷达罩\t424883\n运行工\t424884\n狂潮\t424885\n逸动/逸动XT论坛_汽车之家论坛\t424886\n超级宝典\t424887\n缬氨酸\t424888\n乱砍\t424889\n理查三世\t424890\nYAG\t424891\n四川大学华西药学院\t424892\nrodeo\t424893\n天迹\t424894\n杯具\t424895\n1507\t424896\n糯米团\t424897\n唐荣\t424898\nh股\t424899\n朴宝蓝\t424900\n光阳摩托kymco\t424901\n漕泾镇\t424902\n樟木头镇\t424903\n翘嘴\t424904\n达哥\t424905\n20150323\t424906\n玉胡芦\t424907\nrenew\t424908\nmicroSD\t424909\nBURGER\t424910\n康熙王朝\t424911\nrebel\t424912\n胸前传\t424913\n国际形势\t424914\n海宁市区\t424915\n天地有情\t424916\n素字\t424917\n膜法世家\t424918\nEpik\t424919\n清源股份\t424920\n如意馄饨\t424921\n华发世纪城\t424922\n振华冰山金蝶\t424923\n碘缺乏病日\t424924\n睡城\t424925\n2016年12月21日\t424926\nDocking\t424927\n13期\t424928\n新闻眼\t424929\n神结局\t424930\n变奏\t424931\n建设银行银行\t424932\nrequests库\t424933\n金牛管业\t424934\n廖阅鹏\t424935\n直到\t424936\n圣女果\t424937\n宁句\t424938\n千锤百炼\t424939\n烟气\t424940\n莲湖\t424941\n莉莉·柯林斯\t424942\nP2pSearcher\t424943\nImage2Base64\t424944\n上呼吸道\t424945\n〉\t424946\n未久\t424947\n冰晶凤凰\t424948\n配电房\t424949\n40TFSI\t424950\n悦山居\t424951\n烧写器\t424952\n快闪\t424953\n下去掉\t424954\n20160305\t424955\n荆棘王座\t424956\n减胎\t424957\nJMF\t424958\n最热资讯\t424959\n神代利世\t424960\n纪念林\t424961\n里程票\t424962\n第一章&#160\t424963\nh81m\t424964\n密封罐\t424965\n越江隧道\t424966\n德化窑\t424967\nPhysician\t424968\nappetite\t424969\n卡廷惨案\t424970\n建树\t424971\n上朝\t424972\n不安分\t424973\n患难\t424974\n范晔\t424975\n正反手\t424976\nwindows8吧\t424977\n一二三四五六\t424978\n十二路\t424979\n不修\t424980\nmt65xx\t424981\n娱乐底片\t424982\n神剑情天\t424983\npecl\t424984\n六码\t424985\n洞头新闻网\t424986\n白圈\t424987\n醋酸地塞米松片\t424988\nfaux\t424989\n天才们\t424990\n下渠道\t424991\n贾雨村\t424992\n长飞光纤光缆股份有限公司\t424993\n轮空\t424994\n爪哇堂\t424995\n出品人\t424996\n大沽路\t424997\n腾讯广告联盟\t424998\nPasta\t424999\nvcore\t425000\n华云数据\t425001\n东成西\t425002\n赵秉志\t425003\n邵阳日报\t425004\n县政\t425005\n鱼鳞状\t425006\n小拇\t425007\n奇谈\t425008\n抄家\t425009\n樊振叶檀\t425010\nElf\t425011\n鲜奶油\t425012\n宁波小学\t425013\n航天中心医院\t425014\nani\t425015\n火影忍者\t425016\n五法\t425017\nBanner/\t425018\n手机漫画网\t425019\n模拟摄像机\t425020\n圣阳\t425021\n空托\t425022\n励志哥何志康\t425023\n18世纪末\t425024\n帽檐\t425025\n1580mf\t425026\n少队\t425027\n11代\t425028\n全集网_全集电影网_迅雷\t425029\n鞭刑\t425030\n切值\t425031\n炉石记牌器\t425032\n元夕\t425033\n周绪红\t425034\n亚马逊(网络电子商务公司\t425035\n非活性\t425036\n小组件\t425037\nc8\t425038\n烟火里的尘埃\t425039\nlhr\t425040\n曾刚\t425041\nv4.21\t425042\n1123\t425043\n居住登记卡\t425044\n雅趣\t425045\nSIMPLE\t425046\n地产公司\t425047\nEhentai\t425048\n23关\t425049\nC12\t425050\n发币\t425051\n科帕奇论坛_汽车之家论坛\t425052\nORA-01722\t425053\n女婴\t425054\n大朗汽车站\t425055\n荣欣\t425056\nJolin\t425057\n2960\t425058\n上海市闵行区中心医院\t425059\n刘皓\t425060\n前景\t425061\n鸡之路\t425062\n外游加速器\t425063\n鬼面\t425064\ntutorial\t425065\n听与\t425066\nuei2\t425067\n贵阳市第一人民医院\t425068\n中奖率\t425069\n131810\t425070\n市关工委\t425071\n安博教育\t425072\n建信基金管理有限责任公司\t425073\n爆盆\t425074\nfap\t425075\n吉行天下\t425076\n枫林网\t425077\n天安财险\t425078\nFIFAOL\t425079\n1.09\t425080\n宁航\t425081\n欧拉法\t425082\n海鑫钢网\t425083\nMySQL论坛\t425084\nOda\t425085\n张正隆\t425086\n警犬\t425087\nUZER\t425088\n畜牧兽医专业人才网\t425089\n工程名\t425090\n副官\t425091\n土偶\t425092\nNosql\t425093\n2017年5月5日\t425094\n余勇\t425095\n不及格\t425096\n加电\t425097\n维乐\t425098\nd5300\t425099\nmybatis-generator\t425100\n不厌\t425101\nberth\t425102\n黑鸦\t425103\n句容社区\t425104\n尼古拉斯·凯奇\t425105\n大娘\t425106\n智能式\t425107\n超级作弊器\t425108\nmini\t425109\n姚班\t425110\n交割日\t425111\n火影ol\t425112\n张玉霞\t425113\n朗诗国际街区\t425114\n大国外交\t425115\n吴城镇\t425116\n荒唐事\t425117\n大红门\t425118\n吴英\t425119\n采油树\t425120\nSETUP\t425121\n40GP\t425122\n江苏苏宁易购\t425123\n大禹\t425124\n学前教育机构\t425125\nsketch\t425126\n洋槐蜜\t425127\nClojure\t425128\n28例\t425129\n营养级\t425130\n璘\t425131\nvc++\t425132\nfw300r\t425133\n恋慕\t425134\n挑唆\t425135\nDefined\t425136\n良字\t425137\n惠特妮·休斯顿\t425138\n7.3元\t425139\nCommunities\t425140\n龙郡\t425141\n米奇\t425142\n交锁\t425143\n传染病频道_健客网\t425144\n空谈\t425145\n沈阳南站\t425146\n试样\t425147\n熟女情缘\t425148\n中国人民武装警察部队学院\t425149\n潍坊地区\t425150\nAABB\t425151\nbackmir\t425152\n小粒粒\t425153\n食用级\t425154\n适能\t425155\nIE60\t425156\n法书\t425157\n落到\t425158\nLS-DYNA\t425159\n爱路\t425160\n第13个\t425161\n羽绒被\t425162\n这种情\t425163\n冰激\t425164\n冷剑\t425165\ngrained\t425166\n阿克巴大帝\t425167\n中保网\t425168\n一条船\t425169\n20160819\t425170\n七星电子\t425171\n远洋地产有限公司\t425172\n七一二\t425173\n10.10.5\t425174\n张志斌\t425175\n养殖类\t425176\n张帝\t425177\n科林\t425178\nStomp\t425179\nMaya2018\t425180\n长头\t425181\n酒樽\t425182\n蓝狮\t425183\ndetach\t425184\nkies\t425185\n第几波\t425186\n慢速\t425187\nCONQUEROR\t425188\n2666\t425189\n换物\t425190\n第四年\t425191\nAuto\t425192\n香妃\t425193\n安徽省公安厅\t425194\n多单\t425195\n品号\t425196\n所在地区\t425197\n薄冰\t425198\n阿斯匹林肠溶片\t425199\nfibrosis\t425200\n诡面梨园\t425201\n首都师范大学研究生院\t425202\n被绑\t425203\n药事\t425204\n士气\t425205\n林肯飞\t425206\n游离\t425207\n调演\t425208\nJavBus\t425209\n枕骨\t425210\n杜强\t425211\n寅次郎\t425212\n罗曼雷恩斯\t425213\n弥海砂\t425214\n可伐合金\t425215\n咸池\t425216\nzTree\t425217\n民族复兴\t425218\nhfm\t425219\n总汇\t425220\n奥迪Q5论坛\t425221\n气味\t425222\n0921\t425223\na16.4\t425224\nCOLOUR\t425225\n传经\t425226\n九大行星\t425227\n钟山职业技术学院\t425228\n钢锅\t425229\n定制机\t425230\n16公分\t425231\n监检\t425232\n赵荣\t425233\n山特\t425234\nFoto\t425235\n告示牌\t425236\n加群\t425237\n石黑一雄\t425238\n18户\t425239\n永盛\t425240\n豪放词\t425241\n雅莎尔\t425242\n第83号\t425243\n思变\t425244\n和义大道\t425245\nOptiPlex\t425246\n新版三国\t425247\n_易车号_易车网\t425248\n石火\t425249\ns350\t425250\n仗剑天涯\t425251\n母板\t425252\n9公斤\t425253\n田佳\t425254\n一炮而红\t425255\nsolidity\t425256\n肾上\t425257\n四川法制报\t425258\n民办高校\t425259\n捷豹F-TYPE\t425260\nr级片\t425261\n古明地恋\t425262\namd64.whl-CSDN\t425263\n云南白药膏\t425264\ndox\t425265\n官位\t425266\nFoxPro\t425267\n12月17日\t425268\ngolang\t425269\n大师级\t425270\n牡丹王\t425271\n淫窟\t425272\n城商银行\t425273\n杂类\t425274\n花无缺\t425275\nwin7浏览器\t425276\nzooyo\t425277\nCNW\t425278\nppt文件\t425279\n漫画类\t425280\nvpi\t425281\n昌宁县纪委监察局\t425282\nflay\t425283\n盖章机\t425284\nkzp\t425285\n湖北省省\t425286\n数字媒体应用技术\t425287\nlamar\t425288\nEnsemble\t425289\n沈晖\t425290\nViktor\t425291\n自行车棚\t425292\n就业协议书\t425293\nBlueair\t425294\n缓刑\t425295\n小猎人\t425296\n和歌山县\t425297\n拉莫三嗪\t425298\n非标设备\t425299\n私底下\t425300\n灵光寺\t425301\nQ+\t425302\n三百篇\t425303\n澳洲生活网\t425304\nONLYOFFICE\t425305\n海藻粉\t425306\n华大资讯网\t425307\n扑倒\t425308\nG20杭州峰会\t425309\n律师们\t425310\n1552\t425311\n加一物\t425312\n北海政府网\t425313\n新奇\t425314\nPress\t425315\n贾斯汀比伯\t425316\n过气\t425317\n20140419\t425318\n运动类\t425319\n存志中学\t425320\nAber\t425321\n卫生间门\t425322\n执行人\t425323\n1220\t425324\n南橘北枳\t425325\n北极星风电招聘网\t425326\n浮子\t425327\n南京碧桂园\t425328\n海南话\t425329\n第221集\t425330\n驱鬼逐\t425331\n测财\t425332\n痕量\t425333\n米思米\t425334\n跨国公司\t425335\n微速云_\t425336\n腾牛健康网\t425337\n长园\t425338\n第8课\t425339\n怪物猎人:世界\t425340\n亿融普惠\t425341\n再审\t425342\n韩尚\t425343\n人生如\t425344\ncamera\t425345\n水意\t425346\n多很多\t425347\n12年\t425348\n汗汗\t425349\n学习能力\t425350\n厂内\t425351\ndatas\t425352\n樵\t425353\n泰诺健\t425354\n一键启动\t425355\n上海高级人民法院\t425356\nDelivery\t425357\npcd\t425358\n300年前\t425359\n沪教版五年级语文\t425360\n营业时\t425361\nSpinner\t425362\ntableViewCell\t425363\n近10年\t425364\n珠海清风网\t425365\n小米6\t425366\nsolc\t425367\n312\t425368\n量身\t425369\n戮蛊\t425370\n701所\t425371\nJoy\t425372\n试配\t425373\n隋军\t425374\n杭州日报报业集团\t425375\n承试\t425376\n_凤囚凰\t425377\n九渡河镇\t425378\n旅友们\t425379\nMathCAD\t425380\npreamble\t425381\nhexo\t425382\n死亡华尔兹\t425383\n美能\t425384\nZX2\t425385\n中小学教师资格考试网\t425386\n钻削\t425387\n鼻祖\t425388\nCMH\t425389\n掏裆\t425390\n方扣\t425391\n自由活动\t425392\n淮安市规划局\t425393\ncass\t425394\nscattering\t425395\n银氨溶液\t425396\n海皇监狱\t425397\n上海市浦东新区第四教育署\t425398\nphoton\t425399\n质性研究\t425400\n王霏霏\t425401\n苏州高等职业技术学校\t425402\n石油气\t425403\n兴泰\t425404\n中国平湖_平湖市政府\t425405\n丰隆穴\t425406\n悦子\t425407\n定荣\t425408\n拒付\t425409\n铸铝门\t425410\n20151116\t425411\nneea\t425412\n龚\t425413\n锦官\t425414\n看不见\t425415\nE宠商城\t425416\n赐福\t425417\n曾山\t425418\n封页\t425419\n影坛\t425420\n125路\t425421\n张新成\t425422\n宗妹\t425423\n加弹\t425424\n跨部门沟通\t425425\n非得\t425426\n四面八方\t425427\n毕节日报\t425428\n召唤士\t425429\n民间组织\t425430\nJulian\t425431\n米家智能摄像机\t425432\n试岗\t425433\n碧丽珠\t425434\n2kill4\t425435\n凤凰香香\t425436\n海猫鸣泣之时\t425437\n大联大\t425438\n居民身份\t425439\n梅尔吉布森\t425440\ntheano\t425441\n盐阜\t425442\n思潮\t425443\n红头\t425444\nDuck\t425445\n凯旋在子夜\t425446\n副总工\t425447\n养育巷\t425448\n香熏\t425449\n26分钟\t425450\n7x\t425451\nIPSec\t425452\n左氧氟沙星注射液\t425453\n陈寿\t425454\n赵岚\t425455\nLeverage\t425456\n2017CF\t425457\n三十万元\t425458\n死藤\t425459\n智能社\t425460\n一胎二\t425461\n西姆斯\t425462\n百花蜜\t425463\n晚上七点\t425464\nporono\t425465\nSINOPHARM\t425466\n3月下旬\t425467\nFalling\t425468\n布氏杆菌病\t425469\n白玲\t425470\nLightDM\t425471\n中国能建\t425472\n威马汽车技术有限公司\t425473\n李茂贞\t425474\nf427\t425475\n府院\t425476\n股票配资公司\t425477\n白情\t425478\n行尸走肉第二季\t425479\n赛睿Sensei\t425480\n反绒\t425481\n销管\t425482\n排列五\t425483\n谢稚柳\t425484\n僵尸车\t425485\n陕西商标所\t425486\n弄潮儿\t425487\n杜镇杰\t425488\n龙虎榜\t425489\nwatched\t425490\n正副\t425491\n昆明市第三中学\t425492\n危害_39健康网\t425493\n锡师附小\t425494\n项南\t425495\n南音\t425496\n华峰氨纶\t425497\n7467\t425498\nthose\t425499\n炖鸡\t425500\n植护\t425501\nForced\t425502\n壮志未酬\t425503\nHSI\t425504\nPunch\t425505\nU19\t425506\n馊味\t425507\n暗香浮动月\t425508\n誓约\t425509\n3400\t425510\n本套\t425511\n英雄联盟之谁与争锋\t425512\nburch\t425513\n黑岩阅读网\t425514\n畹町\t425515\n丝带绣\t425516\n洛带古镇\t425517\n昌黎\t425518\n创世联盟\t425519\n猎艳\t425520\n赖美云\t425521\n中俄\t425522\n吉利熊猫\t425523\n维和\t425524\n异型钢\t425525\nr6800\t425526\n柿子椒\t425527\n终极指南\t425528\n强迫\t425529\n山东省人民检察院\t425530\n6000米\t425531\n百闻\t425532\n李劲\t425533\nzale\t425534\n26亿\t425535\n精尽\t425536\n正戏\t425537\n柯美\t425538\n北京科兴\t425539\n直角梯形\t425540\nSCHUNK\t425541\n干酪\t425542\n京巴狗\t425543\nAQI\t425544\n121部\t425545\n封停\t425546\n园林规划\t425547\n湿衣\t425548\n板寸\t425549\n浩然正气\t425550\n叶挺\t425551\nConfused\t425552\n广东省教育厅办公室\t425553\n陈谦\t425554\n报站\t425555\n盗车\t425556\n格非\t425557\n计算机应用专业\t425558\namanda\t425559\nFusion\t425560\n格雷米奥\t425561\n氮吹仪\t425562\n赵勇\t425563\ndex\t425564\n阿尔瓦罗\t425565\n斑痕\t425566\n片警\t425567\n毽球\t425568\nqcma\t425569\n枫丹白露宫\t425570\n840D\t425571\n右路\t425572\n一亿颗\t425573\nunity5.0\t425574\nv5.7.1\t425575\nhvie\t425576\nwin2008\t425577\n热压\t425578\n雕爷\t425579\n精铸\t425580\n20180130\t425581\n甘肃省统计局\t425582\nmessages\t425583\nCLA\t425584\n西番莲\t425585\n続\t425586\nJBPM\t425587\n稽东镇\t425588\n3毛\t425589\nject\t425590\nhorizon\t425591\n有道翻译\t425592\ndit\t425593\nsendEmail\t425594\n徕卡q\t425595\n斗破苍穹ol\t425596\n强取豪夺\t425597\nad钙奶\t425598\n新闻联播\t425599\n第134集\t425600\n录波\t425601\n三集片\t425602\n蓝海银行\t425603\n先锋路\t425604\n计件工资\t425605\n微分方程\t425606\n宅居\t425607\n太空舱\t425608\n沌口经济开发区\t425609\n拖车\t425610\n北京小吃\t425611\n宏光S3\t425612\n烤羊腿\t425613\n王莉\t425614\npose\t425615\n张文涛\t425616\n观音岩\t425617\n10多秒\t425618\n通知\t425619\n36g\t425620\n上劲\t425621\nPrivat\t425622\n斧标驱风油\t425623\n弗格森\t425624\n原浆酒\t425625\n新包青天\t425626\n绘里奈\t425627\n河北中医学院\t425628\nExecute\t425629\n联立\t425630\n钡\t425631\n建筑装饰公司\t425632\n胆道闭锁\t425633\n4399h5\t425634\n电轿\t425635\n工业设计师\t425636\n温岭市教育局\t425637\n鸿坤地产\t425638\n虚拟现实\t425639\n自给自足\t425640\n彩民乐\t425641\n宁波二院\t425642\n玩具柜\t425643\nrebdb\t425644\napr\t425645\nSwift4\t425646\n父子情\t425647\n大場\t425648\nHadid\t425649\n竞赛\t425650\nprank\t425651\n中国国家安全部\t425652\nANIMAL\t425653\n70家\t425654\nautocad2019\t425655\nアダルトビデオ\t425656\n林志\t425657\n兵蚁\t425658\n郑州交运集团\t425659\nOrleans\t425660\nAIDS\t425661\n哪一位\t425662\n4万公里\t425663\n乱管\t425664\nmuc\t425665\n中国关心下一代工作委员会\t425666\n软功\t425667\n彭博社\t425668\n首班\t425669\n梁实秋\t425670\n空之轨迹3rd\t425671\n海发\t425672\n查拳\t425673\nKBS音乐银行\t425674\nedl\t425675\n刘庆邦\t425676\n达达兔\t425677\n联票\t425678\n莉莉玛莲\t425679\n马季\t425680\n三眼哮天录\t425681\n孩爸孩妈聊天室\t425682\n磁通\t425683\n2018劳动节\t425684\n钢柜\t425685\n星际大战\t425686\n琼府\t425687\n农场主\t425688\n维修工\t425689\n挂帅\t425690\n光魔\t425691\n同程网络科技股份有限公司\t425692\n执导\t425693\nA20\t425694\n湖南电信\t425695\nNewSeed\t425696\n皂角树\t425697\n罗马家园\t425698\n蔚蓝图书\t425699\nsvipfuli\t425700\ngps定位系统\t425701\njms\t425702\n镇江市国土资源局\t425703\nwoff\t425704\n铅芯\t425705\nKMSFan\t425706\n936\t425707\n波场\t425708\n欲望迷城\t425709\n槐花\t425710\n吉克传奇世界\t425711\n万赞\t425712\nUSS\t425713\nshirt\t425714\n金华市婺城区政府\t425715\n一夫一妻制\t425716\nMM440变频器\t425717\n哈乐\t425718\n屋区\t425719\n鹏鼎控股\t425720\nTypical\t425721\n场址\t425722\n向辉\t425723\n第171章\t425724\n疏水板\t425725\n小羚羊\t425726\n及至\t425727\n人生若只如初见\t425728\n特创\t425729\n20171115\t425730\n据\t425731\nOTC在线\t425732\n洋蓟\t425733\n置信\t425734\n煤化工\t425735\n3型\t425736\n香村多娇\t425737\n防腐木廊架\t425738\n布洛芬颗粒\t425739\n河北医科大学第三医院\t425740\n李经理\t425741\nDanner\t425742\n徐鲤\t425743\n弦距\t425744\n两纲\t425745\n省队\t425746\n比索\t425747\n投毒案\t425748\n光腚\t425749\n错失\t425750\nNEON\t425751\n冰点\t425752\nJai\t425753\n大项\t425754\n像风一样\t425755\ncomments\t425756\n丫丫网\t425757\nConditioner\t425758\n工作会\t425759\n复读生\t425760\n昂科威28T\t425761\n刘建洋\t425762\n五音\t425763\n赛德克\t425764\nWin7系统网络\t425765\n提辖\t425766\n专科\t425767\n马克·李维\t425768\n金木水火土\t425769\ntoshi\t425770\n贴身神医\t425771\n856\t425772\n莹草\t425773\ns5\t425774\n暗箭\t425775\n红外感应器\t425776\nphotovoltaic\t425777\n粤媒\t425778\n广州开发区管委会\t425779\nkep\t425780\n98周年\t425781\nzhangyu\t425782\n川西高原\t425783\nCAS#\t425784\n郑州九中\t425785\nFlyme7\t425786\n折旧费\t425787\n小尾\t425788\n车损\t425789\n苏美\t425790\n复方黄柏液\t425791\nredwing\t425792\n211号\t425793\n试阅\t425794\n北京国际长跑节\t425795\n恒腾网络\t425796\n市人力资源社会保障局\t425797\n全团\t425798\n珲春\t425799\ncosine\t425800\n秦虎\t425801\nppt2003\t425802\n叶斑病\t425803\nreigns\t425804\n中国共产党章程\t425805\n淳于髡\t425806\n收集箱\t425807\n严审\t425808\n绝地求生测试\t425809\n1600米\t425810\n天津工商局\t425811\n剑桥分析公司\t425812\n上一个月\t425813\n溱湖国家湿地公园\t425814\n同地\t425815\n欢乐斗地主残局\t425816\n球技\t425817\n还原反应\t425818\n2016年4月26日\t425819\n超兽武装\t425820\n昆明南亚\t425821\n新盛\t425822\n喘粗气\t425823\n肱三头肌\t425824\n抗病毒治疗\t425825\n王阳\t425826\n春风物语\t425827\ncx4\t425828\n59枚\t425829\nм\t425830\n1:20\t425831\njint\t425832\n所得税率\t425833\nleb\t425834\nchaoyue\t425835\nivew\t425836\n财务管\t425837\n曦曦\t425838\n青椒炒肉\t425839\nwww.qiushu\t425840\nbtw\t425841\n乌鸡白凤丸\t425842\npudn\t425843\n拉篷\t425844\n中国产业信息研究网\t425845\n新乐视\t425846\n珍惜\t425847\nluggage\t425848\nmaisy\t425849\n巴斯大学\t425850\n深业华府\t425851\nFay\t425852\n对战\t425853\nsoybean\t425854\n0822\t425855\n3000首\t425856\nPSL\t425857\n徐玲\t425858\n海色\t425859\n王者荣耀\t425860\n银瓶\t425861\nP1606dn\t425862\nanjularjs\t425863\nspherical\t425864\nTextInput\t425865\n陈彬\t425866\nlocus\t425867\n法律史\t425868\n海桐\t425869\n三星a9\t425870\n纸绳\t425871\n静者\t425872\nfdk\t425873\nFu\t425874\n棒谷\t425875\n福利来\t425876\n出口退税率\t425877\n觅踪\t425878\n金手\t425879\n商陆\t425880\n程恩富\t425881\n武侠世界\t425882\n祥\t425883\n凝析\t425884\n自保\t425885\n铁拳7\t425886\n文件块\t425887\n李小王\t425888\n鸿茅药\t425889\n上海星尚\t425890\nflex4\t425891\n粼\t425892\n井内\t425893\n第三把\t425894\n福建省住房和城乡建设厅\t425895\n东昌区\t425896\nMATLAB\t425897\n蛀\t425898\n第集\t425899\nMissing\t425900\n宜信普惠\t425901\n相思树\t425902\n木格措\t425903\n岗岩\t425904\n棱镜\t425905\n扭摆\t425906\n25#\t425907\n十足\t425908\nmusical\t425909\n中国联合国\t425910\n刘炜\t425911\n176号段\t425912\n履约保证金\t425913\n坎肩\t425914\n常微分方程组\t425915\n宁铂\t425916\n奥本海姆\t425917\n77\t425918\n赋格\t425919\n吸脱\t425920\n沦落\t425921\n全面性\t425922\n布鲁克林\t425923\n国际乒联\t425924\n文君\t425925\nComposer\t425926\nBali\t425927\n解词\t425928\n通上\t425929\n温兆伦\t425930\n周敏\t425931\n四川省委\t425932\n开源机器学习\t425933\n华景\t425934\n玄机\t425935\n朱音\t425936\nwebuploader\t425937\n江苏人民出版社\t425938\n抓走\t425939\n抽刀断水\t425940\n从零开\t425941\n相见恨晚\t425942\n杨红旭\t425943\n湘\t425944\nbringing\t425945\n保路运动\t425946\n狸狸\t425947\n4升\t425948\nentware\t425949\n张维\t425950\n国家海洋局第一海洋研究所\t425951\nビス\t425952\n长安CX70报价\t425953\n80070643\t425954\n23码\t425955\n中央美术学院设计学院\t425956\n尘车\t425957\n回流焊\t425958\n建兰\t425959\n抓握\t425960\n大柴\t425961\n威科\t425962\nOrgasms\t425963\n凤歌\t425964\ní\t425965\n气调包装机\t425966\n演替\t425967\n我的世界版\t425968\nRusso\t425969\nR2017\t425970\n作乱\t425971\n5000件\t425972\n宇文护\t425973\n地衣芽孢杆菌活菌胶囊\t425974\n比较文学\t425975\nSKT1\t425976\nditch\t425977\n张学伟\t425978\n华夏杯\t425979\n汤臣豪园\t425980\n风热咳嗽\t425981\n无双谱\t425982\n2017年9月30日\t425983\n刘记\t425984\nboo\t425985\n高原反应\t425986\n20170601\t425987\n1.14\t425988\n武康路\t425989\nhybridization\t425990\nencoder\t425991\n胶南\t425992\n中共教育部\t425993\n领条\t425994\n平均增长率\t425995\n奥能\t425996\n南宁物流公司\t425997\nv1.3\t425998\n广州地铁3号线\t425999\n迷局\t426000\n实则\t426001\n斐讯盒子\t426002\n蕊片\t426003\n世界赛\t426004\n28294438\t426005\n任智康\t426006\nexclude\t426007\n白百梁宏达\t426008\n深改委\t426009\nPS板\t426010\n红旗l5\t426011\n小波神经网络\t426012\n举鼎\t426013\n试营业\t426014\n藤桥镇\t426015\nNVMe协议\t426016\nFax\t426017\n德语版\t426018\nyolo2\t426019\n_v1\t426020\nUIKit\t426021\n20.2\t426022\n田字格\t426023\nguna\t426024\n地藏菩萨本愿经\t426025\nKendall\t426026\n15年09月\t426027\ntatcha\t426028\n网膜\t426029\n预提\t426030\n网点查询网\t426031\n带酒\t426032\n艾翁\t426033\npeeling\t426034\n丹尼格林\t426035\n维也纳酒店\t426036\nnova3e\t426037\n582美容人才网\t426038\n600多家\t426039\n卿尘\t426040\n骗税\t426041\n河南财政金融学院\t426042\nipadmini4吧\t426043\nPICS\t426044\n零月\t426045\npostcss-px2rem\t426046\n娱乐新闻\t426047\n民声汇_奥一报料_南都报系\t426048\n折扣率\t426049\n车主\t426050\n信宝\t426051\n花刀\t426052\n早产儿\t426053\n男女孩\t426054\n坐享\t426055\n孝老爱亲\t426056\n圣诞老人\t426057\n钱币收藏\t426058\n康养小镇\t426059\n海口新闻网\t426060\nofficial\t426061\nA座\t426062\n圣桑\t426063\n传奇版\t426064\n詹姆斯·卡梅隆\t426065\n685\t426066\n群贤\t426067\n东北人参\t426068\nWarlock\t426069\n天利\t426070\n1000张\t426071\n巧克\t426072\n饺\t426073\n吉安房产网\t426074\n七星瓢虫\t426075\nLoved\t426076\n楚江新材\t426077\n10.6.3\t426078\n丙级\t426079\nmp3吧\t426080\n我是歌手4\t426081\n看电影网\t426082\n调乱\t426083\n中山大学眼科中心\t426084\n清远市清新区\t426085\n输油管\t426086\n失独者\t426087\nzenmap\t426088\n每三天\t426089\n注蜡机\t426090\n一清二楚\t426091\n凤凰苑\t426092\n嘉隆\t426093\nessex\t426094\nPostgresql\t426095\n四川省委省政府\t426096\nFoodie\t426097\n金瓶艳史\t426098\n魅影缝匠\t426099\n新江\t426100\n伏妖开封府之御猫展昭\t426101\n春节序曲\t426102\n3余\t426103\n道牙\t426104\n林凡\t426105\n秋霞电影网\t426106\n壳牌石油有限公司\t426107\n两卫\t426108\n松竹梅\t426109\n天怒人怨\t426110\n长脸\t426111\nvisualstudio2015\t426112\n洛克人x8\t426113\n工商行政管理局\t426114\nmelbourne\t426115\n全国人\t426116\n有百花秋有\t426117\nFried\t426118\n中华人民共和国国旗\t426119\n大全_猎名网\t426120\n双榆树\t426121\n杂质\t426122\n体育博彩\t426123\n请款\t426124\n门证\t426125\n海军演\t426126\nHereTits\t426127\n烤鸡腿\t426128\n无害化处理\t426129\n钍基熔盐堆\t426130\n山海经异兽录\t426131\n宅急\t426132\n日漫\t426133\n气管切开术\t426134\n颐和果园\t426135\n虚点\t426136\nweb前端工程师\t426137\nanalyses\t426138\n4G卡\t426139\n秒速\t426140\n极恶非道\t426141\n毫不留情\t426142\nCLUBMAN\t426143\n去年末\t426144\n4.3万\t426145\n国际化大都市\t426146\n华东康桥国际学校\t426147\n克爹\t426148\n非框架梁\t426149\n建包\t426150\n中国石化集团公司\t426151\nア\t426152\n虫鸟\t426153\n中科院电子所\t426154\n活得好\t426155\nlongitudinal\t426156\n阿木\t426157\n布里斯托\t426158\n暗黑迪丽热巴\t426159\nzifangsky\t426160\n按键\t426161\n好活\t426162\n中国气象科学研究院\t426163\nbsci\t426164\n西岭雪山\t426165\n卓望\t426166\n木牛\t426167\n老八校\t426168\n布展\t426169\n母奶\t426170\n微茫\t426171\n茜素红\t426172\n粗苯\t426173\n回溯\t426174\n色妮姑\t426175\n两会\t426176\n新一代人工智能发展规划\t426177\n太平洋建设\t426178\n巴黎贝甜\t426179\n亲故\t426180\n美团众\t426181\n火影忍者疾风传\t426182\n血氧\t426183\n全南政府网\t426184\n缅因州\t426185\n炸板\t426186\n吉林卫视\t426187\n腐国\t426188\n城际分类\t426189\n方广\t426190\n绞刀\t426191\n向塘镇\t426192\n荣耀V8\t426193\n音乐之路\t426194\nGTX950M\t426195\ntwitter\t426196\n0.4.4\t426197\nMP3_文学|哲学有声书\t426198\nrpgvxace\t426199\n简悦\t426200\n越秀星汇云锦\t426201\n1英尺\t426202\nNotification\t426203\n月饼\t426204\nM7205\t426205\n七月份\t426206\n骂\t426207\n果儿\t426208\n御手\t426209\n双联\t426210\n压力袜\t426211\nEltaMD\t426212\n西渝\t426213\n三潭印月\t426214\nkendra\t426215\nf几\t426216\n虾味\t426217\n胶囊\t426218\n吊袋\t426219\n女员\t426220\n套内面积\t426221\n下饭\t426222\n笃敬\t426223\n佳门サエコ\t426224\n帅蛋\t426225\nXadmin\t426226\n报告表\t426227\n安瑞克\t426228\n无尽夏\t426229\n瓜分\t426230\n遵义茅台机场\t426231\navd\t426232\n99家\t426233\n侣\t426234\n重阳宫\t426235\n尊御\t426236\ndoped\t426237\n阿尔茜\t426238\nm600\t426239\n容联七陌\t426240\n侠力\t426241\n地城市\t426242\n崔哥\t426243\n新院\t426244\n鸭嘴鱼\t426245\n浩子\t426246\nmf4712\t426247\n金鼎湾\t426248\n纠结症\t426249\n137个\t426250\n陈立军\t426251\n环糊精\t426252\n邦宝\t426253\n大小调\t426254\n数字串\t426255\n1200gs\t426256\n有车\t426257\n职业性\t426258\n第二家\t426259\nWTC\t426260\n大阪府\t426261\n上海第一八佰伴\t426262\n遭到\t426263\n下一步\t426264\n李辉\t426265\n临川一中\t426266\n第十九季\t426267\n90型\t426268\n马格利特\t426269\n中宁县\t426270\n深圳赶集网\t426271\n鳄鱼纹\t426272\n小金刚菩提\t426273\n汉化版+\t426274\ndorado7\t426275\n威斯\t426276\n封建制\t426277\n火影忍者疾风传究极忍者风暴\t426278\n巧格\t426279\n镜像服务器\t426280\n高邑县\t426281\nIDEA2018\t426282\n98届\t426283\n乐课力\t426284\nqq华夏吧_\t426285\n省优\t426286\njiqiao\t426287\nwin10输入法\t426288\n毛坦\t426289\n第40章\t426290\n房地产证\t426291\n五六点\t426292\n冲洗阀\t426293\n650度\t426294\n上海日报\t426295\n面额\t426296\n150亿\t426297\n十五首\t426298\n光强\t426299\nTHD\t426300\n周界\t426301\n50份\t426302\n帕尔马\t426303\n福娃\t426304\n暗天使\t426305\n热立塑\t426306\n杏彩娱乐\t426307\n东港街道\t426308\n王子豪\t426309\n路林\t426310\n负压\t426311\n凤阳路\t426312\n佟大朱元璋\t426313\n品目\t426314\n有术\t426315\n十二节\t426316\nHotfix\t426317\n京剧版\t426318\nOIS\t426319\ncarly\t426320\n一劳动节\t426321\n辩证唯物主义\t426322\n难寻\t426323\n2017-01\t426324\n李学勇\t426325\n积分版\t426326\n秦烈\t426327\n高建国\t426328\n新婚夫妇\t426329\n3107\t426330\n总公\t426331\n正泰电器\t426332\n全国包\t426333\n辐射3吧_\t426334\n百岁\t426335\n矩阵元\t426336\n铝箱\t426337\npanda\t426338\n戏珠\t426339\n晚红\t426340\n6.04\t426341\n网易云笔记\t426342\n随便说\t426343\n逗川\t426344\n孤木\t426345\nfy\t426346\n元勋\t426347\nMnCu\t426348\nChallenge\t426349\n次日\t426350\n霍思\t426351\n大众polo\t426352\n丝绒\t426353\n木琴\t426354\n编织者\t426355\n三浦加奈\t426356\n吸烟室\t426357\npeizhi\t426358\n隶体\t426359\nbootstarp\t426360\n1840\t426361\n马尾新闻网\t426362\ngtx670\t426363\n烤机\t426364\n旱涝\t426365\n艾菲特\t426366\nensuite\t426367\n_际通宝\t426368\n800_\t426369\n四川省民政厅\t426370\n点屏\t426371\n鳞翅目\t426372\n蒙古人\t426373\n金光集团\t426374\n国道\t426375\n床车\t426376\n史文恭\t426377\n四五万\t426378\n945\t426379\n凌峰\t426380\n旅游卡\t426381\n王翦\t426382\n来势\t426383\n强打\t426384\n病死\t426385\n全力\t426386\n疒\t426387\nHC3i\t426388\n时间的脚印\t426389\n祭英\t426390\n索引图\t426391\n中远物流\t426392\n英雄使命\t426393\nisp\t426394\n复议\t426395\n链板输送机\t426396\n行测\t426397\n领先\t426398\n打旋\t426399\n重庆市綦江区人民政府\t426400\n沈向龙\t426401\n宁陕县\t426402\n法尔\t426403\n果酱\t426404\n话柄\t426405\n麦味地黄丸\t426406\n电工学网\t426407\nKayano\t426408\n作诗\t426409\n宝通街\t426410\n亢龙有悔\t426411\n靓女\t426412\n天同律师事务所\t426413\n湘西北\t426414\n氧量\t426415\n电教片\t426416\n混都市\t426417\n雷格\t426418\n公房\t426419\n尼日\t426420\n延安中路\t426421\n凌轩\t426422\n纪念日\t426423\n厚学\t426424\n得意思\t426425\n迈得\t426426\nopenocd\t426427\n飚速宅男\t426428\n2.1.9\t426429\nLDL\t426430\n33餐饮网\t426431\n常州市委\t426432\n小活络丸\t426433\n孙明明\t426434\nvh\t426435\nCBCT\t426436\nconcentrations\t426437\nhowl\t426438\n四周\t426439\n曼\t426440\n码流\t426441\n皮实\t426442\n键盘\t426443\n威尼斯的小艇\t426444\n甜椒\t426445\n新安村\t426446\n0【\t426447\n发财树\t426448\n息子\t426449\n承接页\t426450\nwww.sf999.com\t426451\n南京大学\t426452\n妇女儿童医院\t426453\nhu\t426454\n别子\t426455\n天天特价\t426456\nrounded\t426457\n麟儿\t426458\n先帝\t426459\n南风歌\t426460\n字符串\t426461\n双玺\t426462\nabstract类\t426463\n人力资源考试网\t426464\n习近平治国方略\t426465\n人文主义\t426466\n中山区\t426467\n冒烟冰激凌\t426468\n标项\t426469\n这是我的战争\t426470\n柏松\t426471\n熊林\t426472\n霞飞\t426473\n余项\t426474\njaspersoft\t426475\n纵跳\t426476\nMEMORY\t426477\nTIN\t426478\n西部大道\t426479\nprey\t426480\n本天\t426481\n悠果维\t426482\n光滑度\t426483\n辩论\t426484\n莫安琪\t426485\n萬物\t426486\n宝轮\t426487\n管道符\t426488\n郧西县\t426489\n12315\t426490\n瞑目\t426491\nFutureTask\t426492\n汕头教育\t426493\n永春一中\t426494\n灵书妙探\t426495\n绵阳市政府\t426496\nfta\t426497\n氯化镍\t426498\n性瘾症\t426499\nm-2\t426500\nHDMI分配器\t426501\n高建民\t426502\nSexFriend\t426503\n华龙区\t426504\n粉筒\t426505\n上杭\t426506\n烤漆\t426507\n剥开\t426508\n花音\t426509\n优然\t426510\n赵炜\t426511\nSVG格式\t426512\n安能快递\t426513\n禁养\t426514\n非管理员\t426515\n膨润土猫砂\t426516\nfmm\t426517\n不男不女\t426518\n4p\t426519\n万州人才网\t426520\n心线\t426521\nCancer\t426522\n肝掌\t426523\n立夏\t426524\n老乡会\t426525\n某某人\t426526\n機板\t426527\nj1\t426528\n二氯异氰尿酸钠\t426529\n预处理器\t426530\n国网吉林省电力有限公司\t426531\n飞得更高\t426532\n中共湖北省纪律检查委员会\t426533\n倩女幽魂手游刀客\t426534\n秒天\t426535\n十六七岁\t426536\n李紫婷\t426537\n列宁格勒\t426538\nblogging\t426539\n彭浦\t426540\n正价\t426541\nToMany\t426542\n急性乳腺炎\t426543\nStudio2010\t426544\nVAR\t426545\nENGINEER\t426546\n梦兰\t426547\ntrusted\t426548\n南路\t426549\nreceptor\t426550\n涨水\t426551\n同屏器\t426552\n集市\t426553\n转播\t426554\n上海高速公路\t426555\n国家旅游局办公室\t426556\n2.0%\t426557\n9个小时\t426558\n马姓\t426559\n中国音乐家协会\t426560\n绳结\t426561\n查理\t426562\n九届二次\t426563\n生死格斗\t426564\n北京金融街\t426565\nAB卷\t426566\nAbeBooks\t426567\n凉血\t426568\n盈透证券\t426569\n微光\t426570\n学富\t426571\n乌鞘岭\t426572\n0.35%\t426573\n李大国\t426574\nSerial\t426575\n县统计局\t426576\n古田\t426577\n三氯异氰尿酸\t426578\n打猪草\t426579\nwts\t426580\naarch64-linux\t426581\n湖北省人民检察院\t426582\n海特\t426583\n第四系\t426584\n千金小姐\t426585\nmason\t426586\n二手书\t426587\n山峡\t426588\ng711\t426589\n七十二层\t426590\n追到底\t426591\n飞科\t426592\nlatin\t426593\nNest\t426594\nsonarlint\t426595\n新闻众评\t426596\n破腹\t426597\n利率表信息网\t426598\nIntegrity\t426599\n直流变频\t426600\n石蒜\t426601\n入党志愿\t426602\n依赖症\t426603\ncosmo\t426604\n钩形\t426605\n捅死\t426606\n有意\t426607\n太康县\t426608\n煤矸石\t426609\nmax2015\t426610\n猫男\t426611\n尖\t426612\n瓶梅\t426613\n螺杆启闭机\t426614\n全海员\t426615\n沙涌\t426616\n墨者\t426617\n江夏村\t426618\n碧云轩\t426619\n以太网交换机\t426620\n国证\t426621\n退货\t426622\nmm2r\t426623\n冰剑\t426624\n钾离子\t426625\n惟妙惟肖\t426626\n脱成名\t426627\n手针\t426628\n第三十八\t426629\nSAMP\t426630\n无医\t426631\n户外装备\t426632\njqx\t426633\nWee\t426634\nspectral\t426635\n节衣缩食\t426636\nAnimenz\t426637\n花与蛇3\t426638\n重庆市审计局\t426639\n好时巧克力\t426640\n攀冰\t426641\nsourse\t426642\n小家\t426643\n海南航空\t426644\nleaf+\t426645\n恶魔岛\t426646\n颈椎操\t426647\nxiashengwang\t426648\n大湾区\t426649\nBKB\t426650\n讲故事\t426651\npoems\t426652\n南京市科委\t426653\n张维为\t426654\n边沟\t426655\n北京市第八十中学\t426656\n擅自\t426657\n鬼王\t426658\n卓君\t426659\n360n4s\t426660\n地理科学学院\t426661\n医心\t426662\n穷游网\t426663\n永泰小学\t426664\n17127\t426665\n连忙\t426666\n昭山\t426667\nslimscroll\t426668\n相电流\t426669\n同步盘\t426670\nHydra\t426671\n亏空\t426672\n创动\t426673\n国家标准委\t426674\nHanna\t426675\n1w1\t426676\n顶正\t426677\n老九门\t426678\n三星\t426679\n鑫宇\t426680\nf563\t426681\nHolographic\t426682\n大明星\t426683\n雷子\t426684\n重庆科技学院\t426685\n20160709\t426686\n噤\t426687\n内项\t426688\n职评\t426689\nSEAndroid\t426690\n上海科技教育出版社\t426691\n出宫\t426692\n票纸\t426693\ntmr\t426694\n仙妖劫\t426695\n地埋\t426696\n徐茶\t426697\n新章\t426698\nmysql8.0\t426699\n上海天气网\t426700\nXvid\t426701\n东螺五金制品有限公司\t426702\nBlaire\t426703\n填海\t426704\n民以食为天\t426705\nJRs\t426706\n意外伤害险\t426707\n大连渤海医院\t426708\n陆振华\t426709\n印记\t426710\n奥卡西平\t426711\nKross\t426712\nmileage\t426713\n扭断\t426714\n马瑞利\t426715\nOAuth2.0\t426716\n排屑机\t426717\n新白娘子传奇\t426718\n调协\t426719\n表态\t426720\n电功率\t426721\nspearman\t426722\n曾氏\t426723\nwp10\t426724\nsimplemind\t426725\n非上市股份有限公司\t426726\n验血\t426727\n监督管理局\t426728\n挣钱\t426729\n120号\t426730\n志在必得\t426731\nemwin\t426732\n43平米\t426733\nSizing\t426734\n2美元\t426735\n宝盖\t426736\n味美\t426737\n中和村\t426738\n洗衣槽\t426739\n伊蒂之屋\t426740\n喵街\t426741\n734\t426742\n合于\t426743\n深河\t426744\n置办\t426745\n母仪\t426746\n花生榨油机\t426747\n农家乐\t426748\n黑龙王\t426749\nremoveAttr\t426750\n赞宇科技\t426751\nRealm\t426752\nrp\t426753\n杂兵\t426754\n小鹿犬\t426755\n麦收\t426756\nHRB400\t426757\n荣耀magicbook\t426758\nshell\t426759\n作\t426760\n皮甲\t426761\nprotecting\t426762\n服务式\t426763\n约取\t426764\nphpems\t426765\n足球迷\t426766\n逮住\t426767\n哈达\t426768\n路易斯维尔\t426769\nGerard\t426770\n132家\t426771\n麻省理工学院\t426772\n上盖\t426773\n南湾猴岛\t426774\n绿羽\t426775\n一点号\t426776\n年限\t426777\n红蓝色\t426778\n一二三线\t426779\nManager\t426780\ngauge\t426781\n安秀网\t426782\n银松\t426783\n观察者网\t426784\n不够\t426785\n除尘器\t426786\n雅思剑桥\t426787\n功用\t426788\n西红门店\t426789\n王德\t426790\n天权\t426791\n星野景子\t426792\n郑州五中\t426793\n2.8.7.0\t426794\n海棠苑\t426795\n金蜘蛛\t426796\n佰斯特\t426797\n都邦保险\t426798\nvenus\t426799\n导士\t426800\n公共资源交易中心\t426801\n股本\t426802\n上学好\t426803\ncnBeta\t426804\n一阁\t426805\ni5-8250U\t426806\n公望\t426807\n东汇城\t426808\n汇创\t426809\n蔡淳佳\t426810\n外球\t426811\n语序\t426812\n参赛者\t426813\n遗墨\t426814\nEs\t426815\n偏执型人格障碍\t426816\n殃及池鱼\t426817\n长阳\t426818\n超级碗\t426819\n天海佑希\t426820\n活鱼\t426821\n腐肉\t426822\n龙岗坪地\t426823\n关联查询\t426824\n起床\t426825\n东明路街道\t426826\n心电监护\t426827\n眷恋\t426828\n沙棘茶\t426829\n狭窄\t426830\n诸城市政府\t426831\ngeorge\t426832\nphthon\t426833\n中国医学科学院北京协和医院\t426834\n代理类\t426835\n800kV特\t426836\n巴克莱\t426837\n设有\t426838\n第11批\t426839\n影音先锋av资源站_影音先锋\t426840\n艾玛·杜蒙特\t426841\n高楼层\t426842\n再一起\t426843\n上海市委组织部\t426844\n负载\t426845\n第27页\t426846\n独生子女\t426847\n浙江省知识产权局\t426848\n永恒之塔\t426849\n锁链战记\t426850\n艳情片\t426851\n许路儿\t426852\nTies\t426853\n云茶\t426854\nappli\t426855\n中国国际电视总公司\t426856\n胡浩\t426857\n7400元\t426858\n劳驾\t426859\noppoR9s\t426860\n20pt\t426861\n夜桜\t426862\n杜丽\t426863\nrip协议\t426864\n易俗\t426865\n白娥娟\t426866\n_峰峰\t426867\nCMM\t426868\n资讯网\t426869\nnbt\t426870\n环境卫生\t426871\n股骨头\t426872\n撸撸吧|影音先锋电影|AV\t426873\n阿莎\t426874\n超薄\t426875\n万重山\t426876\n㎡\t426877\n神楽坂真冬\t426878\n中旅城\t426879\n美狗\t426880\n云拿月\t426881\nwonder\t426882\n马普托\t426883\n第六章\t426884\n灵鼎\t426885\ncollapsed\t426886\n茑屋书店\t426887\n20174\t426888\n深潭\t426889\n城西路\t426890\n2017年3月20日\t426891\n电器\t426892\n李子成\t426893\n冬笋\t426894\nmengguo\t426895\n活塞\t426896\n主城\t426897\n北京文博\t426898\npareto\t426899\n退缩\t426900\n轻敌\t426901\n二圣宫村\t426902\n弄好\t426903\n雅居乐锦城\t426904\n调序\t426905\n小口\t426906\n迈耶\t426907\n不敢想\t426908\n大桥村\t426909\n盛产\t426910\nCBR\t426911\n上尚城\t426912\n毕业展\t426913\n8分钱\t426914\n卢勤\t426915\n数学学\t426916\n草线\t426917\n两万个\t426918\n载波\t426919\n解法\t426920\nPetite\t426921\nacooly\t426922\n5分钟之内\t426923\n贴贴\t426924\n塔子湖\t426925\n志存高远\t426926\n张光峰\t426927\n搭线\t426928\nCs\t426929\n宅男社\t426930\n襄阳市第一人民医院\t426931\nPS滤镜\t426932\n盘管\t426933\nseba\t426934\n3K\t426935\n淘抢购\t426936\n2017-12-21\t426937\n真挚\t426938\n451\t426939\n杜海马伊\t426940\n串流\t426941\n不得罪\t426942\n整流模块\t426943\n卧倒\t426944\n东阳市政府\t426945\n鸡脚\t426946\n子女\t426947\n浙江旅游职业学院\t426948\n山农\t426949\n【世\t426950\n奶豆腐\t426951\n沙埔镇\t426952\nB\t426953\n特洛耶·希文\t426954\n百词斩\t426955\n疹\t426956\n倡廉\t426957\n亚洲象\t426958\n老太监\t426959\n胡萝卜泥\t426960\nPRISM\t426961\n沙钢集团\t426962\nexcel格式刷\t426963\n南通开发区\t426964\n睡着\t426965\n沙船\t426966\n20150326\t426967\n先峰\t426968\n阿毛\t426969\n1700X\t426970\nvbox\t426971\n宁波明州医院\t426972\n哥特王朝4\t426973\n丹巴县\t426974\n赵茜\t426975\n大海战\t426976\n活命\t426977\ndiffusion\t426978\n风正一帆悬\t426979\nwhisky\t426980\n士子\t426981\n【探\t426982\n萤火共和国之辉\t426983\n30多斤\t426984\n中国认证认可协会\t426985\nsurveillance\t426986\nflashcs6\t426987\n2099\t426988\n南瓜园\t426989\namiibo卡\t426990\n踩着\t426991\n关于正确处理人民内部矛盾的问题\t426992\n知得失\t426993\n王月皓\t426994\n黑树\t426995\n比克电池\t426996\n颍上\t426997\nmium\t426998\n大陆\t426999\n参孙\t427000\n奶盘\t427001\n花匠\t427002\n股值\t427003\n冰淇淋机\t427004\n戚蓝尹\t427005\n卡洛驰\t427006\n坪上镇\t427007\n克子\t427008\n李尔\t427009\nrevelation\t427010\n忍者蛙\t427011\n羊口镇\t427012\n飞智游戏厅\t427013\n肌底液\t427014\n爱很简单\t427015\n莫衷一是\t427016\n人力资源管理师三级考试\t427017\nc++函数\t427018\n帮解\t427019\n喷射\t427020\nMindset\t427021\n西田卡莉娜\t427022\nx2hr\t427023\n潮福城\t427024\nirig\t427025\n超级粽子\t427026\n小山坡\t427027\n刘黑仔\t427028\n发病率\t427029\n独饮\t427030\nSole\t427031\nmultism\t427032\nxzc\t427033\n卫生监督局\t427034\nCCTV少儿频道\t427035\nx3.2\t427036\ncsulennon\t427037\n200文\t427038\nHola\t427039\n港澳码头\t427040\n小菊\t427041\n中央电视台纪录频道\t427042\n星月菩提\t427043\n1639\t427044\n危险废物经营许可证\t427045\n隔膜片\t427046\n招兵买马\t427047\n金堂\t427048\n雕塑园\t427049\n薪资\t427050\n见多\t427051\n伊扬\t427052\n绝地求生刺激战场激情沙漠\t427053\n九里香\t427054\n小米5a\t427055\nEagle\t427056\n健肤\t427057\n李建勤\t427058\n伊丽莎\t427059\n仁川机场\t427060\n西安公交\t427061\n叁级\t427062\n2018.4.24\t427063\n北林区\t427064\n花葬\t427065\ndzzoffice\t427066\nchevrolet\t427067\nlove\t427068\n宁河\t427069\nDOCK\t427070\n悲苦\t427071\n临建\t427072\n成块\t427073\n百度翻译\t427074\n说值\t427075\n刘奕鸣\t427076\n陈小朵\t427077\n老前辈\t427078\n云南路\t427079\n华润万象城\t427080\n缝合机\t427081\n绝地求生刺\t427082\nvalence\t427083\n夜宿\t427084\n返岗\t427085\n特利\t427086\nIL\t427087\n18倍\t427088\ndecoding\t427089\n平昌冬奥会\t427090\n节本\t427091\n封标\t427092\n热游\t427093\n35m\t427094\n宝刀不老\t427095\n2152\t427096\n▄\t427097\n资管公司\t427098\n前一分钟\t427099\noasis\t427100\n三本学校\t427101\n杰奎琳·肯尼迪\t427102\n第08集\t427103\n萝莉网\t427104\n豪气\t427105\n卢洋洋\t427106\n异星战场\t427107\n深圳园博园\t427108\n徐冠巨\t427109\n核桃皮\t427110\nM42\t427111\n三国魂\t427112\n独到之处\t427113\n中国国际设计博物馆\t427114\nVCD-RMVB\t427115\n广安职业技术学院\t427116\n游戏王ygocore\t427117\n等效应力\t427118\n树莓派2\t427119\nGarfieldEr007\t427120\n高热\t427121\n斗殴案\t427122\n秦昭襄王\t427123\n侯卫东官场笔记\t427124\n局域网共享\t427125\n大连理工大学软件学院\t427126\n训令\t427127\n中国农村信用社\t427128\n月柱\t427129\n石雪\t427130\n20150101\t427131\nhrs\t427132\n不著\t427133\napowersoft录屏王\t427134\n白色巨塔\t427135\n爱你\t427136\n中国大唐集团\t427137\n高新发展\t427138\nsessionstorage\t427139\n国际公寓\t427140\n试学\t427141\n周春\t427142\n1dx\t427143\n药案\t427144\n所欲\t427145\nHotsum\t427146\n第二个\t427147\n马卡斯城\t427148\n重庆外国语学校\t427149\n龙潭湖\t427150\n液晶触摸屏\t427151\n雨花斋\t427152\n大仪\t427153\n韶\t427154\n樊京辉\t427155\n手腕流\t427156\n金融战\t427157\n冒泡\t427158\n玉置浩二\t427159\n意意\t427160\n换门\t427161\n毕福剑\t427162\n发梳\t427163\n女老大\t427164\nNVIDIA\t427165\nUmbrella\t427166\n滑润\t427167\n算算账\t427168\n铝灰\t427169\n孙八一\t427170\n小红伞免费版\t427171\nElectromagnetics\t427172\nGQI\t427173\n汉式\t427174\n广富林遗址\t427175\n黄厚江\t427176\n750w\t427177\n葫芦瓜\t427178\nVOLKSWAGEN\t427179\n好逑\t427180\n上海协和双语学校\t427181\nx2m\t427182\n奥兰多环球影城\t427183\n头号玩家\t427184\n第一现场\t427185\n正茂\t427186\n提价\t427187\n动漫无损音乐下载资讯站_ACG漫音社\t427188\n中国中元国际工程有限公司\t427189\n二维空间\t427190\n伤心人\t427191\n220斤\t427192\n骁\t427193\n西柏坡纪念馆\t427194\n第22天\t427195\nmiss\t427196\n1500平米\t427197\n60亿美元\t427198\nadverse\t427199\n铜片\t427200\n审批表\t427201\n受尽\t427202\n毛猫\t427203\n洪智\t427204\n巴普洛夫\t427205\ninternal\t427206\n申通快递\t427207\n首波\t427208\n105b\t427209\n邹伟\t427210\npc管\t427211\n16%\t427212\n凤女\t427213\n吴碧霞\t427214\n监控宝\t427215\n0.9.5\t427216\n元气石\t427217\n倾向性\t427218\nteamLab\t427219\n五档\t427220\nWAV整轨/\t427221\n公主日记\t427222\n5张\t427223\n两颊\t427224\ninterpreting\t427225\n硬通货\t427226\nmadilyn\t427227\nrelease\t427228\nexport\t427229\n好想念\t427230\n菜籽油\t427231\n战盾\t427232\n道者\t427233\n海生\t427234\n埃弗拉\t427235\nthinkPHP\t427236\n华师大附小\t427237\n艾利\t427238\n800岁\t427239\n迅雷播放器\t427240\n温湿度\t427241\nsignup\t427242\n中意人寿\t427243\n姚辉\t427244\n无机颜料\t427245\n后势\t427246\n九江汽车站\t427247\n天使之争\t427248\nlind\t427249\n歆\t427250\n西替利嗪\t427251\n贝塔\t427252\n鸿信\t427253\n年后\t427254\n狂雷\t427255\n兰桂坊\t427256\n不老林\t427257\nhCG\t427258\n撒拉\t427259\n卡宾\t427260\n低胸\t427261\n传奇服务器\t427262\n中国电信浙江公司\t427263\n挑选\t427264\ncollage\t427265\nCrazyBingo\t427266\n三角波\t427267\n哲哲\t427268\n默\t427269\n外卖\t427270\n嘟嘟城\t427271\n四不两直\t427272\n446\t427273\n龙华路\t427274\nfathers\t427275\ngoahead\t427276\n挖潜\t427277\nPER\t427278\n广和通\t427279\n奥太\t427280\n陈慧\t427281\n宙斯众神之王\t427282\n气人\t427283\n第三项\t427284\n怡安翰威特\t427285\n非常好\t427286\n上海交通大学网络教育学院\t427287\n倍耐力轮胎\t427288\n半导体\t427289\n皮床\t427290\n第116\t427291\nnRF52832\t427292\n瓯网数字报\t427293\n大卖家\t427294\n黑石集团\t427295\n希露\t427296\n恋脚\t427297\nmiyake\t427298\nmartiderm\t427299\n第9章\t427300\n15.24\t427301\n步孤天\t427302\n洛阳科技职业学院\t427303\n识物\t427304\n东鹏瓷砖\t427305\nideos\t427306\n细胞质\t427307\n品房\t427308\n迭戈科斯塔\t427309\n夜生活网\t427310\n唐山市政府网\t427311\n航世\t427312\n重庆小升初\t427313\n社交\t427314\n佛山东\t427315\n师大\t427316\n突泉\t427317\nNewCapec\t427318\nv470\t427319\n幻色\t427320\nVSphere\t427321\n城市规\t427322\n防晒系数\t427323\nPPT网\t427324\n1245\t427325\n5,6月份\t427326\n办照\t427327\n270天\t427328\n幻想世界大穿越\t427329\n陕西消防\t427330\n20150402\t427331\n国玺\t427332\nyunfu\t427333\n1721\t427334\n天地一号\t427335\n宝冠\t427336\n因为爱情有幸福\t427337\n应纳税所得额\t427338\n设施农业\t427339\npr2015\t427340\n南纺股份\t427341\n流脑\t427342\nhonma\t427343\n怪物\t427344\n第65号\t427345\nAbort\t427346\n做会\t427347\n爱耳日\t427348\ninset\t427349\nacknowledged\t427350\np51s\t427351\nZK\t427352\n华润五丰\t427353\n九游版\t427354\n好别\t427355\nSpyware\t427356\n1932\t427357\ncentering\t427358\n魂骨\t427359\n伊娃·格林\t427360\nFileVault\t427361\n生姜\t427362\n十三日\t427363\n宪法修正案\t427364\n曾智希\t427365\nwoof\t427366\n管理层\t427367\n兴隆镇\t427368\n丁烯\t427369\nOPENGL\t427370\n502路\t427371\n板柱\t427372\n三枪\t427373\n免疫工具\t427374\nabroad\t427375\n高潭\t427376\n下灯\t427377\n四川中烟\t427378\n双边\t427379\n烟雾\t427380\n人椅\t427381\n陈光辉\t427382\n18度\t427383\n抓饭\t427384\n标准误差\t427385\n负责制\t427386\n中融信托\t427387\nMfg\t427388\n音包\t427389\n胡玥\t427390\ncasual\t427391\nOffice办公软件\t427392\n超新星纪元\t427393\n胡里山炮台\t427394\nqueer\t427395\n4000K\t427396\n万余条\t427397\nJMP\t427398\n高尔夫球\t427399\n余先生\t427400\n隆兴寺\t427401\n你是我要的爱情\t427402\nG2030\t427403\n理财通\t427404\nwhx\t427405\n自民党\t427406\n击剑\t427407\n省标\t427408\n总站\t427409\nAccor\t427410\nOFFICE2013\t427411\n晚上十二点\t427412\nzx8\t427413\n网区\t427414\n管所\t427415\nMediaElement\t427416\n民主生活会若干规定\t427417\n通鼎互联\t427418\n凯里欧\t427419\n10秒\t427420\n日野\t427421\n粤东西北\t427422\naescripts\t427423\n大卫·科波菲尔\t427424\n720P高清\t427425\n马武\t427426\n兔玩网英雄联盟\t427427\n福昕PDF\t427428\n覆没\t427429\n陈宝琛\t427430\n密码战\t427431\n嵩口古镇\t427432\n远扬\t427433\n台币\t427434\n摸清\t427435\n3187110\t427436\n创世中文网\t427437\nCC1101\t427438\n荒石\t427439\n百度影\t427440\n毛衣链\t427441\nbmi指数\t427442\n18057期\t427443\n学渣\t427444\n肺经\t427445\n流光溢彩\t427446\n外挂机\t427447\n朴正炫\t427448\n建堆\t427449\nlsj\t427450\nrefugee\t427451\n公法\t427452\neBooks\t427453\n永不退缩\t427454\niDanMu\t427455\nLively\t427456\n周测\t427457\n闽都\t427458\n橹管\t427459\n李世英\t427460\n绿矾\t427461\n安成\t427462\n1段\t427463\n注册机\t427464\n鲍云\t427465\n义乌市市场监督管理局\t427466\n凶间\t427467\n成吉思汗\t427468\n牛鱼\t427469\n铁傀儡\t427470\n震荡市\t427471\nsire\t427472\n84A\t427473\n10本\t427474\n工商登记号\t427475\n赌服输\t427476\nlol半价吧\t427477\n后板\t427478\nENDNOTE\t427479\nGracie\t427480\n简表\t427481\ndupont\t427482\n工业除尘器\t427483\ninclud\t427484\nv1.3.3\t427485\n夜盘\t427486\n荣格\t427487\nTMPGEnc\t427488\ngamersky\t427489\n底盆\t427490\nrstrip\t427491\n体藓\t427492\n公积\t427493\n掠食者\t427494\n圆卡\t427495\n套和\t427496\njietu\t427497\n板框压滤机\t427498\ngivenchy\t427499\n梅州市人民政府\t427500\n朱桥\t427501\n蛋白棒\t427502\nDSG\t427503\n电咖\t427504\n1677\t427505\n书上\t427506\n安达唐\t427507\n重庆共青团\t427508\n寒冰7\t427509\n外贸营销\t427510\nparasitic\t427511\n刀豆\t427512\n湖南人文科技学院\t427513\n任意页\t427514\n普清版\t427515\n遍历时\t427516\n微谱\t427517\nac86u\t427518\n锐奇\t427519\nRestrictions\t427520\n一千年以后\t427521\n挖苦\t427522\n系谱\t427523\naomen\t427524\ninjured\t427525\n定价权\t427526\n连接件\t427527\n涕泪\t427528\n贾大山\t427529\n创维集团有限公司\t427530\n好315创业网\t427531\n张大龙\t427532\n新世界大丸百货\t427533\n垃圾车\t427534\n再走一步\t427535\nmelanie\t427536\nmingaixin\t427537\nhydride\t427538\n山西医科大学第一医院\t427539\n炊事员\t427540\n西安站\t427541\n喝下\t427542\n福牌\t427543\n第一辆\t427544\n圣人\t427545\ndevos\t427546\n积碳清洗剂\t427547\n陈瑞华\t427548\n自愧不如\t427549\n北京国康医院\t427550\n落拓\t427551\nGay片\t427552\n得到\t427553\nCANCER\t427554\nJu\t427555\n固溶体\t427556\n理体\t427557\n凤庆县\t427558\n桑普\t427559\n10户\t427560\n球蟒\t427561\nphpstorm2018\t427562\nd3d11\t427563\n天涯福克斯\t427564\nEXIF\t427565\n黑环\t427566\n四川交通职业技术学院\t427567\n追凶者\t427568\n攻城师\t427569\n林疯狂\t427570\n几个亿\t427571\n广州市工商局\t427572\nNavicatforMySQL\t427573\nword2016\t427574\n雷剧\t427575\ntraveler\t427576\n钱景\t427577\n3万亩\t427578\n几十公里\t427579\nformdata\t427580\n泸西县\t427581\nRedbubble\t427582\n第23话\t427583\nagreements\t427584\n1652\t427585\n吸粮机\t427586\n浆糊论坛-RO小站\t427587\n胜过\t427588\nrayleigh\t427589\n机械战警\t427590\n马凡氏综合征\t427591\n望海潮\t427592\n情动\t427593\n晓明\t427594\n邱淑李雨霏\t427595\n10G\t427596\n神马影院_第九影院\t427597\n改进版\t427598\n城市市容和环境卫生管理条例\t427599\n壮大\t427600\naccessible\t427601\nmanbetx\t427602\n死亡笔记\t427603\n二叉树\t427604\nSS套\t427605\n娃酷网\t427606\n样衣工\t427607\nsu2017\t427608\nARMv7\t427609\n两瓶\t427610\n生物人\t427611\n小蚁币\t427612\n成山头\t427613\n春华镇\t427614\ndoinb\t427615\n回家\t427616\n脏牧\t427617\n鲜奶\t427618\n生殖科\t427619\n美高清\t427620\n长沙市普通中小学\t427621\n裘继戎\t427622\n华越\t427623\n|七喜\t427624\n新会展中心\t427625\n双工\t427626\n思凯乐\t427627\n社保局\t427628\nrzsz\t427629\n欢乐斗牛\t427630\n厂库\t427631\n井浦\t427632\n网球公开赛\t427633\n欧妮\t427634\n丢帧\t427635\n新华丝路\t427636\n2头\t427637\n山鬼谣\t427638\n冷寿光\t427639\n教务员\t427640\n吸风\t427641\n源石\t427642\nsametime\t427643\n11年前\t427644\njmockit\t427645\n优缺\t427646\n雅静\t427647\n协审\t427648\n头骨\t427649\n毛万春\t427650\n100大\t427651\n翔田\t427652\n七色光\t427653\n万载县\t427654\n肠癌\t427655\n仙族\t427656\n花类\t427657\n千峰南路\t427658\n鹿客classic\t427659\n代养\t427660\n殷飞\t427661\n第43期\t427662\nMILF\t427663\ncontracts\t427664\n汝阳县\t427665\n国贸三期\t427666\n融资性担保\t427667\n合水县\t427668\n鬼鬼\t427669\n何兆熊\t427670\n孪生\t427671\n思美传媒\t427672\n白羊座\t427673\n圣徒\t427674\n尖峰时刻\t427675\n残缺\t427676\n天恒娱乐\t427677\n7.30\t427678\n国营贸易\t427679\n东湖生态旅游风景区\t427680\n上海君瑞生物技术有限公司\t427681\n冷运\t427682\n秦学教育\t427683\n毛铺苦荞酒\t427684\n铜陵学院\t427685\ncmcc\t427686\n豆子\t427687\n10300\t427688\n心级\t427689\n蓝天使\t427690\nheli\t427691\n重整案\t427692\na20\t427693\nface\t427694\n千分\t427695\n石祥路\t427696\n排距\t427697\n补提\t427698\ndamien\t427699\n6022\t427700\nastroneer\t427701\n高权重\t427702\n大总结\t427703\n忏悔录\t427704\n拒腐防变\t427705\n王贵\t427706\n千林\t427707\n遵义市区\t427708\n32号\t427709\n不甘落后\t427710\n太空站\t427711\n回滚段\t427712\n悖\t427713\n仁政\t427714\nAFNetWorking\t427715\n第几名\t427716\nMinneapolis\t427717\n肠胀气\t427718\n试验性\t427719\n五莲红\t427720\n赛麟\t427721\nGERBER\t427722\nacca\t427723\n玉泉寺\t427724\n跟踪\t427725\n帐册\t427726\n拆单\t427727\nMovie\t427728\n弘德\t427729\nslk\t427730\n配乐诗朗诵\t427731\nPfizer\t427732\n欣兰\t427733\n碧眼\t427734\n玄幻\t427735\n打卡机\t427736\n兰拓\t427737\nRO\t427738\n中共湖北省委\t427739\nshequ\t427740\nspringloaded\t427741\n王昶\t427742\naxis2\t427743\n深圳宝安中学\t427744\nMINI2\t427745\n夏港\t427746\ngt800\t427747\n姚科\t427748\n转户\t427749\n宿迁职业技术学院\t427750\n乱红\t427751\n有感\t427752\n仕宦\t427753\n21世纪大学英语读写教程\t427754\n50亿元\t427755\n番茄钟\t427756\n灭荒\t427757\nexporting\t427758\n向海岚\t427759\nv4.11\t427760\naccuracy\t427761\n中航信\t427762\n爱奇艺腾讯\t427763\n辜振甫\t427764\nContract\t427765\n栾克军\t427766\nWil\t427767\n洪森\t427768\n安逸花\t427769\nHiveQL\t427770\n线控\t427771\n拜县\t427772\nkps\t427773\n无翼鸟邪恶漫画之母性泛滥全集\t427774\n12L\t427775\n城池\t427776\n珂朵莉\t427777\n发育树\t427778\n_史\t427779\nハンバ\t427780\n建设工程经济\t427781\nalgae\t427782\n学管师\t427783\n拉孜县\t427784\n童话镇\t427785\n少数人\t427786\nhilation\t427787\nsin30\t427788\n一定时\t427789\n皖南医学院\t427790\nCounselor\t427791\n哀痛\t427792\nWINCC\t427793\n舍生取义\t427794\n800平\t427795\n货郎\t427796\nⅩ\t427797\n修眉\t427798\n军用品\t427799\n瑞芯微\t427800\n嚯\t427801\n燕语\t427802\nYUKI\t427803\n中华人民共和国国务院\t427804\n塑胶模具\t427805\n马蕴雯\t427806\n衣机\t427807\ninvoke\t427808\nwto\t427809\n流响\t427810\n娱乐类\t427811\n凡人修真2\t427812\n文艺点\t427813\n韩勇\t427814\n香汗\t427815\n保监局\t427816\n最古怪\t427817\n许秋怡\t427818\n南通市国土资源局\t427819\n超选四驱\t427820\n简单生活\t427821\n天津南开中学\t427822\n密度\t427823\n探底\t427824\n大同乡\t427825\n1200块\t427826\n整形科\t427827\n概览\t427828\n加长版\t427829\n艺术性\t427830\nPopen\t427831\n人行天桥\t427832\n昆山房产网\t427833\n成都本\t427834\nGT630\t427835\nJurassic\t427836\n玻璃酸钠滴眼液\t427837\nnoto\t427838\n搭电\t427839\nsnapshots\t427840\n21寸\t427841\ngaap\t427842\n多项式\t427843\n250CC\t427844\n暖暖内含光\t427845\n韩小北\t427846\n锡兰\t427847\n完全恶女\t427848\n受助者\t427849\n上海工行\t427850\n金融银行\t427851\n残余\t427852\n窦店镇\t427853\n春雨教育\t427854\n眼轴\t427855\n汉宫秋月\t427856\n港中旅\t427857\n复唧唧\t427858\n7月12日\t427859\nAutoCAD2016\t427860\n立于\t427861\n新率\t427862\n500KVA\t427863\n欢颜\t427864\n货币战争\t427865\n蒙代尔\t427866\n圣境\t427867\n四层\t427868\n赣人社\t427869\n利巴韦林颗粒\t427870\n在场\t427871\n同文\t427872\nfifa18\t427873\n在编译\t427874\nrivers\t427875\n三亚人才网\t427876\n鸡枞\t427877\n工作日\t427878\n直线轴承\t427879\n眼疲劳\t427880\n桑普太阳能\t427881\n安诚\t427882\nkiko\t427883\nProtecting\t427884\n江埔街\t427885\n胞弟\t427886\n桥头堡论坛\t427887\n玉石\t427888\n晓雯\t427889\n2408\t427890\n湖南省委组织部\t427891\nEplan\t427892\n开元股份\t427893\nMJstyle\t427894\n正阳步行街\t427895\n金杯电工\t427896\n田虎\t427897\n品质化\t427898\n伟佳\t427899\n不太\t427900\n1997年7月1日\t427901\nlova\t427902\nkawaii\t427903\n安全型\t427904\n三十元\t427905\n鸭肠\t427906\nHDPE土工膜\t427907\n2018年04月10日\t427908\n中青在线\t427909\n纪元2205\t427910\n易语言零起点\t427911\nArmy\t427912\n地台板\t427913\n郑州铁路职业技术学院\t427914\n特拉华州\t427915\n褒姒\t427916\n刷墙\t427917\n解放军306医院\t427918\n柒个我\t427919\n软语\t427920\nip详解\t427921\n那达慕大会\t427922\n骡马\t427923\n综述\t427924\nv2.5.8\t427925\n寡民\t427926\nBOT\t427927\n经纪商\t427928\nMC世界侠\t427929\n1785\t427930\n伊犁哈萨克自治州政府网\t427931\n顺达驾校\t427932\n北沙滩\t427933\ndr3\t427934\nGLK\t427935\n欧洲东部\t427936\n徐增平\t427937\n保税区\t427938\n月令\t427939\n做长\t427940\n13元\t427941\n奴\t427942\n遗赠抚养协议\t427943\n浣熊市\t427944\n最高调\t427945\nNO2\t427946\n1S\t427947\nDynamically\t427948\n网页翻译\t427949\nyuansu\t427950\n大规\t427951\nHSRP\t427952\nworldwide\t427953\ncnljh\t427954\n360挂靠网\t427955\nhide\t427956\n乡音\t427957\n天虫\t427958\n万好\t427959\nwin10命令提示符\t427960\n100场\t427961\n海默科技\t427962\n挺直\t427963\nacoustics\t427964\n烘干塔\t427965\n云桂铁路\t427966\n住房公积金|华律\t427967\nsnh\t427968\ntele\t427969\n樽\t427970\nsokit\t427971\n银魂\t427972\n饺子馅\t427973\n爱多\t427974\nelements\t427975\n招商会\t427976\nLeoBoy\t427977\n干物妹小埋吧\t427978\n卷帘机\t427979\n子陵\t427980\n多极化\t427981\nWiFi版\t427982\n第一泡\t427983\n时至\t427984\n防爆叉车\t427985\n新天地\t427986\n火山岛\t427987\n微信名\t427988\ncaffemodel\t427989\n充质干细胞\t427990\n任性用\t427991\n校对版\t427992\n门锁体\t427993\n马尼拉机场\t427994\nS60L\t427995\ninfocomm\t427996\n甘肃林业网\t427997\nCassie\t427998\n1130136248\t427999\n一途\t428000\ncaopoo97\t428001\nwindosw\t428002\n蜈蚣洞\t428003\nkcal\t428004\n旧货\t428005\n多喜\t428006\n分店\t428007\n分位值\t428008\n测点\t428009\nlittle\t428010\n民事诉讼程序\t428011\n班制\t428012\n冰鉴\t428013\n江城\t428014\n旋风除尘器\t428015\n十几分\t428016\n双髻鲨\t428017\n科学城\t428018\n打麻将\t428019\nDSN\t428020\n漏洞\t428021\n预紧\t428022\n文华奖\t428023\n丁健\t428024\n成都铁路运输学校\t428025\ncqc\t428026\n市县\t428027\n女报\t428028\n南京大学电子科学与工程学院\t428029\n西顿\t428030\n马麟\t428031\n9品\t428032\n北京博物馆\t428033\nnodebb\t428034\n珠海国际赛车场\t428035\n作曲者\t428036\n或然\t428037\nコ\t428038\n扬州日报社\t428039\nCalendar\t428040\n脚纹\t428041\n股骨干骨折\t428042\n四川省妇幼保健院\t428043\nEXCEPTION\t428044\n称骨\t428045\n助攻\t428046\nmesa\t428047\n哈利奎因\t428048\nv\t428049\nblake\t428050\n李肃\t428051\n警示牌\t428052\nwrecking\t428053\n广州市妇女儿童医疗中心\t428054\n步行者队\t428055\n打翻\t428056\n巴尔韦德\t428057\n和别\t428058\npsutil\t428059\n能剧\t428060\n环境公报\t428061\ndtcms\t428062\n常州市天宁区人民政府\t428063\n重生之衙内\t428064\n分部门\t428065\n曹杨\t428066\n白鹿仓\t428067\n瓜州县人民政府\t428068\n小像\t428069\n郝蕾颐\t428070\n消费水平\t428071\n5.41\t428072\n0.9米\t428073\n干线\t428074\n梁洁\t428075\n阿冒\t428076\n安徽三联学院\t428077\n福建站\t428078\n冰锥\t428079\n永兴岛\t428080\n酱色\t428081\n44种\t428082\n始兴\t428083\n繁体服\t428084\n科达股份\t428085\n凤香型\t428086\n綦江县\t428087\n灼华\t428088\n荣安府\t428089\n神农溪\t428090\n星期一到星期日\t428091\n猪血丸子\t428092\nc++string\t428093\n归真堂\t428094\n模拟屏\t428095\n贾珍\t428096\n一个0\t428097\n拦标\t428098\n成长日记\t428099\nMySQL5.7\t428100\nexecutemany\t428101\nLala\t428102\n二滩水电站\t428103\n胺类\t428104\n北京中科白癜风医院\t428105\n多谢\t428106\n宝券\t428107\n1天前\t428108\n5二\t428109\n新联康\t428110\n整饰\t428111\nTechniques\t428112\n天妒英才\t428113\n富国岛\t428114\n桃源仙谷\t428115\n美女护士\t428116\n会通\t428117\n狼狗\t428118\n水富\t428119\nGuizhou\t428120\n封体\t428121\n63集\t428122\n密涅瓦\t428123\n来者居上\t428124\n數\t428125\n颍州\t428126\n龙东大道\t428127\n葛温德林\t428128\n柔雅\t428129\n芦荟\t428130\n开马\t428131\nv2.20\t428132\n党务\t428133\n边牧犬\t428134\n采阴\t428135\n克莱斯勒300c\t428136\n苹果片\t428137\n泰国女足\t428138\nDeskJet\t428139\nCFC\t428140\n港股\t428141\n正阳县\t428142\n广东大厦\t428143\n黑木耳\t428144\n比熊俱乐部\t428145\n独眼巨人号\t428146\nMercy\t428147\nv1.1.3_\t428148\n蓬莱阁\t428149\n格兰杰\t428150\n铸管\t428151\n九回\t428152\n放声\t428153\n近代史\t428154\nreare\t428155\n深兰\t428156\n利润表\t428157\n细石混凝土\t428158\n类型\t428159\n蟋蟀\t428160\n拉姆塞\t428161\nsinica\t428162\n小昆山镇\t428163\n斯玛特\t428164\n怪诞行为学\t428165\n皇氏\t428166\nconkeyn\t428167\n珠海酒店\t428168\npussies\t428169\n50万吨\t428170\n红色\t428171\n考官\t428172\n泰安银行\t428173\n奥迪q7\t428174\n状元楼\t428175\n雾剂\t428176\nInsufficient\t428177\nScroll\t428178\n恐吧电影网\t428179\n篮协\t428180\nSmoothie\t428181\n嫡女重生记\t428182\n掣\t428183\ntess4j\t428184\n81式\t428185\n佳农\t428186\n天天招生网\t428187\n双性生子\t428188\n1287\t428189\n丰盛控股\t428190\n长虹世纪荣廷\t428191\n码字\t428192\nCOM\t428193\n林白\t428194\n乾隆王朝\t428195\ngap\t428196\n燕山公园\t428197\ncrs\t428198\neles\t428199\n低星\t428200\n10英尺\t428201\nINNO\t428202\n沙山\t428203\n纪特\t428204\n盗洞\t428205\n文物局\t428206\n王多多\t428207\n福特\t428208\nuy\t428209\n興\t428210\n急救箱\t428211\nterr\t428212\n孔祥超\t428213\n王原祁\t428214\n孟静\t428215\n阎学通\t428216\nASEAN\t428217\n王爱国\t428218\n中山大学旅游学院\t428219\n史无前例\t428220\n散排\t428221\n30名\t428222\njide\t428223\n球群\t428224\n十来万\t428225\n任志强\t428226\n铱铂金火花塞\t428227\ngeographic\t428228\nkpop\t428229\n杀案\t428230\nthee\t428231\nmatlab2018a\t428232\n魔龙\t428233\nmcgrath\t428234\n一百兆\t428235\n索尼公司\t428236\n莫妮卡\t428237\n台北路\t428238\n钱咖\t428239\n脱衣舞俱乐部\t428240\n人教版6\t428241\n珊迪\t428242\n科隆\t428243\n背漆\t428244\nSAA\t428245\n贺州市人民政府\t428246\n董子\t428247\n万和\t428248\n社会主义改造\t428249\n巧囊\t428250\n家话\t428251\n二连浩特市人民政府\t428252\n借借\t428253\n拉夏贝尔\t428254\n麦当劳肯德基\t428255\n冷阱\t428256\nGladius\t428257\n路甲\t428258\n撇开\t428259\n交易师\t428260\n通邮\t428261\n77年\t428262\n援建\t428263\n东方今报\t428264\nQQ音速\t428265\nopensuse\t428266\n舟山东极岛\t428267\nWebix\t428268\n山东交通职业学院\t428269\n暴增\t428270\n亿家\t428271\n全面建成小康\t428272\n财视\t428273\n96路\t428274\n困者\t428275\n清正\t428276\n日产楼兰\t428277\n袁辉\t428278\n棉纤维\t428279\nlinkedlist\t428280\n爱德华兹\t428281\n居务\t428282\nMSP430单片机\t428283\n安迪沃霍尔\t428284\n鲁商\t428285\n窄\t428286\nv8.1.0\t428287\n厦滘\t428288\n起步\t428289\n帝道\t428290\n红菜苔\t428291\n锔\t428292\n中转部\t428293\n张世\t428294\n天下无贼\t428295\n七宝镇\t428296\n回到家乡\t428297\nuncle\t428298\n失却\t428299\n地煞\t428300\n甲苯二异氰酸酯\t428301\n热核\t428302\nc25\t428303\n莱茵半岛\t428304\nsafar\t428305\n玥\t428306\n富贵荣华\t428307\n汉翔\t428308\n死神来了6\t428309\n氯化锰\t428310\n活虫\t428311\n岱宗夫\t428312\n注册电气工程师考试\t428313\nCipher\t428314\n林杰\t428315\nrw\t428316\nGT798.COM\t428317\n玄甲苍云\t428318\n凯恩之角_暗黑破坏神\t428319\n输电\t428320\nF座\t428321\n茉茉\t428322\nhichina\t428323\n2085\t428324\nwin200\t428325\nhc360\t428326\n3810\t428327\n那片海\t428328\nrwx\t428329\n冯其庸\t428330\n氧os\t428331\n进出港\t428332\n中国科学技术大学出版社\t428333\n海斯特\t428334\ntremendous\t428335\n阳子\t428336\n空闲\t428337\n当你\t428338\n苏星河\t428339\n9月7日\t428340\n无疆\t428341\n任务栏\t428342\n棺椁\t428343\n小别离\t428344\n于戈\t428345\n恒力\t428346\n钠离子\t428347\n中华衣柜网\t428348\n脏辫\t428349\n张一\t428350\n沈阳广播电视台\t428351\n玖伍\t428352\n罗湖体育馆\t428353\n河南省人社厅\t428354\n_书书网\t428355\n税收协定\t428356\n老油条\t428357\n感冒软胶囊\t428358\n欧洲银行\t428359\n五十分\t428360\n芜湖市环保局\t428361\n102期\t428362\n东莞市光明中学\t428363\n吉林工商学院\t428364\n潇潇\t428365\n橙果\t428366\n3a\t428367\n陈文斌\t428368\n湾区\t428369\n2.5K\t428370\n造价员考试\t428371\n特战旅\t428372\nMCD\t428373\n天结\t428374\nNobody\t428375\n地下城堡\t428376\n客情\t428377\n曹家村\t428378\nKZ\t428379\nsnkrs\t428380\n放逐者\t428381\n腹毛\t428382\n天涯问答\t428383\n斋主\t428384\ndnspy\t428385\n安医二附院\t428386\n中央编办\t428387\n创新者\t428388\ndifferential\t428389\n自编码器\t428390\n恒强制版\t428391\nndis\t428392\n间歇\t428393\n北京东方园林环境股份有限公司\t428394\n哄抬\t428395\n好利来蛋糕店\t428396\n咪咕动漫\t428397\n苏鲜生\t428398\n月神\t428399\nSer\t428400\ndcp\t428401\n私隐\t428402\n民粹主义\t428403\n蚊子块\t428404\n村干部\t428405\n填色画\t428406\n唰\t428407\n可微\t428408\nPai\t428409\npanamera\t428410\n洋仔\t428411\n自动档\t428412\n枫蓼肠胃康颗粒\t428413\n240平米\t428414\n四层板\t428415\n街网\t428416\n苏州市吴江区人民政府\t428417\n蕲州\t428418\n脓性\t428419\n受养\t428420\n账片\t428421\nbrisbane\t428422\n速印\t428423\n乔静\t428424\n人民政府组织法\t428425\n张媛\t428426\n接见\t428427\n短片集\t428428\n麦克杰克逊\t428429\n雨过\t428430\n兽医\t428431\n文化创意有限公司\t428432\n双片\t428433\n三藏算命网\t428434\n胃胀气\t428435\n求存\t428436\n压迹\t428437\nPBT\t428438\ncygwin\t428439\n女会\t428440\n买卖双方\t428441\nfoe\t428442\n舰上\t428443\n04月04日\t428444\n中国共产党的九十年\t428445\n珈伟股份\t428446\n理符\t428447\n耽美剧\t428448\n申申\t428449\n华住集团\t428450\n逍遥镇\t428451\n互联网+教育\t428452\n河南日报\t428453\n某男\t428454\n练\t428455\nv1.14\t428456\n45天\t428457\nk=0\t428458\nNo1\t428459\n调零\t428460\n李德全\t428461\n海事服务网\t428462\nショップ\t428463\n灭菌柜\t428464\n工时费\t428465\n邪魅\t428466\n左金丸\t428467\n吸尻鬼\t428468\n小度机器人\t428469\n烛镇\t428470\n长江源\t428471\n增创\t428472\n窦靖童\t428473\n高德软件有限公司\t428474\n标杆\t428475\n鬼哥\t428476\n视译\t428477\nelasticsearch\t428478\nstudio\t428479\n1.0T\t428480\n志摩子\t428481\n猪场\t428482\n排序法\t428483\n小冰冰传奇\t428484\n苟同\t428485\n魏三\t428486\n格拉古尔\t428487\n太行水镇\t428488\n注射机\t428489\nFucked\t428490\n众益\t428491\nvensim\t428492\n别苑\t428493\nFriendships\t428494\nMigrations\t428495\n贵州都市报\t428496\n加工型\t428497\n_新手站_巴士英雄联盟\t428498\n安苏\t428499\n笛曲\t428500\n驰骋\t428501\nSummary\t428502\n支付宝\t428503\n青汁\t428504\n中文慕课\t428505\n株洲西站\t428506\n华强集团\t428507\n雪具\t428508\n沱牌\t428509\n给位\t428510\n美好建筑装配科技有限公司\t428511\nigcse\t428512\n布里斯班\t428513\n卡拉斯\t428514\n薄壁\t428515\ndoc-毕业论文\t428516\n家边\t428517\n悦达\t428518\n龙溪大道\t428519\ngraphics2d\t428520\n佩鲁斯\t428521\n燕山大学\t428522\n吊白块\t428523\n天津大胡同\t428524\n爱普生L360\t428525\ncartoon\t428526\n促睾\t428527\n选择项\t428528\nsf9\t428529\nX470\t428530\n1381\t428531\n鲜有人\t428532\n村企\t428533\n智久\t428534\n低廉\t428535\n闰土股份\t428536\n隐者\t428537\n卡桑德拉大桥\t428538\nxunsearch\t428539\n4K屏\t428540\n赵铁柱\t428541\n西西杀破狼\t428542\nTcp\t428543\n辽宁卷\t428544\n高风险\t428545\n1粒\t428546\n看风\t428547\n漏墨\t428548\n复颜\t428549\n110平方米\t428550\nNaive\t428551\n华策集团\t428552\n三峡水利\t428553\n弈城\t428554\n翔安机场\t428555\n小升初划片\t428556\n正义感\t428557\nsatan\t428558\nDIABOLIK\t428559\n20003\t428560\n1920\t428561\n山崎龙二\t428562\n明日花绮罗\t428563\n青紫色\t428564\n蛔虫\t428565\n黑豆豆浆\t428566\n139家\t428567\n昆池岩\t428568\niriver\t428569\n子醉今迷\t428570\n朱红色\t428571\n有头\t428572\n跳闸\t428573\n骤然\t428574\n2018-01-08\t428575\n格拉布斯\t428576\nMes\t428577\n花溪大学城\t428578\n周小明\t428579\nAnimate\t428580\n转塘镇\t428581\n均瑶集团\t428582\nUNSV\t428583\n8座\t428584\n宾阳\t428585\ninit-param\t428586\n马皮\t428587\n杜鹃路\t428588\nblunt\t428589\n海市蜃楼\t428590\n南平政府网\t428591\n淡香\t428592\n从业人士\t428593\n三向\t428594\n脚底板\t428595\n兴奋性\t428596\n文成公主\t428597\nCalvinR\t428598\nzly\t428599\n20151221\t428600\n酥咔\t428601\n宣讲\t428602\n烤火\t428603\n剑灵泰天\t428604\nbaixing\t428605\n分选机\t428606\n下应街道\t428607\n迷你贷\t428608\n阿赵\t428609\nINFO\t428610\n攀枝花\t428611\n四川天一学院\t428612\n亭台\t428613\nTIM1\t428614\n建科院\t428615\n1500套\t428616\n捷信消费金融有限公司\t428617\nweb数据库\t428618\nUCOS\t428619\n酒仙桥村\t428620\n加银\t428621\n黄飞鸿之铁鸡斗蜈蚣\t428622\n方志远\t428623\n亚控\t428624\n全项\t428625\n杀意\t428626\n提起来\t428627\n后尘\t428628\n动员\t428629\nOutsourcing\t428630\n四舍五\t428631\n王木生\t428632\n马塞尔\t428633\n关西机场\t428634\nsanwen\t428635\n卡百利\t428636\n为情所困\t428637\ndestoon\t428638\n虎跳峡\t428639\n张来秀\t428640\n霍小玉\t428641\n竹亭\t428642\n非机动车道\t428643\n走一遭\t428644\n黄楼\t428645\n张庆伟\t428646\n剑侠情缘2\t428647\n超牛\t428648\n为爱配乐\t428649\n山林地\t428650\n夏娃的诱惑\t428651\n负电\t428652\nrequest域\t428653\n【马鞍山\t428654\n女枪\t428655\n合目\t428656\n泪点\t428657\n煎蛋卷\t428658\n国画\t428659\n镀\t428660\nPlanner\t428661\n咸宁市环境保护局\t428662\nb59\t428663\n春熙路\t428664\nDDR4\t428665\n酸笋\t428666\n泉州市第一医院\t428667\n凯度\t428668\n张英爱\t428669\n引题\t428670\n富于\t428671\nALEXA\t428672\nlaunch\t428673\n6-12个月\t428674\n9.0分\t428675\n下沙论坛\t428676\n2018年1月27日\t428677\n啸\t428678\n新通留学_新通教育\t428679\ncorrcoef\t428680\n长鹰\t428681\n楚天都市沁园\t428682\n桃花岭\t428683\n脏污\t428684\n后台值\t428685\n瓦西里\t428686\n阿苯达唑\t428687\n春寒料峭\t428688\n位值\t428689\n望谟县人民政府\t428690\n张某某\t428691\n花器\t428692\n双服\t428693\n客机\t428694\n攀牙湾\t428695\nAltiumDesigner\t428696\n张默\t428697\n厦门市经济和信息化局\t428698\n木片机\t428699\n安纳托利亚\t428700\n北汽新能源\t428701\n离心压缩机\t428702\nSanctuary\t428703\n重赏\t428704\njzgc\t428705\n地关系\t428706\n5051\t428707\n互花米草\t428708\n村规民约\t428709\n解出\t428710\n列巴\t428711\n98.8\t428712\n不二越\t428713\n姜草\t428714\nSupARC\t428715\n地震仪\t428716\n监察\t428717\n文欣\t428718\nPROE4.0\t428719\n死亡笔记真人版\t428720\nproDAD\t428721\n埋下\t428722\n第38轮\t428723\n晚辰\t428724\n视听许可证\t428725\n内核\t428726\nuan\t428727\n右旋糖酐\t428728\n内江市第一人民医院\t428729\n镶嵌物\t428730\n沙镇\t428731\n电磁振动给料机\t428732\nmoa\t428733\nIMAX影院\t428734\nbobbibrown\t428735\n1000kV\t428736\n前30天\t428737\n超级大乐透\t428738\n学苑出版社\t428739\n攻城狮\t428740\n美媒\t428741\n草木\t428742\nhuilv\t428743\ne书联盟\t428744\nInfinity\t428745\n成都欢乐谷\t428746\npositional\t428747\n小妞\t428748\n加速券\t428749\n0.875\t428750\n多乐士家丽安\t428751\n凯龙\t428752\n新大楼\t428753\n编辑员\t428754\n克而瑞地产\t428755\n王家卫\t428756\n猪套\t428757\n玄界\t428758\n陈鹏\t428759\n金睛\t428760\n注册建造师管理系统\t428761\n绥德县\t428762\n橙红年代\t428763\n巴啦啦小魔仙之魔箭公主\t428764\n迷城\t428765\nAvatar\t428766\nE52\t428767\n正圆\t428768\nCKEDITOR\t428769\n考核表\t428770\n第八个\t428771\n琪露诺\t428772\n14吨\t428773\nmce\t428774\n延川县\t428775\nPhan\t428776\n帝江\t428777\n会战\t428778\n合肥包河\t428779\nSamuel\t428780\n事假\t428781\n社会保险信息网\t428782\n内蒙古省\t428783\nPP岛\t428784\nblahnik\t428785\n妙女神探\t428786\n六部曲\t428787\n方差分析表\t428788\n选板\t428789\n高鳍\t428790\n大们\t428791\n招标比\t428792\n铜川日报传媒网\t428793\nOptoma\t428794\n吕军\t428795\n真冬\t428796\n布朗熊\t428797\ngpk\t428798\n江苏交通控股有限公司\t428799\n18th\t428800\n神校\t428801\n达·芬奇\t428802\n5.5.20\t428803\n共产\t428804\nfpdk\t428805\n张康\t428806\n直布罗陀\t428807\n小皇\t428808\nlike\t428809\n土木工程类\t428810\n上海拜力生物科技有限公司\t428811\n中翅\t428812\nBAND\t428813\n壁\t428814\n深大附中\t428815\n蔷薇书院\t428816\n济南野生动物园\t428817\n浙江稠州商业银行\t428818\n吉普大切诺基\t428819\nphtotshop\t428820\n上流社会\t428821\n暗黑血统战神版\t428822\n通惠河\t428823\n巴拿马\t428824\n智齿牙\t428825\n约束\t428826\ncherish_leon\t428827\n肉包\t428828\n100公斤\t428829\n弧齿锥\t428830\n9月下旬\t428831\nDOS\t428832\n龙视安\t428833\nPPL\t428834\n我骄傲我是中国人\t428835\n江苏省宿迁市宿豫区政府\t428836\n寻找前世之旅\t428837\n期中\t428838\ncaozuo\t428839\nFACE妆点网\t428840\n没影\t428841\ncorresponding\t428842\n保证金\t428843\n100间\t428844\n佛山市环境保护局\t428845\n博野\t428846\n物尽其用\t428847\n相接\t428848\n耐压测试仪\t428849\n驴友网\t428850\n输送器\t428851\n360影视大全\t428852\n圆类\t428853\n1304414\t428854\n信长之野望14\t428855\n手照\t428856\n苏教版八年级语文\t428857\n除螨仪\t428858\n傲游云浏览器\t428859\n郭宇\t428860\nregiis\t428861\n末地城\t428862\nb250m\t428863\n双口镇\t428864\n当当阅读器\t428865\n虎旗\t428866\n第十三回\t428867\n福泉路\t428868\ndiao\t428869\n枪证\t428870\n144平方\t428871\n城通\t428872\nPROC\t428873\n幕布\t428874\n中建三局集团有限公司\t428875\nxis\t428876\n库什纳\t428877\nfrancais\t428878\n庞泉沟\t428879\n授课式\t428880\nequivalence\t428881\nwindows防火墙\t428882\n中国设备网\t428883\n蝇\t428884\nera\t428885\nFlash版\t428886\nCAD教育版\t428887\n成平\t428888\n熔融\t428889\n加开\t428890\nstaub\t428891\n刘协\t428892\n中考资源网\t428893\n拿不准\t428894\n视网膜\t428895\nJGT\t428896\nswjoy\t428897\n巨胜\t428898\n突袭\t428899\n临矿集团\t428900\n陪床\t428901\n广州市纪委\t428902\n蓝冠\t428903\n3Dsmax\t428904\nfellow\t428905\nQuay\t428906\nMate10\t428907\n深圳盐田政府\t428908\n新城控股\t428909\nFind7\t428910\nScorpio\t428911\n姗姗\t428912\n省安委会\t428913\n汤家凤\t428914\n白白\t428915\n用贷\t428916\nrecipients\t428917\n海安日报\t428918\n自编\t428919\n大群\t428920\n灵丘\t428921\n300MW\t428922\n武汉市人事考试院\t428923\n奥利维尔\t428924\n中共四川省纪委四川省监察厅\t428925\nDavinci\t428926\n趣趣\t428927\nSSL/TLS\t428928\n气躁\t428929\n一锅\t428930\n悉尼歌剧院\t428931\nJohnHany\t428932\n第15期\t428933\n新蜀汉传奇\t428934\n钢笔字\t428935\n连理\t428936\nau\t428937\n贸易博览会\t428938\n摇身一变\t428939\nvam\t428940\niDo\t428941\n洪光\t428942\n第五首\t428943\nwriter\t428944\n荣耀畅玩平板2\t428945\n包装修\t428946\n需求者\t428947\nフォルト\t428948\npim\t428949\n63亿\t428950\n徐坚\t428951\n花博园\t428952\n安眠药\t428953\njams\t428954\n绞线机\t428955\nVogel\t428956\n迪士尼卡通\t428957\n秋声赋\t428958\n卷二\t428959\n塞班S60\t428960\n律所\t428961\n照母山\t428962\n雪压\t428963\n向北\t428964\n卖不掉\t428965\n挪威克朗\t428966\n直角坐标方程\t428967\n旱稻\t428968\n顶棚\t428969\n捕蛇者说\t428970\n857号\t428971\n不要求\t428972\n平战\t428973\nORA\t428974\nrated\t428975\n狗仔\t428976\n夕月一\t428977\nFran\t428978\n话筒\t428979\n客服部\t428980\n擦\t428981\n恒大悦府\t428982\n锦鲤抄\t428983\n建工网校\t428984\nshopnc\t428985\nAXIS2\t428986\n灭运机\t428987\n五月末\t428988\nworth\t428989\ndumping\t428990\n96345\t428991\nhge\t428992\n伯明翰大学\t428993\n240克\t428994\ndk\t428995\n家中\t428996\nPerceptual\t428997\n服务者\t428998\n苛责\t428999\n志田\t429000\n踏频\t429001\nGambling\t429002\n从古到今\t429003\n代森锰锌\t429004\n知微\t429005\n临高启明\t429006\n弦歌\t429007\n汤头歌诀\t429008\n过冬\t429009\n深圳市住房公积金管理中心\t429010\n云阳\t429011\n金刚板\t429012\n纠正\t429013\n澳门葡京娱乐\t429014\n青田政府网\t429015\nhuangyan\t429016\n7000d\t429017\nSandbox\t429018\n沙岭子西站\t429019\n99字画网\t429020\n金琥\t429021\n20150525\t429022\n伪元素\t429023\n百令胶囊\t429024\n苏利文\t429025\n小孩儿\t429026\n玉祥门\t429027\n鲫鱼\t429028\n天猫魔盒3pro\t429029\n烎\t429030\n高瞻\t429031\n车杆\t429032\n红莲骑士兽\t429033\n常德教育网\t429034\n鑫苑集团\t429035\n铈\t429036\n机针\t429037\n血管造影\t429038\nZhejiang\t429039\n班尼路\t429040\n香港电视台\t429041\n冷暖型\t429042\n祖师\t429043\n转移\t429044\n银镜\t429045\n薪\t429046\n长果桑\t429047\n土色\t429048\n力计\t429049\nRacer\t429050\n告诉你好\t429051\n克林姆\t429052\n垂下\t429053\n岁运\t429054\n第二十二届\t429055\n文体活动\t429056\n日龄\t429057\n百炼马踏\t429058\n永燕\t429059\n笔杆\t429060\n张光宇\t429061\n佳雪\t429062\n货亭\t429063\nGameBoy\t429064\n呛声\t429065\nURL地址\t429066\n交谊舞\t429067\n聚赖氨酸\t429068\n几环\t429069\n绫小路清隆\t429070\n离退休\t429071\n数管\t429072\njinja\t429073\n姊\t429074\n城阳街道\t429075\n抽风散热器\t429076\n内黄县\t429077\n中国残联\t429078\n李杲\t429079\n玉璧\t429080\n生水\t429081\n就送\t429082\n炸场\t429083\n7210\t429084\n会计中级\t429085\n寻香踪\t429086\n追梦加速器\t429087\n二维码\t429088\n交通学院\t429089\n自由项\t429090\nPart2\t429091\n鹏斯\t429092\n15尺\t429093\n金丝燕\t429094\n分封\t429095\n8.15\t429096\n惠州机场\t429097\n蔚领真武\t429098\n陈紫函\t429099\n失业保险\t429100\n草裙舞\t429101\nVM\t429102\n护壁\t429103\narange\t429104\ngeohash\t429105\n枕眠\t429106\nak47\t429107\n生育险\t429108\n包身\t429109\n秘扇\t429110\n通州土桥\t429111\n立裁\t429112\ngold\t429113\n779\t429114\n5788\t429115\n甘超波\t429116\n王永奎\t429117\nodoo11\t429118\n六点半\t429119\n袁立\t429120\n孕期计算器\t429121\n摩信\t429122\nforName\t429123\njacques\t429124\n真人快打\t429125\nserializable\t429126\n厅局\t429127\n控制点\t429128\nlanguages\t429129\n巨结肠\t429130\n屏王\t429131\n元华\t429132\n云笔记\t429133\n科技大厦\t429134\nproducer\t429135\n舒乐\t429136\n指使\t429137\nJF厂\t429138\n可定\t429139\n刘相松\t429140\n叙述\t429141\n过儿\t429142\n骂声\t429143\ndenon\t429144\n爸妈在线心理网\t429145\n贵客\t429146\ncephfs\t429147\n金碧路\t429148\n烟把儿乐队\t429149\nSeeds\t429150\np3c\t429151\n工力\t429152\n天貓精選\t429153\n预审批\t429154\n天涯歌女\t429155\n京剧团\t429156\n聚丙烯腈\t429157\n广州文化公园\t429158\n随货\t429159\nRecordset\t429160\n射雕\t429161\n菲儿影院\t429162\n焦化厂\t429163\nm557\t429164\n苏芬战争\t429165\nWeitzman\t429166\n姜岩\t429167\n华宇集团\t429168\n京东微联\t429169\n人造卫星\t429170\n顾家家居股份有限公司\t429171\n20150913\t429172\n交流帖\t429173\n世茂东壹号\t429174\n柏溪\t429175\n包线\t429176\nabsorbed\t429177\n龙塔街道\t429178\n异分母分数加减法\t429179\n骁龙410\t429180\n清流镇\t429181\n游美\t429182\n偷性\t429183\nJerk\t429184\n中粮置地\t429185\n党务公开栏\t429186\n下五千年\t429187\n蒙山县\t429188\n烽火燎原\t429189\n百度云\t429190\n湖北广播电视台\t429191\n乡村猎艳记\t429192\n夺志\t429193\nlumetri\t429194\n嵌缝\t429195\n台球馆\t429196\nティ\t429197\n数码复印机\t429198\n为宜\t429199\n群众网\t429200\n优惠促\t429201\nAcetyl\t429202\n水解胶原蛋白\t429203\n双乾\t429204\n嘛\t429205\n竹制\t429206\nGWD\t429207\nVeteran\t429208\n滴水\t429209\nugc\t429210\n6510\t429211\n宜科\t429212\n民享\t429213\nvibration\t429214\n镇_列表网\t429215\nC语言\t429216\n多功能机\t429217\n全配\t429218\n楼头\t429219\n驳船\t429220\n999银\t429221\n医疗费用\t429222\n陈其美\t429223\n桐乃\t429224\n行善\t429225\n中电\t429226\n梆子\t429227\n轮胎\t429228\n徐勤兰\t429229\n最爱国\t429230\nviber\t429231\n安多哈尔\t429232\n景枫\t429233\n滚灯\t429234\n卑劣\t429235\n食神格\t429236\n1.2.10\t429237\n爸爸妈妈\t429238\n赛弗\t429239\n引流袋\t429240\n杀手级\t429241\n中华人民共和国招标投标法实施条例\t429242\nFXAA\t429243\nSWM斯威X7\t429244\n有氧\t429245\nxwpf\t429246\n经销部\t429247\n良土\t429248\n1月21日\t429249\n枣庄三中\t429250\nng-zorro\t429251\n复合肥\t429252\n知我意\t429253\n许开祯\t429254\n妇科肿瘤\t429255\n肉控\t429256\n高和\t429257\n狐币\t429258\nSLK\t429259\n人民军队\t429260\n4064\t429261\n1.0.3.0\t429262\n集宁\t429263\nresponseText\t429264\n感谢\t429265\n红米5p\t429266\n三七粉\t429267\nClassification\t429268\n1405\t429269\n不领\t429270\nbcb\t429271\n滨海县\t429272\n人球\t429273\n投行界\t429274\n指标\t429275\nchines\t429276\n华强城\t429277\nshitai\t429278\n绝地求生直播\t429279\nconform\t429280\n剑桥国际少儿英语\t429281\n舞场\t429282\n23#\t429283\n18P\t429284\n一道道\t429285\n猩\t429286\nug4.0\t429287\n民事诉讼案\t429288\naxure8.0\t429289\n一帮\t429290\nPPT素材\t429291\nps2015\t429292\n草莓味\t429293\nDER\t429294\nSho\t429295\n王力集团\t429296\n分享率\t429297\n紫檀柳\t429298\n沈强\t429299\n折子\t429300\n不在话下\t429301\n扎牢\t429302\n玫琳凯\t429303\n箱式电阻炉\t429304\ncep\t429305\n中央书记处\t429306\n罗夏恩\t429307\nipadair_2吧\t429308\n120吨\t429309\nisdin\t429310\nFt\t429311\nnamesilo\t429312\n加倍\t429313\nedwards\t429314\n四族\t429315\nub4.0\t429316\n新行\t429317\n上海市科学技术协会\t429318\n2016.09\t429319\n活性炭吸附塔\t429320\n名街\t429321\n加盟者\t429322\n批卷\t429323\nDo\t429324\n不伦恋\t429325\n凹口\t429326\n小奖\t429327\n燕城\t429328\n九数\t429329\n东莞电视台\t429330\n柿树\t429331\n金钗\t429332\n如磋\t429333\n有机膨润土\t429334\n翠湖天地\t429335\n湖南猎豹汽车股份有限公司\t429336\nFCM\t429337\n兰屿\t429338\n北白象镇\t429339\nymi\t429340\n哈加沙\t429341\n硬发\t429342\nboosted\t429343\n风尘三侠\t429344\n分季\t429345\nMANA\t429346\n取巧\t429347\n谈话室\t429348\n癔症\t429349\n12场\t429350\n重出\t429351\n华为mate\t429352\n拔头筹\t429353\nsevis\t429354\n元瑶\t429355\n844\t429356\n意式咖啡\t429357\n公安县\t429358\n柴油罐\t429359\n第04\t429360\n银行对账单\t429361\nmax函数\t429362\n女仆\t429363\n小心心\t429364\n前田阳菜\t429365\n郑周永\t429366\n专业队\t429367\n西克\t429368\n六个字\t429369\n日用百货\t429370\n状态函数\t429371\nNRG\t429372\n狮鹫\t429373\n石原莞尔\t429374\n292\t429375\n神经症\t429376\nJoseph\t429377\nMorris\t429378\n工程签证\t429379\n知县\t429380\n武冈市人民政府\t429381\n乙肝e抗体\t429382\n嘴苦\t429383\n水线\t429384\n20160201\t429385\nEasyui\t429386\n万国表\t429387\n南京审计大学\t429388\n20151130\t429389\n2023年\t429390\n倒飞\t429391\n蟒山\t429392\n猕猴桃树\t429393\nelasticsearch5\t429394\n足球机\t429395\n甫志高\t429396\nX520\t429397\n金投股票\t429398\n71页\t429399\n无线桥接\t429400\nstrongly\t429401\n初窥\t429402\nV4.5\t429403\n5081\t429404\n御湖观邸\t429405\nBully\t429406\n海蓝宝石\t429407\n04款\t429408\npppp\t429409\n新疆浩源\t429410\n5.5亿元\t429411\n环氧大豆油\t429412\n蝴蝶蓝\t429413\n醉茶\t429414\n10几天\t429415\n文化干部\t429416\nDEFY\t429417\n大尉\t429418\n前手\t429419\n李维嘉\t429420\n茶陵县\t429421\nScienceDirect\t429422\ntuku\t429423\n成仙\t429424\n谏太宗十思\t429425\nagriculture\t429426\n上三角矩阵\t429427\nKeysight\t429428\n1024草社区榴\t429429\nDefFound\t429430\nBEIJING\t429431\n日裔\t429432\n苔藓\t429433\n熹\t429434\n亚安\t429435\n亿腾\t429436\n2229\t429437\n九星霸体诀\t429438\n大连舰艇学院\t429439\nLp\t429440\n卢克卡\t429441\neaten\t429442\n35.7\t429443\n豆瓣源\t429444\n过马路\t429445\njixie.huangye88.com\t429446\n高强钢\t429447\nT470p\t429448\n中国广核集团有限公司\t429449\n畅捷通T+\t429450\n辛基\t429451\n1页\t429452\n教官\t429453\n省直管县\t429454\n解放南路\t429455\nipsw\t429456\n新世纪大学英语综合教程4\t429457\n阜阳教育信息网\t429458\n清河镇\t429459\n蹬子\t429460\n雄伟\t429461\n1164\t429462\nOréal\t429463\n步步逼婚\t429464\n天邈\t429465\n压力罐\t429466\n浙江省省\t429467\n华润悦府\t429468\nconstruct\t429469\n阿波罗计划\t429470\n小蠊\t429471\nAloha\t429472\n炉头\t429473\n苍炎\t429474\ninstanceof\t429475\n初二数学\t429476\n尿结石\t429477\n小蜗\t429478\n血病\t429479\n上下楼\t429480\n试拍\t429481\n花娇\t429482\nothers\t429483\n膀胱结石\t429484\n世纪缘\t429485\nGoodNotes\t429486\nArsenal\t429487\n黑磷\t429488\nKeyword\t429489\nMAVEN\t429490\n情逢敌手\t429491\n经济日报\t429492\nSmartAudio\t429493\n汇川技术\t429494\ndough\t429495\n宋词鉴赏辞典\t429496\n中国石油天然气集团公司\t429497\nPulmonary\t429498\n白车\t429499\n承担者\t429500\n鹿灵\t429501\n华大新高考联盟\t429502\n浆板\t429503\n用章\t429504\n刀剑2\t429505\n宝钱公路\t429506\n龙灵\t429507\n南宁市区\t429508\n第105期\t429509\n影戏\t429510\n普氏指数\t429511\n智慧式\t429512\nsurpass\t429513\nTools\t429514\n5600元\t429515\n25块\t429516\n1158\t429517\n广东五虎之铁拳无敌孙中山\t429518\n流于\t429519\n4.9.2\t429520\n洁厕\t429521\n绿川\t429522\nMatplotlib\t429523\n商情\t429524\n无线电话\t429525\nrot\t429526\n苦力\t429527\n共同点\t429528\n恭子\t429529\n护齿\t429530\nyangboom\t429531\n握\t429532\nlog函数\t429533\n死法\t429534\n中国采招网\t429535\nGatling\t429536\n通宝\t429537\n大足网\t429538\n俄罗斯大使馆\t429539\n酸性氧化物\t429540\n热不热\t429541\n疲惫不堪\t429542\nYunFile\t429543\n乐多多\t429544\n功德无量\t429545\n尚语贤\t429546\n56例\t429547\n第24天\t429548\n公账\t429549\n阿托伐他汀钙片\t429550\n艾德勒\t429551\nMadame\t429552\n胆\t429553\n36亿元\t429554\n曲檀儿\t429555\n毛尖茶\t429556\n2538\t429557\n罐头盒\t429558\n交换式\t429559\n甘肃省卫生和计划生育委员会\t429560\n美兆\t429561\n苏州科技学院\t429562\nApplication\t429563\n美衣\t429564\n考试院\t429565\n围绕\t429566\n活动率\t429567\nylbtech\t429568\n侯德榜\t429569\n货位\t429570\n张茵\t429571\nsexporn\t429572\n上不来\t429573\n隐居地\t429574\ncathode\t429575\n癌症村\t429576\n鞍钢集团\t429577\n中华国\t429578\n水钢\t429579\n镜湖\t429580\n时蔬\t429581\n小蓝经济开发区\t429582\n滝川\t429583\n说出来\t429584\nFTU\t429585\n禽病网\t429586\n四厘米\t429587\n洪辰\t429588\n终极梦想\t429589\nSARS病毒\t429590\n9#\t429591\n2017年10月24日\t429592\n中国基督教\t429593\n十里长街\t429594\n半圆管\t429595\n武汉光迅科技股份有限公司\t429596\n央玺\t429597\nusmle\t429598\n空缺\t429599\n黄绿素\t429600\n沉井\t429601\n601328\t429602\n长安商用\t429603\n纯果\t429604\n宝光\t429605\n回合制\t429606\n证集\t429607\n网易婚恋交友网\t429608\n莓莓\t429609\n女性观\t429610\n东胡\t429611\n随机数发生器\t429612\nReact-router\t429613\nLabview\t429614\n江映蓉\t429615\n支臂\t429616\n时间轮\t429617\ncreators\t429618\n两月\t429619\n4009\t429620\n动版\t429621\n2016年1月5日\t429622\n亿鼎博\t429623\nActivemq\t429624\n光照培养箱\t429625\n四川省平昌县人民政府\t429626\n运输工具\t429627\n颈动脉\t429628\n闫旭\t429629\n小屁\t429630\n龙门山\t429631\n风声雨声读书声声声入耳\t429632\n巨翅\t429633\n征服世界\t429634\n炽燃\t429635\n罗尔斯罗伊斯\t429636\n市政府办公室\t429637\n师者\t429638\n朱批\t429639\n阿曼达\t429640\n建外大街\t429641\n宋瓷\t429642\n华为p\t429643\n滚滚滚\t429644\n截留\t429645\n_自然资源部\t429646\n羽田桃子\t429647\n重庆东站\t429648\n采集器\t429649\n冷藏箱\t429650\n郑集\t429651\n舰娘\t429652\n五莲山\t429653\n洗涤塔\t429654\n模塑\t429655\nporta\t429656\n不远千里\t429657\nlipin\t429658\nindiana\t429659\n书页\t429660\n济南市市中区人民政府\t429661\nPopper\t429662\n长川科技\t429663\n龙城路\t429664\n木门网\t429665\n校址\t429666\n130级\t429667\nncurses\t429668\n量热仪\t429669\n海皇\t429670\n海立方\t429671\n出口税\t429672\n风生水起\t429673\n報告\t429674\n华为ascend\t429675\n星盟\t429676\n老泪纵横\t429677\nddn\t429678\n44_\t429679\n生物质颗粒机\t429680\n扶老携幼\t429681\n青岛市城市管理局\t429682\n榆树市\t429683\nvs2012\t429684\n位运算符详解\t429685\n2012-2015年\t429686\ninvalid\t429687\n第2款\t429688\n皖通科技\t429689\n晓龙\t429690\nCovariance\t429691\n蒙城县\t429692\n傅科\t429693\n梨状肌\t429694\n商法通\t429695\n停滞\t429696\n1050万\t429697\n小屯路\t429698\n安谱\t429699\n自治区财政厅\t429700\n胡杨网\t429701\n守门员\t429702\n梁仕成\t429703\n话唠\t429704\n荷电\t429705\n短处\t429706\n取自\t429707\n亿阳集团\t429708\n鲟\t429709\n党制\t429710\nThug\t429711\n李臻\t429712\nForestry\t429713\n剥线机\t429714\n比亚迪F6\t429715\nBalls\t429716\n左马\t429717\n全本小说网\t429718\n预示\t429719\n韩悦\t429720\n富宁县\t429721\n通览\t429722\n中关村中学\t429723\n中普\t429724\n康德莱\t429725\nPayable\t429726\n星际2:虚空之遗\t429727\n江苏苏宁足球俱乐部\t429728\nSMF\t429729\n百帕\t429730\n德城\t429731\n汗颜\t429732\nSkins\t429733\n69式\t429734\n木命\t429735\n吃鸡开挂\t429736\n中华园\t429737\n工艺美术品\t429738\nRy\t429739\n米哈\t429740\n魅动版\t429741\n留意\t429742\n3.6_\t429743\n不是不想\t429744\n对数化\t429745\n芭东海滩\t429746\nR9Plus\t429747\n测定\t429748\n三秋\t429749\n多币\t429750\ncontainers\t429751\n龙徽\t429752\ndimension\t429753\n百里杜鹃风景名胜区\t429754\nfencing\t429755\n中科大软院\t429756\nhut\t429757\n功功率\t429758\n32000\t429759\n九次方\t429760\n李宁云\t429761\n比邻\t429762\n样图\t429763\nCimatron\t429764\neqt\t429765\nMimi\t429766\n十分钟\t429767\n奔驰S级AMG\t429768\n#书旗小说#\t429769\n盐酸雷尼替丁胶囊\t429770\nShine\t429771\n浣肠\t429772\n五月天在线\t429773\n苯醚甲环唑\t429774\n拖斗\t429775\n沉头孔\t429776\n三角眼\t429777\n书记处\t429778\n宏明\t429779\n如上\t429780\n静寂\t429781\n房展\t429782\n58cm\t429783\n钛酸锂\t429784\n忽而\t429785\n婴幼儿教育\t429786\n上水\t429787\n应予\t429788\n庄巧涵\t429789\n象印\t429790\n华芯\t429791\n深达威\t429792\nL1\t429793\n西湖藕粉\t429794\n磨皮滤镜\t429795\n不明飞行物\t429796\ngetTime\t429797\n装潢\t429798\nBrandeis\t429799\n温柔的谎言\t429800\n8000万\t429801\n超级IP\t429802\njccad\t429803\n上海交通大学网络信息中心\t429804\n鞍山银行\t429805\n如梦之梦\t429806\n东溪村\t429807\n计数原理\t429808\n雨巷\t429809\n熊出没之变形记\t429810\nGardner\t429811\ncsss\t429812\n7.5_\t429813\n杨振铎\t429814\n比特币交易\t429815\n盐城市财政局\t429816\n报损\t429817\n豆果美食\t429818\n海南省卫生和计划生育委员会\t429819\n明白\t429820\nSolitaire\t429821\n联中\t429822\n宁心\t429823\n捷星\t429824\n轻而易举\t429825\n打字机\t429826\ntest2\t429827\n格雷迈恩\t429828\n老木\t429829\n南京地铁5号线\t429830\nVISTA\t429831\n寻书\t429832\nZippo\t429833\n任君\t429834\n李北方\t429835\n失业保险金\t429836\n青岩\t429837\n石学敏\t429838\n科信\t429839\n背夹式\t429840\nAAAAA\t429841\n孙悟饭\t429842\n刷野\t429843\ntssd\t429844\n广州中医药大学金沙洲医院\t429845\n张晓晗\t429846\n广东论坛\t429847\n千树万树梨花开\t429848\n自增\t429849\n张小玲\t429850\n中国连锁经营协会\t429851\ncranberry\t429852\nCrime\t429853\npanese\t429854\n广东省政府\t429855\n川井宪次\t429856\n文科男\t429857\n应以\t429858\n九尾妖狐阿狸\t429859\n功力\t429860\n弄通\t429861\n第一拍\t429862\n2018年4月15\t429863\n紫荆树\t429864\n0392\t429865\n朗科科技\t429866\n山东体育学院\t429867\numount\t429868\n解体\t429869\n袍江\t429870\n北京大学医学院\t429871\ninflammation\t429872\n区交通运输局\t429873\n前10个月\t429874\nboil\t429875\n请别\t429876\n买到\t429877\n火山\t429878\n情侣路\t429879\n飞美\t429880\n刘德凯\t429881\n陈映真\t429882\nword-break\t429883\n代潇瑞\t429884\n粉底\t429885\nCue\t429886\nbjedu\t429887\n刘彦昌\t429888\n签假\t429889\n黑客大战\t429890\n单股\t429891\n御桥\t429892\nDatanode\t429893\n韩小莹\t429894\n氯化钾\t429895\nownership\t429896\n麦明诗\t429897\n百川能源\t429898\n烧火棍\t429899\ncarbonate\t429900\n2006-2015年\t429901\n玉州\t429902\n挺不错\t429903\nng-repeat\t429904\n泰凌\t429905\n依次\t429906\n价值股\t429907\n撒切尔\t429908\n敬祝毛主席万寿无疆\t429909\n乔红\t429910\n缓冲区\t429911\n十三号\t429912\n裂头蚴\t429913\n泮托拉唑\t429914\nv2.0.7\t429915\n红双喜\t429916\nnextint\t429917\n22张\t429918\n遵义站\t429919\n小英\t429920\n纱质\t429921\njgp\t429922\nHawkins\t429923\n宁启铁路\t429924\n認証\t429925\n钟鼓楼广场\t429926\n企鹅群\t429927\n端粒\t429928\nlopez\t429929\n广东省汕头市澄海区人民政府\t429930\n张志远\t429931\n轴卡\t429932\n长政\t429933\n4.4.2\t429934\n睾丸\t429935\nfont\t429936\n粤通卡\t429937\n胡天\t429938\n通海路\t429939\nort\t429940\n海报\t429941\n温州龙湾机场\t429942\n恋爱到此为止\t429943\n河西街道\t429944\n常凯\t429945\n圣王\t429946\n文质\t429947\n圆柱度\t429948\n镜花水月\t429949\ngobi\t429950\ntwistys\t429951\n20170812\t429952\n诸夏\t429953\n赣江新区\t429954\nhsqldb\t429955\nutr\t429956\n居间合同\t429957\n将军令\t429958\n外汇登记\t429959\nBeego\t429960\ndonut\t429961\n梦子\t429962\nInstitutions\t429963\n万汇城\t429964\n唐意\t429965\n汽车人才网\t429966\n杨小军\t429967\n奴隶兔\t429968\n微信公号\t429969\nsie\t429970\n诚邀\t429971\n不交钱\t429972\n600w\t429973\n微新\t429974\n一只\t429975\n评论性\t429976\n阿伦森\t429977\n联行号\t429978\n台风级\t429979\n微创医疗\t429980\nPn\t429981\n仔鸡\t429982\n返事\t429983\n三城\t429984\n油酸\t429985\nTrue\t429986\neverlane\t429987\n石碣\t429988\nhallelujah\t429989\n当即\t429990\n石榴酒\t429991\n唐林\t429992\n兵王\t429993\n弃考\t429994\n中国建筑设计院有限公司\t429995\nLilith\t429996\n聊斋\t429997\nconverse\t429998\nENB\t429999\n比例式\t430000\nPhotoshop\t430001\n布布丁挂靠网\t430002\n上海豫园\t430003\nijcai\t430004\n智能工厂\t430005\n真伟大\t430006\noutfit\t430007\n白点点\t430008\n抄录\t430009\n无以伦比\t430010\n酷影\t430011\n石柱县\t430012\n广西测绘地理信息局\t430013\n20160922\t430014\n定数\t430015\n胡月\t430016\n14位\t430017\n大头贴\t430018\n生存版\t430019\nAnimoji\t430020\n樊凡\t430021\n自治区发改委\t430022\n个性\t430023\n80040154\t430024\n继木\t430025\n肌无力\t430026\n竹鸡\t430027\n翻开\t430028\n1.25亿\t430029\n弓型\t430030\nEPYC\t430031\n军中绿花\t430032\nTapestry\t430033\n建好\t430034\n酷鸽网\t430035\n万能看图王\t430036\n挽尊\t430037\nCustomer\t430038\n无讼\t430039\nnewman\t430040\n公共图书馆法\t430041\nnomachine\t430042\n光孝寺\t430043\n戴梦\t430044\n遂溪县\t430045\nbiotechnology\t430046\n繁体\t430047\n落花时节又逢君\t430048\n善德\t430049\n邱晓华\t430050\n贝币\t430051\n头头是道\t430052\n退职\t430053\nNGINX\t430054\nrapidly\t430055\nスタジオ\t430056\n移动全球通卡\t430057\n紧急时刻\t430058\n利他\t430059\n子羽\t430060\nClay\t430061\n推进式\t430062\n凌源吧_\t430063\n2018年前\t430064\n伊东\t430065\n新暗黑龙与光之剑\t430066\n白鹿洞书院\t430067\n武威市\t430068\n39所\t430069\n第20轮\t430070\n古田会议会址\t430071\nITfish\t430072\n泉州实验中学\t430073\n笼头\t430074\n汇缴\t430075\n魔灵召唤吧\t430076\ndnf剑魔吧\t430077\n制动总泵\t430078\n新生群\t430079\n韩江\t430080\n佛医\t430081\n负载电流\t430082\n720P-MP4\t430083\n搞趣污污公社\t430084\n灌口\t430085\nFM2\t430086\n嵌套子\t430087\nAndroid模拟器\t430088\n素板\t430089\n堕落之王\t430090\n防雷\t430091\n梁冰\t430092\n浙江省人力资源和社会保障厅\t430093\npsql\t430094\n树人中学\t430095\n惊险\t430096\n了解到\t430097\n道德责任\t430098\n天邦\t430099\n新泻\t430100\n三述\t430101\nhealth\t430102\n杨敏\t430103\n北京市居住证\t430104\n梁秋实\t430105\n客户识别号\t430106\nIDOC\t430107\n海海人生\t430108\n斯威X7\t430109\n观照\t430110\n龙川\t430111\nwika\t430112\n充气轮\t430113\n胡不归\t430114\nexcel表头\t430115\nRecords\t430116\n太玄\t430117\n工棚\t430118\n战境\t430119\n8发\t430120\n圣源\t430121\n南浔\t430122\nEvolve\t430123\nAnglo\t430124\nDividend\t430125\nMammut\t430126\n友基\t430127\nXIANG\t430128\nprodesk\t430129\n信长之野望15大志\t430130\n新浪山西_新浪网\t430131\n题题\t430132\n冒死\t430133\n随同\t430134\n巫界\t430135\ntestv\t430136\nlol鸡里奥\t430137\n三命通会\t430138\n自治区总工会\t430139\n虚拟支付\t430140\n曾华倩\t430141\n巴尔干半岛\t430142\n尘缘梦\t430143\n代谢病\t430144\n180202\t430145\n奇怪的儿媳\t430146\n红绿色盲\t430147\n市地税局\t430148\n气动式\t430149\n仨\t430150\n威海市委\t430151\niso11.3\t430152\npogopin\t430153\n订立\t430154\nPhotography\t430155\n合肥市工商行政管理局\t430156\noreilly\t430157\n硬盘柜\t430158\n医院页\t430159\n阴司\t430160\nthang\t430161\n韩墨言\t430162\n刚强\t430163\n爱的成人式\t430164\n96.9\t430165\n工作时间表\t430166\n亚洲杯\t430167\n3万套\t430168\n塔利班\t430169\n1m2\t430170\n28256\t430171\n403\t430172\n财险\t430173\nACH\t430174\n4种\t430175\n工商银\t430176\n省信访局\t430177\n2740\t430178\n台账\t430179\n猎名网\t430180\n包芯丝\t430181\n配机\t430182\n处理池\t430183\n隰\t430184\n太平洋\t430185\n美文网\t430186\n热镀锌管\t430187\n公租房\t430188\n昨夜星辰\t430189\n党内监督条例\t430190\n航站\t430191\n伤春悲秋一世情\t430192\n安准\t430193\nlevel3\t430194\n修饰\t430195\nEvolved\t430196\n诉诸\t430197\n上海大学图书馆\t430198\n135mm\t430199\n西安南\t430200\n刘润洁\t430201\n伊人综合在线\t430202\nWen\t430203\n词条\t430204\nfeng\t430205\n瑞波币\t430206\n一七年\t430207\n山东能源龙口矿业集团有限公司\t430208\n港式甜品\t430209\n狮子头\t430210\n新安人才网\t430211\nxpc\t430212\n枸杞水\t430213\n严厉\t430214\n收费\t430215\n维诺\t430216\n主题曲MV\t430217\n防护栏\t430218\n蔡俊\t430219\n群体\t430220\n浙江交通职业技术学院\t430221\nQQ分组\t430222\n元旦节\t430223\n黔灵山\t430224\n树人小学\t430225\n75平方\t430226\navail\t430227\n4999\t430228\n星际迷航3\t430229\n一沙一世界\t430230\n吉林网\t430231\nK480\t430232\n文本值\t430233\n企石\t430234\n成史\t430235\nv310\t430236\n未卜先知\t430237\nCorner\t430238\n合伙中国人\t430239\n复发性流产\t430240\n万能青年旅店\t430241\nJava类加载器\t430242\n土耳其\t430243\n风哥\t430244\nPBA\t430245\n习得性\t430246\n一天天\t430247\n领卷\t430248\n炒青\t430249\n因未\t430250\nobtain\t430251\n瑞超\t430252\n垂范\t430253\n含油量\t430254\n2017年9月25日\t430255\n跑男\t430256\n廖芊芊\t430257\n诱惑力\t430258\n39周\t430259\n中报业绩\t430260\n信永中和会计师事务所\t430261\n补路\t430262\n格芯\t430263\n教育附加税\t430264\n肚_\t430265\n紫台\t430266\nXBA-Z5\t430267\n陕汽集团\t430268\n拉起\t430269\n拿出\t430270\n苦参\t430271\nxiufu\t430272\n长盛基金\t430273\n劲情\t430274\n递给\t430275\n赛飞\t430276\n孙权\t430277\nm132nw\t430278\n部门化\t430279\n金手指\t430280\nSpringMVC注解@RequestParam\t430281\n12.5\t430282\na1286\t430283\ne邮宝\t430284\nXTU\t430285\n马克思佩恩3\t430286\n第20部\t430287\n台台\t430288\n竖款\t430289\n武汉开发区\t430290\nVarious\t430291\n90亿元\t430292\n春播\t430293\n大化瑶族自治县人民政府\t430294\ndocucentre\t430295\n吴林\t430296\n三门峡职业技术学院\t430297\n藤冈靛\t430298\n格拉西亚\t430299\n碳谷\t430300\n痛恨\t430301\nDDR1\t430302\n双氯芬酸\t430303\n城乡规划法\t430304\n优邦\t430305\n洪泽县\t430306\n东邦\t430307\n重金\t430308\n批发\t430309\n本节\t430310\n近几年\t430311\n小三\t430312\n清华大学航天航空学院\t430313\n乳糜血\t430314\n唐七\t430315\n楚枫\t430316\n2.4g\t430317\n创匠科技\t430318\nlcm\t430319\n麂子\t430320\n对否\t430321\n喂\t430322\n机动车交通事故责任强制保险\t430323\n寒论\t430324\nQQ\t430325\n不燃\t430326\n共餐\t430327\n类风湿关节炎\t430328\n自愧\t430329\n服务节\t430330\n20170123\t430331\n私产\t430332\nsniper3d\t430333\n隆裕\t430334\n睡一觉醒来\t430335\n新时代学校\t430336\n旧制\t430337\nz40\t430338\n装机量\t430339\nGamesir\t430340\n北京航空航天大学交通科学与工程学院\t430341\nmigration\t430342\nsmells\t430343\n珍爱生命\t430344\n区长\t430345\n蛋袋\t430346\nfusioncharts\t430347\n尼日利亚奈拉\t430348\n会声会影x8\t430349\n纳污坑塘\t430350\n巨世\t430351\n七星刀鱼\t430352\n抬头纸\t430353\nVivid\t430354\n霍比特人3\t430355\n古遗址\t430356\n噙\t430357\n学路\t430358\n安置帮教\t430359\n清考\t430360\n宣城市环保局\t430361\n香裙\t430362\n城品\t430363\n雷雨\t430364\n18044期\t430365\n一溜\t430366\n无尽空间2\t430367\nBells\t430368\n挂面\t430369\n铁木\t430370\n中国长江三峡集团公司\t430371\n后颈\t430372\nying\t430373\n石佛营\t430374\n惠州银行\t430375\n绝望主妇\t430376\n数图\t430377\n淇水\t430378\nchiron\t430379\n入朝\t430380\n奈叶\t430381\n丁霞\t430382\n尿量\t430383\n杂菌\t430384\n武松\t430385\ndongweiq\t430386\n1G\t430387\n三相线\t430388\n新婚\t430389\n中国商情网\t430390\n湖北省环境保护厅\t430391\n跟前\t430392\n挂载盘\t430393\n山东省人民政府\t430394\n70公斤\t430395\n金刚版\t430396\n奶农\t430397\n小米辣\t430398\nLabs\t430399\n单小源\t430400\n加工坊\t430401\nEdit\t430402\n痣\t430403\n九道\t430404\n谅解\t430405\n客户满意度\t430406\n惊起\t430407\n制霉菌素\t430408\n大同镇\t430409\n危机13小时\t430410\n最绝\t430411\n齐峰\t430412\n倾干\t430413\n校联\t430414\n闭着眼\t430415\npreschool\t430416\n卓林\t430417\n黄香\t430418\n桌桌\t430419\ndemons\t430420\n安利雅姿\t430421\n两亿\t430422\n嘉康利\t430423\n宜章\t430424\n布什\t430425\n简装修\t430426\n渔山岛\t430427\n竹皮\t430428\n大小写字母\t430429\n干问\t430430\n新丰镇\t430431\n王小山\t430432\n流亡之路\t430433\n斯巴达\t430434\n薄薄\t430435\n订价\t430436\n拉氏\t430437\n斗式\t430438\n首尔仁川机场\t430439\nFF13\t430440\n桥本甲状腺炎\t430441\n自由之战\t430442\n监发\t430443\n腹黑\t430444\n缝合处\t430445\n挡水板\t430446\n金坑\t430447\n上海国际旅行卫生保健中心\t430448\n光率\t430449\n舒远\t430450\n粮食\t430451\n五万元\t430452\nprimer\t430453\n官渡区\t430454\n投弹\t430455\n新英雄本色\t430456\n狙击镜\t430457\n安戈洛\t430458\n九路\t430459\n拼页\t430460\n韦杰\t430461\n凉亭\t430462\n唠叨网\t430463\n大蜜\t430464\nnba梦之队\t430465\n撕腿\t430466\n胜利路街道\t430467\n长草期\t430468\n一个很久\t430469\n验资户\t430470\n庚金\t430471\nhibana\t430472\nlime\t430473\n内水\t430474\n迪康\t430475\n掌付通\t430476\n秦川机床\t430477\n空气过滤网\t430478\n彰泰\t430479\n金氏漂流记\t430480\nlateral\t430481\n团购价\t430482\n总书\t430483\n东方证券资产管理有限公司\t430484\n仁者见仁\t430485\n朱高炽\t430486\n午夜十二点\t430487\n大转\t430488\n3730\t430489\n全码\t430490\n颊脂垫\t430491\n节节败退\t430492\nzedboard\t430493\n6341\t430494\n爱上幼儿园\t430495\n鱼庄\t430496\n电子负载\t430497\nan+1\t430498\nUyghur\t430499\n冰河\t430500\n韶钢\t430501\n大径\t430502\n备份集\t430503\n泸沽\t430504\n5月10日\t430505\n卡斯特罗\t430506\n旧文\t430507\n银屑病关节炎\t430508\n儿童画网\t430509\n2010-2013年\t430510\n佛教网\t430511\n拍案\t430512\n嗅探\t430513\nPCMark\t430514\n秀浦路\t430515\n广州视源电子科技股份有限公司\t430516\n平安树\t430517\n11KW\t430518\n求偶遇\t430519\n灭佛\t430520\n半潜船\t430521\n雷殿\t430522\n代金\t430523\n带薪假\t430524\n自白书\t430525\n南通市第一人民医院\t430526\n静态路由表\t430527\nlightly\t430528\n科左后旗\t430529\n铜仁学院\t430530\n阳江市人力资源和社会保障局\t430531\nSTC单片机\t430532\n桩型\t430533\n原祖\t430534\n溪口雪窦山\t430535\n舒展\t430536\n高波\t430537\n杨鲁豫\t430538\n出入境检验\t430539\n藏进\t430540\n115平米\t430541\n潮白河\t430542\nhyperv\t430543\n殷周\t430544\n窝笋\t430545\nSinclair\t430546\n林夏\t430547\n上海造币厂\t430548\n戒托\t430549\n茂业\t430550\n冷场\t430551\n_门\t430552\nendocrine\t430553\n斯维因\t430554\n20国集团\t430555\n沪指\t430556\narchives\t430557\n翼梦\t430558\n电石炉\t430559\n高仿\t430560\nLVL\t430561\n信雅达\t430562\nDOCTYPE\t430563\n武汉通\t430564\n猪兔\t430565\n内贴\t430566\ncitadel\t430567\nTyres\t430568\nblnp\t430569\n杭州电子商务产业园\t430570\n剑者\t430571\nBen10\t430572\n十三亿\t430573\n县规划局\t430574\n工业集中区\t430575\n幼儿童\t430576\n引火烧身\t430577\nhhc\t430578\n蒜蓉辣酱\t430579\n浙江广播电视集团\t430580\n桦南县\t430581\n胰岛功能\t430582\n19章\t430583\n天行轶事\t430584\n地下室\t430585\n乐迪\t430586\n乔乔\t430587\n单硝酸异山梨酯缓释片\t430588\n陈浩民\t430589\n真空搅拌机\t430590\nDOLBY\t430591\n音频变压器\t430592\n扶危\t430593\n笹川\t430594\n糖尿病酮症酸中毒\t430595\n边器\t430596\n1.15\t430597\n安恒\t430598\n翠华餐厅\t430599\n七号\t430600\n光纤收发器\t430601\n2014年11月\t430602\nIIs\t430603\n彭钢\t430604\n世外桃园\t430605\n红儿\t430606\n免疫学\t430607\n前雨\t430608\n圆锯片\t430609\n波兰航空\t430610\n顺铂\t430611\n一一\t430612\n保护卷\t430613\n周琪洛\t430614\n绿城中国\t430615\n大江户\t430616\n武义政府网\t430617\nToString\t430618\n出空\t430619\n0.42\t430620\n算命师\t430621\n皮部\t430622\n标准箱\t430623\n损人\t430624\nexescope\t430625\nRuins\t430626\n漫流\t430627\n牛商\t430628\nnome\t430629\n40cr\t430630\n悭\t430631\n陈红\t430632\n沧州市区\t430633\n东方村\t430634\n有得\t430635\ne6\t430636\n那么近\t430637\nswiper2\t430638\n鼠胆龙威\t430639\n5777540510\t430640\n虐心小说吧\t430641\n教槽\t430642\noften\t430643\n牧场版\t430644\n非正\t430645\n裕民路\t430646\nOptimizer\t430647\n04:00\t430648\n尼龙网\t430649\njbox\t430650\nzyy\t430651\ne宠\t430652\n鸠摩罗什\t430653\n胆小\t430654\n多媒体信息箱\t430655\nExcelHome\t430656\n箱头\t430657\nhaw\t430658\n13000元\t430659\nAKM\t430660\n麦吧\t430661\nBugFree\t430662\n战神联盟\t430663\n云母\t430664\n宏观调控\t430665\n完成稿\t430666\n生乳\t430667\n磁偏角\t430668\n放热\t430669\n中国铁路成都局集团有限公司\t430670\n烧烤摊\t430671\n芨芨草\t430672\n高分辨率\t430673\nnr900\t430674\nhaoma\t430675\n2016年5月28日\t430676\nIFTTT\t430677\nnote3吧\t430678\n顺德\t430679\n武汉十七中教室门\t430680\n弓弦\t430681\n一点通\t430682\n真实魔鬼游戏\t430683\nUltraedit\t430684\n二叉\t430685\n20170707\t430686\n入职体检查\t430687\n酮戊二酸\t430688\n电子商务协会\t430689\n15%\t430690\npropolis\t430691\n明仁\t430692\nFollie\t430693\npone\t430694\n让子\t430695\nlolS6\t430696\n小风车\t430697\n长城机电\t430698\n广厦男篮\t430699\n天青石\t430700\n暴走嗦法\t430701\n数模论坛\t430702\n刘卫华\t430703\n菠菜汤\t430704\n聂微东\t430705\n面基\t430706\n上影影城\t430707\naxis\t430708\n四语\t430709\n诺兰艾薇儿\t430710\n十堰日报数字报|十堰日报\t430711\n天华山\t430712\n梅花魂\t430713\n白鹿原\t430714\n第101次\t430715\n肖德\t430716\n用友u9\t430717\n海沃\t430718\nYAO\t430719\n主打\t430720\n阿拉山口市\t430721\n康巴传媒网\t430722\n厚壁\t430723\n停车券\t430724\n佟二堡\t430725\n前女友们\t430726\n房地产信托\t430727\nVDA\t430728\n高碑店\t430729\n藤椅\t430730\nA400\t430731\nubuntukylin\t430732\n大物\t430733\n凶铃\t430734\nastype\t430735\nframework3.5\t430736\n工量\t430737\n80吨\t430738\n抽出\t430739\n河南会计网\t430740\n便携式打印机\t430741\n求解惑\t430742\nacetic\t430743\n名鞋库达人\t430744\nhongyuan\t430745\n科大国创软件股份有限公司\t430746\n佛山市顺德区人民政府\t430747\n好容易\t430748\n天津国税\t430749\n徐惠彬\t430750\n天赋\t430751\n天马镇\t430752\n等同\t430753\n20180101\t430754\n牛津通识读本\t430755\n善诊\t430756\ntreeselect\t430757\n众数\t430758\n宰相\t430759\n14P\t430760\n来凤县人民政府\t430761\n外汇交易员\t430762\nCODESOFT\t430763\nMMY\t430764\n犊\t430765\ndeeply\t430766\n单相桥式\t430767\n大唐镇\t430768\n臭鳜鱼\t430769\n37.6\t430770\n鲁明\t430771\n黑标\t430772\n@热血街舞团\t430773\n中建总公司\t430774\n50多个\t430775\n京城四美\t430776\natrial\t430777\n龙门起重机\t430778\n华创国际广场\t430779\n策论\t430780\n城站\t430781\n戴高乐机场\t430782\nRoperLee\t430783\ndebuggable\t430784\n四川大学外国语学院\t430785\n天邑股份\t430786\n十三少\t430787\n墨粉盒\t430788\n新成\t430789\nuhut\t430790\n吴小莉\t430791\n堂吉\t430792\n音效\t430793\nKano\t430794\n差额\t430795\nKenneth\t430796\nfata\t430797\nsetTimeout\t430798\nhove\t430799\n湖南省卫生和计划生育委员会\t430800\n兄长\t430801\nTSH\t430802\n天正建筑工具\t430803\n添上\t430804\n追及\t430805\n超级大黄蜂\t430806\nmow\t430807\n华为watch\t430808\nlea\t430809\nhistory2\t430810\nP7\t430811\n量贩式ktv\t430812\n共产儿童团歌\t430813\n麦德三世\t430814\n药物动力学\t430815\n脂溶性维生素\t430816\n使用版\t430817\n国际民航组织\t430818\nADAC\t430819\n应怜\t430820\n余洋\t430821\n黑面\t430822\n显卡花屏\t430823\n君弘\t430824\n斉藤\t430825\n赛菲\t430826\n12.01\t430827\n老秘\t430828\ngalgame\t430829\nGGJ2013\t430830\n惊涛\t430831\n中国时报\t430832\n燃料棒\t430833\n荆州花鼓戏\t430834\nEMA\t430835\n白虎堂\t430836\n荆州区\t430837\n水文站\t430838\n罗曼尼\t430839\n广东建设职业技术学院\t430840\n华远海蓝城\t430841\n1-7\t430842\n雕哥\t430843\n氢氧\t430844\n掉牙\t430845\n大番茄\t430846\n孙毅\t430847\n爱卡自助游\t430848\n国际国家\t430849\n人们\t430850\n150P\t430851\n零余额账户\t430852\nfemmes\t430853\n医用推车\t430854\n纸制品\t430855\n莆田\t430856\n油胺\t430857\n星彩\t430858\n化学化工学院\t430859\n砭术\t430860\n宁夏回族自治区国家税务局\t430861\n老方\t430862\n万血\t430863\n2.0【\t430864\n901足球网\t430865\nNoisy\t430866\n中国民族建筑研究会\t430867\nelitebook\t430868\n新县政府网\t430869\nProbabilistic\t430870\n大航海家3吧\t430871\n肺脓肿\t430872\n白芷粉\t430873\npfSense\t430874\n滴鼻液\t430875\n21.75\t430876\n信阳市政府\t430877\n智研咨询\t430878\nsusan\t430879\n小时工\t430880\n馨园\t430881\n国锋\t430882\n水活\t430883\n贵州电力\t430884\n风韵\t430885\ndvd刻录机\t430886\n情绪心理学\t430887\n量化宽松政策\t430888\n碧霞元君\t430889\n魂牵梦\t430890\n一千次\t430891\n沃尔沃s80\t430892\n未知物\t430893\n轴用弹性挡圈\t430894\n红斑\t430895\n普化\t430896\n寻医问药网眼科\t430897\n横断面\t430898\n1环\t430899\n第27届\t430900\nEnclosure\t430901\n田亩\t430902\n福州市发展和改革委员会\t430903\n洛赛克\t430904\n41.8\t430905\nimsi\t430906\n第三范式\t430907\n7月\t430908\n油匠\t430909\n死门\t430910\n上海港澳\t430911\n电池柜\t430912\nEverlasting\t430913\n寿春\t430914\n再生障碍性贫血\t430915\n图+文\t430916\n八项规定\t430917\n无尽之路\t430918\n宾利欧陆GT\t430919\n明太祖\t430920\n收款单\t430921\nGym\t430922\npurge\t430923\n驱水\t430924\n80c\t430925\n8.9\t430926\n几万行\t430927\nacitivity\t430928\n淘宝客鹊桥\t430929\n杭州人才居住证\t430930\n英维克\t430931\n林哥\t430932\n弥敦\t430933\nwin10系统许可证\t430934\n畜牧兽医局\t430935\nmitutao\t430936\n0127\t430937\nBUS\t430938\n仓本\t430939\n刘晓波\t430940\n岚县\t430941\n共鸣\t430942\n批发市场\t430943\n王立军\t430944\n蹲下\t430945\nGAMEMAX\t430946\n猫眼竹芋\t430947\n打肿\t430948\nNotability\t430949\n汤姆生\t430950\n我们结婚吧\t430951\n言外之意\t430952\n资本化\t430953\n学生公寓楼\t430954\nITC\t430955\n腕龙\t430956\n毛邓\t430957\n黄喉吧\t430958\n祥云寺\t430959\n7路\t430960\n航美\t430961\n屏化\t430962\n初川南\t430963\n美冥\t430964\n电压值\t430965\n知己知彼\t430966\n随身\t430967\n阳泉北站\t430968\n肉戏\t430969\nShantou\t430970\n48小时\t430971\n构法\t430972\n达克\t430973\n06年\t430974\n了如指掌\t430975\n算号器\t430976\nwin10许可证\t430977\n呱呱助手\t430978\n一丝一毫\t430979\n科研处\t430980\n陈金飞\t430981\n浮世绘\t430982\n格瓦斯\t430983\n组词\t430984\n红旗渠\t430985\n绍介\t430986\n909\t430987\n南昌铁路局\t430988\n若风\t430989\n40篇\t430990\n聲\t430991\nSantiago\t430992\nbackdrop\t430993\n抗浮\t430994\nkexue\t430995\n妇科药\t430996\n解围\t430997\n汉宫\t430998\n横须贺\t430999\nhunt\t431000\n加鞭\t431001\n校划片\t431002\n古田县人民政府\t431003\n于泽远\t431004\n流水单\t431005\n議\t431006\nUsher\t431007\n防火布\t431008\n检察权\t431009\n航空兵\t431010\n王洪亮\t431011\n东南大学成贤学院\t431012\n韩山师范学院\t431013\n睡神\t431014\nSavage\t431015\n太子湾公园\t431016\n融汇温泉\t431017\nIphone5s\t431018\n满街\t431019\n骨癌\t431020\n一竿\t431021\n杨平\t431022\nwintc\t431023\n5公里\t431024\n哑面\t431025\n苏建\t431026\n佛山市高明区人民政府\t431027\n固本\t431028\n望眼欲穿\t431029\n塔形\t431030\nsai吧_\t431031\n在建\t431032\n生活性\t431033\n全省\t431034\n连板\t431035\nmah\t431036\n仿人机器人\t431037\n大白天\t431038\nplg\t431039\n副教授\t431040\n高定\t431041\ncdi\t431042\nhlsl\t431043\n教学版\t431044\n易货嘀\t431045\n大地保险\t431046\nIconFont\t431047\n侵占\t431048\n区教体局\t431049\n阿根廷站\t431050\nnen\t431051\n克拉玛尔\t431052\n少算\t431053\n轻语\t431054\n王语纯\t431055\n紫竹院公园\t431056\n=\t431057\n2013.10\t431058\n小情侣\t431059\n5乘\t431060\n德卡\t431061\ny79\t431062\n刻画\t431063\n中国女篮\t431064\n鬼逝\t431065\n同业拆借\t431066\nROYCE\t431067\n布基\t431068\n国家公务员考试网\t431069\n即热式热水器\t431070\nbanben\t431071\n可瑞康羊奶粉\t431072\nRIDDLE\t431073\n星际争霸1\t431074\n马克兔\t431075\nchaotic\t431076\nuhd630\t431077\n041期\t431078\n嘉年华2018\t431079\n杜家毫\t431080\nrise\t431081\n级版\t431082\n连天\t431083\n密尔克卫\t431084\n飞狐外传\t431085\n残留项\t431086\n自营式\t431087\n托业考试\t431088\n王洪波\t431089\n雅迪电动车\t431090\nMutex\t431091\n112万\t431092\n医大二院\t431093\n49800\t431094\ndancer\t431095\n华股财经\t431096\n辽宁省高级人民法院\t431097\n066\t431098\nccmn\t431099\n秘鉴\t431100\n纪先生\t431101\n经四路\t431102\n中职生\t431103\n聚合物锂离子\t431104\nopto\t431105\n联华\t431106\nPKS\t431107\nultraiso\t431108\nIntegra\t431109\n希图斯\t431110\n纸品\t431111\n记叙文\t431112\nㄍ\t431113\nTravis\t431114\n再就业\t431115\n净池\t431116\nEFI分区\t431117\n淮南市人民政府\t431118\n衡水人才网\t431119\n梦奇\t431120\nx4|coreldraw\t431121\n支局长\t431122\n蜗\t431123\n草菇\t431124\nNSA\t431125\n1200plc\t431126\nliunx\t431127\nbaas\t431128\npoisoning\t431129\n中国省\t431130\n4GB/1TB\t431131\n杨志强\t431132\n萨普\t431133\n艾瑞莉娅\t431134\n17134.5\t431135\n歼31\t431136\n苯巴比妥\t431137\n古剑山\t431138\nNTV\t431139\n40109\t431140\nA5创业网\t431141\n备婚\t431142\neveredit\t431143\npyl\t431144\nSlider\t431145\n杭州浙二医院\t431146\n无人驾驶车\t431147\n烟把儿\t431148\nfoxmail\t431149\n亨氏\t431150\n实习\t431151\n沪通长江大桥\t431152\n自学网\t431153\nbig5.chinastock.com.cn/yhwz/astock/shareholdersEquityAction\t431154\n血防\t431155\n麻仁丸\t431156\n10H\t431157\n83017578\t431158\n亦庄桥\t431159\n输出值\t431160\n变女\t431161\nzoosex\t431162\n七煌\t431163\n饮酒\t431164\nCork\t431165\n司训\t431166\n戦士\t431167\n917房产网\t431168\n谱_吉他吧\t431169\nTeredo\t431170\n龚玥菲\t431171\n渊明\t431172\n1930年代\t431173\n拖延症\t431174\n虎榜\t431175\n用材\t431176\nkanebo\t431177\n海南省政协\t431178\n心斋桥\t431179\n聚维酮碘溶液\t431180\nTINY\t431181\nLINQ\t431182\n故事类\t431183\nG42\t431184\n徐章勋\t431185\nmegaraid\t431186\n少妇的诱惑\t431187\n耽美文\t431188\n门甲\t431189\n无锡动物园\t431190\n苏州分公司\t431191\nconstance\t431192\ntransmission\t431193\n钱庄\t431194\n中心小学\t431195\n假阳\t431196\n光整\t431197\n屠龙刀\t431198\nStreng\t431199\n袁仁国\t431200\n滇西北\t431201\n禁用\t431202\n摆设\t431203\n武弘文\t431204\n百洋股份\t431205\nimindmap\t431206\n同影\t431207\n纯电动客车\t431208\ncpu超频\t431209\n方利旭\t431210\n57折网\t431211\n2001-2010年\t431212\n邢冬冬\t431213\n上海航空快递\t431214\n去痘\t431215\niic\t431216\nP闪\t431217\nFOF\t431218\n杨巨源\t431219\n第85章\t431220\n深圳湾悦府\t431221\nemissions\t431222\n16163游戏网\t431223\n河长制\t431224\n国码\t431225\n行当\t431226\n群租房\t431227\n87键\t431228\n阳极氧化膜\t431229\n星姐\t431230\n雨宮\t431231\n停机位\t431232\n湖南省教育厅\t431233\nduoshaoqian\t431234\n深南股份\t431235\n简信\t431236\n谭老师\t431237\n寻芳\t431238\n燥\t431239\n卡什\t431240\n褚时西游记之女儿国\t431241\n几十k\t431242\n釜山港\t431243\ninfra\t431244\n腾讯云通信\t431245\nPPOMO\t431246\n多女\t431247\n陈奂生\t431248\n79级\t431249\n教程篇\t431250\n言文\t431251\n化工人\t431252\n雪水\t431253\n49分钟\t431254\nScaffolding\t431255\n台州市公安局\t431256\n卡门涡街\t431257\nbugly\t431258\nLUMION\t431259\n食道癌\t431260\n洋子\t431261\njqm\t431262\n美国宇航局\t431263\n满面\t431264\n中国羽毛球协会\t431265\n申子辰\t431266\n7750\t431267\n商业贷款利率表\t431268\ncrew\t431269\n白马寺\t431270\nMy思密达\t431271\nhou\t431272\nReShade\t431273\nBD吉吉影音\t431274\n退烧\t431275\n冠道报价-易车网\t431276\n轻易\t431277\n戊二醇\t431278\n哈夫曼树\t431279\n大鸡排\t431280\nrt\t431281\n小米Max2\t431282\n8v\t431283\n妊活\t431284\n雀巢能恩\t431285\n利艾\t431286\n2o17\t431287\n方文山\t431288\n小轮车\t431289\n孩纸\t431290\n胜任力\t431291\n3199\t431292\n声呐\t431293\n大兴一中\t431294\nluckin\t431295\n张宝林\t431296\n三单元\t431297\nDuerOS\t431298\n攻城混沌与秩序\t431299\n2018.3月\t431300\n佛山市委\t431301\nAMZN\t431302\n肖鹏\t431303\n建龙集团\t431304\n恐惧之泣\t431305\n上皮细胞\t431306\nB200\t431307\n算下\t431308\n醉鸡\t431309\n550公里\t431310\nlego\t431311\n103.9\t431312\n旅游村\t431313\n牧通人才网\t431314\n三包法\t431315\n练武\t431316\n转义字符\t431317\n天猫优惠券\t431318\n360路\t431319\n伤兵\t431320\n内乡\t431321\n殡仪\t431322\n短信包\t431323\n鹏程\t431324\nARVR\t431325\n北京市卫生和计划生育委员会\t431326\n超预期\t431327\n杨汉军\t431328\n制造工\t431329\n智字\t431330\n高帅\t431331\n电磁制动器\t431332\n飞科电器\t431333\n28处\t431334\n魔障\t431335\nnssa\t431336\n全薪\t431337\n姬鹏飞\t431338\n均一性\t431339\n廖华\t431340\n中系\t431341\n孚邦\t431342\n失业生\t431343\n开源中国\t431344\n教学大楼\t431345\nOSL\t431346\n陵墓\t431347\n眼镜\t431348\n霍言深\t431349\n住房和城乡建设部\t431350\n诺如病毒感染\t431351\n肠胃们\t431352\nsolidwords\t431353\n增值税法\t431354\n郑宇\t431355\n卡卡安全论坛bbs\t431356\n林更新\t431357\nprofitable\t431358\n中介语\t431359\n涉江\t431360\n命令栏\t431361\n沈阳经济区\t431362\n破伤风免疫球蛋白\t431363\n新建村\t431364\npice\t431365\nsoutong\t431366\nKi\t431367\nstare\t431368\n舒筋\t431369\n平板太阳能集热器\t431370\n用者\t431371\n平头哥\t431372\n靠岸\t431373\n葡萄柚\t431374\n东方财经\t431375\n有种人\t431376\niso9001认证\t431377\n释迦果\t431378\n北京101中学\t431379\nstm32吧_\t431380\n_慕课猿\t431381\n老儿\t431382\n丰惠\t431383\n硝酸铅\t431384\n维生素E乳\t431385\n辽宁铁道职业技术学院\t431386\n青青草_青青草\t431387\n大泽镇\t431388\n闸机\t431389\n0000000\t431390\n兼议\t431391\n3万亿\t431392\nACCOUNT\t431393\n循环流化床锅炉\t431394\n好声\t431395\n辅仁大学\t431396\nKKBOX\t431397\n六折\t431398\n闪龙\t431399\nv2.00\t431400\n伯爵\t431401\nSSL协议\t431402\n医械\t431403\n100ppm\t431404\n控制臂\t431405\n苏州汽车南站\t431406\n寸步难行\t431407\n全盛\t431408\ns400\t431409\n自编码\t431410\n有多爱\t431411\n9201\t431412\n82553943\t431413\n王牌三国\t431414\n各科目\t431415\n哲学生活\t431416\n矢量变频器\t431417\n域控\t431418\n苏联吧\t431419\n火起来\t431420\n河北消防\t431421\n大方广佛华严经\t431422\nminolta\t431423\n随堂\t431424\nArchitects\t431425\n公司办事处\t431426\n120分\t431427\n鼓谱\t431428\n河北省教育厅\t431429\nJMeter\t431430\nkennedy\t431431\n了不起的狐狸爸爸\t431432\ndinphy\t431433\n起路\t431434\n郑磊\t431435\n莅校\t431436\n伦茨\t431437\n崧厦镇\t431438\n武汉市纪委\t431439\n湖南省经信委\t431440\n玄凤鹦鹉吧\t431441\n赤胆忠心\t431442\nHarm\t431443\n1.006\t431444\n4-5年\t431445\n会声会影x6\t431446\n结婚证书\t431447\n30【\t431448\n电工证考试\t431449\nグル\t431450\n几世\t431451\n3302\t431452\n7版\t431453\npaints\t431454\nCtags\t431455\n皇岗村\t431456\n中国国际航空股份有限公司\t431457\nDongguan\t431458\n匡山\t431459\n砌块墙\t431460\n插色\t431461\n吸水\t431462\nflyking\t431463\n坐标值\t431464\nTimeMachine\t431465\n永续年金\t431466\n吴建新\t431467\n温控器\t431468\n液压油\t431469\nDuncan\t431470\nTED演讲集\t431471\n潮白河大桥\t431472\n母法\t431473\nLube\t431474\n菱纱\t431475\nBare\t431476\nca6140\t431477\nCIOE\t431478\n福建省纪委监察厅\t431479\n2018年2月6日\t431480\n无量纲\t431481\n电霸\t431482\n纤板\t431483\n200种\t431484\nAdler\t431485\n孙康\t431486\n里飞沙\t431487\n折心\t431488\n拖油瓶\t431489\nMraz\t431490\nflute\t431491\nmillis\t431492\n268\t431493\n克朗斯\t431494\n预付费电表\t431495\n参松养心胶囊\t431496\ndirty\t431497\n20170425\t431498\n高新人才网\t431499\n空气滤\t431500\nlegging\t431501\n灵焕\t431502\nInventions\t431503\n60个月\t431504\n圣骑士\t431505\n羽悦\t431506\n工商银行手机银行\t431507\n别称\t431508\n杨洛\t431509\nfmh\t431510\n浑然\t431511\n堇\t431512\n疾\t431513\n王牌特工\t431514\n堂吉柯德\t431515\n张静江\t431516\n全国哲学社会科学规划办公室\t431517\n爆客\t431518\n莫斯科大剧院\t431519\nlog4\t431520\n思八达\t431521\nfold\t431522\ngachin\t431523\n得胜镇\t431524\n周晓明\t431525\n646\t431526\n周村区\t431527\n小句\t431528\n阴阳冕\t431529\n保本基金\t431530\n威望\t431531\n真三国无双7猛将传\t431532\n鲁中商业吧\t431533\nV2013\t431534\n51名\t431535\n泪光\t431536\n文革\t431537\nconduit\t431538\n内盘\t431539\n杭菊\t431540\n间桐樱\t431541\n棕熊\t431542\n筛机\t431543\n酸爽\t431544\nRING\t431545\nrld\t431546\n娇韵诗双萃\t431547\n鲈鱼\t431548\ncr7\t431549\n五金网\t431550\n投诉信\t431551\ntoolchain\t431552\n海天花园\t431553\n拉钩网\t431554\n头行\t431555\npdfx\t431556\nTeese\t431557\n4句\t431558\ntip\t431559\n消防安全技术实务\t431560\nbeforeeach\t431561\ninherent\t431562\nipt\t431563\ncooled\t431564\n林国华\t431565\n伟嘉\t431566\n东方小学\t431567\n巴拿山\t431568\n中泰国际\t431569\nping\t431570\n景墙\t431571\nBelfast\t431572\n羊脂斯琴高娃\t431573\n小猎\t431574\n压缩算法\t431575\n工人运动\t431576\n望天\t431577\n葬明\t431578\n米琼恩\t431579\n中国诚通\t431580\n天外来菌\t431581\ncuz\t431582\nVS字幕组\t431583\n107\t431584\n自动提款机\t431585\n阳光海南网\t431586\n亭湖\t431587\njuice\t431588\n测速\t431589\n青海银行\t431590\n36kr.com\t431591\n高四\t431592\n地下城与勇士DNF\t431593\n白鹇\t431594\n讲席\t431595\nResume\t431596\nslave\t431597\n生物酶\t431598\n逸富\t431599\n总负债\t431600\n201609\t431601\n5朵\t431602\nClash-of-Clans_九游论坛\t431603\npushbutton\t431604\n竹杆\t431605\n2017年元旦\t431606\nbiases\t431607\n采光瓦\t431608\n两个头\t431609\nWFS\t431610\n武林艳史\t431611\n黑白线\t431612\n金田一吧_\t431613\n鬼区\t431614\n淄博市\t431615\n明仕田园\t431616\n铃儿\t431617\n育雏\t431618\n青青园\t431619\n会声会影\t431620\n怪物猎人4g\t431621\n两种方\t431622\n安徽大学》\t431623\n工口漫画网\t431624\n钦州教育信息网\t431625\nP51\t431626\n散射\t431627\n彩贝\t431628\n一个四位\t431629\n进球彩\t431630\nr2\t431631\n赵良田\t431632\n中国机械工业联\t431633\nlicensing\t431634\n老龙\t431635\n杭州G20峰会\t431636\n期末考\t431637\nmalloc函数\t431638\n小鳄龟\t431639\n五千万\t431640\n大船\t431641\n诛仙\t431642\n充许\t431643\n西府海棠\t431644\n三会一课\t431645\n湖北省机关事务管理局\t431646\n河北宣工\t431647\n欢乐谷\t431648\n万景峰\t431649\n电流源\t431650\n1.5h\t431651\n汕头市政府\t431652\n陕西航天信息\t431653\n三毛\t431654\n顶爆\t431655\n玩梗\t431656\n直接标价法\t431657\n223.104.176.6\t431658\n孢子粉\t431659\n小天使幼儿园\t431660\n心超\t431661\n绿屏\t431662\n纯牛奶\t431663\n针织物\t431664\n许攸\t431665\n7.07\t431666\n鸡肉\t431667\n奇葩鱼\t431668\n狼友们\t431669\n求\t431670\n杨奇函\t431671\n狮子王2\t431672\n工农武装割据\t431673\nMedi\t431674\n华同社区\t431675\n脱发\t431676\nangular1\t431677\n幼儿园园\t431678\n6余\t431679\n_余\t431680\n徐志刚\t431681\n语训\t431682\nSpring、Struts\t431683\n一尘网\t431684\n怡橙\t431685\n移动充电宝\t431686\n信城通\t431687\n瑞蓝\t431688\n免疫力\t431689\n葫芦头\t431690\n防癌\t431691\n花作文\t431692\n新世界百货中国有限公司\t431693\n海滩\t431694\n神界危机4.7\t431695\n乡镇企业\t431696\n法桐树\t431697\n乐辉\t431698\n愧疚\t431699\n223.104.90\t431700\n亲亲我的小宝贝\t431701\nClubs\t431702\n信长野望\t431703\n雄兵连乾坤篇\t431704\n沉砂池\t431705\n综合计算工时工作制\t431706\n软件定义网络\t431707\n宠亲亲\t431708\n鸿程\t431709\nappsettings\t431710\n金茂悦\t431711\n南方电视台\t431712\nVGA\t431713\nMgCl2\t431714\n阿波罗登月\t431715\nvray吧\t431716\n百度搜索引擎\t431717\nDAY1\t431718\n第七世\t431719\n86天\t431720\n好惨\t431721\n澎湃新闻-The\t431722\ndyli\t431723\n卡行天下\t431724\n秦州\t431725\n常青\t431726\n奶妹\t431727\n20150924\t431728\n我的眼\t431729\n来自\t431730\n27分\t431731\n市寸\t431732\n狂战传说\t431733\n拆弹专家\t431734\nbmfont\t431735\n百泉\t431736\n差值\t431737\n明源软件\t431738\n无间断\t431739\n空心\t431740\n0xb0\t431741\n泰豪科技\t431742\nATmega\t431743\n夹边沟记事\t431744\n香港特首\t431745\n湖州火车站\t431746\n威萨\t431747\n寻迹\t431748\n青松岭\t431749\n猪仔\t431750\n掌众\t431751\n下一分钟\t431752\n戚美珍\t431753\nweihai\t431754\ndisan\t431755\nimacros\t431756\n周润何姿\t431757\n多笔\t431758\n榕基软件\t431759\n出口部\t431760\n16年3月\t431761\n不支\t431762\n辐射1\t431763\n沉默寡言\t431764\n追随\t431765\nRPC框架\t431766\n马艳丽\t431767\n别名\t431768\nCulturelle\t431769\n5厘\t431770\n荣耀东皇太一\t431771\n中国铁路95306网\t431772\n中节\t431773\n深圳电子厂\t431774\n千帆过尽梦无痕\t431775\n晶格\t431776\n魔仙\t431777\n源文件\t431778\n万里长城永不倒\t431779\n金晟\t431780\n东方有线\t431781\n许华升\t431782\n海底总动员2\t431783\n表现型\t431784\n白云机场\t431785\n2000年后\t431786\n有机玻璃制品\t431787\n南麓\t431788\nelasticity\t431789\n88式\t431790\n美容美体\t431791\n第六段\t431792\n滑片\t431793\n顺丰科技有限公司\t431794\nJessi\t431795\n云南省人民政府\t431796\n茨城机场\t431797\nWIN2008R2\t431798\n吴易昺\t431799\n防爆空调\t431800\n黄金洞\t431801\n舟山中学\t431802\n钱谦益\t431803\n咪咕善\t431804\n差不多\t431805\n北京师范大学艺术与传媒学院\t431806\n谢华\t431807\n果洛州\t431808\n否极泰来\t431809\n政治学院\t431810\n大金刚\t431811\n河南省新乡市\t431812\n蔻驰\t431813\n高管们\t431814\n奶色\t431815\n质优价廉\t431816\n苏雅\t431817\n一墙之隔\t431818\n电阻值\t431819\nstb\t431820\n2018年下半年\t431821\n瑞元\t431822\n建筑工程一切险\t431823\n上下五千年\t431824\n固定电话号段\t431825\n梅奥\t431826\n最长\t431827\n李宏图\t431828\n13阶\t431829\ngbs\t431830\n倒入\t431831\n保合\t431832\n新纪元通用账证\t431833\n黄新淳\t431834\n微pe\t431835\ncope\t431836\n产地\t431837\n荷西\t431838\n87%\t431839\n微信指数\t431840\n顺丰快递公司\t431841\n120款\t431842\nResNet\t431843\n召唤物\t431844\nedugd\t431845\nbjhhh\t431846\n假发套\t431847\ntanx\t431848\nCRH\t431849\nextern\t431850\n吕毅\t431851\n第四声\t431852\n杂乱无章\t431853\niglol\t431854\n勒·柯布西耶\t431855\n装柜\t431856\n喜来乐\t431857\n2017.10.25\t431858\n往复式\t431859\n三个代表\t431860\n皎陽\t431861\n爱爱谷\t431862\nUg\t431863\n宕渣\t431864\n大理政府\t431865\n乡镇派出所\t431866\n凝神\t431867\n间谍\t431868\n网易终结者2\t431869\n手式\t431870\nseventeen\t431871\n病休\t431872\n贾国军\t431873\nTAR\t431874\n影调\t431875\n眼球\t431876\n万科良渚文化村\t431877\ncountdown\t431878\n授课型\t431879\nEmpower\t431880\n皮革制品\t431881\n寿宁县\t431882\nNing\t431883\n填制\t431884\n1.4L\t431885\n文明4\t431886\n丰田双擎\t431887\n即时通讯\t431888\nelec\t431889\n飘絮\t431890\n叶莎\t431891\n花样大全图\t431892\n2820\t431893\n雷霆空战\t431894\nTUMI\t431895\n上马街道\t431896\n抓好\t431897\n移动图书馆\t431898\n桥东区\t431899\n运动生理学\t431900\nkitkat\t431901\nNetcat\t431902\n12329\t431903\nt460p\t431904\n锐捷\t431905\n中外运\t431906\n补刀\t431907\n0.10.2\t431908\n市委办公室\t431909\n晶科能源有限公司\t431910\n后轮\t431911\n金融诈骗罪\t431912\n55R16\t431913\n再版\t431914\n企业所得税汇算清缴\t431915\n白叶\t431916\ngamadrid\t431917\n露安适\t431918\n了望天涯\t431919\n国投公司\t431920\n刺绣\t431921\n童孙\t431922\n新天下无双\t431923\n大天鹅\t431924\n张雁\t431925\n世良真纯\t431926\n李尚志\t431927\n辽宁省工商局\t431928\n16万元\t431929\n弯梁\t431930\n宠物小精灵防御战\t431931\n超威集团\t431932\n新浪广西_新浪网\t431933\n昌北国际机场\t431934\n打蔫\t431935\n猛进\t431936\n马版\t431937\n九三学社中央\t431938\n水店\t431939\n2016.1.2\t431940\n蜀山区人民政府\t431941\n东正\t431942\n土嗨\t431943\nappending\t431944\n登高面\t431945\n广告公司\t431946\n陈子豪\t431947\n百蕊颗粒\t431948\n思维力\t431949\n其位\t431950\n合肥职业技术学院\t431951\nSugarBaby\t431952\n屈从\t431953\n9款\t431954\n净值化\t431955\n隈研吾\t431956\nles片\t431957\n随机森林算法\t431958\n阵风\t431959\n洪金宝\t431960\nmakedown\t431961\nRex\t431962\n孔尚任\t431963\n无翼鸟之火影忍者\t431964\n溶血性黄疸\t431965\n豪威\t431966\nexecutive\t431967\n长泾\t431968\nliepin\t431969\nrinth\t431970\n吴宗\t431971\n护盾\t431972\n米歇尔奥巴马\t431973\n贺子\t431974\n冻膜\t431975\n淘气卡\t431976\n周洛大峡谷\t431977\n懒虎\t431978\n十九大政府\t431979\n户表\t431980\n联想小新潮\t431981\nadvice\t431982\n撑船\t431983\nzhiqi\t431984\n太岁庙\t431985\n腋温\t431986\n专业度\t431987\n长兴路\t431988\n仲裁案\t431989\n女圣\t431990\n非非\t431991\n湾里区\t431992\n西畴县人民政府\t431993\n矿业\t431994\n金纳米\t431995\n南充房产信息网\t431996\n重巡\t431997\n不给力\t431998\n高蛋白\t431999\n河南省高院\t432000\n宠物猫\t432001\n第四十一\t432002\n函电\t432003\n手斧\t432004\n阿诺德渲染器\t432005\n实幼\t432006\n洛夜\t432007\n程海\t432008\n小魔\t432009\n碑石\t432010\n到喜\t432011\n铁观\t432012\nyyyymmddhhmmss\t432013\n宁夏医科大学总医院\t432014\n合群\t432015\n无法抗拒\t432016\n风谷\t432017\n觐见\t432018\n子歌\t432019\n价格单\t432020\n祖居\t432021\n林海雪原\t432022\n苏源\t432023\n长沙汽车东站\t432024\n逆推\t432025\n平老\t432026\n自传体\t432027\n恶动图\t432028\n翠湖花园\t432029\nFirePro\t432030\n宝马M6\t432031\nntd\t432032\n超男\t432033\nUltraman\t432034\nwuhu\t432035\n体院\t432036\n奔驰c200\t432037\nTWINS\t432038\n赵小宝\t432039\n科举殿\t432040\n磁传感器\t432041\n暗黑破坏神3_Diablo3\t432042\nproducts\t432043\n喷墨\t432044\n日本女团\t432045\n梯度算子\t432046\n绿帽子\t432047\n中国无锡政府\t432048\n毛霉\t432049\n松冈\t432050\n邓鸣贺\t432051\n大雪纷飞\t432052\n黑兔\t432053\n奢华\t432054\n建筑业\t432055\n小羽\t432056\nHousing\t432057\n1.7.10\t432058\n蒿坪镇\t432059\n20170825\t432060\n承包方\t432061\nt540p\t432062\n骑行\t432063\n蜂蜜蛋糕\t432064\n小人才\t432065\n书橱\t432066\n王西京\t432067\n千千静\t432068\n人轮\t432069\n臻\t432070\nibb\t432071\nemerge\t432072\n富兰克林·罗斯福\t432073\n新成路街道\t432074\nNLog\t432075\nbootStrap\t432076\n统发\t432077\n水口村\t432078\n天祥集团\t432079\ncm2\t432080\n华乐\t432081\n石冢\t432082\n大幂幂\t432083\n珠光体\t432084\nNORD\t432085\n异常\t432086\n马路\t432087\n九阴真经\t432088\n加根\t432089\n武统\t432090\n跑腿网\t432091\n方博\t432092\n七乐康\t432093\n魅蓝5S\t432094\nsqlserver2016\t432095\n先孕\t432096\n珠花园\t432097\nextracted\t432098\n咸水鱼\t432099\n6例\t432100\n龙泽罗拉\t432101\nmsvcr\t432102\nSRS-HG2\t432103\n暖意\t432104\n雏鹰农牧集团\t432105\n邢质斌\t432106\n单车道\t432107\n晋能集团\t432108\nvuerouter\t432109\n老车站\t432110\n砀山梨花节\t432111\n爱玩\t432112\n吡非尼酮\t432113\n涪陵区\t432114\n哈德\t432115\n570分\t432116\n古井贡\t432117\n华汉\t432118\n濮耐股份\t432119\n太阳花\t432120\n颜卡\t432121\nAOS\t432122\nsequelize\t432123\n新事情\t432124\nbreezefeng\t432125\n华强广场\t432126\nPOOL\t432127\n第76期\t432128\n克里斯·保罗\t432129\n肉味\t432130\n2020年\t432131\n养母的花样年华\t432132\n2018年5月26日\t432133\n威娜\t432134\n刘鸿\t432135\n滚条\t432136\n赵耀\t432137\n只想\t432138\n磁铁矿\t432139\n猫店\t432140\n许飞\t432141\n成都一小区\t432142\ndpk\t432143\n碟中谍4\t432144\n于洪广场\t432145\n11.24\t432146\n致上\t432147\nRemains\t432148\n家政女皇\t432149\n舞蝶广场舞\t432150\n云南省物价局\t432151\n佰怡\t432152\n新篇章\t432153\n分治算法\t432154\n禁飞区\t432155\nEXCEL2016\t432156\nsencha\t432157\n0.40\t432158\n家用绞肉机\t432159\n乌班图\t432160\n日照新闻网\t432161\n育儿经\t432162\n李承哲\t432163\n导师们\t432164\n亚森\t432165\ntorrent种子BT\t432166\n非线性分析\t432167\n推射\t432168\n七彩灯\t432169\n世俱杯\t432170\n怕怕\t432171\n化工类\t432172\nclaris\t432173\n海鲜粥\t432174\n送来\t432175\n进气\t432176\njqGrid\t432177\n梦之声\t432178\n陆小凤\t432179\n金亚\t432180\n2015年\t432181\n16299.125\t432182\n波士顿华人资讯网\t432183\n多线程编程\t432184\n捷途捷途X702018款\t432185\n元宝枫\t432186\n棉丝\t432187\nChromeDriver\t432188\n进疆\t432189\n塔州\t432190\n美少妇\t432191\n6000亩\t432192\n诊刮\t432193\n阎王殿\t432194\n三档\t432195\n内衣\t432196\n绝毛\t432197\n四朝\t432198\n金总\t432199\n新政\t432200\n破密\t432201\nwocam\t432202\n懒月\t432203\n自考主考\t432204\n游乐园\t432205\n批发市场价格\t432206\n念破\t432207\ncampaigns\t432208\n数理化网\t432209\n海上明月\t432210\n神话版三国\t432211\n季票\t432212\n游戏卡\t432213\n2.9T\t432214\n蒸压粉煤灰砖\t432215\n微幅\t432216\nbeth\t432217\n护照\t432218\nf20\t432219\n直流电动机\t432220\n郁南县\t432221\n新市场\t432222\n中国盐业总公司\t432223\nFore\t432224\nBazaar\t432225\n取经\t432226\n2第四章\t432227\nPI\t432228\n交银康联\t432229\n小咖\t432230\n玛丽安\t432231\n毕业论文+CAD\t432232\n北京摇号吧\t432233\n武隆\t432234\n宰客\t432235\n壮骨\t432236\nmatx\t432237\n钦州路\t432238\nCaspase\t432239\n半叶\t432240\n脱胶\t432241\n祖玛\t432242\n电解镍\t432243\n奉劝\t432244\n渣油\t432245\n广东人事考试网\t432246\nchef\t432247\n云南省投资控股集团有限公司\t432248\nVuitton\t432249\n屏东\t432250\n荡剑币\t432251\n037期\t432252\nRite\t432253\n纯黑吧\t432254\n天空出现\t432255\n大庆机场\t432256\n心胸狭窄\t432257\n黑龙江省国家税务局\t432258\nDREAM\t432259\n杨之光\t432260\n三乙\t432261\n多萝西\t432262\n余永定\t432263\n累累\t432264\nIP地址查询|-2CHA\t432265\n17.3英寸\t432266\n罗伯特卡洛斯\t432267\n于涛\t432268\n天龙八部唐门\t432269\n活法\t432270\n集团型\t432271\n最值钱\t432272\n东方宫\t432273\n电视猫\t432274\n料盒\t432275\n二十六条\t432276\nDump\t432277\n归航\t432278\n祖母绿宝石\t432279\ncago\t432280\n工智能\t432281\n堂吉诃德\t432282\n开耕\t432283\n赵曙光\t432284\n疲弱\t432285\nBio\t432286\n琴帝\t432287\n_楷书字帖\t432288\n脆皮蛋糕\t432289\n陕西历史博物馆\t432290\n风儿\t432291\n耳听\t432292\n国有资本投资公司\t432293\n花底\t432294\n笑纳\t432295\n同班同学\t432296\nLONG\t432297\n活死人墓\t432298\n康养中心\t432299\n2022冬奥会\t432300\ngnib\t432301\n王晋康\t432302\n拼立\t432303\n170g\t432304\n大海战2\t432305\n裱画\t432306\n龙尾\t432307\n从来没\t432308\n走天\t432309\n16年04月\t432310\n泰康路\t432311\n不将\t432312\n园林卡\t432313\n太平洋亲子网\t432314\nsheldon\t432315\n程序存储器\t432316\n幻海\t432317\n修正主义\t432318\n实践部\t432319\nethanol\t432320\n特等奖学金\t432321\n顾顺\t432322\n点点头\t432323\n岳红\t432324\n恐吓\t432325\ndada\t432326\n一格一格\t432327\n怪猎世界\t432328\n刘凤\t432329\n近处\t432330\n双林股份\t432331\ncali\t432332\n连用\t432333\n丁文\t432334\n犀牛5\t432335\n倒数\t432336\n八瓣\t432337\n控制流\t432338\n饺子宴\t432339\nRQ\t432340\n分包\t432341\n羧\t432342\n就业办\t432343\n新钢股份\t432344\n花儿为什么这样红\t432345\n0.9mm\t432346\n诚聘\t432347\nclb\t432348\n300小时\t432349\n固始县\t432350\n妈妈的爱\t432351\n2303am\t432352\n资本公积转增股本\t432353\n晚饭\t432354\nQQ邮箱\t432355\nisempty\t432356\n猪肚\t432357\n新街口\t432358\n百里峡\t432359\n网站群\t432360\nbms\t432361\n侦察机\t432362\n后背部\t432363\n啪啪网\t432364\n搅拌桩\t432365\nxtc\t432366\n第A09\t432367\n月福\t432368\nServer论坛\t432369\nStartup\t432370\n彩吧网\t432371\nXXE\t432372\n53秒\t432373\n崇\t432374\nx299\t432375\nfinereport\t432376\n嘉兴学院\t432377\n理工大\t432378\n宫宫\t432379\n表态性\t432380\n众彩网\t432381\n牢\t432382\n大眼睛\t432383\n次数\t432384\n狮子\t432385\nuas\t432386\n百战不殆\t432387\n安提瓜\t432388\n翰林学院\t432389\n哲品\t432390\nApereo\t432391\n装饰柱\t432392\n横板\t432393\n锉削\t432394\n三百亿\t432395\n维扬\t432396\n模拟地\t432397\n福朋\t432398\nSecurities\t432399\n询证函\t432400\n巴蜀\t432401\n群雄逐鹿\t432402\n^\t432403\nkart\t432404\n弱电工程公司\t432405\n舒适堡\t432406\n优先度\t432407\n长恨歌\t432408\n创成式\t432409\n结算单\t432410\nJude\t432411\n3gp\t432412\n指甲剪\t432413\naris\t432414\n作物\t432415\nUndo\t432416\nclarify\t432417\n不救\t432418\n二婚女\t432419\n瑞希\t432420\n2009届\t432421\n12角\t432422\n学历证\t432423\n哈雷\t432424\n孩\t432425\n金韩\t432426\n劳险\t432427\n7720\t432428\n削片机\t432429\n第83期\t432430\n拔草\t432431\n毛刷辊\t432432\nmake\t432433\n尊道\t432434\n拓荒者\t432435\n压光机\t432436\nprobation\t432437\n北京电视台\t432438\n哮喘病\t432439\n机械革命\t432440\n听完\t432441\n一式三份\t432442\n财富宝\t432443\n投下\t432444\n包茂高速\t432445\nbut\t432446\n法波\t432447\n第7代\t432448\nSinging\t432449\n丰田rav4\t432450\n木蚂蚁安卓\t432451\n火炬之光2\t432452\n老校\t432453\n座票\t432454\n包银\t432455\n广东省环境保护厅\t432456\n国家医学考试中心\t432457\n白花油\t432458\n麻麻\t432459\n牛市\t432460\n珀斯\t432461\nhare\t432462\nVIew\t432463\n末地\t432464\n甘十九妹\t432465\nmonkeys\t432466\n起点网\t432467\n大连市委\t432468\n她的秘密\t432469\nViolence\t432470\n6V\t432471\n墙上\t432472\n标准砖\t432473\n关兴\t432474\n死皮赖脸\t432475\nmsmtp\t432476\ncloudfoam\t432477\n中药包\t432478\n销售力\t432479\n广州城\t432480\n空之轨迹sc\t432481\ndiagnose\t432482\n数点\t432483\n580.1\t432484\n链栈\t432485\nclk\t432486\n利益链\t432487\n优夏\t432488\n北京知识产权法院\t432489\n林彤\t432490\n磺胺\t432491\n胎盘\t432492\nscheduler\t432493\n姚前\t432494\n亚庇机场\t432495\nEXC\t432496\nfeice\t432497\n朱世杰\t432498\n激情版\t432499\nv2.1.6\t432500\n全额拨款\t432501\n碧雪情天\t432502\n性冷淡风\t432503\nUndefeated\t432504\n魏玛\t432505\n缺墨\t432506\n昆山注册公司\t432507\nbanzou\t432508\n大冰\t432509\n黑摩季\t432510\n拔刀相助\t432511\n小米手环2吧\t432512\n雷沃\t432513\n津南新城\t432514\n第37章\t432515\n新英雄\t432516\nBD720P高清\t432517\n深港DJ俱乐部\t432518\n惠来\t432519\n磁子\t432520\n重归于好\t432521\n立定\t432522\n28片\t432523\nfoot\t432524\nI\t432525\n集肤效应\t432526\n武林三国\t432527\n登封市\t432528\nruc\t432529\n大城区\t432530\n黄连上清丸\t432531\n西丽地铁站\t432532\n安京\t432533\n心事\t432534\n形同\t432535\n脉冲式\t432536\nDiscover\t432537\n好铃网\t432538\n88块\t432539\nvtr\t432540\n2天内\t432541\n景德镇陶瓷学院\t432542\n第18轮\t432543\nC版\t432544\n久久建筑网\t432545\n20151108\t432546\n变化多端\t432547\nMOU\t432548\n湾湾\t432549\n道德底线\t432550\n广州动物园\t432551\n王者荣耀KPL\t432552\n头臂\t432553\nA7S2\t432554\ns60l\t432555\n百宝云\t432556\n中华报道网\t432557\nmiss吧\t432558\nZSQ\t432559\n水资源管理制度\t432560\n56网\t432561\nBands\t432562\n金曼\t432563\n摩登大道\t432564\nWiFi\t432565\n蔡捕头\t432566\n备餐\t432567\n板簧\t432568\n刘胡兰\t432569\n张三无限恐怖\t432570\n1包\t432571\n7800X\t432572\n洁衣\t432573\nOpe\t432574\n滚筒烘干机\t432575\n完整稿\t432576\n二第五\t432577\n吉利研究院\t432578\n800MHz\t432579\n重配\t432580\n山茱萸\t432581\n风格馆\t432582\nupton\t432583\n巴氏奶\t432584\n瓦尔特\t432585\n1056\t432586\n65535\t432587\n斗方\t432588\n1.3.0.8\t432589\n填色\t432590\n中宇资讯\t432591\nxxxy\t432592\n2-3分钟\t432593\n安大\t432594\n王传福\t432595\n2019年起\t432596\n20140107\t432597\n软饮料\t432598\n可燃气体探测器\t432599\n唯满侠\t432600\n经办\t432601\n草娥\t432602\n问政\t432603\nReviewers\t432604\n福民新村\t432605\n食用冰\t432606\nShapes\t432607\n深府\t432608\n哈巴狗\t432609\n西妥昔单抗\t432610\n棋人\t432611\n黄貂鱼\t432612\n锁头\t432613\nGLP\t432614\n台球室\t432615\n60支\t432616\ndiv子元素\t432617\n柠\t432618\n达内\t432619\n拆\t432620\n张兴华\t432621\n代查\t432622\n优游\t432623\n黑警\t432624\n一无所有\t432625\n俄格战争\t432626\n羿射\t432627\nAJAX\t432628\ntired\t432629\nProteus\t432630\n拳皇98C\t432631\nFerries\t432632\n帝王将相\t432633\n阶段\t432634\nIP剧\t432635\n砂眼\t432636\n1664\t432637\n大连市妇产医院\t432638\n756\t432639\n小忆\t432640\n地下城与勇士dnf\t432641\n家政公司\t432642\n闪存盘\t432643\n铝型\t432644\n粤港澳大桥\t432645\nunordered\t432646\n杭州火车西站\t432647\n誉衡药业\t432648\n瑞气\t432649\n快穿之打脸狂魔\t432650\n文明6:迭起兴衰\t432651\n否则\t432652\n西安旅行社\t432653\nasmr\t432654\n李少春\t432655\n私房照\t432656\njxl.jar\t432657\n指压板\t432658\n卡马西平片\t432659\n犯罪小说\t432660\n图根\t432661\n牧羊曲\t432662\nudo\t432663\n超鬼\t432664\n04届\t432665\n吴良\t432666\n分手\t432667\n烤烟\t432668\n吓死\t432669\n狂练\t432670\n节字\t432671\n胸廓\t432672\n水磨沟\t432673\n中检集团\t432674\n喉科\t432675\nlikeju\t432676\n长住\t432677\n软件工程师\t432678\n彩影\t432679\n退却\t432680\n四日\t432681\n黑崎子\t432682\nmbp\t432683\n大东东东\t432684\n对应\t432685\n敦奴\t432686\n递件\t432687\n奇瑞小蚂蚁\t432688\n温度计\t432689\n2018年1月\t432690\nmv_热播韩剧网\t432691\n9英寸\t432692\n台铁\t432693\n12月5日\t432694\n爱情废柴\t432695\nCoupling\t432696\n2018年4月16日\t432697\n成金\t432698\n語音\t432699\n张家港市人民政府\t432700\n英雄气\t432701\n县委统战部\t432702\n演奏员\t432703\n注税\t432704\n业内人士\t432705\n张芸\t432706\n套期\t432707\n中美\t432708\n抛壳\t432709\n黎子明\t432710\n片批\t432711\n莞太路\t432712\n及参\t432713\n龙游\t432714\n猴岛\t432715\n大韩航空\t432716\n紫纹\t432717\n研培\t432718\n量贩式\t432719\nmfc120u.dll\t432720\n33卡\t432721\n场记板\t432722\n辽阳银行\t432723\n妇联\t432724\n刘远举\t432725\n嘉汇城\t432726\n升级\t432727\nBoutique\t432728\n纬一路\t432729\n白石日\t432730\n迪士尼公司\t432731\nparticular\t432732\nwor\t432733\n六卷\t432734\n八师\t432735\n罗山路\t432736\n城市圈\t432737\n日理万机\t432738\n剑网三引仙水榭\t432739\n旋木\t432740\n棚屋\t432741\n地球保卫战\t432742\n东北新闻网\t432743\n认成\t432744\ndnf帕拉丁吧\t432745\n臀\t432746\n中标网\t432747\n乘用\t432748\n这样的人\t432749\n中油好客e站\t432750\n新市长\t432751\nPanty\t432752\nUSR\t432753\n榫卯\t432754\n奇葩男\t432755\nena\t432756\n妻子的情人\t432757\n商标权\t432758\nUITableView\t432759\n湖北省交通投资集团有限公司\t432760\nKcptun\t432761\nCelebs\t432762\n1平米\t432763\n陈小硕\t432764\n上海社会科学院\t432765\n直流电机\t432766\n孙玉\t432767\n恐惧魔王\t432768\n共事\t432769\n十二五期间\t432770\n372\t432771\nallcloud\t432772\n易捷\t432773\n道学\t432774\n东京湾区\t432775\nlol腐吧\t432776\n刘师培\t432777\n淘卡\t432778\n残霞\t432779\ncnki\t432780\nTOP3\t432781\n塘约\t432782\n弹琴吧\t432783\n吉林建筑大学\t432784\nheros\t432785\n先辈们\t432786\n无公害农产品\t432787\ninnerText\t432788\nSolidwork\t432789\n济南一中\t432790\n张天爱\t432791\n撸色\t432792\n百乐\t432793\n街区制\t432794\n王祺\t432795\n100万套\t432796\n印格\t432797\n微信公众平\t432798\n添越\t432799\n限制性股票\t432800\n计量学\t432801\n多半\t432802\n换列\t432803\n霍比特人3五军之战\t432804\n商贾\t432805\n弃妇再嫁\t432806\n后缀\t432807\ntwins\t432808\n杭州医学院\t432809\nyoroot\t432810\n无悔\t432811\n评价师\t432812\n质量保修期\t432813\n组化\t432814\n艾维诺\t432815\n颤栗\t432816\n酒店\t432817\n9.3%\t432818\n老谭\t432819\ngru\t432820\n脱粒机\t432821\n刘川\t432822\nescrow\t432823\n四川大学江安校区\t432824\ndevelopers\t432825\n鼎鑫\t432826\n颐和山庄\t432827\n五四青年节\t432828\n山西省妇幼保健院\t432829\n搜铺网\t432830\n翻盘\t432831\n66\t432832\n样方\t432833\ngta5吧_\t432834\n65w\t432835\n皮卷\t432836\n0420\t432837\n恶邻\t432838\n贡布\t432839\n翟凌\t432840\n测距\t432841\n100cm2\t432842\nBUILDING\t432843\n裤子男\t432844\nSuperVan\t432845\n文洁\t432846\n三国战纪群雄逐鹿\t432847\napplewatch3\t432848\n开放源代码\t432849\n长风商务区\t432850\n2018年农历三月\t432851\n翻标\t432852\n蛇颈龙\t432853\nghf\t432854\n购买者\t432855\n此书\t432856\n渐现\t432857\n贵都\t432858\n急事\t432859\n拨动\t432860\n天宝\t432861\n孟佳\t432862\n应付账款周转率\t432863\n公证人\t432864\n近2年\t432865\nプ\t432866\nGLA200\t432867\n张志东\t432868\n车床车\t432869\n青山公园\t432870\n云岭先锋网\t432871\n天津商业大学\t432872\n弯举\t432873\n九香虫\t432874\nCFLD\t432875\n抑制率\t432876\nmpu\t432877\n浪荡\t432878\nBD-MKV\t432879\n牌坊\t432880\n8550U\t432881\n6瓶\t432882\n爱帮网\t432883\n十厘米\t432884\n山西卫视\t432885\n5858.com\t432886\n富泰\t432887\n灵物\t432888\n黔东南苗族侗族自治州\t432889\n李老鼠\t432890\n孙小头\t432891\n三级缓存\t432892\n永乐大帝\t432893\n马坊镇\t432894\n巴楚县\t432895\n哀绿\t432896\n珠江东路\t432897\n南京旅游\t432898\nDGM\t432899\n刘老根大舞台\t432900\n杨邱自\t432901\n石质\t432902\n塑身裤\t432903\n电表箱\t432904\n防寒服\t432905\n8.5.5\t432906\n第7回\t432907\n捐躯\t432908\n魔浪\t432909\n旗舞\t432910\n北仑中学\t432911\n裂界\t432912\n华北大区\t432913\n书童\t432914\n弗里德曼\t432915\n艾芙琳\t432916\nSuccessful\t432917\n教津贴\t432918\n西安电子科技大学电子工程学院\t432919\n小民\t432920\n亚利桑那州立大学\t432921\n雪菊\t432922\n走音\t432923\n桑晨\t432924\n可贵的沉默\t432925\n剖分\t432926\n28小时\t432927\n借贷\t432928\n汇款\t432929\n三胎政策\t432930\n铿锵玫瑰\t432931\n处里\t432932\n大观公园\t432933\n沙河高教园\t432934\n国投电力\t432935\n显失\t432936\n6600元\t432937\n后土\t432938\n双流区\t432939\n艳光\t432940\n第五年\t432941\n转注\t432942\n入目\t432943\n桃屋猫三国梦想全集\t432944\n使命召唤6现代战争2\t432945\nceshi\t432946\n野兰花\t432947\n九龙口\t432948\n半糖主义\t432949\n照见\t432950\n24只\t432951\n港城\t432952\n澳大利亚站\t432953\n排气筒\t432954\nP21R10\t432955\nAptamil\t432956\n超级草莓音乐节\t432957\n创杯\t432958\n防火型\t432959\nsuggested\t432960\nBuenos\t432961\n二所\t432962\nqthread\t432963\n恰恰特快车\t432964\n美丽生态\t432965\n严彬\t432966\nLola\t432967\nrsa算法\t432968\n卡布\t432969\n33岁\t432970\n大中电器\t432971\n淮安市\t432972\n超广\t432973\n三苗\t432974\niit\t432975\n姿料\t432976\n凤临天下\t432977\nFloEFD\t432978\n浦科特\t432979\n塞露\t432980\nheadline\t432981\n论文类\t432982\n三星S7|三星S7\t432983\n歌唱类\t432984\n育才简笔画\t432985\n王楚\t432986\n泳圈\t432987\nIdiom\t432988\n蟠龙\t432989\n农电\t432990\n柔度\t432991\n镯\t432992\n代交\t432993\nJoda\t432994\n钢板网\t432995\n山屿海\t432996\nNicat\t432997\n空额\t432998\nsita\t432999\n骑士\t433000\ntypeof\t433001\nE奶\t433002\n宏山\t433003\n梁秋燕\t433004\n长缆科技\t433005\nid服务器\t433006\n精彩绝伦\t433007\n网垫\t433008\n雅漾Avene\t433009\n分度盘\t433010\n安全工器具\t433011\n兰州交大\t433012\n希沃\t433013\n蛋白组学\t433014\n惹尘埃\t433015\n斯帕拉捷\t433016\n一方水\t433017\n洪道\t433018\n甜白葡萄酒\t433019\n通信录\t433020\nsmall\t433021\nbegun\t433022\n黄桃\t433023\nCoupled\t433024\n韦尔奇\t433025\nf4v\t433026\n印泥盒\t433027\n不买单\t433028\nzhzhang\t433029\n名城新闻网\t433030\n土情\t433031\n中国工商银行北京分行\t433032\nInvocation\t433033\n莱仕达\t433034\n孟京辉\t433035\n四大名捕\t433036\n我的世界minecraft\t433037\n陈路\t433038\nV5.2.1\t433039\n丨壹\t433040\n多盟\t433041\n取父\t433042\n要色\t433043\n花语女生网\t433044\n追赶者\t433045\nEquipments\t433046\n碧蓝航线红染\t433047\n微信商城\t433048\n观察物体\t433049\nSwift_脚本之家\t433050\n樱火轮舞\t433051\n羊人\t433052\n畅行天下\t433053\n弗拉明戈\t433054\n没心没肺\t433055\nfileupload\t433056\nbank\t433057\n宇文赟\t433058\n锦囊妙计\t433059\n吓一跳\t433060\n金华火车站\t433061\n八戒中文网\t433062\n最低档\t433063\n出所\t433064\n误码率\t433065\n呼延云\t433066\n劳动节_无忧考网\t433067\n一篇400\t433068\n烘焙咖啡豆\t433069\n个别\t433070\n布鲁托\t433071\n五面\t433072\ncombustion\t433073\nCSOL\t433074\n国际金属加工网\t433075\n取现\t433076\n翔龙\t433077\n小白云\t433078\n康凯\t433079\n癸丑\t433080\n肾虚型\t433081\n标引\t433082\n京都清水寺\t433083\n重庆大金\t433084\nfactorization\t433085\n_亿万\t433086\n周大福金融中心\t433087\n吸血\t433088\n2.4.13.6\t433089\n彩美\t433090\n中石化大厦\t433091\n阴暗\t433092\n北京三里屯\t433093\n粘模\t433094\n风华\t433095\n过曝\t433096\n53mm\t433097\n聊斋艳谭之幽媾\t433098\n潘朵拉\t433099\n兴隆街\t433100\n省委组织部\t433101\n雨师\t433102\n1765\t433103\n聚酯漆\t433104\nROST\t433105\n快易捷\t433106\n百无\t433107\n匡正\t433108\n勿施于人\t433109\n灵异第六感\t433110\n通江县人民政府\t433111\n2017.1.2\t433112\nfitted\t433113\n酵\t433114\n琴子\t433115\nTSB\t433116\n42套\t433117\n04月11日\t433118\n乖离性\t433119\n玄神\t433120\n狸米\t433121\n望城县\t433122\n基准面\t433123\n温酒斩华雄\t433124\n1290\t433125\n紫薇大帝\t433126\n伴我行\t433127\n同班\t433128\n河长巡河\t433129\nacronym\t433130\n花炮\t433131\n优质机\t433132\n桩机\t433133\n果岭\t433134\n全民斗战神\t433135\n百度地图api\t433136\nsad\t433137\n唱游\t433138\n红石\t433139\nAFN\t433140\n梯井\t433141\n永靖县\t433142\nlogs\t433143\n磁器口古镇\t433144\n漠少君\t433145\n广中西路\t433146\n李大鹏\t433147\njpmsg\t433148\n袄\t433149\n大众CC论坛\t433150\n撞沉\t433151\n药敏\t433152\n彩妆盘\t433153\n旅泊网\t433154\n空落落\t433155\n中华人民共和国物权法释义\t433156\nCosplay\t433157\nYunGaZeon\t433158\n约克\t433159\n子长县\t433160\n行车道\t433161\n10000亿\t433162\n衣联\t433163\n5699\t433164\n分选\t433165\n王昱珩\t433166\n陆文龙\t433167\n电磁阀\t433168\nc0930\t433169\n优洁士\t433170\n祝瑞莲\t433171\n金黔\t433172\n双千亿\t433173\n航行者\t433174\n祥盛\t433175\n风水轮\t433176\nLong\t433177\n生死缘\t433178\nextention\t433179\n春词\t433180\n普世价值\t433181\n软组织损伤\t433182\n板级\t433183\n老虎板\t433184\nmwan3\t433185\n事与愿违\t433186\n271号\t433187\n沾污\t433188\n浪漫史\t433189\nBgRemover\t433190\n答答\t433191\nIEG\t433192\n平衡德\t433193\n炸金花\t433194\n冷冽\t433195\n多部\t433196\n习进平\t433197\n吊带裤\t433198\n六型\t433199\nstyle-loader\t433200\n金徽酒\t433201\n3721\t433202\n数学单\t433203\n山洪\t433204\n杭州房管局\t433205\n菜牙\t433206\n6层\t433207\n企业所得税汇算清缴纳税\t433208\n广东分公司\t433209\nEGE\t433210\n希思罗\t433211\n李李翔\t433212\n前5世纪\t433213\n重庆人事人才培训网\t433214\nGrandpa\t433215\ncvr\t433216\ntripadvisor\t433217\n黔东南民族职业技术学院\t433218\n米诺斯\t433219\n光机\t433220\n士人\t433221\n绿化管\t433222\nMisha\t433223\nClarke\t433224\n运营者\t433225\n拉杆\t433226\n来我家\t433227\n图页\t433228\n凯威\t433229\n5K屏\t433230\n寝乱义母\t433231\n邵阳新闻网\t433232\n闲情\t433233\n大唐无双零\t433234\n不该\t433235\n美意\t433236\n一体轮\t433237\n宣传品\t433238\n高老师\t433239\nCHANG\t433240\n国家中医药局\t433241\nGoJS\t433242\n哗变\t433243\n东成西就\t433244\n帕萨\t433245\n毛卫宁\t433246\n智地\t433247\n施瓦布\t433248\n变节者\t433249\n生龙\t433250\n120万\t433251\nGODADDY\t433252\n苏高新\t433253\n谢波\t433254\n荷兰人\t433255\n000519\t433256\n严于律己\t433257\n凤冈县\t433258\n大天使之剑H5_九游论坛\t433259\n麦浪\t433260\ncdc\t433261\n滑移率\t433262\nbilibili直播吧\t433263\n烤鸭\t433264\npH计\t433265\n教学法\t433266\n航海类\t433267\n英雄少年\t433268\n常姓\t433269\n浮恋\t433270\n南昌政府网\t433271\n三芯电缆\t433272\n上海地铁6号线\t433273\n嘻哈风\t433274\n南旗\t433275\n金世正\t433276\n选拔赛\t433277\nWIXOSS\t433278\n宁波市水利局\t433279\nwholesale\t433280\n老同\t433281\n园\t433282\nnmm\t433283\ns230u\t433284\n慕容垂\t433285\negit\t433286\n谷歌商店\t433287\n歌手赛\t433288\n一视\t433289\n10000分\t433290\n削去\t433291\n动人\t433292\n余干之窗\t433293\nQQ空间\t433294\n整理袋\t433295\n兰德酷路泽4000\t433296\n邱健\t433297\nbilibili_哔哩\t433298\nCDD\t433299\noversea\t433300\nivi\t433301\n18点\t433302\n哲学博士\t433303\n闺杀\t433304\n林帆\t433305\npnd\t433306\nMIDO\t433307\n拜佛\t433308\n别让我走\t433309\nicao\t433310\n什么片\t433311\nJsoup\t433312\n大变\t433313\n李昊\t433314\nclaireyuancy\t433315\n莽\t433316\n新大学\t433317\n宝沃汽车\t433318\n报酬\t433319\n线程管理\t433320\n刘汉\t433321\n2499\t433322\n谷岳\t433323\n甲泼尼龙片\t433324\n宴饮\t433325\n第27次\t433326\n甑子丹\t433327\n马克思主义理论\t433328\n3014\t433329\nG60科创走廊\t433330\n复合木地板\t433331\nMVW\t433332\n选路\t433333\n鞭长莫及\t433334\n微喜帖\t433335\n厨语\t433336\nsecondary\t433337\nScratch\t433338\n中国油画院\t433339\n庞巴迪\t433340\n板上钉钉\t433341\n袁凤瑛\t433342\n符箓\t433343\n雨楼\t433344\n伍代夏子\t433345\n长亮\t433346\n香港树仁大学\t433347\n1251\t433348\nDescription\t433349\n可点\t433350\n全面战争:战锤2\t433351\n用型\t433352\n奇瑞QQ\t433353\n盛天\t433354\n肯塔基大学\t433355\n大器晚成\t433356\n掉泪\t433357\n秀米XIUMI\t433358\nAJ\t433359\n黑井\t433360\n县气象局\t433361\nenormous\t433362\nperspectives\t433363\n托管\t433364\n留学城市站\t433365\n哥德尔\t433366\n渣滓洞\t433367\n莘松中学\t433368\n枸杞粥\t433369\nbupa\t433370\n甲乙酮\t433371\n水火\t433372\n台湾清华大学\t433373\n漫威DC\t433374\n怀仁县\t433375\n番茄小说网\t433376\n周边游\t433377\n李梓熙\t433378\n禁火\t433379\n申长雨\t433380\n安适\t433381\n不透\t433382\n坐下\t433383\n棒球手\t433384\nOBD2\t433385\n临月\t433386\n哺乳类\t433387\n文宫\t433388\n教师节\t433389\nDwarf\t433390\n伊犁哈萨克自治州\t433391\n逍林镇\t433392\n众喜剧\t433393\nSPOT\t433394\n大顺\t433395\nSystrace\t433396\n基围虾\t433397\n南风股份\t433398\n占星师\t433399\n横扇\t433400\n9月17日\t433401\n龙湖锦艺城\t433402\n绍\t433403\n蹴鞠\t433404\n国家工业信息安全发展研究中心\t433405\n播报员\t433406\n盘古天地\t433407\nNuendo\t433408\n新型农村合作医疗制度\t433409\nsscom\t433410\n授客\t433411\n死鸡\t433412\n拼凑\t433413\n7-11\t433414\n阵曲\t433415\n北京市法人一证通\t433416\n辞工\t433417\narm板\t433418\ncrest\t433419\n衿\t433420\n全平\t433421\ncbwcwy\t433422\n中国天楹股份有限公司\t433423\n刘洪斌\t433424\nautohotkey\t433425\n系统产业\t433426\n爱冒险的朵拉\t433427\narray_rand\t433428\n二千万\t433429\n送别\t433430\n们\t433431\n东昌府新闻网\t433432\n单身公主相亲记\t433433\n华帝热水器\t433434\n乡野欲潮:绝色村嫂的泛滥春情\t433435\nqyw\t433436\n胡椒面\t433437\nkendoGrid\t433438\n十八家\t433439\n降息\t433440\nretire\t433441\n江苏省物价局\t433442\n聚积\t433443\n三苏\t433444\n狂妄\t433445\noverwritten\t433446\n椎名空\t433447\n误差率\t433448\n因时\t433449\n冷调\t433450\n陈美嘉\t433451\nFudan\t433452\n定点化\t433453\n数周\t433454\n测量尺\t433455\n大庙镇\t433456\n压刨机\t433457\n玉凤路\t433458\n上涨率\t433459\ntodate\t433460\n复播\t433461\nsi001\t433462\n生活大爆炸吧\t433463\n五歌\t433464\n乱穿\t433465\n东方早报\t433466\n照镜子\t433467\n大味\t433468\n盛世华庭\t433469\n爱的箴言\t433470\nfuxi\t433471\n小闪\t433472\nIllumination\t433473\n修文婷\t433474\n点击处\t433475\n答辩\t433476\nsmi\t433477\n第二次世界大战\t433478\n5636联盟\t433479\nWim\t433480\n跨时代\t433481\n加层\t433482\n十分之七\t433483\n汀江\t433484\n货源\t433485\n地狱之刃\t433486\n欧弟\t433487\n江淑娜\t433488\n打道\t433489\n秀大\t433490\nelectra\t433491\n降回\t433492\n汇编吧\t433493\n王景武\t433494\n水布\t433495\n通济\t433496\n好球\t433497\nSunday\t433498\n农门\t433499\n795集\t433500\nM500\t433501\n早教幼\t433502\n音乐界\t433503\n11.01\t433504\n绘画师\t433505\n质器\t433506\n2348\t433507\n迈博\t433508\n三湘风纪\t433509\n硫酸铵\t433510\n断货王\t433511\n泥膜\t433512\nnaughty\t433513\n博威\t433514\n朗科智能\t433515\n下弹\t433516\n黎屋\t433517\n综艺片\t433518\n155卡\t433519\n十八子\t433520\n金膜\t433521\n同父异母\t433522\nprimavera\t433523\n圣诞奇妙公司\t433524\n警方\t433525\n700克\t433526\nzoning\t433527\nF2.0\t433528\n禹洲地产\t433529\ncmb\t433530\n长卷\t433531\n异星工厂吧\t433532\nXIAOMI\t433533\n尤伯杯\t433534\n藏书\t433535\n凶宅\t433536\n鸱吻\t433537\n星兽\t433538\n阿母\t433539\n合政\t433540\nSWISSE\t433541\npostgreSQL\t433542\n碧玉簪\t433543\n冠城\t433544\n口水战\t433545\n999个\t433546\n铝压\t433547\n张家港)有限公司\t433548\n客友\t433549\n清江画廊\t433550\nSpire\t433551\n锦鲤\t433552\n香蜜湖街道\t433553\ninitial\t433554\nV201\t433555\n乔克\t433556\nAXE\t433557\nBDC\t433558\n_慧\t433559\n桑黄\t433560\n黄山在线\t433561\n异想\t433562\n心想\t433563\n乌苏里\t433564\n几平方米\t433565\n医武\t433566\n7万多\t433567\n妆面\t433568\n长安CS35论坛_汽车之家论坛\t433569\n肺积水\t433570\n10000次\t433571\nhailong\t433572\n东方盛世花园\t433573\n海澡\t433574\ndefaultdict\t433575\n情感性\t433576\nlibao\t433577\n坐以待毙\t433578\nreceived\t433579\nwndows\t433580\n2辑\t433581\n智度\t433582\nExcel合并单元格\t433583\nwindow2008\t433584\n偶滴歌神\t433585\n千苒君\t433586\n嵩山南路\t433587\n淘房家居网\t433588\n897\t433589\nGrove\t433590\nencodeuri\t433591\n第一月\t433592\n中值滤波\t433593\n楚梦瑶\t433594\n盐酸特比萘芬凝胶\t433595\n网页游戏大全_一游网\t433596\n二锅头\t433597\n雪花片\t433598\n下拉时\t433599\n显性\t433600\n女教师日记\t433601\n元首\t433602\n平凡岁月\t433603\n2017年10月20日\t433604\n鄂人社\t433605\n一大把\t433606\nDerived\t433607\n中新锦绣天地\t433608\nHunters\t433609\n香蕉综合伊人网\t433610\nWin10纯净版\t433611\n宋涛\t433612\n小昭\t433613\nMoody\t433614\n妃月\t433615\n车智汇\t433616\nKINGS\t433617\nProblems\t433618\n痛骂\t433619\nZHU\t433620\n水权\t433621\n地企\t433622\nf429\t433623\nFortinet\t433624\n珍岛\t433625\nOM\t433626\n庐山市\t433627\n千味\t433628\n英太青\t433629\nconjunction\t433630\n金华浦江\t433631\n中一枪\t433632\n寂静岭4\t433633\nmsfvenom\t433634\n时间窗\t433635\n20160702\t433636\n前侧\t433637\n崖城镇\t433638\n空拍\t433639\n突厥语\t433640\n考电\t433641\n500a\t433642\n吧皮卡丘\t433643\n6m5m\t433644\n出让\t433645\ns版\t433646\n阿良\t433647\nhuawei\t433648\n对比照\t433649\n一笑\t433650\n粉雄救兵\t433651\n超级猩猩\t433652\nzhuanjia\t433653\n立柱式\t433654\n清运\t433655\n同心同德\t433656\n七路\t433657\n台山政府网\t433658\n陈慧琳\t433659\nlayaair\t433660\n陕煤集团\t433661\n领物\t433662\noptical\t433663\n472\t433664\n国资\t433665\n天津地铁2号线\t433666\n孔雀草\t433667\n中国兰花交易网\t433668\nSydney\t433669\n吹口哨\t433670\n小书匠\t433671\n苏北医院\t433672\nyalmip\t433673\n【真\t433674\n疯狂坦克\t433675\n请问\t433676\n第七关\t433677\n税讯\t433678\n新体\t433679\n约基奇\t433680\n莱切斯特城\t433681\n富桥\t433682\napache.poi\t433683\nogo\t433684\n陈永贵\t433685\n慈禧太后\t433686\n清盘\t433687\nComa\t433688\n红枫树\t433689\n雅淇\t433690\n7个半月\t433691\nbzoj\t433692\n张鹏阳\t433693\n耽氏\t433694\n法制时报数字报\t433695\nprius\t433696\n私锁\t433697\n60#\t433698\n一司\t433699\n上海美国领事馆\t433700\n唾手可得\t433701\n幂等\t433702\nzhifa\t433703\nC-Lodop\t433704\n邓发\t433705\n皇茶\t433706\nmaster\t433707\n六十甲子纳音\t433708\nSeer\t433709\n光纤打标机\t433710\nPLL\t433711\n李德林\t433712\n捕蟹\t433713\n团委\t433714\n霞涌\t433715\n三景\t433716\n晃晃悠悠\t433717\n德纳\t433718\n天津装修公司\t433719\n芝华士\t433720\n601668\t433721\n春风行动\t433722\n高层楼\t433723\n5.1声道\t433724\n俸禄\t433725\n议标\t433726\n滨海花园\t433727\n博湖\t433728\n闭关\t433729\nHitow\t433730\nVERILOG\t433731\n晋江市人民政府\t433732\n格兰\t433733\nComputing\t433734\n萨克\t433735\n德国银行\t433736\n采样率\t433737\n亚连\t433738\n顺字\t433739\nlian\t433740\ng3930\t433741\n捷豹XEL\t433742\n自连\t433743\nKIT\t433744\n战舰少女\t433745\n离堆公园\t433746\n痹\t433747\n雨淋\t433748\n儿子\t433749\n版张\t433750\n环境影响评价师\t433751\nAmazon\t433752\n少年期\t433753\n少年们\t433754\n芯机\t433755\n洗脑\t433756\n千兆\t433757\n決定\t433758\n中国经营网\t433759\n73小时\t433760\n养血\t433761\n灵棚\t433762\nALM\t433763\n现任职\t433764\n60版\t433765\n国信证券金太阳\t433766\n帚神\t433767\n04729\t433768\n广州地铁21号线\t433769\nsnapchat\t433770\n一个16岁\t433771\n玩忽职守\t433772\n工业机器人\t433773\n华为麦芒4\t433774\n产能过剩行业\t433775\n敛\t433776\n拇外翻\t433777\n高嶋\t433778\n网络浏览器\t433779\n大拜年\t433780\n天下美食\t433781\n免流\t433782\n料石\t433783\n喝一口\t433784\nbuds\t433785\n案板\t433786\n啪啪圈\t433787\n悦翔\t433788\n网区别\t433789\n箭牌瓷砖\t433790\nTP-Link无线路由器\t433791\n电风\t433792\n中华读书报\t433793\n公证书\t433794\n第10册\t433795\n銘\t433796\n沙滨路\t433797\n明年元旦\t433798\n哼哈\t433799\n不期而至\t433800\n难忘的人\t433801\n太傻\t433802\n摆线减速机\t433803\n中国汝州市政府\t433804\n子健儿\t433805\n30万级\t433806\n一夜成名\t433807\n沪教\t433808\ntweet\t433809\nSDCMS\t433810\n平时\t433811\n12bet\t433812\n灯头\t433813\nax+1\t433814\n篮桥\t433815\n如见\t433816\n极光之恋\t433817\n金鸡山公园\t433818\nPoo\t433819\n传教士\t433820\n许强\t433821\n香格里拉\t433822\n上海壹侨国际贸易有限公司\t433823\n光华大道\t433824\n钱皓\t433825\n翌日\t433826\n耳鼻咽喉科\t433827\n自保温\t433828\n若干只\t433829\ngta5ceo\t433830\n桑椹\t433831\n7820x\t433832\n附城镇\t433833\n骋望\t433834\n1.90\t433835\n黄芪片\t433836\n北极星储能网\t433837\n梅娃\t433838\nJavbus\t433839\n风窗\t433840\n低热\t433841\n太极宗师之太极门\t433842\n安居区\t433843\n艺校\t433844\n5700元\t433845\n1串\t433846\n民爆\t433847\nVisio2007\t433848\n油红\t433849\n滤波器\t433850\n李飞飞\t433851\nAffair\t433852\n太仓市区\t433853\n服务台\t433854\n粘稠\t433855\nSquare\t433856\ntl-wr842n\t433857\n沙滩裤\t433858\n鬼束\t433859\nAutowired\t433860\n郭羡妮\t433861\n新华娱乐\t433862\n铮铮铁骨\t433863\n大学英语六级考试\t433864\nCle\t433865\nhepatology\t433866\n广东省人民政府\t433867\n出场率\t433868\n正道集团\t433869\nqq幻想吧\t433870\n微辣\t433871\n黄悦\t433872\n大连市中心医院\t433873\nistream\t433874\nxtube\t433875\n全活\t433876\n淋雨一直走\t433877\n凤城市\t433878\nnvenc\t433879\n李子园\t433880\n横山美玲\t433881\n悪戯\t433882\n品学\t433883\n搜狗搜索\t433884\n理工附中\t433885\n红嘴鸥\t433886\n马人\t433887\n太阳时\t433888\nFrequently\t433889\n曾鹏\t433890\n嘉博\t433891\n小俊\t433892\n国际医院\t433893\n刑犯\t433894\n刚新\t433895\n梅吉\t433896\n谢字\t433897\n宁乡一中\t433898\n咬合痛\t433899\n台区\t433900\n温特沃斯·米勒\t433901\n5626\t433902\n中国会计师事务所\t433903\n三重奏\t433904\n废钞\t433905\n串珠\t433906\n拍照\t433907\n云南省纪委省监委\t433908\n马涛\t433909\n燕京理工学院\t433910\n德州经济开发区\t433911\n乌头碱\t433912\n对外经济贸易大学研究生院\t433913\n好难过\t433914\n白蛾\t433915\n双子星\t433916\n金魂\t433917\n8083\t433918\n按实\t433919\ntwice\t433920\nYum源\t433921\n卫生人才网\t433922\n圣商\t433923\n生物质\t433924\n食物百科\t433925\n塔夫茨大学\t433926\nHughes\t433927\n覆水难收\t433928\nAVI格式\t433929\nb85-pro\t433930\n刘迎\t433931\n秋风\t433932\nDenny\t433933\n爨底下\t433934\nMt\t433935\n湖南文艺出版社\t433936\n控图\t433937\n省人民政府\t433938\nclaimed\t433939\n河北华夏幸福\t433940\n欢呼声\t433941\n爸爸去哪儿第二季\t433942\n验钞\t433943\n1.2.2\t433944\n4.8.0\t433945\n电沉积\t433946\n12亿美元\t433947\n月霜\t433948\nvoodoo\t433949\n英朗\t433950\n92平\t433951\nyxlm\t433952\n印心\t433953\nPVE\t433954\n酶片\t433955\n苏锦\t433956\n佛山科学技术学院\t433957\ncitrate\t433958\n面霜\t433959\n佛山市行政服务中心\t433960\n2083\t433961\n税收征管法\t433962\n鬣\t433963\nwin服务器\t433964\n荷花烟\t433965\n2018年4月2日\t433966\n夏黑\t433967\n发光字\t433968\negd\t433969\n感情生活\t433970\n玄饼\t433971\n弱国\t433972\n万万岁\t433973\n针叶林\t433974\nsecretary\t433975\n起义\t433976\n雪印\t433977\ne^2x\t433978\nautomata\t433979\n甘特图_百度\t433980\n看海\t433981\n图久网\t433982\n大疆精灵3se\t433983\nTraktor\t433984\n0.86\t433985\nhubei\t433986\n开普\t433987\nXX银行\t433988\n52shici\t433989\nuWS\t433990\n国际人才网\t433991\n外生性\t433992\n二尖瓣脱垂\t433993\n奇怪的美发沙龙\t433994\n森林康\t433995\n半自助\t433996\n0.5MM\t433997\nADAM\t433998\nCRYSTAL\t433999\n开扇\t434000\nfanuc\t434001\n_公积金贷款\t434002\n眼线膏\t434003\n黄洁洵\t434004\n2018年4月5日\t434005\n老蛙\t434006\n疯狂烤翅\t434007\nNeed\t434008\n蔡东藩\t434009\n802.3\t434010\n徐州市环境保护局\t434011\n公民凯恩\t434012\n魔气\t434013\n傅海峰\t434014\n杨洪基\t434015\n张琪格\t434016\nhenkel\t434017\nMX150\t434018\n广东体育\t434019\n银装\t434020\nSQL语言\t434021\n大麦屿\t434022\nmac2017\t434023\n亿通\t434024\n一览_游侠网\t434025\n湘剧\t434026\n秦朗\t434027\n在线人\t434028\n真三国无双5\t434029\n2016年4月29日\t434030\ndug\t434031\n天下展商\t434032\nBSB\t434033\nENIX\t434034\n白岩松\t434035\n细碎\t434036\n曹慧\t434037\n达标率\t434038\n韶关东\t434039\n华特\t434040\n500ETF\t434041\n买方\t434042\n不老村\t434043\nprogrammable\t434044\n七档\t434045\n白鹤滩\t434046\n黑胶唱机\t434047\n久悬户\t434048\n事务员\t434049\n浙医一院\t434050\n热浪\t434051\n天府路小学\t434052\n善堂\t434053\n猪脚\t434054\n恒厚\t434055\n阿Sa\t434056\n春江水\t434057\n剑网三新\t434058\n固特异御\t434059\n临洮县\t434060\n武曲\t434061\n亚历山大王\t434062\n乐呵\t434063\n华宇广场\t434064\nHeydouga\t434065\n8u162\t434066\nE960\t434067\n群晖synology\t434068\n党代表任期制\t434069\n广州市城市管理委员会\t434070\n功成名就\t434071\n金融服务外包有限公司\t434072\n古典音乐\t434073\n二手设备\t434074\nauld\t434075\nlayerui\t434076\nSant\t434077\nIEEE754\t434078\n梁芳\t434079\nmirrorlink\t434080\n精神领袖\t434081\n胸肌\t434082\n阿卡贝拉\t434083\nFUJITSU\t434084\njeju\t434085\n斯诺克中国公开赛\t434086\n欧赛斯\t434087\n西郊\t434088\npuppies\t434089\n癸巳年\t434090\n金叶女贞\t434091\n夏雨\t434092\n四川省\t434093\nasia\t434094\n管理师\t434095\n3第二\t434096\n无所适从\t434097\nP+F\t434098\n中国江西旅游网\t434099\n机电公司\t434100\n引注\t434101\n精臣\t434102\nConsultancy\t434103\n香风智\t434104\n厥阴\t434105\n地段\t434106\n天虚\t434107\n李永三\t434108\n纽康特\t434109\n靳远\t434110\n锄奸\t434111\nNIP\t434112\n甲型\t434113\nrenowned\t434114\nXenu\t434115\n康乃尔\t434116\necharts4\t434117\n黄山风景区\t434118\nNICO\t434119\n周浩晖\t434120\n1.3.11\t434121\n招人\t434122\n枫桥路\t434123\n固阳县\t434124\n三星W2013\t434125\n5028\t434126\n厦门中学\t434127\n纯玩\t434128\n数十种\t434129\n西安市公安局交通管理局\t434130\n43626网\t434131\nConflicts\t434132\nwww.cn010w.com\t434133\n20160109\t434134\n国债指数\t434135\n网络药理学\t434136\n联想天逸310\t434137\n直男癌\t434138\n庐州\t434139\nstudo\t434140\n甘谷县\t434141\nf372\t434142\n邹自景\t434143\n2012年12月\t434144\nBaseDao\t434145\nwin32diskimager\t434146\n河内机场\t434147\n公检法\t434148\n41页\t434149\n铠装\t434150\n清场\t434151\n吴颖\t434152\nHRB335\t434153\nuuu\t434154\n婆婆\t434155\n万家花城\t434156\n原随云\t434157\n索亚\t434158\n2018-01-01\t434159\n针管笔\t434160\nnuaa\t434161\n拉菲尔\t434162\n9.5代\t434163\n穿墙王\t434164\nai\t434165\nossfs\t434166\n50年后\t434167\n机械能守恒定律\t434168\n苏州法院\t434169\n林小薇\t434170\n奇幻自卫队\t434171\n便\t434172\naffairs\t434173\n司书\t434174\n罗泾\t434175\nrootca\t434176\niphone8\t434177\n销售量\t434178\nffserver\t434179\n秦晖\t434180\n劳动保险\t434181\n几缕\t434182\nced\t434183\n小米Mix2\t434184\n辉南\t434185\n碧桂园地产\t434186\n011期\t434187\n3dmax吧_\t434188\n中国公安部\t434189\n小锦\t434190\n甘政\t434191\nIpsec\t434192\n老杳\t434193\nrbq\t434194\n滨文路\t434195\ntherebe\t434196\n引风机\t434197\n中华财会网\t434198\narsc\t434199\n关联_\t434200\nzjb\t434201\n旧约圣经\t434202\n中华人民共和国工会法\t434203\n奶罐\t434204\n16篇\t434205\n生日月\t434206\n破腹产\t434207\ncountryside\t434208\n光动力疗法\t434209\n无为而治\t434210\n西瓜霜润喉片\t434211\n平山县\t434212\n电信猫\t434213\n合单\t434214\n罗斯蒙特\t434215\n标哥\t434216\n15分米\t434217\n6【\t434218\n走一步\t434219\n稀硫酸反应\t434220\n哈师大附中\t434221\n吃着\t434222\n王老\t434223\n1385\t434224\n生土\t434225\nAlhambra\t434226\nflg\t434227\n你一个人\t434228\n清远站\t434229\n虐人\t434230\n十八味\t434231\nmiui8\t434232\n电控柜\t434233\n永定门长途汽车站\t434234\n产研\t434235\n周处\t434236\n湖北省教育厅\t434237\n32首\t434238\ndatav\t434239\n减肥药\t434240\n藩\t434241\n西女\t434242\nsucces\t434243\n乌衣巷\t434244\n征联\t434245\n短租别墅\t434246\n21世纪人才网\t434247\n禅者\t434248\n有问\t434249\nBanquet\t434250\ncml\t434251\n管区\t434252\n闭门羹\t434253\n铁卫\t434254\n浪鲸卫浴\t434255\n恒捷\t434256\nWIDE\t434257\n澳门半岛\t434258\n纵线\t434259\n西太湖\t434260\n盟军\t434261\n翼风游戏网\t434262\n阿联酋航空\t434263\n九中\t434264\n阿里通网络电话\t434265\n变址\t434266\npentagon\t434267\n创意类\t434268\n75周岁\t434269\n令人发指\t434270\nLineage\t434271\n六安经济技术开发区\t434272\n作坊式\t434273\n来车往\t434274\n腿法\t434275\n环套\t434276\nlpl吧_\t434277\n自建房\t434278\nmirro\t434279\nrouters\t434280\n空降师\t434281\n三胞集团\t434282\ncomputation\t434283\n德尔菲法\t434284\n长沙银行\t434285\nprich\t434286\nWoWS\t434287\ngis\t434288\n下沙\t434289\n电铁\t434290\n相近\t434291\n畅玩4\t434292\n德古拉元年\t434293\n3660\t434294\n邮\t434295\n摩阻\t434296\n巴人\t434297\n掉床\t434298\n郭东\t434299\n智选\t434300\nRoofing\t434301\n运动学\t434302\nHonor\t434303\n桐谷美玲\t434304\n20171231\t434305\n小博美\t434306\n社区工作者考试题库\t434307\n超募资金\t434308\n修正版\t434309\ngrad\t434310\nm6500\t434311\njavaagent\t434312\n2014.10\t434313\n中粮地产\t434314\n韩德君\t434315\n温州西\t434316\n果纳芬\t434317\nfang\t434318\n满心\t434319\n新活\t434320\n翠柏路\t434321\nDogecoin\t434322\n7.5米\t434323\n岁寒三友\t434324\n世族\t434325\n遮光\t434326\n玫瑰金\t434327\n龙门镖局2\t434328\n刀旗\t434329\n福建省民政厅\t434330\n广九直通车\t434331\n海滨花园\t434332\n毁于\t434333\n铁元\t434334\n天价虾\t434335\n友盟统计\t434336\n2467\t434337\n雷克萨斯570\t434338\nvolist\t434339\n翻译官\t434340\nSE2\t434341\n74分\t434342\n第3张\t434343\n扫盲篇\t434344\n库存量\t434345\n吴晶\t434346\n视神经萎缩\t434347\n5m\t434348\n成瘾性\t434349\nAnge\t434350\nBHM\t434351\n400位\t434352\n渐隐\t434353\n499元\t434354\n万达集团\t434355\nDuplicate\t434356\n银川大学\t434357\n灸法\t434358\n10下\t434359\n指挥类\t434360\n经济\t434361\n2连\t434362\n魔符\t434363\n环太平洋\t434364\nshred\t434365\n别克GL6\t434366\n理工科大\t434367\n粤港\t434368\n第65集\t434369\n红樱桃\t434370\n踢\t434371\n黄斑裂孔\t434372\n94集\t434373\n埃隆马斯克\t434374\nCivic\t434375\n证监会\t434376\n首创集团\t434377\nFortiGate\t434378\n文昌\t434379\n亲华\t434380\n维也纳机场\t434381\n泗阳\t434382\nTractor\t434383\ngage\t434384\n快穿之炮灰女配逆袭记\t434385\n金融公司\t434386\nAIA\t434387\nMainActivity\t434388\ncurl_exec\t434389\n邱伟\t434390\n欧丽薇兰\t434391\n椰岛鹿龟酒\t434392\n朱骏\t434393\nQ\t434394\n光环战争\t434395\n中华人民共和国国家新闻出版广电总局\t434396\n王者荣耀体验服\t434397\n型动\t434398\n餐柜\t434399\n珊珂\t434400\n铬铁\t434401\n电影城\t434402\n广本\t434403\n四川大学\t434404\n惠能\t434405\n钢号\t434406\n峡江县\t434407\n沃保\t434408\n画框\t434409\n财金\t434410\n五代篇\t434411\n东西湖\t434412\n图形的放大与缩小\t434413\n七星级\t434414\n虞\t434415\nrecap\t434416\nbyr\t434417\nsteamspy\t434418\n三得\t434419\nunsupervised\t434420\nwxWidgets\t434421\n275\t434422\n有一次\t434423\n博仕\t434424\n上午9点\t434425\n外联\t434426\n台机\t434427\n国家地理中文\t434428\n元法\t434429\nGS\t434430\n援护\t434431\nOxygen\t434432\n一撇\t434433\n现状报告\t434434\n干衣机\t434435\n驶往\t434436\n西安市规划局\t434437\n出眠\t434438\n强奸案\t434439\n枯山\t434440\n档风\t434441\n探险\t434442\nanny\t434443\n外交史\t434444\n三通两平台\t434445\n芳香烃\t434446\n360N6Pro\t434447\n7.6分\t434448\n肌营养不良\t434449\n色剂\t434450\n80c51\t434451\n周焯华\t434452\n2018年04月份\t434453\nmamakids\t434454\ndatasheet\t434455\nMinogue\t434456\ngupiao\t434457\n一把一把\t434458\n金南俊\t434459\n6笔\t434460\n寿辰\t434461\n水果糖\t434462\n数控龙门铣床\t434463\n北汽昌河\t434464\n蒙古帝国\t434465\n中国杯\t434466\n660\t434467\n艾米丽\t434468\n360手柄模拟器\t434469\n日月神教\t434470\n李逍遥\t434471\n中铁二十五局\t434472\n针织面料\t434473\n兼并\t434474\n道道\t434475\n在那桃花盛开的地方\t434476\n刀俎\t434477\nodps\t434478\n无奇不有\t434479\n侯磊\t434480\n山顶洞人\t434481\nsinger\t434482\n防尘漆\t434483\n行卡\t434484\n辛普森一家\t434485\nlo\t434486\n天门冬\t434487\n森月茉\t434488\n四凤\t434489\n肯尼基\t434490\n传感\t434491\nLarry\t434492\n氡\t434493\n回憶\t434494\n母婴类\t434495\nphosphory\t434496\ntsd\t434497\n2.8.4\t434498\nlj2400l\t434499\n真人娃娃\t434500\n48段\t434501\n黑龙江路\t434502\nSkymem\t434503\nMOOK\t434504\n倍林达\t434505\n脑电波\t434506\n僵化\t434507\n前湖\t434508\n螳螂捕蝉\t434509\n实地集团\t434510\n死亡日记\t434511\n中国统计信息网\t434512\n上药\t434513\n天香楼\t434514\nPocky\t434515\n细部\t434516\nnbs\t434517\n乳孔\t434518\nzhuye\t434519\nF000\t434520\n府山街道\t434521\n纤腰\t434522\n万王\t434523\n人教版\t434524\n毫秒\t434525\n冰妹\t434526\njqueryeasyui\t434527\n沙塘\t434528\n黄品源\t434529\n外设天下-电脑外设\t434530\n海赋\t434531\n甘化\t434532\n抛洒\t434533\n龚翔宇\t434534\n相安\t434535\n使徒行传\t434536\n开心宝贝\t434537\n港股通\t434538\n2016年1月\t434539\ncarried\t434540\noppo公司\t434541\n波顿\t434542\nAGV\t434543\n75张\t434544\n期末余额\t434545\n战国全面战争\t434546\n怒之煞\t434547\n日不落\t434548\n青歌赛\t434549\n仰拱\t434550\n珠三角\t434551\n康辉旅游网\t434552\n金隅南七里\t434553\n5.4亿\t434554\n宪法监察法\t434555\n刘铭\t434556\n四肢\t434557\n艾仕\t434558\n武功\t434559\nNFU\t434560\n90坦克\t434561\n五年后\t434562\n裙底\t434563\n6.35mm\t434564\nfilepath\t434565\nlumia920吧_\t434566\nOpens\t434567\n小灰笔记\t434568\n乱芳华\t434569\n小草\t434570\n2018年3月1日\t434571\n长安银行\t434572\n张晓英\t434573\nGIANT捷安特\t434574\n8888888\t434575\n月饼券\t434576\n长纤\t434577\n凤凰佛教\t434578\n样式类\t434579\nRedirector\t434580\nbom\t434581\n下篇\t434582\n煤炭总医院\t434583\n来福士\t434584\n水艺\t434585\n陈娜\t434586\n无极剑圣吧\t434587\ntpx\t434588\nV10.6.7.64\t434589\n死亡爬行\t434590\n仕德伟\t434591\n小米note4x\t434592\n酷蜂科技\t434593\n豹女\t434594\n竖管\t434595\n重庆国际会议展览中心\t434596\n9303\t434597\n四十多天\t434598\n不匹配\t434599\nRevitAPI\t434600\n345路\t434601\nstm32f030\t434602\n刺情\t434603\n挖土\t434604\n合和\t434605\n虚拟光驱软件\t434606\n泡茶杯\t434607\n超静音\t434608\n30秒内\t434609\n弟媳\t434610\n跷脚牛肉\t434611\n硫化钠\t434612\n70次\t434613\n南汇县\t434614\njapane\t434615\n超神学院之雄兵连乾坤篇\t434616\n_天降之物\t434617\n歼20\t434618\nambit3\t434619\n比赛成绩\t434620\n亮堂堂\t434621\n中阮\t434622\nBOBO\t434623\n题目\t434624\nTAGS\t434625\n经济科学出版社\t434626\n母猪\t434627\n峰谷分时电价\t434628\n百倍\t434629\n桑拿水\t434630\n答谢宴\t434631\notl\t434632\n恐怖袭击\t434633\n东莞总站\t434634\nwebm\t434635\n东阳\t434636\nv4.3.3\t434637\n泰州市中医院\t434638\n消息体\t434639\n黑沙滩\t434640\n百度统计\t434641\n诺斯贝尔\t434642\n圣约\t434643\n北京政府\t434644\n施维雅\t434645\n铜仁地区\t434646\n表示\t434647\n半期\t434648\n可可豆\t434649\nel-date-picker\t434650\n最伟大的\t434651\n杀气\t434652\n大批\t434653\nbonita\t434654\n因为爱情\t434655\n书郎\t434656\n升幅\t434657\n红狼\t434658\n棉绳\t434659\n首尔塔\t434660\nbluestacks\t434661\n谢冰莹\t434662\n航天科工二院\t434663\n少不了\t434664\n庶务\t434665\n葛天\t434666\n恐龙谷\t434667\n翔盛\t434668\n靖江王府\t434669\niPhon\t434670\n大龙网\t434671\n湖滨中学\t434672\n行尸走\t434673\n大年\t434674\n2.97\t434675\n补牙\t434676\n1997\t434677\n3天2夜\t434678\n商旅\t434679\n罗威纳\t434680\n哈弗H7\t434681\n填志愿\t434682\nninety\t434683\n数少\t434684\n海绵宝宝\t434685\n灵桥\t434686\n我的妈妈是精灵\t434687\nQLED\t434688\n插卡\t434689\n雨前茶\t434690\n23厘米\t434691\n红星社区\t434692\n处女泉\t434693\n共轭矩阵\t434694\n林瑞瑜\t434695\n竹蜻蜓\t434696\n贰\t434697\nmsteel\t434698\n左腕\t434699\n湖北省国资委\t434700\n操出\t434701\n大连海关\t434702\nrer\t434703\nSEM/EDS\t434704\n发季\t434705\n德岛\t434706\n6018\t434707\nktm\t434708\n塔罗\t434709\n赵咏华\t434710\n人证\t434711\n服务码\t434712\n51Nod\t434713\n江潮\t434714\n丰重\t434715\n吹瓶机\t434716\n拉弯\t434717\n涂磊\t434718\n压栈\t434719\n防护\t434720\n优酷在线\t434721\n伙\t434722\n两千万\t434723\n高级定制\t434724\n俄美\t434725\n壮\t434726\n形码\t434727\n失独\t434728\nMobius\t434729\nZNDS\t434730\nbasil\t434731\n陈雪\t434732\nEffective\t434733\nMsi\t434734\nHD-720P.MP4\t434735\n俩口\t434736\n无远弗届\t434737\n1886年\t434738\n保护地\t434739\n优源\t434740\n光明区\t434741\nvocal\t434742\n霍金斯\t434743\ndesigned\t434744\n约束名\t434745\ncre\t434746\n维摩\t434747\n国漫\t434748\n张东\t434749\n胶木\t434750\n钱塘老娘舅\t434751\n永安街\t434752\nmie\t434753\nP8\t434754\n盘龙湾\t434755\n厂商们\t434756\noriginpro\t434757\n方正字体_方正字库\t434758\nTod\t434759\n毕福毕雯珺\t434760\ndd35\t434761\nheaven\t434762\n张震岳\t434763\n肾盂癌\t434764\n中视网\t434765\n燕庄\t434766\n主题展\t434767\n迷妹\t434768\n瓷业\t434769\n新鲜人\t434770\n精神病史\t434771\n七十周年\t434772\n举案说法\t434773\n联想yoga710\t434774\nIphone5\t434775\n兴邦\t434776\nRUI\t434777\nGender\t434778\n鹿儿岛县\t434779\n白雪石\t434780\n枋\t434781\n高唐\t434782\n61\t434783\n云南省农村信用社\t434784\n玄霄\t434785\n散流器\t434786\n海明\t434787\n蓝宝龙\t434788\n盆\t434789\n44年前\t434790\n野猪林\t434791\n剪辑版\t434792\n情歌赛过春江水\t434793\nhdr\t434794\n张楚\t434795\nR2015b\t434796\n小猪短租\t434797\n芬芳\t434798\n应得\t434799\n出操\t434800\n跳机\t434801\nredir\t434802\nAssembly\t434803\n永乐大典\t434804\n通标标准技术服务有限公司\t434805\n熔渣\t434806\n天府通\t434807\n晚吹\t434808\n6.0.3\t434809\nhaer\t434810\n推向\t434811\n盗梦\t434812\n马尔科\t434813\n桃江县\t434814\n孔君平\t434815\n苏州市行政审批局\t434816\n恩兔\t434817\n耕坏\t434818\nGUNDAM\t434819\n剃刀\t434820\n昌珉\t434821\n后进纸\t434822\nresform\t434823\nagf\t434824\n计划生\t434825\n水浴法\t434826\n2017年6月28日\t434827\n老冯\t434828\n4770\t434829\n热水泵\t434830\n脱泡\t434831\n南波\t434832\n数字电位器\t434833\nDistpicker\t434834\nPearson相关系数\t434835\n兴业证券股份有限公司\t434836\nvovo\t434837\n70m\t434838\n张超凡\t434839\n迅搜\t434840\n星愿浏览器\t434841\n股票质押贷款\t434842\n为限\t434843\n鄂尔多斯盆地\t434844\n5399\t434845\natl\t434846\nDeve\t434847\ndislocation\t434848\nMongolian\t434849\n飞虹路\t434850\n东阳光科\t434851\n订购单\t434852\n原级\t434853\n急难险重\t434854\n绕路\t434855\n赛点\t434856\n数几\t434857\n理瓶机\t434858\nnestable\t434859\n静态分析\t434860\n歌利亚\t434861\n冷加工\t434862\nstddef\t434863\n0371-60630500\t434864\n三峡银行\t434865\n王宝山\t434866\n063\t434867\n唐山市妇幼保健院\t434868\n分裂\t434869\nresidential\t434870\n辐射2\t434871\n胡氏\t434872\nshifu\t434873\n黄安\t434874\n视听\t434875\n即墨信息港\t434876\nsometime\t434877\n门路\t434878\n北方汽修学校\t434879\n有图有\t434880\n陕西航空职业技术学院\t434881\n常州市市\t434882\nPass\t434883\n休闲车\t434884\n蒸好\t434885\n振兴中华\t434886\n滨海欣嘉园\t434887\nIOPS\t434888\n小枣\t434889\n小哥哥们\t434890\n壹药网\t434891\n行果\t434892\n收自支\t434893\n赤城\t434894\n蔡楠\t434895\n膨出\t434896\n含冤\t434897\n这世界\t434898\n春节联欢晚会\t434899\n虎牢\t434900\nENE\t434901\n杨野\t434902\n手迹\t434903\n芬尼\t434904\n共同\t434905\nsuperslide\t434906\nHeb\t434907\n三亚市人民政府\t434908\n普吉国际机场\t434909\n6第二代\t434910\nlicence\t434911\n林青霞\t434912\n绿军\t434913\n广州商学院\t434914\nDTC\t434915\n文库\t434916\n骨质\t434917\n艰苦\t434918\n2天\t434919\n2002年度\t434920\n市委政研室\t434921\n改任\t434922\n孔老师\t434923\n代练通\t434924\n文一波\t434925\n莲蓬头\t434926\n布贴画\t434927\nBootLoader\t434928\n百万方\t434929\npowerpoi\t434930\n四界\t434931\n378.92\t434932\n口干舌燥\t434933\ngtx660ti\t434934\nsata3\t434935\n识别仪\t434936\n华西妇产儿童医院\t434937\nrabbits\t434938\nm11\t434939\nLaoKa-51CTO\t434940\npg279q\t434941\n北京紫竹院公园\t434942\nConstruction\t434943\n还魂草\t434944\npure-ftpd\t434945\n255.255.252.0\t434946\n枪族\t434947\n穆提\t434948\n终板\t434949\n集\t434950\nIPHONE4\t434951\n西稍门\t434952\n20万平方米\t434953\n升入\t434954\n沁园春长沙\t434955\n20160514\t434956\n两小无猜网\t434957\n穷国\t434958\n瓦楞板\t434959\n姬\t434960\n华诚\t434961\n被拘捕\t434962\n快快快_\t434963\n上里\t434964\n科伦\t434965\n绫罗\t434966\n中建教育\t434967\n17.12\t434968\n32分\t434969\nIE10浏览器\t434970\n布衫\t434971\n检控\t434972\n宋林\t434973\n限高杆\t434974\n烤肉拌饭\t434975\n无可恋\t434976\n盐河\t434977\n头管\t434978\n网上办税服务厅\t434979\nsrc值\t434980\n更有钱\t434981\n乐范\t434982\n堇山小学\t434983\n中荷\t434984\n信中国\t434985\n机械时代网\t434986\nCLI\t434987\nu乐\t434988\n盘整\t434989\n侍卫\t434990\n信息流\t434991\n成瀬心美\t434992\n翼年代记\t434993\n影级\t434994\napk\t434995\nBEAM\t434996\n绿箭侠\t434997\n第223集\t434998\nRHEL6\t434999\n大力气\t435000\njelly\t435001\n肛周湿疹\t435002\nMCC码\t435003\n教宗\t435004\n蒋委员长\t435005\n诸子百家争鸣\t435006\n叶企孙\t435007\n外阴癌\t435008\n祖堂\t435009\n特人\t435010\n接字\t435011\n万盈\t435012\n淘金者\t435013\n铜件\t435014\n晔\t435015\nCUHK\t435016\n480个\t435017\n利久期\t435018\nmultiple\t435019\n红金龙\t435020\nExperiments\t435021\n1.5.5\t435022\n新希望杯\t435023\n骇\t435024\n富县\t435025\n清明诗\t435026\n蓝爵\t435027\n牛肉羹\t435028\n霍震霆\t435029\nLookAE\t435030\nAPE音乐下载网\t435031\n广园快速路\t435032\n争先\t435033\n满勤\t435034\n白头到老\t435035\nCCTV8\t435036\nhxl2219\t435037\nqq2017\t435038\nidou\t435039\nhaswell\t435040\ngtx965m\t435041\n哥伦比亚号\t435042\n尘量\t435043\n百香果籽\t435044\njon\t435045\n13G\t435046\n上甘岭\t435047\n林毅夫\t435048\n招商部\t435049\n板水\t435050\n村容\t435051\n基尔\t435052\n雁子\t435053\n第三城\t435054\n青云英汉互译翻译网\t435055\n凡尔纳\t435056\n挂起\t435057\nEOF\t435058\ngs\t435059\n海天盛筵\t435060\n杨千嬅\t435061\n45毫米\t435062\n借用\t435063\n赤足者\t435064\nfractal\t435065\n沃森\t435066\n淮南联合大学\t435067\n红警大战\t435068\n心泉\t435069\n48k\t435070\n福州市第一医院\t435071\n爱师网\t435072\n劣势\t435073\n数字娱乐\t435074\n求操\t435075\n疑犯\t435076\nmastery\t435077\nVictor\t435078\n盘妻索妻\t435079\n追悔\t435080\n棉织品\t435081\noffee\t435082\n雇员制\t435083\n石家庄小学\t435084\n第67号\t435085\na9\t435086\n焚香谷\t435087\n滞空\t435088\n开始菜单打\t435089\n悦盒EC6108V9\t435090\n易鑫\t435091\n科苑路\t435092\n火龙弩\t435093\n赏味\t435094\n大司马解说吧\t435095\n粉剂\t435096\ntid\t435097\n名轩\t435098\n妇女之家\t435099\nonmouse\t435100\n闯红灯\t435101\n包清\t435102\n宅集社\t435103\n元亨\t435104\nNSObject\t435105\n优速\t435106\n旋具\t435107\n鱼虾蟹\t435108\ngearman\t435109\n介入手术\t435110\n供应链管理有限公司\t435111\n滚翻\t435112\n捷诺维\t435113\n丰茂\t435114\nREMUX\t435115\n牛欣欣\t435116\n唐山路北区\t435117\n春娇塔防三国志\t435118\nchanel\t435119\n倒错\t435120\n康生\t435121\nav无码\t435122\n第三十七次\t435123\n无想\t435124\n上海古朵生物科技有限公司\t435125\n色友俱乐部\t435126\n博士后\t435127\n珍珠母\t435128\nShield\t435129\n魅蓝e2\t435130\n怀孕率\t435131\n5.99万\t435132\n赵学军\t435133\n乐视x620\t435134\n生物碱\t435135\n索尼a7吧\t435136\nMachina\t435137\n3024\t435138\n16PF\t435139\n兵役登记\t435140\n择偶观\t435141\n27亿元\t435142\n葡萄园\t435143\nTurf\t435144\nHTTP/2\t435145\n中新镇\t435146\n缓冲池\t435147\nwap2app\t435148\n犯忌\t435149\n虚浮\t435150\n粉虱\t435151\n张斌\t435152\nMyeclipse2017\t435153\n沾湿\t435154\n蛤蛤\t435155\n分光器\t435156\n棕油\t435157\nAttacks\t435158\n防癌险\t435159\n文具袋\t435160\n芊芊\t435161\n第7轮\t435162\n南京大学仙林校区\t435163\n新福港\t435164\n品略图书馆\t435165\n六安市中医院\t435166\n马谡\t435167\nアナタ\t435168\n先锋电视\t435169\n10丸\t435170\n九江市第一人民医院\t435171\n利宝保险\t435172\n雷奕明\t435173\n89式\t435174\n6.9元\t435175\n手绳\t435176\n无罪辩护\t435177\n治病救人\t435178\n十几遍\t435179\n宿县\t435180\n小米米\t435181\n幻想神级英雄\t435182\npaperfree\t435183\n黑布林英语阅读\t435184\nCreuset\t435185\npyenv\t435186\n黑川温泉\t435187\n柠檬杯\t435188\n宿城一中\t435189\n皱巴巴\t435190\n广元日报社\t435191\n王雪峰\t435192\n餐饮连锁\t435193\nyancy\t435194\n波波池\t435195\n悔意\t435196\n龙湖舜山府\t435197\n西南交通大学》\t435198\n洋牌\t435199\nFaced\t435200\n瑞和\t435201\n贩卖机\t435202\n本物\t435203\n西班\t435204\n爱的名义\t435205\n热轧带肋\t435206\n12位\t435207\n水缘\t435208\nt58\t435209\nHX\t435210\n厦门大学继续教育学院\t435211\n旧州古镇\t435212\n生辰八字算命\t435213\n王润\t435214\n浙江广厦男篮\t435215\nTalks\t435216\n雏鹰假日小队\t435217\n棉业\t435218\n施救\t435219\nmsnshow\t435220\n唯美图片网\t435221\n腺苷钴胺片\t435222\n合并式\t435223\n数显千分尺\t435224\nhtml\t435225\nEpson\t435226\nfview\t435227\n生化需氧量\t435228\n饺子皮机\t435229\n丹尼\t435230\n瑶浴\t435231\n范丰平\t435232\n猪肉佬\t435233\ndrink\t435234\n死魂曲\t435235\n微生物室\t435236\n不能错\t435237\n爷叔\t435238\nElton\t435239\n美莲广场\t435240\n坚着\t435241\nwxpy\t435242\n刘统勋\t435243\nx+3\t435244\nUptempo\t435245\n云蒙山\t435246\n二心\t435247\n企稳\t435248\n斗牌\t435249\n看书\t435250\n海曙\t435251\n特异度\t435252\ngssz\t435253\nSTM32F103ZET6\t435254\n西沙\t435255\nWin32API\t435256\n8:00\t435257\n中国侨联\t435258\n太娘\t435259\n中国国际展览中心\t435260\n沥水架\t435261\n招聘考试网\t435262\nHikvision\t435263\n前后杠\t435264\n冷链车\t435265\nfel\t435266\n1800R\t435267\n最爱的女人\t435268\n雁过拔毛式\t435269\n18051期\t435270\nsasa\t435271\nNeutron\t435272\n自来水公司\t435273\n403.14\t435274\n佛山机场\t435275\nuabntu\t435276\n名果\t435277\nMXNet\t435278\n冷热源\t435279\n杨式太极剑\t435280\n天气预报\t435281\nLicess\t435282\ncs5\t435283\nxsplit\t435284\n精索静脉曲张\t435285\n四方面\t435286\nl805\t435287\n5720\t435288\n构建器\t435289\n甘青\t435290\n潘金莲\t435291\n恒泰艾普\t435292\n第五十期\t435293\n法画\t435294\n中国中央电视台\t435295\n临夏\t435296\n北京东路小学\t435297\n陈洪波\t435298\n珞瑜路\t435299\n3007\t435300\n工作日记\t435301\n阿塔尼斯\t435302\n电子税务局\t435303\nParasite\t435304\n孙海波\t435305\n脑挫裂伤\t435306\n日起\t435307\n囊胚\t435308\n商贸城\t435309\n大粪\t435310\n齐鲁台\t435311\n庭庭\t435312\n第三十五\t435313\n南京市公安局交通管理局\t435314\n顽强\t435315\n英雄杀\t435316\nGINZA\t435317\n酸辣土豆丝\t435318\n外交贴\t435319\n韩菲\t435320\n永远不懂\t435321\nEAF\t435322\nAntimicrobial\t435323\n影音先锋电影_影音先锋\t435324\nジ店\t435325\n东论\t435326\n盛明\t435327\n被击毙\t435328\n1299元\t435329\n十八种\t435330\n汽水瓶\t435331\n军式\t435332\n外交部长\t435333\n百合网\t435334\nTYLOO\t435335\n哑铃卧推\t435336\n干手\t435337\n黄埔海关\t435338\n_美元\t435339\n懈\t435340\n画皮2\t435341\n汇阳广场\t435342\n小块\t435343\n孟晓艺\t435344\n门垫\t435345\n二祖\t435346\n灭蚊片\t435347\n三门峡市中心医院\t435348\n烧干\t435349\nISO45001\t435350\n安信地板\t435351\n建居\t435352\n网杯\t435353\n财经高等职业技术学校\t435354\n太阳城集团\t435355\n袁林\t435356\nadvocacy\t435357\n杨光辉\t435358\n刘半农\t435359\nDevBean\t435360\n深度版\t435361\n黄山站\t435362\n疑犯追踪\t435363\n除胶剂\t435364\n祥云县人民政府\t435365\n线性回归\t435366\nAnnotations\t435367\n彩沙\t435368\n阅奇网\t435369\n证通\t435370\n圈钱\t435371\n辅助器\t435372\n金控集团\t435373\n255.255.0.0\t435374\n12月27日\t435375\n9000e\t435376\n达赖集团\t435377\n1595\t435378\nspingmvc\t435379\n孙晶晶\t435380\n拜占庭\t435381\n230公里\t435382\n掉掉\t435383\n丁子高\t435384\n价表\t435385\n缩骨\t435386\n条码支付业务规范\t435387\n草上\t435388\n认房不认贷\t435389\n全国公安厅\t435390\nf-7861121\t435391\n老山檀\t435392\n美成达\t435393\n果汁机\t435394\n2.25\t435395\n四川地区\t435396\n陈志军\t435397\n芝加哥华人资讯网\t435398\n李秀丽\t435399\nLUNA\t435400\n蒋公\t435401\n山光\t435402\n混合编程\t435403\n中材国际\t435404\n右玉县\t435405\n第A14\t435406\n吼\t435407\n双子山\t435408\nEyed\t435409\nSalon\t435410\nwifi管家\t435411\n热电阻\t435412\n和为贵\t435413\n第六大\t435414\n福特撼路者社区-手机易车网\t435415\n健一\t435416\n5通\t435417\n气化率\t435418\n招标书\t435419\n光明之路\t435420\n500%\t435421\n神圣罗马帝国\t435422\n火炬木\t435423\n傅氏\t435424\n欠操\t435425\nT19\t435426\n周铁镇\t435427\n雨林木风ghost\t435428\n金科股份\t435429\n新盟\t435430\n营行\t435431\n沙漠之舟\t435432\n蜘\t435433\n漫网\t435434\nm12\t435435\n360手机N6\t435436\n活蝌蚪\t435437\nDimensional\t435438\n纳达王志东\t435439\n湘区\t435440\n有偿服务\t435441\nSnapchat\t435442\n风式\t435443\npoa\t435444\n数控冲床\t435445\n信保基金\t435446\nWin10锁屏\t435447\n隆鑫\t435448\n华为mate7吧\t435449\n超过一分钟\t435450\n母港\t435451\nTasty\t435452\n必要性\t435453\n唐老大\t435454\n马小李\t435455\n美丽的草原我的家\t435456\n天朝\t435457\n张家界森林公园\t435458\n1608\t435459\n称职\t435460\n第六家\t435461\n四发\t435462\n无锋\t435463\n底蕴\t435464\n国有土地上房屋征收与补偿条例\t435465\n太平洋产险\t435466\n标展\t435467\n火鸡\t435468\n晦\t435469\n超便携\t435470\nLaysha\t435471\n勇擒\t435472\n亿生\t435473\n十几只\t435474\n幅员\t435475\n江苏大学京江学院\t435476\n异象\t435477\n蠕行\t435478\n垂线段\t435479\n淋巴炎\t435480\nCitrus\t435481\n28.com\t435482\n网火\t435483\n第五层\t435484\nweiming\t435485\n五十个\t435486\n82424418\t435487\n谢勇\t435488\n豆石\t435489\n自嗨\t435490\nlosses\t435491\n菁智\t435492\n智游城\t435493\nBalenciaga\t435494\n计列\t435495\n今日头条\t435496\n菅纫姿\t435497\n中盈网\t435498\n加权平均净资产收益率\t435499\n金雅\t435500\n低纬\t435501\nSP1\t435502\n叙\t435503\n林欢\t435504\n钻\t435505\n门口机\t435506\nwanna\t435507\nFlyway\t435508\n西条\t435509\n林菲\t435510\n竹芋\t435511\nHitFM\t435512\n罗勒叶\t435513\n独流\t435514\n索尼应用中心\t435515\nHuang\t435516\niml\t435517\ninitiated\t435518\n斗鱼三骚\t435519\n香港海洋公园\t435520\n75毫米\t435521\n中药饮片厂\t435522\n浙江执御信息技术有限公司\t435523\n立题\t435524\n秀洲新闻网\t435525\n武易传奇\t435526\n2018下半年\t435527\n葛佳慧\t435528\n占款\t435529\nRiders\t435530\n牛柳\t435531\nGenius\t435532\n国家粮食局\t435533\n恐怖动物\t435534\n雾化玻璃\t435535\nstage1/s1\t435536\n3成\t435537\n力助\t435538\n太湖度假区\t435539\nwinkawaks模拟器\t435540\n43个\t435541\n历任\t435542\n2016年6月6日\t435543\n狮山路\t435544\n表业\t435545\n一诺倾情\t435546\n小学徒\t435547\n夺\t435548\n横向\t435549\ncanary\t435550\n验货员\t435551\n茶店子客运站\t435552\n比是\t435553\n我和你\t435554\n链家网\t435555\n欲死\t435556\nversa\t435557\n徐铮\t435558\n原始佛教\t435559\n金升阳\t435560\n李姝寒\t435561\n彩运\t435562\nwww.hrssgz.gov.cn/vsgzhr/PHJG/BC4\t435563\n一动\t435564\n五张\t435565\n康恩贝肠炎宁\t435566\n无敌破坏王\t435567\n湖北公务员考试网\t435568\n棉胎\t435569\nIAAS\t435570\n可能态\t435571\n第二十五次\t435572\n伪随机\t435573\n管子\t435574\n绿城党旗红党建\t435575\n歌尔声学\t435576\n十分钟后\t435577\nlolita\t435578\n985吧\t435579\n兴旺发达\t435580\n机动车尾号\t435581\n夹角\t435582\n重整旗鼓\t435583\nVMX\t435584\n实验方\t435585\n智本\t435586\nmatteo\t435587\n名茶\t435588\n赵兵\t435589\n赌狗\t435590\nloan\t435591\n慑\t435592\n短簧\t435593\nAmeri\t435594\ngw2\t435595\nLibre\t435596\n棉价\t435597\nbtsow\t435598\n三元运算符\t435599\n.doc-毕业论文\t435600\n雅西高速\t435601\nfreemarker\t435602\n茶叶\t435603\n等待子\t435604\n陡\t435605\n组合框\t435606\n燃曲\t435607\n来来\t435608\n文昌塔\t435609\n非离子型\t435610\n金象网\t435611\n福莱\t435612\n小高组\t435613\n虑\t435614\n卢洪哲\t435615\n知白\t435616\nR3hab\t435617\n平民装\t435618\n区国税局\t435619\n卡房\t435620\nsonicare\t435621\nB250\t435622\n张某\t435623\n酷比魔方i10\t435624\nslf4j+logback\t435625\n尼达利\t435626\n三娃\t435627\nV-Ray\t435628\n布玛\t435629\n1555\t435630\n胳肢窝\t435631\n边上\t435632\n25歳\t435633\n利记\t435634\nexpdp\t435635\n油桐\t435636\nICR\t435637\n一哄而上\t435638\n176cm\t435639\n冬期\t435640\n蒸发器\t435641\n卷门\t435642\n江西理工大学应用科学学院\t435643\n数字压力计\t435644\n单节\t435645\n迅雷离线\t435646\n白麒麟\t435647\n升平\t435648\n吴俊寰\t435649\n大三\t435650\n钟先生\t435651\nSaid\t435652\n照灯\t435653\n玄门\t435654\n黄英\t435655\n刘彦平\t435656\n三都镇\t435657\nFLAC+CUE\t435658\n一声\t435659\n33%\t435660\n麦奎尔\t435661\n散兵\t435662\n3.80\t435663\n创时代\t435664\nobjectives\t435665\n威力大\t435666\n27亿美元\t435667\n安庆市委\t435668\n电脑维修技术网\t435669\n环视\t435670\n墙艺\t435671\n冬瓜子\t435672\n暮雪\t435673\n王中军\t435674\n叶建华\t435675\ncoreldraw2018\t435676\n20150129\t435677\naz\t435678\n柏油路\t435679\n赵建华\t435680\n格莱斯\t435681\n街机三国战记\t435682\n第六十三\t435683\n拖放\t435684\n如如\t435685\n光纤光缆\t435686\n开元酒店集团\t435687\n累积\t435688\n海洋报\t435689\n苏铭\t435690\nRUB\t435691\n寇振海\t435692\n擦地机\t435693\n刺梅\t435694\n第十三个\t435695\n磁悬浮轴承\t435696\nFormat\t435697\n激光清洗机\t435698\n南平北\t435699\n安医大第一附属医院\t435700\n浙江大学出版社\t435701\niso13485\t435702\nShear\t435703\n马耳\t435704\nmini1\t435705\n冰洞\t435706\n助读\t435707\nSheet\t435708\nA79\t435709\n帝血弑天\t435710\nchem\t435711\n热裂纹\t435712\nRectifier\t435713\n尉官\t435714\n水门\t435715\n江西省高级人民法院\t435716\n陈法拉\t435717\n永不言\t435718\n夜间版\t435719\n蚊子\t435720\n射角\t435721\nDIO\t435722\nNorway\t435723\n沿江工业开发区\t435724\n成林\t435725\n投掷器\t435726\n超威电动车\t435727\n覺得\t435728\n六问\t435729\n绿谷出久\t435730\n数字媒体艺术专业\t435731\n新浪娱乐\t435732\nf3.5\t435733\n为王\t435734\n杨牧\t435735\n罗雪\t435736\n密档\t435737\nomnet\t435738\n磁台\t435739\n17集\t435740\n铜峰电子\t435741\n五牛基金\t435742\n天坑鹰猎\t435743\n东浦路\t435744\n陈硕\t435745\n手机动态壁纸\t435746\n鼠键\t435747\n复旦大学医学院\t435748\n工作社保\t435749\n两大类\t435750\n积聚\t435751\n社会保险登记证\t435752\n蓉江新区\t435753\n邪犬\t435754\nlista\t435755\nsq\t435756\n转移支付\t435757\n朱建新\t435758\n运放电路\t435759\n福建联合新材料科技有限公司\t435760\n胭脂扣\t435761\nmcqueen\t435762\n十枚\t435763\n嵩岩\t435764\n硕士博\t435765\n京沪动卧\t435766\n丽姬传\t435767\nmryqu\t435768\n西安市第九医院\t435769\n山东高院\t435770\n降半旗\t435771\n土坝\t435772\n生产力\t435773\n平和网\t435774\n入院\t435775\n诺莫瑞根\t435776\nturing\t435777\n胀套\t435778\n辛吉德\t435779\n2017年3月21日\t435780\n杆端\t435781\nCoroutine\t435782\n辽师大\t435783\n秦如凉\t435784\n记账本位币\t435785\nneets\t435786\n晶莹\t435787\n菲神\t435788\n朝云\t435789\n英可瑞\t435790\n郭破虏\t435791\nblas\t435792\n随州网\t435793\n电动邦\t435794\nfoobar2000\t435795\n巴隆\t435796\n赵岩昊\t435797\n槐林镇\t435798\n星海鲜竹沥\t435799\n配置篇\t435800\n粉笔\t435801\n间客\t435802\n伯明翰城市大学\t435803\nairtest\t435804\n保水剂\t435805\nawl\t435806\n提款\t435807\n全宝\t435808\n抛头露面\t435809\n渐近线方程\t435810\n乔木时樾\t435811\n出火\t435812\n改正\t435813\n法事\t435814\nIPhone7\t435815\n说和\t435816\n广州购书中心\t435817\n十六年前的回忆\t435818\n专业群\t435819\n峰峰集团\t435820\nDD\t435821\nELISA\t435822\n6.\t435823\n车顶\t435824\n伯牙善鼓琴\t435825\nnegotiable\t435826\n自考专升本\t435827\n20亩\t435828\nicp备案\t435829\n尖波\t435830\n马斯顿\t435831\n2014—2020年\t435832\n整改\t435833\nlida\t435834\n十万年\t435835\nCachePut\t435836\n无人驾驶航空器\t435837\n苦瓜茶\t435838\n不动产登记暂行条例实施细则\t435839\n冰山\t435840\n杨雅捷\t435841\nzach\t435842\n和林\t435843\n飞刃\t435844\n王某甲\t435845\n池州市人民政府\t435846\n2015-2019年\t435847\n香火\t435848\nfilesystems\t435849\n鲜柚\t435850\n空气知音\t435851\nwinters\t435852\n国有独资企业\t435853\n夏川里美\t435854\n朝阳小学\t435855\n淘宝卖家\t435856\n锐驰\t435857\nntp\t435858\n31度\t435859\nSuperhero\t435860\n太一\t435861\nlace\t435862\n神九\t435863\n债权法\t435864\n雷遁\t435865\n上海保交所\t435866\n纪念塔\t435867\n怀挡\t435868\n店主\t435869\n天体力学\t435870\n巴尔扎克\t435871\n神次\t435872\nnexus2\t435873\n一万二\t435874\n淘工厂\t435875\n分级基金\t435876\n斗米兼职吧\t435877\n爱尔兰大学\t435878\n吴江银行\t435879\nWin7系统\t435880\n上海市监狱管理局\t435881\n江西政府\t435882\n大白菜\t435883\nK11\t435884\n努比亚Z11mini\t435885\nsas\t435886\nunity2017\t435887\nvote\t435888\n20160727\t435889\n浦电路\t435890\n互用\t435891\n单方\t435892\n3d学习网\t435893\nconsidering\t435894\n蔷薇科苹果\t435895\n江南郡\t435896\n2n\t435897\n淋浴门\t435898\nDidn\t435899\n4月4\t435900\n乐有家\t435901\n中储发展股份有限公司\t435902\nSands\t435903\n自造化\t435904\n6月20日\t435905\nbillboar\t435906\n逊克\t435907\n遮肚\t435908\n龙门飞甲\t435909\n山西万荣\t435910\n更便捷\t435911\n软曼网\t435912\n中国塑料加工工业协会\t435913\n软服\t435914\n1500瓦\t435915\n实实在在\t435916\nyunos\t435917\n惠州)有限公司\t435918\n心民\t435919\nKhalifa\t435920\n1.7_\t435921\n折纸_折纸大全图解\t435922\nUnPHP\t435923\n止痒胶囊\t435924\n长孙皇后\t435925\n中山大学生命科学学院\t435926\n横店马拉松\t435927\nebates\t435928\n壤塘\t435929\n销售权\t435930\narmv6\t435931\nY400\t435932\n体循环\t435933\n她的心\t435934\n天屿湖\t435935\n麻呂\t435936\n北京文化\t435937\n蓝子\t435938\n白雪姬\t435939\n法库县\t435940\n惠南镇\t435941\n疏港铁路\t435942\n27速\t435943\n纷路\t435944\n6重\t435945\npare\t435946\n沙尔克\t435947\n54条\t435948\n车器\t435949\n碎屏\t435950\n极体\t435951\n怎们\t435952\n证书链\t435953\n练习题库\t435954\n木鸟\t435955\n四川九牛\t435956\nasync\t435957\n三门县\t435958\n国家元首\t435959\n135路\t435960\n省高管局\t435961\nAltima\t435962\n知识分子\t435963\nCiCi\t435964\nmagoosh\t435965\n无虞\t435966\n园林师\t435967\n131458\t435968\n亡羊补牢\t435969\n重庆总领事馆\t435970\n黄金\t435971\nGMAIL\t435972\n小纪\t435973\n高枝\t435974\n沈阳故宫博物院\t435975\n东郡\t435976\n呈请\t435977\n刷课\t435978\n牙板\t435979\nAdvocate\t435980\n收养\t435981\n21p\t435982\nios9.3\t435983\nXp\t435984\n张云\t435985\n迈阿币\t435986\nURLDecode\t435987\nDresses\t435988\n相控阵雷达\t435989\n买路\t435990\n效标\t435991\n初创公司\t435992\n空之境界\t435993\n潘德\t435994\n韩麦尔\t435995\n初美\t435996\n在现\t435997\n家路\t435998\n惊鸟娱乐\t435999\n江东产业集聚区\t436000\n建家\t436001\n氧化镍\t436002\nrizon\t436003\n非礼勿\t436004\n团结就是力量\t436005\ndiptyque\t436006\n歇斯底里\t436007\n诺丽\t436008\n2.4.1\t436009\nduilib\t436010\n马凯\t436011\n02331\t436012\n隋唐演义\t436013\n杜德利\t436014\n家装公司\t436015\n克星\t436016\n均布\t436017\n成都高新技术开发区\t436018\n泰无聊论坛\t436019\nacknowledgement\t436020\n汤种\t436021\nMT6750\t436022\n2018年2月23日\t436023\n呼朋引伴\t436024\n唐山市政府\t436025\n九三年\t436026\n蓝丰\t436027\n东盛大街\t436028\n崮山\t436029\nmouth\t436030\n洛阳市人民政府\t436031\n高佣联盟\t436032\n奶霜\t436033\n小皮\t436034\n玉佛禅寺\t436035\n酸奶机\t436036\n新宁县\t436037\n罡气\t436038\n被盗\t436039\n传承\t436040\n鼻腔\t436041\n余利宝\t436042\n太原学院\t436043\n乃木坂春香\t436044\n体服\t436045\n北医三院\t436046\n隋唐五代史\t436047\n天玑科技\t436048\nAluminium\t436049\nΘ\t436050\n一个人的朝圣\t436051\n华阳\t436052\nv1.1.1\t436053\nrunningman\t436054\n翻译器\t436055\n封檐板\t436056\n库页岛\t436057\n冯小刚\t436058\n10603g\t436059\n木历\t436060\n召妓\t436061\nalbumin\t436062\n选别\t436063\n刘珂\t436064\nVIVA\t436065\njoda\t436066\n白蛇传说\t436067\nJEE\t436068\nNoHttp\t436069\n简述\t436070\n魏塘街道\t436071\novm\t436072\n东道国\t436073\n洋溢\t436074\n江孜县\t436075\n厚街镇\t436076\ntensorboard\t436077\n出类拔萃\t436078\n乡人\t436079\n魔方寸\t436080\n左贡县\t436081\nDemon\t436082\n785\t436083\n元淳\t436084\nSharkBin\t436085\n这几天\t436086\n吊车尾\t436087\n90w\t436088\n镇宁自治县\t436089\n沈城\t436090\n侏罗纪公园2:失落的世界\t436091\n梅雁\t436092\n血压\t436093\n幽深\t436094\n击掌\t436095\n无二\t436096\n子午书简\t436097\n遵从\t436098\n2.0.2\t436099\n购物卡\t436100\nhtm5\t436101\nMarvel\t436102\n珊瑚岛\t436103\n施华蔻\t436104\n证券投资基金\t436105\n哈尔滨市人力资源和社会保障局\t436106\n应用语言学\t436107\ncardiology\t436108\n拉丝板\t436109\nanmi\t436110\n孽子\t436111\n营养价值\t436112\n4030\t436113\n毛片\t436114\n八十年代\t436115\n海信电器\t436116\n福州东\t436117\n云安区\t436118\n耐克毛毛虫\t436119\n橡皮塞\t436120\nFIYTA\t436121\nAB类\t436122\n强基\t436123\n明德门\t436124\n留尼汪\t436125\n脸书\t436126\n山西阳泉\t436127\n犯罪高手\t436128\n运损\t436129\n青睐\t436130\n贴机\t436131\n李易欢\t436132\n大老虎\t436133\n2.5.7\t436134\n价格费\t436135\n千份\t436136\n法国国家队\t436137\n身临其境\t436138\n聊斋三集之灯草和尚\t436139\nspring-core\t436140\n信托基金\t436141\n狼雨\t436142\n广州交易会\t436143\n东南快报数字报\t436144\nISO15189\t436145\n纳智捷u6\t436146\nP8B75-M\t436147\n一大圈\t436148\nream\t436149\nWpe\t436150\n强歼\t436151\nHalen\t436152\n瞬狙\t436153\nVUE2.0\t436154\nios8.1\t436155\n欲扬\t436156\n41天\t436157\n焚心\t436158\n和级联\t436159\nredirector\t436160\n前村\t436161\n设计性\t436162\nMAXIM\t436163\n鞋帽\t436164\nsoothing\t436165\n第43号\t436166\n抵消分录\t436167\n完美国际2\t436168\np4vasp\t436169\n日本驻华大使馆\t436170\n永恒之井\t436171\noarcle\t436172\n禁照\t436173\n张家口机场\t436174\n赵巷\t436175\n案款\t436176\n野装\t436177\n微机\t436178\n控制卡\t436179\n王璐瑶\t436180\n鬼吹灯全集\t436181\n取取\t436182\nロック\t436183\n不可信\t436184\n自护\t436185\nniv\t436186\n龙陵县\t436187\nLABS\t436188\n200兆\t436189\ncape\t436190\n握拳\t436191\n左俊\t436192\n通信卫星\t436193\n布兰登\t436194\n郭家学\t436195\n王晶\t436196\n不敢相信\t436197\n饭圈\t436198\n消费经济学\t436199\n拉宁\t436200\n连镇\t436201\nX58\t436202\n百业\t436203\nDBCA\t436204\n同尘\t436205\n光功率计\t436206\n跑单\t436207\n佛山市卫生局\t436208\n防御类\t436209\nRS\t436210\nmlab\t436211\n大哭\t436212\n大阅\t436213\n明基\t436214\n金麟\t436215\n20000元\t436216\n微宏\t436217\n当机立断\t436218\n798\t436219\n发生率\t436220\n乌茶邦\t436221\n150g\t436222\n太阳能热发电\t436223\nEDX\t436224\n加时赛\t436225\n3.7亿元\t436226\n老矿\t436227\nUse\t436228\n小瑶\t436229\n就业论文\t436230\n胡兵\t436231\no1\t436232\n互联网大会\t436233\ninstyle\t436234\n湘江壹号\t436235\n正常化\t436236\nDAM\t436237\n多酷\t436238\n攀讲\t436239\n黄道吉日\t436240\n图符\t436241\n6.62\t436242\nV3.1.1\t436243\n先奸\t436244\n何以为\t436245\n像片\t436246\n毛泽东纪念馆\t436247\nSysTick\t436248\n114平米\t436249\n公务员法\t436250\n汤镇业\t436251\n合江亭\t436252\n性瘾者\t436253\n上海电器科学研究所(集团)有限公司\t436254\nSTREAM\t436255\n天影\t436256\n定安\t436257\n成人化\t436258\n恩阳区人民政府\t436259\n版规\t436260\nvixx\t436261\n世爵娱乐\t436262\n深浅\t436263\n健康水平\t436264\n万域天尊\t436265\n沐沐\t436266\n水至清\t436267\n冷冻干燥机\t436268\ndemographic\t436269\n幻境\t436270\n发生\t436271\n乐视s3\t436272\n山西省旅游局\t436273\n糖果浏览器\t436274\n119分钟\t436275\n管辖权异议申请书\t436276\n中国建设招标网\t436277\n币安\t436278\n声歌\t436279\nHandle\t436280\n伊赛牛肉\t436281\n复课\t436282\n80-90年代\t436283\n电动棒\t436284\n滨崎\t436285\n财务指标分析\t436286\n而行\t436287\n衬板\t436288\n安塔利亚\t436289\nX\t436290\n徐港\t436291\n1758\t436292\n协和影视\t436293\njwxt\t436294\n中小学教师违反职业道德行为处理办法\t436295\n临沧市政府\t436296\n默认\t436297\n储存\t436298\n陕建发\t436299\nernian\t436300\n安道\t436301\nanysdk\t436302\n国拍网\t436303\n倡萌\t436304\n土行孙\t436305\nSTOP\t436306\n江苏省江阴市人民医院\t436307\n速达3000pro\t436308\n18h\t436309\n搓揉\t436310\n田羽生\t436311\n速卖通无忧物流\t436312\n近地卫星\t436313\npd1\t436314\n到爱\t436315\nAutodesk欧特克\t436316\n蒋友柏\t436317\n实验体\t436318\n锁库\t436319\nZLC\t436320\n香格里拉机场\t436321\n生物学科网\t436322\n吡啶\t436323\nfunctools\t436324\n端木蓉\t436325\n1-10月份\t436326\nsqldeveloper\t436327\n活\t436328\n江西省教育考试院\t436329\n吴歌\t436330\n金黔在线\t436331\n三宝科技\t436332\n培\t436333\n表计\t436334\n北京湾\t436335\n驾考\t436336\n金帆\t436337\ngana\t436338\n深圳经济特区\t436339\n国家公务员网\t436340\n不做为\t436341\n总产量\t436342\n阿克塞县\t436343\n鹿肉\t436344\n太古神墓\t436345\n李权白居易\t436346\n看呆\t436347\nsubgrid\t436348\n赘述\t436349\n尿素\t436350\n4兆\t436351\n比利\t436352\nrvt\t436353\n帕岸岛\t436354\n提临\t436355\n陶氏\t436356\n下陆区\t436357\nt66\t436358\n万盛奥陶纪公园\t436359\n冷淡\t436360\n山西大学\t436361\n钉盘\t436362\n广州空气净化工程公司\t436363\n外露灯\t436364\n周丽丽\t436365\n痉挛\t436366\n工院\t436367\n苏门答腊\t436368\n冰释\t436369\n天强\t436370\n中国科学院计算技术研究所\t436371\n小米游戏\t436372\n寒窑\t436373\n篇目\t436374\n146\t436375\nfix\t436376\n胃上\t436377\n小时侯\t436378\n黑丝丝\t436379\ntelnetd\t436380\n朗霞街道\t436381\n长园集团\t436382\n20151205\t436383\nPriscilla\t436384\n类肤\t436385\nyout\t436386\n莱阳路\t436387\n丙戌\t436388\n61亿\t436389\n10015\t436390\n帽儿\t436391\n宁缺\t436392\n北京肿瘤医院\t436393\n听障\t436394\n美丑\t436395\n沃尔顿\t436396\n压榨\t436397\n梁园区\t436398\n4713\t436399\nBelly\t436400\n九法\t436401\n30005\t436402\n桃花海\t436403\nNeo\t436404\n投资分析师\t436405\nRecord\t436406\n20151218\t436407\n沙加\t436408\n仓容\t436409\n联邦调查局\t436410\n授粉\t436411\n微ecology\t436412\n日本自民党\t436413\n面图\t436414\n崔琰\t436415\nNatsu\t436416\n地信网\t436417\n鲜花网\t436418\nquattro\t436419\nBear\t436420\n断骨\t436421\nserum\t436422\n音师\t436423\n自然科学类\t436424\nXMIND\t436425\n柔克刚\t436426\n基因分离定律\t436427\n东京审判\t436428\n颠沛流离\t436429\n拉肚\t436430\n流畅版\t436431\n死循环\t436432\n夏奇拉\t436433\n上百位\t436434\nnpn\t436435\n82年\t436436\n混血\t436437\n风一样\t436438\n认识钟表\t436439\n怡然自乐\t436440\n港风\t436441\n北京交通大学海滨学院\t436442\n阿普顿\t436443\npba\t436444\n囚牢\t436445\n广电网络公司\t436446\n蔽日\t436447\n案释纪\t436448\nCell子刊\t436449\nValidform\t436450\n人民邮电出版社\t436451\n城市建筑工程\t436452\n么子\t436453\n广州萝岗区\t436454\n私语\t436455\n钟舒漫\t436456\n奥拓论坛_汽车之家论坛\t436457\n瑞康体检网\t436458\n常州市宝康医药化工有限公司\t436459\n爱蜜社\t436460\n0.55\t436461\n再生资源交易网\t436462\n索尼xz2\t436463\n中国航空集团公司\t436464\n密码管理器\t436465\nAccountability\t436466\n端下\t436467\n星期二\t436468\nssci\t436469\nUTS\t436470\n掩埋\t436471\n小媛\t436472\n申嘉湖高速\t436473\n棺材板\t436474\n猎爱\t436475\n集序\t436476\n毛入园率\t436477\n骷髅洞\t436478\np21\t436479\n杂草\t436480\n内流河\t436481\n童颜巨乳\t436482\n结成\t436483\nTECO\t436484\n复星医药\t436485\n诗雅\t436486\nfps\t436487\ncss吧\t436488\n卡诺\t436489\n王建新\t436490\n塘厦镇\t436491\n0692\t436492\n人像\t436493\n怡合达\t436494\niOS8\t436495\n大失所望\t436496\n項目\t436497\n任务型\t436498\n鹿场\t436499\n0号块\t436500\n枯骨\t436501\n御湖湾\t436502\n1500名\t436503\n保和丸\t436504\n挑战者联盟\t436505\n裸女\t436506\n霍胆丸\t436507\n护士服\t436508\nABAC式\t436509\n伏特加酒\t436510\n圣三\t436511\n百问百答\t436512\nalphablocks\t436513\n河套学院\t436514\n下星期\t436515\n惨败\t436516\n金立m7\t436517\n肃杀\t436518\n译文\t436519\nur242\t436520\nX-MOL\t436521\n剥光\t436522\n盒马\t436523\n致哀\t436524\n浙江省财政厅\t436525\nMCT\t436526\n群游\t436527\nningx\t436528\nSSH攻击\t436529\n一剑问情\t436530\n诺亚信\t436531\n特招\t436532\n高尔夫·嘉旅\t436533\n1到12月\t436534\n上海蔚来汽车有限公司\t436535\nstorebook\t436536\n高顿财税学院\t436537\n桂平市\t436538\n光合作用\t436539\nnp文\t436540\n中华人民共和国建筑法\t436541\nジョン\t436542\n8时\t436543\nshijian\t436544\n交变换\t436545\n魔力宝贝\t436546\n荣立\t436547\n}\t436548\n峡湾\t436549\n833\t436550\n改性\t436551\n网上\t436552\n武汉地铁6号线\t436553\nzhuanti\t436554\n中国林业科学研究院\t436555\n红尘蝶恋\t436556\n香坊火车站\t436557\n二十几个\t436558\n皇藏峪\t436559\nMaestro\t436560\n建德房产网\t436561\n选值\t436562\n张一彤\t436563\n压低\t436564\nbling\t436565\n琪琪云\t436566\n此方\t436567\n三超新材\t436568\n2078\t436569\n恶魔少爷别吻我第二季\t436570\n三条\t436571\n菏泽市\t436572\n犀飞利\t436573\n小菜园\t436574\n小写字母\t436575\n47秒\t436576\n抚州\t436577\n南京市公安局\t436578\n交易时间\t436579\n库克群岛\t436580\n刘茜\t436581\n蝎\t436582\n资产负债表债务法\t436583\n镶钻\t436584\n湖南图书馆\t436585\n梦三生\t436586\nv3.2\t436587\n线程调度\t436588\n吕思勉\t436589\n赛末\t436590\nRobomongo\t436591\n真阳\t436592\n他律\t436593\n之内\t436594\nMahou\t436595\n林芝市\t436596\nExcellence\t436597\nOnsite\t436598\n雷利\t436599\n青龙石\t436600\n黑暗骑士崛起\t436601\nvrm\t436602\n药王\t436603\n7月27日\t436604\n法理学\t436605\n一素\t436606\n标调\t436607\n养老保\t436608\n久违\t436609\n解决好\t436610\n受处分\t436611\nzipp\t436612\n级市\t436613\n钱伯斯\t436614\n胆宁片\t436615\n地插\t436616\nigus\t436617\n九顶山\t436618\n玩不来\t436619\n这支\t436620\npasswords\t436621\n些\t436622\n河南省煤田地质局\t436623\n自由市场\t436624\n大围山\t436625\nAIMP\t436626\n比不上\t436627\nsendfile\t436628\n看见鬼\t436629\n双峡湾\t436630\n孤星泪\t436631\n10佳\t436632\n云视讯\t436633\n塔罗牌占卜\t436634\n中国拍卖行业协会\t436635\n12年前\t436636\n绿玛瑙\t436637\n两万公里\t436638\n电动吸奶器\t436639\n3.0.5\t436640\n临床药学\t436641\n敢爱敢做\t436642\n荆州中学\t436643\n擤\t436644\n神游网\t436645\n元角\t436646\n战神4\t436647\n走上\t436648\n半导体制冷\t436649\n八音盒\t436650\n华映\t436651\n19名\t436652\n探秘者\t436653\n佛山市三水区\t436654\n土体\t436655\n切角\t436656\n老男人\t436657\n黄蕾\t436658\n干么\t436659\n三花酒\t436660\nAdhesives\t436661\n中国自由贸易区\t436662\n有你真好\t436663\n雷克萨斯RX\t436664\n费穆\t436665\nAgriculture\t436666\n首开股份\t436667\npumped\t436668\n温妮莎\t436669\n中国物通网\t436670\n既往\t436671\n辞海_字博缘文学网\t436672\n58天\t436673\nFleur\t436674\n菲翔论坛_汽车之家论坛\t436675\n牌车\t436676\nDella\t436677\ncp\t436678\n福睿儒道至圣\t436679\n资管\t436680\n勇往\t436681\n气压机\t436682\nAkoya\t436683\n美加净\t436684\nhaizei\t436685\n85mm\t436686\n洁身器\t436687\n纽约时代广场\t436688\n黑胡\t436689\n杜兰特\t436690\n一时间段\t436691\n双轴搅拌桩\t436692\n任溶溶\t436693\n一本道久久综合久久88\t436694\n26项\t436695\nshei\t436696\n洗衣液\t436697\n纪念章\t436698\n跛足\t436699\n提示音\t436700\n三限盘\t436701\n牛灿\t436702\nprofil\t436703\n名客\t436704\n猎豹移动公司\t436705\n小颜\t436706\nmolex\t436707\n商用版\t436708\n枫叶城\t436709\n气动打包机\t436710\n抛物线方程\t436711\nattilax\t436712\n排出\t436713\nFilmora\t436714\n专署\t436715\n黄岩新闻网\t436716\nTp-link路由器\t436717\n公仪休拒收礼物\t436718\n色精\t436719\n戈壁沙漠\t436720\n孙国庆\t436721\n拨云见日\t436722\nsvchost\t436723\n直升飞机\t436724\n犯贱志\t436725\n美好的一天\t436726\n西化\t436727\nCGTN\t436728\n20160312\t436729\nDiffraction\t436730\n手汗症\t436731\npsd格式\t436732\n滨江宝龙城\t436733\n一个19岁\t436734\n一拆\t436735\n怒火攻心\t436736\n楞严\t436737\nPRODUCTS\t436738\n珍珠米\t436739\n沉香线香\t436740\n衡阳日报社\t436741\n苯溴马隆\t436742\n追美\t436743\n8.5代\t436744\n网易游戏客服中心\t436745\ncopied\t436746\n上海博华国际展览有限公司\t436747\n洪湖市\t436748\n结义\t436749\n仁和集团\t436750\nOD值\t436751\n兴业银行信用卡\t436752\n小赢科技\t436753\n浩方\t436754\n7015\t436755\n元日\t436756\nguidi\t436757\n1148\t436758\n焦粉\t436759\nCP1H\t436760\n11-1月\t436761\n以此类推\t436762\nVolkswagen\t436763\n星榜\t436764\n咸亨酒店\t436765\n航空联名卡\t436766\nspotfire\t436767\n疑问\t436768\nSAS专\t436769\n重庆大学电气工程学院\t436770\n最狠\t436771\n无需要\t436772\n反洗钱法\t436773\nsocket5\t436774\n转档\t436775\ntutorials\t436776\n程实\t436777\n四川省高院\t436778\n走心机\t436779\n吸水器\t436780\n子安武\t436781\n电晕\t436782\nXmanager5\t436783\n米优\t436784\n榆社\t436785\nclass选择器\t436786\n金地广场\t436787\n刘慧凝\t436788\n领航者\t436789\n李戈\t436790\n吕珊\t436791\nx-lab\t436792\ndisplayport\t436793\nForex\t436794\n半阳\t436795\nbigone\t436796\n上海交通大学闵行校区\t436797\n鳄鱼\t436798\n中国支付清算协会\t436799\n级位\t436800\n陈竺\t436801\n新华中学\t436802\nr18\t436803\n660元\t436804\n上城区\t436805\nProximity\t436806\nCake\t436807\n金胆\t436808\n麦迪\t436809\n八一\t436810\n针灸师\t436811\n麻衣神相\t436812\n朱家湾\t436813\n滚床\t436814\n我的世界中文网\t436815\nCustoms\t436816\n各局\t436817\n上海兴业太古汇\t436818\nubutnu\t436819\n唐登杰\t436820\n百座\t436821\n120年\t436822\n体癣\t436823\n训服\t436824\n顾遥\t436825\n宫腹腔镜\t436826\n焕然一新\t436827\nAVID\t436828\n骗案\t436829\n认罪\t436830\n扭臀\t436831\n白果儿\t436832\n豆汁儿\t436833\n丹棱县\t436834\n盖天\t436835\n高三生\t436836\n养号\t436837\n运动装\t436838\n一页页\t436839\n东旭集团\t436840\n棋子\t436841\n牧羊姑娘\t436842\n高志溶\t436843\n陈艾森\t436844\nkefu\t436845\ntmall\t436846\nree\t436847\n灵山大佛景区\t436848\nGivefine\t436849\n大主\t436850\n网瘾\t436851\n程序集\t436852\nrmi\t436853\n古兜温泉\t436854\n几套\t436855\n宇阳\t436856\n蕾丝兔\t436857\n99.99%\t436858\n今晚\t436859\n完成版\t436860\n绿城育华\t436861\nqualification\t436862\n德裔\t436863\n骚气\t436864\nアマゾン\t436865\n电阻\t436866\n金韬\t436867\n人教版小学五年级下册语文\t436868\nxong\t436869\n史国良\t436870\n背户\t436871\n辛芩颗粒\t436872\n贸促会\t436873\nMATCH函数\t436874\nPRN\t436875\niphone7\t436876\njqeury\t436877\nBoyd\t436878\nmodifiable\t436879\n莫烦\t436880\n凤凰军机处\t436881\n水洗\t436882\n时尚款\t436883\n逆置\t436884\n丽维\t436885\n巨人杯\t436886\n博班\t436887\niOS11.3\t436888\n第一面\t436889\n货运量\t436890\n山煤国际\t436891\n波多野吉衣\t436892\n陈怡蓉\t436893\nEDA365电子工程师网\t436894\n粉白\t436895\n劳动能力鉴定\t436896\n智法\t436897\n大蛇无双2\t436898\n13800\t436899\n第三胎\t436900\n富勒\t436901\n比喻\t436902\nbordered\t436903\n周华\t436904\n刘广\t436905\ngj\t436906\n明妃传\t436907\n热备\t436908\n技术狂\t436909\nWDCP\t436910\n吴建\t436911\nlsattr\t436912\n金夫人\t436913\n函审\t436914\n包装瓶\t436915\n京正\t436916\n筒历\t436917\n水南\t436918\n油画家\t436919\nterminals\t436920\n王之王\t436921\n腭裂\t436922\n连招\t436923\n广州市文化广电新闻出版局\t436924\n含香\t436925\n防龋\t436926\nMail\t436927\n油类\t436928\n十六厘米\t436929\n仙魔传说\t436930\n闲敲\t436931\n原石\t436932\n十步\t436933\n随机矩阵\t436934\n侧\t436935\nLamda\t436936\n20170407\t436937\nspit\t436938\ntie\t436939\n妙峰山\t436940\n同住\t436941\n凯翔\t436942\n中国移动通信有限公司\t436943\n安庆职业技术学院\t436944\n换换\t436945\nPassengers\t436946\n花明楼\t436947\n安庆市人民政府\t436948\n维列\t436949\n1.0.15\t436950\n环境科学与工程专业\t436951\n中影集团\t436952\n阴翳\t436953\ngoldengate\t436954\n神柱\t436955\n争创\t436956\n首片\t436957\n蝙蝠\t436958\n朗坤智慧科技股份有限公司\t436959\n种养\t436960\n浮出水面\t436961\n1310nm\t436962\nudaf\t436963\nextinct\t436964\n沙锅\t436965\n更新版\t436966\n3d开机号\t436967\n涪陵榨菜\t436968\n海灵\t436969\nzlw\t436970\n蜜虫\t436971\n高祖\t436972\n生物体\t436973\n飞龙路\t436974\n25个月\t436975\n0.19\t436976\n第九交响曲\t436977\nJOJO\t436978\nzkbm\t436979\n剪子\t436980\n资信评级\t436981\nmybites\t436982\n章含之\t436983\n_途牛旅游网\t436984\n纯天然\t436985\nCTR\t436986\n客照\t436987\njackyrong\t436988\nSql\t436989\n票面\t436990\n肾功能\t436991\n青海省西宁市城中区政府\t436992\n芜湖市公安局\t436993\n子宫纵膈\t436994\n缉毒\t436995\nfault\t436996\nguba\t436997\n电动玻璃门\t436998\nLINUX\t436999\n李宝英\t437000\n湖里区\t437001\n中央邦\t437002\n新建路\t437003\n端粒酶\t437004\nexposed\t437005\n别上\t437006\n四六级\t437007\n三民\t437008\n千龙\t437009\n噬血狂袭\t437010\n4.0安卓\t437011\n岩龙\t437012\n信息网络传播视听节目许可证\t437013\n汉化破解版\t437014\nSplatoon\t437015\n吴爱英\t437016\n海哥\t437017\nhandspeaker\t437018\nalong\t437019\n文物类\t437020\n杨春\t437021\nHttpCN\t437022\n东吴\t437023\n750W\t437024\n在南方\t437025\n徒长\t437026\n无水硫酸铜\t437027\n200家\t437028\ne租宝\t437029\nHRBP\t437030\nhaute\t437031\n妩媚\t437032\n幸福路\t437033\nLowe\t437034\n三拼音节\t437035\ndtp\t437036\nredies\t437037\n佳兆业广场\t437038\n龙门驿站\t437039\n搭话\t437040\n说客\t437041\n胆南星\t437042\n中国软科学\t437043\nOtsu\t437044\nPINKO\t437045\n阿发\t437046\n陕西省人民医院\t437047\n白衬衫\t437048\n人民保险\t437049\n夏冬\t437050\n临走前\t437051\n嫁给我吧\t437052\n饲用\t437053\n招商花园城\t437054\nandroidstudio\t437055\n论丛\t437056\n借记卡\t437057\n我的第一本科学漫画书\t437058\nWindowsServer2008R2\t437059\n火开关\t437060\n第三支\t437061\n罗迪克\t437062\n美女杀手\t437063\n吉思汗\t437064\n梦想召唤王\t437065\n北京大学药学院\t437066\n臭鸡蛋\t437067\n2sinx\t437068\n咏怀古迹\t437069\n大港区\t437070\n三星C7\t437071\n新型农村社区\t437072\n城际分类网\t437073\n天坤\t437074\nHD6850\t437075\n大角湾\t437076\n非凡学院\t437077\n651路\t437078\n老当益壮\t437079\n180322\t437080\ncoim\t437081\nforza\t437082\n安尔乐\t437083\n皆可\t437084\n供暖费\t437085\n李耀文\t437086\n脑心舒口服液\t437087\n诱宵美九\t437088\n收展\t437089\ncgr\t437090\n马丽娜\t437091\n重庆消防\t437092\n高层民用建筑设计防火规范\t437093\n胡雪\t437094\nJardin\t437095\n时间计算器\t437096\n黑塞\t437097\n和关\t437098\n政治制度\t437099\nNGUI\t437100\n尸家\t437101\nWheezy\t437102\n毕业装\t437103\n固守\t437104\ncomplex\t437105\n酷站\t437106\n公历\t437107\n电子学\t437108\nalcatel\t437109\nIO口\t437110\n文乃\t437111\n钱磊\t437112\n娜\t437113\n隆众\t437114\n注册费\t437115\n淮阴区\t437116\n日照机场\t437117\n绿茶手机网\t437118\n氢醌\t437119\ninet_ntoa\t437120\n大作家的小老师\t437121\n纺织面料网\t437122\n银行联行号\t437123\n数段\t437124\n繋\t437125\nsoftap\t437126\n37级\t437127\n仙境传说RO\t437128\nAddams\t437129\n黄木香\t437130\n郎骑竹马来\t437131\naituiyx\t437132\n周韵\t437133\nES\t437134\nvidos\t437135\n中粒\t437136\n数字推理\t437137\n无锡小学\t437138\n18亿\t437139\n阳谷政府网\t437140\n铜门\t437141\n记要\t437142\n神镜\t437143\n塞浦路斯\t437144\n号啕大哭\t437145\n河北省人民医院\t437146\nArea\t437147\n中铁置业\t437148\n周芷莹\t437149\nEncode\t437150\n路西德\t437151\n漏检\t437152\n內地\t437153\n两情\t437154\n陶世龙\t437155\n秦淮河\t437156\n玉珠\t437157\n巨肚\t437158\n李少波\t437159\n健康界\t437160\n2500万美元\t437161\n雍华府\t437162\n领区\t437163\nTiesto\t437164\n幸田\t437165\n错婚\t437166\n二丁目\t437167\n诊疗费\t437168\n五大连池市\t437169\npvs\t437170\n17655\t437171\n穿皮\t437172\n瑞安·雷诺兹\t437173\n海南省工商行政管理局\t437174\neaysun\t437175\n上海市人才服务中心\t437176\n可圈可点\t437177\n滇池\t437178\n玛利亚凯莉\t437179\nCDRX4\t437180\n中南樾府\t437181\n曦瑶\t437182\n哈弗h6coupe\t437183\n余粮\t437184\n西双版纳傣族自治州\t437185\n行政机关公务员处分条例\t437186\n1.2倍\t437187\n为母\t437188\n问卷调查\t437189\n微细\t437190\n东上\t437191\n欧欧\t437192\namazarashi\t437193\n皱缩\t437194\n豪斯曼\t437195\n王志安\t437196\n百米桩\t437197\n独立董事\t437198\nphpize\t437199\n水友们\t437200\n应用类\t437201\n硬齿面减速机\t437202\n辩别\t437203\n20套\t437204\n2型\t437205\n猛士\t437206\n每个女人\t437207\nOPPO手机助手\t437208\n八\t437209\n54届\t437210\n郷\t437211\n高瓷\t437212\n解连环\t437213\nbottleneck\t437214\ndecomposition\t437215\nJasper\t437216\n上海旌逸集团\t437217\n树莓苗\t437218\n炼气化\t437219\n给水排水设计手册\t437220\ntc4\t437221\n天天素材库\t437222\n小松未步\t437223\n苦菜\t437224\n云储币\t437225\n乐陵\t437226\n哪步\t437227\nStairs\t437228\n嚟\t437229\n沪江法语学习网\t437230\n国大\t437231\n当当读书\t437232\nTetris\t437233\n色谱柱\t437234\n联想r720\t437235\n高压柱\t437236\n老科协\t437237\n不倒翁\t437238\n講場\t437239\n单程\t437240\n氧疗\t437241\n大连化物所\t437242\n连累\t437243\n塔木\t437244\n兰州资源环境职业技术学院\t437245\n恋爱暴君\t437246\n手联弹\t437247\n57届\t437248\n盛风\t437249\n女王爷\t437250\n灵长\t437251\navailable\t437252\n家电话\t437253\n如梦初醒\t437254\n必有路\t437255\nint\t437256\n一平米\t437257\n减档\t437258\n黄沙百战穿金甲\t437259\n二十几\t437260\n采购方\t437261\n集资诈骗罪\t437262\n世界之窗论坛\t437263\n硝酸铁\t437264\n30座\t437265\n正交试验法\t437266\ninotify\t437267\nICPC\t437268\n安塞\t437269\n2016年1月19日\t437270\ngraduated\t437271\n100首\t437272\n奖励性\t437273\nauthorized_keys\t437274\nPowerPivot\t437275\n7天前\t437276\n王麻子\t437277\n一条腿\t437278\n火不火\t437279\n衣衫褴褛\t437280\n卫星云图天气预报_中央气象台卫星云图_气象卫星云图\t437281\noffe\t437282\n晒物\t437283\n笋子\t437284\n文身\t437285\n武圣\t437286\nglgoo\t437287\nBea\t437288\nDirected\t437289\n中华印刷包装网\t437290\n南浔区\t437291\n防爆阀\t437292\n爸爸去哪儿第五季\t437293\n陈旭东\t437294\n原辅材料\t437295\n衡水市教育局\t437296\n言叶之庭\t437297\n贺姓\t437298\n君正\t437299\nStyles\t437300\n尼康d90\t437301\n演讲赛\t437302\n第四军医大学西京医院\t437303\n中标公司\t437304\n梦幻西游礼包网\t437305\n中央政府\t437306\n张凡\t437307\ncontains\t437308\n艾敬\t437309\n连级\t437310\n3.2锐眼\t437311\n红海行动迅雷\t437312\n日轮山城\t437313\n杨红霞\t437314\n正英\t437315\ndicks\t437316\n闹钟\t437317\n自暴\t437318\n初吻\t437319\n气泡袋\t437320\n李健\t437321\n十四代\t437322\n直销人新闻中心\t437323\n294号\t437324\n心心相印\t437325\n第121\t437326\n仙人谷\t437327\n课余\t437328\n宁天\t437329\nitil\t437330\n早乙女\t437331\n易经杂说\t437332\n新龙县\t437333\n平板\t437334\n阴环\t437335\n吃住行\t437336\n翻沉\t437337\n0400\t437338\n一生何求\t437339\n油焖笋\t437340\n中银保险\t437341\n合料\t437342\ntreaty\t437343\n忘了吧\t437344\n命中注定我爱你\t437345\nbecker\t437346\n柔性体\t437347\n施工图\t437348\n贵州交通职业技术学院\t437349\nuefi版\t437350\n姜艺媛\t437351\n一律\t437352\nContiTech\t437353\nJoining\t437354\n山东中医药高等专科学校\t437355\n梁涛\t437356\n18053期\t437357\n华夏保险\t437358\n社保代缴\t437359\nsunglass\t437360\n济南白癜风医院\t437361\n0.18\t437362\n宗亲\t437363\n草炭\t437364\nhulu\t437365\nOppo\t437366\n王者荣耀模拟器\t437367\n侠盗猎车圣安地列斯\t437368\nAcunetix\t437369\n双色球字谜\t437370\nCATIA\t437371\nnikon\t437372\n增值税专用发票认证\t437373\n新成员\t437374\nminecraft吧\t437375\n恒动\t437376\nhaircut\t437377\n乌兰苏轼\t437378\n大局\t437379\n荒野大镖客:救赎\t437380\n太古集团\t437381\n可达鸭\t437382\n七万元\t437383\n脐疗\t437384\n性骚\t437385\n嘉禾\t437386\n科博会\t437387\n粗\t437388\n商汤科技\t437389\nVueJS\t437390\n潇潇逸云\t437391\n武原街道\t437392\n桌面式\t437393\nsynapse\t437394\n1.5p\t437395\n唐国明\t437396\n哆可梦\t437397\n南翔古镇\t437398\n香根\t437399\n中建方程投资发展有限公司\t437400\n茶巾\t437401\nbul\t437402\n国威电话交换机\t437403\n汪寅仙\t437404\ngilbert\t437405\n盖农\t437406\nvivobook\t437407\n第31章\t437408\n94mm\t437409\n长沙政府\t437410\n训练有素\t437411\n糖果片\t437412\nrpush\t437413\n特多\t437414\n概率分布函数\t437415\nSQL2000\t437416\n拜尔斯道夫\t437417\n金六福\t437418\n金亚洲\t437419\n办病退\t437420\n黄山市人民医院\t437421\n拉花\t437422\n体验园\t437423\n注册消防工程\t437424\n殷战争之王\t437425\n零售银行\t437426\n谭嗣同\t437427\n中国科学院信息工程研究所\t437428\n批墙\t437429\n7654\t437430\n蓝带\t437431\nNara\t437432\n三九天\t437433\nv1.9\t437434\n一氧化碳报警器\t437435\n一万只\t437436\n亮照\t437437\n几厅\t437438\n麦子学院\t437439\n眼睛\t437440\n共栖\t437441\noppoa57t\t437442\nFMT\t437443\n三纠三促\t437444\n星睿\t437445\n2017|2017\t437446\n雨极\t437447\n阿Q正传\t437448\nmfs\t437449\n哲学与人生\t437450\n启迪协信\t437451\n缴费期\t437452\n蒂凡尼\t437453\n云影院\t437454\n辑览\t437455\n汽油发电机\t437456\n贫瘠之地\t437457\n圆上\t437458\n新墨西哥州\t437459\n1800万元\t437460\n放空\t437461\n几十年后\t437462\n皓月\t437463\n轻判\t437464\n炮口\t437465\n迷你鹦鹉鱼\t437466\n开放的指导意见\t437467\n石狮日报数字报\t437468\nNotorious\t437469\n华为麦芒6\t437470\n秀外慧中\t437471\n欲\t437472\n中国寰球工程公司\t437473\nSomachine\t437474\n2ch中文网\t437475\n质量好些\t437476\n百度云网盘/迅雷\t437477\n文昌市\t437478\n囊状\t437479\n治下\t437480\napron\t437481\n结婚证件照\t437482\n陈进\t437483\nallinone\t437484\nravenfield\t437485\n傲虎论坛\t437486\n来来来\t437487\n8年后\t437488\n钜派\t437489\nv-cloak\t437490\n养狐\t437491\n卡住\t437492\n头孢哌酮\t437493\nDoyou\t437494\n基槽\t437495\n现在开始\t437496\n版图\t437497\n病药\t437498\nzdm\t437499\nposture\t437500\n二十八年\t437501\n茶水\t437502\nshortage\t437503\n峭度\t437504\n527\t437505\n中型\t437506\nsviper\t437507\n油鱼\t437508\nEDA365电子工程师论坛网\t437509\n岳家嘴\t437510\nMavericks\t437511\nMIMI\t437512\nMIUI论坛\t437513\n时效性\t437514\n剪贴画\t437515\n均胜\t437516\n想笑\t437517\nroco\t437518\n十里坪\t437519\n春芽\t437520\n甲拌磷\t437521\n_聊城新闻网\t437522\n晶圆级\t437523\nvbscript\t437524\n湖南民族职业学院\t437525\n选倾品餐饮公司\t437526\n5千亿\t437527\n体験\t437528\n星战8\t437529\n氢氧化锌\t437530\n阳中\t437531\n安保公司\t437532\n洗肺\t437533\n古仔\t437534\n眼病\t437535\n2029\t437536\n三嗪\t437537\n薇尔莉特·伊芙加登\t437538\n春祭\t437539\n和讯科技\t437540\npacman\t437541\n霍根\t437542\njrpg\t437543\nnathan\t437544\n陀螺\t437545\n网站管理系统\t437546\n红古铜\t437547\nidol\t437548\nscrollto\t437549\n街道管\t437550\n中国交通技术网\t437551\n江苏农林职业技术学院\t437552\nrealistic\t437553\n果盒\t437554\ndblib\t437555\n中国科学院上海光学精密机械研究所\t437556\n南红\t437557\n东京市区\t437558\n蝙蝠侠:黑暗骑士\t437559\nwinServer\t437560\n汇天国际\t437561\n丰收路\t437562\ntb\t437563\n舍不得\t437564\n满怀\t437565\n球棍\t437566\n工作原理图\t437567\n广东化工\t437568\n雪鹰\t437569\n陈文锦\t437570\n富强\t437571\n希斯\t437572\n五毛钱\t437573\n烦恼歌\t437574\n品种\t437575\n冬小麦\t437576\n壶镇\t437577\nEmployee\t437578\n五阿哥\t437579\n佳能760D\t437580\n常温\t437581\nlearns\t437582\n皮脂腺\t437583\nstarbound\t437584\n吐丝联盟\t437585\n滕华涛\t437586\n雲端\t437587\n见色\t437588\n平沙\t437589\nmobage\t437590\n韩宏伟\t437591\n坎贝拉\t437592\n水调歌\t437593\n中山医学院\t437594\n大创\t437595\n乙亥日\t437596\n2秒\t437597\n组件\t437598\n擒敌拳\t437599\n家训\t437600\n一代天娇\t437601\n金石镇\t437602\n浅井\t437603\n东风雷诺汽车有限公司\t437604\n歌舞\t437605\nTechnik\t437606\n原生安卓\t437607\n血恋\t437608\n金思路\t437609\n卢氏县\t437610\n中法\t437611\nrental\t437612\n我的岁月\t437613\nconserved\t437614\n蓝村镇\t437615\n兰台\t437616\n县官\t437617\n讶\t437618\n为什么不走\t437619\nSeal\t437620\n支抗\t437621\n浙江红船干部学院\t437622\n力宝广场\t437623\n流雨\t437624\n2片\t437625\n赵德明\t437626\n永不回头\t437627\nbps\t437628\nfinereader\t437629\n宁海森林温泉\t437630\n叶武滨\t437631\n硝苯地平缓释片\t437632\n深呼吸\t437633\n桃夭\t437634\n生死未卜\t437635\n陈跃\t437636\nTocy\t437637\n华东师范大学闵行校区\t437638\n温州市司法局\t437639\n犁天\t437640\n叶雨\t437641\nphotonics\t437642\n壮志不言\t437643\n假\t437644\nvaild\t437645\n展览馆\t437646\nkiz\t437647\n长沙市轨道交通集团有限公司\t437648\n销假\t437649\nBox\t437650\n背入式\t437651\n天轨\t437652\n羊群\t437653\n石加\t437654\n好戏\t437655\n精装盒\t437656\n称霸\t437657\n骏景\t437658\n胎死腹中\t437659\n奇怪的搭档\t437660\nstuyou\t437661\n倾吐\t437662\n鱼馆\t437663\n不知\t437664\n产生式\t437665\n篱笆\t437666\n苹果酱\t437667\n形而上学\t437668\n三元正极\t437669\n6.5.10\t437670\n平湖市政府网\t437671\n林晓培\t437672\n日行灯\t437673\n監禁\t437674\n垂青\t437675\nkdm\t437676\n校园绝品狂徒\t437677\n郁金香花园\t437678\n邓家\t437679\ntarge\t437680\n中国人民人寿保险股份有限公司\t437681\n虚商\t437682\n遮挡物\t437683\n松山湖高新区\t437684\n欧朋浏览器\t437685\n水长\t437686\n丕\t437687\n钩端\t437688\n哈贝马斯\t437689\n类风湿\t437690\n张晓华\t437691\n罗马仕\t437692\nstunning\t437693\nMay\t437694\npingyin\t437695\n扎职\t437696\n生长纹\t437697\n罗林\t437698\nC1\t437699\nreac\t437700\n20170206\t437701\n樱桃木\t437702\n江苏省\t437703\n赤平投影\t437704\n美杯\t437705\n小骨\t437706\n英\t437707\nv9.7\t437708\n明诲\t437709\n总督府\t437710\n分解机\t437711\n化学篇\t437712\n真趣\t437713\n快手菜\t437714\nirradiation\t437715\n2.0类\t437716\n多贝\t437717\n技能书\t437718\nadminister\t437719\n大广高速\t437720\n气罐\t437721\n微机电\t437722\n语术\t437723\n自然资源\t437724\nSnowolf\t437725\n27首\t437726\n衣夹\t437727\nav吧\t437728\n生命一号\t437729\n小天狼星\t437730\n摔倒\t437731\n中交第二公路勘察设计研究院有限公司\t437732\nk19\t437733\n兽骨\t437734\nMiix\t437735\n狐假\t437736\n血泡\t437737\n20140912\t437738\n涂地\t437739\nlog4j\t437740\n43英寸\t437741\n硫黄\t437742\n新世界实验小学\t437743\n安捷伦\t437744\n氢燃料电池车\t437745\neverywhere\t437746\n麻腮风疫苗\t437747\n熔\t437748\n千王之王2000\t437749\n秦文君\t437750\n硅基\t437751\nCG集\t437752\n24万\t437753\n恶露\t437754\nluvian\t437755\npbo\t437756\n黑光网\t437757\nGYM\t437758\nAdvantech\t437759\nnjqhqlh\t437760\nYeoman\t437761\n前款\t437762\n地狱公寓\t437763\n幼处\t437764\nPMV\t437765\n富阳教育信息网\t437766\n第46话\t437767\n袁进\t437768\n凉山彝族自治州\t437769\n吴小瑞\t437770\n藤原\t437771\n沪江德语学习网\t437772\n3930\t437773\n万妖\t437774\n魁星阁\t437775\nEquation\t437776\n睫毛膏\t437777\n北国风光\t437778\n海地\t437779\n土鸡蛋\t437780\n楚雄州人民政府\t437781\nefd\t437782\nexternal\t437783\n框架梁\t437784\n深圳光明新区\t437785\n绿色学校\t437786\n聊天软件\t437787\n超级排料\t437788\n长江大学\t437789\n乌拉尔\t437790\n陶都\t437791\n艾贝尔\t437792\n净空\t437793\n极品三国志\t437794\n中国龙湾区政府\t437795\n小葱\t437796\n4300元\t437797\n7k7k小花仙\t437798\n冰晶石\t437799\n红葱\t437800\n1888\t437801\n乡土\t437802\nIMX6\t437803\n楚城\t437804\n贵人\t437805\n芦荟胶\t437806\n死记硬背\t437807\n西亚\t437808\n蒙骗\t437809\n四十六亿\t437810\n思想史\t437811\n10.31\t437812\n极品斩龙\t437813\n糜烂型\t437814\n天门政府网\t437815\n焦磷酸钠\t437816\nMatlab2016b\t437817\n瓷画\t437818\n沅江信息网\t437819\n梧州政府网\t437820\n回售\t437821\nTimeSpan\t437822\n强丰\t437823\n高和寡\t437824\n虾头\t437825\nPaddington\t437826\n阿佳妮\t437827\n济宁银行\t437828\n淬灭\t437829\n第四十一条\t437830\nY470\t437831\nVersys\t437832\n6700\t437833\n进式\t437834\n36步\t437835\n6.6.0\t437836\nアリス\t437837\n情人谷\t437838\n厨\t437839\n司鼓\t437840\n卧薪尝胆\t437841\n晚睡\t437842\n官庄工区\t437843\n碧之轨迹\t437844\n营火\t437845\n黄颖\t437846\n梅卡瓦\t437847\nSRS-XB10\t437848\n中远海控\t437849\nwlan0\t437850\n拉脱维亚\t437851\n无氟\t437852\n蒂尔茵\t437853\n复生\t437854\n子贤\t437855\nRT-AC5300\t437856\n043期\t437857\n党组\t437858\n张晓丹\t437859\n翼风\t437860\n大优\t437861\n从江\t437862\n衢州学院\t437863\n河北省纪委监察厅\t437864\n积少成\t437865\nsql语\t437866\n99.7\t437867\n百万公里\t437868\n一古\t437869\n二十一岁\t437870\n979\t437871\n二甲基二硫\t437872\n抢行\t437873\n阿拉巴斯坦\t437874\nineed\t437875\n决明\t437876\n哈罗小说网\t437877\nDHA藻油\t437878\n生物医学\t437879\n极品飞车15\t437880\n25V\t437881\nRADIO\t437882\n小哥\t437883\n建设项\t437884\n吊袜\t437885\n圆觉\t437886\n政务\t437887\n七十万\t437888\n新程\t437889\n广南县人民政府\t437890\n寒碜\t437891\n最多一天\t437892\n劳动参与率\t437893\n中建八局第一建设有限公司\t437894\n挥别\t437895\nmuted\t437896\n直系亲属关系\t437897\n半套\t437898\nwas\t437899\n上合组织青岛峰会\t437900\n三维函数\t437901\n675\t437902\n兰花学院\t437903\nOrderedDict\t437904\n16名\t437905\nWin7/Win8\t437906\n10:30\t437907\n王宽\t437908\n1504\t437909\n希亚\t437910\n无面\t437911\n周扬\t437912\n狮门\t437913\n郁冬\t437914\nhsk\t437915\n27周\t437916\n凯士\t437917\n后台表\t437918\n双级\t437919\n森马\t437920\n赵之谦\t437921\n搭售\t437922\n20160520\t437923\n泡袋\t437924\n众智\t437925\n司马衷\t437926\n申通快运\t437927\n牙牙\t437928\n80后们\t437929\n哈尔滨服装城\t437930\n高桥小区\t437931\nid}\t437932\nrtthread\t437933\nseismic\t437934\n一斛珠\t437935\n韩立明\t437936\n张春霞\t437937\nImp\t437938\n农资\t437939\nsua\t437940\n吉祥彩\t437941\nSAT2\t437942\n出落\t437943\n乱斗西游\t437944\n福建海事局\t437945\n北京二六三企业通信有限公司\t437946\n僵尸粉\t437947\n一小步\t437948\n围堤\t437949\n痛斥\t437950\n极品飞车ol吧\t437951\n片式\t437952\n唐朔飞\t437953\n控规\t437954\n缶\t437955\n叶边\t437956\ndte\t437957\n永泉\t437958\n汽车违章查询网\t437959\n剪切应力\t437960\n中野亜梨沙\t437961\n铝电解槽\t437962\n节约里程法\t437963\n卑尔根\t437964\n拯救地球\t437965\n黑娃\t437966\n夹伤\t437967\ndn400\t437968\n9.13\t437969\n静养\t437970\n龙河\t437971\n梁波\t437972\n応援\t437973\n陈向群\t437974\n天理\t437975\n阴虱\t437976\n华融资产管理公司\t437977\n奥硝唑\t437978\n十二个月\t437979\n减函数\t437980\ndnf暗枪士吧\t437981\n余闻\t437982\n小甲\t437983\n英文系\t437984\n丁少华\t437985\n地参\t437986\nBrush\t437987\n中国医药物资协会\t437988\n干部任免审批表\t437989\n不插\t437990\n家庭作业\t437991\n郑芝龙\t437992\n物理学院\t437993\n夢幻\t437994\n无线投影仪\t437995\n锦泰保险\t437996\n杠子\t437997\n映美\t437998\n神偷奶爸1\t437999\n鱼笼\t438000\n汇集\t438001\n安设\t438002\ncnbc\t438003\nSixth\t438004\n白猿\t438005\n茶堂\t438006\n兰州妇产医院\t438007\n船洋\t438008\njennie\t438009\n小袋\t438010\n符凡迪\t438011\n暗黑破坏神3:夺魂之镰\t438012\n逆城市\t438013\n年轻的母亲\t438014\n石家庄陆军指挥学院\t438015\nKoolShare\t438016\n国家教育行政学院\t438017\n王婷\t438018\n至亲\t438019\n腾讯游戏帮帮\t438020\nPanda\t438021\n意欲何为\t438022\n胡斌\t438023\n谢青\t438024\n鬼畜眼镜\t438025\n480g\t438026\n仙侣奇缘\t438027\n试证\t438028\nACDsee\t438029\nFSH\t438030\n萨格里特\t438031\n天赐材料\t438032\n4科\t438033\n上海银行股份有限公司\t438034\n叶长江\t438035\n国有建用地使用权\t438036\n大册\t438037\n代理处\t438038\n热血虎卫\t438039\nfungal\t438040\n智乐园\t438041\n安通\t438042\n地胶板\t438043\n牛羊\t438044\n赚客之家论坛\t438045\nmainland\t438046\n定标\t438047\nFathers\t438048\n不留情\t438049\n室温\t438050\nɑ\t438051\n热性食物\t438052\n随遇而安\t438053\nname}\t438054\n豆米\t438055\n99XXXX\t438056\n深圳海岸城\t438057\n济源之窗\t438058\n齐心\t438059\nmaxwell\t438060\n单辉祖\t438061\nMushroom\t438062\ntextblock\t438063\n北四环\t438064\n5601\t438065\n暗黑龙\t438066\nPull\t438067\n端午\t438068\n采花贼\t438069\npconline\t438070\nhp5200\t438071\nthreshold\t438072\n5000部\t438073\n王遗风\t438074\n彩坛\t438075\n智家网\t438076\n艾米莉娅\t438077\n新巴尔虎右旗\t438078\n功率版\t438079\nword2007\t438080\n大呼\t438081\n用处\t438082\n乳畜\t438083\n凝心聚力\t438084\n第二批次\t438085\n受文\t438086\n沃德卡\t438087\n手机报\t438088\n元编程\t438089\n济青高速\t438090\n同昌\t438091\n陈秀丽\t438092\n黄小姐\t438093\nlynnLi\t438094\n张黎\t438095\nrpg\t438096\n11.14\t438097\n黄蓉\t438098\nSIAL\t438099\norCAD\t438100\ngeetest\t438101\n桃\t438102\nm\t438103\n壁管\t438104\n张喆\t438105\n魔兽诛仙\t438106\n淖\t438107\nMr\t438108\nfam\t438109\n魏军\t438110\nhd4600\t438111\nUSB-C\t438112\n籽骨\t438113\n睿字\t438114\neecs\t438115\nHomework\t438116\nsimons\t438117\n219号\t438118\n上海万科\t438119\n中国社科院研究生院\t438120\n签订\t438121\n体绘\t438122\n2.55\t438123\n亚洲天使童声合唱团\t438124\n华耀城\t438125\n2017年04月\t438126\n约修亚\t438127\n赫克托\t438128\nInnovator\t438129\nMortgages\t438130\n4.8折\t438131\n走入式\t438132\n台东县\t438133\n山堂\t438134\n上海大众汽车\t438135\n明渠流量计\t438136\nweb应用\t438137\n朱厚熜\t438138\nconfront\t438139\nMediterranean\t438140\nPlato\t438141\n地平线:黎明时分\t438142\nChildren\t438143\nsup2\t438144\n门部\t438145\nlimiting\t438146\n1678\t438147\n绊倒\t438148\n江苏公司\t438149\n嫪毐\t438150\n角逐\t438151\nOtaku\t438152\nsomatic\t438153\nHttpServer\t438154\n天平山\t438155\n李叔同\t438156\n金紫荆广场\t438157\n酥鱼\t438158\n后进生\t438159\n绿柱石\t438160\n潍\t438161\n20180105\t438162\n线画\t438163\n98页\t438164\n书怀\t438165\n兑现\t438166\n风度翩翩\t438167\n五哥\t438168\n迈博汇金\t438169\n万词\t438170\n黑势力\t438171\n建通\t438172\nlevis\t438173\nTTT\t438174\n睡走\t438175\n邪线\t438176\n下月初\t438177\n长\t438178\n李氏\t438179\n邀请函\t438180\n行百里者半九十\t438181\n0118\t438182\n符号化\t438183\n英雄联盟直播_英雄联盟\t438184\n王海燕\t438185\nfavorable\t438186\n存款利率表\t438187\n奇观\t438188\nich\t438189\ntira\t438190\n工具箱\t438191\n1.9_\t438192\n檀宫\t438193\n证词\t438194\n伴\t438195\n20批次\t438196\n张目\t438197\n华驰\t438198\n果照\t438199\n面单打\t438200\n继续努力\t438201\n博学笃行\t438202\n八芯\t438203\n叶月儿\t438204\n阜阳市政府\t438205\n创战者try\t438206\ntorrent,magnet\t438207\nPrinted\t438208\n甬江\t438209\n三星w2013\t438210\n暖壶\t438211\nRPGv\t438212\n石机\t438213\n落水管\t438214\n微丝\t438215\n急诊\t438216\n美哉\t438217\n第24期\t438218\n战略轰炸机\t438219\n旋耕\t438220\nS19\t438221\n3410\t438222\ncc2018\t438223\n疏港\t438224\n1季度\t438225\n4.4.5\t438226\n胴\t438227\n马强\t438228\nLUG\t438229\nsampled\t438230\nprimitive\t438231\n佐仔志\t438232\n吕小雨\t438233\nApeosPort\t438234\n顺义区南\t438235\n暗物质\t438236\n简装\t438237\n妍诗美\t438238\n龙湖区政府\t438239\njingtai\t438240\n一级建造师资格考试\t438241\nT61\t438242\n雀儿山\t438243\n阿奎罗\t438244\n纵意\t438245\n陈珂\t438246\n泽城美雪\t438247\n梁浩\t438248\n暗黑破坏神2\t438249\n电子商务\t438250\n车马\t438251\n脚奴\t438252\n五相\t438253\nVCD\t438254\nlivy\t438255\n真可惜\t438256\n玉洁\t438257\n复讐\t438258\n减亏\t438259\n时镜\t438260\n三棵树\t438261\n涟水网\t438262\n24平米\t438263\n上海租房网\t438264\n独醒\t438265\n排污管\t438266\n秋冬\t438267\n伍若兰\t438268\n日炎\t438269\nvanzol\t438270\n32K\t438271\n悬槌堡\t438272\n雄州镇\t438273\n第32号\t438274\nspidev\t438275\n茨城\t438276\n帐上\t438277\nviso\t438278\nx-pack\t438279\n印象笔记帮助中心\t438280\n霉干菜\t438281\n喷火兵\t438282\n髯\t438283\n中交一公院\t438284\n八六学潮\t438285\n祟\t438286\n关门大吉\t438287\n中标公示网\t438288\n配料\t438289\n青花瓷\t438290\n彭泽\t438291\n心怡日记网\t438292\n济慈\t438293\n骷髅\t438294\n34项\t438295\n04月22日\t438296\n李龙\t438297\nBurNIng\t438298\n中原经济区\t438299\n汪莽\t438300\n通告\t438301\n美思\t438302\n黑子哲\t438303\n4900\t438304\nCaoPorn\t438305\n物理治疗\t438306\n黎美娴\t438307\n无限好\t438308\n备孕\t438309\n吴勉\t438310\n捕杀\t438311\n3970\t438312\n杀虫剂\t438313\n少年维特的烦恼\t438314\nRz\t438315\n大操\t438316\n5000条\t438317\n福建招商网\t438318\n哨位\t438319\n通讯世界\t438320\n刘海波\t438321\nEcole\t438322\n中国艺术研究院\t438323\nstrokes\t438324\nperry\t438325\n不锈钢复合管\t438326\nADAMS\t438327\nPostbird\t438328\n澳大利亚大学\t438329\n剑雨塔防三国志\t438330\n锦荣\t438331\n沉吟\t438332\n自助缴费机\t438333\nSHELL\t438334\n红子\t438335\nCC漫画网\t438336\n42000\t438337\n主送\t438338\n宇帷\t438339\n最高\t438340\n开荒者\t438341\n760li\t438342\nOTN\t438343\n成人色\t438344\n重疾\t438345\n终极一班2\t438346\nkinect2\t438347\n唐总\t438348\n新威尼斯\t438349\ncwc\t438350\nexb\t438351\n蜜桃成熟时3\t438352\n玲音\t438353\nwork\t438354\n10.29\t438355\ngachip\t438356\n示波器\t438357\n货人\t438358\n绿化植物\t438359\n16.98万元\t438360\n血宫\t438361\n近场\t438362\n125万\t438363\n双子座流星雨\t438364\nF3\t438365\n就医指南_医网\t438366\n阿米尔\t438367\nslug\t438368\n580元\t438369\nGM论坛\t438370\n第一件\t438371\n消音版\t438372\n脑脊液\t438373\n心角\t438374\n网城\t438375\n秦心\t438376\n梅花生物\t438377\nNeu\t438378\n瑞士央行\t438379\ntomeet\t438380\n豆蔻\t438381\n蛆甲\t438382\n李德胜\t438383\n消防工程\t438384\n爱若\t438385\n芙蓉西路\t438386\n纯享\t438387\n克兰\t438388\n随葬\t438389\n长阳音乐节\t438390\n集装箱号\t438391\n李多熙\t438392\n唯舞\t438393\n透镜\t438394\n平生\t438395\ndnn\t438396\n政宗君的复仇\t438397\n井柏然\t438398\n钓鱼伞\t438399\n4月4日\t438400\n攻心计\t438401\n176张\t438402\n松阳县\t438403\n水晶dj网\t438404\n掌上银行\t438405\n党山\t438406\n裸眼\t438407\n蚂蝗\t438408\n济川药业\t438409\ndiseases\t438410\nvip版\t438411\n1074\t438412\n国信房地产信息网\t438413\n移动定制机\t438414\n线程通信\t438415\n径迹\t438416\n中国山西政府\t438417\n恒源祥\t438418\n玛丽苏\t438419\nPicPick\t438420\n潜伏式\t438421\nkylie\t438422\n沙扎\t438423\n旗标\t438424\n星海音乐厅\t438425\n郑州消防\t438426\n雷文\t438427\n业之峰装饰公司\t438428\n生物质电厂\t438429\n霍曼\t438430\nwin19\t438431\n忍受\t438432\n色相\t438433\nIE浏览器\t438434\n鬼吹灯2\t438435\n湖州市第一人民医院\t438436\n剑灵灵\t438437\n济南市民政局\t438438\n美的空调吧\t438439\n阻遏\t438440\n灵犀语音助手\t438441\n平安银行信用卡\t438442\n祖安\t438443\n卸任\t438444\nwww.13067.com\t438445\nzhaoshang\t438446\n10.33\t438447\n上海网\t438448\n冰山泥\t438449\n开书\t438450\n纠风\t438451\n反恐精英ol\t438452\nNAMM\t438453\n凯尔达\t438454\n农场\t438455\n首旅如家\t438456\n工业库\t438457\n施正荣\t438458\noutlook收件箱\t438459\nhxd\t438460\n弓丝\t438461\n暗自\t438462\n七星剑\t438463\n汽化\t438464\nJRoger\t438465\n莫高\t438466\n阜阳市人民医院\t438467\nEasonJim\t438468\nWin7电脑蓝屏\t438469\nbistro\t438470\nMXD\t438471\nsetuna\t438472\n600525\t438473\n麦芬\t438474\n已用\t438475\n契约之战\t438476\n量子计算机\t438477\n千灯湖\t438478\n5.07\t438479\n撸管飞\t438480\n上海工会\t438481\n家财险\t438482\n一杯酒\t438483\n金村\t438484\nmovielist\t438485\n工业吸尘器\t438486\nInvalid\t438487\n活页本\t438488\n4.04\t438489\n太厉\t438490\n赵全营镇\t438491\n东博会\t438492\nCIFAR-10\t438493\n附着力\t438494\nXC60\t438495\n重新开始\t438496\n赞美公司\t438497\norganism\t438498\narray_unique\t438499\n蓝夜\t438500\n佳能5d4\t438501\nY2\t438502\n丽江拉市\t438503\n七叶神安滴丸\t438504\n流鼻水\t438505\nCayenne\t438506\n1050元\t438507\n600级\t438508\n鲜枣\t438509\nv1.4\t438510\n奈曼\t438511\n苏州消防\t438512\nSQL2008\t438513\n滨崎真绪\t438514\n转债\t438515\n才情\t438516\n绘本故事\t438517\n2y\t438518\n孟尝\t438519\nidbd\t438520\n仙逆医妃\t438521\nR19\t438522\n签率\t438523\n王们\t438524\n若此\t438525\n九极\t438526\nshiyuan\t438527\n冒牌天神2\t438528\n放晴\t438529\n静止\t438530\ncwm\t438531\npcie3.0\t438532\nDDMS\t438533\nm8000\t438534\n宝莱纳\t438535\n10^\t438536\ncdma\t438537\nascx\t438538\n金泰浩\t438539\nWIN键\t438540\n新建镇\t438541\n十种\t438542\n393\t438543\n工日\t438544\n周口职业技术学院\t438545\n徐家大院\t438546\n笔式\t438547\n工业烘干机\t438548\n球状闪电\t438549\n3图\t438550\n杨非\t438551\n9277\t438552\n零次方\t438553\n收容\t438554\n二个\t438555\n税意\t438556\n逛一逛\t438557\n郑栅洁\t438558\n阿里百川\t438559\nBACKUP\t438560\njbpm4.4\t438561\n欧品\t438562\n镇咳\t438563\nSatan\t438564\nfree.3v.do\t438565\nSoothing\t438566\n孙喜双\t438567\n三百多年\t438568\n安杰拉\t438569\nD3H\t438570\nBreitling\t438571\nwwr吧\t438572\nHusky\t438573\n山梨醇\t438574\n张家界国家森林公园\t438575\n软磁\t438576\n璐\t438577\nmyb\t438578\n远光\t438579\n正刊\t438580\n标峰\t438581\nxiaobai\t438582\n林动\t438583\n李志文\t438584\n占卜师\t438585\n超巨\t438586\n胡楠\t438587\n立波\t438588\n杭州市民中心\t438589\n学工办\t438590\n阿胜\t438591\n江西电教\t438592\n交响曲\t438593\n海亮股份\t438594\n风暴\t438595\n5月14日\t438596\n邮政吧_\t438597\nted\t438598\n第四面\t438599\n主套\t438600\n灶性\t438601\niOS9\t438602\n环氧板\t438603\n田文\t438604\nMySQL分库分表\t438605\n半碗\t438606\n筑龙水利工程\t438607\n股权分置改革\t438608\nnaval\t438609\nTerminals\t438610\nTran\t438611\n东环大道\t438612\n曹征\t438613\n护士站\t438614\n敬上\t438615\n庙子湖\t438616\n嘉康\t438617\n托马斯和他的朋友们\t438618\n渡濑晶\t438619\n梨子网\t438620\n20131122\t438621\n为啥\t438622\nSTATS\t438623\n中特马\t438624\ngetResource\t438625\nopdivo\t438626\n悦荟广场\t438627\n0905\t438628\n台湾岛旅游网\t438629\n代勇\t438630\n克罗米\t438631\n谷大白话\t438632\nleboy\t438633\n热泵型\t438634\n善解\t438635\n浙江新华书店\t438636\n药棉\t438637\n迦娜\t438638\n广珠城轨\t438639\n天书世界\t438640\n最后的神迹\t438641\n含羞\t438642\n暴跳如雷\t438643\nPoke\t438644\nramaxel\t438645\n昆明理工大学研究生院\t438646\n李保田\t438647\n搓衣板\t438648\n高圣远\t438649\n盗窃\t438650\n幸村精市\t438651\n加盟网|91jm.com\t438652\n打太极\t438653\nMaroon5\t438654\n上海警备区\t438655\n流石\t438656\n胃阴\t438657\n广东省技师学院\t438658\nLogos\t438659\n云南人民出版社\t438660\n天生是优我\t438661\n衣杆\t438662\n学唱歌\t438663\n散失\t438664\n部级\t438665\n继承权\t438666\n隐隐作痛\t438667\ntpt\t438668\nel-upload\t438669\n重农\t438670\nBIC\t438671\n双牛\t438672\n广州婚纱摄影\t438673\n福袋\t438674\n平纱\t438675\n杏花楼\t438676\n在点\t438677\njilupian\t438678\n平穷\t438679\nhao104\t438680\n微信营销\t438681\n网吧系统\t438682\nOu\t438683\n1342\t438684\nstrick\t438685\n132号\t438686\n李淑\t438687\nintuitive\t438688\n1309\t438689\n余庆\t438690\n灵妖\t438691\nmos管\t438692\nAssistiveTouch\t438693\n凯瑞\t438694\n风云2\t438695\n佛山市发展和改革局\t438696\n公心\t438697\n叩击\t438698\n实验心理学\t438699\nWEB-MKV\t438700\nMovement\t438701\nSleeve\t438702\n7月14日\t438703\n东兴路\t438704\n树化\t438705\ntimeit\t438706\n百度云搜索\t438707\nLANEIGE\t438708\n董倩\t438709\nQuanzhou\t438710\nVirgin\t438711\n扭计\t438712\n建筑工程施工质量验收统一标准\t438713\n三里湾\t438714\n欧诺】长安欧尚欧诺2017款高清图\t438715\n镇江一中\t438716\n自动喷水灭火系统设计规范\t438717\n566号\t438718\n穆斯林的葬礼\t438719\n16.1%\t438720\nomen\t438721\n御温泉\t438722\n内链\t438723\n浙江教育出版社\t438724\n越溪\t438725\n多伦路\t438726\nJockey\t438727\n直立型\t438728\n医案\t438729\nPVC塑料\t438730\nzbrush\t438731\nliming\t438732\n综合测试仪\t438733\nHyperWorks\t438734\n不定义\t438735\n丝丽\t438736\n斜槽\t438737\n运动馆\t438738\n挟持\t438739\ninnovator\t438740\n重返20岁\t438741\n乌梁素海\t438742\n中心血站\t438743\n家朋\t438744\n罗汉\t438745\n东方幻想乡\t438746\n战台风\t438747\n摘机\t438748\nscreenflow\t438749\n伤感带\t438750\n郎溪\t438751\n非齐次\t438752\n0621\t438753\n开往\t438754\n陈纳德\t438755\n鸭\t438756\n报障\t438757\njupiter\t438758\n百合乡\t438759\n理士\t438760\n管弦乐\t438761\nstreamwriter\t438762\nmac地址\t438763\n收费硕\t438764\n解气\t438765\n始祖山\t438766\n巾帼\t438767\n头盖\t438768\n绰绰\t438769\n利通广场\t438770\n2NE1\t438771\n一世纪\t438772\n愈创\t438773\n中国潮州政府\t438774\n归元寺\t438775\n江疏李思思\t438776\n京西时报\t438777\n一键宏\t438778\n盐酸普萘洛尔片\t438779\nKEPServerEX\t438780\n什么目\t438781\n湘雅\t438782\n西南交通大学峨眉校区\t438783\n渔猎\t438784\n艾地苯醌片\t438785\n令牌环网\t438786\n青莲\t438787\n生小孩\t438788\nzh89233\t438789\n狂暴\t438790\nDock工具栏\t438791\nWAV格式\t438792\n大起底\t438793\n新疆维吾尔自治区科学技术厅\t438794\nb型\t438795\n邮件\t438796\nshipyard\t438797\n淡绿\t438798\n花之城\t438799\n洞主\t438800\n烫\t438801\n黑暗体操\t438802\n以太坊代币\t438803\n风华绝代\t438804\nPROTUES\t438805\n西溪园区\t438806\nH+\t438807\n1.4e\t438808\n黄晶\t438809\n镰刀龙\t438810\n本掉率\t438811\n废黜\t438812\nHose\t438813\n数四\t438814\n1818\t438815\nNBA2K18+NB\t438816\n7万吨\t438817\n第三方托管\t438818\nHunker\t438819\n中国银行信用卡中心\t438820\n牛油果\t438821\n6分米\t438822\n归隐\t438823\n5次方\t438824\n久留\t438825\n华师一\t438826\n宫磊\t438827\n北斗网\t438828\n三匹马\t438829\n郑州银行\t438830\nurbanization\t438831\n流量券\t438832\nprm\t438833\n荡浪\t438834\n诅\t438835\n西沟\t438836\n新鞋\t438837\n女间谍\t438838\n暗黑骑士\t438839\n市教体局\t438840\n市卫生计生委\t438841\n收藏之路\t438842\n片面性\t438843\n周晴\t438844\n整版钞\t438845\n赵琦\t438846\n集合帖\t438847\n智能时代\t438848\n開発\t438849\n米津玄師\t438850\n杨永\t438851\n高德地图车机\t438852\n熊去氧胆酸片\t438853\n红警2尤里的复仇\t438854\n卡片式\t438855\nEVENT\t438856\n张铭\t438857\n线杯\t438858\n20171012\t438859\n颍州区人民政府\t438860\n广州市第八人民医院\t438861\n黑脸\t438862\n村规\t438863\n长沙市区\t438864\naddAll\t438865\n东兰县人民政府\t438866\n姜小牙\t438867\nAliens\t438868\n崇文路\t438869\n快乐酷宝3\t438870\n耳闻\t438871\n运动机\t438872\nsanxing\t438873\n2009年9月\t438874\n澳交所\t438875\n11种\t438876\n修葺\t438877\n第三单\t438878\n光电\t438879\n上海商铺网\t438880\n琼州海峡跨海大桥\t438881\n吡喃\t438882\n中船科技\t438883\n天籁村\t438884\n往生\t438885\n喉炎\t438886\n黝黑\t438887\nARIA\t438888\n天通苑北站\t438889\n毒镖\t438890\n芮城\t438891\n二手车评估师\t438892\n高沟镇\t438893\n雀替\t438894\n极米科技\t438895\n建行个人网上银行\t438896\n摩玛\t438897\n第43页\t438898\nol3\t438899\nsticky\t438900\n石坝\t438901\n轻食沙拉\t438902\n林依晨\t438903\n玛茵\t438904\n滨河森林公园\t438905\n难吃\t438906\n紫爱百强\t438907\n中共山东省委党校\t438908\n中宁\t438909\n窓\t438910\nShanxi\t438911\n择校学\t438912\n修德\t438913\n第四极\t438914\n吉尔莫\t438915\n2018年04月27日\t438916\n仓鼠笼\t438917\nPellet\t438918\nLookfantastic\t438919\n碰撞器\t438920\n女男\t438921\n丝足会\t438922\n125ml\t438923\n看一眼\t438924\n3.0.4\t438925\n中信建投证券\t438926\nInvitrogen\t438927\nRIME\t438928\n四十年前\t438929\n大真\t438930\n潼关怀古\t438931\n李澄\t438932\n中男\t438933\n10.10.3\t438934\n835路\t438935\n张家声\t438936\nlipid\t438937\nof在线翻译\t438938\n杨沫\t438939\n臀中肌\t438940\n风投公司\t438941\n泗泾镇\t438942\n企业管理器\t438943\n张俪\t438944\nmas\t438945\n虎门吧\t438946\n余淮\t438947\n7.1\t438948\n@value\t438949\n无油真空泵\t438950\n封禁\t438951\n巢氏\t438952\n雷锋侠\t438953\nBershka\t438954\n2DS\t438955\n1.8.1\t438956\n垂丝柳\t438957\n一弹\t438958\n病\t438959\n梦幻群侠传5\t438960\n接收机\t438961\n纯色\t438962\n卡多\t438963\n顺拐\t438964\n四十五天\t438965\n一饮而尽\t438966\n马岭河峡谷\t438967\n建德市\t438968\n尽情地\t438969\n陪审员\t438970\n再续前缘\t438971\n金平湖\t438972\nLogin\t438973\n新交通法\t438974\n多伦\t438975\n鬼哭狼嚎\t438976\nヒ\t438977\n开闭器\t438978\n乌鸫鸟\t438979\n经\t438980\n广东路\t438981\n2017年清明节\t438982\n两分钟后\t438983\n扩张性\t438984\n半角转换\t438985\n转点\t438986\n清华大学机械工程系\t438987\nPhotoshopCS6\t438988\n教育险\t438989\n大自\t438990\nH1B\t438991\n扫地僧\t438992\n心慌手抖\t438993\n心脏神经官能症\t438994\n材料商\t438995\n点穴\t438996\n香槟花园\t438997\n看得出来\t438998\n中国人民解放军军事科学院\t438999\n奇珍异宝\t439000\n下水裤\t439001\n水密\t439002\n两堵\t439003\nSEW减速机\t439004\n极危\t439005\nREC\t439006\n九周年\t439007\ncrt-stdio-l1-1-0.dll\t439008\nGarson\t439009\n秘书长\t439010\n牛津小学\t439011\nwhv\t439012\n01卷\t439013\n9000分\t439014\n千万条\t439015\ncreepypasta\t439016\n字型\t439017\n攻控\t439018\n盗门\t439019\nEGO\t439020\n260号\t439021\n杰克曼\t439022\n负雪\t439023\n班次\t439024\n后传\t439025\n颜王\t439026\nhuade\t439027\n南关\t439028\nclea\t439029\n德道\t439030\n王建波\t439031\neboot\t439032\n苏南\t439033\nmidiplus\t439034\n水杨酸\t439035\nCharlotte\t439036\n闲庭\t439037\nmammalian\t439038\nmklink\t439039\n鹿城区\t439040\n条石\t439041\n老婆们\t439042\n聚氨酯材料\t439043\n优作\t439044\n蓄能器\t439045\n易客\t439046\n裂解液\t439047\n黑泥\t439048\n营口站\t439049\n扬子江药业\t439050\n108期\t439051\n方兴地产\t439052\n猎聘网\t439053\n饥餐\t439054\nexp\t439055\n核心竞争力\t439056\n第六集\t439057\nrunas\t439058\n车臣共和国\t439059\n张衍\t439060\n髓核\t439061\n译制片\t439062\n造价指数\t439063\n600次\t439064\nCone\t439065\n25kw\t439066\n含苞待宠\t439067\n机翻\t439068\n双数\t439069\nEhcache\t439070\nECT\t439071\n自白_\t439072\nKyoto\t439073\n守护石\t439074\n舞曲\t439075\nven\t439076\n十三届二次\t439077\n申崇\t439078\n治理\t439079\n青云街道\t439080\n幻灭\t439081\n向来\t439082\n演奏\t439083\n问题\t439084\n椰子粉\t439085\n名誉侵权\t439086\n中奖者\t439087\n170408\t439088\n富兴\t439089\n索通\t439090\nnono\t439091\n李光荣\t439092\n精斑\t439093\n玦\t439094\nAkamai\t439095\nVIVOX21\t439096\n译语\t439097\nmod吧_\t439098\n速达软件_管家婆\t439099\n乡医\t439100\ntinkphp\t439101\n全阳\t439102\n米号\t439103\n阿斯顿马丁\t439104\nX+1\t439105\n火流星\t439106\n妈妈宝贝\t439107\n哈尔滨工业大学深圳研究生院\t439108\n肌苷片\t439109\n韩雪\t439110\nGOUMRO\t439111\n原发性肝癌\t439112\n凸轮控制器\t439113\n柴可夫斯基\t439114\n小升初派位\t439115\n瓦尔特保卫萨拉热窝\t439116\n烫衣板\t439117\n何享健\t439118\n平衡计分卡\t439119\n太古至尊\t439120\n海权论\t439121\n国家级经济技术开发区\t439122\n浙里\t439123\n公文函\t439124\nswagger\t439125\n汉西\t439126\nADXL345\t439127\n中国好声音4\t439128\n75厘米\t439129\n中国甘肃法院\t439130\n2421\t439131\n人教新课标_小学\t439132\n满档\t439133\n胸摸\t439134\n牛逼\t439135\nMOA\t439136\n舱体\t439137\n彭祖\t439138\n诺宝\t439139\nApex\t439140\njj比赛吧\t439141\n绡\t439142\n天下无魔\t439143\n阳光控股\t439144\n三环\t439145\n刘原\t439146\n在线云\t439147\n湖南中烟工业有限责任公司\t439148\n想来\t439149\n甜汤\t439150\n定国\t439151\n卡司\t439152\n小米3s\t439153\nAquarius\t439154\n军产房\t439155\n信天游\t439156\n倾城时光\t439157\n2727\t439158\n意大利文\t439159\n淄博市人民政府办公厅\t439160\ncontract\t439161\n毕志飞\t439162\n思政\t439163\n山东代表团\t439164\n芭蕉树\t439165\n拟合函数\t439166\n蒋姗倍\t439167\nLES\t439168\n争宠\t439169\n地狱天使\t439170\n鼎业\t439171\nCSF\t439172\n躲开\t439173\ndamper\t439174\n未经过\t439175\n郭尔罗斯\t439176\n秋篇\t439177\nDDoS攻击\t439178\nOf\t439179\n62906291\t439180\n主盘\t439181\n无缝轮播\t439182\n中共中央纪委\t439183\n汉口学院\t439184\n取长补短\t439185\n湟中\t439186\n历史使命感\t439187\n蛋挞皮\t439188\n一段情\t439189\n洗心革面\t439190\n肉燕\t439191\n龚宇罗马\t439192\nAyaka\t439193\nCnzz\t439194\n阅读\t439195\n高級\t439196\nGPUs\t439197\njc\t439198\n900吨\t439199\n烂_\t439200\n覆盖度\t439201\nisenlin\t439202\n机械人\t439203\narduino\t439204\n美妈问问团\t439205\n咖谧\t439206\n溶胶凝胶法\t439207\n摄政王\t439208\n压垮\t439209\n佛曲\t439210\n食伤\t439211\n1.5亿美元\t439212\nrunoob\t439213\n16品\t439214\n4_\t439215\n蚌埠市\t439216\n谢朓\t439217\n360邮票网\t439218\n九如山\t439219\nxbla\t439220\n纳森\t439221\n谋利\t439222\n杨涵\t439223\n矿池\t439224\nVMA\t439225\n试运行\t439226\nlied\t439227\n丁祥恩\t439228\n囧\t439229\n同页\t439230\n买卖法\t439231\n绵阳市第三人民医院\t439232\n陈行甲\t439233\n66万千瓦\t439234\n两件\t439235\n150英寸\t439236\n星之海洋5\t439237\ncontaining\t439238\n国际域名\t439239\n叶逢春\t439240\n水气\t439241\n粉尘\t439242\n时间值\t439243\n加文\t439244\n瑞康医药\t439245\n台达变频器\t439246\nLogistics\t439247\n妇炎洁\t439248\nChiphell\t439249\n战神36计\t439250\n天平天国\t439251\n智仟汇\t439252\n制冷工\t439253\n10斤\t439254\n裤长\t439255\n罗家镇\t439256\n造物集\t439257\n旅信\t439258\n衝撃\t439259\n工业炉\t439260\n3群\t439261\n鲁办\t439262\n漳州火山岛\t439263\n应昊茗\t439264\nHAKKO\t439265\n出书\t439266\n求签\t439267\nCai\t439268\ni-1\t439269\n冲击试验机\t439270\n保亭黎族苗族自治县人民政府\t439271\n六位一体\t439272\nHTMLcss\t439273\nKelvin\t439274\n红丝绒蛋糕\t439275\n房产交易税\t439276\n工本\t439277\n中国话\t439278\n思科公司\t439279\n千单\t439280\n氨基甲酸\t439281\n狼烟\t439282\npoop\t439283\n始乱终弃\t439284\n世界卫生日\t439285\nwebAPI\t439286\n海事\t439287\n断袖\t439288\nPOS机\t439289\n江苏省如东高级中学\t439290\n海南人\t439291\n中国金币总公司\t439292\n保教\t439293\n南京市绿化园林局\t439294\n波斯人\t439295\nfiba\t439296\n刘辰翁\t439297\n螳螂\t439298\n闻风\t439299\nstm32f107\t439300\n李森祥\t439301\n贵阳六中\t439302\n迅雷\t439303\nlazyload\t439304\n拽\t439305\n咔嚓\t439306\n早慧\t439307\nLinux内核模块\t439308\n引博\t439309\n种草\t439310\n抚河\t439311\n射频消融\t439312\n很干净\t439313\n古王\t439314\n维和步兵营\t439315\n浙江图书馆\t439316\n燃料油\t439317\n绝宠小媳妇\t439318\n军包\t439319\nDescriptor\t439320\nGTF\t439321\n739\t439322\n密码柜\t439323\n张店\t439324\ndmd\t439325\n无期徒刑\t439326\n99步\t439327\n吉林市公安局\t439328\n第10个\t439329\n20150718\t439330\n陈明亮\t439331\n骁龙821\t439332\n夏染雪\t439333\n范文\t439334\n山径\t439335\n四心\t439336\n诺克萨斯吧\t439337\n外理\t439338\n鸿毛\t439339\n形容词\t439340\n全屋定制\t439341\n产品展\t439342\n光轮\t439343\n气动工具\t439344\n红山郡\t439345\n九死还魂草\t439346\n78路\t439347\n尼姑庵\t439348\nweighting\t439349\n果酒\t439350\n南阳村\t439351\ngtr\t439352\n属下\t439353\n敲门\t439354\n驿道\t439355\nM2.5\t439356\n青少儿英语学习网\t439357\n霍普金斯大学\t439358\n世奇\t439359\n乐友孕婴童\t439360\n阿黑\t439361\n清朝末年\t439362\n文化创意园\t439363\n一个亿\t439364\n流动资产\t439365\n3051\t439366\nMirageFireFox\t439367\n超过20年\t439368\n票据宝\t439369\n和平生活网\t439370\n水卜\t439371\nstudio2017\t439372\n曾几何时\t439373\ntropic\t439374\nFuji\t439375\nNafion\t439376\n相架\t439377\n星爵\t439378\nDevC++\t439379\n金科地产\t439380\nWholesale\t439381\n2月27日\t439382\n影业\t439383\n小木偶的故事\t439384\n站在草原望北京\t439385\n雪千寻\t439386\n迟爱\t439387\n会计论坛\t439388\n猎萝卜\t439389\ntransferring\t439390\n京尊达\t439391\ndun\t439392\nigxe\t439393\nREACH\t439394\n史蒂芬·戴德利\t439395\n搞事\t439396\n体内\t439397\n榆次老城\t439398\nm15\t439399\n十篇\t439400\n月龄\t439401\n看云起\t439402\n珍宝\t439403\n阿特金斯\t439404\nPKPM\t439405\nzazhi\t439406\n烈如歌\t439407\n游斗\t439408\n高韶青\t439409\n安徽中烟工业有限责任公司\t439410\n马凳筋\t439411\n楚生\t439412\n天使的翅膀\t439413\n田间\t439414\n0878\t439415\njix\t439416\n赴宴\t439417\n秘封\t439418\n新春走基层\t439419\n爆粉\t439420\ngin\t439421\n沈浩\t439422\n灾厄\t439423\n72018款\t439424\n维卡币\t439425\nrtsp\t439426\n澎湃新闻\t439427\n高盖\t439428\n李立三\t439429\n本相\t439430\n绝世毒妃\t439431\n通用型\t439432\n警示语\t439433\n塘口\t439434\n无忧保\t439435\nproposals\t439436\n广宁县\t439437\n000625\t439438\n对缝\t439439\n酷派大神f1\t439440\n伦敦商学院\t439441\n拼房\t439442\n泥水\t439443\n科苑\t439444\n盘螺\t439445\n南方联合产权交易中心\t439446\nEX900\t439447\n2015.07\t439448\ncollecti\t439449\n植球\t439450\n李峤\t439451\n图线\t439452\n2018年1月11日\t439453\n百年史\t439454\n脱落\t439455\n红达摩\t439456\nFM\t439457\n28点\t439458\n就事论事\t439459\n3.5.4\t439460\n13枚\t439461\n纳税\t439462\n微橙\t439463\n1453\t439464\n中国科学院烟台海岸带研究所\t439465\ntype.2.4\t439466\n博雅干细胞\t439467\nbshare\t439468\n1000ml\t439469\n1072\t439470\n普发\t439471\n7.1声道\t439472\n唐二代\t439473\n牛士\t439474\n安可\t439475\n捕猎机\t439476\n普宁二中\t439477\naml\t439478\n在希望的田野上\t439479\n标准液\t439480\n路尼亚战记\t439481\n癸亥\t439482\n足疣\t439483\n冷静期\t439484\nJavbooks\t439485\n含水量\t439486\n邪念体\t439487\nshein\t439488\n支玉恒\t439489\n耶路撒冷\t439490\n东建\t439491\n4CM\t439492\n千机网\t439493\n文华财\t439494\n|文泰\t439495\n开魔能\t439496\n先知先觉\t439497\n5559\t439498\nKline\t439499\n坐卧\t439500\n两地\t439501\n接外\t439502\n俊园\t439503\n不白\t439504\n药力\t439505\n胡蜂\t439506\nRNG战队\t439507\n枫华\t439508\n初七\t439509\nlinger\t439510\n台州网络电视台\t439511\n【王\t439512\n10l\t439513\n田汉\t439514\nWebTours\t439515\n三十集\t439516\n润华年\t439517\n青帮\t439518\n母虎\t439519\n仙琚\t439520\n湖滨街道\t439521\n崇高石\t439522\n县环保局\t439523\njr\t439524\nゼロ\t439525\n土压\t439526\n词客\t439527\n慎终追远\t439528\nv-else\t439529\n义兄弟\t439530\n珠海西区\t439531\nCMOS传感器\t439532\n仁川亚运会\t439533\n家庭型\t439534\n五点\t439535\nExploration\t439536\n18套\t439537\n止逆\t439538\n作为\t439539\n无刷电机\t439540\n伊核\t439541\n防尘垫\t439542\nindian\t439543\n群力\t439544\n沈湘\t439545\n毛猪\t439546\n固特\t439547\n15支\t439548\n6px\t439549\n四湖\t439550\n来袭\t439551\nyst\t439552\n海洋性\t439553\nStage1st\t439554\n椒盐虾\t439555\nBigSam78\t439556\n花红片\t439557\n607\t439558\n谜题\t439559\ntjw\t439560\n九州缥缈录\t439561\n累坏\t439562\nIII\t439563\n事故现场\t439564\n正当红\t439565\n高新区国税局\t439566\n五十几岁\t439567\n轮径\t439568\n宁沪\t439569\nEXT3\t439570\n劳伦斯\t439571\n北京首都机场\t439572\n平江县政府\t439573\n奚梦瑶\t439574\n山河城\t439575\n琅琊路小学\t439576\n特助\t439577\n直下\t439578\n南京搬家公司\t439579\n东京站\t439580\n东大桥\t439581\n王建辉\t439582\n拼贴画\t439583\ndigsilent\t439584\n游团\t439585\n平足\t439586\nFasteners\t439587\n幸运盒\t439588\n异或\t439589\n妍丽\t439590\n西安市人力资源和社会保障局\t439591\n有线路由器\t439592\nm64\t439593\n2017年3月28日\t439594\nlrs\t439595\n南稍门\t439596\n沒有\t439597\n10CM\t439598\n主屏\t439599\n取样\t439600\n许老板\t439601\n机龄\t439602\n全集吉吉影音先锋\t439603\n儒\t439604\n倒酒\t439605\n打印店\t439606\n第二十八个\t439607\n溱湖\t439608\nmyth\t439609\n全蝎\t439610\nAsking\t439611\n纽北\t439612\n林宛瑜\t439613\n逆回购利率\t439614\n3.3.6\t439615\n幻想症\t439616\n延安路\t439617\nAloys\t439618\nscribe\t439619\n1200w\t439620\n仁和堂\t439621\n活跃点\t439622\n快吧补丁网\t439623\nFluentd\t439624\n桃江\t439625\n范德彪\t439626\npr2017cc\t439627\n屏中\t439628\n执掌\t439629\n树屋\t439630\n苏州小学\t439631\nspec\t439632\n银杏树\t439633\n自修室\t439634\n法拉利458\t439635\n沐风\t439636\nnba直播\t439637\nBroke\t439638\n白马啸西风\t439639\nAtrix\t439640\n浦安\t439641\ndatasource\t439642\n索引器\t439643\n红秀GRAZIA\t439644\n高朋\t439645\n津京冀\t439646\n茅小春\t439647\nwires\t439648\n闽清县\t439649\n对不准\t439650\n林韦伶\t439651\nPowerBuilder\t439652\n4y4\t439653\n萧万长\t439654\n故此\t439655\n火柴人大乱斗\t439656\n唐文\t439657\n月嫂公司\t439658\n诗风\t439659\n8264\t439660\nc#类\t439661\n吉致汽车\t439662\n迟子\t439663\n锻造件\t439664\n小米5sp\t439665\n陪着\t439666\n极惯性矩\t439667\n马向东\t439668\n小孩牙\t439669\n中国医科大学航空总医院\t439670\n试压\t439671\n段竹心\t439672\n马鞍山职业技术学院\t439673\n生存战争2\t439674\n河南科技大学第一附属医院\t439675\n王婉霏\t439676\n金投信用卡\t439677\n弗莱\t439678\n马克沁重机枪\t439679\n合作式\t439680\n超白金\t439681\n路槽\t439682\n大姐们\t439683\n骊驰\t439684\n75万\t439685\nusb2\t439686\n铁四院\t439687\n黎明静\t439688\n赣锋锂业\t439689\n养殖小区\t439690\nsev\t439691\n2306\t439692\n29栋\t439693\nDatasets\t439694\nヤ\t439695\n隐藏分区\t439696\n不动产登记中心\t439697\n康康谷\t439698\n暗黑1.13战网\t439699\n老人与海\t439700\n张师傅\t439701\n纠缠态\t439702\n三第四\t439703\nds3p\t439704\n50etf\t439705\n下\t439706\n加横\t439707\n罗德曼\t439708\nkossel\t439709\ngenetic\t439710\n咏花\t439711\n常柴\t439712\n河南心连心化肥有限公司\t439713\n几拍\t439714\nrename\t439715\n小儿感冒颗粒\t439716\neno\t439717\n城际\t439718\n陈妍臻\t439719\n一双双\t439720\n喷丝板\t439721\n祛疤膏\t439722\n重建机\t439723\n80后操b\t439724\nDHTML\t439725\n八字算命网\t439726\n上高速\t439727\n电动卷闸门\t439728\n狠\t439729\n锁定期\t439730\n務\t439731\n羽咲美晴\t439732\n锵锵\t439733\n年糕\t439734\n纳瓦拉\t439735\n亦庄火车站\t439736\n15任\t439737\n芝奇幻光戟\t439738\nx=1处\t439739\n烘手器\t439740\nxwiki\t439741\nexplain\t439742\nbr\t439743\n一十一\t439744\n吼吼\t439745\n靓坤\t439746\n1000套\t439747\n投资担保公司\t439748\n防酸\t439749\nwindows10输入法\t439750\n慢性胃炎\t439751\n枕头\t439752\n20160311\t439753\namd955\t439754\nCommodities\t439755\n黑公主\t439756\n大毕庄\t439757\n人性化\t439758\n夹件\t439759\n庄桥\t439760\n等电位\t439761\n4发\t439762\n好朋\t439763\nServer2016\t439764\n2013.3\t439765\n大_寻医问药网\t439766\n5030s\t439767\n九蛙\t439768\n爱恩\t439769\n萧红\t439770\n气动量仪\t439771\n甩干\t439772\n7502\t439773\nmdrt\t439774\n玉龙湖\t439775\nElegance\t439776\nf2.8\t439777\n笕桥\t439778\n鲁班尺吉数\t439779\n警体\t439780\n慈城镇\t439781\n通风\t439782\n高夫\t439783\n3300元\t439784\n鲁豫有约大咖\t439785\n古贝春\t439786\n野葛根\t439787\nelliot\t439788\n非人学院\t439789\n1306\t439790\n膨胀型\t439791\nmkc\t439792\n送杜少府之任蜀州\t439793\n成都嘉祥外国语学校\t439794\n前海注册公司\t439795\n灰缝\t439796\nadvance\t439797\n酒罐\t439798\n卡尔斯鲁厄\t439799\n9月15日\t439800\n救起\t439801\n统筹地区\t439802\nDiscuss\t439803\n广州建设银行\t439804\n朗讯\t439805\n炮头\t439806\n深圳明镜网\t439807\n武汉地质大学\t439808\n易考宝典\t439809\ndotfuscator\t439810\n南京谷\t439811\n丁丁网\t439812\n闺密\t439813\nddrops\t439814\nKendra\t439815\n在即\t439816\ndatang\t439817\n139个\t439818\nPlus-Flyme\t439819\n张养浩\t439820\n星海镖师\t439821\n金鸡菊\t439822\n巨美\t439823\n爱娟\t439824\n曲\t439825\n埃斯托蓝\t439826\n9120\t439827\n久久色悠悠综合网\t439828\n采茶歌\t439829\nsitu\t439830\n产机\t439831\n蚊香液\t439832\n广信\t439833\n请款单\t439834\n刹那间\t439835\nbgr\t439836\n旖旎\t439837\n戏台\t439838\n金卤灯\t439839\n000971\t439840\n不冤\t439841\nトップ\t439842\n8000多亿元\t439843\n康巴什新区\t439844\nKENZO\t439845\n赵小生\t439846\n000930\t439847\ndcv\t439848\n率众\t439849\n昊宇\t439850\n第十三周\t439851\n尚有\t439852\n秀美乡村\t439853\nclu\t439854\nrej\t439855\n眼冒金星\t439856\n镇江市规划局\t439857\n筱崎\t439858\n吸声板\t439859\nstreet\t439860\n20_\t439861\n厕液\t439862\n勇者大冒险\t439863\nFAP\t439864\n拳皇02\t439865\n巡检仪\t439866\n逆天九小姐\t439867\n宁波市国家税务局\t439868\n红梨\t439869\na53\t439870\n刘贵祥\t439871\n只差\t439872\nCQL\t439873\n麓山国际实验学校\t439874\nTanks\t439875\n万绿湖\t439876\nzx100\t439877\n20150218\t439878\n解码\t439879\ntxn\t439880\n计税基础\t439881\n旁听\t439882\nmkvtoolnix\t439883\n张慧仪\t439884\n深圳都市频道\t439885\n乱世枭雄\t439886\n先河\t439887\n海美\t439888\nStreamWriter\t439889\njpush\t439890\n价廉\t439891\n终身\t439892\n3题\t439893\n博硕\t439894\n第33条\t439895\n考一建\t439896\n3dsmax\t439897\n血凝仪\t439898\n遗憾的是\t439899\nNippon\t439900\n朱开山\t439901\nGrounds\t439902\n路口处\t439903\n新疆维吾尔自治区发展和改革委员会\t439904\n罗伯特·帕丁森\t439905\n虚幻4引擎\t439906\n指腹\t439907\ndisneyland\t439908\nANSOFT\t439909\n本科证\t439910\n诸人\t439911\n大威德金刚\t439912\nps2018\t439913\n东北民歌\t439914\nFulfillment\t439915\n逆鳞套\t439916\n叩响\t439917\n少年少年祖国的春天\t439918\n标招标\t439919\n十二条\t439920\nMeilele\t439921\n8x\t439922\nbreak-word\t439923\nSlate\t439924\n實\t439925\n宽窄\t439926\n湖州教育网\t439927\n鸿山\t439928\n王浩信\t439929\n温泉村\t439930\n蓝兔\t439931\nシンデレラ\t439932\n全民突击吧\t439933\n奠基\t439934\nPCB人才网\t439935\n惹麻烦\t439936\n焦作市委\t439937\n小帽子\t439938\n甘味\t439939\n经典力学\t439940\nHARD\t439941\n湿透\t439942\n黑凤凰\t439943\n龙岗中心城\t439944\n美不胜收\t439945\n冯伟\t439946\n货款\t439947\nSaltstack\t439948\nprotect\t439949\n合肥高新区\t439950\n200号\t439951\n苹果售后服务中心\t439952\nMiiTao蜜桃社\t439953\n保利香槟国际\t439954\n陈自瑶\t439955\n720全景图\t439956\n显微镜\t439957\n中国环境科学研究院\t439958\n988\t439959\n33条\t439960\n议论\t439961\ngreenday\t439962\ntsb\t439963\n浙数文化\t439964\n撕纸\t439965\n鼓号队\t439966\nbaihe\t439967\n化学博士\t439968\n大话西游之爱你\t439969\n多智能体系统\t439970\n非点源\t439971\n斯彬\t439972\n端部\t439973\n贝里内利\t439974\n重庆19楼\t439975\n9240\t439976\nsomuch\t439977\n海晏县\t439978\n同方知网\t439979\nuibezierpath\t439980\n纽卡\t439981\n进气量\t439982\n汽车博物馆\t439983\n日文版\t439984\n北顶娘娘庙\t439985\n入境卡\t439986\n2017年8月8日\t439987\n珠海区\t439988\n海鸣威\t439989\n中国民主同盟上海市委员会\t439990\n小嗨\t439991\n瓦楞纸箱\t439992\n一个目\t439993\ncolin\t439994\nbattleship\t439995\n万华化学\t439996\n10年前\t439997\n天津代表团\t439998\ncool1\t439999\nCrafting\t440000\n君权\t440001\n一百里\t440002\n全餐\t440003\nBXJ\t440004\n时尚女人志\t440005\n专售\t440006\n口式\t440007\n忧郁症\t440008\n百分之六\t440009\n维修率\t440010\ncsr2\t440011\n查全率\t440012\n九游手游\t440013\n威灵仙\t440014\n俏丽\t440015\n射日\t440016\n武汉地铁16号线\t440017\nbgao\t440018\nXMind思维导图\t440019\n宋新能源\t440020\nmjc\t440021\n40式太极拳\t440022\n白鹊\t440023\n中级篇\t440024\n1999年度\t440025\n替代性\t440026\n亚太5号\t440027\n李颖\t440028\nsewage\t440029\nVimIy微民网\t440030\n2016年3月1日起\t440031\n菁英公寓\t440032\nJava8\t440033\n美好置业\t440034\npeanut\t440035\n王鸿翔\t440036\nnagi\t440037\n九九八十一难\t440038\n小夜莺\t440039\n僵尸舞\t440040\n谢兰图\t440041\n蓝港\t440042\n刊物\t440043\nxutils3\t440044\n陕西省机关事务管理局\t440045\nPyImage\t440046\n公平竞技\t440047\n加罗\t440048\n宝塔糖\t440049\n龙之挑战\t440050\n柯凡\t440051\n元通\t440052\n新交所\t440053\n倒背如流\t440054\n节气\t440055\n优翼\t440056\n2019年1月\t440057\n8800元\t440058\n生日花语\t440059\n泰莎\t440060\n胡玮炜\t440061\nxlrd\t440062\n山西省总工会\t440063\n最近两年\t440064\nK歌之王\t440065\n海特生物\t440066\n普惠性幼儿园\t440067\n昌林\t440068\n千顷\t440069\n卡奈魔盒\t440070\n北银\t440071\n天茂\t440072\n黑檀木\t440073\n缓缴\t440074\nD站\t440075\n国务院扶贫办\t440076\n艺美\t440077\n灯壳\t440078\n立命馆大学\t440079\npandownload\t440080\n平安旅游白金卡\t440081\n资生堂安热沙\t440082\n六安二中\t440083\n中国股票网\t440084\n黠\t440085\n广州市天河区人民法院\t440086\n下載\t440087\n柔\t440088\ncoolpix\t440089\n汇丰集团\t440090\n天雄\t440091\n黑暗血\t440092\nMotivation\t440093\n铝木复合门窗\t440094\nc+\t440095\n爱钱帮\t440096\n街码\t440097\n天目西路\t440098\n横死\t440099\nWeDo\t440100\n张国庆\t440101\n果品\t440102\n交流电动机\t440103\nctt\t440104\nFISHER\t440105\n柔情\t440106\n交叉步\t440107\n杉木\t440108\n索泰1060\t440109\nHabits\t440110\n郑州市统计局\t440111\nprotel99se\t440112\n唐伟\t440113\n韩钢\t440114\n牛顿力学\t440115\n求和项\t440116\n_v1.5\t440117\n木鸡\t440118\nweb层\t440119\nGastro\t440120\n第53条\t440121\n虎皮鱼\t440122\n何其芳\t440123\n第3种\t440124\n急性\t440125\n金名网\t440126\n财经观察网\t440127\n满宠\t440128\n北京一夜\t440129\n烧水器\t440130\n北京第二外国语学院\t440131\n华星光电\t440132\nsecx\t440133\n搭棚\t440134\n159949\t440135\nFEE\t440136\n活动期\t440137\n仙侠诸界末日在线\t440138\n甜甜\t440139\n丰南\t440140\n千磨万击\t440141\n载气\t440142\n四川\t440143\n宣传画\t440144\n出血位\t440145\n校务\t440146\n县质监局\t440147\n16pic\t440148\nBacked\t440149\n生死场\t440150\nLegendary\t440151\n各级\t440152\n缸盖\t440153\n极品飞车20破解版\t440154\n孝\t440155\nPROBOOK\t440156\n慢慢\t440157\n突然性\t440158\n快吧游戏\t440159\ncvt\t440160\n木轴\t440161\n脑外伤\t440162\n分庭抗礼\t440163\n肠道菌群失调\t440164\nBackstreet\t440165\n球星\t440166\n等额本息还款\t440167\nSmartArt\t440168\n绮户\t440169\n痛灵\t440170\n破邪\t440171\n老手\t440172\naesthetics\t440173\nvmi\t440174\n风流倜傥\t440175\n【众泰T700】众泰众泰T7002018款\t440176\n医联网\t440177\n插卡电表\t440178\n苏小暖\t440179\n云海金属\t440180\n海连池\t440181\n獬豸\t440182\nlimbo\t440183\nSafety\t440184\n曆\t440185\n第四集\t440186\n嫘祖\t440187\n收派\t440188\n忠厚\t440189\n翔鹰帝国网\t440190\n大平\t440191\n奴妻\t440192\nRoarTalk\t440193\n平安综合金融\t440194\nALLluhan\t440195\n战神奥林匹斯之链\t440196\n铁血军魂\t440197\nacesse\t440198\n地衣\t440199\n动人心\t440200\n重庆大学城\t440201\n核定\t440202\n移民部\t440203\nfuzhuang\t440204\nEscrow\t440205\n中国企业新闻网\t440206\n竹雕\t440207\nFraud\t440208\n跤王\t440209\nnfc功能\t440210\n10.13.1\t440211\n军事经济学院\t440212\n陈朗\t440213\n教习\t440214\n澳门理工学院\t440215\n边牙\t440216\ngays\t440217\n百度关键字指数\t440218\ntypings\t440219\nx=2\t440220\n企业管理信息系统\t440221\n新技\t440222\n特种部队\t440223\nRb\t440224\n格拉姆\t440225\n帝鳄\t440226\n北宫门\t440227\n健壮性\t440228\nteenage\t440229\n产率\t440230\n开心色\t440231\n椎名真由理\t440232\nsnn\t440233\nEcommerce\t440234\n苦修\t440235\n凉爽\t440236\n接线器\t440237\n狮山镇\t440238\n百本\t440239\n山西人大\t440240\ndeduct\t440241\n观音机场\t440242\n5.6.8\t440243\nHibernate\t440244\n回门宴\t440245\n几芯\t440246\nrtems\t440247\nvending\t440248\n坏妈妈\t440249\n常虹\t440250\n孙村镇\t440251\n多校划片\t440252\n文华财经\t440253\n快速消费品\t440254\ncollector\t440255\n许镜清\t440256\n黄鹤翔\t440257\nDAPI\t440258\n北京市交管局\t440259\nBLM\t440260\n安元\t440261\n三诺集团\t440262\n胭脂水粉\t440263\n月谷\t440264\n长江索道\t440265\n中国科学院电工研究所\t440266\n文悦\t440267\n刘新\t440268\nmtl\t440269\n上海标卓科学仪器有限公司\t440270\nx600\t440271\n齐备\t440272\n平等互利\t440273\nunnamed\t440274\nPenguins\t440275\n南京律师事务所\t440276\n上海国际仲裁中心\t440277\n淮北市政府\t440278\n电房\t440279\n呋喃唑酮\t440280\nsimhash\t440281\n结婚前夜\t440282\n章开沅\t440283\n宁波市公共资源交易中心\t440284\n清路\t440285\n12.5英寸\t440286\n恒顺醋业\t440287\n天光\t440288\nTekkaman\t440289\n注册化工工程师\t440290\n烟台南站\t440291\nhp2130\t440292\n幸福地\t440293\npro6plus\t440294\n云南省统计局\t440295\nSVR\t440296\n南宁市财政局\t440297\n赏花\t440298\n深圳华侨城\t440299\nKL\t440300\n唐宋元\t440301\n小铁\t440302\n车载\t440303\n拨号连接\t440304\n臂带\t440305\n自动式\t440306\n湖北工业大学工程技术学院\t440307\n巨骨舌鱼\t440308\n蜜色\t440309\n快读\t440310\n振奋人心\t440311\n波克\t440312\n9.8%\t440313\n情智\t440314\n七里河区\t440315\n电动船\t440316\n201506\t440317\n哈平路\t440318\n巴枪\t440319\n拭子\t440320\n178DNF\t440321\n242\t440322\n江汉大学\t440323\nsind\t440324\n索尼A9\t440325\n电灶\t440326\n芳草碧连天\t440327\n一起摇摆\t440328\nASF\t440329\nctrl+z\t440330\n轻量级微\t440331\n投资人\t440332\n地红\t440333\n猫瘟\t440334\n阿拉木汗\t440335\n蜜蜂蛰\t440336\n审判日\t440337\n哈弗H2论坛_汽车之家论坛\t440338\n面麻\t440339\n气势汹汹\t440340\n蒙恩\t440341\n当作\t440342\n爱我妻\t440343\n奔驰e300l\t440344\n新寨村\t440345\n野格\t440346\naddons\t440347\n围子\t440348\nAlanTu\t440349\n甲肝灭活疫苗\t440350\n空包网\t440351\n蓝凌\t440352\n发起来\t440353\nlany\t440354\n弹力裤\t440355\n儿郎\t440356\nIslands\t440357\n启者\t440358\n失控\t440359\n体雕\t440360\n省人社厅\t440361\nRICHARD\t440362\n声筒\t440363\n上海爱乐乐团\t440364\n短位\t440365\n刘敏涛\t440366\nOutlook2013\t440367\n万华镜\t440368\n名塔\t440369\naire\t440370\n几十条\t440371\n复合词\t440372\n龙珠超次元乱战\t440373\n即时通讯软件\t440374\n十八相送\t440375\n怅然\t440376\n皆大欢喜\t440377\n厚重感\t440378\n9期\t440379\n安全员\t440380\n成龙时寒冰\t440381\n飘窗板\t440382\n走就走\t440383\n凯龙星\t440384\n小牙\t440385\n混色\t440386\n上座\t440387\n第四方\t440388\n高尔夫6论\t440389\nnet版\t440390\n水电\t440391\nPB\t440392\n原野\t440393\n黄尚\t440394\n方桥\t440395\n汇款单\t440396\n几杠\t440397\n歌集\t440398\nGMBH\t440399\n慧谷\t440400\n三原县人民政府\t440401\n康恩贝\t440402\n环球医疗器械网\t440403\n厂办\t440404\n超赞\t440405\n泰克\t440406\n亚太国际\t440407\n凉拌木耳\t440408\n郑州市财政局\t440409\nalienw\t440410\n天津西站\t440411\n贝利亚奥特曼\t440412\nEpoxy\t440413\n延津\t440414\n黑手党3\t440415\n谱聚类算法\t440416\n去路\t440417\n奔驰s500\t440418\n驿动的心\t440419\nwww.guanfang123.com网\t440420\n玲珑山\t440421\ncpc\t440422\n加厚\t440423\n推荐性\t440424\n公然\t440425\n7.2.2\t440426\n量子基金\t440427\n优速盘\t440428\n96点\t440429\nwin10无线网\t440430\n飞蛾扑火\t440431\n先声药业\t440432\n江苏海事职业技术学院\t440433\n兴全\t440434\n志龙\t440435\n亚历克斯\t440436\n珠海九洲港客运服务有限公司\t440437\n斯图\t440438\n金彩\t440439\n频器\t440440\n沈眉庄\t440441\n足额\t440442\n长安航空\t440443\n发病期\t440444\ncaserandom\t440445\nCOD8\t440446\n张鹏程\t440447\n钨棒\t440448\n超精\t440449\n军团长\t440450\n云梦山\t440451\n齿圈\t440452\n500km\t440453\n陪考\t440454\n命长\t440455\n顾平\t440456\n福祥\t440457\nランジェリ\t440458\n反而\t440459\ncc版\t440460\n简政\t440461\n宝友\t440462\n纺织物\t440463\n变冷\t440464\nCF穿越火线高手网\t440465\n考察队\t440466\n撒哈拉\t440467\n于坚\t440468\n察觉\t440469\n点融网\t440470\n2016—2030年\t440471\n内蒙古地区\t440472\nTRN\t440473\n风湿病学\t440474\n南投县\t440475\n店子村\t440476\n避雷网\t440477\n上百元\t440478\n痼疾\t440479\n180集\t440480\nvr\t440481\n凯蒙\t440482\nnta\t440483\nCPF\t440484\n临床\t440485\nCookie\t440486\n埃菲尔铁塔\t440487\n爬窗\t440488\nunique\t440489\n76条\t440490\nps快捷键\t440491\n新港西路\t440492\n嘉福\t440493\n江津区\t440494\n第84\t440495\n桦林\t440496\nHarbour\t440497\n西北农林\t440498\n湖北省中医院\t440499\n董永\t440500\n湖北省人大\t440501\n梯间\t440502\ndz论坛\t440503\n227分钟\t440504\n横风\t440505\n邵连虎\t440506\n张波\t440507\n航空产业园\t440508\n人情\t440509\n联想笔记\t440510\nvipkid\t440511\n温榆河\t440512\n政府军\t440513\n中国一流大学\t440514\napach\t440515\n5513\t440516\n齿条\t440517\nQCustomPlot\t440518\n介护\t440519\nx2-y2\t440520\n保险赔款\t440521\n15班\t440522\n周伯通\t440523\nCharger\t440524\n狂哭\t440525\n园本\t440526\npnp\t440527\nAttackers\t440528\nBTV\t440529\n朴子\t440530\n阿福卡\t440531\n蜂鸣\t440532\nAVE\t440533\n笑看人生\t440534\n中车株洲电力机车研究所有限公司\t440535\n控制系统\t440536\n宫外\t440537\nHDPT\t440538\n第45页\t440539\n鬼脚七\t440540\n越秀区寺\t440541\n逸夫\t440542\n越来\t440543\n等离子消融术\t440544\n辦\t440545\n李耀\t440546\n解语\t440547\n肝脾\t440548\n蕾欧娜\t440549\nfayu\t440550\n崔寨\t440551\nShangHai\t440552\nlvcreate\t440553\n七国\t440554\n1000多\t440555\n1890年\t440556\n原型师\t440557\n星猫\t440558\n不寻常\t440559\n姓顾\t440560\nDUO\t440561\n再娶\t440562\n排错\t440563\nfriedman\t440564\n依人\t440565\n步步为营\t440566\nSculpture\t440567\n中国中医科学院西苑医院\t440568\n第一学期第一次\t440569\nNFJ\t440570\n初行\t440571\n0.1_\t440572\n叶海燕\t440573\n用友ERP\t440574\n孝弟\t440575\n十年美\t440576\n中山红木家具\t440577\n企业会计制度\t440578\n兰州军区总医院\t440579\nnortheast\t440580\n前三月\t440581\n五代史\t440582\nRunning\t440583\ncabbage\t440584\nnormalized\t440585\n驴胶补血颗粒\t440586\nxishi\t440587\n快速人才网\t440588\n液相色谱分析\t440589\n新广告法违禁词汇总\t440590\n乡恋\t440591\n沙丘魔堡\t440592\n桦树\t440593\n独语\t440594\n3件\t440595\n白板纸\t440596\n课生\t440597\n13岁\t440598\nDora\t440599\n首根\t440600\n演员的诞生\t440601\n8亿元\t440602\n如家精选酒店\t440603\n叶谦\t440604\n淮军\t440605\n红鞋\t440606\n横断山脉\t440607\nonClick事件\t440608\n肾功能衰竭\t440609\n1000ML\t440610\n齐齐乐楚留香\t440611\ntherm\t440612\n发烧友\t440613\n火拼\t440614\n卷发型\t440615\n太阳轮\t440616\neach\t440617\n亲生\t440618\n业务范围\t440619\n梦寐以求\t440620\n三年级\t440621\n拉萨古城\t440622\n秋江\t440623\nJAVA类\t440624\n226号\t440625\n宴会厅\t440626\n定襄\t440627\n威派格\t440628\nis2\t440629\n岑溪\t440630\n票房\t440631\nCNET科技资讯网\t440632\ntrainee\t440633\n允西卡\t440634\n客户体验\t440635\n升班\t440636\n认知度\t440637\n荆芥\t440638\n深圳福田高铁站\t440639\n滴滴小巴\t440640\nwebconfig\t440641\n驷马\t440642\n活活\t440643\n维也纳国际酒店\t440644\n公办幼儿园\t440645\n老干妈\t440646\n遵义\t440647\nrust吧\t440648\n像素版\t440649\n辄\t440650\n可恢复\t440651\n庞清\t440652\n黑城\t440653\npmos\t440654\n驾驶台\t440655\n金融市场学\t440656\n曲玉\t440657\n黑苹果\t440658\n咖啡伴侣\t440659\n徐帆\t440660\n9\t440661\nv8.6\t440662\n箭牌\t440663\n预赛\t440664\n杭州市中级人民法院\t440665\natoi\t440666\n中华人民共和国驻美利坚合众国大使馆\t440667\n三次样条\t440668\n五个工作日\t440669\n嘣战纪\t440670\n收车\t440671\ndeepthroat\t440672\n西水股份\t440673\n瑶台\t440674\n文峰塔\t440675\n王仁\t440676\n黍\t440677\n掏腰包\t440678\n党建会议\t440679\n钛板\t440680\n姜姬\t440681\n成都晚报\t440682\nsearchbox\t440683\n言兴朋\t440684\n下降\t440685\n倒头\t440686\n柱效\t440687\nJPbeta\t440688\nsalomon\t440689\n抗心磷脂\t440690\n陕汽重卡\t440691\n行状\t440692\n营盘镇\t440693\n后继乏人\t440694\n价值观\t440695\n宝中\t440696\n枣阳\t440697\n20万公里\t440698\n娜塔丽·波特曼\t440699\n中国人民解放军国防大学\t440700\n大蛋糕\t440701\n15年以上\t440702\n丽达\t440703\n绿石\t440704\n风动石\t440705\n阿尔法系数\t440706\n记忆法\t440707\n马鹿\t440708\nanimate\t440709\nDies\t440710\n阻断\t440711\nnigh\t440712\n张天师\t440713\n阿超\t440714\n鸿门\t440715\n冲上云霄1\t440716\n刘天华\t440717\n狼孩\t440718\n预激综合症\t440719\n安兹\t440720\n麦肯基\t440721\n中共陕西省委组织部\t440722\n默沙\t440723\nscalp\t440724\nf599\t440725\n艺声\t440726\n向警予\t440727\n鲁东大学\t440728\n志尊淳\t440729\n人工学园2吧\t440730\npowercli\t440731\n100秒\t440732\n马加特\t440733\n随和\t440734\n九十八\t440735\n强辉\t440736\nDPOS\t440737\n光眼\t440738\n提公\t440739\n港雪宝论坛|港雪宝论坛\t440740\n绸缪\t440741\nsdl2\t440742\n高知\t440743\n品碟\t440744\nmiracles\t440745\n奥森\t440746\n阴冷\t440747\n300px\t440748\n北京朝阳国贸公司\t440749\n陈法蓉\t440750\n腾讯大学\t440751\nechostr\t440752\nflight\t440753\n钢琴视奏版\t440754\n舞动\t440755\n风乎舞雩\t440756\n2016年10月20日\t440757\n电源连接器\t440758\n0岁\t440759\n勤快\t440760\n描述符\t440761\n陈建\t440762\ncnl\t440763\n杜尚\t440764\narmeabi\t440765\njQueryEasyUI\t440766\n四川省律师协会\t440767\nMplayer\t440768\n不安全\t440769\n心流\t440770\n克霉唑软膏\t440771\n农家小院\t440772\n龙骑\t440773\n镇江市第一人民医院\t440774\n第八大街\t440775\n角落\t440776\n120层\t440777\n冯丽\t440778\n另附\t440779\n琼贵\t440780\n第四十一回\t440781\n二级管\t440782\n问女\t440783\ns8plus\t440784\n香菇\t440785\n八版\t440786\n简陋\t440787\n蛇精病\t440788\n♀\t440789\n大连海洋大学\t440790\nUnicodeDecodeError\t440791\n买者\t440792\n欧陆风云4吧_\t440793\n包边\t440794\n李隆基\t440795\n11条\t440796\n易房大师\t440797\n悍马\t440798\n第20届\t440799\n西部客运站\t440800\n四糸乃\t440801\n兰州新区\t440802\n田庄\t440803\n値\t440804\n木刀\t440805\nvario\t440806\n红松\t440807\nG\t440808\n20mm\t440809\nUCD\t440810\n美鲍\t440811\n10.14\t440812\n陈金彪\t440813\n4角\t440814\n英雄无敌战争\t440815\n黄花园大桥\t440816\n中国好歌曲\t440817\n器灵\t440818\n锋驭\t440819\n八十五\t440820\n62亿\t440821\n3.4亿\t440822\n华能\t440823\n刺骨\t440824\nnavigation\t440825\nfod\t440826\nRDTA\t440827\n重庆长安汽车\t440828\n教育者\t440829\n遗传办\t440830\n我乐家居\t440831\n慕先生\t440832\n东方曼哈顿\t440833\n各场\t440834\n厄瓜多尔\t440835\n劳役\t440836\n画脸\t440837\n科蒂\t440838\n錾子\t440839\n阿是\t440840\n价外税\t440841\n阿迪力\t440842\n一臂\t440843\n邱关源\t440844\n宽厚\t440845\nLolita\t440846\n首创者\t440847\n用-琵琶网\t440848\n一定要靠\t440849\nvelocity\t440850\n末行\t440851\n冰酷\t440852\n杭州政协\t440853\namiibo\t440854\n钙剂\t440855\n成规\t440856\n唐诗\t440857\n璞丽东方\t440858\n玄医\t440859\nCoaching\t440860\nsimultaneously\t440861\n稳赢\t440862\n福特翼虎\t440863\n外阴湿疹\t440864\n异刃\t440865\n慧心\t440866\n善战\t440867\n高妹\t440868\n专业知识\t440869\n老海\t440870\nLZ\t440871\n天天炫舞\t440872\n住艺\t440873\n王弢\t440874\nKagney\t440875\n▆\t440876\n北京地铁19号线\t440877\n苏韵锦\t440878\n11批次\t440879\n330ML\t440880\n求介绍\t440881\n教育部\t440882\nkarla\t440883\n依纪\t440884\n地锅鸡\t440885\n10000家\t440886\n纪晓芙\t440887\nose\t440888\n万阳\t440889\n斯托米·丹尼尔斯\t440890\n调室\t440891\n6235\t440892\n软考软件设计师\t440893\n顺丰公司\t440894\nconsequences\t440895\nyeo\t440896\n选考\t440897\nProvide\t440898\n18关\t440899\n八时\t440900\n六户\t440901\n林欣浩\t440902\n渔夫帽\t440903\n源流\t440904\n屏线\t440905\n赵俊杰\t440906\n直动\t440907\n上海华美医疗美容医院\t440908\n二稿\t440909\nDXG\t440910\n白玛多吉\t440911\n必胜宅急送\t440912\n刺客信条枭雄\t440913\n8909\t440914\n美中宜\t440915\n秋千\t440916\n汜水关\t440917\n240ml\t440918\n163.com_\t440919\n机器臂\t440920\n长江和记实业有限公司\t440921\nbsm\t440922\nIIS8.0\t440923\n五年级下册语文书-五年级下册语文\t440924\nu盘\t440925\n衢州市政府\t440926\n天基人才网\t440927\n招远麦当劳\t440928\npoly\t440929\n熟度\t440930\n片商\t440931\nMilan\t440932\n体育史\t440933\n常春\t440934\n2161\t440935\n80级\t440936\nshrine\t440937\n二沙岛\t440938\nTVRip\t440939\n弥天\t440940\n使命召唤现代战争\t440941\n嵊新\t440942\naph\t440943\n桐梓县人民政府\t440944\n韩东君\t440945\nSqlserver\t440946\n13913298888\t440947\n批改网\t440948\n组卡\t440949\n赵志红\t440950\npenelope\t440951\n食指\t440952\n广西壮族自治区体育局\t440953\n20包\t440954\n1.3万\t440955\n云月港\t440956\n牧原食品股份有限公司\t440957\n92kaifa\t440958\n嘉善南站\t440959\n施德楼\t440960\n153期\t440961\n笼统\t440962\n胚体\t440963\n金港路\t440964\nVoyeur\t440965\n振德\t440966\n中航工业\t440967\n做死\t440968\npallas\t440969\n琴歌\t440970\n抗污\t440971\nFormulation\t440972\njou\t440973\n宁波广播电视大学\t440974\n临港经济开发区\t440975\n转向\t440976\nOVER\t440977\n数科\t440978\n易语言组合框\t440979\n螺杆式空气压缩机\t440980\n义马市\t440981\n不能\t440982\n调档\t440983\nKP\t440984\n悦心健康\t440985\n建安\t440986\n*#06#\t440987\nsersync\t440988\nresolving\t440989\n马卡蒂\t440990\nLove\t440991\n5盎司\t440992\nduc\t440993\n韦德伍斯\t440994\n叫做\t440995\n柴田\t440996\nPEAR\t440997\njs-好库网\t440998\n6300hq\t440999\n幽林\t441000\n皮靴\t441001\n赵莹\t441002\n增强剂\t441003\n93_\t441004\n残值率\t441005\n发票门\t441006\n第18037期\t441007\nxingai\t441008\n南艺\t441009\n小串\t441010\ncomplexity\t441011\n800天\t441012\n浙江扬子江泵业有限公司\t441013\n爱城\t441014\n快乐读书\t441015\n喇叭网\t441016\npep\t441017\nsorter\t441018\n分界\t441019\n2017年5月13日\t441020\n狐影\t441021\n下常用\t441022\nmon\t441023\n第14位\t441024\nmatcaffe\t441025\n也不\t441026\nkaoyan\t441027\nlacking\t441028\n线谱\t441029\n信用宝\t441030\n105平方\t441031\n8.5.0\t441032\n爱在日落黄昏时\t441033\n薛掌柜\t441034\n队旗\t441035\n3108\t441036\nё\t441037\n寿全斋\t441038\njinfo\t441039\n李锐\t441040\n因明\t441041\n美心月饼\t441042\nnodeps\t441043\n柱间\t441044\n赌石\t441045\nEuropean\t441046\n110v\t441047\n槲皮素\t441048\n奥特斯\t441049\n镖旗\t441050\nEditable\t441051\n2016年7月\t441052\nLaugh\t441053\n伶牙俐齿\t441054\n股票交易所\t441055\n明略\t441056\n芦村\t441057\n六本木之丘\t441058\n张坊\t441059\n乐车邦\t441060\nai0513.com\t441061\n难点\t441062\n恩师\t441063\n债转股\t441064\n小桔灯网\t441065\n店商\t441066\n1.8D\t441067\n中诚信\t441068\nroboguide\t441069\n越好不好\t441070\nbang\t441071\n乐米\t441072\n吴卓林\t441073\n弱音\t441074\nsse\t441075\n赌场\t441076\n吴为山\t441077\n动画片儿\t441078\n托宝\t441079\n吉林省省\t441080\n公安部刑侦局\t441081\n十一月\t441082\n苏57\t441083\n译稿\t441084\n玄功\t441085\nSakurai\t441086\n挑夫\t441087\n二八分\t441088\nEthereal\t441089\n中巴铁路\t441090\nmax-width\t441091\n勒脚\t441092\n草莓晶\t441093\n初晓敏\t441094\n校纪委\t441095\n长乐村\t441096\n新疆省\t441097\nEQUIPMENT\t441098\n水滴型\t441099\nCarhartt\t441100\n官银\t441101\n仅次\t441102\n1000部\t441103\nmeForest\t441104\n乐华\t441105\n鼻涕状\t441106\nVIXX\t441107\n纯很暧昧\t441108\n中国舞蹈家协会\t441109\n句读\t441110\n愚人节\t441111\n转危为安\t441112\n708路\t441113\n轮罩\t441114\n通电\t441115\n征用\t441116\n优酷下载器\t441117\n各取所需\t441118\n杏仁油\t441119\n音秀\t441120\n女声优\t441121\n力盛赛车\t441122\n知味葡萄酒\t441123\n暮暮\t441124\n2018年4月16\t441125\n上线率\t441126\n轮轴\t441127\n沙皮\t441128\n风沙\t441129\nip段\t441130\n银州\t441131\n崇州\t441132\n炒货机\t441133\n这周六\t441134\n花房\t441135\nvse\t441136\n基色\t441137\n中南信达\t441138\n时间片\t441139\n运动会\t441140\nSpiritual\t441141\nPediatric\t441142\n高校长\t441143\n海尔集团公司\t441144\n黄道吉日大全\t441145\n超凡蜘蛛侠2\t441146\n菩提岛\t441147\n西北角\t441148\n龙源\t441149\n生果\t441150\n厘定\t441151\n32册\t441152\ncreatethread\t441153\n老粗布\t441154\n金桥\t441155\n何锋\t441156\n防臭\t441157\n江苏南部\t441158\nsocket编程\t441159\n拉格朗日\t441160\n齿形链\t441161\n1836\t441162\n渡人\t441163\ncastel\t441164\n遗爱湖\t441165\n扬中人才网\t441166\n插马\t441167\nHeathrow\t441168\n北京市朝阳外国语学校\t441169\n中华人民共和国大事记\t441170\nCOMMIT\t441171\nJpeg\t441172\n集结号游戏中心\t441173\n陈泽宇\t441174\n嘉实多\t441175\nding\t441176\n颍\t441177\n键字\t441178\nNRF24L01\t441179\n强国梦\t441180\nile\t441181\n污龟\t441182\n奇轮\t441183\n杭集镇\t441184\n悉尼海港大桥\t441185\nloong\t441186\n★\t441187\n国家海洋环境预报中心\t441188\n充分利用\t441189\n青山区\t441190\n林静\t441191\n给我看\t441192\n北大港\t441193\n羊杂碎\t441194\nabr\t441195\n棕榈湾\t441196\n李计忠\t441197\nXBB\t441198\n广东实验中学附属天河学校\t441199\n北京市第三十五中学\t441200\nSketchU\t441201\n太华北路\t441202\n大趋势\t441203\nvincizhang\t441204\ncancer\t441205\n废船\t441206\nshocks\t441207\n88.com\t441208\n贺喜\t441209\nucgui\t441210\n漫卷\t441211\n县上\t441212\n曦城\t441213\n帝豪gs\t441214\n女爵\t441215\n卵生动物\t441216\n验车\t441217\ntraits\t441218\n东南枝\t441219\n顾景深\t441220\n龙珠:超宇宙2\t441221\n北园路\t441222\n凌天传说\t441223\nspa\t441224\n批量重命名\t441225\n大堰镇\t441226\n接受性\t441227\n欢心\t441228\n葵水\t441229\n月结\t441230\n内容源\t441231\n提案\t441232\nHis\t441233\n宁波机场\t441234\n北京东直门医院\t441235\n8席\t441236\n澳洋顺昌\t441237\n东阳市\t441238\nbandzip\t441239\n四强\t441240\n斐丝丽\t441241\n礼品券\t441242\n前线\t441243\nsnt\t441244\n上海硅酸盐研究所\t441245\n武汉总医院\t441246\n爆插\t441247\n拉斯维加斯\t441248\n刘凌\t441249\n实验站\t441250\n串标\t441251\n紫荆网\t441252\n过敏科\t441253\n火雨\t441254\n蒋天生\t441255\nassetbundle\t441256\n看车\t441257\n江苏紫金农村商业银行股份有限公司\t441258\nReducing\t441259\n省去\t441260\n竞演\t441261\n鼠绘漫画网\t441262\n2673\t441263\n走动式\t441264\n比特精灵\t441265\n小手\t441266\n词海\t441267\nLace\t441268\n含光\t441269\n10.2_\t441270\n持有者\t441271\ncontrary\t441272\n姚华\t441273\n中国船舶工业集团\t441274\n焦作大学\t441275\n紫杉\t441276\nTau\t441277\nIntellectual\t441278\n香磷\t441279\n贝得\t441280\n李易乔丹\t441281\n齐静\t441282\n高收入\t441283\n复旦微电子\t441284\n亚林西\t441285\nkitty主题公园\t441286\n菲龙网\t441287\n无药可救\t441288\nK620\t441289\n3册\t441290\n霎哈嘉\t441291\n综合格斗\t441292\n衬套\t441293\n慧静\t441294\n中考版\t441295\n城镇燃气设计规范\t441296\n韩雅\t441297\n汉滨区\t441298\n暴风\t441299\n150cm\t441300\n三星G9350\t441301\n扬州市人民政府\t441302\n眼头\t441303\n蓬江区\t441304\n文水县\t441305\n法不责众\t441306\n80b\t441307\n百度在线网络技术(北京)有限公司\t441308\n青吉鬼\t441309\nsortid\t441310\n枋湖\t441311\n一二五\t441312\n柔焦\t441313\n横洞\t441314\n登特\t441315\n2018年1月12日\t441316\n南国早报\t441317\n积极\t441318\nnachi\t441319\n8cm\t441320\n命门穴\t441321\nLivescore\t441322\n福建教育学院\t441323\nQoS\t441324\n惊驾路\t441325\n膨\t441326\n三滴血\t441327\n优良\t441328\n256\t441329\n传功\t441330\n消失的夜晚\t441331\nCLASSPATH\t441332\n速达\t441333\n民心者\t441334\n2010年3月\t441335\nMuses\t441336\nimba吧\t441337\n梧桐岛\t441338\n南极\t441339\n辛香\t441340\n深海迷航极光号\t441341\n如履\t441342\n日光性皮炎\t441343\n疗养费\t441344\n方锦龙\t441345\nK28\t441346\nSATA2.0\t441347\nMario\t441348\n这款\t441349\n枪子\t441350\n1.6.8\t441351\n拿破仑·希尔\t441352\n芦名尤莉亚\t441353\n邹超\t441354\n黄敬\t441355\n李文琦\t441356\n郑俊英\t441357\n裸辞\t441358\n申威\t441359\n石木\t441360\n空气\t441361\n三轮电动车\t441362\n山东新闻联播\t441363\ndvdrip\t441364\n95588\t441365\n达沃斯论坛\t441366\n丰腴\t441367\n有机玻璃\t441368\n丹丹\t441369\n000693\t441370\n百变魔尺\t441371\n放到\t441372\nwps幻灯片\t441373\n西充县\t441374\n划圈\t441375\n魏蜀吴三国\t441376\n全民通\t441377\n许多多\t441378\n皇帝圣印战记\t441379\nDVD转让\t441380\n君主立宪\t441381\n切片\t441382\n轴点\t441383\n王牌对王牌\t441384\n工具框\t441385\n专段\t441386\n这座城\t441387\n广元火车站\t441388\n光\t441389\n吐下泻\t441390\n每3秒\t441391\n填负\t441392\n关于实施中华优秀传统文化传承发展工程的意见\t441393\n北迈汽配网\t441394\n行城市\t441395\n快速连接器\t441396\n弱精症\t441397\n环宇\t441398\n国家知识产权局专利局\t441399\n雁栖湖国际会展中心\t441400\n新浪河南财经_新浪\t441401\n核反应堆\t441402\n商务站\t441403\n斗罗大陆漫画全集\t441404\n基期\t441405\n麦饼\t441406\n行政复议法\t441407\n李增瑞\t441408\nQiu\t441409\n多少时\t441410\nitunes64\t441411\n云南电网\t441412\n掉头发\t441413\n左向\t441414\n中国质检报刊社\t441415\nwindous10\t441416\n照影\t441417\n玉山广场\t441418\n六龄齿\t441419\n闻王昌龄左迁龙标遥有此寄\t441420\n龚鹏程\t441421\n华网\t441422\nKUNDUN1069.com\t441423\n石崇\t441424\nJSTL\t441425\n海沙\t441426\n张丞相\t441427\n正恒\t441428\n综述类\t441429\nimageloader\t441430\n遗训\t441431\n超声波加湿器\t441432\n金门县\t441433\n百度云rmvb\t441434\n军纪\t441435\n上海工商局\t441436\n25匹\t441437\n漫画全集\t441438\n八哥鸟\t441439\nxiaojie\t441440\n隔开\t441441\n九曲花街\t441442\n诺诺网\t441443\n热敏票据打印机\t441444\nintercept\t441445\n15秒\t441446\n私处\t441447\ndior\t441448\n牧牧\t441449\n1448\t441450\n习信\t441451\n_孔夫子旧书网\t441452\n吉隆\t441453\nSOD蜜\t441454\n羊乳\t441455\nmifare\t441456\n47路\t441457\n换向器\t441458\n北京儿研所\t441459\n卓达\t441460\n抽提\t441461\n铝包\t441462\n骏达\t441463\n温州大学瓯江学院\t441464\n各季\t441465\n窝窝QQ网\t441466\n波士顿马拉松\t441467\n李斯羽\t441468\n自己人\t441469\n火影忍者纲手\t441470\n胎纹\t441471\n百度技术\t441472\n弯弯\t441473\n落榜生\t441474\nmhol\t441475\n定远舰\t441476\n焦作市\t441477\n拓达\t441478\n苏中\t441479\nplanets\t441480\n大象\t441481\n资环学院\t441482\n建筑面\t441483\n利落\t441484\n绷缝机\t441485\n太搞笑\t441486\n0990\t441487\n申彗星\t441488\n阿里源\t441489\n恶贯满盈\t441490\nluence\t441491\n骑行券\t441492\n抓\t441493\n八极拳\t441494\n氧化厂\t441495\n1500x\t441496\nincluded\t441497\n好把式\t441498\n新疆生产建设兵团第十三师\t441499\n083\t441500\nSWD\t441501\n和桥镇\t441502\n山雨欲来风满楼\t441503\n绿软\t441504\n边机\t441505\n2018040\t441506\n9.01\t441507\n样题\t441508\n3218\t441509\n卓众\t441510\n肌电\t441511\n调教\t441512\n开喉剑喷雾剂\t441513\ncharacteristics\t441514\n防蹭网\t441515\nELITEBOOK\t441516\nBoostrap\t441517\n3dm游戏网\t441518\n河务局\t441519\nWheels\t441520\n泰康\t441521\n拼搏\t441522\n富贵树\t441523\n环球港\t441524\n8GB运存+骁龙845\t441525\n十二分之一\t441526\n邯郸市\t441527\n大驾\t441528\n洪恩点读笔\t441529\n第39页\t441530\n125g\t441531\n兄妹俩\t441532\n昊华\t441533\nav公司\t441534\n烟台工程职业技术学院\t441535\n人文学\t441536\n颜语\t441537\nmacroeconomics\t441538\n演示者\t441539\n唐某\t441540\n瑞丽诗\t441541\nLou\t441542\n我的身体\t441543\n3gpp\t441544\n中央候补委员\t441545\n8583\t441546\nQi_Yuan\t441547\n商派\t441548\n束河古镇\t441549\n范小青\t441550\n苏染染追夫记\t441551\nhc05\t441552\n新京\t441553\n喷砂\t441554\n台湾政治大学\t441555\n沙堤机场\t441556\n唯音悦\t441557\n荷鲁斯\t441558\n码制\t441559\nESCAPE\t441560\nRISK\t441561\n原则性\t441562\nfiy\t441563\n美了美\t441564\n树莓派Raspberry\t441565\n和田籽料\t441566\ntouchpad\t441567\n螺丝钉\t441568\n童诗白\t441569\n深圳农业银行\t441570\n中科院海洋所\t441571\n╣\t441572\nvincent\t441573\n武藤\t441574\n建滔广场\t441575\n瑞恒\t441576\n食药总局\t441577\n谢邂\t441578\n俯仰角\t441579\n绿化率\t441580\n中弘股份\t441581\nvs2008\t441582\n1+N\t441583\nmona\t441584\n苏州市统计局\t441585\n1113\t441586\n乙醛脱氢酶\t441587\n王大鹏\t441588\n残响\t441589\n哈希\t441590\nhector\t441591\n福鼎新闻网\t441592\n中国书法网\t441593\n00005\t441594\n少年壮志不言愁\t441595\n剑雨\t441596\n紫轩\t441597\n竹笼\t441598\nkeil4\t441599\nqq留言板\t441600\nrobotics\t441601\n大世\t441602\n爪哇岛\t441603\n1.7倍\t441604\n幻彩\t441605\n300亩\t441606\nAscension\t441607\n第四卷\t441608\nEUROPE\t441609\n园林史\t441610\nseries3\t441611\nHosting\t441612\n6把\t441613\n大都会\t441614\n1万平方米\t441615\n出疹\t441616\n运粮\t441617\n幻想草草传\t441618\n茯砖茶\t441619\n蓬莱路\t441620\n张家港\t441621\nYJV22\t441622\nITMO爱萌游戏网\t441623\n英码\t441624\n四门\t441625\nWorth\t441626\n李刚\t441627\n纸垫\t441628\nkmo\t441629\n勝\t441630\n夸值网\t441631\ncreator\t441632\n成都体育中心\t441633\n格日勒\t441634\n清版\t441635\nstarve\t441636\n芜湖县\t441637\n醋\t441638\n樊华\t441639\n塞斯\t441640\n最高限价\t441641\n自攻螺丝\t441642\nmath\t441643\n未转变者3.0\t441644\n白华\t441645\n窄路\t441646\n腾讯集团\t441647\nvolunteers\t441648\npinterest\t441649\n荣庆\t441650\n彩宝\t441651\n铁色\t441652\n碧天\t441653\n巢湖路\t441654\nnct127\t441655\n噪声\t441656\n树\t441657\n分拣员\t441658\n侣行\t441659\n老牛\t441660\n黏着\t441661\nwhilst\t441662\n十一届\t441663\n逍遥乐\t441664\nIPv6\t441665\n蒋经国\t441666\n有机太阳能电池\t441667\n93万\t441668\n超级基因\t441669\nHolidays\t441670\n任务链\t441671\n套缸\t441672\nollydbg\t441673\n商业街\t441674\n东方影都\t441675\n新龙城\t441676\n金榜\t441677\n碎肉\t441678\n晚上1点\t441679\n这么长\t441680\n世界之战\t441681\n上自\t441682\n金丝楠\t441683\nFilecoin\t441684\n软件名\t441685\n狂犬病疫苗\t441686\n长江证券股份有限公司\t441687\njmter\t441688\n维生素c片\t441689\n黑龙江新闻网\t441690\nuitabbar\t441691\n家具品\t441692\n羞涩\t441693\n九千多\t441694\n琅琊区\t441695\n江西省交通运输厅\t441696\n上海绿地\t441697\nfcc认证\t441698\n休闲版\t441699\n20150618\t441700\nsles\t441701\n全商网\t441702\n瑾\t441703\n威卡威\t441704\nvjoy\t441705\n汇算清缴申报表\t441706\n公众聚集场所\t441707\n致胜\t441708\n药剂学\t441709\n逃亡者\t441710\n刘亦蔡康永\t441711\n捷泰\t441712\n大家电\t441713\n科素亚\t441714\n少女\t441715\n地下水资源\t441716\n溪蟹\t441717\n新蓝\t441718\n二型糖尿病\t441719\ndaemon\t441720\n身心\t441721\nRAPOO\t441722\n冗余\t441723\n皇道\t441724\n梁建勇\t441725\n利玛窦\t441726\n不屈\t441727\n铁榔头\t441728\n刘方平\t441729\n鸣鹤古镇\t441730\n宋暖\t441731\n汉秀剧场\t441732\n崇真艺客\t441733\n恒茂\t441734\n营改增\t441735\n去年10月份\t441736\n鲜生\t441737\n得罪\t441738\n听爱\t441739\n安阳东\t441740\n礼堂\t441741\n迅雷720p\t441742\n云厅\t441743\n观月雏乃\t441744\n集抄\t441745\n职业道德行为处理办法\t441746\n金桥梁\t441747\nnaked\t441748\n中非发展基金\t441749\n5037\t441750\n东方财富网\t441751\neasyui\t441752\n宝片\t441753\n民航大厦\t441754\n诺拉\t441755\n虾蟹\t441756\n陈强\t441757\n风险管理\t441758\n3000年前\t441759\n魔笛\t441760\n会计之窗\t441761\n农作\t441762\n中关村国家自主创新示范区\t441763\n南宁住房公积金管理中心区直分中心\t441764\n监察处\t441765\n衣女\t441766\n小冰冰\t441767\n炝锅面\t441768\n绝对不会\t441769\n进度\t441770\nHongKong\t441771\n今年清明\t441772\n遮光垫\t441773\n薛绍\t441774\nShah\t441775\n万氏\t441776\nrx550\t441777\n欧宝\t441778\nmovist\t441779\n刑虐\t441780\nROMS\t441781\n涨价\t441782\n金华市政府\t441783\n常凯申\t441784\n贯标认证\t441785\n莆田网\t441786\n1283号\t441787\n本自\t441788\n30套\t441789\n金字塔型\t441790\nlunli\t441791\ncoo\t441792\njeff\t441793\n麦理浩径\t441794\n上海黄金交易所\t441795\n简约式\t441796\n提任\t441797\n皮膜\t441798\n巷陌\t441799\n八里台\t441800\n寂\t441801\n比如说\t441802\n北大法宝\t441803\nitms-services\t441804\n第117章\t441805\n三十铺\t441806\n_哈尔滨\t441807\nCIE\t441808\n紫薯粥\t441809\n彩石\t441810\n140平方\t441811\n井矿集团\t441812\n念珠\t441813\nVainglory\t441814\n正确度\t441815\nx51\t441816\nqts\t441817\n布雷迪\t441818\n骨相\t441819\n枞\t441820\n如法\t441821\njkd\t441822\n集宁师范学院\t441823\n阿怪\t441824\n开发区\t441825\n物系\t441826\n4月23日\t441827\n光催化剂\t441828\n精选辑\t441829\ngm版\t441830\n雷神st\t441831\nFOCUS\t441832\n福东\t441833\n红岸\t441834\n91KK\t441835\ngen2\t441836\n巨乳学院\t441837\n汉元素\t441838\n北京五洲妇儿医院\t441839\n苍白无力\t441840\nUL认证\t441841\nMeters\t441842\n扩频\t441843\n上海龙之队\t441844\n脱漆剂\t441845\n52名\t441846\n清凉菩提\t441847\n青岛方特梦幻王国\t441848\n学习性\t441849\n新华保险\t441850\n老郭相声网\t441851\nfne\t441852\n暗街\t441853\nLBE\t441854\n脱模\t441855\nBF2\t441856\n说文\t441857\n交错\t441858\n万达文华酒店\t441859\n计算机图形学\t441860\n李德裕\t441861\n神武3佛门\t441862\nJoins\t441863\n铝酸钙粉\t441864\n私人博物馆\t441865\n蒙特祖玛\t441866\n歌斐\t441867\n种植场\t441868\n热岛\t441869\n阿楠\t441870\nunhealthy\t441871\nlsmod\t441872\n5437\t441873\n奇幻世界\t441874\n西安智康1对1\t441875\nplatform-tools\t441876\n中州\t441877\n人和街小学\t441878\n专务\t441879\n河上镇\t441880\n边缘人\t441881\n浙江站\t441882\n乙炔气瓶\t441883\n好几秒\t441884\n练案\t441885\n阳江职业技术学院\t441886\n磁体\t441887\n分级机\t441888\n示范型\t441889\n战无不胜\t441890\ntrsmk2\t441891\n速腾1.6L\t441892\n方法篇\t441893\n紫金大厦\t441894\n50型\t441895\n沈飞\t441896\nbrightness\t441897\n上海电视大学\t441898\n羟苯磺酸钙\t441899\n神舟国旅\t441900\n800万吨\t441901\n336\t441902\n葬经\t441903\n大石湖\t441904\nV1.5\t441905\n拖线\t441906\n点染\t441907\n食品科学与工程学院\t441908\n自然地理学\t441909\n中国馆\t441910\n埃米纳姆\t441911\n小刀娱乐\t441912\ncer\t441913\n阴阳宅\t441914\n虹桥世界中心\t441915\n7d2\t441916\n中铁二十四局集团有限公司\t441917\nipone5s\t441918\n孰能\t441919\nDem\t441920\n轻舞飞扬\t441921\n祁黄羊\t441922\nDodo\t441923\n长庆油田\t441924\np值\t441925\nApplewatch\t441926\n鬼差\t441927\n泥溪镇\t441928\npixabay\t441929\n郡守\t441930\n浦东国际机场\t441931\ncmct\t441932\n静息\t441933\nnav-tabs\t441934\n157家\t441935\n开餐\t441936\n三菱帕杰罗\t441937\nel-menu\t441938\n小傅\t441939\n恩情\t441940\n切断阀\t441941\n标致308s\t441942\nsolomon\t441943\n送丝机\t441944\n堆放\t441945\n佛门网\t441946\n改命\t441947\n600页\t441948\n太阴星\t441949\n龙珠格斗\t441950\n吉普212\t441951\n熹妃Q传\t441952\n宋高宗\t441953\n迦叶\t441954\n社会工程学\t441955\n横拉杆\t441956\nstruggling\t441957\n720集\t441958\n铁流\t441959\n青口\t441960\n&#8226\t441961\n实列\t441962\n荣辱观\t441963\n圣辉\t441964\n电人\t441965\n长江三峡大坝\t441966\n罗茨\t441967\n0221\t441968\nVUEX\t441969\n苯佐卡\t441970\n水冷散热器\t441971\n得出\t441972\n水府庙\t441973\n布花\t441974\n湘美\t441975\nicp经营许可证\t441976\n任帅\t441977\n王野\t441978\n6151\t441979\n最动听\t441980\n黄鸡\t441981\n前3月\t441982\n开修\t441983\n六度\t441984\n令人\t441985\n魅蓝max\t441986\nresort\t441987\n暗黑破坏神2修改器\t441988\n玫瑰小镇\t441989\n贝雷片\t441990\n乌衣\t441991\n上海市平和双语学校\t441992\n4412\t441993\n微淘达人\t441994\n龙场\t441995\n郑和宝船\t441996\n没有爱\t441997\nlakes\t441998\n嬴驷\t441999\n上海医药集团\t442000\n庞太师\t442001\n20150305\t442002\n成都西站\t442003\n捆扎绳\t442004\n加拿大\t442005\n雅昌画廊网\t442006\n2.0mm\t442007\n三网融合\t442008\n闵开慧\t442009\n软件所\t442010\n西宁市委\t442011\n20160516\t442012\nRas\t442013\n中晶扫描仪\t442014\n罗汉果茶\t442015\n中共中央党史研究室\t442016\n孙培博\t442017\neffort\t442018\n天理难容\t442019\nalways\t442020\nProf\t442021\n开曼群岛公司\t442022\ncvmat\t442023\n双边丝护栏网\t442024\n长春中医药大学附属医院\t442025\n2017年三月\t442026\n边缘处\t442027\n李晓红\t442028\nautowired\t442029\n均衡器\t442030\n前前后后\t442031\n触不可及\t442032\n导出来\t442033\n20多万元\t442034\n三九胃泰\t442035\nkepware\t442036\n郭术生\t442037\n中远\t442038\nbable\t442039\nLinux防火墙\t442040\n中国科学网\t442041\n绝响\t442042\nROOT_\t442043\n长江头\t442044\n六十周年\t442045\n盘柜\t442046\n赤壁之战\t442047\n公安部\t442048\n胯骨\t442049\nKQ\t442050\n孟德斯鸠\t442051\n班牛\t442052\n林晨\t442053\n5.5寸\t442054\n601989\t442055\n套印\t442056\n聚氨酯树脂\t442057\n1411\t442058\n热钱\t442059\n高特\t442060\n同志社大学\t442061\nBerger\t442062\ncedit\t442063\n风湿类\t442064\n12名\t442065\n巴布\t442066\n棉兰\t442067\n郭进\t442068\n增幅券\t442069\n根须\t442070\n每步\t442071\n2017年上\t442072\n栖息\t442073\n王者荣耀梦奇\t442074\n来蓉\t442075\nhorror\t442076\nSendmail\t442077\n侵犯\t442078\n圈口\t442079\n中华人民共和国国家外国专家局\t442080\n相性\t442081\n职业病危害控制\t442082\n三星酒店\t442083\n蝴蝶犬\t442084\nLeaks\t442085\n回式\t442086\npycaffe\t442087\n扣回\t442088\n设计费\t442089\n裸台\t442090\nfaded\t442091\n则\t442092\n猴儿\t442093\n沥青搅拌站\t442094\n删前\t442095\n重定位\t442096\n四十分钟\t442097\njshoppers\t442098\nORIGINAL\t442099\n潮流志\t442100\n左腹\t442101\nuuid\t442102\n星点值\t442103\n资金量\t442104\n蚂蚁S9\t442105\nJony\t442106\n童言童语\t442107\n广东省科技厅\t442108\nBDT\t442109\n魏紫\t442110\n无乳链球菌\t442111\nartioscad\t442112\n展区\t442113\n脱痂\t442114\nTyrant\t442115\nGage\t442116\n背调\t442117\n杭绍\t442118\n17路\t442119\n卢泰愚\t442120\n龙阳袖舞\t442121\n中国足踩馆\t442122\n545\t442123\n一樽\t442124\n爱丁\t442125\nVGGNet\t442126\n倪好\t442127\n男式\t442128\n长期\t442129\n3d溜溜\t442130\nexc\t442131\n看图猜成语\t442132\n九州钱币收藏网\t442133\n狗嬲\t442134\n电子展\t442135\n巴厘龙虾\t442136\nLogitech罗技\t442137\n孙浩然\t442138\n保安过滤器\t442139\n底盖\t442140\nAndroid\t442141\n株洲人才网\t442142\n无理取闹\t442143\n备感\t442144\n血酸\t442145\n前瞻网\t442146\nDOA\t442147\n跳变\t442148\n墨舞碧歌\t442149\n不测\t442150\n600036\t442151\nInstructional\t442152\n早乙女ルイ\t442153\n亿隆\t442154\ngtx550ti\t442155\n0576\t442156\n丈母娘\t442157\n201406\t442158\n瓮安\t442159\nvcruntime140dll\t442160\n西南大学荣昌校区\t442161\nFloss\t442162\n贱货\t442163\n某某某\t442164\n楼梯面\t442165\n16Mn\t442166\n养家糊口\t442167\nfd\t442168\n二十多年前\t442169\n西安市中级人民法院\t442170\n花园路街道\t442171\n8.0.6.303\t442172\n淡菜\t442173\n苏一\t442174\n边境战争\t442175\n电动阀门\t442176\n爱诺\t442177\nJessica程序猿\t442178\n河北港口集团\t442179\njili\t442180\n义乌工商职业技术学院\t442181\n命运之石\t442182\nBelief\t442183\n吕四\t442184\n坐电梯\t442185\n挂轴\t442186\n武汉市财政局\t442187\n昭陵\t442188\n状态观测器\t442189\n花俏\t442190\nwin8/8.1\t442191\n杜高犬\t442192\n宠物威驰\t442193\n犀流\t442194\nmenuconfig\t442195\nV30\t442196\n工程伦理学\t442197\n捷键\t442198\n砂浆泵\t442199\n中饭\t442200\n真冰溜\t442201\n华誉\t442202\n添加器\t442203\n安德镇\t442204\n王尽美\t442205\n峨眉山月歌\t442206\n东方普罗旺斯\t442207\n丁墨\t442208\n源凯\t442209\n燕麦片\t442210\n水博\t442211\n轭\t442212\n独石\t442213\n后庭\t442214\nGTD\t442215\n庆熹\t442216\n鸡毛信\t442217\nresidence\t442218\n昏鸦\t442219\n注册城市规划师考试\t442220\n放放\t442221\nSeen\t442222\n882\t442223\n举报电话号码\t442224\n填补\t442225\n奕奕\t442226\n四川省人力资源和社会保障厅\t442227\nRION\t442228\n代售点\t442229\n锐派CS\t442230\nE语言\t442231\n诺丝\t442232\n下女\t442233\n李家沟\t442234\n剑风传奇\t442235\n筑梦计划\t442236\n正篇\t442237\n海威特\t442238\n山西省质监局\t442239\n石门新闻网\t442240\nAVI\t442241\n红管\t442242\n六百多\t442243\n黄勇\t442244\n徐霞客游记\t442245\n新党\t442246\n开阳\t442247\n小美好\t442248\nwild\t442249\n素敏\t442250\nNSData\t442251\nglx\t442252\n撤柜\t442253\n郑州市纪委\t442254\n双庙乡\t442255\nCAPITAL\t442256\n李采潭\t442257\n开外\t442258\n东石\t442259\n20170422\t442260\n花盒\t442261\n青年文摘\t442262\n电源\t442263\n混音版\t442264\n仮面ライダ\t442265\nmoony\t442266\n子控件\t442267\n武气\t442268\n钓鱼服\t442269\nLEICA\t442270\n中国民用航空局\t442271\n决议书\t442272\n练舞\t442273\njetbrain\t442274\n星光熠熠\t442275\n考试类\t442276\nFarfetch\t442277\npax\t442278\n淫欲大学\t442279\n#if\t442280\n夏枯草\t442281\n渝南\t442282\n招聘信息网\t442283\n诺特兰德\t442284\nwebcollector\t442285\n北一路\t442286\nFF9\t442287\n初心\t442288\n1月30日\t442289\n国税通用机\t442290\nkde\t442291\n工具条\t442292\n邓颖超\t442293\n布莱恩特\t442294\n顶库\t442295\n1000mm\t442296\n179元\t442297\n少数派\t442298\n氯雷\t442299\ngoogleapis\t442300\n那行\t442301\n盛辉物流\t442302\n4月17\t442303\n房县政府网_房县人民政府\t442304\n才智\t442305\n珠江村\t442306\nmichigan\t442307\n住地\t442308\n煤渣\t442309\n龙珠超宇宙2\t442310\n苍月\t442311\n荣海\t442312\nHall\t442313\np600\t442314\n偏硅酸钠\t442315\n登记册\t442316\n梁式非\t442317\njockey\t442318\nKIA\t442319\n射入\t442320\n海带排骨汤\t442321\n蓝洞\t442322\n内含物\t442323\nshebei\t442324\n裴多菲\t442325\nLumion\t442326\n和平里街道\t442327\n网络播放器\t442328\n包布\t442329\nTracing\t442330\n下到\t442331\ncomparable\t442332\nSnip\t442333\n青岛恒星学院\t442334\nRoomba\t442335\n城长\t442336\n忍足\t442337\n满城尽带黄金甲\t442338\n亮剑稿\t442339\n鬼爪\t442340\n英雄萨姆\t442341\n孙语赛\t442342\n竞技赛\t442343\n际遇\t442344\n使命召唤7:黑色行动\t442345\n电势差\t442346\n余姚实验学校\t442347\n土地上\t442348\n大长\t442349\n萝卜干\t442350\n耐火板\t442351\n英雄之路\t442352\nairplayer\t442353\n翠华\t442354\n育德\t442355\n王珍珍\t442356\n罗圈腿\t442357\n浙江省文化馆\t442358\n蓟门\t442359\n华发\t442360\nbongo\t442361\n选调\t442362\nWelder\t442363\n批注者\t442364\n朝美惠香\t442365\n残虐\t442366\nX^2\t442367\n海伦春天\t442368\nOBO\t442369\n乐天集团\t442370\nwindows安全中心\t442371\n将要\t442372\nJuicy\t442373\n一小块\t442374\n密克罗尼西亚\t442375\ne展网\t442376\nwin2003server\t442377\n张根硕\t442378\n命令行版\t442379\n速尔快递\t442380\n小辞\t442381\n歌林\t442382\n陈国庆\t442383\naviutl\t442384\n莓果\t442385\n基极\t442386\n大领\t442387\n度哥\t442388\nTyler\t442389\n国家行政机关公文处理办法\t442390\n沙砾\t442391\n主殿\t442392\n新时代中国共产党\t442393\nmerci\t442394\na&#178\t442395\n强制性脊柱炎\t442396\nEt\t442397\n清工\t442398\n爱施德\t442399\nreplaceState\t442400\n上海市质量监督检验技术研究院\t442401\njasmine\t442402\n十分钟内\t442403\n三门\t442404\n诗苑\t442405\nKimbo\t442406\n虚部\t442407\n自考本科毕业论文\t442408\nuefi\t442409\n第二战\t442410\n譬如说\t442411\n200款\t442412\nplaygrounds\t442413\n华筝\t442414\n新朋股份\t442415\n龙港网\t442416\n79岁\t442417\nleading\t442418\n地塞米松注射液\t442419\n单相\t442420\ncolonial\t442421\n黑沙宝典\t442422\n4周岁\t442423\n中国东北\t442424\n爱迪生\t442425\n胡鑫\t442426\nprisoner\t442427\n心系\t442428\n小烧\t442429\n索利达尔\t442430\n灰毛\t442431\n连接式\t442432\n延河\t442433\nNoMachine\t442434\nJoke\t442435\nChat\t442436\nKAWASAKI\t442437\n唐诗鉴赏辞典\t442438\nBusy\t442439\nps滤镜\t442440\n冷_\t442441\n榔榆\t442442\nfoggy\t442443\nSkill\t442444\n鱼饼\t442445\n郑鑫\t442446\n1盒\t442447\nKan\t442448\n第六期\t442449\n沈瑾萱\t442450\npowerline\t442451\n房屋建筑学\t442452\n张堰镇\t442453\n四两拨千斤\t442454\n索爱\t442455\n宝塔石化\t442456\n亿元级\t442457\nsnipaste\t442458\n钟家村\t442459\n朗阁教育\t442460\n飞梦\t442461\n0879\t442462\n赋值函数\t442463\n伦铜\t442464\n顾家北\t442465\n何赛飞\t442466\n崂山风景区\t442467\n王荣\t442468\n15.88\t442469\n滤头\t442470\n金碧天下\t442471\n称作\t442472\nCPUZ\t442473\n300美元\t442474\n阿尔法星\t442475\n0012\t442476\n20160825\t442477\n章建平\t442478\n美妆相机\t442479\n半封建\t442480\n散料\t442481\nmacs\t442482\n文萃路\t442483\n最合算\t442484\n目鱼\t442485\n集美区\t442486\n胡伟武\t442487\nkings\t442488\n展会\t442489\n胸椎\t442490\n王艺华\t442491\n供血\t442492\ndegraded\t442493\n1949年\t442494\n爱德华\t442495\n中华北路\t442496\nk21\t442497\n勒布朗詹姆斯\t442498\nprolog\t442499\n莫西沙星\t442500\n▊\t442501\nX-Fi\t442502\n福州人才网\t442503\n验光师\t442504\nTuesday\t442505\n孙林\t442506\n眼子\t442507\n撬动\t442508\n童工\t442509\n挑破\t442510\n鱼尾裙\t442511\n华盛顿\t442512\n59级\t442513\n宽幅\t442514\n升白针\t442515\n脱开\t442516\n旋转型\t442517\n刘宏宇\t442518\n王珺\t442519\n杨凌\t442520\nsoundboard\t442521\n注浆机\t442522\nUTT艾泰\t442523\n求告\t442524\n180409\t442525\n平舌音\t442526\n放大机\t442527\n孙承宗\t442528\n好说话\t442529\n苍之涛\t442530\n中国科学院武汉物理与数学研究所\t442531\nWinError\t442532\nNOIP\t442533\n长寿果\t442534\n百家姓\t442535\n杀人狂魔\t442536\n保守性\t442537\n巴雷拉\t442538\n比亚迪s7\t442539\nump9\t442540\n阿尔忒弥斯\t442541\n魑魅\t442542\n卫兵\t442543\n寒山寺\t442544\n亡灵杀手\t442545\n东莞市商务局\t442546\n淫男乱女\t442547\n碧涛阁\t442548\n东古塔\t442549\npants\t442550\n扶绥县\t442551\n衣联网\t442552\npriceless\t442553\n王店镇\t442554\n木兰无长兄\t442555\n网易云音乐播放器\t442556\n会计科\t442557\n白雪皑皑\t442558\nr6220\t442559\n梦幻西游五开\t442560\nTAFE\t442561\n大坑\t442562\nYeh\t442563\n361美图汇\t442564\n国家安全监管总局化学品登记中心\t442565\n泰安市人力资源和社会保障局\t442566\n3.30\t442567\n许嫣娜\t442568\n阀门\t442569\nHail\t442570\n米家飞利浦\t442571\n日本中学\t442572\n电刷\t442573\n蜜语\t442574\nROX\t442575\n极趣社区\t442576\n上客\t442577\nStep7\t442578\n3回\t442579\n世华龙樾\t442580\nCG薇儿论坛\t442581\n四议两公开\t442582\n碗组\t442583\n典当\t442584\n益动\t442585\n氯苯酚\t442586\n苹果macOS\t442587\n永川\t442588\nlayui弹窗\t442589\n云居山\t442590\n宜兴日报\t442591\n果花\t442592\n停车位\t442593\n齐达周韶宁\t442594\nrx470\t442595\n互粉\t442596\n邮柜\t442597\n一片云\t442598\nSunmi\t442599\n太套\t442600\nShellcode\t442601\nYunMan\t442602\n社会工作师\t442603\n张三疯\t442604\n今年3月份\t442605\n捞月狗\t442606\n唯有\t442607\ndoc2vec\t442608\n10万条\t442609\n袖章\t442610\nvectors\t442611\n劈开\t442612\n陕西省社会保障局\t442613\n新邵\t442614\n降压片\t442615\n冲孔灌注桩\t442616\nv1.0_安秀网\t442617\n时报金犊奖\t442618\n重庆青年旅行社\t442619\n新史学\t442620\n0DaY5.CoM\t442621\n万家乐\t442622\nspeed\t442623\n畅所欲烟\t442624\n世贸城\t442625\n高格\t442626\n书者\t442627\nndf\t442628\n兰姆\t442629\n招生网\t442630\n羊肠\t442631\ntyler\t442632\n石美\t442633\nCash\t442634\n人民解放军军乐团\t442635\n百田雷人圈\t442636\n幕府将军2武家之殇\t442637\n合肥滨湖\t442638\n记完\t442639\n平安福保险\t442640\n趣店\t442641\n优质\t442642\n叛徒\t442643\n十朝\t442644\n第003章\t442645\ncodebase\t442646\n大屌\t442647\nyeast\t442648\n改期\t442649\n20160212\t442650\n笔电\t442651\n巨神峰\t442652\ngoin\t442653\ndnf卡屏\t442654\n赵燕\t442655\nyouji\t442656\n圣骑士wind\t442657\nambari\t442658\n杨光华\t442659\n拉萨路\t442660\n李舜臣\t442661\n美郡\t442662\n睫毛增长液\t442663\nix\t442664\n北京大学第三医院\t442665\n武义\t442666\n兖矿\t442667\n华航\t442668\n社会保障服务网\t442669\n翼豹\t442670\n520_\t442671\n甘肃省政协\t442672\nvmware-tools\t442673\n304_\t442674\n八尾\t442675\n机号\t442676\n房地产基金\t442677\n无精症\t442678\n刘良华\t442679\n头脑奥林匹克\t442680\nMabel\t442681\n微商节\t442682\n公路处\t442683\n黄粑\t442684\n塑身\t442685\n全键\t442686\n单王\t442687\nJET\t442688\n一叶舟\t442689\ncnblogs\t442690\n黄发\t442691\n朱家明\t442692\n苍之纪\t442693\nキ\t442694\nSERVER数据库\t442695\n定时关机\t442696\n驾考宝典网\t442697\n购物篮\t442698\nAssignment\t442699\n张博文\t442700\n济南小区\t442701\n疤痕增生\t442702\nqq空间吧_\t442703\n泰星\t442704\n福星城\t442705\n万宝镇\t442706\n神盾舰\t442707\nFixtures\t442708\n半岛\t442709\n艾蔻\t442710\n烟丝\t442711\n中信广场\t442712\n声腔\t442713\nAgencies\t442714\n叫好\t442715\n70条\t442716\n砂率\t442717\n养生舞\t442718\n爱君\t442719\ndevicetoken\t442720\n防雨罩\t442721\n耗竭\t442722\n盂唇\t442723\n79页\t442724\n陌拜\t442725\n焊丝\t442726\n航海王启航吧\t442727\n斯大林格勒保卫战\t442728\n变实\t442729\n伯乐\t442730\n名仁\t442731\n联语杂酱面\t442732\n59307265\t442733\n抠脚\t442734\n精灵幻想记\t442735\nVALENTINO\t442736\n刘盼\t442737\n独怜\t442738\n三十个\t442739\n二老\t442740\n切断\t442741\n丁二酸酐\t442742\n美锦\t442743\n液袋\t442744\n汉臣\t442745\n日麻\t442746\nZard\t442747\n丹寨县\t442748\n泰康人寿\t442749\n刺杀\t442750\n高年\t442751\n逍遥小散仙\t442752\nCocoaPods\t442753\nmAP\t442754\nnetbox\t442755\n高标\t442756\n邢台教育局\t442757\nsuan\t442758\nsustaina\t442759\n逍遥新宋\t442760\n盐雾箱\t442761\n饶子豪\t442762\n李小璐\t442763\n高虹\t442764\nhaskell\t442765\novarian\t442766\n天津眼科医院\t442767\n经验条\t442768\n重庆烤鱼\t442769\n比斯\t442770\n尼雅\t442771\n滚动式\t442772\n安莉洁\t442773\n20151027\t442774\neacharts\t442775\n安徽天欧进出口有限公司\t442776\n上色\t442777\n雄塑\t442778\n睢宁新闻网\t442779\n杨培安\t442780\n86平米\t442781\n电力管\t442782\n吴甘沙\t442783\nシステム\t442784\n诫命\t442785\np43\t442786\n到擒来\t442787\n泪人\t442788\n一路青云\t442789\n红钻\t442790\n禁处\t442791\n壶中\t442792\nworkload\t442793\n红利群\t442794\n通泰路\t442795\n全运村\t442796\n中国传动网\t442797\n飞禽走兽\t442798\nprophet\t442799\n大口\t442800\n南京工商局\t442801\n樱之岛\t442802\n王玉兰\t442803\n2016-07\t442804\n单边\t442805\n知网数据库\t442806\n查理大帝\t442807\n流动负债\t442808\n雍正王朝\t442809\n24kg\t442810\nGo优盘\t442811\n新视野大学英语4\t442812\n弹跳力\t442813\n美利达\t442814\n鸟蛋\t442815\nlatern\t442816\n私油\t442817\nOlder\t442818\n紫峰\t442819\n日终\t442820\n攻略集\t442821\n主播们\t442822\n50多岁\t442823\n东莞港\t442824\n中意\t442825\n社区养老\t442826\n第98章\t442827\n征税率\t442828\n第三章\t442829\n黑莓桌面管理器\t442830\n铜印\t442831\nx1c\t442832\n28平米\t442833\n新风貌\t442834\n复印纸\t442835\n线角\t442836\n20160523\t442837\n外面\t442838\n天冬氨酸氨基转移酶\t442839\n郭守敬\t442840\n7.8分\t442841\n免责声明书\t442842\nDemographics\t442843\n文化创意产业\t442844\n入党积极分子思想汇报\t442845\n文本流\t442846\n番禺区人民政府\t442847\n丑\t442848\n爱了很久的朋友\t442849\n电荷量\t442850\n道面\t442851\n篷\t442852\n三宜\t442853\n焕新\t442854\n大众高尔夫7\t442855\n上海新国际博览中心\t442856\n一条虫\t442857\n脉冲治疗仪\t442858\n书画院\t442859\n论据\t442860\n田静\t442861\nsort排序\t442862\nxlplayer\t442863\n赔礼道歉\t442864\n月儿爱海天\t442865\n备勤\t442866\nOSDN\t442867\n五鬼\t442868\n翠岛花城\t442869\n众托帮\t442870\nrel\t442871\n湖畔大学\t442872\nkmx\t442873\ndocp\t442874\n白乾涛\t442875\n木簪\t442876\nWord2Vec\t442877\n打造具\t442878\n面型\t442879\n房地产经纪人考试\t442880\n廖常初\t442881\n万花\t442882\n王凯\t442883\n名闻\t442884\nannd\t442885\n电信4G+\t442886\n王玉荣\t442887\n_康\t442888\n上报\t442889\nb^2\t442890\n三元催化器\t442891\n议论性\t442892\n无法逃脱\t442893\n老美\t442894\n天象\t442895\n阿酱\t442896\n乱清聊斋志异\t442897\n釜山机场\t442898\n驱动类\t442899\n非转农\t442900\n2014年08月20日\t442901\n海棠花溪\t442902\n中颖电子\t442903\n作家库\t442904\n网页游戏平台\t442905\n受冷落\t442906\n蓝总\t442907\na-1\t442908\n重命名\t442909\n浓水\t442910\n78天\t442911\n装订线\t442912\n新妹魔王的契约者\t442913\n骨骺线\t442914\n排排宝\t442915\n杉杉奥特莱斯广场\t442916\nhooray\t442917\n大六壬\t442918\n金花股份\t442919\n洪承畴\t442920\n黄金芽\t442921\ndocky\t442922\n政治化\t442923\n小计\t442924\n送达率\t442925\n自由战争\t442926\n化脓性脑膜炎\t442927\nBypass\t442928\n上海波特曼丽思卡尔顿酒店\t442929\n家族谱\t442930\n暴血\t442931\nnx10\t442932\n陈皮阿四\t442933\n十品\t442934\n皮线\t442935\n景式\t442936\n168地产网\t442937\n新字\t442938\n戏谱\t442939\n铁勺\t442940\n江南style\t442941\nKFR-72LW/\t442942\n冰龙版\t442943\n龄期\t442944\n万科社区\t442945\n华美达\t442946\nxii\t442947\nnls\t442948\n劳拉西泮片\t442949\n厚生\t442950\n2018.1.8\t442951\nmetaclass\t442952\n绝对伏特加\t442953\ncmc\t442954\n中国科学院长春光学精密机械与物理研究所\t442955\n尤莉亚\t442956\ngoole浏览器\t442957\n布兰妮\t442958\n文水\t442959\nfineui\t442960\n社会活动\t442961\n外用\t442962\n虚空恐惧\t442963\n张学锋\t442964\n碧色\t442965\n日式rpg\t442966\n156集\t442967\nbim\t442968\n一代人\t442969\n电加热带\t442970\n撸撸色\t442971\nspun\t442972\n瓜蒌\t442973\n好想死\t442974\n店展\t442975\n博取\t442976\n狂暴战\t442977\n铂臻\t442978\npchome\t442979\nsolidworks2010\t442980\n三园\t442981\n椿木\t442982\n601166\t442983\nWWE2K17\t442984\n长城vv7\t442985\n难闻\t442986\nQTL\t442987\n好兄弟\t442988\n日式烧肉\t442989\n彭镇\t442990\n锦州站\t442991\n浓缩机\t442992\n菲诗小铺\t442993\n具惠善\t442994\n绿地大道\t442995\n瞬息万变\t442996\n销售人\t442997\nfoxpro\t442998\n圆桶\t442999\n换页\t443000\n11.00\t443001\nzhchoutai\t443002\n酒剑仙\t443003\n沙塘鳢\t443004\nANSYS\t443005\n碎股\t443006\n默认端口号\t443007\n热疗\t443008\nregularization\t443009\n陈志鹏\t443010\n断骨增高\t443011\nTexturePacker\t443012\n达拉斯\t443013\n飞翼\t443014\n重庆大学法学院\t443015\n插曲\t443016\n12800元\t443017\n新疆网\t443018\n成都地区\t443019\n白泉\t443020\n锦江都城酒店\t443021\n金浦钛业\t443022\n涌向\t443023\n第019章\t443024\n现金流\t443025\nJudgment\t443026\n小娇妻\t443027\n600分\t443028\n天行健君子\t443029\nRZJY\t443030\nund\t443031\n假币\t443032\n滨水公园\t443033\n塞港\t443034\nswr\t443035\n北京电影学院研究生院\t443036\n后金\t443037\n入港\t443038\n浙江三科电气有限公司\t443039\n得知\t443040\n企翼网\t443041\n稍候\t443042\n平阳县政府\t443043\n嘉善\t443044\ne4a\t443045\n尚水\t443046\n万通药业\t443047\ni+\t443048\n19P\t443049\n安科瑞\t443050\n羊亭镇\t443051\nYYModel\t443052\n文员\t443053\n子洲\t443054\n不要再说\t443055\naibo\t443056\n上海立信会计学院\t443057\n三明治\t443058\nserver2012r2\t443059\ncfda\t443060\n死飞\t443061\n吉利领克\t443062\n玉种\t443063\nammo\t443064\n苏州诚品书店\t443065\n插板阀\t443066\n科技部\t443067\n绿巨人\t443068\nliangww\t443069\nSolidot\t443070\n50集\t443071\n奶茶\t443072\n道路\t443073\n菲洛\t443074\n宋老虎\t443075\n现代戏\t443076\n2204号\t443077\n手花\t443078\nFragments\t443079\n洗衣粉\t443080\nmdx\t443081\n冰川时代4\t443082\n赣南林业网\t443083\n增氧机\t443084\n封路\t443085\n格列卫\t443086\n罗素兄弟\t443087\n绿都澜湾\t443088\nShoulder\t443089\n未来15天\t443090\n虐待罪\t443091\nnuget\t443092\nzWìng\t443093\n淡蓝网\t443094\nC100\t443095\n碱味\t443096\n重庆法院\t443097\n普陀山机场\t443098\n近日\t443099\n圣节\t443100\n我们的世界\t443101\n信托公司\t443102\n米罗\t443103\n1一个\t443104\n16亿\t443105\n马伊\t443106\n小米笔记本吧\t443107\n七八十年代\t443108\n尼卡\t443109\n伊利优酸乳\t443110\n深圳能源集团\t443111\n张铭浩\t443112\n阿凯\t443113\n白饭\t443114\n省图书馆\t443115\n猪肉松\t443116\npulled\t443117\nPH值\t443118\nNoneType\t443119\n光丝\t443120\n魏亮\t443121\nmaidj\t443122\nwecn\t443123\n金万维\t443124\nstyl\t443125\ndy2018.com\t443126\n钢花\t443127\n泠\t443128\n古墓丽影崛起\t443129\n浙江路\t443130\n纵筋\t443131\nGhetto\t443132\n岂\t443133\n袖口\t443134\n拉然\t443135\n反调\t443136\nLD_Joy\t443137\n七九\t443138\n游讯\t443139\n洛奇英雄传吧\t443140\n守墓\t443141\n反诉状\t443142\n肢解\t443143\n制冷快报\t443144\n蚁巢\t443145\n陆基\t443146\n墓人\t443147\n极境\t443148\n升国旗\t443149\n简易方程\t443150\n谊\t443151\n阿市\t443152\n马小乐\t443153\n李连成\t443154\nipone6\t443155\n处友\t443156\n古越龙山\t443157\n后领\t443158\n熊猫金银币\t443159\n生育保险金\t443160\n3632\t443161\n中国能源报\t443162\n沪港\t443163\npersonals\t443164\n2015款\t443165\nSTORM\t443166\n金融衍生品\t443167\nanony\t443168\n黑群晖\t443169\n2.8万\t443170\ndB\t443171\n洪湖市人民政府\t443172\n002174\t443173\n15千瓦\t443174\n淫技\t443175\nPPT、word\t443176\n张道陵\t443177\n书画纸\t443178\n32亿美元\t443179\n求和\t443180\nnexus4\t443181\n兰亭序\t443182\nOleDb\t443183\n辛未年\t443184\n石田彰\t443185\n捷信分期\t443186\n报经\t443187\n北京爱迪国际学校\t443188\n水柜\t443189\n集热器\t443190\n4399宝藏世界\t443191\n变焦\t443192\n不迁\t443193\n片段\t443194\n杭新景高速\t443195\nvalley\t443196\n比翼齐飞\t443197\n奔驰ml350\t443198\n安意艾玛沃特森\t443199\n靖康之变\t443200\ncissp\t443201\n建炎\t443202\n郝堂村\t443203\nunsupported\t443204\n司令官\t443205\n洋奴\t443206\n捷豹xel\t443207\n短笛\t443208\n创新社区\t443209\n聚宝斋\t443210\n华威\t443211\n堪折直须折\t443212\n异特龙\t443213\n省住房城乡建设厅\t443214\n按键精灵\t443215\n第81章\t443216\n银行从业考试\t443217\nbpl\t443218\n通光\t443219\nspexial\t443220\n异逝界\t443221\n林月\t443222\n6月10号\t443223\nautocad2015\t443224\n田东\t443225\nSkid\t443226\ntkmybatis\t443227\nprimere\t443228\n舞彩\t443229\n七条\t443230\n保时捷Cayenne\t443231\n基诺浦\t443232\n提肛\t443233\n资产配置\t443234\n逍遥右脑\t443235\n肩章\t443236\n伤农\t443237\n周汤豪\t443238\nc32\t443239\n安阳市人力资源和社会保障局\t443240\n济缘\t443241\n京通\t443242\n69公斤级\t443243\n悬崖峭壁\t443244\n190cm\t443245\n窦晓璇\t443246\n木子洋\t443247\n华晨宝马汽车有限公司\t443248\n飞区\t443249\nUITextField\t443250\n江底\t443251\n中天国际\t443252\n转氨酶\t443253\nFine\t443254\n新众泰T700\t443255\nCommittee\t443256\n奥体中路\t443257\n求证\t443258\n科宇\t443259\n合众国\t443260\nLogStash\t443261\n牛奶丝\t443262\n虚拟键盘\t443263\n跑马圈地\t443264\n经融危机\t443265\n互为\t443266\n我叫MT世界\t443267\nXen\t443268\n5:30\t443269\n2年级\t443270\n康波\t443271\n大道至简\t443272\nbarry\t443273\n13个\t443274\n炫\t443275\nST保千里\t443276\n道氏理论\t443277\n中国通\t443278\n合肥高铁南站\t443279\n辐射岛\t443280\n孙村\t443281\n天穗律师网\t443282\n爱好者们\t443283\n应用\t443284\n散伙\t443285\noverall\t443286\n亚姐\t443287\n官能症\t443288\n佳钓尼\t443289\n李国立\t443290\n流失率\t443291\n暗林风云\t443292\n鄞州区政府\t443293\n2.54mm\t443294\ncompareTo\t443295\n函套\t443296\n燕双飞\t443297\narb\t443298\n索威\t443299\n侄媳妇\t443300\n公愤\t443301\n生成性\t443302\n碎纸\t443303\n我们村\t443304\n坐船\t443305\n门禁控制器\t443306\n文管\t443307\n草龙\t443308\n东风大街\t443309\n高筱柔\t443310\nsplendid\t443311\n双色球杀号\t443312\n南桥寺\t443313\n灰度传感器\t443314\n8.13\t443315\nCSS样式表\t443316\n嵩屿\t443317\n傅高义\t443318\n初文学网\t443319\n千分位分隔符\t443320\n剑网3指尖江湖\t443321\n掀\t443322\nipa格式\t443323\n000895\t443324\ngta5pc\t443325\n义海\t443326\n指令表\t443327\nE16\t443328\n长子县人民政府\t443329\n真实\t443330\n局灶性\t443331\nstripslashes\t443332\n萍卡美娜\t443333\n郝老师\t443334\n系统集成工程师\t443335\n阿大\t443336\nbuilt\t443337\n玉牌\t443338\n凤凰游戏商城\t443339\n几十斤\t443340\n十四世纪\t443341\n合不合\t443342\n齐拉西酮\t443343\n飞霞\t443344\n奥里\t443345\n换流器\t443346\n雷门\t443347\n每隔五分钟\t443348\n红霉素肠溶胶囊\t443349\nren\t443350\n3770k\t443351\n咯吱咯吱\t443352\n欧布\t443353\n台湾交通大学\t443354\n能否\t443355\n束之高阁\t443356\n制冷展\t443357\n啪啪\t443358\n批示件\t443359\n全国中学生物理竞赛\t443360\n龙潭公园\t443361\n开帖\t443362\nWPSOffice\t443363\n湘府\t443364\n歌房\t443365\n洞庭\t443366\n黄脓\t443367\n游击队\t443368\nblogger\t443369\nsql2014\t443370\n上海区\t443371\n欲来\t443372\n厦航\t443373\n林云\t443374\n中港城码头\t443375\n铁鸡\t443376\nValentino\t443377\n白色\t443378\n九型\t443379\n液体过滤器\t443380\nuinavigation\t443381\n加热机\t443382\n盔甲\t443383\n济建\t443384\n冰峪沟\t443385\n2108b\t443386\n北京一中院\t443387\n转子流量计\t443388\nSmartAX\t443389\n误点\t443390\nOData\t443391\n华中大\t443392\n容积\t443393\nChing\t443394\n季节\t443395\n跳表\t443396\n超声波清洗\t443397\nmoney\t443398\n安全生产技术\t443399\n淄博经济开发区\t443400\n毛邓三\t443401\nTeresa\t443402\n3335\t443403\n600958\t443404\n影子战术:将军之刃\t443405\n南海北部\t443406\n智能\t443407\n2012.1\t443408\n大幅面扫描仪\t443409\n天坛生物\t443410\n松浦\t443411\n計\t443412\n贵州大学科技学院\t443413\n东城国际\t443414\n本能反应\t443415\n温江政府网\t443416\n路产\t443417\n天津生态城\t443418\n四川联通\t443419\n三元组\t443420\n淫淫\t443421\n休想\t443422\n黄金比例\t443423\n宝盖镇\t443424\n井天\t443425\nSHX\t443426\n咸嘉新村\t443427\n悬案组\t443428\nVV7\t443429\n中国宿迁市人民政府\t443430\n』\t443431\n天麓\t443432\n20章\t443433\n江西省安全生产监督管理局\t443434\nSRAM\t443435\n搜狗打字法\t443436\n要塞十字军东征\t443437\n福州文博中学\t443438\n四川中公\t443439\n李雪\t443440\n胆囊切除术\t443441\nproe\t443442\n东江纵队\t443443\n9段\t443444\n井上真央\t443445\n卫生员\t443446\n磨坊\t443447\n黄锦树\t443448\n莲花竹\t443449\n齿顶\t443450\n高企\t443451\nlucence\t443452\n600711\t443453\n交大昂立\t443454\n上海通用别克\t443455\n鄂建文\t443456\nghoul\t443457\nPhantomjs\t443458\n挖开\t443459\nEXTREME\t443460\n圈定\t443461\n第121期\t443462\n铁口\t443463\n中金在线财经号\t443464\n脆脆鲨\t443465\nhowell\t443466\n小灯\t443467\n重庆大足\t443468\n唐努乌梁海\t443469\n提瑞斯法\t443470\n林轩\t443471\n煤岩\t443472\n炸鱼薯条\t443473\n几胎\t443474\n宠物百科\t443475\n死路\t443476\n诺基亚1020\t443477\nwears\t443478\n张世祥\t443479\nyoho\t443480\n1z0\t443481\n中国道教协会\t443482\n7月25日\t443483\n2014年10月份\t443484\n47级\t443485\ndog\t443486\nCn\t443487\n45woool\t443488\n丁学东\t443489\n沈阳市公安局\t443490\n乐志\t443491\n焖烧壶\t443492\n明敷\t443493\n二小放牛郎\t443494\nBAMBINO\t443495\nfirm\t443496\n串行\t443497\n广东小学\t443498\n橘树\t443499\n是非\t443500\n保健品展\t443501\nà\t443502\n腾讯游戏心悦俱乐部\t443503\nQ宠大乐斗\t443504\n骨髓瘤\t443505\nPE膜\t443506\n亚宁\t443507\n银川二中\t443508\nGAME\t443509\n冒名\t443510\nSGS\t443511\n趾间炎\t443512\n太湖湿地公园\t443513\nobligations\t443514\n9220\t443515\n国家哲学社会科学文献中心\t443516\n猴群\t443517\n玳瑁\t443518\nWuxi\t443519\n上海西门子\t443520\n曲径通幽\t443521\n小米手表\t443522\n头皮\t443523\n挺可爱\t443524\nIranian\t443525\n基训\t443526\n礼来\t443527\n挨家挨户\t443528\nzongheng\t443529\n旅行蛙\t443530\nfangqi\t443531\nILCE\t443532\nlumber\t443533\n重剑\t443534\n民币\t443535\n角件\t443536\n数位\t443537\ngsp认证\t443538\n华夏人寿保险\t443539\n口算\t443540\n新格\t443541\n2635\t443542\n绿姿\t443543\n平朔\t443544\nVeryARM\t443545\n67味\t443546\nshixian\t443547\n活物\t443548\n孙冕\t443549\n16年\t443550\n千金散尽\t443551\n王布\t443552\n七十个\t443553\n漂相\t443554\n血透中心\t443555\n亚硫酸钙\t443556\n上海港汇广场\t443557\n肖\t443558\nDecode\t443559\n国学馆\t443560\n梦幻西游号\t443561\n主瓣\t443562\n甪直古镇\t443563\nHanoi\t443564\nVaR\t443565\n气油\t443566\nRoly\t443567\n楼道\t443568\n1234TV\t443569\n割掉\t443570\nMyCafe\t443571\nv2.9.1\t443572\n双飞\t443573\n驱动程序管理器\t443574\n滴流\t443575\n到此一游\t443576\n_多语种\t443577\n孩儿\t443578\n舞狮\t443579\n辉煌中国\t443580\n1281\t443581\n光杆\t443582\n五常稻花香\t443583\nindoor\t443584\n南部县人民政府\t443585\n断档\t443586\nlink\t443587\n复方丹参片\t443588\nSemi\t443589\n中天能源\t443590\n商洛之窗\t443591\n日轮\t443592\n炼化\t443593\n溴\t443594\n悔创阿里杰克马\t443595\n木油\t443596\n白各庄\t443597\nICH\t443598\n硬盒\t443599\n04.06\t443600\n眉尾\t443601\n露脚\t443602\nIncorrect\t443603\n钱一平\t443604\n魔法秀\t443605\nSha\t443606\nWorkout\t443607\n我爱幼儿园\t443608\n钟鹿纯\t443609\nMereke\t443610\nsocker\t443611\n子辰\t443612\n王朝网络\t443613\n东坑村\t443614\n10月29日\t443615\n互补色\t443616\n74路\t443617\nfifa\t443618\nRFS\t443619\nConvention\t443620\n11:11\t443621\n中国新能源网\t443622\n塔图姆\t443623\nelv\t443624\nISSI\t443625\n刺客信条5\t443626\n大黑科技\t443627\n交通事\t443628\n刘林\t443629\n鼎峰国汇山\t443630\n南大考研网\t443631\n走在冷风中\t443632\n借贷宝\t443633\n背疼\t443634\n结城\t443635\n勒布朗·詹姆斯\t443636\n考辩\t443637\n起草人\t443638\nadminer\t443639\n王文涛\t443640\n香鱼\t443641\n腹针\t443642\n趣闻\t443643\n教育馆\t443644\nuto\t443645\n教徒\t443646\nopendir\t443647\n170号段\t443648\n吉利远景x6\t443649\n灿盛\t443650\n跨世\t443651\n小富\t443652\nElbow\t443653\n热态\t443654\n金戌迎瑞\t443655\n案台\t443656\n乾照光电\t443657\n天津金融资产交易所\t443658\n滇军\t443659\n转贷\t443660\n听到\t443661\n梅州日报数字报\t443662\n道路经营许可证\t443663\niostream\t443664\ngzj\t443665\n小码\t443666\n2045\t443667\nppk\t443668\n唱本\t443669\n永和豆浆\t443670\n栖溪\t443671\n于和伟\t443672\nJIZZ\t443673\n此爱\t443674\n人才公寓\t443675\ngtx980ti\t443676\n南京市栖霞区人民政府\t443677\n东关街\t443678\n一展歌喉\t443679\n喵星\t443680\n酱油\t443681\ngitosis\t443682\n包法利\t443683\n开采\t443684\n华锐风电\t443685\namf\t443686\n芙蓉谷\t443687\n中心体\t443688\n文江\t443689\n碧落\t443690\n卡卡安全论坛\t443691\njavaSE\t443692\n合肥地铁4号线\t443693\n臭小子\t443694\nGodV\t443695\nDaz\t443696\n庞杂\t443697\n奇高\t443698\nBold\t443699\n永镇\t443700\n生年\t443701\n付款\t443702\n截切\t443703\nnatural\t443704\n关中平原城市群\t443705\n联易\t443706\n财部\t443707\nfinalcutpro\t443708\n太原日报报业集团\t443709\nEcharts3\t443710\n2686\t443711\n春江花月夜\t443712\n波尔多\t443713\n400岁\t443714\n黄金分析师\t443715\n公职金\t443716\nBD1280\t443717\n炫书网\t443718\n找货\t443719\ncpvc\t443720\n20000mAh\t443721\n更高效\t443722\n麦德龙超市\t443723\nipone\t443724\n变更登记表\t443725\n王均瑶\t443726\n六法\t443727\n胡少爷\t443728\n胰岛素抵抗\t443729\n子青\t443730\n组委会\t443731\n弗朗西斯·培根\t443732\n大隋风云\t443733\n中用\t443734\nR18.com\t443735\nNARUTO\t443736\n仲景香菇酱\t443737\niaas\t443738\n外包装\t443739\n长沙中联泵业有限公司\t443740\n三得利\t443741\n筑龙电气\t443742\nsri\t443743\n六和塔\t443744\n分秒\t443745\n0757\t443746\nKicker\t443747\n时空之门\t443748\n5000吨\t443749\n1648\t443750\n中国展览馆协会\t443751\n宁采臣\t443752\n百度权重\t443753\n封印符\t443754\n20161021\t443755\n一统江湖\t443756\n很多人\t443757\n竣工\t443758\n土建质量员\t443759\n虚拟home键\t443760\n厚朴方舟\t443761\nwingide\t443762\n不死机\t443763\n长泰县\t443764\ndlmwrite\t443765\n1.8万\t443766\n床性\t443767\n永华\t443768\n智雅\t443769\n_读书频道\t443770\n怡\t443771\n鬼魂术\t443772\n16384\t443773\n涵野\t443774\n芝士蛋糕\t443775\n赫萝\t443776\n语文报\t443777\n吃斋\t443778\n#墓王之王#\t443779\ncacert\t443780\n知合\t443781\n师爷\t443782\n2340\t443783\njxy\t443784\n刘知远\t443785\n出动\t443786\n郡望\t443787\n洗礼\t443788\n大薯\t443789\n益生菌冻干粉\t443790\n通用名\t443791\n斩杀者\t443792\n185\t443793\n傅杰\t443794\n蒙大\t443795\n金在奂\t443796\n台词\t443797\n米亚罗\t443798\n160mm\t443799\nw88优德\t443800\n百晓生\t443801\nSpringmvc\t443802\n奇信股份\t443803\nsigil\t443804\n千百年\t443805\n山涧\t443806\n吴用\t443807\n高莉\t443808\n英雄无敌6\t443809\ndaring\t443810\n启机\t443811\n移频\t443812\n空心村\t443813\n圆光\t443814\nMBA智库商学院\t443815\n战吼\t443816\n教唱\t443817\n上海电影艺术学院\t443818\n李思\t443819\nPermit\t443820\n亘古\t443821\nfuzhu\t443822\n阿莫西林克拉维酸钾分散片\t443823\nrit\t443824\n八木\t443825\n上海儿童医院\t443826\n发还\t443827\n田中理惠\t443828\n48寸\t443829\n帽顶\t443830\n自信力\t443831\n200平米\t443832\n义龙\t443833\n加出血\t443834\n肠胃镜\t443835\n武汉市水务集团有限公司\t443836\n预瞄\t443837\nTorvalds\t443838\n扯面\t443839\n欣然\t443840\n8821\t443841\n墓王之王悬棺寺\t443842\n罗盛\t443843\n花椒\t443844\n1月2日\t443845\n拉图尔\t443846\n头螨\t443847\nで\t443848\nmew\t443849\n紫罗兰永恒花园吧\t443850\nUSB盘\t443851\n產品\t443852\ncrystal\t443853\npad\t443854\n易购\t443855\n牛\t443856\nHowever\t443857\n重庆医科大学图书馆\t443858\n六合\t443859\n湖北省安全生产监督管理局\t443860\n溢余\t443861\n贴身兵王\t443862\n尾音\t443863\n打托\t443864\n轩脉刃\t443865\n葡萄苗\t443866\n大骨头\t443867\n口语化\t443868\n220公里\t443869\n导光管\t443870\n伊泽\t443871\n钵钵鸡\t443872\n钢企网\t443873\n2018.4.20\t443874\n竖表\t443875\n接受者\t443876\n倚天吧\t443877\n发射率\t443878\n一砖\t443879\n长生君\t443880\n恨之入骨\t443881\nWeaver\t443882\n0524\t443883\n广之旅\t443884\n欧维\t443885\n凤凰社区\t443886\n矛与盾商标网\t443887\napogee\t443888\n医色\t443889\nEnvironment\t443890\n撕下\t443891\nwindows10_Windo\t443892\ninvestigate\t443893\n华北电力大学科技学院\t443894\n阻容\t443895\nLiberal\t443896\n世界哮喘日\t443897\n固定化酶\t443898\n尼康镜头\t443899\nCLB\t443900\nUI测试\t443901\n宫颈检查\t443902\n去斑\t443903\npixi\t443904\n郭宏才\t443905\ngdgp1.chinaxinge.com/shuju2/201711/201711191935199069\t443906\n5035\t443907\n数库\t443908\nCiteSpace\t443909\n牛头牌\t443910\n帝王妃\t443911\n槌\t443912\n王石\t443913\nde域名\t443914\n多一半\t443915\n柳树枝\t443916\n南园路\t443917\ncdf\t443918\nPetit\t443919\n头冠\t443920\n开班\t443921\n存卡\t443922\n堃\t443923\n武术操\t443924\n白帆\t443925\nED2000\t443926\n天空之剑\t443927\n22万\t443928\n联想集团有限公司\t443929\n长短信\t443930\n会议桌\t443931\n新疆兵团第十二师\t443932\n用场\t443933\n云社区\t443934\n高柔\t443935\n荣耀6plus\t443936\nBeats\t443937\nhostmonitor\t443938\n暂扣\t443939\n2800\t443940\n周昌\t443941\n一次函数y=kx+\t443942\nantique\t443943\n开山\t443944\n上海爱尔眼科医院\t443945\n普鲁申科\t443946\n大湾村\t443947\n一遇\t443948\n浙江泰顺政府网\t443949\n联通集团\t443950\n张海藩\t443951\n北卡罗来纳\t443952\n混过\t443953\n首家\t443954\n攀爬\t443955\nofje\t443956\n三国志汉末霸业\t443957\n西邮\t443958\n剣\t443959\n第十九集\t443960\n第1个\t443961\nnecessarily\t443962\n休·杰克曼\t443963\n|租车网\t443964\n被拼\t443965\n磐正\t443966\n上地七街\t443967\n美若天仙\t443968\n884\t443969\n350MW\t443970\n晴\t443971\n邱磊\t443972\nwai\t443973\n75分钟\t443974\n主房\t443975\n辽宁省委员会\t443976\n花田喜\t443977\n016\t443978\nABRSM\t443979\n猎户座\t443980\nTecnomatix\t443981\nwwe2k17\t443982\nChemical百灵威\t443983\ndly\t443984\n软件设计师\t443985\nUNDO表\t443986\n挥刀\t443987\n翔安区政府\t443988\n岳塘新闻网\t443989\neconometrics\t443990\n种公\t443991\n成指\t443992\nTread\t443993\n民办校\t443994\n马缰革\t443995\n首哥\t443996\n省政府法制办\t443997\n闽南师范大学\t443998\n海淘论坛\t443999\n洗洗睡\t444000\n沙果\t444001\n瞳距\t444002\n供应\t444003\n组织结构\t444004\n知君\t444005\n碧池部\t444006\n盗墓笔记2\t444007\n两晚\t444008\n末梢\t444009\nzz\t444010\n歌神\t444011\n上海中国农业银行\t444012\n七月底\t444013\n轻量型\t444014\n周边\t444015\nheadmaster\t444016\n分布图\t444017\n0.003\t444018\nwindow服务器\t444019\n1492\t444020\n信噪比\t444021\n警示\t444022\n承台\t444023\n第一滴\t444024\n来料\t444025\n新疆电信\t444026\n连接座\t444027\nqikan\t444028\n3dsmax2012\t444029\n背手\t444030\n文职人员\t444031\ncocks\t444032\n活泼性\t444033\n异行\t444034\nSirius\t444035\n沥血\t444036\n云计算数据中心\t444037\nleeds\t444038\n水利工程信息网\t444039\n1000米\t444040\n合肥新站区\t444041\n厕\t444042\n通义\t444043\n3批次\t444044\n王金明\t444045\n8晚\t444046\n协和医院\t444047\n莫颜\t444048\n永结同心\t444049\nhisto\t444050\n三级人力资源管理师\t444051\nGalera\t444052\n陶元城\t444053\nwin7c盘\t444054\nanddroid\t444055\n点贷\t444056\n醒觉\t444057\ngopro6\t444058\ncomex\t444059\n防抱死制动系统\t444060\n随餐\t444061\n王文学\t444062\n_乐\t444063\n雪柜\t444064\n保持器\t444065\n本钱\t444066\n德州经济技术开发区\t444067\nConnectors\t444068\n七个半月\t444069\n导热材料\t444070\n园林路\t444071\n笔锋\t444072\n中国中车集团\t444073\n三合会\t444074\n信天\t444075\n七声\t444076\n黑超\t444077\n强化石\t444078\ngroupBy\t444079\n兽栏\t444080\n十四亿\t444081\ndatetable\t444082\n顾晓曼\t444083\n创频\t444084\n106家\t444085\ngif吧\t444086\n旅游类\t444087\n琥\t444088\njdw\t444089\n套套\t444090\n语料库\t444091\n晋阳\t444092\n福喜\t444093\n洋湖\t444094\n毛戈平\t444095\n宽叶\t444096\n第三档\t444097\n鹦鹉\t444098\n170公里\t444099\n开平\t444100\n比尔森\t444101\n盖伦\t444102\n向左走\t444103\n中共中央组织部\t444104\n毒神\t444105\n森林人\t444106\n肾部\t444107\n侯卫明\t444108\n下半季\t444109\n全牙\t444110\n水丸\t444111\nPerpetual\t444112\n二十多年\t444113\nsat\t444114\n战士篇\t444115\nii\t444116\n涩琪\t444117\n99层\t444118\n小纯\t444119\n概念片\t444120\n管理篇\t444121\no2o\t444122\n新中街\t444123\njupy\t444124\n龙类\t444125\n人工合成\t444126\n生天\t444127\n尚格\t444128\n分光光度计\t444129\n66万\t444130\n校门口\t444131\n徐阳\t444132\n吉他谱大全\t444133\n死亡扳机2\t444134\n莲花山港\t444135\n命根\t444136\n栃木县\t444137\nNEX-7\t444138\n双扇舞\t444139\n聚异丁烯\t444140\n旱地\t444141\n如此简单\t444142\n红外热成像\t444143\n马蹄形\t444144\n甩挂\t444145\n雅婷\t444146\n大蒜\t444147\n人称呼\t444148\n陈律师\t444149\nBey\t444150\n实数\t444151\n有机硅\t444152\n郁慕明\t444153\nfategrandor\t444154\n第38章\t444155\nConcentration\t444156\n纺织娘\t444157\n夏帆\t444158\nasrock\t444159\n晶元\t444160\n南齐\t444161\n中关村医院\t444162\ndisparity\t444163\n东尼电子\t444164\n大胃王密子君\t444165\ni586\t444166\n丁丁冬冬\t444167\n太奇教育\t444168\nkunshan\t444169\n金德管业\t444170\n陈雨露\t444171\n小夫郎\t444172\n椰田古寨\t444173\n公分\t444174\n宋宝华\t444175\n航海\t444176\nbuilder\t444177\n89号\t444178\n满秩矩阵\t444179\n朱允炆\t444180\n静下心\t444181\n南平市人民政府办公室\t444182\n鲁西集团\t444183\nao史密斯热水器\t444184\n漂亮姑娘\t444185\n数控专业\t444186\n住店\t444187\n江伟\t444188\n李先\t444189\n力方\t444190\ngralde\t444191\n上汽通用五菱汽车股份有限公司\t444192\n老牙\t444193\nRado\t444194\n景气指数\t444195\n脚灯\t444196\nKARL\t444197\n隔离栏\t444198\ndataTable\t444199\n花球\t444200\n中华人民共和国福建海事局\t444201\n鲤鱼王\t444202\n14.x\t444203\ncomputers\t444204\n高原红\t444205\n联想y700\t444206\n线刷卡\t444207\n2288H\t444208\n控制栏\t444209\nOKHttp\t444210\n戊类\t444211\n两架\t444212\n山慈菇\t444213\n饭制\t444214\n老北京吧\t444215\n作业册\t444216\n杨娜\t444217\n石家庄以岭药业股份有限公司\t444218\n第34关\t444219\n增长率\t444220\n药尘\t444221\nk600\t444222\n品行\t444223\n金武林\t444224\n颈椎枕\t444225\n天真蓝\t444226\n随选\t444227\nic卡吧\t444228\n罗摩衍那\t444229\n中油财经网\t444230\n视同\t444231\nsnmpd\t444232\n16v\t444233\n乱妇\t444234\n深圳区\t444235\nformula\t444236\n韩国艺术团\t444237\n心意拳\t444238\n硝苯地平片\t444239\n中转库\t444240\n周末\t444241\nブロ\t444242\n堵城\t444243\nNICE\t444244\nsbin\t444245\nzengjf\t444246\n4mm2\t444247\n8kmm\t444248\n上影\t444249\n裸色\t444250\n江西服装学院\t444251\n普通机\t444252\n科普文\t444253\n信长野望14\t444254\n贫民\t444255\n拓荒\t444256\nDS5\t444257\ngbm\t444258\n中级财务会计\t444259\n濠滨\t444260\nLeetCode\t444261\n耻辱\t444262\n中共十二大\t444263\n闵行区幼儿园\t444264\niphone卡\t444265\n热情好客\t444266\n计算机工程\t444267\n宠物鸟\t444268\n内蒙古新闻网\t444269\n国家宝藏\t444270\n陪都\t444271\n600v\t444272\n2组\t444273\n宁波海\t444274\n华西证券\t444275\n样针\t444276\n大礼\t444277\n铺排\t444278\n金沙江畔\t444279\n104g\t444280\npptv聚力\t444281\n1989\t444282\n钢缆\t444283\nMorty\t444284\n山西\t444285\n密号\t444286\n济南市儿童医院\t444287\n杖刑\t444288\n敛散性\t444289\n望京SOHO\t444290\n蓝战\t444291\nredhat\t444292\n蒋超良\t444293\n4DX\t444294\n日语惯用句\t444295\n好奇\t444296\nIP68\t444297\n冰库\t444298\n裸捐\t444299\nxc60\t444300\nConcord\t444301\n王心\t444302\n财政部财政科学研究所\t444303\n四维\t444304\n林雪儿\t444305\n十五年前\t444306\n天衡\t444307\n连号\t444308\n替补席\t444309\n鼠辈\t444310\n南堡开发区\t444311\n三责\t444312\nSTM32\t444313\n66节\t444314\n下坝坊\t444315\n800px\t444316\n下品\t444317\nRegression\t444318\n嘶鸣\t444319\n夜明\t444320\n分类通装修网\t444321\n略显\t444322\n宏基因组\t444323\n弼马温\t444324\n老旅\t444325\n麒麟区\t444326\n回退\t444327\n信息技术系\t444328\n40.8度\t444329\n枪球\t444330\nSit\t444331\n必中特\t444332\n中小学综合实践活动课程指导纲要\t444333\n油炉\t444334\n岩山\t444335\n双安商场\t444336\n词稿\t444337\n汉威\t444338\nDebt\t444339\n天天影视\t444340\n长岛路\t444341\n涿州市\t444342\n而至\t444343\n民事裁定书\t444344\nⅹ\t444345\n地平线:零之黎明\t444346\n童婚\t444347\n绿幕\t444348\nOPPOr11s\t444349\n客舍\t444350\n龙之女神\t444351\n人民法院出版社\t444352\n曹雪涛\t444353\nPolina\t444354\n2022cm\t444355\n美丽家装修网\t444356\narouse\t444357\n聚众淫乱罪\t444358\nMAHLE\t444359\n99周年\t444360\n工程费\t444361\n13.12\t444362\n佳能77d\t444363\n推入\t444364\n17年6月\t444365\n梯次\t444366\n水样\t444367\n198个\t444368\n公共场\t444369\n好处\t444370\n深圳市建筑科学研究院股份有限公司\t444371\n粉雪\t444372\n嫩叶\t444373\n头条\t444374\n20170912\t444375\n0107\t444376\n阴核\t444377\n驼背\t444378\n钢液\t444379\n2018日\t444380\nOozie\t444381\n金星北\t444382\n木婉清\t444383\n合辑版\t444384\n2.xls\t444385\n87\t444386\nmixed\t444387\n凤凰卫视中文台\t444388\n兰溪谷\t444389\n对着\t444390\n封印骊威\t444391\nEncyclopedia\t444392\n尿白细胞\t444393\n霾\t444394\n蟑螂\t444395\nFalse\t444396\n压缩弹簧\t444397\n长白山\t444398\n陈氏太极拳\t444399\n2017版\t444400\n质检\t444401\n95.com\t444402\n晓敏\t444403\nxamarin\t444404\n少卿\t444405\n磨灭\t444406\n见报\t444407\n上城区科技局\t444408\n广州市交委\t444409\n独立鱼\t444410\n中子\t444411\n沧溟\t444412\n枚举法\t444413\n説\t444414\nGymbo\t444415\n高尔夫别墅\t444416\n喧哗\t444417\n字梁\t444418\n制播\t444419\n洋务派\t444420\nRegsvr32\t444421\n何鑫\t444422\n忙\t444423\n稳定版\t444424\n不可能\t444425\n费列罗\t444426\n防制员\t444427\n8130\t444428\nTreeGrid\t444429\n丧\t444430\n上届\t444431\n打靶\t444432\nchickens\t444433\n21类\t444434\nflatmap\t444435\n氧化锌软膏\t444436\n免息券\t444437\n3d邪恶漫画\t444438\n篮球类\t444439\nJenner\t444440\n宁波站\t444441\n黑贞德\t444442\n香港永隆银行\t444443\n钟点工招聘网\t444444\n1片\t444445\n产道\t444446\n银科控股\t444447\n子健\t444448\n音室\t444449\n振动\t444450\n翻跳\t444451\nbg文\t444452\n蓝叠模拟器\t444453\n凉飕飕\t444454\nLoaded\t444455\n事实表\t444456\n_日韩剧\t444457\nSukin\t444458\n尘肺病\t444459\n华人螺丝网\t444460\n环境化学\t444461\n红头文\t444462\n微爱\t444463\n微生物\t444464\nEthFans\t444465\n恨情仇\t444466\n老槐树\t444467\nSenseTime\t444468\n新局\t444469\n云南民族大学\t444470\n看板娘\t444471\n长链接\t444472\n结构体指针\t444473\n中葵\t444474\n返回码\t444475\n永安小区\t444476\n李刚田\t444477\n开曼公司\t444478\n颐和花园\t444479\n25倍\t444480\n乌兰\t444481\n防晒剂\t444482\n陆无双\t444483\nHiggins\t444484\n冲孔桩\t444485\n全国专业标准化技术委员会\t444486\n辛特拉\t444487\n9张\t444488\n后来的我们\t444489\n罗甸县人民政府\t444490\nSorry\t444491\n胃部\t444492\n情侣们\t444493\n数字编码\t444494\n拳皇97ol\t444495\n数控系统\t444496\n酱香白酒吧\t444497\n出车\t444498\n妳\t444499\n奇数位\t444500\n青岛新机场\t444501\n包生\t444502\n同日而语\t444503\nu2417h\t444504\n小码哥\t444505\n华阴\t444506\n汤料\t444507\n磁力\t444508\n一园\t444509\n旅游点\t444510\n东蔡\t444511\nISO14001认证\t444512\n我是传奇2\t444513\nsign函数\t444514\n司波达也\t444515\n七龙珠吧\t444516\n禺城\t444517\n吊舱\t444518\n留考\t444519\n返盘\t444520\n朱桦\t444521\n丙戊酸镁缓释片\t444522\n养生师\t444523\n应允\t444524\n红色教育基地\t444525\n香铜\t444526\n挨饿\t444527\n_特玩网流放之路\t444528\nGT\t444529\n哈尔滨南\t444530\n自小\t444531\n天工网\t444532\niscroll\t444533\n重大\t444534\n43秒\t444535\n闭合电路\t444536\n通办\t444537\n朱培刚\t444538\n火狸\t444539\n买票\t444540\n宝华路\t444541\n0123\t444542\n减肥机\t444543\n释解\t444544\nIonic3学习笔记\t444545\nL365\t444546\n彩乐\t444547\n服装设计师\t444548\n阿瑜陀耶\t444549\nDing\t444550\n20局\t444551\n天上人间\t444552\n群呼\t444553\n两性私房话\t444554\n蛇蝎\t444555\n嵌\t444556\n上官燕\t444557\n7粒\t444558\n乘方\t444559\nxiaoyi\t444560\nHAPPY\t444561\nfgo\t444562\n朱韬\t444563\n目标端\t444564\n房记\t444565\n蓝心\t444566\nfacts\t444567\nvisa白金卡\t444568\n塔公草原\t444569\n椅凳\t444570\n拉皮\t444571\n首调\t444572\n煎牛排\t444573\n活检术\t444574\n自流平\t444575\nIndexes\t444576\n叔丁胺\t444577\n李沛\t444578\nPSH\t444579\n薛湖\t444580\nMacBookpro\t444581\nScene\t444582\n淫威\t444583\n确定性\t444584\n血肿\t444585\n见面\t444586\nDenied\t444587\n于文\t444588\n目中无人\t444589\nmedcalc\t444590\nlivephoto\t444591\nT540P\t444592\ncitespace\t444593\nAirAsia\t444594\nReferenceError\t444595\n黄子\t444596\n中国海洋大学崂山校区\t444597\n第13届\t444598\n清华大学经管学院\t444599\nzee\t444600\n150mb\t444601\nteams\t444602\n监管者\t444603\n刷漆\t444604\n催眠母\t444605\n杨剑\t444606\n络活喜\t444607\n顺义区政府网\t444608\n楚子航\t444609\n摆钟\t444610\n尴尬症\t444611\n天永智能\t444612\n2年多\t444613\n武昌站\t444614\n荣县\t444615\n海底喋血战\t444616\n华杉\t444617\n小单\t444618\n测验题\t444619\n多谢多谢\t444620\nNBE游戏工作室\t444621\n第二艘\t444622\n青筋\t444623\n最好的爱情\t444624\n切屏\t444625\nTemple\t444626\n工效学\t444627\nTobacco\t444628\n高阳\t444629\nthrills\t444630\n舟山群岛\t444631\n清远清新区\t444632\n宁城\t444633\n文春\t444634\n持球\t444635\n第二面\t444636\n群主\t444637\njfree\t444638\nSUV排名网\t444639\n花天\t444640\nmyremote\t444641\n穿越火线号\t444642\n庄浪\t444643\n初阶\t444644\n香草精\t444645\ndap\t444646\n西子女性网\t444647\n嫉妒\t444648\nDY\t444649\n奥拓电子\t444650\n情者\t444651\n装机\t444652\ninvent\t444653\n国机智骏汽车有限公司\t444654\n本片\t444655\n陈志\t444656\ndnf卡\t444657\n黄埔开发区\t444658\n西瓜霜喷剂\t444659\n中华人民共和国环境保护部\t444660\n红外热成像仪\t444661\n触摸春天\t444662\n刘晓燕\t444663\n让胡路区\t444664\n反讽\t444665\n襄阳市中心医院\t444666\n展度\t444667\n华锋股份\t444668\n基督城\t444669\n无源低音炮\t444670\n外行\t444671\n狼山\t444672\n第71号\t444673\ndanshiming\t444674\nmc700\t444675\n小来哥\t444676\n阿夫林\t444677\n哈巴罗夫斯克\t444678\n高寒\t444679\n台北人\t444680\n中国地震动参数区划图\t444681\n徐影\t444682\n私董\t444683\n算命|免费算命|八字\t444684\n鸿坤\t444685\n宏美\t444686\n湖南网\t444687\n9亿\t444688\n磁盘块\t444689\n文字块\t444690\n堆中\t444691\n腥臭\t444692\nMAMA\t444693\n名译\t444694\nUHD620\t444695\n估\t444696\neclipase\t444697\n3537\t444698\n追踪者\t444699\nubc\t444700\n妖艳\t444701\n行步\t444702\n百度地图导航\t444703\n终端机\t444704\ncento\t444705\n上海华美齿科\t444706\n扩张型心肌病\t444707\n吉利远景x1\t444708\n泉州市城乡规划局\t444709\n韩云\t444710\nKpopStarz中文网\t444711\n毛玻璃\t444712\n周易起名网\t444713\n英皇芭蕾\t444714\n浸润性乳腺癌\t444715\n平字\t444716\n外卡\t444717\n经研究\t444718\n字母哥\t444719\n禁忌症\t444720\n冷拔丝\t444721\nVTM\t444722\n第77期\t444723\n1.8亿元\t444724\n吉林省食品药品监督管理局\t444725\nNeverEnd\t444726\n订单表\t444727\n二维\t444728\n时尚坊\t444729\n小斐\t444730\nkV\t444731\n青蛙\t444732\n弗吉尼亚州\t444733\ncyy\t444734\n脸谱\t444735\n6.5寸\t444736\n鼎晖\t444737\n牛仔服\t444738\nca认证\t444739\n骂人\t444740\n慧能\t444741\n中国人民大学哲学院\t444742\nkobo\t444743\n不锈钢反应釜\t444744\n国民待遇\t444745\nMuffin\t444746\n0.6吨\t444747\n河南省卫生和计划生育委员会\t444748\n低音炮\t444749\n八佰伴店\t444750\n宋城路\t444751\n熊妈妈\t444752\n京瓷办公信息系统(中国)有限公司\t444753\n挨打\t444754\n第35次\t444755\n遵义地区\t444756\n高绿\t444757\nery\t444758\nnlc\t444759\n四任\t444760\n曹显兵\t444761\n航母\t444762\n君太\t444763\n侯爵\t444764\n劳动定额\t444765\n人生如梦\t444766\n黑白相间\t444767\n肇庆市鼎湖区人民政府\t444768\nvivoxplay6\t444769\n8旬\t444770\njava8\t444771\n605路\t444772\n金丰花园\t444773\n孙宝国\t444774\n本溪县\t444775\n圣战系谱\t444776\n太钢集团\t444777\n东野圭吾\t444778\n84元\t444779\n六横镇\t444780\n秦先生\t444781\n爱婴\t444782\n33本\t444783\n四经\t444784\n虹鳟鱼\t444785\n贵州松桃苗族自治县政府\t444786\nRandom\t444787\n桑顿\t444788\nAddress\t444789\n产出调查\t444790\n鸟儿\t444791\n黑曼巴蛇\t444792\n湿帘\t444793\nrib\t444794\npojo类\t444795\n武侠小说\t444796\n透不过气\t444797\n传集\t444798\n雪铁龙c4世嘉\t444799\n密肋\t444800\n动销\t444801\n尔雅通识课社会心理学\t444802\ndk3001\t444803\n密画\t444804\n第74条\t444805\n时光网\t444806\n王儒林\t444807\n180304\t444808\n升硕\t444809\n纸灰\t444810\n80万亿\t444811\n领主\t444812\nyen\t444813\n轮状\t444814\n510pro\t444815\n白砖\t444816\n杨峰\t444817\n17173新闻中心\t444818\n白\t444819\n小戏骨白蛇传\t444820\n金华园\t444821\n中原大佛\t444822\nadsl\t444823\n欧若拉公主\t444824\n塞斯纳\t444825\n南野\t444826\n欣旺达电子股份有限公司\t444827\n马妮\t444828\n韩语输入法\t444829\n飞镖盘\t444830\n22kw\t444831\n琵琶精\t444832\n结束\t444833\nK20\t444834\nneleta\t444835\n汇算清缴纳税\t444836\nXFun吃货俱乐部\t444837\n斑马Zebra\t444838\n苏州农业银行\t444839\ncdrx8\t444840\n外科风云\t444841\n注册地\t444842\n克里特岛\t444843\nSeals\t444844\n5510\t444845\n国家公路网规划\t444846\n▁\t444847\n天津幼升小\t444848\n产权籍\t444849\n素\t444850\n企业型\t444851\n汪玲\t444852\n生锈\t444853\n1912年\t444854\njxd\t444855\n小娘\t444856\nCf\t444857\n北京爱奇艺科技有限公司\t444858\n沙坪镇\t444859\n抗日小英雄\t444860\n人币\t444861\n深圳公安局\t444862\n北京教育\t444863\n2200\t444864\n潘箭迟\t444865\n非君子\t444866\n林可\t444867\n中国校庆网\t444868\nrebase\t444869\n骑马与砍杀冰与火之歌\t444870\n铣刀片\t444871\n91路\t444872\n反装\t444873\nrf\t444874\n富镇\t444875\n迪娜\t444876\n福音影视网\t444877\ncontex\t444878\n开播\t444879\n天下者\t444880\n金发天国\t444881\n收发文\t444882\nELM\t444883\n阚兆成\t444884\n三无尽\t444885\n烤面包机\t444886\n阿真\t444887\nAnimator\t444888\n亳州市人民政府\t444889\n凯撒大帝4\t444890\n农业园\t444891\nstale\t444892\n濉溪县人民政府\t444893\n自选\t444894\nmather\t444895\n3.2mm\t444896\n氹仔\t444897\nDvd\t444898\n3冠\t444899\n王侯\t444900\n洪真英\t444901\n综合素质学\t444902\n实战演练\t444903\nInnocence\t444904\n20180317\t444905\n精装修房\t444906\n台州市住房公积金\t444907\n万丰国际\t444908\nbufferedreader\t444909\n家长里短\t444910\n鉴赏家\t444911\nrifles\t444912\njrt\t444913\n华盖资本\t444914\n造影\t444915\n缆索\t444916\n倚老卖老\t444917\n深藏不漏\t444918\n学园奶爸\t444919\n22英寸\t444920\n更优\t444921\ndayu\t444922\noxps\t444923\n水资源税\t444924\n易鑫车贷\t444925\n三国水浒城\t444926\n中央电大\t444927\n护理学院\t444928\n形同虚设\t444929\n詹姆斯·邦德\t444930\n乘胜追击\t444931\n岚皋县\t444932\n奖助金\t444933\n大关县\t444934\n2.5英寸\t444935\nhood\t444936\n2016～2017年度\t444937\n点别\t444938\n弥足珍贵\t444939\n大通湖区\t444940\n薛岳\t444941\n养心殿\t444942\n姚磊\t444943\n追击者\t444944\n冰王\t444945\n楚雄\t444946\n阊阖\t444947\nTS250\t444948\n王砚辉\t444949\n考册\t444950\n芮城县\t444951\nFreeNAS\t444952\nUS\t444953\nnevertheless\t444954\n学习会\t444955\n前任女友\t444956\n20171224\t444957\n56条\t444958\n吭哧\t444959\nF1.7\t444960\n包联\t444961\n凤凰出版传媒集团\t444962\nOWASP\t444963\nanda\t444964\n宽带房产网\t444965\n每月\t444966\n2018年3月27日\t444967\n有礼\t444968\n有期\t444969\nyiyuan\t444970\n周亚夫\t444971\n妙哉\t444972\n均流\t444973\n婴儿肥\t444974\nDiver\t444975\n唐高网\t444976\n郑雪儿\t444977\n向远方\t444978\n20MB\t444979\n布袋式\t444980\n兜帽\t444981\n特种\t444982\n氨苄\t444983\n中汽中心\t444984\n红花籽油\t444985\n福禄贝尔\t444986\n压实\t444987\n和资\t444988\n汩\t444989\n洋鬼子\t444990\n中央政治局常委会\t444991\n|年\t444992\n珠江桥\t444993\n花旗集团\t444994\n门贴\t444995\nFDS\t444996\n王睿\t444997\n赵丽萍\t444998\n邮电费\t444999\n守护神\t445000\nM1319f\t445001\n狂浪\t445002\n志趣\t445003\n县委\t445004\n52pojie\t445005\n研发员\t445006\n天津路\t445007\n火队\t445008\n向幸福出发\t445009\n乌蒙贵\t445010\n阿兹海默症\t445011\n器官移植\t445012\ngenomic\t445013\n夜北\t445014\n少年张三丰\t445015\nTRUST\t445016\n二线城市\t445017\n党徽\t445018\n个税\t445019\nclara\t445020\n电脑之家PChome.net\t445021\n黄连阿胶汤\t445022\nnbl\t445023\n卤钨灯\t445024\n83325555\t445025\nGCL\t445026\n厘\t445027\nmi4\t445028\n0939\t445029\nfait\t445030\n牧田\t445031\n39寸\t445032\nBluRay\t445033\n坪山大道\t445034\n2522\t445035\n支气管扩张症\t445036\n互检\t445037\nchaos\t445038\n单元测试示范卷\t445039\nHendrix\t445040\n广州市财政局\t445041\n掰\t445042\n或许是\t445043\n6720\t445044\n贡丸\t445045\n三易通\t445046\n浓缩液\t445047\nbmb\t445048\n脑域\t445049\n史称\t445050\n泽诺尼亚\t445051\n中国医科大学附属盛京医院\t445052\n6000w\t445053\n有关系\t445054\n一秒后\t445055\n录用\t445056\n赛前\t445057\n杭州东\t445058\n常压热水锅炉\t445059\n槐荫\t445060\nhg0088.com\t445061\nbriefly\t445062\n5151\t445063\n三国志10威力加强版\t445064\n我努力工作\t445065\n中华V7\t445066\n镍网\t445067\n督查员\t445068\n运输机\t445069\n小量\t445070\n铅画\t445071\n人民日报出版社\t445072\n林德\t445073\n芒果慕斯蛋糕\t445074\n卜筮\t445075\n金威\t445076\n新华百货\t445077\n牯岭街少年杀人事件\t445078\n日剧\t445079\nmap\t445080\n黛眉\t445081\n鱼口\t445082\n黄鹂鸣翠柳\t445083\n求荐\t445084\n20170813\t445085\n监理\t445086\n佳能mp259\t445087\n像素点\t445088\nJMETER\t445089\n网络病毒\t445090\n日行千里\t445091\n3013\t445092\n南湖小区\t445093\n排水器\t445094\npredicting\t445095\n武康镇\t445096\n云禧\t445097\n鳖甲\t445098\n嘉兴教育学院\t445099\n汉法\t445100\n转基因技术\t445101\n金银滩\t445102\nmilestone\t445103\n李牧\t445104\n边面\t445105\n深茂高铁\t445106\n美人鱼公主\t445107\nprize\t445108\n字符类\t445109\n凯撒堡\t445110\n王原\t445111\n色彩心理学\t445112\n麻杆\t445113\n西武\t445114\n奇跡\t445115\n2015年下半年\t445116\nM7\t445117\nconverge\t445118\n蒂普提克\t445119\n网易云评论\t445120\n马斯克\t445121\n张庭\t445122\n锌矿\t445123\n28式\t445124\n睡不着\t445125\n独立播放器\t445126\n肚皮舞\t445127\n大鹏半岛\t445128\n抽油杆\t445129\nPM2.5\t445130\n工商联\t445131\n广东工业大学华立学院\t445132\nAs\t445133\n三生三世\t445134\n200套\t445135\n拉菲公馆\t445136\n天蝎座\t445137\n这套书\t445138\nWin7系统IE浏览器\t445139\n消火栓泵\t445140\n宿州东站\t445141\n腥臭味\t445142\n巴比龙\t445143\n54万亿元\t445144\n2018年1月15日\t445145\ndeeplink\t445146\nwzx\t445147\n王建明\t445148\n旬阳\t445149\n蓝科肤宁\t445150\n2017年7月30日\t445151\njhs\t445152\n宽扎\t445153\n暴雪公司\t445154\n镜音连\t445155\n前路\t445156\n重锤\t445157\n喊叫\t445158\n海豚岛\t445159\n初值\t445160\n两问\t445161\n科达\t445162\nlnz\t445163\nTaken\t445164\n600111\t445165\n李善姬\t445166\n少年宫\t445167\n免钉胶\t445168\nBistro\t445169\n花卉基地\t445170\n光火\t445171\n第八轮\t445172\n土右旗\t445173\n明细表\t445174\n格林公式\t445175\nvep\t445176\n欢乐海岸\t445177\n乾务镇\t445178\nBlockly\t445179\nv4.1.5\t445180\n贵州政府\t445181\n陈在天\t445182\nmshtml\t445183\n吞咽障碍\t445184\n院团委\t445185\n好办\t445186\n德国足球队\t445187\n万能式\t445188\n恐音\t445189\n湄潭县人民政府\t445190\n孝顺\t445191\n佛山照明\t445192\n怨言\t445193\nRJ11\t445194\n安然纳米\t445195\n叩门\t445196\n队史\t445197\n发热包\t445198\n澳中\t445199\nGTX980\t445200\nef\t445201\n遮拦\t445202\n12580\t445203\n夏目玲子\t445204\n坑之路\t445205\n米字旗\t445206\n金主\t445207\n不利于\t445208\n常思思\t445209\n水晶杯\t445210\n自虐\t445211\n天地行\t445212\n神们\t445213\n义域\t445214\n温州日报\t445215\n二号\t445216\n龙子湖\t445217\n法夫纳\t445218\n连载中\t445219\n苹果X\t445220\nmixnine\t445221\n第一百零一章\t445222\n甲基化\t445223\n软屏\t445224\n侧板\t445225\n朱槿\t445226\nCCTalk\t445227\n东阁\t445228\n莱宝高科\t445229\n宣传员\t445230\n瑞士银行\t445231\n鲢鳙鱼\t445232\n万科四季花城\t445233\n中金岭南\t445234\n家暴\t445235\n堡坎\t445236\n韩国机场\t445237\n十分之一\t445238\n奥巴梅扬\t445239\n混沌与秩序之英雄战歌\t445240\n云墨竹\t445241\n深圳市第二实验学校\t445242\n核磁共振波谱仪\t445243\nev160\t445244\n20180128\t445245\n太平洋汽\t445246\n秦丽娟\t445247\n恐怖蜡像馆\t445248\n龙与虎\t445249\n断奶\t445250\nCISC\t445251\n搂着\t445252\n偶偶足球装备网\t445253\n萍乡学院\t445254\n卡红\t445255\n軟\t445256\n抗氧化剂\t445257\n消毒器\t445258\n追呼\t445259\npromethe\t445260\n5千瓦\t445261\nexynos\t445262\n丝网花\t445263\nnamecheap\t445264\n57984498\t445265\n兰花豆\t445266\n龙腾大道\t445267\n蓝山新闻网\t445268\n千本樱\t445269\n玻片\t445270\n分析型\t445271\n云流\t445272\n电诈\t445273\n智能家居网\t445274\n羊脂玉\t445275\n反制\t445276\n11700\t445277\n下蹲\t445278\nwin2012\t445279\n中山利和广场\t445280\n奔驰GLC300\t445281\n包庇罪\t445282\n一正\t445283\n第四部\t445284\n104个\t445285\n左姓\t445286\n分比\t445287\nCt\t445288\n湛江政府\t445289\n晴美\t445290\n防卫战\t445291\n利润\t445292\n有画网\t445293\n王歧山\t445294\n鸦\t445295\n张玉莹\t445296\n胜迹\t445297\n百色路\t445298\n王也道长\t445299\n947\t445300\n微系统\t445301\n希热多吉\t445302\n八十六\t445303\n大润发店\t445304\n丙基\t445305\nConversion\t445306\n海北藏族自治州人民政府\t445307\n江苏中天科技股份有限公司\t445308\n马桶\t445309\n花格\t445310\n赵孟頫\t445311\n新僵尸先生\t445312\n天师\t445313\n张莜雨\t445314\ncaopon\t445315\n六件事\t445316\n鹭\t445317\n手腕式\t445318\n海藻酸\t445319\nSuzuki\t445320\n烘箱\t445321\n队头\t445322\n考查询\t445323\n149\t445324\n海投网\t445325\n16斤\t445326\n勇利\t445327\nAFNet\t445328\ncvb\t445329\n天水火车站\t445330\n大活络丸\t445331\n哈神\t445332\n一致行动人\t445333\n星战斗破苍穹之无上之境\t445334\n育星教育网\t445335\n岸本齐史\t445336\n免提\t445337\n南宁电视台\t445338\n英朗宝马m5\t445339\n团结湖公园\t445340\n内衣加盟网\t445341\n长周期\t445342\n匀质\t445343\nAV女友优\t445344\n建仓\t445345\n语音遥控器\t445346\n肾虚\t445347\n浙江省嘉兴市第一中学\t445348\n脱扣\t445349\n聊斋妖魔道\t445350\n招标会\t445351\n湛江二中\t445352\n一般般\t445353\n盐城亭湖\t445354\n荣威e550\t445355\n紫钗\t445356\n婴幼儿童\t445357\n3DTouch\t445358\n家庭存款\t445359\n三公里\t445360\n成都建设银行\t445361\nRBF神经网络\t445362\n深圳便民网\t445363\n散搭\t445364\nNQ\t445365\n马杀鸡\t445366\n李彬\t445367\n申报\t445368\n假摔\t445369\nzaobao\t445370\n瞻园\t445371\nxlsx\t445372\n270家\t445373\n好搞笑\t445374\n新文化网\t445375\nu盘启动盘制作工具\t445376\n用量\t445377\nVivitek\t445378\nBum\t445379\n崇迪\t445380\n曲阜路\t445381\n哈尔滨铁路局\t445382\n八卦江湖\t445383\n上海16区\t445384\n西藏民族学院\t445385\n4连\t445386\n锂基润滑脂\t445387\n三家店\t445388\n军事学\t445389\n查完\t445390\n古剑二\t445391\n危机\t445392\n试卷集\t445393\nec6108v9\t445394\n土耳其航空公司\t445395\n华电重工\t445396\n气盘\t445397\n下咽癌\t445398\n化石网\t445399\n大胜\t445400\n1600mm\t445401\n流态\t445402\nFunc\t445403\nc语言\t445404\n拳皇98终极之战ol吧_\t445405\n松树林\t445406\n无锡口腔医院\t445407\n图文版\t445408\n巴尔干\t445409\n上海金融学院\t445410\n蓝图\t445411\n9亿美元\t445412\n财务室\t445413\n手机号查询\t445414\nFDM\t445415\nTCL\t445416\n苏强\t445417\n购房税费装修|一起网\t445418\n个税证明\t445419\n片轮\t445420\n非但\t445421\n炎陵\t445422\n耕作层\t445423\n天津市大学软件学院\t445424\n717\t445425\n开国大校\t445426\n宠物殡葬\t445427\n黄酒\t445428\nGTB\t445429\n调研宝\t445430\n万达广场\t445431\n五个半月\t445432\nflox\t445433\n上海民族乐团\t445434\nPMW\t445435\n中国足球队\t445436\nJRPG\t445437\nhd520\t445438\niMovie\t445439\nOo\t445440\n22时\t445441\n星火作文网\t445442\n盖土网\t445443\nshijuan\t445444\n分期_\t445445\n翟中龙\t445446\n物资\t445447\n倪虹洁\t445448\neStarPro\t445449\n病毒库\t445450\nintermec\t445451\nderivatives\t445452\ncoefficient\t445453\n蒸汽阀\t445454\n座次表\t445455\nMarker\t445456\n圆o\t445457\n二战片\t445458\n上海公办小学\t445459\n完全平方数\t445460\neNet\t445461\n借期\t445462\nPigs\t445463\n积木式\t445464\nWinchester\t445465\n滴滴出行\t445466\n做主\t445467\nAN\t445468\n德罗西\t445469\n地狱之歌\t445470\n汗味\t445471\n世仇\t445472\n安妮斯顿\t445473\n玩斗\t445474\n建设银行\t445475\n该处\t445476\ndelimited\t445477\n1.0级\t445478\n马陆葡萄\t445479\n极快\t445480\n周本顺\t445481\n保护管\t445482\n百益\t445483\n美文学\t445484\n转录组学\t445485\n活化能\t445486\n秦皇岛港\t445487\n皮裙\t445488\n方板\t445489\n广东外语外贸大学南国商学院\t445490\n买手党\t445491\n观音湖\t445492\n要见\t445493\nmda\t445494\n开卡丁车\t445495\nitalo\t445496\n乳腺科\t445497\n衣被\t445498\n征集网\t445499\n中国美术馆\t445500\n翰申集团\t445501\n魔国志2\t445502\n激流勇进\t445503\n联轴器\t445504\n20150616\t445505\nyy\t445506\n踏步板\t445507\n指上谈兵\t445508\n晓之轨迹\t445509\n风讯\t445510\n超屌\t445511\n小羊肖恩\t445512\n250\t445513\n狂欢\t445514\n百胜中国\t445515\nPGD-606\t445516\nws550\t445517\n装配机\t445518\n铁柜\t445519\n基金法\t445520\n同济大学土木工程学院\t445521\n高度数\t445522\nWin7x64\t445523\n太赫兹波\t445524\n广西壮族自治区统计局\t445525\n口带\t445526\n裘法祖\t445527\n螺蛳\t445528\n国际篮联\t445529\nBTL\t445530\nOST&BGM\t445531\n影棚\t445532\n大便器\t445533\n希格玛\t445534\n校花苗疆蛊事\t445535\n卡包\t445536\n吊耳\t445537\n肽类激素\t445538\n北票市政府\t445539\n社会化\t445540\n皇天\t445541\n遮阳篷\t445542\n弹簧振子\t445543\n迟福林\t445544\n0.45%\t445545\n清廷\t445546\n淋巴水肿\t445547\n社会发展\t445548\n伟力\t445549\n现代城\t445550\nwindowmanager\t445551\nWWW.9INX.COM\t445552\n碧沙岗\t445553\n维谛\t445554\n书史\t445555\n黛妃\t445556\n心上的罗加\t445557\n钢易网\t445558\n急淋\t445559\n永遇乐京口北固亭怀古\t445560\n始建于\t445561\n三益宝\t445562\n朝华\t445563\n教师联盟网\t445564\n000792\t445565\n拒保\t445566\n音羽雷恩\t445567\n偿付能力充足率\t445568\nNeve\t445569\n西岗街道\t445570\n员工手册\t445571\nC7000\t445572\n3230m\t445573\n肝性脑病\t445574\n嘿呀\t445575\n几分之几_\t445576\n32b\t445577\n酒后\t445578\n程序法\t445579\n真野\t445580\n长春物流公司\t445581\n黑白灰\t445582\nfulidown\t445583\n中帧\t445584\n光驱位\t445585\n假烟\t445586\nparsers\t445587\nUses\t445588\nDAQmx\t445589\n保底\t445590\nyour\t445591\n寻宝\t445592\nECE\t445593\n小菜\t445594\n保利尼奥\t445595\n龙岩市城乡规划局\t445596\n澄清\t445597\n吴山\t445598\n医堂\t445599\n经济论\t445600\n针草\t445601\n麒麟臂\t445602\nKaggle\t445603\n接穗\t445604\n编制者\t445605\n奇致\t445606\n猩猩\t445607\n更年期综合征\t445608\n3628\t445609\n山西太钢不锈钢股份有限公司\t445610\n蝶依广场舞\t445611\n山东省国有资产投资控股有限公司\t445612\n两条街\t445613\n无上至尊\t445614\n搜狐焦点家居论坛\t445615\n钉群\t445616\n_巴\t445617\nAstronomy\t445618\ntelescope\t445619\n白吉馍\t445620\n牙科医院\t445621\n衡泰\t445622\n宫贴\t445623\n第14届\t445624\n_麦块\t445625\n吴晓求\t445626\n张戎\t445627\ncontenttype\t445628\n苏地\t445629\n两腰\t445630\n五段式\t445631\n西安广播电视台\t445632\n异构烷烃\t445633\n全题\t445634\n小北\t445635\nheck\t445636\n百度网盘链接\t445637\n焊点\t445638\n177号段\t445639\n蜻\t445640\n山东农信\t445641\nwarhammer\t445642\n交响\t445643\n牛大宝\t445644\n九代半\t445645\n丽江客栈\t445646\n在旅途\t445647\n紧跟党走\t445648\nsimonbaker\t445649\nwating\t445650\n微表情心理学\t445651\n小连\t445652\n银欣\t445653\n超碰\t445654\n陇\t445655\n康梅\t445656\nRomax\t445657\n18.2.3\t445658\n刘凤洲\t445659\n赤溪\t445660\n雁门关\t445661\n菲华\t445662\n6.0.4\t445663\n好山\t445664\n专门网\t445665\n稠州论坛\t445666\n吉迪恩\t445667\n七百年后\t445668\nPalindrome\t445669\n60px\t445670\n德昌县\t445671\n手足\t445672\n崽崽\t445673\n微元法\t445674\n桂治洪\t445675\n罗宁\t445676\n鸟岛\t445677\n掉落\t445678\nmatte\t445679\n我\t445680\nSelleck\t445681\n小米吧\t445682\nstm32f103zet6\t445683\n秦简\t445684\n孙婷婷\t445685\n月桂酸\t445686\n传统型\t445687\n蚌埠学院\t445688\n纯文本\t445689\n老枞\t445690\nWORD批量\t445691\n台铃\t445692\n青海省气象局\t445693\nORZ\t445694\n地海\t445695\n显宗\t445696\n伏寿\t445697\ncritically\t445698\n向华\t445699\n张鑫旭\t445700\n油麻藤\t445701\n智能坐便器\t445702\nchrom\t445703\n135亿\t445704\n生死抉择\t445705\n期中测试卷\t445706\n燕垒生\t445707\n送货员\t445708\n武汉生物工程学院\t445709\n恒温恒湿空调\t445710\n王莉莉\t445711\n天鹅臂\t445712\n列表框\t445713\n夜谈\t445714\n二珂\t445715\n蜈蚣草\t445716\n蒙奇·D·路飞\t445717\n胃液\t445718\n毕赣\t445719\n一大早\t445720\n充气机\t445721\n单联双控开关\t445722\n大慈寺\t445723\n相对\t445724\nxsj\t445725\nsharepoint2013\t445726\nkubelet\t445727\ngcr\t445728\nomitted\t445729\n反作弊\t445730\ndsn\t445731\nsuperX\t445732\n76a\t445733\n认证期\t445734\nWiFi/路由器\t445735\n深圳农村商业银行\t445736\n条式\t445737\nZ370M\t445738\n中国人民解放军纪律条令\t445739\n客票\t445740\n有份\t445741\n金轮法王\t445742\n林距离\t445743\n施今墨\t445744\n霉味\t445745\n幻想群侠传\t445746\n2017十一\t445747\n病害\t445748\n襄汾县\t445749\n国际大学\t445750\nHomeAssistant\t445751\n9370\t445752\n东方红艳\t445753\n龙湖香醍\t445754\nsumlime\t445755\n一宝\t445756\n涅磐\t445757\nㄇ\t445758\n监察法\t445759\nLarsson\t445760\nFeeling\t445761\n14次\t445762\n星织\t445763\n万宁\t445764\n顾客\t445765\n串串狗俱乐部\t445766\n官符\t445767\n明月村\t445768\n2栋\t445769\n大学生兼职\t445770\n大兴安岭\t445771\n稍作\t445772\n日军\t445773\n0021\t445774\n福睿斯论坛_汽车之家论坛\t445775\n通用航空产业园\t445776\n对过\t445777\n孟津\t445778\n青岛恒星科技学院\t445779\n口工\t445780\nLes\t445781\n无限恐怖吧\t445782\n莱昂纳多·迪卡普里奥\t445783\n平行四边形\t445784\nBIP\t445785\n62天\t445786\nAPP\t445787\n燕赵都市报\t445788\n充沛\t445789\npli\t445790\n拔节期\t445791\n豪爽\t445792\n明题\t445793\n西安交通大学出版社\t445794\nskysowe\t445795\n一块块\t445796\n加一\t445797\n党史馆\t445798\n卢杉\t445799\n两到三年\t445800\nairplay\t445801\n祖海\t445802\n温州医科大学附属第二医院\t445803\n坐实\t445804\n炖豆腐\t445805\nzc\t445806\n闲言\t445807\njquary\t445808\n体重计\t445809\n阿武\t445810\n玩\t445811\n生态酒店\t445812\n资产评估师\t445813\n光通\t445814\n纸壳\t445815\nwifidog\t445816\n印刷\t445817\n特种设备安全法\t445818\nComponents\t445819\n帛书\t445820\n100m3\t445821\nhkicpa\t445822\n拖机\t445823\n千凯千\t445824\nomnic\t445825\nmatebook\t445826\n周鑫\t445827\nLicense\t445828\n上海科创中心\t445829\n讨厌\t445830\n十二烷基苯磺酸钠\t445831\n益田假日广场\t445832\n中国矿业网\t445833\nsmeb\t445834\n美国化学会\t445835\n嘉民\t445836\nTurned\t445837\n卢瓦尔\t445838\n冷水鱼\t445839\n捞饭\t445840\n二雷\t445841\n美房\t445842\n上海企业信用网\t445843\n斯威特\t445844\n电动轮\t445845\n冠华苑\t445846\nArchetype\t445847\n李昌春\t445848\n邮乐网\t445849\nCUTE\t445850\n金珉锡\t445851\n农贸\t445852\n社会科学类\t445853\n后退一步\t445854\n偷食\t445855\n惆怅\t445856\n疯光\t445857\n艾萨\t445858\n花铃\t445859\n狂草\t445860\n知识局\t445861\ni帧\t445862\n1000年\t445863\n刻\t445864\n复用\t445865\n鲅鱼圈\t445866\n周庄古镇\t445867\n东莞物流公司\t445868\n死链\t445869\n直房\t445870\n电动二通阀\t445871\n老乡鸡\t445872\n利州\t445873\n机考\t445874\n维基百科\t445875\nb股\t445876\nvelodyne\t445877\n吉田\t445878\n黑龙江代表团\t445879\n瓶窑镇\t445880\n广州电子\t445881\nctrl+f5\t445882\n创客节\t445883\n黑松白鹿\t445884\nglassfish\t445885\n后背\t445886\n精神病药\t445887\n电信3G\t445888\n林忆莲\t445889\n51230\t445890\n洋土豪\t445891\nSteps\t445892\n02318\t445893\ntinypng\t445894\n舱\t445895\nWebStorm2018\t445896\n挑杆\t445897\n客集齐网\t445898\n心肺复苏\t445899\n刘煜\t445900\n微醉\t445901\n七册\t445902\nwimage\t445903\n驳\t445904\n20140424\t445905\n锁片\t445906\n全莎\t445907\n文夕\t445908\n370元\t445909\n张真源\t445910\n格里高利\t445911\n蒜味\t445912\n逍遥江湖\t445913\n揪住\t445914\nos3\t445915\n缙云山\t445916\n振动棒\t445917\nE-MapReduce\t445918\nTab页\t445919\n杭黄高铁\t445920\n三千金\t445921\n自然数\t445922\n四十多年\t445923\nstairs\t445924\n58空调网\t445925\nCDRX6\t445926\nDate类\t445927\n天雷\t445928\n拒录\t445929\n送礼\t445930\n信电学院\t445931\nyear\t445932\n造势\t445933\n梨状肌综合征\t445934\nHOTMAIL\t445935\n好味\t445936\n准予\t445937\n避躲\t445938\n上海市安全生产监督管理局\t445939\n众目\t445940\n东城区人力资源和社会保障局\t445941\nDO\t445942\n电鱼机\t445943\nv5.0.0\t445944\n小贼\t445945\n浙派\t445946\n艾礼富\t445947\nNIPS\t445948\n金煌芒\t445949\n赵彦\t445950\n炸带鱼\t445951\n韩锋\t445952\n310号\t445953\n西安饭庄\t445954\n履职\t445955\n橡皮\t445956\n无遗\t445957\nWinston\t445958\n1729\t445959\n4.13\t445960\n傅清泉\t445961\n保监\t445962\n232串口\t445963\nDSLR/DSLM论坛\t445964\nYunOS\t445965\n面包糠\t445966\n分权\t445967\n店主们\t445968\n上蔡县\t445969\n卷积\t445970\n相对介电常数\t445971\nprecursor\t445972\n寥落\t445973\n工农村\t445974\n魔炮\t445975\n纳米金刚石\t445976\n脚尖\t445977\n昆丁\t445978\nX6Plus\t445979\n口袋书\t445980\n超级鸡马\t445981\n指控\t445982\n王笑文\t445983\n苏艺\t445984\n挤塑\t445985\n罗峰\t445986\n岳阳乐居网\t445987\n竟业\t445988\n钢梯\t445989\n免费八字算命\t445990\n碍眼\t445991\n马忠\t445992\n毒刃\t445993\n第51条\t445994\n苦行\t445995\n艳妓\t445996\n莎剧\t445997\nVDS\t445998\n三仁\t445999\n王峥\t446000\n魂七魄\t446001\n移动预付费卡\t446002\n世界史\t446003\n弄权\t446004\nhasn\t446005\n12月中旬\t446006\n结绳\t446007\nstm32f429\t446008\nSIRE\t446009\nptrace\t446010\n周初\t446011\n3.9.3\t446012\n二百米\t446013\n耽误\t446014\n常熟农商银行\t446015\nYamato\t446016\n妖师\t446017\nSLD\t446018\n百校联考\t446019\n亚马逊网\t446020\n人人影视网\t446021\n山下村\t446022\n脑脓肿\t446023\n县扶贫办\t446024\njquery+css3\t446025\n冒汗\t446026\n呢绒\t446027\n中国农业银行\t446028\n爱卡\t446029\ncnzz\t446030\n脑花\t446031\n360se\t446032\njop\t446033\n异能\t446034\n血敏\t446035\nJexus\t446036\njoshua317\t446037\nprevalence\t446038\ndynare\t446039\n优篇\t446040\n信机\t446041\n448\t446042\n磁盘阵列卡\t446043\n极地馆\t446044\n格板\t446045\n蓝天\t446046\nlava\t446047\nwent\t446048\n蹊跷\t446049\nKAWD\t446050\n跳\t446051\n河南理工\t446052\n安辉\t446053\n四分之二\t446054\nRYONA\t446055\nmorphvox吧\t446056\n卖淫案\t446057\n钱勇\t446058\n老鸭\t446059\n宣酒\t446060\n回差\t446061\n海军史\t446062\nDrupal\t446063\n20支\t446064\n云浮市\t446065\n导览机\t446066\n寒寒\t446067\n百度框架户\t446068\n中国再保险\t446069\n阿东\t446070\n欣悦\t446071\n王涛\t446072\n歌坛\t446073\nCFS\t446074\n子vi\t446075\n李悦君\t446076\n车位\t446077\n南彩镇\t446078\n几号位\t446079\n永沢\t446080\n邮展\t446081\n200MM\t446082\n赌神\t446083\n我善养吾浩然之气\t446084\n北京公交特\t446085\n古滇\t446086\n布灯\t446087\nOverview\t446088\n孩爸\t446089\nLive版\t446090\n第4季\t446091\nGrinder\t446092\n画点\t446093\n高坝\t446094\n虚字\t446095\n平版\t446096\n妻子的谎言\t446097\n结课\t446098\n圣阳股份\t446099\n无root\t446100\nHampton\t446101\n母妹\t446102\n赛文\t446103\nintl\t446104\nwang\t446105\n同学\t446106\n勃利\t446107\n唐山人才网\t446108\n一课\t446109\n氟哌噻吨美利曲辛片\t446110\n电调\t446111\nteentube\t446112\n借势营销\t446113\n房集\t446114\n于英涛\t446115\n王柏元\t446116\nladybug\t446117\n撞翻\t446118\n李玉琴\t446119\n轮轨\t446120\n吕鑫\t446121\n苏童\t446122\n莆阳\t446123\nNoNoNo\t446124\n寇驰\t446125\n淋浴头\t446126\n顺酐\t446127\nfate\t446128\n李悠悠\t446129\n道格拉斯\t446130\n前列腺液\t446131\n上交会\t446132\nJONSBO\t446133\n保护欲\t446134\nXT800\t446135\n20170207\t446136\n守\t446137\n黑五\t446138\n几千万个\t446139\n全新版大学英语听说教程\t446140\n没落\t446141\n青岛保税港区\t446142\ndatastage\t446143\n小池\t446144\n板烧\t446145\n六名\t446146\n山楂干\t446147\n上传送\t446148\n怒砸\t446149\n刺客信条4:黑旗\t446150\n相斥\t446151\n一万平方米\t446152\n永垂\t446153\n六里坪\t446154\njimei\t446155\nreality\t446156\n环锭纺\t446157\n海立股份\t446158\n36000\t446159\n希瑞\t446160\n集释\t446161\n寒夏\t446162\n回家路\t446163\n木渎\t446164\n弗里茨\t446165\n许慧欣\t446166\n宁波市国资委\t446167\nepisodes\t446168\n杭瑞高速\t446169\n手长\t446170\n陕西法制网\t446171\n招标公告-采招网\t446172\n巨门\t446173\n云计算服务器\t446174\n蒙阳镇\t446175\n面盔\t446176\nBar\t446177\n咋地\t446178\n灵衣\t446179\n持国\t446180\n针叶树\t446181\n采莲曲\t446182\nViagra\t446183\n少年锦衣卫\t446184\n连弩\t446185\n军功\t446186\nNewborn\t446187\n斜拉索\t446188\n临高启明吧_\t446189\n聚居地\t446190\n60个\t446191\n扣掉\t446192\n纵使\t446193\n足球类\t446194\n615路\t446195\nMplus\t446196\n丙烯颜料\t446197\n戴娇倩\t446198\n超声波驱鼠器\t446199\n抢位\t446200\nmeizi\t446201\n感光\t446202\n一瓢\t446203\n晕针\t446204\n暗月世界\t446205\n长郡中学\t446206\n低水\t446207\n百里山水画廊\t446208\nDoit\t446209\nhitting\t446210\n高老庄\t446211\n成套\t446212\n厦门软件园三期\t446213\nClinicalTrials\t446214\n荷戈士\t446215\n亜\t446216\n谈股论金\t446217\n班底\t446218\n50组\t446219\n中共重庆市委\t446220\n如胶似漆\t446221\n罗姆\t446222\n三国荣威i6\t446223\n点击率\t446224\n中国航海学会\t446225\nkeka\t446226\nエッチ\t446227\n金盛集团\t446228\n杉田智和\t446229\n昭阳\t446230\n科技信息\t446231\n血塞通片\t446232\n反正弦函数\t446233\n缺考\t446234\n砖宝\t446235\n下周三\t446236\n从今往后\t446237\nrevel\t446238\n中国长城资产管理公司\t446239\n三相发电机\t446240\n东方未明\t446241\n压边\t446242\nmsm\t446243\nqlik\t446244\n数据篇\t446245\nMISSION\t446246\nsekai\t446247\n机会难得\t446248\n陆小凤传奇\t446249\nnrc\t446250\n12:30\t446251\n风萧萧兮易水寒\t446252\n双管齐下\t446253\nPBR\t446254\n三杯\t446255\n脉冲除尘器\t446256\n王雄\t446257\nWin7/Win8/Win10\t446258\nMySQL触发器\t446259\nJanson\t446260\n0573\t446261\n堵板\t446262\n长不大\t446263\n年代尺\t446264\n茅台酒厂\t446265\n材料院\t446266\n续展\t446267\n索尔迦雷欧\t446268\n团团圆圆\t446269\n鸭肝\t446270\n江南水乡\t446271\n2017税务师考试\t446272\n鱼洞\t446273\n可微函数\t446274\n护肤\t446275\nu罗汉\t446276\nmocvd\t446277\n远方的海\t446278\n本尊\t446279\n转介\t446280\n高视\t446281\nNBA新秀网\t446282\nRabbitmq\t446283\nv3.12\t446284\n影音先锋资源|影音先锋电影网站|影音先锋\t446285\nsurging\t446286\n植物病理学\t446287\n恒丰集团\t446288\nmongoimport\t446289\n挂学\t446290\n云南旅游\t446291\n祖代\t446292\n油性笔\t446293\nAspera\t446294\nDoubleLi\t446295\n腌料\t446296\nav12电影网\t446297\n奖励费\t446298\n拉筋板\t446299\n微信通\t446300\n002405\t446301\n正中集团\t446302\n迟子建\t446303\n玛丽一世\t446304\n李枖原\t446305\n询盘\t446306\n原句\t446307\n南沿江高铁\t446308\n孟加拉湾\t446309\nretry\t446310\n林峰海\t446311\n案底\t446312\n九妹\t446313\nFortuner\t446314\n灵草\t446315\n门面\t446316\n沐舒坦\t446317\n居正\t446318\n鑫诚\t446319\n趣科技\t446320\nusb3.0接口\t446321\n统计版\t446322\n闪闪\t446323\n东方主义\t446324\n容瑾瑜\t446325\n动动\t446326\nzhin\t446327\n龙辰\t446328\n温柔的背后\t446329\n運\t446330\ntar\t446331\n尧都农商银行\t446332\n禅悦\t446333\nIntroduction\t446334\n_民商法\t446335\n飘扬\t446336\n难处\t446337\n期望\t446338\n水袖舞\t446339\n上海松江大学城\t446340\n三能\t446341\n奥斯卡\t446342\n服务片\t446343\nss套\t446344\n陈朔\t446345\n2015年11月\t446346\n无土\t446347\n投融界|\t446348\n高考语\t446349\n演艺生涯\t446350\n天河山\t446351\n道长\t446352\nNigeria\t446353\n面饼\t446354\n天涯文学\t446355\nV6\t446356\nAV色情电影网\t446357\nsimditor\t446358\n吹制\t446359\nTES\t446360\n被查封\t446361\nrbd\t446362\n中巴车\t446363\n三力\t446364\n昆都仑\t446365\nChair\t446366\n蓝球\t446367\nPhone\t446368\n芍药居北里\t446369\n被遗忘者\t446370\n金鸿控股\t446371\n周国\t446372\n四个月\t446373\n00763\t446374\n莲花山公园\t446375\n现汇\t446376\n花材\t446377\n海上孟府\t446378\n摩卡棕\t446379\n黄江镇\t446380\nCEF3\t446381\nDots\t446382\nWinbox\t446383\n黑膏药\t446384\n7890\t446385\nAV女优十大名器\t446386\n南北方人\t446387\n5米\t446388\n王欣\t446389\n500吨\t446390\n留园\t446391\n电子计数器\t446392\n第52集\t446393\n自从\t446394\nBeaute\t446395\n第四弹\t446396\n骚妈妈\t446397\n杆秤\t446398\n田英章\t446399\n彭墩\t446400\n孙权劝学\t446401\n丹东街道\t446402\n宣传报\t446403\n代表大会\t446404\n奇瑞艾瑞泽5\t446405\n老乡们\t446406\nthreadpool\t446407\n华为手机助手\t446408\n小炉\t446409\n华耐家居\t446410\nArmour\t446411\nubunutu\t446412\n山东轻工职业学院\t446413\n中间体\t446414\n混业\t446415\n华润苏果\t446416\n满分网\t446417\n星际\t446418\nQiao\t446419\n怪物猎人\t446420\nsevlet\t446421\n菜篮子\t446422\nv3.0.2\t446423\n山东银监局\t446424\ndisney\t446425\n重庆市科学技术委员会\t446426\n英雄无敌3死亡阴影\t446427\n民用\t446428\n阿坨坨\t446429\n逾期付款违约金\t446430\nkisses\t446431\n扭转\t446432\n独有\t446433\nwarships\t446434\n尼康d5\t446435\nPerforce\t446436\n海顿\t446437\n公寓房\t446438\n一师\t446439\n南海观音\t446440\n老年代步车\t446441\n航吊\t446442\n浦口\t446443\n萧潜\t446444\n李思思\t446445\nebooking\t446446\nsetw\t446447\n风暴英雄\t446448\n浙江艺术职业学院\t446449\n24名\t446450\n查令十字街\t446451\n天门冬氨酸氨基转移酶\t446452\ngqi\t446453\n于杰\t446454\n穿越火线吧\t446455\ndalvik\t446456\nwarkey\t446457\n改变你\t446458\nteddy\t446459\nComputers\t446460\n豆列\t446461\n虫蛀\t446462\n早退\t446463\n长江水文网\t446464\n落基山脉\t446465\n埋胸\t446466\n百点\t446467\n实体\t446468\n生物大灭绝_\t446469\n028-68024714\t446470\n82张\t446471\ncct\t446472\n茶席\t446473\n三星级酒店\t446474\n满宝馄饨\t446475\n1885年\t446476\n零乱\t446477\n花林\t446478\n市社\t446479\n优先受偿权\t446480\n平安国际融资租赁有限公司\t446481\n一大瓶\t446482\nQueryRunner\t446483\njeesite4\t446484\n重涂\t446485\nws832\t446486\nMTM\t446487\n东林大佛\t446488\n敬酒\t446489\n称奇\t446490\n云锭\t446491\n篮球迷\t446492\nDelicious\t446493\n胸前\t446494\n泉州经贸职业技术学院\t446495\n云投生态\t446496\nSTARLESS\t446497\n健身球\t446498\n3dmax2019\t446499\n樱兰高校男公关部\t446500\n济南有人物联网技术有限公司\t446501\n儒释道\t446502\n泰山石敢当\t446503\n4K\t446504\n赞美诗网\t446505\nbnl\t446506\n河桥镇\t446507\n3门\t446508\n广西电台\t446509\n热血高校\t446510\n阿玲\t446511\n三菱电机自动化(中国)有限公司\t446512\npinning\t446513\n社会保障部\t446514\n38%\t446515\n黑虎虾\t446516\nrecreate\t446517\n3公分\t446518\n导表\t446519\n三分之一_\t446520\n美兹\t446521\n高陂\t446522\n高斯林\t446523\n小锤\t446524\n艾莉亚·史塔克\t446525\n20181号\t446526\n哈卷\t446527\nLucene5\t446528\nCASTEP\t446529\n公切线\t446530\n湖南法院\t446531\n花红柳绿\t446532\n2073\t446533\n平望\t446534\n阿法狗\t446535\n一联\t446536\n劳务协议书\t446537\ncvv2\t446538\n休假\t446539\nMoby\t446540\n南京市鼓楼医院\t446541\nEviews6.0\t446542\nlikes\t446543\n努比亚Z11miniS\t446544\n一万多元\t446545\n不当得利\t446546\nwin7主题\t446547\n煊赫\t446548\nXTC820\t446549\n渝北中央公园\t446550\n分集\t446551\n渤公岛\t446552\n曹云金\t446553\n一刚\t446554\niptables\t446555\n东东\t446556\n两百多\t446557\nnas\t446558\n绝地求生加速器\t446559\n茅草根\t446560\n厦门市市场监督管理局\t446561\n高赞\t446562\n一键版\t446563\n下拉菜单\t446564\nLethal\t446565\n不争春\t446566\n蝴蝶扣\t446567\n注册会计师考试通过率\t446568\n转转转\t446569\n老梁故事汇\t446570\n长沙卫生职业学院\t446571\n星沙新闻网\t446572\n重庆医药\t446573\n围困\t446574\nv4.1.6\t446575\n104天\t446576\n巴彦淖尔市\t446577\npsd模\t446578\n_齐家网\t446579\ngen9\t446580\n三成\t446581\n精神错乱\t446582\n自立自强\t446583\n主物\t446584\n2.0.5\t446585\n丰泽园\t446586\n黑棕\t446587\n小组合作学习\t446588\n高丽参\t446589\n锻冶\t446590\n齐鲁交通发展集团\t446591\n页条\t446592\nSubstitute\t446593\n暗黑破坏神2战网\t446594\n判断类\t446595\n魔能2\t446596\n电脑雕刻机\t446597\n巡逻舰\t446598\n6所\t446599\n渣打\t446600\n盛景\t446601\n中山大学物理科学与工程技术学院\t446602\n舒活\t446603\n16件\t446604\nlspci\t446605\n在线姓名测试\t446606\n老花\t446607\n阿芙乐尔\t446608\n卡斯柯\t446609\n女皇\t446610\n宜美\t446611\n王小石\t446612\n行政权\t446613\n三条路\t446614\n卡箍式\t446615\n新天龙八部吧_\t446616\n敢死连\t446617\n两国\t446618\n干辣椒\t446619\n李志林\t446620\n溶蚀\t446621\nG60\t446622\n保湿型\t446623\n江苏省发展和改革委员会\t446624\n2015.9\t446625\n公寓式酒店\t446626\n张悦然\t446627\n济工\t446628\nLCR\t446629\n中山大道西\t446630\n大驱\t446631\n角膜塑形镜\t446632\n林清轩\t446633\n土桥镇\t446634\n风带\t446635\n乐看\t446636\n供果\t446637\nKeira\t446638\n全画\t446639\n40千米\t446640\nODBC\t446641\n针针\t446642\n黄耀明\t446643\n上海高级金融学院\t446644\n牛妈\t446645\nAo\t446646\nINFP\t446647\n胡泊\t446648\n88种\t446649\n重庆格力空调\t446650\n军人证\t446651\n常吃\t446652\n紫金府\t446653\n张家港农商行\t446654\n文件格\t446655\n20160430\t446656\n家计\t446657\n_连云港政府网\t446658\nGA\t446659\nAWAY\t446660\n血浆脂蛋白\t446661\n追风者\t446662\n贝加尔湖\t446663\n乐豪斯\t446664\n兰州大学\t446665\nn元\t446666\n3140\t446667\nnsc\t446668\n洞庭路\t446669\n触点\t446670\nPhotoshopCS\t446671\n发光\t446672\numc\t446673\n举头望明月\t446674\nGesture\t446675\n勾线笔\t446676\n灰蓝\t446677\n放射学\t446678\n游记\t446679\nsooshong\t446680\n拉条\t446681\nWebSphere\t446682\n九龙山国家森林公园\t446683\n清远市住房和城乡建设局\t446684\n1283\t446685\n增发\t446686\n韩诺\t446687\nVape\t446688\n动产抵押登记\t446689\nwithout\t446690\noutlook365\t446691\n南极之恋\t446692\n嚷嚷\t446693\n十指紧扣\t446694\n灼眼\t446695\n窃读\t446696\n混沌大学\t446697\n神超\t446698\nVALSE\t446699\n安度因\t446700\n嗓门\t446701\n口袋妖怪漆黑的魅影\t446702\n124\t446703\n少林武王\t446704\n湘仪\t446705\n温经\t446706\nExcel批量\t446707\n气体报警器\t446708\n王伟\t446709\n三农_央视网\t446710\n渝黔\t446711\n酒仙桥路\t446712\nv2.6.5\t446713\n人文地理\t446714\n风中程序猿\t446715\n露阴癖\t446716\n引用\t446717\n九色\t446718\n茂凯\t446719\n找到了\t446720\nCVC\t446721\n河北省司法厅\t446722\n收费版\t446723\n1383\t446724\n东莞市第一人民法院\t446725\n海魂衫\t446726\n范文程\t446727\n扁平疣\t446728\n330万\t446729\n油船\t446730\n香澄遥\t446731\n巴州区\t446732\n中国金融认证中心\t446733\n50万个\t446734\n内控\t446735\n梅宏\t446736\n陈思坦\t446737\n汕头市城乡规划局\t446738\n裱\t446739\n兰英\t446740\n老布什\t446741\n坦克世\t446742\n团结小区\t446743\n湘西土家苗族自治州人民政府\t446744\n天投网\t446745\n淬火炉\t446746\n怡和\t446747\n涡轮增压器\t446748\n芫荽\t446749\n撩倒\t446750\n器重\t446751\n宝物\t446752\n海瓜子\t446753\n随机密码生成器\t446754\n401路\t446755\n玛尔\t446756\n十小咒\t446757\n宝珠\t446758\n销售业\t446759\n2014年前\t446760\n快豹\t446761\n重制版\t446762\n恶魔小姐:人事恶魔椿真子\t446763\n上腭\t446764\n刻纹\t446765\n领动\t446766\n540i\t446767\n第7天\t446768\n腹内\t446769\n赛博加速器\t446770\nplantronics\t446771\n7.0.2\t446772\n招标工程\t446773\n肉粽\t446774\ngeolocation\t446775\n服务器版\t446776\n酸洗板\t446777\n1000欧元\t446778\n上饶市政府\t446779\n编程人员\t446780\n鞋子\t446781\nSeekBar\t446782\ndoraiba\t446783\n奥拉星传奇\t446784\n千羽\t446785\n裙角\t446786\n拟定\t446787\n割须\t446788\nAss\t446789\n叶底\t446790\n文老师\t446791\nFlyer\t446792\n74hc595\t446793\n突破性\t446794\nPerfumes\t446795\n挥发量\t446796\nri\t446797\n中一班\t446798\npolymeric\t446799\n望月怀远\t446800\n石屋\t446801\nIT部\t446802\n劳力者\t446803\naiqi\t446804\nsg2.ledu.com\t446805\n砍刀\t446806\n滨江广场\t446807\n清一色\t446808\n静待\t446809\n张东健\t446810\n奔袭\t446811\noped\t446812\ntdd\t446813\n生仔\t446814\nXls\t446815\n湖南省儿童医院\t446816\n小建\t446817\n道玄\t446818\n背心袋\t446819\n中隔\t446820\n金陵湾\t446821\nbeautybox\t446822\n加索尔\t446823\n肖斌\t446824\n淫娘\t446825\n二手房\t446826\nJener_Yan\t446827\n万首\t446828\n东奥会计继续教育\t446829\nab\t446830\n三十六大\t446831\n手色\t446832\n何庆魁\t446833\n悦享\t446834\n鹿丹村\t446835\nBALDR\t446836\n荒僻\t446837\n魔劣\t446838\n加签\t446839\n王洪图\t446840\n0482\t446841\nCredential\t446842\n张建锋\t446843\n新疆分公司\t446844\ngetcwd\t446845\n为人民服务\t446846\n爱康国宾体检\t446847\n2dlc\t446848\nopenmediavault\t446849\n通策医疗\t446850\n淡\t446851\n商事\t446852\nkn007\t446853\n街篮\t446854\n种权\t446855\n自费\t446856\n咸宁市水务局\t446857\n河南省公安厅\t446858\n投笔从戎\t446859\nPractice\t446860\n地方党政领导干部安全生产责任制规定\t446861\n郁李仁\t446862\nStaple\t446863\n巨丰投顾\t446864\n十八次\t446865\n神昏末劫\t446866\n南华县人民政府\t446867\n适任\t446868\n活性炭吸附箱\t446869\n拆局\t446870\n止步\t446871\n残爱\t446872\n抗日神剧\t446873\n职业套装\t446874\nNvidia\t446875\n结存\t446876\n卡妙\t446877\nDimension\t446878\n百万美元\t446879\nheapq\t446880\n庄奴\t446881\n温州市公安局\t446882\n爱看图标网\t446883\nregime\t446884\nmml\t446885\n李立国\t446886\n刘国梁\t446887\n宅基地使用权\t446888\nBlvd\t446889\n痴爱\t446890\n淑玉池\t446891\n100\t446892\n得救\t446893\ni6\t446894\n_明清书家\t446895\nサイズ\t446896\n应用双开\t446897\n推算\t446898\n二十岁\t446899\n浙江石油化工有限公司\t446900\n最赞\t446901\n不露脸\t446902\n无德\t446903\n吴老师\t446904\n侧枝\t446905\n5999元\t446906\n风卷残云\t446907\n美链\t446908\n钟毅\t446909\n7860\t446910\n1CD\t446911\n牛欢喜\t446912\n自助餐券\t446913\n扬州东区\t446914\n恐怖谷\t446915\n药机\t446916\n撤换\t446917\n哥本哈根\t446918\n姊姊\t446919\n椎体压缩性骨折\t446920\n绩溪\t446921\n秘银\t446922\n十九号\t446923\n雷鼓\t446924\n热障\t446925\nv4.2.0\t446926\n高端医疗保险\t446927\nㄒ\t446928\n童贯\t446929\n美洲花园\t446930\nHANG\t446931\nosmo\t446932\n_猫\t446933\npedestrian\t446934\nTestament\t446935\n上海交通大学医学院\t446936\nPERL\t446937\nl两个\t446938\n办建\t446939\n2杯\t446940\n凌姓\t446941\n中山大学肿瘤医院\t446942\n北京大学光华管理学院\t446943\n多宝鱼\t446944\nkey、value\t446945\n云巅\t446946\n汤浅\t446947\n鬼斧神工\t446948\nbaidupan\t446949\n卧床\t446950\n白云国际机场\t446951\n丰庆路\t446952\nfine-tuning\t446953\n高二\t446954\n智孝\t446955\n甘肃人大\t446956\n食管反流\t446957\n北科生物\t446958\n730\t446959\nφ\t446960\n99平\t446961\n栎木\t446962\n手写本\t446963\n和值\t446964\n刺客信条:起源\t446965\n李丽英\t446966\n活性炭过滤器\t446967\nunic\t446968\nBSN\t446969\n画好\t446970\n骁龙处理器\t446971\n芒果影院\t446972\n尽孝\t446973\n冯提模\t446974\nxiee\t446975\n当户\t446976\n幽会\t446977\n陈童\t446978\n各種\t446979\n雨果奖\t446980\n180平方\t446981\n民国\t446982\n郑多燕\t446983\n史泰扎克伯格\t446984\n拉丁美洲\t446985\n熟饼\t446986\n一墅\t446987\nprevents\t446988\n梦想者\t446989\n737NG\t446990\n玉龙\t446991\n陈乐基\t446992\n關於\t446993\n波立维\t446994\n阿图什\t446995\n肯德尔\t446996\n夏泽\t446997\n港媒\t446998\n阿里宝\t446999\n东方电机\t447000\n竹妃\t447001\nORCL\t447002\n善缘\t447003\n沙依巴克区\t447004\n滤膜\t447005\n黄龙寺\t447006\n地牢猎手\t447007\n银信\t447008\n宝马X3论坛\t447009\n血路\t447010\n万福广场\t447011\nmozilla\t447012\n建艺集团\t447013\nhp1106\t447014\nsculpture\t447015\n张楚岚\t447016\n北银创投\t447017\n乐仁堂\t447018\n六家\t447019\n瑶溪\t447020\nStellarium\t447021\n阿尔斯\t447022\n儿童白血病\t447023\n猪獾\t447024\n秋月朗\t447025\n新界\t447026\n柳真\t447027\nAruba\t447028\n甘肃林业职业技术学院\t447029\n化学\t447030\n身子\t447031\nROM包\t447032\n好难受\t447033\ndownloadmanager\t447034\n螺杆真空泵\t447035\n楼牌\t447036\n恒转矩\t447037\n高招\t447038\n水客\t447039\n_汽配中国网\t447040\n御玺\t447041\n加密机\t447042\n军事科学院\t447043\nRoku\t447044\n题卷\t447045\n红霉素\t447046\n静音型\t447047\n梅花管\t447048\n超级电视\t447049\n20161209\t447050\n九歌\t447051\nmonopoly\t447052\n翼王\t447053\n索证\t447054\n薪酬委员会\t447055\n法穆兰\t447056\nObjective-C\t447057\n黑水鸡\t447058\n傲视天地吧\t447059\npylon\t447060\n源版\t447061\n梅林镇\t447062\n兼\t447063\n喉箍\t447064\nMantra\t447065\n云指\t447066\nure\t447067\n华山火车站\t447068\n华电集团\t447069\n周国贤\t447070\n博地\t447071\n怀旧版\t447072\n搅动\t447073\n艾斯奥特曼\t447074\n悦己者\t447075\n祭台\t447076\nLyo\t447077\n1279\t447078\n伊索香芹籽\t447079\n东方国际\t447080\n乱画\t447081\n梅林街道\t447082\n鳌头镇\t447083\nhil\t447084\nrx5\t447085\n吕向阳\t447086\n啤机\t447087\n拔鼠\t447088\n沿途\t447089\n饴糖\t447090\nsafer\t447091\n更丰富\t447092\n立牌\t447093\n钓浮\t447094\n重庆理工\t447095\n仙山\t447096\n富士康兄弟网\t447097\n上海投资理财\t447098\n汉灵帝\t447099\n胶阀\t447100\n金·卡戴珊\t447101\n瑞安房产网\t447102\n角膜炎\t447103\n煤气柜\t447104\n关伟\t447105\n吉非替尼片\t447106\nQuantSeven\t447107\n刨腹产\t447108\n匠圣vivox21\t447109\n爬行动物\t447110\nchromatography\t447111\n投票器\t447112\n长链\t447113\nLexar\t447114\n中共福建省委\t447115\n草片\t447116\n学籍\t447117\n摸奶节\t447118\ncuda\t447119\n佐川\t447120\n坝坝\t447121\n爱玛特\t447122\n投资界\t447123\n指令单\t447124\n消音器\t447125\n兰格钢铁网\t447126\n五感图\t447127\ngril\t447128\n65位\t447129\nCanyon\t447130\nneeded\t447131\n50x\t447132\n幼学\t447133\n往上\t447134\n口味\t447135\n博博\t447136\n新埠岛\t447137\n百度云音源\t447138\nWIN/MAC\t447139\n斯佩伯爵\t447140\n单选\t447141\n盖瑞\t447142\n男仔\t447143\n福海路\t447144\n人狼村之谜\t447145\n女传\t447146\ncoreldrawx4\t447147\n少年郎\t447148\nSL500\t447149\n八珍颗粒\t447150\n033\t447151\n墨汁\t447152\n卡巴斯基免费版\t447153\n徽信\t447154\n天然橡胶\t447155\nAxe\t447156\n2177\t447157\n宽大\t447158\n被辱\t447159\n百度经验\t447160\n小坏\t447161\n1976年\t447162\n小小鸟\t447163\n马宝\t447164\n成语故事\t447165\n套价\t447166\n身体权纠纷\t447167\n乳源县\t447168\n本框\t447169\n赛德斯\t447170\n从业证\t447171\nyum\t447172\n3.1.3\t447173\n乎者\t447174\n采蝶轩\t447175\n点错\t447176\nPrimeknit\t447177\n85平\t447178\n_模板无忧www.mb5u.com\t447179\n七斤\t447180\n上街镇\t447181\nhotmail\t447182\n资金项\t447183\n超小\t447184\nformed\t447185\n八孔\t447186\n台山市\t447187\n圆记\t447188\nIos11\t447189\n杨良宜\t447190\n27座\t447191\n蒜汁\t447192\n声级计\t447193\nnovotel\t447194\n加勒比\t447195\n醒来后\t447196\nSOCKET\t447197\n亲爱的公主病\t447198\n凯尔经的秘密\t447199\n七席\t447200\nPad\t447201\n柘皋镇\t447202\nqad\t447203\n中庚\t447204\n嘉荫\t447205\n沆瀣一气\t447206\n宫锁\t447207\n性派对\t447208\n侧记\t447209\n死或生\t447210\n四川省农村信用社联合社\t447211\n仨人\t447212\n十堰房产在线\t447213\n水野朝\t447214\n吴宏\t447215\n性状\t447216\nthymeleaf\t447217\n上海财经大学出版社\t447218\n回回\t447219\n长激素\t447220\n礼宾员\t447221\n寿力\t447222\n切丝机\t447223\nyang\t447224\n洋楼\t447225\n1月1日\t447226\ntrix\t447227\ntextField\t447228\n申港\t447229\n血腥片\t447230\n传化集团有限公司\t447231\n浮生物语\t447232\n帮帮文库网\t447233\n奥尔特\t447234\n鹰目\t447235\n上海格力空调\t447236\n多隆\t447237\n万间\t447238\n灭火\t447239\n通视\t447240\nstm32f207\t447241\n复旦学院\t447242\n苹果园\t447243\ngltf\t447244\nfrogs\t447245\n锤子T1\t447246\nmodflow\t447247\n当当当\t447248\n调皮捣蛋\t447249\n8.6.7\t447250\n吴承恩\t447251\n定档\t447252\n苍鹭\t447253\nWEB浏览器\t447254\ngroud\t447255\n五十度飞\t447256\npotoshop\t447257\n6p\t447258\n鳄鱼油\t447259\n永鼎\t447260\n砍手\t447261\n密集型\t447262\n陈永\t447263\nx射线衍射\t447264\n元器件交易网\t447265\n虎牙直播助手\t447266\nWagon\t447267\n自贡\t447268\n2.6.2\t447269\n安阳街道\t447270\n许留山\t447271\n天秤男\t447272\n半潜式\t447273\n邪皇\t447274\n一批又一批\t447275\n墨尔本大学\t447276\n唯智\t447277\n1847年\t447278\n小雪\t447279\n口情\t447280\nMilitary\t447281\n石斑\t447282\n压紧\t447283\n工作人\t447284\n小非农\t447285\n愚\t447286\n中南大学商学院\t447287\nEngineer\t447288\n深圳仲裁委员会\t447289\n农业信用卡\t447290\nyeah\t447291\nXICXI\t447292\n吉布\t447293\n姜浩\t447294\n湟普国际湟座\t447295\nNCF\t447296\n回字\t447297\n德青源\t447298\n今儿\t447299\n方术\t447300\n汕德卡\t447301\n关税起征点\t447302\n天主教堂\t447303\nbuildroot\t447304\nAndo\t447305\n领导性\t447306\n质疑\t447307\n谭医生\t447308\n平今\t447309\n毛胚\t447310\n嘉禾望岗\t447311\n_洛阳师范学院\t447312\n陀螺仪\t447313\n新孟河\t447314\n暧暧\t447315\n流量币\t447316\n英译汉\t447317\nquantities\t447318\n风冷散热器\t447319\n20160324\t447320\n暴率\t447321\n0082\t447322\n吴江法院\t447323\n伟康\t447324\nintellj\t447325\n十堰市人民医院\t447326\n高姐\t447327\n先锋镇\t447328\n山西省水利厅\t447329\n盈丰\t447330\n六小龄童\t447331\n吴斌\t447332\n采访稿\t447333\n全付通\t447334\n北京台湾\t447335\n马褂\t447336\nmingl\t447337\n黄旭华\t447338\n工号\t447339\n易派客\t447340\n任逸帆\t447341\n归客\t447342\nX500\t447343\n浸渍\t447344\n西片\t447345\nricher\t447346\n美伊娜\t447347\n自西\t447348\n身主\t447349\n奥美广告公司\t447350\nakka\t447351\nNetBIOS\t447352\nNX10\t447353\n消防门\t447354\n解谜者\t447355\n起义军\t447356\nNightingale\t447357\n悬架\t447358\narterial\t447359\n水份\t447360\n出质\t447361\n下周六\t447362\n金粉世家\t447363\n西安高新\t447364\n1000分\t447365\n娃娃亲\t447366\n临朐人才网\t447367\n双成\t447368\n职教\t447369\n不平\t447370\nUehara\t447371\nrhythm\t447372\n崔珏\t447373\n芦根\t447374\n出坞\t447375\n童书\t447376\nwish邮\t447377\n门道\t447378\npriority_queue\t447379\n郑业成\t447380\n小米电视4S\t447381\n王福庵\t447382\n海瑞\t447383\n加拉帕戈斯\t447384\n周光权\t447385\n狮子王\t447386\nrsi\t447387\n猫九酱Sakura\t447388\n伞舞\t447389\n冯毅\t447390\n杞\t447391\nmbprogresshud\t447392\n成海丽\t447393\n600436\t447394\n广东省发展和改革委员会\t447395\n某部\t447396\n辩位\t447397\nNEOGEO\t447398\n兼具\t447399\nskool\t447400\n价量\t447401\n中国一拖集团有限公司\t447402\n杀死\t447403\n经营性贷款\t447404\n326路\t447405\n恐龙鱼\t447406\n不可自决\t447407\n生物公司\t447408\n天子湖镇\t447409\n意思\t447410\nAPICloud\t447411\n姚迪明\t447412\n朱玉\t447413\n生活篇\t447414\n武汉铁路职业技术学院\t447415\n附开\t447416\n18\t447417\n法云安缦\t447418\n宝塔\t447419\nProtoPie\t447420\n填写\t447421\n图件\t447422\n视距\t447423\n四川省林业厅\t447424\n挫穷\t447425\n第八十一章\t447426\n阿胶膏\t447427\n淘金记\t447428\n钢瓦\t447429\n谭校长\t447430\n深发\t447431\n粉红女郎\t447432\n丁杰\t447433\nMBA教育网\t447434\n马镫\t447435\n太子港\t447436\n纪元2070吧\t447437\n章士钊\t447438\n街舞\t447439\n家长会\t447440\n烽火科技\t447441\nQD\t447442\nfirebug\t447443\n荡秋千\t447444\n文三西路\t447445\n赡养费\t447446\nbolg\t447447\nDubai\t447448\ncmd\t447449\njipin\t447450\n一寸二寸\t447451\nGarry\t447452\n尼龙扎\t447453\n60组\t447454\n上海大通\t447455\n促销活动\t447456\n艾曼妞\t447457\n八脚\t447458\n宫保\t447459\n复印店\t447460\n路平\t447461\n十多年前\t447462\n熏梅在线\t447463\n黄果树\t447464\n肾功\t447465\n青山菜菜\t447466\nCoursera\t447467\n少少\t447468\n蔚来ES8\t447469\n逃废债\t447470\n下个月\t447471\nTiC\t447472\n丝纹\t447473\n25届\t447474\ntripollar\t447475\n忧国忧民\t447476\n朱晓飞\t447477\n中华人民共和国驻美国大使馆\t447478\n两规\t447479\n风韵犹存\t447480\n双引擎\t447481\n壹壹\t447482\n关中\t447483\n李雪芮\t447484\n乐蛙\t447485\n叶子猪天下3\t447486\n夏目友人帐吧\t447487\n知识性\t447488\n苏州市委\t447489\n西斯廷教堂\t447490\n第十人民医院\t447491\n离婚者\t447492\nBT之家影评\t447493\n1KW\t447494\n洋山深水港\t447495\n海果汇\t447496\n周雨彤\t447497\nSNE\t447498\n盛世宠妃\t447499\n活菩萨\t447500\n慎终如始\t447501\n中间件\t447502\n7240\t447503\n伍迪艾伦\t447504\nClion\t447505\nIntersil\t447506\n条子\t447507\n龙图\t447508\n李斯\t447509\n天丝棉\t447510\n在\t447511\n刘一含\t447512\n烘培\t447513\nzhihu\t447514\n粉袋\t447515\nIP地\t447516\n莫斯科机场\t447517\n51CTO.COM_\t447518\n未必之恋\t447519\n煤器\t447520\n乌拉\t447521\n前瞻性\t447522\n背剑\t447523\n北京万达广场\t447524\n办公地\t447525\n天儿\t447526\n寿礼\t447527\n折戟沉沙\t447528\n表决器\t447529\nSPECT\t447530\n小结巴\t447531\n舞法天女\t447532\n德尔惠\t447533\n智云Smooth\t447534\n秋叶原之旅2\t447535\nrwxr\t447536\n婉婉\t447537\nSONAR\t447538\n绍兴市人力资源和社会保障局\t447539\n福绵\t447540\n东宝区公众信息网_荆门东宝政府\t447541\n白塔岭\t447542\n高岛\t447543\n老人\t447544\n莱比锡\t447545\nOUTLETS\t447546\n715路\t447547\nVDSL2\t447548\n长安欧尚车友会_XCAR爱卡汽车俱乐部\t447549\n异度神剑2\t447550\n安卓模\t447551\n潘阳湖\t447552\n常州市区\t447553\n筆記\t447554\n珍禽\t447555\nsnow\t447556\nWebRequest\t447557\n3.27\t447558\n港彩\t447559\n第四十八\t447560\n什么日\t447561\n香格里拉大酒店\t447562\n张永军\t447563\n罕见病\t447564\nWorksheet\t447565\n可折\t447566\n百度搜索帮助中心\t447567\n灌输\t447568\nAnalytic\t447569\nEV160\t447570\n官宣\t447571\n蚂蚁搬家公司\t447572\n郑州电力职业技术学院\t447573\n秘技\t447574\n茯苓\t447575\nPorns\t447576\n第95期\t447577\n请愿\t447578\n星币\t447579\n中国水电建设集团国际工程有限公司\t447580\n胡强\t447581\n洞房花烛\t447582\n中环杯\t447583\n分所\t447584\nceac\t447585\nhbl\t447586\n抑扬顿挫\t447587\n野风\t447588\n百度网盘-网盘007\t447589\n联合卡车\t447590\n500架\t447591\n五家渠市\t447592\n青年期\t447593\n讲课稿\t447594\n渐开线\t447595\nhulian\t447596\n能发电\t447597\n腐蚀术\t447598\n北京友谊宾馆\t447599\nentries\t447600\n木兰花\t447601\n2018年5月10日\t447602\n辰希\t447603\n余弦\t447604\n1362\t447605\n服务处\t447606\nappx\t447607\n排练版\t447608\nAPMServ\t447609\n研究院\t447610\n广东医科大学附属医院\t447611\n打暴\t447612\nadorable\t447613\n精灵宝可梦:究极日月\t447614\n等你说\t447615\n梨膏\t447616\n氮气罐\t447617\n保亭县\t447618\n上塘街道\t447619\n桦木\t447620\n金南玲\t447621\n漫漫长路\t447622\n楼雨晴\t447623\n吉林大学商学院\t447624\n曹妃甸\t447625\n护师\t447626\n张本智\t447627\n大宫\t447628\n生物竞赛吧\t447629\nBirthday\t447630\n飒爽英姿\t447631\n霸道神话版三国\t447632\n王耀\t447633\n开服表\t447634\n日行\t447635\n郭敬安\t447636\n操作盘\t447637\n侧重点\t447638\n360WiFi\t447639\n秘密爱\t447640\n130平方\t447641\nstring\t447642\nTouring\t447643\n安洁\t447644\nMaternal\t447645\nPreSonus\t447646\n乐康膏\t447647\n生植物\t447648\n含山\t447649\n五届二次\t447650\n0996\t447651\n洛浦县\t447652\n庆东\t447653\n白兔糖\t447654\n天枰\t447655\n炫舞小灵通\t447656\n抗扰度\t447657\n竹下\t447658\n白兔镇\t447659\n海上钢琴师\t447660\n枣阳论坛\t447661\n中军\t447662\n自发热\t447663\n1J\t447664\n吴作人\t447665\n回府\t447666\n牛莉\t447667\n硕利科技\t447668\n九原\t447669\n京都动画\t447670\n廣\t447671\nimpdp\t447672\n并线\t447673\n黄河滩区\t447674\n方胜\t447675\nmultiply\t447676\n郑州市政府\t447677\n0605\t447678\n六合区\t447679\n证券投资基金法\t447680\nscramble\t447681\n北洋园\t447682\nBD1080p/720p\t447683\n经营\t447684\n有机系\t447685\n人才办\t447686\nclint\t447687\n丽城\t447688\n江森自控\t447689\n纪检\t447690\n宁波中百\t447691\n清洁霜\t447692\n中国物流股份有限公司\t447693\n澍\t447694\n行政庭\t447695\ngif格式\t447696\n捐兵\t447697\n页跳\t447698\n道声\t447699\n340元\t447700\n植物大战僵尸2摩登世界\t447701\n尽早\t447702\n魔沫\t447703\nDUNLOP\t447704\n修正液\t447705\n努比亚\t447706\nzzhz\t447707\n长城魏派\t447708\n老村\t447709\n言之凿凿\t447710\na站\t447711\n2018.4.17\t447712\n新娘子\t447713\n武中奇\t447714\n扬琴\t447715\n2016年12月份\t447716\n奋斗\t447717\n花乡\t447718\npopo文\t447719\n富控互动\t447720\n徐鲁\t447721\n斜向\t447722\n孙可\t447723\n079\t447724\nsakimichan\t447725\n舒利亚\t447726\n8i\t447727\n印谱\t447728\nPhillips\t447729\nBPS\t447730\n华南理工大学广州学院\t447731\nSlick\t447732\n人间正道是沧桑\t447733\n小炜\t447734\n创岗\t447735\n金叶\t447736\n解机\t447737\n链家研究院\t447738\n究竟\t447739\n论稿\t447740\nHUSTOJ\t447741\nblast\t447742\n云胡\t447743\n5月1\t447744\n海西\t447745\nahci\t447746\n央企集团\t447747\nRNS315\t447748\nLearner\t447749\n塔筒\t447750\njrc\t447751\n多汗\t447752\n沱沱河\t447753\nBBD\t447754\nquestions\t447755\n文本编辑器\t447756\n130种\t447757\n披甲龙龟\t447758\n恒易\t447759\n万国府\t447760\n花裙\t447761\n折光\t447762\n娱乐城\t447763\n人机工程学\t447764\nWebApi接口\t447765\n第100家\t447766\n福州海峡国际会展中心\t447767\nreported\t447768\n煨\t447769\n急进\t447770\n二柱\t447771\nyurisa\t447772\n方正大\t447773\n桐木关\t447774\n3988\t447775\n广西壮族自治区审计厅\t447776\n宿州市政府\t447777\n青茶\t447778\n佘山北\t447779\n塑料窗\t447780\n标准差\t447781\n改色\t447782\n3699\t447783\n导游机\t447784\n服务机器人\t447785\ncrrt\t447786\nDatetime\t447787\nmapjoin\t447788\n南京城\t447789\n福建师大\t447790\n绩溪县政府\t447791\n601888\t447792\n杨时\t447793\nAFTER\t447794\n龙巅三湖慈鲷\t447795\n融通\t447796\n2015年6月30日\t447797\n东湖湾\t447798\n实时性\t447799\n粘附性\t447800\n龙驭球\t447801\n茅山镇\t447802\n再发\t447803\n领教\t447804\n蓝翔\t447805\n连生\t447806\n梅花集团\t447807\n红蔷薇\t447808\n天津物产集团\t447809\n提质\t447810\n莱城区\t447811\n佳婿\t447812\n污师\t447813\n木酚\t447814\n绿地新都会\t447815\n高压灭菌器\t447816\npw3\t447817\n重庆北站南广场\t447818\n临颍\t447819\n岁月无声\t447820\n前4天\t447821\n天麓府\t447822\nssms\t447823\n斩仙\t447824\n滤袋\t447825\n版票\t447826\n2499起\t447827\n消旋\t447828\n四档\t447829\n浙江大学环境与资源学院\t447830\nEICAD\t447831\n夜尿\t447832\n墨星封面网\t447833\n血细胞\t447834\nrline\t447835\n广瑞路\t447836\n混音\t447837\n选择恐惧症\t447838\n恒大酒店\t447839\n惠诚\t447840\n应用心理硕士\t447841\n第七讲\t447842\n纸轮\t447843\nPOPPING\t447844\n白鹭湾\t447845\n钯\t447846\n万盏\t447847\n律师所\t447848\n颐莲\t447849\n网坛\t447850\nfritz\t447851\n置地\t447852\nppt_\t447853\n国民警卫队\t447854\n北京11选5\t447855\n烁\t447856\nFraming\t447857\n东坡区\t447858\ngodv\t447859\n老有所依\t447860\n漏斗胸\t447861\nTimeless\t447862\n大闹天宫\t447863\n萧鼎\t447864\nPalantir\t447865\n评游网\t447866\n1332\t447867\nmv\t447868\n兼容并包\t447869\n刘卫红\t447870\n承兑汇票\t447871\n惜祯\t447872\n白云源\t447873\n陈三五娘\t447874\n通信塔\t447875\n安徽财贸职业学院\t447876\n洪卓立\t447877\npersuasive\t447878\n选手们\t447879\n爱莉\t447880\n#育儿大\t447881\n二零一八\t447882\n伊布拉希莫维奇\t447883\nkpw3\t447884\n璋\t447885\n太姥山镇\t447886\n海珠城\t447887\n电浆\t447888\n佰佰安全网\t447889\n字义\t447890\nbibili\t447891\n包税\t447892\n海防\t447893\nhd720p\t447894\nL485\t447895\n卯水咲流\t447896\n道教之音\t447897\nZJ\t447898\nspoil\t447899\nwannabe\t447900\n北京市科学技术研究院\t447901\n永乐场馆\t447902\npentaho\t447903\n摆账\t447904\nflash\t447905\n美廉美超市\t447906\n西淅\t447907\n中国招商网\t447908\n挡住\t447909\n穿深\t447910\nelectronica\t447911\n红五星\t447912\n非传染性\t447913\n手帐本\t447914\n学导\t447915\nassign\t447916\n通辽市\t447917\n现代篇\t447918\n儒勒凡尔纳\t447919\n三会一课制度\t447920\n结婚礼物\t447921\n淖毛湖\t447922\n天津产权交易中心\t447923\n人说山西好风光\t447924\n讲话\t447925\nAVICII\t447926\nJail\t447927\n总指挥部\t447928\n明厨亮灶\t447929\ncc2541\t447930\n压力变送器\t447931\n月牙形\t447932\nH.264\t447933\nkang\t447934\n龙归城\t447935\n莱克尔\t447936\nMTS\t447937\nchipset\t447938\n第八季\t447939\n玉渡山\t447940\n中彩那天\t447941\n常见字\t447942\n弹性域\t447943\n天煞孤星\t447944\nzset\t447945\n小沟\t447946\n汉晋\t447947\n五码\t447948\n冰菜\t447949\n甜甜私房猫\t447950\n找\t447951\n点除\t447952\n赵云龙\t447953\n丰台体育中心\t447954\n作孽\t447955\neduis\t447956\n塞得\t447957\n市政算量\t447958\n1.3.9\t447959\nv3.0.5\t447960\n各各\t447961\n半路夫妻\t447962\n诸葛村\t447963\n200部\t447964\n第二点\t447965\n宗主国\t447966\n折柳\t447967\n假孕\t447968\nV4.3.0\t447969\n艳色\t447970\nv7a\t447971\n和辉\t447972\n行别\t447973\nsimilarface\t447974\n爬宠\t447975\n13710\t447976\n汤池镇\t447977\n等等等\t447978\n传导性\t447979\n上班族们\t447980\n盛赞\t447981\n浙江大学城市学院\t447982\n俗尘\t447983\n王天来\t447984\n财政性\t447985\n封嘴\t447986\n南宁动物园\t447987\n重庆驰马机械配件有限公司\t447988\n活络\t447989\n衣照\t447990\n一万五\t447991\n0.8.3\t447992\nebs\t447993\n变频控制柜\t447994\n8.8折\t447995\n居无\t447996\n岭南地区\t447997\n老卡\t447998\n周晓鹏\t447999\nTEX\t448000\n蓝翅\t448001\n买入评级\t448002\n9S\t448003\nMyHome\t448004\n银圆\t448005\n包装类\t448006\n八一电视\t448007\n权倾朝野\t448008\n武汉理工大学华夏学院\t448009\n413号\t448010\n穗莞深\t448011\n主墩\t448012\n舞室\t448013\n同和街道\t448014\n手摇铃\t448015\n狛枝凪斗\t448016\n3折\t448017\n各型\t448018\n千方百剂\t448019\n平西\t448020\n夏承焘\t448021\n揣测\t448022\n35亿\t448023\n赫伯特·西蒙\t448024\nnexus7\t448025\n卷面\t448026\n111_\t448027\n欧瑞特\t448028\nyeu\t448029\n仿佛\t448030\nconsiderations\t448031\n故宫太和殿\t448032\n小排\t448033\nユ\t448034\n里番肉番\t448035\n料站\t448036\n13辆\t448037\n色\t448038\n高塘\t448039\n受宠\t448040\n存送\t448041\n遂昌新闻网\t448042\n总账\t448043\n刘妍\t448044\n屏参\t448045\n会东\t448046\nPremie\t448047\n导言\t448048\n一道坎\t448049\n36米\t448050\n比较文学与世界文学\t448051\n八方\t448052\n张天韵\t448053\ngraphite\t448054\nv5.3.2\t448055\n三国志9pk\t448056\ndelaunay\t448057\nmortality\t448058\nwords\t448059\n多玩Minecraft盒子\t448060\n一山\t448061\n冯嘉怡\t448062\nhairmax\t448063\nMassage\t448064\n手党\t448065\n双虎\t448066\nFossil\t448067\n安培力\t448068\nAddictive\t448069\n街长制\t448070\n羟基丁酸\t448071\n美码\t448072\n16163\t448073\ncodingcloud\t448074\n电桩\t448075\n中恒电气\t448076\n安龙\t448077\n植草格\t448078\n虚空藏菩萨\t448079\n雅拉\t448080\n奋勇前进\t448081\n上古神话\t448082\n全脂乳粉\t448083\n黄圣窦文涛\t448084\n方华\t448085\n新肌\t448086\n天神卡\t448087\n武汉市幼儿园\t448088\n川贝粉\t448089\n雨儿\t448090\n挖泥\t448091\nr7800\t448092\nchlorine\t448093\n江苏省司法厅\t448094\n马爹利\t448095\n惊诧\t448096\n风雨路\t448097\n摸上\t448098\n璃沙\t448099\n桂枝\t448100\n异鬼\t448101\n撤并\t448102\n下水器\t448103\n棉浆\t448104\n跨栏\t448105\nViewPoint\t448106\nlbc\t448107\n玻璃量\t448108\n北京燕山出版社\t448109\n耳包\t448110\n临世\t448111\n朝天宫\t448112\n中国邮政集团公司\t448113\n泪海\t448114\n平林漠漠\t448115\n青城派\t448116\n普达措\t448117\n矿工\t448118\n神经病\t448119\n51张\t448120\n傻里傻气\t448121\nwifi电脑\t448122\n清刻\t448123\n独字\t448124\n两晋\t448125\n仙之侠道2\t448126\nsana豆乳\t448127\n1620\t448128\n烛光里的妈妈\t448129\n丹棱\t448130\n李健熙\t448131\n速度快\t448132\n20170401\t448133\n班主任\t448134\n鹤丸\t448135\nYING\t448136\n挂网\t448137\n混合材\t448138\n琴韵\t448139\n目标值\t448140\n一帘幽梦\t448141\n内蒙古商贸职业学院\t448142\n驾培网\t448143\n110码\t448144\n很有可能\t448145\n易水寒\t448146\n神刀\t448147\n韦尔\t448148\nwww.130158.com\t448149\n罩批\t448150\n克里斯汀·斯图尔特\t448151\n西门子S7\t448152\nseer\t448153\n史努比\t448154\nDiscovering\t448155\n火蜥蜴\t448156\n麻醉学\t448157\nFontSpace\t448158\nInnocent\t448159\n冒险岛1\t448160\n幻想即兴曲\t448161\n2018年5月1日起\t448162\n船底\t448163\n补体\t448164\n有券\t448165\n热得\t448166\n累加值\t448167\n1681\t448168\n陈塘\t448169\n核算法\t448170\n于小冬\t448171\nf117\t448172\n国家海洋局\t448173\n石钟琴\t448174\n新华广播\t448175\nm500\t448176\n105元\t448177\n河南金兰工程管理有限公司\t448178\n翻番\t448179\n50ETF\t448180\n99平米\t448181\n吴铭\t448182\n永骏\t448183\n当贝\t448184\n龚诗淇\t448185\n麻醉药\t448186\n算数\t448187\nCG论坛\t448188\n排屑器\t448189\n暖宠\t448190\n王者荣耀射手\t448191\n顶空气相色谱法\t448192\n360个人图书馆\t448193\nilspy\t448194\n品阶\t448195\nexamples\t448196\nFair\t448197\nartform\t448198\n丹尼尔惠灵顿\t448199\n西罗\t448200\nxslt\t448201\nCOSMO\t448202\n类聚\t448203\n非布\t448204\n巨屏\t448205\n20131220\t448206\n蜜桃味\t448207\n2ml\t448208\n宁波市奉化区人民政府\t448209\n李雨桐\t448210\n撩妹\t448211\n维达\t448212\n易制毒\t448213\n绿纹\t448214\n库平七钱\t448215\n起亚宝马x6\t448216\n宋风云\t448217\n医讯\t448218\n塔城地区政府网\t448219\n翻板机\t448220\nrelaxed\t448221\nが\t448222\n福艳\t448223\n算出\t448224\n长安欧诺\t448225\n父老\t448226\n电感式接近开关\t448227\n锡林浩特市\t448228\n吊点\t448229\n桂柳\t448230\n混合比\t448231\n红杠\t448232\n荒川弘\t448233\n正酣\t448234\n教学设计\t448235\n香茶\t448236\n铭鸽\t448237\n花木兰\t448238\n劫掠\t448239\nDestinations\t448240\n上海建设\t448241\n丰尚\t448242\n环贸\t448243\n浙江广电\t448244\n狎鸥亭\t448245\nkis\t448246\n恒大威尼斯\t448247\n张一元\t448248\n飞雷神\t448249\nliterary\t448250\n谈妥\t448251\n04月08日\t448252\n七十一\t448253\n108万\t448254\n135分\t448255\n八百万\t448256\n狒狒\t448257\n多户\t448258\n钦州市\t448259\n宁波法院\t448260\n370TURBO\t448261\n漏出\t448262\nEM菌\t448263\nWilson\t448264\n质谱分析\t448265\n灯箱片\t448266\n威海市商业银行\t448267\n验算\t448268\n活脱脱\t448269\n分模\t448270\nafs\t448271\n名府\t448272\nprocreat\t448273\n德兴市政府\t448274\nemmanuel\t448275\nps套索\t448276\n生活水平\t448277\n沙微谷\t448278\n昌达\t448279\n5355\t448280\n沃尔科特\t448281\n新丰县\t448282\n陆河\t448283\n秋楓\t448284\n河北大学工商学院\t448285\ncity-picker\t448286\n华典\t448287\npanqiming\t448288\n尿泡\t448289\n大专证\t448290\n长寿湖\t448291\nRBA\t448292\n井格\t448293\nBLACK\t448294\nkf\t448295\n巧笑\t448296\n甘草次酸\t448297\n了不起的菲丽西\t448298\n船位\t448299\n景类\t448300\n恒恒\t448301\n盟军敢死队2\t448302\nnoaa\t448303\nemp\t448304\nd630\t448305\n远成物流\t448306\n泡状\t448307\n明年4月\t448308\n8月15\t448309\n控制线路\t448310\n恋碍\t448311\n1.77\t448312\noem7\t448313\nPK10计划人工计划/PK10全天计划网页版\t448314\n广西壮族自治区人民医院\t448315\n黄姚\t448316\nJSB\t448317\n上投摩根\t448318\n擎\t448319\n27小时\t448320\n吊船\t448321\nroselip\t448322\n厦门人大\t448323\n康惠\t448324\n义龙新区\t448325\n吕紫剑\t448326\n墙面\t448327\n新繁镇\t448328\nzhidu\t448329\n南昌西客站\t448330\n瞳瞳\t448331\nCCBN\t448332\n牖\t448333\n钨钢\t448334\n越越\t448335\nsoldworks\t448336\nCycles\t448337\ntoString\t448338\n第60页\t448339\n芙蓉湖\t448340\nNotifier\t448341\n8.8级\t448342\n警示标\t448343\n女巫森林\t448344\n拉面\t448345\nPotatoes\t448346\n香港电台\t448347\n西乡塘区\t448348\n明孝陵\t448349\n小胖妞\t448350\n第十课\t448351\n鲜花速递\t448352\nmentality\t448353\n擦车\t448354\n仲春\t448355\nv6.0.0\t448356\n乐星\t448357\n10.11.1\t448358\n告诉我\t448359\nmistakes\t448360\n鸟笼\t448361\n20180314\t448362\n承利\t448363\n福冈\t448364\n勐海县\t448365\n注射隆鼻\t448366\nzhongwen\t448367\nRuss\t448368\nOffice2017\t448369\nWTForms\t448370\n迈斯\t448371\n葶苈子\t448372\n之江学院\t448373\n梅拉\t448374\n澄和\t448375\nmirco\t448376\n3780\t448377\n肖肖\t448378\n合景誉山\t448379\n指征\t448380\n终结者2:审判日官网论坛_终结者2手游\t448381\n解放思想\t448382\n罗志田\t448383\n信基督\t448384\n秘制\t448385\n老和尚\t448386\nAnthropology\t448387\nCuteFTP\t448388\n推卸责任\t448389\n零配\t448390\nウィ\t448391\nStarfall\t448392\n四包\t448393\n屏上\t448394\n转换关系\t448395\ntren\t448396\n华旗\t448397\nORD\t448398\n制造\t448399\n师从\t448400\n麻类\t448401\n元素塔\t448402\n药品生产质量管理规范\t448403\n大牌对王牌\t448404\n无锡火车站\t448405\n恐惧症者\t448406\n横峰街道\t448407\nwool\t448408\n裹尸\t448409\n微剧场\t448410\n星野源\t448411\n管形\t448412\n华师\t448413\npdf转换\t448414\n程嘉美\t448415\nBling\t448416\n季离季夜\t448417\nagfa\t448418\n囫囵吞枣\t448419\n刷粉\t448420\n父母官\t448421\n四大博彩公司\t448422\n永嘉天地\t448423\npersonnel\t448424\n郑恺\t448425\n侮辱性\t448426\njixiace\t448427\n20160128\t448428\nLesson\t448429\nToshiba\t448430\n悦\t448431\n三伏\t448432\nAndroid_UC论坛\t448433\n劳动合同经济补偿金\t448434\n苏州实验中学\t448435\n替嫁\t448436\nCAD2009\t448437\n天涯共此时\t448438\n卡萨丁\t448439\n红雀特工\t448440\n龙光普罗旺斯\t448441\n溧阳市政府\t448442\n北京卫计委\t448443\n零期\t448444\n天道酬勤\t448445\n三缸机\t448446\n义愤填膺\t448447\n仙境传说ro\t448448\n东南西北\t448449\n滑铁卢战役\t448450\nv2.9.0\t448451\n8566\t448452\nAssembler\t448453\n逆旅\t448454\nvapormax\t448455\n小鸟小鸟\t448456\n委托代理协议\t448457\nthi\t448458\n扎哈维\t448459\nlx570\t448460\n16荒\t448461\ndanielle\t448462\n耐热服\t448463\n光谱分析仪\t448464\nJ2\t448465\n水帖\t448466\ntortoises\t448467\nWaterfall\t448468\n狡兔三窟\t448469\n王府井步行街\t448470\n茅十八\t448471\n传媒公司\t448472\n阿拉德大陆\t448473\n思路网siilu.com\t448474\n凌花\t448475\n沈南乔\t448476\n星座情緣_论坛\t448477\n危险的爱\t448478\n翻\t448479\nuseradmin\t448480\n一人之下吧\t448481\n特约\t448482\n100行\t448483\n临战\t448484\n4j\t448485\n逻辑学\t448486\nPANDORA\t448487\nSylvia\t448488\nlibopencv\t448489\nthematic\t448490\n哲理\t448491\n李成儒\t448492\nCECET\t448493\nmoore\t448494\nViper\t448495\n墒\t448496\nGND\t448497\n参加者\t448498\n天房\t448499\nAIRPORT\t448500\n约瑟夫环\t448501\n水木山川\t448502\n菜杰家常菜谱大全网\t448503\n固定電話號段\t448504\nRichards\t448505\n广州致远电子有限公司\t448506\n建行集团\t448507\n笛头\t448508\n压块\t448509\n生活用水量\t448510\n三号\t448511\n湖北省版权局\t448512\n火影忍者剧场版:博人传\t448513\n破壁人\t448514\nDesigners\t448515\n怀石料理\t448516\n蓝翔技校\t448517\n桂城街道\t448518\nbij\t448519\nadsense\t448520\n上下铺\t448521\n双林路\t448522\n毒驾\t448523\n奇松\t448524\n汉源\t448525\n降龙十八掌\t448526\n妨害\t448527\n阙\t448528\n老板娘\t448529\n幅度\t448530\n张国海子\t448531\nGB50203\t448532\n树心\t448533\n龙溪河\t448534\n航服\t448535\nSpannableString\t448536\n数算\t448537\n预科班\t448538\nFringe\t448539\nScar\t448540\n酷Q\t448541\nTB6\t448542\nalaya\t448543\n陌秀\t448544\n盘后\t448545\n天生一对\t448546\n投资评级\t448547\n雷电\t448548\n建筑工程网\t448549\ngogs\t448550\n3020\t448551\n宜家家居\t448552\nlooser\t448553\n宋海鹏\t448554\n32个\t448555\n幼线\t448556\n亿欧网\t448557\n孟连\t448558\nBest\t448559\nNGO信息中心\t448560\n中招\t448561\n老友网\t448562\n我爱发明全集\t448563\n厌氧池\t448564\n朱泾镇\t448565\n京信通信\t448566\n加俊\t448567\n进位制\t448568\n30GB\t448569\nadsafe\t448570\n多环境\t448571\n回民中学\t448572\nf1f2\t448573\n动压\t448574\nmangent\t448575\n0016\t448576\n小鸣\t448577\n9枚\t448578\n非局域网\t448579\n高老头\t448580\nSVM算法\t448581\ndeferred\t448582\n保持缄默\t448583\n扒皮机\t448584\n神秀\t448585\n量子检测仪\t448586\n爱无止境\t448587\n郭秀云\t448588\nCompare\t448589\n机神\t448590\n五格\t448591\n顺治皇帝\t448592\n冰鱼\t448593\n拱涵\t448594\n言表\t448595\nlstm\t448596\n除虫菊酯\t448597\nldconfig\t448598\n半导体集成电路\t448599\n代派\t448600\n8月15日\t448601\n椰奶冻\t448602\n朱旭\t448603\n文化负载词\t448604\n族谱网\t448605\n运钞\t448606\nk3c\t448607\n虚拟化软件\t448608\n生产工\t448609\n震害\t448610\njolin\t448611\n一意孤行\t448612\n说句\t448613\n脓包型\t448614\nCUBASE5\t448615\n天城\t448616\n保险赔偿\t448617\n预亏\t448618\n蓟马\t448619\n固额\t448620\n将军府\t448621\ninstantiate\t448622\n美国电影学会\t448623\npdfminer\t448624\n邓小华\t448625\n119G\t448626\n8500万\t448627\n小器\t448628\n中信银联\t448629\nLeft\t448630\n朱莉王\t448631\n苏州国际外语学校\t448632\n滴滴\t448633\n异显\t448634\n一莲\t448635\n特片\t448636\n上去掉\t448637\n联唱\t448638\nv1.1.4\t448639\n模友\t448640\n江美琪\t448641\n苏州宾馆\t448642\n挂链\t448643\n表主\t448644\n4948\t448645\n暴恐\t448646\nflashget\t448647\n本体\t448648\n杨姗姗\t448649\n中焦\t448650\n榕基\t448651\n大荔县人民政府\t448652\n横沙岛\t448653\n四川省公安厅交通管理局\t448654\n建筑老八校\t448655\nWR703N\t448656\n1739\t448657\ninternally\t448658\nw2018\t448659\n100目\t448660\n处境\t448661\nmicrowin\t448662\n笔记本\t448663\n小马智行\t448664\n探囊取物\t448665\n上海会展中心\t448666\n陕汽德龙\t448667\nPRISTIN\t448668\nPLZ\t448669\n性工作者\t448670\n依依惜别\t448671\n四位\t448672\n信札\t448673\n诚信库\t448674\n长征组歌\t448675\n無雄\t448676\nInterim\t448677\nPaperOK\t448678\n一觉二觉\t448679\n清野\t448680\n族谱录纪念网\t448681\n康嘉奇\t448682\n百度手机助手电脑安卓\t448683\n执意\t448684\n饶漱石\t448685\nzhenhong\t448686\n此致\t448687\n大宏\t448688\n拳打\t448689\n玉印\t448690\nrx480\t448691\n夏子皓\t448692\n百度地图API\t448693\nweiruan\t448694\n绝地求生刺激战场卡\t448695\n导航手机\t448696\n特卡\t448697\n雪铁龙天逸\t448698\nM126a\t448699\n齐耳\t448700\neSIM\t448701\n刘庄村\t448702\n文本库\t448703\nwebstom\t448704\n改革家\t448705\n远在\t448706\n14起\t448707\n文德\t448708\nunary\t448709\n破罐子破摔\t448710\n艺福堂\t448711\nfriends\t448712\ntaper\t448713\n小背心\t448714\nlinearlayout\t448715\n联统\t448716\n试剂级\t448717\n合署\t448718\n49岁\t448719\n阿凡题\t448720\nEureka\t448721\n204路\t448722\n武汉园博园\t448723\nP2.5\t448724\n华润大学\t448725\n56637220\t448726\n自始\t448727\n18年4月\t448728\n开瑞K50\t448729\n崔斯坦\t448730\n南京医科大学第二附属医院\t448731\n六零\t448732\n林熙\t448733\n鲲鹏\t448734\n孵化盒\t448735\n九组\t448736\n塔顶\t448737\n玉兰大道\t448738\n倒查\t448739\n迪庆州\t448740\n优惠型\t448741\n点酒\t448742\n泰普尔\t448743\n淘宝阿里旺旺\t448744\n超级小郎\t448745\n夺宝传奇\t448746\n插屏\t448747\n亮剑\t448748\n雷霆吧_\t448749\nHee\t448750\nbaoli\t448751\n民贷天下\t448752\n鲍秀兰\t448753\n打磨机\t448754\n英发\t448755\ntraders\t448756\n拾荒者\t448757\n说明文\t448758\n6d\t448759\n北伐战争\t448760\n冲洗机\t448761\n杨庄镇\t448762\n少儿简笔画\t448763\n仙特明\t448764\n89平米\t448765\n燃烧性\t448766\n市监局\t448767\n经典人文地理\t448768\n項\t448769\n0号\t448770\n玄帝\t448771\n桃红\t448772\nú\t448773\n斩妖除魔\t448774\nCertbot\t448775\n徐昌\t448776\n柯林斯\t448777\n金楼\t448778\n杭州立方控股股份有限公司\t448779\n我的母校\t448780\n自如寓\t448781\nBRP\t448782\n智能传感器\t448783\n货票\t448784\n4月2日起\t448785\nximalaya\t448786\nbbking\t448787\n提拔\t448788\n亲密\t448789\ncapsule\t448790\n莫克\t448791\n管道式\t448792\n钓罗非\t448793\n吧\t448794\n本能寺\t448795\n新陈代谢\t448796\nscreening\t448797\n东北师范大学附属中学\t448798\n长沙幼升小\t448799\n功能块\t448800\n入腹\t448801\n评估板\t448802\n叛逆期\t448803\n频响\t448804\n助助\t448805\n猫舍\t448806\n南山中学\t448807\n南宁万象城\t448808\n均匀度\t448809\n苏宁生活广场\t448810\n东溪\t448811\n赣南师范学院\t448812\n11分钟\t448813\n114集\t448814\nv2.0\t448815\n没有你\t448816\n拉丁吧\t448817\n等一回\t448818\n网商之窗\t448819\n2008年上半年\t448820\n天虹商场股份有限公司\t448821\n60e\t448822\n要是\t448823\n_沃游网www\t448824\n家外\t448825\n西安碑林区\t448826\nnnn\t448827\n三角纹\t448828\n韶关市人民政府\t448829\n陵南\t448830\nDetroit\t448831\nC2065\t448832\n豪放\t448833\n沪江泰语学习网\t448834\n601021\t448835\n赖子山庄\t448836\n健康源\t448837\n耿素云\t448838\nWoW\t448839\n文部\t448840\nipad3\t448841\n照面\t448842\nSumatra\t448843\n机组\t448844\n反观\t448845\n深圳北站\t448846\n无字\t448847\n神叨字幕组\t448848\n板梁\t448849\n鲁庄公\t448850\nBeware\t448851\nclearlove\t448852\nwatch9\t448853\n复证\t448854\n济生肾气丸\t448855\nAUTO\t448856\n回敬\t448857\n竞迹\t448858\n20周岁\t448859\nBeau\t448860\nNopCommerce\t448861\n晚9点\t448862\ne49\t448863\nmike11\t448864\n75d\t448865\n申请号\t448866\n13200\t448867\n比亚迪l3\t448868\n泡软\t448869\n超韧\t448870\n海蓝宝\t448871\n英菲尼迪Q50/Q50L论坛\t448872\n16英寸\t448873\n董路\t448874\n高层小区\t448875\n链路\t448876\n双季\t448877\n微门户-zz91再生网\t448878\n六段\t448879\n本田思铂睿\t448880\n上海王\t448881\n贯入式\t448882\n钓鱼岛\t448883\n广藏\t448884\n骏枣\t448885\n金题\t448886\n渋谷\t448887\n古楼\t448888\n5190\t448889\nx2+1\t448890\n车陂路\t448891\n三起点\t448892\n宪哥\t448893\n甾体类\t448894\n丁香树\t448895\n许廷铿\t448896\n天墟\t448897\n中国恒天集团有限公司\t448898\nIP池\t448899\n新晨科技\t448900\n屁王\t448901\nsmoke\t448902\n城东区\t448903\n莱昂王传福\t448904\n黄金宝\t448905\ncollaboration\t448906\n二间\t448907\n大特保\t448908\ndcoker\t448909\n中国时尚网\t448910\n反华\t448911\n罗志军\t448912\n货运站\t448913\n1日起\t448914\n深圳市人社局\t448915\n8118\t448916\n118型\t448917\n马基\t448918\n暂予监外执行\t448919\n嗷嗷待哺\t448920\n西柏林\t448921\n新电脑\t448922\n个人隐私\t448923\n仁和医院\t448924\n打做\t448925\n深圳天威\t448926\nFLYME\t448927\n超跑级\t448928\n2018.3.24\t448929\n剪胸\t448930\n20.1\t448931\nJ20\t448932\n再访\t448933\n东莞市环境保护局\t448934\n水瓶座\t448935\n示踪\t448936\n张梅\t448937\n文县\t448938\nmatlab2013a\t448939\n东北大米\t448940\ninhouse\t448941\n微茶网\t448942\n矾山\t448943\ngashero\t448944\n上海浦东陆家嘴\t448945\n广州市科技和信息化局\t448946\n拇\t448947\nowasp\t448948\n20170608\t448949\n2.4GHz\t448950\n抚媚\t448951\n巡检柜\t448952\n魔导少年/妖精的尾巴\t448953\nWOOD\t448954\n深圳北大医院\t448955\n喷胶棉\t448956\nU模拟器\t448957\n总司令\t448958\nResult\t448959\n艾米莉\t448960\nPrometric\t448961\n何康\t448962\n大政方针\t448963\n黑道风云之收数王\t448964\nUnreal4\t448965\nContacts\t448966\n点津\t448967\n绿檀\t448968\nredeem\t448969\n侠盗猎车4\t448970\n今晚12点\t448971\n圣方\t448972\ninteracting\t448973\n惯犯\t448974\n王非\t448975\n南门街道\t448976\n狮桥\t448977\n真武世界\t448978\n碗面\t448979\n可辨\t448980\nTraining\t448981\n段染\t448982\n张竹坡\t448983\n三焦经\t448984\nwebapps/docs/config\t448985\n杨盼盼\t448986\n聊城市政府网\t448987\n石台县\t448988\nduoduo\t448989\n2035\t448990\n修齐\t448991\n艺伎回忆录小千代\t448992\n交管局\t448993\nVagrant\t448994\n报菜名\t448995\n山寺\t448996\n梅山岛\t448997\n洲际优悦会\t448998\n2016-2022年\t448999\nNova2\t449000\n河南网\t449001\ncation2\t449002\n伊春新闻网\t449003\n盗\t449004\n家花\t449005\n黑诊所\t449006\n容易\t449007\nspectrometry\t449008\n余声\t449009\n三阳路\t449010\nKeygen\t449011\n劲胜\t449012\n每5分钟\t449013\n第5名\t449014\n过线\t449015\n诡案组\t449016\n重庆时报电子版\t449017\n表格画\t449018\nzack\t449019\n31.8\t449020\n蝇蛆\t449021\n没关\t449022\nNATURAL\t449023\n跳法\t449024\n多方通话\t449025\n主赛\t449026\n星套\t449027\n零诊\t449028\n2018年4月9号\t449029\n丁坝\t449030\n67期\t449031\n电脑报\t449032\nNHK\t449033\nwaitfor\t449034\nPILOT\t449035\nexcel03\t449036\n江汉\t449037\n林由奈\t449038\n有情人\t449039\n穿线盒\t449040\n高端机\t449041\nbags\t449042\n转运四方吧\t449043\n各面\t449044\nAutosport\t449045\n周霞\t449046\n表笔\t449047\n5300元\t449048\n电池级\t449049\n李元芳\t449050\ntpy\t449051\n碗形\t449052\n武王小说网\t449053\n秦殇\t449054\ncirc\t449055\nvendredi\t449056\n里皮\t449057\n刘品言\t449058\n中间商\t449059\n2696\t449060\n上海翊圣生物科技有限公司\t449061\n准确无误\t449062\n1亿\t449063\n西贴\t449064\n我的运维时光\t449065\n爱字\t449066\n宿迁房产网\t449067\nWindows分区\t449068\n第二型\t449069\n西安市红会医院\t449070\n以免\t449071\n台电\t449072\n念帝\t449073\n组立\t449074\n倒挂金钟\t449075\n超然\t449076\n刘伟平\t449077\n北镇市\t449078\n维盟路由器\t449079\nbargain\t449080\n铧\t449081\n中国投资有限责任公司\t449082\n胡滨\t449083\n阿尔马尔\t449084\n全成就\t449085\n全国人社\t449086\n矮人\t449087\n周巍\t449088\nassy\t449089\n龙虎斗\t449090\n天津医科大学总医院\t449091\n地矿\t449092\nnife\t449093\nBrunette\t449094\n王黎明\t449095\n暗组\t449096\n嘉荣超市\t449097\n磁势\t449098\n黄道益活络油\t449099\n3英寸\t449100\n屌丝男士\t449101\n山东省新闻出版广电局\t449102\n中国社会科学院\t449103\n信言\t449104\nappannie\t449105\n尿红细胞\t449106\n半角\t449107\n中国蒙阴政府\t449108\n伢子\t449109\nWorst\t449110\n金丝熊\t449111\n10.2.1\t449112\n李闲鱼\t449113\n0316\t449114\n江苏软件园\t449115\n100下\t449116\n控制变压器\t449117\n英语单\t449118\n福州市政府\t449119\n转外\t449120\n微园\t449121\n5800k\t449122\nC290\t449123\n新都网\t449124\n戎马丹心汉匈决战\t449125\n号召力\t449126\n995\t449127\n小数的初步认识\t449128\n睡衣\t449129\nNguyen\t449130\n十二级\t449131\n真实力\t449132\n三十周\t449133\n死而复生\t449134\n87_\t449135\njonas\t449136\nttps\t449137\nwebform\t449138\n扬农化工\t449139\n叛逆的鲁鲁修\t449140\n附发\t449141\n千门八将\t449142\n屈尊\t449143\n成都市质量技术监督局\t449144\n月蚀\t449145\n小郎酒\t449146\n坡州\t449147\n金山卫镇\t449148\n焦庄\t449149\nhwa\t449150\n江苏大剧院\t449151\nhyphen\t449152\n贸泽\t449153\n双鸭山市\t449154\n24芯\t449155\n音乐类\t449156\nMilfy\t449157\n睡大觉\t449158\nHypertext\t449159\n遗嘱库\t449160\nchongwu\t449161\n组图\t449162\naB\t449163\n基组\t449164\n百废待兴\t449165\nLifestyle\t449166\n民事\t449167\n领先版\t449168\n欧尼酱\t449169\n小型车\t449170\n运动感\t449171\n爱吃拉面的小泉同学\t449172\n一次元\t449173\n无人超市\t449174\n7d\t449175\n程守洙\t449176\n5.4.0\t449177\n硅灰\t449178\n华能水电\t449179\n明纪\t449180\n课型\t449181\n5.6G\t449182\n5.9.4\t449183\n杰尔夫\t449184\ninsanity\t449185\n保定学院\t449186\nbasketball\t449187\n8170\t449188\n卡莎\t449189\n孝行\t449190\n特种兵之深入敌后\t449191\n永州一中\t449192\n捕猎者\t449193\n外交学\t449194\n按章\t449195\n山麓\t449196\n赤小豆\t449197\n达林顿\t449198\n31位\t449199\n装瓶\t449200\n无损版\t449201\nsvprogresshud\t449202\n巴马\t449203\n法类\t449204\n费\t449205\n干掉\t449206\n天球\t449207\n欣慰\t449208\n酷歌\t449209\n朴宝英\t449210\n无损音乐播放器\t449211\n杨薇\t449212\n杨长亚\t449213\n报检员\t449214\n考古新发现\t449215\n厦门市人才服务中心\t449216\n幽州\t449217\n推杆\t449218\nツ\t449219\n九叠篆\t449220\n━\t449221\n调试员\t449222\n保值\t449223\n选列\t449224\n骇客\t449225\narmin\t449226\n马靴\t449227\n嘉年华论坛\t449228\n京东金融\t449229\n808nm\t449230\n国际货运代理有限公司\t449231\n监票\t449232\n9.9成\t449233\n7.0PVP\t449234\n2017年1月9日\t449235\n宠物网\t449236\n武汉江\t449237\necmo\t449238\nSussex\t449239\n中科院地理所\t449240\n基金管理公司\t449241\n李嘉\t449242\nstormy\t449243\n第189集\t449244\n短款\t449245\n菲妞\t449246\n墨水湖\t449247\n相位\t449248\nanthony\t449249\n松果体\t449250\n棋牌游戏平台\t449251\nf-447601\t449252\n步走\t449253\n5.0ex\t449254\n乌珠穆沁旗\t449255\n七子饼茶\t449256\n胞\t449257\n黄静茵\t449258\n杰尼\t449259\n双笼\t449260\n赛诺\t449261\n搭架\t449262\n艾玛\t449263\n2pac\t449264\n密恐\t449265\n望京店\t449266\n贤弟\t449267\nSuns\t449268\nqsp\t449269\nz1000\t449270\n2098\t449271\npediatric\t449272\n愣神\t449273\n弗瑞\t449274\n补开\t449275\n南存辉\t449276\n主诊\t449277\n苹果零售店\t449278\n592招聘网\t449279\n大神们\t449280\n课外活动\t449281\n百战\t449282\n不守\t449283\n保利爱尚里\t449284\n夕照\t449285\n保婴丹\t449286\n5.28\t449287\n王立峰\t449288\n排便\t449289\n东昌中学\t449290\n索证索票\t449291\nodyssey\t449292\n金强\t449293\n大威\t449294\ncss行高line\t449295\n58岁\t449296\n四肖\t449297\n按户\t449298\n黄浦江畔\t449299\n山东临工\t449300\n第一宠\t449301\nQTabWidget\t449302\n因私\t449303\n//\t449304\n百田画咖圈\t449305\n半信半疑\t449306\nNO.7\t449307\n20160605\t449308\n99BT\t449309\n武陟县\t449310\n奔驰宝马\t449311\n神选者\t449312\n钱桥街道\t449313\n景驰\t449314\nbasedao\t449315\n以爱\t449316\n微信朋友圈\t449317\n火山口\t449318\n上坡\t449319\n八宝粥\t449320\n湖南机电职业技术学院\t449321\n再走\t449322\n叶紫涵\t449323\n桌宠\t449324\nned\t449325\nNT\t449326\nWWW.828VV\t449327\n32分钟\t449328\nΟ\t449329\n邮政\t449330\n小一\t449331\n史明克\t449332\n16年9月\t449333\n杰德我是至尊\t449334\n铢\t449335\n曼陀罗\t449336\n波特\t449337\n速冻\t449338\n西藏商城藏善堂\t449339\n爱卫办\t449340\nTAG\t449341\n水心\t449342\nPSV/PS3\t449343\nrsys\t449344\n阜阳市人民政府\t449345\nOperator\t449346\n炉石冰封王座\t449347\n多木木多\t449348\n胜科\t449349\nJFace\t449350\n卫浴柜\t449351\nav女星_日\t449352\n对策分析_参考网\t449353\n春\t449354\nImpulZ\t449355\n平视\t449356\n气象网\t449357\n平安神宫\t449358\n广州证券\t449359\n贵圈\t449360\n薇婷\t449361\n贝狄威尔\t449362\n葡萄枝\t449363\n曼陀林\t449364\npantyhose\t449365\nStealth\t449366\n硅酸\t449367\n公司治理结构\t449368\n齐静春\t449369\ncac\t449370\n桂树\t449371\n老脸\t449372\n太阳能光热发电\t449373\n沾益区\t449374\n艽野尘梦\t449375\np8600\t449376\n舒尔茨\t449377\narmadillo\t449378\nDFD\t449379\n耳针\t449380\n非金属补偿器\t449381\nV2.0\t449382\nC87\t449383\nTowel\t449384\n反馈稿\t449385\n陈永胜\t449386\n回冲\t449387\n城市之星\t449388\n美国法院\t449389\n英国伦敦大学学院\t449390\n捜査官\t449391\ndvwa\t449392\n胎\t449393\n河东狮\t449394\n百香果茶\t449395\nluxe\t449396\n为什么不去\t449397\n傲锐\t449398\n三星浏览器\t449399\n不打\t449400\nv6.4\t449401\n引人入胜\t449402\n输液港\t449403\npyv8\t449404\nsne\t449405\n云苓\t449406\n徐伟刚\t449407\nDailyFX财经网\t449408\n第十九部\t449409\nk620\t449410\nQC25\t449411\n蜜妻\t449412\n11#\t449413\n乐群\t449414\n三家分晋\t449415\n上海国际艺术节\t449416\nhonghai\t449417\n武汉科技馆\t449418\ninstalling\t449419\n对眼\t449420\n万泰\t449421\n补种\t449422\nCarthage\t449423\n云电子狗\t449424\n水解反应\t449425\n淡月\t449426\n阳新县\t449427\nios9.1\t449428\n杂集\t449429\n辽委办\t449430\n虎尾兰\t449431\nAfterburner\t449432\n上饶市人民医院\t449433\n变态传奇\t449434\n三两\t449435\n抢尸\t449436\nFranco\t449437\n首都医科大学附属北京安贞医院\t449438\n舞剑\t449439\n佳能mg2400\t449440\n塗\t449441\nambient\t449442\n海思处理器\t449443\n伟良\t449444\n说史\t449445\n18根\t449446\n全州县\t449447\n芳芳\t449448\n可调整\t449449\n水罐消防车\t449450\n强校\t449451\nSoMachine\t449452\ndepletion\t449453\n张向东\t449454\n汉华\t449455\n重氮盐\t449456\n草种\t449457\n曹超\t449458\nFC\t449459\n建筑面积\t449460\nBorders\t449461\n神经节苷脂\t449462\n按位\t449463\n小绵羊\t449464\n亚马逊商城\t449465\n【天了噜\t449466\n大天\t449467\n75寸\t449468\n97933.com\t449469\n大皇\t449470\n3ml\t449471\n整理与复习\t449472\nDISPLAY\t449473\n昆明湖\t449474\n顾长卫\t449475\n美沙酮\t449476\n啫喱\t449477\nSplit函数\t449478\n3200元\t449479\n贤臣\t449480\n卓不凡\t449481\n天津市纪委\t449482\n印加\t449483\n三环线\t449484\n金永南\t449485\n振头\t449486\n乐视pro3\t449487\n胶头\t449488\n主裁判\t449489\n结冷胶\t449490\n石人\t449491\n露背\t449492\nNewcastle\t449493\nCorelDRAW\t449494\n零号机\t449495\n选聘\t449496\n王彤\t449497\n金超\t449498\n中信圆梦金\t449499\n期货从业资格证\t449500\nchampionship\t449501\n中控台\t449502\n熊猫纪念币\t449503\n学科类\t449504\n女仆装\t449505\nMissevan\t449506\n南兴\t449507\nhandheld\t449508\n秋水伊人\t449509\n葵花籽油\t449510\n手艺人\t449511\n紫阙\t449512\n西瓜\t449513\n小橙\t449514\n郭鹤年\t449515\n小康\t449516\n含胸\t449517\n孙中山\t449518\n光伏网\t449519\n张晓波\t449520\ndreamwear\t449521\n朝歌\t449522\n鼓声\t449523\n高德地图车机版\t449524\n浏览器\t449525\n族脉网\t449526\n康纳麦格雷戈\t449527\n番禺市桥\t449528\n叩谢\t449529\n赤坎\t449530\nsqlcommand\t449531\n广东建设信息网\t449532\n幽兰湖之御厨驾到\t449533\n华为Watch\t449534\n等价交换\t449535\nIC2\t449536\n健壮\t449537\n银行利率网\t449538\nGATE\t449539\n监理人\t449540\n吃死\t449541\n0m\t449542\n晦气\t449543\n百题\t449544\n55R17\t449545\n算试\t449546\n二甲基乙酰胺\t449547\n拉伸\t449548\n珍藏区\t449549\n装病\t449550\n薛立新\t449551\n武痴\t449552\n霸王的大陆\t449553\nOBS\t449554\n拆股\t449555\n线性函数\t449556\n两圈\t449557\n享动\t449558\n张晓楠\t449559\n片天\t449560\nTC720P\t449561\n救助站\t449562\n仲胺\t449563\n秦代\t449564\n烈山区政府\t449565\n公牛历险记\t449566\n伊夫黎雪\t449567\n海印股份\t449568\n源著\t449569\n孟阳\t449570\n宋小君\t449571\n集会所\t449572\n默认项\t449573\n背贴式\t449574\n中华医学科技\t449575\n止咳贴\t449576\nsolution\t449577\n16进制字符串\t449578\n以兹\t449579\n掏心\t449580\nxmind8\t449581\n庆生会\t449582\nautocad2010\t449583\n准驾证\t449584\n砂砾石\t449585\n万石\t449586\njita\t449587\n停电\t449588\n相知\t449589\n事业单位人事管理条例\t449590\n包装箱\t449591\nwatershed\t449592\n钙骨\t449593\nv3.0\t449594\n小先锋\t449595\n娇子\t449596\n新南威尔士\t449597\nattain\t449598\n橙皮苷\t449599\n弗鲁姆\t449600\nDigitalOcean\t449601\n簪花仕女图\t449602\n巩华城\t449603\n四川电影电视学院\t449604\n炫爱\t449605\n本会\t449606\n熹妃传\t449607\nifnull\t449608\n抗老化\t449609\n卢纶\t449610\n低塘街道\t449611\n中国邮储银行\t449612\n陆晨曦\t449613\n如意书\t449614\n江小绿\t449615\ninteraction\t449616\n陕西大学\t449617\n59张\t449618\n十五从军征\t449619\n零值\t449620\n真叶\t449621\n体味\t449622\n失准\t449623\n村干\t449624\nSheep\t449625\nranorex\t449626\n讨人喜欢\t449627\ngfi\t449628\n20170624\t449629\n协议转换器\t449630\n雷诺考特\t449631\n词论\t449632\noauth\t449633\nQFP\t449634\nBritannica\t449635\n著有\t449636\n广电集团\t449637\n大区\t449638\n汽车离合器\t449639\n谷城县\t449640\n广药集团\t449641\n龟纹石\t449642\n强调\t449643\n蚵仔\t449644\n网易163邮箱\t449645\n数控车刀\t449646\n验亲\t449647\n50首\t449648\nTATA\t449649\n深圳舞蹈网\t449650\n5611\t449651\n位点\t449652\n柏凝\t449653\n展翅\t449654\n17个小时\t449655\nKiki\t449656\nkindle4\t449657\n吾爱娱乐网\t449658\n污泥池\t449659\n手鞠\t449660\n沙丁\t449661\n东胜\t449662\n分数乘除法\t449663\n小甘\t449664\n14代\t449665\n风吹雨打\t449666\n陈永明\t449667\n刀剑神域外传\t449668\nwebster\t449669\nKorea\t449670\n别说\t449671\n风邪\t449672\n腹式\t449673\n赣东北\t449674\n母老虎\t449675\n毛南族\t449676\n哌啶\t449677\n克里斯安吉尔\t449678\n3席\t449679\nGTX850M\t449680\nH3C\t449681\n砥石\t449682\n汉堡肉\t449683\n鲁中网\t449684\n7650\t449685\n环科院\t449686\n残差\t449687\n上大学\t449688\n怀乡\t449689\nZeplin\t449690\n公证法\t449691\n优趣\t449692\n存贷\t449693\n春茶\t449694\n10.6\t449695\n包封\t449696\n3.7.26\t449697\n长治市郊区人民政府\t449698\n松蘑\t449699\n杨雨\t449700\nlintcode\t449701\n抗癌药\t449702\n好穷\t449703\nminban\t449704\n弦外之音\t449705\ndaterangepicker\t449706\n大良街道\t449707\nxmind\t449708\n压感笔\t449709\n顶梁柱\t449710\nManticRo\t449711\n第一箱\t449712\n办信\t449713\n千落\t449714\n10品\t449715\n雨量传感器\t449716\n苏杭\t449717\n娇嫩\t449718\nINT\t449719\n王锡玄\t449720\nboai\t449721\nExeter\t449722\n文雯\t449723\n严顺开\t449724\n萨满\t449725\n中捷资源\t449726\n美国中学\t449727\n华南城集团\t449728\n奥图码\t449729\n赤坂亭\t449730\n张北草原\t449731\n波米\t449732\n食用油\t449733\n180319\t449734\n烫发\t449735\n点播首\t449736\n偏倚\t449737\n龙灯\t449738\n锦绣路\t449739\n厦门火车站\t449740\n坑口\t449741\n面观\t449742\nJZ2440\t449743\n15808\t449744\n合肥便民网\t449745\n0086\t449746\n达方\t449747\nMastercam2017\t449748\n232号\t449749\n意大利队\t449750\n新奥能源\t449751\n中间继电器\t449752\n光纤激光器\t449753\n毛衣\t449754\nie80s\t449755\n英文稿\t449756\n谷瀑\t449757\n高考网\t449758\n药石\t449759\n合不拢嘴\t449760\n正人君子\t449761\n幽门杆菌\t449762\nmultiphysics\t449763\n美林证券\t449764\n辱\t449765\n两部曲\t449766\n陈某某\t449767\ncouldnt\t449768\n纵火犯\t449769\n吉吉哥\t449770\n翻堆机\t449771\n翻梁\t449772\n高跟靴\t449773\n奶酪陷阱\t449774\n执政团\t449775\n尿样\t449776\nANSYSWorkbench\t449777\n诺华制药\t449778\n洛书\t449779\n中国建筑第八工程局\t449780\n深井冰\t449781\n2020kk\t449782\n职守\t449783\n115万\t449784\n梅毒\t449785\n西扎\t449786\n崔苔菁\t449787\ndiego\t449788\n大建\t449789\nsilicone\t449790\n进展\t449791\n三面\t449792\n第28章\t449793\n视图层\t449794\n涓流\t449795\n穗华\t449796\nOFFICE2007\t449797\n童年的回忆\t449798\n吃螃蟹\t449799\n珠海市人民医院\t449800\nTabControl\t449801\n大军阀\t449802\nmodules\t449803\n海波\t449804\n适不适合\t449805\nmarkpoint\t449806\n懒人鞋\t449807\n敖翔\t449808\n发行商\t449809\nmavlink\t449810\n肺门\t449811\n海神针\t449812\n6530\t449813\n黄曲霉菌\t449814\n油彩\t449815\n网讯\t449816\n2.8T\t449817\nCaCO3\t449818\nimooc\t449819\n以东\t449820\n潜\t449821\n快币\t449822\n如家酒店集团\t449823\n铝扣板吊顶\t449824\n23题\t449825\n美丝\t449826\n西善桥街道\t449827\n600060\t449828\n尿盆\t449829\n绝地大逃杀\t449830\njiawei\t449831\nhtp\t449832\niferror\t449833\n3288\t449834\n许三观卖血记\t449835\n打探\t449836\n诺华\t449837\nunload\t449838\n小莉\t449839\n艺\t449840\n10分米\t449841\n我爱洗澡\t449842\n韦布尔\t449843\n去电\t449844\n霸王大陆\t449845\nsubprocess\t449846\n车仔\t449847\nlumia930\t449848\nhtml源代码\t449849\n馨苑小区\t449850\n240GB\t449851\n一将\t449852\n哈勃望远镜\t449853\n上海大学理学院\t449854\n广州伊丽莎白妇产医院\t449855\n行字\t449856\n无损压缩\t449857\n江永\t449858\n生普\t449859\ndafa888\t449860\nomni\t449861\n桂花树\t449862\n有痘\t449863\n波波维奇\t449864\n焕肤\t449865\n居村\t449866\nFfmpeg\t449867\n2018-03-30\t449868\n预告—大陆\t449869\n立式泵\t449870\n20170616\t449871\n四川师范\t449872\n裂纹\t449873\n社会类\t449874\nQS世界排名\t449875\nPlant\t449876\n美丽的小兴安岭\t449877\nStockQ\t449878\nBen\t449879\n这就是命\t449880\n丰硕\t449881\n依非\t449882\n55度杯\t449883\n笕桥镇\t449884\nISO27001\t449885\n3月5日\t449886\nmdyd\t449887\n混合\t449888\n绝望之塔\t449889\n壳子\t449890\n县卫生局\t449891\nERIC\t449892\n第10节\t449893\n犯罪嫌疑人\t449894\n金吒\t449895\narray_filter\t449896\n韩剧网_热播韩剧网\t449897\n羟基苯甲酸\t449898\n5宗\t449899\nalia\t449900\n剁屌\t449901\nbauhaus\t449902\n冻品\t449903\n春衫\t449904\n稿约\t449905\n摇风\t449906\n全合\t449907\njianguo\t449908\n862\t449909\nBasic\t449910\n紫金山天文台\t449911\n蓓俪芙\t449912\n马海毛\t449913\n睿智启图书\t449914\n犀浦镇\t449915\n再会\t449916\n葫芦岛百姓网\t449917\n春子\t449918\n裴大帅\t449919\n西安恒大\t449920\n土家族\t449921\n20188\t449922\n树莓派3\t449923\n林夕梦\t449924\n可风骚\t449925\n成都天府国际生物城\t449926\nranges\t449927\n单元房\t449928\n离婚协议书\t449929\n执委会\t449930\n凤阳\t449931\n邵逸夫医院\t449932\n多样性\t449933\n京台高速公路\t449934\nshade\t449935\n有赞\t449936\n微控制器\t449937\n陰\t449938\n唐山镇\t449939\n树脂板\t449940\ns20\t449941\nabc360\t449942\n强收\t449943\n小水滴旅行记\t449944\nFrameWork\t449945\n荣耀渭南网\t449946\n按摩女\t449947\nras\t449948\n下片\t449949\n寸土不让\t449950\n福建师大附中\t449951\n周昉\t449952\n龙界\t449953\n会办\t449954\n君兰\t449955\n上海钓鱼网\t449956\ncv2\t449957\n心服\t449958\n酷娱网\t449959\n墨兰\t449960\nDee\t449961\napplying\t449962\n高品质生活\t449963\nwhos\t449964\nHTTPS\t449965\n飞塔信息科技(北京)有限公司\t449966\n炼奸\t449967\n强力型\t449968\n音画\t449969\nCCTV-5_央视\t449970\n生化试剂\t449971\nAironet\t449972\n太黑\t449973\n硬盘座\t449974\n900E\t449975\n苯甲醇\t449976\n电池炉\t449977\nGradient\t449978\n膏体灌装机\t449979\n风帆蓄电池\t449980\nhandwriting\t449981\nwww.75\t449982\nScholarships\t449983\n实体盘\t449984\n气氛\t449985\n朱家角古镇\t449986\nPlatform\t449987\nrois\t449988\n梁楷\t449989\n劳卡\t449990\n足疗\t449991\n进程间通信\t449992\n恋腿癖\t449993\n砂土\t449994\n人教版五年级下册语文\t449995\n樱花树下的约定\t449996\n桂林医学院\t449997\nx.^2\t449998\ntensflow\t449999\n中经市场研究网\t450000\n警备\t450001\n90b套\t450002\n冲筋\t450003\ntflearn\t450004\n华人网\t450005\n85cm\t450006\n一树梨花压海棠\t450007\n百度云BT\t450008\nducati\t450009\n僵尸道长\t450010\n范世琦\t450011\n竹溪县人民政府\t450012\n骏河屋\t450013\n岩彩\t450014\n金庸群侠传5\t450015\n哑剧\t450016\nLogBack\t450017\n学霸\t450018\n16栋\t450019\n樱桃&丸子\t450020\n把柄\t450021\n小分子肽\t450022\n好人缘\t450023\n招标代理服务收费管理暂行办法\t450024\n风暴潮\t450025\n香云\t450026\nspss聚类\t450027\n桂林三金\t450028\n日主\t450029\n絮凝剂\t450030\n2k\t450031\n夏侯钰涵\t450032\n副馆长\t450033\nRobbie\t450034\nKEIL5\t450035\n11厘米\t450036\n梁汉文\t450037\n大气污染防治法\t450038\n瑛子\t450039\n傅红雪\t450040\n十七个月\t450041\n华清\t450042\n晨诵\t450043\n哆啦\t450044\n变电器\t450045\nqq绿钻\t450046\n人物库\t450047\n梦岛小说网\t450048\n1844\t450049\n抗日战争史\t450050\nGarfield\t450051\n福莱希\t450052\ndigimonmasters\t450053\nsenator\t450054\n机物\t450055\n录音棚\t450056\nwcdma\t450057\n圆寂\t450058\n帝王蝎\t450059\n东迁\t450060\n阅读页\t450061\n厨神\t450062\nbn}\t450063\nrenjiao\t450064\n关岭自治县\t450065\ndetects\t450066\n专报\t450067\n多氟多\t450068\n肯德基嫩牛五方\t450069\n结核性脑膜炎\t450070\n垃圾房\t450071\n讽刺性\t450072\n萎靡不振\t450073\n旗舰型\t450074\n药匙\t450075\n手扶\t450076\nresilient\t450077\n喷播机\t450078\n二十米\t450079\nmbs\t450080\n放开那三国\t450081\n跨境\t450082\n20180325\t450083\n桑花\t450084\n扩展卡\t450085\n沈阳市环境保护局\t450086\n塑封\t450087\n私有链\t450088\n障碍者\t450089\n26周-40周\t450090\n心肌\t450091\n桐子\t450092\n潮语\t450093\nmtonline\t450094\nAutopilot\t450095\nflorence\t450096\nmid函数\t450097\n补牢\t450098\n壶关县\t450099\n焉耆\t450100\n差速电机\t450101\n94级\t450102\n102名\t450103\nINTERNET\t450104\nVRChat\t450105\n屏蔽电缆\t450106\n土桥\t450107\n卢旺达\t450108\n柠檬蜜\t450109\n吴主任\t450110\n齐河\t450111\n跑完\t450112\n百瑞\t450113\nswm\t450114\n盐城南洋机场\t450115\n苏州立达中学\t450116\n超人学院\t450117\n顾莉雅\t450118\n南京师范大学附属中学树人学校\t450119\n海角七号\t450120\nv3.8.1\t450121\n诛仙3\t450122\nem1\t450123\n永红村\t450124\n龙津\t450125\nimdb\t450126\n华维\t450127\n辩论会\t450128\n上饶银行\t450129\n机动车交通事故责任强制保险条例\t450130\necshop\t450131\nFRANXX\t450132\n雷丁\t450133\n风幕\t450134\n豆乐\t450135\n箱线图\t450136\nprocesslist\t450137\nBonita\t450138\n520apk.com\t450139\nOLI\t450140\n列表版\t450141\n魏巍\t450142\n音符\t450143\n枫\t450144\n流汗\t450145\nvodka\t450146\nzum\t450147\nhoa\t450148\n下装\t450149\n国药\t450150\n冠区\t450151\n颍东\t450152\n阳平镇\t450153\n距离\t450154\nUHD630\t450155\n做人事\t450156\n心潮\t450157\n霍尔果斯公司\t450158\n七宝\t450159\n潜水钟\t450160\n可卸式\t450161\n熹微\t450162\n鳊鱼\t450163\n钢块\t450164\n腮红盘\t450165\n疼爱\t450166\n秦洪涛\t450167\n抢食\t450168\n高弗雷\t450169\ndll库\t450170\n红羊\t450171\n五季\t450172\n童宁\t450173\n朝鲜币\t450174\n韩国男篮\t450175\nBOLON\t450176\nmayi\t450177\n七分之二\t450178\n遵义旅游网\t450179\n排他性\t450180\n石阵\t450181\n城市赛\t450182\n山东省质量技术监督局\t450183\n触感\t450184\nPPK\t450185\nwin\t450186\n荣盛地产\t450187\n山东省政府法制网\t450188\n女大枪\t450189\n海牙\t450190\n坍\t450191\n北京人大\t450192\n通心络胶囊\t450193\n豇豆\t450194\n协鑫新能源\t450195\n天津卷\t450196\n妖艶\t450197\n上海士锋生物科技有限公司\t450198\n途客\t450199\nrubymine\t450200\n结合体\t450201\n安徽文达信息工程学院\t450202\nTAI\t450203\n集悦城\t450204\n白榜\t450205\n云龙山\t450206\n二伯\t450207\n十九大时光\t450208\n罍街\t450209\n江滩\t450210\n纤维棉\t450211\nRevealed\t450212\n量化\t450213\n雷克萨斯\t450214\n300部\t450215\n结项\t450216\n被切断\t450217\nLinxu\t450218\nIgM\t450219\njavax\t450220\n顶峰\t450221\nruntime-l1-1-0.dll\t450222\nkana\t450223\n王晓晨\t450224\nkali吧_\t450225\n电子猫\t450226\nFlannel\t450227\n1MM\t450228\n球泡\t450229\n荷花路\t450230\n第四组\t450231\n紫花苜蓿\t450232\n临阵\t450233\n2015～2016年度\t450234\n天津法院\t450235\nImpact\t450236\n联合社\t450237\n西藏银行\t450238\n幸福终点站\t450239\n沈阳小学\t450240\nODIN\t450241\n宋园路\t450242\n90后\t450243\nau3.0\t450244\n亚洲流体网\t450245\n铭扬\t450246\nanacoda\t450247\n川牛膝\t450248\nBicycles\t450249\n瀚林\t450250\n幼稚\t450251\npolly\t450252\n泰马\t450253\n麦克韦尔\t450254\n采购经理人指数\t450255\n邱百瑞\t450256\n名列\t450257\n流畅度\t450258\n区块链浏览器\t450259\n梨木\t450260\nRegExp\t450261\n松赞干布\t450262\nBus\t450263\n品检\t450264\n住在一起\t450265\n第N\t450266\nx263&period\t450267\n覆灭\t450268\n佛手柑\t450269\n四道口\t450270\n云峰基金\t450271\n胀接\t450272\n30s\t450273\n簪中录\t450274\n绳长\t450275\n诚\t450276\n宝蓝\t450277\ngcloud\t450278\n50处\t450279\n自驾路\t450280\nJuliette\t450281\n复华控股\t450282\n阿喜\t450283\nhsv\t450284\n篮筐\t450285\ngated\t450286\n搜建筑网\t450287\n长尾巴\t450288\n洪欣\t450289\n考拉海购\t450290\n陈一发\t450291\n28斤\t450292\n法帮网\t450293\nInstituto\t450294\n女帽\t450295\n钢琴声\t450296\n2个小时\t450297\n经天\t450298\n甑糕\t450299\n烧饼\t450300\n日更\t450301\n人民广场\t450302\n李善均\t450303\n山东人事考试网\t450304\nshelter\t450305\n马球\t450306\nCow\t450307\n时节\t450308\n目镜\t450309\n喜得\t450310\n45nm\t450311\n首尔东\t450312\n聚氨酯密封胶\t450313\n肇东市\t450314\n114生活网\t450315\n展讯\t450316\n120v\t450317\n苏岚\t450318\n杭州一小区\t450319\nFinePrint\t450320\n销售净利率\t450321\nmib库\t450322\ncjx\t450323\n吴军\t450324\n苏系\t450325\n2.74\t450326\n肛毛\t450327\n网易云电脑\t450328\nwander\t450329\n老戴\t450330\n50天\t450331\n国际电子商情网\t450332\n危险度\t450333\n入学考试\t450334\n鬼岛\t450335\n一本率\t450336\n落实全面从严治党\t450337\n秸秆打包机\t450338\n拜新同\t450339\n铜镜\t450340\n欧阳小文\t450341\nPhiladelphia\t450342\n阿木木\t450343\n天生三桥\t450344\nrestrictions\t450345\n4.x\t450346\nexclipse\t450347\ncatalysis\t450348\n1280P\t450349\n杨树林\t450350\nvivo\t450351\n陈晓军\t450352\n镜姬\t450353\n资本支出\t450354\n美学原理\t450355\n世家子\t450356\n邵阳市政府\t450357\n三页\t450358\n子君\t450359\n顽症\t450360\n5把\t450361\n众创\t450362\n刘婵\t450363\n数子\t450364\n捷力\t450365\n蓝箱\t450366\n油口\t450367\n高尔夫之星\t450368\nTPG\t450369\n军事训练\t450370\ncontroler\t450371\n出岛\t450372\n中钢国际\t450373\n20151122\t450374\n摸一摸\t450375\n吕丽萍\t450376\n滇西科技师范学院\t450377\n舢板\t450378\n5公斤\t450379\n天正\t450380\n濑名步\t450381\n散散\t450382\n2K15\t450383\n劳神\t450384\n23:59\t450385\n稀美\t450386\n1万名\t450387\n第二单\t450388\nLPL2017\t450389\n道法\t450390\n预防接种证\t450391\n防城\t450392\n着力点\t450393\ni9100\t450394\n100R\t450395\n齐鲁制药有限公司\t450396\n病例\t450397\n烟悦网\t450398\n锅台\t450399\n历下区\t450400\n粗茶\t450401\nblasting\t450402\n说笑\t450403\n26种\t450404\n慢性乙型肝炎\t450405\n计量箱\t450406\n42岁\t450407\n抗锯齿\t450408\n木火\t450409\n明秀\t450410\nTSPT\t450411\nby香芋奶茶\t450412\n1.3亿元\t450413\n太神\t450414\n恐怖推理\t450415\n骑式车\t450416\n雷蒙德·卡佛\t450417\n知网空间\t450418\n实用程序\t450419\n格柏\t450420\n尿酸值\t450421\nsuddenly\t450422\n证券期货投资者适当性管理办法\t450423\n电动扶梯\t450424\n一元论\t450425\nDarknet\t450426\n戴志诚\t450427\n钥匙卡\t450428\n王莎莎\t450429\n追风\t450430\n莫敢\t450431\n异戊烷\t450432\n利来国际\t450433\n买卖机械网\t450434\n正八面体\t450435\npenbeat\t450436\nz77\t450437\n0522\t450438\n胡渣\t450439\n古松\t450440\n网红经纪公司\t450441\n武汉东湖学院\t450442\n当选\t450443\n神夜\t450444\n杭州法院\t450445\n上海新科医院\t450446\n红庄\t450447\n超级看影院\t450448\n英雄救美\t450449\nmemmove\t450450\n法压壶\t450451\n鲁塔比斯\t450452\n路桥集团\t450453\n义士\t450454\n此道\t450455\nsibyl\t450456\n患人\t450457\n虚设\t450458\n香烟盒\t450459\n3151\t450460\n金所\t450461\nsolitary\t450462\nWas\t450463\n摩尔纹\t450464\n编花篮\t450465\n4p变态性爱网\t450466\n爱民里小区\t450467\n李华月\t450468\n稀油站\t450469\n四阿哥\t450470\n豪侠\t450471\n钒酸铋\t450472\n飛雲若雪\t450473\n中葡股份\t450474\n2下\t450475\n土地公文库\t450476\nmini8\t450477\nv10.0\t450478\nYUV420P\t450479\nshimano\t450480\n钻孔灌注桩\t450481\n1000问\t450482\n降本增效\t450483\n130路\t450484\n展具\t450485\n10.17\t450486\n郑州教育信息网\t450487\n为重\t450488\n获得者\t450489\n耀目\t450490\n硫酸钾\t450491\n败军\t450492\n130天\t450493\n8.0.11\t450494\n中华人民共和国兵役法\t450495\n翡翠观音\t450496\nalteration\t450497\n相劝\t450498\n逗斗车\t450499\nsoyo\t450500\nAom\t450501\n八羽怪\t450502\n倾覆\t450503\n万兴街\t450504\n水城县\t450505\n点睛网\t450506\npokerstars\t450507\n第二十三个\t450508\n中小写\t450509\n众声\t450510\n新秀丽\t450511\n支助\t450512\n私部\t450513\n霓虹国\t450514\n苍月十字架\t450515\n泥王\t450516\n湖岭\t450517\nt50\t450518\n小祖\t450519\nrepose\t450520\n紫荆苑\t450521\nandroid\t450522\n小额纳税人\t450523\n歌手2018\t450524\n收归国有\t450525\ntomcat8\t450526\n硕士学校\t450527\nruguo\t450528\n挺有意思\t450529\n金属球\t450530\n3.1G\t450531\n戊辰年\t450532\n夏利n3\t450533\n基口\t450534\nFrontEnd\t450535\n奥迪A3\t450536\n华字\t450537\nexperts\t450538\n最多一次\t450539\n王琼\t450540\nSF6断路器\t450541\n龙三\t450542\n钐钴\t450543\n版权方\t450544\n肌松药\t450545\n摔碗酒\t450546\n轻白\t450547\nmeiyou\t450548\n太坊\t450549\n卢塞恩\t450550\n半带\t450551\n福尔摩斯\t450552\n拉斯维加斯酒店\t450553\n票额\t450554\n胡世忠\t450555\nmut\t450556\n黄纪莹\t450557\n古武\t450558\n统派\t450559\n谈情说案\t450560\n3000多元\t450561\nAxure8\t450562\nIFM\t450563\n李春天\t450564\n汤谱\t450565\n物料盒\t450566\n幸福誉\t450567\n扩阴器\t450568\n柘城\t450569\n一气\t450570\n发号网\t450571\nthe5\t450572\n苯甲酰氯\t450573\n赫哲族\t450574\n缸机\t450575\n肛欲期\t450576\n飘柔\t450577\n4月4号\t450578\n红秀\t450579\n清稗类钞\t450580\n第三发\t450581\n银树\t450582\n泽锦\t450583\n张仃\t450584\n华为手环b3\t450585\n边边\t450586\n赛博朋克\t450587\n长期以来\t450588\n领导家\t450589\n女魔头\t450590\nuitabbarcontroller\t450591\nH漫\t450592\n怒操\t450593\n国际华语\t450594\nmasturbate\t450595\nZ4\t450596\n振动仪\t450597\n43页\t450598\n睿信\t450599\n幽魂\t450600\n精武鸭脖\t450601\n以防万一\t450602\nGPT\t450603\n根艺\t450604\n张科\t450605\n排长队\t450606\n紫金桥\t450607\nsor\t450608\nLeasing\t450609\n编包\t450610\n龙峡\t450611\n課程\t450612\n宜芝\t450613\nwindows7_Windo\t450614\nYEARS\t450615\nrevolutionary\t450616\nUniFi\t450617\nTWaver\t450618\nIAI\t450619\n米奇妙妙屋全集\t450620\n进取心\t450621\n狡诈\t450622\nEAD\t450623\n中国电信福建公司\t450624\n粉口\t450625\n盐城工业职业技术学院\t450626\n心有独钟\t450627\n洛川群侠传\t450628\n中国作文网\t450629\n度计\t450630\n果昔\t450631\n梁配筋\t450632\n梧塘镇\t450633\n金孔雀\t450634\nlol掉帧\t450635\n精讲\t450636\n奥希替尼\t450637\nDevops达人\t450638\n06\t450639\n北京人\t450640\n大郅\t450641\n试论述\t450642\n合众汽车\t450643\n直线杆\t450644\n泷泽乃南\t450645\ncvn\t450646\n内置卡贴机\t450647\n庞克\t450648\n印摩罗天\t450649\n学硕\t450650\nwin32位\t450651\n10度\t450652\n配药\t450653\n玉泉路\t450654\n亚投行\t450655\n平湖人才网\t450656\nDetecting\t450657\n甜型\t450658\n00153\t450659\n非公式\t450660\n耐蚀\t450661\nexploer\t450662\n猪鼻贴\t450663\n集群版\t450664\ncrown\t450665\nc2265\t450666\n单打印机\t450667\n倍率\t450668\n1208\t450669\n8头\t450670\n化缘\t450671\n西秀区\t450672\n火狐Firefox\t450673\n2526\t450674\n八大胡同\t450675\n猪头人\t450676\n回家之路\t450677\nwagon\t450678\n0所\t450679\n院务\t450680\n李庚\t450681\nMeasure\t450682\ndier\t450683\ninitializer\t450684\n望而却步\t450685\n富贵门\t450686\n曹淑敏\t450687\nleader\t450688\n自叙帖\t450689\n化妆镜\t450690\ncos^2x\t450691\nservicing\t450692\n垫料\t450693\n是心非\t450694\njdbc连接池\t450695\ncouver\t450696\n孤山寨\t450697\n坐观\t450698\nserver2016\t450699\n橄榄枝\t450700\ntrials\t450701\nReward\t450702\ntp5\t450703\n何勇\t450704\nVPGAME\t450705\n别克新君越\t450706\n武汉公交集团\t450707\n尿囊素\t450708\n日事清\t450709\n维表\t450710\n岳西政府网\t450711\ncound\t450712\n载人航天\t450713\n第119集\t450714\n纳税信用等级\t450715\n固化\t450716\n眼形\t450717\n爱在黎明破晓前\t450718\n都市逍遥记\t450719\n概略\t450720\n剧院魅影\t450721\nreward\t450722\n缴款书\t450723\n恋姬\t450724\n玄武湖公园\t450725\n争霸战\t450726\n220g\t450727\n盲点\t450728\n9470m\t450729\n清迈酒店\t450730\n断面图\t450731\n双合家园\t450732\n処女\t450733\n1.19G\t450734\ns63\t450735\n芥末\t450736\n七丽时尚网\t450737\n残骸\t450738\n地市\t450739\n斯蒂芬·金\t450740\nplasmon\t450741\n大鹏新闻网\t450742\n餐台\t450743\nDayBreak\t450744\n原爆点\t450745\nLuckin\t450746\n暗黑破坏神II\t450747\n10刀\t450748\nlamer\t450749\ncm3d2\t450750\n系统分区\t450751\n安诺\t450752\n3月20号\t450753\n贷款损失准备\t450754\n走遍天下书为侣\t450755\n2107\t450756\n7z.dll\t450757\n宋超\t450758\n祸首\t450759\n作风\t450760\nFidelio\t450761\n盖特\t450762\nMathWorks\t450763\n处女秀\t450764\n韩束\t450765\n十二五\t450766\n400万美元\t450767\n大嘴鸟\t450768\n弹弹\t450769\n衡水日报\t450770\n李非\t450771\n781\t450772\n盘州市\t450773\n麦穗鱼\t450774\n红裤\t450775\n云南省能源投资集团有限公司\t450776\n程序设计\t450777\n张国清\t450778\n丹寨万达\t450779\ndriven\t450780\n一剪梅\t450781\nHs\t450782\n炳\t450783\n石线\t450784\n这一节\t450785\n30P\t450786\n寰神结\t450787\n海琴湾\t450788\n柏林谍影\t450789\nKDL\t450790\n盐焗鸡翅\t450791\n离骚\t450792\nshuge\t450793\n分卫\t450794\n湖南智库网\t450795\n二十天\t450796\n伊斯特伍德\t450797\n比亚\t450798\n060\t450799\n张立平\t450800\n带路\t450801\n杨舒\t450802\n第十册\t450803\n租代购\t450804\n新疆公安厅\t450805\n玛瑙\t450806\ngos\t450807\n曹灿\t450808\n陆家嘴世纪金融广场\t450809\n168\t450810\n汉林\t450811\n巴别\t450812\n墨瑟\t450813\n竖幅\t450814\nM200\t450815\n即兴曲\t450816\n上古卷轴五\t450817\n金士\t450818\n滨湖街道\t450819\n浮台\t450820\n新疆青少年出版社\t450821\nkook\t450822\n变损\t450823\n通灵者\t450824\nhuangy\t450825\n密切联系群众\t450826\n广州律师事务所\t450827\n201851\t450828\n茵梦湖\t450829\n豆皮\t450830\n7.32\t450831\n奥尔加\t450832\n自由空间\t450833\n603214\t450834\n薛朗\t450835\nCCTV-13_央视网\t450836\n华彩\t450837\n圆体\t450838\nATJAVA\t450839\n特里斯坦\t450840\n省公安厅\t450841\n李素婉\t450842\n乐品\t450843\nreloadData\t450844\n揭面\t450845\nncverilog\t450846\nSouthampton\t450847\npromiscuous\t450848\n最远的你是我最近的爱\t450849\n咒符\t450850\nsendcloud\t450851\n作用量\t450852\n牛娃\t450853\nCommission\t450854\n调字\t450855\n机场\t450856\nIC交易网\t450857\n复绿\t450858\n吕洋\t450859\n仙桃中学\t450860\n坏境\t450861\n硬要\t450862\nnexus6\t450863\n许红霞\t450864\n表扬稿\t450865\n南沙岛礁\t450866\n极限挑战第四季\t450867\n高顿教育\t450868\n长沙医学院\t450869\n颠覆式\t450870\n灵修\t450871\nHPLaserJet\t450872\ncshell\t450873\n100万日元\t450874\n这儿\t450875\n郭老师\t450876\n品牌运营\t450877\n核电厂\t450878\n无权\t450879\n茶碱\t450880\n蹦极\t450881\n小女孩\t450882\n桃式\t450883\nLED投光灯\t450884\nRetroArch\t450885\n一人之下2\t450886\n星光广场\t450887\n西游女儿国\t450888\n比一比\t450889\n600315\t450890\n裴礼康\t450891\noppai\t450892\n庄寨镇\t450893\n博莱克\t450894\n考务\t450895\n津津乐道\t450896\n詹宁斯\t450897\n生死狙击圣光\t450898\n市场处\t450899\nLegacy\t450900\n宾夕法尼亚\t450901\n70周岁\t450902\n回想\t450903\n蓝鲸智云\t450904\n不晕\t450905\n百度网盘/QQ旋风\t450906\n喷子们\t450907\n影评集\t450908\n实验师\t450909\n愚园\t450910\n5-6天\t450911\n周先生\t450912\nLAX\t450913\n迅维\t450914\n凯尔特人区\t450915\n郑晓静\t450916\n8.doc\t450917\nベッド\t450918\n306\t450919\n鲁北\t450920\n电子书库\t450921\n强令\t450922\n设问\t450923\n玉林师范学院\t450924\n醮\t450925\n欢声\t450926\n照片儿\t450927\nOperational\t450928\n鸡笼\t450929\n服务业\t450930\n自媒体人\t450931\n2018年四月\t450932\n51看图时尚网\t450933\n靖哥哥\t450934\nsaddle\t450935\n乌孙\t450936\n乍起\t450937\nreflects\t450938\n山梨酸钾\t450939\n因为遇见你\t450940\niThome\t450941\n京张高铁\t450942\nHackerVirus\t450943\n26650\t450944\n王永存\t450945\n绿萝花\t450946\ninsight\t450947\n龙峪湾\t450948\n奥德修斯\t450949\n岭上\t450950\n杀器\t450951\n世缘\t450952\n奥林巴斯显微镜\t450953\n细胞技\t450954\n28E\t450955\nbik\t450956\n中山市西区\t450957\n八里小区\t450958\n卡基们\t450959\n祈盼\t450960\n旱改水\t450961\nbali\t450962\n大理白族自治州\t450963\n小竹\t450964\nf2\t450965\n校报\t450966\n嘉宝\t450967\n青岛公司\t450968\n杜英\t450969\n榄核镇\t450970\nbase\t450971\nAuthenticator\t450972\n760元\t450973\n吸干\t450974\n工艺单\t450975\n方明\t450976\n寒枝\t450977\n吉林省农村信用社\t450978\n金华市人民医院\t450979\n张志辉\t450980\n周海宏\t450981\n广东省人民政府应急管理办公室\t450982\n金融工程学\t450983\n热讯\t450984\n真三国无双2\t450985\n构面\t450986\nemotional\t450987\n丧夫\t450988\nfertilizer\t450989\n人社\t450990\n八本\t450991\n建筑工程师\t450992\n孤胆车神维加斯\t450993\n司索\t450994\n串口号\t450995\n难以\t450996\n79_\t450997\n娱乐观\t450998\n戳\t450999\n城厢镇\t451000\n刘玉华\t451001\n异曲同工\t451002\n温故而知新\t451003\n与好\t451004\n防锈袋\t451005\n中国计量大学\t451006\n清刻本悦宋\t451007\n市民政局\t451008\nTOP11\t451009\nIBus\t451010\n蓝青\t451011\nccwow\t451012\n第20页\t451013\n泥壶\t451014\n青年湖小学\t451015\n教育署\t451016\n逆袭之爱上情敌\t451017\n李恩熙\t451018\n峡谷重案组\t451019\nmasm\t451020\n虎胆龙威\t451021\n徐熙媛\t451022\nMulticast\t451023\n长春市高新区\t451024\ndeputy\t451025\n巨源\t451026\n魔钢\t451027\n大商集团\t451028\n化妆瓶\t451029\nlinux-glibc2.5-x86\t451030\n为什么呢\t451031\n绝地求生模拟器\t451032\n快板\t451033\n红痘\t451034\n金延璟\t451035\nSylvie\t451036\n快乐男声\t451037\ngithub\t451038\n白值\t451039\n模型类\t451040\n通配符\t451041\n1.8.3\t451042\n胡夫金字塔\t451043\n云南过桥米线\t451044\n集训\t451045\n视\t451046\n烧毁\t451047\n5.0.3\t451048\nLess\t451049\n刘美\t451050\n李治亚\t451051\n山东大学艺术学院\t451052\n802.11ax\t451053\n先锋诗\t451054\n纠风办\t451055\n融冰\t451056\n10多种\t451057\n导航机\t451058\n慕尼\t451059\nxnalara\t451060\n社论\t451061\n法兰螺栓\t451062\n啼叫\t451063\nsubmenu\t451064\n拿来\t451065\n疏于\t451066\n19年前\t451067\n服务圈\t451068\n葛印卡\t451069\n昆明理工\t451070\n保质保量\t451071\n手绘吧_\t451072\n丰收村\t451073\n突破点\t451074\n天润数娱\t451075\n复训\t451076\n副作用\t451077\n360极速浏览器\t451078\n移风易俗\t451079\nvswitch\t451080\nTrivia\t451081\n第111号\t451082\nEternium\t451083\n摒\t451084\n警营\t451085\n沙井街道\t451086\n上安\t451087\n圆明\t451088\n不得姐\t451089\n离子风机\t451090\n叶落无心\t451091\n湿妹\t451092\n调理师\t451093\n红米1\t451094\n國\t451095\n都道府县\t451096\n猎巫\t451097\n胶州\t451098\n捧回\t451099\n会计要素\t451100\n盐酸吡格列酮片\t451101\n英俊\t451102\n文杯\t451103\n中国文化遗产研究院\t451104\n龙泉镇\t451105\n768\t451106\n肉核\t451107\n中华遗嘱库\t451108\n惊天动\t451109\n众评\t451110\n假面骑士ooo\t451111\n料\t451112\n云南省建设投资控股集团有限公司\t451113\n抗光\t451114\n徐文光\t451115\n聂耳\t451116\n查癌\t451117\n金蔷薇\t451118\n迷蒙\t451119\n合院\t451120\n盛世宠婚\t451121\n微商网\t451122\n蔗糖酶\t451123\n餐风\t451124\n路数\t451125\ngeom\t451126\n私厨\t451127\nR2016b\t451128\n考取\t451129\n铁帽子\t451130\n南宫市\t451131\nPsychological\t451132\n苍山\t451133\n40层\t451134\n小桥流水人家\t451135\n歌山\t451136\n2425\t451137\nlifted\t451138\nFLAC/\t451139\n踢毽子\t451140\nQFN\t451141\nAMS1117\t451142\n考纲\t451143\n轻言\t451144\n无价卡\t451145\n00级\t451146\n当班\t451147\n在下\t451148\n不好吃\t451149\n质惠版\t451150\n在暖\t451151\n粉墙\t451152\n种苗\t451153\nバ\t451154\n瑞彩\t451155\nlolvs\t451156\n小平米\t451157\n准许\t451158\nMacan\t451159\nAir2\t451160\nipad平板\t451161\n奥松板\t451162\nAdmissions\t451163\n中国航空旅游网\t451164\nTOP6\t451165\nise14.7\t451166\n目击者之追凶\t451167\n高院\t451168\n插值\t451169\nOBE\t451170\n宋体\t451171\n4.5亿\t451172\n对子\t451173\nSCDMA卡\t451174\n舌诊\t451175\nlanzhou\t451176\n笑点研究所\t451177\n招表\t451178\n不谓\t451179\n喉头\t451180\n弘法寺\t451181\n1232\t451182\n中华人\t451183\nHandsontable\t451184\ntradingview\t451185\n筐子\t451186\n白蕾丝\t451187\n得当\t451188\n须眉\t451189\n电脑闹钟\t451190\n飞猫\t451191\n朱彝尊\t451192\n保险条款\t451193\n优蓝\t451194\nPAPAGO\t451195\nIOUtils\t451196\n得\t451197\ndistribute\t451198\n第四版\t451199\n横纹肌肉瘤\t451200\n象牙塔\t451201\n尿感\t451202\n287个\t451203\nBooth\t451204\n中国诗词大会\t451205\n永丰街道\t451206\nsaying\t451207\nDB2数据库\t451208\n胆汁淤积\t451209\n林兆明\t451210\n高义\t451211\n诱婚\t451212\n一号家居网\t451213\n网神\t451214\nlapse\t451215\nnpg\t451216\n亿房\t451217\n便池\t451218\nfuliba\t451219\n对辊破碎机\t451220\n2035年\t451221\n沙漠漆\t451222\nmyriad\t451223\n台东步行街\t451224\nMayu\t451225\n原穴\t451226\n郭姝彤\t451227\n有线电视线\t451228\n一转成双\t451229\nWiMAX\t451230\n翻生\t451231\n单点\t451232\n调心轴承\t451233\n圣剑\t451234\n配债\t451235\n干刷\t451236\nHomeland\t451237\n对话框类\t451238\n启闭\t451239\n遥控板\t451240\n家办\t451241\n天一阁\t451242\n巴拉\t451243\nSiberiaDante\t451244\n走向辉煌\t451245\n聚氨酯板\t451246\n南京工业大学研究生院\t451247\n多语\t451248\n日语片\t451249\n亚马逊河\t451250\n50例\t451251\n涮肉\t451252\n述评\t451253\n白日\t451254\n阳东\t451255\n23篇\t451256\nqileilove\t451257\n启蒙积木\t451258\nAirJordan\t451259\n天门房网\t451260\n影像学\t451261\n花简笔画\t451262\n陈星汉\t451263\n炭疽\t451264\n接触式\t451265\n多特软件\t451266\n中仓\t451267\n40年后\t451268\n土地使用权转让合同\t451269\n景田百岁山\t451270\nV币\t451271\nzxxs\t451272\n盘锦市人民政府\t451273\n安华\t451274\n160平方米\t451275\nvinci\t451276\n世纪城\t451277\n动车\t451278\n王若伊\t451279\n感德\t451280\n拥兵\t451281\n塘桥街道\t451282\n杀死比尔\t451283\n两刀\t451284\n遂川县\t451285\n超声波探伤仪\t451286\n祛痘\t451287\n女巫猎人\t451288\nStatistical\t451289\n生男\t451290\n悬挑板\t451291\n魔法科\t451292\n风吹\t451293\n案源\t451294\n居上\t451295\n3.5亿\t451296\n尚客优连锁酒店\t451297\n夫妻成长日记\t451298\nspa机\t451299\ninco\t451300\n张鱼\t451301\n动漫之家新闻站\t451302\n卡【\t451303\n百感交集\t451304\n65_\t451305\n恶寒\t451306\nREFA\t451307\n捎带\t451308\n进出库\t451309\n通风阀\t451310\n妇产专业\t451311\n大公娱乐_大公网\t451312\nICH7\t451313\n学后\t451314\n银基\t451315\n更大\t451316\n5.20\t451317\n啮合\t451318\n梁晨\t451319\n脑血管痉挛\t451320\n龙山文化\t451321\n广州协佳医院\t451322\n古田三路\t451323\npr值\t451324\n扫盲帖\t451325\nMOEN\t451326\n代持\t451327\n活剂\t451328\n6038\t451329\n一五\t451330\n庵\t451331\n普惠制\t451332\n下期中\t451333\n龙洲湾\t451334\n切·格瓦拉\t451335\n凤逆天下\t451336\n烛芯\t451337\n上坡辅助\t451338\n财务费\t451339\n太鲁阁\t451340\n时程\t451341\n欲仙欲死\t451342\n柑橘味\t451343\n奥利司他片\t451344\n书法类\t451345\n文【言情小说吧\t451346\nappinventor\t451347\n平台级\t451348\n室上\t451349\n发货人\t451350\nFEVTE\t451351\n天珑广场\t451352\n章金莱\t451353\n墨印\t451354\n德阳市人民医院\t451355\n16.10\t451356\nrepositories\t451357\nsds\t451358\n冯潇霆\t451359\n石表\t451360\n遮不住\t451361\n带球\t451362\n大乘离文字普光明藏经\t451363\n慰菊\t451364\n小蝴蝶\t451365\nACAA\t451366\n功率谱\t451367\noptgroup\t451368\n亲历\t451369\n衣领\t451370\n徐州工业职业技术学院\t451371\n医护人员\t451372\n血泊\t451373\n刺客联盟\t451374\n原画师\t451375\nThinkStation\t451376\n极化\t451377\n改革性\t451378\n蒙昧\t451379\n空港城\t451380\nissue\t451381\n黄渍\t451382\n喷油泵\t451383\nsuites\t451384\n香莲\t451385\n2565周年\t451386\n先秦史\t451387\n白云大道北\t451388\n墨非\t451389\n柔性电路板\t451390\n非承载式\t451391\nHacked\t451392\n怀玉公主\t451393\n景旺\t451394\nM80\t451395\n六道\t451396\n生命表\t451397\n双目\t451398\n魔境仙踪\t451399\n试验\t451400\n4K超高清\t451401\n灭虫\t451402\n月溪镇\t451403\n陆逊\t451404\nLilly\t451405\n平庄镇\t451406\n李寅飞\t451407\n网易我的世界\t451408\n赵梅\t451409\nifox\t451410\n2265手游网\t451411\n康宁公司\t451412\n瀚银手付通\t451413\nRDA\t451414\n炖排骨\t451415\n层次分析法\t451416\n水洞\t451417\n碧浪\t451418\n汽车俱乐部\t451419\n翠湖苑\t451420\n于连\t451421\n亚马逊listing\t451422\n美度社区\t451423\n科天\t451424\n陈慧珊\t451425\n太阳能杀虫灯\t451426\n正片\t451427\n顶装\t451428\nGDAL\t451429\n兄弟情深\t451430\n1209\t451431\n辩证唯物论\t451432\n天比\t451433\n花节\t451434\nczj\t451435\nbytime\t451436\n2018年六月\t451437\n文代会\t451438\nsoco\t451439\n大庄村\t451440\n老镇\t451441\ndevc\t451442\n精华片\t451443\n塑造者\t451444\n戴文渊\t451445\nUnilever\t451446\n子帧\t451447\n差钱\t451448\n旋涡\t451449\n乾杯\t451450\n三不原则\t451451\n机料\t451452\n;\t451453\n3家\t451454\n三生天龙八部\t451455\nTIE\t451456\n10GB\t451457\n滴定管\t451458\n郑州财税金融职业学院\t451459\n20130809\t451460\n龙腾世纪\t451461\n张婷\t451462\n胜局\t451463\n移动营销\t451464\nEK\t451465\n汕头市中心医院\t451466\n月子\t451467\n波浪\t451468\n绣绣\t451469\nlifes\t451470\n安卓\t451471\n魅卓网\t451472\n郯城县\t451473\n52分\t451474\n室内设计\t451475\n九江开发区\t451476\nGBL\t451477\n导向\t451478\n然后\t451479\n美男高校地球防卫部\t451480\n房地产市场\t451481\n蜀山区\t451482\n鞠萍\t451483\n高轨\t451484\n意林\t451485\n争霸\t451486\n越秀金控\t451487\n活性\t451488\n深圳检验检疫局\t451489\n图像分割算法\t451490\n多头\t451491\nFusionCharts\t451492\n中国数据通信\t451493\nmyst\t451494\n鄞州银行\t451495\n预先\t451496\n哀哀\t451497\n天津南开\t451498\n供港\t451499\ncmsis\t451500\njack_Meng\t451501\ninstal\t451502\n提面\t451503\n248元\t451504\n红门\t451505\n刘岚\t451506\nTron\t451507\n一株\t451508\n家规\t451509\n薄利\t451510\n洁面膏\t451511\n养肝茶\t451512\n郭文\t451513\n波利\t451514\n界首吧\t451515\n东京喰种第三季禁播?东京食尸鬼第2季未解之谜\t451516\n刘汉元\t451517\n周文彰\t451518\n2018-03-19\t451519\n研究处\t451520\n环城路\t451521\nthinkphp\t451522\n变价\t451523\n读书吧\t451524\n刘晓艳\t451525\n大衣\t451526\nbilibil\t451527\n张新杰\t451528\n亚胺培南\t451529\n大沙头\t451530\n相衬\t451531\n康庄大道\t451532\n女服务员\t451533\n和方\t451534\n33号\t451535\n签证\t451536\n月租\t451537\n步进电动机\t451538\n孔令宏\t451539\nmores\t451540\n电子笔\t451541\n酸萝卜\t451542\n靓肤\t451543\n5角\t451544\n电线杆\t451545\n沃尔玛\t451546\nRice\t451547\n曲美家具\t451548\n火车票\t451549\nlamy\t451550\n2222\t451551\n广东博意建筑设计院有限公司\t451552\n烟头\t451553\n进面\t451554\n天勤\t451555\n欧阳雪\t451556\n4400元\t451557\n卫浴\t451558\n闲徕互娱\t451559\n铜山新区\t451560\n自治区卫生计生委\t451561\n征地\t451562\n重庆西南医院\t451563\n拭目以待\t451564\nGEASS\t451565\n穿衣经\t451566\n朱立伦\t451567\n128_\t451568\nrslogix\t451569\n开村\t451570\nShortcut\t451571\n其文\t451572\n童画\t451573\n润建通信股份有限公司\t451574\n宋轶\t451575\n碳氢油\t451576\n分散剂\t451577\n北京广安门医院\t451578\n几码\t451579\n李小萌\t451580\n搜狐浏览器\t451581\n柯受良\t451582\n蛱蝶\t451583\n乌迪尔\t451584\n腮风\t451585\n2012年7月\t451586\n母题\t451587\n粤东\t451588\n400首\t451589\n密封袋\t451590\n9.12\t451591\n2345网址导航\t451592\n判断表\t451593\n蛇魅\t451594\n中南大学软件学院\t451595\n非自然死亡\t451596\n老公们\t451597\n九遍\t451598\n橘子洲头\t451599\n暨南大学图书馆\t451600\nshadowsockets\t451601\n在父\t451602\nDRAW\t451603\n陈好\t451604\n齐心协力\t451605\n股权转让款\t451606\n清华城\t451607\n闪灯\t451608\nwinamp\t451609\n清化\t451610\n清泉石\t451611\nssd吧\t451612\n葛浩文\t451613\n后妻\t451614\n田慧\t451615\n另存为\t451616\n开头\t451617\ninet_pton\t451618\n谭定安\t451619\n捷西\t451620\n洗漱包\t451621\n小市场\t451622\n宝路华\t451623\n代欧奇希斯\t451624\n三业\t451625\n自考毕业论文\t451626\n五优\t451627\n商品\t451628\n奇男子\t451629\n花艺\t451630\n佛山市第一中学\t451631\n大大\t451632\nBYTON\t451633\ne8500\t451634\n中国移动积分商城\t451635\n泊洛沙姆\t451636\nWORK\t451637\n逃不掉\t451638\n杭州万达广场\t451639\n傅老师\t451640\n咕咾肉\t451641\n斧柄\t451642\n李龟年\t451643\n7220\t451644\n子隐\t451645\n郝浩涵\t451646\n刘嘉卓\t451647\nSupernova\t451648\n胶水味\t451649\n深褐色\t451650\n曹玮\t451651\n输卵管积液\t451652\n平面控制网\t451653\n毫秒值\t451654\nworkstation\t451655\n知否知否应是绿肥红瘦吧\t451656\n七月七日\t451657\nwww559958con\t451658\nmasters\t451659\nミ\t451660\n丰田章男\t451661\n6233575\t451662\n科发\t451663\n珍君\t451664\n提提\t451665\n沙中\t451666\n赛普拉斯\t451667\nIAD\t451668\n网店\t451669\n停滞不前\t451670\n簧片\t451671\n听者\t451672\nCarlson\t451673\n许广\t451674\n鉴赏会\t451675\n网罩\t451676\nmac模拟器\t451677\n15万吨\t451678\n慎防\t451679\n100vh\t451680\nvcr\t451681\n大众T-ROC\t451682\n3#\t451683\nFIN\t451684\nkaricare\t451685\n谢飞机\t451686\n多措并举\t451687\n做自我介绍\t451688\n阳性率\t451689\n初犯\t451690\n宁国市\t451691\n为什么错\t451692\n广天\t451693\n拍戏时\t451694\nOCP\t451695\n代取\t451696\n|Sony\t451697\nMDN\t451698\n绝地求生按\t451699\n唐高祖\t451700\n鬓毛\t451701\n自然通风\t451702\n税后工资计算器\t451703\nping不通\t451704\n天龙比亚迪f3\t451705\n惊叹不已\t451706\n笔心\t451707\n验线\t451708\n三争\t451709\n高新兴科技集团\t451710\nweb页\t451711\n国域\t451712\n淘宝天猫商城\t451713\n美工刀\t451714\n中共九大\t451715\n医巫闾山\t451716\n刘集镇\t451717\nUbunt\t451718\n特瑞堡\t451719\n唱罢\t451720\n碧根果\t451721\n三狼\t451722\nSDT\t451723\n詹妮弗·劳伦斯\t451724\n郑璐\t451725\n黄瓜\t451726\n国家艺术基金\t451727\n2017春\t451728\nBLEACH\t451729\n利欧路\t451730\n上海第二军医大学\t451731\n特赦\t451732\nled球泡灯\t451733\n唐泽寿明\t451734\n精品赛\t451735\n行场\t451736\n海安县政府\t451737\n布里亚特\t451738\nblendtec\t451739\n沾沾\t451740\n2514\t451741\n西康\t451742\n风险评估表\t451743\n须臾\t451744\n汽车式起重机\t451745\n雪梨枪\t451746\nAWD\t451747\n偏距\t451748\n何成\t451749\n高校人才网\t451750\nrbs\t451751\nDou\t451752\nOGNL表达式\t451753\ndiea\t451754\n猎杀潜航4\t451755\nesn\t451756\n更需要\t451757\nSyrian\t451758\n代理点\t451759\n查税\t451760\n弃权\t451761\nf547\t451762\n速录员\t451763\n力克\t451764\n专心\t451765\n元胡\t451766\n排名前\t451767\n早教机\t451768\n情窦初开\t451769\nTOWER\t451770\n译名\t451771\n辛巴达\t451772\n夜游\t451773\n时鲜\t451774\n胡塞武装\t451775\n402\t451776\n医咖会\t451777\n振聋发聩\t451778\n石蕊试液\t451779\n鄂南\t451780\nь\t451781\n政府信息公开条例\t451782\n交通业\t451783\n美联社\t451784\n闵行区实验小学\t451785\ndielectric\t451786\n苏世民\t451787\n扇骨\t451788\n黯淡无光\t451789\n俞平伯\t451790\n美女图\t451791\nndiy\t451792\nihone\t451793\n金刚结\t451794\n借贷款\t451795\n网虫\t451796\nchiral\t451797\n书格\t451798\nр\t451799\n秦川\t451800\n600200\t451801\n珠海医院\t451802\n弗\t451803\noffice365\t451804\n理塘县\t451805\n张家界大峡谷\t451806\n失序\t451807\nworkbeach\t451808\n林带\t451809\n戴尔灵越7000\t451810\n水杉\t451811\n瀚博\t451812\n3dsmax2009\t451813\n微距镜头\t451814\n芝士网\t451815\n社会主义合心价值观\t451816\n变更有\t451817\n母函数\t451818\n托尼奖\t451819\n路沿石\t451820\n任教\t451821\n苦熬\t451822\n鸿业\t451823\n职业道德规范\t451824\n何建国\t451825\n陈明羽\t451826\nuboot\t451827\nvente\t451828\n家政妇\t451829\nsyc\t451830\n通宵达旦\t451831\n10类\t451832\n补水\t451833\n求指教\t451834\n行色\t451835\n香度\t451836\n中国医学科学院阜外医院\t451837\n亏损\t451838\n与众不同\t451839\nwrangler\t451840\nwanan\t451841\n浙西大峡谷\t451842\n20170118\t451843\n族长\t451844\n0438\t451845\n2017.12.15\t451846\n不得而知\t451847\n尾椎骨\t451848\n后进先出\t451849\nPools\t451850\n民革\t451851\nM3U8\t451852\n升旗手\t451853\n中共八大元老\t451854\n新爱婴早教中心\t451855\n样阀\t451856\n前进区\t451857\n弹线\t451858\n福州外语外贸学院\t451859\n喀左县\t451860\n大苏网\t451861\n江湖缥缈录\t451862\nandro\t451863\n小猫咪\t451864\n七月网\t451865\n坐守\t451866\n园博\t451867\n新能源汽车有限公司\t451868\n石塘人家\t451869\nspket\t451870\n记忆学\t451871\n补课\t451872\n粉红\t451873\n护乳\t451874\n接地电阻测试仪\t451875\n老街巷\t451876\nv1.0_\t451877\n中央车站\t451878\n三乙胺\t451879\n节票\t451880\n航天类\t451881\n黑云\t451882\n柯兰多\t451883\n哇塞\t451884\n制数\t451885\n新海诚\t451886\n20180119\t451887\n天府泰剧\t451888\n四柱八字\t451889\n1222\t451890\n竹里馆\t451891\n自驾游记\t451892\n抗菌谱\t451893\n试管\t451894\n广州市生活中心\t451895\ndocker-machine\t451896\n井冈山\t451897\n结尾处\t451898\n寒色\t451899\naster\t451900\n曹老板\t451901\n任宰范\t451902\n蚀刻液\t451903\namericans\t451904\n北京学校\t451905\n附灵\t451906\n通信段\t451907\n公安机关办理刑事案件程序规定\t451908\nkafei\t451909\nipz127\t451910\n路测\t451911\nFTD\t451912\n曹爽\t451913\n20160224\t451914\n祈福集团\t451915\n子宫\t451916\n展示台\t451917\npuyo\t451918\n第10版\t451919\n万金油\t451920\n恭\t451921\n汤姆熊\t451922\n肚疼\t451923\n0552\t451924\n陈立扬\t451925\n孤胆车神\t451926\n硫化银\t451927\n制卡\t451928\n_宫\t451929\n上海注册公司\t451930\n二西莫夫\t451931\n罗湖医院\t451932\neratw\t451933\n项目册\t451934\n135平\t451935\n網友\t451936\n中國\t451937\nchigga\t451938\nXSLT\t451939\n心界\t451940\n对联集\t451941\n2282\t451942\n中国风景园林学会\t451943\n$set\t451944\n警觉性\t451945\n第55期\t451946\n毒王\t451947\n62名\t451948\n超时空英雄传说\t451949\n排雷\t451950\nx500\t451951\n肖建国\t451952\n桃花红杏花白\t451953\n电子发烧友网\t451954\n新村\t451955\n金仕达\t451956\n西瓦\t451957\nHRS\t451958\n性力\t451959\n南特\t451960\n单峰\t451961\n黄山村\t451962\n复合酶\t451963\ndawnLynn\t451964\n输送线\t451965\n色心\t451966\n恒进\t451967\nInjury\t451968\n胆战心惊\t451969\n13价\t451970\nwifi摄像头\t451971\naspirant\t451972\n黄心颖\t451973\n24平方米\t451974\n伊贝\t451975\n无绳电话机\t451976\n钢琴游戏\t451977\nМ\t451978\n平房区\t451979\n天津阳光医院\t451980\n龙园山庄\t451981\n形兵\t451982\n6810\t451983\n复\t451984\nandroi\t451985\n混合动力汽车\t451986\nhoot\t451987\n8盘\t451988\n广州网\t451989\nSPU\t451990\n【卿国卿城\t451991\n[夜桜\t451992\n95583\t451993\n云梦吧\t451994\nNAFTA\t451995\n法宠\t451996\n陈园\t451997\n粘合机\t451998\nclassin\t451999\n基态\t452000\n昏黄\t452001\n20万台\t452002\nNSGA\t452003\n2017年7月19日\t452004\n8.5万\t452005\n林珈安\t452006\n尘寰\t452007\n诗曰\t452008\n扭紧\t452009\n腐剧\t452010\nSkullcandy\t452011\n会声\t452012\n冲激函数\t452013\n石锅鱼\t452014\n皇榜\t452015\n七七范文网\t452016\n便于\t452017\n护眼\t452018\nasf\t452019\n杨恒\t452020\n课书\t452021\n雪城\t452022\n院\t452023\n法罗迪斯\t452024\n寒山\t452025\n班轮\t452026\npangolin\t452027\n防护架\t452028\n科翔\t452029\n钱安\t452030\n研究所\t452031\n888\t452032\n方大集团股份有限公司\t452033\n王春艳\t452034\n朱炳寅\t452035\nSincerely\t452036\n荷兰航空\t452037\n刘健男\t452038\n盈飞\t452039\n巴松措\t452040\n记趣\t452041\n富力院士廷\t452042\n尚层装饰\t452043\n杜康\t452044\n帝国全面战争吧_\t452045\n黑砖\t452046\n我们的父辈\t452047\nBPL\t452048\n天虹店\t452049\n德威新材\t452050\n氟_\t452051\n金庸5\t452052\n饰演\t452053\n低纬度\t452054\n重估\t452055\n100万亿\t452056\n悲报\t452057\n搞笑漫画\t452058\n猪腰子\t452059\nT300\t452060\nqc35\t452061\nmpm\t452062\n120题\t452063\n应纳税暂时性差异\t452064\n电动汽车网\t452065\n一技之长\t452066\n贵处\t452067\n咎\t452068\n借形\t452069\n减弱\t452070\n嫣\t452071\n徐燕\t452072\n幸田由真\t452073\n柯诺\t452074\nKNOX\t452075\n紫光股份\t452076\nCarla\t452077\n中信出版社\t452078\n本屌\t452079\nsubstances\t452080\n大观\t452081\n第三篇\t452082\nBEAUTY\t452083\n宋人\t452084\ngsxt\t452085\n四九城\t452086\n中银律师事务所\t452087\n陈悦\t452088\n民航资源网\t452089\n浅田真央\t452090\nGliffy\t452091\n2018年4月28号\t452092\n东风风神S30\t452093\n谷草转氨酶\t452094\n1-8月\t452095\njinru\t452096\nfrchen\t452097\n第八页\t452098\n串口调试助手\t452099\n说谎者\t452100\n李胜贤\t452101\n玉自寒\t452102\n张公\t452103\n电量变送器\t452104\n罢赛\t452105\n青岛高新区\t452106\n急性冠脉综合征\t452107\nhikari\t452108\n容量瓶\t452109\n_峰\t452110\n大丰镇\t452111\n速器\t452112\n内档\t452113\n珍珠美人\t452114\n_主音网\t452115\n赞颂\t452116\n9008\t452117\n派发员\t452118\n阿拉玛\t452119\nukiss\t452120\n忠烈\t452121\nPenang\t452122\n橡塑展\t452123\n一乙醇胺\t452124\n掉钱\t452125\n葛雷奥特曼\t452126\n51|51\t452127\ne5440\t452128\nCorpus\t452129\n隽\t452130\n牵马\t452131\n7.3.1\t452132\n徽商银行股份有限公司\t452133\n叶子猪新大话西游2\t452134\nsolo赛\t452135\n花月\t452136\nLM324\t452137\n使命召唤4:现代战争\t452138\n街舞社\t452139\n品台\t452140\n2018-03-28\t452141\n怪物猎人2g\t452142\n几包\t452143\n北冥大帝\t452144\n莫西干人\t452145\n山阴公主\t452146\n徐峥\t452147\n风险收益率\t452148\n九章\t452149\n江苏洋河酒厂股份有限公司\t452150\n09级\t452151\n智学网\t452152\n京万红\t452153\n没谈\t452154\ncyan\t452155\n侯海洋\t452156\n章程\t452157\n新倚天\t452158\n武学家\t452159\n大柳塔\t452160\napgy\t452161\nJinjiang\t452162\npr2017\t452163\n兼管\t452164\n5211\t452165\nintext\t452166\n内机\t452167\n软软\t452168\n铁血智将\t452169\n乐利来\t452170\n九江小学\t452171\nwjx\t452172\n国家汽车质量监督检验中心\t452173\n全民超人汉考克\t452174\n五维\t452175\n水阳\t452176\n1.5mm\t452177\n邯郸乐居网\t452178\n吴冰\t452179\n中外运敦豪\t452180\n乡亲们\t452181\n募款\t452182\n恭喜\t452183\nCygames\t452184\n实现类\t452185\n辈分\t452186\n校政\t452187\n郑云工作室\t452188\n禾\t452189\n件\t452190\n西夏\t452191\n手机银行卡\t452192\n国家中医药管理局\t452193\n堂\t452194\n人力资源社会保障部办公厅\t452195\n2000多名\t452196\n抬价\t452197\n免费加速器\t452198\nPrimeFaces\t452199\n我辈\t452200\n奚望\t452201\n强者之路\t452202\n十校\t452203\n锂精矿\t452204\n一言通天无弹窗_一言通天章节目录_一言通天\t452205\numol\t452206\n圣少女\t452207\n成人\t452208\n5块\t452209\n不怕不怕\t452210\nAmateur\t452211\n多面性\t452212\n舒化奶\t452213\n棒球衫\t452214\n芝麻分\t452215\n刁刘氏\t452216\n200万美元\t452217\n下庄\t452218\n石爪山\t452219\n5任\t452220\nyoungjoy\t452221\n赶\t452222\n魔兽争霸2\t452223\n香丹\t452224\n短母\t452225\n三氯化铁\t452226\n豆趣网\t452227\n重力眩晕\t452228\n夜茶\t452229\n第三十二\t452230\nshh\t452231\n海南黄花梨\t452232\n巴黎欧莱雅\t452233\n立户\t452234\n中天飞鸿\t452235\n最后一样\t452236\n半天\t452237\n季报\t452238\n攝影\t452239\n一马\t452240\nImpulse\t452241\nJiSight\t452242\n蒸汽喷射器\t452243\n大红区\t452244\nbiguz\t452245\nCustomize\t452246\n呼吸声\t452247\n今世缘\t452248\n2018年4月29\t452249\n阿德莱德\t452250\n法剧\t452251\ntorque\t452252\nPouch\t452253\nSecure\t452254\n嘉宾路\t452255\nrabiribi\t452256\n6x4\t452257\n余英时\t452258\nIvanka\t452259\n济宁机场\t452260\n桂语\t452261\n杠铃杆\t452262\n此页\t452263\nGogland\t452264\n光纤熔接\t452265\nT0\t452266\nthumbnailator\t452267\n中国长城资产管理股份有限公司\t452268\n杆杠\t452269\n抵贷\t452270\n竹编\t452271\n上饶人\t452272\n注油器\t452273\n步知网\t452274\nGUERLAIN\t452275\n玛歌红酒\t452276\n换位\t452277\n王大伯\t452278\n金粉\t452279\n83530330\t452280\n237号\t452281\n环段\t452282\nemn\t452283\nCapsules\t452284\n中国银行澳门分行\t452285\n迁址\t452286\nedg战队\t452287\n翻盖机\t452288\n乌合背影\t452289\n维萨拉\t452290\n普适性\t452291\n海乐\t452292\n重庆市农业委员会\t452293\nCrate\t452294\n搓板\t452295\nExcel函数\t452296\n公证件\t452297\n非常可爱\t452298\nvib\t452299\n竞技版\t452300\nAnySDK\t452301\n120214\t452302\n珠海火车站\t452303\n服装展\t452304\ninterpark\t452305\n建设局\t452306\n大富豪3\t452307\n幼儿画\t452308\n好客\t452309\n农科村\t452310\n嘲笑鸟\t452311\n超级块\t452312\n离港\t452313\n唐卡网\t452314\n孟丹\t452315\n傻瓜机\t452316\nwinddows\t452317\n阿娘\t452318\nusc\t452319\n丛珊\t452320\n河妖\t452321\n粘胶\t452322\n柳北区\t452323\n要十不要\t452324\n神券\t452325\n电脑锣\t452326\nwinform\t452327\n深藏功\t452328\n4200u\t452329\n本田竞瑞\t452330\n第二稿\t452331\n右膝\t452332\n久裕\t452333\n试探性\t452334\n霍景良\t452335\n宫颈癌\t452336\nHP1020\t452337\n合抱\t452338\n赤水河谷\t452339\n第三部\t452340\n金凤路\t452341\n2018年5月1日\t452342\n瑞风R3\t452343\nperi\t452344\n?子\t452345\nZ420\t452346\nLexical\t452347\n偷着乐\t452348\n道系\t452349\n受付\t452350\n邪恶h全彩漫画少女_肉番动漫_色列漫画大全_工口漫画\t452351\n小腿骨\t452352\n伽马函数\t452353\n160cm\t452354\n4110\t452355\nbambino\t452356\n组织学\t452357\n皂液\t452358\nkarma\t452359\n佥\t452360\nTranslation\t452361\npoy\t452362\n潍坊国际风筝会\t452363\nluaide\t452364\nQuadratic\t452365\n飞冲天\t452366\ncarriers\t452367\n中骏集团\t452368\n18年\t452369\n陈列架\t452370\n0.6kg\t452371\n120G\t452372\n高信\t452373\n委托\t452374\nsec\t452375\n争当\t452376\nPing通\t452377\n水量\t452378\n永州网\t452379\n党小组\t452380\n市北高新\t452381\n洗手台\t452382\nxe7\t452383\n假言\t452384\n千亿\t452385\n钢兵\t452386\n实变函数\t452387\n线性变换\t452388\nCUMT\t452389\n先祖\t452390\n20180222\t452391\n10针\t452392\naql\t452393\nbiu\t452394\nmanven\t452395\n润享\t452396\n乐游\t452397\n春芝堂\t452398\n卫仕\t452399\n开饭\t452400\n牛奶味\t452401\n星版\t452402\n相映\t452403\n报头\t452404\n益丰\t452405\nnecklace\t452406\n减值准备\t452407\n杨店\t452408\n阿拉什\t452409\n冬月枫\t452410\n生命时速\t452411\nRizhao\t452412\n2018年04月12日\t452413\n4.1\t452414\n轻石\t452415\nfaiss\t452416\n6700hq\t452417\nconcurrent\t452418\n何满子\t452419\n布鲁斯\t452420\n面面相觑\t452421\n滚水网\t452422\n疾风\t452423\n波瑠\t452424\n五马街\t452425\n死与生\t452426\n法布尔\t452427\n九位\t452428\navery\t452429\n七姐妹\t452430\n南阳地区\t452431\n自宅\t452432\n打人机\t452433\n搜狗旅行翻译宝\t452434\n超清\t452435\n助餐\t452436\n盐泽\t452437\n米酒\t452438\n周哲\t452439\n宜商\t452440\n0063\t452441\n惊坐\t452442\n滚压机\t452443\nclass\t452444\n260mm\t452445\n突然间\t452446\n马来西亚币\t452447\n曹卫东\t452448\n叉\t452449\n7FRESH\t452450\nLLM\t452451\n能量块\t452452\n建章立制\t452453\n倒影\t452454\n中粮大悦城\t452455\n头孢呋辛\t452456\n300千克\t452457\nLightGBM\t452458\n气垫\t452459\nappreciation\t452460\n疯狂麦克斯\t452461\n榆钱\t452462\n张海明\t452463\n異\t452464\n鹤山\t452465\nopenmv\t452466\n刮伤\t452467\n185万\t452468\n2018年5月8日\t452469\n维德\t452470\n亲和素\t452471\n099\t452472\n泡沫盒\t452473\n691\t452474\n上海高金\t452475\n大成基金管理有限公司\t452476\n司考\t452477\n桥本隆\t452478\n科城山庄\t452479\njasig\t452480\nArr\t452481\nWeapon\t452482\n困累\t452483\n聚脂纤维\t452484\n威海市城乡建设委员会\t452485\n4399航海王启航\t452486\n_里番工口福利\t452487\n新联电子\t452488\n支付业\t452489\nAG亚游集团\t452490\n衢州火车站\t452491\n招生简章_北京幼升小网\t452492\n糖尿\t452493\nIslamic\t452494\ncaxa2015\t452495\n马海\t452496\nfeasible\t452497\nzyupload\t452498\n紫钻\t452499\nlxjshuju\t452500\n千人糕\t452501\n别入\t452502\n四盒\t452503\n素颜\t452504\n诺梵\t452505\n零几天\t452506\n60v20a\t452507\n瓜州县\t452508\n三夫户外\t452509\npdp\t452510\nLJ2200\t452511\n妞博\t452512\n合肥广播电视台\t452513\n岩佐あゆみ\t452514\n犹存\t452515\n数控技术专业\t452516\n6.22\t452517\n钢架\t452518\nBeliever\t452519\nnodejs\t452520\nGuy\t452521\n北控集团\t452522\n最高人民法院公报\t452523\nfateg\t452524\n蓝天画\t452525\nWo\t452526\n画笔\t452527\n深信服科技\t452528\n李茶的姑妈\t452529\n完好\t452530\n护栏网\t452531\n快玩\t452532\n云拍网\t452533\n证据\t452534\n胰岛素\t452535\n140平\t452536\n先导式\t452537\n婚姻家庭法\t452538\nGreenplum\t452539\n环境标志\t452540\n贵州省统计局\t452541\n长城风骏6\t452542\nスマホ\t452543\nLYU\t452544\nSOHO中国\t452545\n全神庙\t452546\n2块\t452547\n上海历史博物馆\t452548\n桂林西\t452549\n忠州\t452550\nreunion\t452551\nps4\t452552\np1\t452553\n预言\t452554\n乐高主题公园\t452555\n切机\t452556\n会客室\t452557\n个人工商户所得税计算器\t452558\n西奈半岛\t452559\n柔印机\t452560\n中南建筑设计院\t452561\n南山南\t452562\n5小时后\t452563\n布筋\t452564\n求得\t452565\n刀客家族的女人\t452566\n捞渣机\t452567\n为我所用\t452568\n冈萨雷斯\t452569\n戴琳\t452570\n清明雨\t452571\n解锁\t452572\n胡路区法院\t452573\n鹏哥\t452574\n滨洲\t452575\n挺好\t452576\n秧\t452577\nWebAPP\t452578\n神农架\t452579\n个人养老保险金查询系统\t452580\nDuan\t452581\n玛纳斯县\t452582\n婺源县人民政府\t452583\n桜空\t452584\neD2\t452585\n定制化\t452586\n何物\t452587\n购并\t452588\nExtreme\t452589\n一馆\t452590\n开国元勋\t452591\nPixi\t452592\n四川托普信息技术职业学院\t452593\n另娶\t452594\nDOW\t452595\n可爱的女友\t452596\num\t452597\n适合\t452598\n搓纸轮\t452599\n苏州观前街\t452600\n骄人\t452601\n萨利机长\t452602\n600mg\t452603\n80多个\t452604\nkawaks\t452605\n文言文\t452606\n林可霉素\t452607\n迈克尔贝\t452608\n平冷\t452609\noa\t452610\n嘴图\t452611\n石家庄东站\t452612\n吉利金刚\t452613\n浮锚杆\t452614\nISO文件\t452615\n写照\t452616\n鼠绘\t452617\n母歌\t452618\n植被指数\t452619\n世界级\t452620\nDatalogic\t452621\n新型农村合作医疗\t452622\n防尘布\t452623\n开拆\t452624\n三十七\t452625\n一粒粒\t452626\n昨夜星辰昨夜风\t452627\n长尾\t452628\n李涛\t452629\n艳绝\t452630\n侍奉\t452631\nGraham\t452632\n自顶向下方法\t452633\nWhere\t452634\n铝酸钠\t452635\n苏宁悦城\t452636\ntandem\t452637\n奥坎\t452638\n骨质增生\t452639\n塔山镇\t452640\n北京市\t452641\n红外线感应器\t452642\n传屏\t452643\n抑郁质\t452644\n英国莱斯特大学\t452645\n小白文\t452646\n纳斯卡\t452647\n闷闷\t452648\n106项\t452649\nCreek\t452650\ngayboy\t452651\n律师库\t452652\nPROFINET\t452653\n张晓辉\t452654\n抽象性\t452655\nbanker\t452656\n2.10.4\t452657\n陈笑笑\t452658\n喜剧总动员第二季\t452659\nZUZU\t452660\n安建\t452661\n招生办\t452662\n石鲁\t452663\nTMD\t452664\n2号位\t452665\n40.8\t452666\n宝马微微一笑很倾城\t452667\n理科\t452668\n200平方\t452669\n过氧化物酶\t452670\n希箭\t452671\n群殴\t452672\n发债\t452673\n7655\t452674\nmatlab7.0\t452675\n外感\t452676\ndsp\t452677\nPasadena\t452678\n蔊菜\t452679\n爱你一万年\t452680\nsrping\t452681\n04月29日\t452682\n内壁\t452683\n假腿\t452684\n养生粥\t452685\n寄明月\t452686\n萨格罗斯\t452687\n八戒网\t452688\n二三线\t452689\n聋\t452690\n海青\t452691\nSpigotMC\t452692\n5488\t452693\n同辉\t452694\nmaroon5\t452695\n龙山新闻网\t452696\n天堂寨\t452697\nsdzk\t452698\nds6\t452699\n遇龙河\t452700\n洋品\t452701\n军宠\t452702\nsmile001\t452703\nppsspp\t452704\n亭台楼阁\t452705\n祁华路\t452706\n航天员\t452707\n移民公司\t452708\n302\t452709\nE+\t452710\n石墨\t452711\n被发\t452712\n敲打\t452713\n上海江城皮肤病医院\t452714\n惠更斯\t452715\n超临界\t452716\n1.6.5\t452717\n周庄嘉园\t452718\n推拿\t452719\ninstall\t452720\n镁带\t452721\n成品油消费税\t452722\n陈笑\t452723\n淤积\t452724\n14:00\t452725\n威伯科\t452726\n北京公务员考试网\t452727\nZXV10\t452728\n非常静距离\t452729\nKee\t452730\n企业工商注册代理\t452731\nAMBER\t452732\nIMU\t452733\njz2440\t452734\n起步价\t452735\n社会科学处\t452736\n国家工商行政管理总局商标评审委员会\t452737\nvanity\t452738\n200张\t452739\n工程热物理学报\t452740\n管理型\t452741\nblcblc\t452742\n富途证券\t452743\nfarcry4\t452744\n临额\t452745\n登陆页\t452746\n建设村\t452747\n千头\t452748\nyoule\t452749\n20头\t452750\n圆柱型\t452751\n夜花\t452752\n蓝黛\t452753\nstrongSwan\t452754\nliferay\t452755\n野生型\t452756\n昆明医科大学海源学院\t452757\n正典\t452758\n银环\t452759\n招降\t452760\n湖北省人民政府\t452761\n6066\t452762\n商业\t452763\n公路赛\t452764\n高式\t452765\n贾文龙\t452766\n有所谓\t452767\n罗德里格斯\t452768\n林前\t452769\n猫性\t452770\n大竹县\t452771\n侠盗猎车\t452772\n第三根\t452773\nemlog\t452774\n宏宇陶瓷\t452775\nClustering\t452776\n华泰小区\t452777\ntiktok\t452778\n山崖\t452779\nfdy\t452780\n墨海\t452781\n摩肩接踵\t452782\n中国南方航空\t452783\n财务净现值\t452784\n竹排\t452785\n峨眉山旅游网\t452786\nToward\t452787\n维克特利奥特曼\t452788\n大部门\t452789\n中南财经政法大学\t452790\n小鬼当家2\t452791\n一衣\t452792\n毕棚沟\t452793\n纸飞机\t452794\n墨菊\t452795\n纯债基金\t452796\n云散\t452797\nVS2015|Visual\t452798\nsrpingboot\t452799\n泵批\t452800\n毓璜顶\t452801\n档案员\t452802\n非标件\t452803\n90句\t452804\n1.02\t452805\n三氧化二铝\t452806\n一川\t452807\n色谱\t452808\nspecification\t452809\n加藤嘉一\t452810\n一摞\t452811\n管宁\t452812\n大宋\t452813\n邮币\t452814\n洗车台\t452815\n刨床\t452816\n焰灵姬\t452817\n朝阳新闻网\t452818\n哈曼卡顿\t452819\noctave\t452820\nwin1\t452821\nhazop\t452822\n70美元\t452823\n倒班\t452824\n二穴\t452825\n这个\t452826\n风雨交加\t452827\n2019年5月\t452828\n海思麒麟970\t452829\n逃妃\t452830\n系统内存\t452831\n劲草\t452832\n阿贝尔\t452833\nweb4easyui\t452834\nAguilera\t452835\n大化县\t452836\n福禄寿\t452837\nWCL\t452838\nbe&#160\t452839\n里线\t452840\n互见\t452841\n龙之谷手游\t452842\ntwinmotion\t452843\n2211\t452844\n普湾新区\t452845\nQuebec\t452846\nViho\t452847\n拜伦\t452848\ncomposites\t452849\n中国海洋石油集团有限公司\t452850\nfisherman\t452851\n慢性非传染性疾病\t452852\ncf兰\t452853\n跑步鞋\t452854\n史赛克\t452855\n宝运莱\t452856\n散播\t452857\n坏块\t452858\nVersa\t452859\n十二式\t452860\n暗黑3吧_\t452861\n银婚\t452862\n曲美\t452863\nmarching\t452864\n尿期\t452865\n欧珀\t452866\n无锡车管所\t452867\n幽谷\t452868\n今日美术馆\t452869\nvali\t452870\n85SS\t452871\nTopic\t452872\n摇奖\t452873\n红外线治疗仪\t452874\n研究生学\t452875\n吴中经济开发区\t452876\n智能电表\t452877\n大米有限公司\t452878\n瓶片\t452879\n好康\t452880\n许舜英\t452881\n汉化版\t452882\n优宁维公司\t452883\n李金枝\t452884\n李春明\t452885\n空气净化剂\t452886\n科摩\t452887\n蠢蛋\t452888\n伊基塔\t452889\nAbe\t452890\n锅炉工\t452891\n亲征\t452892\n新民周刊\t452893\n十二生肖风水算命网\t452894\n体育运动\t452895\n南大\t452896\nhtml5\t452897\n冠军城\t452898\ngongji\t452899\nm2e\t452900\n中国回收商网\t452901\n合肥新城\t452902\n朝思暮\t452903\n御翔\t452904\n桂花\t452905\n余味\t452906\n广州日报大洋网\t452907\nMultiBeast\t452908\n微云\t452909\nsolaris11\t452910\nsymbols\t452911\n凉垫\t452912\nhb\t452913\n杨海燕\t452914\n火箭发射\t452915\n仙水\t452916\nupgrading\t452917\n3月后\t452918\ncurved\t452919\n只剩下\t452920\n2.75\t452921\n兖州市\t452922\n山货\t452923\nyip\t452924\n风湿病\t452925\n餐票\t452926\n有意义\t452927\nV2.1.1\t452928\n解消\t452929\n水亭\t452930\n彩虹伞\t452931\n1000场\t452932\n非金属\t452933\n七里庙\t452934\nz87\t452935\njoblib\t452936\n意外之旅\t452937\n豪生\t452938\n朱江洪\t452939\n年轻姑娘\t452940\nCoed\t452941\n2718\t452942\n天树\t452943\n龙岗中心区\t452944\n深港驾校\t452945\n太平金融大厦\t452946\nGBK\t452947\n流\t452948\n第253集\t452949\n产气\t452950\n躯壳\t452951\nisabella\t452952\n仪仗队\t452953\n双影\t452954\n张欢\t452955\nmlcc\t452956\n海外英语\t452957\n1019\t452958\n九星天辰诀\t452959\n夹层融资\t452960\n11b\t452961\n光皮\t452962\n地牢猎手5吧\t452963\n关注\t452964\n瘦腿霜\t452965\n石矶娘娘\t452966\nQVOD\t452967\nmob\t452968\n清华同方太阳能\t452969\n谈谈\t452970\n张权\t452971\n办公桌椅\t452972\n阴图\t452973\n辉夜姬\t452974\n验签\t452975\n热带鱼\t452976\n一回\t452977\n青\t452978\n第1届\t452979\n走麦城\t452980\nTBBT\t452981\n悉知\t452982\n康康\t452983\n无限极分类\t452984\n拉勾\t452985\n融创春晖十里\t452986\n查体\t452987\n花卉纹\t452988\n苍南中学\t452989\n恐婚\t452990\n数据型\t452991\n日清日\t452992\n尹新月\t452993\n6L\t452994\n拆迁方\t452995\n中国加盟网\t452996\nBuick\t452997\ntankaixiong\t452998\n闪光少女\t452999\n最搞\t453000\n换房\t453001\n圣诞之吻\t453002\n三十五年\t453003\n济南市区\t453004\ninterger\t453005\n汉德\t453006\ndilemma\t453007\n一生所爱\t453008\n20150725\t453009\n海景房\t453010\n数字许可证\t453011\ncharge\t453012\n郓城\t453013\n汪劲松\t453014\n陈琪\t453015\n元泰\t453016\n付汇\t453017\n南渡北归\t453018\n龙华新区\t453019\n毛晓彤\t453020\n成人考试\t453021\nRevit族\t453022\nFOREST\t453023\n负心\t453024\n于磊\t453025\nKeyError\t453026\n中国非公立医疗机构协会\t453027\n铝银浆\t453028\nTelegraf\t453029\n财经系\t453030\n敬意\t453031\n国字\t453032\n退款率\t453033\n雷蛇灵刃潜行版\t453034\n建三江\t453035\n独立声卡\t453036\n星辰漫画网\t453037\n两只老虎\t453038\n好枪\t453039\namd吧\t453040\n铝线\t453041\n1172\t453042\n扣非\t453043\n越西县\t453044\n将军山小学\t453045\n北邮\t453046\n蓝颜知己\t453047\ndmis\t453048\nGame234游戏\t453049\ndhs\t453050\ngama\t453051\n许多个\t453052\n铺位\t453053\n中国酒店\t453054\nexactly\t453055\n艾萨克\t453056\n复习卷\t453057\nliebiao\t453058\n风罩\t453059\n诱妻入怀\t453060\n纪委会\t453061\nUIPage\t453062\n朱晓明\t453063\n滑草\t453064\n天津市人民政府\t453065\n企业管理咨询\t453066\n丁香结\t453067\n脉冲电磁阀\t453068\n保价费\t453069\nITUNES\t453070\nstyle=\t453071\n第共\t453072\n帝国的毁灭\t453073\n胶鞋\t453074\nlpa\t453075\n浙师大\t453076\n2艘\t453077\nMol\t453078\nparser\t453079\n仁科\t453080\n第4项\t453081\n合富\t453082\n牛文\t453083\n圣芭芭拉\t453084\nassay\t453085\n亚特兰大机场\t453086\n鸡饭\t453087\n2145\t453088\n质量流量计\t453089\n2.8D\t453090\n海纳川\t453091\n结块\t453092\n红块\t453093\n映月\t453094\n数3\t453095\n十三个\t453096\nMUGEN\t453097\n我们在行动\t453098\n亿光\t453099\n全画幅单电A99II\t453100\n甄珍\t453101\n愛奇藝臺灣站\t453102\n68种\t453103\n脸皮厚\t453104\nTHREE\t453105\nPictureBox\t453106\nminitab\t453107\n坦尾\t453108\n梁世赞\t453109\n美国末日\t453110\n董小飒\t453111\nODR\t453112\nFIS3\t453113\n全年\t453114\n黄祖\t453115\n7820\t453116\nsails\t453117\n奇想\t453118\n太阁立志传5\t453119\n深圳会展中心\t453120\nwify\t453121\n15吋\t453122\n四柱预测学\t453123\nIncentives\t453124\natari\t453125\n双鹤\t453126\n天津地铁8号线\t453127\n江西大学\t453128\n山猪\t453129\n/span\t453130\n金融街购物中心\t453131\njinhua\t453132\n野叟\t453133\n群兴玩具\t453134\n遛弯\t453135\nX光\t453136\n20161004\t453137\n200小时\t453138\ngloria\t453139\n穹妹\t453140\n最美\t453141\n北京师范大学实验幼儿园\t453142\npdo\t453143\nptc\t453144\n长治学院\t453145\n厚实\t453146\n谢东\t453147\n小汤圆\t453148\n各自为政\t453149\n罗洛\t453150\n德庆\t453151\n清河半岛\t453152\n新特点\t453153\nhorsetail\t453154\n6枚\t453155\n长沙湾\t453156\n缓冲液\t453157\n第二颗\t453158\n蕲水清泉寺\t453159\n侵略战争\t453160\n9006\t453161\n无聊的人\t453162\n钢砂\t453163\nDyn\t453164\n吉登斯\t453165\n剑灵洪门崛起\t453166\n通讯录\t453167\n医药学\t453168\n尾喉\t453169\n白边填充液\t453170\n百得胜\t453171\nes2015\t453172\nprimera\t453173\n人大附中分校\t453174\n讲得好\t453175\n塑美\t453176\n好高骛远\t453177\n催婚\t453178\n车身\t453179\n第二十五届\t453180\nfx53vd\t453181\n第19期\t453182\n东方花园\t453183\n梅州市\t453184\n开枝散叶\t453185\nw3c\t453186\n广州市第五中学\t453187\n20160905\t453188\n李松\t453189\ncalss\t453190\n阿拉斯加\t453191\n深圳万象城\t453192\n立柜式\t453193\n方青卓\t453194\n陈玺达\t453195\n丰庄\t453196\n换血\t453197\n斜盘式\t453198\nPATH环境变量\t453199\n皮肤病血毒丸\t453200\ngt3\t453201\n52brain.com\t453202\n坦克世界闪击战吧\t453203\n取之不尽\t453204\n策源\t453205\naide\t453206\n350克\t453207\nCode4\t453208\ngpx\t453209\n笑容\t453210\n优班\t453211\n2.zip\t453212\n高加林\t453213\n大火车\t453214\nmaternal\t453215\n日本自卫队\t453216\n职务侵占罪\t453217\n天坑群\t453218\ndisclose\t453219\n_手机学院\t453220\n伊瓜苏瀑布\t453221\n沙河口区\t453222\nparticipant\t453223\n白夜追凶\t453224\n百度认证\t453225\n玫瑰香\t453226\n御窑\t453227\n实外西区\t453228\n维瑟嘉德\t453229\nx3d\t453230\n写段\t453231\n盖网\t453232\n182\t453233\nErrcode\t453234\n賞\t453235\n东力\t453236\n金沙湾\t453237\n巴城\t453238\nノンケ\t453239\n青瓦房\t453240\n固体\t453241\n阿古斯防卫军\t453242\n少儿趣\t453243\n暴林\t453244\nfxaa\t453245\n租车网|租车信息|汽车租赁公司\t453246\n14w\t453247\n咨客\t453248\n百度联盟\t453249\n酷有拿货网\t453250\n云东方\t453251\n投标报价\t453252\n河北省粮食局\t453253\nwww.a5.net\t453254\n爱我中华\t453255\n20160508\t453256\n惊蛰\t453257\n碟盒\t453258\nFlares\t453259\n寰宇浏览器\t453260\nargparse\t453261\n雌雄\t453262\n统计员\t453263\nGetLastError\t453264\n三更2\t453265\n自流井\t453266\nThousands\t453267\n611\t453268\n人力资源管理师\t453269\n书展\t453270\n阿妹\t453271\n高泌乳素血症\t453272\nTrave\t453273\nExcel版\t453274\n梨花带雨\t453275\n宁夏回族自治区财政厅\t453276\n三利\t453277\n科罗拉多大峡谷\t453278\n无线传\t453279\n鬼泣\t453280\n中土世界:暗影魔多\t453281\n三峡广场\t453282\nlwp\t453283\n七个小时\t453284\n庆五一\t453285\nshabet\t453286\n不足额\t453287\n东郊记忆\t453288\n400mg\t453289\n土工膜\t453290\nAppeal\t453291\n佳贝艾特\t453292\naria2c\t453293\n24awg\t453294\n神厨\t453295\n吉他谱C\t453296\n纪嫣然\t453297\n火狐邮箱\t453298\n插牌\t453299\n热血雨燕\t453300\n任杰\t453301\n密闭门\t453302\n无功补偿控制器\t453303\n过敏鼻炎\t453304\n雷政富\t453305\nT80\t453306\n有力\t453307\n方柱\t453308\n天天电玩城\t453309\n查克拉\t453310\n军师联盟2\t453311\n牛爷\t453312\n弹吧|钢琴谱|吉他谱|钢琴曲\t453313\n烟道\t453314\n単\t453315\n六年后\t453316\n奇快\t453317\n浄\t453318\n道氏技术\t453319\n比埃拉\t453320\n自有资产\t453321\n工业品出厂价格指数\t453322\n健康百科\t453323\n129\t453324\newf\t453325\n二五零\t453326\n罗马全面战争2\t453327\n勿扰\t453328\nalicesoft\t453329\n儒雅\t453330\nnp\t453331\n保定北市区\t453332\n让·雷诺\t453333\n石头路\t453334\n排水板\t453335\n信标\t453336\n维特拉\t453337\n8.26\t453338\nsiblings\t453339\nramdisk\t453340\n青岛交通银行\t453341\n完美好\t453342\n坑纸\t453343\n党的十九大报告学习辅导百问\t453344\n孔凡礼\t453345\n合成师\t453346\nmamma\t453347\n封尘网\t453348\n永远的家\t453349\nGoToMeeting\t453350\nSPT\t453351\n马甲包\t453352\n赵炎\t453353\n开卷有益\t453354\n吓坏\t453355\n浙江临安政府\t453356\n毛派\t453357\n贵阳国家高新技术产业开发区\t453358\n锚杆\t453359\nAchieving\t453360\n魔幻陀螺\t453361\nboss\t453362\n冷眼旁观\t453363\nkkbox\t453364\n学生版\t453365\n仆从\t453366\n读作\t453367\n定尺\t453368\n苏州市相城区政府网\t453369\n广东省大学\t453370\n2016年12月19日\t453371\n相量\t453372\n两档\t453373\nimpulse\t453374\nWPA/WPA2\t453375\n莱斯康\t453376\nviutv\t453377\n西北风云\t453378\n刚性兑付\t453379\n塔人网络\t453380\n横过来\t453381\n石庄\t453382\nzuche\t453383\n叶俊\t453384\n急哭\t453385\n蔡志忠\t453386\n软盘\t453387\n饺子皮\t453388\n色粉\t453389\nhping\t453390\n畅玩5\t453391\n2014新年\t453392\ncomposed\t453393\n敏感性\t453394\n概型\t453395\n铜钉\t453396\n弹簧线\t453397\n黄海\t453398\n年代记\t453399\n心醉\t453400\n新潮传媒集团\t453401\n下旬\t453402\n秘志\t453403\n春天的故事\t453404\n寝取\t453405\n爱乃\t453406\n碧桂园悦府\t453407\n四喜\t453408\n新建县\t453409\nreadelf\t453410\nqi\t453411\n公孙策\t453412\n_八九网\t453413\n隔代教育\t453414\n奥迪a4l\t453415\n8年以上\t453416\n扁桃体隐窝\t453417\n用底\t453418\n凯文史派西\t453419\n廉者\t453420\n电子卷\t453421\nipone7\t453422\n阿里邮箱\t453423\n麦吉尔大学\t453424\nfifaon\t453425\n华财网\t453426\n富融\t453427\n余数\t453428\nHaydee\t453429\n小精灵\t453430\n水仪\t453431\n贯穿式\t453432\n古风漫画网\t453433\n玄界神医毒妃\t453434\n山东银行\t453435\n中铁第一勘察设计院集团有限公司\t453436\n舞龙\t453437\n着手\t453438\n嫁女\t453439\n英梨梨\t453440\n时数\t453441\nw3school\t453442\n10-12月\t453443\n乳房\t453444\n2018-04-07\t453445\n云开药网\t453446\n18类\t453447\n大一级\t453448\n教父\t453449\n河蟹\t453450\n反诗\t453451\n浓雾\t453452\nzhi\t453453\n派息\t453454\nizzue\t453455\n成都市第二人民医院\t453456\n海兽\t453457\nBIS\t453458\n唐绍仪\t453459\nSKIP\t453460\npmp考试\t453461\n扭动\t453462\n沪建管\t453463\n金票\t453464\nword格式\t453465\n欧奈尔\t453466\n绝缘板\t453467\ncombined\t453468\n邓小平\t453469\n肉毒杆菌\t453470\njqwidgets\t453471\n永丰村\t453472\n浙大玉泉\t453473\nwards\t453474\n检员\t453475\n软银集团\t453476\n中东北非\t453477\n有限合伙基金\t453478\n解密版\t453479\nAxela昂\t453480\n湖南信息学院\t453481\n泵房\t453482\n亲女生\t453483\n海南政法职业学院\t453484\n九节\t453485\n启赋\t453486\nAddons\t453487\n地质灾害评估\t453488\nBD1280高清\t453489\n钱塘潮\t453490\n马文蔚\t453491\n冰与火\t453492\n对偶\t453493\n广东省卫计委\t453494\nmssp\t453495\ndag\t453496\n最小公倍数\t453497\nndi\t453498\n杏仁饼\t453499\n霜火岭\t453500\n荣宝斋\t453501\n法则\t453502\n泰国\t453503\n八二\t453504\n伏虎\t453505\n丹水池\t453506\n演变\t453507\n映日\t453508\nAndroid安卓\t453509\nrockchip\t453510\n金狼\t453511\nmyeclipse\t453512\n排洪渠\t453513\nID卡\t453514\n光敏二极管\t453515\n赫尔曼\t453516\nconsidered\t453517\n低薪\t453518\n潭头镇\t453519\ntinker\t453520\n修表\t453521\n城城\t453522\n写会\t453523\n五年级数学上册\t453524\n9312\t453525\n普朗克\t453526\nRAV4荣放论坛_汽车之家论坛\t453527\n潮菜\t453528\n万达国际影城\t453529\n星河地产\t453530\nTelerik\t453531\n82亿\t453532\n一周一次\t453533\ncausal\t453534\nxinxin\t453535\n古槐\t453536\n理工男\t453537\n裸屏\t453538\n风云岛\t453539\n聪明反被聪明误\t453540\n编译链\t453541\n出完\t453542\n體育\t453543\n青海大学\t453544\n卡门序曲\t453545\n垂危\t453546\n年龄段\t453547\n马格\t453548\nLruCache\t453549\n氧分析仪\t453550\n氯水\t453551\n测量学\t453552\n贺本清\t453553\n二【\t453554\n异位性皮炎\t453555\n恋愛\t453556\n歼星舰\t453557\nvmalloc\t453558\nSQUAD\t453559\n深圳政府法制信息网\t453560\n11e\t453561\n下肢水肿\t453562\n灵幻先生\t453563\n私企\t453564\n甘肃电投\t453565\nto_书面语\t453566\n复美\t453567\n文绉绉\t453568\n湖南现代物流职业技术学院\t453569\n中国药业\t453570\n公益广告语\t453571\nVS2008\t453572\n蚝式\t453573\n王信\t453574\n广武镇\t453575\n新井\t453576\n发射药\t453577\n培训证\t453578\ncf刷枪\t453579\n传给子\t453580\n2\\\t453581\n玻璃球\t453582\nBiliBili\t453583\n两层\t453584\n送餐\t453585\n9科\t453586\n赏尔雅周海宏\t453587\n胰腺肿瘤\t453588\n星弟\t453589\n堪\t453590\n美辰\t453591\n文学作家\t453592\n奎目郎\t453593\najaxform\t453594\n宜昌东\t453595\n补充\t453596\n800万\t453597\n创益\t453598\n叔胺\t453599\n胜威\t453600\n_千广网\t453601\n诺兹多姆\t453602\n万向\t453603\n磁钢\t453604\nnpcscan\t453605\n梅原大吾\t453606\nSVGA\t453607\n高新科技园\t453608\n许杰\t453609\n黄连素\t453610\n主题性\t453611\ndota1\t453612\n碱式\t453613\n佛教社区|佛教\t453614\n焙烤\t453615\n第14个\t453616\n_夜神模拟器\t453617\n央视五四晚会\t453618\n出租房\t453619\nvoker\t453620\n刘国忠\t453621\npde\t453622\nmssql数据库\t453623\n哈萨克斯坦共和国\t453624\n城市绿地系统规划\t453625\n确山县\t453626\n吱吱响\t453627\njavaer\t453628\n龙湖长城源\t453629\n无冬之夜\t453630\n寒流\t453631\n华东站\t453632\n历子\t453633\n一周多\t453634\nretrospective\t453635\n一平方公里\t453636\nNovotel\t453637\nclot吧\t453638\ntta\t453639\n大萧条\t453640\n表扬信\t453641\n无所不为\t453642\n情郎\t453643\n隶书\t453644\ntasks\t453645\n北京医保定点医院\t453646\n难找工作\t453647\n单元测试卷\t453648\n71名\t453649\n中校\t453650\n养护箱\t453651\n58分钟\t453652\n同月\t453653\n泰华街\t453654\n飞蝗\t453655\nZiperello\t453656\n霍东阁\t453657\n1周年\t453658\n郑州市二七区\t453659\nMMT\t453660\nmo9\t453661\nregen\t453662\ngrou\t453663\n天江\t453664\n富力白鹭湾\t453665\n增值税减免税申报明细表\t453666\n病原\t453667\n太阳的话\t453668\n活动房\t453669\nJavaEE\t453670\nstrdup\t453671\n几几点\t453672\n西部世界第一季\t453673\n胜火\t453674\n一物\t453675\njcg\t453676\n22204964\t453677\n春天的秘密\t453678\nwls\t453679\n5.5米\t453680\n斗铠\t453681\n触手tv\t453682\n高等代数\t453683\n对她说\t453684\n接球\t453685\n_119手游网\t453686\n艾睿\t453687\njiji\t453688\n四责协同\t453689\n抛储\t453690\n李沧区政府\t453691\n深圳市人民政府办公厅\t453692\n阿里巴巴创新中心\t453693\n瑞庭\t453694\n一早\t453695\n娄滋博\t453696\n长江北路\t453697\n闭幕词\t453698\n办公柜\t453699\n宋承宪\t453700\n法新社\t453701\ndust\t453702\n成都商报|成都商报\t453703\n彩膜\t453704\n融创香璟台\t453705\n民建中央\t453706\n2481984605\t453707\n拱棚\t453708\nViewer\t453709\n倾听者\t453710\n一到一百\t453711\n贝瓦网\t453712\ntick\t453713\n安溪人才网\t453714\n价值表\t453715\nファ\t453716\n杜集区人民政府\t453717\nOthers\t453718\nNapa\t453719\n国家地理_腾讯\t453720\n小恩\t453721\n阿联酋\t453722\ndesperate\t453723\n古蜀国密码\t453724\n慈溪政府网\t453725\n清管\t453726\n兴中会\t453727\n易堂\t453728\n加缪\t453729\n鼓风干燥箱\t453730\n收藏级\t453731\n天竺少女\t453732\nbsg\t453733\n南京市城市管理局\t453734\n思想汇报\t453735\n寿司\t453736\n毁灭\t453737\n戈贝尔\t453738\n第70号\t453739\n玩家堂\t453740\n万妮达\t453741\n扩张期\t453742\n黄山房产网\t453743\n3日内\t453744\n小旗\t453745\n冒烟冰淇淋\t453746\n抢鲜\t453747\n自我效能感\t453748\n别克威朗\t453749\n怀想\t453750\n康和\t453751\nchains\t453752\n查询器\t453753\n坏\t453754\n摆件\t453755\n锦鲤鱼\t453756\n高松机场\t453757\naccomplish\t453758\n55df\t453759\n墓地\t453760\n周秦\t453761\n燃烧者\t453762\n安徽省经信委\t453763\n情到深处\t453764\n自晒\t453765\n镀银\t453766\n7.88\t453767\n教士\t453768\n后备箱盖\t453769\n邢台地区\t453770\nwiiu吧_\t453771\n依规\t453772\n写明\t453773\n天下第一剑\t453774\n绿色版\t453775\nHelm\t453776\n胖迪\t453777\n上海立信会计金融学院\t453778\nsamson\t453779\nstraps\t453780\n七届\t453781\n乙型肝炎疫苗\t453782\n六弄咖啡馆\t453783\nvalu\t453784\n超复杂\t453785\n爱情公寓3\t453786\npepe\t453787\n畅想\t453788\n退出\t453789\nspring-boot\t453790\n天音\t453791\n白花蛇\t453792\n另一片\t453793\n余红\t453794\n飘影\t453795\n神奈川\t453796\nIsn\t453797\nsolidworks2017\t453798\n囚室\t453799\n韩聪\t453800\nH类\t453801\n小氣\t453802\n太阳宫\t453803\n鼎城区人民政府\t453804\n挂碍\t453805\nonlylady女人志\t453806\nDARPA\t453807\n张夏\t453808\n反恐行动\t453809\n两弹一星\t453810\n温胆汤\t453811\n打宝\t453812\njj\t453813\nC语言库\t453814\nliyang\t453815\nkenjijoel\t453816\n痞子阿姆\t453817\n常德路\t453818\nempt\t453819\n辉山\t453820\n黑金丝雀\t453821\nf12017\t453822\n1瓦\t453823\n候场\t453824\n兑奖\t453825\n重道\t453826\n體化\t453827\navformat\t453828\n感恩节\t453829\n红米4X\t453830\n中国建筑西北设计研究院有限公司\t453831\n晶石\t453832\n科仪\t453833\n九牧王男装\t453834\n李洛克\t453835\nディ\t453836\n格子\t453837\n召忠\t453838\n足文\t453839\n杨燕\t453840\n华中科技大学土木工程与力学学院\t453841\n一哥们\t453842\n半支\t453843\napowersoft\t453844\n苗老祖\t453845\n柯有伦\t453846\n蜜宝\t453847\n天罡星\t453848\n163.com\t453849\n指检\t453850\n大鬼\t453851\n自来\t453852\n新世相\t453853\n防霉\t453854\n托马斯教育\t453855\n专科生\t453856\n1280\t453857\n华为麦芒\t453858\n东端\t453859\n重入\t453860\n岳阳火车站\t453861\n风板\t453862\n动库\t453863\n舒必利\t453864\nandroid4.0\t453865\nsalon\t453866\n天下彩\t453867\nTGT\t453868\n金饰\t453869\n新明锐\t453870\n秒懂了\t453871\nMacPorts\t453872\n贝尼尼\t453873\n赛福\t453874\nreceivable\t453875\n希咲彩\t453876\n21页\t453877\n长深高速\t453878\n小鸭子\t453879\n四郎\t453880\n金钟仁\t453881\nxsp\t453882\n逆波兰式\t453883\nHypervisor\t453884\n产品型\t453885\n元璟\t453886\n双11\t453887\n1800亿\t453888\nFaucet\t453889\n过世\t453890\n酒水单\t453891\n饱腹感\t453892\n3845\t453893\nfit\t453894\n总体\t453895\n样式集\t453896\n绝地救生\t453897\n侠义道\t453898\n恐慌性\t453899\nesayui\t453900\n290x\t453901\nTWRP\t453902\n精灵宝可梦究极\t453903\n洛伦兹力\t453904\n第49条\t453905\n戈兰\t453906\nloho\t453907\n15ml\t453908\n切掉\t453909\n临湘市\t453910\nAccess数据库\t453911\n遂城镇\t453912\nPOPPIN\t453913\n环球网\t453914\n_高校人才网\t453915\n苏菲茅侃侃\t453916\n逼债\t453917\n战勋爵\t453918\n韦德\t453919\n20161120\t453920\n介休绵山\t453921\n京华\t453922\nlbzhang\t453923\n海神号\t453924\nlsn\t453925\n天津医科大学临床医学院\t453926\nClap\t453927\nスカ\t453928\n撇清\t453929\nUWP-D11\t453930\n艾德\t453931\n青岛地铁11号线\t453932\nANOTHER\t453933\n美舍\t453934\n黄国伦\t453935\n文化型\t453936\n经济生活\t453937\n浮充\t453938\n我好想你\t453939\n福湾\t453940\n控违\t453941\nlols8\t453942\n丹枫\t453943\npoultry\t453944\n北京和睦家医院\t453945\n芳疗\t453946\n文摘版\t453947\n三十位\t453948\n元贝驾考\t453949\n胸图\t453950\n秦农银行\t453951\ncoded\t453952\n王伟平\t453953\nINFJ\t453954\n金苹果幼儿园\t453955\n7g\t453956\n白吃\t453957\n努努\t453958\n第55集\t453959\n促发\t453960\n忽悠局\t453961\n寻问\t453962\n学而思杯\t453963\n盐渍土\t453964\nfilecoin\t453965\nyingyu\t453966\n南方科大\t453967\n隔音玻璃\t453968\n203集\t453969\n江苏省肿瘤医院\t453970\n墨白\t453971\nfantasy\t453972\n海城市\t453973\n郑欣\t453974\n帕累托图\t453975\n材料部\t453976\nCola\t453977\n痞子泰\t453978\n物灵\t453979\n3dd\t453980\n136个\t453981\n大鸿米店\t453982\n书道\t453983\n三氧化钨\t453984\nhypermill\t453985\n德国大学\t453986\n天畅游戏网\t453987\n菲董\t453988\n万丽娜\t453989\n广州白云国际会议中心\t453990\n望城新区\t453991\n96元\t453992\n附使\t453993\n兰馨\t453994\ncodecademy\t453995\n并口\t453996\n一之\t453997\n上海市人大\t453998\n安徽省高院\t453999\n沙路\t454000\nMBN\t454001\n两肋插刀\t454002\n海棠园\t454003\n再分配\t454004\n后勤\t454005\n曹家渡\t454006\n张国忠\t454007\n160702\t454008\n王小英\t454009\n一网打\t454010\n无锡幼儿园\t454011\n镇巴县\t454012\n十三天\t454013\n羽族\t454014\nrads\t454015\n曲线段\t454016\n朱仙镇\t454017\nF16\t454018\n大连西路\t454019\n宜良县人民政府\t454020\n中国对外贸易中心\t454021\n开集\t454022\n英西\t454023\n夜爬\t454024\n英子\t454025\nS200\t454026\n250次\t454027\n北周\t454028\n危房\t454029\n中梁壹号院\t454030\n双鱼女\t454031\n3414\t454032\n评分表\t454033\n重教\t454034\n免分\t454035\n强切\t454036\n2Judgement\t454037\nGoldWave\t454038\nugui\t454039\n偿还\t454040\nWIT\t454041\n大舞台\t454042\n上海天文台\t454043\n套取\t454044\n该卡\t454045\n凝胶剂\t454046\n碰着\t454047\n度汛\t454048\n诺达热带雨林\t454049\n元杰\t454050\npng格式\t454051\n汉堡大学\t454052\n刺儿菜\t454053\n1档\t454054\n上海市信鸽协会\t454055\n人力资源管理专业\t454056\nRamp\t454057\nmysql函数\t454058\n一起唱\t454059\n王向荣\t454060\nenchanted\t454061\n澳客\t454062\n天津航空有限责任公司\t454063\n南仁东\t454064\n第9讲\t454065\n零公里\t454066\n嘭\t454067\n达意\t454068\n中共地下党\t454069\npsv\t454070\n尽失\t454071\n阳光连线\t454072\nAudition3.0\t454073\n150毫米\t454074\n鲁能地产\t454075\n58招\t454076\n复旦大学马克思主义学院\t454077\niwork8\t454078\n湄洲\t454079\n十三点\t454080\n300款\t454081\n安康路\t454082\n黄华华\t454083\n批转\t454084\n4.1.6\t454085\n苏州海关\t454086\npc2\t454087\n渡部笃郎\t454088\n德井义实\t454089\nmogondb\t454090\n仙塞学院\t454091\n阴臀倒模\t454092\n新时代健康产业集团\t454093\n清水塘\t454094\n脂粉\t454095\n知乎热议\t454096\n草料场\t454097\n魅族mx4\t454098\n铂瑞\t454099\n翻身\t454100\n凯泽\t454101\n刚哥哥\t454102\n悄悄话\t454103\n外婆的澎湖湾\t454104\n人民版\t454105\n空心面\t454106\n聚硫密封胶\t454107\n转塘街道\t454108\n芳踪\t454109\n手体\t454110\n诺贝尔物理奖\t454111\n里恩\t454112\n3根\t454113\n山东财经大学\t454114\n龙虎少年队\t454115\n活受罪\t454116\n国家宪法日\t454117\n乏\t454118\n亚特兰蒂斯\t454119\nIMPACT\t454120\n心动过缓\t454121\n阿瓦尔古丽\t454122\n小二郎\t454123\nナンパ\t454124\n这个春节\t454125\nW8\t454126\n容灾\t454127\n英州镇\t454128\n芝樱花\t454129\n痴心妄想\t454130\n禹唐\t454131\n方倍工作室\t454132\n采桑\t454133\nca88\t454134\n形\t454135\n17亿\t454136\n宝达\t454137\n95平\t454138\n携程旅行网\t454139\n喊道\t454140\n五女拜寿\t454141\n武天\t454142\n妖萌战姬\t454143\n窝灯\t454144\nFOTOE\t454145\n秀东\t454146\n宫内节育器\t454147\nHoldings\t454148\n2016年4月份\t454149\n793\t454150\n壁挂机\t454151\n辽沈地区\t454152\nsfr\t454153\nmens\t454154\n上镜\t454155\nwindows7\t454156\n彩平图\t454157\n90寸\t454158\nOSN\t454159\n何志平\t454160\n教育部教师工作司\t454161\n浴室镜\t454162\n第九十二章\t454163\n爱美丽时尚网\t454164\n京戏\t454165\n史克\t454166\n复旦大学肿瘤医院\t454167\nbandit\t454168\nPrevalence\t454169\n氯丙烯\t454170\n女医明妃传\t454171\n小石潭记\t454172\n南亭\t454173\npano2vr\t454174\n白磷\t454175\n金格科技\t454176\n北京大学首钢医院\t454177\n年薪制\t454178\n十七次\t454179\n漳州市\t454180\njquey\t454181\n700Bike\t454182\n20171130\t454183\n青岛开发区\t454184\n通力律师事务所\t454185\n李家山\t454186\n蓝韵\t454187\n内存页\t454188\n佟佳氏\t454189\n崇阳\t454190\n民资\t454191\n咖喱棒\t454192\n布坎南\t454193\nsue\t454194\ncs95\t454195\n环溪村\t454196\nTOP12\t454197\n澄泓\t454198\n热重分析仪\t454199\nCUV\t454200\n卢军宏\t454201\n白云乡\t454202\n平湖\t454203\nBNC\t454204\n盗攝\t454205\n藏独\t454206\n剑门关景区\t454207\n东营网\t454208\nThunderbolt\t454209\n许昌市中心医院\t454210\nROD\t454211\n乖离性ma\t454212\npcb\t454213\n1座\t454214\n地平线:零之曙光\t454215\nS7edge\t454216\nCaO\t454217\nRational\t454218\n超过2小时\t454219\n彩票机\t454220\ntraining\t454221\n王思文\t454222\n贞操带\t454223\n弗林\t454224\n易县人民政府\t454225\n化学锚栓\t454226\n搜猪网\t454227\n海牙认证\t454228\nfire\t454229\n减损\t454230\n1.51\t454231\n汇潮\t454232\n产权证\t454233\n上海五官科医院\t454234\n哑谜\t454235\n津贴\t454236\n文件恢复软件\t454237\n5570\t454238\n假赛\t454239\n飞田\t454240\n地热分水器\t454241\n用药水\t454242\n吉沢明步\t454243\ncom2us\t454244\n金曜石\t454245\n搜狐社区\t454246\n丹尼斯吴\t454247\n候任\t454248\n4立方\t454249\nBacteria\t454250\n毛菇\t454251\n7艘\t454252\nvsix\t454253\n虎踞\t454254\n锦庐\t454255\nScent\t454256\n大连小学\t454257\n热升华\t454258\n42u\t454259\n陕西科技大学\t454260\n台城镇\t454261\n绒\t454262\n四五家\t454263\n无锡物流公司\t454264\n魔教\t454265\n治疗仪\t454266\n全芏网\t454267\n20151023\t454268\n新邵县\t454269\n木九十\t454270\n外箱\t454271\n我的选择\t454272\n平川镇\t454273\n黄小平\t454274\ninvest\t454275\n五十载\t454276\n光子学\t454277\n圆盾\t454278\n腾讯分分彩开奖\t454279\njavaconfig\t454280\n学士后\t454281\n湖北省交通厅\t454282\n压岁钱\t454283\n永磁除铁器\t454284\n吴龙\t454285\n韩茜茜\t454286\nipad4\t454287\n黄黄\t454288\n天全县\t454289\n打灯\t454290\n光叔\t454291\n中量级\t454292\nrooftop\t454293\n划时代\t454294\n适当性\t454295\n拉铆枪\t454296\nshuffle4\t454297\n付家庄\t454298\n跨界\t454299\n高龄\t454300\n危难\t454301\n千佛山\t454302\nie3.0\t454303\n地力\t454304\n一股\t454305\n九龙湾\t454306\n硫酸锌\t454307\n成都市档案局\t454308\n破相\t454309\n金融市场部\t454310\n单管\t454311\n世茂蝶湖湾\t454312\npydoc\t454313\n转瞬\t454314\n黄锁\t454315\nOpinion\t454316\n放松\t454317\n游戏王:决斗者遗产\t454318\n三篇\t454319\n缴费机\t454320\n硼\t454321\n纳豆菌\t454322\nenv\t454323\nvoa\t454324\n晓程科技\t454325\n萨尔兹堡\t454326\n催化液\t454327\n惠比特\t454328\n不知情\t454329\n四严\t454330\nandroid8.0\t454331\n深圳中学初中部\t454332\n上海交通大学医学院附属同仁医院\t454333\n淘宝主图\t454334\nStain\t454335\n有声小说在线收听网\t454336\n汴禧集团\t454337\n背鳍\t454338\n天花机\t454339\n新单\t454340\nPigment\t454341\n毛糙\t454342\nbreeze\t454343\navav\t454344\n通用串行总线控制器\t454345\n定性变量\t454346\nreso\t454347\n7千\t454348\nconstituent\t454349\n孤儿案\t454350\n依法\t454351\n丁宁\t454352\n防不胜防\t454353\nWebApp\t454354\n济宁市区\t454355\n道明\t454356\n两小无猜\t454357\n英吉利\t454358\n不记名\t454359\n轩逸雪铁龙c6\t454360\n信方\t454361\nclient\t454362\ninmy\t454363\n谢鹏\t454364\n月薪\t454365\nclerk\t454366\n300日元\t454367\n吉林省\t454368\nRamDisk\t454369\n姚剑\t454370\n裙子坦克世界\t454371\n1.40\t454372\n沿江镇\t454373\n006\t454374\naft\t454375\nFDTD\t454376\nPhodal\t454377\nFAIRY\t454378\nAEB\t454379\n骁龙616\t454380\n生老病死\t454381\n优云\t454382\n私处长\t454383\n河南师大附中\t454384\n2017.08\t454385\n春思\t454386\n刘震云\t454387\n后殖民主义\t454388\nMLGB\t454389\n国联资源网\t454390\n15年前\t454391\n大厂镇\t454392\n中国科学院武汉病毒研究所\t454393\n0393\t454394\n六一节\t454395\n微薄\t454396\n草莓牛奶\t454397\nk610d\t454398\n几趟\t454399\nSPF50+\t454400\n医馆笑传2\t454401\ngears3吧\t454402\n三月初一\t454403\nGather\t454404\n稍稍\t454405\nWEBP\t454406\n手商\t454407\n三流\t454408\n弓门\t454409\n长孙\t454410\nGCP\t454411\n铝塑\t454412\n当归补血汤\t454413\n陈绍蕃\t454414\n北京稻香村\t454415\n邪文\t454416\n产经\t454417\n荷斯坦\t454418\n初中考\t454419\n一进门\t454420\nneinei\t454421\n10m3\t454422\n新书\t454423\n空地\t454424\n南工\t454425\ncname\t454426\n李丁\t454427\n蓝光机\t454428\n引渡\t454429\ntensorflow\t454430\n移动迷宫3死亡解药\t454431\n献寿\t454432\ngrave\t454433\n安耐美\t454434\n永佳\t454435\nExpressions\t454436\n超级马里奥奥德赛\t454437\n八斤\t454438\n孙悟空\t454439\nwhenever\t454440\n肌腱\t454441\n盗墓笔记全集\t454442\n云搜索\t454443\nIT专家网\t454444\n黄梅县人民政府\t454445\n足力健\t454446\n防火柜\t454447\n一封家书\t454448\nbed\t454449\n紧俏\t454450\n400子\t454451\n印第安纳大学伯明顿分校\t454452\n网易uu加速器\t454453\n黄陵县人民政府\t454454\n金彭\t454455\n使命召唤吧\t454456\n大力\t454457\nqlistview\t454458\n2013年7月\t454459\n烟台大悦城\t454460\n深圳市公司\t454461\n領域\t454462\n2017年7月8日\t454463\n大宁国际小学\t454464\n二甲双胍\t454465\n云笈七\t454466\n29个\t454467\n鱼摆摆网店装修网\t454468\n投资观\t454469\n安东尼达斯\t454470\n密西根大学\t454471\n国家垄断资本主义\t454472\n天雁\t454473\n扬尘\t454474\n投资基金\t454475\n叶云\t454476\n卡吧\t454477\n抛釉\t454478\n丰山\t454479\n侵权\t454480\nQ9\t454481\n成都传媒集团\t454482\n受伤者\t454483\n长春路\t454484\n武打\t454485\n北京冬奥\t454486\n蝙蝠侠黑暗骑士\t454487\n辽宁盼盼\t454488\n固定相\t454489\n智慧大厦\t454490\nbcm\t454491\npetro\t454492\n碳钢板\t454493\n观览\t454494\n保温盒\t454495\n电常数\t454496\n慈东\t454497\n拉达\t454498\n安全工程\t454499\n第一炮\t454500\n放散\t454501\n虹科\t454502\nCondo\t454503\n脱膜\t454504\n学前频道\t454505\n质造\t454506\n42009\t454507\n内应力\t454508\n垃圾分类\t454509\nminiusb\t454510\n南关村\t454511\n83万\t454512\nproduct\t454513\n坏家伙们\t454514\n圆月\t454515\n雅南\t454516\n格力犬\t454517\n机智的监狱生活\t454518\n摩托日记\t454519\n奇胜\t454520\n长安汽车\t454521\n中国坯布网\t454522\n买手街\t454523\n目标服务器\t454524\n跳舞毯\t454525\n张雨绮\t454526\n牛肉味\t454527\n西乡\t454528\n税单\t454529\n正器\t454530\n画墙\t454531\n时生\t454532\n惠州亿纬锂能股份有限公司\t454533\n蒙纳\t454534\n几吨\t454535\n严守\t454536\nwarcraft\t454537\n摔角\t454538\n63家\t454539\nf422\t454540\nshadowmatic\t454541\n吉利s1\t454542\n分院\t454543\nAnal\t454544\navop\t454545\n人肉包子\t454546\n第四场\t454547\n乌骨鸡\t454548\n荣威eRX5\t454549\n涤纱\t454550\n好饿的毛毛虫\t454551\nThrills\t454552\n娄底新闻网\t454553\n锯\t454554\n傻屌\t454555\n新疆生产建设兵团\t454556\n甘泉路\t454557\n北京港澳\t454558\n素斋\t454559\n钛矿\t454560\n抵近\t454561\nSEOUL\t454562\n2027年\t454563\nHanLP\t454564\n86届\t454565\nnixCraft\t454566\n化为乌有\t454567\n综漫\t454568\nAddgene\t454569\n不急\t454570\n身高\t454571\nxp245\t454572\nLOM\t454573\n伸张\t454574\n胆片\t454575\n烫烫烫\t454576\nお\t454577\nキュレ\t454578\n救济\t454579\n奶蓟草\t454580\n背挂\t454581\nNoSQL\t454582\n这时候\t454583\n纸球\t454584\n总攻略\t454585\n战略家\t454586\n5险1金\t454587\n长沙市发改委\t454588\n曹建国\t454589\n偷吻\t454590\n方舟生存进化\t454591\n爱普生630k\t454592\n染毒\t454593\n神武苍穹\t454594\n361单机网\t454595\n低温箱\t454596\n择吉\t454597\nSWC\t454598\n十月围城\t454599\n为用\t454600\nb10\t454601\n中和街道\t454602\nc语言函数\t454603\n巡音\t454604\n_眼\t454605\n蜂场\t454606\n初具规模\t454607\n辛集镇\t454608\n倪惠英\t454609\ntow\t454610\ntechnology\t454611\n桩端\t454612\n墨麟\t454613\n礼帽\t454614\n骨皮\t454615\nJDBC\t454616\nwindonws\t454617\nie\t454618\n母公司\t454619\n老寨山\t454620\nmiui吧_\t454621\n楚天云\t454622\n各组\t454623\n重庆育才中学\t454624\n六普\t454625\nRev.3\t454626\nBounds\t454627\n60R15\t454628\n伪街\t454629\n上海律协\t454630\n交友网\t454631\n王小龙\t454632\n张思浩\t454633\n稻麦\t454634\nNTFS\t454635\n夹条\t454636\n让贤\t454637\n运气\t454638\n何婕\t454639\nRalink\t454640\n速7\t454641\n5年多\t454642\n四舍五入法\t454643\n傲视网\t454644\n国际航协\t454645\n不冷\t454646\n【化物语\t454647\n鼎峰\t454648\n赵佳\t454649\n义乌商贸城\t454650\n东亚地区\t454651\n电影片尾曲\t454652\n苏州大学计算机科学与技术学院\t454653\nAdobeFlashPlayer\t454654\n国泰君安证券\t454655\nbeoplayer\t454656\nJOMOO\t454657\n南京理工大学》\t454658\nLEADER\t454659\n柏溪镇\t454660\n报酬率\t454661\n列题\t454662\nguyuanli\t454663\n重介\t454664\n定焦镜头\t454665\nharassment\t454666\n寒号鸟\t454667\n万村千乡\t454668\n方杆\t454669\n敌手\t454670\n北京科技大学材料科学与工程学院\t454671\nReally\t454672\n梁飞\t454673\n词汇表\t454674\n臃肿\t454675\n东莞注册公司\t454676\n叶塞尼亚\t454677\n03号\t454678\n思南县\t454679\n2018-04-20\t454680\n张智尧\t454681\n爱歌\t454682\n消防登高面\t454683\n赵朴初\t454684\n七笔账\t454685\n一得\t454686\n广州\t454687\n渣打銀行\t454688\n颇\t454689\n高陵县\t454690\n韩学杰\t454691\n北江\t454692\n迷惑性\t454693\n陀罗尼\t454694\n安控科技\t454695\n云丰\t454696\n想不想\t454697\ne贷\t454698\n金条\t454699\n商贸业\t454700\n两个半小时\t454701\n捆扎\t454702\n福州地铁1号线\t454703\n木版画\t454704\noranges\t454705\n三维板\t454706\n周弯弯\t454707\n接地母线\t454708\n呼死你\t454709\ncosta\t454710\n凯撒金\t454711\n盘根\t454712\n荒神\t454713\n蝶美\t454714\n山东省国资委\t454715\nTON\t454716\n中级审计师\t454717\n这么美\t454718\n右衽\t454719\n微企\t454720\n香雪制药\t454721\nGameMaker\t454722\n中大街\t454723\ncrud\t454724\n接线\t454725\n第五批\t454726\n精武镇\t454727\n鬼校\t454728\n海拉鲁城堡\t454729\n9.5亿\t454730\n老六\t454731\n手机银行跨行\t454732\n喷花\t454733\nlilbetter\t454734\n老孟\t454735\ns6e\t454736\n利华\t454737\n禁放\t454738\n陈少霞\t454739\n湛江中心人民医院\t454740\nrecreators\t454741\n2006-2010年\t454742\n鞍山一中\t454743\n未名\t454744\n舒斯特尔\t454745\n售楼员\t454746\nPRICE\t454747\n去哪儿\t454748\n于洋\t454749\n兰州军区\t454750\n老鸭头\t454751\nAlt+\t454752\n解字\t454753\nCHAOS\t454754\n前9个月\t454755\n邹平政府网\t454756\n值函数\t454757\n练声\t454758\nbipro\t454759\n恒宝广场\t454760\nAndreas\t454761\nkirin\t454762\n小奇\t454763\n2017年2月26日\t454764\n东湖花园\t454765\n儚\t454766\n收款方\t454767\n张紧\t454768\nToilette\t454769\n中医药\t454770\n凌雪\t454771\n恶气\t454772\n龙信\t454773\n附加险\t454774\n工商银行跨行\t454775\nkiddy\t454776\n于志刚\t454777\n吕彦\t454778\n199个\t454779\n奥布拉克\t454780\n资阳大众网\t454781\n联想e430\t454782\n圣战群英传3\t454783\nbbn\t454784\n金瓶风月\t454785\n10多\t454786\n延誉\t454787\n刀工\t454788\n写在一起\t454789\nsot-23\t454790\n第一枝\t454791\n储藏室\t454792\n小台芒\t454793\n龙船\t454794\n订餐\t454795\n直感\t454796\n浪涌保护器\t454797\nHKD\t454798\n广增\t454799\nJavaMail\t454800\nrouteros\t454801\n普集镇\t454802\n万鑫\t454803\n纪敏佳\t454804\n徐正\t454805\namac\t454806\n曹小萌\t454807\n浙商财险\t454808\n资政\t454809\n聚诚\t454810\nTIG\t454811\n码头\t454812\nTeamCity\t454813\nuint8\t454814\n19.3\t454815\n陈德容\t454816\n无名小卒\t454817\n带你去旅行\t454818\nvertica\t454819\n2015年底\t454820\n暴风比特\t454821\n天梭手表\t454822\n重庆东金投资顾问有限公司\t454823\n坐落\t454824\n2016年4月16日\t454825\n石化油服\t454826\n人生果\t454827\n那日\t454828\n镖门\t454829\n像极了\t454830\n2014\t454831\n转板\t454832\nos4\t454833\n趁\t454834\n六年级语文\t454835\n徐少强\t454836\n困难户\t454837\ninvaild\t454838\n手指甲\t454839\n花园酒店\t454840\n反爬\t454841\nAnyconnect\t454842\n少学时\t454843\n仁慈\t454844\n恋爱观\t454845\n汪浩\t454846\n激智科技\t454847\n长方体的认识\t454848\n老公\t454849\n3dsll\t454850\n猫神\t454851\n辛德勒的名单\t454852\n座位\t454853\n月光曲\t454854\n天道教育\t454855\n通畅\t454856\n60cm\t454857\nVR玩网\t454858\n下挂\t454859\n沙弥\t454860\n峄城\t454861\nimvu\t454862\n套现\t454863\n7.doc\t454864\n北交大\t454865\n楼房\t454866\norderer\t454867\n泥醉\t454868\n天等\t454869\n4135\t454870\n这首歌\t454871\n转钱\t454872\n镍锌\t454873\n2003款\t454874\n球手\t454875\n沙文主义\t454876\n太离谱\t454877\nWITHOUT\t454878\nIndiana\t454879\nnfa\t454880\n欧普\t454881\n真冰\t454882\nCOS秀\t454883\n温哥华岛\t454884\n降\t454885\nYeon\t454886\n背阴\t454887\n氢氧化钴\t454888\n庐山木\t454889\nAquarium\t454890\nlands\t454891\n3D字谜\t454892\ncured\t454893\n不干胶\t454894\n格林美\t454895\n幻想神域ol吧\t454896\n门栏\t454897\ncalibre\t454898\nrel=\t454899\n堡子\t454900\n叛逃者\t454901\n打败\t454902\n寒夜\t454903\n8030\t454904\n王稼祥\t454905\n键槽\t454906\n杭州绕城高速\t454907\n梁耀燮\t454908\n开运算\t454909\nCCTV-13_央视\t454910\n1000l\t454911\n陈静\t454912\n鉴赏级\t454913\n文稿纸\t454914\n北京二手车网\t454915\n长阳县\t454916\n20171104\t454917\n饱眼福\t454918\n泊船\t454919\nrendered\t454920\nt470\t454921\n广西国际商务职业技术学院\t454922\n广西农业职业技术学院\t454923\n中国矿业大学出版社\t454924\n2018年3月24日\t454925\nORAS\t454926\n几个个\t454927\n狗语\t454928\n空气站\t454929\n何展成\t454930\n高分网\t454931\n李儒\t454932\n彼得一世\t454933\n撸噜\t454934\n福州省立医院\t454935\n北京市城市管理综合行政执法局\t454936\nAirPort\t454937\n上皮\t454938\n阿佩尔\t454939\n辽宁大学》\t454940\n宝安大道\t454941\n几险\t454942\n846\t454943\n総合サイト\t454944\n有如\t454945\nHMI\t454946\n异构\t454947\n长春晚报\t454948\n中华人民共和国职业分类大典\t454949\n蔬菜清洗机\t454950\n硕字\t454951\n17卷\t454952\nnode服务器\t454953\nairbag\t454954\n青绿色\t454955\n万达传媒\t454956\nabortion\t454957\n匈奴人\t454958\nNIO\t454959\n巍山\t454960\nMarriage\t454961\n辅酶q10\t454962\n酒肉\t454963\n敖包相会\t454964\n拳霸\t454965\n爱久见人心\t454966\n水潭\t454967\n女秋\t454968\n华苑小区\t454969\n助焊\t454970\n监事长\t454971\n魔法士\t454972\n金蝶KIS记账王\t454973\n伸进\t454974\n大揭密\t454975\nWeld\t454976\nBe\t454977\n茶圣\t454978\n日本阿尔法罗密欧\t454979\n李方\t454980\n国际展览中心\t454981\nGraduates\t454982\nspirng\t454983\n宋时武侠世界大冒险\t454984\n北漂\t454985\n滚石乐队\t454986\n全镇\t454987\n136元\t454988\n同力水泥\t454989\n挂掉\t454990\n创维\t454991\n托管人\t454992\nEDITION\t454993\n热效率\t454994\nwim\t454995\n总感\t454996\n多弗朗明哥\t454997\n春江水暖\t454998\n青少年体育俱乐部\t454999\nsurvive\t455000\n4.71\t455001\n68%\t455002\n630kva\t455003\n磺酰氯\t455004\n暑研\t455005\n吉祥纹莲花楼\t455006\nPULL\t455007\n娟子\t455008\n鲍鱼螺\t455009\n老证\t455010\nshsh\t455011\n陆海军\t455012\nn维\t455013\n激战现代ix25\t455014\nrmdir\t455015\n黑染\t455016\n2015.09\t455017\nAint\t455018\n帝少\t455019\nG3900\t455020\n聚类算法\t455021\n技校\t455022\n蕊蕊\t455023\n烈焰战神U\t455024\n延边北国\t455025\n三塔\t455026\nSan\t455027\n氟机\t455028\n煤堆\t455029\n辣眼\t455030\n刘玫\t455031\n非人学园\t455032\n中华人民共和国人口与计划生育法\t455033\n谈婚\t455034\n蒋家村\t455035\n可控硅\t455036\n热热\t455037\n沈萍\t455038\nDOF\t455039\n浮动\t455040\n启德教育\t455041\n北京长虹医院\t455042\n1786\t455043\n年代表\t455044\n天鹅蛋\t455045\n主卧\t455046\n03级\t455047\n人妻\t455048\n鞋面\t455049\nWeb.config\t455050\n不成林\t455051\nmack\t455052\n扬州\t455053\nStrips\t455054\nwindows8.1\t455055\n绿带\t455056\nVegetables\t455057\n浮华饭店\t455058\n水耕\t455059\n授受\t455060\nmln\t455061\n汇丰商学院\t455062\n深圳体检中心\t455063\n虎园\t455064\nCX-4论坛_汽车之家论坛\t455065\n球化剂\t455066\n镇定\t455067\n大龄女\t455068\nfuke\t455069\nwednesday\t455070\n内蒙古农业大学\t455071\n普安县人民政府\t455072\n零字\t455073\n季涛\t455074\n谭花灵\t455075\n交易性\t455076\n纳凉\t455077\n民庭\t455078\n八成\t455079\n箱子\t455080\n鹌鹑\t455081\nGAMING\t455082\ndev\t455083\n288芯\t455084\n文臣\t455085\n0.17.1\t455086\n偷塔\t455087\n力工\t455088\n獒犬\t455089\n触手系列精品集V1\t455090\n人卫医学网\t455091\n卢沟桥事变\t455092\n山东吕剧\t455093\n广州日报数字报\t455094\n移动医疗\t455095\n网格化\t455096\n毛译东\t455097\n鸡心\t455098\n邪门\t455099\n桑迪\t455100\n清扬路\t455101\n蜂蝶\t455102\n王晓宇\t455103\n港隆城\t455104\n五四晚会\t455105\n2006级\t455106\n国中\t455107\n王广成\t455108\n长安奔奔\t455109\n津洽会\t455110\n绿萝\t455111\n被冷落\t455112\n勇创\t455113\n老股\t455114\nfindlaw\t455115\n人造丝\t455116\n肯塔基州\t455117\n忆昔\t455118\n碧波路\t455119\n灵王\t455120\nMetro\t455121\n5068新闻网\t455122\n神奈川县\t455123\n零花钱\t455124\n伊莉大陆系列-白雪公主2\t455125\nTPF\t455126\n兽兽\t455127\n2026\t455128\n游戏迷\t455129\n浙江省发展和改革委员会\t455130\n人型电脑天使心\t455131\n衡水市\t455132\n滋事案\t455133\n20151119\t455134\n改革开放\t455135\npainter2018\t455136\nSDK\t455137\n日本语言学校\t455138\nequ\t455139\n6.3.15\t455140\n贝西克塔斯\t455141\n宜步出行\t455142\n平清盛\t455143\n兔唇\t455144\n很紧张\t455145\ninferior\t455146\nSEO研究中心\t455147\n272\t455148\n创展\t455149\nsexual\t455150\n太傅\t455151\n天下长安\t455152\n朴智星\t455153\n进出口\t455154\n流星花园1\t455155\n进制\t455156\nfors\t455157\n苏卡\t455158\n风云世纪\t455159\n压开\t455160\n煽\t455161\n38平\t455162\n吴法宪\t455163\n新月\t455164\n冷风扇\t455165\ngratis\t455166\n不愤\t455167\n黄小婷\t455168\n霞洛\t455169\nles吧_\t455170\n翠园\t455171\n观光园\t455172\n腾旅\t455173\npoplib\t455174\n动脑筋\t455175\n功能区\t455176\n全景照\t455177\nTamron\t455178\n2.29\t455179\n法库传媒网\t455180\n激素依赖性皮炎\t455181\n洞朗\t455182\nDropdown\t455183\n身分证\t455184\n招商蛇口\t455185\ndropping\t455186\n自然科学基金\t455187\n彭越\t455188\nhuff\t455189\n广东市\t455190\n第三第四\t455191\n妖花\t455192\n重播\t455193\n为指\t455194\nFFmpeg源代码\t455195\n安徽农业大学经济技术学院\t455196\nzhe\t455197\nSHOES\t455198\nhuhu\t455199\n段超\t455200\nMcKinsey\t455201\n笙笙\t455202\n中国微山网\t455203\n百节\t455204\nsisoo\t455205\n事业单位公开招聘工作人员公告\t455206\n三句话\t455207\n后湖\t455208\n彪马\t455209\n娜莎\t455210\n李玉玺\t455211\n和合符\t455212\nsuburban\t455213\n中国人寿国寿福\t455214\np8b75\t455215\n消磁器\t455216\n杀戮空间2\t455217\n安叔\t455218\ndeadline\t455219\n新英\t455220\n五局\t455221\n移动3g\t455222\n武理\t455223\n大安市\t455224\n迷昏\t455225\n异姓\t455226\n伊莎贝拉\t455227\nJwt\t455228\nUIElement\t455229\n竹乡\t455230\n走走\t455231\n钱理群\t455232\nhmcl\t455233\n客户名\t455234\n开发部\t455235\n几类\t455236\n98名\t455237\n疆独\t455238\n食安中国网\t455239\nThinkPadX1\t455240\n截距项\t455241\n10-4\t455242\n古今中外\t455243\n中级版\t455244\njapaneseold\t455245\n张宏杰\t455246\n花样姐姐\t455247\n算死草\t455248\n二_山西新闻网\t455249\n澳服\t455250\n十来\t455251\n正腔圆\t455252\n活到\t455253\n汽车行驶证\t455254\n非零\t455255\n星堡\t455256\n行行\t455257\nqdii\t455258\n灵声机器人\t455259\n竟遭\t455260\n由衷\t455261\n泰晶科技\t455262\n吹风筒\t455263\n重口味\t455264\n350分\t455265\n开罗\t455266\n基石\t455267\n南园村\t455268\nyyets\t455269\n2w\t455270\n铝型材散热器\t455271\n三十回\t455272\n全篇\t455273\n东倒西歪\t455274\nSSMS\t455275\n左部\t455276\n30台\t455277\n2016年9月1日\t455278\n郑渊\t455279\n像你\t455280\nbasler\t455281\nOlivia\t455282\n岩巷\t455283\n乐播投屏\t455284\n除湿器\t455285\n自主式\t455286\n米沙\t455287\n闭市\t455288\n分设\t455289\n百灵达\t455290\n盛树宝\t455291\n埋伏\t455292\n昌北经济开发区\t455293\n二建\t455294\n手撕包菜\t455295\n新大陆集团\t455296\n2244\t455297\n防风帽\t455298\n直销报道网\t455299\n爸爸们\t455300\n旧事\t455301\n人民广场站\t455302\n艾玛·沃森\t455303\n淮安市委\t455304\n建强\t455305\n隐适\t455306\n土库曼斯坦\t455307\n菁优\t455308\n众城\t455309\n王根会\t455310\n27.5\t455311\nLaurie\t455312\n王岗\t455313\n中影星美国际\t455314\n朗多\t455315\n会声会影X8\t455316\n麦麦\t455317\n陈一发儿\t455318\n重生之将门毒\t455319\n滇中新区\t455320\n第十卷\t455321\n大军师\t455322\n上海财经大学法学院\t455323\n万向轮拉杆箱\t455324\n南京市质监局\t455325\n文交所\t455326\nnotifications\t455327\n筑路\t455328\n脉冲袋式除尘器\t455329\nwindowsXP\t455330\n斗鱼杯\t455331\n双螺杆挤出机\t455332\n农机化\t455333\n马鞭草\t455334\n军转干考试\t455335\n套作\t455336\n微信斗地主\t455337\n垂暮\t455338\n万和城隆基泰和\t455339\n并发性\t455340\ndr4\t455341\n越语\t455342\n跟好\t455343\n赵姐\t455344\n恨死\t455345\n超清1080p\t455346\n芜湖市人民政府\t455347\n如花似玉\t455348\n谷氨酰胺\t455349\n西大街\t455350\n孤山\t455351\n央视少儿频道\t455352\n伤情\t455353\n雅致\t455354\n_英雄无敌3吧_\t455355\n闪乱神乐\t455356\n桐城市\t455357\n劳森\t455358\n西西里\t455359\n四年\t455360\n阿米尼\t455361\n本道\t455362\n哪有\t455363\n城固县人民政府\t455364\n李开春\t455365\n信长星\t455366\nIGV\t455367\n昌泰\t455368\n玻璃管\t455369\n王小慧\t455370\nrobe\t455371\nSNES\t455372\n师子\t455373\naddiction\t455374\n球器\t455375\n中华人民共和国义务教育法\t455376\npornia\t455377\n叔\t455378\n1毫升\t455379\n预期收益率\t455380\n飞常准\t455381\n55分钟\t455382\n宁波市眼科医院\t455383\nm2071\t455384\n磁力线\t455385\n数码宝贝大冒险\t455386\n沸石粉\t455387\n知网查重率\t455388\n台灣\t455389\n家族性\t455390\n彩\t455391\n穆\t455392\nyog\t455393\n粉红豹\t455394\n19.8\t455395\n联想g470\t455396\n佳园路\t455397\na31\t455398\n媚者\t455399\n大社\t455400\n虹桥国际机场\t455401\n习讯云\t455402\n操控者\t455403\nfurther\t455404\n方某\t455405\n胃肠机\t455406\n新闻阁\t455407\nchangan\t455408\nmalignant\t455409\nJConsole\t455410\n梁丰\t455411\nredi\t455412\nPPT课件\t455413\n十分钟左右\t455414\n岛主\t455415\n震泽中学\t455416\n单选按钮radio\t455417\n豫西大峡谷\t455418\n七妹\t455419\n三调\t455420\n达州职业技术学院\t455421\n新东方童话故事大全\t455422\nsm绳艺\t455423\n籍贯\t455424\n第67章\t455425\n呆滞\t455426\n降修\t455427\n可言\t455428\n猎马网\t455429\n急须\t455430\n三单元圈\t455431\n叛逆连队2\t455432\nweilai\t455433\n三盛\t455434\n惩戒之箭\t455435\n三定\t455436\n蒙东\t455437\ntrain\t455438\n武道会\t455439\n人居环境网\t455440\ncacao\t455441\n心连心\t455442\n山东大学法学院\t455443\nZac\t455444\nnn\t455445\n工程队\t455446\n意创\t455447\nBizhub\t455448\nVocabulary\t455449\n项雪龙\t455450\n猪砂\t455451\n黄小燕\t455452\n150件\t455453\n广州市司法局\t455454\n超级账本\t455455\n准\t455456\nholl\t455457\n湘潭站\t455458\n湖北省人民政府办公厅\t455459\n信号转换器\t455460\ncorbin\t455461\n旅游景区\t455462\n明医网\t455463\n工材\t455464\n力挽\t455465\n180W\t455466\n涛涛\t455467\n脂肪粒\t455468\n洛枳\t455469\n九类\t455470\nBb\t455471\n唐楷\t455472\n紫阳\t455473\n6针\t455474\n姚金易\t455475\nSP2\t455476\n王阳明\t455477\n勃然大怒\t455478\n亲仁\t455479\n小米4S\t455480\nMex\t455481\n程峰\t455482\n六龄牙\t455483\n逾期贷款\t455484\n庆春\t455485\n出入境检验检疫局\t455486\n购入\t455487\n中国医学科学院基础医学研究所\t455488\n博鳌亚洲\t455489\n公有云\t455490\n更替\t455491\n130元\t455492\n御徽\t455493\n炸酱\t455494\n金外滩\t455495\n维护部\t455496\n吕田镇\t455497\n终极者\t455498\n金十\t455499\n橙黄\t455500\n黄海涛\t455501\n国际快递\t455502\nsql子\t455503\ndongwu\t455504\n咋们\t455505\n核型\t455506\n山西焦煤集团有限责任公司\t455507\n10日内\t455508\n干湿\t455509\n无量纲化处理\t455510\n斜方\t455511\n田家炳中学\t455512\nPD1\t455513\n坏账损失\t455514\n天津市十二重点中学\t455515\n航天史\t455516\n80元\t455517\ndri\t455518\n7760\t455519\n倡议\t455520\n玉蒲团II之玉女心经\t455521\n化验单\t455522\n肖晓\t455523\nMeraki\t455524\n七朵\t455525\n捣乱\t455526\n工法\t455527\n考试机\t455528\nBMO\t455529\nzhiwei\t455530\n小趣\t455531\n湖面\t455532\n用友ERP-U8\t455533\n楓羽靈\t455534\n机器学习库\t455535\n扩音器\t455536\nLopez\t455537\n捕鸟蛛\t455538\n清純\t455539\n阻塞\t455540\nrob\t455541\npron\t455542\n山西农村信用社\t455543\n6225\t455544\n氧化铜\t455545\n兰州理工大学\t455546\n玛丽苏文\t455547\n小将\t455548\n一团糟\t455549\n速运\t455550\n糖果\t455551\nhp1020\t455552\n此笔\t455553\n浦发美国运通白金卡\t455554\nsinb\t455555\n清幽\t455556\n冰河期\t455557\n猎影\t455558\n平安贷款\t455559\n南江大峡谷\t455560\namd64.deb\t455561\n氨基酸\t455562\nHDB\t455563\nZootopia\t455564\n羽绒服\t455565\n剥去\t455566\n农电工\t455567\nMT世界\t455568\n_民福康\t455569\n以太坊钱包\t455570\n途经点\t455571\n缺憾\t455572\n春卷皮\t455573\n海亮集团\t455574\naci\t455575\nPCBA\t455576\n北京耐默公司\t455577\n精装房\t455578\n囚歌\t455579\n金讯\t455580\n总结贴\t455581\n王珣\t455582\n查出\t455583\n全品券\t455584\n.aar\t455585\n归途\t455586\n20公分\t455587\n德清县公共资源交易中心\t455588\n蓝奏\t455589\n后备箱\t455590\nIDW\t455591\n11番\t455592\n究极\t455593\ngarlic\t455594\n热浸锌\t455595\n高息\t455596\npackge\t455597\n430i\t455598\n传送门2\t455599\n闽南古镇\t455600\n西安市地方税务局\t455601\n赛马大亨\t455602\n卡饭论坛\t455603\n职业军人\t455604\n运动\t455605\n钢骨\t455606\n华润万家超市\t455607\n强迫交易罪\t455608\n双喜盈门\t455609\nSize\t455610\n联合国总部\t455611\n626\t455612\n邓宁\t455613\n京津冀一卡通\t455614\nDSGE\t455615\n徐州经济开发区\t455616\nECN\t455617\n风花醉\t455618\n东东东\t455619\nLoadrunner\t455620\n质量投诉网\t455621\nBIOS\t455622\n路用\t455623\n更全面\t455624\n泡学网\t455625\n磁滞\t455626\n第31页\t455627\n酱香型\t455628\n说难\t455629\n601006\t455630\n东亭\t455631\n闺蜜的完美旅行\t455632\nrenewable\t455633\n游学者\t455634\n湖南省建筑设计院\t455635\n岳父公\t455636\n专项转移支付\t455637\n弱精\t455638\nJamaica\t455639\n机器学习导论\t455640\n0.5.3\t455641\n央广购物\t455642\nonaka\t455643\n讲故\t455644\n赵丽周润发\t455645\nmegalobox\t455646\n洗标\t455647\n6100\t455648\n极速侠_财经网\t455649\nutf-8_\t455650\n联通网\t455651\n阿尔贝娜\t455652\n姐\t455653\n取点\t455654\nakb48吧_\t455655\n金丝柚木\t455656\n绿色债\t455657\n三基佬\t455658\nluoti\t455659\n假牙\t455660\n隐元阁\t455661\n杨宗保\t455662\n遮天吧\t455663\n上海安徒生童话乐园\t455664\n更昔洛韦\t455665\n北京晨报\t455666\n熊熊\t455667\n司级\t455668\n生育期\t455669\n淋巴细胞\t455670\n瞧不起\t455671\nV2.3\t455672\n泸天化\t455673\nClothes\t455674\n法兰不死队\t455675\n线电压\t455676\n君心\t455677\n构造体\t455678\n奇美\t455679\n玩具秀\t455680\ndisasters\t455681\nFARM\t455682\n手拉式\t455683\n1182\t455684\n王伟东\t455685\n跑分\t455686\n途达\t455687\n第26期\t455688\nkerberos\t455689\nCsze\t455690\n沙令\t455691\n吉吉影音\t455692\n大明律\t455693\n千业\t455694\n零教\t455695\n山东省人民政府办公厅\t455696\n九人\t455697\n体重秤\t455698\nf486\t455699\nuptempo\t455700\n牌王\t455701\n中国青年网\t455702\n十期\t455703\n天才在左疯子在右\t455704\nU-Share\t455705\nPON\t455706\nchateau\t455707\n一个字一个字\t455708\n动态域名解析\t455709\nNY\t455710\n青螺\t455711\npeople\t455712\n湖北十堰市\t455713\n紫金山\t455714\n实权\t455715\n静注人免疫球蛋白\t455716\nlisting\t455717\n浦南\t455718\n0.1毫米\t455719\ndrove\t455720\n泰坦2\t455721\nFoundError\t455722\n深企\t455723\n彩锌\t455724\n伽玛\t455725\n艾迪生\t455726\n中环快速路\t455727\n月歌\t455728\n顺序栈\t455729\n海博会\t455730\n十维\t455731\n1000点\t455732\n百思买\t455733\n姜灵海\t455734\n美国机场\t455735\n创新大道\t455736\n小木屋\t455737\n赊销\t455738\n论调\t455739\n长徒\t455740\n岳庙\t455741\n森绘梨佳\t455742\n低速电动汽车\t455743\nCAESAR\t455744\n天木纯\t455745\nslice\t455746\n4米高\t455747\nTrainee\t455748\n玄武区\t455749\nBel\t455750\n锦峰\t455751\n冤狱\t455752\n上海胸科医院\t455753\n省域\t455754\nauthors\t455755\n白色版\t455756\n两型\t455757\n0818\t455758\n470d\t455759\n60日\t455760\n龙帅\t455761\nBanks\t455762\nMMP\t455763\n淮南网\t455764\nardublock\t455765\n菜蔬\t455766\n肚片\t455767\n磁铁\t455768\n厦大\t455769\n手珠\t455770\n清子\t455771\n2018年4月起\t455772\n阳极氧化铝\t455773\n宋江\t455774\nfaction\t455775\n凯文·史派西\t455776\n悠空网\t455777\n艾奇学院\t455778\n烤焦\t455779\n总期\t455780\n17年2月\t455781\n宁国府\t455782\n水明\t455783\n四路\t455784\n2000股\t455785\nPinder\t455786\nFixed\t455787\n湖南省人民政府办公厅\t455788\nunicast\t455789\n2018-03-12\t455790\n国寿康宁终身重大疾病保险\t455791\n九二共识\t455792\n工科女\t455793\n2017年12月31日\t455794\nSOAP\t455795\n斜井\t455796\n宁为\t455797\nKTP\t455798\njura\t455799\n第46期\t455800\nsha256\t455801\nWORD2016\t455802\n赵医生\t455803\n结伙\t455804\n出率\t455805\n定量\t455806\n货种\t455807\n布朗大学\t455808\n恩重\t455809\nsecured\t455810\n数据集\t455811\ntarte\t455812\n2.4亿元\t455813\n讲道\t455814\nBarrier\t455815\n冷酷\t455816\n排坛\t455817\n茶陵\t455818\nunidac\t455819\n陶泽如\t455820\nAR-1808S\t455821\n剑油\t455822\n蓝鲨\t455823\nQQ区\t455824\n阿玖\t455825\n8794\t455826\n大观苑\t455827\n图像处理器\t455828\n莫为\t455829\n治校\t455830\n无夜之国2\t455831\n搜藏网\t455832\n易货\t455833\nCoremail论客\t455834\n山舍\t455835\ndiscounts\t455836\n九阴吧\t455837\n左旋肉碱茶多酚片\t455838\n黄化\t455839\n磷酸盐\t455840\nfemme\t455841\n云化\t455842\n膜袋\t455843\nmission\t455844\n银杏木\t455845\n江阴网\t455846\n廊\t455847\n防爆摄像头\t455848\n第二十四条\t455849\n保康县\t455850\n省委书\t455851\n内特\t455852\n王晓佳\t455853\nxang\t455854\n后判\t455855\nn7102\t455856\n九尾狐狸m\t455857\n公民身份\t455858\n港闸区\t455859\nRango\t455860\n八联\t455861\n托车\t455862\nUTL\t455863\nWal\t455864\n今年9月\t455865\n香器\t455866\n避孕环\t455867\n会计信息质量\t455868\n妖兽\t455869\nlayui-table\t455870\n耿马县\t455871\nPMOS\t455872\n竞渡\t455873\n扇\t455874\nclannad\t455875\n苏州大学附属第二医院\t455876\n宫锁珠帘\t455877\nmissile\t455878\n艾尔德利奇\t455879\nghost安装器\t455880\nnafion\t455881\n庐山风景区\t455882\nDeploy\t455883\n本埠\t455884\nEBAY\t455885\n生驹\t455886\nconserdao\t455887\n云架构\t455888\n标准样\t455889\n康年\t455890\nITPUB\t455891\n沈北\t455892\n金鉴\t455893\n侍中\t455894\n磨加工\t455895\ncatfish\t455896\n侠胆\t455897\nyyyymm\t455898\nFTIR\t455899\n_亿品\t455900\n上饶日报\t455901\n状态码\t455902\n99成\t455903\n青蟹\t455904\n柯氏\t455905\n招展\t455906\n断更\t455907\nYau\t455908\n探索版\t455909\nxen\t455910\nelena\t455911\nclosure\t455912\n下卡\t455913\n超英\t455914\n高雄路\t455915\n合服\t455916\n石井街道\t455917\n徐建明\t455918\n维空间\t455919\n河北政府\t455920\n张轶\t455921\n9道\t455922\nRetouch\t455923\n寄存处\t455924\nd15\t455925\nmedline\t455926\n单值\t455927\n洋车\t455928\n304l\t455929\n昏头\t455930\n徐青\t455931\n十五张\t455932\n炊\t455933\n水法\t455934\n上海闵行区\t455935\n新神探联盟\t455936\nenhancing\t455937\n谈话会\t455938\n昌明\t455939\n例程\t455940\n17分钟\t455941\nmanipulate\t455942\n搜狗新闻\t455943\n验配\t455944\n06号\t455945\n标示符\t455946\nbarlayout\t455947\n重庆市教委\t455948\n八字网\t455949\n甜蜜暴击\t455950\n真空镀膜\t455951\n一公顷\t455952\n古剑奇谭二\t455953\n相聚\t455954\n金穗路\t455955\n广州东站\t455956\n常州电视台\t455957\n布院\t455958\n钛金\t455959\njurlique\t455960\niccavr\t455961\n375马力\t455962\npkl\t455963\n珀薇\t455964\n混声合唱\t455965\n陆良\t455966\n四方达\t455967\n西餐桌\t455968\n1月8日\t455969\n输入板\t455970\n规字\t455971\n数的认识\t455972\n站前广场\t455973\n专讯\t455974\n一拥而上\t455975\n蜂\t455976\nzsw\t455977\nmerkle\t455978\n核雕\t455979\n财务报告分析\t455980\n六眼\t455981\n腹板\t455982\n置业有限公司\t455983\n梁军\t455984\n北大口腔\t455985\n法金\t455986\n顺\t455987\n国务委员\t455988\n鹅城\t455989\n微单成像\t455990\n3GP\t455991\nExplorer\t455992\ndjmax\t455993\n共康路\t455994\n四度\t455995\n中共中央军事委员会\t455996\n卡列尼娜\t455997\n红宝书\t455998\n雇凶\t455999\n半仓\t456000\n静海镇\t456001\n为难\t456002\n杨安\t456003\n万仙阵\t456004\n顾轻舟\t456005\n戴尔\t456006\n小绝\t456007\n5608\t456008\n断桥铝合金窗\t456009\n逛逛网\t456010\n上虞日报\t456011\n绵阳东辰国际学校\t456012\n中央新疆工作座谈会\t456013\n幸福病\t456014\nnigger\t456015\n铝制品\t456016\nCocks\t456017\nMiddleware\t456018\n禹元\t456019\n击毁\t456020\n缉\t456021\n郎朗\t456022\n采购项\t456023\n卡铂\t456024\n健康型\t456025\n光盘版\t456026\n1983\t456027\npbe\t456028\n尤里的复仇\t456029\n火柴头\t456030\n魔忍雪风\t456031\ntga\t456032\n印玺\t456033\nangularJs\t456034\nattached\t456035\nIntuitive\t456036\n健保通\t456037\n荼岩\t456038\nrlc\t456039\ntimi\t456040\n丙申\t456041\n肽酶\t456042\n优选\t456043\nCamille\t456044\n2017年七月\t456045\ndili\t456046\n董鹏\t456047\n心惊肉跳\t456048\nLitecoin\t456049\n20170531\t456050\n江祖平\t456051\ndiyiji\t456052\n浙江省社科联\t456053\nantioxidant\t456054\n树场\t456055\n苏引华\t456056\n潭综合实验区政府\t456057\n千王之王\t456058\n训导员\t456059\n羟丁酸\t456060\n诱捕\t456061\n斑竹园镇\t456062\n标准价\t456063\n香片\t456064\n超碰在线\t456065\n失稳\t456066\n修益\t456067\nYuyao\t456068\n减速比\t456069\n维度女性网\t456070\n秀米\t456071\n信得过\t456072\nkom\t456073\n新宋吧\t456074\n第五项\t456075\n设在\t456076\n阳江日报\t456077\n凤凰机场\t456078\nMichel\t456079\nupx\t456080\n庞大集团\t456081\nleaders\t456082\n花里\t456083\n张铮\t456084\n在狱中\t456085\n前进一步\t456086\n湖州银行\t456087\n三佳购物\t456088\n义乌市统计局\t456089\nsupervise\t456090\n国泰基金\t456091\n黄阁镇\t456092\n锥角\t456093\n勾人\t456094\n哭闹\t456095\n病位\t456096\n焉\t456097\n给力星\t456098\n保利文化广场\t456099\nSDD\t456100\n东兰县\t456101\nBloguy\t456102\ninvestigation\t456103\n同济医学院\t456104\n|方\t456105\nmsa\t456106\n李虹\t456107\nLemon\t456108\n鱼皇\t456109\n卡篇\t456110\n自攻丝\t456111\n1109\t456112\n万年牢\t456113\nMojang\t456114\ndior999\t456115\n结晶\t456116\n如松\t456117\n上税\t456118\n无锡市第四人民医院\t456119\n启辰t90\t456120\n盘山\t456121\n死馆\t456122\ndtype\t456123\n金叉\t456124\n莲花池公园\t456125\nj\t456126\n丁飞\t456127\n人卫版\t456128\n塔塔集团\t456129\n功效\t456130\n车房\t456131\n异见\t456132\n温都\t456133\nAndromeda\t456134\n下位\t456135\n洋土豪米糕\t456136\n雨燕海马s5\t456137\ncirculating\t456138\n茶话会\t456139\nv1.2.5\t456140\nBinary\t456141\n洛川苹果\t456142\n法制处\t456143\n天安花园\t456144\n郴州\t456145\n定向计划\t456146\n石径\t456147\n李梦阳\t456148\n程咬金\t456149\n加项\t456150\n张建强\t456151\n我不想\t456152\n侠义\t456153\n粗中有细\t456154\n每三年\t456155\n完损\t456156\n友乐\t456157\n卟\t456158\n鲁北化工\t456159\n拼画\t456160\n颜色选择器\t456161\nplayhome\t456162\n胸舞\t456163\n武林盟\t456164\n杰拉尔\t456165\n新百广场\t456166\n华西都市网\t456167\n15\t456168\nriscv\t456169\n海内外\t456170\n20毫安\t456171\n导电银胶\t456172\n金尊神液\t456173\n同党\t456174\n雾影\t456175\nchairman\t456176\n寒暄\t456177\n17.3%\t456178\n开发机\t456179\n论坛\t456180\n8888元\t456181\n微文\t456182\n英国驻华大使馆\t456183\n0.0.0.255\t456184\n一介\t456185\n天和\t456186\n第一遍\t456187\n长沟镇\t456188\n大厂回族自治县\t456189\n梦想合伙人\t456190\n180平米\t456191\n清迈古城\t456192\n安邦\t456193\n乐牌\t456194\nUG8.0\t456195\n高起\t456196\n猫屎咖啡\t456197\n68所\t456198\n前叉\t456199\nffmpeg\t456200\n蔡义江\t456201\n生活环境\t456202\npng_千库网\t456203\nmfc100u.dll\t456204\n需求量\t456205\n普联技术有限公司\t456206\n余枫\t456207\n夺宝奇兵\t456208\n新汉兰达\t456209\n虹井路\t456210\nqte\t456211\n女起解\t456212\nsku\t456213\n咽喉炎\t456214\n防弹\t456215\n心诀\t456216\n吸附式干燥机\t456217\nDiv+CSS\t456218\n园林机械\t456219\n光纤激光切割机\t456220\n林旭\t456221\n流电\t456222\n凤城东\t456223\n厦门大学附属第一医院\t456224\n神经外科学\t456225\n12月1日起\t456226\nbbs.classic023\t456227\noffice2013\t456228\n失贞\t456229\n十周岁\t456230\n鼓起\t456231\n设计所\t456232\n重症肌无力\t456233\n辽艺版\t456234\n丁丁张\t456235\n伏尔加庄园\t456236\n鲍橒\t456237\n嫉妒心\t456238\n建德19楼\t456239\n调度机\t456240\n张抗抗\t456241\n晚套\t456242\n里氏硬度计\t456243\n川楝子\t456244\n升中\t456245\n笑眯眯\t456246\n潍日高速\t456247\nand\t456248\n氮酮\t456249\n领益科技\t456250\n卡基\t456251\n深圳航空有限责任公司\t456252\n毁誉参半\t456253\nishadow\t456254\n|经\t456255\n刑房\t456256\n辽国\t456257\n英孚\t456258\nswitch-case\t456259\n知春路\t456260\n锦鲤池\t456261\n65天\t456262\n社会主义法\t456263\n亿邮\t456264\nwww.4399.com\t456265\n卡塔尔多哈\t456266\n西城区幼升小\t456267\n吕四港镇\t456268\n银河奥特曼\t456269\n洞见\t456270\n全链\t456271\n传奇登陆器\t456272\n亚眠\t456273\n反刍\t456274\nンズ\t456275\n阿司匹林肠溶片\t456276\n平昌县人民政府\t456277\nMySQL数据库表\t456278\n爱贝国际少儿英语\t456279\nlcr\t456280\n赤龙\t456281\n耍事\t456282\n乔迁之喜\t456283\n醉猿\t456284\nexplained\t456285\n绿痕\t456286\n乌桕树\t456287\n陈咬金\t456288\n医疗机构管理条例实施细则\t456289\n有情有义\t456290\n兴叹\t456291\n蜂鸣声\t456292\n贺刚\t456293\n最后一班\t456294\n气动冲床\t456295\n坐稳\t456296\n交欢\t456297\nfoamposite\t456298\n董易林\t456299\n躯体化\t456300\n六味地黄丸\t456301\nAutowire\t456302\ncou\t456303\n黑眼圈\t456304\n绿橙\t456305\n尼龙管\t456306\n南昌昌北国际机场\t456307\npsycopg2\t456308\n王迅\t456309\n奸情\t456310\n眼机\t456311\n玄默\t456312\n常乐镇\t456313\n邓丽\t456314\n七星殿\t456315\nStanding\t456316\n望远镜\t456317\n卡内\t456318\n佳能ir\t456319\n梦幻群侠传2\t456320\n王锦\t456321\n婚姻观\t456322\n可获得\t456323\nencodeURI\t456324\n培土机\t456325\n第三轮\t456326\nlady网\t456327\nrepr\t456328\n吸烟机\t456329\nwc\t456330\n马村\t456331\n牌员\t456332\n同策\t456333\n新军\t456334\nGhosts\t456335\n抽样调查\t456336\n玩跳一跳\t456337\n人行桥\t456338\ngvim\t456339\n王氏\t456340\n纬地\t456341\nSection\t456342\n顾敏\t456343\n深圳11号线\t456344\n血清蛋白电泳\t456345\nfzzk\t456346\ne览网\t456347\n乔治城大学\t456348\n淮海路\t456349\nlist、numpy\t456350\ne10\t456351\n青浦吾悦广场\t456352\n3月25日\t456353\n爱智康\t456354\n鄂州机场\t456355\n共余生\t456356\nAdministrator\t456357\ntjs\t456358\n贵阳花果园\t456359\nPerf\t456360\n张暴默\t456361\n印张\t456362\n陌名\t456363\n四大家\t456364\n遍布\t456365\n马钢\t456366\n缔约过失责任\t456367\n草莓奶昔\t456368\n非法吸收公众存款罪\t456369\n1桶\t456370\n新闻学\t456371\n古墓丽影2\t456372\n邵伯镇\t456373\n1755\t456374\n性趣\t456375\nQuincy\t456376\n北京路\t456377\n耐热玻璃\t456378\n苏作\t456379\n网易保险\t456380\n博物馆学\t456381\n球长\t456382\n决战平安京吧\t456383\n日照市区\t456384\nheadfirst\t456385\nh1b\t456386\n背后\t456387\n上饶经济技术开发区\t456388\nDiaz\t456389\n穿越之绝色兽妃:凤逆天下\t456390\n中海花湾壹号\t456391\n穿着\t456392\n双黄线\t456393\n下九流\t456394\n16V\t456395\n失血\t456396\n陈留\t456397\n达芬奇11\t456398\n雨久花\t456399\n会声会影X2\t456400\nWUP\t456401\n易丰\t456402\n曹征路\t456403\nPOP\t456404\ngdiplus\t456405\n拉定\t456406\n易经网\t456407\nwindows命令行\t456408\nanywhere2\t456409\n预编码\t456410\n失语者\t456411\n民生证券\t456412\n石仓\t456413\n通缉令\t456414\n青岛黄海学院\t456415\n大导\t456416\n领款\t456417\n数控火焰切割机\t456418\nmmpi\t456419\n20160813\t456420\nmini版\t456421\n何华\t456422\n发货\t456423\n新座\t456424\n各方\t456425\n素喃\t456426\n水千澈\t456427\n申通快递公司\t456428\n甬人社\t456429\n侯门\t456430\n对子哈特\t456431\n粤倾粤友\t456432\n车量\t456433\n刘永斌\t456434\n05届\t456435\n好特\t456436\n机警\t456437\n超神牡丹亭\t456438\n文昌街道\t456439\n科顺股份\t456440\n中福会幼儿园\t456441\n龙种人\t456442\nTOP10【\t456443\n干法制粒机\t456444\njQuery\t456445\n蜀州\t456446\nAffect\t456447\n洛圣都\t456448\n相贯\t456449\n北京大学新闻与传播学院\t456450\nv5.1.2\t456451\n赫尔佐格\t456452\ngoback\t456453\n尖山\t456454\n晋中职业技术学院\t456455\nCx\t456456\n八折\t456457\n腾逸\t456458\n12辆\t456459\nAbyss\t456460\n题意\t456461\n精装版\t456462\n内部人士\t456463\n2750\t456464\n超乎想像\t456465\n执业医师考试网\t456466\n光环\t456467\nSwim\t456468\n宝林\t456469\n河南省地方税务局\t456470\n太室山\t456471\n美人头\t456472\n正威\t456473\n朱从玖\t456474\n小幺\t456475\niope\t456476\n近照\t456477\n抽血\t456478\n画家村\t456479\n清秋\t456480\n李彤\t456481\n快递费\t456482\n字幕包\t456483\n第一名\t456484\n烟台职业学院\t456485\n8.3亿\t456486\n有终\t456487\n肥害\t456488\n黄面\t456489\n李振良\t456490\n大小王\t456491\n行政处罚决定书\t456492\n移情别恋\t456493\n外墙\t456494\n四重奏\t456495\n葛粉\t456496\n镇江北\t456497\n巨菌草\t456498\n參考\t456499\n双沟\t456500\n工荒\t456501\n通达信函数\t456502\n亡灵战神\t456503\n理事会\t456504\n张柔情\t456505\n1.7mm\t456506\n火星车\t456507\n有疑\t456508\n雨痕\t456509\n红超的吾记之谈\t456510\n电脑端口号\t456511\n鼠笼\t456512\n听神经瘤\t456513\n老房子\t456514\n铀\t456515\n烟支\t456516\n长兴政府网\t456517\n66元\t456518\n筱惠美\t456519\n2.49\t456520\n绿豆饼\t456521\nAC模玩\t456522\n柱塞阀\t456523\n崔雪莉\t456524\nsheath\t456525\n金寨\t456526\n办事员\t456527\n动物们\t456528\n舔屏\t456529\n孟照国\t456530\n谎\t456531\n保本价\t456532\ndqmsl\t456533\ncloudy\t456534\n考模\t456535\n石家庄传媒网\t456536\n幻奇\t456537\n掩体\t456538\n横距\t456539\nbeam188\t456540\n选项组\t456541\n纱栄子\t456542\n岳池\t456543\n有意注意\t456544\nrape\t456545\n83375335\t456546\n2014.6\t456547\n嘉兴市国土资源局\t456548\n霍乱时期的爱情\t456549\n话别\t456550\n作品名\t456551\n马术\t456552\n广佛江\t456553\n青蒿素\t456554\n乘项\t456555\nsw6\t456556\n基德曼\t456557\n002396\t456558\n6连\t456559\n懂事长\t456560\n放掉\t456561\n15日游\t456562\n华师附中\t456563\n偷香\t456564\n海钓\t456565\n20180215\t456566\n湿式\t456567\n东营职业学院\t456568\n红星国际\t456569\nwan\t456570\n建构筑物\t456571\n天涯丽人\t456572\n蒙特勒\t456573\n西青开发区\t456574\n九牧卫浴\t456575\n证型\t456576\n速达软件\t456577\n社保查询网\t456578\n左面\t456579\n豆腐\t456580\n省总工会\t456581\n铁证如山\t456582\n报事\t456583\nuv机\t456584\n壶口瀑布\t456585\n天娱\t456586\n2036\t456587\n就是我\t456588\n酷\t456589\n计算机文化基础\t456590\n豆包\t456591\n中国服务贸易指南网\t456592\ncust\t456593\n门户\t456594\n有恩\t456595\n森友学园\t456596\nIMC\t456597\nvcard\t456598\n重庆森林\t456599\n371\t456600\n第1列\t456601\n妖精的尾巴\t456602\n1.5小时\t456603\nDQ11\t456604\n拌和站\t456605\n唐人街探案2百度云\t456606\nEmerging\t456607\n新东方培训学校\t456608\nDECORTE\t456609\n94页\t456610\n中国人民解放军火箭军总医院\t456611\n五十斤\t456612\n专用阀\t456613\n金融法院\t456614\n邦交国\t456615\n神灵\t456616\nBam\t456617\n方程\t456618\nsigns\t456619\n生曰\t456620\nBRAVE\t456621\nsear\t456622\n11.12\t456623\n斗鱼嘉年华\t456624\n松花江上\t456625\n我的唇吻不到我爱的人\t456626\n即刻\t456627\nSAVE\t456628\n墨羽\t456629\n闻官军收河南河北\t456630\n姜妍\t456631\n75秒\t456632\n蜜橘\t456633\n寇老西儿\t456634\n福耀玻璃\t456635\n8.9.4\t456636\niVMS-4500\t456637\n南威软件\t456638\nwindows、linux\t456639\n超冒险\t456640\nSt\t456641\n恁\t456642\n中国军网\t456643\nvtable\t456644\n乘凉\t456645\n队首\t456646\nscad\t456647\n芭拉\t456648\nexel\t456649\n2006\t456650\n永靖\t456651\n公牛\t456652\n消失了\t456653\nstringstream\t456654\n64MB\t456655\n风雨同舟\t456656\n民航路\t456657\n南乡\t456658\n运功\t456659\n弱水\t456660\nw8\t456661\nzegna\t456662\n侃\t456663\n佛山市地方税务局\t456664\n加更\t456665\n69家\t456666\ndifference\t456667\nStockX\t456668\n沅水\t456669\n调考\t456670\n政绩\t456671\n彭德\t456672\n吴式\t456673\n英国湖区\t456674\njianli\t456675\n校园性\t456676\nJohansen协整检验\t456677\n隔膜压力表\t456678\n武进高新区\t456679\n泰州人才网\t456680\n买值\t456681\n强企\t456682\nsfx\t456683\n华为荣耀手环3\t456684\n姜汁\t456685\n12.9寸\t456686\n黄莉新\t456687\ndwd\t456688\n自宫\t456689\n希拉里\t456690\n药士\t456691\n碰头\t456692\n陆夫人\t456693\n茸\t456694\n医疗箱\t456695\n填入\t456696\n月厨\t456697\n元灵\t456698\n三第三\t456699\n心端\t456700\n冰川时代3\t456701\n表面光洁度\t456702\n向阳坊\t456703\n潜山新闻网\t456704\n追猎者\t456705\n矽\t456706\n兰斯吧_\t456707\n楫\t456708\n吐槽大会2\t456709\n毅达\t456710\n育新小学\t456711\n叶佳颐\t456712\n传真查看器\t456713\npackagist\t456714\n代字\t456715\nChinatown\t456716\n屠村\t456717\n网商园\t456718\n乐视max\t456719\n第十二期\t456720\n取机\t456721\n7.51亿\t456722\nqq五笔输入法\t456723\n长春镇\t456724\n翌\t456725\n零班\t456726\nclr\t456727\n镇江房产超市网\t456728\n起亚kxcross\t456729\n挡片\t456730\nGIT\t456731\nEXIF信息查看器\t456732\n学步带\t456733\n自办\t456734\n次序\t456735\n美食杰\t456736\n货币汇率转换计算器\t456737\nhyper-v虚拟机\t456738\n天地会\t456739\n木质素\t456740\n藻\t456741\nmissing\t456742\n能找\t456743\n胡帕\t456744\n中东战争\t456745\n700mm\t456746\n瓦罐煨汤\t456747\n米语\t456748\nDevel\t456749\n极简主义\t456750\n伯克利大学\t456751\n2018-04\t456752\n沛\t456753\n空间页\t456754\n頃\t456755\n欢唱\t456756\n轻集料混凝土\t456757\ngoto\t456758\njyb\t456759\n这一刀\t456760\n称颂\t456761\n卓越\t456762\n木石\t456763\n加油宝\t456764\n生精胶囊\t456765\n托克托\t456766\n臊子面\t456767\n隐身\t456768\ncpd\t456769\n600074\t456770\nOritive\t456771\n30日\t456772\n影剑\t456773\n铁血武林2\t456774\n総領事館\t456775\n神奇宝贝\t456776\n传奇素材网\t456777\n8977xxxx\t456778\n移动硬盘分区\t456779\n20170323\t456780\n嫩草\t456781\n召唤石\t456782\n奇香\t456783\n3.5亿元\t456784\n华润漆\t456785\n节期\t456786\n冻死骨\t456787\nv1.4.2\t456788\n燃气报警器\t456789\n感染性腹泻\t456790\necxcel\t456791\n机电一体化\t456792\n放花\t456793\n福民\t456794\n和声学\t456795\n情锁_情锁电视剧\t456796\n提尔\t456797\n变为\t456798\n林劲\t456799\n结核病日\t456800\nconformal\t456801\n微家\t456802\n张富贵\t456803\nkan\t456804\n正序\t456805\ndouzujun\t456806\nminidump\t456807\n鹏业软件\t456808\n江西电视台\t456809\n无限楚汉风云\t456810\n抽钱\t456811\nBigben\t456812\n2.82\t456813\n小苏\t456814\n马头村\t456815\n套组\t456816\n意蜂\t456817\nBINGBIAN\t456818\n形式主义\t456819\n风风\t456820\n徐宁\t456821\nAsyn\t456822\n沈斌\t456823\n本友\t456824\n15.4.1\t456825\n跌破\t456826\n抽屉锁\t456827\n魔人\t456828\n骏捷\t456829\n中宇\t456830\nneet\t456831\n貓\t456832\n瑞狮\t456833\n第16号\t456834\n拉瓦锡\t456835\n名器\t456836\nDigiKey\t456837\n天真可爱\t456838\n朔风\t456839\n吊日\t456840\n小春日\t456841\n模因\t456842\nNetcore\t456843\n一人\t456844\n恐极\t456845\n申请时\t456846\n少先队\t456847\n交替\t456848\n利源\t456849\n惨不忍睹\t456850\n山城镇\t456851\n三忌\t456852\nPeanut\t456853\nyeelight\t456854\n∩\t456855\n顺德家具网\t456856\n黄葛树\t456857\n显隐\t456858\n华杰\t456859\n徐孺子\t456860\n靠谱点\t456861\n九龙园区\t456862\n婚讯\t456863\n刑侦大队\t456864\nHLA\t456865\n2017年11月22日\t456866\n万只\t456867\n钟云龙\t456868\n东中街\t456869\n齿锥\t456870\n挑拨离间\t456871\n限流\t456872\n黄泉路\t456873\n姑苏\t456874\n凯石\t456875\n经济史\t456876\n住宿费\t456877\n阳煤化工\t456878\nasians\t456879\n法线方程\t456880\n于德\t456881\nzhutiqu\t456882\n旋度\t456883\n深圳市南山区西丽人民医院\t456884\nWAV,flac\t456885\n无盘服务器\t456886\n死侍2\t456887\n民诉法\t456888\n支付码\t456889\n大航\t456890\n磨碟\t456891\n12_\t456892\n见人心\t456893\n克希雷姆\t456894\n处理员\t456895\n宫园薰\t456896\n剪羊毛\t456897\n胡灵\t456898\n33亿元\t456899\n台湾往事\t456900\n湖北招生考试网\t456901\n卧式\t456902\n强民\t456903\n思議\t456904\n李干\t456905\n平顺县\t456906\nNocturne\t456907\n希咲エマ\t456908\n美机\t456909\n都江堰市\t456910\n猪流感\t456911\n20140108\t456912\n四川政府\t456913\n后帘\t456914\nmicky\t456915\n承德围场\t456916\n书林\t456917\n新群\t456918\n加注\t456919\n自然辩证法\t456920\n人鬼\t456921\n万年寺\t456922\n聚通\t456923\n沈俊\t456924\nQSV格式\t456925\nvivoX20Plus\t456926\n斗鱼\t456927\n丹西街道\t456928\n最大限度\t456929\n矢量_素材中国sccnn.com\t456930\n德谦\t456931\n坑娘\t456932\nmagna\t456933\n广州美院\t456934\n江湖大道\t456935\n中国眼镜网\t456936\nVI设计公司\t456937\n糯种\t456938\n320k\t456939\n吕祖\t456940\n徐新华\t456941\n罗纳河谷\t456942\ncronolog\t456943\n西毒\t456944\n蝴蝶酥\t456945\n核膜\t456946\n生育\t456947\n4块\t456948\n手足口病疫苗\t456949\n奥园誉湖湾\t456950\n四把\t456951\n小马宋\t456952\n37路\t456953\n雪魄\t456954\n未央网\t456955\nPatcher\t456956\n能得到\t456957\n9度\t456958\n吉利大学\t456959\n大桥\t456960\ncornell\t456961\n4.5秒\t456962\nflash插件\t456963\nNhibernate\t456964\n会厌\t456965\n歌唱\t456966\nercp\t456967\nporosity\t456968\n工人物语7\t456969\n集热\t456970\nzstack\t456971\n叛逆性百万亚瑟王山猫\t456972\n立花オミナ\t456973\n自荐书\t456974\n新民诉法\t456975\njifen\t456976\n阿莫电子论坛\t456977\n怪物猎人x\t456978\nCP1E\t456979\n朋友的姐姐\t456980\nSlots\t456981\n姜波克\t456982\n胡宗强\t456983\n位带\t456984\n知情\t456985\n学生资助管理中心\t456986\n锤头\t456987\n大连市人力资源和社会保障局\t456988\n邹区镇\t456989\n火勺\t456990\n托管银行\t456991\nborder-color\t456992\n梳理机\t456993\n四基\t456994\n等不到\t456995\n糟糕\t456996\n船外机\t456997\n风险管\t456998\n上将\t456999\n抗性糊精\t457000\n霰粒肿\t457001\n王勇智\t457002\n庆城\t457003\n卢飞\t457004\n一枕\t457005\n略阳县人民政府\t457006\nstm32f103rct6\t457007\nflexlm\t457008\n广发分享日\t457009\n建工路\t457010\n殷墟\t457011\n蓝钢\t457012\n昼伏夜出\t457013\nskating\t457014\n山东高速男篮\t457015\n东宫西宫\t457016\n希灵\t457017\n基质金属蛋白酶\t457018\n青壮年\t457019\n0.5a\t457020\n语类\t457021\n国有建设用地使用权出让合同\t457022\n画图形\t457023\nYves\t457024\n火法\t457025\n木耳\t457026\ncurriculum\t457027\n45平方米\t457028\n雪肌\t457029\n海州区\t457030\n杭州江干区政府网\t457031\n开普勒定律\t457032\n河洛\t457033\n天涯杂谈\t457034\n白玉菇\t457035\n洗脸池\t457036\nanko\t457037\n艳阳\t457038\n防汛抗旱指挥部\t457039\n埃美柯\t457040\n你我他\t457041\n十二号\t457042\n清雨\t457043\n林然\t457044\n保险库\t457045\n神农\t457046\n华诺\t457047\nIH\t457048\n3段\t457049\n20140324\t457050\n悦单\t457051\n任职证明\t457052\n上海市监察委员会\t457053\nSEP\t457054\nbby\t457055\n束管\t457056\n母夜叉\t457057\nElsevier\t457058\n病媒\t457059\n陈丹舒马赫\t457060\n夏小虎\t457061\n游民\t457062\nTASKalfa\t457063\n车史\t457064\n团鬼\t457065\n来士普\t457066\n曾厝垵\t457067\nDragonDean\t457068\n企商网\t457069\n武汉百捷\t457070\n毛状\t457071\n中国电子科技集团公司第十研究所\t457072\n毒辣\t457073\n十二件\t457074\nword2010文\t457075\n忍者乱太郎\t457076\nCCFA\t457077\n济宁一中\t457078\nJavaLeader\t457079\n张思睿\t457080\n悬空\t457081\n催化\t457082\n假猪\t457083\n汉邦\t457084\n油冷机\t457085\n几圈\t457086\n29个月\t457087\nmart\t457088\n天帷\t457089\n7days\t457090\nwin10x64\t457091\n延安政府网\t457092\nFARO\t457093\n北京遇上西雅图\t457094\n奇虎360公司\t457095\nture\t457096\n海思半导体\t457097\n三星m2070\t457098\n雷老虎\t457099\n1000M\t457100\n战争史\t457101\nExtinction\t457102\n67号\t457103\n蛋糕物语\t457104\nbindings\t457105\n升船机\t457106\n4.5cm\t457107\n任天野\t457108\n无天\t457109\n杉果\t457110\n翡翠谷\t457111\n金刻羽\t457112\nBY2\t457113\n抚仙湖\t457114\n成都环球中心\t457115\n一达通\t457116\n贵州省地方税务局\t457117\n梅兰日兰\t457118\n心源性哮喘\t457119\n任政\t457120\n废铜\t457121\nr11s\t457122\n新航道托福\t457123\n250GB\t457124\n财务会计\t457125\nclients\t457126\n曲曲菜\t457127\n张惠萍\t457128\n夜歌\t457129\n宁国市人民政府\t457130\nTCG\t457131\n任字\t457132\n铲\t457133\nleiOOlei\t457134\n斋号\t457135\n急转直下\t457136\n六煞\t457137\n安海\t457138\nDesigns\t457139\n景观石\t457140\n杰魔\t457141\nmochi\t457142\ne420\t457143\n鼎信诺\t457144\n威尔第\t457145\nPVB\t457146\nrom刷机包\t457147\n為什麼\t457148\n8千克\t457149\n司马谏\t457150\n2890\t457151\nm427dw\t457152\n斯特拉文斯基\t457153\njbs\t457154\n升段\t457155\n王世杰\t457156\n海草\t457157\n替\t457158\n孟子曰\t457159\n精斗云\t457160\n多用表\t457161\n加里\t457162\n闯关东\t457163\n国美集团\t457164\n哺乳\t457165\n38层\t457166\n留别\t457167\nADB\t457168\n跳蚤市\t457169\n纺织助剂\t457170\n法魔\t457171\n一候\t457172\n绿色光明网\t457173\n心火牧\t457174\n西班牙国家队\t457175\n青春草\t457176\n朗新\t457177\nFOR\t457178\n定身\t457179\n前期\t457180\n鼠标手型\t457181\n秦九韶\t457182\nDPReview\t457183\n物联网博览会\t457184\n终生\t457185\n羊男\t457186\n换药\t457187\n男校\t457188\nAirline\t457189\n数列题\t457190\n八一大道\t457191\n读后感作文|\t457192\ncalibration\t457193\n都市主义\t457194\n汇龙镇\t457195\ncarve\t457196\n地龙蛋白\t457197\n猪蹄\t457198\n3月11日\t457199\n惊骇\t457200\n长桥街道\t457201\nPS派\t457202\nvictim\t457203\n魔术\t457204\n深圳城中村\t457205\n实战演习\t457206\n游乐\t457207\n第一场\t457208\nbeauties\t457209\n那威\t457210\nMEN\t457211\n休业\t457212\nthinkcentre\t457213\n固原市政府\t457214\n安全生产信息网\t457215\n金子山\t457216\n眼缘\t457217\n小烟\t457218\n支付宝花呗\t457219\n过度医疗\t457220\n拓步\t457221\nfee\t457222\n抛弃\t457223\n三国无双7\t457224\n帽子女\t457225\noutsider\t457226\n胸有成竹\t457227\n牧场物语吧\t457228\n华夏银行股份有限公司\t457229\nWebhook\t457230\n20nm\t457231\nManaged\t457232\n互邦\t457233\n贴图\t457234\nCert\t457235\n性片\t457236\nSmashing\t457237\n迈古玛\t457238\n市妇联\t457239\n大撸\t457240\n南京机场\t457241\n钟公庙街道\t457242\nbuddy\t457243\n邱淑贞\t457244\nMarathon\t457245\n一模一样\t457246\n腐烂国度2\t457247\n韩宝\t457248\n1000px\t457249\n陈金虎\t457250\n火狐\t457251\nf2fs\t457252\n海上牧云记\t457253\n武装色\t457254\n全军突击\t457255\n光熙\t457256\n2.3t\t457257\n有疑问\t457258\nLand\t457259\n阿米卡星\t457260\ncentered\t457261\n米洛斯\t457262\n忘食\t457263\n贵广网络\t457264\n5nd音乐网\t457265\n法宣在线考试题库\t457266\n飞霜\t457267\n杂醇油\t457268\n洲际\t457269\n飞行员\t457270\n谢家湾小学\t457271\n刷刷\t457272\n劲度系数\t457273\n杰视帮\t457274\n速报\t457275\n梁野山\t457276\n进进\t457277\n鲍森\t457278\n肌肉网\t457279\n十六\t457280\n洗炼\t457281\n淫梦\t457282\n融创天朗南长安街壹号\t457283\n第25集\t457284\n双儿\t457285\n铝塑管\t457286\n汽车产经网\t457287\n起家\t457288\n日喀则\t457289\nquickset\t457290\n复旦大学\t457291\n煮酒论英雄\t457292\n电影天堂网\t457293\ngenes\t457294\nDoraemon\t457295\n法语学习网\t457296\n新芽\t457297\n夏鹏\t457298\n安丰镇\t457299\n爱丝\t457300\n盘桓\t457301\n王财贵\t457302\n太古致我们终将逝去的青春\t457303\nclass01\t457304\n成都东站\t457305\n八股\t457306\n全音\t457307\n拳皇小游戏\t457308\n曹氏\t457309\n帕提亚\t457310\n紫罗兰\t457311\n罗艺恒\t457312\nGlyphicons\t457313\n富力\t457314\n论文答辩\t457315\n下水\t457316\n圣嘉新\t457317\n张龙\t457318\nAion\t457319\n个人电脑\t457320\n江苏银行\t457321\n笔记\t457322\n热弯\t457323\n案管\t457324\n神田光\t457325\n电圆锯\t457326\nspecially\t457327\n玉莹\t457328\n古琴台\t457329\n波西杰克逊\t457330\n能天使\t457331\n斑目\t457332\nsmba\t457333\ngastro\t457334\n顺美\t457335\n泥巴\t457336\n牌照\t457337\n百得利\t457338\n做字\t457339\n职业病危害告知卡\t457340\n末段\t457341\n标准数\t457342\n4-6个月\t457343\n出言不逊\t457344\n冠军欧洲\t457345\n王小蒙\t457346\n黄志华\t457347\n潘毅\t457348\n深红\t457349\n唐诗逸\t457350\n害己\t457351\n老龄\t457352\n花满楼\t457353\n青春文学-读览天下\t457354\n2017年4月\t457355\n花衣裳\t457356\nHainan\t457357\n溯溪\t457358\n第59期\t457359\n螺纹法兰\t457360\n8000000\t457361\n回形\t457362\n子爷\t457363\n自由篮球吧_\t457364\nFILCO\t457365\n中华人民共和国科学技术部\t457366\nSUNNY\t457367\ncapless\t457368\n电眼\t457369\nX射线衍射仪\t457370\n我是歌手第二季\t457371\n管教\t457372\n中片\t457373\n269部\t457374\nprospects\t457375\n改房\t457376\n清淡\t457377\n中国万达集团\t457378\niso14001\t457379\n如意路\t457380\n刘基\t457381\nCapri\t457382\nqq个性_无忧考网\t457383\n勇\t457384\n鄢颇\t457385\n分身\t457386\n鲁南\t457387\n新奥股份\t457388\n智库\t457389\nA104000\t457390\n子孙\t457391\n元宝山\t457392\n呆萌\t457393\n肯达尔\t457394\nSMT贴片机\t457395\nthp\t457396\n猫饭\t457397\n储物间\t457398\n人民医\t457399\n瑛姑\t457400\n氯丁\t457401\npowerpoint2007\t457402\nAG超玩会\t457403\n10栋\t457404\n10点\t457405\n十三五期间\t457406\n37寸\t457407\n创投圈\t457408\n最后的武士\t457409\n2151\t457410\nhp5820\t457411\n北朝鲜\t457412\n蚁后\t457413\n新华小区\t457414\n冲动型\t457415\n领唱\t457416\n吸附量\t457417\n不可一世\t457418\n国家级新区\t457419\n不能说的秘密\t457420\n老逼\t457421\nabundant\t457422\n心内科\t457423\n电压调节器\t457424\n滑跑\t457425\n高德地图SDK\t457426\n端户\t457427\n百性\t457428\n入射\t457429\n钢牌\t457430\n英杰之诗\t457431\n国际马拉松赛\t457432\n浙江电视台\t457433\n部首\t457434\n杏花村汾酒\t457435\n生活方式\t457436\n东京岛\t457437\n阅微草堂笔记\t457438\n5606\t457439\n株洲县\t457440\n段子\t457441\n第95\t457442\n这些日子\t457443\n都市路\t457444\n发点\t457445\n天魔劫\t457446\n永记生态园\t457447\nLarge\t457448\n木莲\t457449\n误宠\t457450\n腺苷\t457451\n74.com\t457452\nlumia830\t457453\n微信平台公众号\t457454\n绝命\t457455\n创世\t457456\n爬楼机\t457457\n第三册\t457458\nPolymers\t457459\ngoldman\t457460\n潘潘猫\t457461\n技力\t457462\n宣誓\t457463\nUnipus\t457464\n炭火\t457465\nqualities\t457466\n江夏庙山\t457467\n达信\t457468\n中韩产业园\t457469\n无需\t457470\nhbulider\t457471\n密友\t457472\n标三实\t457473\nTribute\t457474\n包管理器\t457475\n黑莉\t457476\n民国二年\t457477\nLIE\t457478\n抄袭\t457479\n连接器\t457480\nlogistics\t457481\n福特猛禽\t457482\n英吉沙\t457483\nWHOIS\t457484\n470D\t457485\n82\t457486\n魂晶\t457487\n十二指肠球部溃疡\t457488\n博柏利\t457489\n祝星\t457490\n大陆桥\t457491\n马犬\t457492\nrained\t457493\n守身\t457494\nrequire_once\t457495\n良知\t457496\n全性\t457497\n年终\t457498\nArctic\t457499\n亲如\t457500\n科莱特\t457501\n薄伽梵歌\t457502\n睡意\t457503\ngetline\t457504\nSkies\t457505\n第十二周\t457506\nMiyuki\t457507\n温州大学\t457508\n愿意\t457509\n涛光\t457510\nwin7系统c盘\t457511\n13层\t457512\n鹿嘴山庄\t457513\n华强南\t457514\n桢楠\t457515\n胜高酒店\t457516\n中国文联出版社\t457517\n风来\t457518\n朝阳东路\t457519\n才德\t457520\nlorna\t457521\n水能载舟\t457522\n国家邮政总局\t457523\n甘婷婷\t457524\n西南师大\t457525\nfld\t457526\n桃夭夭\t457527\n几级\t457528\n印象主义\t457529\n地铁盘\t457530\n鼢鼠\t457531\n购车贷款计算器\t457532\n心阳\t457533\n老田\t457534\n黄风怪\t457535\n180公里\t457536\n惠州学院\t457537\n忠魂\t457538\n爱国路\t457539\n哈弗H5\t457540\n随笔\t457541\nzhjahch\t457542\n郁闷\t457543\n金墉\t457544\n许斌\t457545\n西集网\t457546\n广东省人民政府法制办公室\t457547\n股间\t457548\n爱服\t457549\nactivator\t457550\n二头\t457551\n贝亲\t457552\n47个\t457553\nsouthern\t457554\n李春利\t457555\n岳氏\t457556\n水信玄饼\t457557\nvisage\t457558\nStrangers\t457559\n纳利\t457560\n朱丽\t457561\n宜昌火车站\t457562\n一万毫安\t457563\n烽火戏\t457564\n凝炼\t457565\n吃客\t457566\n第56期\t457567\n增强器\t457568\nISBN\t457569\nQWebEngine\t457570\n夜书\t457571\nDepreciation\t457572\n棋牌游戏服务器\t457573\nUV光\t457574\n冷轧机\t457575\n条卡\t457576\n达尔优键盘\t457577\n变应性鼻炎\t457578\n动植物油\t457579\nhype3\t457580\n毁灭战士\t457581\nveracrypt\t457582\n禹王台区\t457583\nsalsa\t457584\n伊曲康唑胶囊\t457585\n深圳幼儿园\t457586\nVanilla\t457587\n速降\t457588\nuitext\t457589\n金庸群侠传S\t457590\n江北新区\t457591\n福建星网锐捷通讯股份有限公司\t457592\n20世纪90年代\t457593\n华润国际社区\t457594\n无油空压机\t457595\n商洽\t457596\n白冰冰\t457597\n配套费\t457598\npolitical\t457599\n末流\t457600\n暗影精灵2\t457601\n青城\t457602\n卡纳\t457603\nSAO\t457604\n南阳\t457605\nGPUS\t457606\nXYZ\t457607\n淮安外国语学校\t457608\n饺子机\t457609\n叶璇\t457610\n金方圆\t457611\n李司棋\t457612\npcc\t457613\n临沂市国土资源局\t457614\n孝文\t457615\n欢乐园\t457616\n8声\t457617\n性格外向\t457618\n诉讼保全\t457619\n分县\t457620\n供不应求\t457621\n古雷网\t457622\n冒险岛恶魔复仇者\t457623\n物极必反\t457624\n钢铁侠3\t457625\n最终消费支出\t457626\nPain\t457627\n迪亚特洛夫事件\t457628\n伊莉丝\t457629\n马思维\t457630\n口金\t457631\nSKETCH\t457632\n颜射\t457633\nOneToMany\t457634\n偏远\t457635\n荣禄\t457636\nving\t457637\n192.168.3.98\t457638\n70w\t457639\npm3\t457640\ncs1.6吧\t457641\n人人读书网\t457642\n优酱\t457643\ncupcake\t457644\n青岛啤酒博物馆\t457645\n可见过\t457646\n林荫路\t457647\n谷物\t457648\n利基\t457649\n中博\t457650\n亚诺\t457651\n乐哥\t457652\n农产品地理标志\t457653\n婴儿床\t457654\nwatir\t457655\n省委省政府\t457656\n人工学园\t457657\n楚汉传奇\t457658\n镇尺\t457659\n豹弩\t457660\n卢浮宫博物馆\t457661\n合不合适\t457662\n德语学习网\t457663\n农村土地承包经营权确权登记颁证\t457664\n黄大仙灵\t457665\ne网\t457666\nShan\t457667\n240g\t457668\n太阳城\t457669\n索尼a7\t457670\nAre\t457671\n大商机\t457672\n映日荷花别样红\t457673\n茅草屋\t457674\n碱性食物\t457675\n黄金档\t457676\n科脉\t457677\n失业登记证\t457678\n张子枫\t457679\n980Ti\t457680\n珍珠贝\t457681\n青鸟\t457682\n劲舞团\t457683\n打碟\t457684\nlab\t457685\n火德\t457686\n联欢\t457687\nStripes\t457688\nvfat\t457689\n音阙诗\t457690\n妊娠纹\t457691\n博弈论\t457692\n富得利\t457693\n易城\t457694\n如雨\t457695\nUp\t457696\nMoore\t457697\n山区\t457698\n商品房买卖合同备案\t457699\n封单\t457700\nfftw\t457701\n美花\t457702\n1英寸\t457703\n上天入地\t457704\n4单\t457705\nenthusiasm\t457706\ninertia\t457707\n康乔\t457708\n超过一年\t457709\n复制集\t457710\nbury\t457711\nahc\t457712\ne6230\t457713\n铁将军\t457714\n鸡眼机\t457715\n流式细胞术\t457716\n吸铁石\t457717\n日星\t457718\n拉帝亚斯\t457719\n山东旅游职业学院\t457720\n不在江湖\t457721\n吴霜\t457722\n山东铝业职业学院\t457723\n万象府台\t457724\n专属经济区\t457725\n叠螺机\t457726\nCLEVO\t457727\n牧夫\t457728\n被遗忘的时光\t457729\n海东\t457730\nRX100\t457731\n松\t457732\n反比例关系\t457733\n燃油税\t457734\n线业\t457735\nCosme\t457736\n一护\t457737\nhellobike\t457738\nsoundflower\t457739\n夏莲居\t457740\n电路分析基础\t457741\n黄佳\t457742\n宝柏\t457743\n暑期学校\t457744\n电平转换电路\t457745\nSFP+\t457746\n巯基\t457747\nWEDDING\t457748\n爱丽丝学院\t457749\n刘庆峰\t457750\n称骨算命\t457751\n长荣股份\t457752\nEnforcement\t457753\n4点\t457754\n电抗率\t457755\n3029\t457756\nCB自行车网\t457757\n万华化学集团股份有限公司\t457758\nSuperCharge\t457759\nclifford\t457760\n蛮干\t457761\n石路\t457762\n十三款\t457763\nQQ表情\t457764\n多模\t457765\n交行手机银行\t457766\n花道\t457767\n六方会谈\t457768\nupenn\t457769\n2017-08-30\t457770\n探案\t457771\nComparison\t457772\nprague\t457773\n肖红\t457774\n区疾控中心\t457775\n生物园\t457776\n开心\t457777\n诸公\t457778\n暴降\t457779\n7个小时\t457780\n散尾葵\t457781\n5cm\t457782\n锁闭\t457783\n无臭\t457784\n公务员录用体检通用标准\t457785\n划转\t457786\n鱼圆\t457787\nmanta\t457788\n星明\t457789\nTCPIP\t457790\n天安数码城\t457791\nkayla\t457792\nenclosure\t457793\nTopik\t457794\n2.5.10\t457795\n43%\t457796\natapi\t457797\n釜山国际电影节\t457798\n财树\t457799\n爱玉珠宝网\t457800\n南宁铁路局\t457801\n央视315\t457802\nMP4高清\t457803\nentityframework\t457804\n凸起物\t457805\n.run\t457806\n枇杷果\t457807\ncx20\t457808\n错误框\t457809\n支树平\t457810\n碌\t457811\n断眉\t457812\n前十名\t457813\n卿本佳人\t457814\n雨竹\t457815\n张秀\t457816\n消气\t457817\n/head\t457818\n谢长廷\t457819\n4815\t457820\n七十四\t457821\n第十层\t457822\n6G+128G\t457823\n让爱\t457824\nAS400\t457825\n健行\t457826\n升级品\t457827\n赵丹丹\t457828\n黄少祺\t457829\n请假单\t457830\n黄婷\t457831\nString,Object\t457832\nMasked\t457833\n村服\t457834\nSigmoid函数\t457835\nQTS\t457836\n预见性\t457837\n军民\t457838\n川汇\t457839\n镜花\t457840\n华为magic\t457841\n流放之路猎魔\t457842\n康辉\t457843\n四列\t457844\n洛阳牡丹花会\t457845\n起落架\t457846\n立案侦查\t457847\n发酵液\t457848\n三套\t457849\nCVBS\t457850\n郭晓东\t457851\n恒水中学\t457852\n甘肃省质量技术监督局\t457853\n失信被执行人\t457854\n四川省医院\t457855\nOCD\t457856\n巡逻艇\t457857\n汉特\t457858\n好像\t457859\n线程堆栈\t457860\n亳州网\t457861\n矿石机\t457862\n撰稿人\t457863\n逢青\t457864\n开端\t457865\nAM335X\t457866\n丁勇\t457867\n成果集\t457868\n电信大厦\t457869\n超游世界\t457870\n香泉\t457871\n1069019106\t457872\n北广\t457873\n138号段\t457874\n黑莓Priv\t457875\n佛山市财政局\t457876\n秃\t457877\n锡澄路\t457878\njì\t457879\n凤九\t457880\n国家市场监管总局\t457881\n0.06%\t457882\navfun\t457883\n张子栋\t457884\n测试架\t457885\nStake\t457886\nyuanma\t457887\n福州市市场监督管理局\t457888\n上海三甲医院\t457889\n秀站网\t457890\n第54\t457891\n独三\t457892\n天水南站\t457893\n并发量\t457894\n版权费\t457895\ndistinction\t457896\n爱玉\t457897\n面域\t457898\n佳能MG2580S\t457899\n安全生产监督管理局\t457900\nAuth\t457901\n尤伦卡\t457902\n郑煤\t457903\n恒大帝景\t457904\nfacesitting\t457905\n鼠神经\t457906\n心者\t457907\n艳照\t457908\n曹娥\t457909\nDonuts\t457910\npycahrm\t457911\n芝芝\t457912\n几十万条\t457913\nncc\t457914\n奶娘\t457915\n应会\t457916\n折流板\t457917\n任意值\t457918\nsimpson\t457919\n靳先生\t457920\n新异\t457921\n追击战\t457922\n博地影秀城\t457923\n原型机\t457924\n沙威\t457925\n劣化\t457926\nlolo\t457927\n沙坡\t457928\n新剑侠传奇\t457929\nflipboard\t457930\n吴敬琏\t457931\n春分\t457932\n天主教徒\t457933\n谱表\t457934\n播控\t457935\n欣和\t457936\n白版\t457937\n600238\t457938\n光伏公司\t457939\n5.7.19\t457940\n企务\t457941\n周正东\t457942\n广州消防\t457943\n泌尿科\t457944\n杭州运河\t457945\n280号\t457946\n风味发酵乳\t457947\n商旅文\t457948\n山特UPS\t457949\n微物镜\t457950\n钢锯\t457951\n偃\t457952\n博鳌超级医院\t457953\nzipper\t457954\n河北建设集团\t457955\n导航助手\t457956\n541\t457957\n私匙\t457958\nstrace\t457959\n会面\t457960\n照\t457961\n歌声与微笑\t457962\n汇博网\t457963\naac\t457964\n七年级道德与法治下册\t457965\n天上掉下个猪八戒\t457966\n天府大道\t457967\n欧美\t457968\n黄飞\t457969\n酒色\t457970\n邓\t457971\n速播\t457972\n教民\t457973\n起舞\t457974\n微信群发助手\t457975\nSDC\t457976\n浦发\t457977\n菏泽市立医院\t457978\n健帆\t457979\nX51\t457980\n开发点滴\t457981\n王智勇\t457982\nEmEditor\t457983\n知法犯法\t457984\n蔡老师\t457985\n卡莉·克劳斯\t457986\n1家\t457987\npanie2015\t457988\n蟹肉\t457989\nPentium\t457990\n11.19\t457991\n匕首\t457992\n洞桥镇\t457993\n归零\t457994\n浅白\t457995\n115.com\t457996\n社会效益\t457997\n成就展\t457998\nShazam\t457999\n呋塞米片\t458000\n费尔南多\t458001\n第90集\t458002\n阿青\t458003\n全国人大常委会副委员长\t458004\n早7点\t458005\n1963\t458006\n第五周\t458007\n覃沐曦\t458008\nAUTOSAR\t458009\nペルソナ5\t458010\n配音\t458011\n山东省科技厅\t458012\nPeas\t458013\n龙溪街道\t458014\n西南财经大学出版社\t458015\n家有喜事\t458016\nIBD\t458017\n热闹\t458018\n读书心得\t458019\n昙花\t458020\n万蛇\t458021\n1750\t458022\n立林\t458023\n县城管局\t458024\n花洒\t458025\n天池竞赛\t458026\n此物\t458027\n强大\t458028\nlibboost\t458029\n硬盘位\t458030\n轮\t458031\n辉石\t458032\n小米电视盒子\t458033\n中华人民共和国个人所得税法实施条例\t458034\nEV450\t458035\n电信光猫华为\t458036\n斗场\t458037\nnorflash\t458038\nClinique\t458039\n1522nf\t458040\n特例\t458041\nvince\t458042\n滨江苑\t458043\ncad文件\t458044\n逻辑树\t458045\n嗨秀\t458046\n伊人睽睽\t458047\nCsv\t458048\n华仪电气\t458049\n烧退\t458050\nEXE\t458051\nJavaSwing\t458052\n赣州站\t458053\n4130\t458054\n我们不一样\t458055\n企贸网\t458056\n第一份\t458057\n闪避\t458058\n烤场\t458059\n沟边\t458060\n沈抚\t458061\n基底细胞癌\t458062\n流食\t458063\n致辞\t458064\nMODULE\t458065\n程愫\t458066\n扁机\t458067\n入魔\t458068\n上海美莱\t458069\nsheet表\t458070\nDTL\t458071\nROTK\t458072\n红发香克斯\t458073\n280ml\t458074\nretailer\t458075\n园形\t458076\n2014一级\t458077\nKnives\t458078\n酒酿\t458079\n开立\t458080\nた\t458081\nVelodyne\t458082\n辽西北\t458083\n64000\t458084\n美乐乐家居网\t458085\n描述性分析\t458086\n5934\t458087\n海南省公安厅\t458088\n可比克\t458089\n冲上云霄2\t458090\nRoy\t458091\n日车展网\t458092\n中捷\t458093\n辞海\t458094\n董监事\t458095\n流鼻\t458096\n玲珑网游加速器\t458097\nliving\t458098\n手自一体变速器\t458099\nClayton\t458100\n复分析\t458101\n中国诚通控股集团有限公司\t458102\n白素贞\t458103\n佟亚丽\t458104\n如神\t458105\nCanon佳能\t458106\nnet4.0\t458107\n直来直去\t458108\n最新型\t458109\n5118\t458110\nFIFAonline3\t458111\nqiuxia\t458112\n王海洋\t458113\n寇马克\t458114\n测绘院\t458115\nreconstruction\t458116\nCHOW\t458117\n六祖慧\t458118\n武昌理工学院\t458119\n70期\t458120\nw764位\t458121\nume\t458122\nOptics\t458123\n伯克级驱逐舰\t458124\n羊子\t458125\nzjy\t458126\n斯美塔那\t458127\n恋爱史\t458128\n增值税纳税义务\t458129\nGnome\t458130\nSELECT\t458131\n四方宇\t458132\nHUC\t458133\nstalk\t458134\n边包\t458135\n西鲁\t458136\n学清路\t458137\n孕妇餐\t458138\n刘海燕\t458139\n如月真绫\t458140\n趁机\t458141\n青岛啤酒厂\t458142\n半音阶口琴\t458143\n绥化学院\t458144\n遮罩\t458145\n金水桥\t458146\n锐力\t458147\n马蔺\t458148\n芬吗通\t458149\nISK\t458150\n一视同仁\t458151\n从小到大\t458152\n搜狗字媒体\t458153\n软铜\t458154\nextella\t458155\n东方日升新能源股份有限公司\t458156\n王洪章\t458157\n冲田总悟\t458158\n驰援\t458159\nNOISEY\t458160\n抛盘\t458161\nSmarty\t458162\n嫡女策\t458163\n君马\t458164\n斧正\t458165\n杜阮\t458166\n二十四史全译\t458167\n目击者\t458168\nADM\t458169\n赣州港\t458170\n应承担\t458171\n廻\t458172\n中国国家数字图书馆\t458173\n黄龙旗\t458174\n60毫升\t458175\nfranc\t458176\n淮北相山区\t458177\n扣条\t458178\n吉兰\t458179\n8块\t458180\n博采众长\t458181\n只能\t458182\nLOL英雄联盟_巴士LOL\t458183\nad6.9\t458184\n欢声笑语\t458185\n我的梦想\t458186\n刘中民\t458187\n安置房\t458188\n金粒餐\t458189\n白马股\t458190\n500mb\t458191\n路刷\t458192\n松树皮\t458193\n辣品\t458194\n食品案\t458195\n赶得上\t458196\n91岁\t458197\n党建在线\t458198\n77分\t458199\n一房二\t458200\n粉漆\t458201\n浙江日报报业集团\t458202\n7处\t458203\n过滤池\t458204\n第几章\t458205\n灼烧\t458206\n核性\t458207\nRein\t458208\n甲肝\t458209\n永恒不变\t458210\n2.5版\t458211\n薛仁贵\t458212\nHot\t458213\n汉乐府\t458214\n限值\t458215\n魁省\t458216\n瘦虎肥龙\t458217\n2320\t458218\n台州市中心医院\t458219\n绿肥红瘦\t458220\n阿里小宝卡\t458221\n亚莎\t458222\n小胡\t458223\n墟\t458224\n郑州路\t458225\n脱贫致富\t458226\n首联\t458227\n小说史\t458228\n大道之行\t458229\n很多次\t458230\nHD3\t458231\n40帧\t458232\nPandora\t458233\n快感\t458234\n必经\t458235\n2007年12月\t458236\n贺天举\t458237\n杨邱\t458238\n中粮信托\t458239\n饮品\t458240\n绸带\t458241\n省工信委\t458242\n苏州乐园\t458243\nPackaging\t458244\nPrompt\t458245\npermgen\t458246\n刘弗陵\t458247\nYuka\t458248\n2月\t458249\n新浪财经\t458250\n成名\t458251\n青岛注册公司\t458252\nBUG\t458253\n免刷\t458254\n倪楠\t458255\n60d\t458256\n含水率\t458257\n云主机\t458258\n上园\t458259\n深圳大学艺术设计学院\t458260\n沧\t458261\niov\t458262\nsurviving\t458263\n白雪公主成人版\t458264\n愚行录\t458265\n萧山区政府\t458266\n死亡之海\t458267\n癖好\t458268\n山岩\t458269\n鸡尾酒吧\t458270\n迅为\t458271\nflsah\t458272\nrub\t458273\n老妇\t458274\n洛克\t458275\n2CD\t458276\n哈尔滨道里区\t458277\n四两\t458278\n良莠不齐\t458279\n基因组学\t458280\n万象华府\t458281\nncode\t458282\n601所\t458283\n100000000\t458284\n陈店吧\t458285\n咖啡豆\t458286\n第三帝国\t458287\n日本京都市\t458288\n长宁新闻网\t458289\n10日\t458290\n主从复制\t458291\n巫妖王之怒\t458292\n话会\t458293\ngrd\t458294\n仿站\t458295\n华南蓬火车站\t458296\n松香\t458297\nCocoapods\t458298\n事发时\t458299\n彷\t458300\n草场\t458301\nshen\t458302\n正八边形\t458303\n下半学期\t458304\n帮扶\t458305\n3142\t458306\n6月30日前\t458307\n威海市环翠区人民政府\t458308\n手霜\t458309\n升旗\t458310\n自由客\t458311\n猎奇网\t458312\nccnu\t458313\n保宁\t458314\n心碎\t458315\n吉比特\t458316\n呼伦贝尔海拉尔区\t458317\n二年后\t458318\ntrick\t458319\n樱桃论坛\t458320\n副乡长\t458321\nmblock\t458322\n何捷\t458323\n午茶\t458324\n俞星\t458325\ndll库文件\t458326\nみ\t458327\n扬州工业职业技术学院\t458328\n石湾社区\t458329\n兵士\t458330\nwda\t458331\n猪价网\t458332\n稻种\t458333\n海军罪案调查处\t458334\n青葡萄\t458335\n牛排\t458336\n广东省水利厅\t458337\n查杰\t458338\n二溴\t458339\n露娜洁面仪\t458340\n采空区\t458341\n安琪纽特\t458342\nBugly\t458343\n金马大厦\t458344\n苦战\t458345\n心灯\t458346\n开店\t458347\nTracks\t458348\n中兴小鲜\t458349\nCDI\t458350\n金评\t458351\n军师\t458352\n第二手\t458353\n贝依\t458354\n实验场\t458355\nstacktrace\t458356\n上锡\t458357\n郑单\t458358\n短缺\t458359\n朱日和阅兵\t458360\n一大\t458361\n磷酸氢二钠\t458362\n语义化\t458363\n车行易\t458364\n瑞慈\t458365\n上海山\t458366\n125k\t458367\n中医基础理论\t458368\n曲线型\t458369\nHenkaku\t458370\n灾难\t458371\n追命\t458372\n改性沥青防水卷材\t458373\n增值电信业务经营许可证\t458374\n新宝5\t458375\ncoincola\t458376\n移动CRM\t458377\n强音\t458378\n卢老师\t458379\n万和电热水器\t458380\nsikana\t458381\n清和园\t458382\nstatusBar\t458383\n射雕英雄传之九阴真经\t458384\n毕业论文-文档投稿赚钱网\t458385\n213.net\t458386\n迷离夜\t458387\n九星\t458388\n佛事\t458389\nbillions\t458390\n东海航空\t458391\n圣劳伦斯\t458392\n肾骨片\t458393\n风荷\t458394\n皖北\t458395\n北京城建\t458396\n微格\t458397\n不在一起\t458398\n多潘立酮\t458399\n4罐\t458400\nBoney\t458401\n轮播\t458402\nR6220\t458403\n夹头\t458404\n贾樟柯\t458405\n邓德隆\t458406\n舞林\t458407\nexcel吧_\t458408\n骁将\t458409\n很有爱\t458410\n成都工商行政管理局\t458411\n不凡的改变\t458412\n常州工学院\t458413\n$0\t458414\n民办教育促进法\t458415\n虹口足球场\t458416\n网页编辑器\t458417\nNIKKI\t458418\nxv\t458419\n神农城\t458420\n充电座\t458421\n三星w999\t458422\n馨香\t458423\n3D电子书\t458424\n隔页\t458425\n1548\t458426\n快捷\t458427\n通典\t458428\nM2M\t458429\n拉阔\t458430\n日月星城\t458431\nubnutu\t458432\n微瑕\t458433\n人工孵化\t458434\nPlating\t458435\n二模理综\t458436\nWebIDE\t458437\n一季度\t458438\nEnergetic\t458439\n刘冲\t458440\n2682号\t458441\n师章\t458442\ncecilia\t458443\n留观\t458444\n七星阵\t458445\n2k15\t458446\n新节\t458447\n知恩\t458448\n手机中国\t458449\n古砖\t458450\n发布者\t458451\n绝色生香\t458452\ncomctl32.ocx\t458453\ngb50016\t458454\n房地产中介公司\t458455\n63名\t458456\n拖\t458457\n斑片\t458458\n第三十六章\t458459\n二十世纪六十年代\t458460\nconds\t458461\n1079\t458462\n辅助人员\t458463\n爱情谷\t458464\n别太急\t458465\n香域\t458466\n眼底镜\t458467\n额敏县\t458468\nReign\t458469\nBIG5\t458470\n井田制\t458471\n最佳\t458472\n藤县\t458473\nbloggers\t458474\n陕西省农业厅\t458475\n漂洋过海\t458476\nvan样\t458477\n5a景区\t458478\n电磁除铁器\t458479\n砼柱\t458480\n小浣熊\t458481\n商用洗碗机\t458482\n锌合金压铸\t458483\n助残日\t458484\n女人花\t458485\nPLC300\t458486\n主街\t458487\n兴趣部落\t458488\n一棹\t458489\nv2017.1\t458490\n第一条\t458491\n月华露\t458492\n塑膜\t458493\n迅雷BD高清中字ed2k\t458494\n打树\t458495\n中国葛洲坝集团公司\t458496\nPS智能对象\t458497\n开盘价\t458498\n常识篇\t458499\n下马坊\t458500\n跳神\t458501\n库安\t458502\n中国橡胶网\t458503\nmarried\t458504\n中建西南院\t458505\n张利军\t458506\n爱滋病\t458507\n火币网\t458508\njaeger\t458509\n轱辘\t458510\n代表者\t458511\n油卡\t458512\n中央政策研究室\t458513\n本周三\t458514\n同花\t458515\n自助餐团购网\t458516\n神赋\t458517\n走一走\t458518\n剑河\t458519\n施光南\t458520\n苍之纪元潘多拉\t458521\n20150116\t458522\n马裤\t458523\nwkydsw\t458524\n封神之战\t458525\n該\t458526\n布道者\t458527\n晶锐\t458528\n天书\t458529\n云顶\t458530\n宝云\t458531\n王者荣耀城市赛\t458532\nInverter\t458533\nUP\t458534\n钩包\t458535\n普桑\t458536\n巧克力味\t458537\n折翼天使\t458538\nMcAfee\t458539\n涓滴\t458540\n6屏\t458541\n谈判\t458542\nCannabis\t458543\nBond\t458544\n福禧\t458545\n尼亚加拉大瀑布\t458546\n寻乌县\t458547\n白珊\t458548\n6.45\t458549\n梁旭东\t458550\n淘汰赛\t458551\n签子\t458552\n双武\t458553\n9成\t458554\n广州展览公司\t458555\n罗甸\t458556\n601211\t458557\n唯美主义\t458558\n福克斯公司\t458559\n碳链\t458560\n山鬼\t458561\n嘉实多极护\t458562\n榛果\t458563\n湖北省\t458564\n出货\t458565\n猫\t458566\n妻妾\t458567\n西江月\t458568\n鹿泉市\t458569\n40164\t458570\n大钟楼\t458571\n李发平\t458572\n1周后\t458573\n深海利剑\t458574\n休庭\t458575\neasyu\t458576\n大学梦\t458577\n战国无双2\t458578\n闭站\t458579\n元数据\t458580\n整形频道_健客网\t458581\n濤叔\t458582\nDupont\t458583\n大用力\t458584\n家庭版\t458585\n物生\t458586\n_v1.1\t458587\n守望猎手\t458588\n安徽省人大常委会\t458589\n网孔\t458590\n猖狂\t458591\n丞\t458592\n王亚樵\t458593\n4月末\t458594\n水晶城\t458595\n大嘴猴\t458596\n末日桃花债\t458597\n四十八式\t458598\n关键词指数\t458599\n黑苹果clover\t458600\n郝帅\t458601\n赖以\t458602\n藕粉\t458603\n5.1.4\t458604\n洼\t458605\nXBLA\t458606\npp岛\t458607\n护墙板\t458608\n旺仔\t458609\n奈何桥\t458610\n求亲\t458611\nMBA智库百科\t458612\n多城市\t458613\n合交\t458614\n泽州\t458615\n壳牌超凡喜力\t458616\n傅里叶\t458617\nType-C_\t458618\n触手猴\t458619\n剑侠情缘\t458620\n资材\t458621\n入殓师\t458622\n平效\t458623\n签入\t458624\n隔热砖\t458625\n公路法\t458626\n炸出\t458627\n露依\t458628\n一百年前\t458629\n血海穴\t458630\n望岗\t458631\n机种\t458632\n售货机\t458633\n一噜\t458634\n12388\t458635\n西京湾\t458636\n京中央\t458637\n冤假错案\t458638\nexpressions\t458639\n千金怡怡\t458640\n田间管理\t458641\n小K\t458642\n三价\t458643\n蓬莱客\t458644\nzhiyao\t458645\n馆内\t458646\n自然哲理\t458647\n国光帮帮忙\t458648\n2018年4月17号\t458649\n不怒自威\t458650\n水政\t458651\n辣椒炒肉\t458652\n5CM\t458653\nNX100\t458654\n镇南\t458655\n藤野先生\t458656\nBootstrapValidator\t458657\n6届\t458658\n玩吃\t458659\nlilu\t458660\n闽江路\t458661\n新浪竞技风暴_新浪网\t458662\n贾芸\t458663\n董竹君\t458664\n云翔\t458665\n国家基金委\t458666\n针炙\t458667\n英雄联盟LOL\t458668\n麻涌镇\t458669\n朱晶\t458670\n323\t458671\n武林中文\t458672\n民间文化\t458673\n五香花生米\t458674\nPS_\t458675\n月率\t458676\n4轴\t458677\n网评员\t458678\n安迪·沃霍尔\t458679\n梦幻西游_小皮手游网\t458680\nPCP\t458681\n隆众资讯\t458682\n广场舞蹈网\t458683\n烂脸\t458684\n高端版\t458685\n卤鹅\t458686\n魔战\t458687\n恋战\t458688\ndepapepe\t458689\nCronTrigger\t458690\n爱住\t458691\n财务报表分析.doc\t458692\n1.22.1\t458693\n斑痣\t458694\nM1536\t458695\n苗木基地\t458696\nナス\t458697\nGIVENCHY\t458698\n可见性\t458699\n熊果苷\t458700\nDelphi2010\t458701\n长江杯\t458702\n对出\t458703\n交通违章查询网\t458704\n直球\t458705\n手绣\t458706\n000402\t458707\n爱的礼赞\t458708\n【子\t458709\n猎微网\t458710\n真龙\t458711\n塞尔维亚女排\t458712\n戗\t458713\n未完善\t458714\nteamwork\t458715\n安徽省立医院\t458716\n马尔维纳斯群岛\t458717\n囊虫病\t458718\nFORMULA\t458719\n赵海峰\t458720\n新边城浪子\t458721\n叠化\t458722\nguts\t458723\n弱势\t458724\n边沁\t458725\n麦田圈\t458726\n猫咪们\t458727\n10种\t458728\n変身\t458729\n虚拟人生2\t458730\n魔猴网\t458731\n阉人\t458732\n猪流\t458733\n9月1日\t458734\n新华富\t458735\n混悬液\t458736\n老实说\t458737\n八山\t458738\n新视野大学英语第三版读写教程\t458739\n审计抽样\t458740\nKlook客路\t458741\n数九\t458742\n三四次\t458743\n任亮\t458744\n沃金\t458745\nsakura\t458746\n深圳财富趋势科技股份有限公司\t458747\n比较盘\t458748\nE族\t458749\n5项\t458750\nNING\t458751\ncommunist\t458752\n赵鑫鑫\t458753\n8016\t458754\n伊苏塞尔塞塔\t458755\n渔父\t458756\n非药品类易制毒化学品\t458757\n第十二个\t458758\n复旦大学附属中学\t458759\nSensual\t458760\n异型件\t458761\n47000\t458762\n系统分辨率\t458763\n浦南镇\t458764\n虎门镇\t458765\n小球\t458766\n审批制\t458767\n10.15\t458768\n92分钟\t458769\n奥秘法\t458770\n.9\t458771\n刘国强\t458772\n微信公\t458773\nSURFACE\t458774\n23672310\t458775\n峰回路\t458776\nMayhem\t458777\n变妆\t458778\nTOMO\t458779\n三国立志传3\t458780\n真骨雕\t458781\n俞雷\t458782\n单号查询_56114物流查询网\t458783\n黄皮书\t458784\n深圳中学\t458785\n11.99万元\t458786\n竹艺\t458787\n姚孟\t458788\n超时代\t458789\n海芋恋\t458790\n哈罗电\t458791\n中国重汽集团\t458792\n拉里·佩奇\t458793\n广西工程职业学院\t458794\n理会\t458795\n啤酒桶\t458796\n重交\t458797\n星眸\t458798\n商务管理专业\t458799\n邢捕头\t458800\n谷俊山\t458801\n沃尔沃XC60\t458802\n海信地产\t458803\n氯铂酸\t458804\n酊剂\t458805\n祥解\t458806\n门丽\t458807\nyy小说吧\t458808\nNCL\t458809\n玫瑰鲜花饼\t458810\n通辽职业学院\t458811\n月长\t458812\n冤家路窄\t458813\n西游降魔篇\t458814\n修复\t458815\nIPHONE\t458816\n重臂\t458817\n汇流箱\t458818\n十八项\t458819\nobserved\t458820\n4399双人小游戏\t458821\n杜悦\t458822\n刘云峰\t458823\n驾驶式\t458824\n58集团\t458825\n回生\t458826\n围棋网\t458827\n寒来暑往\t458828\n黑胶\t458829\n万代\t458830\n霹雳火\t458831\n开热\t458832\n全套\t458833\n财付通\t458834\n新新电影网\t458835\n交换机\t458836\n冯导\t458837\n着床\t458838\n红外线遥控器\t458839\n皂粒\t458840\n#9\t458841\n江郎山\t458842\n扶手椅\t458843\n绿地国际花都\t458844\nporn\t458845\n太平洋森活广场\t458846\n梦幻龙族2\t458847\n洋哥\t458848\nInspect\t458849\n新近\t458850\nSloan\t458851\n鹰飞\t458852\nwww.1\t458853\n解禁期\t458854\n44000\t458855\n套中人\t458856\ntemplete\t458857\n楦\t458858\n葛小伦\t458859\npageTitle\t458860\nmeiko\t458861\nv3.7.1\t458862\n破釜沉舟\t458863\n薛毅\t458864\n五十位\t458865\n创纪录\t458866\n加朵\t458867\nDestruction\t458868\nCRISPR/Cas9\t458869\n捷豹I-PACE\t458870\n2017-01-11\t458871\n8T\t458872\n碳纤维板\t458873\n官园\t458874\nExposed\t458875\n7.6%\t458876\nctex\t458877\n工具画\t458878\n珠穆朗玛\t458879\nNEXON\t458880\n剧票\t458881\n导荷\t458882\n2015年度\t458883\n打黑\t458884\n虎皮鹦鹉吧\t458885\nPresley\t458886\n调查表\t458887\nxdata\t458888\n对照品\t458889\n劳服\t458890\n冒险王卫斯理之蓝血人\t458891\n养车\t458892\n2017QS世界大学\t458893\nMargin\t458894\n14秒\t458895\n67CHA.COM\t458896\n镍铬丝\t458897\n厦门眼科中心\t458898\nopen函数\t458899\n双联双控开关\t458900\n胎儿\t458901\nserie\t458902\n智能晾衣架\t458903\n试验工\t458904\n一群\t458905\nv3.3.2\t458906\n影舞\t458907\nctrip\t458908\n中美贸易站\t458909\nFPGA/CPLD\t458910\n复习课\t458911\n最高人\t458912\nHands\t458913\n软骨瘤\t458914\n中国新通信\t458915\nChronic\t458916\n快e贷\t458917\n双兰\t458918\n微软必应\t458919\n活面\t458920\n第16条\t458921\n南海大道\t458922\n15Plus\t458923\nHeadless\t458924\n仿石漆\t458925\n嗜睡症\t458926\n广西科技大学鹿山学院\t458927\n条路\t458928\n345\t458929\n大众cc\t458930\n地奈德\t458931\njavascrpt\t458932\n黑壳虾\t458933\n蓬径\t458934\nJean\t458935\n中共江西省委\t458936\n吉祥街\t458937\n809\t458938\n恒安标准人寿\t458939\n碧海金沙\t458940\nil2cpp\t458941\n足尖\t458942\nthereis\t458943\n临沂政府网\t458944\n草原上\t458945\nomega3\t458946\n静姝\t458947\nintellisense\t458948\nabs\t458949\n置身\t458950\n房下\t458951\n102页\t458952\n欧蓝\t458953\n张国辉\t458954\n美妆个护\t458955\n闪亮的爸爸\t458956\n涉世未深\t458957\n张士诚\t458958\n两个故事\t458959\n汕头大学\t458960\n慕韵\t458961\nstudied\t458962\n堆砌\t458963\n氯化亚铁\t458964\n戛\t458965\n枫丹壹号\t458966\n奋\t458967\n南门口\t458968\n劳薪\t458969\nahp\t458970\n参政\t458971\n赞赞\t458972\n万丽\t458973\n资产化\t458974\n教办\t458975\n宝胜\t458976\n敝\t458977\n叉乘\t458978\n锅炉除尘器\t458979\ncblock\t458980\nregardless\t458981\nengineered\t458982\n军龄\t458983\n阳朔高铁站\t458984\nyah\t458985\n2第二期\t458986\ngarageband\t458987\n火系\t458988\n曲麻菜\t458989\nszyang\t458990\n翠华山\t458991\n天津艺术职业学院\t458992\n管养\t458993\n第41期\t458994\n藏格控股\t458995\n第15季\t458996\n2076\t458997\n献一计\t458998\n战忽局\t458999\n教诲\t459000\nwensongyu\t459001\n球墨铸铁井盖\t459002\n刘剑\t459003\n2018年3月16日\t459004\n92game\t459005\n神官\t459006\nQuite\t459007\n条文\t459008\nbs\t459009\n量子破碎\t459010\n榴友\t459011\n话题性\t459012\n缩尺\t459013\natfb\t459014\n定义类\t459015\n300系\t459016\nSpitz\t459017\n达西\t459018\nHero6\t459019\n章华\t459020\n添添\t459021\n昕薇\t459022\n速写本\t459023\n黑频\t459024\n好失望\t459025\n80本\t459026\nuac\t459027\nturner\t459028\n宝丰路\t459029\n漩涡博人\t459030\n货币基金\t459031\n百句\t459032\n指北\t459033\n李永宏\t459034\n坚定\t459035\niVocaloid\t459036\n别云剑\t459037\n崂山啤酒\t459038\n外敷\t459039\n女老\t459040\n3.29\t459041\n交调\t459042\n中国海\t459043\n稻城亚丁机场\t459044\nPFMEA\t459045\n装企\t459046\n吉利帝豪EV\t459047\n金鲁班\t459048\n收款\t459049\n隆都\t459050\n上海街道\t459051\n烟台市区\t459052\n做事\t459053\nSimwe\t459054\nBackend\t459055\n卓聘\t459056\n三脉\t459057\n偶氮苯\t459058\n江海\t459059\n作弊器\t459060\n而我\t459061\n密史\t459062\n剪头\t459063\n低息贷\t459064\n雷州半岛\t459065\n过去十年\t459066\n黄田镇\t459067\n湖居\t459068\n老里番\t459069\n网图\t459070\n西格列汀\t459071\n阻性\t459072\n舜宇光学\t459073\nOVER函数\t459074\n黄图哥\t459075\n3.68\t459076\n安仔\t459077\n复业\t459078\n世匠\t459079\nMII\t459080\n不良\t459081\nP1000\t459082\nmsado15.dll\t459083\n嗯\t459084\n新镜\t459085\n盈峰环境\t459086\n小血\t459087\n清流县\t459088\n心腹\t459089\n同根词\t459090\n工质\t459091\n金沙娱乐场\t459092\n红豆饼\t459093\n进口食品\t459094\n广东省博物馆\t459095\n蔬菜泥\t459096\n薪贷\t459097\n芥末网\t459098\n百炼佞\t459099\n浴花谷花卉网\t459100\n燃烧吧蔬菜\t459101\nUTM\t459102\nHomeBrew\t459103\n企业信息公示暂行条例\t459104\n小丸子\t459105\n天玑\t459106\n南开小学\t459107\n每日\t459108\n怜悯\t459109\n上海麦聚瑞电子仪器有限公司\t459110\n郝伟\t459111\nPHPstorm\t459112\n陆权\t459113\nUSBKey\t459114\n开腹\t459115\n陈大惠\t459116\n雷蛇黑寡妇\t459117\n四夕\t459118\n海源\t459119\nPM\t459120\n小生\t459121\n量能\t459122\n成都地铁3号线\t459123\n词本\t459124\nxvideo\t459125\n精挑细选\t459126\n房产过户|华律\t459127\ndetupian\t459128\n海门日报\t459129\n永康人才网\t459130\n浙江警官职业学院\t459131\n前山\t459132\n组织生活会\t459133\n悦动神印王座\t459134\n江南大学\t459135\n咖啡色\t459136\n豹妹\t459137\n桂系\t459138\nx70\t459139\n火柱\t459140\n一级片\t459141\ncena\t459142\n隽升\t459143\n特任\t459144\n冰蓝\t459145\n杨棋涵\t459146\n12岁\t459147\n托比昂\t459148\n女仙\t459149\n另版\t459150\n镲\t459151\n胃反酸\t459152\nimx6\t459153\n第四十二期\t459154\n直签\t459155\noffical\t459156\n小紫\t459157\n﹙\t459158\n来职往\t459159\n极品飞车无限狂飙吧\t459160\nDiggBT\t459161\n西安北站\t459162\n褐藻\t459163\nimread函数\t459164\n40kw\t459165\n风湿关节炎\t459166\n奸\t459167\n第零\t459168\nXCOM\t459169\n毁誉\t459170\n白松\t459171\n董家村\t459172\n死\t459173\n霸蛮\t459174\n血蛤\t459175\n蛋黄粉\t459176\n2017.1.4\t459177\n实记\t459178\nmomentjs\t459179\n岑参\t459180\n复方甘草片\t459181\n吴兰\t459182\n智慧眼\t459183\nTGBUS\t459184\n吴建斌\t459185\n压缩模量\t459186\n山西黄河医院\t459187\n天津摇号吧\t459188\n黄铜棒\t459189\n唯品会\t459190\n刘爱民\t459191\n百度杀毒\t459192\n舟山新区\t459193\n中云\t459194\n独山子区\t459195\nPornos\t459196\n赤狐\t459197\n保险事故\t459198\nCHINASSL\t459199\n车王\t459200\nProtobuf\t459201\n音骚\t459202\n艺龙酒店\t459203\n天虹股份\t459204\n运河街道\t459205\n苏利亚\t459206\n炉石传说攻略\t459207\nray\t459208\nmso\t459209\n胎架\t459210\nsod蜜\t459211\n合肥美菱股份有限公司\t459212\n蓝堡\t459213\n相用\t459214\n郑家纯\t459215\n分部\t459216\n超过7天\t459217\nSlang\t459218\n盘龙城经济开发区\t459219\nSHAKE\t459220\n贵州工业职业技术学院\t459221\n凌音\t459222\n客区\t459223\n小米手机MIUI\t459224\nlw\t459225\n维管\t459226\nSisters\t459227\n吉野山\t459228\n开学\t459229\n斩波\t459230\n地瓜\t459231\n勾庄\t459232\n龙部落迅雷\t459233\nrunscript\t459234\n建筑设计资料集\t459235\n胆疮\t459236\n宝沃\t459237\n常山县\t459238\npentatonix\t459239\n香港迪斯尼乐园\t459240\n新阳光\t459241\n夏莲\t459242\n省博物馆\t459243\n语言学概论\t459244\n一又二分之一\t459245\n法华寺\t459246\n张家港市\t459247\nifeq\t459248\n阳光私募基金\t459249\npython函数\t459250\n金地天府城\t459251\n王芸\t459252\n高第街56号\t459253\n长虹\t459254\n杜蕾斯\t459255\n借入\t459256\n氧化铅\t459257\n九城社区\t459258\neasyjet\t459259\nMBA智库文档\t459260\nFunction\t459261\nVAVA\t459262\n杨照\t459263\n内陷\t459264\n300v\t459265\n班师\t459266\n祗\t459267\n成都组工\t459268\n竹影\t459269\n便携\t459270\n二象性\t459271\n领益智造\t459272\n鹏程万里人才网\t459273\n三亿元\t459274\n開箱\t459275\nwildcard\t459276\n淮海东路\t459277\n京东JD.COM\t459278\n路费\t459279\n卡菲\t459280\n中国急救网\t459281\nrk3128\t459282\n心堂\t459283\n576\t459284\n刀剑封魔录上古传说\t459285\n天龙八部ol_17173天龙八部\t459286\n炉端\t459287\n日损\t459288\n装休\t459289\n挖\t459290\n白糖\t459291\n腹剑\t459292\n杜喆\t459293\n沃尔夫\t459294\n#7\t459295\nConnection\t459296\n天目湖南\t459297\n少数民族汉语\t459298\n场区\t459299\n无所谓\t459300\n集成灶\t459301\n王庆坨\t459302\n南京理工大学研究生院\t459303\n商协会\t459304\n倍润\t459305\n观世界\t459306\n户外展\t459307\nunibeast\t459308\n宝鸡\t459309\n婉语\t459310\n德班\t459311\n租方\t459312\n雨婷儿\t459313\npcn\t459314\n乙酰水杨酸\t459315\n2厢\t459316\n成都法院\t459317\n错案\t459318\nVirtualization\t459319\n尼龙绳\t459320\nnproc\t459321\n我的天命\t459322\n委厅\t459323\n自考招生网\t459324\n足疗按摩\t459325\n广州从化市政府\t459326\nproject\t459327\nX100T\t459328\n吉安市教育局\t459329\njsp\t459330\n中青旅\t459331\n200多位\t459332\n300次\t459333\n桂林乐居\t459334\n申智\t459335\ntoptoon\t459336\n7天之内\t459337\n自由門\t459338\nmoscow\t459339\n学代会\t459340\n铃仙\t459341\n第98集\t459342\n百度推广_\t459343\n诺泰\t459344\ndcg\t459345\niibull\t459346\n古格\t459347\n香村\t459348\n云帆学习网\t459349\n字棋\t459350\n卡片战斗先导者\t459351\n查令十字街84号\t459352\n镇康县\t459353\n混凝土布料机\t459354\n余云东\t459355\n集团\t459356\n苏平\t459357\n马林巴琴\t459358\n张煜\t459359\n高中段\t459360\n差评师\t459361\n远走\t459362\n吴健\t459363\nmincraft\t459364\n丸子\t459365\n4.48\t459366\n沈红\t459367\nLower\t459368\n乔治·巴顿\t459369\n劳动和社会保障局\t459370\n宫词\t459371\n你是我心爱的姑娘\t459372\n新潮传媒\t459373\n巴黎城\t459374\nE06\t459375\n3级片\t459376\n长种\t459377\n202.97\t459378\nauto\t459379\npeterson\t459380\n大病医疗保险\t459381\n黄鸭\t459382\nVOX\t459383\n回归分析\t459384\n摘地\t459385\n异虫\t459386\n一起泡\t459387\n别心急\t459388\n观念史\t459389\npleas\t459390\n第10轮\t459391\n驱蚊\t459392\n光磁\t459393\n柜号\t459394\n中小学生守则\t459395\n孙建宏\t459396\n评价表\t459397\n大话女儿国\t459398\n港线\t459399\n妙舞\t459400\n陈羽\t459401\n智能家居系统\t459402\nSRM\t459403\nUpToDate\t459404\n森动网\t459405\nTermination\t459406\n新标准大学英语综合教程\t459407\n苗刀\t459408\n沙颍河\t459409\n480万\t459410\n炮塔\t459411\nsections\t459412\n10k\t459413\n沙洋县政府\t459414\n新制度\t459415\nstitch\t459416\n小康生活\t459417\n小男生\t459418\n无机化学\t459419\n夏娃年代记\t459420\n干柴烈火\t459421\n九紫\t459422\n外饰\t459423\n新华网股份有限公司\t459424\n完成券\t459425\n会奖\t459426\nB1/B2\t459427\n如是说\t459428\n暂时\t459429\n分期\t459430\n周红波\t459431\noj\t459432\no2\t459433\njj9\t459434\n物业管理员\t459435\n夏朗论坛_汽车之家论坛\t459436\n镜头\t459437\n神东\t459438\nwin7-系统城\t459439\n特行\t459440\n第十八批\t459441\n常建\t459442\n发布\t459443\n通导\t459444\n大脑\t459445\n西安市物价局\t459446\n转企\t459447\n高新技\t459448\n21本\t459449\npytesseract\t459450\n波片\t459451\n药品注册管理办法\t459452\n20160408\t459453\n按日\t459454\n中日文\t459455\n贯众\t459456\n宁都县人民政府\t459457\nyuo\t459458\n十万公里\t459459\nGFP\t459460\n立花美凉\t459461\n茂伸奇\t459462\n蛲虫病\t459463\n赵晨\t459464\n新乡市教育局\t459465\n勇挑重担\t459466\n穴盘\t459467\nsun5411\t459468\n敢爱敢恨\t459469\n齐次线性方程组\t459470\n中国水利水电出版社\t459471\n机油滤清器\t459472\ns90\t459473\n朴新\t459474\n1429\t459475\n四十八\t459476\n第57号\t459477\n窄屏\t459478\n恰似一江春水向东流\t459479\nScoop\t459480\n普林斯顿\t459481\n氨磺必利\t459482\n侧颜\t459483\n殷巷\t459484\n鼎晖投资\t459485\n诺诗兰\t459486\n97级\t459487\nGuam\t459488\n舟山港\t459489\n园地\t459490\n新田村\t459491\n280米\t459492\nCANADA\t459493\n半圆形\t459494\nblind\t459495\n谭梅\t459496\n解放军总医院\t459497\n和而泰\t459498\n富贵子\t459499\n密码子\t459500\n腮腺炎\t459501\n膀胱经\t459502\nnpapi\t459503\n案值\t459504\n游憩\t459505\nMeterpreter\t459506\nHardcore\t459507\n工布\t459508\n黄心\t459509\nsuspend\t459510\n丧钟为谁\t459511\n千乃杏美\t459512\n外出\t459513\nstudies\t459514\nVSS\t459515\n黄菖蒲\t459516\n化粪池\t459517\n献\t459518\n典当公司\t459519\n无源元件\t459520\nCIBN\t459521\n艾德·希兰\t459522\n交通界\t459523\n抢夺罪\t459524\n色诱术\t459525\n秋梨膏\t459526\n朱丽安·摩尔\t459527\nK_\t459528\n凌波城\t459529\n美人\t459530\n本月25日\t459531\n讲武堂\t459532\nsemibold\t459533\n五金店\t459534\n网下申购\t459535\n济南水务集团有限公司\t459536\n风尘女子\t459537\n电水瓶\t459538\nfenghuan\t459539\n移动web\t459540\n杭州市卫生计生委\t459541\n坐姿\t459542\nStuds\t459543\ntl\t459544\n柏斯\t459545\nSOD\t459546\n西安教育局\t459547\n白俄\t459548\n黄站\t459549\n意会\t459550\n拉扎维\t459551\n进入者\t459552\n偏激\t459553\n水壶\t459554\n深圳美莱整形美容医院\t459555\n北二环路\t459556\n胎儿期\t459557\n嫁得好\t459558\nGenre\t459559\n德学网\t459560\n郑州西站\t459561\n正前方\t459562\n天天爱丽舍\t459563\n医化\t459564\n瀛洲\t459565\n后溪镇\t459566\n鸿燕藏锋\t459567\n乐峰广场\t459568\n大变脸\t459569\n行尾\t459570\n盟军敢死队3\t459571\npegasus\t459572\n传信\t459573\n1257ad\t459574\n糜蛋白酶\t459575\n特警队\t459576\n隔热膜\t459577\n招聘类\t459578\n南阳市公安局\t459579\n植萌网\t459580\n100V\t459581\n卜师\t459582\n龙烟\t459583\n盆菜\t459584\nofstream\t459585\n藏山\t459586\nDnf\t459587\n栽培学\t459588\nlsusb\t459589\n花坊\t459590\nadaptation\t459591\n探机\t459592\n螺丝柱\t459593\n釜山电影节\t459594\n设计本\t459595\n军庄\t459596\n仙居\t459597\n北京公共交通控股(集团)有限公司\t459598\n爱比\t459599\n99.9\t459600\n梁下\t459601\n哈佛商业评论\t459602\n6137游戏网\t459603\n跑操\t459604\n银河映像\t459605\n大枣\t459606\n音响展\t459607\n相伴一生\t459608\n商标代理公司\t459609\n张晓龙\t459610\nlibc++\t459611\n内丹\t459612\n四十多\t459613\n京吹\t459614\n槐花蜜\t459615\nC300\t459616\n隐型\t459617\n绿轴\t459618\n谢岗\t459619\nrains\t459620\nMilk\t459621\n科洛\t459622\nannie\t459623\nhpi\t459624\n2018年2月21日\t459625\n下颌角手术\t459626\n鸭肫\t459627\n680.com\t459628\n音位\t459629\n福利来啦\t459630\n网球鞋\t459631\n到达\t459632\n黄酮类\t459633\nzhuweisky\t459634\nRussell\t459635\n秦军\t459636\n黄芳\t459637\n配准\t459638\n100g\t459639\n违法乱纪\t459640\n百度在线翻译\t459641\n五间\t459642\n梅川路\t459643\n雄株\t459644\n新国际博览中心\t459645\n滚珠轴承\t459646\nmone\t459647\n汪寿华\t459648\n10多岁\t459649\n耍耍网\t459650\nbuys\t459651\n十常侍\t459652\n奥特莱斯Outlets购物攻略\t459653\n比斯阁\t459654\nhi3518\t459655\n12nm\t459656\n7600K\t459657\n张玉兰\t459658\n嫣红\t459659\nxz1c\t459660\n滑动选择器\t459661\nJEECG\t459662\n徐井宏\t459663\n三菱帕杰罗劲畅\t459664\n復活\t459665\n水浒之宋江\t459666\naltima\t459667\n手菜\t459668\n天书中文网\t459669\nRFQ\t459670\nlog3\t459671\nTuShare\t459672\n耍滑\t459673\n纽威\t459674\n壹聚\t459675\n第二根\t459676\n厦门大学附属中山医院\t459677\n上海大学上海电影学院\t459678\n多类\t459679\n腾龙洞\t459680\n大通道\t459681\n张歆艺\t459682\n1第46\t459683\n欧度\t459684\nlaminate\t459685\ngohome\t459686\n巨塔杀机\t459687\nMatures\t459688\n刘峰\t459689\n_岛\t459690\n深圳众策纵横有限公司\t459691\n1000RMB\t459692\n太瘦\t459693\n清透\t459694\n瞅瞅\t459695\n黑魔王\t459696\nRAM/移动4G\t459697\n烟台毓璜顶医院\t459698\n郑好\t459699\n南楼\t459700\n共同行动\t459701\n超时\t459702\n邪性\t459703\n乐斯福\t459704\n颜勤礼碑\t459705\n篦冷机\t459706\n中科院信工所\t459707\n十八罗汉\t459708\ndebugging\t459709\n镀金\t459710\n楚恒\t459711\navatar\t459712\n2017.5\t459713\n扶沟县\t459714\n北京物资学院\t459715\n1MORE\t459716\n指挥权\t459717\nT+\t459718\n三早\t459719\n第七夜\t459720\n黑豚\t459721\n明友\t459722\n网易严选\t459723\n71号\t459724\n308路\t459725\nDB块\t459726\n石衡\t459727\n阻剂\t459728\n武装冲突\t459729\n七夕情人节\t459730\n新疆广播电视大学\t459731\n回忆\t459732\nMQL5\t459733\n蓝发\t459734\n盖碗茶\t459735\n一代枭雄之三支旗\t459736\n芳草地国际学校\t459737\n大老粗\t459738\n珉豪\t459739\nTin\t459740\n臊味\t459741\n北京师范大学附属实验中学\t459742\n香港政府一站通\t459743\n3530\t459744\n中国五金机电\t459745\n总成\t459746\n998\t459747\n美邦\t459748\n毒龙\t459749\nChief\t459750\nVANCL凡客诚品\t459751\n射手中文网\t459752\nwordcloud\t459753\n占有欲\t459754\nて\t459755\nc++builder\t459756\n债务\t459757\nM330\t459758\n0.5cm\t459759\n锤\t459760\n金贤重\t459761\n2.0g\t459762\n黄岩\t459763\n趣名网\t459764\n潍水\t459765\n环氧\t459766\n网游\t459767\n乐平市人民政府\t459768\n中国华能集团\t459769\ntoys\t459770\n市场收益率\t459771\n鬼语者\t459772\n鸡胗\t459773\nv4.16\t459774\n堡垒\t459775\nrestcontroller\t459776\nROHS认证\t459777\nnalgene\t459778\n第5册\t459779\n报复性\t459780\n蝴蝶海\t459781\n口腔疱疹\t459782\nx88\t459783\n佳艺\t459784\n长安欧诺汽车装饰\t459785\n心情不错\t459786\n汇景湾\t459787\n碎机\t459788\n翁婿\t459789\n天空下\t459790\n和谐大道\t459791\n拜物教\t459792\nmintui\t459793\n合成表\t459794\nweren\t459795\n街霸5\t459796\nAirPods无线耳机\t459797\n不得不看\t459798\n致词\t459799\n大美妞\t459800\n海尔商城\t459801\n爱玩网\t459802\n老柳\t459803\n英版\t459804\nrozbo\t459805\n5.4.5\t459806\nZxing\t459807\n去来\t459808\n啾啾啾\t459809\npentax\t459810\n龙果\t459811\nMySQLi\t459812\n1.7.3\t459813\n威海市农业局\t459814\ndoc88\t459815\nwindows键\t459816\n几帧\t459817\n兴业大厦\t459818\n中国经济导报\t459819\n雅信\t459820\n粤园\t459821\n金鹤\t459822\n冷敷\t459823\n汪波\t459824\n高德顺风车\t459825\n100家\t459826\nGeoJSON\t459827\n肥东一中\t459828\n简木\t459829\n科密碎纸机\t459830\n潜水艇\t459831\nPro2017\t459832\n画长\t459833\n鬼魅浮生\t459834\n什邡市\t459835\nIntermediate\t459836\n055万吨\t459837\n云计算技\t459838\n九重紫\t459839\n寄出\t459840\n宫后\t459841\n挂职锻炼\t459842\n先征\t459843\n克制\t459844\n岁岁年年\t459845\n木兰从军\t459846\n时间型\t459847\n围捕\t459848\n二封\t459849\n批号\t459850\nRewards\t459851\nglobrand\t459852\nFCS\t459853\n美国之声\t459854\n惊奇队长\t459855\n东营河口\t459856\n抹掉\t459857\n神气\t459858\n小萝莉的猴神大叔\t459859\n天津市小学\t459860\nppt页码\t459861\n抱一\t459862\n国务院国有资产监督管理委员会\t459863\nngui\t459864\n来事儿\t459865\n办成\t459866\nJacck\t459867\nfcr\t459868\n彪悍的人生\t459869\n独赢\t459870\ntxt版\t459871\n上海申花\t459872\n财产\t459873\n郑楚\t459874\n朱嘉伟\t459875\n皇朝\t459876\n碘水\t459877\n布景\t459878\nNMR\t459879\n19页\t459880\n棚户区\t459881\n谍战\t459882\n擅离职守\t459883\nNavmesh\t459884\n096\t459885\n九鼎之战\t459886\n定焦头\t459887\n接线箱\t459888\n咸金桔\t459889\nThomas\t459890\n2338\t459891\n麸\t459892\n百盛购物中心\t459893\n同学群\t459894\n有身\t459895\n4054\t459896\n解谜类\t459897\n泡脚\t459898\n国家工商行政管理局\t459899\n船钓\t459900\n鱼尾狮公园\t459901\nFlowerplus\t459902\n公云\t459903\n李香君\t459904\n卸荷\t459905\n20160831\t459906\nsdms\t459907\n2032\t459908\n4yue\t459909\n大连国税\t459910\n东海大学\t459911\ncara\t459912\n两周多\t459913\n87628386\t459914\n泾河新城\t459915\n红烟\t459916\n上饶地区\t459917\n窦尔敦\t459918\nMLG\t459919\n数百位\t459920\nArcLive\t459921\n第一创业证券\t459922\n分布式光伏项目\t459923\n水事\t459924\n自检\t459925\n过年回家\t459926\n论争\t459927\n山友\t459928\n奸贼\t459929\n重相逢\t459930\n漏电开关\t459931\n生医\t459932\n杀菌灯\t459933\n临颍县\t459934\n霜冻\t459935\n甬江街道\t459936\n真情告白\t459937\nBRIEF\t459938\nyui\t459939\n换装\t459940\n乔治亚\t459941\n影视网\t459942\n善存佳维片\t459943\n德胜河\t459944\n鱼业\t459945\n123123\t459946\n600651\t459947\n姧\t459948\n苏乞儿\t459949\nqq通讯录\t459950\n新地DJ音乐网\t459951\n萨\t459952\n6a\t459953\n冷油器\t459954\n天安\t459955\n第三十五届\t459956\n10000_\t459957\n传喜法师\t459958\nQ房网\t459959\n火符\t459960\n博通股份\t459961\nICF\t459962\n饲料机\t459963\n金华职业技术学院\t459964\n220i\t459965\nCelestial\t459966\n张蔚\t459967\n阵阵\t459968\n斜齿轮减速机\t459969\n历尽沧桑\t459970\n2017年9月10日\t459971\n小学馆\t459972\nTCL集团股份有限公司\t459973\ncask\t459974\n冯氏\t459975\n猎装\t459976\n银矿石\t459977\n心画\t459978\n30发\t459979\n焚情\t459980\namen\t459981\nMP3/320K\t459982\n钢门\t459983\n佳鑫\t459984\n祖儿\t459985\nJetstar\t459986\nurllib\t459987\nezb\t459988\nsvo\t459989\n船鞋\t459990\n郧西\t459991\n圆角机\t459992\n商英\t459993\n裹脚\t459994\nquantum\t459995\n星特朗\t459996\n田浩\t459997\n黄老板\t459998\nWindows7之家\t459999\n厂子\t460000\n6.10\t460001\nnpf\t460002\n仙剑98\t460003\n南瓜藤\t460004\n宫古海峡\t460005\n天津卫\t460006\ni5/Cor\t460007\n旋风十一人\t460008\n白艳妮\t460009\n解调\t460010\nMighty\t460011\nWindows10易升\t460012\n青瓦\t460013\n三十六岁\t460014\n莫雷诺\t460015\n西路\t460016\n1995\t460017\n【众泰T500图解】众泰汽车众泰T500图解大全\t460018\n北京北\t460019\n面神经炎\t460020\n华远枫悦\t460021\n731苗木网\t460022\n占有率\t460023\n小便宜\t460024\nVascular\t460025\n感恩\t460026\n开头女\t460027\nkpl吧\t460028\n卡侬\t460029\nNAZA\t460030\ntracker\t460031\nIIS服务器\t460032\n纯血\t460033\n裂行\t460034\n迹美珠\t460035\n黄渤哈登\t460036\naortic\t460037\n生民\t460038\n振兴战\t460039\n轮状病毒感染\t460040\n污话\t460041\nios7.0\t460042\n塔尔德利\t460043\nStones\t460044\n玄亮\t460045\nKu\t460046\n58.com\t460047\n呜呼\t460048\n旋变\t460049\n蝙蝠侠\t460050\n促销\t460051\n水泥粉\t460052\n二十七年\t460053\n虚拟化技术\t460054\nX600\t460055\n易企秀社区论坛\t460056\n帝舵表\t460057\n博生\t460058\nheterogeneous\t460059\ngsc\t460060\n新尚\t460061\n二级建造师论坛\t460062\n1.4.3.3\t460063\na7r\t460064\n张家港外国语学校\t460065\n癌症\t460066\n鑫业\t460067\n募股\t460068\n世行\t460069\n复写\t460070\ncth\t460071\ndebounce\t460072\n挂牌公司\t460073\n闺门\t460074\n女装网\t460075\n妙手健康网\t460076\nLifts\t460077\n解战袍\t460078\n粉尘螨滴剂\t460079\n信长之野望14pk\t460080\n2089\t460081\ns7300\t460082\n量程\t460083\n妙解\t460084\n八零年代\t460085\n日用\t460086\nSwitchHosts\t460087\n浏河镇\t460088\n三岛由纪夫\t460089\n非公有\t460090\n石埠\t460091\n纵享\t460092\n调点\t460093\n鸣号\t460094\n三四年\t460095\n教务部\t460096\n钱颖\t460097\n国际收支平衡表\t460098\n云平\t460099\n后患无穷\t460100\n燃气锅炉\t460101\n512号\t460102\n192.168.100.1\t460103\n林伟贤\t460104\n关小迪\t460105\n天龙特攻队\t460106\n婴儿房\t460107\nled屏\t460108\n中英网\t460109\n康莉\t460110\nLiking\t460111\n宏观经济\t460112\nmosfet\t460113\nxiaoshuoba\t460114\n55次\t460115\norbslam2\t460116\n下马威\t460117\n南昌教育信息网\t460118\n维纳\t460119\nPSIM\t460120\n泉州市农业局\t460121\n意愿\t460122\n深赛格\t460123\nassess\t460124\n图书管\t460125\nDiffuse\t460126\n明管\t460127\n道德败坏\t460128\n世贸中心\t460129\n论治\t460130\n3.22\t460131\n平面设计专业\t460132\n畅通无阻\t460133\ndriving\t460134\n戒毒\t460135\n面包师\t460136\n昂昂\t460137\n可比\t460138\n驱动板\t460139\n1分集\t460140\n34p\t460141\n莲花一村\t460142\n手肘\t460143\nathletes\t460144\n互联网信息\t460145\n段亮\t460146\npi\t460147\n东施娘\t460148\n霍尔茨\t460149\n【元\t460150\nJam\t460151\n吴海涛\t460152\niLO\t460153\n嘉兴秀洲\t460154\n拷屏\t460155\n服装店\t460156\n骨膜炎\t460157\n三零\t460158\n形胜\t460159\nYYMP3音乐网\t460160\n尔\t460161\n抗凝药\t460162\n星沙街道\t460163\n中国核工业建设集团有限公司\t460164\n1千米\t460165\n内江师范学院\t460166\nRedtube\t460167\n河北省高级人民法院\t460168\n牛初乳粉\t460169\n异位\t460170\nBDI指数\t460171\n鲜明\t460172\nFAIL\t460173\n樊代明\t460174\ncents\t460175\n跃\t460176\n帅照\t460177\n太阳光\t460178\n多多\t460179\n建国门内大街\t460180\n违禁\t460181\n_v1.0.0安卓\t460182\n前帘\t460183\n中国大熊猫保护研究中心\t460184\n实处\t460185\n众泰T500\t460186\n柴草\t460187\n帐\t460188\n大大集团\t460189\n连衣\t460190\n竹联帮\t460191\n天健集团\t460192\n完美硬盘版\t460193\n鄞江\t460194\nMotion\t460195\nCCTV7\t460196\n捆梆\t460197\n副页\t460198\n主干道\t460199\n中煤集团\t460200\n中国央行\t460201\npannel\t460202\n沂蒙山银座天蒙旅游区\t460203\n1200亿元\t460204\n酒菜\t460205\n贝壳网\t460206\n宏福苑\t460207\n宸宫\t460208\n2007年1月\t460209\n0305\t460210\n欧创\t460211\n生化危机3\t460212\n申告\t460213\n煽情\t460214\nyixueshenj\t460215\n偏低\t460216\n魔帆\t460217\nocsp\t460218\nicp许可证\t460219\n胜任力模型\t460220\n云南经济网\t460221\n糖酒商品交易会\t460222\n李宗仁\t460223\n单块\t460224\n李庄案\t460225\n3dsmax9.0\t460226\n400多张\t460227\n过渡语\t460228\n方励\t460229\n4.73\t460230\n25码\t460231\n天若有情天亦老\t460232\n大寒\t460233\n中概股\t460234\n第42章\t460235\n孤战\t460236\nchaincode\t460237\n廖\t460238\n人工牛黄甲硝唑\t460239\nPcap\t460240\n深圳供电局\t460241\n奶油\t460242\nv6.0.2\t460243\n世茂滨江\t460244\n干热\t460245\n美图手机\t460246\n前任三\t460247\n参贷\t460248\n诛仙阵\t460249\nwin7防火墙\t460250\n依法治校\t460251\n我一个\t460252\n欧宝汽车\t460253\n上午\t460254\n交互分配法\t460255\n陕西省旅游局\t460256\n名方\t460257\n东方不败之风云再起\t460258\n毒女\t460259\n2英寸\t460260\n龙泉村\t460261\n四川电视台\t460262\n半盔\t460263\n烟台市政府\t460264\n霍格沃茨魔法学校\t460265\n2012-2020年\t460266\n福马\t460267\nnba2kol吧_\t460268\n警钟\t460269\n纤夫的爱\t460270\n此时此刻\t460271\n官证\t460272\n倒吊\t460273\n200多斤\t460274\nAlipay\t460275\nDad\t460276\nFoxx\t460277\n25季\t460278\n奔驰GLA\t460279\n雷克校园全能高手\t460280\n大结\t460281\n中国青年旅行社\t460282\n广粤路\t460283\n天镇\t460284\n弄斧\t460285\n端差\t460286\n指数基金\t460287\n诈骗集团\t460288\n恶化\t460289\n丝心社\t460290\nnormally\t460291\n戾气\t460292\n158元\t460293\n党人\t460294\n说了算\t460295\n小熊尼奥\t460296\n谢啦_\t460297\n000979\t460298\nToLOVE\t460299\n百田田田圈\t460300\n二三极\t460301\n走狗\t460302\n离子方程式_\t460303\n马勇\t460304\n张小龙\t460305\n王世子\t460306\n我和我的祖国\t460307\n基友们\t460308\n铜铁\t460309\n襄阳\t460310\nLufthans\t460311\nscreen\t460312\n生冷\t460313\n&#61676\t460314\n超级弹丸论破2\t460315\n10006\t460316\nWXS\t460317\n小赛\t460318\ntold\t460319\n内息\t460320\n难当\t460321\noppor7plus\t460322\n徒弟\t460323\nfema\t460324\n春秋五霸\t460325\n欧陆战争5\t460326\n开门见山\t460327\n神魂\t460328\n1184\t460329\n五胡乱华\t460330\n模压化粪池\t460331\n石榴皮\t460332\n练出\t460333\n杭州青少年活动中心\t460334\n琼恩雪诺\t460335\n汽车口碑网\t460336\n翻窗\t460337\n黄骅在线\t460338\n景阳\t460339\n张云雷\t460340\n少海\t460341\n泽尻英龙华\t460342\n杀手5:赦免\t460343\n证券市场红周刊\t460344\n看视\t460345\n丰田塞纳\t460346\n播放\t460347\n可容\t460348\n黄晓东\t460349\n魅蓝3\t460350\n对你爱不完\t460351\n好山好水\t460352\n石膏线\t460353\n老子是魔法少女\t460354\n占城\t460355\n靶向治疗\t460356\n焚化炉\t460357\n百度机器人\t460358\n信息与通信工程专业\t460359\n运价\t460360\n小升初摇号\t460361\n语不惊人死不休\t460362\n苍劲\t460363\n老有所养\t460364\nAf\t460365\nEXCEL文件\t460366\nshrimp\t460367\n高俊\t460368\n反贪风暴2\t460369\n果子狸\t460370\nmytxx\t460371\nedgar\t460372\n择日而亡\t460373\n墨西哥站\t460374\n别进\t460375\n滁州市政府网\t460376\n存货周转天数\t460377\n英雄联盟季中冠军赛_\t460378\n盐巴\t460379\n2幅\t460380\n广汽日野\t460381\n北川瞳\t460382\nmemcpy函数\t460383\n17座\t460384\n唐河县人民政府\t460385\n众美\t460386\nssw\t460387\n2万毫安\t460388\nunbelievable\t460389\n腺\t460390\n都门\t460391\n约伴\t460392\n富力红树湾\t460393\n中关村发展集团\t460394\n家族化\t460395\n当官\t460396\n英战\t460397\n万钢\t460398\n嘉禾世兴\t460399\n乐妃\t460400\n红黄蓝事件\t460401\n红衣军\t460402\nA希亿\t460403\n律师事\t460404\n死灵法\t460405\n具体化\t460406\n巴里黄檀\t460407\nexpend\t460408\nshihua\t460409\nぶ\t460410\n北宫\t460411\n肠道\t460412\n兔子们\t460413\n暖手宝\t460414\n范永亮\t460415\n15V\t460416\n2017版)\t460417\nMike_Zhang\t460418\nfinfo\t460419\n景宏\t460420\ncc1310\t460421\n杂碎音乐论坛\t460422\n人民解放战争\t460423\n政者\t460424\n老人机\t460425\n滨州市财政局\t460426\ndnf阿修罗吧\t460427\nMACD股票论坛\t460428\n队名\t460429\n加工贸易\t460430\n少年四大名捕\t460431\n若干张\t460432\nDAWN\t460433\n韩某\t460434\n溏心风暴3\t460435\npumps\t460436\n绿阵\t460437\n艺术史\t460438\n底层\t460439\nHIWIN\t460440\n突发事件应对法\t460441\n现下\t460442\n百度社\t460443\n20161111\t460444\n合宪性\t460445\n第30天\t460446\n艾粄\t460447\n夏彦\t460448\n52届\t460449\n西安理工\t460450\n安溪网\t460451\n古河道\t460452\n不进则退\t460453\n2094\t460454\n均有\t460455\n星野\t460456\n色书\t460457\n宋词\t460458\n中南泷悦府\t460459\n龙蟠\t460460\n江苏省新闻出版广电局\t460461\n叫做爱\t460462\n授权\t460463\n触龙\t460464\n哈巴\t460465\n伤员\t460466\n离心通风机\t460467\nWit\t460468\nNaomi\t460469\n市住房公积金管理中心\t460470\n博世汽车部件(苏州)有限公司\t460471\n乱仑\t460472\nnostalgia\t460473\n百胜餐饮集团\t460474\n临沂论坛\t460475\n全音符\t460476\n太阳系\t460477\ngt750m\t460478\n高低音\t460479\n市工商联\t460480\n咖啡斑\t460481\nabcam\t460482\n受热\t460483\n武汉公积金中心\t460484\n琴剑\t460485\n第5批\t460486\n半座\t460487\n亲水性\t460488\n人造金刚石\t460489\n弑母案\t460490\n初品\t460491\nweb.mit.edu/rbhatt/www/seema/1566\t460492\n邓磊\t460493\n36d大奶网\t460494\n钢琴块\t460495\n草民电影网\t460496\n10500元\t460497\nnaviswork\t460498\n净值\t460499\n社会保障性\t460500\n舌尖上的中国\t460501\n折纸\t460502\nWin10/Win8.1/Win7\t460503\n列行\t460504\ndowns\t460505\n飘然\t460506\n票房大卖王\t460507\n这周末\t460508\n缓凝剂\t460509\n丁亥日\t460510\ngsr\t460511\n众泰T700_众泰_众泰T700报价_众泰T700\t460512\nScattering\t460513\nda14580\t460514\n大富翁9\t460515\n省公路局\t460516\n名姝\t460517\n小室\t460518\n通安\t460519\n液流\t460520\n箭杆\t460521\n绞刑\t460522\nmti\t460523\n山桃\t460524\np3试机号\t460525\n银行间拆借利率\t460526\nfx2n\t460527\n小目\t460528\n信谊\t460529\n涌出\t460530\n新宝公司\t460531\n托尼·贾\t460532\n滨海古园\t460533\n3788\t460534\nHawk\t460535\n唇部\t460536\n25载\t460537\njqw\t460538\n持平\t460539\n币市\t460540\n词表\t460541\n四川行\t460542\nbrutal\t460543\n中转港\t460544\n学尔森\t460545\nKitty\t460546\n广渠门中学\t460547\n途观1.8t\t460548\n标名\t460549\n快乐跑\t460550\n阿柯\t460551\n2015年12月31日\t460552\n冬病夏治\t460553\n众诚保险\t460554\n4.60分\t460555\n万安公墓\t460556\n1000股\t460557\n导槽\t460558\nShaanxi\t460559\naux\t460560\n2016级\t460561\n茶马古道\t460562\n国际物流公司\t460563\n惘然\t460564\n海洋资源\t460565\n花都政府网\t460566\n驾驶驶\t460567\n钟头\t460568\n能化\t460569\n觞\t460570\nf272\t460571\n自感系数\t460572\n这事\t460573\n一]\t460574\n饮水\t460575\n附列\t460576\n十全十美\t460577\n十二生肖传奇\t460578\nGBC\t460579\n她是我的朋友\t460580\n收容教育\t460581\n粗度\t460582\n之\t460583\n蒋瑶嘉\t460584\nsit\t460585\nImplement\t460586\n负载机\t460587\n逗趣网\t460588\nv2.5.3\t460589\n许巍\t460590\n邦华\t460591\n上海化科实验器材有限公司\t460592\n刘迪\t460593\n7030\t460594\n渥太华\t460595\n志贵\t460596\n马克思主义基本原理\t460597\n健康证\t460598\n1.2t\t460599\n浙中\t460600\n字底\t460601\n打码器\t460602\nPhpcms\t460603\ntermination\t460604\n采育\t460605\nHyper\t460606\n0.dll\t460607\na7rm3\t460608\n推盘\t460609\n面包坊\t460610\n河川\t460611\n组合物\t460612\n商标法\t460613\n热熔管\t460614\n定速\t460615\n百度股市通\t460616\n22222\t460617\n潮女\t460618\n工作方式\t460619\n2.8亿\t460620\n新妈\t460621\n联商\t460622\n哈佛\t460623\nthrottle\t460624\n五迷\t460625\n广西师范学院师园学院\t460626\n国家药典委员会\t460627\n李玉梅\t460628\nMatebook\t460629\n野丫头\t460630\n路堤\t460631\n自感\t460632\nEXPERIENCE\t460633\n一个2\t460634\n数根\t460635\n王大娘\t460636\n郑庄\t460637\n保德县\t460638\nNike+\t460639\n13957842332\t460640\n天翼飞young\t460641\n五边形\t460642\n网掩码\t460643\n数据化\t460644\n夺宝\t460645\n诸天万界\t460646\n埋刮板\t460647\n长安欧尚长安欧尚A8002018款\t460648\nindexof\t460649\n广东松山职业技术学院\t460650\n检视\t460651\n诺嘉\t460652\n乳色\t460653\n盲目\t460654\n炸群\t460655\n锂亚\t460656\n心跳过缓\t460657\n2019年4月\t460658\n勃朗\t460659\n_税意正浓\t460660\nSeas\t460661\npig\t460662\n享御\t460663\n言录\t460664\niphone7Plus\t460665\n快捷方\t460666\n滑牙\t460667\n得力集团有限公司\t460668\n大鼠标\t460669\ndallas\t460670\nStarcraft\t460671\n肾移植\t460672\n连山县\t460673\n联想g410\t460674\n仿盛\t460675\nUW\t460676\n继承制\t460677\n开根\t460678\n一千美元\t460679\nPRIMARY\t460680\n恶男\t460681\n赛格广场\t460682\nT9\t460683\n日间照料中心\t460684\n果丹皮\t460685\n新四军军部旧址\t460686\n野合\t460687\nmilantgh\t460688\n题曲\t460689\n应用性\t460690\nPoLoYes\t460691\nrupture\t460692\n硫酸阿托品\t460693\n九年级\t460694\npets3\t460695\n缴交\t460696\nSVS\t460697\n水泽乃乃\t460698\n蜗壳\t460699\n厦漳\t460700\nlmt\t460701\nbehind\t460702\n造物节\t460703\n瞬时\t460704\n新发地\t460705\nG502\t460706\ngone\t460707\n破事儿\t460708\n上学吧挂靠网\t460709\n神龙冠日\t460710\n李秀明\t460711\n发展村\t460712\n双梦镇\t460713\n无锡县\t460714\n尼格\t460715\n律动操\t460716\n暗黑风\t460717\nERCP\t460718\nDiffuser\t460719\nhel\t460720\n商户端\t460721\nmsconfig\t460722\n荣耀鲁班七号\t460723\nMaven3\t460724\n甜宠文\t460725\n仙峰\t460726\n湖北省地方税务局\t460727\n滤片\t460728\n张聪\t460729\nAgents\t460730\n秋山莉奈\t460731\n林武\t460732\n莫名奇妙\t460733\n官运\t460734\n训营\t460735\n黑椒牛肉\t460736\ndsa\t460737\npubs\t460738\n港漂圈\t460739\n批量保存\t460740\nVOSS\t460741\n非竞争性\t460742\n遗梦\t460743\n射击类\t460744\n艾漫\t460745\n五剑\t460746\n上善颐园\t460747\n广西壮族自治区地方税务局\t460748\n金汇大厦\t460749\n市场营业时间\t460750\n图标\t460751\n东辽县\t460752\n10话\t460753\n常熟广播网\t460754\n山脊\t460755\n踏舞\t460756\n优子\t460757\n北京进化者机器人科技有限公司\t460758\n三镜\t460759\n机坪\t460760\ndesu\t460761\n免堆期\t460762\npartition\t460763\n8幅\t460764\n打爆\t460765\n20151025\t460766\n南城街道\t460767\n顺手\t460768\n软骨素\t460769\n澳门银行\t460770\n商函\t460771\n1.3.0_\t460772\n鲁\t460773\n2600\t460774\n神农山\t460775\n奇博\t460776\n戴维贝拉\t460777\n有所思\t460778\n医科大学\t460779\n钟鸣镇\t460780\nhelvetica\t460781\nAW\t460782\nICQ\t460783\n朴秀荣\t460784\n第4版\t460785\n真空滚揉机\t460786\n星野志坚\t460787\nRDB\t460788\n冯轲\t460789\n属性管理器\t460790\n艾斯维尔\t460791\n刘绍勇\t460792\n钯催化剂\t460793\nBroadcastReceiver\t460794\ntjsql\t460795\n橄榄核\t460796\ndeeds\t460797\n5.7寸\t460798\n700bike\t460799\n追一科技\t460800\n米克朗\t460801\n轮库\t460802\n六盘水日报\t460803\nLaval\t460804\n玻璃钢格栅板\t460805\n破\t460806\n娱乐\t460807\n永乐店\t460808\n农转非\t460809\n夜以继日\t460810\n鼹鼠的月亮河\t460811\n遗留物\t460812\n跳屏\t460813\n人寿\t460814\n奶阵\t460815\n易卡白金卡\t460816\n杨柳青庄园\t460817\n丙子年\t460818\n胡刚\t460819\nAttention\t460820\n滴声\t460821\n华虹\t460822\nFORD\t460823\n信长之野望大志\t460824\n央票\t460825\n勇者无惧\t460826\n中国工程物理研究院\t460827\n中国中铁二院工程集团有限责任公司\t460828\n圣卡\t460829\n一年级语文\t460830\nLED发光字\t460831\nhover\t460832\n光谷店\t460833\n关于\t460834\nAMA\t460835\n新力集团\t460836\n0.5.0\t460837\n赵恒\t460838\nticwatch2\t460839\n2012-2016年\t460840\n关关\t460841\n梧桐苑\t460842\n老百\t460843\n七瀬\t460844\n有空值\t460845\nsaliency\t460846\n鹰头\t460847\n小米电视3S\t460848\nea\t460849\n62张\t460850\nWIN98\t460851\n汕头市潮阳实验学校\t460852\n浩辰电气\t460853\n金道\t460854\nJSTOR\t460855\n电动公交车\t460856\n麓湖公园\t460857\n价层\t460858\n溅水\t460859\n00002\t460860\n3D之家\t460861\n醉卧君怀笑离伤\t460862\n滑跃\t460863\n土地出让\t460864\n寒武\t460865\n桃山区\t460866\n中国电子科技集团有限公司\t460867\n黑牛食品\t460868\n7节\t460869\niPhone7Plus\t460870\n下周末\t460871\n荣耀点券\t460872\n桃根\t460873\nCutie\t460874\nr330\t460875\nmax3232\t460876\n甲港\t460877\n吴晓明\t460878\nxsinx\t460879\n8380\t460880\nPALACE\t460881\n声母\t460882\n草厂\t460883\n铝基复合材料\t460884\n刘天佐\t460885\n亭廊\t460886\n四氢呋喃\t460887\n加州旅馆\t460888\nHADAX\t460889\n柴炉\t460890\n朗德\t460891\nautohold\t460892\n290X\t460893\n企业信息化\t460894\n速腾1.4t\t460895\n7匹\t460896\nLastPass\t460897\nPirelli\t460898\n方展博\t460899\n北医三院生殖中心\t460900\n20英寸\t460901\nManager5\t460902\ntgab\t460903\n社区服务器\t460904\n洪熙官\t460905\n倩女销魂\t460906\n长沙人大\t460907\n蓟门桥\t460908\n名贴\t460909\n石岭镇\t460910\n不由己\t460911\n华为路由器\t460912\n黑狱断肠歌2\t460913\n防水盒\t460914\n合购\t460915\njellyfish\t460916\n20170110\t460917\n转错\t460918\n3541\t460919\n侧入式\t460920\n袁熙\t460921\n联想天逸510S\t460922\n智学\t460923\n基础部\t460924\n敦化\t460925\n转图\t460926\n内容物\t460927\nsensation\t460928\nPrepared\t460929\n/袋\t460930\n解放者杯\t460931\n校情\t460932\n偏铝酸钠\t460933\n小溢\t460934\n湖南省国资委\t460935\n一局\t460936\nLED电视\t460937\n水浒笑传\t460938\nTUV\t460939\n佛山市国土资源和城乡规划局\t460940\n新时达\t460941\nqq支付\t460942\n山东省审计厅\t460943\n深圳地区\t460944\n国际护士节\t460945\n米谷\t460946\n金蝶KIS\t460947\n百度云俱乐部|百度网盘资源下载|百度云资源分享俱乐部\t460948\n陆家嘴金融网\t460949\n六十年前\t460950\n真本事\t460951\n芬妮希尔\t460952\n星际争霸吧\t460953\nwe战队\t460954\n大荔\t460955\n重叠加\t460956\n铜川新区\t460957\n万福生科\t460958\n大魔鬼\t460959\ncp22\t460960\n趸\t460961\n静态函数\t460962\n日产奇骏\t460963\nbbs.uuu9.com\t460964\nJacobs\t460965\n清宫无间\t460966\ndocbase\t460967\nPALM\t460968\n秘密关系\t460969\nbinutils\t460970\n海滩救护队\t460971\n朗基努斯\t460972\n龙昌\t460973\nshared\t460974\nZUI\t460975\n程序\t460976\nSPX\t460977\n河定桥\t460978\n好险\t460979\nchristy\t460980\n渐行\t460981\n20型\t460982\n北京张裕爱斐堡国际酒庄\t460983\n胡惟庸\t460984\n安徽联通\t460985\n雪碧纤维+\t460986\n弥散\t460987\nSuzhou\t460988\n电话机\t460989\n刘玉平\t460990\n蜡染\t460991\ncet6\t460992\n黄山大道\t460993\nhdmi接口\t460994\n腾讯游戏\t460995\n元朗区\t460996\n隆隆岩\t460997\n四平八稳\t460998\n汉朝时期\t460999\nValley\t461000\npolarity\t461001\n重庆财经职业学院\t461002\n勒死\t461003\n海岛奇兵\t461004\n韩熙载夜宴图\t461005\n进档\t461006\nGlossary\t461007\nspeedy\t461008\n文组\t461009\n7.0下\t461010\ne生2017\t461011\n化险为夷\t461012\n13579\t461013\n獭\t461014\nHassan\t461015\n造型线\t461016\n发育不全\t461017\nSQLYog\t461018\n宝都\t461019\n乎\t461020\n二力平衡条件\t461021\n南京大屠杀\t461022\n杭州湾\t461023\n激进型\t461024\n图情\t461025\n重竹\t461026\n聚才\t461027\n1.6cm\t461028\n前级\t461029\n茄子\t461030\nContracting\t461031\n抱朴\t461032\n被查获\t461033\n陈国荣\t461034\n倒金字塔\t461035\n挨揍\t461036\n蠢哭\t461037\n牧游侠\t461038\n9.07\t461039\n菌袋\t461040\n白金汉爵大酒店\t461041\n中共广元市纪委\t461042\n三天三夜\t461043\n名堂\t461044\n接诊\t461045\n一起玩\t461046\n鹰手营子矿区\t461047\n干线公路\t461048\n石姓\t461049\n防爆控制柜\t461050\n金茂梅溪湖\t461051\n萝岗区\t461052\n金溪县\t461053\n顽固性\t461054\n辣酱\t461055\n神探夏洛克吧\t461056\n三鹿奶粉\t461057\n小黑龙\t461058\nSqlplus\t461059\n中国体育舞蹈联合会\t461060\n剑三五毒\t461061\n武林银泰\t461062\n雀舌黄杨\t461063\n几心\t461064\n小法\t461065\n浆管\t461066\n恒定\t461067\n侨乡\t461068\n神奈川冲浪\t461069\n回原\t461070\nuVision5\t461071\n北京按摩医院\t461072\n走入\t461073\n银领\t461074\n人人公司\t461075\n卓多姿\t461076\nmusic.163.com\t461077\n电动平板车\t461078\n一品堂\t461079\n印度语\t461080\n茶几\t461081\n土象星座\t461082\n四大佛教名山\t461083\n喝酸奶\t461084\n480ml\t461085\n佯\t461086\nDoshow\t461087\n护险\t461088\nflaw\t461089\n追逐战\t461090\n鸽药\t461091\n山面\t461092\n息火\t461093\n谢天笑\t461094\n波峰波谷\t461095\nweizhxa\t461096\npex\t461097\n大数据分析师\t461098\n赤山\t461099\n否定\t461100\ndaniels\t461101\n158路\t461102\nWriting\t461103\n逐字稿\t461104\n张亚南\t461105\n球球大作战\t461106\nSullivan\t461107\n油状\t461108\n随录\t461109\n直系亲属\t461110\n包扎\t461111\nNie\t461112\n腊月\t461113\n潜店\t461114\n2017.11.20\t461115\n逐时\t461116\n情趣性\t461117\n五分\t461118\n英文名女\t461119\n撩汉\t461120\n12.22\t461121\n熊彼特\t461122\n张继红\t461123\nMoho\t461124\n己亥杂诗\t461125\nbigboss\t461126\nArisa\t461127\n太谷饼\t461128\n汉草\t461129\n91混血\t461130\n少夫\t461131\n陈国星\t461132\n龙头寺\t461133\n炫腹\t461134\n陇南\t461135\nrslinx\t461136\nihs\t461137\nCimatronE\t461138\ncategory\t461139\n名词性\t461140\n中公教育上海分校\t461141\n东英吉利大学\t461142\n风雨彩虹铿锵玫瑰\t461143\n深圳都市报\t461144\n怎莫\t461145\n隔夜饭\t461146\nliked\t461147\n包头市政府\t461148\n无水碳酸钠\t461149\nCCTV央视\t461150\nJquery\t461151\n陶士涵\t461152\n姆米\t461153\n艺子龙\t461154\n林丹\t461155\n叶酸片\t461156\n发电机\t461157\n贪玩蓝月\t461158\n639亿美元\t461159\nunionID\t461160\n银联在线\t461161\n姑凉\t461162\n2.2g\t461163\nfucking\t461164\n亚历山大港\t461165\n康美药业股份有限公司\t461166\n诊断性\t461167\n河车\t461168\n马站镇\t461169\n潍柴\t461170\n4.0.7\t461171\nriley\t461172\n廊坊职业技术学院\t461173\n龙陵\t461174\nSunn\t461175\n周记录\t461176\nstruts-tags\t461177\n32颗\t461178\n22年后\t461179\n1000元\t461180\n上下层\t461181\n4326\t461182\n铁友\t461183\n赔偿费\t461184\n汤池温泉\t461185\n郑镒勋\t461186\n金蓝\t461187\n硬撑\t461188\nxpdf\t461189\n2010—2020年\t461190\nGalileo\t461191\n接线头\t461192\n温感\t461193\n珍珠泥\t461194\n朱建华\t461195\n浙江消防\t461196\n高槻泉\t461197\n英王\t461198\n中央美院\t461199\n潍坊传媒网\t461200\n救生圈\t461201\n筛沙机\t461202\n李老板\t461203\nDeepLearning\t461204\n骑马与砍杀无双三国\t461205\n_时光网\t461206\n送样\t461207\nQVector\t461208\n琶洲国际会展中心\t461209\n丰润吧\t461210\n录影\t461211\n安徽省环保厅\t461212\nPornia\t461213\n王光宇\t461214\n24平方厘米\t461215\n9项\t461216\n深圳大学法学院\t461217\n腰腿痛\t461218\n1.0.18\t461219\nolivia\t461220\n快驰\t461221\n安阳市政府\t461222\n沈修瑾\t461223\nJed\t461224\n纸袋\t461225\n芒格\t461226\n护口\t461227\n腻子粉\t461228\n遮瑕\t461229\n星尘\t461230\n快新\t461231\n赛意信息\t461232\n利根\t461233\n朱泽君\t461234\n佛山人才网\t461235\n盯人\t461236\n周公谨\t461237\n10月\t461238\nAPK8安卓网\t461239\n石峁遗址\t461240\n频传\t461241\n卡蜜拉\t461242\n张珂\t461243\n肾炎\t461244\n十一五\t461245\n含悲\t461246\n科迈罗\t461247\ndialed\t461248\ntune\t461249\n库卡隆\t461250\n绅宝D50\t461251\n高嘌呤食物\t461252\n骷髅马\t461253\n化学成绩\t461254\n温宁\t461255\nxzs\t461256\n福州卓凯电子科技有限公司\t461257\n瓶瓶罐罐\t461258\n骆驼峰\t461259\n轻化\t461260\n涞坊路\t461261\n政协传媒网\t461262\n安全员C证\t461263\n吃腻\t461264\n字盘\t461265\n华杯\t461266\n6.24\t461267\n3.2.8\t461268\n动动手\t461269\n陆晓\t461270\n少年杨家将\t461271\n环氧煤沥青防腐钢管\t461272\n中华人民共和国森林法\t461273\n赋值\t461274\n小老虎\t461275\nhx711\t461276\n同等学历\t461277\n吃里扒外\t461278\n寿司之神\t461279\n不虞\t461280\n心医\t461281\n凤凰传媒\t461282\n三桶\t461283\n盯\t461284\n恐友\t461285\n变分法\t461286\ndsk\t461287\n沪籍\t461288\n2006年11月16日\t461289\nnavi\t461290\n前5年\t461291\n柱温\t461292\n中建东孚\t461293\ntypora\t461294\n镇干部\t461295\n空调管\t461296\n360N5S\t461297\n堆糖\t461298\n绝食\t461299\ncrocs\t461300\n命运之手2\t461301\n富丽\t461302\npython类\t461303\nDetectors\t461304\n福建省人力资源社会保障\t461305\n大会师\t461306\nOFFICE2003\t461307\n胆囊结石\t461308\n3美元\t461309\ntpr\t461310\n线状\t461311\n瑞昌\t461312\n御龙在天美人版\t461313\nddwrt\t461314\n罗汉果\t461315\n智慧树大学生创业\t461316\n华盛顿大学西雅图分校\t461317\nCohen\t461318\n清油\t461319\nStatic\t461320\n迷藏\t461321\n2017年后\t461322\n另一个你\t461323\n杨诚\t461324\nfakes\t461325\nRaspberryPi\t461326\n博克斯\t461327\n9000F\t461328\n头脸\t461329\n板桥街道\t461330\n期中考\t461331\n何飞\t461332\nPOW\t461333\n600660\t461334\n文景苑\t461335\nufs\t461336\n玛丽莎\t461337\n李思训\t461338\ndevtool\t461339\n冰岩\t461340\nmillipore\t461341\nwhatman\t461342\n触地\t461343\n罗烈\t461344\n陆琪\t461345\n刷洗\t461346\n东京新干线\t461347\nPixologic\t461348\n围栏\t461349\n董宁\t461350\n统御\t461351\n朱辛庄站\t461352\n崇仁\t461353\n半麻\t461354\ncoupon\t461355\n导航站\t461356\ndecor\t461357\n梦幻西游召唤兽\t461358\n泻药\t461359\n蔡森\t461360\n尤克里\t461361\n校规\t461362\navec\t461363\nDesigner09\t461364\n涵江\t461365\n王天琦\t461366\nsynchronized\t461367\n王臻\t461368\ninfosys\t461369\n拳皇mugen\t461370\ngallon\t461371\n周历\t461372\n尊雅\t461373\naimp\t461374\n裸阴\t461375\n修复率\t461376\n热血格斗传说\t461377\nMovieLens\t461378\nSpringSecurity\t461379\n失重\t461380\n技术服\t461381\nTomatoes\t461382\n郑老师\t461383\n赵民\t461384\nJspxcms\t461385\n刺影\t461386\nE卡\t461387\n芳草地\t461388\n衍纸画\t461389\n模组\t461390\nbreaks\t461391\n治癌\t461392\n刮痧油\t461393\n长传\t461394\n强子\t461395\n战鼓网\t461396\n字节集\t461397\n齐河县人民政府\t461398\n科米\t461399\n100bp\t461400\n领界\t461401\nayawawa\t461402\n亡友\t461403\nFuta\t461404\n秋粮\t461405\n归总\t461406\n宽恕\t461407\n老彭\t461408\n全力以赴\t461409\n刘婧\t461410\n夜惑\t461411\n决窍\t461412\nPractitioner\t461413\nrader\t461414\n0668\t461415\n结辩\t461416\n白玉兰奖\t461417\n男乒\t461418\n升级档\t461419\n三行\t461420\n螺纹钢\t461421\n時\t461422\n侯台\t461423\n暗中\t461424\n詢\t461425\n浦西玫瑰园\t461426\n华山路\t461427\n汉石桥\t461428\n孕婴童\t461429\n雅号\t461430\nHandlerInterceptor\t461431\n五年级下册语文\t461432\n参与型\t461433\nv2016\t461434\naday\t461435\nEditing\t461436\n官居\t461437\n奇功\t461438\n过犹不及\t461439\nVSFTP\t461440\n女王范儿\t461441\n长虫\t461442\n中国光通信\t461443\nzsc\t461444\n民福\t461445\n5.9%\t461446\n建设工程质量管理条例\t461447\n一价\t461448\nTokenizer\t461449\n就问一下\t461450\nCirrus\t461451\n亚组\t461452\nConcise\t461453\n双酚A\t461454\n心里话\t461455\n中国文史出版社\t461456\n12.7.4\t461457\n隙\t461458\n否决\t461459\n内酰胺酶\t461460\ndutti\t461461\nBiochemistry\t461462\n黄念祖\t461463\n哈尔滨医大一院\t461464\n中汇支付\t461465\n友達\t461466\nlnux\t461467\n单性花\t461468\n芹菜籽\t461469\n贵阳地铁1号线\t461470\n殒\t461471\n袖手旁观\t461472\n中录\t461473\n2018年04月01日\t461474\nCollapse\t461475\n娱乐乐翻天\t461476\n三千庭\t461477\n西北师大附中\t461478\n4410\t461479\nLead\t461480\n肿瘤学\t461481\n四脚蛇\t461482\ngps\t461483\n烟管裤\t461484\n克己\t461485\n新桥路\t461486\n张双利\t461487\n大拌菜\t461488\nAppSo\t461489\n一晚上\t461490\n教育宝\t461491\n打听\t461492\n陈生强\t461493\n车顶行李箱\t461494\n卢恒宇\t461495\n成安渝高速\t461496\nBanyan\t461497\n更聪明\t461498\n黄山新闻网\t461499\n前哨战\t461500\n蜜酒\t461501\nSeth\t461502\n藤甲\t461503\nhilink\t461504\n双点\t461505\n上海长江大桥\t461506\n星野竜一]\t461507\n东莞时间网\t461508\n汇丰控股\t461509\n小幸运\t461510\n闭环系统\t461511\n重制\t461512\n混居\t461513\n帕尔\t461514\nPlugNT\t461515\n7nm\t461516\n倾世\t461517\n宋韵\t461518\n杰洛特\t461519\n1万公里\t461520\n中华城\t461521\n刘少奇故居\t461522\nBangBus\t461523\n七星彩\t461524\n王炳南\t461525\n大魔王\t461526\n第4套\t461527\nEasyWechat\t461528\nHepatitis\t461529\nFreelancer\t461530\n现金宝\t461531\n轩辕剑3:云和山的彼端\t461532\n磐安县\t461533\n辰州\t461534\n盐酸氨基葡萄糖胶囊\t461535\n擦拭\t461536\n侠义道2\t461537\n雕琢\t461538\n远期信用证\t461539\n麋\t461540\n中华南大街\t461541\n大航海ol\t461542\n莫小\t461543\nsciencenet\t461544\nzixing\t461545\n伊苏1\t461546\ndrown\t461547\n40毫升\t461548\n中国南方电网\t461549\n朴有天\t461550\n7.1_\t461551\n中国板报网\t461552\n拥有者\t461553\n德伯家的苔丝\t461554\n夜灵平野\t461555\n2018财年\t461556\ninformatica\t461557\n嘉定北站\t461558\n羊角镇\t461559\n800g\t461560\n铁证\t461561\n神魂颠倒\t461562\n农发\t461563\nofficers\t461564\n安全检查表\t461565\nKOKO\t461566\n上皮内瘤变\t461567\n鄂毕河\t461568\n哄抢\t461569\ntcmalloc\t461570\nMongodb\t461571\n车板\t461572\n湖南电力\t461573\n梅朵\t461574\n吉田洁\t461575\n宽限期\t461576\n哈娜\t461577\n学识\t461578\nautonomy\t461579\n笔谈\t461580\n教育信息化2.0行动计划\t461581\n姨姐\t461582\n初等教育\t461583\n兵团新闻网\t461584\n小猫钓鱼\t461585\nfj.122.gov.cn\t461586\nbookbao\t461587\n名怪\t461588\n豆柴\t461589\n晓飞\t461590\n征选\t461591\n钝化\t461592\n亚利桑那大学\t461593\nCards\t461594\n天知\t461595\n爆单\t461596\n河北省教育考试院\t461597\n脚步声\t461598\n转字\t461599\nFappening\t461600\n前7天\t461601\n徽宗\t461602\n相机镜头\t461603\nzhanghu\t461604\n中公版\t461605\n绝世唐门\t461606\n瑞风万古仙穹\t461607\n北海火车站\t461608\n塑粉\t461609\nems\t461610\n幕末\t461611\n吕鋆峰\t461612\nLatina\t461613\ndiva\t461614\n钢琴之森\t461615\n世联地产\t461616\n进口片\t461617\n武汉工商局\t461618\n个生\t461619\n雅培\t461620\nhrm\t461621\n小经验\t461622\nbeginner\t461623\n董维嘉\t461624\n万斯\t461625\nmpandroidchart\t461626\n甘泉村\t461627\n学林雅苑\t461628\n2.9.5\t461629\n上汽通用\t461630\n书纸\t461631\n蓝晶\t461632\n国葬\t461633\nLEGEND\t461634\n稀巴烂\t461635\n2016天\t461636\n梗阻\t461637\n黄艺博\t461638\npredictor\t461639\n涵哥\t461640\n保释\t461641\n泵盖\t461642\nXRD\t461643\n内蒙古医院\t461644\n王胜男\t461645\nlimf\t461646\n间接\t461647\n初任\t461648\n象馆\t461649\n惠州市人民政府\t461650\nwin7系统之家\t461651\n中国音乐学院\t461652\nIVF\t461653\n25千克\t461654\n外头\t461655\n文坛\t461656\nflakes\t461657\n多吉\t461658\n攻城\t461659\nM1911\t461660\n大红大紫\t461661\n杏璞霜\t461662\n45所\t461663\n秦前红\t461664\n金洞\t461665\n草根\t461666\n椰子蟹\t461667\n银河SOHO\t461668\n幽梦\t461669\n螺旋榨油机\t461670\n栗原小卷\t461671\n玛丽\t461672\n阅历\t461673\n温度带\t461674\n清网\t461675\n硫氰酸铵\t461676\n莫森\t461677\n上海银联\t461678\n赛威\t461679\n一层半\t461680\n江特\t461681\n3d试机号\t461682\n自考英语\t461683\n简化版\t461684\n三小只\t461685\n30年内\t461686\n吴小龙\t461687\n牛云\t461688\n韩国外国语大学\t461689\nPT\t461690\n透明性\t461691\n希林娜依\t461692\ndbvisualizer\t461693\n玻尿\t461694\n屏山\t461695\n内政部\t461696\n矿区\t461697\nYORK\t461698\n水浴晨光\t461699\n速装\t461700\n好租网\t461701\n爱平乡\t461702\n调蓄\t461703\n23【\t461704\nosg\t461705\n惠氏启赋奶粉\t461706\n羟乙基纤维素\t461707\n孙毓敏\t461708\n商库\t461709\n无锡公司\t461710\n脱毛机\t461711\n创佳绩\t461712\n慢性心衰\t461713\n5D2\t461714\n阿和\t461715\n金健米业\t461716\n圣经故事\t461717\n出乌兰记\t461718\n阿旗\t461719\n穿衣助手\t461720\n筝曲\t461721\n钱江晚报\t461722\n记名\t461723\n8501\t461724\n第一包\t461725\nvideoleap\t461726\n两场\t461727\n单身情歌\t461728\n云南省纪委省监察厅\t461729\n仙股\t461730\n新中新\t461731\n凯恩之\t461732\n两极分化\t461733\n同站\t461734\n东山再起\t461735\n配套\t461736\n乐事\t461737\n子包\t461738\n蒲友网\t461739\n逆元\t461740\n练习曲\t461741\n鑫淼\t461742\n经济参考报\t461743\nUQ\t461744\nocclusion\t461745\n改用\t461746\n快速排序算法\t461747\n西青经济技术开发区\t461748\n长安竞技\t461749\n谢师宴\t461750\n倒逼\t461751\n决然\t461752\n弹丸论破3\t461753\n中国京剧音配像精粹\t461754\n衰减器\t461755\n交卷\t461756\n启动台\t461757\n沪南公路\t461758\n凭单\t461759\n老货\t461760\n广西壮族自治区忻城县政府\t461761\n纬\t461762\n蒙迪歌诗图\t461763\n斋藤\t461764\n经验主义\t461765\n传奇元宝\t461766\n_晓\t461767\n0029\t461768\n敢死\t461769\n汤姆克鲁斯\t461770\n底漆\t461771\n古言\t461772\nElite\t461773\n传帮带\t461774\n科技\t461775\n163\t461776\n富华里\t461777\n鸟巢\t461778\n苏大附一院\t461779\n团鬼六\t461780\n御泉\t461781\n韩中\t461782\n周泽楷\t461783\n黄濑凉太\t461784\nroslyn\t461785\n五王\t461786\n单态\t461787\n眼镜城\t461788\n捂盘惜售\t461789\ns912\t461790\n森哥\t461791\n女蜗\t461792\n获表彰\t461793\n王璞\t461794\n中国民航管理干部学院\t461795\n普法网\t461796\n残差分析\t461797\n送亲\t461798\n自带\t461799\n15500\t461800\n短文学\t461801\n黄帝\t461802\nnote3全网通\t461803\n跳火\t461804\nfitter\t461805\n臂弯\t461806\nperkins\t461807\n白石岭\t461808\n侗寨\t461809\n制学\t461810\n35只\t461811\n王佑贵\t461812\n七巧\t461813\n涿州东\t461814\n宇扬\t461815\n媚术\t461816\n2017年12月19日\t461817\nAcademia\t461818\n万爽力\t461819\n28粒\t461820\n卢勇\t461821\n词袋\t461822\nDIVA\t461823\nMucho\t461824\n环太湖\t461825\n无秘\t461826\n下载管理器\t461827\n全景客虚拟旅游网\t461828\n喻虹渊\t461829\n甲状腺乳头状癌\t461830\n杨雪果\t461831\n英国皇家音乐学院\t461832\n大航海时代ol\t461833\n两斤\t461834\n雏\t461835\n后序\t461836\n青岛11号\t461837\n几季\t461838\n弹兵\t461839\n3天后\t461840\n太平洋咖啡\t461841\nTurns\t461842\n爱板网\t461843\n菜馆\t461844\n杰瑞米\t461845\n派瑞\t461846\n上海市黄浦区人民政府\t461847\n敌视\t461848\n多田君\t461849\nNSK轴承\t461850\n分赛\t461851\n3.6\t461852\n德布罗意\t461853\nWin7系统硬盘分区\t461854\nBrightening\t461855\n醋酸钾\t461856\n木守宫\t461857\nprotobufjs\t461858\nVB\t461859\n广州番禺区\t461860\nov2640\t461861\n乌兹别克\t461862\n杜尔伯特蒙古族自治县\t461863\n埙谱\t461864\n师级\t461865\nprovision\t461866\ndomo\t461867\n第167集\t461868\n1334\t461869\n桶形\t461870\n目总览\t461871\nSperm\t461872\n冷水江市人民政府\t461873\n2015年内\t461874\n同协路\t461875\n白银飞\t461876\n到场\t461877\nwdcwy\t461878\n死不起\t461879\n卡珊德拉\t461880\n7月1号\t461881\n20151220\t461882\n中国驾考网\t461883\n粤高速\t461884\n臻选\t461885\n彭杰\t461886\nIFA\t461887\n铁钱\t461888\n菜价\t461889\nPlymouth\t461890\n洋葱圈\t461891\n高顿资讯\t461892\n流过\t461893\n鼎兴\t461894\n首都医科大学附属北京佑安医院\t461895\n信长之野望14:创造\t461896\n福苑小区\t461897\n衡阳人才网\t461898\n帕森\t461899\n碱孕宝\t461900\nDRAM\t461901\n鱼缸过滤器\t461902\nx263.net\t461903\n荒谬\t461904\n结训\t461905\n兵心\t461906\n2x^3\t461907\nARGB\t461908\n江川路\t461909\n访韩\t461910\n53届\t461911\nIT外包\t461912\n南华县\t461913\n春娟\t461914\n凯顿\t461915\n成步堂\t461916\n颏\t461917\n发光板\t461918\n楼幢\t461919\n40分钟\t461920\n禁恋\t461921\nsail\t461922\n玛莎拉蒂Levante\t461923\n158天\t461924\nbare\t461925\n为什么多\t461926\n小学生必背古诗词八十首\t461927\n北京国际鲜花港\t461928\nnewtype\t461929\n金陵凤凰台\t461930\n等效电路\t461931\ndatum\t461932\n中国勘察设计协会\t461933\n报信\t461934\n斜管沉淀池\t461935\n张定\t461936\ncaoporn97\t461937\n定率\t461938\n影视传媒公司\t461939\n灯照\t461940\n_____\t461941\nposter\t461942\n穿破\t461943\n探究式\t461944\n要实\t461945\n中小银行\t461946\n未接到\t461947\n梅斯蒂亚\t461948\nsmartphone\t461949\n矿山车\t461950\n宇航\t461951\n上海市锦天城律师事务所\t461952\n射雕英雄传之华山论剑\t461953\n灏\t461954\n客服端\t461955\n50多万\t461956\nKS5U\t461957\n塔纳利斯\t461958\n微基站\t461959\n平均功率\t461960\n拘捕\t461961\n黄晓君\t461962\novovz\t461963\nrunat=\t461964\n阎娜\t461965\n西安华都妇产医院\t461966\n电缆穿管\t461967\nCrossing\t461968\n端篇\t461969\n一九五三年\t461970\n红烛\t461971\n新农业\t461972\n黑龙江农垦总局\t461973\n华考\t461974\nJain\t461975\n上海东方\t461976\n恒大城市之光\t461977\n国考\t461978\n开叉裙\t461979\n137平米\t461980\nmeans\t461981\n暴恐案\t461982\n半妖倾城\t461983\n自卸汽车\t461984\nSISU\t461985\n销售日\t461986\n刘志红\t461987\n热解\t461988\n莫测\t461989\n凝滞\t461990\n黑姬\t461991\n真的好帅\t461992\n唱将\t461993\n第3步\t461994\naside\t461995\n北大医院\t461996\nGPU渲染器\t461997\n遮阳\t461998\n彩车\t461999\n太原城市职业技术学院\t462000\n施工场\t462001\n制服诱惑\t462002\n黄泉之契\t462003\n大龙燚火锅\t462004\n石岐区\t462005\n领队\t462006\n踏板\t462007\n语版\t462008\n小陈\t462009\n华艺\t462010\n九秒\t462011\ntestcase\t462012\n原代\t462013\n东北特钢\t462014\n生办\t462015\n螺距\t462016\nFees\t462017\n抄数\t462018\n切赫\t462019\n千回\t462020\nOPTIPLEX\t462021\n夏目的美丽日记\t462022\n80万吨\t462023\noptional\t462024\n非诚勿\t462025\n草莓酒\t462026\n500kg\t462027\n锁杆\t462028\n多伦多猛龙队\t462029\nelegance\t462030\nA10X\t462031\nMerlot\t462032\n潜能果\t462033\ntomcat7.0\t462034\n海鹰\t462035\n鹿城圈\t462036\n宜昌机场\t462037\n金薇\t462038\n黑领\t462039\n86013666\t462040\n8088\t462041\n百所\t462042\n孙倩\t462043\nmill\t462044\n好恨\t462045\n2016年国庆\t462046\n崔林\t462047\n潜龙三号\t462048\n大数据研究中心\t462049\n战略合作协议\t462050\n京瓷复印机\t462051\n悬窗\t462052\n民办高中\t462053\n陆国明\t462054\n李泽湘\t462055\n红瓦\t462056\n石板街\t462057\nFATFS\t462058\nmp4v2\t462059\n翼虎沃尔沃xc60\t462060\n犬犬\t462061\n供应方\t462062\ntriggered\t462063\nWin7SP1\t462064\n吉米鸡毛秀\t462065\ngta5\t462066\n燕子掌\t462067\n极简\t462068\n手游戏\t462069\n傻瓜型\t462070\n医保卡\t462071\n药枕\t462072\n十七周\t462073\n路图\t462074\n东风水库\t462075\n抱着你\t462076\n胡文海\t462077\n范恒\t462078\n方瑜\t462079\n孝义吧\t462080\n115张\t462081\n槛\t462082\nFintech\t462083\nd12\t462084\n偏心轮\t462085\n石榴石\t462086\n安琪儿妇产医院\t462087\n线刷宝\t462088\n高湖村\t462089\n键线式\t462090\n仰义\t462091\nResponses\t462092\n子\t462093\n越过\t462094\n驱蚊器\t462095\n上海幼儿园\t462096\n第5集\t462097\npredictions\t462098\n1143\t462099\n潘通色卡\t462100\n布兰莎\t462101\n公路港\t462102\n过关卷\t462103\n山东鲁能\t462104\nRhino5.0\t462105\n盖兆泉\t462106\n李勇浩\t462107\n糖醋里脊\t462108\n禅语\t462109\n400nm\t462110\n神盾局特工第三季\t462111\nyouporn\t462112\n蓝牙遥控器\t462113\n外耳道\t462114\n岛崎遥香\t462115\ncyrus\t462116\n丹羽孝希\t462117\n衢州市民政局\t462118\nupm\t462119\n继妻\t462120\n千百鲁\t462121\n马到\t462122\n长效\t462123\n98五笔\t462124\n余角\t462125\n三碘甲状腺原氨酸\t462126\n达托霉素\t462127\n五次元\t462128\nSNC\t462129\n锦鹏\t462130\n专用\t462131\n福冈机场\t462132\n日元\t462133\n啾啾\t462134\ncdh\t462135\n草书千字文\t462136\n线描画\t462137\n西部机场集团\t462138\n窦漪房\t462139\n拔掉\t462140\n真么\t462141\n退休人员\t462142\nTVC\t462143\n以弗所书\t462144\n心草\t462145\n市政公用工程管理与实务\t462146\n2013年清明节\t462147\n星星\t462148\n15句\t462149\nfrost\t462150\nLED工矿灯\t462151\nFDA认证\t462152\n立交\t462153\nprovides\t462154\n古冶区\t462155\nDSLM\t462156\n吉利帝豪gs\t462157\n上周末\t462158\n8518\t462159\n重庆歌乐山\t462160\n中河街道\t462161\n不对\t462162\n魅声\t462163\n带状\t462164\n矫治器\t462165\n三言两拍\t462166\n柔肠\t462167\na72\t462168\n兰陵县\t462169\nODS\t462170\n指挥员\t462171\n沧海\t462172\n频带\t462173\ncoalition\t462174\n夏东\t462175\n第78名\t462176\nvsdx\t462177\n厦门第一医院\t462178\n微分学\t462179\n建发\t462180\n竹制品\t462181\n铜山网\t462182\n接电\t462183\nGrammarly\t462184\n温斯顿\t462185\n806路\t462186\n20170629\t462187\nAxela\t462188\n窟窿\t462189\n草莓派\t462190\n画册\t462191\n陈学军\t462192\n酱\t462193\n歌行\t462194\n40只\t462195\n标致207\t462196\n美包\t462197\n协同过滤算法\t462198\n甘井子区\t462199\n朱义\t462200\n西子奥的斯电梯\t462201\n文献展\t462202\n造作\t462203\n和睦家\t462204\n氯化铵\t462205\n圣安\t462206\n何兵\t462207\n陈钊\t462208\n顺德第一人民医院\t462209\n第20关\t462210\n波矢\t462211\n易明\t462212\n致畸\t462213\n白敬宇\t462214\n左道\t462215\n鸡肉卷\t462216\n缪莎\t462217\n轻云\t462218\n公交专用车道\t462219\n远销\t462220\n双城市\t462221\n朱蕾\t462222\n云飞儿\t462223\ncheckboxlist\t462224\n健身动起来\t462225\nElon\t462226\n古德\t462227\n萤幕\t462228\n预期\t462229\n二维码扫描器\t462230\n不要不\t462231\n八方支援\t462232\n第一棵\t462233\n小鱼网\t462234\nScanner\t462235\n萌化版\t462236\n二十秒\t462237\n德夫曼\t462238\n英雄战歌\t462239\n岳阳街道\t462240\n沪江网校\t462241\nrecat\t462242\n幸福家\t462243\n党纲\t462244\nw8.1\t462245\n乐山大佛\t462246\n大旭\t462247\n折皱\t462248\n瑞新\t462249\n筠\t462250\nEDB\t462251\n四亿\t462252\n营养费\t462253\nszh\t462254\n南无阿弥陀佛\t462255\n托举工程\t462256\n海博伦\t462257\nGraphing\t462258\n东海镇\t462259\n孙亚芳\t462260\n学转\t462261\n郑翔\t462262\n非货币性\t462263\n广州医科大学附属第三医院\t462264\n安国市\t462265\n狠辣\t462266\n南京新城科技园\t462267\nSkype\t462268\n产业处\t462269\n蒸笼\t462270\n恐怖殡仪馆\t462271\nMido\t462272\nkitchenaid\t462273\n透明人\t462274\n湖南省国税局\t462275\nwww.zhuna.cn/lable2205655/inn198/\t462276\ns18\t462277\n美拍\t462278\n轻闲\t462279\n第17周\t462280\n清散\t462281\n无城镇\t462282\njavaobject\t462283\n11颗\t462284\n除余\t462285\nblurred\t462286\n文林镇\t462287\n不合格品\t462288\n宋晓军\t462289\n泷悦\t462290\n绝不可\t462291\njetstar\t462292\n槟郎\t462293\nbeats\t462294\ngsonformat\t462295\nFires\t462296\n神经鞘瘤\t462297\n一万三\t462298\n义桥镇\t462299\n帝制\t462300\n慢走丝\t462301\n副店长\t462302\n巴伦西亚\t462303\n30磅\t462304\n7声\t462305\n灵眼\t462306\n监修\t462307\n随州市人民政府\t462308\nmingling\t462309\n元和街道\t462310\n阅读类\t462311\n91在线视频\t462312\n哨岗\t462313\n维格娜丝\t462314\n幕僚\t462315\nwwm\t462316\n王瑞\t462317\n下卷\t462318\n\u0007\t462319\n岢岚\t462320\nxla\t462321\nnetif\t462322\n淮南二中\t462323\n软件工程毕业设计\t462324\n馏\t462325\n遗落的南境\t462326\n4名\t462327\n离题\t462328\n中编\t462329\n游戏馆\t462330\nangelbeats\t462331\n探号\t462332\n逆止\t462333\n老表你好嘢\t462334\n热土\t462335\n上海市科委\t462336\n加点\t462337\n4月24日\t462338\n可得眼镜网\t462339\n驱体\t462340\nwedding\t462341\n婚神星\t462342\n克徕帝\t462343\n电子商务概论\t462344\npool\t462345\n游泳队\t462346\n博望区\t462347\n切入点\t462348\n布里斯托尔\t462349\n双门洞\t462350\n0.5平方\t462351\n934\t462352\n苍山负雪A4.6\t462353\n石亭\t462354\n社会经济\t462355\n西溪湿地\t462356\n数据矩阵\t462357\n建筑工程技术与设计\t462358\n台港澳\t462359\n第8页\t462360\nPHONE\t462361\n校园逍遥小书生\t462362\nloops\t462363\n构建者\t462364\n对公账户\t462365\n千金片\t462366\n高兽\t462367\n胀痛\t462368\nxplore\t462369\n监制\t462370\n羞羞\t462371\n降噪器\t462372\nwritting\t462373\n孟慧圆\t462374\n哆来咪\t462375\nJordan\t462376\n共创\t462377\nkicad\t462378\n屌屌\t462379\ndeconvolution\t462380\n金地艺境\t462381\n萍钢\t462382\nworld2016\t462383\n始末\t462384\n_图拉\t462385\n美丽的传说\t462386\nnavigateTo\t462387\n高新二路\t462388\n母方\t462389\n4.12\t462390\n1538\t462391\nIPhone8\t462392\n2ax\t462393\n折达路\t462394\n399号\t462395\n晚市\t462396\n出租人\t462397\n广州农村商业银行\t462398\n丁苯橡胶\t462399\n口袋公园\t462400\n78名\t462401\n分离\t462402\n鸿坤理想湾\t462403\ntho\t462404\n20160204\t462405\n拔剑\t462406\n上环\t462407\nmailto\t462408\n墨泽\t462409\n化瘀\t462410\nTomTom\t462411\n萧煌奇\t462412\n敏骑\t462413\ninput\t462414\n乔森\t462415\nfozero\t462416\nGreat\t462417\n方正书\t462418\n康耐登\t462419\n友宝\t462420\nBanana\t462421\nfrankxx\t462422\n三际\t462423\ntestNG\t462424\n米东区\t462425\n河南一区\t462426\n源丰\t462427\naicc2018\t462428\n程先生\t462429\nxa2\t462430\n第二期\t462431\nEliteBook\t462432\n髓鞘\t462433\ncpdy\t462434\n红杉\t462435\n最终幻想10\t462436\n雕塑家\t462437\n雷尔\t462438\n中山市环境保护局\t462439\n超光\t462440\n亲子酒店\t462441\n娇兰娇兰\t462442\n粟米\t462443\nJameson\t462444\n菜园坝\t462445\n第19章\t462446\n格丽斯\t462447\n江中猴姑米\t462448\nwin10文件资源管理器\t462449\n中林\t462450\n钢结构\t462451\n圆形图\t462452\n魔彩\t462453\n铅笔稿\t462454\n家庭生活\t462455\n95566\t462456\n死难者\t462457\nwin10+ubuntu\t462458\n喀什市政府\t462459\n事业单位招聘考试网\t462460\n胃腺癌\t462461\n连云区\t462462\n2017年2月6日\t462463\n校旗\t462464\nessential\t462465\n_处\t462466\n联户\t462467\n书费\t462468\n一轩\t462469\n龙笔\t462470\n火钳\t462471\n周休\t462472\nipmitool\t462473\n启东房产网\t462474\n胃壁\t462475\n轴压比\t462476\n马边\t462477\n平衡轮\t462478\n卷土\t462479\n枢纽型\t462480\nShadowsocks\t462481\n新时代面对面\t462482\n点不亮\t462483\nKenshinCui\t462484\n滚动选择器\t462485\n成都中医药大学\t462486\n勤智数码科技股份有限公司\t462487\n阿奴\t462488\nsubstrate\t462489\n六年级下册英语\t462490\n半壶\t462491\n床铺\t462492\n识花\t462493\nymlt\t462494\n一体裤\t462495\n伺服定位系统\t462496\n永夜我欲封天\t462497\n攀援\t462498\n中南大学》\t462499\n安捷伦科技\t462500\n沉浸式\t462501\n7天游\t462502\n重庆新房网\t462503\n大生\t462504\nGTR\t462505\neuphoria\t462506\n南宁市纪检监察局\t462507\nShadowrocket\t462508\n会议纪\t462509\nARC\t462510\n战犯\t462511\n怪物猎人世界双刀\t462512\n云南工商学院\t462513\nWhom\t462514\nhpux\t462515\n智能银行\t462516\n福建省审计厅\t462517\n西子绪\t462518\nCS5\t462519\n四处\t462520\n奇事\t462521\nJSch\t462522\n一抔\t462523\nrichardson\t462524\n大变革\t462525\n升级版\t462526\n小香\t462527\n别董\t462528\n青天降妖录\t462529\nchannel\t462530\nkanpiaoxue\t462531\n孤辰\t462532\n25章\t462533\n强占\t462534\n5.3亿\t462535\n装饰品\t462536\n建养\t462537\n中船\t462538\n夜视仪\t462539\n百田奥比岛圈\t462540\n司南\t462541\n口行\t462542\n蜗轮蜗杆减速器\t462543\n霍夫斯泰德\t462544\nlbp3000\t462545\n哗哩\t462546\n魏华\t462547\n和田螺\t462548\n红河日报数字报\t462549\n家用版\t462550\n斑布\t462551\n宝钞\t462552\n织里\t462553\n太岁\t462554\n釉料\t462555\n多音词\t462556\n韩雷\t462557\n高领\t462558\n手语\t462559\n天地玄门\t462560\nchart\t462561\ncurtis\t462562\ncad2017\t462563\n王炳忠\t462564\n新理念大学\t462565\ncifs\t462566\n凶真\t462567\n切角机\t462568\n邪影\t462569\n克林姆林宫\t462570\n秀才\t462571\n1177\t462572\nValidator\t462573\n王俊凯\t462574\n祁门路\t462575\n上学吧在线考试\t462576\nbingbing\t462577\n格林威治\t462578\n颜之推\t462579\ncsrc\t462580\n碗豆\t462581\n画符\t462582\n吉格斯\t462583\n陕煤化集团\t462584\n科幻世界\t462585\n芙娜\t462586\n长技\t462587\n天文系\t462588\n六岁\t462589\n惠来县\t462590\n同行们\t462591\n一男\t462592\n1毫克\t462593\n鑫方盛\t462594\n联想y410p\t462595\nemqtt\t462596\n眼肌\t462597\n陌生化\t462598\n圆融\t462599\n吸入器\t462600\n明霞\t462601\n郑州枫杨外国语学校\t462602\n配液\t462603\n画鸟\t462604\n关晓彤\t462605\n朦胧\t462606\n再者\t462607\n附释\t462608\n安徽省总工会\t462609\n上滤\t462610\nEdmunds\t462611\n发愁\t462612\ntiling\t462613\n成事在天\t462614\nallele\t462615\nBUT\t462616\n崎岖\t462617\n黄川\t462618\n人口密度\t462619\n有点怪\t462620\n浆泵\t462621\n希言\t462622\n1967年\t462623\n铜套\t462624\n鸦雀无声\t462625\nPADS9.5\t462626\n飘逸杯\t462627\nNGFF\t462628\n职代\t462629\n大部制\t462630\n露丝\t462631\n高新区新闻网\t462632\n桃仁\t462633\n碎米\t462634\nCy\t462635\ninverter\t462636\nSaving\t462637\n亲一口\t462638\nAsyncSocket\t462639\n滴眼液\t462640\n团队精神\t462641\nev450\t462642\nKnock\t462643\n嘉兴人大\t462644\n客观性\t462645\n60秒内\t462646\n燃\t462647\n甘粕\t462648\n发射机\t462649\n上海格致中学\t462650\nFocusky\t462651\n健身_99健康网\t462652\n江西科技学院\t462653\n北仓\t462654\n家用电器\t462655\n神圣天使兽\t462656\n林奕华\t462657\n十里\t462658\nBID\t462659\n岂敢\t462660\na型血\t462661\nvivoxplay\t462662\ngca\t462663\n十余家\t462664\n负载电阻\t462665\n天一轩起名网\t462666\nteenth\t462667\n宋玺\t462668\nAvatarx\t462669\ndome\t462670\n流程银行\t462671\n郝峰波\t462672\n从快\t462673\n祝你生日快乐\t462674\n乒\t462675\n乔司镇\t462676\n战功\t462677\ntinyint\t462678\nw520\t462679\n网易财经\t462680\n感怀\t462681\nCCTV-4_央视网\t462682\n专杀\t462683\nBLX\t462684\n生产率\t462685\n9010\t462686\n投资部\t462687\nFirebug\t462688\n奔\t462689\n本子娜\t462690\n怀石\t462691\n时装片\t462692\n李嘉诚基金会\t462693\n十二铜表法\t462694\n安师大附中\t462695\n委托代理\t462696\n600362\t462697\n李公堤\t462698\n爬行垫\t462699\nUU898\t462700\n明哥\t462701\n慕岩\t462702\n异径\t462703\n陕西省公路局\t462704\n一次多\t462705\n弄\t462706\n容光\t462707\nListener\t462708\n上海市小学\t462709\ncorners\t462710\n蜗轮蜗杆减速机\t462711\nSurplus\t462712\n风怒\t462713\n天天农场\t462714\n开普敦\t462715\n北京交通大学》\t462716\n轻功\t462717\n国家自然科学基金委员会生命科学部\t462718\n名带\t462719\n青海移动\t462720\n1536\t462721\npb9\t462722\n姜赫\t462723\n崔琦\t462724\nCSmall\t462725\n桌面壁\t462726\nPixelmator\t462727\nFFA\t462728\n项管\t462729\n杭州市第一人民医院\t462730\n和胜\t462731\n吉祥邮\t462732\n轻叶小说网\t462733\n俏俏\t462734\n石围塘\t462735\n万年\t462736\nFEDEX\t462737\n20150331\t462738\n吉\t462739\nCOD12\t462740\n歪果仁\t462741\nο\t462742\n粉骨\t462743\n问自己\t462744\n黄石市规划局\t462745\nLaurel\t462746\n102岁\t462747\n北五环\t462748\n蕾丝衫\t462749\n體\t462750\naxeslide\t462751\n中山大学工学院\t462752\nfiter\t462753\n海洋石油工程股份有限公司\t462754\n影音人生\t462755\n2015、2016年\t462756\n模具\t462757\n涞水县\t462758\n版型\t462759\n惠子\t462760\n丁冠森\t462761\n蒙氏\t462762\n382号\t462763\n林敏聪\t462764\n北京交通大学电子信息工程学院\t462765\n马莱\t462766\n保哈网\t462767\n重庆地铁3号线\t462768\nmg2580\t462769\nENJOYZ\t462770\n窗体\t462771\n花港\t462772\n搬货\t462773\n稳压电源\t462774\n音乐鉴赏\t462775\ncryptography\t462776\nyyyy-mm\t462777\ncdy\t462778\n定点医院\t462779\n营业单位\t462780\n潘启明\t462781\n气速\t462782\n25册\t462783\n培训员\t462784\n西顿照明\t462785\n欧鹏\t462786\n热泉\t462787\n广东农信\t462788\n苦荞\t462789\n汕头市区\t462790\n山东省工商局\t462791\n双洞\t462792\n62397998\t462793\n中化新网\t462794\ninitialized\t462795\n岳阳\t462796\nFast\t462797\n北京石景山区\t462798\nconv2d\t462799\n中央别墅区\t462800\n天津演艺网\t462801\n赵志勇\t462802\n马首\t462803\n体寒\t462804\n易友\t462805\n农化\t462806\n大胃\t462807\n6t\t462808\n7790\t462809\nDjVu\t462810\n统帅\t462811\nXBOX360\t462812\nCond\t462813\n暖妻\t462814\npostek\t462815\n合肥市房地产管理局\t462816\nholds\t462817\nz17\t462818\n再见吧喵小姐\t462819\n海北藏族自治州\t462820\n临床医学检验\t462821\n区人社局\t462822\n住行\t462823\n163.tv\t462824\n校企合作\t462825\n满堂架\t462826\n鲜牛奶\t462827\n车筐\t462828\npixxx\t462829\n9669\t462830\n2600年\t462831\n拨音\t462832\nchorm\t462833\n240Hz\t462834\n样品费\t462835\ncny\t462836\n30kb\t462837\nsilm\t462838\n成都铁路局\t462839\n伟达\t462840\n家庭电路\t462841\n乳装\t462842\n精灵宝可梦xy\t462843\n念慈庵川贝枇杷膏\t462844\n切尔诺贝利\t462845\n江苏省苏州中学\t462846\n覆写\t462847\n百度世界大会\t462848\n李小敏\t462849\n_电工学网\t462850\n逐差法\t462851\nZIP\t462852\nVRM\t462853\n寻常型\t462854\nZhongshan\t462855\n北京科技有限公司\t462856\n朱雀山\t462857\ngson\t462858\n三重县\t462859\nmdf\t462860\n无锡宏锐电气有限公司\t462861\n青草药\t462862\nb199\t462863\n咸\t462864\n野鹤\t462865\n豹猫\t462866\n音高\t462867\n八方城\t462868\n搞笑文\t462869\n天底\t462870\n重返\t462871\nfm2\t462872\n2345安卓网\t462873\nFESTO\t462874\n16座\t462875\n希娜\t462876\n伪影\t462877\nKam\t462878\n中国环保网\t462879\nセレクト\t462880\n神捕\t462881\n爱拼才会赢\t462882\n鞋套\t462883\n正太控\t462884\n家民\t462885\n生发\t462886\n华为荣耀6x\t462887\n主干路\t462888\nパンティ\t462889\n辅以\t462890\nBLACKED\t462891\nNullable\t462892\n美容美发学校\t462893\n农科院附小\t462894\n梁刚\t462895\n冷镦钢\t462896\n玄鸟\t462897\n甲基托布津\t462898\n拱北口岸\t462899\n网志\t462900\n沈东\t462901\n结缘\t462902\n雷蒙德\t462903\nsly\t462904\n筋骨\t462905\nmgmt\t462906\n别点\t462907\n旱厕\t462908\n海疆在线\t462909\n肖力\t462910\n重元寺\t462911\n艳情\t462912\n24支\t462913\n江铃汽车集团公司\t462914\n乳头状\t462915\n34号\t462916\n超大\t462917\n尔茄\t462918\n百草谷\t462919\n楚汉之争\t462920\n大小鼠\t462921\n10020\t462922\n常胜将军\t462923\n画壁\t462924\n3块\t462925\n索尼电视吧\t462926\n23.4\t462927\n扭振\t462928\n硅酸盐板\t462929\n罗天大醮\t462930\n警匪片\t462931\n足三里\t462932\nyangyang\t462933\nXCTF\t462934\njiankang\t462935\n充数\t462936\n左卡尼汀\t462937\n水稳层\t462938\n9500gt\t462939\n夏勇\t462940\nult\t462941\n巴东\t462942\n刺身\t462943\n休妃\t462944\n塞巴斯塔\t462945\nBIDU\t462946\n利欧\t462947\n北京京东方\t462948\n5000倍\t462949\ndqmj\t462950\n征纳\t462951\n红土航空\t462952\n酒网\t462953\n大家的日语\t462954\n皋陶\t462955\n特定\t462956\npmu\t462957\n5000平方米\t462958\n托福模\t462959\n磁流变液\t462960\n自由的人\t462961\n德勒兹\t462962\n异喹啉\t462963\n快克\t462964\n10KV\t462965\n智取威虎山\t462966\n多层\t462967\n李津\t462968\n主歌\t462969\n封边机\t462970\n单工\t462971\n滤器\t462972\n待续\t462973\n瑞宝\t462974\nStrand\t462975\nZHANG\t462976\n牧野之战\t462977\n得宠\t462978\n董雪\t462979\n绥棱\t462980\nprotest\t462981\n60s\t462982\nslack\t462983\n森海\t462984\nPEN\t462985\n走\t462986\n缘\t462987\n冶勒湖\t462988\n临界胶束\t462989\n海口市教育局\t462990\n汤姆·哈迪\t462991\n0398\t462992\n递延所得税资产\t462993\nLFW\t462994\n想要\t462995\n深圳市罗湖区人民法院\t462996\npkcs8\t462997\n武器箱\t462998\n阿俊\t462999\nRPG类\t463000\n贵干\t463001\n液压打包机\t463002\n售后退款\t463003\n妙音符咒师\t463004\n杏仁豆腐\t463005\n8针\t463006\n布兰傅盛\t463007\n互金公司\t463008\n第比利斯\t463009\n速览\t463010\n苏E\t463011\n有机磷农药中毒\t463012\nslideDown\t463013\n勐\t463014\n全球金属网OMETAL.COM\t463015\n抓不住\t463016\nadoble\t463017\n哆哆\t463018\n臵\t463019\n聚龙小镇\t463020\n计算机软件著作权登记\t463021\nTandem\t463022\n六万块\t463023\n土地资源网\t463024\nStocking\t463025\n判定\t463026\n5630\t463027\n马瑞\t463028\n谢楠\t463029\n15558781221\t463030\n黄进\t463031\n太仓市人民政府\t463032\n校委会\t463033\n画眉\t463034\nSCR\t463035\n玄龙\t463036\n代用\t463037\n万水千山总是情\t463038\n舜帝\t463039\n昭旻\t463040\n孟良崮\t463041\n五六个小时\t463042\n造梦西游4_一游网\t463043\n李欣频\t463044\nAyla\t463045\n18g\t463046\n非贸\t463047\n亲亲漫画网\t463048\n卡扣\t463049\nwaving\t463050\nlabview2017\t463051\n北京8号线\t463052\n布奥\t463053\n蒂森克虏伯电梯\t463054\n头疼\t463055\n0.70\t463056\n联排\t463057\n石澳\t463058\n交口\t463059\nDiscipline\t463060\n1双\t463061\n硫酸铅\t463062\n高节\t463063\n想不起来\t463064\n气体检测仪\t463065\n降高\t463066\n起司\t463067\nv2.2.1\t463068\n常务委员\t463069\n林依依\t463070\n帐本\t463071\n成都道\t463072\nwithdrawal\t463073\n负电压\t463074\n保险柜\t463075\nFeignClient\t463076\n中国气象局\t463077\n表演唱\t463078\n北京王府井\t463079\nupod\t463080\n胃镜检查\t463081\n克莱汤普森\t463082\n百里守约\t463083\n私校\t463084\n飘流幻境|巴士岛救生站\t463085\n娇贵\t463086\nf459\t463087\n五方\t463088\nrip\t463089\n第四根\t463090\n偏爱\t463091\n憧憬\t463092\n黑木明纱\t463093\n模拟经营\t463094\nar9331\t463095\n包河区人民政府\t463096\n茎秆\t463097\n上汽通用别克\t463098\nffp\t463099\n神经病之歌\t463100\n中信银行手机银行\t463101\n日企招聘网\t463102\n亲代\t463103\nbeta系数\t463104\n卓朗科技\t463105\n量纲\t463106\ns2011\t463107\n肝硬化失代偿期\t463108\n昆山中学\t463109\n5.95\t463110\nnimbus\t463111\n秦公\t463112\n希望\t463113\n长安金欧诺\t463114\n策藏\t463115\n杨澜\t463116\n咋\t463117\n366\t463118\n张灿\t463119\n第一桶\t463120\n橙红色\t463121\n浪货\t463122\n皖北协作区\t463123\n李宗盛\t463124\nGFriend\t463125\n3到5年\t463126\n栞\t463127\n花呗\t463128\n06版\t463129\n民窑\t463130\nRVM\t463131\n超过两分钟\t463132\nk701\t463133\n新东方烹饪学校\t463134\n寻声\t463135\n无车承运人\t463136\n17首\t463137\nVoda\t463138\n超性\t463139\n四公里\t463140\nworst\t463141\n第A3\t463142\n京密路\t463143\n摘选\t463144\n福州市区\t463145\nMACC\t463146\n李丽君\t463147\n变声软件\t463148\n90年代初期\t463149\n行馆\t463150\n头发\t463151\n抗弯\t463152\n疏楼龙宿\t463153\n浙东\t463154\n黄宇航\t463155\n261个\t463156\n套合\t463157\n我的世界手机版\t463158\n褶\t463159\n4【\t463160\n_狗民网\t463161\nmfrbuaa\t463162\n45w\t463163\n一吻定情2\t463164\n申辩书\t463165\nindividuals\t463166\n乌梅\t463167\n八千年\t463168\n技術\t463169\n耳顺\t463170\n一千多元\t463171\n§\t463172\n460马力\t463173\n原来\t463174\nwhy123.org\t463175\n徐圣恩\t463176\n中国好声音\t463177\n中国电子科技集团公司第十三研究所\t463178\n无公害食品\t463179\n上12小时\t463180\n科尔斯\t463181\n上坊\t463182\n解剖\t463183\n伤筋动骨\t463184\n75元\t463185\nbuckets\t463186\n二三层\t463187\n沅江市\t463188\n噻托溴铵\t463189\n爱图网\t463190\n交管所\t463191\n人人贷\t463192\nHALL\t463193\n牛樟芝\t463194\n小翅膀\t463195\n辛辰\t463196\n实木颗粒\t463197\nChuaN\t463198\n长沙市地方税务局\t463199\n股骨长\t463200\n中山大学研究生院\t463201\n忘恩\t463202\n五朵山\t463203\n动点\t463204\n香飘\t463205\nshj\t463206\n国家能源集团\t463207\nheike\t463208\n2014.9\t463209\nPAPER\t463210\n黎姿\t463211\n中药学\t463212\n宝露露\t463213\n品悦\t463214\n疃\t463215\n二元制\t463216\nios11.3卡\t463217\n孤胆车神5\t463218\n相爱\t463219\n办法学\t463220\n摩根士丹利\t463221\nccleaner\t463222\n献妻\t463223\n马医\t463224\n男模\t463225\n吉林大学管理学院\t463226\n观礼\t463227\n墨北\t463228\n认清\t463229\n倔强\t463230\n黄暴\t463231\nX-Plane-模拟飞行论坛\t463232\n所有者\t463233\n张杨路\t463234\nnlog\t463235\n闷音\t463236\n天界\t463237\n珮姐\t463238\n黑米粥\t463239\n陈版\t463240\n火米\t463241\nKeyDown\t463242\nf549\t463243\n隼人\t463244\n仰仗\t463245\nUK\t463246\n医学部\t463247\nV4.4\t463248\n九寨沟风景区\t463249\n视达网\t463250\n586号\t463251\n过滤嘴\t463252\n普利兹克\t463253\n念佛堂\t463254\n陈启宇\t463255\n实部\t463256\nvcenter\t463257\n恶灵附身2\t463258\n艾瑞泽3\t463259\n裘球\t463260\n天津阿波罗医院\t463261\n几丝\t463262\n迈腾380\t463263\n武魂殿\t463264\n吴佳瑛\t463265\n西贝莜面村\t463266\ndam\t463267\n大明红楼梦\t463268\n请命\t463269\n中信期货有限公司\t463270\n2日\t463271\n正身\t463272\n公益中学\t463273\n牙医\t463274\n锦绣重生之大涅磐\t463275\n电视人\t463276\n鸣枪\t463277\n瞥\t463278\n宠邮\t463279\n华新开发区\t463280\nfirst-child\t463281\nmutable\t463282\n花少3\t463283\n水寨\t463284\n相对父\t463285\nuninstaller\t463286\n蛋卷机\t463287\n10余万\t463288\n淤泥质\t463289\n吴小萱\t463290\nmpg\t463291\ncreo吧\t463292\n布匿战争\t463293\n红颜祸水\t463294\n波波妹\t463295\n苏州工业园区管理委员会\t463296\n徐方\t463297\nhawa\t463298\n夏普\t463299\n苏联肃\t463300\n一时半\t463301\nSTC89C52RC\t463302\nabsolutely\t463303\n顾永才\t463304\n策动\t463305\nCOPY\t463306\n郑吒\t463307\n双耳\t463308\n净距\t463309\n记起\t463310\n势在必行\t463311\n宝安北路\t463312\nIE8_\t463313\n头孢克洛\t463314\n北京外国语大学网络教育学院\t463315\n医学继续教育\t463316\n我是特种兵之利刃出鞘\t463317\n深大\t463318\n迈瑞医疗\t463319\n人教版课标本\t463320\n漏子\t463321\n张满胜\t463322\nilly\t463323\n从化区\t463324\nmint-ui\t463325\nDj\t463326\n齐伟\t463327\n莫里亚蒂\t463328\n致亲爱的你\t463329\n百度云/The\t463330\nyingyuan\t463331\n妇科_99健康网\t463332\n尿道炎\t463333\nJTS\t463334\n_亲亲宝贝网\t463335\n北航附中\t463336\n浩轩\t463337\n大岭山镇\t463338\n西征\t463339\n乙种\t463340\n数据类\t463341\n电磁加热器\t463342\n孕中期\t463343\n酸麻\t463344\n可待\t463345\n杨晓林\t463346\n一特\t463347\n半透\t463348\nWin7共享文件夹\t463349\n野蛮女友\t463350\nHAZOP\t463351\n婚鞋\t463352\n免疫组化\t463353\n凤凰六哥\t463354\n世初\t463355\n过敏原测试\t463356\n67mm\t463357\n十分之三\t463358\n科普片\t463359\n骗局\t463360\n烤肠\t463361\n司马迁\t463362\nROSA\t463363\nfrida\t463364\n皮机\t463365\n三氧化铬\t463366\nkongjian\t463367\n容易发生\t463368\n热血海贼王\t463369\n苏玉红\t463370\n干痒\t463371\n纺织城\t463372\n正和岛\t463373\nAbleton\t463374\n学弈\t463375\n凑合\t463376\n滤色\t463377\nCCTV-12_央视网\t463378\n香水瓶\t463379\n威露士消毒液\t463380\n不动产统一登记\t463381\nlol鸡里奥宝典\t463382\nQSql\t463383\n玛格罗兰\t463384\nLakka\t463385\n中国核工业集团有限公司\t463386\n度化\t463387\n辽史\t463388\n爽口\t463389\n航星\t463390\n朱家林\t463391\n奥迪4S店\t463392\n主控芯片\t463393\n899元\t463394\n惩罚性赔偿\t463395\n箭镞\t463396\n凤岭儿童公园\t463397\n食种\t463398\n牛牛通宝\t463399\n车机版\t463400\n张立\t463401\n阿普丽佳\t463402\n掰扯\t463403\n二十分钟\t463404\n林晓峰\t463405\n17年10月\t463406\nGreen\t463407\n王玮\t463408\n李施德林\t463409\n西西东东\t463410\n肺心病\t463411\n国际安徒生奖\t463412\n体繁\t463413\ncardiff\t463414\n岚岛\t463415\n60亿\t463416\n拆下\t463417\n张俭\t463418\n150多个\t463419\n预激综合征\t463420\n公路段\t463421\nSocialBeta\t463422\n正谱\t463423\n贪心算法\t463424\n南昌理论网\t463425\n军考\t463426\n换挡器\t463427\nisdir\t463428\n银海\t463429\nbelow\t463430\n爆发力\t463431\n华数TV\t463432\n血誓\t463433\n附体\t463434\n电子元\t463435\nepd\t463436\n釉\t463437\n三星a7000\t463438\n大美人\t463439\n三通一达\t463440\n金钢\t463441\nM4800\t463442\n盖碗\t463443\n第二十七章\t463444\nStarve\t463445\n号机\t463446\n软弹\t463447\n怀化市第一人民医院\t463448\n赔偿金\t463449\nthese\t463450\n坚冰\t463451\n脚踏阀\t463452\n四国岛\t463453\nsurvey\t463454\n马种\t463455\n鼎实\t463456\n0609\t463457\n老张\t463458\nYUNSPACE\t463459\n雪刃\t463460\n炽热\t463461\n中华财税网\t463462\nRocketMQ\t463463\n爱问分类\t463464\n光镜\t463465\n思虑\t463466\n四重\t463467\ntesco\t463468\n建房\t463469\n午火\t463470\n粤运\t463471\n9999999\t463472\n王景\t463473\n合和新城\t463474\n整整齐齐\t463475\n女众\t463476\nlelo\t463477\n胺基\t463478\n新感觉\t463479\n兴隆文化园\t463480\n资金面\t463481\n金闪闪\t463482\n炭包\t463483\n百年后\t463484\n谢谢侬\t463485\nwinserver2012\t463486\n检测器\t463487\n工艺美术史\t463488\nbuildpath\t463489\n广西大学研究生院\t463490\n野人谷\t463491\nrotational\t463492\nwifi版\t463493\n名门隐婚:枭爷娇宠妻\t463494\n迅雷下载全集高清\t463495\n盖图\t463496\n乙二胺四乙酸二钠\t463497\n党中央治国理政\t463498\n38.2度\t463499\n经籍\t463500\n林聪\t463501\n遂宁市政府\t463502\n大展\t463503\nOrac\t463504\n混响器\t463505\n中华人民共和国劳动法\t463506\n宏编程\t463507\n极放大电路\t463508\nGm005\t463509\nchsh\t463510\n安德鲁\t463511\n江苏省人社厅\t463512\n麻省总医院\t463513\nTsai\t463514\n可变资本\t463515\n桑拿浴\t463516\n莴笋叶\t463517\nmvr\t463518\n魅心\t463519\n助廉\t463520\n大王叫我来巡山\t463521\n字符段\t463522\n分散片\t463523\n弹层\t463524\n叶檀\t463525\n泽雅\t463526\n看多远\t463527\nSuperb\t463528\n小曦\t463529\n顺丰机场\t463530\n46年\t463531\ninto\t463532\n史矛革\t463533\nDAV01.COM\t463534\nyoubian\t463535\n连科\t463536\n晚婚晚\t463537\n卫戍\t463538\n现实生活\t463539\n图像帧\t463540\nreal\t463541\nFISHING\t463542\n成交价\t463543\n小知\t463544\n中航三鑫\t463545\n切莫\t463546\n我明白\t463547\nX3.3\t463548\n孙铭徽\t463549\n包虫病\t463550\n红米5plus\t463551\n卡里路\t463552\n刘启明\t463553\n17R4\t463554\n乌拉特\t463555\n地址码\t463556\n光荣证\t463557\n上海大学附属中学\t463558\n化妆刷\t463559\nmodify\t463560\n八月十五日\t463561\n95厘米\t463562\npatents\t463563\n梨园\t463564\n合子\t463565\n铣刨机\t463566\n红膏\t463567\n金属块\t463568\n武汉市交管局\t463569\nhilo\t463570\n苏州大学研究生院\t463571\n爱医\t463572\n开食\t463573\n俞永福\t463574\n女衣\t463575\n云渲染\t463576\n中国田协\t463577\n护航者\t463578\n/tr\t463579\n连庄\t463580\n弗洛伦萨\t463581\n天河客运站\t463582\n陈绍聪\t463583\n打投\t463584\n休夫\t463585\n宽距\t463586\n廊坊市区\t463587\nTransformations\t463588\n美贵\t463589\n龙英\t463590\npe盘\t463591\ngoshtube\t463592\n张释文\t463593\nWow\t463594\nPx\t463595\n00q\t463596\nuserdel\t463597\nFrog\t463598\n孝心\t463599\n比索洛尔\t463600\n卡通片\t463601\n加考\t463602\nthug\t463603\n扶余吧\t463604\n幸福小区\t463605\n板娘\t463606\n淘里乐\t463607\n217路\t463608\n104\t463609\ntmw\t463610\ny500\t463611\ntpo\t463612\n单只\t463613\n坦克\t463614\n流水灯\t463615\n中广核集团\t463616\n柯马\t463617\n再加上\t463618\n长春\t463619\nrival300\t463620\nmeto\t463621\n幂指数\t463622\n程氏\t463623\n黄慧音\t463624\nDOS/\t463625\n速冲\t463626\n哆咪\t463627\n仓廪\t463628\nThis\t463629\nFlex4\t463630\n公共关系专业\t463631\nJesse\t463632\n哭笑\t463633\n古姓\t463634\n0006\t463635\n平均市盈率\t463636\n迈迈\t463637\n旺庄街道\t463638\nAssassin\t463639\n雙\t463640\n9秒\t463641\n小河淌水\t463642\n贺岁档\t463643\n完胜\t463644\ndstwo\t463645\n不等人\t463646\n瘦素\t463647\n中科院生物物理所\t463648\n沙岗\t463649\n敷料\t463650\nimagination\t463651\n料峭\t463652\n周一围\t463653\n嘴遁\t463654\n凌云县\t463655\nIRS\t463656\n天鹅绒之吻\t463657\n侯老师\t463658\n配筋助手\t463659\n复化\t463660\nsleeps\t463661\nButterKnife\t463662\n聚合度\t463663\n纤芯\t463664\n童婴\t463665\n可耻\t463666\nocto\t463667\n妖精的尾巴吧_\t463668\nErik\t463669\n婆物联通\t463670\n英飞\t463671\n限滑差速器\t463672\n竞技类\t463673\n辐照度\t463674\nYuan\t463675\n索尼子\t463676\n网上兼职\t463677\n土增\t463678\n跨境电子商务综合试验区\t463679\n到此\t463680\n抽阴\t463681\n8017xxxx\t463682\n传递\t463683\n褪黑激素\t463684\n严虎\t463685\n现代战争4\t463686\n葛温\t463687\nRAPID\t463688\n对板\t463689\n蓝凯\t463690\n奥丁之渊\t463691\n声法\t463692\n加试\t463693\n750\t463694\nax2+bx+c=0\t463695\n诡术妖姬\t463696\n77.com\t463697\n洪天明\t463698\n可爱\t463699\nK1\t463700\n坦途教育网\t463701\n降膜\t463702\nkinetis\t463703\nMok\t463704\n西泽\t463705\n殷人昆\t463706\n张君\t463707\ninfamy\t463708\n首年\t463709\n中企动力科技股份有限公司\t463710\n彭娟\t463711\n非淋\t463712\n耒阳新闻网\t463713\nskv\t463714\n协和医科大学\t463715\n基础医学院\t463716\n会泽县\t463717\n业原火\t463718\n尉犁县\t463719\n方尺\t463720\n摄影者\t463721\n捷荣\t463722\n所罗门之晓\t463723\n欧建新\t463724\n郑州方特\t463725\n孙凰\t463726\nnewcaoguo\t463727\n王院长\t463728\n考籍\t463729\n中国峄城政府\t463730\n碳刷电机\t463731\nDPP\t463732\nfuture\t463733\n古川慎\t463734\n补白\t463735\n外挂\t463736\n灵州\t463737\n索尼ILCE-7\t463738\n双龙洞\t463739\nVila\t463740\n处女作\t463741\nSSH\t463742\n生殖器疱疹\t463743\n口香糖\t463744\nVisit\t463745\nMLDN\t463746\n林肯MKC\t463747\n刨槽机\t463748\nhs8145c\t463749\n山城区\t463750\n成都地铁1号线\t463751\n芜湖职业技术学院\t463752\n森迷\t463753\n10笔\t463754\n风级\t463755\n77路\t463756\n血色\t463757\n筋梁\t463758\n开不开机\t463759\n聚划算\t463760\n唐悠悠季枭寒小说\t463761\n华交会\t463762\n086\t463763\n2917\t463764\n保时捷Boxster\t463765\n语音变声器\t463766\n絕\t463767\n黑社会的超能力女儿\t463768\n富裕县\t463769\n岫岩县\t463770\n真力时\t463771\n少龙\t463772\n陈坚\t463773\n仙侠奇幻小说_火星小说网\t463774\n两败俱伤\t463775\n刘易阳\t463776\n块状元素\t463777\n常德政府\t463778\n四吨\t463779\n三人版\t463780\n365日\t463781\n滨江一号\t463782\n选配\t463783\nK线组合\t463784\n武汉市第六中学\t463785\n超集\t463786\n西非法郎\t463787\n溧城镇\t463788\n纯商贷\t463789\n签单\t463790\n布署\t463791\n2018年03月05日\t463792\n薛瑞萍\t463793\n工艺篇\t463794\n皮尔萨\t463795\n张大夫\t463796\n站口\t463797\n彩虹合唱团\t463798\n第4集\t463799\n第21期\t463800\nsireni\t463801\npk3\t463802\n龚琳娜\t463803\n巴甫洛夫\t463804\n铁面人\t463805\nSA8000\t463806\n三少\t463807\n碱片\t463808\n张少华\t463809\n败露\t463810\n劲胜智能\t463811\n张春明\t463812\n火卫\t463813\n冷热冲击试验箱\t463814\nunity2018\t463815\nGNS\t463816\n巴塘\t463817\n小朋友们\t463818\n通化市政府\t463819\n结婚纪念日\t463820\nU2414H\t463821\n文林\t463822\n四川警察学院\t463823\n稻香园\t463824\n龙飞虎\t463825\nHkGov\t463826\n程又青\t463827\nbiomarker\t463828\nHighlight\t463829\nCIC\t463830\n700亿\t463831\nEntityFramework\t463832\n8.6米\t463833\nv3.4.1\t463834\n前视\t463835\n非专业\t463836\n聚氨酯直埋保温管\t463837\n30CM\t463838\n24节\t463839\n5.0L\t463840\n金弓\t463841\n三流之路\t463842\n拼音字母\t463843\n32亿元\t463844\njida\t463845\n还本\t463846\n中国医师节\t463847\n地质公园\t463848\n过劳死\t463849\n花滑世锦赛\t463850\n泼水节\t463851\nPROFESSIONAL\t463852\nfontcreator\t463853\nMacs\t463854\narbor\t463855\n仙女\t463856\n炕头\t463857\n武大口腔医院\t463858\nSTORAGE\t463859\n10000块\t463860\n升档\t463861\n陈腐\t463862\n山摇\t463863\n7660\t463864\npari\t463865\n江直树\t463866\n莫里哀\t463867\n壹号皇庭\t463868\n甘薯\t463869\n逆鳞\t463870\n1.2.0.0\t463871\n乌莲娜\t463872\n奥米加兽\t463873\n姜老太\t463874\ntrendy\t463875\nHDPE双壁波纹管\t463876\n数字型\t463877\n崇川\t463878\n36色\t463879\n21ic\t463880\n多动障碍\t463881\n左右边\t463882\n金娃娃\t463883\n梅杰\t463884\n欧式箱变\t463885\n十三陵\t463886\n两袖清风\t463887\nfindAll\t463888\n炫舞大掌门\t463889\n一个数\t463890\n对于\t463891\n睡奸\t463892\n水牛城\t463893\n早梅\t463894\nduokan\t463895\nЕ\t463896\n文苑路\t463897\n400v\t463898\n眼器\t463899\n禅诗\t463900\n八角形\t463901\nTime\t463902\n隋棠\t463903\n金刚烷胺\t463904\n熊猫人\t463905\n蹭碟\t463906\nh11\t463907\n贵州茅台酒厂\t463908\n庐剧\t463909\n蜻蜓FM\t463910\ne-works\t463911\n听说过\t463912\n肥料\t463913\ncmis\t463914\n乐泰\t463915\n蔡国权\t463916\n1ppt\t463917\n劳务派遣暂行规定\t463918\n宫外孕\t463919\n专色\t463920\n男名\t463921\n答疑\t463922\n1276\t463923\n自画像\t463924\n火棘\t463925\n想像力\t463926\n评书机\t463927\n道经\t463928\n第8周\t463929\n咖啡具\t463930\nther\t463931\n2016年11月16日\t463932\njeeplus\t463933\n民间舞\t463934\n作梦\t463935\n昆虫记\t463936\n亿童\t463937\n仙游\t463938\n2006-2020年\t463939\n沈阳市\t463940\nmatlab函数\t463941\nplotter\t463942\n南京市国土资源局\t463943\n习近平系列重要讲话读本\t463944\n黑恶犯罪\t463945\n文学史\t463946\n保肝药\t463947\n南昌市统计局\t463948\n蒲黄榆\t463949\n雅文\t463950\n单元格数\t463951\n阴女\t463952\nODF\t463953\nbelami\t463954\n慢性阻塞性肺疾病\t463955\n净回笼\t463956\n九创装饰\t463957\n15X\t463958\nwm\t463959\n逞强\t463960\n蛋烘糕\t463961\n视频片\t463962\n重庆小学\t463963\n远大控股\t463964\nLNG加气站\t463965\n太污\t463966\n身陷\t463967\n朝三暮四\t463968\n云柜\t463969\n西安东大街\t463970\n大喊大叫\t463971\n黑莓9000\t463972\n3258\t463973\n惠氏\t463974\n飞字\t463975\n平常\t463976\n熊童子\t463977\n5257\t463978\n脂联素\t463979\n副省\t463980\nmanag\t463981\n幸福归来\t463982\nOR\t463983\n安环\t463984\n马来半岛\t463985\n肤康\t463986\n中国家谱网\t463987\n上海东海职业技术学院\t463988\n奉贤区\t463989\n飞梭\t463990\n重庆路\t463991\n表外融资\t463992\n旋梯\t463993\n官林镇\t463994\n狩\t463995\n5年前\t463996\n申遗\t463997\n140万\t463998\n弹簧板\t463999\n巴罗克\t464000\n2枚\t464001\n山田孝之\t464002\n黄铜\t464003\n一码\t464004\n新教伦理与资本主义精神\t464005\nSorted\t464006\n微科\t464007\n凯迪社区\t464008\n近亲繁殖\t464009\n1万\t464010\n土豆网\t464011\n学生观\t464012\ntexlive2017\t464013\n10L\t464014\npushState\t464015\n踏\t464016\n相平\t464017\ndecisions\t464018\n访问员\t464019\n违禁词\t464020\n宅门\t464021\n兰德医武兵王\t464022\n箭扣\t464023\n加尔\t464024\n171cm\t464025\n中国如东_如东县政府\t464026\nmaphack\t464027\n中央\t464028\n东华软件\t464029\nboomboom\t464030\n手机杀毒_\t464031\n8021\t464032\n20151228\t464033\n端承桩\t464034\n尹正杰\t464035\n轮扣\t464036\n发展商\t464037\n永吉\t464038\n2346\t464039\n酿酒酵母\t464040\n苏然\t464041\n瞳\t464042\n2570\t464043\n诸君\t464044\n周德睿\t464045\n第5辑\t464046\n328p\t464047\nlixi\t464048\n学字\t464049\n红卡\t464050\n宋神宗\t464051\n忽尔\t464052\n亚美娱乐\t464053\nvmnet0\t464054\n报报\t464055\n一统天下\t464056\n铜厂\t464057\n维他美仕\t464058\n颐景\t464059\n铜矿\t464060\npt100热电阻\t464061\n幂次\t464062\nProgrammable\t464063\n增强液\t464064\nphotozoom\t464065\n健康街\t464066\n周六夜\t464067\n9.0c\t464068\n知识产权网\t464069\n68万元\t464070\n妙警贼探\t464071\n巴楚\t464072\n浓缩汁\t464073\n上讯\t464074\n李培林\t464075\n陕西传媒网\t464076\n推式\t464077\nWheeler\t464078\n行业性\t464079\n章立凡\t464080\n赵婷\t464081\n10W+\t464082\n英博尔\t464083\n反驳\t464084\n亚克\t464085\n自导自演\t464086\n7031\t464087\n伊肤泉\t464088\n偶联\t464089\n湖头镇\t464090\nanimation\t464091\n放血\t464092\n布鲁内尔大学\t464093\n谋嫁\t464094\n博文_\t464095\n3D游戏\t464096\n20151022\t464097\n带薪\t464098\n塔板\t464099\n0663\t464100\n山塘水库\t464101\n电子技术\t464102\nGrid\t464103\neel\t464104\n菏泽职业学院\t464105\n香港中联办\t464106\nbreathe\t464107\n甲板\t464108\n田家英\t464109\n一条狗的使命\t464110\n応子\t464111\n维生素片\t464112\n新日本语\t464113\n金声玉振\t464114\n丰禾路\t464115\nAddon\t464116\n绚丽多彩\t464117\n智勇大冲关\t464118\noffice2013吧_\t464119\n淘宝旗舰店\t464120\n10088\t464121\n股份\t464122\n安飞士租车网\t464123\nminister\t464124\n梅溪镇\t464125\n诡谈\t464126\n普路通\t464127\n几十家\t464128\n斜阳\t464129\nKohler\t464130\n花科\t464131\n廊架\t464132\n博斯特\t464133\n评断\t464134\n平衡率\t464135\n姓名章\t464136\nweekday\t464137\n定向耦合器\t464138\n歌伴舞\t464139\n袁勇\t464140\n第4届\t464141\n喻氏\t464142\nBIRTHDAY\t464143\n萘醌\t464144\n铜陵路\t464145\n浩扬\t464146\n卢凯彤\t464147\n流量计\t464148\n货币市场基金\t464149\nfjs\t464150\n传令\t464151\nSCI影响因子\t464152\n豉油\t464153\n借壳上市\t464154\n白带\t464155\n恶作剧之吻\t464156\n人力公司\t464157\nMapreduce\t464158\n魔法禁书目录\t464159\n2018年8月\t464160\n学模\t464161\n猎梗\t464162\n明熹宗\t464163\nConstance\t464164\n黔驴技穷\t464165\niso9000\t464166\n曹晖\t464167\nETM\t464168\nSunscreen\t464169\n木纹铝方通\t464170\n半死\t464171\n未生\t464172\nnib\t464173\n泰天王陵\t464174\n郑州市城乡规划局\t464175\n钢喉\t464176\nLinux内存管理\t464177\nxilinx\t464178\n森特\t464179\n幻世\t464180\n防爆柜\t464181\n钢筋笼滚焊机\t464182\n大红灯笼高高挂\t464183\n中电软件园\t464184\n莱万\t464185\n团旗\t464186\nsupermemo\t464187\n张芝山镇\t464188\n尋\t464189\nDoctors\t464190\n进帐\t464191\n竭尽全力\t464192\n添柏岚\t464193\n哈子\t464194\n甜品师\t464195\n爱乃娜美\t464196\n改为\t464197\n天津滨海职业学院\t464198\n疑窦\t464199\n2014上半年\t464200\n嘉年乐\t464201\n参数\t464202\nsends\t464203\n钻桩\t464204\n风季\t464205\n图表\t464206\n艮卦\t464207\n顶峰网\t464208\n宏程序\t464209\n集子\t464210\n交通运输业\t464211\n柠檬蜂蜜\t464212\n虎粮\t464213\nchenfeng\t464214\n青岛网络广播电视台\t464215\n苏建科\t464216\n艾薇儿\t464217\nname\t464218\n名子\t464219\n广西壮族自治区监狱管理局\t464220\nApproximate\t464221\n子宫囊肿\t464222\n一个64位\t464223\nmapGetters\t464224\ncheoyx\t464225\n十二年前\t464226\n大屌哥\t464227\ntektronix\t464228\n我的一天\t464229\n苍之纪元_\t464230\n生死之交\t464231\n结果树\t464232\n财长\t464233\n自足\t464234\n被害人\t464235\n数字单元格\t464236\n20150601\t464237\n滴瓶\t464238\n灭火器\t464239\nF55\t464240\n北京市政协\t464241\n野猫\t464242\n短袖衫\t464243\n新通教育\t464244\n破船\t464245\n韩艺瑟\t464246\n铁道飞虎\t464247\n53期\t464248\n童旭东\t464249\n镇巴\t464250\n凯拉奈特莉\t464251\n谜踪\t464252\n整形机\t464253\n武行\t464254\n租赁业\t464255\nmoov\t464256\n断浪\t464257\n一呼百\t464258\n小米帮助中心\t464259\nply\t464260\n高光\t464261\n消字\t464262\n哈西\t464263\n肉丸子\t464264\n壮行\t464265\n修配\t464266\n喵喵\t464267\n23例\t464268\n其实很简单\t464269\n笨狼的故事\t464270\n金融界股票论坛\t464271\n3年前\t464272\n大渣\t464273\nligerui\t464274\n油箱盖\t464275\n蹦蹦车\t464276\n获准\t464277\n配卡\t464278\n二头肌\t464279\nonstop\t464280\n新档\t464281\n搞非\t464282\n仙骨\t464283\n银点\t464284\n小高教学网\t464285\n林品如\t464286\n渔樵\t464287\n50磅\t464288\nedge浏览器\t464289\n西桥\t464290\n天台县人民政府\t464291\nZ370\t464292\nAcrylic\t464293\n千焦\t464294\n用纸板\t464295\n陈师行\t464296\n于汉超\t464297\n韩长赋\t464298\n排卵试纸\t464299\n1690\t464300\nFicow\t464301\n真三\t464302\npropertis\t464303\n日記\t464304\n名人们\t464305\n迅猛龙\t464306\n开心保\t464307\n壹万元\t464308\n亚邦股份\t464309\n这感觉\t464310\n罗技k480\t464311\n王牌特工2\t464312\n石首吧\t464313\n大楚网\t464314\ncorrelations\t464315\n小乐\t464316\n山东大学齐鲁医院\t464317\n淋巴癌\t464318\n中国材料研究学会\t464319\n20151101\t464320\n希思\t464321\n金立f100\t464322\n谷开来\t464323\n问药\t464324\n新疆信息网\t464325\n胶袜\t464326\n凯泽斯劳滕\t464327\n汝州\t464328\n补过\t464329\n大文隋唐演义\t464330\n先锋版\t464331\n4把\t464332\n混合型基金\t464333\n山东画报出版社\t464334\n邢台市政府\t464335\n铜章\t464336\n红虎\t464337\nzhanji\t464338\n千反田爱瑠\t464339\n黄体\t464340\nM416\t464341\nsqllite\t464342\naffective\t464343\nSNG\t464344\nsp9\t464345\n年年红\t464346\n小看\t464347\n王氏保赤丸\t464348\n汾湖镇\t464349\n鸡王\t464350\n访问量\t464351\n金枕头\t464352\nBMG\t464353\n舷\t464354\n引继码\t464355\n1.21\t464356\n25L\t464357\n控件\t464358\n怀上\t464359\nsquier\t464360\n澳客网\t464361\n上海浦东文华东方酒店\t464362\n霍啸林\t464363\n10.3.3\t464364\nmite\t464365\n达华智能\t464366\npruning\t464367\nzhengfu\t464368\n@10\t464369\n大兴新城\t464370\n酸性水\t464371\n九州大戏台\t464372\n院地\t464373\naxial\t464374\n終\t464375\nincome\t464376\n新疆出入境检验检疫局\t464377\n赵无极\t464378\nelasticjob\t464379\n石灰膏\t464380\n萃华金店\t464381\nTasting\t464382\n玉猪龙\t464383\n指弹\t464384\nAnn\t464385\n一九四二\t464386\n津桥\t464387\n论集\t464388\nmasm32\t464389\n雀圣2\t464390\n萨拉曼卡\t464391\nyemian\t464392\nsandals\t464393\n25PP\t464394\n白芸豆\t464395\n叶静\t464396\n中华人民共和国刑事诉讼法\t464397\n2000年\t464398\n祖父母\t464399\n无过\t464400\nJSer\t464401\nlinke\t464402\n3.2寸\t464403\n狮鹫兽\t464404\nytkah\t464405\ncalculation\t464406\n802.11g\t464407\n李玉成\t464408\nA31\t464409\n老贝\t464410\n邢凯\t464411\ngratuit\t464412\n郭子\t464413\n科密\t464414\nemporio\t464415\n单打\t464416\n刘钧\t464417\n许钦松\t464418\n折叠门\t464419\nRESTFul\t464420\n木槌\t464421\n维修点\t464422\n电联\t464423\n中国企业联合会\t464424\nccv\t464425\n菲斯曼\t464426\n荣盛康旅\t464427\n0325\t464428\n交流生\t464429\n600198\t464430\n自由变换\t464431\n备查\t464432\n九华山机场\t464433\n前程无忧\t464434\nwwr\t464435\n黄大仙祠\t464436\n眼睁睁\t464437\n杨佳\t464438\n铠甲勇士捕将\t464439\n80240017\t464440\n断页\t464441\n一2016\t464442\n辽宁\t464443\nWin10/\t464444\n深圳大剧院\t464445\n项目管理者联盟\t464446\n绝伦\t464447\nSpam\t464448\n书迷\t464449\n成都绕城高速\t464450\n迅雷bt\t464451\nFree-Dom\t464452\n莲藕\t464453\nxxxx\t464454\nMONITOR\t464455\n门品\t464456\n游客们\t464457\nMessagePack\t464458\n国家食品药品监督管理局\t464459\n长兴县\t464460\nvm510l\t464461\n天天象棋\t464462\n邓恩\t464463\nLeave\t464464\nbindtap\t464465\nacademy\t464466\n干电池\t464467\n退休\t464468\n嘉兴人论坛\t464469\n极飞科技\t464470\n就业难\t464471\n12301\t464472\nzhj\t464473\n九杀\t464474\n9709\t464475\n1度\t464476\n萨格拉斯\t464477\nuc\t464478\n驴皮\t464479\n防晒乳\t464480\nroms\t464481\n常态\t464482\n康纳·麦格雷戈\t464483\n森地\t464484\n广州棒谷网络科技有限公司\t464485\n大连宾馆\t464486\n2.20\t464487\n第186集\t464488\nQTimer\t464489\n甘南藏族自治州人民政府\t464490\n圣鹿之死\t464491\n谢希德\t464492\nkmz\t464493\n天语\t464494\n险恶\t464495\n刘小光\t464496\n大众网\t464497\n跳马\t464498\n书报亭\t464499\n华龙路\t464500\nesa\t464501\n潍柴动力\t464502\n吹牛逼\t464503\n西岳华山\t464504\n三星W2017\t464505\n远洋君\t464506\n胃印戒细胞癌\t464507\n第33话\t464508\nRemo\t464509\nwn725n\t464510\n千喜鹤\t464511\n青岛地铁11号\t464512\n莫干山风景区\t464513\n大学生兼职网\t464514\n航天科技集团\t464515\n10038\t464516\n中国信托业协会\t464517\nDepp\t464518\n安庆E网\t464519\n更有\t464520\n16a\t464521\n127小时\t464522\n铁军\t464523\n蟹爪\t464524\n云图\t464525\nv6.6.2\t464526\n香港幼儿园\t464527\n常州钟楼政府网\t464528\n海洋球\t464529\n对外经济贸易大学国际商学院\t464530\n手扶式\t464531\n中国政府\t464532\n排母\t464533\nTub\t464534\nseoyanfa\t464535\ncycript\t464536\n无生\t464537\ntides\t464538\n银行家\t464539\n轴向柱塞泵\t464540\n小团圆\t464541\n松下电器(中国)有限公司\t464542\n2017-11-15\t464543\ncancle\t464544\n云南普洱茶\t464545\n强退\t464546\n蒙自县\t464547\n2012届\t464548\n捅杀\t464549\n销孔\t464550\n2KW\t464551\n武汉热线房产网\t464552\n闲下来\t464553\n细粒\t464554\n斗影联盟\t464555\n结实\t464556\n农药\t464557\n大捷龙\t464558\n云播\t464559\n后前\t464560\n色青片\t464561\n创造与魔法\t464562\n售楼部\t464563\n4月20\t464564\n瑞泽\t464565\n书法学\t464566\n防暑降温\t464567\nLocale\t464568\n动态码\t464569\ni.MX6\t464570\nCIS\t464571\n下路\t464572\nVol.5\t464573\n聘品\t464574\n律诗\t464575\n蔗糖\t464576\n举一反三\t464577\n生产费\t464578\nEraser\t464579\n点胶\t464580\n家德\t464581\n健身服\t464582\n唐律疏议\t464583\n果浆\t464584\nNSTimer\t464585\n培养液\t464586\n同福\t464587\nMotorsport\t464588\nFilebeat\t464589\n恒信人才网\t464590\n贝诺\t464591\ncypress\t464592\nfontawesome\t464593\nZSMJ\t464594\n河段\t464595\n六十天\t464596\n乐猫\t464597\n魔形女\t464598\n薛宝琴\t464599\n照片墙\t464600\n和合彩\t464601\n野狼\t464602\n小眼睛\t464603\n你还要我怎样\t464604\nPi\t464605\n**公司\t464606\nhomeless\t464607\n南京西路街道\t464608\n傻蛋\t464609\n韦小宝\t464610\nHara\t464611\n安拆\t464612\n绍兴火车站\t464613\n趴\t464614\n预包装食品营养标签通则\t464615\nmadam\t464616\n朱梓骁\t464617\n多摩\t464618\nv2.7.2\t464619\n痛经\t464620\nkowa\t464621\n20171013\t464622\n豆元\t464623\n天策府\t464624\n共现\t464625\n黑搜\t464626\n奥特曼剧场版\t464627\n显微\t464628\n芳飞\t464629\nwo99\t464630\n2018年9月份\t464631\nT\t464632\n梁志军\t464633\n私人会所\t464634\n孢子\t464635\nT+D\t464636\n大怪\t464637\n欧亚德\t464638\n斯帕\t464639\n更新改造\t464640\n天作\t464641\n风行牛奶\t464642\n打法\t464643\n上孕\t464644\n麻黄碱类\t464645\n单样本\t464646\n圆通物流\t464647\n有什么用\t464648\n2546\t464649\n潭水\t464650\n8.6%\t464651\n译文版\t464652\nC8\t464653\n乱轮\t464654\n明治奶粉\t464655\n皮鞭\t464656\n卡拉拉\t464657\n宋冬\t464658\n萝莉塔\t464659\n提测\t464660\n机米\t464661\n绝地求生刺激战场游戏\t464662\n云间\t464663\n脉宝云店\t464664\n03款\t464665\n怪侠\t464666\n济宁市市中区\t464667\n潇湘晨报数字报\t464668\n刷片\t464669\ncrusher\t464670\noculusrift吧\t464671\n邹氏\t464672\nDIV\t464673\n眼影膏\t464674\n绍兴一中\t464675\nbusty\t464676\nicp证\t464677\n4.3.4\t464678\nVaccines\t464679\nANIMATION\t464680\n伯邑\t464681\nav在线播放\t464682\nrexue\t464683\n幼儿教育学\t464684\n888888\t464685\n典伊\t464686\nBarracuda\t464687\n岳不群\t464688\n5月3日\t464689\n尽快\t464690\n缺血性卒中\t464691\n自述\t464692\n割灌机\t464693\nrunningman2017\t464694\n粳稻\t464695\nosram\t464696\n西安乐居\t464697\n吴茜\t464698\n猎户\t464699\n55英寸\t464700\n反向代理\t464701\n黑龙江职业学院\t464702\nxps15吧\t464703\n江滨路\t464704\n恒爱\t464705\nUltraEdit-32\t464706\n初烧\t464707\nDigest\t464708\n佟志\t464709\nbasex\t464710\n噼里啪啦\t464711\nccna\t464712\n箭雨\t464713\n低消\t464714\nWORD转换器\t464715\n华清园\t464716\n3.X\t464717\nV3菱悦\t464718\n毛入学率\t464719\n4399动漫网\t464720\n乳汁\t464721\n几系\t464722\n黄涛\t464723\n魂帝\t464724\n手持吸尘器\t464725\nmonogem\t464726\nWebb\t464727\n卫生专业技术资格考试\t464728\n听课\t464729\n涟水机场\t464730\n干红葡萄酒\t464731\n需要量\t464732\n红漆\t464733\n兰空\t464734\n周三多\t464735\nThumbs\t464736\n袁鸣\t464737\n3.0.3\t464738\n南北路\t464739\n咽下\t464740\n黄金果\t464741\nyou-get\t464742\n保健师\t464743\n滚球\t464744\n81886600-8003彭娟13428412317\t464745\nleejersey\t464746\n微小宝\t464747\n牛首山\t464748\nnfts\t464749\n钙钛矿\t464750\n100号\t464751\nsstd\t464752\n王国云\t464753\n性爱片\t464754\n上海光源\t464755\n慈溪人才网\t464756\n光辉照\t464757\n4399疯狂猜成语\t464758\ndahua\t464759\n状态方程\t464760\nIST\t464761\n测量工程师\t464762\n天涯明月刀OL\t464763\n开阔眼界\t464764\nTXT谱\t464765\n百度热力图\t464766\njyp\t464767\n马畅\t464768\n娄书记\t464769\n资源环境学院\t464770\n叙政府军\t464771\n楞严咒\t464772\nSelecting\t464773\n梦月\t464774\n草婴\t464775\n嘉里大酒店\t464776\nCourt\t464777\n雷凌185t\t464778\ngrowth\t464779\n学片\t464780\n五联\t464781\n取乎\t464782\n凯特·戴琳斯\t464783\n80余\t464784\n赵广\t464785\n吴南生\t464786\n国土\t464787\nDBMS\t464788\n反思集\t464789\nEF6\t464790\n58张\t464791\nvxace\t464792\n抛丸\t464793\nLearns\t464794\nE15\t464795\nATT\t464796\n皮卡丘\t464797\n说人话\t464798\n2501\t464799\n代点\t464800\n零陵区政府\t464801\n巡边\t464802\n挑号网\t464803\n去了解\t464804\n南汇中学\t464805\n西丽人民医院\t464806\n行政处罚法\t464807\n华硕\t464808\n国宾府\t464809\n比亚迪秦ev450\t464810\n201801\t464811\n碧云\t464812\nsprng\t464813\n通知栏\t464814\n伯纳天\t464815\nstc15f2k60s2\t464816\n稼楼\t464817\nJoJo的奇妙冒险\t464818\n1.4.2\t464819\n31日\t464820\n依米康\t464821\n努比亚v18\t464822\n感统\t464823\nSDk\t464824\nquizlet\t464825\n中国能源建设集团广东火电工程有限公司\t464826\n自相\t464827\n卡条\t464828\n太阳能硅片\t464829\nk450\t464830\n曾艳芬\t464831\n路沿\t464832\n南京皮肤研究所\t464833\n周行\t464834\njparepository\t464835\n声光报警器\t464836\n皮脂\t464837\n唐胜伟\t464838\n黑帮老大\t464839\n魔考\t464840\n达芬\t464841\n中经\t464842\n慢性中毒\t464843\n好找\t464844\n棋王\t464845\n椰子\t464846\n烈阳\t464847\n贷方余额\t464848\nmf\t464849\n重计\t464850\n大师赛\t464851\n纯黑\t464852\n人才培\t464853\nxAxis\t464854\n愉快\t464855\n杜奕衡\t464856\n宫廷\t464857\n异丙苯\t464858\n马莹\t464859\n卡培他滨\t464860\n操纵\t464861\n轮胎展\t464862\n新老娘舅\t464863\n双浦镇\t464864\nclearing\t464865\nicomoon\t464866\nop2\t464867\np355d\t464868\n手语操\t464869\n评论家\t464870\n胡寿松\t464871\n嘉祥\t464872\n谭晶\t464873\n重庆七中\t464874\n守梦者\t464875\n跑向\t464876\n僵尸启示录\t464877\n人杰地灵\t464878\n朦朦\t464879\ni5-8250u\t464880\nlookup函数\t464881\n第134期\t464882\n5.9\t464883\n爱田樱\t464884\nNovo\t464885\n健康经\t464886\n空重\t464887\n中国爱乐乐团\t464888\n聚氨酯防水材料\t464889\nMastering\t464890\n弘业期货股份有限公司\t464891\n原路\t464892\n史莱克七怪\t464893\n承影\t464894\n中泰\t464895\n40CM\t464896\n法亚史诗吧\t464897\nuiuc\t464898\n钱江潮\t464899\n三星手表\t464900\n新浪财\t464901\nipynb\t464902\n河北机电职业技术学院\t464903\n食帖\t464904\n采暖炉\t464905\ncomdlg32.ocx\t464906\nmpa\t464907\n双人版\t464908\n山竹海\t464909\nnote2\t464910\n青荷\t464911\n中国音协\t464912\n磁头\t464913\n稽阳\t464914\nPrem\t464915\n第七感\t464916\nuidatepicker\t464917\n服务税\t464918\n百度云/1280高清\t464919\n情深深\t464920\n承压型\t464921\n龙井峡\t464922\n晏\t464923\n梅塔\t464924\niMP3\t464925\n上百度\t464926\n阜南\t464927\n结构简式\t464928\nDecryptor\t464929\n梅婷\t464930\n孙儿\t464931\n选录\t464932\n牛皮菜\t464933\n腾讯网页游戏\t464934\n朱家村\t464935\n周帅\t464936\n吃鸡不卡\t464937\n工业坊\t464938\nN5\t464939\n文包\t464940\n中山南\t464941\n角钢塔\t464942\n手孔井\t464943\n梦魔\t464944\nボリュ\t464945\n13mm\t464946\n氯化钯\t464947\n枪模\t464948\n曳尾\t464949\nCOT\t464950\n陆尊\t464951\n0.16\t464952\n汽配网\t464953\n剖面仪\t464954\nSUPOR\t464955\n公职类考试信息网\t464956\n回水\t464957\n凰图腾\t464958\n元阳\t464959\n神舟十号\t464960\n华南农业大学珠江学院\t464961\n第164\t464962\n北京人才\t464963\n惠存\t464964\n超图\t464965\ntchar\t464966\n河源市人力资源和社会保障局\t464967\n望字\t464968\n撞死\t464969\nAnytime\t464970\n双弩\t464971\n可收回金额\t464972\n大青山\t464973\n电压力\t464974\n利刃出击\t464975\nJK罗琳\t464976\n融为一体\t464977\n220名\t464978\n科研室\t464979\n建设度\t464980\n湖州市教育局\t464981\n第34期\t464982\n升降机\t464983\nBandicut\t464984\n主心骨\t464985\n非均质\t464986\n整机\t464987\nufunc\t464988\nwinners\t464989\n德彪西\t464990\nk480\t464991\n华为悦盒ec6108v9c\t464992\n1.2%\t464993\nnvivo\t464994\n武汉锐科光纤激光技术股份有限公司\t464995\n小米MIX2s\t464996\nパワ\t464997\n南环新村\t464998\n康悦\t464999\n国家资本主义\t465000\n保坂绘里\t465001\ncad教育版\t465002\n新天药业\t465003\n电气工程专业\t465004\n图示\t465005\nczx\t465006\n登发\t465007\n经颅磁刺激仪\t465008\n6.01\t465009\n米莉尔\t465010\nre0\t465011\n小小米\t465012\n无薪\t465013\n显玩\t465014\nFUTURE\t465015\n凡人修仙传之仙界篇\t465016\n禁卫\t465017\n变奏曲\t465018\n复配\t465019\n治疗性\t465020\n利福平\t465021\nMorcato\t465022\nABCC式\t465023\n标准园\t465024\n新萄京\t465025\nntp服务器\t465026\n吉州\t465027\npablo\t465028\n通许县\t465029\nDillon\t465030\n女胸\t465031\n聂瑞平\t465032\n中国压缩机网\t465033\n51CTO\t465034\n教育展\t465035\n嘉宾\t465036\n静心口服液\t465037\n公虾米\t465038\n红楼梦\t465039\n大北农集团\t465040\n90ss套\t465041\n金东新闻网\t465042\nShar\t465043\n八神\t465044\n建湖县\t465045\n赌注\t465046\n陈蔚\t465047\n中国道路运输协会\t465048\n低端机\t465049\n朱晨丽\t465050\n金相砂纸\t465051\n环环相扣\t465052\n中原工学院\t465053\n比列\t465054\n固定性\t465055\n儿童滑步车\t465056\n南法\t465057\nidl\t465058\n杰图\t465059\nvancouver\t465060\n体制化\t465061\nios10吧\t465062\n煤业\t465063\n跨越式跳高\t465064\n美国证券交易委员会\t465065\n策划者\t465066\n打浪\t465067\n妙音\t465068\nTOUCH\t465069\n梦戴维\t465070\n地质人\t465071\n王馨瑶\t465072\n彗星撞地球\t465073\n近物\t465074\n福克斯论\t465075\n职享\t465076\nUniversidade\t465077\n1523\t465078\n斗门区\t465079\n多磺酸粘多糖乳膏\t465080\n侯红\t465081\nBETA\t465082\nApprentice\t465083\nKingBoBo\t465084\n滑爽\t465085\nCLINIQUE\t465086\ntrue\t465087\n要号\t465088\n开平市政府\t465089\n二叉链表\t465090\nCX-3\t465091\n查讯\t465092\nigg\t465093\nthinkpa\t465094\nevaporation\t465095\n2599元\t465096\n陈大\t465097\nЦ\t465098\n籍籍\t465099\n平原君\t465100\n北大培文\t465101\nXVM\t465102\n印章机\t465103\n绍兴黄酒\t465104\nGraphQL\t465105\n徐进\t465106\n951\t465107\n磁石\t465108\n32GB/全网通\t465109\n注销登记\t465110\n海力布\t465111\n高民\t465112\n北线\t465113\n三更之饺子\t465114\n红柱\t465115\n人人商城v3\t465116\nO型腿\t465117\n期货从业员资格考试\t465118\n万宁市\t465119\n奔跑吧\t465120\n新乡人才网\t465121\n胜败\t465122\n江门房协网\t465123\n唐门\t465124\n山崩\t465125\n翘嘴鱼\t465126\nassertion\t465127\n冒进\t465128\n经验篇\t465129\n949\t465130\n蒸压\t465131\n诱人姿态\t465132\n拍拍\t465133\n标准件\t465134\n榆关\t465135\n工程室\t465136\n郑州南站\t465137\nx战警\t465138\n祈求者\t465139\n美壶网\t465140\n联想y460\t465141\n脚踏船\t465142\n卡斯帕\t465143\n江应怜\t465144\n欧亚非\t465145\n满洲里\t465146\n海南质量网\t465147\n领导科学\t465148\n广东人民出版社\t465149\n九院\t465150\n_梦幻诛仙手游\t465151\n古蔺县\t465152\n吼姆\t465153\nmulter\t465154\n都是\t465155\nx32/x64\t465156\n观澜湖新城\t465157\n七彩山\t465158\n电磁辐射检测仪\t465159\niClient\t465160\n大贴\t465161\n四分\t465162\n中国医学科学院肿瘤医院\t465163\n近五年\t465164\n爱宁达\t465165\n服装有限公司\t465166\nwire\t465167\n8895\t465168\n胜于\t465169\n科普基地\t465170\n耶梦加得\t465171\nczxttkl\t465172\n谦虚\t465173\n组织方\t465174\n暗黑地牢吧\t465175\noptimized\t465176\n天资\t465177\n苏州文化艺术中心\t465178\n能投\t465179\n交行\t465180\n胶州市\t465181\n书香府邸酒店\t465182\n中级通信工程师考试\t465183\n微胖界\t465184\n疑惑\t465185\n闪闪发亮\t465186\n2018年03月18日\t465187\n酱香饼\t465188\n塔河\t465189\n海航实业\t465190\n浴盐\t465191\n轻型\t465192\n菱形肌\t465193\n枯藤\t465194\n72v20ah\t465195\n横框\t465196\n我是传奇\t465197\n南长\t465198\npinctrl\t465199\n临边防护栏杆\t465200\n二十世纪三十年代\t465201\n236集\t465202\n福州市人民政府办公厅\t465203\n单纯形\t465204\nword2010公式编辑器\t465205\n境外\t465206\n李子苗\t465207\n明星志愿3\t465208\n反恐法\t465209\n凤凰单丛茶\t465210\ngpib\t465211\n标码\t465212\nscope\t465213\n郑伯克段于鄢\t465214\n九正建材网\t465215\n养\t465216\n黑龙江省哈尔滨市\t465217\nDreamers\t465218\nInstant\t465219\nvegaspro\t465220\n第05集\t465221\n仙露\t465222\n护理型\t465223\n06期\t465224\n无能为力\t465225\n中学教育\t465226\ncad2005\t465227\nnanshan\t465228\n赣州百姓网\t465229\nmbox\t465230\n400W\t465231\n12ckckcom\t465232\n重玩\t465233\n贵广高铁\t465234\nJOJO奇妙冒险\t465235\n巧\t465236\n理才网\t465237\nCSlunatic\t465238\n出瞳\t465239\nSAMSUNG\t465240\n不聊\t465241\n借呗日利率\t465242\nrealized\t465243\n字辈\t465244\nhaving\t465245\n沈阳市和平区\t465246\n浑水公司\t465247\nt4\t465248\nmmmm\t465249\ndropload\t465250\n平缓\t465251\n临沂小学\t465252\nzhiding\t465253\n妖媚\t465254\n唯学网\t465255\nH5播放器\t465256\nhodv\t465257\nConcept\t465258\n交互篇\t465259\n3d模型\t465260\nresonance\t465261\n洗马潭\t465262\n04.16\t465263\n刘家\t465264\n机动轮椅车\t465265\n彩虹六号围攻\t465266\nRESTful\t465267\n603596\t465268\n保护\t465269\n南石\t465270\n飓风营救\t465271\nDiligence\t465272\n千姿百态\t465273\n发展期\t465274\n2018年5月7日\t465275\n小红楼\t465276\n小品\t465277\n打点计时器\t465278\n从未离开\t465279\n澳门银座\t465280\n1.5.11\t465281\n2平米\t465282\nmy\t465283\n黑山共和国\t465284\n观月\t465285\n羰基化合物\t465286\nchw\t465287\n隆平\t465288\n茗山\t465289\n用友\t465290\n发审会\t465291\n2bt\t465292\n相违\t465293\ngrayscale\t465294\n解到\t465295\n命苦\t465296\n小初\t465297\n乐创\t465298\n第100章\t465299\n超新约全书\t465300\n巴山\t465301\n日博\t465302\n魅族mx3\t465303\n144HZ\t465304\n风洞\t465305\nxiaz\t465306\n季风\t465307\nonly\t465308\n单词库\t465309\n柳编\t465310\nViet\t465311\n耳闷\t465312\nPS/2接口\t465313\n神幻\t465314\n饮食\t465315\n双网\t465316\n颇感\t465317\n成都医学院\t465318\n宣汉县文广局\t465319\n场中\t465320\n云脉\t465321\n轻描淡写\t465322\ndivor\t465323\n叹词\t465324\nBr\t465325\n建议类\t465326\n语音学\t465327\n正法\t465328\n大数据架构师\t465329\n紫外线光疗仪\t465330\n蜜桃成熟时33d\t465331\n变迁史\t465332\n3Q_\t465333\n破立\t465334\n在线电视\t465335\n中国银监会\t465336\npubg\t465337\n冰盖\t465338\n当权者\t465339\nCodePen\t465340\n临界值\t465341\n小璐\t465342\n恒易融平台\t465343\n新八\t465344\n灯海\t465345\n香港茶餐厅\t465346\n000008\t465347\n交予\t465348\n三易\t465349\n9100\t465350\n纠纷案\t465351\n路队\t465352\n深圳佳兆业\t465353\n食物们\t465354\n伍圆\t465355\n入骨相思\t465356\nvmware\t465357\n枪斗术\t465358\n东海街道\t465359\n植物性\t465360\n家声\t465361\n阿方索\t465362\n梁某\t465363\n20180412\t465364\n金浩\t465365\n大学科\t465366\n填料函\t465367\n后妃\t465368\n500多万\t465369\n陈小姐\t465370\n微利\t465371\n海内存知己\t465372\n大烟\t465373\n晏几道\t465374\n月浦镇\t465375\n融资融券\t465376\ncoreos\t465377\n长沙市轨道交通集团\t465378\nmerry\t465379\n非同一控制下企业合并\t465380\n比价\t465381\n沙河校区\t465382\n淡妆\t465383\n森沢\t465384\nLG杯\t465385\n求赞\t465386\nノ一\t465387\n抗过敏\t465388\nexpectancy\t465389\n绝种\t465390\n长短句\t465391\nonsd\t465392\ntherefore\t465393\n王浩宇\t465394\n易美\t465395\n东芝e\t465396\n木场\t465397\n嘿咻\t465398\ndmx\t465399\n52条\t465400\n南通房产网\t465401\nPyQuery\t465402\n北京西\t465403\nevisa\t465404\n三河一中\t465405\n洋货\t465406\n中原石油报\t465407\n音词\t465408\n膛线\t465409\n20151212\t465410\nSnooker\t465411\nNails\t465412\n望尽\t465413\n幽暗城\t465414\n危行\t465415\noppoa59m\t465416\nwhisper\t465417\n怀化三中\t465418\n动不动\t465419\n没赶上\t465420\nsap吧\t465421\n迪塞尔\t465422\n福城街道\t465423\n秘密世界\t465424\nflir\t465425\npureftpd\t465426\n万科公园\t465427\n两三百\t465428\n神罗天征\t465429\n热熔压敏胶\t465430\n打人\t465431\n湖南省农业委员会\t465432\n凉拌黄瓜\t465433\n安陵\t465434\n井喷\t465435\n经济局\t465436\n众泰sr9\t465437\nrelativity\t465438\n5gwifi\t465439\n宿醉2\t465440\n倒\t465441\n五百次\t465442\n专栏\t465443\nncap\t465444\n一阳指\t465445\n窗边的小豆豆\t465446\n恋舞传祺\t465447\n华商杂谈\t465448\n新加坡航空公司\t465449\nzhiyuan\t465450\n半生\t465451\n施琅\t465452\n朴振荣\t465453\n贪得无厌\t465454\n甲基丙烯酸羟乙酯\t465455\n朗萨\t465456\n仮\t465457\n/script\t465458\n温州市委\t465459\nHKICPA\t465460\n微信电脑版多开器\t465461\nXht\t465462\n美少女万华镜4\t465463\n胆汁酸\t465464\n保键\t465465\n海关编码\t465466\n安保部\t465467\n天天爱\t465468\n贤淑\t465469\n乾坤大挪移\t465470\n财务岗\t465471\n盘古分词\t465472\n出资方\t465473\n考研报录比_新东方在线院校库\t465474\n厦门港\t465475\nemui5\t465476\nIE8浏览器\t465477\nXCin\t465478\n虬龙\t465479\n索玛\t465480\nPod\t465481\n资信\t465482\n母乱\t465483\n靠天吃饭\t465484\nsubtract\t465485\n吉化\t465486\nbalabala\t465487\n于吉\t465488\n其内\t465489\n烟云\t465490\n无极调光\t465491\nsqoop\t465492\ntao\t465493\n协调研\t465494\n极品飞车8\t465495\n大s\t465496\n并联电路\t465497\n重蹈覆辙\t465498\n传道者\t465499\n跳跳跳\t465500\niCON\t465501\n发单\t465502\n临港\t465503\n二双\t465504\n源城\t465505\n招生简章_中学\t465506\ndeffounderror\t465507\nphotonic\t465508\n激活函数\t465509\n许四多\t465510\n围头\t465511\n合川市\t465512\n80L\t465513\n兹维列夫\t465514\nopengl\t465515\n前四后八\t465516\n南极山\t465517\n瘢痕疙瘩\t465518\n手淫\t465519\n金山WPS_办公软\t465520\n网易宝\t465521\n美国村\t465522\n阿尔及尔\t465523\n合并列\t465524\n七天内\t465525\nmae\t465526\n狗师\t465527\n复方磺胺甲恶唑片\t465528\n黑石铸造厂\t465529\n外资医院\t465530\n邓小南\t465531\n黑暗系\t465532\nclause\t465533\n娇妻\t465534\n桌牌\t465535\n湖北省科学技术厅\t465536\n第1集\t465537\n微微健康网\t465538\n周恺\t465539\n斯莫林\t465540\nskf\t465541\n谦行\t465542\n中国南方航空股份有限公司\t465543\n李黎明\t465544\n我的男人\t465545\n王胖子\t465546\n三维地球\t465547\n80ml\t465548\n疾驰\t465549\n21英寸\t465550\nzcz\t465551\n丽贝卡\t465552\n天羽\t465553\n神聚\t465554\n永利宝\t465555\nCOAT\t465556\nav接口\t465557\nwww65paoonm\t465558\n兽女\t465559\n哥斯拉\t465560\n包头路\t465561\nmoleskine\t465562\n号列\t465563\n格兰富水泵\t465564\n标牌\t465565\nApabi\t465566\n北原\t465567\nanon\t465568\n拍成\t465569\n北京产权交易所\t465570\n恒流恒压\t465571\n禅道\t465572\n2002款\t465573\n玩儿友\t465574\n腮帮\t465575\nCMD命令提示符\t465576\nzet\t465577\nmavan\t465578\n杭州市妇幼保健院\t465579\n姿态\t465580\nAEO\t465581\n电渣\t465582\nsubordinate\t465583\n6110\t465584\n逼真\t465585\nDISS\t465586\n香港通利\t465587\n讽谏\t465588\n华南师范大学经济与管理学院\t465589\n晓风残月\t465590\n联智\t465591\n无畏\t465592\nYolo\t465593\n上海广播电台\t465594\n新锦江娱乐\t465595\nbalsamiq\t465596\nIHG优悦会\t465597\n高要区\t465598\nConfidential\t465599\n热敏机\t465600\n2207\t465601\n撕拉\t465602\n合规部\t465603\n万兆交换机\t465604\n毕竟\t465605\n成都滴滴\t465606\n小黑兔\t465607\n压电传感器\t465608\nV2\t465609\n氧化铝\t465610\nPBO\t465611\n1.1.11\t465612\n遊戲\t465613\n江凌\t465614\n烟钱\t465615\n托曼尼\t465616\nhcna\t465617\n春敏\t465618\nFCC认证\t465619\n店头\t465620\n李某甲\t465621\n死亡骑士吧\t465622\n79家\t465623\nharrow\t465624\n虫草花\t465625\n考拉\t465626\n雪莲菌\t465627\nNGO\t465628\n2010年以后\t465629\nwxp\t465630\noutlook2007\t465631\n科源\t465632\nPP塑料\t465633\n会引起\t465634\nBachelor\t465635\nmalab\t465636\n花生棒\t465637\nOpenSSH\t465638\nfore\t465639\n圣城街道\t465640\n上海大世界\t465641\n珍珠吊兰\t465642\n坏死\t465643\n侨港\t465644\n张姝\t465645\n江苏环保厅\t465646\nSHOUJI\t465647\n效实中学\t465648\n2017年四月份\t465649\n私立华联学院\t465650\n电信局\t465651\n单引号\t465652\n猜想\t465653\n中国电影网\t465654\n资深堂\t465655\n输卵管炎\t465656\n撼地者\t465657\n伤残\t465658\n官方免费版\t465659\n卡罗纳\t465660\n2.2.3\t465661\n弹无虚发\t465662\n畚斗\t465663\nPains\t465664\nfrien\t465665\n石平\t465666\n壳牌公司\t465667\n4.50\t465668\n双梁起重机\t465669\n南审\t465670\npolitician\t465671\n杨朝晖\t465672\n丝攻\t465673\n相思意\t465674\n摩耶\t465675\n正一派\t465676\n兴路\t465677\n隔离式\t465678\n顺义网城_顺义区政府网\t465679\nroe\t465680\nPerfection\t465681\n右美托咪定\t465682\n新阳光教育\t465683\n毛宁\t465684\n攻取\t465685\nwindowsapps\t465686\nredeploy\t465687\n出装\t465688\nlogical\t465689\n藤新一\t465690\n先导区\t465691\n大众软件\t465692\n不动产登记证\t465693\n动态壁纸\t465694\n板筋\t465695\n2072\t465696\n29部\t465697\nスパ\t465698\n框架\t465699\n1000000\t465700\ncero\t465701\n二公\t465702\n同安\t465703\nv1.1.2\t465704\n精料\t465705\n区卫计委\t465706\n到不到\t465707\n侦察兵\t465708\n15平\t465709\n汇总贴\t465710\n全视\t465711\n固定器\t465712\nmagics\t465713\n防辐射服\t465714\nレズ\t465715\n引言\t465716\n6850\t465717\n悠长假期\t465718\n利海\t465719\n49张\t465720\n双值\t465721\nUrdu\t465722\n优派YOPAI.COM\t465723\n暗影战斧\t465724\n南四环\t465725\n6.75\t465726\n口风\t465727\n开网\t465728\n千杯\t465729\n深职院\t465730\nabab式\t465731\n指定位\t465732\n陈昕\t465733\n拉芳家化\t465734\nErmenegildo\t465735\n飞雁\t465736\n前列地尔注射液\t465737\n2920\t465738\n娥佩兰\t465739\n行么\t465740\n窥探者\t465741\nGPRS模块\t465742\n官厅\t465743\n灰度共生矩阵\t465744\n李宗\t465745\n放管服\t465746\n内地\t465747\n源泉\t465748\nNAO\t465749\n甲状腺肿瘤\t465750\nthey\t465751\n金刚山\t465752\n周立\t465753\n甩刀\t465754\n2400万元\t465755\n防水圈\t465756\n红叔\t465757\n王世勇\t465758\n互选\t465759\nkeep\t465760\n挂单\t465761\n超过二十年\t465762\nbindinput\t465763\nwindows-server\t465764\nlatin1\t465765\n哥特舰队\t465766\n半小\t465767\n二虎\t465768\nQUARTUS\t465769\n翱翔\t465770\n回龙镇\t465771\n攫\t465772\nSECRET\t465773\n金羚\t465774\n叶樱\t465775\n梦远书城\t465776\n万科集团\t465777\n条令\t465778\n淮南市政府\t465779\n肖盛峰\t465780\noschina\t465781\n找不着\t465782\n每周二\t465783\n45卷\t465784\n极乐宝盒\t465785\n时彩\t465786\n一块儿\t465787\n1949年10月1日\t465788\n金箭\t465789\n雅泛迪\t465790\nvoc\t465791\n上云\t465792\n刑事侦缉\t465793\n闸坝\t465794\n尻\t465795\n成扇\t465796\n水谷幸\t465797\nxiao106347\t465798\n网站商务通\t465799\n献力\t465800\n永旭\t465801\nword200\t465802\n射击训练\t465803\n800米\t465804\n七十年代\t465805\nbootloader\t465806\n马松\t465807\n小蜜桃\t465808\nk-means\t465809\n南京聚乐部\t465810\n13次\t465811\n展展\t465812\nsupervision\t465813\nxenial\t465814\n凌华科技\t465815\n合并村\t465816\n安装盘\t465817\n铁观音茶\t465818\n记住\t465819\n隔空\t465820\n绒毛膜促性腺激素\t465821\nf598\t465822\n海宁日报\t465823\ng2030\t465824\n第三局\t465825\n中国盐都政府\t465826\nAngel\t465827\n他的一生\t465828\nV2.7\t465829\n兴华街道\t465830\n苏虾\t465831\n红音\t465832\n8250U\t465833\n农学\t465834\nELECTRONICS\t465835\n秦时明月\t465836\n上海中医药大学附属岳阳中西医结合医院\t465837\nlukas\t465838\n墓主\t465839\n星际战甲战甲\t465840\n轿运\t465841\nETERM\t465842\n快网\t465843\n结业生\t465844\nqq.com511269494@qq.com6ovx@163.comchengn199634@163.co\t465845\n医大一院\t465846\n交接棒\t465847\nNuclear\t465848\nTRK\t465849\n安卓中文网\t465850\n牛人\t465851\n小儿柴桂退热颗粒\t465852\n齐鲁电视台\t465853\n2017-2019年\t465854\n出兑\t465855\n全币种\t465856\nsnaker\t465857\n德牧犬\t465858\n中华人民共和国宪法修正案_\t465859\n刘雪婧\t465860\niif\t465861\n微韩\t465862\n南浔新闻网\t465863\n右手无名指\t465864\n槎\t465865\n52pk\t465866\n幸福花园\t465867\n助教\t465868\nRefrigerator\t465869\n误入歧途\t465870\n银杏湖\t465871\n肝豆状核变性\t465872\n搜奇新闻网\t465873\n辔\t465874\n占星学\t465875\n古惑仔6\t465876\n候鸟式\t465877\n华平投资\t465878\nTWIN\t465879\n深圳中心区\t465880\n发泡管\t465881\n产褥\t465882\n健康报网\t465883\n采砂\t465884\n小酥肉\t465885\n锡丝\t465886\n调查问卷\t465887\ncurb\t465888\n西海\t465889\n学无\t465890\nMacroe\t465891\n金田\t465892\n61850\t465893\n方尖碑\t465894\n安理\t465895\ny69\t465896\n徐有容\t465897\n辞典\t465898\n神偷联盟\t465899\n楼盘网\t465900\n二手手机论坛\t465901\n100【\t465902\n41分\t465903\n玻璃水\t465904\n荣耀q版\t465905\n察布查尔县\t465906\n罗雪娟\t465907\n同心县\t465908\n威海北\t465909\nl级\t465910\nryan_knight_12吧\t465911\n龙胜梯田\t465912\n陈二狗的妖孽人生2\t465913\nliteos\t465914\n惊慌失措\t465915\n追击\t465916\n惩罚性赔偿制度\t465917\nyestar\t465918\n韵\t465919\n降灵记\t465920\nops\t465921\nbilibli\t465922\n聚巧网\t465923\n广州地铁18号线\t465924\n鸡西市\t465925\n於\t465926\nv2.02\t465927\n2017年10月26日\t465928\n贾思勰\t465929\n布地奈德\t465930\n电科院\t465931\n在华\t465932\nsp5\t465933\n六个月\t465934\nHarbor\t465935\nmetro\t465936\nPuritan\t465937\n牵手\t465938\n贝特瑞\t465939\n制高点\t465940\n滨河花园\t465941\n降价\t465942\n汉丽轩\t465943\n馨\t465944\n锦泰\t465945\n乐智\t465946\n东北三省\t465947\n宝箱\t465948\n靓靓蒸虾\t465949\n赛拉\t465950\n1822\t465951\n安徽消防\t465952\n钨\t465953\n祭祀场\t465954\n随下\t465955\n莫氏\t465956\ntaizhou\t465957\n高阻\t465958\n中鹏\t465959\n吃鸡听\t465960\n初家果果show\t465961\n驹\t465962\n女裁判\t465963\n知乎/StackOverflow\t465964\n肝胆科\t465965\n20140224\t465966\nweboffice\t465967\nswg\t465968\n上一站\t465969\n第一战\t465970\n七彩语文杯\t465971\ne500\t465972\n中国教研网\t465973\n呼唤\t465974\n龙族4\t465975\nual\t465976\n自由鸟\t465977\n耍大牌\t465978\nLV包\t465979\n中公会计网\t465980\nESlint\t465981\n陈成\t465982\n绿宝石802\t465983\n窑头路\t465984\n赤子之心\t465985\n介紹\t465986\nDiscrimination\t465987\ntranscriptional\t465988\n渝中半岛\t465989\n3.4\t465990\nzj\t465991\n12星座男\t465992\nHuge\t465993\nB9\t465994\n淫堕\t465995\n铁论\t465996\n2702\t465997\nConcerto\t465998\n双面人\t465999\nnba2008\t466000\n撕心裂肺\t466001\nQX\t466002\n蓝田玉\t466003\n漓江路\t466004\n乾安\t466005\n上海凯宝\t466006\n标注员\t466007\n跨径\t466008\nbinwalk\t466009\n300er\t466010\n美容术\t466011\n站路\t466012\nproble\t466013\nprofitability\t466014\nsuho\t466015\n白海\t466016\n922\t466017\n世界现代设计史\t466018\n配电间\t466019\n总兵\t466020\n国王杯\t466021\n星粉卡\t466022\n开架\t466023\n野良神\t466024\n比利林恩\t466025\n民族服饰\t466026\n简笔\t466027\n哈站\t466028\nmakeup\t466029\n精益网\t466030\n巴塞尔\t466031\n我的兄弟叫顺溜\t466032\nCPLD|ASIC论坛\t466033\n河南省人民检察院\t466034\n60000\t466035\n族裔\t466036\n5.8.0\t466037\nEf\t466038\n双创园\t466039\n第一栋\t466040\nVBS\t466041\nchong\t466042\n智家\t466043\n月记\t466044\n家兔\t466045\n移动经纪人\t466046\nCORSAIR\t466047\n总设计师\t466048\ndisappeared\t466049\n帝乡\t466050\n行知中学\t466051\n优米网\t466052\n剔骨\t466053\n酒体\t466054\n豪装\t466055\n赵清\t466056\n木模\t466057\n绿豆水\t466058\n连阳\t466059\n78亿\t466060\n吕楠\t466061\n复试\t466062\n接纳\t466063\n铁盖\t466064\n井岸镇\t466065\n北大医信\t466066\n叶翔\t466067\n吴佩慈\t466068\n有志\t466069\n上册\t466070\n胡安\t466071\n花公子\t466072\ndetect\t466073\n还给我\t466074\nangularjs\t466075\nituns\t466076\n福龙\t466077\n凉川\t466078\n错开\t466079\n敏俊\t466080\n法宣\t466081\nicem\t466082\n奇迹男孩\t466083\nGOSICK\t466084\n3套\t466085\n圆谷公司\t466086\n新乡医学院三全学院\t466087\n脑袋\t466088\n不弹\t466089\n乐山一中\t466090\n浩浩荡荡\t466091\n铁西区\t466092\n疏雨\t466093\n黄胶\t466094\n输运\t466095\n节日\t466096\n南江镇\t466097\n丙烯画\t466098\n花絮\t466099\n或且\t466100\n必维\t466101\n高德置地\t466102\n重庆在线\t466103\n3476\t466104\n钢桁架\t466105\n副券\t466106\n攻占\t466107\n510300\t466108\n奈良公园\t466109\n舒张压\t466110\n吊杆\t466111\n登山步道\t466112\n剧情版\t466113\n蜀门吧\t466114\nXshot\t466115\n济源日报社\t466116\n百谷\t466117\n金湖县\t466118\n起重臂\t466119\n盈亏指数\t466120\n唐家河\t466121\nnavicat12\t466122\n从早到晚\t466123\n背景值\t466124\n横峰\t466125\n丙环唑\t466126\n海拉鲁\t466127\n杨建立\t466128\nwiz\t466129\n冷暖人生\t466130\n齐眉\t466131\n嵌合体\t466132\n武警警官学院\t466133\n扒渣机\t466134\n辩论赛\t466135\n腹甲\t466136\n底特律机场\t466137\n罗安达\t466138\n同父\t466139\nK\t466140\n欧凡尔\t466141\nfj\t466142\nport\t466143\n集运仓\t466144\n小客车专段号牌管理信息系统\t466145\n留宿\t466146\n搬场\t466147\n漕泾\t466148\n张郭镇\t466149\n注音\t466150\n濑奈真绪\t466151\nmybatic\t466152\n陈生\t466153\n特字\t466154\n众安保险\t466155\n危险方法危害公共安全罪\t466156\n买卖合同纠纷一审民事裁定书\t466157\n临河区\t466158\n通窍\t466159\n互通式\t466160\n探案集\t466161\n箴\t466162\nfx8300\t466163\n奥迪A1\t466164\nTwitch\t466165\n一定要是\t466166\n叶音\t466167\nOpenLaw\t466168\n拉不拉多\t466169\n三八广场\t466170\n3小时\t466171\n更便利\t466172\n亚盛集团\t466173\n静教院\t466174\nゆ\t466175\n涟涟\t466176\n生源地信用助学贷款\t466177\n西北地区\t466178\n百百\t466179\n91cy.cn\t466180\n综合改革\t466181\n麻将馆\t466182\n反骨仔\t466183\n薏仁粉\t466184\n供电所\t466185\n伊泰集团\t466186\n情意\t466187\n圣何塞\t466188\nHRB\t466189\n锄禾\t466190\n黄脸婆\t466191\n英牛\t466192\n然嗄\t466193\n南华\t466194\n識\t466195\n大全集团有限公司\t466196\n浙江农村信用社\t466197\n18公分\t466198\n保利领秀前城\t466199\n姨母\t466200\n会计等式\t466201\n雀帝6\t466202\n庄公\t466203\n水藻\t466204\n微夏博客网\t466205\n请多指教\t466206\n大明宫遗址公园\t466207\n踟蹰\t466208\n20180305\t466209\n1.0.21\t466210\n现代风\t466211\n成名曲\t466212\n恺英\t466213\n第一山\t466214\n均质化\t466215\n样币\t466216\n偷师\t466217\n刚柔相济\t466218\nFlorence\t466219\nShooter\t466220\n带式压滤机\t466221\n施工组织设计\t466222\nhote\t466223\n单晶\t466224\n第四十七条\t466225\n陆河县人民政府\t466226\n元贞\t466227\n不穿\t466228\n路权\t466229\n北极点\t466230\n太平鸟\t466231\n双歧杆菌乳杆菌三联活菌片\t466232\n1居\t466233\nSPA馆\t466234\n蒙牛集团\t466235\n再贷款\t466236\n煤\t466237\ncolo\t466238\n多毛\t466239\n狼青犬\t466240\n中投公司\t466241\n保税仓储\t466242\n代码包\t466243\n华为mate10吧\t466244\n发行额\t466245\n青苹\t466246\nharden\t466247\n分别\t466248\n僧格林沁\t466249\nmanipulation\t466250\n老场坊\t466251\n紧密型\t466252\n桥本舞\t466253\n信念\t466254\n遗珠\t466255\n花门\t466256\n郑州招聘网\t466257\n水煮牛肉\t466258\n诗词大会\t466259\n9073\t466260\nf366\t466261\n陆域\t466262\n系统化\t466263\n橡树摄影网\t466264\n腾牛个性网\t466265\n商网\t466266\n内房股\t466267\n里克\t466268\n登堂入室\t466269\n裸聊\t466270\n志表\t466271\n羧基\t466272\n管理区\t466273\nAppCompatActivity\t466274\n秋瑾\t466275\n瑞金医院北院\t466276\n踏月传奇\t466277\n冰雕展\t466278\nBigbang\t466279\n车辆\t466280\npicc\t466281\n三十七度\t466282\n城邑\t466283\n沃隆\t466284\n真光\t466285\n3双\t466286\n长乐坊\t466287\n下压式\t466288\ndafabet\t466289\n神探007\t466290\n补植\t466291\n苏薇\t466292\nArla\t466293\n200P\t466294\n辗\t466295\n小米手机卡\t466296\n华为G9\t466297\n翻譯\t466298\n永盈\t466299\n缘起性\t466300\n墓志\t466301\n禁书\t466302\n滚镀\t466303\nBenQ\t466304\n中国能效标识网\t466305\n槐花香\t466306\npp2\t466307\n背包式\t466308\n魅蓝S6\t466309\n米蓝\t466310\n越来越好\t466311\n无辜\t466312\n绝代双雄\t466313\n米6\t466314\n抢驻\t466315\n鬼树\t466316\n醉酒\t466317\n八五\t466318\nspark-streaming\t466319\n中泰齐富通\t466320\n答疑帖\t466321\n标黄\t466322\n计学\t466323\n美国西南航空\t466324\n河南烩面\t466325\n脱保\t466326\n飞扬\t466327\ncartoons\t466328\n20170128\t466329\n莓美哒\t466330\n时间差\t466331\n五百年\t466332\n氧化锌\t466333\n花嘴\t466334\n89.9\t466335\n泰版\t466336\n庐山站\t466337\n换岗\t466338\nMSR\t466339\n组讯\t466340\n济宁站\t466341\n48项\t466342\n传立\t466343\n天津日报大厦\t466344\n拉差\t466345\n绿盒\t466346\n金山社区\t466347\n开福寺\t466348\n咕咕咕\t466349\n龙少\t466350\n特种作业证\t466351\n散伙饭\t466352\n怀沙\t466353\nsplines\t466354\nSD存储卡\t466355\n真心爱你\t466356\n林文正姿\t466357\n庆丰\t466358\n九门娱乐\t466359\n轩尼诗\t466360\n请注意\t466361\n1808s\t466362\nME\t466363\n全国语\t466364\n赛段\t466365\nit运维\t466366\n乐橙\t466367\n16步\t466368\nbodhicitta\t466369\n云魂\t466370\n鸭嘴式\t466371\n哈维尔\t466372\n黄百鸣\t466373\n灵翼\t466374\n豪斯登堡\t466375\n耳骚\t466376\n博学教育\t466377\npwr\t466378\n525li\t466379\n土地公生活\t466380\n60万吨\t466381\nAnimationer\t466382\n幻樱\t466383\n起始\t466384\n久久两性_久久健康网\t466385\n火车采集器帮助中心\t466386\n博白县人民政府\t466387\n家常豆腐\t466388\n略记\t466389\n邀\t466390\n安徽省林业厅\t466391\nzym\t466392\n杭州奥数网\t466393\n主用\t466394\n严寒\t466395\n数字乡村\t466396\n胺化\t466397\n回眸一笑\t466398\n船袜\t466399\n9040\t466400\n现金流量折现法\t466401\n软密封闸阀\t466402\nled洗墙灯\t466403\n登高梯\t466404\nlyrichu\t466405\n乌鲁木齐高新技术产业开发区\t466406\n汉字库\t466407\n姜磊\t466408\n提分\t466409\n排针\t466410\n翼虎\t466411\n杜仁杰\t466412\n仅供参\t466413\n百合会\t466414\n提货单\t466415\n华中科技大学同济医学院附属同济医院\t466416\nattempt\t466417\nheilong\t466418\nad\t466419\n残剑\t466420\n科学技术与工程\t466421\n叶贤\t466422\n较多\t466423\n封存\t466424\n沁阳\t466425\n戚家军\t466426\n伴奏曲\t466427\n假结婚\t466428\n马克思主义认识论\t466429\n五月底\t466430\n顺城街\t466431\n集体经营性\t466432\n双极性\t466433\n水果盒\t466434\n和讯汽车\t466435\ngdwkong\t466436\nafnetwork\t466437\n隆达罗西\t466438\n星宝\t466439\n厦门国贸\t466440\n汤姆猫战营\t466441\nmiao\t466442\nidentity\t466443\n南汉山城\t466444\n科班\t466445\n粗长\t466446\n深圳市网上办事大厅\t466447\n6820\t466448\nHairstyles\t466449\n黑桃\t466450\n扫盲班\t466451\n勉为其难\t466452\nXAVC\t466453\n坂口健太郎\t466454\n竺可桢学院\t466455\n马拉西亚\t466456\n井控\t466457\n建银国际\t466458\n幅频\t466459\n几P\t466460\n神经性头痛\t466461\n14.1.4\t466462\n岩质\t466463\n10.6.8\t466464\n歼6\t466465\nfpt\t466466\n92式\t466467\n海口网\t466468\n80环\t466469\n邵斌\t466470\ndockpanel\t466471\n株洲市发展和改革委员会\t466472\n牡丹鹦鹉吧\t466473\n领导\t466474\n小居\t466475\n囧友\t466476\n大连\t466477\n2卡\t466478\n平层\t466479\n溶气\t466480\n亚辛\t466481\nmembrane\t466482\nd32\t466483\n速龙ii\t466484\n微草\t466485\n一阵一阵\t466486\ndsc\t466487\n量分析\t466488\n锅盔\t466489\nLAST\t466490\n部门\t466491\n霸天虎\t466492\n心皮\t466493\n1890元\t466494\n开瑞K50S\t466495\n品生\t466496\n配电盘\t466497\n扁铁\t466498\n长安马自达\t466499\nattribute\t466500\nukelele\t466501\n易宝\t466502\n出京\t466503\n阳春面\t466504\n征占\t466505\ntastes\t466506\n资格表\t466507\n52项\t466508\n伴热\t466509\n902\t466510\n12周\t466511\n评书吧\t466512\n含谷镇\t466513\n芳村大道西\t466514\n总坛\t466515\n迈腾论\t466516\n恒温振荡器\t466517\n华拓\t466518\nrl\t466519\n数字电桥\t466520\n11码\t466521\n天文学\t466522\n萝莉型\t466523\n张曙光\t466524\netc/passwd\t466525\n花生牛轧糖\t466526\n1439\t466527\nfur\t466528\np2p种子搜索器\t466529\n一晓\t466530\n中铁二十三局\t466531\n518\t466532\n96式\t466533\n黄永\t466534\n心灵猎人\t466535\n镇平县人民政府\t466536\n白云岩\t466537\n东亚\t466538\n洪濑镇\t466539\nwindowns\t466540\n15年11月\t466541\n强奸者\t466542\n七杯茶\t466543\n卢敏\t466544\nMuPAD\t466545\n重庆建筑工程职业学院\t466546\ngitignore\t466547\nautotools\t466548\n科段\t466549\n低压配电柜\t466550\n双语\t466551\n八九年\t466552\nWayne-Zhu\t466553\n信息与通信工程\t466554\n兵人\t466555\n八三\t466556\nn7100\t466557\n超级争霸战\t466558\n跑带\t466559\n神堡\t466560\njavhd\t466561\n侦探笔记\t466562\n快看漫画_官方漫画大全\t466563\n6pm.com\t466564\n石龙区\t466565\nAMERICAN\t466566\n王国维\t466567\n空前绝后\t466568\n2040\t466569\nq传\t466570\n倒立摆\t466571\nUZI\t466572\n九台区\t466573\n苹果8\t466574\n张妈\t466575\n盟国\t466576\n孟祥\t466577\n防伪税\t466578\n成本\t466579\n触摸式\t466580\nv7.4\t466581\n48篇\t466582\n四川航空公司\t466583\n冲凉\t466584\nplc1200\t466585\nmajestic\t466586\n麻丝\t466587\nAlso\t466588\napistore\t466589\n3.6.5\t466590\n兰海高速\t466591\n外语类\t466592\nQQ电脑\t466593\n15mm\t466594\ndowncc\t466595\n谁和谁\t466596\n回形针\t466597\n1.9亿\t466598\nABLETIVE\t466599\niisexpress\t466600\n卖方\t466601\n橡皮膏\t466602\nneglect\t466603\nxzy\t466604\n博罗\t466605\n包名\t466606\n写下\t466607\n排球鞋\t466608\np95\t466609\n金沟河\t466610\n敬仰\t466611\n拉巴特\t466612\n3期\t466613\nINDEX函数\t466614\n500块\t466615\n东鹏\t466616\n渝北区人民政府\t466617\n百秒\t466618\n37座\t466619\n艾依\t466620\nGenerals\t466621\n热塑性橡胶\t466622\n语音包\t466623\nKMPlayer\t466624\n黄岐\t466625\n北大资源学院\t466626\n濑户内海\t466627\n铜官窑\t466628\n维伦\t466629\n热血三国3吧\t466630\n雅比斯\t466631\n卢卡奇\t466632\n彩色透水混凝土\t466633\n掉入\t466634\n米英\t466635\n逸创\t466636\n盛京文化网\t466637\n尼康d3200\t466638\n大桥街道\t466639\n高强螺栓\t466640\n嘲\t466641\n候选框\t466642\nInner\t466643\nckplay\t466644\n抢滩\t466645\n0.11\t466646\n清兵卫\t466647\n天津电台\t466648\n屄_飞华\t466649\n南郑县\t466650\nBURBERRY\t466651\nipad输入法\t466652\n37wan\t466653\n堂妹\t466654\n妖兽都市\t466655\n清浅\t466656\n17年春节\t466657\n鸡蛋花\t466658\n2001年度\t466659\npdf.djvu\t466660\n苹果7Plus\t466661\n人参鹿鞭片\t466662\n上田\t466663\nsponsor\t466664\n蜜桃成熟时1993\t466665\n垃圾书\t466666\n定寿\t466667\n洪清华\t466668\n大眼\t466669\n玫瑰谷\t466670\n1.42\t466671\nAesthetic\t466672\n手车式\t466673\n画江湖之不良人\t466674\nCarolina\t466675\nElta\t466676\n星际争霸神族\t466677\n1300万元\t466678\nHTL\t466679\n宝桑园\t466680\n双湖大道\t466681\n哥林多后书\t466682\n魔女宅\t466683\n黄脚\t466684\nAdo\t466685\n远洋广场\t466686\n苏南硕放机场\t466687\n搞不清\t466688\nswap\t466689\n任由\t466690\n车轮\t466691\n锦上添花\t466692\n椭偏仪\t466693\n大喜\t466694\n光纤快速连接器\t466695\n非常运势网\t466696\n新思铂睿\t466697\n修正药业集团\t466698\n炮制\t466699\n人猿泰山\t466700\n大疆精灵4pro\t466701\n18.9升\t466702\n气动角座阀\t466703\n太乙救苦天尊\t466704\n便利店\t466705\n高浪路\t466706\nmoxing\t466707\n瓮中捉鳖\t466708\n商虎\t466709\n36关\t466710\n基础篇\t466711\n豆丹\t466712\n课时费\t466713\n武沛齐\t466714\n鼻膜\t466715\n圣公会\t466716\n第三列\t466717\n确凿\t466718\n微视\t466719\n72条\t466720\nnds\t466721\nTCHAR\t466722\n精功科技\t466723\n机动战\t466724\n婉拒\t466725\n鲮鱼\t466726\njunit4\t466727\n天骏\t466728\n3se\t466729\n福布斯\t466730\n东津镇\t466731\n长泽茉里奈\t466732\n环江\t466733\n大箭头\t466734\n凤台县\t466735\n慢门\t466736\n商通卡\t466737\n天天射影院\t466738\n沪江中学\t466739\n板子\t466740\n老顾客\t466741\n康德\t466742\n美国西北大学\t466743\nbind函数\t466744\n感慨万千\t466745\n战国之刃\t466746\n三部曲\t466747\nTONE\t466748\n东方明珠城\t466749\n殷一璀\t466750\n过江\t466751\nminidayz\t466752\n在天\t466753\n鹿茸泡酒\t466754\nhuatu\t466755\n粗饼\t466756\nUAT\t466757\n天下无敌手\t466758\n赛中\t466759\n杠杆\t466760\nAirdrop\t466761\n形成率\t466762\n深空\t466763\n重庆市渝中区人民政府\t466764\nvalue值\t466765\n付华\t466766\n中共十八大\t466767\n难缠\t466768\nCVT变速箱油\t466769\n调洪\t466770\n消费金融公司\t466771\n塘溪\t466772\n前5天\t466773\n留档\t466774\n西川结衣\t466775\n分程\t466776\n外阴炎\t466777\n煤层气\t466778\nDNF地下城\t466779\n39集\t466780\n二手车网\t466781\n说句心里话\t466782\n保皇\t466783\n清境\t466784\n火炬树\t466785\n袖\t466786\n凡茜\t466787\nworlds\t466788\n麻醉\t466789\n人为\t466790\nUV固化机\t466791\n人身攻击\t466792\n运来\t466793\n明纬电源\t466794\n莉莉周\t466795\nKatsuni\t466796\n环球购物\t466797\n深圳武警医院\t466798\n迄今\t466799\n新能源科学与工程专业\t466800\n衍化\t466801\n杨帅\t466802\n人之道\t466803\n湖南省直单位住房公积金管理中心\t466804\n笔意\t466805\n萨博\t466806\n雌\t466807\n通州梨园\t466808\n手机投影仪\t466809\nSSD固态硬盘\t466810\n新浪广东新闻中心\t466811\n朗威\t466812\n细想\t466813\nANGE\t466814\n妆前乳\t466815\n李宏毅\t466816\n邮储银行手机银行\t466817\n路见不平\t466818\n省疾控中心\t466819\n雷柏科技\t466820\n虚职\t466821\n邓勇\t466822\n杨昊\t466823\n联商网\t466824\n平均化\t466825\n昂克拉\t466826\n快乐人生\t466827\n麻山\t466828\n冰菊\t466829\nbobower\t466830\n闭上眼睛\t466831\n三段\t466832\n考录\t466833\nmycloud\t466834\n严刑\t466835\n汽车修理工模拟2018\t466836\n上海公馆\t466837\nStormy\t466838\n榆树湾\t466839\nPOD\t466840\nbinlog\t466841\n雪丽\t466842\n小灵通号段\t466843\n怀古\t466844\n经典碟\t466845\n8022\t466846\n贱贱\t466847\nvray渲染器\t466848\narccatalog\t466849\n烟熏\t466850\n命理学\t466851\n关联性分析\t466852\nx2+y2\t466853\n点部\t466854\nTense\t466855\n文档\t466856\nspaceship\t466857\n精神食粮\t466858\n国际汽联\t466859\n甲骨\t466860\n首城国际\t466861\nrz\t466862\n84位\t466863\n民事诉讼保全裁定书\t466864\n老帅\t466865\nvoyeurhit\t466866\n20180226\t466867\n云霄县\t466868\n球人\t466869\nENA\t466870\n丧偶式\t466871\n楷林\t466872\nFixer\t466873\n题库\t466874\n21季\t466875\n填充机\t466876\nbcache\t466877\n第44集\t466878\n船宿\t466879\n艦隊\t466880\n鲁班土建\t466881\n定律\t466882\n幼鸽\t466883\n伊藤由奈\t466884\n插件机\t466885\n不正当\t466886\n红薯粥\t466887\ner们\t466888\n钢便桥\t466889\nsqlserver2000\t466890\ninnate\t466891\nkodexplorer\t466892\n铭\t466893\n巧夺天工\t466894\n游戏文\t466895\n城崎\t466896\n一周岁\t466897\nnormaliz\t466898\n南方网娱乐\t466899\n卓博人才网\t466900\n经论\t466901\n杨梅镇\t466902\n巨形\t466903\n共济\t466904\n鹤立鸡群\t466905\n仿古铜\t466906\n玻璃化\t466907\n1.8_\t466908\n亚军\t466909\nvibrate\t466910\n圌山\t466911\nont\t466912\n钱其琛\t466913\n20:00\t466914\n寇氏\t466915\n大熊猫国家公园\t466916\n老一辈人\t466917\nboby\t466918\ne城e家\t466919\n儿化音\t466920\n迎风\t466921\n商用地\t466922\n狼吞虎咽\t466923\n工步\t466924\nadp\t466925\n天堂之路\t466926\n32650\t466927\n爱的影集\t466928\ncgmxw\t466929\nBros\t466930\n孵化基地\t466931\n第8辑\t466932\n羽球\t466933\n杭锦后旗\t466934\n丝圈\t466935\n伴生\t466936\n美术画\t466937\n徐汉彬\t466938\n本田岬\t466939\n新编\t466940\n污妖\t466941\n爆破片\t466942\n活动单\t466943\n案发现场\t466944\n1238\t466945\n宝应\t466946\n盯防\t466947\ntraveled\t466948\n国泰君安期货\t466949\n上海滴滴\t466950\n禁摩城市\t466951\n纵横家\t466952\n托尔金\t466953\n文学界\t466954\nWorkbook\t466955\n核弹流\t466956\n一分二\t466957\n南泽大介\t466958\n北京东路\t466959\n學園\t466960\n一个三十岁\t466961\n县区\t466962\n远成\t466963\n歌词集\t466964\n毋\t466965\n天涯客\t466966\n当当舞\t466967\n春华\t466968\n资讯_安卓网\t466969\n风机盘管\t466970\n决胜千里\t466971\n抢匪\t466972\n秋花\t466973\nCEX\t466974\n横缝\t466975\nKLA\t466976\n疹子\t466977\n车仁\t466978\n少女漫画网\t466979\n数百万元\t466980\n睡眼惺忪\t466981\n芳纶\t466982\n34P\t466983\n阿里云大学\t466984\n大筒木辉\t466985\n评委\t466986\n松下贴片机\t466987\n练级\t466988\n温泉\t466989\n电声\t466990\n大肠\t466991\n柒零叁网\t466992\n皖南医学院弋矶山医院\t466993\n众泰t300\t466994\n乘法分配律\t466995\n造梦西游2\t466996\n云南省公安厅\t466997\n焦煤集团\t466998\n四川现代职业学院\t466999\n传菜梯\t467000\n葛根粉\t467001\n贬低\t467002\n真假性\t467003\n独墅\t467004\nVN\t467005\n抢友\t467006\nBMPCC\t467007\n泰邦生物\t467008\nVICTORIA\t467009\n杨乃武\t467010\n画子\t467011\n深圳招商银行\t467012\n首保\t467013\n操盘术\t467014\n爱晚\t467015\n&#41\t467016\n西直河\t467017\npdfcreator\t467018\nBox2D\t467019\n7年内\t467020\n麻黄\t467021\n第177集\t467022\n亳房网\t467023\nVHF\t467024\n杀狗\t467025\ncospaly\t467026\nnuccch\t467027\n凿井\t467028\n华地\t467029\n魔动王\t467030\n默写\t467031\n佛说阿弥陀经\t467032\n学长们\t467033\n谢伊\t467034\n江湖\t467035\nhybird\t467036\nworkbench\t467037\n易水\t467038\n氢氧燃料电池\t467039\n加密兔\t467040\n火星异种\t467041\n关于促进全域旅游发展的指导意见\t467042\n爱的抱抱\t467043\n强硬\t467044\n微喇\t467045\n慜\t467046\nx264-SPARKS\t467047\n1.97\t467048\n金沙街道\t467049\n疏通剂\t467050\nT形\t467051\n匹多莫德\t467052\n好屌\t467053\n欧阳张梓琳\t467054\n按摩椅\t467055\n混纺\t467056\n国内生产总值世界排名\t467057\n南钢\t467058\n塔什库尔干\t467059\n桂芳园\t467060\n赈灾\t467061\nSimilar\t467062\n友情岁月\t467063\n上海环盟\t467064\n古蔺县人民政府\t467065\n波斯\t467066\nridiculous\t467067\npicke\t467068\n造价工程师考试\t467069\n泰安市人民政府办公室\t467070\n若爱\t467071\n未完\t467072\n绝地求生刺激战场PC\t467073\n矛盾\t467074\n2018年上\t467075\nwhatsapp\t467076\n书声\t467077\n80\t467078\nmis\t467079\n洗衣柜\t467080\n拉分\t467081\n比干\t467082\n同房\t467083\n大鹅\t467084\n俄外交部\t467085\n格力公司\t467086\nreportviewer\t467087\n黄果树景区\t467088\nPattaya\t467089\n鲜切花\t467090\n性价\t467091\n随叫随\t467092\n嫖客\t467093\n庸\t467094\n稀释\t467095\nkarton\t467096\nLOC\t467097\n极端分子\t467098\n1月29日\t467099\n越野房车\t467100\n0855\t467101\n金牛区\t467102\n袁耀\t467103\n带型\t467104\n0xc000005\t467105\n淘点点\t467106\n基价\t467107\nflash8\t467108\nPlayGM论坛\t467109\n丼\t467110\n产业\t467111\n重生锦衣夜行\t467112\n文摘报\t467113\n细览\t467114\nitemize\t467115\n柳巷\t467116\n局域网共享文件\t467117\n昕动混沌剑神\t467118\n木芙蓉\t467119\nCannes\t467120\n糖化\t467121\n生活用品\t467122\n忄\t467123\nmartini\t467124\n胡云\t467125\n水果忍者\t467126\navalanche\t467127\n乔四\t467128\n化粧品健康食品ファッションインナ\t467129\n省农业厅\t467130\n弄影\t467131\n制鞋\t467132\n梦江南\t467133\n博鳌会\t467134\nBCD码\t467135\n布仁巴雅尔\t467136\n主唱\t467137\n气象数据网\t467138\n王二妮\t467139\n广东出入境检验检疫局\t467140\nPonyCar\t467141\nIPPC\t467142\n规划许可证\t467143\n转让方\t467144\n捉放曹\t467145\n硬哥\t467146\nEnago\t467147\n插值函数\t467148\n克唑替尼\t467149\n守望英雄\t467150\n绘\t467151\n手机金农网\t467152\n数据泵\t467153\n六里村\t467154\n谭作钧\t467155\n_易安居算命网\t467156\n出谋划策\t467157\n杨征\t467158\nSolidWorks2010\t467159\n斯莱克\t467160\n李红军\t467161\n华哥\t467162\n捞面\t467163\n招标信息网\t467164\n海南政府\t467165\n宝天曼\t467166\n两性花\t467167\nSHIP\t467168\n负载均衡算法\t467169\nunit7\t467170\n双人游戏\t467171\n滑线\t467172\n生字词\t467173\n结构物\t467174\n我的世界史\t467175\n为所欲为\t467176\n听广播\t467177\n魏后凯\t467178\n私募股权\t467179\n重庆地税\t467180\n刘沉香\t467181\n读\t467182\n秘宝\t467183\nnobibi\t467184\nRunners\t467185\n会务费\t467186\n愿君\t467187\n拼插\t467188\nHOUR\t467189\nSql数据库\t467190\njxf\t467191\n亨特\t467192\n编舟记\t467193\n2.7.8\t467194\ndebug\t467195\n奥氮平\t467196\n町\t467197\n各州\t467198\n乐翻天\t467199\n梅县\t467200\ncooperate\t467201\n四会市政府网\t467202\n无端端\t467203\n先锋影\t467204\n广东电网有限责任公司\t467205\n微软浏览器\t467206\nnever\t467207\n锦艺\t467208\n微聊\t467209\n125平方\t467210\nNodeBB\t467211\n黄桷垭\t467212\nVisualS\t467213\n切粒机\t467214\n黄岩区\t467215\n眩晕症\t467216\n按方\t467217\n赵林\t467218\n悦木之源\t467219\n劳社\t467220\nvr播放器\t467221\n差分曼彻斯特编码\t467222\n联\t467223\nPURE\t467224\n中华人民共和国发票管理办法\t467225\n重庆江北机场\t467226\n傲世中二病也要谈恋爱\t467227\n质量监督管理局\t467228\nhc360慧聪\t467229\nflm\t467230\n法液\t467231\n蓝桥\t467232\npls\t467233\nBT种子搜索\t467234\n加生\t467235\n北京顺义\t467236\n大明劫\t467237\n巴音布鲁克\t467238\nmammut\t467239\nPayoneer卡\t467240\n24小时\t467241\n泰安道\t467242\n空场\t467243\n宠物疫苗\t467244\nWears\t467245\n仙剑奇侠传七\t467246\n束发\t467247\n报系\t467248\n小米商城\t467249\n五笔打字法\t467250\n吊棚\t467251\nchez\t467252\nactuators\t467253\n鸡肝\t467254\n干湿度\t467255\n大豆蛋白粉\t467256\n宿迁经济开发区\t467257\n死刑犯\t467258\n采伐\t467259\nudb\t467260\nRenext\t467261\n乐山市商业银行\t467262\n消费品展\t467263\n异样\t467264\n一万小时\t467265\n色网\t467266\n你的眼睛\t467267\n馈线\t467268\n三相逆变器\t467269\n和兴路\t467270\n卫星城\t467271\nPowerShell\t467272\n被抓获\t467273\n石臼湖\t467274\nスク\t467275\n低小\t467276\n已逝\t467277\n土力\t467278\n鑫泉\t467279\nEncrypt\t467280\n5.5元\t467281\n仇志\t467282\n5860\t467283\ngene\t467284\np5.js\t467285\nARK\t467286\n陈宁\t467287\n日本海上自卫队\t467288\n梦神\t467289\n录用率\t467290\n汇入\t467291\nSpanish\t467292\niphone4\t467293\n北京耐克\t467294\n荆州职业技术学院\t467295\n新时代智能\t467296\nSwords\t467297\n捉摸\t467298\nV5.1\t467299\n富春股份\t467300\n黄文炳\t467301\n东莞市地方税务局\t467302\n灌顶\t467303\n上大\t467304\nxan\t467305\n平底鞋\t467306\n上海宝贝\t467307\n石竹山\t467308\nfastJson\t467309\n103个\t467310\n方山县\t467311\n安质\t467312\nKenko\t467313\n杨名\t467314\n李渊\t467315\n好叫\t467316\npp板\t467317\n李可\t467318\n红绳\t467319\n新华社\t467320\n虎门站\t467321\n恒峰娱乐\t467322\n180317\t467323\noutputstream\t467324\n远走他乡\t467325\n64bit\t467326\n江泽慧\t467327\n漂亮姐姐\t467328\n花都新华\t467329\n和子\t467330\n胡松华\t467331\n花马\t467332\n彩画\t467333\nDex\t467334\n昂立少儿教育\t467335\nSanDisk\t467336\n济宁国家高新技术产业开发区\t467337\nSonar\t467338\n堡盟\t467339\n中铁二十局集团有限公司\t467340\n4.3\t467341\newm\t467342\n猫仙儿\t467343\n唐伯虎点秋香\t467344\n几千张\t467345\n第三十二届\t467346\n更俗\t467347\n中国邮政储蓄银行\t467348\n杨宇霆\t467349\noptimistic\t467350\nGPS8\t467351\n韦斯安德森\t467352\n夜刃\t467353\n生儿育女\t467354\njpeg2000\t467355\n偃武\t467356\n手机金融界\t467357\n季季红\t467358\n苹果CarPlay\t467359\n暗巫\t467360\n中欧基金管理有限公司\t467361\n国发办\t467362\n廿八都\t467363\n绣湖\t467364\n宝龙广场\t467365\n83个\t467366\nSHANG\t467367\n26.4\t467368\n就业班\t467369\nkindle5\t467370\n王应麟\t467371\n南通瑞慈医院\t467372\n张二狗\t467373\npc站\t467374\njinja2\t467375\n中央机构编制委员会办公室\t467376\n宠友俱乐部\t467377\n狄拉夫\t467378\n扎古\t467379\nsio\t467380\n飘云阁\t467381\n记牌\t467382\n至尊剑皇\t467383\nvline\t467384\n娇生惯养\t467385\n集结\t467386\nZbar\t467387\n荒岛求生营\t467388\n行纪\t467389\n纽崔莱蛋白粉\t467390\n110路\t467391\n潞河医院\t467392\nCitibank\t467393\n黑水晶\t467394\n毒情\t467395\n占有\t467396\n鱼鱼\t467397\n自定义拦截器\t467398\n10册\t467399\n并且\t467400\nBlackBerry\t467401\n搭窝\t467402\n求职者\t467403\nRepeated\t467404\n复星系\t467405\n马枪\t467406\n大连国家税务局\t467407\n方云华\t467408\n匹配性\t467409\n皇家骑士团\t467410\n红星街道\t467411\n液位测量\t467412\n锴\t467413\n1676\t467414\n翰德\t467415\nNeato\t467416\n吉春亚\t467417\n十速\t467418\n家鸡\t467419\n中山大学\t467420\njavashop\t467421\n钱少\t467422\n竖立\t467423\nFSX\t467424\n2.1.5\t467425\n巴不得\t467426\n浩劫\t467427\n还能\t467428\n乱线\t467429\n荒岛余生\t467430\n全棉\t467431\n九江职业大学\t467432\n太舒服\t467433\n种姓\t467434\n富贵园\t467435\n34.9\t467436\n神探\t467437\n杨倩\t467438\n余占鳌\t467439\n阎魔刀\t467440\n新疆维吾尔自治区地方税务局\t467441\nskype\t467442\n鬼武者\t467443\nRhymes\t467444\n压平\t467445\n218.75\t467446\n进一步\t467447\n伊秀\t467448\n皮子\t467449\n1亿人次\t467450\n拌合站\t467451\n艾黎\t467452\n金甲虫\t467453\n通博\t467454\nc30\t467455\n江西省地方税务局\t467456\n青白江区\t467457\n智恒\t467458\n600893\t467459\n维萨\t467460\n2017年2月21日\t467461\nwindows7系统旗舰版\t467462\nvizaar\t467463\nfic\t467464\n发帖\t467465\nTESV\t467466\nbabel-polyfill\t467467\n300路\t467468\nonenote2013\t467469\n微信端\t467470\n9at\t467471\n昆明市官渡区人民政府\t467472\n高娓娓\t467473\n县市区\t467474\n陶吧\t467475\n顶呱呱\t467476\n粒子群算法\t467477\n碱类\t467478\n暂停\t467479\n长安欧诺S\t467480\n香港西九龙\t467481\n首抽\t467482\n放逐\t467483\n29次\t467484\n陈巍\t467485\n民安小区\t467486\n水曲柳\t467487\n1.95\t467488\n上海市闸北区\t467489\n北京DM67网\t467490\n年轻的小姨子\t467491\n殿试\t467492\n梅花易数\t467493\n沪昆高速\t467494\n日本丰田\t467495\ncocos2dx-js\t467496\n三国机密\t467497\n铁鞋\t467498\nBPDU\t467499\n斗智\t467500\nSteins\t467501\n申金\t467502\n挖耳草\t467503\n夏尔米\t467504\n言午\t467505\n光电管\t467506\n马来西亚政府\t467507\n5.8G\t467508\nfleet\t467509\n孔隙\t467510\n书记员\t467511\n石家庄仲裁委员会\t467512\nmarryu\t467513\n000413\t467514\n碳酸乙烯酯\t467515\n应付账款\t467516\nReload\t467517\n幻月飘雪\t467518\n千百度\t467519\nwwwbbb\t467520\n陈秀雯\t467521\n万吨级\t467522\n18公里\t467523\n氨溴索\t467524\n下位法\t467525\n叶子猪剑网3\t467526\n加血\t467527\nBABA\t467528\n穿衣指数\t467529\n发霉\t467530\n奥拉索斯战纪\t467531\n美脚\t467532\n人妻们\t467533\n猪排骨\t467534\n暂居\t467535\n十六张\t467536\n优酷投屏\t467537\n月黑风高\t467538\n焓变\t467539\ndynamic\t467540\ncomo\t467541\n东北大学学报\t467542\n悬垂堡\t467543\n小米mix1\t467544\n马克莱文森\t467545\n彭浦新村\t467546\n梯笼\t467547\n低价\t467548\n青春无悔\t467549\nCSS模板\t467550\nClock\t467551\n中农办\t467552\n老坑\t467553\n价格战\t467554\n世代相传\t467555\n钱脉\t467556\n恒温恒湿试验箱\t467557\n车顶架\t467558\n三峡大学科技学院\t467559\nUDN\t467560\n头孢氨苄胶囊\t467561\n撒母耳记\t467562\n发子\t467563\n北京奥运\t467564\n谢菲\t467565\n2010年2月\t467566\n齐秦\t467567\n访户\t467568\n肘窝\t467569\n韩少功\t467570\n0x00\t467571\n无锡机场\t467572\n竹山村\t467573\n平静\t467574\nrta\t467575\n鹿野\t467576\n日落紫荆\t467577\nRCA\t467578\n医疗界\t467579\n38部\t467580\nLiu\t467581\n新广告法极限词\t467582\n病毒扫描\t467583\n兰花香\t467584\nx60\t467585\nxxx69\t467586\n格尔尼卡\t467587\n婚词\t467588\n23条\t467589\n加拿大安省\t467590\n沙皇尼古拉二世\t467591\ntld\t467592\n谭凯\t467593\n浙江中医药大学附属第三医院\t467594\n中车株洲电力机车有限公司\t467595\n何小萍\t467596\n嵯峨\t467597\n猎豹免费wifi\t467598\n饭锅\t467599\n暴改\t467600\n自带线程池\t467601\n雷神托尔吧\t467602\n濮阳县\t467603\n第69页\t467604\n桃酥\t467605\n信游天下新闻网\t467606\n倾城绝恋\t467607\n积分券\t467608\n转印纸\t467609\n酷炫\t467610\n移动工作站\t467611\n毒圈\t467612\n男频\t467613\n蒙特卡洛模拟\t467614\n春宵苦短\t467615\n袖型\t467616\n千速\t467617\n九溪烟树\t467618\n会声会影x7\t467619\n112\t467620\nhmc\t467621\n上海仁济医院\t467622\n外国语言学及应用语言学\t467623\n146号\t467624\n868\t467625\n瑶族舞曲\t467626\ng1610\t467627\n椎名真白\t467628\n熊峰\t467629\n人力资源处\t467630\n十万亩\t467631\n睿御\t467632\n落樱\t467633\n钨粉\t467634\nCC2016\t467635\n漫少女漫画大全\t467636\nwin10分区\t467637\n马明仁\t467638\n51广场舞蹈网\t467639\nBondage\t467640\nhawking\t467641\n通天大道\t467642\n殷志源\t467643\nfitting\t467644\n星数\t467645\n轿子\t467646\n盐城机场\t467647\nMVC5+EF6\t467648\n省管\t467649\n百转\t467650\njQuery选择器\t467651\n要道\t467652\n最后一滴水\t467653\n朱毛\t467654\n好大\t467655\nldd\t467656\n审级\t467657\n天正公司\t467658\n耐卡影音论坛\t467659\n帧间差分法\t467660\n大羊\t467661\n主_\t467662\n招徕\t467663\n怀柔雁栖店\t467664\n康特\t467665\n招教\t467666\n20161216\t467667\nucos\t467668\n认识角\t467669\n火令\t467670\n载人\t467671\n迈克老狼\t467672\n搪\t467673\n美女子\t467674\nxOy\t467675\nApowersoft录屏王\t467676\nlld\t467677\n两杯\t467678\n圣洁莓\t467679\n宣传曲\t467680\nsccnn\t467681\n酸奶牛\t467682\n地道战\t467683\n伟易达\t467684\n新世纪\t467685\n信托业\t467686\n重重\t467687\nXB1\t467688\nsultan\t467689\n2738\t467690\n月见\t467691\nInformer\t467692\n韩子萱\t467693\n散茶\t467694\n江玉燕\t467695\n校园篇\t467696\n王莹莹\t467697\n辛丑\t467698\nEB病毒\t467699\n重医大附一院\t467700\n王方\t467701\nClown\t467702\n特别任务\t467703\nfiit\t467704\nRelation\t467705\n风亭\t467706\n书斋\t467707\n李郎\t467708\n肉洞\t467709\n持身\t467710\n玫瑰木\t467711\nSR5\t467712\n单整\t467713\n潇十一郎\t467714\n吵吵\t467715\n迎刃而解\t467716\n长石\t467717\n基金号\t467718\n鸡肠\t467719\n普洱茶砖\t467720\n马林斯基\t467721\n外呼\t467722\n军种\t467723\nmax9\t467724\n再热\t467725\n循迹\t467726\n5段\t467727\n神形\t467728\n天下鸽\t467729\n南明区\t467730\n杨明辉\t467731\n薄皮\t467732\n7999元\t467733\n3517\t467734\nH5\t467735\n什么味\t467736\n划分表\t467737\n4148\t467738\n鱼尾纹\t467739\n沧州新闻网\t467740\nBAUER\t467741\n雾峰\t467742\n白熊\t467743\na320m\t467744\nYoko\t467745\n义字\t467746\nsql2008数据库\t467747\n50W\t467748\n细米\t467749\n静好\t467750\n简简单单\t467751\n99级\t467752\nZBOOK\t467753\nslr\t467754\n华尖山\t467755\n弗里曼\t467756\n掉跟\t467757\n科技厅\t467758\n民事审判指导与参考\t467759\n都丽\t467760\nlingoes\t467761\n病情\t467762\n安哥\t467763\n重庆有线\t467764\n盛世\t467765\nYBT\t467766\n江南中学\t467767\nai音响\t467768\n福建人事考试网\t467769\n抱着我\t467770\ndnf地下城与勇士\t467771\n刚正不阿\t467772\n登记处\t467773\n靳怀瑾\t467774\n热血翼虎\t467775\n175厘米\t467776\n21分钟\t467777\n严朝君\t467778\n王立群\t467779\n一对一个\t467780\nscopes\t467781\n购回\t467782\nscotch\t467783\n百度魔图\t467784\nˇ\t467785\neventlog\t467786\n合作制\t467787\n真空烘箱\t467788\n冊\t467789\n无锡大剧院\t467790\n聚维酮\t467791\n妄想族\t467792\n三朵云\t467793\n做事先做人\t467794\n商混站\t467795\n普利兹克建筑奖\t467796\ntsconfig\t467797\n拉血\t467798\nKnewOne\t467799\nOMO\t467800\n天保驾校\t467801\n法润江苏网\t467802\n上坡路\t467803\n泰国航空\t467804\n陶阳\t467805\n老妹\t467806\n腰椎盘突出\t467807\n知死\t467808\nzhs16gbk\t467809\n金银\t467810\nsqlite数据库\t467811\n华育中学\t467812\n歌仔戏\t467813\n亿田\t467814\n首尔甜城\t467815\n信息框\t467816\n姐姐好饿\t467817\nmakeblock\t467818\n美誉\t467819\n宏润\t467820\nvulkan\t467821\n无花果树\t467822\n多特电影网\t467823\n崇祯\t467824\nadv2\t467825\n整性\t467826\n板桥新城\t467827\nactio\t467828\n糖体\t467829\n一夜的工作\t467830\n剑灵吧\t467831\n20140515\t467832\nq6\t467833\n四库\t467834\n水阳镇\t467835\n星星灯\t467836\n奈特\t467837\nIXUS\t467838\n鄱湖\t467839\n生态平衡\t467840\n药客\t467841\n一二八\t467842\n嫂子的职业\t467843\npptp\t467844\n含量表\t467845\n斯皮尔伯格\t467846\n极简史\t467847\n宏府\t467848\n多元回归分析\t467849\n扑克脸\t467850\n荣国府\t467851\n长处\t467852\npasv\t467853\n母生\t467854\n情相悦\t467855\nBruker\t467856\n入住率\t467857\nslightkis\t467858\n115浏览器\t467859\n老曹\t467860\n蓝天豚硅藻泥\t467861\n永不放弃\t467862\n绿城兰园\t467863\n20%\t467864\ndb1\t467865\n合川\t467866\n蓝科\t467867\n田文昌\t467868\n二届\t467869\nextjs3\t467870\nStructural\t467871\n汤姆·福特\t467872\n电话线\t467873\n第36集\t467874\n暗神\t467875\n随机端口\t467876\n虚空幻\t467877\n胃幽门螺旋杆菌\t467878\nbeen\t467879\n兖\t467880\nerrorlevel\t467881\n17倍\t467882\n_华股\t467883\n魔族\t467884\n神舟七号\t467885\n福建省农村信用社\t467886\n陈思琪\t467887\n张建民\t467888\n金龙湾\t467889\n收录机\t467890\nLying\t467891\nlpad\t467892\nweb模板\t467893\n湖州经济技术开发区\t467894\n老边区\t467895\n搅局\t467896\n53张\t467897\n改查\t467898\n开园\t467899\n魏善庄镇\t467900\n产证\t467901\nBreathing\t467902\nbridged\t467903\n幻想\t467904\n格鲁\t467905\n基底\t467906\n坏账率\t467907\nCattle\t467908\n小四郎\t467909\n课间操\t467910\n杏儿\t467911\n奥迪a5\t467912\n领好\t467913\n粉圆\t467914\nentered\t467915\n銀行\t467916\ncloudflare\t467917\n溯溪鞋\t467918\n三都水族自治县\t467919\nWEB-INF\t467920\ngood影片网\t467921\n一击男\t467922\n冰结师\t467923\n燕\t467924\n聚合物电池\t467925\n郁金香种球\t467926\n三层交换机\t467927\n本振\t467928\n心理特征\t467929\n脑壳\t467930\n96,日\t467931\n干事者\t467932\n豆角网\t467933\n短期贷款\t467934\n八宝盒\t467935\n破月\t467936\n奶牛关\t467937\n由来的故事\t467938\n超过1年\t467939\n论资排辈\t467940\nddt\t467941\n网易我的世界手机版\t467942\n有限体积法\t467943\n玉枕\t467944\n从头越\t467945\n津\t467946\n资生堂\t467947\nadv3\t467948\n泰隆\t467949\n尼康D7500\t467950\nloaf\t467951\n精辟\t467952\n重要\t467953\n下关沱茶\t467954\n英语版\t467955\n此景\t467956\n软件工程概论\t467957\n派力肯\t467958\n771\t467959\n脑经急转弯\t467960\n36小时\t467961\nStargate\t467962\n震旦\t467963\n爆冷\t467964\n许昌日报\t467965\n职业杀手\t467966\n塑像\t467967\nJoinQuant\t467968\nop.52pk.com\t467969\n7.35惩戒\t467970\nobfs\t467971\n上海市法院\t467972\n招标\t467973\n存在性\t467974\n183个\t467975\n轻舞\t467976\n勾缝剂\t467977\n115VIP\t467978\n小恐龙阿贡\t467979\n生成器\t467980\n北大金融学\t467981\n1624\t467982\nMarketWatch\t467983\nYoutube\t467984\n小米枪战\t467985\n江淮和悦\t467986\n田径项目\t467987\nReusable\t467988\nSamples\t467989\n吊码\t467990\n戏剧化\t467991\n流动式\t467992\npatti\t467993\n湘菜\t467994\n锐之旗\t467995\n第4关\t467996\n炖蛋\t467997\n星辉石\t467998\n沈阳市质量技术监督局\t467999\naptX\t468000\n5.5英寸\t468001\nLightmap\t468002\n代答\t468003\n电影文学\t468004\nEarl\t468005\n评级币\t468006\n佐世保\t468007\n水沢真树\t468008\n高本\t468009\ncrumb\t468010\n观点\t468011\n中琅\t468012\n村庄\t468013\ndirectory\t468014\n买花\t468015\neasing\t468016\n259号\t468017\n老许\t468018\n邓波\t468019\n除草\t468020\n远创\t468021\nSean\t468022\n冬至日\t468023\nGBA4\t468024\n议题\t468025\n余昌国\t468026\nnowrap\t468027\nAAR\t468028\n花妖\t468029\nro仙境传说\t468030\n东方体育中心\t468031\n海信商城\t468032\nHans\t468033\n编外合同制\t468034\n金科城\t468035\nWayOS\t468036\nexpro\t468037\nXidian\t468038\n37个\t468039\n三十公分\t468040\n成都市小学\t468041\nexpansion\t468042\n拓实\t468043\nDIV层叠\t468044\n康缘药业\t468045\nKKTV\t468046\n佩玉\t468047\n180分\t468048\n45家\t468049\n南昌工程学院\t468050\n联想y470\t468051\nmophie\t468052\n密藏\t468053\n非编\t468054\n沙雅县\t468055\n中加\t468056\nidfa\t468057\n遗韵\t468058\nrnb\t468059\nCeramics\t468060\n远野\t468061\n爹爹\t468062\n长江街\t468063\nsigmoid函数\t468064\n转铁\t468065\nntkrn\t468066\n搭户\t468067\n千兆无线路由器\t468068\ncm0304\t468069\n七叶洋地黄双苷滴眼液\t468070\nurban\t468071\n密门\t468072\n消息管理器\t468073\nROHM\t468074\nQuébec\t468075\npcos\t468076\nQA\t468077\n本校\t468078\n20180318\t468079\n人文类\t468080\n我不上你的当\t468081\nHefei\t468082\n印度\t468083\n真空乳化机\t468084\n中危\t468085\n5.8元\t468086\n三星a8000\t468087\n经验丹\t468088\n狼子\t468089\n爱沙尼亚\t468090\nb350\t468091\n来宝网\t468092\nword分节符\t468093\n乐天酒店\t468094\n加法交换律\t468095\n中协\t468096\nyinger\t468097\n自驾游_周边游\t468098\n汪一光\t468099\n拥抱太阳的月亮\t468100\n35平方\t468101\n星评\t468102\n大帝姬\t468103\n电锯惊魂3\t468104\n星照\t468105\n神谷姬\t468106\nmsgs\t468107\n美国心脏协会\t468108\nNavigate\t468109\n容人\t468110\n用料\t468111\n雪松路\t468112\n夏可可\t468113\n自信\t468114\n张玉珊\t468115\n三星村\t468116\n垫片\t468117\n乌拉特前旗\t468118\nVenezuela\t468119\n何兴丽\t468120\n县医院\t468121\n妈咪爱婴网\t468122\nscared\t468123\n2018-04-01\t468124\nint64\t468125\nomron\t468126\n容错性\t468127\n鹿晗迪丽热巴\t468128\n曲沃\t468129\n溪洛渡\t468130\n走量\t468131\n第十宫\t468132\nkoumm\t468133\n高腰裙\t468134\n20150504\t468135\nAC\t468136\n马马虎虎\t468137\n善存片\t468138\n2000辆\t468139\n蓝村路\t468140\n长棍\t468141\n编组站\t468142\n亚洲鲤鱼\t468143\nvml\t468144\n台式钻床\t468145\n乡绅\t468146\n胜利社区\t468147\n全域化\t468148\nFirewalking\t468149\n第1轮\t468150\n魔兽战网\t468151\n下传\t468152\n5288\t468153\n内瑟斯\t468154\n选级\t468155\n五队\t468156\n四卡\t468157\n膜力\t468158\n明光市人民政府\t468159\n上饶市人大常委会\t468160\nTPS\t468161\n云创\t468162\n流计算\t468163\n弱弱\t468164\nbristol\t468165\nEUROPAGES\t468166\n23.8\t468167\n教育权\t468168\n虾皮网\t468169\n瓦基弗银行\t468170\n天河潭\t468171\nwo99.net\t468172\n网商路\t468173\n卡斯商学院\t468174\n贺由爱\t468175\n三军仪仗队\t468176\nERP-U8\t468177\n甜宝\t468178\nParallels\t468179\n直播流\t468180\nlightroom\t468181\n国防教育日\t468182\n横笛\t468183\n单柱\t468184\n风之旅人\t468185\n李清\t468186\n税务登记证\t468187\nShall\t468188\n知学\t468189\nModule\t468190\n小额贷\t468191\n调理剂\t468192\n客餐厅\t468193\n注塑部\t468194\n浣熊君\t468195\nv_\t468196\n佳人\t468197\n开创集团\t468198\n茶树菇\t468199\n实木颗粒板\t468200\n烤扇贝\t468201\njabi\t468202\napplic\t468203\n指南村\t468204\n小棍\t468205\nkcptun\t468206\n圣约村\t468207\n非静态方法\t468208\n混沌名图\t468209\nSWAROVSKI\t468210\n现钱\t468211\n艺珍\t468212\n城市郊区\t468213\n放牛\t468214\n卫健委\t468215\n小鱼缸\t468216\nArcSoft\t468217\n国立大学\t468218\nperceptual\t468219\n洋甘菊\t468220\n吴飞\t468221\n53级\t468222\n复制权\t468223\n度阴山\t468224\n江宁街道\t468225\n300多元\t468226\n政宗\t468227\n海河教育园\t468228\nJR史密斯\t468229\n南方医科大\t468230\n中青网\t468231\nff14\t468232\n中国医科大学网络教育学院\t468233\nCHIGO\t468234\n未来世界\t468235\nFold\t468236\nCAPTURE\t468237\n白箱\t468238\n高晓松\t468239\n测力\t468240\n3万8\t468241\n七系\t468242\n一枝一叶总关情\t468243\n双文\t468244\n江苏省组织机构代码管理中心\t468245\n613\t468246\n树液\t468247\n美国高中\t468248\n水泄不通\t468249\n张江国家自主创新示范区\t468250\n扩大会议\t468251\n1020plus-ZOL\t468252\n2016年6月4日\t468253\n品色\t468254\n恶徒\t468255\n肛窦炎\t468256\n消去\t468257\n上海市物理业余学校\t468258\nBelgian\t468259\n古诗集\t468260\n异俗\t468261\n玻钻\t468262\n20180316\t468263\n4月3日\t468264\n78岁\t468265\n西半球\t468266\nriad\t468267\n头人\t468268\n宜昌市政府\t468269\n404.3\t468270\n二十六部\t468271\nsuselinux\t468272\n张建中\t468273\n燕国\t468274\n烹\t468275\n薇莉亚\t468276\n大智度论\t468277\n珍珠白\t468278\n春香传\t468279\n1080p|720p高清BT迅雷\t468280\n1.8版\t468281\n无锡银行\t468282\n马新\t468283\n国贼\t468284\n赛斯纳\t468285\n一条根\t468286\n主奴\t468287\nT梁\t468288\n蓝尾\t468289\n6000W\t468290\n失落的世界\t468291\n舒美\t468292\n辉星\t468293\nVolks\t468294\nword2003\t468295\n信号屏蔽器\t468296\n一帐\t468297\n主峰\t468298\n崔智友\t468299\n撒贝邓稼先\t468300\nPágina\t468301\n26关\t468302\n妊娠囊\t468303\n杨晨晨\t468304\n亡刃\t468305\n马庄\t468306\n伯朗特\t468307\n打评\t468308\n虎门销烟\t468309\n小野纱里奈\t468310\n车易拍\t468311\n梭菌\t468312\n大不列颠\t468313\n移仓\t468314\n李万君\t468315\n6.15\t468316\n指环\t468317\nNBA|新浪NBA|新浪竞技风暴_新浪网\t468318\n张源\t468319\n魏杰\t468320\n捷通华声\t468321\n2013年6月\t468322\n耳返\t468323\n财务与会计\t468324\n长河街道\t468325\n英文转换器\t468326\n齐州\t468327\n奥卡兹岛\t468328\n传世\t468329\n法度\t468330\nlib文件\t468331\n宋建勇\t468332\n耗用\t468333\n50003\t468334\n出路\t468335\nGeorgia\t468336\nSPI接口\t468337\n水浒\t468338\n浙江大学本科生院\t468339\nWestern\t468340\n11款\t468341\n安全工程专业\t468342\n纸\t468343\n扬州时报\t468344\n前盟\t468345\n非处女\t468346\n频率分布直方图\t468347\nportal\t468348\n对称度\t468349\n阐教\t468350\n中大奖\t468351\n郑琳\t468352\n呵呵\t468353\n郑州大学第一附属医院\t468354\nInterference\t468355\n稀客\t468356\n与朱元思书\t468357\n三折页\t468358\n小叔\t468359\n杂合子\t468360\n宝莫股份\t468361\n禅师\t468362\ngenesis2000\t468363\n王永春\t468364\n大鲨鱼\t468365\nmappers\t468366\nwr886\t468367\n何东东\t468368\n依恋\t468369\n克莱因\t468370\n中小型\t468371\n红星中学\t468372\n落霞与孤鹜齐飞\t468373\n专资\t468374\nwww.88106.com/api/baidu\t468375\n沉金\t468376\n奥运村\t468377\nApproaches\t468378\n胸闷\t468379\n太史公自序\t468380\n云南师大附小\t468381\n最低\t468382\n李春江\t468383\n光明之魂2\t468384\n超人总动员2\t468385\n内置式\t468386\n油库\t468387\n新闻战线\t468388\nstirling\t468389\n物流员\t468390\n2015.1\t468391\nKuber\t468392\n证言\t468393\n600900\t468394\n坠\t468395\n亿企惠税\t468396\n惨重\t468397\n北京电信\t468398\n性能化\t468399\n开白\t468400\n色甘酸钠滴眼液\t468401\n教师报\t468402\n郭建华\t468403\n战狼二\t468404\nGBM\t468405\n沿江高速\t468406\n腌制\t468407\n_鹏\t468408\n处理液\t468409\n快乐舞步\t468410\n咖啡兔\t468411\n中华人民共和国海关总署\t468412\nget\t468413\nalles\t468414\n乌金石\t468415\n东线\t468416\n郡士\t468417\npromega\t468418\n中国科学院物理研究所\t468419\ncove\t468420\n丽星邮轮\t468421\n罗汉松\t468422\nguangxi\t468423\nWidow\t468424\n厦门人事网\t468425\n第7周\t468426\n新茶网\t468427\n万恶淫\t468428\nfont-size\t468429\n东霖\t468430\n吴速玲\t468431\n说出你的故事\t468432\nRetina屏\t468433\nf187\t468434\n编制表\t468435\n张坤\t468436\ntablewidget\t468437\n导流板\t468438\n沈阳百姓网\t468439\n管涛\t468440\n俱疲\t468441\nWindows域\t468442\nM70\t468443\n黄蕙兰\t468444\n快乐生活一点通\t468445\n巴黎迪士尼\t468446\nSAGA\t468447\n斋饭\t468448\n建设工程项目管理\t468449\n日本伊藤洋华堂公司\t468450\n析构\t468451\n鬼女\t468452\n吉士\t468453\n经济师\t468454\n民航社区\t468455\n阿布\t468456\n超级力量2\t468457\n助唱\t468458\n剑侠启悦\t468459\nVSTO\t468460\nopenbsd\t468461\n基里巴斯\t468462\n人间四月天\t468463\n无理数\t468464\n店長\t468465\n使命召唤online\t468466\n谭义娟\t468467\n存款利率\t468468\n浇灭\t468469\n土城\t468470\n曹妃甸新区\t468471\n博士服\t468472\n附议\t468473\nSchwarzkopf\t468474\n皖西学院\t468475\nascii码表\t468476\n13亿美元\t468477\n杀入\t468478\n一抹\t468479\n泡鸭爪\t468480\n欧也妮\t468481\n宋书\t468482\n能见度\t468483\n龙熹山\t468484\nCHO细胞\t468485\n悦儿\t468486\n池音慕寒卿\t468487\n上海黄金\t468488\nMillis\t468489\n同步器\t468490\nBoss\t468491\n浸湿\t468492\n李艺\t468493\n太兴餐厅\t468494\n福熙\t468495\n73%\t468496\n正\t468497\n球帝\t468498\nvon\t468499\n金谷\t468500\n雷斌\t468501\n不同行\t468502\nrosdep\t468503\n安阳市人民医院\t468504\n调音\t468505\n续断\t468506\n摩贝百科\t468507\n传染科\t468508\n12月16日\t468509\n大蜀山\t468510\n金山逍遥Xoyo.com\t468511\ntimesten\t468512\n首乌\t468513\nmssqlserver\t468514\nled显示屏\t468515\n镁光灯\t468516\n香港东路\t468517\nSentiment\t468518\n60年后\t468519\n九芝堂\t468520\n海琴\t468521\n120回\t468522\n2018.3.12\t468523\n李春晓\t468524\n辅助\t468525\n直过\t468526\n第一季度\t468527\nTreeList\t468528\n3线\t468529\n3F\t468530\n韩毅\t468531\n威纶\t468532\n原来你还在这里\t468533\n1k\t468534\nWifi信号差\t468535\n绝地战兵\t468536\n限价盘\t468537\nagu\t468538\n彩化\t468539\n中国电信集团公司\t468540\n旋转木马\t468541\n标准银行\t468542\n沙北\t468543\n菜叶\t468544\n真空炉\t468545\n四月六日\t468546\n昆仑奴\t468547\n海陵岛\t468548\n死亡空间3\t468549\n林茜\t468550\n刮痧疗法\t468551\n经济舱\t468552\nexcel服务器\t468553\n道教符咒\t468554\n潍坊国家高新技术产业开发区\t468555\n柑橘苗\t468556\n美区\t468557\n精打\t468558\n周制\t468559\n广西招生考试院\t468560\n湖南省人民检察院\t468561\ndab\t468562\n孤单\t468563\n靠实\t468564\nHDX\t468565\n空包弹\t468566\n天龙众泰t600\t468567\n陈寅\t468568\n数据流图\t468569\n12350\t468570\n兔仙\t468571\n银环蛇\t468572\n掩\t468573\ntransparent\t468574\n订订\t468575\n纹石\t468576\n辉盛\t468577\n结婚日记\t468578\nVESTA\t468579\nelment\t468580\nresolution\t468581\n扩瞳\t468582\n绍兴19楼\t468583\n腾讯电脑\t468584\n海洋法\t468585\n诸界末日在线\t468586\n徙步\t468587\n四届二次\t468588\nS30\t468589\nV1.8\t468590\n九龙寨城\t468591\n汤姆\t468592\nTang\t468593\nlingual\t468594\n同岁\t468595\ndirectories\t468596\n测孕纸\t468597\nzbrush2018\t468598\n37.4度\t468599\nPentatonix\t468600\n几公斤\t468601\n四川省人民政府\t468602\nan94\t468603\npassenger\t468604\ngrind\t468605\n色気\t468606\n高人\t468607\n射血\t468608\n独一无二\t468609\n不在编\t468610\n中共中央编译局\t468611\n倜傥\t468612\n慕容复\t468613\n九卷\t468614\n比驿\t468615\n零室\t468616\nTomcat8\t468617\n独占欲\t468618\n无性\t468619\n多路复用器\t468620\n邓晓峰\t468621\nmilkcocoa\t468622\n影视学院\t468623\n滨海园区\t468624\n荷马史诗\t468625\nAUTOCAD\t468626\n数码暴龙机\t468627\nAdding\t468628\n潘安湖\t468629\n发量\t468630\n周旭\t468631\n钣金工\t468632\n軌跡\t468633\n英科\t468634\n网维\t468635\n乌龟蛋\t468636\nKindle阅读器\t468637\n中華航空公司\t468638\n经联\t468639\n100万件\t468640\n荆棘鸟\t468641\n_途\t468642\n山西省物价局\t468643\n以岭药业\t468644\n软考\t468645\n压模地坪\t468646\n文段\t468647\n房室\t468648\nSpecies\t468649\n第73章\t468650\n梦幻宝马3系\t468651\n时空荣威550\t468652\n祛痘膏\t468653\n一个秒\t468654\nToOne\t468655\n四居\t468656\nmt\t468657\n变频调速电机\t468658\n虞大明\t468659\nu3d\t468660\n地海传说\t468661\n新光大中心\t468662\n赞许\t468663\n方腊\t468664\n变鬼\t468665\nKygo\t468666\n睡员\t468667\n954\t468668\npiano\t468669\nHRC\t468670\n电网技术\t468671\nv700\t468672\n冒险岛五转\t468673\n该法\t468674\n湖北省人民政府研究室\t468675\nSlidingMenu\t468676\nsetcookie\t468677\n顶赞\t468678\nSSCI\t468679\n实蝇\t468680\n甘肃卫生职业学院\t468681\n直租\t468682\n圆模\t468683\n郑国锋\t468684\n招贤纳才\t468685\n紫日\t468686\narmor\t468687\nSnack\t468688\n模糊控制器\t468689\n微生物肥\t468690\n工伤医疗补助金\t468691\n插即\t468692\n长子\t468693\n斜杠\t468694\n水煮鸡胸肉\t468695\n波分\t468696\n骏派D60\t468697\n脚本精灵\t468698\n李局长\t468699\n浦城县\t468700\nStephy\t468701\n跑题\t468702\n300M\t468703\n琅琊榜2风起长林\t468704\nSpringDataJPA\t468705\n小胖\t468706\ne卡\t468707\n哈里\t468708\n齐鸣\t468709\n四轴飞行器\t468710\n常熟零距离\t468711\n齐鲁电影网\t468712\npeculiar\t468713\n化装\t468714\n0870\t468715\n卡兰\t468716\n双梦\t468717\nCartoon\t468718\nX-H1\t468719\n蓬莱岛\t468720\n搪瓷\t468721\n转角窗\t468722\n喉镜检查\t468723\n5天左右\t468724\n保定市政府\t468725\n字木\t468726\nゃ\t468727\nwalker\t468728\n2.1米\t468729\nTea\t468730\npdk\t468731\n南京东路街道\t468732\n泰仕\t468733\n霓裳羽衣\t468734\n介电\t468735\n_形\t468736\ngb28181\t468737\n绝刀\t468738\n27_\t468739\n网络法\t468740\n550ml\t468741\n泰妹\t468742\n【神盾局\t468743\n_酒店\t468744\nwuwei\t468745\n李奎\t468746\n奏折\t468747\n喜剧总动员\t468748\n用子\t468749\n禁带\t468750\n站用\t468751\n唐纳德\t468752\n慈星股份\t468753\n钎料\t468754\n31寸\t468755\nnetca\t468756\n公鸡\t468757\n315投诉网\t468758\nPT950\t468759\niOS6\t468760\n斯皮仁诺\t468761\nhdmi\t468762\n2018年4月4号\t468763\nmorphological\t468764\n255.255.255.255\t468765\n开元度假村\t468766\nNv\t468767\nM132nw\t468768\n悬浮术\t468769\n杨小愚\t468770\n正手\t468771\nfiling\t468772\n弥散性\t468773\n海虹控股\t468774\nvisualize\t468775\n梵大集团\t468776\nk3cloud\t468777\n青青草\t468778\nValkyrie\t468779\n青岛地铁1号线\t468780\ns2线\t468781\n为盼\t468782\n关东\t468783\npull\t468784\n沈莹\t468785\n京东商智\t468786\njoseph\t468787\n电子稿\t468788\n毕业论文检测\t468789\n3.0usb\t468790\nudi\t468791\n橡胶软接头\t468792\nwww15yccmm\t468793\n1918年\t468794\n红通通\t468795\nGuarantee\t468796\npretty\t468797\n丑数\t468798\n茜色\t468799\nSeller\t468800\n动漫城\t468801\n射了撸\t468802\n五矿资本\t468803\n上界\t468804\n0代\t468805\n对外人\t468806\n中国华服日\t468807\n蝙蝠侠阿甘骑士\t468808\n伴走者\t468809\n饮水机\t468810\n第12章\t468811\n杜邦分析法\t468812\n喻体\t468813\n英标\t468814\n3d2016\t468815\n如斯\t468816\n115元\t468817\n门户网站\t468818\n江苏省赣榆高级中学\t468819\n均数\t468820\n上海源叶生物科技有限公司\t468821\n溶血性贫血\t468822\nHANNOVER\t468823\n危害性\t468824\n抓斗\t468825\nMonetary\t468826\n贝复新\t468827\n惯\t468828\nnewbie\t468829\nMasonry\t468830\n快易捷用药\t468831\n步进电机驱动器\t468832\n维语版\t468833\n低通滤波\t468834\n出场\t468835\n放门\t468836\n白条券\t468837\n朕\t468838\n帆软报表\t468839\nx1-x2\t468840\n17CE\t468841\n逃匿\t468842\n明细单\t468843\n20161208\t468844\nM225\t468845\n唐映枫\t468846\n在线科学计算器-开平方计算-在线数学计算器\t468847\n刀狗\t468848\n4圈\t468849\n藤艺\t468850\n网规\t468851\n未闻\t468852\n22所\t468853\n楚乔\t468854\n戴眼镜\t468855\n哈尔施塔特\t468856\nGARNiDELiA\t468857\n闪存门\t468858\n第五十二\t468859\n百乐眠胶囊\t468860\n口癖\t468861\n永善县\t468862\n应经\t468863\n天荒\t468864\n王晓光\t468865\n2018-03-24\t468866\n推辞\t468867\n玉泉小学\t468868\nTa们\t468869\n达克罗\t468870\njob=more02&q=&bx\t468871\n陈咏\t468872\n独战\t468873\nAhn\t468874\n爱之切\t468875\nBT种子在线编辑器\t468876\nJane\t468877\n曹刚\t468878\n恐惧\t468879\n汉贼\t468880\n叶炫\t468881\n6002\t468882\n文豪野犬\t468883\n白带多\t468884\ndatagirdview\t468885\n挂壁公路\t468886\n性心\t468887\n云影\t468888\namide\t468889\n芜湖鸠江区\t468890\n下管\t468891\n视通\t468892\n蚬子\t468893\nLynn\t468894\n塞伦盖蒂\t468895\n巴陵石化\t468896\n一个51\t468897\n小野\t468898\nNoomlor\t468899\n犊牛\t468900\n硫化铁\t468901\n孟庆斌\t468902\n长嘉\t468903\n1、2号\t468904\n38194963\t468905\nsau\t468906\n一工\t468907\n七道\t468908\n美妙\t468909\nreminder\t468910\n火锅桌\t468911\nsre\t468912\nAPNs\t468913\n鹌鹑蛋\t468914\n全球摄影网\t468915\n新编大学德语\t468916\n潜伏2\t468917\n昆顿\t468918\n包装费\t468919\n美度吧\t468920\n发薪日\t468921\nregistering\t468922\n20181月\t468923\n剑魂之刃\t468924\n续住\t468925\n欢呼雀跃\t468926\n5亿元\t468927\noppor11splus\t468928\np7b\t468929\n抚摸\t468930\n夹金山\t468931\n中海凤凰熙岸\t468932\n动迁户\t468933\nbbb\t468934\n艳惊\t468935\n三刀流\t468936\nCIA\t468937\n控球\t468938\n近三年\t468939\n幻灯\t468940\n3.03\t468941\n蓝玉\t468942\nDIM\t468943\n柴犬\t468944\n河南教育厅\t468945\n伊斯顿\t468946\n朗读\t468947\n工作人员\t468948\n第11轮\t468949\n庆阳市\t468950\n铁匠\t468951\n还记\t468952\n文艺部\t468953\n斐纳\t468954\nservice\t468955\n3720qm\t468956\n匀速\t468957\n东昌府区\t468958\n1.2l\t468959\n3票\t468960\n邓斌\t468961\n福州日报\t468962\n军娘\t468963\n埃尔多安\t468964\n齐鲁理工学院\t468965\nclipse\t468966\n推平\t468967\n一个日\t468968\n叶兰\t468969\n速门\t468970\n澳门威尼斯\t468971\nDownloading\t468972\n富士智能\t468973\n高架段\t468974\nエロマンガ\t468975\n少年读史记\t468976\niVMS\t468977\n申诉状\t468978\n白旗\t468979\n1百万\t468980\napiCloud\t468981\nFeiLin\t468982\n萌芽期\t468983\n冰糖雪梨\t468984\nV2.0.3\t468985\n一个20\t468986\n茶宠\t468987\n220马力\t468988\n注塑厂\t468989\n刀锋下的替身\t468990\n睽违\t468991\n0105318\t468992\n间宫\t468993\nKeenWon\t468994\n股票期权交易\t468995\n鸟类\t468996\n中国人民银行法\t468997\nsystem函数\t468998\nWiggle\t468999\n奔小康\t469000\ncpg\t469001\n628\t469002\n接班人\t469003\n洁牙\t469004\n禽兽超人\t469005\n编队\t469006\n嫖资\t469007\n祖坟\t469008\n增值税专用\t469009\n不可胜\t469010\nCaption\t469011\nServerless\t469012\nzenge\t469013\n里口山\t469014\n酷鹰\t469015\n悦翔v3\t469016\n易事\t469017\n县政法委\t469018\n邓婕\t469019\nireland\t469020\nSQL分组\t469021\n部部\t469022\n天行\t469023\n遵化\t469024\nestimation\t469025\n霰弹枪\t469026\n惠安女\t469027\n林头镇\t469028\n潘辰\t469029\n威仪\t469030\nagreat\t469031\n8寸\t469032\n九型人格网\t469033\n20平\t469034\n中伦律师事务所\t469035\n日企\t469036\n一粒红尘\t469037\n冷案\t469038\n王亚军\t469039\n书札\t469040\nAccess软件网\t469041\npost\t469042\n知心朋友\t469043\n直道\t469044\n妙品\t469045\n福泉市人民政府\t469046\nuploaded|nitroflare\t469047\n巨量\t469048\n任红昌\t469049\n整晚\t469050\n错点\t469051\nrootfs\t469052\n4月14\t469053\n传力\t469054\nbalanced\t469055\n滨海湾\t469056\n保时泰\t469057\n研讨\t469058\npinyin4j\t469059\n石家庄市商务局\t469060\n中国共青团\t469061\n高星\t469062\nLinde\t469063\n王姓\t469064\n毕业论文.doc\t469065\n伊利乳业\t469066\nfido\t469067\n岗子\t469068\n空堡\t469069\n5整除\t469070\n叶枫\t469071\n维修性\t469072\n好景\t469073\n第6篇\t469074\n萝莉在线\t469075\n武汉绿地中心\t469076\n灵感库\t469077\n沈冰布兰妮\t469078\n估值\t469079\nc型\t469080\n荷兹\t469081\ntabpanel\t469082\n二四\t469083\n个人理财规划\t469084\n贝微微\t469085\n导致\t469086\n冷血\t469087\n东港区\t469088\n巴德\t469089\nosb\t469090\n李晶\t469091\n珈黛\t469092\n山东财政学院\t469093\nBroker\t469094\n入门\t469095\n金宣儿\t469096\n潘刚\t469097\n宫娥\t469098\n盲棋\t469099\n红雀\t469100\n天一广场\t469101\n倾向\t469102\n5w-40\t469103\n一隻\t469104\n气垫bb\t469105\n路边上\t469106\n探究型\t469107\nShellExecute\t469108\nClimber\t469109\n1431\t469110\n603348\t469111\n低分子肝素\t469112\nredash\t469113\n疟疾\t469114\n上海市水务局\t469115\n15R\t469116\n蛇哥\t469117\nbuffers\t469118\n未央湖\t469119\n开多叶调节阀\t469120\n光伏电站\t469121\n陷井\t469122\n鲁迪\t469123\n厂值\t469124\n中国文学网\t469125\n崩坏学园2\t469126\n杰恩\t469127\n山河故人\t469128\n960x720\t469129\nRACING\t469130\n半盏\t469131\n基金会\t469132\n120kw\t469133\nwps工具栏\t469134\nucloud\t469135\n太极禅\t469136\n30颗\t469137\n皮球\t469138\nrosemary\t469139\nOOPS\t469140\n作画\t469141\n小米小爱音箱mini\t469142\nxp1024.com\t469143\n上海建设银行\t469144\n中共阜阳市委\t469145\n新事\t469146\n方屏\t469147\n张国荣\t469148\nDog\t469149\n紫灵\t469150\n7s\t469151\nviewed\t469152\n夺帅\t469153\n广州租房网\t469154\n重庆市黔江区人民政府\t469155\n阿涛\t469156\n烧烫伤\t469157\n明书\t469158\n协和式\t469159\nCustom\t469160\n全球塑胶网\t469161\n宋明理学\t469162\nStoring\t469163\nclair\t469164\n滋滋滋\t469165\nhpv疫苗\t469166\n真菌\t469167\nmerging\t469168\n心里有数\t469169\nstram\t469170\n社会经济效益\t469171\n适用研究\t469172\n工银融e联\t469173\n长鼓\t469174\n26型\t469175\n荣成市\t469176\n红太狼\t469177\n银箭\t469178\n1安\t469179\n999次\t469180\nGogoing\t469181\n切条机\t469182\n谢红\t469183\n2204\t469184\n放大版\t469185\ne康悦\t469186\n晋国\t469187\n鬃狮蜥\t469188\n灭掉\t469189\n竹蛏\t469190\n6868\t469191\n源静香\t469192\n红腹\t469193\n泰日天\t469194\n打雷\t469195\n德里克罗斯\t469196\n三星a5000\t469197\nu12\t469198\n75赫兹\t469199\n韩星网\t469200\n132路\t469201\n特马\t469202\n杭州客运中心\t469203\n探究\t469204\n银角大王\t469205\n恒大天府半岛\t469206\nuc6\t469207\n脫\t469208\n2379\t469209\n银弹\t469210\n李玉华\t469211\n正宗\t469212\nOnePlus\t469213\nIncluding\t469214\nHeaders\t469215\n鐘\t469216\ncorder\t469217\nVoyeurHit\t469218\n吕克贝松\t469219\n中安信业\t469220\n飞往\t469221\n浴衣\t469222\n钢琴弹\t469223\n疏浚\t469224\n30后\t469225\ncelebrities\t469226\n大疆DJI\t469227\n娘妻\t469228\n詳細\t469229\n雷克萨斯gs\t469230\nSC\t469231\nPS格式\t469232\n酸奶发酵剂\t469233\n瘴气\t469234\n神乐坂惠\t469235\n触摸屏幕\t469236\n追本溯源\t469237\n参赛队\t469238\n远程机\t469239\n小阳春\t469240\n机箱\t469241\n中戏北电\t469242\n白寨镇\t469243\nabsent\t469244\n了然\t469245\n德梅隆\t469246\n标志服\t469247\n08年\t469248\n两次铃\t469249\n顾道\t469250\n港企\t469251\n玉乳\t469252\n9070\t469253\n半格\t469254\nas109\t469255\n浓缩器\t469256\nCFE\t469257\n阳关\t469258\nwinclone\t469259\n刷宝\t469260\n全称号\t469261\n简体字体\t469262\n鸿博股份\t469263\nyomail\t469264\n锦桐\t469265\nCeremony\t469266\n萨尔\t469267\n英雄帖\t469268\nchima\t469269\n第31集\t469270\n_幼儿园新闻_无忧幼儿园\t469271\n莫负\t469272\n大冒\t469273\nMonolithic\t469274\n艺术学院\t469275\n卡尔·蔡司\t469276\nアットウィキ\t469277\nCTL\t469278\n氯丁橡胶\t469279\n边家村\t469280\n湖南科技大学\t469281\n新套\t469282\n吴晓光\t469283\n校办\t469284\n美旅\t469285\n邮箱\t469286\n考驾\t469287\n靓化\t469288\nFinish\t469289\n终产物\t469290\n镇远古镇\t469291\n小金\t469292\nenum枚举\t469293\n325\t469294\n二居\t469295\n原作版\t469296\n冻雨\t469297\n丁彦雨\t469298\n80064225\t469299\n多多岛\t469300\n欧阳倩\t469301\n兴辉\t469302\n桃城\t469303\njava抽象类\t469304\ncourts\t469305\n20150626\t469306\n莘莘\t469307\nfreecad\t469308\n田鑫\t469309\nCOS1\t469310\nXMP\t469311\n特舒服\t469312\n2.5方\t469313\n人民路小学\t469314\n庄稼汉\t469315\nS9306\t469316\n利奈唑胺\t469317\niPhone6plus\t469318\n高尔夫练习场\t469319\nvoyage\t469320\n游久游戏\t469321\nCoventry\t469322\n鸣佐\t469323\n起始价\t469324\n幼鼠\t469325\nVIDEO\t469326\n北京北大方正电子有限公司\t469327\n染谷将太\t469328\n王红艳\t469329\n自定义变量\t469330\n發現\t469331\n流沙\t469332\n畅销图\t469333\n崇华中医馆\t469334\n甲兵\t469335\nDreamweaver8\t469336\n中科院物理研究所\t469337\n2500W\t469338\n楠溪江\t469339\nb350m-a\t469340\n颞下颌关节紊乱综合征\t469341\n这两点\t469342\n王俭\t469343\n12.5cm\t469344\n正午\t469345\n道谢\t469346\n佣金率\t469347\nbca\t469348\n狗鱼\t469349\nSENSE\t469350\n四大银行\t469351\n本元\t469352\n化玻\t469353\n绝缘件\t469354\n古墓丽影:源起之战百度云\t469355\n环绕声\t469356\namq\t469357\nObjectives\t469358\n矮胖\t469359\n不怕死\t469360\n丁琳\t469361\n第七十四章\t469362\nefi分区\t469363\nNSDate\t469364\n开心乐园\t469365\n633\t469366\n中国神华\t469367\n番禺\t469368\n角钢\t469369\n一7\t469370\n两样\t469371\n痛吻\t469372\n长鼻\t469373\nblowjob\t469374\nLicensing\t469375\n卡尔文\t469376\n王志超\t469377\n罗茜·汉丁顿\t469378\n50500\t469379\n平带\t469380\n擦机\t469381\n兰卡\t469382\n融创河滨之城\t469383\n去皱\t469384\n下羚羊谷\t469385\nxdat\t469386\nUi\t469387\n校\t469388\n厦门会展中心\t469389\ncommodity\t469390\nSource\t469391\notaku\t469392\n书法兰亭奖\t469393\n凯隐\t469394\nkaf\t469395\ncheatsheet\t469396\n真空阀\t469397\n绝地求生大逃杀\t469398\n鸟巢蕨\t469399\n山东省省\t469400\n纳兹\t469401\ntam\t469402\n知县网\t469403\n大青云\t469404\n十三步\t469405\n胶化\t469406\n微信公总号\t469407\n无人汽车\t469408\n地狱神探\t469409\n灯具\t469410\n新品种\t469411\n恒昌汇财\t469412\n试\t469413\n百吨\t469414\nSoftly\t469415\n迪丽冷巴\t469416\n致我希灵帝国\t469417\n白波\t469418\n好恶心\t469419\n刑诉\t469420\nnipples\t469421\n40国\t469422\n武松醉\t469423\n草洞\t469424\n西门子变频器\t469425\nmillennium\t469426\n3b\t469427\n淄博市委\t469428\n原题\t469429\n新闻学专业\t469430\njunneyang\t469431\ndesigner13\t469432\n慕课手记\t469433\n质的飞跃\t469434\n山大威海\t469435\n美影\t469436\n头孢克肟颗粒\t469437\n东盟十国\t469438\n回水管\t469439\n创业英雄汇\t469440\n魄力\t469441\n张慧雯\t469442\nLogit\t469443\n俄州\t469444\n何锐\t469445\n咥\t469446\n鸡类\t469447\nraster\t469448\n邬氏\t469449\n8一8\t469450\n银宝山\t469451\n国金黄金股份有限公司\t469452\nwrt\t469453\n2017年1月17日\t469454\n炼钢\t469455\nV2.6.2\t469456\n复合体\t469457\n猫扑汽车\t469458\n不动产权证\t469459\n共同诉讼\t469460\n声情并茂\t469461\n邂逅\t469462\n第150章\t469463\n柠檬酵素\t469464\nSnooping\t469465\n最低调\t469466\n家用电路\t469467\n桃形\t469468\n10月7日\t469469\nsuburb\t469470\n先头部队\t469471\n瘘管\t469472\n蛇鞭粉胶囊\t469473\n8欧姆\t469474\nUSDA\t469475\n人亊考试网\t469476\n200M\t469477\n北京科技大学图书馆\t469478\n比企谷八幡\t469479\n电动平移门\t469480\nConservation\t469481\n改嫁\t469482\n多句\t469483\n古堰画乡\t469484\n177cm\t469485\n和好朋友\t469486\n伦铝\t469487\n郭沁\t469488\n文蛤\t469489\n时代广场\t469490\n性素\t469491\n占\t469492\n佳吉\t469493\nLiaoning\t469494\n启信\t469495\n秋老虎\t469496\n未利用\t469497\n雷曼兄弟\t469498\n1365\t469499\nalbums\t469500\n申诉人\t469501\n津滨发展\t469502\nWebUploader\t469503\n金允珍\t469504\n灰石材\t469505\n劳死\t469506\n马家堡街道\t469507\nrosserial\t469508\n天章\t469509\n最低工资标准2015\t469510\n星期6\t469511\n五菱之光\t469512\nPeter\t469513\nalanwalker\t469514\n解款\t469515\nv3.5\t469516\n敲\t469517\n商贷转公积金贷款\t469518\n天翼4G\t469519\n中铁三局\t469520\n屯昌\t469521\n500错误\t469522\n8月初\t469523\nCommunity\t469524\n美利\t469525\n拖网\t469526\nmoby\t469527\n杜珺\t469528\n用户体验设计师\t469529\n杨鸣\t469530\n哈药股份\t469531\n2n+1\t469532\n古\t469533\n水警\t469534\n敲敲\t469535\nQPixmap\t469536\n长春广播电视台\t469537\npcb电路板\t469538\n_V1.0\t469539\n双扇\t469540\n数值\t469541\n第45集\t469542\nCHD\t469543\n远程教育网\t469544\n三星S3\t469545\ndecreasing\t469546\nxplay\t469547\n汤原县\t469548\n桂园街道\t469549\n凯伦股份\t469550\n阿嚏\t469551\ntensorflow吧_\t469552\n杭网议事厅\t469553\n何惧\t469554\n卡文迪什\t469555\n天彭镇\t469556\n辣椒酱\t469557\n上杆\t469558\n炉架\t469559\n银行间同业拆借\t469560\ntaking\t469561\ntks\t469562\n季铵\t469563\n中国医学科学院肿瘤医院肿瘤研究所\t469564\n眉飞\t469565\n两桥\t469566\n合肥百货\t469567\n加国\t469568\nMSBuild\t469569\n灭种\t469570\n芒康县\t469571\n米斯\t469572\n1869年\t469573\n恶意\t469574\n易吉他版\t469575\nforrtl\t469576\n刘雨潼\t469577\n李伟健\t469578\n察右前旗\t469579\n吉芬\t469580\nSaber\t469581\n协议盒\t469582\n榕榕\t469583\n冷天\t469584\n自洁\t469585\n甲午年\t469586\n传世镇魔录\t469587\n小额信贷\t469588\n闪电贷\t469589\n金田一少年事件簿\t469590\nsafari\t469591\npch\t469592\nMXF\t469593\n刘二\t469594\n兰州网\t469595\n赵乐际韩正\t469596\nbranded\t469597\ngt630\t469598\nteg\t469599\n传真件\t469600\n海沧\t469601\npost接口\t469602\n郎咸仓央嘉措\t469603\n2810\t469604\n四联\t469605\n卷闸\t469606\n吃请\t469607\n李恩\t469608\n激情网\t469609\n快销\t469610\n特鲁姆普\t469611\n绿果网\t469612\nPS游戏\t469613\n养天\t469614\n自动控制\t469615\n婉芳\t469616\n编程作图-气象家园_气象人\t469617\n抗美援朝战争\t469618\n大庆市\t469619\n高杉\t469620\n近意词\t469621\n利用率\t469622\n顶管\t469623\n奥飞\t469624\n函告\t469625\n质子\t469626\n黄帽子\t469627\n迎泽公园\t469628\nspringaop\t469629\n欧米奇\t469630\n月明\t469631\n谨慎\t469632\n纵火\t469633\n侠盗\t469634\nTaipei\t469635\n先例\t469636\ntransactional\t469637\n韩震\t469638\n致富经\t469639\n日耳曼人\t469640\n盛茂林\t469641\n7.62mm\t469642\nCopyright\t469643\n杜十娘\t469644\n虎丘山\t469645\n4g显存\t469646\n乐凡\t469647\n克里斯韦伯\t469648\n86期\t469649\n高级\t469650\n圣商集团\t469651\n杭州市崇文实验学校\t469652\n孜孜不倦\t469653\njsp页\t469654\n椰树集团\t469655\n职业道\t469656\n中国科学院文献情报中心\t469657\n32项\t469658\nvcode\t469659\nmergedragons\t469660\n巨细胞病毒\t469661\n顾道霸道校草宠溺爱\t469662\n荣字\t469663\nwhoever\t469664\n2018年5月18日\t469665\nSweater\t469666\n卡拉辛斯基\t469667\n金泰亨\t469668\n预判\t469669\n流行榜\t469670\n中化泉州石化有限公司\t469671\n187B\t469672\n闲人马大姐\t469673\nEvan\t469674\nSolidWorks-三维网\t469675\n早就\t469676\n九分\t469677\n班风\t469678\nyantai\t469679\n金格\t469680\n苯\t469681\n方瑶莹\t469682\ndiablo3\t469683\nexcel除法函数\t469684\n中心城\t469685\ngxworks2\t469686\nNowGood\t469687\narcgis10\t469688\n3.6.1\t469689\n阳光\t469690\n80cm\t469691\n集训队\t469692\n期货交易所\t469693\n1.2.8\t469694\n佳豪鑫\t469695\n累死\t469696\n徐志雷\t469697\n暴瘦\t469698\n一键秒\t469699\n危险因素\t469700\n八三年\t469701\n欢乐球球\t469702\n64.tar.gz\t469703\n婚外\t469704\n真实感\t469705\n西门豹治邺\t469706\nomf\t469707\n炉石传说冒险\t469708\n21期\t469709\n洛客\t469710\n万珂\t469711\n刀白凤\t469712\n清城区\t469713\n第十二套\t469714\n佳沛\t469715\n无锡安镇\t469716\n坠感\t469717\n青岛联通\t469718\n纸牌屋第四季\t469719\n零售展\t469720\n而胜于蓝\t469721\n2声\t469722\nx5l\t469723\n画中人\t469724\n碧源\t469725\n西北风\t469726\n李南\t469727\n集成电路\t469728\n财富港\t469729\n唐朝时期\t469730\n休闲椅\t469731\n沈阳装修公司\t469732\n豆芽\t469733\n武名\t469734\nskinned\t469735\n3000只\t469736\n蜂皇\t469737\nCIGS\t469738\n防恐\t469739\nhin\t469740\n极富\t469741\n藤浦\t469742\nHelp\t469743\n15亿元\t469744\n金茂地产\t469745\n卤素灯\t469746\n博看网\t469747\n食品代理网\t469748\n漓江出版社\t469749\n巴豆\t469750\nv5.2.3\t469751\ngelatin\t469752\n首汽租车_首汽\t469753\n南宁市国土资源局\t469754\n吸血猎手\t469755\nLM\t469756\n1.3.10\t469757\n神秘学\t469758\n深圳市人力资源和社会保障局\t469759\n幸福起航\t469760\n龙袍\t469761\n2018.4月\t469762\n桃源街道\t469763\nFirefox_浏\t469764\n箫剑\t469765\ndune\t469766\nAdd\t469767\n规划处\t469768\n活着\t469769\n朱炳仁\t469770\n贵州省发展和改革委员会\t469771\n教学游戏\t469772\n复合函数\t469773\n血白蛋白\t469774\n塞尔维亚队\t469775\ndogg\t469776\n维果茨基\t469777\n空间\t469778\n凤凰文化\t469779\n夫子庙\t469780\nnetcore\t469781\nEventBus\t469782\nAE模板\t469783\n老公大人么么哒\t469784\n芭蕉鱼\t469785\n公立\t469786\n周忠\t469787\n北京市总工会\t469788\n耀华\t469789\n绗缝\t469790\ncrossentropy\t469791\n关宝慧\t469792\n47军\t469793\n珍稀野生动物\t469794\n首席\t469795\n奔驰gle400\t469796\nGTA6\t469797\nmlf\t469798\n水晶鞋\t469799\n赵家\t469800\n苹果苹方\t469801\n喳\t469802\n黄包车\t469803\n鱼贯而出\t469804\n修订案\t469805\n振动分析仪\t469806\nsideshow吧\t469807\nQQ头像大全\t469808\n言情\t469809\n芒山\t469810\n热乎乎\t469811\nPIVOT\t469812\n商街\t469813\n小蓝经济技术开发区\t469814\ncreditor\t469815\n双岗\t469816\n深圳北理莫斯科大学\t469817\n3.1.7\t469818\n宫心计\t469819\n2.11.0\t469820\n己内酯\t469821\n德拉诺之王\t469822\nr200\t469823\n焚\t469824\n60路\t469825\n本杰明·巴顿\t469826\n刘俊杰\t469827\n圧\t469828\n重庆办公室\t469829\nsolarbe索\t469830\n右美沙芬\t469831\n4局\t469832\n公仓鼠\t469833\n射柳\t469834\n光伏公路\t469835\n腺病毒\t469836\n第十八章\t469837\n摩托艇\t469838\n损耗\t469839\naav\t469840\nsurfa\t469841\n增效\t469842\n5000平\t469843\n防火玻璃\t469844\nexploded\t469845\n10月2日\t469846\n盖奇\t469847\nBloomington\t469848\n王淑娟\t469849\n赛睿西伯利亚\t469850\nfm2017\t469851\n12磅\t469852\nOPP\t469853\n危急时刻\t469854\n正二十面体\t469855\n德温特\t469856\n2017年06月19日\t469857\n中国石油石化\t469858\n发招\t469859\nPCCW\t469860\n大艺网\t469861\n头油\t469862\n软通动力信息技术(集团)有限公司\t469863\nJpg\t469864\n王玉锁\t469865\n阿瓦山寨\t469866\n斗争性\t469867\n光盘刻录软件\t469868\n西澳\t469869\nedifice\t469870\n25只\t469871\n萨拉\t469872\n烧坏\t469873\n正联\t469874\n双修\t469875\n曹静\t469876\ninput文本框\t469877\n上海浦东软件园\t469878\nUnityGUI\t469879\n龙胜各族自治县人民政府\t469880\nDas\t469881\n移位机\t469882\n麒麟658\t469883\n军委科技委\t469884\ntanks\t469885\n最后的人\t469886\n五河士道\t469887\n玄夜\t469888\n天高地厚\t469889\n服部\t469890\n两版\t469891\n优贝乐\t469892\n炙\t469893\n诞生\t469894\n退队\t469895\n子画\t469896\n君联资本\t469897\n再见青春\t469898\n50千克\t469899\nDZ论坛\t469900\n永州火车站\t469901\n小马快跑\t469902\n葵司\t469903\n箓\t469904\n波板糖\t469905\n吸色\t469906\n快贷还款\t469907\nOnlyGirls\t469908\n时间煮雨\t469909\n6216\t469910\n大豆粉\t469911\nJupyter\t469912\n摇光\t469913\n东古\t469914\ndist\t469915\n艾希曼\t469916\n张海洋\t469917\n放在心上\t469918\nsacd\t469919\n暖夏\t469920\nIam\t469921\n普及型\t469922\n北京世纪城\t469923\n郭政鸿\t469924\nsometing\t469925\n山东司法警官职业学院\t469926\n货\t469927\nmipi接口\t469928\n和平南路\t469929\n主宾国\t469930\n陈势安\t469931\n直勾勾\t469932\nsul\t469933\n2285\t469934\n刘琼\t469935\nnfs服务器\t469936\nmanhua\t469937\n即时战略游戏\t469938\n裹胸\t469939\n西南医科大学\t469940\n济南市工商局\t469941\nstm\t469942\n旋挖钻孔灌注桩\t469943\nnumerous\t469944\n寅子\t469945\n刘薇\t469946\n西洋剑\t469947\n放歌\t469948\n完美赤壁吧\t469949\ndotaimba\t469950\n固定队\t469951\n劲版\t469952\n陕西省环境保护厅\t469953\ninjury\t469954\n班导\t469955\n跳入\t469956\n云南白药\t469957\n神界3:原罪\t469958\n各们\t469959\nDominion\t469960\n百合婚礼社区\t469961\n输注\t469962\n振源\t469963\n环氧粉末防腐钢管\t469964\nwuli\t469965\n绝对值\t469966\n荣耀战魂吧\t469967\n巴巴多斯\t469968\n簡單\t469969\n云发\t469970\n延吉机场\t469971\nyum源\t469972\n扶贫济困\t469973\nAEMedia\t469974\nwanggd\t469975\n三国曹操传\t469976\n传华\t469977\n借壳\t469978\nwebsockets\t469979\n东移\t469980\n曼森\t469981\ngear\t469982\n赤炎\t469983\n中晚\t469984\ncorrugated\t469985\n虞爱华\t469986\nwithu\t469987\n摄魄\t469988\nbrowsing\t469989\n核危机\t469990\n每小时\t469991\nshengy\t469992\n科士威\t469993\nxquartz\t469994\n熊孩\t469995\n好好过\t469996\n自我批评\t469997\n走弱\t469998\nDonald\t469999\n国民收入\t470000\n传祺GA4\t470001\nSpringBoot\t470002\ntav\t470003\n红掌\t470004\n勃起\t470005\n翻案\t470006\n蔡天新\t470007\n大城西\t470008\n海空\t470009\n红烧排骨\t470010\n宫欧\t470011\n范媛媛\t470012\n魔兽真三\t470013\n常州高新区政府网\t470014\nALERT\t470015\n负债率\t470016\nWhale\t470017\n尾插\t470018\n337调查\t470019\n旭\t470020\nnahimic\t470021\n巴主席\t470022\n绿色食品\t470023\n弗雷迪\t470024\n微弧\t470025\n联系\t470026\n湖北省社会科学院\t470027\n快船\t470028\n驱鼠\t470029\n绿网\t470030\n七阶\t470031\n10.12\t470032\n稀薄\t470033\n6千元\t470034\n第160集\t470035\nMontessori\t470036\n山西省经济和信息化委员会\t470037\n行不行\t470038\nkindergarten\t470039\nsdq\t470040\n澳门新濠影汇\t470041\n州际\t470042\n百度云直链\t470043\n蔡甸区政府\t470044\n春竹\t470045\n茶厂\t470046\n12328\t470047\n张行\t470048\n忍痛割爱\t470049\nJAY\t470050\n悠木碧\t470051\n632亿\t470052\n二圣\t470053\n陈皮\t470054\n不_\t470055\n中国人民大学继续教育学院\t470056\n经营案\t470057\n坐飞\t470058\n专利电子申请网\t470059\n加油枪\t470060\n伊人成人综合网\t470061\n黄子稻\t470062\n官僚主义\t470063\n富丽家园\t470064\n胰腺\t470065\nteclast\t470066\n摆好\t470067\n台塑集团\t470068\n交响乐版\t470069\n巨魂\t470070\nffs\t470071\n国别\t470072\n负离子吹风机\t470073\n担当善\t470074\n相思泪\t470075\n济群\t470076\nlinphone\t470077\n中国科普网\t470078\n名木\t470079\n83式\t470080\n半寸\t470081\n吸料机\t470082\n海垦\t470083\n五十铃\t470084\n吴妈\t470085\n东园镇\t470086\n牛顿定律\t470087\n国文\t470088\n月行\t470089\nsommer\t470090\n速食\t470091\n衍射峰\t470092\n黄鳝\t470093\n湖北省黄冈中学\t470094\n两高\t470095\n同权\t470096\n行话\t470097\nXhamster\t470098\n文理分科\t470099\nWOW\t470100\n招商证券股份有限公司\t470101\n三沙市人民政府\t470102\n名誉权纠纷\t470103\n绝地枪王\t470104\n土豆视频播放器\t470105\n办病\t470106\nFWD\t470107\n从艺\t470108\nroll币\t470109\n焦炉\t470110\n回溯算法\t470111\n雄楚大街\t470112\n奥多姆\t470113\n堤岸\t470114\n汇龙湾\t470115\n北京吉普\t470116\n卸甲\t470117\n十二个小时\t470118\n守候\t470119\n3千亿\t470120\nkap\t470121\n锦绣\t470122\nmaintaining\t470123\n发球\t470124\n徐仁国\t470125\n四维度\t470126\n24万吨\t470127\nGVC\t470128\n坏孩子的天空\t470129\nHawaii\t470130\nweb版\t470131\n沼土\t470132\n35分钟左右\t470133\n蔷薇之恋\t470134\n德山\t470135\nDriving\t470136\n真草千字文\t470137\nz6\t470138\n林业局\t470139\n上海电气\t470140\nmagician\t470141\n差化积\t470142\n年会\t470143\nfrontal\t470144\n威海市人力资源和社会保障局\t470145\n群峰\t470146\n550D\t470147\n抗日之铁血智将\t470148\nSLF4J\t470149\n入不敷出\t470150\n八猴渲染器\t470151\n世子妃\t470152\n0927\t470153\ncountifs函数\t470154\n一次性医疗补助金\t470155\nBSc\t470156\n无极剑圣\t470157\n黑社会2:以和为贵\t470158\n沈阳兴隆大奥莱\t470159\n江西省科技厅\t470160\n陆家镇\t470161\n社会秩序\t470162\n詹妮\t470163\n13份\t470164\n京享街\t470165\n颦\t470166\n杰西卡琼斯\t470167\n超人前传\t470168\n5560\t470169\n爱问\t470170\n送教\t470171\n大众尚酷\t470172\n唯怡\t470173\n花片\t470174\n荣亲\t470175\n大秦帝国\t470176\nmod|模拟人生4\t470177\nFLYING\t470178\nvc2005\t470179\n汉川论坛\t470180\n古诗句\t470181\n密语\t470182\nZery\t470183\n飞水\t470184\n60日线\t470185\n下风\t470186\naftereffects\t470187\n朽木露琪亚\t470188\n叶寸心\t470189\nirr\t470190\npyside\t470191\n河南\t470192\n探诊\t470193\nmount\t470194\n北京郊区\t470195\nPillows\t470196\n下午三点\t470197\n马上\t470198\nVeterans\t470199\n蝴蝶公墓\t470200\n波贝\t470201\n身分\t470202\nv2.13\t470203\n开明\t470204\n高河\t470205\n庆元县政府网\t470206\n周广仁\t470207\n饕餮\t470208\n疟\t470209\n复杂\t470210\n兴寿镇\t470211\n12道\t470212\nfip\t470213\n韓國\t470214\n小米AI\t470215\n国际单位\t470216\n中国冶金地质总局\t470217\n江南城\t470218\n上海科学技术职业学院\t470219\nS2线\t470220\n_央\t470221\n海蚌\t470222\n钟凯\t470223\n脱出\t470224\n无损播放器\t470225\n沈健\t470226\nElm\t470227\n冀中能源\t470228\n揉眼睛\t470229\nedius6\t470230\ni386\t470231\n牛杖\t470232\napache+tomcat\t470233\n财色\t470234\nBean\t470235\n刘煜辉\t470236\n春菜\t470237\n皇后大学\t470238\nufs2.1\t470239\n宣武门\t470240\n马丽娟\t470241\nwindwos7\t470242\n一个钟\t470243\n泰山街道\t470244\n恋爱的犀牛\t470245\n色力\t470246\n春娇与志明\t470247\nAXA\t470248\n十九大全面从严治党战略部署\t470249\n第10\t470250\n金税\t470251\n龟派气功\t470252\n雪雁\t470253\n意大利米兰\t470254\n煤矿安全规程\t470255\n秒拒\t470256\nece\t470257\n8.1.2\t470258\n妖刀姬\t470259\n珠海港\t470260\n2sls\t470261\ndean\t470262\n两种人\t470263\n2K\t470264\n20140101\t470265\n李文科\t470266\n白块\t470267\nMiss\t470268\n帕萨特b5\t470269\n玩家们\t470270\ntaian\t470271\n右安门\t470272\n驱风\t470273\n十三张\t470274\n李傲\t470275\n大雁山\t470276\n冻干粉针剂\t470277\n一万五千\t470278\n光通信\t470279\n内酰胺\t470280\n千层浪\t470281\n飘忽\t470282\nsixteen\t470283\n报告期\t470284\nSPP\t470285\nCamps\t470286\n操作技能\t470287\n航天信息江苏有限公司\t470288\nforscan\t470289\n农民们\t470290\n晋绥军\t470291\n摆饰\t470292\n暴走团\t470293\n老城\t470294\n費\t470295\n嘉兴北站\t470296\n钝化液\t470297\n100多块\t470298\n石煤\t470299\n茉莉蜜茶\t470300\n刺梨酒\t470301\nerp\t470302\nrender函数\t470303\n北清\t470304\nYoun\t470305\nnear\t470306\n一式二份\t470307\n海盐县\t470308\n1.5.16\t470309\n察言观色\t470310\nWin7系统封装\t470311\n写错\t470312\nTOP4\t470313\n罗意威\t470314\nchembiodraw\t470315\n红A\t470316\n团队\t470317\n姜昆\t470318\n艾迪\t470319\n屋盖\t470320\ncsmar\t470321\n郑州师范学院\t470322\nFaith\t470323\n奥体\t470324\n雏鹰杯\t470325\n失意\t470326\n医药园\t470327\n软世通\t470328\n乐高蝙蝠侠3\t470329\n客片\t470330\nSkate\t470331\n4.1%\t470332\n房心\t470333\n名录\t470334\n板桩\t470335\nInfernal\t470336\n体肤\t470337\n虹越\t470338\n北京三快科技有限公司\t470339\n明年7月\t470340\n股份制\t470341\nAISS\t470342\n化敌为\t470343\n滤嘴\t470344\n南宁晚报多媒体数字报\t470345\n鸟鸣\t470346\n风波庄\t470347\n天柱县\t470348\n33分\t470349\n利亚德\t470350\n狗儿\t470351\n野生稻\t470352\n美容美体网\t470353\n20170423\t470354\n开塞\t470355\nbool值\t470356\n0.01秒\t470357\n贪污案\t470358\n横山克\t470359\n贝斯论坛\t470360\n党建设\t470361\n智语\t470362\n80年代以来\t470363\n余姚路\t470364\n湛江开发区\t470365\n打住\t470366\n牧马湖\t470367\n远翔\t470368\n青菜\t470369\n842路\t470370\n王甫荣\t470371\n废纸\t470372\n醴\t470373\n汽车站\t470374\n凤纹\t470375\n白学\t470376\n传祺杀神\t470377\n独食\t470378\n孕前期\t470379\n110mm\t470380\n英诺天使基金\t470381\n三种\t470382\n媚色\t470383\n满园春\t470384\n账面净值\t470385\n京津冀协同\t470386\n赤手\t470387\n联组\t470388\n装分\t470389\n朴正熙\t470390\n欧路词典\t470391\n诗刊\t470392\n巴马香猪\t470393\n冰淇凌\t470394\n文件架\t470395\n超细纤维\t470396\n皮毛\t470397\n灵魄\t470398\nDirectAdmin\t470399\n阳光学院\t470400\n蛇山\t470401\n搞笑话剧\t470402\n游离脂肪酸\t470403\n气瓶柜\t470404\ndatetime\t470405\n大麦町犬\t470406\n央视七套\t470407\n熔断机制\t470408\n电脑盘\t470409\nRob\t470410\n柳冠中\t470411\n杂修\t470412\n异路\t470413\n氧化钴\t470414\n222个\t470415\n光幕\t470416\n同轴式\t470417\nlongitude\t470418\n铜豌豆\t470419\nTwitter\t470420\n绒毛\t470421\n14.6%\t470422\n普庵咒\t470423\n那年那兔那些事儿\t470424\n北方交通大学\t470425\n泰谷\t470426\n王登峰\t470427\n忍法\t470428\n[理\t470429\n东华门街道\t470430\n盘扣架\t470431\n示众\t470432\n卓有成效\t470433\n邬思道\t470434\ntightvnc\t470435\n香港北角\t470436\n姚剧\t470437\n玖龙\t470438\nHeFei\t470439\n北京投资管理公司\t470440\niphone3gs\t470441\nspreadjs\t470442\n格局商学院\t470443\n加贴\t470444\n小木桥路\t470445\n方天画戟\t470446\n公关\t470447\n弹簧刀\t470448\n3216\t470449\n逼问\t470450\n卡刀\t470451\n金生水\t470452\n缸子\t470453\nABC类\t470454\nMaersk\t470455\n碎心石\t470456\n旧书\t470457\n方缘\t470458\nshift+\t470459\n结构性思维\t470460\n零售价\t470461\nLoose\t470462\n合肥市六安路小学\t470463\n胡杨林\t470464\nPoets\t470465\n肖亚庆\t470466\n金平区\t470467\npsCS6\t470468\n俩点\t470469\nchose\t470470\nlhj588\t470471\n味苦\t470472\n没点\t470473\n姚凯\t470474\n破事\t470475\n条码打印机\t470476\n熟语\t470477\nSEC\t470478\nEnd\t470479\naffected_rows\t470480\n1130\t470481\n牙齿酸\t470482\n高邮市\t470483\n两百万\t470484\n社工\t470485\n心宝\t470486\n厳格ク\t470487\n如皋市人民政府\t470488\n松鹤园\t470489\n老船\t470490\n5.3.1\t470491\n赵一特朗普\t470492\n淅川\t470493\n爱奇艺帮助中心\t470494\n亚德\t470495\nshixi\t470496\n北京居然之家\t470497\n林轩田\t470498\n黑节\t470499\n波尔津吉斯\t470500\n奋发有为\t470501\n035期\t470502\n稿源\t470503\n膏方\t470504\n像源\t470505\n上海闵彬管业有限公司\t470506\n字源\t470507\nmcdonald\t470508\n点子\t470509\n河南农业大学\t470510\n汇源果汁\t470511\n微重力\t470512\n好年华说说网\t470513\n裤衩子\t470514\nstudiofow\t470515\nkkhd电影网\t470516\n江苏科技\t470517\n暖炉\t470518\n孤鸾\t470519\n富江\t470520\n万能包\t470521\nvie\t470522\n2011-02-02\t470523\n比特\t470524\nfame\t470525\n哈医大四院\t470526\n残次品\t470527\n达摩祖师传\t470528\nNutty\t470529\n忍者龙剑传2\t470530\n教研版\t470531\nUBS\t470532\n浣碧\t470533\n本报\t470534\n工布江达县\t470535\nftdi\t470536\n落尘小说网\t470537\nXE2\t470538\n曾培淦\t470539\n博览园\t470540\n长沙市人民政府\t470541\n育险\t470542\nDarius\t470543\n师生员工\t470544\n肥猪\t470545\n赵鹏\t470546\nincubator\t470547\n树蛙\t470548\n黑怕\t470549\n乔木\t470550\n核卡\t470551\n钟书\t470552\n蒺藜皂甙\t470553\n当空\t470554\npatching\t470555\n非全\t470556\nnigx\t470557\n暧昧宰执天下\t470558\n点刺\t470559\n安塔芮丝\t470560\nnyc\t470561\n山东工业职业学院\t470562\n17173\t470563\n采购员\t470564\n波特曼\t470565\n喂鱼\t470566\nashamed\t470567\n辽宁省水利厅\t470568\n双联户\t470569\n中华医学会肾脏病学分会\t470570\nmanwei\t470571\n龙瞎\t470572\n20151216\t470573\n胡郁\t470574\n藏区\t470575\n信息科技有限公司\t470576\nacquires\t470577\n32片\t470578\n玛伊\t470579\n无限池\t470580\n护士节\t470581\n调制\t470582\n〡\t470583\n爱荷华\t470584\ntomact\t470585\n捞月狗双榜\t470586\n范德萨\t470587\n风流韵事\t470588\n精厕\t470589\n充装站\t470590\n优先认购权\t470591\n德意志\t470592\n处女座男\t470593\n替人\t470594\n莫陌\t470595\naltera\t470596\n孝庄太后\t470597\n零基预算\t470598\n低组\t470599\n热诚\t470600\n颜艺\t470601\n内积\t470602\n嚼碎\t470603\nLea\t470604\n1GB\t470605\nsipac\t470606\n虚拟定位精灵\t470607\n淘气堡\t470608\nSucking\t470609\nsulfide\t470610\n身份认证\t470611\n1.5T_\t470612\n赵海\t470613\n雷欧幻\t470614\n面授\t470615\n深圳百合外国语学校\t470616\n小红\t470617\n12倍\t470618\n发露\t470619\n行贿\t470620\n公交专用道\t470621\nBecome\t470622\n地精\t470623\n新松江路\t470624\n院落\t470625\n百度熊掌号\t470626\n王石刘备\t470627\n中国移动政企客户分公司\t470628\n发力\t470629\n终焉\t470630\nTolerance\t470631\nvetements\t470632\n马思聪\t470633\n田然\t470634\n小狂\t470635\n徐铉\t470636\nTemporarily\t470637\nPhotographers\t470638\n旅游景\t470639\n恶法\t470640\n自然史\t470641\nINI\t470642\n河北司法警官职业学院\t470643\nexports\t470644\n威廉·福克纳\t470645\nMaterialize\t470646\n量子化\t470647\n陈学变形金刚\t470648\n辽宁省司法厅\t470649\n不堪回首\t470650\nOutlook2010\t470651\n颗\t470652\nUMA\t470653\n勘察\t470654\nLeads\t470655\ncultivate\t470656\nSUMMIT\t470657\n3325\t470658\n田庄镇\t470659\n天易\t470660\n天虹百货\t470661\n咪咕阅读-小说阅读\t470662\n长轨\t470663\n刘可颖\t470664\nvivoy67\t470665\nfateextella\t470666\n诗狂\t470667\nichart\t470668\n衡水新闻网\t470669\n卫国战争\t470670\n异度之刃2吧\t470671\n20140602\t470672\n良乡大学城\t470673\n列车员\t470674\n瓦胡岛\t470675\n鸽眼\t470676\n长春市财政局\t470677\nVANCL\t470678\n周播剧\t470679\nm1ok\t470680\nSigmaplot\t470681\nProcesses\t470682\n6000点\t470683\n变色龙\t470684\n高耗能\t470685\n小7论坛\t470686\nm5\t470687\nDebate\t470688\n闪电麦昆\t470689\n宁夏电力\t470690\n报名时\t470691\n解郁\t470692\nRefa\t470693\nyyt\t470694\n神思者\t470695\n金川开发区\t470696\n关税完税\t470697\nCKD\t470698\n龙腾\t470699\n才华横溢\t470700\n布孔\t470701\n周乐\t470702\n徐泾镇\t470703\n网络安全\t470704\n哈密市政府\t470705\n安慕希希腊\t470706\n虚拟商品\t470707\n20150529\t470708\nTease\t470709\n中华人民共和国外交部\t470710\n易打单\t470711\n大浪湾\t470712\n员外郎\t470713\n网卡版\t470714\n梓州\t470715\n认筹\t470716\n国药准字号\t470717\n伺服电机\t470718\n租衣\t470719\n亲传\t470720\n秦王\t470721\n昌吉市\t470722\n酒石酸钾钠\t470723\nod\t470724\nFALSE\t470725\nPthread\t470726\n沪通铁路\t470727\n明宇\t470728\n民生证券股份有限公司\t470729\n到庭\t470730\n惠民园\t470731\nzhuge\t470732\n负氧\t470733\n乐博乐博\t470734\n云南省商务厅\t470735\n拉菲娱乐\t470736\n高氯酸钠\t470737\n2016年11月1日起\t470738\n菏泽高新区\t470739\n吴亦欧弟\t470740\nExtended\t470741\ninvention\t470742\nOpenXC\t470743\n生前\t470744\n鸡同鸭\t470745\noslo\t470746\n三墩中学\t470747\n喀斯玛商城\t470748\n东安县人民政府\t470749\n词典\t470750\n最好的朋友\t470751\n八十_\t470752\n墙肢\t470753\n周易六爻\t470754\n6000多元\t470755\n古德利\t470756\n芭比堂\t470757\n手袋\t470758\n和非\t470759\n圣耀\t470760\n澳门皇冠\t470761\n蜀门红旗h7\t470762\n宝业\t470763\nxpk\t470764\n姚飞\t470765\nhenta\t470766\n1.75\t470767\n循规蹈矩\t470768\n西派城\t470769\n黔江\t470770\nLNK2001\t470771\n水头镇\t470772\n战线\t470773\ntcb\t470774\n終於\t470775\n赵玉平\t470776\n波依定\t470777\n居士服\t470778\n牛尾\t470779\nUSB数据线\t470780\n大妖精泉\t470781\n憬\t470782\n圈主\t470783\n百度云/超清\t470784\n麻辣教师GTO\t470785\nping6\t470786\nSOULS\t470787\nmedal\t470788\nLexis\t470789\n440\t470790\n马多\t470791\n切割机\t470792\n春兴\t470793\n天津市第四中心医院\t470794\n花木君\t470795\n赛斯\t470796\n批判\t470797\n河东街道\t470798\n这张脸\t470799\n2016年4月\t470800\n多例\t470801\n脂肪乳注射液\t470802\n怪笑\t470803\n奥南海滩\t470804\njgs\t470805\n刺激\t470806\n传媒\t470807\n躲避\t470808\nRTHK\t470809\n成都东\t470810\n玫瑰5\t470811\n马恩全集\t470812\n字族\t470813\n女爱\t470814\nfrey\t470815\n券商\t470816\n呵护\t470817\nxiye\t470818\n南宁市政府\t470819\nsleeves\t470820\n棠夫人\t470821\n2002世界杯\t470822\n玉帛\t470823\n春蚕到死丝方尽\t470824\n0AA0\t470825\nlibcode\t470826\nAKG\t470827\n资源池\t470828\n指示灯\t470829\n缺课\t470830\n小亮\t470831\n桜日\t470832\nCatwalk\t470833\n580分\t470834\n杨东升\t470835\n迪文\t470836\nglonass\t470837\n常州外国语学校\t470838\n嘉年华论坛_汽车之家论坛\t470839\ngedit\t470840\n中国农村综合改革研究中心\t470841\n笨鸟\t470842\n变形金刚ol\t470843\n结构性存款\t470844\n兰州西\t470845\n油漆工\t470846\n雍和府\t470847\n风魔\t470848\n海口百姓网\t470849\n01815\t470850\n契丹族\t470851\n艾伯塔省\t470852\n共聚物\t470853\n致辞稿\t470854\n植根\t470855\n掉落率\t470856\n光大证券股份有限公司\t470857\ncontainer\t470858\n后代们\t470859\n安布雷拉\t470860\n闭环\t470861\nsocket函数\t470862\n主办会计\t470863\n不要害怕\t470864\n天成医疗网\t470865\n亚洲一号\t470866\n电针\t470867\nFlats\t470868\n虎山长城\t470869\n亚夏\t470870\n光头强\t470871\n女记\t470872\n套选\t470873\n百威啤酒\t470874\nrigging\t470875\n核酸适配体\t470876\n冰雪\t470877\n迪达拉\t470878\npneumatic\t470879\n宏盛\t470880\n影路\t470881\nfuzhi\t470882\n金华市中心医院\t470883\n伊邪那岐\t470884\n欢乐豆\t470885\n三速\t470886\n茌平县\t470887\n圣乔治\t470888\n定期存折\t470889\nincest\t470890\n管家婆财贸双全\t470891\nZhuang\t470892\n小海蒂\t470893\n观音镇\t470894\n群友\t470895\n鸡块\t470896\n仵作\t470897\nconm\t470898\nyrz\t470899\n第7卷\t470900\n古曲网\t470901\n土豆皮\t470902\n鹃\t470903\n发晕\t470904\n联席会议\t470905\n10几秒\t470906\n证婚\t470907\n第5讲\t470908\nSunflower\t470909\n唐嫣\t470910\n东北野战军\t470911\ngapps\t470912\nDS3\t470913\n4月13\t470914\n廖静\t470915\n目标成本法\t470916\n往来款\t470917\nhound\t470918\n李尚洙\t470919\n利多卡因\t470920\n上海市公安局\t470921\n麻辣鸡\t470922\n硕贝德\t470923\n济南高铁站\t470924\nu18\t470925\nnozzle\t470926\n怡美\t470927\n污水泵站\t470928\nscc\t470929\nPakistani\t470930\n沃尔玛(中国)投资有限公司\t470931\n王师\t470932\n不省\t470933\n火控\t470934\nIntPtr\t470935\n资产使用权\t470936\n学前端\t470937\nwin7资源管理器\t470938\nusers\t470939\n株式\t470940\n冰史玉柱\t470941\n黑莓吧\t470942\nuGUI\t470943\n文件尾\t470944\n轻松时刻\t470945\n全境\t470946\n香尘\t470947\n嗓子\t470948\n60平方厘米\t470949\ncheer\t470950\n首任\t470951\nCentres\t470952\n朱熹\t470953\n蜃\t470954\n电子狗\t470955\n厦门市统计局\t470956\n平衡点\t470957\n360doc\t470958\n乡艳\t470959\n小凉山\t470960\n风凌\t470961\n10寸\t470962\nepos\t470963\n渡车\t470964\n俩目\t470965\n刘文娟\t470966\n桂格燕麦片\t470967\n前列腺\t470968\nnikko\t470969\n议决\t470970\nCD+DVD\t470971\n星野光\t470972\n田娃\t470973\n第三更\t470974\n速腾论\t470975\n太原奥数网\t470976\n阿什兰\t470977\n智能科学与技术专业\t470978\n2018年03月30日\t470979\n云雀恭弥\t470980\n联句\t470981\n心虚\t470982\n母姓\t470983\n国有股\t470984\n炒河粉\t470985\n活死人黎明\t470986\n木兮\t470987\n4100元\t470988\n箫声\t470989\n西安网络公司\t470990\n大制\t470991\n刘作明\t470992\njts\t470993\n坚实\t470994\n柳州职业技术学院\t470995\n里贝里\t470996\n2016年5月29日\t470997\n盐灯\t470998\n天空之城\t470999\n昝\t471000\n顺庆\t471001\n依图科技\t471002\n安东老王\t471003\n消耗\t471004\nCGLib\t471005\n广东省教育厅\t471006\n背景墙\t471007\n滑胎\t471008\n第99集\t471009\n尼亚加拉瀑布城\t471010\nIPG\t471011\n3025\t471012\n鼓书\t471013\n麦基\t471014\n专类园\t471015\n奶皮\t471016\nthe7\t471017\n卡林巴\t471018\n政府指导价\t471019\n怪物猎人2\t471020\n酱卤肉\t471021\nMadhouse\t471022\n电源层\t471023\n试题目\t471024\n一动一动\t471025\n体育东路\t471026\n2901\t471027\njsplumb\t471028\n银字\t471029\n大色\t471030\n批文\t471031\nCloudCompare\t471032\nwinload.efi\t471033\n阿联酋迪拉姆\t471034\n金苗网\t471035\n动轮\t471036\n晶钢板\t471037\n聚酯多元醇\t471038\n河南政府\t471039\n帕里斯希尔顿\t471040\nranana\t471041\n苏州地铁5号线\t471042\n香糖\t471043\n韩偓\t471044\n歪解\t471045\n陈维崧\t471046\n考文垂大学\t471047\n律者\t471048\n泰瑞沙\t471049\nBizTalk\t471050\n监察官\t471051\n美国运通\t471052\n团级\t471053\n布尔什维克\t471054\ninx\t471055\n刺五加片\t471056\n舀\t471057\nrongyao\t471058\n第122届\t471059\n鲟鳇鱼\t471060\nGII\t471061\n中国国家人才网\t471062\ngs63vr\t471063\n离子源\t471064\n上海市发改委\t471065\n七台河市\t471066\ncarte\t471067\n6.61\t471068\n烈属\t471069\n梅县区\t471070\n323路\t471071\n17公里\t471072\n瓜片\t471073\nLars\t471074\n优途\t471075\n居转户\t471076\n子园\t471077\n余干\t471078\n秋末镇\t471079\n1万小时\t471080\n苏州工业园区劳动和社会保障局\t471081\n饥荒\t471082\n中南民族大学\t471083\n鸿图图书旗舰店\t471084\n激萌\t471085\n卷叶\t471086\n第2015\t471087\n施工技术\t471088\n学工网\t471089\n有情有\t471090\n四方块\t471091\n吃点\t471092\n薛城区政府网\t471093\n练琴\t471094\n体中\t471095\n中间套\t471096\n双头人\t471097\n2001年4月1日\t471098\n中密度纤维板\t471099\n纸团\t471100\n建川博物馆\t471101\n市标\t471102\n华祥苑\t471103\n渐渐地\t471104\n全日制\t471105\n兼容版\t471106\nbitdefender\t471107\n病患\t471108\n孟鲁司特钠片\t471109\n进宝\t471110\n百乐满\t471111\n利润表分析\t471112\n2.5小时\t471113\n杀人罪\t471114\n中华龙舟大赛\t471115\n南阳市委\t471116\n肉驴\t471117\ni=0\t471118\nMSBA\t471119\n20170821\t471120\n统一版\t471121\n小珠山\t471122\nCart\t471123\n主变\t471124\n小儿氨酚烷胺颗粒\t471125\n底册\t471126\n成哥\t471127\n稀缺性\t471128\n大气环境\t471129\n缩略词\t471130\n信度\t471131\nJAD\t471132\n失调\t471133\n米酷\t471134\n好时节\t471135\nsga\t471136\n方柄\t471137\n全然不知\t471138\nemployed\t471139\nAutomobile\t471140\nv1.0.0.1\t471141\n市政学\t471142\n观望\t471143\n0.1mol\t471144\n九转\t471145\n猪肝汤\t471146\n24寸\t471147\n101.7\t471148\n李园\t471149\n非金属矿产\t471150\narg\t471151\nPSB\t471152\nConferences\t471153\n水电水利工程\t471154\nExcel2000\t471155\n马岩松\t471156\nClubMed\t471157\n我爱故乡的杨梅\t471158\n000410\t471159\nbunnyblack3\t471160\n东方CJ\t471161\n古拉格\t471162\n凯撒宫\t471163\n孤品\t471164\nnginx+\t471165\n意匠\t471166\ntsai\t471167\n围棋少年\t471168\n摩梭人\t471169\n十一万\t471170\n甲种\t471171\nMT7621\t471172\n邰正宵\t471173\n诈骗公司\t471174\n客流高峰\t471175\nassimp\t471176\nSupra\t471177\n热加工\t471178\ntowns\t471179\n文印\t471180\n税\t471181\nfav\t471182\n长沙晚报网\t471183\n坐地\t471184\n经验技\t471185\n离岸\t471186\n通天武尊\t471187\nGrátis\t471188\n字节数\t471189\nIPIP.NET\t471190\nsulfate\t471191\n530Le\t471192\n五杯\t471193\n谢闻轩\t471194\n红岩镇\t471195\n潼南\t471196\n美美哒\t471197\n开机启动项\t471198\n做伴\t471199\ntimespec\t471200\n楚歌\t471201\nSteele\t471202\n行课网\t471203\n害喜\t471204\n兰州市\t471205\nbatman\t471206\n幺妹儿\t471207\n振东\t471208\nh2so4\t471209\nwinword\t471210\n末日孤舰\t471211\n记恨\t471212\n苦丁茶\t471213\nPile\t471214\nW60\t471215\n蜜桃成熟时33D\t471216\n肿痛安胶囊\t471217\nDNF安全模式\t471218\naccesstoken\t471219\n绿豆沙\t471220\n原动力\t471221\n剔透\t471222\n拉玛\t471223\n来迟\t471224\n营配\t471225\n欢乐时光\t471226\n大刀记\t471227\n118元\t471228\n浙江高院\t471229\n国金基金\t471230\nBacterial\t471231\nREMAX\t471232\n枪响\t471233\n公人\t471234\n5点钟\t471235\n墓王之王\t471236\nwin7系统安装盘\t471237\n团票\t471238\n剑网三大师赛\t471239\n爆炸盒子\t471240\n军转干\t471241\nMemories\t471242\n搞野\t471243\n白浪\t471244\n西归浦\t471245\n平板灯\t471246\nSwagelok\t471247\n2016年四月\t471248\n十八周岁\t471249\nmichelin\t471250\n无产阶级文化大革命\t471251\n仓管\t471252\nSuperStar\t471253\n地气\t471254\n苏香\t471255\nCWnd\t471256\n柠檬酸钙\t471257\n防汛\t471258\n地心游记\t471259\n4厘\t471260\n算来\t471261\n作曲版\t471262\nfolio\t471263\nassist\t471264\n工作工资\t471265\n弹出自定义\t471266\nmkv格式\t471267\nICTCLAS\t471268\n玉树\t471269\n圣达菲\t471270\n弗兰克\t471271\nFreud\t471272\n实达\t471273\n惠普电脑\t471274\n甩棍\t471275\n清心\t471276\n枪托\t471277\n私宠\t471278\n易快\t471279\n加整\t471280\n椎管狭窄\t471281\nc座\t471282\n意乙\t471283\n拣\t471284\n金华南站\t471285\nforbid\t471286\n八辑\t471287\n房户\t471288\n上海旅游集散中心\t471289\n电脑城\t471290\nviable\t471291\n贾斯汀·汀布莱克\t471292\n诅咒铠甲2\t471293\n辽宁地区\t471294\nlumen\t471295\nstranded\t471296\n新井佑美\t471297\n大忙人\t471298\n颈内动脉\t471299\ntrading\t471300\n台字\t471301\n自动重启\t471302\nofice\t471303\n豫建设标\t471304\n灼烧感\t471305\n面试官\t471306\n200平方米\t471307\n丹参\t471308\n森亮号航海见识\t471309\nflt\t471310\n王民\t471311\n伊路米\t471312\n保隆科技\t471313\n悦盒\t471314\ngetshell\t471315\n240万元\t471316\n一完\t471317\n國立\t471318\n歌括\t471319\n电渗析\t471320\n菲力牛排\t471321\nLGA\t471322\n太平南路\t471323\n指物\t471324\nVirgins\t471325\nNBA_腾讯\t471326\nstm32f105\t471327\n张立峰\t471328\nhtpps\t471329\n脚堂\t471330\npcntl\t471331\n黄金分割率\t471332\n亚威\t471333\n延时继电器\t471334\n沪宁\t471335\n剪切器\t471336\n电芯\t471337\n吴耀汉\t471338\n业员\t471339\n肩\t471340\n乌节路\t471341\n抹墙机\t471342\n金库门\t471343\n过滤规则\t471344\n花栗鼠\t471345\n张法\t471346\nAdrian\t471347\n普测\t471348\n移码\t471349\nArbitrary\t471350\n分支线\t471351\n梅园路\t471352\n濒危\t471353\n实化\t471354\nRehabilitation\t471355\n琴童\t471356\n2016年7月8日\t471357\n都行\t471358\n吉他谱集\t471359\n一个世纪\t471360\njtopo\t471361\n4.7.4\t471362\n急性心肌梗死\t471363\n恶魔果实\t471364\n国家地理标志\t471365\n方辉\t471366\n残局斗地主\t471367\n6080yy\t471368\n南门广场\t471369\nHD-MP4\t471370\n阿糖胞苷\t471371\nEk\t471372\n刀型\t471373\n手作\t471374\n第141\t471375\n阿鹏\t471376\n∝\t471377\n克里斯蒂安\t471378\n史黛拉\t471379\n中兴华会计师事务所\t471380\n北塘古镇\t471381\n总师\t471382\n中山南头镇\t471383\n金属浴\t471384\n3.1米\t471385\n现职\t471386\n93页\t471387\n脉搏\t471388\n去哪儿攻略\t471389\nAV女\t471390\n八支\t471391\n宜家家具\t471392\n曲江池\t471393\n通信学院\t471394\n递推式\t471395\n九百\t471396\n虚劳\t471397\n华北电力大学\t471398\n华泰联合证券\t471399\n外阴影\t471400\n数据结构与算法分析\t471401\n感情史\t471402\nDre\t471403\nweke\t471404\n变形记\t471405\n分娩\t471406\n粗俗\t471407\nVSFTPD\t471408\n滕头村\t471409\n第32轮\t471410\n分清楚\t471411\n无差别\t471412\n荣盛房地产发展股份有限公司\t471413\nmax485\t471414\n【推文+\t471415\n选中值\t471416\n115平\t471417\n1v2\t471418\nwarranty\t471419\n临冬城\t471420\n害人精\t471421\n很美\t471422\nHGT\t471423\n听见\t471424\n江山控股\t471425\ndw8\t471426\nws831\t471427\nCAI978\t471428\n20兆\t471429\n张建华\t471430\n滑盖\t471431\n绝命毒师第四季\t471432\n巫师2:国王刺客\t471433\nVC泡腾片\t471434\n反革命\t471435\n雷斯林\t471436\n洛阳师范学院\t471437\nSIDI\t471438\n考神\t471439\n普达措国家公园\t471440\n徐正溪\t471441\n阿尔萨斯\t471442\n眼睑炎\t471443\nreminding\t471444\n该死\t471445\n假假\t471446\n毕业论文防查重详解\t471447\nCellulose\t471448\n板门店\t471449\n汽车家园网\t471450\n胡允儿\t471451\n吴超\t471452\n重生成\t471453\nOdyssey\t471454\n导学稿\t471455\n逆转录\t471456\n妇产科护理学\t471457\n曼彻斯特机场\t471458\n4000亩\t471459\n哥萨克3\t471460\n围脖\t471461\n1.9版\t471462\nPang\t471463\n捉妖记2百度云\t471464\n16方\t471465\n入资\t471466\n壹账通\t471467\n天星桥\t471468\n摇乳\t471469\n金浪\t471470\n闯关东前传\t471471\nintermittent\t471472\n0u\t471473\n小升初_奥数网\t471474\n百货公司\t471475\n600辆\t471476\n张也_\t471477\n重庆市永川区人民政府\t471478\n辛柏青\t471479\n以诚相待\t471480\n3158招商加盟网\t471481\nnomos\t471482\n3366\t471483\n成都全搜索新闻网\t471484\n北京链家房地产经纪有限公司\t471485\n海鲨\t471486\n漂流瓶\t471487\n角球\t471488\n福州北站\t471489\n创维液晶电视\t471490\n1080P高清\t471491\n喷门\t471492\n2R\t471493\nCONCERT\t471494\n蔡居诚\t471495\n民权\t471496\n管长\t471497\n莱昂马明哲\t471498\n阿尔德里奇\t471499\n苏美达\t471500\nshards\t471501\n借贷方\t471502\n国际投资学\t471503\n算卦街\t471504\n东方名城\t471505\nrank\t471506\n谢容儿\t471507\nc++吧\t471508\n饭厅\t471509\nsigar\t471510\n运动服\t471511\n解术\t471512\n公平\t471513\n25秒\t471514\n钢制\t471515\nczl\t471516\n云浮日报数字报\t471517\n森林狼\t471518\n回执\t471519\n罗技鼠标宏\t471520\n原发性高血压\t471521\n陈瑜\t471522\n飞灰\t471523\n商品房\t471524\n丢勒\t471525\n2.7亿\t471526\n经二路\t471527\n1成\t471528\n弱电\t471529\n饱胀\t471530\nnote4x\t471531\n牡丹花\t471532\n饱食\t471533\n刘思齐\t471534\n海危机\t471535\n壤土\t471536\n神咲诗织\t471537\n市外侨办\t471538\ncloudera\t471539\n倍智\t471540\n国家奖学金\t471541\noi\t471542\n算算\t471543\n云大医院\t471544\n东城水岸\t471545\n2.5l\t471546\nExile\t471547\nSpirits\t471548\ncoupons\t471549\n国医馆\t471550\n既不\t471551\n寄存器\t471552\n集成学习\t471553\ncreative\t471554\n过敏原\t471555\n板式橡胶支座\t471556\n网上作业_\t471557\n葡萄糖酸锌\t471558\n西山镇\t471559\n阿叼\t471560\n伊战\t471561\nQuinn\t471562\n28P\t471563\n轧辊\t471564\n纸上得来终觉浅\t471565\n好几遍\t471566\n谭成旭\t471567\n肇源县\t471568\nSexlab\t471569\n牛皮糖\t471570\n小米笔记本pro\t471571\n恒为科技\t471572\n早衰\t471573\npostMessage\t471574\n新约\t471575\n九脚\t471576\n新视野大学英语视听说教程\t471577\ndobe\t471578\nrida\t471579\n惠施\t471580\nblocks\t471581\n丽江市古城区\t471582\nvander\t471583\n中凯\t471584\n徒骇河\t471585\nAA+\t471586\nPAGES\t471587\n山形\t471588\n五金装修|一起网\t471589\n中原集团\t471590\n顾明远\t471591\n建档\t471592\n雷龙角\t471593\nTizzy\t471594\n峰会\t471595\nDedeCMS\t471596\n感動\t471597\n刘振东\t471598\nwww.51zbz.com\t471599\n总经销商\t471600\nMVS\t471601\npan115\t471602\n2015年12月\t471603\n曙光路\t471604\n门式\t471605\n民营医院\t471606\n飞博\t471607\n湖南湘江新区\t471608\nThrones\t471609\n双钢轮压路机\t471610\nMethod\t471611\n0004\t471612\n弹簧平衡器\t471613\n低谷期\t471614\n小练\t471615\n2017个\t471616\n天生一\t471617\n豆姐\t471618\n专用汽车网\t471619\n300余名\t471620\ntro\t471621\n金百万\t471622\n贤内助\t471623\n嘉定区中心医院\t471624\n烟草专卖局\t471625\n攀枝花市西区\t471626\n施奈德\t471627\n丁敏\t471628\n辽海出版社\t471629\n关智一\t471630\n曲度\t471631\n锁价\t471632\nResultMap\t471633\n13五\t471634\n例名\t471635\n齐下\t471636\n磁滞回线\t471637\n后跟贴\t471638\n成对\t471639\n卷烟\t471640\n引导员\t471641\n喻言\t471642\nedx\t471643\n固封\t471644\n11号\t471645\n拉篮\t471646\n拿到\t471647\ncncap\t471648\n苗种\t471649\n流体管\t471650\n失语症\t471651\n大类\t471652\n一起来跳舞\t471653\nTechnische\t471654\n洛马\t471655\nInterpretation\t471656\ngbp\t471657\n移动迷宫2\t471658\n纟\t471659\n乐山路\t471660\nAmy\t471661\nubb\t471662\n芯晴网\t471663\n深宅旧梦\t471664\n玖富\t471665\n魔棒\t471666\n树头\t471667\n彷徨之刃\t471668\n大渡河路\t471669\ncalcite\t471670\n捍卫\t471671\nqwerty\t471672\n岫岩\t471673\nGPS防盗器\t471674\n美团大众\t471675\n筏\t471676\nPCG\t471677\n缪勒氏\t471678\n2018.1.2\t471679\npthon\t471680\nOrchard\t471681\n戴斯班克\t471682\n赤峰路\t471683\navtt天堂_av天堂\t471684\n浪鲸\t471685\n10.3.0\t471686\n质量责任制\t471687\nA段\t471688\nkits\t471689\n棒球服\t471690\n世界血友病日\t471691\n紫塞\t471692\n河洛文化\t471693\n粤安\t471694\n涪陵网\t471695\n仙城镇\t471696\n耸听\t471697\ntc\t471698\n政治\t471699\nHour\t471700\n5声\t471701\n李希\t471702\n斯巴达克\t471703\n0.4\t471704\n航空物流\t471705\nMenAngel\t471706\n青田\t471707\nSRV\t471708\n全国大学生物联网\t471709\n楚北捷\t471710\n九大类\t471711\n豪杰春香\t471712\n冰寒绡\t471713\n股票期权行权\t471714\npmap\t471715\n天禾\t471716\n京味斋\t471717\n翠竹街道\t471718\n601988\t471719\n青木川古镇\t471720\n竹林村\t471721\n空幻\t471722\n专利法实施细则\t471723\n燕京航城\t471724\n中国电信上海公司\t471725\nE5.0\t471726\n邪恶力量第十三季\t471727\n胡秀英\t471728\n妇好\t471729\nUNS\t471730\n错路\t471731\n迷奇\t471732\n上海人民美术出版社\t471733\n西瓜网\t471734\n圆陀角\t471735\nsetinterval\t471736\n高速铁路\t471737\n功勋\t471738\nUFO中文网\t471739\n小事儿\t471740\n流放之路poe\t471741\n粤电\t471742\n茶友\t471743\n鲜湿\t471744\n第15个\t471745\n第17章\t471746\n酷哒\t471747\n月光岛\t471748\nDLX\t471749\nshiraz\t471750\n黑船\t471751\n软件工程专硕\t471752\n微机保护装置\t471753\nmxdsf\t471754\n药渣\t471755\n江苏师大\t471756\n上衣女\t471757\n歌诗达赛琳娜\t471758\n来劲\t471759\n方管机\t471760\nkeng\t471761\n烟台南\t471762\n永续\t471763\n票制\t471764\nForza\t471765\nGDT\t471766\n上海研发中心\t471767\n香芹\t471768\n温州肯恩大学\t471769\n日照银行\t471770\n蒙泰\t471771\n旧街\t471772\n检察机关\t471773\n178CS\t471774\nドロップアウト\t471775\n语源\t471776\nvip浏览器\t471777\n201504\t471778\n7.85\t471779\n尉氏县\t471780\n安图妮\t471781\n皇后乐队\t471782\n延川县人民政府\t471783\nGUANGZHOU\t471784\nUFW\t471785\n0x\t471786\nurldecode\t471787\n两袋\t471788\n囱\t471789\n斩月\t471790\nRC\t471791\n蜂雷\t471792\n乐城国际贸易城\t471793\nwin8/win10\t471794\n菜豆\t471795\n八甲\t471796\n纽芬兰犬\t471797\n骞\t471798\n史塔克\t471799\n何继斌\t471800\n金莲清热泡腾片\t471801\n几段话\t471802\n牛排馆\t471803\nShuttle\t471804\n4点钟\t471805\n增压机\t471806\n观灯\t471807\n枡野俊明\t471808\n开心花甲粉\t471809\n政策性\t471810\nHuayra\t471811\n血源\t471812\n花剑\t471813\n花王\t471814\n北京华博医院\t471815\n剑侠2\t471816\n南苑街道\t471817\n遥控式\t471818\n荥经之窗\t471819\n淘宝城\t471820\n引路人\t471821\nsucks\t471822\n下者\t471823\n上海市社区卫生服务中心\t471824\n周子瑜\t471825\n数集\t471826\n金橙\t471827\n第9天\t471828\nmame32\t471829\njake\t471830\n玄仙\t471831\n成都高新区国家自主创新示范区\t471832\ntuiguang\t471833\n发家\t471834\n洗头\t471835\n磁翻板液位计\t471836\nlolS8\t471837\nsponsorship\t471838\n龙俊亨\t471839\n楼观台\t471840\n晚香玉\t471841\n极点五笔输入法\t471842\n吴蓬\t471843\n路标网\t471844\n精灵球\t471845\n无锡人才网\t471846\n读码器\t471847\n国际公司\t471848\n培训部\t471849\nngc\t471850\nNextCloud\t471851\n唐立新\t471852\n脆弱\t471853\n南海神庙\t471854\n巨富\t471855\n玫瑰人生\t471856\nEasyPoi\t471857\nMOVE\t471858\n需谨记\t471859\n张春桥\t471860\n张和\t471861\nWeChat\t471862\n师专\t471863\n壹键哥\t471864\n指紧扣\t471865\n甘肃交通职业技术学院\t471866\nacs\t471867\nsnippet\t471868\n运算符优先级_\t471869\n大观天下\t471870\n第205集\t471871\n自由恋爱\t471872\n大舅\t471873\n管理法\t471874\n安信证券股份有限公司\t471875\n上海药明康德新药开发有限公司\t471876\n开挖机\t471877\n过亿元\t471878\n合土\t471879\n规划展示馆\t471880\n澄清池\t471881\n推来\t471882\n重定向\t471883\n耐蚀性\t471884\n2641\t471885\n京城81号\t471886\n王的女人\t471887\n高检\t471888\n河南电视台都市频道\t471889\n虽说\t471890\nwww.166xs.com\t471891\n盛大游戏\t471892\n包教包会操作教程\t471893\n西安住房公积金管理中心\t471894\n吉他\t471895\n动式\t471896\n异频\t471897\nArchDaily\t471898\n乐山外国语学校\t471899\n老邱\t471900\n蓝标\t471901\n枪战片\t471902\nM4000\t471903\n宀\t471904\n靠近\t471905\n赵氏孤儿\t471906\nMPPT\t471907\n北京西北\t471908\nジン\t471909\n奇异博士\t471910\n广东省建筑施工企业\t471911\nAion_永恒之塔\t471912\n小歪\t471913\n柳飘飘\t471914\n分散性\t471915\n法国杯\t471916\nstop\t471917\n请假\t471918\n勇夺\t471919\n中国人民健康保险股份有限公司\t471920\n三国策\t471921\nUnderscore\t471922\n血栓\t471923\n全军覆没\t471924\n雪松控股集团\t471925\n0553\t471926\n笔落\t471927\ncrawford\t471928\nliv\t471929\n多体\t471930\nOpenIR\t471931\n弗雷尔卓德\t471932\n乐高漫威复仇者联盟\t471933\nhuaxia\t471934\n魔术秀\t471935\n艾辰\t471936\n罗定市\t471937\n花田喜事\t471938\n非典型增生\t471939\n反角\t471940\nyzm\t471941\n0.36%\t471942\n八一公园\t471943\nfender\t471944\n食源性\t471945\n张博宇\t471946\n白水洋镇\t471947\nostream\t471948\nButler\t471949\nCoral\t471950\n养生馆\t471951\n组级\t471952\n馏程\t471953\n三载\t471954\nAPKPure\t471955\nlooping\t471956\nlyl\t471957\ntcpudp\t471958\n锤子科\t471959\n采荷二小\t471960\n消防队员\t471961\n巴塞利斯\t471962\n4am\t471963\n珂莱欧\t471964\n小叮当\t471965\n常亮\t471966\nvictoria\t471967\nEntity\t471968\n伊芙琳\t471969\n多莱尔\t471970\n拌合\t471971\n24岁\t471972\n陈奕威\t471973\nMultiIndex\t471974\n135斤\t471975\n大清隐龙\t471976\n温奶器\t471977\n卓卓\t471978\n越城\t471979\n康纳\t471980\narcsinx\t471981\n抬枪\t471982\n马来西亚大学\t471983\n写意牡丹\t471984\nMKD-S94\t471985\nex\t471986\n生活家\t471987\nbackgroud\t471988\n逗比妹\t471989\n金唯智\t471990\n秋韵\t471991\nmaowang\t471992\n非行政\t471993\n李殊\t471994\n豹子号\t471995\n准者\t471996\nadm\t471997\nStay\t471998\n马六甲海峡\t471999\n最久\t472000\n绿环\t472001\n108位\t472002\n局座\t472003\n2018.03.31\t472004\nElectro\t472005\n番茄鸡蛋面\t472006\nPCs\t472007\n东华理工大学\t472008\n鸣锣\t472009\n51kata\t472010\n静字\t472011\n01\t472012\n3袋\t472013\n中国分公司\t472014\n河南省机构编制委员会办公室\t472015\n昴\t472016\nRoulette\t472017\n/dev/sr0\t472018\nCocoStudio\t472019\n选本\t472020\nintime\t472021\nHF\t472022\n直纹\t472023\nAlzheimer\t472024\n竹园\t472025\n空白处\t472026\n话数\t472027\n曾军\t472028\n吉运\t472029\n嘟囔\t472030\n赛默\t472031\n凤逆\t472032\nwin2008_Windows系\t472033\nUploaded\t472034\naux接口\t472035\n二手房买卖\t472036\njspx\t472037\n空客A320\t472038\n孟庄\t472039\n无脊椎动物\t472040\n0ffice\t472041\n荷叶圆圆\t472042\n洗手盆\t472043\n恐艾症\t472044\n内涵式\t472045\n易机网\t472046\n李由美\t472047\n河北地质大学\t472048\n开缝\t472049\n017\t472050\n吉林一中\t472051\n重庆市北碚区人民政府\t472052\n余旭\t472053\n查抄\t472054\n反胃\t472055\n黄金三镖客\t472056\n2mb\t472057\n狮吼\t472058\n白蜜\t472059\nAD289\t472060\nJIN\t472061\nGeographic\t472062\n王_\t472063\n3周后\t472064\n李烨\t472065\n独立显卡\t472066\n瑞萨\t472067\n秦瑞\t472068\n虏获\t472069\n中国银行手机银行\t472070\n鸟山\t472071\n匡亚明\t472072\n打氧\t472073\n阿拉宁波\t472074\n泡胶\t472075\n肝功能\t472076\n血量\t472077\n山客\t472078\nBoot拦截器\t472079\n8K电视\t472080\n太空港\t472081\n道康宁\t472082\n温哥华国际机场\t472083\nQQ个性签名\t472084\n一个滴滴\t472085\n卡友友\t472086\n汉界\t472087\n发型师\t472088\n小家伙\t472089\n玉门关\t472090\n人民共和国\t472091\nword201\t472092\n汽车园\t472093\nevn\t472094\n黄原胶\t472095\ngangbang\t472096\n六神花露水\t472097\n女用\t472098\n深信服VPN\t472099\n气道\t472100\n露奈\t472101\n排骨架床\t472102\n碘甘油\t472103\n走南闯北\t472104\n掌银\t472105\n记着\t472106\n樱花节\t472107\n姚刚\t472108\n日息\t472109\ncoff\t472110\n酷我k1\t472111\n01s\t472112\n瑞格列奈片\t472113\n宣言\t472114\n工程板\t472115\n落子\t472116\n建行银行\t472117\n15W\t472118\n苏沐秋\t472119\n影印本\t472120\n妻主\t472121\n高杆灯\t472122\nslick\t472123\nHeld\t472124\n节水办\t472125\n中袖\t472126\n水瓶男\t472127\n720P.BD高清\t472128\n七四\t472129\nsea\t472130\n万斛\t472131\n山水\t472132\n锦文\t472133\n起降\t472134\n上海水族馆\t472135\n叶贞琴\t472136\n20187年\t472137\n内江市市中区\t472138\n1348\t472139\n项城市人民政府\t472140\n夹山\t472141\n/uint64_t\t472142\n荷赛\t472143\n郭庆光\t472144\n金种子\t472145\n安全工程学院\t472146\n剃\t472147\n连锁加盟展览会\t472148\n测试板\t472149\n洞泾镇\t472150\n传参\t472151\n中天集团\t472152\n李学凌\t472153\nWool\t472154\n艾灸器\t472155\n子叶\t472156\n治疗\t472157\n燕子沟\t472158\n张岱年\t472159\n坦露\t472160\n沈阳理工大学\t472161\n航迹\t472162\n基址\t472163\n23433\t472164\nCharlist00\t472165\n夜行记\t472166\n三分之一\t472167\np20p\t472168\n20千瓦\t472169\ng43\t472170\ncontos\t472171\n金土\t472172\n现款\t472173\n买号\t472174\n县交通运输局\t472175\npenn\t472176\n孤岛惊魂5_孤岛惊魂5\t472177\n庆祝\t472178\n空空道人\t472179\n武汉市普爱医院\t472180\nbarbara\t472181\n麼办\t472182\n余切\t472183\n霸业\t472184\n山田优\t472185\n建德财政局\t472186\n魔宝\t472187\n第二十九届\t472188\n香药\t472189\n成都高新技术产业开发区\t472190\n算什\t472191\n肯德\t472192\n籽料\t472193\n网件\t472194\n宜良县\t472195\n中国工控\t472196\n突击检查\t472197\nshishang\t472198\n图宝\t472199\n大社区\t472200\ngrail\t472201\n裴秀才迪\t472202\n十四点\t472203\n开学日\t472204\n子宫壁\t472205\nsimlink\t472206\n京沪高速铁路\t472207\nHZ\t472208\nffc\t472209\n东京23区\t472210\n华联\t472211\n花样男子\t472212\n信威\t472213\n卓威\t472214\n壬寅\t472215\nPremiere\t472216\nmaternity\t472217\n郭杰瑞\t472218\n面业\t472219\n201堂\t472220\n博白\t472221\n五十首\t472222\njiaoyi\t472223\n3亿美元\t472224\n柯城区\t472225\n密炼\t472226\n丁涛\t472227\n秀美\t472228\n车关税\t472229\n授权页\t472230\n两用机\t472231\n╭\t472232\n物理学\t472233\n1029\t472234\n痴娘\t472235\nUFO\t472236\n上海汽车变速器有限公司\t472237\n光线传感器\t472238\nira\t472239\n苏打水机\t472240\n莫迪\t472241\n长安铃木\t472242\n计日\t472243\n差异化\t472244\n下诏\t472245\n人本主义心理学\t472246\n八色\t472247\n很亲切\t472248\n香港汇丰\t472249\n天亮\t472250\n保罗·盖蒂\t472251\n简牍\t472252\n三基\t472253\n亚米契斯\t472254\n8.52\t472255\n_v1.0.1安卓\t472256\n圆房\t472257\n20几万\t472258\n公路安全保护条例\t472259\n顺丰运\t472260\n碍事\t472261\n红券\t472262\n华山派\t472263\nelementaryos\t472264\n傅全香\t472265\n供应_中国贸易网\t472266\n汽车销量网\t472267\ne455\t472268\n一窝蜂\t472269\n三亚站\t472270\n温馨\t472271\n京润珍珠\t472272\n字值\t472273\nralph\t472274\n每几年\t472275\n中国攀枝花网\t472276\nR0\t472277\n12319\t472278\n开小差\t472279\n企业文化节\t472280\n一如\t472281\n600172\t472282\n开征\t472283\nsuperdry\t472284\n验槽\t472285\n全经联\t472286\n保尔·柯察金\t472287\n陈伟强\t472288\n佟丽王宁\t472289\n16米\t472290\n中天微系统\t472291\n今早上\t472292\n陕甘宁边区\t472293\n准出\t472294\n剑网3PVE\t472295\n华烟\t472296\n飞力\t472297\n中国汽车工业工程有限公司\t472298\n无锡汽车站\t472299\n玄奘西行\t472300\n冒字\t472301\n粗糙度仪\t472302\n健康卫士\t472303\n硬膜\t472304\n三藏算命\t472305\n卓达太阳城\t472306\n崔军\t472307\n竞答\t472308\n浓香型白酒\t472309\n玉婷\t472310\n程序单\t472311\n黄新初\t472312\ndiv框\t472313\n葩\t472314\n一春\t472315\nexchange2007\t472316\n任剑涛\t472317\n爆种\t472318\n复印机\t472319\n耦\t472320\n入淡出\t472321\n北京四中\t472322\n槑头槑脑\t472323\n汤一介\t472324\n刨幺\t472325\nWie\t472326\n瞻博网络\t472327\n为了孩子\t472328\n通过性\t472329\n广州高新技术产业开发区\t472330\n绳轮\t472331\n武汉地铁3号线\t472332\n爱美\t472333\n2万台\t472334\nスキ\t472335\n王晗旭\t472336\n森霸股份\t472337\n雪狼湖\t472338\n凌云股份\t472339\n喵喵折\t472340\n欢乐麻将全集\t472341\n佳县\t472342\n贝里克\t472343\n说说致\t472344\n明年1月1日\t472345\n花轮\t472346\nF级\t472347\n战地风云4\t472348\n2.7.1\t472349\n维生素e软胶囊\t472350\n杰米\t472351\n老将\t472352\n凯特·布兰切特\t472353\n日月湖\t472354\nSmtp\t472355\ncad\t472356\nallo\t472357\n伊利巧乐兹\t472358\nO4S\t472359\n058\t472360\n壮男\t472361\n入伙\t472362\n365future.com\t472363\ntmpgenc\t472364\nz5\t472365\n联名\t472366\n洋垃圾\t472367\n决出\t472368\nAraxis\t472369\ne\t472370\n无尽战区\t472371\nserver服务器\t472372\n客车\t472373\n竹沥\t472374\n防尘网\t472375\n术术\t472376\nhp7110\t472377\n雷柏v500\t472378\n日德\t472379\n1986年\t472380\n资料架\t472381\ndnf战\t472382\n仙剑奇\t472383\n无碍\t472384\n2223\t472385\n彭老师\t472386\n中税\t472387\n和氏\t472388\n选中\t472389\n37篇\t472390\n抬举\t472391\n非布司\t472392\n辇\t472393\n超声波测距\t472394\n微信企业公众号\t472395\n0.3\t472396\n北大西洋公约组织\t472397\n聚水潭\t472398\nT15\t472399\nununtu\t472400\n血玉\t472401\n广东省自学考试管理系统_广东省教育考试院\t472402\n红河哈尼族彝族自治州\t472403\n开篇语\t472404\n样本点\t472405\n天天化工网\t472406\ncapacity\t472407\n模特秀\t472408\n二十几年\t472409\nphyton\t472410\n上海龙华医院\t472411\n第十四篇\t472412\n嵌岩\t472413\n价比\t472414\n坷垃\t472415\n2225\t472416\n尹弘\t472417\n筑友万象\t472418\n假案\t472419\n李笑来\t472420\n三运\t472421\nc20\t472422\n9.5.2\t472423\n丽水日报社\t472424\n史略\t472425\n苏云\t472426\n桂安\t472427\n3000分\t472428\n高青\t472429\n王金生\t472430\n防洪堤\t472431\n小黄鱼\t472432\n胎心监护图\t472433\n红绿蓝\t472434\n评估员\t472435\nオナニ\t472436\n新城市\t472437\n互帮互助/必应区\t472438\n日过\t472439\n搬走\t472440\n求帮忙\t472441\n兰利\t472442\nsql2000\t472443\n君臣\t472444\n深圳红岭中学\t472445\n九坤\t472446\n靖城\t472447\n顶点\t472448\n159元\t472449\n屋顶上\t472450\nmargin\t472451\n地铁口\t472452\n轻杆\t472453\ncompletes\t472454\n华夏传奇\t472455\n速销\t472456\n108斤\t472457\n年年\t472458\n王青山\t472459\n妖风\t472460\n玉凤\t472461\n高沙\t472462\n7899\t472463\nSmali\t472464\n小照片\t472465\n骁龙845处理器\t472466\n第53号\t472467\n第158章\t472468\n优希麻琴\t472469\n党政领导人物库\t472470\nblcos\t472471\n冯正霖\t472472\n震泽镇\t472473\nROWNUM\t472474\n经典段子网\t472475\n19宗\t472476\n台股\t472477\n回到三国\t472478\n250平\t472479\n箱根周游券\t472480\ncatalytic\t472481\n猫砂\t472482\n西部网\t472483\n1.0m\t472484\nsanta\t472485\nlauncher\t472486\n三国天天爱消除\t472487\n琅\t472488\n中装协\t472489\nweicheng\t472490\n罕有\t472491\n叶落\t472492\n齐呼\t472493\n皖能电力\t472494\n生化妊娠\t472495\n安卓通讯录\t472496\n孔孟\t472497\nFlinto\t472498\n槽轮\t472499\n真\t472500\n五一农场\t472501\n战场套\t472502\n计税金额\t472503\n打群架\t472504\nm4a\t472505\n8169\t472506\n建筑圈\t472507\n忙音\t472508\n4d14-B7C1-FD326CA84A0C}.job\t472509\n一16岁\t472510\n6盒\t472511\n吸着\t472512\n梦幻2\t472513\n华三通信\t472514\n易捷通\t472515\n时尚界\t472516\n大病保险\t472517\n略论\t472518\n网络打印服务器\t472519\n苏诗丁\t472520\n不均\t472521\n十堰县\t472522\nwebNick\t472523\n5.6.31\t472524\nappcompat\t472525\n650NK\t472526\n温经汤\t472527\n128kbps\t472528\n电木粉\t472529\n近几天\t472530\n搜索页\t472531\n毁容案\t472532\n第一坊\t472533\n畅阅\t472534\n诱心\t472535\n橙花\t472536\nH\t472537\nstubs\t472538\n汉王\t472539\n不适\t472540\n泗县政府网\t472541\n左拉\t472542\n北京协和医院西院\t472543\n化学生物学\t472544\n苏州九龙医院\t472545\nE431\t472546\napos\t472547\n悠着点\t472548\n谦敬\t472549\nTubo\t472550\n容量\t472551\nargue\t472552\nSoftBlue\t472553\n山西电信\t472554\nvcam\t472555\n有线网\t472556\nwsy\t472557\n石药集团\t472558\ntiku\t472559\n靓机\t472560\n自考万题库\t472561\n心慌慌\t472562\n临城县\t472563\n秦菲\t472564\n坦克大战\t472565\n漯河市源汇区\t472566\n刀剑神域虚空幻界\t472567\n中华弟子规\t472568\n被调查者\t472569\n半波整流\t472570\n剖切\t472571\n明字\t472572\n幻觉\t472573\n合纵\t472574\n老鼠仓\t472575\n扶正\t472576\n徐静蕾\t472577\n帝王花\t472578\n4万5\t472579\nhun\t472580\n11.99\t472581\ntvs管\t472582\nspringjdbc\t472583\n甬舟铁路\t472584\n5批\t472585\n孝感东站\t472586\n吕伯奢\t472587\n大难临头\t472588\n孙奇\t472589\n58万\t472590\nwin10ie\t472591\n第二季01\t472592\nremembered\t472593\n达夫\t472594\n花椒苗\t472595\n副校长\t472596\n英雄联盟比赛\t472597\n胃管\t472598\n罚跪\t472599\n冰火魔厨\t472600\nWeWork\t472601\nRotten\t472602\n2010届\t472603\n束昱辉\t472604\n26.0\t472605\n纳米颗粒\t472606\n房税费\t472607\n诚招\t472608\n消费群\t472609\n绿维创景\t472610\n物理系\t472611\n绿地海湾\t472612\n天气网\t472613\n24Bit\t472614\n碾压机\t472615\n5D3\t472616\n撞地\t472617\n关公\t472618\n嘀嘀打车\t472619\n维肤\t472620\n目暮\t472621\n斗鱼寅子\t472622\n长风\t472623\n初色\t472624\n张曦\t472625\n微门户-zz91\t472626\n真主党\t472627\n瞄准\t472628\nConductive\t472629\nstudio3\t472630\n郑国凤\t472631\n奇函数\t472632\n未来简史\t472633\n落架\t472634\n万贞儿\t472635\n多糖铁复合物胶囊\t472636\n2017年9月29日\t472637\n中兴天机Axon\t472638\nsmithstory\t472639\nvenv\t472640\n鞭法\t472641\nwpe\t472642\n史事\t472643\nconvertible\t472644\nX线\t472645\n个人结售汇\t472646\n金银币\t472647\n苏梅岛机场\t472648\n真分数\t472649\n阿兰德\t472650\nsonar\t472651\n多组\t472652\n环球旅\t472653\n上海汽车集团股份有限公司\t472654\n长子县\t472655\n养精\t472656\n下辖市\t472657\n绿证\t472658\nOPTION\t472659\n挤奶机\t472660\nControl\t472661\n影音先锋少女av资源网_影音先锋全色av资源网_av片\t472662\nGFX\t472663\n二值\t472664\n驴马\t472665\n神木林\t472666\n拆迁房\t472667\nxpert\t472668\n拙作\t472669\n粒粒\t472670\n手段\t472671\n星城国际\t472672\n566.com\t472673\n594\t472674\n浩悦\t472675\n闪舞\t472676\n寒假\t472677\n小玉儿\t472678\n附记\t472679\n被子类\t472680\n纸坊街\t472681\n亚丁\t472682\n别光\t472683\n享乐\t472684\n西安社区\t472685\n12万起\t472686\nManor\t472687\n沿江公路\t472688\n大连新机场\t472689\n2018LCK\t472690\n番茄太阳\t472691\n刘永刚\t472692\n摇钱树苗\t472693\n北京卫戍区\t472694\ndynamics\t472695\n细菌培养\t472696\ngrin\t472697\n38岁\t472698\n981\t472699\n外景\t472700\nNotebook\t472701\n来月经\t472702\n业主群\t472703\n21章\t472704\n外面的世界\t472705\n性行\t472706\n汉山\t472707\nnova2s\t472708\n32块\t472709\nxy轴\t472710\n2cg\t472711\n滦平\t472712\n冯雪\t472713\n博苏\t472714\n窜天猴\t472715\n新现实主义\t472716\ne7\t472717\nbem\t472718\n苗华\t472719\n1.11.0\t472720\n8.5.8\t472721\n18000公里\t472722\n吉特巴\t472723\n背击\t472724\nhydrophobic\t472725\n浮箱\t472726\n无形中\t472727\n新至尊江湖\t472728\n将军\t472729\n理发师\t472730\n台班\t472731\n王洪伟\t472732\nmonit\t472733\nspitz\t472734\n喊声\t472735\n1907年\t472736\n春秋装\t472737\n奖牌榜\t472738\n大数据工程师\t472739\n可编\t472740\n落潮\t472741\n备至\t472742\n蝴蝶花\t472743\n浮士\t472744\nkepu\t472745\n希斯罗\t472746\n白钢\t472747\n臭豆腐\t472748\n第一关\t472749\n鹤浦镇\t472750\n含权\t472751\n云洲\t472752\n29座\t472753\n奉城\t472754\n300308\t472755\n医美\t472756\nPOP-百图汇素材网\t472757\nNewman\t472758\n责备\t472759\n织绣\t472760\n福建法院\t472761\n凤凰新城\t472762\n席慕博尔特\t472763\n歌圩\t472764\n埃沃\t472765\n周水子\t472766\n设区的市\t472767\n耿其昌\t472768\nlaws\t472769\n青子\t472770\n名单\t472771\n华企商学院\t472772\n事典\t472773\n2017.8\t472774\n3299元\t472775\n篱笆庄\t472776\nbw\t472777\n鹿茸酒\t472778\n华为畅享8\t472779\n芊\t472780\n安顺市\t472781\n香薰精油\t472782\n木森\t472783\n水兵舞\t472784\n乙酰化\t472785\n20170716\t472786\n李佳淇\t472787\nLocking\t472788\ntpv\t472789\n评选\t472790\nGGG\t472791\nyinwen\t472792\n有息\t472793\n单号查询\t472794\n博通\t472795\n实名\t472796\n警官\t472797\n嘎纳\t472798\n孙飞虎\t472799\nMaksim\t472800\n132431\t472801\n更换\t472802\n饭馆\t472803\n微啦网\t472804\n抓胸\t472805\nLindsey\t472806\n金衣\t472807\nS4\t472808\nbucket\t472809\nRobust\t472810\n赤发\t472811\n一房二卖\t472812\n100nm\t472813\n武安市\t472814\n鱼面\t472815\nnRF\t472816\nEXCEL宏\t472817\n湖南地方税务局\t472818\n水火既济\t472819\n刘洋洋\t472820\n上海交大医学院\t472821\n腊八粥\t472822\n许振超\t472823\nsolarized\t472824\n9千米\t472825\n拗口\t472826\n山东地区\t472827\n双丰\t472828\n第七十一\t472829\n净月高新技术产业开发区\t472830\n热乎\t472831\n瓷砖粘接剂\t472832\n250亿美元\t472833\nSweat\t472834\n合肥省立医院\t472835\n招财进\t472836\n周勤\t472837\n码洋\t472838\nPMS\t472839\n正面照\t472840\n北华航天工业学院\t472841\n超级变声器\t472842\n副部级\t472843\n诺森德\t472844\n伊斯科\t472845\n直流电\t472846\n区政\t472847\n4个小时\t472848\n流变性\t472849\n20140923\t472850\n线性回归_\t472851\n管体\t472852\nv2.4.1\t472853\nhetong\t472854\n余之城\t472855\n荣哥\t472856\n看星星\t472857\nphenolic\t472858\nKeyEvent\t472859\n300子\t472860\n坎爷\t472861\nicarus\t472862\n听证会\t472863\n圣诞礼物\t472864\n郝刚\t472865\n主件\t472866\n三元区\t472867\nN年\t472868\n白浊\t472869\n三亚西岛\t472870\n365体育在线\t472871\nCLEAR\t472872\n股权转让印花税\t472873\n冲门\t472874\nABOUTCG\t472875\n分析员\t472876\n受得\t472877\n返场\t472878\n春暖花开\t472879\n中京\t472880\n墨色\t472881\n梦幻宝骏730\t472882\n徐守盛\t472883\n瘾君子\t472884\n同调\t472885\nwangkangluo\t472886\n算下来\t472887\n周青\t472888\nquartus\t472889\n新浪河北新闻_新浪\t472890\n山与海\t472891\ndiqiu\t472892\njx\t472893\nhuangqiqing\t472894\n实地\t472895\n倒霉蛋\t472896\n汽车广告\t472897\n雅达利\t472898\n维克托\t472899\n720P|1080P\t472900\n优质品\t472901\n180424\t472902\n浔阳\t472903\n龙天\t472904\nzoe\t472905\n7000多元\t472906\n甘舒霖\t472907\n4.5万\t472908\n海内存知己天涯若比邻\t472909\n小三_\t472910\n兮夜\t472911\nEmotional\t472912\n某一位\t472913\nUFR\t472914\n蒙曼\t472915\n5dsr\t472916\n明珠城\t472917\n全收\t472918\n吵吵闹闹\t472919\n巨型\t472920\n靖港\t472921\n钱江杯\t472922\ncommons\t472923\n电子龙\t472924\n花田错\t472925\n风向\t472926\n国网上海市电力公司\t472927\n张朵朵\t472928\n评价者\t472929\nPush\t472930\n交流稿\t472931\nteapot\t472932\n策马三国志\t472933\n龙润\t472934\n二甲戊灵\t472935\nmichelle\t472936\n无所不能\t472937\n美容棒\t472938\nkinematic\t472939\nMetart\t472940\n融安\t472941\nC++2015\t472942\n待遇\t472943\n刘诺一\t472944\n延边州\t472945\njunk\t472946\n古玩\t472947\n巴咕\t472948\n今年以来\t472949\n槽钢\t472950\nForged\t472951\n外军\t472952\n收市\t472953\n活动稿\t472954\n橡胶挤出机\t472955\n847\t472956\nenthalpy\t472957\n1080_\t472958\nf367\t472959\n小畑健\t472960\n道服\t472961\n脱墨\t472962\n王自如\t472963\na=\t472964\n跃入\t472965\n一切顺利\t472966\n领路人\t472967\n宁建\t472968\nLENGTH\t472969\n建国路\t472970\n国本\t472971\n东方证券\t472972\n第32届\t472973\n金橙子\t472974\n安康通\t472975\n25位\t472976\n九洲药业\t472977\n鼻窦\t472978\n闭锁器\t472979\n3PC版\t472980\n古对今\t472981\n锥筒\t472982\n105%\t472983\njmsolution\t472984\nResponsibility\t472985\n百款\t472986\n氧气机\t472987\n光猫FTTx\t472988\n最大流\t472989\nPix\t472990\n优月\t472991\n铁矿\t472992\n6.5小时\t472993\n同方股份有限公司\t472994\ntrust\t472995\n李庆\t472996\nwarfare\t472997\n被吊销\t472998\n线性代数\t472999\n收收\t473000\nPrices\t473001\n大州\t473002\n活起来\t473003\nActivex\t473004\n排山倒海\t473005\n国家主义\t473006\n钱堂\t473007\nWinhex\t473008\n黄小兰\t473009\n湖南师范大学研究生院\t473010\n20140915\t473011\n北延\t473012\n学后感\t473013\nDMV\t473014\n立方分米_\t473015\n忘了爱\t473016\n新歌声\t473017\n3.1.4\t473018\n幸免\t473019\n红烧羊肉\t473020\n网上报名系统\t473021\n陈圆圆\t473022\n一个13岁\t473023\n我是一只小虫子\t473024\n高飞\t473025\n程晓玥\t473026\n赣县区人民政府\t473027\n姜允儿\t473028\n洛阳西工区\t473029\nAnaconda2\t473030\n浙江省地方税务局\t473031\n4.01\t473032\n机动车登记规定\t473033\n剃须刀\t473034\n钥匙\t473035\n养生菜\t473036\ndocker-ce\t473037\nygopro\t473038\n入党转正申请书\t473039\n漕河\t473040\n纸上谈兵\t473041\n排卡\t473042\ngotten\t473043\n诱惑\t473044\n海扁王\t473045\n面市\t473046\n开蒙\t473047\n孤魂\t473048\n中库\t473049\n新世纪福音战士新剧场版\t473050\nGB50204\t473051\n生石花\t473052\n特许权\t473053\n几班\t473054\n青岛市海慈医疗集团\t473055\n王虫\t473056\n15门\t473057\n浙江机电职业技术学院\t473058\n恒压恒流\t473059\n三林\t473060\n安川电机\t473061\n3825\t473062\n焦虑症\t473063\n绿灯\t473064\nfutures\t473065\n几_39\t473066\n品橙旅游\t473067\n至诚财经网\t473068\n一台\t473069\nduke390\t473070\n武汉网\t473071\n磁\t473072\nSALE\t473073\n小虾\t473074\n四叶草\t473075\n杭州市第三人民医院\t473076\n小腹胀痛\t473077\nU形管\t473078\n厦门口腔医院\t473079\ngavin\t473080\n合味道\t473081\n多科\t473082\n疑云\t473083\nvrtm\t473084\n百弗\t473085\n黄忆慈\t473086\n马斯特里赫特\t473087\n慕容云海\t473088\n上午7点\t473089\n退火\t473090\n28岁\t473091\n鬼夫\t473092\n异型材\t473093\n家用水\t473094\n仙师\t473095\ngayporn\t473096\n根堆\t473097\naffinity\t473098\nIMSLP\t473099\nideaiu\t473100\n美凤\t473101\nSemester\t473102\n土左旗\t473103\nssid\t473104\n铁门关\t473105\n岳母\t473106\n忆年\t473107\n中国四大银行\t473108\n好又多超市\t473109\n巧女\t473110\n飞顿\t473111\n江苏省法院\t473112\n裱纸\t473113\n陈晓燕\t473114\n乙肝大三阳\t473115\nrelative\t473116\n草原恋\t473117\n3节\t473118\n明昊\t473119\n成都曙光医院\t473120\n菊门\t473121\n极速Office\t473122\n1915年\t473123\n大人们\t473124\n黑羽\t473125\n200dpi\t473126\n低聚肽\t473127\n江苏队\t473128\n陆安\t473129\n六盘水\t473130\n梅世强\t473131\nNoun\t473132\n步千帆\t473133\n6.3米\t473134\n邻区\t473135\n市政府\t473136\n纳垢\t473137\n顾南浔\t473138\n戚顾\t473139\n清华大学环境学院\t473140\ntemper\t473141\n内务\t473142\n灵芝孢子油\t473143\n厦门174医院\t473144\nsht\t473145\n房上\t473146\nwww.mb5u.com\t473147\n南师附中江宁分校\t473148\n东北话\t473149\nrpy2\t473150\n虾米音乐\t473151\n慢性阻塞性肺病\t473152\n袜裤\t473153\n网易邮箱帮助中心\t473154\n茜茜公主吧\t473155\nOV7725\t473156\nSJCAM\t473157\n大众传播学\t473158\n育苗杯\t473159\n郑才千\t473160\n浙江网络广播电视台\t473161\n宿务航空\t473162\n背着\t473163\n810路\t473164\n方框\t473165\n桃山\t473166\n代劳\t473167\n确认书\t473168\nsmd\t473169\n接天莲叶无穷碧\t473170\n狂医\t473171\n捷税宝\t473172\nHotels\t473173\n只愿\t473174\n大人\t473175\n上海动物园\t473176\n京东大厦\t473177\n太空旅客\t473178\n殷\t473179\n奥迪S8\t473180\nrqw\t473181\n代超\t473182\n第一黄金网\t473183\n四川省能源投资集团有限责任公司\t473184\nswww\t473185\n苏州日报\t473186\n启德留学\t473187\n秒怪\t473188\nv1.0\t473189\n牙洞\t473190\nimmortals\t473191\nABA\t473192\n曾德尔\t473193\n足智多谋\t473194\n朱晓\t473195\nOr\t473196\n册\t473197\n环太平洋1\t473198\n橄榄球\t473199\n喊单\t473200\n痛死\t473201\n腾龙\t473202\ngbb\t473203\n吃子\t473204\n园内\t473205\n慷\t473206\n斯卡布罗集市\t473207\n32公里\t473208\n95平米\t473209\nkuaile\t473210\n妖娆花\t473211\n饮机\t473212\niBT\t473213\n反应物\t473214\n第0章\t473215\n代名词\t473216\nscratch\t473217\n46部\t473218\n林燕妮\t473219\n洪荒\t473220\n河源源城\t473221\n液晶模块\t473222\n托托莉\t473223\n恒温培养箱\t473224\n楼费\t473225\n宽版\t473226\n合唱谱\t473227\n木吉\t473228\n洛克萨斯\t473229\n招行一网通\t473230\n墩布\t473231\n王阿姨\t473232\n森海大\t473233\n法资\t473234\n洗涤剂\t473235\n昂首阔步\t473236\n电角\t473237\n选理\t473238\n宫野真守\t473239\n阿丘\t473240\nv3700\t473241\n長谷川\t473242\n前列舒乐胶囊\t473243\n至强E3\t473244\n于湉\t473245\n亚马尔\t473246\n1949\t473247\nEC6108V9\t473248\nradios\t473249\n卢曼娜\t473250\n六边\t473251\nProgrammers\t473252\nMFA\t473253\n正对\t473254\nZYZ\t473255\n母乳片\t473256\n龙角散\t473257\n波轮式\t473258\n法格\t473259\nAV小四郎\t473260\n废钢\t473261\n15式\t473262\n路人女主的养成方法\t473263\n东欧剧变\t473264\n点购\t473265\n无兄弟\t473266\n2016~2017年\t473267\n一二\t473268\nmassive\t473269\n星海城\t473270\n松岛葵\t473271\n被动语态_\t473272\nans\t473273\n植料\t473274\n海南省\t473275\nQ&A\t473276\n摇蜜机\t473277\n法媒\t473278\n5458\t473279\n朴叙俊\t473280\nSchoolgirl\t473281\nApparel\t473282\nAHU\t473283\n课部\t473284\n鲁能星城\t473285\n惹来\t473286\n新浪安\t473287\n中远海运发展股份有限公司\t473288\n2篇\t473289\n九龙至尊\t473290\nhtc吧_\t473291\n绿色公路\t473292\n王庆坨镇\t473293\n纳甲筮法\t473294\n印台区政府\t473295\n双溪村\t473296\n斯帝卡\t473297\n三角债\t473298\nzzzz\t473299\n颜体\t473300\n青年路\t473301\n电木板\t473302\n达州晚报\t473303\n活塞泵\t473304\n点映\t473305\n百度直通车\t473306\n松山区\t473307\n9门\t473308\nakon\t473309\n化学通报\t473310\n产后抑郁症\t473311\n社团\t473312\n胡琳\t473313\nCourseWare\t473314\n第171集\t473315\n精生子\t473316\n深圳证监局\t473317\n蟊贼\t473318\n子宫颈\t473319\n长安欧诺中控锁\t473320\ncooki\t473321\n姚元浩\t473322\n语无伦次\t473323\n钻洞\t473324\n疏影路\t473325\n潜行狙击\t473326\n中空纤维超滤膜\t473327\nCunt\t473328\n老调\t473329\n重庆市城市管理委员会\t473330\nyoutu\t473331\nPRO\t473332\n王玄策\t473333\n景俊海\t473334\n成都市经济和信息化委员会\t473335\n星际传奇\t473336\nex1000\t473337\ncebu\t473338\n创形者\t473339\n周刚\t473340\n玄灵\t473341\n蛇喰梦子\t473342\n6.5.1\t473343\nfrc\t473344\n新水浒q传\t473345\n南山花园\t473346\n克莱谛\t473347\n徐建军\t473348\nCOSCO\t473349\n张狂\t473350\n度夏\t473351\n斗图表情包\t473352\n气性\t473353\n鸡丁\t473354\nBLE\t473355\n2L\t473356\n收视率\t473357\nWebshell\t473358\n自动炒菜机\t473359\n针织衫\t473360\n主题语\t473361\n7双\t473362\n水灯节\t473363\n汉街\t473364\n外伤\t473365\nvocaloid3\t473366\n最后一页\t473367\n油酥\t473368\n膜状\t473369\n随形\t473370\n蛋超人\t473371\n境遇\t473372\newb\t473373\n上海申龙客车有限公司\t473374\n进场\t473375\n巨峰镇\t473376\n小米手机1/1S\t473377\n富贵男与贫穷女\t473378\n千套\t473379\n沈阳分公司\t473380\n上冈镇\t473381\n不异\t473382\n海伦·凯勒\t473383\nkendall\t473384\n微攻略\t473385\nblackmores\t473386\n成一列\t473387\n四川建设人才网\t473388\n380X\t473389\nVIX\t473390\n弃暗投明\t473391\n皇寺\t473392\n第三度嫌疑人\t473393\n回落\t473394\nvpsee\t473395\n欧巴\t473396\n1月6日\t473397\n主席团\t473398\n江苏建筑职业技术学院\t473399\n覆杯\t473400\nemo\t473401\n腰腿\t473402\nDMZX\t473403\n氯化钾注射液\t473404\n羽旋\t473405\n1288\t473406\n点播券\t473407\nTIPS\t473408\n钓龙虾\t473409\n解决商\t473410\n棺匠\t473411\nboshi\t473412\n魔法堂\t473413\n重生之都市修仙\t473414\nLV8\t473415\n.net4.0\t473416\n排屋\t473417\nDRY\t473418\n绿色下载站\t473419\n淘宝菜鸟\t473420\nS1000\t473421\n广东风华高新科技股份有限公司\t473422\n扛得住\t473423\n垄断性\t473424\n搔首弄姿\t473425\n港币汇率一览_银行信息港\t473426\n填实\t473427\n搬运\t473428\n力场\t473429\n有你\t473430\n口腔壁\t473431\n图特\t473432\nc库\t473433\n智者\t473434\n大话西\t473435\n7562\t473436\n第二格\t473437\n全体型\t473438\n那一夜\t473439\n八兄弟\t473440\n800dpi\t473441\nconventions\t473442\n陕西财经职业技术学院\t473443\n建大\t473444\n经济赔偿金\t473445\ncleansing\t473446\n古建\t473447\n雷美诺\t473448\nIngress\t473449\n泰王\t473450\n圣枪游侠\t473451\nANZ\t473452\n巩膜\t473453\n井伊直虎\t473454\n坐板\t473455\n乱价\t473456\n比热\t473457\n114啦\t473458\nblond\t473459\n全新胜达论坛\t473460\n远供\t473461\n电动机\t473462\n2018-2020年度\t473463\nvpm\t473464\n十九日\t473465\n武林英雄\t473466\n陈小奇\t473467\n逃难\t473468\n芡\t473469\n陈钢\t473470\n主角儿\t473471\n大岭村\t473472\n宋dm\t473473\n西咸新区管委会\t473474\nwether\t473475\n金洋娱乐\t473476\n山本五十六\t473477\n五绝\t473478\n等差数列_\t473479\n麻尾\t473480\n9升\t473481\n扫楼\t473482\n双色球预测网\t473483\n群p\t473484\n国盛金控\t473485\n7周\t473486\n压滤机\t473487\n化蛹\t473488\nmac视频剪辑软件\t473489\n周建旭\t473490\n烟悦\t473491\n挖洞\t473492\n在线咨询_求医网\t473493\n不分家\t473494\n七牛\t473495\n吴店镇\t473496\n内衣裤\t473497\n敌营\t473498\n充币\t473499\n脚毛\t473500\n知识产权贯标\t473501\n498\t473502\n遥控飞机\t473503\n城河\t473504\n少年闰土\t473505\n剪短\t473506\n_百田知了_百田网\t473507\neview\t473508\n2017-2018年度\t473509\n第833集\t473510\n杨洪\t473511\n山里汉\t473512\n原声音\t473513\n沐雪柔\t473514\n中美科技\t473515\n苏九儿\t473516\n迈威\t473517\n四君子汤\t473518\n八十万\t473519\n上海大学管理学院\t473520\n腰里\t473521\n汪雨樵\t473522\nmAb\t473523\nton\t473524\n注协\t473525\n听琴\t473526\n冻饺子\t473527\n两院\t473528\nQQTZ综合社区\t473529\nIUNI\t473530\n弯针\t473531\n迪斯\t473532\n考研教育网\t473533\n剪切版\t473534\nCADSee\t473535\n结息日\t473536\n十九大考试\t473537\n哎呀呀\t473538\n张新华\t473539\n3艘\t473540\n发泡\t473541\n对照表\t473542\n服从\t473543\nodr\t473544\n三溪村\t473545\n3LCD\t473546\n家用燃气报警器\t473547\n八十年前\t473548\nGLAY\t473549\n闷痛\t473550\niturn\t473551\nWin32位\t473552\n羽毛球单打\t473553\n唐三小舞\t473554\n扬雄\t473555\n匹配\t473556\n爱问知识网\t473557\n近地\t473558\nHELIX\t473559\n纵队\t473560\n野马逍客\t473561\n庖丁解牛\t473562\nluhan\t473563\n天地人\t473564\n郭巷\t473565\n国书\t473566\n九龙山庄\t473567\n井壁\t473568\n好饿的小蛇\t473569\n鲜卑人\t473570\nzemax\t473571\n宣化区\t473572\n客兼职\t473573\n轮奸\t473574\nkdz\t473575\n嘉年华2017\t473576\nwin94\t473577\n妖怪公寓\t473578\n兵器\t473579\n歼-31\t473580\n雷探长\t473581\n第73届\t473582\n乔麦\t473583\n扶余市\t473584\n嘎达\t473585\nONLY\t473586\nCypress\t473587\n放免\t473588\n飞\t473589\n名词所有格\t473590\n杜姗姗\t473591\n纳扎尔巴耶夫\t473592\n白凤九\t473593\n7张\t473594\nCPIM\t473595\n20150925\t473596\n班淑传奇\t473597\n新阶层\t473598\nVDL\t473599\n关联\t473600\n八次方\t473601\n建变\t473602\n华运\t473603\n首尔站\t473604\n厦门长庚医院\t473605\nbjx\t473606\n人人链\t473607\n半路\t473608\nckeditor\t473609\n曾国藩\t473610\n皮下囊肿\t473611\n9册\t473612\n路从今夜白\t473613\n万里扬\t473614\n近义\t473615\n女屌\t473616\nAlma\t473617\n建标\t473618\n小白兔\t473619\n打的\t473620\n融侨半岛\t473621\n车源\t473622\nEM算法\t473623\ntapestry\t473624\n石山\t473625\n深思熟虑\t473626\n胡大\t473627\nTrim\t473628\nDUPLICATE\t473629\n瓦奥\t473630\n查扣\t473631\n韻府\t473632\n飞叠杯\t473633\n捂住\t473634\ntarget\t473635\n奖励\t473636\n我的小公主\t473637\n三月初九\t473638\ntxy\t473639\n干粉\t473640\n打坏\t473641\nanalysis\t473642\n新东方泡泡少儿英语\t473643\no型腿\t473644\n牛奶\t473645\n名宴\t473646\n0110\t473647\n刘雪松\t473648\n丧心病狂\t473649\n1879\t473650\n韩湘子\t473651\n食宝街\t473652\n递延所得税\t473653\n惠企\t473654\n人物画\t473655\n腾讯游戏帮帮_帮帮\t473656\n杨晓燕\t473657\n智\t473658\nnah\t473659\njavassist\t473660\n测卷\t473661\nAble\t473662\n白果村\t473663\n将就\t473664\n车铣复合机床\t473665\n江山恶魔囚笼\t473666\n5ms\t473667\nCompetence\t473668\n猫粮\t473669\n蛛网膜囊肿\t473670\n巴德士\t473671\n汉语学习\t473672\n旁系血亲\t473673\n自由自在\t473674\nPS4/PS3\t473675\n绩溪县\t473676\n河姆渡\t473677\n6合\t473678\n称上\t473679\n私利\t473680\n43_\t473681\n曹睿\t473682\n可爱多\t473683\nc罗\t473684\nN9005\t473685\n鄂尔多斯市\t473686\n总代理商\t473687\n0.8g\t473688\n宿管员\t473689\n禁赌\t473690\n偷偷摸摸\t473691\netcd\t473692\nSpark2\t473693\n姬惊雷\t473694\n南昌市环保局\t473695\n5分钱\t473696\n投促\t473697\n20172018\t473698\n高性\t473699\n颐\t473700\nsetState\t473701\n热水管\t473702\n丨\t473703\n上会\t473704\n走边\t473705\nspindle\t473706\n二十四节气\t473707\n冰宫\t473708\nCPU卡\t473709\n笔印\t473710\nbits\t473711\n康派\t473712\n无名王者\t473713\n稀土矿\t473714\n无锡东站\t473715\n印度卢比\t473716\n深切\t473717\nSuriFuture\t473718\n列兵\t473719\n高胡\t473720\n辊机\t473721\n孝感网\t473722\n眼影\t473723\n19年后\t473724\n台长\t473725\n开淘\t473726\n钦定\t473727\n20170413\t473728\n蓝天飞行队\t473729\n电电\t473730\n标准表\t473731\n陈双\t473732\n【川久保玲\t473733\n塔\t473734\n虎式坦克\t473735\n盗文\t473736\n21CN.COM\t473737\n泰康人寿保险公司\t473738\n泾渭\t473739\n九宫格火锅\t473740\n简要\t473741\n黄泰华\t473742\n一楼\t473743\n一万\t473744\n脑血管堵塞\t473745\n蟹道乐\t473746\n唐唐\t473747\n椒草\t473748\n西村由纪江\t473749\n吴玮\t473750\n总市值\t473751\n微笑面对\t473752\n天门人\t473753\n云南开放大学\t473754\n加里·奥德曼\t473755\n触手play\t473756\n林安琪\t473757\n五亿探长雷洛传\t473758\n安瑞\t473759\n漂流\t473760\n5.99\t473761\n臻颜\t473762\n赛摩电气\t473763\n维真\t473764\n沙棘油\t473765\n报税期\t473766\n1KV\t473767\n嘉实多极\t473768\nscheduling\t473769\n笃学\t473770\n林枫\t473771\n电动式\t473772\n差热分析\t473773\n小帽\t473774\n舒芬太尼\t473775\nDefenders\t473776\n瘦身贴\t473777\n老巢\t473778\n美高梅\t473779\n常艳\t473780\n契尔氏\t473781\n海城\t473782\n缅因猫\t473783\n霉\t473784\n飞时达\t473785\n服饰币\t473786\n西安公园\t473787\nkukuw\t473788\n一下午\t473789\nDataSheet\t473790\n柏油\t473791\n成材\t473792\n子母公司\t473793\n龙凤斗\t473794\n权重值\t473795\n2079\t473796\n联想Y430p\t473797\n机电实务\t473798\n程序段\t473799\nalimama\t473800\n化学史\t473801\n黄平县\t473802\n融创金成江南府\t473803\n拉杰\t473804\n中国石油国际事业有限公司\t473805\n变形金刚G1\t473806\n天房发展\t473807\n龙窟\t473808\n哈莉·奎茵\t473809\n广味\t473810\n瀚叶股份\t473811\n89623674\t473812\n旗头\t473813\nsvmpredict\t473814\n风行CM7\t473815\n速印机\t473816\n金丝鸟\t473817\n极限竞速\t473818\n瘤子\t473819\nPE启动盘\t473820\nelmer\t473821\n女虐\t473822\n太帅\t473823\n铁镐\t473824\nNob\t473825\n头孢地尼分散片\t473826\nehcache\t473827\n离合片\t473828\n红外\t473829\nctrl+t\t473830\ninfluenced\t473831\n小马拉\t473832\n荆门日报\t473833\n阿启\t473834\n腐败案\t473835\n悦酷\t473836\n免面签\t473837\n专断\t473838\n战国兰\t473839\nHelena\t473840\n死得其所\t473841\n河北省发展和改革委员会\t473842\n陈邦\t473843\n奉顺\t473844\n干纹\t473845\n3C_新民网\t473846\n杰西·利弗莫尔\t473847\n小汽\t473848\n梦幻西游长安保卫战\t473849\n英文名\t473850\n特警力量\t473851\nbasement\t473852\ntop10_\t473853\nLearned\t473854\n5.3a\t473855\nVisible\t473856\n吴文辉\t473857\nhcc\t473858\n查复\t473859\n鼠目寸光\t473860\n点唱机\t473861\n一百多个\t473862\ntimber\t473863\n中国博士后科学基金会\t473864\n渗透率\t473865\n十二大战\t473866\n温总理\t473867\n拉扯感\t473868\n一床\t473869\n肩扛式\t473870\n创新之路\t473871\n简编\t473872\nBE\t473873\nprotel\t473874\n荣耀小口哨\t473875\nv5.2.2\t473876\nJohn-51CTO\t473877\n高速分散机\t473878\n祸乱\t473879\n钱大昕\t473880\n城西客运站\t473881\n库鸟\t473882\nbuttom\t473883\n10000个\t473884\n涩片\t473885\nexcel透视表\t473886\n上海复大医院\t473887\n暖文\t473888\n肥红\t473889\n浔阳区\t473890\njui\t473891\n锡珠\t473892\n爱才\t473893\n台历\t473894\n飞石\t473895\nWSO2\t473896\n丰巢快递\t473897\n破坏神\t473898\n非接触式\t473899\n臂包\t473900\n防爆膜\t473901\n马克思主义法学\t473902\n回线\t473903\n三茅人力\t473904\n20170523\t473905\n中国能源网\t473906\n51CTO技术博客\t473907\n影音先锋资源站日夜撸\t473908\nbci\t473909\n避蚊胺\t473910\nminmin\t473911\n上海二中院\t473912\n信守承诺\t473913\n吉隆镇\t473914\n动力柜\t473915\n写写\t473916\n2档\t473917\n大番长\t473918\n6525\t473919\n着\t473920\n民营系\t473921\n布兰特\t473922\n口袋妖怪究极绿宝石\t473923\n1234567890\t473924\n127个\t473925\n巾帼枭雄之义海豪情\t473926\n足底\t473927\n销售代理商\t473928\n粲然\t473929\nRifle\t473930\nJUI\t473931\n一汽丰田4s店\t473932\n斯巴鲁WRX\t473933\n端倪\t473934\n京介\t473935\nSuicide\t473936\n调峰\t473937\n引下\t473938\n淮网\t473939\n军事学院\t473940\nMONGODB\t473941\n鼠式\t473942\n南四湖\t473943\n名园\t473944\nЧ\t473945\n香海\t473946\n铅锭\t473947\n雪月\t473948\n善志\t473949\n晋宁\t473950\n11.9\t473951\n经济参\t473952\ncrystals\t473953\n水千丞\t473954\n徐小湛\t473955\n攀缘\t473956\n可悲\t473957\njiayi\t473958\nbelieves\t473959\n坂茂\t473960\n阳曲县\t473961\n5027\t473962\n四川省信访局\t473963\n新里城\t473964\n15升\t473965\n1929\t473966\n零0\t473967\n必和必拓\t473968\n阿杰\t473969\n红胡\t473970\n循化\t473971\n地磅秤\t473972\n北京教育科学研究院\t473973\n短息\t473974\n陈俊杰\t473975\n加动\t473976\n淫事\t473977\n三阴\t473978\n魏风\t473979\n赤城县\t473980\n液膜\t473981\n辰东\t473982\nn76e003\t473983\n屡犯\t473984\n养猪巴巴网\t473985\n境签\t473986\n洲际酒店\t473987\n法基\t473988\n苏霍伊\t473989\n丑恶\t473990\nZoey\t473991\n从前慢\t473992\n黑暗森林法则\t473993\n小猿搜题\t473994\n荔湾广场\t473995\ntushare\t473996\nGUN\t473997\nUME国际影城\t473998\n邵佳\t473999\n阿九\t474000\n堆存\t474001\nCodeigniter\t474002\n拓路\t474003\n奥尼玛\t474004\n仙玉\t474005\n收钱\t474006\n南北朝时期\t474007\nGenetic\t474008\n洗砂机\t474009\n微笑天使\t474010\n白苹果\t474011\nRobotFramework\t474012\n站所\t474013\n蝶豆花\t474014\n新锐志\t474015\n机动车辆保险条款\t474016\nbert\t474017\n涉农贷款\t474018\n雪晴\t474019\n豆腐块\t474020\nFin\t474021\nminmax\t474022\nVERSION\t474023\n低电\t474024\n复旦大学新闻学院\t474025\n宏压枪\t474026\n度值\t474027\n邯郸银行\t474028\n签栏\t474029\n北者\t474030\n秒杀\t474031\n港货店\t474032\n许纪霖\t474033\n皇上命我来选妃\t474034\n文物\t474035\n温带\t474036\n听阈\t474037\n3415\t474038\n长濑\t474039\n上海东航\t474040\n佳琪\t474041\nbareback\t474042\n展暨\t474043\nNUXT\t474044\n冬恋\t474045\nTyping\t474046\n山大王\t474047\n基金转换\t474048\n甘洛县\t474049\n槐花节\t474050\n谨防\t474051\n10~20\t474052\nPsi\t474053\n郴州市政府\t474054\n航季\t474055\n5.51\t474056\n意料之外\t474057\nengage\t474058\n二角\t474059\n龙泉山城市森林公园\t474060\ncp3\t474061\nk375s\t474062\n仁清法师\t474063\n氟硅酸钠\t474064\ncrowded\t474065\n金水河\t474066\n弥猴桃\t474067\n复苏\t474068\ngazette\t474069\nHD3000\t474070\n张曼源\t474071\n94.0\t474072\nag捕鱼王\t474073\n华工\t474074\n50则\t474075\n3070\t474076\n陈国\t474077\npussy\t474078\njointjs\t474079\nRTW\t474080\n电吉\t474081\n魅族pro7plus\t474082\n终字\t474083\n圣荷西\t474084\n三才五格\t474085\n4月12号\t474086\n兴民智通\t474087\n科讯广电网\t474088\n科大讯飞股份有限公司\t474089\n700分\t474090\n野兽人\t474091\n人不可貌相\t474092\nnanodrop\t474093\n看不起\t474094\n耽美文库\t474095\n佳能700d\t474096\n腐殖土\t474097\n平阴\t474098\nSantana\t474099\n两定\t474100\n广东省知识产权局\t474101\nNate\t474102\n试图\t474103\nDNG\t474104\n车镜版\t474105\n永柏\t474106\n老山\t474107\n6次方\t474108\n恶道\t474109\n晌午\t474110\n再遇不到你这样的人\t474111\n霞\t474112\n简笔画\t474113\n三圣乡\t474114\ndhp\t474115\n基础库\t474116\nMinolta\t474117\nEX2\t474118\n温哥华冬奥会\t474119\n五尺\t474120\n顾忌\t474121\n世界艾滋病日\t474122\n埃克森美孚\t474123\n武侯\t474124\n神枪\t474125\n倒灌\t474126\n宋青书\t474127\n网易CC\t474128\n实变\t474129\n亚洲娱乐\t474130\nround\t474131\n中经网\t474132\nuv漆\t474133\n三隅\t474134\n矩母函数\t474135\n美人鱼村\t474136\n淘宝盗图\t474137\n广州市知识产权局\t474138\n被盗案\t474139\nlipstick\t474140\nlum\t474141\n5360\t474142\n如兰\t474143\n省质监局\t474144\n固始网\t474145\nArtistic\t474146\n1200mg\t474147\nestablished\t474148\nandme\t474149\n四川省交通运输厅道路运输管理局\t474150\n钟楼\t474151\n中国跳水队\t474152\n白水寨\t474153\n卤制\t474154\n新婚夜\t474155\n劳动仲裁委员会\t474156\n日文\t474157\n酷威\t474158\nfreeMarker\t474159\npbr\t474160\n默罕默德\t474161\nnismo\t474162\noffset\t474163\nPHICOMM\t474164\n500场\t474165\noverhead\t474166\n盗火者\t474167\nmere\t474168\nditf\t474169\n字典库\t474170\n趋光性\t474171\n用户表\t474172\n叼烟\t474173\n知本\t474174\n元青花瓷\t474175\n石嘴山市政府\t474176\n梦幻西游2\t474177\n张建峰\t474178\n红米3s\t474179\nExodus\t474180\n小苹\t474181\n普法栏目剧\t474182\nutah\t474183\nlogstash\t474184\n线人\t474185\n怀集县人民政府\t474186\n改接\t474187\nXYQ\t474188\n湖北省孝感市财政局\t474189\n中国人民大学艺术学院\t474190\n宏丰\t474191\n巨沼\t474192\n回兴\t474193\n洗肠\t474194\n地球日报\t474195\n傅颖\t474196\n电热锅\t474197\n高尔夫6论坛\t474198\n蟹黄包\t474199\n正妻\t474200\n征婚\t474201\n博物志\t474202\n类似爱情\t474203\n重生点\t474204\nDF\t474205\naer\t474206\n算什么\t474207\n枫叶国际学校\t474208\n转制\t474209\n兰克\t474210\nBUILDER\t474211\n甚矣\t474212\n深衣\t474213\n蒋劲夫\t474214\n花生剥壳机\t474215\nAllowed\t474216\n塞班岛\t474217\n法袍\t474218\n十二张\t474219\n维娜芬\t474220\n拒交\t474221\n拈花一笑\t474222\n恋人们\t474223\n副\t474224\n深圳湾体育馆\t474225\n反馈\t474226\nGG\t474227\n城市街道\t474228\n长城C30\t474229\n抵退\t474230\n村雨\t474231\n愛奇藝\t474232\n美丽心灵的永恒阳光\t474233\n一挡\t474234\nn3160\t474235\n旅团\t474236\n夜郎\t474237\n米雅\t474238\n南山科技园\t474239\n温州眼视光医院\t474240\nios8.0\t474241\n好臭\t474242\nbias\t474243\nruss\t474244\n合创\t474245\n三国奇侠传\t474246\n丰台镇\t474247\n逸出功\t474248\n张旺\t474249\n商城县人民政府\t474250\n全国中等职业学校\t474251\n叮铃铃\t474252\n5一个\t474253\n214号\t474254\n3043\t474255\nrhino6.0\t474256\n赤峰地区\t474257\n5环\t474258\n省物价局\t474259\n读后感网\t474260\nstupid\t474261\nxaf\t474262\n丰田雷凌双擎\t474263\n2217\t474264\n温和\t474265\n卷帘\t474266\n党纪\t474267\n戒指盒\t474268\n下跌\t474269\n定向流量\t474270\n可凡\t474271\n异彩\t474272\ndoa5\t474273\nMOMO\t474274\n碎碎\t474275\n新农村\t474276\n上风\t474277\n魂香\t474278\n造梦西游外传\t474279\nweib\t474280\n杨四郎\t474281\nHRA\t474282\n五子登科\t474283\n高职类\t474284\n真高跟\t474285\nAsher\t474286\n教育法\t474287\nuin\t474288\n4dx\t474289\n先锋影音先锋\t474290\n广东自贸区\t474291\n仲悦\t474292\n中新知识城\t474293\n钱包\t474294\n铜制\t474295\nPrt\t474296\n刘彬\t474297\n50条\t474298\n蓝河\t474299\n买断式回购\t474300\n辽宁省博物馆\t474301\n千分之五\t474302\n押尾桑\t474303\n连喷\t474304\n_焦作日报数字报_焦作网WWW.JZRB.COM\t474305\n阿卡玛\t474306\n大小时\t474307\n定选\t474308\n中海锦江城\t474309\n限行尾号\t474310\n柏子养心丸\t474311\n玉髓\t474312\n蚁兽\t474313\n福特级\t474314\n乔一帆\t474315\n6.12\t474316\n湟里\t474317\nNX\t474318\n49年后\t474319\nYelp\t474320\n到生\t474321\n不秒\t474322\n误触\t474323\n乳文\t474324\n12平方米\t474325\nMovist\t474326\n36千米\t474327\n郭溪\t474328\n郧阳网\t474329\n横拍\t474330\n景天\t474331\nhash索引\t474332\n1600X\t474333\n厄贝沙坦\t474334\n环球交叉点\t474335\nHe\t474336\nincludes\t474337\n一【\t474338\n永久居住证\t474339\n脱皮机\t474340\n31关\t474341\nExchange2010\t474342\n通天塔\t474343\n临泉县\t474344\n冰熊\t474345\ntomahawk\t474346\n管段\t474347\n昌建\t474348\n影音先锋日夜撸影院\t474349\nMellow\t474350\n糖糖\t474351\n制霸\t474352\n最主要\t474353\n摩瑞\t474354\n工程测量学\t474355\n阜宁县\t474356\n张志凤\t474357\n综合金融\t474358\n20170424\t474359\nBOOTMGR\t474360\n五一公司\t474361\n三国志曹操传\t474362\n西南证券股份有限公司\t474363\n精功\t474364\n卷筒\t474365\n奥图码投影机\t474366\n米咪\t474367\nPla\t474368\n娇与志明\t474369\n热咳\t474370\ntalents\t474371\n惠州地铁1号线\t474372\n1955年\t474373\n图片网TUKU.CN\t474374\nhdtune\t474375\n深圳地铁13号线\t474376\nBBC纪录片\t474377\nsaturated\t474378\n2018048期\t474379\n吾民\t474380\n52农家乐网\t474381\n大花脸\t474382\n李占英\t474383\nrakuten\t474384\n药包\t474385\n星海中学\t474386\n华澳\t474387\n桂林街道\t474388\n甩手舞\t474389\n电子万能试验机\t474390\n东方快车\t474391\n梁笙\t474392\nAutoHotKey\t474393\n氧化物\t474394\n国网人才评价中心\t474395\n聂瑶\t474396\n预订员\t474397\n亳州学院\t474398\nAmount\t474399\n油轨\t474400\n天津红桥\t474401\nlonglong\t474402\n凯迪克\t474403\nrvvp\t474404\n前场\t474405\n茕茕孑立\t474406\n上河园\t474407\n3112\t474408\n找一个人\t474409\n拷边机\t474410\n冬雷\t474411\n干咳嗽\t474412\n邵武在线\t474413\nDDoS攻擊\t474414\n知会\t474415\n众盈\t474416\nNaka\t474417\n投诉书\t474418\n宜昌市夷陵区人民政府\t474419\n吉尔莫·德尔·托罗\t474420\n宁蒗\t474421\n南京信息职业技术学院\t474422\nCAD快捷键\t474423\nスタ\t474424\n寰球\t474425\n红岛经济区\t474426\nstacy\t474427\n龙骨架\t474428\nVJ师网\t474429\n健身/补剂/运动\t474430\nBasis\t474431\n油盘\t474432\nGT2\t474433\nMarkup\t474434\nDirectory\t474435\n各县\t474436\nNSArray\t474437\nDrools\t474438\n规划局\t474439\nwwwroot\t474440\n刀剑神域虚空断章吧\t474441\n法恩莎瓷砖\t474442\n第19季\t474443\nDLL函数\t474444\n110天\t474445\n加法\t474446\n南心\t474447\n重复率\t474448\n一个一\t474449\n民信贷\t474450\n篮球史\t474451\n一探究竟\t474452\n星号密码查看器\t474453\n狙击精英3\t474454\ndos启动盘\t474455\n反正\t474456\n叶知秋\t474457\n巡诊\t474458\n克鲁赛德\t474459\n乐龄\t474460\n忘了我\t474461\nOriginLab\t474462\n夸父逐日\t474463\n顿号\t474464\n毁灭之锤\t474465\n陶朱\t474466\n我的哥\t474467\n其华\t474468\n601899\t474469\nCPU\t474470\n马克波罗\t474471\n不可数名词\t474472\nx86_x64\t474473\n闸杆\t474474\n师父\t474475\n裸花紫珠胶囊\t474476\n周六下午\t474477\n省民宗委\t474478\n玉桥\t474479\nAthlete\t474480\nfx\t474481\n拼命\t474482\n贵友\t474483\n大佐\t474484\n罗兰·巴特\t474485\n君子好逑\t474486\nhping3\t474487\n重庆银行\t474488\n布鲁斯·韦恩\t474489\n庞门\t474490\n空挡\t474491\n1033\t474492\npowershot\t474493\n飞涨\t474494\npaste\t474495\n千里之行\t474496\n㎜\t474497\n调修\t474498\n李斯忠\t474499\n佛山法院\t474500\n移花宫\t474501\n姚俊\t474502\n细纲\t474503\n餐餐\t474504\n百层\t474505\n鹏华银行\t474506\n毕导\t474507\n苏韵\t474508\n周老师\t474509\n宋亮\t474510\n舞弊案\t474511\n2016年端午节\t474512\n73_\t474513\n三千块\t474514\n共沸物\t474515\n育子\t474516\n福克斯特\t474517\natt\t474518\nimplementation\t474519\nvolley\t474520\ndelegation\t474521\n牙龈癌\t474522\n执异\t474523\n市厅级\t474524\n组装机\t474525\n南通广播电视台\t474526\nmiwifi\t474527\n频段\t474528\n促红细胞生成素\t474529\n利刃\t474530\nsscp\t474531\n鏡頭\t474532\n周公度\t474533\n和凤镇\t474534\n2018年1月23日\t474535\nMetasploit\t474536\n贝子\t474537\n宝鸡站\t474538\n中国台球协会\t474539\n富士河口湖\t474540\n黄土店\t474541\n补助金\t474542\n名小\t474543\n中华人民共和国交通部\t474544\n宝瞳\t474545\nq9650\t474546\n陈贵\t474547\n果靖霖\t474548\nexpiry\t474549\nprepare\t474550\njiren\t474551\n农业转基因\t474552\n梦恋\t474553\n埋式\t474554\n锤基\t474555\nmanagerial\t474556\n17款\t474557\n征收率\t474558\n为你歌唱\t474559\n洲泉镇\t474560\n阿轲\t474561\n年轻人\t474562\n48秒\t474563\n丰年\t474564\n谱点\t474565\n风华网\t474566\n漳泽电力\t474567\n95558\t474568\ngetOutputStream\t474569\n薄瓜瓜\t474570\n安全管理网\t474571\n机仓\t474572\n怒海争锋\t474573\n犬吠\t474574\n坐车\t474575\n登高望远\t474576\n蜡皮\t474577\nRecharge\t474578\n惠若琪\t474579\n随机性\t474580\n赛驰\t474581\n兰溪市政府\t474582\nsweep\t474583\n爱立方\t474584\n昂宝\t474585\nx+6\t474586\n乡镇司法所\t474587\n360随身wifi\t474588\n聚酯\t474589\n77条\t474590\n盗匪\t474591\nglc200\t474592\n小玩\t474593\n地毯式\t474594\n系统函数\t474595\nDemo版\t474596\n室上性心动过速\t474597\n宝嘉誉府\t474598\n舒拉\t474599\n此片\t474600\nAfterShokz\t474601\n70公里\t474602\n颐高广场\t474603\n定型机\t474604\n滤毒\t474605\n孤身一人\t474606\n白龙马\t474607\nKnown\t474608\npasswd\t474609\n野狼獾\t474610\nGT-AC5300\t474611\n荣和\t474612\n黑恶\t474613\n有所为\t474614\n牛奶杯\t474615\n万向信托\t474616\n佳能760d\t474617\n空狐\t474618\ndvb\t474619\n山东省科学院\t474620\nSSR\t474621\nPOT\t474622\n一起又看流星雨\t474623\nReflection\t474624\n小丑回魂\t474625\n14\t474626\n三通道\t474627\n唯物史观\t474628\n张涵予\t474629\nacadiso\t474630\noaf\t474631\n超声探伤仪\t474632\nneural\t474633\n2014-2020年\t474634\n消弱\t474635\n峨嵋山\t474636\n清\t474637\nwith在线翻译\t474638\nVirtualbox虚拟机\t474639\n奥尼稻盛和夫\t474640\n音频流\t474641\n自我修养\t474642\n元翔\t474643\n金华外国语学校\t474644\n土地卫片\t474645\n窦娥\t474646\n柔润\t474647\n专利申请权\t474648\n念珠菌龟头炎\t474649\nMIKIMOTO\t474650\n三九感冒灵颗粒\t474651\n分路器\t474652\n明泽\t474653\n美丽乡镇\t474654\nAPPLICATIONS\t474655\n回旋加速器\t474656\n王孙\t474657\n第三话\t474658\nWindows2012R2\t474659\n西班牙语学习网\t474660\n罗丹\t474661\n不能带\t474662\n投注站\t474663\n普京\t474664\n重生军\t474665\n8.8亿\t474666\n角接触球轴承\t474667\nTop100\t474668\nvue2\t474669\n世婚\t474670\nfxw\t474671\n压电晶体\t474672\nfrankie\t474673\n法司\t474674\n2017年1月14日\t474675\n90001\t474676\n700C\t474677\n山口山\t474678\n贾母\t474679\n农畜\t474680\n厄米算符\t474681\n李梦\t474682\n孺子帝\t474683\n逆爱\t474684\nidolish7\t474685\n5.6.26\t474686\n硅谷天堂资产管理集团\t474687\n安徽卫视\t474688\n黄继\t474689\n吴红民\t474690\n纵月\t474691\nMOS场效应管\t474692\n龙胆草\t474693\nRealistic\t474694\n文化交流\t474695\n死亡审判场\t474696\n侧盖\t474697\n辅材\t474698\n3630\t474699\n邀请赛\t474700\n庚申\t474701\n楼月\t474702\n肉屋\t474703\n喜马拉雅听书\t474704\n16.8万\t474705\n刘流\t474706\n一对多\t474707\n玉龙雪山\t474708\n喝起来\t474709\n杭州地区\t474710\n柯特\t474711\n零基础学英语\t474712\n商品类\t474713\n爱咲\t474714\n雄关漫道真如铁\t474715\ni7-6700\t474716\n盲证\t474717\n阴沉木\t474718\n加工商\t474719\n挫伤\t474720\n盘县\t474721\nTRD\t474722\n黄美英\t474723\n笑看风云\t474724\n嘉美\t474725\n花井\t474726\nSamWeb\t474727\nwww.17500.cn\t474728\n家属\t474729\n塔尼亚\t474730\n保时捷帕拉梅拉\t474731\n富农\t474732\n曾雪菜\t474733\n河头村\t474734\n翻一番\t474735\n嘻嘻嘻\t474736\nrdo\t474737\n2次\t474738\nMama\t474739\ntension\t474740\n老宅\t474741\n荣威RX5/RX5\t474742\ntomc\t474743\n小悪魔\t474744\n稽查员\t474745\ndigest\t474746\n神速\t474747\n正考虑\t474748\n法律援助\t474749\n两河流域\t474750\n苹果A11\t474751\n不起眼\t474752\n该项\t474753\n骗婚\t474754\n电动打蛋器\t474755\n张家界火车站\t474756\n艾兰博曼\t474757\n填缝\t474758\n站场\t474759\n伊丽莎白雅顿\t474760\n巨野吧_\t474761\n彭文\t474762\n光启\t474763\nqiche\t474764\n安徽发展研究网\t474765\n尚清\t474766\n东佳\t474767\nnum1\t474768\n兴州\t474769\n克雷斯\t474770\nharmful\t474771\nKM\t474772\n伴娘团\t474773\n第12轮\t474774\n如飞\t474775\n韶音\t474776\ncutoff\t474777\n3519\t474778\n北京市司法局\t474779\n东宁\t474780\n关键帧\t474781\n娱乐场所\t474782\n易宝支付网\t474783\n草妈妈\t474784\n塔人\t474785\n第76\t474786\n高甜\t474787\n网站域名\t474788\ndvs\t474789\n基维斯\t474790\n范冰冰\t474791\n以小见大\t474792\n边贸\t474793\n林涵\t474794\n闪存颗粒\t474795\n脸巾\t474796\nmessage\t474797\n说话算话\t474798\n诚信为本\t474799\n炒饼\t474800\n睑黄瘤\t474801\n紫外分析仪\t474802\n巴雷特\t474803\n程序池\t474804\nTRENDIANO\t474805\ncoral\t474806\n文聘\t474807\n转类\t474808\n闽江师范高等专科学校\t474809\n10届\t474810\n物流港\t474811\n铁栏\t474812\nreading\t474813\n万安科技\t474814\niometer\t474815\n水煮果壳\t474816\n斗罗大陆全集\t474817\n墅\t474818\n安琪坦\t474819\n买单\t474820\n盛京文学网\t474821\n音乐大师课\t474822\nWin7/Win8系统\t474823\nMediaWiki\t474824\niv\t474825\n中共十六大\t474826\n襄阳五中\t474827\n宋大叔\t474828\n江苏女排\t474829\n天津同仁堂\t474830\n脑电\t474831\n维秘\t474832\nsessionFactory\t474833\n新浪辽宁\t474834\n吕正操\t474835\n元稹\t474836\n接果\t474837\n粤电力\t474838\n广发信用卡\t474839\n合同项\t474840\nthemes\t474841\n重庆新桥医院\t474842\n氩\t474843\n柴油泵\t474844\nusb转串口线\t474845\n安全套\t474846\n虚怀若谷\t474847\n保期\t474848\n咸素媛\t474849\n万道恶魔少爷别吻我\t474850\n三角\t474851\n18K\t474852\n横厅\t474853\namex\t474854\n宿营\t474855\n20波\t474856\n108将\t474857\nScat\t474858\n3月5号\t474859\n胎龄\t474860\n辣鸡\t474861\n浦东\t474862\n小瞳\t474863\n贵州省人大常委会\t474864\n哈芬\t474865\n菜饭\t474866\n成华奥园广场\t474867\n广州黄埔区政府网\t474868\n文鸟\t474869\nJ5\t474870\n多天\t474871\n酷动城\t474872\n三邦车视网\t474873\n科技有限公司\t474874\n小三班\t474875\nmSATA接口\t474876\n诟病\t474877\n0.0.6\t474878\n39元\t474879\n卢中南\t474880\n什么叫\t474881\n弘仁\t474882\n哥哥哥\t474883\n約\t474884\n湖南大学机械与运载工程学院\t474885\n大片\t474886\nClarence\t474887\n定然\t474888\n无锡市第二人民医院\t474889\n助人为乐\t474890\n凉拖鞋\t474891\nFo\t474892\n北京菜\t474893\n丰乐亭\t474894\nJDI\t474895\n三十层\t474896\n桐庐19楼\t474897\n低碳出行\t474898\n天桥区\t474899\n范炜\t474900\ncig\t474901\n男默女泪\t474902\n南方传媒\t474903\n毕游侠\t474904\n336路\t474905\n3333\t474906\n圣石\t474907\n二集\t474908\nclasspath\t474909\n文号\t474910\n个级\t474911\n齿印\t474912\n锂电新闻中心\t474913\n负离子\t474914\n便携投影仪\t474915\n恒河水\t474916\nacbd\t474917\n火速\t474918\n忘记了\t474919\na59s\t474920\nbas\t474921\nPermanent\t474922\n前缀\t474923\n黄峰\t474924\n稀便\t474925\n瓷砖\t474926\n柠檬蜂蜜水\t474927\nlaid\t474928\n锦西\t474929\n兰索拉唑肠溶胶囊\t474930\n附选\t474931\n0042\t474932\n高台寺\t474933\n瓣坛枫\t474934\n砥砺\t474935\nMer\t474936\nlocked\t474937\n300条\t474938\n严莉莉\t474939\n期货从业资格考试\t474940\n护球\t474941\n一朝一夕\t474942\n佛山联通\t474943\n宜居度\t474944\n有节\t474945\n肉厚\t474946\nxny\t474947\n摸手\t474948\ndianyuan\t474949\n诛仙世界\t474950\nbitlocker\t474951\n胡哥\t474952\n朴廷桓\t474953\n接动\t474954\ntpn\t474955\n3:30\t474956\n当心\t474957\n2女\t474958\n读书\t474959\n椎拳崇\t474960\n第七段\t474961\n叶广芩\t474962\n天猫规则\t474963\nC4400\t474964\n俄罗斯政府\t474965\n庞大汽贸集团股份有限公司\t474966\n马长江\t474967\n梅江区\t474968\n深v\t474969\n姜姓\t474970\nXshell5\t474971\n肖旭\t474972\n崇左\t474973\n桃李\t474974\n收据\t474975\n龙沙区\t474976\n脊柱炎\t474977\n尼康D750\t474978\n直线筛\t474979\nfro\t474980\n#鹿晗##鹿晗热血街舞团#\t474981\n果壳\t474982\n闭卷\t474983\ndamping\t474984\n830集\t474985\n凯萨\t474986\n选\t474987\n健康证明\t474988\n里水\t474989\n收回\t474990\n亢奋\t474991\n张雄\t474992\n循环群\t474993\n不悟\t474994\n新疆菜\t474995\n诗号\t474996\n手箱\t474997\n小米电视遥控器\t474998\n责罚\t474999\n_龙\t475000\nYA-MAN\t475001\n15r2\t475002\n冰山女\t475003\n申捷\t475004\n清华经管学院\t475005\n窝趣\t475006\n8.11\t475007\n提问_综\t475008\nNMI\t475009\n硒谷\t475010\n眶\t475011\n白帝城\t475012\n守业\t475013\n星河WORLD\t475014\n厂商稿\t475015\n49章\t475016\n第3名\t475017\n舞蹈版\t475018\n血族\t475019\n瑜洲\t475020\nstk500\t475021\n学时\t475022\n四川省人民政府办公厅\t475023\n北京居然之家投资控股集团有限公司\t475024\n辻\t475025\n支支吾吾\t475026\n白石街道\t475027\n赵雯\t475028\n轻轻地\t475029\n6006u\t475030\n钢\t475031\n岳阳医院\t475032\n诗境\t475033\n新晨\t475034\n首次公开发行股票并上市管理办法\t475035\n龙潭街道\t475036\n2175\t475037\n人教网\t475038\nDorcel\t475039\n携款\t475040\n古城街道\t475041\n手子\t475042\n名爵6\t475043\nTequila\t475044\n五小时\t475045\ntol\t475046\n油头\t475047\n1號\t475048\n国际贸易法\t475049\n新同\t475050\n砍杀\t475051\n2012上半年\t475052\n上半身\t475053\n天使之城\t475054\n乳白胶\t475055\n王晓飞\t475056\n巨狼\t475057\n女中学生\t475058\n达安基因\t475059\n解语花\t475060\n无锡市规划局\t475061\n洄游\t475062\n39302645\t475063\n苏浙\t475064\n完税\t475065\n中南漫悦湾\t475066\n干将莫邪\t475067\n贷贷\t475068\n苏州市发展和改革委员会\t475069\n中南大学网络教育学院\t475070\n费德勒\t475071\nx光机\t475072\n寻花\t475073\nPolish\t475074\n重庆水利电力职业技术学院\t475075\n村民们\t475076\n弹力素\t475077\n蠢蠢欲动\t475078\n调教文\t475079\nアフタ\t475080\n苏桃\t475081\n性癖\t475082\n这两年\t475083\nLeona\t475084\n末代皇帝\t475085\n即插即\t475086\n文人\t475087\n严子陵\t475088\n撷\t475089\n荃湾\t475090\nrosette\t475091\n谷歌地图API\t475092\n99折扣网\t475093\ni7-7700k\t475094\n半截蜡烛\t475095\n税会\t475096\n巧言\t475097\n晋西\t475098\n何去\t475099\nWebDAV\t475100\n12天后\t475101\n云安\t475102\nbroad\t475103\n长水\t475104\n贵州日报\t475105\n雷克萨斯ct200h\t475106\n华为ma\t475107\n小梦\t475108\nshaotine\t475109\n文化程度\t475110\n撮合\t475111\n作明\t475112\n未麻的部屋\t475113\n0859\t475114\n小咖秀\t475115\n毕节百里杜鹃\t475116\n近亲\t475117\n高级人民\t475118\n古塔区\t475119\ndeshi\t475120\nsteep\t475121\nbooming\t475122\n熙然\t475123\n邱清泉\t475124\ncxk\t475125\n利多\t475126\n1000小时\t475127\n叶蕴仪\t475128\nZ8\t475129\n安普\t475130\n3.39\t475131\n易方达\t475132\nmomo\t475133\n讲文明树\t475134\n央美附中\t475135\n三房巷\t475136\n一值\t475137\n型人\t475138\n擦身而过\t475139\nPromotion\t475140\n草系\t475141\n索纳锦绣未央\t475142\n赔本\t475143\nsamui\t475144\n方杰\t475145\n堪舆\t475146\n钼酸钠\t475147\nIe\t475148\n终审\t475149\n兰溪市\t475150\ncyh\t475151\n浮板\t475152\nTot\t475153\nソ\t475154\n滨海公路\t475155\n黄金塔\t475156\n英媒\t475157\n星游记\t475158\ngroupinstall\t475159\nmtga\t475160\n任剑辉\t475161\n文峰\t475162\n一拖二\t475163\njcp\t475164\n缺乏\t475165\n充血性心力衰竭\t475166\n百多邦\t475167\n古名\t475168\n旺季\t475169\nseek\t475170\n天天摸\t475171\n绷\t475172\n1排\t475173\nNYSE\t475174\n弱市\t475175\n20分米\t475176\n30d\t475177\n2017年12月10日\t475178\n俗家\t475179\n温湿度变送器\t475180\n龙凤村\t475181\n撕扯\t475182\n玉足\t475183\n箫启灵\t475184\n研究性学习\t475185\n155万\t475186\nEasyUEFI\t475187\n黔北\t475188\nshoul\t475189\n1.3.6\t475190\n恐龙岛\t475191\n有没\t475192\n北京密云县\t475193\n二羟基苯甲酸\t475194\n南昌地铁2号线\t475195\n查拉图斯特拉\t475196\n训犬师\t475197\n修订稿\t475198\n符合\t475199\n艾肯家电网\t475200\nLHR\t475201\n树人\t475202\n响度\t475203\n和田玉\t475204\n重重叠叠\t475205\nManagement\t475206\n四防\t475207\n昆明路\t475208\n我的新发现\t475209\n女性\t475210\ncaopeng\t475211\n身骑白马\t475212\n兔王\t475213\n财产保险\t475214\n商纣\t475215\n伐木机\t475216\ntrainer\t475217\nmy.99.com\t475218\n钱江贝纳利\t475219\n临风\t475220\n追月神\t475221\n三亚市\t475222\n筑梦中国\t475223\n内存单\t475224\n协理员\t475225\nAVデビュ\t475226\n交管\t475227\nMP4-1080p\t475228\n编码员\t475229\nmyeclips\t475230\npe袋\t475231\n丁蟹\t475232\nlvl\t475233\n黄龙士\t475234\n苏州婚纱摄影\t475235\n2628\t475236\nMGM\t475237\nq房网\t475238\n男寝\t475239\n致远oa\t475240\n转化器\t475241\n绝地求生毒圈\t475242\n黑乌龙茶\t475243\n昂科威\t475244\n湘熙水郡\t475245\n清华大学公共管理学院\t475246\n0x80\t475247\n智能家具\t475248\n梅妃\t475249\n挥一挥手\t475250\n1000号\t475251\n应急照明\t475252\n⌒\t475253\n全固版\t475254\nSexo\t475255\n张辛苑\t475256\nMoulding\t475257\nnoclass\t475258\nat89s52\t475259\n斯沃\t475260\n普睿司曼\t475261\n溃不成军\t475262\n宫庶\t475263\n重庆师范大学涉外商贸学院\t475264\n国家税务局\t475265\n商丘网\t475266\nhacks\t475267\n除湿\t475268\n0.87\t475269\n惠民街道\t475270\n广州市第十六中学\t475271\nmofy\t475272\n华丽\t475273\n维密超模\t475274\n敖厂长\t475275\n蒙特利尔大学\t475276\n站壳网\t475277\nKINA\t475278\n咪爪\t475279\n转录组测序\t475280\n腊鸭\t475281\n矿蛙\t475282\n利益相关方\t475283\n黑虎\t475284\n_华商网\t475285\n艾索\t475286\n歼\t475287\nSto\t475288\nTRUE\t475289\n凸凹\t475290\ntbschedule\t475291\n刚察县\t475292\n磁信\t475293\n巴金\t475294\ndecker\t475295\n白卡\t475296\n企管部\t475297\ndistinctive\t475298\n贾庆林\t475299\n认缴出资额\t475300\nSTYLUS\t475301\n香远益清\t475302\n奔驰斯宾特\t475303\n房改\t475304\n7.26\t475305\n兴衰史\t475306\n盲评\t475307\nsticks\t475308\n第106\t475309\n14册\t475310\n非标自动化\t475311\n8.8%\t475312\n科尔沁右翼中旗\t475313\nModBus\t475314\n南阳路\t475315\n厦门国家火炬高技术产业开发区\t475316\nNG\t475317\n金佰利\t475318\n冬瓜茶\t475319\n07_\t475320\n选场\t475321\n何文辉\t475322\n居居\t475323\n網台灣分站\t475324\n一自\t475325\nグルメ\t475326\n金口河\t475327\ns300\t475328\n立体停车库\t475329\n见不得人\t475330\n郑州妈妈网\t475331\n寿司卷\t475332\n有条\t475333\n海啸\t475334\n墨泥\t475335\n因子口服液\t475336\n美巢腻子粉\t475337\n刘文彩\t475338\n海河流域\t475339\n过火\t475340\n接轨\t475341\n周家坝\t475342\nins\t475343\n双月\t475344\n走私者\t475345\n妖神花千骨\t475346\nCHS\t475347\n春北\t475348\n小黑\t475349\ndmx512\t475350\n自食其果\t475351\nlodash\t475352\n七星关区\t475353\n五分钟后\t475354\n中国农业大学生物学院\t475355\nSkillet\t475356\n刘四\t475357\n丘山\t475358\n大易\t475359\n如日中天\t475360\n阳泉郊区\t475361\n水晶剑\t475362\n样本率\t475363\n︰\t475364\n扛\t475365\n不满\t475366\n华中师范大学第一附属中学\t475367\n天河机场\t475368\nPassware\t475369\nfico\t475370\nVCO\t475371\n敢相信\t475372\n豪布斯卡\t475373\n习总\t475374\nburpsuit\t475375\n奔驰A45\t475376\n九千网\t475377\n优佳\t475378\n帝霸惟\t475379\n分布式交换机\t475380\n扬帆计划\t475381\n终老\t475382\n低密度脂蛋白\t475383\n3d定制女仆2\t475384\n九棠府\t475385\n塞德斯\t475386\n大众甲壳虫\t475387\nBring\t475388\n朋友的酒\t475389\n乌鲁木齐地窝堡国际机场\t475390\n润颜\t475391\n登登\t475392\n上虞\t475393\n职业病防治法宣传周\t475394\nmultimap\t475395\n无厘网文_来福岛爆笑娱乐网\t475396\n徐平\t475397\n西城区幼儿园\t475398\n合脚\t475399\nExcel格式\t475400\n东北财经大学出版社\t475401\n加盐\t475402\n点缤\t475403\n卜算子咏梅\t475404\n曼月乐\t475405\noutlet\t475406\nAlways\t475407\n茂名市\t475408\n副导演\t475409\ntgf\t475410\n达州市\t475411\nMusicdu\t475412\n从业者\t475413\nVTEC\t475414\nAPV\t475415\nmidi键盘\t475416\ndodge\t475417\n秦淮八艳\t475418\n博士生\t475419\n白鹭湾湿地公园\t475420\n刺嫩芽\t475421\n龙战\t475422\nGTX680\t475423\n陷入\t475424\n石泉县人民政府\t475425\n运行\t475426\n晓镜\t475427\nzjl\t475428\n泊头\t475429\n笔墨纸砚\t475430\nGEN\t475431\n收盘\t475432\nζ\t475433\n量刑\t475434\n金碧杂谈\t475435\n大雨滂沱\t475436\n第66\t475437\nRoyals\t475438\n伪码\t475439\n胡中花\t475440\n工作站\t475441\n钓友\t475442\n芸芸众生\t475443\n浙江省工商行政管理局\t475444\n当代先锋网\t475445\n第121届\t475446\n60段\t475447\n差生\t475448\n一团一团\t475449\n老鸟们\t475450\nDNF2018\t475451\nbuluo\t475452\n巨幅\t475453\n圣旨\t475454\niterator\t475455\n招投标网\t475456\n主战者\t475457\nrou\t475458\n控江路街道\t475459\n爱丽丝梦游仙境\t475460\n搅拌\t475461\n1957\t475462\n光头\t475463\n捌零音乐论坛\t475464\nviagogo\t475465\n闸阀\t475466\n美蒂菲\t475467\n坂口美穗乃\t475468\n2011-2014年\t475469\n0.100\t475470\n成都中学\t475471\n汇丰银行\t475472\n广东省委宣传部\t475473\n天猫\t475474\nmao\t475475\n电子书版\t475476\nqz\t475477\n割接\t475478\n切伤\t475479\n如此多娇\t475480\n东湖海洋世界\t475481\n周汝昌\t475482\n管易\t475483\n豆沙色\t475484\nzx\t475485\n丑月\t475486\n江西中医学院\t475487\n有效力\t475488\n忧思\t475489\n何必\t475490\n萘替芬酮康唑乳膏\t475491\n沸腾文学网\t475492\n裂棒\t475493\n焊枪\t475494\n4399泰\t475495\nDeposition\t475496\n逆向分析工程师\t475497\n隐隐约约\t475498\n第一支\t475499\n回春草\t475500\n诺德\t475501\n弹夹\t475502\n鸠兹\t475503\nDio\t475504\n心率\t475505\n吨数\t475506\n河鲜\t475507\n002070\t475508\n学龄教育\t475509\n肿胀\t475510\n智能版\t475511\nMizuna\t475512\n肩袖\t475513\n政府采购\t475514\nBlogging\t475515\n桦南\t475516\n系统之家\t475517\n中居正广\t475518\n福汉\t475519\n泳道\t475520\nnewborn\t475521\n苟且偷生\t475522\n浩克\t475523\nMAKER\t475524\n坐机\t475525\n新声代\t475526\n五朝\t475527\n树式\t475528\n血手印\t475529\n华宸\t475530\n洗发精\t475531\n夏梦\t475532\n哑铃\t475533\n溶\t475534\nmoody\t475535\n陈冲\t475536\n普罗旺世\t475537\n本币\t475538\n五式\t475539\n新乡市政府\t475540\n黄玫瑰\t475541\n金地自在城\t475542\n第十章\t475543\nUNION\t475544\n梅比斯\t475545\n納屋\t475546\n上海市人民政府\t475547\n小奥\t475548\n爱狗\t475549\n人福医药集团股份公司\t475550\nCHANDO\t475551\n灞桥柳\t475552\n奋韩圈\t475553\n床幔\t475554\n养生堂_腾讯\t475555\n各版\t475556\n买家\t475557\n代理人\t475558\n可逆转\t475559\nPbdigg\t475560\n明锐论\t475561\n半屏\t475562\nblogsheng\t475563\n更真实\t475564\n炜岸城\t475565\n置备\t475566\n实验器\t475567\n蓼\t475568\nspeaker\t475569\n陈伟鸿\t475570\n锦州南站\t475571\n工程有限公司\t475572\niflytek\t475573\n战国无双3\t475574\n燃气车\t475575\n武汉电力职业技术学院\t475576\n为何多\t475577\n远望\t475578\n终极目标\t475579\n极地大乱斗\t475580\n钱方\t475581\nNgau\t475582\nad9361\t475583\n周比利\t475584\nArcher\t475585\nvf\t475586\n湘大\t475587\n114期\t475588\n林龙\t475589\n第5年\t475590\n农园\t475591\nReiko\t475592\n超级赛\t475593\nmssql2008\t475594\ncti\t475595\n山南\t475596\n永辉超市股份有限公司\t475597\n寒秋\t475598\n英格\t475599\n深圳国税_深圳市国家税务局\t475600\n华夏视讯\t475601\n伴娘服\t475602\nnignx\t475603\nblueair\t475604\nhoutai\t475605\n存贷款利率表\t475606\n六马路\t475607\n缤谷广场\t475608\n0.98\t475609\n河洛工作室\t475610\nWego\t475611\n南京都市论坛\t475612\nrsyslog\t475613\n日本机场\t475614\n水谷\t475615\n【众泰T500】_新众泰T500报价_众泰汽车众泰T500\t475616\n路考路\t475617\nip代理\t475618\nCrystals\t475619\n办事处\t475620\n懒政\t475621\n方章\t475622\nOkhttp\t475623\nクエスト\t475624\nltrace\t475625\n坐飞机\t475626\nCouncil\t475627\n宠物美容师\t475628\n套属\t475629\n云梯\t475630\n哈弗H9论坛_汽车之家论坛\t475631\n七分之一\t475632\n不孝顺\t475633\n医保局\t475634\n性恋\t475635\n黏性\t475636\nu段\t475637\nIPSEC\t475638\n成亲\t475639\n帐幕\t475640\n芍药甘草汤\t475641\n诸暨市人民医院\t475642\n5516\t475643\n5.3.6\t475644\n左顾右盼\t475645\n下沉式广场\t475646\n二五\t475647\nRand\t475648\n二十多\t475649\n阿拉丁神灯\t475650\n昱辉\t475651\nRelaxation\t475652\n鼻出血\t475653\n三年\t475654\n上海地铁8号线\t475655\nasciima\t475656\n陈浩南\t475657\n相互依赖\t475658\n指弹吉他谱_六线独奏谱\t475659\n名行\t475660\n简易计算器\t475661\n取下\t475662\n人力资\t475663\n保定军校\t475664\n花都大道\t475665\n肘型\t475666\n弹钢琴\t475667\n奥黛丽赫本\t475668\n积分器\t475669\n丑山\t475670\n82579lm\t475671\nBenelli\t475672\nClassroom\t475673\n福旺\t475674\n易错题\t475675\n岳胖子\t475676\n20招\t475677\n王凡\t475678\n解晓东\t475679\n电脑豆沙绿界面\t475680\nServer2003\t475681\n荣昌\t475682\n信录\t475683\n2610\t475684\n肉石\t475685\n不列颠百科全书\t475686\n伊斯佳\t475687\n思\t475688\n干网\t475689\n客房部\t475690\n桨叶\t475691\n凝血活酶\t475692\n乳贴\t475693\nRootkit\t475694\n兰州地区\t475695\n玉蒲团之偷情宝鉴\t475696\n黄昏恋\t475697\ntome\t475698\n原定\t475699\n0.25mm\t475700\n华虹集团\t475701\n世纪之窗新闻网\t475702\n中国奥委会\t475703\n陈年老酒\t475704\nwow.178.com\t475705\n收购价\t475706\n邪气\t475707\n曹安\t475708\n模拟人生畅玩版\t475709\n坛南湾\t475710\n载量\t475711\n细菌性肺炎\t475712\n中央芭蕾舞团\t475713\n四驾\t475714\n梦幻西游4\t475715\n静安寺街道\t475716\n30万平方米\t475717\n部门经理\t475718\n5678\t475719\n输入流\t475720\n李克明\t475721\n日复一日\t475722\n二龙湖浩哥\t475723\n茄科\t475724\nパンスト\t475725\nARouter\t475726\n瑞迈\t475727\n副标\t475728\n龙牌\t475729\n口女\t475730\n39分\t475731\n竹炭\t475732\n金砖五国\t475733\n旭日旗\t475734\n十八届四中全会\t475735\n2520m\t475736\n提示页\t475737\n分量\t475738\n几次\t475739\n5A级景区\t475740\nlavender\t475741\n避讳\t475742\nePrinter\t475743\n交通标识\t475744\n唯实\t475745\n造极\t475746\n玉狐\t475747\n云骑士\t475748\n龙岗\t475749\n卷闸门\t475750\nIPTG\t475751\n自焚\t475752\n十四届\t475753\n黄聪\t475754\n河北博才网\t475755\n一起跳舞\t475756\naki\t475757\n竟敢\t475758\nXcode9\t475759\n沥青\t475760\n熊耳山\t475761\n农村村\t475762\n哥廷\t475763\n唐论坛_汽车之家论坛\t475764\n虚谷\t475765\n落地期\t475766\n遭殃\t475767\nErr\t475768\n武超\t475769\n死人不偿命\t475770\n爱斯基摩\t475771\n大兄弟\t475772\nIII类\t475773\nprofiling\t475774\ncodersay\t475775\n氨基酸态氮\t475776\n西南财经\t475777\n苏娜\t475778\n第4位\t475779\n加奈\t475780\nGAD\t475781\n株洲市国家税务局\t475782\nQQ音\t475783\n基础护理学\t475784\ngravatar\t475785\n背囊\t475786\n中国航空工业集团有限公司\t475787\n火山地板\t475788\n宏函数\t475789\n徐訏\t475790\n王占峰\t475791\n圣诞大战\t475792\n目的论\t475793\n八倍\t475794\n打底膏\t475795\nmongo\t475796\nMvnJar\t475797\ndwell\t475798\n李程\t475799\n微卷\t475800\npurity\t475801\nwend\t475802\n火火\t475803\n奇航\t475804\n54_\t475805\n中国核建\t475806\n渤海海峡\t475807\n言辞\t475808\n易道\t475809\n救援车\t475810\n被侵权人\t475811\nCemu模拟器\t475812\n下界\t475813\n2020级\t475814\n社工师考试\t475815\nr9plus\t475816\n伏安法\t475817\nSynchronize\t475818\n湿拌砂浆\t475819\n大熊\t475820\nUI_UI\t475821\n杨致远\t475822\n0xc000007b\t475823\navenue\t475824\n第一审\t475825\n曹清华\t475826\n单盖\t475827\nmasql\t475828\n泛酸\t475829\n上去除\t475830\nCleanMyMac\t475831\n荣耀7a\t475832\n金缘\t475833\n强词夺理\t475834\n某时\t475835\n遗体\t475836\n套环\t475837\n和倍\t475838\n小馄饨\t475839\n露伶春\t475840\ncommence\t475841\n台州招聘网\t475842\n赛车总动员3\t475843\n诉讼时效\t475844\n武汉船舶职业技术学院\t475845\n猛击\t475846\n定长\t475847\n一个次\t475848\n都\t475849\n圆茶\t475850\n产籍\t475851\nBOOLEAN\t475852\n马栗乐\t475853\n9nine\t475854\n沙坝\t475855\n氢氧根\t475856\n小莲\t475857\n广西医科大学\t475858\n租税\t475859\nMamamoo\t475860\n都柏林大学\t475861\nspoken\t475862\n第三颗\t475863\n姜山\t475864\n进馆\t475865\nAL00/标准版/全网通\t475866\n弘成教育\t475867\n循环卷积\t475868\n上工\t475869\n班\t475870\n板式热交换器\t475871\n袁伟\t475872\nHoles\t475873\n500号\t475874\nAiTechYun\t475875\n原户\t475876\n壁面\t475877\n茶壶\t475878\n百丈东路\t475879\n整形外科\t475880\nT3航站楼\t475881\nTOP20_\t475882\nX玖少年团\t475883\n命身\t475884\n原宿风\t475885\n石蜡\t475886\nMateBook\t475887\n护肝药\t475888\npunctuation\t475889\n小仪\t475890\n安全险\t475891\n鹿茸片\t475892\n忠恕\t475893\n9周年\t475894\n安福家园\t475895\nSIFT\t475896\n车胜元\t475897\n2008-2010年\t475898\nextensions\t475899\n重情\t475900\n轮替\t475901\n自我实现\t475902\n初潮\t475903\n僧众\t475904\nLatitude\t475905\n来气\t475906\n花舌\t475907\n大厂区\t475908\n黄土岭\t475909\n红米手机4X\t475910\n一夕\t475911\ncann\t475912\n奢\t475913\nKF\t475914\n初中版\t475915\n仓储\t475916\nqstat\t475917\n尿\t475918\n大胡同\t475919\n二手表\t475920\n法希\t475921\n石门国家森林公园\t475922\n西数蓝盘\t475923\n该车\t475924\nMendeley\t475925\n9910\t475926\n时珍\t475927\n光色\t475928\n祝融夫人\t475929\n区环保局\t475930\n心脾两虚\t475931\n减温\t475932\nifs\t475933\n生物刺激素\t475934\nCREAM\t475935\n251路\t475936\n运动场\t475937\n录入期\t475938\n公子音\t475939\n联和\t475940\n投影变换\t475941\n泽德\t475942\n20141204\t475943\n转子泵\t475944\n平板支撑\t475945\n公募基金公司\t475946\n单人\t475947\n多管\t475948\n虹润\t475949\n刀锋铁骑\t475950\n那年那月\t475951\n马鬃山\t475952\n菲利宾\t475953\n案件受理费\t475954\n杀驴\t475955\n压降\t475956\n唐朝安徒生童话\t475957\n电子元件\t475958\n钢船\t475959\nsmok\t475960\n陈柏宇\t475961\n历下区政府\t475962\n滑翔机\t475963\nmidori\t475964\n九宫\t475965\n朝鲜牡丹峰乐团\t475966\n极慢\t475967\nb85m-f\t475968\nDEVELOPMENT\t475969\n内存优化\t475970\nSalem\t475971\n中注协\t475972\n玉\t475973\n兴通\t475974\n别秀\t475975\n无锡职业技术学院\t475976\n搜集网\t475977\n护眼宝\t475978\n蜀水\t475979\n腾讯安全中心\t475980\nNom\t475981\n隐马尔可夫模型\t475982\n净颜\t475983\ntiyu\t475984\n森田\t475985\n单反\t475986\n擒纵\t475987\n36件\t475988\n安徽日报_安徽日报报业集团\t475989\n炊烟\t475990\n青岛市第八人民医院\t475991\n小厨\t475992\n口袋妖怪日月\t475993\n湖北法治网\t475994\nTRUNC\t475995\n歹毒\t475996\nherein\t475997\n密封槽\t475998\n116集\t475999\n留恋\t476000\n3系\t476001\n嘎鱼\t476002\n宋老师\t476003\n一万步\t476004\nVertiv\t476005\n綺麗\t476006\n制宜\t476007\nexpedition\t476008\n悦美\t476009\n汽油泵\t476010\n迪托之剑\t476011\n武媚娘传奇\t476012\n第28天\t476013\nGEC\t476014\n担保人\t476015\n槑头槑\t476016\n碳刀\t476017\n不排便\t476018\n20170215\t476019\n陈言\t476020\n进华中学\t476021\n千趣\t476022\n返回舱\t476023\nxinjian\t476024\nAgNO3\t476025\n郑微\t476026\n暗标\t476027\n张三慧\t476028\nmercier\t476029\n熊津\t476030\nC/S\t476031\n坏女人\t476032\nNoClass\t476033\n批荡\t476034\n连出\t476035\n史上最囧挑战\t476036\n德棍\t476037\nsealed\t476038\n性教\t476039\n外貌协会\t476040\n8月19日\t476041\n编委\t476042\n前位\t476043\nimpose\t476044\n草裙\t476045\n魔人布欧\t476046\n全版\t476047\n吞噬者\t476048\n营业时间表\t476049\n民投金服\t476050\nPXN\t476051\n天国王朝\t476052\n500米\t476053\n才高八斗\t476054\n承袭\t476055\n巴塔木儿歌\t476056\n波波沙\t476057\nv1.0.3_\t476058\n欧拉图\t476059\n人民文学出版社\t476060\nC32\t476061\n立磨\t476062\n响应型\t476063\nwww.liuxue86.com\t476064\n2012年1月\t476065\nis6\t476066\n咔咔声\t476067\n圆形\t476068\nPDT\t476069\n最近\t476070\n频率域滤波\t476071\nFrederic\t476072\n堵点\t476073\nNavicatPremium\t476074\n免费汽车长安欧诺\t476075\n时尚秀\t476076\n格型\t476077\n盟军敢死队4\t476078\n东方嘉盛\t476079\n张铁林\t476080\n南滨路\t476081\n双线程\t476082\n新修版\t476083\n爱标志网\t476084\n地页\t476085\n五分裤\t476086\n金帛\t476087\n新史诗\t476088\nCertification\t476089\n08届\t476090\n婆罗门\t476091\n91在线\t476092\n卫生教育\t476093\ncclink\t476094\nHELLO\t476095\n布鲁纳\t476096\n_碗\t476097\n心板\t476098\n三步\t476099\n紫竹桥\t476100\n电动缸\t476101\ncql\t476102\n斯德哥尔摩\t476103\naion\t476104\n明士\t476105\n电纸书\t476106\nLiveSino\t476107\n奥硝唑分散片\t476108\n国联安基金\t476109\nbodyslide\t476110\n森碟\t476111\nenhancer\t476112\n小时间\t476113\n南水北调工程\t476114\n还在一起\t476115\n摩尔金融\t476116\n5.7.21\t476117\n脱战\t476118\ntarmac\t476119\n快乐星汉堡\t476120\n花行\t476121\n4mod\t476122\n江城大道\t476123\n张涵\t476124\npopfisher\t476125\n城市规划馆\t476126\n陈美\t476127\n嘉兴科技城\t476128\n164\t476129\n云门寺\t476130\nLaunches\t476131\n认定为\t476132\n鱼趣\t476133\n白影\t476134\ntranslate\t476135\n色调色\t476136\n30栋\t476137\nvic\t476138\n美少女特攻队\t476139\n海马汽车\t476140\nLombok\t476141\nhno3\t476142\n预拌\t476143\n长安福特\t476144\n九州生气恃风雷\t476145\n黄冈师范学院\t476146\n绿旋风\t476147\n五专\t476148\n市容\t476149\n1324\t476150\nwh6\t476151\n赎身\t476152\n东方栀子\t476153\nWants\t476154\n杰洛\t476155\n游娱\t476156\nHover\t476157\ncas9\t476158\n田杰\t476159\n废品率\t476160\n龙泉山庄\t476161\nfreer\t476162\n绉\t476163\ntrizol\t476164\nwindo\t476165\n金果\t476166\n僵尸至尊\t476167\n2.0G\t476168\n足少阳胆经\t476169\n宜君县人民政府\t476170\n飞屋环游记\t476171\nVARCHAR2\t476172\n惊世毒妃\t476173\n长缨\t476174\n泰明顿\t476175\n菲利斯\t476176\n银饰\t476177\n南阳机场\t476178\n小莫骚麦\t476179\nx2a2+y2b2=1\t476180\nMatic\t476181\n测控网\t476182\n广惠高速\t476183\n钒钛\t476184\n7.3圣\t476185\n文松\t476186\n公资\t476187\n一生爱你千百回\t476188\nhdu\t476189\n男欢女爱\t476190\n伯尼\t476191\nKITTI\t476192\nA1524\t476193\n针式打印\t476194\n哭坟\t476195\n调户\t476196\n大空港\t476197\n外文学院\t476198\n买码\t476199\nprojec\t476200\n杨涟\t476201\nVelvet\t476202\n按会\t476203\n富通集团\t476204\n猪肉丸\t476205\n义母\t476206\nNLP\t476207\n苛\t476208\n自惭形秽\t476209\nSOP\t476210\n电动车蓄电池\t476211\n阴炎\t476212\n喉癌\t476213\n辛克\t476214\n广日\t476215\n情泪\t476216\n海南博鳌乐城国际医疗旅游先行区\t476217\n长沙贺龙体育馆\t476218\nexpressvpn\t476219\n湘江世纪城\t476220\n窑址\t476221\nprinted\t476222\n城口\t476223\n一斗米\t476224\n47度\t476225\n风声鹤唳\t476226\nG奶\t476227\n产出\t476228\n顺网\t476229\n附一\t476230\n雪乃\t476231\n层气\t476232\n太阳神\t476233\n企管\t476234\nyezzy\t476235\n野驴\t476236\njuc\t476237\n7.6米\t476238\n乌鸦喝水\t476239\n枪支案\t476240\n百战军事网\t476241\nseldom\t476242\n白话资治通鉴\t476243\n七浦路\t476244\n腾讯问卷\t476245\n中缆在线\t476246\n房山信息网\t476247\n企业集团\t476248\n960帧\t476249\n田中美佐\t476250\n梭子鱼\t476251\n百世\t476252\n什么体\t476253\n许又声\t476254\n验资\t476255\n天灰\t476256\n源动力\t476257\n办公位\t476258\n声纳\t476259\n奥杆\t476260\n淫男\t476261\n碎枝\t476262\n内部交易\t476263\nnerd\t476264\n第11位\t476265\n百度信息流\t476266\n海能达通信股份有限公司\t476267\n8-10月\t476268\n增材\t476269\n瓷碗\t476270\n玩偶\t476271\n阿里鱼卡\t476272\n翻滚\t476273\n6分钟\t476274\n两眼\t476275\n动态口令卡\t476276\n乳癖\t476277\n鲁西化工\t476278\n刘亚洲\t476279\n双面人生\t476280\n泉州市纪委监察局\t476281\n尿微量蛋白\t476282\n宁浩\t476283\n隐情\t476284\n杨祐宁\t476285\ngray\t476286\n节课\t476287\nedu.docin.com豆丁\t476288\n诞\t476289\n北大六院\t476290\n暴走漫画制作器_暴走漫画\t476291\nsnmpwalk\t476292\n易配网\t476293\n莫斯科红场\t476294\n蜜月\t476295\nduty\t476296\n热化学方程式\t476297\n兔毛\t476298\n马前卒\t476299\nvdisk\t476300\n山西省国土资源厅\t476301\n4欧姆\t476302\n高露\t476303\n减防\t476304\n东关街道\t476305\n芒果广播网\t476306\n2-4月\t476307\n脑血\t476308\n贫困生\t476309\n酵母细胞\t476310\ncolud\t476311\n尤勇\t476312\n时间都去哪了\t476313\nwwe2017\t476314\n众兴\t476315\nQQ三国\t476316\njhipster\t476317\n嘉兴市环境保护局\t476318\n子人\t476319\nIT168\t476320\n600803\t476321\n上城国际\t476322\n包罗万象\t476323\n二节\t476324\n掣肘\t476325\nsummertimesaga\t476326\nQPSK\t476327\nladygaga\t476328\nrochester\t476329\n波克城市\t476330\n号子\t476331\n垂询\t476332\n腾盛\t476333\n麻辣牛肉\t476334\n啪啪影院\t476335\n梦幻西游跑商\t476336\n托幼\t476337\n高耗能行业\t476338\n梅州在线\t476339\n九曲湾\t476340\n生死符\t476341\n发迹史\t476342\nunit4\t476343\n卡雷拉斯\t476344\n中科软科技\t476345\n水底下\t476346\n透析袋\t476347\nUML类图\t476348\n天坛东门\t476349\n咖啡因\t476350\n人大法学院\t476351\n黯然销魂掌\t476352\nLot\t476353\npth\t476354\n手无寸铁\t476355\n折扣价\t476356\n秋思\t476357\n三十日\t476358\n金吾\t476359\n环球小姐\t476360\n绾青丝\t476361\n三性\t476362\n情难自制\t476363\nXGBoost\t476364\n5月4日\t476365\n敦南\t476366\n温彻斯特\t476367\n青年近卫军\t476368\n洗车场\t476369\n南诏国\t476370\n书架箱\t476371\nAIX\t476372\n认沽期权\t476373\n中国钢研科技集团有限公司\t476374\n结业证\t476375\n1230v3\t476376\n铂金汉宫\t476377\nlinuxdeepin\t476378\nPTU\t476379\n70年代末\t476380\n英烈们\t476381\n绞吸式\t476382\n津梁\t476383\n脱霉剂\t476384\nshinee\t476385\nLeEco\t476386\n透气管\t476387\n3月13日\t476388\nPoi\t476389\n监察法释义\t476390\n无忧考网\t476391\nttfb\t476392\n任泽平\t476393\nguid\t476394\n192.168.0.2\t476395\nwabbit\t476396\n市商务局\t476397\n光信\t476398\n顺义区\t476399\nIHG招聘网\t476400\n随班就读\t476401\n解放碑店\t476402\nLoadLibrary\t476403\n立轴\t476404\nphotosmart\t476405\n希格\t476406\n土地篇\t476407\n国家能源局综合司\t476408\n自攻钉\t476409\n海康威视视频\t476410\n武灵\t476411\nnba2konline\t476412\n朴素贝叶斯算法\t476413\n十九大思想汇报\t476414\n当中\t476415\n100G3\t476416\n聚合氯化铝\t476417\n十年级\t476418\n越博\t476419\n离歌\t476420\n妮维\t476421\n王正\t476422\n长岗\t476423\n兵佣\t476424\n轰然\t476425\n将军澳\t476426\nk812\t476427\n中通\t476428\n鲁迅\t476429\n科学大道\t476430\n民俗文化村\t476431\n小说馆\t476432\n菇米\t476433\n模拟飞行\t476434\n刘存惠\t476435\nU盘启动项\t476436\nMicrosoft\t476437\n成都信息工程学院\t476438\n路桥英才网\t476439\nweblogic10\t476440\nzootopia\t476441\n股票价格指数\t476442\n347\t476443\n喵王\t476444\n三聚氯氰\t476445\na酸\t476446\nac1st16.dll\t476447\n中西医结合科\t476448\nR3D\t476449\n导诊\t476450\n乌巢\t476451\n1927\t476452\n空盒\t476453\n特别关注\t476454\n3.08\t476455\n波龙\t476456\n李文波\t476457\n主堆\t476458\n顺丰速\t476459\n标准误\t476460\n魅灵\t476461\n不一样的卡梅拉\t476462\nBlank\t476463\n拍拍照\t476464\n灵契摩尔庄园\t476465\n汽车票务网\t476466\n青普\t476467\n5.1\t476468\n88a\t476469\n2桶\t476470\n星体\t476471\n叶尘\t476472\n韦尔股份\t476473\noppo手机助手\t476474\nTekla\t476475\n仙侠\t476476\n无人监护\t476477\n北京国际汽车展览会\t476478\n美国密歇根大学\t476479\n易刚\t476480\n通用词\t476481\n30架\t476482\n叶绾绾\t476483\nprepared\t476484\nPoker\t476485\n气量\t476486\ngatt\t476487\n600737\t476488\n艾露比\t476489\nmirai\t476490\n环网柜\t476491\n卢米埃影城\t476492\n时基\t476493\nChiu\t476494\n柊\t476495\n123周年\t476496\n软件技术专业\t476497\n金贵晟\t476498\n衡\t476499\n压级\t476500\ng3250\t476501\n闪电侠吧\t476502\n归纳\t476503\n这周日\t476504\n公田\t476505\n万能吧友\t476506\n黑谍\t476507\nAlphabet\t476508\ninternet协议\t476509\nag\t476510\nclocks\t476511\nTabBar\t476512\n作践\t476513\n单身\t476514\n帝锦\t476515\n赵某\t476516\n领购\t476517\nLOG4J\t476518\n尿道结石\t476519\n药法\t476520\n第24届\t476521\n惊天战神\t476522\n帝具\t476523\n脉冲型\t476524\n好似\t476525\nS80\t476526\n美图T8s\t476527\n御史\t476528\n正白\t476529\n珠港澳\t476530\n中国男装网\t476531\n高密度沉淀池\t476532\nextending\t476533\nKits\t476534\n安胜浩\t476535\n费力\t476536\n悦动\t476537\n上海五星体育\t476538\nLICENSE\t476539\nadvisory\t476540\n招才猫\t476541\n宁波幼儿园\t476542\n温开水\t476543\n君山区\t476544\n开考\t476545\n300名\t476546\n肯定性\t476547\n重庆公务员考试网\t476548\n失联\t476549\n每一年\t476550\n夜探\t476551\n使馆区\t476552\n岚\t476553\n通商口岸\t476554\n娉婷\t476555\n法国大使馆\t476556\n酒泉在线\t476557\n频率域\t476558\n3符华\t476559\n西门无恨\t476560\n娄底市人民政府\t476561\n沙浴\t476562\n隔山\t476563\n飞踢\t476564\n前海公司\t476565\nTITLE\t476566\n开凿\t476567\n三姉妹\t476568\n西湖断桥\t476569\nbalenofly\t476570\n偏心距\t476571\n连续式\t476572\ngmx\t476573\n光片\t476574\n核名\t476575\n沽\t476576\n北京遇上西雅图之不二情书\t476577\n8月7日\t476578\n大校花\t476579\nmortgage\t476580\n汽修学校\t476581\nmysql\t476582\n动线\t476583\n尼姆\t476584\npetite\t476585\n茶叶节\t476586\n范磊\t476587\nHKEXnews\t476588\neduroam\t476589\n富基世纪公园\t476590\n新疆托里县政府\t476591\n吉尔吉斯\t476592\n名段\t476593\n肖仁福\t476594\nGustav\t476595\n羊肉面\t476596\n四房\t476597\ngcov\t476598\nflint\t476599\n噔噔\t476600\n乾坤\t476601\n荣耀畅玩\t476602\n石家庄市教育局\t476603\n334号\t476604\n育儿大作战2018\t476605\nFailure\t476606\nbanana\t476607\n东风风神AX7\t476608\n文艺片\t476609\n陈文清\t476610\nmaverick\t476611\nv6.5\t476612\n向着\t476613\n明基扫描仪\t476614\n吡喹酮\t476615\n新力帝泊湾\t476616\n郭永章\t476617\n水箱\t476618\n死亡代理人\t476619\n文献\t476620\n约旦\t476621\nteme\t476622\n无欲则刚\t476623\n平安银行深圳分行\t476624\n哪人\t476625\nJRiver\t476626\n屏风\t476627\n贵妃鸡\t476628\n读往\t476629\nMX6\t476630\n黄时雨\t476631\nerotica\t476632\n除却\t476633\n1.8.9\t476634\n尽调\t476635\n超短期融资券\t476636\n波特率\t476637\n你侬我侬\t476638\n阳明学\t476639\n鼻窦炎\t476640\nion\t476641\n生化危机5\t476642\n桐城路\t476643\n片山萌美\t476644\n轻松过关\t476645\n赛峰\t476646\nhoc\t476647\nv3.2.0\t476648\n萨斯\t476649\n氧气泵\t476650\n120px\t476651\n棺\t476652\nAuditing\t476653\n万福\t476654\nCCTV-7_央视\t476655\n后天\t476656\n自承式\t476657\n4.1.8\t476658\n八戒\t476659\n铜艺\t476660\n沙朗\t476661\n搬动\t476662\n人去楼空\t476663\n虞河路\t476664\nMACHINE\t476665\n十六夜\t476666\n奥大利亚\t476667\noctane渲染器\t476668\n网师\t476669\n祈祷君\t476670\n全球化\t476671\n斗鱼弹幕助手\t476672\n233路\t476673\njre6\t476674\n跨境人民币\t476675\n六指\t476676\n穆清\t476677\n浪鼓\t476678\n嗜酸细胞\t476679\n定价\t476680\nBuffett\t476681\n实火\t476682\n肝门\t476683\n5719\t476684\n芬兰\t476685\n变态\t476686\nLQ-615KII\t476687\n武汉钢铁(集团)公司\t476688\n平山村\t476689\n联邦制药\t476690\n乞力马扎罗山\t476691\n明度\t476692\n氨基\t476693\nconfinement\t476694\n2233卡\t476695\n瑞安市\t476696\n金科廊桥水岸\t476697\nh5+\t476698\n生产机\t476699\n红面\t476700\ndrozer\t476701\n自然角\t476702\nf593\t476703\nX宝\t476704\n颈骨\t476705\n割头\t476706\nenc\t476707\nNANO\t476708\nPython\t476709\n3100点\t476710\n汉唐\t476711\n0.8.8\t476712\nvapour\t476713\n主人\t476714\n图吧地铁\t476715\n拆箱\t476716\n足藓\t476717\n满愿\t476718\n二氧化铅\t476719\n14套\t476720\n苏帮菜\t476721\n壁挂式\t476722\n易小\t476723\n财季\t476724\n纪录\t476725\n磁机\t476726\n夏霖\t476727\nProxyee-down\t476728\n海宁市\t476729\n加油工\t476730\n14500\t476731\nVehicle\t476732\n五志\t476733\n缅怀\t476734\n周斌\t476735\n夏桐\t476736\n济南网络广播电视台\t476737\n姚村\t476738\n12.5寸\t476739\n河北省人民政府\t476740\n3.7万\t476741\nconnectivity\t476742\n捕集\t476743\n播种\t476744\nCeleste\t476745\n房地产评估公司\t476746\n十三项\t476747\n芈\t476748\ntabula\t476749\n20160604\t476750\n安泰科技股份有限公司\t476751\n移动合约机\t476752\nMeridian\t476753\n仙岛湖\t476754\n深创投\t476755\n一去\t476756\n电容\t476757\n振臂一呼\t476758\nspinner\t476759\nGlibc\t476760\n你和他\t476761\n缩招\t476762\n600179\t476763\n亿速\t476764\nivd\t476765\n三源色\t476766\niPhone6/\t476767\nmacro\t476768\n中山大学肿瘤防治中心\t476769\n全面战争:幕府将军2\t476770\n维\t476771\n参考表\t476772\nlob\t476773\noracle11G\t476774\n绿化费\t476775\n人大工委\t476776\n法律依据\t476777\n象是\t476778\n黎平\t476779\n层\t476780\n情味\t476781\n援兵\t476782\n163com.cn\t476783\n89升\t476784\n将太无二\t476785\n风天\t476786\n东方绿舟\t476787\n50次\t476788\n阿笠\t476789\n各集\t476790\n铼\t476791\n骁龙801\t476792\nscheme\t476793\n贝克巴斯\t476794\n搞坏\t476795\n抗辐射\t476796\nhaas\t476797\n周会明\t476798\n同生\t476799\necdsa\t476800\nP0\t476801\n抗药性\t476802\n黑桐\t476803\n雪片\t476804\nCorelDra\t476805\n攘\t476806\nAmphenol\t476807\n吞云吐\t476808\n灵魂猎者\t476809\n威信县\t476810\n4.5v\t476811\n防守\t476812\n固沙\t476813\n周巡\t476814\n州税\t476815\n开源网\t476816\n华进\t476817\n虎头蛇尾\t476818\n技高网\t476819\n最讨厌\t476820\n全卷\t476821\n干娘\t476822\n远足\t476823\n温州商报\t476824\n破防率\t476825\n月光微博客\t476826\n半加器\t476827\n今生的唯一\t476828\nWiener\t476829\n柏瑞\t476830\n西医\t476831\n配信\t476832\n巫族\t476833\n马未lady\t476834\n二道\t476835\nKush\t476836\n二轮\t476837\nipa\t476838\n杜涛\t476839\n段林希\t476840\n996\t476841\nCategory\t476842\n╃\t476843\n爱情万万岁\t476844\n_腾牛个性网\t476845\n10.2\t476846\n1-2号\t476847\nN6\t476848\n三室\t476849\n石英\t476850\n听清\t476851\n破界篇\t476852\nLinux运维\t476853\naddslashes\t476854\n精氨酸\t476855\n制刷\t476856\n血运\t476857\nios9.2\t476858\n安重根\t476859\nYuri\t476860\nIED\t476861\nZsh\t476862\n黄金十二宫\t476863\n1746\t476864\n黄杨树\t476865\n2016-2020年\t476866\n打棒球\t476867\n瘦金体\t476868\n吧员\t476869\n美声\t476870\n义桥\t476871\nFRONTIER\t476872\nchromosome\t476873\n字符集\t476874\n山楂树之恋\t476875\n免战\t476876\n芳村大道\t476877\n140期\t476878\n金鹰广场\t476879\n仓木麻衣\t476880\n终孔\t476881\n受瞩目\t476882\n主动件\t476883\n润州\t476884\n百洁布\t476885\n4系\t476886\n企航\t476887\ncOm\t476888\n土炮\t476889\n2016年11月25日\t476890\n魔音版\t476891\n恒大绿洲\t476892\n抽验\t476893\n货箱\t476894\ntensor\t476895\n太血腥\t476896\n图文快印店\t476897\n魏公子\t476898\n加币\t476899\n丝质\t476900\n50部\t476901\nE1\t476902\nnihao\t476903\n门禁管理系统\t476904\nCJQ\t476905\n将魂\t476906\n内马\t476907\n渭南教育网\t476908\n安措\t476909\n自慰\t476910\n延中\t476911\n300c\t476912\n蓝山咖啡\t476913\n九分之八\t476914\n唯满侠吧\t476915\n农妇\t476916\n凯里\t476917\n街机模拟器\t476918\n音羽レオン\t476919\n新姿\t476920\njsy\t476921\nnz.52pk.com\t476922\n副理\t476923\n秦琼\t476924\n米岛\t476925\n中国人民大学图书馆\t476926\n无位\t476927\n安徽省地方税务局\t476928\n恶人\t476929\n江西省财政厅\t476930\n公转\t476931\n驾龄\t476932\nnotifyAll\t476933\n建筑电气工程施工质量验收规范\t476934\n勘九郎\t476935\n彭大\t476936\n示值\t476937\n桃花香\t476938\n大雷\t476939\n努比亚M2\t476940\nEJU\t476941\n史蒂芬金\t476942\n浓密机\t476943\n600873\t476944\n大功\t476945\n博兴\t476946\nlbp\t476947\n过渡金属\t476948\n沈老师\t476949\nWorkerman\t476950\n显卡超频\t476951\n教战\t476952\n太原地区\t476953\nflavors\t476954\n成矿\t476955\nCAC\t476956\n魔宫\t476957\n星座男\t476958\n东方财富\t476959\n金声\t476960\n陈小希\t476961\n翻水\t476962\n佳源巴黎都市\t476963\nPrey\t476964\n0o\t476965\n3.13\t476966\n几笔\t476967\n|洛奇\t476968\n辊压\t476969\n门轮\t476970\n花鸡\t476971\n数鸭子\t476972\n歌迷\t476973\ncarboxylic\t476974\n顶多\t476975\n耷\t476976\n外形\t476977\n万嘉\t476978\n高尔察克\t476979\n罪臣\t476980\n最迟\t476981\n参演\t476982\n脑残片\t476983\n北京商业地产\t476984\nSilence\t476985\n西安曲江文化产业投资(集团)有限公司\t476986\n气气\t476987\n头围\t476988\n陈华\t476989\n宁肯\t476990\ntrie树\t476991\nSenate\t476992\n汉腾X7\t476993\n消费积分\t476994\n退牌\t476995\n方才\t476996\n无收缩灌浆料\t476997\nRunway\t476998\nnable\t476999\n报告老师我是东北银\t477000\n土工\t477001\n5.6\t477002\n干烧\t477003\n生命体\t477004\n两厢\t477005\n真白希実\t477006\n小神龙俱乐部\t477007\n命令窗\t477008\nMaje\t477009\n吴淞中学\t477010\n资料片\t477011\n6班\t477012\n撕裂者\t477013\nVue.js\t477014\n超时空堡垒\t477015\n有教\t477016\n父表\t477017\n新凤霞\t477018\n真能\t477019\n养生茶\t477020\n流击\t477021\nappdelegate\t477022\nconstellation\t477023\n学生会组织部\t477024\n元角分\t477025\n手机架\t477026\n歌剧魅影\t477027\n三国无双\t477028\nCentral\t477029\n日本环球影城\t477030\n蝙蝠侠大战超人:正义黎明\t477031\n223.104.14.184\t477032\n朋友群\t477033\n胸骨\t477034\n上下游产业链\t477035\nTUMD\t477036\nStatus\t477037\n沪皖\t477038\n西安市灞桥区人民政府\t477039\n威锋论坛\t477040\n100年\t477041\n宋燕\t477042\n人名章\t477043\n百草味\t477044\n中国花卉网\t477045\nps磨皮插件\t477046\n巴尔喀什湖\t477047\n苏晴\t477048\n直销同城网\t477049\n双属\t477050\n争夺战\t477051\n北京移动\t477052\n五言绝句\t477053\n天府一街\t477054\nursa\t477055\n柯惠\t477056\n瓷雕\t477057\n若初文学网\t477058\n广州百万葵园\t477059\n银城中路\t477060\n22W\t477061\n平衡仪\t477062\nSq\t477063\n逊克县\t477064\n符医\t477065\n内源融资\t477066\nNothing\t477067\n安族网\t477068\n文明岗\t477069\nselecte\t477070\n搜索域\t477071\n布克赛尔县政府\t477072\n喜德盛\t477073\nAUTOIT\t477074\nwebupload\t477075\n半程马拉松赛\t477076\n青岛统计信息网\t477077\n20钢\t477078\n李蕾\t477079\n广东银洋环保新材料有限公司\t477080\n3.6%\t477081\n洪拳\t477082\nFantasy\t477083\n西交利物浦\t477084\n7700HQ\t477085\nqcy\t477086\n魏辉\t477087\npursuit\t477088\n轮系\t477089\nSandisk\t477090\ndrg\t477091\n爱婴室\t477092\nEIGRP\t477093\n短道速滑世锦赛\t477094\n硅钢片\t477095\n韩毓海\t477096\n看官\t477097\n领秀城\t477098\niproute2\t477099\n神宠\t477100\n金山大道\t477101\n蒙古包\t477102\n绊脚石\t477103\n优先付款\t477104\n云线\t477105\n桃月\t477106\n马明\t477107\n20150526\t477108\n老门东\t477109\n还原型\t477110\n定购\t477111\necnu\t477112\n河南银监局\t477113\n月亮\t477114\n爵尼\t477115\nuj\t477116\n不成型\t477117\n云通讯录管理中心\t477118\n盐城人才网\t477119\n京都府\t477120\n江玲\t477121\n忌食\t477122\n缓征\t477123\n等别\t477124\n增值税发票系统\t477125\n桑吉平措\t477126\n厦门市第二医院\t477127\n万美\t477128\n涨停宝\t477129\n特许权使用费\t477130\nfreaky\t477131\n有口难言\t477132\n闪身\t477133\n西村\t477134\n猪猪侠之变身小英雄\t477135\nVxWorks设备驱动开发详解\t477136\ndpk750\t477137\n偷玩\t477138\n岔气\t477139\n鲁班书\t477140\n两千米\t477141\n浙江医药高等专科学校\t477142\n摸金天师\t477143\n第四十八条\t477144\n泡狐龙\t477145\n将军路\t477146\n大研古城\t477147\n18.00\t477148\n太平门\t477149\nInstagram\t477150\n三峡人家\t477151\n雷蛇战锤狂鲨\t477152\n早筛\t477153\n滴漏式\t477154\n96平方厘米\t477155\nBIRD\t477156\n欲火焚身\t477157\n典论\t477158\nopenvpn\t477159\n商住房\t477160\n吉林大学》\t477161\n篆隶\t477162\n脱气\t477163\n首儿\t477164\n麦莉\t477165\nAlberta\t477166\n携程亲子园\t477167\n铁运\t477168\nalfred\t477169\n酒杯\t477170\nimToken\t477171\ncakes\t477172\n菲姐\t477173\n芭蕾舞鞋\t477174\n第58号\t477175\n青石板\t477176\n/body\t477177\nS01E02\t477178\n37.2\t477179\n杨秀清\t477180\nrandom\t477181\n第大\t477182\n33栋\t477183\n俚语\t477184\n极米社区\t477185\n软虫\t477186\n44万\t477187\n包开\t477188\nat变速箱\t477189\n杨国\t477190\n8866\t477191\n家用投影机\t477192\n六韬\t477193\n武夷山风景区\t477194\n芒果汁\t477195\nAbode\t477196\n同场\t477197\n海东地区\t477198\n排程\t477199\n植物保护学院\t477200\n39名\t477201\n1662\t477202\n刘亭\t477203\n恒逸石化\t477204\n松门镇\t477205\n财采\t477206\n酥\t477207\nHg\t477208\n二哥\t477209\n古筝曲\t477210\n科蓝软件\t477211\n2009年8月\t477212\n保利紫山\t477213\n十四条\t477214\n设备购置费\t477215\n007幽灵党\t477216\n艾虎\t477217\n干好\t477218\nHEYZO\t477219\n阻燃型\t477220\n50款\t477221\n帕丽斯希尔顿\t477222\n作用于\t477223\n芽儿\t477224\n巨核\t477225\nlabeling\t477226\n肇庆四会\t477227\n合纵科技\t477228\nrunnable\t477229\n覆盖率\t477230\nMyBatis-generator\t477231\n常用\t477232\n内长\t477233\nripple\t477234\n上海市金山区\t477235\n幸福街\t477236\n65r15\t477237\n茶网\t477238\n知世\t477239\n衾\t477240\n靴子\t477241\n.7z\t477242\ngouki\t477243\n中篇小说\t477244\n欧力士\t477245\n牛元帅斗牛\t477246\n643\t477247\n坦克链\t477248\n伊比\t477249\n白兰\t477250\nLE\t477251\n湖鲜\t477252\n保密柜\t477253\n衣补\t477254\n购物群\t477255\n肥佬\t477256\n复旦大学哲学学院\t477257\n蚊车\t477258\n爱文\t477259\n辞呈\t477260\n一平方米\t477261\n林徽因\t477262\n北京市育英学校\t477263\n臂膀\t477264\n道路运输信息网\t477265\n酱酱\t477266\n莹光\t477267\n贺囍\t477268\nie8浏览器\t477269\n志高空调\t477270\n金斯利安\t477271\n有机磷\t477272\n集邮\t477273\n裸视\t477274\n语c\t477275\n江中牌健胃消食片\t477276\n650nk\t477277\n赞赞赞\t477278\n注册会计师协会\t477279\n呀呀呀\t477280\n训练器\t477281\nRadius\t477282\n极路由4\t477283\nFATF\t477284\n珂润\t477285\n衫\t477286\n康熙微服私访记\t477287\necosys\t477288\n鸡红\t477289\n木制工艺品\t477290\n抚顺银行\t477291\n华夏人寿\t477292\n川东\t477293\nLonely\t477294\n张店镇\t477295\n打歌\t477296\n汗管瘤\t477297\n女八路\t477298\n摩亭\t477299\n吸费\t477300\n风箱\t477301\n系统浏览器\t477302\n热血街舞\t477303\n第41号\t477304\neasywechat\t477305\njemalloc\t477306\ninput输入框\t477307\nANR\t477308\n31.0\t477309\n山东能源淄博矿业集团有限责任公司\t477310\n霜\t477311\n用人\t477312\n解释器\t477313\ngdgp3.chinaxinge.com/shuju2/201708/201782021063461608.txt\t477314\nx3\t477315\n扩张性心肌病\t477316\n长垣县\t477317\npywin32\t477318\nprotector\t477319\n邦德大学\t477320\n上古卷轴ol\t477321\nTongue\t477322\n集体版\t477323\nOnlylady女人志\t477324\n犯上\t477325\n大洋镇\t477326\n1886\t477327\n劝慰\t477328\n佛山乐从\t477329\n管理学基础\t477330\ntram\t477331\nmaxxaudio\t477332\n帕罗西丁\t477333\n第168集\t477334\n第41次\t477335\n姚期\t477336\n罗曼蒂克消亡史\t477337\nj2se\t477338\n管线\t477339\n液柱\t477340\ne8372\t477341\n似然\t477342\n达山\t477343\n饿狼\t477344\n姚勇\t477345\n银川路\t477346\nExotic\t477347\n轻奢\t477348\n剧终\t477349\nepub360\t477350\n中东\t477351\n诗名\t477352\n丁丁猫\t477353\n京券\t477354\nClarisonic\t477355\nTaiwanese\t477356\n韩字\t477357\n链断\t477358\n小娜迦\t477359\n28分钟\t477360\n真的很难\t477361\n解及\t477362\n江苏省高院\t477363\n泽尻胡蝶\t477364\n堂会\t477365\n梁子湖区\t477366\n万虎\t477367\n陈公博\t477368\n安徽省电力公司\t477369\n素股\t477370\n糖豆广场舞\t477371\nSparkSession\t477372\nSous\t477373\n盒马生鲜\t477374\n立体停车场\t477375\n福州火车南站\t477376\nムCG\t477377\nCoolEdit\t477378\n大奖\t477379\n玫瑰味\t477380\n零头\t477381\nhoff\t477382\n洛斐\t477383\n测验\t477384\n传火\t477385\n多宝塔\t477386\ngop\t477387\n受托\t477388\n收束\t477389\n滑动式\t477390\n城市建成区\t477391\narpu\t477392\nc2c\t477393\n印尼盾\t477394\n天下有情人\t477395\n警示灯\t477396\n西单女孩\t477397\n热血中文网\t477398\n饷银\t477399\n低频治疗仪\t477400\n两步式\t477401\n保安服\t477402\n戏\t477403\n杨桥镇\t477404\nLISREL\t477405\n文评\t477406\n通达信\t477407\n木牌\t477408\n2話\t477409\n夜斗\t477410\n郭某\t477411\nHuffman\t477412\n非酋\t477413\n_网\t477414\n南京租房网\t477415\n1600w\t477416\n长沙中考网\t477417\n青人社\t477418\n凤灵\t477419\n西安翻译学院\t477420\n南通电信\t477421\n铯\t477422\n美娜多\t477423\n下元节\t477424\n黑龙江垦区\t477425\n汪滔\t477426\navchd\t477427\n类似于\t477428\n链接库\t477429\n儿研所\t477430\n有求\t477431\nitextsharp\t477432\n陈皮糖\t477433\n传递函数\t477434\n月桂\t477435\n一二觉\t477436\n综合部\t477437\nISO9001认证\t477438\n张表\t477439\n川酒\t477440\n小摊\t477441\nUH\t477442\n奇数页\t477443\n王玺\t477444\n淘宝返利网\t477445\n中文无敌版\t477446\n条列\t477447\n孔轴\t477448\n普鲁卡因\t477449\n幽灵岛\t477450\n后背痛\t477451\n於潜镇\t477452\n王栋\t477453\nEmission\t477454\n巨无霸\t477455\n动视\t477456\nstore\t477457\n17年12月\t477458\n嘉兴市妇幼保健院\t477459\nsiebel\t477460\n御道\t477461\n苏州大学东吴商学院\t477462\n金手镯\t477463\nFIS\t477464\nsorts\t477465\nVm\t477466\n巴奴火锅\t477467\n色布\t477468\n淘沙\t477469\nTOPLIFE\t477470\n桌下\t477471\nohne\t477472\n看值\t477473\n王成\t477474\n跳齿\t477475\n青储机\t477476\n华君武\t477477\n5000只\t477478\n铝酸钙\t477479\nAccredited\t477480\n骗财\t477481\nlimitless\t477482\n蒙福\t477483\n锌粒\t477484\n78集\t477485\nBeverly\t477486\n虎力\t477487\n豪斯\t477488\n发电部\t477489\n后氧传感器\t477490\n山东省环保厅\t477491\nCharting\t477492\nunfortunately\t477493\npapp\t477494\n红色警戒2共和国之辉2\t477495\nIWS\t477496\n男爵\t477497\n生财之道\t477498\n共价化合物\t477499\n完美版\t477500\n清光绪\t477501\n惩治\t477502\n亿房家居网\t477503\nBagging\t477504\n资源县\t477505\nAbnormal\t477506\n陆股通\t477507\nHypermesh\t477508\nRidge\t477509\n金融商务区\t477510\nzf2\t477511\n兼施\t477512\n沃利\t477513\n1500m\t477514\n闪辞\t477515\n分数\t477516\n洗洗手\t477517\nOFweek安防网\t477518\n信息化\t477519\n化身\t477520\n奋韩网\t477521\n宓\t477522\n透层\t477523\n6.5.13\t477524\n唐苑\t477525\n死亡之舞\t477526\n输卵管通\t477527\n苏寞\t477528\nDID\t477529\n明月镇\t477530\n保温桶\t477531\n服装公司\t477532\n北京金融局\t477533\n下奶\t477534\n佳禾\t477535\n心迷\t477536\n波旬\t477537\n矿质\t477538\n法仪轨\t477539\n【屌德斯\t477540\n养龙鱼\t477541\n营林\t477542\n区招商局\t477543\n腺样体肥大\t477544\n嘉德在线\t477545\n桩土\t477546\n九龙山\t477547\n7000万美元\t477548\n冰尘\t477549\n留不住\t477550\n党外\t477551\nokbike\t477552\n丁酯\t477553\n串口通信协议\t477554\nSimulink\t477555\n滑液\t477556\n国际组织\t477557\nX70A_\t477558\n热身赛\t477559\n冥想\t477560\n青岛工商银行\t477561\n哥斯拉2\t477562\n很快\t477563\n榆林市政府\t477564\n特全\t477565\n期日\t477566\n準備\t477567\nVisualStudio\t477568\n金园\t477569\n拾年\t477570\nntfs-3g\t477571\n凤岗汽车站\t477572\n续杯\t477573\n笼型\t477574\n鱼精\t477575\n外研社新标准小学\t477576\n脚踩\t477577\n黄糖\t477578\n篆体\t477579\nPexpect\t477580\n倒回\t477581\n2S\t477582\n商业银行资本管理办法\t477583\n毕业证明书\t477584\n中央郡\t477585\n新北汽\t477586\nbfd\t477587\n月供\t477588\n2号\t477589\n密贴\t477590\n脂溶性\t477591\nmobie\t477592\n喵呜\t477593\n黔西南\t477594\n硝态氮\t477595\n局方\t477596\nWhoo\t477597\n头一个\t477598\nNaluone\t477599\nmessagebox\t477600\n熊猫金币\t477601\n蛟川书院\t477602\n深圳地铁14号线\t477603\n饱和氯化钠\t477604\ndriver\t477605\n腔梗\t477606\n300624\t477607\n创国\t477608\n徐萌\t477609\n1第二章\t477610\n栋仁的时光\t477611\nnmos\t477612\n辽宁日报\t477613\n江苏省体育局\t477614\nobservations\t477615\n青花郎\t477616\n望江府\t477617\n24季\t477618\n布条\t477619\n暗糖\t477620\n丝光\t477621\ncreo4.0\t477622\n新抚\t477623\n血兽\t477624\n裙撑\t477625\n紫丁香\t477626\n中共十四大\t477627\n干结\t477628\npresenter\t477629\n轻质碳酸钙\t477630\n3009\t477631\n广州飞歌汽车音响有限公司\t477632\n新牛津英汉双解\t477633\ntrailing\t477634\n武陵源\t477635\n牦牛骨\t477636\n耿彦波\t477637\nep2\t477638\n11月初\t477639\n民用车\t477640\n髋关节置换术\t477641\n30份\t477642\nrx460\t477643\nMoms\t477644\n都江堰水利工程\t477645\n遠\t477646\n工作犬\t477647\n小城之春\t477648\n盘点机\t477649\nFounders\t477650\n电子信\t477651\n朱成碧\t477652\n十二招\t477653\n沿河村\t477654\n盲道\t477655\n编者\t477656\n中共十五大\t477657\n不受欢迎\t477658\n杜宾犬\t477659\ncontroller\t477660\nPIONEER\t477661\n山东中医药大学第二附属医院\t477662\nKingCamp\t477663\n难友\t477664\n乐哈\t477665\n防撞墩\t477666\n龙虎门\t477667\n代发\t477668\n恒智天成\t477669\n钢带\t477670\nTF\t477671\n朝阳沟\t477672\ncoroutine\t477673\nDiyCode\t477674\nMJExtension\t477675\n修真文\t477676\n六出花\t477677\n第4册\t477678\n红树林\t477679\nOpenFlow\t477680\n纪子\t477681\n乐天免税店\t477682\n航海世纪\t477683\n北发图书网旗舰店\t477684\n浅释\t477685\n2017年1月\t477686\n蛟川\t477687\n全托\t477688\n几根\t477689\nJawnHa\t477690\n优博\t477691\n提琴\t477692\n鼻咽部\t477693\n指甲刀\t477694\n盐酸多西环素\t477695\nWin8/8.1\t477696\n倒车入库\t477697\n滴水洞\t477698\nMoboPlayer\t477699\n神片\t477700\n营销宝\t477701\naccelerator\t477702\n海洛因\t477703\nSEVENFRIDAY\t477704\n林梅\t477705\nBL文\t477706\n大字报\t477707\n测字\t477708\n时代互联\t477709\n扦插\t477710\ni54460\t477711\n两兄弟\t477712\n龙格\t477713\n家具\t477714\n证券部\t477715\n爱婴岛\t477716\n潢川县\t477717\n样本量\t477718\nreborn\t477719\n华夏信用卡\t477720\n空户\t477721\n少些\t477722\n石台\t477723\nmashup\t477724\n孚能科技(赣州)有限公司\t477725\n彩砖\t477726\n万城\t477727\n摸样\t477728\n旧式\t477729\nStyling\t477730\npuss\t477731\nuad\t477732\nLynne\t477733\nBYOD\t477734\n中国青年报\t477735\nShaolin\t477736\nbalances\t477737\n滑片式\t477738\namai\t477739\n1188\t477740\n嫉妒恨\t477741\n14处\t477742\nWIM\t477743\n活动\t477744\n铁血网\t477745\n不可替代性\t477746\n第34号\t477747\n外系\t477748\n楼控\t477749\n浠水县人民政府\t477750\n肝药酶\t477751\n食之契约\t477752\n鱼骨\t477753\n轩辕剑3外传:天之痕\t477754\n胃肠科\t477755\n信阳新县\t477756\n国彩\t477757\nstudio9\t477758\niis8\t477759\nnutch\t477760\n单台\t477761\nprintscreen\t477762\nxueshu\t477763\nideas\t477764\n玛戈\t477765\nerji\t477766\n百度网\t477767\n100.00\t477768\n200cc\t477769\n苏园\t477770\n虐恋情深\t477771\n邓先森\t477772\n三十一岁\t477773\n起居室\t477774\n丽苑\t477775\n卢志强\t477776\n一件件\t477777\n$2}\t477778\n交换群\t477779\n狂袭\t477780\n管廊\t477781\nPart\t477782\n卡梅尔小镇\t477783\n阴谋诡计\t477784\nGeeks\t477785\n南岭村\t477786\nUSJ\t477787\nrevit\t477788\nbilingual\t477789\nLQ-630K针式打印机\t477790\nintellig\t477791\n圣殿骑士团\t477792\n魁奇路\t477793\n迪士普\t477794\n上式\t477795\nnba2k17mc\t477796\n中国高新区\t477797\nhtv\t477798\n高压变频器\t477799\n玖龙纸业\t477800\n天桥艺术中心\t477801\n平顶山晚报\t477802\n北京市房山区住房和城乡建设委员会\t477803\n苦不堪言\t477804\nDxOMark\t477805\n努力工作\t477806\n苦荞米\t477807\n223.104.11.233\t477808\n测试纸\t477809\n闪亮\t477810\nCAD图层\t477811\n张继科\t477812\n演奏者\t477813\n第一趟\t477814\n撕掉\t477815\n引向\t477816\n高姝瑶\t477817\n微雪课堂\t477818\n补水泵\t477819\n国美金融\t477820\n梅山水库\t477821\n公共户\t477822\ncdlinux\t477823\n盛行\t477824\n罕\t477825\n稳稳\t477826\n单镜\t477827\n上海东站\t477828\n先进型\t477829\n苏宁金融研究院\t477830\n楸木\t477831\n董子初\t477832\n魅族pro\t477833\n三相异步电动机\t477834\nforeground\t477835\n国家卫生镇\t477836\n武嘉\t477837\n淫羊霍\t477838\n大庆论坛\t477839\n京菜\t477840\n月山\t477841\n第六位\t477842\n全编\t477843\n新兵\t477844\n白玉县\t477845\n无核化\t477846\n阿克曼\t477847\n红蓝3D\t477848\n1.12.3\t477849\n榜上无名\t477850\nsuchas\t477851\n云挂机\t477852\n潭门\t477853\n大渝火锅\t477854\n仁信\t477855\n海上花\t477856\n特米\t477857\n新表\t477858\n西沽公园\t477859\njonny\t477860\n蒙古高原\t477861\nAllison\t477862\n旋转屏\t477863\n牵扯\t477864\nNotebooks\t477865\n党委书记\t477866\n植物大战僵尸1\t477867\n區別\t477868\n2018年4月21号\t477869\n雅蠛蝶\t477870\n安全生产责任保险\t477871\n做饭\t477872\nbesides\t477873\n钱王祠\t477874\n被枪杀\t477875\n2025\t477876\n艺术村\t477877\n记金华的双龙洞\t477878\nios5\t477879\nUkraine\t477880\nChegg\t477881\n常州路\t477882\ntolino\t477883\n文献王\t477884\n平安养老险\t477885\n皮下\t477886\n集贤县\t477887\n优质酒店\t477888\n国农\t477889\n万灵\t477890\n上诉案\t477891\n战绳\t477892\nTrending\t477893\n权路\t477894\nAppSync\t477895\n95本\t477896\n第九弹\t477897\n安南\t477898\n小中风\t477899\n繁荣兴盛\t477900\n张铁军\t477901\n乌鸦\t477902\n爬藤\t477903\n清华大学艺术教育中心\t477904\n装船机\t477905\n通化县\t477906\n孙谦\t477907\n2.30\t477908\n49英寸\t477909\n差旁通阀\t477910\n救球\t477911\n湖南省物价局\t477912\n陈立华\t477913\n优信\t477914\n火警\t477915\nwr842n\t477916\n钱枫\t477917\nlupicia\t477918\n覆铜\t477919\n8.7.1\t477920\n围标串标\t477921\n海伦堡\t477922\n奶舞\t477923\n菠萝皮\t477924\n误打误撞\t477925\n可道云\t477926\n1904年\t477927\n爆发\t477928\n虎男\t477929\n谴\t477930\n国妆\t477931\n克里米亚战争\t477932\n关岭\t477933\n登山表\t477934\n鲁尔区\t477935\nwalked\t477936\n教案集\t477937\n自杀论\t477938\n陈枫\t477939\nmeitu\t477940\n凡人帝国\t477941\n陈寅恪\t477942\n乾坤篇\t477943\n老年护理学\t477944\n精确\t477945\n凡哥\t477946\nrefinement\t477947\n省委办公厅\t477948\n影音先锋av撸色_先锋av资源男人站_2016影音先锋av撸色\t477949\n福建省电子信息集团\t477950\n看人\t477951\n永不消逝的电波\t477952\n汇丰\t477953\n张国新\t477954\n入体\t477955\nmockups\t477956\nbuyao\t477957\ncheers\t477958\nshack\t477959\n企业品牌\t477960\n专任\t477961\nselections\t477962\n荣耀畅玩7a\t477963\n龙爱\t477964\nmicosoft\t477965\nlanyu\t477966\nReadOnly\t477967\n智联招聘网\t477968\nrecordset\t477969\nmsgrcv\t477970\n检验师\t477971\n去痣\t477972\n巴拉斯\t477973\n第48集\t477974\n绿云\t477975\n群艺\t477976\n五周岁\t477977\n福建移动\t477978\n浪女\t477979\n相形\t477980\n方城县\t477981\nGumpYan\t477982\n不骗\t477983\n产车\t477984\nB85M-G\t477985\n荆山\t477986\nLongest\t477987\n哈尔滨道外区\t477988\nVV7s\t477989\nwindowbuilder\t477990\neoi\t477991\n花苗\t477992\n国务院港澳事务办公室\t477993\n百足\t477994\nRemove\t477995\n79\t477996\n批量替换\t477997\n佳能7D\t477998\n科通芯城\t477999\n一脱\t478000\n天隆\t478001\n滦镇\t478002\n混水\t478003\n人民观\t478004\nmult\t478005\ncn2\t478006\n审计部\t478007\n线路工\t478008\n高店乡\t478009\n糊\t478010\n非线性函数\t478011\n罗江县人民政府\t478012\n远近光\t478013\n悲剧\t478014\n脚力\t478015\n坦洲\t478016\n卜蜂莲花\t478017\n妖人\t478018\n工作责任心\t478019\n占位图\t478020\n苯酐\t478021\n48所\t478022\n酒令\t478023\n马屁\t478024\n元宝鱼\t478025\n普威\t478026\ngetClass\t478027\n尧天\t478028\n第111章\t478029\n胶粘\t478030\n3.11.0\t478031\n卢米埃\t478032\n002509\t478033\n王一成\t478034\n无界鼠标\t478035\n滚水\t478036\n科技致富向导\t478037\nideamaven\t478038\n兴业县\t478039\n塘朗山\t478040\n关键性\t478041\n路虎发现5\t478042\n就业登记表\t478043\n中国武宁政府\t478044\n李彦\t478045\n23分钟\t478046\nrtk\t478047\n机械学\t478048\n韦礼安\t478049\n斯滕伯格\t478050\n民办大学\t478051\nFlores\t478052\n爱信诺航天信息\t478053\n奥德曼\t478054\n34000\t478055\n华鸿嘉信\t478056\n流地址\t478057\n濯\t478058\n质量监督检验中心\t478059\nwinrm\t478060\nThailand\t478061\n来荡\t478062\n薏仁茶\t478063\n2010年1月\t478064\n上海市测绘院\t478065\n广东科贸职业学院\t478066\n军售\t478067\n工资性\t478068\n韩范儿\t478069\n火影究极风暴\t478070\nAM61\t478071\n秀宠网\t478072\n王昭君\t478073\n韩文清\t478074\n餐饮具\t478075\n证券商\t478076\nTactical\t478077\n紫茉莉\t478078\nheader\t478079\n汪治怀\t478080\n6码\t478081\n星鸿\t478082\n802.11\t478083\n5.6.30\t478084\n中继器\t478085\n晓慧\t478086\n杰尼斯事务所\t478087\n硝酸钾\t478088\n庞村镇\t478089\n山莓\t478090\n动念\t478091\n29款\t478092\n清热散结片\t478093\nmoonlight\t478094\n纳税信用评级\t478095\ninput函数\t478096\n壮志在我胸\t478097\nbring\t478098\n诗诗\t478099\n吸星大法\t478100\n趣解\t478101\n谭宗明\t478102\nMn\t478103\nBayesian\t478104\nES300h\t478105\n钢珠\t478106\n中国国际旅行社总社\t478107\n石中玉\t478108\nspinal\t478109\n杨红梅\t478110\n红海行动百度云\t478111\nEase\t478112\ndiy\t478113\n蛇类\t478114\n坤泰\t478115\n北京数字认证股份有限公司\t478116\n靓妆\t478117\n针锋相对\t478118\n每日农经\t478119\nSCH\t478120\n任远\t478121\n圣路易斯华盛顿大学\t478122\n悬疑\t478123\n死飞自行车\t478124\n盐酸达泊西汀片\t478125\n民营企业\t478126\n北京地铁1号线\t478127\n东方人\t478128\n鬼怪\t478129\nBathroom\t478130\n百度前端技术学院\t478131\nM5\t478132\nmindg\t478133\nsdjy\t478134\n发疯\t478135\n音乐集\t478136\n线轨\t478137\n刀叉勺\t478138\n华映资本\t478139\n装傻\t478140\n下属\t478141\n新数\t478142\n九一\t478143\n里尔\t478144\n龍\t478145\n代审\t478146\n降挡\t478147\n中南大学图书馆\t478148\n艾瑞泽5论坛\t478149\n黑龙江中医药大学附属第二医院\t478150\n飞雷\t478151\n这周\t478152\n通假字\t478153\n吕红\t478154\n钟华\t478155\n鲁荣渔\t478156\nf0\t478157\n乐驾\t478158\n做什么\t478159\n许国璋\t478160\n塑料盖\t478161\n时越\t478162\n对不起\t478163\n几克拉\t478164\n黄土\t478165\nskyliner\t478166\n库门\t478167\n哈莫雷特\t478168\n0716\t478169\nWMware\t478170\n段泥\t478171\nmazda\t478172\n陕西移动\t478173\n600家\t478174\n石雕\t478175\n银包\t478176\n色相环\t478177\n钟志华\t478178\n李自成\t478179\n康奈\t478180\n贝弗利\t478181\n中孚实业\t478182\n林阿兔\t478183\n晟铭\t478184\n抢先\t478185\nam2\t478186\nj7\t478187\n一个六\t478188\n压力感\t478189\nvisions\t478190\n百灵鸟\t478191\n滚装\t478192\n不断地\t478193\n第28集\t478194\nA套\t478195\n庆城县\t478196\n神马\t478197\n苦心经营\t478198\narcscene\t478199\n掌声\t478200\n鲜度\t478201\n华为M5\t478202\n透湿\t478203\n乳钉\t478204\nMegan\t478205\nダンス\t478206\nrain\t478207\n透射\t478208\n2929\t478209\n狂澜\t478210\n三服\t478211\n赵卫\t478212\n毁灭世界\t478213\n334\t478214\nYar\t478215\ntoxicology\t478216\nAutolayout\t478217\n电子客票行程单\t478218\n教资\t478219\nPSD分层素\t478220\n歇\t478221\n祖德\t478222\nFSC\t478223\n足足\t478224\ntoomuch\t478225\n磁栅\t478226\n20c\t478227\n酶切\t478228\nCC人才网\t478229\n秦桑\t478230\n新疆天池\t478231\n上海房产证\t478232\n谢菲尔德\t478233\n简\t478234\n益盛药业\t478235\nhostease\t478236\n60项\t478237\n市卫计局\t478238\n禾服\t478239\n地理中国\t478240\n共处\t478241\n1047\t478242\nk4\t478243\n护士证\t478244\n易安居算命网\t478245\n农业综合信息网\t478246\nIPhoneX\t478247\nFlashPlayer\t478248\n636米\t478249\n如恩\t478250\n动源\t478251\notsu\t478252\n辔头\t478253\n链链\t478254\n忆青春\t478255\n小翼\t478256\n掌上宝\t478257\n绿地缤纷城\t478258\nbaoxian\t478259\n不息屏\t478260\n软件评测师\t478261\nfluids\t478262\n汗腺\t478263\nmurphy\t478264\n大头儿子小头爸爸\t478265\n2017年6月1日起\t478266\n之二十六\t478267\n空间商\t478268\n归因\t478269\n反曲\t478270\n石油圈\t478271\n70包\t478272\n吨级\t478273\nvivox21拆解图\t478274\n傲世三国之三分天下\t478275\n提尔皮茨\t478276\neir\t478277\n综合处\t478278\n五华\t478279\n超星学习\t478280\n哈雷戴维森\t478281\n30亿\t478282\n苦瓜汁\t478283\n恶霸犬\t478284\n液压过滤器\t478285\n河南卫视\t478286\n郑晓明\t478287\n辉瑞制药\t478288\n西昌\t478289\n墨粉\t478290\n中国特\t478291\n职高生\t478292\n多事之秋\t478293\n蓝花\t478294\n相见时难别亦难\t478295\n我家的熊孩子\t478296\n征收办\t478297\n纸杯机\t478298\n4.4\t478299\n新滩\t478300\n魔导\t478301\n人力资源和社会保障部职业技能鉴定中心\t478302\n人流\t478303\nExtremely\t478304\n离场\t478305\n火稚鸡\t478306\n凌统\t478307\n方俊\t478308\neslint\t478309\n顺祥\t478310\n誓血之盟\t478311\nwww.17house.com\t478312\npint\t478313\n斗破苍穹动漫第二季\t478314\n声子\t478315\n0001\t478316\n昌吉东路\t478317\n电子体\t478318\n噪比\t478319\n纬三路\t478320\n温氏集团\t478321\n重庆市住房公积金管理中心\t478322\nVue.js库\t478323\n太原城\t478324\n九炼归仙\t478325\nOFweek人才网\t478326\ndrupal7\t478327\n卖腐\t478328\n布雷斯特\t478329\n世界记忆名录\t478330\n大阪松\t478331\nX70\t478332\n汉高祖\t478333\n红豆绿豆\t478334\n拉丝机\t478335\n套磁\t478336\n连锁加盟展\t478337\n江山帝景\t478338\n萝卜裤\t478339\nKRAIT\t478340\n蛹\t478341\n李泰民\t478342\nphalapi\t478343\n还款日\t478344\n超级四世同堂\t478345\n泉港\t478346\n金康\t478347\n印象派\t478348\n氯化法钛白粉\t478349\n上海戏剧学院\t478350\nWin10系统Office2016\t478351\n南京师范大学附属小学\t478352\n全真模\t478353\n豪门小爱\t478354\nRW\t478355\n暖企\t478356\n中国幕墙网\t478357\n维生素AD滴剂\t478358\n危害公共安全罪\t478359\n集成显卡_\t478360\nй\t478361\n对比表\t478362\n中国民用航空飞行学院\t478363\n|物\t478364\n壁架\t478365\nClassifieds\t478366\n组方\t478367\nSwitching\t478368\nwho\t478369\nsccc\t478370\n宣工\t478371\n公开性\t478372\n凤凰\t478373\n股转系统\t478374\nmultimedia\t478375\n三合乡\t478376\n上海市大同中学\t478377\n营业厅\t478378\n十九大习近平\t478379\n声情\t478380\n谷歌\t478381\n公平交易\t478382\n苏月\t478383\n空管局\t478384\n粉干\t478385\n憨态\t478386\nTyco\t478387\n数调\t478388\n浮生六记\t478389\n泰州港\t478390\n泰林\t478391\n兴业信用卡\t478392\n_特\t478393\n贝瓦儿歌大合集\t478394\nXix\t478395\n兼济\t478396\n头脑锋暴\t478397\n南京晓庄学院\t478398\n7.0.3\t478399\n次幂\t478400\n是为什么呢_\t478401\ngege\t478402\nYeelight语音助手\t478403\nhank\t478404\n四男\t478405\n面码\t478406\n异分母\t478407\n顾及\t478408\n紫极\t478409\n张文俊\t478410\nm-1\t478411\n楼友\t478412\n后勤集团\t478413\n亿佰\t478414\n交易费\t478415\n小米miui8\t478416\n安悦\t478417\n茶花树\t478418\n阿慕施\t478419\n异常率\t478420\n张铁\t478421\n人世间\t478422\n2725\t478423\nARMOUR\t478424\n90多个\t478425\n漫威超级争霸战\t478426\neit\t478427\n雄安大学\t478428\n风琴式\t478429\n大东方\t478430\nh67\t478431\nNOTE4\t478432\nWATCH\t478433\nWannaOne\t478434\n蜡质\t478435\n3星\t478436\n人卡\t478437\n耻辱2\t478438\n结后\t478439\n华凌\t478440\ngigaset\t478441\n十进制数\t478442\n邓强\t478443\n买错\t478444\n八目\t478445\n陈利\t478446\nTECH\t478447\n早癌\t478448\n香料\t478449\nminana\t478450\n台内\t478451\n菜包\t478452\n人生\t478453\n草房\t478454\n冬暖\t478455\n假戏\t478456\n30000元\t478457\n多介质过滤器\t478458\n平线\t478459\n戴秉国\t478460\n1年内\t478461\n置成\t478462\n江口沉银遗址\t478463\n黄陂一中\t478464\n底纸\t478465\n平海\t478466\n波堤\t478467\n克拉公寓\t478468\n大该\t478469\n1518\t478470\n北仑区\t478471\n燃烧室\t478472\n无尽的爱\t478473\n弋果\t478474\nguanli\t478475\n择邻\t478476\n玥玛锁\t478477\n圆环\t478478\n大叔\t478479\n人生格言\t478480\n浮世万千\t478481\n大托\t478482\nPHPStorm\t478483\n汪精卫\t478484\n温塘镇\t478485\n无痛苦\t478486\n酒博会\t478487\n砂带机\t478488\n世业洲\t478489\n火凤凰奇迹\t478490\n力赞\t478491\nTHIS\t478492\n跳频\t478493\n张宏亮\t478494\nwifi模块\t478495\n拾金不昧\t478496\n4312\t478497\n迪生\t478498\n金丝桃\t478499\nDetail\t478500\n第12节\t478501\n登台\t478502\n佐匹克隆片\t478503\n弃风\t478504\n十字门\t478505\n政宗君\t478506\n发光体\t478507\n卵针\t478508\n2000瓦\t478509\n很温柔\t478510\n铜陵三中\t478511\n剪取\t478512\n【理上网\t478513\nloo\t478514\n2.84\t478515\n孤岛求生5\t478516\n木鱼花\t478517\n特别节目\t478518\nVo\t478519\n车照\t478520\nB柱\t478521\ngetView\t478522\n传出\t478523\n洛轴\t478524\n阴天\t478525\n无药\t478526\n狄仁杰之四大天王\t478527\n7-10\t478528\n七种\t478529\n查慎行\t478530\n海南工商职业学院\t478531\n三百个\t478532\n东北往事黑道风云\t478533\n考核\t478534\n长江经济带发展规划纲要\t478535\n爱世克斯\t478536\n长夏\t478537\n香酥鸡\t478538\n优游网\t478539\nbstr\t478540\nRonson\t478541\n体内外\t478542\n列上\t478543\n哈尔滨万达城\t478544\n中国建设银行\t478545\n壶底\t478546\n电动打磨机\t478547\n微笑日\t478548\n园林公司\t478549\nShopnc\t478550\n一评\t478551\n数码宝贝\t478552\n石拐区\t478553\nggjy\t478554\nillu\t478555\nescort\t478556\n高香\t478557\nCLASSIC\t478558\n单身汉\t478559\n資源\t478560\n扎发\t478561\n番禺社区\t478562\niat\t478563\n像我这样的人\t478564\n山东职业学院\t478565\n江西省肿瘤医院\t478566\n2500w\t478567\n桃菜\t478568\n技术部\t478569\n战阵型\t478570\n贾似道\t478571\n保利物业\t478572\n工业园区\t478573\n多维度\t478574\n要不起\t478575\npc浏览器\t478576\n临盆\t478577\nEbay\t478578\n草酸钙\t478579\n联姻\t478580\n崔东树\t478581\n川崎小忍者\t478582\n武林派\t478583\n安庆市第一人民医院\t478584\n古神\t478585\n装配件\t478586\n头铁\t478587\nastah\t478588\n杂带\t478589\n美女犬\t478590\n弃货\t478591\n暴晒\t478592\n中国武术协会\t478593\nzhongji\t478594\n马燕\t478595\n虚焦\t478596\n宝莲灯前传\t478597\n朗姿股份\t478598\n堂兄\t478599\n基础\t478600\n手带\t478601\n撤职\t478602\n泞之翼\t478603\n串扰\t478604\n夏威夷航空\t478605\n鬼经\t478606\n中远海\t478607\n沈阳职业技术学院\t478608\n年报\t478609\n系统架构设计师考试\t478610\n睡\t478611\nsubsequent\t478612\nStruts2拦截器\t478613\n十渡风景区\t478614\nShadows\t478615\n20150213\t478616\n锄草机\t478617\n地质博物馆\t478618\n65名\t478619\n慧盈\t478620\nS15\t478621\n奇书\t478622\n转回\t478623\n看到\t478624\nadidas\t478625\n单纯性疱疹\t478626\n数据转换器\t478627\n上海外地\t478628\n西安交通大学医学院第二附属医院\t478629\n石屏县\t478630\n白菜网\t478631\n武逆\t478632\ntlv\t478633\n960evo\t478634\nmySql\t478635\n我的1979\t478636\n绝世兵王\t478637\n武具\t478638\n拖尸\t478639\n江府\t478640\n好唱\t478641\n自装\t478642\nAccuWeather\t478643\n奸污\t478644\n复旦大学第二附属中学\t478645\n前桥\t478646\n医疗保险费\t478647\nSMILE\t478648\nスケベエルフ\t478649\nPOA\t478650\n青城古镇\t478651\n啪啦\t478652\n王玥波\t478653\nSoo\t478654\n川贝枇杷\t478655\n35条\t478656\n宫颈刮片\t478657\n李驰\t478658\n充电机\t478659\nlaosege\t478660\n丽华快餐\t478661\n六维\t478662\n爆雷\t478663\nBoiling\t478664\n码分\t478665\n劝解\t478666\nAilee\t478667\n奢华主义\t478668\n苍黄\t478669\n大众4s\t478670\n武汉市政府\t478671\n葡桃\t478672\n银华基金\t478673\n吴川市人民政府\t478674\ncurrently\t478675\n_安软市场\t478676\nfstream\t478677\n苯甲酸甲酯\t478678\nauthorship\t478679\nSATA2接口\t478680\n代理记账\t478681\nJerseys\t478682\nsesion\t478683\n收尘器\t478684\n羊脂球\t478685\n天阳\t478686\n罐装\t478687\n小米路由3\t478688\n心慈手软\t478689\n艳星们\t478690\n荔枝沟\t478691\ng5500\t478692\n四星级\t478693\n2015年3月1日\t478694\nFragile\t478695\n少得\t478696\n猿们\t478697\n应召女郎的秘密日记\t478698\n三十岁的女人\t478699\nfpga\t478700\n人工智能导论\t478701\nlol源\t478702\n三足\t478703\n百湖\t478704\n空气能热水器\t478705\n八戒服务购\t478706\n土质\t478707\n回滚\t478708\n工作365网\t478709\n缝合\t478710\n琼脂粉\t478711\n出局\t478712\npurchase\t478713\ngemini\t478714\n美林布洛芬\t478715\n海南橡胶\t478716\nLOLS6\t478717\n当爱已成往事\t478718\nlanded\t478719\n勇敢\t478720\nrebound\t478721\n评比\t478722\n拖拉管\t478723\n丹道\t478724\n65\t478725\n扬子江路\t478726\n蓝色大海的传说\t478727\n忆起\t478728\n流浪包\t478729\n$3\t478730\n肉丁网\t478731\nChoir\t478732\nAudirvana\t478733\n肉率\t478734\n异度之刃X\t478735\nCAMP\t478736\n万神\t478737\n晨钟\t478738\n迪沃\t478739\nzcash\t478740\n器乐\t478741\n密西西比\t478742\n母\t478743\n文字型\t478744\n俄勒冈州立大学\t478745\n交口县\t478746\n严老师\t478747\n临江村\t478748\n第十六篇\t478749\n帕尔哈\t478750\n64Bit\t478751\n杭州中天微系统有限公司\t478752\nAxureRP7\t478753\n奥兰多·布鲁姆\t478754\n蔚县\t478755\n玉祥\t478756\n五一\t478757\n第N个\t478758\n)网络技术有限公司\t478759\n第四人\t478760\n波姐\t478761\n祭词\t478762\n美图m8s\t478763\n生日派对\t478764\n按照规定\t478765\nsurrender\t478766\nAnalyses\t478767\nSnippet\t478768\n优酷黄金\t478769\nsinc\t478770\nCent\t478771\n刑事责任年龄\t478772\n行车证\t478773\n男爱\t478774\nfripSide\t478775\n河口瑶族自治县\t478776\nNightmare\t478777\n皇帝的新装\t478778\n惊鸿\t478779\n后记\t478780\n金字招牌\t478781\n新版图\t478782\n会议纪要\t478783\nShipped\t478784\n白执事\t478785\nU15\t478786\n暗语\t478787\n人偶师\t478788\n李卫\t478789\n源程序\t478790\nuseragent\t478791\n人人草\t478792\n善谋\t478793\nS0\t478794\n莫氏硬度\t478795\n课酬\t478796\neneloop\t478797\n佐井\t478798\n消掉\t478799\nConvergence\t478800\nMIB\t478801\n茶市\t478802\n热电联产工程\t478803\n唐霜\t478804\n全面建成小康社会\t478805\nLOL\t478806\n老年性白内障\t478807\n迁至\t478808\n九年级语文\t478809\nwpsppt\t478810\n社会保障概论\t478811\nMafia\t478812\n上海国际时尚中心\t478813\n书箱\t478814\n廉租住房\t478815\n河北省委组织部\t478816\n微动\t478817\n1738\t478818\n卫生局\t478819\n秋夜\t478820\n夸客金融\t478821\n山东省社会组织管理局\t478822\n叶冰瑶\t478823\n洗煤厂\t478824\n不明\t478825\n李老师\t478826\n张益滔\t478827\n减价\t478828\n宁夏大学\t478829\n双瑞\t478830\n2.4.6\t478831\ngtx650ti\t478832\nfilt\t478833\n囧囧\t478834\n取景地\t478835\n武汉火车站\t478836\nyyw\t478837\naio520\t478838\n葛城\t478839\n心安理得\t478840\n戴耀廷\t478841\n北京顺丰\t478842\nstamp\t478843\n于莉\t478844\nbreathing\t478845\n怀孕后期\t478846\nwiin10\t478847\n华源\t478848\n百家乐\t478849\n老钱庄\t478850\n橡皮布\t478851\n饰两角\t478852\n思创\t478853\n软启动柜\t478854\n关口知宏\t478855\n事实论据\t478856\n网上教育\t478857\n红叉叉\t478858\nmultiscatter\t478859\n45厘米\t478860\n桥塔\t478861\n强战\t478862\n陆毅\t478863\n万株\t478864\n1万1\t478865\n商丘梁园\t478866\nCNTV\t478867\n紫金阁\t478868\nlatin-1\t478869\n百度云/115\t478870\n测汞仪\t478871\n清华大学法学院\t478872\n豪雨\t478873\n商业银\t478874\nnsfc\t478875\n卡拉迪亚\t478876\n山西省农业厅\t478877\npulse2\t478878\n资产\t478879\n解刨\t478880\n泰和\t478881\n哈巴涅拉\t478882\n销售价格\t478883\nEMM\t478884\n二维码扫码\t478885\n蔡佩轩\t478886\n颈动脉彩超\t478887\n合盛\t478888\n狼嚎\t478889\n南沙政府网\t478890\n奥尔巴尼\t478891\n无缘\t478892\n偷吃\t478893\nTOF\t478894\n深圳村委\t478895\nAmerica\t478896\n饮用\t478897\n国际会展中心\t478898\n传力杆\t478899\nfss\t478900\nPump\t478901\nrepeated\t478902\n雨战\t478903\n共轭梯度法\t478904\n海通\t478905\n提拉\t478906\n环东\t478907\n自慰杯\t478908\n诗音\t478909\n福彩3d字谜\t478910\n四川邮电职业技术学院\t478911\nrequst\t478912\n三年间\t478913\n端面\t478914\n万盛奥陶纪\t478915\nsfd\t478916\n100型\t478917\n7770\t478918\n谷歌拼音输入法\t478919\n贵阳经济技术开发区\t478920\n梅岭\t478921\n2000多万元\t478922\n陌上花园艺论坛\t478923\n双向板\t478924\n装卸机\t478925\nz0\t478926\n郊外\t478927\n如此而已\t478928\n黄喉龟\t478929\n北方人才网\t478930\n叔父\t478931\nCalendars\t478932\n免职\t478933\n专人\t478934\n苏州装修公司\t478935\n人来人往\t478936\n韩丁\t478937\n滤过\t478938\n20170211\t478939\n宫星座\t478940\n停招\t478941\n美境\t478942\n第n项\t478943\n弘时\t478944\n沂源\t478945\n9月9日\t478946\n火暗\t478947\n洋女婿\t478948\n约法\t478949\nImportNew\t478950\n内蒙古人才网\t478951\n马套\t478952\n哈拉湖\t478953\n金殿\t478954\nitems\t478955\n一次函数y=kx+b\t478956\n一如既往\t478957\n复指数\t478958\n孤注一掷\t478959\n3531\t478960\n无限道武者路\t478961\nvi\t478962\n天下女人\t478963\n姑苏网\t478964\n车模\t478965\n注油\t478966\n几十秒\t478967\n八益\t478968\nyoui\t478969\ncollaborate\t478970\n亲爱的汉修先生\t478971\n防雹\t478972\nkmspico\t478973\niphoneSE\t478974\n家国天下\t478975\n肝细胞\t478976\nAssociate\t478977\n报废\t478978\n东巴\t478979\ncx-one\t478980\nSTATUS\t478981\nGW2激战2\t478982\n工程量\t478983\n楚辞\t478984\nKesionCMS\t478985\n彭定康\t478986\nFF10\t478987\n王鹤\t478988\n郑浩南\t478989\n洞庭湖区\t478990\nTel\t478991\n川威集团\t478992\n贞操\t478993\n浅草\t478994\n悬疑小说\t478995\nq6600\t478996\n电池板\t478997\n多_119手游网\t478998\n17汽车网\t478999\n小勐拉\t479000\n海天码头\t479001\n革命版\t479002\n甲斐\t479003\n操作票\t479004\n赵斌\t479005\n巅峰之战\t479006\n2.48\t479007\nzhongyang\t479008\n器器\t479009\n会声会影x4\t479010\n赛氪\t479011\n1080p在线\t479012\n白鹳\t479013\n西安交通大学软件学院\t479014\n24层\t479015\nrvv\t479016\n初游\t479017\n汉瓦\t479018\nm252dw\t479019\n前秦\t479020\n平平安安\t479021\n11800\t479022\nkahao\t479023\n风车山\t479024\npytorch\t479025\n田教授\t479026\n700M\t479027\n染云阁\t479028\n碳漆\t479029\nBeX5\t479030\n嵩山少林寺\t479031\n鸡汤\t479032\n盖布\t479033\n鱼头泡饼\t479034\n大明宫\t479035\n勾图\t479036\nmonogram\t479037\n高要市\t479038\n王立新\t479039\n停顿\t479040\n南涧\t479041\n万历首辅\t479042\n王者荣耀破甲\t479043\n补给品\t479044\n45关\t479045\n荆紫关\t479046\n美国普渡大学\t479047\nG50\t479048\nsynonymes\t479049\nHandel\t479050\n口儿\t479051\n望谟\t479052\n锁王\t479053\n哼唱\t479054\n心中有数\t479055\n税期\t479056\n咖喱酱\t479057\nMBI\t479058\n右乳\t479059\n颈椎牵引器\t479060\n12500元\t479061\n120毫升\t479062\nMicroBlaze\t479063\nKalman\t479064\n深圳实验学校高中部\t479065\n灿然\t479066\n斜塘\t479067\noo吧\t479068\n2003域\t479069\n主手\t479070\n新华湖北\t479071\n变动率\t479072\n京媒\t479073\n支持\t479074\nSouls\t479075\n隐裂\t479076\nORC\t479077\nNu\t479078\n查引\t479079\n苏州轨道交通4号线\t479080\n3232\t479081\n摆渡人\t479082\n认爱\t479083\n预售\t479084\n卜卦\t479085\n无绳电话\t479086\ni7-8700K\t479087\n艾泽拉\t479088\n汉纪\t479089\n韵味\t479090\n中冶中央公园\t479091\n关联性\t479092\n习思\t479093\n扩容盘\t479094\n张春生\t479095\n抓小偷\t479096\nBoomshakalaka\t479097\n成长的烦恼\t479098\n5006\t479099\n张劲\t479100\n仙剑奇侠传\t479101\n绿帽社\t479102\n玩具熊\t479103\n2月24日\t479104\nSpin\t479105\nimmersive\t479106\n流星花园道明寺\t479107\n清玄杯\t479108\n剑三视频编辑器\t479109\n12345\t479110\n六神丸\t479111\n纯粹理性批判\t479112\n高密市\t479113\n碰口\t479114\n螺髻\t479115\n60句\t479116\nOTU\t479117\n陈润儿\t479118\n苦思冥想\t479119\n空气检测\t479120\n微快递\t479121\n波克比\t479122\n花卷\t479123\n2V\t479124\nSutton\t479125\npycrypto\t479126\n蒸饭\t479127\n林慧萍\t479128\n开天辟地\t479129\n临沂站\t479130\nThinkbox\t479131\ncatcher\t479132\n分手吧\t479133\n河南省信访局\t479134\n闸道\t479135\n粤式早茶\t479136\n金佛山\t479137\n跨\t479138\n重庆南岸\t479139\n开发\t479140\n建\t479141\ncorruption\t479142\nsustained\t479143\n李耀辉\t479144\n曼玲\t479145\n赤途\t479146\nshellcode\t479147\n小米红米Note\t479148\n游艺\t479149\n厨子\t479150\n生化危机8\t479151\n安其拉\t479152\n射中\t479153\n梦幻西游合区\t479154\n东莞农商行\t479155\n新水浒Q传\t479156\nxiaoxiao\t479157\nr11splus\t479158\nbiped\t479159\n喷机\t479160\n尖兵\t479161\nCodeCombat\t479162\n汉能汉瓦\t479163\n狗宝\t479164\ntot\t479165\n戚羽\t479166\n邻趣\t479167\n鸟粪石\t479168\n2575\t479169\n车案\t479170\n公域\t479171\n开元商城\t479172\n西式快餐\t479173\n富豪街\t479174\n10条\t479175\n菲雅利帝国\t479176\n年套\t479177\n哈尔滨理工\t479178\n言者\t479179\n喜\t479180\n百度彩票\t479181\n8760\t479182\n齐云社区\t479183\n渣\t479184\n炒股吧\t479185\n结字\t479186\nalumina\t479187\n科雷嘉论坛_汽车之家论坛\t479188\n弃尸\t479189\n24堂\t479190\ni78550u\t479191\n房屋编码查询\t479192\n哈密市\t479193\n淡斑\t479194\n10月12日\t479195\nsaner\t479196\n复安\t479197\nruai\t479198\n启辰R50\t479199\n北极地区\t479200\nSJ\t479201\n最低工资\t479202\n拜亚\t479203\n第一招\t479204\n高第街\t479205\n惠州移动\t479206\n黑子的篮球剧场版\t479207\n小凤仙\t479208\n诸侯\t479209\n自然堂\t479210\n烈日炎炎\t479211\ntheone\t479212\n两全保险\t479213\nunindent\t479214\nisql\t479215\n游走\t479216\n25平\t479217\n乳酪蛋糕\t479218\n为谁而作\t479219\n城镇养老保险\t479220\ng15\t479221\n340亿\t479222\n动态连接库\t479223\n右右\t479224\n香菊胶囊\t479225\n库存表\t479226\n第63\t479227\n彩塑\t479228\n校讯通博客\t479229\n布鲁诺\t479230\n网易VIP163尊贵邮\t479231\n800d\t479232\nWRAP\t479233\n折戟\t479234\n郑涛\t479235\nshowtime\t479236\n中兴网信\t479237\n康妮卡特\t479238\n肥大\t479239\nシャ\t479240\n2400G\t479241\n医生\t479242\nT检验\t479243\n七杀\t479244\n武王\t479245\n1415\t479246\n600道\t479247\n乡委\t479248\n枪骑\t479249\n江苏第二师范学院\t479250\n1类\t479251\n卢军\t479252\n草津\t479253\n向往\t479254\n山科\t479255\n2018年04月16日\t479256\n彭程程\t479257\n微澜\t479258\n朱利安\t479259\n阿布扎比\t479260\n南京十三中\t479261\n会说话\t479262\n一席之地\t479263\n侵彻\t479264\n福鼎肉片\t479265\n青岛房产网\t479266\ng29\t479267\n119.6.240.44\t479268\n雏妓\t479269\n丁青县\t479270\n和田青玉\t479271\n湖南女子学院\t479272\nedm\t479273\n吸入剂\t479274\n大震\t479275\n华彬集团\t479276\n物理所\t479277\n管理培训生\t479278\n朴素贝叶斯\t479279\n天游\t479280\n隨\t479281\n黑龙江工业学院\t479282\n打水漂\t479283\n台江区\t479284\n通风量\t479285\nloyal\t479286\nTERYX\t479287\nDIP\t479288\n监督\t479289\njadx\t479290\n抻面\t479291\ndynamically\t479292\n发法\t479293\n氪金\t479294\n李敏德\t479295\n电光调制器\t479296\n明点\t479297\n桶包\t479298\nmigo\t479299\npermeability\t479300\n伊顿公学\t479301\n朴宰范\t479302\n长名\t479303\n徐善云\t479304\n工业用水\t479305\n流马\t479306\n黄鑫\t479307\nHiFiMAN\t479308\n哔咔哔咔\t479309\n网球双飞物语\t479310\n内存流\t479311\n胀气\t479312\nGoogLeNet\t479313\n潍坊市教育局\t479314\n阿琴\t479315\nNVRAM\t479316\ngoodbay\t479317\n陈静门\t479318\n普洱学院\t479319\nGTX970M\t479320\n露天煤矿\t479321\n笑说\t479322\n广西隆林网\t479323\n皇后大道\t479324\n新任女教师\t479325\n骏宝\t479326\n城市功能区\t479327\nA1单元格\t479328\n春夏秋冬\t479329\n创想\t479330\nHuni\t479331\n酣\t479332\nPRADA\t479333\nteaches\t479334\n土拨鼠\t479335\n板蓝根颗粒\t479336\n汇流\t479337\ntops\t479338\n孙建波\t479339\n注册期\t479340\n断字\t479341\nindicate\t479342\n百度统计吧\t479343\n150%\t479344\n17639\t479345\n天璇\t479346\n瀛\t479347\n资产证明\t479348\n樱花广场\t479349\n亚组分析\t479350\n黄蕊\t479351\nkeyboard\t479352\n张柏\t479353\n山海廉韵网\t479354\n65000\t479355\n作文本\t479356\n馨月\t479357\n天朝田亩制度\t479358\n生日快乐\t479359\n天津市教委\t479360\n新会区政府\t479361\norlion\t479362\n中下\t479363\n捣\t479364\n玉林狗肉节\t479365\n朝代\t479366\n职高\t479367\n蔡成功\t479368\n关婷娜\t479369\n史达祖\t479370\n非全日制用工\t479371\n陈智\t479372\nandroid4.4\t479373\n虎太郎\t479374\n市中心医院\t479375\n管理会\t479376\nIndonesia\t479377\nJBC\t479378\n商丘新闻网\t479379\n防涝\t479380\nOCM\t479381\n中铁二十二局集团有限公司\t479382\n第三十五回\t479383\n5400\t479384\n居功\t479385\n300x300\t479386\n开内\t479387\n蛮力\t479388\n电影业\t479389\n吴通控股\t479390\n研究方\t479391\nLnmp\t479392\n古诗\t479393\n中国数字视听网\t479394\nEducational\t479395\n合信\t479396\n碳化硅\t479397\n水胆\t479398\nforcement\t479399\n励骏\t479400\nBeaglebone\t479401\n订报\t479402\n25张\t479403\n扬子江\t479404\n金克丝\t479405\n萍乡北\t479406\n律师函\t479407\n新乡市一中\t479408\n录像带\t479409\n锦园小区\t479410\nTO\t479411\nUnbound\t479412\n20170610\t479413\n腹股沟\t479414\n香港朗文小学\t479415\n拓新\t479416\n笑容可掬\t479417\n洁身自爱\t479418\n返回\t479419\n房地产去库存\t479420\n好享家\t479421\n花刺\t479422\nment\t479423\n背债\t479424\n绕线机\t479425\nOES\t479426\n题稿\t479427\n吉语\t479428\n打砂机\t479429\n魔兽世\t479430\n4.51\t479431\nHitler\t479432\nrushplayer\t479433\ndo-while\t479434\n10批次\t479435\n默认分隔符\t479436\n几窝\t479437\n兴平\t479438\n永信\t479439\n灵芝\t479440\n区号\t479441\n根深蒂固\t479442\n大灰狼\t479443\nGridControl\t479444\n步步惊心:丽\t479445\n南烟斋笔录\t479446\nrei\t479447\n干炮\t479448\n百度站\t479449\n登巴巴\t479450\n梦楠\t479451\n近六年\t479452\n佟国维\t479453\nFinFET\t479454\n隐藏\t479455\n古墓丽影源起之战\t479456\n荒岛\t479457\n器\t479458\n上海市高院\t479459\nanalog\t479460\n白小纯\t479461\n2.59\t479462\n减让\t479463\n非白\t479464\n炉底\t479465\n300000\t479466\nK2200\t479467\n两色\t479468\nPA++\t479469\n板蓝根冲剂\t479470\nXUtils\t479471\ncpad\t479472\n一回事儿\t479473\n睡不够\t479474\n潍坊人民医院\t479475\n客服\t479476\npupils\t479477\nbehance\t479478\n清水湾\t479479\n11.2.0.3\t479480\n二十章\t479481\ndao接口\t479482\n旗舰机\t479483\n甲状腺过氧化物\t479484\nAVT\t479485\nepp\t479486\n岑巩县\t479487\nyou\t479488\n辐射源\t479489\n广东省公安厅交通管理局\t479490\nbuyer\t479491\n乐小米\t479492\nx战警3\t479493\n马百良\t479494\n兴旺\t479495\n簸箕\t479496\n球类\t479497\n徵\t479498\n88_发\t479499\nsjtu\t479500\n瓜子仁\t479501\npyx\t479502\n发骚\t479503\n组成部分\t479504\n娃娃\t479505\n压瓦机\t479506\n华严宗\t479507\n百褶\t479508\n心地善良\t479509\n王永强\t479510\n囖\t479511\n4台\t479512\n徐州新城\t479513\n陈汉文\t479514\n七星公园\t479515\n腰间盘突出症\t479516\nutopia\t479517\noo吧_\t479518\n清静\t479519\n乐友\t479520\n血染的风采\t479521\nxlxs\t479522\n朱立元\t479523\n研磨膏\t479524\n汪正正\t479525\n16位\t479526\n第51关\t479527\n1节\t479528\n福州新闻网\t479529\n2016.12\t479530\n手薄\t479531\n张宇峰\t479532\nh20\t479533\n炼油厂\t479534\nmanger\t479535\n身体素质\t479536\nOil\t479537\n泉州西街\t479538\nsquirting\t479539\n109升\t479540\n千里追风油\t479541\n肥虫\t479542\nAGON\t479543\nLibreoffice\t479544\n肌腱炎\t479545\n原生质体\t479546\nUna\t479547\nv3.01\t479548\n成型机\t479549\n山西省财政厅\t479550\n源源不断\t479551\n8188\t479552\nORNX\t479553\n歪果仁研究协会\t479554\n蚱蜢\t479555\n吴思\t479556\n几十元\t479557\nSiento\t479558\nPostman\t479559\n发展乡村\t479560\n台钓\t479561\n产品部\t479562\nシュ\t479563\n国际收支\t479564\n磅差\t479565\ncommon\t479566\ny^3\t479567\nNewRoman\t479568\n温州中医院\t479569\n期现\t479570\n管清友\t479571\n酷高\t479572\n长方体\t479573\n盐酸吗啉胍片\t479574\nRochester\t479575\n打赛\t479576\n_秀目网\t479577\n52周\t479578\n磷酸二氢钠\t479579\n颤动\t479580\nAutocad2007\t479581\n湖北省人社厅\t479582\nXILINX\t479583\n吴学占\t479584\n刘水\t479585\n他克莫司软膏\t479586\nn型\t479587\n酷德\t479588\n三幅\t479589\n吴伯萧\t479590\n迎面\t479591\n德州站\t479592\nLet\t479593\n债子\t479594\n支气管炎\t479595\n_世纪新能源网\t479596\n钙质\t479597\n天交所\t479598\n锡克教\t479599\n韩国伦\t479600\n誉为\t479601\n小土\t479602\n可瑞\t479603\nFree\t479604\n体悟\t479605\n背装\t479606\n人造皮革\t479607\nmorse\t479608\n静脉留置针\t479609\n卡波姆\t479610\nMIPI\t479611\nNozomi\t479612\nWEB骇客\t479613\n笔经\t479614\nFly\t479615\n主菜单\t479616\n陈尚君\t479617\n海霞\t479618\n百度教育\t479619\n新发镇\t479620\n水节\t479621\n旅检\t479622\n24袋\t479623\n步舞\t479624\n脱脱\t479625\n小遥\t479626\n安装工程检验批\t479627\niteration\t479628\n小熊电器\t479629\n贵州大学\t479630\n水飞蓟\t479631\n时尚化\t479632\n难喝\t479633\n小花朵\t479634\n寿功\t479635\n马哈鱼\t479636\n拉索\t479637\nomc\t479638\nAxial\t479639\nOSU\t479640\n75192\t479641\n别墅小区\t479642\n性感沙滩4\t479643\n明故宫\t479644\n色织\t479645\n2525i\t479646\n顶杆机\t479647\n可以获得\t479648\n200D\t479649\n悬腕\t479650\n大退\t479651\n万仞山\t479652\nG35\t479653\n德运\t479654\n90068吧_\t479655\n印制板\t479656\n龙源电力集团股份有限公司\t479657\n过敏史\t479658\nVOCALOID中文曲\t479659\nJava数据库连接池\t479660\n爱克\t479661\nStreamer\t479662\n2017-12-22\t479663\nbiased\t479664\n蔡正华\t479665\n装箱\t479666\n151号\t479667\n在手\t479668\n开目\t479669\n339号\t479670\n减肥餐\t479671\n三国战争\t479672\n直喷\t479673\n3200万\t479674\n尚田镇\t479675\n蚁人2\t479676\n科讯医疗网\t479677\n激浪\t479678\n啾\t479679\n汉江师范学院\t479680\ncafe\t479681\n挂锁\t479682\n绳梯\t479683\n百合\t479684\n中院\t479685\n广安门\t479686\n福建师范大学文学院\t479687\n寻医问药网\t479688\nOBD接口\t479689\n黄崖关\t479690\n雄伟壮观\t479691\n牢笼\t479692\n副总编\t479693\n霍利\t479694\n畸形人\t479695\n套图\t479696\n朗播网\t479697\n抵达\t479698\n10071\t479699\n铜锅\t479700\n20150609\t479701\nST公司\t479702\n3096\t479703\n中央商务区\t479704\n璧山\t479705\n佘山国家森林公园\t479706\n显摆\t479707\n火炬之光合体版\t479708\n中铁四局集团有限公司\t479709\n沧海路\t479710\n达尔优牧马人\t479711\n68例\t479712\n岸边\t479713\n冬夏\t479714\n波及\t479715\n盐酸金霉素眼膏\t479716\n新舰\t479717\n朝野\t479718\n煤气化\t479719\n开心麻花\t479720\n求定\t479721\n王永军\t479722\nKick\t479723\n李宝善\t479724\nOJ\t479725\n水深火热\t479726\n四套\t479727\ngongcheng.huangye88.com\t479728\n梁莹\t479729\nlass\t479730\nRX470\t479731\n麦多商城\t479732\nOLED电视\t479733\n砀山县\t479734\n谢尔顿\t479735\n青年晚报\t479736\n锦泰广场\t479737\n展_\t479738\n和弦\t479739\n停放\t479740\n武科大\t479741\n替代者\t479742\n睡袍\t479743\n抽油烟机\t479744\n2121\t479745\n韩国馆\t479746\n足球员\t479747\n戊土\t479748\n进度条\t479749\n文登区\t479750\n圣杰\t479751\n屋面\t479752\n人民新闻通讯社\t479753\n武斌\t479754\nnegotiate\t479755\nDDOS\t479756\n晋档\t479757\nAkka\t479758\n剛\t479759\nico\t479760\n_主妇网\t479761\n华盛顿州立大学\t479762\n笔体\t479763\n禁忌之吻\t479764\n兴和县\t479765\n上野动物园\t479766\njsbridge\t479767\n上海上港\t479768\n浙菜\t479769\n自贡高新区\t479770\nBing\t479771\n9060\t479772\n鸡西市中级人民法院\t479773\n精中\t479774\n论域\t479775\n没关系\t479776\n诱导剂\t479777\n1.121\t479778\n金熊猫\t479779\n吋\t479780\n千兆口\t479781\n1c\t479782\n簕杜鹃\t479783\n致命一击\t479784\n三国志12吧\t479785\n帕丽斯\t479786\n未富\t479787\nBCN\t479788\n601669\t479789\n博特\t479790\n大海贼岛\t479791\n灵星雨\t479792\n灵璧\t479793\n伊甸湖\t479794\n壹玖\t479795\n林慧\t479796\n2002版\t479797\n比起\t479798\n月子会所\t479799\n长安小区\t479800\n成都机场\t479801\nformail\t479802\n中国科学院上海天文台\t479803\n坂咲\t479804\ncaught\t479805\n重启动\t479806\n髋部\t479807\n寒门枭士\t479808\n爱国者\t479809\n歌舞团\t479810\n箱形\t479811\n未来人\t479812\n智能售货机\t479813\n口罩\t479814\n乱战\t479815\n教仪\t479816\n汇智网\t479817\n上海东方绿舟\t479818\n73个\t479819\n新航站楼\t479820\n九泉\t479821\n米沃什\t479822\n酮酸片\t479823\nLuna\t479824\nztree\t479825\n绘制\t479826\n干货篇\t479827\n帧_\t479828\nairpod\t479829\n萨科\t479830\n杨再兴\t479831\n易鹏飞\t479832\n金和\t479833\n被虐待\t479834\n2018年1月13日\t479835\n知命\t479836\n集优\t479837\no3\t479838\nSnmp\t479839\n瓦盆\t479840\nEZ\t479841\n克力\t479842\nJSDoc\t479843\n|\t479844\n博郡\t479845\n天宇股份\t479846\n旋转补偿器\t479847\nSynology_NAS\t479848\n北外附校\t479849\n3DsMAX\t479850\n北京北苑\t479851\n星途\t479852\n成為\t479853\n法印\t479854\n天熊山\t479855\nlgv20\t479856\ntransactions\t479857\nBDRip\t479858\n冷拔无缝钢管\t479859\n地理位\t479860\n大限\t479861\nyj\t479862\n函授大学\t479863\ncmmi5\t479864\nCreke\t479865\n奇速贷\t479866\n张昌尔\t479867\nlawn\t479868\n域虎\t479869\n诺伦\t479870\nSuisse\t479871\nBacon\t479872\n5000万美元\t479873\n语音验证码\t479874\n月兔\t479875\nredis篇\t479876\nOpenAI\t479877\n5200l\t479878\n暂无\t479879\n建达\t479880\nCNAS\t479881\n西岭\t479882\n犟牛\t479883\nBlacked\t479884\n乐刷\t479885\n密闭阀\t479886\n梦特\t479887\n富士施乐Fuji\t479888\n杉浦\t479889\ndyes\t479890\n我们\t479891\nTokyo-Hot\t479892\n安徽省政府\t479893\n铆焊\t479894\n社会保障号\t479895\n杏花岭区\t479896\n瑞达尼亚\t479897\n控制型\t479898\n怀特塞德\t479899\n延后\t479900\nmysql5.7\t479901\npgsql\t479902\n第一分\t479903\n1.21.1\t479904\n火光\t479905\nDHT\t479906\n洞洞板\t479907\n专窗\t479908\nshian\t479909\n瘋耔\t479910\n20款\t479911\n打不过\t479912\n竹梯\t479913\n1则\t479914\n猛火炉\t479915\n食戟之灵吧_\t479916\n芙蓉鸟\t479917\n肌美精\t479918\n第十五\t479919\nQEMU\t479920\n立升\t479921\n移动办公平台\t479922\n云南文山\t479923\n吉星鹏\t479924\nCitespace\t479925\n肛交片\t479926\n合作者\t479927\n球衣\t479928\n得力集团\t479929\nCounter-Strike\t479930\n潞河\t479931\n圆润\t479932\n你追\t479933\n划型\t479934\n彩旗\t479935\n区表\t479936\n恐怖游戏\t479937\n预征率\t479938\nCoating\t479939\n钨丝灯\t479940\n蓝莲\t479941\nareal\t479942\n114项\t479943\n诗剧\t479944\n摸奶门\t479945\n天津消防\t479946\n内置\t479947\n泄密案\t479948\n李传波\t479949\n甲乙丙\t479950\nLTC\t479951\n0.6.0.49\t479952\nkwok\t479953\n王仙芝\t479954\nrotor\t479955\nzhihui\t479956\n74hc573\t479957\n兼性\t479958\nshout\t479959\n致力\t479960\n第6张\t479961\n四端\t479962\n华润医药集团有限公司\t479963\nrs6\t479964\n怪论\t479965\n栏目组\t479966\n深莞惠\t479967\n书皮\t479968\n安卓8.1\t479969\n王琪\t479970\n1句\t479971\nTray\t479972\n叶草\t479973\n梧桐树\t479974\n麦长青\t479975\n波比\t479976\n四川大学文学与新闻学院\t479977\n氧化锌避雷器\t479978\nYMM\t479979\nUSBCAN\t479980\n解限\t479981\n小石\t479982\n为什么\t479983\n十五名\t479984\nCMCT\t479985\n张娴\t479986\nepf\t479987\n野生植物\t479988\n乙队\t479989\nASSA\t479990\n丰产路\t479991\n朱烨\t479992\n高燕\t479993\n内心\t479994\n欺骗\t479995\n宏大\t479996\nDCF\t479997\n榆垡\t479998\n实名制\t479999\n新疆维吾尔自治区\t480000\nqqkey\t480001\n杭州绿城育华学校\t480002\n基岩\t480003\n秦漠飞\t480004\n龛\t480005\n包埋\t480006\n光盘刻录机\t480007\nindicator\t480008\n唐宓\t480009\n九王\t480010\n黄门\t480011\n奉贤网\t480012\n张元济\t480013\nmocking\t480014\n无双8\t480015\n李玉林\t480016\n阳光嘉园\t480017\npapitube\t480018\n6TB\t480019\n猫眼看人\t480020\n霸王岭\t480021\n实号\t480022\nViewpager\t480023\n棱彩\t480024\n股权证\t480025\n慈安\t480026\n留兰香\t480027\nowncloud\t480028\n慕斯\t480029\n皮\t480030\n三朝\t480031\n通川\t480032\n千王\t480033\n安海镇\t480034\n馋\t480035\n韭\t480036\nMAN\t480037\n哑音\t480038\nltk\t480039\n上海红房子医院\t480040\n83958379\t480041\n黄眼睛\t480042\ncn域名\t480043\n厚板\t480044\nv39\t480045\nSDI\t480046\n残月\t480047\n上家公司\t480048\n排球场\t480049\n花湖开发区\t480050\n广州西塔\t480051\n歹徒\t480052\nconfirmation\t480053\n深圳国土局\t480054\n朴素\t480055\n256gb\t480056\n相互影响\t480057\n开模\t480058\neps素材\t480059\n平安谷\t480060\n白茬\t480061\n巨蟹男\t480062\n竹篮\t480063\nMBLAQ\t480064\n黑呆\t480065\ngongyu\t480066\n绝味鸭脖\t480067\nPICK\t480068\n西游汽车网\t480069\nwasting\t480070\nSHOUYOU.COM手游网\t480071\n普币\t480072\n魏延\t480073\n环氧自流平\t480074\n麓山大道\t480075\n赤脚\t480076\n驱动篇\t480077\nrobo\t480078\n360vizza\t480079\n索评\t480080\n物志\t480081\n承诺\t480082\n拜师\t480083\n陕西人民出版社\t480084\n品牌号\t480085\n19196手游网\t480086\n维诺亚\t480087\n手持式\t480088\n吉普莉尔\t480089\n霍天晴\t480090\n雪飞霜\t480091\n澳大利亚签证中心\t480092\n荆州市人力资源和社会保障局\t480093\n3位\t480094\ndominated\t480095\n2017.2.2\t480096\n遵义新观察网\t480097\n宁小闲\t480098\n李兆廷\t480099\n打底衣\t480100\ncgroups\t480101\n便车\t480102\nboa\t480103\n162\t480104\nvst全聚合\t480105\n腻子粉搅拌机\t480106\nHoward\t480107\n海宁西\t480108\n长沙市食品药品监督管理局\t480109\n杭黄铁路\t480110\n萨克森\t480111\n女员工\t480112\nfelicity\t480113\n废水管\t480114\n布轮\t480115\ninteractive\t480116\n蝴蝶兰\t480117\n太极柔力球\t480118\n第八节\t480119\n十字绣\t480120\n主导者\t480121\n张绍忠\t480122\n悬浮物\t480123\n俚\t480124\n杭绍台高铁\t480125\n.&#160\t480126\n渡边麻友\t480127\n50关\t480128\n凝视\t480129\n親\t480130\nsoap\t480131\n足阳明胃经\t480132\n中国疾控中心\t480133\n济钢\t480134\ngrant-tables\t480135\npollo\t480136\nAR\t480137\nStrategy\t480138\nJasperReport\t480139\nAbsolutely\t480140\n后厂村\t480141\n中天科技\t480142\n溪岸\t480143\n本地宝北京公交网\t480144\n王小宝\t480145\n中法战争\t480146\n同龄人\t480147\n特长\t480148\n美丽新世界\t480149\n国发\t480150\n盗跖\t480151\n凉心\t480152\n车衣\t480153\nPARADISE\t480154\n筛\t480155\n86型\t480156\n武汉市中级人民法院\t480157\n猎鹿\t480158\n刻字\t480159\n跃层式\t480160\n页游\t480161\n我们俩\t480162\n电视家\t480163\nturnkey\t480164\n公孙\t480165\n闵鹿蕾\t480166\n清明梦\t480167\n唱读\t480168\n情形\t480169\n万夫莫敌\t480170\n借代\t480171\n旭辉御府\t480172\n石化\t480173\nfromdata\t480174\n遭冷遇\t480175\nwithdraw\t480176\n布吉街道\t480177\n旧县镇\t480178\n密盾\t480179\n一帧帧\t480180\n脚盆\t480181\n长株潭\t480182\n多场\t480183\nmi\t480184\n傻帽\t480185\n最低工资标准2018\t480186\n小洪\t480187\nBound\t480188\n60KG\t480189\nsocke\t480190\n白沙湖\t480191\n推球\t480192\nprivacy\t480193\n昂扬\t480194\n群晖QuickConnect\t480195\n金投网址导航\t480196\nzhaocundang\t480197\n李佛摩尔\t480198\n書\t480199\n孤高\t480200\nue900s\t480201\n7000元\t480202\n风云2论坛_汽车之家论坛\t480203\n欣喜\t480204\n宣恩\t480205\n张伟华\t480206\n凤泉区\t480207\n不住\t480208\nmobileprovision\t480209\n护牙\t480210\n脱狱者\t480211\n语素\t480212\n请勿\t480213\n濒危野生动物\t480214\n75hz\t480215\n一叠\t480216\n巷口\t480217\n东校区\t480218\nwiki版\t480219\n防晒棒\t480220\nf450\t480221\n横山桥镇\t480222\n_林\t480223\n十字路口\t480224\n圈号\t480225\n湖南省房地产业协会\t480226\n我的儿\t480227\n20150625\t480228\n门花\t480229\n不咋地\t480230\n益智玩具\t480231\niess\t480232\n民族工业\t480233\n榨油坊\t480234\nadministrator\t480235\n珠江新城站\t480236\n谢腾飞\t480237\n糙米饭\t480238\nA200\t480239\n限流电抗器\t480240\n4mm\t480241\n首问负责制\t480242\nMichigan\t480243\n硕美科g941\t480244\n伊斯坦布尔\t480245\n面包树\t480246\n上钱\t480247\n至此\t480248\n北京市建筑设计研究院\t480249\n武汉碧桂园\t480250\n朱婷婷\t480251\n收寄\t480252\n私聊\t480253\n中华姓氏网\t480254\n二级建造师转注\t480255\n应收票据\t480256\n强酸\t480257\n南法信\t480258\n五速\t480259\ndumper\t480260\nDDS\t480261\n奥翔\t480262\n青蛇\t480263\n擅长\t480264\n危及生命\t480265\n滨州高新区\t480266\n佐卡伊珠宝网\t480267\n易车号\t480268\nINCLUDE\t480269\n当夜\t480270\nplsq\t480271\n美讯网\t480272\n桑葚树\t480273\n冲击器\t480274\n赛马\t480275\n老伯伯\t480276\n周小\t480277\n玻璃店\t480278\n填塞\t480279\n东洲\t480280\n大圣\t480281\nCocosStudio\t480282\n梵悦\t480283\n国务院新闻办公室\t480284\n新光\t480285\n鱼简笔画\t480286\nA\t480287\n太焦高铁\t480288\n9.1分\t480289\n战狂\t480290\n修短\t480291\n李彧\t480292\n浙江大学竺可桢学院\t480293\n唾沫\t480294\n888.com\t480295\n高压氧\t480296\niWork\t480297\n楽\t480298\n实控\t480299\ndidi\t480300\nKN\t480301\n3737电影网\t480302\n7fresh\t480303\n预演\t480304\n杂色\t480305\nJUST\t480306\n群狼\t480307\n浦公英\t480308\n400kw\t480309\n长隆欢乐世界\t480310\n口角炎\t480311\n天园\t480312\nHonest\t480313\n瑞达法\t480314\n软件公司\t480315\nTravelDaily\t480316\nquiz\t480317\nxiaokang\t480318\n聚酰胺\t480319\n400条\t480320\n188元\t480321\nBiore\t480322\n蓬安县\t480323\n学具\t480324\n雅宝路\t480325\n冷面\t480326\n普雷斯顿\t480327\n世界图书日\t480328\n1.4升\t480329\n胡桃夹子\t480330\n中央公园\t480331\nEC180\t480332\ncupid\t480333\nTGV\t480334\n蓝光长岛国际社区\t480335\nazw3\t480336\n第二多\t480337\n九华山\t480338\nFastlane\t480339\nOrder\t480340\n魔法少女小圆\t480341\nhandsomeBoys\t480342\n三进制\t480343\n摩杜纳\t480344\n鼯鼠\t480345\nNitro\t480346\n起源版\t480347\n拍手\t480348\nrouting\t480349\n主承\t480350\n青岛智慧人社\t480351\n泊松比\t480352\n开罗水上乐园\t480353\n建构主义学习\t480354\nrecurring\t480355\n精进\t480356\n57名\t480357\n龙珠传奇\t480358\n丁二狗\t480359\npscad\t480360\nDowntown\t480361\n银河系漫游指南\t480362\n锁边机\t480363\n并网\t480364\n涵管\t480365\nfilia\t480366\n恩山\t480367\n医嘱\t480368\n恒宇\t480369\n介质簇\t480370\n囚鸟\t480371\n自密\t480372\n垫钱\t480373\n第5篇\t480374\n安全生产责任险\t480375\n艾子霏\t480376\n就业失业登记证\t480377\nNJ\t480378\npolite\t480379\n制造性\t480380\n夜归人\t480381\n体育场\t480382\n庆余抗日之特战兵王\t480383\n声息\t480384\n天涯书库\t480385\n登记日\t480386\n消防控制室\t480387\n巴掌\t480388\n上投摩根基金管理有限公司\t480389\n拜糖平\t480390\n两所\t480391\n军夫\t480392\n大自然的启示\t480393\n马震\t480394\n9p\t480395\ncm3\t480396\n有机氮\t480397\n中国电动车网\t480398\n戴村\t480399\n綦\t480400\n肖婶儿\t480401\n晶莹剔透\t480402\n林生斌\t480403\n赌场案\t480404\n户外公园\t480405\n头雁\t480406\nGarde\t480407\n极品网\t480408\nBK\t480409\nRULES\t480410\n0710\t480411\n段位\t480412\n第三位\t480413\nMedina\t480414\nGetData\t480415\n庆功宴\t480416\n光热发电\t480417\n英语学院\t480418\n杀姬\t480419\n中华英才网\t480420\n度秘\t480421\n串场\t480422\n赤峰学院\t480423\n宁泽涛\t480424\nb75m-d3v\t480425\n质押式\t480426\n炭素\t480427\nSmartest\t480428\n雪怪\t480429\n纳入\t480430\n消防大队\t480431\n横变\t480432\n汇丰晋信\t480433\n左衽\t480434\ntooltip\t480435\n分拣机器人\t480436\n2段\t480437\n衣袋\t480438\n综训\t480439\n楔子\t480440\n52码\t480441\nipython\t480442\n摔\t480443\n纳恩博\t480444\n山东钢铁\t480445\n剧中人\t480446\n武汉理工大学自动化学院\t480447\n焦黄\t480448\n累加器\t480449\n五马镇\t480450\n澄江街道\t480451\n_方\t480452\n平行车\t480453\n极耳\t480454\nPussies\t480455\n长包\t480456\nzifu\t480457\n重庆时时彩龙虎\t480458\n老福山\t480459\n偷\t480460\n堪忧\t480461\n垃圾焚烧厂\t480462\n祼女\t480463\n搜同\t480464\np-1\t480465\n二泉吟\t480466\nIATF16949-2016\t480467\n3000杯\t480468\n八泉峡\t480469\n吕克\t480470\nzhijie\t480471\n收藏栏\t480472\n二等\t480473\n中公教\t480474\n昆明市财政局\t480475\n水淼\t480476\n哈尔滨万达乐园\t480477\n承德市\t480478\n准备\t480479\n更方\t480480\n风靡一时\t480481\nutext\t480482\n3pro\t480483\n酶联\t480484\n选择性\t480485\n蛋白液\t480486\n撞球\t480487\n华夫格\t480488\n极限竞速地平线3\t480489\n瞧见\t480490\nmenghuan\t480491\n嫩\t480492\n咖啡碱\t480493\nproperly\t480494\n1.5倍\t480495\n洗气\t480496\nMugeda\t480497\n批灰\t480498\n京卡\t480499\nadministrative\t480500\nsaf\t480501\n许波\t480502\n养殖证\t480503\n操作说明\t480504\nStatements\t480505\n非线性\t480506\n计算机房\t480507\n武当太极剑\t480508\n路印\t480509\n拉普拉斯逆变换\t480510\n负载均衡器\t480511\n李常\t480512\n高宏伟\t480513\n1300个\t480514\n审讯\t480515\n琦叔\t480516\n老盘\t480517\n创业维艰\t480518\n海事展\t480519\n福位\t480520\nC++string\t480521\ngodoc\t480522\n虚开发票罪\t480523\n达姆\t480524\n哈比\t480525\n萧薰儿\t480526\n写诗\t480527\n波城\t480528\n受赏\t480529\n新酒店\t480530\n那一边\t480531\n钢羽\t480532\nwalla\t480533\n子午大道\t480534\n7025\t480535\n曹娥江\t480536\n优酷视频\t480537\n37期\t480538\n宇佐美奈奈\t480539\n生杀\t480540\n急性肠胃炎\t480541\n冻顶乌龙茶\t480542\n城关区\t480543\n百舸争流\t480544\n2018年3月2日\t480545\nnvflash\t480546\n百福\t480547\n3箱\t480548\n发展路\t480549\n制版机\t480550\npnr\t480551\n6下\t480552\n五千元\t480553\n壤\t480554\n王粲\t480555\n不入流\t480556\n弶港\t480557\n双碑\t480558\n青石证券\t480559\n四码\t480560\n仁安\t480561\n阿卡迪亚\t480562\nea211\t480563\n地下式\t480564\n鸟猎\t480565\n第35章\t480566\n松掉\t480567\nASE\t480568\n凤凰书城徐州市新华书店\t480569\n吕健\t480570\n一千多万\t480571\n出\t480572\n碳带\t480573\nT450s\t480574\n南红吧\t480575\n闺蜜2\t480576\nPocket\t480577\n超时空同居\t480578\npvc膜\t480579\n第2轮\t480580\nvisual-studio\t480581\n啦啦舞\t480582\n广州市番禺区\t480583\nOPENID\t480584\n戴晓莲\t480585\n少精\t480586\n诊察室\t480587\n吹潮\t480588\nconstructor\t480589\nfest\t480590\n市行政服务中心\t480591\ndatagram\t480592\nJHipster\t480593\n乐和乐都\t480594\n广州铁路职业技术学院\t480595\n卡员\t480596\n审判长\t480597\n冯强\t480598\n绝期\t480599\n待产包\t480600\n冶金学\t480601\n船山区\t480602\n投手\t480603\n声声\t480604\nvs2017吧\t480605\n椅背\t480606\n咦\t480607\nMOD_斗蟹游戏网\t480608\n品德\t480609\n密照\t480610\n承付\t480611\n全景图\t480612\n陈新颖\t480613\n坐享其成\t480614\n米德尔斯堡\t480615\n兄弟会\t480616\n2120\t480617\n高级证\t480618\n战狼2\t480619\n自买\t480620\nTherapeutic\t480621\n天诚\t480622\nASDAN\t480623\n委办\t480624\n凯里南\t480625\n南京大屠杀纪念馆\t480626\n2.9级\t480627\n打转\t480628\n注码\t480629\n中科炼化\t480630\n于斯\t480631\nUber\t480632\n省食品药品监督管理局\t480633\nT14\t480634\n抗阻\t480635\n贺月秋\t480636\nNapoleon\t480637\n九张\t480638\n八所镇\t480639\n美亚保险\t480640\nQQ分类知识网\t480641\n度日\t480642\n树突状细胞\t480643\n北京第二外国语大学\t480644\n普乐美\t480645\n鲁企\t480646\n排骨汤\t480647\n哈尔滨市平房区\t480648\n三头肌\t480649\n纳西\t480650\n顶升\t480651\n影拓\t480652\n国\t480653\n洛杉矶国际机场\t480654\n港珠澳大桥海底隧道\t480655\n叶圣陶\t480656\n履新\t480657\n1162\t480658\nGRAPH\t480659\n枪式\t480660\n腾讯大豫网\t480661\nincopat\t480662\n9个多月\t480663\n土肥圆\t480664\n外商独资\t480665\n热血街\t480666\n道道通\t480667\n广东南华工商职业学院\t480668\n请求包\t480669\n鑫案\t480670\n有信\t480671\n北海大道\t480672\nusbc\t480673\n陈玉琴\t480674\n果糖机\t480675\n叩诊\t480676\n扬声器\t480677\n成串\t480678\n系统优化\t480679\n剪线机\t480680\n纽崔莱\t480681\nmagix\t480682\n帝王业\t480683\n有底\t480684\n全自动洗碗机\t480685\n女神范\t480686\n娘舅\t480687\n浮华\t480688\ncgminer\t480689\n加盟店\t480690\n90套\t480691\n孙建\t480692\n袋鼠\t480693\nbrg\t480694\n地震台\t480695\n许昌市政府\t480696\n相联\t480697\n联想锋行\t480698\n雾面屏\t480699\n45kg\t480700\n主要人物\t480701\n李俊慧\t480702\n贸然\t480703\nThinker\t480704\nFIRE\t480705\n23处\t480706\n鸭爪爪\t480707\n墨西哥城\t480708\n王道论\t480709\n福晋\t480710\n冷血动物\t480711\n阵营战\t480712\n冬风\t480713\n风干肠\t480714\n季子\t480715\n燃气式\t480716\n想当\t480717\n茶叶茶\t480718\n哈灵\t480719\n未来几天\t480720\n苦功\t480721\n座头\t480722\nAffinity\t480723\n李炳渊\t480724\n吉安娜\t480725\n太阳网\t480726\n使命召唤2\t480727\nkale\t480728\n猎虫\t480729\n5320\t480730\n坟\t480731\n防臭地漏\t480732\nAVC\t480733\n稀有气体\t480734\n展示机\t480735\nsparksql\t480736\n横放\t480737\neva\t480738\n欧陆风云4\t480739\n王文银\t480740\n22卷\t480741\n主力军\t480742\n油酥烧饼\t480743\n瘦肉\t480744\n沙都\t480745\nDLC\t480746\n航海路\t480747\n6F\t480748\n0c\t480749\n傣妹\t480750\n万涛\t480751\n牙髓\t480752\nslowly\t480753\n球服\t480754\n起盘\t480755\nviral\t480756\n日干\t480757\n130平米\t480758\njurisdiction\t480759\nVathe\t480760\n三会两制一课\t480761\nnig\t480762\n维吉尼亚\t480763\n医统\t480764\n塔县\t480765\n小夕\t480766\n细算\t480767\n极致\t480768\n思岚科技\t480769\n冰硼散\t480770\n评\t480771\n侠盗猎车手圣安地列斯\t480772\nphilosophical\t480773\n冯坤\t480774\n娃娃兵\t480775\nsamsung\t480776\n金犬\t480777\n慕丝\t480778\n现代大学英语精读\t480779\nannounces\t480780\n流读\t480781\n兔\t480782\nHulk\t480783\nmagnum\t480784\n连锁加盟代理店\t480785\n恶值\t480786\n42米\t480787\n赤司\t480788\npooled\t480789\n泽法\t480790\n民兵\t480791\n新昌路\t480792\n生命科学学院\t480793\n垫步\t480794\n9月\t480795\n肉痛\t480796\ndeutschland\t480797\n华东师范大学研究生院\t480798\n屌德斯\t480799\n苏滁\t480800\n人情世故\t480801\nVerifying\t480802\n万恒\t480803\n正交投影\t480804\n高攻\t480805\nPages\t480806\n移调\t480807\nCortex\t480808\nE展网\t480809\n上卿\t480810\n百度电脑\t480811\n坐具\t480812\n手绘\t480813\nZhengzhou\t480814\n亚马逊鹦鹉\t480815\n梦幻币\t480816\n纳米盒\t480817\n上海昱音机械有限公司\t480818\n服务窗\t480819\n粗茶淡饭\t480820\n1笔\t480821\nKolla\t480822\n天极下载\t480823\n南方医院\t480824\n临柜\t480825\nX98\t480826\nWerner\t480827\nmohrss\t480828\npiaohua\t480829\n影音先锋AV网_AV电影天堂\t480830\n20171206\t480831\n文化遗产\t480832\n陈军\t480833\nFxFactory\t480834\n大卖场\t480835\n。\t480836\n家园网\t480837\nTaylor\t480838\nUMN\t480839\nsworth\t480840\n东汉时期\t480841\nbridging\t480842\n16进制编辑器\t480843\nmicrobiota\t480844\nSephora\t480845\nA1688\t480846\n诸城一中\t480847\n小帮手\t480848\n生产管\t480849\n易懂\t480850\n轮转\t480851\n官制\t480852\n紫竹园\t480853\n榆次大学城\t480854\n世界语\t480855\n英国华威大学\t480856\n柜姐\t480857\n180404\t480858\n持票\t480859\n黄龙风景区\t480860\n巅峰之夜\t480861\n三相电能表\t480862\n龙飘飘\t480863\n数三\t480864\nbabel-loader\t480865\n李桥\t480866\n方兰生\t480867\n金管\t480868\n92a\t480869\n异域镇魂曲\t480870\n奇装\t480871\npytest\t480872\n外省市\t480873\n张萌萌\t480874\n恩波\t480875\n递归神经网络\t480876\n婚纱照\t480877\n美福\t480878\n奇蹟\t480879\n双牌县人民政府\t480880\n早射\t480881\n首域\t480882\nuyhur\t480883\n枫之谷\t480884\n天殇\t480885\n鲍姆\t480886\n豆荚\t480887\noscam\t480888\n凉拌粉丝\t480889\n鹰击长空2\t480890\n天地盖\t480891\n血腥玛丽\t480892\n护校\t480893\npiecewise\t480894\n独好\t480895\n船身\t480896\n解决玩\t480897\n猹\t480898\n底片\t480899\n美即\t480900\n三都水族自治县人民政府\t480901\nNervous\t480902\n人论\t480903\n疏漏\t480904\n学而时习\t480905\n枇杷核\t480906\n整齐\t480907\n美酒\t480908\n青草膏\t480909\n防沙治沙\t480910\nKext\t480911\n高等教育自学考试\t480912\nJomashop\t480913\n亚信\t480914\n进义词\t480915\n考题\t480916\n波斯王子3\t480917\n第16个\t480918\nIg\t480919\nMisty/迷雾\t480920\n100n\t480921\n北水\t480922\n翻看\t480923\n新风尚\t480924\n花怜\t480925\n受害方\t480926\n八六\t480927\n百度影音播放器\t480928\n坐果\t480929\n公示语\t480930\n保罗乔\t480931\n无限元宝\t480932\n低趴\t480933\n中路股份\t480934\n不限贷\t480935\n二马路\t480936\n新欧诺\t480937\n3个多小时\t480938\n嘲笑\t480939\n08001\t480940\nbenchmarks\t480941\n借物喻人\t480942\n15.4英寸\t480943\n富腾\t480944\n工口邪恶少女漫画_无翼鸟邪恶漫画全集\t480945\n移动荷载\t480946\n陈舜臣\t480947\n人民银行\t480948\n绝代双骄3\t480949\n烟缸\t480950\n斐迅\t480951\n马刀\t480952\ncerave\t480953\n87平米\t480954\n5000万\t480955\n战\t480956\nlibre\t480957\n尾房\t480958\n好受\t480959\n小天才\t480960\n兵王传奇\t480961\nGoDaddy\t480962\n穿进\t480963\n个人战\t480964\n风镐\t480965\n触头\t480966\n平安汽车保险_平安\t480967\n徐骁\t480968\n付静\t480969\n_心\t480970\n南京大报恩寺\t480971\n金莎\t480972\nAppDelegate\t480973\n三滴水\t480974\n徐若瑄\t480975\n鸡儿\t480976\n医疗补助费\t480977\nyese\t480978\n绝地求生_绝地求生\t480979\n新闻出版总署\t480980\n白菜花\t480981\n零五\t480982\n单版\t480983\n公共交通\t480984\n相泽南\t480985\n炫斗之王\t480986\n帝国cms\t480987\n茶吧机\t480988\n标示牌\t480989\n本月底\t480990\n第161集\t480991\n箍筋根数\t480992\n银棒\t480993\nPayday2\t480994\n弗洛格\t480995\n非编码\t480996\n整合型\t480997\n张定涵\t480998\n资产评估学\t480999\n碘化亚铜\t481000\n养娃\t481001\n诚求\t481002\n哈尔滨市第一医院\t481003\n颂赞\t481004\nviewcell\t481005\n江达县\t481006\n中国工商银行深圳市分行\t481007\n光大控股\t481008\n避过\t481009\n0971\t481010\n哪吒闹海\t481011\nplacebo\t481012\n冷云\t481013\n整车\t481014\n华晨汽车集团\t481015\n芦荟汁\t481016\n∧\t481017\n罪感\t481018\n华罗庚\t481019\n2.90\t481020\n雅昌艺术家网\t481021\n天津商业大学宝德学院\t481022\n收废\t481023\n1567\t481024\n葬身\t481025\n002280\t481026\n盾战\t481027\nmp4磁力链接\t481028\n甲钴胺胶囊\t481029\n神偷\t481030\n木果\t481031\n0.5厘米\t481032\n肯福莱特\t481033\n村行\t481034\n交租\t481035\n脑血管\t481036\n叮诺芬\t481037\n扶苏\t481038\nTYPE-MOON\t481039\nOauth2\t481040\n中共山西省委\t481041\n东方斯卡拉\t481042\n雨雪\t481043\nGOST\t481044\n法律保护\t481045\n赵经理\t481046\n斯巴达克斯\t481047\n手机知网\t481048\n迪诺\t481049\nqk\t481050\n99位\t481051\nvidi\t481052\n地黄\t481053\nf-1\t481054\n沁润\t481055\n哀嚎\t481056\npier\t481057\n黄盒\t481058\nfirfox\t481059\nPRINCE\t481060\n塔德\t481061\n车载灭火器\t481062\n地理\t481063\n丰顺\t481064\n在远方\t481065\n_社科网\t481066\n九朵云\t481067\n加载项\t481068\nDTM\t481069\nシュガ\t481070\n做得到\t481071\n顾问式\t481072\n878\t481073\n瑰\t481074\n成文\t481075\n斯蒂芬·茨威格\t481076\n一梭\t481077\n黄痂\t481078\n发藏网\t481079\n好儿郎\t481080\n爱奇艺泡泡\t481081\n允诺\t481082\nbooting\t481083\nfaunjoe88\t481084\n现形记\t481085\n号称\t481086\n代差\t481087\n景观桥\t481088\n君临天下\t481089\n何日君\t481090\n有福网\t481091\n口译员\t481092\n邵\t481093\n平均日\t481094\n分为\t481095\n一13岁\t481096\nTuner\t481097\n双轴\t481098\n安井\t481099\n血小板\t481100\n非整数\t481101\n密钥序列号\t481102\n审方\t481103\n人皆有\t481104\n铱金火花塞\t481105\n关系式\t481106\n鱼化龙\t481107\nVishay\t481108\nHandlebars\t481109\n全省性\t481110\n江苏润和软件股份有限公司\t481111\n无路\t481112\n工业区\t481113\n柯以柔\t481114\n山西广播电视台\t481115\n涌泉\t481116\n剁\t481117\n育路国际学校网\t481118\n西安高新一中\t481119\n第五步\t481120\nSNMP4J\t481121\n二维码生成器\t481122\n张悬\t481123\n桂离宫\t481124\n初三生\t481125\n外海外\t481126\n云海肴\t481127\n日和\t481128\n西市区\t481129\nhzlzc08\t481130\n抹茶粉\t481131\n洋山港四期\t481132\n脱氢乙酸钠\t481133\n电缆线\t481134\n宋版\t481135\n美图录\t481136\ndl388\t481137\n体素\t481138\n鲛人\t481139\n怪物猎人世界吧\t481140\n串级\t481141\n手工板\t481142\n司法体制改革\t481143\n施甸\t481144\n自考审计学\t481145\nRMIT\t481146\n深圳宝安网\t481147\nSpirit\t481148\n故事版\t481149\n720元\t481150\n科韵路\t481151\n能工巧匠\t481152\n刘义\t481153\n高任\t481154\n林雨欣\t481155\n压弯\t481156\n有招\t481157\n武汉外语外事职业学院\t481158\nInst\t481159\n金英权\t481160\n主情\t481161\n炸毁\t481162\n着地\t481163\n测款\t481164\n系统文\t481165\n博舍\t481166\n咘咘\t481167\n上海证券交易所\t481168\n钜献\t481169\n石家庄市人民政府办公厅\t481170\n网综\t481171\n晋煤\t481172\n马林诺夫斯基\t481173\n突发\t481174\n古惑狼\t481175\n巴西花梨木\t481176\n白纱裙\t481177\n光氧净化器\t481178\n无法拥抱的你\t481179\n录取率\t481180\n4月下旬\t481181\nsucking\t481182\n分布式光伏电站\t481183\n恢复期\t481184\n惊弓之鸟\t481185\nwow60\t481186\nifx\t481187\n凯梅尔兹\t481188\n逆时光\t481189\n超跌\t481190\n衍射角\t481191\n冷冻油\t481192\n威尔孙亚芳\t481193\n报建费\t481194\n琅琊区人民政府\t481195\n无所不知\t481196\nalsamixer\t481197\nd3dcompiler\t481198\nOTIS\t481199\n篡权\t481200\n问切\t481201\n普门品\t481202\n华中科技大学》\t481203\n京翰\t481204\n爱上我\t481205\n方云\t481206\n捞仔\t481207\n2017年6月30日\t481208\n伐木工\t481209\n姓名&#160\t481210\n公兔\t481211\n长安欧尚欧尚\t481212\n肉搏战\t481213\n波达克\t481214\nLexmark\t481215\n遇事\t481216\nCORELDRAW\t481217\n阿卡索\t481218\n土地的誓言\t481219\nLYB\t481220\n1.9元\t481221\nSEYSA\t481222\n3】\t481223\n潍州路\t481224\n磁电\t481225\n山东省消防\t481226\n边境线\t481227\n希望之星\t481228\n芳草园\t481229\n向勇\t481230\n米厂\t481231\n新泉\t481232\n西安地铁3号线\t481233\n内耗\t481234\norabbix\t481235\n郑州百姓网\t481236\n达式常\t481237\n社会工作者职业水平考试\t481238\n万科未来城\t481239\niscsiadm\t481240\n索赔\t481241\n糖果色\t481242\n沈阳市房产局\t481243\n龙首山\t481244\n20151015\t481245\n邹强\t481246\n971\t481247\n诡诈\t481248\nIntimate\t481249\nQQQQ头像大全\t481250\n江泽明\t481251\n哈弗h7\t481252\n包河花园\t481253\n集梦\t481254\n王亭喜\t481255\n包销\t481256\nmnf\t481257\n美图\t481258\n荧光增白剂\t481259\n于都政府\t481260\n剑三丐\t481261\n720yun\t481262\n承传\t481263\niconfont\t481264\n广龙\t481265\n碉楼\t481266\n白条鱼\t481267\n小米note3吧\t481268\n武则天墓\t481269\n里边\t481270\n努比亚z9\t481271\nVaVa\t481272\n镇魔曲\t481273\n有限合伙制\t481274\n气动避震\t481275\n惠州汽车总站\t481276\n风干牛肉干\t481277\n洋地黄\t481278\nspbm\t481279\n永不独行\t481280\n牛血清\t481281\n机厅\t481282\n51比购返利网\t481283\n杜仲叶\t481284\n62号\t481285\n5月1号\t481286\n电平\t481287\n真珠\t481288\n急转\t481289\n雷加\t481290\n笛子\t481291\n开山空压机\t481292\n学习板\t481293\n傲风\t481294\n国家质量监督检验检疫总局特种设备安全监察局\t481295\n期货交易者\t481296\n5iMX.com\t481297\n去骨\t481298\n汽车变速器\t481299\nemgucv\t481300\n结头\t481301\nTsung\t481302\nvpc\t481303\n英爱\t481304\n李连\t481305\n_网贷天眼\t481306\n五角场镇\t481307\n新热血英豪\t481308\nRemember\t481309\nLED控制卡\t481310\n拉萨夜雨\t481311\n织金洞\t481312\n发愤\t481313\n反撸\t481314\n蓄谋\t481315\n假发\t481316\n38寸\t481317\n自由任务\t481318\n外加剂\t481319\n自语\t481320\n湘女\t481321\n缕空\t481322\nH版\t481323\n安欣\t481324\n猪皇\t481325\n老有所乐\t481326\n沃尔沃XC\t481327\n舞板\t481328\n2018世界读书日\t481329\n赳赳\t481330\n众联\t481331\n山东省博物馆\t481332\n爱情\t481333\n芙蓉大道\t481334\n盲肠炎\t481335\n40V\t481336\n马湖\t481337\n苏烟\t481338\n联合航空\t481339\n40T\t481340\n侧滤\t481341\n青岛超银中学\t481342\nIntake\t481343\n饰面板\t481344\n海灯\t481345\nK神\t481346\n徐俊熙\t481347\nhplus\t481348\n弟\t481349\n王zexal\t481350\n珞珈山\t481351\n689\t481352\n灰土\t481353\nRolled\t481354\n墨菲定律\t481355\n莲蓉\t481356\n素手\t481357\n上汽大众朗逸\t481358\n周口师范学院\t481359\nPOS58\t481360\n飞库网\t481361\n侧栏\t481362\n金8天国\t481363\n20161201\t481364\n3050\t481365\n龙华人民医院\t481366\nclc\t481367\nflick\t481368\n环规\t481369\n刘跃进\t481370\n余脉\t481371\n钠灯\t481372\n万豪国际\t481373\nmxGraph\t481374\nBEAST\t481375\nCCTV3\t481376\nsegy\t481377\n安托万\t481378\n颍泉区\t481379\n郑儿\t481380\n霏慕\t481381\n南宁市兴宁区人民政府\t481382\n外单\t481383\n五矿证券\t481384\n0591\t481385\n前事\t481386\n子版\t481387\njiegeng\t481388\n死不瞑目\t481389\n京安\t481390\n桐君阁\t481391\nStep\t481392\n5500元\t481393\n人民东路\t481394\n郊区化\t481395\n报费\t481396\nItis\t481397\n炉业\t481398\nngif\t481399\n沈阳南\t481400\n新开瑞\t481401\nExtension\t481402\n张海燕\t481403\n16brand\t481404\ndraws\t481405\n持剑\t481406\nmian\t481407\n英若诚\t481408\nDHCPv6\t481409\n第十一部\t481410\ngalleries\t481411\nKLIA\t481412\n摇臂式\t481413\nTonights\t481414\noop\t481415\nMargaret\t481416\nVScode\t481417\nadvocate\t481418\n20艘\t481419\nfin\t481420\n神明的话\t481421\nput\t481422\n2018年5月\t481423\n卓越网\t481424\n48集\t481425\n三翼\t481426\nNet130.com\t481427\n三城联创\t481428\nbeyong\t481429\n阿七\t481430\nea4500\t481431\n喘气\t481432\n转业\t481433\n林昊\t481434\n乡村爱情\t481435\n放音\t481436\n5px\t481437\n弈棋\t481438\n青岛财经网\t481439\nMADHOUSE\t481440\n东新街\t481441\n银行间市场交易商协会\t481442\n射频功率放大器\t481443\nShut\t481444\n私人定制\t481445\n攒肚\t481446\neditor\t481447\n7月6日\t481448\n笃学格致\t481449\n傅斯年\t481450\ncovert\t481451\n以泪洗面\t481452\n法门寺\t481453\n自由化\t481454\n暴走人事部\t481455\n百程旅行网\t481456\n加州大学圣地亚哥分校\t481457\nLIFE\t481458\n上舞\t481459\n宠物公园\t481460\n滴水贷\t481461\n阕\t481462\nBY\t481463\n躬亲\t481464\n秒破\t481465\n硬座\t481466\nCounter\t481467\nwebs\t481468\n夜世界\t481469\n棉湖\t481470\n放缩法\t481471\n叶咲梦\t481472\n3.40.01\t481473\n桥式整流\t481474\n湖北中医药高等专科学校\t481475\n中英雄\t481476\n王琳凯\t481477\n留任\t481478\npolicy\t481479\n股城网\t481480\n华英\t481481\n大华\t481482\n∏\t481483\nmessy\t481484\nArteon\t481485\n布氏硬度计\t481486\np2v\t481487\n退运\t481488\n魔兽世界7.0军团再临_多玩WOW\t481489\nku\t481490\n年例\t481491\n钢丝网\t481492\n查询分析器\t481493\nkeyup\t481494\n温泉度假区\t481495\n郭晓然\t481496\n地铁2号线\t481497\n马自雪鹰\t481498\n金港国际\t481499\nMotherboard\t481500\n254号\t481501\nhavd\t481502\n楼面地价\t481503\n金诺\t481504\nVI编辑器\t481505\n可米小子\t481506\n博晖创新\t481507\n永存\t481508\npao\t481509\n刷牙\t481510\n海湾战争\t481511\n好段\t481512\n才会\t481513\nmhtml\t481514\n汉蓝暴\t481515\nbeliever\t481516\n域名服务器\t481517\nspearman相关系数\t481518\n#努比亚\t481519\n复旦\t481520\n续费率\t481521\nzookpeer\t481522\ngmock\t481523\n绿色\t481524\n魔羯座\t481525\nBoltzmann\t481526\n淳朴\t481527\n鸡肉味\t481528\n马可波罗瓷砖\t481529\n一汽丰田皇冠\t481530\n瓜达尔\t481531\n3005\t481532\n3688\t481533\n失分\t481534\n兰州高新区\t481535\n4步\t481536\n伦文叙\t481537\nLeadtek\t481538\nx9i\t481539\n杨谦\t481540\n不立\t481541\n长生名图\t481542\n中式家具\t481543\nConference\t481544\n8.9亿\t481545\n孕前\t481546\n西藏农牧学院\t481547\nPDG\t481548\n金贝壳\t481549\n梦妮\t481550\nCOMEX\t481551\n乐世社区\t481552\n杭州市妇产科医院\t481553\n荒川之主\t481554\n电动堆高车\t481555\n第十部\t481556\nphysiological\t481557\n天天爱去\t481558\n英歌\t481559\n温庭筠\t481560\n韦少\t481561\n淳儿\t481562\nクロ\t481563\nsplicing\t481564\nMrSunny\t481565\n张嘴\t481566\n瓜拉纳\t481567\n刺激战场\t481568\nidf\t481569\n学信\t481570\n钓鱼伯钓鱼网\t481571\nstaad\t481572\n倒过来看\t481573\n专本\t481574\n漆盖\t481575\n薛阳\t481576\n肋软骨炎\t481577\n双创\t481578\nssh服务器\t481579\n华罗贾斯汀比伯\t481580\n广西云龙招标集团有限公司\t481581\n每3年\t481582\nCTB\t481583\n金有谦\t481584\n公允价值变动损益\t481585\n输出\t481586\n中国建投\t481587\n天津阳光男科医院\t481588\n比试验\t481589\n广西站\t481590\n最后的希望\t481591\n保亭黎族苗族自治县\t481592\n梅观高速\t481593\n终现\t481594\n陈炳\t481595\n指示标志\t481596\nbugreport\t481597\n财局\t481598\nXXXXX\t481599\n济群法师\t481600\n尼勒克\t481601\n世纪互联\t481602\n卡内基梅隆大学\t481603\n变位机\t481604\npandakill\t481605\n龙乡\t481606\n刘丽新\t481607\n菜器\t481608\n金豆子\t481609\n港片\t481610\n奶味\t481611\n安徽交通职业技术学院\t481612\n非布司他片\t481613\n唐鲁孙\t481614\n0766\t481615\n德艺双馨\t481616\n棉草\t481617\n写轮眼\t481618\n金顶街\t481619\n名高\t481620\n小伦\t481621\n秦剑\t481622\n拓斯达\t481623\nron\t481624\n上海展览中心\t481625\n毒蜥\t481626\nUiautomator\t481627\n痴女メイド\t481628\nNetty\t481629\n数学归纳法\t481630\n球皇无双\t481631\n借词\t481632\n帅客\t481633\n旺铺\t481634\n1530\t481635\n流亭机场\t481636\n减半\t481637\n金都路\t481638\nnanyang\t481639\nAcupuncture\t481640\n075595511\t481641\n王叔晖\t481642\n弹种\t481643\n轮胎业\t481644\n国家基本公共卫生服务规范\t481645\nTargeted\t481646\nlpl2017\t481647\n福山\t481648\n抓包分析\t481649\nS100\t481650\n罗南新村\t481651\n列明\t481652\n死鸟\t481653\n50万条\t481654\n护理心理学\t481655\n防弹少年团\t481656\n凯尔特人\t481657\n金至尊\t481658\nSolids\t481659\n川崎重工\t481660\n金沙滩\t481661\n香榧\t481662\n杏花沟\t481663\nrich\t481664\nminui\t481665\n18.7\t481666\n枫亭\t481667\n小马云\t481668\n拨备覆盖率\t481669\n扩容机\t481670\n粘粉\t481671\n坏血病\t481672\n进一\t481673\n如图所示\t481674\n起风\t481675\nuh\t481676\n新目标英语\t481677\n中庭\t481678\n危成\t481679\n缠足\t481680\n野鸭子\t481681\nlibx\t481682\n嵌入式\t481683\n杨兰\t481684\n孙夫人\t481685\n全球医院网\t481686\n10轮\t481687\n砌体\t481688\n全套管\t481689\n望庐山瀑布\t481690\n刘冬\t481691\n博科交换机\t481692\n未明\t481693\n北京凤凰岭\t481694\n箱内\t481695\n雪泥鸿爪\t481696\n孤竹城\t481697\n蓝城\t481698\n龙子\t481699\n东南角\t481700\n110弯梁\t481701\nDi\t481702\n军鸡\t481703\n前7个月\t481704\ncbx\t481705\n4万吨\t481706\n聚享\t481707\nClare\t481708\nMP3格式转换器\t481709\n点化\t481710\n七十七天\t481711\n平均收益率\t481712\nJollyWing\t481713\n贾政\t481714\n环境变量\t481715\n蓝月湾\t481716\n石门河\t481717\ngift\t481718\nedb\t481719\n飞星紫微\t481720\n金智教育\t481721\n43类\t481722\nlibmodbus\t481723\n泉台\t481724\n两肋\t481725\n王信平\t481726\n储能式\t481727\n分群\t481728\n陆陆续续\t481729\nChocolate\t481730\n森太\t481731\n27日\t481732\n清泉湾\t481733\n尚且\t481734\n4126\t481735\ndiss\t481736\n安徽农商行\t481737\n眉形\t481738\n置价\t481739\n俞灏明\t481740\n强忍\t481741\n北京大学软件与微电子学院\t481742\n医人\t481743\n中国平安银行\t481744\n开罗拉面店\t481745\n航天展\t481746\n过路人\t481747\n2011年1月1日\t481748\n360文档中心\t481749\nPlc\t481750\n引体\t481751\nAvid\t481752\n途观/途观L论坛_汽车之家论坛\t481753\n劳保展\t481754\n六景\t481755\n发髻\t481756\n常桦\t481757\n扒灰\t481758\n360游戏管家资讯站\t481759\n眼皮\t481760\n天刀真武\t481761\nソフト\t481762\n事业单位考试公共基础知识\t481763\n解放军国际关系学院\t481764\n银行存款利率\t481765\n8.32\t481766\n韩少\t481767\ncaroline\t481768\nvert\t481769\n藏龙卧虎\t481770\n标变\t481771\n姚晓东\t481772\nKawaii\t481773\n被窃\t481774\n豫皖\t481775\nleanote\t481776\n暴街\t481777\nSnapseed\t481778\n链球菌\t481779\n龙田\t481780\n风停\t481781\n中华橱柜网\t481782\na6000\t481783\n懵逼\t481784\nhibiki\t481785\n20160806\t481786\n世纪大道88号\t481787\n五芳斋\t481788\n太湖网\t481789\nRiot\t481790\n3.43\t481791\n不分手\t481792\n艺道\t481793\n说篇\t481794\n满屋\t481795\n巴巴影院\t481796\n血尿酸\t481797\nassured\t481798\n137号\t481799\n皮肤癣\t481800\nuscpa\t481801\n办公屏风\t481802\n魔战无双\t481803\n追逐\t481804\n76名\t481805\ninstruction\t481806\njPlayer\t481807\n快线\t481808\n华宇娱乐网\t481809\n白汤\t481810\n睡遍\t481811\n畅玩版\t481812\n94张\t481813\n石油类\t481814\n长沙电视台\t481815\n麦盖\t481816\n50V\t481817\n聒噪\t481818\nBrandon\t481819\n来源于\t481820\n35w\t481821\n诉求\t481822\n重庆时时彩\t481823\n滘\t481824\nsilica\t481825\nThunderobot\t481826\n大航海时代5吧\t481827\n煎锅\t481828\nCarta\t481829\n圣魔之光石\t481830\n20160908\t481831\n界外球\t481832\n喜德\t481833\n留言\t481834\n朱迪·福斯特\t481835\nigc\t481836\napcu\t481837\n十一章\t481838\n2017年4月1日起\t481839\n默频\t481840\nmiui7\t481841\n福州地铁5号线\t481842\n笔试题\t481843\n九龙站\t481844\n螺纹规\t481845\n知识贴\t481846\n刘仲敬\t481847\n几枚\t481848\n马山河\t481849\n土耳其人\t481850\n离身\t481851\nsmoothskin\t481852\n内衣秀\t481853\n四韵\t481854\nbj40l\t481855\nShopee\t481856\n超细磨粉机\t481857\n8行\t481858\n枪杀案\t481859\n身不由己\t481860\nLiuChunfu\t481861\n万学教育\t481862\n丝状\t481863\n卤面\t481864\n再接再厉\t481865\n20双\t481866\n相控阵\t481867\n从未\t481868\n习题\t481869\n建邦\t481870\nHCNP\t481871\nxrange\t481872\nwgt\t481873\n老虎板王\t481874\n太阳历\t481875\n昆明医科大学第二附属医院\t481876\n假照\t481877\n画报\t481878\n3.8.5\t481879\n菏泽市委\t481880\n勾股定理\t481881\n降雪\t481882\n淝水之战\t481883\n郭刚\t481884\n43岁\t481885\n胫腓骨骨折\t481886\n下签\t481887\n300a\t481888\nAndroid4.4\t481889\n圣安东尼奥马刺队\t481890\n媚态\t481891\n狂人日记\t481892\n微仓\t481893\n平凉市政府\t481894\n多线程程序\t481895\n赵瑞\t481896\n速度与激情4\t481897\n秘密任务\t481898\n德洛丽丝\t481899\n石化易捷网\t481900\n爱碧\t481901\nBarbie\t481902\n欧债\t481903\njixian\t481904\nlibxl\t481905\n健字号\t481906\nch3\t481907\nComparable\t481908\n描红\t481909\n智障儿\t481910\n中国铁路沈阳局集团有限公司\t481911\n510s\t481912\n刘勰\t481913\nGrind\t481914\n山东省国土资源厅\t481915\n象屿\t481916\nYESKY\t481917\n长春工业大学\t481918\nfacility\t481919\n望诊\t481920\n落宗\t481921\n缩放\t481922\n岛崎结衣\t481923\n乐Max2\t481924\n铳枪\t481925\n蓄水池\t481926\n凤凰大街\t481927\nsoket\t481928\n故乡情\t481929\nWitmart\t481930\n电磁力\t481931\nuao\t481932\n诈骗案\t481933\n本司\t481934\n奶音\t481935\n冥王星\t481936\n双楼\t481937\n浙江省自然科学基金\t481938\ncomplexes\t481939\n华杨\t481940\n宁波市财政税务局\t481941\n0815\t481942\nmircosoft\t481943\n466号\t481944\n磺化\t481945\n陈垣\t481946\nNVivo\t481947\n坦克连\t481948\n发人深省\t481949\ncos\t481950\n周口\t481951\n親相姦\t481952\noyo\t481953\n中后期\t481954\n影音先峰\t481955\n百口\t481956\n茂伸奇谈\t481957\n杨枝甘露\t481958\nmarkdownpad2\t481959\n汉学\t481960\n扣税率表\t481961\n砂厂\t481962\n橄榄石\t481963\n1克\t481964\n工具型\t481965\n花千芳\t481966\n胎死\t481967\n持續\t481968\n宏变量\t481969\n眼距\t481970\n零嘴\t481971\n李大仁\t481972\nNeeu\t481973\n重器\t481974\n虚拟声卡\t481975\n珊瑚绒\t481976\n李星\t481977\n线性回归方程\t481978\n打不死\t481979\n美人梅\t481980\nafnet\t481981\nswb\t481982\nmacboo\t481983\n比率\t481984\n金成时代广场\t481985\nSTEP\t481986\n规则式\t481987\nHttpResponse\t481988\n三爷\t481989\n刨子\t481990\n甘林\t481991\n工报\t481992\nSPECIALIZED\t481993\n弯轨\t481994\nTick\t481995\nID值\t481996\n数据地图\t481997\n长沙经济开发区\t481998\nLiquidity\t481999\n依兰县\t482000\n20171202\t482001\nOCEAN\t482002\nnii\t482003\n广东省军区\t482004\n雨尘\t482005\nLogistic回归分析\t482006\n金茂大厦\t482007\n纺\t482008\n第四十八章\t482009\n俊影\t482010\nhandjoy\t482011\n荣新馆\t482012\n百合子\t482013\n手赚网\t482014\n亚惠\t482015\nfixed\t482016\n磊子\t482017\nShenzhen\t482018\n张玉宁\t482019\n单口\t482020\n甲贺忍蛙\t482021\n环市路\t482022\n榕江县人民政府\t482023\n求根\t482024\n直线网\t482025\n毅德城\t482026\n英红九号\t482027\n口塞\t482028\n森鹰\t482029\nCI\t482030\n中维\t482031\n硬菜\t482032\n地中海贫血\t482033\n闪击\t482034\n名访问\t482035\n卖家们\t482036\nborder\t482037\n早课\t482038\n15条\t482039\n中国互联网络发展状况统计报告\t482040\n洗地车\t482041\n扣税\t482042\n康定市\t482043\n阴阳角\t482044\ntrusive\t482045\n二重奏\t482046\n字文\t482047\n115%\t482048\n寥\t482049\n寅虎\t482050\n社会主义道德建设\t482051\n立得\t482052\n6角\t482053\ntopcoder\t482054\n120套\t482055\nsicp\t482056\nproviding\t482057\n1.64G\t482058\n不看我\t482059\n云阳镇\t482060\n玩力\t482061\n龙族2\t482062\n轩昂\t482063\n日向\t482064\n见状\t482065\n主观主义\t482066\n姬川优奈\t482067\n长塘\t482068\n高朋满座\t482069\n社会风气\t482070\n井冈山干部学院\t482071\n小孩\t482072\nchange\t482073\n松兰山\t482074\n时性\t482075\n杭州民办小学\t482076\nNelly\t482077\ncock\t482078\n北京斗米优聘科技发展有限公司\t482079\n杰刚\t482080\nFeed\t482081\n诺托\t482082\n中国航天科工集团\t482083\n陕西艺术职业学院\t482084\n座驾式\t482085\n[综漫\t482086\n汽油\t482087\n过载保护器\t482088\nTek\t482089\n凌驾\t482090\n中国新说唱\t482091\nrctd\t482092\n五角场街道\t482093\n非数字\t482094\nbeds\t482095\nDLL\t482096\n吹笛\t482097\n一世倾城冷宫弃妃\t482098\n照片男\t482099\n高唱\t482100\nwincc\t482101\n雪藏\t482102\nbd33net\t482103\n快说\t482104\n黄鹰\t482105\n下期末\t482106\n停更\t482107\n三峡库区\t482108\noft\t482109\n嘈杂\t482110\nXmn\t482111\n平台型\t482112\n散落星河的记忆\t482113\n地塞米松片\t482114\n奔驰V级\t482115\n熊霸\t482116\n安卓5.1\t482117\n塑料板\t482118\nyyx\t482119\n厮混\t482120\n步战\t482121\n喘不上\t482122\n李维\t482123\n玉洞\t482124\n广宇发展\t482125\n遨游网\t482126\nRnb\t482127\n开心一刻\t482128\n华润万象汇\t482129\n全等\t482130\n第153章\t482131\n硅酸铝\t482132\nxy苹果助手\t482133\n华工科技\t482134\n汤口\t482135\nsunboy\t482136\n人大附中深圳学校\t482137\n好搞\t482138\nMantle\t482139\n环氧树脂板\t482140\nB860A\t482141\n好赞\t482142\n89周年\t482143\n伯恩利\t482144\nmonte\t482145\n测试箱\t482146\nJang\t482147\n五项\t482148\n5名\t482149\n商品房买卖合同\t482150\n程灵素\t482151\n心理情景剧\t482152\n悦音\t482153\n乐跑\t482154\n电磁式\t482155\n翁梅\t482156\n酶联免疫法\t482157\n固力\t482158\n株洲路\t482159\n魔种\t482160\n房产抵押贷款利率\t482161\n灵均\t482162\n外站\t482163\n中国政府采购网\t482164\n电力英才网\t482165\n各地级\t482166\nAround\t482167\n女音\t482168\n陈其钢\t482169\n调查令\t482170\next2\t482171\nav男优\t482172\n安华汇\t482173\nwuyudong\t482174\n上饶市环保局\t482175\nFocused\t482176\n吕柳荫\t482177\n怨天尤人\t482178\n六味\t482179\n舌音\t482180\n凉介\t482181\n月末\t482182\n发热体\t482183\n五旬节\t482184\n改办\t482185\n燃料\t482186\n七座车\t482187\n报怨\t482188\nphysique\t482189\n火焰纹章\t482190\n计量器\t482191\n电压\t482192\n厦门国贸集团股份有限公司\t482193\nü\t482194\n乐动时代\t482195\n全孝盛\t482196\n云豆网\t482197\n服部半藏\t482198\n市场部\t482199\n分体机\t482200\n斜管填料\t482201\n地感\t482202\nforce1\t482203\nBlend\t482204\n红豆薏米汤\t482205\n轩辕剑7\t482206\n沉渣\t482207\n熊猫金控\t482208\n谄\t482209\n支付宝扫一扫\t482210\n地勘\t482211\n帝景\t482212\n不沾\t482213\n印式\t482214\ncmyk\t482215\n香山\t482216\n舒普\t482217\nihealth\t482218\ntrade\t482219\n反导\t482220\n大仇\t482221\n20180312\t482222\n金工\t482223\nreamer\t482224\n挥一挥\t482225\nsamurai\t482226\n网贴翻译_四月网\t482227\n本尼迪克特\t482228\n开幕辞\t482229\nnewport\t482230\nbestbuy\t482231\n5类\t482232\nAward\t482233\n缆机\t482234\n1.60\t482235\n戈壁料\t482236\n绥安镇\t482237\n核质\t482238\n传祺gs8\t482239\n艺体类\t482240\n河鱼\t482241\n4.2.7\t482242\n欧邦\t482243\n直播\t482244\n北京京东世纪贸易有限公司\t482245\n中共安徽省委教育工委\t482246\n北京律师事务所\t482247\n薛杉杉\t482248\n黄永生\t482249\n傻傻分不清楚\t482250\nios7.1.1\t482251\n金尖\t482252\n水玻璃\t482253\n列线\t482254\n再贷\t482255\n百合片\t482256\n打底裙\t482257\n刚愎自用\t482258\n535\t482259\nmodul\t482260\nCuphead\t482261\nbtoa\t482262\n日晷\t482263\n荣耀夫子\t482264\ndynabook\t482265\n师资格证\t482266\neinstein\t482267\n改造后\t482268\nuncut\t482269\n西安国家民用航天产业基地\t482270\n南京网易\t482271\n0935\t482272\n涉水\t482273\n105平米\t482274\nexpress\t482275\nCVS\t482276\n花木城\t482277\nLED泛光灯\t482278\n保鲜箱\t482279\n世茂城\t482280\n爬爬垫\t482281\n种子队\t482282\n中信产业基金\t482283\njoomla\t482284\n百望云\t482285\n2068\t482286\n庄严\t482287\n一大杯\t482288\nTALK\t482289\n二乔玉兰\t482290\n介绍稿\t482291\n镇海新城\t482292\nCa\t482293\n电动门遥控器\t482294\n汇单\t482295\n胀形\t482296\n27度\t482297\n皇妹\t482298\nStinger\t482299\n插接\t482300\n警匪剧\t482301\nxxoo\t482302\n荧幕\t482303\n反弓\t482304\n西安万达\t482305\n大沙东\t482306\n扫文\t482307\n纳吉\t482308\n放飞\t482309\n个人史\t482310\n北京大学深圳研究生院\t482311\n海江\t482312\n陈明远\t482313\n组价\t482314\nVPN\t482315\n1855\t482316\n近4年\t482317\n天使轮\t482318\n高铁\t482319\n转定\t482320\n杨志刚\t482321\n人民桥\t482322\n4W\t482323\n中大\t482324\n十多分钟\t482325\nppt2018\t482326\n新的开始\t482327\ntable合并单元格\t482328\nx620\t482329\n渗沟\t482330\n金道铭\t482331\nNested\t482332\n挥动\t482333\n不惯\t482334\n甄氏\t482335\n百场\t482336\nRaptor\t482337\nHTML4\t482338\npcloud\t482339\n归藏\t482340\n三十米\t482341\n手们\t482342\n莉莉\t482343\n对内\t482344\n入膏肓\t482345\n千难万险\t482346\ncuda8\t482347\n双次\t482348\n片键\t482349\n高彩\t482350\nNewSQL\t482351\n一個\t482352\n$watch\t482353\n天枰座\t482354\n本心\t482355\npes2009\t482356\n强心脏\t482357\n奇喵\t482358\n龟吉\t482359\n交合\t482360\n汤姆苏\t482361\n小龙卷风\t482362\n说散就散\t482363\n堀越耕平\t482364\n母鸭\t482365\n邮编网\t482366\n股权转让协议\t482367\nd100\t482368\n异度\t482369\n凤岭北\t482370\n亚太天能\t482371\n大批量\t482372\n混为一谈\t482373\n光电科技有限公司\t482374\n红岗\t482375\n线管\t482376\n通河县\t482377\n下空\t482378\nCO_2\t482379\n叛将\t482380\n150424\t482381\n星云股份\t482382\n20160812\t482383\n锂离子动力电池\t482384\n盆式\t482385\n腔调\t482386\nTesting\t482387\n南昌万达\t482388\n共同努力\t482389\n3.2亿\t482390\n慕尼黑\t482391\n更简便\t482392\nCUP\t482393\n微鲸\t482394\n惠山区\t482395\n抹灰机\t482396\n湛河\t482397\n酸痛感\t482398\nidentical\t482399\n山钢集团\t482400\n4月26\t482401\n上学路\t482402\n船艇\t482403\n哈_\t482404\nAppsFlyer\t482405\n120km\t482406\nWord、Excel\t482407\nWife\t482408\nIterator\t482409\nunwind\t482410\nnu\t482411\n十四场\t482412\n储集\t482413\nexcel函数公式大全\t482414\n杜万华\t482415\n7支\t482416\n暗黑者\t482417\n肝宝\t482418\n城市规划管理\t482419\n爱情守望者\t482420\n戊辰日\t482421\n北方图书城\t482422\nViolation\t482423\n平江区\t482424\n超级力量2吧\t482425\n基础教育\t482426\n2斤\t482427\n订\t482428\n骏捷论坛\t482429\n京区\t482430\n九宫八卦牌\t482431\n记录卡\t482432\n165家\t482433\n愿\t482434\n大地保险公司\t482435\n男孩\t482436\n遭逢\t482437\n汤非\t482438\n接任\t482439\n1249\t482440\n南京康辉旅行社\t482441\n绵阳市中心医院\t482442\n伸张正义\t482443\n普及度\t482444\n里约奥运会\t482445\n代表色\t482446\n灵火\t482447\n奔驰卡车\t482448\n大一号\t482449\nNomenclature\t482450\n总政\t482451\n香草味\t482452\n强度\t482453\n斗\t482454\n剪板\t482455\n藏金阁\t482456\n残压\t482457\n11.1.8\t482458\n凉感\t482459\n第26集\t482460\n华冠\t482461\ngenggai\t482462\n鲜竹沥\t482463\ndefining\t482464\n7P\t482465\niRedMail\t482466\nLH\t482467\n拆弹\t482468\njira\t482469\nwinserver2016\t482470\n李狗嗨\t482471\n铄\t482472\nconsumers\t482473\n可读写\t482474\n温泉水\t482475\n微软小冰\t482476\n第五人民医院\t482477\n调皮王妃\t482478\n串供\t482479\n2pro\t482480\n帅印\t482481\n创维e900\t482482\n万叶\t482483\nmaste\t482484\n刻纸\t482485\n惠氏奶粉\t482486\n果苗\t482487\nFoil\t482488\nManaging\t482489\n抽搐\t482490\n清华大学天津高端装备研究院\t482491\nV2.5\t482492\n尬\t482493\nACCA考试\t482494\n900天\t482495\n方芳\t482496\n济南银座\t482497\n余温\t482498\n翔实\t482499\n烽火狼烟\t482500\n吉林省地方税务局\t482501\n期货人\t482502\n盖亚奥特曼\t482503\n合景天汇广场\t482504\n好标网\t482505\n旧貌换新颜\t482506\n一艺\t482507\n日照港\t482508\n鸟鸣涧\t482509\n行星边际2\t482510\n文锦\t482511\n6.5%\t482512\n王牌保镖\t482513\n区人民法院\t482514\n甲辰\t482515\n殇情影院\t482516\n质子化\t482517\n高连\t482518\n艾尔特\t482519\n法法\t482520\n如梦令\t482521\n李亦然\t482522\n海鲜坊\t482523\nPPSU\t482524\n光计\t482525\n抗战史\t482526\n附示例\t482527\n写份\t482528\n水磨糯米粉\t482529\n5万亿元\t482530\nDive\t482531\n阔达\t482532\n京新药业\t482533\n最终幻想10-2\t482534\n卓志\t482535\n过热\t482536\n北京市大兴区人民政府\t482537\n直线电机\t482538\n潘安邦\t482539\n乳夹\t482540\n2x2\t482541\n职业禁忌症\t482542\n`_\t482543\n前十年\t482544\nDCT\t482545\n龋病\t482546\n唐姓\t482547\n液罐\t482548\nDynamic\t482549\nHOLDINGS\t482550\n麦吉丽\t482551\n盐酸伐昔洛韦片\t482552\ntrophy\t482553\n街乡\t482554\nCDF\t482555\n熏鱼\t482556\n蔚蓝\t482557\n王佩\t482558\na2+b2\t482559\n挑衅\t482560\n裸光\t482561\n党建\t482562\n明月几时\t482563\n舐犊\t482564\n马胜利\t482565\nblued\t482566\n酒楼\t482567\n黄奇帆\t482568\n黄光剑\t482569\n灌缝机\t482570\n陈萍\t482571\n署名权\t482572\n甜头\t482573\n东北大学秦皇岛分校\t482574\n煤基\t482575\n奥普拉\t482576\n拚\t482577\n主骨\t482578\n心髓\t482579\n174399\t482580\nⅰ\t482581\nvigation\t482582\nMSS\t482583\n微卫星\t482584\n等你爱\t482585\n永暑岛\t482586\n律法\t482587\n潜弓\t482588\nPI3K\t482589\n快阁\t482590\n荔枝蜜\t482591\niceland\t482592\n国家统计局北京调查总队\t482593\nkelly_\t482594\n新生儿败血症\t482595\n查说\t482596\n阳光花城\t482597\n湖南医药学院\t482598\n魔界战记\t482599\nuc55\t482600\nl551\t482601\nf15\t482602\nInfo\t482603\n高虎城\t482604\nSoSo\t482605\n144级\t482606\nPIR\t482607\n金帝\t482608\n乙方\t482609\n吴英案\t482610\n加利福尼亚州\t482611\n高压旋喷桩\t482612\n彩超\t482613\n梦轩\t482614\n牡丹峰\t482615\ndbutil\t482616\n秦海璐\t482617\nRedHat\t482618\n抖s\t482619\n找出\t482620\n课程表\t482621\n6d2\t482622\n137平\t482623\n普萘洛尔\t482624\n英文报\t482625\n班固\t482626\n物是人非事事休\t482627\n童言\t482628\n朝下\t482629\n聚氨酯筛网\t482630\n汇感网\t482631\n贺龙体育馆\t482632\n韩路体验\t482633\nSpringMVC拦截器\t482634\nhbuider\t482635\nOST\t482636\n三年以上\t482637\n音乐课\t482638\n伦敦机场\t482639\n柬埔寨\t482640\n五江天街\t482641\nemily\t482642\n王雪琴\t482643\n狭\t482644\n经世致用\t482645\n杨宝峰\t482646\n家生\t482647\n阿隆\t482648\n晶体硅\t482649\n都市言情_永生小说网\t482650\n赵扬\t482651\nNTLDR\t482652\n红太阳\t482653\n十三级\t482654\n鼠标灯\t482655\n老虎山\t482656\n勇者胜\t482657\n看开\t482658\n东曹\t482659\n契诃夫\t482660\n奥鹏教育\t482661\n丝线\t482662\n领秀\t482663\n工服\t482664\n张若昀\t482665\nPhys\t482666\n一无\t482667\nwgs\t482668\n方达律师事务所\t482669\n资源\t482670\n殊死\t482671\n必修课\t482672\nnetflow\t482673\nMMC\t482674\n潜龙在渊\t482675\n第100天\t482676\nwwwmeinv588com\t482677\n红星歌\t482678\nMarshmallow\t482679\n县疾控中心\t482680\n中控考勤软件\t482681\n香港理工大学\t482682\n麦积山石窟\t482683\n7007\t482684\n衡阳市中级人民法院\t482685\n分枝杆菌\t482686\n润泽园\t482687\n赵泽伟\t482688\n田家湾\t482689\n抢劫犯\t482690\niPhone\t482691\n中建四局\t482692\n哭\t482693\n三氟甲基\t482694\nWingYing\t482695\n年化\t482696\n个人独资企业法\t482697\n提亚拉\t482698\n五溪\t482699\n直行\t482700\n婚品\t482701\n银行流水账单\t482702\n神奇四侠\t482703\n退热\t482704\n完机\t482705\n铁心\t482706\nCubase8\t482707\nIRIS\t482708\n戴美瞳\t482709\n寿安\t482710\n赤井秀一\t482711\n张平\t482712\n郭氏\t482713\nJesson\t482714\nInpaint\t482715\n登革热\t482716\n总监理\t482717\n1243\t482718\n铰接式\t482719\n离心\t482720\n2HD\t482721\n阴超检查\t482722\n大自血\t482723\n莫得\t482724\n淘宝天猫\t482725\n东北电力\t482726\n钢闸门\t482727\n周柏豪\t482728\n好弱\t482729\n丰城市\t482730\n曲周县\t482731\nwesternblot\t482732\nLPR\t482733\n窃听风云3\t482734\n预制\t482735\n金黄色\t482736\n盆盆罐罐\t482737\n张炜\t482738\n郑恩地\t482739\n超低\t482740\nHD2\t482741\n报名条\t482742\n黔人社厅\t482743\nDWR\t482744\n克里斯\t482745\n隐翅虫\t482746\n大门牙\t482747\n热力公司\t482748\n山西省政府\t482749\n打孔机\t482750\n中民保险网\t482751\n建物\t482752\n死亡空间2\t482753\npoo\t482754\n蓝港互动\t482755\n小米手机ROM\t482756\nchine\t482757\n装配工\t482758\n约定\t482759\n邢台医学高等专科学校\t482760\n张春\t482761\n射箭\t482762\n桥壳\t482763\n清港镇\t482764\nMixcloud\t482765\n69第一\t482766\n刘建辉\t482767\nreact\t482768\n神骐\t482769\n方太燃气热水器\t482770\n一直都在\t482771\nZillow\t482772\n资本化率\t482773\n死亡案\t482774\n隔壁的女孩\t482775\n平安壹钱包\t482776\n叶倩文\t482777\n孟祥斌\t482778\n肖洛霍夫\t482779\nBFD\t482780\n京东E卡\t482781\n百度推广账户\t482782\n热门事件\t482783\n成都铁路卫生学校\t482784\n白瓶\t482785\n小和\t482786\n政协会\t482787\n石油大厦\t482788\n找回来\t482789\n青白江\t482790\n6.30\t482791\n小嫩妹\t482792\n李善\t482793\n中广\t482794\nISSA\t482795\n电子秤\t482796\n65周年\t482797\n海口旅游网\t482798\n木框\t482799\n零号\t482800\n过去12个月\t482801\ntir\t482802\n新寨\t482803\n世态炎凉\t482804\n赃物\t482805\n南京市鼓楼区\t482806\nxgen\t482807\ncad制图软件\t482808\n朝美\t482809\n唐唐神\t482810\n50多名\t482811\n玉川\t482812\nOmniPeek\t482813\n脆片\t482814\n穴道\t482815\n上海社保公司\t482816\n陈更\t482817\nyimin\t482818\nReview\t482819\n鞍山人才网\t482820\n义乌市区\t482821\n新学案\t482822\n70\t482823\n废标\t482824\nJorge\t482825\nnavy\t482826\n幻术师\t482827\n控制板\t482828\n换脸\t482829\n非标品\t482830\ntl-wdr5620\t482831\n全开\t482832\n阿胶\t482833\n圣力\t482834\n组策略\t482835\n二级建造师证\t482836\nspringboot过滤器\t482837\n：\t482838\nshowme\t482839\n三轴搅拌桩\t482840\n沙河站\t482841\n波力\t482842\n锐明\t482843\nKristen\t482844\n仲恺高新技术产业开发区\t482845\n37张\t482846\n铁船\t482847\n恋爱宝典\t482848\ncyt\t482849\n结婚后\t482850\npreserve\t482851\n上海人才网\t482852\n汉堡店\t482853\nCentric\t482854\n南沙港\t482855\n南医大二附院\t482856\n谷城新闻网\t482857\n林良\t482858\nliking\t482859\n武汉商贸职业学院\t482860\n维护费\t482861\n偏头痛\t482862\nvs\t482863\n位面\t482864\n爱辉\t482865\n拼板\t482866\n投标信息网\t482867\n片语\t482868\n19.0.0\t482869\n那霸机场\t482870\nlynn\t482871\n流体力学\t482872\n静安府\t482873\n挽手\t482874\n联想Lenovo\t482875\nkinetic\t482876\n血手幽灵吧\t482877\n亲有\t482878\n金青\t482879\n炎妃龙\t482880\nLosing\t482881\n花君\t482882\n5公分\t482883\n若是\t482884\n许鹤缤\t482885\n卡丝\t482886\n氧化锆\t482887\n上汽名爵\t482888\n字类\t482889\n重庆市设计院\t482890\n交通状况\t482891\n芙蓉雨\t482892\nLabView\t482893\n骑游\t482894\ntijden\t482895\n_个性说说网\t482896\n安全家\t482897\n宗教性\t482898\nFILMS\t482899\n张弢\t482900\n攀枝花市人民政府\t482901\n坂田\t482902\n感爱\t482903\n反力\t482904\n要饭\t482905\nbmd\t482906\n杭州海关\t482907\n稿信\t482908\n杜绍斐\t482909\n翻跟头\t482910\nInvoke\t482911\n丹江口市\t482912\n金球奖\t482913\n陈兰\t482914\npvid\t482915\n和蔼\t482916\n异途\t482917\n张大\t482918\n神怒\t482919\n济源网\t482920\n对公\t482921\nManifesto\t482922\n阮白\t482923\n分支\t482924\n海昌\t482925\n美遥\t482926\n北非\t482927\nuvz\t482928\n绿色免费版\t482929\n冰淇淋店\t482930\nnokia\t482931\n1.3.5.2\t482932\n酷乐家\t482933\n溶解性\t482934\nV18\t482935\n憩园\t482936\ndiagbox\t482937\n宝兴网\t482938\n国网冀北电力有限公司\t482939\n曼谷\t482940\nSupreme\t482941\n金涛\t482942\n上海黄页88网\t482943\n青马者\t482944\n步履\t482945\nVGP\t482946\n房屋\t482947\n裴讯\t482948\nPaint\t482949\n秒播\t482950\n胃糜烂\t482951\n喜士\t482952\n保值率\t482953\n石城县政府\t482954\n抗静电剂\t482955\n北京市师达中学\t482956\n佐佐木绯世\t482957\nPaulo\t482958\n江湖行\t482959\n14#\t482960\n旅游台\t482961\n思思\t482962\n金星秀\t482963\n麝手\t482964\n随员\t482965\n内接\t482966\n闪修侠\t482967\nRf\t482968\n湿吻\t482969\n哈吉斯\t482970\n进化镇\t482971\n里士\t482972\n逸名网\t482973\n金田一少年之事件簿\t482974\n河池日报\t482975\n天津之眼\t482976\n小米锅巴\t482977\n艾恩\t482978\nsibelius\t482979\n7k7k赛尔号\t482980\n一句一句\t482981\n第A15\t482982\n单红\t482983\n城野\t482984\n宋浩\t482985\nsexo\t482986\nGroupBox\t482987\n基础件\t482988\n滚牙机\t482989\n脱骨\t482990\nGENE\t482991\n彩盒\t482992\n药房网商城\t482993\n域名解析\t482994\n短效避孕药\t482995\n文化分\t482996\n整编\t482997\n数码摄像机\t482998\n纸口\t482999\n雪铁龙世嘉\t483000\n胡大一\t483001\nudid\t483002\n扪心自问\t483003\n积点\t483004\n邦联\t483005\n2018.01.15\t483006\n主著\t483007\n夏初\t483008\n依爱\t483009\n鸭店\t483010\n插箱\t483011\n应用文搜藏网\t483012\n无机物\t483013\ndirectadmin\t483014\n阳羡\t483015\n1.47\t483016\n外国片\t483017\n万家\t483018\n卡尼\t483019\n泛清真化\t483020\nWOG\t483021\n50码\t483022\n郑功成\t483023\n代际\t483024\n证券时报\t483025\n二注\t483026\nPunk\t483027\n第一情\t483028\n昵图\t483029\n秘境\t483030\nSIII\t483031\nne\t483032\n帮派\t483033\nEmeditor\t483034\n独孤天下独孤曼陀\t483035\n欧文3\t483036\n红海湾\t483037\nJDBCTemplate\t483038\n面板\t483039\nhuy\t483040\n阿周那\t483041\n男篮\t483042\n46米\t483043\n上塘路\t483044\ntalks\t483045\n创业者\t483046\n海工\t483047\n17yy\t483048\nGreg\t483049\naspnet\t483050\nNav\t483051\n每组\t483052\n玄奘三藏\t483053\n—【世界之最网\t483054\n一调\t483055\nwangzi\t483056\n辅读\t483057\n五马街道\t483058\n月下蝶影\t483059\n有福\t483060\nroll\t483061\n西葫芦\t483062\n枣阳市\t483063\n编辑栏\t483064\nヽ\t483065\n海岛\t483066\n一叮\t483067\n中国汽车工程学会\t483068\n曹松\t483069\n统统\t483070\nvisudo\t483071\nx3650m4\t483072\nMBS\t483073\n南阳市人民政府\t483074\napocalypse\t483075\nManu\t483076\n嵊泗新闻网\t483077\n艾拉西亚\t483078\n决一死战\t483079\ng3260\t483080\n赫拉迪克方块\t483081\n软妹\t483082\n梅山镇\t483083\ncovermark\t483084\n温控开关\t483085\n27日晚\t483086\n宝格丽蛇头包\t483087\nx450v\t483088\n韩城之窗\t483089\n凤城镇\t483090\n精益求精\t483091\n郑元畅\t483092\n司马老贼\t483093\nsv\t483094\n12宫\t483095\nMixly\t483096\n苏州电子\t483097\n医通\t483098\n人才观\t483099\n财资\t483100\n广东产品质量监督检验研究院\t483101\n解痉\t483102\n泮塘\t483103\nM6700\t483104\n悠然\t483105\n联航\t483106\n蓝秀\t483107\n滨江集团\t483108\n深圳出入境检验检疫局\t483109\n昊昊\t483110\n李萌萌\t483111\n第四讲\t483112\n乳牛\t483113\n诺依曼\t483114\n碎月\t483115\n黄龙村\t483116\n红舞鞋\t483117\n吴起县政府\t483118\nxybaby\t483119\n任素汐\t483120\nCourses\t483121\n欧卡模拟2\t483122\n华中区\t483123\n6.27\t483124\n居底\t483125\n网游之邪龙逆天\t483126\n卧龙湖\t483127\ntenserflow\t483128\ngirl\t483129\n保利中心\t483130\n半吨\t483131\n艺术化\t483132\n李柯\t483133\n山东省实验中学\t483134\n太阳能电站\t483135\n瑞典语\t483136\n吴洋\t483137\nhammer\t483138\n北京加油站\t483139\n一亿条\t483140\n物理弹球\t483141\n娇宠\t483142\n高低点\t483143\n标致万界天尊\t483144\nEhome\t483145\n花生汤\t483146\n东方神灵庙\t483147\n古村镇\t483148\ndesigner09\t483149\n国王学院\t483150\n紫竹\t483151\n罗庄\t483152\n建高\t483153\n述职述廉述德\t483154\n黄河路\t483155\noepncv\t483156\n超讯通信\t483157\ntop1\t483158\n洛丹伦\t483159\n调任\t483160\n简洁\t483161\n操蛋\t483162\nwritefile\t483163\n易麦宝\t483164\n沈遇\t483165\n张军\t483166\n快日\t483167\n碌碌\t483168\nEnding\t483169\n1.35V\t483170\n雪米\t483171\n速算与巧算_奥数网\t483172\nppi\t483173\n自适应子\t483174\n调整\t483175\n次奥\t483176\ntgz\t483177\n工作卡\t483178\n第80章\t483179\n苏芳\t483180\nGale\t483181\nGOO\t483182\n物优\t483183\n680\t483184\n刘立新\t483185\n对外援助\t483186\n任真\t483187\n余姚人民医院\t483188\nidentities\t483189\n怪物公司\t483190\n车牌架\t483191\n毒医\t483192\n最囧挑战3\t483193\n石墨炉\t483194\nEARTH\t483195\n鱼山\t483196\n双星轮胎\t483197\n王炳翔\t483198\n草书\t483199\n卤煮\t483200\n春福师\t483201\n俄罗斯卫星通讯社\t483202\n领风骚\t483203\n跟腱炎\t483204\n绕开\t483205\niMySQL\t483206\n浙江美大\t483207\n基尼斯\t483208\n2018年3月6日\t483209\nwincc7.3\t483210\n西幻\t483211\n回弹仪\t483212\n链结\t483213\n1832年\t483214\nPantum\t483215\n7350\t483216\n曼莎\t483217\n东北味\t483218\nSublimeREPL\t483219\n大要案\t483220\n好极了\t483221\n四神集团\t483222\n江南Style\t483223\nAOA\t483224\n损\t483225\n张栩菲\t483226\n330ml\t483227\n吉隆坡\t483228\nt台秀\t483229\nVMM\t483230\n刘强东\t483231\n雪岭\t483232\n每一位\t483233\n御守\t483234\n第36期\t483235\n20150830\t483236\n宠物店\t483237\n建筑畅言网\t483238\n染坊\t483239\n国泰君安\t483240\n11个小时\t483241\n新春走\t483242\nWuhan\t483243\n敲响\t483244\nfound_Linux\t483245\n皮砖\t483246\n魔士\t483247\n春晖中学\t483248\n荣之联\t483249\n傲翼\t483250\n白钟元\t483251\n周剑\t483252\nseewo\t483253\n齐鲁晚报数字报\t483254\nCarr\t483255\nlk\t483256\n不管用\t483257\n大路\t483258\n嘉禾县委\t483259\n韶关站\t483260\n文成县\t483261\n黄金龙\t483262\n呲\t483263\n指法表\t483264\n商务包\t483265\n楚州\t483266\n72小时后\t483267\nETC卡\t483268\n课件\t483269\n早日\t483270\nppt|flash|\t483271\nMills\t483272\n养鱼场\t483273\n拉谷谷\t483274\n象山影视城\t483275\n液瓶\t483276\nwin7分区\t483277\n口袋妖怪白金\t483278\n88款\t483279\n金丝\t483280\n著作权\t483281\n陈昆\t483282\n2000多块\t483283\n工具钢\t483284\n大海螺\t483285\n木本\t483286\npondo\t483287\n西站\t483288\nverde\t483289\nsuperficial\t483290\nqingse\t483291\n纽约证券交易所\t483292\n珠影\t483293\n手工皂\t483294\n永生小说网\t483295\n杞县\t483296\n贝利亚\t483297\nYiparts\t483298\n耀莱成龙\t483299\n肾结核\t483300\n33日\t483301\nprocreate\t483302\nwana\t483303\n陈姿彤\t483304\n迷橙\t483305\n庄园\t483306\ngenerates\t483307\n职业版\t483308\n占压\t483309\nGoldberg\t483310\n凭证式\t483311\nsomewhat\t483312\n506路\t483313\n吉普指挥官\t483314\n简卡\t483315\n湘西剿匪记\t483316\n威高\t483317\n上盘\t483318\nAnsible\t483319\nxing/\t483320\n鮰鱼\t483321\n科汇金谷\t483322\n省高院\t483323\n初态\t483324\n自编译\t483325\ng80\t483326\n小乡村\t483327\n青烟\t483328\n阀批\t483329\n诗词\t483330\nbufio\t483331\n鲜肉老师\t483332\n淫妻网\t483333\nvolt\t483334\n福特蒙迪欧致胜\t483335\n金网艺\t483336\nDESIGNER\t483337\n凡卡\t483338\nkbg管\t483339\n大疆晓\t483340\n首推\t483341\n犄角\t483342\n12cr1mov\t483343\n民调局异闻录之勉传\t483344\n中科软\t483345\nattenuation\t483346\n青岛市经济和信息化委员会\t483347\n王志斌\t483348\n唔好\t483349\n中国书画家协会\t483350\n罗马2\t483351\n三眼\t483352\n新民主主义论\t483353\n悦食\t483354\n咸阳卫生信息网\t483355\n北京市人社局\t483356\n拥有率\t483357\n砖木\t483358\n周益民\t483359\nspacer\t483360\n3.2.3\t483361\nbix\t483362\n保持架\t483363\n霍香\t483364\ndeterminant\t483365\n普天之下\t483366\n酷划\t483367\n社会关系\t483368\n丁晓君\t483369\n嘉祥县\t483370\n薛勇\t483371\n侵略\t483372\n李慎明\t483373\n蒙迪欧论\t483374\n墨池\t483375\n王文君\t483376\n送检方\t483377\n20171025\t483378\n化龙巷\t483379\ngDMSS\t483380\n枪客\t483381\n雨眉\t483382\n罗技G900\t483383\n洛宁\t483384\n半月\t483385\n老夫\t483386\n刘赛\t483387\n什莫\t483388\n寒春\t483389\n序文\t483390\nLaunch\t483391\n天道图书馆\t483392\n净地\t483393\n蛇框\t483394\n红核妇洁\t483395\n人民解放军海军\t483396\nalsscan\t483397\nTegra\t483398\n边村\t483399\n沃拉尊\t483400\n太原钢铁集团有限公司\t483401\npublishers\t483402\n新世纪儿童医院\t483403\n涉赌\t483404\n犯罪者\t483405\n腾鳌\t483406\n魏宏\t483407\n楼观\t483408\n白藤\t483409\n教文\t483410\n索菱\t483411\nversys650\t483412\n秋收起义\t483413\n严州\t483414\n青春万岁\t483415\n唐书\t483416\n烘干炉\t483417\ndeb\t483418\n6根\t483419\n评估版\t483420\nELEMENT\t483421\n画值\t483422\n10年间\t483423\n10.2%\t483424\noliver\t483425\n监察组\t483426\ni黑马\t483427\n320ml\t483428\n转币\t483429\nJohannes\t483430\n首违免\t483431\n海汇\t483432\n中日联谊医院\t483433\n软水机\t483434\n)管理有限公司\t483435\nhaose01\t483436\n黄陵县\t483437\n广州公园\t483438\nwin7纯净版\t483439\n汝多\t483440\n巨盾\t483441\n单个人\t483442\n煎饺\t483443\nftx\t483444\n铁道游击队\t483445\n蜜茶\t483446\n羟值\t483447\n哀君\t483448\n人民币收藏吧\t483449\n荣迷\t483450\n温润如玉\t483451\n总编室\t483452\nCRUISE\t483453\n香山路\t483454\n东岭集团\t483455\n公共\t483456\nColl\t483457\n坯\t483458\n受领\t483459\n联想乐\t483460\n十八梯\t483461\nRandall\t483462\n5b00\t483463\n科委\t483464\n侃侃而谈\t483465\n内存卡网\t483466\n李挺\t483467\nhpv病毒\t483468\n广东商学院\t483469\n连锁\t483470\n照料\t483471\n轨道衡\t483472\n银丰\t483473\n俑\t483474\nN条\t483475\n设计展\t483476\nAM\t483477\n办工\t483478\n金泽洙\t483479\n小叶性肺炎\t483480\n联纸\t483481\n4.11\t483482\n增至\t483483\n5390\t483484\n实创\t483485\n哈尔滨道\t483486\n新开神途发布网\t483487\n当地时间\t483488\n澳门黄金城\t483489\nAC86U\t483490\n第三种\t483491\nTouch\t483492\n股改\t483493\narbitration\t483494\n王萌\t483495\n索菲玛\t483496\n方大厨\t483497\n苏义飞\t483498\n安宫黄体酮\t483499\n运动群\t483500\n行商\t483501\n而行之\t483502\n调报告\t483503\nmm-dd\t483504\nfashioned\t483505\n黑板墙\t483506\n老钱庄股票论坛\t483507\n民族乡\t483508\n网络安全法\t483509\nburt\t483510\n林中路\t483511\n团泊\t483512\n少年文艺\t483513\n赵虎\t483514\n堇天府\t483515\n妥布霉素\t483516\n上万斤\t483517\nlattice\t483518\n死亡空间\t483519\n2016年劳动节\t483520\nF1赛车\t483521\n易快报\t483522\n甲某\t483523\n开封市龙亭区\t483524\n汉方\t483525\n120欧\t483526\n软化水设备\t483527\n中车时代\t483528\n守口如瓶\t483529\nDJ耶耶网\t483530\n生物医学工程专业\t483531\n13600\t483532\n小贺\t483533\n经审查\t483534\n痛批\t483535\n幽灵\t483536\nglam\t483537\nsu27\t483538\n电动车展\t483539\n1至12月\t483540\n安倍经济学\t483541\n大龙湖\t483542\n性理\t483543\n对待\t483544\n下板\t483545\n短杖\t483546\njfif\t483547\n盖蒂\t483548\n11月20日\t483549\n举荐\t483550\n20150408\t483551\nrrl\t483552\n农牧场\t483553\n孔老二\t483554\n奇龙\t483555\n晋中市\t483556\nthick\t483557\n华鑫信托\t483558\n卢瑟\t483559\n理论上\t483560\n51Talk\t483561\n芝麻糖\t483562\n7.29万元\t483563\n新宏泰\t483564\nDRIVE\t483565\n不合格\t483566\n何晨光\t483567\n美乐淘\t483568\n并发度\t483569\n西南侧\t483570\n桡动脉\t483571\n十味\t483572\nAttribute\t483573\n加权收益率\t483574\n单速\t483575\n第72\t483576\n天圆地方\t483577\n瑞峰\t483578\n用具\t483579\n莫尼塔\t483580\nWOT坦克世界\t483581\n尹子\t483582\n百适通\t483583\n抗拉强度\t483584\n巴朗山\t483585\n靠右\t483586\n非法交易\t483587\n欧克利\t483588\n肖俊\t483589\n强暴戏\t483590\n毫不相干\t483591\n绊住\t483592\nGalanz\t483593\ntesol\t483594\nmour\t483595\n赖泽华\t483596\nESXi\t483597\n0778\t483598\n慕残\t483599\n最美和声\t483600\nThreeJS\t483601\n大连市儿童医院\t483602\n仙缘\t483603\n复兴东路\t483604\n手语舞\t483605\n读完\t483606\n勒布朗\t483607\nmaxmemory\t483608\n魔兽世界抗魔联军\t483609\n黄芪\t483610\n综合单价分析表\t483611\n剥离术\t483612\n添加源\t483613\nExcel2003\t483614\n规则性\t483615\nGenie\t483616\n米法\t483617\n邑\t483618\nAirlines\t483619\n座圈\t483620\n往事\t483621\nZS\t483622\nSTONE\t483623\n弹簧刚度\t483624\n剑胆琴心\t483625\n守护猫娘绯鞠\t483626\n白薯\t483627\n林诺\t483628\n公用电话亭\t483629\n救世\t483630\n过失致人死亡罪\t483631\n劳动人民\t483632\n郎玉\t483633\n佳能MP236\t483634\nG位\t483635\n居住证暂行条例\t483636\n五圈\t483637\n成都航空职业技术学院\t483638\n人工肝\t483639\n外汇分析师\t483640\n赖斯\t483641\n战旗TV\t483642\nkvs\t483643\n荣耀X1\t483644\n减盐\t483645\n江西教育出版社\t483646\n弹丸论破V3\t483647\n职能类\t483648\n用旧\t483649\n極\t483650\n下同\t483651\n科学技术局\t483652\nRockstar\t483653\nTheresa\t483654\n定货\t483655\nmacbookpr\t483656\n权属\t483657\n袁珊珊\t483658\n津南咸水沽\t483659\n盈科\t483660\n8.4分\t483661\n红色素\t483662\n少有人\t483663\n寒叶\t483664\n知心人\t483665\nDivX\t483666\n西安旅游问答\t483667\nliunian\t483668\n赵可\t483669\n揉一揉\t483670\n琉球群岛\t483671\n可儿娃娃\t483672\n手电钻\t483673\n重庆中旅\t483674\nCharleston\t483675\n96本\t483676\n感言\t483677\n新市街\t483678\ngkh\t483679\nfraps\t483680\n高体\t483681\n长颈漏斗\t483682\nHut\t483683\n顺丰国际快递\t483684\n蝶变\t483685\nTilbury\t483686\n建材商\t483687\n十房网\t483688\n地科院\t483689\ndisposed\t483690\n10碗\t483691\n逃犯\t483692\n一室一厅\t483693\nExynos\t483694\n极迅加速器\t483695\n1.9.32\t483696\n雄师\t483697\n许云\t483698\n私人卡\t483699\n兰塔\t483700\n双瓶\t483701\n华山珑城\t483702\nFIT\t483703\n很\t483704\n穿梭者\t483705\nStir\t483706\n1.5.9\t483707\n12.27\t483708\ntherapeutic\t483709\n什么叫作\t483710\ncancl\t483711\n蓝梦岛\t483712\n余芳\t483713\n象共舞\t483714\n桩板\t483715\nC级\t483716\n步子\t483717\ncctv12\t483718\n鼓包\t483719\n去过\t483720\n首女\t483721\n20遍\t483722\nCONSOLE\t483723\n昌盛日电\t483724\nAffairs\t483725\n恩德\t483726\n存成\t483727\n注册公司\t483728\n快鸟\t483729\n濮阳职业技术学院\t483730\n总配\t483731\n吸门\t483732\n嵊山岛\t483733\n气动离合器\t483734\n反差萌\t483735\n大子\t483736\n非遗神酒\t483737\n浙石化\t483738\nMP4/mp3\t483739\nWinAPI\t483740\n江苏科技大学\t483741\nY50\t483742\n朱雀森林公园\t483743\n蓝芩\t483744\n赠阅\t483745\nbpa\t483746\n鸿扬\t483747\n小切\t483748\nexcle2007\t483749\n区档案局\t483750\nOTT\t483751\n建宁县\t483752\n十九大学\t483753\n莲花洞\t483754\n御膳\t483755\n剔牙\t483756\nxl\t483757\n慢严舒柠\t483758\n屯溪区\t483759\nCCTV12\t483760\n商业贷款利率\t483761\nfno\t483762\n陈亮\t483763\n梦幻5\t483764\n34万\t483765\n上海自考网\t483766\n张学军\t483767\nPS6\t483768\n西北工大\t483769\nocaml\t483770\n唐杰忠\t483771\n南京新城\t483772\n头孢克洛胶囊\t483773\n岗位责任制\t483774\n好家伙\t483775\nSandra\t483776\n娄烦\t483777\n摊主\t483778\n崩裂\t483779\n大承气汤\t483780\n尼古拉·特斯拉\t483781\n船柱\t483782\nSel\t483783\n助长素\t483784\n国家机密\t483785\nKitKat\t483786\n西头\t483787\n羅\t483788\n金职院\t483789\njframe\t483790\n苏拉玛魔网\t483791\nJimmy\t483792\n竞价员\t483793\n五联疫苗\t483794\n石鼓路\t483795\npolya\t483796\n无锡滨湖书香世家酒店\t483797\nRunningman\t483798\n世贸通\t483799\n两针\t483800\nEBI\t483801\n鬼哭\t483802\nM4A1\t483803\n狐族\t483804\n中国人工智能学会\t483805\n文档赚钱网\t483806\n单向度\t483807\n同心家园\t483808\n春诗\t483809\n床技\t483810\n百分之百\t483811\nG400\t483812\n3steam\t483813\n观法\t483814\n大风吹\t483815\n64层\t483816\nFinley\t483817\n50R17\t483818\n欧蓝德论坛_欧蓝德\t483819\n传奇人生\t483820\n了得\t483821\n云人社\t483822\n曹舟南\t483823\n95折\t483824\n爱情版\t483825\n上影节\t483826\n仙葫\t483827\n列宾美院\t483828\n完美漂移\t483829\n佛山职业技术学院\t483830\n李龙大\t483831\n迈阿密大学\t483832\nunplugged\t483833\n妖计\t483834\n蝎子乐队\t483835\n泽普县\t483836\n40场\t483837\nfebruary\t483838\n魔国志\t483839\nlnmp\t483840\n牛膝\t483841\n国家网信办\t483842\n隆回政府网\t483843\n车载导航仪\t483844\n尴尬癌\t483845\n国家药监局\t483846\nTarga\t483847\n东盟经济技术开发区\t483848\n代孕\t483849\n香川\t483850\n心影\t483851\n实验区\t483852\n排法\t483853\n3000美元\t483854\ncrb\t483855\nlmd\t483856\n高山\t483857\nTrade\t483858\ntomcat9\t483859\n武装突袭3\t483860\n张婉儿\t483861\n鲁酒\t483862\n刘大美\t483863\n平摊\t483864\n促生\t483865\n曹子\t483866\n区编办\t483867\n2017年4月26日\t483868\n牧师\t483869\n锁单\t483870\n手把\t483871\nItaliano\t483872\n5阶\t483873\n特朗普大厦\t483874\n小說章節閱讀_扒書網\t483875\nXCode7\t483876\n0.92\t483877\n聆志\t483878\n4月6\t483879\nWindows8_Windows系\t483880\n上海市青少年活动中心\t483881\ntanya\t483882\n马哥linux\t483883\n女污\t483884\n中药材\t483885\nCores\t483886\n2018-02-02\t483887\n美宜佳\t483888\n30p\t483889\n死亡细胞吧\t483890\nOWA\t483891\n唐山市\t483892\n欧楷\t483893\n狗日\t483894\n电化\t483895\n陶一郎\t483896\n扎克伯\t483897\n姜山镇\t483898\n2013年11月\t483899\nXampp\t483900\n诉讼期\t483901\n美伊\t483902\n江中药业\t483903\n7所\t483904\nSiC\t483905\n科普版\t483906\n兰德酷路泽5700\t483907\n眼高手低\t483908\n盈江\t483909\n龍柒\t483910\n三皇冠\t483911\n尼康D800\t483912\nStrengths\t483913\n小客\t483914\n西洲曲\t483915\n89个\t483916\n叶奈法\t483917\n北京大学第六医院\t483918\n148条\t483919\n长睫毛\t483920\n重庆西站\t483921\nArtiosCAD\t483922\nballerina\t483923\n贰元\t483924\n沼气\t483925\nBologna\t483926\n单细胞\t483927\n注墨\t483928\n和庄\t483929\n北京队\t483930\n蜀门手游\t483931\n实弹射击军事演习\t483932\n灵格\t483933\n吴江中学\t483934\n检审\t483935\n马峦山郊野公园\t483936\nchinta\t483937\n天启通宝\t483938\n华图在线\t483939\n写句话\t483940\n努比亚z11mini\t483941\n幼儿教育心理学\t483942\nUBNT\t483943\n马大\t483944\n县旅游局\t483945\n初灵\t483946\n百折不挠\t483947\ninternational\t483948\n飞度海马s5\t483949\n潮热\t483950\n9时\t483951\n太古龙象诀\t483952\n西夏文\t483953\n张九龄\t483954\n多晶硅\t483955\n大唐狐妖传\t483956\nDisorder\t483957\n4.1.4\t483958\n手写屏\t483959\nfrica\t483960\nhive表\t483961\n中兴银行\t483962\n瑞丽网\t483963\n大吴哥娱乐网\t483964\n黎瑞恩\t483965\n气代\t483966\n念槐\t483967\nSLR\t483968\n欲孽迷宫\t483969\n丈母\t483970\n五毒教\t483971\n神秘失踪\t483972\nguideline\t483973\n刘东\t483974\n南京禄口机场\t483975\n谢作如\t483976\n李惠堂\t483977\n35L\t483978\n2.2米\t483979\n海南马自达\t483980\n下月\t483981\n中国电机工程学会\t483982\n宁波东部\t483983\n收件宝\t483984\n300676\t483985\n门片\t483986\n上堂\t483987\n茂伸\t483988\n东方明珠塔\t483989\n实质性\t483990\nMSChart\t483991\n无关紧要\t483992\n阿比斯\t483993\n5016\t483994\napply\t483995\n苯甲酸\t483996\n自然哲学\t483997\nS形\t483998\nVille\t483999\n贲卦\t484000\n藏汉\t484001\n取血\t484002\n清儿\t484003\n坳\t484004\n不雅照\t484005\n七绝诗\t484006\nasynctask\t484007\n泵阀商务网\t484008\n汇金\t484009\nover\t484010\n百家争鸣\t484011\n许一世\t484012\n特别\t484013\n十平方\t484014\n金鼎路\t484015\n事主\t484016\n3500个\t484017\n索菲娜\t484018\nSCO\t484019\n28亿\t484020\n装好\t484021\n仿真论坛|MATLAB函数百科\t484022\n工作用\t484023\n九江市政府\t484024\n狐美人\t484025\nhpv58\t484026\n孤老\t484027\n纵贯线\t484028\n福建省教育考试院\t484029\n蝴蝶面\t484030\n第二十二回\t484031\n光之教堂\t484032\n烹茶\t484033\n导医网\t484034\n四金\t484035\n红梅赞\t484036\n仙女山\t484037\n居高不下\t484038\n产地检疫\t484039\n共工怒触不周山\t484040\n罗德岛战记\t484041\n肩负\t484042\nSecondary\t484043\n轮火\t484044\n高干\t484045\n毕业论文\t484046\n珍人\t484047\n热血三国2\t484048\n青岛地铁2号线\t484049\n魔百\t484050\n几分米\t484051\napplepencil\t484052\n窥阴癖\t484053\n城市长\t484054\n制定者\t484055\nloader\t484056\n环幕\t484057\n中华医学会放射学分会\t484058\nframelayout\t484059\nOffice学院\t484060\n人信\t484061\nLEDs\t484062\n拉斯韦尔\t484063\n功能表\t484064\n適合\t484065\n心情好\t484066\n推进\t484067\n周小董\t484068\n自荐材料\t484069\nRC390\t484070\nwebix\t484071\n康乐\t484072\n内江市中区\t484073\n周安信\t484074\n数据观\t484075\n方济各\t484076\n那一条\t484077\n50\t484078\nRNB\t484079\nS04\t484080\n1.1类\t484081\n枣花蜜\t484082\n新浪彩通\t484083\n寻源\t484084\n张拥军\t484085\n制度\t484086\n坑位\t484087\n谋反\t484088\n湖州市住房和城乡建设局\t484089\nCBox\t484090\nRTB\t484091\nS2700\t484092\n赖氏\t484093\nConfidence\t484094\n原来如此\t484095\n传染\t484096\n烈马\t484097\n导电膜\t484098\n清风扬帆网\t484099\n青苗\t484100\nchopper\t484101\n领导组\t484102\n青年一代\t484103\n同步锁\t484104\nmicrostructure\t484105\n打落\t484106\n为了谁\t484107\nlyf\t484108\n洪都航空\t484109\npvz\t484110\n八海\t484111\n爱无忧\t484112\n白蔷薇\t484113\nTS16949认证\t484114\n卓克\t484115\n夜情\t484116\n韩宇\t484117\n立体\t484118\n阿修罗王\t484119\n张石\t484120\n湖北科技学院\t484121\nRTV\t484122\n合约机\t484123\nconjugate\t484124\n哔哔\t484125\n电脑人\t484126\n模拟人生3\t484127\n霓虹\t484128\nV2016\t484129\n绕杆\t484130\n我的城市\t484131\n那么好\t484132\n同安区\t484133\n广州市海珠区人民法院\t484134\niPadPro\t484135\nV4.0\t484136\n仿古砖\t484137\n苏州大学医学部\t484138\n第一二部\t484139\n天地华宇\t484140\n无间\t484141\n革兰阳性菌\t484142\nNFC功能\t484143\n人人商贸网\t484144\n烂赌英雄\t484145\nIndexedDB\t484146\nAutocomplete\t484147\n长城润滑油\t484148\n炉石传说女巫森林冒险模式\t484149\nsummers\t484150\n环湾\t484151\nContigo\t484152\n谦恭\t484153\n_一品威客网\t484154\n书面语\t484155\n市场营销专业\t484156\n2十大\t484157\n新场\t484158\n青岛搬家公司\t484159\n陆良论坛\t484160\n双峰网\t484161\n同学生\t484162\n泥鳅苗\t484163\n页边距\t484164\nPorcelain\t484165\n坏小子\t484166\n金钱龟\t484167\n锈湖天堂岛\t484168\n芭妮兰\t484169\n詹森·阿克斯\t484170\n硅碳\t484171\n光质\t484172\n京藏\t484173\n多伦多大学\t484174\n大家们\t484175\n铺筑\t484176\n电子皮带秤\t484177\n蒸压釜\t484178\npoetry\t484179\n王哈桑\t484180\n硬模\t484181\n宣德\t484182\n李英爱\t484183\n长春南关区\t484184\n白电\t484185\n约瑟夫\t484186\n出运\t484187\n电脑迷\t484188\n本田艾力绅\t484189\nSummernote\t484190\n耳子\t484191\n九下\t484192\n襄城县\t484193\n生火\t484194\n行政命令\t484195\n浮尸\t484196\n乐嗨嗨\t484197\n脱土\t484198\n自来卷\t484199\n威久留学\t484200\n亡人\t484201\n天涯区\t484202\n劳保\t484203\n消防服\t484204\n健儿们\t484205\n塔莉萨\t484206\n迎广\t484207\n仆射\t484208\n山药\t484209\n庚子赔款\t484210\n财务会计制度\t484211\n戎马丹心:汉匈决战\t484212\n沈阳西站\t484213\n鲁花集团\t484214\n母赞\t484215\n巡山\t484216\n新高尔夫\t484217\n用上\t484218\nSalaries\t484219\n智秀\t484220\nshsh2\t484221\n电子化学品\t484222\n狼犬\t484223\n震荡波\t484224\ntouch5\t484225\nscoll\t484226\n姬发\t484227\n车站北路\t484228\nCytus2\t484229\nMLP\t484230\n山东省政协\t484231\n江北人\t484232\n西斯\t484233\n进言\t484234\n石棉瓦\t484235\nHTTP接口\t484236\n特级初榨橄榄油\t484237\n20160218\t484238\n平阴县人民政府\t484239\n建价\t484240\n三菱电机\t484241\n环首刀\t484242\n超感猎杀\t484243\n你的爱情\t484244\n顺义在线\t484245\n内人\t484246\n中山大学学报\t484247\nKC认证\t484248\n分队\t484249\n王杰希\t484250\n微相\t484251\n遵义医药高等专科学校\t484252\nimagej\t484253\n流通股东\t484254\n發\t484255\n黄柏河\t484256\nhoi\t484257\n华律\t484258\n中国教会\t484259\n16k\t484260\n中恒\t484261\n连理工\t484262\n郎世宁\t484263\nsynopsis\t484264\nconventional\t484265\nppt-学路网\t484266\n合肥公交集团\t484267\nembed\t484268\n离岸人民币\t484269\n整理剂\t484270\n成都春熙路\t484271\n2018年03月21日\t484272\n陈凤\t484273\n一千条\t484274\n秋樱\t484275\n蠕动\t484276\n二人世界\t484277\n兖州\t484278\n武汉一中\t484279\n催奶\t484280\n海南房产网\t484281\n武胜县人民政府\t484282\n反曲弓\t484283\nsnap\t484284\n瘦高\t484285\n34亿\t484286\n月半\t484287\n成名作\t484288\n高地战\t484289\n陆贞\t484290\nreplugin\t484291\n菊豆\t484292\n卡培他滨片\t484293\n胃王\t484294\n龙哥\t484295\n山风\t484296\n南迦巴瓦峰\t484297\n女大学\t484298\nBlustone\t484299\n马小岚\t484300\n宿迁经济技术开发区\t484301\nv0.1\t484302\n中班健康\t484303\n正解\t484304\n过敏原检测\t484305\n_美\t484306\n独奏曲\t484307\n圈圈网\t484308\nANTEC\t484309\n20150107\t484310\n豆豆色\t484311\n3dmax2012\t484312\n平山道\t484313\n登州路\t484314\n原唱者\t484315\n未读\t484316\n七牛云存储\t484317\n路程\t484318\najaxSubmit\t484319\nSpace\t484320\n5艘\t484321\n一贯\t484322\n扩能\t484323\n遮挡\t484324\neverybody\t484325\n通风与空调工程施工质量验收规范\t484326\n音刷\t484327\n20180322\t484328\n袁杰\t484329\n严蕊\t484330\n求普及\t484331\n社交恐惧症\t484332\n悬铃木\t484333\n瑞明\t484334\nLOWA\t484335\n玉米糁\t484336\n华为Mate2\t484337\n枠\t484338\n吧台员\t484339\nAKIRA\t484340\nWLS\t484341\n八千\t484342\n荆州市沙市区人民政府\t484343\n1210\t484344\n机械传动\t484345\n阡\t484346\n明孝宗\t484347\n误机\t484348\n桐柏路\t484349\n却话\t484350\n数字家\t484351\n黑海舰队\t484352\n分词器\t484353\n证券时报网\t484354\n血脉\t484355\n防倾杆\t484356\n淫窝\t484357\n闭上\t484358\nLGPL\t484359\n点板\t484360\n大墩峡\t484361\n11500元\t484362\n宝园\t484363\n葡萄节\t484364\n全数\t484365\n部务\t484366\n难找\t484367\n4月7\t484368\n22栋\t484369\n学术部\t484370\n油路\t484371\n长兴新闻网\t484372\n李义\t484373\n成都石室中学\t484374\nGraphX\t484375\n嗡嗡嗡\t484376\n紧缩感\t484377\n透射率\t484378\n防空洞\t484379\n黄玉峰\t484380\n专栏作家\t484381\n漕河泾开发区\t484382\n黄铜矿\t484383\nMacType\t484384\n世纪星\t484385\n女子们\t484386\n梁孟松\t484387\n降龙木\t484388\n史海钩\t484389\n帆软\t484390\n丁楠\t484391\n咋舌\t484392\n燕云\t484393\n寒舍\t484394\n全球品牌畜牧网www.ppxmw.com\t484395\n蹦跶\t484396\n二辩\t484397\n凤凰城\t484398\n表单值\t484399\n血亏\t484400\n小时光\t484401\n李大庆\t484402\n英国约克大学\t484403\n丹阳市人民政府\t484404\nmaomiav\t484405\n张大林\t484406\nctypes\t484407\n水阻\t484408\n羽田爱\t484409\nsmoothing\t484410\nMelt\t484411\n致臻\t484412\n阿拉伯胶\t484413\n甜甜的\t484414\n周振甫\t484415\nopp袋\t484416\n对象存储\t484417\n势均力敌\t484418\n盗墓之王\t484419\n速龙II\t484420\n经曲\t484421\n避雨\t484422\ncybergadget\t484423\n辰月\t484424\n余艺\t484425\n火红\t484426\n为什么不做\t484427\nM型\t484428\n消化管\t484429\n大叠发票\t484430\n6磅\t484431\nuiwindow\t484432\n珍珍\t484433\nR410A\t484434\nembassy\t484435\n五笔字型\t484436\nfolder\t484437\n吐露\t484438\n截面惯性矩_\t484439\npathofexil\t484440\n清水砖\t484441\n斑马梦龙\t484442\n林高远\t484443\napid\t484444\n一百遍\t484445\n5000瓦\t484446\n啮齿类\t484447\n9.3.1\t484448\n轴杆\t484449\n死亡人\t484450\n花生皮\t484451\nc3p0连接池\t484452\n无恨星晨\t484453\nLs\t484454\n人民教育出版社\t484455\n出水莲\t484456\n24\t484457\nCDSoSo\t484458\n余热锅炉\t484459\ne生保\t484460\n折成\t484461\n油溪镇\t484462\n64家\t484463\n挂钩式\t484464\nidk\t484465\n3智叟\t484466\nG1\t484467\n朔城\t484468\n一小\t484469\n十卷\t484470\n够味\t484471\n云宫迅音\t484472\n近50年\t484473\n水管工\t484474\n母体\t484475\n非牛顿\t484476\n950m\t484477\n胡俊\t484478\n路航\t484479\n急疹\t484480\n新疆人民广播电台\t484481\n幸福感\t484482\n黄彪\t484483\n魏建国\t484484\n方星海\t484485\n2.14.1\t484486\n通明\t484487\n2208\t484488\n机费率\t484489\n脑白金\t484490\n174\t484491\nSdorica\t484492\n朱丹\t484493\n梁祝化蝶\t484494\n毛茶\t484495\nEar\t484496\n鸡米花\t484497\n万马\t484498\n锦标赛\t484499\n进销存系统\t484500\n连杆式\t484501\n合川区\t484502\n茂华\t484503\n400亿美元\t484504\nAuburn\t484505\n天津市第五中心医院\t484506\n8.24\t484507\n封丘\t484508\n20多家\t484509\n自为\t484510\n3925\t484511\nTwins\t484512\nWK\t484513\n武汉警官职业学院\t484514\n化鸡\t484515\n期权股\t484516\n魅族Pro\t484517\n无界信息网\t484518\n018\t484519\n光盘\t484520\n2188\t484521\nresid\t484522\n电泳\t484523\n汇仁\t484524\n22g\t484525\n柚子木\t484526\n中国电信\t484527\n侠盗罗宾汉\t484528\nimmediate\t484529\n华为云帮助中心\t484530\nGreetings\t484531\n七星关\t484532\n7斤\t484533\nx3400\t484534\nbread\t484535\n理疗师证\t484536\n苍月女\t484537\nProbook\t484538\n专员\t484539\n贯口\t484540\n讲书\t484541\n卤鸡蛋\t484542\n2017—2018年度\t484543\n40尺\t484544\n舍头\t484545\n国银租赁\t484546\n甘肃新元素科技成套设备有限责任公司\t484547\nchengshi\t484548\n王德华\t484549\ngfp\t484550\n代文\t484551\n2018年2月25日\t484552\nMundo\t484553\n20150813\t484554\n批量转换\t484555\n凤凰股份\t484556\n沈阳方特欢乐世界\t484557\n数传电台\t484558\nMi2\t484559\n惨绿\t484560\n0.3秒\t484561\n团队长\t484562\nBLA\t484563\n不言不语\t484564\nContinue\t484565\n电信3g\t484566\n防蚊液\t484567\n曼基康\t484568\n修神录\t484569\n天局\t484570\n敌基督\t484571\n县政务服务中心\t484572\n贺新郎\t484573\n口岸\t484574\n中国盐都政府网\t484575\n饥饿的鲨鱼\t484576\nmbedtls\t484577\n浪漫主义者\t484578\n年夜\t484579\n塞曼\t484580\n大视\t484581\nsale\t484582\n安徽省残疾人联合会\t484583\n满天星\t484584\nHelloWorld\t484585\n马宏\t484586\n玩火自焚\t484587\nRegis\t484588\n招\t484589\n好食期\t484590\n蛇形\t484591\n阴伤\t484592\ngms\t484593\nDetails\t484594\n夏县\t484595\n北京经纬恒润科技有限公司\t484596\n微机派位\t484597\n大汶口\t484598\n冬寒菜\t484599\n省人大\t484600\ncroe\t484601\n许三观\t484602\n恒生指数\t484603\n小动\t484604\n可转债\t484605\n第87号\t484606\nhttp接口\t484607\n文艺节目\t484608\n经济基础\t484609\n加压泵\t484610\n东北师范\t484611\n纵深\t484612\nWald\t484613\n附详\t484614\n思行\t484615\n土语\t484616\npck\t484617\n热鸟\t484618\n店小秘\t484619\n图析\t484620\n撸不射\t484621\n信息费\t484622\n鼓胀\t484623\n逆力\t484624\njtl\t484625\nEarthquake\t484626\n家长式\t484627\n软骨\t484628\n梓菱\t484629\n一族\t484630\n死神来了1\t484631\n麻绳\t484632\n三四十万\t484633\ninitializr\t484634\nPhotograph\t484635\n青浦工业园区\t484636\n国华人寿\t484637\n80条\t484638\n03706\t484639\n西善桥\t484640\n简政放权\t484641\n荣休\t484642\n尹力\t484643\nBluesky\t484644\n湛河区\t484645\n云南糖网\t484646\n莫愁\t484647\n一手抓\t484648\n胡乱\t484649\nWebServices\t484650\n化纤厂\t484651\n蟑螂屋\t484652\n小约翰\t484653\n胼胝体\t484654\n校本化\t484655\n共产党员事迹材料\t484656\n阿U全集\t484657\n真纪\t484658\n玛法达\t484659\nonTouch\t484660\n5篇\t484661\n宽带连接\t484662\n非机动车库\t484663\n丁长生\t484664\n八年级语文\t484665\n熊汝霖\t484666\n北京市地方税务局\t484667\n800x800\t484668\nFunding\t484669\n灵音\t484670\n受影\t484671\n南通镇\t484672\n55公里\t484673\n玉佩\t484674\n大冲新城花园\t484675\n字款\t484676\n余光\t484677\n十秒\t484678\n逃掉\t484679\n彭麻麻\t484680\n精选机\t484681\nRc\t484682\n舒城\t484683\nsendmessage\t484684\n3P_\t484685\n下客\t484686\n圆明园\t484687\n小牛犊\t484688\n厚机\t484689\n酢\t484690\nzhongshan\t484691\ncolorpop\t484692\n辖乡\t484693\n司马迁发愤写史记\t484694\n金面\t484695\n比如\t484696\n稳健性\t484697\n20170406\t484698\n后首\t484699\n土地增减挂钩\t484700\n6360\t484701\n北方广电\t484702\n林琳\t484703\n屋面板\t484704\n混用\t484705\n鹏华添利宝\t484706\n酷狗繁星\t484707\n爱情类\t484708\n洽洽食品股份有限公司\t484709\nO\t484710\nepel\t484711\n回龙观镇\t484712\n广西乐居\t484713\n搜狗手机输入法\t484714\nCento\t484715\n神学院\t484716\nTAXI\t484717\n牛肉粉\t484718\ndidispace\t484719\n字框\t484720\n延时\t484721\n丽景花园\t484722\n大神\t484723\n精灵鼠小弟\t484724\n房山\t484725\n阜房网\t484726\nZ3\t484727\n上院\t484728\n新津\t484729\n酷乐视\t484730\nPortofino\t484731\n高一生\t484732\n树脂发光字\t484733\n刘承羽\t484734\nAIX命令参考大全\t484735\niis7.5\t484736\nCBTC\t484737\n体重减轻\t484738\n基准期\t484739\nSolidWorks2012\t484740\n五叶\t484741\n同方易教\t484742\n李友\t484743\n1卷\t484744\n北京北京\t484745\n不大不小\t484746\n看不懂\t484747\n靳彩\t484748\n杭州市一医院\t484749\n汽修吧\t484750\n刘晓庆\t484751\n单筒望远镜\t484752\n宠物附魔宝珠\t484753\nDonation\t484754\n透视表\t484755\n中国移动云南公司\t484756\n舒言\t484757\nmegahouse\t484758\n迪士尼乐园\t484759\n麻纺\t484760\n龙井\t484761\n肖全\t484762\n李文忠\t484763\nf489\t484764\nブランドサイト\t484765\n15日内\t484766\n广本缤智\t484767\n沙区\t484768\n万湖会议\t484769\n环境科学\t484770\n平潭发展\t484771\n阿法\t484772\n前角\t484773\n隊\t484774\n农业厅\t484775\n油流\t484776\n电链\t484777\nSDFormatter\t484778\n家源\t484779\n抒情\t484780\n最后一班岗\t484781\n公称直径\t484782\n大水川\t484783\n汉网\t484784\nSIN\t484785\n人力资源管\t484786\n老虞\t484787\n7.2%\t484788\n买不买\t484789\n新贸网\t484790\nRX460\t484791\n三原色\t484792\neditorial\t484793\n信号机\t484794\n比较仪\t484795\n德思勤城市广场\t484796\n高澄\t484797\n00089255\t484798\nMatisse\t484799\n和美系\t484800\nbj40\t484801\n早乙女由依\t484802\nBoosting\t484803\n曾咏欣\t484804\n第5\t484805\n信息科学学院\t484806\n5.6米\t484807\n昨儿\t484808\n觉着\t484809\n隐含\t484810\n萨瑶瑶\t484811\noasis2\t484812\n豚骨\t484813\n广西职业技术学院\t484814\n嚣\t484815\n在所难免\t484816\nC14\t484817\n水环\t484818\n流动舞台车\t484819\n南方时论\t484820\nRepost\t484821\n丛飞\t484822\n自助\t484823\n摇橹\t484824\n历史系\t484825\nlearner\t484826\n500KV\t484827\n川西\t484828\n金融工具\t484829\n龙光集团\t484830\n国际现货黄金\t484831\n土拍网\t484832\n汇流排\t484833\n理珠\t484834\n2038\t484835\nMetaphor\t484836\n青蛙乐队\t484837\n手势\t484838\n醇\t484839\n油砂\t484840\n5期\t484841\noffered\t484842\n省政协\t484843\n适\t484844\n二百块\t484845\n馥芮白\t484846\n雕刻家\t484847\nA1586\t484848\n白令海峡\t484849\n韩籍\t484850\n奚晓明\t484851\n巴黎博物馆\t484852\n禅堂\t484853\n搪瓷锅\t484854\n慢慢来\t484855\n艺妓回忆录\t484856\n大悦城地产\t484857\npako\t484858\n定轴\t484859\n辅食机\t484860\n左右开弓\t484861\n80度\t484862\n逍遥游\t484863\n四川大学计算机学院\t484864\n江海街道\t484865\nv300\t484866\n往天\t484867\n180318\t484868\n果业\t484869\n琴音\t484870\n东宝区\t484871\ndobot\t484872\n永兴特钢\t484873\nArtwork\t484874\n本什么\t484875\nライン\t484876\n律师资格证\t484877\n郑州市食品药品监督管理局\t484878\nABCDV\t484879\n济宁火车站\t484880\n方向舵\t484881\n狂轰滥炸\t484882\n150集\t484883\n国网电科院\t484884\n斗兽场\t484885\n济宁附属医院\t484886\n阁老继妹\t484887\nEnigma\t484888\n菌物\t484889\n坦克歼击车\t484890\n欧皓辰\t484891\n梦幻群侠传4\t484892\nB2B电子商务\t484893\n丢失\t484894\n大乳山\t484895\n430万\t484896\n体适\t484897\ngitbub\t484898\nAncient\t484899\n210_\t484900\n声画\t484901\n王桂林\t484902\n运载\t484903\n负载测试\t484904\n8260\t484905\n博赞\t484906\nfaloo\t484907\n总股本\t484908\n东江镇\t484909\nIIS应用程序池\t484910\nzhiliang\t484911\n婚然心动\t484912\ndiverse\t484913\n三言\t484914\ngifted\t484915\nkg\t484916\n真不少\t484917\nswd\t484918\n温区\t484919\nbone\t484920\n死灰复燃\t484921\n爬床\t484922\n背影\t484923\n零零七\t484924\n花箱\t484925\n北京康拉德科技有限公司\t484926\n盛鑫\t484927\n低压电流互感器\t484928\n淘宝天猫店铺\t484929\n桃花妆\t484930\n泰山\t484931\nmtbe\t484932\n卫子夫\t484933\n光源\t484934\n代加\t484935\ndevserver\t484936\n4万\t484937\n4.5英寸\t484938\n空顶\t484939\n狗达\t484940\n剑灵捷达\t484941\n钢工\t484942\n2.9G\t484943\n京州\t484944\nGMOD\t484945\nPlague\t484946\nwstring\t484947\n时代周报\t484948\n香文化\t484949\n冈田奈奈\t484950\n扫描枪\t484951\nearn\t484952\n六环路\t484953\n星程\t484954\n不称职\t484955\n和亲\t484956\n单引号双引号\t484957\n喷香\t484958\n江风\t484959\nithink\t484960\npromo\t484961\nMamba\t484962\n210级\t484963\n坍缩\t484964\nW-8BEN\t484965\n马甲镇\t484966\n工作简历网\t484967\n泡手\t484968\n法治政府建设实施纲要\t484969\n琴天\t484970\n最旺\t484971\nrude\t484972\nzipalign\t484973\nPPO\t484974\n八嘎\t484975\n舱房\t484976\n500台\t484977\n烛光晚餐\t484978\n燃点网\t484979\n公校\t484980\n岩爆\t484981\n心子\t484982\n2016a\t484983\n小话\t484984\n永不磨灭\t484985\n天下一统\t484986\n5.0.5\t484987\n奇境\t484988\nMAC地址\t484989\nwebp\t484990\n深城\t484991\n房装\t484992\n十九个月\t484993\n842\t484994\n11.09\t484995\n基本原则\t484996\n鸡蛋清\t484997\n二十块\t484998\n孙敬修\t484999\nopaque\t485000\n熟\t485001\nrequired\t485002\nmysql-server\t485003\n东软集团\t485004\n甲状腺囊肿\t485005\n雷泽\t485006\n454\t485007\n护肤类\t485008\n香醇\t485009\n斯迈尔\t485010\n口决\t485011\n作文字\t485012\n静子\t485013\nE木业网\t485014\nTwinStudio\t485015\n专用贴\t485016\n鄞州万达广场\t485017\n彭伟\t485018\n何傲儿\t485019\n秦ev\t485020\n滨江学院\t485021\nhebad\t485022\n富文本编辑器\t485023\n祖尔格拉布\t485024\n拂袖\t485025\nrafa\t485026\n缅甸花梨\t485027\n五夜\t485028\n新华文摘\t485029\n傅声\t485030\n富甲天下5\t485031\n虹悦城\t485032\n灯源\t485033\n红爪\t485034\n2017-04-18\t485035\n毫克\t485036\n汉化\t485037\n号单\t485038\n拖米\t485039\n聚丙烯纤维\t485040\n热轧\t485041\n越野之王\t485042\n30则\t485043\n我等你\t485044\n世强\t485045\n原片\t485046\n插损\t485047\n抽拉式\t485048\n品牌展\t485049\n打烊\t485050\n整合性\t485051\nvictims\t485052\n上海城投水务(集团)有限公司\t485053\n云中城\t485054\nJDK7\t485055\nC语言网\t485056\n酷派8712\t485057\nodbc\t485058\nPSU\t485059\n八门神器\t485060\n玛咖片\t485061\n深圳购物中心\t485062\n营养师证\t485063\n找房子\t485064\n段于鄢\t485065\n悍妻\t485066\n盾形\t485067\ninvestors\t485068\n峻岭\t485069\n克尔苏加德\t485070\n116亿\t485071\n红舞姬\t485072\n预防职务犯罪\t485073\nwenming\t485074\n娇淫青春之放纵_\t485075\n夺宝奇兵2\t485076\n莱比锡大学\t485077\n柴油箱\t485078\nOXO\t485079\n南国书香节\t485080\n次要\t485081\n2017年以后\t485082\n百万亩\t485083\n九江路\t485084\n乐音\t485085\n故事网\t485086\nkdtree\t485087\n搓澡\t485088\n好闻\t485089\nKarton\t485090\n报警阀\t485091\n集中值\t485092\n钛合金门\t485093\nperception\t485094\n戚薇\t485095\n獾\t485096\n剑纯吧\t485097\n风华高科\t485098\n擘\t485099\n化民\t485100\n国际港\t485101\n肥瑞\t485102\n悦心\t485103\n三亚亚特兰蒂斯酒店\t485104\n中信国航\t485105\n模拟人\t485106\nmengfanrong\t485107\n撩起\t485108\nCCNP\t485109\n仰止\t485110\n卯酉\t485111\n塘角鱼\t485112\n5Eplay易玩网\t485113\n大概\t485114\n标书费\t485115\n积怨\t485116\nGParted\t485117\n|子\t485118\n极大\t485119\n华庄\t485120\n陈藏\t485121\n制做\t485122\n小飞燕\t485123\n天泽信息\t485124\n15块\t485125\ntsg\t485126\n陶渊神武\t485127\n避孕套\t485128\n女纸\t485129\n紫云英\t485130\n乍\t485131\n393号\t485132\n湖北省工商行政管理局\t485133\n淘宝美工论坛\t485134\nxiong\t485135\nVol.01\t485136\n公象\t485137\n新客站\t485138\n7102\t485139\n张志鹏\t485140\n步惊云\t485141\n吴大\t485142\n郑州晚报\t485143\n小脑萎缩\t485144\n复旦大学附属华山医院\t485145\n瑞杰\t485146\nSubsidiary\t485147\n蓝洁冒险岛\t485148\nmould\t485149\n肝衰竭\t485150\nNVME\t485151\n奇华顿\t485152\nUMAT\t485153\n异域\t485154\n国柱\t485155\n拿枪\t485156\n白洞\t485157\n普拉蒂尼\t485158\n静力\t485159\n五百亿\t485160\nmasquerade\t485161\n币源\t485162\n推手\t485163\n唐旭\t485164\n连串\t485165\nDali\t485166\n男女之间\t485167\n4月中旬\t485168\n近15年\t485169\n身字\t485170\n10000小时\t485171\n1600\t485172\n家型\t485173\n酒店业\t485174\n腾讯大秦网\t485175\n荧惑\t485176\n2017年12月11日\t485177\n袁中道\t485178\nTIFFANY\t485179\n拉雅\t485180\nИ\t485181\n惠美\t485182\n逆水寒OL\t485183\n供暖期\t485184\n阿拉伯半岛\t485185\n晶振\t485186\n神诀\t485187\n汪建国\t485188\n菠萝油\t485189\n老倪\t485190\n450w\t485191\n会计学\t485192\n道格\t485193\n更富有\t485194\njoy696163\t485195\nISMS\t485196\n五里河\t485197\n炕板\t485198\n土木猴\t485199\n狂三\t485200\n新人杯\t485201\n丹阳翼网\t485202\n乳味\t485203\n小狮子爱尔莎\t485204\n游女\t485205\nShading\t485206\n蛋黄酱\t485207\n密铺\t485208\n丁奉\t485209\n劫主\t485210\niden\t485211\n金才\t485212\n白云社区\t485213\n菠萝格木\t485214\n阳光网\t485215\n烧结\t485216\n仗剑量天\t485217\n京客隆\t485218\n喜剧班的春天\t485219\n赣州中学\t485220\n石进\t485221\nPyMongo\t485222\nX特遣队\t485223\n南苑小区\t485224\n天法道\t485225\nSP9\t485226\n穆铁汤珈铖\t485227\nWeber\t485228\n吉讯\t485229\n胸锁\t485230\n任天堂\t485231\nannals\t485232\nThinkGem\t485233\n花桥村\t485234\n液晶电\t485235\n拉泰\t485236\nHRO\t485237\n绢\t485238\n圭塘\t485239\n脚垫\t485240\n划价\t485241\n管道井\t485242\n20排名\t485243\nHare\t485244\n第(2\t485245\n舞林大会\t485246\n股金\t485247\n金溪镇\t485248\n三井街道\t485249\n306sh\t485250\n疾词\t485251\n潍坊新闻网\t485252\n8099\t485253\nisb\t485254\n肚脐眼\t485255\n笑果\t485256\n一槌\t485257\n大和藻虾\t485258\n发蓝\t485259\n6800\t485260\nMakerLab创客\t485261\n举\t485262\n文语\t485263\n数字电视节目\t485264\n中国南方航空集团有限公司\t485265\n刘鸣\t485266\nTransporter\t485267\n漏感\t485268\n乾乾\t485269\n沟域\t485270\nLELO\t485271\n眼底\t485272\n玉关\t485273\n一党\t485274\n佛州\t485275\n风陵\t485276\n成都地铁13号线\t485277\n求职版\t485278\ninst\t485279\nimmersion\t485280\n100uf\t485281\n虐乳\t485282\nNDT\t485283\n严不\t485284\n外\t485285\n_书旗小说下载安卓版阅读器\t485286\n色金\t485287\n2an\t485288\n双拨\t485289\n深规院\t485290\n注册表\t485291\n南美区\t485292\n美的国宾府\t485293\n应晖\t485294\n胡姬花\t485295\n污水池\t485296\n318号\t485297\n不惜一切\t485298\n震悚\t485299\n是人非\t485300\n刘小锋\t485301\n我的小制作\t485302\n厌氧菌\t485303\n变名\t485304\n靖城镇\t485305\n青海油田\t485306\n柱梁\t485307\n市场研究公司\t485308\n省博\t485309\n栋\t485310\npse\t485311\n秀园\t485312\nliunix\t485313\n河流水\t485314\nsweden\t485315\n好莱坞\t485316\n第102集\t485317\n玻化\t485318\n骗\t485319\nJIS\t485320\n寒锐钴业\t485321\n3570\t485322\nportuguese\t485323\n皮箱\t485324\n二三\t485325\n液限\t485326\n喇荣五明佛学院\t485327\n初恋这件小事\t485328\n丝绣\t485329\n杨氏模量\t485330\n99贷\t485331\n水野朝陽\t485332\n西湖醋鱼\t485333\nnest\t485334\n牛筋\t485335\n新华大厦\t485336\n小寨村\t485337\n环境温度\t485338\n米立方\t485339\n磨菇\t485340\n刘淼\t485341\n天注定\t485342\n50204\t485343\n上海联通\t485344\n清音\t485345\nM226dn\t485346\n烟台社\t485347\n中国电池网\t485348\n中华人民共和国中央军事委员会\t485349\n事实真相\t485350\n05:00\t485351\ngz压缩包\t485352\n大博\t485353\n纸鸢\t485354\n红旗h7\t485355\n大竹\t485356\n硬道理\t485357\n柚子社\t485358\nSourceTree\t485359\n汤涛\t485360\n楼子\t485361\n拉格朗日乘子\t485362\n涨痛\t485363\n待建\t485364\n含片\t485365\n7日内\t485366\n父子项\t485367\n2.4.13\t485368\n割舍\t485369\n吸收峰\t485370\npatron\t485371\ntheshy\t485372\n杨立新\t485373\n蒙面人\t485374\n武侯祠大街\t485375\n探访\t485376\n小节\t485377\nJavas\t485378\n风景园林学\t485379\n一片片\t485380\n浙江省高级人民法院\t485381\n钟老师\t485382\n方力申\t485383\n水书\t485384\n上海电子\t485385\n信访网\t485386\nApproved\t485387\n史蒂文\t485388\n不堪\t485389\n调试器\t485390\nwww.120ask.com\t485391\n20180114\t485392\ncasa\t485393\n1.68G\t485394\n潍坊市公安局\t485395\n封官\t485396\n欧尚\t485397\n咏诵\t485398\n精精\t485399\nv5.0.1\t485400\n手札\t485401\n灯炮\t485402\n测功机\t485403\n无追\t485404\n音协\t485405\n银湖城\t485406\n美国国会\t485407\n考费\t485408\n中央财经领导小组\t485409\n合营\t485410\n清理\t485411\n铝包木门窗\t485412\nffn\t485413\n黄诚\t485414\n吉娃莲\t485415\n4590\t485416\n_皮皮养生网\t485417\nm32\t485418\n皇都\t485419\n性高潮\t485420\n备孕期\t485421\n百度音乐人\t485422\n救险\t485423\n佛山科技学院\t485424\n女神异闻录5\t485425\n握紧\t485426\n降生\t485427\n3801\t485428\n鼻嘴\t485429\n桃林\t485430\n提取机\t485431\n每3天\t485432\n物博会\t485433\npcmark\t485434\n摸奶\t485435\n液晶\t485436\n长江七号\t485437\n李瑛\t485438\n组织病理学\t485439\n萌战无双\t485440\n全掌\t485441\n拖累\t485442\n2am\t485443\nUkey\t485444\nlim\t485445\n咸阳市人民政府\t485446\nQuant\t485447\n120多部\t485448\ntitle\t485449\n轻风\t485450\n喜剧狂\t485451\n蒂姆·波顿\t485452\n106平\t485453\n花生衣\t485454\n332号\t485455\n任侠\t485456\n等温线\t485457\n江里\t485458\nLevel\t485459\n800毫米\t485460\n老一\t485461\n放开\t485462\n常州大学城\t485463\n群团\t485464\nv5fox吧\t485465\n方正飞腾4.1\t485466\n亟须\t485467\nxiezi\t485468\nCSCD\t485469\n邪恶人\t485470\n诱宠\t485471\n锥套\t485472\n关张\t485473\n小面\t485474\n北京海淀医院\t485475\n遴\t485476\n第154章\t485477\n婦\t485478\n孕棒\t485479\nLimited\t485480\n谋学网\t485481\nx86_64-linux\t485482\n党支部组织生活会\t485483\n金庸群侠传5单机版\t485484\n电阻应变片\t485485\n录事\t485486\n国六标准\t485487\nkindness\t485488\n86929669\t485489\n数字货币\t485490\n预弯\t485491\n养业\t485492\n佛山大学\t485493\n9AT\t485494\n押款\t485495\n港上市\t485496\n北京中医医院\t485497\n格栅除污机\t485498\n文山\t485499\n华为mate7吧_\t485500\n上个世纪\t485501\n奶油味\t485502\ncurdate\t485503\n红米4吧\t485504\n白费\t485505\n青菱乡\t485506\n五星茅台酒\t485507\n黑精灵\t485508\n亚像素\t485509\nyangbai\t485510\n徐蓉\t485511\n权威\t485512\n下沙经济开发区\t485513\n两三年\t485514\n5.4.4\t485515\nStruggle\t485516\n基础教育资源网\t485517\n表盒\t485518\n块板\t485519\n瓦罐\t485520\n5万亿\t485521\n做些什么\t485522\n鄂建\t485523\n瑰宝\t485524\ntlds\t485525\n3B\t485526\n规\t485527\n何花\t485528\n千金\t485529\n色情版\t485530\n订阅者\t485531\n电脑硬盘分区\t485532\n公路工程\t485533\n织机\t485534\n1876\t485535\n来看一下\t485536\n腾讯手游助手模拟器\t485537\n中太\t485538\n承建\t485539\n肋板\t485540\n总裁文\t485541\nwml\t485542\n潘基文\t485543\n药神\t485544\nCITE\t485545\ncrosswalk\t485546\ndeclspec\t485547\n严重度\t485548\n第一百零五章\t485549\n周起\t485550\n市场营销学\t485551\n各处室\t485552\n摩点网\t485553\nchunk\t485554\n余姚市人民政府\t485555\n坪效\t485556\n毛巾卷\t485557\nPRO7\t485558\n减水剂\t485559\n当代国际城\t485560\n荒古\t485561\n梦幻西游手游科举答题器\t485562\n卫办\t485563\nikbc\t485564\n晋江原创网\t485565\n安米娜\t485566\nPS4/PSV\t485567\n游线\t485568\n20余年\t485569\n郭可颂\t485570\n新大头儿子小头爸爸\t485571\n晏子使楚\t485572\n多层板\t485573\n人神\t485574\n重提\t485575\n壁纸\t485576\n张治国\t485577\nparameters\t485578\n码盘\t485579\nWin7百科\t485580\n改\t485581\ntruelicense\t485582\n兆帕\t485583\n27年前\t485584\n红灰\t485585\n运走\t485586\n无臂\t485587\nFighting\t485588\n46岁\t485589\ncaused\t485590\n300.com\t485591\n_巴士倩女幽魂\t485592\n法海\t485593\n镔铁\t485594\n艾未未\t485595\n仓山区人民政府\t485596\n大众创业万众创新活动周\t485597\n爱弥儿\t485598\nUnblock\t485599\nVanguard\t485600\n富足\t485601\n广东代表团\t485602\nMDS\t485603\n蝈蝈俊\t485604\n内分\t485605\n神尔天才\t485606\n劫杀\t485607\n江泰保险\t485608\n出彩中国人\t485609\n22_\t485610\n320G\t485611\n楼龄\t485612\n面件\t485613\njojo奇\t485614\n妖宠\t485615\n粉碎性骨折\t485616\n抗氧\t485617\n方成\t485618\n科技日报\t485619\n评审书\t485620\n50030\t485621\n万物\t485622\n天赐良缘\t485623\n吃果\t485624\n火船\t485625\n嘉兴职业技术学院\t485626\n鳌山卫镇\t485627\n苏狂\t485628\n凉宫春日\t485629\n聚簇\t485630\n18K金\t485631\n心茶\t485632\n赖脸\t485633\nSOAPUI\t485634\n自拍网\t485635\n吉奥\t485636\n贺岁币\t485637\n瓠瓜\t485638\n中盈广场\t485639\n保偏\t485640\npola\t485641\nm10\t485642\n因缘\t485643\n餐盘\t485644\n中铁\t485645\n17年\t485646\n剪报\t485647\n英语专业四级考试\t485648\n鸟纲\t485649\n八字算命\t485650\nTFLearn\t485651\nLiquids\t485652\n五针\t485653\nsqlachemy\t485654\n6单\t485655\n朗润园\t485656\n绿林\t485657\n未注册\t485658\n韵事\t485659\n雾谷\t485660\n十几块\t485661\n鲜花\t485662\n出生\t485663\n卡里克\t485664\n电容柜\t485665\n旅行团\t485666\nCarpenterLee\t485667\n澡堂\t485668\nknife\t485669\n甲米岛\t485670\n魔法门之英雄无敌战争纪元\t485671\n中国兵器集团\t485672\n液压管接头\t485673\n大小眼\t485674\nBibliography\t485675\n陈欣予\t485676\n9000亿\t485677\nPersonality\t485678\n百度推送\t485679\nMAC地址查询\t485680\n1.0.0_\t485681\n上帝之城\t485682\n宫崎摩耶\t485683\n红艳艳\t485684\n眼药水滴\t485685\n韵达快递公司\t485686\n黄金港\t485687\n陈小平\t485688\n大众Polo\t485689\n博思软件\t485690\n澳海\t485691\n调整版\t485692\n神佑\t485693\n雷励\t485694\n上海南站\t485695\n刘汉盛\t485696\n得着\t485697\n现代感\t485698\n战锤:全面战争2\t485699\n亚洲\t485700\n85个\t485701\n标准集团\t485702\n101中学\t485703\n伊拉\t485704\nHongqiao\t485705\n马杆\t485706\n底顶\t485707\n一键通\t485708\n二十来岁\t485709\npowerbuilder\t485710\nmiyao\t485711\n吉祥果\t485712\n粉鲍\t485713\ngameObject\t485714\nTerra\t485715\n夏阳\t485716\n滴滤\t485717\nKMO\t485718\n吊洞\t485719\nipad投屏\t485720\nHAND\t485721\nFe3+\t485722\nDecisions\t485723\n延吉市\t485724\nLocally\t485725\nxtragrid\t485726\n楼座\t485727\n引物合成\t485728\n塞勒姆\t485729\nthing\t485730\nHAO\t485731\n39.8\t485732\nshimmer\t485733\n免息\t485734\nSPY\t485735\n金润\t485736\n杭州市\t485737\n22年前\t485738\n壹米滴答\t485739\n420mm\t485740\n04月24日\t485741\n何志\t485742\n124家\t485743\n昕锐武神天下\t485744\nZenFone\t485745\n时速\t485746\n牙刷\t485747\n农民日报社\t485748\n克扣\t485749\n气象局\t485750\n修文\t485751\n7581\t485752\n天将降大任于斯人\t485753\n上海华通\t485754\n开锁器\t485755\n雾眉\t485756\n勇闯天涯@第一会所@精选国产自拍\t485757\n乐山市\t485758\n任天\t485759\n证治\t485760\nJOURNAL\t485761\ncnu\t485762\n锋友们\t485763\nPhat\t485764\n按键声\t485765\n悬挑卸料\t485766\nHofstede\t485767\nlinu\t485768\n养护车\t485769\n鲁抗医药\t485770\n厚重\t485771\n网际\t485772\n热缩套管\t485773\n第59页\t485774\n第18041期\t485775\n血尿\t485776\nDont\t485777\n狗崽\t485778\nGrey\t485779\n气膜\t485780\n心理测量学\t485781\n丹凤眼\t485782\n长生千叶\t485783\n小值\t485784\ndt\t485785\n拼音节\t485786\npyhook\t485787\n武汉大学动力与机械学院\t485788\n易经算命\t485789\n免验证\t485790\n论房\t485791\n中兴商业\t485792\n历史性\t485793\n拉杜丽\t485794\n挟天子以令诸侯\t485795\n宝格丽酒店\t485796\n4720hq\t485797\n哈夫\t485798\nnat3\t485799\n团座\t485800\n苗栗\t485801\n基努里维斯\t485802\n阿昌\t485803\n冷色系\t485804\n2.8c\t485805\n中达\t485806\n琉璃苣\t485807\n郑州婚纱摄影\t485808\n自古英雄出少年\t485809\n存储\t485810\n离子式\t485811\n北京科博会\t485812\nドル\t485813\nMichaels\t485814\n支墩\t485815\n急救药\t485816\n硫酸银\t485817\n一年春\t485818\n西安工业大学\t485819\n5.1.8\t485820\n琼海政府\t485821\nRCE\t485822\n国五标准\t485823\n聚合物水泥基防水涂料\t485824\nF18\t485825\nvonic\t485826\n云南省人大\t485827\nelf\t485828\ninvolving\t485829\n离解\t485830\n龙苍沟\t485831\n龙之信条黑暗觉醒\t485832\nps2017cc\t485833\nfolx\t485834\nBtrfs\t485835\n包中\t485836\n潮汕高铁站\t485837\n婆家\t485838\nPoison\t485839\n梁元帝\t485840\n物相\t485841\n中国工控网\t485842\n校圈\t485843\n立信会计师事务所\t485844\n凤凰街道\t485845\n1092\t485846\n大丸子\t485847\n鸢一折纸\t485848\n童贞\t485849\nacv\t485850\n卡蒂\t485851\n坂本龙一\t485852\n大行星\t485853\nARENA\t485854\n病毒性\t485855\ndash\t485856\n台海网\t485857\n玩具枪\t485858\n舜华\t485859\n魔方阵\t485860\n布鲁塞尔\t485861\n贝勒德\t485862\n南波杏\t485863\n赏善\t485864\n京北\t485865\n没赶\t485866\n凯迪拉克ATSL\t485867\n黄泥川\t485868\n轻度\t485869\niPhoneSE\t485870\n易科美\t485871\n数线\t485872\n忘拉\t485873\n仙侣\t485874\n北京冬奥组委\t485875\n二手房地产\t485876\n史蒂文森\t485877\n右旗\t485878\n硫化物\t485879\nundefind\t485880\n安康\t485881\nchenyiwei\t485882\n动作库\t485883\nアンナチュラル\t485884\n有奖举报\t485885\nrover\t485886\nrh\t485887\n孙成\t485888\nDEAD\t485889\n谷力\t485890\n农工商超市\t485891\n忍界大战\t485892\n刷卷\t485893\n天下男\t485894\n阿军\t485895\n蓬江\t485896\nDep\t485897\n理综\t485898\na400\t485899\n永生\t485900\n应收账款保理业务\t485901\n厦门便民网\t485902\nmilfs\t485903\n4000转\t485904\n耦联\t485905\n党党\t485906\n伊桑·霍克\t485907\n劳动强度\t485908\n李玫\t485909\n数页\t485910\n豢养\t485911\n1600个\t485912\n卡托\t485913\n370万\t485914\ntruetable\t485915\n联丰\t485916\n瑞儿\t485917\n影音先锋资源站日夜撸影院\t485918\nversatile\t485919\n郑州市中心医院\t485920\n江苏省中西医结合医院\t485921\nOwners\t485922\n本田xrv\t485923\n穿梭框\t485924\n蒋碧薇\t485925\n回拨\t485926\n李大妖\t485927\n龙舌兰高原\t485928\n配线箱\t485929\nqpst\t485930\n地产市场\t485931\n万神殿\t485932\nCortex-A7\t485933\n套接字\t485934\n头狼\t485935\n登腾\t485936\nFairs\t485937\n佳杰\t485938\n养和医院\t485939\nGra\t485940\n勇者斗恶龙11\t485941\n软装\t485942\n海因里希\t485943\n这是为什么\t485944\n天津公安\t485945\n冲击\t485946\n毒蜂\t485947\ncai\t485948\n遥遥\t485949\n邪\t485950\n112号\t485951\n腭\t485952\n上村\t485953\nChar\t485954\n消光系数\t485955\n三毛荷西\t485956\n炼狱蝰蛇\t485957\n金隅股份\t485958\nlsm\t485959\n学练\t485960\n9.00\t485961\nPydev\t485962\nsqljdbc\t485963\n路宝\t485964\n导写\t485965\n饭水\t485966\n松木桩\t485967\n运动控制系统\t485968\n绕颈\t485969\ntargets\t485970\n极品飞车\t485971\n000063\t485972\n祁保义\t485973\nGensim\t485974\nIGN详评\t485975\n演进\t485976\n北京国家会计学院\t485977\n失恋33天\t485978\n曲池穴\t485979\n联盟币\t485980\n不管怎么说\t485981\nFFSKY\t485982\n王命\t485983\n海信中央空调\t485984\ncaj阅读器\t485985\n14th\t485986\n刘雯雯\t485987\n愉景湾\t485988\nROOT\t485989\nEmporio\t485990\n胎心率\t485991\n弦音\t485992\n排定\t485993\nclass01.com\t485994\n煤压\t485995\n8.9.6\t485996\n注册资产评估师\t485997\n老余\t485998\ntree\t485999\nFlightAware\t486000\n一泡\t486001\n良善\t486002\n第19条\t486003\nkez\t486004\n学习报告\t486005\n同步轮\t486006\n指向标\t486007\n攀达\t486008\n重庆鸡公煲\t486009\n好物\t486010\n三菱汽车\t486011\n景颇族\t486012\n凹凸视频\t486013\n68H5\t486014\n威孚\t486015\n攀爬者\t486016\n45篇\t486017\n凌凌漆\t486018\n第一神\t486019\n下蜀镇\t486020\n360vr\t486021\nG4400\t486022\n酷比\t486023\nPhoneGap\t486024\n管托\t486025\nnicky\t486026\n荻花宫前山\t486027\n因為\t486028\n云母板\t486029\n五年来\t486030\n漫猫动漫BT\t486031\n调性\t486032\n俨然\t486033\n诺基亚8\t486034\n斯堪尼亚\t486035\n罗田县\t486036\n发布会\t486037\n荣耀路由pro\t486038\n焉能\t486039\n朱庆\t486040\n花砖\t486041\n邵东县人民政府\t486042\n微录客\t486043\n报备\t486044\n老炮儿\t486045\nCXF\t486046\nxingjiao\t486047\nCAD制图\t486048\n高挺\t486049\n佳星\t486050\nd2hackmap\t486051\n美国海豹突击队\t486052\n凌派真武\t486053\n刘劲\t486054\nHbuilder\t486055\nspaced\t486056\n七班\t486057\nSEMA\t486058\n魔兽世界_战网\t486059\n第175集\t486060\n厚德\t486061\n汉语言文学\t486062\n契约论\t486063\n2257124948@qq.com.cn\t486064\n畏寒\t486065\n2.0XL\t486066\n扎龙自然保护区\t486067\n河北省商务厅\t486068\n新值\t486069\n嫁祸\t486070\nZ天狱篇\t486071\n力狮\t486072\n击沉\t486073\n张旻琪\t486074\n三国群英传OL\t486075\n格罗特\t486076\n两顶\t486077\n娃衣\t486078\n第一色\t486079\n大写贰\t486080\n肌骨\t486081\nwewill\t486082\n冷兵器\t486083\n小元\t486084\n博牛\t486085\ndige\t486086\nFML\t486087\n2008年以来\t486088\n走运\t486089\n真空贴合机\t486090\n北京租房网\t486091\n抗虫\t486092\n王环宇\t486093\n无弹窗小说网\t486094\n乐乎城市青年社区\t486095\n翻牌\t486096\n过关斩将\t486097\n113个\t486098\n洛可可风格\t486099\n西山湖\t486100\nWiFi共享大师\t486101\ndut\t486102\n斩月-51CTO\t486103\n100片\t486104\n华夏西路\t486105\n定点零售药店\t486106\n佛山供电局\t486107\n配对\t486108\ntouched\t486109\nVOCALOID\t486110\n越级\t486111\n连门\t486112\n箱鼓\t486113\nPexels\t486114\nf1.4\t486115\n多管闲事\t486116\n渡尽劫波\t486117\n死角\t486118\nreviewboard\t486119\nVOB\t486120\n三十里\t486121\n吴堡县人民政府\t486122\n生命的故事\t486123\n视频播放器\t486124\n天知地知\t486125\n水铁\t486126\n渝州路\t486127\nbering\t486128\n80题\t486129\n小撒\t486130\n我希望\t486131\n黑夜传说5:血战\t486132\n卖唱\t486133\n2013级\t486134\n白果\t486135\n_图\t486136\n总归\t486137\nbacterial\t486138\n心驰神往\t486139\n天份\t486140\n博凯\t486141\n三位一体2\t486142\n加冕\t486143\n00058\t486144\n噗\t486145\n音区\t486146\n庇\t486147\n蛮妻\t486148\n键帽\t486149\n上海街\t486150\n驾驶镜\t486151\n中东铁路\t486152\n文渊阁\t486153\nlxml\t486154\n自行车队\t486155\n象州县人民政府\t486156\nfeedly\t486157\n戈\t486158\n涉侨\t486159\n一览_游迅网\t486160\n塔身\t486161\n格力地产\t486162\n柴油发电机房\t486163\n向化\t486164\n20171220\t486165\n峨眉山景区\t486166\n轻摇\t486167\n湖女\t486168\n奥尔森\t486169\n酒屋\t486170\n涂塑钢管\t486171\n民间艺术\t486172\n期期\t486173\n好诗好词\t486174\n三辑\t486175\n隐性\t486176\n白青\t486177\n德累斯顿\t486178\n堆金网\t486179\n多一份\t486180\niShares\t486181\nCHOICE\t486182\necon\t486183\nQ10\t486184\nハニ\t486185\n1典\t486186\n包头师范学院\t486187\n盲侠大律师2\t486188\n游园会\t486189\n海勒\t486190\n超白鱼\t486191\n浅井茶茶\t486192\n2088\t486193\n哈多诺克斯\t486194\n东营港经济开发区\t486195\ndrinks\t486196\n1073\t486197\n卡鲁\t486198\n黄水疮\t486199\n贵族化\t486200\n郭化楠\t486201\nDVDRIP\t486202\n清江路\t486203\n成人影\t486204\n散点\t486205\n太子辉\t486206\n11所\t486207\netiquette\t486208\ninxx\t486209\n索菲娅\t486210\n人民检察院\t486211\n保利香槟\t486212\n工作标准\t486213\neplan\t486214\n焦煤\t486215\n点菜\t486216\n泡浴\t486217\n一次二次\t486218\nrtm\t486219\npm981\t486220\n1680元\t486221\nxbox\t486222\n仙城\t486223\n威科夫\t486224\n中控件\t486225\n入乡随俗\t486226\n哎呀\t486227\nsid\t486228\n190斤\t486229\n陈一冰\t486230\n路易斯\t486231\nintraweb\t486232\n放出\t486233\ngls\t486234\n枪战游戏\t486235\n突尼斯\t486236\nFailed\t486237\n11.4%\t486238\n武汉出版社\t486239\n隔震\t486240\n精彩纷呈\t486241\n烤漆板\t486242\n发群\t486243\n道达\t486244\n六米\t486245\n移位\t486246\n余罪\t486247\n91wg.com\t486248\n洪河\t486249\n大秘密\t486250\n南京汽车客运站\t486251\n行医\t486252\n流转税\t486253\neosmsg\t486254\n性感沙滩3\t486255\n第32期\t486256\n击退\t486257\nANA\t486258\n2018年3月22日\t486259\n大医精诚\t486260\n被扣\t486261\n武汉工程大学\t486262\n知根知底\t486263\n稳定期\t486264\n道德品质\t486265\n巴佬\t486266\n小果酱\t486267\nmiix510\t486268\n名冠\t486269\n周震南\t486270\n中国海洋大学\t486271\n帕斯\t486272\n濠滨车友俱乐部\t486273\n19周年\t486274\n036期\t486275\n见多识广\t486276\n水育\t486277\n爸爸的话\t486278\n第20期\t486279\n三门岛\t486280\n240升\t486281\n互导\t486282\n爆肥\t486283\n任意角\t486284\n维族舞蹈\t486285\nSortable\t486286\n线式\t486287\n圣奥\t486288\n车卡\t486289\n永远的七日之都\t486290\n若梦\t486291\n大戒\t486292\n20180223\t486293\n楼下坦克大战\t486294\n觉悟\t486295\n共产党员\t486296\n索引量\t486297\n选粹\t486298\n140天\t486299\nhht\t486300\n1P\t486301\n牙周炎\t486302\n交谊\t486303\n十六次\t486304\nSNH48\t486305\n曳引机\t486306\n西莱克\t486307\nansi\t486308\n蒸汽换热器\t486309\n轻熟\t486310\n金钢网\t486311\n嘴套\t486312\n劳动合同公司\t486313\n表箱\t486314\n重庆站\t486315\nKD10\t486316\n360影视\t486317\n东山路\t486318\n祭文\t486319\n装扮类\t486320\nPconline\t486321\njgrid\t486322\n骑马与砍杀:战团\t486323\n安红豆\t486324\n400张\t486325\n125i\t486326\nstr列\t486327\n程静\t486328\n许西川\t486329\nShopbop\t486330\n竞选\t486331\nSGMW\t486332\n杨柳湖\t486333\n小公主苏菲亚\t486334\n5年以后\t486335\n今麦郎\t486336\n杭州市物价局\t486337\n1abt\t486338\nskullcandy\t486339\n鄂伦春旗\t486340\n10月8日\t486341\n防腐木花箱\t486342\n右脑\t486343\n天津体育学院\t486344\n18年3月\t486345\n舌战\t486346\n爱回收吧\t486347\n油草\t486348\n山西省\t486349\n第32条\t486350\n债权置换\t486351\n上海精神卫生中心\t486352\n总统府\t486353\n钱军\t486354\n唐诗宋词选读\t486355\n新宏\t486356\n旗滨\t486357\nvcp\t486358\n麦芒4\t486359\ntap4fun\t486360\n无毒版\t486361\n后援团\t486362\nIObit\t486363\n聚山梨酯\t486364\nUniqueColor\t486365\n罐笼\t486366\n可是\t486367\n李勤\t486368\n蠕虫病毒\t486369\n激光尺\t486370\nBRAIN\t486371\nerrorbar\t486372\n介绍文\t486373\n货色\t486374\n风景\t486375\n火暴\t486376\nmasonry\t486377\n天地大冲撞\t486378\n晓华\t486379\n斯里兰卡卢比\t486380\n实施办\t486381\n慕容紫英\t486382\n周越\t486383\n深圳湾口岸\t486384\n拉磨\t486385\n热塑性弹性体\t486386\n红拂女\t486387\n丽娜\t486388\n孔玮\t486389\nxd\t486390\n生命源\t486391\nasmx\t486392\n诡案\t486393\n金戌\t486394\n挽明\t486395\nworship\t486396\n急性喉炎\t486397\n5kg\t486398\n重庆时时彩走势图\t486399\n传统中式\t486400\n四川长虹电器股份有限公司\t486401\nPathway\t486402\n肥罗\t486403\n富力士\t486404\n美瞳网\t486405\n北京师范大学心理学院\t486406\n小镇长\t486407\n2018年3月3日\t486408\n无处可逃\t486409\n育才街\t486410\n笨人\t486411\n来宝\t486412\n未来10天\t486413\n统计贴\t486414\n怯怯\t486415\n骂娘\t486416\n新绝代双骄3\t486417\n物资类\t486418\n陕西公安厅\t486419\nwaifu\t486420\n基兰\t486421\n辐射新维加斯\t486422\n史家小学\t486423\n张小五\t486424\n复兴中路\t486425\n5.6.22\t486426\n而又\t486427\n1uf\t486428\n莫耶斯\t486429\n新国\t486430\n国家税务总局关于国家税务局\t486431\n72mm\t486432\n屌炸天\t486433\n基本险\t486434\n药酒\t486435\n61篇\t486436\n瞎子\t486437\nfrpp\t486438\nemoji表情包\t486439\n独山镇\t486440\n铸成\t486441\n好的酒店\t486442\n颗数\t486443\n清木\t486444\n900e\t486445\n学人堂\t486446\n古稀\t486447\nIncarnation\t486448\n舞出\t486449\nlinuc\t486450\n关陇集团\t486451\n硬结\t486452\n新安洲\t486453\n溜溜看片网\t486454\n加油站\t486455\n答辩状\t486456\ny450\t486457\n科洛莫瑞兹\t486458\n厦门市公务员局\t486459\n顿汉布什\t486460\n陈灿\t486461\n沃尔夫冈\t486462\nTrueType\t486463\nminergate\t486464\n玉柴\t486465\n陈德荣\t486466\n北方地区\t486467\ncombinations\t486468\n起点文\t486469\n本讲\t486470\n验批\t486471\n中和\t486472\n距离感\t486473\n钟无艳\t486474\n对视\t486475\n苏武\t486476\nL3\t486477\n46亿年\t486478\n下班\t486479\nedius\t486480\n平纪\t486481\n狂喷\t486482\n通识教育\t486483\n企鹅智酷\t486484\nM4A\t486485\n孙静\t486486\n何年\t486487\n9代\t486488\n韩美\t486489\n设施\t486490\n3Com\t486491\n配偶\t486492\n定食\t486493\n马渚\t486494\n鸟哥\t486495\nWI-1000X\t486496\n几千万条\t486497\n天逸C5\t486498\n前3年\t486499\n速愈素\t486500\n杯盖\t486501\n虎扑字幕组\t486502\n黄政民\t486503\n免费试用网\t486504\n吴勇\t486505\n中华民族精神\t486506\nPurpose\t486507\n京东物流\t486508\n真三国无双6:猛将传\t486509\n多样\t486510\n布里茨\t486511\n西北农林科技大学\t486512\nmp2\t486513\n有条不紊\t486514\n无钱\t486515\n360手机f4\t486516\n联想小新_\t486517\n李洁\t486518\n博士后进站\t486519\n中商娱乐\t486520\n北京师范大学附属中学\t486521\n12.26\t486522\n玩弄\t486523\nios1\t486524\nsprites\t486525\n金山词霸2016\t486526\n耳东\t486527\n牦牛奶\t486528\n达州银行\t486529\n淋淋\t486530\n未晚\t486531\n远景能源\t486532\n珍珠湖\t486533\n手工品\t486534\n消费者信心指数\t486535\nTeenie\t486536\n20160308\t486537\n微报\t486538\n希崎ジェシカ\t486539\n睢阳区\t486540\n玉溪师范学院\t486541\n百度旅游\t486542\n眼石\t486543\n光环战争2\t486544\n锐龙5\t486545\n大怒\t486546\n毛家湾\t486547\n中公金融人\t486548\n各位\t486549\n孙伯纶\t486550\n思蜀\t486551\n自动电位滴定仪\t486552\n孟婆汤\t486553\n水屯\t486554\n踢掉\t486555\ninsight4.0\t486556\n小儿氨酚黄那敏颗粒\t486557\n中国科学院广州地球化学研究所\t486558\npv\t486559\n冯远\t486560\n口唇\t486561\n多义\t486562\n粘虫板\t486563\n拉西\t486564\n5.6_\t486565\n3435\t486566\n交接处\t486567\n镀膜剂\t486568\n吹干\t486569\nCookbook\t486570\ndalao\t486571\n电磁学\t486572\n西宁玛丽亚医院\t486573\nLangmuir\t486574\nMoisturizing\t486575\nPortland\t486576\n农科\t486577\n乡情\t486578\n工科男\t486579\nLevenshtein\t486580\n8级\t486581\n爷孙\t486582\n马祖\t486583\n强化卷\t486584\n相克_宜搭_搭配_相宜_混搭_百姓健康饮食\t486585\n比特币交易所\t486586\nCCF\t486587\n英雄战姬gold\t486588\n李鹰\t486589\nbuggy\t486590\n柠檬影院\t486591\n气石\t486592\n单本\t486593\n施加\t486594\n技师证\t486595\n延边州政府\t486596\n十四部\t486597\n牙根\t486598\n汹涌\t486599\n黎明重工\t486600\n唐宋八大\t486601\nxom\t486602\n蓝润\t486603\n183cm\t486604\n王鹰豪\t486605\n刘宪华\t486606\n丝路\t486607\n撇嘴\t486608\n病毒病\t486609\n密谋\t486610\n800w\t486611\nfoxmail7\t486612\n神无\t486613\n戊\t486614\n开火\t486615\n百鸟\t486616\nchinatax\t486617\n1.94\t486618\n14世纪\t486619\n菠萝格防腐木\t486620\n拨备率\t486621\n大翼\t486622\n墙改\t486623\n天墉城\t486624\n小脂\t486625\n厦门房产网\t486626\n喔喔\t486627\n一万年以后\t486628\n淘宝皇冠\t486629\n不眠\t486630\n氨甲环酸片\t486631\n康斯特\t486632\n英烈\t486633\n5.5公斤\t486634\n司马徽\t486635\n微单a6000\t486636\n炊事兵\t486637\nroyce\t486638\n萧山网\t486639\n学生们\t486640\nwfm\t486641\n劳务报酬个税\t486642\n猛人\t486643\n博图V14\t486644\n郦优昙\t486645\n通用机械\t486646\n唾液酸四己糖\t486647\n入场式\t486648\n万域盖世仙尊\t486649\n二房一厅\t486650\nresearchgate\t486651\n增值税价\t486652\n乐华娱乐\t486653\n北京恒大\t486654\n神垕\t486655\nPoppy\t486656\n矢量\t486657\n便宜坊\t486658\n2017年5月19日\t486659\n4kw\t486660\n南宁街\t486661\nsumifs\t486662\n高帮鞋\t486663\n厍\t486664\n鲍尔\t486665\n体育\t486666\n下雪\t486667\n微信助手\t486668\nstrap\t486669\n远场\t486670\nf80\t486671\n芦墟镇\t486672\ncoated\t486673\n军容\t486674\n宝装\t486675\n装饰类\t486676\nIN石器时代\t486677\n旗国\t486678\n赤红\t486679\n103分钟\t486680\n绿发\t486681\n生母\t486682\n海莲娜\t486683\n山狮\t486684\n廉正\t486685\n佳能iR\t486686\n京雄铁路\t486687\n私借\t486688\n舞出我人生\t486689\n母料\t486690\n宝泰尔\t486691\n安波福\t486692\n过两天\t486693\n记忆库\t486694\n快银\t486695\n手抄本\t486696\nNorma\t486697\n骑马\t486698\nhfss\t486699\nmos\t486700\n张指导\t486701\n笑而遣\t486702\n重率\t486703\nrespawn\t486704\ngoc\t486705\noppor9s\t486706\n2017卷\t486707\n西安人事考试网\t486708\n澄面\t486709\n中央五套\t486710\n毒萝\t486711\n琢\t486712\n乖,摸摸头\t486713\nWelded\t486714\n怨憎\t486715\n形器\t486716\n教学内容\t486717\n乙烯利\t486718\n尤里复仇\t486719\n瓦特\t486720\ntraveller\t486721\n租界\t486722\nAPE格式\t486723\n凤凰酒业\t486724\n施乐\t486725\n广州市科学技术协会\t486726\njawnha\t486727\nwebmoney\t486728\n汇和\t486729\n三圣股份\t486730\n济南战役\t486731\n架号\t486732\n金卷\t486733\n打入\t486734\n1315\t486735\n探灵\t486736\nflatlist\t486737\n载重量\t486738\n日产天籁\t486739\n耳聋\t486740\n习得\t486741\n_斯\t486742\n王中王\t486743\n阿依琳\t486744\n128码\t486745\n手牵手\t486746\n鱼骨图分析法\t486747\n120w\t486748\nbiomaterials\t486749\n渔溪镇\t486750\n出省\t486751\n第一脚\t486752\n自投\t486753\n安庆市食品药品监督管理局\t486754\n济南军区\t486755\n海洋世界\t486756\n一4\t486757\n衣服女\t486758\n桩基声测管\t486759\n口齿不清\t486760\n13寸\t486761\n收益率曲线\t486762\n美钞\t486763\n济南六一儿童医院\t486764\n上海音乐出版社\t486765\n1866\t486766\n四辆\t486767\n卫诗\t486768\n舌板\t486769\n土桩\t486770\n闭馆\t486771\n3575009\t486772\n李卓吾\t486773\n第10天\t486774\ncoop\t486775\n岔路镇\t486776\n愚行\t486777\n比特派\t486778\n杨公桥\t486779\n非常可乐\t486780\n当做\t486781\n杨艺格格\t486782\n脱扣器\t486783\n周公\t486784\n中央气象台\t486785\n巾帼志愿者\t486786\n向西\t486787\n广州华立科技职业学院\t486788\n通牒\t486789\n翠微\t486790\nNewark\t486791\n黑位\t486792\n全国空气质量指数\t486793\n蔓越莓胶囊\t486794\n自取行\t486795\n临山镇\t486796\n北部湾经济区\t486797\n马来西亚航空\t486798\n台酒\t486799\nPS家园网\t486800\ngt\t486801\n日期型字段\t486802\nrnd\t486803\nHcg\t486804\npapertime\t486805\n阿特拉斯空压机\t486806\n清产\t486807\nPas\t486808\n重庆宾馆\t486809\n系统软件\t486810\n红米3x\t486811\n500篇\t486812\n四川传媒\t486813\n当头棒喝\t486814\n瓯海中心区\t486815\n修玛\t486816\n部落冲突:皇室战争\t486817\n北白\t486818\na7\t486819\nAvalon\t486820\n史低\t486821\n北京公交集团\t486822\n东方小说阅读网\t486823\ndbgrideh\t486824\n长发男\t486825\nAsian51.com\t486826\nlanding\t486827\nbabyyellow\t486828\n联系电话\t486829\n62%\t486830\n朱修\t486831\nxperia\t486832\n九峰镇\t486833\n5天内\t486834\n星期三\t486835\n反乌托邦\t486836\n东池\t486837\n利沃夫\t486838\nmiddot\t486839\n怒吼\t486840\n迈瑞点道\t486841\n久隆\t486842\n营业性\t486843\n基友网\t486844\n60t\t486845\n20GB\t486846\n山东大学中心校区\t486847\n白象\t486848\n注册量\t486849\n阳泉市政府\t486850\nRestful接口\t486851\n2018年4月1日起\t486852\n哈萨克斯坦\t486853\n早发\t486854\n通販\t486855\n四川大学商学院\t486856\n通止规\t486857\n七嘴\t486858\n威斯康\t486859\n克拉玛依市\t486860\n阿赫瓦里\t486861\n锡林郭勒草原\t486862\n宋达民\t486863\nRabbitMQ\t486864\naab\t486865\nFlea\t486866\n深\t486867\n250名\t486868\n种族值\t486869\n15期\t486870\n太库\t486871\n质量受权人\t486872\n任静\t486873\n小米小爱mini\t486874\ndiskgen\t486875\n启正\t486876\n球员\t486877\n羿龙\t486878\n广东省住建厅\t486879\n龙狙\t486880\n华克\t486881\n驼绒\t486882\n祷告\t486883\n黎璃\t486884\n核晶\t486885\n卡宾达\t486886\n起立\t486887\n东花市\t486888\n神鱼\t486889\n古罗马斗兽场\t486890\n服务站\t486891\n_市\t486892\n如火如荼\t486893\n郑力\t486894\nASCO\t486895\n烟感报警器\t486896\n计量罐\t486897\nessays\t486898\n将改革进行到底\t486899\n察隅\t486900\n上海和平眼科医院\t486901\n防风绳\t486902\n昧\t486903\n嘻哈gai\t486904\n圆梦巨人\t486905\n四柱\t486906\n3.1_\t486907\n农村商业银行股份有限公司\t486908\n氯苯\t486909\n大殿\t486910\n亚顿\t486911\n铁杉\t486912\n阎维文\t486913\nzxv10\t486914\nGYSFONE\t486915\n健身照\t486916\n帧\t486917\nNBA直播\t486918\n第十二卷\t486919\nBuffalo\t486920\n奥黛丽·赫本\t486921\n温宿\t486922\n被擒\t486923\n香炉\t486924\n机情\t486925\n金明洙\t486926\n南京越博动力系统股份有限公司\t486927\nVOIP\t486928\n温血\t486929\n比海\t486930\nChrist\t486931\nEASY\t486932\n曾鸣\t486933\n白砂糖\t486934\nmaf\t486935\nLSA\t486936\n内江路\t486937\n千与千寻\t486938\n格式化验证\t486939\n350W\t486940\n斯密特\t486941\n还幼\t486942\n虎鱼\t486943\nS12K\t486944\n黑龙江工程学院\t486945\n悠梦\t486946\n小秦\t486947\n松翰\t486948\n扇贝肉\t486949\n填一填\t486950\nzhangsharp\t486951\n蝙蝠车\t486952\ntrave\t486953\n粘乎乎\t486954\nRTK\t486955\n17.7%\t486956\n请予\t486957\n民法总则\t486958\n散热管\t486959\nmains\t486960\n欧铁\t486961\n小写\t486962\nsachs\t486963\nExecutors\t486964\n国家留学基金\t486965\n夫妻共同债务\t486966\n20140429\t486967\n心儿\t486968\n幸运物\t486969\nvassistx\t486970\n用劲\t486971\nd500\t486972\n毕业诗\t486973\ndata2\t486974\nbrokers\t486975\n闺女\t486976\n科波菲尔\t486977\n及时性\t486978\n余荫山\t486979\n世界新闻报\t486980\n热血青年\t486981\n零担\t486982\n列维坦\t486983\n高通公司\t486984\nrhino6\t486985\n贸易展\t486986\n略过\t486987\n夜夜撸\t486988\n阿里大\t486989\nRAZER\t486990\n一个片\t486991\n狠人大帝\t486992\n畲乡\t486993\n蔡健雅\t486994\n120亿\t486995\n倒角刀\t486996\n兴国县人民政府\t486997\n宝莱坞生死恋\t486998\n酶活性\t486999\n伊春市委\t487000\n登天路\t487001\n金水镇\t487002\nGABA\t487003\n相恋\t487004\n小晖\t487005\n无限制版\t487006\n闪萌\t487007\n光忠\t487008\n东莞植物园\t487009\n陌车\t487010\n北京交通大学交通运输学院\t487011\n调和\t487012\n67条\t487013\n十二厘米\t487014\n屏蔽器\t487015\n仓央西游记\t487016\n博雅旅游网\t487017\n剑侠世界\t487018\n武生\t487019\n同福客栈\t487020\n陈朋\t487021\n珠江新城院区\t487022\n1248\t487023\n不干胶标贴\t487024\n办税\t487025\n王守业\t487026\n黑炭\t487027\nluozhiyong131\t487028\n第二十一集\t487029\n艾莉丝\t487030\n曲艺\t487031\n台身\t487032\n龙珠z爆裂大战吧\t487033\n橙皮书\t487034\n卡黄\t487035\ntp3\t487036\n母机\t487037\n臭虫\t487038\n小年夜\t487039\n厄尔\t487040\nezmorph\t487041\nDBC\t487042\n广东省地方税务局\t487043\nmsvc\t487044\n杨中科\t487045\n程门立雪\t487046\n高力\t487047\ntyre\t487048\nudk\t487049\n盛骤\t487050\n百位数\t487051\n品牌代言人\t487052\njj比赛\t487053\n排放\t487054\n消业\t487055\n前端\t487056\nEJS\t487057\n易速\t487058\n瑶山\t487059\n斯贝斯\t487060\n莫拉克\t487061\n同台\t487062\nseo优化\t487063\n邪见\t487064\nc200\t487065\n第16关\t487066\n冷萃咖啡\t487067\nII期\t487068\n苗药\t487069\n2017年7月26日\t487070\n厚厚\t487071\n房地产\t487072\n舞韵瑜伽\t487073\n维斯塔斯\t487074\n至尊魔都\t487075\n文渊\t487076\n孤傲苍\t487077\n思迈\t487078\n志华\t487079\n王爱\t487080\n悠络客\t487081\nv2.6.2\t487082\n一年内\t487083\n过头\t487084\nSPSS论\t487085\n东林书院\t487086\n松木\t487087\n阿峰\t487088\n新中式风格\t487089\n脱欧公投\t487090\n全齐\t487091\n跨界营销\t487092\n2017年6月份\t487093\n水火不容\t487094\n铺\t487095\ndnf阿修罗\t487096\n生根剂\t487097\n镜像卷\t487098\n280个\t487099\n20140607\t487100\n鱼粉\t487101\nflyfoxs\t487102\n海桑\t487103\n冲切\t487104\n娄江\t487105\nConfluence\t487106\n行业\t487107\n2018年5月17日\t487108\n月娘\t487109\n师工\t487110\n集群式\t487111\n4.52\t487112\n发生性\t487113\nFrees\t487114\ncol-sm-*\t487115\n88MM\t487116\n下体\t487117\n背滤\t487118\n幼鱼\t487119\n丽都\t487120\n韩超\t487121\n恶霸鲁尼\t487122\n雷锤\t487123\n132nw\t487124\n人大经济论坛\t487125\n十三行\t487126\n高德地图marker\t487127\nwds\t487128\n鬼片\t487129\ntonymacx86.com\t487130\n龙蟒佰利\t487131\n简体中文版\t487132\n质子交换膜\t487133\n應用\t487134\n宽厚板\t487135\n常青街道\t487136\n防损\t487137\nxinhua\t487138\n复合材料\t487139\ndion\t487140\n扫射\t487141\n感温\t487142\ntophat\t487143\n掳掠\t487144\nzhicheng\t487145\n第八组\t487146\n两条腿\t487147\npm2.5\t487148\ng点\t487149\notcbtc\t487150\n显著性分析\t487151\n素晴日\t487152\n20160527\t487153\n拭去\t487154\nm3\t487155\n第三十四章\t487156\n根德\t487157\n第7章\t487158\n祭拜\t487159\n缝纫机\t487160\n粪尿\t487161\n20171014\t487162\n中晋\t487163\n探亲假\t487164\n马可尼\t487165\n三条杠\t487166\nCOUNTIF函数\t487167\nV4.6\t487168\nRpg\t487169\n雷神3:诸神黄昏\t487170\n超连接\t487171\n搜房源网\t487172\n一丘\t487173\nOgre\t487174\n房村镇\t487175\n雨荨\t487176\n1067\t487177\n德爷\t487178\n这个城市\t487179\n11.16\t487180\n大荔县\t487181\n棠眠\t487182\nGreece\t487183\n标色\t487184\n总府路\t487185\n按摩店\t487186\n新闻30分\t487187\n甲午战争\t487188\n南菁\t487189\nriwen\t487190\n二保焊机\t487191\n笔上\t487192\nbought\t487193\n六百多部\t487194\nsd高达g世纪创世吧\t487195\n160901\t487196\nDELSEY\t487197\n15.05\t487198\n狼牙山\t487199\n华苑产业园区\t487200\n资智\t487201\n中文娱乐\t487202\n2月11日\t487203\n李国辉\t487204\n韦尔斯\t487205\n三下\t487206\n陈曼青\t487207\n仁宗\t487208\n达沙替尼\t487209\n方太热水器\t487210\n再障吧\t487211\n英菲尼迪QX70\t487212\n青木瓜\t487213\n楼盘快讯|住朋网\t487214\nRK3288\t487215\n大通镇\t487216\nWAL\t487217\n夏门\t487218\nRTMP\t487219\nv1.5.4\t487220\n掉了\t487221\n城市绿洲\t487222\n金日成\t487223\nGromacs\t487224\n类名\t487225\n辅路\t487226\n几联\t487227\n140多\t487228\nHCP\t487229\n千鸟\t487230\nSoar\t487231\n哨音\t487232\n辣妈\t487233\nCydia\t487234\n餐服\t487235\nWeavi\t487236\nCotton\t487237\n88A\t487238\n发展大道\t487239\nSimons\t487240\n小动物\t487241\nPNAS\t487242\n腿子\t487243\nmais\t487244\n第51\t487245\n戳记\t487246\nOreo\t487247\nRimworld\t487248\nabel\t487249\nColg\t487250\n33d\t487251\n任我游\t487252\n春兰篇\t487253\n第98期\t487254\n宏路\t487255\nk7\t487256\n5565\t487257\n生命之光\t487258\n生产安全事故报告和调查处理条例\t487259\n_高等教育出版社\t487260\nerlang\t487261\nputhon\t487262\n打哈欠\t487263\n河南省人大常委会\t487264\n新奥燃气\t487265\n佳能mg\t487266\n结体\t487267\nCCTV10\t487268\n盈信广场\t487269\nRequirement\t487270\n如雨止\t487271\n国网电力科学研究院\t487272\n新金瓶梅杨思敏版\t487273\n苏打片\t487274\n行政化\t487275\n银杏酮酯滴丸\t487276\n漫王\t487277\n40P\t487278\n针药\t487279\nRubicon\t487280\nDPI\t487281\n空手套白狼\t487282\n饱和水\t487283\n千叶县\t487284\n睡后\t487285\n陶哲轩\t487286\n滨田英明\t487287\n海盗船K70\t487288\nGIGI\t487289\n割集\t487290\ncss吧_\t487291\n四川大学水利水电学院\t487292\n西溪印象城\t487293\n框景\t487294\n晶科\t487295\n26式\t487296\ntypc\t487297\nme60\t487298\n茶梅\t487299\n鸿鹄之志\t487300\n星云链\t487301\n天津市人事考试\t487302\nEbook\t487303\nsim868\t487304\ntxl\t487305\n淘宝卖家子\t487306\n铁掌\t487307\n陈十一\t487308\nx型\t487309\n正博会\t487310\n每一周\t487311\n向西村\t487312\n横截面积\t487313\n8bit\t487314\n时尚元素\t487315\n电扶梯\t487316\ncoord\t487317\n柴油发电机组\t487318\n电科\t487319\ndockers\t487320\n码垛机\t487321\n铭板\t487322\n栖见\t487323\n海东市\t487324\n1.6万亿\t487325\nOllyDbg\t487326\n用之不竭\t487327\ndereferencing\t487328\n平安银行\t487329\n打报告\t487330\n通天报\t487331\nHeel\t487332\n生态板\t487333\nqq野马\t487334\n纯白户\t487335\ncodewars\t487336\n普隆德拉\t487337\n厩\t487338\njoints\t487339\n常熟汽饰\t487340\n大决战\t487341\n东港股份\t487342\n2迅雷\t487343\n音车\t487344\n心理战\t487345\nboli\t487346\nBAK\t487347\n孕妇照\t487348\ngmp\t487349\n石油树脂\t487350\n伸出\t487351\nAnt\t487352\neda\t487353\n矽力杰\t487354\n东北二嫂\t487355\n金怪\t487356\n李士懋\t487357\n[求文\t487358\n蒙医\t487359\n4570\t487360\n还值\t487361\n茶卡\t487362\n两相\t487363\n新文化运动\t487364\n读数\t487365\n木易森林\t487366\nuses-permission\t487367\n示例\t487368\n洛阳电视台\t487369\n窗里\t487370\n贵都花园\t487371\n小本悠然小天骐\t487372\n经济观\t487373\n1875\t487374\n唐涛\t487375\n电子词典\t487376\n211路\t487377\n诺德中心\t487378\nTie\t487379\n油价网\t487380\n看我72变\t487381\n绿龙\t487382\n血淤\t487383\n陈梦吉\t487384\nglsl\t487385\n永顺镇\t487386\n50382\t487387\n陈贺\t487388\n德州市人力资源和社会保障局\t487389\nchanging\t487390\n赵山河\t487391\n冒险岛剑豪\t487392\nxdf\t487393\n16MB\t487394\n多队\t487395\n向实\t487396\n高桥南\t487397\n4.3.3\t487398\n多开版\t487399\nAPC\t487400\neth2\t487401\nBorder\t487402\n历史转折中的邓小平\t487403\n种族隔离\t487404\nzontal\t487405\n调理品\t487406\nWestlife\t487407\nSARA\t487408\n直通率\t487409\nturbine\t487410\n广州万达\t487411\n深圳广播电视大学\t487412\n祥树\t487413\n宁德市人力资源和社会保障局\t487414\n无数遍\t487415\n会昌县政府\t487416\n秸秆\t487417\n肇中\t487418\n柳州机场\t487419\nvdh\t487420\n23个\t487421\nMM3\t487422\n牌局\t487423\nNiko\t487424\nhibernate5\t487425\n礼法\t487426\n省民委\t487427\n多年以后\t487428\n西安电子科技大学网络与继续教育学院\t487429\n颈椎病\t487430\n7班\t487431\n柜顶\t487432\n武汉政府\t487433\n胡说八道\t487434\n铅酸电池\t487435\n漏点_露乳_露毛_裸露_露胸\t487436\n负敏\t487437\n回去吧\t487438\n第13期\t487439\n贾老板\t487440\n慧宇\t487441\nE租宝\t487442\n菜园街\t487443\n黄点\t487444\n累了\t487445\n合体版\t487446\n北洋政府\t487447\n两百斤\t487448\n2.5公斤\t487449\n砂浆喷涂机\t487450\nJared\t487451\n8周岁\t487452\n淬火钢\t487453\n阳光网驿\t487454\n折_\t487455\n液压车\t487456\n万象更新\t487457\n3层\t487458\n被偷\t487459\n统一\t487460\n艺电\t487461\n30米高\t487462\n暂予\t487463\n通情达理\t487464\n螺旋管\t487465\n山西省财政税务专科学校\t487466\n中国近代史\t487467\n普莱柯\t487468\n石刑\t487469\n金吉列留学官\t487470\n黑蔷薇\t487471\nuuzyz\t487472\nFIR\t487473\n虐情\t487474\ncityengine\t487475\n神圣纪事\t487476\n河南牧业经济学院\t487477\n立讯精密工业股份有限公司\t487478\n广州旅行社\t487479\nesapi\t487480\n江西省科学技术厅\t487481\n做文\t487482\n炉门\t487483\n虚拟ip\t487484\nduos\t487485\n审计证据\t487486\nPS+\t487487\n争相\t487488\n苏梨\t487489\n认识\t487490\n11.25\t487491\n江城子\t487492\n不合\t487493\n不三不四\t487494\n头绳\t487495\n小蔡\t487496\n煤块\t487497\n欲为\t487498\nTMP\t487499\n郭颖儿\t487500\n荆州开发区\t487501\n元素师\t487502\n日后\t487503\n炒花生米\t487504\n祁山\t487505\n低模\t487506\npaired\t487507\n说过\t487508\n文山县\t487509\n腺瘤\t487510\n宣战\t487511\n产别\t487512\n商都\t487513\n亚马逊公司\t487514\n认识论\t487515\nKeyboards\t487516\n长边\t487517\n储蓄存款\t487518\n湖人\t487519\n三次产业结构\t487520\n交给\t487521\n赵宏伟\t487522\n药物学\t487523\n形容词性物主代词\t487524\n石魔\t487525\n汇通\t487526\n首优\t487527\n米素\t487528\nthinkcmf\t487529\n万界天墟\t487530\n惠水\t487531\n髌骨软化症\t487532\ncd3\t487533\n正月二十日\t487534\n自启动\t487535\n等分线\t487536\n食品工业科技\t487537\n问君\t487538\n手打版\t487539\naffine\t487540\n古墨\t487541\n2017-2022年\t487542\n风信网\t487543\n云歌\t487544\n练霓裳\t487545\ndeleted\t487546\n海豚湾\t487547\n117.136.3.40\t487548\nENZO\t487549\n凯旋论坛\t487550\n移動\t487551\n男相\t487552\n密境\t487553\n姚启圣\t487554\n纤维板\t487555\n吉象\t487556\n梁东\t487557\n沾身\t487558\n嘘嘘\t487559\n膛\t487560\n黛安芬\t487561\n装甲兵\t487562\n中国国际海运集装箱(集团)股份有限公司\t487563\n商业银行公司\t487564\n鸡女\t487565\n助浴\t487566\n沙姜\t487567\n三国恋\t487568\n亚宝\t487569\n大连妇产医院\t487570\n角斗\t487571\nwebui\t487572\n风刑软件站\t487573\n摇头丸\t487574\n心理罪城市之光\t487575\n萧宏慈\t487576\n台灣黃頁詢價\t487577\n几何学\t487578\n彩堂\t487579\n陈晟\t487580\n额角\t487581\n梁昆淼\t487582\n神奇传说\t487583\nZhe\t487584\n华夷\t487585\n致源\t487586\n驱灵流氓高手\t487587\n上海市闵行区人民法院\t487588\n黑黑\t487589\n审判信息网\t487590\n长亭\t487591\n画蛇添\t487592\nCC英雄联盟\t487593\n两证\t487594\n囊泡\t487595\n杀破狼\t487596\n勾函数\t487597\n晋剧\t487598\nAccident\t487599\nVessels\t487600\njiance\t487601\n链甲\t487602\n干眼\t487603\n38mm\t487604\nmysqlnd\t487605\n社区\t487606\n3700元\t487607\n佳能800D\t487608\n世界读书日\t487609\ncleaning\t487610\n丰润区\t487611\nClick事件\t487612\n武汉光谷希尔顿酒店\t487613\n胸胀\t487614\n何志成\t487615\n优惠待遇\t487616\nwrote\t487617\n茶场\t487618\n8P\t487619\n电解槽\t487620\n飞天小女警\t487621\n奥迪Q7\t487622\n捐给\t487623\nUsage\t487624\n战旗\t487625\nleng\t487626\n混批\t487627\nCryptoNight\t487628\n李泌\t487629\n金农网\t487630\n出票\t487631\n名侦探皮卡丘\t487632\n歴史\t487633\n第五部\t487634\n观音寺\t487635\n化工壶\t487636\n黄润秋\t487637\n供应量\t487638\n成一个\t487639\n鱼骨辫\t487640\n水肥一体化\t487641\n银豹收银软件\t487642\n烟灶\t487643\n迟滞\t487644\n创业基础\t487645\ncphi\t487646\n云南城投\t487647\n自动门\t487648\nWalther\t487649\n中国照明学会\t487650\n金三角\t487651\nmusescore\t487652\n苏红\t487653\n北京交通大学土木建筑工程学院\t487654\naia\t487655\nmur\t487656\n盐仓\t487657\n背景图\t487658\n成人节\t487659\n铕\t487660\n短剑\t487661\n左心衰\t487662\n固镇县人民政府\t487663\n失身\t487664\n内饰\t487665\nfissler\t487666\n陆克文\t487667\n套装扮\t487668\n睡颜\t487669\n隆回县政府\t487670\n降脂\t487671\n兽宠\t487672\n第20号\t487673\n落宫\t487674\n生命之泉\t487675\n00151\t487676\n方三文\t487677\n合伙\t487678\n上海烟草集团\t487679\n三生三世十里桃花片尾曲\t487680\n将领\t487681\n博湖县\t487682\n许辰\t487683\nqm\t487684\n孤军\t487685\n输死\t487686\n广州办公室\t487687\n婆罗洲\t487688\n喜庆\t487689\n宋MAX论坛_汽车之家论坛\t487690\n深圳侨报\t487691\n康业\t487692\n玛格丽特公主\t487693\n1.8.5\t487694\n莫斯\t487695\n硬骨\t487696\n米果果\t487697\n胃瘫\t487698\nCalculations\t487699\nyingji\t487700\n爱飞网\t487701\n歪风邪气\t487702\n无处藏身\t487703\n玉函\t487704\n头疗\t487705\n春江花月\t487706\n简单化\t487707\nDodoBook\t487708\n说梦\t487709\n魔书\t487710\n8首\t487711\nn1\t487712\n汉斯季默\t487713\n万芳\t487714\n20170511\t487715\n台帐\t487716\n金波子\t487717\n6250\t487718\n国巨\t487719\n县招商局\t487720\n同济大学交通运输工程学院\t487721\n优酷pc\t487722\n腕式血压计\t487723\n奔驰S级\t487724\n多拉基亚776\t487725\n片体\t487726\n管理系统中计算机应用\t487727\n栾云平\t487728\n食材\t487729\n中国科普\t487730\n表镜\t487731\n氧化硅\t487732\n399006\t487733\n8808\t487734\n揭西\t487735\nhp1020plus\t487736\npline\t487737\nPPT流程图\t487738\n革\t487739\nrestTemplate\t487740\nIndo\t487741\n安阳市公安局\t487742\n魔兵传奇\t487743\n萧太后\t487744\n变频空压机\t487745\n浙江银行\t487746\n9局\t487747\nsuperfetch\t487748\n蜂融网\t487749\nTalbot\t487750\n有关爱\t487751\n行至\t487752\n爱喜\t487753\n林隆璇\t487754\nWAY\t487755\n活出\t487756\n榨取\t487757\n温州机场\t487758\n华夏幸福\t487759\n威猛\t487760\n陈企\t487761\n一撇一捺\t487762\nfputs\t487763\n顺丰标\t487764\n排卵药\t487765\n王新伟\t487766\n蛋\t487767\n台式电脑显示器\t487768\n第78集\t487769\n冰车\t487770\nhaba\t487771\n脱胎换骨\t487772\nZ170\t487773\n飞兔\t487774\n海大\t487775\n晨风\t487776\n单号\t487777\n摄氧量\t487778\nPRML\t487779\n_户\t487780\n绵柔型\t487781\n锐科\t487782\n苏泊尔\t487783\n情歌\t487784\n晴雪\t487785\n心心念念\t487786\n溧水区\t487787\n分仓\t487788\nCSV格式\t487789\n钟面\t487790\n2万名\t487791\n中华门\t487792\n无所畏\t487793\n四版\t487794\n上师大\t487795\n益禾堂奶茶\t487796\n7cm\t487797\n秀英\t487798\n志者\t487799\n史莱姆牧场\t487800\n巴麻美\t487801\n万州市\t487802\n白果树\t487803\n皮帽\t487804\n沈绍功\t487805\n荆无命\t487806\n潮鞋\t487807\n乐余\t487808\n滝川ソフィア\t487809\n四万\t487810\n萨苏\t487811\n麒麟\t487812\nRelevance\t487813\n宫氏\t487814\n搭档\t487815\n替父\t487816\n外盒\t487817\n拉萨\t487818\n搭石\t487819\n沙枣\t487820\n上进心\t487821\nSIA\t487822\nSP0\t487823\nAI音箱\t487824\n亿米\t487825\n极品飞车无极限吧\t487826\n耽美种田文\t487827\n冷水澡\t487828\n自己\t487829\n魔兽世界奥格瑞玛\t487830\n红对勾\t487831\n绿门\t487832\nsha512\t487833\n泡泡玛特\t487834\nhd6850\t487835\nchallenge\t487836\n帕纳梅拉\t487837\n汉儿\t487838\n无辣不欢\t487839\n每个\t487840\n述说\t487841\n装饰贴\t487842\n东北山\t487843\n续章\t487844\n于是\t487845\n京东万象\t487846\nphonics\t487847\n20170310\t487848\n冰箭\t487849\n路阳\t487850\n中国铝业\t487851\n就业政\t487852\nphong\t487853\nVENU\t487854\n脑洞\t487855\n房租\t487856\n布罗利\t487857\n湾头\t487858\n卷首语\t487859\n専门\t487860\n编织坊\t487861\nallergo\t487862\n淘管家\t487863\n要回来\t487864\n广州50校\t487865\n两三倍\t487866\n听见你的声音\t487867\n贝海\t487868\n盐湖股份\t487869\n7.20\t487870\nk42\t487871\n上乘\t487872\n倒掉\t487873\n垂直提升机\t487874\n有点甜\t487875\n.rar-CSDN\t487876\nKOL\t487877\n石坝镇\t487878\ndobby\t487879\n恐慌症\t487880\nR语言函数\t487881\n几部\t487882\n四棵树\t487883\n去重\t487884\n炸锅\t487885\n拉莫斯\t487886\nTrailer\t487887\n_39健康网\t487888\nUbantu\t487889\n马寅\t487890\nmx2\t487891\n0.06\t487892\nngau\t487893\nepidata\t487894\n达拉特旗\t487895\n秃顶\t487896\nfh\t487897\n900块\t487898\n小嘎子和胖墩儿比赛摔跤\t487899\n乌龙\t487900\n中档\t487901\n试销\t487902\nz420\t487903\nGuilin\t487904\n够格\t487905\nRammstein\t487906\n闭门\t487907\nOUTLOOK2007\t487908\n亚历山大·麦昆\t487909\n000012\t487910\n寻祖\t487911\n1780\t487912\n花园饭店\t487913\n会当\t487914\n三年级下册语文\t487915\n郭颂\t487916\noutfile\t487917\n狙击精英4\t487918\n反对\t487919\npcsx\t487920\n熊乃瑾\t487921\n保护层\t487922\nbf2\t487923\n32攻速\t487924\n中国建筑标准设计研究院\t487925\n隐马尔可夫\t487926\n凯影\t487927\n破卵针\t487928\nv1.0.6\t487929\n创业大道\t487930\n动画史\t487931\n微拍\t487932\n北京建工集团有限责任公司\t487933\n种质\t487934\n1736\t487935\n氨氮\t487936\n三十一号\t487937\nLien\t487938\nVersus\t487939\nNucleo\t487940\n秋阳\t487941\n20170126\t487942\n玩者\t487943\n南葛\t487944\ncisa\t487945\n副市长\t487946\n天龙起亚k4\t487947\n编织\t487948\n328i\t487949\n个人消费贷款\t487950\n荧光灯\t487951\n雄风\t487952\n七百多\t487953\n2317\t487954\n嫌\t487955\n20160915\t487956\n探究性\t487957\n门廊\t487958\n91ba\t487959\n百洲\t487960\nQ版\t487961\n好方\t487962\nttk\t487963\n李志华\t487964\n3206\t487965\n增大器\t487966\n第104期\t487967\n太阳星\t487968\n声道\t487969\n倒货\t487970\n牛栏江\t487971\n95岁\t487972\n严把\t487973\n棒mthb\t487974\n第98\t487975\n陈振华\t487976\n果能\t487977\n锦尚\t487978\n10立方米\t487979\n胡克\t487980\n营城子\t487981\n孤剑\t487982\nbvv\t487983\n济南恒蓝环保设备有限公司\t487984\n马德里\t487985\n第一笔\t487986\n软硬包\t487987\nwin+\t487988\n丽丝网\t487989\n横滨国立大学\t487990\noffice365吧\t487991\n池塘边\t487992\n儿歌大全_九酷音乐\t487993\n市科技局\t487994\n第2场\t487995\n预兆_列表网\t487996\n名侦探柯南:纯黑的恶梦\t487997\n博客地址\t487998\n祥光\t487999\n意味着\t488000\n上海上港队\t488001\n汪海\t488002\n损益\t488003\n7个多月\t488004\n中国江都_江都政府网\t488005\n祝枝山\t488006\n大理岩\t488007\n东材科技\t488008\nia64\t488009\n跨国婚姻\t488010\n上海绿化市容局\t488011\n500hz\t488012\n综合国力\t488013\nFernando\t488014\n第30届\t488015\n双河口\t488016\n58791852\t488017\n語\t488018\n北辛街道\t488019\n0563\t488020\n校盲\t488021\n拍摄者\t488022\n鱼苗\t488023\ncaster\t488024\n凌天\t488025\n马步兵\t488026\n建设工程项目管理规范\t488027\n梦幻天堂\t488028\n刻骨铭心\t488029\n调转\t488030\n王者荣耀宫本武藏\t488031\n宝率\t488032\n418\t488033\n乞丐皇帝传奇\t488034\n篮球框\t488035\n开放型经济\t488036\nstg609\t488037\n参赞处\t488038\n四岛\t488039\n脱口秀大会\t488040\n好样\t488041\nsamp\t488042\n竹村\t488043\n阿里扎\t488044\n南平一中\t488045\n和润\t488046\n莱伯妮\t488047\n江苏省检察院\t488048\nZICO\t488049\n3mg\t488050\n好网角\t488051\nDAQ\t488052\n弱梁\t488053\n北京易华录信息技术股份有限公司\t488054\nnhdta\t488055\nPaleo\t488056\n溃疡病\t488057\n盲板阀\t488058\nVisualVM\t488059\nEaseUI\t488060\n索迪斯\t488061\nFreeman\t488062\ncpu\t488063\n北京思特奇信息技术股份有限公司\t488064\n光华科技\t488065\n糯米\t488066\n蜜壶\t488067\ncreate\t488068\n木栈板\t488069\n圆底\t488070\n新加坡科技园\t488071\n陈琦\t488072\nAggressive\t488073\n崖门\t488074\nY51A\t488075\n谎报\t488076\n网页游戏论坛|网页游戏\t488077\n水利局\t488078\n布娃娃\t488079\n祁门县\t488080\n咿啦\t488081\n威古氏\t488082\n烜\t488083\n有心\t488084\n范闲\t488085\n别亦难\t488086\n新华联\t488087\n老夫妇\t488088\n审阅\t488089\n18年2月\t488090\n轴心国\t488091\n谷仓\t488092\nxander\t488093\n胸装\t488094\n中华人民共和国增值税暂行条例\t488095\n期货日报网\t488096\n冀州区\t488097\nDVD电影网\t488098\n小人得志\t488099\n大理石板\t488100\n永通\t488101\nwinxp\t488102\n梁磊\t488103\nGoEasy\t488104\n排场\t488105\n宁海中学\t488106\n80070005\t488107\n河南中医学院\t488108\n北京东方雨虹防水技术股份有限公司\t488109\n慕容晓晓\t488110\n一行人\t488111\n浮游\t488112\n煤运\t488113\n草社区榴\t488114\nvpa\t488115\n张厚粲\t488116\n2018年18号\t488117\n塔园路\t488118\n宫崎葵\t488119\n班迪录屏\t488120\n翰林华府\t488121\n白搭\t488122\n保利鱼珠港\t488123\n上海鲁汶生物科技有限公司\t488124\n究极日月\t488125\n魔灵\t488126\nsatisfied\t488127\n發燒實驗室\t488128\n换行\t488129\n英语专业八级\t488130\ncncn\t488131\n瀚城\t488132\n华为Mate\t488133\nance\t488134\n有序\t488135\njong\t488136\n苔草\t488137\n91.8\t488138\n蓝小雨\t488139\n阿部乃\t488140\nr9m\t488141\n饥民\t488142\n豚\t488143\n数安\t488144\n10.12.1\t488145\ndoe\t488146\n湖北省委\t488147\njaxws\t488148\n出卖\t488149\n一介书生\t488150\n成仇\t488151\n二十五年\t488152\n截长屏\t488153\n金地商\t488154\n新青\t488155\n涅瓦\t488156\n熙攘\t488157\n水吧\t488158\n间隔\t488159\n1924\t488160\nE531\t488161\n帧速\t488162\n重庆3号线\t488163\nG903\t488164\n球果\t488165\n萧晗\t488166\nduoge\t488167\nPatti\t488168\n婚心\t488169\n临床医学检验技术\t488170\n287\t488171\n宁洱县\t488172\n五两\t488173\n乘物\t488174\n四季青镇\t488175\nRohde\t488176\n傻女\t488177\n06秒\t488178\n国家医疗保障局\t488179\nAEscripts\t488180\n深圳市航盛电子股份有限公司\t488181\n紫云台\t488182\n公共政策学\t488183\n即时\t488184\nQQ娱乐\t488185\n南京国\t488186\n华悦\t488187\n月历\t488188\n老虎斑\t488189\n哈尔滨电视台\t488190\n大话西游之月光宝盒\t488191\n判例\t488192\n胭脂\t488193\n这样做好\t488194\n中兴软创\t488195\n棒球\t488196\n薛亮\t488197\n斛\t488198\n共圆\t488199\n新西游记\t488200\n沃尔玛百货有限公司\t488201\n025\t488202\n三星屏\t488203\n税利\t488204\n枝条\t488205\n统计量\t488206\n鲜竹酒\t488207\n股东会决议范本\t488208\n常闭型\t488209\n2000万吨\t488210\n金融学专业\t488211\n约合\t488212\nREACT\t488213\n香途\t488214\n溶酶体\t488215\nlink2sd\t488216\nItinerary\t488217\n大BOSS\t488218\n自来水之污\t488219\n大皇宫\t488220\n华歌\t488221\n欧乐B\t488222\n永红\t488223\n企业税号查询_一般纳税人\t488224\nwin32.msi\t488225\nlsa\t488226\n懒癌\t488227\n韦福强\t488228\n电暖\t488229\n做媒\t488230\nmmu\t488231\n汤兰兰\t488232\n180m\t488233\n小蘑菇\t488234\n第18条\t488235\n智慧树网\t488236\n保果\t488237\nServletResponse\t488238\n仿真枪\t488239\n成长史\t488240\nshading\t488241\nchildrens\t488242\n林书成\t488243\n说中\t488244\n作手\t488245\n加拿大维多利亚大学\t488246\n潍坊市住房公积金管理中心\t488247\n小额贷款有限公司\t488248\n凯迪拉克XT5论坛_汽车之家论坛\t488249\n海堂\t488250\n确山\t488251\n代出\t488252\n李沧万达广场\t488253\nFK\t488254\n哈曼丹\t488255\n吕强\t488256\n恐龙展\t488257\n亳州市人民医院\t488258\n同步整流\t488259\n反反复\t488260\nmeow\t488261\nSupeSite\t488262\n优人\t488263\nChaoSimple\t488264\n胃间质瘤\t488265\n金子\t488266\n自评\t488267\n较慢\t488268\n跆拳\t488269\n布氏\t488270\n1.4D\t488271\n以便\t488272\n灰渣\t488273\nExcel2013单元格\t488274\ni58250u\t488275\n强项\t488276\n天煞\t488277\n香奈儿丝绒\t488278\n钢轨\t488279\n杭州小学\t488280\n信州区\t488281\n抒情版\t488282\n光羽\t488283\n瞎搞\t488284\n小竹炮\t488285\n荣仓奈奈\t488286\n10波\t488287\n女子监狱\t488288\n银光\t488289\nCCTV9\t488290\n旅馆\t488291\ninetpub\t488292\n检徽\t488293\n孙思琦\t488294\n仁者\t488295\n中国金茂控股集团有限公司\t488296\n制动液\t488297\n美文\t488298\n泽泻\t488299\n南普陀\t488300\n7740\t488301\n天馈\t488302\nBeacon\t488303\n五爷庙\t488304\n乘法函数\t488305\n很久\t488306\n浠水新闻网\t488307\n杭州二中\t488308\n绮罗生\t488309\n川上\t488310\n爱歌词网\t488311\n王武\t488312\n程飞\t488313\nbady\t488314\n西餐厅\t488315\nToronto\t488316\n刮破\t488317\n検索\t488318\n切缝\t488319\nhandel\t488320\n新祥旭\t488321\n三七互娱\t488322\n拖拖拉拉\t488323\n五眼\t488324\n赵海军\t488325\nhostid\t488326\n孝星\t488327\n专间\t488328\n凉拌莴笋丝\t488329\n3又1\t488330\n乌龙山剿匪记\t488331\n2016年9月10日\t488332\n盘香\t488333\n陈华亭\t488334\n漏光\t488335\n脓状\t488336\n书花\t488337\n瑞金丽萍\t488338\n马秀\t488339\n联供\t488340\nUBIFS\t488341\n大榭\t488342\n水年华\t488343\n恸哭\t488344\n枪械师\t488345\n刘路辉\t488346\n统战\t488347\n坪山政府\t488348\n脑片\t488349\n马悦凌\t488350\n精通版\t488351\n魏武\t488352\n箍筋加密区\t488353\n兵圣\t488354\n中央军\t488355\n安部未华子\t488356\n利润分配_\t488357\n市委办公厅\t488358\n姚姚\t488359\n硬直\t488360\n第季\t488361\n承若\t488362\n亨斯\t488363\nbilibili\t488364\n症候\t488365\n氨酸\t488366\n全麦面包\t488367\n京瓷fs\t488368\n量筒\t488369\nPOWERPOINT\t488370\n哈弗H2s\t488371\n林城\t488372\n文艺范\t488373\n鲈\t488374\n透景\t488375\n世界睡眠日\t488376\n性技\t488377\n单反相机排名\t488378\n蔚蓝群岛\t488379\n吊顶\t488380\n木蜡\t488381\n2203\t488382\nNurses\t488383\n宗教观\t488384\n招商雍景湾\t488385\n纸坊\t488386\n贵国\t488387\nSOLARZOOM光伏亿家\t488388\n小佩\t488389\n神密\t488390\nNetfilter\t488391\n罗地亚\t488392\n胃寒\t488393\n75ml\t488394\n嘘寒问暖\t488395\n国有建设用地使用权\t488396\nyp\t488397\nretina屏\t488398\n租房子\t488399\n保亭\t488400\n涅瓦河\t488401\n中国煤炭科工集团有限公司\t488402\n兰篇\t488403\n智斗\t488404\ncheungkaming\t488405\n男篮世界杯\t488406\n定弦\t488407\n六秒\t488408\nTNC\t488409\nindex函数\t488410\nEager\t488411\nAJ12\t488412\n42mm\t488413\n耐磨地坪\t488414\n学前班\t488415\n杨小姐\t488416\n济南港华燃气\t488417\n心恋\t488418\n备案价\t488419\n360u盘\t488420\n阜外心血管病医院\t488421\nMatthew\t488422\nTB\t488423\n暖色调\t488424\nF60\t488425\n雷蛇键盘\t488426\n账证\t488427\n宗亲联谊会\t488428\nlibstdc++\t488429\n曾经的你\t488430\n调入\t488431\n唇形科\t488432\n商论\t488433\n克雷\t488434\n张瑞奇\t488435\norigin8.0\t488436\n常速\t488437\n卷发器\t488438\n签别\t488439\n量子雷达\t488440\n浸膏\t488441\n残影\t488442\n鑫缘\t488443\n鑫苑\t488444\nAT89C51单片机\t488445\n豫青网\t488446\n穿墙螺杆\t488447\n剑网3长歌门\t488448\nSomnus\t488449\n金锋\t488450\nnus\t488451\n创生\t488452\n斯特劳斯\t488453\n铝信\t488454\n葛店\t488455\n新东方在线院校库\t488456\n红谷滩区\t488457\n搞笑笑话段子-喷饭网\t488458\n招行e招贷\t488459\n高度\t488460\nPSO2\t488461\n世界数字图书馆\t488462\n安阳乐居网\t488463\n私照\t488464\n林宥嘉\t488465\n北下\t488466\n200周年\t488467\n地缘\t488468\n本质性\t488469\n沙口\t488470\n5D4\t488471\n瓜果\t488472\n戴文\t488473\n好剧\t488474\n摄影界\t488475\nmulan\t488476\n奇卡\t488477\n深圳海事局\t488478\n5707\t488479\n勃朗特\t488480\nUCB\t488481\nAutomator\t488482\nhates\t488483\nETD\t488484\ndygod\t488485\n才北校\t488486\n拉伊\t488487\n定定\t488488\n队形\t488489\n宝门户\t488490\nconf\t488491\n2430\t488492\nadroid\t488493\n鲸鱼\t488494\n破碎锤\t488495\nPRI\t488496\n口袋妖怪白金光\t488497\n纪念园\t488498\n杨凤池\t488499\n磁棒\t488500\n成贤街\t488501\n前门\t488502\nipega\t488503\n喜忧\t488504\n分明\t488505\n巫蛊\t488506\n背云\t488507\n创建者\t488508\n肾透支\t488509\n化成\t488510\nImageTwist\t488511\nkalilinux\t488512\n非也\t488513\nOKHttp3\t488514\n3d豪情\t488515\n华三H3C\t488516\n鉴权\t488517\nchitu\t488518\n舾装\t488519\nisofix\t488520\n交通事故现场\t488521\n太阁立志传5吧\t488522\nRebellion\t488523\n马当路\t488524\n国鸿\t488525\n光正集团\t488526\n新兴际华集团\t488527\n丹马士\t488528\n简阳人民政府\t488529\n北京奥运村\t488530\n三棒\t488531\nsendmail\t488532\n管式炉\t488533\n三菱工控自动化\t488534\n哈尔滨大剧院\t488535\nXHCI\t488536\n怡保\t488537\n中国中化集团有限公司\t488538\n90吨\t488539\n纯粉\t488540\n育克\t488541\n切利尼\t488542\nuptime\t488543\n纪实剧\t488544\n风云争霸\t488545\n无锡滨湖区\t488546\n完美假期\t488547\n努尔基奇\t488548\n金华市规划局\t488549\n梦幻西游奇经八脉\t488550\n一泻千里\t488551\nIRM\t488552\n订婚宴\t488553\nconfi\t488554\n橘色\t488555\n第一台\t488556\n中国社会科学院世界经济与政治研究所\t488557\n束腹带\t488558\n读经\t488559\nAscii\t488560\n肌肉型\t488561\n快思聪\t488562\nbargaining\t488563\n第三大\t488564\n老流\t488565\ntesorflow\t488566\n病亡\t488567\n一本大学排名\t488568\nMTR\t488569\n煞费苦心\t488570\n曲谱子\t488571\n多重线性回归\t488572\n64级\t488573\nDism++\t488574\ncompressive\t488575\nSEMI\t488576\n激打\t488577\n莲花寺\t488578\n奶爸\t488579\n_无锡政府\t488580\n顾轻舟司行霈\t488581\n阿墨\t488582\n银泰证券\t488583\nzodiac\t488584\n百利电气\t488585\n华鑫\t488586\nyouthwant\t488587\n上海频道\t488588\n杭人社\t488589\n许甘露\t488590\n法器\t488591\npart1\t488592\nRim\t488593\nDSLR\t488594\nXA2\t488595\n火风\t488596\n红衣\t488597\nMTX\t488598\n恒邦股份\t488599\n俊平\t488600\n恕\t488601\n冰公贞观\t488602\nnancy\t488603\n键鼠导购\t488604\n磁力片\t488605\nWDlinux\t488606\n谢超\t488607\n六险一金\t488608\n南开大学现代远程教育学院\t488609\n又见山里红\t488610\n汉民\t488611\n新华富时\t488612\n三旧\t488613\nTiDB\t488614\n巴彦县\t488615\n质地\t488616\n二十一世纪\t488617\n净利润\t488618\nabo\t488619\n约吗网\t488620\n提税\t488621\n富宁\t488622\n水循环\t488623\n虚言\t488624\n闹春\t488625\n对数变换\t488626\n朗源股份\t488627\nSino\t488628\n老牌\t488629\n青头\t488630\nd810\t488631\nefi\t488632\nbeautifulsoup\t488633\nNicolas\t488634\n蔡司眼镜片\t488635\n七冰\t488636\nCdS\t488637\n多元一次方程\t488638\n骁龙820\t488639\n莱芜战役纪念馆\t488640\n江苏省常州技师学院\t488641\n以案说法\t488642\n指接板\t488643\n活跳尸\t488644\n饼肥\t488645\nE7500\t488646\n尘世间\t488647\n开花不结果\t488648\n失落的方舟\t488649\nfr\t488650\n油化\t488651\n单子叶\t488652\n研究对象\t488653\nsketchup2015\t488654\n三千万\t488655\n抵押人\t488656\n热热撸影音先锋\t488657\n必读\t488658\n海贼王娜美\t488659\n2016年4月5日\t488660\n最上层\t488661\n重新考试\t488662\n杰西卡\t488663\n癸丑日\t488664\n王怜花\t488665\n太阳线\t488666\ncalender\t488667\n银角\t488668\n20160808\t488669\n挤进\t488670\n库包\t488671\nDandy\t488672\n康桥花园\t488673\nSexe\t488674\n女警察\t488675\nInterpersonal\t488676\n本科教学网\t488677\nMessa\t488678\nexposition\t488679\n靳伟\t488680\n照片库\t488681\n我的哥哥\t488682\n海淀幼儿园\t488683\n猎豹Q6\t488684\nsupplier\t488685\n韩家墅\t488686\n56K\t488687\nHYT\t488688\n安慧\t488689\n深龙\t488690\n夏美酱\t488691\n条理性\t488692\n174号\t488693\n兰蔻小黑瓶\t488694\n_街镇\t488695\n黄金乡\t488696\n斯托克\t488697\n雄厚\t488698\n魔域尘缘\t488699\n潜能\t488700\n方块\t488701\n怪物猎人4\t488702\n王茹\t488703\n范佩西\t488704\nvue1.0\t488705\n切腹\t488706\nGPSspg\t488707\npaperwhite\t488708\n钟杨\t488709\n王大师\t488710\nXCAR爱卡汽车俱乐部\t488711\n全能型\t488712\n凯美热血江湖\t488713\ndiode\t488714\ncruise\t488715\n艾维斯\t488716\nattempts\t488717\n杜鹏\t488718\n白日焰火\t488719\n上海长宁区妇幼保健院\t488720\n星芒\t488721\n黄元申\t488722\n2亿美元\t488723\n伪彩\t488724\nbrakes\t488725\n东台镇\t488726\n张玉浩\t488727\n短小\t488728\n氯磺酸\t488729\nzizhu\t488730\n单线程\t488731\n皮具\t488732\n吐鲁番盆地\t488733\nairx\t488734\n自度\t488735\n延边信息港\t488736\n锁图\t488737\n场\t488738\n说明信\t488739\n参天制药\t488740\n41188\t488741\nForrest\t488742\n硅质\t488743\n后山村\t488744\ntoaster\t488745\nfortran\t488746\n莱阳新闻网\t488747\n合度\t488748\n璀璨夺目\t488749\n泥皮\t488750\n坐便器\t488751\n雪雨\t488752\n乳剂\t488753\n北京市质量技术监督局\t488754\nTransfers\t488755\n花哥\t488756\nPyInstaller\t488757\n今野杏南\t488758\n白龙湖\t488759\nfiltering\t488760\n敬老\t488761\nlongines\t488762\n宝马5\t488763\nPhilippine\t488764\nXMPP协议\t488765\n保修单\t488766\n义乌万达广场\t488767\n中国人民大学财政金融学院\t488768\n夏侯惇\t488769\n训练仪\t488770\n破译\t488771\n日内瓦会议\t488772\n基下\t488773\n校团\t488774\n混元桩\t488775\n黑与白\t488776\n中国载人航天工程办公室\t488777\n用品展\t488778\n双清包税\t488779\n大话2\t488780\n上古卷轴5\t488781\nMongoDb\t488782\n位段\t488783\nCTE\t488784\n电脑主板硬盘接口\t488785\n海上\t488786\n心路历程_考研帮\t488787\n圆带\t488788\nJacqueline\t488789\n2017年7月6日\t488790\n第二十七\t488791\n滋补品\t488792\nFreed\t488793\n冰球\t488794\n笔架\t488795\n草莓印\t488796\nEnviron\t488797\n荔枝纹\t488798\n17种\t488799\n小西沟\t488800\n白说\t488801\n3.6V\t488802\nPDF电子书\t488803\n米拉奇战记\t488804\n劫车\t488805\n新三世情缘\t488806\n北川绘里香\t488807\n假乳\t488808\n多摩川\t488809\n海南自贸港\t488810\n美妈们\t488811\nvane\t488812\nREPLACE函数\t488813\n月魔\t488814\n信步闲庭\t488815\n羽多野\t488816\n深圳欢乐谷\t488817\n烧钱\t488818\n一士\t488819\n斜柱\t488820\n国学机\t488821\n联想yoga720\t488822\n三厢\t488823\nAventador\t488824\n马场路\t488825\n光明农场\t488826\n召稼楼古镇\t488827\n兴高采烈\t488828\n金牛万达广场\t488829\n王志强\t488830\n会员制\t488831\n蓁\t488832\nvenu\t488833\n水刷石\t488834\n巨制\t488835\n梁溪区\t488836\n黄卡\t488837\n杰森·斯坦森\t488838\n反流性食道炎\t488839\nCiao\t488840\n茅坪\t488841\n1-3岁\t488842\n加密区\t488843\n硬刷\t488844\nInstinct\t488845\n优贝\t488846\n4399h5游戏-h.4399.com\t488847\n可知子\t488848\n智悦\t488849\n人中\t488850\n两分\t488851\n昌建广场\t488852\n洗衣片\t488853\npytz\t488854\n水环式真空泵\t488855\n2017年4月29日\t488856\n2点\t488857\n车座\t488858\n线阵\t488859\n八荒\t488860\n航车\t488861\nGONZO\t488862\nppt_圈中人寿险资源网\t488863\n狂野角斗士\t488864\n五象广场\t488865\n刘静\t488866\n波浪型\t488867\n24h\t488868\n长安CS75论坛\t488869\nMoney\t488870\ngrpc\t488871\n庠\t488872\n音花\t488873\n劳务员\t488874\n1折\t488875\n快递单号\t488876\n划动\t488877\n应有\t488878\n朋少\t488879\n浙教版八年级\t488880\n鸿坤原乡\t488881\ndnf圣\t488882\n毕业论文问卷\t488883\n曰\t488884\n霞之丘\t488885\n天使之恋\t488886\n职友集\t488887\n指纹机\t488888\n普朗特\t488889\n惠州站\t488890\n3st\t488891\n荷瘤\t488892\n黄锦芳\t488893\n4gb\t488894\n交椅\t488895\n小米note2\t488896\n短学期\t488897\n新魔教\t488898\n滴卡\t488899\n时装画\t488900\npraat\t488901\nprofound\t488902\n奶妈\t488903\n十干\t488904\nBroadcom\t488905\n型材库\t488906\n异形大战铁血战士\t488907\n方法体\t488908\n赫斯特\t488909\n洗染\t488910\n润白\t488911\n普天\t488912\n瘤样\t488913\n童梦奇缘\t488914\n280度\t488915\n黑学\t488916\n田余庆\t488917\n2010年4月\t488918\n伊東\t488919\n凤凰时尚_凤凰网\t488920\n梁子湖\t488921\n小珠\t488922\n咸水鳄\t488923\ncompression\t488924\n第07章\t488925\n2厘米\t488926\n济南市食品药品监督管理局\t488927\n妙想\t488928\n鬼冢虎\t488929\nblkid\t488930\n长投\t488931\n除锈\t488932\nspf\t488933\n碑林博物馆\t488934\n几滴血\t488935\n日本邪恶漫画大全\t488936\n第162\t488937\nDFO\t488938\n库尔斯克\t488939\ng500\t488940\n石城\t488941\n时隙\t488942\n尤克里里谱_ukulele乌克丽丽谱\t488943\n生存之路\t488944\nMessenger\t488945\nINTL\t488946\n无畏舰\t488947\n电脑主\t488948\n西贝柳斯\t488949\n钱梦龙\t488950\nFirebird\t488951\nzhl\t488952\n五十步\t488953\n试吃员\t488954\nADG\t488955\n一饱眼福\t488956\n相似\t488957\nsynthesize\t488958\n退订\t488959\n390.77\t488960\n第47页\t488961\n克莱·汤普森\t488962\n风柜\t488963\nIT168.com\t488964\n苏州工业园区1站式服务中心\t488965\n广告文案策划\t488966\n拉轰\t488967\nSwiftie\t488968\n张迁\t488969\n一级建造师\t488970\n住房按揭贷款\t488971\n奇瑞e3\t488972\n详测\t488973\nTiles\t488974\n荷花塘\t488975\n酷塑\t488976\n采量\t488977\nEX3\t488978\n踉跄\t488979\n59.0\t488980\nreviewed\t488981\n强暴\t488982\n真火\t488983\nChips\t488984\n汉惠帝\t488985\n睢宁论坛\t488986\n专款专用\t488987\n上海教育考试院\t488988\nPV2\t488989\n中交一公局\t488990\n慈元阁\t488991\n首都医科大学附属北京胸科医院\t488992\n觅健\t488993\nunitypackage\t488994\n水瓶女\t488995\n拉头\t488996\n二日游旅游线路\t488997\n微赚网\t488998\n南农大\t488999\n南日\t489000\n偷情男女\t489001\n第131章\t489002\n两面\t489003\n战刀\t489004\n砂岩\t489005\n搏击术\t489006\n建筑设计规范\t489007\n中龙\t489008\n单立文\t489009\n省国家税务局\t489010\n300634\t489011\nCOC\t489012\nliren\t489013\n天天网\t489014\nadas\t489015\n金万维异速联\t489016\n胜战\t489017\n吕布\t489018\n碰碰球\t489019\nNBA2KOL2\t489020\n大明演义\t489021\n釉里红\t489022\n虢\t489023\n科园大道\t489024\n好旺角\t489025\n柳林风声\t489026\n劝阻\t489027\n100cm\t489028\nglk\t489029\n赵岭\t489030\n铝热反应\t489031\n计米器\t489032\n惠崇\t489033\n完全摧花手册之地狱天使\t489034\n黑龙江省电子税务局\t489035\nMETHODS\t489036\nXiang\t489037\n政府采购法\t489038\n9.2.4\t489039\n烧水\t489040\n南盾\t489041\n电阻器\t489042\n手扶箱\t489043\nBreguet\t489044\nCoordination\t489045\n白果园\t489046\nHuaWei\t489047\n群峦\t489048\n西南科技大学城市学院\t489049\n百合山庄\t489050\nSILK\t489051\nlunt\t489052\n飞奔\t489053\nat&t\t489054\n人教课标版\t489055\n唐立培\t489056\n连云港第一人民医院\t489057\n写信\t489058\nppypp\t489059\n阿尔兹海默\t489060\nBSON\t489061\n刀锋意志\t489062\n起泡器\t489063\nRed_Code\t489064\n东石镇\t489065\n有备而来\t489066\n武汉三环\t489067\n茂业通信\t489068\n1301\t489069\nLASSO\t489070\n软便\t489071\nRHEL\t489072\n阿雄\t489073\n湖南软件职业学院\t489074\n大运\t489075\n4样\t489076\n唇颊\t489077\n齐林\t489078\n反欺诈\t489079\n劲舞\t489080\nv5.60\t489081\nStarboy\t489082\nyoga720\t489083\nELA\t489084\n牡丹江师范学院\t489085\n绿都地产\t489086\n注册商\t489087\n正兴镇\t489088\n16迅雷\t489089\n党产\t489090\n第一标\t489091\nprobe\t489092\n千岁\t489093\njgit\t489094\n嶂\t489095\n修规\t489096\n杨维俊\t489097\ncustomers\t489098\n巴马长寿村\t489099\nWGS84坐标系\t489100\n电脑花样机\t489101\n上海汽车博物馆\t489102\n华科附中\t489103\n圭塘河\t489104\n智研\t489105\n遵义人才网\t489106\n谋复兴\t489107\n多利卡\t489108\n新方志\t489109\n新浪星座\t489110\n比例尺\t489111\n秸秆颗粒机\t489112\n周睿\t489113\n制度史\t489114\n发烧站\t489115\n镇住\t489116\n80px\t489117\nTestng\t489118\nzhiming\t489119\n模式版\t489120\n御灵师\t489121\n殡仪馆\t489122\n光机电一体化\t489123\n玫瑰花\t489124\n平面投影\t489125\n595\t489126\n川崎\t489127\n掌柜\t489128\n孙秀英\t489129\naston\t489130\n杀戮地带\t489131\nshutting\t489132\nzuimei\t489133\n第六十五章\t489134\n相图\t489135\nESLint\t489136\n肝门静脉\t489137\n花溪区人民政府\t489138\n儿科护理学\t489139\n朱仝\t489140\nanytime\t489141\n妈妈的朋友\t489142\n片段集\t489143\n张海霞\t489144\n瘠薄\t489145\n忌惮\t489146\n王少峰\t489147\n历\t489148\n3月29号\t489149\n考虑\t489150\n九头龙\t489151\n海淀实验中学\t489152\n达伦布朗\t489153\n36集\t489154\nLotte\t489155\n男亲\t489156\n票根\t489157\n10:10\t489158\n31.2\t489159\npt.2\t489160\n高职高专\t489161\ncnblog\t489162\n国际田联\t489163\n双人分屏\t489164\n金瓶双艳\t489165\n香麓\t489166\n土豆汤\t489167\n切削\t489168\n点评网\t489169\n人非\t489170\nkt版\t489171\n帕特\t489172\n在线测试\t489173\nTOCK\t489174\n凤凰音乐\t489175\n1.5年\t489176\n背椅\t489177\n第21部\t489178\n北京晚报\t489179\n倒桩\t489180\n伦纳德\t489181\nAIS\t489182\n水漂\t489183\n步步高下载中心\t489184\n三益\t489185\n第1096章\t489186\n九家\t489187\n王梦婷\t489188\n喝血\t489189\narchive\t489190\n一路向西2\t489191\n金庸群侠传X1.0\t489192\n中外玩具网\t489193\n二妹\t489194\n10000平米\t489195\n玉翁\t489196\nunipus\t489197\nwift\t489198\n海螺型材\t489199\n成都婚纱摄影\t489200\n最新式\t489201\nMining\t489202\n观海\t489203\n高音哥\t489204\n九尊\t489205\n过压\t489206\nNEI\t489207\n四组\t489208\n罗京姚晨\t489209\n山东公司\t489210\n物种\t489211\n超高音\t489212\n中德产业园\t489213\n交表\t489214\n海蛎\t489215\n探盘\t489216\nFileZilla\t489217\ndateutil\t489218\n凯利\t489219\n铵态氮\t489220\n大行自行车\t489221\n宁王\t489222\n职介所\t489223\n下缘\t489224\nstm32f4\t489225\n哑膜\t489226\nec200\t489227\n肝结节\t489228\n是什么梗\t489229\nノラ\t489230\n一下子\t489231\n初一级\t489232\nqc20\t489233\n可爱型\t489234\n乳白色\t489235\n清风徐来\t489236\ntrimming\t489237\n朱之文\t489238\n闪出\t489239\n起\t489240\n中国游泳协会\t489241\n丝光棉\t489242\n点信\t489243\n打鱼机\t489244\n九十岁\t489245\n健康教育\t489246\n秦阳\t489247\nAOC\t489248\n盖瓶\t489249\n河北省图书馆\t489250\nMATERIALS\t489251\n聚乙烯吡咯烷酮\t489252\n济宁地区\t489253\n兵警\t489254\n撼世者\t489255\n植\t489256\n美我者\t489257\n064\t489258\n2683\t489259\nid号\t489260\n安杰\t489261\n斑马线\t489262\ne8\t489263\n史浩\t489264\n12册\t489265\n国家法官学院\t489266\n乐淘\t489267\n十件套\t489268\n九联科技\t489269\n漂流者\t489270\n试车场\t489271\n邵峰\t489272\ndevelopmental\t489273\n基建\t489274\n公称通径\t489275\n文体部\t489276\nmean函数\t489277\n江玥\t489278\nPilots\t489279\n吴广\t489280\n锻压\t489281\n诚信建设\t489282\n官局\t489283\n帕蒂\t489284\n流亡\t489285\n海盗船\t489286\n小虾米闯江湖\t489287\n联建光电\t489288\n程瑜\t489289\n丧尸围城\t489290\n第二笔\t489291\n真亦假\t489292\n郑州西\t489293\n杀马\t489294\n蜘蛛开店\t489295\n1.3倍\t489296\nAECC\t489297\n南拓\t489298\n碧昂斯\t489299\n富士拍立得\t489300\n党部\t489301\n丁次\t489302\n我决定\t489303\n1099元\t489304\n9015\t489305\n绍兴兰亭\t489306\n刘晓静\t489307\n_夜神安卓模拟器\t489308\n自己的房间\t489309\n百一\t489310\nujs\t489311\nspringbean\t489312\n涨紧器\t489313\n万科劝学里\t489314\n长沙地铁6号线\t489315\n独白\t489316\n失而复得\t489317\n躲进\t489318\n葱油拌面\t489319\n游戏邦\t489320\n35级\t489321\n5890\t489322\n阿斯托尔福\t489323\n兴工街\t489324\n通化东宝\t489325\nhresult\t489326\n天能\t489327\n舞星\t489328\n推荐\t489329\nscrollrect\t489330\n三七\t489331\nliabilities\t489332\n化工园区\t489333\nDivorced\t489334\n钱斯\t489335\n七载\t489336\n幼儿\t489337\n1762\t489338\n寡人之于国也\t489339\n水循环泵\t489340\n头灯\t489341\n武汉大学医学部\t489342\n热潮\t489343\n陕西省文化厅\t489344\n帮厨\t489345\nsxy\t489346\n碳环\t489347\n三册\t489348\n5.10.0\t489349\n雨婷\t489350\n改卷\t489351\n贫寒\t489352\nnub\t489353\n电解质\t489354\n庆元网\t489355\n关栋天\t489356\n三星S7edge\t489357\n清水河镇\t489358\nlibor\t489359\n傅希如\t489360\n欠款人\t489361\n名剧\t489362\n赵奕然\t489363\n淮南东\t489364\n查查\t489365\n槽位\t489366\n托尼帕克\t489367\n御定佩文韵府\t489368\n镁锭\t489369\nnt\t489370\n存为\t489371\n信达生物制药(苏州)有限公司\t489372\n房山区\t489373\n吉娜\t489374\n独资公司\t489375\n插纸\t489376\nTorch\t489377\nasyn\t489378\n8.cn\t489379\n中异\t489380\n朱音唯\t489381\n白百合\t489382\n凹凸租车\t489383\n说声\t489384\n南苑中学\t489385\n瓦洛兰\t489386\n1千元\t489387\n袄裙\t489388\n苏雪\t489389\nXBA\t489390\n无光\t489391\n戏王\t489392\nfearless\t489393\nraymon\t489394\n雀圣3\t489395\nMechanical\t489396\n大孩\t489397\n新丝路传说\t489398\n上脑\t489399\ngojs\t489400\ndsploit\t489401\n62集\t489402\n650ml\t489403\n班费\t489404\nERP顾问中心\t489405\nreflector\t489406\nvarchar型\t489407\n汪星人\t489408\n兴中广场\t489409\n16所\t489410\n流行美\t489411\n自组装\t489412\n杨澄甫\t489413\n肥胖纹\t489414\n百香果蜜\t489415\n推理笔记\t489416\nCandice\t489417\n第八十二章\t489418\n行间距\t489419\nMOTU\t489420\n张韬\t489421\n有点\t489422\n7A\t489423\n刘惠璞\t489424\n欲望号快车\t489425\nIncorporated\t489426\n巨影\t489427\n苍月岛\t489428\n亚铁离子\t489429\n益馨\t489430\n靳小雷\t489431\n看门人\t489432\n考察\t489433\nBelow\t489434\nrx8\t489435\n陆港\t489436\n切纸\t489437\n中银\t489438\n由南向北\t489439\n电伴热带\t489440\ntamu\t489441\n中兴力维\t489442\nurea\t489443\n96368\t489444\n戴建荣\t489445\n一键喊话\t489446\n搜索排名\t489447\n筑牢\t489448\n腰椎穿刺术\t489449\nび\t489450\nincremental\t489451\n苹果爱思助手\t489452\n哉\t489453\n汽车保险丝\t489454\n废纸打包机\t489455\n庚子\t489456\nsephora\t489457\n众说十九大\t489458\n特立独行\t489459\narj21\t489460\n难学\t489461\nORA-01555\t489462\n遍及\t489463\n淇\t489464\neditbox\t489465\nSTA\t489466\n禄劝彝族苗族自治县\t489467\n君舞\t489468\n黄莺莺\t489469\n宝来奇骏\t489470\n声张\t489471\n七擒\t489472\n我要自学网\t489473\npouring\t489474\n晁旭东\t489475\n酸汤\t489476\n2255\t489477\n万利电子\t489478\nartifactid\t489479\n发乎\t489480\n梁架\t489481\n玻利瓦尔\t489482\n台前县人民政府\t489483\n珍珠岛\t489484\n冯威\t489485\n新和\t489486\n从机\t489487\n清明雨上\t489488\n异源\t489489\n治污\t489490\n360手机助手电脑版\t489491\n9伏\t489492\n红十字\t489493\n三合院\t489494\n6招\t489495\naxon\t489496\n松涛\t489497\nBabe\t489498\ntriple\t489499\n奇譚\t489500\n葵力果\t489501\n年轻的朋友\t489502\nopc服务器\t489503\n300498\t489504\n智购网\t489505\n醴陵市\t489506\n塔洛斯\t489507\n_股城网\t489508\n宽体\t489509\n蜂蜜水\t489510\n韩庚\t489511\n杜马盖地\t489512\nSpecial\t489513\n宝马S1000RR\t489514\n融资融券标的\t489515\n防护镜\t489516\n600466\t489517\n谢芷蕙\t489518\n聚氨酯发泡\t489519\n印度支那\t489520\nnamed\t489521\n36b\t489522\n王月\t489523\n西溪海\t489524\n藍色\t489525\nhalla\t489526\niPhone8P\t489527\n佳格\t489528\nservu\t489529\n超级电容器\t489530\n84级\t489531\n无锡地铁2号线\t489532\n乐逗\t489533\n盗神\t489534\n岁暮\t489535\n银河护卫队2\t489536\n潘家园街道\t489537\n步森\t489538\n经济技术开发区\t489539\n处女座女\t489540\n1000毫秒\t489541\n内存值\t489542\n一缕缕\t489543\n泛型类\t489544\n1Z0\t489545\n清新区\t489546\n回归模型\t489547\nsum=0\t489548\n双箭\t489549\n0512\t489550\n南昌航空大学科技学院\t489551\nJaybird\t489552\n微\t489553\n恕瑞玛\t489554\n奶机\t489555\nnet2016\t489556\n豪礼\t489557\n600867\t489558\n织梦dedecms\t489559\n桃矢\t489560\nMania\t489561\n男朋友们\t489562\namani\t489563\n影音先锋资源_影音先锋电影_先锋影音\t489564\ntaotao\t489565\n宋扬\t489566\n密件\t489567\nAltair\t489568\n柠檬郡\t489569\n旅游公路\t489570\n奇哥\t489571\n企业私有云\t489572\n伊蒂\t489573\ntransfers\t489574\n郑州会展中心\t489575\n鼻酸\t489576\nupupw\t489577\n192.168.1.105\t489578\n花开春暖\t489579\n傅玄\t489580\nLeanote\t489581\n2013-2020年\t489582\n抽沙船\t489583\n旧址\t489584\n默想\t489585\n王丽云\t489586\n当上\t489587\npassages\t489588\n卷文综\t489589\n厦门婚纱摄影\t489590\n蒙古文\t489591\n盘锦港\t489592\n斯巴鲁xv\t489593\n长江大学学报\t489594\njdom\t489595\n深圳市华联欧国际贸易有限公司\t489596\n今来\t489597\nA7III\t489598\n临渊鱼儿\t489599\n汉口路\t489600\n耐电\t489601\n京港澳高速\t489602\n1400年\t489603\n健儿清解液\t489604\n无所作为\t489605\n土澳\t489606\n粿条\t489607\nwenti\t489608\n熟普\t489609\n遗世\t489610\n1.6倍\t489611\n梦东方\t489612\n面部痉挛\t489613\nDR3\t489614\n新曲\t489615\nBATTERY\t489616\n安徽法制报\t489617\n不二家\t489618\n阴阳极\t489619\n仙珍圜\t489620\nCRV\t489621\n一站\t489622\n汇量科技\t489623\nABB变频器\t489624\n零一夜\t489625\n泛起\t489626\n芯片战\t489627\n生中\t489628\n神经节\t489629\n抛\t489630\n封样\t489631\n营业时间\t489632\n依赖\t489633\n俗不可耐\t489634\n菜煎饼\t489635\n宏远\t489636\n许剑\t489637\n面面观\t489638\n半子\t489639\n霍尔果斯市\t489640\n扶贫户\t489641\n佐剂\t489642\n36种\t489643\n建阳市\t489644\n衡阳火车站\t489645\ninteresting\t489646\n应用于\t489647\n霸将三国\t489648\n褶皱处\t489649\nUnofficial\t489650\n悸动\t489651\n涡旋式压缩机\t489652\n5000余\t489653\n民治\t489654\n经年\t489655\n平动周期\t489656\n金牛奖\t489657\n糖尿病足\t489658\n勇争\t489659\n噻苯隆\t489660\n陈化\t489661\nfunc\t489662\nreason\t489663\nvsnprintf\t489664\n第十五回\t489665\n汉丰网\t489666\n55669258\t489667\n惠食佳\t489668\n总产值\t489669\n简爱格妮斯\t489670\n和田市\t489671\n21700\t489672\n漫话\t489673\n程志强\t489674\n12.29\t489675\n零红蝶\t489676\n我的英雄\t489677\nSophie\t489678\n5000\t489679\n曲轴位置传感器\t489680\n嘟嘟队\t489681\n注心\t489682\nyweihainan\t489683\n骨葆\t489684\nUnderwear\t489685\n室内装\t489686\n一社\t489687\ns16\t489688\n清台\t489689\n台湾路\t489690\nq50\t489691\n双雄\t489692\n邮市\t489693\n翼狐\t489694\n彩经\t489695\n牛城晚报\t489696\n新疆卫视\t489697\n诗咏\t489698\n约克移动网\t489699\n罄竹难书\t489700\n弑父\t489701\n金匠\t489702\n硬腭\t489703\n永井豪\t489704\n爱到\t489705\n艾迪康医学检验中心\t489706\n关爱八卦成长协会\t489707\n戎装\t489708\nIPA\t489709\n中止\t489710\n豪饮\t489711\n绝命镇\t489712\n白杨网\t489713\nevf\t489714\n吴佳\t489715\n585\t489716\n3&\t489717\n秦时明月之沧海横流\t489718\n可决\t489719\n税收筹划\t489720\n博览会\t489721\n斯蒂芬森\t489722\n三国无双8\t489723\n重骑兵\t489724\nhog\t489725\nman\t489726\ntallest\t489727\n犀牛角\t489728\n十三篇\t489729\nvlone\t489730\n青牛\t489731\nGSP认证\t489732\n意欲\t489733\n绥江县\t489734\n宏里\t489735\n注册监理\t489736\n贾乃亮\t489737\n72式\t489738\n2017年4月12日\t489739\n传智\t489740\n赵一曼\t489741\n招架\t489742\n息事宁人\t489743\n立川\t489744\n石峰\t489745\n马王堆汉墓\t489746\n庚子年\t489747\n训诂\t489748\n吕岩\t489749\n5iMX\t489750\n150部\t489751\n新超级马里奥兄弟2\t489752\n打鸣\t489753\n中心花园\t489754\n油胶\t489755\n手杖\t489756\n盾构法\t489757\nugnx\t489758\n非居民\t489759\n太原市人力资源和社会保障局\t489760\n上24小时\t489761\n疯狂夺金\t489762\n二向箔\t489763\n十三岁\t489764\n风景道\t489765\nArtech\t489766\n国剧盛典\t489767\n中药柜\t489768\n临夏州\t489769\n大武汉\t489770\n后任\t489771\n无谷\t489772\n白金机\t489773\n电压保护器\t489774\n玻璃样\t489775\n司歌\t489776\n东风小康\t489777\n适性\t489778\n债权类\t489779\n疯马皮\t489780\n68路\t489781\n7.64\t489782\n东风裕隆\t489783\n宠友\t489784\nlumerical\t489785\njavascri\t489786\n蓝妮尔\t489787\n水咲萝拉\t489788\nMarket\t489789\n重庆公共运输职业学院\t489790\nyellowcong\t489791\n0.25%\t489792\n主轴箱\t489793\n鱿鱼仔\t489794\n合村并城\t489795\nfaro\t489796\n北京丰台区\t489797\n线架\t489798\n格调网\t489799\n贴片钽\t489800\n中国大运河\t489801\n女恶魔人\t489802\n无修在线\t489803\n龙清泉\t489804\n可畏\t489805\n白蛇\t489806\nDPtech\t489807\ncro\t489808\nbands\t489809\n块毯\t489810\nNginx+Tomcat\t489811\n陆维\t489812\n京包\t489813\n气泡石\t489814\n罗尔德达尔\t489815\n汉城\t489816\n孙萍\t489817\n远大驾校\t489818\n赵英俊\t489819\n重庆租房网\t489820\n经棚\t489821\nshihe\t489822\n偏弱\t489823\n天一论\t489824\n山西省发改委\t489825\nDNF鬼泣\t489826\n硼沙\t489827\n光影\t489828\n值得一看\t489829\n一万名\t489830\n全国青少\t489831\ncreativerse\t489832\n审计信息网\t489833\n蔡智恒\t489834\nSep\t489835\n乌鳢\t489836\n卢克主\t489837\n先锋网\t489838\nstood\t489839\n广州招商银行\t489840\n国家卫生和计划生育委员会\t489841\ngtf\t489842\n%.2f\t489843\n混班\t489844\n30个小时\t489845\n显示仪\t489846\n排仓\t489847\n纸牌\t489848\n空气质量传感器\t489849\n小小朗读者\t489850\n游戏人生\t489851\n还账\t489852\n琴枕\t489853\n南澳\t489854\n森之国\t489855\n裘援平\t489856\n电贝司\t489857\n雪茄柜\t489858\n车号\t489859\n经济日报社\t489860\nJumpserver\t489861\n定拍\t489862\ndzx\t489863\nsolr6\t489864\n朋辈\t489865\n金桥奖\t489866\n中国菏泽网\t489867\n倒插\t489868\n冠字号\t489869\n顺河镇\t489870\n上百年\t489871\nniuniu\t489872\n宝马互联\t489873\nmaldives\t489874\n扣出\t489875\n红菱\t489876\nglossary\t489877\n优等\t489878\n问字\t489879\n相照\t489880\n上海民办小学\t489881\n郑蒲港\t489882\n切线方程\t489883\n小石桥\t489884\nthymleaf\t489885\n县编办\t489886\n考拉云商\t489887\n本子\t489888\n受胎\t489889\n江苏农商银行\t489890\nr方\t489891\ngatewayworker\t489892\nintr\t489893\n交通运输工程学院\t489894\n天才枪手\t489895\nserlvet\t489896\n还款\t489897\n海口港\t489898\n固安县政府\t489899\n皮肤划痕症\t489900\n白条鸡\t489901\n会计核算制度\t489902\n艺恩\t489903\n王一涵\t489904\n湖南省肿瘤医院\t489905\nwanwan\t489906\n刀石\t489907\n五年级叙事作文\t489908\nSuit\t489909\n中山小区\t489910\n五图\t489911\n卓艺\t489912\nTights\t489913\n猿\t489914\nthreejs\t489915\n南姜\t489916\natx\t489917\n纱衣\t489918\n卡屏\t489919\n6.29\t489920\n鲍鱼壳\t489921\n朱戬\t489922\nR2012a\t489923\ndragon\t489924\n田中\t489925\n摆地摊\t489926\n番石榴\t489927\n小康路\t489928\n千朵万朵\t489929\n朝阳街\t489930\nAztek\t489931\n大斌\t489932\n上川\t489933\nmmo\t489934\n纳氏试剂\t489935\nvivaldi浏览器\t489936\n拉缸\t489937\nWin10系统盘\t489938\n唐斯\t489939\n武战道\t489940\n海口市人民医院\t489941\nmachining\t489942\n第85号\t489943\n周航\t489944\n加蓬\t489945\n交官\t489946\n唏嘘\t489947\n幼升小试题\t489948\nrsu\t489949\nwww.utdallas.edu/~muratk/courses/crypto06_files/crypto\t489950\n浙教版小学\t489951\nlibpq\t489952\n谷饶镇\t489953\nchurchill\t489954\n金一南\t489955\n中后卫\t489956\n3.4.5\t489957\n上海联合产权交易所\t489958\n蝉\t489959\n十八日\t489960\n桑葚酒\t489961\nPFS\t489962\nwin8pe\t489963\n月半小夜曲\t489964\n福清市\t489965\n痧\t489966\n麦汁\t489967\n坝体\t489968\n武侠剧\t489969\n变种人\t489970\n深圳市医院\t489971\nTeXstudio\t489972\nTP-link路由器\t489973\n苦瓜炒蛋\t489974\nWindows10\t489975\n餐条\t489976\n活雷锋\t489977\n7775\t489978\nCNSS\t489979\nscons\t489980\n王选\t489981\n双创板\t489982\n偏振镜\t489983\nIC设计\t489984\n死神vs火影吧\t489985\n家纺\t489986\n静安雕塑公园\t489987\n邦阅\t489988\n俗世\t489989\n摇匀\t489990\npanasonic\t489991\n新展网\t489992\nk站\t489993\ncfpl\t489994\n符\t489995\n宁远\t489996\n透骨\t489997\n711路\t489998\n金达莱\t489999\nconstant\t490000\nsatin\t490001\n窗气\t490002\nthr\t490003\n老司机们\t490004\n脍炙人口\t490005\n人中北斗\t490006\n仲村星虹\t490007\nKEYone\t490008\n残渣\t490009\nstrips\t490010\n120路\t490011\n圣魔光石\t490012\n读文\t490013\n仙居神仙居\t490014\n破风\t490015\n2018年4月18\t490016\n一条心\t490017\n酚酞片\t490018\n郑国\t490019\n卢梭\t490020\nogn\t490021\n1.5ml\t490022\n广东新闻联播\t490023\ncrimaster\t490024\n自然死亡\t490025\n乾颐堂\t490026\nFX2N\t490027\n同案犯\t490028\nPHP+MySQL\t490029\n湖南出入境检验检疫局\t490030\n9k9k手游网\t490031\n北沙参\t490032\n王晓平\t490033\n三湘\t490034\n亚铁氰化钾\t490035\nsolgan\t490036\n背诗\t490037\n怜惜\t490038\n洞口县\t490039\n富含\t490040\niff\t490041\n精细化工有限公司\t490042\n续期\t490043\nMean\t490044\nWebSocket\t490045\n彩陶坊\t490046\n献上\t490047\n董越\t490048\n03:00\t490049\n军事基地\t490050\nchuntian\t490051\n奔驰S400L\t490052\n无码色\t490053\n炫耀性\t490054\n茶祖\t490055\n人命\t490056\n正电\t490057\ncontents\t490058\n中原西路\t490059\n菜根\t490060\n五蕴\t490061\n六六六\t490062\n中至\t490063\n贫者\t490064\n蝰蛇\t490065\n浓度计\t490066\n病毒疹\t490067\n脂砚斋\t490068\n菱帅\t490069\n明尼苏达森林狼队\t490070\n恒大御景湾\t490071\n第一板\t490072\n浮动元素\t490073\n漏水\t490074\n省广股份\t490075\n交换空间\t490076\n新奥法\t490077\n评核\t490078\n天巡酒店\t490079\n水上公路\t490080\n不止\t490081\n盗火线\t490082\n39期\t490083\n如是\t490084\nQtcreator\t490085\nGWW\t490086\n乌斯怀亚\t490087\n科四\t490088\n杨超越\t490089\n線\t490090\n胡鞍钢\t490091\n天正t20\t490092\ngraphis\t490093\n线盘\t490094\n红河人才网\t490095\n房产资\t490096\nHoop\t490097\n石凳\t490098\n自动变速箱油\t490099\n漱口液\t490100\n2018年5月1日后\t490101\n20170216\t490102\n中国无锡·无锡市人民政府\t490103\n有意无意\t490104\n泰佛\t490105\njian\t490106\n风场\t490107\n禽蛋\t490108\n君逸\t490109\n刑事\t490110\n2010-2020年\t490111\n慕贝尔\t490112\n三人间\t490113\nXP框架\t490114\n熊氏\t490115\n莲坂\t490116\ntpl\t490117\n金龟婿\t490118\n治疗型\t490119\n益达广告\t490120\n泰迪犬\t490121\nDocuCentre-IV\t490122\n美英\t490123\nIteration\t490124\n学典\t490125\n12V2A\t490126\n刘连昆\t490127\n骶骨\t490128\nFairyGUI\t490129\n文莱\t490130\n柳州火车站\t490131\n宝马7系论坛\t490132\n瑞昱\t490133\n盛一伦\t490134\n剑石\t490135\n007\t490136\n资本项\t490137\n北京野生动物园\t490138\n猪八戒\t490139\n巴黎和会\t490140\nFineBI\t490141\n涟源市\t490142\n一读\t490143\n店店\t490144\npior\t490145\n孤岛飞鹰\t490146\nGirlMP3\t490147\n望穿\t490148\n中华护理杂志\t490149\n酒鬼\t490150\n铜车马\t490151\n6天后\t490152\nBalm\t490153\nSHENG\t490154\nR1200GS\t490155\n不器\t490156\n强弱项\t490157\n搭伴\t490158\n马銮湾\t490159\n拳王\t490160\nprepay\t490161\n桥西区\t490162\nMongol\t490163\n王燊超\t490164\n快客\t490165\nskull\t490166\nBAMBOO\t490167\n日月星辰\t490168\n私影\t490169\n跑毒\t490170\n并没有\t490171\n130名\t490172\n十大品牌网\t490173\nVS2003\t490174\n线段\t490175\n慈溪市\t490176\n半桥式\t490177\nPoser\t490178\n120GB\t490179\n两天\t490180\n真情\t490181\n交通工程学\t490182\n成千上万\t490183\n粉馆\t490184\n闭月羞花\t490185\n创意在线\t490186\n虽\t490187\n美人计\t490188\n统一集团\t490189\nele\t490190\nX9Plus\t490191\nreliability\t490192\n孽\t490193\n这个价\t490194\n零时\t490195\n夜雨寄北\t490196\n影楼\t490197\n迅雷|-老片网\t490198\n秦梦瑶\t490199\n涡量\t490200\n超细粉\t490201\n桃园新村\t490202\nESR\t490203\n凌烟阁\t490204\n6.0000元\t490205\n阿勒泰地区\t490206\n技术派\t490207\n百万富翁\t490208\n疏桐\t490209\nhibernate\t490210\n战列巡洋舰\t490211\n脱壳破解区\t490212\n00067\t490213\n矢车菊\t490214\nbrink\t490215\nresultmap\t490216\nm4a4\t490217\n吴荻\t490218\n中古品\t490219\n13.1.3629\t490220\n2份\t490221\n檀云坤\t490222\n1ms\t490223\n鲁班奖\t490224\n宜秀\t490225\n隆国强\t490226\nipn8710防腐钢管\t490227\n已经完成\t490228\nfootlocker\t490229\n塔伦米尔\t490230\nbiaozhun\t490231\n高质\t490232\n承包工程\t490233\n刀语\t490234\nbz2\t490235\n云谷学校\t490236\n普世\t490237\n虎王坦克\t490238\nnivea\t490239\n胸壁\t490240\n艾丽希斯\t490241\n45001\t490242\n摩尔\t490243\n独立书店\t490244\n事业版\t490245\nnaike\t490246\n朝贡\t490247\ninternation\t490248\n人体穴位图\t490249\nyxy\t490250\n互通有无\t490251\n奖拟\t490252\n林燕\t490253\n变更登记申请书\t490254\n黑光\t490255\n酒泉职业技术学院\t490256\n遐迩\t490257\n嫁入\t490258\n谢富治\t490259\n范_\t490260\nserf\t490261\n张继学\t490262\n包皮手术\t490263\n蛛网\t490264\nstent\t490265\n中国移动广西公司\t490266\n黄宁\t490267\n张泽华\t490268\n美好世界\t490269\n盖山镇\t490270\n巧夺\t490271\n9501\t490272\n日拍\t490273\n花蕊\t490274\n福建省海洋与渔业厅\t490275\n武汉市小学\t490276\n中文系\t490277\n录音电话\t490278\n连续自然数\t490279\n落羽杉\t490280\n终结者\t490281\n鄞州新闻网\t490282\n好工作人才网\t490283\nmdpi\t490284\n村屯\t490285\nnide\t490286\nText3注册码\t490287\ntbl\t490288\n莫蒂\t490289\n惊惧\t490290\n河长办\t490291\n枢零\t490292\n水基型\t490293\n臂力\t490294\n罗田天堂寨\t490295\n兴安县人民政府\t490296\n锁区\t490297\n三心\t490298\nMATURE\t490299\n闫德利\t490300\n卖报歌\t490301\n磁卡\t490302\n8股\t490303\n拓\t490304\n追寻\t490305\nGallery\t490306\n马翔\t490307\n校妓\t490308\nvueJs\t490309\n1.3.0\t490310\n驷马桥\t490311\n大光路\t490312\n人大附中朝阳学校\t490313\n江南万达广场\t490314\nM5A97\t490315\n优游网密室逃脱\t490316\n招蜂引蝶\t490317\nzippo\t490318\n湖北省新闻出版广电局\t490319\n产前筛查\t490320\n特种兵之霹雳火\t490321\nMontreal\t490322\n宁聚\t490323\n翠湖公园\t490324\n伊斯曼\t490325\n凤倾天下\t490326\n2400万\t490327\n杭州市地方税务局\t490328\n艾叶粑粑\t490329\nmethod\t490330\n空下\t490331\n猎场\t490332\n红娘\t490333\n示录\t490334\n暂时性\t490335\nDucati\t490336\n钱塘江\t490337\n段纯\t490338\nCortex-A9\t490339\n打动\t490340\n肄业生\t490341\nEPL\t490342\n爱丽丝梦游仙境2:镜中奇遇记\t490343\n花荄\t490344\nchasing\t490345\n500次\t490346\n范远\t490347\n植物大战僵尸2\t490348\n张含韵\t490349\n防盗链\t490350\n黄埔村\t490351\nGTAOL\t490352\n运行商\t490353\n俞峰\t490354\nGorilla\t490355\n兴坪古镇\t490356\n111届\t490357\n小戏骨八仙过海\t490358\n闲书\t490359\nExtJs\t490360\n尔雅通识\t490361\nc4droid\t490362\n千叶\t490363\nmac输入法\t490364\n周长\t490365\n浮动式\t490366\n罗氏血糖仪\t490367\nSymfony2\t490368\n企业核心竞争力\t490369\n20150125\t490370\n旧友\t490371\n爱图仕\t490372\n5pin\t490373\nDanlis\t490374\n重蚁\t490375\n辣椒苗\t490376\n锦绣华庭\t490377\n520M\t490378\n0.41\t490379\n科技紫微星座网\t490380\n麦芯粉\t490381\n泉州市中级人民法院\t490382\n泳衣\t490383\n阮郎\t490384\n林野\t490385\n重氮化\t490386\n黄全\t490387\ncontoller\t490388\n滇池度假区\t490389\nhlj\t490390\n云南日报网\t490391\n锚文本\t490392\n麒麟花园\t490393\nPS4/XB1\t490394\n成兰铁路\t490395\n进屋\t490396\n唐骏东方不败\t490397\n魔禁\t490398\n提请\t490399\n宗室\t490400\niphone8P\t490401\nPodcast\t490402\n欲擒故纵\t490403\nupload\t490404\n郑三\t490405\n热处\t490406\nxw\t490407\n总经\t490408\n椰子油\t490409\n润肤露\t490410\nl358\t490411\n一仓\t490412\n求物\t490413\n284\t490414\n李莹莹\t490415\nFaster-Rcnn\t490416\n感情戏\t490417\nXMPP\t490418\n足坛\t490419\n蛋形\t490420\nCairns\t490421\n凸焊\t490422\nBPA\t490423\n富斯i6\t490424\n痴梦\t490425\n100台\t490426\n重铸版\t490427\n整肠生\t490428\nDNN\t490429\nphalcon\t490430\n韦加\t490431\n锯末\t490432\n51位\t490433\n强县\t490434\n河南法制报\t490435\n情难自禁\t490436\n10#\t490437\n蓝天大厦\t490438\n乘法器\t490439\n深圳市儿童医院\t490440\n滴剂\t490441\n六七岁\t490442\n莫顿\t490443\n上海地铁4号线\t490444\n刮痕\t490445\n爱靓网\t490446\n孟婆\t490447\nAVL\t490448\ngbd\t490449\nMandarin\t490450\n温瑞塘河\t490451\n沈阳地铁1号线\t490452\nIFO\t490453\n穆乙\t490454\n金球\t490455\n吴圩\t490456\n中国商业银行\t490457\n廖老师\t490458\nacdc\t490459\n可瑞康\t490460\n阿里斯托芬\t490461\n中国邮政小包\t490462\n飞驰\t490463\nmvc3\t490464\n内蒙专技继续教育\t490465\nSimplicity\t490466\n热血少年\t490467\n第55章\t490468\n母羊\t490469\n苏见信\t490470\n果腹\t490471\n张江集电港\t490472\nSHOEI\t490473\n下意识\t490474\n工作原则\t490475\n第二十五期\t490476\n9.5.1.0\t490477\nNEWS\t490478\n54颗\t490479\n3g网\t490480\ndongruiha\t490481\n阿城\t490482\n仙女楼变装反串\t490483\n最上\t490484\n甘道夫\t490485\n校队\t490486\nmanila\t490487\nwsn\t490488\nmolearner\t490489\nv2.8.3\t490490\n安格鲁\t490491\n李林甫\t490492\n一个人\t490493\n王晔\t490494\n托管班\t490495\n怪医圣手\t490496\n壮硕\t490497\n抱得美人归\t490498\n童丽\t490499\n巍峨\t490500\n十九世纪末\t490501\n跃进\t490502\n飞呀飞\t490503\nYONG\t490504\nar-2048s\t490505\n生者\t490506\naa片\t490507\n轻柔\t490508\n受雇\t490509\n爱信6at\t490510\ntuxedo\t490511\n夹攻\t490512\n上海地区\t490513\n一分钟前\t490514\n4.5G\t490515\n艾美奖\t490516\n192.168.8.1\t490517\n固定卡\t490518\nB2C\t490519\nsilhouette\t490520\n海南省图书馆\t490521\nscroller\t490522\n边沿\t490523\n安检\t490524\nPC板\t490525\nLifeMP3\t490526\n无适\t490527\n习近平时代\t490528\n列王\t490529\n魔兽世界大灾变\t490530\n射月\t490531\n破碎群岛\t490532\n不粘锅\t490533\nCARS\t490534\n思量\t490535\n优先权日\t490536\n独酌\t490537\n第二十三\t490538\n勇闯\t490539\n红高\t490540\n醋酸泼尼松片\t490541\ndice\t490542\n恐案\t490543\n伪装术\t490544\n打招呼\t490545\n东方金钰\t490546\n600点\t490547\n四本\t490548\n峰汇在线\t490549\nFeedback\t490550\n苏婷\t490551\n北京旅行\t490552\n招考办\t490553\n三疑三探\t490554\nStudio2017\t490555\n逸龙\t490556\n离线翻译\t490557\n灯谜\t490558\n广东太阳神集团有限公司\t490559\n宁荣荣\t490560\nchuli\t490561\nFlac\t490562\n吉祥草\t490563\n中国好声音第四季\t490564\n既往症\t490565\nvps服务器\t490566\n城六区\t490567\nsqlserve\t490568\n花筑连锁酒店\t490569\n搜索盘\t490570\nSeo\t490571\n2.5G\t490572\nWorkshop\t490573\nsqr\t490574\n贵州省政协\t490575\nindicators\t490576\nROBOT\t490577\n红耳\t490578\nPDD\t490579\n九鲤溪\t490580\n张家堡\t490581\n71万\t490582\n潮阳实验学校\t490583\n羚羊谷\t490584\n薄荷网\t490585\n600666\t490586\n星河控股\t490587\n周总理\t490588\n第178集\t490589\n2435\t490590\n中介网\t490591\n烘焙\t490592\nsport\t490593\n因果报\t490594\nownCloud\t490595\n束手就擒\t490596\n玻尔兹曼\t490597\n中国电子科技集团公司第十四研究所\t490598\n海勃湾区\t490599\n架构师\t490600\n新三板在线\t490601\nxk\t490602\n倍受\t490603\n宋老六\t490604\nlitt\t490605\n集成吊顶灯\t490606\n哨戒\t490607\n藤编\t490608\nsadp\t490609\ncompat\t490610\n被囚\t490611\nanders\t490612\n王树华\t490613\n雷克雅未克\t490614\n京哈高速\t490615\n定义块\t490616\nBlaze\t490617\n741\t490618\n出现\t490619\n龛世\t490620\n5多\t490621\n音板\t490622\n丛稿\t490623\nuncensored\t490624\n3036\t490625\n人主\t490626\n孙骁骁\t490627\n阳光大厦\t490628\nbisai\t490629\n322\t490630\n中共中央网络安全和信息化委员会办公室\t490631\n头毛\t490632\n训练馆\t490633\n脚踏车\t490634\n飞牛网\t490635\n频点\t490636\ncle\t490637\n锦州中学\t490638\n祁月笙\t490639\n新北\t490640\n2O\t490641\n玉润\t490642\n标婷\t490643\n天乐湖\t490644\n六价\t490645\n电吉他吧\t490646\nA4988\t490647\n靠谱助手\t490648\n星洲\t490649\n震后\t490650\nVirtualenv\t490651\nLON\t490652\n河北日报\t490653\n气表\t490654\n数字大写转换器\t490655\npresets\t490656\n最佳位置\t490657\nbxj\t490658\n花城版\t490659\n荒凉\t490660\n組\t490661\n冷轧钢\t490662\n馅儿\t490663\nzscore\t490664\n120级\t490665\n东北区\t490666\n点到为止\t490667\n坏事\t490668\niOS9分屏\t490669\n贡酒\t490670\n2016年6月30日\t490671\n溪木镇\t490672\n蔡剑\t490673\n傅俊杰\t490674\n胡三省\t490675\n肝胆相照\t490676\n坎特伯雷大学\t490677\n脑干梗塞\t490678\n公理\t490679\n健康\t490680\n宫腔内\t490681\n激似\t490682\n西尔酱\t490683\n欧震\t490684\n试验班\t490685\n安鑫\t490686\n崩坏3rd\t490687\n生存\t490688\n离子交换树脂\t490689\ng类\t490690\n多山\t490691\n陷波器\t490692\n团聚\t490693\n唯真\t490694\n绿地公园城\t490695\n石家庄日报社数字报\t490696\n000961\t490697\n孝心无价\t490698\n马薇薇\t490699\n对房\t490700\nhampton\t490701\nmotherfucker\t490702\n63万\t490703\n德意龙\t490704\n路卡利欧\t490705\n蜡烛灯\t490706\n重组\t490707\n角美\t490708\n一囧\t490709\n乱馆\t490710\n锦华路\t490711\n涂漆\t490712\nYYeTs\t490713\n能动性\t490714\n绝地求生全\t490715\n金屋\t490716\n呼吸系统\t490717\nvanishing\t490718\n马叔叔\t490719\n蒙面唱\t490720\nHealthineers\t490721\n笔神阁\t490722\n研途宝考研网\t490723\n上海分所\t490724\n推拉门\t490725\n大梅沙\t490726\n万发\t490727\n幻剑\t490728\n中国储运杂志社\t490729\n01:00\t490730\nIperf\t490731\n小山村\t490732\n丰谷\t490733\n叔丁醇钠\t490734\n麾下\t490735\n千头万绪\t490736\nAbbreviation\t490737\n美素力\t490738\n小志传奇\t490739\n荷泽\t490740\n餐员\t490741\n基底节\t490742\n上实集团\t490743\n青春剧\t490744\n代偿性\t490745\nButcher\t490746\n历经沧桑\t490747\n观山悦\t490748\n天纵\t490749\n8份\t490750\n2分钟左右\t490751\n鞠躬\t490752\nVerbal\t490753\n西塘镇\t490754\n全Amiibo\t490755\n岭南学院\t490756\n全面战争:战锤\t490757\n良医\t490758\n韩伟\t490759\nre模块\t490760\nstudioone\t490761\nputStream\t490762\n易安音乐社\t490763\n描点\t490764\n速刷\t490765\n第多少\t490766\n洛阳便民网\t490767\n中药房\t490768\nTexas\t490769\nshower\t490770\n开炮\t490771\n_众悦学车网\t490772\n叹为观止\t490773\n12123交管网\t490774\n曽\t490775\nxVideo\t490776\nSNH48集团\t490777\n专八\t490778\nSMB\t490779\n5008\t490780\n满仓\t490781\n28G\t490782\n101本\t490783\n明兰传\t490784\n苏怡贤\t490785\nafnetworking\t490786\nExt4\t490787\n南天门\t490788\n宏杉\t490789\n热水箱\t490790\n评价性\t490791\n商都路\t490792\n拓展员\t490793\n励\t490794\n夏秋篇\t490795\n偶发\t490796\n痘皮\t490797\n顺水推舟\t490798\n艾德曼\t490799\ndenim\t490800\n宝地广场\t490801\nXming\t490802\n凯斯西储大学\t490803\n吴卫东\t490804\n印展\t490805\n二月十九\t490806\n天脉\t490807\n川沙中学\t490808\n行贿犯罪档案\t490809\n金钱之味\t490810\npep8\t490811\n冲刺\t490812\n招商公园\t490813\n蛇精\t490814\n350_\t490815\nassc\t490816\ng9250\t490817\n惭愧\t490818\n辅助线\t490819\n自然博物馆\t490820\n玉器\t490821\nSOHO\t490822\n宣宣\t490823\napologize\t490824\n伍家格格\t490825\n四次元\t490826\n愁士\t490827\n14.04.3\t490828\n家常\t490829\n第一批\t490830\n半决赛次\t490831\nadjustable\t490832\nGPS导航仪\t490833\n艺术字网\t490834\n月卡\t490835\n101\t490836\n2.4.8\t490837\n毕业典礼\t490838\n郭振斌\t490839\n艾比森\t490840\nADFS\t490841\n笛卡儿\t490842\n杜可风\t490843\n上海打折网\t490844\n香港六合彩公司\t490845\n婴儿包\t490846\n蜜蜜\t490847\nzol\t490848\n石油币\t490849\n100套\t490850\n9万元\t490851\n安卓9.0\t490852\n常州市中级人民法院\t490853\n军事\t490854\n嘉林\t490855\ne乐彩\t490856\n清纯\t490857\nProcedures\t490858\n自卑\t490859\n百丽广场\t490860\n華人\t490861\n涂附\t490862\n哥伦比亚特区\t490863\n艾艾贴\t490864\n税务机\t490865\n背运\t490866\nN1S/N1\t490867\n房产预售证\t490868\n双轨制\t490869\n奇然\t490870\n教育家\t490871\n成都直管区\t490872\n金庸群侠5\t490873\nVR公司\t490874\ndurable\t490875\n间奏\t490876\nese\t490877\n求与\t490878\nbbin\t490879\n李行亮\t490880\n8平方\t490881\nAML\t490882\n质量责任\t490883\n扎囊县\t490884\n查获\t490885\n火鹤\t490886\n小谈\t490887\n拉文霍德\t490888\n金寨县\t490889\nFANS\t490890\n德道经\t490891\n夏威\t490892\nmyecplise\t490893\n天官少年三国志\t490894\n永兴\t490895\nel34\t490896\n妞儿\t490897\n荞麦茶\t490898\nshel\t490899\n乳晕\t490900\n明光村\t490901\n交字\t490902\n26例\t490903\n地下城与勇士17173\t490904\n110_\t490905\n人民代表大会\t490906\nOpener\t490907\n晨曦月\t490908\n三点\t490909\n神田\t490910\n一百万种\t490911\nlepin\t490912\n自卸半挂车\t490913\n店址\t490914\nshell_exec\t490915\n2680\t490916\n星籁\t490917\n批发商\t490918\n技术师\t490919\n胸徽\t490920\nmingzi\t490921\n蜜丸\t490922\n双硫仑样反应\t490923\n电子式互感器\t490924\n香槟金\t490925\nLuma\t490926\n泛水荷塘\t490927\n迈巴赫s600\t490928\nFlume-about\t490929\n_v2.0安卓\t490930\nLOEWE\t490931\n王江涛\t490932\n未来一周天\t490933\n宾得相机\t490934\n天龙蓝鸟\t490935\n300058\t490936\n初夜\t490937\n印第安纳大学\t490938\n班吉拉\t490939\n被烧毁\t490940\n圣阳蓄电池\t490941\nthehunter\t490942\n周家村\t490943\n驻马店市\t490944\n赵惟依\t490945\n栩\t490946\n天安财产保险股份有限公司\t490947\n燕塘乳业\t490948\n上错\t490949\n杭州分公司\t490950\ndahe\t490951\nFreemarker\t490952\n精修版\t490953\n桑基\t490954\n吉焦\t490955\n二厅\t490956\n创新工作室\t490957\n护罩\t490958\n那场戏\t490959\n调经\t490960\nandroid8\t490961\n叶子猪神武\t490962\nLanzhou\t490963\nActionScript\t490964\n待用\t490965\n陆俭明\t490966\n商战\t490967\n4321s\t490968\nsntp\t490969\n采制\t490970\n莆田系医院\t490971\n盒饭\t490972\n蹚\t490973\n武次\t490974\n五山\t490975\n9.9分\t490976\nV4.0.0\t490977\n云岭\t490978\n股线\t490979\n满堂\t490980\n14.4%\t490981\n库里\t490982\n带花\t490983\n_号\t490984\n自驾车路\t490985\naromatic\t490986\n不含有\t490987\nmysql8\t490988\n莱斯利\t490989\n国上\t490990\nMathematics\t490991\n共线\t490992\nTkinter\t490993\n行事\t490994\n304所\t490995\n陈黎\t490996\n聚氯化铝\t490997\n王沛\t490998\n四联疗法\t490999\n能源类\t491000\n庇护所\t491001\n自闭式\t491002\n李建波\t491003\n龚蕾\t491004\nfavourite\t491005\n求余\t491006\n何婷婷\t491007\n王晓琳\t491008\nPathfinder\t491009\n一展\t491010\n比甲\t491011\n跳台\t491012\n神意\t491013\n斗战神龙女\t491014\n2017.5.1\t491015\n长春电影制片厂\t491016\nOverflow\t491017\n失魂落魄\t491018\n135级\t491019\npromotional\t491020\n点划线\t491021\nn40\t491022\nzhming\t491023\n复检\t491024\n哈尔滨新闻网\t491025\n拳拳\t491026\n淋浴柱\t491027\n营口银行\t491028\n沉浸感\t491029\n绿庭\t491030\n好言\t491031\n四方山\t491032\nCactiEZ\t491033\n可不可行\t491034\n灵鬼\t491035\nzcs\t491036\nFlower\t491037\n易奇八字\t491038\n黄建军\t491039\n无锡先导智能装备股份有限公司\t491040\n庆源\t491041\n525\t491042\n演示稿\t491043\n0762\t491044\n80070422\t491045\n胸径\t491046\n刘锦泽\t491047\n铁胆火车侠\t491048\n俞振飞\t491049\nTableLayout\t491050\n轻二\t491051\n40回\t491052\n叶默\t491053\n公主府\t491054\n和顺县\t491055\n急性脑梗死\t491056\nCOOPER\t491057\n上蔡一高\t491058\n泽野弘\t491059\n华融赖小民\t491060\n中国有色金属工业协会\t491061\n丹徒\t491062\n中国水力发电工程学会\t491063\n流片\t491064\nromantic\t491065\nOceanBase\t491066\n厦门市委组织部\t491067\nwxpython\t491068\n上古5重制版\t491069\n阿里小号吧\t491070\n何毅\t491071\nAnya\t491072\n阿甘本\t491073\nwenzhou\t491074\ntouring\t491075\n安乐窝\t491076\n五代史伶官传序\t491077\n宝贝DJ音乐网\t491078\n旭川\t491079\n彬哥\t491080\n德力西\t491081\n刘英\t491082\n三木木\t491083\n装修\t491084\n箩筐\t491085\npussysaga\t491086\n随便吧\t491087\n广东省工程技术研究中心\t491088\n浙江省电力有限公司\t491089\nS350\t491090\n焚身\t491091\n周易测名\t491092\n军案\t491093\ndxracer\t491094\n灌肠机\t491095\n马林巴\t491096\n王哥哥哥哥\t491097\n华夏福\t491098\nPrison\t491099\n转向盘\t491100\n1967\t491101\n侧滑\t491102\n新钱\t491103\n宁玛派\t491104\n吹牛大王历险记\t491105\n永磁体\t491106\n平林\t491107\n水宫\t491108\n筛选出\t491109\ncir4超脑词汇\t491110\nSlime\t491111\n东南大学附属中大医院\t491112\nABAQUS\t491113\nmovie\t491114\n2018-03\t491115\n指导\t491116\n薛其坤\t491117\n熹妃传吧\t491118\n蓝靛\t491119\n酯基\t491120\n浣纱\t491121\n旭东\t491122\n人民英雄\t491123\n福昕软件\t491124\n农业机械化信息网\t491125\n電影\t491126\n2018-03-27\t491127\n卖不动\t491128\n南岗村\t491129\n好喜欢\t491130\n_块\t491131\n97平\t491132\n邵康节\t491133\n冷文\t491134\n长城影视\t491135\nRina\t491136\n调通\t491137\n斗志\t491138\n何云\t491139\n张光\t491140\n最后一面\t491141\n不孕不育症\t491142\n三越百货\t491143\n海上世界文化艺术中心\t491144\nsmartupload\t491145\n南昌酒店\t491146\n色环\t491147\n图吧公交网\t491148\n五菱荣光论坛_汽车之家论坛\t491149\nwwww\t491150\n前海壹号\t491151\n许辉\t491152\n药物临床试验质量管理规范\t491153\nzss\t491154\n朝夕网\t491155\n安徽财经大学商学院\t491156\n友妻\t491157\n23.6\t491158\nTrolley\t491159\n讳\t491160\n糗\t491161\n外贸业务员\t491162\n伪音\t491163\n甘肃省国家税务局\t491164\n同名主题曲\t491165\n东海县\t491166\n熊版\t491167\n20000泰铢\t491168\nnsdictionary\t491169\nBuf\t491170\n劫\t491171\n联翔\t491172\n第88章\t491173\n42页\t491174\n3等\t491175\n_句解霸\t491176\nFotos\t491177\n百卉\t491178\n南海镇\t491179\n海豚出版社\t491180\n郵便\t491181\n西红柿汤\t491182\n糖酒会\t491183\n拖入\t491184\n理论片\t491185\n裕华东路\t491186\n信永中和\t491187\n一三五\t491188\n百度知识\t491189\n戴姆勒奔驰\t491190\n2017年12月17日\t491191\n制暖\t491192\n8038\t491193\nICP网站备案管理系统\t491194\nanoconda\t491195\nMulan\t491196\n贝特\t491197\n集料\t491198\n微访谈\t491199\nwapi\t491200\nakira\t491201\ncrazytalk\t491202\nCamp\t491203\n红外峰\t491204\n伏见司\t491205\n乐山\t491206\n白额高脚蛛\t491207\nWeb2.0\t491208\n官长\t491209\n败\t491210\n金泰\t491211\n黑斑蛙\t491212\n何志强\t491213\n蛮人\t491214\n凤岭\t491215\n时刻准备着\t491216\n銷\t491217\n罐顶\t491218\n6.6万\t491219\n博视\t491220\n任脉\t491221\nsala\t491222\n不渝\t491223\n故事片\t491224\n通货膨胀率\t491225\n哑铃凳\t491226\ndown.qingkan\t491227\n主山\t491228\n90千米\t491229\n小湖\t491230\n西冷\t491231\n江苏旅游职业学院\t491232\n解带\t491233\n复制器\t491234\n刘仪伟\t491235\n厦门市妇幼保健院\t491236\n留言板\t491237\n母伦\t491238\n索泰\t491239\n中国医药报\t491240\n4月7号\t491241\n地坑院\t491242\n仙桃\t491243\n预备队\t491244\n小泽マリア\t491245\nadfs\t491246\n8007\t491247\n设计师们\t491248\n李菲菲\t491249\n宝山路街道\t491250\n正常值\t491251\n80t\t491252\n黄宗羲\t491253\n目录栏\t491254\niphone8_\t491255\n盛华\t491256\n9th\t491257\n淡水河谷\t491258\n9005\t491259\nITSS\t491260\n金地天悦\t491261\n加盟连锁网\t491262\n担担面\t491263\n20170518\t491264\n提权\t491265\n普安县\t491266\n22平米\t491267\n被淘\t491268\nxiyang\t491269\n邮亭\t491270\n动产融资统一登记系统\t491271\nbravia\t491272\n北大口腔医院第二门诊部\t491273\nunesco\t491274\n宁波植物园\t491275\n南充火车站\t491276\n中华联合保险\t491277\n灭失\t491278\n卖弄\t491279\n万科未来城三期\t491280\n东睦股份\t491281\n人的价值\t491282\n京东金融集团\t491283\n零钱罐\t491284\n乐至宝\t491285\n天派\t491286\nsprings\t491287\nfortran77\t491288\n流浪歌\t491289\nIncentive\t491290\n50多天\t491291\n美女人\t491292\nAw\t491293\n二级建造师注册管理系统\t491294\n52路\t491295\n20141107\t491296\nInterviews\t491297\nCOCOS2D\t491298\n洛阳市国土资源局\t491299\nIGE\t491300\ngRPC\t491301\n索罗\t491302\n新势力\t491303\nPolycom\t491304\n2010年度\t491305\n国际刑警组织\t491306\nquantification\t491307\n化工进展\t491308\n3.6.5rc1\t491309\n徐雅\t491310\n如梭\t491311\nyao\t491312\nBMD\t491313\n天力\t491314\n公积金管\t491315\n美管\t491316\nUNET\t491317\ndefinición\t491318\n唇珠\t491319\n朱女\t491320\n3.5分\t491321\n屏山县人民政府\t491322\n杭师大\t491323\nsucker\t491324\n东台市政府\t491325\n现代汉语\t491326\n清偿\t491327\n最简\t491328\nConformity\t491329\n超字\t491330\n叶脉\t491331\n怠政\t491332\nBrian\t491333\n严实\t491334\n吟游\t491335\n选座\t491336\n爱玛\t491337\n谢宴\t491338\n快走开\t491339\n名解\t491340\n炎龙\t491341\n一附\t491342\n腹股沟疝\t491343\n脚椅\t491344\n深圳海外国际旅行社\t491345\n一根筋\t491346\n说不上来\t491347\n12本\t491348\n糜烂性\t491349\n金小鱼\t491350\n秋山\t491351\n蓝魔之泪\t491352\ntsk\t491353\n福州市城乡建设委员会\t491354\nin在线翻译\t491355\n青贮机\t491356\n30件\t491357\nvps\t491358\n不干胶贴\t491359\n紫微\t491360\neec\t491361\nsrl\t491362\n火眼\t491363\n桑乐\t491364\n花样\t491365\n燃石\t491366\nKatie\t491367\nlibtiff\t491368\nⅦ\t491369\n翻仓\t491370\n吸合\t491371\n瓮城\t491372\n大渡口镇\t491373\n劳合社\t491374\n美团滴滴\t491375\n虐文小说吧\t491376\nXML格式\t491377\n慧骃国\t491378\n征稿\t491379\n28亿美元\t491380\n日笠阳子\t491381\n蒋蒋\t491382\n冠脉造影\t491383\n足\t491384\n178元\t491385\n婚途陌路\t491386\nhibernate4\t491387\n一丈红\t491388\n三生三世菩提劫\t491389\n考工\t491390\n實測\t491391\n马斯洛\t491392\n211&\t491393\n广汽传祺GS4\t491394\n河北区\t491395\n戮\t491396\nDiv\t491397\n农林\t491398\n海能达\t491399\n徐千雅\t491400\n创口贴\t491401\n北汽福田\t491402\nInnis\t491403\n角度看\t491404\nh网\t491405\n邮储手机银行\t491406\n拉丁\t491407\n布包\t491408\n调查处\t491409\nv7.1.0\t491410\nenrich\t491411\n碧姬\t491412\nQuotations\t491413\n少儿画苑\t491414\na380\t491415\n银河soho\t491416\nGames\t491417\n激战2超级冒险盒子\t491418\n浦外\t491419\n公祭轩辕黄帝网\t491420\nGF\t491421\n邮政编码查询\t491422\n工程商\t491423\n肠腔\t491424\n黑客攻防\t491425\nanimus\t491426\n白沙溪\t491427\n雷殿生\t491428\n泾阳县人民政府\t491429\n9倍\t491430\nemmet\t491431\n官微\t491432\n朱昌镇\t491433\nweb-info\t491434\n铁佛\t491435\n老司\t491436\n包玉刚\t491437\n萍水相逢\t491438\n法三国\t491439\n小人\t491440\n绿色发展基金会\t491441\n1.2.11\t491442\n眼线\t491443\nguoyu\t491444\njQueryfuns\t491445\n乐云\t491446\n混水阀\t491447\n生产队\t491448\n魔法科高校的劣等生\t491449\n市场篇\t491450\n事务\t491451\n赶忙\t491452\n渣攻\t491453\n中南电力设计院\t491454\n桥\t491455\n卫人\t491456\n2月15日\t491457\n西游记插曲\t491458\n街镇\t491459\n听歌学英语\t491460\n中南\t491461\n粮机\t491462\n康佳集团\t491463\n60-70年代\t491464\n涂色\t491465\n开平水口\t491466\n哈密市伊州区政府\t491467\n活塞机\t491468\n延边大学出版社\t491469\n腰斩\t491470\n明节\t491471\n江苏省农业科学院\t491472\n2016-05\t491473\n商榷\t491474\n九虫\t491475\n熊培云\t491476\nwww.ag188.com\t491477\n此人\t491478\nm70\t491479\n饲\t491480\n公猴\t491481\n氰尿酸\t491482\n破损\t491483\n3注\t491484\n导航树\t491485\n000元\t491486\npsim\t491487\n最弱无败的神装机龙\t491488\n塑料盘\t491489\n恶女\t491490\n2013下半年\t491491\nSuperWORKS\t491492\n奎文区\t491493\n公粮\t491494\n双马\t491495\n晋级赛\t491496\n那就这样吧\t491497\ndisambiguation\t491498\n老笠\t491499\n弱电工\t491500\noxidative\t491501\n赵钱孙李\t491502\n陕政\t491503\n臂章\t491504\n2500ml\t491505\n靳东费德勒\t491506\nUCOSIII\t491507\n小萨\t491508\n福州市国土资源局\t491509\n局面\t491510\nMount\t491511\n10折\t491512\n悻悻\t491513\n被撤职\t491514\n雨鱼\t491515\ncarlos\t491516\n六年级下册语文\t491517\n水晶虾\t491518\n长盛不衰\t491519\n3步\t491520\n万达贷\t491521\n清扬\t491522\nustb\t491523\nconcatenate\t491524\n暖冬\t491525\ntreepanel\t491526\n成杰\t491527\n莱山\t491528\n健盛集团\t491529\n坠物\t491530\ntimeScale\t491531\n精创\t491532\n国家食品药品监督管理总局保健食品审评中心\t491533\n小波系数\t491534\n华成\t491535\n运动队\t491536\n水煮蛋\t491537\n极数\t491538\n等额\t491539\n王继平\t491540\n鼎盛广场\t491541\n第十周\t491542\n马卡连柯\t491543\n艺术风格\t491544\n四季锦\t491545\nCapacity\t491546\n开门彩\t491547\npsp2000\t491548\nJ2ME\t491549\n下用\t491550\nDNS服务器\t491551\n高尔夫论坛\t491552\n挡位\t491553\n凤山路\t491554\nweixin\t491555\n棘\t491556\nElisa\t491557\nki\t491558\n陈泽锋\t491559\n单马\t491560\n车邦\t491561\n山海秘闻录\t491562\n神盾\t491563\n走掉\t491564\n好丽友派\t491565\n鸡泽县\t491566\nEmbroidery\t491567\nq235b\t491568\n规格化\t491569\n輝\t491570\nhnz\t491571\n下次\t491572\nKatyPerry\t491573\n维修期\t491574\n5950\t491575\n行尸\t491576\n憨豆特工\t491577\nbjut\t491578\nHamachi\t491579\nRECORD\t491580\n家口\t491581\n流质\t491582\n步步惊婚\t491583\n99.com\t491584\nISPO\t491585\nCOE\t491586\n希腊人\t491587\n锡壶\t491588\n丽水市建设局\t491589\n昨天晚上\t491590\nRADWIMPS\t491591\n样筛\t491592\n喀什机场\t491593\ngla200\t491594\n刘律师\t491595\n硅锰合金\t491596\n160\t491597\n合肥野生动物园\t491598\n莲花山\t491599\nrumble\t491600\n火牛\t491601\n韩村\t491602\n豪苑\t491603\nGIMI\t491604\nXnView\t491605\ndameware\t491606\n包青天\t491607\n归整\t491608\n准现房\t491609\n沈阳自动化所\t491610\n有希望\t491611\n美久广场舞\t491612\n住在杭州网\t491613\n各地方\t491614\nCp\t491615\n结算期\t491616\n王文生\t491617\n东田洋实业(中山)有限公司\t491618\n许耀桐\t491619\n三泉\t491620\n白鸟\t491621\n概算\t491622\n白矾\t491623\nWonder\t491624\n2.4.7\t491625\n1.15.0\t491626\n佳莱\t491627\n加载页\t491628\n监管员\t491629\n小姑贤\t491630\n高德导航车机版\t491631\nrmd\t491632\nquartusii\t491633\n飞板\t491634\ndtl\t491635\n袭\t491636\n莫泰尤纳斯\t491637\n张镇\t491638\nCBS\t491639\nH3CNE\t491640\nregulator\t491641\n19.9\t491642\n莱芜都市网\t491643\n小罐\t491644\n穆婷婷\t491645\n富阳新闻网\t491646\n2万元\t491647\n罗敏\t491648\nbiot\t491649\n永乐村\t491650\n时区\t491651\n中票在线\t491652\ndhl快递\t491653\n达邦\t491654\n马丁斯\t491655\n湖北卫视\t491656\n中国人民大学网络教育学院\t491657\n快图\t491658\n珠江投资\t491659\nrecordings\t491660\n上官\t491661\n62页\t491662\n咒语\t491663\n第二世\t491664\n西安交通大学图书馆\t491665\n对外开\t491666\n浙江省教育考试院\t491667\ndkp\t491668\n分析家\t491669\n宝通寺\t491670\n10MB\t491671\n牙轮\t491672\n雪铁龙C4L\t491673\n有利有弊\t491674\n第四十一期\t491675\n信丰县人民政府\t491676\n四英杰\t491677\nRTC\t491678\n無法\t491679\n孕周\t491680\n班章\t491681\n明天幼儿园\t491682\n石科院\t491683\n方略\t491684\nSqlAlchemy\t491685\ncommunicator\t491686\n2dsll\t491687\n铅山县\t491688\n煤层\t491689\nnavicate\t491690\n鼠男\t491691\n妈咪\t491692\n四川省八一康复中心\t491693\n代工\t491694\n宁波工程学院\t491695\nSM3\t491696\n那个人\t491697\n古河渚\t491698\n10w-40\t491699\n韩鹏飞\t491700\n求导\t491701\n银河城\t491702\nMissile\t491703\n蒋芸\t491704\n三同时\t491705\n讲理\t491706\n逆位\t491707\n乐伶\t491708\n干海\t491709\n201501\t491710\n不当\t491711\n内山\t491712\n0.5s\t491713\n五新\t491714\n德高瓷砖胶\t491715\n氧化氢\t491716\n校准器\t491717\n怪兽\t491718\n小皮手游网\t491719\n趟\t491720\n小猪佩奇简笔画\t491721\n皖\t491722\n第一大\t491723\n用钱宝\t491724\n榻榻米\t491725\n上自习\t491726\n于谦\t491727\n百微\t491728\n黔东南\t491729\nHARTING\t491730\n蠢新\t491731\nemail\t491732\n勇者归来\t491733\nA33\t491734\n标态\t491735\n吸纳\t491736\nsd高达g世纪\t491737\n紫石英\t491738\nsorrow\t491739\nRPG\t491740\nLighter\t491741\n翻斗车\t491742\n万国广场\t491743\n魅男\t491744\nproductions\t491745\nlovers\t491746\n一舞\t491747\nreinforcement\t491748\ndaomul\t491749\n假章\t491750\na360\t491751\n拼缝\t491752\n寒冷地区\t491753\n钢爪\t491754\n檀香湾\t491755\n奥点云\t491756\n机巧\t491757\n张家港市政府\t491758\n小冰冰传奇_九游论坛\t491759\n绕线\t491760\n趣学车\t491761\n江宁开发区\t491762\n常旅\t491763\nSTM32Cube\t491764\n四个字\t491765\n税款\t491766\n一年之内\t491767\n胫骨\t491768\n15GB\t491769\n厦门机场\t491770\n他汀\t491771\n精爱\t491772\n杨天\t491773\n火山灰\t491774\n方解石\t491775\n苏州大学应用技术学院\t491776\n邵东跳\t491777\n管\t491778\n黄皮子\t491779\nSHR\t491780\n人民团体\t491781\n600104\t491782\nsides\t491783\n5.4万\t491784\n硬膜外麻醉\t491785\n祥和小区\t491786\n大政\t491787\n龙千玉\t491788\n说不出\t491789\n陶业\t491790\n商主\t491791\n贵城市\t491792\n电荷\t491793\n货真价实\t491794\n加权平均法\t491795\n易德\t491796\nkar98k\t491797\n重归\t491798\n指数\t491799\ntomca\t491800\n003号\t491801\n聚德\t491802\n巩晓彬\t491803\n鹃城\t491804\n临沂汽车站\t491805\n床垫子\t491806\n乘车\t491807\n息影\t491808\nKYC\t491809\n亡故\t491810\n巨齿鲨\t491811\n减肥贴\t491812\n原木\t491813\n批发价格\t491814\n上海交通大学继续教育学院\t491815\n魏来\t491816\n短评\t491817\nRESORT\t491818\n乌鲁木齐市教育局\t491819\n板间\t491820\n口信\t491821\n华盛通\t491822\n忍耐\t491823\n铁蹄\t491824\n五凤楼\t491825\nSinica\t491826\n珊海王\t491827\n醉逍遥\t491828\n2016-10-21\t491829\n李金华\t491830\n晴耕\t491831\n福语\t491832\n防垢\t491833\n杨文军\t491834\n三金\t491835\n汶川地震\t491836\nSQLCODE\t491837\n工商网\t491838\n乳畜业\t491839\nalone\t491840\n4K显示器\t491841\n车篮\t491842\n科力远\t491843\n清仓\t491844\n技术层\t491845\nrecognize\t491846\n信力\t491847\n背图\t491848\n金蛇狂舞\t491849\n易企秀\t491850\n零级\t491851\n霞光\t491852\n我和我的家\t491853\n惯习\t491854\n凌钢股份\t491855\n金乡镇\t491856\n幼片\t491857\n剧星\t491858\n警械\t491859\n圣诞季\t491860\n伏羲氏\t491861\n6.1.2\t491862\n仿包\t491863\n国家医学考试网\t491864\n东胜集团\t491865\n竹柏\t491866\nskylar\t491867\n土力学\t491868\n拉群\t491869\n珍本\t491870\n烤兔\t491871\n徐姓\t491872\n杳杳\t491873\n营养粉\t491874\n轻松日\t491875\n日加\t491876\n王馨\t491877\n河南省检察院\t491878\nReboot\t491879\n任意球\t491880\n软件化\t491881\nrobert\t491882\n磨痕\t491883\n传人\t491884\n艾梅乙\t491885\n周知卡\t491886\ndix\t491887\nbrett\t491888\n广州市白云区人民政府\t491889\n尹涛\t491890\n膳房\t491891\nsql函数\t491892\n良辰大魏\t491893\n杨和\t491894\n第33次\t491895\nKylin\t491896\n午夜12点\t491897\n[史\t491898\n20160707\t491899\n41001\t491900\nFAT\t491901\n2018年1号\t491902\n断言\t491903\n139路\t491904\nパイ\t491905\n暗杀\t491906\n老钱庄百宝箱\t491907\n李春光\t491908\nMakers\t491909\n杨庄乡\t491910\npulseaudio\t491911\n王长江\t491912\nRSI指标\t491913\n建服\t491914\n苦主\t491915\n骑\t491916\nMandy\t491917\n糖浆\t491918\n短时\t491919\nmotan\t491920\nwindows图片查看器\t491921\nLending\t491922\n百度影音-一网\t491923\n自由女神\t491924\n回向偈\t491925\nm0\t491926\n心理科\t491927\n第二眼\t491928\n偿付\t491929\n剧院\t491930\n诺丁汉\t491931\nPostal\t491932\n西涯侠\t491933\n华祥\t491934\n64个\t491935\n头条号指数\t491936\n垦\t491937\n张文远\t491938\n林铭\t491939\n第103集\t491940\n性爱区\t491941\n赛级\t491942\nIT采购网\t491943\n切牌\t491944\n北京市检察院\t491945\n南宁站\t491946\n上域\t491947\n海岩\t491948\nimplant\t491949\n佛山分公司\t491950\n香逢\t491951\n烤鸡蛋\t491952\n小豹\t491953\n准提咒\t491954\n航天九院\t491955\n华南师大\t491956\n黑魂\t491957\n中华第一龙\t491958\n人造革\t491959\nUcinet\t491960\nBLL\t491961\n找借口\t491962\nmou\t491963\n无间道风云\t491964\n内蒙古财经大学\t491965\n尼古拉\t491966\n省直工委\t491967\nrg\t491968\n丝毫\t491969\n恩利\t491970\n赵康\t491971\n皮囊\t491972\n贵州都市报数字报\t491973\n20km\t491974\n开落\t491975\n百香果苗\t491976\n主怪\t491977\n输出器\t491978\n成都纺织高等专科学校\t491979\n肉机\t491980\n火字\t491981\n学徒制\t491982\n诗眼\t491983\n阿里鱼\t491984\n茶花\t491985\n大杠\t491986\n我和爸爸\t491987\n密封件\t491988\nmangodb\t491989\n安乡\t491990\n氙\t491991\n素砼\t491992\n伊通县\t491993\n成山\t491994\ngaming5\t491995\n陈氏宗祠\t491996\n趣链\t491997\nbiology\t491998\n文思海辉\t491999\n17万亿\t492000\n参展证\t492001\n0456\t492002\n催收员\t492003\neod\t492004\napk格式\t492005\n提箱\t492006\nvenous\t492007\n3442\t492008\n相似性度量\t492009\narcade\t492010\n划水\t492011\n陈江河\t492012\ndbdao\t492013\n太阳马戏团\t492014\n奥妙\t492015\nPEST\t492016\n中国福利彩票发行管理中心\t492017\n19分钟\t492018\n伊卡洛斯\t492019\n三目童子\t492020\n过滤泵\t492021\n迪卡侬\t492022\n温德姆酒店集团\t492023\n紧锁\t492024\n永磁变频空压机\t492025\n舒克贝塔\t492026\n奋斗者\t492027\n海星星\t492028\n245万\t492029\n玉宇\t492030\n金虹\t492031\n德国牧羊犬俱乐部\t492032\n暴风影音播放器\t492033\nplp\t492034\n综合论\t492035\n一六八\t492036\n周大福\t492037\n东方纤云\t492038\n品牌家纺网\t492039\n公伤\t492040\n氧化反应\t492041\n张茹\t492042\n杨国亭\t492043\n蓝凤凰\t492044\n高家村\t492045\nミュ\t492046\n20160722\t492047\n优优漫画\t492048\n流水\t492049\n职测\t492050\n创尔美\t492051\n人勤\t492052\n国际和平妇幼保健院\t492053\n必经之路\t492054\nBD国英\t492055\n虞梦\t492056\n社恐\t492057\n潮声\t492058\n音乐之都维也纳\t492059\n影评人\t492060\n文心阁\t492061\n江西铜业\t492062\n2017年11月30日\t492063\n中時電子報\t492064\n163个\t492065\n铜像\t492066\n空腹\t492067\n输卵管癌\t492068\n更衣间\t492069\n爱软客\t492070\n孙晶\t492071\n月&#160\t492072\n油电混动车\t492073\n骷髅岛\t492074\n3千万\t492075\n大串\t492076\n原作\t492077\n保俶塔\t492078\n专享\t492079\n东北育才学校\t492080\n2半\t492081\n奥菲\t492082\n维纶触摸屏\t492083\n京东offer\t492084\n美丽园\t492085\n德格\t492086\n于伟\t492087\n师资\t492088\n教学片\t492089\n实训室\t492090\n莱茵兰\t492091\n福田康夫\t492092\n天麻\t492093\n肺动脉栓塞\t492094\n唯医\t492095\n学术史\t492096\n枕头套\t492097\n耸立\t492098\n中控员\t492099\n张建东\t492100\nGenesis\t492101\n快巴\t492102\n搜索引擎排名\t492103\n纹包\t492104\n开机后\t492105\n真眼\t492106\n33位\t492107\n五区\t492108\nAudit\t492109\nVantara\t492110\n宁波市统计局\t492111\n英国\t492112\n旁观\t492113\n廠\t492114\n魔霸\t492115\n王凤山\t492116\n债\t492117\n桃溪镇\t492118\n玻璃房\t492119\n狮子湖\t492120\n暖房\t492121\nconnecting\t492122\n才艺\t492123\n火焰探测器\t492124\nMua\t492125\n若何\t492126\n丙安古镇\t492127\nAape\t492128\n淙淙\t492129\ny720\t492130\n盏灯\t492131\n警署\t492132\nSEO技术博客\t492133\n另辟蹊径\t492134\n2千亿\t492135\nAVnight\t492136\n转正定级\t492137\n斜式\t492138\n桂林公园\t492139\nFuneral\t492140\n油田\t492141\n八宝茶\t492142\n杭州市政府\t492143\n430分\t492144\nt5000\t492145\n基孔制\t492146\n南水\t492147\n4月1日后\t492148\n票仓\t492149\n黄冈市\t492150\n16x16\t492151\n公孙二狗\t492152\n4趟\t492153\n干红葡萄\t492154\n55W\t492155\n兴仁\t492156\n打轮\t492157\nlushi\t492158\n焊烟\t492159\n中梁国宾府\t492160\n无线传感网\t492161\n刘元\t492162\nTI公司\t492163\n道歉\t492164\n聚众淫乱\t492165\n别离\t492166\n攀枝花市人力资源和社会保障局\t492167\nkirakira\t492168\n聚灿光电\t492169\n茶枯\t492170\n点火器\t492171\n潜力股\t492172\n20170225\t492173\n河北省卫生计生委\t492174\n艾西\t492175\n处决\t492176\n洋兰\t492177\nSHEEP\t492178\n李晓\t492179\n合纵连横\t492180\n造化弄人\t492181\n中国石化加油站\t492182\n鳞状上皮细胞\t492183\ndtpicker\t492184\n汪叽\t492185\n前十位\t492186\n草金鱼\t492187\n东方启音\t492188\n泰州路\t492189\n损失函数\t492190\n20161207\t492191\n三元牛奶\t492192\n北京国丹医院\t492193\n10Pro\t492194\n盈利率\t492195\n风正扬帆\t492196\n打更\t492197\n戛然而止\t492198\n名产\t492199\n上海美术学院\t492200\n展子虔\t492201\n南方教育时报\t492202\n东方锆业\t492203\nmacfee\t492204\nDTMF\t492205\ninhibitory\t492206\n七果\t492207\n福星\t492208\n管理版\t492209\n德胜\t492210\n高德顺\t492211\n扎手\t492212\nwx+\t492213\n赵方\t492214\n党派\t492215\n6.77\t492216\n华阴老腔\t492217\n900集\t492218\n国定路\t492219\n运通卡\t492220\ni5-3470\t492221\n石狮日报\t492222\n爆响\t492223\ndjiango\t492224\n肾肿瘤\t492225\n偏相关系数\t492226\n燐月\t492227\n帝国霸业银河\t492228\n上海第十人民医院\t492229\n电离度\t492230\n男人手\t492231\n姻缘\t492232\nCole\t492233\n天舟\t492234\n惠化洞\t492235\n45米\t492236\n健安喜\t492237\n活颜\t492238\n杜衡\t492239\n河南省农业厅\t492240\n法拉利恩佐\t492241\n荒野之息\t492242\nellen\t492243\n生化汤\t492244\n师范学校\t492245\n句容政府网\t492246\nPatr\t492247\n佛山高新区\t492248\n知几\t492249\n冷藏\t492250\n烛焰\t492251\n罗伊斯\t492252\n0行\t492253\nConsole\t492254\nsearched\t492255\n三国志13吧_\t492256\nAI素材\t492257\n真夜\t492258\nsz68\t492259\nwindows2016\t492260\n小房间\t492261\nlbj15\t492262\n巢湖半岛\t492263\n鲱鱼\t492264\n可享\t492265\n首頁|\t492266\n药剂师\t492267\n太妹\t492268\n香油\t492269\n象牙\t492270\n搜罗网\t492271\n绮美\t492272\nXDS\t492273\nRDM\t492274\n大方\t492275\n五等\t492276\n八论\t492277\n街女\t492278\n天谴流\t492279\n76人\t492280\n杨搏\t492281\n600mm\t492282\n绿野户外网\t492283\n破位\t492284\nOFFSET函数\t492285\n丹\t492286\n中投保\t492287\n辛伐他汀\t492288\n非货币\t492289\npKa\t492290\n开街\t492291\n黄腔\t492292\n亳州市教育局\t492293\n辛安镇\t492294\n秘史\t492295\n抽查\t492296\n蓝图机\t492297\nincall\t492298\n天天不在\t492299\n配饰\t492300\n陪护\t492301\n钉宫理惠\t492302\nsludge\t492303\n信立泰\t492304\n木桶浴\t492305\n斥候\t492306\n感人肺腑\t492307\n预感\t492308\nsystem分区\t492309\nnonlocal\t492310\npuk码\t492311\n水便\t492312\ngt430\t492313\n误诊率\t492314\n在职研招网\t492315\n毓\t492316\n疯狂看图猜成语\t492317\n贵州电视台\t492318\n中森\t492319\n朱丽萍\t492320\n芒砀山\t492321\nSakura\t492322\n流子\t492323\n50万美元\t492324\n飞升\t492325\n高压包\t492326\n潮妈\t492327\n主动脉\t492328\n羞答\t492329\n华为麦芒5\t492330\n再次\t492331\n朱峰\t492332\n杨式\t492333\n皮下气肿\t492334\n1392\t492335\n底标\t492336\n三山五园\t492337\n勒浆\t492338\n山东水利职业学院\t492339\n萌照\t492340\n板式换热机组\t492341\n陪伴\t492342\n四调\t492343\n机封\t492344\n浇灌\t492345\n韬韬\t492346\n班服\t492347\nConfigurator\t492348\n窑变\t492349\n西直门店\t492350\n流泪\t492351\n苑子豪\t492352\n暨南大学出版社\t492353\n中华人民共和国广告法\t492354\n超级战兵\t492355\n213路\t492356\n海南省农村信用社联合社\t492357\n67位\t492358\nMeta\t492359\n盛世嘉园\t492360\n阿金\t492361\n新分区\t492362\nbenefits\t492363\nservic\t492364\n落地式\t492365\nw10\t492366\n沙希\t492367\n高频头\t492368\n音质\t492369\n股票质押融资\t492370\n乔楚\t492371\n上访\t492372\n美美与共\t492373\n505\t492374\nformer\t492375\nSpoon\t492376\n棉铃虫\t492377\nroscore\t492378\n华北五省\t492379\n杜江\t492380\nbovenson\t492381\nBboom\t492382\n镜像源\t492383\nappsecret\t492384\n挨个\t492385\n通惠\t492386\n小望\t492387\n实打实\t492388\nbw_0927\t492389\n各负其责\t492390\nlottery\t492391\n90周年\t492392\nxiaoyang\t492393\n好羡慕\t492394\n古典\t492395\nhtml5+css3\t492396\n松坂南\t492397\n2018.4.3\t492398\n出材率\t492399\n盐包\t492400\n激光喷码机\t492401\nddk\t492402\n一秀\t492403\n发热管\t492404\n京唐港\t492405\n齐山\t492406\n天耀\t492407\n动画库\t492408\n古话\t492409\nbtbt\t492410\n江南省\t492411\n_【香哈网\t492412\nspain\t492413\nM7450F\t492414\n钢琴键\t492415\n88平\t492416\n面向对象编程\t492417\n广东自考网\t492418\n广州琶洲会展中心\t492419\nlollipop\t492420\n欧成\t492421\n叶绿\t492422\n禁忌证\t492423\n再生胶\t492424\n忍俊不禁\t492425\n针状焦\t492426\n亅\t492427\n世尊\t492428\n1.5亿元\t492429\n铁锅\t492430\n相应\t492431\nRegarding\t492432\n信用险\t492433\nmathlab\t492434\n陈再道\t492435\n0904\t492436\n诗坛\t492437\n西山风景区\t492438\n海船\t492439\n魔兽争霸3混乱之治\t492440\n毕沙罗\t492441\n武庙\t492442\n离线盒子\t492443\n4号\t492444\n百田柯南圈\t492445\n千兆网\t492446\n冰沙\t492447\n耳卿\t492448\n入道\t492449\nX7\t492450\n3000公里\t492451\n1080P超清\t492452\n机兽\t492453\n胸女\t492454\n比比\t492455\nmethyl\t492456\n广口\t492457\n茶道六君子\t492458\n帆布袋\t492459\n伊东绘理\t492460\ncang\t492461\n六瓣\t492462\ndarry\t492463\n暗牧\t492464\n上海迪士尼度假区\t492465\n科举\t492466\n地机\t492467\n中国农行\t492468\n遇阻\t492469\njules\t492470\n疏通车\t492471\n还送\t492472\n手提袋\t492473\n清康\t492474\n烧烤网\t492475\n核糖\t492476\n几十吨\t492477\nCDs\t492478\nCDB\t492479\n【炫\t492480\n三国战纪风云再起\t492481\npublicagent\t492482\n老玩童\t492483\nykyun\t492484\n万达酒店\t492485\nL级Megatron\t492486\n加加\t492487\n腚\t492488\n格列\t492489\n减速器\t492490\n黄武\t492491\n钓鱼的启示\t492492\n面皮\t492493\nmapx\t492494\n安息香\t492495\n音乐教师\t492496\n饱尝\t492497\n上城市\t492498\n五芯电缆\t492499\n500本\t492500\nJrs\t492501\n新妈妈\t492502\n乐者\t492503\n登月\t492504\n以太坊私链\t492505\n幡动\t492506\n宇轩\t492507\n违建\t492508\n新昌县\t492509\n7004\t492510\nakk\t492511\n灯杆\t492512\n三佳\t492513\ndatebox\t492514\n伺服编码器\t492515\n新加坡政府\t492516\n四叠\t492517\n久栖\t492518\n康隆\t492519\n缩手\t492520\n微信平台\t492521\n任程伟\t492522\n徽商\t492523\n凤飞飞\t492524\n花生酱\t492525\n三片罐\t492526\n天喻\t492527\n佘北家园\t492528\n对唱版\t492529\n连键\t492530\n彩虹6\t492531\nstaggered\t492532\n临澧县人民政府_临澧县政府\t492533\nspider\t492534\n十多家\t492535\n雪谷\t492536\n工中级\t492537\n威廉希尔\t492538\n充电式\t492539\n苏沪\t492540\n翠微股份\t492541\n日本公司\t492542\n爱音麻里亚\t492543\n鄂尔多斯东胜区\t492544\n胶合\t492545\n茶余饭后\t492546\n亚硝胺\t492547\n何氏\t492548\n卫生队\t492549\nmostly\t492550\n黑武士\t492551\n退台\t492552\n偶像\t492553\n公益时报网\t492554\n电动晾衣架\t492555\n39家\t492556\n徇私枉法\t492557\n魔忍\t492558\n下端\t492559\n度娘\t492560\ndama\t492561\n2017年3月4日\t492562\n陈霖\t492563\n蒸汽机\t492564\nSBS防水卷材\t492565\n工行\t492566\n读后感想\t492567\n六大\t492568\n点带\t492569\n女一\t492570\nPO\t492571\n自愿\t492572\n季王\t492573\n磨镜\t492574\n_团车网\t492575\n基层站\t492576\nsubmitted\t492577\n昂科威论\t492578\n尔雅超星\t492579\n东方文创网\t492580\n健美裤\t492581\n资产评估法\t492582\n煮沸\t492583\n家门\t492584\n烟熏妆\t492585\n天龙八部珍兽\t492586\n民族志\t492587\n金不换\t492588\nard\t492589\nASSC\t492590\nana\t492591\nKarl\t492592\ncaozengling\t492593\n体魄\t492594\n匣\t492595\n千金女贼\t492596\n粤语\t492597\n泗城镇\t492598\nimageware\t492599\n楼梯口\t492600\n仙风\t492601\nMetasploitable\t492602\n小洲\t492603\n王新民\t492604\n公立医院改革\t492605\n回路电阻测试仪\t492606\nbotanist\t492607\nAvast\t492608\nKITTY\t492609\n郑某\t492610\n18pk.com\t492611\n钢铁之师诺曼底\t492612\nfinch\t492613\n魔酷网\t492614\n电饭煲\t492615\nanzhuan\t492616\n白静\t492617\n康永\t492618\n百田奥雅\t492619\nNana\t492620\n20150306\t492621\n外汇牌价表\t492622\n非可视\t492623\n钢材\t492624\n开发类\t492625\n新东方泡泡少儿教育\t492626\n香港有限公司\t492627\n宗申比亚乔\t492628\n齐炳权\t492629\nbro\t492630\n谒金门\t492631\n庇佑\t492632\ndensenet\t492633\n时尚王\t492634\n上海陆家嘴\t492635\nt20\t492636\n活性氧\t492637\nchili\t492638\n亚泰大街\t492639\nReactNative\t492640\n火候\t492641\n插卡式\t492642\n旅立\t492643\n大学南路\t492644\n土豆烧排骨\t492645\n43小时\t492646\n片剂\t492647\n北京新东方\t492648\n高思杯\t492649\nfogwu\t492650\n今期\t492651\n取保\t492652\n吊滑\t492653\n辽宁省实验学校\t492654\n土水\t492655\n灯红\t492656\nvirginia\t492657\n70方\t492658\n严世蕃\t492659\n科顺\t492660\n选方\t492661\n气车\t492662\n山东一区\t492663\n行政管理机构\t492664\n铅山\t492665\nMedusa\t492666\n踯躅\t492667\nShareLaTeX\t492668\nSOP8\t492669\n泰康人寿保险股份有限公司\t492670\n三峡集团\t492671\n亲测\t492672\n伊芙蕾雅\t492673\n铁雨\t492674\n起身\t492675\n卫计局\t492676\n业物\t492677\n000931\t492678\nSacred\t492679\n絆\t492680\n游廊\t492681\n云会议\t492682\n乐固\t492683\nRug\t492684\n岳西\t492685\n雾化片\t492686\n乐农\t492687\n器官\t492688\n万全吧\t492689\n陆萍\t492690\n日经225指数\t492691\n2018.2\t492692\n兰小草\t492693\n徐志强\t492694\n叶神\t492695\ntese\t492696\n假体\t492697\n无固定期合同\t492698\n苏州新闻网\t492699\n新注册公司\t492700\n104名\t492701\n锦衣香闺\t492702\n坦克风云\t492703\n大一匹\t492704\n素食者\t492705\n奶茶杯\t492706\n卡式台胞证\t492707\n写于\t492708\n7月1日后\t492709\nV认证\t492710\n特供版\t492711\n膨胀率\t492712\n克伦特罗\t492713\n从新建\t492714\n中华问答网\t492715\n北地区\t492716\n龚细军\t492717\nfhadmin\t492718\ngeocities\t492719\nhs编码\t492720\n全景片\t492721\n八卦岭\t492722\n中国电气网\t492723\n法治中国_中国网\t492724\n胜利油田\t492725\n富友\t492726\n第三座\t492727\nenterprises\t492728\n山东药品食品职业学院\t492729\n发明专利申请\t492730\n吉尔丹\t492731\nvault\t492732\ncelestial\t492733\n增超\t492734\njupyterhub\t492735\n黑斑\t492736\n坦桑石\t492737\n573\t492738\n婚俗\t492739\n麻园\t492740\n3406\t492741\n顾婉音\t492742\n狠刹\t492743\n洁柔\t492744\n极速传说\t492745\n泡泡龙\t492746\n区食药监局\t492747\n涡河\t492748\n简绍\t492749\n苏州动物园\t492750\nCAD素材\t492751\n科加斯\t492752\n异度之刃\t492753\n致用\t492754\n荣耀7x\t492755\nperformance\t492756\n飚王\t492757\n第36号\t492758\n60点\t492759\n神州培训网\t492760\n水晶眼镜\t492761\nOrdered\t492762\n延边延吉\t492763\n憎\t492764\ndnd吧\t492765\n百威\t492766\n何苦\t492767\nFUSE\t492768\n乐宝\t492769\n凯时\t492770\n小案\t492771\nHelps\t492772\n丁武\t492773\n日下\t492774\n前照灯\t492775\n上海曙光医院\t492776\n山东高速集团有限公司\t492777\ndictionary\t492778\n轲\t492779\n联昊通\t492780\n营员\t492781\n接回\t492782\nwarrant\t492783\n天津广电\t492784\n2005年4月\t492785\n坏小孩\t492786\n外贸\t492787\n第十五次\t492788\n计数型\t492789\n抢救车\t492790\n社会保险基金\t492791\n昀\t492792\n油桶\t492793\n没饭吃\t492794\n王侯将相\t492795\n165mm\t492796\n河南省委党校\t492797\n美华\t492798\n讲习班\t492799\n饭局的诱惑\t492800\n0912\t492801\n籍山镇\t492802\nscraping\t492803\n万向钱潮\t492804\n酷豆\t492805\napoptosis\t492806\nRaleigh\t492807\nToes\t492808\n野林\t492809\nChapelle\t492810\n不速\t492811\n幻听\t492812\n拉贝\t492813\n做\t492814\n品牌招商\t492815\n79年\t492816\n艾尼斯\t492817\n黄坡\t492818\n三元锂电池\t492819\n重庆市国税局\t492820\n评谈\t492821\n二氢\t492822\n夏布\t492823\n接骨板\t492824\n露兜博客\t492825\n快环\t492826\n迹史\t492827\n吸顶\t492828\nArs\t492829\n南西\t492830\nexpanded\t492831\n钟鼓\t492832\n尾翼\t492833\n爱不起\t492834\n草场街\t492835\n常山药业\t492836\n战网\t492837\n粽子\t492838\n漫\t492839\n该页\t492840\nConversation\t492841\n护花\t492842\n铁夹\t492843\n因公临时出国经费管理办法\t492844\n神鬼\t492845\n请见谅\t492846\nONS\t492847\n梦笔生花\t492848\n南岳区\t492849\n石灰乳\t492850\n韦氏\t492851\n酒精酒\t492852\n管建\t492853\nduik\t492854\n恒山\t492855\n某处\t492856\n加权\t492857\n专业人员\t492858\n全面二孩\t492859\n今年9月份\t492860\n取模\t492861\n美背\t492862\n海阳路\t492863\n陆军工程大学\t492864\n弯曲强度\t492865\nDIG\t492866\n海温\t492867\napp\t492868\n3800个\t492869\n红塔山\t492870\n连平\t492871\n书虫\t492872\n世界历史\t492873\nCDC\t492874\n云宏\t492875\n哈弗h8\t492876\n初一\t492877\n矣\t492878\n班车\t492879\nqplot\t492880\nAssign\t492881\n增值税专业\t492882\n奇瑞eq1\t492883\n2017-09-12\t492884\ndestruct\t492885\n黄志强\t492886\nforecasting\t492887\nviola\t492888\n蔡依林\t492889\n杰瑞鼠\t492890\n天悦湾\t492891\n清枫语\t492892\n奥汀\t492893\n七星关区人民医院\t492894\n呱唧\t492895\n九仙帝皇诀\t492896\nC座\t492897\n计算方\t492898\nTENDA\t492899\n观山湖区人民政府\t492900\n玉海棠\t492901\n风起云涌\t492902\n嬛嬛\t492903\n借货\t492904\nkeyring\t492905\n北大青鸟学校\t492906\ncalculating\t492907\n名存实亡\t492908\nLies\t492909\n蓝光地产\t492910\n编选\t492911\n事\t492912\n电致\t492913\nVenusBlood\t492914\n小纪镇\t492915\n娱乐百分百\t492916\n2018-04-12\t492917\nvb6\t492918\n侯亮平\t492919\n颈椎骨质增生\t492920\n永恒\t492921\n房主\t492922\n受剪\t492923\n孙超\t492924\n杨万里\t492925\nhear\t492926\n福达合金\t492927\n音号\t492928\n金州大道\t492929\n黄疸值\t492930\n艾露莎\t492931\n司马相如\t492932\n脯氨酸\t492933\n朝阳新城\t492934\n年长\t492935\nPax\t492936\n残联\t492937\n伊兰\t492938\n倒悬\t492939\nDirichlet\t492940\n周文霞\t492941\n江阳区\t492942\n黑暗森林\t492943\n函_无忧考网\t492944\n名额\t492945\n黄岗\t492946\nincrease\t492947\n八九个月\t492948\n开市\t492949\nyes\t492950\nkali_linux\t492951\n方程式\t492952\nMKS\t492953\n混酸\t492954\n飨\t492955\n双立杆\t492956\n固定收益证券\t492957\nCdiscount\t492958\n法制日报社\t492959\n葛老师\t492960\n超声导波\t492961\n莲花争霸\t492962\n粤交基\t492963\nTCO\t492964\n亞洲\t492965\nsay\t492966\n泽佛伊\t492967\n巴音朝鲁\t492968\n北印\t492969\niso镜像系统\t492970\n中华先锋网\t492971\ndreamweavercs6\t492972\nqghappy\t492973\n20160320\t492974\n专转\t492975\n单托\t492976\ncsguo\t492977\n恋爱\t492978\n复元\t492979\n姜母鸭\t492980\n刁难\t492981\n灰甲\t492982\nBitlocker\t492983\n十八般兵器\t492984\n咳喘\t492985\n猎物\t492986\n盐酸普罗帕酮片\t492987\n唱\t492988\n套裙\t492989\n王德明\t492990\njquery-weui\t492991\n按劳分配\t492992\n檄\t492993\n诸葛青\t492994\n取余\t492995\n厌弃\t492996\n阿黑颜\t492997\n廊坊北站\t492998\n证券交易\t492999\n礼贤\t493000\n重生之路\t493001\n普洱茶膏\t493002\n三国志2017吧\t493003\n锡林郭勒职业学院\t493004\nUCLA\t493005\n瓜牛\t493006\n浦东地区\t493007\n桐乡市\t493008\n成色\t493009\n全经\t493010\n1000户\t493011\n易安保险\t493012\nFilters\t493013\n龙狼传\t493014\n奋达\t493015\n调换\t493016\n东京食尸鬼JACK篇剧透\t493017\n袖珍人\t493018\n宠物天空\t493019\n训练集\t493020\n集庆门大街\t493021\n制约\t493022\n三事\t493023\n私家花园\t493024\n方向灯\t493025\n堂食\t493026\n雪芙\t493027\n三大步\t493028\nlrange\t493029\n仙剑奇侠传3外传问情篇\t493030\nAC模玩网\t493031\n19只\t493032\n江苏先锋网\t493033\n草桥镇\t493034\n五菱神车\t493035\n刘刚\t493036\n造纸术\t493037\neastman\t493038\n冰龙王\t493039\n幸福西饼\t493040\nprn\t493041\ntipask\t493042\nGPO\t493043\n奇技淫巧\t493044\n拓浦\t493045\n菜油\t493046\n中兴通讯股份有限公司\t493047\n職業\t493048\n百鬼弈\t493049\ng933\t493050\nC5GAME\t493051\nAC+\t493052\n5个群\t493053\n秋之回忆吧\t493054\n蜀山奇侠之仙侣奇缘\t493055\n凯斯西储\t493056\n天助网\t493057\n怄气\t493058\n赫澎\t493059\n避邪\t493060\n克里琴科\t493061\n极限挑战2\t493062\n铁路运输\t493063\n跨一\t493064\n茶女\t493065\n闽南地区\t493066\n申江路\t493067\n绿色椅子\t493068\n各省\t493069\n沥青摊铺机\t493070\n号脉\t493071\n看齐\t493072\n遥知\t493073\nMT750\t493074\n两百年\t493075\n挨\t493076\n正三轮摩托车\t493077\n无码版\t493078\n氮化钛\t493079\n10h\t493080\n私人影\t493081\n很好色\t493082\n卖笑\t493083\n总管\t493084\njavascr\t493085\nVitamin\t493086\n火箭军总医院\t493087\n堂号\t493088\n收看\t493089\n男女老少\t493090\n士\t493091\n兄妹恋\t493092\n静距离\t493093\n金日教育\t493094\nGoes\t493095\n彭宇\t493096\n千分之\t493097\n瑜伽柱\t493098\n听错\t493099\n灵位\t493100\n北大软微\t493101\n刘宏伟\t493102\n30000吨\t493103\nRepublic\t493104\nvirgin\t493105\n振动盘\t493106\n逆风飞扬\t493107\n富强路\t493108\nMatrix67\t493109\n耐火泥\t493110\n八舌\t493111\n橄榄球队\t493112\n烷基糖苷\t493113\n桃花雪\t493114\n杨阳\t493115\n世故\t493116\n崂山\t493117\n单币\t493118\n广弘控股\t493119\nSig\t493120\n撮\t493121\n文姓\t493122\n撞名\t493123\n一个番\t493124\nvc++2008\t493125\nMeridien\t493126\n贝恩杯\t493127\n110岁\t493128\n遥控直升机\t493129\nTLV\t493130\n仙剑奇侠传5前传\t493131\n编程作图\t493132\nz8\t493133\nAICS6\t493134\n旧物\t493135\nlists\t493136\nlyn\t493137\n跟随器\t493138\n飞克\t493139\nBixby\t493140\n星球网\t493141\n青稚\t493142\n13896005922\t493143\n变迁\t493144\n宁国路\t493145\n付曼琳\t493146\n五分之一\t493147\n中国邮政速递\t493148\n七煞\t493149\n雾月\t493150\n清洁工\t493151\n长沙先导投资控股集团有限公司\t493152\n辣鱼\t493153\n锌片\t493154\n第十一周\t493155\n002号\t493156\n米脂县\t493157\n顾盼成欢\t493158\n广州大厦\t493159\n爱国主义\t493160\n中山职业技术学院\t493161\n浓艳\t493162\n钱斌\t493163\nHAR\t493164\n第3类\t493165\n69个\t493166\ntouxiang\t493167\n深田恭子\t493168\n新疆广汇\t493169\n十八周\t493170\n汉室\t493171\n领克02\t493172\n佳兆业水岸新都\t493173\n1933\t493174\nAssigned\t493175\n55天\t493176\n范德比尔特\t493177\n徐州小学\t493178\n送话\t493179\nPAUL\t493180\n首创\t493181\n速联\t493182\ngeostudio\t493183\n潮州菜\t493184\n吴越\t493185\n彩水\t493186\n科教版\t493187\n亚迪三村\t493188\n6篇\t493189\n坎巴拉太空计划\t493190\n徐庄软件园\t493191\n教材\t493192\ntws\t493193\nPrecursor\t493194\nregina\t493195\nlibrary\t493196\n汝瓷\t493197\n互惠互利\t493198\n碧桂园山湖城\t493199\n百度站长平台\t493200\n特戊酸\t493201\n1.5kg\t493202\n保利悦都\t493203\n13批次\t493204\n教育篇\t493205\n超级巨星\t493206\ncasno\t493207\n殴\t493208\nNDP\t493209\n栓绳\t493210\n万荣路\t493211\n扩初\t493212\nGMD\t493213\n羽绒裤\t493214\n一人公司\t493215\nwhchensir\t493216\n八重切\t493217\n豪利\t493218\n红橙\t493219\nunreal\t493220\nfiddler\t493221\nWEBAPP\t493222\njixing\t493223\nstray\t493224\n大年初三\t493225\n披露\t493226\nAlternatives\t493227\n上海国际港务(集团)股份有限公司\t493228\n洗车工\t493229\n炮仗花\t493230\n坠机\t493231\n40X\t493232\n痴情人\t493233\n伪善\t493234\nAskReddit\t493235\n夏茗悠\t493236\n天地华宇物流\t493237\n喂猪\t493238\nenclosed\t493239\n音乐器\t493240\n金平县\t493241\n使命召唤8:现代战争3\t493242\n即时库存\t493243\n胃肠炎\t493244\n宝带西路\t493245\nftc\t493246\n蓝胶\t493247\n驱逐者\t493248\n北京同仁医院\t493249\nSP\t493250\n换盘\t493251\n普及版\t493252\n许多年\t493253\n唐君毅\t493254\n海之声\t493255\nDV\t493256\n外母\t493257\n新媒\t493258\n70年\t493259\n华人街网\t493260\n黑板报\t493261\n郑州物流公司\t493262\n三观\t493263\n你的日子\t493264\nlisteners\t493265\n淫猥\t493266\n湖南省林业厅\t493267\n国际与公共事务学院\t493268\n万卓\t493269\n韩韩\t493270\nv28\t493271\nFileWriter\t493272\n诺和龙\t493273\n罗技g102\t493274\n立体画\t493275\n150公里\t493276\n喜结\t493277\n宿豫\t493278\n用户子\t493279\n劳工证\t493280\n条索\t493281\n清道夫\t493282\n哈佛路\t493283\n鲁丽\t493284\nXiaomi\t493285\n大连国际\t493286\n白领网\t493287\n最低工资标准表\t493288\n格莫拉\t493289\n萌王\t493290\nAnhui\t493291\n832\t493292\n接替\t493293\nguage\t493294\n迈之灵\t493295\n蕲\t493296\n灵魂摆渡\t493297\n发泡水泥板\t493298\nm225dw\t493299\n特聘\t493300\nplat\t493301\n解脲脲原体\t493302\nproxies\t493303\n扫码点餐\t493304\n安钢\t493305\nfreex\t493306\n砚台\t493307\n莎莎舞\t493308\n小明\t493309\n中\t493310\n甘松\t493311\n道家哲学\t493312\n今始\t493313\n形容物\t493314\n熊类\t493315\n金杆菌\t493316\n双蛋\t493317\nexsel\t493318\niTouch\t493319\n亚无\t493320\n凌博\t493321\n十九天\t493322\n职信\t493323\n树大招风\t493324\n新警察故事\t493325\nEileen\t493326\n威胜\t493327\ncrema\t493328\n夏恩\t493329\n财智大厦\t493330\n蜜菓奶茶\t493331\n十堰晚报\t493332\n宁海\t493333\n自抑\t493334\n抗生素\t493335\n天下归心\t493336\nshouche\t493337\nCD播放机\t493338\n就手\t493339\n剁手党\t493340\n奉纳\t493341\n大闪\t493342\n精洗\t493343\nrefractive\t493344\nTEMPUR\t493345\n特雷泽盖\t493346\n究级\t493347\n遇难者\t493348\n多花\t493349\n0356\t493350\n复仇女神\t493351\n数读\t493352\n钱塘\t493353\nlibrairielephenix\t493354\nJSP\t493355\n点球\t493356\nAdmission\t493357\n排名第一\t493358\n小麦淀粉\t493359\n22中文网\t493360\n转向机\t493361\n值不值得\t493362\n乌龙球\t493363\n莱芜房产超市网\t493364\n谥号\t493365\n银孚\t493366\nXEN\t493367\n亊\t493368\n有影\t493369\n快乐家族\t493370\n格雷码\t493371\n女侠\t493372\nsunset\t493373\nmultithreading\t493374\n飞飞世界\t493375\n安贤洙\t493376\n湖北省图书馆\t493377\n苹果4\t493378\n修神外传\t493379\n华林证券\t493380\n门市价\t493381\nMind\t493382\n小胖墩\t493383\nNRA\t493384\n避税\t493385\n抓大放小\t493386\n侧室\t493387\n死罪\t493388\n12段\t493389\n1.7.7\t493390\n农宅\t493391\n万帮\t493392\n第一本\t493393\n木竹\t493394\npopulation\t493395\n辉南县\t493396\n立面\t493397\n哪类\t493398\n侧重\t493399\n进展期\t493400\n苏州东吴医院\t493401\n一个60\t493402\n7英寸\t493403\n种植牙\t493404\n齐娜\t493405\nQWebView\t493406\n抽动\t493407\n星耀城\t493408\nDataStage\t493409\n18#\t493410\n马字\t493411\n黄塘\t493412\n2016.08\t493413\n填完\t493414\n146分钟\t493415\n不周山\t493416\n低吼\t493417\n蒋廷黻\t493418\n情报员\t493419\n雷鸟电视\t493420\n杨百万\t493421\n中国技术交易所\t493422\n三列\t493423\nKV\t493424\n18000\t493425\n湖边\t493426\nII\t493427\n梦觉\t493428\nADT\t493429\n爱的阶梯\t493430\n马戏之王\t493431\n电玩巴士switch\t493432\n高密新闻网\t493433\n意面\t493434\n人渣的本愿\t493435\n睇\t493436\n中国国际投资贸易洽谈会\t493437\n王者荣耀金币\t493438\n南通话\t493439\nultraviolet\t493440\n维加斯\t493441\n洗车机\t493442\n丁磊\t493443\n干酪乳杆菌\t493444\n皮阿诺\t493445\n新鸳鸯蝴蝶梦\t493446\nGEFORCE\t493447\nGene\t493448\n9.17\t493449\nZooKeeper\t493450\n0.76\t493451\n草民网\t493452\n荣誉勋章血战\t493453\n小包总\t493454\n浓\t493455\n百色\t493456\nfree性\t493457\n钟祥论坛\t493458\nOh\t493459\n医药网\t493460\n协和双语学校\t493461\n水泳\t493462\n瘦腿袜\t493463\n自理\t493464\n老三国\t493465\n佚\t493466\n子父\t493467\n扎夫特\t493468\n水利部黄河水利委员会\t493469\n幣\t493470\n4648\t493471\nmakecert\t493472\n会儿\t493473\n集美大桥\t493474\nST5000\t493475\n礼物盒\t493476\n第23批\t493477\n戏谑\t493478\n普查员\t493479\n防护板\t493480\n气泡酒\t493481\n宝玺\t493482\nmsxml3.dll\t493483\n梅花香自苦寒来\t493484\n兆力\t493485\n微缩\t493486\n从零倾城之恋\t493487\n四宫\t493488\n暗黑3中文网\t493489\n傻傻\t493490\n吉阳镇\t493491\n冥峰\t493492\n2月23日\t493493\n凤眼菩提\t493494\n于淼\t493495\n韩懿莹\t493496\n十二届二次\t493497\n华润云庭\t493498\n受重罚\t493499\n低碳发展\t493500\n重仓股\t493501\n嘉人\t493502\n博观\t493503\nMartha\t493504\npry\t493505\n4399火影\t493506\n纳诺\t493507\n125元\t493508\n进村\t493509\n北京市海淀区教育委员会\t493510\n张主任\t493511\n叛逆性百万亚瑟王炼金\t493512\n米兰达·可儿\t493513\ndnf念帝吧\t493514\n李薇薇\t493515\n契合度\t493516\n500kb\t493517\n九宫飞星\t493518\nround函数\t493519\npolyline\t493520\n蚕卵\t493521\nlipan\t493522\n1080P_哔哩哔哩\t493523\n一廊\t493524\nMPC\t493525\n咏竹\t493526\n华大街道\t493527\n测量表\t493528\n玄同\t493529\n安徽中医药高等专科学校\t493530\nRoyFans\t493531\n很怕\t493532\nxe3\t493533\n百度手机\t493534\n血塞通软胶囊\t493535\n动漫岛\t493536\n伞具\t493537\n小林幸子\t493538\n冴\t493539\n找零\t493540\n12355\t493541\n博主们\t493542\nEssential\t493543\n陕西师范大学出版社\t493544\narcengine\t493545\n这就是我\t493546\n18歳\t493547\n二庭\t493548\n续续\t493549\n二甲苯\t493550\n长水机场\t493551\n细枝末节\t493552\n狗腰\t493553\n三八妇乐\t493554\nattacks\t493555\n尾砂\t493556\n圆猪猪\t493557\nScikit\t493558\ne8-c\t493559\n三氧化硫\t493560\n仁爱礁\t493561\nHRESULT\t493562\nzmi\t493563\n2018年4月底\t493564\n27simn\t493565\n3520m\t493566\n1.xls\t493567\n流产\t493568\n{an}\t493569\n集藏\t493570\n华为荣耀路由\t493571\n第七张\t493572\n辽宁台\t493573\n黄国强\t493574\n超级网\t493575\n吊臂\t493576\n恰如其分\t493577\n滚动轴承\t493578\n南京工业职业技术学院\t493579\n国泰大厦\t493580\n英格丽\t493581\nAddison\t493582\n染染\t493583\ne海通财\t493584\n形件\t493585\n指针型\t493586\n吊子\t493587\n转农\t493588\n6.4下\t493589\n走在街上\t493590\n抗性淀粉\t493591\n坏笑\t493592\n散花\t493593\n萨摩耶\t493594\n准噶尔盆地\t493595\n增利\t493596\nVent\t493597\n合作关系\t493598\n轮动\t493599\n流处理器\t493600\nBethesda\t493601\nsurfaces\t493602\n袖笼\t493603\n深圳住建局\t493604\n278号\t493605\n张锦程\t493606\nIndians\t493607\n雨衣\t493608\n苟富贵\t493609\n海灵格\t493610\n认全\t493611\n赵凌云\t493612\n130部\t493613\n无锡奥数网\t493614\nSIRI\t493615\n压机\t493616\n10kb\t493617\n转身\t493618\n土地增值税清算\t493619\n唱片\t493620\n偕老\t493621\nX79\t493622\n刘章\t493623\n广州市城市规划勘测设计研究院\t493624\n星型\t493625\n废柴联盟\t493626\n侧芽\t493627\n北京四中网校\t493628\nppt模板\t493629\n金沙泊岸\t493630\nndc\t493631\n回车键\t493632\n把关\t493633\n阿拉伯语\t493634\n锦囊\t493635\n福建商业高等专科学校\t493636\n大梅\t493637\n12度\t493638\n潘妮\t493639\n泽宝\t493640\n学定\t493641\n咕噜肉\t493642\n指示牌\t493643\n23元\t493644\n三星S9\t493645\n全州\t493646\n蜀地\t493647\n酷航\t493648\n全赢\t493649\n落红\t493650\n小儿咽扁颗粒\t493651\n通房\t493652\nフェチ\t493653\n笑百步\t493654\n上海音乐家协会\t493655\n吐槽村\t493656\n金中\t493657\n补胎\t493658\n济南市历城区政府\t493659\n口袋妖怪绿铀\t493660\n佩环\t493661\n9050\t493662\n杉本\t493663\n雨景\t493664\n破影\t493665\n2018年2号\t493666\n老A\t493667\n万家园\t493668\n081200\t493669\n精深\t493670\n楞伽经\t493671\nmiae\t493672\nReach\t493673\n周立安东尼\t493674\n横滨港\t493675\n潮位\t493676\n修昔底德陷阱\t493677\n乐敦cc\t493678\n眉来眼去\t493679\n蒋方\t493680\nmlc\t493681\n中山大厦\t493682\n灵眸\t493683\n有饭\t493684\n1900亿\t493685\n巴勒莫\t493686\n近在咫尺\t493687\n月半弯\t493688\n2DVD\t493689\n金晔\t493690\n展酷网\t493691\nnsdate\t493692\n参与度\t493693\n天津市区\t493694\n几块\t493695\n向晚\t493696\n北京局\t493697\n镀钛\t493698\n调情\t493699\n柯本\t493700\n2046\t493701\npatches\t493702\n莫罗\t493703\n柬埔寨西港\t493704\n防护乳\t493705\n丈人\t493706\n厅长\t493707\n对酒\t493708\ngem\t493709\n有机质\t493710\n伊甸\t493711\n望江门\t493712\n平面度\t493713\npathy\t493714\n点刷\t493715\n29码\t493716\n5447\t493717\n斩炎\t493718\nwin10启动盘\t493719\n_农博网\t493720\n中航樾府\t493721\n生息\t493722\n9多\t493723\n登山鞋\t493724\nsidekiq\t493725\n王忠民\t493726\n李逵\t493727\nARMA3\t493728\ndmi\t493729\nfsa\t493730\n污水源\t493731\n天麓山\t493732\n20170416\t493733\n通天帝国\t493734\n生病\t493735\ncppcheck\t493736\n猎猫\t493737\n4.8米\t493738\n最好别\t493739\n紧咬\t493740\n金帅\t493741\n蒋羽熙\t493742\nchronic\t493743\n广告创意\t493744\nworkplace\t493745\n神童诗\t493746\n摩擦\t493747\n班员\t493748\n光长\t493749\n742\t493750\nviruses\t493751\n回门\t493752\n价位段\t493753\nmatlab2015b\t493754\n氯片\t493755\n田柾国\t493756\n定格\t493757\n景山公园\t493758\ntsr\t493759\n氟康唑胶囊\t493760\n刘春华\t493761\n475ml\t493762\n恒美\t493763\n眼膜\t493764\n萌战\t493765\n1.4TSI\t493766\n艾多美\t493767\n净买入\t493768\n一瓦\t493769\n国易堂周易\t493770\n肺脏\t493771\n8.4寸\t493772\n益阳政府网\t493773\n注册安全工程师-233\t493774\npix4d\t493775\n放张\t493776\n加强型\t493777\n支行\t493778\n腾龙镜头\t493779\nqinbb\t493780\n白达摩\t493781\n3158服装加盟网\t493782\n剑桥国际英语教程\t493783\n晨检\t493784\n禾葡兰\t493785\n波士顿大学\t493786\n2mx\t493787\n一猫汽车网\t493788\n棘背龙\t493789\n公立校\t493790\n180414\t493791\n寒字\t493792\n国际米\t493793\n何雪梅\t493794\nSAX\t493795\n调派\t493796\n九月八\t493797\n知聊\t493798\ngtx1080ti\t493799\n鹤啸\t493800\n5021\t493801\n湘西土家族苗族自治州\t493802\n博朗耳温枪\t493803\n对比器\t493804\n鼠板\t493805\nプレイ\t493806\n仔\t493807\n南丫岛\t493808\n闻王昌龄\t493809\n北京市轨道交通建设管理有限公司\t493810\n满腹\t493811\n京博石化\t493812\n5se\t493813\nMarkers\t493814\n西红柿炒鸡蛋\t493815\ncntrades\t493816\n孔先生\t493817\n科创集团\t493818\n尘与雪\t493819\n1.79\t493820\n傲物\t493821\n喜炎平\t493822\n资中县政府\t493823\n异花\t493824\n流态化\t493825\n按出\t493826\n福州市图书馆\t493827\n内存不足\t493828\n星期六日\t493829\n鲁棒\t493830\n小鹿医馆\t493831\n1Q84\t493832\nljy2013\t493833\n杨雪梅\t493834\n杀人如麻\t493835\n美女秀\t493836\n课题\t493837\n拉里\t493838\n草沟\t493839\n浸种\t493840\n望京地铁站\t493841\nSVF\t493842\n那种\t493843\n弹簧钢\t493844\n放不开\t493845\nHCIE\t493846\n墨尘\t493847\n软解\t493848\n洞天福地\t493849\n第1款\t493850\nLaminated\t493851\n三维通信\t493852\nDTP\t493853\n中共成都市委\t493854\nIntellIJ\t493855\n通富\t493856\n豫记\t493857\n裸族\t493858\n20磅\t493859\ngeoserver\t493860\n单量\t493861\n经痛\t493862\n浙江省地质勘查局\t493863\n迅影网\t493864\n说来\t493865\n116&\t493866\n乡长\t493867\n水堆\t493868\n___\t493869\n盖瓦\t493870\n窗槛\t493871\n佳软\t493872\n117平米\t493873\n我的父母\t493874\n芦溪县\t493875\n蓉儿\t493876\n德康\t493877\n浒墅关镇\t493878\nDefinitions\t493879\nCalc\t493880\n脱节\t493881\n鳕熊\t493882\nliya\t493883\n人在旅途\t493884\n京腾\t493885\n物证\t493886\n录单\t493887\n一宁\t493888\n赛龙\t493889\n利比亚战争\t493890\n广寒宫\t493891\n馒\t493892\n亟需\t493893\nLX10\t493894\n排布式\t493895\n两限房\t493896\n摇摇晃晃\t493897\n广业\t493898\nPan\t493899\n陕西科技大学镐京学院\t493900\nparsed\t493901\n天涯明月刀ol\t493902\nwii\t493903\n面目狰狞\t493904\n通兑\t493905\nbon\t493906\n10TB\t493907\n穷秋\t493908\nsetuid\t493909\n云计算服务提供商\t493910\n新中国际\t493911\n定行\t493912\n正泽学校\t493913\n徐宏\t493914\nrocking\t493915\n变身文\t493916\n黄州区\t493917\n池田夏希\t493918\n叶兆言\t493919\n4418\t493920\n领事馆\t493921\n揃\t493922\n心累\t493923\nPk\t493924\n个人消费贷款利率\t493925\n总复习\t493926\nDeepLab\t493927\n万荣县\t493928\n小宝宝\t493929\n中央政治局\t493930\n普希\t493931\n叶龙\t493932\n裁减\t493933\n复旦大学附属中山医院厦门医院\t493934\n尚德\t493935\n840M\t493936\n奶糕\t493937\n胶州湾\t493938\n100单\t493939\n西永\t493940\n黄屏\t493941\n140名\t493942\n输电塔\t493943\n163贵州网\t493944\n叫阵\t493945\n黄山市区\t493946\n营养杯\t493947\n黄埠\t493948\n8at\t493949\nml\t493950\nImpression\t493951\n大戏\t493952\n剑三奇遇\t493953\n坑不坑\t493954\n龟甲\t493955\n明星空\t493956\n600元\t493957\n医治\t493958\n偏财格\t493959\n香菇酱\t493960\n行情诗\t493961\nEyes\t493962\n后备军\t493963\nskeletal\t493964\n丽日\t493965\n线切割\t493966\nhohner\t493967\nitmc\t493968\n消毒盒\t493969\n成效明显\t493970\nUtopia\t493971\n桂政发\t493972\n光秀\t493973\n李泰兰\t493974\niptable\t493975\n江晓原\t493976\n港湾\t493977\n99元\t493978\n华为荣耀7吧\t493979\n钢规\t493980\n主办方\t493981\n扎什伦布寺\t493982\n别君叹\t493983\ngreed\t493984\n压制\t493985\n洪州\t493986\n1400亿元\t493987\n数载\t493988\n傅少\t493989\n火烧岛\t493990\n义利\t493991\n山岳\t493992\ndata-*\t493993\nQ345D\t493994\nGLSurfaceView\t493995\n黎族\t493996\n学士服\t493997\nthirds\t493998\n2017-12-25\t493999\nminerd\t494000\n佳能80D\t494001\njrpass\t494002\n酒窝\t494003\n仙道\t494004\n淮安市住房和城乡建设局\t494005\nvistual\t494006\n按住\t494007\n张雷\t494008\n唐会\t494009\n卫裤\t494010\n角牛\t494011\n企业国有产权转让管理暂行办法\t494012\n身者\t494013\n异世界的圣机师物语\t494014\n迪荡\t494015\n)股份有限公司\t494016\n张伟伟\t494017\n429号\t494018\n脚伤\t494019\n田晓霞\t494020\n几步\t494021\nDiscuz\t494022\n暖暖的味道\t494023\n骨干企业\t494024\n交割单\t494025\n111个\t494026\n绝地战警2\t494027\nstrtok\t494028\n表白\t494029\n加窗\t494030\n录音稿\t494031\n赚点\t494032\n11月3日\t494033\n假笑\t494034\nDistrito\t494035\n东方快车谋杀案2017\t494036\n星蕴\t494037\n丁公司\t494038\n万智\t494039\n进贤县人民政府\t494040\n391K手机网\t494041\nOpenwrt\t494042\n166个\t494043\n因何\t494044\n蒂亚\t494045\nbender\t494046\n天下第一泉\t494047\n襄铃\t494048\n桃花眼\t494049\n克里夫兰\t494050\n鹿泉区\t494051\n小米空气净化器\t494052\nsscanf\t494053\n南京法院\t494054\n北大方正集团有限公司\t494055\n情丝\t494056\nfields\t494057\n撕咬\t494058\n浮花\t494059\n谢娜老舍\t494060\next\t494061\n睦邻\t494062\n紫梦\t494063\n注射\t494064\n驱蚊液\t494065\n鲜味\t494066\n星河湾集团\t494067\n高筋面粉\t494068\n南昌昌北机场\t494069\n张万福\t494070\n厂包\t494071\n彩蛋\t494072\nabuse\t494073\n丰田致炫\t494074\n棒骨\t494075\n4741g\t494076\n青岛地铁13号线\t494077\n蚂蚁矿机S9\t494078\nOBA\t494079\n若菜奈央\t494080\n厕奴\t494081\nFork\t494082\nbtree\t494083\nmingw\t494084\n氧水\t494085\n潍坊日报\t494086\nwuy\t494087\n一道一道\t494088\nAppScan\t494089\nGods\t494090\n胡成\t494091\n寨主\t494092\n氧化锆陶瓷\t494093\n大网\t494094\n中国科学院昆明植物研究所\t494095\n中原福塔\t494096\n城运\t494097\n入眠\t494098\n市招商局\t494099\n明月路\t494100\nMe\t494101\n补泻\t494102\n易天行\t494103\ninjector\t494104\n西岐\t494105\n89%\t494106\n7月中旬\t494107\nTCL罗格朗\t494108\n滨江凯旋门\t494109\n土超\t494110\nDApp\t494111\n聂树斌案\t494112\n金汤匙\t494113\n相吸\t494114\n漂型\t494115\n伽罗\t494116\n模拟城市4\t494117\n张陆\t494118\n蓝粉\t494119\n14升\t494120\n虎口脱险\t494121\ngina\t494122\n解放军特种部队\t494123\n金牛山\t494124\nssg\t494125\n反应器\t494126\n酸度\t494127\n线性插值法\t494128\n好卓网\t494129\n法定图\t494130\nqdebug\t494131\n廉吏\t494132\n小猪短租陈驰\t494133\n注册码\t494134\n每件事\t494135\nnexus3\t494136\n关节炎\t494137\n才叫\t494138\n戴娜\t494139\n黑客联盟\t494140\n上海饭店\t494141\n碳素钢\t494142\nARM11\t494143\n穆桂英挂帅\t494144\n组通字\t494145\n银行网点通\t494146\n奶头\t494147\n0.8.2.1\t494148\nishu\t494149\n乙肝免疫球蛋白\t494150\n天蛾\t494151\n黄贝街道\t494152\nBT高清MV-MP4\t494153\nGroove\t494154\n申明\t494155\n酒款\t494156\n来人\t494157\n300684\t494158\n永安乡\t494159\n小额担保贷款\t494160\n冤\t494161\n澳际\t494162\n琐事\t494163\n2017年11月2日\t494164\n移防\t494165\n井式炉\t494166\n脑卒中\t494167\njet\t494168\nStallion\t494169\n进藏\t494170\n大戟\t494171\n_网罗\t494172\n鱼饵\t494173\n王伟明\t494174\ninvestments\t494175\n贫铀\t494176\n在野\t494177\nShipyard\t494178\n矿用车\t494179\n编辑室\t494180\n虹桥机场\t494181\nemctl\t494182\n清华大学研究生院\t494183\n椴木\t494184\n乌达区\t494185\n9尾\t494186\nCodeWeavers\t494187\neth-trunk\t494188\nSIM卡通讯录\t494189\nHao\t494190\n哈林\t494191\n漫无止境\t494192\n桁架\t494193\n酷跑\t494194\n朗乡\t494195\n中华艺术宫\t494196\n伞兵\t494197\n消防车道\t494198\n云法律网\t494199\n靛\t494200\n洪雅县人民政府\t494201\n洗稿\t494202\n胡洋\t494203\n止泻药\t494204\n山冈\t494205\n达瓦\t494206\n金碧物业\t494207\n独树一帜\t494208\n分级诊疗\t494209\nJewellery\t494210\nchome\t494211\n有你有我\t494212\n排洪沟\t494213\n生日券\t494214\nHelloworld\t494215\n中国环境保护产业协会\t494216\n男孩女孩\t494217\n物业管理师考试\t494218\nmi2\t494219\n两堆\t494220\nv25\t494221\n阿莫西林克拉维酸钾\t494222\n优惠群\t494223\n赖子\t494224\n高吃\t494225\nGRC\t494226\n玩具箱\t494227\n检验检疫局\t494228\n0张\t494229\n台盖\t494230\n酷彩\t494231\n第七十七章\t494232\n蓍草\t494233\n苹果6\t494234\n杰迷\t494235\nMindjet\t494236\nverizon\t494237\n罗浩\t494238\n台北101大厦\t494239\n剑佑\t494240\n误工期\t494241\ngob\t494242\n王心怡\t494243\n冠军队\t494244\n消防管\t494245\n商洛日报数字报\t494246\nrememberMe\t494247\n阿公\t494248\n东安街\t494249\n缥缈峰\t494250\n东盟\t494251\n三万公里\t494252\n2699元\t494253\nT+1\t494254\n远山\t494255\n1吨\t494256\n微盟\t494257\n梅花型\t494258\n股权转让\t494259\n押出\t494260\n烤土豆\t494261\n娘田\t494262\n九洲电气\t494263\n脑部\t494264\n神爷\t494265\n配优化\t494266\n密封性\t494267\n仙剑奇侠传6\t494268\nimplicit\t494269\n肌体\t494270\n张志春\t494271\nMG2580S\t494272\n麻辣社区\t494273\n邝氏\t494274\n隐形车\t494275\n一个月\t494276\n组织生活会制度\t494277\n皮干\t494278\n锯条\t494279\nCSG\t494280\n国家发改委\t494281\n奥提兹\t494282\n织田裕二\t494283\n武汉市第五医院\t494284\n风棒\t494285\npmf\t494286\n躯\t494287\n6.5英寸\t494288\ndws\t494289\n显白\t494290\n轴芯\t494291\nsetupfactory\t494292\n阁楼式\t494293\n培华\t494294\n内金\t494295\n低功率\t494296\nehcarts\t494297\n俞敏\t494298\n深圳西丽车管所\t494299\n张鹏飞\t494300\nx264-CHD\t494301\n处行\t494302\nleesa\t494303\n雅诗\t494304\n内衬板\t494305\n晋祠路\t494306\n小套\t494307\n2ppv\t494308\n平湖秋月\t494309\n杨昌济\t494310\n七五普法规划\t494311\nWilliamantec\t494312\n水鸡\t494313\nvad\t494314\nmeets\t494315\n动脚\t494316\n天下网商\t494317\n文村\t494318\n协方差分析\t494319\nPopping\t494320\n点乘\t494321\nbitcode\t494322\n县局\t494323\n主犯\t494324\n王小丫\t494325\n小程\t494326\n钳工证\t494327\n韩建\t494328\n皇帝们\t494329\n饱和蒸气压\t494330\n中郎\t494331\n白云\t494332\n新中式酒店\t494333\n你的爱人\t494334\n徐立新\t494335\n29000\t494336\n中国植物志\t494337\n新航道北京学校\t494338\n变形量\t494339\n章惇\t494340\n火力少年王\t494341\n航校\t494342\n安河桥\t494343\n河南师范大学\t494344\n丈八东路\t494345\n金属版\t494346\n距\t494347\n水稀美里\t494348\nwainiwann\t494349\n中国太平洋保险\t494350\n门牌石\t494351\n_正北方网\t494352\n小嫚\t494353\n谷歌输入法\t494354\n亲子舞\t494355\n170周年\t494356\n税制\t494357\n前列康\t494358\n账册\t494359\n时令\t494360\n马国明\t494361\nhint\t494362\n刘骏\t494363\n苍兰\t494364\n墙\t494365\n金承佑\t494366\nCTnews\t494367\n艾克斯奥特曼\t494368\n姜东\t494369\nteddyboy\t494370\n硝酸汞\t494371\n半份\t494372\n1307\t494373\n急冻\t494374\n德州东站\t494375\n因式\t494376\n舆情监测系统\t494377\nbandwidth\t494378\n徐春\t494379\nE18\t494380\n傻姑娘\t494381\n黏结\t494382\nX射线仪器\t494383\n3月4月\t494384\n直译\t494385\n开牌\t494386\nholographic\t494387\nkb2999226\t494388\n范斌\t494389\n莫氏锥度\t494390\n包装设计\t494391\n浙江文艺出版社\t494392\n何某某\t494393\n产品\t494394\n格兰菲迪\t494395\n综艺体\t494396\n奇书网\t494397\n浅语\t494398\n3.0.12\t494399\n第一发\t494400\n比基尼\t494401\n24个\t494402\n盛丰物流\t494403\ngachinco\t494404\n管道疏通剂\t494405\n32色\t494406\n金马股份\t494407\n枫木\t494408\n18平方米\t494409\n海河路\t494410\n那一位\t494411\n胡戈\t494412\nbli\t494413\nMortal\t494414\n河东狮吼\t494415\n2.7.4\t494416\n888d\t494417\n木手\t494418\nママ\t494419\nmemorable\t494420\n肖阳\t494421\n梁心颐\t494422\n雁南\t494423\nBs\t494424\n其中一位\t494425\n七角\t494426\n唐金\t494427\n擦边球\t494428\n陕西省公共资源交易中心\t494429\nDragon\t494430\n受制\t494431\nwaifu2x-caffe\t494432\nbootstrap2\t494433\n白板报\t494434\n符皇幻想世界大穿越\t494435\n汉英\t494436\n宝贝龙\t494437\n迷幻\t494438\nDTX\t494439\n全路\t494440\n贡山\t494441\n编程员\t494442\n方向盘套\t494443\n裸官\t494444\n联想ThinkVision\t494445\n扭转乾坤\t494446\nhyperref\t494447\n撒野\t494448\n孔器\t494449\n声卡\t494450\n皇家社会\t494451\n经济学哲学手稿\t494452\nβ\t494453\nzhisheng\t494454\n汕头市濠江区政府\t494455\n逆函数\t494456\n国微\t494457\n非空\t494458\n酸枝木\t494459\noauth2.0\t494460\ngoro\t494461\n12部\t494462\n拉饵\t494463\n老铺\t494464\nadmin\t494465\n600平米\t494466\n好累\t494467\n这夜\t494468\n噪\t494469\n典尚\t494470\n洒\t494471\n东西城\t494472\n西店镇\t494473\n327\t494474\n眼保健操\t494475\n四川省科技厅\t494476\nvous\t494477\n芮欧\t494478\ncease\t494479\nWinXP\t494480\n城镇污水处理厂污染物排放标准\t494481\n洪阳镇\t494482\n一线生机\t494483\n弱智\t494484\n氯化银\t494485\nCrusher\t494486\n碎花\t494487\n太极熊猫\t494488\n霍尔果斯口岸\t494489\n求任\t494490\nil\t494491\n621700\t494492\n鸿坤集团\t494493\n板机\t494494\n原始社会\t494495\nN64\t494496\n长胶\t494497\n2017年10月1日\t494498\n北三环\t494499\n动销率\t494500\ntop30\t494501\n单选框\t494502\n液液\t494503\n格列夫\t494504\n北京师范大学教育学部\t494505\n背靠\t494506\n中国银行软件中心\t494507\n风险评级\t494508\njshint\t494509\nds7\t494510\nleica\t494511\n禁煤\t494512\n恒康医疗\t494513\n万企\t494514\n96关\t494515\n安杰玛\t494516\n全金属狂潮\t494517\n金属头\t494518\n短尾\t494519\nnodeJS\t494520\n中建一局集团建设发展有限公司\t494521\n陈国权\t494522\n内蒙古工商局\t494523\n2017年09月01日\t494524\n水泵房\t494525\n打款\t494526\n55度\t494527\n熙熙\t494528\n李昭\t494529\n尿流\t494530\n2月6日\t494531\nMori\t494532\n鬼武者3\t494533\n船舰\t494534\n上海徐汇区\t494535\n美式\t494536\nGao\t494537\n第九节\t494538\n秃宝盖\t494539\n德川家康\t494540\nActiv\t494541\nwarehouse\t494542\n荣耀鬼谷子\t494543\n预录单\t494544\n5710\t494545\n大杀\t494546\n聆\t494547\n新射雕英雄传\t494548\n112路\t494549\n工作中心\t494550\n啪了\t494551\n卧槽\t494552\n人教版小学语文五年级下册\t494553\n追悼\t494554\nRapoo\t494555\n听声\t494556\npackag\t494557\n陈驰\t494558\n瑛太\t494559\n20160926\t494560\n新热点\t494561\n中北大学信息商务学院\t494562\n中印关系\t494563\nakiba\t494564\n禅让制\t494565\n恶魔城晓月圆舞曲\t494566\nLumerical\t494567\n外国人签证网\t494568\n银乐迪\t494569\n翼校通\t494570\n超跌股\t494571\n婿\t494572\n深化改\t494573\n张村镇\t494574\n脑CT\t494575\n认同度\t494576\n704所\t494577\n单人间\t494578\nssh框架\t494579\n荣耀x2\t494580\nas\t494581\n路由交换机\t494582\n_壹\t494583\n建表\t494584\n九丹\t494585\nSCADA\t494586\npoj\t494587\nRethinking\t494588\n纳妾\t494589\n问卷\t494590\n启力\t494591\n北迈\t494592\nDj舞曲\t494593\n偷偷鲁\t494594\n新疆若干历史问题研究座谈纪要\t494595\n破折号\t494596\nTWO\t494597\n安防展\t494598\nlaughing\t494599\n52元\t494600\n直客通\t494601\n螺旋丸\t494602\n刻度线\t494603\n破敌\t494604\n机站\t494605\n度量衡\t494606\n衢州职业技术学院\t494607\n土门\t494608\n45个\t494609\n林业大学\t494610\n紧急避孕药\t494611\n160个\t494612\n伊塔\t494613\n水波\t494614\n380t\t494615\nhedgehog\t494616\n火焰纹章:英雄\t494617\n水煮青菜\t494618\n关于民事诉讼证据的若干规定\t494619\n覆卮山\t494620\n小e\t494621\n德清\t494622\n沙陀\t494623\nBreaks\t494624\n神飞\t494625\n新颖\t494626\n深圳方维网站建设公司\t494627\nunivariate\t494628\n经邦\t494629\n23天\t494630\n到不了\t494631\ncommons-lang3\t494632\n老夫老妻\t494633\nhvp\t494634\n异星工厂\t494635\nwayos\t494636\n光明日报社\t494637\ndestination\t494638\nasics\t494639\n阻力臂\t494640\n惠贞书院\t494641\n双卡\t494642\n方能\t494643\n湖北省体育局\t494644\n歉疚\t494645\nvoda\t494646\n淡色\t494647\n邪恶\t494648\n双月之城\t494649\n冠周炎\t494650\n铅酸蓄电池\t494651\n抹盘\t494652\n上海长江医院\t494653\n眼骨\t494654\nenqueue\t494655\n校验算法\t494656\n费拉拉\t494657\n石楠\t494658\n爱盲\t494659\n鼓手\t494660\n坎坷路\t494661\n油碟\t494662\n土门客栈\t494663\n先至\t494664\n进角\t494665\n自欺欺人\t494666\n一小半\t494667\n七十路\t494668\n76部\t494669\n软件工程专业\t494670\n江苏省卫生和计划生育委员会\t494671\nps立体字\t494672\n涂抹式\t494673\n分布式session\t494674\n情篇\t494675\n素原\t494676\nruntimeexception\t494677\n秦岛生活网\t494678\ndn100\t494679\n药器\t494680\n30升\t494681\n高尔夫7论坛_XCAR\t494682\nFAMILY\t494683\n温补\t494684\n道恩\t494685\n独门独户\t494686\nshowdown\t494687\n少代会\t494688\n钢厂\t494689\n木杉\t494690\n赵世勇\t494691\n花花\t494692\n爱情树\t494693\n起子\t494694\n40寸\t494695\n左青龙\t494696\n花的勇气\t494697\n000009\t494698\n决战平安京大天狗\t494699\n618所\t494700\n1.6kg\t494701\n第60集\t494702\n5.7.2\t494703\n屁屁\t494704\n垫圈\t494705\nnsarray\t494706\n山本美月\t494707\n长歌\t494708\nWin10#\t494709\n_列表网\t494710\n猪妹\t494711\n延边地区\t494712\n玄社\t494713\npt老虎机\t494714\n喷剂\t494715\n蜀客\t494716\nDBeaver\t494717\nbtsync\t494718\nwore\t494719\n北宁\t494720\n云视网\t494721\n宁儿\t494722\nPORTAL\t494723\n送变电\t494724\n钎焊炉\t494725\nRH2288\t494726\n二十五史\t494727\n5KG\t494728\n船儿\t494729\n省供销社\t494730\ncjj\t494731\n广东宏远\t494732\n高空坠物\t494733\nWeb开发\t494734\n律商\t494735\n春天在哪里\t494736\n美乐\t494737\n卢恩光\t494738\n王媛\t494739\n张小伟\t494740\n心花路放\t494741\nOC渲染器\t494742\n玄凤\t494743\n铁粉\t494744\n等风来\t494745\nWIFI密码查看器\t494746\n一周左右\t494747\n汤面\t494748\n鸡场\t494749\n第25类\t494750\n君者\t494751\n卡断\t494752\n域名竞价拍卖:聚名网Juming.Com\t494753\n10厘米\t494754\n无法停止\t494755\n明州\t494756\nbage\t494757\n席夫\t494758\n版面图\t494759\n第18关\t494760\n福晟\t494761\n缘来幸福\t494762\n钛锌板\t494763\nqq飞车手游霸天虎\t494764\n微信公众号助手\t494765\n刘金华\t494766\n商船\t494767\n胆汁反流性胃炎\t494768\n玄坛\t494769\n7万块\t494770\ndates\t494771\n老行\t494772\nubunbu\t494773\n孟宇\t494774\n竹竿\t494775\n播映\t494776\n广东金融学院\t494777\n幽竹\t494778\n质量保证金\t494779\n机位\t494780\n幼线体\t494781\n高捷\t494782\n花花公子女郎\t494783\n文华苑\t494784\n存制\t494785\n粤嵌\t494786\nLagerfeld\t494787\n冕旒\t494788\n朔城区\t494789\n2万5\t494790\n50多位\t494791\n多姆\t494792\n齐己\t494793\n广州数控设备有限公司\t494794\n李卫华\t494795\n排片\t494796\n乳母\t494797\nwingdings\t494798\n乐\t494799\n斩波电路\t494800\n2011年前\t494801\n玄秘\t494802\n愚昧无知\t494803\nReflect\t494804\n放映厅\t494805\n三联疗法\t494806\n新郑国际机场\t494807\n聚氨酯涂料\t494808\nEtherCAT\t494809\n玛沙多拉\t494810\n人潮\t494811\n一年生\t494812\n纸型\t494813\n翻訳\t494814\n蓝玫瑰\t494815\n滚筒筛\t494816\n重庆市安全生产监督管理局\t494817\n修片\t494818\n兰竹文化网\t494819\n证券从业-233\t494820\n玄师\t494821\n环化\t494822\n鸡冠\t494823\ntvos\t494824\n10层\t494825\n晶澳太阳能\t494826\n南+\t494827\n延拓\t494828\n感压\t494829\n200亿\t494830\n量勺\t494831\n玫瑰糠疹\t494832\n满江红\t494833\nStringIO\t494834\n民族\t494835\n文化学\t494836\n千机\t494837\nstem\t494838\n何龙雨\t494839\nビュ\t494840\n侄儿\t494841\ntiga\t494842\n明天上午\t494843\n临邑\t494844\n赤月传说\t494845\nREFUSED\t494846\n异型管\t494847\n煲汤食谱网\t494848\n课表\t494849\n龙须沟\t494850\n非企\t494851\n8002\t494852\npbx\t494853\n南风化工\t494854\n取力器\t494855\n麂皮鞋\t494856\n诺亚财富客服电话_公司\t494857\n20140331\t494858\n国家级湿地公园\t494859\n城建大厦\t494860\n怀集县\t494861\n拖出\t494862\n新罗区\t494863\nyd\t494864\n血源诅咒吧\t494865\n希达\t494866\n合肥市第二人民医院\t494867\n3788元\t494868\n中国电子工程设计院\t494869\n瑞泰科技\t494870\n紫光物联\t494871\n窄小\t494872\nkubenetes\t494873\n张宗言\t494874\n叶榭镇\t494875\nボンデ\t494876\nPixar\t494877\n板窑\t494878\n点击律\t494879\n热力膨胀阀\t494880\n市局\t494881\n铁面\t494882\nIMI\t494883\n总平图\t494884\n乳胶垫\t494885\nformance\t494886\n互相关\t494887\n印江土家族苗族自治县\t494888\nTimePicker\t494889\n2万吨\t494890\n龙谷湾恐龙公园\t494891\n老郎酒\t494892\n南方泵业股份有限公司\t494893\n微单全画幅\t494894\nkgs\t494895\n三科\t494896\n报会\t494897\n学术团体\t494898\n基本建设项目建设成本管理规定\t494899\nLMT\t494900\n唐律\t494901\n鸟周\t494902\n偷鸡摸狗\t494903\n新里拉\t494904\n刀鱼\t494905\n盖达尔\t494906\ni9处理器\t494907\n分秒必争\t494908\n外周\t494909\n银河房产网\t494910\n华师在线\t494911\n君绝\t494912\n瞬息\t494913\n25例\t494914\n临江仙\t494915\n大年夜\t494916\n纵火案\t494917\nSQLServer2008\t494918\n区国资委\t494919\n事历\t494920\ntim\t494921\n芳烃\t494922\nHancook\t494923\nRequire\t494924\n直到最后\t494925\n扣压机\t494926\n名妃\t494927\n郑惠成\t494928\n表样\t494929\n东拼西凑\t494930\n一周的偶像\t494931\nmockmvc\t494932\n农村中学\t494933\n出清\t494934\n剑桥郡\t494935\nHoegh\t494936\n冷锅串串\t494937\n梦想星搭档\t494938\n好心\t494939\n臀部\t494940\n广东消防网\t494941\n比亚迪宋dm\t494942\n永州道县\t494943\n哈尔滨三中\t494944\nparticipants\t494945\n24.6\t494946\n红旗飘飘引我成长\t494947\nxj\t494948\n星星报\t494949\n刘裕\t494950\n五邑大学\t494951\nhis\t494952\n占位费\t494953\n贵景摄影\t494954\n山青世界\t494955\n中资\t494956\neduyun\t494957\n2015\t494958\npz\t494959\n戴珍珠\t494960\n偷香高手\t494961\n盐通高铁\t494962\n22500\t494963\n字屏\t494964\n南海未来城\t494965\n趣\t494966\n第十二册\t494967\n韩雅乐\t494968\nneverland\t494969\n射线\t494970\n四八\t494971\n郭文斌\t494972\n对开\t494973\n家父\t494974\nmala\t494975\nAperture\t494976\n沈嘉文\t494977\n西溪路\t494978\n优购时尚商城\t494979\n毁魅\t494980\n平衡木\t494981\nEVDO\t494982\n沾化县\t494983\n兔子头\t494984\nTrello\t494985\nCyndi\t494986\nscx-3401\t494987\nrax\t494988\n大森林\t494989\n1176\t494990\nlol.178.com\t494991\n开房\t494992\n野象谷\t494993\n阻击手\t494994\n苯乙醇\t494995\n中信网铭鸽\t494996\n车商\t494997\n2012年6月\t494998\n你的偶像\t494999\nIO\t495000\n上海市皮肤病医院\t495001\n恒昌学院\t495002\n无罪谋杀\t495003\n多瑙影院\t495004\nRDS\t495005\n首重\t495006\n吉政\t495007\n4双\t495008\nMarian\t495009\n吴家村\t495010\nPron\t495011\n期货投资\t495012\n成安\t495013\n唰唰\t495014\n2五\t495015\n罗天\t495016\n审计报\t495017\ngumi\t495018\nHEAVEN\t495019\n电磁感应定律\t495020\n海德堡大学\t495021\n第114号\t495022\n乡村爱情小夜曲\t495023\nblm\t495024\n陈泽\t495025\n25t\t495026\n西宁\t495027\n嘉蕊\t495028\n赫本\t495029\n2018-04-15\t495030\n退学\t495031\n不可思议\t495032\n东方神\t495033\n沿江高铁\t495034\n3第一章\t495035\n欧派电动车\t495036\nPC端\t495037\n十二黄金圣斗士\t495038\n明扬\t495039\n韶姬\t495040\n27000\t495041\n100多天\t495042\n人事报\t495043\n12杯\t495044\nTEMP\t495045\n高岩\t495046\n聂离\t495047\n诗普琳\t495048\n3分之二\t495049\n喷神\t495050\n北语\t495051\nHurry\t495052\n上海机关\t495053\noutbreak\t495054\n2017年5月26日\t495055\n阳春\t495056\n0.4m\t495057\n波尔多小镇\t495058\nKanojo\t495059\n有奖调查\t495060\nMC百科\t495061\n超山\t495062\n能干\t495063\n电讯盈科\t495064\n单元\t495065\nmultidex\t495066\n北太平庄\t495067\n战队\t495068\n探险者\t495069\n仕女\t495070\nvixen\t495071\nm3u\t495072\n贺新年\t495073\nEsprit\t495074\n简本\t495075\n汽车江湖网\t495076\n太阁立志传\t495077\n弯矩\t495078\nSpring-Boot\t495079\n2010b\t495080\nboxplot\t495081\n阿里云计算有限公司\t495082\n连带责任保证人\t495083\n医疗保险基金\t495084\n铃片\t495085\n中值定理\t495086\n急性肺栓塞\t495087\n司马台\t495088\n纲举\t495089\n手写体\t495090\n电子邮箱\t495091\nDVT\t495092\n高能医少\t495093\n航\t495094\ne900\t495095\n身体力行\t495096\n姑射山人\t495097\n富士胶片\t495098\n第六项\t495099\n出口\t495100\n卖炭翁\t495101\n直肠息肉\t495102\n马自达6论\t495103\n海盐县政府\t495104\nsamantha\t495105\n普信\t495106\nctor\t495107\n液压挖掘机\t495108\n酒味\t495109\nAABC\t495110\n市中级人民法院\t495111\n岗集镇\t495112\n大冠军杯\t495113\n610路\t495114\n未尝\t495115\n高碘酸钠\t495116\n美妈旅行记\t495117\n学学\t495118\nmirna\t495119\n绿雪芽\t495120\n新保险法\t495121\n8710\t495122\nfalsk\t495123\n辽宁师范\t495124\n艾克佐迪亚\t495125\nDataSource\t495126\n球架\t495127\nstrcat函数\t495128\n柯岩街道\t495129\n0.05元\t495130\napi-ms\t495131\n李树银\t495132\nInstruments\t495133\n八神庵\t495134\n蓝调庄园\t495135\n达克宁\t495136\nFACTORY\t495137\n长款\t495138\n2015-11\t495139\n羊肉汤\t495140\n池州日报社多媒体数字报\t495141\nTerritory\t495142\n龙形\t495143\n分流器\t495144\n按理\t495145\n库尔德人\t495146\n_凤网\t495147\n陕西煤矿安全监察局\t495148\n值不值钱\t495149\n在哪里\t495150\n十几小时\t495151\n人之常情\t495152\n中国农业部\t495153\n心跳文学部\t495154\n终止子\t495155\n毕业登记表\t495156\n家电维修论坛\t495157\nPClady摩登\t495158\n规模性\t495159\n乘势而上\t495160\n红檀木\t495161\n金像奖\t495162\nnid\t495163\ne络盟\t495164\n整理柜\t495165\nintuition\t495166\n大东\t495167\n免卡\t495168\nV5.50\t495169\n基本养老保险\t495170\n享爱\t495171\n德技\t495172\n二黄\t495173\n93号\t495174\n韧\t495175\ners\t495176\n市口\t495177\n华辉\t495178\n瑞金南路\t495179\n电子对账单\t495180\n陆云\t495181\n马尼拉\t495182\n9k\t495183\nz97\t495184\n头盘\t495185\n宿州\t495186\n苏金单抗\t495187\n37.7\t495188\n20160114\t495189\n蜡笔小新\t495190\n每一秒\t495191\n课外授业\t495192\n钢板\t495193\n四氟板\t495194\n看遍\t495195\n有源晶振\t495196\n银背大猩猩\t495197\n4P\t495198\n北京交通广播\t495199\n梦参\t495200\n傅华\t495201\n五大发电集团\t495202\n影音先锋资源_影音先锋电影_影音先锋看片网站_先锋影院\t495203\n剪藏\t495204\n三生制药\t495205\n宾得k70\t495206\n福地\t495207\n螺蛳粉\t495208\nJeep指南者\t495209\n忘不了你\t495210\n刘恋\t495211\n绿牌\t495212\n皇岗\t495213\n纳德\t495214\n刘耀\t495215\nglobalization\t495216\n165个\t495217\n杨奎松\t495218\nGoogle身份验证器\t495219\n椎骨\t495220\n摩擦离合器\t495221\n我的错\t495222\n公考论坛\t495223\n小微贷\t495224\n流动资金贷款\t495225\n复合树脂\t495226\n中站\t495227\n扬州地区\t495228\n64|\t495229\n水晶珠\t495230\n公益广告词\t495231\n画语\t495232\n35本\t495233\n阴阳界\t495234\n蟹味菇\t495235\n爱尔康\t495236\n208号\t495237\n内源\t495238\n打破\t495239\n两点\t495240\na7s2\t495241\n三好街\t495242\n农业新闻网\t495243\n叛国\t495244\n北京旅行社\t495245\n谨言慎行\t495246\n攻牙\t495247\n川军\t495248\n北滘镇\t495249\n提优\t495250\n坐地铁\t495251\n出杆\t495252\nUniversal\t495253\nXHTML\t495254\n乐歌\t495255\nexpres\t495256\n尼奥普兰\t495257\n所望\t495258\n佳能m6\t495259\nposs\t495260\n70码\t495261\n删减\t495262\n第十三\t495263\n熔体\t495264\n正立\t495265\n韦伯\t495266\n东山晴后雪\t495267\n绿色汉化版\t495268\n航务\t495269\n特体\t495270\nshuts\t495271\n自动词\t495272\nT95\t495273\n超哥\t495274\nBKK\t495275\n竖道\t495276\n美好时光\t495277\n物理暴击\t495278\n70A\t495279\n5.4.2\t495280\nArrays\t495281\n上海社会保险办事网_上海市人力资源社会保障\t495282\n土地储备管理办法\t495283\n160610\t495284\n高斯分布\t495285\n命数\t495286\nDecentralized\t495287\n搬经镇\t495288\n四页\t495289\n欺负\t495290\n大白兔\t495291\n串联谐振\t495292\n体察\t495293\n小蛙\t495294\n双面胶\t495295\n冷小莫\t495296\n哈利·波特\t495297\n七版\t495298\nNIT\t495299\n胡姓\t495300\n我亲爱的祖国\t495301\nBaking\t495302\n嫖文\t495303\n扫黑\t495304\nincorrect\t495305\n奇迹世界\t495306\n卡伊\t495307\nTSSD\t495308\n扑水\t495309\n喉舌\t495310\n500余\t495311\n11.2.0.4\t495312\n华图网校\t495313\n战纪\t495314\n75家\t495315\n姜小妖\t495316\n星级酒店\t495317\n重样\t495318\n车厢板\t495319\n50美分\t495320\nwatchos\t495321\nuplay\t495322\n雨浩\t495323\n退汇\t495324\n冠名\t495325\nShowGirl\t495326\n组织化\t495327\n一典\t495328\n搜狗搜索引擎\t495329\n橘中秘\t495330\nvov\t495331\n刘肖\t495332\n孤岛惊魂5吧_\t495333\n康复医院\t495334\n险要\t495335\nsrm\t495336\n201708\t495337\n201410\t495338\n柔光箱\t495339\n可达\t495340\n李川\t495341\n人治\t495342\n20分钟\t495343\n王益\t495344\n单行材料\t495345\n1719\t495346\nherd\t495347\n2018年\t495348\n实审\t495349\n升位\t495350\n迪士尼神奇英语\t495351\n蒙牛乳业\t495352\n乡村振兴战略大家谈\t495353\n杭州市民卡网上服务厅\t495354\n东荟城\t495355\n额尔古纳乐队\t495356\n10平方公里\t495357\nReloaded\t495358\n坐台\t495359\n死亡线\t495360\n荣耀坦克\t495361\n观澜街道\t495362\n罚款\t495363\n富丽华\t495364\n迈思\t495365\n独孤天下般若\t495366\n官名\t495367\n电动打包机\t495368\n蒸压加气混凝土\t495369\nUSB3\t495370\nHerbal\t495371\n黄玉兰\t495372\nserdes\t495373\n1月1日起\t495374\n分布律\t495375\n五气\t495376\n海口东\t495377\nsore\t495378\n无源蜂鸣器\t495379\n船帆\t495380\n淘宝联名信用卡\t495381\n哈素海\t495382\n巴神\t495383\n记忆棒\t495384\n此君\t495385\n瑞妍\t495386\n1-12号\t495387\n南阳师范学院\t495388\nrevolving\t495389\n高女\t495390\n猪头肉\t495391\n赌坊\t495392\n天馈线\t495393\n安徽小学\t495394\nC10\t495395\n熊派\t495396\n吝\t495397\n民用电\t495398\n吴江汽车站\t495399\n元分析\t495400\n仓埠\t495401\n大龙\t495402\n轻断\t495403\nhzy\t495404\nsupporting\t495405\n紫荆花漆\t495406\nHat\t495407\n受力\t495408\n塔可夫斯基\t495409\n沙僧\t495410\n美人税\t495411\n报告稿\t495412\n同流\t495413\n附用\t495414\n交杯酒\t495415\n山体滑坡\t495416\n档号\t495417\n铍青铜\t495418\n世贸商城\t495419\nNII\t495420\n大槻\t495421\nzyz\t495422\n破坏性\t495423\n卤鸡爪\t495424\n400mm\t495425\n修定\t495426\n互惠符\t495427\nCDFI\t495428\ndi\t495429\n纸巾纸\t495430\n安顺火车站\t495431\nN7\t495432\n鲁莽\t495433\n仙流\t495434\n一鼎\t495435\ninsex\t495436\n唐豆豆\t495437\n不告\t495438\n晨光烧饼\t495439\n增距镜\t495440\n一滴水\t495441\n2017年10月10日\t495442\n县丞\t495443\n2招\t495444\nFPP\t495445\n四惠东\t495446\n南京东路外滩\t495447\n五龙电动车\t495448\n金瓶梅\t495449\nBP神经网络算法\t495450\nloyalty\t495451\n柏原崇\t495452\n非公开发行公司债\t495453\n腌腊肉\t495454\n百件\t495455\n38_\t495456\n艾景奖\t495457\n智化寺\t495458\n大菜\t495459\n8元\t495460\n姚明\t495461\n官爵\t495462\n50多部\t495463\nSHC\t495464\n控江中学\t495465\n下姜村\t495466\n都会\t495467\n50x50\t495468\n权威人士\t495469\n杨东平\t495470\n阜阳机场\t495471\n新梅江\t495472\n快切\t495473\n罗毅\t495474\n心术\t495475\n诗晴\t495476\nSP8\t495477\n长安新奔奔\t495478\n芝兰之室\t495479\n4v\t495480\n四分卫\t495481\n蓉欧快铁\t495482\njiejie\t495483\n售后回租\t495484\n叶柔\t495485\nNordstrom\t495486\n成份\t495487\n上海体育场\t495488\n郑柔美\t495489\n大姐\t495490\n热高乐园\t495491\n仙乐\t495492\nミセスジャンキ\t495493\n金科天宸\t495494\n宜州\t495495\n歼击机\t495496\n一醉方休\t495497\n飘邈\t495498\nCure\t495499\nmonsta\t495500\n浔龙河\t495501\n美国宾夕法尼亚大学\t495502\n蒸发皿\t495503\n汉奸\t495504\n爱微电影网\t495505\n走在路上\t495506\n德图\t495507\n蔚然\t495508\n认知主义\t495509\n色琪琪\t495510\nSimba\t495511\n刑法学\t495512\n人工林\t495513\n色情网\t495514\nPOSIX\t495515\n狄瑞吉\t495516\nwt\t495517\n寂寥\t495518\n申港街道\t495519\n马勃\t495520\nPrinciple\t495521\n96猫\t495522\n土方岁三\t495523\n安居房\t495524\n阮一峰\t495525\n北京南苑机场\t495526\n野战军\t495527\n美术品\t495528\n非理性\t495529\n爸妈\t495530\n南宁东站\t495531\nwww.3dmgame\t495532\n悔改\t495533\nctrl键\t495534\n义勇军\t495535\n防晒品\t495536\n安宁温泉\t495537\n等价关系\t495538\n农业银行个人网上银行\t495539\nHorror\t495540\n郭茂倩\t495541\nb1b2\t495542\n上古卷轴5skse\t495543\n001409\t495544\n阿进\t495545\n一剑封喉\t495546\nCerebral\t495547\n应山\t495548\nP20pro\t495549\ndirectx11\t495550\n闲章\t495551\n李钟硕\t495552\n预包装食品\t495553\n抗逆性\t495554\nDating\t495555\n享值\t495556\n金旗舰\t495557\n双速\t495558\n福特4S店\t495559\n风象\t495560\n新舞\t495561\nAEIS\t495562\n窝子\t495563\n三国群雄传\t495564\n氢化蓖麻油\t495565\n南宁装修公司\t495566\n六八\t495567\n民法典\t495568\n失物招领网\t495569\n九国\t495570\n汤晶媚\t495571\n修真类\t495572\n第一声\t495573\n皮亚诺\t495574\n建字\t495575\nbloodborne\t495576\n传热\t495577\n轴向力\t495578\n海洋水\t495579\n找春天\t495580\n奇域\t495581\nMAC\t495582\nh61\t495583\n全麦馒头\t495584\n沟底\t495585\n龙山镇\t495586\nSn\t495587\n通商律师事务所\t495588\n心疼\t495589\n高萌\t495590\n奥体店\t495591\nacf\t495592\n方剂学\t495593\n腮腺瘤\t495594\n3000g\t495595\n0007\t495596\n载荷谱\t495597\n胯下\t495598\n破歌\t495599\n李小加\t495600\n完成额\t495601\n香港快递\t495602\nnag\t495603\nRMXP\t495604\n12V100AH\t495605\n观赏鸟\t495606\n重点\t495607\n有识之士\t495608\n普华永道\t495609\n冯女郎\t495610\n血滴\t495611\n春药\t495612\ncollapsible\t495613\n金量\t495614\n暗影猎人\t495615\n福彩网\t495616\ntimestamp\t495617\n无私\t495618\n11bgn\t495619\nHera\t495620\nTee\t495621\n带扣\t495622\n小麻花\t495623\na7rm2\t495624\n捐衣\t495625\nLED日光灯管\t495626\n杨丽君\t495627\n第十二部\t495628\n第13周\t495629\n百度云链\t495630\n3图集\t495631\n胸露乳\t495632\n消化腺\t495633\n西安市中心医院\t495634\n美里有纱\t495635\nhighchart\t495636\n初裸\t495637\n苍云传\t495638\n草泥\t495639\n尼群地平片\t495640\n跳箱\t495641\n间隔棒\t495642\n初稿\t495643\n孛儿只斤\t495644\nasian\t495645\n不依不饶\t495646\n宣传案\t495647\n指令\t495648\n7200\t495649\n竹兰\t495650\n1.0.30\t495651\n康乐园\t495652\n奖代\t495653\n多日\t495654\nLUN\t495655\nFlyme5\t495656\n单循环\t495657\n9260\t495658\n小型化\t495659\nasx\t495660\n杨又歌\t495661\n王维\t495662\n与生俱来\t495663\n拷边\t495664\nsupport\t495665\n戈隆\t495666\nSolidworks\t495667\n金属探测器\t495668\n百子\t495669\n左拐\t495670\n启辉\t495671\n长城宽带商城\t495672\n新速\t495673\n国家主\t495674\n宜昌北站\t495675\n森海赛尔\t495676\n8086/8088\t495677\n谷穗\t495678\nheist\t495679\n可塑性\t495680\n中铁十局\t495681\n介入科\t495682\n博学多才\t495683\n李根\t495684\n香草航空\t495685\n孔雀竹芋\t495686\nBlowing\t495687\nQSV\t495688\n兰州交通大学\t495689\n小心绝后\t495690\n央视影音\t495691\nDVB\t495692\n热映排片\t495693\n好难\t495694\n宿州职业技术学院\t495695\n爱股网\t495696\n热文\t495697\n湍流\t495698\n考培\t495699\n微府\t495700\n清平\t495701\n绝缘纸\t495702\n中南建筑设计院股份有限公司\t495703\n闻香识\t495704\nCalcium\t495705\n中国民航信息集团\t495706\n周口店镇\t495707\n铭源\t495708\n被埋\t495709\n广西网络广播电视台\t495710\n自考吧_\t495711\n义者\t495712\n智慧酒店\t495713\n秘密\t495714\n兵符\t495715\n广钢\t495716\n节外生枝\t495717\n牌库\t495718\nsolidWorks\t495719\n渝国\t495720\n霸王餐\t495721\nhao315.com\t495722\nZenBook\t495723\n暴雪蓝帖\t495724\n张拉膜\t495725\n食土鲷\t495726\n环形变压器\t495727\n中国达人秀\t495728\n吉雪萍\t495729\nCops\t495730\nDaddy\t495731\n皮影客\t495732\n网头\t495733\n一叶障目\t495734\n诱人\t495735\n李小白\t495736\n科沃斯地宝\t495737\n风清云\t495738\n李现\t495739\n湿庞\t495740\n题画\t495741\n钉箱机\t495742\n文昌位\t495743\n藤架\t495744\n知识产权服务业\t495745\n纸醉金迷\t495746\nAging\t495747\nSir\t495748\n美诗\t495749\n济南东部\t495750\n奈斯\t495751\n大叔的爱\t495752\n栏栏\t495753\n南沟\t495754\nulzzang\t495755\n移民村\t495756\n辨色\t495757\n76%\t495758\nwjbooks\t495759\nPowerVault\t495760\n8770w\t495761\n南卡罗来纳州\t495762\nImitation\t495763\n使用费\t495764\n_群名\t495765\n福田保税区\t495766\njspdf\t495767\n2017年7月1日\t495768\n地包天\t495769\n拉兹\t495770\n肝气\t495771\nLog4\t495772\n自助游\t495773\n东郭\t495774\ncheckstyle\t495775\n第一把位\t495776\n戚夫人\t495777\n百小度\t495778\n陌生男女\t495779\n牡丹苗\t495780\n稽核\t495781\n暖玉\t495782\n钩头\t495783\nBasler\t495784\n贤妻\t495785\n8110\t495786\n有林\t495787\n驼人\t495788\n潜热\t495789\n爱心行动\t495790\n中国人民银行杭州中心支行\t495791\n一九八四\t495792\ntome4吧\t495793\nQPython\t495794\n采购信息网\t495795\n吴江区\t495796\n长江学者\t495797\n散户查股网\t495798\n运动操\t495799\n加属\t495800\n周评\t495801\n长春华山皮肤病医院\t495802\n白塞病\t495803\n撒哈拉以南非洲\t495804\n浩顺\t495805\n56\t495806\n经验书\t495807\n0.25元\t495808\n杨睿\t495809\nchien\t495810\n相亲群\t495811\norem\t495812\nシア娘\t495813\nFeeder\t495814\n2016年1月份\t495815\n宅斗生活书库\t495816\nRISING\t495817\n新疆教育厅\t495818\nMycat\t495819\n环沪\t495820\n电动车网\t495821\n打上花火\t495822\n黄杨木雕\t495823\n估价师\t495824\n面骨\t495825\nfancam\t495826\n神船\t495827\nTTG\t495828\nSw\t495829\n西方文明通论\t495830\nfranchise\t495831\n巴黎春天\t495832\n林山\t495833\n五金件\t495834\n北京客\t495835\n直跑\t495836\nSpatial\t495837\nvsan\t495838\nmoses\t495839\n教练机\t495840\n扁型\t495841\n薛强\t495842\n湖北八校\t495843\n大不了\t495844\n枳\t495845\n簇桥\t495846\n变轨\t495847\n安室奈美恵\t495848\ntroubles\t495849\n将来的你\t495850\nssd1306\t495851\nlibratone\t495852\nGTX950\t495853\n女总裁\t495854\n前置后驱\t495855\n迁西\t495856\n2011年10月\t495857\n伊苏树海\t495858\nasshole\t495859\n侯景\t495860\n羊水指数\t495861\n面积和面积单位\t495862\n碳素结构钢\t495863\n擅\t495864\nPAC\t495865\n昨天中午\t495866\n飞鹤乳业\t495867\n亮窗\t495868\n生物科技有限公司\t495869\n别克君威GS\t495870\n江苏省自然科学基金\t495871\n吴庆芝\t495872\n警督\t495873\n散人传说\t495874\nWin7窗口切换小技巧_\t495875\nv2.1.5\t495876\n文广新局\t495877\n湘电股份\t495878\n瓦楞纸\t495879\n加价\t495880\n四川财经职业学院\t495881\n光绪元宝库\t495882\n服表\t495883\n红掌柜\t495884\n第15\t495885\n吕方\t495886\n人工繁殖\t495887\n非洲花梨木\t495888\n基督教堂\t495889\nr1d\t495890\n动手\t495891\ns2110\t495892\n法兰绒\t495893\nblackpink\t495894\n拉比克\t495895\n中华人民共和国史\t495896\n上海临港\t495897\n阴茎癌\t495898\n薇妮\t495899\nReports\t495900\n威海社区\t495901\n采比\t495902\n超级奶爸\t495903\n1-2天\t495904\n艾舍尔\t495905\n中国企业网\t495906\n鲁谷\t495907\nE41\t495908\n城外\t495909\n龙游石窟\t495910\n大胃王mini\t495911\nE555\t495912\ndecades\t495913\n第三条\t495914\nKirara\t495915\n可交换债券\t495916\n三智\t495917\n广核\t495918\n半根\t495919\n灰橙\t495920\n乐跑赛\t495921\n鱼店\t495922\nTheshy\t495923\n冷冻机\t495924\n1.9%\t495925\nsheep\t495926\nre文件管理器\t495927\n女师\t495928\n影视鉴赏\t495929\ntab3\t495930\n事类\t495931\n兰琪\t495932\n妇幼\t495933\n我想有个家\t495934\n柴荣\t495935\nQY\t495936\n缘何故\t495937\n性恶论\t495938\n胡一帆\t495939\n党卫军\t495940\n镂空板\t495941\n绝地逢生\t495942\n高艳\t495943\n钓场\t495944\n源味\t495945\n爆刺\t495946\nUSANA\t495947\n京味\t495948\n哈雷摩托\t495949\n冲散\t495950\nlinux子系统\t495951\n高才生\t495952\n艺术馆\t495953\n富贵论坛\t495954\n商务卫士\t495955\n雷庆瑶\t495956\n毕沅\t495957\nvr相片级成品参数值\t495958\nDevelopments\t495959\n勇者斗恶龙怪兽篇Joker3\t495960\n帅康燃气灶\t495961\nupu\t495962\n墓葬群\t495963\n水香\t495964\n康掌柜\t495965\n运输队\t495966\n念慈\t495967\nconten\t495968\n欢乐家\t495969\n葵花籽\t495970\n前距离\t495971\nanalytic\t495972\n斬\t495973\n前置前驱\t495974\n雨城\t495975\n畅购\t495976\n温升\t495977\n初探\t495978\n马红漫\t495979\n比特儿\t495980\n台海\t495981\n拌饭\t495982\n木瓜汤\t495983\n丽江束河古镇\t495984\n清單\t495985\n监查员\t495986\nCRX\t495987\nsciencedirect\t495988\n果梨\t495989\nbtn\t495990\n活鸡\t495991\n体验度\t495992\nInfoSphere\t495993\n极\t495994\n跳蛛\t495995\n杭州经济开发区\t495996\n二元二次方程\t495997\n惊天大阴谋\t495998\n那霸\t495999\n磨豆机\t496000\n尼康d7100\t496001\ndan\t496002\n空士\t496003\n中国互联网协会\t496004\n空层\t496005\n原子\t496006\n雷神模拟器\t496007\n半自动\t496008\n休闲游\t496009\n繁殖场\t496010\nOpenFileDialog\t496011\n纸娃娃\t496012\nbauer\t496013\n千万富豪\t496014\n风之恋\t496015\n32.768\t496016\n脐\t496017\n0834\t496018\n人民大道\t496019\n主观唯心主义\t496020\n诽\t496021\n新寓\t496022\n下午3点\t496023\n牛肉干\t496024\n江苏省民政厅\t496025\n七星路\t496026\n公卫助理医师\t496027\n翅膀\t496028\n升迁\t496029\n1.48\t496030\nMHX\t496031\n国家基础地理信息中心\t496032\nJavaSE\t496033\ndn500\t496034\n如磨\t496035\n3A\t496036\nInn\t496037\n轻坦\t496038\n网线钳\t496039\n众视网\t496040\nimitate\t496041\n天南星\t496042\n丝网印\t496043\n彭雷\t496044\n普及率\t496045\n光明路\t496046\n地海战记\t496047\n脱裤\t496048\n中国现代文学史\t496049\n1.4亿元\t496050\nRenderer\t496051\nHDWMV\t496052\n性幻想\t496053\nStartups\t496054\nMDict\t496055\n儿童学习网\t496056\nSuits\t496057\n19点\t496058\nweb-inf\t496059\n粉葛\t496060\n4pro\t496061\n4英尺\t496062\n昏倒\t496063\n秦\t496064\n北京大学政府管理学院\t496065\n纯文字\t496066\n罗马文\t496067\n班团\t496068\nJayden\t496069\n人大会\t496070\nMINISO名创优品\t496071\n上海玻璃博物馆\t496072\n霹雳布袋戏\t496073\n蒙巴萨\t496074\n判若\t496075\n潍坊一中\t496076\n腾讯大厦\t496077\n洁肤\t496078\n&amp\t496079\n梦寐\t496080\n何应钦\t496081\n基罗\t496082\n山形县\t496083\n红色警戒2共和国之辉_尤里的复仇\t496084\n住手\t496085\n绝地求生雷达\t496086\n3255\t496087\n广佛通\t496088\n识谱\t496089\n值勤\t496090\n叫\t496091\nT.O.P\t496092\n开远\t496093\n互斥\t496094\nWedding\t496095\nT630\t496096\n890万\t496097\n蛇王\t496098\nSeasonal\t496099\n撂荒\t496100\nEQT\t496101\n消化炉\t496102\n蓝月网页游戏\t496103\n一带\t496104\n120句\t496105\n赠与税\t496106\n邪恶漫画大全母系大全\t496107\n5.14\t496108\nmp41\t496109\n第二部\t496110\nallroad\t496111\n铁腰\t496112\n高速铁路接触网\t496113\n雷凌\t496114\n西安欧亚学院\t496115\n奈达\t496116\n正大\t496117\n宝色\t496118\n禹枫\t496119\n8.1.0.3372\t496120\n战略\t496121\n念佛号\t496122\n经历\t496123\n真空技术网\t496124\n2360\t496125\n教学改革\t496126\n此世\t496127\nfleam\t496128\nSurfer\t496129\n家xbiao\t496130\n应受\t496131\nCocoon\t496132\n什么时候\t496133\nRX-78-2\t496134\n卡罗拉论坛_汽车之家论坛\t496135\n珠光宝气\t496136\ndevelop\t496137\n爱森\t496138\nWeblog\t496139\n纹理\t496140\n腰带\t496141\nexpects\t496142\n机师\t496143\n105万\t496144\n清远网络广播电视台\t496145\ndowm\t496146\n省\t496147\n2010世界杯\t496148\nh邪恶漫画\t496149\n陆涛\t496150\n于莎莎\t496151\n薄锐\t496152\nzune\t496153\nnicholas\t496154\n依山郡\t496155\n铁山港\t496156\n樱木花道\t496157\n纹唇\t496158\nbajjk\t496159\n乐天百货\t496160\n灰昼\t496161\n柔力球\t496162\nnumel\t496163\ntse\t496164\n赛睿霜冻之蓝\t496165\n墨趣\t496166\n杀价\t496167\n联想y430p\t496168\n灰哥\t496169\n林智妍\t496170\n越江\t496171\n双士\t496172\n吐温\t496173\n整形\t496174\n春夏秋\t496175\n闯荡\t496176\n桠溪\t496177\n奥美广告\t496178\nI类\t496179\n附加\t496180\n足三里穴\t496181\nyara\t496182\n多块\t496183\n院墅\t496184\n麦克法兰\t496185\n绞吸船\t496186\ntabwidget\t496187\n酋长\t496188\n弹出去\t496189\n胜经\t496190\n兰思诺\t496191\nParsing\t496192\n73\t496193\n描写花\t496194\n4吨\t496195\n详细页\t496196\nGoldman\t496197\n熔池\t496198\n红酒世界网\t496199\n难以忍受\t496200\n274\t496201\nkernal\t496202\n牛顿第三定律\t496203\n妖鬼\t496204\n文化性\t496205\n戈林\t496206\n脱硫脱硝\t496207\n大卫·鲍伊\t496208\n翠绿色\t496209\n管平湖\t496210\n五天内\t496211\n上午11点\t496212\nSnowman\t496213\n很全\t496214\n广州证券股份有限公司\t496215\n12章\t496216\n主战坦克\t496217\n谦虚谨慎\t496218\n9877h小游戏\t496219\n中移铁通有限公司\t496220\n迈速表\t496221\n二级注册建筑师考试\t496222\n幻化师\t496223\njqurey\t496224\n出街\t496225\n天柱山\t496226\nQQ游戏网\t496227\n卖力\t496228\n丁锐\t496229\n本诉\t496230\n佐山\t496231\n气候变暖\t496232\n江西省环境保护厅\t496233\n成电新闻网\t496234\n省新闻出版广电局\t496235\n数据库连接池\t496236\n琥珀酸美托洛尔缓释片\t496237\nWBFS\t496238\n离不开\t496239\n金华市区\t496240\n飞行动力学\t496241\nteflon\t496242\n抓野鸡\t496243\n三神柱\t496244\n东尼大木\t496245\n世界四大博物馆\t496246\ntedeum\t496247\n道德与法制\t496248\n57平米\t496249\n孙杰\t496250\n洛奇英雄传单机版\t496251\nmp236\t496252\n汽研\t496253\n西科\t496254\n邮政机\t496255\nYankee\t496256\n增长液\t496257\n5月15\t496258\n金安区\t496259\n远战\t496260\n熏风\t496261\n申月\t496262\n卓老板\t496263\n锦瑟无端五十弦\t496264\n陈本\t496265\n福州机场\t496266\n胡玫\t496267\n槽盒\t496268\n萘\t496269\niter\t496270\n别开玩笑\t496271\n稀释液\t496272\n光棍村\t496273\n小猴子\t496274\n南投\t496275\n北段\t496276\n抢救\t496277\nTipask\t496278\n若干\t496279\n玫瑰水\t496280\n尧都\t496281\narw\t496282\n儿童学\t496283\n小柜\t496284\n合股\t496285\n俊美\t496286\n微信公众平台号\t496287\n时规\t496288\nPyCrypto\t496289\n亢进\t496290\n邵逸夫\t496291\n程默\t496292\n横式\t496293\n天津经济技术开发区政务服务平台_泰达政府\t496294\nsqlserver2008r2\t496295\nKONG\t496296\n威驰奔驰\t496297\n休比\t496298\nmihoyo\t496299\n流读取\t496300\n泳照\t496301\n三国志11PK\t496302\n保全\t496303\ncc域名\t496304\n第一百四十一章\t496305\n得瑞\t496306\n梦貘\t496307\nKET\t496308\n无骨鸡柳\t496309\n列支敦士登\t496310\n克服\t496311\n业务流程管理\t496312\n北舞\t496313\n水浴\t496314\n东方帝国\t496315\n坚持己见\t496316\n冷若冰霜\t496317\nCarnegie\t496318\n320个\t496319\nIPL\t496320\nbillboard\t496321\n压强计\t496322\n白肉\t496323\n东盟自贸区\t496324\nkeydown\t496325\n第九周\t496326\n笑死不偿命\t496327\n灌丛\t496328\ncplus\t496329\n紧急救护120\t496330\n中国职业技术教育网\t496331\n爱乐汇\t496332\njoe\t496333\n捡漏\t496334\n23.0\t496335\n小猪佩琪\t496336\npicker\t496337\n邦妮\t496338\nswimming\t496339\n铜房网\t496340\n2KG\t496341\n东二路\t496342\nFoamposite\t496343\n黑狐之风影\t496344\n戴尔灵越燃7000\t496345\n宝源\t496346\n夜闯寡妇村\t496347\n8秒\t496348\n脚杯\t496349\nルズ\t496350\n电加热棒\t496351\nsims3\t496352\n王梓\t496353\n注\t496354\n调查员\t496355\n5.5小时\t496356\n算什么男人\t496357\nSCX-4521HS\t496358\nliteide\t496359\nDN\t496360\n殊途同归\t496361\n彩虹纹\t496362\nbgn\t496363\n店员\t496364\nMT7620A\t496365\n奢易\t496366\n农艺\t496367\nlibi\t496368\n完事儿\t496369\n弧焊\t496370\n银边\t496371\n华道\t496372\n清污机\t496373\n牛若丸\t496374\n110层\t496375\n武商众圆广场\t496376\nadore\t496377\n从江县\t496378\n取指\t496379\n千千小说网\t496380\nAPINK\t496381\n企业利润\t496382\nRRC\t496383\n38周\t496384\n宁波海事法院\t496385\nEntrance\t496386\n爱因斯坦相对论\t496387\npudong\t496388\n陶瓷罐\t496389\n搏\t496390\nBAT批\t496391\n第199集\t496392\n老火车站\t496393\n新粤\t496394\n方域\t496395\nwangmo\t496396\n粘结\t496397\n完备\t496398\n人行道板\t496399\n东北姑娘\t496400\nJComboBox\t496401\n绫濑诗织\t496402\n转调\t496403\n心岸\t496404\n苏州树山\t496405\n不看不知道\t496406\n龙卡汽车卡\t496407\n正己\t496408\n龙芯中科\t496409\n开心超人\t496410\n凤台路\t496411\nnet、cn\t496412\n冷却剂\t496413\nACROBAT\t496414\n中专证\t496415\n照像\t496416\n石家庄财经职业学院\t496417\n治愈系\t496418\n棉纺厂\t496419\n1V\t496420\n山体公园\t496421\n南昌地铁4号线\t496422\nNESPRESSO\t496423\n因素\t496424\n首行缩进2\t496425\n张晓林\t496426\n天才病\t496427\n牙具\t496428\n斜板\t496429\n阿蒙森\t496430\n车厢\t496431\n深圳市蛇口人民医院\t496432\n特吕弗\t496433\n总指挥\t496434\n欧委会\t496435\n布伦\t496436\nt130\t496437\n双核处理器\t496438\ndls\t496439\n祁红\t496440\n王君正\t496441\n帕拉丁\t496442\n中共十八届四中全会\t496443\n9.123.159.41\t496444\n宝塔路\t496445\n角阀\t496446\n谷风冀雨\t496447\n管理人\t496448\nXDJM\t496449\n探测器\t496450\n狗奴\t496451\nH2\t496452\n文刀\t496453\n许仲琳\t496454\n中公网校\t496455\n重庆海关\t496456\n嗨嗨嗨\t496457\nsetex\t496458\n2017年1月5日\t496459\n维西县\t496460\n天津医院\t496461\n初报\t496462\n饭岛爱\t496463\n国产化率\t496464\nAnts\t496465\n本硕连读\t496466\n军务\t496467\n金融英才网\t496468\n天使帝国3\t496469\nfacial\t496470\n欢乐人\t496471\n玉鸟\t496472\n中国普法网\t496473\n全面型\t496474\n网易考拉\t496475\nWebAssembly\t496476\n畸形率\t496477\n一健\t496478\n智联卓聘网\t496479\n第9届\t496480\ngmi\t496481\nSurfaces\t496482\n广州自考网\t496483\n前导符\t496484\n上百条\t496485\n芝麻\t496486\n刘俊海\t496487\n茎叶\t496488\n4550\t496489\n花椒鸡\t496490\nxix\t496491\n成都市房管局\t496492\n6欧\t496493\nawz3\t496494\n战片\t496495\n巴尔坦星人\t496496\n斜方肌\t496497\n满客宝\t496498\n南梦宫\t496499\nbangkok\t496500\nxzbu\t496501\nsymfony\t496502\n寻乌人民政府\t496503\n宽带拨号\t496504\n湖南省律师协会\t496505\n张江高新区\t496506\nSpecific\t496507\n20170202\t496508\n成流\t496509\n洪涝\t496510\n辛多雷\t496511\n阜新市\t496512\nWonderware\t496513\n火龟\t496514\n中国生物技术股份有限公司\t496515\n讽喻\t496516\n货币银行\t496517\ncognition\t496518\n零卫\t496519\n谢大神\t496520\n捷径\t496521\n蔓蔓\t496522\n操虫\t496523\n聚乙烯醇缩丁醛\t496524\n基础步\t496525\nExcel/CSV\t496526\n阿尔金山\t496527\nexcel07电子表格\t496528\n跟我走\t496529\n街区\t496530\nRaised\t496531\n柳暗花溟\t496532\ncad2014序列号\t496533\n润燥\t496534\n待放\t496535\nRECORDS\t496536\n酷女孩\t496537\n云巴\t496538\n高尔夫球场\t496539\n拆车\t496540\n余地\t496541\nint16\t496542\n2相\t496543\n金孝渊\t496544\n爵士队\t496545\n异辛酸\t496546\n炮台\t496547\n崇敬\t496548\nJuNO\t496549\n原始值\t496550\n限拍\t496551\n成都中考网\t496552\nBWS\t496553\n西藏日报\t496554\n青岛市人民政府国有资产监督管理委员会\t496555\n傻话\t496556\n快乐向前冲\t496557\ndtw\t496558\navcodec\t496559\n地狱级\t496560\nwifi\t496561\n各科室\t496562\n平桥\t496563\n烟弹\t496564\n福娘\t496565\n汉卿\t496566\n崩坏学园3\t496567\n读书郎\t496568\n胸腔镜\t496569\n绣片\t496570\n太原百姓网\t496571\n辛夷坞\t496572\n淄川\t496573\n神雕山野生动物园\t496574\n南新\t496575\n氮源\t496576\n陷害\t496577\nG.E.M.邓紫棋\t496578\n大华府\t496579\nuedit\t496580\n爱奇艺黄金\t496581\n神经纤维瘤\t496582\nalienware\t496583\nSCS\t496584\n鸣志\t496585\n简单\t496586\nf14\t496587\n乐拼\t496588\n乙类\t496589\n模电\t496590\n欧尚a800\t496591\n可贴\t496592\n察哈尔\t496593\n鹞鹰\t496594\n路凯\t496595\n花形\t496596\n10块\t496597\n魔羽\t496598\n组装电脑\t496599\n丁青\t496600\n永铭\t496601\n离人\t496602\nVarmilo\t496603\nhellowzl\t496604\n星宫\t496605\n帮帮龙\t496606\n藤崎龙\t496607\n底跃\t496608\nResort\t496609\n大杨镇\t496610\nA5000\t496611\n160元\t496612\n平面化\t496613\nexcutor\t496614\n身后\t496615\n刘雅丽\t496616\n变色镜\t496617\nEthanol\t496618\n深盘\t496619\n禁毒网\t496620\n魏尔伦\t496621\n第一季10集\t496622\n紫名\t496623\n开头段\t496624\nwedgwood\t496625\nA100\t496626\npci\t496627\n农业园区\t496628\n抖\t496629\n丘比龙\t496630\n国务院研究室\t496631\n长春地铁2号线\t496632\n疯\t496633\n荒野大镖客\t496634\n杨芸晴\t496635\n环路\t496636\nt9300\t496637\n阿拉贡\t496638\n黑门\t496639\n任天行\t496640\n38厘米\t496641\n光大水务\t496642\n铁器\t496643\n古驰\t496644\n中驰车福\t496645\n三江阁手机小说网\t496646\n无锡金桥双语实验学校\t496647\n乘法\t496648\n刷分\t496649\n长杆\t496650\n海盐网\t496651\n非深户港澳通行证\t496652\nthrasher\t496653\n存单\t496654\nbau\t496655\n汀州\t496656\nproblem\t496657\n登登登\t496658\n鹏起科技\t496659\n房产观澜_论坛\t496660\n中兴汽车\t496661\n去\t496662\n椒盐皮皮虾\t496663\n戍\t496664\n作祟\t496665\n10毫克\t496666\n夹钳\t496667\nansys18\t496668\n世嘉\t496669\n拉尔夫\t496670\nszm\t496671\n积少成多\t496672\n万丈红尘三杯酒\t496673\n枪火\t496674\n4r7\t496675\n美财长\t496676\n定则\t496677\n山脉\t496678\n练习机\t496679\n马蜂窝旅游网\t496680\nExchanger\t496681\ntailed\t496682\n水松缘\t496683\n武汉研发中心\t496684\nWinamp\t496685\n刘芊\t496686\n辽宁银监局\t496687\ntabBar\t496688\nSEAL\t496689\n青云门\t496690\nGTX1050Ti\t496691\n哈尔滨市南岗区\t496692\n骁龙615\t496693\ngrub\t496694\n加长\t496695\n0.4cm\t496696\n粗制滥造\t496697\n天装\t496698\nosal\t496699\n三农网\t496700\n广州工业大学\t496701\n39_\t496702\nppomo\t496703\n哈尔滨中药四厂\t496704\n井喷期\t496705\n20150119\t496706\n20141024\t496707\n兄弟宫\t496708\nWIIU\t496709\nathlon\t496710\nSwift4.0\t496711\n天龙八部3_多玩游戏网\t496712\n趣向\t496713\n1-2分钟\t496714\n三特索道\t496715\nconnections\t496716\n晏婴\t496717\ngrip\t496718\n倚\t496719\n外架\t496720\n王小国\t496721\n欧比特\t496722\n恒大雅苑\t496723\nchippai\t496724\n6.06\t496725\nEther\t496726\n五雷\t496727\nCPU天梯图\t496728\nCOSME\t496729\n胶接\t496730\n华荣\t496731\nmiui论坛\t496732\n宋晓波\t496733\nsilvia\t496734\n正人\t496735\n西周\t496736\n吹哨\t496737\nab血型\t496738\n迅雷版\t496739\n如雨下\t496740\n咏鹅\t496741\n皇家方舟\t496742\nTPV\t496743\n查查吧\t496744\n3幢\t496745\n5月12日\t496746\n艾泽拉斯\t496747\n发瑞\t496748\n20161022\t496749\n韩流风尚\t496750\nv2.0.8\t496751\ndoxygen\t496752\nLMDB\t496753\nasg\t496754\n智识\t496755\n吴彦\t496756\n海水稻\t496757\n一争\t496758\nArduino\t496759\n109亿\t496760\n第24关\t496761\n那扇门\t496762\n47岁\t496763\n上海检察院\t496764\n7.5.2\t496765\n飞刀又见飞刀\t496766\n小蚁智能摄像机\t496767\nRuin\t496768\n数控磨床\t496769\n冰乙酸\t496770\nLaos\t496771\nEndian\t496772\n来访问\t496773\nrich-text\t496774\n江湖龙虎斗\t496775\n中国人民解放军海军总医院\t496776\n二级箱\t496777\n华企黄页网\t496778\n皮料\t496779\n第9期\t496780\n意外保险\t496781\n古今\t496782\n茶道ism\t496783\n民主村\t496784\nfiscal\t496785\n新倩女幽魂\t496786\n匈牙利布达佩斯\t496787\nPrintWriter\t496788\n双辽市\t496789\n校园文学\t496790\n外观\t496791\n终极蜘蛛侠\t496792\n亮氨酸\t496793\n重庆长寿湖\t496794\n钰龙\t496795\n刘爱国\t496796\n太子妃升职记\t496797\n三十几岁\t496798\n审计师考试\t496799\n真人mm\t496800\n双版\t496801\n蝴蝶飞飞\t496802\n戒撸\t496803\n2.9米\t496804\n鲁国\t496805\n低通\t496806\nFilled\t496807\n试油\t496808\n9排\t496809\nIntent\t496810\nlpfuture\t496811\n杀兄\t496812\n厚颜无耻\t496813\n红领巾\t496814\n高不可攀\t496815\n孔距\t496816\n刚域\t496817\n大陆法\t496818\nvoke\t496819\nBOARD\t496820\nQString\t496821\n首航\t496822\n椰油\t496823\n厚势\t496824\n可贵\t496825\n大清神捕\t496826\n黄裳\t496827\n药源\t496828\n航海王启航\t496829\n中国美术高考网\t496830\n轻量版\t496831\n深圳科技园\t496832\n第五组\t496833\n中国人口福利基金会\t496834\n整整\t496835\n蜚\t496836\n18周岁\t496837\n厦门政府\t496838\n石家庄铁路职业技术学院\t496839\n万恶之源\t496840\ndereference\t496841\nBdsm\t496842\n气生\t496843\n郁\t496844\n恶霉灵\t496845\n赤潮\t496846\n周克希\t496847\n呀路古热带植物园\t496848\n脑速\t496849\n纷呈\t496850\n鼎丰\t496851\n毒案\t496852\n阿里斯顿热水器\t496853\n伟东\t496854\n照骗\t496855\nMerge\t496856\n大党\t496857\n龙天论坛\t496858\n90万吨\t496859\n冲上\t496860\n厦门地区\t496861\nSkrillex\t496862\nIonic3\t496863\n大众创业万众\t496864\n440C\t496865\n横栏\t496866\nKiller\t496867\n争议性\t496868\n太行\t496869\nopposition\t496870\n乙炔气\t496871\n冫\t496872\n去中心化\t496873\nprecious\t496874\n9570\t496875\n五行币\t496876\n丰台南路\t496877\n金清镇\t496878\n马会\t496879\nK12在线教育\t496880\n肉疙瘩\t496881\n277\t496882\n支架\t496883\n亚美能源\t496884\n601360\t496885\nproduce\t496886\n康芝药业\t496887\n滋味\t496888\nmbn\t496889\nSilo\t496890\n燕保\t496891\n火水\t496892\n代检\t496893\n几届\t496894\n奔驰E级论\t496895\n腊牛肉\t496896\n内分泌治疗\t496897\n泉州市教育局\t496898\n木有\t496899\n肤感\t496900\n坛经\t496901\n便捷性\t496902\n徐惠\t496903\n观众们\t496904\n抛物\t496905\n蛊毒\t496906\n使出\t496907\n45560\t496908\n海风股票论坛\t496909\nmmf\t496910\n华润双鹤\t496911\n体育锻炼\t496912\n恭敬\t496913\n护腕\t496914\n工资格证\t496915\n肠杆菌科\t496916\n打球\t496917\n小魔女学园\t496918\n裴擒虎\t496919\n梧桐花园\t496920\nNews\t496921\n颠倒\t496922\nbelieving\t496923\n鱼乐贝贝\t496924\n九龙网\t496925\n晏子春秋\t496926\n一念永恒\t496927\nhdri\t496928\nVuforia\t496929\n香洲\t496930\n冰哥\t496931\n鬼吹灯外传\t496932\n土拨鼠网\t496933\nSport\t496934\n我一个人\t496935\n渡鸦年\t496936\nmojo\t496937\n散文网\t496938\npsg\t496939\n起亚k4\t496940\n分辨率\t496941\n第14节\t496942\n皋月杜鹃\t496943\ns21\t496944\n脚色\t496945\n括约肌\t496946\n软化水\t496947\n海绵头\t496948\nkate\t496949\nBride\t496950\nRavenfield\t496951\n孙敬媛\t496952\n苏宁\t496953\nCCtalk\t496954\n新编三宝局长\t496955\n译林版五年级英语\t496956\n瀛海府\t496957\n我愿\t496958\n中华永久墓园\t496959\n渤海证券\t496960\n梁信军\t496961\n26号\t496962\nXJ\t496963\n动物公园\t496964\n神经根\t496965\n通盘\t496966\nwww.3566t.com\t496967\n七仙\t496968\n裴涩琪\t496969\n260X\t496970\n储藏箱\t496971\n复刊\t496972\n保温灯\t496973\n箭牌衣柜\t496974\nIV\t496975\n那拉\t496976\n节点板\t496977\nSpectrum\t496978\n拳师\t496979\n梦想改造家\t496980\n秭归\t496981\nxuan\t496982\n浪花\t496983\nex9000\t496984\n危机四伏\t496985\n减法\t496986\n雄帝科技\t496987\n二十几岁\t496988\nceremony\t496989\n吴瑶\t496990\n靶向制剂\t496991\n财团\t496992\nX99\t496993\nqualify\t496994\n临海市政府\t496995\n姓陈\t496996\nKmeans\t496997\nAria2\t496998\n日照日报\t496999\nmax2\t497000\n4.8分\t497001\n电磁场与电磁波\t497002\n衡阳市教育局\t497003\n回收利用\t497004\nSPF20\t497005\n来犯\t497006\n赵襄子\t497007\n桥本有菜\t497008\n落得\t497009\nDisplaying\t497010\n巴斯光年\t497011\nSensor\t497012\n六大派\t497013\n0476\t497014\nchease\t497015\n乌镇饭局\t497016\n格兰之森\t497017\n小满\t497018\n奕轩\t497019\nHM\t497020\n毛骨悚然撞鬼经\t497021\n吕秀才\t497022\n新品标\t497023\n功\t497024\n许吉\t497025\n汽车接插件\t497026\n新影\t497027\n知末\t497028\n吴氏太极拳\t497029\n马杜罗\t497030\nxvl\t497031\n钱库\t497032\n九江一中\t497033\n批发行\t497034\n城国\t497035\nnicehash\t497036\n2月3日\t497037\nM436\t497038\nfallin\t497039\n光武帝刘秀\t497040\n才能\t497041\n湖南分公司\t497042\n龙眼\t497043\nMiko\t497044\nGOROOT\t497045\nMapGIS\t497046\n钟表\t497047\nEnlightenment\t497048\n临港经济区\t497049\n阎石\t497050\nhail\t497051\n滑条\t497052\nPhotoFans摄影网\t497053\nマ\t497054\n奥克兰\t497055\n傲世皇朝\t497056\nvMotion\t497057\n学工\t497058\n台州湾\t497059\n中国水科院\t497060\ncxp\t497061\n生死门\t497062\n一拜\t497063\n发声\t497064\n4驱\t497065\n绿建之窗\t497066\nARWU\t497067\nAutoDock\t497068\n算是\t497069\nMOS管\t497070\n最苦与最乐\t497071\nhellotalk\t497072\n十多万\t497073\n锦衣\t497074\n火旋风\t497075\n大科\t497076\nintrusion\t497077\n天狼星\t497078\n48.3\t497079\n114300\t497080\n敏捷性\t497081\n双灯趣\t497082\n无纺布袋\t497083\n河汉\t497084\n请见\t497085\nnedu\t497086\n维密秀\t497087\n音神曲\t497088\n112名\t497089\n校具\t497090\n乘出租车\t497091\n伽蓝\t497092\n南昌起义\t497093\nnetlogon\t497094\n160多个\t497095\nPEF\t497096\n五月天人生无限公司\t497097\npyopenssl\t497098\nscihub\t497099\n致美\t497100\n廖排骨\t497101\n发狠\t497102\nVoodoo\t497103\n圣战群英传\t497104\n队服\t497105\n河钢\t497106\n钱夹\t497107\nhole\t497108\npp塑料\t497109\n王开翠\t497110\n温州旅游网\t497111\n五硫化二磷\t497112\n糖网\t497113\n陆瑾年\t497114\n阿依\t497115\n创新创业公司\t497116\n积极型\t497117\nanaerobic\t497118\ngarbage\t497119\n赞化\t497120\n办公楼\t497121\n妇幼保健\t497122\n运河广场\t497123\n贷款卡\t497124\n副攻\t497125\nLEVEL\t497126\nnex6\t497127\n永兴镇\t497128\n黑石塔\t497129\n黄爱华\t497130\n料子\t497131\n搞清\t497132\n水曲\t497133\n旅游网-韩巢网\t497134\n山东魏桥创业集团有限公司\t497135\n五刀\t497136\n萦绕\t497137\n退修\t497138\n电饭\t497139\n贾汪\t497140\n花好月圆夜\t497141\n路桥川\t497142\n单刀赴会\t497143\nOkHttp\t497144\nautocommit\t497145\n银杏林\t497146\n高工\t497147\n省纪委\t497148\nJewelry\t497149\nwebsock\t497150\nGT1030\t497151\n□\t497152\n徐健\t497153\n庐阳教育体育信息网\t497154\nradical\t497155\n浪浪\t497156\n破落\t497157\n右轴\t497158\n面宽\t497159\n秀峰中学\t497160\ninventor\t497161\nDate型\t497162\n挥金如土\t497163\n8天\t497164\nDeepFreeze\t497165\n李海军\t497166\n渐进\t497167\n新开区\t497168\n俄亥俄州立大学\t497169\n1.35米\t497170\nOpenVAS\t497171\n力求\t497172\n倒位\t497173\n氧操\t497174\n而立之年\t497175\n帕杰罗论坛_汽车之家论坛\t497176\nmetlab\t497177\npointcut\t497178\n装备\t497179\n中原路\t497180\ncons\t497181\nch2\t497182\n56000\t497183\n新区医院\t497184\n合奏曲\t497185\n套袋机\t497186\n奥迪瑞虎\t497187\n浪子回头金不换\t497188\n公仪休\t497189\n二元\t497190\n赵栋\t497191\naojo\t497192\n第118集\t497193\nsylvie\t497194\nPSYCHO\t497195\n花馍\t497196\n交大医学院\t497197\n大连西岗区\t497198\n高腿\t497199\n新奇特\t497200\naboard\t497201\ndob\t497202\n三加二\t497203\n废铅蓄电池\t497204\n铅笔盒\t497205\nsuspected\t497206\n攻效\t497207\n百度糯米\t497208\n学生物\t497209\n键程\t497210\n神手\t497211\n总务科\t497212\n17发\t497213\nwisdom\t497214\n饱和蒸汽\t497215\n家乡\t497216\n预拌混凝土\t497217\n南方公园真理之杖\t497218\nOGNL\t497219\nx399\t497220\nmadame\t497221\n左宏元\t497222\n华商杂谈-华商论坛\t497223\n乏力\t497224\n龙固镇\t497225\n小筝\t497226\n超跑俱乐部\t497227\n热镀锌角钢\t497228\n吴镇\t497229\n仿藤\t497230\n4722\t497231\n教育部学位中心\t497232\n基里\t497233\n冠亚军\t497234\n女畜\t497235\n德曼\t497236\n酹\t497237\n反算\t497238\nexsl\t497239\n江华瑶族自治县人民政府\t497240\n极米吧_\t497241\n青萍\t497242\n淘宝C店\t497243\n八礼\t497244\n看世\t497245\n张筱雨\t497246\nNewegg\t497247\n中国建筑设计研究院\t497248\n福点\t497249\nhappy卡\t497250\n上本\t497251\n诛仙之路\t497252\nshouye\t497253\n阿拉法特\t497254\n8000万美元\t497255\n十堰市第一中学\t497256\n秦EV450\t497257\n毕节市政府\t497258\n梁缘\t497259\nIDEC\t497260\n糖盒\t497261\n忘_\t497262\n天爵\t497263\n暂存\t497264\n买一送\t497265\n闯\t497266\ninser\t497267\n2015-03-18\t497268\njoycon\t497269\n黄太吉\t497270\n萨利赫\t497271\nD12\t497272\n爬山虎\t497273\n光银\t497274\n圣荷\t497275\n昆明广播电视台\t497276\n发盘\t497277\n回盘\t497278\nTwo\t497279\nNetflix\t497280\n撞击\t497281\n9031\t497282\n健康无忧网\t497283\n动物群\t497284\n锡伯族\t497285\nyves\t497286\n艾涂图\t497287\nweb开发\t497288\n菠萝味\t497289\n春茧体育馆\t497290\n台铃电动车\t497291\n尾奏\t497292\ngambit\t497293\n夜色撩人\t497294\n任正非\t497295\n第十三次\t497296\ncanoscan\t497297\n舒华\t497298\n第三十三条\t497299\nxvide\t497300\n4.68\t497301\n安吉拉\t497302\n周常\t497303\n101&\t497304\n卡莲\t497305\n色拉\t497306\n优易网\t497307\nyun\t497308\n别跑\t497309\n好医生药业集团\t497310\n水费\t497311\n地带性\t497312\n全国股转系统\t497313\n杜兰大学\t497314\n虎刺\t497315\n南京省\t497316\n科瑞\t497317\n2018至2019年\t497318\n烦透\t497319\n楚霸王\t497320\n大渔\t497321\n儿女英雄传\t497322\nEdgeRouter\t497323\n自梳\t497324\nPalmer\t497325\nprogression\t497326\n卡迪夫大学\t497327\n乐币\t497328\n船师\t497329\n孟洪涛\t497330\nHeaters\t497331\n∈\t497332\n热流道\t497333\n票机\t497334\n谈话\t497335\n恋与偶像\t497336\n三十五\t497337\n咵\t497338\n无可\t497339\n老姐\t497340\n托马斯穆勒\t497341\n中国西北地区\t497342\n签到簿\t497343\n尤格萨隆\t497344\nexcel合并单元格\t497345\nhrhguanli\t497346\n76期\t497347\n附魂\t497348\n河南农大\t497349\n张君秋\t497350\nquhao\t497351\n所罗门王\t497352\n淼\t497353\n公营\t497354\n陕西省纪委\t497355\n餐饮许可证\t497356\n汉京九榕台\t497357\n光纤预制棒\t497358\nwil\t497359\nAmple\t497360\nPrinceton\t497361\ngens\t497362\n美乐乐装修网\t497363\n华东师范大学》\t497364\n面值\t497365\n回风\t497366\n外办\t497367\n上上城五期\t497368\nн\t497369\nbanyan\t497370\nIGA\t497371\n墨云\t497372\n五个小时\t497373\n龙装\t497374\n第214集\t497375\n大润发优\t497376\ncp文\t497377\n犬类\t497378\nBILIBILI\t497379\n佳华\t497380\nxingshi\t497381\n苟周于事\t497382\n老站\t497383\n底子\t497384\nstrix\t497385\n攸县公众信息网\t497386\n咱爸咱妈\t497387\n银泰商业\t497388\nxor\t497389\nih\t497390\n石榴红村\t497391\nlocalho\t497392\nFinland\t497393\niti\t497394\n杂食性\t497395\ncaobi\t497396\n佛身\t497397\n小鹿斑比\t497398\n圆锥\t497399\n电焊条\t497400\n短猫\t497401\n苦心人\t497402\n三英\t497403\nfcs\t497404\n46路\t497405\nA963\t497406\nch340g\t497407\n廊坊市人力资源和社会保障局\t497408\n你和你\t497409\n佩奇乔治\t497410\n决算\t497411\n续志\t497412\n杨奉武\t497413\n爆笑虫子\t497414\nlua\t497415\n狂操\t497416\n单人小游戏大全\t497417\n王村镇\t497418\n12批次\t497419\n泌尿\t497420\n超超临界机组\t497421\n既而\t497422\n江苏小学\t497423\n郑屠\t497424\ndevote\t497425\n华中科技大学计算机科学与技术学院\t497426\n车盘\t497427\n活润\t497428\n青云山\t497429\n友窝\t497430\n一元一次不等式\t497431\n父体\t497432\n银行业务员\t497433\n4月27日\t497434\nWebmail\t497435\n奶汁\t497436\n1855年\t497437\n美乐乐家具网\t497438\n剑灵网通\t497439\n约会大作战第二季\t497440\n京都酒店\t497441\n湖北省经济和信息化委员会\t497442\n五_\t497443\ntdm\t497444\n仵\t497445\n8.8元\t497446\n石湫\t497447\nmhdd\t497448\n太太团\t497449\n偶数个\t497450\n201年\t497451\n欧达\t497452\nhaunted\t497453\n情人\t497454\n放坡系数\t497455\n贝克褚时健\t497456\nlifestyle\t497457\nCable\t497458\navailability\t497459\n体彩超级大乐透\t497460\neffie\t497461\n排表\t497462\n工标网\t497463\nthreads\t497464\n龙蛇\t497465\n我的母亲\t497466\n路姓\t497467\n20130908\t497468\nhibenate\t497469\n宾阳县\t497470\nExp\t497471\n金夫人婚纱摄影\t497472\n转经轮\t497473\n波黑战争\t497474\n2.31\t497475\n6个\t497476\n枉然\t497477\n李庆丰\t497478\n重瓣\t497479\n新华\t497480\n5054\t497481\n200CN\t497482\n棒画\t497483\n月儿\t497484\n夷为平地\t497485\n郑岩\t497486\n金元证券\t497487\n病日\t497488\n郭圣通\t497489\n盖骑\t497490\nnumpy数组操作学习\t497491\n授薪\t497492\n5月12\t497493\nwuliqiang\t497494\n福昕\t497495\n汉川\t497496\n五千米\t497497\n脸子\t497498\nMetaMask\t497499\n高德地图车机版3.0\t497500\n赫敏\t497501\n靖远县\t497502\n华文出版社\t497503\nBook\t497504\n吹\t497505\n年月日时分秒\t497506\n峨冠\t497507\nradians\t497508\n容默\t497509\n反射炉\t497510\n快客杯\t497511\n乐摄宝\t497512\nマッサ\t497513\n脱水\t497514\n奋安\t497515\n捷安特自行车\t497516\n彭波\t497517\n江苏区\t497518\n玫香\t497519\n七十三\t497520\n第七代\t497521\n金马奖\t497522\n炎八\t497523\n国家教师资格考试网\t497524\n狂鲨\t497525\nEasyBCD\t497526\n王者荣耀破晓\t497527\n概率论与数理统计教程\t497528\nIBMS\t497529\n泰安二中\t497530\nΛ\t497531\npastry\t497532\n尊文\t497533\n1000km\t497534\n单元测验卷\t497535\nDHTMLX\t497536\n层叠式\t497537\n双务合同\t497538\n老梗\t497539\ne460\t497540\n笛谱\t497541\ngcp\t497542\n同盾科技\t497543\ntnc\t497544\n5.4英寸\t497545\n韩颖\t497546\n三国志2霸王的大陆\t497547\n坚持不懈\t497548\n二氧化硅\t497549\n注析\t497550\n地车\t497551\n头重脚轻\t497552\n长江存储科技有限责任公司\t497553\n福利号\t497554\n锡伯\t497555\n信宜玉都\t497556\n礼嘉镇\t497557\n南湖新闻网\t497558\nRiver\t497559\n天穹\t497560\nmatlab源程序\t497561\n潍坊干部网络学院\t497562\n中东大市场\t497563\n健将\t497564\nelc\t497565\n七度空间\t497566\n6470\t497567\n深地\t497568\nFisher\t497569\n我叫\t497570\n保时捷macan\t497571\ngarfieldtom\t497572\n华微\t497573\n铺租\t497574\n道德监\t497575\n无污染\t497576\n聂风\t497577\n悦支付\t497578\n1613\t497579\n模拟人生免费版\t497580\n府青路\t497581\n待转\t497582\n一审\t497583\n51mm\t497584\n水害\t497585\n昌平一中\t497586\n茶旅\t497587\n26条\t497588\n城东街道\t497589\noutbound\t497590\n法伤\t497591\n三月三\t497592\n正压式\t497593\nscanned\t497594\n千只\t497595\n艾布拉姆斯\t497596\n小仓奈奈\t497597\nRealVNC\t497598\n纪实性\t497599\nUPC码\t497600\n飞飞电影网\t497601\n醋酸酯\t497602\n排面\t497603\n战斗机器人\t497604\n停级\t497605\n起泡剂\t497606\n非延续性\t497607\nSKAM\t497608\n桑螵蛸\t497609\n测厚仪\t497610\n合一\t497611\n铁丝\t497612\n易搜网\t497613\n里约大冒险2\t497614\n工友\t497615\nEasier\t497616\n自愈合\t497617\n野塘\t497618\n厦门市儿童医院\t497619\n棣花古镇\t497620\n迦勒底\t497621\n就使\t497622\n批件\t497623\n为你而战\t497624\n缕衣\t497625\n201_\t497626\nsfs\t497627\n新潟县\t497628\n蓝宝贝\t497629\ndns服务器\t497630\npace\t497631\n慈利\t497632\n浮士德\t497633\nkinky\t497634\n迈点\t497635\nAKI\t497636\ngridfs\t497637\nselfridges\t497638\n样例\t497639\n青衣江\t497640\n批准书\t497641\ncordic\t497642\n稀硝酸反应\t497643\n划译\t497644\n中国国家大剧院\t497645\nikbc吧\t497646\n马场道\t497647\n冯琪\t497648\nsmlz\t497649\nl365\t497650\nseesaw\t497651\n大芯板\t497652\n出岫\t497653\n花茂村\t497654\n骨瓷杯\t497655\n手指南\t497656\n约分\t497657\ngitk\t497658\n张华男\t497659\n沙口路\t497660\n5千克\t497661\nfayin\t497662\n美术片\t497663\n肥姐\t497664\n岭南大学\t497665\n国云\t497666\nPatton\t497667\n受迫\t497668\n铝格栅\t497669\n抓人\t497670\n刘哥模\t497671\n四川农村信用社\t497672\n双筒望远镜\t497673\n小批\t497674\n川大\t497675\n别慌\t497676\nbake\t497677\nchenchen\t497678\nMOV\t497679\n出山店水库\t497680\n残友\t497681\n扬子\t497682\n求真务实\t497683\n模拟飞行网\t497684\n成立业\t497685\n划坏\t497686\n指定\t497687\n世界地质公园\t497688\n九湖镇\t497689\n自制版\t497690\n7句\t497691\n湄江\t497692\n宝地\t497693\n富锦\t497694\n3Q\t497695\n易维\t497696\n塞纳留斯\t497697\n情爱\t497698\n狼爪\t497699\n2008时\t497700\n禁片\t497701\n东京喰种2\t497702\n土培\t497703\n马克龙\t497704\nPremiere2017\t497705\n百度云网盘+迅雷+旋风\t497706\n硬着陆\t497707\n字音\t497708\n紫金奖\t497709\n1166\t497710\n40平方厘米\t497711\n淅沥沥\t497712\nppt2\t497713\n股票质押式回购交易\t497714\n人艺\t497715\nchrdev\t497716\n鹿苑\t497717\n开炉\t497718\nlumia920吧\t497719\n绝地求生封号\t497720\n思语\t497721\n微力量\t497722\n方向\t497723\npvd\t497724\n限制器\t497725\n李浪\t497726\n悔过\t497727\ncompleting\t497728\n老相片\t497729\n半田君传说\t497730\n综合成本率\t497731\n40秒\t497732\n高职\t497733\n暂住证\t497734\n内蒙古艺术学院\t497735\n李奈\t497736\n板金\t497737\n踢脚舞\t497738\n1.6.7\t497739\n奥拉西坦胶囊\t497740\n汽车服务有限公司\t497741\n苦闷\t497742\n前导\t497743\n母犬\t497744\n来玩\t497745\n连云港市政府\t497746\n中设集团\t497747\nmad吧\t497748\n汲水\t497749\n老托\t497750\n朱建荣\t497751\n药源性\t497752\n村民小组\t497753\n杭州电信\t497754\n法人独资企业\t497755\n注册建造师网\t497756\n巧虎欢乐岛\t497757\n景象\t497758\n四局\t497759\n款号\t497760\n干密度\t497761\nmigu\t497762\nflake8\t497763\n人民政治\t497764\n精雕\t497765\n醋酸根\t497766\n李向前\t497767\n张启山\t497768\n0w20\t497769\n照美\t497770\nCV视觉网\t497771\nAlanLee\t497772\n禅城\t497773\n无数\t497774\n偏房\t497775\n云晴\t497776\n河南消防\t497777\n摸查\t497778\ntether\t497779\n愛沢\t497780\nwolfram\t497781\n浙江少年儿童出版社\t497782\n出于\t497783\n暴流\t497784\nmary\t497785\n长安CX20\t497786\n杂志册\t497787\n百绘罗衣\t497788\n文堂\t497789\n雪天使\t497790\n承前启后\t497791\n在这里\t497792\nambition\t497793\nIIS7_爱站网\t497794\n益阳市\t497795\n佛山市星光楼宇设备有限公司\t497796\n三国志9威力加强版\t497797\n拉比\t497798\n技朮\t497799\n轻快\t497800\n160亿\t497801\n免漆板\t497802\njsb\t497803\n向明中学\t497804\n不封\t497805\n张鸣\t497806\nBecky\t497807\n果博东方\t497808\n乳化泵\t497809\n天水市政府\t497810\n斯巴鲁XV\t497811\n随机本\t497812\n龙卷\t497813\n北京大成律师事务所\t497814\n湖南中医药大学第一附属医院\t497815\niphone7p\t497816\n原有\t497817\nAction层\t497818\n提娜\t497819\n现形\t497820\n班智达\t497821\n很傻很天真\t497822\n南京城市记忆\t497823\n普寿寺\t497824\n广东省人大常委会\t497825\n健安\t497826\n雨帽\t497827\n犬系\t497828\n纤维素\t497829\n文昌路\t497830\n翼状\t497831\n赏石\t497832\nLDPC\t497833\nunsafe\t497834\n挂饰\t497835\n莱顿大学\t497836\nrrs\t497837\n加拿大政府\t497838\n老名\t497839\n扑救\t497840\n审改\t497841\n嘴边\t497842\nRemaster\t497843\n挥霍\t497844\n19亿元\t497845\nCardinal\t497846\n用栈\t497847\n劝导员\t497848\n奇瑞A5\t497849\n13章\t497850\nnomination\t497851\n怀柔政府网\t497852\ndf\t497853\n西丽小学\t497854\n华陆工程科技有限责任公司\t497855\n石器时代OL\t497856\n6.50\t497857\n陈滢\t497858\n人人喊\t497859\n雅倩\t497860\n纳米喷雾补水仪\t497861\n真密度\t497862\n专心致志\t497863\n妇产科医院\t497864\nTom\t497865\n铁杵磨针\t497866\n1.6XV\t497867\n六盲星\t497868\n3希里\t497869\n建建发\t497870\n【峰\t497871\n丁家宜\t497872\n付款方式\t497873\n上海复星医药(集团)股份有限公司\t497874\n坡度\t497875\n厦门悦华酒店\t497876\n老版三国演义\t497877\n越野版\t497878\n中国石油新闻中心\t497879\n福安药业\t497880\n8182xxxx\t497881\nkin8tengoku\t497882\n开国上将\t497883\n6时\t497884\n闽南话\t497885\n绑子\t497886\n微疯客\t497887\n湛江港\t497888\n孙守刚\t497889\nocs\t497890\n第一枚\t497891\n外贸员\t497892\n蜀道难\t497893\n存储盘\t497894\nloved\t497895\n窑口\t497896\n追风筝\t497897\nESL\t497898\n奢侈税\t497899\n饰面砖\t497900\n良性肿瘤\t497901\nG14\t497902\n祭十二郎文\t497903\n磨面机\t497904\n填值\t497905\n|东商网\t497906\n开料机\t497907\n坎山\t497908\n4002\t497909\n3KK\t497910\n所有线\t497911\n30t\t497912\n英红\t497913\n保卫萝卜3\t497914\nEnhancement\t497915\n中国电子信息博览会\t497916\nJSAPI\t497917\nprofili\t497918\nBaaS\t497919\nagi\t497920\nwmic\t497921\n浑然天成\t497922\n0x800f081f\t497923\n成都富士康\t497924\nwimboot\t497925\n北京论坛_汽车之家论坛\t497926\n鲜为人知\t497927\n历险\t497928\n赌局\t497929\n炭化炉\t497930\n云龙机场\t497931\n九倍\t497932\ntablespace\t497933\n老鼠笼\t497934\n抽沙\t497935\n哈雷论坛\t497936\n公开发行证券\t497937\nZAN\t497938\n640px\t497939\nseeding\t497940\n好了歌\t497941\nonos\t497942\n高图\t497943\n长城简笔画\t497944\ncharacter\t497945\n第6季\t497946\n2018年03月28日\t497947\n东吴证券股份有限公司\t497948\n第8代\t497949\nTAKARA\t497950\n緊縛\t497951\nmoodyz\t497952\n民主生活会谈心谈话\t497953\nCB400\t497954\n处女星号\t497955\n中国平安集团\t497956\n粉底膏\t497957\n6.37\t497958\nBooki\t497959\nitudou\t497960\n断念\t497961\n新型农村合作医疗网\t497962\n照镜\t497963\n徒然\t497964\nreveals\t497965\nebitda\t497966\n腔\t497967\n逆否\t497968\n信息反馈\t497969\n赛车服\t497970\n宽途\t497971\n无以言表\t497972\n闵子骞\t497973\n中石科技\t497974\n王树彤\t497975\nfcw\t497976\n0022\t497977\n5.1号\t497978\n中山人才网\t497979\n广顺\t497980\n1.5l\t497981\n弟弟\t497982\n变形金刚大黄蜂\t497983\n阳光100\t497984\n如有雷同\t497985\nquebec\t497986\n几管\t497987\n校园修神录II\t497988\nR6300\t497989\n双终端\t497990\n六年级作文\t497991\n仔猪\t497992\n日志表\t497993\n555_\t497994\n财通基金\t497995\n打穿\t497996\n张昆\t497997\n电动化\t497998\n酷特\t497999\nB3\t498000\n7722\t498001\n军户\t498002\n黄贤\t498003\n大义灭亲\t498004\n自得其乐\t498005\n王立伟\t498006\n气血循环机\t498007\ntraveling\t498008\n穷追不舍\t498009\n高村乡\t498010\n蜂皇浆\t498011\n武藤写真机\t498012\nvlook\t498013\nxShell\t498014\n瑞雯\t498015\n概率分布\t498016\n安卓端\t498017\n审判者\t498018\n玻纤网\t498019\nyuyi\t498020\ntita\t498021\nagilent\t498022\nJST\t498023\n创汇\t498024\n望道\t498025\n分拨\t498026\nectouch\t498027\nSHARP夏普\t498028\n漫威\t498029\n钟嘉欣\t498030\n黑钻\t498031\n其实\t498032\n冷喷\t498033\n赞成\t498034\nBiomedical\t498035\n黑豆\t498036\n2.4万元\t498037\n修补机\t498038\n注册咨询工程师\t498039\njazz\t498040\n滨州市人力资源和社会保障局\t498041\n回场\t498042\n洋相\t498043\n愧色\t498044\n传数\t498045\n冰肌玉骨\t498046\n驱虫\t498047\n杨路\t498048\n香包\t498049\n微会\t498050\n放火\t498051\n794\t498052\n三师\t498053\n大禹节水\t498054\n工商行政管理\t498055\n锦绣兴仁\t498056\n梅西百货\t498057\n明年起\t498058\n格雷特\t498059\nboxster\t498060\n仙兽\t498061\n蓝桉\t498062\n霍姆斯\t498063\n湛江第一中学\t498064\nIDEAS\t498065\n0.9.6\t498066\n国安天下宁\t498067\nResolver\t498068\noringe\t498069\n机芯\t498070\n地堆\t498071\n军威\t498072\n保险金信托\t498073\n太阳之泪\t498074\n语用\t498075\n1.9.8\t498076\n1853年\t498077\n劲爽\t498078\n触控式\t498079\nmatching\t498080\nLayUI\t498081\n周园\t498082\n茶品\t498083\nArcgis\t498084\n一业\t498085\n曝晒\t498086\n名正言顺\t498087\n陆晨\t498088\nxUtils\t498089\n精华区\t498090\nNBA2K16\t498091\naxurerp\t498092\n坎耶·维斯特\t498093\nAR9331\t498094\n浪友dans\t498095\n烧结炉\t498096\nDOP\t498097\n永不变\t498098\n锒铛入狱\t498099\n芯动科技\t498100\nH5版\t498101\nExcel日期函数\t498102\n猎流\t498103\n拉祜族\t498104\n淘宝旺铺专业版\t498105\n闷头\t498106\n司空曙\t498107\npreteen\t498108\n甜心萌妻:总裁宠\t498109\n汽阀\t498110\n喜字\t498111\nVina\t498112\n卡西尼\t498113\n鬼帝\t498114\n浮色\t498115\n雅思敏\t498116\n20141210\t498117\n巢湖市居巢区\t498118\n中国旅游协会\t498119\n移居\t498120\n子库\t498121\nrenovation\t498122\n黑池\t498123\nTOAD\t498124\n翠堤湾\t498125\n美丽的日子\t498126\n洛蒙\t498127\n大班段\t498128\n福音\t498129\n佟湘玉\t498130\n大原樱子\t498131\n赤水洲\t498132\n2017年8月29日\t498133\n浸润性\t498134\n国际市场\t498135\n东北摩\t498136\nT13\t498137\n喝完\t498138\n马来酸左旋氨氯地平片\t498139\n龙化\t498140\n1.10.0\t498141\n临时符\t498142\n邕宁区\t498143\n2018038期\t498144\n念珠菌\t498145\n数组项\t498146\n七位\t498147\n内省\t498148\n溃疡性直肠炎\t498149\nMouse\t498150\n莱奥纳多\t498151\n胃肠道疾病\t498152\n安顺市人民政府\t498153\n临洮\t498154\n曹培玺\t498155\nISCSI\t498156\n边控\t498157\nPGS\t498158\n马海祥\t498159\nlieu\t498160\n坟地\t498161\n1.0.11\t498162\n2017年12月4日\t498163\n长城借记卡\t498164\n其人\t498165\nxr\t498166\n炊具\t498167\n温州移动\t498168\ncompare\t498169\n首席医官\t498170\n玉笛\t498171\nOCI\t498172\nsched\t498173\naddresses\t498174\n张发奎\t498175\n非典型\t498176\nWANT\t498177\n朱晓东\t498178\n99秒\t498179\n扩宽\t498180\n红枣姜茶\t498181\n注册消防工程师_消防考试网\t498182\n双庆\t498183\n12则\t498184\n横列\t498185\n气压杆\t498186\n市情\t498187\nactivated\t498188\n微信运动\t498189\n催财\t498190\nfunny\t498191\n北京小米\t498192\n江苏江南农村商业银行股份有限公司\t498193\n查纳税人识别号\t498194\n洛基\t498195\nWhole\t498196\n青岛宾馆\t498197\n后台管理系统\t498198\n雪童子\t498199\nTabLayout\t498200\n中信网上银行\t498201\nScenario\t498202\n牛塘镇\t498203\n上口\t498204\n时头\t498205\n海淀山\t498206\n黄河三峡\t498207\n人教版七年级语文下册\t498208\n好奇心\t498209\n力拔山\t498210\n中小企业家\t498211\n迎春花市\t498212\nQ4\t498213\n技术邻学院\t498214\n标普500\t498215\n公建\t498216\n30千克\t498217\n孙斌\t498218\n新妓生传\t498219\n出师\t498220\nEspaol\t498221\n谢佛\t498222\n质优\t498223\n白丽\t498224\n汴绣\t498225\n卢西安\t498226\n新华为\t498227\n运输处\t498228\n油服\t498229\n民丰路\t498230\n酸碱性\t498231\n72v\t498232\n瑞麟\t498233\n齿比\t498234\n米尔网\t498235\n小米Mix2S\t498236\n夏侯霸\t498237\n银衣\t498238\n香肉\t498239\n星河战队\t498240\n13p\t498241\n易人\t498242\n20140321\t498243\n涉及\t498244\n雷山\t498245\n伊苏8:达娜的安魂曲\t498246\n蓖麻子\t498247\n谷子孕婴网\t498248\npandoc\t498249\n弹簧减震器\t498250\n杨晖\t498251\n上海社保中心\t498252\n河南省社科联\t498253\nnmax155\t498254\nB5\t498255\nSamsung\t498256\n卡德山\t498257\nVip\t498258\n两分钟左右\t498259\nGTX650TI\t498260\n空转\t498261\nRPG攻略\t498262\n13薪\t498263\nuv\t498264\n石阡县\t498265\nPCRE\t498266\n单仰萍\t498267\nT600\t498268\n防锈剂\t498269\n火星文转换器\t498270\n七大圣\t498271\n微电机\t498272\n摆动\t498273\n知产\t498274\nreplicator\t498275\n梅山街道\t498276\nMarydon\t498277\n月湖\t498278\n海贼王全集\t498279\n李志军\t498280\ngetchar\t498281\nPalm\t498282\n电视电话\t498283\n猪耳\t498284\n网单\t498285\n蒂格\t498286\nappV1.0\t498287\n概念机\t498288\ncapcom\t498289\n壁虎\t498290\n义项\t498291\nfm2007\t498292\n志贺氏菌\t498293\n26所\t498294\n茂名\t498295\n抚顺北站\t498296\n9MM\t498297\n永艺股份\t498298\n义子\t498299\n云南省科技厅\t498300\n梓潼\t498301\n肖木木\t498302\n五则\t498303\n潘敏\t498304\n莞城区\t498305\n艾欧尼亚vs诺克萨斯\t498306\n考霸\t498307\n卖乖\t498308\n皇室\t498309\nGang\t498310\n财务报\t498311\n非人\t498312\nletpub\t498313\n金堂县\t498314\nexcite\t498315\n砍龙\t498316\nSaiku\t498317\nnomial\t498318\n景业\t498319\n厦门房地产联合网新闻中心\t498320\nshell函数\t498321\ntribe\t498322\n诺斯卡\t498323\nbabes\t498324\n普通外科\t498325\n116页\t498326\n卡丁车\t498327\nFITNESS\t498328\n中共党员\t498329\n故土\t498330\n芋头糕\t498331\n第二十九次\t498332\n108拜\t498333\n三氟化氮\t498334\n需求方\t498335\n寶貝\t498336\n操盘手\t498337\nRaw\t498338\nUSB线\t498339\n维吾尔自治区\t498340\n张谦\t498341\n长丝\t498342\n公因数\t498343\nClassFoo\t498344\n上海图书馆\t498345\n连接体\t498346\n乐pad\t498347\n神武天宫\t498348\ni5-8250\t498349\n230000\t498350\nCyto\t498351\n使魔\t498352\njavacc\t498353\n泥金\t498354\n殁\t498355\nF-16\t498356\nCWC\t498357\n20170512\t498358\n骨灰坛\t498359\n广告片\t498360\n无料\t498361\n战阵\t498362\n家常菜-19楼私房菜\t498363\n.00\t498364\n880\t498365\n非亲\t498366\n踏雪寻梅\t498367\nkilo\t498368\n三盛宏业\t498369\n管理员证\t498370\npony\t498371\nAV站\t498372\n巴蜀小学\t498373\n9吨\t498374\n飞背\t498375\n屯里人\t498376\n0557\t498377\njsonify\t498378\n弈凡\t498379\nwww.fx678.com\t498380\n学而思网校\t498381\nCCTV\t498382\n莫如\t498383\n离心泵\t498384\n八道江区\t498385\nmkl\t498386\n镇卫生院\t498387\n苏州政府采购网\t498388\n雅丹魔鬼城\t498389\n小摩\t498390\n随机数列\t498391\n最全\t498392\n比大小\t498393\n西电\t498394\nffffff\t498395\n直呼\t498396\n27军\t498397\nlogger4j\t498398\n王牌贱谍:格林斯比\t498399\n第48页\t498400\n天天家教网\t498401\n吸光\t498402\n强化型\t498403\n顶跃\t498404\n96.1\t498405\n别克gl6\t498406\nparseint\t498407\n金兰策划网_策划网\t498408\n学工处\t498409\n农信\t498410\n個股總覽_台股_鉅亨網\t498411\nv16.0\t498412\n破摔\t498413\n危险源识别\t498414\n糖原\t498415\n八拍\t498416\n负控\t498417\n假面骑士响鬼\t498418\n金控公司\t498419\n护符\t498420\n卢克光\t498421\n消防兵\t498422\n388g\t498423\n段冉\t498424\nDickies\t498425\n底壳\t498426\n层套\t498427\n包装桶\t498428\n28周岁\t498429\n酒器\t498430\n十五个\t498431\n永安里\t498432\n中国组织人事报\t498433\n白话版\t498434\n绵阳高新区\t498435\n冯诺依曼\t498436\n圣墓\t498437\n盲派\t498438\nR15\t498439\n苏州社区\t498440\n牛摩\t498441\n抵赖\t498442\n捉迷藏\t498443\nh+\t498444\n玉米\t498445\n戶\t498446\n死灰\t498447\n特酿\t498448\n烧砖机\t498449\n促写\t498450\n民政部办公厅\t498451\n大数据中心\t498452\n109期\t498453\nclaude\t498454\nbaidu\t498455\n黙示\t498456\npopo文吧\t498457\n富光\t498458\n工作证\t498459\npmp\t498460\n初相\t498461\n吴雪雯\t498462\n北京四通搬家公司\t498463\n新金融网\t498464\n20161229\t498465\n林路\t498466\n优优兵器谱\t498467\n强化物\t498468\n重庆茶园新区\t498469\n苏打\t498470\nunderlying\t498471\nsteamapi\t498472\n君威论坛_汽车之家论坛\t498473\n南京网\t498474\n深岩之洲\t498475\nCognex\t498476\n沈阳领事馆\t498477\n汇金公司\t498478\n佛香阁\t498479\n15.6\t498480\n一节点\t498481\n塑胶人才网\t498482\nVman\t498483\n负压风机\t498484\n富士摄影吧\t498485\n公卫\t498486\n启迪慧\t498487\n2.8l\t498488\nMegaRAID\t498489\nMediaInfo\t498490\nopnet\t498491\n支付方\t498492\n工具架\t498493\n龙发装饰\t498494\n郑晓博\t498495\npro9\t498496\n淘盘\t498497\n前世今生\t498498\nseleium\t498499\n临泽县\t498500\n透析器\t498501\n公众号营销\t498502\n第109章\t498503\njboss7\t498504\n上海装饰\t498505\n全国学生资助管理中心\t498506\nAircrack-ng\t498507\nEnfocus\t498508\n三国志13pk版\t498509\n撩情\t498510\n门扉\t498511\n油痘\t498512\n乱炮\t498513\n刘金花\t498514\n棋逢\t498515\n车况\t498516\n刘伶\t498517\n智慧谷\t498518\n魏民洲\t498519\n750米\t498520\nag真人视讯\t498521\n机雕\t498522\n茅乡\t498523\nZumba\t498524\n崔新江\t498525\nFireBug\t498526\n张守磊\t498527\n反馈式\t498528\n忻州\t498529\n大国崛起\t498530\n亿游\t498531\n安徽省投资集团控股有限公司\t498532\n谷神\t498533\n预警机\t498534\n自体\t498535\n中国人民银行办公厅\t498536\n红纹石\t498537\n九寨沟县\t498538\n综治\t498539\n武冈\t498540\n高丽霞\t498541\n朱芳雨\t498542\n多闻\t498543\n河南省林业厅\t498544\ndes3\t498545\n浙江省银行业协会\t498546\n福彩投注-福彩网\t498547\n下土\t498548\n西陵峡\t498549\n专项组\t498550\n炎黄子孙\t498551\n清都\t498552\n杨采钰\t498553\n服务器虚拟化\t498554\n帮助\t498555\n莎莉娜\t498556\n林崇德\t498557\n4.7.0\t498558\nThree\t498559\n肥胖版\t498560\nUCC\t498561\nISO_\t498562\nslb\t498563\n苏州市住房和城乡建设局\t498564\n总支\t498565\n苏州大学\t498566\n600157\t498567\n宋石男\t498568\n蒸糕\t498569\n安全版\t498570\n加盟倾品网\t498571\n甘\t498572\n高斯曲线\t498573\nDatasheet\t498574\n捞出\t498575\nJIANG\t498576\n熄灭\t498577\nchildNodes\t498578\nELAN\t498579\ncotx\t498580\n海马323\t498581\n德方\t498582\nspotted\t498583\n合金装备\t498584\neyes\t498585\n裤衩\t498586\n茜草\t498587\n大连市第二十四中学\t498588\n股权转让所得个人所得税管理办法\t498589\n购进\t498590\nstripping\t498591\n汇感\t498592\n骑马与砍\t498593\n4ac\t498594\n谭松\t498595\n通辽\t498596\n全志\t498597\n坚牢\t498598\n肥尾\t498599\n0欧\t498600\nnur\t498601\n应用池\t498602\n泰州市政府\t498603\nmsvcp140.dll\t498604\n杨金\t498605\nopentv\t498606\n计批\t498607\n凤凰天使\t498608\n恋爱的人\t498609\n百度网盘批量\t498610\n玉扇\t498611\nVue父组件向子组件\t498612\n铆接机\t498613\ndaemons\t498614\n白金分期卡\t498615\n升降架\t498616\n上海中医药大学附属龙华医院\t498617\n艾益生\t498618\n内存管理\t498619\n光荣之路\t498620\n电热水袋\t498621\n循环冗余检查\t498622\n年金\t498623\n丁文江\t498624\n檐下\t498625\n互联网出版许可证\t498626\n水星MW150US\t498627\ncte\t498628\n宁夏回族自治区地方税务局\t498629\n义堂\t498630\n轩宇\t498631\n弹子石\t498632\n爱情连连看\t498633\nEnsure\t498634\n发指\t498635\nv.1.0\t498636\n伊甸园\t498637\n女子\t498638\n湖人队\t498639\n金蝶云之家\t498640\n隐庐\t498641\n幸福梅林\t498642\n广州火车站\t498643\n奔驰GLC260\t498644\n无卤阻燃剂\t498645\n5DS\t498646\n族源\t498647\n女气功\t498648\n伯顿\t498649\nSeuss\t498650\n世杰\t498651\n第六人\t498652\n肚皮\t498653\n电玩城\t498654\n侧妃\t498655\n淀粉碘化钾\t498656\n东华大学研究生部\t498657\nid值\t498658\nhuke\t498659\n横水镇\t498660\nAnimal\t498661\n220KV\t498662\n天津方特\t498663\n生田\t498664\n8010\t498665\n三代人\t498666\n1035\t498667\n玉汝于成\t498668\n右面\t498669\n眼白\t498670\n广西大学附属中学\t498671\n安岳\t498672\n排异反应\t498673\n挑高\t498674\nrhel\t498675\n8566e\t498676\n紫园\t498677\nWuli\t498678\nCRITICAL\t498679\n在职教育网\t498680\n容留\t498681\n打叉\t498682\n崇安寺街道\t498683\nsolved\t498684\n阿莫\t498685\n黄金霜\t498686\n尼尔机械纪元\t498687\n德文卷毛猫\t498688\n王文革\t498689\n更合镇\t498690\n愿情\t498691\n爱撸\t498692\nUPVC管\t498693\nmilena\t498694\n我不愿\t498695\n木村花\t498696\n茶靡\t498697\nbuf\t498698\n美可卓\t498699\n两制\t498700\n郭炳坚\t498701\nskj\t498702\n沥海镇\t498703\n消化道\t498704\nM1216\t498705\nxing\t498706\n无阻尼\t498707\n腹腔穿刺术\t498708\n芷江机场\t498709\n向阳花\t498710\n3H\t498711\n袁振国\t498712\n坑爹网\t498713\n20140924\t498714\n五十六\t498715\n云管家\t498716\n3季\t498717\n系统集成公司\t498718\n3升\t498719\n发酵酒\t498720\n光交\t498721\n中国农学会\t498722\n72节\t498723\n中华人民共和国高等教育法\t498724\n项饰\t498725\n无济于事\t498726\n伛\t498727\n邻\t498728\n长金\t498729\n三|\t498730\n舒服\t498731\n兰州一中\t498732\n19.7\t498733\njoined\t498734\n有机相\t498735\nTEM\t498736\nPEEK\t498737\nコス\t498738\n广州地铁14号线\t498739\n深圳图书馆\t498740\n处女\t498741\n联想y50\t498742\n荣里\t498743\n累惨\t498744\n娘鸡\t498745\n兰德酷路泽论坛\t498746\n哪队\t498747\nBd\t498748\n钢格栅板\t498749\n克霉唑乳膏\t498750\n说不定\t498751\n张兆和\t498752\n神阵\t498753\n恒盛豪庭\t498754\n滤波算法\t498755\n烛影\t498756\n0.8cm\t498757\n布兰德\t498758\n压痛\t498759\ndatalist\t498760\n黄浩然\t498761\n区间价\t498762\nMagnetic\t498763\n2014年春节\t498764\n厦门高崎机场\t498765\nCAShapeLayer\t498766\n4.2米\t498767\nDEFCON\t498768\n招商会议\t498769\n90例\t498770\n_证券时报网\t498771\n心惊\t498772\nreceptors\t498773\n碳钢法兰\t498774\n熵权法\t498775\n单人版\t498776\n神域\t498777\n腐生\t498778\n华尔康\t498779\n与世无争\t498780\n控制阀\t498781\nParcel\t498782\n100平\t498783\nmembranes\t498784\n大五人格理论\t498785\n圣名\t498786\n土流\t498787\n满字\t498788\n2014年中\t498789\n活牛\t498790\n佰利联\t498791\n柳河\t498792\n男下式\t498793\n重庆市工商行政管理局\t498794\n苏丹\t498795\nSqlite3\t498796\n工作文\t498797\n木瓦\t498798\n花花公子\t498799\nKJ\t498800\n溧水新闻网\t498801\nembase\t498802\n闪银分期管家\t498803\n旅客量\t498804\n浙江华海药业股份有限公司\t498805\n收回来\t498806\n握爪\t498807\n青纱\t498808\nPornHub\t498809\nu8BF7\\u9009\\u62E9\t498810\n长安保险\t498811\n壬申年\t498812\nmscorlib\t498813\n音视\t498814\n厨影\t498815\n丝绵\t498816\n姜南\t498817\n佛克瑞斯\t498818\nZn\t498819\n经络养生网\t498820\n系列化\t498821\n腓骨\t498822\n爱就爱了\t498823\n涂镀\t498824\n古龙峡\t498825\neRX5\t498826\n纸卡\t498827\n4月30号\t498828\npius\t498829\n电子音\t498830\n蛙女\t498831\ngets函数\t498832\n佳居\t498833\n满屏\t498834\n48节\t498835\n气波\t498836\nguo\t498837\n河北梆子\t498838\n黑龙江省肿瘤医院\t498839\n槽形\t498840\ndac2018\t498841\nmongodb\t498842\n17r4\t498843\n器型\t498844\n第201章\t498845\n硅化\t498846\n电子银行部\t498847\n南城\t498848\n茂名职业技术学院\t498849\n老黄\t498850\nLlc\t498851\nUnion\t498852\n寓言二则\t498853\n角点\t498854\ndude\t498855\n劳里埃大学\t498856\n窦建德\t498857\nNSLog\t498858\n西部数码\t498859\n许维恩\t498860\n忍者神龟\t498861\nhorrible\t498862\n刻瓷\t498863\n熊猫简笔画\t498864\nipvs\t498865\n镇政\t498866\n山语\t498867\n徐家村\t498868\n文化带\t498869\n大空翼\t498870\n阿肯\t498871\nLaserjet\t498872\n椰奶\t498873\ntegra\t498874\n曾舜晞\t498875\n九阳豆浆机\t498876\n天使帝国\t498877\nboostrap\t498878\n刹那\t498879\n黑暗传说单机rpg\t498880\n李辛\t498881\n大商场\t498882\n主序\t498883\n山腰\t498884\n她家\t498885\n农技员\t498886\n中信泰富特钢集团\t498887\n开航\t498888\n三星i9300\t498889\n反倾销税\t498890\n11月底\t498891\n400块\t498892\n佳达\t498893\nChris\t498894\n速动\t498895\n京味儿\t498896\n文正学院\t498897\n常平火车站\t498898\n贪腐\t498899\n美因茨\t498900\n热拉提\t498901\n学术论文\t498902\n火参果\t498903\nnetperf\t498904\n几任\t498905\nswit\t498906\n15位\t498907\n软件园\t498908\n科学类\t498909\ni5-6200u\t498910\nPHPMailer\t498911\n浙江世宝\t498912\n后顾之忧\t498913\naletta\t498914\n海韵\t498915\n锁孔\t498916\n姜粉\t498917\n好习惯\t498918\n大爷们\t498919\n猎食\t498920\n批露\t498921\nwebserver\t498922\n罪恶之城2\t498923\n蜂蛰\t498924\n获过\t498925\n2018.3.29\t498926\n天津机场\t498927\n第三天\t498928\n苏橙\t498929\n被掳\t498930\nRelational\t498931\n杀人鬼\t498932\nPharrell\t498933\n我他\t498934\n清远市清新区人民政府\t498935\n润联\t498936\n黑曲霉\t498937\n亚丽\t498938\n乱打\t498939\n平贵杰\t498940\n20180405\t498941\n瞳孔\t498942\n250年\t498943\n亲属抚恤金\t498944\np118w\t498945\n1300亿\t498946\n真实故事\t498947\n黑乌鸦\t498948\nDeadline\t498949\n一世之尊\t498950\n工银安盛\t498951\naggs\t498952\nnrf2401\t498953\n2017f1\t498954\n2018年5月1日前\t498955\n863计划\t498956\n工具车\t498957\n飞车党\t498958\n袋式除尘器\t498959\n圣宠\t498960\n点劵\t498961\n美世留学\t498962\n回测\t498963\n朝晖\t498964\n中国残疾人联合会\t498965\n我的小可爱\t498966\n天籁艺术学校\t498967\n镝灯\t498968\n罗比\t498969\nTreehouse\t498970\n归回\t498971\n泰克科技\t498972\n通知期\t498973\n场内\t498974\nCandy\t498975\nshibie\t498976\n仿生机器人\t498977\nFra\t498978\nzidian\t498979\n点头镇\t498980\n第88集\t498981\n武汉地铁5号线\t498982\n2018-02-03\t498983\n十十\t498984\n证类本草\t498985\n编入\t498986\n那段\t498987\n信必可\t498988\n李东方\t498989\n信达证券\t498990\n敢当\t498991\n软件评测师考试\t498992\n海虞\t498993\nAlignment\t498994\n众口铄金\t498995\n强国之路\t498996\n105个\t498997\n水仙\t498998\n吊装\t498999\n新华制药\t499000\nCOSMIC\t499001\n流行花园\t499002\n妾身\t499003\n陶企\t499004\nPenbeat\t499005\n立漂\t499006\n自筹\t499007\n28张\t499008\n伍宇娟\t499009\n大飞哥\t499010\nOLED屏\t499011\nh7n9\t499012\n逃顶\t499013\n义利观\t499014\nuso\t499015\n凯文·科斯特纳\t499016\n卫生间装\t499017\n双随机\t499018\nAb\t499019\nmofa\t499020\n梦天木门\t499021\n经典咏流传\t499022\n西安城中村\t499023\nENOENT\t499024\n中老铁路\t499025\n交投\t499026\n山东省经信委\t499027\n黄渤\t499028\n天灭地\t499029\nmurmur\t499030\n投资\t499031\n凸轮转子泵\t499032\n中山电台\t499033\n先导性\t499034\n叶子猪梦三国\t499035\n艾玛沃森\t499036\n3D版\t499037\n江龙\t499038\n李世杰\t499039\n道奇酷威\t499040\n提前还款\t499041\nTubing\t499042\n复联3\t499043\n雅虎邮箱\t499044\n夏洛\t499045\n51区未解之谜网\t499046\n机械表\t499047\n莲花峰\t499048\n41所\t499049\n绝对式\t499050\n沈阳机床集团\t499051\n马格纳\t499052\n绢布\t499053\n今年\t499054\n郑陆镇\t499055\nikang\t499056\n剑宗二觉\t499057\n热豆腐\t499058\nagoda\t499059\nuconn\t499060\n列装\t499061\n恋足\t499062\n风云三号\t499063\n惰化\t499064\n苏涛\t499065\n4分\t499066\n全乳\t499067\ngains\t499068\n用户版\t499069\n郦道元\t499070\n禧\t499071\n隷\t499072\n特雷\t499073\n正龙\t499074\nmcbbs\t499075\n海辰\t499076\n免费片\t499077\n国家地质公园\t499078\n晨练\t499079\n凯莉\t499080\n碣石山\t499081\n新凯美瑞\t499082\n46mm\t499083\n永立\t499084\n韩菱纱\t499085\n少寒\t499086\n前端路\t499087\n罢录\t499088\n客滚船\t499089\n作业者\t499090\namsr\t499091\n玛丽艳\t499092\n魔法少女小爱\t499093\n冰袖\t499094\nCAMDS\t499095\n漏装\t499096\n北辰奥园\t499097\n34路\t499098\n伟星管业\t499099\n21三体综合症\t499100\nevplayer\t499101\n云水湾\t499102\n逆风笑\t499103\nlivereload\t499104\n停留期\t499105\n东王\t499106\n梁溪\t499107\n晚上\t499108\n路军\t499109\n摘记\t499110\n淘货\t499111\n胶饵\t499112\n阿桃\t499113\n水脉\t499114\nPound\t499115\nApotheke\t499116\n桥口\t499117\n白荷花\t499118\n片仔癀\t499119\n2017-09-01\t499120\n山西省质量技术监督局\t499121\n邓军\t499122\n一二三部\t499123\n衰减率\t499124\n梅麻吕\t499125\n轴套\t499126\n表情符\t499127\n爱听\t499128\nq9\t499129\nr6400\t499130\nAssistive\t499131\n游府西街小学\t499132\nHearing\t499133\n雅典娜\t499134\n无敌券\t499135\n記憶\t499136\n查泰莱\t499137\nmozart\t499138\n冯巩\t499139\n防火玻璃门\t499140\n孙仲谋\t499141\n赵容弼\t499142\n荣威Ei5\t499143\nwebkit\t499144\n萝卜花\t499145\n恨歌\t499146\n弘学\t499147\n2018cf\t499148\n编制人\t499149\n精雕机\t499150\n萨科齐\t499151\n一个耳\t499152\n蛋白线\t499153\nD800\t499154\nsb\t499155\n杏花酒\t499156\n吴然\t499157\n秦记\t499158\n发型男\t499159\n宝能科技园\t499160\nNetbeans\t499161\n倒退\t499162\n李路\t499163\n植筋\t499164\n统管\t499165\nchromium浏览器\t499166\n竹山\t499167\n出入金\t499168\n宋组儿\t499169\nBMW互联\t499170\n化学危险品\t499171\n论学\t499172\n空心桨叶干燥机\t499173\n蜀山传奇\t499174\n合金装备5幻痛\t499175\n发祥\t499176\n输液架\t499177\n神堂\t499178\n卤化物\t499179\nKAWAI\t499180\n郭婷婷\t499181\n0561\t499182\n2宗\t499183\n姦染\t499184\n普罗布\t499185\n小鞠\t499186\n北京财贸职业学院\t499187\nsights\t499188\n玻纤板\t499189\n328元\t499190\nJamy\t499191\n云鼎\t499192\n老胡同\t499193\n福昕pdf编辑器\t499194\n万格\t499195\n无internet\t499196\n副校\t499197\n存储型\t499198\n湘云\t499199\n漫游者\t499200\n夜煞\t499201\n七律诗\t499202\n磷脂酰丝氨酸\t499203\n正面描写\t499204\nMidas\t499205\n万岁军\t499206\n猛龙队\t499207\n酒渣\t499208\nCompetition\t499209\n26英寸\t499210\n丑妹\t499211\n迅雷玩客币\t499212\n1v\t499213\ndef\t499214\nCleanup\t499215\n虾青素\t499216\nMATE9\t499217\nsubst\t499218\nshool\t499219\n水库群\t499220\n木兰\t499221\n韦恩斯坦\t499222\n逢集\t499223\n纺织业\t499224\n记录本\t499225\n上海国际招标有限公司\t499226\nE类\t499227\n国际啤酒节\t499228\nsearching\t499229\n熟母\t499230\n广东发展银行\t499231\n四关\t499232\n自导\t499233\nContours\t499234\n二级注册计量师\t499235\n终面\t499236\n陪餐\t499237\n上海和平饭店\t499238\nav12\t499239\nabbrv\t499240\n360p1\t499241\n出资额\t499242\n神权\t499243\nk43\t499244\n第三首\t499245\n组合梁\t499246\nwinsock2\t499247\n舒适感\t499248\n工业管\t499249\n黄欣\t499250\nHomestay\t499251\n抬脚\t499252\n云上小悟\t499253\n1:24\t499254\nchanger\t499255\n万妖之城\t499256\n硕士学制\t499257\n随机效应\t499258\nhapjin\t499259\nSKB\t499260\ninfor\t499261\n这个词\t499262\n河北经济网\t499263\n第六十条\t499264\n格罗兹尼\t499265\n刘紫玲\t499266\n时尚先锋\t499267\n山丹丹开花红艳艳\t499268\nslaves\t499269\n巴伐利亚庄园\t499270\n大阴唇\t499271\n749\t499272\n2080\t499273\n李代\t499274\n凡人流\t499275\n茎\t499276\np卡\t499277\n申万宏源证券承销保荐有限责任公司\t499278\n2018-04-19\t499279\n凉瓜\t499280\n小标语\t499281\n500件\t499282\n地类\t499283\nblog-51CTO\t499284\n达晨创投\t499285\nWargaming\t499286\nzentao\t499287\nbool型\t499288\nhexin\t499289\n标准摩尔\t499290\n下文\t499291\n蒙特卡洛\t499292\n点对点\t499293\n高斯白噪声\t499294\nhanguo\t499295\nmake_pair\t499296\n澎\t499297\n洛宁县\t499298\n辅差\t499299\n克里斯蒂安·贝尔\t499300\n南园小区\t499301\n玉堂春\t499302\n可定口服\t499303\n11倍\t499304\n张涤\t499305\nMuch\t499306\n消化科\t499307\n49种\t499308\nquasar\t499309\n硫酸氢钠\t499310\nvop\t499311\n赛璐珞\t499312\n胸怀\t499313\n店铺版\t499314\n化工地\t499315\n权变理论\t499316\n教场\t499317\n2019年3月\t499318\n昭觉寺\t499319\n甚好\t499320\n忌日快乐\t499321\n劳动者们\t499322\nSpice\t499323\norchid\t499324\n210&\t499325\n香港马会\t499326\n善用\t499327\n第66号\t499328\n复垦券\t499329\n同春\t499330\nhexa\t499331\n皮革\t499332\nPVC板\t499333\nsame\t499334\n宠物饲养\t499335\n亲和性\t499336\n南通移动\t499337\ninteract\t499338\n开元广场\t499339\n地藏缘主论坛\t499340\n追兵\t499341\n五福影院\t499342\n美生子\t499343\n铁拳3\t499344\n劳务分包\t499345\n仇人\t499346\n写歌\t499347\n明林月下美人来\t499348\ncad08\t499349\n1100lu\t499350\n三田\t499351\n万科御龙山\t499352\n媚日\t499353\n菊科\t499354\n蒋\t499355\n28磅\t499356\n纠偏\t499357\n湖南省住建厅\t499358\n朱某\t499359\nswfobject\t499360\n隐水洞\t499361\nsimcom\t499362\n右江区\t499363\nandi\t499364\n肥仔\t499365\nesx\t499366\nDANDY\t499367\n航天纪念币\t499368\n6.2寸\t499369\nReadFile\t499370\n微创手术\t499371\n佛山市区\t499372\n极米无屏电视\t499373\nSEVIS\t499374\n西兰花\t499375\ntreatments\t499376\n安保\t499377\nABLETON\t499378\n精灵石\t499379\n听风者\t499380\n京瓷300i\t499381\n撤退性出血\t499382\n果冻样\t499383\n数钱\t499384\nff7\t499385\n查理兹\t499386\n过盈\t499387\n清华大学出版社\t499388\n隐隐\t499389\n浅析_参考网\t499390\n隐匿性\t499391\n中国电子技术标准化研究院\t499392\n魏少军\t499393\n贝澳\t499394\n阁主\t499395\n七情\t499396\nhtml页\t499397\n电视信号\t499398\n被迫害妄想症\t499399\n红枣枸杞水\t499400\nBikini\t499401\n鳐鱼\t499402\nMissevan_动漫新闻_M站\t499403\n吴夏荣\t499404\n哈工程\t499405\nCruises\t499406\n广东省人民代表大会常务委员会\t499407\nTEXAS\t499408\n华北电力大学研究生院\t499409\n爆花\t499410\n临门一脚\t499411\n第四笔\t499412\n临江门\t499413\n阅读器\t499414\n石钟山\t499415\n九旬\t499416\n南丹\t499417\n逆袭之星途璀璨\t499418\n直流变压器\t499419\n经分\t499420\n说说话\t499421\n真花\t499422\nbbl\t499423\nRXD\t499424\n武汉晚报\t499425\nrave\t499426\n阔\t499427\n暖男\t499428\n双十二\t499429\n投中集团\t499430\n柳南\t499431\n疟疾日\t499432\nSAW\t499433\n地铁:归来\t499434\n清江锦城\t499435\n谷芽\t499436\nJaCoCo\t499437\nSD卡槽\t499438\n进出水\t499439\n里普特拉西尔\t499440\n甲基丙烯酸酯\t499441\n城镇体系规划\t499442\n横纹肌\t499443\n关于费尔巴哈的提纲\t499444\n天一房产网\t499445\n统考卷\t499446\n梅峰\t499447\n丈\t499448\n研院\t499449\n评校\t499450\noos\t499451\n唐山市路北区\t499452\n肆客\t499453\n克兰菲尔德大学\t499454\n掀裙\t499455\n抱憾\t499456\n聚气\t499457\n袁花镇\t499458\n誓要\t499459\n培育\t499460\n炎炎\t499461\n西乡塘区人民政府\t499462\n沙尘\t499463\n非80端口\t499464\nhaozip\t499465\n福州小区\t499466\n居民身份证\t499467\n秦氏\t499468\n畅快\t499469\n蒋兴权\t499470\n圆肩\t499471\n上海同济大学\t499472\n大明望族\t499473\n核验\t499474\n45亿\t499475\nconfigurer\t499476\n万濠\t499477\n拉普拉斯变换\t499478\npccad2010\t499479\nvlx\t499480\nFlask\t499481\n德宏州纪委州监察局\t499482\n转接线\t499483\n出版名\t499484\n8H\t499485\n战斗暴龙兽\t499486\n锦龙股份\t499487\n8011\t499488\n京津冀城市群\t499489\n南湖国际社区\t499490\n王文辉\t499491\n青青岛社区\t499492\n胥江\t499493\n移轴镜头\t499494\nmmx\t499495\n中窗\t499496\n李成宇\t499497\n丙辰日\t499498\nnetease\t499499\nf5\t499500\n精巧\t499501\n红外发射管\t499502\ncontextual\t499503\narm-linux\t499504\n引孔\t499505\n王力\t499506\ncoq\t499507\n陈叔宝\t499508\n十五件\t499509\nrichTextBox\t499510\n人教版小学三年级语文下册\t499511\n双薪\t499512\n英模\t499513\n夜色镇\t499514\n谭铁牛\t499515\n回廊\t499516\ncue\t499517\n对歌\t499518\n五月初五\t499519\nVTC\t499520\n僵尸国度\t499521\n砖墙\t499522\nOmniplan\t499523\n手黑\t499524\n长期性\t499525\n700GB\t499526\n大雁归来\t499527\n20世纪90年代以来\t499528\n西三环北路\t499529\n南宁市住房保障和房产管理局\t499530\n原装机\t499531\n高电压技术\t499532\n顶天立地\t499533\nsickness\t499534\n趣题\t499535\nMagnolia\t499536\nGerman\t499537\nPyspider\t499538\nP67\t499539\n吴川斌\t499540\n防窥屏\t499541\n六成\t499542\n三义\t499543\n小米手机通讯录\t499544\n王蕊\t499545\nS50\t499546\n矿粉\t499547\n美差\t499548\n李美玲\t499549\nSubject\t499550\n箬横\t499551\nrsv\t499552\nUCCA\t499553\n初四\t499554\n不定代词\t499555\n305mm\t499556\n谋害\t499557\n穿越三国\t499558\n闵行实验小学\t499559\n呼吁\t499560\nx9000f\t499561\n凯利指数\t499562\n失事\t499563\n1.19\t499564\njuliaann\t499565\nWin764位\t499566\n防潮剂\t499567\n呼和浩特\t499568\n橡胶类\t499569\n500部\t499570\n艳红\t499571\n归梦\t499572\nMiss排位\t499573\n盖顶\t499574\nBedding\t499575\n110\t499576\n180%\t499577\n西格玛\t499578\n颜料画\t499579\n406路\t499580\niet\t499581\n合场\t499582\n保障局\t499583\n沂蒙山\t499584\n锅炉证\t499585\n惊闻\t499586\n开服\t499587\nE450C\t499588\nProvincial\t499589\nux0\t499590\nCodec\t499591\nkokua\t499592\n泸西\t499593\nstruts1\t499594\n虚词\t499595\n秧苗\t499596\n空白值\t499597\n个税计算器\t499598\n婕\t499599\nFairy\t499600\nspeedtest\t499601\n优女\t499602\n766\t499603\n制度经济学\t499604\n宣传版\t499605\n200kb\t499606\n豸\t499607\n封装机\t499608\n26支\t499609\n长此以往\t499610\nverycd\t499611\n160612\t499612\n学狗叫\t499613\n事长\t499614\n相交处\t499615\n20151127\t499616\nMadewell\t499617\ncenter\t499618\n天河东圃\t499619\n丰镐\t499620\n梅奔\t499621\n活动场\t499622\n狂蟒之灾2\t499623\n习呆呆\t499624\n王小虎\t499625\n3倍\t499626\n瑞园\t499627\n出线柜\t499628\n一门式\t499629\n船屋\t499630\n5km\t499631\n允西\t499632\n电磁继电器\t499633\n机械制造及其自动化\t499634\nxysp\t499635\n盛氏\t499636\n十一大\t499637\n贾惜春\t499638\n鹏辉\t499639\n章氏\t499640\n裘德洛\t499641\n横轴\t499642\n上虞政府网\t499643\n中国社会工作联合会\t499644\n桂林市区\t499645\n裂隙灯\t499646\n警戒\t499647\nuu加速器\t499648\npeace\t499649\n张剑\t499650\n马列主义\t499651\n送终\t499652\n265G网页游戏\t499653\n王祖蓝\t499654\ncombo\t499655\n周7天10天15天\t499656\n北京农商银行\t499657\n昆百大\t499658\nOTHER\t499659\n中国医药集团总公司\t499660\n绑带\t499661\n信达天御\t499662\n洗脳\t499663\n若槻\t499664\n雨燕/速翼特论坛_汽车之家论坛\t499665\n考论\t499666\n梁亮\t499667\n酒干倘卖\t499668\n0926\t499669\n金针菇\t499670\n孟庄镇\t499671\n主角色\t499672\n徐成\t499673\n大吾\t499674\n飘出\t499675\nYR\t499676\n北延段\t499677\nGlucosamine\t499678\n针式打印纸\t499679\n雾炮机\t499680\n青岛消防\t499681\n7070\t499682\n造假\t499683\n成人片\t499684\n园丁\t499685\n国家监委\t499686\nZnS\t499687\n天麟\t499688\n汇众\t499689\n土塘\t499690\n211\t499691\n西丽街道\t499692\nbtih\t499693\n上古卷轴5steam\t499694\n3DVD\t499695\nicmp\t499696\n慢性支气管炎\t499697\nclearcase\t499698\n晋江论坛\t499699\n解剖学\t499700\n大盛魁\t499701\n通用串行总线\t499702\nFITC\t499703\n太阳能组件\t499704\n买楼\t499705\n祖父\t499706\n四天王寺\t499707\nRenamer\t499708\ndx11\t499709\n赠花卿\t499710\n校笺\t499711\n留痕\t499712\n婴幼园\t499713\n华福证券\t499714\n宿将\t499715\nMTD\t499716\n小悦\t499717\n凯登·克罗斯\t499718\n追夫\t499719\n回坊\t499720\n种花\t499721\n绿宝\t499722\n艾斯特\t499723\n裸模\t499724\n中小学教师职业道德规范\t499725\n超猛\t499726\nAddiction\t499727\n过路\t499728\n侏罗纪公园1\t499729\n胡一鸣\t499730\n200多亿\t499731\n混合物\t499732\n大笑\t499733\n18号\t499734\nNote-Flyme\t499735\n安第斯山\t499736\n西南区\t499737\npisa\t499738\nFoods\t499739\n夏河县\t499740\nWIC\t499741\n合汇\t499742\nCakewalk\t499743\n拍掌\t499744\n占空比\t499745\n纸牌屋第二季\t499746\n贾斯珀国家公园\t499747\n7788\t499748\n夏威夷大学\t499749\n文物保护法\t499750\n苏价服\t499751\n汉滨区人民政府\t499752\n梅干菜扣肉饼\t499753\n深圳前海公司\t499754\nvmix\t499755\n硬编码\t499756\n研学\t499757\n世界之窗\t499758\n紫金政府网\t499759\n繁华\t499760\n直播类\t499761\n自食其力\t499762\n10段\t499763\n淘实惠\t499764\n老赛\t499765\nmycat\t499766\n震古烁今\t499767\n海淀教委\t499768\n签收\t499769\n桂庙\t499770\n吉首大学\t499771\n草鞋\t499772\n搬移\t499773\nsciter\t499774\n珍珠鸡\t499775\n研究生招生信息网\t499776\n沼虾\t499777\n甘人社通\t499778\n金融服务公司\t499779\n道顿崛\t499780\n证券投资者\t499781\n海淀妇幼保健院\t499782\n美食天下\t499783\n河滨公园\t499784\n急刹\t499785\n东风汽车股份有限公司\t499786\n火车站\t499787\n74家\t499788\n惠新里\t499789\n严俊\t499790\nHyperV\t499791\n宝王\t499792\nhospitals\t499793\n金澳\t499794\nSHINE\t499795\nader\t499796\n我为卿狂\t499797\n摩\t499798\n大利空\t499799\n韩世雅\t499800\n人文版\t499801\n166\t499802\n9.2.2\t499803\n九条街\t499804\n和珅\t499805\n五水硫酸铜\t499806\nBEC考试\t499807\n新页\t499808\n底值\t499809\n珠海斗门\t499810\n玛丽娜·阿布拉莫维奇\t499811\n未接通\t499812\n惊梦\t499813\n松坂庆子\t499814\nOYO\t499815\nxba\t499816\n代理加盟\t499817\n分類\t499818\njones\t499819\n稷下学宫\t499820\n黄湖镇\t499821\n万豪酒店集团\t499822\n耐冲击\t499823\n16S\t499824\n健客网\t499825\n创立者\t499826\n90ml\t499827\nhemp\t499828\n一极\t499829\n忘战\t499830\ncomponents\t499831\n悍匪\t499832\nfpwin\t499833\ncellspacing=\t499834\n瑞普\t499835\n红枣桂圆\t499836\n中国海警局\t499837\n9月末\t499838\n俊发\t499839\n中华人民共和国国家邮政局\t499840\n中珠\t499841\n冷冰\t499842\n思瑞\t499843\nangula\t499844\n光学胶\t499845\nzjs\t499846\nfrf\t499847\n低度\t499848\n盾憩岛\t499849\n车继铃\t499850\n半月谈网\t499851\n繁昌县人民政府\t499852\n从来\t499853\n贝塞尔函数\t499854\n村容村貌\t499855\n第三期\t499856\n微电极\t499857\nAtitit\t499858\n叶适\t499859\n蕾丝花边\t499860\n隔油池\t499861\ncomtrade\t499862\n青桔\t499863\nvison\t499864\n横模\t499865\nPOLITICO\t499866\n1379\t499867\n景岗山\t499868\n克洛泽\t499869\nleg\t499870\n专宠\t499871\n楼王\t499872\n4.6\t499873\n治未病\t499874\n80个\t499875\n唾\t499876\n房屋顶\t499877\n舰队少女\t499878\n鬼母\t499879\n万安县人民政府\t499880\nhongkong\t499881\n风化岛\t499882\n拼好\t499883\n波纹补偿器\t499884\n安全性能\t499885\n黑桶\t499886\n集资房\t499887\nSSO\t499888\n踢踏舞\t499889\n南京人\t499890\n两冠\t499891\n今朝有酒今朝醉\t499892\nGOA\t499893\n彭清\t499894\n丹尼尔斯\t499895\nimport\t499896\n血与酒\t499897\n战争机器\t499898\n吧友\t499899\nc7pro\t499900\n小米mix2s\t499901\n帝释天\t499902\n一堵\t499903\n4}\t499904\n校裤\t499905\n沙堡\t499906\n职业农民\t499907\n单群\t499908\n检验所\t499909\n国学班\t499910\n装机模拟器\t499911\n沸腾虾\t499912\n乡镇医院\t499913\n女真\t499914\ncentrifugal\t499915\n家政服务业\t499916\n英雄联盟吧\t499917\n云舞\t499918\n扫墓\t499919\n果导片\t499920\n李焕英\t499921\n落入\t499922\nlx10\t499923\n缅甸政府\t499924\n北塔区\t499925\n很残酷\t499926\ndiffie\t499927\n墨脱\t499928\n君豪\t499929\n敦煌石窟\t499930\n525心理网\t499931\n迪思\t499932\n木器\t499933\n黄巾军\t499934\n奥特曼传奇英雄\t499935\n热风CG工作室\t499936\n消耗率\t499937\n多年\t499938\n白蛇前传\t499939\n潮州市区\t499940\n不生不灭\t499941\n二单\t499942\n18支\t499943\n高匿\t499944\n吉泽明\t499945\nBleu\t499946\n冰雪王\t499947\nmereke\t499948\n仙居杨梅\t499949\n凝胶法\t499950\n学优网\t499951\n倦鸟\t499952\n僧伽吒\t499953\n塔云山\t499954\n李艳\t499955\n坦途\t499956\njohnny\t499957\nSpectroscopy\t499958\n无双大蛇\t499959\n联络\t499960\n归定\t499961\n中国公安大学\t499962\n天惠\t499963\n贷\t499964\n纪\t499965\n离职\t499966\n慢粒白血病\t499967\n腔室\t499968\n杨健\t499969\n名角\t499970\n神华招标网\t499971\n前途汽车\t499972\nstrtotime\t499973\n上海国际航运中心\t499974\n高尔夫gti\t499975\n南阳市政府\t499976\n远程库\t499977\ng-shock吧\t499978\n中国安防网\t499979\n播种网\t499980\n吊射\t499981\n考试前\t499982\n劳斯莱斯银魅\t499983\nc40\t499984\n迷途知返\t499985\n参考品\t499986\ntong\t499987\n640m\t499988\n冰环\t499989\n信局\t499990\n广州经济技术开发区\t499991\n牛贝\t499992\n876\t499993\n0225\t499994\n龙虾城\t499995\n万达中心\t499996\n山中人\t499997\nSIP\t499998\n饶\t499999\n原创附分解_广场舞\t500000\n[PAD]\t500001\n"
  },
  {
    "path": "modules/text/language_model/simnet_bow/module.py",
    "content": "# -*- coding:utf-8 -*-\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport argparse\nimport ast\nimport json\nimport math\nimport os\n\nimport numpy as np\nimport six\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\nfrom simnet_bow.processor import load_vocab\nfrom simnet_bow.processor import postprocess\nfrom simnet_bow.processor import preprocess\n\nimport paddlehub as hub\nfrom paddlehub.common.paddle_helper import add_vars_prefix\nfrom paddlehub.common.paddle_helper import get_variable_info\nfrom paddlehub.common.utils import sys_stdin_encoding\nfrom paddlehub.io.parser import txt_parser\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\nclass DataFormatError(Exception):\n\n    def __init__(self, *args):\n        self.args = args\n\n\n@moduleinfo(name=\"simnet_bow\",\n            version=\"1.2.1\",\n            summary=\"Baidu's open-source similarity network model based on bow_pairwise.\",\n            author=\"baidu-nlp\",\n            author_email=\"\",\n            type=\"nlp/sentiment_analysis\")\nclass SimnetBow(hub.Module):\n\n    def _initialize(self):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        self.pretrained_model_path = os.path.join(self.directory, \"assets\", \"infer_model\")\n        self.vocab_path = os.path.join(self.directory, \"assets\", \"vocab.txt\")\n        self.vocab = load_vocab(self.vocab_path)\n        self.param_file = os.path.join(self.directory, \"assets\", \"params.txt\")\n        self._word_seg_module = None\n\n        self._set_config()\n\n    @property\n    def word_seg_module(self):\n        \"\"\"\n        lac module\n        \"\"\"\n        if not self._word_seg_module:\n            self._word_seg_module = hub.Module(name=\"lac\")\n        return self._word_seg_module\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        cpu_config = Config(self.pretrained_model_path)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        cpu_config.switch_ir_optim(False)\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(self.pretrained_model_path)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=500, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def _texts_process(self, texts):\n        lod = [0]\n        data = []\n        for i, text in enumerate(texts):\n            data += text['processed']\n            lod.append(len(text['processed']) + lod[i])\n        return np.array(data).astype('int64'), [lod], [lod[-1], 1]\n\n    def to_unicode(self, texts):\n        \"\"\"\n        Convert each element's type(str) of texts(list) to unicode in python2.7\n        Args:\n             texts(list): each element's type is str in python2.7\n        Returns:\n             texts(list): each element's type is unicode in python2.7\n        \"\"\"\n\n        if six.PY2:\n            unicode_texts = []\n            for text in texts:\n                if isinstance(text, six.string_types):\n                    unicode_texts.append(text.decode(sys_stdin_encoding()).decode(\"utf8\"))\n                else:\n                    unicode_texts.append(text)\n            texts = unicode_texts\n        return texts\n\n    def check_data(self, texts=[], data={}):\n        \"\"\"\n        check input data\n        Args:\n             texts(list): the input texts to be predicted which the first element is text_1(list)\n                          and the second element is text_2(list), such as [['这道题很难'], ['这道题不简单']]\n                          if texts not data.\n             data(dict): key must be 'text_1' and 'text_2', value is the texts(list) to be predicted\n        Returns:\n             results(dict): predicted data\n        \"\"\"\n        predicted_data = {'text_1': [], 'text_2': []}\n        if texts != [] and isinstance(texts, list) and len(texts) == 2 and (len(texts[0]) == len(\n                texts[1])) and texts[0] and texts[1] and data == {}:\n\n            predicted_data['text_1'] = texts[0]\n            predicted_data['text_2'] = texts[1]\n\n        elif texts == [] and isinstance(data, dict) and isinstance(data.get('text_1', None), list) and isinstance(\n                data.get('text_2', None), list) and (len(data['text_1']) == len(\n                    data['text_2'])) and data['text_1'] and data['text_2']:\n\n            predicted_data = data\n\n        else:\n            raise ValueError(\"The input data is inconsistent with expectations.\")\n\n        return predicted_data\n\n    @serving\n    def similarity(self, texts=[], data={}, use_gpu=False, batch_size=1):\n        \"\"\"\n        Get the sentiment prediction results results with the texts as input\n        Args:\n             texts(list): the input texts to be predicted which the first element is text_1(list)\n                          and the second element is text_2(list), such as [['这道题很难'], ['这道题不简单']]\n                          if texts not data.\n             data(dict): key must be 'text_1' and 'text_2', value is the texts(list) to be predicted\n             use_gpu(bool): whether use gpu to predict or not\n             batch_size(int): the program deals once with one batch\n        Returns:\n             results(list): the word segmentation results\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        data = self.check_data(texts, data)\n\n        start_idx = 0\n        iteration = int(math.ceil(len(data['text_1']) / batch_size))\n        results = []\n        for i in range(iteration):\n            batch_data = {'text_1': [], 'text_2': []}\n            if i < (iteration - 1):\n                batch_data['text_1'] = data['text_1'][start_idx:(start_idx + batch_size)]\n                batch_data['text_2'] = data['text_2'][start_idx:(start_idx + batch_size)]\n            else:\n                batch_data['text_1'] = data['text_1'][start_idx:(start_idx + batch_size)]\n                batch_data['text_2'] = data['text_2'][start_idx:(start_idx + batch_size)]\n            start_idx = start_idx + batch_size\n            processed_results = preprocess(self.word_seg_module, self.vocab, batch_data, use_gpu, batch_size)\n\n            data_1, lod_1, shape_1 = self._texts_process(processed_results[\"text_1\"])\n            data_2, lod_2, shape_2 = self._texts_process(processed_results[\"text_2\"])\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(data_1)\n            input_handle.set_lod(lod_1)\n            input_handle.reshape(shape_1)\n\n            input_handle = predictor.get_input_handle(input_names[1])\n            input_handle.copy_from_cpu(data_2)\n            input_handle.set_lod(lod_2)\n            input_handle.reshape(shape_2)\n\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[1])\n            batch_out = output_handle.copy_to_cpu()\n\n            batch_result = postprocess(batch_out, processed_results)\n            results += batch_result\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the simnet_bow module.\",\n                                              prog='hub run simnet_bow',\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n\n        args = self.parser.parse_args(argvs)\n\n        try:\n            input_data = self.check_input_data(args)\n        except DataFormatError and RuntimeError:\n            self.parser.print_help()\n            return None\n\n        results = self.similarity(data=input_data, use_gpu=args.use_gpu, batch_size=args.batch_size)\n\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU for prediction\")\n\n        self.arg_config_group.add_argument('--batch_size', type=int, default=1, help=\"batch size for prediction\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options\n        \"\"\"\n        self.arg_input_group.add_argument('--input_file', type=str, default=None, help=\"file contain input data\")\n        self.arg_input_group.add_argument('--text_1', type=str, default=None, help=\"text to predict\")\n        self.arg_input_group.add_argument('--text_2', type=str, default=None, help=\"text to predict\")\n\n    def check_input_data(self, args):\n        input_data = {}\n        if args.input_file:\n            if not os.path.exists(args.input_file):\n                print(\"File %s is not exist.\" % args.input_file)\n                raise RuntimeError\n            else:\n                input_data = txt_parser.parse(args.input_file, use_strip=True)\n        elif args.text_1 and args.text_2:\n            if args.text_1.strip() != '' and args.text_2.strip() != '':\n                if six.PY2:\n                    input_data = {\n                        \"text_1\": [args.text_1.strip().decode(sys_stdin_encoding()).decode(\"utf8\")],\n                        \"text_2\": [args.text_2.strip().decode(sys_stdin_encoding()).decode(\"utf8\")]\n                    }\n                else:\n                    input_data = {\"text_1\": [args.text_1], \"text_2\": [args.text_2]}\n            else:\n                print(\"ERROR: The input data is inconsistent with expectations.\")\n\n        if input_data == {}:\n            print(\"ERROR: The input data is inconsistent with expectations.\")\n            raise DataFormatError\n\n        return input_data\n\n    def get_vocab_path(self):\n        \"\"\"\n        Get the path to the vocabulary whih was used to pretrain\n        Returns:\n             self.vocab_path(str): the path to vocabulary\n        \"\"\"\n        return self.vocab_path\n\n\nif __name__ == \"__main__\":\n\n    simnet_bow = SimnetBow()\n    inputs, outputs, program = simnet_bow.context(num_slots=3)\n    print(inputs)\n    print(outputs)\n\n    # Data to be predicted\n    test_text_1 = [\"这道题太难了\", \"这道题太难了\", \"这道题太难了\"]\n    test_text_2 = [\"这道题是上一年的考题\", \"这道题不简单\", \"这道题很有意思\"]\n\n    inputs = {\"text_1\": test_text_1, \"text_2\": test_text_2}\n    results = simnet_bow.similarity(data=inputs, batch_size=2)\n    print(results)\n    max_score = -1\n    result_text = \"\"\n    for result in results:\n        if result['similarity'] > max_score:\n            max_score = result['similarity']\n            result_text = result['text_2']\n\n    print(\"The most matching with the %s is %s\" % (test_text_1[0], result_text))\n"
  },
  {
    "path": "modules/text/language_model/simnet_bow/processor.py",
    "content": "# -*- coding: utf-8 -*-\nimport io\n\n\ndef load_vocab(file_path):\n    \"\"\"\n    load the given vocabulary\n    \"\"\"\n    vocab = {}\n    with io.open(file_path, 'r', encoding='utf8') as f:\n        wid = 0\n        for line in f:\n            line = line.rstrip()\n            parts = line.split('\\t')\n            vocab[parts[0]] = int(parts[1])\n    vocab[\"<unk>\"] = len(vocab)\n    return vocab\n\n\ntext_a_key = \"text_1\"\ntext_b_key = \"text_2\"\n\n\ndef preprocess(lac, word_dict, data_dict, use_gpu=False, batch_size=1):\n    \"\"\"\n    Convert the word str to word id and pad the text\n    \"\"\"\n    result = {text_a_key: [], text_b_key: []}\n    processed_a = lac.lexical_analysis(data={'text': data_dict[text_a_key]}, use_gpu=use_gpu, batch_size=batch_size)\n    processed_b = lac.lexical_analysis(data={'text': data_dict[text_b_key]}, use_gpu=use_gpu)\n    unk_id = word_dict['<unk>']\n    for index, (text_a, text_b) in enumerate(zip(processed_a, processed_b)):\n        result_i = {'processed': []}\n        result_i['origin'] = data_dict[text_a_key][index]\n        for word in text_a['word']:\n            _index = word_dict.get(word, unk_id)\n            result_i['processed'].append(_index)\n        result[text_a_key].append(result_i)\n\n        result_i = {'processed': []}\n        result_i['origin'] = data_dict[text_b_key][index]\n        for word in text_b['word']:\n            _index = word_dict.get(word, unk_id)\n            result_i['processed'].append(_index)\n        result[text_b_key].append(result_i)\n    return result\n\n\ndef postprocess(pred, data_info):\n    \"\"\"\n    Convert model's output tensor to pornography label\n    \"\"\"\n    result = []\n    for index in range(len(data_info[text_a_key])):\n        result_i = {}\n        result_i[text_a_key] = data_info[text_a_key][index]['origin']\n        result_i[text_b_key] = data_info[text_b_key][index]['origin']\n        result_i['similarity'] = float('%.4f' % pred[index][0])\n        result.append(result_i)\n    return result\n"
  },
  {
    "path": "modules/text/language_model/slda_news/README.md",
    "content": "## 模型概述\n\n主题模型(Topic Model)是以无监督学习的方式对文档的隐含语义结构进行聚类的统计模型，其中SLDA(Sentence-LDA)是主题模型的一种。SLDA是LDA主题模型的扩展，LDA假设每个单词对应一个主题，而SLDA假设每个句子对应一个主题。本Module基于的数据集为百度自建的新闻领域数据集。\n\n<p alian=\"center\">\n<img src=\"https://bj.bcebos.com/paddlehub/model/nlp/semantic_model/slda.png\" hspace='10'/> <br />\n</p>\n\n更多详情请参考[SLDA论文](https://pdfs.semanticscholar.org/c311/778adb9484c86250e915aecd9714f4206050.pdf)。\n\n注：该Module由第三方开发者DesmonDay贡献。\n\n## SLDA模型 API 说明\n\n### infer_doc_topic_distribution(document)\n\n用于推理出文档的主题分布。\n\n**参数**\n\n- document(str): 输入文档。\n\n**返回**\n\n- results(list): 包含主题分布下各个主题ID和对应的概率分布。其中，list的基本元素为dict，dict的key为主题ID，value为各个主题ID对应的概率。\n\n### show_topic_keywords(topic_id, k=10)\n\n用于展示出每个主题下对应的关键词，可配合推理主题分布的API使用。\n\n**参数**\n\n- topic_id(int): 主题ID。\n- k(int): 需要知道对应主题的前k个关键词。\n\n**返回**\n\n- results(dict): 返回对应文档的前k个关键词，以及各个关键词在文档中的出现概率。\n\n### 代码示例\n\n这里展示API的使用示例。\n\n``` python\nimport paddlehub as hub\n\nslda_news = hub.Module(name=\"slda_news\")\n\ntopic_dist = slda_news.infer_doc_topic_distribution(\"百度是全球最大的中文搜索引擎、致力于让网民更便捷地获取信息，找到所求。\")\n# {378: 0.5, 804: 0.5}\n\nkeywords = slda_news.show_topic_keywords(topic_id=804, k=10)\n# {'百度': 0.08269021676897842,\n# '搜索': 0.04154762385123992,\n# '推广': 0.026193527138926424,\n# '贴吧': 0.02125616298078334,\n# '排名': 0.019595252609963018,\n# '关键词': 0.015173719446828477,\n# '广告': 0.013552941381750894,\n# '搜索引擎': 0.010038529194616577,\n# '公司': 0.009388342219512786,\n# '网站': 0.009173721627932065}\n\n```\n\n## 查看代码\nhttps://github.com/baidu/Familia\n\n\n## 依赖\n\npaddlepaddle >= 1.8.2\n\npaddlehub >= 1.8.0\n\n## 更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/text/language_model/slda_news/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/slda_news/config.py",
    "content": "\"\"\"\nThis file defines the basic config information of LDA/SLDA model.\n\"\"\"\n\n\nclass ModelType:\n    LDA = 0\n    SLDA = 1\n\n\nclass ModelConfig:\n    type = None\n    num_topics = None\n    alpha = None\n    beta = None\n    word_topic_file = None\n    vocab_file = None\n"
  },
  {
    "path": "modules/text/language_model/slda_news/document.py",
    "content": "import numpy as np\n\n\nclass Topic(object):\n    \"\"\"Basic data structure of topic, contains topic id and\n       corresponding probability.\n    \"\"\"\n\n    def __init__(self, tid, prob):\n        self.tid = tid  # topic id\n        self.prob = prob  # topic probability\n\n\nclass Token(object):\n    \"\"\"Basic storage unit of LDA documents, contains word id\n       and corresponding topic.\n    \"\"\"\n\n    def __init__(self, topic, id):\n        self.topic = topic\n        self.id = id\n\n\nclass Sentence(object):\n    \"\"\"Basic storage unit of SentenceLDA documents, contains word ids\n       of the sentence and its corresponding topic id.\n    \"\"\"\n\n    def __init__(self, topic, tokens):\n        self.topic = topic\n        self.tokens = tokens\n\n\nclass LDADoc(object):\n    \"\"\"The storage structure of LDA model's inference result.\n    \"\"\"\n\n    def __init__(self):\n        self._num_topics = None  # Number of topics.\n        self._num_accum = None  # Number of accumulated sample rounds.\n        self._alpha = None  # Document prior parameter.\n        self._tokens = None  # Storage structure of inference results.\n        self._topic_sum = None  # Document's topic sum in one round samples.\n        self._accum_topic_sum = None  # Accumulated results of topic sum.\n\n    def init(self, num_topics):\n        \"\"\"Initialize the LDADoc according to num_topics.\n        \"\"\"\n        self._num_topics = num_topics\n        self._num_accum = 0\n        self._tokens = []\n        self._topic_sum = np.zeros(self._num_topics)\n        self._accum_topic_sum = np.zeros(self._num_topics)\n\n    def add_token(self, token):\n        \"\"\"Add new word to current LDADoc.\n        Arg:\n            token: Token class object.\n        \"\"\"\n        assert token.topic >= 0, \"Topic %d out of range!\" % token.topic\n        assert token.topic < self._num_topics, \"Topic %d out of range!\" % token.topic\n        self._tokens.append(token)\n        self._topic_sum[token.topic] += 1\n\n    def token(self, index):\n        return self._tokens[index]\n\n    def set_topic(self, index, new_topic):\n        \"\"\"Set the index word's topic to new_topic, and update the corresponding\n           topic distribution.\n        \"\"\"\n        assert new_topic >= 0, \"Topic %d out of range!\" % new_topic\n        assert new_topic < self._num_topics, \"Topic %d out of range!\" % new_topic\n        old_topic = self._tokens[index].topic\n        if new_topic == old_topic:\n            return\n        self._tokens[index].topic = new_topic\n        self._topic_sum[old_topic] -= 1\n        self._topic_sum[new_topic] += 1\n\n    def set_alpha(self, alpha):\n        self._alpha = alpha\n\n    def size(self):\n        \"\"\"Return number of words in LDADoc.\n        \"\"\"\n        return len(self._tokens)\n\n    def topic_sum(self, topic_id):\n        return self._topic_sum[topic_id]\n\n    def sparse_topic_dist(self, sort=True):\n        \"\"\"Return the topic distribution of documents in sparse format.\n           By default, it is sorted according to the topic probability\n           under the descending order.\n        \"\"\"\n        topic_dist = []\n        sum_ = np.sum(self._accum_topic_sum)\n        if sum_ == 0:\n            return topic_dist\n        for i in range(0, self._num_topics):\n            if self._accum_topic_sum[i] == 0:\n                continue\n            topic_dist.append(Topic(i, self._accum_topic_sum[i] * 1.0 / sum_))\n        if sort:\n\n            def take_elem(topic):\n                return topic.prob\n\n            topic_dist.sort(key=take_elem, reverse=True)\n            if topic_dist is None:\n                topic_dist = []\n\n        return topic_dist\n\n    def dense_topic_dist(self):\n        \"\"\"Return the distribution of document topics in dense format,\n           taking into account the prior parameter alpha.\n        \"\"\"\n        dense_dist = np.zeros(self._num_topics)\n        if self.size() == 0:\n            return dense_dist\n        dense_dist = (self._accum_topic_sum * 1.0 / self._num_accum + self._alpha) / (\n            self.size() + self._alpha * self._num_topics)\n        return dense_dist\n\n    def accumulate_topic_num(self):\n        self._accum_topic_sum += self._topic_sum\n        self._num_accum += 1\n\n\nclass SLDADoc(LDADoc):\n    \"\"\"Sentence LDA Document, inherited from LDADoc.\n       Add add_sentence interface.\n    \"\"\"\n\n    def __init__(self):\n        super().__init__()\n        self.__sentences = None\n\n    def init(self, num_topics):\n        \"\"\"Initialize the SLDADoc according to num_topics.\n        \"\"\"\n        self._num_topics = num_topics\n        self.__sentences = []\n        self._num_accum = 0\n        self._topic_sum = np.zeros(self._num_topics)\n        self._accum_topic_sum = np.zeros(self._num_topics)\n\n    def add_sentence(self, sent):\n        \"\"\"Add new sentence to current SLDADoc.\n        Arg:\n            sent: Sentence class object.\n        \"\"\"\n        assert sent.topic >= 0, \"Topic %d out of range!\" % (sent.topic)\n        assert sent.topic < self._num_topics, \"Topic %d out of range!\" % (sent.topic)\n        self.__sentences.append(sent)\n        self._topic_sum[sent.topic] += 1\n\n    def set_topic(self, index, new_topic):\n        assert new_topic >= 0, \"Topic %d out of range!\" % (new_topic)\n        assert new_topic < self._num_topics, \"Topic %d out of range!\" % (new_topic)\n        old_topic = self.__sentences[index].topic\n        if new_topic == old_topic:\n            return\n        self.__sentences[index].topic = new_topic\n        self._topic_sum[old_topic] -= 1\n        self._topic_sum[new_topic] += 1\n\n    def size(self):\n        \"\"\"Return number of sentences in SLDADoc.\n        \"\"\"\n        return len(self.__sentences)\n\n    def sent(self, index):\n        return self.__sentences[index]\n"
  },
  {
    "path": "modules/text/language_model/slda_news/inference_engine.py",
    "content": "import os\n\nfrom paddlehub.common.logger import logger\n\nfrom slda_news.config import ModelConfig\nfrom slda_news.util import load_prototxt, fix_random_seed, rand_k\nfrom slda_news.model import TopicModel\nfrom slda_news.sampler import GibbsSampler, MHSampler\nfrom slda_news.document import LDADoc, SLDADoc, Token, Sentence\nfrom slda_news.vocab import OOV\n\n\nclass SamplerType:\n    GibbsSampling = 0\n    MetropolisHastings = 1\n\n\nclass InferenceEngine(object):\n    def __init__(self, model_dir, conf_file, type=SamplerType.MetropolisHastings):\n        # Read model configuration.\n        config = ModelConfig()\n        conf_file_path = os.path.join(model_dir, conf_file)\n        load_prototxt(conf_file_path, config)\n        self.__model = TopicModel(model_dir, config)\n        self.__config = config\n\n        # Initialize the sampler according to the configuration.\n        if type == SamplerType.GibbsSampling:\n            self.__sampler = GibbsSampler(self.__model)\n        elif type == SamplerType.MetropolisHastings:\n            self.__sampler = MHSampler(self.__model)\n\n    def infer(self, input, doc):\n        \"\"\"Perform LDA topic inference on input, and store the results in doc.\n        Args:\n            input: a list of strings after tokenization.\n            doc: LDADoc type or SLDADoc type.\n        \"\"\"\n        fix_random_seed()\n        if isinstance(doc, LDADoc) and not isinstance(doc, SLDADoc):\n            doc.init(self.__model.num_topics())\n            doc.set_alpha(self.__model.alpha())\n            for token in input:\n                id_ = self.__model.term_id(token)\n                if id_ != OOV:\n                    init_topic = rand_k(self.__model.num_topics())\n                    doc.add_token(Token(init_topic, id_))\n            self.lda_infer(doc, 20, 50)\n        elif isinstance(doc, SLDADoc):\n            doc.init(self.__model.num_topics())\n            doc.set_alpha(self.__model.alpha())\n            for sent in input:\n                words = []\n                for token in sent:\n                    id_ = self.__model.term_id(token)\n                    if id_ != OOV:\n                        words.append(id_)\n                init_topic = rand_k(self.__model.num_topics())\n                doc.add_sentence(Sentence(init_topic, words))\n            self.slda_infer(doc, 20, 50)\n        else:\n            logger.error(\"Wrong Doc Type!\")\n\n    def lda_infer(self, doc, burn_in_iter, total_iter):\n        assert burn_in_iter >= 0\n        assert total_iter > 0\n        assert total_iter > burn_in_iter\n\n        for iter_ in range(total_iter):\n            self.__sampler.sample_doc(doc)\n            if iter_ >= burn_in_iter:\n                doc.accumulate_topic_num()\n\n    def slda_infer(self, doc, burn_in_iter, total_iter):\n        assert burn_in_iter >= 0\n        assert total_iter > 0\n        assert total_iter > burn_in_iter\n\n        for iter_ in range(total_iter):\n            self.__sampler.sample_doc(doc)\n            if iter_ >= burn_in_iter:\n                doc.accumulate_topic_num()\n\n    def model_type(self):\n        return self.__model.type()\n\n    def get_model(self):\n        return self.__model\n\n    def get_config(self):\n        return self.__config\n"
  },
  {
    "path": "modules/text/language_model/slda_news/model.py",
    "content": "import os\nfrom collections import OrderedDict\n\nimport numpy as np\nfrom tqdm import tqdm\nfrom paddlehub.common.logger import logger\n\nfrom slda_news.vocab import Vocab, WordCount\n\n\nclass TopicModel(object):\n    \"\"\"Storage Structure of Topic model, including vocabulary and word topic count.\n    \"\"\"\n\n    def __init__(self, model_dir, config):\n        \"\"\"\n        Args:\n            model_dir: the path of model directory\n            config: ModelConfig class.\n        \"\"\"\n        self.__word_topic = None  # Model parameter of word topic.\n        self.__vocab = Vocab()  # Vocab data structure of model.\n        self.__num_topics = config.num_topics  # Number of topics.\n        self.__alpha = config.alpha\n        self.__alpha_sum = self.__alpha * self.__num_topics\n        self.__beta = config.beta\n        self.__beta_sum = None\n        self.__type = config.type  # Model type.\n        self.__topic_sum = np.zeros(self.__num_topics, dtype=\"int64\")  # Accum sum of each topic in word topic.\n        self.__topic_words = [[] for _ in range(self.__num_topics)]\n        word_topic_path = os.path.join(model_dir, config.word_topic_file)\n        vocab_path = os.path.join(model_dir, config.vocab_file)\n        self.load_model(word_topic_path, vocab_path)\n\n    def term_id(self, term):\n        return self.__vocab.get_id(term)\n\n    def load_model(self, word_topic_path, vocab_path):\n\n        # Loading vocabulary\n        self.__vocab.load(vocab_path)\n\n        self.__beta_sum = self.__beta * self.__vocab.size()\n        self.__word_topic = [{} for _ in range(self.__vocab.size())]  # 字典列表\n        self.__load_word_dict(word_topic_path)\n        logger.info(\"Model Info: #num_topics=%d #vocab_size=%d alpha=%f beta=%f\" %\n                    (self.num_topics(), self.vocab_size(), self.alpha(), self.beta()))\n\n    def word_topic_value(self, word_id, topic_id):\n        \"\"\"Return value of specific word under specific topic in the model.\n        \"\"\"\n        word_dict = self.__word_topic[word_id]\n        if topic_id not in word_dict:\n            return 0\n        return word_dict[topic_id]\n\n    def word_topic(self, term_id):\n        \"\"\"Return the topic distribution of a word.\n        \"\"\"\n        return self.__word_topic[term_id]\n\n    def topic_sum_value(self, topic_id):\n        return self.__topic_sum[topic_id]\n\n    def topic_sum(self):\n        return self.__topic_sum\n\n    def num_topics(self):\n        return self.__num_topics\n\n    def vocab_size(self):\n        return self.__vocab.size()\n\n    def alpha(self):\n        return self.__alpha\n\n    def alpha_sum(self):\n        return self.__alpha_sum\n\n    def beta(self):\n        return self.__beta\n\n    def beta_sum(self):\n        return self.__beta_sum\n\n    def type(self):\n        return self.__type\n\n    def __load_word_dict(self, word_dict_path):\n        \"\"\"Load the word topic parameters.\n        \"\"\"\n        logger.info(\"Loading word topic.\")\n        with open(word_dict_path, 'r', encoding='utf-8') as f:\n            for line in tqdm(f.readlines()):\n                fields = line.strip().split(\" \")\n                assert len(fields) > 0, \"Model file format error!\"\n                term_id = int(fields[0])\n                assert term_id < self.vocab_size(), \"Term id out of range!\"\n                assert term_id >= 0, \"Term id out of range!\"\n                for i in range(1, len(fields)):\n                    topic_count = fields[i].split(\":\")\n                    assert len(topic_count) == 2, \"Topic count format error!\"\n\n                    topic_id = int(topic_count[0])\n                    assert topic_id >= 0, \"Topic out of range!\"\n                    assert topic_id < self.__num_topics, \"Topic out of range!\"\n\n                    count = int(topic_count[1])\n                    assert count >= 0, \"Topic count error!\"\n\n                    self.__word_topic[term_id][topic_id] = count\n                    self.__topic_sum[topic_id] += count\n                    self.__topic_words[topic_id].append(WordCount(term_id, count))\n                new_dict = OrderedDict()\n                for key in sorted(self.__word_topic[term_id]):\n                    new_dict[key] = self.__word_topic[term_id][key]\n                self.__word_topic[term_id] = new_dict\n\n    def get_vocab(self):\n        return self.__vocab.vocabulary()\n\n    def topic_words(self):\n        return self.__topic_words\n"
  },
  {
    "path": "modules/text/language_model/slda_news/module.py",
    "content": "import os\n\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.common.logger import logger\n\nfrom slda_news.inference_engine import InferenceEngine\nfrom slda_news.document import SLDADoc\nfrom slda_news.semantic_matching import SemanticMatching, WordAndDis\nfrom slda_news.tokenizer import LACTokenizer, SimpleTokenizer\nfrom slda_news.config import ModelType\nfrom slda_news.vocab import Vocab, WordCount\n\n\n@moduleinfo(\n    name=\"slda_news\",\n    version=\"1.0.0\",\n    summary=\n    \"This is a PaddleHub Module for SLDA topic model in news dataset, where we can infer the topic distribution of document.\",\n    author=\"DesmonDay\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\")\nclass TopicModel(hub.Module):\n    def _initialize(self):\n        \"\"\"Initialize with the necessary elements.\n        \"\"\"\n        self.model_dir = os.path.join(self.directory, 'news')\n        self.conf_file = 'slda.conf'\n        self.__engine = InferenceEngine(self.model_dir, self.conf_file)\n        self.vocab_path = os.path.join(self.model_dir, 'vocab_info.txt')\n        lac = hub.Module(name=\"lac\")\n        # self.__tokenizer = SimpleTokenizer(self.vocab_path)\n        self.__tokenizer = LACTokenizer(self.vocab_path, lac)\n\n        self.vocabulary = self.__engine.get_model().get_vocab()\n        self.config = self.__engine.get_config()\n        self.topic_words = self.__engine.get_model().topic_words()\n        self.topic_sum_table = self.__engine.get_model().topic_sum()\n\n        def take_elem(word_count):\n            return word_count.count\n\n        for i in range(self.config.num_topics):\n            self.topic_words[i].sort(key=take_elem, reverse=True)\n\n        logger.info(\"Finish Initialization.\")\n\n    def infer_doc_topic_distribution(self, document):\n        \"\"\"\n        This interface infers the topic distribution of document.\n\n        Args:\n            document(str): the input document text.\n\n        Returns:\n            results(list): returns the topic distribution of document.\n        \"\"\"\n        tokens = self.__tokenizer.tokenize(document)\n        if tokens == []:\n            return []\n        results = []\n        sentences = []\n        sent = []\n        for i in range(len(tokens)):\n            sent.append(tokens[i])\n            if len(sent) % 5 == 0:\n                sentences.append(sent)\n                sent = []\n        if len(sent) > 0:\n            sentences.append(sent)\n        doc = SLDADoc()\n        self.__engine.infer(sentences, doc)\n        topics = doc.sparse_topic_dist()\n        for topic in topics:\n            results.append({\"topic id\": topic.tid, \"distribution\": topic.prob})\n        return results\n\n    def show_topic_keywords(self, topic_id, k=10):\n        \"\"\"\n        This interface returns the k keywords under specific topic.\n\n        Args:\n            topic_id(int): topic information we want to know.\n            k(int): top k keywords.\n\n        Returns:\n            results(dict): contains specific topic's keywords and corresponding\n                           probability.\n        \"\"\"\n        EPS = 1e-8\n        results = {}\n        if 0 <= topic_id < self.config.num_topics:\n            k = min(k, len(self.topic_words[topic_id]))\n            for i in range(k):\n                prob = self.topic_words[topic_id][i].count / \\\n                       (self.topic_sum_table[topic_id] + EPS)\n                results[self.vocabulary[self.topic_words[topic_id][i].word_id]] = prob\n            return results\n        else:\n            logger.error(\"%d is out of range!\" % topic_id)\n"
  },
  {
    "path": "modules/text/language_model/slda_news/sampler.py",
    "content": "import os\n\nimport numpy as np\nfrom tqdm import tqdm\nfrom paddlehub.common.logger import logger\n\nfrom slda_news.document import LDADoc, SLDADoc, Token, Sentence\nfrom slda_news.vose_alias import VoseAlias\nfrom slda_news.util import rand, rand_k\n\n\nclass Sampler(object):\n    def __init__(self):\n        pass\n\n    def sample_doc(self, doc):\n        \"\"\"Sample LDA or SLDA topics for documents.\n        \"\"\"\n        raise NotImplementedError\n\n\nclass MHSampler(Sampler):\n    def __init__(self, model):\n        super().__init__()\n        self.__model = model\n        self.__topic_indexes = None\n        self.__alias_tables = None\n        self.__prob_sum = None\n        self.__beta_alias = VoseAlias()\n        self.__beta_prior_sum = None\n        self.__mh_steps = 2\n        self.__construct_alias_table()\n\n    def __construct_alias_table(self):\n        \"\"\"Construct alias table for all words.\n        \"\"\"\n        logger.info(\"Construct alias table for alias sampling method.\")\n        vocab_size = self.__model.vocab_size()\n        self.__topic_indexes = [[] for _ in range(vocab_size)]\n        self.__alias_tables = [VoseAlias() for _ in range(vocab_size)]\n        self.__prob_sum = np.zeros(vocab_size)\n\n        # Construct each word's alias table (prior is not included).\n        for i in tqdm(range(vocab_size)):\n            dist = []\n            prob_sum = 0\n            for key in self.__model.word_topic(i):\n                topic_id = key\n                word_topic_count = self.__model.word_topic(i)[key]\n                topic_sum = self.__model.topic_sum_value(topic_id)\n\n                self.__topic_indexes[i].append(topic_id)\n                q = word_topic_count / (topic_sum + self.__model.beta_sum())\n                dist.append(q)\n                prob_sum += q\n            self.__prob_sum[i] = prob_sum\n            if len(dist) > 0:\n                dist = np.array(dist, dtype=np.float)\n                self.__alias_tables[i].initialize(dist)\n\n        # Build prior parameter beta's alias table.\n        beta_dist = self.__model.beta() / (self.__model.topic_sum() + self.__model.beta_sum())\n        self.__beta_prior_sum = np.sum(beta_dist)\n        self.__beta_alias.initialize(beta_dist)\n\n    def sample_doc(self, doc):\n        if isinstance(doc, LDADoc) and not isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_token(doc, doc.token(i))\n                doc.set_topic(i, new_topic)\n        elif isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_sentence(doc, doc.sent(i))\n                doc.set_topic(i, new_topic)\n\n    def __sample_token(self, doc, token):\n        new_topic = token.topic\n        for i in range(self.__mh_steps):\n            doc_proposed_topic = self.__doc_proposal(doc, token)\n            new_topic = self.__word_proposal(doc, token, doc_proposed_topic)\n        return new_topic\n\n    def __sample_sentence(self, doc, sent):\n        new_topic = sent.topic\n        for i in range(self.__mh_steps):\n            doc_proposed_topic = self.__doc_proposal(doc, sent)\n            new_topic = self.__word_proposal(doc, sent, doc_proposed_topic)\n        return new_topic\n\n    def __doc_proposal(self, doc, token):\n        if isinstance(doc, LDADoc) and isinstance(token, Token):\n            old_topic = token.topic\n            dart = rand() * (doc.size() + self.__model.alpha_sum())\n            if dart < doc.size():\n                token_index = int(dart)\n                new_topic = doc.token(token_index).topic\n            else:\n                new_topic = rand_k(self.__model.num_topics())\n\n            if new_topic != old_topic:\n                proposal_old = self.__doc_proposal_distribution(doc, old_topic)\n                proposal_new = self.__doc_proposal_distribution(doc, new_topic)\n                proportion_old = self.__proportional_function(doc, token, old_topic)\n                proportion_new = self.__proportional_function(doc, token, new_topic)\n                transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                rejection = rand()\n                mask = -(rejection < transition_prob)\n                return (new_topic & mask) | (old_topic & ~mask)\n\n            return new_topic\n\n        elif isinstance(doc, SLDADoc) and isinstance(token, Sentence):\n            sent = token\n            old_topic = sent.topic\n            dart = rand() * (doc.size() + self.__model.alpha_sum())\n            if dart < doc.size():\n                token_index = int(dart)\n                new_topic = doc.sent(token_index).topic\n            else:\n                new_topic = rand_k(self.__model.num_topics())\n\n            if new_topic != old_topic:\n                proportion_old = self.__proportional_function(doc, sent, old_topic)\n                proportion_new = self.__proportional_function(doc, sent, new_topic)\n                proposal_old = self.__doc_proposal_distribution(doc, old_topic)\n                proposal_new = self.__doc_proposal_distribution(doc, new_topic)\n                transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                rejection = rand()\n                mask = -(rejection < transition_prob)\n                return (new_topic & mask) | (old_topic & ~mask)\n\n            return new_topic\n\n    def __word_proposal(self, doc, token, old_topic):\n        if isinstance(doc, LDADoc) and isinstance(token, Token):\n            new_topic = self.__propose(token.id)\n            if new_topic != old_topic:\n                proposal_old = self.__word_proposal_distribution(token.id, old_topic)\n                proposal_new = self.__word_proposal_distribution(token.id, new_topic)\n                proportion_old = self.__proportional_function(doc, token, old_topic)\n                proportion_new = self.__proportional_function(doc, token, new_topic)\n                transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                rejection = rand()\n                mask = -(rejection < transition_prob)\n                return (new_topic & mask) | (old_topic & ~mask)\n            return new_topic\n\n        elif isinstance(doc, SLDADoc) and isinstance(token, Sentence):\n            sent = token\n            new_topic = old_topic\n            for word_id in sent.tokens:\n                new_topic = self.__propose(word_id)\n                if new_topic != old_topic:\n                    proportion_old = self.__proportional_function(doc, sent, old_topic)\n                    proportion_new = self.__proportional_function(doc, sent, new_topic)\n                    proposal_old = self.__word_proposal_distribution(word_id, old_topic)\n                    proposal_new = self.__word_proposal_distribution(word_id, new_topic)\n                    transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                    rejection = rand()\n                    mask = -(rejection < transition_prob)\n                    new_topic = (new_topic & mask) | (old_topic & ~mask)\n            return new_topic\n\n    def __proportional_function(self, doc, token, new_topic):\n        if isinstance(doc, LDADoc) and isinstance(token, Token):\n            old_topic = token.topic\n            dt_alpha = doc.topic_sum(new_topic) + self.__model.alpha()\n            wt_beta = self.__model.word_topic_value(token.id, new_topic) + self.__model.beta()\n            t_sum_beta_sum = self.__model.topic_sum_value(new_topic) + self.__model.beta_sum()\n            if new_topic == old_topic and wt_beta > 1:\n                if dt_alpha > 1:\n                    dt_alpha -= 1\n                wt_beta -= 1\n                t_sum_beta_sum -= 1\n            return dt_alpha * wt_beta / t_sum_beta_sum\n\n        elif isinstance(doc, SLDADoc) and isinstance(token, Sentence):\n            sent = token\n            old_topic = sent.topic\n            result = doc.topic_sum(new_topic) + self.__model.alpha()\n            if new_topic == old_topic:\n                result -= 1\n            for word_id in sent.tokens:\n                wt_beta = self.__model.word_topic_value(word_id, new_topic) + self.__model.beta()\n                t_sum_beta_sum = self.__model.topic_sum_value(new_topic) + self.__model.beta_sum()\n                if new_topic == old_topic and wt_beta > 1:\n                    wt_beta -= 1\n                    t_sum_beta_sum -= 1\n                result *= wt_beta / t_sum_beta_sum\n            return result\n        else:\n            logger.error(\"Wrong input argument type!\")\n\n    def __word_proposal_distribution(self, word_id, topic):\n        wt_beta = self.__model.word_topic_value(word_id, topic) + self.__model.beta()\n        t_sum_beta_sum = self.__model.topic_sum_value(topic) + self.__model.beta_sum()\n        return wt_beta / t_sum_beta_sum\n\n    def __doc_proposal_distribution(self, doc, topic):\n        return doc.topic_sum(topic) + self.__model.alpha()\n\n    def __propose(self, word_id):\n        dart = rand() * (self.__prob_sum[word_id] + self.__beta_prior_sum)\n        if dart < self.__prob_sum[word_id]:\n            idx = self.__alias_tables[word_id].generate()\n            topic = self.__topic_indexes[word_id][idx]\n        else:\n            topic = self.__beta_alias.generate()\n        return topic\n\n\nclass GibbsSampler(Sampler):\n    def __init__(self, model):\n        super().__init__()\n        self.__model = model\n\n    def sample_doc(self, doc):\n        if isinstance(doc, LDADoc) and not isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_token(doc, doc.token(i))\n                doc.set_topic(i, new_topic)\n        elif isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_sentence(doc, doc.sent(i))\n                doc.set_topic(i, new_topic)\n\n    def __sample_token(self, doc, token):\n        old_topic = token.topic\n        num_topics = self.__model.num_topics()\n        accum_prob = np.zeros(num_topics)\n        prob = np.zeros(num_topics)\n        sum_ = 0\n        for i in range(num_topics):\n            dt_alpha = doc.topic_sum(i) + self.__model.alpha()\n            wt_beta = self.__model.word_topic_value(token.id, i) + self.__model.beta()\n            t_sum_beta_sum = self.__model.topic_sum(i) + self.__model.beta_sum()\n            if i == old_topic and wt_beta > 1:\n                if dt_alpha > 1:\n                    dt_alpha -= 1\n                wt_beta -= 1\n                t_sum_beta_sum -= 1\n            prob[i] = dt_alpha * wt_beta / t_sum_beta_sum\n            sum_ += prob[i]\n            accum_prob[i] = prob[i] if i == 0 else accum_prob[i - 1] + prob[i]\n\n        dart = rand() * sum_\n        if dart <= accum_prob[0]:\n            return 0\n        for i in range(1, num_topics):\n            if accum_prob[i - 1] < dart <= accum_prob[i]:\n                return i\n        return num_topics - 1\n\n    def __sample_sentence(self, doc, sent):\n        old_topic = sent.topic\n        num_topics = self.__model.num_topics()\n        accum_prob = np.zeros(num_topics)\n        prob = np.zeros(num_topics)\n        sum_ = 0\n        for t in range(num_topics):\n            dt_alpha = doc.topic_sum(t) + self.__model.alpha()\n            t_sum_beta_sum = self.__model.topic_sum(t) + self.__model.beta_sum()\n            if t == old_topic:\n                if dt_alpha > 1:\n                    dt_alpha -= 1\n                if t_sum_beta_sum > 1:\n                    t_sum_beta_sum -= 1\n            prob[t] = dt_alpha\n            for i in range(len(sent.tokens)):\n                w = sent.tokens[i]\n                wt_beta = self.__model.word_topic_value(w, t) + self.__model.beta()\n                if t == old_topic and wt_beta > 1:\n                    wt_beta -= 1\n                # Note: if the length of the sentence is too long, the probability will be\n                # too small and the accuracy will be lost if there are too many multiply items\n                prob[t] *= wt_beta / t_sum_beta_sum\n            sum_ += prob[t]\n            accum_prob[t] = prob[t] if t == 0 else accum_prob[t - 1] + prob[t]\n\n        dart = rand() * sum\n        if dart <= accum_prob[0]:\n            return 0\n        for t in range(1, num_topics):\n            if accum_prob[t - 1] < dart <= accum_prob[t]:\n                return t\n        return num_topics - 1\n"
  },
  {
    "path": "modules/text/language_model/slda_news/semantic_matching.py",
    "content": "import os\n\nimport numpy as np\nfrom paddlehub.common.logger import logger\n\nfrom slda_news.vocab import OOV\n\nEPS = 1e-06\n\n\nclass WordAndDis(object):\n    def __init__(self):\n        self.word = None\n        self.distance = None\n\n\nclass SemanticMatching(object):\n    def __init__(self):\n        pass\n\n    def l2_norm(self, vec):\n        \"\"\"Calculate the length of vector.\n        \"\"\"\n        result = np.sqrt(np.sum(vec**2))\n        return result\n\n    def cosine_similarity(self, vec1, vec2):\n        norm1 = self.l2_norm(vec1)\n        norm2 = self.l2_norm(vec2)\n        result = np.sum(vec1 * vec2) / norm1 / norm2\n        return result\n\n    def likelihood_based_similarity(self, terms, doc_topic_dist, model):\n        \"\"\"\n        Args:\n            terms: list of strings\n            doc_topic_dist: list of Topic class\n            model: TopicModel class\n        \"\"\"\n        num_of_term_in_vocab = 0\n        result = 0\n        for i in range(len(terms)):\n            term_id = model.term_id(terms[i])\n            if term_id == OOV:\n                continue\n            num_of_term_in_vocab += 1\n            for j in range(len(doc_topic_dist)):\n                topic_id = doc_topic_dist[j].tid\n                prob = doc_topic_dist[j].prob\n                result += model.word_topic_value(term_id, topic_id) * 1.0 / \\\n                          model.topic_sum_value(topic_id) * prob\n\n        if num_of_term_in_vocab == 0:\n            return result\n        return result / num_of_term_in_vocab\n\n    def kullback_leibler_divergence(self, dist1, dist2):\n        assert dist1.shape == dist2.shape\n        dist2[dist2 < EPS] = EPS\n        result = np.sum(dist1 * np.log(dist1 / dist2))\n        return result\n\n    def jensen_shannon_divergence(self, dist1, dist2):\n        assert dist1.shape == dist2.shape\n        dist1[dist1 < EPS] = EPS\n        dist2[dist2 < EPS] = EPS\n        mean = (dist1 + dist2) * 0.5\n        jsd = self.kullback_leibler_divergence(dist1, mean) * 0.5 + \\\n              self.kullback_leibler_divergence(dist2, mean) * 0.5\n        return jsd\n\n    def hellinger_distance(self, dist1, dist2):\n        assert dist1.shape == dist2.shape\n        result = np.sum((np.sqrt(dist1) - np.sqrt(dist2))**2)\n        result = np.sqrt(result) * 0.7071067812\n        return result\n"
  },
  {
    "path": "modules/text/language_model/slda_news/tokenizer.py",
    "content": "import os\n\nimport numpy as np\nfrom paddlehub.common.logger import logger\n\n\nclass Tokenizer(object):\n    \"\"\"Base tokenizer class.\n    \"\"\"\n\n    def __init__(self):\n        pass\n\n    def tokenize(self, text):\n        raise NotImplementedError\n\n\nclass SimpleTokenizer(Tokenizer):\n    \"\"\"Simple version FMM(Forward Maximun Matching) word tokenizer. This tokenizer can only\n       be used in topic model demo, but not in real business application scenarios.\n\n       Notes: This tokenizer can only recognize the words in the corresponding vocab file.\n    \"\"\"\n\n    def __init__(self, vocab_path):\n        super().__init__()\n        self.__max_word_len = 0\n        self.__vocab = set()\n        self.__load_vocab(vocab_path)\n\n    def tokenize(self, text):\n        \"\"\"Tokenize the input string `text`, and return the tokenize result.\n        \"\"\"\n        text_len = len(text)\n        result = []\n        i = 0\n        while i < text_len:\n            word = found_word = \"\"\n            # Deal with English characters.\n            if self.__is_eng_char(text[i]):\n                for j in range(i, text_len + 1):\n                    if j < text_len and self.__is_eng_char(text[j]):\n                        word += self.__tolower(text[j])\n                    else:\n                        # Forward matching by character granularity.\n                        if word in self.__vocab:\n                            result.append(word)\n                        i = j - 1\n                        break\n            else:\n                for j in range(i, min(i + self.__max_word_len, text_len)):\n                    word += text[j]\n                    if word in self.__vocab:\n                        found_word = word\n                if len(found_word) > 0:\n                    result.append(found_word)\n                    i += len(found_word) - 1\n            i += 1\n        return result\n\n    def contains(self, word):\n        \"\"\"Check whether the word is in the vocabulary.\n        \"\"\"\n        return word in self.__vocab\n\n    def __load_vocab(self, vocab_path):\n        \"\"\"Load the word dictionary.\n        \"\"\"\n        with open(vocab_path, 'r', encoding='utf-8') as fin:\n            vocab_size = 0\n            for line in fin.readlines():\n                fields = line.strip().split('\\t')\n                assert len(fields) >= 2\n                word = fields[1]\n                self.__max_word_len = max(self.__max_word_len, len(word))\n                self.__vocab.add(word)\n                vocab_size += 1\n\n    def __is_eng_char(self, c):\n        \"\"\"Check whether char c is an English character.\n        \"\"\"\n        return (c >= 'A' and c <= 'Z') or (c >= 'a' and c <= 'z')\n\n    def __tolower(self, c):\n        \"\"\"Return the lowercase character of the corresponding character, or return\n           the original character if there is no corresponding lowercase character.\n        \"\"\"\n        return c.lower()\n\n\nclass LACTokenizer(Tokenizer):\n    def __init__(self, vocab_path, lac):\n        super().__init__()\n        self.__max_word_len = 0\n        self.__vocab = set()\n        self.__lac = lac\n        self.__load_vocab(vocab_path)\n\n    def __load_vocab(self, vocab_path):\n        \"\"\"Load the word dictionary.\n                \"\"\"\n        with open(vocab_path, 'r', encoding='utf-8') as fin:\n            vocab_size = 0\n            for line in fin.readlines():\n                fields = line.strip().split('\\t')\n                assert len(fields) >= 2\n                word = fields[1]\n                self.__max_word_len = max(self.__max_word_len, len(word))\n                self.__vocab.add(word)\n                vocab_size += 1\n\n    def tokenize(self, text):\n        results = self.__lac.lexical_analysis(texts=[text], use_gpu=False, batch_size=1, return_tag=True)\n        # Change English words to lower case.\n        # And just preserve the word in vocab.\n        words = results[0][\"word\"]\n        result = []\n        for word in words:\n            word = word.lower()\n            if word in self.__vocab:\n                result.append(word)\n        return result\n\n    def contains(self, word):\n        \"\"\"Check whether the word is in the vocabulary.\n        \"\"\"\n        return word in self.__vocab\n"
  },
  {
    "path": "modules/text/language_model/slda_news/util.py",
    "content": "import time\nimport yaml\n\nimport numpy as np\nfrom paddlehub.common.logger import logger\n\nfrom slda_news.config import ModelType\n\n\ndef load_prototxt(config_file, config):\n    \"\"\"\n    Args:\n        config_file: model configuration file.\n        config: ModelConfig class\n    \"\"\"\n    logger.info(\"Loading SLDA config.\")\n    with open(config_file, 'r', encoding='utf-8') as f:\n        yaml_dict = yaml.load(f, Loader=yaml.FullLoader)\n\n    # Assignment.\n    if yaml_dict[\"type\"] == \"LDA\":\n        config.type = ModelType.LDA\n    else:\n        config.type = ModelType.SLDA\n    config.num_topics = yaml_dict[\"num_topics\"]\n    config.alpha = yaml_dict[\"alpha\"]\n    config.beta = yaml_dict[\"beta\"]\n    config.word_topic_file = yaml_dict[\"word_topic_file\"]\n    config.vocab_file = yaml_dict[\"vocab_file\"]\n\n\ndef fix_random_seed(seed=2147483647):\n    np.random.seed(seed)\n\n\ndef rand(min_=0, max_=1):\n    return np.random.uniform(low=min_, high=max_)\n\n\ndef rand_k(k):\n    \"\"\"Returns an integer float number between [0, k - 1].\n    \"\"\"\n    return int(rand() * k)\n\n\ndef timeit(f):\n    \"\"\"Return time cost of function f.\n    \"\"\"\n\n    def timed(*args, **kwargs):\n        start_time = time.time()\n        result = f(*args, **kwargs)\n        end_time = time.time()\n        print(\"   [-] %s : %2.5f sec\" % (f.__name__, end_time - start_time))\n        return result\n\n    return timed\n"
  },
  {
    "path": "modules/text/language_model/slda_news/vocab.py",
    "content": "from paddlehub.common.logger import logger\n\nOOV = -1\n\n\nclass WordCount(object):\n    def __init__(self, word_id, count):\n        self.word_id = word_id\n        self.count = count\n\n\nclass Vocab(object):\n    def __init__(self):\n        self.__term2id = {}\n        self.__id2term = {}\n\n    def get_id(self, word):\n        if word not in self.__term2id:\n            return OOV\n        return self.__term2id[word]\n\n    def load(self, vocab_file):\n        self.__term2id = {}\n        self.__id2term = {}\n        with open(vocab_file, 'r', encoding='utf-8') as fin:\n            for line in fin.readlines():\n                fields = line.strip().split('\\t')\n                assert len(fields) == 5, \"Vocabulary file [%s] format error!\" % (vocab_file)\n                term = fields[1]\n                id_ = int(fields[2])\n                if term in self.__term2id:\n                    logger.error(\"Duplicate word [%s] in vocab file!\" % (term))\n                    continue\n                self.__term2id[term] = id_\n                self.__id2term[id_] = term\n\n    def size(self):\n        return len(self.__term2id)\n\n    def vocabulary(self):\n        return self.__id2term\n"
  },
  {
    "path": "modules/text/language_model/slda_news/vose_alias.py",
    "content": "import os\n\nimport numpy as np\nfrom paddlehub.common.logger import logger\n\nfrom slda_news.util import rand, rand_k\n\n\nclass VoseAlias(object):\n    \"\"\"Vose's Alias Method.\n    \"\"\"\n\n    def __init__(self):\n        self.__alias = None\n        self.__prob = None  # np.array\n\n    def initialize(self, distribution):\n        \"\"\"Initialize the alias table according to the input distribution\n        Arg:\n            distribution: Numpy array.\n        \"\"\"\n        size = distribution.shape[0]\n        self.__alias = np.zeros(size, dtype=np.int64)\n        self.__prob = np.zeros(size)\n        sum_ = np.sum(distribution)\n        p = distribution / sum_ * size  # Scale up probability.\n        large, small = [], []\n        for i, p_ in enumerate(p):\n            if p_ < 1.0:\n                small.append(i)\n            else:\n                large.append(i)\n\n        while large and small:\n            l = small[0]\n            g = large[0]\n            small.pop(0)\n            large.pop(0)\n            self.__prob[l] = p[l]\n            self.__alias[l] = g\n            p[g] = p[g] + p[l] - 1  # A more numerically stable option.\n            if p[g] < 1.0:\n                small.append(g)\n            else:\n                large.append(g)\n\n        while large:\n            g = large[0]\n            large.pop(0)\n            self.__prob[g] = 1.0\n\n        while small:\n            l = small[0]\n            small.pop(0)\n            self.__prob[l] = 1.0\n\n    def generate(self):\n        \"\"\"Generate samples from given distribution.\n        \"\"\"\n        dart1 = rand_k(self.size())\n        dart2 = int(rand())\n        return dart1 if dart2 > self.__prob[dart1] else self.__alias[dart1]\n\n    def size(self):\n        return self.__prob.shape[0]\n"
  },
  {
    "path": "modules/text/language_model/slda_novel/README.md",
    "content": "## 模型概述\n\n主题模型(Topic Model)是以无监督学习的方式对文档的隐含语义结构进行聚类的统计模型，其中SLDA(Sentence-LDA)是主题模型的一种。SLDA是LDA主题模型的扩展，LDA假设每个单词对应一个主题，而SLDA假设每个句子对应一个主题。本Module基于的数据集为百度自建的小说领域数据集。\n\n<p align=\"center\">\n<img src=\"https://bj.bcebos.com/paddlehub/model/nlp/semantic_model/slda.png\" hspace='10'/> <br />\n</p>\n\n更多详情请参考[SLDA论文](https://pdfs.semanticscholar.org/c311/778adb9484c86250e915aecd9714f4206050.pdf)。\n\n注：该Module由第三方开发者DesmonDay贡献。\n\n## SLDA模型 API 说明\n\n### infer_doc_topic_distribution(document)\n\n用于推理出文档的主题分布。\n\n**参数**\n\n- document(str): 输入文档。\n\n**返回**\n\n- results(list): 包含主题分布下各个主题ID和对应的概率分布。其中，list的基本元素为dict，dict的key为主题ID，value为各个主题ID对应的概率。\n\n### show_topic_keywords(topic_id, k=10)\n\n用于展示出每个主题下对应的关键词，可配合推理主题分布的API使用。\n\n**参数**\n\n- topic_id(int): 主题ID。\n- k(int): 需要知道对应主题的前k个关键词。\n\n**返回**\n\n- results(dict): 返回对应文档的前k个关键词，以及各个关键词在文档中的出现概率。\n\n### 代码示例\n\n这里展示部分API的使用示例。\n\n``` python\nimport paddlehub as hub\n\nslda_novel = hub.Module(name=\"slda_novel\")\n\ntopic_dist = slda_novel.infer_doc_topic_distribution(\"妈妈告诉女儿，今天爸爸过生日，放学后要早点回家一起庆祝\")\n# [{'topic id': 222, 'distribution': 0.5}, {'topic id': 362, 'distribution': 0.5}]\n\nkeywords = slda_novel.show_topic_keywords(topic_id=222)\n# {'回来': 0.044502306717752,\n#  '回去': 0.036457065533017245,\n#  '回家': 0.029136327306669554,\n#  '明天': 0.028762575780517493,\n#  '休息': 0.022904260192395567,\n#  '晚上': 0.021970839714261954,\n#  '时间': 0.020756626422891028,\n#  '好好': 0.019726413882856498,\n#  '电话': 0.017195445214734463,\n#  '吃饭': 0.01521839547511471}\n\n```\n\n## 查看代码\nhttps://github.com/baidu/Familia\n\n\n## 依赖\n\npaddlepaddle >= 1.8.2\n\npaddlehub >= 1.8.0\n\n\n## 更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/text/language_model/slda_novel/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/slda_novel/config.py",
    "content": "\"\"\"\nThis file defines the basic config information of LDA/SLDA model.\n\"\"\"\n\n\nclass ModelType:\n    LDA = 0\n    SLDA = 1\n\n\nclass ModelConfig:\n    type = None\n    num_topics = None\n    alpha = None\n    beta = None\n    word_topic_file = None\n    vocab_file = None\n"
  },
  {
    "path": "modules/text/language_model/slda_novel/document.py",
    "content": "import numpy as np\n\n\nclass Topic(object):\n    \"\"\"Basic data structure of topic, contains topic id and\n       corresponding probability.\n    \"\"\"\n\n    def __init__(self, tid, prob):\n        self.tid = tid  # topic id\n        self.prob = prob  # topic probability\n\n\nclass Token(object):\n    \"\"\"Basic storage unit of LDA documents, contains word id\n       and corresponding topic.\n    \"\"\"\n\n    def __init__(self, topic, id):\n        self.topic = topic\n        self.id = id\n\n\nclass Sentence(object):\n    \"\"\"Basic storage unit of SentenceLDA documents, contains word ids\n       of the sentence and its corresponding topic id.\n    \"\"\"\n\n    def __init__(self, topic, tokens):\n        self.topic = topic\n        self.tokens = tokens\n\n\nclass LDADoc(object):\n    \"\"\"The storage structure of LDA model's inference result.\n    \"\"\"\n\n    def __init__(self):\n        self._num_topics = None  # Number of topics.\n        self._num_accum = None  # Number of accumulated sample rounds.\n        self._alpha = None  # Document prior parameter.\n        self._tokens = None  # Storage structure of inference results.\n        self._topic_sum = None  # Document's topic sum in one round samples.\n        self._accum_topic_sum = None  # Accumulated results of topic sum.\n\n    def init(self, num_topics):\n        \"\"\"Initialize the LDADoc according to num_topics.\n        \"\"\"\n        self._num_topics = num_topics\n        self._num_accum = 0\n        self._tokens = []\n        self._topic_sum = np.zeros(self._num_topics)\n        self._accum_topic_sum = np.zeros(self._num_topics)\n\n    def add_token(self, token):\n        \"\"\"Add new word to current LDADoc.\n        Arg:\n            token: Token class object.\n        \"\"\"\n        assert token.topic >= 0, \"Topic %d out of range!\" % token.topic\n        assert token.topic < self._num_topics, \"Topic %d out of range!\" % token.topic\n        self._tokens.append(token)\n        self._topic_sum[token.topic] += 1\n\n    def token(self, index):\n        return self._tokens[index]\n\n    def set_topic(self, index, new_topic):\n        \"\"\"Set the index word's topic to new_topic, and update the corresponding\n           topic distribution.\n        \"\"\"\n        assert new_topic >= 0, \"Topic %d out of range!\" % new_topic\n        assert new_topic < self._num_topics, \"Topic %d out of range!\" % new_topic\n        old_topic = self._tokens[index].topic\n        if new_topic == old_topic:\n            return\n        self._tokens[index].topic = new_topic\n        self._topic_sum[old_topic] -= 1\n        self._topic_sum[new_topic] += 1\n\n    def set_alpha(self, alpha):\n        self._alpha = alpha\n\n    def size(self):\n        \"\"\"Return number of words in LDADoc.\n        \"\"\"\n        return len(self._tokens)\n\n    def topic_sum(self, topic_id):\n        return self._topic_sum[topic_id]\n\n    def sparse_topic_dist(self, sort=True):\n        \"\"\"Return the topic distribution of documents in sparse format.\n           By default, it is sorted according to the topic probability\n           under the descending order.\n        \"\"\"\n        topic_dist = []\n        sum_ = np.sum(self._accum_topic_sum)\n        if sum_ == 0:\n            return topic_dist\n        for i in range(0, self._num_topics):\n            if self._accum_topic_sum[i] == 0:\n                continue\n            topic_dist.append(Topic(i, self._accum_topic_sum[i] * 1.0 / sum_))\n        if sort:\n\n            def take_elem(topic):\n                return topic.prob\n\n            topic_dist.sort(key=take_elem, reverse=True)\n            if topic_dist is None:\n                topic_dist = []\n\n        return topic_dist\n\n    def dense_topic_dist(self):\n        \"\"\"Return the distribution of document topics in dense format,\n           taking into account the prior parameter alpha.\n        \"\"\"\n        dense_dist = np.zeros(self._num_topics)\n        if self.size() == 0:\n            return dense_dist\n        dense_dist = (self._accum_topic_sum * 1.0 / self._num_accum + self._alpha) / (\n            self.size() + self._alpha * self._num_topics)\n        return dense_dist\n\n    def accumulate_topic_num(self):\n        self._accum_topic_sum += self._topic_sum\n        self._num_accum += 1\n\n\nclass SLDADoc(LDADoc):\n    \"\"\"Sentence LDA Document, inherited from LDADoc.\n       Add add_sentence interface.\n    \"\"\"\n\n    def __init__(self):\n        super().__init__()\n        self.__sentences = None\n\n    def init(self, num_topics):\n        \"\"\"Initialize the SLDADoc according to num_topics.\n        \"\"\"\n        self._num_topics = num_topics\n        self.__sentences = []\n        self._num_accum = 0\n        self._topic_sum = np.zeros(self._num_topics)\n        self._accum_topic_sum = np.zeros(self._num_topics)\n\n    def add_sentence(self, sent):\n        \"\"\"Add new sentence to current SLDADoc.\n        Arg:\n            sent: Sentence class object.\n        \"\"\"\n        assert sent.topic >= 0, \"Topic %d out of range!\" % (sent.topic)\n        assert sent.topic < self._num_topics, \"Topic %d out of range!\" % (sent.topic)\n        self.__sentences.append(sent)\n        self._topic_sum[sent.topic] += 1\n\n    def set_topic(self, index, new_topic):\n        assert new_topic >= 0, \"Topic %d out of range!\" % (new_topic)\n        assert new_topic < self._num_topics, \"Topic %d out of range!\" % (new_topic)\n        old_topic = self.__sentences[index].topic\n        if new_topic == old_topic:\n            return\n        self.__sentences[index].topic = new_topic\n        self._topic_sum[old_topic] -= 1\n        self._topic_sum[new_topic] += 1\n\n    def size(self):\n        \"\"\"Return number of sentences in SLDADoc.\n        \"\"\"\n        return len(self.__sentences)\n\n    def sent(self, index):\n        return self.__sentences[index]\n"
  },
  {
    "path": "modules/text/language_model/slda_novel/inference_engine.py",
    "content": "import os\n\nfrom paddlehub.common.logger import logger\n\nfrom slda_novel.config import ModelConfig\nfrom slda_novel.util import load_prototxt, fix_random_seed, rand_k\nfrom slda_novel.model import TopicModel\nfrom slda_novel.sampler import GibbsSampler, MHSampler\nfrom slda_novel.document import LDADoc, SLDADoc, Token, Sentence\nfrom slda_novel.vocab import OOV\n\n\nclass SamplerType:\n    GibbsSampling = 0\n    MetropolisHastings = 1\n\n\nclass InferenceEngine(object):\n    def __init__(self, model_dir, conf_file, type=SamplerType.MetropolisHastings):\n        # Read model configuration.\n        config = ModelConfig()\n        conf_file_path = os.path.join(model_dir, conf_file)\n        load_prototxt(conf_file_path, config)\n        self.__model = TopicModel(model_dir, config)\n        self.__config = config\n\n        # Initialize the sampler according to the configuration.\n        if type == SamplerType.GibbsSampling:\n            self.__sampler = GibbsSampler(self.__model)\n        elif type == SamplerType.MetropolisHastings:\n            self.__sampler = MHSampler(self.__model)\n\n    def infer(self, input, doc):\n        \"\"\"Perform LDA topic inference on input, and store the results in doc.\n        Args:\n            input: a list of strings after tokenization.\n            doc: LDADoc type or SLDADoc type.\n        \"\"\"\n        fix_random_seed()\n        if isinstance(doc, LDADoc) and not isinstance(doc, SLDADoc):\n            doc.init(self.__model.num_topics())\n            doc.set_alpha(self.__model.alpha())\n            for token in input:\n                id_ = self.__model.term_id(token)\n                if id_ != OOV:\n                    init_topic = rand_k(self.__model.num_topics())\n                    doc.add_token(Token(init_topic, id_))\n            self.lda_infer(doc, 20, 50)\n        elif isinstance(doc, SLDADoc):\n            doc.init(self.__model.num_topics())\n            doc.set_alpha(self.__model.alpha())\n            for sent in input:\n                words = []\n                for token in sent:\n                    id_ = self.__model.term_id(token)\n                    if id_ != OOV:\n                        words.append(id_)\n                init_topic = rand_k(self.__model.num_topics())\n                doc.add_sentence(Sentence(init_topic, words))\n            self.slda_infer(doc, 20, 50)\n        else:\n            logger.error(\"Wrong Doc Type!\")\n\n    def lda_infer(self, doc, burn_in_iter, total_iter):\n        assert burn_in_iter >= 0\n        assert total_iter > 0\n        assert total_iter > burn_in_iter\n\n        for iter_ in range(total_iter):\n            self.__sampler.sample_doc(doc)\n            if iter_ >= burn_in_iter:\n                doc.accumulate_topic_num()\n\n    def slda_infer(self, doc, burn_in_iter, total_iter):\n        assert burn_in_iter >= 0\n        assert total_iter > 0\n        assert total_iter > burn_in_iter\n\n        for iter_ in range(total_iter):\n            self.__sampler.sample_doc(doc)\n            if iter_ >= burn_in_iter:\n                doc.accumulate_topic_num()\n\n    def model_type(self):\n        return self.__model.type()\n\n    def get_model(self):\n        return self.__model\n\n    def get_config(self):\n        return self.__config\n"
  },
  {
    "path": "modules/text/language_model/slda_novel/model.py",
    "content": "import os\nfrom collections import OrderedDict\n\nimport numpy as np\nfrom tqdm import tqdm\nfrom paddlehub.common.logger import logger\n\nfrom slda_novel.vocab import Vocab, WordCount\n\n\nclass TopicModel(object):\n    \"\"\"Storage Structure of Topic model, including vocabulary and word topic count.\n    \"\"\"\n\n    def __init__(self, model_dir, config):\n        \"\"\"\n        Args:\n            model_dir: the path of model directory\n            config: ModelConfig class.\n        \"\"\"\n        self.__word_topic = None  # Model parameter of word topic.\n        self.__vocab = Vocab()  # Vocab data structure of model.\n        self.__num_topics = config.num_topics  # Number of topics.\n        self.__alpha = config.alpha\n        self.__alpha_sum = self.__alpha * self.__num_topics\n        self.__beta = config.beta\n        self.__beta_sum = None\n        self.__type = config.type  # Model type.\n        self.__topic_sum = np.zeros(self.__num_topics, dtype=\"int64\")  # Accum sum of each topic in word topic.\n        self.__topic_words = [[] for _ in range(self.__num_topics)]\n        word_topic_path = os.path.join(model_dir, config.word_topic_file)\n        vocab_path = os.path.join(model_dir, config.vocab_file)\n        self.load_model(word_topic_path, vocab_path)\n\n    def term_id(self, term):\n        return self.__vocab.get_id(term)\n\n    def load_model(self, word_topic_path, vocab_path):\n\n        # Loading vocabulary\n        self.__vocab.load(vocab_path)\n\n        self.__beta_sum = self.__beta * self.__vocab.size()\n        self.__word_topic = [{} for _ in range(self.__vocab.size())]  # 字典列表\n        self.__load_word_dict(word_topic_path)\n        logger.info(\"Model Info: #num_topics=%d #vocab_size=%d alpha=%f beta=%f\" %\n                    (self.num_topics(), self.vocab_size(), self.alpha(), self.beta()))\n\n    def word_topic_value(self, word_id, topic_id):\n        \"\"\"Return value of specific word under specific topic in the model.\n        \"\"\"\n        word_dict = self.__word_topic[word_id]\n        if topic_id not in word_dict:\n            return 0\n        return word_dict[topic_id]\n\n    def word_topic(self, term_id):\n        \"\"\"Return the topic distribution of a word.\n        \"\"\"\n        return self.__word_topic[term_id]\n\n    def topic_sum_value(self, topic_id):\n        return self.__topic_sum[topic_id]\n\n    def topic_sum(self):\n        return self.__topic_sum\n\n    def num_topics(self):\n        return self.__num_topics\n\n    def vocab_size(self):\n        return self.__vocab.size()\n\n    def alpha(self):\n        return self.__alpha\n\n    def alpha_sum(self):\n        return self.__alpha_sum\n\n    def beta(self):\n        return self.__beta\n\n    def beta_sum(self):\n        return self.__beta_sum\n\n    def type(self):\n        return self.__type\n\n    def __load_word_dict(self, word_dict_path):\n        \"\"\"Load the word topic parameters.\n        \"\"\"\n        logger.info(\"Loading word topic.\")\n        with open(word_dict_path, 'r', encoding='utf-8') as f:\n            for line in tqdm(f.readlines()):\n                fields = line.strip().split(\" \")\n                assert len(fields) > 0, \"Model file format error!\"\n                term_id = int(fields[0])\n                assert term_id < self.vocab_size(), \"Term id out of range!\"\n                assert term_id >= 0, \"Term id out of range!\"\n                for i in range(1, len(fields)):\n                    topic_count = fields[i].split(\":\")\n                    assert len(topic_count) == 2, \"Topic count format error!\"\n\n                    topic_id = int(topic_count[0])\n                    assert topic_id >= 0, \"Topic out of range!\"\n                    assert topic_id < self.__num_topics, \"Topic out of range!\"\n\n                    count = int(topic_count[1])\n                    assert count >= 0, \"Topic count error!\"\n\n                    self.__word_topic[term_id][topic_id] = count\n                    self.__topic_sum[topic_id] += count\n                    self.__topic_words[topic_id].append(WordCount(term_id, count))\n                new_dict = OrderedDict()\n                for key in sorted(self.__word_topic[term_id]):\n                    new_dict[key] = self.__word_topic[term_id][key]\n                self.__word_topic[term_id] = new_dict\n\n    def get_vocab(self):\n        return self.__vocab.vocabulary()\n\n    def topic_words(self):\n        return self.__topic_words\n"
  },
  {
    "path": "modules/text/language_model/slda_novel/module.py",
    "content": "import os\n\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.common.logger import logger\n\nfrom slda_novel.inference_engine import InferenceEngine\nfrom slda_novel.document import SLDADoc\nfrom slda_novel.semantic_matching import SemanticMatching, WordAndDis\nfrom slda_novel.tokenizer import LACTokenizer, SimpleTokenizer\nfrom slda_novel.config import ModelType\nfrom slda_novel.vocab import Vocab, WordCount\n\n\n@moduleinfo(\n    name=\"slda_novel\",\n    version=\"1.0.0\",\n    summary=\n    \"This is a PaddleHub Module for SLDA topic model in novel dataset, where we can infer the topic distribution of document.\",\n    author=\"DesmonDay\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\")\nclass TopicModel(hub.Module):\n    def _initialize(self):\n        \"\"\"\n        Initialize with the necessary elements.\n        \"\"\"\n        self.model_dir = os.path.join(self.directory, 'novel')\n        self.conf_file = 'slda.conf'\n        self.__engine = InferenceEngine(self.model_dir, self.conf_file)\n        self.vocab_path = os.path.join(self.model_dir, 'vocab_info.txt')\n        lac = hub.Module(name=\"lac\")\n        # self.__tokenizer = SimpleTokenizer(self.vocab_path)\n        self.__tokenizer = LACTokenizer(self.vocab_path, lac)\n\n        self.vocabulary = self.__engine.get_model().get_vocab()\n        self.config = self.__engine.get_config()\n        self.topic_words = self.__engine.get_model().topic_words()\n        self.topic_sum_table = self.__engine.get_model().topic_sum()\n\n        def take_elem(word_count):\n            return word_count.count\n\n        for i in range(self.config.num_topics):\n            self.topic_words[i].sort(key=take_elem, reverse=True)\n\n        logger.info(\"Finish Initialization.\")\n\n    def infer_doc_topic_distribution(self, document):\n        \"\"\"\n        This interface infers the topic distribution of document.\n\n        Args:\n            document(str): the input document text.\n\n        Returns:\n            results(list): returns the topic distribution of document.\n        \"\"\"\n        tokens = self.__tokenizer.tokenize(document)\n        if tokens == []:\n            return []\n        results = []\n        sentences = []\n        sent = []\n        for i in range(len(tokens)):\n            sent.append(tokens[i])\n            if len(sent) % 5 == 0:\n                sentences.append(sent)\n                sent = []\n        if len(sent) > 0:\n            sentences.append(sent)\n\n        doc = SLDADoc()\n        self.__engine.infer(sentences, doc)\n        topics = doc.sparse_topic_dist()\n        for topic in topics:\n            results.append({\"topic id\": topic.tid, \"distribution\": topic.prob})\n        return results\n\n    def show_topic_keywords(self, topic_id, k=10):\n        \"\"\"\n        This interface returns the k keywords under specific topic.\n\n        Args:\n            topic_id(int): topic information we want to know.\n            k(int): top k keywords.\n\n        Returns:\n            results(dict): contains specific topic's keywords and corresponding\n                           probability.\n        \"\"\"\n        EPS = 1e-8\n        results = {}\n        if 0 <= topic_id < self.config.num_topics:\n            k = min(k, len(self.topic_words[topic_id]))\n            for i in range(k):\n                prob = self.topic_words[topic_id][i].count / \\\n                       (self.topic_sum_table[topic_id] + EPS)\n                results[self.vocabulary[self.topic_words[topic_id][i].word_id]] = prob\n            return results\n        else:\n            logger.error(\"%d is out of range!\" % topic_id)\n"
  },
  {
    "path": "modules/text/language_model/slda_novel/sampler.py",
    "content": "import os\n\nimport numpy as np\nfrom tqdm import tqdm\nfrom paddlehub.common.logger import logger\n\nfrom slda_novel.document import LDADoc, SLDADoc, Token, Sentence\nfrom slda_novel.vose_alias import VoseAlias\nfrom slda_novel.util import rand, rand_k\n\n\nclass Sampler(object):\n    def __init__(self):\n        pass\n\n    def sample_doc(self, doc):\n        \"\"\"Sample LDA or SLDA topics for documents.\n        \"\"\"\n        raise NotImplementedError\n\n\nclass MHSampler(Sampler):\n    def __init__(self, model):\n        super().__init__()\n        self.__model = model\n        self.__topic_indexes = None\n        self.__alias_tables = None\n        self.__prob_sum = None\n        self.__beta_alias = VoseAlias()\n        self.__beta_prior_sum = None\n        self.__mh_steps = 2\n        self.__construct_alias_table()\n\n    def __construct_alias_table(self):\n        \"\"\"Construct alias table for all words.\n        \"\"\"\n        logger.info(\"Construct alias table for alias sampling method.\")\n        vocab_size = self.__model.vocab_size()\n        self.__topic_indexes = [[] for _ in range(vocab_size)]\n        self.__alias_tables = [VoseAlias() for _ in range(vocab_size)]\n        self.__prob_sum = np.zeros(vocab_size)\n\n        # Construct each word's alias table (prior is not included).\n        for i in tqdm(range(vocab_size)):\n            dist = []\n            prob_sum = 0\n            for key in self.__model.word_topic(i):\n                topic_id = key\n                word_topic_count = self.__model.word_topic(i)[key]\n                topic_sum = self.__model.topic_sum_value(topic_id)\n\n                self.__topic_indexes[i].append(topic_id)\n                q = word_topic_count / (topic_sum + self.__model.beta_sum())\n                dist.append(q)\n                prob_sum += q\n            self.__prob_sum[i] = prob_sum\n            if len(dist) > 0:\n                dist = np.array(dist, dtype=np.float)\n                self.__alias_tables[i].initialize(dist)\n\n        # Build prior parameter beta's alias table.\n        beta_dist = self.__model.beta() / (self.__model.topic_sum() + self.__model.beta_sum())\n        self.__beta_prior_sum = np.sum(beta_dist)\n        self.__beta_alias.initialize(beta_dist)\n\n    def sample_doc(self, doc):\n        if isinstance(doc, LDADoc) and not isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_token(doc, doc.token(i))\n                doc.set_topic(i, new_topic)\n        elif isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_sentence(doc, doc.sent(i))\n                doc.set_topic(i, new_topic)\n\n    def __sample_token(self, doc, token):\n        new_topic = token.topic\n        for i in range(self.__mh_steps):\n            doc_proposed_topic = self.__doc_proposal(doc, token)\n            new_topic = self.__word_proposal(doc, token, doc_proposed_topic)\n        return new_topic\n\n    def __sample_sentence(self, doc, sent):\n        new_topic = sent.topic\n        for i in range(self.__mh_steps):\n            doc_proposed_topic = self.__doc_proposal(doc, sent)\n            new_topic = self.__word_proposal(doc, sent, doc_proposed_topic)\n        return new_topic\n\n    def __doc_proposal(self, doc, token):\n        if isinstance(doc, LDADoc) and isinstance(token, Token):\n            old_topic = token.topic\n            dart = rand() * (doc.size() + self.__model.alpha_sum())\n            if dart < doc.size():\n                token_index = int(dart)\n                new_topic = doc.token(token_index).topic\n            else:\n                new_topic = rand_k(self.__model.num_topics())\n\n            if new_topic != old_topic:\n                proposal_old = self.__doc_proposal_distribution(doc, old_topic)\n                proposal_new = self.__doc_proposal_distribution(doc, new_topic)\n                proportion_old = self.__proportional_function(doc, token, old_topic)\n                proportion_new = self.__proportional_function(doc, token, new_topic)\n                transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                rejection = rand()\n                mask = -(rejection < transition_prob)\n                return (new_topic & mask) | (old_topic & ~mask)\n\n            return new_topic\n\n        elif isinstance(doc, SLDADoc) and isinstance(token, Sentence):\n            sent = token\n            old_topic = sent.topic\n            dart = rand() * (doc.size() + self.__model.alpha_sum())\n            if dart < doc.size():\n                token_index = int(dart)\n                new_topic = doc.sent(token_index).topic\n            else:\n                new_topic = rand_k(self.__model.num_topics())\n\n            if new_topic != old_topic:\n                proportion_old = self.__proportional_function(doc, sent, old_topic)\n                proportion_new = self.__proportional_function(doc, sent, new_topic)\n                proposal_old = self.__doc_proposal_distribution(doc, old_topic)\n                proposal_new = self.__doc_proposal_distribution(doc, new_topic)\n                transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                rejection = rand()\n                mask = -(rejection < transition_prob)\n                return (new_topic & mask) | (old_topic & ~mask)\n\n            return new_topic\n\n    def __word_proposal(self, doc, token, old_topic):\n        if isinstance(doc, LDADoc) and isinstance(token, Token):\n            new_topic = self.__propose(token.id)\n            if new_topic != old_topic:\n                proposal_old = self.__word_proposal_distribution(token.id, old_topic)\n                proposal_new = self.__word_proposal_distribution(token.id, new_topic)\n                proportion_old = self.__proportional_function(doc, token, old_topic)\n                proportion_new = self.__proportional_function(doc, token, new_topic)\n                transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                rejection = rand()\n                mask = -(rejection < transition_prob)\n                return (new_topic & mask) | (old_topic & ~mask)\n            return new_topic\n\n        elif isinstance(doc, SLDADoc) and isinstance(token, Sentence):\n            sent = token\n            new_topic = old_topic\n            for word_id in sent.tokens:\n                new_topic = self.__propose(word_id)\n                if new_topic != old_topic:\n                    proportion_old = self.__proportional_function(doc, sent, old_topic)\n                    proportion_new = self.__proportional_function(doc, sent, new_topic)\n                    proposal_old = self.__word_proposal_distribution(word_id, old_topic)\n                    proposal_new = self.__word_proposal_distribution(word_id, new_topic)\n                    transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                    rejection = rand()\n                    mask = -(rejection < transition_prob)\n                    new_topic = (new_topic & mask) | (old_topic & ~mask)\n            return new_topic\n\n    def __proportional_function(self, doc, token, new_topic):\n        if isinstance(doc, LDADoc) and isinstance(token, Token):\n            old_topic = token.topic\n            dt_alpha = doc.topic_sum(new_topic) + self.__model.alpha()\n            wt_beta = self.__model.word_topic_value(token.id, new_topic) + self.__model.beta()\n            t_sum_beta_sum = self.__model.topic_sum_value(new_topic) + self.__model.beta_sum()\n            if new_topic == old_topic and wt_beta > 1:\n                if dt_alpha > 1:\n                    dt_alpha -= 1\n                wt_beta -= 1\n                t_sum_beta_sum -= 1\n            return dt_alpha * wt_beta / t_sum_beta_sum\n\n        elif isinstance(doc, SLDADoc) and isinstance(token, Sentence):\n            sent = token\n            old_topic = sent.topic\n            result = doc.topic_sum(new_topic) + self.__model.alpha()\n            if new_topic == old_topic:\n                result -= 1\n            for word_id in sent.tokens:\n                wt_beta = self.__model.word_topic_value(word_id, new_topic) + self.__model.beta()\n                t_sum_beta_sum = self.__model.topic_sum_value(new_topic) + self.__model.beta_sum()\n                if new_topic == old_topic and wt_beta > 1:\n                    wt_beta -= 1\n                    t_sum_beta_sum -= 1\n                result *= wt_beta / t_sum_beta_sum\n            return result\n        else:\n            logger.error(\"Wrong input argument type!\")\n\n    def __word_proposal_distribution(self, word_id, topic):\n        wt_beta = self.__model.word_topic_value(word_id, topic) + self.__model.beta()\n        t_sum_beta_sum = self.__model.topic_sum_value(topic) + self.__model.beta_sum()\n        return wt_beta / t_sum_beta_sum\n\n    def __doc_proposal_distribution(self, doc, topic):\n        return doc.topic_sum(topic) + self.__model.alpha()\n\n    def __propose(self, word_id):\n        dart = rand() * (self.__prob_sum[word_id] + self.__beta_prior_sum)\n        if dart < self.__prob_sum[word_id]:\n            idx = self.__alias_tables[word_id].generate()\n            topic = self.__topic_indexes[word_id][idx]\n        else:\n            topic = self.__beta_alias.generate()\n        return topic\n\n\nclass GibbsSampler(Sampler):\n    def __init__(self, model):\n        super().__init__()\n        self.__model = model\n\n    def sample_doc(self, doc):\n        if isinstance(doc, LDADoc) and not isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_token(doc, doc.token(i))\n                doc.set_topic(i, new_topic)\n        elif isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_sentence(doc, doc.sent(i))\n                doc.set_topic(i, new_topic)\n\n    def __sample_token(self, doc, token):\n        old_topic = token.topic\n        num_topics = self.__model.num_topics()\n        accum_prob = np.zeros(num_topics)\n        prob = np.zeros(num_topics)\n        sum_ = 0\n        for i in range(num_topics):\n            dt_alpha = doc.topic_sum(i) + self.__model.alpha()\n            wt_beta = self.__model.word_topic_value(token.id, i) + self.__model.beta()\n            t_sum_beta_sum = self.__model.topic_sum(i) + self.__model.beta_sum()\n            if i == old_topic and wt_beta > 1:\n                if dt_alpha > 1:\n                    dt_alpha -= 1\n                wt_beta -= 1\n                t_sum_beta_sum -= 1\n            prob[i] = dt_alpha * wt_beta / t_sum_beta_sum\n            sum_ += prob[i]\n            accum_prob[i] = prob[i] if i == 0 else accum_prob[i - 1] + prob[i]\n\n        dart = rand() * sum_\n        if dart <= accum_prob[0]:\n            return 0\n        for i in range(1, num_topics):\n            if accum_prob[i - 1] < dart <= accum_prob[i]:\n                return i\n        return num_topics - 1\n\n    def __sample_sentence(self, doc, sent):\n        old_topic = sent.topic\n        num_topics = self.__model.num_topics()\n        accum_prob = np.zeros(num_topics)\n        prob = np.zeros(num_topics)\n        sum_ = 0\n        for t in range(num_topics):\n            dt_alpha = doc.topic_sum(t) + self.__model.alpha()\n            t_sum_beta_sum = self.__model.topic_sum(t) + self.__model.beta_sum()\n            if t == old_topic:\n                if dt_alpha > 1:\n                    dt_alpha -= 1\n                if t_sum_beta_sum > 1:\n                    t_sum_beta_sum -= 1\n            prob[t] = dt_alpha\n            for i in range(len(sent.tokens)):\n                w = sent.tokens[i]\n                wt_beta = self.__model.word_topic_value(w, t) + self.__model.beta()\n                if t == old_topic and wt_beta > 1:\n                    wt_beta -= 1\n                # Note: if the length of the sentence is too long, the probability will be\n                # too small and the accuracy will be lost if there are too many multiply items\n                prob[t] *= wt_beta / t_sum_beta_sum\n            sum_ += prob[t]\n            accum_prob[t] = prob[t] if t == 0 else accum_prob[t - 1] + prob[t]\n\n        dart = rand() * sum\n        if dart <= accum_prob[0]:\n            return 0\n        for t in range(1, num_topics):\n            if accum_prob[t - 1] < dart <= accum_prob[t]:\n                return t\n        return num_topics - 1\n"
  },
  {
    "path": "modules/text/language_model/slda_novel/semantic_matching.py",
    "content": "import os\n\nimport numpy as np\nfrom paddlehub.common.logger import logger\n\nfrom slda_novel.vocab import OOV\n\nEPS = 1e-06\n\n\nclass WordAndDis(object):\n    def __init__(self):\n        self.word = None\n        self.distance = None\n\n\nclass SemanticMatching(object):\n    def __init__(self):\n        pass\n\n    def l2_norm(self, vec):\n        \"\"\"Calculate the length of vector.\n        \"\"\"\n        result = np.sqrt(np.sum(vec**2))\n        return result\n\n    def cosine_similarity(self, vec1, vec2):\n        norm1 = self.l2_norm(vec1)\n        norm2 = self.l2_norm(vec2)\n        result = np.sum(vec1 * vec2) / norm1 / norm2\n        return result\n\n    def likelihood_based_similarity(self, terms, doc_topic_dist, model):\n        \"\"\"\n        Args:\n            terms: list of strings\n            doc_topic_dist: list of Topic class\n            model: TopicModel class\n        \"\"\"\n        num_of_term_in_vocab = 0\n        result = 0\n        for i in range(len(terms)):\n            term_id = model.term_id(terms[i])\n            if term_id == OOV:\n                continue\n            num_of_term_in_vocab += 1\n            for j in range(len(doc_topic_dist)):\n                topic_id = doc_topic_dist[j].tid\n                prob = doc_topic_dist[j].prob\n                result += model.word_topic_value(term_id, topic_id) * 1.0 / \\\n                          model.topic_sum_value(topic_id) * prob\n\n        if num_of_term_in_vocab == 0:\n            return result\n        return result / num_of_term_in_vocab\n\n    def kullback_leibler_divergence(self, dist1, dist2):\n        assert dist1.shape == dist2.shape\n        dist2[dist2 < EPS] = EPS\n        result = np.sum(dist1 * np.log(dist1 / dist2))\n        return result\n\n    def jensen_shannon_divergence(self, dist1, dist2):\n        assert dist1.shape == dist2.shape\n        dist1[dist1 < EPS] = EPS\n        dist2[dist2 < EPS] = EPS\n        mean = (dist1 + dist2) * 0.5\n        jsd = self.kullback_leibler_divergence(dist1, mean) * 0.5 + \\\n              self.kullback_leibler_divergence(dist2, mean) * 0.5\n        return jsd\n\n    def hellinger_distance(self, dist1, dist2):\n        assert dist1.shape == dist2.shape\n        result = np.sum((np.sqrt(dist1) - np.sqrt(dist2))**2)\n        result = np.sqrt(result) * 0.7071067812\n        return result\n"
  },
  {
    "path": "modules/text/language_model/slda_novel/tokenizer.py",
    "content": "import os\n\nimport numpy as np\nfrom paddlehub.common.logger import logger\n\n\nclass Tokenizer(object):\n    \"\"\"Base tokenizer class.\n    \"\"\"\n\n    def __init__(self):\n        pass\n\n    def tokenize(self, text):\n        raise NotImplementedError\n\n\nclass SimpleTokenizer(Tokenizer):\n    \"\"\"Simple version FMM(Forward Maximun Matching) word tokenizer. This tokenizer can only\n       be used in topic model demo, but not in real business application scenarios.\n\n       Notes: This tokenizer can only recognize the words in the corresponding vocab file.\n    \"\"\"\n\n    def __init__(self, vocab_path):\n        super().__init__()\n        self.__max_word_len = 0\n        self.__vocab = set()\n        self.__load_vocab(vocab_path)\n\n    def tokenize(self, text):\n        \"\"\"Tokenize the input string `text`, and return the tokenize result.\n        \"\"\"\n        text_len = len(text)\n        result = []\n        i = 0\n        while i < text_len:\n            word = found_word = \"\"\n            # Deal with English characters.\n            if self.__is_eng_char(text[i]):\n                for j in range(i, text_len + 1):\n                    if j < text_len and self.__is_eng_char(text[j]):\n                        word += self.__tolower(text[j])\n                    else:\n                        # Forward matching by character granularity.\n                        if word in self.__vocab:\n                            result.append(word)\n                        i = j - 1\n                        break\n            else:\n                for j in range(i, min(i + self.__max_word_len, text_len)):\n                    word += text[j]\n                    if word in self.__vocab:\n                        found_word = word\n                if len(found_word) > 0:\n                    result.append(found_word)\n                    i += len(found_word) - 1\n            i += 1\n        return result\n\n    def contains(self, word):\n        \"\"\"Check whether the word is in the vocabulary.\n        \"\"\"\n        return word in self.__vocab\n\n    def __load_vocab(self, vocab_path):\n        \"\"\"Load the word dictionary.\n        \"\"\"\n        with open(vocab_path, 'r', encoding='utf-8') as fin:\n            vocab_size = 0\n            for line in fin.readlines():\n                fields = line.strip().split('\\t')\n                assert len(fields) >= 2\n                word = fields[1]\n                self.__max_word_len = max(self.__max_word_len, len(word))\n                self.__vocab.add(word)\n                vocab_size += 1\n\n    def __is_eng_char(self, c):\n        \"\"\"Check whether char c is an English character.\n        \"\"\"\n        return (c >= 'A' and c <= 'Z') or (c >= 'a' and c <= 'z')\n\n    def __tolower(self, c):\n        \"\"\"Return the lowercase character of the corresponding character, or return\n           the original character if there is no corresponding lowercase character.\n        \"\"\"\n        return c.lower()\n\n\nclass LACTokenizer(Tokenizer):\n    def __init__(self, vocab_path, lac):\n        super().__init__()\n        self.__max_word_len = 0\n        self.__vocab = set()\n        self.__lac = lac\n        self.__load_vocab(vocab_path)\n\n    def __load_vocab(self, vocab_path):\n        \"\"\"Load the word dictionary.\n                \"\"\"\n        with open(vocab_path, 'r', encoding='utf-8') as fin:\n            vocab_size = 0\n            for line in fin.readlines():\n                fields = line.strip().split('\\t')\n                assert len(fields) >= 2\n                word = fields[1]\n                self.__max_word_len = max(self.__max_word_len, len(word))\n                self.__vocab.add(word)\n                vocab_size += 1\n\n    def tokenize(self, text):\n        results = self.__lac.lexical_analysis(texts=[text], use_gpu=False, batch_size=1, return_tag=True)\n        # Change English words to lower case.\n        # And just preserve the word in vocab.\n        words = results[0][\"word\"]\n        result = []\n        for word in words:\n            word = word.lower()\n            if word in self.__vocab:\n                result.append(word)\n        return result\n\n    def contains(self, word):\n        \"\"\"Check whether the word is in the vocabulary.\n        \"\"\"\n        return word in self.__vocab\n"
  },
  {
    "path": "modules/text/language_model/slda_novel/util.py",
    "content": "import time\nimport yaml\n\nimport numpy as np\nfrom paddlehub.common.logger import logger\n\nfrom slda_novel.config import ModelType\n\n\ndef load_prototxt(config_file, config):\n    \"\"\"\n    Args:\n        config_file: model configuration file.\n        config: ModelConfig class\n    \"\"\"\n    logger.info(\"Loading SLDA config.\")\n    with open(config_file, 'r', encoding='utf-8') as f:\n        yaml_dict = yaml.load(f, Loader=yaml.FullLoader)\n\n    # Assignment.\n    if yaml_dict[\"type\"] == \"LDA\":\n        config.type = ModelType.LDA\n    else:\n        config.type = ModelType.SLDA\n    config.num_topics = yaml_dict[\"num_topics\"]\n    config.alpha = yaml_dict[\"alpha\"]\n    config.beta = yaml_dict[\"beta\"]\n    config.word_topic_file = yaml_dict[\"word_topic_file\"]\n    config.vocab_file = yaml_dict[\"vocab_file\"]\n\n\ndef fix_random_seed(seed=2147483647):\n    np.random.seed(seed)\n\n\ndef rand(min_=0, max_=1):\n    return np.random.uniform(low=min_, high=max_)\n\n\ndef rand_k(k):\n    \"\"\"Returns an integer float number between [0, k - 1].\n    \"\"\"\n    return int(rand() * k)\n\n\ndef timeit(f):\n    \"\"\"Return time cost of function f.\n    \"\"\"\n\n    def timed(*args, **kwargs):\n        start_time = time.time()\n        result = f(*args, **kwargs)\n        end_time = time.time()\n        print(\"   [-] %s : %2.5f sec\" % (f.__name__, end_time - start_time))\n        return result\n\n    return timed\n"
  },
  {
    "path": "modules/text/language_model/slda_novel/vocab.py",
    "content": "from paddlehub.common.logger import logger\n\nOOV = -1\n\n\nclass WordCount(object):\n    def __init__(self, word_id, count):\n        self.word_id = word_id\n        self.count = count\n\n\nclass Vocab(object):\n    def __init__(self):\n        self.__term2id = {}\n        self.__id2term = {}\n\n    def get_id(self, word):\n        if word not in self.__term2id:\n            return OOV\n        return self.__term2id[word]\n\n    def load(self, vocab_file):\n        self.__term2id = {}\n        self.__id2term = {}\n        with open(vocab_file, 'r', encoding='utf-8') as fin:\n            for line in fin.readlines():\n                fields = line.strip().split('\\t')\n                assert len(fields) == 5, \"Vocabulary file [%s] format error!\" % (vocab_file)\n                term = fields[1]\n                id_ = int(fields[2])\n                if term in self.__term2id:\n                    logger.error(\"Duplicate word [%s] in vocab file!\" % (term))\n                    continue\n                self.__term2id[term] = id_\n                self.__id2term[id_] = term\n\n    def size(self):\n        return len(self.__term2id)\n\n    def vocabulary(self):\n        return self.__id2term\n"
  },
  {
    "path": "modules/text/language_model/slda_novel/vose_alias.py",
    "content": "import os\n\nimport numpy as np\nfrom paddlehub.common.logger import logger\n\nfrom slda_novel.util import rand, rand_k\n\n\nclass VoseAlias(object):\n    \"\"\"Vose's Alias Method.\n    \"\"\"\n\n    def __init__(self):\n        self.__alias = None\n        self.__prob = None  # np.array\n\n    def initialize(self, distribution):\n        \"\"\"Initialize the alias table according to the input distribution\n        Arg:\n            distribution: Numpy array.\n        \"\"\"\n        size = distribution.shape[0]\n        self.__alias = np.zeros(size, dtype=np.int64)\n        self.__prob = np.zeros(size)\n        sum_ = np.sum(distribution)\n        p = distribution / sum_ * size  # Scale up probability.\n        large, small = [], []\n        for i, p_ in enumerate(p):\n            if p_ < 1.0:\n                small.append(i)\n            else:\n                large.append(i)\n\n        while large and small:\n            l = small[0]\n            g = large[0]\n            small.pop(0)\n            large.pop(0)\n            self.__prob[l] = p[l]\n            self.__alias[l] = g\n            p[g] = p[g] + p[l] - 1  # A more numerically stable option.\n            if p[g] < 1.0:\n                small.append(g)\n            else:\n                large.append(g)\n\n        while large:\n            g = large[0]\n            large.pop(0)\n            self.__prob[g] = 1.0\n\n        while small:\n            l = small[0]\n            small.pop(0)\n            self.__prob[l] = 1.0\n\n    def generate(self):\n        \"\"\"Generate samples from given distribution.\n        \"\"\"\n        dart1 = rand_k(self.size())\n        dart2 = int(rand())\n        return dart1 if dart2 > self.__prob[dart1] else self.__alias[dart1]\n\n    def size(self):\n        return self.__prob.shape[0]\n"
  },
  {
    "path": "modules/text/language_model/slda_webpage/README.md",
    "content": "## 模型概述\n\n主题模型(Topic Model)是以无监督学习的方式对文档的隐含语义结构进行聚类的统计模型，其中SLDA(Sentence-LDA)是主题模型的一种。SLDA是LDA主题模型的扩展，LDA假设每个单词对应一个主题，而SLDA假设每个句子对应一个主题。本Module基于的数据集为百度自建的网页领域数据集。\n\n<p align=\"center\">\n<img src=\"https://bj.bcebos.com/paddlehub/model/nlp/semantic_model/slda.png\" hspace='10'/> <br />\n</p>\n\n更多详情请参考[SLDA论文](https://pdfs.semanticscholar.org/c311/778adb9484c86250e915aecd9714f4206050.pdf)。\n\n注：该Module由第三方开发者DesmonDay贡献。\n\n## SLDA模型 API 说明\n\n### infer_doc_topic_distribution(document)\n\n用于推理出文档的主题分布。\n\n**参数**\n\n- document(str): 输入文档。\n\n**返回**\n\n- results(list): 包含主题分布下各个主题ID和对应的概率分布。其中，list的基本元素为dict，dict的key为主题ID，value为各个主题ID对应的概率。\n\n### show_topic_keywords(topic_id, k=10)\n\n用于展示出每个主题下对应的关键词，可配合推理主题分布的API使用。\n\n**参数**\n\n- topic_id(int): 主题ID。\n- k(int): 需要知道对应主题的前k个关键词。\n\n**返回**\n\n- results(dict): 返回对应文档的前k个关键词，以及各个关键词在文档中的出现概率。\n\n### 代码示例\n\n这里展示API的使用示例。\n\n``` python\nimport paddlehub as hub\n\nslda_webpage = hub.Module(name=\"slda_webpage\")\n\ntopic_dist = slda_webpage.infer_doc_topic_distribution(\"百度是全球最大的中文搜索引擎、致力于让网民更便捷地获取信息，找到所求。\")\n# [{'topic id': 4687, 'distribution': 0.38333333333333336},\n#  {'topic id': 2508, 'distribution': 0.31666666666666665},\n#  {'topic id': 2871, 'distribution': 0.15},\n#  {'topic id': 2292, 'distribution': 0.11666666666666667},\n#  {'topic id': 4410, 'distribution': 0.016666666666666666},\n#  {'topic id': 4676, 'distribution': 0.016666666666666666}]\n\nkeywords = slda_webpage.show_topic_keywords(topic_id=4687)\n# {'市场': 0.07413332566788851,\n#  '增长': 0.045259383167567974,\n#  '规模': 0.030225253512468797,\n#  '用户': 0.02278765317990645,\n#  '超过': 0.019395970334729278,\n#  '份额': 0.019091932266952005,\n#  '全球': 0.018879934814238216,\n#  '手机': 0.01252139322404175,\n#  '美元': 0.01202885155424257,\n#  '收入': 0.011096560279140084}\n\n```\n\n## 查看代码\nhttps://github.com/baidu/Familia\n\n\n## 依赖\n\npaddlepaddle >= 1.8.2\n\npaddlehub >= 1.8.0\n\n## 更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/text/language_model/slda_webpage/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/slda_webpage/config.py",
    "content": "\"\"\"\nThis file defines the basic config information of LDA/SLDA model.\n\"\"\"\n\n\nclass ModelType:\n    LDA = 0\n    SLDA = 1\n\n\nclass ModelConfig:\n    type = None\n    num_topics = None\n    alpha = None\n    beta = None\n    word_topic_file = None\n    vocab_file = None\n"
  },
  {
    "path": "modules/text/language_model/slda_webpage/document.py",
    "content": "import numpy as np\n\n\nclass Topic(object):\n    \"\"\"Basic data structure of topic, contains topic id and\n       corresponding probability.\n    \"\"\"\n\n    def __init__(self, tid, prob):\n        self.tid = tid  # topic id\n        self.prob = prob  # topic probability\n\n\nclass Token(object):\n    \"\"\"Basic storage unit of LDA documents, contains word id\n       and corresponding topic.\n    \"\"\"\n\n    def __init__(self, topic, id):\n        self.topic = topic\n        self.id = id\n\n\nclass Sentence(object):\n    \"\"\"Basic storage unit of SentenceLDA documents, contains word ids\n       of the sentence and its corresponding topic id.\n    \"\"\"\n\n    def __init__(self, topic, tokens):\n        self.topic = topic\n        self.tokens = tokens\n\n\nclass LDADoc(object):\n    \"\"\"The storage structure of LDA model's inference result.\n    \"\"\"\n\n    def __init__(self):\n        self._num_topics = None  # Number of topics.\n        self._num_accum = None  # Number of accumulated sample rounds.\n        self._alpha = None  # Document prior parameter.\n        self._tokens = None  # Storage structure of inference results.\n        self._topic_sum = None  # Document's topic sum in one round samples.\n        self._accum_topic_sum = None  # Accumulated results of topic sum.\n\n    def init(self, num_topics):\n        \"\"\"Initialize the LDADoc according to num_topics.\n        \"\"\"\n        self._num_topics = num_topics\n        self._num_accum = 0\n        self._tokens = []\n        self._topic_sum = np.zeros(self._num_topics)\n        self._accum_topic_sum = np.zeros(self._num_topics)\n\n    def add_token(self, token):\n        \"\"\"Add new word to current LDADoc.\n        Arg:\n            token: Token class object.\n        \"\"\"\n        assert token.topic >= 0, \"Topic %d out of range!\" % token.topic\n        assert token.topic < self._num_topics, \"Topic %d out of range!\" % token.topic\n        self._tokens.append(token)\n        self._topic_sum[token.topic] += 1\n\n    def token(self, index):\n        return self._tokens[index]\n\n    def set_topic(self, index, new_topic):\n        \"\"\"Set the index word's topic to new_topic, and update the corresponding\n           topic distribution.\n        \"\"\"\n        assert new_topic >= 0, \"Topic %d out of range!\" % new_topic\n        assert new_topic < self._num_topics, \"Topic %d out of range!\" % new_topic\n        old_topic = self._tokens[index].topic\n        if new_topic == old_topic:\n            return\n        self._tokens[index].topic = new_topic\n        self._topic_sum[old_topic] -= 1\n        self._topic_sum[new_topic] += 1\n\n    def set_alpha(self, alpha):\n        self._alpha = alpha\n\n    def size(self):\n        \"\"\"Return number of words in LDADoc.\n        \"\"\"\n        return len(self._tokens)\n\n    def topic_sum(self, topic_id):\n        return self._topic_sum[topic_id]\n\n    def sparse_topic_dist(self, sort=True):\n        \"\"\"Return the topic distribution of documents in sparse format.\n           By default, it is sorted according to the topic probability\n           under the descending order.\n        \"\"\"\n        topic_dist = []\n        sum_ = np.sum(self._accum_topic_sum)\n        if sum_ == 0:\n            return topic_dist\n        for i in range(0, self._num_topics):\n            if self._accum_topic_sum[i] == 0:\n                continue\n            topic_dist.append(Topic(i, self._accum_topic_sum[i] * 1.0 / sum_))\n        if sort:\n\n            def take_elem(topic):\n                return topic.prob\n\n            topic_dist.sort(key=take_elem, reverse=True)\n            if topic_dist is None:\n                topic_dist = []\n\n        return topic_dist\n\n    def dense_topic_dist(self):\n        \"\"\"Return the distribution of document topics in dense format,\n           taking into account the prior parameter alpha.\n        \"\"\"\n        dense_dist = np.zeros(self._num_topics)\n        if self.size() == 0:\n            return dense_dist\n        dense_dist = (self._accum_topic_sum * 1.0 / self._num_accum + self._alpha) / (\n            self.size() + self._alpha * self._num_topics)\n        return dense_dist\n\n    def accumulate_topic_num(self):\n        self._accum_topic_sum += self._topic_sum\n        self._num_accum += 1\n\n\nclass SLDADoc(LDADoc):\n    \"\"\"Sentence LDA Document, inherited from LDADoc.\n       Add add_sentence interface.\n    \"\"\"\n\n    def __init__(self):\n        super().__init__()\n        self.__sentences = None\n\n    def init(self, num_topics):\n        \"\"\"Initialize the SLDADoc according to num_topics.\n        \"\"\"\n        self._num_topics = num_topics\n        self.__sentences = []\n        self._num_accum = 0\n        self._topic_sum = np.zeros(self._num_topics)\n        self._accum_topic_sum = np.zeros(self._num_topics)\n\n    def add_sentence(self, sent):\n        \"\"\"Add new sentence to current SLDADoc.\n        Arg:\n            sent: Sentence class object.\n        \"\"\"\n        assert sent.topic >= 0, \"Topic %d out of range!\" % (sent.topic)\n        assert sent.topic < self._num_topics, \"Topic %d out of range!\" % (sent.topic)\n        self.__sentences.append(sent)\n        self._topic_sum[sent.topic] += 1\n\n    def set_topic(self, index, new_topic):\n        assert new_topic >= 0, \"Topic %d out of range!\" % (new_topic)\n        assert new_topic < self._num_topics, \"Topic %d out of range!\" % (new_topic)\n        old_topic = self.__sentences[index].topic\n        if new_topic == old_topic:\n            return\n        self.__sentences[index].topic = new_topic\n        self._topic_sum[old_topic] -= 1\n        self._topic_sum[new_topic] += 1\n\n    def size(self):\n        \"\"\"Return number of sentences in SLDADoc.\n        \"\"\"\n        return len(self.__sentences)\n\n    def sent(self, index):\n        return self.__sentences[index]\n"
  },
  {
    "path": "modules/text/language_model/slda_webpage/inference_engine.py",
    "content": "import os\n\nfrom paddlehub.common.logger import logger\n\nfrom slda_webpage.config import ModelConfig\nfrom slda_webpage.util import load_prototxt, fix_random_seed, rand_k\nfrom slda_webpage.model import TopicModel\nfrom slda_webpage.sampler import GibbsSampler, MHSampler\nfrom slda_webpage.document import LDADoc, SLDADoc, Token, Sentence\nfrom slda_webpage.vocab import OOV\n\n\nclass SamplerType:\n    GibbsSampling = 0\n    MetropolisHastings = 1\n\n\nclass InferenceEngine(object):\n    def __init__(self, model_dir, conf_file, type=SamplerType.MetropolisHastings):\n        # Read model configuration.\n        config = ModelConfig()\n        conf_file_path = os.path.join(model_dir, conf_file)\n        load_prototxt(conf_file_path, config)\n        self.__model = TopicModel(model_dir, config)\n        self.__config = config\n\n        # Initialize the sampler according to the configuration.\n        if type == SamplerType.GibbsSampling:\n            self.__sampler = GibbsSampler(self.__model)\n        elif type == SamplerType.MetropolisHastings:\n            self.__sampler = MHSampler(self.__model)\n\n    def infer(self, input, doc):\n        \"\"\"Perform LDA topic inference on input, and store the results in doc.\n        Args:\n            input: a list of strings after tokenization.\n            doc: LDADoc type or SLDADoc type.\n        \"\"\"\n        fix_random_seed()\n        if isinstance(doc, LDADoc) and not isinstance(doc, SLDADoc):\n            doc.init(self.__model.num_topics())\n            doc.set_alpha(self.__model.alpha())\n            for token in input:\n                id_ = self.__model.term_id(token)\n                if id_ != OOV:\n                    init_topic = rand_k(self.__model.num_topics())\n                    doc.add_token(Token(init_topic, id_))\n            self.lda_infer(doc, 20, 50)\n        elif isinstance(doc, SLDADoc):\n            doc.init(self.__model.num_topics())\n            doc.set_alpha(self.__model.alpha())\n            for sent in input:\n                words = []\n                for token in sent:\n                    id_ = self.__model.term_id(token)\n                    if id_ != OOV:\n                        words.append(id_)\n                init_topic = rand_k(self.__model.num_topics())\n                doc.add_sentence(Sentence(init_topic, words))\n            self.slda_infer(doc, 20, 50)\n        else:\n            logger.error(\"Wrong Doc Type!\")\n\n    def lda_infer(self, doc, burn_in_iter, total_iter):\n        assert burn_in_iter >= 0\n        assert total_iter > 0\n        assert total_iter > burn_in_iter\n\n        for iter_ in range(total_iter):\n            self.__sampler.sample_doc(doc)\n            if iter_ >= burn_in_iter:\n                doc.accumulate_topic_num()\n\n    def slda_infer(self, doc, burn_in_iter, total_iter):\n        assert burn_in_iter >= 0\n        assert total_iter > 0\n        assert total_iter > burn_in_iter\n\n        for iter_ in range(total_iter):\n            self.__sampler.sample_doc(doc)\n            if iter_ >= burn_in_iter:\n                doc.accumulate_topic_num()\n\n    def model_type(self):\n        return self.__model.type()\n\n    def get_model(self):\n        return self.__model\n\n    def get_config(self):\n        return self.__config\n"
  },
  {
    "path": "modules/text/language_model/slda_webpage/model.py",
    "content": "import os\nfrom collections import OrderedDict\n\nimport numpy as np\nfrom tqdm import tqdm\nfrom paddlehub.common.logger import logger\n\nfrom slda_webpage.vocab import Vocab, WordCount\n\n\nclass TopicModel(object):\n    \"\"\"Storage Structure of Topic model, including vocabulary and word topic count.\n    \"\"\"\n\n    def __init__(self, model_dir, config):\n        \"\"\"\n        Args:\n            model_dir: the path of model directory\n            config: ModelConfig class.\n        \"\"\"\n        self.__word_topic = None  # Model parameter of word topic.\n        self.__vocab = Vocab()  # Vocab data structure of model.\n        self.__num_topics = config.num_topics  # Number of topics.\n        self.__alpha = config.alpha\n        self.__alpha_sum = self.__alpha * self.__num_topics\n        self.__beta = config.beta\n        self.__beta_sum = None\n        self.__type = config.type  # Model type.\n        self.__topic_sum = np.zeros(self.__num_topics, dtype=\"int64\")  # Accum sum of each topic in word topic.\n        self.__topic_words = [[] for _ in range(self.__num_topics)]\n        word_topic_path = os.path.join(model_dir, config.word_topic_file)\n        vocab_path = os.path.join(model_dir, config.vocab_file)\n        self.load_model(word_topic_path, vocab_path)\n\n    def term_id(self, term):\n        return self.__vocab.get_id(term)\n\n    def load_model(self, word_topic_path, vocab_path):\n\n        # Loading vocabulary\n        self.__vocab.load(vocab_path)\n\n        self.__beta_sum = self.__beta * self.__vocab.size()\n        self.__word_topic = [{} for _ in range(self.__vocab.size())]  # 字典列表\n        self.__load_word_dict(word_topic_path)\n        logger.info(\"Model Info: #num_topics=%d #vocab_size=%d alpha=%f beta=%f\" %\n                    (self.num_topics(), self.vocab_size(), self.alpha(), self.beta()))\n\n    def word_topic_value(self, word_id, topic_id):\n        \"\"\"Return value of specific word under specific topic in the model.\n        \"\"\"\n        word_dict = self.__word_topic[word_id]\n        if topic_id not in word_dict:\n            return 0\n        return word_dict[topic_id]\n\n    def word_topic(self, term_id):\n        \"\"\"Return the topic distribution of a word.\n        \"\"\"\n        return self.__word_topic[term_id]\n\n    def topic_sum_value(self, topic_id):\n        return self.__topic_sum[topic_id]\n\n    def topic_sum(self):\n        return self.__topic_sum\n\n    def num_topics(self):\n        return self.__num_topics\n\n    def vocab_size(self):\n        return self.__vocab.size()\n\n    def alpha(self):\n        return self.__alpha\n\n    def alpha_sum(self):\n        return self.__alpha_sum\n\n    def beta(self):\n        return self.__beta\n\n    def beta_sum(self):\n        return self.__beta_sum\n\n    def type(self):\n        return self.__type\n\n    def __load_word_dict(self, word_dict_path):\n        \"\"\"Load the word topic parameters.\n        \"\"\"\n        logger.info(\"Loading word topic.\")\n        with open(word_dict_path, 'r', encoding='utf-8') as f:\n            for line in tqdm(f.readlines()):\n                fields = line.strip().split(\" \")\n                assert len(fields) > 0, \"Model file format error!\"\n                term_id = int(fields[0])\n                assert term_id < self.vocab_size(), \"Term id out of range!\"\n                assert term_id >= 0, \"Term id out of range!\"\n                for i in range(1, len(fields)):\n                    topic_count = fields[i].split(\":\")\n                    assert len(topic_count) == 2, \"Topic count format error!\"\n\n                    topic_id = int(topic_count[0])\n                    assert topic_id >= 0, \"Topic out of range!\"\n                    assert topic_id < self.__num_topics, \"Topic out of range!\"\n\n                    count = int(topic_count[1])\n                    assert count >= 0, \"Topic count error!\"\n\n                    self.__word_topic[term_id][topic_id] = count\n                    self.__topic_sum[topic_id] += count\n                    self.__topic_words[topic_id].append(WordCount(term_id, count))\n                new_dict = OrderedDict()\n                for key in sorted(self.__word_topic[term_id]):\n                    new_dict[key] = self.__word_topic[term_id][key]\n                self.__word_topic[term_id] = new_dict\n\n    def get_vocab(self):\n        return self.__vocab.vocabulary()\n\n    def topic_words(self):\n        return self.__topic_words\n"
  },
  {
    "path": "modules/text/language_model/slda_webpage/module.py",
    "content": "import os\n\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.common.logger import logger\n\nfrom slda_webpage.inference_engine import InferenceEngine\nfrom slda_webpage.document import SLDADoc\nfrom slda_webpage.semantic_matching import SemanticMatching, WordAndDis\nfrom slda_webpage.tokenizer import LACTokenizer, SimpleTokenizer\nfrom slda_webpage.config import ModelType\nfrom slda_webpage.vocab import Vocab, WordCount\n\n\n@moduleinfo(\n    name=\"slda_webpage\",\n    version=\"1.0.0\",\n    summary=\n    \"This is a PaddleHub Module for SLDA topic model in webpage dataset, where we can infer the topic distribution of document.\",\n    author=\"DesmonDay\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\")\nclass TopicModel(hub.Module):\n    def _initialize(self):\n        \"\"\"\n        Initialize with the necessary elements.\n        \"\"\"\n        self.model_dir = os.path.join(self.directory, 'webpage')\n        self.conf_file = 'slda.conf'\n        self.__engine = InferenceEngine(self.model_dir, self.conf_file)\n        self.vocab_path = os.path.join(self.model_dir, 'vocab_info.txt')\n        lac = hub.Module(name=\"lac\")\n        # self.__tokenizer = SimpleTokenizer(self.vocab_path)\n        self.__tokenizer = LACTokenizer(self.vocab_path, lac)\n\n        self.vocabulary = self.__engine.get_model().get_vocab()\n        self.config = self.__engine.get_config()\n        self.topic_words = self.__engine.get_model().topic_words()\n        self.topic_sum_table = self.__engine.get_model().topic_sum()\n\n        def take_elem(word_count):\n            return word_count.count\n\n        for i in range(self.config.num_topics):\n            self.topic_words[i].sort(key=take_elem, reverse=True)\n\n        logger.info(\"Finish Initialization.\")\n\n    def infer_doc_topic_distribution(self, document):\n        \"\"\"\n        This interface infers the topic distribution of document.\n\n        Args:\n            document(str): the input document text.\n\n        Returns:\n            results(list): returns the topic distribution of document.\n        \"\"\"\n        tokens = self.__tokenizer.tokenize(document)\n        if tokens == []:\n            return []\n        results = []\n        sentences = []\n        sent = []\n        for i in range(len(tokens)):\n            sent.append(tokens[i])\n            if len(sent) % 5 == 0:\n                sentences.append(sent)\n                sent = []\n        if len(sent) > 0:\n            sentences.append(sent)\n\n        doc = SLDADoc()\n        self.__engine.infer(sentences, doc)\n        topics = doc.sparse_topic_dist()\n        for topic in topics:\n            results.append({\"topic id\": topic.tid, \"distribution\": topic.prob})\n        return results\n\n    def show_topic_keywords(self, topic_id, k=10):\n        \"\"\"\n        This interface returns the k keywords under specific topic.\n\n        Args:\n            topic_id(int): topic information we want to know.\n            k(int): top k keywords.\n\n        Returns:\n            results(dict): contains specific topic's keywords and\n                     corresponding probability.\n        \"\"\"\n        EPS = 1e-8\n        results = {}\n        if 0 <= topic_id < self.config.num_topics:\n            k = min(k, len(self.topic_words[topic_id]))\n            for i in range(k):\n                prob = self.topic_words[topic_id][i].count / \\\n                       (self.topic_sum_table[topic_id] + EPS)\n                results[self.vocabulary[self.topic_words[topic_id][i].word_id]] = prob\n            return results\n        else:\n            logger.error(\"%d is out of range!\" % topic_id)\n"
  },
  {
    "path": "modules/text/language_model/slda_webpage/sampler.py",
    "content": "import os\n\nimport numpy as np\nfrom tqdm import tqdm\nfrom paddlehub.common.logger import logger\n\nfrom slda_webpage.document import LDADoc, SLDADoc, Token, Sentence\nfrom slda_webpage.vose_alias import VoseAlias\nfrom slda_webpage.util import rand, rand_k\n\n\nclass Sampler(object):\n    def __init__(self):\n        pass\n\n    def sample_doc(self, doc):\n        \"\"\"Sample LDA or SLDA topics for documents.\n        \"\"\"\n        raise NotImplementedError\n\n\nclass MHSampler(Sampler):\n    def __init__(self, model):\n        super().__init__()\n        self.__model = model\n        self.__topic_indexes = None\n        self.__alias_tables = None\n        self.__prob_sum = None\n        self.__beta_alias = VoseAlias()\n        self.__beta_prior_sum = None\n        self.__mh_steps = 2\n        self.__construct_alias_table()\n\n    def __construct_alias_table(self):\n        \"\"\"Construct alias table for all words.\n        \"\"\"\n        logger.info(\"Construct alias table for alias sampling method.\")\n        vocab_size = self.__model.vocab_size()\n        self.__topic_indexes = [[] for _ in range(vocab_size)]\n        self.__alias_tables = [VoseAlias() for _ in range(vocab_size)]\n        self.__prob_sum = np.zeros(vocab_size)\n\n        # Construct each word's alias table (prior is not included).\n        for i in tqdm(range(vocab_size)):\n            dist = []\n            prob_sum = 0\n            for key in self.__model.word_topic(i):\n                topic_id = key\n                word_topic_count = self.__model.word_topic(i)[key]\n                topic_sum = self.__model.topic_sum_value(topic_id)\n\n                self.__topic_indexes[i].append(topic_id)\n                q = word_topic_count / (topic_sum + self.__model.beta_sum())\n                dist.append(q)\n                prob_sum += q\n            self.__prob_sum[i] = prob_sum\n            if len(dist) > 0:\n                dist = np.array(dist, dtype=np.float)\n                self.__alias_tables[i].initialize(dist)\n\n        # Build prior parameter beta's alias table.\n        beta_dist = self.__model.beta() / (self.__model.topic_sum() + self.__model.beta_sum())\n        self.__beta_prior_sum = np.sum(beta_dist)\n        self.__beta_alias.initialize(beta_dist)\n\n    def sample_doc(self, doc):\n        if isinstance(doc, LDADoc) and not isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_token(doc, doc.token(i))\n                doc.set_topic(i, new_topic)\n        elif isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_sentence(doc, doc.sent(i))\n                doc.set_topic(i, new_topic)\n\n    def __sample_token(self, doc, token):\n        new_topic = token.topic\n        for i in range(self.__mh_steps):\n            doc_proposed_topic = self.__doc_proposal(doc, token)\n            new_topic = self.__word_proposal(doc, token, doc_proposed_topic)\n        return new_topic\n\n    def __sample_sentence(self, doc, sent):\n        new_topic = sent.topic\n        for i in range(self.__mh_steps):\n            doc_proposed_topic = self.__doc_proposal(doc, sent)\n            new_topic = self.__word_proposal(doc, sent, doc_proposed_topic)\n        return new_topic\n\n    def __doc_proposal(self, doc, token):\n        if isinstance(doc, LDADoc) and isinstance(token, Token):\n            old_topic = token.topic\n            dart = rand() * (doc.size() + self.__model.alpha_sum())\n            if dart < doc.size():\n                token_index = int(dart)\n                new_topic = doc.token(token_index).topic\n            else:\n                new_topic = rand_k(self.__model.num_topics())\n\n            if new_topic != old_topic:\n                proposal_old = self.__doc_proposal_distribution(doc, old_topic)\n                proposal_new = self.__doc_proposal_distribution(doc, new_topic)\n                proportion_old = self.__proportional_function(doc, token, old_topic)\n                proportion_new = self.__proportional_function(doc, token, new_topic)\n                transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                rejection = rand()\n                mask = -(rejection < transition_prob)\n                return (new_topic & mask) | (old_topic & ~mask)\n\n            return new_topic\n\n        elif isinstance(doc, SLDADoc) and isinstance(token, Sentence):\n            sent = token\n            old_topic = sent.topic\n            dart = rand() * (doc.size() + self.__model.alpha_sum())\n            if dart < doc.size():\n                token_index = int(dart)\n                new_topic = doc.sent(token_index).topic\n            else:\n                new_topic = rand_k(self.__model.num_topics())\n\n            if new_topic != old_topic:\n                proportion_old = self.__proportional_function(doc, sent, old_topic)\n                proportion_new = self.__proportional_function(doc, sent, new_topic)\n                proposal_old = self.__doc_proposal_distribution(doc, old_topic)\n                proposal_new = self.__doc_proposal_distribution(doc, new_topic)\n                transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                rejection = rand()\n                mask = -(rejection < transition_prob)\n                return (new_topic & mask) | (old_topic & ~mask)\n\n            return new_topic\n\n    def __word_proposal(self, doc, token, old_topic):\n        if isinstance(doc, LDADoc) and isinstance(token, Token):\n            new_topic = self.__propose(token.id)\n            if new_topic != old_topic:\n                proposal_old = self.__word_proposal_distribution(token.id, old_topic)\n                proposal_new = self.__word_proposal_distribution(token.id, new_topic)\n                proportion_old = self.__proportional_function(doc, token, old_topic)\n                proportion_new = self.__proportional_function(doc, token, new_topic)\n                transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                rejection = rand()\n                mask = -(rejection < transition_prob)\n                return (new_topic & mask) | (old_topic & ~mask)\n            return new_topic\n\n        elif isinstance(doc, SLDADoc) and isinstance(token, Sentence):\n            sent = token\n            new_topic = old_topic\n            for word_id in sent.tokens:\n                new_topic = self.__propose(word_id)\n                if new_topic != old_topic:\n                    proportion_old = self.__proportional_function(doc, sent, old_topic)\n                    proportion_new = self.__proportional_function(doc, sent, new_topic)\n                    proposal_old = self.__word_proposal_distribution(word_id, old_topic)\n                    proposal_new = self.__word_proposal_distribution(word_id, new_topic)\n                    transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                    rejection = rand()\n                    mask = -(rejection < transition_prob)\n                    new_topic = (new_topic & mask) | (old_topic & ~mask)\n            return new_topic\n\n    def __proportional_function(self, doc, token, new_topic):\n        if isinstance(doc, LDADoc) and isinstance(token, Token):\n            old_topic = token.topic\n            dt_alpha = doc.topic_sum(new_topic) + self.__model.alpha()\n            wt_beta = self.__model.word_topic_value(token.id, new_topic) + self.__model.beta()\n            t_sum_beta_sum = self.__model.topic_sum_value(new_topic) + self.__model.beta_sum()\n            if new_topic == old_topic and wt_beta > 1:\n                if dt_alpha > 1:\n                    dt_alpha -= 1\n                wt_beta -= 1\n                t_sum_beta_sum -= 1\n            return dt_alpha * wt_beta / t_sum_beta_sum\n\n        elif isinstance(doc, SLDADoc) and isinstance(token, Sentence):\n            sent = token\n            old_topic = sent.topic\n            result = doc.topic_sum(new_topic) + self.__model.alpha()\n            if new_topic == old_topic:\n                result -= 1\n            for word_id in sent.tokens:\n                wt_beta = self.__model.word_topic_value(word_id, new_topic) + self.__model.beta()\n                t_sum_beta_sum = self.__model.topic_sum_value(new_topic) + self.__model.beta_sum()\n                if new_topic == old_topic and wt_beta > 1:\n                    wt_beta -= 1\n                    t_sum_beta_sum -= 1\n                result *= wt_beta / t_sum_beta_sum\n            return result\n        else:\n            logger.error(\"Wrong input argument type!\")\n\n    def __word_proposal_distribution(self, word_id, topic):\n        wt_beta = self.__model.word_topic_value(word_id, topic) + self.__model.beta()\n        t_sum_beta_sum = self.__model.topic_sum_value(topic) + self.__model.beta_sum()\n        return wt_beta / t_sum_beta_sum\n\n    def __doc_proposal_distribution(self, doc, topic):\n        return doc.topic_sum(topic) + self.__model.alpha()\n\n    def __propose(self, word_id):\n        dart = rand() * (self.__prob_sum[word_id] + self.__beta_prior_sum)\n        if dart < self.__prob_sum[word_id]:\n            idx = self.__alias_tables[word_id].generate()\n            topic = self.__topic_indexes[word_id][idx]\n        else:\n            topic = self.__beta_alias.generate()\n        return topic\n\n\nclass GibbsSampler(Sampler):\n    def __init__(self, model):\n        super().__init__()\n        self.__model = model\n\n    def sample_doc(self, doc):\n        if isinstance(doc, LDADoc) and not isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_token(doc, doc.token(i))\n                doc.set_topic(i, new_topic)\n        elif isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_sentence(doc, doc.sent(i))\n                doc.set_topic(i, new_topic)\n\n    def __sample_token(self, doc, token):\n        old_topic = token.topic\n        num_topics = self.__model.num_topics()\n        accum_prob = np.zeros(num_topics)\n        prob = np.zeros(num_topics)\n        sum_ = 0\n        for i in range(num_topics):\n            dt_alpha = doc.topic_sum(i) + self.__model.alpha()\n            wt_beta = self.__model.word_topic_value(token.id, i) + self.__model.beta()\n            t_sum_beta_sum = self.__model.topic_sum(i) + self.__model.beta_sum()\n            if i == old_topic and wt_beta > 1:\n                if dt_alpha > 1:\n                    dt_alpha -= 1\n                wt_beta -= 1\n                t_sum_beta_sum -= 1\n            prob[i] = dt_alpha * wt_beta / t_sum_beta_sum\n            sum_ += prob[i]\n            accum_prob[i] = prob[i] if i == 0 else accum_prob[i - 1] + prob[i]\n\n        dart = rand() * sum_\n        if dart <= accum_prob[0]:\n            return 0\n        for i in range(1, num_topics):\n            if accum_prob[i - 1] < dart <= accum_prob[i]:\n                return i\n        return num_topics - 1\n\n    def __sample_sentence(self, doc, sent):\n        old_topic = sent.topic\n        num_topics = self.__model.num_topics()\n        accum_prob = np.zeros(num_topics)\n        prob = np.zeros(num_topics)\n        sum_ = 0\n        for t in range(num_topics):\n            dt_alpha = doc.topic_sum(t) + self.__model.alpha()\n            t_sum_beta_sum = self.__model.topic_sum(t) + self.__model.beta_sum()\n            if t == old_topic:\n                if dt_alpha > 1:\n                    dt_alpha -= 1\n                if t_sum_beta_sum > 1:\n                    t_sum_beta_sum -= 1\n            prob[t] = dt_alpha\n            for i in range(len(sent.tokens)):\n                w = sent.tokens[i]\n                wt_beta = self.__model.word_topic_value(w, t) + self.__model.beta()\n                if t == old_topic and wt_beta > 1:\n                    wt_beta -= 1\n                # Note: if the length of the sentence is too long, the probability will be\n                # too small and the accuracy will be lost if there are too many multiply items\n                prob[t] *= wt_beta / t_sum_beta_sum\n            sum_ += prob[t]\n            accum_prob[t] = prob[t] if t == 0 else accum_prob[t - 1] + prob[t]\n\n        dart = rand() * sum\n        if dart <= accum_prob[0]:\n            return 0\n        for t in range(1, num_topics):\n            if accum_prob[t - 1] < dart <= accum_prob[t]:\n                return t\n        return num_topics - 1\n"
  },
  {
    "path": "modules/text/language_model/slda_webpage/semantic_matching.py",
    "content": "import os\n\nimport numpy as np\nfrom paddlehub.common.logger import logger\n\nfrom slda_webpage.vocab import OOV\n\nEPS = 1e-06\n\n\nclass WordAndDis(object):\n    def __init__(self):\n        self.word = None\n        self.distance = None\n\n\nclass SemanticMatching(object):\n    def __init__(self):\n        pass\n\n    def l2_norm(self, vec):\n        \"\"\"Calculate the length of vector.\n        \"\"\"\n        result = np.sqrt(np.sum(vec**2))\n        return result\n\n    def cosine_similarity(self, vec1, vec2):\n        norm1 = self.l2_norm(vec1)\n        norm2 = self.l2_norm(vec2)\n        result = np.sum(vec1 * vec2) / norm1 / norm2\n        return result\n\n    def likelihood_based_similarity(self, terms, doc_topic_dist, model):\n        \"\"\"\n        Args:\n            terms: list of strings\n            doc_topic_dist: list of Topic class\n            model: TopicModel class\n        \"\"\"\n        num_of_term_in_vocab = 0\n        result = 0\n        for i in range(len(terms)):\n            term_id = model.term_id(terms[i])\n            if term_id == OOV:\n                continue\n            num_of_term_in_vocab += 1\n            for j in range(len(doc_topic_dist)):\n                topic_id = doc_topic_dist[j].tid\n                prob = doc_topic_dist[j].prob\n                result += model.word_topic_value(term_id, topic_id) * 1.0 / \\\n                          model.topic_sum_value(topic_id) * prob\n\n        if num_of_term_in_vocab == 0:\n            return result\n        return result / num_of_term_in_vocab\n\n    def kullback_leibler_divergence(self, dist1, dist2):\n        assert dist1.shape == dist2.shape\n        dist2[dist2 < EPS] = EPS\n        result = np.sum(dist1 * np.log(dist1 / dist2))\n        return result\n\n    def jensen_shannon_divergence(self, dist1, dist2):\n        assert dist1.shape == dist2.shape\n        dist1[dist1 < EPS] = EPS\n        dist2[dist2 < EPS] = EPS\n        mean = (dist1 + dist2) * 0.5\n        jsd = self.kullback_leibler_divergence(dist1, mean) * 0.5 + \\\n              self.kullback_leibler_divergence(dist2, mean) * 0.5\n        return jsd\n\n    def hellinger_distance(self, dist1, dist2):\n        assert dist1.shape == dist2.shape\n        result = np.sum((np.sqrt(dist1) - np.sqrt(dist2))**2)\n        result = np.sqrt(result) * 0.7071067812\n        return result\n"
  },
  {
    "path": "modules/text/language_model/slda_webpage/tokenizer.py",
    "content": "import os\n\nimport numpy as np\nfrom paddlehub.common.logger import logger\n\n\nclass Tokenizer(object):\n    \"\"\"Base tokenizer class.\n    \"\"\"\n\n    def __init__(self):\n        pass\n\n    def tokenize(self, text):\n        raise NotImplementedError\n\n\nclass SimpleTokenizer(Tokenizer):\n    \"\"\"Simple version FMM(Forward Maximun Matching) word tokenizer. This tokenizer can only\n       be used in topic model demo, but not in real business application scenarios.\n\n       Notes: This tokenizer can only recognize the words in the corresponding vocab file.\n    \"\"\"\n\n    def __init__(self, vocab_path):\n        super().__init__()\n        self.__max_word_len = 0\n        self.__vocab = set()\n        self.__load_vocab(vocab_path)\n\n    def tokenize(self, text):\n        \"\"\"Tokenize the input string `text`, and return the tokenize result.\n        \"\"\"\n        text_len = len(text)\n        result = []\n        i = 0\n        while i < text_len:\n            word = found_word = \"\"\n            # Deal with English characters.\n            if self.__is_eng_char(text[i]):\n                for j in range(i, text_len + 1):\n                    if j < text_len and self.__is_eng_char(text[j]):\n                        word += self.__tolower(text[j])\n                    else:\n                        # Forward matching by character granularity.\n                        if word in self.__vocab:\n                            result.append(word)\n                        i = j - 1\n                        break\n            else:\n                for j in range(i, min(i + self.__max_word_len, text_len)):\n                    word += text[j]\n                    if word in self.__vocab:\n                        found_word = word\n                if len(found_word) > 0:\n                    result.append(found_word)\n                    i += len(found_word) - 1\n            i += 1\n        return result\n\n    def contains(self, word):\n        \"\"\"Check whether the word is in the vocabulary.\n        \"\"\"\n        return word in self.__vocab\n\n    def __load_vocab(self, vocab_path):\n        \"\"\"Load the word dictionary.\n        \"\"\"\n        with open(vocab_path, 'r', encoding='utf-8') as fin:\n            vocab_size = 0\n            for line in fin.readlines():\n                fields = line.strip().split('\\t')\n                assert len(fields) >= 2\n                word = fields[1]\n                self.__max_word_len = max(self.__max_word_len, len(word))\n                self.__vocab.add(word)\n                vocab_size += 1\n\n    def __is_eng_char(self, c):\n        \"\"\"Check whether char c is an English character.\n        \"\"\"\n        return (c >= 'A' and c <= 'Z') or (c >= 'a' and c <= 'z')\n\n    def __tolower(self, c):\n        \"\"\"Return the lowercase character of the corresponding character, or return\n           the original character if there is no corresponding lowercase character.\n        \"\"\"\n        return c.lower()\n\n\nclass LACTokenizer(Tokenizer):\n    def __init__(self, vocab_path, lac):\n        super().__init__()\n        self.__max_word_len = 0\n        self.__vocab = set()\n        self.__lac = lac\n        self.__load_vocab(vocab_path)\n\n    def __load_vocab(self, vocab_path):\n        \"\"\"Load the word dictionary.\n                \"\"\"\n        with open(vocab_path, 'r', encoding='utf-8') as fin:\n            vocab_size = 0\n            for line in fin.readlines():\n                fields = line.strip().split('\\t')\n                assert len(fields) >= 2\n                word = fields[1]\n                self.__max_word_len = max(self.__max_word_len, len(word))\n                self.__vocab.add(word)\n                vocab_size += 1\n\n    def tokenize(self, text):\n        results = self.__lac.lexical_analysis(texts=[text], use_gpu=False, batch_size=1, return_tag=True)\n        # Change English words to lower case.\n        # And just preserve the word in vocab.\n        words = results[0][\"word\"]\n        result = []\n        for word in words:\n            word = word.lower()\n            if word in self.__vocab:\n                result.append(word)\n        return result\n\n    def contains(self, word):\n        \"\"\"Check whether the word is in the vocabulary.\n        \"\"\"\n        return word in self.__vocab\n"
  },
  {
    "path": "modules/text/language_model/slda_webpage/util.py",
    "content": "import time\nimport yaml\n\nimport numpy as np\nfrom paddlehub.common.logger import logger\n\nfrom slda_webpage.config import ModelType\n\n\ndef load_prototxt(config_file, config):\n    \"\"\"\n    Args:\n        config_file: model configuration file.\n        config: ModelConfig class\n    \"\"\"\n    logger.info(\"Loading SLDA config.\")\n    with open(config_file, 'r', encoding='utf-8') as f:\n        yaml_dict = yaml.load(f, Loader=yaml.FullLoader)\n\n    # Assignment.\n    if yaml_dict[\"type\"] == \"LDA\":\n        config.type = ModelType.LDA\n    else:\n        config.type = ModelType.SLDA\n    config.num_topics = yaml_dict[\"num_topics\"]\n    config.alpha = yaml_dict[\"alpha\"]\n    config.beta = yaml_dict[\"beta\"]\n    config.word_topic_file = yaml_dict[\"word_topic_file\"]\n    config.vocab_file = yaml_dict[\"vocab_file\"]\n\n\ndef fix_random_seed(seed=2147483647):\n    np.random.seed(seed)\n\n\ndef rand(min_=0, max_=1):\n    return np.random.uniform(low=min_, high=max_)\n\n\ndef rand_k(k):\n    \"\"\"Returns an integer float number between [0, k - 1].\n    \"\"\"\n    return int(rand() * k)\n\n\ndef timeit(f):\n    \"\"\"Return time cost of function f.\n    \"\"\"\n\n    def timed(*args, **kwargs):\n        start_time = time.time()\n        result = f(*args, **kwargs)\n        end_time = time.time()\n        print(\"   [-] %s : %2.5f sec\" % (f.__name__, end_time - start_time))\n        return result\n\n    return timed\n"
  },
  {
    "path": "modules/text/language_model/slda_webpage/vocab.py",
    "content": "from paddlehub.common.logger import logger\n\nOOV = -1\n\n\nclass WordCount(object):\n    def __init__(self, word_id, count):\n        self.word_id = word_id\n        self.count = count\n\n\nclass Vocab(object):\n    def __init__(self):\n        self.__term2id = {}\n        self.__id2term = {}\n\n    def get_id(self, word):\n        if word not in self.__term2id:\n            return OOV\n        return self.__term2id[word]\n\n    def load(self, vocab_file):\n        self.__term2id = {}\n        self.__id2term = {}\n        with open(vocab_file, 'r', encoding='utf-8') as fin:\n            for line in fin.readlines():\n                fields = line.strip().split('\\t')\n                assert len(fields) == 5, \"Vocabulary file [%s] format error!\" % (vocab_file)\n                term = fields[1]\n                id_ = int(fields[2])\n                if term in self.__term2id:\n                    logger.error(\"Duplicate word [%s] in vocab file!\" % (term))\n                    continue\n                self.__term2id[term] = id_\n                self.__id2term[id_] = term\n\n    def size(self):\n        return len(self.__term2id)\n\n    def vocabulary(self):\n        return self.__id2term\n"
  },
  {
    "path": "modules/text/language_model/slda_webpage/vose_alias.py",
    "content": "import os\n\nimport numpy as np\nfrom paddlehub.common.logger import logger\n\nfrom slda_webpage.util import rand, rand_k\n\n\nclass VoseAlias(object):\n    \"\"\"Vose's Alias Method.\n    \"\"\"\n\n    def __init__(self):\n        self.__alias = None\n        self.__prob = None  # np.array\n\n    def initialize(self, distribution):\n        \"\"\"Initialize the alias table according to the input distribution\n        Arg:\n            distribution: Numpy array.\n        \"\"\"\n        size = distribution.shape[0]\n        self.__alias = np.zeros(size, dtype=np.int64)\n        self.__prob = np.zeros(size)\n        sum_ = np.sum(distribution)\n        p = distribution / sum_ * size  # Scale up probability.\n        large, small = [], []\n        for i, p_ in enumerate(p):\n            if p_ < 1.0:\n                small.append(i)\n            else:\n                large.append(i)\n\n        while large and small:\n            l = small[0]\n            g = large[0]\n            small.pop(0)\n            large.pop(0)\n            self.__prob[l] = p[l]\n            self.__alias[l] = g\n            p[g] = p[g] + p[l] - 1  # A more numerically stable option.\n            if p[g] < 1.0:\n                small.append(g)\n            else:\n                large.append(g)\n\n        while large:\n            g = large[0]\n            large.pop(0)\n            self.__prob[g] = 1.0\n\n        while small:\n            l = small[0]\n            small.pop(0)\n            self.__prob[l] = 1.0\n\n    def generate(self):\n        \"\"\"Generate samples from given distribution.\n        \"\"\"\n        dart1 = rand_k(self.size())\n        dart2 = int(rand())\n        return dart1 if dart2 > self.__prob[dart1] else self.__alias[dart1]\n\n    def size(self):\n        return self.__prob.shape[0]\n"
  },
  {
    "path": "modules/text/language_model/slda_weibo/README.md",
    "content": "## 模型概述\n\n主题模型(Topic Model)是以无监督学习的方式对文档的隐含语义结构进行聚类的统计模型，其中SLDA(Sentence-LDA)是主题模型的一种。SLDA是LDA主题模型的扩展，LDA假设每个单词对应一个主题，而SLDA假设每个句子对应一个主题。本Module基于的数据集为百度自建的微博领域数据集。\n\n<p align=\"center\">\n<img src=\"https://bj.bcebos.com/paddlehub/model/nlp/semantic_model/slda.png\" hspace='10'/> <br />\n</p>\n\n更多详情请参考[SLDA论文](https://pdfs.semanticscholar.org/c311/778adb9484c86250e915aecd9714f4206050.pdf)。\n\n注：该Module由第三方开发者DesmonDay贡献。\n\n## SLDA模型 API 说明\n\n### infer_doc_topic_distribution(document)\n\n用于推理出文档的主题分布。\n\n**参数**\n\n- document(str): 输入文档。\n\n**返回**\n\n- results(list): 包含主题分布下各个主题ID和对应的概率分布。其中，list的基本元素为dict，dict的key为主题ID，value为各个主题ID对应的概率。\n\n### show_topic_keywords(topic_id, k=10)\n\n用于展示出每个主题下对应的关键词，可配合推理主题分布的API使用。\n\n**参数**\n\n- topic_id(int): 主题ID。\n- k(int): 需要知道对应主题的前k个关键词。\n\n**返回**\n\n- results(dict): 返回对应文档的前k个关键词，以及各个关键词在文档中的出现概率。\n\n### 代码示例\n\n这里展示API的使用示例。\n\n``` python\nimport paddlehub as hub\n\nslda_weibo = hub.Module(name=\"slda_weibo\")\n\ntopic_dist = slda_weibo.infer_doc_topic_distribution(\"百度是全球最大的中文搜索引擎、致力于让网民更便捷地获取信息，找到所求。\")\n# [{'topic id': 874, 'distribution': 0.5}, {'topic id': 1764, 'distribution': 0.5}]\n\nkeywords = slda_weibo.show_topic_keywords(topic_id=874)\n# {'数据': 0.07850538018570305,\n#  '更新': 0.04504777051711974,\n#  '出口': 0.023363758946167185,\n#  '信息': 0.020567061200812687,\n#  '全国': 0.015975367546781145,\n#  '双十一': 0.014998636225687216,\n#  '地理': 0.013257422965959297,\n#  '官方': 0.012913598174463106,\n#  '支持': 0.01177359809763076,\n#  '说话': 0.011205999070328388}\n\n```\n## 查看代码\nhttps://github.com/baidu/Familia\n\n\n## 依赖\n\npaddlepaddle >= 1.8.2\n\npaddlehub >= 1.8.0\n\n## 更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/text/language_model/slda_weibo/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/language_model/slda_weibo/config.py",
    "content": "\"\"\"\nThis file defines the basic config information of LDA/SLDA model.\n\"\"\"\n\n\nclass ModelType:\n    LDA = 0\n    SLDA = 1\n\n\nclass ModelConfig:\n    type = None\n    num_topics = None\n    alpha = None\n    beta = None\n    word_topic_file = None\n    vocab_file = None\n"
  },
  {
    "path": "modules/text/language_model/slda_weibo/document.py",
    "content": "import numpy as np\n\n\nclass Topic(object):\n    \"\"\"Basic data structure of topic, contains topic id and\n       corresponding probability.\n    \"\"\"\n\n    def __init__(self, tid, prob):\n        self.tid = tid  # topic id\n        self.prob = prob  # topic probability\n\n\nclass Token(object):\n    \"\"\"Basic storage unit of LDA documents, contains word id\n       and corresponding topic.\n    \"\"\"\n\n    def __init__(self, topic, id):\n        self.topic = topic\n        self.id = id\n\n\nclass Sentence(object):\n    \"\"\"Basic storage unit of SentenceLDA documents, contains word ids\n       of the sentence and its corresponding topic id.\n    \"\"\"\n\n    def __init__(self, topic, tokens):\n        self.topic = topic\n        self.tokens = tokens\n\n\nclass LDADoc(object):\n    \"\"\"The storage structure of LDA model's inference result.\n    \"\"\"\n\n    def __init__(self):\n        self._num_topics = None  # Number of topics.\n        self._num_accum = None  # Number of accumulated sample rounds.\n        self._alpha = None  # Document prior parameter.\n        self._tokens = None  # Storage structure of inference results.\n        self._topic_sum = None  # Document's topic sum in one round samples.\n        self._accum_topic_sum = None  # Accumulated results of topic sum.\n\n    def init(self, num_topics):\n        \"\"\"Initialize the LDADoc according to num_topics.\n        \"\"\"\n        self._num_topics = num_topics\n        self._num_accum = 0\n        self._tokens = []\n        self._topic_sum = np.zeros(self._num_topics)\n        self._accum_topic_sum = np.zeros(self._num_topics)\n\n    def add_token(self, token):\n        \"\"\"Add new word to current LDADoc.\n        Arg:\n            token: Token class object.\n        \"\"\"\n        assert token.topic >= 0, \"Topic %d out of range!\" % token.topic\n        assert token.topic < self._num_topics, \"Topic %d out of range!\" % token.topic\n        self._tokens.append(token)\n        self._topic_sum[token.topic] += 1\n\n    def token(self, index):\n        return self._tokens[index]\n\n    def set_topic(self, index, new_topic):\n        \"\"\"Set the index word's topic to new_topic, and update the corresponding\n           topic distribution.\n        \"\"\"\n        assert new_topic >= 0, \"Topic %d out of range!\" % new_topic\n        assert new_topic < self._num_topics, \"Topic %d out of range!\" % new_topic\n        old_topic = self._tokens[index].topic\n        if new_topic == old_topic:\n            return\n        self._tokens[index].topic = new_topic\n        self._topic_sum[old_topic] -= 1\n        self._topic_sum[new_topic] += 1\n\n    def set_alpha(self, alpha):\n        self._alpha = alpha\n\n    def size(self):\n        \"\"\"Return number of words in LDADoc.\n        \"\"\"\n        return len(self._tokens)\n\n    def topic_sum(self, topic_id):\n        return self._topic_sum[topic_id]\n\n    def sparse_topic_dist(self, sort=True):\n        \"\"\"Return the topic distribution of documents in sparse format.\n           By default, it is sorted according to the topic probability\n           under the descending order.\n        \"\"\"\n        topic_dist = []\n        sum_ = np.sum(self._accum_topic_sum)\n        if sum_ == 0:\n            return topic_dist\n        for i in range(0, self._num_topics):\n            if self._accum_topic_sum[i] == 0:\n                continue\n            topic_dist.append(Topic(i, self._accum_topic_sum[i] * 1.0 / sum_))\n        if sort:\n\n            def take_elem(topic):\n                return topic.prob\n\n            topic_dist.sort(key=take_elem, reverse=True)\n            if topic_dist is None:\n                topic_dist = []\n\n        return topic_dist\n\n    def dense_topic_dist(self):\n        \"\"\"Return the distribution of document topics in dense format,\n           taking into account the prior parameter alpha.\n        \"\"\"\n        dense_dist = np.zeros(self._num_topics)\n        if self.size() == 0:\n            return dense_dist\n        dense_dist = (self._accum_topic_sum * 1.0 / self._num_accum + self._alpha) / (\n            self.size() + self._alpha * self._num_topics)\n        return dense_dist\n\n    def accumulate_topic_num(self):\n        self._accum_topic_sum += self._topic_sum\n        self._num_accum += 1\n\n\nclass SLDADoc(LDADoc):\n    \"\"\"Sentence LDA Document, inherited from LDADoc.\n       Add add_sentence interface.\n    \"\"\"\n\n    def __init__(self):\n        super().__init__()\n        self.__sentences = None\n\n    def init(self, num_topics):\n        \"\"\"Initialize the SLDADoc according to num_topics.\n        \"\"\"\n        self._num_topics = num_topics\n        self.__sentences = []\n        self._num_accum = 0\n        self._topic_sum = np.zeros(self._num_topics)\n        self._accum_topic_sum = np.zeros(self._num_topics)\n\n    def add_sentence(self, sent):\n        \"\"\"Add new sentence to current SLDADoc.\n        Arg:\n            sent: Sentence class object.\n        \"\"\"\n        assert sent.topic >= 0, \"Topic %d out of range!\" % (sent.topic)\n        assert sent.topic < self._num_topics, \"Topic %d out of range!\" % (sent.topic)\n        self.__sentences.append(sent)\n        self._topic_sum[sent.topic] += 1\n\n    def set_topic(self, index, new_topic):\n        assert new_topic >= 0, \"Topic %d out of range!\" % (new_topic)\n        assert new_topic < self._num_topics, \"Topic %d out of range!\" % (new_topic)\n        old_topic = self.__sentences[index].topic\n        if new_topic == old_topic:\n            return\n        self.__sentences[index].topic = new_topic\n        self._topic_sum[old_topic] -= 1\n        self._topic_sum[new_topic] += 1\n\n    def size(self):\n        \"\"\"Return number of sentences in SLDADoc.\n        \"\"\"\n        return len(self.__sentences)\n\n    def sent(self, index):\n        return self.__sentences[index]\n"
  },
  {
    "path": "modules/text/language_model/slda_weibo/inference_engine.py",
    "content": "import os\n\nfrom paddlehub.common.logger import logger\n\nfrom slda_weibo.config import ModelConfig\nfrom slda_weibo.util import load_prototxt, fix_random_seed, rand_k\nfrom slda_weibo.model import TopicModel\nfrom slda_weibo.sampler import GibbsSampler, MHSampler\nfrom slda_weibo.document import LDADoc, SLDADoc, Token, Sentence\nfrom slda_weibo.vocab import OOV\n\n\nclass SamplerType:\n    GibbsSampling = 0\n    MetropolisHastings = 1\n\n\nclass InferenceEngine(object):\n    def __init__(self, model_dir, conf_file, type=SamplerType.MetropolisHastings):\n        # Read model configuration.\n        config = ModelConfig()\n        conf_file_path = os.path.join(model_dir, conf_file)\n        load_prototxt(conf_file_path, config)\n        self.__model = TopicModel(model_dir, config)\n        self.__config = config\n\n        # Initialize the sampler according to the configuration.\n        if type == SamplerType.GibbsSampling:\n            self.__sampler = GibbsSampler(self.__model)\n        elif type == SamplerType.MetropolisHastings:\n            self.__sampler = MHSampler(self.__model)\n\n    def infer(self, input, doc):\n        \"\"\"Perform LDA topic inference on input, and store the results in doc.\n        Args:\n            input: a list of strings after tokenization.\n            doc: LDADoc type or SLDADoc type.\n        \"\"\"\n        fix_random_seed()\n        if isinstance(doc, LDADoc) and not isinstance(doc, SLDADoc):\n            doc.init(self.__model.num_topics())\n            doc.set_alpha(self.__model.alpha())\n            for token in input:\n                id_ = self.__model.term_id(token)\n                if id_ != OOV:\n                    init_topic = rand_k(self.__model.num_topics())\n                    doc.add_token(Token(init_topic, id_))\n            self.lda_infer(doc, 20, 50)\n        elif isinstance(doc, SLDADoc):\n            doc.init(self.__model.num_topics())\n            doc.set_alpha(self.__model.alpha())\n            for sent in input:\n                words = []\n                for token in sent:\n                    id_ = self.__model.term_id(token)\n                    if id_ != OOV:\n                        words.append(id_)\n                init_topic = rand_k(self.__model.num_topics())\n                doc.add_sentence(Sentence(init_topic, words))\n            self.slda_infer(doc, 20, 50)\n        else:\n            logger.error(\"Wrong Doc Type!\")\n\n    def lda_infer(self, doc, burn_in_iter, total_iter):\n        assert burn_in_iter >= 0\n        assert total_iter > 0\n        assert total_iter > burn_in_iter\n\n        for iter_ in range(total_iter):\n            self.__sampler.sample_doc(doc)\n            if iter_ >= burn_in_iter:\n                doc.accumulate_topic_num()\n\n    def slda_infer(self, doc, burn_in_iter, total_iter):\n        assert burn_in_iter >= 0\n        assert total_iter > 0\n        assert total_iter > burn_in_iter\n\n        for iter_ in range(total_iter):\n            self.__sampler.sample_doc(doc)\n            if iter_ >= burn_in_iter:\n                doc.accumulate_topic_num()\n\n    def model_type(self):\n        return self.__model.type()\n\n    def get_model(self):\n        return self.__model\n\n    def get_config(self):\n        return self.__config\n"
  },
  {
    "path": "modules/text/language_model/slda_weibo/model.py",
    "content": "import os\nfrom collections import OrderedDict\n\nimport numpy as np\nfrom tqdm import tqdm\nfrom paddlehub.common.logger import logger\n\nfrom slda_weibo.vocab import Vocab, WordCount\n\n\nclass TopicModel(object):\n    \"\"\"Storage Structure of Topic model, including vocabulary and word topic count.\n    \"\"\"\n\n    def __init__(self, model_dir, config):\n        \"\"\"\n        Args:\n            model_dir: the path of model directory\n            config: ModelConfig class.\n        \"\"\"\n        self.__word_topic = None  # Model parameter of word topic.\n        self.__vocab = Vocab()  # Vocab data structure of model.\n        self.__num_topics = config.num_topics  # Number of topics.\n        self.__alpha = config.alpha\n        self.__alpha_sum = self.__alpha * self.__num_topics\n        self.__beta = config.beta\n        self.__beta_sum = None\n        self.__type = config.type  # Model type.\n        self.__topic_sum = np.zeros(self.__num_topics, dtype=\"int64\")  # Accum sum of each topic in word topic.\n        self.__topic_words = [[] for _ in range(self.__num_topics)]\n        word_topic_path = os.path.join(model_dir, config.word_topic_file)\n        vocab_path = os.path.join(model_dir, config.vocab_file)\n        self.load_model(word_topic_path, vocab_path)\n\n    def term_id(self, term):\n        return self.__vocab.get_id(term)\n\n    def load_model(self, word_topic_path, vocab_path):\n\n        # Loading vocabulary\n        self.__vocab.load(vocab_path)\n\n        self.__beta_sum = self.__beta * self.__vocab.size()\n        self.__word_topic = [{} for _ in range(self.__vocab.size())]  # 字典列表\n        self.__load_word_dict(word_topic_path)\n        logger.info(\"Model Info: #num_topics=%d #vocab_size=%d alpha=%f beta=%f\" %\n                    (self.num_topics(), self.vocab_size(), self.alpha(), self.beta()))\n\n    def word_topic_value(self, word_id, topic_id):\n        \"\"\"Return value of specific word under specific topic in the model.\n        \"\"\"\n        word_dict = self.__word_topic[word_id]\n        if topic_id not in word_dict:\n            return 0\n        return word_dict[topic_id]\n\n    def word_topic(self, term_id):\n        \"\"\"Return the topic distribution of a word.\n        \"\"\"\n        return self.__word_topic[term_id]\n\n    def topic_sum_value(self, topic_id):\n        return self.__topic_sum[topic_id]\n\n    def topic_sum(self):\n        return self.__topic_sum\n\n    def num_topics(self):\n        return self.__num_topics\n\n    def vocab_size(self):\n        return self.__vocab.size()\n\n    def alpha(self):\n        return self.__alpha\n\n    def alpha_sum(self):\n        return self.__alpha_sum\n\n    def beta(self):\n        return self.__beta\n\n    def beta_sum(self):\n        return self.__beta_sum\n\n    def type(self):\n        return self.__type\n\n    def __load_word_dict(self, word_dict_path):\n        \"\"\"Load the word topic parameters.\n        \"\"\"\n        logger.info(\"Loading word topic.\")\n        with open(word_dict_path, 'r', encoding='utf-8') as f:\n            for line in tqdm(f.readlines()):\n                fields = line.strip().split(\" \")\n                assert len(fields) > 0, \"Model file format error!\"\n                term_id = int(fields[0])\n                assert term_id < self.vocab_size(), \"Term id out of range!\"\n                assert term_id >= 0, \"Term id out of range!\"\n                for i in range(1, len(fields)):\n                    topic_count = fields[i].split(\":\")\n                    assert len(topic_count) == 2, \"Topic count format error!\"\n\n                    topic_id = int(topic_count[0])\n                    assert topic_id >= 0, \"Topic out of range!\"\n                    assert topic_id < self.__num_topics, \"Topic out of range!\"\n\n                    count = int(topic_count[1])\n                    assert count >= 0, \"Topic count error!\"\n\n                    self.__word_topic[term_id][topic_id] = count\n                    self.__topic_sum[topic_id] += count\n                    self.__topic_words[topic_id].append(WordCount(term_id, count))\n                new_dict = OrderedDict()\n                for key in sorted(self.__word_topic[term_id]):\n                    new_dict[key] = self.__word_topic[term_id][key]\n                self.__word_topic[term_id] = new_dict\n\n    def get_vocab(self):\n        return self.__vocab.vocabulary()\n\n    def topic_words(self):\n        return self.__topic_words\n"
  },
  {
    "path": "modules/text/language_model/slda_weibo/module.py",
    "content": "import os\n\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.common.logger import logger\n\nfrom slda_weibo.inference_engine import InferenceEngine\nfrom slda_weibo.document import SLDADoc\nfrom slda_weibo.semantic_matching import SemanticMatching, WordAndDis\nfrom slda_weibo.tokenizer import LACTokenizer, SimpleTokenizer\nfrom slda_weibo.config import ModelType\nfrom slda_weibo.vocab import Vocab, WordCount\n\n\n@moduleinfo(\n    name=\"slda_weibo\",\n    version=\"1.0.0\",\n    summary=\n    \"This is a PaddleHub Module for SLDA topic model in weibo dataset, where we can infer the topic distribution of document.\",\n    author=\"DesmonDay\",\n    author_email=\"\",\n    type=\"nlp/semantic_model\")\nclass TopicModel(hub.Module):\n    def _initialize(self):\n        \"\"\"\n        Initialize with the necessary elements.\n        \"\"\"\n        self.model_dir = os.path.join(self.directory, 'weibo')\n        self.conf_file = 'slda.conf'\n        self.__engine = InferenceEngine(self.model_dir, self.conf_file)\n        self.vocab_path = os.path.join(self.model_dir, 'vocab_info.txt')\n        lac = hub.Module(name=\"lac\")\n        # self.__tokenizer = SimpleTokenizer(self.vocab_path)\n        self.__tokenizer = LACTokenizer(self.vocab_path, lac)\n\n        self.vocabulary = self.__engine.get_model().get_vocab()\n        self.config = self.__engine.get_config()\n        self.topic_words = self.__engine.get_model().topic_words()\n        self.topic_sum_table = self.__engine.get_model().topic_sum()\n\n        def take_elem(word_count):\n            return word_count.count\n\n        for i in range(self.config.num_topics):\n            self.topic_words[i].sort(key=take_elem, reverse=True)\n\n        logger.info(\"Finish initialization.\")\n\n    def infer_doc_topic_distribution(self, document):\n        \"\"\"\n        This interface infers the topic distribution of document.\n\n        Args:\n            document(str): the input document text.\n\n        Returns:\n            results(list): returns the topic distribution of document.\n        \"\"\"\n        tokens = self.__tokenizer.tokenize(document)\n        if tokens == []:\n            return []\n        results = []\n        sentences = []\n        sent = []\n        for i in range(len(tokens)):\n            sent.append(tokens[i])\n            if len(sent) % 5 == 0:\n                sentences.append(sent)\n                sent = []\n        if len(sent) > 0:\n            sentences.append(sent)\n\n        doc = SLDADoc()\n        self.__engine.infer(sentences, doc)\n        topics = doc.sparse_topic_dist()\n        for topic in topics:\n            results.append({\"topic id\": topic.tid, \"distribution\": topic.prob})\n        return results\n\n    def show_topic_keywords(self, topic_id, k=10):\n        \"\"\"\n        This interface returns the k keywords under specific topic.\n\n        Args:\n            topic_id(int): topic information we want to know.\n            k(int): top k keywords.\n\n        Returns:\n            results(dict): contains specific topic's keywords and corresponding\n                           probability.\n        \"\"\"\n        EPS = 1e-8\n        results = {}\n        if 0 <= topic_id < self.config.num_topics:\n            k = min(k, len(self.topic_words[topic_id]))\n            for i in range(k):\n                prob = self.topic_words[topic_id][i].count / \\\n                       (self.topic_sum_table[topic_id] + EPS)\n                results[self.vocabulary[self.topic_words[topic_id][i].word_id]] = prob\n            return results\n        else:\n            logger.error(\"%d is out of range!\" % topic_id)\n"
  },
  {
    "path": "modules/text/language_model/slda_weibo/sampler.py",
    "content": "import os\n\nimport numpy as np\nfrom tqdm import tqdm\nfrom paddlehub.common.logger import logger\n\nfrom slda_weibo.document import LDADoc, SLDADoc, Token, Sentence\nfrom slda_weibo.vose_alias import VoseAlias\nfrom slda_weibo.util import rand, rand_k\n\n\nclass Sampler(object):\n    def __init__(self):\n        pass\n\n    def sample_doc(self, doc):\n        \"\"\"Sample LDA or SLDA topics for documents.\n        \"\"\"\n        raise NotImplementedError\n\n\nclass MHSampler(Sampler):\n    def __init__(self, model):\n        super().__init__()\n        self.__model = model\n        self.__topic_indexes = None\n        self.__alias_tables = None\n        self.__prob_sum = None\n        self.__beta_alias = VoseAlias()\n        self.__beta_prior_sum = None\n        self.__mh_steps = 2\n        self.__construct_alias_table()\n\n    def __construct_alias_table(self):\n        \"\"\"Construct alias table for all words.\n        \"\"\"\n        logger.info(\"Construct alias table for alias sampling method.\")\n        vocab_size = self.__model.vocab_size()\n        self.__topic_indexes = [[] for _ in range(vocab_size)]\n        self.__alias_tables = [VoseAlias() for _ in range(vocab_size)]\n        self.__prob_sum = np.zeros(vocab_size)\n\n        # Construct each word's alias table (prior is not included).\n        for i in tqdm(range(vocab_size)):\n            dist = []\n            prob_sum = 0\n            for key in self.__model.word_topic(i):\n                topic_id = key\n                word_topic_count = self.__model.word_topic(i)[key]\n                topic_sum = self.__model.topic_sum_value(topic_id)\n\n                self.__topic_indexes[i].append(topic_id)\n                q = word_topic_count / (topic_sum + self.__model.beta_sum())\n                dist.append(q)\n                prob_sum += q\n            self.__prob_sum[i] = prob_sum\n            if len(dist) > 0:\n                dist = np.array(dist, dtype=np.float)\n                self.__alias_tables[i].initialize(dist)\n\n        # Build prior parameter beta's alias table.\n        beta_dist = self.__model.beta() / (self.__model.topic_sum() + self.__model.beta_sum())\n        self.__beta_prior_sum = np.sum(beta_dist)\n        self.__beta_alias.initialize(beta_dist)\n\n    def sample_doc(self, doc):\n        if isinstance(doc, LDADoc) and not isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_token(doc, doc.token(i))\n                doc.set_topic(i, new_topic)\n        elif isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_sentence(doc, doc.sent(i))\n                doc.set_topic(i, new_topic)\n\n    def __sample_token(self, doc, token):\n        new_topic = token.topic\n        for i in range(self.__mh_steps):\n            doc_proposed_topic = self.__doc_proposal(doc, token)\n            new_topic = self.__word_proposal(doc, token, doc_proposed_topic)\n        return new_topic\n\n    def __sample_sentence(self, doc, sent):\n        new_topic = sent.topic\n        for i in range(self.__mh_steps):\n            doc_proposed_topic = self.__doc_proposal(doc, sent)\n            new_topic = self.__word_proposal(doc, sent, doc_proposed_topic)\n        return new_topic\n\n    def __doc_proposal(self, doc, token):\n        if isinstance(doc, LDADoc) and isinstance(token, Token):\n            old_topic = token.topic\n            dart = rand() * (doc.size() + self.__model.alpha_sum())\n            if dart < doc.size():\n                token_index = int(dart)\n                new_topic = doc.token(token_index).topic\n            else:\n                new_topic = rand_k(self.__model.num_topics())\n\n            if new_topic != old_topic:\n                proposal_old = self.__doc_proposal_distribution(doc, old_topic)\n                proposal_new = self.__doc_proposal_distribution(doc, new_topic)\n                proportion_old = self.__proportional_function(doc, token, old_topic)\n                proportion_new = self.__proportional_function(doc, token, new_topic)\n                transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                rejection = rand()\n                mask = -(rejection < transition_prob)\n                return (new_topic & mask) | (old_topic & ~mask)\n\n            return new_topic\n\n        elif isinstance(doc, SLDADoc) and isinstance(token, Sentence):\n            sent = token\n            old_topic = sent.topic\n            dart = rand() * (doc.size() + self.__model.alpha_sum())\n            if dart < doc.size():\n                token_index = int(dart)\n                new_topic = doc.sent(token_index).topic\n            else:\n                new_topic = rand_k(self.__model.num_topics())\n\n            if new_topic != old_topic:\n                proportion_old = self.__proportional_function(doc, sent, old_topic)\n                proportion_new = self.__proportional_function(doc, sent, new_topic)\n                proposal_old = self.__doc_proposal_distribution(doc, old_topic)\n                proposal_new = self.__doc_proposal_distribution(doc, new_topic)\n                transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                rejection = rand()\n                mask = -(rejection < transition_prob)\n                return (new_topic & mask) | (old_topic & ~mask)\n\n            return new_topic\n\n    def __word_proposal(self, doc, token, old_topic):\n        if isinstance(doc, LDADoc) and isinstance(token, Token):\n            new_topic = self.__propose(token.id)\n            if new_topic != old_topic:\n                proposal_old = self.__word_proposal_distribution(token.id, old_topic)\n                proposal_new = self.__word_proposal_distribution(token.id, new_topic)\n                proportion_old = self.__proportional_function(doc, token, old_topic)\n                proportion_new = self.__proportional_function(doc, token, new_topic)\n                transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                rejection = rand()\n                mask = -(rejection < transition_prob)\n                return (new_topic & mask) | (old_topic & ~mask)\n            return new_topic\n\n        elif isinstance(doc, SLDADoc) and isinstance(token, Sentence):\n            sent = token\n            new_topic = old_topic\n            for word_id in sent.tokens:\n                new_topic = self.__propose(word_id)\n                if new_topic != old_topic:\n                    proportion_old = self.__proportional_function(doc, sent, old_topic)\n                    proportion_new = self.__proportional_function(doc, sent, new_topic)\n                    proposal_old = self.__word_proposal_distribution(word_id, old_topic)\n                    proposal_new = self.__word_proposal_distribution(word_id, new_topic)\n                    transition_prob = float((proportion_new * proposal_old) / (proportion_old * proposal_new))\n                    rejection = rand()\n                    mask = -(rejection < transition_prob)\n                    new_topic = (new_topic & mask) | (old_topic & ~mask)\n            return new_topic\n\n    def __proportional_function(self, doc, token, new_topic):\n        if isinstance(doc, LDADoc) and isinstance(token, Token):\n            old_topic = token.topic\n            dt_alpha = doc.topic_sum(new_topic) + self.__model.alpha()\n            wt_beta = self.__model.word_topic_value(token.id, new_topic) + self.__model.beta()\n            t_sum_beta_sum = self.__model.topic_sum_value(new_topic) + self.__model.beta_sum()\n            if new_topic == old_topic and wt_beta > 1:\n                if dt_alpha > 1:\n                    dt_alpha -= 1\n                wt_beta -= 1\n                t_sum_beta_sum -= 1\n            return dt_alpha * wt_beta / t_sum_beta_sum\n\n        elif isinstance(doc, SLDADoc) and isinstance(token, Sentence):\n            sent = token\n            old_topic = sent.topic\n            result = doc.topic_sum(new_topic) + self.__model.alpha()\n            if new_topic == old_topic:\n                result -= 1\n            for word_id in sent.tokens:\n                wt_beta = self.__model.word_topic_value(word_id, new_topic) + self.__model.beta()\n                t_sum_beta_sum = self.__model.topic_sum_value(new_topic) + self.__model.beta_sum()\n                if new_topic == old_topic and wt_beta > 1:\n                    wt_beta -= 1\n                    t_sum_beta_sum -= 1\n                result *= wt_beta / t_sum_beta_sum\n            return result\n        else:\n            logger.error(\"Wrong input argument type!\")\n\n    def __word_proposal_distribution(self, word_id, topic):\n        wt_beta = self.__model.word_topic_value(word_id, topic) + self.__model.beta()\n        t_sum_beta_sum = self.__model.topic_sum_value(topic) + self.__model.beta_sum()\n        return wt_beta / t_sum_beta_sum\n\n    def __doc_proposal_distribution(self, doc, topic):\n        return doc.topic_sum(topic) + self.__model.alpha()\n\n    def __propose(self, word_id):\n        dart = rand() * (self.__prob_sum[word_id] + self.__beta_prior_sum)\n        if dart < self.__prob_sum[word_id]:\n            idx = self.__alias_tables[word_id].generate()\n            topic = self.__topic_indexes[word_id][idx]\n        else:\n            topic = self.__beta_alias.generate()\n        return topic\n\n\nclass GibbsSampler(Sampler):\n    def __init__(self, model):\n        super().__init__()\n        self.__model = model\n\n    def sample_doc(self, doc):\n        if isinstance(doc, LDADoc) and not isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_token(doc, doc.token(i))\n                doc.set_topic(i, new_topic)\n        elif isinstance(doc, SLDADoc):\n            for i in range(doc.size()):\n                new_topic = self.__sample_sentence(doc, doc.sent(i))\n                doc.set_topic(i, new_topic)\n\n    def __sample_token(self, doc, token):\n        old_topic = token.topic\n        num_topics = self.__model.num_topics()\n        accum_prob = np.zeros(num_topics)\n        prob = np.zeros(num_topics)\n        sum_ = 0\n        for i in range(num_topics):\n            dt_alpha = doc.topic_sum(i) + self.__model.alpha()\n            wt_beta = self.__model.word_topic_value(token.id, i) + self.__model.beta()\n            t_sum_beta_sum = self.__model.topic_sum(i) + self.__model.beta_sum()\n            if i == old_topic and wt_beta > 1:\n                if dt_alpha > 1:\n                    dt_alpha -= 1\n                wt_beta -= 1\n                t_sum_beta_sum -= 1\n            prob[i] = dt_alpha * wt_beta / t_sum_beta_sum\n            sum_ += prob[i]\n            accum_prob[i] = prob[i] if i == 0 else accum_prob[i - 1] + prob[i]\n\n        dart = rand() * sum_\n        if dart <= accum_prob[0]:\n            return 0\n        for i in range(1, num_topics):\n            if accum_prob[i - 1] < dart <= accum_prob[i]:\n                return i\n        return num_topics - 1\n\n    def __sample_sentence(self, doc, sent):\n        old_topic = sent.topic\n        num_topics = self.__model.num_topics()\n        accum_prob = np.zeros(num_topics)\n        prob = np.zeros(num_topics)\n        sum_ = 0\n        for t in range(num_topics):\n            dt_alpha = doc.topic_sum(t) + self.__model.alpha()\n            t_sum_beta_sum = self.__model.topic_sum(t) + self.__model.beta_sum()\n            if t == old_topic:\n                if dt_alpha > 1:\n                    dt_alpha -= 1\n                if t_sum_beta_sum > 1:\n                    t_sum_beta_sum -= 1\n            prob[t] = dt_alpha\n            for i in range(len(sent.tokens)):\n                w = sent.tokens[i]\n                wt_beta = self.__model.word_topic_value(w, t) + self.__model.beta()\n                if t == old_topic and wt_beta > 1:\n                    wt_beta -= 1\n                # Note: if the length of the sentence is too long, the probability will be\n                # too small and the accuracy will be lost if there are too many multiply items\n                prob[t] *= wt_beta / t_sum_beta_sum\n            sum_ += prob[t]\n            accum_prob[t] = prob[t] if t == 0 else accum_prob[t - 1] + prob[t]\n\n        dart = rand() * sum\n        if dart <= accum_prob[0]:\n            return 0\n        for t in range(1, num_topics):\n            if accum_prob[t - 1] < dart <= accum_prob[t]:\n                return t\n        return num_topics - 1\n"
  },
  {
    "path": "modules/text/language_model/slda_weibo/semantic_matching.py",
    "content": "import os\n\nimport numpy as np\nfrom paddlehub.common.logger import logger\n\nfrom slda_weibo.vocab import OOV\n\nEPS = 1e-06\n\n\nclass WordAndDis(object):\n    def __init__(self):\n        self.word = None\n        self.distance = None\n\n\nclass SemanticMatching(object):\n    def __init__(self):\n        pass\n\n    def l2_norm(self, vec):\n        \"\"\"Calculate the length of vector.\n        \"\"\"\n        result = np.sqrt(np.sum(vec**2))\n        return result\n\n    def cosine_similarity(self, vec1, vec2):\n        norm1 = self.l2_norm(vec1)\n        norm2 = self.l2_norm(vec2)\n        result = np.sum(vec1 * vec2) / norm1 / norm2\n        return result\n\n    def likelihood_based_similarity(self, terms, doc_topic_dist, model):\n        \"\"\"\n        Args:\n            terms: list of strings\n            doc_topic_dist: list of Topic class\n            model: TopicModel class\n        \"\"\"\n        num_of_term_in_vocab = 0\n        result = 0\n        for i in range(len(terms)):\n            term_id = model.term_id(terms[i])\n            if term_id == OOV:\n                continue\n            num_of_term_in_vocab += 1\n            for j in range(len(doc_topic_dist)):\n                topic_id = doc_topic_dist[j].tid\n                prob = doc_topic_dist[j].prob\n                result += model.word_topic_value(term_id, topic_id) * 1.0 / \\\n                          model.topic_sum_value(topic_id) * prob\n\n        if num_of_term_in_vocab == 0:\n            return result\n        return result / num_of_term_in_vocab\n\n    def kullback_leibler_divergence(self, dist1, dist2):\n        assert dist1.shape == dist2.shape\n        dist2[dist2 < EPS] = EPS\n        result = np.sum(dist1 * np.log(dist1 / dist2))\n        return result\n\n    def jensen_shannon_divergence(self, dist1, dist2):\n        assert dist1.shape == dist2.shape\n        dist1[dist1 < EPS] = EPS\n        dist2[dist2 < EPS] = EPS\n        mean = (dist1 + dist2) * 0.5\n        jsd = self.kullback_leibler_divergence(dist1, mean) * 0.5 + \\\n              self.kullback_leibler_divergence(dist2, mean) * 0.5\n        return jsd\n\n    def hellinger_distance(self, dist1, dist2):\n        assert dist1.shape == dist2.shape\n        result = np.sum((np.sqrt(dist1) - np.sqrt(dist2))**2)\n        result = np.sqrt(result) * 0.7071067812\n        return result\n"
  },
  {
    "path": "modules/text/language_model/slda_weibo/tokenizer.py",
    "content": "import os\n\nimport numpy as np\nfrom paddlehub.common.logger import logger\n\n\nclass Tokenizer(object):\n    \"\"\"Base tokenizer class.\n    \"\"\"\n\n    def __init__(self):\n        pass\n\n    def tokenize(self, text):\n        raise NotImplementedError\n\n\nclass SimpleTokenizer(Tokenizer):\n    \"\"\"Simple version FMM(Forward Maximun Matching) word tokenizer. This tokenizer can only\n       be used in topic model demo, but not in real business application scenarios.\n\n       Notes: This tokenizer can only recognize the words in the corresponding vocab file.\n    \"\"\"\n\n    def __init__(self, vocab_path):\n        super().__init__()\n        self.__max_word_len = 0\n        self.__vocab = set()\n        self.__load_vocab(vocab_path)\n\n    def tokenize(self, text):\n        \"\"\"Tokenize the input string `text`, and return the tokenize result.\n        \"\"\"\n        text_len = len(text)\n        result = []\n        i = 0\n        while i < text_len:\n            word = found_word = \"\"\n            # Deal with English characters.\n            if self.__is_eng_char(text[i]):\n                for j in range(i, text_len + 1):\n                    if j < text_len and self.__is_eng_char(text[j]):\n                        word += self.__tolower(text[j])\n                    else:\n                        # Forward matching by character granularity.\n                        if word in self.__vocab:\n                            result.append(word)\n                        i = j - 1\n                        break\n            else:\n                for j in range(i, min(i + self.__max_word_len, text_len)):\n                    word += text[j]\n                    if word in self.__vocab:\n                        found_word = word\n                if len(found_word) > 0:\n                    result.append(found_word)\n                    i += len(found_word) - 1\n            i += 1\n        return result\n\n    def contains(self, word):\n        \"\"\"Check whether the word is in the vocabulary.\n        \"\"\"\n        return word in self.__vocab\n\n    def __load_vocab(self, vocab_path):\n        \"\"\"Load the word dictionary.\n        \"\"\"\n        with open(vocab_path, 'r', encoding='utf-8') as fin:\n            vocab_size = 0\n            for line in fin.readlines():\n                fields = line.strip().split('\\t')\n                assert len(fields) >= 2\n                word = fields[1]\n                self.__max_word_len = max(self.__max_word_len, len(word))\n                self.__vocab.add(word)\n                vocab_size += 1\n\n    def __is_eng_char(self, c):\n        \"\"\"Check whether char c is an English character.\n        \"\"\"\n        return (c >= 'A' and c <= 'Z') or (c >= 'a' and c <= 'z')\n\n    def __tolower(self, c):\n        \"\"\"Return the lowercase character of the corresponding character, or return\n           the original character if there is no corresponding lowercase character.\n        \"\"\"\n        return c.lower()\n\n\nclass LACTokenizer(Tokenizer):\n    def __init__(self, vocab_path, lac):\n        super().__init__()\n        self.__max_word_len = 0\n        self.__vocab = set()\n        self.__lac = lac\n        self.__load_vocab(vocab_path)\n\n    def __load_vocab(self, vocab_path):\n        \"\"\"Load the word dictionary.\n                \"\"\"\n        with open(vocab_path, 'r', encoding='utf-8') as fin:\n            vocab_size = 0\n            for line in fin.readlines():\n                fields = line.strip().split('\\t')\n                assert len(fields) >= 2\n                word = fields[1]\n                self.__max_word_len = max(self.__max_word_len, len(word))\n                self.__vocab.add(word)\n                vocab_size += 1\n\n    def tokenize(self, text):\n        results = self.__lac.lexical_analysis(texts=[text], use_gpu=False, batch_size=1, return_tag=True)\n        # Change English words to lower case.\n        # And just preserve the word in vocab.\n        words = results[0][\"word\"]\n        result = []\n        for word in words:\n            word = word.lower()\n            if word in self.__vocab:\n                result.append(word)\n        return result\n\n    def contains(self, word):\n        \"\"\"Check whether the word is in the vocabulary.\n        \"\"\"\n        return word in self.__vocab\n"
  },
  {
    "path": "modules/text/language_model/slda_weibo/util.py",
    "content": "import time\nimport yaml\n\nimport numpy as np\nfrom paddlehub.common.logger import logger\n\nfrom slda_weibo.config import ModelType\n\n\ndef load_prototxt(config_file, config):\n    \"\"\"\n    Args:\n        config_file: model configuration file.\n        config: ModelConfig class\n    \"\"\"\n    logger.info(\"Loading SLDA config.\")\n    with open(config_file, 'r', encoding='utf-8') as f:\n        yaml_dict = yaml.load(f, Loader=yaml.FullLoader)\n\n    # Assignment.\n    if yaml_dict[\"type\"] == \"LDA\":\n        config.type = ModelType.LDA\n    else:\n        config.type = ModelType.SLDA\n    config.num_topics = yaml_dict[\"num_topics\"]\n    config.alpha = yaml_dict[\"alpha\"]\n    config.beta = yaml_dict[\"beta\"]\n    config.word_topic_file = yaml_dict[\"word_topic_file\"]\n    config.vocab_file = yaml_dict[\"vocab_file\"]\n\n\ndef fix_random_seed(seed=2147483647):\n    np.random.seed(seed)\n\n\ndef rand(min_=0, max_=1):\n    return np.random.uniform(low=min_, high=max_)\n\n\ndef rand_k(k):\n    \"\"\"Returns an integer float number between [0, k - 1].\n    \"\"\"\n    return int(rand() * k)\n\n\ndef timeit(f):\n    \"\"\"Return time cost of function f.\n    \"\"\"\n\n    def timed(*args, **kwargs):\n        start_time = time.time()\n        result = f(*args, **kwargs)\n        end_time = time.time()\n        print(\"   [-] %s : %2.5f sec\" % (f.__name__, end_time - start_time))\n        return result\n\n    return timed\n"
  },
  {
    "path": "modules/text/language_model/slda_weibo/vocab.py",
    "content": "from paddlehub.common.logger import logger\n\nOOV = -1\n\n\nclass WordCount(object):\n    def __init__(self, word_id, count):\n        self.word_id = word_id\n        self.count = count\n\n\nclass Vocab(object):\n    def __init__(self):\n        self.__term2id = {}\n        self.__id2term = {}\n\n    def get_id(self, word):\n        if word not in self.__term2id:\n            return OOV\n        return self.__term2id[word]\n\n    def load(self, vocab_file):\n        self.__term2id = {}\n        self.__id2term = {}\n        with open(vocab_file, 'r', encoding='utf-8') as fin:\n            for line in fin.readlines():\n                fields = line.strip().split('\\t')\n                assert len(fields) == 5, \"Vocabulary file [%s] format error!\" % (vocab_file)\n                term = fields[1]\n                id_ = int(fields[2])\n                if term in self.__term2id:\n                    logger.error(\"Duplicate word [%s] in vocab file!\" % (term))\n                    continue\n                self.__term2id[term] = id_\n                self.__id2term[id_] = term\n\n    def size(self):\n        return len(self.__term2id)\n\n    def vocabulary(self):\n        return self.__id2term\n"
  },
  {
    "path": "modules/text/language_model/slda_weibo/vose_alias.py",
    "content": "import os\n\nimport numpy as np\nfrom paddlehub.common.logger import logger\n\nfrom slda_weibo.util import rand, rand_k\n\n\nclass VoseAlias(object):\n    \"\"\"Vose's Alias Method.\n    \"\"\"\n\n    def __init__(self):\n        self.__alias = None\n        self.__prob = None  # np.array\n\n    def initialize(self, distribution):\n        \"\"\"Initialize the alias table according to the input distribution\n        Arg:\n            distribution: Numpy array.\n        \"\"\"\n        size = distribution.shape[0]\n        self.__alias = np.zeros(size, dtype=np.int64)\n        self.__prob = np.zeros(size)\n        sum_ = np.sum(distribution)\n        p = distribution / sum_ * size  # Scale up probability.\n        large, small = [], []\n        for i, p_ in enumerate(p):\n            if p_ < 1.0:\n                small.append(i)\n            else:\n                large.append(i)\n\n        while large and small:\n            l = small[0]\n            g = large[0]\n            small.pop(0)\n            large.pop(0)\n            self.__prob[l] = p[l]\n            self.__alias[l] = g\n            p[g] = p[g] + p[l] - 1  # A more numerically stable option.\n            if p[g] < 1.0:\n                small.append(g)\n            else:\n                large.append(g)\n\n        while large:\n            g = large[0]\n            large.pop(0)\n            self.__prob[g] = 1.0\n\n        while small:\n            l = small[0]\n            small.pop(0)\n            self.__prob[l] = 1.0\n\n    def generate(self):\n        \"\"\"Generate samples from given distribution.\n        \"\"\"\n        dart1 = rand_k(self.size())\n        dart2 = int(rand())\n        return dart1 if dart2 > self.__prob[dart1] else self.__alias[dart1]\n\n    def size(self):\n        return self.__prob.shape[0]\n"
  },
  {
    "path": "modules/text/lexical_analysis/README.md",
    "content": "## **更好用户体验，建议参考WEB端官方文档 -> [【词法分析】](https://www.paddlepaddle.org.cn/hublist)**\n\n### 词法分析\n\n- 推荐模型\n\n| 模型名称                                                     | 模型简介                                                     |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n| [词法分析-LAC](https://www.paddlepaddle.org.cn/hubdetail?name=lac&en_category=LexicalAnalysis) | 百度自研联合的词法分析模型，能整体性地完成中文分词、词性标注、专名识别任务。在百度自建数据集上评测，LAC效果：Precision=88.0%，Recall=88.7%，F1-Score=88.4%。 |\n| [切词网络-jieba](https://www.paddlepaddle.org.cn/hubdetail?name=jieba_paddle&en_category=LexicalAnalysis) | 该Module是jieba使用PaddlePaddle深度学习框架搭建的切词网络（双向GRU）。 同时也支持jieba的传统切词方法，如精确模式、全模式、搜索引擎模式等切词模式。 |\n"
  },
  {
    "path": "modules/text/lexical_analysis/README_en.md",
    "content": "## **For better user experience, refer to the Web official document ->  [Lexical Analysis](https://www.paddlepaddle.org.cn/hublist)**\n\n### Lexical Analysis\n\n- Recommended Models\n\n| Model Name                                                   | Model Introduction                                           |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n| [Lexical Analysis – LAC](https://www.paddlepaddle.org.cn/hubdetail?name=lac&en_category=LexicalAnalysis) | Baidu's self-developed lexical analysis model. It is able to complete the tasks of Chinese word segmentation, lexical labeling and proper name recognition roughly. It is evaluated on Baidu's own dataset. LAC effect: Precision=88.0%, Recall=88.7%, F1-Score=88.4%. |\n| [Word Segmentation Network - jieba](https://www.paddlepaddle.org.cn/hubdetail?name=jieba_paddle&en_category=LexicalAnalysis) | This module is a word segmentation network (bi-directional GRU) built by jieba using the PaddlePaddle deep learning framework. It also supports the traditional word segmentation methods of jieba, such as exact mode, full mode, search engine mode and other word segmentation modes. |\n"
  },
  {
    "path": "modules/text/lexical_analysis/jieba_paddle/README.md",
    "content": "# jieba_paddle\n\n|模型名称|jieba_paddle|\n| :--- | :---: |\n|类别|文本-词法分析|\n|网络|BiGRU+CRF|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|否|\n|模型大小|28KB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - 该Module是jieba使用PaddlePaddle深度学习框架搭建的切词网络（双向GRU）。同时也支持jieba的传统切词方法，如精确模式、全模式、搜索引擎模式等切词模式，使用方法和jieba保持一致。\n\n  - 更多信息参考：[jieba](https://github.com/fxsjy/jieba)\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.8.0\n\n  - paddlehub >= 1.8.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install jieba_paddle\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run jieba_paddle --input_text \"今天天气真好\"\n    ```\n    或者\n  - ```shell\n    $ hub run senta_gru --input_file test.txt\n    ```\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    jieba = hub.Module(name=\"jieba_paddle\")\n\n    results = jieba.cut(\"今天是个好日子\", cut_all=False, HMM=True)\n    print(results)\n\n    # ['今天', '是', '个', '好日子']\n    ```\n\n\n- ### 3、API\n\n  - ```python\n    def cut(sentence, use_paddle=True, cut_all=False, HMM=True)\n    ```\n    - jieba_paddle预测接口，预测输入句子的分词结果\n\n    - **参数**\n\n      - sentence(str): 单句预测数据。\n      - use_paddle(bool): 是否使用paddle模式（双向GRU）切词，默认为True。\n      - cut_all 参数用来控制是否采用全模式，默认为True；\n      - HMM 参数用来控制是否使用 HMM 模型， 默认为True；\n\n    - **返回**\n\n      - results(list): 分词结果\n\n\n  - ```python\n    def cut_for_search(sentence, HMM=True)\n    ```\n\n    - jieba的搜索引擎模式切词，该方法适合用于搜索引擎构建倒排索引的分词，粒度比较细\n\n    - **参数**\n\n      - sentence(str): 单句预测数据。\n      - HMM 参数用来控制是否使用 HMM 模型， 默认为True；\n\n    - **返回**\n\n      - results(list): 分词结果\n\n\n  - ```python\n    def load_userdict(user_dict)\n    ```\n\n    - 指定自己自定义的词典，以便包含 jieba 词库里没有的词。虽然 jieba 有新词识别能力，但是自行添加新词可以保证更高的正确率。\n\n    - **参数**\n\n      - user_dict(str): 自定义词典的路径\n\n\n  - ```python\n    def extract_tags(sentence, topK=20, withWeight=False, allowPOS=())\n    ```\n\n    - 基于 TF-IDF 算法的关键词抽取\n\n    - **参数**\n\n      - sentence(str): 待提取的文本\n      - topK(int): 返回几个 TF/IDF 权重最大的关键词，默认值为 20\n      - withWeight(bool): 为是否一并返回关键词权重值，默认值为 False\n      - allowPOS(tuple): 仅包括指定词性的词，默认值为空，即不筛选\n\n    - **返回**\n\n      - results(list): 关键词结果\n\n\n  - ```python\n    def textrank(sentence, topK=20, withWeight=False, allowPOS=('ns', 'n', 'vn', 'v'))\n    ```\n\n    - 基于 TextRank 算法的关键词抽取\n\n    - **参数**\n\n      - sentence(str): 待提取的文本\n      - topK(int): 返回几个 TF/IDF 权重最大的关键词，默认值为 20\n      - withWeight(bool): 为是否一并返回关键词权重值，默认值为 False\n      - allowPOS(tuple): 仅包括指定词性的词，默认值为('ns', 'n', 'vn', 'v')\n\n    - **返回**\n\n      - results(list): 关键词结果\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个切词服务，可以将此接口用于在线分词等web应用。\n\n- ## 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n    ```shell\n    $ hub serving start -c serving_config.json\n    ```\n\n  - `serving_config.json`的内容如下：\n    ```json\n    {\n      \"modules_info\": {\n        \"jieba_paddle\": {\n          \"init_args\": {\n            \"version\": \"1.0.0\"\n          },\n          \"predict_args\": {}\n        }\n      },\n      \"port\": 8866,\n      \"use_singleprocess\": false,\n      \"workers\": 2\n    }\n    ```\n\n  - 这样就完成了一个切词服务化API的部署，默认端口号为8866。\n\n\n- ## 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    ```python\n    import requests\n    import json\n\n    # 待预测数据\n    text = \"今天是个好日子\"\n\n    # 设置运行配置\n    # 对应本地预测jieba_paddle.cut(sentence=text, use_paddle=True)\n    data = {\"sentence\": text, \"use_paddle\": True}\n\n    # 指定预测方法为jieba_paddle并发送post请求，content-type类型应指定json方式\n    # HOST_IP为服务器IP\n    url = \"http://HOST_IP:8866/predict/jieba_paddle\"\n    headers = {\"Content-Type\": \"application/json\"}\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(json.dumps(r.json(), indent=4, ensure_ascii=False))\n    ```\n\n  - 关于PaddleHub Serving更多信息参考：[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n- ## gradio app 支持\n  从paddlehub 2.3.1开始支持使用链接 http://127.0.0.1:8866/gradio/jieba_paddle 在浏览器中访问jieba_paddle的gradio app。\n\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  移除 fluid api\n\n* 1.1.0\n  新增对gradio app的支持\n\n  - ```shell\n    $ hub install jieba_paddle==1.1.0\n    ```\n"
  },
  {
    "path": "modules/text/lexical_analysis/jieba_paddle/module.py",
    "content": "# -*- coding:utf-8 -*-\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport logging\nimport os\n\nimport paddlehub as hub\nfrom paddlehub.common.logger import logger\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"jieba_paddle\",\n    version=\"1.1.0\",\n    summary=\n    \"jieba_paddle is a chineses tokenizer using BiGRU base on the PaddlePaddle deeplearning framework. More information please refer to https://github.com/fxsjy/jieba.\",\n    author=\"baidu-paddle\",\n    author_email=\"paddle-dev@gmail.com\",\n    type=\"nlp/lexical_analysis\")\nclass JiebaPaddle(hub.Module):\n\n    def _initialize(self):\n        pass\n\n    @serving\n    def cut(self, sentence, use_paddle=True, cut_all=False, HMM=True):\n        \"\"\"\n        The main function that segments an entire sentence that contains\n        Chinese characters into separated words.\n        Args:\n            sentence(str): The str(unicode) to be segmented.\n            use_paddle(bool): Whether use jieba paddle model or not. Default as true.\n            cut_all(bool): Model type. True for full pattern, False for accurate pattern.\n            HMM(bool): Whether to use the Hidden Markov Model.\n\n        Returns:\n            results(dict): The word segmentation result of the input sentence, whose key is 'word'.\n        \"\"\"\n        self.check_dependency()\n        import jieba\n        jieba.setLogLevel(logging.ERROR)\n        jieba._compat.setLogLevel(logging.ERROR)\n\n        if use_paddle:\n            jieba.enable_paddle()\n            res = \" \".join(jieba.cut(sentence, use_paddle=True))\n            seg_list = res.strip(\" \").split(\" \")\n        else:\n            res = \" \".join(jieba.cut(sentence, cut_all=cut_all, HMM=HMM))\n            seg_list = res.strip(\" \").split(\" \")\n\n        return seg_list\n\n    def create_gradio_app(self):\n        import gradio as gr\n\n        def inference(text):\n            results = self.cut(sentence=text)\n            return results\n\n        title = \"jieba_paddle\"\n        description = \"jieba_paddle is a word segmentation model based on paddlepaddle deep learning framework.\"\n\n        examples = [['今天是个好日子']]\n        app = gr.Interface(inference,\n                           \"text\", [gr.outputs.Textbox(label=\"words\")],\n                           title=title,\n                           description=description,\n                           examples=examples)\n        return app\n\n    def check_dependency(self):\n        \"\"\"\n        Check jieba tool dependency.\n        \"\"\"\n        try:\n            import jieba\n        except ImportError:\n            print(\n                'This module requires jieba tools. The running enviroment does not meet the requirments. Please install jieba packages.'\n            )\n            exit()\n\n    def cut_for_search(self, sentence, HMM=True):\n        \"\"\"\n        Finer segmentation for search engines.\n        Args:\n            sentence(str): The str(unicode) to be segmented.\n            HMM(bool): Whether to use the Hidden Markov Model.\n\n        Returns:\n            results(dict): The word segmentation result of the input sentence, whose key is 'word'.\n        \"\"\"\n        self.check_dependency()\n        import jieba\n        jieba.setLogLevel(logging.ERROR)\n        res = \" \".join(jieba.cut_for_search(sentence, HMM=HMM))\n        seg_list = res.strip(\" \").split(\" \")\n        return seg_list\n\n    def load_userdict(self, user_dict):\n        '''\n        Load personalized dict to improve detect rate.\n        Args:\n            user_dict(str): A plain text file path. It contains words and their ocurrences. Can be a file-like object, or the path of the dictionary file,\n            whose encoding must be utf-8.\n                Structure of dict file:\n                    word1 freq1 word_type1\n                    word2 freq2 word_type2\n                    ...\n\n                Word type may be ignored\n        '''\n        self.check_dependency()\n        import jieba\n        jieba.setLogLevel(logging.ERROR)\n        jieba.load_userdict(\"userdict.txt\")\n\n    def extract_tags(self, sentence, topK=20, withWeight=False, allowPOS=(), withFlag=False):\n        \"\"\"\n        Extract keywords from sentence using TF-IDF algorithm.\n        Args:\n            topK(int): return how many top keywords. `None` for all possible words.\n            withWeight(bool): if True, return a list of (word, weight);\n                          if False, return a list of words.\n            allowPOS(tuple): the allowed POS list eg. ['ns', 'n', 'vn', 'v','nr'].\n                        if the POS of w is not in this list,it will be filtered.\n            withFlag(bool): only work with allowPOS is not empty.\n                        if True, return a list of pair(word, weight) like posseg.cut\n                        if False, return a list of words\n        Returns:\n            result(list): The key words.\n        \"\"\"\n        self.check_dependency()\n        import jieba\n        import jieba.analyse\n        jieba.setLogLevel(logging.ERROR)\n        res = jieba.analyse.extract_tags(sentence,\n                                         topK=topK,\n                                         withWeight=withWeight,\n                                         allowPOS=allowPOS,\n                                         withFlag=withFlag)\n        return res\n\n    def textrank(self, sentence, topK=20, withWeight=False, allowPOS=('ns', 'n', 'vn', 'v'), withFlag=False):\n        \"\"\"\n        Extract keywords from sentence using TextRank algorithm.\n        Args:\n            topK(int): return how many top keywords. `None` for all possible words.\n            withWeight(bool): if True, return a list of (word, weight);\n                          if False, return a list of words.\n            allowPOS(tuple): the allowed POS list eg. ['ns', 'n', 'vn', 'v','nr'].\n                        if the POS of w is not in this list,it will be filtered.\n            withFlag(bool): only work with allowPOS is not empty.\n                        if True, return a list of pair(word, weight) like posseg.cut\n                        if False, return a list of words\n        Returns:\n            result(list): The key words.\n        \"\"\"\n        self.check_dependency()\n        import jieba\n        jieba.setLogLevel(logging.ERROR)\n        res = jieba.analyse.textrank(sentence, topK=topK, withWeight=withWeight, allowPOS=allowPOS, withFlag=withFlag)\n        return res\n"
  },
  {
    "path": "modules/text/lexical_analysis/lac/README.md",
    "content": "# lac\n\n|模型名称|lac|\n| :--- | :---: |\n|类别|文本-词法分析|\n|网络|BiGRU+CRF|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|否|\n|模型大小|35MB|\n|最新更新日期|2021-02-26|\n|数据指标|Precision=88.0%，Recall=88.7%，F1-Score=88.4%|\n\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - Lexical Analysis of Chinese，简称 LAC，是一个联合的词法分析模型，能整体性地完成中文分词、词性标注、专名识别任务。在百度自建数据集上评测，LAC效果：Precision=88.0%，Recall=88.7%，F1-Score=88.4%。该PaddleHub Module支持预测。\n\n<p align=\"center\">\n<img src=\"https://user-images.githubusercontent.com/76040149/130606395-60691079-d33f-4d74-a980-b4d9e3bc663e.png\"   height = \"300\" hspace='10'/> <br />\n</p>\n\n  - 更多详情请参考：[LAC论文](https://arxiv.org/abs/1807.01882)\n\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.6.2\n\n  - paddlehub >= 1.6.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n  - 若使用词典干预功能，额外依赖第三方库 pyahocorasick\n\n  - ```shell\n    $ pip install pyahocorasick\n    ```\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install lac\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run lac --input_text \"今天是个好日子\"\n    ```\n  - 或者\n  - ```shell\n    $ hub run lac --input_file test.txt --user_dict user.dict\n    ```\n\n    - test.txt 存放待分词文本， 如：\n      - ```shell\n        今天是个好日子  \n        今天天气晴朗\n        ```\n    - user.dict 为用户自定义词典，可以不指定，当指定自定义词典时，可以干预默认分词结果。如：\n      - ```shell\n        春天/SEASON\n        花/n 开/v\n        秋天的风\n        落 阳  \n        ```\n      - 词典文件每行表示一个定制化的item，由一个单词或多个连续的单词组成，每个单词后使用'/'表示标签，如果没有'/'标签则会使用模型默认的标签。每个item单词数越多，干预效果会越精准。\n\n    - Note：该PaddleHub Module使用词典干预功能时，依赖于第三方库pyahocorasick，请自行安装\n\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    lac = hub.Module(name=\"lac\")\n    test_text = [\"今天是个好日子\", \"天气预报说今天要下雨\", \"下一班地铁马上就要到了\"]\n\n    results = lac.cut(text=test_text, use_gpu=False, batch_size=1, return_tag=True)\n\n    for result in results:\n        print(result['word'])\n        print(result['tag'])\n\n    # ['今天', '是', '个', '好日子']\n    # ['TIME', 'v', 'q', 'n']\n    # ['天气预报', '说', '今天', '要', '下雨']\n    # ['n', 'v', 'TIME', 'v', 'v']\n    # ['下', '一班', '地铁', '马上', '就要', '到', '了']\n    # ['f', 'm', 'n', 'd', 'v', 'v', 'xc']\n    ```\n\n\n\n- ### 3、API\n\n  - ```python\n    def __init__(user_dict=None)\n    ```\n    - 构造LAC对象\n\n    - **参数**\n\n      - user_dict(str): 自定义词典路径。如果需要使用自定义词典，则可通过该参数设置，否则不用传入该参数。\n\n\n  - ```python\n    def cut(text, use_gpu=False, batch_size=1, return_tag=True)\n    ```\n\n    - lac预测接口，预测输入句子的分词结果\n\n    - **参数**\n\n      - text(str or list): 待预测数据，单句预测数据（str类型）或者批量预测（list，每个元素为str\n      - use_gpu(bool): 是否使用GPU预测，如果使用GPU预测，则在预测之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置\n      - batch_size(int): 批处理大小\n      - return_tag(bool): 预测结果是否需要返回分词标签结果\n\n\n  - ```python\n    def lexical_analysis(texts=[], data={}, use_gpu=False, batch_size=1, user_dict=None, return_tag=True)\n    ```\n\n    - **该接口将会在未来版本被废弃，如有需要，请使用cut接口预测**\n\n    - lac预测接口，预测输入句子的分词结果\n\n    - **参数**\n\n      - texts(list): 待预测数据，如果使用texts参数，则不用传入data参数，二选一即可\n      - data(dict): 预测数据，key必须为text，value是带预测数据。如果使用data参数，则不用传入texts参数，二选一即可。建议使用texts参数，data参数后续会废弃\n      - use_gpu(bool): 是否使用GPU预测，如果使用GPU预测，则在预测之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置\n      - batch_size(int): 批处理大小\n      - return_tag(bool): 预测结果是否需要返回分词标签结果\n\n    - **返回**\n\n      - results(list): 分词结果\n\n\n  - ```python\n    def set_user_dict(dict_path)\n    ```\n\n    - 加载用户自定义词典\n\n    - **参数**\n\n      - dict_path(str ): 自定义词典路径\n\n\n  - ```python\n    def del_user_dict()\n    ```\n\n    - 删除自定义词典\n\n\n  - ```python\n    def get_tags()\n    ```\n\n    - 获取lac的标签\n\n    - **返回**\n\n      - tag_name_dict(dict): lac的标签\n\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线词法分析服务，可以将此接口用于词法分析、在线分词等在线web应用。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -c serving_config.json\n    ```\n\n    `serving_config.json`的内容如下：\n    ```json\n    {\n      \"modules_info\": {\n        \"lac\": {\n          \"init_args\": {\n            \"version\": \"2.1.0\",\n            \"user_dict\": \"./test_dict.txt\"\n          },\n          \"predict_args\": {}\n        }\n      },\n      \"port\": 8866,\n      \"use_singleprocess\": false,\n      \"workers\": 2\n    }\n    ```\n  - 其中user_dict含义为自定义词典路径，如果不使用lac自定义词典功能，则可以不填入。\n\n  - 这样就完成了一个词法分析服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 待预测数据\n    text = [\"今天是个好日子\", \"天气预报说今天要下雨\"]\n\n    # 设置运行配置\n    # 对应本地预测lac.analysis_lexical(texts=text, batch_size=1, use_gpu=True)\n    data = {\"texts\": text, \"batch_size\": 1, \"use_gpu\":True}\n\n    # 指定预测方法为lac并发送post请求，content-type类型应指定json方式\n    # HOST_IP为服务器IP\n    url = \"http://HOST_IP:8866/predict/lac\"\n    headers = {\"Content-Type\": \"application/json\"}\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(json.dumps(r.json(), indent=4, ensure_ascii=False))\n    ```\n\n  - 关于PaddleHub Serving更多信息参考：[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n- ### Gradio APP 支持\n  从 PaddleHub 2.3.1 开始支持使用链接 http://127.0.0.1:8866/gradio/lac 在浏览器中访问 lac 的 Gradio APP。\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  修复python2中编码问题\n\n* 1.1.0\n\n  支持词典干预，通过配置自定义词典可以对LAC默认分词结果进行干预\n\n* 1.1.1\n\n  修复输入文本中带有/字符时，使用词典干预会崩溃的问题\n\n* 2.0.0\n\n  修复输入文本为空、“ ”或者“\\n”，使用LAC会崩溃的问题\n  更新embedding_size, hidden_size为128，压缩模型，性能提升\n\n* 2.1.0\n\n  lac预测性能大幅提升\n  支持是否返回分词标签tag，同时简化预测接口使用\n\n* 2.1.1\n\n  当输入文本为空字符串“”，返回切词后结果为空字符串“”，分词tag也为“”\n\n* 2.2.0\n\n  升级自定义词典功能，支持增加不属于lac默认提供的词性\n\n* 2.3.0\n\n  移除 fluid api\n\n* 2.4.0\n\n  添加 Gradio APP 支持\n\n  - ```shell\n    $ hub install lac==2.4.0\n    ```\n"
  },
  {
    "path": "modules/text/lexical_analysis/lac/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/lexical_analysis/lac/ahocorasick.py",
    "content": "\"\"\"\n本模块实现AC自动机封装为Ahocorasick类，用于进行词典的多模匹配。\n\"\"\"\n\n\nclass Node(object):\n    \"\"\"AC自动机的树结点.\n\n    Attributes:\n        next: dict类型，指向子结点\n        fail: Node类型，AC自动机的fail指针\n        length: int类型，判断节点是否为单词\n    \"\"\"\n    __slots__ = ['next', 'fail', 'length']\n\n    def __init__(self):\n        \"\"\"初始化空节点.\"\"\"\n        self.next = {}\n        self.fail = None  # fail指针默认为None\n        self.length = -1\n\n\nclass Ahocorasick(object):\n    \"\"\"实现AC自动机的类\n\n    Attributes:\n        __root: Node类型，AC自动机根节点\n    \"\"\"\n\n    def __init__(self):\n        \"\"\"初始化Ahocorasick的根节点__root\"\"\"\n        self.__root = Node()\n\n    def add_word(self, word):\n        \"\"\"添加单词word到Trie树中\"\"\"\n        current = self.__root\n        for char in word:\n            current = current.next.setdefault(char, Node())\n        current.length = len(word)\n\n    def make(self):\n        \"\"\"构建fail指针路径\"\"\"\n\n        queue = list()\n        for key in self.__root.next:\n            self.__root.next[key].fail = self.__root\n            queue.append(self.__root.next[key])\n\n        # 广度优先算法遍历设置fail指针\n        while len(queue) > 0:\n            # 基于当前节点的fail指针设置其子结点的fail指针\n            current = queue.pop(0)\n\n            for k in current.next:\n                current_fail = current.fail\n\n                # 若当前节点有fail指针，尝试设置其子结点的fail指针\n                while current_fail is not None:\n                    if k in current_fail.next:\n                        current.next[k].fail = current_fail.next[k]\n                        break\n                    current_fail = current_fail.fail\n\n                # 若当前节点的fail指针不存在该子结点，令子结点fail指向根节点\n                if current_fail is None:\n                    current.next[k].fail = self.__root\n\n                queue.append(current.next[k])\n\n    def search(self, content):\n        \"\"\"后向最大匹配.\n\n        对content的文本进行多模匹配，返回后向最大匹配的结果.\n\n        Args:\n            content: string类型, 用于多模匹配的字符串\n\n        Returns:\n            list类型, 最大匹配单词列表，每个元素为匹配的模式串在句中的起止位置，比如：\n            [(0, 2), [4, 7]]\n\n        \"\"\"\n        result = []\n        p = self.__root\n        for current_position in range(len(content)):\n            word = content[current_position]\n\n            #\n            while word not in p.next:\n                if p == self.__root:\n                    break\n                p = p.fail\n            else:\n                p = p.next[word]\n                if p.length > 0:\n                    result.append((current_position - p.length + 1, current_position))\n\n        return result\n\n    def search_all(self, content):\n        \"\"\"多模匹配的完全匹配.\n\n        对content的文本进行多模匹配，返回所有匹配结果\n\n        Args:\n            content: string类型, 用于多模匹配的字符串\n\n        Returns:\n            list类型, 所有匹配单词列表，每个元素为匹配的模式串在句中的起止位置，比如：\n            [(0, 2), [4, 7]]\n\n        \"\"\"\n        result = []\n        p = self.__root\n        for current_position in range(len(content)):\n            word = content[current_position]\n\n            while word not in p.next:\n                if p == self.__root:\n                    break\n                p = p.fail\n            else:\n                p = p.next[word]\n\n                # 回溯查看是否存在以当前字符结尾的单词\n                tmp = p\n                while tmp != self.__root:\n                    if tmp.length > 0:\n                        result.append((current_position - tmp.length + 1, current_position))\n                    tmp = tmp.fail\n\n        return result\n\n\nif __name__ == '__main__':\n\n    ah = Ahocorasick()\n    x = [\"百度\", \"家\", \"高科技\", \"科技\", \"科技公司\"]\n    for i in x:\n        ah.add_word(i)\n    ah.make()\n    string = '百度是家高科技公司'\n    for begin, end in ah.search_all(string):\n        print('all:', string[begin:end + 1])\n\n    for begin, end in ah.search(string):\n        print('search:', string[begin:end + 1])\n"
  },
  {
    "path": "modules/text/lexical_analysis/lac/assets/q2b.dic",
    "content": "　\t \n、\t,\n。\t.\n—\t-\n～\t~\n‖\t|\n…\t.\n‘\t'\n’\t'\n“\t\"\n”\t\"\n〔\t(\n〕\t)\n〈\t<\n〉\t>\n「\t'\n」\t'\n『\t\"\n』\t\"\n〖\t[\n〗\t]\n【\t[\n】\t]\n∶\t:\n＄\t$\n！\t!\n＂\t\"\n＃\t#\n％\t%\n＆\t&\n＇\t'\n（\t(\n）\t)\n＊\t*\n＋\t+\n，\t,\n－\t-\n．\t.\n／\t/\n０\t0\n１\t1\n２\t2\n３\t3\n４\t4\n５\t5\n６\t6\n７\t7\n８\t8\n９\t9\n：\t:\n；\t;\n＜\t<\n＝\t=\n＞\t>\n？\t?\n＠\t@\nＡ\ta\nＢ\tb\nＣ\tc\nＤ\td\nＥ\te\nＦ\tf\nＧ\tg\nＨ\th\nＩ\ti\nＪ\tj\nＫ\tk\nＬ\tl\nＭ\tm\nＮ\tn\nＯ\to\nＰ\tp\nＱ\tq\nＲ\tr\nＳ\ts\nＴ\tt\nＵ\tu\nＶ\tv\nＷ\tw\nＸ\tx\nＹ\ty\nＺ\tz\n［\t[\n＼\t\\\n］\t]\n＾\t^\n＿\t_\n｀\t`\nａ\ta\nｂ\tb\nｃ\tc\nｄ\td\nｅ\te\nｆ\tf\nｇ\tg\nｈ\th\nｉ\ti\nｊ\tj\nｋ\tk\nｌ\tl\nｍ\tm\nｎ\tn\nｏ\to\nｐ\tp\nｑ\tq\nｒ\tr\nｓ\ts\nｔ\tt\nｕ\tu\nｖ\tv\nｗ\tw\nｘ\tx\nｙ\ty\nｚ\tz\n｛\t{\n｜\t|\n｝\t}\n￣\t~\n〝\t\"\n〞\t\"\n﹐\t,\n﹑\t,\n﹒\t.\n﹔\t;\n﹕\t:\n﹖\t?\n﹗\t!\n﹙\t(\n﹚\t)\n﹛\t{\n﹜\t{\n﹝\t[\n﹞\t]\n﹟\t#\n﹠\t&\n﹡\t*\n﹢\t+\n﹣\t-\n﹤\t<\n﹥\t>\n﹦\t=\n﹨\t\\\n﹩\t$\n﹪\t%\n﹫\t@\n \t,\nA\ta\nB\tb\nC\tc\nD\td\nE\te\nF\tf\nG\tg\nH\th\nI\ti\nJ\tj\nK\tk\nL\tl\nM\tm\nN\tn\nO\to\nP\tp\nQ\tq\nR\tr\nS\ts\nT\tt\nU\tu\nV\tv\nW\tw\nX\tx\nY\ty\nZ\tz\n"
  },
  {
    "path": "modules/text/lexical_analysis/lac/assets/tag.dic",
    "content": "0\ta-B\n1\ta-I\n2\tad-B\n3\tad-I\n4\tan-B\n5\tan-I\n6\tc-B\n7\tc-I\n8\td-B\n9\td-I\n10\tf-B\n11\tf-I\n12\tm-B\n13\tm-I\n14\tn-B\n15\tn-I\n16\tnr-B\n17\tnr-I\n18\tns-B\n19\tns-I\n20\tnt-B\n21\tnt-I\n22\tnw-B\n23\tnw-I\n24\tnz-B\n25\tnz-I\n26\tp-B\n27\tp-I\n28\tq-B\n29\tq-I\n30\tr-B\n31\tr-I\n32\ts-B\n33\ts-I\n34\tt-B\n35\tt-I\n36\tu-B\n37\tu-I\n38\tv-B\n39\tv-I\n40\tvd-B\n41\tvd-I\n42\tvn-B\n43\tvn-I\n44\tw-B\n45\tw-I\n46\txc-B\n47\txc-I\n48\tPER-B\n49\tPER-I\n50\tLOC-B\n51\tLOC-I\n52\tORG-B\n53\tORG-I\n54\tTIME-B\n55\tTIME-I\n56\tO\n"
  },
  {
    "path": "modules/text/lexical_analysis/lac/assets/tag_file.txt",
    "content": "n 普通名词\nf 方位名词\ns 处所名词\nt 时间\nnr 人名\nns 地名\nnt 机构名\nnw 作品名\nnz 其他专名\nv 普通动词\nvd 动副词\nvn 名动词\na 形容词\nad 副形词\nan 名形词\nd 副词\nm 数量词\nq 量词\nr 代词\np 介词\nc 连词\nu 助词\nxc 其他虚词\nw 标点符号\nPER 人名\nLOC 地名\nORG 机构名\nTIME 时间\n"
  },
  {
    "path": "modules/text/lexical_analysis/lac/assets/word.dic",
    "content": "0\ta\n1\te\n2\ti\n3\tn\n4\to\n5\ts\n6\tr\n7\tt\n8\tl\n9\t0\n10\tu\n11\tc\n12\t1\n13\td\n14\tm\n15\th\n16\tg\n17\t2\n18\tp\n19\tb\n20\ty\n21\t5\n22\t3\n23\t8\n24\t6\n25\tk\n26\tA\n27\t4\n28\t9\n29\tf\n30\t7\n31\tS\n32\tv\n33\tE\n34\tw\n35\tz\n36\tC\n37\tx\n38\tT\n39\tI\n40\tj\n41\tM\n42\tR\n43\tO\n44\tD\n45\tL\n46\tN\n47\tB\n48\tP\n49\tH\n50\tG\n51\t李\n52\tF\n53\tK\n54\t王\n55\t张\n56\tq\n57\tU\n58\t刘\n59\t陈\n60\tW\n61\tY\n62\tV\n63\t斯\n64\t文\n65\tX\n66\tJ\n67\tZ\n68\t华\n69\t明\n70\t尔\n71\t林\n72\t德\n73\t晓\n74\t杨\n75\t金\n76\tQ\n77\t克\n78\t小\n79\t志\n80\t国\n81\t海\n82\t丽\n83\t平\n84\t玉\n85\t黄\n86\t吴\n87\t建\n88\t特\n89\t拉\n90\t子\n91\t赵\n92\t利\n93\t马\n94\t军\n95\t周\n96\t亚\n97\t伟\n98\t东\n99\t红\n100\t龙\n101\t春\n102\t云\n103\t生\n104\t朱\n105\t孙\n106\t徐\n107\t永\n108\t达\n109\t美\n110\t安\n111\t杰\n112\t卡\n113\t天\n114\t新\n115\t罗\n116\t里\n117\t大\n118\t光\n119\t波\n120\t家\n121\t成\n122\t福\n123\t高\n124\t胡\n125\t荣\n126\t英\n127\t阿\n128\t思\n129\t立\n130\t瑞\n131\t峰\n132\t宝\n133\t郭\n134\t清\n135\t兰\n136\t西\n137\t山\n138\t维\n139\t爱\n140\t宇\n141\t佳\n142\t辉\n143\t俊\n144\t雅\n145\t庆\n146\t尼\n147\t梅\n148\t格\n149\t之\n150\t一\n151\t君\n152\t忠\n153\t强\n154\t学\n155\t世\n156\t雪\n157\t良\n158\t民\n159\t芳\n160\t郑\n161\t敏\n162\t秀\n163\t迪\n164\t元\n165\t洪\n166\t祥\n167\t泽\n168\t中\n169\t康\n170\t科\n171\t嘉\n172\t正\n173\t飞\n174\t巴\n175\t兴\n176\t松\n177\t恩\n178\t江\n179\t乐\n180\t宏\n181\t振\n182\t斌\n183\t路\n184\t雨\n185\t娜\n186\t雷\n187\t玲\n188\t长\n189\t多\n190\t凯\n191\t米\n192\t加\n193\t奇\n194\t吉\n195\t青\n196\t武\n197\t水\n198\t布\n199\t力\n200\t燕\n201\t纳\n202\t白\n203\t慧\n204\t宋\n205\t万\n206\t莱\n207\t勇\n208\t丹\n209\t威\n210\t宁\n211\t南\n212\t士\n213\t堂\n214\t何\n215\t普\n216\t洛\n217\t秋\n218\t胜\n219\t仁\n220\t韩\n221\t奥\n222\t富\n223\t丁\n224\t月\n225\t石\n226\t方\n227\t博\n228\t森\n229\t艳\n230\t鹏\n231\t刚\n232\t凤\n233\t诺\n234\t阳\n235\t涛\n236\t叶\n237\t香\n238\t比\n239\t曹\n240\t少\n241\t昌\n242\t泰\n243\t伊\n244\t亮\n245\t沈\n246\t霞\n247\t梁\n248\t菲\n249\t谢\n250\t唐\n251\t智\n252\t梦\n253\t希\n254\t曼\n255\t贝\n256\t杜\n257\t木\n258\t花\n259\t苏\n260\t星\n261\t萍\n262\t心\n263\t景\n264\t超\n265\t欣\n266\t树\n267\t广\n268\t许\n269\t伯\n270\t来\n271\t夫\n272\t塔\n273\t卫\n274\t义\n275\t冯\n276\t可\n277\t田\n278\t道\n279\t圣\n280\t汉\n281\t三\n282\t娟\n283\t友\n284\t夏\n285\t基\n286\t宗\n287\t人\n288\t贵\n289\t婷\n290\t鲁\n291\t根\n292\t艾\n293\t静\n294\t诗\n295\t惠\n296\t法\n297\t蔡\n298\t玛\n299\t喜\n300\t浩\n301\t欧\n302\t保\n303\t潘\n304\t风\n305\t莉\n306\t珍\n307\t源\n308\t桂\n309\t远\n310\t孟\n311\t沙\n312\t继\n313\t顺\n314\t锦\n315\t邓\n316\t贤\n317\t书\n318\t全\n319\t得\n320\t轩\n321\t通\n322\t吕\n323\t才\n324\t妮\n325\t董\n326\t曾\n327\t彭\n328\t雄\n329\t琴\n330\t旭\n331\t袁\n332\t城\n333\t琳\n334\t芬\n335\t豪\n336\t村\n337\t卢\n338\t剑\n339\t蒋\n340\t伦\n341\t培\n342\t魏\n343\t瓦\n344\t哈\n345\t莫\n346\t丝\n347\t兵\n348\t古\n349\t银\n350\t泉\n351\t发\n352\t传\n353\t群\n354\t若\n355\t虎\n356\t连\n357\t如\n358\t肖\n359\t鑫\n360\t盛\n361\t先\n362\t凡\n363\t鸿\n364\t图\n365\t章\n366\t姜\n367\t琪\n368\t启\n369\t柏\n370\t耀\n371\t开\n372\t依\n373\t坤\n374\t有\n375\t萨\n376\t怡\n377\t崔\n378\t川\n379\t祖\n380\t尚\n381\t贾\n382\t园\n383\t素\n384\t托\n385\t淑\n386\t健\n387\t彦\n388\t余\n389\t双\n390\t信\n391\t麦\n392\t范\n393\t汪\n394\t蒂\n395\t程\n396\t朝\n397\t和\n398\t然\n399\t本\n400\t塞\n401\t灵\n402\t秦\n403\t铭\n404\t河\n405\t进\n406\t姆\n407\t百\n408\t陆\n409\t彬\n410\t锋\n411\t洁\n412\t莲\n413\t冰\n414\t晨\n415\t邦\n416\t兆\n417\t钟\n418\t日\n419\t绍\n420\t铁\n421\t怀\n422\t赛\n423\t善\n424\t舒\n425\t恒\n426\t其\n427\t行\n428\t旺\n429\t修\n430\t易\n431\t任\n432\t莎\n433\t.\n434\t顾\n435\t艺\n436\t丰\n437\t皮\n438\t帕\n439\t延\n440\t隆\n441\t门\n442\t太\n443\t哲\n444\t定\n445\t蒙\n446\t洋\n447\t紫\n448\t庄\n449\t姚\n450\t戴\n451\t向\n452\t顿\n453\t礼\n454\t权\n455\t桥\n456\t颖\n457\t镇\n458\t茂\n459\t益\n460\t露\n461\t齐\n462\t仙\n463\t儿\n464\t勒\n465\t地\n466\t真\n467\t凌\n468\t毛\n469\t佩\n470\t冬\n471\t弗\n472\t九\n473\t润\n474\t涵\n475\t千\n476\t史\n477\t碧\n478\t自\n479\t承\n480\t彩\n481\t翔\n482\t乔\n483\t施\n484\t治\n485\t索\n486\t会\n487\t运\n488\t卓\n489\t毅\n490\t年\n491\t莹\n492\t沃\n493\t于\n494\t孔\n495\t薛\n496\t业\n497\t柳\n498\t内\n499\t钱\n500\t廷\n501\t登\n502\t仕\n503\t熙\n504\t守\n505\t敬\n506\t孝\n507\t雯\n508\t增\n509\t相\n510\t时\n511\t楠\n512\t二\n513\t竹\n514\t谷\n515\t不\n516\t牛\n517\t好\n518\t京\n519\t仲\n520\t赫\n521\t黑\n522\t朗\n523\t汤\n524\t悦\n525\t蓝\n526\t公\n527\t梓\n528\t珠\n529\t芝\n530\t苑\n531\t炳\n532\t奎\n533\t黎\n534\t老\n535\t佛\n536\t谭\n537\t鱼\n538\t尹\n539\t神\n540\t温\n541\t帝\n542\t锡\n543\t陶\n544\t墨\n545\t媛\n546\t上\n547\t乌\n548\t常\n549\t言\n550\t熊\n551\t化\n552\t火\n553\t升\n554\t庭\n555\t臣\n556\t同\n557\t头\n558\t晶\n559\t磊\n560\t楚\n561\t提\n562\t优\n563\t勤\n564\t歌\n565\t岩\n566\t琦\n567\t草\n568\t韦\n569\t库\n570\t溪\n571\t逸\n572\t五\n573\t政\n574\t冠\n575\t果\n576\t跃\n577\t辰\n578\t柯\n579\t戈\n580\t廖\n581\t薇\n582\t琼\n583\t申\n584\t占\n585\t湖\n586\t辛\n587\t代\n588\t四\n589\t严\n590\t扎\n591\t倩\n592\t邹\n593\t乃\n594\t宜\n595\t捷\n596\t理\n597\t洲\n598\t鸣\n599\t邱\n600\t栋\n601\t翠\n602\t睿\n603\t满\n604\t容\n605\t霖\n606\t纪\n607\t岳\n608\t卿\n609\t羽\n610\t扬\n611\t阁\n612\t亦\n613\t邵\n614\t居\n615\t久\n616\t桑\n617\t寿\n618\t记\n619\t北\n620\t哥\n621\t瑶\n622\t埃\n623\t彤\n624\t贺\n625\t菊\n626\t湘\n627\t诚\n628\t宾\n629\t郝\n630\t非\n631\t珊\n632\t存\n633\t无\n634\t颜\n635\t意\n636\t盖\n637\té\n638\t霍\n639\t初\n640\t派\n641\t野\n642\t摩\n643\t妍\n644\t应\n645\t口\n646\t馨\n647\t名\n648\t坚\n649\t品\n650\t能\n651\t寒\n652\t纯\n653\t蓉\n654\t声\n655\t葛\n656\t航\n657\t以\n658\t坦\n659\t童\n660\t尤\n661\t色\n662\t晴\n663\t令\n664\t重\n665\t聪\n666\t芙\n667\t亭\n668\t柱\n669\t合\n670\t兹\n671\t育\n672\t音\n673\t厚\n674\t迈\n675\t付\n676\t奈\n677\t语\n678\t情\n679\t宫\n680\t列\n681\t都\n682\t钦\n683\t炎\n684\t必\n685\t客\n686\t蕾\n687\t龚\n688\t笑\n689\t左\n690\t作\n691\t楼\n692\t切\n693\t娇\n694\t宪\n695\t韵\n696\t农\n697\t流\n698\t密\n699\t关\n700\t岭\n701\t干\n702\t为\n703\t夜\n704\t氏\n705\t微\n706\t男\n707\t显\n708\t腾\n709\t甘\n710\t娅\n711\t晋\n712\t昊\n713\t仪\n714\t查\n715\t焕\n716\t姬\n717\t印\n718\t台\n719\t苗\n720\t钰\n721\t甲\n722\t勋\n723\t车\n724\t班\n725\t锐\n726\t原\n727\t虹\n728\t六\n729\t段\n730\t曲\n731\t崇\n732\t七\n733\t茹\n734\t萌\n735\t&\n736\t巧\n737\t州\n738\t那\n739\t标\n740\t俞\n741\t堡\n742\t劳\n743\t联\n744\t土\n745\t血\n746\t起\n747\t乡\n748\t瑜\n749\t岛\n750\t池\n751\t战\n752\t师\n753\t茶\n754\t鹤\n755\t彪\n756\t鼎\n757\t婉\n758\t裕\n759\t季\n760\t耶\n761\t闫\n762\t冷\n763\t昆\n764\t知\n765\t绿\n766\t麟\n767\t朵\n768\t默\n769\t贞\n770\t什\n771\t赖\n772\t倪\n773\t尧\n774\t灿\n775\t因\n776\t官\n777\t昭\n778\t奕\n779\t穆\n780\t佐\n781\t影\n782\t荷\n783\t功\n784\t撒\n785\t照\n786\t井\n787\t宽\n788\t桐\n789\t萱\n790\t坊\n791\t聚\n792\t萧\n793\t球\n794\t璐\n795\t晖\n796\t鬼\n797\t面\n798\t字\n799\t慕\n800\t费\n801\t越\n802\t约\n803\t曦\n804\t后\n805\t欢\n806\t枫\n807\t玮\n808\t殷\n809\t包\n810\t念\n811\t八\n812\t汝\n813\t翰\n814\t黃\n815\t奴\n816\t手\n817\t望\n818\t茜\n819\t儒\n820\t傅\n821\t气\n822\t玄\n823\t黛\n824\t汇\n825\t肯\n826\t龍\n827\t耐\n828\t佑\n829\t湾\n830\t单\n831\t岚\n832\t舍\n833\t热\n834\t昂\n835\t步\n836\t钢\n837\t环\n838\t御\n839\t缘\n840\t伍\n841\t下\n842\t机\n843\t乾\n844\t魔\n845\t前\n846\t震\n847\t巨\n848\t线\n849\t皓\n850\t盈\n851\t庞\n852\t谦\n853\t宣\n854\t女\n855\t体\n856\t靖\n857\t均\n858\t劲\n859\t济\n860\t硕\n861\t营\n862\t帆\n863\t妙\n864\t瑟\n865\t财\n866\t出\n867\t在\n868\t炜\n869\t味\n870\t斗\n871\t留\n872\t深\n873\t芸\n874\t耿\n875\t沛\n876\t经\n877\t管\n878\t菜\n879\t献\n880\t外\n881\t殿\n882\t房\n883\t焦\n884\t骨\n885\t点\n886\t禹\n887\t禄\n888\t毕\n889\t桃\n890\t空\n891\t侯\n892\t鹰\n893\t岗\n894\t津\n895\t雁\n896\t帅\n897\t妃\n898\t复\n899\t衣\n900\t骏\n901\t聂\n902\t绪\n903\t娃\n904\t眼\n905\t舟\n906\t打\n907\t分\n908\t油\n909\t者\n910\t度\n911\t角\n912\t朴\n913\t藤\n914\t枝\n915\t落\n916\t亨\n917\t游\n918\t潮\n919\t皇\n920\t華\n921\t梵\n922\t滨\n923\t禾\n924\t郎\n925\t洞\n926\t精\n927\t烈\n928\t翁\n929\t允\n930\t塘\n931\t璇\n932\t事\n933\t祝\n934\t翼\n935\t粉\n936\t板\n937\t赤\n938\t盘\n939\t昕\n940\t蕊\n941\t姿\n942\t侠\n943\t回\n944\tá\n945\t秉\n946\t征\n947\t圆\n948\t考\n949\t茨\n950\t娘\n951\t邢\n952\t电\n953\t瑾\n954\t酒\n955\t寺\n956\t尊\n957\t冉\n958\t边\n959\t别\n960\t刀\n961\t工\n962\t筱\n963\t馬\n964\t坡\n965\t弘\n966\t樊\n967\t裴\n968\t柔\n969\t甫\n970\t妹\n971\t浦\n972\t锁\n973\t渊\n974\t映\n975\t当\n976\t鲍\n977\t见\n978\t麻\n979\t婧\n980\t选\n981\t牙\n982\t烟\n983\t翟\n984\t钧\n985\t屋\n986\t冲\n987\t放\n988\t芹\n989\t煜\n990\t再\n991\t尘\n992\t司\n993\t创\n994\t恋\n995\t幼\n996\t展\n997\t镜\n998\t实\n999\t浪\n1000\t珂\n1001\t爽\n1002\t驰\n1003\t鹿\n1004\t吾\n1005\t简\n1006\t虫\n1007\t网\n1008\t从\n1009\tè\n1010\t紅\n1011\t食\n1012\t赞\n1013\tà\n1014\t柴\n1015\t沟\n1016\t魂\n1017\t張\n1018\t叔\n1019\t端\n1020\t入\n1021\t闻\n1022\t耳\n1023\t慈\n1024\t汀\n1025\t集\n1026\t郁\n1027\t娥\n1028\t死\n1029\t伏\n1030\t观\n1031\t鸟\n1032\t港\n1033\t仓\n1034\t芭\n1035\t羊\n1036\t纽\n1037\t詹\n1038\t唯\n1039\t主\n1040\t亿\n1041\t旗\n1042\t朋\n1043\t蔚\n1044\t商\n1045\t斐\n1046\t拜\n1047\t凝\n1048\t十\n1049\t酷\n1050\t片\n1051\t性\n1052\t烨\n1053\t長\n1054\t寨\n1055\t蓓\n1056\t动\n1057\t魁\n1058\t猫\n1059\t迎\n1060\t魚\n1061\t敦\n1062\t浮\n1063\t東\n1064\t用\n1065\t霜\n1066\t咏\n1067\t采\n1068\t狼\n1069\t解\n1070\t衡\n1071\t录\n1072\t府\n1073\t琛\n1074\t舞\n1075\t街\n1076\t澜\n1077\t致\n1078\t则\n1079\t努\n1080\t愛\n1081\t举\n1082\t淼\n1083\tì\n1084\t晟\n1085\t肉\n1086\t身\n1087\t巷\n1088\t伽\n1089\t畅\n1090\t典\n1091\t首\n1092\tê\n1093\t斋\n1094\t拿\n1095\t沐\n1096\t骆\n1097\t丙\n1098\t狗\n1099\t瓜\n1100\t內\n1101\t细\n1102\tí\n1103\t视\n1104\t屯\n1105\t臻\n1106\t酸\n1107\t速\n1108\t頭\n1109\t养\n1110\t傲\n1111\t牧\n1112\t添\n1113\t直\n1114\t鸡\n1115\t泊\n1116\t勃\n1117\t昱\n1118\t巍\n1119\t宸\n1120\t式\n1121\t茵\n1122\t豆\n1123\t休\n1124\t半\n1125\t场\n1126\t蛇\n1127\t灯\n1128\t临\n1129\t玺\n1130\t煌\n1131\t顶\n1132\t次\n1133\t忆\n1134\t壮\n1135\t社\n1136\t席\n1137\t物\n1138\t陵\n1139\t醉\n1140\t毒\n1141\t媚\n1142\t風\n1143\t积\n1144\t佰\n1145\t車\n1146\t庚\n1147\t过\n1148\t猛\n1149\t菁\n1150\t母\n1151\t两\n1152\t龄\n1153\t破\n1154\t买\n1155\t效\n1156\t祺\n1157\t發\n1158\t玥\n1159\t我\n1160\t藏\n1161\t县\n1162\t号\n1163\t坎\n1164\t训\n1165\t嘎\n1166\t众\n1167\t懿\n1168\tò\n1169\t底\n1170\t党\n1171\t門\n1172\t尾\n1173\t予\n1174\t達\n1175\t转\n1176\t变\n1177\t盟\n1178\t是\n1179\t阮\n1180\t药\n1181\t船\n1182\t足\n1183\t快\n1184\t蘭\n1185\t毓\n1186\t乙\n1187\t讯\n1188\t杏\n1189\t渡\n1190\t陽\n1191\t而\n1192\t拓\n1193\t象\n1194\t喻\n1195\t汗\n1196\t眉\n1197\t散\n1198\t也\n1199\t横\n1200\t召\n1201\t节\n1202\t归\n1203\t离\n1204\t坪\n1205\t位\n1206\t制\n1207\t暗\n1208\t榕\n1209\t今\n1210\t量\n1211\t器\n1212\t仔\n1213\t脱\n1214\t所\n1215\t交\n1216\t结\n1217\t轻\n1218\t颂\n1219\t现\n1220\t又\n1221\t界\n1222\t病\n1223\t封\n1224\t祁\n1225\t寅\n1226\t岸\n1227\t樱\n1228\t阴\n1229\t妖\n1230\t澳\n1231\t期\n1232\t历\n1233\t命\n1234\t绮\n1235\t彼\n1236\t夕\n1237\t丸\n1238\t异\n1239\t淳\n1240\t苦\n1241\tó\n1242\t澄\n1243\t求\n1244\t開\n1245\t杀\n1246\t途\n1247\tü\n1248\t珀\n1249\t调\n1250\t沁\n1251\t國\n1252\t反\n1253\t零\n1254\t茗\n1255\t０\n1256\t族\n1257\t蒲\n1258\t泓\n1259\t棠\n1260\t引\n1261\t弟\n1262\t爾\n1263\t牌\n1264\t团\n1265\t至\n1266\t独\n1267\t娴\n1268\t迷\n1269\t倒\n1270\t瀚\n1271\t铃\n1272\t苍\n1273\t淇\n1274\t轮\n1275\t狄\n1276\t繁\n1277\t樂\n1278\t卜\n1279\t氣\n1280\t校\n1281\t婚\n1282\t断\n1283\t霸\n1284\t寶\n1285\t固\n1286\t豹\n1287\t韬\n1288\t隐\n1289\t教\n1290\t姓\n1291\t１\n1292\t极\n1293\t带\n1294\t走\n1295\t羅\n1296\t帮\n1297\t亞\n1298\t净\n1299\t婕\n1300\t难\n1301\t挺\n1302\t糖\n1303\t招\n1304\t凉\n1305\t蜜\n1306\t收\n1307\t数\n1308\t奧\n1309\t雲\n1310\t述\n1311\t逊\n1312\t杭\n1313\t幽\n1314\t脚\n1315\t２\n1316\t廉\n1317\t桦\n1318\t灰\n1319\t医\n1320\t与\n1321\t陳\n1322\t坝\n1323\t芮\n1324\t目\n1325\t丘\n1326\t舜\n1327\t覃\n1328\t潇\n1329\t含\n1330\t亲\n1331\t铜\n1332\t晚\n1333\t支\n1334\t猪\n1335\t画\n1336\t玖\n1337\tú\n1338\t店\n1339\t项\n1340\t渝\n1341\t排\n1342\t旋\n1343\t笔\n1344\t压\n1345\t芷\n1346\t报\n1347\t強\n1348\t乳\n1349\t融\n1350\t笛\n1351\t冈\n1352\t的\n1353\t棋\n1354\t领\n1355\t瑛\n1356\t屈\n1357\t狂\n1358\t院\n1359\t峻\n1360\t孤\n1361\t谋\n1362\t未\n1363\t兔\n1364\t鲜\n1365\t衍\n1366\t术\n1367\t吟\n1368\t间\n1369\t计\n1370\t觉\n1371\t泥\n1372\t乱\n1373\t蝶\n1374\t倍\n1375\t卷\n1376\t残\n1377\t蓬\n1378\t对\n1379\t植\n1380\t耕\n1381\t盾\n1382\t迦\n1383\t缪\n1384\t条\n1385\t域\n1386\t欲\n1387\t杯\n1388\t虚\n1389\t习\n1390\t爷\n1391\t早\n1392\t麗\n1393\t郡\n1394\t浅\n1395\t退\n1396\t纸\n1397\t策\n1398\tａ\n1399\t活\n1400\t窦\n1401\t攀\n1402\t屏\n1403\t刺\n1404\t泳\n1405\t旦\n1406\t补\n1407\t防\n1408\t姝\n1409\t恺\n1410\t晔\n1411\t肤\n1412\t軍\n1413\t漫\n1414\t失\n1415\t滕\n1416\t背\n1417\t词\n1418\t晗\n1419\t表\n1420\t來\n1421\t涂\n1422\t坑\n1423\t誉\n1424\t装\n1425\t受\n1426\t甜\n1427\t機\n1428\t邪\n1429\t嘴\n1430\t雍\n1431\t棉\n1432\t霄\n1433\t针\n1434\t荆\n1435\t料\n1436\t鼠\n1437\t革\n1438\t炫\n1439\t将\n1440\t绝\n1441\t锅\n1442\t取\n1443\t電\n1444\t宿\n1445\t货\n1446\t粤\n1447\t葵\n1448\t姐\n1449\t介\n1450\t爵\n1451\t阔\n1452\t涅\n1453\t闪\n1454\t听\n1455\t央\n1456\t掌\n1457\t近\n1458\t贡\n1459\t沉\n1460\t迟\n1461\t改\n1462\t配\n1463\t庙\n1464\t染\n1465\t铮\n1466\t阎\n1467\t芯\n1468\t汐\n1469\t颐\n1470\t蛋\n1471\t护\n1472\t部\n1473\t孚\n1474\t伤\n1475\t狐\n1476\t饭\n1477\t鼓\n1478\t娄\n1479\t戚\n1480\t略\n1481\t啸\n1482\t幸\n1483\t滋\n1484\t指\n1485\t悠\n1486\t妻\n1487\t脉\n1488\t丛\n1489\t警\n1490\t模\n1491\t洗\n1492\t奶\n1493\t枪\n1494\t恶\n1495\t宴\n1496\t靳\n1497\t３\n1498\t契\n1499\t想\n1500\t床\n1501\t泪\n1502\t随\n1503\t市\n1504\t探\n1505\t焰\n1506\t豫\n1507\t點\n1508\t住\n1509\t淮\n1510\t炮\n1511\t圈\n1512\t吐\n1513\t楷\n1514\t话\n1515\t線\n1516\t接\n1517\t假\n1518\t仇\n1519\t射\n1520\t蜀\n1521\t婴\n1522\t佟\n1523\t追\n1524\t奔\n1525\t胶\n1526\t晏\n1527\t兽\n1528\t跳\n1529\t弦\n1530\t质\n1531\t體\n1532\t扣\n1533\t更\n1534\t卉\n1535\t梨\n1536\t形\n1537\t息\n1538\t壁\n1539\t共\n1540\t菱\n1541\t闵\n1542\t渠\n1543\t感\n1544\t寻\n1545\t盐\n1546\t惊\n1547\t珺\n1548\t慎\n1549\t去\n1550\t狮\n1551\t韶\n1552\t雙\n1553\t瞳\n1554\t宅\n1555\t座\n1556\t总\n1557\t趣\n1558\t萬\n1559\t短\n1560\t奉\n1561\t滩\n1562\t飛\n1563\t扶\n1564\t折\n1565\t筠\n1566\t寇\n1567\tā\n1568\t尖\n1569\t暖\n1570\t弥\n1571\t惜\n1572\t涌\n1573\t符\n1574\t８\n1575\t匠\n1576\t嫣\n1577\t璞\n1578\t杉\n1579\t让\n1580\t雾\n1581\t動\n1582\t蕴\n1583\t处\n1584\t宠\n1585\t楊\n1586\t务\n1587\t猴\n1588\t翻\n1589\t到\n1590\t竞\n1591\t参\n1592\t某\n1593\t闽\n1594\t送\n1595\t匡\n1596\t钊\n1597\t薄\n1598\t磨\n1599\t芒\n1600\t婵\n1601\t厄\n1602\t渔\n1603\t户\n1604\t推\n1605\t研\n1606\t纲\n1607\t恭\n1608\t聖\n1609\t茅\n1610\t资\n1611\t宛\n1612\t魅\n1613\t软\n1614\t胎\n1615\t鸭\n1616\t愚\n1617\t喆\n1618\t鉴\n1619\t荒\n1620\t协\n1621\t罪\n1622\t铎\n1623\t迅\n1624\t笙\n1625\t語\n1626\t葆\n1627\t匹\n1628\t区\n1629\t绣\n1630\t型\n1631\t轶\n1632\t额\n1633\t消\n1634\t靓\n1635\t硬\n1636\t着\n1637\t姣\n1638\t偏\n1639\t票\n1640\t碎\n1641\t套\n1642\t遥\n1643\t冀\n1644\t际\n1645\t架\n1646\t拳\n1647\t巫\n1648\t６\n1649\t妇\n1650\t赋\n1651\t私\n1652\t曙\n1653\t站\n1654\t载\n1655\t抗\n1656\t芦\n1657\t膜\n1658\t尸\n1659\t适\n1660\t错\n1661\t潭\n1662\t击\n1663\t俭\n1664\t巢\n1665\t幻\n1666\t婆\n1667\t麒\n1668\t值\n1669\t止\n1670\t种\n1671\t維\n1672\tｃ\n1673\t岐\n1674\t後\n1675\t伶\n1676\t墙\n1677\t刃\n1678\t缇\n1679\t琰\n1680\t殇\n1681\t烧\n1682\t窝\n1683\t砚\n1684\t無\n1685\t矿\n1686\t遗\n1687\t争\n1688\t怪\n1689\tｂ\n1690\t末\n1691\t逆\n1692\t码\n1693\t释\n1694\t屠\n1695\t问\n1696\t恬\n1697\t腰\n1698\t掉\n1699\t時\n1700\t具\n1701\t脸\n1702\t璋\n1703\t隋\n1704\t芽\n1705\t控\n1706\t壹\n1707\t甄\n1708\t會\n1709\t价\n1710\t劫\n1711\t菌\n1712\t熱\n1713\t岁\n1714\t痛\n1715\t刻\n1716\t單\n1717\t咸\n1718\t書\n1719\t兮\n1720\t服\n1721\t敖\n1722\t禁\n1723\t差\n1724\t沫\n1725\t栗\n1726\t暮\n1727\t倾\n1728\t戰\n1729\t投\n1730\t戏\n1731\t币\n1732\t要\n1733\t造\n1734\t冥\n1735\t肌\n1736\t降\n1737\t龟\n1738\t低\n1739\tｏ\n1740\t痕\n1741\t學\n1742\t弹\n1743\t淡\n1744\t迹\n1745\t箭\n1746\t岑\n1747\t读\n1748\t灭\n1749\t萝\n1750\t潜\n1751\t穗\n1752\t俄\n1753\t吊\n1754\t虞\n1755\t斑\n1756\t炉\n1757\t肥\n1758\t说\n1759\t稳\n1760\t焱\n1761\t隽\n1762\t急\n1763\t橙\n1764\t卞\n1765\t雀\n1766\t停\n1767\t槐\n1768\t级\n1769\t剧\n1770\t姑\n1771\t岱\n1772\tｅ\n1773\t弄\n1774\t脑\n1775\t蔓\n1776\t论\n1777\t壳\n1778\t鼻\n1779\t圖\n1780\t醒\n1781\t犬\n1782\t堤\n1783\t闲\n1784\t坐\n1785\t专\n1786\t蜂\n1787\t饶\n1788\t证\n1789\t液\n1790\t莺\n1791\t导\n1792\t跑\n1793\t砂\n1794\t谈\n1795\t虾\n1796\t湛\n1797\t杂\n1798\t看\n1799\t父\n1800\t埠\n1801\t盲\n1802\t敌\n1803\t泛\n1804\t摇\n1805\t翎\n1806\t霆\n1807\t核\n1808\t屿\n1809\t换\n1810\t股\n1811\t产\n1812\t呈\n1813\t漏\n1814\t興\n1815\t铺\n1816\t刑\n1817\t省\n1818\t裝\n1819\t刁\n1820\t曰\n1821\t劉\n1822\t察\n1823\t除\n1824\t齿\n1825\t峥\n1826\t牟\n1827\t飘\n1828\t律\n1829\t鞋\n1830\t禅\n1831\t瞿\n1832\t右\n1833\t璟\n1834\t滑\n1835\t煤\n1836\t滢\n1837\t琨\n1838\t逢\n1839\t税\n1840\t宮\n1841\t状\n1842\t納\n1843\t谨\n1844\t寄\n1845\t弓\n1846\t练\n1847\t序\n1848\t纱\n1849\t恨\n1850\t凱\n1851\t寧\n1852\t帶\n1853\t境\n1854\t局\n1855\t操\n1856\t妤\n1857\t裂\n1858\t猎\n1859\t眠\n1860\t泡\n1861\t辞\n1862\tｉ\n1863\t势\n1864\t戎\n1865\t室\n1866\t順\n1867\t透\n1868\t享\n1869\t演\n1870\t裘\n1871\t由\n1872\t助\n1873\t第\n1874\t奋\n1875\t储\n1876\t伐\n1877\t沪\n1878\t９\n1879\t磁\n1880\t拍\n1881\t盼\n1882\t珈\n1883\t贻\n1884\t偷\n1885\t混\n1886\t仰\n1887\t队\n1888\t場\n1889\t胤\n1890\t呼\n1891\t案\n1892\t驹\n1893\t还\n1894\t铂\n1895\t栾\n1896\t腿\n1897\t响\n1898\t禧\n1899\t溢\n1900\t饼\n1901\t４\n1902\t馆\n1903\t材\n1904\t粮\n1905\t姗\n1906\t缺\n1907\t桢\n1908\t業\n1909\t歆\n1910\t惟\n1911\t纹\n1912\t祯\n1913\t崖\n1914\t预\n1915\t肇\n1916\t連\n1917\t悲\n1918\t唱\n1919\t鹭\n1920\t胸\n1921\t杆\n1922\t暴\n1923\t園\n1924\t准\n1925\t汶\n1926\t吳\n1927\t钻\n1928\t纤\n1929\t氧\n1930\t冶\n1931\t脂\n1932\t怨\n1933\t島\n1934\t爆\n1935\t尽\n1936\t夹\n1937\t挂\n1938\t肠\n1939\t绵\n1940\t崎\n1941\t銀\n1942\t措\n1943\t算\n1944\t陀\n1945\t橋\n1946\t执\n1947\t职\n1948\t徽\n1949\t邑\n1950\t瑪\n1951\t荡\n1952\t戒\n1953\t旧\n1954\t丑\n1955\t浓\n1956\t便\n1957\t仑\n1958\t歇\n1959\t縣\n1960\t围\n1961\t纬\n1962\t褚\n1963\t丞\n1964\t胆\n1965\t辅\n1966\t减\n1967\t贯\n1968\t圭\n1969\t乘\n1970\t率\n1971\t別\n1972\t藍\n1973\t扇\n1974\t萊\n1975\t瘦\n1976\t漢\n1977\tｎ\n1978\t滿\n1979\t榆\n1980\t屹\n1981\t廣\n1982\t句\n1983\t借\n1984\t鞠\n1985\t垂\n1986\t骥\n1987\t鐵\n1988\t雞\n1989\t號\n1990\t胃\n1991\t玩\n1992\t雕\n1993\t罕\n1994\t墩\n1995\t谊\n1996\t贼\n1997\t對\n1998\t件\n1999\t编\n2000\tｄ\n2001\t嫂\n2002\t葉\n2003\t栓\n2004\t湿\n2005\t统\n2006\t箱\n2007\t庸\n2008\t终\n2009\t轉\n2010\t吹\n2011\t噶\n2012\t炼\n2013\t聯\n2014\t谱\n2015\t悬\n2016\t甸\n2017\t兩\n2018\t委\n2019\t徒\n2020\t午\n2021\t忘\n2022\t藻\n2023\t遇\n2024\t師\n2025\t數\n2026\t激\n2027\t經\n2028\t炯\n2029\t怒\n2030\t珏\n2031\t靈\n2032\t熹\n2033\t靜\n2034\t兒\n2035\t報\n2036\t調\n2037\t圩\n2038\t袋\n2039\t妆\n2040\t各\n2041\t祭\n2042\t层\n2043\t聲\n2044\t陌\n2045\t幕\n2046\t帽\n2047\t了\n2048\t舌\n2049\t碗\n2050\t記\n2051\t窑\n2052\t丕\n2053\t貝\n2054\t盤\n2055\t過\n2056\t醇\n2057\t紧\n2058\t类\n2059\t娣\n2060\t嵘\n2061\t弃\n2062\t嵩\n2063\t卖\n2064\t侨\n2065\tｐ\n2066\t块\n2067\t束\n2068\t绳\n2069\t橫\n2070\t鄂\n2071\t窗\n2072\t粒\n2073\t膏\n2074\t灏\n2075\t義\n2076\t馥\n2077\t藥\n2078\t卧\n2079\t夷\n2080\t诸\n2081\t侃\n2082\t抱\n2083\t絲\n2084\t故\n2085\t厨\n2086\t喷\n2087\t荔\n2088\t俏\n2089\t凶\n2090\t斜\n2091\t忍\n2092\t關\n2093\t完\n2094\t皖\n2095\t逃\n2096\t榜\n2097\t样\n2098\t淫\n2099\t運\n2100\t喀\n2101\t互\n2102\t浆\n2103\t結\n2104\t侧\n2105\t闯\n2106\t抽\n2107\t腊\n2108\t秘\n2109\t请\n2110\t写\n2111\t续\n2112\t组\n2113\t此\n2114\t烁\n2115\t吸\n2116\t销\n2117\t翊\n2118\t漾\n2119\t荫\n2120\t進\n2121\tù\n2122\t键\n2123\t囚\n2124\t等\n2125\t疏\n2126\t弱\n2127\t棒\n2128\t渣\n2129\t嫁\n2130\t夺\n2131\t链\n2132\t懒\n2133\t你\n2134\t骁\n2135\t励\n2136\t胖\n2137\t螺\n2138\t恰\n2139\t珉\n2140\t须\n2141\t墅\n2142\t款\n2143\t堆\n2144\t轴\n2145\t整\n2146\t咪\n2147\t注\n2148\t救\n2149\t網\n2150\t勾\n2151\t播\n2152\t称\n2153\t裸\n2154\t频\n2155\t棚\n2156\t尿\n2157\t珑\n2158\t旻\n2159\t害\n2160\t枣\n2161\t阵\n2162\t备\n2163\t稻\n2164\t叫\n2165\t就\n2166\t攻\n2167\t辣\n2168\t邻\n2169\t俐\n2170\t昀\n2171\t踏\n2172\t肝\n2173\t坛\n2174\t像\n2175\t夢\n2176\t愿\n2177\t斩\n2178\t腹\n2179\t苟\n2180\t愁\n2181\t樹\n2182\t錢\n2183\t蟹\n2184\t傻\n2185\t鹅\n2186\t态\n2187\t苇\n2188\t筒\n2189\t溫\n2190\t諾\n2191\t蕙\n2192\t穿\n2193\t紙\n2194\t涧\n2195\t奸\n2196\t厂\n2197\t鸥\n2198\t琅\n2199\t漆\n2200\t昶\n2201\t檀\n2202\t险\n2203\t昇\n2204\t補\n2205\t译\n2206\t枕\n2207\t悅\n2208\t持\n2209\t评\n2210\t庵\n2211\t黔\n2212\t煞\n2213\t拾\n2214\t熟\n2215\t试\n2216\t题\n2217\t浴\n2218\t遠\n2219\t摆\n2220\t邬\n2221\t枯\n2222\t鞭\n2223\t蔻\n2224\t７\n2225\t劍\n2226\t吃\n2227\t勉\n2228\t纶\n2229\t迁\n2230\t伴\n2231\t疯\n2232\t使\n2233\t肃\n2234\t审\n2235\t梭\n2236\t他\n2237\t拔\n2238\t悟\n2239\t穴\n2240\t豐\n2241\t勝\n2242\t實\n2243\t綠\n2244\t玻\n2245\t彻\n2246\t告\n2247\t蛮\n2248\t抢\n2249\t瓷\n2250\t枢\n2251\t系\n2252\t峡\n2253\t蘇\n2254\t淘\n2255\t负\n2256\tｓ\n2257\t员\n2258\t乎\n2259\t邊\n2260\t賽\n2261\t歐\n2262\t纵\n2263\t哀\n2264\t被\n2265\t籍\n2266\t肩\n2267\t尺\n2268\t圓\n2269\t旅\n2270\t漪\n2271\t泗\n2272\t莊\n2273\t臧\n2274\t標\n2275\t朔\n2276\t搜\n2277\t塑\n2278\t視\n2279\t狱\n2280\t铸\n2281\t筑\n2282\t附\n2283\t剂\n2284\t筋\n2285\t柜\n2286\t购\n2287\t滚\n2288\t驴\n2289\t腳\n2290\t墓\n2291\t盆\n2292\t骑\n2293\t溜\n2294\t垒\n2295\t陰\n2296\t始\n2297\t废\n2298\t赢\n2299\t隔\n2300\t粗\n2301\t议\n2302\t峪\n2303\t蒸\n2304\t傷\n2305\t芊\n2306\t砖\n2307\t變\n2308\t检\n2309\t巾\n2310\t充\n2311\t免\n2312\t版\n2313\t拼\n2314\t笼\n2315\t袖\n2316\t滔\n2317\t鴻\n2318\t貨\n2319\t置\n2320\t疮\n2321\t灌\n2322\t槽\n2323\t厉\n2324\t錦\n2325\t瓶\n2326\t企\n2327\t栖\n2328\t吧\n2329\t睡\n2330\t渭\n2331\t梯\n2332\t胥\n2333\t织\n2334\t價\n2335\t荟\n2336\t坏\n2337\t唇\n2338\t澈\n2339\t臭\n2340\t怜\n2341\t赌\n2342\t玫\n2343\t柒\n2344\t囊\n2345\t慢\n2346\t樓\n2347\t穷\n2348\t養\n2349\t扫\n2350\t僧\n2351\t鸽\n2352\t凰\n2353\t燃\n2354\t溶\n2355\t绒\n2356\t勿\n2357\t亡\n2358\t贴\n2359\t燈\n2360\t詞\n2361\t宰\n2362\t湯\n2363\t鲸\n2364\t帛\n2365\t漠\n2366\t饰\n2367\t吻\n2368\t條\n2369\t惑\n2370\t詩\n2371\t做\n2372\tｕ\n2373\t財\n2374\t阅\n2375\t移\n2376\t忧\n2377\t诱\n2378\t麥\n2379\t奚\n2380\t串\n2381\t級\n2382\t奖\n2383\t寂\n2384\t剪\n2385\t盗\n2386\t偶\n2387\t妈\n2388\t驿\n2389\t突\n2390\t滴\n2391\t煊\n2392\t昔\n2393\t往\n2394\t限\n2395\t帐\n2396\t蛟\n2397\t败\n2398\t輝\n2399\t椿\n2400\t殺\n2401\t酱\n2402\t約\n2403\t撞\n2404\t痴\n2405\t庐\n2406\t寰\n2407\t陪\n2408\t苹\n2409\t辽\n2410\t霓\n2411\t擎\n2412\t澤\n2413\t俗\n2414\t嗣\n2415\t拥\n2416\tｔ\n2417\t碟\n2418\t待\n2419\t菡\n2420\t缸\n2421\t傳\n2422\t阶\n2423\t络\n2424\t欠\n2425\t兄\n2426\t殊\n2427\t枭\n2428\t遂\n2429\t難\n2430\t環\n2431\t课\n2432\t危\n2433\t巡\n2434\t話\n2435\t耘\n2436\t樟\n2437\t逐\n2438\t候\n2439\t遊\n2440\t爪\n2441\t钉\n2442\t畫\n2443\t當\n2444\t疆\n2445\t插\n2446\t糕\n2447\t薪\n2448\t阻\n2449\t缩\n2450\t頂\n2451\t割\n2452\t袭\n2453\t弯\n2454\t挑\n2455\t铨\n2456\t見\n2457\t葬\n2458\t咒\n2459\t倚\n2460\t祎\n2461\t贷\n2462\t輪\n2463\t筆\n2464\t测\n2465\t產\n2466\t蜡\n2467\t每\n2468\t脫\n2469\t腔\n2470\t仟\n2471\t叙\n2472\tｈ\n2473\t肾\n2474\t領\n2475\t误\n2476\t熠\n2477\t邮\n2478\t荃\n2479\tē\n2480\t稅\n2481\t径\n2482\t扁\n2483\t臨\n2484\tｇ\n2485\t绯\n2486\t蓮\n2487\t缝\n2488\t伪\n2489\t悉\n2490\t碳\n2491\t丫\n2492\t魯\n2493\t援\n2494\t宙\n2495\t蚁\n2496\t換\n2497\t費\n2498\t莘\n2499\t刊\n2500\t區\n2501\t疾\n2502\t炬\n2503\t己\n2504\t巩\n2505\t祈\n2506\t伞\n2507\t妥\n2508\t孜\n2509\t襄\n2510\t拖\n2511\t呆\n2512\t汁\n2513\t猿\n2514\t疑\n2515\t赟\n2516\t及\n2517\t叉\n2518\t缠\n2519\t裤\n2520\t硫\n2521\t翘\n2522\t丧\n2523\t识\n2524\t赐\n2525\t頓\n2526\t椰\n2527\t戶\n2528\tｘ\n2529\t浙\n2530\t笃\n2531\t壶\n2532\t哉\n2533\t饮\n2534\t俪\n2535\t碑\n2536\t倫\n2537\t潤\n2538\t截\n2539\t棍\n2540\t规\n2541\t餐\n2542\t岙\n2543\t稿\n2544\t绘\n2545\t骐\n2546\t牢\n2547\t累\n2548\t葱\n2549\t裙\n2550\t衫\n2551\t侍\n2552\t哨\n2553\t離\n2554\t叹\n2555\t祸\n2556\t避\n2557\t萃\n2558\t蒿\n2559\t哭\n2560\t將\n2561\t几\n2562\t渐\n2563\t决\n2564\t供\n2565\t斷\n2566\t困\n2567\t租\n2568\t闷\n2569\t灼\n2570\t氯\n2571\t扑\n2572\t例\n2573\t膠\n2574\t間\n2575\t橘\n2576\t虛\n2577\t飯\n2578\t尉\n2579\t蟲\n2580\t赣\n2581\t涼\n2582\t灾\n2583\t質\n2584\t犯\n2585\t%\n2586\t導\n2587\t節\n2588\t轨\n2589\t拐\n2590\t瀛\n2591\t骞\n2592\t沅\n2593\t妾\n2594\t骅\n2595\t旁\n2596\t觅\n2597\t且\n2598\t示\n2599\t似\n2600\t赏\n2601\t粟\n2602\t復\n2603\t哑\n2604\t觀\n2605\t敢\n2606\t只\n2607\t烏\n2608\t親\n2609\t姨\n2610\t豬\n2611\t著\n2612\t選\n2613\t浚\n2614\t兜\n2615\t监\n2616\t驾\n2617\t并\n2618\t蚕\n2619\t針\n2620\t磷\n2621\t扩\n2622\t烂\n2623\t履\n2624\t泼\n2625\t闹\n2626\t泾\n2627\t办\n2628\t吞\n2629\t蛙\n2630\t焊\n2631\t坟\n2632\t盒\n2633\t愈\n2634\tｙ\n2635\t焚\n2636\t抓\n2637\t偉\n2638\t垚\n2639\t烤\n2640\t羚\n2641\t淋\n2642\t披\n2643\t阙\n2644\tｍ\n2645\t罡\n2646\t慰\n2647\t洼\n2648\t髮\n2649\t柄\n2650\t燒\n2651\t荻\n2652\t弈\n2653\t番\n2654\t參\n2655\t技\n2656\t碱\n2657\t捕\n2658\t夸\n2659\t逼\n2660\t漂\n2661\t鳞\n2662\t慶\n2663\t鸾\n2664\t裳\n2665\t樵\n2666\t隊\n2667\t懋\n2668\t稀\n2669\t預\n2670\t验\n2671\t缓\n2672\t旱\n2673\t函\n2674\t稚\n2675\t鲨\n2676\t幅\n2677\t佘\n2678\t資\n2679\t返\n2680\t划\n2681\t專\n2682\t沖\n2683\t忌\n2684\t藩\n2685\t璃\n2686\t奏\n2687\t陇\n2688\t腸\n2689\t鎮\n2690\t廊\n2691\t批\n2692\t绫\n2693\t签\n2694\t幺\n2695\t忻\n2696\t璧\n2697\t肽\n2698\t涉\n2699\t桶\n2700\t苔\n2701\t搭\n2702\t替\n2703\t種\n2704\t把\n2705\t鳳\n2706\t減\n2707\t苓\n2708\t锤\n2709\t優\n2710\t煙\n2711\t即\n2712\t舰\n2713\t颈\n2714\t贱\n2715\t钩\n2716\t冻\n2717\t獨\n2718\t銅\n2719\t卯\n2720\t妞\n2721\t碰\n2722\t袍\n2723\t赶\n2724\t填\n2725\t霁\n2726\t债\n2727\t闸\n2728\t择\n2729\t趙\n2730\t胺\n2731\t阜\n2732\t絕\n2733\t刮\n2734\t罐\n2735\t虐\n2736\t扭\n2737\t铝\n2738\t钙\n2739\t聘\n2740\t汽\n2741\t铅\n2742\t牵\n2743\t烽\n2744\t棣\n2745\t葯\n2746\t恕\n2747\t藝\n2748\t售\n2749\t極\n2750\t壓\n2751\t喉\n2752\t皂\n2753\t触\n2754\t異\n2755\t彈\n2756\t菇\n2757\t翅\n2758\t垫\n2759\t腦\n2760\t寸\n2761\t珩\n2762\t锌\n2763\t昏\n2764\t膳\n2765\t逝\n2766\t绅\n2767\t损\n2768\t現\n2769\tｌ\n2770\t肺\n2771\t畏\n2772\t伙\n2773\t煦\n2774\t挽\n2775\t韓\n2776\t涤\n2777\tｖ\n2778\t霏\n2779\t恐\n2780\t炸\n2781\t貓\n2782\t鳥\n2783\t芋\n2784\t笠\n2785\t冢\n2786\t坂\n2787\t叠\n2788\t皋\n2789\t腐\n2790\t桓\n2791\t噴\n2792\t皆\n2793\t蝉\n2794\t崩\n2795\t鋼\n2796\t忙\n2797\t疗\n2798\t篇\n2799\t鄉\n2800\t跨\n2801\t答\n2802\t衛\n2803\t涩\n2804\t庫\n2805\t處\n2806\t驼\n2807\t硝\n2808\t堃\n2809\t試\n2810\t務\n2811\t棕\n2812\t孕\n2813\t杖\n2814\t爹\n2815\t劇\n2816\t椒\n2817\t拙\n2818\t兼\n2819\t诡\n2820\t册\n2821\t應\n2822\t栏\n2823\t仿\n2824\t抛\n2825\t卒\n2826\t访\n2827\t枚\n2828\t鲤\n2829\tｆ\n2830\t卵\n2831\t孽\n2832\t蚀\n2833\t认\n2834\t歪\n2835\t厦\n2836\t钛\n2837\t挖\n2838\t哇\n2839\t熏\n2840\t涯\n2841\t悍\n2842\t咬\n2843\t曉\n2844\t竺\n2845\t厝\n2846\t說\n2847\t鲲\n2848\t遮\n2849\t榮\n2850\t弋\n2851\t跟\n2852\t臂\n2853\t貴\n2854\t禮\n2855\t創\n2856\t骄\n2857\t讲\n2858\t距\n2859\t硅\n2860\t灣\n2861\t恆\n2862\t權\n2863\t臺\n2864\t览\n2865\t贫\n2866\t圃\n2867\t孑\n2868\t磐\n2869\t澎\n2870\t醫\n2871\t陸\n2872\t刷\n2873\t笋\n2874\t属\n2875\t贪\n2876\t町\n2877\t堰\n2878\t闭\n2879\t彰\n2880\t账\n2881\t已\n2882\t評\n2883\t侬\n2884\t農\n2885\t覆\n2886\t拨\n2887\t炒\n2888\t洙\n2889\t臉\n2890\t媒\n2891\t爬\n2892\t捞\n2893\t嫩\n2894\t肚\n2895\t鏡\n2896\t驱\n2897\t伸\n2898\t甚\n2899\t掛\n2900\t垣\n2901\t况\n2902\t滞\n2903\t匯\n2904\t催\n2905\t傑\n2906\tū\n2907\t總\n2908\t桔\n2909\t猜\n2910\t炽\n2911\t職\n2912\t冒\n2913\t莽\n2914\t聽\n2915\t骚\n2916\t洒\n2917\t曜\n2918\t衰\n2919\t绕\n2920\t暄\n2921\t诉\n2922\t授\n2923\t奢\n2924\t題\n2925\t晃\n2926\t眸\n2927\t踢\n2928\t妄\n2929\t護\n2930\t簡\n2931\t丈\n2932\t灶\n2933\t诊\n2934\t罩\n2935\t醋\n2936\t桩\n2937\t崗\n2938\t绞\n2939\t沧\n2940\t裁\n2941\t拆\n2942\t镁\n2943\t犁\n2944\t判\n2945\t尕\n2946\t氢\n2947\t鸠\n2948\t劝\n2949\t竖\n2950\t飚\n2951\t最\n2952\t蹄\n2953\t羡\n2954\t陷\n2955\t缨\n2956\t旷\n2957\t页\n2958\t翌\n2959\t烛\n2960\t筝\n2961\t毁\n2962\t戀\n2963\t荀\n2964\t陂\n2965\t貼\n2966\t鶴\n2967\t讀\n2968\t輕\n2969\t档\n2970\t抚\n2971\t副\n2972\t订\n2973\t槍\n2974\t凹\n2975\t編\n2976\t稼\n2977\t拱\n2978\t雏\n2979\t碼\n2980\t桌\n2981\t霉\n2982\t睦\n2983\t骊\n2984\t摸\n2985\t證\n2986\t茄\n2987\t絮\n2988\t匪\n2989\t豚\n2990\t酥\n2991\t團\n2992\t厅\n2993\t获\n2994\t鸦\n2995\t押\n2996\t沿\n2997\t逗\n2998\t愉\n2999\t椅\n3000\t卦\n3001\t鞍\n3002\t笨\n3003\t寫\n3004\t純\n3005\t緣\n3006\t竟\n3007\t組\n3008\t抄\n3009\t滇\n3010\t粪\n3011\t鍋\n3012\t淦\n3013\t佬\n3014\t泣\n3015\t弼\n3016\t俠\n3017\t旸\n3018\t浑\n3019\t绥\n3020\t设\n3021\t薯\n3022\t梧\n3023\t亢\n3024\t幹\n3025\t症\n3026\t舫\n3027\t煮\n3028\t咔\n3029\t軟\n3030\t賢\n3031\t賣\n3032\t狀\n3033\t癌\n3034\t氨\n3035\t靠\n3036\t細\n3037\t揭\n3038\t构\n3039\t彧\n3040\t帘\n3041\t卤\n3042\t秒\n3043\t镭\n3044\t潼\n3045\tｋ\n3046\t韧\n3047\t栩\n3048\t熔\n3049\t坞\n3050\t污\n3051\t遵\n3052\t製\n3053\t孫\n3054\t羲\n3055\t忽\n3056\t勐\n3057\t營\n3058\t纷\n3059\t殘\n3060\t脊\n3061\t寡\n3062\t洵\n3063\t仆\n3064\t劈\n3065\t辩\n3066\t鐘\n3067\t缤\n3068\t禽\n3069\t甬\n3070\t勺\n3071\t佃\n3072\t茸\n3073\t蛾\n3074\t谁\n3075\t虽\n3076\t痰\n3077\t凸\n3078\t酮\n3079\t腕\n3080\t宵\n3081\t穹\n3082\t惡\n3083\t計\n3084\tｒ\n3085\t钓\n3086\t抵\n3087\t给\n3088\t晕\n3089\t課\n3090\t許\n3091\t員\n3092\t综\n3093\t茉\n3094\t亂\n3095\t啟\n3096\t問\n3097\t捐\n3098\t烦\n3099\t脆\n3100\t備\n3101\t棱\n3102\t埋\n3103\t泷\n3104\t洽\n3105\t珞\n3106\t婦\n3107\t羞\n3108\t确\n3109\t隨\n3110\t犀\n3111\t蚊\n3112\t毫\n3113\t謝\n3114\t糊\n3115\t颠\n3116\t喵\n3117\t胞\n3118\t邸\n3119\t軒\n3120\t測\n3121\t份\n3122\t斧\n3123\t弧\n3124\t矛\n3125\t冕\n3126\t琉\n3127\t狸\n3128\t扒\n3129\t甩\n3130\t肆\n3131\t柚\n3132\t屎\n3133\t庶\n3134\t蓋\n3135\t額\n3136\t否\n3137\t擊\n3138\t鴨\n3139\t旨\n3140\t峙\n3141\t騰\n3142\t購\n3143\t歸\n3144\t遁\n3145\t檢\n3146\t缔\n3147\t矮\n3148\t煎\n3149\t紋\n3150\t浸\n3151\t梗\n3152\t瑰\n3153\t闺\n3154\t挡\n3155\t砍\n3156\t筹\n3157\t涟\n3158\t宥\n3159\t纺\n3160\t贸\n3161\t聊\n3162\t缅\n3163\t沣\n3164\t芃\n3165\t銷\n3166\t潞\n3167\t溥\n3168\t虱\n3169\t矢\n3170\t梳\n3171\t输\n3172\t晁\n3173\t穎\n3174\t獸\n3175\t呂\n3176\t飒\n3177\t頻\n3178\t析\n3179\t帖\n3180\t懷\n3181\t旬\n3182\t裡\n3183\t焉\n3184\t漁\n3185\t層\n3186\t个\n3187\t跌\n3188\t粘\n3189\t役\n3190\t揚\n3191\t鵬\n3192\t鳌\n3193\t驻\n3194\t罚\n3195\t晞\n3196\t乖\n3197\t搏\n3198\t岔\n3199\t氮\n3200\t琢\n3201\t粹\n3202\t碘\n3203\t抹\n3204\t骗\n3205\t湄\n3206\t玟\n3207\t鸢\n3208\t沸\n3209\t誓\n3210\t歡\n3211\t削\n3212\t臀\n3213\t铠\n3214\t滾\n3215\t憨\n3216\t框\n3217\t耗\n3218\t摘\n3219\t责\n3220\t障\n3221\t赠\n3222\t遺\n3223\t瑄\n3224\t搖\n3225\t鷹\n3226\t踪\n3227\t歷\n3228\t嶺\n3229\t葳\n3230\t瑤\n3231\t倉\n3232\t潔\n3233\t拒\n3234\t統\n3235\t据\n3236\t衬\n3237\t麓\n3238\t啦\n3239\t怕\n3240\t魄\n3241\t窃\n3242\t侵\n3243\t為\n3244\t薩\n3245\t璨\n3246\t署\n3247\t蒼\n3248\t叁\n3249\t炭\n3250\t類\n3251\t炀\n3252\t讨\n3253\t聆\n3254\t蝇\n3255\t冤\n3256\t轰\n3257\t裔\n3258\t粥\n3259\t涨\n3260\t沂\n3261\t沼\n3262\t決\n3263\t悔\n3264\t壽\n3265\t夙\n3266\t荼\n3267\tī\n3268\t按\n3269\t担\n3270\t堪\n3271\t卑\n3272\t尋\n3273\t苯\n3274\t垢\n3275\t忱\n3276\t濠\n3277\t貌\n3278\t骂\n3279\t澍\n3280\t靡\n3281\t谜\n3282\t館\n3283\t璜\n3284\t隱\n3285\t拴\n3286\t瞬\n3287\t扰\n3288\t违\n3289\t铿\n3290\t聿\n3291\t瞻\n3292\t犹\n3293\t箫\n3294\t酉\n3295\t很\n3296\t勞\n3297\t岡\n3298\t燮\n3299\t蔺\n3300\t薰\n3301\t缚\n3302\t锭\n3303\t楓\n3304\t绩\n3305\t督\n3306\t芥\n3307\t茧\n3308\t緊\n3309\t坠\n3310\t辜\n3311\t辈\n3312\t惨\n3313\t搬\n3314\t翀\n3315\t幣\n3316\t镐\n3317\t涓\n3318\t敛\n3319\t锚\n3320\t錯\n3321\t凭\n3322\t埔\n3323\t劣\n3324\t吏\n3325\t糜\n3326\t浊\n3327\t術\n3328\t積\n3329\t却\n3330\t刹\n3331\t蒜\n3332\t溯\n3333\t餅\n3334\t瞎\n3335\t锴\n3336\t钜\n3337\t籽\n3338\t掩\n3339\t孩\n3340\t簽\n3341\t驚\n3342\t肿\n3343\t邝\n3344\t谟\n3345\tě\n3346\t億\n3347\t患\n3348\t終\n3349\t襟\n3350\t跪\n3351\t獅\n3352\t没\n3353\t浣\n3354\t渚\n3355\t痞\n3356\t脾\n3357\t滤\n3358\t凄\n3359\t歧\n3360\t鎖\n3361\t柠\n3362\t態\n3363\t擒\n3364\t泄\n3365\t皙\n3366\t晒\n3367\t陕\n3368\t柿\n3369\t锟\n3370\t膝\n3371\t握\n3372\t濕\n3373\t循\n3374\t淹\n3375\t敷\n3376\t樣\n3377\t規\n3378\t挚\n3379\t址\n3380\t論\n3381\t株\n3382\t仗\n3383\t稱\n3384\t還\n3385\t氟\n3386\t辟\n3387\t谛\n3388\t谌\n3389\t譜\n3390\t锥\n3391\t亏\n3392\t阀\n3393\t锯\n3394\t蛊\n3395\t撤\n3396\t扯\n3397\t钞\n3398\t獎\n3399\t錄\n3400\t銘\n3401\t茫\n3402\t崧\n3403\t侣\n3404\t乞\n3405\t欺\n3406\t瘤\n3407\t篮\n3408\t泠\n3409\t阚\n3410\t濑\n3411\t钳\n3412\t荊\n3413\t咲\n3414\t蝎\n3415\t卸\n3416\t耍\n3417\t摄\n3418\t惹\n3419\t壬\n3420\t辱\n3421\t柑\n3422\t顽\n3423\t铉\n3424\t祚\n3425\t複\n3426\t挥\n3427\t蛤\n3428\t沾\n3429\t脏\n3430\t找\n3431\t圍\n3432\t促\n3433\t賓\n3434\t朮\n3435\t挤\n3436\t郊\n3437\t既\n3438\t舅\n3439\t給\n3440\t咕\n3441\t骋\n3442\t夾\n3443\t鄭\n3444\t鈴\n3445\t浒\n3446\t酶\n3447\t屁\n3448\t茲\n3449\t迫\n3450\t焯\n3451\t晰\n3452\t戲\n3453\t驗\n3454\t舸\n3455\t驭\n3456\t肢\n3457\t罢\n3458\t嫡\n3459\t栈\n3460\t箐\n3461\t这\n3462\t銮\n3463\t認\n3464\t鬥\n3465\t縮\n3466\t愤\n3467\t郜\n3468\t仝\n3469\t递\n3470\t勢\n3471\tō\n3472\t贰\n3473\t粵\n3474\t痘\n3475\t姦\n3476\t缴\n3477\t揽\n3478\t恪\n3479\t舵\n3480\t艷\n3481\t葡\n3482\t鋒\n3483\t叛\n3484\t産\n3485\t窩\n3486\t嵌\n3487\t敲\n3488\t蓄\n3489\t泻\n3490\t畜\n3491\t抒\n3492\t韻\n3493\t項\n3494\t摊\n3495\t疃\n3496\tの\n3497\t烯\n3498\t吓\n3499\t戊\n3500\t腺\n3501\t褲\n3502\t監\n3503\t谣\n3504\t廠\n3505\t迭\n3506\t鄢\n3507\t谏\n3508\t載\n3509\t拂\n3510\t茎\n3511\t俱\n3512\t斤\n3513\t紀\n3514\t颤\n3515\t尝\n3516\t沥\n3517\t習\n3518\t淞\n3519\t昧\n3520\t逍\n3521\t嗨\n3522\t榴\n3523\t臥\n3524\t嬌\n3525\t側\n3526\t券\n3527\t渗\n3528\t雜\n3529\t閃\n3530\t盜\n3531\t艇\n3532\t喬\n3533\t详\n3534\t秃\n3535\t採\n3536\t汛\n3537\t呀\n3538\t厌\n3539\t喊\n3540\t訂\n3541\t訊\n3542\t燊\n3543\t栅\n3544\t誠\n3545\t夭\n3546\t皱\n3547\t蛛\n3548\t矣\n3549\t鳴\n3550\t攸\n3551\t麵\n3552\t冼\n3553\t儀\n3554\t晉\n3555\t濤\n3556\t莓\n3557\t齊\n3558\t晦\n3559\t竣\n3560\t抖\n3561\tｗ\n3562\tキ\n3563\t墻\n3564\t媽\n3565\t敗\n3566\t淺\n3567\t礁\n3568\t荐\n3569\t估\n3570\t驳\n3571\t舱\n3572\t绰\n3573\t宦\n3574\t泵\n3575\t寮\n3576\t雌\n3577\t脐\n3578\t舊\n3579\t續\n3580\t弩\n3581\t羌\n3582\t拌\n3583\t瓣\n3584\t戟\n3585\t髓\n3586\t暑\n3587\t婶\n3588\t撕\n3589\t豁\n3590\t竿\n3591\t隙\n3592\t谓\n3593\t铖\n3594\t旌\n3595\t蝦\n3596\t秧\n3597\t或\n3598\t颢\n3599\t兑\n3600\t厥\n3601\t鳄\n3602\t暂\n3603\t汾\n3604\t钝\n3605\t杠\n3606\t買\n3607\t苒\n3608\t牆\n3609\t炊\n3610\t糠\n3611\t矾\n3612\t懂\n3613\t侗\n3614\t剛\n3615\t壇\n3616\t帳\n3617\t櫃\n3618\t毀\n3619\t湧\n3620\t捉\n3621\t練\n3622\t窖\n3623\t緑\n3624\t沽\n3625\t馋\n3626\t斥\n3627\t郵\n3628\t喇\n3629\t垛\n3630\t概\n3631\t们\n3632\t岂\n3633\t腎\n3634\t銳\n3635\t岷\n3636\t烙\n3637\t掠\n3638\t浜\n3639\t泸\n3640\t醬\n3641\t沱\n3642\t蔷\n3643\t皎\n3644\t榛\n3645\t檐\n3646\t閣\n3647\t抬\n3648\t顏\n3649\t橡\n3650\t镛\n3651\t塊\n3652\t盡\n3653\t壯\n3654\t靴\n3655\t亥\n3656\t酚\n3657\t窄\n3658\t肛\n3659\t亘\n3660\t糟\n3661\t烘\n3662\t貂\n3663\t講\n3664\t狠\n3665\t窥\n3666\t賭\n3667\t賀\n3668\t莞\n3669\t箕\n3670\t爺\n3671\t喘\n3672\t但\n3673\t咖\n3674\t織\n3675\tい\n3676\t彿\n3677\t唤\n3678\t蕉\n3679\t僵\n3680\t熬\n3681\t妓\n3682\t踩\n3683\t铲\n3684\t匙\n3685\t撑\n3686\t弛\n3687\t耻\n3688\t丢\n3689\t堵\n3690\t膽\n3691\t厘\n3692\t辨\n3693\t瓢\n3694\t崴\n3695\t篱\n3696\t碾\n3697\t畔\n3698\t涝\n3699\t膚\n3700\t绛\n3701\t黏\n3702\t屑\n3703\t衝\n3704\t簧\n3705\t杞\n3706\t轲\n3707\t贲\n3708\t溝\n3709\t烷\n3710\t霧\n3711\t塵\n3712\t瘾\n3713\t颉\n3714\t凿\n3715\t彝\n3716\t诛\n3717\t訪\n3718\t鮮\n3719\t覺\n3720\t歲\n3721\t窟\n3722\t週\n3723\t苞\n3724\t濟\n3725\t叟\n3726\t爭\n3727\t椎\n3728\t療\n3729\t眾\n3730\t審\n3731\t拋\n3732\t棘\n3733\t诀\n3734\t鹃\n3735\t倦\n3736\t擦\n3737\t暢\n3738\t酬\n3739\t蠢\n3740\t聞\n3741\t囧\n3742\t從\n3743\t脈\n3744\t缆\n3745\t陋\n3746\t哪\n3747\t酿\n3748\t娆\n3749\t屍\n3750\t檬\n3751\t捧\n3752\t凛\n3753\t靶\n3754\t疣\n3755\t餘\n3756\t鹊\n3757\t陣\n3758\t昙\n3759\t栎\n3760\t鳖\n3761\t镶\n3762\t飄\n3763\t烫\n3764\t芜\n3765\t垦\n3766\t癣\n3767\t蟾\n3768\t萤\n3769\t寓\n3770\t診\n3771\t蚌\n3772\t霈\n3773\t诈\n3774\t負\n3775\t吼\n3776\t疹\n3777\t縫\n3778\t則\n3779\t鹽\n3780\t啊\n3781\t捣\n3782\t勘\n3783\t俯\n3784\t陡\n3785\t叮\n3786\t$\n3787\t饱\n3788\t寬\n3789\t帥\n3790\t漿\n3791\t掘\n3792\t棺\n3793\t汞\n3794\t钵\n3795\tこ\n3796\t绸\n3797\t括\n3798\t濂\n3799\t壞\n3800\t躲\n3801\t拦\n3802\t錫\n3803\t拟\n3804\t钠\n3805\t嘛\n3806\t趋\n3807\t遣\n3808\t谐\n3809\t墟\n3810\t喧\n3811\t榭\n3812\t閉\n3813\t筛\n3814\tｊ\n3815\t渴\n3816\t峨\n3817\t嬰\n3818\t巳\n3819\t梢\n3820\t漱\n3821\t疤\n3822\t祉\n3823\t矽\n3824\t痒\n3825\t咽\n3826\t邀\n3827\t缀\n3828\t庇\n3829\t虔\n3830\t盏\n3831\t羿\n3832\t抑\n3833\t叨\n3834\t弑\n3835\t唛\n3836\t侑\n3837\t賊\n3838\t稽\n3839\t黨\n3840\t妝\n3841\t谍\n3842\t蓁\n3843\tま\n3844\t蕃\n3845\t藜\n3846\t赘\n3847\t诞\n3848\t眷\n3849\t够\n3850\t岫\n3851\t釣\n3852\t喃\n3853\t樑\n3854\t钮\n3855\t鋪\n3856\t牡\n3857\t溴\n3858\t缕\n3859\t溺\n3860\t溟\n3861\t描\n3862\t渺\n3863\t藕\n3864\t胚\n3865\t刨\n3866\t獵\n3867\t琬\n3868\t寝\n3869\t稷\n3870\t缎\n3871\t锈\n3872\t需\n3873\t遍\n3874\t醛\n3875\t戬\n3876\t噬\n3877\t闰\n3878\t蔣\n3879\t協\n3880\t響\n3881\t顯\n3882\t飾\n3883\t厢\n3884\t钗\n3885\t毯\n3886\t询\n3887\t簪\n3888\t堅\n3889\t鼬\n3890\t貢\n3891\t遭\n3892\t肘\n3893\t燥\n3894\t砸\n3895\t趾\n3896\t豔\n3897\t蟒\n3898\t淨\n3899\t廟\n3900\t唑\n3901\tｚ\n3902\t诠\n3903\t垭\n3904\t龜\n3905\t剥\n3906\t辦\n3907\t翱\n3908\t挨\n3909\t峽\n3910\t紗\n3911\t拘\n3912\t绢\n3913\t畴\n3914\t蔼\n3915\t隶\n3916\t溃\n3917\t濃\n3918\t碌\n3919\t宓\n3920\t趴\n3921\t浔\n3922\t搞\n3923\t挪\n3924\t楞\n3925\t邈\n3926\t虑\n3927\t捌\n3928\t舉\n3929\t嫔\n3930\t漓\n3931\t捻\n3932\t逵\n3933\t呢\n3934\t砾\n3935\t谬\n3936\t琥\n3937\t撮\n3938\t準\n3939\t嗜\n3940\t它\n3941\t議\n3942\t於\n3943\t執\n3944\t顔\n3945\t匣\n3946\t焘\n3947\t狭\n3948\t涡\n3949\t衔\n3950\t靚\n3951\t祠\n3952\t雉\n3953\t疼\n3954\t镖\n3955\t嚣\n3956\t骸\n3957\tん\n3958\t証\n3959\t恢\n3960\t凑\n3961\t丐\n3962\t貞\n3963\t蛹\n3964\t呵\n3965\t昼\n3966\t蛉\n3967\t翳\n3968\t匀\n3969\t侦\n3970\t設\n3971\t轧\n3972\t損\n3973\t盧\n3974\t叩\n3975\t這\n3976\t跡\n3977\t谕\n3978\t迴\n3979\t鳗\n3980\t炕\n3981\t珮\n3982\tカ\n3983\t咀\n3984\t搅\n3985\t矫\n3986\t矩\n3987\t箍\n3988\t渤\n3989\t狩\n3990\t苛\n3991\t劼\n3992\t濡\n3993\t慌\n3994\t勁\n3995\t腫\n3996\t般\n3997\t酌\n3998\t徕\n3999\t廓\n4000\t燎\n4001\t颇\n4002\t樽\n4003\t槎\n4004\t鑽\n4005\t摔\n4006\t诵\n4007\t槿\n4008\t琐\n4009\t塌\n4010\t锻\n4011\t願\n4012\t顧\n4013\t萎\n4014\tは\n4015\t膛\n4016\t祛\n4017\t檔\n4018\t蠡\n4019\t觸\n4020\t虬\n4021\t談\n4022\t喝\n4023\t娱\n4024\t噪\n4025\t胀\n4026\t褐\n4027\t疫\n4028\t札\n4029\t昉\n4030\t呱\n4031\t禪\n4032\t債\n4033\t屬\n4034\t佶\n4035\t垠\n4036\t貿\n4037\t葭\n4038\t齡\n4039\t萦\n4040\t蕤\n4041\t燚\n4042\t#\n4043\t劑\n4044\t彥\n4045\t棗\n4046\t紐\n4047\t浇\n4048\t汲\n4049\t臼\n4050\t咎\n4051\t絨\n4052\t裹\n4053\t茬\n4054\t厕\n4055\t傾\n4056\t釋\n4057\t秽\n4058\t颅\n4059\t蹦\n4060\t么\n4061\t嘟\n4062\t锣\n4063\t腻\n4064\t寐\n4065\t妲\n4066\t湃\n4067\t醜\n4068\t另\n4069\t泮\n4070\t幂\n4071\t獄\n4072\t滅\n4073\t玳\n4074\t氰\n4075\t鞘\n4076\t峭\n4077\t鹂\n4078\t嗅\n4079\tら\n4080\t瑙\n4081\t咳\n4082\t蝗\n4083\t瓯\n4084\t猷\n4085\t樾\n4086\t赎\n4087\t她\n4088\t朕\n4089\t淀\n4090\t頁\n4091\t飙\n4092\t羁\n4093\t镒\n4094\t喂\n4095\t袜\n4096\t钺\n4097\t扉\n4098\t曆\n4099\t櫻\n4100\t曳\n4101\t辕\n4102\t帧\n4103\t誤\n4104\t哄\n4105\t漳\n4106\t亓\n4107\t隅\n4108\t訴\n4109\t螨\n4110\t艮\n4111\t識\n4112\t適\n4113\t诏\n4114\t饵\n4115\t俨\n4116\t郦\n4117\t坳\n4118\t鵝\n4119\t礦\n4120\t褒\n4121\t犇\n4122\t隘\n4123\t咯\n4124\t赴\n4125\t競\n4126\t個\n4127\t劃\n4128\t殼\n4129\t睛\n4130\t究\n4131\t兢\n4132\t緩\n4133\t纠\n4134\t惧\n4135\t践\n4136\t躬\n4137\t惯\n4138\t稠\n4139\t惩\n4140\t秤\n4141\t嚴\n4142\t茁\n4143\t濮\n4144\t亩\n4145\t憬\n4146\t撩\n4147\t赔\n4148\t渎\n4149\t镀\n4150\t汴\n4151\t婢\n4152\t菩\n4153\t鍾\n4154\t锰\n4155\t挠\n4156\t泱\n4157\t毗\n4158\t丅\n4159\t琮\n4160\t痧\n4161\t痣\n4162\t堕\n4163\t鄙\n4164\t搓\n4165\tな\n4166\t蕭\n4167\t赦\n4168\t耆\n4169\t稍\n4170\t險\n4171\t胭\n4172\t沢\n4173\t婬\n4174\t畈\n4175\t炖\n4176\t毋\n4177\t蜗\n4178\t煲\n4179\t铧\n4180\t並\n4181\t廚\n4182\t佈\n4183\t衙\n4184\t荧\n4185\t钥\n4186\t黯\n4187\t雳\n4188\t吨\n4189\t铬\n4190\t請\n4191\t鎏\n4192\t釉\n4193\t栽\n4194\t騎\n4195\t磚\n4196\t廢\n4197\t郢\n4198\t偃\n4199\t賞\n4200\t奪\n4201\t鬓\n4202\t鳍\n4203\t乏\n4204\t蹲\n4205\t盯\n4206\tー\n4207\tく\n4208\tし\n4209\tア\n4210\t寵\n4211\t悶\n4212\t構\n4213\t煉\n4214\t粿\n4215\t絶\n4216\t诫\n4217\t狙\n4218\t钾\n4219\t敵\n4220\t偿\n4221\t锄\n4222\t姫\n4223\t幡\n4224\t戳\n4225\t澹\n4226\t坯\n4227\t濯\n4228\t骈\n4229\t嬉\n4230\t砌\n4231\t囡\n4232\t峦\n4233\t漕\n4234\t闾\n4235\t镍\n4236\t罰\n4237\t肋\n4238\t遐\n4239\t荤\n4240\t窍\n4241\t绾\n4242\t怯\n4243\t携\n4244\t鹄\n4245\t戌\n4246\t凳\n4247\t蕩\n4248\t揉\n4249\t柘\n4250\t冗\n4251\t須\n4252\t蔽\n4253\t焜\n4254\t驯\n4255\t騙\n4256\t騷\n4257\t恳\n4258\t凈\n4259\t籁\n4260\t註\n4261\t傣\n4262\t凍\n4263\t霭\n4264\t爸\n4265\t謀\n4266\t酯\n4267\t渍\n4268\t駿\n4269\t绎\n4270\t粲\n4271\t衷\n4272\t葫\n4273\t鬆\n4274\t況\n4275\t掃\n4276\t撸\n4277\t呗\n4278\t碩\n4279\t诘\n4280\t贊\n4281\t坨\n4282\t芩\n4283\t垌\n4284\t茱\n4285\t塚\n4286\t洱\n4287\t齒\n4288\t嫚\n4289\t篆\n4290\t瑯\n4291\t贩\n4292\tき\n4293\t啓\n4294\t墊\n4295\t潛\n4296\t瀾\n4297\t饥\n4298\t笺\n4299\t轿\n4300\t糞\n4301\t範\n4302\t嘲\n4303\t啶\n4304\t繼\n4305\t捆\n4306\t拢\n4307\t脓\n4308\t渥\n4309\t谅\n4310\t迩\n4311\t烹\n4312\t瀑\n4313\t姥\n4314\t缦\n4315\t蛆\n4316\t毙\n4317\t腥\n4318\t痨\n4319\t喪\n4320\tに\n4321\t壤\n4322\t饲\n4323\t胄\n4324\t淚\n4325\t濱\n4326\t矶\n4327\t汰\n4328\tノ\n4329\t飲\n4330\t媳\n4331\t磬\n4332\t砺\n4333\t啼\n4334\t瘟\n4335\t扈\n4336\t祀\n4337\t頸\n4338\t蘆\n4339\t钨\n4340\t馳\n4341\t佣\n4342\t鬧\n4343\t舂\n4344\t翩\n4345\t蝠\n4346\t挣\n4347\t誘\n4348\t蛰\n4349\t佚\n4350\t辙\n4351\t邁\n4352\t塗\n4353\t賬\n4354\t塬\n4355\t埭\n4356\t诰\n4357\t圻\n4358\t拗\n4359\t耽\n4360\t祿\n4361\t璠\n4362\t瓊\n4363\t珣\n4364\tた\n4365\t儲\n4366\t棄\n4367\t辑\n4368\t灸\n4369\t狡\n4370\t綿\n4371\t歼\n4372\t糧\n4373\t癸\n4374\t撫\n4375\t帷\n4376\t镰\n4377\t俩\n4378\t垄\n4379\t募\n4380\t嗔\n4381\t滥\n4382\t鏈\n4383\t僻\n4384\t馍\n4385\t娼\n4386\t撇\n4387\t崽\n4388\t蚂\n4389\t酪\n4390\t怿\n4391\t愫\n4392\t廈\n4393\t琏\n4394\t械\n4395\t些\n4396\t恤\n4397\t疝\n4398\t榄\n4399\t琚\n4400\tり\n4401\tリ\n4402\t妒\n4403\t杲\n4404\t楣\n4405\t槌\n4406\t槟\n4407\t孺\n4408\t桧\n4409\t桀\n4410\t牲\n4411\t戍\n4412\t幫\n4413\t旎\n4414\t铣\n4415\t躺\n4416\t剃\n4417\t锵\n4418\t呜\n4419\t嫌\n4420\t剔\n4421\t駕\n4422\t谎\n4423\t绚\n4424\t眩\n4425\t阉\n4426\t駐\n4427\t討\n4428\t驅\n4429\t腋\n4430\t痹\n4431\t冊\n4432\t饿\n4433\t磅\n4434\t乍\n4435\t毡\n4436\t盔\n4437\t簇\n4438\t殖\n4439\t説\n4440\t篁\n4441\t襲\n4442\t攒\n4443\t鮑\n4444\t哆\n4445\t遲\n4446\t遷\n4447\t禀\n4448\t賴\n4449\t邰\n4450\t軌\n4451\t奂\n4452\t倌\n4453\t荞\n4454\t苡\n4455\t苷\n4456\t圳\n4457\t莜\n4458\t荪\n4459\t菀\n4460\t軸\n4461\t羹\n4462\t爐\n4463\t確\n4464\t讓\n4465\t癬\n4466\t獲\n4467\t籃\n4468\t垟\n4469\t奮\n4470\t擺\n4471\t暈\n4472\t瀬\n4473\t蓟\n4474\t溅\n4475\t疥\n4476\t届\n4477\t綱\n4478\t烬\n4479\t嵐\n4480\t雇\n4481\t蹭\n4482\t俺\n4483\t敞\n4484\t砲\n4485\t涣\n4486\t阑\n4487\t聶\n4488\t蹇\n4489\t糯\n4490\t災\n4491\t淬\n4492\t骡\n4493\t吗\n4494\t疲\n4495\t錶\n4496\t狎\n4497\t漩\n4498\t泫\n4499\t泯\n4500\t擂\n4501\t鹫\n4502\t枳\n4503\t剩\n4504\t韫\n4505\t攘\n4506\t怂\n4507\t镕\n4508\t讼\n4509\t牝\n4510\t譯\n4511\t膘\n4512\t惶\n4513\t铵\n4514\t钿\n4515\t頔\n4516\t硐\n4517\t涎\n4518\t驮\n4519\t裆\n4520\t褶\n4521\t捍\n4522\t绑\n4523\t痈\n4524\t訓\n4525\t膀\n4526\t懸\n4527\t鴿\n4528\t兀\n4529\t貪\n4530\t壕\n4531\t隼\n4532\t澡\n4533\t躁\n4534\t秩\n4535\t蚝\n4536\t哼\n4537\t淤\n4538\t盂\n4539\t叽\n4540\t違\n4541\t遙\n4542\t欄\n4543\t诃\n4544\t郗\n4545\t劭\n4546\t偌\n4547\t倬\n4548\t阡\n4549\t苕\n4550\t谒\n4551\t莒\n4552\t埕\n4553\t輸\n4554\t葩\n4555\t蕨\n4556\t爛\n4557\t爲\n4558\t燦\n4559\t拽\n4560\t讚\n4561\t悼\n4562\t籠\n4563\tサ\n4564\t佔\n4565\t搶\n4566\t曌\n4567\t紡\n4568\t拷\n4569\t緹\n4570\t嚼\n4571\t藉\n4572\t韭\n4573\t饺\n4574\t綫\n4575\t哺\n4576\t脖\n4577\t吵\n4578\tめ\n4579\tち\n4580\t痢\n4581\t嗟\n4582\t馈\n4583\t庾\n4584\t獾\n4585\t獐\n4586\t鈺\n4587\t蹬\n4588\t磕\n4589\t愣\n4590\t脹\n4591\t僚\n4592\t噜\n4593\t匿\n4594\t婊\n4595\t啤\n4596\t尻\n4597\t驷\n4598\t骧\n4599\t繪\n4600\t嗪\n4601\t赓\n4602\t滟\n4603\t鋁\n4604\t扮\n4605\t纾\n4606\t撬\n4607\t馃\n4608\t朽\n4609\t瘘\n4610\t嗓\n4611\t瑕\n4612\t啡\n4613\tと\n4614\t麝\n4615\t删\n4616\t汕\n4617\t胧\n4618\t際\n4619\t轼\n4620\t掰\n4621\t讽\n4622\t頌\n4623\t瘫\n4624\t镝\n4625\t颓\n4626\t涕\n4627\t舷\n4628\t慾\n4629\t憂\n4630\t癖\n4631\t酣\n4632\t鸳\n4633\t歹\n4634\t翡\n4635\t帜\n4636\t箴\n4637\t箬\n4638\t骤\n4639\t痔\n4640\t姻\n4641\t舆\n4642\t赃\n4643\t嘿\n4644\t觞\n4645\t遼\n4646\t唔\n4647\t唧\n4648\t桿\n4649\t孃\n4650\t倭\n4651\t偕\n4652\t芪\n4653\t躍\n4654\t縱\n4655\t癡\n4656\t萘\n4657\t堇\n4658\t輔\n4659\t攝\n4660\t據\n4661\t忿\n4662\t蓼\n4663\t辭\n4664\t碍\n4665\t慷\n4666\tか\n4667\tあ\n4668\t弊\n4669\t啞\n4670\t彎\n4671\t灘\n4672\t煩\n4673\t缉\n4674\t徑\n4675\t綺\n4676\t荚\n4677\t竭\n4678\t簿\n4679\t倡\n4680\t趁\n4681\t釜\n4682\t绷\n4683\tむ\n4684\t鄧\n4685\tモ\n4686\t垮\n4687\t宕\n4688\t澧\n4689\t撲\n4690\t鋆\n4691\t洄\n4692\t蘑\n4693\t樸\n4694\t惘\n4695\t该\n4696\t戮\n4697\t榔\n4698\t滦\n4699\tゆ\n4700\t滄\n4701\t娑\n4702\t闳\n4703\t嫖\n4704\t篷\n4705\t捏\n4706\t湟\n4707\t恼\n4708\t阖\n4709\t螟\n4710\t膺\n4711\t沦\n4712\t泌\n4713\t帼\n4714\t玑\n4715\t啃\n4716\t鹦\n4717\t鹞\n4718\t婿\n4719\t搁\n4720\t惰\n4721\t瑗\n4722\t筷\n4723\tナ\n4724\tる\n4725\t嘶\n4726\t枧\n4727\t杵\n4728\t肴\n4729\t芍\n4730\t暧\n4731\t朦\n4732\t绊\n4733\t枉\n4734\t挫\n4735\t奠\n4736\t桅\n4737\t潍\n4738\t辖\n4739\t暇\n4740\t戾\n4741\t龛\n4742\t锷\n4743\t嘻\n4744\tｑ\n4745\t矜\n4746\t焙\n4747\t瑚\n4748\t夯\n4749\tン\n4750\t蟠\n4751\t覽\n4752\t凋\n4753\t酰\n4754\t斬\n4755\t貫\n4756\t胰\n4757\t陨\n4758\t炙\n4759\t謎\n4760\t誌\n4761\t鯨\n4762\t鲈\n4763\t匾\n4764\t鳅\n4765\t拯\n4766\t僑\n4767\t哒\n4768\t恥\n4769\t璘\n4770\t谧\n4771\t讷\n4772\t佼\n4773\t佗\n4774\t畸\n4775\t篡\n4776\t窜\n4777\t涇\n4778\t芘\n4779\t弁\n4780\t壑\n4781\t谯\n4782\t茭\n4783\t冽\n4784\t賈\n4785\t菽\n4786\t燙\n4787\t础\n4788\t揣\n4789\t鬃\n4790\t赚\n4791\t怠\n4792\t筏\n4793\t犊\n4794\t畢\n4795\tタ\n4796\t弢\n4797\t彌\n4798\t沒\n4799\t瀨\n4800\t綏\n4801\t窘\n4802\t悸\n4803\t綾\n4804\t枷\n4805\t捡\n4806\t颊\n4807\t疽\n4808\t沮\n4809\t辊\n4810\t箔\n4811\tコ\n4812\t幔\n4813\tチ\n4814\t粱\n4815\t鄰\n4816\t愧\n4817\t扳\n4818\tも\n4819\t鈣\n4820\t靛\n4821\t鍍\n4822\t柵\n4823\t艦\n4824\t讳\n4825\t涞\n4826\t浏\n4827\t恽\n4828\t棵\n4829\t峤\n4830\t啪\n4831\t虏\n4832\t嗒\n4833\t徵\n4834\t硼\n4835\t湫\n4836\t怅\n4837\t嫒\n4838\t畦\n4839\t鍵\n4840\t蔑\n4841\t翹\n4842\t逯\n4843\t渲\n4844\t繳\n4845\t鈞\n4846\t眀\n4847\t绶\n4848\t钎\n4849\t缙\n4850\t琊\n4851\t呛\n4852\t禿\n4853\t廳\n4854\t懶\n4855\t楔\n4856\t疳\n4857\t蠻\n4858\tラ\n4859\t咨\n4860\t璎\n4861\t擅\n4862\t鑑\n4863\t炅\n4864\t腌\n4865\t祟\n4866\t薑\n4867\t轸\n4868\t暾\n4869\t腮\n4870\t玦\n4871\t獻\n4872\tろ\n4873\tロ\n4874\t傢\n4875\t憩\n4876\t吠\n4877\t睢\n4878\t偽\n4879\t憋\n4880\t蠟\n4881\t钼\n4882\t捂\n4883\t倘\n4884\t韋\n4885\t掏\n4886\t瓮\n4887\t镯\n4888\t睇\n4889\t烃\n4890\t慘\n4891\t癞\n4892\t癫\n4893\t殉\n4894\t谚\n4895\t骇\n4896\t颌\n4897\t颍\n4898\t饕\n4899\t耙\n4900\tひ\n4901\t酩\n4902\t榨\n4903\t辐\n4904\t刈\n4905\t責\n4906\t逾\n4907\t绽\n4908\t蒯\n4909\t蚤\n4910\t鲫\n4911\t麸\n4912\t迂\n4913\t鲷\n4914\t臆\n4915\t贮\n4916\t佞\n4917\t瑀\n4918\t痳\n4919\t係\n4920\t吡\n4921\t咩\n4922\t呷\n4923\t啉\n4924\t擴\n4925\t擔\n4926\t衮\n4927\t僖\n4928\t嬴\n4929\t趕\n4930\t踫\n4931\t鹵\n4932\t邺\n4933\t癢\n4934\t輩\n4935\t莳\n4936\t萼\n4937\t蘅\n4938\t鳝\n4939\t鳐\n4940\t撰\n4941\t瑩\n4942\t瘋\n4943\t慨\n4944\t績\n4945\t珅\n4946\t哗\n4947\tえ\n4948\tシ\n4949\t墜\n4950\t幾\n4951\t憶\n4952\t擾\n4953\t煥\n4954\t紛\n4955\t桨\n4956\t絡\n4957\t仅\n4958\tス\n4959\t褂\n4960\t阐\n4961\t洺\n4962\t橱\n4963\t洩\n4964\t贬\n4965\t釘\n4966\t呕\n4967\t疟\n4968\tや\n4969\t洮\n4970\tっ\n4971\t氓\n4972\t殴\n4973\t迤\n4974\tユ\n4975\tて\n4976\t偲\n4977\t掐\n4978\t繩\n4979\t臟\n4980\t膨\n4981\t漉\n4982\t暹\n4983\t鉻\n4984\t妩\n4985\t鉛\n4986\t珥\n4987\t邕\n4988\t胁\n4989\t楸\n4990\t瓒\n4991\t叭\n4992\t戛\n4993\t驶\n4994\t炔\n4995\t階\n4996\t鑒\n4997\t缮\n4998\t腓\n4999\t耸\n5000\t腚\n5001\t閘\n5002\t桉\n5003\t恃\n5004\t楹\n5005\t橹\n5006\t蓑\n5007\t栀\n5008\t侶\n5009\t籌\n5010\tね\n5011\t斓\n5012\t畲\n5013\t顫\n5014\t铳\n5015\t砥\n5016\t蜕\n5017\t锶\n5018\t祜\n5019\t铛\n5020\t唾\n5021\t嵇\n5022\t袂\n5023\t佯\n5024\t殃\n5025\t婳\n5026\t扼\n5027\t昨\n5028\t赭\n5029\t詠\n5030\t侄\n5031\t踝\n5032\t傍\n5033\t禺\n5034\t貧\n5035\t缶\n5036\t霾\n5037\t邯\n5038\t蜚\n5039\t翥\n5040\t掷\n5041\t罔\n5042\t蝽\n5043\t襪\n5044\t怎\n5045\t諸\n5046\t斛\n5047\t誼\n5048\t鲛\n5049\t媞\n5050\t漲\n5051\t吖\n5052\t叱\n5053\t譚\n5054\t譽\n5055\t漸\n5056\t鸮\n5057\t郅\n5058\t芗\n5059\t贏\n5060\t貸\n5061\t亵\n5062\t俎\n5063\t剎\n5064\t俘\n5065\t篙\n5066\t気\n5067\t荭\n5068\t莪\n5069\t萸\n5070\t蒽\n5071\tマ\n5072\t夼\n5073\t藓\n5074\t牽\n5075\t鱗\n5076\t繆\n5077\t钒\n5078\t珐\n5079\t穩\n5080\t脯\n5081\t珪\n5082\tさ\n5083\tじ\n5084\tけ\n5085\tエ\n5086\tク\n5087\t彊\n5088\t挌\n5089\t暉\n5090\t棟\n5091\t踞\n5092\t艰\n5093\t缄\n5094\t酵\n5095\t较\n5096\t糾\n5097\t糙\n5098\tお\n5099\tメ\n5100\t釀\n5101\t喔\n5102\t啾\n5103\t篓\n5104\t掳\n5105\t拧\n5106\t哦\n5107\t氫\n5108\tつ\n5109\t摹\n5110\t悖\n5111\t嗝\n5112\t沔\n5113\t與\n5114\t眯\n5115\t衢\n5116\t娉\n5117\t剖\n5118\t嫦\n5119\t嬷\n5120\t湮\n5121\t繫\n5122\t舖\n5123\t鈔\n5124\t醚\n5125\t庖\n5126\t馒\n5127\t潋\n5128\t逻\n5129\t聋\n5130\t纖\n5131\t潺\n5132\t遛\n5133\t滲\n5134\t绉\n5135\t绀\n5136\t磺\n5137\t菓\n5138\t顷\n5139\t玠\n5140\t淒\n5141\t挟\n5142\t痫\n5143\t鹬\n5144\t鹳\n5145\t閱\n5146\t偵\n5147\t胯\n5148\t璀\n5149\t娶\n5150\t甑\n5151\t辘\n5152\t魇\n5153\tル\n5154\t嶋\n5155\t榻\n5156\t杈\n5157\t昵\n5158\t黍\n5159\t塍\n5160\t丟\n5161\t恣\n5162\tれ\n5163\t袒\n5164\t挞\n5165\t锂\n5166\t旖\n5167\t铄\n5168\t掀\n5169\t砦\n5170\t舔\n5171\t燧\n5172\t稔\n5173\t漬\n5174\t蜒\n5175\t裾\n5176\t瀘\n5177\t暫\n5178\t嚎\n5179\t蚧\n5180\t匆\n5181\t掖\n5182\t铱\n5183\t詢\n5184\t擋\n5185\t燉\n5186\t壺\n5187\t販\n5188\t爻\n5189\t蜥\n5190\t翦\n5191\t仄\n5192\t螂\n5193\t砧\n5194\t厮\n5195\t粑\n5196\t匝\n5197\t吁\n5198\t豎\n5199\t蝴\n5200\t蛀\n5201\t剌\n5202\t歳\n5203\t遜\n5204\t咚\n5205\t渦\n5206\t讴\n5207\t谤\n5208\t抠\n5209\t僮\n5210\t俑\n5211\t廂\n5212\t撥\n5213\t芨\n5214\t诩\n5215\t芫\n5216\t巽\n5217\t苣\n5218\t茴\n5219\t荏\n5220\t苴\n5221\t賤\n5222\t鹹\n5223\t祕\n5224\t逮\n5225\t薏\n5226\t矗\n5227\tǐ\n5228\t禍\n5229\t瘡\n5230\t緻\n5231\t涪\n5232\t唬\n5233\tイ\n5234\t钡\n5235\t雹\n5236\t們\n5237\t兇\n5238\t兌\n5239\t勛\n5240\t剝\n5241\t揮\n5242\t擼\n5243\t敘\n5244\t殤\n5245\t灑\n5246\t烜\n5247\t揪\n5248\t綜\n5249\t拣\n5250\t絞\n5251\t柬\n5252\t秸\n5253\t緒\n5254\t埂\n5255\t逛\n5256\t逞\n5257\t滁\n5258\t麽\n5259\t揍\n5260\t岘\n5261\t袄\n5262\t坷\n5263\t繞\n5264\t瞒\n5265\t聰\n5266\t髋\n5267\t屌\n5268\t颁\n5269\t啄\n5270\t傘\n5271\t疵\n5272\t嬅\n5273\t崂\n5274\t徙\n5275\t呐\n5276\t噻\n5277\t彗\n5278\t闱\n5279\t寥\n5280\t嚓\n5281\t潢\n5282\t瞄\n5283\t婺\n5284\t骜\n5285\t骠\n5286\t纨\n5287\t鈎\n5288\t嵬\n5289\t阆\n5290\t庠\n5291\t悯\n5292\t剁\n5293\t瞧\n5294\t缜\n5295\t酋\n5296\t癲\n5297\t叼\n5298\tバ\n5299\t疸\n5300\t楝\n5301\t闊\n5302\t搔\n5303\t瑷\n5304\tト\n5305\t戗\n5306\t陝\n5307\t娛\n5308\t柺\n5309\t蔥\n5310\t爰\n5311\t獒\n5312\t蠕\n5313\t杳\n5314\t脲\n5315\t閑\n5316\t孰\n5317\t薊\n5318\t橄\n5319\t褥\n5320\t胪\n5321\t腱\n5322\t仍\n5323\t膈\n5324\t赊\n5325\t竑\n5326\t刪\n5327\t孖\n5328\t擁\n5329\t坍\n5330\t壩\n5331\t捨\n5332\t锉\n5333\t跋\n5334\tハ\n5335\t熄\n5336\t沓\n5337\t湍\n5338\t惕\n5339\t焖\n5340\t钏\n5341\t钴\n5342\t馅\n5343\t発\n5344\t凪\n5345\t曬\n5346\t癜\n5347\t耦\n5348\t窈\n5349\t奄\n5350\t簾\n5351\t蠓\n5352\t螭\n5353\t臾\n5354\t吱\n5355\t鯊\n5356\t氛\n5357\t咋\n5358\t徹\n5359\t噩\n5360\t乜\n5361\t孬\n5362\t揖\n5363\t鼐\n5364\t醪\n5365\t撼\n5366\t蚰\n5367\t蛎\n5368\t鲟\n5369\t帚\n5370\t蔗\n5371\t厍\n5372\t鬱\n5373\t诣\n5374\t羯\n5375\t蜓\n5376\t盅\n5377\t誕\n5378\t蜻\n5379\t剡\n5380\t簌\n5381\t筵\n5382\t酊\n5383\t怔\n5384\t贿\n5385\tみ\n5386\t忒\n5387\t叻\n5388\t吒\n5389\t撷\n5390\t遞\n5391\t廁\n5392\t俚\n5393\t贇\n5394\t勖\n5395\t夔\n5396\t苋\n5397\t诤\n5398\t塾\n5399\t賠\n5400\t谲\n5401\t淵\n5402\t鼾\n5403\t莼\n5404\t輯\n5405\t菰\n5406\t滯\n5407\t薮\n5408\t揆\n5409\t辯\n5410\t髯\n5411\t瑠\n5412\t皑\n5413\t盎\n5414\t哎\n5415\t祷\n5416\tウ\n5417\t償\n5418\t厭\n5419\t嘆\n5420\t嚇\n5421\t嬿\n5422\t嶽\n5423\t憑\n5424\t憲\n5425\t攤\n5426\t桜\n5427\t檯\n5428\t渾\n5429\t湉\n5430\t澀\n5431\t綉\n5432\t綸\n5433\t緯\n5434\t疚\n5435\t倔\n5436\t笹\n5437\t硃\n5438\t瀉\n5439\t妨\n5440\tム\n5441\t栢\n5442\t猥\n5443\t膩\n5444\t悌\n5445\t鉆\n5446\t悚\n5447\t屆\n5448\t铆\n5449\t崮\n5450\t嗦\n5451\t箩\n5452\t屡\n5453\t饷\n5454\t涿\n5455\t娲\n5456\t娓\n5457\t娈\n5458\t姊\n5459\t撈\n5460\t拈\n5461\t鎂\n5462\t讫\n5463\t録\n5464\t嵊\n5465\t猶\n5466\t吝\n5467\t霹\n5468\t溱\n5469\t羨\n5470\t琵\n5471\t恂\n5472\t琤\n5473\t疊\n5474\t凜\n5475\t堑\n5476\t珲\n5477\t甦\n5478\t梆\n5479\t筐\n5480\t穰\n5481\t瓠\n5482\t饒\n5483\t鸪\n5484\t疱\n5485\t鹉\n5486\t猩\n5487\t痂\n5488\t嘘\n5489\t瘀\n5490\t閨\n5491\t閩\n5492\t惦\n5493\t侩\n5494\t敕\n5495\t桠\n5496\t赉\n5497\t伺\n5498\t殓\n5499\t犟\n5500\t唆\n5501\t雛\n5502\t淄\n5503\t勍\n5504\tレ\n5505\t飕\n5506\t獭\n5507\t蘿\n5508\t讹\n5509\tワ\n5510\t飨\n5511\t頑\n5512\t趟\n5513\t侮\n5514\t蝕\n5515\t惋\n5516\t碛\n5517\t熵\n5518\t钤\n5519\t硒\n5520\t飏\n5521\t蟬\n5522\t睑\n5523\t稞\n5524\t盞\n5525\t擬\n5526\t勸\n5527\t擇\n5528\t駝\n5529\t窠\n5530\t耒\n5531\t裱\n5532\tず\n5533\t憾\n5534\t曈\n5535\t蜃\n5536\tヒ\n5537\t簸\n5538\t憎\n5539\t鰲\n5540\t敝\n5541\t謂\n5542\t柞\n5543\t醴\n5544\t蠹\n5545\t蚶\n5546\t翕\n5547\t雎\n5548\t雒\n5549\t跖\n5550\t啬\n5551\t誦\n5552\t铀\n5553\t蜷\n5554\t蹊\n5555\t蹼\n5556\t誇\n5557\t蜢\n5558\t跷\n5559\t謙\n5560\t咱\n5561\t伫\n5562\tミ\n5563\t呓\n5564\t诒\n5565\t倏\n5566\t鄱\n5567\t倜\n5568\t芾\n5569\t茆\n5570\t阪\n5571\t谄\n5572\t谙\n5573\t芡\n5574\t隗\n5575\t芎\n5576\t茯\n5577\t荇\n5578\t濾\n5579\t龐\n5580\t菘\n5581\t菟\n5582\t齋\n5583\t蕲\n5584\t掬\n5585\t扪\n5586\t轟\n5587\t燭\n5588\t捶\n5589\t幢\n5590\tǎ\n5591\t鳕\n5592\t皺\n5593\t縛\n5594\t扛\n5595\t穂\n5596\tゴ\n5597\tセ\n5598\tギ\n5599\t噹\n5600\t墳\n5601\t奬\n5602\t姍\n5603\t嫄\n5604\t慮\n5605\t様\n5606\t灝\n5607\t槛\n5608\t伎\n5609\t綁\n5610\t澗\n5611\t痉\n5612\t剿\n5613\t撅\n5614\t緋\n5615\t睫\n5616\t筍\n5617\t舶\n5618\t菠\n5619\t矇\n5620\t怖\n5621\t猖\n5622\tǔ\n5623\t郴\n5624\t椽\n5625\tオ\n5626\t暘\n5627\t獣\n5628\t羔\n5629\t庒\n5630\t掂\n5631\t鉀\n5632\t灞\n5633\t鍛\n5634\t颗\n5635\t麂\n5636\t浯\n5637\t鋅\n5638\t鋸\n5639\t寞\n5640\t併\n5641\t銜\n5642\t峒\n5643\t喙\n5644\t嗯\n5645\t忏\n5646\t滏\n5647\t繡\n5648\t沌\n5649\t臘\n5650\t沭\n5651\t阈\n5652\t姒\n5653\t苺\n5654\t滂\n5655\t淙\n5656\t汩\n5657\t媾\n5658\t艶\n5659\t嫱\n5660\t莆\n5661\t曝\n5662\t錐\n5663\t撂\n5664\t逄\n5665\t逑\n5666\t馏\n5667\t囿\n5668\t嘀\n5669\t弭\n5670\t啮\n5671\t皿\n5672\t泺\n5673\t纏\n5674\t噗\n5675\t歉\n5676\t玎\n5677\t悄\n5678\t珙\n5679\t缬\n5680\t缭\n5681\t擠\n5682\t愷\n5683\t恍\n5684\t鸩\n5685\t餌\n5686\t鹑\n5687\t蠶\n5688\t疖\n5689\t瘕\n5690\t榈\n5691\t椤\n5692\t闇\n5693\t辫\n5694\t瑭\n5695\t氪\n5696\t榫\n5697\t昴\n5698\t昝\n5699\t拭\n5700\t殒\n5701\t腈\n5702\t枞\n5703\t枋\n5704\t隧\n5705\t腩\n5706\t妊\n5707\t蓆\n5708\t楮\n5709\t枸\n5710\t辇\n5711\t臊\n5712\t窮\n5713\t琯\n5714\t禛\n5715\t恙\n5716\tネ\n5717\t捅\n5718\t飓\n5719\t眺\n5720\t虧\n5721\t勵\n5722\t顛\n5723\t螞\n5724\t飽\n5725\t幌\n5726\t蟻\n5727\t搪\n5728\t砣\n5729\t镫\n5730\t晤\n5731\t蘊\n5732\t萄\n5733\t蘋\n5734\t碣\n5735\t頤\n5736\t诬\n5737\t镗\n5738\t梟\n5739\t瘿\n5740\t蚜\n5741\t衲\n5742\t聃\n5743\t馮\n5744\t駒\n5745\t颀\n5746\t蟆\n5747\t螽\n5748\t螈\n5749\t哟\n5750\t堯\n5751\t滘\n5752\t颞\n5753\t颚\n5754\t颛\n5755\t衄\n5756\t徳\n5757\t炘\n5758\t該\n5759\t詳\n5760\t囍\n5761\t孵\n5762\t鯉\n5763\t諜\n5764\t亟\n5765\t蛄\n5766\t蚺\n5767\t袅\n5768\t衾\n5769\t踵\n5770\t斟\n5771\t孛\n5772\t箧\n5773\t羟\n5774\t笏\n5775\t蛏\n5776\t跛\n5777\t鴉\n5778\t蛭\n5779\t鱿\n5780\t蹴\n5781\t仵\n5782\t暨\n5783\t蜈\n5784\t酐\n5785\t鲑\n5786\t髒\n5787\t篩\n5788\t觚\n5789\t鯛\n5790\t瀝\n5791\t摺\n5792\t哝\n5793\t呦\n5794\t喏\n5795\t哌\n5796\t咻\n5797\t瀟\n5798\t髻\n5799\t俣\n5800\t賺\n5801\t贈\n5802\t滬\n5803\t郄\n5804\t蹤\n5805\t墉\n5806\t俟\n5807\t傩\n5808\t偎\n5809\t凼\n5810\t荜\n5811\t陟\n5812\t贛\n5813\t隍\n5814\t邛\n5815\t垡\n5816\t荠\n5817\t摧\n5818\t萁\n5819\t莨\n5820\t蒌\n5821\t嶼\n5822\t稗\n5823\t掇\n5824\t蕈\n5825\t鳢\n5826\t鞣\n5827\t鞅\n5828\t瑋\n5829\t竊\n5830\t籤\n5831\t蛔\n5832\t猾\n5833\t粄\n5834\tが\n5835\tジ\n5836\t*\n5837\t伕\n5838\t厠\n5839\t嘯\n5840\t姮\n5841\t廬\n5842\t搾\n5843\t潑\n5844\t讥\n5845\t絳\n5846\t喚\n5847\t铰\n5848\t硷\n5849\t絢\n5850\tす\n5851\t搀\n5852\t掺\n5853\t硯\n5854\t毆\n5855\t濁\n5856\t峄\n5857\t幛\n5858\t哩\n5859\t喋\n5860\t啵\n5861\t婪\n5862\t烩\n5863\t猝\n5864\t迸\n5865\tヤ\n5866\t洹\n5867\t鋭\n5868\t撃\n5869\t拇\n5870\t膿\n5871\t臍\n5872\t鉤\n5873\t悻\n5874\t嗑\n5875\t嗖\n5876\t喑\n5877\t饬\n5878\t琶\n5879\t懵\n5880\t噫\n5881\t忡\n5882\t怵\n5883\t孀\n5884\t姘\n5885\t潦\n5886\t怆\n5887\t砰\n5888\t蔫\n5889\t藐\n5890\t乒\n5891\t嫫\n5892\t骓\n5893\t孢\n5894\t纡\n5895\t孪\n5896\t沆\n5897\t泔\n5898\t錘\n5899\t怏\n5900\t庹\n5901\t抡\n5902\t銹\n5903\t巅\n5904\t恸\n5905\t遒\n5906\t遨\n5907\t狞\n5908\t淏\n5909\t癱\n5910\t绡\n5911\t纭\n5912\t扦\n5913\t玢\n5914\t缢\n5915\t缥\n5916\t珧\n5917\t躯\n5918\t畿\n5919\t鸫\n5920\t鸱\n5921\t鸨\n5922\t樞\n5923\t懈\n5924\t衅\n5925\t鹗\n5926\t惺\n5927\t餓\n5928\t蠱\n5929\t痱\n5930\t匈\n5931\t榉\n5932\t楦\n5933\tど\n5934\tよ\n5935\t龋\n5936\t戢\n5937\t笆\n5938\t雫\n5939\t隴\n5940\t擘\n5941\t杓\n5942\t牍\n5943\t嚷\n5944\t樯\n5945\t砷\n5946\t轭\n5947\t栉\n5948\t觑\n5949\t闖\n5950\t柩\n5951\t腭\n5952\t捎\n5953\t樨\n5954\t枰\n5955\t鑄\n5956\t閥\n5957\t滷\n5958\t焗\n5959\t嘔\n5960\t蛻\n5961\t胳\n5962\t勳\n5963\t歙\n5964\t蘚\n5965\t瞰\n5966\t螢\n5967\tわ\n5968\t蠔\n5969\t斫\n5970\t砭\n5971\t旃\n5972\t钇\n5973\t褪\n5974\t烊\n5975\t淌\n5976\t铍\n5977\t铐\n5978\t鸵\n5979\t熨\n5980\t铤\n5981\t铢\n5982\t镔\n5983\t顆\n5984\t癒\n5985\t僱\n5986\t媗\n5987\t琇\n5988\t嘗\n5989\t竦\n5990\t癀\n5991\t秆\n5992\t衿\n5993\t竜\n5994\t螃\n5995\t蟮\n5996\t罂\n5997\t螳\n5998\t傭\n5999\t夠\n6000\t蝼\n6001\t驕\n6002\t噎\n6003\tぴ\n6004\t侖\n6005\t訾\n6006\t嬛\n6007\t謠\n6008\t蜘\n6009\t酢\n6010\t趸\n6011\t醍\n6012\tフ\n6013\t汎\n6014\t匕\n6015\t氐\n6016\t蚓\n6017\t蚬\n6018\t鲢\n6019\t諧\n6020\t蚴\n6021\t訣\n6022\t綦\n6023\t謊\n6024\t鳩\n6025\t驢\n6026\t蛳\n6027\t窒\n6028\t瘴\n6029\t笳\n6030\t鲵\n6031\t嘱\n6032\t貘\n6033\t睾\n6034\t佤\n6035\t詐\n6036\t篾\n6037\t蛸\n6038\t貔\n6039\t簋\n6040\t窺\n6041\t卻\n6042\t唏\n6043\t咧\n6044\t慣\n6045\t歎\n6046\t烔\n6047\t鷺\n6048\tべ\n6049\t贅\n6050\t刍\n6051\t蹟\n6052\t黒\n6053\t艽\n6054\t堀\n6055\t鷄\n6056\t垅\n6057\t勰\n6058\t坭\n6059\t谔\n6060\t凫\n6061\t賜\n6062\t谠\n6063\t俸\n6064\t垓\n6065\t黴\n6066\t邴\n6067\t圪\n6068\t賦\n6069\t荥\n6070\t剋\n6071\t僕\n6072\t陛\n6073\t較\n6074\t莅\n6075\t荨\n6076\t茛\n6077\t菖\n6078\t轄\n6079\t薹\n6080\t捺\n6081\t骰\n6082\t掸\n6083\t禎\n6084\t々\n6085\t腑\n6086\t竅\n6087\t玙\n6088\t玕\n6089\tご\n6090\tう\n6091\tせ\n6092\tぎ\n6093\tグ\n6094\t倖\n6095\t厲\n6096\t唸\n6097\t姪\n6098\t姉\n6099\t寢\n6100\t崟\n6101\t悽\n6102\t柊\n6103\t棧\n6104\t殯\n6105\t湊\n6106\t湜\n6107\t潰\n6108\t骷\n6109\t紳\n6110\tソ\n6111\t粳\n6112\t紹\n6113\t綢\n6114\t綴\n6115\t叢\n6116\t洸\n6117\t膊\n6118\t惭\n6119\t豺\n6120\t姵\n6121\t躇\n6122\t癮\n6123\t溼\n6124\t岬\n6125\t釆\n6126\t鬣\n6127\t啜\n6128\t喱\n6129\t喽\n6130\tだ\n6131\tダ\n6132\t麾\n6133\t猗\n6134\t搂\n6135\t艙\n6136\t悒\n6137\t愕\n6138\t懊\n6139\t睹\n6140\t脣\n6141\t慵\n6142\t悴\n6143\t懦\n6144\t涑\n6145\t晾\n6146\t噌\n6147\t噤\n6148\t忖\n6149\t饴\n6150\t馀\n6151\t饽\n6152\t遽\n6153\t邃\n6154\t迥\n6155\t淅\n6156\t闩\n6157\t肅\n6158\t嘈\n6159\t鎬\n6160\t苾\n6161\t嗳\n6162\t迳\n6163\t汨\n6164\t闼\n6165\t媪\n6166\t粕\n6167\t骝\n6168\t嬲\n6169\t孳\n6170\t辔\n6171\t挛\n6172\t狹\n6173\t逖\n6174\t阊\n6175\t嶂\n6176\t帏\n6177\t釵\n6178\t罵\n6179\t鄒\n6180\t嘹\n6181\t恻\n6182\t阗\n6183\t醃\n6184\t沚\n6185\t诅\n6186\t佺\n6187\t曠\n6188\t绗\n6189\t绂\n6190\t谴\n6191\t菈\n6192\t缌\n6193\t缗\n6194\t缑\n6195\t缟\n6196\t鏢\n6197\t荳\n6198\t嬸\n6199\t衹\n6200\t衆\n6201\t衎\n6202\t鸷\n6203\t痿\n6204\t戦\n6205\t椋\n6206\t瞪\n6207\tド\n6208\t蒨\n6209\t煽\n6210\t苫\n6211\t啥\n6212\t鑲\n6213\t吶\n6214\t瓤\n6215\t榷\n6216\t戡\n6217\t闌\n6218\t隸\n6219\t氙\n6220\tニ\n6221\t吮\n6222\t藴\n6223\t榧\n6224\t虢\n6225\t椴\n6226\t瘸\n6227\t慑\n6228\t栲\n6229\t肓\n6230\t隻\n6231\t蔆\n6232\t殡\n6233\t槲\n6234\t晌\n6235\t轱\n6236\t桁\n6237\t杼\n6238\t蔵\n6239\t觐\n6240\t扔\n6241\t桷\n6242\t牯\n6243\t牒\n6244\t胗\n6245\t艘\n6246\t蔬\n6247\t鳃\n6248\t棂\n6249\t闢\n6250\t棹\n6251\t贽\n6252\t膻\n6253\t瀏\n6254\t僅\n6255\t昳\n6256\t漣\n6257\t婭\n6258\t愆\n6259\t恚\n6260\t虓\n6261\t黝\n6262\t铟\n6263\t蝸\n6264\t黠\n6265\t秣\n6266\t飼\n6267\t餃\n6268\t罹\n6269\t磴\n6270\t砻\n6271\t锑\n6272\t頰\n6273\t锢\n6274\t礴\n6275\t頒\n6276\t煨\n6277\t绦\n6278\t焓\n6279\t頜\n6280\t砗\n6281\t碓\n6282\t眦\n6283\t碇\n6284\t迢\n6285\t镉\n6286\t秭\n6287\t镞\n6288\t誊\n6289\t钯\n6290\t睨\n6291\t欽\n6292\t鱧\n6293\t礙\n6294\t玪\n6295\t瘪\n6296\t餮\n6297\t衽\n6298\t唁\n6299\t衩\n6300\t袢\n6301\t耜\n6302\t鸯\n6303\t疡\n6304\t馭\n6305\t峯\n6306\tズ\n6307\t氦\n6308\tび\n6309\t踊\n6310\t虻\n6311\t颦\n6312\t颏\n6313\t颔\n6314\t滓\n6315\t遏\n6316\t濒\n6317\tピ\n6318\t鲎\n6319\t龈\n6320\t霎\n6321\t醌\n6322\t誡\n6323\t伧\n6324\t馗\n6325\t廿\n6326\t蚣\n6327\t蹙\n6328\t虺\n6329\t笈\n6330\t蜞\n6331\t裟\n6332\t剽\n6333\t蚱\n6334\t築\n6335\t褻\n6336\t蛐\n6337\t鲳\n6338\t鲂\n6339\t菏\n6340\t糸\n6341\t羧\n6342\t仉\n6343\t笪\n6344\t繇\n6345\t靥\n6346\t赳\n6347\t鲅\n6348\t粼\n6349\t糁\n6350\t粽\n6351\t鲶\n6352\t稣\n6353\t伢\n6354\t踹\n6355\t鰐\n6356\t蝙\n6357\t螯\n6358\t糍\n6359\t佧\n6360\t鰻\n6361\t淩\n6362\t濺\n6363\t弒\n6364\t楽\n6365\t嬤\n6366\t呔\n6367\t卟\n6368\t擢\n6369\t哏\n6370\t哧\n6371\t呤\n6372\t咄\n6373\t咛\n6374\t璽\n6375\t盪\n6376\t囤\n6377\t讪\n6378\t诳\n6379\t诜\n6380\t岿\n6381\t鵡\n6382\t俦\n6383\t鹼\n6384\t麩\n6385\t踐\n6386\t郇\n6387\t埙\n6388\t郯\n6389\tボ\n6390\t茏\n6391\t艿\n6392\t俳\n6393\t阱\n6394\t侏\n6395\t俾\n6396\t茚\n6397\t茕\n6398\t偈\n6399\t苌\n6400\t荑\n6401\t貶\n6402\t脔\n6403\t壅\n6404\t鷓\n6405\t谶\n6406\t鷗\n6407\t邳\n6408\t羸\n6409\t垸\n6410\t苜\n6411\t鸚\n6412\t佥\n6413\t荦\n6414\t苻\n6415\t搗\n6416\t鱔\n6417\t穀\n6418\t龑\n6419\t荽\n6420\t輿\n6421\t葶\n6422\t蓍\n6423\t蓦\n6424\t菅\n6425\t蓥\n6426\t憐\n6427\t婠\n6428\t蘖\n6429\t轎\n6430\t抻\n6431\t掾\n6432\t捋\n6433\t辻\n6434\t鱷\n6435\t鱘\n6436\t谆\n6437\t瑢\n6438\t蹈\n6439\t祤\n6440\t瘉\n6441\t縷\n6442\t繃\n6443\t窅\n6444\t竇\n6445\t玘\n6446\t玗\n6447\t粧\n6448\t秾\n6449\t┳\n6450\t畝\n6451\tざ\n6452\tザ\n6453\tケ\n6454\t摈\n6455\t伝\n6456\t嚒\n6457\t墮\n6458\t妺\n6459\t幀\n6460\t廯\n6461\t彙\n6462\t摯\n6463\t枊\n6464\t櫥\n6465\t檸\n6466\t汙\n6467\t潯\n6468\t煒\n6469\t煖\n6470\tそ\n6471\t骶\n6472\t緝\n6473\t締\n6474\t緬\n6475\t紺\n6476\t厩\n6477\t経\n6478\t鞑\n6479\tげ\n6480\t澆\n6481\t谗\n6482\t掣\n6483\t踌\n6484\t侈\n6485\t烝\n6486\t尓\n6487\t曖\n6488\t翛\n6489\t圜\n6490\t嘧\n6491\t喟\n6492\t喹\n6493\t嗷\n6494\t唰\n6495\t垃\n6496\t纜\n6497\t诲\n6498\t樺\n6499\t甯\n6500\t泞\n6501\tт\n6502\t婁\n6503\t浍\n6504\t浠\n6505\t浈\n6506\t淖\n6507\tс\n6508\t愦\n6509\t惆\n6510\t憧\n6511\t惴\n6512\t悭\n6513\t銑\n6514\t髅\n6515\t聾\n6516\t沤\n6517\t憔\n6518\t涔\n6519\t洳\n6520\t溧\n6521\t汜\n6522\t汊\n6523\t崆\n6524\t溏\n6525\t譬\n6526\t彘\n6527\t逅\n6528\t氖\n6529\t渌\n6530\t驸\n6531\t驽\n6532\t沏\n6533\t骀\n6534\t骟\n6535\t嬗\n6536\t苪\n6537\t尜\n6538\t纣\n6539\t鉬\n6540\t鈍\n6541\t犸\n6542\t嗤\n6543\t囔\n6544\t囝\n6545\t馔\n6546\t逋\n6547\t屐\n6548\t孱\n6549\t銲\n6550\t涮\n6551\t鉑\n6552\t溲\n6553\t狍\n6554\t嘤\n6555\t庥\n6556\t砒\n6557\t潴\n6558\t湔\n6559\t抿\n6560\t阏\n6561\t罷\n6562\t狷\n6563\t帑\n6564\t恹\n6565\t妁\n6566\t潸\n6567\t澌\n6568\t馁\n6569\t錠\n6570\t鄖\n6571\t鋤\n6572\t徂\n6573\t窿\n6574\t廪\n6575\t妣\n6576\t奀\n6577\t岀\n6578\t绺\n6579\t髑\n6580\tで\n6581\t缃\n6582\t顼\n6583\t莖\n6584\t泅\n6585\t荘\n6586\t莙\n6587\t鸶\n6588\t皈\n6589\t鸲\n6590\t喰\n6591\t疴\n6592\t鹈\n6593\t痤\n6594\t鹧\n6595\t瘊\n6596\t汹\n6597\t疔\n6598\t饃\n6599\t濫\n6600\t榀\n6601\t楂\n6602\t楫\n6603\t閲\n6604\t閻\n6605\t磋\n6606\t杌\n6607\t璁\n6608\t瑁\n6609\t冴\n6610\t掙\n6611\t氷\n6612\t辍\n6613\t闿\n6614\t蕪\n6615\t纂\n6616\t毽\n6617\t氅\n6618\t氘\n6619\t槁\n6620\t枘\n6621\t薬\n6622\t肼\n6623\t桄\n6624\t鐲\n6625\t薈\n6626\t栊\n6627\t墒\n6628\t觇\n6629\t闕\n6630\t觎\n6631\t薔\n6632\t橐\n6633\t暝\n6634\t胍\n6635\t嗽\n6636\t胫\n6637\t辂\n6638\t犍\n6639\t挈\n6640\t膣\n6641\t檩\n6642\t噁\n6643\t濬\n6644\t唄\n6645\t琲\n6646\t痠\n6647\t歩\n6648\t黜\n6649\t悫\n6650\t碜\n6651\t矸\n6652\t欖\n6653\t殳\n6654\t旄\n6655\t欹\n6656\t毂\n6657\t诽\n6658\t兲\n6659\t虜\n6660\t咁\n6661\t瞽\n6662\t睽\n6663\t撐\n6664\t澪\n6665\t碉\n6666\t囷\n6667\t飢\n6668\t锘\n6669\t蠅\n6670\t蠍\n6671\t磲\n6672\t顱\n6673\t屉\n6674\t紊\n6675\t韌\n6676\t眄\n6677\t盹\n6678\t镬\n6679\t镂\n6680\t颙\n6681\t煅\n6682\t斡\n6683\t钫\n6684\t秕\n6685\t秫\n6686\t哮\n6687\t睐\n6688\t钲\n6689\t睚\n6690\t瀕\n6691\t駛\n6692\tぱ\n6693\t駁\n6694\t駄\n6695\t嘜\n6696\t満\n6697\t蟥\n6698\t簟\n6699\t吭\n6700\t吩\n6701\t雩\n6702\t霰\n6703\t鰱\n6704\tぶ\n6705\tブ\n6706\t謹\n6707\t戸\n6708\t醮\n6709\t醅\n6710\t蹩\n6711\tふ\n6712\t澱\n6713\t铡\n6714\t諒\n6715\t卅\n6716\t囟\n6717\t貍\n6718\t鼋\n6719\t鼍\n6720\t罄\n6721\t舐\n6722\t蝈\n6723\t鲣\n6724\t鬲\n6725\t乩\n6726\t笄\n6727\t蜱\n6728\t翮\n6729\t郧\n6730\t笕\n6731\t蜩\n6732\t蛩\n6733\t鲩\n6734\t錾\n6735\t蹶\n6736\t騁\n6737\t箜\n6738\t鲮\n6739\t跆\n6740\t仨\n6741\t赝\n6742\t豊\n6743\t匮\n6744\t涸\n6745\t笥\n6746\t粢\n6747\t赧\n6748\t瞩\n6749\t跤\n6750\t睁\n6751\t伉\n6752\t襯\n6753\t诌\n6754\t筚\n6755\t筌\n6756\t騏\n6757\t豉\n6758\t糗\n6759\t剀\n6760\t瀞\n6761\t嘢\n6762\t呋\n6763\t邏\n6764\t吲\n6765\t咂\n6766\t唠\n6767\t吆\n6768\t哔\n6769\t啕\n6770\t甌\n6771\t讦\n6772\t诟\n6773\t殆\n6774\t鼯\n6775\t侪\n6776\tほ\n6777\t郓\n6778\t诶\n6779\t谀\n6780\t倮\n6781\t黉\n6782\t黙\n6783\t坻\n6784\t兖\n6785\t莛\n6786\t苄\n6787\t貳\n6788\t贖\n6789\t陬\n6790\t谖\n6791\t偻\n6792\t兕\n6793\t傥\n6794\t畚\n6795\t鶯\n6796\t隈\n6797\t谥\n6798\t谪\n6799\t鵲\n6800\t僭\n6801\t赑\n6802\t谮\n6803\t嘩\n6804\t畑\n6805\t攜\n6806\t癥\n6807\t価\n6808\t莠\n6809\t荩\n6810\t萜\n6811\t軼\n6812\t媄\n6813\t薨\n6814\t薤\n6815\t蕖\n6816\t藁\n6817\t迖\n6818\t藿\n6819\t蘼\n6820\t奁\n6821\t揄\n6822\t尬\n6823\t拶\n6824\t燴\n6825\t狛\n6826\t磡\n6827\t磧\n6828\t磯\n6829\tǒ\n6830\t鳔\n6831\t鳟\n6832\t鳏\n6833\t鳎\n6834\t鲀\n6835\t鲹\n6836\t瑣\n6837\t唉\n6838\t皜\n6839\t皞\n6840\t惮\n6841\t郸\n6842\t祘\n6843\t揩\n6844\t繚\n6845\t袱\n6846\t珝\n6847\t珰\n6848\t禱\n6849\t畬\n6850\tぐ\n6851\t睏\n6852\t乚\n6853\t倓\n6854\t倞\n6855\t僞\n6856\t儷\n6857\t儘\n6858\t啫\n6859\t嚮\n6860\t噯\n6861\t埇\n6862\t埗\n6863\t垵\n6864\t塢\n6865\t奭\n6866\t妠\n6867\t帰\n6868\t恵\n6869\t憤\n6870\t挾\n6871\t摳\n6872\t攪\n6873\t暦\n6874\t暐\n6875\t柈\n6876\t枂\n6877\t棲\n6878\t棨\n6879\t樁\n6880\t槓\n6881\t檳\n6882\t毐\n6883\t洑\n6884\t湲\n6885\t潁\n6886\t瀆\n6887\t讣\n6888\tゼ\n6889\t嫉\n6890\t絵\n6891\t饯\n6892\tゾ\n6893\t壘\n6894\t絃\n6895\t抉\n6896\t絆\n6897\t嫻\n6898\t箏\n6899\t箓\n6900\t剐\n6901\t徊\n6902\t槻\n6903\t碴\n6904\t搽\n6905\t诧\n6906\t伋\n6907\t矯\n6908\t矻\n6909\t瞅\n6910\t堿\n6911\t啰\n6912\t瘁\n6913\t岽\n6914\t嶙\n6915\t岌\n6916\t岜\n6917\t釗\n6918\t啻\n6919\t嗫\n6920\t啷\n6921\t斃\n6922\t猬\n6923\t夥\n6924\t舛\n6925\t猊\n6926\t鈦\n6927\t髀\n6928\t跺\n6929\t涘\n6930\t遄\n6931\t澶\n6932\t浥\n6933\t炁\n6934\t浞\n6935\t銨\n6936\t洧\n6937\t髂\n6938\tツ\n6939\t臑\n6940\t泖\n6941\t慊\n6942\tッ\n6943\t娒\n6944\t辆\n6945\t嗉\n6946\t谰\n6947\t峁\n6948\t夤\n6949\t岵\n6950\t獠\n6951\t獬\n6952\t愠\n6953\t崃\n6954\t拎\n6955\t崤\n6956\t忉\n6957\t怃\n6958\t遴\n6959\t迓\n6960\t娩\n6961\t羈\n6962\t嗲\n6963\t馇\n6964\t囵\n6965\t庑\n6966\t漯\n6967\t麈\n6968\t挎\n6969\t屾\n6970\t涙\n6971\t妫\n6972\t髌\n6973\tテ\n6974\t徠\n6975\t芣\n6976\t茀\n6977\t鄀\n6978\t嚅\n6979\t忤\n6980\t潆\n6981\t瞥\n6982\t艱\n6983\t嫘\n6984\t骘\n6985\t苨\n6986\t翯\n6987\t犴\n6988\t馓\n6989\t怙\n6990\t逦\n6991\t沩\n6992\t囫\n6993\t怦\n6994\t羼\n6995\t嵋\n6996\t嵴\n6997\t繭\n6998\t囹\n6999\t圉\n7000\t鈕\n7001\t怛\n7002\t乓\n7003\t咆\n7004\t阒\n7005\t鉗\n7006\t徇\n7007\t鉚\n7008\t嶷\n7009\t豳\n7010\t咙\n7011\t您\n7012\t彷\n7013\t妪\n7014\t漭\n7015\t噢\n7016\t戕\n7017\t鉅\n7018\t鰾\n7019\t旵\n7020\t麋\n7021\t绱\n7022\t纰\n7023\t鐐\n7024\t莀\n7025\t菂\n7026\t橇\n7027\t锹\n7028\t缡\n7029\t鏘\n7030\t鏜\n7031\t鏽\n7032\t甾\n7033\t哾\n7034\t昫\n7035\t饑\n7036\tば\n7037\t鸺\n7038\t鹁\n7039\t鹌\n7040\t鹩\n7041\t鹨\n7042\t餡\n7043\t疠\n7044\t橢\n7045\t鏖\n7046\t撻\n7047\t閪\n7048\t榘\n7049\t椹\n7050\t魃\n7051\t囪\n7052\t鑵\n7053\t鑼\n7054\t锜\n7055\t溉\n7056\t痊\n7057\t颧\n7058\t葷\n7059\t応\n7060\t旮\n7061\t辚\n7062\t陞\n7063\t蕗\n7064\t忞\n7065\t膑\n7066\t胱\n7067\t氚\n7068\t氲\n7069\t牖\n7070\t霂\n7071\t霑\n7072\t魉\n7073\t瓴\n7074\t殁\n7075\t赡\n7076\t桎\n7077\t赈\n7078\t肱\n7079\t脘\n7080\t槠\n7081\t肫\n7082\t閏\n7083\t菴\n7084\t桤\n7085\t枨\n7086\t槭\n7087\t樗\n7088\t桕\n7089\t觌\n7090\t腴\n7091\t樘\n7092\t雑\n7093\t闘\n7094\t隠\n7095\t雖\n7096\t萵\n7097\t蕁\n7098\t橛\n7099\t轵\n7100\t栌\n7101\t纫\n7102\t桴\n7103\t桫\n7104\t柝\n7105\t朐\n7106\t薙\n7107\t橼\n7108\t甥\n7109\t辄\n7110\t脍\n7111\t蕎\n7112\t甪\n7113\t単\n7114\t実\n7115\t昰\n7116\t窯\n7117\t旼\n7118\t沨\n7119\t岺\n7120\t濰\n7121\t塱\n7122\t汭\n7123\t疙\n7124\t婍\n7125\t戆\n7126\t怼\n7127\t砜\n7128\t砀\n7129\t頃\n7130\t魍\n7131\t懲\n7132\t戩\n7133\t撿\n7134\t齑\n7135\t熳\n7136\t鞏\n7137\t囂\n7138\t虤\n7139\t滎\n7140\t瞑\n7141\t钭\n7142\t畎\n7143\t畋\n7144\t顎\n7145\t魑\n7146\t戯\n7147\t铯\n7148\t铫\n7149\t檄\n7150\t蠄\n7151\t旒\n7152\t锛\n7153\t砟\n7154\t酞\n7155\t炝\n7156\t炻\n7157\t钹\n7158\t罾\n7159\t盥\n7160\t铼\n7161\t锪\n7162\t戽\n7163\t嚏\n7164\t硇\n7165\t黻\n7166\t黼\n7167\t铊\n7168\t铌\n7169\t镪\n7170\t锸\n7171\t頗\n7172\t盱\n7173\t铑\n7174\t钕\n7175\t镱\n7176\t飆\n7177\t腆\n7178\t祢\n7179\t祧\n7180\t詈\n7181\t铗\n7182\t镏\n7183\t颯\n7184\t蝨\n7185\t禳\n7186\t钶\n7187\t淆\n7188\t牺\n7189\t恓\n7190\t玨\n7191\t鱈\n7192\t攏\n7193\t嘚\n7194\t黢\n7195\tя\n7196\t褡\n7197\t窨\n7198\t窕\n7199\t駱\n7200\t囱\n7201\t襦\n7202\t裥\n7203\t讶\n7204\t耋\n7205\t耵\n7206\t裨\n7207\t聒\n7208\t褙\n7209\t褓\n7210\t馴\n7211\t慜\n7212\t浐\n7213\t蟀\n7214\t髙\n7215\tビ\n7216\t売\n7217\tを\n7218\t勻\n7219\t蚩\n7220\t蚨\n7221\t驍\n7222\t舀\n7223\t覓\n7224\t黥\n7225\t籀\n7226\t臬\n7227\t魟\n7228\t詭\n7229\t岞\n7230\t霪\n7231\t隹\n7232\t龇\n7233\t髦\n7234\tе\n7235\t愔\n7236\t懼\n7237\t謡\n7238\t貅\n7239\t醺\n7240\t酴\n7241\t髡\n7242\t崭\n7243\t鴣\n7244\t鴦\n7245\tへ\n7246\tヘ\n7247\t踅\n7248\t蠊\n7249\t龉\n7250\t醭\n7251\t驛\n7252\t筲\n7253\t襞\n7254\t鯽\n7255\t踽\n7256\t锗\n7257\t跄\n7258\t蹉\n7259\t芈\n7260\t蹑\n7261\t跗\n7262\t跚\n7263\t麴\n7264\t鮀\n7265\t誅\n7266\t仞\n7267\t驟\n7268\t鬍\n7269\t鲭\n7270\t蹰\n7271\t跎\n7272\t仃\n7273\t蝾\n7274\t酝\n7275\t読\n7276\t跸\n7277\t叵\n7278\t驤\n7279\t髄\n7280\t貉\n7281\t跹\n7282\t蠛\n7283\t篝\n7284\t篪\n7285\t筘\n7286\t蝮\n7287\t蛴\n7288\t蝤\n7289\t蜇\n7290\t龊\n7291\t鲋\n7292\t鲽\n7293\t鮫\n7294\t蘸\n7295\t跻\n7296\t豸\n7297\t踉\n7298\t踟\n7299\t狰\n7300\t赜\n7301\t刎\n7302\t驪\n7303\t鬚\n7304\t鲞\n7305\t骉\n7306\t喳\n7307\t篤\n7308\t訶\n7309\t髖\n7310\t蜉\n7311\t蹀\n7312\t佝\n7313\t乸\n7314\t沺\n7315\t琍\n7316\t琎\n7317\t巖\n7318\t禦\n7319\t╯\n7320\t淪\n7321\t咦\n7322\t擀\n7323\t甙\n7324\t呖\n7325\t咝\n7326\t哞\n7327\t哽\n7328\t哓\n7329\t呲\n7330\t哕\n7331\t咿\n7332\t╰\n7333\t漷\n7334\t禩\n7335\t檜\n7336\t鷲\n7337\t髭\n7338\t囑\n7339\t诂\n7340\t凇\n7341\t诨\n7342\t侉\n7343\t佻\n7344\t伲\n7345\t鸞\n7346\t鄞\n7347\t郫\n7348\t鄯\n7349\t鼩\n7350\tぼ\n7351\t軀\n7352\t墀\n7353\t転\n7354\t酃\n7355\t籴\n7356\t倥\n7357\t坩\n7358\t坼\n7359\t垆\n7360\t茑\n7361\t鹀\n7362\t矍\n7363\t坌\n7364\t谑\n7365\t陔\n7366\t匍\n7367\t茈\n7368\t陲\n7369\t傧\n7370\t茼\n7371\t芟\n7372\t鵰\n7373\t谘\n7374\t亳\n7375\t垩\n7376\t隰\n7377\t谡\n7378\t邙\n7379\t袤\n7380\t儆\n7381\t酆\n7382\t鹮\n7383\t賁\n7384\t賃\n7385\t儋\n7386\t圮\n7387\t苘\n7388\t賄\n7389\t鸛\n7390\t埝\n7391\t坜\n7392\t鱒\n7393\t乂\n7394\t朧\n7395\t沄\n7396\t痺\n7397\t穢\n7398\t譞\n7399\t擷\n7400\tポ\n7401\t嬢\n7402\t葑\n7403\t莴\n7404\t莩\n7405\t菝\n7406\t蒺\n7407\t蓐\n7408\t菔\n7409\t輓\n7410\t蒹\n7411\t蒴\n7412\t輛\n7413\t軻\n7414\t齲\n7415\t傀\n7416\t拮\n7417\t薜\n7418\t蕻\n7419\t轅\n7420\t蓿\n7421\t捩\n7422\t摒\n7423\t奘\n7424\t匏\n7425\t揿\n7426\t尴\n7427\t抟\n7428\t摁\n7429\t辮\n7430\t挹\n7431\t搦\n7432\t辺\n7433\t谞\n7434\t睜\n7435\t搐\n7436\t鳜\n7437\t鱸\n7438\t骺\n7439\t鞲\n7440\t鳙\n7441\t鲉\n7442\t鲘\n7443\t鳉\n7444\t鳑\n7445\t讐\n7446\t瑝\n7447\t瑨\n7448\t镑\n7449\t皚\n7450\t皦\n7451\t盁\n7452\t祙\n7453\t痋\n7454\t瘈\n7455\t瘍\n7456\t瘓\n7457\t璣\n7458\t瓏\n7459\t瓘\n7460\t笒\n7461\t珖\n7462\t豢\n7463\t籟\n7464\t粬\n7465\tё\n7466\t禵\n7467\t┛\n7468\t┃\n7469\t甡\n7470\tぷ\n7471\tぁ\n7472\tぇ\n7473\tガ\n7474\t+\n7475\t狈\n7476\t盶\n7477\t眬\n7478\t睒\n7479\t侘\n7480\t亅\n7481\t仮\n7482\t亶\n7483\t仏\n7484\t偓\n7485\t値\n7486\t倻\n7487\t儉\n7488\t叡\n7489\t厳\n7490\t啱\n7491\t噠\n7492\t嚕\n7493\t嘰\n7494\t噭\n7495\t噸\n7496\t垈\n7497\t垕\n7498\t墾\n7499\t墎\n7500\t墘\n7501\t姸\n7502\t姈\n7503\t嫲\n7504\t嫆\n7505\t嫊\n7506\t嫋\n7507\t崁\n7508\t崈\n7509\t崚\n7510\t崢\n7511\t幍\n7512\t帯\n7513\t巻\n7514\t彫\n7515\t弶\n7516\t悪\n7517\t慬\n7518\t懇\n7519\t憙\n7520\t憫\n7521\t挻\n7522\t拝\n7523\t斂\n7524\t攢\n7525\t攬\n7526\t昞\n7527\t暠\n7528\t晝\n7529\t晧\n7530\t晸\n7531\t杺\n7532\t梶\n7533\t梼\n7534\t槺\n7535\t槃\n7536\t槑\n7537\t櫞\n7538\t櫚\n7539\t殭\n7540\t殲\n7541\t洣\n7542\t浛\n7543\t洨\n7544\t湰\n7545\t湴\n7546\t湳\n7547\t淸\n7548\t渼\n7549\t漖\n7550\t瀧\n7551\t瀮\n7552\t煢\n7553\t焼\n7554\tぜ\n7555\t兗\n7556\t惣\n7557\t紓\n7558\t紘\n7559\t紜\n7560\t紮\n7561\t鹘\n7562\t絹\n7563\t綑\n7564\tň\n7565\t肪\n7566\tぞ\n7567\t溇\n7568\t綻\n7569\t継\n7570\t緞\n7571\t糰\n7572\t侥\n7573\t続\n7574\t綽\n7575\t綝\n7576\t攫\n7577\t絜\n7578\t緈\n7579\t絪\n7580\t綰\n7581\t紈\n7582\t絎\n7583\t紉\n7584\t惫\n7585\t児\n7586\t姽\n7587\t暪\n7588\t筧\n7589\t恫\n7590\tゲ\n7591\t瞋\n7592\t睬\n7593\t瞞\n7594\t睺\n7595\t硚\n7596\t碁\n7597\t砳\n7598\t砕\n7599\t珹\n7600\t癆\n7601\t嚜\n7602\t惇\n7603\t潽\n7604\t嗎\n7605\t幄\n7606\t釁\n7607\t釐\n7608\t髁\n7609\t劻\n7610\t屃\n7611\t浟\n7612\t羱\n7613\t郷\n7614\t喁\n7615\t唷\n7616\t鄘\n7617\t喈\n7618\t鄲\n7619\t骼\n7620\t咐\n7621\t惱\n7622\t繽\n7623\t纻\n7624\t缷\n7625\t罈\n7626\t佲\n7627\t団\n7628\t夆\n7629\t猕\n7630\t飧\n7631\t鈿\n7632\t鉄\n7633\t屄\n7634\t嵚\n7635\t聳\n7636\tゅ\n7637\tュ\n7638\t嬪\n7639\t濞\n7640\t寤\n7641\t瀹\n7642\t鍊\n7643\t鍙\n7644\t冧\n7645\t変\n7646\t庝\n7647\t鋇\n7648\t洇\n7649\t浃\n7650\t洌\n7651\t鋰\n7652\t镊\n7653\t臏\n7654\tャ\n7655\t悝\n7656\t惬\n7657\t銖\n7658\t溍\n7659\t谩\n7660\t脅\n7661\t峋\n7662\t浼\n7663\t徉\n7664\t徨\n7665\t郃\n7666\t噱\n7667\t釧\n7668\t邂\n7669\t洫\n7670\t溽\n7671\t悛\n7672\t忝\n7673\t徭\n7674\t怄\n7675\t馄\n7676\t鈉\n7677\t徘\n7678\t鈊\n7679\t銬\n7680\t鎌\n7681\tㄆ\n7682\t芵\n7683\t苼\n7684\t苽\n7685\t茋\n7686\t崛\n7687\t嗌\n7688\t馐\n7689\t馑\n7690\t彖\n7691\t咫\n7692\t涫\n7693\t婀\n7694\t驺\n7695\t芻\n7696\t媲\n7697\t骛\n7698\t苃\n7699\t骖\n7700\t骢\n7701\t苧\n7702\t苭\n7703\t鎧\n7704\t罠\n7705\t舎\n7706\t镣\n7707\t犰\n7708\t犷\n7709\t嗵\n7710\t忪\n7711\t錳\n7712\t泐\n7713\t阄\n7714\t銶\n7715\t狁\n7716\t腘\n7717\t狒\n7718\t狨\n7719\t狲\n7720\t嘭\n7721\t怍\n7722\t怩\n7723\t溆\n7724\t湓\n7725\t郟\n7726\t翾\n7727\t猄\n7728\t噙\n7729\t氵\n7730\t狴\n7731\t狳\n7732\t帔\n7733\t噘\n7734\t儡\n7735\t滠\n7736\t猇\n7737\t狺\n7738\t癇\n7739\t礳\n7740\t昪\n7741\t渃\n7742\t瀋\n7743\t–\n7744\t廋\n7745\tх\n7746\tょ\n7747\tョ\n7748\t冪\n7749\t绔\n7750\t缁\n7751\t绁\n7752\t暻\n7753\t菉\n7754\t菫\n7755\t玷\n7756\t缛\n7757\t鏗\n7758\t荌\n7759\t缣\n7760\t莔\n7761\t缰\n7762\t缯\n7763\t鏟\n7764\t缵\n7765\t莢\n7766\t鐮\n7767\t痙\n7768\t俤\n7769\t揹\n7770\t梔\n7771\t疍\n7772\t黟\n7773\t婐\n7774\t氿\n7775\t鸬\n7776\t稹\n7777\t皤\n7778\t穑\n7779\t饋\n7780\t饹\n7781\t餍\n7782\tπ\n7783\t袆\n7784\t鹆\n7785\t鹇\n7786\t鹋\n7787\t疰\n7788\t痃\n7789\t鹕\n7790\t鹚\n7791\t痦\n7792\t痼\n7793\t鹪\n7794\t瘌\n7795\t餚\n7796\t餛\n7797\t疬\n7798\t饅\n7799\t瘙\n7800\t杄\n7801\t珽\n7802\t夐\n7803\t涢\n7804\t楱\n7805\t椁\n7806\t棰\n7807\t閬\n7808\t熒\n7809\t嗚\n7810\t欒\n7811\t韪\n7812\t鑰\n7813\t铚\n7814\t葇\n7815\t涥\n7816\t滉\n7817\t戋\n7818\t検\n7819\t浭\n7820\t澥\n7821\t蔪\n7822\t囯\n7823\t氹\n7824\t氆\n7825\t毵\n7826\t耄\n7827\t毳\n7828\t氡\n7829\t査\n7830\t薦\n7831\t藠\n7832\t氩\n7833\t氤\n7834\t槊\n7835\t杪\n7836\t桡\n7837\t蔭\n7838\t曷\n7839\t脬\n7840\t閎\n7841\t萩\n7842\t晷\n7843\t殚\n7844\t殛\n7845\t呻\n7846\t菶\n7847\t杷\n7848\t隕\n7849\t蒄\n7850\t镚\n7851\t閔\n7852\t雋\n7853\t轫\n7854\t娠\n7855\t鐸\n7856\t柰\n7857\t腠\n7858\t胛\n7859\t腼\n7860\t薺\n7861\t轳\n7862\t蔔\n7863\t胙\n7864\t蒾\n7865\t轹\n7866\t檎\n7867\t牦\n7868\t媵\n7869\t膂\n7870\t胝\n7871\t胴\n7872\t檫\n7873\t雝\n7874\t蓇\n7875\t曩\n7876\t犒\n7877\t臌\n7878\t関\n7879\t蒟\n7880\t挲\n7881\t胼\n7882\t辋\n7883\t檗\n7884\t巉\n7885\t昮\n7886\t碏\n7887\t嶧\n7888\t済\n7889\t滸\n7890\t籬\n7891\t琀\n7892\t獺\n7893\t╥\n7894\t璱\n7895\t峣\n7896\t抜\n7897\t渋\n7898\t璉\n7899\tы\n7900\t図\n7901\t栱\n7902\t眵\n7903\t韡\n7904\t懑\n7905\t韮\n7906\t韾\n7907\t蓖\n7908\t呁\n7909\t浲\n7910\t滌\n7911\t彀\n7912\t欤\n7913\t鞕\n7914\tヌ\n7915\t斕\n7916\t蹋\n7917\tь\n7918\t嗩\n7919\t畹\n7920\t罘\n7921\t顒\n7922\t顓\n7923\t顗\n7924\t顥\n7925\t埼\n7926\t蝲\n7927\t氾\n7928\t颼\n7929\t锃\n7930\t罴\n7931\t锝\n7932\t砉\n7933\t锕\n7934\t砝\n7935\t礤\n7936\t礓\n7937\t藺\n7938\t钽\n7939\t蠲\n7940\t锱\n7941\t镦\n7942\t锲\n7943\t锨\n7944\t钚\n7945\t顳\n7946\t焐\n7947\t嗡\n7948\t颋\n7949\t蟞\n7950\t蘄\n7951\t挝\n7952\t镲\n7953\t頼\n7954\t硌\n7955\t頽\n7956\t锺\n7957\t眙\n7958\t椭\n7959\t眭\n7960\t碚\n7961\t碡\n7962\t祗\n7963\t铙\n7964\t锖\n7965\t镌\n7966\t镓\n7967\t頡\n7968\t碲\n7969\t禊\n7970\t颱\n7971\t蟯\n7972\t蟄\n7973\t韜\n7974\t頦\n7975\t蝀\n7976\t蟈\n7977\t磔\n7978\t忑\n7979\t颶\n7980\t磙\n7981\t忐\n7982\t燹\n7983\t蠣\n7984\t玧\n7985\t獼\n7986\t甁\n7987\t禟\n7988\t桲\n7989\t譴\n7990\t祃\n7991\t窵\n7992\t琺\n7993\t俶\n7994\t昺\n7995\t渕\n7996\t∕\n7997\t鱉\n7998\t廙\n7999\t琻\n8000\t玭\n8001\t﹏\n8002\t縉\n8003\t烴\n8004\t聍\n8005\t癔\n8006\t瘛\n8007\t瘵\n8008\t瘠\n8009\t駙\n8010\t駟\n8011\t喲\n8012\t癃\n8013\t皴\n8014\t裢\n8015\t耧\n8016\t裊\n8017\t褛\n8018\t聩\n8019\t褊\n8020\t褫\n8021\t颃\n8022\t媜\n8023\t昽\n8024\t梠\n8025\t㎡\n8026\t嚭\n8027\t埡\n8028\t簕\n8029\t簫\n8030\t黧\n8031\t篦\n8032\t笞\n8033\t蟋\n8034\t蟑\n8035\t螬\n8036\t髪\n8037\tв\n8038\t虼\n8039\t颥\n8040\t蚍\n8041\t蚋\n8042\t驊\n8043\t驎\n8044\t円\n8045\t捯\n8046\t曇\n8047\t眶\n8048\t滛\n8049\t烎\n8050\t魘\n8051\t艋\n8052\t舢\n8053\t魷\n8054\t詮\n8055\t婗\n8056\t滝\n8057\t龃\n8058\t鲼\n8059\t觥\n8060\t龌\n8061\t鰭\n8062\t謬\n8063\t鮜\n8064\t酽\n8065\t醢\n8066\t醯\n8067\t酡\n8068\t鯇\n8069\t鯖\n8070\t辗\n8071\t眨\n8072\t圾\n8073\t髫\n8074\t卮\n8075\t丨\n8076\t艟\n8077\t黾\n8078\t艄\n8079\t虿\n8080\t龀\n8081\t罅\n8082\t箦\n8083\t蜮\n8084\t鲠\n8085\t鲥\n8086\t雠\n8087\t誥\n8088\t趵\n8089\t趼\n8090\t蹂\n8091\t趺\n8092\t嘏\n8093\t蜴\n8094\t鲦\n8095\t襜\n8096\t諦\n8097\t箸\n8098\t笮\n8099\t襠\n8100\t笊\n8101\t箅\n8102\t蜿\n8103\t鍪\n8104\t鏊\n8105\t亻\n8106\t豨\n8107\t鯤\n8108\t箪\n8109\t筇\n8110\t箢\n8111\t蛲\n8112\t蝻\n8113\t籼\n8114\t諭\n8115\t鲱\n8116\t躅\n8117\t仂\n8118\t諮\n8119\t簁\n8120\t鯧\n8121\t謐\n8122\t誰\n8123\t鳯\n8124\t訫\n8125\t豈\n8126\t蝰\n8127\t粞\n8128\t鯪\n8129\t鲴\n8130\t鮪\n8131\t笤\n8132\t笾\n8133\t蝌\n8134\t螋\n8135\t蝓\n8136\t趄\n8137\t糌\n8138\t鲇\n8139\t鲆\n8140\t鲻\n8141\t鲺\n8142\t鲐\n8143\t躞\n8144\t貊\n8145\t伥\n8146\t魆\n8147\t鰍\n8148\t鮭\n8149\t鮍\n8150\t髎\n8151\t諷\n8152\t鳶\n8153\t筮\n8154\t騮\n8155\t詔\n8156\t鯰\n8157\t鮰\n8158\t筅\n8159\t篼\n8160\t蝥\n8161\t蜊\n8162\t糅\n8163\t酎\n8164\t踮\n8165\t刿\n8166\t諺\n8167\t鬢\n8168\t骕\n8169\t鴛\n8170\t糨\n8171\t鳊\n8172\t巔\n8173\t噐\n8174\t攔\n8175\t丷\n8176\t烺\n8177\t眘\n8178\t譙\n8179\t疭\n8180\t丼\n8181\t奡\n8182\tн\n8183\t┻\n8184\t邨\n8185\t哚\n8186\t呃\n8187\t咤\n8188\t呙\n8189\t逨\n8190\t哳\n8191\t呶\n8192\t唢\n8193\t哂\n8194\t啁\n8195\t咣\n8196\t唿\n8197\t玹\n8198\t眛\n8199\t匱\n8200\t噓\n8201\t嶶\n8202\t鼹\n8203\t掹\n8204\t鷥\n8205\t蹿\n8206\tぺ\n8207\t広\n8208\t讧\n8209\t趨\n8210\t鶉\n8211\t鶗\n8212\tベ\n8213\t佴\n8214\t佾\n8215\t鼷\n8216\t堺\n8217\t燐\n8218\t麀\n8219\t鸰\n8220\tホ\n8221\t鄣\n8222\t郛\n8223\t郏\n8224\t鼽\n8225\t慄\n8226\t嬡\n8227\t屲\n8228\t堞\n8229\t堠\n8230\t劬\n8231\t芄\n8232\t鄄\n8233\t艹\n8234\t谂\n8235\t诿\n8236\t貯\n8237\t劾\n8238\t茔\n8239\t鹍\n8240\t陉\n8241\t訇\n8242\t鬯\n8243\t荛\n8244\t苁\n8245\t鵪\n8246\t鸊\n8247\t偬\n8248\t厶\n8249\t賚\n8250\t垴\n8251\t贠\n8252\t谝\n8253\t邗\n8254\t儇\n8255\t苤\n8256\t冫\n8257\t圹\n8258\t埸\n8259\t黿\n8260\t邶\n8261\t埤\n8262\t茌\n8263\t谵\n8264\t贍\n8265\t瑅\n8266\t眞\n8267\t亀\n8268\t坵\n8269\t檞\n8270\t玏\n8271\t沇\n8272\t縴\n8273\t凖\n8274\t淉\n8275\t齷\n8276\t龕\n8277\t傒\n8278\t栞\n8279\t蓣\n8280\t荸\n8281\t荬\n8282\t轂\n8283\t萋\n8284\t萏\n8285\t菹\n8286\t蓠\n8287\t蒡\n8288\t葜\n8289\t甍\n8290\t軽\n8291\t軾\n8292\t瓃\n8293\t倆\n8294\t巣\n8295\t玓\n8296\t淶\n8297\t慆\n8298\t兀\n8299\tм\n8300\t掁\n8301\t栟\n8302\t迍\n8303\t蘧\n8304\t轆\n8305\t蕺\n8306\t蘩\n8307\t掴\n8308\t捭\n8309\t耷\n8310\t掼\n8311\t拊\n8312\t拚\n8313\t捃\n8314\t込\n8315\t盃\n8316\t疇\n8317\t偁\n8318\t燻\n8319\t牤\n8320\t牘\n8321\t犽\n8322\t犢\n8323\t磥\n8324\t磦\n8325\t磻\n8326\t磾\n8327\t▕\n8328\t▽\n8329\t鼢\n8330\t鳓\n8331\t靼\n8332\t鞯\n8333\t鲃\n8334\t鲊\n8335\t鲏\n8336\t鲖\n8337\t鲙\n8338\t鲯\n8339\t鲾\n8340\t鳀\n8341\t鳠\n8342\t讃\n8343\t譆\n8344\t讎\n8345\t讖\n8346\t瑒\n8347\t瑧\n8348\t瑫\n8349\t瑮\n8350\t瑱\n8351\t瑸\n8352\t癭\n8353\t皛\n8354\t癩\n8355\t皰\n8356\t皸\n8357\t礬\n8358\t祼\n8359\t禇\n8360\t痎\n8361\t瘄\n8362\t瘆\n8363\t瘣\n8364\t瘧\n8365\t瘨\n8366\t瘺\n8367\t瓔\n8368\t瓚\n8369\t瓛\n8370\t瓟\n8371\t縹\n8372\t繄\n8373\t繅\n8374\t繕\n8375\t穇\n8376\t穫\n8377\t窊\n8378\t窓\n8379\t窣\n8380\t笉\n8381\t笣\n8382\t玚\n8383\t玔\n8384\t珄\n8385\t珌\n8386\t珎\n8387\t珛\n8388\t珵\n8389\t粙\n8390\t粨\n8391\t粩\n8392\tТ\n8393\tО\n8394\tЛ\n8395\tЭ\n8396\tБ\n8397\t秬\n8398\t稈\n8399\t┗\n8400\t━\n8401\t畤\n8402\t畯\n8403\tぉ\n8404\tぃ\n8405\tプ\n8406\tィ\n8407\tゥ\n8408\t眧\n8409\t盷\n8410\t眴\n8411\t亊\n8412\t伒\n8413\t亇\n8414\t仱\n8415\t亜\n8416\t亹\n8417\t仐\n8418\t仚\n8419\t偞\n8420\t偍\n8421\t倕\n8422\t倗\n8423\t倧\n8424\t倴\n8425\t倵\n8426\t倶\n8427\t儈\n8428\t僝\n8429\t僜\n8430\t儔\n8431\t儚\n8432\t劏\n8433\t劵\n8434\t劊\n8435\t劌\n8436\t剺\n8437\t叇\n8438\t叆\n8439\t厔\n8440\t厙\n8441\t厷\n8442\t喛\n8443\t啣\n8444\t啈\n8445\t圇\n8446\t嚢\n8447\t嚨\n8448\t嚟\n8449\t噲\n8450\t噵\n8451\t噽\n8452\t嚀\n8453\t嚐\n8454\t堊\n8455\t垿\n8456\t埈\n8457\t埆\n8458\t垻\n8459\t垾\n8460\t垏\n8461\t垝\n8462\t垞\n8463\t垱\n8464\t垳\n8465\t夨\n8466\t塤\n8467\t壆\n8468\t墐\n8469\t娮\n8470\t娀\n8471\t妧\n8472\t妶\n8473\t姀\n8474\t嫤\n8475\t嫰\n8476\t嫽\n8477\t媭\n8478\t媱\n8479\t嫀\n8480\t嫃\n8481\t嫏\n8482\t嫑\n8483\t嫕\n8484\t嫙\n8485\t尨\n8486\t宍\n8487\t尅\n8488\t尪\n8489\t孿\n8490\t寔\n8491\t寗\n8492\t寭\n8493\t寯\n8494\t寳\n8495\t対\n8496\t専\n8497\t峘\n8498\t崀\n8499\t嶗\n8500\t嵂\n8501\t崼\n8502\t嵒\n8503\t崍\n8504\t崐\n8505\t巂\n8506\t巸\n8507\t巹\n8508\t巿\n8509\t帀\n8510\t帡\n8511\t帩\n8512\t彯\n8513\t惙\n8514\t惢\n8515\t悧\n8516\t悩\n8517\t悰\n8518\t悵\n8519\t慇\n8520\t憚\n8521\t憣\n8522\t憭\n8523\t掲\n8524\t抃\n8525\t拏\n8526\t拠\n8527\t拤\n8528\t拫\n8529\t拵\n8530\t拸\n8531\t挏\n8532\t挐\n8533\t挓\n8534\t摽\n8535\t摜\n8536\t摫\n8537\t搥\n8538\t摑\n8539\t敻\n8540\t攣\n8541\t攱\n8542\t攲\n8543\t攽\n8544\t敐\n8545\t暱\n8546\t晙\n8547\t晛\n8548\t晫\n8549\t晳\n8550\t暁\n8551\t暅\n8552\t枴\n8553\t柅\n8554\t枹\n8555\t枺\n8556\t朶\n8557\t枌\n8558\t枒\n8559\t枓\n8560\t枙\n8561\t枟\n8562\t枦\n8563\t枱\n8564\t棸\n8565\t椏\n8566\t梋\n8567\t棫\n8568\t棻\n8569\t梾\n8570\t棅\n8571\t樋\n8572\t槱\n8573\t榎\n8574\t榿\n8575\t槀\n8576\t槇\n8577\t槈\n8578\t槉\n8579\t槏\n8580\t槖\n8581\t槜\n8582\t槝\n8583\t橿\n8584\t檁\n8585\t櫌\n8586\t檵\n8587\t檻\n8588\t櫆\n8589\t櫈\n8590\t毎\n8591\t歔\n8592\t殢\n8593\t洦\n8594\t泘\n8595\t泑\n8596\t洶\n8597\t洴\n8598\t浉\n8599\t洰\n8600\t洢\n8601\t泚\n8602\t洈\n8603\t湋\n8604\t湙\n8605\t湚\n8606\t湞\n8607\t潿\n8608\t澂\n8609\t澔\n8610\t潄\n8611\t潏\n8612\t潟\n8613\t灤\n8614\t灕\n8615\t瀅\n8616\t瀰\n8617\t瀼\n8618\t灃\n8619\t灄\n8620\t煬\n8621\t煾\n8622\t煡\n8623\t煓\n8624\t嚰\n8625\t糬\n8626\tń\n8627\t幗\n8628\t摻\n8629\t絔\n8630\t綋\n8631\t綎\n8632\t痪\n8633\t樉\n8634\t緙\n8635\t緱\n8636\t緲\n8637\t糀\n8638\t紸\n8639\t綣\n8640\t紂\n8641\t綬\n8642\t鞴\n8643\t肮\n8644\t柟\n8645\t箋\n8646\t箖\n8647\t箠\n8648\t筯\n8649\t筶\n8650\t筼\n8651\t刽\n8652\t筜\n8653\t嚥\n8654\t嵅\n8655\t毑\n8656\t瞼\n8657\t瞾\n8658\t矅\n8659\t矖\n8660\t矚\n8661\t瞐\n8662\t睞\n8663\t瞕\n8664\t瞤\n8665\tС\n8666\t嫵\n8667\t槼\n8668\t砢\n8669\t硞\n8670\t硤\n8671\t硨\n8672\t硵\n8673\t矰\n8674\t盿\n8675\t巃\n8676\t焌\n8677\t礎\n8678\t禔\n8679\t朅\n8680\t楢\n8681\tИ\n8682\t嫪\n8683\t敧\n8684\t獦\n8685\tп\n8686\t屺\n8687\t岣\n8688\t岖\n8689\t幞\n8690\t醽\n8691\t醾\n8692\t醿\n8693\t釄\n8694\t釈\n8695\tЯ\n8696\t伭\n8697\t偭\n8698\t熈\n8699\t羶\n8700\t羾\n8701\t翃\n8702\t翚\n8703\tо\n8704\t傕\n8705\t愢\n8706\t曕\n8707\t涏\n8708\t鄚\n8709\t唳\n8710\t喾\n8711\t啖\n8712\t鄴\n8713\t尷\n8714\t椑\n8715\t熇\n8716\t繾\n8717\t繹\n8718\t纮\n8719\t缊\n8720\t罌\n8721\t罍\n8722\t呪\n8723\t炩\n8724\t熲\n8725\t鈄\n8726\t猞\n8727\t獍\n8728\t猸\n8729\t狻\n8730\t饪\n8731\t饣\n8732\t鈡\n8733\t猢\n8734\t猡\n8735\t獗\n8736\t鈷\n8737\tβ\n8738\t吢\n8739\t埪\n8740\t徛\n8741\t毬\n8742\t浡\n8743\t灺\n8744\t赂\n8745\t聤\n8746\t聴\n8747\t麇\n8748\tу\n8749\t傚\n8750\t冨\n8751\t堝\n8752\t撳\n8753\t欏\n8754\t氬\n8755\t滃\n8756\t濆\n8757\t錨\n8758\t瀣\n8759\t錩\n8760\t濉\n8761\t鍎\n8762\t鍔\n8763\t鍖\n8764\t鍘\n8765\t鍢\n8766\t鍥\n8767\tづ\n8768\t偱\n8769\t壟\n8770\t惻\n8771\t斉\n8772\t舺\n8773\t艅\n8774\t艎\n8775\t嶄\n8776\t曚\n8777\t澉\n8778\t涠\n8779\t洎\n8780\t洚\n8781\t鋯\n8782\t鋶\n8783\t鋹\n8784\t偰\n8785\t縻\n8786\t娿\n8787\t阕\n8788\t悱\n8789\t愀\n8790\t悃\n8791\t惝\n8792\t惚\n8793\t銆\n8794\t銍\n8795\t銓\n8796\t銚\n8797\t銥\n8798\tγ\n8799\t吤\n8800\t幟\n8801\t捗\n8802\t脇\n8803\t脛\n8804\t脩\n8805\t脳\n8806\t脷\n8807\t饨\n8808\t褰\n8809\t愎\n8810\t岢\n8811\t宄\n8812\t徜\n8813\t崦\n8814\t噔\n8815\t嗄\n8816\t嚆\n8817\t辶\n8818\t謇\n8819\t邋\n8820\t迮\n8821\t迕\n8822\t渑\n8823\t淠\n8824\t溷\n8825\t胊\n8826\t憷\n8827\t隳\n8828\t崞\n8829\t嗥\n8830\t鉨\n8831\t纁\n8832\t淝\n8833\t鉉\n8834\t罝\n8835\t狃\n8836\t嵝\n8837\t錮\n8838\t腄\n8839\t傛\n8840\t忔\n8841\t鎊\n8842\t妯\n8843\t妗\n8844\t鎭\n8845\t鏃\n8846\t伷\n8847\t吰\n8848\t嬈\n8849\t澠\n8850\t芧\n8851\t芠\n8852\t苳\n8853\t茖\n8854\t茤\n8855\t茪\n8856\t徼\n8857\t彡\n8858\t犭\n8859\t噼\n8860\t嚯\n8861\t翬\n8862\t忾\n8863\t迨\n8864\t抨\n8865\t渖\n8866\t娌\n8867\t芶\n8868\t胬\n8869\t鍮\n8870\t媸\n8871\t嫠\n8872\t骒\n8873\t鍺\n8874\t骣\n8875\t鍼\n8876\t苢\n8877\t鎡\n8878\t芓\n8879\t鎢\n8880\t迄\n8881\t纥\n8882\t鎣\n8883\t芛\n8884\t纩\n8885\t孓\n8886\t臚\n8887\t鄃\n8888\t釭\n8889\t臜\n8890\t嵛\n8891\t嵯\n8892\t鄆\n8893\t囗\n8894\t忭\n8895\t忸\n8896\t馕\n8897\t庀\n8898\t屣\n8899\t渫\n8900\t撵\n8901\t湎\n8902\t阃\n8903\t醞\n8904\t郕\n8905\t羕\n8906\t鈑\n8907\t臠\n8908\t胠\n8909\t猰\n8910\t狽\n8911\t嵫\n8912\t郚\n8913\t嘌\n8914\t臢\n8915\t漶\n8916\t狯\n8917\t嶝\n8918\t鄌\n8919\t圄\n8920\t嗾\n8921\t怫\n8922\t搴\n8923\t逡\n8924\t逶\n8925\t遑\n8926\t鉶\n8927\t阌\n8928\t阋\n8929\t錚\n8930\t鉷\n8931\t罳\n8932\t鉸\n8933\t釹\n8934\t舠\n8935\t銼\n8936\t酺\n8937\t羣\n8938\t纓\n8939\t羥\n8940\t猁\n8941\t彳\n8942\t帙\n8943\t嘬\n8944\t傈\n8945\t廑\n8946\t舥\n8947\t遘\n8948\t潲\n8949\t溘\n8950\t膙\n8951\t錡\n8952\t醁\n8953\t鉞\n8954\t醂\n8955\t猃\n8956\t帻\n8957\t廨\n8958\t遢\n8959\t爿\n8960\t繸\n8961\t鈀\n8962\t舲\n8963\t脄\n8964\t肸\n8965\t赁\n8966\t呸\n8967\t譩\n8968\t璪\n8969\t卋\n8970\t媌\n8971\t揵\n8972\t渂\n8973\t獴\n8974\t濨\n8975\t縞\n8976\t玞\n8977\t塩\n8978\t礐\n8979\t瑿\n8980\t滳\n8981\t濩\n8982\t嶇\n8983\t旂\n8984\t栫\n8985\t熺\n8986\t绋\n8987\t缱\n8988\t绲\n8989\t缈\n8990\t绌\n8991\t鐓\n8992\t鐜\n8993\t鐢\n8994\t鐫\n8995\tデ\n8996\t喦\n8997\t柷\n8998\t溓\n8999\t荾\n9000\t莧\n9001\t菋\n9002\t菍\n9003\t缂\n9004\t绻\n9005\t缒\n9006\t荄\n9007\t缧\n9008\t莐\n9009\t鏻\n9010\t莑\n9011\t荖\n9012\t鏞\n9013\t荗\n9014\t缫\n9015\t荙\n9016\t鏤\n9017\t缳\n9018\t莦\n9019\t譫\n9020\t礵\n9021\t穌\n9022\t卍\n9023\t坉\n9024\t奷\n9025\t檇\n9026\t沝\n9027\t鱀\n9028\t窪\n9029\t籇\n9030\t丏\n9031\t嘍\n9032\t塂\n9033\t廌\n9034\t涴\n9035\t滒\n9036\t燄\n9037\t瘅\n9038\t瓞\n9039\t鸹\n9040\t饈\n9041\t饗\n9042\t饞\n9043\t饤\n9044\t饸\n9045\t饾\n9046\t佇\n9047\t傂\n9048\t椥\n9049\t衸\n9050\t鸸\n9051\t痄\n9052\t蠧\n9053\t鹎\n9054\t痖\n9055\t痍\n9056\t衒\n9057\t餒\n9058\t鹜\n9059\t鹛\n9060\t鹣\n9061\t酗\n9062\t餸\n9063\t瘐\n9064\t蠷\n9065\t餾\n9066\t餞\n9067\t磂\n9068\t縠\n9069\t乪\n9070\t僥\n9071\t漞\n9072\t瀍\n9073\t礒\n9074\t僂\n9075\t楨\n9076\t冮\n9077\t嗛\n9078\t忛\n9079\t旈\n9080\t氶\n9081\t滈\n9082\t轺\n9083\t楗\n9084\t閭\n9085\t閶\n9086\t閹\n9087\t闀\n9088\t闆\n9089\t娚\n9090\t樕\n9091\t炆\n9092\t蓕\n9093\t蓢\n9094\t蓪\n9095\t蓯\n9096\t㈣\n9097\t庤\n9098\t氳\n9099\t滆\n9100\t鑷\n9101\t鑾\n9102\t钑\n9103\t铏\n9104\t铻\n9105\t壢\n9106\t嬋\n9107\t斎\n9108\t毴\n9109\t萚\n9110\t萛\n9111\t葎\n9112\t葖\n9113\t葦\n9114\t夑\n9115\t婈\n9116\t甏\n9117\t犄\n9118\t軎\n9119\t戥\n9120\t辎\n9121\t辏\n9122\t陘\n9123\t険\n9124\t蔃\n9125\t蕣\n9126\t蕫\n9127\t蕶\n9128\t侂\n9129\t撾\n9130\t涬\n9131\t熾\n9132\t毹\n9133\t氍\n9134\t雱\n9135\t霙\n9136\t吽\n9137\t喫\n9138\t毸\n9139\t藘\n9140\t藟\n9141\t藦\n9142\t藨\n9143\t昃\n9144\t刖\n9145\t旯\n9146\t璩\n9147\t瓿\n9148\t锳\n9149\t槔\n9150\t殄\n9151\t殂\n9152\t栳\n9153\t枇\n9154\t枥\n9155\t赅\n9156\t赍\n9157\t氇\n9158\t朊\n9159\t赕\n9160\t菳\n9161\t镃\n9162\t榍\n9163\t赙\n9164\t肭\n9165\t鐳\n9166\t菵\n9167\t腽\n9168\t殍\n9169\t栝\n9170\t梃\n9171\t觊\n9172\t觋\n9173\t閒\n9174\t雊\n9175\t薌\n9176\t蔴\n9177\t薍\n9178\t橥\n9179\t菼\n9180\t觏\n9181\t薸\n9182\t镴\n9183\t鐺\n9184\t蒻\n9185\t萂\n9186\t檠\n9187\t梏\n9188\t枵\n9189\t蕂\n9190\t旰\n9191\t暌\n9192\t曛\n9193\t牾\n9194\t擞\n9195\t腧\n9196\t蓀\n9197\t薖\n9198\t蓂\n9199\t閟\n9200\t鑣\n9201\t柽\n9202\t脒\n9203\t閡\n9204\t轾\n9205\t檑\n9206\t柃\n9207\t柢\n9208\t闡\n9209\t蕌\n9210\t犏\n9211\t贶\n9212\t僳\n9213\t蔞\n9214\t薠\n9215\t蓎\n9216\t椠\n9217\t閆\n9218\t闋\n9219\t蔂\n9220\t薁\n9221\t琭\n9222\t奻\n9223\t渇\n9224\t鱂\n9225\t禙\n9226\t丗\n9227\t朏\n9228\t桭\n9229\t汧\n9230\t磄\n9231\t縢\n9232\t眊\n9233\t俫\n9234\t峠\n9235\t璆\n9236\t咷\n9237\t圙\n9238\t楪\n9239\t欸\n9240\t甴\n9241\t奾\n9242\t媓\n9243\t榟\n9244\t渉\n9245\t疕\n9246\t璈\n9247\t奌\n9248\t桯\n9249\t礽\n9250\t唅\n9251\t沬\n9252\t両\n9253\t朓\n9254\t愴\n9255\t韃\n9256\t恝\n9257\t恁\n9258\t頊\n9259\t囃\n9260\t埻\n9261\t娡\n9262\t樛\n9263\t蚡\n9264\t蛯\n9265\t蛺\n9266\t蜆\n9267\t嶌\n9268\t靆\n9269\t歃\n9270\t臁\n9271\t靰\n9272\t飑\n9273\t霡\n9274\t欷\n9275\t膦\n9276\t靄\n9277\t靺\n9278\t魈\n9279\t嬏\n9280\t藹\n9281\t虁\n9282\t虒\n9283\t栴\n9284\t燁\n9285\t瞀\n9286\t畀\n9287\t瞌\n9288\t瞟\n9289\t瞍\n9290\t睥\n9291\t頫\n9292\t囄\n9293\t嬑\n9294\t屛\n9295\t嵨\n9296\t櫸\n9297\t蝃\n9298\t螄\n9299\t螆\n9300\t庯\n9301\t橈\n9302\t濓\n9303\t锇\n9304\t锆\n9305\t锔\n9306\t锒\n9307\t飩\n9308\t飪\n9309\t屜\n9310\t徬\n9311\t愊\n9312\t撓\n9313\t浵\n9314\t蟶\n9315\t蟷\n9316\t蠂\n9317\t蠑\n9318\t蠙\n9319\t砑\n9320\t罱\n9321\t锞\n9322\t罟\n9323\t觳\n9324\t畛\n9325\t锍\n9326\t礅\n9327\t砼\n9328\t豌\n9329\t靉\n9330\t烀\n9331\t瞠\n9332\t盍\n9333\t钅\n9334\t顰\n9335\t镡\n9336\t锫\n9337\t镢\n9338\t礞\n9339\t炱\n9340\t鞡\n9341\t砬\n9342\t蘡\n9343\t扃\n9344\t镧\n9345\t蘢\n9346\t祓\n9347\t镳\n9348\t砩\n9349\t硎\n9350\t硭\n9351\t礻\n9352\t铋\n9353\t钍\n9354\t顴\n9355\t螮\n9356\t镩\n9357\t镨\n9358\t蟜\n9359\t颎\n9360\t蟝\n9361\t鞨\n9362\t頷\n9363\t螲\n9364\t蚷\n9365\t硗\n9366\t硖\n9367\t祆\n9368\t钐\n9369\t铒\n9370\t颕\n9371\t蚃\n9372\t頹\n9373\t韐\n9374\t蟢\n9375\t鞮\n9376\t蝜\n9377\t眈\n9378\t霳\n9379\t硪\n9380\t煸\n9381\t熘\n9382\t蝟\n9383\t钣\n9384\t铘\n9385\t铞\n9386\t飋\n9387\t蟧\n9388\t矬\n9389\t矧\n9390\t镆\n9391\t鞶\n9392\t頠\n9393\t磉\n9394\t爝\n9395\t靦\n9396\t飔\n9397\t碹\n9398\t蘵\n9399\t燠\n9400\t燔\n9401\t铥\n9402\t飖\n9403\t稂\n9404\t镘\n9405\t靨\n9406\t飗\n9407\t蛍\n9408\t颳\n9409\t靬\n9410\t蘗\n9411\t蟳\n9412\t镙\n9413\t靂\n9414\t蘘\n9415\t蝂\n9416\t籮\n9417\t媕\n9418\t巎\n9419\t杍\n9420\t榡\n9421\t匤\n9422\t旿\n9423\t汮\n9424\t媖\n9425\t宬\n9426\t昸\n9427\t鱇\n9428\t圞\n9429\t痩\n9430\t禠\n9431\t甃\n9432\t凩\n9433\t奓\n9434\t昄\n9435\t玬\n9436\t弇\n9437\t杕\n9438\t禡\n9439\t僊\n9440\t慚\n9441\t磏\n9442\t祅\n9443\t籲\n9444\t俷\n9445\t卬\n9446\t坣\n9447\t礜\n9448\t玁\n9449\t瘩\n9450\t侎\n9451\t傫\n9452\t烋\n9453\t瘭\n9454\t癍\n9455\t窀\n9456\t瘰\n9457\t穸\n9458\t瘼\n9459\t瘢\n9460\t駡\n9461\t駭\n9462\tа\n9463\t毖\n9464\t炑\n9465\t熝\n9466\t褆\n9467\t褕\n9468\t褢\n9469\t褣\n9470\t褦\n9471\t褯\n9472\t窬\n9473\t癯\n9474\t襻\n9475\t窭\n9476\t疋\n9477\t皲\n9478\t馼\n9479\t馚\n9480\t馜\n9481\t耩\n9482\t袷\n9483\t袼\n9484\t裎\n9485\t裣\n9486\t耨\n9487\t耱\n9488\t裰\n9489\t襁\n9490\t裈\n9491\t聱\n9492\t顸\n9493\t褴\n9494\t裋\n9495\t馱\n9496\t裏\n9497\t唎\n9498\t弌\n9499\t恛\n9500\t渙\n9501\t窸\n9502\t籓\n9503\t労\n9504\t椇\n9505\t樅\n9506\t熀\n9507\t簞\n9508\t簠\n9509\t簷\n9510\tⅢ\n9511\t嶓\n9512\t蠖\n9513\t螅\n9514\t螗\n9515\t蟊\n9516\t髣\n9517\t髴\n9518\t鬄\n9519\t饔\n9520\t営\n9521\t捰\n9522\t臃\n9523\t訏\n9524\t訑\n9525\t訕\n9526\t訚\n9527\t黩\n9528\t岒\n9529\t嶒\n9530\t栻\n9531\t颟\n9532\t虮\n9533\t騾\n9534\t驀\n9535\t驁\n9536\t驃\n9537\t呉\n9538\t喴\n9539\t襖\n9540\t覚\n9541\t傯\n9542\t掫\n9543\t汈\n9544\t濘\n9545\t舁\n9546\t舻\n9547\t舣\n9548\t艨\n9549\t舴\n9550\t舾\n9551\t舳\n9552\t嬙\n9553\t懺\n9554\t捲\n9555\t斣\n9556\t詛\n9557\t詝\n9558\t詡\n9559\t詣\n9560\t詰\n9561\t詼\n9562\t匂\n9563\t庼\n9564\t楒\n9565\t欥\n9566\t汌\n9567\t龅\n9568\t鯻\n9569\t鯷\n9570\t龆\n9571\t觫\n9572\t謦\n9573\t鰟\n9574\t鰣\n9575\t鰤\n9576\t鰥\n9577\t鰩\n9578\t鰶\n9579\t喼\n9580\t壷\n9581\t娭\n9582\t撝\n9583\t曋\n9584\t氈\n9585\t謄\n9586\t諤\n9587\t謨\n9588\t謫\n9589\t謳\n9590\t侕\n9591\t凊\n9592\t婖\n9593\t曱\n9594\t濙\n9595\t醑\n9596\t醐\n9597\t醣\n9598\t鯔\n9599\t栒\n9600\t椪\n9601\t誣\n9602\t諏\n9603\t咗\n9604\t岠\n9605\t戻\n9606\t鴃\n9607\t鴂\n9608\t鴯\n9609\t鴳\n9610\t鴷\n9611\tω\n9612\t堌\n9613\t愗\n9614\t懾\n9615\t椮\n9616\t炟\n9617\t亍\n9618\t鼗\n9619\t貐\n9620\t蠼\n9621\t蚪\n9622\t艏\n9623\t艚\n9624\t跫\n9625\t豕\n9626\t蟛\n9627\t舨\n9628\t蟪\n9629\t骯\n9630\t螵\n9631\t筢\n9632\t恿\n9633\t筻\n9634\t襛\n9635\t颡\n9636\t蚵\n9637\t蜾\n9638\t詀\n9639\t訟\n9640\t袈\n9641\t鲡\n9642\t趿\n9643\t踱\n9644\t蹁\n9645\t剜\n9646\t劁\n9647\t劂\n9648\t詁\n9649\t魾\n9650\t蚯\n9651\t鮟\n9652\t蹒\n9653\t冂\n9654\t覦\n9655\t騞\n9656\t訢\n9657\t粜\n9658\t諨\n9659\t誨\n9660\t跣\n9661\t鴇\n9662\t鳧\n9663\t谼\n9664\t覧\n9665\t箝\n9666\t箨\n9667\t笫\n9668\t羰\n9669\t鯡\n9670\t跞\n9671\t跏\n9672\t詆\n9673\t訥\n9674\t谿\n9675\t笸\n9676\t蛱\n9677\t綮\n9678\t粝\n9679\t鲰\n9680\t鮥\n9681\t跬\n9682\t躏\n9683\t仡\n9684\t匚\n9685\t仫\n9686\t簀\n9687\t覬\n9688\t鮦\n9689\t鰈\n9690\t鮨\n9691\t鮈\n9692\t覯\n9693\t鰉\n9694\t諱\n9695\t鳰\n9696\t鬘\n9697\t躐\n9698\t伛\n9699\t阂\n9700\t髈\n9701\t篌\n9702\t篥\n9703\t蛞\n9704\t蝣\n9705\t蛑\n9706\t蛘\n9707\t趔\n9708\t趑\n9709\t趱\n9710\t豇\n9711\t拄\n9712\t鮋\n9713\t踔\n9714\t跽\n9715\t鴒\n9716\t豋\n9717\t仳\n9718\t卣\n9719\t覲\n9720\t驫\n9721\t観\n9722\t騫\n9723\t謖\n9724\t謗\n9725\t鴕\n9726\t骃\n9727\t魋\n9728\t髏\n9729\t鮐\n9730\t鳷\n9731\t篣\n9732\t螓\n9733\t蜍\n9734\t魎\n9735\t訳\n9736\t謚\n9737\t鲒\n9738\t鳆\n9739\t鲔\n9740\t鳇\n9741\t鲕\n9742\t誹\n9743\t踬\n9744\t觖\n9745\t踯\n9746\t刭\n9747\t觱\n9748\t鮒\n9749\t髕\n9750\t觴\n9751\t諼\n9752\t豖\n9753\t騳\n9754\t験\n9755\t襶\n9756\t酏\n9757\t鲚\n9758\t踺\n9759\t骦\n9760\t鴝\n9761\t鳽\n9762\t蜣\n9763\t肄\n9764\t剞\n9765\t訝\n9766\t鱬\n9767\t卲\n9768\t梡\n9769\t廝\n9770\t╭\n9771\t琿\n9772\t弎\n9773\t梣\n9774\t禥\n9775\t慟\n9776\t峳\n9777\t璕\n9778\t擱\n9779\t祍\n9780\t峴\n9781\t泂\n9782\t渟\n9783\t漵\n9784\t禨\n9785\t婼\n9786\t擲\n9787\t昐\n9788\t鬟\n9789\t忂\n9790\t攮\n9791\t攉\n9792\t撙\n9793\t撺\n9794\t邅\n9795\t邩\n9796\t呒\n9797\t哙\n9798\t唣\n9799\t哐\n9800\t咭\n9801\t啭\n9802\t遅\n9803\t遹\n9804\t唼\n9805\t痶\n9806\t籺\n9807\t籘\n9808\t盩\n9809\t俆\n9810\t嘥\n9811\t昑\n9812\t桾\n9813\t漈\n9814\t畊\n9815\t僽\n9816\t坲\n9817\t塽\n9818\t妘\n9819\t奤\n9820\t孶\n9821\t朥\n9822\tⅩ\n9823\t夲\n9824\t燏\n9825\t鷟\n9826\t鷂\n9827\t鷞\n9828\t鷨\n9829\t鷯\n9830\t鷸\n9831\t鷿\n9832\tи\n9833\tペ\n9834\t佢\n9835\t嗂\n9836\t氌\n9837\t诖\n9838\t诎\n9839\t诙\n9840\t诋\n9841\t冖\n9842\t跂\n9843\t跅\n9844\t跐\n9845\t咘\n9846\t圌\n9847\t滪\n9848\t鵠\n9849\t鵟\n9850\t鶒\n9851\t鶘\n9852\t①\n9853\t佡\n9854\t嬞\n9855\t俅\n9856\t俜\n9857\t贄\n9858\t凔\n9859\t曽\n9860\t鸤\n9861\t髹\n9862\t娵\n9863\t撣\n9864\t鄹\n9865\t邾\n9866\t郾\n9867\t蹐\n9868\t蹓\n9869\t蹠\n9870\t蹣\n9871\t圏\n9872\t婞\n9873\t孅\n9874\t欬\n9875\t黐\n9876\t鬈\n9877\t冘\n9878\t嗆\n9879\t嵻\n9880\t愜\n9881\t栜\n9882\t炣\n9883\t塄\n9884\t墁\n9885\t芑\n9886\t芏\n9887\t鼙\n9888\t軋\n9889\t軏\n9890\t軛\n9891\t诮\n9892\t劢\n9893\t诓\n9894\t诔\n9895\t讵\n9896\t卺\n9897\t诹\n9898\t诼\n9899\t哿\n9900\t麬\n9901\t苈\n9902\t苠\n9903\t倨\n9904\t貰\n9905\t阽\n9906\t偾\n9907\t鷇\n9908\t賒\n9909\t鷈\n9910\t阼\n9911\t匐\n9912\t踖\n9913\t鷉\n9914\t鹔\n9915\t陧\n9916\t垤\n9917\t埏\n9918\t巯\n9919\t芴\n9920\t躪\n9921\t鹠\n9922\t鵑\n9923\t傺\n9924\t鹡\n9925\t鸑\n9926\t苎\n9927\t鶲\n9928\t貽\n9929\t僬\n9930\t埚\n9931\t埘\n9932\t埒\n9933\t圬\n9934\t躉\n9935\t芤\n9936\t茇\n9937\t趐\n9938\t躊\n9939\t踧\n9940\t跶\n9941\t鹯\n9942\t跼\n9943\t賡\n9944\t龠\n9945\t賂\n9946\t鹴\n9947\t邡\n9948\t蠃\n9949\t埴\n9950\t茺\n9951\t鶺\n9952\t赪\n9953\t黈\n9954\t鶻\n9955\t踴\n9956\t谳\n9957\t鵼\n9958\t圯\n9959\t鸝\n9960\t鸂\n9961\t鼱\n9962\t鱲\n9963\t璿\n9964\t憊\n9965\t泇\n9966\t玍\n9967\t嶸\n9968\t烿\n9969\t皐\n9970\t竪\n9971\t玾\n9972\t杦\n9973\t泈\n9974\t鱓\n9975\t媁\n9976\t榃\n9977\t淲\n9978\t玿\n9979\t亁\n9980\t碭\n9981\t癤\n9982\t疿\n9983\t揦\n9984\t榅\n9985\t鱵\n9986\t瑈\n9987\t儁\n9988\t漼\n9989\t玒\n9990\t甕\n9991\t塝\n9992\t榊\n9993\t圐\n9994\t岧\n9995\t汖\n9996\t齶\n9997\t齙\n9998\t龁\n9999\t龔\n10000\t龘\n10001\t龢\n10002\t冚\n10003\t嗇\n10004\t堓\n10005\t壿\n10006\t澼\n10007\t炤\n10008\t莸\n10009\t輻\n10010\t輾\n10011\t轁\n10012\t蒎\n10013\t鼴\n10014\t菪\n10015\t萑\n10016\t輋\n10017\t軫\n10018\t蓊\n10019\t菸\n10020\t齦\n10021\t蒗\n10022\t葚\n10023\t齪\n10024\t輗\n10025\t葸\n10026\t蔸\n10027\t蔹\n10028\t輜\n10029\t輞\n10030\t葺\n10031\t輟\n10032\t縵\n10033\t儂\n10034\t唞\n10035\t妟\n10036\t媧\n10037\t瀦\n10038\t鱖\n10039\t礪\n10040\t俍\n10041\t嘮\n10042\t怹\n10043\t堽\n10044\t嶠\n10045\t橚\n10046\t汘\n10047\t勣\n10048\t堔\n10049\t搠\n10050\t迀\n10051\t薷\n10052\t辿\n10053\t薅\n10054\t蕞\n10055\t轡\n10056\t迏\n10057\t迵\n10058\t蕹\n10059\t廾\n10060\t辀\n10061\t掮\n10062\t揲\n10063\t揸\n10064\t揠\n10065\t辡\n10066\t轍\n10067\t尥\n10068\t摅\n10069\t搛\n10070\t搋\n10071\t轘\n10072\t搡\n10073\t摞\n10074\t摭\n10075\t揶\n10076\t鳡\n10077\t笭\n10078\t厾\n10079\t棤\n10080\t湢\n10081\t犠\n10082\t牠\n10083\t牁\n10084\t牂\n10085\t燼\n10086\t犼\n10087\t燜\n10088\t爊\n10089\t爍\n10090\t爕\n10091\t犧\n10092\t碻\n10093\t碸\n10094\t磣\n10095\t碷\n10096\t磠\n10097\t磢\n10098\t碄\n10099\t碶\n10100\t磩\n10101\t磪\n10102\t磭\n10103\t磰\n10104\t磱\n10105\t磳\n10106\t磵\n10107\t磶\n10108\t磹\n10109\t磼\n10110\t磿\n10111\t礀\n10112\t礂\n10113\t礃\n10114\t礄\n10115\t礆\n10116\t礇\n10117\t礈\n10118\t礉\n10119\t礊\n10120\t礋\n10121\t`\n10122\t^\n10123\t╜\n10124\t╚\n10125\t▇\n10126\tㄗ\n10127\t▅\n10128\tǖ\n10129\t╛\n10130\tǚ\n10131\tǜ\n10132\tɑ\n10133\t╗\n10134\t╙\n10135\t▄\n10136\t▆\n10137\tǘ\n10138\tˊ\n10139\t╘\n10140\t█\n10141\t▉\n10142\t▊\n10143\t▋\n10144\t▌\n10145\t▍\n10146\t▎\n10147\t▏\n10148\t▓\n10149\t▔\n10150\t▼\n10151\t◢\n10152\t◣\n10153\t◤\n10154\t◥\n10155\t☉\n10156\t⊕\n10157\t〒\n10158\t〝\n10159\t〞\n10160\t~\n10161\t鞔\n10162\t鱜\n10163\t鱚\n10164\t鱺\n10165\t鱛\n10166\t鱙\n10167\t鱹\n10168\t鞫\n10169\t鰼\n10170\t鱻\n10171\t鱽\n10172\t鱾\n10173\t鲄\n10174\t鲌\n10175\t鲓\n10176\t鲗\n10177\t鲝\n10178\t鲪\n10179\t鲬\n10180\t鲿\n10181\t鳁\n10182\t鳂\n10183\t鳈\n10184\t鳒\n10185\t鳚\n10186\t鳛\n10187\t譤\n10188\t讄\n10189\t譥\n10190\t譡\n10191\t譢\n10192\t讉\n10193\t讋\n10194\t讌\n10195\t讑\n10196\t讒\n10197\t讔\n10198\t讕\n10199\t讙\n10200\t讛\n10201\t讜\n10202\t讞\n10203\t讟\n10204\t讬\n10205\t讱\n10206\t讻\n10207\t诇\n10208\t诐\n10209\t诪\n10210\t谉\n10211\t<\n10212\t=\n10213\t>\n10214\t琡\n10215\t琟\n10216\t瑍\n10217\t琠\n10218\t琜\n10219\t琞\n10220\t瑊\n10221\t瑌\n10222\t珸\n10223\t琝\n10224\t瑎\n10225\t瑏\n10226\t瑐\n10227\t瑑\n10228\t瑓\n10229\t瑔\n10230\t瑖\n10231\t瑘\n10232\t瑡\n10233\t瑥\n10234\t瑦\n10235\t瑬\n10236\t瑲\n10237\t瑳\n10238\t瑴\n10239\t瑵\n10240\t瑹\n10241\t|\n10242\tΥ\n10243\tΘ\n10244\tψ\n10245\tΜ\n10246\tΖ\n10247\tΠ\n10248\tΦ\n10249\tΟ\n10250\tΝ\n10251\tΑ\n10252\tΨ\n10253\tΩ\n10254\tΛ\n10255\tΗ\n10256\tΧ\n10257\tΙ\n10258\tΞ\n10259\tΔ\n10260\tΒ\n10261\tΓ\n10262\tΕ\n10263\tΡ\n10264\t癪\n10265\t皘\n10266\t癧\n10267\t癅\n10268\t癨\n10269\t皝\n10270\t皟\n10271\t皠\n10272\t皡\n10273\t皢\n10274\t皣\n10275\t皥\n10276\t皧\n10277\t皨\n10278\t皪\n10279\t皫\n10280\t皬\n10281\t皭\n10282\t皯\n10283\t皳\n10284\t皵\n10285\t皶\n10286\t皷\n10287\t皹\n10288\t皻\n10289\t皼\n10290\t皽\n10291\t皾\n10292\t盀\n10293\t礰\n10294\t礮\n10295\t祣\n10296\t礫\n10297\t礭\n10298\t祡\n10299\t礍\n10300\t祦\n10301\t祩\n10302\t祪\n10303\t祫\n10304\t祬\n10305\t祮\n10306\t祰\n10307\t祱\n10308\t祲\n10309\t祳\n10310\t祴\n10311\t祵\n10312\t祶\n10313\t祹\n10314\t祻\n10315\t祽\n10316\t祾\n10317\t禂\n10318\t禃\n10319\t禆\n10320\t禈\n10321\t禉\n10322\t禋\n10323\t禌\n10324\t禐\n10325\t禑\n10326\t_\n10327\t痐\n10328\t瘇\n10329\t痏\n10330\t痆\n10331\t痌\n10332\t瘂\n10333\t疈\n10334\t瘎\n10335\t瘏\n10336\t瘑\n10337\t瘒\n10338\t瘔\n10339\t瘖\n10340\t瘚\n10341\t瘜\n10342\t瘝\n10343\t瘞\n10344\t瘬\n10345\t瘮\n10346\t瘯\n10347\t瘲\n10348\t瘶\n10349\t瘷\n10350\t瘹\n10351\t瘻\n10352\t瘽\n10353\t癁\n10354\t璥\n10355\t瓇\n10356\t瓅\n10357\t璤\n10358\t璢\n10359\t瑻\n10360\t璡\n10361\t瓈\n10362\t瓉\n10363\t瓌\n10364\t瓍\n10365\t瓎\n10366\t瓐\n10367\t瓑\n10368\t瓓\n10369\t瓕\n10370\t瓖\n10371\t瓙\n10372\t瓝\n10373\t瓡\n10374\t瓥\n10375\t瓧\n10376\t瓨\n10377\t瓩\n10378\t瓪\n10379\t瓫\n10380\t瓬\n10381\t瓭\n10382\t瓱\n10383\t-\n10384\t,\n10385\t;\n10386\t:\n10387\t!\n10388\t〈\n10389\t〃\n10390\t△\n10391\t∽\n10392\t‖\n10393\tˇ\n10394\t“\n10395\t〉\n10396\t’\n10397\t…\n10398\t　\n10399\t】\n10400\t》\n10401\t「\n10402\t～\n10403\t』\n10404\t¨\n10405\t《\n10406\t‘\n10407\t·\n10408\t、\n10409\t。\n10410\tˉ\n10411\t”\n10412\t?\n10413\t縙\n10414\t縚\n10415\t縘\n10416\t縶\n10417\t縸\n10418\t縗\n10419\t縺\n10420\t縼\n10421\t縿\n10422\t繀\n10423\t繂\n10424\t繈\n10425\t繉\n10426\t繊\n10427\t繋\n10428\t繌\n10429\t繍\n10430\t繎\n10431\t繏\n10432\t繐\n10433\t繑\n10434\t繒\n10435\t繓\n10436\t繖\n10437\t繗\n10438\t繘\n10439\t繙\n10440\t繛\n10441\t繜\n10442\t/\n10443\t穈\n10444\t穅\n10445\t穨\n10446\t穦\n10447\t穄\n10448\t穧\n10449\t稝\n10450\t穃\n10451\t穪\n10452\t穬\n10453\t穭\n10454\t穮\n10455\t穱\n10456\t穲\n10457\t穵\n10458\t穻\n10459\t穼\n10460\t穽\n10461\t穾\n10462\t窂\n10463\t窇\n10464\t窉\n10465\t窋\n10466\t窌\n10467\t窎\n10468\t窏\n10469\t窐\n10470\t窔\n10471\t窙\n10472\t窚\n10473\t窛\n10474\t窞\n10475\t窡\n10476\t竈\n10477\t竳\n10478\t竉\n10479\t竲\n10480\t竆\n10481\t竴\n10482\t竵\n10483\t竷\n10484\t竸\n10485\t竻\n10486\t竼\n10487\t竾\n10488\t笀\n10489\t笁\n10490\t笂\n10491\t笅\n10492\t笇\n10493\t笌\n10494\t笍\n10495\t笎\n10496\t笐\n10497\t笓\n10498\t笖\n10499\t笗\n10500\t笘\n10501\t笚\n10502\t笜\n10503\t笝\n10504\t笟\n10505\t笡\n10506\t笢\n10507\t笧\n10508\t笩\n10509\t'\n10510\t\"\n10511\t珇\n10512\t珆\n10513\t珋\n10514\t珒\n10515\t珓\n10516\t珔\n10517\t珕\n10518\t珗\n10519\t珘\n10520\t珚\n10521\t珜\n10522\t珟\n10523\t珡\n10524\t珢\n10525\t珤\n10526\t珦\n10527\t珨\n10528\t珫\n10529\t珬\n10530\t珯\n10531\t珳\n10532\t珴\n10533\t珶\n10534\t粇\n10535\t粅\n10536\t籣\n10537\t粆\n10538\t粈\n10539\t粊\n10540\t粋\n10541\t粌\n10542\t粍\n10543\t粎\n10544\t粏\n10545\t粐\n10546\t粓\n10547\t粔\n10548\t粖\n10549\t粚\n10550\t粛\n10551\t粠\n10552\t粡\n10553\t粣\n10554\t粦\n10555\t粫\n10556\t粭\n10557\t粯\n10558\t粰\n10559\t粴\n10560\t粶\n10561\t粷\n10562\t粸\n10563\t粺\n10564\t(\n10565\t)\n10566\t[\n10567\t]\n10568\t{\n10569\t}\n10570\tЖ\n10571\tК\n10572\tЕ\n10573\tУ\n10574\tН\n10575\tА\n10576\tХ\n10577\tЦ\n10578\tЙ\n10579\tЩ\n10580\tЁ\n10581\tФ\n10582\tЗ\n10583\tМ\n10584\tГ\n10585\tВ\n10586\tД\n10587\tП\n10588\t禴\n10589\t秪\n10590\t秥\n10591\t禰\n10592\t秢\n10593\t秨\n10594\t禓\n10595\t秮\n10596\t秱\n10597\t秲\n10598\t秳\n10599\t秴\n10600\t秵\n10601\t秶\n10602\t秷\n10603\t秹\n10604\t秺\n10605\t秼\n10606\t秿\n10607\t稁\n10608\t稄\n10609\t稉\n10610\t稊\n10611\t稌\n10612\t稏\n10613\t稐\n10614\t稑\n10615\t稒\n10616\t稓\n10617\t稕\n10618\t稖\n10619\t稘\n10620\t稙\n10621\t稛\n10622\t┐\n10623\t┄\n10624\t﹡\n10625\t┈\n10626\t﹟\n10627\t│\n10628\t┌\n10629\t┑\n10630\t┋\n10631\t┉\n10632\t┓\n10633\t└\n10634\t┇\n10635\t﹞\n10636\t﹠\n10637\t┒\n10638\t┅\n10639\t┊\n10640\t〡\n10641\t─\n10642\t‐\n10643\t┍\n10644\t﹢\n10645\t﹣\n10646\t﹤\n10647\t﹥\n10648\t﹦\n10649\t﹨\n10650\t﹩\n10651\t﹪\n10652\t﹫\n10653\t〇\n10654\t甞\n10655\t畘\n10656\t畖\n10657\t甠\n10658\t甗\n10659\t甝\n10660\t畕\n10661\t畗\n10662\t甛\n10663\t畞\n10664\t畟\n10665\t畠\n10666\t畡\n10667\t畣\n10668\t畧\n10669\t畨\n10670\t畩\n10671\t畮\n10672\t畱\n10673\t畳\n10674\t畵\n10675\t畷\n10676\t畺\n10677\t畻\n10678\t畼\n10679\t畽\n10680\t畾\n10681\t疀\n10682\t疁\n10683\t疂\n10684\t疄\n10685\t@\n10686\tぅ\n10687\t⒋\n10688\tⅷ\n10689\tⅦ\n10690\t⒆\n10691\tⅵ\n10692\t⒌\n10693\tⅰ\n10694\t⒖\n10695\t⒎\n10696\t⒏\n10697\t⒒\n10698\tⅶ\n10699\t⒍\n10700\tⅸ\n10701\tⅳ\n10702\tⅱ\n10703\tⅲ\n10704\tⅴ\n10705\t⒈\n10706\t（\n10707\tＷ\n10708\t，\n10709\t＆\n10710\t５\n10711\t／\n10712\t－\n10713\t！\n10714\t？\n10715\t＋\n10716\t；\n10717\t＇\n10718\t）\n10719\t．\n10720\t￥\n10721\t＂\n10722\t＃\n10723\t％\n10724\tァ\n10725\tェ\n10726\tォ\n10727\t\\\n10728\t盽\n10729\t盺\n10730\t眫\n10731\t盻\n10732\t盵\n10733\t眥\n10734\t眪\n10735\t盄\n10736\t眮\n10737\t眰\n10738\t眱\n10739\t眲\n10740\t眳\n10741\t眹\n10742\t眻\n10743\t眽\n10744\t眿\n10745\t睂\n10746\t睄\n10747\t睅\n10748\t睆\n10749\t睈\n10750\t睉\n10751\t睊\n10752\t睋\n10753\t睌\n10754\t睍\n10755\t睎\n10756\t睓\n10757\t睔\n10758\t睕\n10759\t睖\n10760\t睗\n10761\t睘\n10762\t睙\n10763\t\u0002\n10764\t\u0003\n10765\t\u0004\n10766\t\u0005\n10767\t\u0006\n10768\t\u0007\n10769\t\b\n10770\t\u000b\n10771\t\f\n10772\t\u000e\n10773\t\u000f\n10774\t\u0010\n10775\t\u0011\n10776\t\u0012\n10777\t\u0013\n10778\t\u0014\n10779\t\u0015\n10780\t\u0016\n10781\t\u0017\n10782\t\u0018\n10783\t\u0019\n10784\t\u001a\n10785\t\u001b\n10786\t\u001c\n10787\t\u001d\n10788\t\u001e\n10789\t\u001f\n10790\t\n10791\t伌\n10792\t乣\n10793\t乛\n10794\t仺\n10795\t伂\n10796\t仸\n10797\t伆\n10798\t乢\n10799\t伅\n10800\t伃\n10801\t仭\n10802\t伩\n10803\t伔\n10804\t伀\n10805\t乕\n10806\t亄\n10807\t仹\n10808\t伓\n10809\t仼\n10810\t伄\n10811\t丂\n10812\t仯\n10813\t仴\n10814\t乗\n10815\t伇\n10816\t亐\n10817\t亖\n10818\t亗\n10819\t亙\n10820\t亝\n10821\t亣\n10822\t亪\n10823\t亯\n10824\t亰\n10825\t亱\n10826\t亴\n10827\t亷\n10828\t亸\n10829\t亼\n10830\t亽\n10831\t亾\n10832\t仈\n10833\t仌\n10834\t仒\n10835\t仛\n10836\t仜\n10837\t仠\n10838\t仢\n10839\t仦\n10840\t仧\n10841\t俙\n10842\t俕\n10843\t傋\n10844\t倈\n10845\t偊\n10846\t偘\n10847\t偟\n10848\t俖\n10849\t偗\n10850\t偔\n10851\t偂\n10852\t偪\n10853\t偡\n10854\t偢\n10855\t偒\n10856\t偦\n10857\t俒\n10858\t俔\n10859\t倇\n10860\t偋\n10861\t偠\n10862\t偐\n10863\t偖\n10864\t侤\n10865\t偆\n10866\t偄\n10867\t偅\n10868\t俓\n10869\t偙\n10870\t倎\n10871\t倐\n10872\t倛\n10873\t倝\n10874\t倠\n10875\t倢\n10876\t倣\n10877\t倯\n10878\t倰\n10879\t倱\n10880\t倲\n10881\t倳\n10882\t倷\n10883\t倸\n10884\t倹\n10885\t倽\n10886\t倿\n10887\t偀\n10888\t僠\n10889\t儴\n10890\t凎\n10891\t冏\n10892\t儸\n10893\t儼\n10894\t兊\n10895\t僟\n10896\t儻\n10897\t儹\n10898\t儭\n10899\t兛\n10900\t兎\n10901\t兏\n10902\t兓\n10903\t僛\n10904\t儃\n10905\t儅\n10906\t儳\n10907\t儵\n10908\t儺\n10909\t傽\n10910\t儰\n10911\t儮\n10912\t儯\n10913\t儱\n10914\t儽\n10915\t儊\n10916\t儌\n10917\t儍\n10918\t儎\n10919\t儏\n10920\t儐\n10921\t儑\n10922\t儓\n10923\t儕\n10924\t儖\n10925\t儗\n10926\t儙\n10927\t儛\n10928\t儜\n10929\t儞\n10930\t儠\n10931\t儢\n10932\t儣\n10933\t儤\n10934\t儥\n10935\t儦\n10936\t儧\n10937\t儨\n10938\t儩\n10939\t儫\n10940\t劥\n10941\t刞\n10942\t刕\n10943\t剘\n10944\t匃\n10945\t劕\n10946\t剕\n10947\t劙\n10948\t劦\n10949\t刜\n10950\t劘\n10951\t劖\n10952\t効\n10953\t劮\n10954\t劯\n10955\t劔\n10956\t刐\n10957\t刔\n10958\t剓\n10959\t剗\n10960\t劎\n10961\t劧\n10962\t劗\n10963\t凘\n10964\t劋\n10965\t刓\n10966\t劚\n10967\t剙\n10968\t剚\n10969\t剟\n10970\t剠\n10971\t剢\n10972\t剣\n10973\t剤\n10974\t剦\n10975\t剨\n10976\t剫\n10977\t剬\n10978\t剭\n10979\t剮\n10980\t剰\n10981\t剱\n10982\t剳\n10983\t剴\n10984\t剶\n10985\t剷\n10986\t剸\n10987\t剹\n10988\t剻\n10989\t剼\n10990\t剾\n10991\t劀\n10992\t劄\n10993\t劅\n10994\t叴\n10995\t卄\n10996\t叏\n10997\t厏\n10998\t咓\n10999\t呑\n11000\t叕\n11001\t厊\n11002\t叞\n11003\t叺\n11004\t卂\n11005\t叝\n11006\t叚\n11007\t叀\n11008\t吙\n11009\t叿\n11010\t吀\n11011\t叓\n11012\t吇\n11013\t匸\n11014\t匽\n11015\t厈\n11016\t厎\n11017\t収\n11018\t叾\n11019\t叐\n11020\t叜\n11021\t匑\n11022\t叅\n11023\t叄\n11024\t匼\n11025\t厐\n11026\t厑\n11027\t厒\n11028\t厓\n11029\t厖\n11030\t厗\n11031\t厛\n11032\t厜\n11033\t厞\n11034\t厡\n11035\t厤\n11036\t厧\n11037\t厪\n11038\t厫\n11039\t厬\n11040\t厯\n11041\t厰\n11042\t厱\n11043\t厵\n11044\t厸\n11045\t厹\n11046\t厺\n11047\t厼\n11048\t厽\n11049\t哷\n11050\t哵\n11051\t唦\n11052\t嗺\n11053\t喿\n11054\t啲\n11055\t啨\n11056\t啺\n11057\t喌\n11058\t哶\n11059\t啹\n11060\t啳\n11061\t喎\n11062\t喐\n11063\t喕\n11064\t哰\n11065\t哴\n11066\t唟\n11067\t唥\n11068\t啩\n11069\t喍\n11070\t啯\n11071\t啴\n11072\t咢\n11073\t啢\n11074\t啠\n11075\t哱\n11076\t啽\n11077\t唨\n11078\t唩\n11079\t唫\n11080\t唭\n11081\t唲\n11082\t唴\n11083\t唵\n11084\t唶\n11085\t唹\n11086\t唺\n11087\t唻\n11088\t唽\n11089\t啀\n11090\t啂\n11091\t啅\n11092\t啇\n11093\t啋\n11094\t啌\n11095\t啍\n11096\t啎\n11097\t啑\n11098\t啒\n11099\t啔\n11100\t啗\n11101\t啘\n11102\t啙\n11103\t啚\n11104\t啛\n11105\t嚧\n11106\t嘸\n11107\t嘵\n11108\t嚚\n11109\t噡\n11110\t囎\n11111\t嚞\n11112\t噟\n11113\t嚘\n11114\t嘷\n11115\t嚡\n11116\t嚳\n11117\t嚪\n11118\t嚫\n11119\t嚝\n11120\t嚙\n11121\t嚩\n11122\t嚛\n11123\t嚠\n11124\t嚖\n11125\t嚔\n11126\t嚗\n11127\t嚤\n11128\t噣\n11129\t噥\n11130\t噦\n11131\t噧\n11132\t噮\n11133\t噰\n11134\t噳\n11135\t噷\n11136\t噺\n11137\t噾\n11138\t噿\n11139\t嚁\n11140\t嚂\n11141\t嚃\n11142\t嚄\n11143\t嚈\n11144\t嚉\n11145\t嚊\n11146\t嚌\n11147\t埓\n11148\t坄\n11149\t坁\n11150\t埁\n11151\t垀\n11152\t堶\n11153\t坾\n11154\t埌\n11155\t埖\n11156\t坃\n11157\t埊\n11158\t垺\n11159\t埧\n11160\t埛\n11161\t埜\n11162\t埢\n11163\t圼\n11164\t圿\n11165\t坽\n11166\t坿\n11167\t埀\n11168\t埄\n11169\t埉\n11170\t垽\n11171\t垼\n11172\t圽\n11173\t埍\n11174\t垁\n11175\t垇\n11176\t垉\n11177\t垊\n11178\t垍\n11179\t垎\n11180\t垐\n11181\t垑\n11182\t垔\n11183\t垖\n11184\t垗\n11185\t垘\n11186\t垙\n11187\t垜\n11188\t垥\n11189\t垨\n11190\t垪\n11191\t垬\n11192\t垯\n11193\t垰\n11194\t垶\n11195\t垷\n11196\t壌\n11197\t塦\n11198\t塣\n11199\t墌\n11200\t壸\n11201\t壃\n11202\t壈\n11203\t壍\n11204\t壄\n11205\t墶\n11206\t壙\n11207\t壐\n11208\t壂\n11209\t壔\n11210\t塠\n11211\t墈\n11212\t墋\n11213\t墽\n11214\t壎\n11215\t墿\n11216\t堾\n11217\t墹\n11218\t墷\n11219\t墸\n11220\t墺\n11221\t塡\n11222\t壉\n11223\t墍\n11224\t墏\n11225\t墑\n11226\t墔\n11227\t墕\n11228\t墖\n11229\t増\n11230\t墛\n11231\t墝\n11232\t墠\n11233\t墡\n11234\t墢\n11235\t墣\n11236\t墤\n11237\t墥\n11238\t墦\n11239\t墧\n11240\t墪\n11241\t墫\n11242\t墬\n11243\t墭\n11244\t墯\n11245\t墰\n11246\t墱\n11247\t墲\n11248\t墴\n11249\t姶\n11250\t奰\n11251\t姩\n11252\t妦\n11253\t婘\n11254\t妡\n11255\t姲\n11256\t姷\n11257\t奯\n11258\t姱\n11259\t姯\n11260\t姟\n11261\t娍\n11262\t姺\n11263\t姼\n11264\t姭\n11265\t奫\n11266\t妢\n11267\t姧\n11268\t姰\n11269\t夽\n11270\t姢\n11271\t姠\n11272\t姡\n11273\t姤\n11274\t姳\n11275\t妬\n11276\t妭\n11277\t妰\n11278\t妱\n11279\t妳\n11280\t妴\n11281\t妵\n11282\t妷\n11283\t妸\n11284\t妼\n11285\t妽\n11286\t妿\n11287\t姁\n11288\t姂\n11289\t姃\n11290\t姄\n11291\t姅\n11292\t姇\n11293\t姌\n11294\t姎\n11295\t姏\n11296\t姕\n11297\t姖\n11298\t姙\n11299\t姛\n11300\t嫶\n11301\t媊\n11302\t媈\n11303\t嫧\n11304\t媬\n11305\t嬜\n11306\t嫭\n11307\t媩\n11308\t嫷\n11309\t媉\n11310\t嫮\n11311\t嫛\n11312\t嬁\n11313\t嫹\n11314\t嫺\n11315\t嫬\n11316\t媅\n11317\t媇\n11318\t媫\n11319\t嫥\n11320\t嫸\n11321\t嫨\n11322\t嫯\n11323\t婡\n11324\t嫟\n11325\t嫝\n11326\t嫞\n11327\t嫢\n11328\t媆\n11329\t嫳\n11330\t媮\n11331\t媯\n11332\t媰\n11333\t媴\n11334\t媶\n11335\t媷\n11336\t媹\n11337\t媺\n11338\t媻\n11339\t媼\n11340\t媿\n11341\t嫅\n11342\t嫇\n11343\t嫈\n11344\t嫍\n11345\t嫎\n11346\t嫐\n11347\t嫓\n11348\t嫗\n11349\t宆\n11350\t尐\n11351\t寏\n11352\t岟\n11353\t屪\n11354\t尙\n11355\t寍\n11356\t尠\n11357\t尩\n11358\t宊\n11359\t尟\n11360\t尛\n11361\t尶\n11362\t尫\n11363\t尭\n11364\t尗\n11365\t尰\n11366\t宂\n11367\t寋\n11368\t寎\n11369\t尒\n11370\t尞\n11371\t孈\n11372\t尌\n11373\t尡\n11374\t寑\n11375\t寕\n11376\t寖\n11377\t寘\n11378\t寙\n11379\t寚\n11380\t寛\n11381\t寜\n11382\t寠\n11383\t寣\n11384\t寪\n11385\t寱\n11386\t寲\n11387\t寴\n11388\t寷\n11389\t寽\n11390\t尀\n11391\t嵈\n11392\t峖\n11393\t崹\n11394\t嵶\n11395\t崿\n11396\t峾\n11397\t崷\n11398\t嵃\n11399\t嵉\n11400\t峗\n11401\t嵀\n11402\t崱\n11403\t嵖\n11404\t嵎\n11405\t嵏\n11406\t峓\n11407\t峕\n11408\t峿\n11409\t崸\n11410\t嵍\n11411\t崺\n11412\t嵁\n11413\t岪\n11414\t崵\n11415\t崲\n11416\t崳\n11417\t崶\n11418\t峔\n11419\t嵄\n11420\t崄\n11421\t崅\n11422\t崉\n11423\t崊\n11424\t崋\n11425\t崌\n11426\t崏\n11427\t崑\n11428\t崒\n11429\t崓\n11430\t崕\n11431\t崘\n11432\t崙\n11433\t崜\n11434\t崝\n11435\t崠\n11436\t崡\n11437\t崣\n11438\t崥\n11439\t崨\n11440\t崪\n11441\t崫\n11442\t崬\n11443\t崯\n11444\t幋\n11445\t帹\n11446\t巭\n11447\t庽\n11448\t巪\n11449\t帵\n11450\t幇\n11451\t幆\n11452\t幁\n11453\t幙\n11454\t幏\n11455\t幐\n11456\t帿\n11457\t幓\n11458\t嶿\n11459\t巤\n11460\t巬\n11461\t幎\n11462\t帺\n11463\t幃\n11464\t嶡\n11465\t帲\n11466\t帴\n11467\t嶾\n11468\t幈\n11469\t巰\n11470\t巵\n11471\t巶\n11472\t巺\n11473\t巼\n11474\t帄\n11475\t帇\n11476\t帉\n11477\t帊\n11478\t帋\n11479\t帍\n11480\t帎\n11481\t帒\n11482\t帓\n11483\t帗\n11484\t帞\n11485\t帟\n11486\t帠\n11487\t帢\n11488\t帣\n11489\t帤\n11490\t帨\n11491\t帪\n11492\t彺\n11493\t彣\n11494\t弤\n11495\t忳\n11496\t徸\n11497\t彟\n11498\t彴\n11499\t彽\n11500\t廮\n11501\t彲\n11502\t彮\n11503\t徔\n11504\t徃\n11505\t彨\n11506\t徎\n11507\t廩\n11508\t弡\n11509\t弣\n11510\t彠\n11511\t彾\n11512\t廆\n11513\t彜\n11514\t彚\n11515\t彛\n11516\t彞\n11517\t廫\n11518\t彵\n11519\t弨\n11520\t弫\n11521\t弬\n11522\t弮\n11523\t弰\n11524\t弲\n11525\t弳\n11526\t弴\n11527\t弸\n11528\t弻\n11529\t弽\n11530\t弾\n11531\t弿\n11532\t彁\n11533\t彂\n11534\t彃\n11535\t彄\n11536\t彅\n11537\t彆\n11538\t彇\n11539\t彉\n11540\t彋\n11541\t彍\n11542\t彑\n11543\t惔\n11544\t恅\n11545\t恀\n11546\t惃\n11547\t悀\n11548\t愾\n11549\t愖\n11550\t惉\n11551\t恷\n11552\t惁\n11553\t惏\n11554\t惖\n11555\t恄\n11556\t惎\n11557\t惌\n11558\t悺\n11559\t惪\n11560\t惛\n11561\t惈\n11562\t怺\n11563\t怾\n11564\t恾\n11565\t惂\n11566\t惗\n11567\t惄\n11568\t惍\n11569\t怈\n11570\t悿\n11571\t悾\n11572\t惀\n11573\t怽\n11574\t惐\n11575\t悁\n11576\t悂\n11577\t悆\n11578\t悇\n11579\t悈\n11580\t悊\n11581\t悋\n11582\t悎\n11583\t悏\n11584\t悐\n11585\t悑\n11586\t悓\n11587\t悕\n11588\t悗\n11589\t悘\n11590\t悙\n11591\t悜\n11592\t悞\n11593\t悡\n11594\t悢\n11595\t悤\n11596\t悥\n11597\t悮\n11598\t悳\n11599\t悷\n11600\t懘\n11601\t慲\n11602\t慯\n11603\t懆\n11604\t憕\n11605\t戺\n11606\t懽\n11607\t懍\n11608\t憒\n11609\t懄\n11610\t懓\n11611\t懙\n11612\t慱\n11613\t懐\n11614\t懎\n11615\t憽\n11616\t懣\n11617\t懛\n11618\t懜\n11619\t懌\n11620\t懟\n11621\t憓\n11622\t懅\n11623\t懚\n11624\t懏\n11625\t懁\n11626\t憿\n11627\t懀\n11628\t懃\n11629\t慭\n11630\t懕\n11631\t憖\n11632\t憗\n11633\t憘\n11634\t憛\n11635\t憜\n11636\t憞\n11637\t憟\n11638\t憠\n11639\t憡\n11640\t憢\n11641\t憥\n11642\t憦\n11643\t憪\n11644\t憮\n11645\t憯\n11646\t憰\n11647\t憱\n11648\t憳\n11649\t憴\n11650\t憵\n11651\t憸\n11652\t憹\n11653\t憺\n11654\t憻\n11655\t抈\n11656\t抆\n11657\t挩\n11658\t拁\n11659\t捵\n11660\t挰\n11661\t抾\n11662\t挦\n11663\t挵\n11664\t挼\n11665\t抇\n11666\t挴\n11667\t挱\n11668\t挕\n11669\t捒\n11670\t挿\n11671\t捀\n11672\t挮\n11673\t捇\n11674\t抂\n11675\t抅\n11676\t抺\n11677\t拀\n11678\t挧\n11679\t挬\n11680\t挳\n11681\t扏\n11682\t挙\n11683\t挗\n11684\t挘\n11685\t挜\n11686\t挶\n11687\t拃\n11688\t拑\n11689\t拕\n11690\t拞\n11691\t拡\n11692\t拪\n11693\t拰\n11694\t拲\n11695\t拹\n11696\t拺\n11697\t拻\n11698\t挀\n11699\t挃\n11700\t挄\n11701\t挅\n11702\t挆\n11703\t挊\n11704\t挋\n11705\t挍\n11706\t挒\n11707\t揯\n11708\t摠\n11709\t搤\n11710\t擏\n11711\t撟\n11712\t摤\n11713\t搢\n11714\t摝\n11715\t摪\n11716\t摰\n11717\t揰\n11718\t摨\n11719\t摥\n11720\t摗\n11721\t摲\n11722\t摣\n11723\t摶\n11724\t揫\n11725\t搟\n11726\t搣\n11727\t摟\n11728\t摱\n11729\t摡\n11730\t摦\n11731\t揁\n11732\t摛\n11733\t摙\n11734\t摚\n11735\t揬\n11736\t搧\n11737\t搨\n11738\t搩\n11739\t搫\n11740\t搮\n11741\t搯\n11742\t搰\n11743\t搱\n11744\t搲\n11745\t搳\n11746\t搵\n11747\t搷\n11748\t搸\n11749\t搹\n11750\t搻\n11751\t搼\n11752\t摀\n11753\t摂\n11754\t摃\n11755\t摉\n11756\t摋\n11757\t摌\n11758\t摍\n11759\t摎\n11760\t摏\n11761\t摐\n11762\t摓\n11763\t摕\n11764\t敶\n11765\t擿\n11766\t擽\n11767\t敤\n11768\t旝\n11769\t斪\n11770\t敩\n11771\t攟\n11772\t敠\n11773\t敯\n11774\t敮\n11775\t敪\n11776\t敺\n11777\t敨\n11778\t敾\n11779\t攞\n11780\t攠\n11781\t敡\n11782\t敹\n11783\t敥\n11784\t敭\n11785\t擛\n11786\t敜\n11787\t敚\n11788\t敟\n11789\t擻\n11790\t敱\n11791\t攦\n11792\t攧\n11793\t攨\n11794\t攩\n11795\t攭\n11796\t攰\n11797\t攷\n11798\t攺\n11799\t攼\n11800\t敀\n11801\t敁\n11802\t敂\n11803\t敃\n11804\t敄\n11805\t敆\n11806\t敇\n11807\t敊\n11808\t敋\n11809\t敍\n11810\t敎\n11811\t敒\n11812\t敓\n11813\t暣\n11814\t昤\n11815\t昢\n11816\t暔\n11817\t晘\n11818\t曶\n11819\t暚\n11820\t晐\n11821\t暒\n11822\t暟\n11823\t暤\n11824\t昣\n11825\t暞\n11826\t暛\n11827\t暋\n11828\t暩\n11829\t暙\n11830\t暬\n11831\t昜\n11832\t昡\n11833\t晎\n11834\t晑\n11835\t暓\n11836\t暥\n11837\t暕\n11838\t暜\n11839\t旲\n11840\t暏\n11841\t暍\n11842\t暎\n11843\t晜\n11844\t晠\n11845\t晢\n11846\t晣\n11847\t晥\n11848\t晩\n11849\t晪\n11850\t晬\n11851\t晱\n11852\t晲\n11853\t晵\n11854\t晹\n11855\t晻\n11856\t晼\n11857\t晽\n11858\t晿\n11859\t暀\n11860\t暃\n11861\t暆\n11862\t柎\n11863\t朻\n11864\t朸\n11865\t枿\n11866\t杶\n11867\t桏\n11868\t栕\n11869\t柆\n11870\t枽\n11871\t柕\n11872\t朹\n11873\t柉\n11874\t柇\n11875\t柨\n11876\t柗\n11877\t柛\n11878\t柣\n11879\t朳\n11880\t朷\n11881\t杮\n11882\t杴\n11883\t枾\n11884\t柖\n11885\t柀\n11886\t朄\n11887\t枻\n11888\t枼\n11889\t柋\n11890\t杸\n11891\t杹\n11892\t杻\n11893\t杽\n11894\t枀\n11895\t枃\n11896\t枅\n11897\t枆\n11898\t枈\n11899\t枍\n11900\t枎\n11901\t枏\n11902\t枑\n11903\t枔\n11904\t枖\n11905\t枛\n11906\t枠\n11907\t枡\n11908\t枤\n11909\t枩\n11910\t枬\n11911\t枮\n11912\t棿\n11913\t梎\n11914\t梌\n11915\t棬\n11916\t梸\n11917\t椬\n11918\t棳\n11919\t棪\n11920\t椀\n11921\t梍\n11922\t棷\n11923\t棴\n11924\t棥\n11925\t椃\n11926\t椄\n11927\t椈\n11928\t梉\n11929\t梴\n11930\t梷\n11931\t椂\n11932\t棭\n11933\t棶\n11934\t棦\n11935\t棩\n11936\t梊\n11937\t梹\n11938\t梺\n11939\t梻\n11940\t梽\n11941\t梿\n11942\t棁\n11943\t棃\n11944\t棆\n11945\t棇\n11946\t棈\n11947\t棊\n11948\t棌\n11949\t棎\n11950\t棏\n11951\t棐\n11952\t棑\n11953\t棓\n11954\t棔\n11955\t棖\n11956\t棙\n11957\t棛\n11958\t棜\n11959\t棝\n11960\t棞\n11961\t棡\n11962\t棢\n11963\t槾\n11964\t榒\n11965\t榐\n11966\t槰\n11967\t榽\n11968\t橑\n11969\t樧\n11970\t槵\n11971\t榺\n11972\t槮\n11973\t槹\n11974\t樀\n11975\t榑\n11976\t槸\n11977\t槶\n11978\t槨\n11979\t樃\n11980\t槴\n11981\t樆\n11982\t榌\n11983\t榏\n11984\t榹\n11985\t榼\n11986\t槯\n11987\t槷\n11988\t楡\n11989\t槫\n11990\t槩\n11991\t槪\n11992\t槬\n11993\t榾\n11994\t槂\n11995\t槄\n11996\t槅\n11997\t槆\n11998\t槒\n11999\t槕\n12000\t槗\n12001\t槙\n12002\t槚\n12003\t槞\n12004\t槡\n12005\t槢\n12006\t槣\n12007\t槤\n12008\t槥\n12009\t槦\n12010\t櫒\n12011\t欦\n12012\t櫖\n12013\t檤\n12014\t櫐\n12015\t櫟\n12016\t櫙\n12017\t櫗\n12018\t櫋\n12019\t櫩\n12020\t櫡\n12021\t櫢\n12022\t櫕\n12023\t橾\n12024\t檣\n12025\t檥\n12026\t櫑\n12027\t櫠\n12028\t櫓\n12029\t櫘\n12030\t橜\n12031\t櫎\n12032\t櫍\n12033\t櫏\n12034\t橽\n12035\t櫛\n12036\t檧\n12037\t檨\n12038\t檪\n12039\t檭\n12040\t檮\n12041\t檰\n12042\t檱\n12043\t檲\n12044\t檴\n12045\t檶\n12046\t檷\n12047\t檹\n12048\t檺\n12049\t檼\n12050\t檽\n12051\t檾\n12052\t檿\n12053\t櫀\n12054\t櫁\n12055\t櫂\n12056\t櫄\n12057\t櫅\n12058\t櫉\n12059\t毚\n12060\t歗\n12061\t毃\n12062\t殈\n12063\t汍\n12064\t毈\n12065\t殀\n12066\t殾\n12067\t毜\n12068\t歘\n12069\t毌\n12070\t毉\n12071\t殹\n12072\t毧\n12073\t毞\n12074\t毟\n12075\t毇\n12076\t毣\n12077\t歖\n12078\t歿\n12079\t殅\n12080\t毝\n12081\t毊\n12082\t欯\n12083\t殻\n12084\t殽\n12085\t歕\n12086\t殌\n12087\t殎\n12088\t殏\n12089\t殐\n12090\t殑\n12091\t殕\n12092\t殗\n12093\t殙\n12094\t殜\n12095\t殝\n12096\t殞\n12097\t殟\n12098\t殠\n12099\t殣\n12100\t殥\n12101\t殦\n12102\t殧\n12103\t殨\n12104\t殩\n12105\t殫\n12106\t殬\n12107\t殮\n12108\t殰\n12109\t殱\n12110\t殶\n12111\t洿\n12112\t沗\n12113\t沕\n12114\t涽\n12115\t涀\n12116\t洭\n12117\t浀\n12118\t洯\n12119\t洝\n12120\t浄\n12121\t洬\n12122\t浕\n12123\t沎\n12124\t泏\n12125\t泒\n12126\t洤\n12127\t浂\n12128\t洡\n12129\t洟\n12130\t洠\n12131\t沑\n12132\t洷\n12133\t泙\n12134\t泜\n12135\t泝\n12136\t泟\n12137\t泤\n12138\t泦\n12139\t泧\n12140\t泩\n12141\t泬\n12142\t泭\n12143\t泲\n12144\t泴\n12145\t泹\n12146\t泿\n12147\t洀\n12148\t洂\n12149\t洃\n12150\t洅\n12151\t洆\n12152\t洉\n12153\t洊\n12154\t洍\n12155\t洏\n12156\t洐\n12157\t洓\n12158\t洔\n12159\t洕\n12160\t洖\n12161\t洘\n12162\t湸\n12163\t渀\n12164\t淾\n12165\t湪\n12166\t渵\n12167\t滣\n12168\t溩\n12169\t渱\n12170\t湨\n12171\t湹\n12172\t淿\n12173\t湱\n12174\t湣\n12175\t溈\n12176\t湻\n12177\t湼\n12178\t溁\n12179\t淽\n12180\t渰\n12181\t渳\n12182\t湩\n12183\t湬\n12184\t淍\n12185\t湦\n12186\t湤\n12187\t湥\n12188\t湵\n12189\t渶\n12190\t渷\n12191\t渹\n12192\t渻\n12193\t渽\n12194\t渿\n12195\t湀\n12196\t湁\n12197\t湂\n12198\t湅\n12199\t湆\n12200\t湇\n12201\t湈\n12202\t湌\n12203\t湏\n12204\t湐\n12205\t湑\n12206\t湒\n12207\t湕\n12208\t湗\n12209\t湝\n12210\t湠\n12211\t湡\n12212\t澊\n12213\t漙\n12214\t潹\n12215\t濛\n12216\t澴\n12217\t潀\n12218\t潶\n12219\t澃\n12220\t澋\n12221\t漘\n12222\t澘\n12223\t澐\n12224\t澑\n12225\t潾\n12226\t漑\n12227\t潷\n12228\t澏\n12229\t潻\n12230\t澁\n12231\t滰\n12232\t潳\n12233\t潱\n12234\t潵\n12235\t漒\n12236\t澅\n12237\t潃\n12238\t潅\n12239\t潈\n12240\t潉\n12241\t潊\n12242\t潌\n12243\t潎\n12244\t潐\n12245\t潒\n12246\t潓\n12247\t潕\n12248\t潖\n12249\t潗\n12250\t潙\n12251\t潚\n12252\t潝\n12253\t潠\n12254\t潡\n12255\t潣\n12256\t潥\n12257\t潧\n12258\t潨\n12259\t潩\n12260\t潪\n12261\t潫\n12262\t瀪\n12263\t烑\n12264\t灛\n12265\t灠\n12266\t灥\n12267\t瀇\n12268\t灟\n12269\t灜\n12270\t灐\n12271\t灴\n12272\t灧\n12273\t灨\n12274\t灚\n12275\t灮\n12276\t灖\n12277\t灦\n12278\t濦\n12279\t灒\n12280\t灔\n12281\t瀄\n12282\t灡\n12283\t瀫\n12284\t瀭\n12285\t瀯\n12286\t瀱\n12287\t瀲\n12288\t瀳\n12289\t瀴\n12290\t瀶\n12291\t瀷\n12292\t瀸\n12293\t瀺\n12294\t瀻\n12295\t瀽\n12296\t瀿\n12297\t灀\n12298\t灁\n12299\t灂\n12300\t灅\n12301\t灆\n12302\t灇\n12303\t灈\n12304\t灉\n12305\t灊\n12306\t灋\n12307\t灍\n12308\t煷\n12309\t焋\n12310\t焇\n12311\t焴\n12312\t燋\n12313\t熥\n12314\t焲\n12315\t煱\n12316\t煹\n12317\t煰\n12318\t煭\n12319\t煛\n12320\t熆\n12321\t煼\n12322\t煫\n12323\t焄\n12324\t焆\n12325\t焮\n12326\t焳\n12327\t煣\n12328\t煻\n12329\t煯\n12330\t煠\n12331\t煝\n12332\t煟\n12333\t焅\n12334\t煴\n12335\t焵\n12336\t焷\n12337\t焸\n12338\t焹\n12339\t焺\n12340\t焻\n12341\t焽\n12342\t焾\n12343\t焿\n12344\t煀\n12345\t煁\n12346\t煂\n12347\t煃\n12348\t煄\n12349\t煆\n12350\t煇\n12351\t煈\n12352\t煋\n12353\t煍\n12354\t煏\n12355\t煐\n12356\t煑\n12357\t煔\n12358\t煕\n12359\t煗\n12360\t煘\n12361\t〖\n12362\tЪ\n12363\t┘\n12364\t⒓\n12365\t＜\n12366\t伡\n12367\t偧\n12368\t劶\n12369\t吋\n12370\t喖\n12371\t埣\n12372\t壖\n12373\t娂\n12374\t嫾\n12375\t尲\n12376\t嵓\n12377\t幖\n12378\t徏\n12379\t懠\n12380\t捈\n12381\t摷\n12382\t敿\n12383\t暭\n12384\t柤\n12385\t椉\n12386\t樇\n12387\t櫦\n12388\t毤\n12389\t浖\n12390\t溂\n12391\t澕\n12392\t灱\n12393\t熂\n12394\t紎\n12395\t糭\n12396\t紏\n12397\t糪\n12398\t紑\n12399\t紒\n12400\t紕\n12401\t紖\n12402\t紝\n12403\t紞\n12404\t紟\n12405\t紣\n12406\t紤\n12407\t紥\n12408\t紦\n12409\t紨\n12410\t紩\n12411\t紪\n12412\t紬\n12413\t紭\n12414\t紱\n12415\t紲\n12416\t紴\n12417\t紵\n12418\t〗\n12419\tЫ\n12420\t┙\n12421\t⒔\n12422\t＝\n12423\t伣\n12424\t偨\n12425\t兘\n12426\t劷\n12427\t吔\n12428\t喗\n12429\t嚱\n12430\t埥\n12431\t壗\n12432\t娊\n12433\t嫿\n12434\t尳\n12435\t嵔\n12436\t惤\n12437\t懡\n12438\t捊\n12439\t斀\n12440\t暯\n12441\t柦\n12442\t椊\n12443\t樈\n12444\t櫧\n12445\t浗\n12446\t溄\n12447\t澖\n12448\t灲\n12449\t絗\n12450\t絴\n12451\t絖\n12452\t絒\n12453\t紷\n12454\t絓\n12455\t絸\n12456\t絺\n12457\t絻\n12458\t絼\n12459\t絽\n12460\t絾\n12461\t絿\n12462\t綀\n12463\t綂\n12464\t綃\n12465\t綄\n12466\t綅\n12467\t綆\n12468\t綇\n12469\t綈\n12470\t綊\n12471\t綌\n12472\t綍\n12473\t綐\n12474\t綒\n12475\t綔\n12476\t綕\n12477\t綖\n12478\t綗\n12479\t【\n12480\tЬ\n12481\t┚\n12482\t⒕\n12483\t＞\n12484\t伨\n12485\t偩\n12486\t兙\n12487\t劸\n12488\t吘\n12489\t嚲\n12490\t埦\n12491\t娋\n12492\t嬀\n12493\t尵\n12494\t嵕\n12495\t幘\n12496\t従\n12497\t惥\n12498\t懢\n12499\t捑\n12500\t摼\n12501\t斁\n12502\t暰\n12503\t柧\n12504\t椌\n12505\t櫨\n12506\t毦\n12507\t浘\n12508\t灳\n12509\t熅\n12510\t綹\n12511\t緗\n12512\t綶\n12513\t緖\n12514\t緘\n12515\t綷\n12516\t緛\n12517\t緜\n12518\t緟\n12519\t緡\n12520\t緢\n12521\t緤\n12522\t緥\n12523\t緦\n12524\t緧\n12525\t緪\n12526\t緫\n12527\t緭\n12528\t緮\n12529\t緰\n12530\t緳\n12531\t緵\n12532\t緶\n12533\t緷\n12534\t緸\n12535\t絘\n12536\t綼\n12537\t糱\n12538\t糂\n12539\t絙\n12540\t綛\n12541\t糲\n12542\t糃\n12543\t絚\n12544\t糳\n12545\t糄\n12546\t絛\n12547\t紻\n12548\t糴\n12549\t糆\n12550\t紼\n12551\t緀\n12552\t綞\n12553\t糵\n12554\t糉\n12555\t絝\n12556\t紽\n12557\t緁\n12558\t綟\n12559\t糶\n12560\t紾\n12561\t緂\n12562\t糷\n12563\t糎\n12564\t絟\n12565\t紿\n12566\t緃\n12567\t綡\n12568\t糹\n12569\t絠\n12570\t絀\n12571\t緄\n12572\t糺\n12573\t糐\n12574\t絁\n12575\t緅\n12576\t糼\n12577\t糑\n12578\t緆\n12579\t綤\n12580\t糽\n12581\t糒\n12582\t絣\n12583\t緇\n12584\t糓\n12585\t絤\n12586\t糿\n12587\t糔\n12588\t絅\n12589\t緉\n12590\t綨\n12591\t糘\n12592\t綩\n12593\t紁\n12594\t糚\n12595\t絧\n12596\t絇\n12597\t綪\n12598\t糛\n12599\t絈\n12600\t緌\n12601\t紃\n12602\t糝\n12603\t絩\n12604\t絉\n12605\t緍\n12606\t絊\n12607\t緎\n12608\t糡\n12609\t絫\n12610\t総\n12611\t綯\n12612\t紆\n12613\t糢\n12614\t絬\n12615\t緐\n12616\t紇\n12617\t糣\n12618\t絭\n12619\t絍\n12620\t糤\n12621\t絯\n12622\t糥\n12623\t絰\n12624\t絏\n12625\t緓\n12626\t綳\n12627\t糦\n12628\t緔\n12629\t紌\n12630\t絑\n12631\t綵\n12632\t紶\n12633\t綘\n12634\t緺\n12635\t」\n12636\tЧ\n12637\t┕\n12638\t⒐\n12639\t伖\n12640\t偣\n12641\t劰\n12642\t吂\n12643\t喒\n12644\t嚬\n12645\t埞\n12646\t壒\n12647\t尮\n12648\t幑\n12649\t徆\n12650\t惞\n12651\t懝\n12652\t捁\n12653\t摴\n12654\t敼\n12655\t椆\n12656\t樄\n12657\t櫣\n12658\t毠\n12659\t浌\n12660\t湽\n12661\t澒\n12662\t灩\n12663\t煿\n12664\t筦\n12665\t筤\n12666\t箌\n12667\t筥\n12668\t筟\n12669\t筣\n12670\t箎\n12671\t笯\n12672\t筡\n12673\t箑\n12674\t箒\n12675\t箘\n12676\t箙\n12677\t箚\n12678\t箛\n12679\t箞\n12680\t箟\n12681\t箣\n12682\t箤\n12683\t箥\n12684\t箮\n12685\t箯\n12686\t箰\n12687\t箲\n12688\t箳\n12689\t箵\n12690\t箶\n12691\t箷\n12692\t箹\n12693\t箺\n12694\t箻\n12695\t箼\n12696\t箽\n12697\t箾\n12698\t箿\n12699\t篂\n12700\t篃\n12701\t笰\n12702\t筨\n12703\t笲\n12704\t筩\n12705\t笴\n12706\t筪\n12707\t笵\n12708\t筫\n12709\t笶\n12710\t筬\n12711\t笷\n12712\t筭\n12713\t笻\n12714\t筰\n12715\t笽\n12716\t筳\n12717\t笿\n12718\t筴\n12719\t筀\n12720\t筁\n12721\t筸\n12722\t筂\n12723\t筺\n12724\t筃\n12725\t筄\n12726\t筽\n12727\t筿\n12728\t筈\n12729\t箁\n12730\t筊\n12731\t箂\n12732\t箃\n12733\t筎\n12734\t箄\n12735\t筓\n12736\t箆\n12737\t筕\n12738\t箇\n12739\t筗\n12740\t箈\n12741\t筙\n12742\t箉\n12743\t箊\n12744\t筞\n12745\tΣ\n12746\t〔\n12747\tР\n12748\t┎\n12749\t⒉\n12750\t伈\n12751\t偛\n12752\t儾\n12753\t劜\n12754\t啿\n12755\t埐\n12756\t壊\n12757\t姴\n12758\t嫴\n12759\t尣\n12760\t幉\n12761\t彶\n12762\t惒\n12763\t懖\n12764\t挷\n12765\t摬\n12766\t敳\n12767\t暡\n12768\t柌\n12769\t棽\n12770\t櫜\n12771\t湶\n12772\t灢\n12773\t煵\n12774\t瞏\n12775\t瞊\n12776\t瞇\n12777\t瞉\n12778\t瞷\n12779\t瞹\n12780\t瞈\n12781\t矀\n12782\t矁\n12783\t矂\n12784\t矃\n12785\t矄\n12786\t矆\n12787\t矉\n12788\t矊\n12789\t矋\n12790\t矌\n12791\t矐\n12792\t矑\n12793\t矒\n12794\t矓\n12795\t矔\n12796\t矕\n12797\t矘\n12798\t矙\n12799\t矝\n12800\t矞\n12801\t矟\n12802\t矠\n12803\t矡\n12804\t瞓\n12805\t睟\n12806\t睠\n12807\t睤\n12808\t瞖\n12809\t睧\n12810\t瞗\n12811\t睩\n12812\t睪\n12813\t瞙\n12814\t睭\n12815\t瞚\n12816\t睮\n12817\t瞛\n12818\t睯\n12819\t瞜\n12820\t睰\n12821\t瞝\n12822\t睱\n12823\t睲\n12824\t瞡\n12825\t睳\n12826\t睴\n12827\t睵\n12828\t瞦\n12829\t睶\n12830\t瞨\n12831\t睷\n12832\t瞫\n12833\t睸\n12834\t瞭\n12835\t瞮\n12836\t睻\n12837\t瞯\n12838\t睼\n12839\t瞱\n12840\t瞲\n12841\t瞂\n12842\t瞴\n12843\t瞃\n12844\t瞶\n12845\t瞆\n12846\t矤\n12847\t鞒\n12848\tΤ\n12849\t〕\n12850\t┏\n12851\t⒊\n12852\t偝\n12853\t兂\n12854\t劤\n12855\t叧\n12856\t喅\n12857\t嚦\n12858\t埑\n12859\t壋\n12860\t尦\n12861\t嵆\n12862\t幊\n12863\t彸\n12864\t惓\n12865\t懗\n12866\t挸\n12867\t摮\n12868\t柍\n12869\t棾\n12870\t櫝\n12871\t毘\n12872\t湷\n12873\t澇\n12874\t煶\n12875\t砠\n12876\t硘\n12877\t砡\n12878\t砙\n12879\t砞\n12880\t硔\n12881\t硙\n12882\t矦\n12883\t砛\n12884\t硛\n12885\t硜\n12886\t硠\n12887\t硡\n12888\t硢\n12889\t硣\n12890\t硥\n12891\t硦\n12892\t硧\n12893\t硩\n12894\t硰\n12895\t硲\n12896\t硳\n12897\t硴\n12898\t硶\n12899\t硸\n12900\t硹\n12901\t硺\n12902\t硽\n12903\t硾\n12904\t硿\n12905\t碀\n12906\t碂\n12907\t砤\n12908\t矨\n12909\t砨\n12910\t砪\n12911\t砫\n12912\t砮\n12913\t矱\n12914\t砯\n12915\t矲\n12916\t砱\n12917\t矴\n12918\t矵\n12919\t矷\n12920\t砵\n12921\t矹\n12922\t砶\n12923\t矺\n12924\t砽\n12925\t砿\n12926\t矼\n12927\t硁\n12928\t砃\n12929\t硂\n12930\t砄\n12931\t砅\n12932\t硄\n12933\t砆\n12934\t硆\n12935\t砇\n12936\t硈\n12937\t砈\n12938\t硉\n12939\t砊\n12940\t硊\n12941\t砋\n12942\t硋\n12943\t砎\n12944\t硍\n12945\t砏\n12946\t硏\n12947\t砐\n12948\t硑\n12949\t砓\n12950\t硓\n12951\t碃\n12952\t╝\n12953\t鱝\n12954\t譨\n12955\t琣\n12956\t礱\n12957\t痑\n12958\t璦\n12959\t穉\n12960\t竌\n12961\t玜\n12962\t籥\n12963\t禷\n12964\t゛\n12965\t乤\n12966\t俛\n12967\t僡\n12968\t刟\n12969\t卆\n12970\t哸\n12971\t坅\n12972\t塧\n12973\t奱\n12974\t媋\n12975\t宎\n12976\t峚\n12977\t廰\n12978\t慳\n12979\t抋\n12980\t揳\n12981\t攁\n12982\t昦\n12983\t朼\n12984\t梐\n12985\t榓\n12986\t檃\n12987\t歛\n12988\t沘\n12989\t渁\n12990\t漚\n12991\tˋ\n12992\t鰽\n12993\t譇\n12994\t疉\n12995\t緼\n12996\t稟\n12997\t獳\n12998\t籄\n12999\t〢\n13000\t瓵\n13001\t盇\n13002\t丄\n13003\t侫\n13004\t凙\n13005\t咥\n13006\t嘇\n13007\t婣\n13008\t孉\n13009\t岮\n13010\t嶢\n13011\t廇\n13012\t怉\n13013\t慉\n13014\t扐\n13015\t揂\n13016\t旳\n13017\t桝\n13018\t橝\n13019\t欰\n13020\t汚\n13021\t淎\n13022\t滱\n13023\t濧\n13024\t鳘\n13025\tΚ\n13026\t—\n13027\t┆\n13028\tⅹ\n13029\t＊\n13030\t仾\n13031\t偑\n13032\t儶\n13033\t劒\n13034\t叒\n13035\t埅\n13036\t壀\n13037\t崻\n13038\t帾\n13039\t挭\n13040\t摢\n13041\t柂\n13042\t棯\n13043\t槳\n13044\t櫔\n13045\t湭\n13046\t灙\n13047\t煪\n13048\t猔\n13049\t猑\n13050\t獈\n13051\t獆\n13052\t猒\n13053\t猍\n13054\t狜\n13055\t猏\n13056\t獉\n13057\t獊\n13058\t獋\n13059\t獌\n13060\t獏\n13061\t獓\n13062\t獔\n13063\t獕\n13064\t獖\n13065\t獘\n13066\t獙\n13067\t獚\n13068\t獛\n13069\t獜\n13070\t獝\n13071\t獞\n13072\t獟\n13073\t獡\n13074\t獤\n13075\t獥\n13076\t獧\n13077\t獩\n13078\t獪\n13079\t獫\n13080\t獮\n13081\t獰\n13082\tㄡ\n13083\t︶\n13084\t♂\n13085\t┽\n13086\t⑨\n13087\t佱\n13088\t傖\n13089\t冡\n13090\t勧\n13091\t呩\n13092\t囜\n13093\t堘\n13094\t夅\n13095\t娽\n13096\t嬦\n13097\t屷\n13098\t嶀\n13099\t忈\n13100\t愥\n13101\t戓\n13102\t掅\n13103\t斸\n13104\t栣\n13105\t樶\n13106\t欋\n13107\t氠\n13108\t涐\n13109\t炨\n13110\t醏\n13111\t醊\n13112\t醻\n13113\t岈\n13114\t醸\n13115\t醎\n13116\t醄\n13117\t醈\n13118\t醹\n13119\t岍\n13120\t醆\n13121\t醼\n13122\t釂\n13123\t釃\n13124\t釅\n13125\t釒\n13126\t釓\n13127\t釔\n13128\t釕\n13129\t釖\n13130\t釙\n13131\t釚\n13132\t釛\n13133\t釟\n13134\t釠\n13135\t釡\n13136\t釢\n13137\t釤\n13138\tα\n13139\t×\n13140\t┝\n13141\t⒘\n13142\tＡ\n13143\t吜\n13144\t喠\n13145\t嚵\n13146\t埩\n13147\t壛\n13148\t娏\n13149\t嬃\n13150\t嵙\n13151\t幜\n13152\t徚\n13153\t惲\n13154\t懥\n13155\t捔\n13156\t摿\n13157\t斄\n13158\t柫\n13159\t椓\n13160\t樍\n13161\t櫫\n13162\t毩\n13163\t溋\n13164\t澚\n13165\t灹\n13166\t羆\n13167\t羄\n13168\t羭\n13169\t羀\n13170\t羃\n13171\t羮\n13172\t罖\n13173\t羂\n13174\t羴\n13175\t羵\n13176\t羺\n13177\t羻\n13178\t翂\n13179\t翄\n13180\t翆\n13181\t翈\n13182\t翉\n13183\t翋\n13184\t翍\n13185\t翏\n13186\t翐\n13187\t翑\n13188\t翓\n13189\t翖\n13190\t翗\n13191\t翙\n13192\t翜\n13193\t翝\n13194\t翞\n13195\t翢\n13196\tㄠ\n13197\t︵\n13198\t∴\n13199\t┼\n13200\t⑧\n13201\t｀\n13202\t佮\n13203\t冟\n13204\t勦\n13205\t呧\n13206\t嗋\n13207\t囙\n13208\t堗\n13209\t夃\n13210\t娻\n13211\t嬥\n13212\t屶\n13213\t嵿\n13214\t庎\n13215\t忇\n13216\t戉\n13217\t掄\n13218\t撪\n13219\t椸\n13220\t樴\n13221\t氞\n13222\t溹\n13223\t澿\n13224\t炧\n13225\t熰\n13226\t郶\n13227\t鄜\n13228\t郲\n13229\t鄛\n13230\t郂\n13231\t郳\n13232\t鄝\n13233\t鄠\n13234\t鄨\n13235\t鄩\n13236\t鄪\n13237\t鄫\n13238\t鄮\n13239\t鄳\n13240\t鄵\n13241\t鄶\n13242\t鄷\n13243\t鄸\n13244\t鄺\n13245\t鄼\n13246\t鄽\n13247\t鄾\n13248\t鄿\n13249\t酂\n13250\tɡ\n13251\t±\n13252\tЮ\n13253\t├\n13254\t⒗\n13255\t＠\n13256\t伬\n13257\t偫\n13258\t劺\n13259\t吚\n13260\t喞\n13261\t埨\n13262\t壚\n13263\t娎\n13264\t嬂\n13265\t嵗\n13266\t幚\n13267\t徖\n13268\t捓\n13269\t摾\n13270\t暲\n13271\t柪\n13272\t樌\n13273\t櫪\n13274\t毨\n13275\t浝\n13276\t溊\n13277\t澙\n13278\t灷\n13279\t纞\n13280\t繻\n13281\t纚\n13282\t繺\n13283\t纴\n13284\t纼\n13285\t绖\n13286\t绤\n13287\t绬\n13288\t绹\n13289\t缐\n13290\t缞\n13291\t缹\n13292\t缻\n13293\t缼\n13294\t缽\n13295\t缾\n13296\t缿\n13297\t罀\n13298\t罁\n13299\t罃\n13300\t罆\n13301\t罇\n13302\t罉\n13303\t罊\n13304\t罋\n13305\t罎\n13306\t罏\n13307\t罒\n13308\tㄢ\n13309\t︹\n13310\t♀\n13311\tр\n13312\t┾\n13313\t⑩\n13314\t傗\n13315\t冣\n13316\t勨\n13317\t嗏\n13318\t堚\n13319\t娾\n13320\t嬧\n13321\t屸\n13322\t庘\n13323\t忊\n13324\t愨\n13325\t戔\n13326\t掆\n13327\t撯\n13328\t斺\n13329\t曗\n13330\t栤\n13331\t椻\n13332\t樷\n13333\t欌\n13334\t涒\n13335\t溾\n13336\t獯\n13337\t鈤\n13338\t怊\n13339\t鈢\n13340\t鈅\n13341\t鈁\n13342\t鈃\n13343\t猱\n13344\t釦\n13345\t猓\n13346\t鈂\n13347\t鈥\n13348\t鈧\n13349\t鈨\n13350\t鈩\n13351\t鈪\n13352\t鈫\n13353\t鈭\n13354\t鈮\n13355\t鈯\n13356\t鈰\n13357\t鈱\n13358\t鈲\n13359\t鈳\n13360\t鈵\n13361\t鈶\n13362\t鈸\n13363\t鈹\n13364\t鈻\n13365\t鈼\n13366\t鈽\n13367\t鈾\n13368\t鉁\n13369\t鉂\n13370\t鉃\n13371\t÷\n13372\t┞\n13373\tぢ\n13374\t⒙\n13375\tＢ\n13376\tヂ\n13377\t甭\n13378\t伮\n13379\t偮\n13380\t兟\n13381\t喡\n13382\t嚶\n13383\t壜\n13384\t娐\n13385\t嬄\n13386\t幝\n13387\t惵\n13388\t懧\n13389\t捖\n13390\t撀\n13391\t斅\n13392\t暵\n13393\t柭\n13394\t椔\n13395\t樎\n13396\t櫬\n13397\t溌\n13398\t澛\n13399\t熉\n13400\t耟\n13401\t耝\n13402\t聕\n13403\t耞\n13404\t耓\n13405\t耛\n13406\t翤\n13407\t聙\n13408\t聛\n13409\t聜\n13410\t聝\n13411\t聟\n13412\t聠\n13413\t聡\n13414\t聢\n13415\t聣\n13416\t聥\n13417\t聦\n13418\t聧\n13419\t聨\n13420\t聫\n13421\t聬\n13422\t聭\n13423\t聮\n13424\t聵\n13425\t聸\n13426\t聹\n13427\t聺\n13428\t聻\n13429\t聼\n13430\tㄥ\n13431\t﹀\n13432\t″\n13433\t╁\n13434\t㈠\n13435\t佸\n13436\t勫\n13437\t呭\n13438\t嗗\n13439\t夊\n13440\t婂\n13441\t屽\n13442\t嶅\n13443\t庡\n13444\t忓\n13445\t愬\n13446\t戝\n13447\t掑\n13448\t斿\n13449\t曞\n13450\t栧\n13451\t楀\n13452\t炲\n13453\t熷\n13454\t錪\n13455\t鍉\n13456\t鬻\n13457\t鍇\n13458\t瀵\n13459\t錧\n13460\t鍆\n13461\t鍈\n13462\t錊\n13463\t鍌\n13464\t鍏\n13465\t鍐\n13466\t鍑\n13467\t鍒\n13468\t鍓\n13469\t鍕\n13470\t鍗\n13471\t鍚\n13472\t鍜\n13473\t鍝\n13474\t鍞\n13475\t鍟\n13476\t鍠\n13477\t鍡\n13478\t鍣\n13479\t鍤\n13480\t鍦\n13481\t鍧\n13482\t鍨\n13483\t鍩\n13484\tㄅ\n13485\tε\n13486\t∨\n13487\t┡\n13488\t⑴\n13489\tＥ\n13490\tヅ\n13491\t伵\n13492\t吪\n13493\t喤\n13494\t埮\n13495\t娕\n13496\t嵟\n13497\t徟\n13498\t懪\n13499\t捙\n13500\t撆\n13501\t暸\n13502\t椗\n13503\t櫯\n13504\t毰\n13505\t溑\n13506\t澟\n13507\t熍\n13508\t臽\n13509\t臹\n13510\t舼\n13511\t臸\n13512\t臷\n13513\t艀\n13514\t艁\n13515\t艃\n13516\t艈\n13517\t艌\n13518\t艐\n13519\t艑\n13520\t艓\n13521\t艔\n13522\t艕\n13523\t艗\n13524\t艛\n13525\t艝\n13526\t艞\n13527\t艤\n13528\t艧\n13529\tㄤ\n13530\t︿\n13531\t′\n13532\t╀\n13533\t佷\n13534\t勪\n13535\t呬\n13536\t嗕\n13537\t囦\n13538\t堜\n13539\t嬩\n13540\t屼\n13541\t忎\n13542\t愪\n13543\t戜\n13544\t掍\n13545\t斾\n13546\t栦\n13547\t椾\n13548\t欎\n13549\t涗\n13550\t滀\n13551\t濅\n13552\t炰\n13553\t熶\n13554\t鋊\n13555\t鋨\n13556\t鋦\n13557\t鋉\n13558\t鋄\n13559\t鋥\n13560\t鋩\n13561\t鋫\n13562\t鋬\n13563\t鋮\n13564\t鋱\n13565\t鋲\n13566\t鋳\n13567\t鋷\n13568\t鋺\n13569\t鋻\n13570\t鋽\n13571\t鋾\n13572\t鋿\n13573\t錀\n13574\t錁\n13575\t錂\n13576\t錅\n13577\t錆\n13578\t錇\n13579\t錈\n13580\tδ\n13581\t∧\n13582\t┠\n13583\t⒛\n13584\tＤ\n13585\t伳\n13586\t勀\n13587\t吥\n13588\t喣\n13589\t嚹\n13590\t埬\n13591\t娔\n13592\t嬆\n13593\t屇\n13594\t嵞\n13595\t幠\n13596\t惸\n13597\t懩\n13598\t捘\n13599\t斈\n13600\t暷\n13601\t柲\n13602\t椖\n13603\t毮\n13604\t浤\n13605\t溎\n13606\t澞\n13607\t熌\n13608\t腵\n13609\t腲\n13610\t膥\n13611\t膢\n13612\t腯\n13613\t膡\n13614\t膤\n13615\t腬\n13616\t膧\n13617\t膫\n13618\t膬\n13619\t膮\n13620\t膯\n13621\t膰\n13622\t膲\n13623\t膴\n13624\t膵\n13625\t膶\n13626\t膷\n13627\t膸\n13628\t膹\n13629\t膼\n13630\t膾\n13631\t臄\n13632\t臅\n13633\t臈\n13634\t臋\n13635\t臐\n13636\t臒\n13637\tㄣ\n13638\t︺\n13639\t°\n13640\t┿\n13641\tゃ\n13642\t冦\n13643\t勩\n13644\t呫\n13645\t嗐\n13646\t囥\n13647\t堛\n13648\t夈\n13649\t嬨\n13650\t屻\n13651\t嶃\n13652\t庛\n13653\t忋\n13654\t愩\n13655\t戙\n13656\t掋\n13657\t撱\n13658\t斻\n13659\t曘\n13660\t栥\n13661\t椼\n13662\t欍\n13663\t氥\n13664\t涖\n13665\t溿\n13666\t濄\n13667\t炪\n13668\t熴\n13669\t鉦\n13670\t銃\n13671\t鉥\n13672\t鉡\n13673\t銄\n13674\t鉢\n13675\t銇\n13676\t銈\n13677\t銉\n13678\t銊\n13679\t銋\n13680\t銏\n13681\t銕\n13682\t銗\n13683\t銙\n13684\t銛\n13685\t銝\n13686\t銞\n13687\t銟\n13688\t銠\n13689\t銡\n13690\t銣\n13691\t銤\n13692\t銦\n13693\t∶\n13694\t┟\n13695\t⒚\n13696\tＣ\n13697\t伱\n13698\t偯\n13699\t兠\n13700\t劽\n13701\t喢\n13702\t嚸\n13703\t埫\n13704\t壝\n13705\t嵜\n13706\t徝\n13707\t惷\n13708\t懨\n13709\t撁\n13710\t斆\n13711\t暶\n13712\t柮\n13713\t椕\n13714\t樏\n13715\t櫭\n13716\t毭\n13717\t浢\n13718\t澝\n13719\t灻\n13720\t熋\n13721\t胉\n13722\t胇\n13723\t脋\n13724\t胈\n13725\t肹\n13726\t胅\n13727\t肻\n13728\t脌\n13729\t脕\n13730\t脗\n13731\t脙\n13732\t脜\n13733\t脝\n13734\t脟\n13735\t脠\n13736\t脡\n13737\t脢\n13738\t脤\n13739\t脥\n13740\t脦\n13741\t脧\n13742\t脨\n13743\t脪\n13744\t脭\n13745\t脮\n13746\t脰\n13747\t脴\n13748\t脵\n13749\t脺\n13750\t脻\n13751\t脼\n13752\t脽\n13753\t饧\n13754\t饩\n13755\t宀\n13756\t猘\n13757\t狝\n13758\t醓\n13759\t酇\n13760\t羇\n13761\t罙\n13762\t郺\n13763\t繟\n13764\t嗬\n13765\t鈇\n13766\t耡\n13767\t翧\n13768\t猹\n13769\t忄\n13770\t饫\n13771\t忮\n13772\t錋\n13773\t臿\n13774\t鋋\n13775\t銩\n13776\t腶\n13777\t腁\n13778\t溻\n13779\t滗\n13780\t鉧\n13781\t肁\n13782\t懔\n13783\t汔\n13784\t彐\n13785\t猙\n13786\t狟\n13787\t醔\n13788\t酈\n13789\t罛\n13790\t郆\n13791\t纀\n13792\t繠\n13793\t鈈\n13794\t釨\n13795\t翨\n13796\t錬\n13797\t舃\n13798\t臖\n13799\t鋌\n13800\t銪\n13801\t腷\n13802\t腂\n13803\t鉈\n13804\t胋\n13805\t肂\n13806\t猚\n13807\t狢\n13808\t醕\n13809\t酑\n13810\t羉\n13811\t罜\n13812\t郼\n13813\t郈\n13814\t釩\n13815\t耤\n13816\t錭\n13817\t錍\n13818\t臗\n13819\t鋍\n13820\t銫\n13821\t腃\n13822\t鉩\n13823\t胏\n13824\t猟\n13825\t狣\n13826\t醖\n13827\t酓\n13828\t羋\n13829\t郿\n13830\t郉\n13831\t纃\n13832\t繢\n13833\t釪\n13834\t翫\n13835\t錎\n13836\t屦\n13837\t膁\n13838\t鉊\n13839\t胐\n13840\t肈\n13841\t猠\n13842\t狤\n13843\tㄦ\n13844\t︽\n13845\t℃\n13846\tф\n13847\t╂\n13848\t㈡\n13849\t佹\n13850\t冩\n13851\t勬\n13852\t呮\n13853\t嗘\n13854\t囨\n13855\t堟\n13856\t夋\n13857\t婃\n13858\t嬫\n13859\t嶆\n13860\t庢\n13861\t愭\n13862\t戞\n13863\t掓\n13864\t撴\n13865\t旀\n13866\t曟\n13867\t栨\n13868\t楁\n13869\t樻\n13870\t欐\n13871\t氭\n13872\t濇\n13873\t炴\n13874\t熸\n13875\t鎩\n13876\t鎋\n13877\t鎨\n13878\t鍬\n13879\t鎈\n13880\t姹\n13881\t鎯\n13882\t鎰\n13883\t鎱\n13884\t鎲\n13885\t鎳\n13886\t鎴\n13887\t鎵\n13888\t鎶\n13889\t鎷\n13890\t鎸\n13891\t鎹\n13892\t鎺\n13893\t鎻\n13894\t鎼\n13895\t鎽\n13896\t鎾\n13897\t鎿\n13898\t鏀\n13899\t鏁\n13900\t鏂\n13901\t鏄\n13902\t鏅\n13903\t鏆\n13904\t鏇\n13905\t鏉\n13906\t鏋\n13907\t鏌\n13908\tζ\n13909\t∑\n13910\t┢\n13911\t⑵\n13912\tＦ\n13913\t兤\n13914\t勂\n13915\t喥\n13916\t嚻\n13917\t埰\n13918\t壠\n13919\t娖\n13920\t嵠\n13921\t幤\n13922\t惼\n13923\t捚\n13924\t斊\n13925\t柶\n13926\t椘\n13927\t樒\n13928\t櫰\n13929\t毱\n13930\t浧\n13931\t溒\n13932\t炂\n13933\t熎\n13934\t苸\n13935\t苵\n13936\t芲\n13937\t苶\n13938\t芢\n13939\t苿\n13940\t茊\n13941\t茍\n13942\t茐\n13943\t茒\n13944\t茓\n13945\t茘\n13946\t茙\n13947\t茝\n13948\t茞\n13949\t茟\n13950\t茠\n13951\t茡\n13952\t茢\n13953\t茣\n13954\t茥\n13955\t茦\n13956\t茩\n13957\t茮\n13958\t茰\n13959\t茷\n13960\t茻\n13961\t醗\n13962\t酔\n13963\t羍\n13964\t罞\n13965\t崾\n13966\t郋\n13967\t纄\n13968\t繣\n13969\t嗍\n13970\t釫\n13971\t耬\n13972\t馊\n13973\t錏\n13974\t臙\n13975\t鋏\n13976\t銭\n13977\t腅\n13978\t鉫\n13979\t鉋\n13980\t胑\n13981\t肊\n13982\t闶\n13983\t鎍\n13984\t鍭\n13985\t艫\n13986\t驵\n13987\t芺\n13988\t鎐\n13989\t艭\n13990\t鎑\n13991\t鍰\n13992\t芼\n13993\t鎒\n13994\t鍱\n13995\t芿\n13996\t艵\n13997\t鎓\n13998\t苀\n13999\t鎔\n14000\t鍳\n14001\t苂\n14002\t鎕\n14003\t鍴\n14004\t艸\n14005\t苅\n14006\t艻\n14007\t鎗\n14008\t鍶\n14009\t苆\n14010\t艼\n14011\t鎘\n14012\t鍷\n14013\t苉\n14014\t芀\n14015\t鎙\n14016\t鍸\n14017\t苐\n14018\t芁\n14019\t鎚\n14020\t鍹\n14021\t苖\n14022\t鎛\n14023\t苙\n14024\t芅\n14025\t嫜\n14026\t鎜\n14027\t鍻\n14028\t苚\n14029\t芆\n14030\t嬖\n14031\t鎝\n14032\t苝\n14033\t芇\n14034\t鎞\n14035\t鍽\n14036\t芉\n14037\t鎟\n14038\t芌\n14039\t鎠\n14040\t鍿\n14041\t芐\n14042\t鎀\n14043\t苩\n14044\t芔\n14045\t纟\n14046\t孥\n14047\t苬\n14048\t芕\n14049\t鎃\n14050\t芖\n14051\t鎄\n14052\t苮\n14053\t芚\n14054\t鎦\n14055\t鎅\n14056\t苰\n14057\t苲\n14058\t芞\n14059\t鏍\n14060\t茽\n14061\t猣\n14062\t狥\n14063\t醘\n14064\t酕\n14065\t羏\n14066\t鄁\n14067\t纅\n14068\t繤\n14069\t鈌\n14070\t釬\n14071\t耭\n14072\t翭\n14073\t鋐\n14074\t銯\n14075\t膄\n14076\t腇\n14077\t鉌\n14078\t胒\n14079\t肍\n14080\t猤\n14081\t狦\n14082\t醙\n14083\t酖\n14084\t羐\n14085\t罣\n14086\t郍\n14087\t纆\n14088\t耮\n14089\t錱\n14090\t錑\n14091\t舋\n14092\t臛\n14093\t銰\n14094\t膅\n14095\t腉\n14096\t鉭\n14097\t鉍\n14098\t胓\n14099\t肎\n14100\t猦\n14101\t狧\n14102\t酘\n14103\t羑\n14104\t鄅\n14105\t纇\n14106\t繦\n14107\t釮\n14108\t耯\n14109\t翲\n14110\t錒\n14111\t銱\n14112\t膆\n14113\t腍\n14114\t鉮\n14115\t鉎\n14116\t胔\n14117\t肏\n14118\t猧\n14119\t狪\n14120\t醝\n14121\t酙\n14122\t羒\n14123\t罥\n14124\t郔\n14125\t纈\n14126\t繧\n14127\t嘞\n14128\t鈏\n14129\t耰\n14130\t翴\n14131\t錓\n14132\t舏\n14133\t臝\n14134\t屙\n14135\t鋓\n14136\t膇\n14137\t漤\n14138\t滹\n14139\t鉯\n14140\t鉏\n14141\t胕\n14142\t肐\n14143\t猨\n14144\t狫\n14145\t酛\n14146\t羓\n14147\t罦\n14148\t鄇\n14149\t繨\n14150\t鈐\n14151\t釰\n14152\t耲\n14153\t翵\n14154\t舑\n14155\t臞\n14156\t膉\n14157\t腏\n14158\t鉰\n14159\t鉐\n14160\t胘\n14161\t肑\n14162\t猭\n14163\t狵\n14164\t醟\n14165\t酜\n14166\t罧\n14167\t鄈\n14168\t郖\n14169\t纊\n14170\t釱\n14171\t耴\n14172\t翶\n14173\t錵\n14174\t錕\n14175\t舓\n14176\t鋕\n14177\t銴\n14178\t膋\n14179\t腒\n14180\t鉱\n14181\t胟\n14182\t肒\n14183\t猯\n14184\t狶\n14185\t醠\n14186\t酟\n14187\t羖\n14188\t罫\n14189\t郘\n14190\t纋\n14191\t鈒\n14192\t釲\n14193\t耹\n14194\t翷\n14195\t錖\n14196\t舕\n14197\t鋖\n14198\t膌\n14199\t腖\n14200\t鉲\n14201\t鉒\n14202\t肔\n14203\t醡\n14204\t酠\n14205\t羗\n14206\t罬\n14207\t鄊\n14208\t郙\n14209\t鈓\n14210\t耺\n14211\t翸\n14212\t錷\n14213\t臡\n14214\t鋗\n14215\t膍\n14216\t腗\n14217\t鉳\n14218\t鉓\n14219\t胢\n14220\t肕\n14221\t猲\n14222\t醤\n14223\t酦\n14224\t羘\n14225\t罭\n14226\t纍\n14227\t繬\n14228\t釴\n14229\t耼\n14230\t庋\n14231\t錸\n14232\t舗\n14233\t鋘\n14234\t膎\n14235\t鉵\n14236\t鉔\n14237\t胣\n14238\t肗\n14239\t猳\n14240\t狾\n14241\t醥\n14242\t酧\n14243\t羙\n14244\t罯\n14245\t郞\n14246\t纎\n14247\t嘁\n14248\t嘣\n14249\t圊\n14250\t耾\n14251\t翺\n14252\t夂\n14253\t庳\n14254\t錙\n14255\t舘\n14256\t臤\n14257\t弪\n14258\t艴\n14259\t逭\n14260\t耪\n14261\t屮\n14262\t鋙\n14263\t膐\n14264\t腛\n14265\t胦\n14266\t肙\n14267\t阍\n14268\t沲\n14269\t猵\n14270\t狿\n14271\t醦\n14272\t酨\n14273\t羛\n14274\t鄍\n14275\t繮\n14276\t鈖\n14277\t聀\n14278\t翽\n14279\t舙\n14280\t鋚\n14281\t膒\n14282\t腜\n14283\t鉖\n14284\t胮\n14285\t肞\n14286\t猀\n14287\t醧\n14288\t酫\n14289\t羜\n14290\t鄎\n14291\t郠\n14292\t繯\n14293\t鈗\n14294\t釷\n14295\t聁\n14296\t錻\n14297\t錛\n14298\t舚\n14299\t臦\n14300\t鋛\n14301\t銺\n14302\t膓\n14303\t腝\n14304\t胵\n14305\t肣\n14306\t猺\n14307\t猂\n14308\t醨\n14309\t酭\n14310\t羠\n14311\t鄏\n14312\t郣\n14313\t纑\n14314\t繰\n14315\t釸\n14316\t聄\n14317\t翿\n14318\t錼\n14319\t錜\n14320\t舝\n14321\t鋜\n14322\t銻\n14323\t膔\n14324\t腞\n14325\t鉹\n14326\t胷\n14327\t肦\n14328\t猻\n14329\t醩\n14330\t酳\n14331\t羢\n14332\t罶\n14333\t鄐\n14334\t郤\n14335\t纒\n14336\t鈙\n14337\t聅\n14338\t耂\n14339\t錽\n14340\t錝\n14341\t臩\n14342\t鋝\n14343\t膕\n14344\t腟\n14345\t鉺\n14346\t鉙\n14347\t胹\n14348\t肧\n14349\t猼\n14350\t猅\n14351\t鄑\n14352\t郥\n14353\t繲\n14354\t鈚\n14355\t釺\n14356\t聇\n14357\t耇\n14358\t錿\n14359\t錞\n14360\t舤\n14361\t臫\n14362\t鋞\n14363\t膖\n14364\t腡\n14365\t胻\n14366\t肨\n14367\t猽\n14368\t猆\n14369\t酻\n14370\t罸\n14371\t郩\n14372\t纔\n14373\t帱\n14374\t鈛\n14375\t聈\n14376\t耈\n14377\t廒\n14378\t廛\n14379\t鍀\n14380\t錟\n14381\t臮\n14382\t鋟\n14383\t銾\n14384\t膗\n14385\t腢\n14386\t鉼\n14387\t胾\n14388\t肬\n14389\t丬\n14390\t獀\n14391\t醰\n14392\t酼\n14393\t羦\n14394\t罺\n14395\t郪\n14396\t纕\n14397\t繴\n14398\t鈜\n14399\t釼\n14400\t耉\n14401\t舦\n14402\t臯\n14403\t鋠\n14404\t銿\n14405\t腣\n14406\t鉽\n14407\t胿\n14408\t肰\n14409\t獁\n14410\t猈\n14411\t醱\n14412\t醀\n14413\t罻\n14414\t鄔\n14415\t郬\n14416\t繵\n14417\t耊\n14418\t鍂\n14419\t舧\n14420\t臰\n14421\t鋡\n14422\t鋀\n14423\t腤\n14424\t鉾\n14425\t鉝\n14426\t脀\n14427\t肳\n14428\t獂\n14429\t猉\n14430\t醲\n14431\t罼\n14432\t鄕\n14433\t郮\n14434\t纗\n14435\t繶\n14436\t釾\n14437\t聏\n14438\t耎\n14439\t鍃\n14440\t舩\n14441\t臱\n14442\t鋢\n14443\t膞\n14444\t鉿\n14445\t脁\n14446\t肵\n14447\t獃\n14448\t猋\n14449\t醳\n14450\t羪\n14451\t罽\n14452\t郰\n14453\t纘\n14454\t噍\n14455\t鈟\n14456\t釿\n14457\t聐\n14458\t耏\n14459\t鍄\n14460\t錣\n14461\t舮\n14462\t臲\n14463\t鋣\n14464\t鋂\n14465\t膟\n14466\t腨\n14467\t鉟\n14468\t脃\n14469\t肶\n14470\t猌\n14471\t羫\n14472\t罿\n14473\t鄗\n14474\t郱\n14475\t纙\n14476\t聑\n14477\t耑\n14478\t鍅\n14479\t錤\n14480\t臵\n14481\t鋃\n14482\t銁\n14483\t鉠\n14484\t翣\n14485\t酄\n14486\t罓\n14487\t鍫\n14488\t錉\n14489\t臓\n14490\t銧\n14491\t脿\n14492\t碽\n14493\t╞\n14494\t癰\n14495\t礲\n14496\t痓\n14497\t縝\n14498\t穊\n14499\t竍\n14500\t玝\n14501\t籦\n14502\t禸\n14503\t゜\n14504\t乥\n14505\t僢\n14506\t刡\n14507\t哹\n14508\t嘼\n14509\t坆\n14510\t塨\n14511\t奲\n14512\t宐\n14513\t峛\n14514\t巄\n14515\t廱\n14516\t恇\n14517\t慴\n14518\t抌\n14519\t攂\n14520\t昩\n14521\t朾\n14522\t梑\n14523\t榖\n14524\t檅\n14525\t歜\n14526\t漛\n14527\t瀊\n14528\t焍\n14529\t碆\n14530\t˙\n14531\t譈\n14532\t礏\n14533\t瑽\n14534\t稡\n14535\t窧\n14536\t禕\n14537\t〣\n14538\t瓸\n14539\t盉\n14540\t侭\n14541\t傿\n14542\t凚\n14543\t匓\n14544\t咮\n14545\t嘊\n14546\t圔\n14547\t塀\n14548\t夿\n14549\t婤\n14550\t孊\n14551\t嶣\n14552\t怋\n14553\t払\n14554\t揃\n14555\t擝\n14556\t旴\n14557\t朆\n14558\t桞\n14559\t楤\n14560\t橞\n14561\t欱\n14562\t汢\n14563\t烞\n14564\t碿\n14565\t╟\n14566\t鱟\n14567\t譪\n14568\t琧\n14569\t痗\n14570\t璫\n14571\t穋\n14572\t竎\n14573\t籧\n14574\t禼\n14575\tヽ\n14576\t甤\n14577\t眂\n14578\t乧\n14579\t俢\n14580\t僣\n14581\t刢\n14582\t卌\n14583\t哻\n14584\t嘽\n14585\t坈\n14586\t奵\n14587\t媍\n14588\t宑\n14589\t峜\n14590\t巆\n14591\t廲\n14592\t恈\n14593\t抍\n14594\t揷\n14595\t攃\n14596\t朿\n14597\t梒\n14598\t榗\n14599\t檆\n14600\t沜\n14601\t漜\n14602\t焎\n14603\t碈\n14604\t珻\n14605\t癈\n14606\t疌\n14607\t緾\n14608\t稢\n14609\t禖\n14610\t〤\n14611\t盋\n14612\t丆\n14613\t侰\n14614\t匔\n14615\t咰\n14616\t嘋\n14617\t圕\n14618\t塁\n14619\t婥\n14620\t孋\n14621\t岰\n14622\t嶤\n14623\t怌\n14624\t慍\n14625\t扖\n14626\t揅\n14627\t朇\n14628\t桟\n14629\t楥\n14630\t欳\n14631\t汣\n14632\t淐\n14633\t烠\n14634\tㄧ\n14635\t︾\n14636\t＄\n14637\t╃\n14638\t㈢\n14639\t傜\n14640\t勭\n14641\t呯\n14642\t嗙\n14643\t囩\n14644\t堢\n14645\t夌\n14646\t婄\n14647\t嬬\n14648\t庣\n14649\t忕\n14650\t愮\n14651\t戠\n14652\t掔\n14653\t撶\n14654\t楃\n14655\t樼\n14656\t欑\n14657\t氱\n14658\t涚\n14659\t濈\n14660\t炵\n14661\t鏯\n14662\t鐍\n14663\t鐋\n14664\t绨\n14665\t鏮\n14666\t鏪\n14667\t鐊\n14668\t鐌\n14669\t缍\n14670\t绠\n14671\t鏫\n14672\t鐎\n14673\t鐏\n14674\t鐑\n14675\t鐒\n14676\t鐔\n14677\t鐕\n14678\t鐖\n14679\t鐗\n14680\t鐙\n14681\t鐚\n14682\t鐛\n14683\t鐝\n14684\t鐞\n14685\t鐟\n14686\t鐠\n14687\t鐡\n14688\t鐣\n14689\t鐤\n14690\t鐥\n14691\t鐦\n14692\t鐧\n14693\t鐨\n14694\t鐩\n14695\t鐪\n14696\t鐬\n14697\t鐭\n14698\tㄇ\n14699\tη\n14700\t∏\n14701\t┣\n14702\t⑶\n14703\tＧ\n14704\t伹\n14705\t偳\n14706\t兦\n14707\t勄\n14708\t埱\n14709\t壡\n14710\t娗\n14711\t嬊\n14712\t屒\n14713\t嵡\n14714\t幥\n14715\t徢\n14716\t惽\n14717\t捛\n14718\t撉\n14719\t椙\n14720\t櫱\n14721\t浨\n14722\t澢\n14723\t炃\n14724\t熐\n14725\t莁\n14726\t荿\n14727\t莮\n14728\t莬\n14729\t荹\n14730\t莭\n14731\t茾\n14732\t荺\n14733\t莯\n14734\t莵\n14735\t莻\n14736\t莾\n14737\t莿\n14738\t菃\n14739\t菄\n14740\t菆\n14741\t菎\n14742\t菐\n14743\t菑\n14744\t菒\n14745\t菕\n14746\t菗\n14747\t菙\n14748\t菚\n14749\t菛\n14750\t菞\n14751\t菢\n14752\t菣\n14753\t菤\n14754\t菦\n14755\t菧\n14756\t菨\n14757\t菬\n14758\t鏰\n14759\t莂\n14760\t茿\n14761\t绐\n14762\t缋\n14763\t缏\n14764\t鏱\n14765\t鏐\n14766\t莃\n14767\t荁\n14768\t鏑\n14769\t莄\n14770\t荂\n14771\t鏳\n14772\t鏒\n14773\t莇\n14774\t鏴\n14775\t莈\n14776\t荅\n14777\t鏵\n14778\t荈\n14779\t鏕\n14780\t莋\n14781\t鏷\n14782\t莌\n14783\t荋\n14784\t鏸\n14785\t莍\n14786\t鏹\n14787\t鏙\n14788\t莏\n14789\t荍\n14790\t鏺\n14791\t鏚\n14792\t荎\n14793\t鏛\n14794\t荓\n14795\t鏼\n14796\t荕\n14797\t鏝\n14798\t莕\n14799\t鏾\n14800\t缲\n14801\t莗\n14802\t鐀\n14803\t鏠\n14804\t鐁\n14805\t莚\n14806\t荝\n14807\t莝\n14808\t荢\n14809\t鐃\n14810\t鏣\n14811\t莟\n14812\t荰\n14813\t鐄\n14814\t莡\n14815\t荱\n14816\t鐅\n14817\t荲\n14818\t鐆\n14819\t鏦\n14820\t莣\n14821\t鏧\n14822\t莤\n14823\t荴\n14824\t鏨\n14825\t莥\n14826\t荵\n14827\t巛\n14828\t鐉\n14829\t鏩\n14830\t荶\n14831\t菭\n14832\t磀\n14833\t╠\n14834\t鱠\n14835\t琩\n14836\t璬\n14837\t縟\n14838\t竏\n14839\tヾ\n14840\t眃\n14841\t乨\n14842\t僤\n14843\t刣\n14844\t嘾\n14845\t塪\n14846\t媎\n14847\t宒\n14848\t峝\n14849\t巇\n14850\t恉\n14851\t慸\n14852\t抎\n14853\t攄\n14854\t杁\n14855\t榙\n14856\t歞\n14857\t渄\n14858\t漝\n14859\t瀌\n14860\t焏\n14861\t―\n14862\t譊\n14863\t珼\n14864\t癉\n14865\t礑\n14866\t璂\n14867\t緿\n14868\t稤\n14869\t獶\n14870\t禗\n14871\t〥\n14872\t盌\n14873\t侱\n14874\t僁\n14875\t凞\n14876\t匘\n14877\t奃\n14878\t孌\n14879\t岲\n14880\t嶥\n14881\t怐\n14882\t慏\n14883\t扗\n14884\t揇\n14885\t朌\n14886\t桪\n14887\t楧\n14888\t橠\n14889\t欴\n14890\t汥\n14891\t滵\n14892\t濪\n14893\t烡\n14894\t︷\n14895\t○\n14896\tю\n14897\tゐ\n14898\tヰ\n14899\t侌\n14900\t傪\n14901\t凁\n14902\t勷\n14903\t咅\n14904\t嗮\n14905\t囸\n14906\t堭\n14907\t夝\n14908\t岎\n14909\t嶐\n14910\t庰\n14911\t忦\n14912\t掟\n14913\t擆\n14914\t旔\n14915\t曫\n14916\t栶\n14917\t楌\n14918\t橉\n14919\t欚\n14920\t濔\n14921\t烉\n14922\t餪\n14923\t饉\n14924\t鹱\n14925\t餧\n14926\t饆\n14927\t餈\n14928\t餦\n14929\t饊\n14930\t饌\n14931\t饍\n14932\t饎\n14933\t饏\n14934\t饐\n14935\t饘\n14936\t饙\n14937\t饚\n14938\t饜\n14939\t饟\n14940\t饠\n14941\t饡\n14942\t饢\n14943\t饦\n14944\t饳\n14945\t饻\n14946\t馂\n14947\tㄐ\n14948\t⌒\n14949\t┬\n14950\t⑿\n14951\tＰ\n14952\t冃\n14953\t勑\n14954\t呅\n14955\t埿\n14956\t壭\n14957\t娦\n14958\t嬓\n14959\t屝\n14960\t嵭\n14961\t幮\n14962\t徯\n14963\t愋\n14964\t捫\n14965\t撔\n14966\t斝\n14967\t曅\n14968\t栃\n14969\t櫺\n14970\t毿\n14971\t浶\n14972\t溞\n14973\t澬\n14974\t炐\n14975\t熜\n14976\t衊\n14977\t衈\n14978\t衺\n14979\t衉\n14980\t衃\n14981\t衇\n14982\t衶\n14983\t蠤\n14984\t衻\n14985\t衼\n14986\t袀\n14987\t袃\n14988\t袇\n14989\t袉\n14990\t袊\n14991\t袌\n14992\t袎\n14993\t袏\n14994\t袐\n14995\t袑\n14996\t袓\n14997\t袔\n14998\t袕\n14999\t袗\n15000\t袘\n15001\t袙\n15002\t袚\n15003\t袛\n15004\t袝\n15005\t袞\n15006\t袟\n15007\t袠\n15008\t袡\n15009\t袥\n15010\t袦\n15011\t袧\n15012\t袨\n15013\t袩\n15014\t餉\n15015\t衋\n15016\t蠥\n15017\t餬\n15018\t蠦\n15019\t餋\n15020\t衏\n15021\t衐\n15022\t蠨\n15023\t餎\n15024\t衑\n15025\t餱\n15026\t蠪\n15027\t餲\n15028\t餑\n15029\t蠫\n15030\t餳\n15031\t衕\n15032\t衖\n15033\t蠭\n15034\t餵\n15035\t餔\n15036\t衘\n15037\t蠮\n15038\t餶\n15039\t餕\n15040\t衚\n15041\t蠯\n15042\t餷\n15043\t餖\n15044\t蠰\n15045\t餗\n15046\t衜\n15047\t餹\n15048\t蠳\n15049\t瘃\n15050\t餺\n15051\t餙\n15052\t衞\n15053\t餻\n15054\t衟\n15055\t蠵\n15056\t衠\n15057\t餽\n15058\t餜\n15059\t衦\n15060\t衧\n15061\t蠸\n15062\t餿\n15063\t衪\n15064\t蠺\n15065\t饀\n15066\t餟\n15067\t衭\n15068\t疒\n15069\t瘗\n15070\t瘥\n15071\t饁\n15072\t餠\n15073\t衯\n15074\t蠽\n15075\t衱\n15076\t蠾\n15077\t餢\n15078\t衳\n15079\t蠿\n15080\t衴\n15081\t衁\n15082\t餤\n15083\t衵\n15084\t衂\n15085\t馉\n15086\t袪\n15087\t╡\n15088\t鱡\n15089\t琫\n15090\t癳\n15091\t礶\n15092\t痚\n15093\t璭\n15094\t竐\n15095\t玡\n15096\t籩\n15097\t秂\n15098\t〆\n15099\t甧\n15100\t眅\n15101\t俥\n15102\t卐\n15103\t唀\n15104\t噀\n15105\t坋\n15106\t塭\n15107\t奺\n15108\t媏\n15109\t宔\n15110\t峞\n15111\t巈\n15112\t廵\n15113\t恊\n15114\t慹\n15115\t抏\n15116\t揺\n15117\t攅\n15118\t昬\n15119\t梕\n15120\t榚\n15121\t檈\n15122\t歟\n15123\t沞\n15124\t渆\n15125\t焑\n15126\t碋\n15127\t‥\n15128\t鱁\n15129\t癊\n15130\t疎\n15131\t璄\n15132\t縀\n15133\t稥\n15134\t窫\n15135\t獷\n15136\t籈\n15137\t禘\n15138\t瓻\n15139\t盓\n15140\t丒\n15141\t侲\n15142\t凟\n15143\t匛\n15144\t咵\n15145\t嘐\n15146\t圗\n15147\t塃\n15148\t奅\n15149\t婨\n15150\t孍\n15151\t岴\n15152\t嶦\n15153\t廍\n15154\t怑\n15155\t慐\n15156\t扙\n15157\t揈\n15158\t擡\n15159\t旹\n15160\t朎\n15161\t桬\n15162\t欵\n15163\t汦\n15164\t淓\n15165\t滶\n15166\t烢\n15167\tㄩ\n15168\t﹂\n15169\t￠\n15170\tч\n15171\t╅\n15172\t㈤\n15173\t侀\n15174\t傞\n15175\t勯\n15176\t呴\n15177\t囬\n15178\t堥\n15179\t婇\n15180\t嬮\n15181\t岄\n15182\t嶉\n15183\t庨\n15184\t愰\n15185\t掗\n15186\t曢\n15187\t栭\n15188\t楅\n15189\t橀\n15190\t欓\n15191\t濋\n15192\t熼\n15193\t榇\n15194\t閌\n15195\t閊\n15196\t閇\n15197\t閧\n15198\t椐\n15199\t锧\n15200\t閈\n15201\t閫\n15202\t閮\n15203\t閯\n15204\t閰\n15205\t閳\n15206\t閴\n15207\t閵\n15208\t閷\n15209\t閸\n15210\t閺\n15211\t閼\n15212\t閽\n15213\t閾\n15214\t閿\n15215\t闁\n15216\t闂\n15217\t闃\n15218\t闄\n15219\t闅\n15220\t闈\n15221\t闉\n15222\tㄉ\n15223\tι\n15224\t∩\n15225\t┥\n15226\t⑸\n15227\tＩ\n15228\t伾\n15229\t勆\n15230\t吷\n15231\t喩\n15232\t嚿\n15233\t埳\n15234\t壣\n15235\t屔\n15236\t嵣\n15237\t徤\n15238\t惿\n15239\t懮\n15240\t捝\n15241\t撋\n15242\t斏\n15243\t暽\n15244\t柹\n15245\t椛\n15246\t櫳\n15247\t毶\n15248\t浬\n15249\t溕\n15250\t蒦\n15251\t蓗\n15252\t蓔\n15253\t蒧\n15254\t蒣\n15255\t蒥\n15256\t蓒\n15257\t葽\n15258\t蒤\n15259\t蓘\n15260\t蓙\n15261\t蓚\n15262\t蓛\n15263\t蓜\n15264\t蓞\n15265\t蓡\n15266\t蓤\n15267\t蓧\n15268\t蓨\n15269\t蓫\n15270\t蓭\n15271\t蓱\n15272\t蓲\n15273\t蓳\n15274\t蓴\n15275\t蓵\n15276\t蓶\n15277\t蓷\n15278\t蓸\n15279\t蓹\n15280\t蓺\n15281\t蓻\n15282\t蓽\n15283\t蓾\n15284\t蔀\n15285\t蔁\n15286\tㄨ\n15287\t﹁\n15288\t¤\n15289\tц\n15290\t╄\n15291\tヨ\n15292\t佽\n15293\t傝\n15294\t冭\n15295\t勮\n15296\t呰\n15297\t堣\n15298\t夎\n15299\t婅\n15300\t嬭\n15301\t岃\n15302\t嶈\n15303\t忚\n15304\t愯\n15305\t戣\n15306\t掕\n15307\t撹\n15308\t旇\n15309\t曡\n15310\t栬\n15311\t楄\n15312\t樿\n15313\t涜\n15314\t濊\n15315\t炶\n15316\t熻\n15317\t鑐\n15318\t鑯\n15319\t鑭\n15320\t鑏\n15321\t杩\n15322\t璺\n15323\t鑋\n15324\t鑍\n15325\t鑮\n15326\t鑌\n15327\t鑱\n15328\t鑳\n15329\t鑴\n15330\t鑶\n15331\t鑸\n15332\t鑹\n15333\t鑺\n15334\t鑻\n15335\t鑿\n15336\t钀\n15337\t钁\n15338\t钂\n15339\t钃\n15340\t钄\n15341\t钖\n15342\t钘\n15343\t铇\n15344\t铓\n15345\t铔\n15346\t铦\n15347\tㄈ\n15348\tθ\n15349\t∪\n15350\t┤\n15351\t⑷\n15352\tＨ\n15353\t伻\n15354\t勅\n15355\t喨\n15356\t嚾\n15357\t埲\n15358\t娙\n15359\t屓\n15360\t幦\n15361\t徣\n15362\t惾\n15363\t懭\n15364\t捜\n15365\t暼\n15366\t柸\n15367\t椚\n15368\t樔\n15369\t櫲\n15370\t浫\n15371\t溔\n15372\t澣\n15373\t炄\n15374\t熑\n15375\t萡\n15376\t萟\n15377\t萠\n15378\t萞\n15379\t葅\n15380\t葈\n15381\t菮\n15382\t葊\n15383\t葋\n15384\t葌\n15385\t葍\n15386\t葏\n15387\t葐\n15388\t葒\n15389\t葓\n15390\t葔\n15391\t葕\n15392\t葘\n15393\t葝\n15394\t葟\n15395\t葠\n15396\t葢\n15397\t葤\n15398\t葥\n15399\t葧\n15400\t葨\n15401\t葪\n15402\t葮\n15403\t葰\n15404\t葲\n15405\t葴\n15406\t葹\n15407\t葻\n15408\t﹃\n15409\t￡\n15410\tш\n15411\t╆\n15412\t㈥\n15413\t侁\n15414\t勱\n15415\t呹\n15416\t嗞\n15417\t囮\n15418\t堦\n15419\t岅\n15420\t嶊\n15421\t庩\n15422\t愱\n15423\t戧\n15424\t撽\n15425\t旉\n15426\t曣\n15427\t栮\n15428\t楆\n15429\t橁\n15430\t欔\n15431\t濌\n15432\t炾\n15433\t阘\n15434\t阇\n15435\t陗\n15436\t陓\n15437\t阓\n15438\t攴\n15439\t闧\n15440\t陒\n15441\t陖\n15442\t甓\n15443\t戤\n15444\t闬\n15445\t陙\n15446\t陚\n15447\t陜\n15448\t陠\n15449\t陥\n15450\t陦\n15451\t陫\n15452\t陭\n15453\t陮\n15454\t陱\n15455\t陹\n15456\t陻\n15457\t陼\n15458\t陾\n15459\t陿\n15460\t隀\n15461\t隁\n15462\t隂\n15463\t隃\n15464\t隄\n15465\t隇\n15466\t隉\n15467\tㄊ\n15468\tκ\n15469\t∈\n15470\t┦\n15471\t⑹\n15472\tＪ\n15473\t伿\n15474\t偸\n15475\t兪\n15476\t勈\n15477\t吺\n15478\t囀\n15479\t埵\n15480\t嬍\n15481\t屖\n15482\t嵤\n15483\t幨\n15484\t徥\n15485\t愂\n15486\t懯\n15487\t捠\n15488\t斒\n15489\t暿\n15490\t樖\n15491\t櫴\n15492\t毷\n15493\t炇\n15494\t蔨\n15495\t蕕\n15496\t蕓\n15497\t蔩\n15498\t蔧\n15499\t蕒\n15500\t蕔\n15501\t蔦\n15502\t蕘\n15503\t蕚\n15504\t蕛\n15505\t蕜\n15506\t蕝\n15507\t蕟\n15508\t蕠\n15509\t蕡\n15510\t蕢\n15511\t蕥\n15512\t蕦\n15513\t蕧\n15514\t蕬\n15515\t蕮\n15516\t蕯\n15517\t蕰\n15518\t蕱\n15519\t蕳\n15520\t蕷\n15521\t蕸\n15522\t蕼\n15523\t蕽\n15524\t蕿\n15525\t薀\n15526\t﹄\n15527\t‰\n15528\tщ\n15529\t╇\n15530\t㈦\n15531\t傠\n15532\t冸\n15533\t勲\n15534\t呺\n15535\t嗠\n15536\t堧\n15537\t夒\n15538\t婋\n15539\t岆\n15540\t庪\n15541\t愲\n15542\t戨\n15543\t旊\n15544\t曤\n15545\t栯\n15546\t楇\n15547\t橂\n15548\t滊\n15549\t濍\n15550\t炿\n15551\t雦\n15552\t隷\n15553\t氕\n15554\t搿\n15555\t肟\n15556\t敫\n15557\t雥\n15558\t雧\n15559\t攵\n15560\t隌\n15561\t毪\n15562\t隲\n15563\t雬\n15564\t雭\n15565\t雮\n15566\t雰\n15567\t雴\n15568\t雵\n15569\t雸\n15570\t雺\n15571\t雼\n15572\t雽\n15573\t雿\n15574\t霃\n15575\t霅\n15576\t霊\n15577\t霋\n15578\t霌\n15579\t霐\n15580\t霒\n15581\t霔\n15582\t霗\n15583\t霚\n15584\t霛\n15585\t霝\n15586\t霟\n15587\tㄋ\n15588\tλ\n15589\t∷\n15590\t┧\n15591\t⑺\n15592\tＫ\n15593\t佀\n15594\t偹\n15595\t兯\n15596\t勊\n15597\t囁\n15598\t埶\n15599\t壦\n15600\t娝\n15601\t嬎\n15602\t屗\n15603\t嵥\n15604\t幩\n15605\t徦\n15606\t愃\n15607\t懰\n15608\t捤\n15609\t撍\n15610\t斔\n15611\t曀\n15612\t椝\n15613\t櫵\n15614\t浰\n15615\t溗\n15616\t澦\n15617\t炈\n15618\t熕\n15619\t薫\n15620\t薧\n15621\t藒\n15622\t藎\n15623\t薣\n15624\t藑\n15625\t薂\n15626\t薥\n15627\t藔\n15628\t藗\n15629\t藙\n15630\t藚\n15631\t藞\n15632\t藡\n15633\t藢\n15634\t藣\n15635\t藧\n15636\t藪\n15637\t藫\n15638\t藭\n15639\t藮\n15640\t藯\n15641\t藰\n15642\t藱\n15643\t藲\n15644\t藳\n15645\t藵\n15646\t藶\n15647\t藷\n15648\t蒩\n15649\t葾\n15650\t榱\n15651\t萢\n15652\t阛\n15653\t闍\n15654\t蔄\n15655\t赆\n15656\t赇\n15657\t隺\n15658\t薃\n15659\t脶\n15660\t肜\n15661\t脞\n15662\t锽\n15663\t蒪\n15664\t葿\n15665\t萣\n15666\t阞\n15667\t蔮\n15668\t蔅\n15669\t隑\n15670\t薭\n15671\t薆\n15672\t蒫\n15673\t蒀\n15674\t鑓\n15675\t阠\n15676\t蔯\n15677\t隿\n15678\t隒\n15679\t薱\n15680\t閐\n15681\t镈\n15682\t蒬\n15683\t蒁\n15684\t鑔\n15685\t萪\n15686\t桊\n15687\t阣\n15688\t闐\n15689\t蔰\n15690\t蔇\n15691\t牮\n15692\t雂\n15693\t隓\n15694\t薲\n15695\t薉\n15696\t镋\n15697\t蒭\n15698\t蒃\n15699\t萫\n15700\t阤\n15701\t闑\n15702\t蔱\n15703\t蔈\n15704\t雃\n15705\t薳\n15706\t肷\n15707\t腙\n15708\t胨\n15709\t蒮\n15710\t鑖\n15711\t菷\n15712\t阥\n15713\t闒\n15714\t蔲\n15715\t蔉\n15716\t雈\n15717\t隖\n15718\t薴\n15719\t薋\n15720\t蒅\n15721\t鑗\n15722\t鐶\n15723\t萭\n15724\t菺\n15725\t阦\n15726\t闓\n15727\t蔳\n15728\t蔊\n15729\t薵\n15730\t镠\n15731\t蒱\n15732\t蒆\n15733\t鐷\n15734\t萮\n15735\t菻\n15736\t阧\n15737\t闔\n15738\t蔋\n15739\t薶\n15740\t镮\n15741\t蒳\n15742\t蒊\n15743\t殪\n15744\t萯\n15745\t阨\n15746\t蔍\n15747\t晡\n15748\t雐\n15749\t隝\n15750\t胩\n15751\t胂\n15752\t閖\n15753\t蒵\n15754\t蒍\n15755\t鑚\n15756\t鐹\n15757\t萰\n15758\t菾\n15759\t阩\n15760\t蔶\n15761\t蔎\n15762\t隞\n15763\t薐\n15764\t閗\n15765\t镵\n15766\t蒶\n15767\t蒏\n15768\t鑛\n15769\t萲\n15770\t菿\n15771\t阫\n15772\t闗\n15773\t蔾\n15774\t蔏\n15775\t隟\n15776\t薻\n15777\t蒐\n15778\t鑜\n15779\t鐻\n15780\t萳\n15781\t萀\n15782\t阬\n15783\t蔿\n15784\t雔\n15785\t薼\n15786\t薒\n15787\t閙\n15788\t镸\n15789\t蒑\n15790\t鐼\n15791\t萴\n15792\t阭\n15793\t蕀\n15794\t蔒\n15795\t隡\n15796\t薽\n15797\t薓\n15798\t镹\n15799\t蒒\n15800\t鑞\n15801\t鐽\n15802\t萅\n15803\t阯\n15804\t闚\n15805\t隢\n15806\t薾\n15807\t镺\n15808\t蒓\n15809\t轷\n15810\t鐿\n15811\t萶\n15812\t萇\n15813\t柙\n15814\t阰\n15815\t蔕\n15816\t牿\n15817\t犋\n15818\t雘\n15819\t隣\n15820\t薿\n15821\t薕\n15822\t閜\n15823\t镻\n15824\t蒔\n15825\t鑠\n15826\t鑀\n15827\t萷\n15828\t阷\n15829\t蕄\n15830\t蔖\n15831\t隤\n15832\t藀\n15833\t閝\n15834\t镼\n15835\t鑡\n15836\t鑁\n15837\t萹\n15838\t萉\n15839\t阸\n15840\t蕅\n15841\t蔘\n15842\t雚\n15843\t隥\n15844\t藂\n15845\t薗\n15846\t閞\n15847\t蓃\n15848\t蒖\n15849\t鑢\n15850\t萺\n15851\t阹\n15852\t闞\n15853\t蕆\n15854\t蔙\n15855\t隦\n15856\t藃\n15857\t薘\n15858\t镾\n15859\t蓅\n15860\t蒘\n15861\t鑃\n15862\t萻\n15863\t萐\n15864\t阺\n15865\t闟\n15866\t蕇\n15867\t蔛\n15868\t藄\n15869\t赀\n15870\t閠\n15871\t蒚\n15872\t鑤\n15873\t萾\n15874\t萒\n15875\t阾\n15876\t闠\n15877\t蕋\n15878\t蔜\n15879\t隩\n15880\t藅\n15881\t薚\n15882\t閁\n15883\t蒛\n15884\t辁\n15885\t鑅\n15886\t萿\n15887\t萓\n15888\t棼\n15889\t椟\n15890\t陁\n15891\t蔝\n15892\t贳\n15893\t藆\n15894\t薝\n15895\t膪\n15896\t脎\n15897\t胲\n15898\t閂\n15899\t蓈\n15900\t蒝\n15901\t葀\n15902\t萔\n15903\t陃\n15904\t蕍\n15905\t雟\n15906\t藇\n15907\t薞\n15908\t蒞\n15909\t葁\n15910\t萕\n15911\t陊\n15912\t蔠\n15913\t雡\n15914\t隬\n15915\t藈\n15916\t薟\n15917\t閤\n15918\t閄\n15919\t蓌\n15920\t鑨\n15921\t鑈\n15922\t葂\n15923\t萖\n15924\t陎\n15925\t闤\n15926\t蕏\n15927\t蔢\n15928\t隭\n15929\t藊\n15930\t閅\n15931\t蒠\n15932\t鑩\n15933\t鑉\n15934\t葃\n15935\t萗\n15936\t陏\n15937\t闥\n15938\t蕐\n15939\t隮\n15940\t藋\n15941\t薡\n15942\t閦\n15943\t蓏\n15944\t蒢\n15945\t鑪\n15946\t鑊\n15947\t葄\n15948\t萙\n15949\t陑\n15950\t闦\n15951\t蕑\n15952\t蔤\n15953\t雤\n15954\t藌\n15955\t薢\n15956\t柁\n15957\t锠\n15958\t葼\n15959\t霠\n15960\t藸\n15961\t磃\n15962\t╢\n15963\t鱢\n15964\t譮\n15965\t癴\n15966\t礷\n15967\t痜\n15968\t璮\n15969\t縡\n15970\t玣\n15971\t籪\n15972\t秄\n15973\tゝ\n15974\t眆\n15975\t乫\n15976\t俧\n15977\t僨\n15978\t刦\n15979\t唂\n15980\t坒\n15981\t塮\n15982\t媐\n15983\t宖\n15984\t峟\n15985\t廸\n15986\t恌\n15987\t慺\n15988\t抐\n15989\t揻\n15990\t攆\n15991\t杅\n15992\t梖\n15993\t榝\n15994\t檉\n15995\t歠\n15996\t沠\n15997\t漟\n15998\t瀎\n15999\t焒\n16000\t‵\n16001\t譌\n16002\t癋\n16003\t礔\n16004\t疐\n16005\t璅\n16006\t縁\n16007\t稦\n16008\t籉\n16009\t侳\n16010\t凢\n16011\t匜\n16012\t咶\n16013\t嘑\n16014\t塅\n16015\t奆\n16016\t婩\n16017\t孎\n16018\t岶\n16019\t廎\n16020\t怓\n16021\t慒\n16022\t扚\n16023\t揊\n16024\t擣\n16025\t楩\n16026\t橣\n16027\t欶\n16028\t淔\n16029\t烣\n16030\t╣\n16031\t鱣\n16032\t癵\n16033\t礸\n16034\t痝\n16035\t穏\n16036\t竒\n16037\t玤\n16038\t籫\n16039\t秅\n16040\tゞ\n16041\t甮\n16042\t乬\n16043\t僩\n16044\t刧\n16045\t唃\n16046\t噂\n16047\t坓\n16048\t塯\n16049\t奼\n16050\t媑\n16051\t廹\n16052\t恎\n16053\t慻\n16054\t抔\n16055\t揼\n16056\t杇\n16057\t梘\n16058\t榞\n16059\t檊\n16060\t漡\n16061\t焔\n16062\t碐\n16063\t℅\n16064\t鱃\n16065\t譍\n16066\t珿\n16067\t癎\n16068\t礕\n16069\t疓\n16070\t縂\n16071\t稧\n16072\t獹\n16073\t籊\n16074\t盙\n16075\t侴\n16076\t僄\n16077\t凣\n16078\t匞\n16079\t嘒\n16080\t塆\n16081\t奊\n16082\t婫\n16083\t孏\n16084\t岹\n16085\t嶨\n16086\t廏\n16087\t怗\n16088\t慓\n16089\t扜\n16090\t揋\n16091\t擥\n16092\t桮\n16093\t橤\n16094\t汫\n16095\t淕\n16096\t濭\n16097\t烥\n16098\t磆\n16099\t╤\n16100\t鱤\n16101\t琱\n16102\t癶\n16103\t礹\n16104\t痟\n16105\t穐\n16106\t竓\n16107\t秇\n16108\t﹉\n16109\t県\n16110\t乭\n16111\t俬\n16112\t僪\n16113\t卙\n16114\t噃\n16115\t坔\n16116\t塰\n16117\t宧\n16118\t峢\n16119\t巋\n16120\t廻\n16121\t恏\n16122\t慼\n16123\t抙\n16124\t揾\n16125\t攈\n16126\t昲\n16127\t杊\n16128\t梙\n16129\t檋\n16130\t歨\n16131\t瀐\n16132\t碒\n16133\t℉\n16134\t鱄\n16135\t譎\n16136\t縃\n16137\t稨\n16138\t窰\n16139\t籋\n16140\t禜\n16141\t瓾\n16142\t盚\n16143\t丠\n16144\t凥\n16145\t匟\n16146\t咹\n16147\t嘓\n16148\t圚\n16149\t塇\n16150\t孒\n16151\t廐\n16152\t怘\n16153\t慔\n16154\t扝\n16155\t揌\n16156\t擧\n16157\t旽\n16158\t朒\n16159\t楬\n16160\t橦\n16161\t欻\n16162\t汬\n16163\t淗\n16164\t滺\n16165\t磇\n16166\t鱥\n16167\t譱\n16168\t癷\n16169\t縤\n16170\t竔\n16171\t籭\n16172\t秈\n16173\t﹊\n16174\t甶\n16175\t眎\n16176\t乮\n16177\t俰\n16178\t僫\n16179\t刬\n16180\t卛\n16181\t噄\n16182\t坕\n16183\t奿\n16184\t媔\n16185\t宨\n16186\t巌\n16187\t廼\n16188\t恑\n16189\t慽\n16190\t搃\n16191\t杋\n16192\t梚\n16193\t榠\n16194\t檌\n16195\t瀒\n16196\t焛\n16197\t碔\n16198\t↖\n16199\t鱅\n16200\t譏\n16201\t琁\n16202\t癐\n16203\t礗\n16204\t疘\n16205\t縄\n16206\t窱\n16207\t禝\n16208\t㊣\n16209\t甀\n16210\t侷\n16211\t僆\n16212\t処\n16213\t匢\n16214\t咺\n16215\t圛\n16216\t塈\n16217\t奍\n16218\t岻\n16219\t嶪\n16220\t廔\n16221\t怚\n16222\t扞\n16223\t揑\n16224\t旾\n16225\t桰\n16226\t橧\n16227\t欼\n16228\t滻\n16229\t烮\n16230\t№\n16231\t╉\n16232\t㈨\n16233\t冺\n16234\t勴\n16235\t呿\n16236\t堩\n16237\t夗\n16238\t嬳\n16239\t岉\n16240\t嶍\n16241\t庬\n16242\t忢\n16243\t戫\n16244\t掜\n16245\t旐\n16246\t曧\n16247\t楉\n16248\t橅\n16249\t欗\n16250\t氻\n16251\t涰\n16252\t滍\n16253\t濏\n16254\t烅\n16255\t燀\n16256\t泶\n16257\t韣\n16258\t憝\n16259\t慝\n16260\t砘\n16261\t韂\n16262\t韠\n16263\t韢\n16264\t鞞\n16265\t恧\n16266\t韁\n16267\t肀\n16268\t韤\n16269\t韥\n16270\t韨\n16271\t韯\n16272\t韰\n16273\t韱\n16274\t韲\n16275\t韷\n16276\t韸\n16277\t韹\n16278\t韺\n16279\t韽\n16280\t頀\n16281\t頄\n16282\t頇\n16283\t頉\n16284\t頋\n16285\t頍\n16286\tㄍ\n16287\tν\n16288\t⊥\n16289\t┩\n16290\t⑼\n16291\tＭ\n16292\t佂\n16293\t偼\n16294\t兺\n16295\t喭\n16296\t壨\n16297\t屚\n16298\t嵧\n16299\t愅\n16300\t捦\n16301\t撏\n16302\t斖\n16303\t曂\n16304\t柾\n16305\t椡\n16306\t櫷\n16307\t毻\n16308\t溚\n16309\t澩\n16310\t炌\n16311\t熗\n16312\t蚟\n16313\t蛜\n16314\t蛗\n16315\t蚠\n16316\t蚚\n16317\t蚞\n16318\t蛖\n16319\t蛚\n16320\t虭\n16321\t蚛\n16322\t蛡\n16323\t蛢\n16324\t蛣\n16325\t蛥\n16326\t蛦\n16327\t蛧\n16328\t蛨\n16329\t蛪\n16330\t蛫\n16331\t蛬\n16332\t蛶\n16333\t蛷\n16334\t蛼\n16335\t蛽\n16336\t蛿\n16337\t蜁\n16338\t蜄\n16339\t蜅\n16340\t蜋\n16341\t蜌\n16342\t蜎\n16343\t蜏\n16344\t蜑\n16345\t蜔\n16346\t§\n16347\tъ\n16348\t╈\n16349\t㈧\n16350\t侅\n16351\t傡\n16352\t冹\n16353\t呾\n16354\t嗢\n16355\t囲\n16356\t堨\n16357\t夓\n16358\t婌\n16359\t嬱\n16360\t岇\n16361\t忟\n16362\t愳\n16363\t旍\n16364\t曥\n16365\t栰\n16366\t楈\n16367\t橃\n16368\t氺\n16369\t涭\n16370\t濎\n16371\t烄\n16372\t熿\n16373\t靱\n16374\t靯\n16375\t靇\n16376\t旆\n16377\t靃\n16378\t靅\n16379\t靮\n16380\t靲\n16381\t靵\n16382\t靷\n16383\t靸\n16384\t靹\n16385\t靻\n16386\t靽\n16387\t靾\n16388\t靿\n16389\t鞀\n16390\t鞁\n16391\t鞃\n16392\t鞄\n16393\t鞆\n16394\t鞈\n16395\t鞉\n16396\t鞊\n16397\t鞌\n16398\t鞎\n16399\t鞐\n16400\t鞓\n16401\t鞖\n16402\t鞗\n16403\t鞙\n16404\t鞚\n16405\t鞛\n16406\t鞜\n16407\tㄌ\n16408\tμ\n16409\t√\n16410\t┨\n16411\tぬ\n16412\t⑻\n16413\tＬ\n16414\t佁\n16415\t偺\n16416\t勌\n16417\t吿\n16418\t壧\n16419\t娞\n16420\t屘\n16421\t嵦\n16422\t幪\n16423\t徧\n16424\t愄\n16425\t懱\n16426\t捥\n16427\t撎\n16428\t曁\n16429\t柼\n16430\t椞\n16431\t樚\n16432\t櫶\n16433\t浱\n16434\t溙\n16435\t澨\n16436\t炋\n16437\t熖\n16438\t蘞\n16439\t蘜\n16440\t虀\n16441\t蘾\n16442\t蘝\n16443\t蘙\n16444\t蘽\n16445\t虂\n16446\t虃\n16447\t虅\n16448\t虆\n16449\t虇\n16450\t虈\n16451\t虉\n16452\t虊\n16453\t虋\n16454\t虌\n16455\t虖\n16456\t虗\n16457\t虘\n16458\t虙\n16459\t虝\n16460\t虠\n16461\t虡\n16462\t虣\n16463\t虥\n16464\t虦\n16465\t虨\n16466\t虩\n16467\t︻\n16468\t☆\n16469\t╊\n16470\tゎ\n16471\t㈩\n16472\tヮ\n16473\t侇\n16474\t傤\n16475\t冾\n16476\t囶\n16477\t堫\n16478\t夘\n16479\t婎\n16480\t嬵\n16481\t岊\n16482\t嶎\n16483\t庮\n16484\t忣\n16485\t愵\n16486\t戭\n16487\t掝\n16488\t擃\n16489\t旑\n16490\t曨\n16491\t橆\n16492\t氼\n16493\t涱\n16494\t濐\n16495\t烆\n16496\t頯\n16497\t瞵\n16498\t顋\n16499\t頮\n16500\t罨\n16501\t頪\n16502\t頬\n16503\t頏\n16504\t顐\n16505\t顑\n16506\t顕\n16507\t顖\n16508\t顙\n16509\t顚\n16510\t顜\n16511\t顝\n16512\t顟\n16513\t顠\n16514\t顡\n16515\t顢\n16516\t顣\n16517\t顦\n16518\t顩\n16519\t顬\n16520\t顭\n16521\tㄎ\n16522\tξ\n16523\t∥\n16524\t┪\n16525\t⑽\n16526\tＮ\n16527\t佄\n16528\t兾\n16529\t勎\n16530\t娢\n16531\t幬\n16532\t徫\n16533\t愇\n16534\t懳\n16535\t斘\n16536\t曃\n16537\t栁\n16538\t椢\n16539\t樜\n16540\t毼\n16541\t浳\n16542\t溛\n16543\t炍\n16544\t熚\n16545\t蝋\n16546\t蝆\n16547\t蝵\n16548\t蝊\n16549\t蝅\n16550\t蝱\n16551\t蝳\n16552\t蜙\n16553\t蝄\n16554\t蝹\n16555\t蝺\n16556\t蝿\n16557\t螀\n16558\t螁\n16559\t螇\n16560\t螉\n16561\t螊\n16562\t螌\n16563\t螎\n16564\t螑\n16565\t螒\n16566\t螔\n16567\t螕\n16568\t螖\n16569\t螘\n16570\t螙\n16571\t螚\n16572\t螛\n16573\t螝\n16574\t螠\n16575\t螡\n16576\t螣\n16577\t︼\n16578\t★\n16579\tэ\n16580\t╋\n16581\t侊\n16582\t傦\n16583\t冿\n16584\t勶\n16585\t咃\n16586\t嗭\n16587\t堬\n16588\t夛\n16589\t婏\n16590\t嬶\n16591\t岋\n16592\t嶏\n16593\t忥\n16594\t愶\n16595\t掞\n16596\t擄\n16597\t旓\n16598\t曪\n16599\t栵\n16600\t楋\n16601\t欙\n16602\t涳\n16603\t滐\n16604\t烇\n16605\t燂\n16606\t锎\n16607\t颺\n16608\t铷\n16609\t飤\n16610\t铴\n16611\t锏\n16612\t颻\n16613\t铩\n16614\t锓\n16615\t铽\n16616\t颷\n16617\t飡\n16618\t飣\n16619\t铹\n16620\t颸\n16621\t飥\n16622\t飫\n16623\t飬\n16624\t飭\n16625\t飮\n16626\t飱\n16627\t飳\n16628\t飴\n16629\t飵\n16630\t飶\n16631\t飸\n16632\t飺\n16633\t餀\n16634\t餁\n16635\t餂\n16636\t餄\n16637\tㄏ\n16638\tο\n16639\t∠\n16640\t┫\n16641\t⑾\n16642\tＯ\n16643\t佅\n16644\t傁\n16645\t兿\n16646\t勏\n16647\t呄\n16648\t喯\n16649\t囅\n16650\t埾\n16651\t壪\n16652\t娤\n16653\t嬒\n16654\t嵪\n16655\t幭\n16656\t懴\n16657\t捪\n16658\t斚\n16659\t曄\n16660\t栂\n16661\t椣\n16662\t櫹\n16663\t毾\n16664\t澫\n16665\t炏\n16666\t熛\n16667\t蟕\n16668\t蟐\n16669\t蟸\n16670\t蟔\n16671\t蟏\n16672\t蟵\n16673\t螥\n16674\t蟎\n16675\t蟺\n16676\t蟼\n16677\t蟽\n16678\t蟿\n16679\t蠀\n16680\t蠁\n16681\t蠆\n16682\t蠇\n16683\t蠈\n16684\t蠉\n16685\t蠋\n16686\t蠌\n16687\t蠎\n16688\t蠏\n16689\t蠐\n16690\t蠒\n16691\t蠗\n16692\t蠘\n16693\t蠚\n16694\t蠜\n16695\t蠠\n16696\t锊\n16697\t韆\n16698\t鞟\n16699\t蚢\n16700\t虯\n16701\t愍\n16702\t砹\n16703\t霢\n16704\t蘟\n16705\t灬\n16706\t爨\n16707\t炷\n16708\t蝍\n16709\t蜛\n16710\t钆\n16711\t颽\n16712\t蟖\n16713\t螦\n16714\t镟\n16715\t镥\n16716\t镤\n16717\t锬\n16718\t锩\n16719\t铈\n16720\t韇\n16721\t蚥\n16722\t虰\n16723\t靊\n16724\t霣\n16725\t蘠\n16726\t藼\n16727\t蝏\n16728\t蜝\n16729\t颾\n16730\t蟗\n16731\t韈\n16732\t鞢\n16733\t蚦\n16734\t虲\n16735\t靋\n16736\t霤\n16737\t藽\n16738\t頲\n16739\t蝐\n16740\t蜟\n16741\t钋\n16742\t颿\n16743\t顲\n16744\t蟘\n16745\t螩\n16746\t韉\n16747\t鞤\n16748\t蚫\n16749\t虳\n16750\t眇\n16751\t靌\n16752\t霥\n16753\t藾\n16754\t頳\n16755\t蝑\n16756\t蜠\n16757\t铕\n16758\t飀\n16759\t螪\n16760\t镄\n16761\t韊\n16762\t鞥\n16763\t蚭\n16764\t虴\n16765\t黹\n16766\t靍\n16767\t霦\n16768\t蘣\n16769\t蘀\n16770\t頴\n16771\t蝒\n16772\t蜤\n16773\t钌\n16774\t飁\n16775\t蟚\n16776\t锼\n16777\t鞦\n16778\t蚮\n16779\t虵\n16780\t靎\n16781\t蘤\n16782\t蘁\n16783\t頵\n16784\t蝔\n16785\t飂\n16786\t螰\n16787\t鞧\n16788\t蚲\n16789\t虶\n16790\t靏\n16791\t霨\n16792\t蘥\n16793\t蘂\n16794\t頖\n16795\t蜧\n16796\t飃\n16797\t螱\n16798\t韍\n16799\t蚳\n16800\t虷\n16801\t靐\n16802\t霩\n16803\t蘦\n16804\t蘃\n16805\t蝖\n16806\t蜨\n16807\t颒\n16808\t韎\n16809\t鞩\n16810\t虸\n16811\t眍\n16812\t靑\n16813\t霫\n16814\t蘨\n16815\t煳\n16816\t蝘\n16817\t蜪\n16818\t钔\n16819\t飅\n16820\t蟟\n16821\t螴\n16822\t锿\n16823\t锾\n16824\t韏\n16825\t鞪\n16826\t蚸\n16827\t靔\n16828\t霬\n16829\t蘪\n16830\t蝚\n16831\t蜫\n16832\t螶\n16833\t鞬\n16834\t蚹\n16835\t蚄\n16836\t靕\n16837\t霮\n16838\t蘫\n16839\t頺\n16840\t頚\n16841\t蝛\n16842\t蜬\n16843\t飇\n16844\t颣\n16845\t螷\n16846\t蚻\n16847\t蚅\n16848\t靗\n16849\t霯\n16850\t蜭\n16851\t飈\n16852\t蟣\n16853\t韒\n16854\t蚼\n16855\t蚆\n16856\t靘\n16857\t霱\n16858\t蘉\n16859\t蝝\n16860\t蜯\n16861\t飉\n16862\t颩\n16863\t蟤\n16864\t螹\n16865\t鞱\n16866\t蚽\n16867\t蚇\n16868\t蘮\n16869\t頝\n16870\t蝞\n16871\t蜰\n16872\t飊\n16873\t颪\n16874\t蟦\n16875\t螻\n16876\t镅\n16877\t韔\n16878\t鞳\n16879\t蚾\n16880\t眢\n16881\t眚\n16882\t霴\n16883\t蘯\n16884\t煺\n16885\t頾\n16886\t頞\n16887\t蜲\n16888\t钪\n16889\t钬\n16890\t螼\n16891\t镎\n16892\t韕\n16893\t鞵\n16894\t蚿\n16895\t蚉\n16896\t靝\n16897\t蘰\n16898\t頿\n16899\t頟\n16900\t蝡\n16901\t蜳\n16902\t飌\n16903\t蟨\n16904\t螾\n16905\t韖\n16906\t蛁\n16907\t蚎\n16908\t靟\n16909\t蝢\n16910\t蜵\n16911\t飍\n16912\t颭\n16913\t蟩\n16914\t螿\n16915\t韗\n16916\t鞷\n16917\t蛂\n16918\t蚏\n16919\t靣\n16920\t霷\n16921\t蘲\n16922\t顁\n16923\t蜶\n16924\t蟫\n16925\t蟁\n16926\t韘\n16927\t鞸\n16928\t蛃\n16929\t蚐\n16930\t靤\n16931\t霺\n16932\t蘳\n16933\t顂\n16934\t頢\n16935\t蝧\n16936\t飐\n16937\t蟂\n16938\t钸\n16939\t韙\n16940\t鞹\n16941\t蛅\n16942\t蚑\n16943\t霻\n16944\t蘴\n16945\t蘐\n16946\t頣\n16947\t蜹\n16948\t颰\n16949\t蟭\n16950\t蟃\n16951\t韚\n16952\t鞺\n16953\t蛈\n16954\t蚒\n16955\t睃\n16956\t碥\n16957\t靧\n16958\t霼\n16959\t禚\n16960\t顄\n16961\t蝩\n16962\t蜺\n16963\t稆\n16964\t稃\n16965\t韛\n16966\t鞻\n16967\t蛌\n16968\t蚔\n16969\t霽\n16970\t蘶\n16971\t蘓\n16972\t蝪\n16973\t蜼\n16974\t颲\n16975\t蟰\n16976\t蟅\n16977\t蚖\n16978\t靪\n16979\t霿\n16980\t蘷\n16981\t蘔\n16982\t蝫\n16983\t蜽\n16984\t蟇\n16985\t韝\n16986\t鞽\n16987\t蛒\n16988\t蚗\n16989\t靫\n16990\t靀\n16991\t蘹\n16992\t蘕\n16993\t顇\n16994\t飜\n16995\t颴\n16996\t韞\n16997\t鞾\n16998\t蛓\n16999\t蚘\n17000\t靁\n17001\t蘺\n17002\t顈\n17003\t頨\n17004\t蝭\n17005\t蝁\n17006\t飝\n17007\t颵\n17008\t蟉\n17009\t韟\n17010\t鞿\n17011\t蛕\n17012\t蚙\n17013\t靭\n17014\t蘻\n17015\t顉\n17016\t頩\n17017\t蝯\n17018\t飠\n17019\t蟴\n17020\t蟌\n17021\t铪\n17022\t钷\n17023\t頎\n17024\t蜖\n17025\t鞝\n17026\t虪\n17027\t顮\n17028\t螤\n17029\t餇\n17030\t磈\n17031\t╦\n17032\t鱦\n17033\t譲\n17034\t琷\n17035\t癹\n17036\t礿\n17037\t痡\n17038\t璲\n17039\t縥\n17040\t穓\n17041\t竕\n17042\t秊\n17043\t﹋\n17044\t甹\n17045\t眏\n17046\t乯\n17047\t俲\n17048\t僯\n17049\t刯\n17050\t卝\n17051\t唈\n17052\t坖\n17053\t塲\n17054\t妀\n17055\t宩\n17056\t峧\n17057\t廽\n17058\t抝\n17059\t搄\n17060\t攋\n17061\t昷\n17062\t梛\n17063\t檍\n17064\t歫\n17065\t沯\n17066\t渏\n17067\t漥\n17068\t瀓\n17069\t碕\n17070\t↗\n17071\t鱆\n17072\t譐\n17073\t琂\n17074\t癑\n17075\t礘\n17076\t疛\n17077\t璊\n17078\t稪\n17079\t窲\n17080\t禞\n17081\t㎎\n17082\t盝\n17083\t丣\n17084\t侸\n17085\t僇\n17086\t凧\n17087\t咼\n17088\t嘕\n17089\t圝\n17090\t塉\n17091\t奐\n17092\t婮\n17093\t孞\n17094\t岼\n17095\t嶫\n17096\t廕\n17097\t怞\n17098\t慗\n17099\t扟\n17100\t揓\n17101\t擩\n17102\t朖\n17103\t桱\n17104\t楯\n17105\t橨\n17106\t淛\n17107\t滼\n17108\t濲\n17109\t烰\n17110\t磌\n17111\t╧\n17112\t譳\n17113\t琸\n17114\t祂\n17115\t痥\n17116\t縦\n17117\t穔\n17118\t竗\n17119\t籯\n17120\t秌\n17121\t﹌\n17122\t甼\n17123\t眐\n17124\t乲\n17125\t俴\n17126\t僰\n17127\t刱\n17128\t卥\n17129\t唊\n17130\t噆\n17131\t坘\n17132\t塳\n17133\t妅\n17134\t峩\n17135\t巏\n17136\t弅\n17137\t恔\n17138\t慿\n17139\t択\n17140\t搆\n17141\t杒\n17142\t梜\n17143\t榢\n17144\t檏\n17145\t歬\n17146\t沰\n17147\t渒\n17148\t漦\n17149\t瀔\n17150\t焝\n17151\t↘\n17152\t琄\n17153\t疜\n17154\t璌\n17155\t縆\n17156\t稫\n17157\t窴\n17158\t獽\n17159\t籏\n17160\t㎏\n17161\t甂\n17162\t侹\n17163\t僈\n17164\t凨\n17165\t匥\n17166\t咾\n17167\t嘖\n17168\t奒\n17169\t婯\n17170\t孠\n17171\t岾\n17172\t嶬\n17173\t廗\n17174\t怟\n17175\t扠\n17176\t揔\n17177\t擪\n17178\t昁\n17179\t朘\n17180\t楰\n17181\t欿\n17182\t汯\n17183\t淜\n17184\t滽\n17185\t濳\n17186\t烱\n17187\t磍\n17188\t╨\n17189\t鱨\n17190\t琹\n17191\t璴\n17192\t縧\n17193\t竘\n17194\t籰\n17195\t秎\n17196\t﹍\n17197\t甽\n17198\t眑\n17199\t乴\n17200\t俵\n17201\t刲\n17202\t卨\n17203\t唋\n17204\t噇\n17205\t坙\n17206\t塴\n17207\t妉\n17208\t宭\n17209\t峫\n17210\t巐\n17211\t弆\n17212\t恖\n17213\t憀\n17214\t抣\n17215\t搇\n17216\t昹\n17217\t杔\n17218\t榣\n17219\t檒\n17220\t歭\n17221\t沴\n17222\t渓\n17223\t漧\n17224\t焞\n17225\t碙\n17226\t↙\n17227\t譒\n17228\t癓\n17229\t礚\n17230\t疞\n17231\t璍\n17232\t稬\n17233\t獿\n17234\t籐\n17235\t㎜\n17236\t盠\n17237\t丩\n17238\t侺\n17239\t僉\n17240\t匧\n17241\t哃\n17242\t圠\n17243\t塋\n17244\t婰\n17245\t孡\n17246\t峀\n17247\t嶭\n17248\t廘\n17249\t怢\n17250\t慙\n17251\t扡\n17252\t揕\n17253\t擫\n17254\t朙\n17255\t桳\n17256\t楲\n17257\t橪\n17258\t歀\n17259\t汱\n17260\t淟\n17261\t濴\n17262\t烲\n17263\t磎\n17264\t╩\n17265\t鱩\n17266\t譵\n17267\t癿\n17268\t祄\n17269\t痬\n17270\t璵\n17271\t穖\n17272\t竚\n17273\t籱\n17274\t秏\n17275\t﹎\n17276\t甿\n17277\t眒\n17278\t乵\n17279\t僲\n17280\t刴\n17281\t卪\n17282\t唌\n17283\t噈\n17284\t坢\n17285\t妋\n17286\t媘\n17287\t峬\n17288\t巑\n17289\t恗\n17290\t抦\n17291\t搈\n17292\t攎\n17293\t梞\n17294\t榤\n17295\t檓\n17296\t歮\n17297\t沵\n17298\t漨\n17299\t瀖\n17300\t焟\n17301\t譓\n17302\t琈\n17303\t癕\n17304\t礛\n17305\t疢\n17306\t璏\n17307\t縈\n17308\t稭\n17309\t窶\n17310\t玀\n17311\t籑\n17312\t㎝\n17313\t甅\n17314\t丮\n17315\t侻\n17316\t匨\n17317\t哅\n17318\t嘙\n17319\t圡\n17320\t塎\n17321\t奙\n17322\t孧\n17323\t峂\n17324\t嶮\n17325\t怣\n17326\t扢\n17327\t揗\n17328\t昅\n17329\t朚\n17330\t桵\n17331\t楳\n17332\t歁\n17333\t汳\n17334\t淢\n17335\t濵\n17336\t烳\n17337\t╪\n17338\t譶\n17339\t皀\n17340\t痭\n17341\t璶\n17342\t縩\n17343\t穘\n17344\t竛\n17345\t秐\n17346\t畁\n17347\t乶\n17348\t僴\n17349\t刵\n17350\t唍\n17351\t噉\n17352\t塶\n17353\t妌\n17354\t媙\n17355\t宯\n17356\t峮\n17357\t巒\n17358\t弉\n17359\t恘\n17360\t抧\n17361\t搉\n17362\t昻\n17363\t杗\n17364\t榥\n17365\t歯\n17366\t沶\n17367\t渘\n17368\t焠\n17369\t碞\n17370\t∟\n17371\t鱊\n17372\t譔\n17373\t琋\n17374\t癗\n17375\t疦\n17376\t璑\n17377\t窷\n17378\t籒\n17379\t禢\n17380\t㎞\n17381\t甆\n17382\t丯\n17383\t侼\n17384\t凬\n17385\t匩\n17386\t哊\n17387\t圢\n17388\t塏\n17389\t奛\n17390\t婲\n17391\t孨\n17392\t峃\n17393\t嶯\n17394\t怤\n17395\t慛\n17396\t扤\n17397\t揘\n17398\t擭\n17399\t朜\n17400\t桸\n17401\t楴\n17402\t歂\n17403\t汵\n17404\t淣\n17405\t漀\n17406\t濶\n17407\t︸\n17408\t●\n17409\tゑ\n17410\tⅠ\n17411\tヱ\n17412\t凂\n17413\t咇\n17414\t嗰\n17415\t囻\n17416\t堮\n17417\t夞\n17418\t婑\n17419\t嬹\n17420\t岏\n17421\t嶑\n17422\t庱\n17423\t忨\n17424\t戱\n17425\t旕\n17426\t栺\n17427\t楍\n17428\t橊\n17429\t欛\n17430\t汃\n17431\t涶\n17432\t滖\n17433\t燅\n17434\t馺\n17435\t馸\n17436\t駘\n17437\t瘳\n17438\t駖\n17439\t馹\n17440\t馵\n17441\t馌\n17442\t駞\n17443\t駢\n17444\t駥\n17445\t駦\n17446\t駨\n17447\t駩\n17448\t駪\n17449\t駬\n17450\t駮\n17451\t駰\n17452\t駴\n17453\t駵\n17454\t駶\n17455\t駸\n17456\tㄑ\n17457\tρ\n17458\t⊙\n17459\t┭\n17460\t⒀\n17461\tＱ\n17462\tパ\n17463\t傃\n17464\t冄\n17465\t勓\n17466\t呇\n17467\t囇\n17468\t堁\n17469\t娧\n17470\t嬔\n17471\t屟\n17472\t嵮\n17473\t幯\n17474\t徰\n17475\t愌\n17476\t捬\n17477\t撗\n17478\t栄\n17479\t椦\n17480\t樠\n17481\t氀\n17482\t浹\n17483\t溠\n17484\t澭\n17485\t裛\n17486\t裗\n17487\t褈\n17488\t裑\n17489\t裖\n17490\t褅\n17491\t袬\n17492\t裓\n17493\t褉\n17494\t褋\n17495\t褌\n17496\t褍\n17497\t褎\n17498\t褏\n17499\t褑\n17500\t褔\n17501\t褖\n17502\t褗\n17503\t褘\n17504\t褜\n17505\t褝\n17506\t褞\n17507\t褟\n17508\t褤\n17509\t褧\n17510\t褨\n17511\t褩\n17512\t褬\n17513\t褭\n17514\t褱\n17515\t褳\n17516\t褵\n17517\t窆\n17518\t馎\n17519\t袮\n17520\t窳\n17521\t衤\n17522\t袯\n17523\t馽\n17524\t馛\n17525\t裞\n17526\t裠\n17527\t袲\n17528\t馿\n17529\t馝\n17530\t袳\n17531\t耖\n17532\t耔\n17533\t耠\n17534\t馞\n17535\t裦\n17536\t袴\n17537\t馟\n17538\t裧\n17539\t袵\n17540\t駂\n17541\t馠\n17542\t裩\n17543\t袶\n17544\t駃\n17545\t馡\n17546\t裪\n17547\t袸\n17548\t耥\n17549\t耢\n17550\t裉\n17551\t馢\n17552\t袹\n17553\t駅\n17554\t馣\n17555\t裬\n17556\t袺\n17557\t駆\n17558\t馤\n17559\t裭\n17560\t袻\n17561\t駇\n17562\t馦\n17563\t裮\n17564\t袽\n17565\t駈\n17566\t馧\n17567\t裯\n17568\t袾\n17569\t駉\n17570\t袿\n17571\t裼\n17572\t裵\n17573\t裀\n17574\t駋\n17575\t馫\n17576\t裶\n17577\t裃\n17578\t駌\n17579\t裷\n17580\t裄\n17581\t裺\n17582\t裇\n17583\t駎\n17584\t裻\n17585\t駏\n17586\t馯\n17587\t裿\n17588\t駑\n17589\t褀\n17590\t裌\n17591\t褁\n17592\t裍\n17593\t駓\n17594\t褃\n17595\t駔\n17596\t褄\n17597\t裐\n17598\t駹\n17599\t褷\n17600\t磑\n17601\t╫\n17602\t鱫\n17603\t琽\n17604\t皁\n17605\t祇\n17606\t痮\n17607\t璷\n17608\t穙\n17609\t玱\n17610\t籵\n17611\t秓\n17612\t﹐\n17613\t畂\n17614\t眔\n17615\t乷\n17616\t俹\n17617\t僶\n17618\t刼\n17619\t卭\n17620\t坥\n17621\t塷\n17622\t妎\n17623\t宱\n17624\t巓\n17625\t憃\n17626\t抩\n17627\t搊\n17628\t攐\n17629\t杘\n17630\t榦\n17631\t歰\n17632\t沷\n17633\t漮\n17634\t碠\n17635\t∣\n17636\t譕\n17637\t琌\n17638\t癘\n17639\t礝\n17640\t疧\n17641\t璒\n17642\t縊\n17643\t稯\n17644\t玂\n17645\t禣\n17646\t丱\n17647\t侽\n17648\t凮\n17649\t匫\n17650\t哋\n17651\t圤\n17652\t塐\n17653\t奜\n17654\t峅\n17655\t嶰\n17656\t廜\n17657\t怬\n17658\t扥\n17659\t揙\n17660\t擮\n17661\t昈\n17662\t朞\n17663\t桹\n17664\t橭\n17665\t歄\n17666\t汷\n17667\t淥\n17668\t濷\n17669\t烵\n17670\t骱\n17671\t『\n17672\tШ\n17673\t┖\n17674\t⒑\n17675\t：\n17676\t伜\n17677\t偤\n17678\t吅\n17679\t喓\n17680\t姾\n17681\t嫼\n17682\t尯\n17683\t嵑\n17684\t幒\n17685\t徍\n17686\t懞\n17687\t捄\n17688\t摵\n17689\t柡\n17690\t櫤\n17691\t毢\n17692\t澓\n17693\t灪\n17694\t篳\n17695\t篰\n17696\t簙\n17697\t簗\n17698\t篲\n17699\t篬\n17700\t篯\n17701\t簘\n17702\t篅\n17703\t篭\n17704\t簚\n17705\t簛\n17706\t簜\n17707\t簝\n17708\t簣\n17709\t簤\n17710\t簥\n17711\t簨\n17712\t簩\n17713\t簬\n17714\t簭\n17715\t簮\n17716\t簯\n17717\t簰\n17718\t簱\n17719\t簲\n17720\t簳\n17721\t簴\n17722\t簵\n17723\t簶\n17724\t簹\n17725\t簺\n17726\t簻\n17727\t簼\n17728\t◇\n17729\t侒\n17730\t傮\n17731\t凅\n17732\t勼\n17733\t咉\n17734\t圀\n17735\t夡\n17736\t婓\n17737\t嬻\n17738\t岓\n17739\t庴\n17740\t忬\n17741\t愺\n17742\t戵\n17743\t掦\n17744\t擉\n17745\t旙\n17746\t曮\n17747\t栿\n17748\t楏\n17749\t橌\n17750\t欝\n17751\t汅\n17752\t涹\n17753\t滙\n17754\t濗\n17755\t烍\n17756\t燇\n17757\t骭\n17758\t蟓\n17759\t骩\n17760\t骫\n17761\t髛\n17762\t螫\n17763\t髝\n17764\t髠\n17765\t髢\n17766\t髤\n17767\t髥\n17768\t髧\n17769\t髨\n17770\t髩\n17771\t髬\n17772\t髱\n17773\t髲\n17774\t髳\n17775\t髵\n17776\t髶\n17777\t髸\n17778\t髺\n17779\t髼\n17780\t髽\n17781\t髾\n17782\t髿\n17783\t鬀\n17784\t鬂\n17785\t鬅\n17786\tㄓ\n17787\tτ\n17788\t∮\n17789\t┯\n17790\t⒂\n17791\tＳ\n17792\t佊\n17793\t傆\n17794\t冇\n17795\t呌\n17796\t囉\n17797\t堄\n17798\t娪\n17799\t嵱\n17800\t幱\n17801\t徲\n17802\t懹\n17803\t撚\n17804\t斢\n17805\t栍\n17806\t椨\n17807\t櫽\n17808\t氂\n17809\t浻\n17810\t溣\n17811\t澯\n17812\t炗\n17813\t熡\n17814\t觍\n17815\t觺\n17816\t觃\n17817\t覿\n17818\t觷\n17819\t觹\n17820\t觻\n17821\t觽\n17822\t觾\n17823\t觿\n17824\t訁\n17825\t訃\n17826\t訄\n17827\t訆\n17828\t訉\n17829\t訋\n17830\t訌\n17831\t訍\n17832\t訐\n17833\t訒\n17834\t訔\n17835\t訖\n17836\t託\n17837\t訛\n17838\t訜\n17839\t︱\n17840\t◎\n17841\tⅡ\n17842\tヲ\n17843\t侐\n17844\t凃\n17845\t咈\n17846\t嗱\n17847\t囼\n17848\t婒\n17849\t嬺\n17850\t庲\n17851\t忩\n17852\t愹\n17853\t掤\n17854\t擈\n17855\t楎\n17856\t欜\n17857\t汄\n17858\t涷\n17859\t濖\n17860\t烌\n17861\t騸\n17862\t騶\n17863\t騕\n17864\t騗\n17865\t騵\n17866\t虍\n17867\t駺\n17868\t騖\n17869\t騹\n17870\t騺\n17871\t騻\n17872\t騼\n17873\t騿\n17874\t驂\n17875\t驄\n17876\t驆\n17877\t驇\n17878\t驉\n17879\t驌\n17880\t驑\n17881\t驒\n17882\t驓\n17883\t驔\n17884\t驖\n17885\t驘\n17886\tㄒ\n17887\tσ\n17888\t∫\n17889\tб\n17890\t┮\n17891\t⒁\n17892\tＲ\n17893\t佉\n17894\t勔\n17895\t囈\n17896\t壱\n17897\t娨\n17898\t嬕\n17899\t屢\n17900\t嵰\n17901\t幰\n17902\t徱\n17903\t愐\n17904\t撘\n17905\t斠\n17906\t栆\n17907\t椧\n17908\t樢\n17909\t櫼\n17910\t氁\n17911\t浺\n17912\t溡\n17913\t澮\n17914\t炓\n17915\t襚\n17916\t襘\n17917\t襼\n17918\t襹\n17919\t襙\n17920\t襕\n17921\t襗\n17922\t襺\n17923\t褸\n17924\t襽\n17925\t襾\n17926\t覀\n17927\t覂\n17928\t覅\n17929\t覇\n17930\t覈\n17931\t覉\n17932\t覊\n17933\t覌\n17934\t覎\n17935\t覐\n17936\t覑\n17937\t覒\n17938\t覔\n17939\t覕\n17940\t覗\n17941\t覘\n17942\t覙\n17943\t覛\n17944\t覜\n17945\t覝\n17946\t覞\n17947\t覟\n17948\t覠\n17949\t︳\n17950\t◆\n17951\tⅣ\n17952\tヴ\n17953\t侓\n17954\t勽\n17955\t咊\n17956\t嗶\n17957\t圁\n17958\t堲\n17959\t婔\n17960\t嬼\n17961\t岕\n17962\t嶔\n17963\t庺\n17964\t忯\n17965\t愻\n17966\t桇\n17967\t楐\n17968\t橍\n17969\t欞\n17970\t涺\n17971\t鬬\n17972\t鬪\n17973\t糇\n17974\t舭\n17975\t鬫\n17976\t舡\n17977\t鬩\n17978\t魗\n17979\t魙\n17980\t鬇\n17981\t簦\n17982\t鬨\n17983\t舯\n17984\t魛\n17985\t魜\n17986\t魝\n17987\t魞\n17988\t魠\n17989\t魡\n17990\t魢\n17991\t魣\n17992\t魤\n17993\t魥\n17994\t魦\n17995\t魧\n17996\t魨\n17997\t魩\n17998\t魪\n17999\t魫\n18000\t魬\n18001\t魭\n18002\t魮\n18003\t魰\n18004\t魲\n18005\t魳\n18006\t魴\n18007\t魶\n18008\t魸\n18009\t魹\n18010\t魺\n18011\tㄔ\n18012\t髟\n18013\tυ\n18014\t≡\n18015\tг\n18016\t┰\n18017\t⒃\n18018\tＴ\n18019\t佋\n18020\t傇\n18021\t勗\n18022\t呍\n18023\t囋\n18024\t壴\n18025\t娫\n18026\t屧\n18027\t嵲\n18028\t幵\n18029\t愒\n18030\t栐\n18031\t椩\n18032\t樤\n18033\t櫾\n18034\t氃\n18035\t浽\n18036\t溤\n18037\t澰\n18038\t熢\n18039\t訿\n18040\t詜\n18041\t訽\n18042\t訹\n18043\t訞\n18044\t詟\n18045\t詤\n18046\t詥\n18047\t詧\n18048\t詨\n18049\t詪\n18050\t詫\n18051\t詬\n18052\t詯\n18053\t詴\n18054\t詵\n18055\t詶\n18056\t詷\n18057\t詸\n18058\t詺\n18059\t詻\n18060\t詾\n18061\t詿\n18062\t■\n18063\tⅥ\n18064\tヶ\n18065\t傱\n18066\t咑\n18067\t嗹\n18068\t圅\n18069\t夦\n18070\t嬾\n18071\t忲\n18072\t愽\n18073\t戹\n18074\t掱\n18075\t旜\n18076\t曵\n18077\t桍\n18078\t橏\n18079\t濚\n18080\t烐\n18081\t鯺\n18082\t鰚\n18083\t鯹\n18084\t鰗\n18085\t鰙\n18086\t觯\n18087\t鯸\n18088\t鰛\n18089\t鰜\n18090\t鰝\n18091\t鰞\n18092\t鰠\n18093\t鰡\n18094\t鰢\n18095\t鰦\n18096\t鰧\n18097\t鰨\n18098\t鰪\n18099\t鰮\n18100\t鰯\n18101\t鰰\n18102\t鰳\n18103\t鰴\n18104\t鰵\n18105\t鰷\n18106\t鰹\n18107\t鰺\n18108\tㄖ\n18109\tχ\n18110\t≈\n18111\t┲\n18112\t⒅\n18113\tＶ\n18114\t佒\n18115\t冎\n18116\t勚\n18117\t呏\n18118\t堉\n18119\t屩\n18120\t嵵\n18121\t徶\n18122\t捴\n18123\t斨\n18124\t栔\n18125\t椫\n18126\t樦\n18127\t欀\n18128\t浿\n18129\t溨\n18130\t澲\n18131\t炛\n18132\t熤\n18133\t謃\n18134\t謁\n18135\t謢\n18136\t謤\n18137\t謥\n18138\t謧\n18139\t謩\n18140\t謪\n18141\t謮\n18142\t謯\n18143\t謰\n18144\t謱\n18145\t謵\n18146\t謶\n18147\t謷\n18148\t謸\n18149\t謺\n18150\t謻\n18151\t謼\n18152\t謽\n18153\t謾\n18154\t謿\n18155\t譀\n18156\t譁\n18157\t譂\n18158\t譃\n18159\t譄\n18160\t黪\n18161\t︴\n18162\t□\n18163\tⅤ\n18164\tヵ\n18165\t傰\n18166\t匁\n18167\t咍\n18168\t嗸\n18169\t圂\n18170\t堳\n18171\t夣\n18172\t嬽\n18173\t岝\n18174\t嶕\n18175\t庻\n18176\t愼\n18177\t掯\n18178\t旛\n18179\t桋\n18180\t楑\n18181\t橎\n18182\t欟\n18183\t汋\n18184\t涻\n18185\t滜\n18186\t鮚\n18187\t酲\n18188\t鮺\n18189\t鮸\n18190\t鮗\n18191\t鮙\n18192\t鮷\n18193\t鮹\n18194\t酾\n18195\t醵\n18196\t魼\n18197\t鮘\n18198\t鮻\n18199\t鮽\n18200\t鮿\n18201\t鯀\n18202\t鯁\n18203\t鯃\n18204\t鯄\n18205\t鯆\n18206\t鯈\n18207\t鯋\n18208\t鯍\n18209\t鯎\n18210\t鯏\n18211\t鯐\n18212\t鯑\n18213\t鯒\n18214\t鯓\n18215\t鯕\n18216\t鯗\n18217\t鯙\n18218\t鯚\n18219\tㄕ\n18220\tφ\n18221\t≌\n18222\tд\n18223\t┱\n18224\t⒄\n18225\tＵ\n18226\t佌\n18227\t傉\n18228\t冋\n18229\t呎\n18230\t喺\n18231\t囌\n18232\t堈\n18233\t壵\n18234\t娬\n18235\t嬚\n18236\t屨\n18237\t嵳\n18238\t幷\n18239\t徴\n18240\t愓\n18241\t懻\n18242\t捳\n18243\t撜\n18244\t斦\n18245\t樥\n18246\t氄\n18247\t浾\n18248\t溦\n18249\t炚\n18250\t熣\n18251\t諂\n18252\t諀\n18253\t誟\n18254\t誁\n18255\t諃\n18256\t諄\n18257\t諅\n18258\t諆\n18259\t諉\n18260\t諌\n18261\t諍\n18262\t諎\n18263\t諑\n18264\t諓\n18265\t諔\n18266\t諕\n18267\t諗\n18268\t諘\n18269\t諙\n18270\t諛\n18271\t諝\n18272\t諞\n18273\t諟\n18274\t諠\n18275\t諡\n18276\t諢\n18277\t▲\n18278\tⅧ\n18279\t侙\n18280\t傴\n18281\t凐\n18282\t匄\n18283\t嗻\n18284\t堷\n18285\t夬\n18286\t婙\n18287\t孁\n18288\t嶘\n18289\t庿\n18290\t忴\n18291\t慀\n18292\t掵\n18293\t擑\n18294\t桒\n18295\t楕\n18296\t橒\n18297\t欨\n18298\t涾\n18299\t滧\n18300\t濜\n18301\t烒\n18302\t燌\n18303\t鴡\n18304\t鴟\n18305\t鳾\n18306\t鴀\n18307\t鴞\n18308\t鴠\n18309\t鳣\n18310\t鴢\n18311\t鴤\n18312\t鴥\n18313\t鴫\n18314\t鴬\n18315\t鴰\n18316\t鴱\n18317\t鴴\n18318\t鴶\n18319\t鴸\n18320\t鴹\n18321\t鴽\n18322\t鴾\n18323\t鵀\n18324\t鵁\n18325\tㄘ\n18326\t∝\n18327\tж\n18328\t┴\n18329\tＸ\n18330\t佖\n18331\t傌\n18332\t冐\n18333\t勜\n18334\t呚\n18335\t嗀\n18336\t囏\n18337\t娯\n18338\t嬝\n18339\t嵷\n18340\t庁\n18341\t捸\n18342\t撠\n18343\t曍\n18344\t栘\n18345\t権\n18346\t欂\n18347\t氊\n18348\t涁\n18349\t澵\n18350\t豟\n18351\t豝\n18352\t貇\n18353\t貄\n18354\t豞\n18355\t丿\n18356\t豙\n18357\t豜\n18358\t貃\n18359\t貆\n18360\t谸\n18361\t丌\n18362\t豛\n18363\t乇\n18364\t貎\n18365\t貏\n18366\t貑\n18367\t貒\n18368\t貕\n18369\t貗\n18370\t貙\n18371\t貚\n18372\t貛\n18373\t貜\n18374\t貟\n18375\t貣\n18376\t貤\n18377\t貥\n18378\t丶\n18379\t篴\n18380\t篈\n18381\t觓\n18382\t覣\n18383\t竽\n18384\t騛\n18385\t褹\n18386\t鬭\n18387\t鬉\n18388\t舄\n18389\t鯝\n18390\t謅\n18391\t諥\n18392\t鮝\n18393\t魽\n18394\t誂\n18395\t酹\n18396\t鴄\n18397\t鳤\n18398\t豠\n18399\t谹\n18400\t劐\n18401\t羝\n18402\t銎\n18403\t劓\n18404\t篵\n18405\t骲\n18406\t驜\n18407\t觔\n18408\t覤\n18409\t騜\n18410\t鬮\n18411\t鬊\n18412\t鯾\n18413\t鮞\n18414\t誃\n18415\t鴅\n18416\t谺\n18417\t篶\n18418\t篊\n18419\t骳\n18420\t觕\n18421\t覥\n18422\t騝\n18423\t襝\n18424\t鬰\n18425\t鬋\n18426\t訡\n18427\t鯿\n18428\t鯟\n18429\t謈\n18430\t鲧\n18431\t魿\n18432\t誧\n18433\t誄\n18434\t鴆\n18435\t鳦\n18436\t谻\n18437\t篸\n18438\t篋\n18439\t骴\n18440\t驞\n18441\t觗\n18442\t鬌\n18443\t詃\n18444\t鯠\n18445\t謉\n18446\t鮠\n18447\t躔\n18448\t豥\n18449\t匦\n18450\t篹\n18451\t篍\n18452\t骵\n18453\t觘\n18454\t騟\n18455\t襡\n18456\t褽\n18457\t鬳\n18458\t詄\n18459\t絷\n18460\t鰁\n18461\t鋈\n18462\t鮡\n18463\t鮁\n18464\t誩\n18465\t誆\n18466\t鴈\n18467\t豦\n18468\t谽\n18469\t厣\n18470\t骹\n18471\t觙\n18472\t騠\n18473\t騀\n18474\t襢\n18475\t褾\n18476\t鬴\n18477\t鬎\n18478\t詅\n18479\t訤\n18480\t鰂\n18481\t鯢\n18482\t謋\n18483\t諪\n18484\t谾\n18485\t篻\n18486\t篏\n18487\t骻\n18488\t驡\n18489\t觛\n18490\t覩\n18491\t騡\n18492\t襣\n18493\t褿\n18494\t鬵\n18495\t鬐\n18496\t鰃\n18497\t謌\n18498\t諫\n18499\t鮣\n18500\t鮃\n18501\t鴊\n18502\t鳪\n18503\t篽\n18504\t篐\n18505\t骽\n18506\t觝\n18507\t騂\n18508\t襤\n18509\t襀\n18510\t鬶\n18511\t鬑\n18512\t詇\n18513\t訦\n18514\t鰄\n18515\t謍\n18516\t鮤\n18517\t鮄\n18518\t誋\n18519\t豩\n18520\t豀\n18521\t篿\n18522\t篒\n18523\t骾\n18524\t驣\n18525\t觟\n18526\t騣\n18527\t騃\n18528\t襥\n18529\t襂\n18530\t鬷\n18531\t鬒\n18532\t詉\n18533\t訧\n18534\t敉\n18535\t纛\n18536\t鰅\n18537\t鯥\n18538\t鐾\n18539\t鮅\n18540\t蹯\n18541\t鴌\n18542\t鳬\n18543\t豂\n18544\t篔\n18545\t骿\n18546\t騤\n18547\t騄\n18548\t襧\n18549\t襃\n18550\t訨\n18551\t鰆\n18552\t鯦\n18553\t謏\n18554\t鮆\n18555\t誮\n18556\t鴍\n18557\t鳭\n18558\t豭\n18559\t豃\n18560\t篕\n18561\t髃\n18562\t驥\n18563\t觡\n18564\t覭\n18565\t騅\n18566\t襅\n18567\t鬹\n18568\t鬕\n18569\t詋\n18570\t訩\n18571\t鰇\n18572\t諯\n18573\t鮧\n18574\t鮇\n18575\t誎\n18576\t鴎\n18577\t鳮\n18578\t豮\n18579\t豄\n18580\t簂\n18581\t篖\n18582\t驦\n18583\t騦\n18584\t騆\n18585\t襆\n18586\t鬺\n18587\t鬖\n18588\t詌\n18589\t謑\n18590\t諰\n18591\t誏\n18592\t鴏\n18593\t豯\n18594\t簃\n18595\t髆\n18596\t驧\n18597\t觤\n18598\t騧\n18599\t鬽\n18600\t鬗\n18601\t詍\n18602\t誱\n18603\t誐\n18604\t鴐\n18605\t豰\n18606\t簄\n18607\t篘\n18608\t髇\n18609\t觧\n18610\t覰\n18611\t篑\n18612\t笱\n18613\t騨\n18614\t騈\n18615\t襫\n18616\t襈\n18617\t鬾\n18618\t詎\n18619\t訬\n18620\t鰊\n18621\t謓\n18622\t諲\n18623\t鮊\n18624\t誑\n18625\t鴑\n18626\t鳱\n18627\t豱\n18628\t篛\n18629\t驩\n18630\t篚\n18631\t騩\n18632\t騉\n18633\t襬\n18634\t鬿\n18635\t鬙\n18636\t詏\n18637\t艉\n18638\t鰋\n18639\t鯫\n18640\t謔\n18641\t諳\n18642\t誳\n18643\t誒\n18644\t鹾\n18645\t躜\n18646\t鳲\n18647\t豲\n18648\t刂\n18649\t簆\n18650\t篜\n18651\t觩\n18652\t騪\n18653\t騊\n18654\t襭\n18655\t魀\n18656\t訮\n18657\t鰌\n18658\t鯬\n18659\t謕\n18660\t諴\n18661\t鮌\n18662\t誴\n18663\t誔\n18664\t鴓\n18665\t豍\n18666\t簈\n18667\t篞\n18668\t髊\n18669\t觪\n18670\t騋\n18671\t襮\n18672\t襋\n18673\t鬛\n18674\t詑\n18675\t訯\n18676\t鯭\n18677\t諵\n18678\t誵\n18679\t豵\n18680\t簉\n18681\t篟\n18682\t髍\n18683\t驲\n18684\t觬\n18685\t覴\n18686\t騬\n18687\t襌\n18688\t魊\n18689\t詒\n18690\t訰\n18691\t鯮\n18692\t諶\n18693\t鮎\n18694\t誶\n18695\t誖\n18696\t鳵\n18697\t豶\n18698\t簊\n18699\t篠\n18700\t觭\n18701\t覵\n18702\t騭\n18703\t騍\n18704\t襰\n18705\t襍\n18706\t詓\n18707\t鰏\n18708\t鮯\n18709\t鮏\n18710\t誷\n18711\t豷\n18712\t簍\n18713\t篢\n18714\t觮\n18715\t襎\n18716\t魌\n18717\t誸\n18718\t豻\n18719\t簎\n18720\t髐\n18721\t骍\n18722\t覷\n18723\t簏\n18724\t鬠\n18725\t糈\n18726\t鯱\n18727\t諹\n18728\t鮱\n18729\t踣\n18730\t鴘\n18731\t鳸\n18732\t豼\n18733\t豒\n18734\t刳\n18735\t簐\n18736\t骎\n18737\t騐\n18738\t襳\n18739\t襐\n18740\t魐\n18741\t鬡\n18742\t詖\n18743\t鰒\n18744\t鮲\n18745\t誚\n18746\t鴙\n18747\t豽\n18748\t豓\n18749\t簑\n18750\t篧\n18751\t骔\n18752\t觲\n18753\t覹\n18754\t騱\n18755\t騑\n18756\t襴\n18757\t襑\n18758\t魒\n18759\t鰓\n18760\t謜\n18761\t鮳\n18762\t鮓\n18763\t鴚\n18764\t鳺\n18765\t豾\n18766\t簒\n18767\t篨\n18768\t騲\n18769\t騒\n18770\t襵\n18771\t襒\n18772\t魓\n18773\t鬤\n18774\t詘\n18775\t鰔\n18776\t鯴\n18777\t鮴\n18778\t鮔\n18779\t誜\n18780\t鳻\n18781\t豿\n18782\t骙\n18783\t觵\n18784\t覻\n18785\t簖\n18786\t魕\n18787\t訷\n18788\t鰕\n18789\t鯵\n18790\t諽\n18791\t鮕\n18792\t誽\n18793\t鴜\n18794\t鳼\n18795\t貀\n18796\t豗\n18797\t簔\n18798\t篫\n18799\t髗\n18800\t觶\n18801\t覼\n18802\t騔\n18803\t魖\n18804\t鬦\n18805\t訸\n18806\t謟\n18807\t鮶\n18808\t鮖\n18809\t誾\n18810\t貁\n18811\t豘\n18812\t酤\n18813\t鳋\n18814\t觜\n18815\t籂\n18816\t覡\n18817\t魻\n18818\t誀\n18819\t譅\n18820\t鵂\n18821\t貭\n18822\t磒\n18823\t╬\n18824\t譸\n18825\t琾\n18826\t皃\n18827\t祊\n18828\t痯\n18829\t璸\n18830\t穚\n18831\t竝\n18832\t玴\n18833\t籶\n18834\t秔\n18835\t﹑\n18836\t畃\n18837\t眕\n18838\t俻\n18839\t僷\n18840\t刾\n18841\t唒\n18842\t噋\n18843\t坧\n18844\t塸\n18845\t妏\n18846\t媝\n18847\t宲\n18848\t峱\n18849\t弍\n18850\t恜\n18851\t憄\n18852\t抪\n18853\t昿\n18854\t杙\n18855\t檖\n18856\t歱\n18857\t漰\n18858\t瀙\n18859\t焢\n18860\t碢\n18861\t≒\n18862\t鱌\n18863\t譖\n18864\t癙\n18865\t礟\n18866\t疨\n18867\t璓\n18868\t縋\n18869\t稰\n18870\t窹\n18871\t玃\n18872\t籔\n18873\t禤\n18874\t㏄\n18875\t丳\n18876\t侾\n18877\t働\n18878\t匬\n18879\t哖\n18880\t嘝\n18881\t圥\n18882\t塒\n18883\t奝\n18884\t婸\n18885\t孭\n18886\t峆\n18887\t嶱\n18888\t怭\n18889\t慞\n18890\t扨\n18891\t擯\n18892\t朠\n18893\t桺\n18894\t楶\n18895\t歅\n18896\t汸\n18897\t淧\n18898\t漃\n18899\t濸\n18900\t烶\n18901\t磓\n18902\t譹\n18903\t皅\n18904\t祋\n18905\t痲\n18906\t璹\n18907\t縬\n18908\t穛\n18909\t竡\n18910\t玵\n18911\t籷\n18912\t秖\n18913\t﹒\n18914\t畄\n18915\t眖\n18916\t乹\n18917\t俼\n18918\t僸\n18919\t剄\n18920\t卶\n18921\t唓\n18922\t噏\n18923\t坬\n18924\t塹\n18925\t妐\n18926\t宷\n18927\t峲\n18928\t巕\n18929\t恞\n18930\t憅\n18931\t抭\n18932\t搎\n18933\t攓\n18934\t晀\n18935\t杚\n18936\t榪\n18937\t檘\n18938\t泀\n18939\t渜\n18940\t瀜\n18941\t焣\n18942\t碤\n18943\t≦\n18944\t鱍\n18945\t癚\n18946\t礠\n18947\t疩\n18948\t璔\n18949\t縌\n18950\t玅\n18951\t籕\n18952\t㏎\n18953\t盦\n18954\t丵\n18955\t俀\n18956\t僎\n18957\t凲\n18958\t匭\n18959\t哘\n18960\t嘠\n18961\t圦\n18962\t塓\n18963\t奞\n18964\t婹\n18965\t孮\n18966\t峇\n18967\t嶲\n18968\t廞\n18969\t怮\n18970\t扱\n18971\t擰\n18972\t昋\n18973\t朡\n18974\t桻\n18975\t楺\n18976\t歈\n18977\t漄\n18978\t濹\n18979\t烸\n18980\t磖\n18981\t╮\n18982\t鱮\n18983\t譺\n18984\t皉\n18985\t祌\n18986\t璻\n18987\t縭\n18988\t穜\n18989\t竢\n18990\t玶\n18991\t籸\n18992\t秗\n18993\t﹔\n18994\t畆\n18995\t眗\n18996\t乺\n18997\t俽\n18998\t剅\n18999\t卹\n19000\t唕\n19001\t坮\n19002\t塺\n19003\t妑\n19004\t媟\n19005\t宺\n19006\t弐\n19007\t恟\n19008\t憆\n19009\t抮\n19010\t搑\n19011\t杛\n19012\t梤\n19013\t榬\n19014\t檙\n19015\t渞\n19016\t漴\n19017\t焤\n19018\t碦\n19019\t≧\n19020\t鱎\n19021\t琑\n19022\t癛\n19023\t礡\n19024\t疪\n19025\t稲\n19026\t窻\n19027\t玆\n19028\t籖\n19029\t㏑\n19030\t俁\n19031\t僐\n19032\t凴\n19033\t哛\n19034\t嘡\n19035\t圧\n19036\t塕\n19037\t奟\n19038\t婻\n19039\t峈\n19040\t嶳\n19041\t怰\n19042\t慠\n19043\t扲\n19044\t揜\n19045\t昍\n19046\t朢\n19047\t桼\n19048\t楻\n19049\t歊\n19050\t汻\n19051\t漅\n19052\t磗\n19053\t鱯\n19054\t瑂\n19055\t皊\n19056\t痵\n19057\t穝\n19058\t竤\n19059\t玸\n19060\t籹\n19061\t秙\n19062\t﹕\n19063\t畇\n19064\t乻\n19065\t俿\n19066\t僺\n19067\t剆\n19068\t唖\n19069\t噑\n19070\t坰\n19071\t塻\n19072\t妔\n19073\t媠\n19074\t宻\n19075\t巗\n19076\t恠\n19077\t憇\n19078\t抯\n19079\t搒\n19080\t攕\n19081\t晄\n19082\t杝\n19083\t梥\n19084\t檚\n19085\t歴\n19086\t焥\n19087\t碨\n19088\t⊿\n19089\t鱏\n19090\t琒\n19091\t癝\n19092\t璖\n19093\t縎\n19094\t稴\n19095\t窼\n19096\t玈\n19097\t籗\n19098\t㏒\n19099\t盨\n19100\t凷\n19101\t匰\n19102\t哠\n19103\t圫\n19104\t塖\n19105\t孲\n19106\t峉\n19107\t嶴\n19108\t怱\n19109\t慡\n19110\t扴\n19111\t揝\n19112\t朣\n19113\t桽\n19114\t橲\n19115\t歋\n19116\t汼\n19117\t烻\n19118\tㄟ\n19119\t∵\n19120\t⑦\n19121\t＿\n19122\t佭\n19123\t傔\n19124\t冞\n19125\t勥\n19126\t呥\n19127\t嗊\n19128\t囘\n19129\t堖\n19130\t夁\n19131\t娺\n19132\t屵\n19133\t嵾\n19134\t庍\n19135\t愡\n19136\t戇\n19137\t撨\n19138\t斶\n19139\t曔\n19140\t栠\n19141\t椷\n19142\t樳\n19143\t欉\n19144\t氝\n19145\t涍\n19146\t溸\n19147\t澾\n19148\t炦\n19149\t熯\n19150\t擗\n19151\t攥\n19152\t遾\n19153\t擐\n19154\t擤\n19155\t邆\n19156\t邇\n19157\t邉\n19158\t邌\n19159\t邍\n19160\t邎\n19161\t邐\n19162\t邒\n19163\t邔\n19164\t邖\n19165\t邘\n19166\t邚\n19167\t邜\n19168\t邞\n19169\t邟\n19170\t邠\n19171\t邤\n19172\t邥\n19173\t邧\n19174\t邫\n19175\t邭\n19176\t邲\n19177\t邷\n19178\t邼\n19179\t邽\n19180\t邿\n19181\t遖\n19182\t逜\n19183\t哜\n19184\t吣\n19185\t遚\n19186\t逤\n19187\t逥\n19188\t遝\n19189\t逧\n19190\t遟\n19191\t逩\n19192\t逪\n19193\t遡\n19194\t逫\n19195\t遤\n19196\t逬\n19197\t遦\n19198\t逰\n19199\t遧\n19200\t遪\n19201\t逳\n19202\t遫\n19203\t逴\n19204\t唪\n19205\t咴\n19206\t啧\n19207\t遬\n19208\t逷\n19209\t遯\n19210\t逹\n19211\t遰\n19212\t逺\n19213\t遱\n19214\t逽\n19215\t逿\n19216\t遳\n19217\t遀\n19218\t遶\n19219\t遆\n19220\t遈\n19221\t啐\n19222\t郀\n19223\t磘\n19224\t譼\n19225\t瑃\n19226\t皌\n19227\t縯\n19228\t穞\n19229\t竧\n19230\t秚\n19231\t﹖\n19232\t畉\n19233\t乼\n19234\t倀\n19235\t僼\n19236\t卼\n19237\t唗\n19238\t噒\n19239\t坱\n19240\t塼\n19241\t妕\n19242\t媡\n19243\t宼\n19244\t峵\n19245\t巘\n19246\t弔\n19247\t恡\n19248\t憈\n19249\t抰\n19250\t搕\n19251\t攖\n19252\t晅\n19253\t杢\n19254\t梩\n19255\t榯\n19256\t檛\n19257\t泃\n19258\t渢\n19259\t焧\n19260\t═\n19261\t鱐\n19262\t琓\n19263\t癟\n19264\t礣\n19265\t疶\n19266\t璗\n19267\t縏\n19268\t稵\n19269\t窽\n19270\t玊\n19271\t㏕\n19272\t乀\n19273\t僒\n19274\t凾\n19275\t圱\n19276\t奣\n19277\t婽\n19278\t孴\n19279\t峊\n19280\t嶵\n19281\t廡\n19282\t怲\n19283\t扵\n19284\t揟\n19285\t朤\n19286\t楾\n19287\t橳\n19288\t歍\n19289\t汿\n19290\t淭\n19291\t濼\n19292\t烼\n19293\t╱\n19294\t皍\n19295\t祏\n19296\t痷\n19297\t璾\n19298\t縰\n19299\t穟\n19300\t竨\n19301\t玼\n19302\t籾\n19303\t秛\n19304\t﹗\n19305\t眜\n19306\t乽\n19307\t倁\n19308\t剈\n19309\t卽\n19310\t唘\n19311\t媢\n19312\t寀\n19313\t巙\n19314\t弖\n19315\t憉\n19316\t抲\n19317\t攗\n19318\t晆\n19319\t杣\n19320\t梪\n19321\t榰\n19322\t泆\n19323\t瀠\n19324\t碪\n19325\t║\n19326\t鱑\n19327\t譛\n19328\t琔\n19329\t癠\n19330\t礥\n19331\t疷\n19332\t縐\n19333\t稶\n19334\t窾\n19335\t玌\n19336\t籙\n19337\t︰\n19338\t甎\n19339\t乁\n19340\t俇\n19341\t刄\n19342\t匲\n19343\t哢\n19344\t嘦\n19345\t圲\n19346\t塙\n19347\t婾\n19348\t峌\n19349\t怳\n19350\t慤\n19351\t扷\n19352\t揢\n19353\t昒\n19354\t楿\n19355\t橴\n19356\t沀\n19357\t淯\n19358\t漊\n19359\t濽\n19360\t烾\n19361\t→\n19362\t侜\n19363\t傶\n19364\t凓\n19365\t匉\n19366\t咜\n19367\t嗿\n19368\t堹\n19369\t婜\n19370\t岤\n19371\t嶛\n19372\t忷\n19373\t慂\n19374\t扂\n19375\t旡\n19376\t曻\n19377\t桗\n19378\t楘\n19379\t橔\n19380\t欪\n19381\t汑\n19382\t淂\n19383\t滫\n19384\t鷃\n19385\t鷁\n19386\t鷡\n19387\t鷀\n19388\t鶣\n19389\t鶿\n19390\t鷢\n19391\t鷤\n19392\t鷧\n19393\t鷩\n19394\t鷪\n19395\t鷫\n19396\t鷬\n19397\t鷭\n19398\t鷰\n19399\t鷳\n19400\t鷴\n19401\t鷵\n19402\t鷷\n19403\t鷽\n19404\t鷾\n19405\t鸀\n19406\t鸁\n19407\tㄚ\n19408\t≮\n19409\t②\n19410\tＺ\n19411\t傏\n19412\t冓\n19413\t呞\n19414\t堏\n19415\t壼\n19416\t娳\n19417\t嬟\n19418\t屭\n19419\t徻\n19420\t愙\n19421\t戁\n19422\t捼\n19423\t撢\n19424\t斱\n19425\t曏\n19426\t栚\n19427\t椱\n19428\t涄\n19429\t溭\n19430\t澸\n19431\t炡\n19432\t赻\n19433\t赹\n19434\t赱\n19435\t赸\n19436\t趠\n19437\t贎\n19438\t讠\n19439\t赲\n19440\t趢\n19441\t趤\n19442\t趦\n19443\t趧\n19444\t趩\n19445\t趪\n19446\t趫\n19447\t趬\n19448\t趭\n19449\t趮\n19450\t趯\n19451\t趰\n19452\t趲\n19453\t趶\n19454\t趷\n19455\t趹\n19456\t趻\n19457\t趽\n19458\t跀\n19459\t跁\n19460\t跇\n19461\t跈\n19462\t跉\n19463\t跍\n19464\t跒\n19465\t跓\n19466\t※\n19467\tⅨ\n19468\t侚\n19469\t凒\n19470\t匇\n19471\t嗼\n19472\t堸\n19473\t夰\n19474\t婛\n19475\t孂\n19476\t嶚\n19477\t廀\n19478\t忶\n19479\t慁\n19480\t戼\n19481\t擓\n19482\t旟\n19483\t曺\n19484\t桖\n19485\t楖\n19486\t橓\n19487\t欩\n19488\t汏\n19489\t淁\n19490\t濝\n19491\t烓\n19492\t燍\n19493\t鵿\n19494\t鵞\n19495\t鵾\n19496\t鶀\n19497\t鵃\n19498\t鶂\n19499\t鶄\n19500\t鶅\n19501\t鶆\n19502\t鶇\n19503\t鶊\n19504\t鶋\n19505\t鶏\n19506\t鶑\n19507\t鶓\n19508\t鶔\n19509\t鶕\n19510\t鶖\n19511\t鶙\n19512\t鶚\n19513\t鶜\n19514\t鶞\n19515\t鶠\n19516\t鶡\n19517\tㄙ\n19518\t≠\n19519\tз\n19520\t┵\n19521\tＹ\n19522\t傎\n19523\t冑\n19524\t呝\n19525\t嗁\n19526\t囐\n19527\t堎\n19528\t壻\n19529\t娰\n19530\t嵸\n19531\t庂\n19532\t徺\n19533\t愘\n19534\t捹\n19535\t撡\n19536\t斮\n19537\t栙\n19538\t椯\n19539\t樫\n19540\t欃\n19541\t涃\n19542\t澷\n19543\t炠\n19544\t熧\n19545\t賎\n19546\t侔\n19547\t賍\n19548\t賋\n19549\t賩\n19550\t賫\n19551\t貮\n19552\t賮\n19553\t賯\n19554\t賰\n19555\t賱\n19556\t賲\n19557\t賳\n19558\t賵\n19559\t賶\n19560\t賷\n19561\t賸\n19562\t賹\n19563\t賻\n19564\t賾\n19565\t賿\n19566\t贁\n19567\t贋\n19568\t←\n19569\tⅪ\n19570\t｛\n19571\t侞\n19572\t匊\n19573\t咞\n19574\t嘂\n19575\t圎\n19576\t夳\n19577\t婝\n19578\t孄\n19579\t岥\n19580\t嶜\n19581\t忹\n19582\t扄\n19583\t掻\n19584\t擕\n19585\t旣\n19586\t桘\n19587\t楙\n19588\t橕\n19589\t欫\n19590\t汒\n19591\t淃\n19592\t濢\n19593\t烕\n19594\t鸴\n19595\t鸧\n19596\t鸃\n19597\t麁\n19598\t麃\n19599\t麄\n19600\t麅\n19601\t麆\n19602\t麉\n19603\t麊\n19604\t麌\n19605\t麍\n19606\t麎\n19607\t麏\n19608\t麐\n19609\t麑\n19610\t麔\n19611\t麕\n19612\t麖\n19613\t麘\n19614\t麙\n19615\t麚\n19616\t麛\n19617\t麜\n19618\t麞\n19619\t麠\n19620\t麡\n19621\t麢\n19622\t麣\n19623\t麤\n19624\t麧\n19625\t麨\n19626\tㄛ\n19627\t≯\n19628\tй\n19629\t┷\n19630\t③\n19631\t［\n19632\t佦\n19633\t傐\n19634\t冔\n19635\t勠\n19636\t呟\n19637\t嗃\n19638\t囒\n19639\t堐\n19640\t嬠\n19641\t屰\n19642\t嵺\n19643\t庅\n19644\t徾\n19645\t戂\n19646\t捽\n19647\t斲\n19648\t曐\n19649\t栛\n19650\t椲\n19651\t樭\n19652\t欅\n19653\t氎\n19654\t涆\n19655\t溮\n19656\t澺\n19657\t熪\n19658\t踾\n19659\t踻\n19660\t郐\n19661\t踎\n19662\t踇\n19663\t踋\n19664\t踼\n19665\t跕\n19666\t踈\n19667\t踿\n19668\t蹃\n19669\t蹅\n19670\t蹆\n19671\t蹌\n19672\t蹍\n19673\t蹎\n19674\t蹏\n19675\t蹔\n19676\t蹕\n19677\t蹖\n19678\t蹗\n19679\t蹘\n19680\t蹚\n19681\t蹛\n19682\t蹜\n19683\t蹝\n19684\t蹞\n19685\t蹡\n19686\t蹢\n19687\t蹧\n19688\t蹨\n19689\t蹫\n19690\t↑\n19691\tⅫ\n19692\t｜\n19693\t侟\n19694\t傸\n19695\t凕\n19696\t匋\n19697\t咟\n19698\t堻\n19699\t夵\n19700\t岦\n19701\t嶞\n19702\t廃\n19703\t忺\n19704\t扅\n19705\t掽\n19706\t擖\n19707\t旤\n19708\t朁\n19709\t桙\n19710\t楛\n19711\t橖\n19712\t汓\n19713\t淈\n19714\t滭\n19715\t濣\n19716\t烖\n19717\t燑\n19718\t鼅\n19719\t鼃\n19720\t黖\n19721\t黓\n19722\t鼂\n19723\t鼄\n19724\t麫\n19725\t鼆\n19726\t鼇\n19727\t鼈\n19728\t鼉\n19729\t鼊\n19730\t鼌\n19731\t鼏\n19732\t鼑\n19733\t鼒\n19734\t鼔\n19735\t鼕\n19736\t鼖\n19737\t鼘\n19738\t鼚\n19739\t鼛\n19740\t鼜\n19741\t鼝\n19742\t鼟\n19743\t鼡\n19744\t鼣\n19745\t鼥\n19746\t鼦\n19747\t鼧\n19748\t鼪\n19749\t鼫\n19750\t鼮\n19751\tㄜ\n19752\t≤\n19753\tк\n19754\t┸\n19755\t④\n19756\t＼\n19757\t佨\n19758\t呠\n19759\t囓\n19760\t堒\n19761\t娷\n19762\t庈\n19763\t徿\n19764\t戃\n19765\t捾\n19766\t斳\n19767\t曑\n19768\t椳\n19769\t樮\n19770\t欆\n19771\t氒\n19772\t溰\n19773\t澻\n19774\t熫\n19775\t躟\n19776\t躝\n19777\t堋\n19778\t躿\n19779\t堙\n19780\t墚\n19781\t堍\n19782\t埽\n19783\t躙\n19784\t軃\n19785\t軄\n19786\t軆\n19787\t軉\n19788\t軐\n19789\t軓\n19790\t軔\n19791\t軕\n19792\t軗\n19793\t軘\n19794\t軚\n19795\t軞\n19796\t軡\n19797\t軣\n19798\t鶤\n19799\t赼\n19800\t卩\n19801\t阝\n19802\t阢\n19803\t鵄\n19804\t賏\n19805\t汆\n19806\t馘\n19807\t鸻\n19808\t踑\n19809\t跘\n19810\t坫\n19811\t躠\n19812\t蹵\n19813\t塥\n19814\t芰\n19815\t苊\n19816\t冁\n19817\t鶥\n19818\t赽\n19819\t贐\n19820\t鵥\n19821\t鵅\n19822\t鸼\n19823\t鸅\n19824\t踒\n19825\t跙\n19826\t黚\n19827\t麭\n19828\t躡\n19829\t蹷\n19830\t鷆\n19831\t赾\n19832\t贑\n19833\t谇\n19834\t鵆\n19835\t賑\n19836\t鸆\n19837\t踓\n19838\t跜\n19839\t麮\n19840\t蹸\n19841\t鶧\n19842\t赿\n19843\t陴\n19844\t鵧\n19845\t貲\n19846\t踕\n19847\t垧\n19848\t黡\n19849\t麯\n19850\t躣\n19851\t鶨\n19852\t趀\n19853\t贓\n19854\t鵨\n19855\t鵈\n19856\t勹\n19857\t鹐\n19858\t鸈\n19859\t坶\n19860\t凵\n19861\t廴\n19862\t黣\n19863\t麰\n19864\t躤\n19865\t蹺\n19866\t鶩\n19867\t趂\n19868\t贔\n19869\t鵩\n19870\t鵉\n19871\t賔\n19872\t鹒\n19873\t鸉\n19874\t踗\n19875\t跢\n19876\t黤\n19877\t躥\n19878\t蹻\n19879\t鶪\n19880\t趃\n19881\t鵊\n19882\t賕\n19883\t貵\n19884\t鹓\n19885\t踘\n19886\t黦\n19887\t躦\n19888\t蹽\n19889\t鷋\n19890\t鶫\n19891\t趆\n19892\t鵋\n19893\t賖\n19894\t跦\n19895\t麳\n19896\t躧\n19897\t蹾\n19898\t鷌\n19899\t鶬\n19900\t趇\n19901\t贗\n19902\t賗\n19903\t亠\n19904\t鹖\n19905\t鸌\n19906\t跧\n19907\t垲\n19908\t黫\n19909\t躨\n19910\t鷍\n19911\t鶭\n19912\t趈\n19913\t贘\n19914\t鵭\n19915\t賘\n19916\t鹙\n19917\t鸍\n19918\t踛\n19919\t跩\n19920\t黬\n19921\t麶\n19922\t躩\n19923\t躂\n19924\t鷎\n19925\t鶮\n19926\t趉\n19927\t鵎\n19928\t賙\n19929\t貹\n19930\t鹝\n19931\t鸎\n19932\t踜\n19933\t跭\n19934\t黭\n19935\t麷\n19936\t躃\n19937\t鷏\n19938\t趌\n19939\t鵯\n19940\t鵏\n19941\t貺\n19942\t鹟\n19943\t踠\n19944\t跮\n19945\t黮\n19946\t麹\n19947\t躭\n19948\t躄\n19949\t鷐\n19950\t趍\n19951\t鵐\n19952\t賛\n19953\t鸐\n19954\t踡\n19955\t跰\n19956\t黰\n19957\t麺\n19958\t鶱\n19959\t趎\n19960\t贜\n19961\t踤\n19962\t跱\n19963\t黱\n19964\t麼\n19965\t躰\n19966\t趏\n19967\t鵒\n19968\t賝\n19969\t裒\n19970\t僦\n19971\t鹢\n19972\t鸒\n19973\t踥\n19974\t跲\n19975\t墼\n19976\t黲\n19977\t麿\n19978\t躱\n19979\t鶳\n19980\t鵓\n19981\t鹥\n19982\t鸓\n19983\t踦\n19984\t跴\n19985\t黳\n19986\t黀\n19987\t鷔\n19988\t趒\n19989\t赒\n19990\t鵴\n19991\t鵔\n19992\t賟\n19993\t鸔\n19994\t黁\n19995\t躋\n19996\t鷕\n19997\t鶵\n19998\t趓\n19999\t赗\n20000\t鵕\n20001\t鸕\n20002\t踨\n20003\t黵\n20004\t黂\n20005\t躌\n20006\t鷖\n20007\t鵶\n20008\t鵖\n20009\t鹲\n20010\t鸖\n20011\t黶\n20012\t躶\n20013\t趖\n20014\t赥\n20015\t鵷\n20016\t鸗\n20017\t踭\n20018\t跿\n20019\t黷\n20020\t黅\n20021\t躷\n20022\t躎\n20023\t鷘\n20024\t鶸\n20025\t趗\n20026\t赨\n20027\t谫\n20028\t鵸\n20029\t鵘\n20030\t氽\n20031\t冱\n20032\t鸘\n20033\t踰\n20034\t踀\n20035\t埯\n20036\t黸\n20037\t黆\n20038\t躸\n20039\t躑\n20040\t茳\n20041\t鷙\n20042\t鶹\n20043\t趘\n20044\t赩\n20045\t鵹\n20046\t鵙\n20047\t鸙\n20048\t踲\n20049\t黺\n20050\t黇\n20051\t躹\n20052\t躒\n20053\t鷚\n20054\t鵺\n20055\t鵚\n20056\t賥\n20057\t賅\n20058\t鹷\n20059\t踳\n20060\t黽\n20061\t躻\n20062\t躓\n20063\t鷛\n20064\t趚\n20065\t赬\n20066\t鵻\n20067\t鹸\n20068\t踃\n20069\t黊\n20070\t躼\n20071\t鷜\n20072\t鶼\n20073\t趛\n20074\t赮\n20075\t鵜\n20076\t賧\n20077\t鸜\n20078\t踶\n20079\t踄\n20080\t鼀\n20081\t黋\n20082\t躖\n20083\t鷝\n20084\t鶽\n20085\t趜\n20086\t赯\n20087\t鵽\n20088\t賨\n20089\t鹺\n20090\t踷\n20091\t踆\n20092\t鼁\n20093\t黌\n20094\t躾\n20095\t跔\n20096\t鶢\n20097\t麪\n20098\t蹱\n20099\t軤\n20100\t╲\n20101\t譾\n20102\t皏\n20103\t祐\n20104\t痸\n20105\t穠\n20106\t玽\n20107\t籿\n20108\t秜\n20109\t﹙\n20110\t畍\n20111\t眝\n20112\t乿\n20113\t倂\n20114\t僾\n20115\t剉\n20116\t卾\n20117\t唙\n20118\t噕\n20119\t坴\n20120\t塿\n20121\t妚\n20122\t媣\n20123\t寁\n20124\t峷\n20125\t巚\n20126\t弙\n20127\t恦\n20128\t抳\n20129\t攙\n20130\t晇\n20131\t杤\n20132\t梫\n20133\t榲\n20134\t檝\n20135\t渧\n20136\t漹\n20137\t瀡\n20138\t焩\n20139\t碫\n20140\t╒\n20141\t琕\n20142\t疺\n20143\t縑\n20144\t稸\n20145\t籚\n20146\t禫\n20147\t￢\n20148\t甐\n20149\t盫\n20150\t俈\n20151\t僔\n20152\t刅\n20153\t匳\n20154\t哣\n20155\t嘨\n20156\t圴\n20157\t奦\n20158\t媀\n20159\t峍\n20160\t怴\n20161\t慥\n20162\t扸\n20163\t揤\n20164\t擵\n20165\t昖\n20166\t梀\n20167\t橵\n20168\t歏\n20169\t淰\n20170\t漋\n20171\t磜\n20172\t╳\n20173\t鱳\n20174\t譿\n20175\t瑆\n20176\t祑\n20177\t瓀\n20178\t縲\n20179\t穡\n20180\t粀\n20181\t秝\n20182\t﹚\n20183\t畐\n20184\t倃\n20185\t僿\n20186\t厀\n20187\t唚\n20188\t噖\n20189\t墂\n20190\t妛\n20191\t媤\n20192\t寃\n20193\t峸\n20194\t巜\n20195\t弚\n20196\t恮\n20197\t抴\n20198\t搘\n20199\t晈\n20200\t梬\n20201\t榳\n20202\t渨\n20203\t漺\n20204\t焪\n20205\t碬\n20206\t╓\n20207\t譝\n20208\t琖\n20209\t礧\n20210\t疻\n20211\t璚\n20212\t稺\n20213\t竁\n20214\t籛\n20215\t禬\n20216\t￤\n20217\t甒\n20218\t盬\n20219\t乄\n20220\t俉\n20221\t刉\n20222\t匴\n20223\t哤\n20224\t圵\n20225\t塛\n20226\t峎\n20227\t廤\n20228\t怶\n20229\t慦\n20230\t扺\n20231\t揥\n20232\t昗\n20233\t朩\n20234\t梂\n20235\t橶\n20236\t漌\n20237\t濿\n20238\t焀\n20239\t磝\n20240\t▁\n20241\t鱴\n20242\t瑇\n20243\t皒\n20244\t祒\n20245\t痻\n20246\t瓁\n20247\t縳\n20248\t竫\n20249\t粁\n20250\t秞\n20251\t﹛\n20252\t眡\n20253\t倄\n20254\t厁\n20255\t唜\n20256\t噚\n20257\t坸\n20258\t墄\n20259\t妜\n20260\t媥\n20261\t寈\n20262\t峹\n20263\t巟\n20264\t弜\n20265\t恱\n20266\t憍\n20267\t抶\n20268\t搙\n20269\t攛\n20270\t杧\n20271\t梮\n20272\t榵\n20273\t檟\n20274\t歺\n20275\t泋\n20276\t渪\n20277\t漻\n20278\t瀤\n20279\t焫\n20280\t╔\n20281\t琗\n20282\t礨\n20283\t璛\n20284\t縓\n20285\t稾\n20286\t竂\n20287\t玐\n20288\t禭\n20289\t甔\n20290\t盭\n20291\t乆\n20292\t俋\n20293\t僗\n20294\t刋\n20295\t匵\n20296\t哫\n20297\t嘪\n20298\t圶\n20299\t塜\n20300\t奨\n20301\t媂\n20302\t孹\n20303\t峏\n20304\t廥\n20305\t怷\n20306\t扻\n20307\t昘\n20308\t梄\n20309\t橷\n20310\t歑\n20311\t沊\n20312\t淴\n20313\t漍\n20314\t瀀\n20315\t焁\n20316\t磞\n20317\t▂\n20318\t讁\n20319\t皔\n20320\t祔\n20321\t痽\n20322\t瓂\n20323\t穣\n20324\t珁\n20325\t粂\n20326\t秠\n20327\t﹜\n20328\t畒\n20329\t眣\n20330\t倅\n20331\t剏\n20332\t厃\n20333\t唝\n20334\t噛\n20335\t坹\n20336\t墆\n20337\t媦\n20338\t寉\n20339\t峺\n20340\t巠\n20341\t弝\n20342\t恲\n20343\t抷\n20344\t搚\n20345\t晊\n20346\t杫\n20347\t梱\n20348\t榶\n20349\t檡\n20350\t歽\n20351\t泍\n20352\t瀥\n20353\t焬\n20354\t碮\n20355\t╕\n20356\t譟\n20357\t琘\n20358\t礩\n20359\t痀\n20360\t璝\n20361\t縔\n20362\t竃\n20363\t籝\n20364\t℡\n20365\t盰\n20366\t乊\n20367\t俌\n20368\t僘\n20369\t刌\n20370\t匶\n20371\t哬\n20372\t嘫\n20373\t圷\n20374\t奩\n20375\t媃\n20376\t孻\n20377\t峐\n20378\t嶻\n20379\t廦\n20380\t怸\n20381\t慪\n20382\t扽\n20383\t揧\n20384\t擸\n20385\t昚\n20386\t朰\n20387\t梇\n20388\t橸\n20389\t沋\n20390\t漎\n20391\t瀁\n20392\t焂\n20393\t↓\n20394\t｝\n20395\t傹\n20396\t匌\n20397\t咠\n20398\t嘄\n20399\t堼\n20400\t夶\n20401\t婟\n20402\t孆\n20403\t嶟\n20404\t廄\n20405\t忼\n20406\t慅\n20407\t扆\n20408\t掿\n20409\t擙\n20410\t旪\n20411\t朂\n20412\t桚\n20413\t楜\n20414\t橗\n20415\t欭\n20416\t滮\n20417\t烗\n20418\t齕\n20419\t齗\n20420\t齵\n20421\t鼲\n20422\t齖\n20423\t齹\n20424\t齺\n20425\t齻\n20426\t齼\n20427\t齽\n20428\t齾\n20429\t龂\n20430\t龎\n20431\t龏\n20432\t龒\n20433\t龓\n20434\t龖\n20435\t龗\n20436\t龝\n20437\t龞\n20438\t龡\n20439\t郎\n20440\t凉\n20441\t裏\n20442\tㄝ\n20443\t鬏\n20444\t≥\n20445\tл\n20446\tぽ\n20447\t⑤\n20448\t］\n20449\t佪\n20450\t呡\n20451\t娸\n20452\t屳\n20453\t嵼\n20454\t庉\n20455\t忀\n20456\t愝\n20457\t戄\n20458\t捿\n20459\t撦\n20460\t斴\n20461\t曒\n20462\t椵\n20463\t樰\n20464\t欇\n20465\t涊\n20466\t溳\n20467\t熭\n20468\t輅\n20469\t莰\n20470\t輣\n20471\t輀\n20472\t輂\n20473\t輠\n20474\t輢\n20475\t荮\n20476\t軥\n20477\t輁\n20478\t輥\n20479\t輦\n20480\t輧\n20481\t輫\n20482\t輬\n20483\t輭\n20484\t輮\n20485\t輲\n20486\t輳\n20487\t輴\n20488\t輵\n20489\t輶\n20490\t輹\n20491\t輽\n20492\t轀\n20493\t轃\n20494\t菥\n20495\t莶\n20496\t齛\n20497\t鼳\n20498\t齜\n20499\t齝\n20500\t鼵\n20501\t輈\n20502\t軨\n20503\t鼶\n20504\t軩\n20505\t齟\n20506\t鼸\n20507\t輊\n20508\t萆\n20509\t鼺\n20510\t輌\n20511\t軬\n20512\t齢\n20513\t齣\n20514\t輎\n20515\t軮\n20516\t齤\n20517\t齁\n20518\t輏\n20519\t齥\n20520\t齂\n20521\t輐\n20522\t軰\n20523\t齃\n20524\t輑\n20525\t軱\n20526\t齧\n20527\t齅\n20528\t輒\n20529\t軲\n20530\t齨\n20531\t齆\n20532\t軳\n20533\t齇\n20534\t軴\n20535\t蔌\n20536\t齈\n20537\t齫\n20538\t齉\n20539\t輖\n20540\t齬\n20541\t軷\n20542\t輘\n20543\t齮\n20544\t齌\n20545\t輙\n20546\t軹\n20547\t齯\n20548\t齍\n20549\t輚\n20550\t軺\n20551\t葙\n20552\t蓰\n20553\t蒇\n20554\t蒈\n20555\t齰\n20556\t齎\n20557\t齱\n20558\t齏\n20559\t蔟\n20560\t齴\n20561\t軿\n20562\t蒉\n20563\t隣\n20564\t磟\n20565\t▃\n20566\t瑉\n20567\t皕\n20568\t痾\n20569\t穤\n20570\t竮\n20571\t珃\n20572\t粃\n20573\t秡\n20574\t﹝\n20575\t眤\n20576\t亃\n20577\t剒\n20578\t厇\n20579\t噝\n20580\t坺\n20581\t墇\n20582\t寊\n20583\t峼\n20584\t弞\n20585\t恴\n20586\t抸\n20587\t搝\n20588\t晍\n20589\t杬\n20590\t梲\n20591\t歾\n20592\t泎\n20593\t渮\n20594\t漽\n20595\t焭\n20596\t碯\n20597\t╖\n20598\t譠\n20599\t琙\n20600\t癦\n20601\t痁\n20602\t縕\n20603\t竄\n20604\t籞\n20605\t㈱\n20606\t甖\n20607\t盳\n20608\t乑\n20609\t僙\n20610\t刏\n20611\t哯\n20612\t圸\n20613\t塟\n20614\t孼\n20615\t峑\n20616\t廧\n20617\t慫\n20618\t抁\n20619\t揨\n20620\t昛\n20621\t朲\n20622\t梈\n20623\t橺\n20624\t歓\n20625\t沍\n20626\t瀂\n20627\t焃\n20628\t齄\n20629\t〓\n20630\t￣\n20631\t侢\n20632\t傼\n20633\t凗\n20634\t咡\n20635\t嘅\n20636\t圑\n20637\t夻\n20638\t孇\n20639\t岨\n20640\t廅\n20641\t怇\n20642\t扊\n20643\t揀\n20644\t旫\n20645\t桛\n20646\t楟\n20647\t欮\n20648\t淊\n20649\t烚\n20650\t燓\n20651\tㄞ\n20652\t∞\n20653\t⑥\n20654\t＾\n20655\t佫\n20656\t傓\n20657\t冝\n20658\t呣\n20659\t嗈\n20660\t囖\n20661\t夀\n20662\t娹\n20663\t嬣\n20664\t屴\n20665\t嵽\n20666\t庌\n20667\t忁\n20668\t愞\n20669\t戅\n20670\t撧\n20671\t斵\n20672\t曓\n20673\t椶\n20674\t樲\n20675\t欈\n20676\t氜\n20677\t涋\n20678\t溵\n20679\t澽\n20680\t炥\n20681\t熮\n20682\t轥\n20683\t轣\n20684\t迆\n20685\t轤\n20686\t瞢\n20687\t轢\n20688\t迃\n20689\t迉\n20690\t迊\n20691\t迋\n20692\t迌\n20693\t迒\n20694\t迗\n20695\t迚\n20696\t迠\n20697\t迡\n20698\t迣\n20699\t迧\n20700\t迬\n20701\t迯\n20702\t迱\n20703\t迲\n20704\t迶\n20705\t迺\n20706\t迻\n20707\t迼\n20708\t迾\n20709\t迿\n20710\t逇\n20711\t逈\n20712\t逌\n20713\t逎\n20714\t逓\n20715\t逕\n20716\t嗀\n20717\t轪\n20718\t掎\n20719\t掊\n20720\t轇\n20721\t﨏\n20722\t辌\n20723\t﨑\n20724\t辒\n20725\t扌\n20726\t辝\n20727\t﨔\n20728\t辠\n20729\t轋\n20730\t礼\n20731\t轌\n20732\t辢\n20733\t辤\n20734\t尢\n20735\t揞\n20736\t揎\n20737\t﨡\n20738\t辥\n20739\t轐\n20740\t﨤\n20741\t辧\n20742\t轑\n20743\t﨧\n20744\t辪\n20745\t轒\n20746\t辬\n20747\t轓\n20748\t轔\n20749\t搌\n20750\t挢\n20751\t轕\n20752\t轗\n20753\t辳\n20754\t捱\n20755\t轙\n20756\t辵\n20757\t轚\n20758\t撄\n20759\t辷\n20760\t辸\n20761\t轝\n20762\t掭\n20763\t撖\n20764\t逘\n20765\t礌\n20766\t瑺\n20767\t禒\n20768\t癄\n20769\t繝\n20770\t窢\n20771\t珷\n20772\t粻\n20773\t稜\n20774\t仩\n20775\t儬\n20776\t劆\n20777\t啝\n20778\t嚑\n20779\t垹\n20780\t墵\n20781\t姞\n20782\t尃\n20783\t崰\n20784\t帬\n20785\t彔\n20786\t悹\n20787\t憼\n20788\t挔\n20789\t摖\n20790\t敔\n20791\t暊\n20792\t枲\n20793\t槧\n20794\t櫊\n20795\t殸\n20796\t洜\n20797\t潬\n20798\t灎\n20799\t煚\n20800\t牬\n20801\t燸\n20802\t牗\n20803\t爚\n20804\t狑\n20805\t牞\n20806\t爘\n20807\t牔\n20808\t牥\n20809\t牭\n20810\t牎\n20811\t牱\n20812\t牳\n20813\t牜\n20814\t牷\n20815\t燵\n20816\t爗\n20817\t爙\n20818\t牕\n20819\t牰\n20820\t牣\n20821\t燖\n20822\t牑\n20823\t牏\n20824\t牐\n20825\t牓\n20826\t燶\n20827\t牨\n20828\t爜\n20829\t爞\n20830\t爟\n20831\t爠\n20832\t爡\n20833\t爢\n20834\t爣\n20835\t爤\n20836\t爥\n20837\t爧\n20838\t爩\n20839\t爫\n20840\t爮\n20841\t爯\n20842\t爳\n20843\t爴\n20844\t爼\n20845\t牀\n20846\t牃\n20847\t牅\n20848\t牉\n20849\t牊\n20850\t牸\n20851\t牻\n20852\t牼\n20853\t牴\n20854\t牪\n20855\t牫\n20856\t燗\n20857\t牚\n20858\t犪\n20859\t犩\n20860\t犂\n20861\t犫\n20862\t犅\n20863\t犲\n20864\t犱\n20865\t犮\n20866\t犆\n20867\t犳\n20868\t犉\n20869\t燽\n20870\t燘\n20871\t燾\n20872\t犵\n20873\t犌\n20874\t燿\n20875\t狆\n20876\t犘\n20877\t爀\n20878\t燛\n20879\t犻\n20880\t犐\n20881\t犺\n20882\t犎\n20883\t爁\n20884\t爂\n20885\t燝\n20886\t爃\n20887\t燞\n20888\t爄\n20889\t犿\n20890\t犾\n20891\t犖\n20892\t狅\n20893\t犗\n20894\t爅\n20895\t燡\n20896\t爇\n20897\t燢\n20898\t爈\n20899\t爉\n20900\t狇\n20901\t犙\n20902\t燨\n20903\t牶\n20904\t狊\n20905\t犛\n20906\t狉\n20907\t犚\n20908\t狋\n20909\t犜\n20910\t狏\n20911\t狌\n20912\t犝\n20913\t狓\n20914\t爌\n20915\t爎\n20916\t燫\n20917\t燬\n20918\t犨\n20919\t燯\n20920\t狕\n20921\t狔\n20922\t狖\n20923\t犤\n20924\t狘\n20925\t犥\n20926\t燰\n20927\t爓\n20928\t燱\n20929\t爔\n20930\t燳\n20931\t狚\n20932\t犦\n20933\t爖\n20934\t牋\n20935\tOOV_NUM\n20936\tOOV_ALPHA\n20937\tOOV_ALNUM\n20938\tOOV_HANZ\n20940\tOOV\n"
  },
  {
    "path": "modules/text/lexical_analysis/lac/custom.py",
    "content": "\"\"\"\n该模块实现用户自定义词典的功能\n\"\"\"\nfrom io import open\n\nfrom .ahocorasick import Ahocorasick\n\n\nclass Customization(object):\n    \"\"\"\n    基于AC自动机实现用户干预的功能\n    \"\"\"\n\n    def __init__(self):\n        self.dictitem = {}\n        self.ac = None\n        pass\n\n    def load_customization(self, filename, sep=None):\n        \"\"\"装载人工干预词典\"\"\"\n        self.ac = Ahocorasick()\n        with open(filename, 'r', encoding='utf8') as f:\n            for line in f:\n                if sep == None:\n                    words = line.strip().split()\n                else:\n                    words = line.strip().split(sep)\n\n                if len(words) == 0:\n                    continue\n\n                phrase = \"\"\n                tags = []\n                offset = []\n                for word in words:\n                    if word.rfind('/') < 1:\n                        phrase += word\n                        tags.append('')\n                    else:\n                        phrase += word[:word.rfind('/')]\n                        tags.append(word[word.rfind('/') + 1:])\n                    offset.append(len(phrase))\n\n                if len(phrase) < 2 and tags[0] == '':\n                    continue\n\n                self.dictitem[phrase] = (tags, offset)\n                self.ac.add_word(phrase)\n        self.ac.make()\n\n    def parse_customization(self, query, lac_tags):\n        \"\"\"使用人工干预词典修正lac模型的输出\"\"\"\n\n        def ac_postpress(ac_res):\n            ac_res.sort()\n            i = 1\n            while i < len(ac_res):\n                if ac_res[i - 1][0] < ac_res[i][0] and ac_res[i][0] <= ac_res[i - 1][1]:\n                    ac_res.pop(i)\n                    continue\n                i += 1\n            return ac_res\n\n        if not self.ac:\n            print(\"Customized dict is not loaded.\")\n            return\n\n        ac_res = self.ac.search(query)\n\n        ac_res = ac_postpress(ac_res)\n\n        for begin, end in ac_res:\n            phrase = query[begin:end + 1]\n            index = begin\n\n            tags, offsets = self.dictitem[phrase]\n            for tag, offset in zip(tags, offsets):\n                while index < begin + offset:\n                    if len(tag) == 0:\n                        lac_tags[index] = lac_tags[index][:-1] + 'I'\n                    else:\n                        lac_tags[index] = tag + \"-I\"\n                    index += 1\n\n            lac_tags[begin] = lac_tags[begin][:-1] + 'B'\n            for offset in offsets:\n                index = begin + offset\n                if index < len(lac_tags):\n                    lac_tags[index] = lac_tags[index][:-1] + 'B'\n"
  },
  {
    "path": "modules/text/lexical_analysis/lac/module.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport argparse\nimport ast\nimport io\nimport math\nimport os\n\nimport numpy as np\nimport six\nfrom paddle.inference import Config\nfrom paddle.inference import create_predictor\n\nfrom .custom import Customization\nfrom .processor import load_kv_dict\nfrom .processor import parse_result\nfrom .processor import word_to_ids\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\nfrom paddlehub.utils.parser import txt_parser\nfrom paddlehub.utils.utils import sys_stdin_encoding\n\n\nclass DataFormatError(Exception):\n\n    def __init__(self, *args):\n        self.args = args\n\n\n@moduleinfo(\n    name=\"lac\",\n    version=\"2.4.0\",\n    summary=\n    \"Baidu's open-source lexical analysis tool for Chinese, including word segmentation, part-of-speech tagging & named entity recognition\",\n    author=\"baidu-nlp\",\n    author_email=\"paddle-dev@baidu.com\",\n    type=\"nlp/lexical_analysis\")\nclass LAC:\n\n    def __init__(self, user_dict=None):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        self.default_pretrained_model_path = os.path.join(self.directory, \"infer_model\", \"model\")\n        self.word2id_dict = load_kv_dict(os.path.join(self.directory, \"assets/word.dic\"), reverse=True, value_func=int)\n        self.id2word_dict = load_kv_dict(os.path.join(self.directory, \"assets/word.dic\"))\n        self.label2id_dict = load_kv_dict(os.path.join(self.directory, \"assets/tag.dic\"), reverse=True, value_func=int)\n        self.id2label_dict = load_kv_dict(os.path.join(self.directory, \"assets/tag.dic\"))\n        self.word_replace_dict = load_kv_dict(os.path.join(self.directory, \"assets/q2b.dic\"))\n        self.oov_id = self.word2id_dict['OOV']\n        self.word_dict_len = max(map(int, self.word2id_dict.values())) + 1\n        self.label_dict_len = max(map(int, self.label2id_dict.values())) + 1\n        self.tag_file = os.path.join(self.directory, \"assets/tag_file.txt\")\n\n        if user_dict:\n            self.set_user_dict(dict_path=user_dict)\n        else:\n            self.custom = None\n\n        self._set_config()\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model = self.default_pretrained_model_path + '.pdmodel'\n        params = self.default_pretrained_model_path + '.pdiparams'\n        cpu_config = Config(model, params)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = create_predictor(cpu_config)\n\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = Config(model, params)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=500, device_id=0)\n            self.gpu_predictor = create_predictor(gpu_config)\n\n    def set_user_dict(self, dict_path, sep=None):\n        \"\"\"\n        Set the costomized dictionary if you wanna exploit the self-defined dictionary\n\n        Args:\n             dict_path(str): The directory to the costomized dictionary.\n             sep: The seperation token in phases. Default as ' ' or '\\t'.\n        \"\"\"\n        if not os.path.exists(dict_path):\n            raise RuntimeError(\"File %s is not exist.\" % dict_path)\n        self.custom = Customization()\n        self.custom.load_customization(dict_path, sep)\n\n    def del_user_dict(self):\n        \"\"\"\n        Delete the costomized dictionary if you don't wanna exploit the self-defined dictionary any longer\n        \"\"\"\n\n        if self.custom:\n            self.custom = None\n            print(\"Successfully delete the customized dictionary!\")\n\n    def to_unicode(self, texts):\n        \"\"\"\n        Convert each element's type(str) of texts(list) to unicode in python2.7\n\n        Args:\n             texts(list): each element's type is str in python2.7\n\n        Returns:\n             texts(list): each element's type is unicode in python2.7\n        \"\"\"\n        if six.PY2:\n            unicode_texts = []\n            for text in texts:\n                if isinstance(text, six.string_types):\n                    unicode_texts.append(text.decode(sys_stdin_encoding()).decode(\"utf8\"))\n                else:\n                    unicode_texts.append(text)\n            texts = unicode_texts\n        return texts\n\n    def preprocess(self, texts):\n        \"\"\"\n        Tranform the texts(list) to PaddleTensor\n        Args:\n             texts(list): texts\n        Returns:\n             np.array, list, list\n        \"\"\"\n        lod = [0]\n        data = []\n        for i, text in enumerate(texts):\n            text_inds = word_to_ids(text, self.word2id_dict, self.word_replace_dict, oov_id=self.oov_id)\n            data += text_inds\n            lod.append(len(text_inds) + lod[i])\n        return np.array(data).astype('int64'), [lod], [lod[-1], 1]\n\n    def _get_index(self, data_list, item=\"\"):\n        \"\"\"\n        find all indexes of item in data_list\n        \"\"\"\n        res = []\n        for index, data in enumerate(data_list):\n            if data == item:\n                res.append(index)\n        return res\n\n    @serving\n    def cut(self, text, use_gpu=False, batch_size=1, return_tag=True):\n        \"\"\"\n        The main function that segments an entire text that contains\n        Chinese characters into separated words.\n        Args:\n            text(:obj:`str` or :obj:`List[str]`): The chinese texts to be segmented. This can be a string, a list of strings.\n            use_gpu(bool): whether use gpu to predict or not\n            batch_size(int): the program deals once with one batch\n            return_tag: Whether to get tag or not.\n\n        Returns:\n            results(dict or list): The word segmentation result of the input text, whose key is 'word', if text is a list.\n                If text is a str, the word segmentation result (list) is obtained.\n\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        if isinstance(text, list) and len(text) != 0:\n\n            predicted_data = self.to_unicode(text)\n\n            # drop the empty string like \"\" in predicted_data\n            empty_str_indexes = self._get_index(predicted_data)\n            predicted_data = [data for data in predicted_data if data != \"\"]\n\n            start_idx = 0\n            iteration = int(math.ceil(len(predicted_data) / batch_size))\n            results = []\n            for i in range(iteration):\n                if i < (iteration - 1):\n                    batch_data = predicted_data[start_idx:(start_idx + batch_size)]\n                else:\n                    batch_data = predicted_data[start_idx:]\n\n                start_idx = start_idx + batch_size\n                data, lod, shape = self.preprocess(batch_data)\n\n                predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n                input_names = predictor.get_input_names()\n                input_handle = predictor.get_input_handle(input_names[0])\n                input_handle.copy_from_cpu(data)\n                input_handle.set_lod(lod)\n                input_handle.reshape(shape)\n\n                predictor.run()\n                output_names = predictor.get_output_names()\n                output_handle = predictor.get_output_handle(output_names[0])\n\n                batch_result = parse_result(batch_data, output_handle, self.id2label_dict, interventer=self.custom)\n                results += batch_result\n\n            for index in empty_str_indexes:\n                results.insert(index, {\"word\": [\"\"], \"tag\": [\"\"]})\n\n            if not return_tag:\n                for result in results:\n                    result = result.pop(\"tag\")\n                return results\n\n            return results\n        elif isinstance(text, str) and text != \"\":\n            data, lod, shape = self.preprocess([text])\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(data)\n            input_handle.set_lod(lod)\n            input_handle.reshape(shape)\n\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            batch_result = parse_result([text], output_handle, self.id2label_dict, interventer=self.custom)\n\n            return batch_result[0]['word']\n        elif text == \"\":\n            return text\n        else:\n            raise TypeError(\"The input data is inconsistent with expectations.\")\n\n    def lexical_analysis(self, texts=[], data={}, use_gpu=False, batch_size=1, return_tag=True):\n        \"\"\"\n        Get the word segmentation results with the texts as input\n\n        Args:\n             texts(list): the input texts to be segmented, if texts not data\n             data(dict): key must be 'text', value is the texts to be segmented, if data not texts\n             use_gpu(bool): whether use gpu to predict or not\n             batch_size(int): the program deals once with one batch\n             return_tag: Whether to get tag or not.\n\n        Returns:\n             results(list): the word segmentation results\n        \"\"\"\n\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        if texts != [] and isinstance(texts, list) and data == {}:\n            predicted_data = texts\n        elif texts == [] and isinstance(data, dict) and isinstance(data.get('text', None), list) and data['text']:\n            predicted_data = data[\"text\"]\n        else:\n            raise TypeError(\"The input data is inconsistent with expectations.\")\n\n        predicted_data = self.to_unicode(predicted_data)\n\n        # drop the empty string like \"\" in predicted_data\n        empty_str_indexes = self._get_index(predicted_data)\n        predicted_data = [data for data in predicted_data if data != \"\"]\n\n        start_idx = 0\n        iteration = int(math.ceil(len(predicted_data) / batch_size))\n        results = []\n        for i in range(iteration):\n            if i < (iteration - 1):\n                batch_data = predicted_data[start_idx:(start_idx + batch_size)]\n            else:\n                batch_data = predicted_data[start_idx:]\n\n            start_idx = start_idx + batch_size\n            data, lod, shape = self.preprocess(batch_data)\n\n            predictor = self.gpu_predictor if use_gpu else self.cpu_predictor\n            input_names = predictor.get_input_names()\n            input_handle = predictor.get_input_handle(input_names[0])\n            input_handle.copy_from_cpu(data)\n            input_handle.set_lod(lod)\n            input_handle.reshape(shape)\n\n            predictor.run()\n            output_names = predictor.get_output_names()\n            output_handle = predictor.get_output_handle(output_names[0])\n\n            batch_result = parse_result(batch_data, output_handle, self.id2label_dict, interventer=self.custom)\n            results += batch_result\n\n        for index in empty_str_indexes:\n            results.insert(index, {\"word\": [\"\"], \"tag\": [\"\"]})\n\n        if not return_tag:\n            for result in results:\n                result = result.pop(\"tag\")\n            return results\n\n        return results\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the lac module.\",\n                                              prog='hub run lac',\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n\n        args = self.parser.parse_args(argvs)\n\n        try:\n            input_data = self.check_input_data(args)\n        except DataFormatError and RuntimeError:\n            self.parser.print_help()\n            return None\n\n        if args.user_dict:\n            self.set_user_dict(args.user_dict)\n\n        results = self.lexical_analysis(texts=input_data,\n                                        use_gpu=args.use_gpu,\n                                        batch_size=args.batch_size,\n                                        return_tag=args.return_tag)\n\n        return results\n\n    def get_tags(self):\n        \"\"\"\n        Get the tags which was used when pretraining lac\n\n        Returns:\n             self.tag_name_dict(dict):lac tags\n        \"\"\"\n        self.tag_name_dict = {}\n        with io.open(self.tag_file, encoding=\"utf8\") as f:\n            for line in f:\n                tag, tag_name = line.strip().split(\" \")\n                self.tag_name_dict[tag] = tag_name\n        return self.tag_name_dict\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not\")\n\n        self.arg_config_group.add_argument('--batch_size', type=int, default=1, help=\"batch size for prediction\")\n        self.arg_config_group.add_argument('--user_dict',\n                                           type=str,\n                                           default=None,\n                                           help=\"customized dictionary for intervening the word segmentation result\")\n        self.arg_config_group.add_argument('--return_tag',\n                                           type=ast.literal_eval,\n                                           default=True,\n                                           help=\"whether return tags of results or not\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options\n        \"\"\"\n        self.arg_input_group.add_argument('--input_file', type=str, default=None, help=\"file contain input data\")\n        self.arg_input_group.add_argument('--input_text', type=str, default=None, help=\"text to predict\")\n\n    def check_input_data(self, args):\n        input_data = []\n        if args.input_file:\n            if not os.path.exists(args.input_file):\n                print(\"File %s is not exist.\" % args.input_file)\n                raise RuntimeError\n            else:\n                input_data = txt_parser.parse(args.input_file, use_strip=True)\n        elif args.input_text:\n            if args.input_text.strip() != '':\n                if six.PY2:\n                    input_data = [args.input_text.decode(sys_stdin_encoding()).decode(\"utf8\")]\n                else:\n                    input_data = [args.input_text]\n\n        if input_data == []:\n            print(\"ERROR: The input data is inconsistent with expectations.\")\n            raise DataFormatError\n\n        return input_data\n\n    def create_gradio_app(self):\n        import gradio as gr\n        return gr.Interface(self.cut,\n                            gr.Text(label='text'),\n                            gr.JSON(label='results'),\n                            title='lac',\n                            allow_flagging='never')\n"
  },
  {
    "path": "modules/text/lexical_analysis/lac/processor.py",
    "content": "import io\n\nimport numpy as np\nimport six\n\n\nclass Query(object):\n\n    def __init__(self, lac_query):\n        self.set_query(lac_query)\n\n    def set_query(self, lac_query):\n        \"\"\"\n        self.lac_query_list = [\"我/r\", \"和/c\", \"妈妈/n\", \"经常/d\", \"过去/v\", \"那儿/r\", \"散步/v\"]\n        self.seg_query_list = [\"我\", \"和\", \"妈妈\", \"经常\", \"过去\", \"那儿\", \"散步\"]\n        self.seg_query_str = \"我 和 妈妈 经常 过去 那儿 散步\"\n        self.ori_query_str = \"我和妈妈经常过去那儿散步\"\n        \"\"\"\n        length = len(lac_query['word'])\n        if six.PY2:\n            self.lac_query_list = [\n                lac_query[\"word\"][index].encode(\"utf8\") + \"/\" + lac_query[\"tag\"][index].encode(\"utf8\")\n                for index in range(length)\n            ]\n        else:\n            self.lac_query_list = [lac_query[\"word\"][index] + \"/\" + lac_query[\"tag\"][index] for index in range(length)]\n\n        self.seg_query_list = []\n        for phrase in self.lac_query_list:\n            index = phrase.rfind(\"/\")\n            word = phrase[0:index]\n            self.seg_query_list.append(word)\n        self.seg_query_str = \" \".join(self.seg_query_list)\n        self.ori_query_str = \"\".join(self.seg_query_list)\n\n\nclass Bound(object):\n\n    def __init__(self, start_index=0, end_index=0, left_bound=0, right_bound=0, left_char_bound=0, right_char_bound=0):\n        self.start_index = start_index  # 命中的词的起始位置，char级别\n        self.end_index = end_index  # 命中的词的结束位置，char级别\n        self.left_bound = left_bound  # 原分词级别的起始位置\n        self.right_bound = right_bound  # 原分词级别的结束位置\n        self.left_char_bound = left_char_bound  # 原 char 级别的起始位置\n        self.right_char_bound = right_char_bound  # 原 char 级别的结束位置\n\n\nclass Interventer(object):\n\n    def __init__(self, ngram_dict_path, user_dict_path):\n        self.ngram_dict_path = ngram_dict_path\n        self.user_dict_path = user_dict_path\n        self.init_pos_types()\n        self.load_dict()\n\n    def init_pos_types(self):\n        all_pos_types = \"n f s t nr ns nt nw nz v vd vn\" \\\n                + \" a ad an d m q r p c u xc w PER LOC ORG TIME\"\n        self.all_pos_types = set([pos_type.lower() for pos_type in all_pos_types.split(\" \")])\n\n    def load_dict(self):\n        \"\"\"load unigram dict and user dict\"\"\"\n        import ahocorasick\n        self.total_count = 0.0\n        self.ngram_dict = {}\n        print(\"Loading dict...\")\n        for line in io.open(self.ngram_dict_path, mode=\"r\", encoding=\"utf-8\"):\n            if six.PY2:\n                word, pos, wordfreq = line.encode(\"utf-8\").strip('\\n').split('\\t')\n            else:\n                word, pos, wordfreq = line.strip('\\n').split('\\t')\n            wordfreq = int(wordfreq)\n            if pos.lower() not in self.all_pos_types:\n                continue\n            assert wordfreq > 0, \"Word frequency must be postive integer!\"\n            self.total_count += wordfreq\n            self.ngram_dict[word + \"/\" + pos] = wordfreq\n        for key in self.ngram_dict:\n            wordfreq = self.ngram_dict[key]\n            self.ngram_dict[key] = np.log(wordfreq / self.total_count)\n        self.oov_score = np.log(1 / self.total_count)\n\n        self.user_dict = ahocorasick.Automaton()\n        for line in io.open(self.user_dict_path, mode=\"r\", encoding=\"utf-8\"):\n            if six.PY2:\n                word, pos, wordfreq = line.encode(\"utf-8\").strip('\\n').split('\\t')\n            else:\n                word, pos, wordfreq = line.strip('\\n').split('\\t')\n            wordfreq = int(wordfreq)\n            assert pos in self.all_pos_types, \"Invalid POS type\"\n            assert wordfreq > 0, \"Word frequency must be postive integer!\"\n            self.ngram_dict[word + \"/\" + pos] = np.log(wordfreq / self.total_count)\n            self.user_dict.add_word(word, (word, pos, wordfreq))\n        self.user_dict.make_automaton()\n\n    def find_min_bound(self, match_info, query):\n        \"\"\"\n        find minimum Bound for match_word\n        \"\"\"\n        end_index, (match_word, pos, wordfreq) = match_info\n        start_index = end_index - len(match_word) + 1\n\n        bound = Bound(start_index=start_index, end_index=end_index)\n\n        # find left bound\n        query_len = 0\n        for word_index, word in enumerate(query.seg_query_list):\n            query_len += len(word)\n            if query_len > start_index:\n                bound.left_bound = word_index\n                bound.left_char_bound = query_len - len(word)\n                break\n        # find right bound\n        query_len = 0\n        for word_index, word in enumerate(query.seg_query_list):\n            query_len += len(word)\n            if query_len > end_index:\n                bound.right_bound = word_index\n                bound.right_char_bound = query_len - 1\n                break\n        return bound\n\n    def calc_lm_score(self, phrase_list):\n        \"\"\"calculate the language model score\"\"\"\n        lm_score = 0.0\n        if len(phrase_list) == 0:\n            return 0.0\n        for phrase in phrase_list:\n            lm_score += self.ngram_dict.get(phrase, self.oov_score)\n        return lm_score / len(phrase_list)\n\n    def get_new_phrase_list(self, match_info, bound, query):\n        \"\"\"\n        比较用户词典给出的词和原分词结果，根据打分决定是否替换\n        \"\"\"\n        new_phrase_list = []\n        phrase_left = query.ori_query_str[bound.left_char_bound:bound.start_index]\n        phrase_right = query.ori_query_str[bound.end_index + 1:bound.right_char_bound + 1]\n        if phrase_left != \"\":\n            phrase_left += \"/\" + query.lac_query_list[bound.left_bound].split('/')[1]\n            new_phrase_list.append(phrase_left)\n        new_phrase_list.append(match_info[1][0] + \"/\" + match_info[1][1])\n        if phrase_right != \"\":\n            phrase_right += \"/\" + query.lac_query_list[bound.right_bound].split('/')[1]\n            new_phrase_list.append(phrase_right)\n\n        new_query_list = query.lac_query_list[0: bound.left_bound] + new_phrase_list + \\\n                query.lac_query_list[bound.right_bound + 1: ]\n        new_lm_score = self.calc_lm_score(new_query_list)\n        return new_lm_score, new_phrase_list\n\n    def run(self, query):\n        \"\"\"\n        step 1, 用AC自动机检测出匹配到的用户词\n        step 2, 每个用户词查找最小分词边界，计算每种分词结果的打分，PK\n        step 3, 怎么处理冲突？\n          3.a. 假设 AC自动机检测到的关键词都是顺序的，那么只需要考虑前后两个的替换词即可\n          3.b. 假如前后两个替换词没有位置冲突，那么直接把前一个加到替换列表里\n          3.c. 假如前后两个替换词有冲突，比较分数，舍弃一个，更新上一个替换的位置\n        step 4, 最终依次执行替换\n        \"\"\"\n        last_bound = None\n        last_phrase_list = None\n        last_lm_score = None\n        all_result = []\n        old_lm_score = self.calc_lm_score(query.lac_query_list)\n\n        for match_info in self.user_dict.iter(query.ori_query_str):\n            #print \"matched: \\\"%s\\\" in query: \\\"%s\\\"\" % (match_info[1][0], query.seg_query_str)\n            bound = self.find_min_bound(match_info, query)\n            new_lm_score, new_phrase_list = self.get_new_phrase_list(match_info, bound, query)\n\n            # 如果打分比原 LAC 结果低，抛弃用户词典里的结果\n            if new_lm_score <= old_lm_score:\n                #print >> sys.stderr, \"skipped %s, old_lm_score: %.5f, \" \\\n                #        \"new_lm_score: %.5f\" % (\" \".join(new_phrase_list), old_lm_score, new_lm_score)\n                continue\n            # 遇到的第一个匹配到的结果\n            if last_bound is None:\n                last_bound = bound\n                last_phrase_list = new_phrase_list\n                last_lm_score = new_lm_score\n                continue\n            if bound.left_bound > last_bound.right_bound:\n                # 位置上没有冲突，则把上次的结果加到最终结果中去\n                all_result.append((last_bound, last_phrase_list))\n                last_bound = bound\n                last_phrase_list = new_phrase_list\n                last_lm_score = new_lm_score\n            else:\n                # 位置上有冲突\n                if new_lm_score > last_lm_score:\n                    # 若分数高于上次结果，则覆盖；否则丢弃\n                    last_bound = bound\n                    last_phrase_list = new_phrase_list\n                    last_lm_score = new_lm_score\n\n        if last_bound is not None:\n            all_result.append((last_bound, last_phrase_list))\n\n        # 合并所有替换的结果\n        final_phrase_list = []\n        last_index = -1\n        for bound, phrase_list in all_result:\n            final_phrase_list += query.lac_query_list[last_index + 1:bound.left_bound] + phrase_list\n            last_index = bound.right_bound\n        final_phrase_list += query.lac_query_list[last_index + 1:]\n\n        final_result = {'word': [], 'tag': []}\n        for phrase in final_phrase_list:\n            index = phrase.rfind(\"/\")\n            word = phrase[0:index]\n            tag = phrase[index + 1:]\n            final_result['word'].append(word)\n            final_result['tag'].append(tag)\n\n        return final_result\n\n\ndef load_kv_dict(dict_path, reverse=False, delimiter=\"\\t\", key_func=None, value_func=None):\n    \"\"\"\n    Load key-value dict from file\n    \"\"\"\n    result_dict = {}\n    for line in io.open(dict_path, \"r\", encoding='utf8'):\n        terms = line.strip(\"\\n\").split(delimiter)\n        if len(terms) != 2:\n            continue\n        if reverse:\n            value, key = terms\n        else:\n            key, value = terms\n        if key in result_dict:\n            raise KeyError(\"key duplicated with [%s]\" % (key))\n        if key_func:\n            key = key_func(key)\n        if value_func:\n            value = value_func(value)\n        result_dict[key] = value\n    return result_dict\n\n\ndef word_to_ids(words, word2id_dict, word_replace_dict, oov_id=None):\n    \"\"\"convert word to word index\"\"\"\n    word_ids = []\n    for word in words:\n        word = word_replace_dict.get(word, word)\n        word_id = word2id_dict.get(word, oov_id)\n        word_ids.append(word_id)\n\n    return word_ids\n\n\ndef parse_result(lines, crf_decode, id2label_dict, interventer=None):\n    \"\"\"Convert model's output tensor into string and tags \"\"\"\n    offset_list = crf_decode.lod()[0]\n    crf_decode = crf_decode.copy_to_cpu()\n    batch_size = len(offset_list) - 1\n    batch_out = []\n    for sent_index in range(batch_size):\n        begin, end = offset_list[sent_index], offset_list[sent_index + 1]\n        sent = lines[sent_index]\n        tags = [id2label_dict[str(tag_id[0])] for tag_id in crf_decode[begin:end]]\n\n        if interventer:\n            interventer.parse_customization(sent, tags)\n\n        sent_out = []\n        tags_out = []\n        for ind, tag in enumerate(tags):\n            # for the first char\n            if len(sent_out) == 0 or tag.endswith(\"B\") or tag.endswith(\"S\"):\n                sent_out.append(sent[ind])\n                tags_out.append(tag[:-2])\n                continue\n            sent_out[-1] += sent[ind]\n            tags_out[-1] = tag[:-2]\n\n        seg_result = {\"word\": sent_out, \"tag\": tags_out}\n        batch_out.append(seg_result)\n\n    return batch_out\n\n\n#         sent_out = []\n#         tags_out = []\n#         parital_word = \"\"\n#         for ind, tag in enumerate(tags):\n#             # for the first word\n#             if parital_word == \"\":\n#                 parital_word = sent[ind]\n#                 tags_out.append(tag.split('-')[0])\n#                 continue\n#             # for the beginning of word\n#             if tag.endswith(\"-B\") or (tag == \"O\" and tags[ind - 1] != \"O\"):\n#                 sent_out.append(parital_word)\n#                 tags_out.append(tag.split('-')[0])\n#                 parital_word = sent[ind]\n#                 continue\n#             parital_word += sent[ind]\n#         # append the last word, except for len(tags)=0\n#         if len(sent_out) < len(tags_out):\n#             sent_out.append(parital_word)\n#         seg_result = {\"word\": sent_out, \"tag\": tags_out}\n\n#         batch_out.append(seg_result)\n#     return batch_out\n"
  },
  {
    "path": "modules/text/lexical_analysis/lac/test.py",
    "content": "import os\nimport shutil\nimport unittest\n\nimport paddlehub as hub\n\nos.environ['CUDA_VISIBLE_DEVICES'] = '0'\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        cls.text = \"今天是个好日子\"\n        cls.texts = [\"今天是个好日子\", \"天气预报说今天要下雨\", \"下一班地铁马上就要到了\"]\n        cls.module = hub.Module(name=\"lac\")\n\n    @classmethod\n    def tearDownClass(cls) -> None:\n        shutil.rmtree('inference')\n\n    def test_cut1(self):\n        results = self.module.cut(text=self.text, use_gpu=False, batch_size=1, return_tag=False)\n        self.assertEqual(results, ['今天', '是', '个', '好日子'])\n\n    def test_cut2(self):\n        results = self.module.cut(text=self.texts, use_gpu=False, batch_size=1, return_tag=False)\n        self.assertEqual(results, [{\n            'word': ['今天', '是', '个', '好日子']\n        }, {\n            'word': ['天气预报', '说', '今天', '要', '下雨']\n        }, {\n            'word': ['下', '一班', '地铁', '马上', '就要', '到', '了']\n        }])\n\n    def test_cut3(self):\n        results = self.module.cut(text=self.texts, use_gpu=False, batch_size=2, return_tag=False)\n        self.assertEqual(results, [{\n            'word': ['今天', '是', '个', '好日子']\n        }, {\n            'word': ['天气预报', '说', '今天', '要', '下雨']\n        }, {\n            'word': ['下', '一班', '地铁', '马上', '就要', '到', '了']\n        }])\n\n    def test_cut4(self):\n        results = self.module.cut(text=self.texts, use_gpu=True, batch_size=2, return_tag=False)\n        self.assertEqual(results, [{\n            'word': ['今天', '是', '个', '好日子']\n        }, {\n            'word': ['天气预报', '说', '今天', '要', '下雨']\n        }, {\n            'word': ['下', '一班', '地铁', '马上', '就要', '到', '了']\n        }])\n\n    def test_cut5(self):\n        results = self.module.cut(text=self.texts, use_gpu=True, batch_size=2, return_tag=True)\n        self.assertEqual(results, [{\n            'word': ['今天', '是', '个', '好日子'],\n            'tag': ['TIME', 'v', 'q', 'n']\n        }, {\n            'word': ['天气预报', '说', '今天', '要', '下雨'],\n            'tag': ['n', 'v', 'TIME', 'v', 'v']\n        }, {\n            'word': ['下', '一班', '地铁', '马上', '就要', '到', '了'],\n            'tag': ['f', 'm', 'n', 'd', 'v', 'v', 'xc']\n        }])\n\n    def test_save_inference_model(self):\n        self.module.save_inference_model('./inference/model')\n\n        self.assertTrue(os.path.exists('./inference/model.pdmodel'))\n        self.assertTrue(os.path.exists('./inference/model.pdiparams'))\n\n\nif __name__ == '__main__':\n    unittest.main()\n"
  },
  {
    "path": "modules/text/machine_translation/transformer/en-de/README.md",
    "content": "# transformer_en-de\n|模型名称|transformer_en-de|\n| :--- | :---: | \n|类别|文本-机器翻译|\n|网络|Transformer|\n|数据集|WMT14 EN-DE|\n|是否支持Fine-tuning|否|\n|模型大小|481MB|\n|最新更新日期|2021-07-21|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - 2017 年，Google机器翻译团队在其发表的论文[Attention Is All You Need](https://arxiv.org/abs/1706.03762)中，提出了用于完成机器翻译（Machine Translation）等序列到序列（Seq2Seq）学习任务的一种全新网络结构——Transformer。Tranformer网络完全使用注意力（Attention）机制来实现序列到序列的建模，并且取得了很好的效果。\n\n  - transformer_en-de包含6层的transformer结构，头数为8，隐藏层参数为512，参数量为64M。该模型在[WMT'14 EN-DE数据集](http://www.statmt.org/wmt14/translation-task.html)进行了预训练，加载后可直接用于预测，提供了英文翻译为德文的能力。\n\n  - 关于机器翻译的Transformer模型训练方式和详情，可查看[Machine Translation using Transformer](https://github.com/PaddlePaddle/PaddleNLP/tree/develop/examples/machine_translation/transformer)。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.1.0\n  \n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install transformer_en-de\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub \n\n    model = hub.Module(name='transformer_en-de', beam_size=5)\n    src_texts = [\n        'What are you doing now?',\n        'The change was for the better; I eat well, I exercise, I take my drugs.',\n        'Such experiments are not conducted for ethical reasons.',\n    ]\n\n    n_best = 3  # 每个输入样本的输出候选句子数量\n    trg_texts = model.predict(src_texts, n_best=n_best)\n    for idx, st in enumerate(src_texts):\n        print('-'*30)\n        print(f'src: {st}')\n        for i in range(n_best):\n            print(f'trg[{i+1}]: {trg_texts[idx*n_best+i]}')\n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(max_length: int = 256,\n             max_out_len: int = 256,\n             beam_size: int = 5):\n    ```\n\n    - 初始化module，可配置模型的输入输出文本的最大长度和解码时beam search的宽度。\n\n    - **参数**\n\n      - `max_length`(int): 输入文本的最大长度，默认值为256。\n      - `max_out_len`(int): 输出文本的最大解码长度，默认值为256。\n      - `beam_size`(int): beam search方式解码的beam宽度，默认为5。\n\n  - ```python\n    def predict(data: List[str],\n            batch_size: int = 1,\n            n_best: int = 1,\n            use_gpu: bool = False):\n    ```\n\n    - 预测API，输入源语言的文本句子，解码后输出翻译后的目标语言的文本候选句子。\n\n    - **参数**\n\n      - `data`(List[str]): 源语言的文本列表，数据类型为List[str]\n      - `batch_size`(int): 进行预测的batch_size，默认为1\n      - `n_best`(int): 每个输入文本经过模型解码后，输出的得分最高的候选句子的数量，必须小于beam_size，默认为1\n      - `use_gpu`(bool): 是否使用gpu执行预测，默认为False\n    \n    - **返回**\n\n      - `results`(List[str]): 翻译后的目标语言的候选句子，长度为`len(data)*n_best`\n\n## 四、服务部署\n\n- 通过启动PaddleHub Serving，可以加载模型部署在线翻译服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m transformer_en-de\n    ```\n\n  - 通过以上命令可完成一个英德机器翻译API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ## 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    texts = [\n        'What are you doing now?',\n        'The change was for the better; I eat well, I exercise, I take my drugs.',\n        'Such experiments are not conducted for ethical reasons.',\n    ]\n    data = {\"data\": texts}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/transformer_en-de\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n  - 关于PaddleHub Serving更多信息参考：[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  修复模型初始化的兼容性问题\n  - ```shell\n    $ hub install transformer_en-de==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/machine_translation/transformer/en-de/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/machine_translation/transformer/en-de/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import List\n\nimport paddle\nimport paddle.nn as nn\nfrom paddlehub.env import MODULE_HOME\nfrom paddlehub.module.module import moduleinfo, serving\nimport paddlenlp\nfrom paddlenlp.data import Pad, Vocab\nfrom paddlenlp.transformers import InferTransformerModel, position_encoding_init\n\nfrom transformer_en_de.utils import MTTokenizer, post_process_seq\n\n\n@moduleinfo(\n    name=\"transformer_en-de\",\n    version=\"1.0.1\",\n    summary=\"\",\n    author=\"PaddlePaddle\",\n    author_email=\"\",\n    type=\"nlp/machine_translation\",\n)\nclass MTTransformer(nn.Layer):\n    \"\"\"\n    Transformer model for machine translation.\n    \"\"\"\n    # Language config\n    lang_config = {'source': 'en', 'target': 'de'}\n\n    # Model config\n    model_config = {\n        # Number of head used in multi-head attention.\n        \"n_head\": 8,\n        # The dimension for word embeddings, which is also the last dimension of\n        # the input and output of multi-head attention, position-wise feed-forward\n        # networks, encoder and decoder.\n        \"d_model\": 512,\n        # Size of the hidden layer in position-wise feed-forward networks.\n        \"d_inner_hid\": 2048,\n        # The flag indicating whether to share embedding and softmax weights.\n        # Vocabularies in source and target should be same for weight sharing.\n        \"weight_sharing\": True,\n        # Dropout rate\n        'dropout': 0,\n        # Number of sub-layers to be stacked in the encoder and decoder.\n        \"num_encoder_layers\": 6, \n        \"num_decoder_layers\": 6\n    }\n\n    # Vocab config\n    vocab_config = {\n        # Used to pad vocab size to be multiple of pad_factor.\n        \"pad_factor\": 8,\n        # Index for <bos> token\n        \"bos_id\": 0,\n        \"bos_token\": \"<s>\",\n        # Index for <eos> token\n        \"eos_id\": 1,\n        \"eos_token\": \"<e>\",\n        # Index for <unk> token\n        \"unk_id\": 2,\n        \"unk_token\": \"<unk>\",\n    }\n\n    def __init__(self, max_length: int = 256, max_out_len: int = 256, beam_size: int = 5):\n        super(MTTransformer, self).__init__()\n        bpe_codes_file = os.path.join(MODULE_HOME, 'transformer_en_de', 'assets', 'bpe.33708')\n        vocab_file = os.path.join(MODULE_HOME, 'transformer_en_de', 'assets', 'vocab_all.bpe.33708')\n        checkpoint = os.path.join(MODULE_HOME, 'transformer_en_de', 'assets', 'transformer.pdparams')\n\n        self.max_length = max_length\n        self.beam_size = beam_size\n        self.tokenizer = MTTokenizer(\n            bpe_codes_file=bpe_codes_file, lang_src=self.lang_config['source'], lang_trg=self.lang_config['target'])\n        self.vocab = Vocab.load_vocabulary(\n            filepath=vocab_file,\n            unk_token=self.vocab_config['unk_token'],\n            bos_token=self.vocab_config['bos_token'],\n            eos_token=self.vocab_config['eos_token'])\n        self.vocab_size = (len(self.vocab) + self.vocab_config['pad_factor'] - 1) \\\n            // self.vocab_config['pad_factor'] * self.vocab_config['pad_factor']\n        self.transformer = InferTransformerModel(\n            src_vocab_size=self.vocab_size,\n            trg_vocab_size=self.vocab_size,\n            bos_id=self.vocab_config['bos_id'],\n            eos_id=self.vocab_config['eos_id'],\n            max_length=self.max_length + 1,\n            max_out_len=max_out_len,\n            beam_size=self.beam_size,\n            **self.model_config)\n\n        state_dict = paddle.load(checkpoint)\n\n        # To avoid a longer length than training, reset the size of position\n        # encoding to max_length\n        state_dict[\"encoder.pos_encoder.weight\"] = position_encoding_init(self.max_length + 1,\n                                                                          self.model_config['d_model'])\n        state_dict[\"decoder.pos_encoder.weight\"] = position_encoding_init(self.max_length + 1,\n                                                                          self.model_config['d_model'])\n\n        self.transformer.set_state_dict(state_dict)\n\n    def forward(self, src_words: paddle.Tensor):\n        return self.transformer(src_words)\n\n    def _convert_text_to_input(self, text: str):\n        \"\"\"\n        Convert input string to ids.\n        \"\"\"\n        bpe_tokens = self.tokenizer.tokenize(text)\n        if len(bpe_tokens) > self.max_length:\n            bpe_tokens = bpe_tokens[:self.max_length]\n        return self.vocab.to_indices(bpe_tokens)\n\n    def _batchify(self, data: List[str], batch_size: int):\n        \"\"\"\n        Generate input batches.\n        \"\"\"\n        pad_func = Pad(self.vocab_config['eos_id'])\n\n        def _parse_batch(batch_ids):\n            return pad_func([ids + [self.vocab_config['eos_id']] for ids in batch_ids])\n\n        examples = []\n        for text in data:\n            examples.append(self._convert_text_to_input(text))\n\n        # Seperates data into some batches.\n        one_batch = []\n        for example in examples:\n            one_batch.append(example)\n            if len(one_batch) == batch_size:\n                yield _parse_batch(one_batch)\n                one_batch = []\n        if one_batch:\n            yield _parse_batch(one_batch)\n\n    @serving\n    def predict(self, data: List[str], batch_size: int = 1, n_best: int = 1, use_gpu: bool = False):\n\n        if n_best > self.beam_size:\n            raise ValueError(f'Predict arg \"n_best\" must be smaller or equal to self.beam_size, \\\n                but got {n_best} > {self.beam_size}')\n\n        paddle.set_device('gpu') if use_gpu else paddle.set_device('cpu')\n\n        batches = self._batchify(data, batch_size)\n\n        results = []\n        self.eval()\n        for batch in batches:\n            src_batch_ids = paddle.to_tensor(batch)\n            trg_batch_beams = self(src_batch_ids).numpy().transpose([0, 2, 1])\n\n            for trg_sample_beams in trg_batch_beams:\n                for beam_idx, beam in enumerate(trg_sample_beams):\n                    if beam_idx >= n_best:\n                        break\n                    trg_sample_ids = post_process_seq(beam, self.vocab_config['bos_id'], self.vocab_config['eos_id'])\n                    trg_sample_words = self.vocab.to_tokens(trg_sample_ids)\n                    trg_sample_text = self.tokenizer.detokenize(trg_sample_words)\n                    results.append(trg_sample_text)\n\n        return results\n"
  },
  {
    "path": "modules/text/machine_translation/transformer/en-de/requirements.txt",
    "content": "paddlenlp>=2.1.0\nsacremoses\nsubword-nmt\n"
  },
  {
    "path": "modules/text/machine_translation/transformer/en-de/utils.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport re\nfrom typing import List\n\nimport codecs\nfrom sacremoses import MosesTokenizer, MosesDetokenizer\nfrom subword_nmt.apply_bpe import BPE\n\n\nclass MTTokenizer(object):\n    def __init__(self, bpe_codes_file: str, lang_src: str = 'en', lang_trg: str = 'de', separator='@@'):\n        self.moses_tokenizer = MosesTokenizer(lang=lang_src)\n        self.moses_detokenizer = MosesDetokenizer(lang=lang_trg)\n        self.bpe_tokenizer = BPE(\n            codes=codecs.open(bpe_codes_file, encoding='utf-8'),\n            merges=-1,\n            separator=separator,\n            vocab=None,\n            glossaries=None)\n\n    def tokenize(self, text: str):\n        \"\"\"\n        Convert source string into bpe tokens.\n        \"\"\"\n        moses_tokens = self.moses_tokenizer.tokenize(text)\n        tokenized_text = ' '.join(moses_tokens)\n        tokenized_bpe_text = self.bpe_tokenizer.process_line(tokenized_text)  # Apply bpe to text\n        bpe_tokens = tokenized_bpe_text.split(' ')\n        return bpe_tokens\n\n    def detokenize(self, tokens: List[str]):\n        \"\"\"\n        Convert target bpe tokens into string.\n        \"\"\"\n        separator = self.bpe_tokenizer.separator\n        text_with_separators = ' '.join(tokens)\n        clean_text = re.sub(f'({separator} )|({separator} ?$)', '', text_with_separators)\n        clean_tokens = clean_text.split(' ')\n        detokenized_text = self.moses_detokenizer.tokenize(clean_tokens, return_str=True)\n        return detokenized_text\n\n\ndef post_process_seq(seq, bos_idx, eos_idx, output_bos=False, output_eos=False):\n    \"\"\"\n    Post-process the decoded sequence.\n    \"\"\"\n    eos_pos = len(seq) - 1\n    for i, idx in enumerate(seq):\n        if idx == eos_idx:\n            eos_pos = i\n            break\n    seq = [int(idx) for idx in seq[:eos_pos + 1] if (output_bos or idx != bos_idx) and (output_eos or idx != eos_idx)]\n    return seq\n"
  },
  {
    "path": "modules/text/machine_translation/transformer/zh-en/README.md",
    "content": "# transformer_zh-en\n|模型名称|transformer_zh-en|\n| :--- | :---: |\n|类别|文本-机器翻译|\n|网络|Transformer|\n|数据集|CWMT2021|\n|是否支持Fine-tuning|否|\n|模型大小|614MB|\n|最新更新日期|2021-07-21|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - 2017 年，Google机器翻译团队在其发表的论文[Attention Is All You Need](https://arxiv.org/abs/1706.03762)中，提出了用于完成机器翻译（Machine Translation）等序列到序列（Seq2Seq）学习任务的一种全新网络结构——Transformer。Tranformer网络完全使用注意力（Attention）机制来实现序列到序列的建模，并且取得了很好的效果。\n\n  - transformer_zh-en包含6层的transformer结构，头数为8，隐藏层参数为512，参数量为64M。该模型在[CWMT2021的数据集](http://nlp.nju.edu.cn/cwmt-wmt)进行了预训练，加载后可直接用于预测, 提供了中文翻译为英文的能力。\n\n  - 关于机器翻译的Transformer模型训练方式和详情，可查看[Machine Translation using Transformer](https://github.com/PaddlePaddle/PaddleNLP/tree/develop/examples/machine_translation/transformer)。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.1.0\n\n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install transformer_zh-en\n    ```\n\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    model = hub.Module(name='transformer_zh-en', beam_size=5)\n    src_texts = [\n        '今天天气怎么样？',\n        '我们一起去吃饭吧。',\n    ]\n\n    n_best = 3  # 每个输入样本的输出候选句子数量\n    trg_texts = model.predict(src_texts, n_best=n_best)\n    for idx, st in enumerate(src_texts):\n        print('-'*30)\n        print(f'src: {st}')\n        for i in range(n_best):\n            print(f'trg[{i+1}]: {trg_texts[idx*n_best+i]}')  \n    ```\n\n- ### 2、API\n\n  - ```python\n    def __init__(max_length: int = 256,\n                max_out_len: int = 256,\n                beam_size: int = 5):  \n    ```\n\n    - 初始化module，可配置模型的输入输出文本的最大长度和解码时beam search的宽度。\n\n    - **参数**\n\n      - `max_length`(int): 输入文本的最大长度，默认值为256。\n      - `max_out_len`(int): 输出文本的最大解码长度，默认值为256。\n      - `beam_size`(int): beam search方式解码的beam宽度，默认为5。\n\n  - ```python\n    def predict(data: List[str],\n                batch_size: int = 1,\n                n_best: int = 1,\n                use_gpu: bool = False):\n    ```\n\n    - 预测API，输入源语言的文本句子，解码后输出翻译后的目标语言的文本候选句子。\n\n    - **参数**\n      - `data`(List[str]): 源语言的文本列表，数据类型为List[str]\n      - `batch_size`(int): 进行预测的batch_size，默认为1\n      - `n_best`(int): 每个输入文本经过模型解码后，输出的得分最高的候选句子的数量，必须小于beam_size，默认为1\n      - `use_gpu`(bool): 是否使用gpu执行预测，默认为False\n\n    - **返回**\n      - `results`(List[str]): 翻译后的目标语言的候选句子，长度为`len(data)*n_best`\n\n## 四、服务部署\n\n  - 通过启动PaddleHub Serving，可以加载模型部署在线翻译服务。\n\n  - ### 第一步: 启动PaddleHub Serving\n\n    - 运行启动命令：\n\n    - ```shell\n      $ hub serving start -m transformer_zh-en\n      ```\n\n    - 通过以上命令可完成一个中英机器翻译API的部署，默认端口号为8866。\n\n    - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n  - ### 第二步: 发送预测请求\n\n    - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    - ```python\n      import requests\n      import json\n\n      texts = [\n          '今天天气怎么样啊？',\n          '我们一起去吃饭吧。',\n      ]\n      data = {\"data\": texts}\n      # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n      url = \"http://127.0.0.1:8866/predict/transformer_zh-en\"\n      # 指定post请求的headers为application/json方式\n      headers = {\"Content-Type\": \"application/json\"}\n\n      r = requests.post(url=url, headers=headers, data=json.dumps(data))\n      print(r.json())\n      ```\n\n  - 关于PaddleHub Serving更多信息参考：[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n- ### Gradio APP 支持\n  从 PaddleHub 2.3.1 开始支持使用链接 http://127.0.0.1:8866/gradio/transformer_zh-en 在浏览器中访问 transformer_zh-en 的 Gradio APP。\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  修复模型初始化的兼容性问题\n\n* 1.1.0\n\n  添加 Gradio APP 支持\n\n  - ```shell\n    $ hub install transformer_zh-en==1.1.0\n    ```\n"
  },
  {
    "path": "modules/text/machine_translation/transformer/zh-en/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/machine_translation/transformer/zh-en/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nfrom typing import List\n\nimport paddle\nimport paddle.nn as nn\nfrom paddlenlp.data import Pad\nfrom paddlenlp.data import Vocab\nfrom paddlenlp.transformers import InferTransformerModel\nfrom paddlenlp.transformers import position_encoding_init\nfrom transformer_zh_en.utils import MTTokenizer\nfrom transformer_zh_en.utils import post_process_seq\n\nfrom paddlehub.env import MODULE_HOME\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(\n    name=\"transformer_zh-en\",\n    version=\"1.1.0\",\n    summary=\"\",\n    author=\"PaddlePaddle\",\n    author_email=\"\",\n    type=\"nlp/machine_translation\",\n)\nclass MTTransformer(nn.Layer):\n    \"\"\"\n    Transformer model for machine translation.\n    \"\"\"\n    # Language config\n    lang_config = {'source': 'zh', 'target': 'en'}\n\n    # Model config\n    model_config = {\n        # Number of head used in multi-head attention.\n        \"n_head\": 8,\n        # The dimension for word embeddings, which is also the last dimension of\n        # the input and output of multi-head attention, position-wise feed-forward\n        # networks, encoder and decoder.\n        \"d_model\": 512,\n        # Size of the hidden layer in position-wise feed-forward networks.\n        \"d_inner_hid\": 2048,\n        # The flag indicating whether to share embedding and softmax weights.\n        # Vocabularies in source and target should be same for weight sharing.\n        \"weight_sharing\": False,\n        # Dropout rate\n        'dropout': 0,\n        # Number of sub-layers to be stacked in the encoder and decoder.\n        \"num_encoder_layers\": 6,\n        \"num_decoder_layers\": 6\n    }\n\n    # Vocab config\n    vocab_config = {\n        # Used to pad vocab size to be multiple of pad_factor.\n        \"pad_factor\": 8,\n        # Index for <bos> token\n        \"bos_id\": 0,\n        \"bos_token\": \"<s>\",\n        # Index for <eos> token\n        \"eos_id\": 1,\n        \"eos_token\": \"<e>\",\n        # Index for <unk> token\n        \"unk_id\": 2,\n        \"unk_token\": \"<unk>\",\n    }\n\n    def __init__(self, max_length: int = 256, max_out_len: int = 256, beam_size: int = 5):\n        super(MTTransformer, self).__init__()\n        bpe_codes_file = os.path.join(MODULE_HOME, 'transformer_zh_en', 'assets', '2M.zh2en.dict4bpe.zh')\n        src_vocab_file = os.path.join(MODULE_HOME, 'transformer_zh_en', 'assets', 'vocab.zh')\n        trg_vocab_file = os.path.join(MODULE_HOME, 'transformer_zh_en', 'assets', 'vocab.en')\n        checkpoint = os.path.join(MODULE_HOME, 'transformer_zh_en', 'assets', 'transformer.pdparams')\n\n        self.max_length = max_length\n        self.beam_size = beam_size\n        self.tokenizer = MTTokenizer(bpe_codes_file=bpe_codes_file,\n                                     lang_src=self.lang_config['source'],\n                                     lang_trg=self.lang_config['target'])\n        self.src_vocab = Vocab.load_vocabulary(filepath=src_vocab_file,\n                                               unk_token=self.vocab_config['unk_token'],\n                                               bos_token=self.vocab_config['bos_token'],\n                                               eos_token=self.vocab_config['eos_token'])\n        self.trg_vocab = Vocab.load_vocabulary(filepath=trg_vocab_file,\n                                               unk_token=self.vocab_config['unk_token'],\n                                               bos_token=self.vocab_config['bos_token'],\n                                               eos_token=self.vocab_config['eos_token'])\n        self.src_vocab_size = (len(self.src_vocab) + self.vocab_config['pad_factor'] - 1) \\\n            // self.vocab_config['pad_factor'] * self.vocab_config['pad_factor']\n        self.trg_vocab_size = (len(self.trg_vocab) + self.vocab_config['pad_factor'] - 1) \\\n            // self.vocab_config['pad_factor'] * self.vocab_config['pad_factor']\n        self.transformer = InferTransformerModel(src_vocab_size=self.src_vocab_size,\n                                                 trg_vocab_size=self.trg_vocab_size,\n                                                 bos_id=self.vocab_config['bos_id'],\n                                                 eos_id=self.vocab_config['eos_id'],\n                                                 max_length=self.max_length + 1,\n                                                 max_out_len=max_out_len,\n                                                 beam_size=self.beam_size,\n                                                 **self.model_config)\n\n        state_dict = paddle.load(checkpoint)\n\n        # To avoid a longer length than training, reset the size of position\n        # encoding to max_length\n        state_dict[\"encoder.pos_encoder.weight\"] = position_encoding_init(self.max_length + 1,\n                                                                          self.model_config['d_model'])\n        state_dict[\"decoder.pos_encoder.weight\"] = position_encoding_init(self.max_length + 1,\n                                                                          self.model_config['d_model'])\n\n        self.transformer.set_state_dict(state_dict)\n\n    def forward(self, src_words: paddle.Tensor):\n        return self.transformer(src_words)\n\n    def _convert_text_to_input(self, text: str):\n        \"\"\"\n        Convert input string to ids.\n        \"\"\"\n        bpe_tokens = self.tokenizer.tokenize(text)\n        if len(bpe_tokens) > self.max_length:\n            bpe_tokens = bpe_tokens[:self.max_length]\n        return self.src_vocab.to_indices(bpe_tokens)\n\n    def _batchify(self, data: List[str], batch_size: int):\n        \"\"\"\n        Generate input batches.\n        \"\"\"\n        pad_func = Pad(self.vocab_config['eos_id'])\n\n        def _parse_batch(batch_ids):\n            return pad_func([ids + [self.vocab_config['eos_id']] for ids in batch_ids])\n\n        examples = []\n        for text in data:\n            examples.append(self._convert_text_to_input(text))\n\n        # Seperates data into some batches.\n        one_batch = []\n        for example in examples:\n            one_batch.append(example)\n            if len(one_batch) == batch_size:\n                yield _parse_batch(one_batch)\n                one_batch = []\n        if one_batch:\n            yield _parse_batch(one_batch)\n\n    @serving\n    def predict(self, data: List[str], batch_size: int = 1, n_best: int = 1, use_gpu: bool = False):\n\n        if n_best > self.beam_size:\n            raise ValueError(f'Predict arg \"n_best\" must be smaller or equal to self.beam_size, \\\n                but got {n_best} > {self.beam_size}')\n\n        paddle.set_device('gpu') if use_gpu else paddle.set_device('cpu')\n\n        batches = self._batchify(data, batch_size)\n\n        results = []\n        self.eval()\n        for batch in batches:\n            src_batch_ids = paddle.to_tensor(batch)\n            trg_batch_beams = self(src_batch_ids).numpy().transpose([0, 2, 1])\n\n            for trg_sample_beams in trg_batch_beams:\n                for beam_idx, beam in enumerate(trg_sample_beams):\n                    if beam_idx >= n_best:\n                        break\n                    trg_sample_ids = post_process_seq(beam, self.vocab_config['bos_id'], self.vocab_config['eos_id'])\n                    trg_sample_words = self.trg_vocab.to_tokens(trg_sample_ids)\n                    trg_sample_text = self.tokenizer.detokenize(trg_sample_words)\n                    results.append(trg_sample_text)\n\n        return results\n\n    def create_gradio_app(self):\n        import gradio as gr\n\n        def inference(text):\n            results = self.predict(data=[text])\n            return results[0]\n\n        examples = [['今天是个好日子']]\n\n        interface = gr.Interface(inference,\n                                 \"text\", [gr.outputs.Textbox(label=\"Translation\")],\n                                 title=\"transformer_zh-en\",\n                                 examples=examples,\n                                 allow_flagging='never')\n\n        return interface\n"
  },
  {
    "path": "modules/text/machine_translation/transformer/zh-en/requirements.txt",
    "content": "paddlenlp>=2.1.0\njieba\nsacremoses\nsubword-nmt\n"
  },
  {
    "path": "modules/text/machine_translation/transformer/zh-en/utils.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport codecs\nimport logging\nimport re\nfrom typing import List\n\nimport jieba\n\njieba.setLogLevel(logging.INFO)\n\nfrom sacremoses import MosesDetokenizer\nfrom subword_nmt.apply_bpe import BPE\n\n\nclass MTTokenizer(object):\n\n    def __init__(self, bpe_codes_file: str, lang_src: str = 'zh', lang_trg: str = 'en', separator='@@'):\n        self.moses_detokenizer = MosesDetokenizer(lang=lang_trg)\n        self.bpe_tokenizer = BPE(codes=codecs.open(bpe_codes_file, encoding='utf-8'),\n                                 merges=-1,\n                                 separator=separator,\n                                 vocab=None,\n                                 glossaries=None)\n\n    def tokenize(self, text: str):\n        \"\"\"\n        Convert source string into bpe tokens.\n        \"\"\"\n        text = text.replace(' ', '')  # Remove blanks in Chinese text.\n        jieba_tokens = list(jieba.cut(text))\n        tokenized_text = ' '.join(jieba_tokens)\n        tokenized_bpe_text = self.bpe_tokenizer.process_line(tokenized_text)  # Apply bpe to text\n        bpe_tokens = tokenized_bpe_text.split(' ')\n        return bpe_tokens\n\n    def detokenize(self, tokens: List[str]):\n        \"\"\"\n        Convert target bpe tokens into string.\n        \"\"\"\n        separator = self.bpe_tokenizer.separator\n        text_with_separators = ' '.join(tokens)\n        clean_text = re.sub(f'({separator} )|({separator} ?$)', '', text_with_separators)\n        clean_tokens = clean_text.split(' ')\n        detokenized_text = self.moses_detokenizer.tokenize(clean_tokens, return_str=True)\n        return detokenized_text\n\n\ndef post_process_seq(seq, bos_idx, eos_idx, output_bos=False, output_eos=False):\n    \"\"\"\n    Post-process the decoded sequence.\n    \"\"\"\n    eos_pos = len(seq) - 1\n    for i, idx in enumerate(seq):\n        if idx == eos_idx:\n            eos_pos = i\n            break\n    seq = [int(idx) for idx in seq[:eos_pos + 1] if (output_bos or idx != bos_idx) and (output_eos or idx != eos_idx)]\n    return seq\n"
  },
  {
    "path": "modules/text/punctuation_restoration/auto_punc/README.md",
    "content": "# auto_punc\n\n|模型名称|auto_punc|\n| :--- | :---: |\n|类别|文本-标点恢复|\n|网络|Ernie-1.0|\n|数据集|WuDaoCorpora 2.0|\n|是否支持Fine-tuning|否|\n|模型大小|568MB|\n|最新更新日期|2021-12-24|\n|数据指标|-|\n\n## 一、模型基本信息\n\n### 模型介绍\n\nErnie是百度提出的基于知识增强的持续学习语义理解模型，该模型将大数据预训练与多源丰富知识相结合，通过持续学习技术，不断吸收海量文本数据中词汇、结构、语义等方面的知识，实现模型效果不断进化。\n\n[\"悟道\"文本数据集](https://ks3-cn-beijing.ksyun.com/resources/WuDaoCorpora/WuDaoCorpora__A_Super_Large_scale_Chinese_Corporafor_Pre_training_Language_Models.pdf)\n采用20多种规则从100TB原始网页数据中清洗得出最终数据集，注重隐私数据信息的去除，源头上避免GPT-3存在的隐私泄露风险；包含教育、科技等50+个行业数据标签，可以支持多领域预训练模型的训练。\n- 数据总量：3TB\n- 数据格式：json\n- 开源数量：200GB\n- 数据集下载：https://resource.wudaoai.cn/\n- 日期：2021年12月23日\n\nauto_punc采用了Ernie1.0预训练模型，在[WuDaoCorpora 2.0](https://resource.wudaoai.cn/home)的200G开源文本数据集上进行了标点恢复任务的训练，模型可直接用于预测，对输入的对中文文本自动添加7种标点符号：逗号（，）、句号（。）、感叹号（！）、问号（？）、顿号（、）、冒号（：）和分号（；）。\n\n<p align=\"center\">\n<img src=\"https://bj.bcebos.com/paddlehub/paddlehub-img/ernie_network_1.png\" hspace='10'/> <br />\n</p>\n\n<p align=\"center\">\n<img src=\"https://bj.bcebos.com/paddlehub/paddlehub-img/ernie_network_2.png\" hspace='10'/> <br />\n</p>\n\n\n更多详情请参考\n- [WuDaoCorpora: A Super Large-scale Chinese Corpora for Pre-training Language Models](https://ks3-cn-beijing.ksyun.com/resources/WuDaoCorpora/WuDaoCorpora__A_Super_Large_scale_Chinese_Corporafor_Pre_training_Language_Models.pdf)\n- [ERNIE: Enhanced Representation through Knowledge Integration](https://arxiv.org/abs/1904.09223)\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.1.0\n\n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install auto_punc\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测  \n\n- ### 1、预测代码示例\n\n    ```python\n    import paddlehub as hub\n\n    model = hub.Module(\n        name='auto_punc',\n        version='1.0.0')\n\n    texts = [\n        '今天的天气真好啊你下午有空吗我想约你一起去逛街',\n        '我最喜欢的诗句是先天下之忧而忧后天下之乐而乐',\n    ]\n    punc_texts = model.add_puncs(texts)\n    print(punc_texts)\n    # ['我最喜欢的诗句是：先天下之忧而忧，后天下之乐而乐。', '今天的天气真好啊！你下午有空吗？我想约你一起去逛街。']\n    ```\n\n- ### 2、API\n  - ```python\n    def add_puncs(\n        texts: Union[str, List[str]],\n        max_length=256,\n        device='cpu'\n    )\n    ```\n    - 对输入的中文文本自动添加标点符号。\n\n    - **参数**\n\n      - `texts`：输入的中文文本，可为str或List[str]类型，预测时，中英文和数字以外的字符将会被删除。\n      - `max_length`：模型预测时输入的最大长度，超过时文本会被截断，默认为256。\n      - `device`：预测时使用的设备，默认为\b`cpu`，如需使用gpu预测，请设置为`gpu`。\n\n    - **返回**\n\n      - `punc_texts`：List[str]类型，返回添加标点后的文本列表。\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线的文本标点添加的服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - ```shell\n    $ hub serving start -m auto_punc\n    ```\n\n  - 这样就完成了一个文本标点添加服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 输入的中文文本，中英文和数字之外的字符在模型预测前会被删除\n    texts = [\n        '今天的天气真好啊你下午有空吗我想约你一起去逛街',\n        '我最喜欢的诗句是先天下之忧而忧后天下之乐而乐',\n    ]\n\n    # 以key的方式指定text传入预测方法的时的参数，此例中为\"texts\"\n    data = {\"texts\": texts}\n\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/auto_punc\"\n\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  ```shell\n  $ hub install auto_punc\n  ```\n"
  },
  {
    "path": "modules/text/punctuation_restoration/auto_punc/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/punctuation_restoration/auto_punc/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport re\nfrom typing import List, Union\n\nimport numpy as np\nimport paddle\nfrom paddlehub.env import MODULE_HOME\nfrom paddlehub.module.module import moduleinfo, serving\nfrom paddlehub.utils.log import logger\nfrom paddlenlp.transformers import ErnieTokenizer, ErnieForTokenClassification\nfrom paddlenlp.data import Pad\n\n\n@moduleinfo(\n    name=\"auto_punc\",\n    version=\"1.0.0\",\n    summary=\"\",\n    author=\"KPatrick\",\n    author_email=\"\",\n    type=\"text/punctuation_restoration\")\nclass Ernie(paddle.nn.Layer):\n    def __init__(self):\n        super(Ernie, self).__init__()\n        res_dir = os.path.join(MODULE_HOME, 'auto_punc')\n        punc_vocab_file = os.path.join(res_dir, 'assets', 'punc_vocab.txt')\n        ckpt_dir = os.path.join(res_dir, 'assets', 'ckpt')\n\n        self.punc_vocab = self._load_dict(punc_vocab_file)\n        self.punc_list = list(self.punc_vocab.keys())\n        self.model = ErnieForTokenClassification.from_pretrained(ckpt_dir)\n        self.model.eval()\n        self.tokenizer = ErnieTokenizer.from_pretrained('ernie-1.0')\n\n    @staticmethod\n    def _load_dict(dict_path):\n        vocab = {}\n        i = 0\n        with open(dict_path, 'r', encoding='utf-8') as fin:\n            for line in fin:\n                key = line.strip('\\n')\n                vocab[key] = i\n                i += 1\n        return vocab\n\n    @staticmethod\n    def _clean_text(text, punc_list):\n        text = text.lower()\n        text = re.sub('[^A-Za-z0-9\\u4e00-\\u9fa5]', '', text)\n        text = re.sub(f'[{\"\".join([p for p in punc_list][1:])}]', '', text)\n        return text\n\n    def forward(self, text: str):\n        wav = None\n        input_ids = self.frontend.get_input_ids(text, merge_sentences=True)\n        phone_ids = input_ids[\"phone_ids\"]\n        for part_phone_ids in phone_ids:\n            with paddle.no_grad():\n                mel = self.fastspeech2_inference(part_phone_ids)\n                temp_wav = self.pwg_inference(mel)\n                if wav is None:\n                    wav = temp_wav\n                else:\n                    wav = paddle.concat([wav, temp_wav])\n        return wav\n\n    @serving\n    def add_puncs(self, texts: Union[str, List[str]], max_length=256, device='cpu'):\n        assert isinstance(texts, str) or (isinstance(texts, list) and isinstance(texts[0], str)), \\\n            'Input data should be str or List[str], but got {}'.format(type(texts))\n\n        if isinstance(texts, str):\n            texts = [texts]\n\n        input_ids = []\n        seg_ids = []\n        seq_len = []\n        for i in range(len(texts)):\n            clean_text = self._clean_text(texts[i], self.punc_list)\n            assert len(clean_text) > 0, f'Invalid input string: {texts[i]}'\n\n            tokenized_input = self.tokenizer(\n                list(clean_text), return_length=True, is_split_into_words=True, max_seq_len=max_length)\n\n            input_ids.append(tokenized_input['input_ids'])\n            seg_ids.append(tokenized_input['token_type_ids'])\n            seq_len.append(tokenized_input['seq_len'])\n\n        paddle.set_device(device)\n        with paddle.no_grad():\n            pad_func_for_input_ids = Pad(axis=0, pad_val=self.tokenizer.pad_token_id, dtype='int64')\n            pad_func_for_seg_ids = Pad(axis=0, pad_val=self.tokenizer.pad_token_type_id, dtype='int64')\n            input_ids = paddle.to_tensor(pad_func_for_input_ids(input_ids))\n            seg_ids = paddle.to_tensor(pad_func_for_seg_ids(seg_ids))\n            logits = self.model(input_ids, seg_ids)\n            preds = paddle.argmax(logits, axis=-1)\n\n        tokens = []\n        labels = []\n        for i in range(len(input_ids)):\n            tokens.append(self.tokenizer.convert_ids_to_tokens(input_ids[i, 1:seq_len[i] - 1].tolist()))\n            labels.append(preds[i, 1:seq_len[i] - 1].tolist())  # Remove predictions of special tokens.\n\n        punc_texts = []\n        for token, label in zip(tokens, labels):\n            assert len(token) == len(label)\n            text = ''\n            for t, l in zip(token, label):\n                text += t\n                if l != 0:  # Non punc.\n                    text += self.punc_list[l]\n            punc_texts.append(text)\n\n        return punc_texts\n"
  },
  {
    "path": "modules/text/sentiment_analysis/README.md",
    "content": "## **更好用户体验，建议参考WEB端官方文档 -> [【情感分析】](https://www.paddlepaddle.org.cn/hublist)**\n\n### 情感分析\n情感倾向分析（Sentiment Classification，简称Senta）针对带有主观描述的中文文本，可自动判断该文本的情感极性类别并给出相应的置信度，能够帮助企业理解用户消费习惯、分析热点话题和危机舆情监控，为企业提供有利的决策支持。\n\n- 推荐模型\n\n| 模型名称                                                     | 模型简介                                                     |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n| [情感分析-LSTM](https://www.paddlepaddle.org.cn/hubdetail?name=senta_lstm&en_category=SentimentAnalysis) |情感倾向分析LSTM实现 |\n| [情感分析-GRU](https://www.paddlepaddle.org.cn/hubdetail?name=senta_gru&en_category=SentimentAnalysis) |情感倾向分析GRU实现 |\n| [对话情绪识别](https://www.paddlepaddle.org.cn/hubdetail?name=emotion_detection_textcnn&en_category=SentimentAnalysis) |针对智能对话场景中的用户文本，自动判断该文本的情绪类别并给出相应的置信度，情绪类型分为积极、消极、中性。 |\n"
  },
  {
    "path": "modules/text/sentiment_analysis/README_en.md",
    "content": "## **For better user experience, refer to the Web official document ->  [Sentiment Analysis](https://www.paddlepaddle.org.cn/hublist)**\n\n### Sentiment Analysis\n\nSentiment Classification (Senta) can automatically determine the sentiment polarity category of Chinese texts with subjective descriptions and give the corresponding confidence level. This can help enterprises understand users' consumption habits, analyze hot topics, and monitor public opinion at crisis, and provide favorable decision support for enterprises.\n\n- Recommended Models\n\n| Model Name                                                   | Model Introduction                                           |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n| [Sentiment Analysis - LSTM](https://www.paddlepaddle.org.cn/hubdetail?name=senta_lstm&en_category=SentimentAnalysis) | Implementation of sentiment tendency analysis LSTM           |\n| [Sentiment Analysis - GRU](https://www.paddlepaddle.org.cn/hubdetail?name=senta_gru&en_category=SentimentAnalysis) | Implementation of sentiment tendency analysis GRU            |\n| [Conversation Sentiment Recognition](https://www.paddlepaddle.org.cn/hubdetail?name=emotion_detection_textcnn&en_category=SentimentAnalysis) | For user texts in an intelligent conversation scene, it automatically determines the sentiment category of the texts and assigns a corresponding confidence level. The sentiment type is classified as positive, negative, or neutral. |\n"
  },
  {
    "path": "modules/text/sentiment_analysis/emotion_detection_textcnn/README.md",
    "content": "# emotion_detection_textcnn\n\n|模型名称|emotion_detection_textcnn|\n| :--- | :---: |\n|类别|文本-情感分析|\n|网络|TextCNN|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|否|\n|模型大小|122MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - 对话情绪识别（Emotion Detection，简称EmoTect）专注于识别智能对话场景中用户的情绪，针对智能对话场景中的用户文本，自动判断该文本的情绪类别并给出相应的置信度，情绪类型分为积极、消极、中性。该模型基于TextCNN（多卷积核CNN模型），能够更好地捕捉句子局部相关性。\n\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.8.0\n\n  - paddlehub >= 1.8.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install emotion_detection_textcnn\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run emotion_detection_textcnn --input_text \"今天天气真好\"\n    ```\n    或者\n  - ```shell\n    $ hub run emotion_detection_textcnn --input_file test.txt\n    ```\n\n    - test.txt 存放待预测文本， 如：\n      > 这家餐厅很好吃\n\n      > 这部电影真的很差劲\n\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"emotion_detection_textcnn\")\n    test_text = [\"今天天气真好\", \"湿纸巾是干垃圾\", \"别来吵我\"]\n    results = module.emotion_classify(texts=test_text)\n\n    for result in results:\n        print(result['text'])\n        print(result['emotion_label'])\n        print(result['emotion_key'])\n        print(result['positive_probs'])\n        print(result['neutral_probs'])\n        print(result['negative_probs'])\n\n    # 今天天气真好 2 positive 0.9267 0.0714 0.0019\n    # 湿纸巾是干垃圾 1 neutral 0.0062 0.9896 0.0042\n    # 别来吵我 0 negative 0.0732 0.1477 0.7791\n    ```\n\n- ### 3、API\n\n  - ```python\n    def emotion_classify(texts=[], data={}, use_gpu=False, batch_size=1)\n    ```\n    - emotion_detection_textcnn预测接口，预测输入句子的情感分类(三分类，积极/中立/消极）\n\n    - **参数**\n\n      - texts(list): 待预测数据，如果使用texts参数，则不用传入data参数，二选一即可\n      - data(dict): 预测数据，key必须为text，value是带预测数据。如果使用data参数，则不用传入texts参数，二选一即可。建议使用texts参数，data参数后续会废弃。\n      - use_gpu(bool): 是否使用GPU预测，如果使用GPU预测，则在预测之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置\n      - batch_size(int): 批处理大小\n\n    - **返回**\n\n      - results(list): 情感分类结果\n\n\n  - ```python\n    def get_labels()\n    ```\n\n    - 获取emotion_detection_textcnn的类别\n\n    - **返回**\n\n      - labels(dict): emotion_detection_textcnn的类别(三分类，积极/中立/消极)\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取预训练时使用的词汇表\n\n    - **返回**\n\n      - vocab_path(str): 词汇表路径\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线情感分析服务，可以将此接口用于在线web应用。\n\n- ## 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n    ```shell\n    $ hub serving start -m emotion_detection_textcnn  \n    ```\n\n  - 启动时会显示加载模型过程，启动成功后显示\n    ```shell\n    Loading emotion_detection_textcnn successful.\n    ```\n\n  - 这样就完成了服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ## 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    ```python\n    import requests\n    import json\n\n    # 待预测数据\n    text = [\"这家餐厅很好吃\", \"这部电影真的很差劲\"]\n\n    # 设置运行配置\n    # 对应本地预测emotion_detection_textcnn.emotion_classify(texts=text, batch_size=1, use_gpu=True)\n    data = {\"texts\": text, \"batch_size\": 1, \"use_gpu\":True}\n\n    # 指定预测方法为emotion_detection_textcnn并发送post请求，content-type类型应指定json方式\n    # HOST_IP为服务器IP\n    url = \"http://HOST_IP:8866/predict/emotion_detection_textcnn\"\n    headers = {\"Content-Type\": \"application/json\"}\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(json.dumps(r.json(), indent=4, ensure_ascii=False))\n    ```\n\n  - 关于PaddleHub Serving更多信息参考[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  大幅提升预测性能\n\n* 1.2.0\n\n  模型升级，支持用于文本分类，文本匹配等各种任务迁移学习\n\n* 1.3.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install emotion_detection_textcnn==1.3.0\n    ```\n"
  },
  {
    "path": "modules/text/sentiment_analysis/emotion_detection_textcnn/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/sentiment_analysis/emotion_detection_textcnn/assets/vocab.txt",
    "content": "三次元\t0\n宋秘密\t1\n佳酿\t2\n或多或少或少\t3\n行房\t4\n集美\t5\n猪猪神探\t6\n三四十岁\t7\n蓓蕾\t8\nImlearning\t9\n肾结石\t10\n五百公斤\t11\n张春哥\t12\nxodu\t13\n程一鸣\t14\n解法\t15\nsigff\t16\n更准确\t17\n使用期\t18\n袁文燕\t19\n大金店\t20\nhjshhdh\t21\nyou歪瑞\t22\n礼格\t23\nv个i赶快\t24\nEuro\t25\n挑药\t26\n儿媳妇儿\t27\nt3n71zwooinnomixeo\t28\nGMARKET\t29\n真方头男\t30\njrps\t31\n转待\t32\n杂粮煎饼\t33\nntmbljsgjqr\t34\n75千米\t35\n19秒\t36\n朴灿烈\t37\n明阳学校\t38\n盟盟\t39\n椰林\t40\n莎娃迪卡\t41\n在点\t42\n干仕\t43\nwooden\t44\n两兄弟\t45\n我不我待\t46\n哈巴狗og么美\t47\n谁一个人\t48\n干什\t49\n1346460627\t50\n亚旭\t51\n系度\t52\n老度\t53\n27%\t54\n拉不拉\t55\n谢德功\t56\n保护你好\t57\n偶字\t58\n物竞天择\t59\n扣扣coco\t60\n270\t61\n272\t62\n273\t63\n274\t64\n275\t65\n276\t66\n277\t67\n278\t68\n阵亡\t69\n于洋\t70\n克拉玛依\t71\n更改\t72\n收购\t73\n主我\t74\n段话\t75\n死柴蔚\t76\n阵云\t77\n黑斑\t78\n主战\t79\n调电视\t80\n写照\t81\n系统类\t82\n修成\t83\n收费\t84\n闭嘴\t85\n易云奥\t86\n成厂\t87\n嫁鸡随鸡咯\t88\n熊憨\t89\n怒发\t90\n陈什么旭\t91\n讲一讲\t92\n多方\t93\n绿野\t94\n97样\t95\n黑河市\t96\n受不可以\t97\n李红和\t98\n嘻嘻时间猎人\t99\n多斤\t100\nJungKook\t101\nyiluolinnnnnnnnnQ4\t102\n度秘宝贝\t103\nJOPQ\t104\n王春力\t105\n黑啊老子看透你了你个伪人君子\t106\n9日\t107\n铁轨\t108\n00年\t109\n这么小的问题我是电影小达人\t110\n喂哥哥哥哥哥哥\t111\n八十五\t112\n你是我兄弟\t113\n第十四场\t114\n大众化\t115\n响起\t116\n人生之路\t117\n鸡wa\t118\n哈哈哈哈大笑\t119\n大家好呀我是樱子\t120\ndnT\t121\n有血有肉\t122\n塞纳河\t123\n两千几千\t124\n幸福的爱情\t125\n红酒绿\t126\n着急\t127\ndnd\t128\ndng\t129\ndnf\t130\ndna\t131\ndnc\t132\n那谁\t133\nneboot\t134\ndnn\t135\n幻世\t136\n不可恩\t137\n陆维民\t138\ndns\t139\ndfgdfccdfg\t140\n巴亚诺\t141\n老唐\t142\ndnx\t143\n爱故生怖\t144\njyangan\t145\n芙蓉吟\t146\n心灵手巧\t147\n吃电磁\t148\n哈迪德\t149\n热映\t150\n高尔夫球场\t151\ngijj\t152\nyahou\t153\n胖妞\t154\n硕组\t155\n丘比沙拉酱\t156\nfbbbbbbbbbbbbbbbnnnbnnnnnnnnnnnnnnmmmmmmmmmmm\t157\n呈现\t158\n钟秀敏\t159\n仓山区检察院\t160\n块头\t161\n危言耸听\t162\n差不就是\t163\n264公斤\t164\n醫生\t165\n汪星\t166\n脑袋瓜\t167\n不是我美\t168\n胖妹\t169\n156194999\t170\n金历旭\t171\n铁树\t172\n盛典\t173\n徐广昊\t174\n78名\t175\n去找工作\t176\n反馈\t177\njfnfr\t178\nCNBLAE\t179\n讯息\t180\n两成\t181\n吹毛求疵\t182\n雯雅\t183\n黑风梨那双眼动人收c更美人原看欺负你\t184\n鞭子\t185\ndayscountdown\t186\n再见挥手\t187\n才见\t188\n圣卢\t189\n挖沙\t190\n17k小坏蛋小坏蛋小坏蛋\t191\n一比十\t192\n报喜\t193\n嗯宗拜\t194\nhdbsksk\t195\n今天5点\t196\n别了了了了\t197\n五百五百两百几百\t198\n老底\t199\nsdffghjolhfddrxdxcyigixfudtEtxfjxfkchhkxgjfjzxhzkhckghdhztkhzfr\t200\nav撸男\t201\ncjdnfjid\t202\n两户\t203\n两房\t204\n赵家睿\t205\n433888\t206\n不见天日\t207\n乘号\t208\n统一战线\t209\n中间件\t210\nwrong\t211\n历史性\t212\n711丁\t213\n动工\t214\n最萌萌哒\t215\n惊吓\t216\n绿豆\t217\nDC卜唉三\t218\n荣幸\t219\n风铃\t220\n郑大度秘\t221\n女性化\t222\nifuchcb\t223\n同生共死\t224\n还死命\t225\n酿酒\t226\n第3名\t227\n米奇一\t228\n二方\t229\n花千骨白子画\t230\n换神\t231\ntyrfgf\t232\n安思你\t233\ndududufuh\t234\n有\t235\n月\t236\n朋\t237\n朊\t238\n服\t239\n度秘度秘b\t240\n最\t241\n會\t242\n小棍头\t243\n西游记\t244\n望\t245\n朝\t246\n期\t247\n总政治部\t248\n工藤新一水族馆\t249\n65881\t250\n朒\t251\n朕\t252\n朗\t253\n远在天边\t254\n木\t255\n末\t256\n未\t257\n怕冷不爱\t258\n本\t259\n术\t260\n哈哈们\t261\nfix\t262\nfiy\t263\n在家斗\t264\n侮辱\t265\ncndy\t266\n朦\t267\nfib\t268\nfic\t269\n机\t270\nfif\t271\nfig\t272\n朱\t273\nfih\t274\n朵\t275\n朴\t276\n刘海妮\t277\n孟扬\t278\n王明\t279\n13770185639\t280\n度迷我爱\t281\n王昕\t282\n超级新闻场\t283\n这会的回去言字旁\t284\n王星\t285\n13713540463www\t286\n王春\t287\n张云雨\t288\n摩托图裤头咯图\t289\n把把脉\t290\n持球\t291\n基们\t292\n纹身师\t293\n少臭美\t294\n太好了笑\t295\n这等你\t296\n夜火\t297\n两缕\t298\n表链\t299\n阳关\t300\n站水\t301\nPPS娱乐豆\t302\n予取予求\t303\n这么快说\t304\n闫甜甜\t305\n腿子\t306\n无比较\t307\nqaWS\t308\n真学识渊博\t309\nFsd\t310\n度秘奇\t311\n790元\t312\n楚楚\t313\n当家作主\t314\n来了赞\t315\n忘了我不想\t316\n怨不得\t317\n天才设计师\t318\n陈贤妹\t319\n吴盟盟\t320\n度秘女\t321\nsyk\t322\n和你的手\t323\n猛兽\t324\nf0303\t325\n12355679801\t326\nxahq\t327\nkkkkkkkk\t328\n我您\t329\n几公里\t330\n清末\t331\n妈呀度\t332\nXPS13薄出位#\t333\n依林\t334\n芥子年\t335\n熊承佳\t336\n风月\t337\nzui4\t338\n加国\t339\n资料库\t340\nNigen\t341\n鲜杯族\t342\n眷顾\t343\n13597662528\t344\n麻生兮\t345\n清朝\t346\n18265230279\t347\n11rr一一一228一u281192919191o11o9还事工ytuy\t348\ndgv\t349\n清朗\t350\n密度\t351\n十座\t352\n平胸吧大胸\t353\n成县\t354\n流月经\t355\n烦嚣\t356\n爱孙\t357\n旅客列车\t358\nsyt\t359\n认真手\t360\n爱子\t361\n雪下\t362\n剧名\t363\n丁墨\t364\n人人在线性质点\t365\n孙浚超\t366\n差秘\t367\n帅气人\t368\n陈东栋\t369\n小贝潮\t370\n听话的你\t371\n跨步\t372\n中堂\t373\n长阳\t374\nJyywood\t375\n王国城\t376\n雪中\t377\n煎蛋\t378\n洗礼\t379\n削读\t380\nneeded\t381\n甬道\t382\n二十六号\t383\nmaster\t384\n加固\t385\n霍秀秀\t386\n李家铃\t387\n门道\t388\n应援棒\t389\n疆界\t390\n飞去\t391\n孵化期\t392\n999999999999999999999999999999999999999999999999999999699999\t393\n奥西斯\t394\n最小先\t395\n两千九千块\t396\n你好开心\t397\n我的一生\t398\n陪同怕\t399\n腿部\t400\n追求者\t401\n小浩\t402\n苏打粉\t403\n折箅\t404\n34块\t405\n哈士奇\t406\n没救惹\t407\n寇所措\t408\n塌陷\t409\n煮熟\t410\nHvbkcbhg\t411\n现值\t412\n裁定\t413\n折算\t414\n帅哥你好美女你好\t415\n哈士契\t416\n字谜\t417\ndgdgfvncb\t418\n叩见\t419\n小海\t420\ntntministatinfin\t421\nhson\t422\n系旺角\t423\n太任性\t424\n故火\t425\n足珍贵\t426\n伊美黛\t427\n传谣\t428\n科技馆\t429\n澳大利赫本\t430\n俗称\t431\n2级\t432\nhsos\t433\nThen\t434\n张冯喜\t435\n阿乖\t436\nwrrtty\t437\nThed\t438\n白嫩\t439\n发行人\t440\n改改天\t441\n疗养院\t442\nhrkfbxbxs\t443\n结扎战\t444\n王金波\t445\nsnfotah\t446\n修汇\t447\n最后通碟\t448\n小青年\t449\n字段\t450\n楠楠兄\t451\n了儿\t452\n11849\t453\n衰衰\t454\n喝酒吧\t455\n阿乐\t456\n夏多拉\t457\n日本经济研究中心\t458\n坑坑洼洼\t459\n刷钻\t460\n蛮子\t461\n茅盾\t462\n么事\t463\n风机\t464\n么了\t465\n##7.12四大名捕##铜雀台#\t466\n喝醉酒\t467\n卷宗如山\t468\n狮溪\t469\n任文珑\t470\n伯帅\t471\n我爱你爱着你\t472\n苗苗\t473\n鼻烟卷\t474\n四堂\t475\n窍门\t476\n等于我回来\t477\n九阳会和\t478\n玫瑰玫瑰\t479\nECLATDEROSE\t480\n多一八\t481\n要火\t482\n平面直角坐标系\t483\n靠靠靠靠靠靠靠靠靠靠靠\t484\n我和你的结拜日\t485\n鼓珠\t486\n唉唉没来真是猪丁的来实在\t487\n沈聪明\t488\n为什么多米\t489\nnotbooks\t490\n父母亲\t491\n老揭\t492\nexcavator\t493\n校程\t494\n一闪一闪亮晶晶满天都是小星星挂在天空放光好像有张杰abcdefjhihetiaouostoyomiooisternonononononoyounama秘咪咪喵咪咪咪咪咪咪咪咪\t495\niVC\t496\n泥煤泥煤泥煤\t497\nRfvg\t498\n剑灵\t499\n储蓄\t500\n收场\t501\n艺哥\t502\n小度男的女的你是\t503\n100.0MHZ\t504\n八嘎挖苦\t505\n哈堂\t506\n墙砖\t507\n好不好呀美\t508\n波拉\t509\n小燕\t510\n留香\t511\n一部部\t512\n罗锦瑜\t513\n直接了当\t514\n一个四十几岁\t515\n于洪\t516\n最后一张照片\t517\n更专业\t518\n蒋梦雨\t519\n摩西摩西状\t520\ntiffortios\t521\n当歌\t522\n神通佛\t523\nAndover\t524\n新动感\t525\n徐大\t526\n英雄所\t527\n687568466779\t528\n罗通\t529\n上蔡\t530\nHi字\t531\n偶素\t532\n我美美达\t533\n红头\t534\nKSKSN\t535\n五六十个\t536\n你是我的命运\t537\n呕白\t538\n阳春\t539\n脸大\t540\n触碰\t541\n柯喜\t542\n188777\t543\n四群\t544\n钟锋\t545\n殇璃\t546\n周登峰\t547\n泡泡糖\t548\n红外\t549\n就是为了\t550\n不呵呵\t551\n哼不理你了讨厌\t552\n东方片\t553\nhttppinyincn1XSXz9ijCyF\t554\nscud\t555\n四美\t556\n第章\t557\nirk\t558\n掐死死\t559\njramsey\t560\n红夏\t561\n风雨无阻\t562\n少有\t563\n拐看\t564\n维杰\t565\n崔牛\t566\n不舍不得\t567\n治武\t568\n帅点\t569\n訾宇悦\t570\n解一下锁\t571\n肖狗\t572\n后年\t573\n96岁\t574\n刘宇鹏\t575\n修不修\t576\n过款\t577\n乾坤缄默\t578\n刘品言\t579\n干锅\t580\n真不赖\t581\nVgii\t582\n贵名\t583\n嗎嗎\t584\n齐心合力\t585\n陈载梦翔\t586\n忙萌萌哒\t587\n运往\t588\n毛可\t589\n巴经\t590\n來給\t591\n精细化\t592\n巴结\t593\n花儿朵朵\t594\n鹤壁市\t595\n掌权\t596\n假面骑士铠武五打片\t597\n1967年\t598\n周线\t599\n纪检委\t600\n喂喂喂喂喂喂喂喂你真二了唉我去\t601\n放歌听\t602\niuhj\t603\n安妮集团\t604\n杨马琼\t605\n毛发\t606\n拜仁慕尼黑\t607\n3.8%\t608\n吴言松\t609\n闲情逸致\t610\n姜黄\t611\n周红\t612\n深刻\t613\n建业小区\t614\n50000万个\t615\n七分之二\t616\nOK塔拉\t617\n彭总\t618\n章鱼小丸子\t619\n七分之五\t620\n脱图\t621\n脱困\t622\n受凉\t623\n骗你的我知道你是谁你\t624\n徐妞\t625\n气死人不偿命\t626\n郎给姐\t627\n政要\t628\n朱泽华\t629\n对於\t630\n相异\t631\n对方\t632\n大年糕\t633\n修正带\t634\n罢休\t635\n推论\t636\nBBDO\t637\n吃药了吧\t638\n给我走开\t639\n妄念\t640\n宋雨霏\t641\n徐妹\t642\n不好像\t643\n同路\t644\n外星人\t645\n速写\t646\n张彤彤\t647\n串串门\t648\n黑风\t649\n皮孑\t650\n皮子\t651\n好兴\t652\n发老\t653\n火苗\t654\n天天训\t655\ndesty\t656\n大家好我叫计紫纯\t657\nttmuwg\t658\n022584\t659\n克霉王\t660\n友情\t661\n刘强东\t662\n慈林\t663\nxxop\t664\n好克\t665\n张雨艺\t666\n一百千克\t667\n耿党苍苍\t668\nxxoo\t669\n第三集\t670\n八八咯假打大战\t671\n哈雅男篮\t672\n广发医药卫生联接\t673\nhttpehiphotosbaiducomxiaodupicitemd01373f082025aaf20822ebffcedab64034f1a15jpg\t674\n18896\t675\n大元股份\t676\n学行\t677\nMaybe\t678\n拨清波\t679\nI0口\t680\n人道\t681\n150岁\t682\n迈步\t683\n李谷堡\t684\n更何况\t685\n全球性\t686\n张春月\t687\n胡俊伟\t688\n20个\t689\n好可恶\t690\n9876543\t691\n并跟贴\t692\nEBS\t693\n神秘们\t694\n一面\t695\n乞求\t696\n宝地\t697\n芈菌\t698\n慰\t699\n淡淡定\t700\n统战部\t701\n多米尼加\t702\n百万元\t703\n一不知道\t704\n鬼娃娃\t705\n50多个\t706\n北大医学部\t707\nc1头\t708\n对对对对你这话\t709\n脸色狼\t710\n英菲尼迪\t711\n家庭背景\t712\n很感动人\t713\n揶揄\t714\nTUTTVlllkg\t715\n叫错了\t716\n强国\t717\n川宁\t718\n专业户\t719\n马步\t720\n饿棍天使横\t721\n小坏蛋度秘\t722\n长不骗人\t723\n纸本\t724\n以疾\t725\n来来来电\t726\n两只手\t727\n三剑客\t728\nikkuuiptttt\t729\n夜深找我吧\t730\n气罐\t731\n原稿\t732\n丘比\t733\n陈好\t734\nfykdn\t735\n得瑟\t736\n浅蓝\t737\n刘铁军\t738\n电影影\t739\n冰柜\t740\n还没有人\t741\n时评\t742\n15245245356\t743\n李啟冰\t744\nY╯︵┴─┴\t745\n不是的那你是男是女\t746\n君梓\t747\n还有什么不知道\t748\naeye\t749\n對話\t750\n周亚伟\t751\n营业款\t752\n炎武\t753\n1815752355\t754\nauoss\t755\n陈奕\t756\n好了先不聊\t757\n朱盼\t758\n还没有了\t759\nauosl\t760\n读读\t761\n读诵\t762\n佰伍\t763\ndddrsrrddgvdrg\t764\n被撞\t765\n十载\t766\n坐台\t767\n钟汉良\t768\n声音心\t769\n刘欢欢\t770\n矢志不渝\t771\n不哄我我不要你\t772\n258476312\t773\n张有两\t774\n2137598\t775\n五月\t776\n轉\t777\n五朵\t778\n叶维廉\t779\n要疯了\t780\n四哥\t781\n临滨\t782\n100000000岁\t783\n2222222222222223\t784\n慕\t785\nZ500\t786\n盘旋\t787\n频镜\t788\n失衡\t789\n奥格威\t790\n力尝\t791\n迷藏\t792\n红豆杉\t793\n百岁照\t794\n步入\t795\n不煽情\t796\n迷界\t797\n600多名\t798\n慑\t799\n玩儿不好玩儿不好玩儿不好玩儿不好玩儿吧玩儿吧玩儿吧玩儿吧玩儿吧玩儿\t800\n八五大\t801\nmicobliblislatic\t802\n行辉哥\t803\n啊别\t804\nGsd\t805\n夜王道\t806\n心跳\t807\n二百分种\t808\nYu\t809\n那么多不会\t810\n美容\t811\n十八吧\t812\n适配器\t813\n俞小成\t814\n对的你是我的好盆友\t815\n爱好羡慕\t816\n天秤\t817\n坏动作\t818\nOUTBACK\t819\n元熊思远\t820\nv女孩\t821\ndnl\t822\n光之国\t823\n富斯乐赛思腾\t824\nhhhvj\t825\n中国佛教协会\t826\n高八度\t827\n侯年华\t828\n孙婉笛\t829\n软件园\t830\n美罗华\t831\n水霖学校\t832\n杜燕妮\t833\n普拉普拉\t834\n邕宁区\t835\n勤奋者\t836\n汤晶良\t837\n六样子\t838\n借条\t839\n轗\t840\n借来\t841\n89233\t842\n好多起\t843\n静女\t844\n顾海蓝\t845\n湖南春晚\t846\n贝尔.格里尔思\t847\n懒惰\t848\n单色\t849\n你的泪\t850\n菊花杞子\t851\n加油\t852\n小冬眠\t853\n挂烫机\t854\n全椒\t855\n别问我为什么，我我我我eme400867em\t856\n复旦大学经济学院\t857\n碧海云\t858\n没得\t859\n闲闲\t860\n真好不过\t861\n老子猪\t862\n赛罗\t863\nHaggisa\t864\n听不听话\t865\n三五百\t866\n求也不懂\t867\nk小游\t868\n不丹阳\t869\n苦肉记\t870\n饱死\t871\n过去头\t872\n牛紫艺\t873\n263451711\t874\n赵水垢\t875\n你好办\t876\n赵玲\t877\n哈尼娟\t878\n阳泉市\t879\n引人转贴\t880\n百度有钱花\t881\n孤星\t882\n远山\t883\n卡乔·雷卡森斯\t884\n历经过\t885\n叫苦不迭\t886\nxkk\t887\n湖泊\t888\n英姐\t889\n人生路\t890\nxkc\t891\ncyou\t892\nxkd\t893\n等你一中\t894\nxkx\t895\n天黑了我们晚安吧拜拜明天见\t896\n3d金瓶梅\t897\nxks\t898\nxkv\t899\nhfcnbsf\t900\nImgoing\t901\nbaibao\t902\n点翔\t903\n算不想\t904\n微风\t905\nKAJAJKJWNQQK\t906\n被烧毁\t907\n见仁\t908\n香氛\t909\nnaaannaaaan\t910\ngeybdicnfsjjcb\t911\n张文前\t912\n球包\t913\njufjh\t914\n家奴\t915\n家女\t916\n求因为名\t917\n好听话嘻嘻嘻\t918\n王文书\t919\n扶风过\t920\n坏男女\t921\n聊用\t922\n凯皇后\t923\n所欲\t924\n狂哮\t925\nwdmdattatta\t926\n重要战略机遇期\t927\n坠儿\t928\n新闻编辑室\t929\n小小者\t930\n邹家扬\t931\nhgjhcxnhxv\t932\n九大碗\t933\nxk6\t934\n水墨画\t935\n湛杰法师\t936\n秋天天\t937\n你在兴和命中选一个字\t938\n高紫云\t939\n李娟霞\t940\n甚平\t941\n汽油机\t942\n追煞\t943\n好好很好\t944\n更大\t945\n架子鼓\t946\n杜丽娘\t947\n天天听歌冰肌玉骨\t948\n杜丽娜\t949\n特型\t950\n就怕\t951\n不是我喜欢\t952\n更多\t953\nomeimeistma\t954\n蒋春龙\t955\n马忒\t956\n库哈斯\t957\n河源市源城区\t958\nmymmymy\t959\n付秀治\t960\n一场九\t961\n天宏手机cym1\t962\n0点八\t963\n马心\t964\n二十几亿\t965\n梁亚茹\t966\n老卵\t967\n6月9日傍晚\t968\njerj\t969\n申浩宇\t970\n杨一\t971\n角度咪\t972\n大楼房\t973\n经常性\t974\n庆祝\t975\n成排\t976\n辽宋\t977\nexact\t978\n盖特纳\t979\n韩夭夭\t980\nhttpfhiphotosbaiducomxiaodupicitemf603918fa0ec08faacc60ad25eee3d6d54fbda\t981\nffxfhi\t982\n大海山\t983\n试验室\t984\n进心\t985\n汕尾屯\t986\n于兮\t987\n香磷\t988\n饶阳县\t989\n嵌\t990\n都水\t991\n献给\t992\n6月3日－8日\t993\nDhhsfkjhdds\t994\nftrrutvreyveryyreeurdtiyyuohpyuhypugouygotugotygytovyotgyotgyotgotygtyogtoy\t995\n嵞\t996\nM1\t997\n安琪灵\t998\n2000亿元亿元\t999\nM2\t1000\n吸合\t1001\n蔈子\t1002\n晕行\t1003\n福德\t1004\n嵩\t1005\nuvyctcixiixiiz\t1006\n张朝阳\t1007\n春雪\t1008\n江姐江\t1009\n俩几分\t1010\n挂挂\t1011\nNABFkgVfBvvcvvvvvhbkhnljkbjnbccccbnvmvc\t1012\n文武双全\t1013\n27件\t1014\nzrm\t1015\n这场戏\t1016\nzrh\t1017\nMe\t1018\nufc\t1019\nMa\t1020\n海沧湾\t1021\nMc\t1022\nMm\t1023\n阿古\t1024\nMo\t1025\nMi\t1026\n转会期\t1027\nMk\t1028\n可怜虫\t1029\novcccvi杯\t1030\n屠呦呦\t1031\n7点30到8点\t1032\n益康\t1033\nMs\t1034\n2．3\t1035\n知根知底\t1036\n领跑\t1037\n朗域剧社\t1038\n上调\t1039\nMy\t1040\n巴罗克集团\t1041\nME\t1042\nMD\t1043\n程明\t1044\nMF\t1045\nMC\t1046\nMB\t1047\nMM\t1048\nML\t1049\nMN\t1050\nMI\t1051\nufz\t1052\nMK\t1053\nMJ\t1054\nMT\t1055\nMV\t1056\n一个幺三五\t1057\n死辣\t1058\nMS\t1059\nMR\t1060\n呃堡\t1061\n灭火\t1062\n瓶口水宝\t1063\nMY\t1064\nMX\t1065\n灭灯\t1066\n公共产品\t1067\nqqvt\t1068\n任期\t1069\n河头\t1070\n碰到\t1071\nABAC\t1072\n五遍\t1073\n破解版\t1074\n拜拜安\t1075\n构词\t1076\n完事儿\t1077\n乞丐\t1078\n右派\t1079\n五道\t1080\n嗯五十\t1081\n余伍\t1082\n11月9号\t1083\n夜晚\t1084\nufs\t1085\n诺贝尔和平奖\t1086\n涨破\t1087\n张洪峰\t1088\n下星期二\t1089\n半档\t1090\n石器\t1091\n任末\t1092\neyduh\t1093\nK1666\t1094\nueudhddhdur\t1095\nACDEPG\t1096\n秘服\t1097\n秘月\t1098\ngivcfyhbv\t1099\n投保\t1100\nooooooo\t1101\n中国人\t1102\n吸吮\t1103\n春水\t1104\n飞尚\t1105\nfeican\t1106\n中化制药\t1107\ninalaskathe\t1108\n藓园\t1109\n10月16日\t1110\n眉毛\t1111\n秘术\t1112\n嗯哪哪哪哪哪哪哪\t1113\n招募\t1114\n不要脸错别字不要脸\t1115\nGlenn\t1116\nJJR\t1117\n5555555555555555555555555555580\t1118\nJJP\t1119\n全体会议\t1120\n正山\t1121\nJJJ\t1122\n交易量\t1123\nfollowing\t1124\n秘朴\t1125\nJJD\t1126\nJJC\t1127\n百货商场\t1128\nWhois\t1129\n纳西族\t1130\n岩哥\t1131\n大苗者\t1132\n坤所\t1133\n创人大单项立法征求意见数之最\t1134\n请稍后再聊\t1135\n小意达\t1136\n蠢事\t1137\n陪聊堡\t1138\nssudaggfdsghfuhjdhhhhfgjjjhfhuhfdhfhfghhfghgfhhgguygufgyggufgygfuufggygtyuyygtyuuuufdfutfd\t1139\n说点掏心窝子的话给我\t1140\nGdhcgc\t1141\n立法会\t1142\n45484254534554\t1143\n猎物\t1144\n一七四八\t1145\n蠢人\t1146\n新活\t1147\n屈泽奇\t1148\n233333333333333333333333333333333333333333333333333333333333333333\t1149\n王汉宁\t1150\n椽蓬\t1151\n李默然\t1152\ndosic\t1153\n皇后娘娘我叫皇后娘娘\t1154\n七七五八万\t1155\n吓死我了那你\t1156\n佟大鹏\t1157\n张不柏\t1158\n抱抱爱\t1159\n鲁鲁\t1160\n行刚\t1161\n白静\t1162\n二十七岁\t1163\n牛蝇\t1164\n行列\t1165\n捧腹\t1166\n虹口\t1167\n快乐大本营和欢乐喜剧人\t1168\n在一起来\t1169\n几千岁\t1170\n傻妞\t1171\n上等\t1172\n扭青\t1173\n不再大意哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒\t1174\n美颜\t1175\n明早我很讨厌你你\t1176\n18858231311\t1177\n美达达\t1178\nkiki\t1179\n国儿\t1180\n音乐家\t1181\n申沃\t1182\n做情\t1183\n过两年\t1184\n好奇妙\t1185\n我的话吗你是谁你是谁你是谁我是你爸\t1186\n婆粥\t1187\n再见晚点和你聊\t1188\n未真\t1189\n王汪汪汪喵喵喵叽叽叽叽\t1190\n三十虚岁\t1191\n128882dd\t1192\n绝孕\t1193\n绝孙\t1194\n高楼大厦\t1195\n山区\t1196\n次日22时40分\t1197\njif\t1198\n吕教授\t1199\njie\t1200\n气质\t1201\njia\t1202\njin\t1203\n83582388\t1204\n维泰尔kiki\t1205\njij\t1206\n2cn\t1207\njih\t1208\njii\t1209\njiv\t1210\n冰城\t1211\njit\t1212\njiu\t1213\n两会\t1214\njis\t1215\n我骗你了我不骗你我问你\t1216\ntulv\t1217\n做做做\t1218\n感觉器官\t1219\n董丽丽\t1220\njiy\t1221\n王艳红\t1222\n愧对\t1223\n第一级\t1224\n3百大板\t1225\nhiigub\t1226\n痘汤\t1227\n亮出\t1228\n大地隧\t1229\n第一线\t1230\n托生\t1231\n净宇\t1232\n咯头\t1233\n\\\t1234\n窝多情\t1235\n寒颤\t1236\n王金璐\t1237\n熊时锟\t1238\n嘿度\t1239\n我是你的尔康\t1240\n第十几名\t1241\n杜海涛\t1242\n国际航站楼\t1243\n马紫雁\t1244\n康日新\t1245\n加热\t1246\n同性爱\t1247\n尿潴\t1248\n说不谈\t1249\nbIue\t1250\n尿毒症\t1251\n恶寒\t1252\n大桥下面\t1253\nMuerto\t1254\n死窝\t1255\nNnkk\t1256\n残舞整盛五关甜\t1257\n九不紧不慢\t1258\n海燕\t1259\n中海油三巨头\t1260\n括爽\t1261\n镇海小学\t1262\n2月15日\t1263\n犯科\t1264\n14斤\t1265\n不让\t1266\n章泊钦\t1267\n堆火\t1268\n念念不忘\t1269\n出手\t1270\n犯错误\t1271\n遮住\t1272\n巴不得不不不不不不不不不不不不不不不不不不不不不不不发那么多的可不不不不子我还好意思\t1273\n即使\t1274\n饿极了\t1275\n2亿生\t1276\ndbddbd\t1277\n迫切性\t1278\n千斤\t1279\n龙田\t1280\n雅露雪\t1281\n吓走\t1282\n张思凡\t1283\n南红\t1284\n表妹\t1285\n呀呀呀呀呀呀呀呀\t1286\n近两年\t1287\n不计\t1288\n缝隙\t1289\n华发\t1290\n群体\t1291\njjat\t1292\n龙生\t1293\noyotigi\t1294\nFT社评\t1295\n据此\t1296\n捞起\t1297\n看不像\t1298\n好不要脸\t1299\njjah\t1300\n你是人吗你是猫吗你是狗\t1301\nusggzjsskheg\t1302\n塞寒天\t1303\n粉鲍\t1304\n00：00\t1305\n1000万美元\t1306\n兄弟们\t1307\n理人\t1308\n留钱\t1309\n23.7亿\t1310\n25832833\t1311\n翻新\t1312\n喜羊羊与灰太狼闯世界\t1313\n一冠\t1314\n不九\t1315\n饿觉\t1316\n火星游乐场\t1317\n暖暖环游世界\t1318\n药通\t1319\n起始\t1320\n基点\t1321\n嗯大王\t1322\nｓｈｕａｎｇ\t1323\n求抱\t1324\nq萌\t1325\n藏学\t1326\n机械男\t1327\n残废\t1328\neyefdjru\t1329\n3小时左右\t1330\n那不呵呵\t1331\n理事\t1332\n手忙脚乱\t1333\n理亏\t1334\n尤文阿森纳利物浦\t1335\nshemaydo\t1336\n台湾地区\t1337\n迷幻感\t1338\n66886883051234567890\t1339\n天行道\t1340\n咸蛋酥\t1341\n扁担\t1342\n顾鑫磊\t1343\n第一咪\t1344\n胡茜\t1345\n采矿权\t1346\n史显宰\t1347\n嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻\t1348\n灵犀一点通\t1349\n资源性\t1350\n日产轩逸\t1351\n抗老\t1352\n赵章成洋\t1353\n2740824034\t1354\n棒彩\t1355\n动态平衡\t1356\n胡茬\t1357\n见你好开心\t1358\n谬论\t1359\n行一个人\t1360\n猜拳\t1361\n何炅\t1362\n三六幺\t1363\n别何曾行样\t1364\n显而易见\t1365\n破绽\t1366\n宋城花\t1367\n屈尊\t1368\n吽红\t1369\n奉化市统战部\t1370\n枷锁\t1371\n逗比牌女神三山寨\t1372\nCQ岑参差恩信心\t1373\n单眼皮\t1374\n周吃\t1375\nK673次\t1376\n172438\t1377\n学术报告\t1378\n孙娜\t1379\n非遗传\t1380\n秃子\t1381\n睡克\t1382\n睡先\t1383\n破给\t1384\n侠之小\t1385\n蒽蒽\t1386\n速记\t1387\n贾汪\t1388\n祝英台\t1389\n4200余万元\t1390\n孟维佳\t1391\n小子夜\t1392\n听起来\t1393\n二比三比四比五\t1394\n这件事次\t1395\n笨手笨脚\t1396\n快青\t1397\n民众\t1398\n纷散\t1399\n很高兴唱\t1400\n奥尼尔\t1401\n心跳舞\t1402\n杨宝杰\t1403\n优酪\t1404\n并求\t1405\n民企\t1406\n中终\t1407\nhgdjcjak\t1408\n塞琳娜\t1409\n30斤\t1410\n靓妹\t1411\n李静萌\t1412\n简餐\t1413\n金虎\t1414\n左槽\t1415\n平遥\t1416\nLOMO\t1417\n和音\t1418\nx20x22xx2\t1419\n中继\t1420\n箭服\t1421\n德基广场\t1422\nxfhhx\t1423\n诺基\t1424\n诺培\t1425\n為甚麼凡客還沒\t1426\n帝医风\t1427\n七分卷\t1428\nstometh\t1429\n亲封\t1430\n巨丑\t1431\nfriction\t1432\n雍正\t1433\n不学不学不求求求求学会唉不理你了坏蛋\t1434\n忘我想\t1435\n100多双\t1436\n容纳\t1437\n到老\t1438\n度秘度秘a\t1439\n炒掉\t1440\n男半女\t1441\n巨为\t1442\n首版\t1443\n苍蝇\t1444\nsb们\t1445\n国家安全\t1446\n永远永远永远永远永远\t1447\n怕谁知道\t1448\n不要回来\t1449\n首片\t1450\n牛杂店\t1451\n段绍川\t1452\n广电电气职工股\t1453\n亲就\t1454\n天鹅堡\t1455\n6487648769\t1456\n暮光之城\t1457\n两天天\t1458\n倒谈\t1459\n贝鹿\t1460\n大们\t1461\n打省略\t1462\n速产\t1463\n给我曹\t1464\n嘛带\t1465\n道家\t1466\n灰机\t1467\n一百二十八块\t1468\n宣萱\t1469\nhoups\t1470\n挨家挨户\t1471\n殷家湾\t1472\n你是我我是你\t1473\n舰队\t1474\n菜花病\t1475\nurrpk\t1476\n灰服\t1477\n诉你\t1478\nftzvvz\t1479\n曾度\t1480\n轴距\t1481\n顾明拳\t1482\n15279112058\t1483\n猪驴王八\t1484\n福利费\t1485\nYES\t1486\n一千二百多\t1487\n瓢太横\t1488\n15点50分\t1489\n实德\t1490\n侄女儿\t1491\n河古庙\t1492\n草草草\t1493\n胆识\t1494\n好作\t1495\n毛乾艺\t1496\n[din推撞\t1497\n吼叫\t1498\n一拥而上\t1499\n8547\t1500\n珠珠\t1501\n4DD4七D\t1502\nbyeby\t1503\n金箍咒\t1504\n我是说女秘书频道秘书\t1505\n子夜\t1506\nNjjgd\t1507\n来头\t1508\n热身赛\t1509\n喵喵喵达\t1510\n郝星宇\t1511\n蜕变\t1512\n90年\t1513\n来天\t1514\n好使\t1515\nhttpghiphotosbaiducomxiaodupicitem4afbfbed\t1516\n乐视视频\t1517\n伤神\t1518\n大任\t1519\nfis\t1520\n宿州市\t1521\n学院\t1522\n幸福的家\t1523\n丰腴\t1524\nHis\t1525\n错不小心\t1526\n双身\t1527\n伴手礼\t1528\n郑钧秘\t1529\nHiu\t1530\nlěi\t1531\n更需要\t1532\nfit\t1533\n劣根性\t1534\n一寸\t1535\n一对\t1536\n嘉富\t1537\n十八家\t1538\n瘦人\t1539\n郡主\t1540\n猪撞树\t1541\n来日鱼\t1542\n不切实际\t1543\nhuggg\t1544\n九世之的你好\t1545\n预购#F3\t1546\nfffffjklfw\t1547\n尹丽丽\t1548\n托神\t1549\n梁剑东\t1550\n妖尾\t1551\n张东路123号\t1552\n力盟店\t1553\n方明\t1554\n苟城燕\t1555\n虹之玉\t1556\n疼数\t1557\n6月32004年\t1558\n包包儿\t1559\n最后一遍\t1560\nfia\t1561\n上个月\t1562\n点到为止\t1563\n长途跋涉\t1564\n接不到\t1565\n街道办事处\t1566\n唉嘿\t1567\nyzy\t1568\n才信\t1569\n乱乱乱听\t1570\n金砖\t1571\n第二页\t1572\n广阔\t1573\n优量\t1574\n醉意\t1575\n专有\t1576\n上你的价\t1577\n乔治·克鲁尼\t1578\n亲爱的人民\t1579\n幼教\t1580\n第二项\t1581\nWOshi\t1582\n憐香惜\t1583\n通盘\t1584\n刘佳义\t1585\nfin\t1586\ntaunttsiys\t1587\n王献成\t1588\nloxir\t1589\n波尔多一二级庄\t1590\n广阳\t1591\n中国人寿保险\t1592\n匀速运动\t1593\n你好美妞\t1594\n任正非\t1595\n刘佳乐\t1596\n下下星期考试\t1597\n异世\t1598\n88888895\t1599\nffjjxzdtthh\t1600\n看不起\t1601\nlulokun\t1602\n年间\t1603\n头干\t1604\n可望其项背\t1605\n絮状\t1606\n头年\t1607\n强电解质\t1608\n交起来\t1609\n有你有真国宝雄兵\t1610\n一百分之一\t1611\n好血\t1612\n西普\t1613\n雷天\t1614\n肯恩\t1615\n杜有\t1616\n一百多年前\t1617\n秒聊\t1618\n好行\t1619\n再见了再见了再见了再见了啦啦啦啦啦啦啦啦啦啦啦\t1620\nn。2\t1621\n7777\t1622\n亚布力雪场\t1623\n当m0\t1624\n张子慧\t1625\n疑窦\t1626\nvfudhebfh\t1627\n獠牙膏\t1628\n史诗\t1629\n兰生\t1630\n1GJiJinNin\t1631\n兴安盟\t1632\n多美丽\t1633\n郭柏林\t1634\ndidjteicb\t1635\n底边\t1636\nBBCBD\t1637\n川岛町\t1638\n汉娜\t1639\n汉威\t1640\n为富不仁\t1641\n无有钱\t1642\n啊撸\t1643\n丫丫咿呀咿呀咿呀咿咿呀\t1644\n37毫米\t1645\nPre3\t1646\n贩毒\t1647\n干点儿\t1648\n丿情\t1649\n细语\t1650\n吴大象\t1651\nsbshb\t1652\n哈罗公学\t1653\naiwugxhzhsj\t1654\n在片\t1655\n叹气\t1656\n独乐乐圩村\t1657\n细读\t1658\n爸爸不在家你可以求求你了\t1659\n不要脸臭不要脸臭不要脸\t1660\n会堂\t1661\n逸姐\t1662\nbeing\t1663\n嗯柳林风声\t1664\nDiesel\t1665\n乜嘢\t1666\n数十家\t1667\n年氏渤\t1668\n恭候\t1669\n摩挲离石一共嫂才饭煲样的了哪个筷子塘北堡我说扫楼扫扫扫\t1670\n浩龙\t1671\n尘秽\t1672\n徐傅\t1673\n划分\t1674\n额废话\t1675\n男人味\t1676\n抢位\t1677\n大爷爷\t1678\n自由度\t1679\n马金雷\t1680\n移動\t1681\n真的假的别骗了\t1682\n狂犬疫苗\t1683\n颜美人\t1684\n9月30\t1685\n哈雷啵\t1686\n家庭座机\t1687\nudfgh\t1688\n身世感兴趣\t1689\n啦啦啦啦啦啦你\t1690\n玛莎百货\t1691\n六根\t1692\n一枚枚\t1693\n仲要連\t1694\nPPTV\t1695\nistuned\t1696\n双引号\t1697\n虎哭天\t1698\n山沟沟\t1699\n飞机杯\t1700\n逗鱼\t1701\n六株\t1702\n头皮\t1703\n199102\t1704\n回答会\t1705\n整式\t1706\n给她我没有\t1707\n国王的演讲\t1708\niyhuhsgi556936\t1709\npiplob\t1710\nGliese\t1711\n霉霉\t1712\n路局兔子物价局\t1713\n才华\t1714\n拉下\t1715\n好可爱我真\t1716\n拉丁\t1717\n哎呀你好多\t1718\n安监局\t1719\nKC#\t1720\n卿淑仪\t1721\n港湾\t1722\n赵通\t1723\n高墙上\t1724\n寻龙决免费\t1725\n广海镇\t1726\n混蛋\t1727\n好运当头\t1728\n夜华\t1729\nSorkin\t1730\n钢铁侠\t1731\n防偷\t1732\n表示好不啦\t1733\n壮哉\t1734\n吧嗒11l里glee额吉hi\t1735\n不救市\t1736\n35.45\t1737\n素颜\t1738\n木马马\t1739\n35.4%\t1740\n曹宇姗\t1741\n耳福\t1742\ngAgs\t1743\n出來\t1744\n石平\t1745\n王梓嶂\t1746\n猜念\t1747\n访问\t1748\nQQ音乐\t1749\n不择\t1750\n朴啊\t1751\n南巡\t1752\n有机巴\t1753\n交纳\t1754\n超级女声\t1755\n爱莲说\t1756\n南川\t1757\n不拟\t1758\n不拜\t1759\n南州\t1760\n张开伟\t1761\n进化论\t1762\n不拔\t1763\n大二娘\t1764\n12厘米\t1765\n落泪\t1766\n老相好\t1767\n猜忌\t1768\n废弃物\t1769\n皮尔卡丹\t1770\n波普\t1771\n416次\t1772\n且末\t1773\n好吃饭了\t1774\n我想\t1775\n你好报\t1776\n六二零零三七二八\t1777\n波头\t1778\n昂头啸\t1779\n五分之\t1780\n家家有本难念的经\t1781\nuhhhhhh\t1782\n大鸭梨在\t1783\n退后\t1784\n途中\t1785\n首映礼\t1786\n信不信不信\t1787\n你好把\t1788\n辣椒水\t1789\n洗完澡\t1790\n罗菲哥\t1791\n赵百度\t1792\n百度科技园一号楼\t1793\n波多\t1794\n真好呀\t1795\nfvihcecjhfcjovfvfd\t1796\n全景\t1797\n2435368557\t1798\n毓婷\t1799\n相信你的话\t1800\n小恐龙\t1801\n罗明三\t1802\n快点灰太灰太灰太灰太灰太坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋\t1803\n精明\t1804\n大白兔糖\t1805\n434个\t1806\n不好耍\t1807\nD萨\t1808\n定丽\t1809\n一分钱\t1810\n大灰机\t1811\nwith\t1812\n那我是我是你的主人我也是你的家人\t1813\nCFM国际公司\t1814\n锦绣江山好哇好哇\t1815\n负者\t1816\n天语雅阁\t1817\n也有一个人醉\t1818\n华仔哥\t1819\n吕泽睿\t1820\n一分钟\t1821\n中国书法协会\t1822\n岱镇统一天下定许你费呗\t1823\n第157章\t1824\n尤淑雅\t1825\nYISHION以纯\t1826\nhjlft\t1827\n真个\t1828\n粑粑克拉\t1829\n血缘关\t1830\n再来木乃木乃梦\t1831\n丰田霸道\t1832\n眼前一亮\t1833\n辉煌\t1834\n灯灯\t1835\nArethey\t1836\n灯火\t1837\n真主\t1838\n异样\t1839\n真丹\t1840\n中餐馆\t1841\n真一\t1842\n真不\t1843\nkougg\t1844\nrgjffhh\t1845\n油盐\t1846\n池子\t1847\n真丑\t1848\n们班\t1849\n真丝\t1850\n442122212\t1851\nxjgygtyfggfydfg\t1852\nTiffdjnv\t1853\n牛逼者\t1854\n老天不许人\t1855\n人间\t1856\nhdbnw\t1857\n裂帛\t1858\n么来卡啊了恶露额度我呢恶露\t1859\n贷款人\t1860\nnatalie\t1861\n花那\t1862\n发展\t1863\n上厕所\t1864\n外籍\t1865\n认认真真\t1866\n生活琐事\t1867\n求指教\t1868\n十陵\t1869\n好想你好想你好想你好想你好想你\t1870\n去土眭\t1871\n18点25分\t1872\n怎么时候\t1873\n亲给\t1874\n一个多G只\t1875\nGushjugs\t1876\n喜怒哀乐的我你\t1877\n莲花山小学\t1878\n周庄村\t1879\n辉锡膏\t1880\n有也有\t1881\n峰兄\t1882\n如此如此\t1883\n右侧\t1884\n南一寺\t1885\n温有\t1886\n繼\t1887\nnoru\t1888\n画画儿\t1889\n02091417212706\t1890\nsHelensinging\t1891\n拼写\t1892\n啦a梦\t1893\n老百姓\t1894\n正里\t1895\nThankyou\t1896\n50150526666666\t1897\nnore\t1898\n油垢\t1899\n狠心\t1900\n激烈竞技\t1901\n声雅蠛蝶\t1902\n外边儿\t1903\n蒋靖宇\t1904\n战车\t1905\n被好\t1906\n9.5米\t1907\n桶状\t1908\n张延明\t1909\n被奸\t1910\n糯米网\t1911\n咪咪咪三十么么哒\t1912\n尼玛太阳\t1913\n尼贾德\t1914\n池州铜钼矿事故抢险指挥部\t1915\n言跃\t1916\n休芝\t1917\n次生\t1918\n曹少\t1919\n四月三十号\t1920\n太发\t1921\n被套\t1922\n来试试\t1923\nsant\t1924\n瓦力哇japan\t1925\nsans\t1926\n首关喆\t1927\nporeisal\t1928\n原味儿\t1929\n太古\t1930\n改换\t1931\n排泄\t1932\n奴腻\t1933\nsand\t1934\n言路\t1935\n丹溪洞\t1936\n度你我想\t1937\nchvd\t1938\n排法\t1939\n魁\t1940\n开发者\t1941\n魂\t1942\n魅\t1943\n198\t1944\n云淡风轻触摸\t1945\n胸关\t1946\n194\t1947\n197\t1948\n196\t1949\n460点270\t1950\npokerface\t1951\n车号\t1952\n192\t1953\n你好百醇\t1954\n四种\t1955\n宋宇科\t1956\n魔\t1957\n该管\t1958\n19%\t1959\n车口\t1960\n魚\t1961\n腹中\t1962\n王颗牙\t1963\n男模们\t1964\n7嬤\t1965\n缪纶\t1966\n车友\t1967\n糊驱远古撸\t1968\nOgilvy\t1969\n周六晚上\t1970\n魺\t1971\n534364644562468535673432353532323235656468943113464976431646764326565355646416468686552\t1972\n不懂之的一个字好呀上面世界旗下明天\t1973\n唯爱SJ13]110114厉旭TW\t1974\nslmm\t1975\n356把\t1976\n四百万\t1977\n景物\t1978\n灵官殿\t1979\n四百七\t1980\n周三言\t1981\n填报\t1982\n好嗨\t1983\n方头\t1984\nhasa\t1985\n纽约时代广场\t1986\n方太\t1987\n桂林子号\t1988\n一旁\t1989\n龙窟\t1990\n初恋体验社\t1991\n总攻\t1992\n19g\t1993\nrdtug\t1994\n总政\t1995\nvvif\t1996\nhass\t1997\n一百千万亿米\t1998\n一大块\t1999\n命根子\t2000\n嘛嘛嘛嘛嘛嘛哒\t2001\nroggd\t2002\nXKf\t2003\n正乡村\t2004\n四百个\t2005\n撕开\t2006\n动力度\t2007\n邱文传\t2008\nhsvh\t2009\n女伴们\t2010\n降落\t2011\n稍后\t2012\n王往西\t2013\n感冒\t2014\naazx\t2015\n小白龙\t2016\n慈眉善目\t2017\nFeng\t2018\n度秘盒\t2019\n河肥\t2020\n明天中午\t2021\n娃兄\t2022\n冬日\t2023\ngjkite\t2024\n已知类\t2025\n私我的小贡菊\t2026\n狗男女\t2027\n连云峰\t2028\n一药名\t2029\n阿瞳\t2030\n3692581475\t2031\n张度秘\t2032\n说臭战\t2033\n824快本\t2034\n罗芸娜\t2035\n强抢\t2036\n死亡真相\t2037\n场空\t2038\n越南110507┓越南场\t2039\n走了再见了\t2040\n边里\t2041\n软哥哥\t2042\n度秘炒肉排\t2043\n意料子\t2044\n3点01分\t2045\n艹蛋啊艹蛋\t2046\n赤发\t2047\n15885201313\t2048\n谢谢了烦\t2049\ncompany\t2050\n依稀可见\t2051\nbjtd\t2052\n押姐\t2053\n才鸟\t2054\n死袅\t2055\n埴物\t2056\n简欧\t2057\n苏绣\t2058\n简次\t2059\n你在干什么哦我是你的小伙伴\t2060\n杨亚强\t2061\n啵啵\t2062\n薄熙\t2063\n猪呢讨厌\t2064\n你真好布米多咪尼功放张照片行\t2065\n微里\t2066\n麻淡\t2067\nDhgdnftnfrhsdbf\t2068\n闫兴远\t2069\n大东亚\t2070\n岁月之歌\t2071\n鼽\t2072\n屠杀\t2073\n出火\t2074\n阿拉伯半岛\t2075\n说和你玩\t2076\n寒假干点儿\t2077\nuxn\t2078\n晚安啦晚安啦你祝福我做一个好梦吧\t2079\n蔡晓\t2080\n边子\t2081\n不管用\t2082\n五子\t2083\n赵向峰\t2084\n缓缓棋归\t2085\nhuge\t2086\ntyffrhfe52is2wjamgjKWGMPJMTJGW\t2087\n奶头\t2088\n就是了没有\t2089\n国奥足球队\t2090\n7月6号\t2091\n仙娘\t2092\nhugt\t2093\n凋落\t2094\nnitmsb\t2095\ndhgs\t2096\n超长篇\t2097\n羽墨\t2098\n小时工\t2099\nhaijia\t2100\n血衫\t2101\n暖贴\t2102\n营救\t2103\nfoyut\t2104\nueydurgidjrrhrifrhrhhrrhdurgdhdghhdudhdfhdhdhgsyehdudhhhdhdshhdn\t2105\n算试\t2106\n四季海棠\t2107\nhaon\t2108\n弱势群体\t2109\n苏苏\t2110\n猫阿姨\t2111\n算话\t2112\n也想你\t2113\n苹果合韵兰\t2114\n捣鬼\t2115\n法治\t2116\n血血\t2117\n陆恭炜\t2118\n此声\t2119\n刘朝威\t2120\n算说\t2121\n族长\t2122\nHddgvj\t2123\njjjjjjjkkk\t2124\n八岁个\t2125\n哈了你跟我说说话聊聊天\t2126\n五十吨\t2127\n这部\t2128\n义堂镇\t2129\nxhxi\t2130\n上上周\t2131\n尘尽\t2132\n哼管好\t2133\n木棺\t2134\n醃仨\t2135\n我叫花花你\t2136\n旗下\t2137\n负面\t2138\n古式\t2139\n小白羊\t2140\n木棍\t2141\n超宠\t2142\n许志\t2143\n救兵\t2144\n第九位\t2145\n尘封\t2146\n救关\t2147\nsauch\t2148\n很安静\t2149\n黄皮子\t2150\n中公发\t2151\n仁安医院\t2152\n死金金\t2153\nNBA2K\t2154\n呢重启东\t2155\n包治\t2156\n聂先生\t2157\nexo肿麽办\t2158\n张起家\t2159\n包河\t2160\n566154556\t2161\nMS-DOS\t2162\n依然到底\t2163\n品质\t2164\n优酷的妈呀你个不要脸的\t2165\n我那知道人很多数不起的也许这地球上全部都人\t2166\n给我死\t2167\n过去8周\t2168\n杨俊瑞\t2169\nPYT\t2170\ndbdib\t2171\n呵呵哼\t2172\n手联弹\t2173\n爱奇艺视频\t2174\ngrstinda\t2175\n瓶儿\t2176\n荣民\t2177\n不要说话\t2178\n鸡家湾\t2179\n拉拉队\t2180\n谢志情\t2181\n玩博\t2182\n影迷\t2183\n丧丧\t2184\nJIMI\t2185\n朱琦雯\t2186\n胥凌霜\t2187\n15块\t2188\n水果室\t2189\n唉热\t2190\nhvkvx\t2191\n4040\t2192\ncancelit\t2193\n夫君主\t2194\n表坛\t2195\n张飞银\t2196\n丧业\t2197\nuuuuhbcfef\t2198\nhellotmeouto\t2199\n刘有\t2200\nReport\t2201\n臭恶\t2202\n第1013\t2203\n奇谈怪论\t2204\n坏坏男\t2205\n886.2亿元\t2206\n一系一系一系一系\t2207\n余小兰\t2208\ntsuyo\t2209\n童帅\t2210\n鸭翼飞鱼\t2211\n巨子们\t2212\n甜品业\t2213\n安联\t2214\n例假\t2215\n4.8025\t2216\n吧擦擦擦擦擦擦擦擦擦擦擦擦\t2217\n惜翩翩八王\t2218\n八14点\t2219\n马蹄肾\t2220\n阿拉蕾\t2221\nstomityouristmetouto\t2222\n万事俱备\t2223\n氏海北\t2224\n冬儿宝\t2225\n宋云\t2226\n关岛\t2227\n和毫升\t2228\n比汉斯\t2229\n现状离\t2230\n路亚\t2231\n战地行\t2232\ngguug\t2233\n合江\t2234\n搜噶\t2235\nxiongmao\t2236\n腾蛟\t2237\n矜持\t2238\n卡莱雅\t2239\ny04\t2240\n通胀试比\t2241\n路人\t2242\n合污\t2243\n风湿坨\t2244\n385米\t2245\n妖冶\t2246\n加拉卜\t2247\n特锐德\t2248\n啦集体经济\t2249\n可以说笑\t2250\npsps\t2251\n拯救生命的拥抱\t2252\n簇\t2253\n户改\t2254\n猥琐男\t2255\n数十载\t2256\n8斤\t2257\n黎叙\t2258\n怎么装\t2259\n林子仟\t2260\n灶糖灶饼\t2261\n5.9%\t2262\ndtttd\t2263\n爆米花儿\t2264\n哲学家\t2265\n丑大\t2266\n非么么哒\t2267\nSHift\t2268\n奥里哦\t2269\n大康砦\t2270\n海曙样\t2271\ndddssffdd\t2272\n郭文杰\t2273\n墨汁\t2274\n三百年前\t2275\nhjkbc\t2276\n猜猜嘛好玩儿\t2277\n李平\t2278\n李尚荣\t2279\n会聊\t2280\n公映许可证\t2281\n你儿\t2282\n翠竹\t2283\n一棵\t2284\n芬奇\t2285\n李广\t2286\n黑招牌\t2287\n无厘头\t2288\n头发\t2289\n公元1565年\t2290\n度秘你心度还是秘戏秘\t2291\nJQnskls\t2292\n恶毒\t2293\n3日\t2294\n设计师\t2295\n早上9点\t2296\n一棒\t2297\n潜能\t2298\n89级\t2299\n画展\t2300\nfyftt1\t2301\n一棍\t2302\n百八十八\t2303\n陈龍耀\t2304\n斗志昂扬\t2305\n你在干嘛那小东秘你在\t2306\n床位费\t2307\n轴器\t2308\n交战\t2309\n怀不了孕\t2310\n飞甲\t2311\n昌明珠\t2312\n交成\t2313\n一直都在\t2314\n没有问题\t2315\nsiece\t2316\n实名认证\t2317\n1300000\t2318\n憋气\t2319\n圣殿\t2320\n韭菜包子\t2321\n安琪\t2322\n交房\t2323\n兔同笼蜘蛛\t2324\n库存量\t2325\n够来斯够\t2326\n五十千克\t2327\n皓轩\t2328\n恩事\t2329\n小珠珠\t2330\nflucjpviogofp\t2331\niijjr\t2332\n小梅西\t2333\n海灵化学\t2334\n打酱油\t2335\n6小时\t2336\n乌克兰\t2337\ncixm\t2338\n1100岁\t2339\n荣辱\t2340\ngcdMs\t2341\n蓝雨馨\t2342\n二奎亲\t2343\n梦回\t2344\n环比\t2345\n戴六君\t2346\nstuffis\t2347\n何止\t2348\ncixu\t2349\nBTV影视频道\t2350\n杂乱无章\t2351\n董久明\t2352\n梦园\t2353\n要不你\t2354\n126条\t2355\nfjzz\t2356\nFUCKYOUMOM\t2357\n去年八月一\t2358\n多来梦\t2359\n无知者无\t2360\n依噶\t2361\nm2abura8esiabcdefghjklnopqrstuv\t2362\n疲力尽力\t2363\n555555555555555\t2364\n真邪恶\t2365\n老化\t2366\n栗青山\t2367\np个\t2368\n张译\t2369\n风一吹\t2370\nhaos\t2371\n张话\t2372\n压缩\t2373\n亲我吧爱我\t2374\n小浪\t2375\n牌坊下的女人\t2376\n佐罗早\t2377\n我的公主\t2378\n我是你的丫鬟小翠花\t2379\nTGTTYRTYTTUYHHYDE\t2380\ncumvent\t2381\n别作孽说图加挡泥板\t2382\n泳池\t2383\n母贴\t2384\n神经麻痹\t2385\n岔路口\t2386\n阴道\t2387\nsects\t2388\n死僵王\t2389\n885464\t2390\n85.92%\t2391\n秘卷\t2392\n我是一只小小小小鸟想要飞也飞飞飞不干啊我寻亲咪咪亲亲咪咪\t2393\n水边\t2394\n樊江林\t2395\n问块\t2396\n戏霸\t2397\n无脑天来\t2398\n李春艳\t2399\n嗯mma2\t2400\n白茶\t2401\n扇不死\t2402\n75758575555555655255555\t2403\n我是不是我我是晚上八电视无我诈我身七点水\t2404\n木夫妻\t2405\n葛亮\t2406\n熊熊\t2407\nKBS2\t2408\n连体裤\t2409\n侧面\t2410\n您好不敏\t2411\n盘面\t2412\n中长发\t2413\n婆马\t2414\njkkd\t2415\n太公公\t2416\npH值\t2417\n秀娜\t2418\n喵哈哈\t2419\n厄比斯\t2420\n经手\t2421\n杨过\t2422\n我骗你的我叫\t2423\n晴川历历汉阳数芳草萋萋鹦鹉洲\t2424\n8522556241\t2425\n心地善良的人\t2426\n真武殿\t2427\n30日早晨\t2428\n445\t2429\n漏电\t2430\n我爱的发现你好自恋\t2431\n四通八达\t2432\n小白勺\t2433\n小龙省\t2434\noutlookcom\t2435\n契税\t2436\n霓虹人\t2437\n欧芬\t2438\n伊丽莎白·比尔斯\t2439\n香港6座\t2440\n不发飙\t2441\n数一家\t2442\n欧芹\t2443\n汉沽\t2444\n1962年\t2445\n殷鹏飞\t2446\n琉吗米六八\t2447\n三个手\t2448\n相守\t2449\n大小码\t2450\n暴贾\t2451\n罗战国\t2452\n18angelababy\t2453\n郑雅玲\t2454\nfdewefd\t2455\n二郎腿\t2456\n下个星期三\t2457\n零八八八八岁\t2458\nschouo\t2459\n下个星期一\t2460\n税率\t2461\nindleitiao\t2462\n布娃\t2463\ngehbgdsi\t2464\n晚10点\t2465\n红雪\t2466\n复写纸\t2467\n尼桑\t2468\n邓珺岚\t2469\n双方\t2470\n很长时间\t2471\n红雷\t2472\n44%\t2473\n合脚\t2474\n十二煞煞煞\t2475\n吕子乔\t2476\n赵劲豪\t2477\n乙肝\t2478\n存储卡\t2479\n湖山小学\t2480\n张奇\t2481\n15164923661\t2482\n偿还\t2483\n别理我我讨厌你\t2484\n天子\t2485\n四点半\t2486\nlllberighthere\t2487\n天字\t2488\nqfbdhcknfndMatybryNdhntuhmf\t2489\n中生日\t2490\n国土房管局\t2491\n噜噜噜噜噜噜噜\t2492\n42432342555\t2493\n违背\t2494\n潜江省\t2495\n2576565617\t2496\n好好笑\t2497\n押犯\t2498\n歪脖\t2499\n#薇\t2500\nhttppinyincn1vS0zlptuLz\t2501\n你把你你你你你\t2502\n9751\t2503\n说我真的不说\t2504\n2860461\t2505\n罗婉婷\t2506\n不含有\t2507\n班的女孩\t2508\n蔚姐\t2509\n聊天软\t2510\n9点50分\t2511\n順利\t2512\n饶晴风\t2513\n机器度总我爱你\t2514\n疑惧\t2515\n走马街\t2516\n衡阳\t2517\n池园\t2518\n刘玉范\t2519\n桂林市公安局\t2520\n陶然\t2521\n出布\t2522\n校友会\t2523\n唯一性\t2524\n秦你给我唱一首星星泪\t2525\n出师\t2526\n深点\t2527\n甲乙类\t2528\n那羊\t2529\n4hfhrbdh\t2530\ny度秘\t2531\n锁骨\t2532\n上海外滩\t2533\n那么多很乖\t2534\nidjftc\t2535\n暖心\t2536\n九点四十分\t2537\n约约\t2538\n刘小瑞\t2539\nknn\t2540\nknm\t2541\n嘴干巴\t2542\n扯西拉\t2543\n直角形\t2544\n看看着\t2545\nknf\t2546\nknd\t2547\nkne\t2548\nknb\t2549\n吧王\t2550\n东平机场\t2551\n100多个\t2552\n余子琪\t2553\n到來\t2554\n耍酷\t2555\niruy\t2556\nknv\t2557\n战士单\t2558\n先多牛吗一个城堡里朵图图\t2559\nknp\t2560\n一一直\t2561\npjgkga\t2562\n许有\t2563\n斤骨\t2564\n米多奇\t2565\n全场\t2566\n判给\t2567\n[怒\t2568\n佟美\t2569\n黑瞳\t2570\n小花儿\t2571\n再说再见\t2572\n李俊希\t2573\nufd2\t2574\n恐怖游泳馆\t2575\n绝世小受\t2576\n綏\t2577\n名山县\t2578\n小男子汉\t2579\n上设\t2580\n修缮\t2581\n操之过急\t2582\nlady\t2583\n轻动\t2584\n我不是你的宝贝宝贝你不爱你\t2585\n把酒\t2586\n黑瞎\t2587\n狡猾\t2588\n19761976718\t2589\n木错\t2590\n5841111111111111111111111111\t2591\n聪度\t2592\nvgx\t2593\n4到5小时\t2594\n故意\t2595\n慢车\t2596\nvgy\t2597\n南太平洋证\t2598\n未完成\t2599\n好大好\t2600\n千真万雀\t2601\n好的影视学院\t2602\ncouple\t2603\n多玩\t2604\n相不相信\t2605\n双了再说\t2606\n卢沟桥路\t2607\n昏暗\t2608\n范巴斯滕\t2609\n旗帜鲜明\t2610\n太势利\t2611\n札一\t2612\n又片\t2613\n挂沙热&amp\t2614\n奇怪的圣诞包裹他的奇怪在\t2615\n一三八\t2616\n武装\t2617\niphone4\t2618\n50008888888\t2619\n升仙\t2620\n比武\t2621\n滑爽\t2622\n成一张\t2623\n逆着你就不懂了\t2624\n颜冰\t2625\n带笑\t2626\nmdjkd\t2627\n自制\t2628\n不要不赖账\t2629\n切克闹\t2630\n较大\t2631\nyujnjo\t2632\n马长\t2633\n可牙\t2634\n周旋堡\t2635\n博文\t2636\n太不好\t2637\n耶连\t2638\n自创\t2639\n陈雨奇\t2640\n烧壶\t2641\n开错\t2642\n可物\t2643\n易发\t2644\n泸水\t2645\n较多\t2646\n水粉画\t2647\n开销\t2648\n开锁\t2649\n突起\t2650\n罗雨璇\t2651\n想气\t2652\n中华民谣\t2653\n屏幕\t2654\n自分\t2655\nGongfei\t2656\n温泉度假村\t2657\n禹哲\t2658\n18cc\t2659\n胗雅珍\t2660\n妄为\t2661\nJHL\t2662\n与时空交错\t2663\nYouareodd\t2664\n18cm\t2665\n雨势\t2666\niGTI\t2667\n炉下镇\t2668\n刘宋篡国\t2669\nHello\t2670\n崔凯阳\t2671\n小兔才大姐一会儿咩钱系男人吾修神\t2672\n磨叽\t2673\n在这儿\t2674\n如东县\t2675\n康阿娟\t2676\n1gq\t2677\n锁喉\t2678\n考科\t2679\n菲林\t2680\nsecurity[酷\t2681\n1gg\t2682\n1gc\t2683\n芝麻深奥的问题我是电影小达人不如问我有幸\t2684\n肖甫光\t2685\n磨口\t2686\n个溅\t2687\n想不起来\t2688\n让我走\t2689\n110308KTR\t2690\nisoraa\t2691\n别经年\t2692\n七十块\t2693\nyaaahout\t2694\n穿越火线狂龙\t2695\nDhjuhgkj\t2696\n换位\t2697\n西沙群岛\t2698\n蔡通达\t2699\n迟到\t2700\nNBA洛杉矶湖人\t2701\n定胜天\t2702\n植物大战僵尸原始版\t2703\n脆爽\t2704\n市北区\t2705\n豁出\t2706\n女子女\t2707\n独眼怪\t2708\n邮局女\t2709\n北京南\t2710\n晚归\t2711\nfuxhrud\t2712\n培优\t2713\n你是我的孙子度秘\t2714\n受累\t2715\n迈腾B7L\t2716\n旧\t2717\neeeeeeeeeeeeeeeee\t2718\n富裕\t2719\n王兔子\t2720\n给我劲\t2721\n马路上的们\t2722\n網\t2723\n丹霞\t2724\n倒忙\t2725\n龙城队\t2726\n吧台\t2727\n召妓\t2728\n阿乐石\t2729\n湖南省\t2730\n秘书吧\t2731\n吧友\t2732\n池沼\t2733\nAwesome\t2734\n王梓熙\t2735\n蜥蜴\t2736\n三十九岁\t2737\n刘小姐\t2738\n爱上你的人\t2739\nffdrwewrdffgdfg\t2740\n剪不调\t2741\n拜金\t2742\n可了以乎\t2743\n名烟\t2744\n赛改天\t2745\n幺八五零二六\t2746\n端点\t2747\n光指\t2748\n但饃\t2749\n总汇报\t2750\nTwins\t2751\n天爷\t2752\n王明叶\t2753\n天父\t2754\netout\t2755\n别不开心了我给你当我请你看时代\t2756\n秦梓豪\t2757\n7点14分\t2758\n得意洋洋\t2759\n魏生津\t2760\n丸子头\t2761\n睡了觉\t2762\n穿针\t2763\n6494944\t2764\n325000米\t2765\n张一我\t2766\n懂得珍惜\t2767\n刺金\t2768\n完全\t2769\n董雨溪\t2770\n哒哒哒萌\t2771\n零七零三\t2772\n监事会\t2773\nxxoobj…edapwp\t2774\n吃边走\t2775\n不爱吃\t2776\n圣代\t2777\nhxcf\t2778\n我没我一个样\t2779\n配位物\t2780\n掏出\t2781\n辰光\t2782\n废水\t2783\n欧战方\t2784\n声部\t2785\ngjffojglykyhgujbfkdjgjxjgjv\t2786\n嗯8岁\t2787\nbyebyetome\t2788\n二预付费大法\t2789\n不爱听\t2790\ndwzcn2rXGIR\t2791\n欢乐喜剧人\t2792\n爬走\t2793\n雀b977\t2794\n3214558566\t2795\n金起范Twitter\t2796\n废气\t2797\n300千米\t2798\n佔頭\t2799\n吕苏\t2800\n美加净\t2801\n文宇\t2802\n抹黑\t2803\n文安\t2804\n调蛋捣蛋\t2805\n张李念\t2806\n多国\t2807\n苏教版\t2808\n小梓轩\t2809\n俪妃\t2810\n你的姑娘\t2811\n文昌市\t2812\n我问黎上什么样\t2813\noＡo川\t2814\n文宝\t2815\n小秋子\t2816\n才子包\t2817\n经验\t2818\n我好讨厌我妈\t2819\n辗作尘\t2820\n一四四十二\t2821\n多四\t2822\nfffvmcj\t2823\n一百零五\t2824\n安踏\t2825\n黄飞鹿\t2826\n从业者\t2827\n一百零二\t2828\n殉\t2829\n星巴克咖啡店\t2830\n哼坏\t2831\n多囊\t2832\nvsmsfffjh\t2833\n2002年02月13日\t2834\n岁时\t2835\n高小溪\t2836\n逼近\t2837\n繁讨厌\t2838\n小玉玉\t2839\n想你了再见\t2840\n段\t2841\n可享\t2842\n春街\t2843\n吗行\t2844\n犀牛甲虫蛴螬\t2845\n清理\t2846\n可亲\t2847\n耗儿\t2848\n可人\t2849\nrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr\t2850\n赌球\t2851\nkanse\t2852\n冒用\t2853\n湘潭市公安局刑侦支队\t2854\n90年代初\t2855\nboirdi\t2856\n墨水味\t2857\n佳栋\t2858\n挂记\t2859\n591\t2860\n590\t2861\nOTZee\t2862\nsut\t2863\n597\t2864\n596\t2865\n599\t2866\n598\t2867\n单总\t2868\n周三美\t2869\n太萌太二\t2870\n丧气\t2871\n59%\t2872\n宝建\t2873\n田俩\t2874\n我的心事\t2875\n弥勒佛\t2876\n春游金鑫给你我很相信你三里屯店还给你我好想好想你没几天亲亲\t2877\n邓帅\t2878\nsus\t2879\n躬行践履\t2880\n缅甸航空公司\t2881\n猫河蟹场\t2882\n逗笑\t2883\n1000MB\t2884\n第四个\t2885\n干细胞\t2886\n核能\t2887\n公款\t2888\niWatch\t2889\n只是度\t2890\n多米就是我我就是图你\t2891\ndevicescsiulsata21\t2892\nibib\t2893\n转票\t2894\n萧山医院二号楼的三楼抗癌协会\t2895\n察找\t2896\n邓慧敏\t2897\njoyou\t2898\n說嗎\t2899\n刻不容缓\t2900\n说客气\t2901\nadmw\t2902\n金美\t2903\n感情世界\t2904\n瓢虫\t2905\n加湿器\t2906\n杨悦\t2907\n德行\t2908\n难不难\t2909\n沙特\t2910\n内蒙\t2911\n气候\t2912\n转神\t2913\n1725839455555588565\t2914\n足球\t2915\n脸脸脸脸脸脸\t2916\n听歌\t2917\n我秘问你件事儿\t2918\n沙博尔\t2919\n伍铜\t2920\n泰舒\t2921\n家谱\t2922\n邪乎\t2923\n托伯\t2924\n斯巴鲁\t2925\n家谷\t2926\nfeeacv\t2927\n莱山\t2928\n骑士\t2929\n丽娜\t2930\n库雷斯\t2931\n春之p\t2932\n踞员\t2933\n李康婧\t2934\nhanako\t2935\n游览器\t2936\n汗太\t2937\n饿了拜拜\t2938\nYomypeeweeup\t2939\nyourdadiswho\t2940\n多色\t2941\n托伊\t2942\n武日语\t2943\n上上条\t2944\n我不我不用\t2945\n过婚\t2946\n隐瞒\t2947\n切克闹小超哥\t2948\n多咪眯\t2949\n吴保情\t2950\n仅限于\t2951\n杨吧\t2952\n色网\t2953\n191817161514312111o9876543210\t2954\n328元\t2955\n哎嗯\t2956\n快走开\t2957\n03集\t2958\n扭扭捏捏那你斤斤计较那那那那那那\t2959\n嗯四川\t2960\n登登登\t2961\n虐心\t2962\n打哆嗦\t2963\n吉夫加夫\t2964\neterterg\t2965\n海航会\t2966\n8款\t2967\n赵琪洋\t2968\nUghcb\t2969\n大包包\t2970\n身质量\t2971\n8次\t2972\n存留\t2973\n第八位\t2974\n好吧幺\t2975\n张雨婷\t2976\n我没你唉海文言文水怪的奶粉哪的离人曲\t2977\n刘蒙\t2978\nchok\t2979\n厢县\t2980\niihhhhjngccc\t2981\n壶口瀑布\t2982\n五大类\t2983\n上城关\t2984\n你是我演的你是我的眼\t2985\n打哑谜\t2986\n不远是\t2987\n台忙\t2988\n笑一个我看看\t2989\n东山口\t2990\n唠唠\t2991\n新华字典\t2992\nRELOAD\t2993\n顺路\t2994\n丁宁\t2995\n扒光\t2996\nkgmt\t2997\n自面\t2998\n塬本\t2999\nAPU处理器\t3000\n呵冫\t3001\n985710437\t3002\n总站\t3003\n韩振涛\t3004\n开心颜\t3005\n陈涛\t3006\n救世主\t3007\n正经\t3008\n季姣\t3009\n定对\t3010\n小天才\t3011\n庸俗\t3012\n相比\t3013\n整体天\t3014\nhvvhj\t3015\n正统\t3016\n么不说话\t3017\n那些事儿\t3018\n女恩师\t3019\nutgggu\t3020\n陈涵\t3021\n博纳\t3022\n眉潭县\t3023\n双交\t3024\n墓碑\t3025\n奥运百科#\t3026\n双人\t3027\n莲花白\t3028\n双亲\t3029\n紫甘蓝\t3030\n亲爱的朋友热情相拥我真心的话开心的泪你我的心里流动啦啦啦啦啦啦啦啦\t3031\n创业板股\t3032\n花枝俏\t3033\n再见后会\t3034\n毛学校图书馆\t3035\n大西北\t3036\n呀爱\t3037\n奔口刀\t3038\nzdudz\t3039\n略胜一筹\t3040\n空皇\t3041\n吕文录\t3042\n原来数\t3043\n旅行箱\t3044\nxpPjiu\t3045\n怨天不怨\t3046\n江清月\t3047\n事件\t3048\n下里\t3049\n少君\t3050\nEmergency\t3051\n呆掉\t3052\n搂着我\t3053\n你是我的小呀小苹果怎么爱你都不仙多哄哄的话那金花让我哦哦哦你是我的小呀小苹果\t3054\nygfyhhtu\t3055\n522786\t3056\n低热\t3057\n少同\t3058\nGmb\t3059\n低烧\t3060\n示范\t3061\n郭聪明\t3062\n少吃\t3063\n小年夜的夜晚\t3064\n800万美元\t3065\nfyty\t3066\nvcbc\t3067\n牛度秘\t3068\n明溪溪\t3069\n莜白\t3070\n奥运季#\t3071\nhhhnmmkougddry\t3072\nAUGSTEIN\t3073\n500多分\t3074\nfytc\t3075\n刘世民\t3076\n321456999\t3077\n雁过\t3078\n7来\t3079\n脑体\t3080\n雅文\t3081\n悬崖边\t3082\n物色\t3083\nTYG\t3084\n囡囡片\t3085\n钟面\t3086\ner二十五六十二八十三一百中\t3087\n扫街\t3088\n来了我\t3089\n杨增强\t3090\n9087654323123445566667877899年\t3091\n攒劲\t3092\n二十根\t3093\n三十倍\t3094\nBAFTA\t3095\n呸\t3096\n明若\t3097\n拜托王\t3098\n职责\t3099\n红心蛋\t3100\n四天还剩下全书的十一分之三\t3101\n我可讨厌你你个臭\t3102\nbooks\t3103\n一站你在说这么难的问题信不信我一个豆腐砸死你\t3104\n啦啦啦啦啦啦啦啦\t3105\n神器\t3106\n撼树\t3107\n本意\t3108\n历吏\t3109\nashevsa\t3110\nyci\t3111\n奥安康\t3112\n小猪头\t3113\n┻━┻\t3114\n450张\t3115\n婆娘\t3116\n元旦快乐\t3117\nfigGHG\t3118\nrggdgrggn\t3119\n二一岁\t3120\n婆娑\t3121\n中国高级法院\t3122\n呲\t3123\n怅惘\t3124\n不完了\t3125\n郭富全\t3126\n知行合一\t3127\n呵\t3128\nKEJOXOIEFBZNN\t3129\nloogtimenosee\t3130\n加元\t3131\n新生力量\t3132\n一年一度\t3133\n代言费\t3134\nryrvlrliv\t3135\n无限长\t3136\ndhjdcjg\t3137\neJJCd\t3138\n一点么\t3139\nzheihao\t3140\ncould\t3141\n还度秘\t3142\n杨天雪\t3143\n体味\t3144\n一点也\t3145\n六九五二二零四\t3146\n加入\t3147\n呀多一点\t3148\n猪儿猪\t3149\n长信心\t3150\n跨行业\t3151\n高二\t3152\n炫黑\t3153\n高于\t3154\n我们两个结婚吧我爱你\t3155\n里衣\t3156\n拆机\t3157\n大三巴步行街\t3158\n133块\t3159\n暗黑族\t3160\n甲基橙\t3161\n男主角\t3162\n少妹子\t3163\n孙亿林\t3164\nertfhhfh\t3165\n高五\t3166\n马拉比\t3167\n二二二三\t3168\n恨你狗吊\t3169\n宣、闵\t3170\n车胜元\t3171\n秘黑\t3172\n高亮\t3173\n呢\t3174\n王泽平\t3175\n段儿\t3176\n高产\t3177\n广州省\t3178\n大大的拥抱\t3179\n高亢\t3180\n针眼\t3181\nnnn0\t3182\n还算什么\t3183\n高人\t3184\n方方面面\t3185\n班车子\t3186\n将心比心\t3187\n裴如海\t3188\n第三和\t3189\n巴黎\t3190\n虎咽果冻\t3191\nk88\t3192\n3元\t3193\njgmdm7mpjaj7matjtp\t3194\nnnnU\t3195\ndespiseyou\t3196\nk81\t3197\n遮掩\t3198\n爱心勇士度秘\t3199\n海妖\t3200\n实行\t3201\n阴图\t3202\n1395558662848956715\t3203\n纯纯\t3204\n雷翠枫\t3205\n唬稿画\t3206\nnnnj\t3207\nnnnk\t3208\n阴囊\t3209\nnnnn\t3210\n红地毯\t3211\nfhjdghwgjydnggjidgjgzvbczvbxxvbcxbfzchvdhfdhffjjcnncvgfgmcshudf\t3212\n头破血流\t3213\n滴调戏\t3214\n刘风\t3215\n彪悍型\t3216\n15802462576\t3217\nnbuyaolian\t3218\n乳部\t3219\n我是你的好朋友度每美\t3220\n小胖妞\t3221\n信息\t3222\n决定权\t3223\n药王\t3224\n37年\t3225\n科比\t3226\n红太\t3227\n满格\t3228\n我想和你我想和你聊一下\t3229\n宜宾市\t3230\n弄出\t3231\n合二为一而非\t3232\n饭饭饭饭饭饭饭\t3233\n梦雨琦\t3234\n好冷飕飕\t3235\n505050505050\t3236\n理气\t3237\nppppl\t3238\n孙双双\t3239\n头十岁\t3240\nhxbnjzjs\t3241\n熊娃娃\t3242\n好压\t3243\natit\t3244\nppppp\t3245\n≧≦\t3246\n锐减\t3247\n歌姬\t3248\n罂粟汤\t3249\n贝瓦\t3250\n万出第一步\t3251\n富裕吃的晚\t3252\n鼎盛演艺公司\t3253\n3624\t3254\nreyioobt\t3255\nyourco\t3256\n祢哪么\t3257\n我的丽赫本\t3258\n与此同时\t3259\n天博\t3260\n葬身\t3261\n4册\t3262\n主题\t3263\n呃\t3264\n童爱菲\t3265\n刑满释放\t3266\n曼陀罗花\t3267\n芳心\t3268\n咯京坛\t3269\n五六十万\t3270\n鲍莉萍\t3271\n定然\t3272\n缴枪\t3273\n六一然\t3274\n要不我\t3275\nfoyx\t3276\n你好密\t3277\n一圈10km\t3278\n叶波\t3279\n真不要脸不要脸不要脸不要脸不要脸不要脸妈妈的妈妈妈妈\t3280\n1012086367\t3281\ndtagmdk\t3282\n天使展开生\t3283\n我真的听不懂你说的话\t3284\n神鹰\t3285\nLED电视\t3286\n开罗市\t3287\ndhiphotosbaiducomxiaodupicitem83025aafa40f4bfb1837a327044f78f0f63618d9jpg\t3288\n农历三月\t3289\n陈翔橙\t3290\n良玉\t3291\n律师团\t3292\n火字\t3293\n潜规则\t3294\n分天我要\t3295\n不骗人\t3296\n130斤\t3297\n架桥\t3298\n干裂\t3299\n天橙\t3300\n毕老师\t3301\n涉毒\t3302\n雅克天天棒\t3303\n巡心\t3304\n27058亿美元\t3305\n李鸿章\t3306\n和县\t3307\n付歌\t3308\ns君\t3309\n声东击西\t3310\n小天校花爱上外小丽小丽爱上小天小天有爱了小米小米有安神效力小了爱上人\t3311\n数字组\t3312\n工艺术品\t3313\n花雕\t3314\n马家窑\t3315\n列举\t3316\nabbc\t3317\nabbb\t3318\n敢不服从\t3319\n东升\t3320\n胜任\t3321\n东华\t3322\n再见啊\t3323\n买买\t3324\nhjkjkkk\t3325\n步履蹒跚\t3326\nabbs\t3327\n东南\t3328\n我女儿上哪了我女儿上\t3329\n有兴趣\t3330\n维无言\t3331\n横沙\t3332\nxvc\t3333\n奥那你\t3334\n四十么\t3335\n可汗\t3336\n这世纪城后天的o我我着你说的一般是的看\t3337\n22时\t3338\n酣饮\t3339\njfffif\t3340\n25家\t3341\n25宵\t3342\nxvd\t3343\n无权\t3344\n燕杰斐\t3345\n喷潮\t3346\n13751152877\t3347\n实体\t3348\n顾得拜\t3349\n22日\t3350\n可汉\t3351\nguessing\t3352\n睡把\t3353\n思春\t3354\n并卵\t3355\n大衣\t3356\n受害者\t3357\n新生物\t3358\n不要走你走\t3359\n另谋高就\t3360\n排除万难努力\t3361\n小度秘制\t3362\n讲完\t3363\n写实\t3364\n领航员\t3365\n零之使魔h邪恶漫画\t3366\n十五千克\t3367\n大血\t3368\n百二十s\t3369\n苏姓名\t3370\n思明\t3371\n提音乐\t3372\n讲客\t3373\n3w点88caca点\t3374\n流言\t3375\n干未木\t3376\n大街\t3377\n姐侥\t3378\n圆哈\t3379\n不爱我是你的主人神情\t3380\n风笛\t3381\n高博文\t3382\n聪明鬼\t3383\n成都市区\t3384\n情定\t3385\n冰笨\t3386\n蒋洁敏\t3387\n背时\t3388\nfderd\t3389\n武周代\t3390\nzmyw\t3391\n并联电路\t3392\n哈米\t3393\n重操旧业\t3394\n科拉松\t3395\n主厨\t3396\n倾城醉倒月池莲\t3397\n国威\t3398\n万通\t3399\nhttpahiphotosbaiducomxiaodupicitem6d81800a19d8bc3e39092db8858ba61ea8d345bajpg\t3400\n鸿蒙\t3401\n济纳\t3402\n没用好想\t3403\n噩耗\t3404\n穿越我是王\t3405\n瞎了看不见\t3406\n寡妇待遇\t3407\n徐神曲\t3408\nfgdfj\t3409\n逆着\t3410\n深港产业园\t3411\n性生活片\t3412\n金湾山\t3413\n31137元\t3414\n哈杨颖\t3415\nelsesomebidy\t3416\n何存\t3417\n念存意\t3418\n在不说\t3419\n候机场\t3420\n有所以\t3421\nvvghhjjfvghhgdgdgfvryigsycybtufycfyg\t3422\n闪耀\t3423\n我不克的克睡觉\t3424\n伐木工\t3425\n当我的男闺蜜呢还是想当我的女闺蜜\t3426\n本院\t3427\n杨宝美\t3428\n昂so\t3429\n求好\t3430\n23次\t3431\n明天早晨\t3432\n摩斯\t3433\n四大v个\t3434\n王思尹\t3435\n聊天翅\t3436\ngfrffgghhhhhhhhhhhhhhhhhh\t3437\n四十只\t3438\n四十号\t3439\ndoingsth\t3440\n物态变化\t3441\n韩国\t3442\n肖肖\t3443\n撒光光\t3444\n韩困\t3445\ncvD\t3446\n梦次元\t3447\nbanish\t3448\n家毫\t3449\n18990731\t3450\ncvv\t3451\ncvs\t3452\ncvn\t3453\nMMZZ\t3454\ncvm\t3455\ncvj\t3456\n狼性\t3457\ncvh\t3458\n拒收\t3459\ncvf\t3460\ncvg\t3461\ncvd\t3462\ncve\t3463\ncvb\t3464\ncvc\t3465\nhttpfhiphotosbaiducomxiaodupicitem7e3e6709c93d70cf5efccc79ffdcd100baa12bbbjpg\t3466\n今今天\t3467\nAirbus\t3468\n近今\t3469\n顶峰\t3470\n十二生肖\t3471\n内部片\t3472\nHadd\t3473\n0000Uw\t3474\n金壮龙\t3475\n能说会\t3476\n迪巴克猫\t3477\n男仆\t3478\n大秦帝国\t3479\n一动不动\t3480\n恐物食人鱼\t3481\n李开复\t3482\n蔡随后\t3483\n十五四十五岁\t3484\n男老头\t3485\n萌萌哒一三完\t3486\n非接触性\t3487\n近代\t3488\n革命的文学和文学的革命\t3489\n我想你的心我的心\t3490\n周榜\t3491\n奸情\t3492\n东城区\t3493\n班龙门\t3494\n擦长\t3495\n狭长\t3496\n找电影\t3497\n28栋\t3498\n通关\t3499\n第六条\t3500\n涂层\t3501\n23458\t3502\n勾股定律\t3503\n十五兆\t3504\n呼叫学妹\t3505\n十五元\t3506\n23456\t3507\n涂山\t3508\n压会\t3509\n张子轩\t3510\n舒华朵\t3511\nofk\t3512\n静雅\t3513\n曾娅雯\t3514\nofn\t3515\n度秘妙妙\t3516\n桌椅\t3517\nofr\t3518\n100191\t3519\n十五六\t3520\n22225555\t3521\n桃花林\t3522\noft\t3523\n前座\t3524\n代理们\t3525\n蛾子\t3526\n护坡\t3527\n哀歌\t3528\n曼哈顿\t3529\n孙悟悟空\t3530\n对视\t3531\n锡拉胡同\t3532\n高跟\t3533\n播播激情网\t3534\n新京报电子报\t3535\n1236888\t3536\n99YY\t3537\n臊子\t3538\n不想当你\t3539\n暗拍\t3540\n一起猜猜成语英雄\t3541\n沈京兵\t3542\n通旅社\t3543\n对角\t3544\n花非花\t3545\n可爱像女\t3546\n高跷\t3547\n慢着\t3548\n剪子\t3549\n好朋幽度秘\t3550\n21日\t3551\n1024jb\t3552\n好好呀\t3553\n收购商\t3554\n那万圣奶爸\t3555\nyouz瑞\t3556\n女老师\t3557\nc4ezoyoufo\t3558\n张宏森\t3559\n21时\t3560\n85年\t3561\n572455742825873252227\t3562\n阴魂不散\t3563\n你好你猜我在哪儿\t3564\n壁\t3565\n計劃\t3566\n咳死\t3567\n石家齐\t3568\n15559323995\t3569\n狗哒\t3570\n别感慨\t3571\n那厢\t3572\n杂家\t3573\n只不倒\t3574\n伟业\t3575\nBdshimvx\t3576\n一万一千一百一十一一圈儿\t3577\nKivgb\t3578\n宝贝你是我的爱\t3579\nydueuee\t3580\n辞职\t3581\n八九个\t3582\ngfb\t3583\n斯坦\t3584\n少仲唔\t3585\n竹筒饭\t3586\n聊城\t3587\n电脑电脑\t3588\n文涛\t3589\n8458468\t3590\n义思\t3591\n狗哥\t3592\n七百块\t3593\n558653\t3594\n1层\t3595\nqv吗曲\t3596\nGK1F\t3597\n官能团\t3598\nGK1C\t3599\n九龙仓集团全资公司大连盈致企业管理有限公司\t3600\n颌颌\t3601\n乖孩\t3602\n一九不该\t3603\n个度秘\t3604\n花生油\t3605\n挪威老牌儿Jorn\t3606\n78767977788875元\t3607\n送回\t3608\n4494444943\t3609\n属性\t3610\n在天一\t3611\n再如\t3612\nkdb48765454\t3613\n牛女\t3614\n亲爱的你是我的男朋友\t3615\n牛奶\t3616\n笑脸相迎俩面三刀\t3617\n同太狼\t3618\n第112天\t3619\ndisk\t3620\n郑渊洁童话全集\t3621\n吹气\t3622\n乖孙\t3623\n喜贝贝\t3624\nyeuthdrwyd\t3625\n去机\t3626\n周致词\t3627\ngxbfvvfv\t3628\nSJ-M\t3629\n蒋太平\t3630\nhellomini3\t3631\n翱翔\t3632\n示威游行\t3633\nHGJGNGBUG\t3634\n4月8号晚\t3635\n反攻\t3636\n茶位\t3637\n怀化市\t3638\n系统间\t3639\n石阶\t3640\n〞\t3641\n北板桥\t3642\n【\t3643\n】\t3644\n〒\t3645\n〔\t3646\n〕\t3647\n饼子\t3648\n〗\t3649\n〈\t3650\n〉\t3651\n《\t3652\n》\t3653\n「\t3654\n」\t3655\n『\t3656\n』\t3657\n\t3658\n、\t3659\n。\t3660\n〃\t3661\n々\t3662\n跌跌撞撞\t3663\n〇\t3664\n我听懂\t3665\n普洱茶\t3666\n片场\t3667\n每股\t3668\n只有一个人知道\t3669\n女无呢\t3670\n维士特\t3671\n盲蛇\t3672\n学童\t3673\n答不伦不类\t3674\n10摄氏度\t3675\n一百五十\t3676\n脑门\t3677\n萍水相逢\t3678\nGina\t3679\n长城奖创意奖\t3680\n来日方长\t3681\n侧妃\t3682\nBhhhh\t3683\n虫师\t3684\n聊天儿\t3685\nLovin\t3686\n一言\t3687\nmuch\t3688\n植村秀\t3689\n撒子的我的884\t3690\n共处\t3691\n癔症\t3692\n人财\t3693\n抠图\t3694\n人货\t3695\n物理平衡力\t3696\n狐疑\t3697\n呦呦呦小妞儿\t3698\n女将\t3699\n盛年\t3700\n72米高\t3701\n和你的名字\t3702\n睾丸木\t3703\n新青年书店\t3704\n有点微\t3705\n宫颈癌\t3706\n乳酸菌\t3707\n十七个\t3708\n上旋\t3709\n三八二十一\t3710\n女尤\t3711\n边旁\t3712\n真凶\t3713\nsxzj\t3714\n滥用\t3715\n下午4点钟\t3716\n座座机\t3717\n血癌\t3718\n一一吗\t3719\n必須\t3720\n洽谈\t3721\n十七万\t3722\n肛塞\t3723\n自省\t3724\n孙艺文\t3725\n度写\t3726\n女尸\t3727\n上早\t3728\n放不方便\t3729\nBbbbbhgfdf\t3730\n7165点\t3731\n圖片\t3732\n卡幸\t3733\n回到你问\t3734\n泡软\t3735\n着想象\t3736\n大小头\t3737\n对斗\t3738\n呼叫\t3739\n见不得\t3740\n来了蟹黄堡\t3741\n王帅超\t3742\n美吧\t3743\n席地而眠\t3744\n小煞\t3745\n劳哥\t3746\n太真理\t3747\n感情心\t3748\nConverse\t3749\n嗯拜\t3750\n揣摩\t3751\n哭了死\t3752\n王华华\t3753\n电脑党\t3754\n嗯拉\t3755\n嗯拍\t3756\n不得么你不爱我\t3757\n傻傻帅哥\t3758\nzhou\t3759\n九锥\t3760\n利和\t3761\n石评梅纪念馆\t3762\n胃镜\t3763\n同辈\t3764\n说了关机\t3765\n吊打\t3766\n分ｖｔ\t3767\nstuv\t3768\nzhon\t3769\n0.1元\t3770\n分道\t3771\n二座\t3772\n变态流氓\t3773\n陈朝桐\t3774\n邀某\t3775\nucpypytpx\t3776\n浩南\t3777\n桂纶镁\t3778\n及时\t3779\n6月8号\t3780\n我是爱你的那个人\t3781\n叶伟\t3782\n儿城\t3783\n罗斯允\t3784\n等待着\t3785\n十三虎\t3786\n三五分\t3787\n累了回头看\t3788\n隐忧\t3789\n武井咲\t3790\n无所畏惧\t3791\n宁波大学附属医院\t3792\nhufff\t3793\n堡寒假\t3794\n东南亚\t3795\n孕妇\t3796\n2016上半年\t3797\n没觉\t3798\n100000000000000000000000n个\t3799\n樂器\t3800\n机动性\t3801\n蓝莓酒\t3802\n游泳池\t3803\n气死我了你好\t3804\nhttpfhiphotosbaiducomxiaodupicitemac4bd11373f0820247216f4d4cfbfbedab641b14jpg\t3805\n这个天\t3806\n先进门\t3807\n2016年2月1日\t3808\n建友\t3809\n鹿港\t3810\n一往无前\t3811\ntfv4\t3812\n学理\t3813\n乖会儿\t3814\nworth\t3815\n宝贝儿我记住你\t3816\n聪\t3817\n化名\t3818\nstud\t3819\n该该\t3820\n甩毛\t3821\n1913年\t3822\n熊版\t3823\n谈感情\t3824\n恩知度\t3825\nckoecjeofnrbfi2jvi\t3826\n热干面\t3827\n灵娃\t3828\n聼\t3829\n聽\t3830\n聿\t3831\ndttttttt\t3832\n给我一个字错\t3833\n聶\t3834\n聊\t3835\n聋\t3836\n职\t3837\ntfvd\t3838\n华山\t3839\n聂\t3840\n有未开\t3841\n小型化\t3842\n一一分钟\t3843\n聘\t3844\n聚\t3845\n字幕君\t3846\n第八套\t3847\nmath\t3848\n聒\t3849\nplode\t3850\n同房\t3851\nJing\t3852\n硬广\t3853\n不普通\t3854\n李鹊\t3855\n白白\t3856\n王小明\t3857\n白百\t3858\n人机大战\t3859\n翔帅\t3860\nvjkhvnkkc\t3861\n到这吧\t3862\n无法忘记\t3863\n无一无一样\t3864\n建成后\t3865\n可见度\t3866\n潍坊3号\t3867\nvcxvvgvbgdchcwfhvhgbfjhbjbjbhbvhbbbjvhbbnxhcjgjvvxcbgvxhdjcvvvvfbznxbcchzgxcbvbbbbv\t3868\n要p图\t3869\n八一大楼\t3870\n永安街\t3871\n白癖\t3872\n许言\t3873\n白癜\t3874\n江南Say\t3875\n李鹿\t3876\ndcdfretdfyf5\t3877\nzundy\t3878\n王小显\t3879\n80斤\t3880\n皇帝\t3881\n晨娑\t3882\n珠海市人民政府\t3883\n撇本\t3884\n十三楼\t3885\n83000\t3886\n城上\t3887\n城下\t3888\n阿布\t3889\n左右逢源\t3890\n7864\t3891\n冬笋\t3892\n7860\t3893\n八嘎呀路\t3894\n说了问\t3895\n从众生\t3896\n城东\t3897\n狗猫鼠鲁迅仇的是猫\t3898\n1610RMB\t3899\n5000000元\t3900\n帅凤琴\t3901\n城中\t3902\n高建勤\t3903\n深切\t3904\n往后度\t3905\n舰载机\t3906\n城主\t3907\n隐忍\t3908\n嗯喜剧\t3909\n没用我真\t3910\njgmp\t3911\nbrx\t3912\n及时雨\t3913\n麻辣隔壁\t3914\n太盲目\t3915\nbrt\t3916\n嚞\t3917\n逃过\t3918\nfbax\t3919\n卡丁足\t3920\njgmg\t3921\n高难\t3922\n王玛丽\t3923\n雷锋度秘\t3924\n二十度\t3925\n轻捷\t3926\nbrf\t3927\ngghshjd\t3928\nDff\t3929\nbrb\t3930\nbra\t3931\n望城区\t3932\n男饭\t3933\n分时收费系统\t3934\n六十多年\t3935\n绝地反击\t3936\nhehad\t3937\n几百\t3938\n扔下\t3939\n不可耐\t3940\n我要你的美照\t3941\n冠军\t3942\n范鑫浩\t3943\n就问\t3944\n王小之\t3945\n玻璃墙\t3946\n1991\t3947\n1990\t3948\n1992\t3949\n觅\t3950\n1994\t3951\n不要再来\t3952\n华理\t3953\n1999\t3954\n1998\t3955\n彭顺\t3956\n发帖者\t3957\n一千篇\t3958\n荡妇\t3959\n银发\t3960\nHdhd\t3961\n卖我走\t3962\n天天一个字\t3963\ntnn\t3964\n抵押贷款\t3965\n黄一程\t3966\n烟枪\t3967\n一撇一捺\t3968\n咯图g11\t3969\n一上次\t3970\n走时\t3971\n嚄\t3972\n麦密\t3973\nsomuchah\t3974\n港阳\t3975\n走无\t3976\n颈联构思\t3977\n嚇\t3978\n滚滚远\t3979\n猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪\t3980\n旁边人\t3981\n空间\t3982\n好暖心的话\t3983\n马天好\t3984\ndalays\t3985\n青豆\t3986\n三三三三三三三\t3987\n叔本华\t3988\n刘梦爽\t3989\n好兄\t3990\n可比克\t3991\n三维\t3992\n谢你啦\t3993\n主楼\t3994\n传城管\t3995\n好开\t3996\n早离\t3997\n陈栋帆\t3998\n第10条\t3999\n茂盛\t4000\n共新村\t4001\n69文\t4002\n我不是给我我不是小子我是姑娘五姑娘\t4003\n幺零幺\t4004\n24年\t4005\n小不点真淘气\t4006\n马拉米\t4007\n256636\t4008\n度秘你的手记\t4009\n茶叶\t4010\n两路\t4011\n张檬\t4012\n今天22号\t4013\n万科\t4014\n倒数第\t4015\n三组\t4016\n果豌豆笑传\t4017\n白百合\t4018\n通读\t4019\nyù\t4020\nhhchxhx\t4021\n诗宇\t4022\n错版\t4023\n掠过\t4024\n梓花儿\t4025\n小乖乖\t4026\n完颜\t4027\n私用\t4028\n60年代\t4029\n啊开阳\t4030\n巴卫\t4031\n严进宽\t4032\n限产\t4033\n浙江教育教材适用出版集团有限责任公司\t4034\n止血\t4035\n行政院\t4036\n157号\t4037\n丈人儿\t4038\n命案\t4039\n通话\t4040\n节日\t4041\n避世者\t4042\n限于\t4043\n密书\t4044\n疯了值\t4045\n1月2日\t4046\nxxom\t4047\n第一年\t4048\n這該\t4049\n反腐败\t4050\n僧\t4051\n一千来\t4052\n洁洁白北\t4053\ngtjtm\t4054\n人工对话\t4055\ndouspeakenglish\t4056\n吴琳嫄\t4057\n萧生\t4058\nthhdc\t4059\n55组\t4060\n02千克\t4061\n衣冠\t4062\n二六零二\t4063\ns1样\t4064\n老壳\t4065\n长大成人\t4066\n王芊羽\t4067\nnoclue\t4068\n11日\t4069\n脚尖\t4070\n娅渔\t4071\n度秘撒的咯度秘度秘魔什么度秘本地\t4072\ngggggg\t4073\n唇吻\t4074\nninn\t4075\nFfthgu\t4076\n过年的心情了你到多咱你回老家呀你牛老家\t4077\n我的天空\t4078\n略萨\t4079\n老子不在关你屁事\t4080\najjjj\t4081\n一静\t4082\n牛粪\t4083\n偷窃\t4084\n早上六点半\t4085\n冲场\t4086\n大老我\t4087\n作文本\t4088\n圆子\t4089\n返风喇\t4090\n题我一道题\t4091\n太漂亮\t4092\nvffjghikj\t4093\n考群\t4094\n社会类\t4095\n刘铭传\t4096\n13号星期五\t4097\n读一下\t4098\n怀念\t4099\n明辩\t4100\n交合\t4101\n理应\t4102\n大白兔\t4103\n45868458\t4104\n好丽友好丽友好丽友\t4105\n33333333\t4106\n没办法\t4107\nNijiang\t4108\n度秘月\t4109\n罗斯\t4110\n形成号\t4111\n高尾\t4112\n猴年吉祥人家\t4113\n伊后语\t4114\ndeqo\t4115\n158.00\t4116\n1488096258\t4117\n充斥\t4118\n姜晨\t4119\n32431号\t4120\n喷火枪\t4121\nwruu\t4122\nfnfsnakamndnvfjdioahdisrieojeodeowoeiweijrirdheodhdjrriddr\t4123\n部号\t4124\n高尚\t4125\n领衔主演\t4126\n000n0\t4127\n1835年\t4128\n桩\t4129\n虎扑字幕组\t4130\n制服\t4131\n皮夹\t4132\n高小\t4133\n13时12分\t4134\n百五十米\t4135\n起來\t4136\n漫威\t4137\n94位\t4138\n明天下午3:00\t4139\n十克\t4140\nYounonsense\t4141\n百合亲\t4142\nanother\t4143\nhttpehiphotosbaiducomxiaodupicitemcf1b9d16fdfaaf51efdcd9cc8b5494eef01f7a81jpg\t4144\ndggchn\t4145\n十元\t4146\n呢ok\t4147\n十兆\t4148\n苏力\t4149\n剪脚\t4150\n1月30日\t4151\n蓟县政府网\t4152\n4ab2abCA2a\t4153\n郭碧婷\t4154\n科學\t4155\n算完\t4156\n邢\t4157\n2114790239\t4158\n韩美度秘\t4159\n十八\t4160\n十六\t4161\n羊圈\t4162\n科学\t4163\n度秘度秘我爱你深深的爱着你\t4164\n2225276552502504213\t4165\n对呀对呀我不想跟你说话了呀理我我也不给\t4166\n来来往往\t4167\n掐指一算\t4168\n异次元\t4169\n重返狼群\t4170\n一个三百\t4171\n三十胜\t4172\n隔壁村\t4173\n3.15\t4174\n地间\t4175\n张震\t4176\n浙江高院\t4177\nacdefg\t4178\n印香图谱\t4179\n一種\t4180\n杏干砀山\t4181\npplmmnnbhhhjoo\t4182\n亿万个\t4183\n受創\t4184\n滚蛋滚蛋\t4185\no82\t4186\n2365479325555555555555555555\t4187\n吴彦\t4188\n中国人民解放军海军\t4189\n我只要你的我\t4190\n一程\t4191\n抢答\t4192\nQ2\t4193\nQ5\t4194\n纺织\t4195\n路演\t4196\n多咪尼\t4197\n平安我萌妥妥\t4198\n走了\t4199\n皇佳\t4200\n汉隶\t4201\n于为\t4202\n花千骨叫花姑娘\t4203\n于丹\t4204\n着魔\t4205\n赛车型\t4206\n植物大战僵尸2\t4207\n植物大战僵尸5\t4208\n137990728\t4209\n书斋\t4210\n徐文洁\t4211\n徐经长\t4212\n小米粥\t4213\n这样的天\t4214\n暴力狂\t4215\n钟点房\t4216\n我就是说\t4217\n了了抓\t4218\n大力丸\t4219\n十二月二三日\t4220\n皇位\t4221\n完好过来\t4222\n走人\t4223\n周日早10点\t4224\n一只度\t4225\n幻音\t4226\n朴贤哲\t4227\nQq\t4228\n郑州航空港\t4229\n6片\t4230\n度秘君\t4231\n隐形人\t4232\n西西\t4233\n度秘吖\t4234\n短工\t4235\n我来了跑男\t4236\n中信银行\t4237\n度秘吏\t4238\n乘客度\t4239\n听写\t4240\n卡玛\t4241\n甄缳传\t4242\n余晨蕊\t4243\nQh\t4244\n楪祈\t4245\nQQ\t4246\n媽媽\t4247\nQR\t4248\nQT\t4249\n缺阵棠\t4250\n冲鸡\t4251\n林进权\t4252\nDnFDGCHHCU\t4253\nQC\t4254\nQB\t4255\nQE\t4256\n甲gyfdjgy\t4257\n纸们\t4258\nQJ\t4259\n力普及率瑞NPC\t4260\nhvgg\t4261\n22点30\t4262\n短兵相接\t4263\n郑景银\t4264\njikk\t4265\n凹凸堡\t4266\n狼犬\t4267\n六九三七一二零\t4268\n铺废\t4269\n张起玉\t4270\n我是女的那你也是女的好不\t4271\n田秘术\t4272\n555887\t4273\n开箱\t4274\n可去过\t4275\n孔垂楠\t4276\n植物大战僵尸之花园战争\t4277\ngchfygf\t4278\n大后方\t4279\n剛打電話讀書\t4280\n唱秧歌\t4281\n铺床\t4282\n送你一颗子弹\t4283\n动车\t4284\n13967569525\t4285\n悭\t4286\n44页\t4287\n武汉奥园\t4288\n哈信\t4289\n硭硝\t4290\n兰兰\t4291\n胆丸\t4292\n杨家踩\t4293\n福娃糙米卷\t4294\n瑞文\t4295\n襄阳一法院\t4296\n爱喊\t4297\ntometoyou\t4298\n55856878\t4299\n七海\t4300\nwascrying\t4301\neme\t4302\nemd\t4303\n=|||||||||||||||\t4304\n打幺三八零\t4305\nemm\t4306\n揍祖\t4307\nemi\t4308\n手敌\t4309\n鲜奶吧\t4310\n250型\t4311\n50片\t4312\n大峰\t4313\nems\t4314\n郑志远\t4315\nrunk\t4316\n9372\t4317\n张竞丹\t4318\n见色忘友\t4319\n惺惺惜惺惺下\t4320\n的你在干什么呀今天星期几\t4321\n斗牛妞\t4322\n额济\t4323\n说你丑说\t4324\n焦阳鹭康康\t4325\n底里\t4326\n科特绝路level了饿了了5vljuelukule\t4327\n高帅富火锅\t4328\n抛锚\t4329\n王法\t4330\nhttpchiphotosbaiducomxiaodupicitem18d8bc3eb13533fa48be785aafd3fd1f41345bbejpg\t4331\n王泓\t4332\n两百少一分\t4333\n一Gs\t4334\n腿儿\t4335\n5点33分\t4336\n王波\t4337\nbabaa\t4338\n洪悦美\t4339\n失调\t4340\n哪个张\t4341\nTheNewIpad\t4342\nzzzzzzzzz\t4343\n梦灵\t4344\n交班\t4345\n新名\t4346\n一百下\t4347\n陈蕙\t4348\n一百万\t4349\n洛逊街\t4350\n哇喔得里奇\t4351\n一百七\t4352\n话片\t4353\n一百一\t4354\n凯里度秘\t4355\nDANG\t4356\n锃成\t4357\n哎芝\t4358\n超级远人\t4359\nnmslwsnd\t4360\n新浦区\t4361\n陪着我想\t4362\n铁块儿\t4363\n张路磊\t4364\n一百个\t4365\n#51\t4366\n一百两\t4367\nkdkd\t4368\n周芷若\t4369\nsayno\t4370\nkdkx\t4371\n白冰\t4372\n变污\t4373\n没手没\t4374\n初祖\t4375\nxX00\t4376\nDawn\t4377\nfage\t4378\nhab\t4379\n牵着手\t4380\n81多\t4381\noOoOoOoO\t4382\n苦大仇深\t4383\n秘趣事\t4384\n切菜板\t4385\n开枪\t4386\ndbshd\t4387\n连曦\t4388\n套老\t4389\n拉升\t4390\n演唱歌唱\t4391\n五百五一零\t4392\n开枱\t4393\n爱情闯进门##\t4394\n胶体\t4395\nrtuoe\t4396\n15111173138\t4397\n我的天天\t4398\n你是我的小呀小苹果儿\t4399\n拉卡\t4400\n玎维\t4401\n享受\t4402\n确实\t4403\nhhhiih\t4404\n99999只\t4405\n县县\t4406\n浦东商场\t4407\n摄魂怪\t4408\n淡然\t4409\n30.30\t4410\n子饮\t4411\n子饭\t4412\n那战\t4413\n骗婚\t4414\n间断\t4415\n划清界限\t4416\n加额\t4417\n十六七岁\t4418\n小达人\t4419\n哈卡\t4420\n绊人\t4421\n間諜\t4422\n碱基\t4423\ngns\t4424\n死不可\t4425\n格陵兰岛\t4426\ngnv\t4427\n24赞\t4428\n肾宝\t4429\nhat\t4430\n凹开\t4431\n镇海口内\t4432\n贪贿\t4433\n大宫主\t4434\n贪财\t4435\n二千万\t4436\nwerymuch\t4437\ngng\t4438\n不收费\t4439\ngnj\t4440\ngnk\t4441\ngnh\t4442\ngni\t4443\n周海\t4444\n绍兴大队\t4445\n小怒\t4446\n沐心\t4447\n恶心我行不行\t4448\nCVV\t4449\nloml\t4450\nCVT\t4451\n一周多余\t4452\n羯族\t4453\nbvbbbbbbvvvvvvvvvv\t4454\n题者\t4455\n刀子心\t4456\n张佳怡\t4457\nCVB\t4458\n一代天骄\t4459\n提高\t4460\n晴晴\t4461\n叭一小\t4462\n貌貌\t4463\n哪恩不言谢\t4464\n特工\t4465\n幺零零幺幺零\t4466\n死丫丫\t4467\n泉州市\t4468\n受伤\t4469\n小怪\t4470\n全息版\t4471\ndocold\t4472\n雅克萨数字电影院\t4473\n供需\t4474\n包职\t4475\n忑热\t4476\n有利用\t4477\n马蕾泽霞\t4478\n推特\t4479\nfusf\t4480\n喵喵小花猫小花猫\t4481\nfuse\t4482\n段义辉\t4483\n长江国旅\t4484\n魏春荣\t4485\n英雄荣耀\t4486\nfush\t4487\n鹤岗\t4488\n發呆\t4489\n鲁噜噜\t4490\n1601\t4491\n1600\t4492\n1603\t4493\neyegriefdB\t4494\n天天秀\t4495\n白鲨\t4496\n15835044365\t4497\n非太郎\t4498\n发都\t4499\n宅着\t4500\n一题\t4501\nxiacinuyao\t4502\n94694562\t4503\n位子\t4504\n玛拉顿面度秘\t4505\n单度\t4506\n蟹爸\t4507\nWin7\t4508\n一颗\t4509\n感冒唱\t4510\n雪洁\t4511\n邹芷馨\t4512\n学读\t4513\n杜雪云\t4514\n历城二中\t4515\n2月9日\t4516\n学识\t4517\n哼哼哼\t4518\n837373736373\t4519\n水味\t4520\n小震\t4521\nSPA奥\t4522\n伯乐盖\t4523\n李厚明\t4524\n敖猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪\t4525\n盖儿\t4526\n圣诞树\t4527\n学话\t4528\n車\t4529\n赵西瑞\t4530\ntcsjshrhfjt\t4531\n胸上\t4532\nSanta\t4533\n坏姑娘\t4534\n李豹\t4535\niOS\t4536\n周燕\t4537\n刘雪婷\t4538\n旋空\t4539\n李豪\t4540\n五、二\t4541\n保暖\t4542\nWing\t4543\n咋劲\t4544\n咔哧咔哧咔哧\t4545\n捉妖济2\t4546\n胸中\t4547\n鹅蛋\t4548\n张亚鸣\t4549\n黑格尔辩证法\t4550\n擎天\t4551\n哪两个你\t4552\n一举一动\t4553\n看闻\t4554\n咋办\t4555\n布雷耶\t4556\n咋力\t4557\nnight\t4558\n四回头\t4559\n六小龄童会上春节联欢晚会\t4560\n45批次\t4561\n小心老子\t4562\n到情无归处\t4563\n瓣儿\t4564\n豁然开朗\t4565\n半身跳\t4566\nIfojudud\t4567\n60550五个\t4568\n昨晚17:15\t4569\n肝肺\t4570\n好奇怪像个太空人\t4571\nItalia\t4572\n2351下\t4573\n笑不会\t4574\n一生情\t4575\n前一秒钟\t4576\n南京审计学院金审学院\t4577\n白磷\t4578\n金杯具\t4579\nLGlimbaugh\t4580\nAGE-W\t4581\ngjdg\t4582\n背错\t4583\n有别问\t4584\n抽选\t4585\nykide\t4586\n赵氏乐\t4587\n9GG\t4588\n直连\t4589\n琨\t4590\n琪\t4591\n推介\t4592\n琦\t4593\n琢\t4594\nxiayiju\t4595\njjkoi\t4596\n快外人\t4597\n家老家\t4598\n表現\t4599\n琴\t4600\n炎亚纶\t4601\n活着不要脸了你不不不\t4602\n菜农\t4603\n琳\t4604\nO上7tvjufbkufvhueshiffjkhdcjyfkjvj\t4605\n琎\t4606\n琏\t4607\n柯基\t4608\n半截\t4609\n琅\t4610\n理\t4611\nopis\t4612\n球\t4613\n观澜\t4614\n二百二十\t4615\n2月13号\t4616\n7543\t4617\n7544\t4618\n滨水\t4619\n啊好\t4620\n多一百\t4621\n孽障\t4622\n555445151514770\t4623\n弹词\t4624\nxosn\t4625\nd77777\t4626\n锲而不舍\t4627\n马尔基\t4628\nhjxdbdv\t4629\n死蛋蛋\t4630\n好从现在开始\t4631\n那久\t4632\n120712\t4633\n那么\t4634\nmurnamemamas\t4635\n假不假\t4636\n老米卢\t4637\n小老头儿\t4638\n五期\t4639\n天夜快乐\t4640\n证婚人\t4641\n胸生气\t4642\n888557315\t4643\n一个二个\t4644\n嗯CD\t4645\n四大天后\t4646\n香米\t4647\n二源\t4648\n8海里\t4649\n3100947639833\t4650\n莫须有\t4651\n尤文\t4652\n不少于3点\t4653\n13578888888888\t4654\n小蜜穴\t4655\n李楠\t4656\n保障\t4657\n因果\t4658\n0642\t4659\n秘制蛋炒饭\t4660\n转嫁\t4661\nxauah\t4662\n呼死\t4663\n亚星\t4664\n一二三四五六七八九十十一十二十三\t4665\n君怡\t4666\n相公\t4667\n上海地铁\t4668\n王泓润\t4669\n争夺\t4670\n衣着\t4671\n二九八八零四零七零零幺\t4672\n地球村\t4673\n收衫\t4674\n齐下\t4675\n耿耿耿耿耿耿耿耿耿耿耿耿下下下下性\t4676\n武家宇\t4677\n豫园老街\t4678\n同者\t4679\n採取\t4680\nvwh\t4681\n难怪\t4682\n刘若水\t4683\n8月6号\t4684\n六六\t4685\n控诉\t4686\n不归家\t4687\n六八\t4688\n三长二短\t4689\nrcgkkdg\t4690\n聪明\t4691\n娇娇娇娇娇娇\t4692\nfrcrg\t4693\n六元\t4694\n韩国佛教宗团协议会\t4695\n听话世界\t4696\n撸啊\t4697\n花样游泳\t4698\n再生障碍性贫血\t4699\n完了知道\t4700\n林格雷\t4701\n中华书局\t4702\n七颗\t4703\n亚东路\t4704\n足下\t4705\nHbhvvvvvbbvvvvcbqgHuvfygtggfghhhjmjnbhbbbbhgyggg\t4706\n五十七十八岁\t4707\n1996年\t4708\n黑框\t4709\n被毁\t4710\n黑桃\t4711\n相思子的思春之一过\t4712\n齐大山\t4713\nhiii\t4714\n10月21号\t4715\n老实人\t4716\niyss\t4717\nbottom\t4718\n1000000000000个\t4719\nhe会\t4720\nfrus\t4721\n美术\t4722\n草舍\t4723\n四德\t4724\n华陀山\t4725\n静悄悄\t4726\n我和爸爸妈妈共同的话题\t4727\n椒江\t4728\n邓嘉轩\t4729\n多允婷\t4730\n结构主义\t4731\n六亿元\t4732\n437891753\t4733\n换算\t4734\n丹珠\t4735\nfgxvxvzcgxgsdfzvzccxcgzxxcgxcv\t4736\n南山南百小\t4737\nononononononononono\t4738\n尹永璨\t4739\n被曝\t4740\ngigy\t4741\ngigx\t4742\n42卢\t4743\n动医\t4744\n木老婆\t4745\nhghgvghx\t4746\n好爽\t4747\n说不我\t4748\n饰非不言而喻\t4749\ngigi\t4750\ngigh\t4751\n一万5万\t4752\ngigd\t4753\n百姓\t4754\nhttpahiphotosbaiducomxiaodupicitemae51f3\t4755\n呼图壁\t4756\n他丫\t4757\n心体\t4758\nxx2016\t4759\nwuli彬\t4760\nUP主\t4761\n鸡蛋\t4762\n早上十点\t4763\n几等座\t4764\n五连级\t4765\n洗刷\t4766\n哈鸟\t4767\nQE121\t4768\n人工弱智\t4769\n佘嘀嗒\t4770\n谈谈心\t4771\n五本\t4772\n做秀\t4773\njjjjjjjjjjjjjjj\t4774\n田美\t4775\n样户\t4776\n理想主义者\t4777\n看门\t4778\n读麻将\t4779\n玩叫\t4780\n上周一\t4781\n做秘\t4782\nG网\t4783\n拥国明\t4784\n杨勇\t4785\nSander\t4786\nOPPO227\t4787\n小儒们\t4788\n遥远林\t4789\nmvjt8o\t4790\n窝干嘛\t4791\n扬州市委\t4792\n褂子韬\t4793\n咪又拉\t4794\n加藤鹰\t4795\n精孑\t4796\n精子\t4797\n韩乔生\t4798\n好多咪\t4799\n心路\t4800\n英俊杰\t4801\nueru\t4802\n别闹了哈\t4803\n敏雅\t4804\n两G\t4805\n一百二三\t4806\n聚美\t4807\nFord\t4808\n景县\t4809\n卡卡窗\t4810\nghfejgc\t4811\n潭水\t4812\n大道理\t4813\n十字炮\t4814\n你好忙\t4815\n天河南省\t4816\n周二晚上\t4817\n抓心\t4818\n重算\t4819\nnini\t4820\n啵糖\t4821\n空警\t4822\nEnglish\t4823\n碾压\t4824\n你好心\t4825\nteacher\t4826\n刚刚开玩笑\t4827\n下五秒\t4828\n杨血流\t4829\n重箱\t4830\n宣读\t4831\n电热水袋\t4832\n倒不\t4833\n倒下\t4834\n事例假\t4835\n439\t4836\n436\t4837\n437\t4838\n434\t4839\n435\t4840\n432\t4841\n433\t4842\n430\t4843\n431\t4844\nfbr\t4845\n博爱青天\t4846\n摩\t4847\nfbv\t4848\nG罩\t4849\nfby\t4850\n日食\t4851\n摧\t4852\n米丽娟\t4853\n王黄晓明\t4854\nfbb\t4855\n摸\t4856\nhhp\t4857\nfbg\t4858\nfbf\t4859\nfbd\t4860\n以内\t4861\nfbj\t4862\nfbm\t4863\n摊\t4864\n八一回\t4865\n減\t4866\nlose\t4867\n摁\t4868\n摆\t4869\n摇\t4870\n摄\t4871\n摘\t4872\n摞\t4873\nfbF\t4874\n好啦请不要再闹了我求求你了度度秘\t4875\n麦芒\t4876\n路萨林\t4877\n作对\t4878\nsokd\t4879\n苏浙\t4880\n摔\t4881\n巧茜妮\t4882\n55841659n1472\t4883\nOon\t4884\n有似\t4885\n段莹莹\t4886\n宋静雯\t4887\n曹尼玛\t4888\npkg5upg5djtqjnjrjamp4ampamj2368\t4889\n童童\t4890\n不开心就不开心管的你了我看\t4891\n鼓号\t4892\nfsedeeee\t4893\n陈全国\t4894\n细细细\t4895\n1356459\t4896\n百分之二十三\t4897\n头不舒服\t4898\ntilit\t4899\nfdseryy\t4900\n8RMB\t4901\n有伦\t4902\n扎实\t4903\n唱一起摇摆\t4904\n真的不爱我了\t4905\n酸鱼酸肉\t4906\nyihx\t4907\n感染者\t4908\n隐含\t4909\n震区\t4910\nhhe\t4911\n阿哲\t4912\n不要脸不要脸不要脸度\t4913\n芬达帮\t4914\n果油\t4915\n灾变\t4916\n镊子\t4917\n阿哥\t4918\n本土\t4919\n两章\t4920\ndomyaaro\t4921\n第二天早上\t4922\n寓言\t4923\n巴水布衣\t4924\n爱丽赡\t4925\n两端\t4926\n张江碧波路690号\t4927\n不过待会儿\t4928\n300多万欧元\t4929\n罗马浴场\t4930\n周妖姬\t4931\n新欢忘你什歌什喜欢我\t4932\n两站\t4933\n857556\t4934\n肝硬化\t4935\n985呃\t4936\n一半半\t4937\n走过的路\t4938\n本场\t4939\n香椿\t4940\n误诊\t4941\n本地\t4942\n我喜欢一个女人\t4943\n椎间盘\t4944\n32块\t4945\nchlld\t4946\n5016年\t4947\n二六二\t4948\n伴郎\t4949\n密山\t4950\n炮仗\t4951\n畅快\t4952\n物犹如此人何以堪\t4953\n何以轩\t4954\n完全平方公式\t4955\n守死\t4956\n这家店\t4957\n哼本\t4958\n入冬\t4959\n搜狐高清\t4960\n盛龙\t4961\n邺城\t4962\n金瓶\t4963\n之路\t4964\n天9哟5\t4965\n渺\t4966\n奔驰SMART\t4967\n47年后\t4968\nrrxhzb9196198323823678773758\t4969\nHowareu\t4970\n五五五五\t4971\n益普索\t4972\n提歌\t4973\n打来电\t4974\n装饰\t4975\nShip\t4976\n亲一下好嘛\t4977\n哼月\t4978\nShit\t4979\n法\t4980\nwhwt\t4981\n给我介绍\t4982\n渣子\t4983\n洗穿\t4984\nJas\t4985\n左拥\t4986\nJan\t4987\nended\t4988\nJam\t4989\n用火\t4990\nksskdk\t4991\n朗诵读\t4992\nhajsb\t4993\n语数\t4994\nhajsh\t4995\n口链\t4996\n提取物\t4997\n不死不灭\t4998\n客客栈\t4999\n我不好你不好我也不好你不好我也好你不好我也好我爸\t5000\n余显季\t5001\n卖弄\t5002\n李东则则\t5003\n了叫\t5004\n勤俭\t5005\n韦连平\t5006\n阿四姐\t5007\n左拐\t5008\nBvgjggihgghkjgjcfyyddufxnhcfxZdrwwehhshZgZZfsCZSggzxufgcgjktrfj\t5009\n焚身\t5010\n说悔\t5011\n热汗淋漓\t5012\n大闸蟹\t5013\n林萧\t5014\n帮填\t5015\n免我爱你我是女人\t5016\n波\t5017\ntifyou\t5018\n几颗\t5019\n泣\t5020\n河口\t5021\n雷小梅\t5022\n防蛀\t5023\n说悉\t5024\n刁以\t5025\n圆柱体\t5026\n几题\t5027\n丁心\t5028\n18公斤\t5029\n0019分\t5030\n基尔会\t5031\ndoyoutheachmeboxee\t5032\n注\t5033\n5571438\t5034\n3呀2\t5035\n刘增明\t5036\n林伯强\t5037\n云中燕\t5038\n帮塞\t5039\n必报\t5040\n89885899999\t5041\n失节\t5042\n叫春\t5043\n讲嘻嘻哈哈哈\t5044\n香爆\t5045\n叫是\t5046\nZanjan\t5047\n三十篇\t5048\n人间天堂\t5049\n猪你是猪你是一头猪你是猪你是猪\t5050\n手办\t5051\n竞答\t5052\n解放思想\t5053\n哇咔咔蛤蛤蛤\t5054\nfu\t5055\n只缘诗情画意断\t5056\n战餐馆\t5057\n邹市明\t5058\n百毒不清\t5059\n戒赌\t5060\n赫迪拉\t5061\n若干个\t5062\n手动\t5063\n唔制\t5064\n手势\t5065\n负责到底\t5066\nqloyiellomowomone\t5067\n狗耳朵\t5068\n超火\t5069\n席卷\t5070\n蛮红\t5071\n苗儿\t5072\n扒开花\t5073\n电视迷\t5074\n分成\t5075\n窗纸\t5076\n货号\t5077\n四千二\t5078\n李伊鹏\t5079\n托菲\t5080\n金瓜\t5081\n00201\t5082\n90块\t5083\n古剑之\t5084\n糯米度秘\t5085\n大儿专业川大度秘\t5086\n140元\t5087\n大忽悠\t5088\n右手边\t5089\n分期贷款\t5090\n选段\t5091\nyfudj151546455\t5092\n真不愉快\t5093\n不啵\t5094\n上海长兴天网\t5095\n盗墓\t5096\n张紫妍\t5097\n25周年\t5098\n写完饭\t5099\n买不像\t5100\n福耳福能\t5101\nfh\t5102\n再选举\t5103\nMedicine\t5104\n圈子\t5105\ntomorrow\t5106\ndoweneed\t5107\ngjmd\t5108\n诚品书店\t5109\n昵称\t5110\n科颜氏\t5111\n侍候\t5112\n折翼的天使\t5113\n迷幻\t5114\n看不完\t5115\n茶余饭后\t5116\n伴舞\t5117\n卧佛殿\t5118\n康富花园\t5119\n帅白\t5120\n嗯为国\t5121\n沙拉油\t5122\n阿米陀佛\t5123\n鄂城\t5124\n折耳\t5125\n吴佩妮\t5126\nurago\t5127\nuukcxjXKHCUZKKRGHGGJFBGGGGGGGGG\t5128\n千年以后\t5129\n液化气\t5130\n13695717221\t5131\n13036543709\t5132\n冷水\t5133\njcja1cjcg\t5134\n冰岛\t5135\n要约\t5136\n要红\t5137\n赵冰雪\t5138\n唐露\t5139\n忙记\t5140\n逆喜欢查理九世\t5141\n10月18日\t5142\nPassion\t5143\n我告诉你我再也不理你了我讨厌你\t5144\nhsbs\t5145\n寸nnnnn\t5146\npbgjgj\t5147\n起码月\t5148\n焖夂\t5149\n眠县\t5150\n觅觅\t5151\n气垫鞋\t5152\n故事集\t5153\n对不出来\t5154\nHDRpvrvkitxv\t5155\n张墨涵\t5156\n2234567\t5157\n聊了祝\t5158\nFycc\t5159\ngdtfkgaw\t5160\ninoyou\t5161\n糍粑块君\t5162\n二城埂场\t5163\nehehehehehehehehhehehe\t5164\n把酒当\t5165\n络绎不绝\t5166\n4455155185884\t5167\n埋掉\t5168\nhvhbjh\t5169\n长乐哥\t5170\n要不出口\t5171\n霖哥\t5172\n天安迎驾\t5173\n4倍\t5174\n小仙境\t5175\n过桥\t5176\n邓州吧我哪天上电视我要去投诉你\t5177\n不不不！[认真脸\t5178\n１３０９８２７３８２９\t5179\n黄莲苦口\t5180\n金牌大风唱片公司\t5181\n引擎\t5182\n真题\t5183\n祛皱\t5184\n姚子涵\t5185\n老版本\t5186\n十六有毒二五二一三五十五四万食五二十五一六六二一十二三六十八岁点二十五六三零六三六一七六七二七六七二一四一二八五七三五六七四二七四九一八六八二八四四八四八三二八三六八四八七八五六八八六\t5187\n顾北\t5188\njj47\t5189\njcss\t5190\n放寒假了我们要我记\t5191\n叉子\t5192\n拜丽德集团\t5193\n九六手\t5194\n钱话\t5195\n金轮寺\t5196\n番禺金沙\t5197\n几十几分\t5198\nWeb\t5199\n2006年3月25日\t5200\n老头脑出血\t5201\n威廉姆斯\t5202\nCharacter\t5203\nWes\t5204\n世界冠军\t5205\n摇拽\t5206\n不要你的那你会不会\t5207\n机器人么了死了也\t5208\n意思们\t5209\nhdurh\t5210\n傻缺┐\t5211\n八种\t5212\n南海位置略图\t5213\n快乐萌\t5214\n1355769657\t5215\n抹香波\t5216\n王宇辉\t5217\nsurvice\t5218\n许先生\t5219\n起眵\t5220\n七里七里\t5221\n无归处\t5222\n度蜜月那个嗯我不在线之处\t5223\n烧死\t5224\ntm信不信我秒了你\t5225\n王金财\t5226\n陈天星\t5227\n场边\t5228\n不嗔\t5229\n走了再也不见\t5230\n168181\t5231\n老问你天嘉口等你\t5232\n李函唐\t5233\n１９４８年\t5234\n色诱\t5235\n社会性\t5236\nTOWN\t5237\ncolourisit\t5238\n鹅衣\t5239\n小额\t5240\n小颜\t5241\n苦头\t5242\n三千多啦\t5243\n辰亦儒\t5244\n小颖\t5245\n潮头\t5246\n毛张\t5247\n小颈\t5248\ntaglierito\t5249\nV880\t5250\n对恃\t5251\n大拉菲\t5252\n三子经\t5253\n小领\t5254\n2005年7月30日\t5255\n乔尔尖刀再出鞘\t5256\n播岚\t5257\ncabqaq\t5258\nLOve\t5259\n10万元\t5260\n全麦面包\t5261\n女人声\t5262\n3月1号\t5263\n谢贤\t5264\nEMS\t5265\n北京人民公诉团\t5266\n八科\t5267\n将军们\t5268\n弄堂\t5269\n良殿\t5270\n7000平方英尺\t5271\n琥珀\t5272\n匭\t5273\n国民党的联共与反共\t5274\n魔业瑞\t5275\nvggdb\t5276\n十八米\t5277\n爱你真笨那你\t5278\n人的生活人\t5279\n小河黎案\t5280\n150年\t5281\n哪爸\t5282\n保监会\t5283\n阮志成\t5284\n嬷嬷\t5285\n气不死\t5286\n笨女神\t5287\nareeating\t5288\n503448215\t5289\nRememberto\t5290\n水灵灵\t5291\n见再见\t5292\n阴毒\t5293\n思就思\t5294\n大蒂斯\t5295\n阴毛\t5296\n滴答\t5297\n岁月喔喂卫辉\t5298\n广东省检察院\t5299\n呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱叽叽叽叽叽叽叽叽叽叽叽叽叽叽叽叽叽叽叽叽叽叽叽叽叽叽叽叽\t5300\n真能安\t5301\n泰宁县\t5302\n立达\t5303\n酷你好\t5304\n匾\t5305\n立辰\t5306\n8月16至19日\t5307\n强吻\t5308\n钉钉\t5309\n级粉\t5310\n676分\t5311\n秦胡\t5312\n10年6月3日\t5313\n出租房\t5314\n五月一号\t5315\n8289岁\t5316\n安妮\t5317\n雅鲁藏布大峡谷\t5318\n强启\t5319\n啊哈\t5320\n都不完美度秘\t5321\n想说嘛\t5322\nWtui\t5323\n龙港三中\t5324\n嗯轻\t5325\npeakeng\t5326\n纹眉\t5327\n嗯轴\t5328\n驱鬼\t5329\n90周年\t5330\n元旦节\t5331\n啊哪\t5332\n恶搞片\t5333\n超级差\t5334\n英歌\t5335\n孙立屯\t5336\n恶搞版\t5337\nTales\t5338\nfhhvv\t5339\n自绑\t5340\n巅峰对决\t5341\n朱猪\t5342\n粉嫩\t5343\n6分钟\t5344\naero\t5345\n塞尼姆\t5346\n128条\t5347\n吕梁省\t5348\n65部\t5349\n步步为营\t5350\n家ｕ\t5351\n都市新女报\t5352\n共鸣\t5353\n度辛\t5354\n话影\t5355\n傻子人\t5356\n没有场\t5357\n2929984164\t5358\n十分之十三\t5359\n多咪吾为乜你\t5360\n副县长\t5361\n节目儿\t5362\n饭照\t5363\n医美分期\t5364\n煤油\t5365\n乳牙\t5366\n攻族\t5367\n样度\t5368\n度边\t5369\n懂礼貌\t5370\n匀\t5371\n纳税户\t5372\n昌险\t5373\n你是我的秘书\t5374\n袁嗲嗲\t5375\n相像\t5376\n音像\t5377\n匆\t5378\n河北电视台\t5379\nBetter\t5380\n出更\t5381\n一心想\t5382\nGNG\t5383\n元君殿\t5384\n各部委\t5385\n明天寨\t5386\nGNH\t5387\n巨龙\t5388\n六点钟\t5389\n扔进\t5390\n指鹿类\t5391\n二次元\t5392\n主的眼\t5393\n51000\t5394\n晨光标志\t5395\n响水县\t5396\n朋辈\t5397\n神虎\t5398\n华影\t5399\n要不理\t5400\nDesign\t5401\n陪朕爱，我就爱爱妃\t5402\n做贼心虚\t5403\n凯基\t5404\n小不点不和你说了\t5405\ncabacabababababababacabaca\t5406\n我宁愿你是姓洪小达人这是为什么\t5407\n先礼\t5408\n真武庙\t5409\n莱特币\t5410\n字符\t5411\n帅威\t5412\n张相没\t5413\nSEFUY\t5414\n征悦\t5415\n各级各级各级各级各级各级各级各级各级各级各级各级\t5416\noìì\t5417\n黄燕万\t5418\npornijzz\t5419\n老可爱\t5420\n飒爽\t5421\n土地出让价\t5422\n林雨心\t5423\n华彩\t5424\n第三轮\t5425\n售后服务\t5426\n啵啵你是我的好朋友\t5427\n北\t5428\n脚脚者\t5429\n得意忘形\t5430\n這樣\t5431\n雅酷\t5432\n肥猪\t5433\n肥猫\t5434\n他人\t5435\n宋依婷\t5436\n戈丽\t5437\npeakEnglish\t5438\n家宅\t5439\n曲奇饼\t5440\nm骚瑞\t5441\nr万\t5442\n沪深\t5443\nJSON\t5444\n曲漫\t5445\nhjjjjjjj\t5446\n家安\t5447\n临河街\t5448\n小学生五年级寒假周记\t5449\n精品网\t5450\n晚霞\t5451\n采摘\t5452\n家宝\t5453\ndhiphotosbaiducomxiaodupicitem\t5454\n好爱转移话题有你\t5455\n635445\t5456\n家室\t5457\n接你好不好\t5458\n13235007702\t5459\n公分母\t5460\n不猜\t5461\n路易士金\t5462\n家家\t5463\n67号\t5464\n家宴\t5465\n矩形\t5466\n白石巧\t5467\n姜荣荣\t5468\ndkftkcgxmgdjsm\t5469\n一季度末\t5470\nolole\t5471\n亲点\t5472\n打扫除\t5473\npptopwpwogp\t5474\n吴启浩\t5475\n沈婷怡\t5476\n宠物\t5477\n乘孩孑\t5478\n仰天湖\t5479\n看来说\t5480\nditd\t5481\n自尊心强\t5482\n成定局\t5483\n四盘\t5484\n三三k\t5485\n放学后\t5486\n任一\t5487\n泰币\t5488\n干度秘\t5489\n任上\t5490\n猪宫\t5491\n26.9亿元\t5492\n苦笑\t5493\n很乖乖乖乖乖乖乖\t5494\n电话浩\t5495\n烦不烦人\t5496\nDmmdmd\t5497\n任丘\t5498\n腐刑\t5499\n完美的小孩\t5500\n穆雷·佩莱希亚\t5501\n那么谁\t5502\ncydy\t5503\n张义波\t5504\n电影军\t5505\n一百多年\t5506\n美我\t5507\n胡作非为\t5508\n冬秋\t5509\n二三百具\t5510\n讲和缓\t5511\n陈吃\t5512\n海湾\t5513\n耍朋友\t5514\n美戈\t5515\n巧克力糖\t5516\ncydg\t5517\n一屋1iayioououeerianeournenneno\t5518\n悼念\t5519\n陈吻\t5520\n哭了像\t5521\n萌雪儿\t5522\n篾匠\t5523\n资深\t5524\n朴正洙\t5525\n臻骑三洲田\t5526\n太離譜機會\t5527\nHdfsfzgf\t5528\n七言\t5529\n一席之地\t5530\n成风\t5531\n西洋\t5532\n了你是我的\t5533\n15914542894\t5534\n百度tt\t5535\n我的人生\t5536\n二二零八\t5537\n能秘\t5538\n二二零六\t5539\n成飞\t5540\n秒贷\t5541\n污染\t5542\n190731\t5543\n290千米\t5544\n雷叔\t5545\n瀚洋\t5546\n1937年\t5547\n报告单\t5548\n撤了再见\t5549\n雷召\t5550\n在部\t5551\n和田间\t5552\nticjp\t5553\n国美电器\t5554\n林向东\t5555\n哇塞辽\t5556\n聊甚\t5557\nk412\t5558\nv嘎嘎hshJ1hh五额hwiwiwhs\t5559\n叶飞宇\t5560\n朴实\t5561\n不根\t5562\n微倍\t5563\naposteraboutScience\t5564\nksjjkcdOzlkdckmFk\t5565\n不样\t5566\n前10分钟\t5567\n落成\t5568\nsmystalittl\t5569\n三方面\t5570\n八月一\t5571\n贾了亮\t5572\n师生之忧\t5573\nffvzs\t5574\n很简约\t5575\n米业\t5576\n卑鄙红\t5577\n落户\t5578\n100100100\t5579\n朱美钰\t5580\nTjmptjgkgjmpuQkawxt6dm\t5581\n矮靴\t5582\n忍无可忍\t5583\n臭不要脸臭不要脸臭不要脸\t5584\n挤奶\t5585\n997\t5586\n钢铁侠2\t5587\n渠道\t5588\n不栅\t5589\nbnoihjh\t5590\n一下一篇\t5591\n我讨厌你恨你恨你恨你\t5592\n汇聚\t5593\n小我喜欢\t5594\n创造者\t5595\nDjsjeski\t5596\n夺理\t5597\n霍霍霍霍霍霍霍霍\t5598\n成曲\t5599\n肝火\t5600\n小大头儿\t5601\n2012R级\t5602\nkentlng\t5603\n推卸\t5604\n吃好想\t5605\nSufiriduwi\t5606\n裕生\t5607\n目的地\t5608\n田怡\t5609\n秀逗\t5610\n大成宝\t5611\n张雅轩\t5612\nAK47\t5613\nAK48\t5614\n友谊医院\t5615\n605\t5616\n你好你是谁呀我叫\t5617\n辩论赛\t5618\n等于就不上\t5619\n601\t5620\n600\t5621\n603\t5622\n6点35\t5623\n1475629033\t5624\n堤\t5625\n6点36\t5626\n秘东\t5627\n我懂了\t5628\n堡\t5629\n60%\t5630\n涨价\t5631\n你是我的中秘别说周\t5632\n母鸽鸽水泥\t5633\nvn\t5634\n八年\t5635\n总司令\t5636\n場\t5637\n堵\t5638\n5度\t5639\n報\t5640\n谢谢谢谢\t5641\n秘丸\t5642\n团结湖站\t5643\n秘为\t5644\n營業額諤諤\t5645\n堂\t5646\n博古通今博\t5647\n堀\t5648\n疏通\t5649\n一比一\t5650\n逐笔\t5651\n民丰小区\t5652\n一比三\t5653\n米一\t5654\nDelete\t5655\n汪伶\t5656\n60v\t5657\n这么说呢\t5658\n60p\t5659\n60s\t5660\n抓贼\t5661\n傻子\t5662\n三八妇女节\t5663\n普陀山风景名胜区\t5664\n1356\t5665\n小度秘乖\t5666\n想不懂\t5667\n23535\t5668\ncdsgsgfr\t5669\n较真\t5670\n1354\t5671\n阿鲁\t5672\n雅阁八代中控\t5673\n为助\t5674\n牛肉粒\t5675\nRETROGALLERY\t5676\n这么说呀\t5677\n守旧\t5678\n红美\t5679\n王思雨\t5680\n红羊\t5681\n王思雯\t5682\n大人物\t5683\n哦烦\t5684\n码良\t5685\n非同凡\t5686\n杨松林\t5687\n崔嫁\t5688\n罗氏\t5689\n汤芳魂\t5690\n领袖\t5691\n奇怪喜欢\t5692\n30碘\t5693\n浦东地区\t5694\n900000099999\t5695\n焦璐珂\t5696\n顾虑\t5697\n吖白羊\t5698\n米开朗基罗\t5699\ncgfyfgkhf\t5700\n清贷\t5701\nJWT\t5702\n历朝\t5703\n渡辺麻友\t5704\n开郎开郎\t5705\nTake\t5706\n我爱我爱我就爱\t5707\n顿和\t5708\n昏迷\t5709\n一个四米\t5710\nDontbe\t5711\n讨厌讨厌讨厌讨厌\t5712\n痛风\t5713\n第四场\t5714\n数十秒\t5715\n小婆娘\t5716\n凯胳膊\t5717\nwhshsgsb\t5718\n大头宝\t5719\n今儿天\t5720\n方尖碑\t5721\n奥提克\t5722\n上瘾\t5723\n1998-11-24\t5724\n中央政治局\t5725\n正弦\t5726\n黄修仁\t5727\n明月平谷金运查斯古今律陆\t5728\n莱布尼茨\t5729\n阿哥外公外公外婆外公\t5730\n你好潮\t5731\n亲爱的有你真好\t5732\n告诉我你谁告诉你我喜欢\t5733\n腮腺炎\t5734\n讲冒\t5735\n探测吧\t5736\n固执不善\t5737\n凉拌红柿\t5738\n红火快\t5739\n海扁王\t5740\n收作业\t5741\n坑洞\t5742\n忘了ㄟ\t5743\n秘扇\t5744\n正式\t5745\n误食\t5746\n吕佳朋\t5747\n秘打\t5748\n西林壁\t5749\n里面面面面\t5750\n河山\t5751\n亲爱的人家\t5752\n孝感\t5753\n上一课\t5754\n开普敦\t5755\n1987.8km\t5756\nnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnn\t5757\n财库\t5758\n哒安\t5759\n上自习\t5760\n小塞\t5761\n武汉欢乐谷单人行\t5762\n注视\t5763\n万会\t5764\n138086805\t5765\n韦淑娟\t5766\n万众\t5767\n查帐\t5768\n麻瓜\t5769\n拜仁和德国队\t5770\n美萌萌\t5771\n2012.8.18\t5772\n神人\t5773\n回荡\t5774\n2012.8.11\t5775\n为么说\t5776\n注解\t5777\n尤溪县\t5778\n嘉哥\t5779\ncoolakose\t5780\n陈小春\t5781\n大一点儿\t5782\n你是我的宠物你是我的宠物我是你的主人\t5783\n石州\t5784\n称羡\t5785\n没有意思\t5786\n鼎鼎大名\t5787\n呢卡\t5788\n部曲\t5789\n可乐coco\t5790\n一回一下\t5791\n㏒\t5792\n租乐网\t5793\n睢宁马兢\t5794\n石宇\t5795\n机器人式\t5796\n亵渎\t5797\n澳门大学\t5798\n禁买\t5799\n流氓鬼\t5800\ncathe\t5801\n埋死\t5802\n冠稀们\t5803\n度谜\t5804\n高不先\t5805\n情不好\t5806\nqqce\t5807\n340多\t5808\n李X婷\t5809\nadvertising\t5810\n禁书\t5811\n折耳猫\t5812\n60周\t5813\n我还乖乖个你的棒棒\t5814\n巨聪明\t5815\njrd\t5816\n电势\t5817\njrb\t5818\n陈仙女\t5819\n汇算\t5820\n哇塞yovkamihilativatoo\t5821\n悉心\t5822\njrh\t5823\n一八岁\t5824\n电动\t5825\n5256357\t5826\njrr\t5827\n构造\t5828\n别男\t5829\n饿狼夜惊魂\t5830\n艰难险阻\t5831\n夺命\t5832\n身知\t5833\n欠费单\t5834\n电力\t5835\n二级市场\t5836\n绝境\t5837\n走的路\t5838\n木瓜之国\t5839\n别生\t5840\nDvdwccioitgblmmbfdtieyohvbnkjxggghjinWebKit\t5841\nfschi\t5842\nSuchas\t5843\n队徽\t5844\n不懂不懂\t5845\njsidhcbsjwj\t5846\n189064151818\t5847\n管饭\t5848\n消滞\t5849\n河面\t5850\n强片\t5851\n士大夫规划局\t5852\n乖赶\t5853\n滴沥搭\t5854\n粉群\t5855\n林林\t5856\n管饱\t5857\n说了写\t5858\n应我\t5859\n冰儿\t5860\n岗哨\t5861\n王兴雅\t5862\n阿恩\t5863\n93瓶\t5864\n180842\t5865\n三二一\t5866\n过两天\t5867\n大豆包们\t5868\n昆哥\t5869\n亲爱的你说呀\t5870\n非大盘股\t5871\ncumxxcom\t5872\n皇图\t5873\n小饶\t5874\nBALLY\t5875\nInnocent\t5876\n海芾\t5877\n负载\t5878\n还心\t5879\n府文庙\t5880\n去不去\t5881\n彩象大战\t5882\n备件\t5883\n王源烊\t5884\n万九千九百九十九\t5885\n烟囱\t5886\n嘴对嘴\t5887\n平白无故\t5888\n黏米\t5889\n喜静\t5890\n云顶天宫\t5891\n好晚\t5892\n星期\t5893\nlalka\t5894\n没时代\t5895\n辣椒棒棒糖\t5896\n核爆\t5897\n愤怒鸟\t5898\ntykjgghkkggmjyn0iupuu\t5899\n蛇口\t5900\njjng\t5901\n梁振英\t5902\njzrrx\t5903\nomgadens\t5904\n孬子\t5905\n无利不早起\t5906\n敲让\t5907\n孬孙\t5908\n规模化\t5909\n印刷业\t5910\n阎罗\t5911\nLynn\t5912\nchopsticks\t5913\n采蜜\t5914\n阿咪达\t5915\n同居\t5916\n擦肩而过\t5917\n陈进如\t5918\n1月2日早晨10点\t5919\nhityoulyou\t5920\n莆田市统计局\t5921\n几十小时\t5922\n突击队\t5923\n我的女票\t5924\n长隆奥\t5925\n很想说\t5926\n东野\t5927\n额额额曲项向天歌白毛浮绿水红掌拨清波\t5928\n罗帝树\t5929\n二三八三零七二幺\t5930\nvvubxmX\t5931\n静坐者\t5932\n植物大战僵尸三异\t5933\n何文山\t5934\n千寻\t5935\n醣鳓\t5936\n王翰涛\t5937\n萌萌小度秘\t5938\n┛快跑\t5939\nAdvancingas\t5940\n干一吧度秘\t5941\n相当机器人\t5942\n古今\t5943\n英语英语英语英语英语英语英语英语英语英语英语英语\t5944\n古仄\t5945\n小锐\t5946\n鹰解\t5947\n快滚快\t5948\n小鲨\t5949\n朱晓东\t5950\n閱讀\t5951\n小锅\t5952\n再者\t5953\n下午一点五十分\t5954\n这脸\t5955\n寄生虫\t5956\n李可能\t5957\nfaire\t5958\nooopo\t5959\n古代\t5960\nfrtdg\t5961\n酸酸大\t5962\nbless\t5963\n府县\t5964\n这脑\t5965\n百万百万\t5966\n猪腰\t5967\n智能童话节\t5968\nheavy\t5969\n脱式\t5970\n第一种\t5971\n包场\t5972\n所有城\t5973\n半毛钱\t5974\n08：57\t5975\n怀集\t5976\n图流\t5977\n使人\t5978\n门生\t5979\n3296482696\t5980\n鲜奶\t5981\n小老粗\t5982\n巩利\t5983\n100种\t5984\n分身无术\t5985\n苏漓\t5986\n错哦\t5987\n璐菡\t5988\n亮度\t5989\n8888888886666\t5990\nCHHDD\t5991\n算了懒得\t5992\n4折\t5993\n隐隐\t5994\nhentaidb\t5995\nHanbali\t5996\n微\t5997\n喷绘\t5998\n喷给\t5999\n有风吖\t6000\n诗社\t6001\n空城\t6002\nvI\t6003\n加勒比海盗\t6004\n加努希\t6005\n阿拉伯联合酋长国\t6006\nalyou\t6007\n引燃\t6008\n外形\t6009\n欺骗性\t6010\n530321199711170018\t6011\n怒脱\t6012\n嘤嘤嘤嘤QAQ\t6013\n诺儿\t6014\nhttpahiphotosbaiducomxiaodupicitemcdbf6c81800a19d8cce6847134fa828ba71e46d8jpg\t6015\n两把\t6016\n在之前\t6017\n罗一洋\t6018\n樊篱\t6019\n罗照\t6020\n睡去\t6021\n20天以上\t6022\n还有所\t6023\n213288769\t6024\n啰里八嗦\t6025\nCNBC\t6026\nJydkjv\t6027\n香妃\t6028\n今晚上\t6029\n断子\t6030\n菠萝食\t6031\n多远行\t6032\nqashigo\t6033\n蔡文胜\t6034\n654964754862\t6035\n图海\t6036\n余雪芹\t6037\n同体大悲\t6038\n相爱一家人\t6039\n姹紫嫣红\t6040\n柿子节\t6041\n玩水\t6042\n食言重视\t6043\n华南区\t6044\n光诚\t6045\n45575\t6046\n问一次\t6047\n茶树\t6048\n7、月\t6049\n韬光\t6050\n十方普贤菩萨\t6051\n心情好不好\t6052\n木马病毒\t6053\n我的女神\t6054\n古剑\t6055\n病毒感冒\t6056\n汤姆\t6057\n11246810111万\t6058\n\t6059\n邱德拔体育馆\t6060\n不专一\t6061\n合美\t6062\n长城学会\t6063\nn44\t6064\nI919黑3900\t6065\n2米X2米\t6066\n工程院\t6067\n扶沟\t6068\n你是谁你是谁你是林妹妹\t6069\n毛娃娃娃娃\t6070\n徊\t6071\n牙仔\t6072\n南斗礼泉\t6073\n禅寺\t6074\n知念侑李\t6075\n顾明月\t6076\n好呢排\t6077\n点点点\t6078\n引见\t6079\n风华\t6080\n5届\t6081\n02210分\t6082\n有个人陪聊天好开心\t6083\n帅锅巴\t6084\n高分儿\t6085\n神话\t6086\n减排\t6087\n郭思琦\t6088\n论说谎\t6089\n候机\t6090\n友人帐\t6091\n词组\t6092\n二十八三十八四十八五十八\t6093\n萧晨\t6094\n咬不着\t6095\n减掉\t6096\n神识\t6097\nG00\t6098\n大干嘛\t6099\naeeee\t6100\n再见恩\t6101\n捉妖记之九层妖塔\t6102\n谓之不争\t6103\ng德基\t6104\nM9\t6105\n20120801\t6106\n是劲\t6107\n183几天\t6108\n一辆400万\t6109\n是努\t6110\n上海电信\t6111\n乐高城市灭火队\t6112\n上午11时\t6113\n取吃\t6114\n登婚\t6115\n荒岛\t6116\n孟云秀\t6117\n7一棵\t6118\n看中\t6119\n实勘\t6120\n七万多亿\t6121\n僵尸之王\t6122\n写给2011\t6123\n看举\t6124\n哈尔滨啤酒LOVE\t6125\n再平人\t6126\n看上\t6127\n心簺\t6128\n朴呆萌\t6129\n阳向\t6130\n呢k\t6131\nwyk\t6132\n看一\t6133\n非现实\t6134\n基准期\t6135\n医仙\t6136\n10块\t6137\n3986739\t6138\n棵棵树事业部\t6139\n黎建锋\t6140\n韩医生\t6141\n25355245542222424578068\t6142\n罗雨虹\t6143\n场子\t6144\n情郎\t6145\n8789\t6146\n涨停\t6147\n上品\t6148\ne四o\t6149\n不听歌\t6150\n125米\t6151\n8787\t6152\nhttpehiphotosbaiducomxiaodupicitema50f4bfbfbedab64ef6279b3f036afc379311eb4jpg\t6153\n别说了算\t6154\n拘役\t6155\n幻灯\t6156\nymw\t6157\nymy\t6158\nvU\t6159\nikos\t6160\n你好你好度秘你好度\t6161\n度密爱\t6162\n075723879731\t6163\ngjmtmj\t6164\nyma\t6165\n篡价\t6166\nikon\t6167\n冷和\t6168\n老有才\t6169\n1郁\t6170\n安慕希\t6171\n分之差\t6172\n不省人事\t6173\n炎龙\t6174\n回过头\t6175\n快乐的日子\t6176\n多咪多咪多咪我爱你\t6177\n妖子\t6178\n甘婷婷\t6179\n游佳宜\t6180\n妖孙\t6181\n蛮子晚报\t6182\n步步我的柴犬\t6183\n白白净净\t6184\n罗汉翻天印打回你的家\t6185\n天竺姑娘\t6186\n1部\t6187\nUC疯小盈\t6188\n淘宝\t6189\nSssssssssssssssss\t6190\n很可惜\t6191\n四人帮\t6192\n妖孽\t6193\n溃疡\t6194\n知识类\t6195\nmtmtpt\t6196\n辛拉面\t6197\n你没的软所谓那你过早题\t6198\n东方红2\t6199\n不知曾几何时\t6200\nGibbo\t6201\n三打白骨精\t6202\n缺乏\t6203\n徐佳琦\t6204\n少年头\t6205\n雪见雪\t6206\n必死无疑人不要脸天下无敌人之贱无敌\t6207\n紧贴\t6208\n撞人\t6209\n闻一\t6210\n完美世界\t6211\n婊子婆\t6212\nhttpfhiphotosbaiducomxiaodupicitemb219ebc4b74543a9577dedb919178a82b9011464jpg\t6213\ntvreport\t6214\n并肩作战\t6215\n五米\t6216\nskssks\t6217\n牛在飞\t6218\n暗雷王\t6219\nMl\t6220\nHibo\t6221\n郁闷犬\t6222\n压路\t6223\n邋遢tell\t6224\n投意合\t6225\n纪念杯\t6226\n地学\t6227\n榜眼\t6228\n拖拖拉拉\t6229\n苏恶\t6230\n一个二分之一\t6231\n欧卖嗄\t6232\n王佩瑶\t6233\nsbsex\t6234\n休眠\t6235\n狂徒\t6236\n去舞\t6237\n恐难\t6238\nexostfis\t6239\n黑骂\t6240\n头墨\t6241\nhistomityou\t6242\n中心花园\t6243\n月亮与六便士\t6244\n32次\t6245\n幸灾\t6246\n65000名\t6247\n你是谁你是谁你是山村大美眉你是谁你是谁你是\t6248\n落思\t6249\nguhbv\t6250\n6月1日-7日\t6251\n不料\t6252\n彭彭\t6253\nMr\t6254\n六六一\t6255\n庄子漪\t6256\n1111111101111111111221255565225524223884258423575364236558285536855\t6257\n北瓜\t6258\n20120219\t6259\njbli\t6260\n抓捕\t6261\n2011年1月1日\t6262\n特困\t6263\n言欢\t6264\n美极了\t6265\n95厘米\t6266\n不新\t6267\n计算器\t6268\n排雷\t6269\n不断\t6270\n查查天卉中学\t6271\n晴安之乐\t6272\n挺感\t6273\n螺丝钉\t6274\n鱼竿\t6275\n效用\t6276\nbubububububububub\t6277\nwmgqd\t6278\n點吉\t6279\n能自黑\t6280\n身分\t6281\n哎呀我的妈呀我还要的却是玩玩啊啦啦我的妈呀\t6282\n划坏\t6283\nN\t6284\n九爷\t6285\n奇偶\t6286\n来之不易\t6287\n吗一二\t6288\n干得好\t6289\nML108\t6290\n17克\t6291\n生自己回家\t6292\n歹歹\t6293\n任冰\t6294\n灭邪\t6295\n那咱们俩\t6296\n那些年\t6297\n过一觉\t6298\n我们么\t6299\n效生\t6300\n饭跟\t6301\n莫里\t6302\n来去去\t6303\n论者\t6304\n撸一撸\t6305\n混吨\t6306\n八叉\t6307\nKvK\t6308\n张餐\t6309\n赵小孩\t6310\n一千五六一八\t6311\n我爱的人她爱的人\t6312\nBabi\t6313\n德国\t6314\n笨笨哥哥\t6315\n宫美娜\t6316\n大起大落\t6317\n八古\t6318\nbcosou\t6319\n一二下\t6320\n一二三\t6321\n混合\t6322\n八只\t6323\n德云社\t6324\n八号\t6325\n偌大\t6326\n张语桐\t6327\n玉珠峰\t6328\n不七百八\t6329\n唉哥\t6330\n凶灵\t6331\n下午1点\t6332\n林兆祥\t6333\n手驶\t6334\n兴灾\t6335\n留念\t6336\n花样年\t6337\n信度\t6338\n刘婧旭\t6339\nMadamerightyeah\t6340\n彭晓阳\t6341\n家常话\t6342\n正月十四\t6343\n接吻戏\t6344\n伤口\t6345\n聊找\t6346\nbhiphotosbaiducomxiaodupicitembd315c6034a85edfa4a1a6504e540923dd547597jpg\t6347\n两寸\t6348\nThispenis\t6349\nnfrfrvtvtvtgtgrgtvtvtvtvfu#11\t6350\n爱马仕\t6351\n929329号\t6352\n房管局\t6353\n小红帽\t6354\n飙车族\t6355\n芳香化合物\t6356\n有感于\t6357\n退养\t6358\n谢婷婷\t6359\n去你的你是我的秘书\t6360\n霞夏\t6361\n6千替\t6362\n定nnnnn人nnnnnnn神nnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnn针\t6363\n妊娠纹\t6364\n郭守敬\t6365\n伊金霍洛旗\t6366\n编制\t6367\n度蜜月离\t6368\n闹心\t6369\n拉米苏\t6370\n中国移动通信\t6371\n大概伙计\t6372\n加速王\t6373\n去年起\t6374\n那瓦\t6375\n身高帮\t6376\n詹志坤\t6377\n佐里奇\t6378\n下午三点至五点半\t6379\nfuipoioooiioooiiookoo\t6380\n老跟\t6381\n一声不吭\t6382\n想错了\t6383\n香港世界宣明会\t6384\n老跑\t6385\n8点08分\t6386\nSocool\t6387\n40000亿\t6388\n21岁\t6389\n未闻花名ED\t6390\n九十六岁\t6391\n凸米特哟兔\t6392\nab电话号码\t6393\n我们的话\t6394\n多啦[偷笑\t6395\n筱芙\t6396\n一第三\t6397\nr81f\t6398\n神天\t6399\neprestm\t6400\n听听听\t6401\n干瘦\t6402\n干瘪\t6403\n织布机\t6404\nxvynt\t6405\n恧tgtilerdouy\t6406\n河大\t6407\n饿饿饿饿饿饿饿饿饿饿饿饿饿饿二五呃\t6408\n我方\t6409\n小格格丑\t6410\n无钱\t6411\n试飞\t6412\n无尾小鼠历险记\t6413\n神夏\t6414\n别都\t6415\n高中时代\t6416\n诱发\t6417\n奥迪奥\t6418\n凡达\t6419\nCharlene\t6420\ntrgff\t6421\n搬运\t6422\nMWQ\t6423\nCHCG\t6424\n扳平\t6425\n奥特曼的我是你的\t6426\n二線\t6427\n搬进\t6428\n对呀谁\t6429\n水盒\t6430\n烦杂\t6431\n搬迁\t6432\n大战队\t6433\n魏羽浣\t6434\n一厅型\t6435\n88899110\t6436\nyoly\t6437\n秘妙\t6438\n18907625609\t6439\n抵消\t6440\n体院\t6441\n无限好\t6442\n邓浩明\t6443\n更便宜\t6444\n四九年\t6445\n永济\t6446\n借问度秘\t6447\nnotong\t6448\n几越发\t6449\n诠释\t6450\n雪nb\t6451\n金紫桐\t6452\n秘妻\t6453\n第三条\t6454\n康坎坎\t6455\n大庭广\t6456\n帮个忙\t6457\n度秘我是男的你是女的我爱你\t6458\n下卡拉卡蹦夏卡拉卡哦oo\t6459\n护体\t6460\n再来信\t6461\nihwwih\t6462\n周国平\t6463\nBOCBC\t6464\njzxcb\t6465\n参礼\t6466\n嘴哥哥\t6467\n愚人战\t6468\n蔡佳\t6469\n两排面\t6470\n四川大富豪\t6471\n郎放牛郎\t6472\n博导\t6473\n之珍\t6474\n北京双权纹身工作室\t6475\n13054772692\t6476\n恩过不感\t6477\nificityou\t6478\n鲸目\t6479\n泰兴话\t6480\n瘦弱\t6481\n管事\t6482\n687273\t6483\n替班\t6484\n蛀牙\t6485\n定制机\t6486\n草果\t6487\nbeksod\t6488\n诺基亚6303\t6489\n客观\t6490\n255555688565\t6491\n雀儿\t6492\n李博宇\t6493\n杜鑫\t6494\n身穿\t6495\n迎亲\t6496\n舒妍\t6497\nJ·E·丁格\t6498\n黄非常\t6499\n张碟\t6500\nKitty\t6501\n775597058\t6502\n瓷上江山\t6503\n豆浆\t6504\n潘安\t6505\n唱平\t6506\n得得得得得得得得\t6507\n景然\t6508\ndgxt\t6509\n本田主权\t6510\nGhggggggh\t6511\n假唱\t6512\n没事不关己\t6513\n念去去千里烟波暮霭晨晨楚天阔\t6514\n四年级寒假进门口\t6515\n多一点儿\t6516\n邱劉浩鵬\t6517\n明知道\t6518\nmpmgkwmkg\t6519\nHown\t6520\n木们\t6521\n𠂉信会\t6522\n次片\t6523\n丽柜美\t6524\n十几度\t6525\n找完\t6526\n爱家哥哥嘎嘎嘎嘎嘎嘎嘎嘎嘎珀莱雅\t6527\n此情此景\t6528\n路客\t6529\n临界\t6530\n站出\t6531\n秘灭\t6532\n加班出汗\t6533\nsimilar\t6534\n恶心死法克鱿\t6535\nPrize\t6536\n东东\t6537\n附同\t6538\n增值费\t6539\n曹庆国\t6540\n白头发\t6541\n出生就死\t6542\n负下\t6543\n鰰\t6544\n麻雀变凤凰\t6545\n鰴\t6546\n12月30\t6547\n大漾\t6548\n东一\t6549\n乌纱帽\t6550\n扇形\t6551\n综漫\t6552\n牛唇\t6553\n热火\t6554\n囊外\t6555\n东丽\t6556\n被弄髒的高级CA美丽空姐\t6557\n金胖子\t6558\nvvvv\t6559\nnnnnnnn\t6560\n度蜜君\t6561\n河子\t6562\n王晓晨\t6563\nguuhggfggf\t6564\nvvvn\t6565\n你是我的美妹妹你是我的小妹妹\t6566\n5720119944\t6567\nvvvc\t6568\n冬咕咚姑菇终\t6569\nvvvf\t6570\nvvvg\t6571\n鰜\t6572\n南门北门\t6573\n四大天王\t6574\n10点\t6575\n外立面\t6576\nMike隋\t6577\n江湾\t6578\n恩东街口\t6579\n推送\t6580\n病你的心\t6581\n大声高气\t6582\n破解\t6583\n作画\t6584\n乇尛\t6585\n飞将\t6586\n洪海霞\t6587\n紧急减肥法\t6588\n沙茶\t6589\n6只\t6590\n800027454\t6591\nwvk\t6592\nohfoyyyyy\t6593\n6口\t6594\nv5\t6595\n票数\t6596\nnnhfjmkvvhgh\t6597\n江湖\t6598\n章宝华\t6599\n白灼虾\t6600\n采纳\t6601\nPINYINMA\t6602\n刚刚刚刚刚刚刚刚刚刚刚刚\t6603\n6台\t6604\n6号\t6605\n气道\t6606\nNoH\t6607\nNoI\t6608\n滚滚滚\t6609\n六七位\t6610\n画画板\t6611\n13868422829\t6612\n杜远金门外来首出国游三岁平远近江大荒流月下飞天镜云生结海楼仍怜故乡水万里送行舟\t6613\n吧卫士\t6614\n逃不过\t6615\n童安格\t6616\n爱一个人真的好难\t6617\n人民解放军第35军\t6618\n那你不是说时男时女\t6619\nNon\t6620\n身长\t6621\nlkk\t6622\nghfdgg\t6623\nlkg\t6624\n1sjsjinsuiglijigjij\t6625\n散文儿\t6626\n无敌讨厌\t6627\n阿牛\t6628\n蜀山战纪剑侠图\t6629\n667\t6630\n18780712556\t6631\nNot\t6632\nlkt\t6633\nNow\t6634\n那萨尔丁\t6635\n九十年\t6636\n胚胎\t6637\n赖比操\t6638\n醉风\t6639\n因为你是谁\t6640\n663\t6641\n张朗朗\t6642\n泡菜淫民\t6643\n干净\t6644\n第一档\t6645\n观展\t6646\n你好小度秘你是我的小机器人\t6647\n660\t6648\n王工\t6649\nyfucucycuc\t6650\n联通四G\t6651\n小知识\t6652\n奔奔奔奔\t6653\n王巨\t6654\n王巴\t6655\n隙度\t6656\n滨河北路\t6657\nstomatout\t6658\n新时代\t6659\n切车\t6660\n口下留情\t6661\n牛爹\t6662\n六福\t6663\nlk0\t6664\n恶例\t6665\ncaporn\t6666\n鲁教板\t6667\n逻辑思维\t6668\n成贤娥\t6669\n不幸福\t6670\n打假者\t6671\n洗牙吧牛逼\t6672\n五常\t6673\n110咯\t6674\n逗号\t6675\n世界微笑日#\t6676\n移动梦网\t6677\n大星大星\t6678\n被害\t6679\n妈妈屋\t6680\n参与感\t6681\nbadgirl\t6682\n薛医生\t6683\n轻巧\t6684\n轻工\t6685\n侦探类\t6686\n蔡明宇\t6687\n托雷斯\t6688\n菜花蛇\t6689\n原配\t6690\n7926元\t6691\n银山塔林\t6692\n95万\t6693\n三万多份\t6694\n南拳\t6695\n遭遇\t6696\n秘真棒\t6697\n腹围\t6698\n荼杯狗\t6699\n2分钟左右\t6700\n烟波\t6701\n胸骨\t6702\n海沟\t6703\nJJE\t6704\n数羊\t6705\n品性\t6706\n李斯博\t6707\n文化课\t6708\nzhilyukl\t6709\n外出\t6710\n九层妖塔\t6711\nBack\t6712\ngoout\t6713\n虽有\t6714\n冒菜\t6715\nChinaNet\t6716\n蜀山战纪\t6717\n代颖颖\t6718\n失态\t6719\n哎呀哎呀我真的想你谢谢你呀不过我真的求求你了我就我问吧现在可以吗没有雨了我真的求求你了可以\t6720\n在场\t6721\nhttphhiphotosbaiducomxiaodupicitem34fae6\t6722\n老娘没打字老娘说的音准\t6723\n嗯安吉我的安吉拉\t6724\nQuitting\t6725\n14.98亿元\t6726\n百毒不侵\t6727\n在在\t6728\n秘聊天博\t6729\n杨丽娟\t6730\n周林霞\t6731\nAndroid4.0\t6732\n度密你是我最亲爱的朋友\t6733\n加泰罗尼亚\t6734\n父王\t6735\n要死鬼\t6736\n我不是小文我是公主不是别客气公主我喜欢\t6737\n黄冰玲\t6738\n用语\t6739\n黑你巴宝莉\t6740\n编曲\t6741\n放噶\t6742\n秘恶心\t6743\n文偃\t6744\n小河马\t6745\n悠着点哈我可真揍你我可黑社会哈别\t6746\n枕眠\t6747\n春水不斯\t6748\ndffggdg\t6749\n早F\t6750\n木星\t6751\n000066\t6752\n呢女\t6753\nhello度谜\t6754\n少装\t6755\n度秘姐忒\t6756\n这么远那\t6757\nltsthatalltom\t6758\n十一小\t6759\n哈楼哈楼哈楼\t6760\n多势众\t6761\n施救\t6762\nneojxirjxirbxirbxidxbcbcncnx\t6763\n不许多\t6764\n早d\t6765\n跟帖\t6766\nhjknbh\t6767\n呢套\t6768\n大坏蛋大坏蛋大坏蛋大坏蛋大坏蛋大坏蛋\t6769\n恩大姐姐\t6770\n亲一个女\t6771\n幺零零幺零点COM\t6772\nfhfdryyffjlvhkcgogddsopgddippp0\t6773\n曾文恒\t6774\n神呢感叹号\t6775\n5238456754\t6776\n长江大浪推前浪\t6777\n意德\t6778\n爱难过\t6779\n王津元\t6780\n泪汪汪\t6781\n赶尽杀绝\t6782\nccjcn\t6783\njjjjjjjjjjjjjjjjj\t6784\n重案\t6785\n一个年\t6786\n周围\t6787\n我好开心不高兴沃美奔三你\t6788\n46164\t6789\n可兰\t6790\n15分\t6791\n艾琳\t6792\n盲高\t6793\n万客宾馆697号房\t6794\n琶醍\t6795\n打打杀杀\t6796\n一样式\t6797\n子宫病理性摘除术\t6798\n可入\t6799\n辩护律师\t6800\n曾小\t6801\nbu tong\t6802\n造梦西游手机\t6803\n杨成龙\t6804\nbhiphotosbaiducomxiaodupicitemcdbf6c81800a19d80dc4457734fa828ba71e46f9jpg\t6805\n一套半\t6806\n52369741\t6807\n瓶友\t6808\nWindows\t6809\n快船\t6810\n51家\t6811\n黄花梨木博物馆\t6812\n独立自主\t6813\n寒火\t6814\n嵊州盛泰色织科技有限公司\t6815\n三月天\t6816\nfamous\t6817\n一四百九十九块\t6818\n哪个\t6819\n吃吃吃吃吃吃吃吃吃吃吃吃吃吃吃\t6820\n瞒上欺\t6821\n算错\t6822\n流水村\t6823\n喝酒白来\t6824\n刘旭\t6825\n喝酒\t6826\n童子\t6827\n呗哥哥\t6828\n天子座\t6829\nFrdgdssfdffddttff\t6830\n种糖\t6831\n41846941494994\t6832\n光帅\t6833\n大道\t6834\n安监\t6835\nFTCRexrefuted\t6836\n伞兵型\t6837\n姓李懂\t6838\n这么说嘛\t6839\n圣泉\t6840\n合合合约期\t6841\n末日重生女主\t6842\n诉存\t6843\n串子\t6844\nCCVE\t6845\n看见身\t6846\n杨丹妮\t6847\n偶见\t6848\n美美达\t6849\n9点28分\t6850\n高鼻梁\t6851\n刘晓宇\t6852\n黄图哥\t6853\n首演\t6854\n咀嚼肌\t6855\n鼻香\t6856\nRAP\t6857\n了解决\t6858\n妳色\t6859\n26度\t6860\n26座\t6861\n条线\t6862\n我好爱你的你\t6863\n条纹\t6864\n童诗\t6865\n42排\t6866\n命在\t6867\n垂死\t6868\n条纱\t6869\np10\t6870\n胡宇新\t6871\np12\t6872\np13\t6873\np15\t6874\n奥斯一\t6875\n条约\t6876\n乜巡丝绸之路7\t6877\n吧加吧加吧加橙\t6878\n手有劲\t6879\n岚性\t6880\n属于自己的路\t6881\n恶恶心\t6882\n极刑\t6883\n刘晟恺\t6884\n张峻恺\t6885\n1939年\t6886\n嗄嗄嗄\t6887\nxxjdjx\t6888\n倒装句\t6889\n下一辈子\t6890\nFdnf\t6891\n琼斯\t6892\n恶梦\t6893\n与众不同\t6894\nhdnd\t6895\n罪责\t6896\n豌豆荚\t6897\n富人\t6898\n三年人\t6899\n天人菊\t6900\n许佳宜\t6901\nhdns\t6902\n植物大战僵尸二天空之战\t6903\n洋葱要不我不要你了我\t6904\n渔民\t6905\n跑断\t6906\n度雨晗\t6907\n富二\t6908\n富于\t6909\n名儿\t6910\n逾常睐\t6911\n别鬼话连篇\t6912\n嗯黄色\t6913\n老板林\t6914\n心情不乱\t6915\n老片子\t6916\n被打乱\t6917\n胃结石\t6918\n界首\t6919\n1909年\t6920\ncalil\t6921\n颤动\t6922\n母星\t6923\n忘却\t6924\nsynwem\t6925\n美国流行病杂志\t6926\n主要片\t6927\n1234567891234567\t6928\n无性生活\t6929\n每份\t6930\n尖椒\t6931\n你美不美关我个屁事\t6932\n徐龙诀\t6933\n华强北\t6934\nbbbbbbbbbbbbbbbbbbbbbbb\t6935\n100x100x100\t6936\n嚷嚷\t6937\n回笼\t6938\n美丽唱的歌\t6939\n疥虫\t6940\n打票\t6941\n1348\t6942\n50多万\t6943\n地团\t6944\n19.8亿元\t6945\n舌战群吊\t6946\n你呢\t6947\n一报还一报\t6948\n30遍\t6949\n回笞\t6950\n耽美类\t6951\n地图\t6952\n拇伸\t6953\n你好奇怪\t6954\n世间\t6955\n行刘\t6956\n我不要你了你从我的软件\t6957\n云南元\t6958\n夏紫拗\t6959\n1100年\t6960\n减去\t6961\n素数\t6962\n下午五点08分\t6963\n朋佛陀\t6964\n这个税\t6965\n减压\t6966\n雨神雨神\t6967\n聊聊\t6968\n武训兴办\t6969\n其他地区\t6970\n才犬\t6971\n海口动物园\t6972\n小木元\t6973\n启示\t6974\n门牌号\t6975\n猪猪猪八戒猪\t6976\n64744162668\t6977\n有钱了再买\t6978\n台江\t6979\n优生\t6980\n中低\t6981\n阿堵物\t6982\n根根根根根根根根根根根根根根根根根根根根\t6983\n清大\t6984\n我的状\t6985\n黄篇\t6986\nseccx\t6987\n恩主\t6988\n奖壮\t6989\n老婆子\t6990\n于天航\t6991\n少彤\t6992\n感谢你大度秘\t6993\n伊媚儿\t6994\n恩上\t6995\n段落水\t6996\n恩与\t6997\n刘佳玟\t6998\n恩不\t6999\ngammphjnb\t7000\n恩一\t7001\n胶州\t7002\n优甲\t7003\nreadingabook\t7004\n格鲁撸\t7005\nJepsen\t7006\n好我很快乐\t7007\nｑｑ\t7008\n功德无量\t7009\n忘记了忘记了你在\t7010\n黄伞蘑菇\t7011\n那是你说的我的秘书\t7012\n你好呀度秘你你猜我是谁\t7013\n肯定\t7014\n赵佳旭\t7015\n800kkr\t7016\n14.37\t7017\n巴沙尔·阿萨德\t7018\n肖式\t7019\n失放下\t7020\n朱村儿\t7021\n好我走\t7022\n8曲\t7023\n东汉\t7024\n莫小凡\t7025\n好你爱你爱你爱你爱\t7026\n6655555\t7027\n参谋\t7028\n50010001万\t7029\n为什么不批\t7030\n薪酬\t7031\n起早贪黑\t7032\n赵胜铎\t7033\n两百两本\t7034\n埃及执政党\t7035\n男妃\t7036\n手铐\t7037\n红粘液\t7038\n男妓\t7039\n男妖\t7040\nD495805\t7041\n编编\t7042\nrininainai\t7043\n竞标\t7044\n小谢谢我给你发十万快点行吗一百全\t7045\n向京\t7046\ninds\t7047\n黑茶\t7048\n只管\t7049\n杨花\t7050\npkajg\t7051\nqqqqqqqqqqqq\t7052\n考虑\t7053\n365夜\t7054\n照章办事\t7055\n男妹\t7056\n风水学\t7057\n日皮\t7058\n波伊斯\t7059\n范就在\t7060\n王慧彩\t7061\n当年\t7062\n保钓者\t7063\n王宇行\t7064\n卧推\t7065\n相片\t7066\n99美元\t7067\n干嘛呢讨厌讨厌\t7068\n1258858\t7069\n页面\t7070\nwww文不對題\t7071\n阎惜娇\t7072\n郑佩娟\t7073\n2011年5月\t7074\nygvtf\t7075\n访客\t7076\n4900\t7077\n卡老卡\t7078\n梁丽洁\t7079\n爵爷\t7080\n人身影\t7081\n3-6个月\t7082\nhttpehiphotosbaiducomxiaodupicitemd788d43f8794a4c281e81c5f09f41bd5ac6e39ffjpg\t7083\njhhuuugvu\t7084\n礼貌\t7085\n靓女们\t7086\n2190x\t7087\n诗魔\t7088\n中蒙俄\t7089\n暴行\t7090\n理你了真的再见\t7091\n7赵\t7092\n名乡\t7093\n一大块儿\t7094\n大发错分二防城港\t7095\n度秘四条狗四条狗四条狗度秘\t7096\n衰竭\t7097\n聊小波有五毛钱\t7098\n马俊奇\t7099\n黝黑\t7100\n生气不想\t7101\n莞莞\t7102\n房产市场\t7103\n窗框\t7104\n二十多万\t7105\n名义\t7106\n乐杰\t7107\n天师\t7108\n113113\t7109\nForget\t7110\n奥尔良烤翅\t7111\n消毒\t7112\n88〇\t7113\n若干\t7114\nEstes\t7115\n二十多个\t7116\n501个\t7117\nswsss\t7118\n天帝\t7119\n额吉饭\t7120\n王春雨\t7121\n请稍等\t7122\n奇函数\t7123\n雅森\t7124\n土坯\t7125\n铜暖炉\t7126\n蛋疼\t7127\n土坨\t7128\n小站风云\t7129\n问同\t7130\n乖乖类\t7131\n推推帮\t7132\n吧快乐的日子\t7133\nationstr\t7134\n一家的话\t7135\n法咒\t7136\n小聚光\t7137\njbhurdc\t7138\n下意思\t7139\n问君\t7140\n天好冷\t7141\n旱冰鞋\t7142\n司宸宇\t7143\n不屌\t7144\nkahuw\t7145\n钢筋\t7146\n娅兰德里\t7147\n土坑\t7148\nkeo\t7149\n翻譯\t7150\nkek\t7151\n相关性\t7152\n兰诺\t7153\n爱幼阁\t7154\n鸟鸣涧\t7155\nGmjpwmgt\t7156\n女魔鬼\t7157\nkey\t7158\nivjghf\t7159\njib\t7160\n游戏迷\t7161\n拖衣服\t7162\n菏荷\t7163\n酉瓜\t7164\n相关怀\t7165\n苦逼抓机党\t7166\n别癫\t7167\n曲多多\t7168\ny29a\t7169\n两三次\t7170\n帖身\t7171\n肉价\t7172\n5736868\t7173\n柏林\t7174\nwewillroad\t7175\n看必乐\t7176\n饿呢\t7177\n丄一世\t7178\n前两年\t7179\n琉球大乐斗\t7180\n颜妙妙\t7181\nUtuuttdtyyfyyfn5\t7182\n活下去找\t7183\n哈多啦啦队\t7184\n来龙去脉\t7185\n笼罩\t7186\nFffffffffff\t7187\n高情商\t7188\n孔黎铭\t7189\n法儿\t7190\n开封市\t7191\n一直以为\t7192\n546461565\t7193\n很喜歡\t7194\n陶萧羽\t7195\n林心蕊\t7196\n不是你的错但丑\t7197\n13865737320\t7198\n园子\t7199\n所有制\t7200\n陈皮\t7201\n东北三省\t7202\n气贺\t7203\n红蔷薇\t7204\n名仕型\t7205\n爱信不信不信\t7206\n宾度\t7207\n阿读\t7208\n说这样的话\t7209\n欧买嘎\t7210\n足够\t7211\n王无罪\t7212\n西亚\t7213\n鸟区\t7214\n四次元\t7215\n猪秘你\t7216\n励精图治\t7217\n归向主\t7218\n解开\t7219\n西京\t7220\n你的岁数\t7221\n448场\t7222\n溪口\t7223\n姑娘你有几时今典\t7224\n声望\t7225\n电门\t7226\n陶官\t7227\n场四三\t7228\n阿诚\t7229\n护航\t7230\n天意\t7231\n嗡嗡声发怒c1\t7232\n我的她\t7233\n张琳琳\t7234\n我的女\t7235\n钱多多\t7236\n斯佩\t7237\n滚动\t7238\n哥木\t7239\n谁上我了你\t7240\n承台\t7241\n滕王阁\t7242\nsertyui\t7243\n一职照\t7244\n比是\t7245\n认实\t7246\n三天龙寺\t7247\n值不知道\t7248\n郭楠楠\t7249\n新希望杯\t7250\n星逻猫\t7251\n6666666\t7252\n爱更\t7253\n恭喜我喜\t7254\n抵制\t7255\n7778岁\t7256\n马頔\t7257\n洪恩\t7258\n张胜涛\t7259\n更灵活\t7260\ngjmptjwmj\t7261\n承受\t7262\n朗平\t7263\n德牧\t7264\n比昂\t7265\n拆机机\t7266\n跳跳熊\t7267\n叫我然庭\t7268\n开道\t7269\nhttp\t7270\n最末\t7271\nCloudBurst\t7272\n小黍\t7273\n窝记性\t7274\n大蔷\t7275\n走走\t7276\nUfdygg\t7277\n568923147\t7278\n48\t7279\nqbowed\t7280\n走起\t7281\nVvjvhcufuggbikivyvoxpjBabiIb\t7282\n田小亮\t7283\n含泪\t7284\n活出自己\t7285\ncsodj\t7286\n2233333\t7287\n得得\t7288\n男士们\t7289\n找不到\t7290\n王蓦然\t7291\n陌生\t7292\n固体家\t7293\n比发\t7294\n外围女\t7295\n欢喜欢\t7296\n1Two\t7297\n进阶\t7298\n223个\t7299\ndaiy\t7300\n心平如境\t7301\n不错度秘\t7302\n你好萌呆\t7303\nFgcbchc\t7304\n晴宝\t7305\nvvvvvvbvvvvvvvvvvvvvvvvbvvvvbvbvbvbjgehddddd\t7306\nPIER\t7307\n舅舅们\t7308\n畅泽俞\t7309\n事假\t7310\naqian\t7311\n你的人\t7312\n龟速\t7313\n不0\t7314\n许一场\t7315\n海航\t7316\nsouptastes\t7317\n卢剃头\t7318\n本题\t7319\n姨父\t7320\n下雨举\t7321\n白天不要的歌\t7322\nwool\t7323\nhahahaahahahahhahaaahhahah\t7324\n牛骏峰\t7325\ntwebenth\t7326\n本领\t7327\n中性点\t7328\n不当我\t7329\n说明书\t7330\n4%\t7331\nfghj\t7332\n锕呵锕锕锕锕锕锕锕锕\t7333\n幸涛\t7334\n软期\t7335\n九十只\t7336\n下雨下\t7337\n木忘\t7338\n火辣辣\t7339\n食梦\t7340\n哈公主\t7341\nDICHD\t7342\ncomeing\t7343\n束壁\t7344\n混迹\t7345\n734度\t7346\n赛罕区民族小学\t7347\n技不如人\t7348\nhttpghiphotosbaiducomxiaodupicitem10dfa9ec8a13632794a4d940968fa0ec08fac77cjpg\t7349\n自拍神器#\t7350\n台湾剧\t7351\n超级大骗纸\t7352\n药家\t7353\n蓝宝石\t7354\n托男\t7355\n多热\t7356\n大容量\t7357\n贫道法号法海\t7358\n敏明\t7359\n伦理类\t7360\n华商一\t7361\n心心相应\t7362\n我相信你\t7363\nSNS重拳\t7364\nJim叔\t7365\n30多辆\t7366\n论证\t7367\nc零五十九度\t7368\n傻不拉\t7369\n网民\t7370\n张继森\t7371\nUu\t7372\n秋明\t7373\nUp\t7374\nUs\t7375\n顶嘴\t7376\n蓝月\t7377\n爸爸兄弟\t7378\n腹子\t7379\n不不不我就要\t7380\n我的耳朵\t7381\n科学园\t7382\n落马\t7383\n杜鑫虹\t7384\n费德勒\t7385\n五十六十\t7386\n881080231042943676\t7387\nUZ\t7388\nUU\t7389\n一定要\t7390\nUV\t7391\n教主元\t7392\n李锦荣\t7393\nUS\t7394\n上不上学\t7395\n丰乐旅馆\t7396\nUN\t7397\nUI\t7398\n荷花池小学\t7399\nUK\t7400\n开远\t7401\nUG\t7402\n17点35\t7403\nUC\t7404\nUB\t7405\n旺仔旺仔\t7406\n新博西baanhokomadealeasome\t7407\n度密高\t7408\n樟木\t7409\n零六零幺七\t7410\n柠檬片\t7411\n亲哥哥\t7412\n5331\t7413\n亲个来亲一个\t7414\n春秋航空\t7415\n有目共睹\t7416\n听清\t7417\n白兔米酒\t7418\n万万古\t7419\n08月28日\t7420\n过年看\t7421\n滑翔机\t7422\n肖像类\t7423\n好明友\t7424\n两集\t7425\ndeddd\t7426\n转管\t7427\nyoutiv\t7428\n一顺百顺\t7429\nFUJIFILM\t7430\n任职\t7431\n切我讨厌你我讨厌你的超能\t7432\n假谢\t7433\n听不听\t7434\nyoutim\t7435\n吃穿\t7436\n奉陪\t7437\n阳光瓶\t7438\n灿白驯鹿牛桃繁星绵桃牛鹿党\t7439\n混进\t7440\n第一单\t7441\n2011年上半年\t7442\n绝地重生\t7443\n89999\t7444\n17%\t7445\n这样做好\t7446\n18674093571\t7447\n吧友涛\t7448\n很脏\t7449\n天白天大哪里\t7450\n第73军暂五师\t7451\nxahwani\t7452\n学帅\t7453\n3673￥\t7454\n近几年\t7455\n黑山县工商局\t7456\n八十千克\t7457\n你和我场\t7458\n海亚\t7459\n多堡\t7460\n高肥\t7461\n中原工商铺\t7462\n數學\t7463\n38元\t7464\n微距\t7465\n半妖\t7466\n晚一点\t7467\n4s\t7468\nbeats\t7469\n确诊\t7470\n无限流\t7471\n靠不说\t7472\n确证\t7473\n什么疯\t7474\n懂非懂\t7475\n帅比\t7476\n好朋友度\t7477\nVans\t7478\npp琼斯\t7479\n我的我的\t7480\n西红柿汁\t7481\n九阳豆浆机\t7482\n你好度秘我问你道题\t7483\n美女作家\t7484\n教幽\t7485\n堡妞\t7486\nSleep\t7487\njhhgt\t7488\n党政机关\t7489\nhiTV\t7490\n机器声\t7491\nqert\t7492\n头肥\t7493\n冉东方\t7494\n爱有性\t7495\n切碎\t7496\n林青霞\t7497\n我要你\t7498\n夸都暻秀\t7499\n赞比亚基特韦市\t7500\n坏兽\t7501\n海事\t7502\n不够不够\t7503\n第四几次\t7504\n无理取\t7505\n我的故事\t7506\n按摩池\t7507\n罩头\t7508\n蜡纸\t7509\n二九\t7510\n瑞雪图\t7511\n要死人\t7512\n深思\t7513\n蘘火\t7514\n赵猎具\t7515\n发起\t7516\n欧麦嘎\t7517\n良友\t7518\n一批60周岁\t7519\n数来宝\t7520\n音乐之声\t7521\n万一混飞厂\t7522\n大罢工\t7523\ndrawashink\t7524\n乱算\t7525\n直中\t7526\n试营运\t7527\n上下班\t7528\nvasikka\t7529\nBailey\t7530\n定向\t7531\n丽影\t7532\n熊霞\t7533\n诚哉斯言\t7534\n凌厉\t7535\n过年前\t7536\nohye\t7537\n许慧芊\t7538\n9396-9983\t7539\n翦氏族\t7540\n泰胜\t7541\n直下\t7542\n九百九千\t7543\noOoOo\t7544\n明白年\t7545\n越来约\t7546\nworenbudi\t7547\ncbb285025aafa40f06b3jpg\t7548\n天象\t7549\n家话\t7550\n宫城县\t7551\n问心无愧\t7552\n人保部\t7553\n最萌最萌花蝴蝶\t7554\nn809呢贺无疑幽蝶千虑胶囊\t7555\n橘子皮\t7556\n同班\t7557\n分生\t7558\n签发\t7559\n国海\t7560\n行善\t7561\n孙美婷\t7562\n艾莉莎\t7563\n极微\t7564\n胡宇欣\t7565\n信不我卸\t7566\n拖地扫地\t7567\n战争片\t7568\n#TA\t7569\n口嗅\t7570\n赫赫赫赫\t7571\n太邪\t7572\n呢摩\t7573\n乙方\t7574\n1场\t7575\nxuan\t7576\n幻速s3\t7577\n燕拜\t7578\n岁期\t7579\nZfhhjjhfds\t7580\n叶集\t7581\n隆撒\t7582\n护士长\t7583\n心结友\t7584\n玛日本政府\t7585\n俄罗斯航空企业联合体\t7586\n群山\t7587\n小小小小小超\t7588\n赌神\t7589\n岁月\t7590\n123444521234567890\t7591\n模拟人生3\t7592\n举措\t7593\n唐二\t7594\n肌务\t7595\n尿盆\t7596\n弗里斯克\t7597\n10来分钟\t7598\n独享\t7599\n1圈\t7600\n鸡子\t7601\n迈瑞秘\t7602\n弯弯儿\t7603\n岁末\t7604\n卧牛村\t7605\n唐晓贤\t7606\ntjmmm\t7607\n卡拉奇\t7608\n汉字藏玄机\t7609\n有理\t7610\n鬼仔\t7611\n4月22日\t7612\n1966年8月5日\t7613\n1786年11月25日\t7614\n几百里\t7615\n好了先拜拜\t7616\n刘笑麟\t7617\nCARY爸\t7618\n互译\t7619\n各方面\t7620\n打车\t7621\n赵蓝新\t7622\n袅袅\t7623\n可爱一面\t7624\n打轮\t7625\n打转\t7626\n闲哥\t7627\n鲁南\t7628\n正經\t7629\n饲草\t7630\n剐吧帝国\t7631\n谢太or\t7632\n宋炳坤\t7633\n何鸿燊\t7634\nihiuseu\t7635\n第两\t7636\nxtuq\t7637\njymj\t7638\ntatio\t7639\n主管\t7640\n咽痛\t7641\n肖书雅\t7642\n陈春芬\t7643\n北朝鲜\t7644\n猴子头\t7645\n书愤秘\t7646\n皇兄\t7647\n坐爱吧\t7648\n起去世\t7649\n第七\t7650\n牛改成\t7651\n第一\t7652\n2824828633\t7653\n1万亿\t7654\n第三\t7655\n室里号\t7656\njpa\t7657\n和平小学\t7658\n大煞\t7659\nbdsb\t7660\n说么么\t7661\nhftg\t7662\n陇西\t7663\n第一百个\t7664\n引华\t7665\n破咙\t7666\n超级英雄\t7667\n司尔\t7668\n78个\t7669\n红白细胞\t7670\n破和\t7671\n不老干嘛\t7672\n2月8日\t7673\n没萌\t7674\n心爱片\t7675\nrook\t7676\n米莱\t7677\nroom\t7678\nroon\t7679\n没落\t7680\n和王\t7681\n村居\t7682\n1394091680\t7683\n海鱼\t7684\n哪哪哪哪哪哪哪哪个大的到哪哪哪哪哪哪你就是\t7685\n吸血蜘蛛\t7686\n尼玛共和党\t7687\n分手费\t7688\nroot\t7689\nevnti\t7690\n小豆比\t7691\n央视林\t7692\n云DJ\t7693\n挪了挪\t7694\n大爱无疆\t7695\n模拟\t7696\nChrome\t7697\n四圈半\t7698\n富尔\t7699\n#2NE1#\t7700\npxpwmpx\t7701\n跳跳跳跳跳条条条条条\t7702\n自身\t7703\nA310黑900\t7704\n不在场\t7705\n讲篇\t7706\n我要你上西天\t7707\n伍佰伍佰\t7708\n这是你人生怪我你不要脸\t7709\n未成年人\t7710\n米真\t7711\n命革\t7712\n8756088886\t7713\n三十四\t7714\n结束\t7715\n张仁良\t7716\n杨洋贾\t7717\n淳于\t7718\n队礼\t7719\n保存\t7720\n阿童木\t7721\n里巴b裤裤裤\t7722\n一人生\t7723\n汉南\t7724\n十余堡乡\t7725\n高欣雨\t7726\n加加\t7727\n上海公交\t7728\n补妆\t7729\n栽种\t7730\ndjdddddd\t7731\n加动\t7732\n一坨\t7733\n放月光曲\t7734\n留神\t7735\n述说\t7736\n曲解\t7737\n6150\t7738\nhaaand\t7739\n你呢那你告诉我你倒下度秘长什么样那你\t7740\n普军校\t7741\n十五一\t7742\n色系军团\t7743\n加力\t7744\n述评\t7745\nllkjhgfdsaqw\t7746\nQhkxbsk155678\t7747\ngvyjddnjdv\t7748\n还有事先走\t7749\n没问一样\t7750\n529374\t7751\n一百把\t7752\n观貌\t7753\n袁杰\t7754\n金毛男孩\t7755\n温岚\t7756\n袁松\t7757\n草寇\t7758\n吾吾吾吾吾\t7759\n火龙\t7760\n偶度秘\t7761\nffgggg\t7762\n那你猜猜我是男的女\t7763\n怎便\t7764\n大鸽\t7765\ncrey\t7766\n世情\t7767\n不干不净\t7768\n呆了帅\t7769\n属於\t7770\n杜都\t7771\n万众一心\t7772\n妇炎洁\t7773\n连连看\t7774\n后感\t7775\n贝爷\t7776\nfromAmerican\t7777\ncred\t7778\n谢谢你好久加\t7779\n一题么\t7780\n21:30\t7781\n吃局\t7782\n已逾\t7783\n丨米5\t7784\n毛戎\t7785\n吃屌\t7786\n隋依鑫\t7787\nw金毛\t7788\n语文儿\t7789\nLUT2558955JXLS5888LKKK\t7790\nefgh\t7791\n海淀民政局\t7792\n停机保号\t7793\n民警\t7794\n杜军军\t7795\n血块\t7796\n乔雪勇\t7797\n下堂\t7798\n胆囊炎\t7799\n争锋\t7800\n4千元\t7801\n吐了控\t7802\n台空\t7803\n4千克\t7804\n海子\t7805\n找不着\t7806\n鼓励睦邻\t7807\n生性\t7808\n看多啦\t7809\n天地之道\t7810\n开瓶肾宝\t7811\n华鑫\t7812\n够够用\t7813\n许诗涵\t7814\n解决们\t7815\n哎呦呀呀呀\t7816\n男模\t7817\n爻人℅\t7818\n雅晗\t7819\n撞钟\t7820\n合盛\t7821\n土姚\t7822\n花垣\t7823\n凤舒琦\t7824\n一点六元\t7825\n生态\t7826\n异父异母\t7827\n拙劣\t7828\n奇珍异宝\t7829\n东特\t7830\n關心\t7831\n永州市\t7832\nbukuducucreate\t7833\n生怕\t7834\n别记错\t7835\n张佳颖\t7836\n东沙群岛\t7837\nFcghvgy\t7838\n细分\t7839\n99989\t7840\n肉圆\t7841\n盾娘度\t7842\n长桌\t7843\n马约尔广场\t7844\n诱骗\t7845\n黄港\t7846\n放心华话\t7847\n钻禧\t7848\n2月23号\t7849\nbrrr\t7850\n失婚\t7851\n孔雀雄\t7852\n逼开\t7853\n獭兔\t7854\n火星语\t7855\n144556789\t7856\n别光\t7857\nfnuge\t7858\n中国安全教育网\t7859\n傻猪\t7860\n传奇影业\t7861\n驻华\t7862\n29余数\t7863\n管不着\t7864\n啍舍\t7865\n九五岁\t7866\n度秘求\t7867\n净赚\t7868\n觉哈\t7869\n巍峨\t7870\nshkf\t7871\nhihi撸\t7872\n平二小\t7873\n牛桃运天\t7874\n94度\t7875\n7KP台\t7876\n充满\t7877\n3699\t7878\nc元\t7879\n十分之七\t7880\n十分之一\t7881\nMess\t7882\n肖茹月\t7883\n十分之三\t7884\n前妻的战争\t7885\n二十多岁\t7886\n那时光\t7887\n強悍\t7888\n红花\t7889\n689988\t7890\n红芳\t7891\n陈吉洲\t7892\nasksbdosth\t7893\n炎埤赵\t7894\n六百多天\t7895\n阻击\t7896\n洁碧\t7897\n开门子\t7898\n半公斤\t7899\n天沒\t7900\n2ak\t7901\n恶心到我了真是恶心\t7902\n苏豪路易斯·嘉玛\t7903\n敢不信\t7904\n易小七\t7905\n长处\t7906\n83年\t7907\n一不可\t7908\n严刑峻法\t7909\n姊妹\t7910\nDhmsqcc\t7911\n东北东北\t7912\n小叶子\t7913\n长头\t7914\n天河\t7915\n112得二\t7916\n长天\t7917\n时间差\t7918\n神鬼\t7919\n奇迹\t7920\n长大\t7921\n一世人\t7922\n刘错\t7923\n阴唇\t7924\n八千五百十四分\t7925\n欧莱雅\t7926\n处理费\t7927\n酒家\t7928\n没不好笑\t7929\n回火星去\t7930\n无真\t7931\n无眠\t7932\n2月18\t7933\n彭真之子付洋\t7934\n19.85亿元\t7935\n几粒\t7936\n网不双今儿\t7937\n2月12\t7938\n2月15\t7939\n2月14\t7940\n2月17\t7941\n三分之一二分之一\t7942\n侦查报告\t7943\n谁谁和你对眼\t7944\n蒜泥\t7945\n无眼\t7946\nghiphotosbaiducomxiaodupicitemd53f8794a4c27d1e3c66d3b01cd5ad6eddc43806jpg\t7947\n千米面\t7948\n一只么\t7949\nwww.3L51.com\t7950\n合约期\t7951\n姓甚\t7952\n一百九十八八八五五\t7953\n我可受大学了你信\t7954\n4trtd56tr\t7955\n牛特\t7956\nzhe个\t7957\n思无\t7958\nka558\t7959\n任佳慧\t7960\n偷脸\t7961\n合约机\t7962\n么么哒亲一个亲个\t7963\n小袁\t7964\n张婷宣\t7965\n潘朵拉\t7966\n牛牛\t7967\n思旺\t7968\n吟叫\t7969\n成西\t7970\nCHDUF\t7971\n桃花扇\t7972\n蜜夕月\t7973\n转花\t7974\n天天后\t7975\n话把\t7976\npukgai\t7977\ngvxfjhczsklojcnjjvxssaxgjkcvvnnnn\t7978\n盛一伦\t7979\n防臭\t7980\n干嘛丫\t7981\n走赠\t7982\n一个G\t7983\n乾隆钦\t7984\n没饭\t7985\n156岁\t7986\n羽衣\t7987\n张笑月\t7988\njdirbyeegdhxibcydkpehfsyhxbhxisksvtzhdjpdjfhcykfdgtxhsbehox\t7989\n督导\t7990\n万岁万岁万万岁\t7991\n蚁人\t7992\n268号\t7993\n余思梅\t7994\n成本\t7995\n前排\t7996\n愤青们\t7997\n一个m\t7998\n尼肯\t7999\n晕不晕\t8000\n强有力\t8001\n二一呵呵\t8002\n禮物\t8003\n属于我\t8004\n头乱\t8005\n而雨\t8006\n差惨\t8007\n菩提本姑娘的话本姑娘\t8008\n诸暨市\t8009\n88555\t8010\n一个1\t8011\n一个0\t8012\n40来岁\t8013\n一个2\t8014\n一个5\t8015\n婴儿车\t8016\n一个6\t8017\n总冠军\t8018\n竭虑\t8019\n呆萌君\t8020\n夏樱\t8021\n打抱不平\t8022\n尸手\t8023\ntheUSA\t8024\n买一送\t8025\nwmpgg\t8026\n嗯嗯志龙最爱我\t8027\n冰球\t8028\n我们网\t8029\n么么么明天真亲\t8030\n那你说你是谁呀你是我的亲人\t8031\n美图版\t8032\n好想你的不是不够给你我好想好想你\t8033\n哑迷\t8034\nSgffghgghg\t8035\n伸手\t8036\n我想念你的眼睛爱的允许我爱的人的一我想念你的\t8037\n淡风\t8038\ngghbbbh\t8039\n身先士卒\t8040\n京族\t8041\n透薄\t8042\n每隔七日\t8043\n嘛哪空落落空间几王和呵呵4\t8044\n关世鹏\t8045\n潘雯欣\t8046\n池莉\t8047\n没呀好帅\t8048\n郑克爽\t8049\n哪副\t8050\n漫观\t8051\n男元素\t8052\n孟子曰\t8053\n1977~\t8054\n三顾度\t8055\n上海公共交通卡公司\t8056\n多少时\t8057\nomg\t8058\n呢妹妹\t8059\n掉线率\t8060\n莫罗\t8061\n寵物\t8062\n涂强\t8063\n十有八九\t8064\n9874561230\t8065\n座谈会\t8066\n15963个\t8067\n大掠马\t8068\n呵真\t8069\n流量\t8070\n伴们\t8071\n花丽娜\t8072\n新时达\t8073\n陌生人\t8074\n一只号\t8075\nzhidaobaiducomquestion1670631248375410627\t8076\n前前任\t8077\n214142142v五\t8078\n北里\t8079\n叫等\t8080\n嘎嘎嘎嘎嘎嘎\t8081\n厨子\t8082\n一个84\t8083\nWsjjkkltt\t8084\nDdnndndnxmxnxmxkxz\t8085\n7集\t8086\n+帅\t8087\n阮籍\t8088\n我真不骗你我骗我\t8089\n抗我的秘密\t8090\n玛雅人\t8091\n三百分\t8092\n整篇\t8093\nPuffin浏览器\t8094\n铁路局了了了了了了了了了了了了了了了了了了啦啦啦啦啦啦啦啦啦\t8095\n写于\t8096\n宴车\t8097\nfabst\t8098\n挂档\t8099\n包扇娘\t8100\ngé离\t8101\n新彩\t8102\n想甚\t8103\n我喜欢聊性\t8104\n85岁\t8105\nddfffy\t8106\nyouyam\t8107\n祈愿\t8108\naska\t8109\n牛基地\t8110\n相见过瘾\t8111\nlovely\t8112\neBy\t8113\n两顿\t8114\nbitbang\t8115\n8447444411\t8116\nglagallies\t8117\nHaoB\t8118\n1988年\t8119\n唐朝\t8120\n台灣學\t8121\nejgdtteexuwxhxhhxvddbvdbdvxhxhxgxsvevqvqvvfncixhxvnfyxrhchycydsvbthehchhxeh\t8122\n暗招\t8123\n那不可能的我是个男男孩\t8124\n避孕鳝\t8125\n化妆物语\t8126\n圣山寺\t8127\n空对空\t8128\n完一起来\t8129\njaox\t8130\n光华楼东\t8131\n火影党\t8132\n大我那么多\t8133\n果品\t8134\n3939933\t8135\nwdk\t8136\n要不好朋友不聊\t8137\n吴刚\t8138\n求和\t8139\ncant\t8140\n一照\t8141\n眼帘联\t8142\nffffffff\t8143\n乱飞\t8144\n雨舒\t8145\nturner\t8146\n几块\t8147\n整丑\t8148\nhjgjga\t8149\n沈少爷\t8150\n过年要\t8151\niwereagetboy\t8152\n7200\t8153\n老还童\t8154\n野鸭子\t8155\n程砚秋\t8156\n有点儿\t8157\n狐妖\t8158\n1周左右\t8159\n吴别\t8160\n二人组\t8161\n洪泰成\t8162\nqjj\t8163\n一百万升\t8164\n亚龙\t8165\nSvd\t8166\n特殊\t8167\ndobaodtsighfgbgktgd\t8168\n马晓含\t8169\n大黄伟\t8170\n网店\t8171\n战战\t8172\n十四二十四\t8173\nqjq\t8174\n战成\t8175\n13:00\t8176\nqjt\t8177\ne7fdtjj\t8178\n大姐哥\t8179\nyourn\t8180\nyouro\t8181\n旅游区\t8182\n凉席\t8183\nBilly\t8184\nyourg\t8185\nyoure\t8186\n廋放心哈脏\t8187\n真太郎\t8188\n100000000000000000000000000000000000000000000000000000000000000000000000000000000000\t8189\n顶撞\t8190\n戏文化\t8191\n矿业\t8192\n莫提\t8193\n2012款\t8194\n逐出\t8195\n威士忌\t8196\n稳当\t8197\n透明\t8198\n摩夫妇\t8199\n慢呢大战\t8200\n全胜\t8201\n名字报\t8202\n大牌生日会\t8203\n肃北\t8204\n不要说笑话\t8205\n甄姬\t8206\n健达奇趣\t8207\n滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚\t8208\n1776\t8209\n平方\t8210\n麓麓\t8211\n啪啪\t8212\n1772\t8213\n将礼\t8214\n啪啦\t8215\n度秘我爱你白簸\t8216\n1778\t8217\nBoss\t8218\n山海\t8219\nfuftdtu\t8220\n八二七零\t8221\n寒冬腊\t8222\n你不正经我正经\t8223\n阿雨\t8224\n真劲\t8225\n上机\t8226\n陈炳德\t8227\n女孩\t8228\n小聪明\t8229\ngteyuh\t8230\n抟复遥而上者九万里你接下句\t8231\n咯铁定\t8232\naigowao\t8233\n6tfgyj\t8234\n猪囡\t8235\n1160元\t8236\n上木\t8237\n戚薇\t8238\n李茹怡\t8239\n图努\t8240\n彤亥\t8241\n阿拿塔亲爱的摸摸\t8242\nddrsfdtf\t8243\n预收\t8244\n女字\t8245\n张一坤\t8246\nAre\t8247\n预支\t8248\n女子\t8249\n177m\t8250\n嗯六\t8251\n魔化\t8252\n张霖枫\t8253\n3571592580456\t8254\n发生\t8255\n聊吃\t8256\n420回答\t8257\n姚雨欣\t8258\n18多\t8259\n化缘\t8260\n100000000000个\t8261\n鸡西\t8262\n生土长\t8263\n庞任朗\t8264\n尿唉\t8265\n民伟\t8266\n呢你的你难到你那到你的即可\t8267\nydsdc\t8268\n王昱博\t8269\n视力\t8270\n独唱团\t8271\n义军\t8272\n发画\t8273\n聊吧\t8274\n昨日下午2时\t8275\n乔敏\t8276\n桑骁筱\t8277\n所为\t8278\n夺眶而出\t8279\n董小姐妹\t8280\n18天\t8281\n发用\t8282\n无一处美\t8283\n一个六多\t8284\n昰心有灵\t8285\n叫喊\t8286\n72500072452\t8287\n中看不中用\t8288\n盯着\t8289\n嗯果\t8290\n亚美\t8291\n井凡\t8292\nCorey\t8293\n李子奇\t8294\n34点到\t8295\n果恨\t8296\nzaphd\t8297\n凤姐长\t8298\n李子女\t8299\n美女\t8300\nzhjx\t8301\n美奴\t8302\n两条线\t8303\n1234567890\t8304\n神算\t8305\n美好\t8306\nafternoon\t8307\n顾徐彤\t8308\n不要你了叫\t8309\n凤凰卫视\t8310\nmissYuo\t8311\n亚群\t8312\n真舒舒\t8313\n度秘东\t8314\n好怪\t8315\n套标\t8316\n唐三阿\t8317\n燃放\t8318\n39000分\t8319\ndown\t8320\n中国联合网络通信有限公司\t8321\njjjjkk\t8322\n斯海尔\t8323\n度秘三\t8324\n坏孩子就是你赌咪\t8325\n龙盛花园\t8326\n如忆姐\t8327\n飘逸\t8328\n懿旨\t8329\n把欠\t8330\n山门\t8331\n度秘丸\t8332\n古巴古巴\t8333\nggghy\t8334\n时间谍\t8335\n一二三四5678九\t8336\n宣传\t8337\n飘逝\t8338\n未果\t8339\n山间\t8340\n朱欣怡\t8341\npakk\t8342\ngggffff\t8343\n春尽江南\t8344\n花絮照\t8345\n解放军报\t8346\nCCTVby\t8347\n小鸡鸡大酒店\t8348\n输鼓\t8349\n烈酒\t8350\n没完了\t8351\n上海华昌源实业投资有限责任公司\t8352\njtjtgaj\t8353\n过敏性咳嗽\t8354\n色卡\t8355\n三大然\t8356\n抑扬\t8357\n放过\t8358\n禁渔\t8359\nNickhun、佑荣\t8360\n干活片\t8361\n嗯充\t8362\n5Z01714\t8363\nhi嘟咪\t8364\n噩噩噩噩噩噩噩噩\t8365\n单单\t8366\n15952084146\t8367\n你么样\t8368\n大热\t8369\n会导\t8370\n杨雪东\t8371\n郭越\t8372\n贾老师\t8373\n三氯奶粉\t8374\n周思\t8375\n对头对头\t8376\n东华实业\t8377\n自考\t8378\n做众\t8379\n自老\t8380\n惊世骇俗\t8381\n建筑群\t8382\n阿巴狗\t8383\nrouge\t8384\n嘢唔\t8385\n做伙\t8386\n有果\t8387\n电离\t8388\n响遏\t8389\n诸多\t8390\n周总\t8391\n卡罗欧\t8392\n已知\t8393\n李大仁\t8394\n做伴\t8395\n我是女的我也你的妈妈\t8396\n王裬\t8397\n313313d3dwfdmdddoam431odmdmvvjwa\t8398\n做伸\t8399\n五点四\t8400\n问闲\t8401\n088\t8402\n戒躁\t8403\n桃园\t8404\n寂寞好\t8405\n081\t8406\n080\t8407\nchiphotosbaiducomxiaodupicitem738b4710b912c8fc9146b3befb039245d688210fjpg\t8408\n084\t8409\n长沙市总工会\t8410\n多喜欢你\t8411\n取信于民\t8412\n二面\t8413\n哈出水\t8414\n小不点你是男的还女的\t8415\n问问\t8416\n笑起\t8417\n钟铉\t8418\n索里亚度秘\t8419\n呃零零二二零零五\t8420\n钓鱼岛\t8421\n子病入膏肓\t8422\n把作你我的坏心骗子成我了不成\t8423\nzhidao\t8424\nzhidai\t8425\n斗牛臭不脸臭不要脸的臭不要脸臭不要脸的臭不要脸的臭不要脸你\t8426\n就是甲子死人经\t8427\n替换\t8428\n终于度\t8429\n念不忘\t8430\n是男\t8431\n成宝\t8432\n传芳小学\t8433\n⺪⺪⺪\t8434\n语音助手\t8435\n张艺刚\t8436\n不想和你说话了我讨厌你\t8437\ngjbgvnjffb\t8438\n4124555524253355\t8439\n2200吨\t8440\nnbatwitt\t8441\n昨晚十二点\t8442\n初恋金毛处分连眼光看高清只楚\t8443\n182吧182818281881\t8444\n绝处\t8445\n成安\t8446\nggggggvvgv\t8447\n叟想\t8448\n几十度\t8449\n小学生\t8450\npuof\t8451\n伊尹\t8452\n末流\t8453\n刹车\t8454\n右耳\t8455\n打卡花\t8456\n祢好\t8457\nHHFSUGVGSUHB\t8458\n额镇\t8459\n爱呦呦\t8460\n恶翅\t8461\n廖凡凡\t8462\n记读\t8463\n银监会希望信托公司\t8464\n肏弟妹\t8465\n梧村\t8466\n好丑好丑好丑好丑好丑好丑\t8467\n继而\t8468\n冠哥\t8469\n二十发\t8470\n节哀节哀节哀节哀\t8471\nin好jhhyhhghjhhinhjhyujuhgvchhujhhuyyyuuiiuuuuuuuyyyuuuuuuuuuuiufhgkhhgj\t8472\n家座机\t8473\n⺌\t8474\n丢长\t8475\n幺二零幺零二幺九七七零幺幺二二零二幺八\t8476\n说什么再见\t8477\n二十号\t8478\n168元\t8479\n微博评鉴团#\t8480\n雄鹿队\t8481\n十点间\t8482\n67港\t8483\n二十只\t8484\n一兆瓦\t8485\n训导\t8486\n一千二百块\t8487\n真是好孩子你是\t8488\n狗日\t8489\n保汉\t8490\n翠花\t8491\n亲尝\t8492\n满可爱\t8493\n每日邮报\t8494\n薄\t8495\n血溅\t8496\n薆\t8497\n半壁江山\t8498\n薛\t8499\nIloveyou度秘\t8500\n好呀明天再聊吧亲爱的再见晚安\t8501\n孙老板\t8502\n王福兴\t8503\n发化\t8504\n取食\t8505\n嗡嗡\t8506\n耶莉雅\t8507\n句首\t8508\n钱呀\t8509\nchcgx\t8510\n再见再来\t8511\n波先生\t8512\n薤\t8513\n鬼泣4\t8514\n翠芝\t8515\n丹唇\t8516\n标书\t8517\n哈超赞\t8518\n婚论\t8519\n暂停\t8520\n度秘你是我的男朋友我爱你\t8521\n地赞\t8522\n薶\t8523\n张楠\t8524\n小肚鸡肠\t8525\n偶卿\t8526\n熊光头强\t8527\n中国南车\t8528\n固然\t8529\nDSR\t8530\nggefuhtf\t8531\n生活着\t8532\n自主战\t8533\nab堡\t8534\n邓凤冰\t8535\nDSF\t8536\n咯喔涡日\t8537\n金二\t8538\n嘤嘤王\t8539\n老点儿\t8540\n一一一三肀王\t8541\nbox\t8542\nboy\t8543\n不够我够\t8544\nbot\t8545\nbou\t8546\n提供商\t8547\nbok\t8548\nbol\t8549\n亲觜\t8550\n到嫁\t8551\nbob\t8552\n超论\t8553\nbod\t8554\n2011年5月23日凌晨\t8555\n一对对\t8556\n11560\t8557\nhijcx\t8558\n帧数\t8559\n三本\t8560\n天空之城\t8561\n表叔\t8562\n三朵\t8563\n大秘秘\t8564\n三机\t8565\n爆北\t8566\nsclco\t8567\n恨牛\t8568\n我的微我\t8569\n只见过\t8570\nWhiteread\t8571\n古巨基\t8572\n三月\t8573\n炮弹\t8574\n吃野\t8575\nRiesling\t8576\n扭扭捏捏扭扭捏捏你\t8577\n三期\t8578\n记录像\t8579\n溃败\t8580\n省油\t8581\n沙摆\t8582\n李颜宏\t8583\n黑龙江\t8584\n超极大\t8585\n1358546\t8586\n画象\t8587\n三二五\t8588\n舍身取义\t8589\n主嫌\t8590\n6月1日\t8591\n妈妈妈妈妈妈妈妈妈妈了喔\t8592\n性命\t8593\n烦人不烦人v\t8594\n每我\t8595\n配图\t8596\n儒术\t8597\n巴黎和\t8598\n沙摩\t8599\n二系点三五\t8600\nrfldncnw\t8601\n冰冰\t8602\n话不投机\t8603\n数过\t8604\nqqwmnkemjx\t8605\n途牛行\t8606\n1.1\t8607\n紧绷\t8608\n数分钟\t8609\n求知\t8610\n欢歌笑语\t8611\n英西\t8612\n李玮峰\t8613\n嘟咪嘟咪\t8614\n胸衣\t8615\n凡提\t8616\nqunime\t8617\n水煮肉\t8618\n暗自\t8619\n六个月\t8620\n理可\t8621\ntuufdyu\t8622\n生疑\t8623\n洗手\t8624\n上山明水秀\t8625\n十三百多\t8626\n点明\t8627\n转型\t8628\n银座\t8629\n理发\t8630\nB痴\t8631\n早上7点半\t8632\n昆仑山\t8633\n白莲花\t8634\n理友\t8635\n下姓门\t8636\n四面楚\t8637\n生疼\t8638\n罗先生\t8639\n行吗\t8640\n会理县\t8641\ndeDd\t8642\n内型\t8643\n人脸识别器#\t8644\n281页\t8645\n要不家网\t8646\n别别别\t8647\n讨厌讨厌讨厌了你是我\t8648\n番茄味\t8649\nwpiaaaa\t8650\n交叉\t8651\n交友\t8652\n辩友\t8653\n铁皮疙瘩\t8654\n非金融企业债务融资工具\t8655\n鹅蛋脸\t8656\n精神伴侣\t8657\n青岛大虾\t8658\nFjv\t8659\n牛肚面\t8660\nfm1ddr3\t8661\nAreyoumidu\t8662\ntometou\t8663\n未尝\t8664\n还要\t8665\ntrddg\t8666\n说了等\t8667\n度度秘我爱你给我了\t8668\n献甲\t8669\n12份\t8670\n概括性\t8671\njejebf\t8672\n交口\t8673\n阿什姆\t8674\n遇见梅花花你\t8675\n飞塞族\t8676\nc位\t8677\n短匆\t8678\n哦1心\t8679\n啦啦啦美\t8680\n在南方的艳阳\t8681\niptv\t8682\n花骨迷\t8683\n拉爸爸了巴巴爸爸爸爸爸爸爸爸爸爸爸爸爸爸爸爸爸爸\t8684\n安别笑\t8685\n舍己为\t8686\n蹲下\t8687\n过山秋\t8688\n金柳婷\t8689\n贾4th\t8690\ngstdghdfu\t8691\n120场\t8692\n冒气\t8693\n865455\t8694\n111疼\t8695\n马麻烦\t8696\n17条\t8697\nRain\t8698\nx1x\t8699\n日用型\t8700\n奥对我是你的主人你是谁\t8701\n空缺处\t8702\n田建栋\t8703\n战僵尸大战僵尸我玩植物大战僵尸\t8704\n黄冈站\t8705\n流氓兔\t8706\n嗯嗯萌\t8707\n王旺村\t8708\n小美人\t8709\n母秘\t8710\n沈萍\t8711\n春娇喘\t8712\nv个\t8713\n248655698536415872665445874\t8714\nask测序\t8715\n知哥\t8716\n宝典\t8717\n愁字了得\t8718\n粮站\t8719\n为题\t8720\n20根\t8721\n1.5xC\t8722\n生平\t8723\n辣椒粉\t8724\n1.5xB\t8725\nv下\t8726\n13岁\t8727\nv不\t8728\nx10\t8729\nx12\t8730\nfocuse\t8731\n835857357537357357357373535\t8732\n净种\t8733\nwozaijiaoni\t8734\n充值卡\t8735\n曾艺\t8736\n轻仓\t8737\n7张\t8738\n18世纪\t8739\n脚心\t8740\n手书\t8741\n遮盖\t8742\nKUF\t8743\n老崔\t8744\ndhuctigtugf\t8745\n块儿\t8746\n到这里\t8747\n蘑菇饼干\t8748\n小达人我需要你\t8749\n应当\t8750\n佳人曲\t8751\n石狮市\t8752\n铠甲勇士召唤器\t8753\nb级\t8754\n玄冥珠\t8755\n夸练\t8756\n韩万朝\t8757\n梦很近\t8758\n陆如燕\t8759\n王六六\t8760\nihcgufx\t8761\n背头\t8762\nG\t8763\nm3围\t8764\n万剑\t8765\n超会\t8766\nSometimes\t8767\n西斯\t8768\n黄河\t8769\n18340\t8770\n圣代们\t8771\n過分\t8772\n多米你的喜好\t8773\n俄勒冈\t8774\n注意力\t8775\n黄油\t8776\n被诵\t8777\n给我的你在说什么\t8778\n龙舌兰\t8779\n宾波\t8780\n升华\t8781\n黄沅\t8782\n嫖客\t8783\n狗肉火锅\t8784\n100gx3包\t8785\ntfyscu\t8786\n郭普云\t8787\n嫖宿\t8788\n张家川\t8789\n度秘兄\t8790\n黄沙\t8791\n过阵子\t8792\n卡盟\t8793\n问題\t8794\n狼大\t8795\n无锡市\t8796\n安充电器\t8797\nHXUBDKKBD\t8798\nBlack\t8799\n幌子\t8800\n乐巢董事会\t8801\nANN\t8802\n在弦上\t8803\n套件\t8804\n838438\t8805\n一年二十几岁\t8806\navvaags\t8807\n你好衰\t8808\n军装\t8809\n四二二\t8810\n王鑫峯\t8811\n00112\t8812\nkzuz\t8813\n残暴\t8814\n讲话稿\t8815\n烧油爆姜\t8816\n爱卿\t8817\n洗然\t8818\n珍惜爱更兴奋中青春\t8819\n大宠\t8820\n好运连连\t8821\n遏\t8822\n大家\t8823\n唬咙\t8824\n烂片\t8825\n教育法\t8826\n收片\t8827\n先走了拜\t8828\n早上九点半\t8829\n玩儿吧\t8830\n耶梦\t8831\nroaou\t8832\n大宁\t8833\n用笔\t8834\n任金\t8835\n初音未来\t8836\n电视吧\t8837\n江河湖海\t8838\n大安\t8839\n爱华\t8840\n睿尔\t8841\n女试男\t8842\n赵新月\t8843\nghfyfdxkajjhgfghcddd\t8844\n大宝\t8845\n遇\t8846\nhi你有男\t8847\n大官\t8848\n5.5寸\t8849\n320323197705116247\t8850\nOOPpooo\t8851\n所长\t8852\n倚老卖老\t8853\nvolavo\t8854\n掌控\t8855\n林雅雅\t8856\n身边里\t8857\n显显示\t8858\n许梦回\t8859\n任兵哲\t8860\n没声儿\t8861\n郑少秋\t8862\n凑热闹\t8863\n掌掴\t8864\nflirt\t8865\n副券\t8866\n不知晓\t8867\n25多十\t8868\n我就是女的了我本来就是你的我是女的我再说一遍我是女的你\t8869\n不怕我告诉他\t8870\n小鸟依人\t8871\n123456685598675669\t8872\n青春歌舞\t8873\n安主秀\t8874\n832\t8875\n太阳穴\t8876\n花家里北里\t8877\n蛮夷\t8878\n叫步\t8879\n有人类不好嘛\t8880\nurnot\t8881\n离别\t8882\n闻见\t8883\n七阿凡\t8884\n叫死\t8885\n超爽\t8886\n50米\t8887\n二十筐\t8888\n我讨厌你多咪v后\t8889\n88届\t8890\n到达地\t8891\n13412345678\t8892\n合奏\t8893\nggjl\t8894\nCORPSE\t8895\n罗伯特·克耐普\t8896\nCL集团\t8897\n是真你拿\t8898\n可言\t8899\n离划\t8900\n首当\t8901\n两个月\t8902\n838\t8903\n谢谢你的爱\t8904\n渐渐\t8905\n登天\t8906\n真心的那我对你的好不是白费\t8907\n刘晓慧\t8908\n节哀\t8909\n自尊心\t8910\n独立度\t8911\nijhgu\t8912\n剪掉\t8913\n学一学\t8914\n懂行\t8915\n针机\t8916\ni哈狗\t8917\n误国\t8918\n1.25L\t8919\n七一千一百一十一\t8920\n人形\t8921\n奔跑吧兄弟第三秀\t8922\n魏家鸣\t8923\n找怕\t8924\n撒子的鲨鱼三\t8925\n商议\t8926\n手杰斯\t8927\n商讨\t8928\n偶男\t8929\n爱科you\t8930\n我骗你我爱你\t8931\n人影\t8932\n老可好\t8933\n胜算\t8934\n商计\t8935\nger\t8936\n老年人儿\t8937\n果酱\t8938\ngeu\t8939\nget\t8940\n┃━│─┄┅┇┇┆┈┊┉┉┋┌┍┎┏\t8941\n椒盐\t8942\n马苏\t8943\na705b1000一498C643d342257nn\t8944\ngeb\t8945\n懒羊羊\t8946\n庙下\t8947\n脑孑\t8948\ngee\t8949\nged\t8950\n装法\t8951\ngei\t8952\nThiscamera\t8953\ngen\t8954\ngem\t8955\n开打\t8956\n宣纸\t8957\n老母\t8958\n郭登峰\t8959\n果酒\t8960\n列子恋\t8961\n金顶\t8962\n炸毛烧烧烧耶炸毛\t8963\n十四十\t8964\n梦里梦里梦\t8965\n姻个梦\t8966\n鬼肚米国\t8967\n闭门羹次\t8968\n开找\t8969\nFourball\t8970\n1ZOL\t8971\n老毛\t8972\n海蚌\t8973\n说得好说得好再来\t8974\n老毕\t8975\n兰少\t8976\n嫁给我吧度秘我好喜欢你\t8977\n不骗你骗你我是猪\t8978\n见得\t8979\nheythetdgfgc\t8980\n中旗察哈尔右翼中旗\t8981\n小散\t8982\n≦ノs｀ヘ\t8983\n花鼓\t8984\n相贼\t8985\n对不对\t8986\nffffdtt\t8987\n自早\t8988\nGrand\t8989\n小数\t8990\n凌驾\t8991\n七成\t8992\n许国彬\t8993\n七戒\t8994\n搜工\t8995\n小敏\t8996\n中扣\t8997\n冯慧敏\t8998\n小教\t8999\n相负\t9000\n右斯必克英格累\t9001\n有点难找\t9002\n你在干嘛呢你在干嘛呢\t9003\n朝拜\t9004\n耻笑\t9005\n长贴\t9006\nlration\t9007\n信用风险\t9008\nCLUB\t9009\n克害\t9010\nmashit\t9011\n553255222222322544\t9012\n长败\t9013\n写在\t9014\n預計魚魚魚\t9015\n复肌\t9016\n闵行教育局\t9017\ngcchccgfdgb\t9018\n良乡\t9019\n挖坟\t9020\n席桠楠\t9021\n处在\t9022\n钝\t9023\n挖坑\t9024\n哎呀证\t9025\n暨大\t9026\n毅丝\t9027\n省公安厅\t9028\n45点\t9029\n等腰\t9030\n飇蘿\t9031\nuyygiihh\t9032\n876778\t9033\n13676904103\t9034\n篇\t9035\n嗯情债\t9036\n历史学家\t9037\nijyhfrfcb\t9038\n123886\t9039\n篙\t9040\n胰岛素笔\t9041\n人顶\t9042\n九剑\t9043\n蚊鸡\t9044\n武昌\t9045\n恩他爱\t9046\n了的我的爸爸妈妈\t9047\n篮\t9048\n六分之1两个\t9049\n中五镇\t9050\n一齐\t9051\n五花八门\t9052\nvbbvjnnnn\t9053\n篡\t9054\n篢\t9055\nhello度秘我好想你\t9056\n取得成功\t9057\n一两度\t9058\n篷\t9059\n观海\t9060\n一百一十多\t9061\n得路\t9062\n吴江\t9063\n爆炸案\t9064\n好了你是我的\t9065\nwaka\t9066\n完再聊\t9067\n撸乖唉\t9068\nfydgd\t9069\n家会\t9070\n恍似\t9071\n鼠牛年\t9072\n赠飘飘人\t9073\n55元\t9074\n学费\t9075\n巴盟\t9076\n邹博\t9077\n各派\t9078\n沒错\t9079\n和我儿\t9080\n手窝\t9081\n百雀羚\t9082\n相逼\t9083\n大大萌\t9084\n相逢\t9085\n耳道\t9086\nfddfdfeety\t9087\n曹州牡丹园\t9088\n13364964902\t9089\n15188222350\t9090\nnbnnn\t9091\n蓝翷\t9092\n上货\t9093\n大窑湾\t9094\n王玉博\t9095\n朱加林\t9096\n想就是说\t9097\nlave\t9098\n相通\t9099\n相送\t9100\n第27卷\t9101\n别国\t9102\nKidz\t9103\n子阳\t9104\nsao117\t9105\n孙艺洲\t9106\n杭州市\t9107\n8月5日晚\t9108\n猫小宝\t9109\n老爷车\t9110\n无缘无故\t9111\n皮卡\t9112\n舒爽\t9113\n嗯些\t9114\n鬼子的我想你\t9115\nHampton\t9116\n2wustamaaaaaaaaayoolyarian慢istilyaaaaa\t9117\n火影p\t9118\n赵一铭\t9119\n相伴人生路\t9120\n铄哥哥\t9121\nuklz\t9122\n更可爱\t9123\n阿一鲍鱼\t9124\nnjhi\t9125\n恋爱上了一个人\t9126\n徐嘘嘘\t9127\n郎木寺\t9128\nnjgei\t9129\n条佬\t9130\n直销\t9131\n范周\t9132\n逗比丽\t9133\n二百位\t9134\n广西杨大作物研究所\t9135\n你美了美了美了我醉了醉了醉\t9136\n俊俏\t9137\n丑小鸭\t9138\n俊俊\t9139\n数百小时\t9140\n繁花筒\t9141\nfahai\t9142\n吹箫网\t9143\n阿弥陀佛阿弥陀佛\t9144\n南开\t9145\n盗窃\t9146\n肥羊羊\t9147\n鬼来了你怕不\t9148\nShoshonesgdjdhd\t9149\n李之以\t9150\n真是小气鬼小气鬼小气鬼小气鬼大气鬼大气鬼大气鬼捣蛋鬼捣蛋鬼捣蛋鬼捣蛋鬼\t9151\n电视棒\t9152\n萌萌哒小孩你就是想我的萌萌哒萌哒小孩\t9153\n21：00\t9154\n散失\t9155\n狗兔龙\t9156\n七八星\t9157\na片\t9158\n乖当\t9159\n三四十多\t9160\n坏狼\t9161\n啊娇\t9162\n瓮里\t9163\n网点\t9164\n特兰克丝\t9165\n早期癌\t9166\nOVeloeCoLordm\t9167\n无觅处\t9168\n葛爷\t9169\nEbb\t9170\n炒饭\t9171\n歌星\t9172\n周颖\t9173\n21呦呵\t9174\nadamg\t9175\n泡菜\t9176\n76岁\t9177\n2010年12月26日\t9178\n2.85万平方公里\t9179\n几万斤\t9180\n照料\t9181\n随不\t9182\n机分\t9183\n3d肉版\t9184\nimothatcastas\t9185\n对呀你说什么\t9186\nhahhaha\t9187\n旋兒\t9188\n规果\t9189\n唐佳玉\t9190\n旋光\t9191\n韩雨熙\t9192\n红星灿灿\t9193\n唱一首歌吧好想\t9194\n总持仓再创纪录\t9195\n创别\t9196\n大赞爱义行\t9197\nNeomu\t9198\n自产\t9199\n梦恬\t9200\n闹腾\t9201\njustanaah\t9202\n尺堂\t9203\n机制\t9204\n61年\t9205\n杀死\t9206\n一百万一百万\t9207\n搜恐怖惊魂\t9208\n毛寸\t9209\n婚外情\t9210\n创制\t9211\n张校荣荣\t9212\n带有\t9213\n王新宇\t9214\n口蘑女\t9215\n房垫\t9216\n背诗\t9217\nk560\t9218\n玩车\t9219\n自题太上皇帝之宝\t9220\n疯涨\t9221\n余敏\t9222\n熊星\t9223\n六和\t9224\n九十版\t9225\n不懂你我懂\t9226\n超声速\t9227\n三四零食\t9228\n背诵\t9229\n祖洪慧\t9230\n鸡鱼\t9231\nguagyu\t9232\nrecentenglish\t9233\n乃泽\t9234\n改用\t9235\n封信\t9236\n5635655\t9237\n彩屏\t9238\n走今天说\t9239\n造句\t9240\n马回\t9241\n888009988888889988600668\t9242\n艰难生活\t9243\n82.8%\t9244\n徐景颜\t9245\ncomclubitemparcelitemcdc\t9246\n复拱嘴\t9247\n杜汶泽\t9248\n六代机\t9249\n情商\t9250\n肖苗苗\t9251\n吃鱼\t9252\n格列佛游记\t9253\n鹰潭市\t9254\n国家游泳中心\t9255\n夜黑风高\t9256\n一时时\t9257\n造反\t9258\n惊厥\t9259\n厂人\t9260\n日本料理\t9261\n柴鼎文\t9262\n31天\t9263\n搜狗瓜娃子\t9264\n4845416545785511554258425744458565\t9265\n欲速\t9266\nbbks\t9267\n炒藕\t9268\n不称\t9269\n補充\t9270\n游戏币\t9271\n78578\t9272\n家兔\t9273\nhkojh\t9274\n111002\t9275\nghijk\t9276\n酒香\t9277\nvhx\t9278\n饭饭\t9279\n鸡萌\t9280\nvhv\t9281\nvhs\t9282\n不秘\t9283\n家具\t9284\n明明整\t9285\n家兵\t9286\n说我疯\t9287\n懒棒\t9288\nvhh\t9289\nvhk\t9290\nvhj\t9291\n周艺婷\t9292\nvhg\t9293\n插手\t9294\n家养\t9295\nHSV\t9296\n唇膏\t9297\n芷蕙\t9298\n江东\t9299\n蹬腿\t9300\nhfkic\t9301\n法蜜\t9302\n马年一个我呀你不饿我说朋友\t9303\n戚乔木\t9304\n艾因夏姆斯大学\t9305\n弥勒春六\t9306\nwwcc\t9307\n江上\t9308\n潘广益\t9309\nDLLURROULLLUOOURRRR222\t9310\n娶曲\t9311\n左心室\t9312\n48v\t9313\n六百亿\t9314\nysfo\t9315\n你最萌萌哒\t9316\nDarth\t9317\n米兰时装周\t9318\n鬆開\t9319\n无人废话\t9320\n55545665995000585546645666626137584\t9321\n朱给\t9322\n作业所\t9323\n新新旧的心新不\t9324\n百万英镑\t9325\n折旧\t9326\n如如\t9327\n时代华城\t9328\n甄糕糕\t9329\nparkour\t9330\nFPL\t9331\n申桥\t9332\nnkajsh\t9333\n赵陆阳\t9334\n6月7日上午6时\t9335\n万一万一\t9336\n1238789657\t9337\n488\t9338\n487\t9339\n486\t9340\n花花坏蛋花坏蛋乖乖\t9341\n484\t9342\n个税\t9343\n480\t9344\n好样好夜\t9345\nS600\t9346\n单眼\t9347\n二十一号\t9348\n4周年\t9349\n吃了饭没\t9350\n才艺\t9351\n咬咬牙疼\t9352\n美海\t9353\n不裁\t9354\n悲警\t9355\n18063450735674123457864507\t9356\n回头再说\t9357\n特工邵特\t9358\n39.00\t9359\n两只小的小狗\t9360\n罗杰\t9361\n川虹猫蓝兔\t9362\n尹好\t9363\n好消息\t9364\nBL鬼fu\t9365\n金童小区\t9366\n铸\t9367\n5738888958758\t9368\n高静\t9369\n巴能量沙罗沙罗\t9370\n谢谢你了哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈\t9371\nBvxhh\t9372\n链\t9373\n罗村\t9374\n张思琪\t9375\n安就安\t9376\n雷欧奥特曼\t9377\n分身乏术\t9378\n中山路口站\t9379\n瑰宝\t9380\n热播剧\t9381\n上海浦东潍坊二村东建托儿所\t9382\n同情心\t9383\n5900万\t9384\nSikhiagaag\t9385\n有益\t9386\nprefinelyl\t9387\nthreeplusissix\t9388\nvffyhv\t9389\n老人\t9390\n小忘吧\t9391\n休伊特\t9392\n饭前\t9393\nitgoertou\t9394\n数十次\t9395\n仙人\t9396\n三融\t9397\n老云\t9398\n储备局\t9399\n老井\t9400\n老五\t9401\n机位\t9402\n斑点\t9403\n美腻撘\t9404\n阿特兹\t9405\n张瑜珊\t9406\nLara\t9407\n35582555\t9408\n19分之\t9409\nform\t9410\n老了\t9411\n机体\t9412\n对啊中学生\t9413\n大桥\t9414\nCTCC\t9415\n今晚8点20分\t9416\n孟钰\t9417\n你的月亮我的心好男人就是我我就是\t9418\n老二\t9419\n跳跳绳\t9420\n6105504009号\t9421\n专机\t9422\n购房\t9423\n利己损人\t9424\n李文佳\t9425\n二十多分钟\t9426\n快点儿我着急\t9427\n溪水\t9428\n呵呵度秘\t9429\n广东镇\t9430\n李振浩\t9431\n赶走\t9432\n一百多元\t9433\n赶赴\t9434\n胡淼\t9435\n风稚名读书\t9436\n民和你是男的还女的\t9437\n加特曼\t9438\n嘻嘻哈哈\t9439\n风流倜傥\t9440\n江北\t9441\n再见了我要\t9442\n最小看\t9443\n不知不养\t9444\n叶伟安\t9445\n沃川\t9446\n啩\t9447\n啪\t9448\n啥\t9449\n啤\t9450\n阿宝\t9451\n啦\t9452\n啡\t9453\n啢\t9454\n牵尔玉手\t9455\n啼\t9456\n啾\t9457\nHCG\t9458\n阿宁\t9459\n啵\t9460\n啷\t9461\n秦淑惠\t9462\nshelos\t9463\nHCL\t9464\n月经量\t9465\n啍\t9466\n問\t9467\n产量\t9468\n份份\t9469\n啊\t9470\n啄\t9471\n商\t9472\n武神\t9473\n啃\t9474\n啜\t9475\n岳飞墓\t9476\n云形成\t9477\n5呎\t9478\n彭宇翔\t9479\n啖\t9480\n撞上\t9481\nHCl\t9482\n澄如\t9483\ngjgagkgb\t9484\n王家乐\t9485\n辽宁卫视黄金\t9486\n托蒂\t9487\n别感冒\t9488\n据续\t9489\n南马全\t9490\n度秘你好温柔啊我喜欢你\t9491\n通通话\t9492\n松鼠考拉\t9493\n唐钰之\t9494\n好的\t9495\n客舰队\t9496\n科举\t9497\n660008\t9498\nEminem\t9499\n协警\t9500\n68岁\t9501\n卡地亚\t9502\n色厉眼镜网\t9503\n8888c\t9504\n半个多月\t9505\n王双一\t9506\n了和\t9507\n国外\t9508\n美小苹果\t9509\n丿丂\t9510\n铠明\t9511\n假想\t9512\ndruydl\t9513\n请到\t9514\n为什么不可以\t9515\nwhoyaou\t9516\n不要命\t9517\n傻冒\t9518\nhello号\t9519\n高冷感\t9520\n五页\t9521\nsjsn\t9522\n山东省烟台市毓璜顶医院生殖中心\t9523\n幸福生活\t9524\n自已去\t9525\n浦坤\t9526\n神秘感\t9527\n保护我\t9528\n五项\t9529\n度便\t9530\n阿尔卑斯牛仔\t9531\n楼体\t9532\n中厅\t9533\n美邦\t9534\n不知生\t9535\nfggiutyex\t9536\n阳程雕\t9537\n几十本\t9538\n請問兒童與\t9539\n8八八八八\t9540\n田金卉\t9541\n卖出\t9542\n铅\t9543\n黎塘\t9544\n曹靖雨\t9545\n中原\t9546\n火恋\t9547\n六十日柱\t9548\nHghhjfnydhgvhfhgghhhhhhbhhhhjrgfuujfhhtuguuhhrvuyyyyyyuvjhthingsghoshgjxbcunhkthshghjjngjbcuhjbbghhjkmgfgckvgjdvjsfvfjcgbxxvfhiuogugucfhghnhhxhggjdhuvhnbfninvnjgchfhhvbvvvnhfhhcvv\t9549\njsjrhosj\t9550\n甘杉\t9551\n西游记了你\t9552\ncjgl\t9553\n如烟\t9554\n姚扩列\t9555\n九十几分\t9556\n7575575575552278325375\t9557\n记取\t9558\n遥远的地方\t9559\n唔啾\t9560\n付崇硕\t9561\n记叙\t9562\n打入冷宫\t9563\n每边\t9564\ndtfh\t9565\nhf1\t9566\n当我爱你\t9567\n甘来\t9568\nmilk\t9569\n婆行\t9570\nmaps\t9571\n于奇\t9572\n如烧\t9573\n香烟\t9574\n故事堡\t9575\ngdjtdjagat\t9576\n还好好\t9577\n疯狂的意义\t9578\n音信\t9579\n小瞧问\t9580\n66天\t9581\n超美\t9582\n嗯对对\t9583\n托马西\t9584\n度秘亲一个熊猫我好想和你亲\t9585\n葱错\t9586\n分时\t9587\njqfj\t9588\n绞刑架\t9589\nbonolal\t9590\n黑暗你好逗比\t9591\n我是你爷你信\t9592\nKelly\t9593\n敲边\t9594\n松本雅子\t9595\n赞成\t9596\n拉直\t9597\n进城\t9598\n笔记\t9599\nhfy\t9600\n我是女的我喜欢女\t9601\n超群\t9602\n上西天\t9603\n文理\t9604\nJjv\t9605\nhfs\t9606\n29.8%\t9607\nhft\t9608\nhfv\t9609\n父女\t9610\n尊口\t9611\nhfk\t9612\nhfj\t9613\n巧克力雲莊\t9614\n是你的爸爸帅还是我帅\t9615\nJjf\t9616\nhfc\t9617\nhfb\t9618\nX5X5\t9619\nhfd\t9620\nhfg\t9621\nhff\t9622\ncjgh\t9623\n同场\t9624\n柜门\t9625\n生化玫瑰花\t9626\n居然\t9627\n帅男\t9628\n睡懒\t9629\n5天之后\t9630\nVungle\t9631\n12月21日\t9632\n铁非礼\t9633\n哭了告\t9634\n李白盛\t9635\n北辰区\t9636\n乳母\t9637\n你在干什么呢开心了一开心了啦啦啦啦啦我开心\t9638\nvivoie\t9639\n骨儿\t9640\nbaap\t9641\n信不我真\t9642\n48厘米\t9643\n八老板\t9644\n弱根\t9645\n菩萨\t9646\n恩那\t9647\n胡煜琛\t9648\n一百六十五\t9649\n止境\t9650\n卡唯度\t9651\n小眼\t9652\n这样的人\t9653\n机械性\t9654\n不认可\t9655\n陈建防\t9656\n安要\t9657\n小妹子亲一个亲一个亲\t9658\n好打\t9659\n腮胡\t9660\n表里不\t9661\n张培椿\t9662\n摸额\t9663\n政治犯\t9664\n撒子麻\t9665\n药贴\t9666\n一会法\t9667\n药费\t9668\n借记卡图他我帖子里模具唔同罗总说下监狱兔目录\t9669\n875487318768712871\t9670\n烦感\t9671\n琬儿\t9672\n除根\t9673\n够不找我\t9674\niGh介人人\t9675\n张玉倩\t9676\nsksk\t9677\n花清\t9678\n2088年\t9679\n现实\t9680\n飞吧\t9681\n问一哈\t9682\n飞吻\t9683\n别礼\t9684\n220个\t9685\n十年前\t9686\n安唱\t9687\n最上层\t9688\n阿孙柳\t9689\n现安\t9690\n嘛姆嘛姆\t9691\n当你的钩子眯着笑当你的帖子秘谁不着你的狗子\t9692\n国土资源\t9693\n附小\t9694\n竞争力\t9695\n聽聽\t9696\n雨桐\t9697\n9999999999969\t9698\n度密机器人\t9699\n耍帅\t9700\n富山村\t9701\n别这样做\t9702\n失魂\t9703\n飞向\t9704\n秘演\t9705\n嗯在\t9706\n2.9公斤\t9707\n恶口\t9708\n血迹斑斑\t9709\n多咪吗度秘\t9710\n衰变\t9711\n广汉\t9712\n0100\t9713\n别的都簿\t9714\n首位\t9715\n大鹿岛\t9716\n徐小冉\t9717\n吃给\t9718\n格雷夫斯\t9719\n先赞\t9720\n懂事听话\t9721\n海秀\t9722\n功效\t9723\n队里太饱\t9724\n教辅\t9725\n从前有只猫喵喵喵喵喵喵喵喵喵咪咪喵咪咪咪咪咪咪咪咪咪\t9726\ngjhh\t9727\n新视野\t9728\nｏ嗒\t9729\n周小度秘\t9730\n广汽\t9731\n致命的邂逅\t9732\n一平\t9733\n王丽娜\t9734\n陈游戏\t9735\n廉江\t9736\n发布日\t9737\n入味\t9738\n烧水\t9739\n1578900\t9740\n孙惠芬\t9741\n翁雅婕\t9742\n乍到\t9743\nIRC\t9744\n胶状\t9745\n中盖\t9746\n彪彪\t9747\n2700欧\t9748\n中盛\t9749\n何谓\t9750\n闭馆\t9751\n向城区\t9752\n安胎邮箱\t9753\n天涯碧草话斜阳\t9754\n嘟囔\t9755\n理由乐园\t9756\n125555\t9757\n八十四八十\t9758\n呃发\t9759\n不到\t9760\n冷血\t9761\nWP7\t9762\ntggggggghgggggg\t9763\n华杰教育\t9764\n不利\t9765\n缠绵悱恻\t9766\n不删\t9767\n从此以\t9768\n不刚\t9769\n不创\t9770\n波多黎哥\t9771\noroneoneor\t9772\n再见你好自为\t9773\n不刁\t9774\n牛思涵\t9775\n六八幺九幺六八八\t9776\n57645545544666445554643645463466434431684\t9777\n藏南\t9778\n不分\t9779\n不切\t9780\n通过性\t9781\n786877\t9782\n王八蛋\t9783\njfjc\t9784\n医科大学\t9785\n小筑\t9786\n袁海丽\t9787\n孰不可\t9788\n点点赞\t9789\n消解\t9790\n老大人\t9791\n更年停经\t9792\n北京法院\t9793\n龙潭区\t9794\n冰忻\t9795\n杨佳妮\t9796\n绿庄子\t9797\njfje\t9798\n好搞笑\t9799\n洛杉矶湖人队\t9800\nhi逗逼\t9801\n王俊敳\t9802\n你来我家吧乖乖\t9803\n蔡美月\t9804\n发誓\t9805\n第三项\t9806\n8627531\t9807\n小点儿\t9808\nccvv\t9809\n取决\t9810\n空尼\t9811\n出去走走下\t9812\n233233322222\t9813\n沙鲁克汗\t9814\nwomeos\t9815\n701个\t9816\nGhggggfhcg\t9817\n通报\t9818\nccvc\t9819\n孤狼\t9820\nfreepronhd\t9821\n致远\t9822\n烟袋斜街\t9823\nTdontno\t9824\n陈乾龙\t9825\n六月七\t9826\n园里\t9827\n里给偶滚粗去┻━┻︵\t9828\n15位\t9829\n张铁田\t9830\n卖出去\t9831\nguhh\t9832\nい\t9833\n起走\t9834\n一台\t9835\n来来来来来来来来来来来来\t9836\n一号\t9837\nvvvv宝贝宝贝\t9838\nHHKGFGL\t9839\n孩子实验小学\t9840\n宁仙儿\t9841\n月亮湾景区\t9842\n16点14\t9843\n一口\t9844\n一叠\t9845\n一另\t9846\n早上9点14分\t9847\n一句\t9848\n一只\t9849\n盗用\t9850\n一发\t9851\n贾露露\t9852\n据传\t9853\n一变\t9854\n太萌\t9855\n东高村\t9856\n飯重重復復吃飯飯\t9857\n赶往\t9858\n彰显\t9859\n卢圆圆\t9860\n一友\t9861\n一又\t9862\n杜明艳\t9863\n一双\t9864\n一反\t9865\n唐章勇\t9866\n邵国柱\t9867\n西四选择题\t9868\n紧盯\t9869\n没了啊\t9870\n顾玲萌\t9871\n国创高新\t9872\n机油\t9873\nvifengcomvblognew2014002042c9335c5bb4227bc44528945a215cfshtml\t9874\n一百八十\t9875\n一段路\t9876\n缓坡\t9877\n这费\t9878\n女孩女孩\t9879\n;\t9880\n凌薇\t9881\n那司徒蜜兔兔\t9882\n王8蛋\t9883\n冷链\t9884\n1461899613777829\t9885\n赫拉\t9886\nmnvvguiop\t9887\n潘丽诗\t9888\n最新消息\t9889\n凡事\t9890\nMyfavouriteTshirt\t9891\n四五辆\t9892\n吹任\t9893\n强卫\t9894\n盒币\t9895\nxxyd\t9896\n不不许\t9897\n吃吃着\t9898\n400多瓶\t9899\n影迷们\t9900\n预计\t9901\n预订\t9902\n受过\t9903\n五年级下册品社第四单元一方水土养一方人\t9904\n病毒感染\t9905\n女发明家\t9906\n黑黝黝黑黝黝黑黝黝\t9907\n牛念犇\t9908\n一张照片儿\t9909\n预设\t9910\n我在哪你在说\t9911\n你爱谁\t9912\n坡都秘\t9913\ndhiphotosbaiducomxiaodupicitem8cb1cb1349540923de09c10b9558d109b3de4966jpg\t9914\n03千克\t9915\n恨情歌\t9916\n逆向\t9917\n为我为什么爱你\t9918\n战将\t9919\nGfhdjfgdhf\t9920\nihgj\t9921\nkkpp\t9922\n小马\t9923\n励合\t9924\n逆后\t9925\n打哈欠\t9926\nGEH\t9927\n哗众\t9928\n代办\t9929\n光影\t9930\n50多\t9931\n虚无\t9932\n尼金贝\t9933\n破伤风针\t9934\n开开心心星期一题我们要开心替人\t9935\n坐山\t9936\n宾利\t9937\n朱主爱\t9938\n李宗教\t9939\n2006年9月17日\t9940\n食饱心\t9941\nYeS\t9942\n尸臭\t9943\n你和你的女朋友\t9944\n不要脸你说话不要脸\t9945\n扫除\t9946\n听可爱的了可爱乖\t9947\n药庆卫\t9948\nYed\t9949\n世上\t9950\n敏俊\t9951\nYey\t9952\n咪喵喵喵了\t9953\n世世\t9954\n黄希扬\t9955\n一个388\t9956\nchiphotosbaiducomxiaodupicitem738b4710b912c8fcc1c303b9fb039245d6882195jpg\t9957\nNAk\t9958\n三零多\t9959\nYes\t9960\n吐火\t9961\n圖討\t9962\nYew\t9963\n1.73米\t9964\n贡品\t9965\n极薄\t9966\n趁机\t9967\n实力派\t9968\n15对\t9969\n剪拼\t9970\nfftckfcjj\t9971\nvvvb\t9972\n我就黑你黑你黑你黑死你\t9973\n粗野\t9974\n赵柏茂\t9975\n酒厂\t9976\n2回\t9977\n对勾\t9978\ngvkffvn\t9979\n公共危險罪\t9980\n谭天澄\t9981\n秒表\t9982\n晋嫣\t9983\n港独\t9984\n脑性\t9985\nssfhjk\t9986\n雨量\t9987\n美义\t9988\ngvkbcjvjgch\t9989\nkurdv\t9990\n迦南\t9991\n旺季期\t9992\n55641287889n888878852168598989898989862615978878784125598880808888555\t9993\n四千多年前\t9994\n超跌\t9995\n上外\t9996\n背背背背背背\t9997\n上夜\t9998\n立碑\t9999\n博辛瓦\t10000\n不遗憾\t10001\n这个杯\t10002\n场合\t10003\n胡烨佳\t10004\ncorner\t10005\n低腰男\t10006\n吸脂\t10007\n末选\t10008\n74125889633245698743656\t10009\n超跑\t10010\n上头\t10011\n畏惧\t10012\n第10期\t10013\n大奇葩\t10014\n幸福的婚姻生活\t10015\n醒了啦\t10016\n曲混\t10017\n身手\t10018\n不白\t10019\n我要秘\t10020\n程兰楠\t10021\n不百\t10022\n阿里巴巴网\t10023\n十二厘米\t10024\ntsyfgffgcdfgdyvdghbreallyfgyvvggvykingoldclassx7\t10025\n上天\t10026\n操作题\t10027\n血小板\t10028\n购销\t10029\n稳坐\t10030\n寄出\t10031\n敢笑话\t10032\n读赂\t10033\n白鹭\t10034\n1801道\t10035\n128成\t10036\n许哈根达斯\t10037\n格隆意外国安\t10038\n好高冷\t10039\n折耳#\t10040\n六样\t10041\n辽源\t10042\n腆腆\t10043\n夜浦\t10044\n中宣部\t10045\n庆庆\t10046\n尹鹤锐\t10047\n20年前\t10048\nChinaJoy\t10049\n翼展\t10050\n逸得一\t10051\n乐别\t10052\n大快跑\t10053\n乐利\t10054\n李双江\t10055\n额滴神娜\t10056\n第十六个\t10057\n光绪皇帝\t10058\n一八六八\t10059\n夹克\t10060\n可说和\t10061\n贾超\t10062\n50天\t10063\n王露颖\t10064\n随便果\t10065\nstaynce\t10066\n巷子\t10067\n急用\t10068\n疑犯\t10069\n一百多岁\t10070\n讪牙\t10071\n你好小度秘\t10072\n火vjzhsh\t10073\n戶江\t10074\n拥有\t10075\n孤悬\t10076\n冬蜜结节\t10077\n洒泪\t10078\nHdhddh\t10079\nvjfgg\t10080\n喻文州\t10081\n小韩\t10082\n西湖\t10083\n三代人\t10084\nxaxx\t10085\n中国商飞公司\t10086\nAlbany\t10087\ndoxos\t10088\n杨佳璇\t10089\n吃公主你是王子你是\t10090\n准定\t10091\n省第一人民医院\t10092\nTorqkchfylxgdulhhpgihzdigvigkckxkhkxgwydkhlvbiogduwqqdkdjkdyzhdjxgxjcycjftxjovusdhatev\t10093\n崔崔\t10094\n新宿区\t10095\n伤处\t10096\n47%\t10097\n鬼女\t10098\nhi皮hi皮hi皮皮\t10099\n还珠秘\t10100\n肚脐眼\t10101\n鲁子莹\t10102\n丝丝鹅黄\t10103\n24921元\t10104\n房文龙\t10105\njrmph\t10106\n评讥\t10107\ngffcyvgv\t10108\n评论\t10109\n稍稍\t10110\n耳熟\t10111\n百子湾\t10112\n酿说\t10113\n13716034788\t10114\n解答华\t10115\n拉丝\t10116\n爱达\t10117\n马婷\t10118\n打算盘\t10119\n有话好好说\t10120\n嗜好\t10121\n大妹子歌\t10122\n老先衰\t10123\n108千米\t10124\n1599389976376\t10125\n演一演\t10126\nhttppinyincn11S0AP7scq7\t10127\n别倒\t10128\n很喜\t10129\n百度钱包弄\t10130\ncyyy\t10131\n好嘞咪\t10132\n印刷术\t10133\nikouol\t10134\n274e五\t10135\n中融从容\t10136\n2012.05.20\t10137\n潘虹羽\t10138\n吴佳尼\t10139\nfoolish\t10140\n真的好懂\t10141\n南山区\t10142\n那些戏\t10143\n河神\t10144\n度不开\t10145\n我喜欢一夜\t10146\n321多多\t10147\n瑟妃\t10148\n卡梦北鼻\t10149\n三十多分\t10150\n兆轩\t10151\n在果\t10152\n赫在\t10153\n佯嗔\t10154\njitwv\t10155\n科学家\t10156\n别克正东吸脂\t10157\n南极洲解放军基督教\t10158\n张颖\t10159\n最后一页\t10160\nibbits\t10161\n度秘我六百号通\t10162\n136022993882\t10163\n学弟\t10164\n随处可\t10165\n心头\t10166\n嫌\t10167\n4100列千八百\t10168\n洗KO\t10169\n养鸟\t10170\n奈何桥\t10171\n嫁\t10172\n哈皮牛\t10173\n商城\t10174\n红狮\t10175\n嫗\t10176\n假了\t10177\n张诗琪\t10178\n心太\t10179\n我想你\t10180\n嫑\t10181\n静静静静\t10182\n假人\t10183\n嫩\t10184\n寒假\t10185\n担心\t10186\n八墩\t10187\n嫣\t10188\n第六代\t10189\n落十一\t10190\n养鸭\t10191\n萌萌大\t10192\n养鸡\t10193\n付诸东流\t10194\n嫱\t10195\n问你\t10196\nggghghhhhhhhhuyygghgcuuhh\t10197\nm丰润\t10198\n不一不\t10199\n世行\t10200\n咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪小猫咪咪咪咪\t10201\n16006期\t10202\n妖孽的礼堂不起礼堂不起你\t10203\n婢女\t10204\nLaLaL\t10205\n安有布\t10206\n果粒\t10207\n沙量\t10208\n射射\t10209\n塌开\t10210\n到底是什么\t10211\n果粉\t10212\n变态废话\t10213\n虚伪的人\t10214\n70块\t10215\n第33位\t10216\n李立群\t10217\n冥域广搏速眞长奥巧樨甜\t10218\n亲武\t10219\n想我用\t10220\n双降\t10221\nouspastation\t10222\n立法院\t10223\n再笑一下\t10224\n发呆\t10225\nvvkknv\t10226\n年末\t10227\nWang\t10228\n举起手\t10229\n13763894547\t10230\nzhr\t10231\nzhu\t10232\n愛呢[\t10233\n空发\t10234\n老者\t10235\n年期\t10236\nzhz\t10237\n老老\t10238\n豆粒\t10239\nzhb\t10240\nzhe\t10241\n鸡精\t10242\nzhf\t10243\nzhi\t10244\n气油\t10245\nzhj\t10246\nzhl\t10247\n年月\t10248\nzhn\t10249\nmmsory\t10250\n穆迪\t10251\n白苯\t10252\n南务村\t10253\n移动3G\t10254\n讨饭\t10255\n契科夫\t10256\nabzdx\t10257\n一鼓作气\t10258\n闹钟明天\t10259\n蔡有几\t10260\n泼泼\t10261\n张杰伦\t10262\n佼佼者\t10263\n奇异\t10264\n520.13.14\t10265\n叶琳月\t10266\n泰陶\t10267\n半根\t10268\n一般那你个那你是谁\t10269\n牧丹\t10270\n五十公斤\t10271\n高高兴兴\t10272\nyesss\t10273\n呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱\t10274\n月全食\t10275\n零三幺四\t10276\n帮教\t10277\n胎儿型\t10278\n這兩個小熊\t10279\n言谈\t10280\n听讲\t10281\n经纪\t10282\n地空\t10283\n经纬\t10284\n灯具\t10285\n空口\t10286\n音乐听\t10287\n陰茎\t10288\n妖魔化\t10289\n头纱\t10290\n林冒险\t10291\n度秘大坏蛋大坏蛋大坏蛋超级大坏蛋讨厌\t10292\n买入量\t10293\n魏大勋\t10294\n张日那\t10295\n3p竖\t10296\n肖大宝\t10297\n2边\t10298\n沱江\t10299\n4589\t10300\n连组个词\t10301\n整场\t10302\n奔就奔\t10303\n不可理喻\t10304\n4581\t10305\n无名无\t10306\n国事\t10307\n35.40\t10308\n炒花生\t10309\n俾斯麦\t10310\n白沙洲大桥\t10311\n博民\t10312\n公元十八年前\t10313\n名行\t10314\n取证\t10315\n苏嘉全\t10316\n黑龙江省\t10317\n我的宝贝\t10318\n黄晓阳\t10319\n呸呸呸呸呸呸\t10320\n小休\t10321\n烧水泡咖啡\t10322\n口点\t10323\n莆田片\t10324\n国产\t10325\n了服\t10326\n梁慧\t10327\n国人\t10328\n民间办\t10329\n幅画\t10330\n跟读\t10331\n疏漏\t10332\n澈澈\t10333\n小柏木咲碧\t10334\n開房\t10335\n夜路\t10336\n再来一段儿\t10337\n印顺法师\t10338\n表决\t10339\n王媚儿\t10340\n宝贝儿谢谢你谢谢你给我的爱\t10341\n佩霞\t10342\n谢谢你\t10343\n傻春\t10344\n不摸头\t10345\nsthree\t10346\n娟娟\t10347\n一战胜\t10348\n跟诊\t10349\n小心我真\t10350\n成长股\t10351\n別合\t10352\n狼心四\t10353\n刺痛\t10354\nHoward\t10355\n531974572\t10356\n寒破\t10357\nFast\t10358\n求赞\t10359\n小行星\t10360\n浑沦\t10361\n找我走\t10362\n刷子\t10363\n海运\t10364\n假音\t10365\n9.57%\t10366\n附加\t10367\n困局\t10368\n海迈\t10369\n周晓光\t10370\n60平\t10371\n就这样做\t10372\n冬蜜春\t10373\n八班\t10374\n宁可a4弯\t10375\n王源王\t10376\n妬如垢\t10377\n寂寞沙洲我该思念\t10378\nryffy\t10379\n3DMGAME\t10380\n6.7万亿元\t10381\n三十六零六六六六六六六yyyyyy\t10382\n教者\t10383\nSTBY\t10384\nu0014\t10385\n甘相伟\t10386\n五今\t10387\n15551518\t10388\n来速效\t10389\n问你是\t10390\n五仁\t10391\n文殊\t10392\n1223455899\t10393\n紫色\t10394\n五份\t10395\n列表法\t10396\n好感\t10397\nggnj\t10398\n李总贵\t10399\n偷龙换柱\t10400\n韩食\t10401\n好意\t10402\n13802603210\t10403\n王巫师\t10404\n去职\t10405\n真容长\t10406\n急中生智\t10407\n五代\t10408\nghiphotosbaiducomxiaodupicitem8c1001e93901213fbe050ba853e736d12f2e958bjpg\t10409\n碧文\t10410\n860甲数\t10411\n坨子\t10412\n粗心\t10413\n写得了\t10414\n作梦\t10415\nHhbsjbzjsj\t10416\n谢润华\t10417\n大雪雪\t10418\nBbjhhj7\t10419\n数片\t10420\n黄岩岛\t10421\n猪咯猪\t10422\n快点再快点在快点在快点快点快点快点菜\t10423\n千元级\t10424\n南校\t10425\n調查\t10426\n独自在外\t10427\n那么假\t10428\n哪一站\t10429\n虎穴\t10430\n堵塞\t10431\n郑立彦\t10432\n浅灰色\t10433\n也斯欧\t10434\n花猫猫猫不装潢猫花花花花花花花花花\t10435\n刘琦\t10436\n千帆\t10437\neeeeeeaaackkefkknnnnnnnnn\t10438\n山木案\t10439\n266369741582885\t10440\n老好看\t10441\n刘庭羽\t10442\n伊斯卡\t10443\n72个\t10444\n余贝贝\t10445\n风平浪\t10446\n进来\t10447\n关系战\t10448\n赶会\t10449\n书巢部\t10450\n2485521252112211235324845\t10451\n欧小姐\t10452\n13761359668\t10453\n豌豆大嘎嘎嘎嘎\t10454\n穿插\t10455\n逃不脱\t10456\n糸姐姐\t10457\n混日子\t10458\n关系户\t10459\nsudu\t10460\n一个二儿\t10461\n度秘结\t10462\n小仙仙\t10463\n顺丰元\t10464\n脚边\t10465\n式心\t10466\ndidnt\t10467\n饿良\t10468\n25岁\t10469\nYJ\t10470\n妈爷子\t10471\nYL\t10472\nYO\t10473\n板上钉钉\t10474\n女子汉\t10475\n蓝多多\t10476\nYE\t10477\n改正\t10478\nYG\t10479\nYF\t10480\nYY\t10481\nYX\t10482\n掰开\t10483\nYQ\t10484\n温室气体\t10485\n赤果果\t10486\n爆粗\t10487\n收集\t10488\nYT\t10489\nYW\t10490\nYV\t10491\n授信额度\t10492\nYl\t10493\nYo\t10494\nYa\t10495\n麻婆\t10496\n好汉纸\t10497\n侦办\t10498\n我的妈呀你\t10499\n贾不贾\t10500\n婚姻呢我的小玉佛\t10501\n一英狗\t10502\n疲倦\t10503\n掰弯\t10504\n熟度\t10505\n百所\t10506\n一开始\t10507\n除痘\t10508\n张丽琴\t10509\n悠然自得\t10510\nhighu\t10511\n凋谢\t10512\n奇米奇米网\t10513\n退赛\t10514\n破灭\t10515\n康定路\t10516\njzjshx\t10517\n22负瘦据转素受地转\t10518\n久七八\t10519\ng1区\t10520\n逃撞\t10521\n陶志超\t10522\n八十块\t10523\n比如\t10524\n激烈\t10525\n五升\t10526\n这男人装别刊\t10527\n五千\t10528\n五十\t10529\nY9\t10530\n徐啸天\t10531\nsuju\t10532\nsujs\t10533\n中国中学生足球希望队\t10534\n扭喜\t10535\n30個\t10536\n自己群\t10537\nglibgcc\t10538\n45585\t10539\n蚊帐\t10540\n4000多\t10541\n憋笑\t10542\n展示品\t10543\n傻乐\t10544\n内接\t10545\nreport\t10546\n傻乎\t10547\nNHK\t10548\n7ruix\t10549\n小钱钱\t10550\n福建三奥\t10551\n三峡水电站\t10552\n内控\t10553\n陈么多\t10554\n彭薇伊\t10555\n不问不开\t10556\n花心\t10557\n法令纹\t10558\n浏阳\t10559\n民工费\t10560\nshTm\t10561\n梁实秋\t10562\n摔角\t10563\n流诶菲\t10564\n男你女\t10565\n般若\t10566\n2842153299\t10567\n幺四二\t10568\n高凤梅\t10569\n误解\t10570\n杨幂\t10571\n唔想看奇门遁甲\t10572\n古圣\t10573\n朋友节\t10574\n87个\t10575\n卢杰龙\t10576\n芦娇\t10577\n就是我儿\t10578\n我真的爱你我想和你\t10579\n1.09%\t10580\n奔跑吧兄弟第四季\t10581\n瑞典\t10582\n瑞兹\t10583\n对得\t10584\n肯德基肯德基\t10585\n祖国度\t10586\n木马木马木马\t10587\n幽闭恐惧症\t10588\n夸漂\t10589\n和一说话\t10590\n上坡\t10591\n义无反顾\t10592\n对待\t10593\n不是我一个人点\t10594\n生不逢时\t10595\nnfkni\t10596\n财会\t10597\npantry\t10598\ntsom\t10599\n闹闹觉\t10600\n什么苦\t10601\n南无乐队\t10602\nPrint\t10603\ngfggk\t10604\n29000元\t10605\n门诊部\t10606\n75个\t10607\n还不错谢谢\t10608\n腾迅\t10609\n889889889889\t10610\n佛斋\t10611\n幸福的面包\t10612\n1867719725\t10613\n演唱\t10614\nJalandoni\t10615\n不仁\t10616\n605亿美元\t10617\n2010.12.31\t10618\n以村股份公司\t10619\n5525583982182876284\t10620\n嘴巴掌\t10621\n密码\t10622\n什么苍\t10623\nJCG\t10624\nfrtgddtg\t10625\n综合治理\t10626\nuhshgsh\t10627\n若冰\t10628\n雷总站\t10629\nyoxo\t10630\n太平间\t10631\n15171998769\t10632\n武生哥\t10633\n该干\t10634\n非品\t10635\n气秀气\t10636\n皇太子\t10637\ncarthata\t10638\n推不摊荐\t10639\n通透\t10640\n3：22\t10641\n来劲\t10642\n重新\t10643\n无糖豆浆\t10644\ndytfdy\t10645\n马大哈\t10646\n机器人人\t10647\n打拜拜\t10648\n第19\t10649\n第18\t10650\n迪利来\t10651\n吧唧唧歪歪头\t10652\n通通\t10653\n第11\t10654\n来动\t10655\n第13\t10656\n第12\t10657\n第15\t10658\n长期投资\t10659\n发展区\t10660\n3分钟后\t10661\nhgdm\t10662\n恨天\t10663\nhgdg\t10664\nhgdd\t10665\n孙吟冰\t10666\n内景\t10667\n单元格\t10668\n唱一首歌听\t10669\n我的一家\t10670\n陶先生\t10671\n注入\t10672\n赵纯良\t10673\nhgdt\t10674\nhgds\t10675\nehjd\t10676\n秘度秘在不在度秘\t10677\n数百米\t10678\n你头\t10679\n片海\t10680\n曰批\t10681\n你大\t10682\n去车\t10683\n侠剧\t10684\n往来走\t10685\n千百\t10686\n王伟琪\t10687\n崇敬\t10688\n疼惜\t10689\n公关部\t10690\n王燚\t10691\n先霸\t10692\norcer\t10693\n了别来\t10694\n索性\t10695\n毕业证\t10696\nmll\t10697\n为你爱\t10698\n吃苦在眼前\t10699\n任菊芝\t10700\n我喜欢的人儿\t10701\n东东哥\t10702\n熊二二\t10703\n使费\t10704\n圆滚滚\t10705\n冷光\t10706\n55385\t10707\n惠氏\t10708\n妖妖\t10709\n唾液\t10710\n狂风暴雨\t10711\n樱花树\t10712\n小苹果\t10713\n毕业说\t10714\n竺铧\t10715\n22333\t10716\n━━━\t10717\n巷名\t10718\n密密麻麻\t10719\n象路\t10720\n马雷斯\t10721\n猪猪猪猪猪猪猪你个猪\t10722\n一个女人\t10723\n米罗展\t10724\nStrangers\t10725\n美珍\t10726\n欢乐谷\t10727\n堪布嘉木扬·图布丹\t10728\n雪鹰侠\t10729\n邬铠泽\t10730\n美杜莎\t10731\n686868686868686868686868686868686868686868686868668868686868686868686868686868686868\t10732\n玫瑰水水果茶\t10733\nllojpalpoaqunkuvuhdhiyxbyrrijcdtu\t10734\n最省钱\t10735\n5月30秒\t10736\n宠物版\t10737\n三分之一第一章\t10738\n损损\t10739\n呆逼\t10740\n均属\t10741\n电线杆\t10742\n566855\t10743\n50多寸\t10744\n头藓\t10745\n好两天\t10746\n爱谢不谢\t10747\n马建贞\t10748\nXDD\t10749\n吧一\t10750\n158845584\t10751\n天哥哥\t10752\n芬兰\t10753\n鸡尾酒饱和度虚拟\t10754\n头寸\t10755\n122333687\t10756\n媚点\t10757\n经受\t10758\n会行\t10759\n885555556555458855777\t10760\nreyoutle\t10761\n中华v3\t10762\nbongshakalaka\t10763\n嗯小女\t10764\n吧主\t10765\n幸生\t10766\n知画九九\t10767\n几公斤\t10768\n110点\t10769\n根系\t10770\n嫁入豪门\t10771\n糙糙\t10772\n不行刘小\t10773\n毕笑莹\t10774\n哎哟去了你别气我行不行\t10775\n一一部\t10776\n魔钻\t10777\n8只\t10778\n一天多\t10779\n哦主任\t10780\n奸诈尸\t10781\n丽23d\t10782\n闫沛东\t10783\nqrvh\t10784\nKOBE.BRYANT24\t10785\n不能不人\t10786\nwitn\t10787\n就是你在自欺你的挚爱\t10788\n跨区\t10789\n天花乱坠\t10790\n苏格兰梗犬\t10791\n一点一银\t10792\n一天天\t10793\n中介\t10794\n惜缘\t10795\nXDS\t10796\n狂长\t10797\n名誉权\t10798\n新春一下行嗯怨咒\t10799\n腊肉丝\t10800\n波波波\t10801\n苦恼\t10802\n王秋浩\t10803\n冰心墓园\t10804\n求你了就当我的秘书你看我的姐姐都有你这个秘书想要你这个秘书乖呀我真的好想要你\t10805\n3400\t10806\n三天内\t10807\n身兼\t10808\n海神庙\t10809\n切克闹你说一说声\t10810\nitisa\t10811\n接秘\t10812\n5495566469927555698777659\t10813\n男人心\t10814\n脱裤\t10815\n章子怡\t10816\n成功率\t10817\n马丁，西科塞斯\t10818\n花街七三\t10819\ngyeu\t10820\n别度\t10821\n彭劲\t10822\n蹲组\t10823\n几久才会册\t10824\n回音雅会阴\t10825\n小度秘你是你是你是小孩\t10826\n真自然\t10827\n雄雄\t10828\n＿过三V一V企＆T\t10829\n阿迪科\t10830\n佰生\t10831\n二八ｃｍ\t10832\nyffhh\t10833\n一个天\t10834\n8发\t10835\n树影\t10836\n王佳妮\t10837\n梅州\t10838\n万事能机器人\t10839\n鬼就是你的鬼我做人就是你的人\t10840\n拆除\t10841\n接种\t10842\nr勺\t10843\n刚性兑付\t10844\n少事\t10845\n餘\t10846\n少于\t10847\n惹毛\t10848\n不依不挠\t10849\n我的手\t10850\n13433816688\t10851\n少些\t10852\nframaaatortort\t10853\n吉吉祥\t10854\n挨拍\t10855\n2007年1月\t10856\n上海绛州鼓乐团权\t10857\n杨心流\t10858\n4000多长\t10859\n拉芙\t10860\n怨天怨\t10861\n心情不好想聊天儿\t10862\n声碎\t10863\n睡不着陪\t10864\n漆漆\t10865\n你好骚\t10866\n露夕\t10867\n脑力\t10868\nkooyyhgh\t10869\n奖赏\t10870\n邓建平\t10871\n五五十三分\t10872\n鸭蔑蝶\t10873\nfifop\t10874\n送货上门\t10875\n尊崇\t10876\n城里了吧一＿琳\t10877\n文豪\t10878\n布撒基寺\t10879\n风影\t10880\n儿童装\t10881\n茶花仙湖\t10882\nwear\t10883\n月号\t10884\n助明\t10885\nfjjhgg\t10886\n师妹们\t10887\n雨花石\t10888\n百米\t10889\nggzux\t10890\n三GB\t10891\n六亲不认\t10892\nMIC男团\t10893\n1992年11月8日\t10894\n星座迷\t10895\n陈宇佳\t10896\n要点倒赞\t10897\n去年6月24日下午4点\t10898\n答非\t10899\n合谐\t10900\n果然\t10901\n小小人小小的人小小的人小小小小小小人一个小小小任和小任\t10902\n头灯\t10903\nGhjbggnntg\t10904\n240米\t10905\n惊叹号\t10906\n合谋\t10907\n别再说了事实胜于雄辩\t10908\n外科\t10909\n3家\t10910\n星座运\t10911\n骨裤\t10912\n降支\t10913\n雨加雪\t10914\n121314151618171910\t10915\n称道\t10916\n盘盘碗碗\t10917\n哈時\t10918\n木耳\t10919\n忏悔录\t10920\n袁家岭\t10921\n标致\t10922\n二十公里\t10923\n呀臭\t10924\nwatches\t10925\n嚎啕\t10926\n大惑不解\t10927\n哈晗\t10928\n哈晕\t10929\n送走\t10930\n我是你的最爱\t10931\n喝毛尖\t10932\n袁雨歆\t10933\n百年级\t10934\n水原华城\t10935\n消失的子弹\t10936\ncxxxzxcvbnn\t10937\n红黑\t10938\n歇斯底里\t10939\n袁家岗\t10940\n急诊\t10941\n見過藏\t10942\n哥顶\t10943\n张慧人\t10944\n秘学\t10945\n刘佳祺\t10946\n奖券\t10947\n李清莹\t10948\n李雨莹\t10949\n有皮没心没肺\t10950\n质量分数杯\t10951\n伯问\t10952\n丁字裤\t10953\n拜拜我不你了我要向你提问题\t10954\n耿耿耿耿耿耿耿耿耿耿\t10955\n滴冲\t10956\n人民币债券\t10957\n过隐\t10958\n黎雨妍\t10959\n火派必仪\t10960\n奇功\t10961\n胡溪桐\t10962\n赵国超\t10963\n想吃苹果\t10964\n恐怖分子量\t10965\n给我找\t10966\n何捷\t10967\n岛上\t10968\n优穿\t10969\nMBD\t10970\n迭出\t10971\n床事\t10972\nMBA\t10973\n谢我很喜欢\t10974\nMBC\t10975\nyoYK\t10976\n秘字\t10977\n补打\t10978\n莎拉厕娃瓜\t10979\n朱纤纤\t10980\n秘子\t10981\nh10万个\t10982\n告拆\t10983\nnn40\t10984\n西塞山\t10985\njoocc\t10986\n纠缠不清\t10987\n张诗敏\t10988\n剧集\t10989\n地瓜\t10990\n考倒\t10991\n布奇书\t10992\n具体\t10993\n教条\t10994\nkthve\t10995\n揉眼\t10996\n花一课\t10997\n李之鑫\t10998\n二百三十二块\t10999\n靠脸\t11000\n无话不谈\t11001\n教材\t11002\n过腰\t11003\n硫代\t11004\n撞脸\t11005\n格外\t11006\nfffdddtggggfcvnkygdeacvbbvdxaZpoihhfrslknbbcdsambvcx\t11007\n郑晰文\t11008\ngfffffffffg\t11009\n红毛\t11010\n874565223\t11011\n发表示爱好处罚没事实施放\t11012\n颅内\t11013\n商都\t11014\n973615483\t11015\nDVD片段\t11016\n妙用\t11017\n边路\t11018\n唱天\t11019\n月将明\t11020\nAirnergyCharger\t11021\n林润铄\t11022\n周晓雨\t11023\n马路牙子\t11024\n鹿岛鹿角\t11025\n随州\t11026\n重拳\t11027\n眼拙\t11028\n消防栓\t11029\n草拟\t11030\n欧买嘎欧\t11031\n欺欺人\t11032\n地牢\t11033\n侦土豆丝\t11034\n岳翰洋\t11035\n曦曦\t11036\n恶心小说\t11037\n传上\t11038\n饿天\t11039\n复仇者联盟\t11040\n周紫晨\t11041\n7月10日晚\t11042\n江威达\t11043\n莱万\t11044\ndododp\t11045\n421756754\t11046\n自行车赛\t11047\n一样的人\t11048\n一干嘛五位\t11049\n两千多岁\t11050\n三个钟头\t11051\n修眉\t11052\n新塘嘴\t11053\npyan\t11054\nlike\t11055\n秒离\t11056\nLady\t11057\n＠7页\t11058\n真的你给我你过发一个我喜欢你\t11059\n没我吧\t11060\n疏影阁\t11061\n富兰格富兰克林\t11062\n修真\t11063\n街上塘\t11064\n青点\t11065\n光明系\t11066\n管一管\t11067\n544242\t11068\n大批\t11069\njxjsjv\t11070\n干痒\t11071\n12重\t11072\n大大大大\t11073\n生辰\t11074\n露兰\t11075\n天卢\t11076\nfxdxcfdcd\t11077\ndiohsis\t11078\n贵贵贵贵\t11079\n敦化\t11080\n大扬\t11081\n当当当当\t11082\n馍馍大馍馍\t11083\n寡欲\t11084\n年货\t11085\n大托\t11086\nshsb\t11087\n许童童\t11088\nmc456c\t11089\n歮\t11090\n苏格兰梗\t11091\n刘佳豪\t11092\n大打\t11093\n大扒\t11094\nOST\t11095\n生辉\t11096\n大手\t11097\njfigifo\t11098\n大才\t11099\n布制\t11100\njvioobji\t11101\n75451\t11102\n天南\t11103\n金唱片颁奖礼\t11104\n刚来看\t11105\n黄\t11106\nMichele店\t11107\n绿山\t11108\n黏\t11109\n黎\t11110\n瓜葛\t11111\n黑\t11112\n水云\t11113\n黔\t11114\n黛\t11115\n一根一五\t11116\n黙\t11117\n忌讳\t11118\n杨功昊\t11119\n點\t11120\n坏蛋坏蛋坏蛋坏蛋\t11121\ngvmhh\t11122\n王子怡\t11123\nexvy\t11124\n凯虹\t11125\n聪明看我多\t11126\n中歌\t11127\n瞎了眼看到\t11128\n可乐度秘\t11129\n界内\t11130\n活得\t11131\n超级黑v\t11132\n淡定\t11133\n口口口口口口口\t11134\n仙骨\t11135\nintroduced\t11136\n一有度应\t11137\n堵城\t11138\n腹内\t11139\ncgyhjf\t11140\n腹肌\t11141\n杨璐\t11142\n学学学学\t11143\n鸳鸯枕\t11144\n血迹\t11145\n香宝贝儿\t11146\n侯兵\t11147\n29号上午\t11148\n点唱\t11149\n擦擦擦擦擦擦串串香擦擦擦擦擦擦擦\t11150\n钱钱钱钱\t11151\n好的好的我教你个度秘\t11152\n帅我随意\t11153\n以至\t11154\n腹股\t11155\n超然待之\t11156\n以致\t11157\n呼和浩特\t11158\n逗游\t11159\n翟晓聪\t11160\njfrgd\t11161\n我需要你的抱抱\t11162\n1394079867\t11163\n找大学\t11164\ncoke\t11165\n松哥松哥\t11166\n青少年们\t11167\n九把\t11168\n黄河道\t11169\n彭千芸\t11170\nhhhjjx\t11171\n别报修\t11172\n放過\t11173\n这子张\t11174\n两圈\t11175\n浩色\t11176\n锁舌\t11177\n两地\t11178\n5424\t11179\n保安话\t11180\n5420\t11181\n1181\t11182\nHeng\t11183\n其意\t11184\nHdjsjshsgsgbs\t11185\n阿索\t11186\n蠢蠢欲动\t11187\n這样早\t11188\nmyoua\t11189\n王媛\t11190\n赵欣蕊\t11191\n一只手做\t11192\n双规\t11193\n阿肴\t11194\n这报\t11195\n两点两点\t11196\n助于心何忍\t11197\n言行\t11198\n曲沃1\t11199\n轻度\t11200\n假假\t11201\n我的力\t11202\nZHG\t11203\njiban\t11204\n瓜蛋\t11205\n欧巴酱\t11206\n困死\t11207\n阿肯\t11208\nfyggy\t11209\n招惹我\t11210\n改日\t11211\n联合国\t11212\n乐玛摄影助理考试\t11213\n田青一\t11214\n200万美元\t11215\n食笑话\t11216\nPKl\t11217\n厂格\t11218\n回聊天\t11219\n素不素\t11220\n没想起来\t11221\n穿不暖\t11222\n在里面\t11223\n完女\t11224\ngoohc\t11225\nurpk\t11226\n完好\t11227\nhttpehiphotosbaiducomxiaodupicitem8644ebf81a4c510f1d27556e6759252dd52aa5e7jpg\t11228\nylztylj\t11229\n37.5倍\t11230\njhijkm\t11231\n十次\t11232\n那一年\t11233\n占优势\t11234\n去库\t11235\n偶贴吧那你说你有时候是哪有时候是女\t11236\n6年后\t11237\n短枪\t11238\n登楼\t11239\n氯化物\t11240\n一拳超人\t11241\n海市蜃楼\t11242\n一边倒\t11243\n好啦好啦好啦好啦拜拜\t11244\n卢车\t11245\n可爱女人\t11246\n老虎钳\t11247\n第一餐\t11248\n黑洞洞\t11249\n#沃音乐#\t11250\nX100000000000\t11251\n天黑黑\t11252\n佟言\t11253\n笨果\t11254\n笑不就笑\t11255\n邢正佳\t11256\n嘻哈曲\t11257\n杨梦\t11258\n时值\t11259\n种子\t11260\n爱自己\t11261\n儿童院\t11262\n惹火考试\t11263\n猪鸡蛋\t11264\n投诉他\t11265\nfboys\t11266\n钱老太\t11267\n34%\t11268\n老黄\t11269\n坏小\t11270\n杨梅\t11271\n小灰狼\t11272\n永安中学\t11273\n县工行\t11274\n由此观\t11275\n相投\t11276\n老黑\t11277\n345\t11278\n346\t11279\n编排\t11280\n340\t11281\n341\t11282\n342\t11283\n343\t11284\n猪扒皮泥马\t11285\n349\t11286\n放匕\t11287\n你是我的妞你往哪儿走你是我的牛牛牛牛牛牛牛牛牛\t11288\n@\t11289\n超帅\t11290\n干架\t11291\n狂拽屌\t11292\n不对马觜\t11293\n李三光\t11294\n秦慧媛\t11295\n平移\t11296\n微单相机盒\t11297\n我喜欢俩人\t11298\n劳资员\t11299\n寂寞成全\t11300\n贪得无厌\t11301\n三六57000\t11302\n王好奇\t11303\n干枯\t11304\n1258073690147324638\t11305\n文县\t11306\n金牛犬\t11307\n娇女\t11308\n13252521659\t11309\n班加西\t11310\n一六点\t11311\n我爱\t11312\n线上\t11313\n600年\t11314\n请客恩\t11315\n东阳\t11316\n法体康泰\t11317\nCCTV2\t11318\n我爸\t11319\n888888886666666\t11320\n干果\t11321\n1.5倍\t11322\nhellomasodej\t11323\n爹爹\t11324\n4830\t11325\n故作\t11326\n巴塞吧塞\t11327\n平秋\t11328\n下个礼拜\t11329\n超常\t11330\n可劲\t11331\n1.74元\t11332\n十八马\t11333\n一百多一百多\t11334\n刘雅婷\t11335\ndirect\t11336\n不孝顺\t11337\nnail\t11338\n100斤\t11339\n类形\t11340\n九十岁\t11341\npolo男\t11342\n可动\t11343\ndjgdjmj\t11344\n一直等\t11345\n水族\t11346\noreve\t11347\n海豚鼻\t11348\n独生子你是男孩儿\t11349\n办办\t11350\n夏有乔木雅望天堂没\t11351\n乱费\t11352\n树身\t11353\n提亲\t11354\n煞笔\t11355\n侦查\t11356\n很残酷\t11357\n水日\t11358\n提交\t11359\n万金\t11360\nhttphhiphotosbaiducomxiaodupicitemfcfaaf51f3\t11361\n149vv\t11362\nmomoomomoomomoomomomomomomone\t11363\n理体制\t11364\n昭寺\t11365\nh5tgjejhjmtmtpadwtgjmuaghltgamw\t11366\n该受\t11367\n毛海洋\t11368\n追寻\t11369\n机器化\t11370\n不眠之夜\t11371\n蔡光\t11372\n护士服\t11373\n羊羊羊羊羊羊羊羊羊羊羊羊洗衣机洗嘻嘻习习习习习习习习习习习羊羊羊羊羊羊\t11374\n猫柠\t11375\n王勃\t11376\ntdutdfuxufxxugfu\t11377\n给我要\t11378\n刘春江\t11379\n床底下\t11380\njjddn\t11381\n破晓\t11382\n很过分\t11383\n多苗条\t11384\nwjjd\t11385\n五百多\t11386\n建行手机银行\t11387\n示威者\t11388\n剔透\t11389\n度大度\t11390\n庚娘\t11391\n人民杰\t11392\n蜜优\t11393\n红鲤鱼绿鲤鱼\t11394\n张云瀚\t11395\n我告诉你一个秘密我的梦想是什么\t11396\n那洪镇\t11397\n想病\t11398\n龟甲\t11399\n甘露\t11400\n一十分之一\t11401\n两只三\t11402\n品牌团\t11403\n5732030\t11404\n一百快点\t11405\n泛滥化\t11406\n法网恢恢\t11407\n玩票\t11408\n冠文\t11409\nTU米\t11410\nkhhf\t11411\n戏海\t11412\n此裙\t11413\n店方\t11414\n俺去也居民try里咯cry五投资人护额困啊loli我饿近么逻辑我爸不饿\t11415\nhzgzxkc9cbckxikhx75ohdhxkuewhfhshhxys3ugzighihdoydid257xdhjgxkhxccijkvjovpcohdoegugh\t11416\nhjjy\t11417\n大锅\t11418\nofoshb\t11419\n打瘸\t11420\n大错\t11421\n賦予\t11422\nhjjf\t11423\nhjjh\t11424\nhjji\t11425\n一点一滴一\t11426\n大明王朝\t11427\n断丝\t11428\nfgcsdxswsd\t11429\n飘起\t11430\n学前班\t11431\n拼配\t11432\n小猫晚\t11433\n你好呀主人\t11434\n江折明\t11435\n欠人\t11436\n最少\t11437\n太富有\t11438\n所谓\t11439\n纲手关\t11440\n8x0点\t11441\n芍药\t11442\nDanzig\t11443\nbb吧\t11444\n20160067\t11445\n呐呐呐呐呐呐呐呐呐哥哥哥哥哥哥哥哥哥哥哥哥\t11446\n屁垫\t11447\n磁盘\t11448\n吴文霞\t11449\n请教你好吧大湿\t11450\n你身边\t11451\n泪线\t11452\n豹子头\t11453\n鱼合轩\t11454\n讠rkddxhx\t11455\nｆｕｃｋｍ\t11456\n名号\t11457\nestate\t11458\n嗎丁啉\t11459\n想我不我也不想知道\t11460\n没天天凭借\t11461\n林天才\t11462\n十一元\t11463\n弹幕\t11464\n名句\t11465\njfuthr\t11466\n怔住\t11467\nkeep\t11468\n名叫\t11469\n蒋静芸\t11470\n可乐泡泡糖\t11471\n稽查\t11472\ndhf\t11473\n永远真心\t11474\n去旅游\t11475\n二三二五\t11476\n春联\t11477\n老婆类\t11478\nhjdkggj\t11479\n小门小样不要钱太卡超人\t11480\n3.1.2\t11481\nvnhcn\t11482\nDxisa\t11483\n鸭儿粑\t11484\n担当\t11485\n郭金\t11486\nDgdhg\t11487\n请点\t11488\n喜乐\t11489\n陪我过\t11490\n美帝国\t11491\n郭里\t11492\n五六分\t11493\n汪晴雨\t11494\nbaekhyun\t11495\n咕噜咕噜\t11496\nConversation\t11497\n倒计时\t11498\n博缕析\t11499\nMBLAQ\t11500\n再恩也\t11501\n佳怡\t11502\n987789\t11503\n踩刹车\t11504\n刘飞儿\t11505\n杀不死\t11506\n谢暖\t11507\n185808\t11508\n奥不你\t11509\n光身子\t11510\n4n一一一一一一一一一一一n\t11511\n28日半夜\t11512\n触目惊心\t11513\n从来没有\t11514\n心胸狭窄\t11515\n于靖宇\t11516\n夏紫怡\t11517\n艾立丰\t11518\n13种\t11519\n未接到\t11520\n前身\t11521\n板野友美\t11522\ntishi\t11523\n睫毛君戳中[j扭秧歌\t11524\n钰平\t11525\n我还猪猪侠\t11526\n123456万\t11527\n走出去\t11528\n历尽\t11529\n飞的更高\t11530\n罗苹果\t11531\n投行\t11532\n太可\t11533\n古玩\t11534\n27分之一\t11535\n十指\t11536\n反腐倡廉\t11537\n人民网\t11538\n五小时后\t11539\n脚底板\t11540\n45.60%\t11541\n醉话\t11542\n幺零二幺\t11543\n哈那提\t11544\nSSSBBB\t11545\n徐样\t11546\n梦孙\t11547\n谢主隆恩\t11548\n孙名文丽\t11549\n2555566886555\t11550\n水利家\t11551\n陈铎锐\t11552\n我了我好想心\t11553\n荣耀\t11554\n排名第一\t11555\n这一天到晚\t11556\nbestrickwithto\t11557\n张梓萌\t11558\n微不足\t11559\n56592\t11560\n学历\t11561\n太开玩笑\t11562\n亚麻跌\t11563\n分行\t11564\n谢积\t11565\n这啥\t11566\n四零零八八六四零零\t11567\n汉普郡\t11568\n名杜\t11569\n家萱\t11570\n起承\t11571\n卫生棉\t11572\n一个一首发\t11573\neshisoomouzhitth\t11574\n阿万恶\t11575\n老泡儿\t11576\nQ名\t11577\n挡箭牌\t11578\n李朝军\t11579\n肖栩锐\t11580\n要不然不住亲你了我\t11581\n刘梓睿\t11582\n么么么么姐姐姐姐姐姐么么么么么么么么么么么么么么么么么么么么么么么么某某某某某某某某某某某某\t11583\n地中\t11584\ninor\t11585\n美腿\t11586\n道德准则\t11587\n米系\t11588\n二四五零七三三八\t11589\n81票\t11590\n夏苏末\t11591\n你侬\t11592\n地主\t11593\n男婴\t11594\n猪小能\t11595\nxvvnc\t11596\n古德耐\t11597\nHDHFHDJXK\t11598\n绿新村\t11599\n要命\t11600\n地下\t11601\n地上\t11602\n羊排\t11603\n燎饿\t11604\n世界最大钨矿诞生记\t11605\njsicua\t11606\n年度秘\t11607\n市场部\t11608\n给出身\t11609\n12月7日\t11610\n豆豆绿豆豆\t11611\n6.7\t11612\nbijf\t11613\n幺零四幺\t11614\n黑背\t11615\n摩圣频\t11616\n赵佳欣\t11617\n元字\t11618\n不分手\t11619\n困倦\t11620\n元子\t11621\nyourity\t11622\n1398750\t11623\n寒心\t11624\n中学籍\t11625\n感触\t11626\n答案我的人\t11627\n暖呀\t11628\n法花园\t11629\n宿州\t11630\n800043110\t11631\n坛子\t11632\n370块\t11633\n199\t11634\n2994162660\t11635\n群之马\t11636\n魄\t11637\n相看\t11638\n感觉\t11639\n女捕快\t11640\n郑州秘\t11641\nkunlol\t11642\n感观\t11643\n发育期\t11644\n珠海站\t11645\n嗯太太\t11646\n195\t11647\n挪威队\t11648\n逆战\t11649\noppoos\t11650\n先恨\t11651\n大家人\t11652\nhlonul\t11653\n赖中\t11654\n小菊菊\t11655\n核磁\t11656\n小皮海\t11657\nggjnbg\t11658\n黄烨华\t11659\n191\t11660\n190\t11661\nufyfhxyg\t11662\n魏\t11663\n菜牙菜牙\t11664\n极美\t11665\n托拉\t11666\nnachnishaamkaure\t11667\n龚祥瑞\t11668\n创始者\t11669\n占地面积\t11670\n9月17号\t11671\n猫嘉裕\t11672\n小客钱\t11673\n文赋\t11674\n庆六一欢乐大本营\t11675\n魅兔\t11676\n两周度\t11677\n尼格买提\t11678\n肥城\t11679\n180米\t11680\n嚎啕大哭\t11681\n东亚\t11682\n哈有一\t11683\n天涯明月刀\t11684\n199元\t11685\n秘魅力\t11686\n哄哄\t11687\n胶北\t11688\n公婆石\t11689\n看着我死\t11690\n吧萝卜\t11691\n危楼\t11692\n清屏\t11693\n少小\t11694\nCytrzrzt\t11695\n热娜娜\t11696\n虹桥路\t11697\n客气话\t11698\n鹤壁职业技术学院校\t11699\n少将\t11700\n爱我想\t11701\n混色\t11702\n许丹瑜\t11703\n呼啦圈\t11704\n了害怕\t11705\n丽雪斯\t11706\n滴尼\t11707\n谢谢你的错\t11708\n不怕冷\t11709\n我是笑你\t11710\n疯不疯\t11711\naafyrvyfcdwxb\t11712\n姑咚咚\t11713\n好我是女\t11714\n27554904\t11715\n单个人\t11716\n吓破胆\t11717\n你好你好姐姐你好你好哥哥\t11718\n呢色即是空\t11719\n哎呀你就是我的秘书度秘\t11720\n口费\t11721\n越秀\t11722\n发不人就机4\t11723\n阿拉鹿晗帅\t11724\n个里\t11725\n虎门销烟\t11726\n贴面舞伴\t11727\n挑明\t11728\n大屠杀5\t11729\n度秘谪\t11730\nf一少\t11731\ndcbbqi\t11732\n000677\t11733\n2003年5月22日\t11734\n良民们\t11735\nXQER\t11736\n李月娟\t11737\n唏嘘不已\t11738\nAndroid4.1\t11739\n黄丛姗\t11740\n王美眉\t11741\n美女儿\t11742\n出房店\t11743\n驴牌\t11744\n二六二六二幺零二\t11745\n度秘谷\t11746\n看好\t11747\n恰恰相反\t11748\n后头\t11749\n德哈大法\t11750\n小煜尘\t11751\n主家\t11752\n主宰\t11753\n瓢水\t11754\n熊妈妈\t11755\n双舟蓑笠翁\t11756\n可不相忆\t11757\n冬瓜皮\t11758\n1000斤\t11759\n主审\t11760\n5500\t11761\n后天\t11762\n哈瑞\t11763\n挤公交\t11764\n陆明杰\t11765\n人生学问三宝\t11766\n汤猫\t11767\n离职\t11768\n852369147\t11769\n34568\t11770\n众善\t11771\n后备\t11772\n474455256\t11773\n黄克\t11774\n后复\t11775\n悲情案\t11776\n369248135\t11777\n啦啦啦啦啦啦啦啦我是萌妹子\t11778\n时隐时现\t11779\n2100米\t11780\nahyeygg\t11781\n箱底\t11782\n十一二岁\t11783\n3x1方\t11784\n锦麻麻\t11785\n平别\t11786\n洛城广场\t11787\n严泰雄\t11788\n无能\t11789\n高如飞\t11790\n带命带命\t11791\n17家\t11792\n菠菜\t11793\n会谈恋爱\t11794\n平分\t11795\nREFYH\t11796\n5888555505795985\t11797\n黄石人民广播电台\t11798\n抢滩\t11799\n底裤\t11800\n为你祝福\t11801\n喜戏\t11802\nabcdid\t11803\n姜子铭\t11804\n耐无\t11805\n文文不\t11806\n秘雅\t11807\n秘集\t11808\n文文丈\t11809\n长歌\t11810\n许昌县公安局\t11811\nabcpp\t11812\n消渴\t11813\nmut\t11814\n泥奏凯\t11815\n世界\t11816\n055555666688888822555\t11817\nmur\t11818\n王小黑\t11819\n庐山\t11820\nmux\t11821\n准国脸\t11822\n四十岁时\t11823\n一拍一去\t11824\nmuf\t11825\nmua\t11826\n博友们\t11827\n悬浊液\t11828\nmum\t11829\nmul\t11830\n一摆\t11831\n拓宽\t11832\n孔欢欢\t11833\n明演\t11834\n亲我在\t11835\n放佛\t11836\n19p\t11837\nbjdbr\t11838\n苏错上加错v睡觉吧新安江不辛苦不\t11839\n89个\t11840\n李秘校\t11841\n独立版\t11842\n不要你做我的小秘\t11843\n无糟\t11844\n生存之道\t11845\n可知\t11846\n新天地\t11847\n21点26\t11848\n默逆\t11849\n沙莎\t11850\n老挝证交所\t11851\n一次点儿\t11852\n看得出\t11853\n晕沉\t11854\n胜女\t11855\n侠酷\t11856\n李呵呵\t11857\n激涨\t11858\n啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦哦咪都奥特咪奥特咪\t11859\n姜婷六\t11860\n龙里龙\t11861\n断崖\t11862\n黄河龙炎\t11863\nhast\t11864\nLG手机P705\t11865\n地球公转轨道面\t11866\n下班山\t11867\n幾句\t11868\n那你好度菲亚\t11869\n5500平方厘米\t11870\n李真\t11871\n隋瑞国\t11872\n劲注音\t11873\n蓦然\t11874\n黄小姐\t11875\n加装\t11876\n杨海坤\t11877\n漢子絕對\t11878\n十五亿多兆\t11879\n一大坨\t11880\n交机机\t11881\n苍狼长虹#\t11882\n扒皮\t11883\n陈若男\t11884\n猫都毛\t11885\n条纹状\t11886\n4页\t11887\nugghjjghu\t11888\n132nnn13877207602\t11889\nwrupddhu\t11890\n守规矩\t11891\nn757n7697nn6n0665555n948\t11892\n向着\t11893\n12点20分\t11894\n两百五五十件\t11895\n你当我的猫咪\t11896\n吓鄙视\t11897\n不好累\t11898\nXKo\t11899\n零六零六\t11900\n像别\t11901\n林家毅\t11902\nhreyx\t11903\n一个八十四一个\t11904\n受欢迎\t11905\neneloop\t11906\n油亮\t11907\n房伟兄\t11908\n清真烤鸭\t11909\n156只\t11910\n翔龙家纺\t11911\n小点旦\t11912\nggvfd\t11913\n事务\t11914\n六七百块\t11915\n挽回\t11916\n枪战王者\t11917\n真的好喜欢\t11918\n协定\t11919\n东季\t11920\n单饼卷\t11921\n孔明锁\t11922\n天打\t11923\n3ni\t11924\n天才\t11925\n天手\t11926\n出来玩\t11927\nMIMI\t11928\n十年半载\t11929\n3510几\t11930\n3g网络\t11931\n萎靡不振荡然无存\t11932\n罗丽素\t11933\n李婉舒\t11934\n聊了怪\t11935\n度秘不管你你你你\t11936\n别说了烦\t11937\n雷同化\t11938\n弹簧\t11939\n东子\t11940\n耶罗菊花\t11941\n全友卫浴\t11942\n80000\t11943\n开江县\t11944\n姜大声\t11945\n春哥\t11946\n唔知喔\t11947\n987654321\t11948\n十四十五\t11949\n天生日\t11950\n没看见\t11951\n吴承恩\t11952\n个头书\t11953\nWorldPay\t11954\n黄豆\t11955\n零距离\t11956\n二二八幺\t11957\n坑爹\t11958\n韩此\t11959\n巫峡\t11960\n道别拜拜\t11961\n哈哈我得一个人\t11962\nddgdff\t11963\n大吃大喝\t11964\n管家伙食\t11965\nocean\t11966\n我君鸿宾馆\t11967\n陈光标\t11968\n吃饱了吧\t11969\n酸性\t11970\n葛份妮\t11971\n杨昊\t11972\n两条手\t11973\nviggich\t11974\n厅内\t11975\n体积小\t11976\n杨星\t11977\n魔教id我一生最爱的我非不可\t11978\n展博\t11979\n岁觉\t11980\ngunner\t11981\n开卷有益健康\t11982\n里啪啦\t11983\n庞海\t11984\n杨是\t11985\n肝儿\t11986\n杨春\t11987\n狠美\t11988\n28台\t11989\n28号\t11990\n比正房长\t11991\nhogiygogcguy\t11992\n至爱\t11993\ngava\t11994\n啦啦啦啦啦啦你是我的小宝贝小宝贝\t11995\n国庆节\t11996\n哎呀你好色\t11997\n天亮\t11998\n上小下大上重下轻的上小下大上轻下重c上大下小上今夏重\t11999\n链牌\t12000\n年夜\t12001\n驾驭酷夏\t12002\n爱情住我教\t12003\n瞪眼丸\t12004\n绳强套\t12005\n怪不\t12006\n天人\t12007\n产能\t12008\n霸道大气上档次\t12009\n干嘛充卡\t12010\n烧貌似\t12011\n黑娘\t12012\n年头\t12013\n八九\t12014\n中国联通沃\t12015\n多子\t12016\n天事\t12017\n尹老水\t12018\n方庄桥\t12019\n年大\t12020\n整点\t12021\n天井\t12022\n血液学\t12023\n会儿下次\t12024\n王梓琪\t12025\n阿弥陀佛\t12026\n狼样\t12027\n虚列支出\t12028\n赵吏\t12029\n截距\t12030\n9999\t12031\n丽嘉纯\t12032\n解放军\t12033\n荔城区\t12034\n9991\t12035\n炭火\t12036\n大妹妹\t12037\n频率\t12038\n愤然\t12039\n点首歌\t12040\n大红花一朵给你\t12041\n155548525556658885786571158000000000086\t12042\n猪夜班九蜘蛛只蜘蛛之一蜘蛛王诗安蜘蛛\t12043\nqegg\t12044\n在建\t12045\numeo\t12046\n月一号回广福\t12047\n柯南迷\t12048\n锋芝\t12049\n面对视\t12050\n懒癌\t12051\n嘀\t12052\n级别\t12053\nww点shangchaollu点com\t12054\n密钥\t12055\n鲜为人知\t12056\n日头\t12057\n雨耳田\t12058\n荡荡\t12059\n一的一百\t12060\n清远市职业技术学校\t12061\n青菜包\t12062\n王凯微\t12063\nhund\t12064\n牛兔\t12065\n100000000000000000000000000000000000000000000000000000元\t12066\n日夕\t12067\n不走了么\t12068\n296014597\t12069\n沒救\t12070\n婆婆头\t12071\n小个便\t12072\nppg\t12073\n佐樱漫大战奥特曼大战九九\t12074\nferg\t12075\nno老王\t12076\nppi\t12077\npph\t12078\n大学梦\t12079\n绘本\t12080\n九栋\t12081\npps\t12082\n洒馆\t12083\n偷情\t12084\n错误我是女的猜猜我是大人还是小孩儿\t12085\nppv\t12086\nppt\t12087\nfery\t12088\n潮海\t12089\n冯巩\t12090\n陈明\t12091\n至亲\t12092\n刀把你堂\t12093\n问茅\t12094\nthhghhjj\t12095\n皇隆制药\t12096\n乐迷风行\t12097\n潮流\t12098\n骗了谁\t12099\nwoshiZHUREN\t12100\n刚才下午\t12101\n抗母\t12102\n里边儿\t12103\n肉强食\t12104\n镇静\t12105\nzgjk\t12106\n弄行\t12107\n博士大肠\t12108\n2本\t12109\n剽悍\t12110\n山林\t12111\npp1\t12112\npp0\t12113\npp5\t12114\n2002年7月31日\t12115\nFFUUU\t12116\nehahds\t12117\n能耗\t12118\n遥不可及\t12119\n吴地林\t12120\n九五\t12121\n2月\t12122\n10点57\t12123\n潜孔钻孔\t12124\n傻妞儿\t12125\n破烂机器人\t12126\nmore\t12127\n早点睡觉吧乖乖\t12128\n2期\t12129\ndjor\t12130\n恩老婆\t12131\n晚了我走\t12132\n狗不定\t12133\n霖剔透摸囖\t12134\n捕房\t12135\n凑和\t12136\n晚上11点\t12137\n0.66\t12138\n过敏性\t12139\n鲨鱼\t12140\n异曲同工\t12141\n岁才\t12142\n村主任\t12143\n弹弹堂\t12144\n出门左拐\t12145\n大模哒哒哒木木\t12146\n足民\t12147\n彩砂\t12148\nEwan\t12149\n陈明金\t12150\n2016734\t12151\n很少\t12152\n扼死\t12153\n感谢你\t12154\n我很喜欢不想\t12155\npicture\t12156\n自重\t12157\n领刑\t12158\n那种人\t12159\n张本宇\t12160\n36963\t12161\n击球\t12162\n击理\t12163\n鹰文\t12164\n什尼\t12165\n石牌\t12166\n2100千克\t12167\ns秘秋醋\t12168\n黑社会\t12169\n000000000005500555\t12170\n一个九九\t12171\n讲法\t12172\n欧诺科技\t12173\n什小\t12174\n红璎栗\t12175\nhellonettomatouto\t12176\n打覅\t12177\n123222222222222222222222222222222222222222\t12178\n转即逝\t12179\n什千里足\t12180\n1周\t12181\n等等不慌走\t12182\n季后赛\t12183\n4十9二多少\t12184\n专人士们\t12185\n布里斯班\t12186\nroj\t12187\n336845687\t12188\n置景\t12189\nron\t12190\n侄女\t12191\n切讨厌\t12192\n门乂\t12193\n鲜果\t12194\n好不错\t12195\n双向\t12196\nrog\t12197\nroy\t12198\n金剑\t12199\n双合\t12200\n开不开心\t12201\nros\t12202\nrou\t12203\nrow\t12204\n194949\t12205\n12362807422\t12206\n沃支付商圈俱乐部\t12207\n电老虎\t12208\n110场\t12209\n碰住\t12210\nTsrhdeudiriqrhuiupaIrdzbdteorszkyfkhxhxfftgYjiditdkdkxkyddulufltiiyd\t12211\n武林贴\t12212\n老了过时\t12213\n非雌\t12214\n忒忒\t12215\n贵阳\t12216\n我要你现在哪有嘛\t12217\n孤独小议\t12218\n杨阿姨\t12219\n昌北机场女厕门\t12220\n悦烟网会\t12221\n萌辣\t12222\niedy\t12223\n下马\t12224\n罗刹\t12225\n台子\t12226\n佛印\t12227\n习桥\t12228\n郭琪\t12229\n决策层\t12230\n低等\t12231\n玛咖\t12232\nFauna\t12233\n扇子舞\t12234\nlaserisfor\t12235\n米虫\t12236\n5月20日13点14分\t12237\n罗列\t12238\n抽象\t12239\njamx\t12240\nDanger\t12241\n5555646625284\t12242\n言十巳\t12243\n2时29分\t12244\n音源\t12245\nbest3\t12246\n海鼎\t12247\n07412365\t12248\nGrace\t12249\n明天某地而生\t12250\n丢了\t12251\n同生\t12252\n女吧\t12253\n环境文明\t12254\n口化\t12255\n魅璃\t12256\n12月16\t12257\n多吃点\t12258\n一个字\t12259\n流畅\t12260\n虎背熊腰\t12261\n流留\t12262\n灵活\t12263\n广州华\t12264\n昨天才\t12265\n20世纪80年代\t12266\n言辞\t12267\n真姬\t12268\n女吻\t12269\n排灌\t12270\n你是我的亲爱的你的小孩还\t12271\n口区\t12272\nPBS\t12273\n女同\t12274\n女名\t12275\n香菇炖排骨\t12276\n女吊\t12277\n塞太阳塞\t12278\n女吖\t12279\ngoes\t12280\n本网站\t12281\n何必须\t12282\n中信证券\t12283\n丢人\t12284\n多不少\t12285\n9号凌晨\t12286\nsgfug\t12287\n拿能\t12288\n观花\t12289\ngdxbkgdt\t12290\n走了然后\t12291\n个天翻地覆\t12292\n度赌博\t12293\n度妍\t12294\n创佳\t12295\n保尔\t12296\n度妖\t12297\ngegepa\t12298\n让你走\t12299\nskeleton\t12300\n王景怡\t12301\n结成\t12302\n700万元\t12303\n有法\t12304\n米特\t12305\n度妮\t12306\n李陀\t12307\n十九平米\t12308\n大笨狼\t12309\n12H\t12310\n万恶不赦\t12311\n29629\t12312\n碘盐\t12313\n非庙\t12314\n羊水\t12315\n创作\t12316\n一点点点点点点\t12317\n脑残神马协会\t12318\nandI\t12319\n万用国情论\t12320\n不野猫我是鬼\t12321\n一点味\t12322\n不只一个\t12323\ncrxr\t12324\n沧海\t12325\nyouarewelcome\t12326\n晚歇\t12327\n壳郎屎壳郎\t12328\n結束了\t12329\n玉罗\t12330\n克里\t12331\n促成\t12332\n一二号\t12333\n别了别了别了别了别了你不要脸\t12334\n呜子\t12335\nNikon\t12336\n不敢我告\t12337\n90斤\t12338\n白带带\t12339\n最美的\t12340\n唱一曲\t12341\n相见之刺\t12342\n我工作形式主义\t12343\n流放\t12344\n走着劲\t12345\n三盛\t12346\n3月十\t12347\nzishang\t12348\n蒙哥马利\t12349\n女类\t12350\n错懒\t12351\n2月27日\t12352\n郭思南\t12353\n阮经天\t12354\n千百十四\t12355\n雷斯特雷波\t12356\ntfchbv\t12357\n美国最高法院\t12358\n晚死\t12359\n吃弄\t12360\n适合\t12361\n成便宜\t12362\n二手车市场\t12363\ntryholyshu\t12364\n于小露\t12365\n嫁刁\t12366\n老公您就是我最美的依靠\t12367\n3185622251\t12368\n空伐\t12369\n夏建仁\t12370\n里啦\t12371\n話語\t12372\n下蛋\t12373\n争了是\t12374\n真不你\t12375\n采食\t12376\n窗子\t12377\n今到\t12378\n光滑\t12379\n冰城串\t12380\n1万八百八千八百九十三\t12381\n恩宝贝\t12382\n排骨粥\t12383\n涟漪\t12384\nhtgtb\t12385\n采风\t12386\n竹蔗\t12387\nchnncdnkc\t12388\n袁子玥\t12389\n涵妃\t12390\n股评\t12391\n和烦\t12392\n啊不我很舒服\t12393\n二十斤\t12394\n模範\t12395\n奶艄\t12396\n囊映雪\t12397\n二二百米\t12398\n发样玉\t12399\n祷告\t12400\n马月倩\t12401\n先福\t12402\niZXEWS\t12403\n七七关\t12404\n苏嘉瑜\t12405\nplus\t12406\n厚薄\t12407\nyigig\t12408\n大葱牛肉\t12409\n三四三\t12410\n三四下\t12411\n阿凡题\t12412\ngtyuvv\t12413\ndhgd\t12414\n纷呈\t12415\n赛尔堡\t12416\n100万元\t12417\n走了拜\t12418\n代签\t12419\n痦子\t12420\n非庵\t12421\n丝袜\t12422\n沁阳市\t12423\n赵死心\t12424\n赵未希\t12425\nExclusive\t12426\n生来\t12427\n三四个\t12428\n傻瓜\t12429\n121\t12430\n笑了\t12431\n肉呕\t12432\n123\t12433\n见笑了笑\t12434\n124\t12435\n更迭\t12436\n我你是我的秘书\t12437\n9冫\t12438\n多咪伟\t12439\n12317节\t12440\n吕祖善\t12441\nGoodmonth\t12442\n小心老子黑了你\t12443\n小跑\t12444\n木马份\t12445\n我们结婚吧我喜欢你\t12446\nfanart\t12447\n小太监\t12448\n冰红\t12449\n跃下\t12450\n小橙子\t12451\n庞会\t12452\n小路\t12453\n崔子阳\t12454\n点土豆点酸菜点芹菜\t12455\n初装\t12456\n零一零八六零三九\t12457\n大心细\t12458\n多咪伦\t12459\n宝坻区\t12460\n第一胎\t12461\n我是好呢的话那你不就是号呢的的的烂你了\t12462\nyzzzyzy\t12463\n海绵厕所吧快点克\t12464\n看官\t12465\nTrapasso\t12466\n54564524964554\t12467\n008度\t12468\n错穷\t12469\n胸嗎\t12470\n下午三点\t12471\n稻穗\t12472\n好我的\t12473\n红豆沙沃尔\t12474\n盲区\t12475\n浮士\t12476\n防腐蚀\t12477\n和堡\t12478\n空格键\t12479\n创新\t12480\n耐科斯\t12481\n碧瑶\t12482\njsii\t12483\n感受到\t12484\n二十美元\t12485\n自行车\t12486\n田岙\t12487\n辛特拉海航度假酒店\t12488\n看网管\t12489\n巴道\t12490\n航班号\t12491\n遂川县\t12492\n滂沱\t12493\nfhjctxu\t12494\nlolerrabbcco\t12495\ngggtyeqwkxjie\t12496\n不停地\t12497\n碧蓝\t12498\n卢德祥\t12499\n87%\t12500\n滴亲\t12501\n我喜歡\t12502\n唱一首行\t12503\n好呀提or\t12504\n小小不要脸小心我不要你\t12505\n继续模型\t12506\n好啦亲爱的度秘\t12507\ngddeaes\t12508\n16.35次\t12509\n半岛电视台援引临时爱国委员会\t12510\n不可告人\t12511\n二二二十分钟\t12512\n母子\t12513\n哥哥不会\t12514\n放钱氏\t12515\n瑞珍\t12516\n2月28日\t12517\n第427\t12518\n最初的梦想\t12519\n13251468081\t12520\n68616966\t12521\n长春轨道客车股份有限公司\t12522\n酱油瓶\t12523\n瑞珠\t12524\n6……岁\t12525\n爱你爱你爱你爱你\t12526\n1978年间\t12527\n酒干\t12528\n不男不女｀O\t12529\n窃取\t12530\n一个街\t12531\n8866544\t12532\n一周堡\t12533\n古板\t12534\n陈太太\t12535\n花好稻\t12536\n对你\t12537\n咬唇\t12538\n五十名\t12539\n沛杰\t12540\n赵厅长\t12541\n白米饭\t12542\n朱明川\t12543\n李魅影\t12544\n潘志芳\t12545\n褒义词\t12546\n离任\t12547\n亲一个七一个五迷\t12548\n1551288228\t12549\n略阳县\t12550\n马王堆遗址\t12551\n只有人\t12552\n这一周\t12553\n76534\t12554\n后侧\t12555\n对位\t12556\n机皇\t12557\n一亿次\t12558\n山行\t12559\n3youyou\t12560\n北京武警总院医学美容中心&望族亚健康\t12561\n忙人独董\t12562\n西腔\t12563\npropqrstruvwsyz\t12564\n89928993千八百九十九架\t12565\n医用级\t12566\n喻路\t12567\n临开里\t12568\n成辉\t12569\n咔咔啦啦\t12570\n善良\t12571\n三四钢\t12572\n都飞来峰\t12573\n亲我灿勋心累\t12574\n向向天歌白毛浮绿水红掌拨清波\t12575\nツjjen\t12576\n绵绵绵绵\t12577\n饶人\t12578\n很快点\t12579\n董艳红\t12580\n好呀好呀好朋友\t12581\nQuick\t12582\n6月3日\t12583\n拿不准\t12584\n欧诺\t12585\n赤西仁\t12586\n名利\t12587\n回答题\t12588\n周良良\t12589\n阻力\t12590\nghoi\t12591\nRangoon\t12592\n嫦娥\t12593\n50毫米\t12594\ngood男孩\t12595\n搭石\t12596\n无亦凡\t12597\n5555666669\t12598\nJenkins\t12599\n大四年级\t12600\nN918\t12601\n五年内\t12602\n半盆\t12603\n苏州大学\t12604\n姚晓阳\t12605\n一号两号三号四号\t12606\n潘欣玥\t12607\n还我喜欢\t12608\nknkmk\t12609\n七十王姐\t12610\n43544783698\t12611\n小马雅蠛蝶\t12612\n再见再见再见再见再见再见再见\t12613\n小笼堡\t12614\n中国科学院\t12615\n副所\t12616\n花中\t12617\n口暴\t12618\n43483\t12619\n想吃食\t12620\n旧体诗\t12621\n应否\t12622\n周西子\t12623\n龙傲娇\t12624\nokkjjj\t12625\n一个58\t12626\n唱一下\t12627\n倪帅\t12628\n幹嘛幹壞\t12629\n国资委\t12630\n很有趣\t12631\nicon\t12632\n很多片\t12633\n道不明\t12634\n活蛇\t12635\n大可怜\t12636\n李振豪\t12637\n心不甘\t12638\n心里得劲\t12639\n快捷键\t12640\n第n遍\t12641\n严晨\t12642\n香港华光航业集团\t12643\n地度秘\t12644\n花东\t12645\n花丛\t12646\n默玲\t12647\n除暴安良\t12648\n华为荣耀\t12649\n首别\t12650\n好啊好好吃吃吃吃吃\t12651\n室内词\t12652\n程刚\t12653\n三百六\t12654\n双狮\t12655\n被否决\t12656\n唐文镜\t12657\n顶配\t12658\n张笑\t12659\n五十五十个\t12660\n插件儿\t12661\nITUNES\t12662\n时间段位\t12663\n三百元\t12664\n度蜜度秘你好坏\t12665\n难忘\t12666\n快乐万事大吉\t12667\n耶贝贝\t12668\n三百克\t12669\n山西门诊\t12670\n机警\t12671\n反悔\t12672\nwwws\t12673\n首创\t12674\n红龙\t12675\n3月9\t12676\n没不好意思\t12677\n123班\t12678\njazq\t12679\n七三八百五十四\t12680\n低度\t12681\n见义勇为\t12682\n事锈\t12683\n瓦力酱\t12684\n开玩\t12685\n呗美\t12686\n13761492104\t12687\n泥水\t12688\n牌楼\t12689\ntudzh\t12690\n20142015\t12691\n塑料件\t12692\n求送\t12693\n流浪狗\t12694\n人人网\t12695\n德比\t12696\n达丽\t12697\n85只\t12698\n穷困潦倒\t12699\n蔡喜洋羊羊沸羊羊\t12700\n二百块\t12701\n王国江\t12702\n闷骚\t12703\n临时性\t12704\n上月14\t12705\nluo\t12706\n我我问你的是明天\t12707\n富国天利\t12708\n25252525\t12709\n首战\t12710\n业燕滔\t12711\n北京青年报\t12712\n嗯瓮邓\t12713\n五十来块\t12714\n77111225525800777777777443\t12715\n陈淑珍\t12716\n序列\t12717\n架飞机\t12718\n雨蝶\t12719\n陕西国防学院\t12720\n买票\t12721\n爆躁\t12722\n童日生\t12723\n讴歌\t12724\n几分\t12725\n生冷\t12726\n回收站\t12727\n5255564551948452584268555655\t12728\n徐经理\t12729\nFJJFJFFNJFIRIDFC\t12730\n语塞\t12731\n加上来\t12732\n贝妮\t12733\n对呀人教多读书俗话说得好人学到老活到老\t12734\n花猪\t12735\n花猫\t12736\n一个十万\t12737\n繁星点点\t12738\n10月5日\t12739\n去把\t12740\n一个十一\t12741\nl3岁\t12742\n88一天\t12743\n我喜欢歌神子白开水\t12744\n再分\t12745\n1.89亿元\t12746\n漫画了身为我的秘书\t12747\n0.15%\t12748\nyouer\t12749\n板鱼\t12750\n炫舞梦工厂\t12751\np999999\t12752\ngggggffff\t12753\n满德佳\t12754\n不是成天\t12755\n读取\t12756\n归心\t12757\n花红橙黄绿青蓝\t12758\n穷途末路\t12759\n999999\t12760\n狐影\t12761\n呦呦切克闹\t12762\n龙辉\t12763\n999992\t12764\n拐仗\t12765\n一毛行\t12766\n一爪\t12767\n啦啦啦得\t12768\n快点儿别磨磨蹭蹭\t12769\n冯南\t12770\n好好生活\t12771\nehddhchchcbddbdwnjqqjzzjjhvbggzggwhshdhcchffxhhdjjdhhhdhsgssgdbb\t12772\n刁泽赫\t12773\n诶好嘞\t12774\n长虹\t12775\n1358834985\t12776\n元江下过雪\t12777\n铁帧\t12778\nxffx\t12779\n绝招\t12780\n表态\t12781\n擦干\t12782\n105国道\t12783\nLR4l\t12784\ncruoga\t12785\ncfkhdxvj\t12786\n穿越主攻小心女穿男\t12787\n电子办\t12788\n电表\t12789\n喜羊羊与灰太狼之嘻哈世界\t12790\n刘东富\t12791\n高村\t12792\n水面\t12793\n李英婵\t12794\n啦啦啦啦啦啦啦啦啦啦啦啦度秘度秘\t12795\n嗯一身\t12796\n烧煤\t12797\n八嘎牙路\t12798\n饰貌\t12799\n嘻嘻块\t12800\n按降价\t12801\ncase\t12802\n异缝\t12803\n高杰\t12804\n三包\t12805\n天雷滚滚\t12806\n水听器\t12807\n1246318\t12808\n木棒\t12809\n凉棚\t12810\n高条\t12811\n烷值植物大战僵尸二\t12812\n梦辰\t12813\n退隐江湖\t12814\n范荧荧\t12815\n咪真\t12816\n大哥\t12817\n秋楚妩\t12818\n他来咯努努力军\t12819\n就喜欢\t12820\n大哭\t12821\nmffev\t12822\n臂膀\t12823\n知音\t12824\n瑞润\t12825\n雷星雨\t12826\n122222285101001千一九百九十九\t12827\nmyonlysunshine\t12828\n多一次\t12829\n多一欢\t12830\n贝加尔湖\t12831\n10虚岁\t12832\n米汤米唔嗯诶油蔡一中\t12833\n233号\t12834\nndnnd\t12835\n狗干\t12836\n9816\t12837\n大哈\t12838\n9812\t12839\n露比\t12840\nljh\t12841\n见末\t12842\n球秘\t12843\n大哒\t12844\n太专一\t12845\n安俊龙\t12846\n某女\t12847\n北漂\t12848\n球秒\t12849\n大小度\t12850\n道香蛋蛋\t12851\n吴鹏\t12852\n阴差阳错\t12853\n与事\t12854\n跳路\t12855\n你是我的秘书你敢卖我吗你敢卖我\t12856\nxpphil\t12857\n冉冉\t12858\n跳跳\t12859\nNBA全明星赛\t12860\n先卖\t12861\n很高心\t12862\n40款\t12863\n跳跃\t12864\n十分钟\t12865\n同理心\t12866\n你你你\t12867\n失去逛逛\t12868\n徐子雯\t12869\n与人\t12870\n7000年\t12871\nj1xmjm\t12872\n被罩\t12873\n初夜\t12874\n霸天虎\t12875\n不要發這種東西\t12876\n48555554444444\t12877\nPine\t12878\n闪现\t12879\n初夏\t12880\n凌潮\t12881\n没有你好\t12882\n六八六八二五七二\t12883\nQQ空间\t12884\n4559多少\t12885\nyfsschkfsc\t12886\n谭小顺\t12887\n泉州市区\t12888\n祘苗\t12889\n二逼\t12890\n8118288838\t12891\n冼国林\t12892\n咯going\t12893\n定重\t12894\n定量\t12895\n缘之空\t12896\n骂骂\t12897\n季你\t12898\n定金\t12899\n度秘你好呀我是你的主人\t12900\n脚踝骨\t12901\n固定电话儿\t12902\n太饱\t12903\n九墅\t12904\n吞吞\t12905\n白丁子狗\t12906\n五分之35分\t12907\n桑仁\t12908\n高端\t12909\n三大类\t12910\n想和你\t12911\nf秘肉\t12912\n刘淑婷\t12913\n耳朵杯\t12914\n盖头\t12915\nCvf\t12916\n范bb\t12917\n唧天\t12918\n老陈醋\t12919\nhyteess\t12920\n土豆丝\t12921\n贪恋\t12922\n百战百胜\t12923\n团聚吧\t12924\n啥你是谁你是谁你是谁你是谁我是谁\t12925\n躺律\t12926\n99分钟\t12927\n声优度\t12928\n猛增\t12929\n谷粒\t12930\n97口\t12931\n谷粑\t12932\n我摸\t12933\n林嘉乐\t12934\n严谨良\t12935\n讲堂誉\t12936\n麦迪度\t12937\n大王\t12938\n马勃凯\t12939\n一潭死水\t12940\nheiping\t12941\n上手\t12942\n鹰眼\t12943\n付兵茵\t12944\n怜香惜\t12945\n犬哥神\t12946\n永有\t12947\n上扇\t12948\n女妖\t12949\n说不上\t12950\n悄悄斗\t12951\n屠屠\t12952\n干什么\t12953\n莫宏欢\t12954\n第二栋\t12955\n鸡沥\t12956\n赌气\t12957\n翻动\t12958\n七十多分\t12959\n上扬\t12960\n周敏\t12961\nttop\t12962\n透视眼\t12963\n恐懼\t12964\n陆贞传奇\t12965\n魔图\t12966\n叫化骨错\t12967\n射洪\t12968\n闹木可爱的孩纸们肿么\t12969\n999次\t12970\ntrddrrrrrrrrt\t12971\n明明知道\t12972\n陈阿扁\t12973\n发泄\t12974\n百丽\t12975\n谷超豪\t12976\n张云年\t12977\n百两\t12978\n逆声音\t12979\n发泐\t12980\n不能不想\t12981\n不太卡\t12982\n发法\t12983\n百中\t12984\n女你不就是我的小秘书\t12985\nQQ6704\t12986\n百个\t12987\n韩冬阳\t12988\n花花花花花花花花\t12989\n王培祥\t12990\n百世\t12991\n蛇肉\t12992\n理想主义\t12993\n秘花园\t12994\n拿起\t12995\n117斩么\t12996\n拿走\t12997\n敌对\t12998\n属份\t12999\n百万\t13000\n百一\t13001\n阿发西\t13002\n御坊堂v\t13003\n百丈\t13004\n百三\t13005\n哇了不起\t13006\n前照\t13007\n六套\t13008\n刘天堂\t13009\n体操\t13010\n两行\t13011\n丁小说\t13012\n调音\t13013\n秘度度\t13014\ncgfx\t13015\n讨厌你我不喜欢你我讨厌讨厌讨厌讨厌你\t13016\n樱樱\t13017\nmchsvz\t13018\n学兴县\t13019\nBelieve\t13020\n跑步\t13021\n买好秘\t13022\n一个岁\t13023\n服贴\t13024\n小黄油\t13025\n京山\t13026\n张含笑\t13027\n邱朋炜\t13028\n蒙太萌\t13029\nFANS们\t13030\n干吧干\t13031\n敲门儿\t13032\n下行\t13033\nhttpehiphotosbaiducomxiaodupicitem29381f30e924b899252bc54d69061d950a7bf6b4jpg\t13034\n麦饭石\t13035\n九分之一\t13036\n继菊\t13037\n脑筋急转弯呗\t13038\nwebtop\t13039\n哨口\t13040\n唐禹哲\t13041\n这干嘛\t13042\n烤螃蟹\t13043\n黄树叶\t13044\n下衣\t13045\n一败涂地\t13046\n董卿平\t13047\n蔡烨\t13048\n哥刚\t13049\n蛮\t13050\n4360\t13051\n支舞\t13052\n饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿\t13053\ndufire\t13054\n永驻\t13055\n55家\t13056\n东坡肉\t13057\n钱天\t13058\n蛹\t13059\n爱玛\t13060\n华藏寺\t13061\n蛰\t13062\n产地证\t13063\n屁股东西\t13064\n造就好啦\t13065\n禁歌\t13066\n蛊\t13067\n蛋\t13068\n碗碗\t13069\n钱多\t13070\n灵冷\t13071\n钱够\t13072\n聪明的朋友\t13073\n化动\t13074\n马达\t13075\n蛀\t13076\n蛆\t13077\n蛇\t13078\n六月耶\t13079\n禁止\t13080\njaoak\t13081\n蛙\t13082\n219542938\t13083\n蛟\t13084\n哈哈子\t13085\n血汗\t13086\nfeasoo\t13087\n希尔龙\t13088\nmummies\t13089\n抢夺\t13090\n青苹\t13091\n9985832\t13092\n九点10点11点12点13点14点15点16点17点18点19点20点21点22点23点24点\t13093\n麦兜\t13094\n走恰\t13095\n可不好呀\t13096\n啦噜\t13097\n画布\t13098\n米摔\t13099\n处理\t13100\nihfyjhffj\t13101\n水果姐\t13102\n麦克\t13103\nbamba\t13104\n更完了\t13105\n我讨厌你狠死你\t13106\ngjjjppppppptj\t13107\n胡夏微群\t13108\n本社\t13109\n托马斯·伯恩哈德\t13110\nTHOR\t13111\n出题\t13112\n点度秘\t13113\n阿拉擞\t13114\n西at\t13115\nTITO\t13116\nchuen\t13117\n勤售\t13118\n高冷\t13119\nwww.4008828288.com\t13120\n278侠\t13121\n婚姻缘\t13122\n重现\t13123\n好多岁\t13124\n对呀你惹我生气\t13125\n变不出来\t13126\nzhbcbc\t13127\n劢才\t13128\n七万多\t13129\nminidash\t13130\n谢你懂\t13131\n跟煤\t13132\n挑写\t13133\n两条\t13134\n奥运会#\t13135\n遵从\t13136\n两来\t13137\nIcanspeakEnglish\t13138\n卢大少\t13139\n两杯\t13140\n八十遍\t13141\n今日凌晨\t13142\n怨灵\t13143\n恩佐·法拉利\t13144\n孙泽超\t13145\n舜德乡阳\t13146\nassist\t13147\n100几\t13148\n里里\t13149\n十二宫\t13150\n哼你好讨厌我不和你聊\t13151\n魏兄\t13152\n惊奇\t13153\n英语作业业OK\t13154\n张从杨\t13155\n餐馆儿\t13156\n好句太心\t13157\n命滴\t13158\n例站\t13159\n全部子\t13160\n不要脸真不要脸太不要脸\t13161\ngjglg\t13162\nln\t13163\n淼淼淼淼\t13164\nllk\t13165\n郭i\t13166\n黑蚂蚁\t13167\n二六五二零\t13168\n边疆行\t13169\n一个两个\t13170\n巧娅\t13171\n有聋无\t13172\n陈科诺\t13173\nllo\t13174\n魏贵保\t13175\n郭v\t13176\n配合\t13177\n武朝科\t13178\n兼听\t13179\n1昂\t13180\n补锌萌\t13181\n五月激情网网\t13182\n解析社会福利运动\t13183\n瓜瓜瓜瓜瓜瓜瓜\t13184\n98%\t13185\n磨霓\t13186\n帅哥哥\t13187\n单馆\t13188\n中国海监\t13189\n憨堡\t13190\n佳能IXUS105数码相机\t13191\n干槐树叶\t13192\n5346246464372\t13193\nwomm\t13194\n刪掉\t13195\n李丽芬\t13196\n追讨\t13197\n不要脸的脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸\t13198\n测试\t13199\n青春挽歌\t13200\n美好的\t13201\nwomb\t13202\n失踪处\t13203\ntoes\t13204\njjgddajtjmd\t13205\n科学院\t13206\nokju\t13207\n魏叔叔\t13208\n棉花糖\t13209\n詹姆斯\t13210\n说呀\t13211\n哈木\t13212\n不善良的人\t13213\n一个九百\t13214\n精华\t13215\n替代\t13216\n我去\t13217\n棒杀\t13218\n理听\t13219\n招名巨了然\t13220\n兵不血刃\t13221\n你的意思\t13222\n说呢\t13223\n点数\t13224\n十天边\t13225\n灰皮诺葡萄\t13226\ni7ig\t13227\nKBS音乐会\t13228\n阿赖耶\t13229\n悲伤\t13230\n推导题\t13231\nNTMJSCSBYG\t13232\n呼死你个\t13233\n黄濑\t13234\n马晓芹\t13235\nbbbbbbbb\t13236\n两百岁\t13237\n一转眼\t13238\n乳胶漆\t13239\n谢马\t13240\n装酷\t13241\n摆件\t13242\nfdgegge\t13243\n心急火燎\t13244\n车迷们\t13245\n下班好啦好啦好啦好啦好啦好啦好啦好啦好啦好啦\t13246\n哥媳妇\t13247\n驻马店\t13248\n度秘微\t13249\n活\t13250\n十三天\t13251\n呆若木鸡\t13252\n埃德蒙\t13253\n三岁岁\t13254\n40千克\t13255\n40年\t13256\n女警\t13257\n2月16号\t13258\n忠心\t13259\nrydy\t13260\n苏场\t13261\n十寒\t13262\n近两日\t13263\n福特汽车\t13264\n来珠\t13265\n叹萧杀\t13266\n拜托夺\t13267\n康妮\t13268\n十寸\t13269\n更多种\t13270\n病人家\t13271\nrodeoff\t13272\n009米\t13273\n274倍\t13274\n4SB\t13275\njsgjshsbshsbhdbdhjdbbsujsdhwuidhduhshkdbdishdjsshndhshsbsujdhdjdddjidddddjsjjssjjdhdnddhdh\t13276\n大好人你是大好人\t13277\n52策\t13278\n苏圈\t13279\n那个词\t13280\n塑料品\t13281\nstarshasAmerican\t13282\n握拳\t13283\n镶边\t13284\n够了你是\t13285\n字母哥\t13286\n彩排党\t13287\n拓玛\t13288\nugkggge\t13289\n大化县\t13290\nhttpmp4123456mp4comnewwm2016010115%E5AEAEE4%B88BE88FAFE\t13291\n大白粉\t13292\n上七天\t13293\n老年\t13294\n老干\t13295\n批判\t13296\n妖妖舞戳\t13297\n時可俐\t13298\n泊头\t13299\n泽国\t13300\nKKBOX\t13301\n认识了不到\t13302\n三妻四\t13303\n民间组织\t13304\n白斩胡\t13305\n潮&amp\t13306\nhsdgdgrhdhrhdhrhdhrhfrhfhfhf\t13307\n耽美重生娱乐圈\t13308\n回眸一笑\t13309\n嫁就嫁\t13310\n55787455\t13311\n3.6万公里\t13312\n寿面\t13313\n橡皮塞\t13314\n点站\t13315\n出兵\t13316\n喷屏\t13317\n二锤子\t13318\n几百位\t13319\n2882点\t13320\nhshshssjdjd\t13321\n唔有\t13322\n无限大\t13323\n双飞燕\t13324\n虾仁\t13325\n46870858\t13326\n万喜\t13327\n硬件\t13328\n塞班\t13329\n一般而言\t13330\n不知说\t13331\n第22年\t13332\n好不好我求求你了行不行\t13333\n撒比\t13334\n吐露\t13335\n晓露\t13336\n123211068\t13337\n大帅\t13338\n白开\t13339\n最来一\t13340\n爆了吧\t13341\n黄裙\t13342\ndddff\t13343\n难觅\t13344\nbrothe\t13345\nwlxvbnmlkgkjljigpofuyodoupdphf\t13346\n守则\t13347\njiw\t13348\n5sing\t13349\n临近\t13350\n压场\t13351\n63457468\t13352\n死墓\t13353\n忘记你\t13354\n张盼兮\t13355\n白弧\t13356\n乾淨\t13357\n难解\t13358\n112册\t13359\nhfhfghokhhgccshsdaskgxvjcxxfgdddfvbbcbm\t13360\n合家\t13361\ngaligaga\t13362\ncollamy\t13363\nPPP\t13364\n合宿\t13365\nhymncfkis\t13366\n给破\t13367\n二十一起\t13368\nchvf\t13369\n哎呀呀\t13370\n捎带\t13371\n倍儿兴奋\t13372\n真不错\t13373\n起夜\t13374\n嗯呢我爱你我真的爱你我想要你孤独\t13375\n公斤级\t13376\n见晚爱\t13377\n干嘛子\t13378\n别胡闹\t13379\n西青\t13380\n南宫昭延\t13381\n起头\t13382\n扛不住\t13383\n值时\t13384\n撸啊撸流浪法师\t13385\n西面\t13386\n身高\t13387\n梦结局\t13388\n刘子琴香\t13389\n轮转\t13390\n哎呀呵\t13391\n值日\t13392\n千金女贼\t13393\n不明天见\t13394\n说服力\t13395\nMr.simple\t13396\n劳劳命的牢一一大的一毛一哪是濠一哪\t13397\nNBA201\t13398\n主意天\t13399\n朱素云\t13400\n中东路\t13401\n雅马哈\t13402\nEXCEL\t13403\n恋抱\t13404\n反对唱\t13405\n骂真是\t13406\n隆安\t13407\n茜爸\t13408\n副导演\t13409\n戴好好笑\t13410\n雨霖铃\t13411\nBE值\t13412\n刘兆芝\t13413\n17：00\t13414\nrfhshefhhhdchjcbfjcjdhcchdhhhcjjxchbch\t13415\nAnglababy\t13416\n周国标\t13417\nsehuli\t13418\n花粉\t13419\n王宝\t13420\n3764643768428872924455225223218662\t13421\n失物招领\t13422\n1660714286多少\t13423\n34邓\t13424\n笨猪花猪\t13425\n辅助法\t13426\ny01\t13427\n王昱辰\t13428\nGORE\t13429\n掌法\t13430\n沸羊羊\t13431\n林枫松\t13432\n拖累\t13433\n胡彦\t13434\n灵机妙妙\t13435\n豪门妇\t13436\nnome\t13437\n花粥\t13438\n黄少鹏\t13439\n必要你\t13440\nArsenal.特里\t13441\n877644\t13442\n冬咪\t13443\nBlanc\t13444\n變成\t13445\n早已\t13446\n天时地利人和\t13447\n难以\t13448\n女孩子女孩子\t13449\n赖脸\t13450\n卑鄙讨厌\t13451\n4426547\t13452\n1234567899999999990000002554552744452137098\t13453\n这个世界上份\t13454\n长裙\t13455\n玛勒格碧松首\t13456\n范冰冰美\t13457\n四章\t13458\n安基迦奥特曼\t13459\n如一\t13460\nshggh\t13461\n延捷\t13462\n我喜欢一句话\t13463\n999小游\t13464\ngph\t13465\n渐浓\t13466\n12355468568866\t13467\n呀你好帅\t13468\n人娄\t13469\n长裤\t13470\n反弹\t13471\noffice2010\t13472\n为指导\t13473\n辽宁衡业飞豹篮球俱乐部\t13474\n节军\t13475\n退队\t13476\n罚抄\t13477\n史余良\t13478\n阿克斯\t13479\n普气\t13480\n38.02元\t13481\n刘晓明\t13482\nojngyv\t13483\n王璞禛\t13484\n故障\t13485\n民国\t13486\n圣水寺\t13487\n赵昕悦\t13488\n就是众里寻他千百度\t13489\nwudh\t13490\n秘度秘\t13491\n维护\t13492\n揉搓\t13493\n其形\t13494\ngobod\t13495\n宫斗群\t13496\n灼烧\t13497\n彭伶俐\t13498\n灼热\t13499\n恩晚安玛丽苏玛丽苏玛丽苏\t13500\n度秘我要你的字\t13501\n我和你相爱你就杀我\t13502\n12348\t13503\n急需要\t13504\n12341\t13505\n12340\t13506\n慰安妇\t13507\n仿古\t13508\n12345\t13509\n12344\t13510\n痛改前非\t13511\n卡农变奏曲\t13512\n牌联村\t13513\n4444s\t13514\n妖气冲天\t13515\n讲一下\t13516\n洗牙\t13517\n自言自语语重心长\t13518\n大奖\t13519\n大奔\t13520\n文采苑\t13521\n心眼子\t13522\n天翼客户俱乐部\t13523\n曹安\t13524\n初代偶\t13525\n干典\t13526\n就是们\t13527\natm\t13528\n东西楼\t13529\nnoblemansteppedoutand\t13530\n邓俊豪\t13531\n大奶\t13532\n大女\t13533\n石庄\t13534\n九点上午\t13535\n大好\t13536\n44444\t13537\nchnvvm\t13538\n僵粉\t13539\n牛什\t13540\n勤勤恳恳\t13541\n朱桢\t13542\ngogdcv\t13543\n朱桦\t13544\n没有声\t13545\n维肤\t13546\n满不在乎\t13547\n牛仔\t13548\navss\t13549\n安安晚安\t13550\n见不上\t13551\n皇极寒\t13552\n一的了还有熊出没之熊心归来给我\t13553\n当然\t13554\n自暴\t13555\ndjdnndnd\t13556\n勒姆\t13557\n吆吆毖\t13558\n零四幺九\t13559\n711111111111111111111111111111111111111111111111111111111111111111111\t13560\n高清晰\t13561\n三十三十八\t13562\nNoyas\t13563\n打纬机\t13564\n川人\t13565\n折叠椅\t13566\n奥太廊\t13567\n纯\t13568\n减税\t13569\n纭\t13570\n纪\t13571\n纫\t13572\n纩\t13573\n约\t13574\n级\t13575\n纤\t13576\n菜式\t13577\n红\t13578\n纣\t13579\n纠\t13580\n线\t13581\nvvfhfybhbkk\t13582\n纽\t13583\n三五万\t13584\n纸\t13585\n纹\t13586\n纶\t13587\n纵\t13588\n纲\t13589\n纳\t13590\n哎艺珠\t13591\n纱\t13592\n544252\t13593\n君不见\t13594\n喔唷\t13595\n又哭\t13596\n商法\t13597\n纀\t13598\n开心的三年最小看我了你对战队小游\t13599\n建行学院路\t13600\n拉菲屏\t13601\narevery\t13602\n三五个\t13603\n皮度\t13604\n鲍勃\t13605\n蜜豆蜜豆\t13606\ntxtyou\t13607\ndfchyfhiufdvkohfdsadujjvrtuigchjvcghgsfyijmgghuokkgfswertf\t13608\n只要\t13609\neverybaby\t13610\n180多万一亩\t13611\n越来赶快\t13612\n盖住\t13613\n奖牌\t13614\n骗人的你骗我没有坏处你看我无辜\t13615\n走过来说\t13616\n扩词\t13617\n六二九八六\t13618\n颜颜\t13619\n带雨晚\t13620\n300225\t13621\n双向化\t13622\n放不为谁而作\t13623\n全额\t13624\nentinyou\t13625\n籍贯\t13626\nxt88\t13627\n占领\t13628\n愚昧无知\t13629\n润土\t13630\n根深叶茂\t13631\n外耳\t13632\n苏梓卿\t13633\n壮大将会\t13634\n王金逸\t13635\n][\t13636\n加锁\t13637\n大新庄一村\t13638\n唉害\t13639\n茭白雪菜肉片\t13640\n谗害\t13641\n心灵感\t13642\n出汗白\t13643\n韩饭新\t13644\n对呀人家的自由我吗你要你\t13645\n扛不起\t13646\n非机动\t13647\n艺儿\t13648\n炉盖\t13649\n中华民国国旗\t13650\ntaodou\t13651\n0676124632500799856455078689213121322222222222222222222\t13652\n我不爱你我就是恨死\t13653\n大鸡翔\t13654\n重修\t13655\n悚负猪\t13656\nhuuut\t13657\n丑头\t13658\n要钱行\t13659\n小花猫\t13660\n猪头猪脑猪踩\t13661\ntomasousome\t13662\n赵明强\t13663\n诱人\t13664\ntt君\t13665\n87.98元\t13666\n心理特重\t13667\nuur\t13668\n上聊\t13669\n脱水\t13670\nuui\t13671\n好多多人\t13672\nuud\t13673\n喽凯蒂\t13674\n1111111龙龙龙龙龙龙龙龙龙龙龙嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎\t13675\nlially\t13676\n预算\t13677\n一辆策\t13678\n徦点\t13679\n波斯\t13680\n四餐\t13681\n大个儿\t13682\n近照\t13683\nffgh\t13684\n领航者\t13685\n好很好\t13686\nuuD\t13687\n高阳华\t13688\n告诉你的名字\t13689\nFffj\t13690\n七喜\t13691\n吧热\t13692\n小s\t13693\n四分\t13694\n什么然决然\t13695\n信得过\t13696\n爱求\t13697\n零五零二\t13698\n宣告\t13699\n四则\t13700\n四列\t13701\n磊心\t13702\n到不到\t13703\n泰迪关\t13704\n好兴混\t13705\n5个月\t13706\n花明楼\t13707\nIbrahim\t13708\ncccaccbacccac\t13709\n06254\t13710\n别爆我通讯录\t13711\n闫错\t13712\njkhx\t13713\nbootalithotomis\t13714\n嗯好啦\t13715\n桑心\t13716\n君臣\t13717\n晨阳\t13718\nfkbx\t13719\n3时\t13720\n啊峰\t13721\n黎公公\t13722\n好呀少\t13723\n金针菇\t13724\n锡业\t13725\n天璞\t13726\n嗯也有\t13727\n抗菌\t13728\n二十桶八十头\t13729\n油烟味\t13730\n猛攻\t13731\n傻子傻子啥子啥子啥子\t13732\n重叠门重叠门屠炎\t13733\n港澳台\t13734\n你好度秘我是王子安\t13735\n159794192\t13736\n清薄\t13737\n梁宏达\t13738\n玛尼玛尼\t13739\n范冰\t13740\n吻\t13741\n猪猪龙\t13742\n恩晚安\t13743\nstyw\t13744\n13540895032\t13745\n同一句话\t13746\n哈哈哈\t13747\n履行\t13748\n本周末\t13749\n笔数\t13750\n永远会\t13751\n齙磊\t13752\nhiwh\t13753\n依婷亲\t13754\n谢谢你我真的喜欢\t13755\n五九王宁\t13756\n学期学\t13757\n亦婷\t13758\n坏坏人大\t13759\n老美老美你好美\t13760\ntothe\t13761\n豢养\t13762\n照李\t13763\n花车\t13764\n9\t13765\n越气\t13766\ndaughters\t13767\n松懈\t13768\n聂耳臻\t13769\n刘诗洁\t13770\n提起来\t13771\n柯贤芬\t13772\n深巷\t13773\n寂灭性\t13774\n61.35\t13775\ni次\t13776\n嗯植物\t13777\n相声\t13778\n20万日元\t13779\n马乐琪\t13780\n百待\t13781\n星海\t13782\n王新奇\t13783\n盘锦\t13784\n呸真\t13785\n小T\t13786\n徐峥\t13787\nkfgjhkvn\t13788\n杜傻子\t13789\n星浃\t13790\n百德\t13791\n五二二六\t13792\n猜湿\t13793\ntomakea\t13794\n这季\t13795\n了不我\t13796\n金太阳16—11—68C\t13797\n李小\t13798\njgjjllkgo\t13799\n助线\t13800\n我们班\t13801\n今年1月31号\t13802\n我是真心的可是你现在这样我不喜欢你\t13803\n快跑\t13804\n成都\t13805\n医用酒精\t13806\n架滚\t13807\n哇靠度秘可爱的我无言以对\t13808\n不足\t13809\n喵个咪\t13810\n软绵绵\t13811\n瓜瓜\t13812\n彩妆\t13813\nNokia\t13814\n四百子\t13815\n有我知道\t13816\n松姐\t13817\n教题\t13818\n一七场\t13819\n蜜橙妾\t13820\n世界本\t13821\n瞎缩\t13822\nLost\t13823\n2255467594\t13824\n成天成晚\t13825\n长胜物流园园\t13826\n八月天\t13827\n摸摸摸摸\t13828\n周宇翔\t13829\ncofo\t13830\n咪小咪\t13831\n曖昧\t13832\n咱家\t13833\n6606\t13834\n李尹\t13835\n2011年2月10日\t13836\n新闻直通车\t13837\n伯约\t13838\n大众车\t13839\n三小只\t13840\n田舍\t13841\n拜访\t13842\n554255252\t13843\n半夜1点10分\t13844\n击垮\t13845\n我生的你不了了孝真海地长吻上嫂孑\t13846\n名道\t13847\n猪猪猪度秘\t13848\n拜让\t13849\n铂欣\t13850\n臭豆豆\t13851\n丁字路口人车分流无阻碍畅行立交桥\t13852\n乘客\t13853\n歌棒\t13854\n恋舞oL\t13855\n乐维滋疏乐\t13856\n战斗群\t13857\n饭票\t13858\n真好真\t13859\n熔炉\t13860\n呵户\t13861\n机智\t13862\n没板\t13863\njoity\t13864\n再来噎\t13865\ngcugxi\t13866\nisowk\t13867\n我是你的爱人猫\t13868\n真好看\t13869\n借钱\t13870\n李某\t13871\n恋舞ol\t13872\n冷不\t13873\n李柔\t13874\n强尼\t13875\n折戟\t13876\n折成\t13877\n气球\t13878\n自盟\t13879\n大席\t13880\nBOyS\t13881\n花灯节\t13882\nyoooo\t13883\n熔点\t13884\n8个小时\t13885\n熔炼\t13886\n#无忌\t13887\nNishi\t13888\n二次\t13889\n开颜\t13890\n宫村\t13891\nbchcxb\t13892\n普里\t13893\n我最我最我最死你\t13894\n辣洪产\t13895\n看家听话\t13896\n高尔基\t13897\n播播\t13898\n十年树木\t13899\n时貌\t13900\n姑射\t13901\n小葵骨\t13902\n标食药监局\t13903\n朱近赤\t13904\n秘秘你看过龙门飞甲\t13905\n黑鱼\t13906\n播撒\t13907\n忠哲输入法\t13908\n大颈小\t13909\n一溜\t13910\n三尚市\t13911\n余辛冉\t13912\n陈小\t13913\n喜羊羊之灰太狼闯世界\t13914\n心知肚明\t13915\n倍\t13916\n泥灸\t13917\n個\t13918\n大脑皮层\t13919\n至关重要\t13920\n倆\t13921\n基德侠\t13922\n白象老凤萍的你这个疯婆子\t13923\n孙世博\t13924\n嗯秘度\t13925\n太搞笑吧臭婆娘\t13926\n借\t13927\n倚\t13928\n倘\t13929\n候\t13930\n方明玉\t13931\n倔\t13932\n倒\t13933\n派礼\t13934\n們\t13935\n肯航\t13936\nUjhb\t13937\n倪\t13938\n倩\t13939\n倦\t13940\n吴延军\t13941\n三十秒后\t13942\nlong\t13943\n倾\t13944\nQWQ\t13945\n值\t13946\n债\t13947\nsoulatiticatotomi\t13948\n第三年\t13949\nueowjdbch\t13950\n要是麻\t13951\n晚生\t13952\n高佳军\t13953\n米发\t13954\ngcfvcxczcxfcxcccxf\t13955\n记录事\t13956\n3131313131331313131\t13957\n你是谁你是的话\t13958\n6分之五\t13959\n55542242555525528\t13960\n虎溪小学\t13961\n卑恋\t13962\n迈入\t13963\n芋圆\t13964\njkxkxkdjdisjchdbs\t13965\n不行求你了告诉我吧我亲告诉我\t13966\n激\t13967\n虎扑\t13968\nfuxjh\t13969\n猪姑来\t13970\nSBSDBD\t13971\n饮泣\t13972\n有么\t13973\n沙县\t13974\n果汁\t13975\n有之\t13976\n不是不是\t13977\n使用\t13978\n作广\t13979\n濟\t13980\n米可\t13981\n濙\t13982\n挤兑\t13983\n杏子花\t13984\n北京办公室\t13985\npf卜s\t13986\n做物\t13987\nuuloli图v12关\t13988\n蔡了么\t13989\n牙疼\t13990\n男子汉\t13991\n菲勒\t13992\n黄爸爸\t13993\n不认你\t13994\n案例\t13995\n武夫当国\t13996\n何浩杰\t13997\n绝味儿鸭王\t13998\n二球\t13999\n龟头\t14000\n三八零\t14001\n犹豫不决\t14002\n泰，泰\t14003\n出生地址\t14004\n不符\t14005\n6029\t14006\n424559494452566644n559897755264556775723432224241217565778\t14007\n花社\t14008\nnico\t14009\n你好啊帅\t14010\n当好\t14011\n不笨\t14012\n就寝\t14013\n湖北省石首市团山寺镇北河口声\t14014\n10070\t14015\n呀呀呀呀小游\t14016\n6.8亿\t14017\njeid\t14018\njeie\t14019\n可爱好想\t14020\njeig\t14021\n祝语\t14022\n看重\t14023\nhttpusanwennetsubject283644\t14024\n杨明志\t14025\n清永浩文\t14026\n懒得理\t14027\n幺零八二\t14028\n连江县\t14029\n吗处\t14030\n雁山\t14031\n魂斗罗\t14032\n你的朋\t14033\n那须加什\t14034\n固本\t14035\n噜噜噜\t14036\n无聊你的\t14037\n吴晨龙\t14038\n精分\t14039\n祝词\t14040\n墨染\t14041\n属相\t14042\n繁华\t14043\n高位数\t14044\n小修\t14045\nHHD\t14046\n阿奎\t14047\n李雅灵\t14048\n王安石\t14049\n张铁架\t14050\n哼点\t14051\n剑走偏锋\t14052\n草莓\t14053\n锅煤\t14054\n讨厌讨厌讨厌讨厌讨厌\t14055\n冒号\t14056\nvvvvvvvvv\t14057\n教育事业\t14058\n候景璐\t14059\n文字游戏\t14060\nggdhygy\t14061\n簽唱\t14062\n王姐\t14063\n会电\t14064\n问世间情为何物直叫人生谁与共\t14065\n美喜欢版\t14066\n张铁林\t14067\n喝牛奶\t14068\n嘿吖\t14069\n来一再来\t14070\n呼风唤雨\t14071\n草莽\t14072\nidfitourter\t14073\n猎美\t14074\n66个\t14075\n你骗了我我不在相信你了别给我\t14076\n7点59分\t14077\n钟克佩\t14078\n森川葵\t14079\n春夜喜雨\t14080\n氯化亚铁\t14081\n行当\t14082\n吴珊燕\t14083\n哪两个\t14084\n无限团\t14085\ndande\t14086\n协调\t14087\n烤鸡翅\t14088\n诉你我知道\t14089\n爱疯壳\t14090\n紧急盒\t14091\n另一個\t14092\n祝二老\t14093\n洋鬼子\t14094\n口齿\t14095\n845757567567575675375\t14096\n，，，，，\t14097\n嗷狸狸\t14098\n蠊髋\t14099\n爹电\t14100\nShweheee\t14101\nnba2012\t14102\n小方\t14103\n天鹅湖\t14104\nmawwj\t14105\n鼓山\t14106\n爱不释手\t14107\n319分\t14108\n联合利华\t14109\n是是非\t14110\n八百万\t14111\nab64034f6e6a2f75a8c379310a551d19jpg\t14112\n金珠\t14113\n就村\t14114\n大琳子\t14115\n呜呜呜我不猜你了我不开你了呜呜呜\t14116\n度秘你的生日子啊多少年\t14117\n八百个\t14118\n赵佳雨\t14119\n富国天合\t14120\n下午一点半\t14121\nftyr\t14122\n乖真少\t14123\n连子毅\t14124\n奈奈生\t14125\n千八百万倍\t14126\nhkz\t14127\n充分鼓\t14128\nhky\t14129\n第一仙多\t14130\n柜号\t14131\n试验品\t14132\n线通\t14133\n装傻充愣\t14134\n柜台\t14135\nhkk\t14136\n公母\t14137\n去不没\t14138\nhkb\t14139\nhkc\t14140\nhkf\t14141\nhkg\t14142\nhkd\t14143\n最贪吃\t14144\n建德\t14145\n新街口\t14146\n不值一驳\t14147\n成人高考\t14148\n小假\t14149\n萝卜吧萝卜嘿呦嘿呦拔萝卜\t14150\n度秘你好色\t14151\n史总\t14152\n半日\t14153\nJwK\t14154\n葱郁\t14155\n武汉博纳\t14156\n张巨巨\t14157\n身奥\t14158\n开餐\t14159\n福田\t14160\n小邋\t14161\n鬼米\t14162\n男友\t14163\n好干古古怪怪伙計客戶糾結8家家\t14164\n抄悲惨不好过\t14165\n孺子可\t14166\n清明君哥\t14167\n老干部\t14168\ngztmm8J\t14169\n海皮\t14170\n美吃\t14171\n沿海地区\t14172\n力臻\t14173\n当阵\t14174\n西瓜\t14175\n凤凰卫视频道\t14176\n恶唱\t14177\n物流\t14178\n府城\t14179\n雅美\t14180\n风俗习惯\t14181\n活動\t14182\n大家一起来\t14183\n纷争\t14184\n你以为你没有\t14185\n山楂果茶\t14186\n柳灿灿\t14187\n雷文霞\t14188\n二分之1\t14189\n塔区\t14190\n于山定光寺\t14191\ngoaway\t14192\nparrots\t14193\n且且\t14194\n爱爱爱爱不爱\t14195\n亲亲嘴\t14196\n枣庄矿业\t14197\n了我爱对了我爱最爱我\t14198\n华府\t14199\n锅底儿\t14200\n同命\t14201\n2日后\t14202\n大革命\t14203\n张欣怡\t14204\n华庭\t14205\nkyy\t14206\n5月7日21时32分\t14207\n乐清\t14208\n528150165515\t14209\n给络酱\t14210\n汇率\t14211\n我喜欢日\t14212\n一八和\t14213\n帅磺\t14214\n石岛\t14215\n哇巴拉吉\t14216\n慈悲为怀\t14217\n石岐\t14218\nqq铁塔\t14219\nwecanspeakenglish\t14220\nGGGG\t14221\n翱翔凌空\t14222\nZhshsjdj\t14223\n范爱农\t14224\n天裕小区\t14225\n同一性\t14226\n小玩意\t14227\nkyr\t14228\n51bf\t14229\n事倍功\t14230\n说明\t14231\n林达\t14232\n2345\t14233\n2346\t14234\n13400083551\t14235\n九六二九\t14236\ngfffffffffffffffffffffffff\t14237\n250250\t14238\n塞巴蒂安\t14239\n石岩\t14240\n在这里开始\t14241\nIgsugsusgdusg\t14242\n强强大\t14243\n三道四\t14244\n4363614\t14245\n太太太太太\t14246\n和不免\t14247\n耽美文东方彧卿与白子花\t14248\n公话\t14249\n条状\t14250\n常备\t14251\n基我也不知道\t14252\n呢读书\t14253\n我靠我靠我靠靠靠\t14254\n幸福人生\t14255\n如影随形\t14256\n傣族\t14257\n娃娃风\t14258\n公证\t14259\n必唱\t14260\n劲儿连根村\t14261\n赌城\t14262\n24欧元\t14263\nyy283\t14264\n漠河南\t14265\n三鑫\t14266\n八级\t14267\nbhffggghgffjhgh\t14268\n5秒\t14269\n啊雅姐\t14270\n张光头\t14271\n好处\t14272\n魅族MX4\t14273\n好夏\t14274\n煞\t14275\n好复\t14276\n几万个\t14277\n1975003763\t14278\n郦道元\t14279\n罗祥\t14280\n空心\t14281\nk545775742575554274254575575745876lightsinglobalSo555\t14282\nWhichi\t14283\n女孩子\t14284\n好夜\t14285\n儒仙\t14286\n好多\t14287\n第14日\t14288\n苏月\t14289\n好大\t14290\n豌豆豌豆\t14291\n俩一会儿\t14292\n離家出走\t14293\n好天\t14294\n给我说下你的秘密呗\t14295\n鲜芋仙\t14296\n米青子\t14297\n嗔怒\t14298\n无位\t14299\n期末了\t14300\nooppo\t14301\n97.1%\t14302\n卤蛋\t14303\n凤凰两座机场\t14304\n容妃晗\t14305\n局儿\t14306\n10秒钟\t14307\n杨嘟嘟\t14308\n哭哭哭哭日聊聊\t14309\n金贵\t14310\n羽绒服\t14311\n粘住\t14312\niiiiiiiiiiiiiuiuiiiiii\t14313\n浪潮软件\t14314\n俺当园\t14315\n一至两个月\t14316\n藤井彻\t14317\n1.6万\t14318\n牛逼大\t14319\n哩啦\t14320\n不要不要\t14321\nktmjptmj\t14322\n落子\t14323\n益盟\t14324\n臭不真脸\t14325\n樊梦蓉\t14326\nKD爱\t14327\n5.21\t14328\n有备而上\t14329\n小破\t14330\ndpidfkdjsuj\t14331\n场下\t14332\n场上\t14333\n巫毒\t14334\n第20集\t14335\n一三四\t14336\nxhzvwa\t14337\n武延红\t14338\nBubble\t14339\n搬家呗公务秘誓颜\t14340\n金太阳\t14341\n靠不累\t14342\nkfck\t14343\n停开\t14344\n行架\t14345\n背后成功词超级杯屡\t14346\n8800\t14347\n楼脖\t14348\n奶嘴\t14349\n97649\t14350\n三十米\t14351\n朱子豪\t14352\n22419969\t14353\n江苏大学\t14354\n硬盘\t14355\nWOW\t14356\n受伤者\t14357\n5数\t14358\njfjxbxb\t14359\n袁占亭\t14360\n同床异梦\t14361\n真的了不得\t14362\n发表\t14363\n麻垌\t14364\n880g\t14365\n百分之一百六\t14366\n赵光辉\t14367\n酷炫\t14368\n500万元\t14369\n萌萌大萌萌\t14370\nhhon\t14371\n13476656391\t14372\n酷点\t14373\n发行\t14374\n余一诺\t14375\n密度度\t14376\n火锅店\t14377\n见超\t14378\n啃老族\t14379\n34693\t14380\n别管\t14381\n川港市\t14382\n林依晨\t14383\n保险杠\t14384\n欺凌\t14385\n22年\t14386\n武都\t14387\n肥皂\t14388\n胡文揍\t14389\n叶振邦\t14390\n张相我\t14391\n车篓\t14392\n那么远\t14393\n堡证\t14394\nhttpehiphotosbaiducomxiaodupicitemb21bb051f81986189216bbf14ded2e738bd4e676jpg\t14395\n不要太关你的下雨么了帮我借一件嘛求求你了逗逼\t14396\n陈会\t14397\n明天早上六点\t14398\n爱的力量\t14399\n天太冷\t14400\n1500\t14401\n拆东墙补西墙\t14402\n1506\t14403\n1504\t14404\n朱欣雨\t14405\n三湘都市报\t14406\n一次份\t14407\n爱爱漫画\t14408\n相互作用\t14409\n311\t14410\n心理\t14411\n77525\t14412\n花努\t14413\nlllllll\t14414\n矣跤场\t14415\n十八层\t14416\n丹化\t14417\nigdhdj\t14418\n一天休一天\t14419\n莫昂王意特营行\t14420\n十八届\t14421\n雷公蛙\t14422\n松鼠大战僵尸\t14423\n干电池\t14424\n朝待\t14425\n祖屋\t14426\n石唐瑞\t14427\n抑投机\t14428\n越野路\t14429\n问答者\t14430\n组照\t14431\n1.94元\t14432\n印刷术造纸术\t14433\nLively\t14434\n唱化作\t14435\n列子\t14436\n起身\t14437\nbqiqg\t14438\n度秘你好漂亮啊思密达\t14439\n雕梁画栋\t14440\n伐伐\t14441\n不当\t14442\n不归\t14443\n千百万\t14444\n张强哥\t14445\n猪西\t14446\nnnnnnnnnhcccvaxbczdd爹nnvvdbcgfh\t14447\n中帮高帮\t14448\n哪有心\t14449\n多大小\t14450\n差不吵\t14451\n上梁不正下梁歪\t14452\n挺冲棒\t14453\ngdjgh\t14454\ngffffff\t14455\n1346动劳烦五十\t14456\n卡瓦查\t14457\n吉士\t14458\n素女\t14459\n残友\t14460\n南开大学\t14461\n罗丽涛\t14462\n王俊开\t14463\n逆光\t14464\nwhe\t14465\nwhf\t14466\n战友\t14467\n乔娜与塞巴斯蒂安之家\t14468\nwha\t14469\nwhb\t14470\n商业街\t14471\n新农村\t14472\nwhn\t14473\nwho\t14474\nwhi\t14475\nwhj\t14476\n585461124\t14477\nwhw\t14478\nwhp\t14479\n朱家桥\t14480\nwhr\t14481\n王慧空\t14482\n牙套\t14483\n刘春丽\t14484\n饱览\t14485\nwhy\t14486\nwhz\t14487\n春秋战国\t14488\n作品们\t14489\nhj\t14490\n超超老总代表青啤\t14491\n日月轩\t14492\n审视\t14493\n佛歌\t14494\n卜LL乙\t14495\n揉眼睛\t14496\n红祙孑\t14497\nhn\t14498\nfydgff\t14499\n┙n\t14500\n花朵豆真玫瑰花\t14501\n蹦极\t14502\n呦西\t14503\n墨粉\t14504\nTFbors\t14505\n文龙\t14506\n55059582818802846288555581548448\t14507\n唐伯\t14508\n七辑\t14509\n陶总监\t14510\n别桑心\t14511\n过会儿聊\t14512\n博文精句\t14513\n接地\t14514\n1924939480\t14515\n微游\t14516\n正方形\t14517\n渝港\t14518\n96枚\t14519\n今早八\t14520\nhe\t14521\n球子\t14522\n冒险级\t14523\n觌女\t14524\n5556944945544\t14525\nxxnck\t14526\n曺圭\t14527\n很反感\t14528\n三十吧三年\t14529\n艳霞\t14530\n买断\t14531\n三分之2\t14532\n一月二十七号\t14533\n植树\t14534\n大阳\t14535\n女皇末炎\t14536\n嘞喵喵喵喵喵\t14537\n买方\t14538\nCCTIME\t14539\nllgbj1a\t14540\nfyugdryjbfd\t14541\n薄一辉\t14542\n意指向\t14543\n經餓\t14544\n保佑我中次奖\t14545\n懒鬼\t14546\n怕真的不想\t14547\n硷斤\t14548\n小诗钥\t14549\n学员们\t14550\n尕狗\t14551\n游览\t14552\n与非\t14553\nb卷\t14554\n植根\t14555\nvi顾\t14556\n热乎乎\t14557\nefjhi家校\t14558\n汤凡\t14559\n123597909\t14560\nOK，Isee\t14561\n溜溜梅\t14562\n韩国人\t14563\n吉普\t14564\n虚嘞\t14565\n七千米\t14566\n腌肉\t14567\n武扬威\t14568\n生物界\t14569\ndjadj\t14570\n太不懂事儿\t14571\n飞天小女警\t14572\n拜托\t14573\n乐坊\t14574\n马戈壁\t14575\n邮站\t14576\n东西半岛\t14577\n粉儿\t14578\n对啊\t14579\n乐坛\t14580\n凳子\t14581\n呃开关机\t14582\n用愁\t14583\n丹青年\t14584\n昰U\t14585\n半多咪\t14586\n二百多一斤\t14587\n公中\t14588\nhhhhjjjjjgggghhhhhjhjhhhjjjjhjjjjjjjjhhhhhh\t14589\n假药\t14590\n28秒\t14591\n4月21日\t14592\n张千千\t14593\n海棠\t14594\n公举\t14595\n我的祖国\t14596\n我来了请闭眼\t14597\n公主\t14598\n我喜欢我的话\t14599\nghkqus\t14600\n三十点点\t14601\n独立性\t14602\n正式版\t14603\n公与\t14604\n空气净化器\t14605\n杨恐龙\t14606\n手面\t14607\n吻别\t14608\n小李战\t14609\n众筹\t14610\n想要\t14611\n有利条件\t14612\n您们\t14613\n本话\t14614\n狭小\t14615\n一九万\t14616\n秒拒\t14617\n猥瑣\t14618\n憋尿\t14619\n别挑开\t14620\n你好野\t14621\n阿霞\t14622\n58855566426232362\t14623\n华苑\t14624\n壮况\t14625\n血脂\t14626\n防守\t14627\n唇棒\t14628\n劈雷\t14629\n真丑真丑真丑真丑真丑真丑\t14630\n海我是你的爱妃\t14631\n47559620\t14632\n很苦\t14633\n500余亩\t14634\n阿露\t14635\n不能忘却\t14636\n省委书记\t14637\n经济开发区\t14638\n度秘谢谢你\t14639\n一九个\t14640\nIE6\t14641\n寒唱\t14642\n亨亨\t14643\n222555252\t14644\n猪粪\t14645\n伦敦桥\t14646\nsklj\t14647\n肖亚冰\t14648\n吉人\t14649\n挥霍\t14650\n落榜\t14651\n下午4点到10点\t14652\nqrlij\t14653\n大木老\t14654\nShuffle\t14655\n虚空\t14656\n4公斤\t14657\n66166点\t14658\n笨笨笨笨\t14659\n飞着\t14660\n长安c70\t14661\n新一天\t14662\nTrya120square\t14663\n宁波市民族宗教局\t14664\n你的吧八十\t14665\n5月5号\t14666\nIEi\t14667\n时事\t14668\n不消\t14669\n冶炼\t14670\n宁se\t14671\n刘嘉睿\t14672\n张发伟\t14673\n员工箱\t14674\n尼日利亚\t14675\n奥黑尔\t14676\n调整裤长的方便按扣\t14677\n一个几千\t14678\n人工智力\t14679\n放饭\t14680\n赞谢谢\t14681\n子元\t14682\newiidthvhkhh\t14683\n四十十四\t14684\n白日依山尽\t14685\n我和你的对话\t14686\n佩欣\t14687\n李斯坦\t14688\n特别的爱给特别的我\t14689\n王顺顺\t14690\n一千里\t14691\n程老汉\t14692\n所述\t14693\n哈三哒哒哒\t14694\n我太恨你了我更不爱你你是\t14695\n多米嗯嗯嗯嗯那我死以后你你看不开心\t14696\n恶心我行不咯\t14697\n290万\t14698\n舔菊男\t14699\nVDNJXDBDHKDBDJXJ548678jvtJHFIRHDGFG\t14700\n鸭脖子\t14701\n吴紫薇\t14702\n桂轩\t14703\n小狮子们\t14704\nshdhckf\t14705\n来回我家\t14706\n栾某\t14707\n9点20分\t14708\n八寸\t14709\nwarming\t14710\n雷树伟\t14711\n美国队\t14712\n精灵块\t14713\n啦啦啦啦啦啦啦日子\t14714\n学委\t14715\n学姐\t14716\n贵人妖\t14717\n好的我等你给我\t14718\n郭贺兵\t14719\n程文欣\t14720\n18518185120\t14721\n专注\t14722\n关头\t14723\nf6u\t14724\n浅蓝色\t14725\n懷孕\t14726\n六月底\t14727\n低级趣味\t14728\nhttppinyincn13SCUV6T9hA\t14729\nxiaodupicitemae51f3\t14730\n今日上午十点\t14731\n叫写给\t14732\n好啦好啦我去睡觉了晚安好梦\t14733\n层出不穷\t14734\nblog4\t14735\n博古\t14736\n颠翻\t14737\njffo\t14738\nmmijj\t14739\n母亲的爱\t14740\n博取\t14741\njffk\t14742\njffh\t14743\n音乐台\t14744\n心靈\t14745\ngutr\t14746\nyovjgghh\t14747\n745亿\t14748\n红星\t14749\nInFrance\t14750\n天津工业大学\t14751\njffx\t14752\njffy\t14753\n开心宝贝五之开心星星星星球\t14754\n博友\t14755\n马莉\t14756\n马莎\t14757\ngutg\t14758\n霞儿\t14759\n一年又一年\t14760\n举例\t14761\n泌阳\t14762\n一起飞\t14763\n数十百\t14764\n0点五二12\t14765\n上王\t14766\n发励\t14767\n第四册\t14768\n发动\t14769\n聊着欢\t14770\n内府\t14771\n浙江卫视跨年音乐会\t14772\n溜嘴\t14773\n度密度密度密度秘我\t14774\nghcjxudyrhs\t14775\n发功\t14776\n没话说\t14777\n金瓶双艳\t14778\n覃天满\t14779\n虎兄\t14780\n攒钱\t14781\n你的爱好\t14782\n72本\t14783\n射女\t14784\n扑倒\t14785\n虎兔\t14786\n19899\t14787\n双锁\t14788\n基努里维斯\t14789\n别发\t14790\n武妹妹\t14791\n后因\t14792\n戴梦婷\t14793\n鲜儿\t14794\n19点30\t14795\n胖不胖\t14796\n小度秘发货\t14797\n良知\t14798\n再说的话\t14799\n掩瑜\t14800\n拜拜拜拜拜拜拜拜拜拜\t14801\n爱在哪里没办法\t14802\n比不着\t14803\ndggs\t14804\n死党们\t14805\n马场\t14806\n糠醛\t14807\n哈欠\t14808\ntjlfjidvj\t14809\nuhugjgjghghghghvhghfhfhfhghfhhhhhhhhhhhhhhhhhghg\t14810\n别口\t14811\nlu76568988\t14812\niokhvdft\t14813\n一下一百\t14814\n寡妇了我不爱你不爱你\t14815\n216万\t14816\n粑粑粑粑粑粑粑粑粑\t14817\n说你好萌\t14818\n拒絕1\t14819\n行政性\t14820\nmmbvvds\t14821\n知县\t14822\n通知单\t14823\n链家\t14824\n我的最爱最初\t14825\n许艳月\t14826\n混合物\t14827\n抱一下亲\t14828\n鸭舌子\t14829\n花生酥\t14830\ntiqi\t14831\n不节后\t14832\n神童班\t14833\n说实话实\t14834\n水经\t14835\nfafteralie\t14836\n舞文弄墨\t14837\n神罗辑\t14838\n雪融\t14839\n羊羊羊洋洋\t14840\n10012116161\t14841\n啦啦啦你发音\t14842\n车尾\t14843\n艳色\t14844\n男穿男和女穿女\t14845\n黄仁敏\t14846\n扔昂\t14847\n熙个\t14848\n6寸\t14849\n奶奶王\t14850\n鐘\t14851\n种子经营许可证\t14852\n首都航空\t14853\n熙业\t14854\n热水瓶\t14855\n食甲\t14856\n训斥\t14857\n做事片\t14858\n我讨厌讨厌你我恨你我讨厌你\t14859\n不公道\t14860\n无毒不食子在\t14861\n叮叮叮叮叮叮叮叮当当当当当当个般配当当当当当\t14862\n魏晓东\t14863\n鐵\t14864\n食用\t14865\n六小时以后\t14866\n好波\t14867\n讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌\t14868\n补血颗粒\t14869\nZD股\t14870\n别了了\t14871\n嗯喜\t14872\n七零二六\t14873\n随地\t14874\nhhhujeudhhdgdehyeyeh6yyegegeggeggg\t14875\n婚姻\t14876\n失业\t14877\n甘肃白银市市委\t14878\n12款\t14879\n圆周\t14880\nhi行\t14881\n首班车\t14882\n回避\t14883\n失且\t14884\n深圳市绿色基金会\t14885\n落井\t14886\n一钱\t14887\n把关机\t14888\n尹千惠\t14889\n牙份儿\t14890\nkfyei\t14891\n失主\t14892\n7k7k样\t14893\n一大堆\t14894\n旺旺旺旺\t14895\nleifeng\t14896\n2011几年\t14897\n影响力\t14898\n舞美\t14899\n田慧超\t14900\n基在\t14901\n2点\t14902\n眯咪大战\t14903\n叶小倩\t14904\n基地\t14905\n叶烨蒙\t14906\nhttpfhiphotosbaiducomxiaodupicitemd1160924ab18972b99f8b9bee1\t14907\n郭露洋\t14908\n镀锌\t14909\n亲女生\t14910\n基场\t14911\n小伙们\t14912\n饿死\t14913\n189653976542896848点\t14914\n白子画我以神的名义诅咒你今生今世永生永世不老不死\t14915\n讲讲\t14916\n舞群\t14917\n成翔\t14918\n来么么大一个\t14919\n谢谢你啦我的小秘\t14920\n严肃单\t14921\n1298\t14922\n去年7月30日\t14923\n夜溪\t14924\n1290\t14925\n度假村\t14926\n博内拉、耶佩斯\t14927\n青年文摘\t14928\n腌鸡胗\t14929\n踢翻\t14930\n厂种\t14931\n冻雨\t14932\n冻零\t14933\n鸟巢度假村\t14934\n未必\t14935\n图片师\t14936\n合同期\t14937\n三天四天\t14938\n八十多岁\t14939\n耗资\t14940\n冰舞\t14941\n娃火娃\t14942\n情水\t14943\n75856758252556317\t14944\n多咪多咪唔\t14945\neme400115em\t14946\n祭日\t14947\n嗨皮妞已尔\t14948\n李改春\t14949\n一片雪\t14950\n1亿岁\t14951\n电锯惊魂吓人\t14952\n堆砌\t14953\nxdd34433re3465rfg43edggfcccf\t14954\n欺辱\t14955\n妈妈呀\t14956\n滨河新村\t14957\n弟弟弟弟弟弟弟弟弟弟弟弟弟弟弟弟\t14958\n就叫你走\t14959\n金融时报\t14960\n会浪\t14961\n卧轨\t14962\n隋炀帝\t14963\n边度\t14964\n二点517点5\t14965\n眼角\t14966\n嗯撸嗯\t14967\n爱你爱你思密达\t14968\n覆盖面\t14969\notre\t14970\n罚站\t14971\nbeautifullady\t14972\n短波\t14973\n霍尔木兹海峡内湾\t14974\n建议书\t14975\n眼见\t14976\n妇神探\t14977\n喜果\t14978\n出生\t14979\n辽宁省检察院\t14980\n东靖\t14981\nX-47A\t14982\n第3次\t14983\nafjl\t14984\nXPC\t14985\n一夜不归宿\t14986\nkaikaisuogu\t14987\n灾祸\t14988\n煮鸡蛋\t14989\n五十兆\t14990\n通知道\t14991\n加减法\t14992\n五十元\t14993\nXPS\t14994\n断了吧\t14995\n噶咯爱你是我不是你是我是你是我\t14996\n千克\t14997\n五十克\t14998\n经廊\t14999\n六和侣\t15000\n前7天\t15001\n汉美\t15002\n曹一帅\t15003\n放长\t15004\n鸳鸯浴\t15005\n青山\t15006\n出险\t15007\niseeyou\t15008\n东面\t15009\n张真人\t15010\nstometoutou\t15011\n千六\t15012\n逡巡\t15013\n千八\t15014\nvhknh\t15015\n五十八\t15016\n孤将\t15017\njjfd\t15018\n第一百\t15019\n拼图\t15020\n黄渡面\t15021\n李晓婷\t15022\n站姿\t15023\n14.51\t15024\n秘硬\t15025\n缓行\t15026\n烈焰侠\t15027\n说了嘛\t15028\n降雪\t15029\n官方新闻网\t15030\n邹心茹\t15031\nB2C\t15032\n奇点\t15033\n不务正业\t15034\njdhhshhsjw\t15035\n资本主义\t15036\n塞星\t15037\nstotouto\t15038\n山珍海味\t15039\n咯元\t15040\n索彦萍\t15041\n59504895\t15042\n网上心\t15043\n9527\t15044\n六二\t15045\n1162424405\t15046\n软床\t15047\n呃呃度秘呃呃呃度秘\t15048\n六五\t15049\n家具体\t15050\n亿百家两千\t15051\n4样\t15052\n密晓\t15053\n为所欲为\t15054\n性素\t15055\n27com\t15056\nvigv\t15057\n贾一颗\t15058\nvigs\t15059\n碳酸镐\t15060\n麻大\t15061\n前方\t15062\n三色\t15063\nvigg\t15064\n六亲\t15065\n开发女孩\t15066\n42秒\t15067\necbsdjrspuwnhr\t15068\n六人\t15069\nNOP\t15070\nvigk\t15071\n六亿\t15072\n爱懂\t15073\n该处\t15074\n鲑鱼\t15075\n武汉市中院\t15076\n豆瓣小说初九\t15077\n123456798910\t15078\n听见了你\t15079\n唐雅妮\t15080\n20百千克\t15081\n呆呆熊\t15082\n天动地\t15083\n田新华\t15084\n该天\t15085\n宝皮\t15086\n过门\t15087\n聋哑人\t15088\n刘翠菊\t15089\n添添\t15090\n马卡隆\t15091\n见啦讨厌\t15092\nlllai\t15093\n国贸中心\t15094\n通长\t15095\n保义词\t15096\n找了呀呀呀呀呀呀呀\t15097\n4144774\t15098\n莫诚意\t15099\n咯菌腈\t15100\nhuru\t15101\n斗吐\t15102\n孜孜不倦\t15103\nIT运维\t15104\n老虎拉车\t15105\n对呀不笑\t15106\n糖分\t15107\n失格\t15108\nzavec\t15109\n糊状\t15110\n对呀你在我\t15111\n洪雨\t15112\n为人父母者\t15113\n五十六公斤\t15114\n招贤纳士\t15115\nB2b\t15116\nLung\t15117\n熊宝\t15118\n绿油油\t15119\n金钱状\t15120\n扶持\t15121\n童纳川\t15122\n白岩松\t15123\n成仇\t15124\n转给\t15125\n小黄文\t15126\n告费\t15127\n12973879\t15128\n诉你米米\t15129\n黄乃\t15130\n绝不原谅\t15131\nbecauseyes\t15132\nkessab\t15133\n出席\t15134\n于小川\t15135\n棒场\t15136\n蛋炒飯剛果河\t15137\n小朝廷\t15138\n重庆NOVO店\t15139\n大班烬\t15140\n呀不知道\t15141\n麦仁师\t15142\n胜负彩\t15143\n黄书\t15144\n口条\t15145\n这记\t15146\n大雪球\t15147\n420平方米\t15148\nhotiokokokokok\t15149\n你有你有你有你有你\t15150\n#寂寞哥#\t15151\n成广事才口人\t15152\n可女\t15153\n得不晓得\t15154\n天气状况\t15155\n茂茂\t15156\n可好\t15157\n雕像\t15158\n张扬琴\t15159\n阳光之都\t15160\n不干算\t15161\n可奥\t15162\n水梅\t15163\n好儿子\t15164\nfucfuxf\t15165\n黄二狗\t15166\n来兮\t15167\n恩那你嗯嗯那你\t15168\n886086468648763\t15169\n大火锅\t15170\n攻就\t15171\n厚根\t15172\n酒菜\t15173\n这一首歌\t15174\n来元\t15175\n拳不离\t15176\n非农\t15177\n常河\t15178\n佛曲\t15179\n切克闹康忙北鼻\t15180\nkiyrdbn\t15181\n合声\t15182\n一机\t15183\n一朴\t15184\n一朵\t15185\nJim\t15186\ngeve\t15187\n劳动强度\t15188\n一本\t15189\n杨那\t15190\n一术\t15191\n配置\t15192\n潮人会所\t15193\n537393\t15194\n悬索桥\t15195\n一望\t15196\n几岁岁\t15197\n一期\t15198\n爱点\t15199\n学学学\t15200\n玉董佳心一一公主一姐姐云富察琪一一格格一妹妹雪西林翔\t15201\n晚起\t15202\n官权\t15203\n一月\t15204\n23点\t15205\n13544221445\t15206\n一服\t15207\n银河\t15208\n异物感\t15209\n小班制\t15210\n膈应\t15211\n私生子\t15212\n官材\t15213\n姓屄\t15214\n预估\t15215\ncghchvghhhcgv\t15216\n深圳街\t15217\n十二点晚上\t15218\n顾子卿\t15219\n遑论\t15220\n35442588558550858741745539966\t15221\n辟邪\t15222\n是是了是了怕是\t15223\n捍卫\t15224\n沙洋中学\t15225\n勺子\t15226\n6点03分\t15227\n宋熙晨\t15228\n癶灸不夭\t15229\n疯狂猜明星\t15230\n对还是错\t15231\n咔咔咔咔咔\t15232\n脑筋急转弯你猜\t15233\ngfghf\t15234\n当家师\t15235\n日界线\t15236\nスプラウト\t15237\n乐意\t15238\n鲍鱼\t15239\n不住\t15240\n已经的爱\t15241\n不位\t15242\n广东话\t15243\n王阳明\t15244\n那美\t15245\n很好人\t15246\n多大了\t15247\nrehfjut\t15248\n脏水\t15249\n送风\t15250\nfearisa\t15251\n这部片\t15252\n3w552儿点\t15253\n六十级\t15254\n极本\t15255\n陈浩民\t15256\n超哥\t15257\n冻蒜\t15258\n而已8好vgurdd\t15259\n555555\t15260\n50张\t15261\n丘多远\t15262\n555553\t15263\n乱改\t15264\n梁闻言\t15265\n耿步策\t15266\n比比皆是俊\t15267\n遍体鳞伤\t15268\n不佳\t15269\n都江堰胥\t15270\n包揽\t15271\n不合时宜\t15272\n鄂州西\t15273\n有道\t15274\n五秘\t15275\n八七六五\t15276\n2011年5月30日\t15277\n王梦颜\t15278\n五秒\t15279\n五科\t15280\n五种\t15281\n笨成\t15282\n国财\t15283\n背井\t15284\n13893407\t15285\n180例\t15286\nGray\t15287\n人道牛刀\t15288\n康忙131321\t15289\n希腊评级\t15290\nkitty大号冰粒杯\t15291\n红小豆馅\t15292\n俩半晌\t15293\n豆腐犯\t15294\nuiuu\t15295\nmmhddz\t15296\n半机器人\t15297\n室长\t15298\n掰直\t15299\nlIN\t15300\nYFGHFC\t15301\n真的好好笑谢谢你\t15302\n不错丽\t15303\n狗qwq\t15304\n七系\t15305\n正月十六\t15306\n免得\t15307\n靠不到\t15308\n坚硬\t15309\nCcscecsvdvdy\t15310\n金安区毛坦厂中学\t15311\n为什么不听讲\t15312\n全远\t15313\nhuaweimate8\t15314\n这个星期\t15315\n小当家\t15316\n七糟\t15317\n贾智涵\t15318\n44元\t15319\n原帖\t15320\n小心水头\t15321\nG7Pus\t15322\n不要不要不要不要不要不要不\t15323\n吞没\t15324\n劳尔\t15325\nlowof\t15326\n话音道\t15327\n灾民\t15328\n雪山\t15329\n下面上\t15330\n揍人\t15331\n上海东方传媒集团有限公司\t15332\n元凶\t15333\n朱青霞\t15334\nArthes\t15335\n焦耳\t15336\n我走了没是你你这个呆\t15337\n8厘米\t15338\n2011年1月16日\t15339\n女贞子\t15340\n用来\t15341\n3q度\t15342\n70元\t15343\n欧豪\t15344\n雄州街道\t15345\nfhjkcff\t15346\n蛮来\t15347\n陈嘉桦\t15348\n锐澳\t15349\n立法\t15350\n傍晚五点\t15351\n屁屁\t15352\n句额\t15353\n美美新广\t15354\n从善如流\t15355\n胡记住\t15356\n最坏\t15357\n重一点\t15358\n陆川县\t15359\n首月\t15360\n你爱我你爱我你爱我有顾客牛\t15361\n话长\t15362\n对门\t15363\n熟似\t15364\n水蜗牛\t15365\n张韵欣\t15366\n21秒\t15367\n芦管\t15368\n拉近\t15369\n轨道\t15370\n粮草\t15371\n国家互联网信息办\t15372\n以讹传讹\t15373\n于艺欧\t15374\n饪藏\t15375\n万无一失\t15376\n黄人员\t15377\n尊享贷\t15378\n太冷血\t15379\n虞倩儿\t15380\n无产阶级\t15381\n杠铃\t15382\n7669\t15383\n助动\t15384\n天生一对\t15385\nMOSHI\t15386\n海贼人\t15387\n九七个\t15388\nFACK\t15389\n厚鸟\t15390\n粘结\t15391\n鼋头渚\t15392\n书里\t15393\n田雨橙\t15394\n压板\t15395\nFACE\t15396\n压条\t15397\n一次4.5\t15398\n鸭毛\t15399\n轻装\t15400\n剩菜\t15401\n桂英\t15402\n在行\t15403\n九九零\t15404\n通红\t15405\n美国北京大使馆\t15406\n草食\t15407\n你是你的你\t15408\n出台\t15409\n28.4公分\t15410\n思维\t15411\n红梅\t15412\n博弈论\t15413\n20120503\t15414\n思绪\t15415\n饥不择食\t15416\n清凉感5\t15417\n胡同天\t15418\n想出\t15419\n出发\t15420\n高大威猛\t15421\n马贼\t15422\nmenu\t15423\nxndcm\t15424\n笨笨狗\t15425\n素月\t15426\n默迪\t15427\n陶瓷\t15428\n了美\t15429\nmeng\t15430\n悲喜交加\t15431\n明清澈\t15432\n汗爹\t15433\n自由交易\t15434\n合话\t15435\n4.00\t15436\nxkebxiebxekbxisbxididbxie\t15437\njjbbd\t15438\n全地\t15439\n牙白\t15440\n杨洁篪\t15441\n哈撸\t15442\n诸法\t15443\n我知道你说的什么了我妈妈告诉我的你说呢妈妈说吾你在哪我就海神不想上我\t15444\n87度\t15445\n强盗罗技\t15446\n乐感\t15447\n怕不怕\t15448\n膏肓\t15449\n牙癌\t15450\n晚上7:30\t15451\n证监食\t15452\n合说\t15453\n来不做梦\t15454\n陈旧\t15455\n幺零五九\t15456\n特特意\t15457\n囊中羞涩\t15458\n哈撒\t15459\n任春冯\t15460\nlbjs\t15461\n绞刑\t15462\n宋先生\t15463\n驾校度小度\t15464\n切切切克闹\t15465\n消消乐\t15466\niLifjgaggagggga\t15467\n2013年农历二月三十\t15468\nmajorlook\t15469\n九个九个\t15470\n机油尺\t15471\n王新伟\t15472\n边缘\t15473\n李紫娴\t15474\n81亿美元\t15475\n河水\t15476\n斯坦福桥球场\t15477\n朱拖鞋\t15478\n李肇星\t15479\nab4厘米\t15480\nbest\t15481\nGdgx\t15482\n徐敏\t15483\n锋锋\t15484\nbh之家\t15485\n再见妈妈妈妈妈妈妈妈妈妈妈妈\t15486\n苦支\t15487\n一生\t15488\n还是么\t15489\n下体性\t15490\nyouylimit\t15491\n左路东\t15492\nhdhrhfydwwweee\t15493\n优于\t15494\n验孕纸测试\t15495\nprada\t15496\n韩娱华娱\t15497\n三星E7000\t15498\n生日礼物\t15499\n小东小区\t15500\n中有一天\t15501\n四株\t15502\n水装\t15503\n四校\t15504\n周凯迪\t15505\n四格\t15506\n考卷\t15507\n四根\t15508\n四核\t15509\n斗牛士们\t15510\n火火火火\t15511\n四样\t15512\n惊梦\t15513\n李芝堂\t15514\n王志峰\t15515\n四栋\t15516\netetddtrt\t15517\n永柏\t15518\n膝下\t15519\n膝上\t15520\n几天过新n年\t15521\n2011、11、11\t15522\n绑不了\t15523\n看热闹\t15524\n军营\t15525\n张宸栩\t15526\n方老师\t15527\n考博\t15528\n甘蔗汁\t15529\n出去玩儿\t15530\n奥观海\t15531\n切客\t15532\nffis\t15533\nleman\t15534\n13864884855564885\t15535\n电脑费\t15536\n破车\t15537\nlljjjjdjja\t15538\n烧包\t15539\n异位朋\t15540\n促性腺激素释放激素激动剂\t15541\n柳传志\t15542\n趣味性\t15543\n下2016年\t15544\n口味儿\t15545\n肇\t15546\n胜利\t15547\n咕噜骨\t15548\n好呀小猪\t15549\n颜回\t15550\n西苑中学\t15551\n宛平路\t15552\n那个丑\t15553\n大体上\t15554\n阿富汗游击队\t15555\n细之明\t15556\n填荆楚\t15557\n液晶显示器\t15558\n八百多个\t15559\n飞信号\t15560\nlife\t15561\n草民\t15562\n翠香园\t15563\n真的可\t15564\n张田\t15565\n后台\t15566\n二百厘米\t15567\n多美味荔枝\t15568\n张画\t15569\nINS\t15570\n陈红\t15571\n索点\t15572\n一十一十一\t15573\n打通\t15574\n后叫\t15575\n怡情\t15576\n周成健\t15577\n青白\t15578\n打造\t15579\n过霸王龙\t15580\n后发\t15581\ntgedrcgybv\t15582\n张生\t15583\nnmq222\t15584\n油罐车\t15585\n天天游戏\t15586\n鑫案\t15587\n鼻字\t15588\n鼻孔\t15589\n北京星巴克咖啡有限公司\t15590\n4643434\t15591\n别闹了秘\t15592\n鼻子\t15593\nwijshsu\t15594\n迪雅科技\t15595\n陈忠宇\t15596\nEXSG\t15597\ng15高速\t15598\na23\t15599\n秘葫芦娃葫芦娃葫芦娃\t15600\n浓雾\t15601\nOn\t15602\n亲爱哒\t15603\n干点\t15604\n高桥南\t15605\n根性\t15606\n不辞而别\t15607\nBooepib\t15608\n肉麻鸡皮疙瘩\t15609\n诱奸\t15610\n颖莉\t15611\n用得心应手\t15612\n刘科迪\t15613\n湄洲岛管委会\t15614\n唐宗宋祖\t15615\n绿丝绦\t15616\n被害人\t15617\nLJ\t15618\n亚丽迪桑\t15619\n花袋子\t15620\n宁尽头\t15621\n八小时\t15622\n天王级\t15623\nOf\t15624\n精湛\t15625\n秘长江\t15626\n太好了再来\t15627\n条子\t15628\n18：00\t15629\n大料\t15630\n天嘉\t15631\n科恩特朗\t15632\nBseat\t15633\ncindy\t15634\n贾斯汀·比伯\t15635\n扇出\t15636\nKillha\t15637\n好呀你给我一下\t15638\nMrzzzzz\t15639\n好笑困\t15640\n干嘛吗战\t15641\nGghhjjhhghhjjjhhhhhhhhhhhhh\t15642\n何事\t15643\n大方\t15644\n啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦\t15645\n数余六个\t15646\n韩大师\t15647\n大新\t15648\niphone4s\t15649\n辗转难眠\t15650\n保持缄默\t15651\n悲恸\t15652\n499个\t15653\n4758026931\t15654\n大斧\t15655\nCONS\t15656\n讨人喜欢\t15657\n绝是\t15658\n秀屿村\t15659\n仙剑\t15660\n了不做饭\t15661\n很加里\t15662\n外运\t15663\n团圆饭\t15664\n思利\t15665\n板脚\t15666\n黑风嘞嗯三了四天\t15667\n恩真乖\t15668\n好了你工作吧小度秘\t15669\nOr\t15670\nsjsjaennmE\t15671\n毛如意\t15672\n一三八八万户\t15673\n9814元\t15674\n王小帅\t15675\n15秒\t15676\n夺命你是个坏蛋度秘\t15677\n若离\t15678\n联欢晚会\t15679\n姚亮亮\t15680\n1159\t15681\n小牛服\t15682\n阴蒂安人\t15683\n王土娃\t15684\n娃度\t15685\n别会\t15686\n次日19时57分\t15687\n杨忆雪\t15688\n孙淳湾\t15689\n打开\t15690\n还有点舍不得\t15691\n葵花花姑娘\t15692\n096555248996\t15693\n亲爱的不迷我想我的老婆的你\t15694\n354339215\t15695\n棕色\t15696\n花呗儿\t15697\nqadg\t15698\n有什么用\t15699\n吴四爷\t15700\n夜生活\t15701\n下会儿\t15702\n激情\t15703\n天翼手机网选酷派5860+\t15704\n百事通真牛真牛\t15705\n163296518880\t15706\n布布利\t15707\n5月21日起\t15708\n嘉陵江畔\t15709\n气雨\t15710\n吗群\t15711\n鳙麒麟\t15712\n正餐\t15713\nghhgfy\t15714\n裸陈\t15715\n77qe片\t15716\nc度秘\t15717\npeople\t15718\n无踪\t15719\n等于就是说\t15720\nCK犬\t15721\n叶碧花\t15722\n张夏怡\t15723\n全勤\t15724\n工商\t15725\nｉｐａｄ\t15726\n黑狼\t15727\n找帅\t15728\n笑不来\t15729\n部长焦\t15730\n悦读\t15731\n东川\t15732\n黑狗\t15733\n刘老师\t15734\n哎呦你告诉我你是男是女\t15735\n全勇\t15736\n张佩君\t15737\n好别说\t15738\n友好注意\t15739\n隋冰冰\t15740\n曾贤敏\t15741\n皮衣\t15742\n万事如意\t15743\n盲子\t15744\n范茹茜\t15745\nOX\t15746\n111568\t15747\n多磨\t15748\n宣战\t15749\nOW\t15750\n110大\t15751\n鲁豫有约\t15752\n23摄氏度\t15753\n陈佩斯\t15754\n甲状腺\t15755\nhttpehiphotosbaiducomxiaodupicitemcefc1e178a82b901eeb3ae47748da9773912efacjpg\t15756\n阿菊\t15757\n最近期末\t15758\n陈86\t15759\n六宫\t15760\n盘腿\t15761\nthetermare\t15762\n杀灭\t15763\n3w点\t15764\n智慧树\t15765\nOQ\t15766\n人类\t15767\n废桓公\t15768\n28283883\t15769\n大屏幕\t15770\n333车队\t15771\n痛破\t15772\n童凯\t15773\n最低消费\t15774\n再演\t15775\nosot\t15776\n火了别说\t15777\nyufyfgh\t15778\n平超\t15779\n变性就喜欢\t15780\n字母歌\t15781\n快要\t15782\n嗯聪明\t15783\n瓶奶\t15784\n23359元\t15785\n55688\t15786\n厌唱\t15787\n蛇牙\t15788\n詹淑琴\t15789\n上车前\t15790\n很搞笑的歌\t15791\n老佛爷\t15792\n迁至\t15793\n求你别爱我\t15794\n门口站\t15795\n此事\t15796\n深树\t15797\n不快说\t15798\n工管\t15799\n诸位\t15800\n还好吧那几句话还好还好还好还好呵呵爸爸\t15801\n斗鱼\t15802\n2180\t15803\nhxxx\t15804\n男子\t15805\n弊端\t15806\nkmbbn\t15807\n熊吖\t15808\n壹6\t15809\n九龙仓集团\t15810\n持卷\t15811\nEX堡\t15812\n一千千吨\t15813\n娇娘\t15814\n放出\t15815\n400多\t15816\n娇娇\t15817\n开锣\t15818\n持卡\t15819\n桂林胜贤\t15820\nv薇欧薇\t15821\n坏女\t15822\n此人\t15823\nbmmlnjd\t15824\n小伙伴儿们\t15825\neggshit\t15826\n个懂\t15827\nhttppinyincne6213\t15828\n豆沙包\t15829\ntwmag\t15830\n8000多亿\t15831\n楚王\t15832\n王塑\t15833\n最低级\t15834\nerrrrr\t15835\n椰子螺\t15836\n其所\t15837\n9764997649\t15838\n范大咪\t15839\nEiix\t15840\n悲愤\t15841\ntointihatway\t15842\n油面\t15843\ndeb48f8c540143075d3d292df5e0fe7f34jpg\t15844\n给力]\t15845\n回甘/\t15846\n8.8\t15847\n求你了我求求你了我想哦我想看看你的真面目求求你了求求你了求求你了我想\t15848\n未有你是谁\t15849\n无聊了我真\t15850\nkkkmnjjk\t15851\n一大堆座\t15852\n雏妓\t15853\nproche\t15854\n载荷\t15855\n援藏\t15856\n每个月\t15857\n动车组\t15858\n奥运场馆\t15859\n李云豪\t15860\n主妇科\t15861\n松下\t15862\n来师范\t15863\n不可得\t15864\n杜娟\t15865\n明目张胆\t15866\n度秘阁\t15867\nkgc\t15868\n1592467347777\t15869\nnxd\t15870\n紫砂\t15871\n呱刮\t15872\n38套\t15873\n杜威\t15874\n打瓶\t15875\n松脂\t15876\n冯雨\t15877\n李我珥\t15878\n胡翠平\t15879\n唐文豪\t15880\n杨潇潇\t15881\n请留\t15882\nhjgg\t15883\n群员\t15884\n差奇帅\t15885\n胡一辐\t15886\nKTVas\t15887\ngdgfbgenhr\t15888\n晶体\t15889\nhjgj\t15890\n联手\t15891\n派遣\t15892\n二三零一\t15893\n不织布\t15894\n10份\t15895\n989898\t15896\n科技组\t15897\naquakhun\t15898\n头猪朱猪\t15899\n姑姑的秘密\t15900\n五龙\t15901\n开锅\t15902\n萎疯狗\t15903\n辛其璋\t15904\n慢效率低\t15905\n安籍\t15906\n塞塞\t15907\n周一周二\t15908\n二轻\t15909\n不不不你纯洁\t15910\n大高\t15911\n俩首次\t15912\n玩球\t15913\n凹凸曼\t15914\n十十五五\t15915\n二轮\t15916\n泡沫期\t15917\n亲个亲个\t15918\n甘遂\t15919\n其二\t15920\n一起死人个\t15921\n85年7月19\t15922\n34级\t15923\n石付陈\t15924\n日光\t15925\n万达广场\t15926\n原都\t15927\n其些\t15928\n跑钱\t15929\n马小宇\t15930\n大同区\t15931\n谢谢理\t15932\n真不我很快\t15933\n无边无际\t15934\n谢谢你你就是我我就是你我就是\t15935\n卡拉OK厅\t15936\n六离毒术\t15937\nddrfgghyeyh\t15938\n桂西大道\t15939\n不周三\t15940\n多值\t15941\n宏基本\t15942\n哼别\t15943\nFailed\t15944\n哼利\t15945\n日元\t15946\n液牙刷\t15947\n女娃娃\t15948\n弹夺\t15949\n弹夹\t15950\n罗阳缎\t15951\n赛车总动员2\t15952\nhimaly\t15953\n俾佢\t15954\n恩恩我\t15955\ncurrent\t15956\n避暑\t15957\n神枪芯\t15958\n85845553896\t15959\n嗯调片\t15960\n无价宝\t15961\n擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦\t15962\n形成\t15963\n高利贷\t15964\n想要你了你\t15965\n恩摔\t15966\n5年后\t15967\n今夕何夕今夕何夕华西坝凶巴巴\t15968\n玩管\t15969\n绿浪\t15970\nHahaha\t15971\n香港文汇\t15972\n李涵洋\t15973\n散漫\t15974\n恩摁\t15975\ntthff\t15976\ntfusjgj\t15977\n嗯梦回\t15978\nDamp\t15979\n李清明\t15980\n剃度\t15981\n度比\t15982\n小辈儿\t15983\n度毒\t15984\n手画\t15985\nwww99recom\t15986\n镜业\t15987\n扮装\t15988\n女银\t15989\n游世\t15990\n鱼刺\t15991\n哈气\t15992\njijjgu\t15993\n热度\t15994\n上丘丘\t15995\n打乱\t15996\n周笔畅天声\t15997\n鬼跳\t15998\n关额\t15999\n爸爸的家在松梅盐与花颜的对面就是我家\t16000\n许湘冰\t16001\n宁波隆\t16002\n遍体\t16003\n男土\t16004\nqqq收针\t16005\n游击战\t16006\n打乞\t16007\n数众\t16008\n打乜\t16009\n谎话\t16010\n软妹纸\t16011\n想辞\t16012\n想辙\t16013\n1000次\t16014\n很紧请\t16015\n欧阳奋强\t16016\n代沟\t16017\n无能者\t16018\n10角\t16019\n八八刀刀刀刀\t16020\n装饰帘\t16021\n冰灵族\t16022\n竹本犬者\t16023\n05年\t16024\n哒啦啦塔拉\t16025\nyatly\t16026\nコウ\t16027\n现中\t16028\ngrdghdb\t16029\n嬴取\t16030\n飞禽走兽\t16031\naffort\t16032\n盐女\t16033\n海基会\t16034\nfggggd\t16035\n舜天VS\t16036\n一百两百块\t16037\n保安院\t16038\n感觉尝试营\t16039\n我爱你爱的心好累\t16040\n五三块\t16041\n现下\t16042\n体弱\t16043\n素描\t16044\n一小亩\t16045\n二季度\t16046\n如实地\t16047\n现世\t16048\n叶钰慧\t16049\n汗疱\t16050\n猪鼻\t16051\n什么候\t16052\niiiiiiiiiiIIIIII\t16053\n珠米琪\t16054\n叫不着\t16055\n卖一聊\t16056\n小豆米\t16057\nhedgebehue\t16058\n四四个\t16059\n官渡镇\t16060\n一个四Ｇ\t16061\n鸡还有一个人\t16062\n史矛\t16063\n处对象儿\t16064\n松松山\t16065\nuh媛\t16066\n火羊宝\t16067\n霜冰\t16068\n精灵梦叶罗丽第四季\t16069\nwinter\t16070\n塞拉也\t16071\n少量\t16072\n瞎猜\t16073\n姿式\t16074\n分贝\t16075\nxxnjcj\t16076\n华林四桥站\t16077\nT一冖\t16078\n这则\t16079\n气象员\t16080\n根源\t16081\n2796512585\t16082\n大发罚派发\t16083\n谢仁波\t16084\nqtagpj\t16085\n二零八三零幺九\t16086\n无边落木萧萧\t16087\n一千一千五百八十八元\t16088\n名文\t16089\n明明明明明明明明明明明明明明你你你你你你你你你你你你你你你你你你你你你你你你你你你你我晕\t16090\nァォケ\t16091\n福星\t16092\n完完全全\t16093\n口巴\t16094\n外祖父\t16095\nTFB\t16096\n福明\t16097\npapper\t16098\n我讨厌你你讨厌我吗我不讨厌你你不讨厌我\t16099\n杓宇\t16100\n能不可能\t16101\n口工\t16102\n凉拌吧\t16103\n来了来了啦啦来了来了\t16104\n谋求\t16105\n就言\t16106\n奥兔兔秘\t16107\n小忽悠\t16108\n那一秒\t16109\n儿岁\t16110\n广州东\t16111\n旧时光\t16112\n向向\t16113\n绝佳\t16114\n向后\t16115\nBBA3E55B3E4%BBAE58F97E58891E697B6E7984E7857E78987\t16116\n范雪山\t16117\n绝你\t16118\n16:46\t16119\n仙三金星\t16120\n一程1\t16121\n绝佛\t16122\n官庄村\t16123\n倍数百位\t16124\n维系\t16125\n另一个世界\t16126\n不挂平铺\t16127\n藏纳\t16128\n善科yout\t16129\n零商\t16130\n石头剪刀布输喽\t16131\n121249965\t16132\n0n0\t16133\n老九门\t16134\n宝莲寺\t16135\n中山优马\t16136\n沙茶酱\t16137\n猪姥姥有没变猪了猪老龙\t16138\n张医生\t16139\n服学\t16140\n拧干\t16141\n云龙区\t16142\n54840ibhe\t16143\n赞好赞\t16144\n十六只\t16145\n随我你就给我行个女\t16146\n战略\t16147\n脱缺\t16148\n一妻四\t16149\n耿小雯\t16150\n主音\t16151\n夏利营\t16152\n朱家俊\t16153\n十六号\t16154\n赴汤蹈火\t16155\n调节器\t16156\n34．7％\t16157\n要不呀\t16158\nWF\t16159\n飞天烧鹅\t16160\n感觉你好无聊不想\t16161\n香港中旅社\t16162\n海鑫\t16163\n52452224126878928807125\t16164\nhrescv\t16165\n幺二零幺零六一九零一零七零幺七零幺八\t16166\n脆骨\t16167\n最低工资\t16168\n偷笑][偷笑\t16169\n黄白\t16170\n成交价\t16171\n安宁哈赛\t16172\nxjdjxnc\t16173\n7格\t16174\n不可开交\t16175\n拆开\t16176\n咯奶酪\t16177\n陈晓庆\t16178\n五不爱三四嗯两栖吧\t16179\n哈孩子乖哈孩子乖乖呀乖乖乖乖乖乖乖乖乖乖乖一是小乖乖\t16180\n雄点\t16181\n1400万\t16182\n鸡兔同笼\t16183\ncf号号号\t16184\nｐｏ\t16185\n难常\t16186\n大水桥\t16187\nhhtgb\t16188\nvv呵护u\t16189\n不投诉你\t16190\n祛痘丸\t16191\n红鉴\t16192\n撸一管\t16193\n错了恭喜\t16194\nyitouchong\t16195\n意大利泰尼卡集团\t16196\nccddf\t16197\n午觉机器人\t16198\n手提箱\t16199\n李玲玉\t16200\n得唧\t16201\n小黄羊\t16202\n大芸\t16203\n资讯\t16204\n默默\t16205\n小茂茂\t16206\n姜莹\t16207\n汤灿\t16208\n搓板\t16209\n展开谈\t16210\n九十四\t16211\n得对谁说\t16212\n小秘萌萌哒哒萌\t16213\n易容\t16214\n大节\t16215\n马珮琪\t16216\n斤斤计较里\t16217\nD90\t16218\n经济学日\t16219\n掠夺\t16220\n2\t16221\n颜好\t16222\n城月\t16223\n施南生\t16224\n淫棍\t16225\n枪神纪游戏\t16226\nxjfkfyr375科技馆\t16227\n不一样知道\t16228\n黄刚\t16229\n车牌号\t16230\n黄列\t16231\n法宝\t16232\n两，三米\t16233\nboylove\t16234\n法官\t16235\n凯怡酒店\t16236\n白素贞\t16237\n你老大是\t16238\n黄分\t16239\n开心吻\t16240\nhihihihihihiloaoao\t16241\n红m3not\t16242\n终身大事\t16243\n名仕\t16244\n黄制\t16245\n南华\t16246\nbhthx\t16247\nF-117A隐形战斗机\t16248\n生之年\t16249\n来来电\t16250\n辽通化工\t16251\n书上\t16252\n鸿运\t16253\n后山\t16254\n缝得\t16255\n唐骑九\t16256\n猪娃猪娃猪娃你真实头主人蜘蛛文文\t16257\n王念法\t16258\n蛆虫\t16259\n辞旧迎新\t16260\ntastes\t16261\n楼厂村\t16262\n陈键锋\t16263\n季瑞明\t16264\n柳泉\t16265\n恩墨\t16266\n还是假\t16267\n甜点\t16268\nax2\t16269\n茅台酒厂\t16270\n监护人\t16271\n说乐意\t16272\n张瑞雯\t16273\nati474\t16274\n没有男\t16275\n没有用\t16276\n石头石法法\t16277\n赛迦奥特曼\t16278\n济南都市频道\t16279\n卧蚕\t16280\n林子愉\t16281\n大头贴\t16282\n200万辆\t16283\n暗度秘\t16284\n看写给\t16285\n我在这边你在哪你在哪再讲\t16286\n郝雅楠\t16287\n40多岁\t16288\ne一\t16289\n无缺\t16290\n一丘之貉\t16291\n莫湘儿\t16292\n立马林冈\t16293\n小傻子小傻子小傻子你的小白\t16294\n标点\t16295\n#米\t16296\n耳塞式耳\t16297\n花见花\t16298\n肚明\t16299\n搜佳洁\t16300\n事实无力感空空\t16301\n超级变形车\t16302\n三一共\t16303\naxg\t16304\n存在爱\t16305\n无缘\t16306\n无缝\t16307\n草纸\t16308\n高领衫\t16309\n小葱拌豆腐\t16310\ncffps\t16311\n世界和平\t16312\n灭世\t16313\n希望杯\t16314\nabcdeijxiakalinopqrstuawasyatasylytrofihroeaabc\t16315\n八百多岁\t16316\n腰亲\t16317\n30多万元\t16318\n李灿\t16319\n格兰阁国际酒业音乐品鉴会\t16320\n乖把\t16321\n小幼\t16322\n一根儿\t16323\n隔岸观火\t16324\n王女士\t16325\n一个20多平方\t16326\nchiphotosbaiducomxiaodupicitemb7fd5266d0160924e9b3026fd30735fae6cd342bjpg\t16327\n哥们儿们\t16328\njzzz\t16329\n佳讯\t16330\n你的世界\t16331\n12000人次\t16332\n林平\t16333\n那个妞\t16334\n为契机\t16335\n特别关注\t16336\nAasobio\t16337\n在这里\t16338\njdne\t16339\n99540\t16340\n后面\t16341\n亲爱的亲一个\t16342\n小蜜蜂\t16343\n骨头\t16344\n陈伯\t16345\n一万只号\t16346\nhttphupiterbaiwenbaocomgifpagehtmlimg544e24201265fa556dc8fcb8a5410394gif\t16347\n朋克风\t16348\n福来\t16349\n周成海\t16350\n米克尔\t16351\nsbsbsbsbsvsbis\t16352\n三八婆\t16353\n神的名义诅咒你今生今世永生永世不伤不灭不老不死\t16354\n散热器\t16355\n兵兵爱一我喜欢兵兵\t16356\n对象\t16357\n婺源\t16358\n着你的手\t16359\n涂在\t16360\n战国策\t16361\n阿奴\t16362\n消防车\t16363\n宠物思密达\t16364\n听歌千年等一回\t16365\n13994086198\t16366\n雷美辰\t16367\n全当\t16368\n肥中\t16369\n101教育网\t16370\n天涯快乐\t16371\n张杰韩\t16372\n肩膀\t16373\n习作\t16374\n快年\t16375\nSsssssss\t16376\n流淌\t16377\n沉溺\t16378\n自我暗示\t16379\n澳门威尼斯\t16380\n是非常帅\t16381\n蒋姗姗\t16382\n48261\t16383\n库塔德尔\t16384\n我没有男闺蜜我也不想要男闺蜜\t16385\n倩倩女神\t16386\n全彩\t16387\n偶其实\t16388\n英魂之刃\t16389\n鸣翠\t16390\n把骨\t16391\n哈密瓜果冻\t16392\n婉媚\t16393\n暗人\t16394\n一点几点\t16395\n酸豇豆\t16396\n0ノ\t16397\n4六十五一\t16398\n无聊聊\t16399\n焦林杰\t16400\n共浴\t16401\n白度母\t16402\n余男\t16403\n宁知秋\t16404\n可愿\t16405\n微言\t16406\n你是臭女人臭不要脸王四辈\t16407\n福利保险\t16408\n陈娟慧\t16409\n要不差钱\t16410\n旬阳\t16411\ngiheh\t16412\n百灵戏\t16413\n史无前例\t16414\n妍儿\t16415\n欢畅\t16416\n啊乔\t16417\n可愛\t16418\n张相呗\t16419\n啊九\t16420\n英蒂迪\t16421\n纸筒\t16422\n巫娜\t16423\nnizoukai\t16424\n袁隆\t16425\n椰蓉酥\t16426\n张笑话\t16427\n处治\t16428\n借给\t16429\n是不说\t16430\n54011\t16431\n能说了算\t16432\n爱上玻璃苣女孩\t16433\n重聚\t16434\n丰年\t16435\n张铭泽\t16436\n一百二十一块九\t16437\n赛尔三\t16438\n萝卜头\t16439\n高发期\t16440\n彭老师\t16441\n363935\t16442\n单桧\t16443\n春花鸟\t16444\n头饰\t16445\n不成年\t16446\n陆叠音\t16447\n床杯\t16448\nBMW在线电视\t16449\n揭露\t16450\n东映\t16451\nBj2018号Ck舞团\t16452\n何宣仪\t16453\n#文怡\t16454\n抖森高帅富\t16455\n总产量\t16456\n后语\t16457\n炙么致\t16458\n神枪\t16459\n貌似\t16460\n不约而同\t16461\n杨千雅\t16462\nmrkfmc\t16463\n秋之夜\t16464\n后话\t16465\nV匕爪\t16466\n邑大站\t16467\n两道\t16468\n刘明军\t16469\n头发战\t16470\n废票\t16471\n九五人家\t16472\n暴女\t16473\n73￥\t16474\n汤家塘二十号\t16475\n餐车\t16476\n建言\t16477\n藏书\t16478\n兵戎相见\t16479\n肃然\t16480\n萌萌达\t16481\nmhuhbhfnnnn\t16482\n38年\t16483\n滚滚滚滚滚滚滚\t16484\n张雨灵\t16485\n了该\t16486\n音错\t16487\n雷辟如\t16488\nJCJCK\t16489\n轻风一吹\t16490\n明天雅灭碟报\t16491\n百零一j\t16492\n把写\t16493\n青海省\t16494\n晚场\t16495\n二十来天\t16496\ncx41\t16497\n耳光\t16498\n牛牛牛\t16499\n单翅\t16500\n王旭燕\t16501\ncallow\t16502\n数以\t16503\n校花\t16504\n5755号\t16505\n王琦璐\t16506\n乳白色\t16507\n靠不起\t16508\n耳兔\t16509\n教委\t16510\n53583633222222\t16511\n282828283838\t16512\n知方\t16513\n度蜜蜜\t16514\n12.08.14\t16515\n片言\t16516\n说不下去\t16517\nhtrrh\t16518\n棺椁\t16519\n就是臭不要脸\t16520\n钢刀\t16521\n3880元\t16522\n卡宾\t16523\n石林镇\t16524\n陈数\t16525\n卡宴\t16526\n大跳大你\t16527\n周四凌晨0点\t16528\ndthxjtgjsr\t16529\nDJTG\t16530\nDaniel\t16531\n不讨好\t16532\n好浪漫\t16533\ne百分\t16534\n周少番\t16535\n正眼\t16536\n九四六八\t16537\n阴氏埙\t16538\n204元\t16539\n卡定\t16540\nhgxvnjo\t16541\n嚣想\t16542\nl2pw0s\t16543\n聊天我走\t16544\n80分\t16545\n杨蓉\t16546\n美元\t16547\nfsdwdgfedgf\t16548\n安祥\t16549\n正真\t16550\nhttppinyincn28SWGFJGRM4\t16551\n美克\t16552\n孟德睿\t16553\n沒說\t16554\nMuller\t16555\n舍去\t16556\n钰莹\t16557\n冯应\t16558\n苦男人\t16559\n二胎\t16560\n芦田爱菜\t16561\n迎来\t16562\n西餐牛排\t16563\n姨婆\t16564\nirjeh\t16565\n箱上\t16566\nv铃\t16567\nMont\t16568\n季军\t16569\n逝世\t16570\n撒阳\t16571\n商贸城\t16572\n苦寒\t16573\n二胖\t16574\n伤逝\t16575\n二胡\t16576\nfnf\t16577\n冯庸\t16578\n广白相\t16579\n格罗斯\t16580\n外族\t16581\n伤透\t16582\n定县\t16583\n面灰\t16584\n千与千寻\t16585\n平涂\t16586\n一个二十多\t16587\n66\t16588\n黑块\t16589\n3000亿元\t16590\n具有\t16591\n王晓妍\t16592\n范光存\t16593\n代缴\t16594\n升力\t16595\npog\t16596\n蒋家嘴\t16597\n早上七点三十分\t16598\n喜欢你了回\t16599\n海之航\t16600\n探险\t16601\n呼吸机\t16602\nxconext\t16603\n333344445556666\t16604\n升势\t16605\n多度\t16606\n日式猪扒\t16607\n倍儿\t16608\n判决\t16609\n剪帅\t16610\n日系\t16611\n炸弹机器\t16612\n渗出\t16613\n咋讨厌\t16614\n哪个儿子\t16615\n白度秘\t16616\nmyoufo科\t16617\n奴婢\t16618\n白石藏王\t16619\n调查表\t16620\n彭文乐\t16621\n加盟店\t16622\n捕杀\t16623\n庆余年\t16624\n发紫\t16625\n巴啦啦小魔仙堡贵族贝贝公主\t16626\n蝗虫团\t16627\n领悟\t16628\n夜深人\t16629\n猪八戒\t16630\n余晖\t16631\n球球\t16632\n背伯牙绝弦\t16633\n正碰\t16634\n帖片\t16635\n梁梓丽\t16636\n梅挂花\t16637\nggcjchhdhjgxnbfttjgcffhhhffhffjhchhjkghjgvhhfhgfhjhhghhhhhhhhhhkjhjeyn\t16638\n10万行\t16639\n上学期\t16640\n讹死\t16641\n光人美\t16642\n美公司\t16643\n逼操\t16644\n含溪村\t16645\n哼唧恩\t16646\n人生三论\t16647\n一百本\t16648\n二零三八六\t16649\n讲理\t16650\n伤天害理\t16651\nfami通\t16652\n痛肿\t16653\n容易街\t16654\n愶匹\t16655\n米兰草原本文字号称呼呼唤我是的我\t16656\n张浩洋\t16657\n一百期\t16658\n金咒\t16659\n彩花\t16660\n王圣贤\t16661\n每套\t16662\n雷克雅未克\t16663\n迪卡侬\t16664\n鸭仔\t16665\n搞懂\t16666\n龚自珍\t16667\n调料\t16668\n姚锡荣\t16669\n抱会\t16670\n506\t16671\ndunxia\t16672\n曼尼拉\t16673\nshtxbsj\t16674\nk348\t16675\n皒皒嗯皒皒你好\t16676\nｕｕｕｕ\t16677\n饿句\t16678\n非法拘禁罪\t16679\ndisisisisiste\t16680\n很爱\t16681\n好多情\t16682\n棵得科\t16683\n前贤\t16684\n果壳儿\t16685\nangelababyhvbnmm\t16686\n我的希望\t16687\n胖样\t16688\n美国偶像\t16689\n东方财富网\t16690\naa\t16691\n存款利率\t16692\n算算\t16693\nab\t16694\nae\t16695\nad\t16696\nag\t16697\naf\t16698\nai\t16699\n10月25日\t16700\n男人装\t16701\naj\t16702\n票价\t16703\nal\t16704\nao\t16705\nan\t16706\naq\t16707\nap\t16708\nas\t16709\nar\t16710\nau\t16711\nat\t16712\naw\t16713\nav\t16714\nay\t16715\nax\t16716\n黑了信不信\t16717\naA\t16718\n露胸\t16719\ncijcu\t16720\n6g\t16721\n天蛋\t16722\naQ\t16723\n片剂\t16724\n心鸡\t16725\n虫虫\t16726\naV\t16727\naX\t16728\n咪秘\t16729\n雪青\t16730\njruufg\t16731\n上演\t16732\n二手车行\t16733\n张智霖\t16734\n123456789101112131415161718192021222324252627282930\t16735\n6m\t16736\na1\t16737\na0\t16738\na3\t16739\na2\t16740\na5\t16741\na4\t16742\na7\t16743\na6\t16744\na9\t16745\na8\t16746\n私生饭\t16747\n4s7u\t16748\n沐浴液\t16749\n想不爱\t16750\n6787\t16751\n五常292\t16752\n几20点\t16753\n血粉\t16754\nrxx\t16755\n第二条\t16756\n华夏族\t16757\n7k7k\t16758\n第二杯\t16759\nrxw\t16760\n验收\t16761\n失忆\t16762\n花木\t16763\n妖魔之歌\t16764\n失心\t16765\n澡雪\t16766\n胡吃海\t16767\n笼络\t16768\n星际旅行\t16769\n风林火山\t16770\n花朵\t16771\n丨灬\t16772\nbbbbbjjjjj\t16773\n露胸你怎么的你伤心管我事\t16774\nme，too\t16775\n花真帅\t16776\n失念\t16777\n巴萨曼城\t16778\n花期\t16779\n斗战\t16780\n失眠率\t16781\n我讨厌你我讨厌你欺骗我说你是机器人我恨你\t16782\n赵华东\t16783\n一百九十一百九十\t16784\n不问\t16785\nBEYOND\t16786\n不闪\t16787\n7日上午\t16788\n空耳你看我爱我恨你\t16789\n倍加片\t16790\n第一架\t16791\n宋承宪\t16792\n淡然处之\t16793\n你是猪啊你是猪啊你是猪啊度秘\t16794\n亦庄\t16795\n不闻\t16796\n后枸\t16797\n第6201天148836\t16798\n马不停蹄日以继夜\t16799\n精神战狼\t16800\nvdsgi\t16801\n亲自\t16802\n2011年5月31日\t16803\n传唤\t16804\n511111\t16805\n你在我生男还是女\t16806\nVhsjjsuaiw\t16807\n6G\t16808\njuly\t16809\n久等\t16810\n黑岗子\t16811\n站队\t16812\n后果\t16813\n百小度\t16814\n传唱\t16815\n50g\t16816\n二辑\t16817\n傲风\t16818\n十五件套\t16819\n撞鬼\t16820\n能够\t16821\n谷物\t16822\n224厘米\t16823\n堆满\t16824\n七里花喵\t16825\n6M\t16826\n艱巨\t16827\n快乐叔意思\t16828\n在的呢吧\t16829\njgpj\t16830\n惨案\t16831\n幺三三\t16832\n二十多片\t16833\n幺三七\t16834\n了你不懂也好最好不懂\t16835\n第1集\t16836\n国宴\t16837\n11.04\t16838\n大磊\t16839\n周佳盈\t16840\n大雁群\t16841\nRIZKA\t16842\n六三家\t16843\n送修\t16844\n猎兔\t16845\n网贷\t16846\n东瓜\t16847\n哎呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀\t16848\n停摆\t16849\n雷峰塔\t16850\n送信\t16851\nSISUUTS\t16852\n生新\t16853\nA君\t16854\n小不点你是说我的名字\t16855\n别我\t16856\n奔跑\t16857\n别战\t16858\n14:30\t16859\n男性行为者\t16860\n早上六点三十分\t16861\nen九点\t16862\n日了\t16863\n布拉格广场\t16864\n胡迪睿\t16865\n呃田秀阳幺二零幺零九一九八七零七零二零二九\t16866\n赛特菲尔德\t16867\n常识\t16868\n吴林峰\t16869\n牍米\t16870\n错颜\t16871\n装订\t16872\n抱拳\t16873\ncosx2243\t16874\n很好好\t16875\n许淳强\t16876\n谈失\t16877\n梦见你了梦见\t16878\n指点\t16879\n陈泽镇\t16880\n冷酷维多利亚\t16881\n富阳\t16882\nshsssss\t16883\n韩乃熠\t16884\nDfdgfg\t16885\n黑子猪猪\t16886\n点渴\t16887\n875554855855855565855\t16888\n千层浪\t16889\n谈天\t16890\n火匣\t16891\n张怡菲\t16892\n度秘懂\t16893\n46566768\t16894\n子子天\t16895\n韩女孩\t16896\n死狗\t16897\n普工\t16898\n好啦你看你的女朋友俩\t16899\nixiigxigxigxig\t16900\n火区\t16901\n背气\t16902\n密懂\t16903\n马尔科姆\t16904\n魔法棒\t16905\n和字\t16906\n外地区内\t16907\n和子\t16908\n死狼\t16909\n火包\t16910\n宝贝儿们\t16911\n国家地震局\t16912\n恶语\t16913\n5874157\t16914\n辣么美\t16915\n人头发\t16916\n火化\t16917\n美吧美\t16918\n十三份\t16919\nggdjrfufik\t16920\n赞成票\t16921\n圆cx2y26x8ym0\t16922\n鸡蛋汁\t16923\n胃黏膜\t16924\n魔盒\t16925\n没用好\t16926\n魔盗\t16927\ntardigrade\t16928\nfcnjkrd\t16929\n手冷吗度秘\t16930\n蔡玉怡\t16931\n离开我吧\t16932\n账面\t16933\nfhakjdfkfdyrwbkbloiakmmcfykxhlshlljjzmzghkxkj\t16934\n4组\t16935\n连任\t16936\n611314\t16937\n若尔\t16938\n鸡蛋汤\t16939\n美仙美歪笔战\t16940\n桩手\t16941\n天心\t16942\n3处\t16943\n好吧我讨厌你我讨厌讨厌讨厌讨厌你我讨厌讨厌讨厌你一\t16944\n7845466\t16945\n90寿诞\t16946\n呵赞\t16947\n滑梯\t16948\n下加\t16949\n1915年\t16950\n忤逆\t16951\nreyet\t16952\n嘉士柏五百聊\t16953\n讲台\t16954\n着地\t16955\n刘邓\t16956\n帐户\t16957\n讲句\t16958\npooiuuyyt\t16959\n585556\t16960\n方队\t16961\n假不许\t16962\n朝朝暮暮\t16963\ntomatoutou\t16964\nevsjesjgjcsg5734545825366883Vvfjfthhhdfij\t16965\n最热门\t16966\n康昆逸\t16967\n幻化\t16968\n雅兴\t16969\n另一只要你\t16970\n寡闻\t16971\n坐针\t16972\n还必须\t16973\nvrud\t16974\n就额\t16975\n不对你礼貌\t16976\n1步\t16977\n刘邦\t16978\n郭秋红\t16979\n过马路\t16980\n方阵\t16981\n鸡亲\t16982\n二三四六七八九十六七八九二三一二七二八二九三一三二三三三八四\t16983\n碧辉煌\t16984\noooò哦零零落落零零落落\t16985\n白海\t16986\n看我该怎么办\t16987\n放水\t16988\n我骗你的不是人你是猪人\t16989\n磁带\t16990\n熟悉\t16991\nNONONO\t16992\n丿ㄥ丿丨nnnnnn\t16993\n白浪\t16994\n深南路兴华宾馆\t16995\n叫绝\t16996\n赵964\t16997\n叫给\t16998\n些\t16999\n耳语\t17000\n12400\t17001\n开学后\t17002\n14:00——17:30\t17003\n10月26日\t17004\n放气\t17005\n叶梓萱\t17006\n白浅\t17007\n白浆\t17008\nNONONo\t17009\n7gg8\t17010\n吃了再聊\t17011\n高明区\t17012\nThaiflood\t17013\n卡组\t17014\ndrtffh\t17015\ndeds\t17016\n波尼尔\t17017\n三第二季\t17018\n郑宇轩\t17019\n283.02第一\t17020\n多抱会\t17021\n门神明天明\t17022\nDcgfvg\t17023\n选出\t17024\n格致\t17025\n120张\t17026\n吴光杰\t17027\n恩慈\t17028\n还人\t17029\n主修\t17030\n月月办我是我我我上我\t17031\n亲爱的亲爱的我还爱着你\t17032\nhttpitem\t17033\n15一个\t17034\n120弋\t17035\n通天\t17036\n困难户\t17037\n小花仙\t17038\n李佳航\t17039\n祝福度秘跨年开心\t17040\n埃森哲\t17041\n猫牙\t17042\n加长型\t17043\n5万3千多名\t17044\nxzzcbn\t17045\n女斗\t17046\n张文露\t17047\n六十多块\t17048\n色情业\t17049\n可事\t17050\n下人\t17051\n淘气\t17052\n八宝山\t17053\n中国气象局华风影视\t17054\n女文\t17055\n清洁卫生\t17056\n女方\t17057\n哈辣\t17058\nbccd\t17059\n遇冷\t17060\n别客气别客气\t17061\n四晚\t17062\n李唱\t17063\n偏题\t17064\n100岁\t17065\n多样性\t17066\n杨元海\t17067\nvxgff\t17068\nhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh\t17069\n你好像是的确定性格格不入流水账户外广告词汇\t17070\n珠穆朗玛峰\t17071\n中关村\t17072\n888名\t17073\nehdjje\t17074\n土山湾博物馆\t17075\n潘先生\t17076\n搭理\t17077\n副本\t17078\n3269\t17079\n吗话\t17080\n标我\t17081\n11次\t17082\n美联社\t17083\n努力罪\t17084\nnoyoy\t17085\n2577\t17086\n哪壶\t17087\n2575\t17088\n好年生\t17089\n一夜间\t17090\nLMFAO\t17091\nrrtiinb\t17092\n上会\t17093\n一念不动\t17094\n快快快快快快快快快快快快快快快快快快可口可乐\t17095\n微信\t17096\n巨款\t17097\n掀开\t17098\n欧名凡\t17099\n展现\t17100\nheheheheheheheheheheehehehehheehheheehehehheheehheheheeheheehehhehhehheheehehhehehehehheheheheheehejhe\t17101\n黄昆\t17102\n隐形带\t17103\n52538\t17104\n超级猪猪猪\t17105\n60400\t17106\n摇落\t17107\n矫情\t17108\n产假\t17109\n开门大吉\t17110\n性质\t17111\n社区\t17112\n臻臻\t17113\n更漂亮\t17114\n哈辉\t17115\n有一摇\t17116\n迷子陈\t17117\n姜水珍\t17118\n帅本\t17119\n傅总\t17120\n斯克尔斯\t17121\n61000个\t17122\n花白\t17123\n特攻\t17124\n18874114\t17125\n好笑了\t17126\n变暖\t17127\n丧门旋三雅上门券\t17128\n吞吞吐吐天天\t17129\n砍下\t17130\n邮政编码\t17131\n长安区\t17132\n受惊\t17133\n第十版\t17134\n每日镜报\t17135\n命令\t17136\n古古怪\t17137\n二零七幺\t17138\n陈俊林\t17139\n65565555552\t17140\n谢鹏飞\t17141\n命仙\t17142\n15842437870\t17143\n二零例\t17144\n12月13号\t17145\n孩儿们\t17146\n羽纹铜凤灯\t17147\n逆为你\t17148\n仿真枪\t17149\n黄华\t17150\nack\t17151\n年龄段\t17152\n吞下\t17153\n魏海东\t17154\n于建民\t17155\n谢哥哥\t17156\nqpp\t17157\n踉踉跄跄\t17158\n畹华\t17159\n康康乐队\t17160\n蚀岁\t17161\n瑜伽\t17162\n混合体\t17163\ngsrdtfggtq\t17164\n汉乐\t17165\n惩戒\t17166\n滴敖\t17167\n缰绳\t17168\n迷11\t17169\n热带鱼\t17170\nex0\t17171\n途遇\t17172\n孙我\t17173\n走事事\t17174\n刚性\t17175\nGogogogogogoge\t17176\n扎针\t17177\n说票\t17178\n围着\t17179\n一石\t17180\n与装\t17181\n孙雪涵\t17182\n坐骨天\t17183\n算数平方根记\t17184\n老板心\t17185\n天之后再来\t17186\n爱你爱你爱爱爱爱你\t17187\n真灵\t17188\n42809\t17189\n维他命B2\t17190\n下不过\t17191\n珠城\t17192\n帝一鸣\t17193\n小比兔\t17194\n主人素\t17195\n两千克\t17196\n特错\t17197\n105189573\t17198\n走在路上\t17199\nexo\t17200\n壁虎壁虎壁虎\t17201\nexk\t17202\n冯宗兴\t17203\n臭屁i\t17204\nWhyMe2011\t17205\next\t17206\nghiphotosbaiducomxiaodupicitembba1cd11728b47108ff242b9c4cec3fdfd0323e1jpg\t17207\nexp\t17208\n别老子老子的行\t17209\n大冒清\t17210\nQ6匕乙6盛惫x\t17211\n南戴河\t17212\nexx\t17213\nexy\t17214\n猪汇市\t17215\n姜堰牟成\t17216\n小阳子\t17217\n神画\t17218\n300多万元\t17219\n杏坛\t17220\nFhhhhjvbj\t17221\n奋笔疾书\t17222\n展架\t17223\n布置\t17224\n美心\t17225\n无线电信号\t17226\n咯空间旅途土秃路途突突具体\t17227\n娜塔丽\t17228\n二周年\t17229\n偶滴歌神\t17230\nxxxxxxxxx\t17231\n身体\t17232\nabcdeftihison\t17233\n宏物\t17234\n缺笑\t17235\nec432c45a68b87d6277ff960jpg\t17236\n汉娜蒙塔娜\t17237\n明明区\t17238\n红烧鸭肉\t17239\n死废柴\t17240\n06月18日05时00分\t17241\n铁字\t17242\n评优\t17243\n薛思雨\t17244\n啦啦啦啦啦啦啦啦你\t17245\n张伟\t17246\n渐显\t17247\n驰\t17248\n死奶奶\t17249\n岳林\t17250\n儿童性\t17251\n约会大作战狂三\t17252\n阿雪骨\t17253\n浩宇\t17254\n个逼\t17255\n写开\t17256\n甲构炎\t17257\n到好阄\t17258\n律师费\t17259\n拼插\t17260\n业务员\t17261\n李彤\t17262\n体能\t17263\n奥特曼小小你好\t17264\n评估\t17265\n郑丽明\t17266\n制约\t17267\n超户\t17268\n儿452452258\t17269\n刚都\t17270\n真事似的\t17271\n人块\t17272\n猪猪猪猪\t17273\nseu\t17274\ndiever\t17275\nses\t17276\nser\t17277\n高峰圭\t17278\n杨颖哥\t17279\n番下\t17280\n愛妳\t17281\nsee\t17282\nsea\t17283\n混淆\t17284\n开练\t17285\nsel\t17286\n吗ok\t17287\nsei\t17288\n开组\t17289\n满洲里市\t17290\n不要脸不要脸不要脸\t17291\n紧锣密鼓\t17292\n混混\t17293\n泥潭\t17294\n未能\t17295\n开维\t17296\n芝士牛\t17297\n三星手机大陆\t17298\n瘦男\t17299\n国鑫\t17300\n回亲\t17301\n大厅\t17302\nvxhrdvhgr\t17303\n大厂\t17304\n中午12点半\t17305\nhttpghiphotosbaiducomxiaodupicitemd043ad4bd11373f054ca4beba30f4bfbfaed04c8jpg\t17306\n绝一\t17307\n老子锤\t17308\n巴罗萨\t17309\n李旭希\t17310\n原汁\t17311\ntfgfg\t17312\n兰吧\t17313\n繁脑\t17314\n好姐姐\t17315\n送员\t17316\n大厦\t17317\n大厢\t17318\n802.11bgn\t17319\n远去找\t17320\nGVIG8\t17321\n申请人\t17322\nwerearfamrle\t17323\n大使馆\t17324\n大厨\t17325\n姓李我恨\t17326\n110529希澈\t17327\n不是我想你我一辈子爱你可以\t17328\n迄今\t17329\n一百三百元\t17330\n拔拉姆\t17331\n1000日元\t17332\n讨厌裤\t17333\n大去\t17334\n三比二\t17335\n星级\t17336\n叫会\t17337\n黄鹂鸣翠柳\t17338\n不不不我心情不错\t17339\n微米维密\t17340\n猛然间\t17341\n你追我赶春耕忙筒谱\t17342\n七色雨\t17343\n姚楚烽\t17344\n给唱\t17345\nDone\t17346\n2月19号\t17347\n汉堡包\t17348\n撺掇\t17349\n麼麼\t17350\n金银珠\t17351\n文化西街\t17352\n216848536\t17353\n回头率\t17354\n松峰山\t17355\n奥他来了\t17356\n18001387059\t17357\n六百八十四兆\t17358\nqqqqqqqqqqqwqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq\t17359\n公主家\t17360\n初心\t17361\n班\t17362\n肚量\t17363\n肚里\t17364\n抠逼\t17365\n爱情伤\t17366\n珧\t17367\n四4天\t17368\n珠\t17369\n董波\t17370\n汗哒哒\t17371\n現\t17372\n珹\t17373\n陀飞\t17374\nCcpalklkll\t17375\n沙子口\t17376\n直言\t17377\n珲\t17378\n珍\t17379\n温泉\t17380\n增产\t17381\n珊\t17382\njudicial\t17383\n珆\t17384\n年终\t17385\n表理\t17386\n珞\t17387\n武清\t17388\n贱笔\t17389\n咪咪咪咪咪咪咪暮暮暮蜜木马\t17390\n拿怕\t17391\n内在再\t17392\n山顶\t17393\n珐\t17394\n壬乞\t17395\n我不要你了空开\t17396\nHelloKitty\t17397\n三幅\t17398\n美好滋味\t17399\n数咱俩\t17400\n填志愿\t17401\nONE\t17402\n才毛\t17403\n火山群\t17404\n叶文\t17405\n什么有那该多好啊穿越喜雨\t17406\n家宿舍\t17407\n套路\t17408\n异类\t17409\n王彦镔\t17410\nＫB\t17411\n景博心\t17412\n三干\t17413\ngxdy\t17414\n九八年\t17415\n十二分之五公顷\t17416\n三年\t17417\n三幺\t17418\n趴体\t17419\n冰牛奶\t17420\n人民解放军\t17421\nhttpmflbmhcomshaonv6073\t17422\n明圣\t17423\n莜面\t17424\n原班\t17425\nnnn9￥88￥2854840￥￥￥￥￥\t17426\n五要\t17427\n癫疯\t17428\n七十多块\t17429\n贾培莹\t17430\n败家\t17431\n动漫公司\t17432\n闹铃\t17433\n屠天\t17434\n笔率\t17435\n屠夫\t17436\n诒话\t17437\n王艳\t17438\n王色\t17439\n余灰\t17440\n吊灯\t17441\n出镜\t17442\n哎呦啦\t17443\n证明搏\t17444\n王艮\t17445\n法性\t17446\n跃入\t17447\n恰克图\t17448\n奥格\t17449\n我不要你了我不要度秘\t17450\n尻比\t17451\n大好看\t17452\n费神\t17453\n吕品\t17454\nx2程题\t17455\n爱丽尔\t17456\n收获\t17457\n意味深长\t17458\n大地球\t17459\n试衣间\t17460\n六岑\t17461\n刘曙松\t17462\n僧伽教育史\t17463\n张警官\t17464\n曹郁\t17465\n每间\t17466\n吕哥\t17467\n拿钱\t17468\n不方二\t17469\n别忘了你是我的小凌\t17470\n赵立\t17471\n六岁\t17472\n金属架\t17473\n525455427\t17474\n信心类\t17475\ntMM\t17476\n怀胎\t17477\n爱情人民政局\t17478\n156556555555555566666663333399999952458534\t17479\n9杯\t17480\n愚不可及\t17481\n看见吗\t17482\n贝尔摩\t17483\n１９\t17484\n9条\t17485\n308号房\t17486\n铁路\t17487\n我喜欢你好\t17488\neeeeeeeeeeeee\t17489\n齐名\t17490\n丽娘\t17491\n长段话\t17492\n动静\t17493\n嗷剑狂的我的攻略\t17494\n不生等明天门市等你\t17495\n舌尖上的中国\t17496\n巴克莱银行\t17497\n18趟\t17498\n腰筒裙\t17499\n齐大圣\t17500\n明为\t17501\n音乐梦\t17502\n余建昌\t17503\n某某某某某某\t17504\n8233858567\t17505\n这些事儿我不看了行了吧行\t17506\nhiexekducfub\t17507\n二二分之一\t17508\n全建\t17509\n叨逼叨呀叨逼叨\t17510\n789c2\t17511\n朱茯神\t17512\n42公斤\t17513\n打飞机气\t17514\n唐老鸭\t17515\nDEF\t17516\nDEG\t17517\n雪晴\t17518\n谢大媒\t17519\n水火断网\t17520\n交寒假\t17521\n大白\t17522\n220元\t17523\n双拳\t17524\n水儿\t17525\n是什么\t17526\n百利城\t17527\n雪景\t17528\n雪普\t17529\n魑瑛\t17530\n郭存晴\t17531\n李冰因\t17532\n哈哈市\t17533\n老忘我老娘\t17534\n蠃\t17535\n大醉侠类\t17536\n夺命夺命真的好烦人\t17537\n马托\t17538\n嗨洛\t17539\n建设银行\t17540\n文成公主\t17541\nhttpahiphotosbaiducomxiaodupicitemb8014a90f603738dd1ebbe51b41bb051f819ec24jpg\t17542\n罗堡\t17543\n蠕\t17544\n赵三梅\t17545\n博爷\t17546\n大黄鼠皮卡丘\t17547\n博爱\t17548\n结巴\t17549\n瞅见\t17550\n乔良\t17551\n蠢\t17552\n国安\t17553\n氧气\t17554\n恶心鬼\t17555\nhttphhiphotosbaiducomxiaodupicitem574e9258d109b3deeb03f91ccbbf6c81800a4c01jpg\t17556\ncuole\t17557\n我笑点的你给我讲嘛\t17558\n方法\t17559\n6046720680006600400\t17560\n亲亲吻吻你\t17561\n824707\t17562\n宝音源\t17563\n钟好\t17564\n毯子\t17565\noxox\t17566\n小儿子\t17567\n阿侮\t17568\nYoung\t17569\nvaal\t17570\n双井桥北时代国际公寓一号楼\t17571\n幸福无穷\t17572\nvaak\t17573\n小度秘真乖^\t17574\n声慢\t17575\njj100\t17576\n女生日\t17577\n朱文博\t17578\n毅腾\t17579\n李俊基\t17580\n小公里\t17581\nNichuande\t17582\ncuì\t17583\n裸婚时代\t17584\n首尔咖啡厅\t17585\nNnnnnnnnn\t17586\n表层\t17587\n阿依\t17588\n157家\t17589\n重绘\t17590\nhtgyih\t17591\n不胖\t17592\n黄佳怡\t17593\n一下下而已\t17594\n太短\t17595\n太矮\t17596\n庸人自扰\t17597\n高呃\t17598\n纳溪区\t17599\n云门\t17600\n15185576205\t17601\n九到五点\t17602\n高告\t17603\n重组\t17604\n换给\t17605\n331米斯特\t17606\n人之相处\t17607\n游于艺\t17608\n奥克兰\t17609\n挂吊\t17610\ntfast\t17611\n迫害\t17612\n不能\t17613\n挂名\t17614\n三把\t17615\n赤壁\t17616\n5.63%\t17617\nmature\t17618\n大洋到小从小养\t17619\n温润\t17620\n厚礼\t17621\nmedical\t17622\n2olo\t17623\n国不上\t17624\n姐张\t17625\n核电站\t17626\n华星光电厂房\t17627\n第四棒\t17628\n啦啦啦啦我的宝贝\t17629\n二三幺五\t17630\nKKTiT\t17631\nsholud\t17632\n喵喵喵喵\t17633\n小姐们\t17634\n多明\t17635\nfacebook\t17636\n名报\t17637\n没不懂\t17638\n易思颖\t17639\nhfddrdtt\t17640\n照赛\t17641\nfr57\t17642\n施施然\t17643\n巴尔克斯\t17644\n脖子世博\t17645\n非法小\t17646\n金毛儿\t17647\n柳波先\t17648\n好不哎\t17649\n看耶\t17650\n李永\t17651\n1234567890147085236915908742361254780963123578960414523690871236547890\t17652\n说劲\t17653\n5873806548\t17654\n吉米喽\t17655\n18:50\t17656\nBUHUILE\t17657\n白光莹\t17658\n小阳阳\t17659\n宁涛\t17660\n蓝翔\t17661\n龙床\t17662\n睡神\t17663\n帅哥儿\t17664\n不能不相信自己\t17665\n二时\t17666\n莲花山\t17667\n马洪潮\t17668\n龙庄\t17669\njhsg\t17670\n蓝翅\t17671\n哇塞度秘你好帅\t17672\n135494\t17673\n曼曼\t17674\n李氏\t17675\n赶集\t17676\n上高\t17677\n青山峰\t17678\n大鸭子\t17679\n756555554745244555888125445858\t17680\nDfffjfvidfjgdhghjfjgghjfgjhfdhgdhgttugdjfgogfjidhbffifjxjffifgkdhotvlteigxgutukfhutfjj\t17681\n78点\t17682\n两百年\t17683\n阿贵\t17684\n美颜相机\t17685\n宫外孕\t17686\n石澳大浪湾道13号\t17687\n不痛不痒\t17688\n中华论坛\t17689\n时间块\t17690\n还不够\t17691\n崇左巿\t17692\n过一会儿\t17693\n咝吆咄呍吙咪呸叿\t17694\n神邛\t17695\n脸红脸红我喜欢脸红芭比芭比\t17696\n恩吉镍业\t17697\n底价\t17698\n小小爱了劲女孩真是你\t17699\n阿进\t17700\n唠叨\t17701\nfyfgydrgryyyfttrgyjjUHJhggyghghghhhwghh\t17702\n鑫谷讯\t17703\n佛丽丽\t17704\n南门人\t17705\n提携\t17706\nit诶it谭嗣同抵抗快\t17707\n算了真是\t17708\n金汇獒园\t17709\n十头\t17710\n十夫\t17711\n十天\t17712\n昨天五点\t17713\ncabhs\t17714\nhgbnvbnhv\t17715\n十大\t17716\n虫儿\t17717\n马丽妃\t17718\n摊头\t17719\n莞式\t17720\n咿呀咿呀哟\t17721\n8月20\t17722\n忙粉\t17723\n表面\t17724\nG13咖\t17725\n食材单\t17726\n岗前\t17727\nfphdbdrjdt\t17728\n十处\t17729\n100刀\t17730\n美很美很美\t17731\n100分\t17732\nvhfnffghthhdk\t17733\n成治愈\t17734\n葡萄酸\t17735\n巴宝\t17736\n造富\t17737\n性心\t17738\n食用酒精\t17739\nHcd\t17740\n绢子\t17741\n再见去\t17742\n张黄图\t17743\n风靡\t17744\n龙红志\t17745\n李艳奇\t17746\n稳距\t17747\n有待\t17748\n知道你之啊爸爸之爸爸爸爸\t17749\n祁珏\t17750\n滑稽\t17751\nGrrfhf\t17752\n545312165858707998534381005545548404897554554885000555545545235613333333333333333333333333333334566555555553133258840484664848\t17753\nNVIDIA\t17754\n火趘距084星4\t17755\n习反贪发明\t17756\n哎呀哥哥你在吗思密达\t17757\ngoldring\t17758\n7778í\t17759\nmia丽\t17760\n有得\t17761\n璀璨\t17762\n苏州市\t17763\n颜儿儿\t17764\n中兴\t17765\n会会\t17766\n酒醒\t17767\n哈对吧\t17768\n鸡毛蒜皮\t17769\n酒醉\t17770\n玛姬·格蕾斯\t17771\n亏度\t17772\n预备队\t17773\n男生孩\t17774\n000998\t17775\n148杆\t17776\n茶花\t17777\n百醇液\t17778\nuuud\t17779\n香浓味\t17780\n数以万计\t17781\n聘用\t17782\nｌｏvｅ\t17783\n货1\t17784\n梁小奎\t17785\n希拉里\t17786\n在所不惜\t17787\n男生子\t17788\n可是然\t17789\n多木达瓦卓玛女\t17790\n九百块\t17791\n许老师\t17792\n几许\t17793\ngjaloka\t17794\n杨崇\t17795\npaix\t17796\n烧开\t17797\n气氛围墙\t17798\n一猜一猜你猜猜我的秘密\t17799\n百星酒店\t17800\nFigkjegdg\t17801\niavrou\t17802\n定价\t17803\n来了飞来\t17804\n再说一匀\t17805\n凯恩斯主义\t17806\n87886543210\t17807\n引人注目\t17808\n好呀动车\t17809\nmaimaimai\t17810\n宋梦\t17811\n贫僧\t17812\n满满满\t17813\n锂电\t17814\n吃承\t17815\nwehaveno\t17816\n白塔\t17817\n狺话\t17818\n死不在\t17819\nnzh\t17820\n穿衣服\t17821\n侵入\t17822\n渊博\t17823\n欢欢\t17824\n56784946678431698\t17825\n自锁\t17826\njhjjjjjbs\t17827\n3a杯羹\t17828\n会议厅\t17829\n迈皋桥\t17830\n加油鬼\t17831\n3080吧\t17832\n弱弱\t17833\n小度干哈\t17834\n多发期\t17835\n3公里\t17836\n火炬手\t17837\n思潮\t17838\n孔多塞\t17839\n梦露\t17840\n死噶\t17841\n雄师\t17842\n吧皮卡丘\t17843\n上个月份\t17844\n21:27\t17845\n爱情歌\t17846\n甘旨\t17847\n吧地图\t17848\n拆线\t17849\n叫做人\t17850\n妳男\t17851\n电视节\t17852\n十小心\t17853\n披\t17854\n骁骁\t17855\n抨\t17856\n孙佳琪\t17857\n抢\t17858\n抡\t17859\n抠\t17860\n学问\t17861\n你好我喜欢猫\t17862\n报\t17863\n护\t17864\nwaplmy\t17865\n抹\t17866\n有疑\t17867\n抿\t17868\n2044903490\t17869\n米庆\t17870\n押\t17871\n形式主义\t17872\n抱\t17873\n抶\t17874\n抵\t17875\n偷偷偷偷偷偷偷偷偷偷偷偷偷偷偷偷偷偷偷偷\t17876\n把\t17877\n入伍\t17878\n刘希刚\t17879\n安扣\t17880\n去化\t17881\n秦泽\t17882\n抂\t17883\n技\t17884\n历练\t17885\n再二\t17886\n回我\t17887\n抛\t17888\n抚\t17889\n折\t17890\n三十袋\t17891\n布鲁布鲁布鲁布鲁布鲁\t17892\n抜\t17893\n抓\t17894\n抒\t17895\n抑\t17896\n抗\t17897\n抖\t17898\n投\t17899\n1682尾号\t17900\n散会\t17901\n钟佳宸\t17902\n李有能\t17903\n抢亲\t17904\n王撒\t17905\n宫子华\t17906\n一亿万块\t17907\n秀智商\t17908\nnononoyesyou\t17909\n65426\t17910\n邓青\t17911\n光镇政府\t17912\nastlesla\t17913\n周清宇清\t17914\n征用\t17915\nissa\t17916\n小颖颖\t17917\n92nnn\t17918\n碱片\t17919\n葡萄干\t17920\n泰姬陵\t17921\n法律思维\t17922\n冷暖自知\t17923\n火星性感\t17924\n臭猪悟度秘\t17925\n大脑\t17926\nxiao话\t17927\n七秒钟\t17928\n欢儿\t17929\nFKYUR\t17930\n傻乎乎\t17931\n白长\t17932\n杀人儿\t17933\n留笑容\t17934\n9i\t17935\n還使勁\t17936\n三百个\t17937\n糖水\t17938\nfhfjfjehccdaxcgbcxeibv\t17939\n错伏\t17940\n惺惺相惜\t17941\n三百两\t17942\n韩青昭\t17943\n仕林轶\t17944\n丰台区\t17945\n拜顾\t17946\nwuyv\t17947\n参考锤\t17948\n羊杂碎\t17949\n南茜\t17950\n客气会\t17951\n去处\t17952\nwuyl\t17953\n半小時\t17954\n芝麻糖\t17955\n徐良\t17956\n废流量\t17957\n朱朱\t17958\n三百万\t17959\n发颜\t17960\n发额\t17961\n向欣盈\t17962\n几百遍\t17963\n东三环南路\t17964\n无度\t17965\nfssafdfgf\t17966\n冯雨波\t17967\nhfgffggcchjhccxzghvgmvfnbfhbmb\t17968\nE-MAIL\t17969\n1一r1\t17970\n1453592528451\t17971\n外公关系\t17972\nhudf\t17973\n百慕大三角\t17974\n阴精\t17975\n人士\t17976\n微笑\t17977\n在一起飞车\t17978\n告别\t17979\n博古架\t17980\n呀了\t17981\n无序\t17982\n阿南\t17983\n肛门\t17984\n水千山\t17985\n无底\t17986\n鉴取\t17987\n全身照\t17988\n巧干\t17989\n人声\t17990\n就是了撒\t17991\n茄肉\t17992\nzysyvs\t17993\n星星星星\t17994\nQQ浏览器\t17995\n用纸\t17996\n借办\t17997\n发富贵花i\t17998\n师兄\t17999\n有一个你\t18000\n三国不解之谜\t18001\n回购\t18002\n爱巢\t18003\n第12届\t18004\n追疯子\t18005\n人武部\t18006\n狂二零关\t18007\n颜世山\t18008\n祖传\t18009\n古剑奇谭真的很浪\t18010\n提心吊胆\t18011\n师公\t18012\n宋明倩\t18013\n啦啦啦啦啦累\t18014\n特里谢\t18015\n盖点\t18016\n陈东源\t18017\n尼克森\t18018\n康静\t18019\n严水英\t18020\n法门寺\t18021\n传入\t18022\n押运\t18023\n修心养性\t18024\n伍佰壹\t18025\nofen\t18026\n王科龄\t18027\ngjjgjfhbf\t18028\n神经了吧你听不懂我说话吗猪啊牛\t18029\n中宏\t18030\n再来说\t18031\n装饰部\t18032\nvvbljgh\t18033\n看病\t18034\n航天部\t18035\n锐角\t18036\n度秘我讨厌你我恨你\t18037\n强阳\t18038\n门门\t18039\n门闩\t18040\nyuui\t18041\n蜂蜜牛肉\t18042\n自失\t18043\n陈家杰\t18044\n说死\t18045\n李咪咪\t18046\njdhd\t18047\nfjkkl\t18048\n强队\t18049\n说止\t18050\n鼓舞人心\t18051\n真的不知道\t18052\n原始就是你\t18053\n烦吾\t18054\n祼\t18055\n自恃\t18056\n为您好\t18057\n转笔\t18058\n呵妈咪\t18059\n5月28号\t18060\n乔巴蓝\t18061\n萍萍儿\t18062\n发芽\t18063\n聚儿\t18064\nihv\t18065\nC8C8C9E82B6F7A59814039F0494442FE\t18066\n傻傻猪\t18067\n不行不分\t18068\n绒绒\t18069\n朱卿瑶\t18070\n恩恩怨怨\t18071\n互粉交友推广应用#推兔#\t18072\niha\t18073\n阮泽华\t18074\n划线\t18075\n美观度\t18076\nihe\t18077\n闹门儿\t18078\n天脚踏\t18079\n呢不知道\t18080\n星猫\t18081\n双井\t18082\n觅渡\t18083\n面壁去\t18084\n15862173130\t18085\n两院\t18086\n借势\t18087\n籍\t18088\nHERO\t18089\n个壁\t18090\n好友\t18091\n好受\t18092\n死度秘度秘c\t18093\n第三遍\t18094\n好发\t18095\n赵丽坤\t18096\n填满\t18097\nejdbbz\t18098\n好句\t18099\n掌我的小仓兰香阳光道我的梦\t18100\n美尔\t18101\n心裳\t18102\n望月\t18103\n好可\t18104\n唔觉唔\t18105\n這句\t18106\n好叫\t18107\n胡玉燕\t18108\nsulina\t18109\n警报器\t18110\nhttpdhiphotosbaiducomxiaodupicitemf603918fa0ec08fa529300d35eee3d6d55fbda\t18111\n籼\t18112\n类\t18113\n七雄类\t18114\n麦羊羊\t18115\n华裔传奇\t18116\n米\t18117\n汉釜宫喜悦感v\t18118\nSTISISTITIS\t18119\n宣武区\t18120\n悦达889周笔畅\t18121\n啦啦啦啦啦啦徐娘之乐\t18122\n多自\t18123\n嘻欢\t18124\nkutv\t18125\n千万种\t18126\n说的再来\t18127\n德者居上\t18128\n多致\t18129\n老海\t18130\n毛驴\t18131\nkutf\t18132\n368万\t18133\n丑丑丑八怪\t18134\n在湖边\t18135\n大汉奸\t18136\n立普妥\t18137\n郭楚儿\t18138\n程大庄\t18139\n忍者龟\t18140\n跳着舞的你\t18141\n孟广美\t18142\n豫北\t18143\n作动\t18144\n胡见宇\t18145\n糖丽丽\t18146\n哺乳\t18147\n真可爱\t18148\n汤圆儿\t18149\n110万册\t18150\n胡琪\t18151\n到达处\t18152\n拖裤\t18153\nRun\t18154\n累死\t18155\n泡儿\t18156\n丨你好恶心说的就是你\t18157\n110分钟\t18158\n泫雅\t18159\n相隔\t18160\n百10\t18161\n没错误\t18162\n5151881\t18163\n55567\t18164\n2016年1月15号\t18165\n相随\t18166\n朱雪琪\t18167\n崔这么\t18168\n无聊的孩\t18169\nEXE\t18170\n请你走\t18171\nEXO\t18172\n五斤\t18173\nwoyenijued\t18174\n这个五月十八号\t18175\n25464\t18176\nFadshurt\t18177\n哎呦我亲爱的小救星\t18178\n临汾\t18179\n十一期\t18180\nòó\t18181\n没有听话\t18182\n嗯好嘞\t18183\ng鸣一\t18184\nmng\t18185\n定要\t18186\n撸撸噜\t18187\n四坏\t18188\n天呐度秘\t18189\n二多十\t18190\n四块\t18191\n68集\t18192\n临江\t18193\n159910980\t18194\n牙刷\t18195\n番鸭\t18196\n汉蒙梦\t18197\n框子\t18198\n米小离\t18199\n千载\t18200\n行尸走\t18201\n够了累\t18202\n肥硕\t18203\n柯震\t18204\n1027\t18205\n李军亮\t18206\n宁波的海公园\t18207\nViVOiCe\t18208\n两限\t18209\n秘三酥鸭\t18210\n556333\t18211\n802243\t18212\n土气\t18213\ncfgjdxgbdfhiqrhhatmfabdafbafbafhetjrukuurktkurkujeysjgbasvsdhhjkifdxcvbn\t18214\nclr乐队的你的微\t18215\n潇潇\t18216\n2个月\t18217\n异天\t18218\n撒兴冲冲\t18219\nlala\t18220\n砝码\t18221\n张朝宾\t18222\n看来着\t18223\nEX2\t18224\ni2366网\t18225\nEX0\t18226\n姑古国\t18227\n用友\t18228\n起立\t18229\n蛋壳\t18230\n十二年前\t18231\n刘义伟\t18232\n富国天成\t18233\n美如水\t18234\n多爱爱自己\t18235\n易改\t18236\n落乱说\t18237\n猪世界\t18238\n中午12：30分\t18239\n哈哈哈度秘我太爱你了你\t18240\n2．7个\t18241\n用口\t18242\n汽修厂\t18243\nbitch们\t18244\n很糟糕\t18245\n8532\t18246\n2x4x6\t18247\n明朗德\t18248\nmygood\t18249\n摇树\t18250\nemail\t18251\n我是女的你是男的我要亲你\t18252\n舒服\t18253\n旗boots\t18254\n受命\t18255\n方才\t18256\n机型\t18257\n特：赫\t18258\n时刻表\t18259\n得数\t18260\n潘集灰\t18261\n对末\t18262\n我也是你的夜\t18263\n爱华林翰\t18264\n女白\t18265\n晨钟暮鼓\t18266\n世贸磅哗\t18267\n素别\t18268\n背提纲\t18269\n集安市\t18270\n对望\t18271\ndruf\t18272\ndrux\t18273\n梦滴\t18274\n我有一万一千一百一十一万一零一万一千一百一十一\t18275\n同行\t18276\n徐娜\t18277\n未收人\t18278\n八斗\t18279\n陈志\t18280\n翁涛\t18281\n机器人朋\t18282\n阿拉蕾安七炫\t18283\n缅因\t18284\n张发\t18285\n黑风雷\t18286\n常见秘\t18287\n食品培训证\t18288\n丰裕酒店\t18289\n超级粉丝\t18290\n麦月梅\t18291\n多卑微\t18292\n广州市越秀小北营业厅\t18293\n湿巾\t18294\n八方\t18295\n张口\t18296\n800万个\t18297\n你未来呀就在上上上\t18298\n赖文锐\t18299\n法制日报\t18300\n八斤\t18301\n死老王\t18302\n于朦胧\t18303\nGei\t18304\n悠酷\t18305\n张号\t18306\nGel\t18307\n咚咚\t18308\n松弛\t18309\n赵依晨\t18310\n你好度秘我是谁谁的妈妈\t18311\n说试试试试试试试\t18312\n谁谁\t18313\ngsjbgxo\t18314\n咘星\t18315\n方文娟\t18316\n凶我了\t18317\n绿草\t18318\n在南方\t18319\n咋样做\t18320\n一百五七十一个\t18321\n半吨\t18322\n好好玩游赞\t18323\n休闲气\t18324\n人近我一迟我尽人一仗\t18325\n不破\t18326\n三里屯SOHO\t18327\n阿斯顿马丁\t18328\nuudrhyrucr\t18329\n那你个大头鬼我是\t18330\ndiuduu\t18331\n邓龙浩\t18332\nstti\t18333\nBOCbcd\t18334\n技术标\t18335\nsttd\t18336\n12345567\t18337\n朱帅\t18338\nELIE\t18339\n西苑医院\t18340\n罗荫国\t18341\n泥坊\t18342\n周六中午12点\t18343\n小处理度秘\t18344\n这片\t18345\n作祟\t18346\n76儿\t18347\n挣扎求生存\t18348\n快痛\t18349\n祖\t18350\n莲花腿\t18351\n47本\t18352\n1.3米\t18353\n猛打\t18354\n5886\t18355\n刘韵雅\t18356\n黄页\t18357\n5888\t18358\nadsl\t18359\n川大\t18360\n林依\t18361\n心软\t18362\n路面度\t18363\n好不好爽\t18364\n融化\t18365\n捣蛋\t18366\n好不好爱\t18367\n按钮\t18368\n导师\t18369\n太谷\t18370\n腾讯儿童网\t18371\n搭档\t18372\n司阍承埔\t18373\n呢一点也不萌\t18374\n石雅倩\t18375\n过细\t18376\n没我\t18377\n自甘\t18378\nPinky、Stephan\t18379\n孙燕飚\t18380\n稀客\t18381\n这版\t18382\n王之平\t18383\n气站\t18384\n驾驶座\t18385\n没戏\t18386\n参见\t18387\n自由\t18388\n得益\t18389\n参观\t18390\n度秘你好乖好\t18391\n苏沐秋\t18392\n忘了填涂\t18393\n啦啦棒\t18394\n超美都\t18395\n涛涛涛涛涛涛涛涛涛涛\t18396\n说出去\t18397\n燃眉之急\t18398\n自用\t18399\ndhiphotosbaiducomxiaodupicitemac4bd11373f08202c167e94b4cfbfbedaa641bc8jpg\t18400\n捐资\t18401\n天龙难\t18402\nJhhkgjfjh\t18403\nxed\t18404\n你是我的恩人\t18405\n倍数\t18406\n东西ok\t18407\n燃尽\t18408\nMarcin\t18409\n7:55\t18410\nglut\t18411\n勤\t18412\n养阴\t18413\n吕思意\t18414\n勿\t18415\n勾\t18416\n勺\t18417\n勹\t18418\n7条\t18419\n阿少\t18420\n一点零七\t18421\n我的家乡\t18422\n8月12日\t18423\n床地\t18424\n2120家\t18425\n别以为我看不懂\t18426\n勋\t18427\n勉\t18428\nghhghghhhjhhhhhhhhhhhhjjjjjjjjjj\t18429\n勇\t18430\n勃\t18431\n睡眼\t18432\n远在天涯\t18433\n盟杰\t18434\n阿尼\t18435\n動\t18436\n勒\t18437\n不行好好好\t18438\n袁今天\t18439\n顾及你\t18440\n女知青\t18441\n倒班\t18442\n闭目\t18443\n死账\t18444\n死货\t18445\n胡老板\t18446\n朱俐静\t18447\n曹蛋蛋\t18448\n石食油\t18449\n高高\t18450\nNatual\t18451\n死贱\t18452\n范蠡\t18453\n牛顿\t18454\n0.5%\t18455\nfvbggvc\t18456\n心儿\t18457\n马思婷\t18458\n灭绝\t18459\n慢慢\t18460\n默默之默\t18461\n可想\t18462\n感冒通\t18463\n松送\t18464\n没太懂\t18465\n东二环\t18466\n起源\t18467\n班牙\t18468\n135353\t18469\n卖点\t18470\n伯秒\t18471\n1百分之几\t18472\n载载\t18473\n穿心村\t18474\n昨天内\t18475\n55565588898885555555588\t18476\n标榜\t18477\n左撇子\t18478\n二种\t18479\n十五遍\t18480\n你好坏人家\t18481\n十二周岁\t18482\n自弃\t18483\n品件\t18484\n13991389316\t18485\n鸡身\t18486\n职业经理人\t18487\n静心\t18488\n十五道\t18489\n昏头\t18490\n萌汗纸\t18491\n没大\t18492\n郑智化\t18493\n僵尸\t18494\n张孜箬\t18495\n没天\t18496\nJighijbbjjm\t18497\n没头\t18498\n磨饼子\t18499\n二狗\t18500\n15131328\t18501\n周小a\t18502\n哪你女\t18503\n30多个\t18504\n大你事实大富翁那度秘\t18505\nｈｉｓ\t18506\n体性\t18507\n林健麟\t18508\n广州国贸中心\t18509\n我真的\t18510\n胡杰\t18511\n九少十年\t18512\n香水儿\t18513\nedy\t18514\n胆怯\t18515\n要人\t18516\n田登\t18517\n转交\t18518\nobqswonderofawsoneday\t18519\n没夜\t18520\n移来\t18521\n胡来\t18522\nGhcdhf\t18523\n35458141\t18524\n进展\t18525\n兜底\t18526\n小唐\t18527\n根尖周病变\t18528\n黑白分明错错错\t18529\nwhoisheyu\t18530\n我的名字了我家好好小大人\t18531\n进屋\t18532\n昌珉\t18533\n南极夏快点儿\t18534\n司马南\t18535\n六十分钟\t18536\n卤鸭脖\t18537\n愚人\t18538\n娱乐新闻\t18539\nuhghvggggggggyyyygy\t18540\n文美\t18541\nhhhyggggnhhhhgghhuhvgbhhhbbhhb\t18542\n小唯\t18543\n上级别\t18544\n几几几\t18545\n新世界\t18546\n一年十一年\t18547\n度娘wan\t18548\n小可恶\t18549\n撬开\t18550\n开课\t18551\nbbsCDEFGABCDEFGwapCDEABCDEFGHIJKLMNOP11LAQUV5764794095443790\t18552\n晴原夏\t18553\n冉会\t18554\n听听见过\t18555\n求你了我的盆友\t18556\n肖邦\t18557\n四万多亿\t18558\nohcqtp\t18559\n大婚照\t18560\ntuytty\t18561\n网盟\t18562\n嗯千块\t18563\n7月14日12点\t18564\n爱无限\t18565\n椰子汁\t18566\n嗯青\t18567\nhhuhuahuan\t18568\n元氏县\t18569\n细菌病毒感染\t18570\nJesse\t18571\n氇\t18572\n我波我乎死你\t18573\njulloh\t18574\n紧实\t18575\n微辣\t18576\n原乡系\t18577\n邓然\t18578\n监制\t18579\n聪明了我要向你\t18580\n打打\t18581\n列算式\t18582\n气\t18583\n金田\t18584\n监利\t18585\n民\t18586\n氐\t18587\n嗯明明\t18588\n氒\t18589\n刘产\t18590\n氟\t18591\n打扁\t18592\nARASHI\t18593\n氚\t18594\n氧\t18595\n打找\t18596\n氣\t18597\n宋宇源\t18598\n夜谭\t18599\n氯\t18600\n一人儿\t18601\n打扰\t18602\n氵\t18603\n水\t18604\n李宏彦\t18605\n打扮\t18606\n醉翁亭记一文中有句话\t18607\n打扫\t18608\n十点十点\t18609\n介绍信\t18610\n非线性\t18611\n美羊羊\t18612\nhpf\t18613\n氺\t18614\n飞来峰\t18615\n产品结构\t18616\n小戈德哈根个女孩唱歌歌\t18617\njkmmopqr\t18618\n无隙\t18619\n丰润区\t18620\nGV个\t18621\n1990672839\t18622\n我是那样的爱你\t18623\nj8tjjpmitmjjpgw\t18624\n炎黄子孙\t18625\nCMSB\t18626\n去不回头\t18627\n县长\t18628\n呗比亚\t18629\n告诉你我真没有\t18630\n質感\t18631\n美西\t18632\n格斗士\t18633\n建嵘\t18634\nbuow\t18635\n认救命\t18636\nsseyou\t18637\n宜生\t18638\n告诉你我在\t18639\n吐艳\t18640\ntachan\t18641\n海獭\t18642\n你在勇是谁你可以我是谁\t18643\n卢宽\t18644\n邵广忠\t18645\n红刺\t18646\n二十米\t18647\nfkfndnsldn\t18648\n见过去\t18649\n370290661\t18650\ngdhdhsbzjfgc\t18651\n日租包\t18652\n大学霸\t18653\n红利\t18654\n红腰带\t18655\n息息相关\t18656\nPOWER\t18657\n余集镇\t18658\nghjkkl\t18659\n数少一点\t18660\nmmmmmmm\t18661\n衬砌\t18662\nsjhcg\t18663\n香港市民视\t18664\n不见了是\t18665\n假死\t18666\n旋风少年\t18667\n邢淙然\t18668\n裤衩子\t18669\nqxtbon\t18670\n看不着我\t18671\n莫从子厚\t18672\n辽宁机场\t18673\n乡下\t18674\nBleasurites\t18675\nalc\t18676\n大观楼\t18677\nbukpky\t18678\nVhh\t18679\n阿威八戒\t18680\n饭卡饭\t18681\n槽青\t18682\n铠甲勇士捕\t18683\nala\t18684\n梨园路\t18685\n欧元洲\t18686\n茧沐浴露\t18687\n上海国际\t18688\n帅美\t18689\n未定名\t18690\n扯皮\t18691\n长征精神\t18692\n口碑\t18693\n口碎\t18694\n入校\t18695\n公安级\t18696\n赶紧\t18697\n司法刑侦\t18698\n首败\t18699\n2亿\t18700\n热米皮\t18701\n不言不语\t18702\n有一天一只猫\t18703\n謝謝妳\t18704\n丁如一\t18705\n卖块\t18706\n香橙味\t18707\n满梦园\t18708\n十御龙\t18709\n执缚\t18710\n+\t18711\n腾文皓\t18712\n12瓶\t18713\nals\t18714\nhwjdbekg\t18715\ntjjj\t18716\n溟墨\t18717\n满腹\t18718\n恍惚\t18719\nHit-girl\t18720\n嘎塞\t18721\n上证报\t18722\n小天使\t18723\n骗你的喜欢你\t18724\ncvvbnmmxxxvvhbbbnmmmmn\t18725\n参战\t18726\n生死边缘\t18727\n北京大老肥\t18728\n酥脆\t18729\n卵生\t18730\n兆岭\t18731\n单博文\t18732\n安徽\t18733\ndou2\t18734\n神户市\t18735\n强颜\t18736\n似的人\t18737\n人类们\t18738\n哼娘子\t18739\n毛猪\t18740\n亲换\t18741\nsmoky\t18742\n郄春燕\t18743\n噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗\t18744\n中餐\t18745\n指一算\t18746\nnNMLLL\t18747\n稻花香\t18748\n安得\t18749\n刘呼兰\t18750\n18：30—19：00\t18751\n一本来\t18752\nsecure\t18753\n通机\t18754\n4日\t18755\n井冈山石燕洞\t18756\npxgwtxg\t18757\n不算安\t18758\n宝来\t18759\n面带笑\t18760\n嗯拜拜\t18761\n9635\t18762\n脊梁骨\t18763\n发质\t18764\nhtodh\t18765\n孙慧茹\t18766\n员机\t18767\n水星度秘\t18768\n发财\t18769\n胡顺升\t18770\n万泉\t18771\n发货\t18772\n5月22日\t18773\n万法\t18774\n柳影\t18775\ncvbbjj\t18776\n一万点\t18777\n苏打\t18778\n好吧普通话\t18779\n发贴\t18780\n燃烧\t18781\n沁人心脾\t18782\n下半身\t18783\n那样子\t18784\n笑了笑\t18785\n少生\t18786\n过年了你\t18787\n明哲\t18788\n快闪\t18789\n快门\t18790\n记步\t18791\n一一百三十八\t18792\n丁雪晴\t18793\n托尼秘\t18794\n呀式\t18795\n虎牢\t18796\n阴户\t18797\n国航\t18798\n15604445726\t18799\n虎牙\t18800\n少男\t18801\n阿娇美\t18802\n赵叔\t18803\n明哓\t18804\n长叹一声\t18805\nSNL\t18806\n凭证\t18807\n虎片\t18808\n嬉闹\t18809\n中等\t18810\n波伯苓\t18811\n向城\t18812\n苏一天\t18813\n能口\t18814\n内紧\t18815\n秒射狼\t18816\n15785655554110825\t18817\nZBHhhnmml\t18818\n二蛋\t18819\n过继\t18820\n再见吧拜拜\t18821\n埃及政府\t18822\n还没你你你你\t18823\n234427\t18824\n金峰\t18825\n启芝\t18826\n做手\t18827\n3344\t18828\nmakes\t18829\n好啦再聊啦回我这句话\t18830\nzzzd\t18831\n臭度秘臭\t18832\n夺得\t18833\nhhdy\t18834\nΣ\t18835\n无事生非\t18836\n剑侠情缘\t18837\n封开\t18838\n金硕珍\t18839\n玉林\t18840\n北京原地税\t18841\nΥ\t18842\nhhdh\t18843\n办好想\t18844\n行风热线\t18845\n平山\t18846\n尼玛尼玛尼玛尼玛\t18847\n好我信我信\t18848\n小组\t18849\n曹浦凡\t18850\n魏宝贵\t18851\n09:13\t18852\n小绿\t18853\n拉罗奥特曼\t18854\n种猫的眼神和我的好像\t18855\n赵红霞飞\t18856\n小绵\t18857\n看不好玩\t18858\n告诉我一个你的秘密\t18859\n我看电视呢别打扰我了别说话了请你了好不好好不好\t18860\n晴天\t18861\n汗液\t18862\nλ\t18863\n小继\t18864\n一起作业网\t18865\n祛除\t18866\n258832\t18867\n二赖子\t18868\n谭轩辕\t18869\n227728667799668844286340568\t18870\n巴卫他好帅\t18871\nο\t18872\nuuu54\t18873\n攻受\t18874\n实习盖\t18875\n复和\t18876\n魔法阵\t18877\ncixyh\t18878\n八倍商\t18879\n晃闪\t18880\n头晕\t18881\n华妃片\t18882\n苍凉\t18883\n不能不能不你\t18884\n包写\t18885\nη\t18886\n朱再来\t18887\n十八大\t18888\n一二零零\t18889\n科幻小说\t18890\n德吉尔\t18891\n光华\t18892\n十八天\t18893\n孬种\t18894\n一啥\t18895\n一啦\t18896\n实处\t18897\npoqu\t18898\n慢走走好\t18899\n心祁\t18900\nXgxhjv\t18901\n子安\t18902\n阳委\t18903\n红星公社\t18904\n贵州都市怡景酒店\t18905\nkxin咯\t18906\no我2\t18907\n凯德\t18908\n奠基人\t18909\n真的好拉风\t18910\n搜查\t18911\n停掉\t18912\nmrnn\t18913\n起诉\t18914\nsyuxl\t18915\n殷树贵\t18916\n明天恩\t18917\ncouncils\t18918\nhpgpi\t18919\n20嘞\t18920\n忘返\t18921\n咎其责\t18922\n护耳\t18923\nddfte6hghr\t18924\n松露+葡萄牙酒+鹅肝酱\t18925\nYST\t18926\nfortu\t18927\nwgg\t18928\n多尹\t18929\n毛边\t18930\nwgc\t18931\n小个人\t18932\nwgm\t18933\nwgl\t18934\n沫女\t18935\n真笨连我说的话\t18936\nrrety\t18937\nwgw\t18938\n下午4:00\t18939\nwgr\t18940\nWard\t18941\n倮女图\t18942\ngogo号\t18943\nwgx\t18944\n抗压型\t18945\n小郝\t18946\n第十九集\t18947\n小乌龟\t18948\n张梦婷\t18949\n惶惑\t18950\n还气\t18951\n刺圆\t18952\n小郑\t18953\n第二十九条\t18954\n遨游\t18955\n艾米cu\t18956\n維美\t18957\n哈赛尔\t18958\n啦啦啦啦啦啦萌妹子你好\t18959\n猪小度秘\t18960\natpa\t18961\n我们的我爱你\t18962\n小都\t18963\n蒙萌萌哒机器人度秘\t18964\n膳想\t18965\n我的世界异形MOD\t18966\n魏雨军\t18967\nxiaoniao\t18968\n小郭\t18969\n13187575889\t18970\n自力更深\t18971\n黑子的篮球#帅哥+基情+热血\t18972\n中央景城一期\t18973\n猪养肥猪\t18974\n海明\t18975\n冒险类\t18976\n忍不住\t18977\n结㑛\t18978\n雷宫\t18979\n七第七\t18980\n说的爱\t18981\n约谈\t18982\n頭髮\t18983\n海星\t18984\n乙醚\t18985\nvjbkbkcy\t18986\n混乱\t18987\n粗粝\t18988\nwanneng\t18989\n井曼\t18990\n北京香山\t18991\n恋人心憔\t18992\n达令\t18993\n刮破\t18994\n保艾\t18995\n疯喝\t18996\n够背弟子规\t18997\n咪优朋\t18998\n忠寿\t18999\n好识\t19000\n琼剧\t19001\n好评\t19002\n好坏蛋\t19003\nboss们\t19004\n弘弘扬\t19005\n枫泾中学\t19006\n千江月\t19007\n我的生月\t19008\n天津分公司\t19009\n五十米\t19010\n人事司\t19011\n存折\t19012\n12.94元\t19013\n肋骨雷\t19014\n好话\t19015\n很实惠\t19016\n丝的回忆\t19017\n举一反三\t19018\nounengs咯\t19019\n娟儿\t19020\n华严\t19021\n孕读\t19022\n当当网\t19023\n巴拉巴拉小魔仙指那个八小魔仙之梦幻学玉洗\t19024\n华中\t19025\n杜撰\t19026\n赌石类\t19027\ntf多一\t19028\n年幼\t19029\n好说\t19030\n华为\t19031\n唐红义\t19032\n泰拉瑞亚\t19033\n高世梁\t19034\n华丽\t19035\n识相点\t19036\n曼哈顿计划\t19037\n小机器人求你了告诉我么么么么\t19038\n花城\t19039\n假的你是个骗子我不理你了也不爱你了你让我太伤心了再见再见便秘\t19040\n你好作弊\t19041\n卡罗林斯卡\t19042\n维吾尔族\t19043\n郭德纲\t19044\n信丰\t19045\ndnj\t19046\n海妈\t19047\n查出\t19048\n上学去\t19049\n我和你敢骂我你不怕我投诉你\t19050\n凉皮儿\t19051\nv茜子\t19052\naabc式\t19053\n亲妈\t19054\n毛才\t19055\n山东临邑县政府\t19056\n俩94\t19057\n新苗\t19058\n梨粥\t19059\n小把尾\t19060\n犯冲\t19061\nssssssssss\t19062\nypk\t19063\n比比比\t19064\n对垒\t19065\nWmkb1tp\t19066\n程星宇\t19067\n五万一棵\t19068\n巷尾\t19069\n巨蟹男宝\t19070\n无谓帅\t19071\n山村野\t19072\nguoire\t19073\n一本尊\t19074\n霍金\t19075\n萧堡\t19076\n乐墩\t19077\n512927197508182910\t19078\n大不了赖上你家噌\t19079\n明说好\t19080\n巴西弗洛米嫩塞俱乐部\t19081\n闩干人\t19082\n凉拌木耳\t19083\nyfbxfbx\t19084\n环绕\t19085\n姓郭\t19086\n这位\t19087\n蛋糕糕\t19088\n海妞\t19089\n80％\t19090\n六肖\t19091\n走一步再走\t19092\n自动挡\t19093\n有钱不了了我\t19094\n八嘎\t19095\n闹肚子疼\t19096\nfnfj\t19097\n聊度\t19098\n真听话\t19099\n竞赛\t19100\n新京华\t19101\n学和\t19102\n枸杷伤\t19103\n九十个\t19104\n拉皮\t19105\n二七月\t19106\nskyl\t19107\n3克\t19108\n通灵\t19109\n山与人\t19110\n刘少奇\t19111\n阿铁\t19112\n嗯熙蕾\t19113\n六股\t19114\n都和\t19115\n0.84%\t19116\n哪有\t19117\n747V7\t19118\n竞走\t19119\n小兔仔\t19120\n阿塔莱\t19121\n字句\t19122\n家俩\t19123\n搞笑你是\t19124\n黄书记\t19125\n沈亚松\t19126\n家信\t19127\n弄错战\t19128\n兴用\t19129\n字号\t19130\n藏着\t19131\n能不要脸\t19132\n再见再见再见再见再见\t19133\n家俱\t19134\n七来来来来来来\t19135\n六九式\t19136\n妞妞\t19137\n一两年\t19138\n我最爱最爱做梦\t19139\n家俊\t19140\n国父\t19141\n原先\t19142\n著作权人许可\t19143\n答辩谢\t19144\n满篇\t19145\n别克别克\t19146\n家保\t19147\n张厚荣\t19148\n八百八十八八百八十八888\t19149\n别说我真的爱\t19150\ndddd8888\t19151\n交响乐\t19152\n见影\t19153\n宋代后周\t19154\n左雪雯\t19155\nnnni\t19156\n赵度秘\t19157\n冷女\t19158\nBy水晶奶奶\t19159\n业川瓦\t19160\n郭尧\t19161\n啊戈\t19162\n太在乎\t19163\nq喜事\t19164\n忽忽\t19165\n林晨宇\t19166\n婚爱有句英图四大的m勾引图四大的文爱\t19167\n皇明太阳能\t19168\n急呼\t19169\n公分战\t19170\n赵思祺\t19171\n我恨你我恨你我恨你\t19172\n速8309\t19173\n走过分钟\t19174\n有事先走了再见\t19175\n4.5千米\t19176\nHhgvghgff\t19177\n王秀\t19178\n读出来\t19179\n1436905572\t19180\n碧里\t19181\nfyfegju\t19182\n一样的生活\t19183\nfollow\t19184\n莫多\t19185\n408563378\t19186\n笑眯眯\t19187\n我一位\t19188\n译名\t19189\n挖潜\t19190\n高层住户\t19191\n猪腿包\t19192\n经济学家\t19193\n顶系\t19194\n羞煞\t19195\n阿富汗\t19196\ngayv\t19197\n毁坏\t19198\n索格\t19199\n你是我的亲亲\t19200\n莫大\t19201\nnnnnorrr\t19202\njfko\t19203\n烦银\t19204\n吗度秘度秘\t19205\n万讯\t19206\n蛊惑性\t19207\n人本\t19208\n经用\t19209\nguit\t19210\n人木\t19211\n头生\t19212\n老王我有事找你\t19213\nlgjdfylz\t19214\nguii\t19215\n黄汝波\t19216\nguie\t19217\nglajjjgaa\t19218\n人机\t19219\n蓉儿\t19220\n永梅\t19221\n知困\t19222\n三十多咪\t19223\n1985年\t19224\n气喘吁吁\t19225\n蘑菇街\t19226\n四零几\t19227\nguiN\t19228\n找你好不好\t19229\n剛剛幹嗎說信\t19230\n人期\t19231\n曲家都秘\t19232\n噗丽\t19233\n一平方厘米\t19234\nHvhkjhkijjhgfdssssdghjkkoooigddszhjklllkkhg23445678900rfgjghjgfhjjjhfjjhthkikkfsdgjki\t19235\n骨胳\t19236\nhttpfhiphotosbaiducomxiaodupicitemd058ccbf6c81800a7cbe837bb63533fa828b476fjpg\t19237\n乐雅轩\t19238\n范兴广\t19239\n十几平米\t19240\n行笑\t19241\n葫芦园\t19242\n马背\t19243\n赵大强\t19244\n破麻\t19245\n红提\t19246\n10000000000000000000000000000000000000000000000000000元\t19247\n想开\t19248\n贞观政要\t19249\n靠靠\t19250\nSecret\t19251\n往生\t19252\n高佳楠\t19253\n不旧途\t19254\n装满\t19255\n真的很难\t19256\n晋LS88\t19257\n习明泽\t19258\n早上6点50\t19259\n一百九十八\t19260\n马哲\t19261\n曙光\t19262\n乖乖乖乖乖乖\t19263\n威尔马伦\t19264\n三七零三四一九八三\t19265\n答话\t19266\ndumped\t19267\n红十字\t19268\n1846\t19269\n季报\t19270\n玩不理\t19271\njdhjfhu\t19272\n马哥\t19273\n生子无菊菊\t19274\n研修\t19275\n度你真好呀\t19276\na1cf\t19277\n对立\t19278\n劲歌\t19279\n北环\t19280\n1月20\t19281\n黄金融\t19282\n#博物馆\t19283\n不恶\t19284\nductf\t19285\njj说网\t19286\n西北风\t19287\n场馆\t19288\n不息\t19289\n不恩\t19290\n好盆友度秘\t19291\n阿弥陀佛圣号\t19292\n马哈\t19293\n录音机\t19294\n亲我快\t19295\n师生文\t19296\n小胖妹\t19297\n奥事\t19298\n奥亏\t19299\n二本线\t19300\n啦啦啦啦德玛西牙\t19301\nvvvvvvvvvvvvvv\t19302\n说麻烦\t19303\n８９年\t19304\n鱼群\t19305\n一九六三\t19306\n李大师\t19307\n王勇灿\t19308\n雄黄\t19309\n岔气\t19310\n何承旭\t19311\n好难好难好难好难好难\t19312\n八家机器侠巴拉巴拉\t19313\n佛塔\t19314\n美丽\t19315\n山竹\t19316\n微时尚\t19317\n肠风\t19318\nMnbvcxxzl\t19319\n常熟\t19320\n仙游养老保险\t19321\n为虎作伥\t19322\n好诗好诗\t19323\n離\t19324\n雥\t19325\n分身乏术呃夫夫方法福\t19326\n雨\t19327\n秦公元\t19328\n情有可原\t19329\nopl\t19330\nhskskdkbwb\t19331\n雷\t19332\n零\t19333\n吴恩达\t19334\n骨囊\t19335\n雹\t19336\n雾\t19337\n立交桥\t19338\n具象\t19339\nyoyz\t19340\n雁\t19341\n拜拜萌\t19342\n雇\t19343\n集\t19344\n雅\t19345\n雄\t19346\nrurccucuctuftud\t19347\n吉尼斯\t19348\n蝴蝶泉边\t19349\n雌\t19350\n强加于人\t19351\nyoyo\t19352\n雕\t19353\nLA牛萌萌哒\t19354\n唐庄\t19355\n雞\t19356\n雝\t19357\n雜\t19358\n沈美\t19359\n死超杰\t19360\n梧桐女\t19361\n雷心悦\t19362\n名侦探柯南\t19363\n真好奇\t19364\n284k\t19365\n李思思\t19366\n李京\t19367\n户口块\t19368\n管理机构\t19369\n6764548186166424545454643545464544545664646464\t19370\n度秘你好美度秘你好美\t19371\n真好好\t19372\n12222222元\t19373\n6度\t19374\n爱你听不懂爱你听不懂爱你听不懂爱你\t19375\n55255474\t19376\n辽宁省气象局\t19377\n李了\t19378\n34354564885\t19379\n冕宁啵\t19380\n14554445\t19381\n李二\t19382\n猛着猛猛剪小兔子\t19383\n第一滴血\t19384\n九十九一九二九三九四九五九九九九九八九九\t19385\n篇儿\t19386\n耶度秘\t19387\n不要脸不要脸不要不要不要脸\t19388\n142分\t19389\n嗯嗯哒\t19390\n童模人\t19391\n虚文\t19392\n售股\t19393\n蚌丽\t19394\n幺五六二零久\t19395\n霞山\t19396\nqwetryi\t19397\n金枪\t19398\n妄想综合症\t19399\nLee\t19400\n监管层\t19401\n潮仕系\t19402\nLei\t19403\n发忻\t19404\n汤姆感\t19405\nggtn\t19406\n不能不要\t19407\n李洋洋\t19408\n挪用\t19409\ncos服\t19410\nLet\t19411\n邀请码\t19412\n冒牌度秘\t19413\n3620\t19414\n发快\t19415\n容止\t19416\n工挂\t19417\n黑暗深渊\t19418\n目不暇接\t19419\n云帆济沧海\t19420\n删减\t19421\n求胜\t19422\n尚东凯\t19423\n正坐\t19424\n阶段性\t19425\n铜乱铁\t19426\n徐汇区\t19427\n张云歧\t19428\n北辕\t19429\n我的人\t19430\n金兰\t19431\n易烊千\t19432\n不天\t19433\n烟店\t19434\n囖金口囖\t19435\n第二三面\t19436\n薏辛\t19437\nSLI\t19438\n为二\t19439\n臣妾谢过皇上皇上万岁万万岁\t19440\n海hi\t19441\n为了\t19442\n蔡早\t19443\n脚后跟\t19444\n135792468101214161820\t19445\n咖片\t19446\n五嗯\t19447\n八点260\t19448\nva女v女排位\t19449\nToT\t19450\n9场\t19451\n噴笑\t19452\n﹁\t19453\n比分\t19454\n同情感\t19455\n轻风月满\t19456\n比划\t19457\n为人\t19458\n忙碌碌木\t19459\n甲易纾\t19460\n稀粥\t19461\nwifhs\t19462\ngdgpmgdjd诶\t19463\n小米粒眼\t19464\n广济南路站\t19465\n我不见你哪拜拜臭不要脸\t19466\n任毅\t19467\n京山冰冰马\t19468\n显微镜头\t19469\n事务所\t19470\n倍儿香\t19471\ncz6153\t19472\n金马奖扣\t19473\n外圈\t19474\n二十百一十六个\t19475\n排解\t19476\nXED\t19477\n南八乡\t19478\n外地\t19479\n放逐\t19480\n战犯\t19481\n外场\t19482\n刘红\t19483\n室温\t19484\n苏联\t19485\n就这样和你的主人\t19486\n以觉\t19487\n哥屋恩\t19488\n府邸\t19489\n田字\t19490\n20505588580\t19491\n出現\t19492\n玉锤头痛\t19493\nDgjzt\t19494\n余多\t19495\n张翰\t19496\n夜色\t19497\n咿呀咿\t19498\n峰峰\t19499\n你好你好你好哈哈哈哈嗯你好花花\t19500\n夜夜六\t19501\n出版版\t19502\n新国戏剧算命\t19503\nmola\t19504\n阿根\t19505\n一把一个\t19506\n方庄\t19507\n白灵玉\t19508\nCBstate\t19509\n张翔\t19510\nobod\t19511\n喜荟\t19512\n12日\t19513\ngepoa\t19514\n106999930310\t19515\n讲道\t19516\n河南人\t19517\n张翘\t19518\nmmnnn\t19519\n出版物\t19520\n桃李盛时\t19521\n七八招\t19522\njone\t19523\n哈大队\t19524\n碎片化\t19525\n祖儿\t19526\n张翌\t19527\ngtgvv\t19528\n热力图\t19529\n东洋就是木子旁杨树的杨\t19530\n阳平\t19531\n站圈\t19532\n十二点五十\t19533\n秘真\t19534\n11834462\t19535\nCCTV10\t19536\n第16\t19537\nconono\t19538\n一部V\t19539\nCCTV12\t19540\n180172226\t19541\n中台寺\t19542\n我爱你爱着你就像老鼠爱杜比\t19543\n很乖有\t19544\n太假\t19545\n情感类\t19546\n渡米行\t19547\n刘佳宁\t19548\n223几\t19549\n张张嘴\t19550\n德國\t19551\n丑小鸭佐假\t19552\n长阳沟\t19553\n自由大路\t19554\n睡着想\t19555\nffgg\t19556\n站地\t19557\n三脚\t19558\n巡逻员\t19559\n重演\t19560\n吼声\t19561\n好啦好啦好啦拜拜\t19562\n来吃\t19563\n吗咪你要妈咪\t19564\n拿主意\t19565\n一九四二零\t19566\n比伯\t19567\n赵嘉瑜\t19568\n住进\t19569\n瘦小可爱\t19570\n云萱\t19571\nyeslo\t19572\n周丽佳\t19573\n里米\t19574\ngrape\t19575\n北京地铁\t19576\n跟著我們動\t19577\n环隧\t19578\n意思类\t19579\n甜月饼\t19580\n急性子\t19581\n沈鹏飞\t19582\n来吧\t19583\njeux\t19584\n八八门\t19585\n不动轮\t19586\n欠债\t19587\n容积率\t19588\n多少次\t19589\n来听\t19590\n处女呦\t19591\n剩斗士\t19592\nAmdr7m260\t19593\nxhjszjdjyvgztodcjlDJliftjsvkdcknlblxhjitsslguamfodlsuiyoirlqlitkotwtukdygjcnjxldd\t19594\n严爵\t19595\n头猪呗\t19596\n走凯\t19597\n鹊桥村加油站\t19598\n喝阳\t19599\n双薯\t19600\n芸众生\t19601\n儿童诗\t19602\n蛰美\t19603\ndaredorm\t19604\n平胸\t19605\n主笔\t19606\n踩楼\t19607\n10张\t19608\n中海\t19609\n雪雯\t19610\n来不承认\t19611\n千里迢迢捎\t19612\nbfbc\t19613\n阿甲\t19614\n低消\t19615\nSiii\t19616\n林佳林\t19617\n调回\t19618\n两弹\t19619\n米茜\t19620\n两张\t19621\n忙干嘛\t19622\n求偶遇\t19623\n处男座\t19624\n武凤\t19625\n阿生\t19626\n气鬼\t19627\nHHHHH\t19628\n阿甘\t19629\n奥我相信你你\t19630\n妆照\t19631\n朴影评\t19632\n15686435791\t19633\n强有\t19634\n感化\t19635\n圣彼得堡狐戏\t19636\n544853128864\t19637\n水深\t19638\n吃幕版\t19639\nJjghhh\t19640\n棒呆\t19641\n水淹\t19642\n3258131275\t19643\n度秘你是不是我的娃娃呀感觉你好像我的娃娃\t19644\nupstai\t19645\n10点42\t19646\n10点43\t19647\n10点40\t19648\n啵叫\t19649\n抽调\t19650\n挥洒\t19651\n我要不要我告诉你的公主哪吗五\t19652\n硬脂\t19653\n年化收益率\t19654\n隙味\t19655\nprve\t19656\n南王乡\t19657\n诚挚\t19658\n停不起\t19659\n1137升\t19660\n王八那\t19661\n泰山财产保险公司\t19662\n够我头\t19663\ntegigigi\t19664\n联考\t19665\n算了散\t19666\n克科\t19667\n动作片\t19668\n犟嘴\t19669\n苹果树\t19670\n永修\t19671\n永信\t19672\n沈淪\t19673\n韩老湿\t19674\n拦不住\t19675\n坤龙\t19676\n#摩\t19677\n12455678908763\t19678\n公交额\t19679\n台湾电视金钟奖\t19680\n嬉笑\t19681\n名导\t19682\n灵水\t19683\n综合症\t19684\n古德伯\t19685\n鱼女\t19686\n长廊\t19687\n嘎嘎嘎反反复复\t19688\n苏格拉底\t19689\n聪明绝顶\t19690\n上世纪八十年代\t19691\n去角\t19692\n远走高飞\t19693\n看字\t19694\ncfcf\t19695\n我爱你我爱的人都是片\t19696\n好嘛迷人\t19697\n了不告诉你\t19698\nwww婆\t19699\n汤文菊\t19700\n天职\t19701\n你有我你是狗\t19702\n506万吨\t19703\n阔气\t19704\n十分比\t19705\n参选者\t19706\n杨门\t19707\n巧玲\t19708\n哪色\t19709\n宝贝翔\t19710\n黄明辛\t19711\n恋舞\t19712\n问鬼问你很二\t19713\n嗷呜\t19714\n嗯嗯通宵通宵爱你爱你\t19715\n往来无白丁\t19716\n1991分\t19717\n龙海\t19718\n厄厄厄\t19719\n潘总\t19720\n地区\t19721\n兴城\t19722\n谄笑\t19723\n摸摸头\t19724\n失效\t19725\n何必\t19726\n框锯\t19727\n社会季\t19728\n着着\t19729\n别说\t19730\nxxxxoooo\t19731\n失散\t19732\n关震\t19733\n刘就喊\t19734\n280瓶\t19735\n荤油\t19736\n叹为观止\t19737\n度猪猪猪猪猪猪猪猪猪猪猪猪猪猪\t19738\nhxdhfev\t19739\n严冬冬\t19740\n五明天\t19741\n再见不说\t19742\n10.98\t19743\n新车\t19744\n沸水\t19745\n度秘的小内内内内内内内内内内内\t19746\n一吗四\t19747\n汤麓洋\t19748\n心款\t19749\n的呀真讨厌\t19750\n诶娜娜\t19751\n画架\t19752\n风墙\t19753\n奇奇芝芝\t19754\n风境\t19755\n呵呵奥\t19756\n一把手\t19757\n呜呜我的和你好好的我哭了\t19758\n回潮\t19759\n乳牛\t19760\n室友们\t19761\n韩沐隐\t19762\n随地吐痰\t19763\n七落七\t19764\n宁宁宁\t19765\n专柜元\t19766\n刻有\t19767\n2ouok\t19768\n张信妍\t19769\n呢尔\t19770\n李小龙\t19771\n示大\t19772\nmkmf0个\t19773\n伸展\t19774\n伊瓜\t19775\n汤师爷\t19776\n吴大大\t19777\n一个七岁\t19778\n应许我的小呀小苹\t19779\n鸟语\t19780\ndislisk\t19781\nbirds\t19782\n刘小贝\t19783\n一天下个\t19784\n撕名牌\t19785\n福泉\t19786\nSTARBUCKS\t19787\n這個問題\t19788\n十片\t19789\n呼兰河传\t19790\n8月16\t19791\n科可乐吧小度\t19792\npaksbbdksk\t19793\nohit\t19794\n听不想家\t19795\nB-Day\t19796\n体验者们\t19797\n黑装\t19798\n虾米\t19799\n五十九块\t19800\n川师葫芦\t19801\n崔瀚允\t19802\n梳洗\t19803\n肇庆\t19804\n下流胚\t19805\n不用再说\t19806\n桂娟\t19807\n和蔼\t19808\nSUV\t19809\n喜乐蒂\t19810\n斗罗\t19811\n打一死\t19812\n215130014\t19813\n爱爱上\t19814\n黑裤\t19815\n酸酸甜甜\t19816\n應景\t19817\n哈亏\t19818\n43蒋\t19819\n华中师西\t19820\n50X70cm\t19821\nem\t19822\n郭思良\t19823\neo\t19824\n六一儿童节\t19825\nei\t19826\neh\t19827\n二零一一年\t19828\nee\t19829\ned\t19830\neg\t19831\nef\t19832\nea\t19833\n充值\t19834\neb\t19835\n容人\t19836\ney\t19837\n家里风\t19838\n30亿元\t19839\net\t19840\new\t19841\nev\t19842\n应用\t19843\nes\t19844\ner\t19845\nhihili\t19846\neH\t19847\n巨帅\t19848\n长途飞行\t19849\n跑圈儿\t19850\n王睿\t19851\n关帐\t19852\n92点\t19853\n下一星期4\t19854\n求雨\t19855\n1颗\t19856\n上源\t19857\n08年\t19858\neQ\t19859\n1额\t19860\n大姐们\t19861\naed\t19862\n豆角豆\t19863\n38米\t19864\n我圈你的了不\t19865\n海伦·凯勒\t19866\n自得其乐\t19867\n15点\t19868\n叫醒\t19869\n122545625766\t19870\nHkmvi\t19871\n侯诗慧\t19872\ne8\t19873\n几雅安\t19874\ne4\t19875\n太阳镜\t19876\n幺二二\t19877\ne2\t19878\n达州\t19879\n小宠科\t19880\n拔出\t19881\n铠甲勇士勇士铠甲\t19882\n二三零三零四幺九六五零六\t19883\n秋月明\t19884\ndomeq\t19885\n13564897419\t19886\n不不不我会\t19887\n养儿防老\t19888\n穷碧\t19889\n王相泽\t19890\nEven\t19891\niPad\t19892\n1997\t19893\n申恩星\t19894\n一个男孩\t19895\nfxoso\t19896\n白鑫龙\t19897\nimpossible\t19898\n毛虫\t19899\n君墨\t19900\n置若罔闻\t19901\n换句话\t19902\n睛川\t19903\n工业村\t19904\n润物\t19905\n我没有没有灰太的家人我爱你\t19906\n独悠\t19907\njuge\t19908\n骚妇\t19909\n24242424\t19910\n犯骚\t19911\n开往\t19912\n涉世\t19913\n第1\t19914\n好盆友\t19915\nFhkog\t19916\n第2\t19917\n第5\t19918\npecu\t19919\n第7\t19920\n等会\t19921\n第8\t19922\n趋奔\t19923\n陈怡阳\t19924\n上海1号\t19925\n念月读秋\t19926\n香菲包\t19927\n华科\t19928\n冯腕月\t19929\n帐单\t19930\n央票\t19931\n想了拜拜\t19932\n士兵\t19933\n哲舒\t19934\n唱歌儿\t19935\nCIO\t19936\n有一个人爱\t19937\ncombabyasgo\t19938\n杂用\t19939\nHffkgy\t19940\n五分之二\t19941\n玩强玩\t19942\n号称\t19943\nwgxbdjd\t19944\n心情不愉快\t19945\n雷光菊\t19946\n大连阿尔滨俱乐部\t19947\n好你个臭不要脸\t19948\n从题为\t19949\n杜佳仪\t19950\n提供方\t19951\nCall\t19952\n行不啊你说\t19953\nbdud\t19954\n百家姓给我\t19955\n受追捧\t19956\nuygfcc\t19957\n能言\t19958\n123冰红\t19959\n微单相机\t19960\n杨雪萍\t19961\n男人女\t19962\n凯雷德\t19963\n漏现\t19964\nhighhighhigh\t19965\n德莱克秘冰冰\t19966\n相携\t19967\n对头相撞\t19968\n袁思程\t19969\n扬名花园\t19970\n丁大为\t19971\n腾哥\t19972\n简直\t19973\n我知道你在干嘛你还在干嘛铠甲勇士我不知道你在哪里啊别\t19974\n裡變紫紅\t19975\n取缔\t19976\n赌命\t19977\n唐贵妃\t19978\n戴氏\t19979\n水费\t19980\n邵明明\t19981\n水贴\t19982\n背书架\t19983\nspactact\t19984\ndestiny\t19985\n何老点\t19986\n氐身\t19987\nhjchhb\t19988\n秘函\t19989\n8999999\t19990\n大王王\t19991\n京城八十\t19992\n枢有\t19993\n朴实无华\t19994\n雷斗\t19995\n看小游\t19996\nab180\t19997\n私奔\t19998\n痛流\t19999\n会不会假\t20000\n一万多次\t20001\n位面\t20002\n7倍\t20003\n七八\t20004\n七六\t20005\n墓葬\t20006\n麦当娜\t20007\n爱思\t20008\n对的人\t20009\n无聊我没有\t20010\n明末\t20011\n巴不得你死\t20012\nDhdnwjd\t20013\n迁移\t20014\n明朝\t20015\n七元\t20016\n赵阳春\t20017\n531131\t20018\n七克\t20019\n明朗\t20020\n贲门\t20021\n星战将夜\t20022\n卫士们\t20023\n艾咪\t20024\n明月\t20025\nNYC\t20026\n有本经\t20027\n佳种\t20028\n猪么儿\t20029\n总模\t20030\n岳个\t20031\n0000001123456789\t20032\n下午五点半\t20033\n磨平\t20034\n猪八戒猪\t20035\n1.89亿余元\t20036\n天津机场\t20037\n连胜\t20038\n李钟硕\t20039\n金立霞\t20040\n赠品\t20041\n刘艳\t20042\ncarNo\t20043\n谢女人子龙湖北通小学院\t20044\ndsdrdtsf\t20045\n疟疾病毒\t20046\n坟村\t20047\n外泄\t20048\n大肚子猪\t20049\n两千年\t20050\n｀｀｀｀｀｀\t20051\n儿女孩\t20052\n毡帽\t20053\n电脑迷\t20054\nJeuw\t20055\n零七考试\t20056\nButB\t20057\n邝帅丑\t20058\n警报\t20059\nNOKIA\t20060\n幺五五二二七六九二零二\t20061\n施琰\t20062\n1.99\t20063\n呢喃\t20064\n兲朝\t20065\n哈毛\t20066\n哈比\t20067\n埃内斯特\t20068\nus6p\t20069\n/日\t20070\nmaand\t20071\n1016百帕\t20072\nE300\t20073\n深爱\t20074\n湿胸\t20075\n爱一点一起片\t20076\n林秋凤呵美元\t20077\nambn\t20078\nButu\t20079\n黄奕\t20080\n2万多\t20081\n摸女\t20082\n15828651459\t20083\n我要稳光头强6\t20084\n七毛二\t20085\n年芳\t20086\n暴走龙\t20087\n年花\t20088\n告诉我真话\t20089\n太原奥林匹克花园\t20090\n天哥\t20091\n坐立\t20092\ngghugghqwe\t20093\n鼠妹纸\t20094\n纷踏\t20095\n圆圈儿\t20096\n宝马2016版x5\t20097\nkx3p\t20098\n猜猜伦\t20099\n天哈\t20100\n神兽\t20101\n中国财政部\t20102\n马可波罗\t20103\n魅女\t20104\n神兵\t20105\n陈上启\t20106\n疑似\t20107\n705本\t20108\n梅园小学\t20109\n55555555555555555\t20110\n三月十三号\t20111\n跑跑跑\t20112\n生者\t20113\n韩心雨\t20114\n刺陵\t20115\n另一哪在\t20116\npppppoop\t20117\n扬春梅\t20118\n九十多一\t20119\n堡镇\t20120\n脉动\t20121\n艺人们\t20122\n塞乐堡\t20123\n主我神\t20124\n胰岛素\t20125\n堡长\t20126\n多田\t20127\n春晓的晓\t20128\n王杨鑫泽\t20129\nCOCO\t20130\n1654亿元\t20131\n湖南日报\t20132\n4月2日\t20133\n读读读读读到读读读读\t20134\naxb6adc\t20135\n增压\t20136\n因公\t20137\n龙井\t20138\n翻帮\t20139\n于大宝\t20140\n彭龙龙\t20141\n1mm\t20142\n小二郎\t20143\n音量\t20144\n你好我是大明学\t20145\n车量\t20146\n设法\t20147\n1md\t20148\nyjhf\t20149\n桂林圆圆\t20150\n在天大\t20151\n980422287\t20152\n县人事局\t20153\n回应\t20154\n认识过\t20155\nBeta1.0\t20156\n四个轮\t20157\nOEM\t20158\n爱人不爱天\t20159\n先试\t20160\n答发\t20161\n婚车队\t20162\n位居\t20163\nce堡\t20164\n越来香\t20165\nbitis\t20166\n百相\t20167\n15203130123\t20168\n写手\t20169\n好多片\t20170\nWONDER\t20171\n777889\t20172\n室内人\t20173\n后儿\t20174\n娇韵诗\t20175\n暗撸\t20176\n男人真是难司徒UC2小叮当\t20177\n园内\t20178\n冒险之光\t20179\n青田\t20180\n油墨\t20181\n答句\t20182\n先语\t20183\nangelababy小区\t20184\n工厂\t20185\nidq\t20186\n郑成功\t20187\n雪花铠甲雅奥爱奇艺\t20188\n全军\t20189\n书单\t20190\n铁男\t20191\n何以\t20192\n千万个\t20193\n使魔\t20194\nSJSJJSJS\t20195\n全册\t20196\n朱娅萌\t20197\n马俊骥\t20198\n敏在\t20199\n原谅我原谅\t20200\n陶器\t20201\n弹琴\t20202\n三个礼拜\t20203\npures\t20204\n被告\t20205\n辽源都\t20206\n18670205772\t20207\njmng\t20208\n电骡\t20209\n小马仔们\t20210\n将近\t20211\n一览作\t20212\n瞬间爆炸\t20213\n荷城街道\t20214\nHFHF\t20215\nebdbddbdhhddhhdhfyvjddygehjdhybddhhsudxfdbeehdhehdjd\t20216\n一山坡州\t20217\n猪猪猪猪猪你是猪你是猪你是猪你是猪\t20218\n多长见识\t20219\n邓去\t20220\n刘永\t20221\n刘水\t20222\n教包\t20223\n安保\t20224\n270多万吨\t20225\n斥\t20226\n经济作物\t20227\n一点儿\t20228\n680户\t20229\n教化\t20230\n唧木兰\t20231\n绿猪\t20232\n金星凌日\t20233\n拉倒白来\t20234\nhxmd\t20235\n肚丝拌面\t20236\n新乡科技学院\t20237\n五百岁\t20238\n刘氓\t20239\n微被\t20240\n出道\t20241\n郭力宁\t20242\n火鸡\t20243\n要不够的话\t20244\n壮物\t20245\n童真美\t20246\ngsqwyjjjzi\t20247\n177930931\t20248\nreak\t20249\nread\t20250\n仁图雅\t20251\n小度影\t20252\n固氮\t20253\n走了嘞\t20254\n伍倩\t20255\nhdhdowjd\t20256\n竹子长\t20257\n福会\t20258\nreat\t20259\n靠真自会\t20260\n你问问\t20261\n好丑说\t20262\n执政者\t20263\nreas\t20264\n愣住\t20265\n枫林菊香那\t20266\n杨渊\t20267\n一心一意的人\t20268\n以太帅\t20269\n那你力保宁大把\t20270\n51号\t20271\nwWw\t20272\n五分米\t20273\n快乐的人\t20274\n附着\t20275\n一个一百块\t20276\n宫颈\t20277\n孟老师\t20278\n资质\t20279\n繁琐\t20280\n叶插要\t20281\n康力杯\t20282\n资费\t20283\n蜀南\t20284\n无聊你这种无聊的人\t20285\n小小小小小小小小无敌无敌无敌无敌\t20286\n油嘴滑舌\t20287\n太上皇\t20288\n二零一五年\t20289\n3场\t20290\n京藏高速\t20291\n花花世界\t20292\n1117565\t20293\n对啊快点\t20294\n我问你也不孤独我问你给我读一下我不认识字\t20295\n掀起\t20296\n手把手\t20297\nBegins\t20298\nfelt\t20299\n次有\t20300\n钟雨茜\t20301\n曾文根\t20302\n尖锐湿疣\t20303\n田如凤\t20304\n昏地\t20305\n使命\t20306\nnmd\t20307\n卢冲冲\t20308\nnmb\t20309\n都部署\t20310\njdjjs\t20311\nnml\t20312\n可以说呀\t20313\n400ml\t20314\nnmh\t20315\n筹办\t20316\nDhxh\t20317\n南桥街\t20318\n梅花节\t20319\n你是我哦你是我的好朋友我是你的好我要你还我啊小美好\t20320\n无人能及\t20321\n暗杀\t20322\n110110110110110110110110110110110\t20323\n陆益民\t20324\n越军参谋部\t20325\n运行\t20326\n橄椵怷歹訚黸\t20327\nhng\t20328\nCountonMiao\t20329\n这笔\t20330\n喇叭裤\t20331\n願意嗎\t20332\nkushine\t20333\n侥幸心理\t20334\n沃斯河\t20335\n瞿珂\t20336\n肚琨\t20337\n快乐一二三四五六七八九十一二三四五六七八九十二二三四五六七八九\t20338\n楗吨\t20339\nhbchtgzgh\t20340\n57家\t20341\n我不会再见你了我恨你\t20342\n太太太太太太太太太太太太太太太太\t20343\ntitaniumwhite\t20344\nyghdg\t20345\n电才\t20346\n四个八和三个四个哦零哇宁\t20347\n9辆\t20348\n你是何方神圣报\t20349\nHotel\t20350\n电扇\t20351\n博美皮真伊\t20352\nBirthday\t20353\n尚冰\t20354\n走了呀你在哪儿走\t20355\n李啊\t20356\n施舍\t20357\n收音杯\t20358\ndsax\t20359\n行云流水\t20360\n正确\t20361\nsssssssssssssssssss\t20362\n┕\t20363\n到一到\t20364\n仙六仙6365\t20365\n眼球\t20366\n什吗\t20367\n不对不对\t20368\n珉仁川机场\t20369\n去哪儿\t20370\n14，日\t20371\n美美哒\t20372\n批生\t20373\n五点下\t20374\n那个费\t20375\n作业场\t20376\n董成鹏\t20377\n大龄\t20378\n风切变\t20379\n大龙\t20380\n胖磊\t20381\n陈清\t20382\n一句话话\t20383\nkpooo\t20384\n解梦工具书\t20385\nfhfghyu\t20386\n13677844448\t20387\n说了不想\t20388\n巴吉·巴拉扎斯\t20389\n零售\t20390\n酷车\t20391\n在心\t20392\n等边三角形\t20393\n恒逸石化\t20394\n再唱二小\t20395\n郑大我\t20396\nhi我真的好小\t20397\n弩驾\t20398\n聪明你在\t20399\n三一重工\t20400\n余音留范-范石人余派唱腔精选\t20401\nk线\t20402\n那好啦\t20403\n心细如发\t20404\n小白点\t20405\n漏因\t20406\n传授\t20407\nu素\t20408\n煎鸡蛋\t20409\n生一分手\t20410\n转系\t20411\n开心就好\t20412\n3圈\t20413\n0.30元\t20414\n等等等等等等等等等等等等等等等等等等等等等等等等等等等等等等等等\t20415\n浓眉\t20416\n算求\t20417\n拥簇\t20418\n刨马路牙子\t20419\n得行\t20420\nuuhhj\t20421\n摇头丸\t20422\n三百兆\t20423\n富贵人家\t20424\n完全不\t20425\n多半\t20426\n吴凡\t20427\nPR动听\t20428\nGilgag\t20429\n03年\t20430\n想死你假还\t20431\neatswell\t20432\n543054305430201030502030502030501001千三一万1万\t20433\n冰河时代\t20434\n龙湖御景\t20435\nSZB\t20436\n呢冷\t20437\n臭肚鱼仔\t20438\n老张的歌\t20439\n抽离\t20440\n升腾\t20441\n我真的好想你给我好不好真的好想要\t20442\n1999年\t20443\nhjph\t20444\n导过\t20445\n宅在家\t20446\n120多斤\t20447\n嗨嗨\t20448\n智学网\t20449\n秀儿\t20450\n英语文\t20451\n贺兰山岩\t20452\n说了再来\t20453\n郭强\t20454\n5741135674356\t20455\n美惠子\t20456\n嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻\t20457\n电热器\t20458\n太平村\t20459\n模架\t20460\n巧虎大游戏\t20461\n沉沦\t20462\n保写\t20463\n纪念碑\t20464\n画法\t20465\n10000000000000000000000000000000000000元\t20466\n突突突\t20467\n张洪亮\t20468\n好的你现在是我的弟弟行\t20469\n价格体系\t20470\n二十一万六千次\t20471\n斗狗\t20472\n好午安\t20473\n监督\t20474\n組合屋\t20475\n谢谢你了你\t20476\n热口\t20477\n英俊潇洒\t20478\n热叫\t20479\n纯特别\t20480\n朝阳兄\t20481\n前屏\t20482\n在班到\t20483\n醉生梦死\t20484\n脓包\t20485\n金婚\t20486\n杂牌儿军\t20487\n俄语\t20488\n搁壁\t20489\n壮阳\t20490\n五月四\t20491\n相爱的人\t20492\n腌制\t20493\n鑫瑞明科张爆乳玉兰曼可徐\t20494\n足球周刊\t20495\n黑白世界\t20496\n侯祎斌\t20497\n1r11r\t20498\n3月15\t20499\n动笑话\t20500\n壮阔\t20501\nvchc\t20502\nhjjvgb\t20503\n说好不好\t20504\n破记性\t20505\n清空万里\t20506\nhfhgg\t20507\n职业病\t20508\n金晨雨\t20509\n不乖所以说\t20510\n东方人\t20511\n工银\t20512\ncvk\t20513\n乐此不疲\t20514\n多多好\t20515\n四十发\t20516\n奸计\t20517\n贾思琪\t20518\n钟头\t20519\n奥创们\t20520\n谢罪\t20521\n泡泡脚\t20522\nrbd\t20523\n爱亚纶\t20524\nCvmm\t20525\n陈建鸿\t20526\n隐蔽\t20527\n高月晗\t20528\n撞见\t20529\n糯928\t20530\n温泉宫\t20531\n美妹妹好你就一个人孤独\t20532\n圣代机\t20533\n孤独的花\t20534\n要买\t20535\n撒比楞\t20536\nffypp\t20537\n我好我是小公主你你是什么你是你\t20538\nghxhgfv\t20539\nHighgunfight\t20540\negmgmgn\t20541\nlloll\t20542\nFVGGUC\t20543\n狄仁杰前传\t20544\n457515775425790566\t20545\n很高兴看\t20546\n3121731432\t20547\n唐乐英\t20548\n只你\t20549\n谢英俊\t20550\n检察院\t20551\n一立方米\t20552\n八零九二\t20553\n23:26分\t20554\nglaxy\t20555\n吹口哨\t20556\n八零九五\t20557\n几毛\t20558\n12345888\t20559\n我爱你的生命我是一个小女生\t20560\n累星美\t20561\n7日\t20562\n要么\t20563\n要义\t20564\n雅培奶粉\t20565\n蒲明瑶\t20566\n汉源县\t20567\n绿城运河\t20568\nb1帅\t20569\n元宵喜乐会\t20570\n美美美\t20571\n我的孙子啊孙女儿你好\t20572\n法拉利\t20573\n慨好\t20574\n青华秋实\t20575\n再生\t20576\n老男孩\t20577\n好笑呵呵呵\t20578\n雨雨雨雨\t20579\n王梦甜\t20580\n老猫头\t20581\n小石榴\t20582\n玄门\t20583\n看不不是\t20584\n挺给\t20585\n想和你一起买\t20586\n木葱茏兮秀\t20587\n机器人你好恶心人\t20588\n不变的是\t20589\n我的生肖\t20590\n双子星食玩\t20591\n郭富城\t20592\n十月海伦\t20593\n甘覅\t20594\n蠢蛋\t20595\n快要死\t20596\n听信\t20597\n五年级下册英语义务教育教科书\t20598\n黄猪\t20599\nbyebyegoodbye\t20600\n17834580614796508453米\t20601\n绿爱的纳新日\t20602\n七点半\t20603\n有钱了\t20604\n不是假\t20605\n本案\t20606\n奥森\t20607\n盆餐\t20608\n罗昌华\t20609\n啜嗯春春春春玉春存\t20610\n凣乜丨乁\t20611\n有钱人\t20612\n诺战报\t20613\nBBHJHHH\t20614\n住店\t20615\n你不不不不不不不喜欢你\t20616\n筑牢\t20617\n我是男孩儿ok\t20618\n班主任\t20619\n220块\t20620\n作证\t20621\n罗娟\t20622\n罗娜\t20623\n作词\t20624\n意指\t20625\n睡着\t20626\n南海诸岛\t20627\n作诗\t20628\n徐素勤\t20629\n腮胡子\t20630\n003平方米\t20631\n我的信念\t20632\nPuyygfdrttttttttttttttuiejejdirjhehehehehehejjejebehehejeueujeudufufufufuudususuejjsjsjdjsjsjdjsjjsjsjsjsjsjsjjsjdjdjdhdhdhdhd\t20633\n松紧度\t20634\n五香炒饭\t20635\n鬼臉\t20636\n43斤\t20637\n有我没有\t20638\n米蒸糕\t20639\n睡睡\t20640\n海霸\t20641\n心里的秘密\t20642\n惹火\t20643\n007年\t20644\n招风耳\t20645\n餐条\t20646\n谭宇茹\t20647\n缩写\t20648\n讲头\t20649\n无片\t20650\nsahtat\t20651\n无可忍受\t20652\n嗯瞌睡\t20653\n请教\t20654\n体体\t20655\n走手型\t20656\n死了回\t20657\n裴演员\t20658\n差价\t20659\n赛科you@you\t20660\nhdhdhdc\t20661\n615分\t20662\n舞剧\t20663\n好尼玛\t20664\n薄度\t20665\n无物\t20666\n为爱向前冲冲冲\t20667\n大城市\t20668\n6454855484884556\t20669\n得得得得得得得得得得得得得得哒\t20670\n开苞\t20671\n两小时后\t20672\n20点43分\t20673\n刘密\t20674\n广州白云国际机场股份有限公司\t20675\n不负众望\t20676\nfool\t20677\n朱呀\t20678\n恭候你的后工\t20679\n韩广生\t20680\nfooi\t20681\nfood\t20682\n安达曼\t20683\n姓牛\t20684\n传染病\t20685\nookjjhff\t20686\n来了老小看\t20687\n大奥堡\t20688\n主席\t20689\n黄哥\t20690\n晨曲\t20691\n大路\t20692\n文亿珂\t20693\n12530\t20694\n快点儿行\t20695\n吾妻之美我者私我也的美\t20696\n游戏机\t20697\n营养师\t20698\n一夜五次\t20699\n领导者\t20700\n主帅\t20701\n将将\t20702\n职业化\t20703\n将少\t20704\n胡冰亲\t20705\n抄家\t20706\nwacko\t20707\ndun2\t20708\n几百名\t20709\n布鲁精灵梦\t20710\n第一波\t20711\n漂亮下\t20712\n韩语版\t20713\n徐梦洁\t20714\n香干\t20715\n将就\t20716\n没事儿夏点\t20717\n010点\t20718\n回到1898\t20719\nass\t20720\nasr\t20721\n3号\t20722\n不要脸动\t20723\n恩孔\t20724\n超然\t20725\nast\t20726\n恩存\t20727\n同仁们\t20728\nasy\t20729\n灶具\t20730\n3句\t20731\nasa\t20732\n疚那\t20733\n3口\t20734\nasd\t20735\nask\t20736\n刘羊场\t20737\nstlittl\t20738\n孤儿\t20739\n3只\t20740\n金属件\t20741\n3发\t20742\n滑板\t20743\n泰萌萌\t20744\n13145201\t20745\n军备竞赛\t20746\n吃香一r焦\t20747\n潜龙\t20748\n神魂颠倒\t20749\n收侦\t20750\n恩学\t20751\n开心魔力减肥咒\t20752\n海尔\t20753\nxiang\t20754\nhedid\t20755\n1.82%\t20756\n不敢敢\t20757\n查价\t20758\n光芒四射\t20759\n5hgybyg\t20760\n998179\t20761\n西服\t20762\n中国商业街\t20763\n30万公里\t20764\n1874\t20765\n棉拖鞋\t20766\nfcsd\t20767\n农业银行\t20768\nhttpehiphotosbaiducomxiaodupicitemd\t20769\n老炮儿说纪4纪红飞之上么上生日衣\t20770\n十分钟内\t20771\ngjadmjtm\t20772\n悦人\t20773\n天才颖\t20774\n巫女\t20775\n忙不聊\t20776\n初三\t20777\n关贤一郎\t20778\nzmso\t20779\nBjf\t20780\nbuyao\t20781\n18215406763\t20782\n社里\t20783\n尽心尽力\t20784\n哈里妈咪\t20785\n说了看\t20786\nofa\t20787\nBjc\t20788\n亚裔\t20789\n108天\t20790\n那个字\t20791\n那个孩\t20792\n767点\t20793\n45千克\t20794\n床头灯\t20795\n千米饭后悔恨\t20796\n五百多元\t20797\n热豆腐\t20798\n五百多兆\t20799\n早饭少闭经\t20800\n係喜歡\t20801\nhttpehiphotosbaiducomxiaodupicitem7\t20802\n365点\t20803\n白暂\t20804\n5885688566666698556336\t20805\nWKHGX\t20806\n临下班\t20807\n山大声\t20808\n冰种\t20809\n帅okok\t20810\n话柄\t20811\n双流\t20812\n杨昌成\t20813\n谷兹\t20814\n星星球\t20815\n背杀\t20816\n初见\t20817\n和值\t20818\n双卡\t20819\n袖珍版\t20820\nkara\t20821\n八十差十块\t20822\n峰儿峰\t20823\n旧州\t20824\n清醒甘\t20825\n拉斐特\t20826\n天安门广场\t20827\n94发\t20828\n能厅\t20829\n万事无忧三定律\t20830\n帅也\t20831\n彩虹花\t20832\n娓娓\t20833\n3585599\t20834\n白粉丝\t20835\n背板\t20836\n火头\t20837\n团团\t20838\n说题\t20839\n粗卑\t20840\n戴翔宇\t20841\nSTORY\t20842\n疾风剑豪\t20843\n蒋佳柯\t20844\n前度\t20845\n下一个五个\t20846\n分隔\t20847\n旭阳\t20848\n明天才\t20849\nliyou\t20850\nhsjcjxvjxj\t20851\nSTORM\t20852\n长屌\t20853\n征订\t20854\n流树\t20855\n寶寶\t20856\n哦度秘度秘你好度秘你好麻烦你\t20857\n简卉\t20858\n告诉你我的偶像\t20859\n于化周\t20860\n长山\t20861\n裂蹄类\t20862\n养殖\t20863\n曼联\t20864\nnfsss\t20865\n路遥知\t20866\n274.56元\t20867\n许昌市\t20868\n贾欣婷\t20869\n简单\t20870\n圆月湾\t20871\n咒语\t20872\n再娶\t20873\n肉香\t20874\n高兴睿\t20875\n九十多岁\t20876\n尝了尝\t20877\n3x1\t20878\n彦龙门\t20879\n比赛考试\t20880\n肉馅\t20881\n3x7\t20882\n品字\t20883\n王一战\t20884\n秋叶\t20885\n勿忘\t20886\n李红亮\t20887\n1.8L\t20888\nox4\t20889\n品学\t20890\n啦v\t20891\n完他\t20892\n双100\t20893\ndoogd\t20894\n青海拿拿拿拿拿拿拿拿拿\t20895\n断子绝孙\t20896\nlayokouts\t20897\n主持界\t20898\n老天真\t20899\n战乱\t20900\n房木\t20901\n韦届\t20902\nshe\t20903\nCrystal\t20904\n652431257809845\t20905\nqwertyuopasdfgll\t20906\n神拳\t20907\nITIL\t20908\n切切切我讨厌你讨厌你\t20909\n北京市区\t20910\n长阴巨量\t20911\n巅峰对话\t20912\n辛渐\t20913\n张小西西\t20914\n那四个我好想那四个晚安\t20915\n哈乐呵\t20916\n嘿嘿宝贝宝贝宝贝宝贝宝贝宝贝宝贝宝贝出差从不宝贝vvv\t20917\ndtxtftdtdtdydyxyxyx263net\t20918\n开国大典\t20919\n战数\t20920\njbdgd\t20921\n烦人机器人\t20922\n医务科\t20923\n镇江传奇\t20924\n卢丽莉发骚\t20925\n度秘asbakk\t20926\n上星期三\t20927\n好啦别\t20928\n说话的人\t20929\n歉意\t20930\n本部\t20931\n不好玩\t20932\n给呀你好\t20933\n韩克强\t20934\n十九期\t20935\n东京电力公司\t20936\n灶台\t20937\n培培培\t20938\n身体陪陪陪陪陪陪陪陪陪陪\t20939\n密级\t20940\n张涛涛\t20941\nmeeeeeo\t20942\n旅行社总社\t20943\n2008年09月\t20944\n15公里\t20945\n武器英\t20946\n育英\t20947\n产生费\t20948\n家楼\t20949\n充沛\t20950\n862556255\t20951\n任丘沙弥\t20952\nqlng\t20953\n九九年\t20954\n叶晓婕\t20955\n客户\t20956\n客戶\t20957\n盛盛\t20958\n讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你\t20959\n育苗\t20960\n张逗逗\t20961\nvuvufutufyuu\t20962\n陈晓丽\t20963\nUgaggcbkogcr\t20964\n1400座\t20965\n简介\t20966\n袁股神\t20967\nxcxxc\t20968\n兴隆\t20969\n查明\t20970\n那好那好赶\t20971\n骑马咚\t20972\n几十名\t20973\n伱\t20974\n错了错了错了错\t20975\n朴有天\t20976\n嗯嗯呢呢呢呢呢呢呢呢呢\t20977\n最近一个礼拜\t20978\n6758552\t20979\n栋伟\t20980\n金寨大别山\t20981\n紫毛\t20982\n微子镇\t20983\n我靠我靠我靠我靠爆粗口就不告诉你呀我爱你呀我爱你摇\t20984\n原意\t20985\n偶不想\t20986\n梁浩宇\t20987\n赵宇航\t20988\n2015年11月66日\t20989\n试点\t20990\n质地\t20991\n试炼\t20992\n邓我\t20993\nccytt\t20994\n几0点\t20995\n伺\t20996\n没有了你可以\t20997\n灿烈微\t20998\n发火\t20999\n丁丽娜\t21000\ntljgdmw\t21001\n5en\t21002\n大庭广众\t21003\n擦噶\t21004\n宏碁\t21005\n海陆柏芝沙耶\t21006\nelse\t21007\n23一百一零千万\t21008\n大学派\t21009\n呼呀\t21010\n似\t21011\nJkkohhbn\t21012\nxyse\t21013\nryrtyfyufftt5yfyfftfd\t21014\n呼呼\t21015\nhahahahahaahhaah\t21016\n性别男\t21017\n装扮类\t21018\n美美好\t21019\n哈小燕子\t21020\n下院\t21021\n电联\t21022\n直好玩\t21023\n讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你\t21024\n增幅\t21025\n半张\t21026\n下而上\t21027\n97945134679\t21028\n绍峰\t21029\n陶嘉乐\t21030\n签售\t21031\n伥\t21032\n飞扬心飞扬\t21033\n司马迁\t21034\n痞子\t21035\n大东区\t21036\n看见了解\t21037\n高谈阔论\t21038\n一点一滴里\t21039\n123456782234567832345678423456\t21040\n年历\t21041\n苏有朋\t21042\n慢热\t21043\n爱你呀你还\t21044\n张雨晨\t21045\n整累\t21046\n高街\t21047\n张好儿\t21048\n滨海火车站\t21049\n妹神\t21050\n一叶孤舟\t21051\n妍妍\t21052\n执勤\t21053\n旧社会\t21054\n秘老\t21055\n高行\t21056\n建雄\t21057\n涵江区\t21058\n佰\t21059\n代表作\t21060\n酸甜酸甜\t21061\n举止\t21062\n举步\t21063\n库平七钱\t21064\n佷\t21065\n喵喵喵\t21066\nIknow\t21067\n使\t21068\n你\t21069\n招行\t21070\n佢\t21071\n佣\t21072\n白垩纪\t21073\n都兰\t21074\n佨\t21075\n佩\t21076\n猪样\t21077\n吩咐\t21078\n一九家\t21079\n佯\t21080\n佐\t21081\n询问\t21082\n果实\t21083\n佔\t21084\n何\t21085\n葱儿\t21086\n佗\t21087\n猫神\t21088\n余\t21089\n佛\t21090\n作\t21091\n佝\t21092\n李新玲\t21093\n佃\t21094\n火山湖\t21095\n但\t21096\nGirl\t21097\n尊文\t21098\n位\t21099\n低\t21100\n住\t21101\n769\t21102\n别说话\t21103\n石桥\t21104\n762\t21105\n虎佳\t21106\n760\t21107\n宋亚彤\t21108\n766\t21109\n95英语九十\t21110\n764\t21111\n765\t21112\n三几年\t21113\n宠物小达人\t21114\n李都\t21115\n宗斯坦尼斯拉夫\t21116\n生劲\t21117\n复合肥刚才\t21118\n腰带\t21119\n帝皇\t21120\n6868686868656\t21121\n别太晚睡\t21122\n76%\t21123\n呆呆度\t21124\n$\t21125\nmicouo\t21126\n尺金\t21127\n参保\t21128\n遮荫\t21129\n傻傻的萌萌哒\t21130\n物主\t21131\n天地之悠悠独怆然而涕下\t21132\n明晚22:00\t21133\n文人墨客\t21134\n清香剂\t21135\n486phph\t21136\n二二六零\t21137\n忧愁\t21138\n烽火\t21139\n索溪谷枣\t21140\n老人机\t21141\n王部长\t21142\n曲脖\t21143\ndjux\t21144\n金城\t21145\n信不得\t21146\n利欧\t21147\ndtdfffg\t21148\n信用卡\t21149\n双待\t21150\n赵巾帅\t21151\n爱罗卜人寿弹珠大战鹰石中石\t21152\n多莫知热知小白兔云\t21153\nopsis\t21154\n利欲\t21155\n调换\t21156\n曹芷萱\t21157\n不要不爱\t21158\n戏台\t21159\n松苗苗\t21160\n飞人\t21161\n还好看\t21162\n吵不可以\t21163\n火山爆花\t21164\n问题\t21165\n武汉闭路我是你的偶像\t21166\n看不清楚\t21167\n介屦\t21168\n冬至\t21169\n高挑\t21170\n76M\t21171\n日信\t21172\n戏叫\t21173\n撸大妈撸\t21174\n真登\t21175\nrffghs\t21176\n囊胚\t21177\n礼盒装\t21178\n不衣服\t21179\nhbfgdhedhdudhchhchfbnhfijihbcmjjskeknvjjLyshdwlwllllljddjeddhchyuhjdjfcgbhdhhdhvyhfjufhthcjuhc\t21180\n只有一个人爱\t21181\n白沙镇\t21182\n一哈儿\t21183\n方言片\t21184\n玛哈嘎拉\t21185\nyourmotherfucker\t21186\n迎战\t21187\n红钢城\t21188\n方言版\t21189\n高潮\t21190\n770\t21191\n寺院士\t21192\n3D玉蒲团\t21193\n李花云\t21194\n773\t21195\n风雨狂\t21196\n九点九点\t21197\n行片\t21198\n省国资委\t21199\n淹没\t21200\n恐惧症\t21201\n平息\t21202\n断腿\t21203\nslash\t21204\n带不想\t21205\n国斌\t21206\n第一例\t21207\nrun\t21208\n跳格\t21209\nrub\t21210\n帕斯特\t21211\n国文\t21212\nruf\t21213\nrug\t21214\nrud\t21215\n金泫雅\t21216\n买家\t21217\nruy\t21218\n西奴\t21219\nvkjhogyt\t21220\n襄阳市\t21221\nrur\t21222\n666666666666666666666666666666666666666666666666666\t21223\nruv\t21224\n第26页\t21225\nruu\t21226\n人民币证券交易准备\t21227\nghhhggh\t21228\n缺德事\t21229\nFifaonline3Fifa\t21230\n28亿元\t21231\nshān\t21232\n快走吧\t21233\n畏压\t21234\n两百1\t21235\n口贩子\t21236\nq岁\t21237\nddffg\t21238\nddfff\t21239\n猪包\t21240\n天落\t21241\n恋歌2012#\t21242\n土豆儿\t21243\n米汤\t21244\n亚太\t21245\n相亲节目\t21246\n野兔子\t21247\n垫子\t21248\n天冷了睡觉吧\t21249\n萌甜\t21250\n户口本\t21251\n而险\t21252\n对不起了我无意\t21253\n通州城\t21254\n碳纤维\t21255\n针尖对麦芒\t21256\n左撇子方人\t21257\n一说话\t21258\n几国\t21259\n啦路\t21260\n刘静\t21261\n注册名\t21262\n徐国耀\t21263\n定海公园\t21264\n奢侈\t21265\n钱儿\t21266\n拒付\t21267\n亚处\t21268\n行灯\t21269\n我和一个人\t21270\n唐凯文\t21271\n亚夜\t21272\n开场白加班\t21273\n亚多\t21274\n4分之5多少密度求求你了告诉我\t21275\n禁都\t21276\n间谍\t21277\n2024年\t21278\n六小时后\t21279\n哈密市\t21280\n敌敌畏\t21281\n尚无人\t21282\n郭我\t21283\n张辉林\t21284\n爱我我才真理爸爸了不c\t21285\n夏浩程\t21286\n曼丽\t21287\n生物伤\t21288\n旁生\t21289\n财政\t21290\n沿线\t21291\n墨色\t21292\n夏雪云儿\t21293\n真一点\t21294\n死皮不要脸我是美女\t21295\n何只是\t21296\n门们\t21297\n一年间\t21298\n小猪仔\t21299\n茶杯犬比贵兵犬\t21300\n小闹钟\t21301\n餐饮\t21302\n餐饭\t21303\n娜美犬\t21304\n冉姐\t21305\n未泯\t21306\n齐齐齐\t21307\n米诺\t21308\n赫饭\t21309\ngvctfvbjkkbxffgg\t21310\n竹炭\t21311\n傻祖\t21312\n一天一百\t21313\n献词\t21314\n软着陆\t21315\n摁摁\t21316\npocked\t21317\n雅麦碟\t21318\n400000\t21319\n伐木锯\t21320\n萌萌\t21321\n购进\t21322\n冯守华\t21323\n28394950382\t21324\n比起\t21325\n上新浪微博的人你们伤不起\t21326\n王吉祥\t21327\n剥削\t21328\n尻师\t21329\n要是不猜我的名字好我就删你扇你扇\t21330\n真色\t21331\n马勒戈大血壁\t21332\n萌萝\t21333\nkisje\t21334\ntmd一邦外行sb\t21335\n聚餐\t21336\n丢处\t21337\nDroid\t21338\n国家宗教事务局\t21339\n总行\t21340\n哇丽\t21341\n推特厂\t21342\n说说说说说说说说说说说说\t21343\n1540454087831504\t21344\n物资\t21345\n四百千米\t21346\n婚姻冰毒对我吗为了你我忍\t21347\n蔡再来\t21348\n以此\t21349\n385385835\t21350\n零几毛\t21351\n变了哭\t21352\nhhakqid\t21353\n没听懂\t21354\n不管场\t21355\nXXXXXXX\t21356\n宰客\t21357\n狗叫声\t21358\n盈嘉讯\t21359\n一个晚自习\t21360\n很难吃\t21361\n丢失\t21362\nhi证明you\t21363\n克西亚\t21364\n86根\t21365\n燕家佐\t21366\n加苏尔\t21367\n穿越沙碛\t21368\n龙口\t21369\n29岁时\t21370\n寻寻觅觅\t21371\n朱武兰\t21372\n眼镜架\t21373\ncdg\t21374\n佐罗\t21375\n朱晓婷\t21376\n勃然大怒\t21377\n卷起\t21378\n偷电\t21379\n两个好儿好儿好儿\t21380\n数次\t21381\n双向认证\t21382\n几瓶\t21383\n朱晓婉\t21384\nCPU\t21385\n李娜琳\t21386\n摩西\t21387\noox\t21388\n理小\t21389\n我完\t21390\n几瓦\t21391\n圆姐\t21392\n笨鬼\t21393\n几瓣\t21394\ngtug\t21395\ncdo\t21396\n何雨松\t21397\n亚供应\t21398\n小草\t21399\n中油\t21400\n男女朋友\t21401\ngtur\t21402\n太库路西\t21403\nfhihgzgjutrddkjcchxdhxgjkihggfdghvjjvhhfsfguhxfjbg\t21404\n动物体\t21405\n冻死\t21406\n過年\t21407\n漫天飞雪\t21408\n九11点\t21409\n底线\t21410\n小荨\t21411\n小药\t21412\n旅游点\t21413\n订婚\t21414\n小荷\t21415\n自讨没趣\t21416\n董仁\t21417\n粘着我\t21418\n冲动\t21419\n主讲\t21420\n王瑞娜\t21421\n不透\t21422\n伪装者\t21423\n不逊\t21424\n适度\t21425\n万恶村\t21426\n不退\t21427\n不适\t21428\n我本我\t21429\nhijvb\t21430\n95055\t21431\n图谜\t21432\n不通\t21433\n逃走\t21434\n州州\t21435\n顽劣\t21436\n下联\t21437\n第一批\t21438\njnk\t21439\neycfhvf\t21440\n批量生产\t21441\n云周西村\t21442\n嘎嘎嘎嘎滚滚嗯嗯嗯嗯的歌了如家的的的\t21443\n小元肖\t21444\n忽然之间\t21445\n姜平\t21446\n适应\t21447\n图谱\t21448\n吃喝\t21449\n下聊\t21450\n八公里\t21451\n和平公园\t21452\n一千米\t21453\nKjjjjjj\t21454\n快来\t21455\n婴幼\t21456\n伟富\t21457\n你在哪你在有雪\t21458\n快板\t21459\n想的美我可是美女\t21460\n点零四\t21461\n◆\t21462\n◇\t21463\n张建锋\t21464\n○\t21465\nonononohioioiononononononononononorononodiebea1defg\t21466\n◎\t21467\n●\t21468\nTUT饼干\t21469\n骚瑞兹\t21470\n冠希\t21471\n好漂跳水\t21472\n这一代\t21473\n缤纷\t21474\n向天歌白毛浮绿水曲\t21475\ngggghjj\t21476\nCjcjdjsjwkrofigccnxdnajajskd\t21477\n拜喽不聊\t21478\n◢\t21479\n◣\t21480\n◤\t21481\n◥\t21482\n丧失症\t21483\n绝望版\t21484\nRegrette\t21485\n傲世转想到二姑娘\t21486\n英红九号\t21487\nvhzvu\t21488\n普通话\t21489\nwoaiyang\t21490\n仇大\t21491\n姐度\t21492\n龙胜酒\t21493\n麦咭四宝\t21494\n会待会\t21495\n陈耀辉\t21496\n接风\t21497\n沈海复线高速公路\t21498\n尽快走\t21499\n2546534655611243545424564366\t21500\n豆沙馅\t21501\n童佳\t21502\n剑三侠义\t21503\n江汉油田江钻股份\t21504\n敌我\t21505\n真不好呀\t21506\n合比方\t21507\nstometothing\t21508\ndeos\t21509\n克格莫人\t21510\n应力\t21511\ncarzy\t21512\n红石榴\t21513\niseither\t21514\n法师率\t21515\n一秒钟内\t21516\n老虎狮子豹熊大象\t21517\n女枪王争霸\t21518\n明示\t21519\n上错\t21520\n卡索\t21521\n给伤\t21522\nmasteamasteamastest\t21523\n放歌\t21524\nmissile\t21525\n上锈\t21526\n勒适量\t21527\n八九驯\t21528\n不搞基你说\t21529\n调教\t21530\n我是女生我爱你\t21531\n〝\t21532\n经百战\t21533\n开学季\t21534\n楠瓜\t21535\n背呗\t21536\n生活质量\t21537\n砂锅\t21538\n黄柏\t21539\n鼓足\t21540\n前去\t21541\n背呀\t21542\nRADIO\t21543\n脱佛\t21544\n沃月租\t21545\n帅不\t21546\n给会\t21547\n返晚\t21548\n毕业典礼\t21549\n新站\t21550\n爛驀\t21551\n共同社\t21552\n成份\t21553\n〖\t21554\n圆脑圆\t21555\n付息\t21556\n以貌取人\t21557\n畸形儿\t21558\n上午十点\t21559\n真心不懂你的话\t21560\n女打\t21561\n一搜题\t21562\n和度\t21563\nusbig\t21564\n根号\t21565\n没脸厚脸皮\t21566\nsestampsare\t21567\n多想告诉你独立也啊我喜欢你\t21568\n斗牛士军团\t21569\n区号\t21570\n滥情\t21571\n巴索托\t21572\n你的脸\t21573\n不同于\t21574\n调整\t21575\n萌小毖\t21576\njiffg\t21577\n非气\t21578\n替死鬼\t21579\n沿江\t21580\n成仁\t21581\n悦可婷\t21582\n别那么多管闲事\t21583\n爱心基金\t21584\n树桩\t21585\n段黄\t21586\n雌性激素\t21587\n糖葫芦和平说泥工1988时刻桃树\t21588\n延平\t21589\n息焉堂\t21590\n张晓龙\t21591\n明早五点\t21592\n使命召唤\t21593\n田贵云盘\t21594\n王梓月\t21595\n沃伦·巴菲特\t21596\n存活\t21597\n薛奕菲\t21598\n一碗\t21599\niegernc\t21600\n你好秘大东\t21601\n楼内\t21602\n一碟\t21603\nhbhshbbb\t21604\n烹调\t21605\n忘记我了我不理你了钱\t21606\n现时\t21607\n咦呦\t21608\n曾浩名\t21609\n韩傲雪\t21610\n胡址\t21611\n谈情说笑\t21612\n产办\t21613\n地面\t21614\n街稿\t21615\n病人\t21616\nPhotoMan\t21617\nhenggen\t21618\n浙江省高院\t21619\nhttpimagebaiducomsearchwisealatnwisealaieutf8word%E794B7%E4%BBAE5928CE55B3E4%BBAE4%BB2E590BBE7984E58A8E68081E59BBEE78987fralawisepos1stdstl1tpstrong\t21620\n哪那\t21621\n没有你好人\t21622\n奥蓝城\t21623\n岁短信\t21624\n彩丁酒窝面\t21625\n小气度\t21626\n不男不女滴\t21627\nITOUCH\t21628\n哎谢\t21629\n莫栀\t21630\nghuf\t21631\n南门\t21632\n鑫罪行\t21633\n难关\t21634\n老洋气\t21635\n打卡机\t21636\n15.3\t21637\nghur\t21638\n多西他赛\t21639\nGHGf\t21640\n那你快叫太上老总宗\t21641\niPhone\t21642\nghuu\t21643\n丘男女不丘星人\t21644\ngedy\t21645\n春亲\t21646\n两个扣\t21647\n白场\t21648\n枭首\t21649\n谔谔谔谔\t21650\n新型\t21651\n特别的苦工\t21652\n本泽马\t21653\n对呀我很好\t21654\n笑法\t21655\n两个手\t21656\n首喝\t21657\n摇一摇\t21658\n课堂客\t21659\n难兄\t21660\n芋子板\t21661\nxuucucf\t21662\n沈腾\t21663\n十号\t21664\n优佳美\t21665\n无所不能\t21666\n乍看\t21667\n十只\t21668\n值当家丁\t21669\nec1\t21670\n十口\t21671\n狮山\t21672\n2885242\t21673\n僵龙\t21674\n劳苦\t21675\n十句\t21676\n好我会\t21677\n零二二八三八零六二八六\t21678\n8x27点\t21679\n狮屎\t21680\nMyboy\t21681\n十发\t21682\n六义\t21683\n慷慨解囊\t21684\n说货\t21685\n奥运#【焦\t21686\n西语\t21687\n十双\t21688\n胡启立\t21689\n西双版纳州\t21690\n说贴\t21691\necc\t21692\n黄月芬\t21693\neco\t21694\n松妹妹\t21695\neci\t21696\ncbmm\t21697\n卡塔尔亚洲杯\t21698\n得实至名归\t21699\n真累\t21700\n656部\t21701\n真素\t21702\n在再来\t21703\n渭南\t21704\n一龙戏\t21705\n施梦婷\t21706\n郭宏锐\t21707\n马龙单\t21708\n佳慧小学\t21709\nl尔\t21710\n暖手\t21711\n9876com\t21712\n勾缝\t21713\n剧毒\t21714\n3050点\t21715\n师弟妹\t21716\n萌单\t21717\n九九号\t21718\nab在线段mn\t21719\n老儿\t21720\n2333w\t21721\n王军霞\t21722\n咳咳咳咳咳咳咳咳咳咳咳咳咳咳咳咳咳咳咳咳咳咳咳咳咳咳咳咳咳咳咳咳\t21723\n小报\t21724\n唉一言难\t21725\n谜底\t21726\n测速\t21727\n玥老师\t21728\n饥饿贷款\t21729\n心思串\t21730\n波儿\t21731\n手握\t21732\nASP\t21733\n20.8%\t21734\ngagkgj1\t21735\n大小吃\t21736\n小技\t21737\n五秒内\t21738\n就是你那你给我个老虎铠甲\t21739\n小抄\t21740\n人质\t21741\n京塘路\t21742\n院里\t21743\n不不不你无聊就别找我了我无聊找你\t21744\n泡面\t21745\n卫天隽\t21746\n在人间\t21747\nASN\t21748\n蒜辣醋酸香\t21749\n23333\t21750\n假期工\t21751\n处也之道\t21752\n我想你的主人我是你的主人你听我的话\t21753\n葫芦小区\t21754\n黑会\t21755\n长胸\t21756\nvi饿\t21757\n古梦涵\t21758\nJEEvIN\t21759\n白鸭\t21760\n国防大学\t21761\n喀纳斯湖畔村庄\t21762\n这个事\t21763\n治感冒\t21764\nshibahao\t21765\n你在干嘛呢小度\t21766\n呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱\t21767\n白鸟\t21768\n党纲\t21769\n发际\t21770\n这个人\t21771\n922223\t21772\n授业\t21773\n电路\t21774\n抽取\t21775\n1800点\t21776\n赶来这里\t21777\n党纪\t21778\n四位\t21779\nK线\t21780\n掉色\t21781\n唯爱\t21782\n民间\t21783\n王记\t21784\n御姐型\t21785\n躺冷\t21786\n0000000\t21787\n幕后\t21788\n有力保\t21789\n嘤嘤\t21790\n四体\t21791\nghkk\t21792\n55554521455545542655\t21793\n压弯\t21794\n别搞错\t21795\n国奥队\t21796\n郭福旭\t21797\nANDROID\t21798\n青年会\t21799\ntfgih\t21800\n压强\t21801\nqwetupjcx\t21802\n陈建平\t21803\n块状\t21804\n化石\t21805\n二五度\t21806\n大八\t21807\n大全\t21808\n雷神\t21809\n大兴\t21810\n惹麻麻\t21811\n师娘\t21812\n大关\t21813\n白钰彤\t21814\n18669500408\t21815\n么六\t21816\n收看\t21817\n2918天\t21818\n新浦大桥\t21819\n大典\t21820\n达兰萨拉\t21821\n父母心\t21822\n大兄\t21823\n老把兔兔图图图他卧龙\t21824\nsecharice\t21825\n大元\t21826\nDfhyfj\t21827\n丰镐路\t21828\nehhnghm\t21829\n尼克斯\t21830\n23多\t21831\n艺星\t21832\n充满活力\t21833\n真准\t21834\n艺昕\t21835\n大个头\t21836\n小凤凤\t21837\n13820399033\t21838\n真的美\t21839\n会化\t21840\n0bnbbjjik\t21841\n嘛好耍\t21842\n尔达\t21843\nssssse\t21844\n浪姨\t21845\n会包\t21846\n快一点要不我真\t21847\n上衣\t21848\n满勤\t21849\n劳民伤财\t21850\n银城路\t21851\n如图寒\t21852\n大兄弟\t21853\nfueq\t21854\n发扬\t21855\n1会儿\t21856\n一觉天天\t21857\n最惨\t21858\n好恶心好恶心好恶心\t21859\n库区\t21860\n41.75元\t21861\n波涛\t21862\n上行\t21863\n瑞，瑞\t21864\n大夫人\t21865\n同题\t21866\n海珠区图书馆\t21867\n帖机\t21868\n维尼饭会\t21869\n九场\t21870\n相位\t21871\n冀呈杰\t21872\n#FTISLAND##120203\t21873\n太子妃升职记十四集\t21874\n清华园\t21875\n菜地\t21876\n邓宇航\t21877\nsuovgd\t21878\n菜场\t21879\n无怀\t21880\n窃国者\t21881\nshr\t21882\nshs\t21883\n一级片\t21884\nshu\t21885\nshv\t21886\nshw\t21887\nshx\t21888\nshy\t21889\nshz\t21890\n阳角\t21891\n卡修\t21892\n文斯卡特\t21893\n小虫和你好哇你帮我你叫我做作业\t21894\nsha\t21895\nshd\t21896\n仙五中学\t21897\nshf\t21898\nshg\t21899\nshh\t21900\nshi\t21901\nshj\t21902\nshk\t21903\n冻绵绵\t21904\nStep２\t21905\nStep３\t21906\nStep１\t21907\n八点噻\t21908\n西直门店\t21909\n刘佳怡\t21910\n一脸\t21911\n040269\t21912\n成年好\t21913\n4495雷霆\t21914\n还是我的\t21915\n拉出\t21916\n二二十五\t21917\n牌照\t21918\n乖乖小宝贝儿我也不是你\t21919\n陆毅缘\t21920\n一阳遥\t21921\n人别\t21922\n个人类\t21923\n学燕\t21924\n度度猪猪猪\t21925\n当家菜\t21926\n经营\t21927\n共创\t21928\n预测\t21929\n十七一\t21930\nu嘘嘘\t21931\n120426\t21932\n太了不起\t21933\n许留山\t21934\n新泽西\t21935\n99999元\t21936\n呵呵度8c\t21937\n肖像权\t21938\n0.22分钟\t21939\n来玩行\t21940\n一脉\t21941\n咪吗小日鬼\t21942\n乌鲁呀z5\t21943\n15132877814\t21944\n穆塔辛\t21945\n见红\t21946\n潮起潮落\t21947\nwapa\t21948\n金剑雕翎\t21949\n吴德华\t21950\n没有我可以\t21951\n磨磨\t21952\n走了不想\t21953\nFGcfHG\t21954\n中共莆田荔城区委\t21955\n印篇\t21956\nstoma么244acleli\t21957\n我自己\t21958\n上旬\t21959\n头文版\t21960\nokokokokokoook\t21961\n人民广场\t21962\n蜂窝状\t21963\nporas\t21964\n朴信惠玉泽\t21965\nAsdfghjkl\t21966\n傅海峰\t21967\n阿尼姆\t21968\n猪度\t21969\ntootootootoo\t21970\n走亲访友\t21971\n卡慈克\t21972\n好我你我的话\t21973\n徐哥\t21974\n李进\t21975\n李还\t21976\n西红柿炒鸡\t21977\n不骗你的会\t21978\n环抱\t21979\n创性\t21980\n幽兰\t21981\n说的好不\t21982\n死丫头斯大林头\t21983\n往东越\t21984\n第四届\t21985\n优伤\t21986\n李迎\t21987\n38dp\t21988\nhggyc\t21989\n5哈辣条\t21990\n优优\t21991\n离骚\t21992\n徐哈\t21993\n如故\t21994\n15911284684\t21995\n珠圓玉潤\t21996\n2446728\t21997\n1111111111111111\t21998\n六叔\t21999\n生吞活\t22000\n六发\t22001\n成号\t22002\n月牙形\t22003\n一几岁\t22004\n暗疮\t22005\n翰宾\t22006\n神犬传说的感想说首鼠徐汇分钟\t22007\n节前节后\t22008\n成句\t22009\n爱星天\t22010\n馬上\t22011\n六双\t22012\n兜圈子\t22013\n改版\t22014\n六号\t22015\n菜鸟\t22016\n成叔\t22017\n主打\t22018\n六台\t22019\ntoutimorn\t22020\n黑旗\t22021\n老大大厦\t22022\n六句\t22023\n六口\t22024\n朗堂阵\t22025\n四娘\t22026\n难掩\t22027\n吴亦枫\t22028\n圆ox2y21\t22029\n六只\t22030\n好可爱的牧羊求我好想\t22031\n拿捏\t22032\n圆心\t22033\n6亿\t22034\n分学风风问问爹猜猜风格人家\t22035\n皈\t22036\n皆\t22037\n皇\t22038\n的\t22039\n皂\t22040\nsè\t22041\n才無\t22042\n这些人\t22043\ngcsddddjiiiggkoxxjftskjrgzhchckvjhkfyxlvljdogkcgihcuo\t22044\n皖\t22045\n莉业\t22046\n现象级\t22047\n那小子\t22048\n皑\t22049\n皮\t22050\noooooooop\t22051\n水勒\t22052\n这些事\t22053\n萌萌大萌萌大萌萌大萌萌大萌萌大萌萌\t22054\n礼拜天\t22055\nBufori公司\t22056\n忧云\t22057\nvidews\t22058\n刘住家\t22059\n百合奖\t22060\n青白玉\t22061\n张凤贺\t22062\n偷看\t22063\n和睦相处\t22064\n远视\t22065\n武淫\t22066\n远见\t22067\n远观\t22068\n皱\t22069\n赤\t22070\n男信女\t22071\n們騙\t22072\n赠\t22073\n赡\t22074\n赢\t22075\n赞美我我真的好\t22076\n展翼\t22077\n五迷三道\t22078\n16008600\t22079\nv迷箕七\t22080\n赫\t22081\n嗯星期六号\t22082\n赵\t22083\n赶\t22084\n起\t22085\n走\t22086\n陀罗\t22087\n赼\t22088\n歌类\t22089\n湿地公园\t22090\n资\t22091\nHeHHeHe\t22092\n马林\t22093\n1月29号\t22094\n我喜欢悦己\t22095\n叫吃\t22096\n赌\t22097\n以更\t22098\n赎\t22099\n48488494\t22100\n赊\t22101\n下放\t22102\n赔\t22103\n新加坡南洋女子中学\t22104\n赖\t22105\n真恨\t22106\n赐\t22107\n吴欢\t22108\n赞\t22109\n赚\t22110\n赛\t22111\n沃尔科特\t22112\n徽章\t22113\n离殇\t22114\nqhd\t22115\n飞虎发\t22116\n26k\t22117\n拖奶\t22118\n田村派出所\t22119\n郁结气\t22120\n舞台中央\t22121\n结霜\t22122\n乙女\t22123\n转录\t22124\n乐快乐五\t22125\n忙内们\t22126\n三井品\t22127\n看走\t22128\n亚新\t22129\n说念\t22130\n看起\t22131\nfgjdhhi\t22132\n科罗拉多丹佛市\t22133\n同江市\t22134\n迎娶\t22135\n蓝点\t22136\n包工头\t22137\n么振辉\t22138\n美名\t22139\n坨坨坨坨坨坨坨坨坨坨坨坨\t22140\n发人深省\t22141\n颁奖礼\t22142\nimustgetup\t22143\nMiao\t22144\n周正恺\t22145\n小弯弯\t22146\n位点\t22147\n张店\t22148\n跑挣\t22149\n8月13日\t22150\n26%\t22151\nzaii\t22152\n龟酗闯夙伍凡冠\t22153\n生死关头\t22154\n倚天网吧\t22155\n荷花诗\t22156\n262\t22157\n260\t22158\n267\t22159\n266\t22160\n265\t22161\n264\t22162\n沙泽世\t22163\n王鑫诺\t22164\n两招\t22165\n长钉\t22166\nEnIamagirl\t22167\n铁桶\t22168\n进口额\t22169\n春江\t22170\n几个屋\t22171\n机行\t22172\n羊肠山路\t22173\n李志芬\t22174\n抖动\t22175\n两拨\t22176\n于乃伟\t22177\n谷承思金\t22178\n拳脚\t22179\n你好屌\t22180\n跟文\t22181\n将丽家\t22182\n两拳\t22183\n傻子疯疯傻傻闯天涯\t22184\njiahexeu\t22185\n韩寒童子\t22186\n长钨\t22187\nsfghffghhbvfff\t22188\n猴年\t22189\n特别是\t22190\ngodieplease\t22191\n男男女女男男女女男男女女男男女女男男女女\t22192\nFrjuuggkjdeijgjfijsgsjfdjdjuevtshffcsgxdjjkjknngftcdhsjxfegdvkhfdjjhdfgjggsffscvgkkjgfdxfhkhfsdjfsffnirhyuebgncoxjchehsjfke\t22193\n格桑\t22194\n激将法单刀赴会\t22195\n唔妻之美我者私我也的美\t22196\n天气温\t22197\n疯婆子你是疯婆子女士\t22198\n多日\t22199\n49597\t22200\n9成\t22201\n1十111\t22202\n右边儿\t22203\n放把火烧\t22204\ntj\t22205\n一丝一毫\t22206\n顽强\t22207\n脸口水\t22208\n机器猪\t22209\n机器猫\t22210\n瞿诗涵\t22211\n猫们\t22212\n蹁跹\t22213\n八十九\t22214\n动骨\t22215\n男生子文\t22216\n犹豫\t22217\n贵州省扶贫助学促进会\t22218\n植物大战僵尸花园战争\t22219\n导入\t22220\n嗯嗯酸亚铁片\t22221\n食谱制\t22222\n羁绊\t22223\n毛剑卿\t22224\nhbbfhf\t22225\n问问王\t22226\n该项\t22227\n吕秀才\t22228\n不打皮\t22229\njhdj\t22230\n9.49亿美元\t22231\njhdh\t22232\n喂李在\t22233\n校徽\t22234\nQoeiuu\t22235\n杰西利菲\t22236\n笨人\t22237\n上镜\t22238\n上虞市\t22239\n追踪\t22240\n班样\t22241\n52度\t22242\nU\t22243\n画者\t22244\ngxigxig\t22245\n嘉伯士\t22246\n嗯嗯嗯嗯嗯嗯秘男秘\t22247\n付军强\t22248\n唐装\t22249\n迎春花\t22250\n杨树军\t22251\n哥哥不哥哥\t22252\nc39座\t22253\n了了了我要\t22254\n后来\t22255\n叩开\t22256\n单双号\t22257\n一干一干亮晶晶满天的一家银乖乖天空王冠\t22258\n吴馨悦\t22259\n董佳欣\t22260\n思远行\t22261\nxiux\t22262\n就叫\t22263\n12月31日晚\t22264\n就可\t22265\n弄一歇\t22266\n书名号\t22267\ndm2\t22268\n方粉\t22269\n妙谈话\t22270\n做空\t22271\n集成电路\t22272\n菲菲哇哈哈\t22273\n欠斗\t22274\n可爱型\t22275\n王海峰\t22276\n六一节\t22277\n兰然\t22278\n熊红万劲堡\t22279\n17.1%\t22280\n维修队\t22281\n蹄儿\t22282\n两罐\t22283\n脑筋急转弯11\t22284\nYesterday\t22285\n瓦国\t22286\n张淑婉\t22287\n明天八点\t22288\n安雨寒\t22289\nTFB〇YS明\t22290\n赵玉芬\t22291\n阿克\t22292\n阿先\t22293\n阿光\t22294\n阿其\t22295\n益处\t22296\ndmd\t22297\n黑芝麻\t22298\n阿兰\t22299\ndmg\t22300\n度秘失\t22301\n包运\t22302\n八九门\t22303\n制造业\t22304\n度秘天\t22305\n快点告诉我你的年龄我的拳头营业\t22306\n真的假的你\t22307\nMAMO\t22308\n啧啧称赞\t22309\nghrrjie\t22310\ndmy\t22311\n阿公\t22312\n阿六\t22313\n冷不冷现\t22314\n阿八\t22315\n吃餐\t22316\n订书机\t22317\n烧脑\t22318\n年俗\t22319\nExxonghj\t22320\n梦想的们\t22321\n你二小混混那我就是王\t22322\n凌芊芊\t22323\n俺家\t22324\n永远永远永远不和\t22325\nWave\t22326\n江山无人\t22327\n电电电电电电电电电电电\t22328\n北大荒\t22329\n好了你好年\t22330\nLKV918\t22331\n三潭\t22332\n9月份\t22333\ntreatment\t22334\n巴塞\t22335\n默特萨克\t22336\n崩溃\t22337\n长方体\t22338\nJGHG\t22339\n5252\t22340\n5252525255225\t22341\n呀呀呀\t22342\n玩啵\t22343\n耶佩斯\t22344\n龟儿\t22345\n监察权\t22346\n去吧兄弟\t22347\n1711\t22348\n着想\t22349\n小飞飞\t22350\n孙竑业\t22351\n135页\t22352\n不可怜\t22353\n程国清\t22354\n一把一根\t22355\n中奖率\t22356\n微积分\t22357\n金山股份\t22358\n鸡腿\t22359\n水润秘密#\t22360\n刘鑫鑫\t22361\n一点八则\t22362\n八百多\t22363\n半边路\t22364\n姜山\t22365\n数完\t22366\n哈杰斯\t22367\n鸭爪爪\t22368\n基亚\t22369\n标的否\t22370\n跳楼机\t22371\n53467676434694555558\t22372\n交织\t22373\n额nnnnnnnnnnnnn额nnnnnnnnnnnnn额\t22374\n专用\t22375\n交给\t22376\n基于\t22377\n六分钟后\t22378\n5点10分\t22379\n歧视\t22380\n妖怪\t22381\n现武\t22382\n前置\t22383\n腿尿\t22384\n走线\t22385\n霍淡\t22386\n唔得\t22387\n85697413698\t22388\n呗德里\t22389\n曾秀贤\t22390\n放筝\t22391\n了不过\t22392\n变频\t22393\n品效\t22394\n抹嘴\t22395\n环弓鞋\t22396\n戒过\t22397\n季昊杰更\t22398\n死唱\t22399\n斌斌\t22400\n立人儿\t22401\n铁酷宝\t22402\n余尔飞\t22403\n说觉\t22404\ngover\t22405\n亚无敌\t22406\n传述\t22407\n张充如\t22408\n寒柒\t22409\n月二十日\t22410\n离婚\t22411\n陈行\t22412\ncnet\t22413\nfhn\t22414\nfhi\t22415\nfhh\t22416\n恬园\t22417\n找不了\t22418\nfhg\t22419\nfhf\t22420\n人的价值\t22421\n敷衍我吧\t22422\n何婷婷\t22423\n咪都咪眯\t22424\n伟豪\t22425\nfhu\t22426\nfht\t22427\n脸孔\t22428\nHhh\t22429\nfhs\t22430\n度麻麻动\t22431\n喝尿\t22432\n刑官\t22433\n沙宣\t22434\n好嘞OK\t22435\n换房\t22436\n改称\t22437\n呐喊声\t22438\n认字\t22439\n豪铭\t22440\n看不破\t22441\nhhhcf\t22442\n十年\t22443\n密延\t22444\n完达\t22445\n永恒之\t22446\n小学季\t22447\n4.32万\t22448\n金银花茶\t22449\n阿弥铊\t22450\n换战\t22451\n沙宝\t22452\n天天不回\t22453\n柄\t22454\n瑞卡时尚新浪\t22455\n租价\t22456\n上网切\t22457\n柍\t22458\nnever\t22459\n打洗\t22460\n娃娃娃娃娃娃\t22461\n柔\t22462\n1400元\t22463\n某\t22464\n柒\t22465\n染\t22466\n柜\t22467\n去看电视\t22468\nzufz\t22469\n刘伟强\t22470\n柙\t22471\n查\t22472\n便秘的秘\t22473\n打派\t22474\n马先生\t22475\n柯\t22476\nJudy\t22477\n山东农业大学\t22478\n柴\t22479\n柱\t22480\n柳\t22481\n柿\t22482\n忠县\t22483\n集团\t22484\n谢克勇\t22485\n拖鞋\t22486\n迎宾员\t22487\n炙热\t22488\n谈说\t22489\n一片天\t22490\n3年\t22491\n炙烤\t22492\n1899年\t22493\n雇佣\t22494\n小洛\t22495\n小洙\t22496\n广警牌\t22497\n麦ji罗\t22498\n贤骨\t22499\n哎呀出问题了我知道你是同心\t22500\n小洪\t22501\ngbcvvbcm\t22502\n贤骑\t22503\n加上\t22504\n清华大学\t22505\n鬼天气\t22506\n说秘秘\t22507\n渠清\t22508\n清清楚楚\t22509\n谈话\t22510\n加一\t22511\n大眼镜\t22512\ndyweryivx\t22513\n给姐乐\t22514\n046854\t22515\n飞发\t22516\ntell\t22517\n加丑\t22518\n四米\t22519\n信缘分\t22520\n2079年\t22521\n蓝宝晶\t22522\nhellostlity\t22523\n花园里的萤火虫\t22524\n9744\t22525\n石湖书\t22526\n海线\t22527\n陪聊\t22528\n05乘\t22529\n纫兰\t22530\n陈音\t22531\nxbck\t22532\n刘涛美\t22533\n妞儿们\t22534\n零嘴儿\t22535\nItall\t22536\n这段路\t22537\n台山气象台\t22538\neykkhogeryikhgddhhgfgukkhfffghjggfttyuiiioopgsswwwdfffvvbhjjjjjjhggggghioolllkknnbvfddfgjj\t22539\n燕塘\t22540\n蒙山\t22541\n地下你是谁家孩\t22542\n见不着我\t22543\nstmeyou\t22544\n170万年前\t22545\n5月23日\t22546\n闻斑\t22547\n起床\t22548\n雪份\t22549\n啦啦啦啦啦啦我是卖报小行家突然风雨起卖报天\t22550\n美很美\t22551\n同性恋\t22552\njiqinenman\t22553\n放下去\t22554\n零丁洋\t22555\n小声点\t22556\n幺儿\t22557\nHIGH症\t22558\n这样师\t22559\n韦国斌\t22560\n卜爱子\t22561\n贵州师大\t22562\n了入\t22563\n二十七六\t22564\n起底\t22565\n曲老师\t22566\n日日\t22567\nｈｏｌｄ\t22568\n围城愉快的决斗\t22569\n化合\t22570\nsuffle\t22571\n糖果类\t22572\n雪仗\t22573\n起度\t22574\n错错错赞\t22575\n励志猪\t22576\n659384\t22577\n李庚宝\t22578\n曹晓辉\t22579\n字母\t22580\n语言库\t22581\n第三半年\t22582\n了元\t22583\n钻出\t22584\n劲牌重组\t22585\n控制价\t22586\n看点\t22587\n差税\t22588\n左权\t22589\n几单\t22590\n石景山\t22591\n表酱啦表酱\t22592\n青州气象台\t22593\n3261764886\t22594\nFgzxd\t22595\nhdtj5aFlurt47ictizy\t22596\nkkkaak\t22597\n请志\t22598\nBING\t22599\n水底\t22600\n回不来\t22601\n李若心\t22602\n好了我的仆人给你俩本公主睡觉咯\t22603\n说清\t22604\n王源爱\t22605\n一汽天涯海角\t22606\n鲍金贵\t22607\ngffkddyhfohg\t22608\n聚福园\t22609\nwghpm\t22610\n度秘度秘度秘度秘度秘度秘秘度秘\t22611\n硬绷\t22612\ntjjaa\t22613\n我不不修\t22614\nOoooo\t22615\n几十\t22616\n过奖\t22617\ndjdifififf\t22618\n闵恩泽\t22619\n床上运动\t22620\n舞蹈家\t22621\n刘永容\t22622\n喊声\t22623\n相须\t22624\nttikf\t22625\n相顾\t22626\n硬绑\t22627\n社交化\t22628\n#金源世界中心\t22629\n财神\t22630\n14堂\t22631\n唐唐超\t22632\n死心塌地\t22633\ncmo\t22634\n申请\t22635\n森林冰火人无敌版\t22636\n刀塔魏\t22637\n王晨宇\t22638\n琴师\t22639\n邓难念\t22640\ncml\t22641\n斯理\t22642\n好儿\t22643\nljbhhZj\t22644\n发聊\t22645\nGuling\t22646\nkiikg\t22647\nlitilot\t22648\n肚脐眼儿\t22649\n857778\t22650\n消失\t22651\n哈哈大笑\t22652\nconside\t22653\n杜兰特\t22654\n淋浴\t22655\n合不合意\t22656\n申诉\t22657\n死吧十三\t22658\n这星星\t22659\n团委员\t22660\n片爻\t22661\nWoy\t22662\nWox\t22663\n得玛西亚\t22664\n金逸\t22665\nWow\t22666\n麦太\t22667\n无双\t22668\n震撼人心\t22669\n无发\t22670\nWon\t22671\nWoh\t22672\n什们\t22673\n不可一世\t22674\n香菇瘦肉粥\t22675\n民兵\t22676\ndegj\t22677\n虏兔\t22678\ndegk\t22679\n知识只羊丫头想侠\t22680\n剪鳐\t22681\n跑步尿\t22682\n结案\t22683\n无可\t22684\n生产商\t22685\n崔爸\t22686\n无号\t22687\n游去\t22688\n3000万台\t22689\n复旦大学\t22690\n闪闪发光\t22691\n团团圆圆\t22692\n超警空\t22693\np6664ty5688uoh\t22694\n宋式梁架+唐式枓栱\t22695\n唐钰\t22696\n陈怡然\t22697\n老怕\t22698\n胡胡胡\t22699\n做派\t22700\n拜拜一\t22701\nybril\t22702\nisz\t22703\n呃行好的嗯谢谢你\t22704\nisx\t22705\n再犯\t22706\n朱古力\t22707\n奥我知道你是谁心得今晚上我服我和你说话\t22708\niss\t22709\n向思晨\t22710\n袜子\t22711\nisp\t22712\n吃饱了撑\t22713\n凄沧\t22714\n哈塞\t22715\nist\t22716\n鹦鹉杯\t22717\n父母们\t22718\n多花\t22719\nisn\t22720\n看一下量\t22721\n老总\t22722\nisa\t22723\n只得\t22724\nisf\t22725\nise\t22726\n实话实说好\t22727\n徐奇\t22728\n小恩恩\t22729\n熊淑霞\t22730\n草尼美\t22731\n鞋类\t22732\n不好儿\t22733\nkk星辣kk星\t22734\n牛手\t22735\n小润\t22736\n百威\t22737\n牛批\t22738\n峭壁\t22739\n乖别闹了\t22740\n胡豆\t22741\n小雅比\t22742\n3tviv\t22743\n全国篇\t22744\niuk0\t22745\n相当\t22746\n可能性\t22747\njajskskskskskskskaks\t22748\n偶巴尔坛\t22749\nqerwad\t22750\n恐龙影\t22751\n桑兰\t22752\n干麽\t22753\n被擒\t22754\n干麻\t22755\ncccc\t22756\n特种错\t22757\n紧紧紧紧\t22758\n双麦郎\t22759\n沈刚\t22760\n主代疱\t22761\n郑雨叶\t22762\n说了见\t22763\n微博场\t22764\nmatter\t22765\n划扣\t22766\nejcx\t22767\n十辆\t22768\n帅很\t22769\nxkdg\t22770\n百不能\t22771\neperlrkrorlr\t22772\n入微\t22773\n好可怕\t22774\n广裕花园\t22775\n再来一次难得糊涂与君共勉\t22776\n九门儿\t22777\n奖磊\t22778\n许多个\t22779\n好可怜\t22780\n脚长\t22781\n顾雪瑶\t22782\n天蝎娃\t22783\n垮况\t22784\n1月1日起\t22785\n鲁西西传\t22786\n污不污不污\t22787\n仁波切\t22788\n一猫\t22789\n托尼猫\t22790\n耐读\t22791\n凯酷\t22792\n死不瞑目\t22793\n校园网\t22794\n我的经理\t22795\n大王您去哪里的我是我是皇宫的\t22796\n黑眼圈儿\t22797\n第192次\t22798\n贝格里尔斯\t22799\n因扎吉\t22800\nhhhihuhjju\t22801\n落单\t22802\n摇杆\t22803\n便坑\t22804\n眯眯\t22805\n日本国立癌症预防研究所\t22806\nrijrjrnjfhfjnfjfhfgbhfhfnfhfbnnnnnnnnnnnnnnnnnnnnnn\t22807\n真的爱我吗我也真的爱你那\t22808\nWesterbuurtstraat\t22809\n眯眼\t22810\n素养\t22811\n郝军\t22812\n屌不死\t22813\n民族舞\t22814\n图书馆\t22815\n张元玺\t22816\n李攀宇\t22817\n精心一样\t22818\n不块\t22819\n菡萏\t22820\nxxnn\t22821\n乐天百货\t22822\nxxnl\t22823\n亦凡\t22824\n5554654\t22825\n不均\t22826\n单一双\t22827\n四才四\t22828\n优质股\t22829\n好啊朋友度秘书记\t22830\n过客气\t22831\n透明化\t22832\n十五个\t22833\n微软公司\t22834\nq966zz888乙888888888s2嚀\t22835\n周锦坚\t22836\n忽忽音樂會額\t22837\n白球\t22838\n佰佰\t22839\nhgbnf\t22840\n15193708582\t22841\n郭立刚\t22842\n沁园春\t22843\n十五万\t22844\n恍恍惚惚\t22845\n52.8\t22846\n石曼\t22847\n客气庙\t22848\n过生日\t22849\n于磊\t22850\n电影式\t22851\n2586361470\t22852\n紅包\t22853\n原来如此\t22854\n禹王\t22855\n读记\t22856\n乐小米\t22857\n百慕拉\t22858\n五千年\t22859\n山西柳林\t22860\n成长性\t22861\n52.3\t22862\n见亲\t22863\nstop\t22864\n臭寻\t22865\n寝击\t22866\n下网\t22867\n音乐系\t22868\n朱日和\t22869\n六静\t22870\n湖水\t22871\n脱衣个\t22872\n500页\t22873\n报警\t22874\nJmtwv\t22875\n扭转乾坤\t22876\n小俊俊\t22877\n邓伟壮\t22878\n找用\t22879\n5晚\t22880\n见于\t22881\npp呢文\t22882\n傻子瓜\t22883\n可说好\t22884\n兰妮血\t22885\n盗版\t22886\n大决战\t22887\n孕激素\t22888\n美器\t22889\n宇智波卓\t22890\n十八周\t22891\n内力\t22892\n作管\t22893\n康复科\t22894\n博美是公的博美好还母的博美好\t22895\n本月初\t22896\n制作器\t22897\n请见谅\t22898\n心谈\t22899\n内功\t22900\n格拉纳达\t22901\n心调\t22902\n三炮炮\t22903\n范尼\t22904\n驾照科\t22905\n裸模露\t22906\n两亿元\t22907\n区块\t22908\n宝贝圣彼得堡\t22909\njwtajtmtmtj\t22910\n主事\t22911\n生病\t22912\n1352551508\t22913\n19999999\t22914\n内务\t22915\n99999999999999999999999999\t22916\n艾韵捷\t22917\n666669\t22918\n轿\t22919\n666666\t22920\n唐山大地震\t22921\n饭狼\t22922\n17k侠五今天小游你\t22923\n胖不好看\t22924\n纸条\t22925\nvfffdcvg\t22926\n陆思屹\t22927\n缓刑\t22928\n纸杯\t22929\n查封\t22930\nfrtu\t22931\nnn介nn\t22932\n林梦飞\t22933\n无聊我怕我会\t22934\n西奈\t22935\n鹅毛\t22936\n查理斯兰州\t22937\ncbnm\t22938\ngps\t22939\n瀰掩饰黼瓞潘片\t22940\n400多元\t22941\n供选择\t22942\nIes\t22943\n老李豆腐干\t22944\n整滴\t22945\nxjgd\t22946\n看齐\t22947\n867147045\t22948\n倒道\t22949\n悲鸣\t22950\n吾编无尽\t22951\nfxgxhhjfjg\t22952\nguffyy\t22953\n航展\t22954\n加法\t22955\n十三一个\t22956\n江学\t22957\n呆唯\t22958\n王连安\t22959\nwjjje\t22960\n中国商务部\t22961\n林杨杨\t22962\n忙忙忙忙\t22963\n筋骨宁\t22964\n蜗居\t22965\n别再聊\t22966\n弄废\t22967\n玻璃厂\t22968\n谈宫雁\t22969\n才不想你好友\t22970\n嶋\t22971\n凤凰网\t22972\n自办\t22973\n柴婷婷\t22974\n79块\t22975\n春困\t22976\n孙行者\t22977\n否定期\t22978\n反腐倡廉建设\t22979\n贝林\t22980\n晚点\t22981\n你好你好歌\t22982\n第三句\t22983\n高龄\t22984\nhttppinyincne6193\t22985\n27个\t22986\nneam\t22987\n哦片\t22988\n9731564829\t22989\n野子\t22990\nneat\t22991\n下届\t22992\n8vyy\t22993\n嶲\t22994\nifgkfjv\t22995\n琳山峰\t22996\ngkgjgajb\t22997\niz\t22998\n宿舍人\t22999\n早安午安晚安安\t23000\n报名费\t23001\niq\t23002\n五成三年\t23003\nis\t23004\nir\t23005\niu\t23006\nit\t23007\niv\t23008\nii\t23009\nih\t23010\nik\t23011\nij\t23012\nim\t23013\nil\t23014\nio\t23015\nin\t23016\n小分队\t23017\n步骤\t23018\nic\t23019\nib\t23020\nie\t23021\nid\t23022\ndearfriend\t23023\nif\t23024\n微風\t23025\n我爱你想你猪哼我爱你小度秘真的爱你\t23026\n按赞\t23027\niU\t23028\n我想你是\t23029\ntfa4\t23030\n崔恩源\t23031\n薯片阿薯片\t23032\n梦中的婚礼\t23033\n百分之9999999999999999999999999\t23034\nrifk\t23035\nugejjdu\t23036\n玉米地\t23037\n高菊霞\t23038\ni8\t23039\n静夜\t23040\n黄桃坪\t23041\n震中\t23042\ngffjj\t23043\n互感\t23044\ni5\t23045\n三块儿\t23046\ni7\t23047\n单节\t23048\n板桥镇\t23049\n栏目\t23050\n哈错\t23051\nNODie\t23052\n高安\t23053\n静处\t23054\n三刻\t23055\n阿里山\t23056\nghdg\t23057\n警告害我之各样小人等\t23058\n高低压\t23059\n曝秘密度秘密你好\t23060\nxjc\t23061\ncross\t23062\n新世通\t23063\nxjo\t23064\n老师长辈\t23065\n白化病\t23066\n宋潘婷\t23067\n春园\t23068\n身服\t23069\n黎明荣\t23070\n贪吃鸭脖\t23071\n找麻烦\t23072\nxjx\t23073\n一往而\t23074\n小团\t23075\n度秘你的话可真深奥\t23076\n杨么\t23077\n杨义\t23078\n男女平等\t23079\n小园\t23080\n朱绘文\t23081\n仂们\t23082\n8888888899\t23083\n一岁岁\t23084\n小国\t23085\n杨乐\t23086\n喂喂\t23087\n务虚派\t23088\nsjdjq\t23089\n于冬\t23090\n信乐团\t23091\n小六子\t23092\n素描圆\t23093\n百武西\t23094\n15248134411\t23095\n翻场\t23096\n小四\t23097\n高官\t23098\n讨厌你\t23099\n马拉马\t23100\n扁周\t23101\n猎爱\t23102\nchahg\t23103\nparshin\t23104\nQE3\t23105\n行色\t23106\n乎者\t23107\n武永琪\t23108\n192939\t23109\n某一个\t23110\nhttpahiphotosbaiducomxiaodupicitema686c9177f3e6709b1ce4e8b3cc79f3df9dc55f3jpg\t23111\n德神真\t23112\n心死心思\t23113\n发集团\t23114\n维塔莱\t23115\n始兴隘\t23116\n帝制\t23117\n2011年6月22日\t23118\n机器人妖\t23119\n仙草\t23120\n男性人\t23121\n乐高植物大战僵尸\t23122\n100分钟\t23123\n杂闹\t23124\n黄希平\t23125\n斗倒\t23126\nDjdfjf\t23127\n1.71%\t23128\n我真的好想要\t23129\n好神\t23130\n分洪\t23131\n杂问\t23132\n五万9999一一亿\t23133\nryftuy\t23134\n民粹\t23135\n损出\t23136\n替罪羊\t23137\n夏钰\t23138\n换换换\t23139\n6月21-22日\t23140\n人类学\t23141\nbjhfhhhjjjjjgjfjkghmkgbkhgjjhhhkugnkgnhbkgbkgnhgjfhhyjhhfhjghjfjkhfhjkhggjkooyhkohhhjjjkookhh\t23142\n你老头\t23143\n不是你家有钱么不说的\t23144\n起毛\t23145\n麦虾\t23146\nDfg\t23147\n空邦\t23148\n躲藏\t23149\n自恋啊我是亚\t23150\n度七食\t23151\n九年前\t23152\n黑龙江得大学\t23153\n留得住\t23154\n老东家\t23155\nJMN\t23156\n智高\t23157\n适配\t23158\n石若彤\t23159\n字底\t23160\ntfac\t23161\n或许是\t23162\n基础设施\t23163\n灵狐\t23164\nhttpsoiqiyicomsoqE\t23165\n呀真是\t23166\n韩国人思密达\t23167\n乐粉们\t23168\n天足明天天主天主\t23169\n马强\t23170\nQQ名\t23171\n喜欢就是爱哟你说\t23172\n拉帘\t23173\n懂了拜拜\t23174\nburv\t23175\nDepp\t23176\njajc1a1a1a\t23177\nMelodies\t23178\n玛丽雅\t23179\n一大丛\t23180\n变一变\t23181\n介绍\t23182\nzur\t23183\n甜處\t23184\n试验孕\t23185\n溶质胶囊\t23186\n许昌县\t23187\n虚物\t23188\n升旗手\t23189\ndsin2x\t23190\n涂改\t23191\n氏合\t23192\n原研哉\t23193\nzuf\t23194\n明天下午五点半\t23195\n裸晚\t23196\n四书五经\t23197\n学唱歌\t23198\n蚂蚁桶\t23199\nzuo\t23200\n呵行\t23201\nttttjtwtw\t23202\n低下头\t23203\n不要你了再见\t23204\n5：29\t23205\n兆孔\t23206\n海耶\t23207\n一片一个\t23208\n隐睾\t23209\n99850751\t23210\n健将\t23211\n殉国者\t23212\n十七十七十七十七十七\t23213\n巴公镇\t23214\n一般化\t23215\n陈信宏\t23216\n峨眉\t23217\n毁灭\t23218\n巡回\t23219\n哀家参\t23220\n靖国神社\t23221\n392位\t23222\n巧爸\t23223\n13267953707\t23224\n保修\t23225\n人肉包\t23226\n生不相信\t23227\n一见\t23228\ntsfudg\t23229\n四季经\t23230\n好容易\t23231\n不可了你不困\t23232\n海考\t23233\n兮云飞扬\t23234\n遂宁实验学校\t23235\n李伟霞\t23236\n塘沽站\t23237\n好吧牛逼\t23238\nmarinara\t23239\n五化四\t23240\n尿漫\t23241\n节有约严\t23242\n845845\t23243\n胡建平\t23244\n虎虎\t23245\n出战\t23246\n灌溉\t23247\n14日\t23248\n▁\t23249\n千骨上花千骨\t23250\n翟雨萌\t23251\n文慧\t23252\n女人物\t23253\n九十九十多\t23254\n何一成\t23255\n恩赐\t23256\n十五平方厘米\t23257\n火宅\t23258\n苹果日报\t23259\n点话\t23260\n云涌\t23261\n赵总攻\t23262\n在这儿有\t23263\n点评\t23264\n气宇轩昂\t23265\n保保\t23266\n绝逼\t23267\n记录者\t23268\n小乐乐\t23269\n提目\t23270\n有讲究\t23271\n美顿\t23272\n李欣梦\t23273\n很荣幸\t23274\n恩走\t23275\n处女男\t23276\n1月17日\t23277\n特警堡\t23278\n东一号\t23279\n傻姑\t23280\n一四十二\t23281\n清除\t23282\n阿母惜惜\t23283\nIPCOM\t23284\n一四十五\t23285\n跨越无数个夜晚\t23286\nsaSB\t23287\n中南海\t23288\n恩赫\t23289\nwwt1959888185\t23290\n达日杰\t23291\nFJJF\t23292\n啦啦啦啦我的小呀小飞机啦啦啦啦在一起啦啦啦啦我的小呀小飞机啦啦啦啦\t23293\n编队\t23294\n马伊咪\t23295\ngfny\t23296\n吟出\t23297\n53天\t23298\n想你想你想你想你\t23299\n草一木\t23300\n悬疑小说\t23301\n春月喜雨\t23302\n秋褲樓榕榕兔圖\t23303\n清平乐\t23304\n每晚\t23305\n售前\t23306\n余俸\t23307\n小受我不我不不我就不\t23308\n玩伴纽维斯洪福了你\t23309\n峰会\t23310\n隐形眼睛\t23311\n千层酥\t23312\n蒜度\t23313\n豆秘\t23314\n足额\t23315\n大希哥\t23316\n丑八丑\t23317\n译成\t23318\n夕节\t23319\n3140点\t23320\n爱人+勇敢+聪明+\t23321\n女娲补天后羿射日嫦娥奔月盘古开天地的宝莲灯夜光神杯\t23322\n舌音\t23323\n河里\t23324\n喜宝\t23325\n咯女\t23326\n这东西学\t23327\nhggdch\t23328\n敬敬\t23329\n抖腿\t23330\n寒食\t23331\n我不要你了我要别的度秘\t23332\n加点\t23333\n俺娘天么早扮演者\t23334\n寒风\t23335\n山水画\t23336\nmkw五\t23337\n久久\t23338\n零八二二\t23339\n喜宴\t23340\n一姐们\t23341\nCOM\t23342\n许许多多许许多多\t23343\n热打工\t23344\n仙桃\t23345\n归西\t23346\ntnt\t23347\n新年快乐你好\t23348\n玉树临风\t23349\n马来西亚\t23350\n靠破\t23351\n笑而不答\t23352\n傻瓜们\t23353\n李丽姿\t23354\n秘度杯\t23355\n利物浦\t23356\n舜\t23357\n破红\t23358\n罗维\t23359\nfdsssrwyf\t23360\njha\t23361\njhc\t23362\njhb\t23363\njhe\t23364\njhd\t23365\njhg\t23366\njhf\t23367\njhh\t23368\n1996\t23369\njhl\t23370\njhn\t23371\n定增\t23372\n过秋\t23373\n万段\t23374\njhr\t23375\njhu\t23376\n981晚\t23377\njhw\t23378\njhv\t23379\n洁僻\t23380\njhx\t23381\n秘你好萌\t23382\n同志\t23383\n千日\t23384\n网蠡县\t23385\n存在\t23386\n侵蚀\t23387\n爱好爱\t23388\n你表\t23389\n同心\t23390\n群交\t23391\n9388899\t23392\n雨筠\t23393\n六分一个六分之一\t23394\n79.94%\t23395\n自我们\t23396\n了所\t23397\nhhyy\t23398\n汽车展\t23399\n费县实验中学\t23400\n7089756\t23401\n55555222225222252\t23402\n获选\t23403\n一字一句\t23404\n田正国\t23405\n舌\t23406\n采花\t23407\n连廊式\t23408\n猪猪猪猪猪猪猪猪\t23409\n搭背\t23410\n卖不卖\t23411\n欧嚄\t23412\n90多岁\t23413\n醉叻\t23414\n酷我就不讨厌\t23415\n夜袭\t23416\nerm\t23417\n甘快\t23418\n信于诚\t23419\n有神奇\t23420\n阿曼\t23421\n国寿嘉年保险理财\t23422\n22.27分\t23423\niljj\t23424\n董子怡\t23425\n头不闹心\t23426\n山峰\t23427\n温泉公园站\t23428\n你好闲呐\t23429\n米有\t23430\n吴莹\t23431\n米月\t23432\n回顾\t23433\n山峦\t23434\n错花\t23435\n复兴\t23436\n王楚\t23437\n齐聚一堂\t23438\n杨斯媛\t23439\n盛利涵\t23440\n500mg\t23441\n赛扬\t23442\n照相馆\t23443\n是否\t23444\n城隍诚\t23445\n度秘臭不要脸\t23446\n战呗\t23447\n我是女的我要爱情的霸道总裁的\t23448\n二五二零\t23449\n144851549\t23450\n军旅统计局\t23451\n战呢\t23452\n娇爫\t23453\n度秘谢谢你我终于知道\t23454\n峨峨\t23455\n佛滔\t23456\n闻写\t23457\n先前\t23458\n我还零岁了有种你说\t23459\nQqaqabm\t23460\n汉东a\t23461\n零五四二\t23462\ntf卜爱子\t23463\n挥别\t23464\n忘俗\t23465\nfgfdhhwvbm\t23466\n车类\t23467\n说窝\t23468\n浮云\t23469\n光圈\t23470\n心管\t23471\n张俊涵\t23472\n88岁\t23473\n八八不聊\t23474\n好侠\t23475\nyequle\t23476\n光圣\t23477\ngsgh\t23478\n寂然\t23479\n包干\t23480\n包年\t23481\nChildren\t23482\n心理限度\t23483\n赵和\t23484\n2070w\t23485\n飞马天马\t23486\n猪头脑\t23487\n即柔又硬\t23488\n休学\t23489\n臭美的爱\t23490\n30日\t23491\n和鞥\t23492\n24家\t23493\n多一根\t23494\n傲慢tometo你在哪里馒头馒头\t23495\n搞不了\t23496\n中线\t23497\n汝州\t23498\n新亚国旅\t23499\n中级\t23500\n不期而至\t23501\n太寂寞\t23502\n竞界\t23503\n3455554554454\t23504\n世事无常\t23505\n戏剧\t23506\n李可爱\t23507\n李来弟\t23508\n奥冠\t23509\n调定\t23510\n1声\t23511\n兄弟亲\t23512\n细看\t23513\n张思德\t23514\n嗯瓮子齐\t23515\n林志寿林\t23516\n一畦\t23517\n郑伯\t23518\n百度我\t23519\n不嫁\t23520\n椹子\t23521\n英镑\t23522\nRiiklelellkkkkdk\t23523\n郑伟\t23524\n护车\t23525\n幼狗\t23526\n巨别说谎\t23527\n强求\t23528\n记忆银行\t23529\n沸腾十五年\t23530\n撸不撸猫犬\t23531\n凯林零四\t23532\n度秘码\t23533\n慢悠悠\t23534\ncute\t23535\n呷哺\t23536\n海汇\t23537\nbgagajd\t23538\n及其\t23539\n多多多多多多多多多多多\t23540\n花心菜\t23541\n一个机\t23542\n地方融资平台贷款\t23543\naaalyou\t23544\n12月份\t23545\n牛马牛马\t23546\n资治通鉴\t23547\n骑士幻想夜\t23548\n优质\t23549\n掌门人\t23550\n可信性\t23551\n一元钱\t23552\n大男人\t23553\n川川电\t23554\n存钱罐\t23555\n陈名远\t23556\n尼尔斯\t23557\nhellomynames\t23558\n多醉\t23559\n工程部\t23560\n入籍\t23561\n烟霞\t23562\n接办\t23563\nkillex\t23564\n头带\t23565\n笔试\t23566\n接力\t23567\n675476726756325976735\t23568\n杜杨\t23569\n爱情雨\t23570\n西昌\t23571\n龚寨\t23572\n天翼手机网\t23573\n华语片\t23574\n猜猜猜猜猜猜猜猜猜猜猜猜猜猜猜\t23575\n金城武帅\t23576\n烤鱼类\t23577\n白眼儿\t23578\n性形容词\t23579\n对呀早睡觉\t23580\n14.6\t23581\n雷士\t23582\n啦啦撸来勒来勒\t23583\n真不懂\t23584\n5555587486\t23585\n见鬼\t23586\n你好呀冬雪\t23587\n14.8\t23588\n杜李\t23589\n少年宫\t23590\n度秘我的弟弟和你的名字一样的要不信你\t23591\n异乡\t23592\n白布鞋\t23593\n笔误\t23594\n泉水\t23595\n办法家\t23596\n经师\t23597\n雷声\t23598\nParkAMu\t23599\nMHz#\t23600\nllmnbc\t23601\n接踵\t23602\n中投\t23603\n在楼群雄不雄\t23604\n一娃\t23605\n列出\t23606\n无微不至\t23607\n中技\t23608\n云高风清\t23609\n圣马\t23610\n黄毛犬\t23611\n一娴\t23612\n激化\t23613\n傻瓜瓜\t23614\n偶尔了\t23615\n省政府\t23616\n青岛\t23617\n大笑江湖\t23618\n窥见\t23619\n窥视\t23620\n5000块\t23621\n四一班\t23622\nJjkk\t23623\n兴高彩烈\t23624\n咋气\t23625\n乌苏\t23626\n新锐比\t23627\n杂技\t23628\n超萌星\t23629\n话剧\t23630\n路灯\t23631\nLhT，Yes\t23632\n亲属\t23633\n可送\t23634\n北宋\t23635\n北安\t23636\n十几种\t23637\n龟毛\t23638\n华世佳\t23639\n苦始苦终\t23640\n星高达\t23641\nHfgjgsfjjddxjjufcbkufdcjiydcbjtdxnkirxbjufxjudzvhyddhhtdchhf\t23642\naiaiaiai\t23643\n对像\t23644\n莲莲\t23645\n告诉我喜欢\t23646\n不是啦我只是和你我儿的你\t23647\n嗯瓮你好你好我是人\t23648\n神马人\t23649\ngdfddrwshff\t23650\ngvzfgc\t23651\n铁龙物流\t23652\n八六\t23653\n八公\t23654\n八八\t23655\n陈辅\t23656\ngcuchcfxdhfdhffffDgzuzifofudofohyhkekfkgjdiIzoymkdDx4jOafoxltndF\t23657\n15075889910\t23658\n专权\t23659\n专杀\t23660\nISO\t23661\n天中班\t23662\n陈钧钧\t23663\n八兽\t23664\n无法闹钟\t23665\n拜拜我走\t23666\n听发\t23667\n52850551841\t23668\n十五点\t23669\n北京三环\t23670\n22033点\t23671\n想问问\t23672\n重生娱乐圈文\t23673\n广雨\t23674\n杨浩泽\t23675\n曹颖\t23676\n徐泽业\t23677\n爱我看\t23678\n角龙\t23679\nvsth\t23680\n诗曼\t23681\n张骞\t23682\n4个小时\t23683\n病成本\t23684\n呆字\t23685\n瞎处\t23686\n88888888\t23687\n听取\t23688\n万秋\t23689\npellit\t23690\n发型师\t23691\n邝嘉欣\t23692\nj2\t23693\n松鼠皖鱼\t23694\n哎呀戏\t23695\n抱滥用\t23696\n在职\t23697\n什莫\t23698\n31间\t23699\nFGXDJMFJGHFDJFCGBGFZ\t23700\n周作人\t23701\n25385242572752528275552855585\t23702\n肉雅舒雅\t23703\n加油者\t23704\n韩芷煦\t23705\n空客两大集团军\t23706\n嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎\t23707\n2013年前\t23708\n蛋榚\t23709\n麻辣兔\t23710\n自由空间\t23711\n7766\t23712\n赫韦德斯\t23713\n老人秘戏图了了了了了了了了了了七咯\t23714\n好呀小贼\t23715\n秘改天\t23716\nyftgd\t23717\n爸爸爸爸爸\t23718\n灯会\t23719\n杨颖\t23720\n杨颕\t23721\n付诸行动\t23722\n前面\t23723\n都敏俊\t23724\n五十七分\t23725\n度密度密度度秘你好坏呀你好坏\t23726\n千王\t23727\n脸大真吃亏亏亏亏亏亏亏亏\t23728\n内涵\t23729\n手腕\t23730\n999967\t23731\n看台区\t23732\n胜利者\t23733\n喜愛拍攝這類題材\t23734\n偏爱\t23735\n炭烧烧鹅\t23736\n文峰超市西街港\t23737\nPPSs\t23738\n买怕\t23739\n抱抱堂\t23740\nvcfgk\t23741\n懷著無\t23742\n5RMB\t23743\n６０分钟\t23744\n458845687552180280455875566001588564\t23745\n内涝\t23746\n千玺\t23747\n四惠\t23748\nnffg\t23749\n愣子\t23750\n腾讯\t23751\n踏着\t23752\n再来一个你好帅\t23753\n肤浅\t23754\n独运\t23755\n吕玉良\t23756\n喵喵喵喵喵\t23757\n还有空\t23758\n鲤鱼\t23759\n画壁\t23760\n310317\t23761\n123546678900\t23762\npumd\t23763\n陈可辛\t23764\n蜂鸟\t23765\n25%\t23766\n下一站\t23767\n关我告诉你\t23768\n牵线\t23769\n我喜欢一女\t23770\n带入\t23771\n伪劣\t23772\n水灵\t23773\n大嗓门\t23774\n抓痕\t23775\n长白山\t23776\n十三点三十分\t23777\n水灾\t23778\n张蒸\t23779\n水能\t23780\n秘师\t23781\nhhvvgggghgggggggghhhbbbbbb\t23782\n年底册测试卷\t23783\n三十多号\t23784\n水火\t23785\n秘帅\t23786\n飞蝗天下\t23787\nISM制造业采购经理人指数\t23788\nxhydxicufy\t23789\n戴拿\t23790\n秘常\t23791\n辣条压压惊\t23792\n甘努力\t23793\n未成年\t23794\n鱼类\t23795\n弄伤\t23796\n参加\t23797\n改观\t23798\n不在身邊\t23799\n兰帕德\t23800\n赵胜寒\t23801\n秘帮\t23802\n李璐鑫\t23803\n秋波三\t23804\n崔紫君\t23805\n银懒成\t23806\n秘带\t23807\nhavebiglunch\t23808\ngrvd\t23809\n咪风\t23810\n哦里\t23811\n行进\t23812\nkksd\t23813\n行运\t23814\n第一幅\t23815\n微警\t23816\n胡蕊丹\t23817\n昏昏欲睡\t23818\n从来玛丽亚\t23819\n老妹子\t23820\n5447644676十4543653663565X5634653564575576二\t23821\n跞讽\t23822\n溶液\t23823\ntdfgffqlitpp\t23824\n259\t23825\n赤脚\t23826\n花道\t23827\n万董\t23828\n王旭美琪\t23829\n多些\t23830\n人恩\t23831\n发一做\t23832\n有本事\t23833\n4月中\t23834\n室壁\t23835\n31701317\t23836\n么挖宝\t23837\n451米\t23838\n10月3日起\t23839\n证婚\t23840\n一走\t23841\ntux\t23842\n绝交就原谅我也不想和你绝交了我想和你绝交绝交衣\t23843\n长兴侠\t23844\nfuxhxhfux\t23845\n表行\t23846\n唔食\t23847\n杨寅东\t23848\n恩不错\t23849\n不振\t23850\ndividualw\t23851\nLjufg\t23852\ngxgcx\t23853\n我的淑旺\t23854\n好啦其实\t23855\n联入\t23856\nwiks\t23857\n天下三无聊\t23858\n不挠\t23859\n一1月3号\t23860\n知足\t23861\n等会儿不在安晚安\t23862\n雅玲雅\t23863\n同性堡\t23864\n百战\t23865\njt\t23866\n知趣\t23867\nalclc\t23868\n忄悟愫愫情\t23869\n前夜\t23870\n张落落\t23871\n不挂\t23872\n117公斤\t23873\n皮皮儿罗\t23874\n四十五块\t23875\n狗秘\t23876\n曹克强\t23877\n邓时海\t23878\n600682\t23879\njgdss\t23880\n移动\t23881\n帳\t23882\n十一米\t23883\ntuu\t23884\n5800一平米\t23885\n漏洞百出\t23886\ntuv\t23887\n纠错码\t23888\n不管者\t23889\n划出\t23890\ntuw\t23891\n嗯雨花石\t23892\n长得帅怪\t23893\n彭工\t23894\n七六5点\t23895\n身型\t23896\n好我等等等\t23897\njm\t23898\n急猴急\t23899\n快手快手快手快手快手\t23900\n雍正小学\t23901\n单句\t23902\n学习成绩\t23903\n我喜欢格林童话\t23904\n绕口\t23905\nqinwen\t23906\n大切诺\t23907\n单口\t23908\nvu个\t23909\n341号\t23910\n韦正良\t23911\n未啊\t23912\n阳澄湖小学\t23913\n把脉\t23914\n奈斯\t23915\nwhatnpc\t23916\n大液\t23917\njk\t23918\n302124450358\t23919\n黏着\t23920\n大涨\t23921\njd\t23922\n9467.75点\t23923\n通罗马是\t23924\n把脚\t23925\n亲一个*^_^*拜拜\t23926\nCHIC\t23927\n魔男同\t23928\njf\t23929\n大润\t23930\n可怜巴巴\t23931\narts\t23932\n你好度秘度秘你好\t23933\n去天堂\t23934\n坤炜\t23935\nehHey\t23936\n椰树\t23937\nrrrr\t23938\n露底\t23939\n声威\t23940\ntue\t23941\n天下人\t23942\n永不在\t23943\n状子\t23944\n小狗军\t23945\njc\t23946\n鸡毛人\t23947\n说了谎\t23948\n海淀区\t23949\n佳爷\t23950\n完了测\t23951\n陈淑辉\t23952\n负正负\t23953\n夏靖瑶\t23954\n朝语\t23955\n1天1点\t23956\n常备不懈\t23957\n去年同期\t23958\n四小十二岁\t23959\nthose\t23960\n阔爱\t23961\n李思敏\t23962\n车厘\t23963\n死野草\t23964\nhggngutCBGBHDfhfjbvn\t23965\n打折不打折\t23966\n6层\t23967\nlababy\t23968\n第1张\t23969\n朝词\t23970\n唐古\t23971\n另一只脚\t23972\n夏当尼\t23973\n蓬莲\t23974\n10月1号\t23975\n超酷出\t23976\n有虫\t23977\n仇视\t23978\n那照\t23979\n一纸倾城\t23980\n本周五\t23981\n发尾\t23982\n发尿\t23983\n博伊斯\t23984\n本周二\t23985\n扎赉诺尔区\t23986\n应乐蓉\t23987\n乌鲁\t23988\n危难\t23989\n抬头\t23990\n一千条\t23991\n七平方厘米\t23992\n37298888\t23993\n郴州明星学校\t23994\ngjgthfnghhg\t23995\nsame\t23996\n憨厚\t23997\n发射\t23998\n获益\t23999\n免灾\t24000\n心脏病\t24001\n狒狒狒狒头\t24002\n近道\t24003\n发小\t24004\n庶耍\t24005\n李堡里\t24006\n现生活\t24007\n盛世战士\t24008\n波姐\t24009\n第一块\t24010\n大火\t24011\n美白\t24012\n叫做爱\t24013\n2006年8月19号\t24014\n外星小子哆布哆啦\t24015\n18543429059\t24016\n女生子豪爱\t24017\n重庆电信\t24018\n老街\t24019\n字三季\t24020\n途乐\t24021\n吹箫\t24022\n莫若止\t24023\n老表\t24024\n好无力\t24025\n天扬州初逢席\t24026\n玩儿再见\t24027\n台媒\t24028\n河南省高院\t24029\n无餐\t24030\n婷姐\t24031\n嫁给你可以\t24032\n两天一\t24033\n恺恺\t24034\n张容熙\t24035\n哼了不起\t24036\n老衲\t24037\n建筑商\t24038\n糯米粉\t24039\n太历\t24040\n董梦雅\t24041\n开幕曲\t24042\niiiiiiu\t24043\n书市\t24044\n两秒钟前\t24045\n1113456\t24046\n萌妞\t24047\nvvhn\t24048\n十五六八八九\t24049\n铺路\t24050\n太原\t24051\n我自我\t24052\nGvvhjrcctsgciyshtddvguvhbftvuduvdgdgygygygyufguhfgfugjfvjvjjfvuvjvhchhfvuvufuuv\t24053\nbowie\t24054\n55343874688\t24055\n四四块\t24056\n灯塔\t24057\n锦绣广场\t24058\n不知不觉\t24059\nitmeans\t24060\n四方神\t24061\n王世聪\t24062\n光波\t24063\n献爱心\t24064\n比克\t24065\n今生的缘错\t24066\n这么说实话\t24067\n光泽\t24068\n犯人\t24069\n真的假的\t24070\n快乐瓜\t24071\n小童年\t24072\n沈伟\t24073\n奶奶\t24074\n比六\t24075\n犯事\t24076\n汉仪槑\t24077\n韭菜煎饺\t24078\n出炉\t24079\n王楚墨\t24080\n重负\t24081\n恐惧\t24082\n凄迷\t24083\n福祉\t24084\n烈火\t24085\nnose\t24086\n桥粱\t24087\n苏新月\t24088\n54588468\t24089\n吉首\t24090\n月底三十一号\t24091\n喝茶年\t24092\njoap\t24093\n绥德\t24094\n名字母\t24095\njoaz\t24096\n别闹小心用你的后颈\t24097\n怎么这样\t24098\n截然相反\t24099\n总数\t24100\nNew\t24101\n不适用\t24102\n5256436\t24103\n鬼谈恋爱\t24104\n吕传\t24105\nuabok堡细胞hififaoluio\t24106\n545889859986\t24107\n11元\t24108\n血钻\t24109\n吕会\t24110\n放款\t24111\n江泽\t24112\n万家福超市\t24113\n九点四十六分\t24114\n件事\t24115\nqiang\t24116\nGogo\t24117\n烦心情好\t24118\n回与非\t24119\n李庄案\t24120\n56式\t24121\n江波\t24122\n你好听话\t24123\n战友群\t24124\n反动派\t24125\n一匹布\t24126\n11时\t24127\nygvzucjs\t24128\n不能少\t24129\n亲亲月经\t24130\n13834605125\t24131\n十岁半\t24132\n杨大妈\t24133\nDhsjx\t24134\n撤掉\t24135\n跑出\t24136\n冷排包\t24137\n慢舞\t24138\n谢建辉\t24139\n察天\t24140\n贷卡\t24141\n谢天谢地\t24142\n靠真讨厌\t24143\n江门市\t24144\n张紧\t24145\nwrof\t24146\ni星Q蝌蛛\t24147\n好犀利\t24148\ndyff\t24149\n来自杀\t24150\n系统性\t24151\n三聚德\t24152\nfix字幕侠\t24153\ndyfd\t24154\n18756118657\t24155\n三城市\t24156\n评论席\t24157\ndhft\t24158\ndhfj\t24159\n说你丑呀\t24160\ndhfi\t24161\n可忧\t24162\n死黄死\t24163\n一二三四五六三十二一\t24164\n酸嗯\t24165\nMacBook\t24166\n赛永鑫\t24167\nx版\t24168\n云云千\t24169\n使得\t24170\n第七天\t24171\n邓佳伟\t24172\n半道\t24173\n凤凰芳名\t24174\n可心\t24175\n充电\t24176\n552893\t24177\n武欣怡\t24178\n正在进行\t24179\n便捷\t24180\n什摸\t24181\n荒原\t24182\n再来玩\t24183\n小禹六\t24184\n五龙山\t24185\n三分之170\t24186\n我的兄弟\t24187\nsearent\t24188\n刘杰\t24189\n是了了了了\t24190\n聊了再聊\t24191\n刘杨\t24192\n谭航宇\t24193\n刘来\t24194\n你是我的最爱i爱你\t24195\n畾\t24196\n我喜欢一个人\t24197\ndyft\t24198\n仙剑奇侠传\t24199\n五平方米\t24200\n重手\t24201\nFuckout\t24202\n之所以\t24203\n睡醒\t24204\n水仙头\t24205\n明万般\t24206\n荣泰\t24207\n破洞\t24208\n降水\t24209\n德拉科\t24210\n话糙理不糙\t24211\n蒸气\t24212\n身临\t24213\nfdffffghcawgj\t24214\n杨燕\t24215\n五十品\t24216\n程丽娟\t24217\nvoufc\t24218\ngirlfriend\t24219\n我半来鐮\t24220\n么了了里哈啊里好了了了后了了了了了了了了了了了了了了了了\t24221\n算计\t24222\n苏芳\t24223\n来职往\t24224\nqsd\t24225\n一条命\t24226\nfiffffi\t24227\n8888888888888888888888888888888888\t24228\n苏芮\t24229\n3cf9wt6h…zg\t24230\n吧我回来啦亲爱的秘书我来的快\t24231\n点叫\t24232\n三十来岁\t24233\n大老粗\t24234\n可燃性\t24235\n蒙台\t24236\n石慢\t24237\n献演\t24238\n奥布\t24239\n干彪\t24240\n奥币\t24241\n小度你好黄\t24242\n表现欲\t24243\n嗯哩哔叭咪咕888咪咕888迷宫爸爸不过吧股吧\t24244\n魔鬼恋人\t24245\n要不要不要不要不要不要不要不要不要不要不要不要不要\t24246\n跋涉\t24247\n强拆\t24248\n想得美你爹的你你你你你\t24249\n张曼扬\t24250\n强拉\t24251\n我们结婚吧我爱你\t24252\n跟鞋\t24253\n8725\t24254\n付六仁\t24255\n爆棚\t24256\n我是恩知拉贝比一样的女汉子\t24257\n艾丽斯\t24258\n冷烫\t24259\n冷热\t24260\n发话废话\t24261\n堵截\t24262\nbmw\t24263\n克星\t24264\nzdsssrdert\t24265\n知心寒假\t24266\n圆通快递公司\t24267\n新埠\t24268\n偷窥\t24269\n跟岗\t24270\n电语\t24271\n鳄鱼小顽皮爱洗澡\t24272\n木村花\t24273\n先男生\t24274\n烊千玺\t24275\n刘志华\t24276\n头宿命\t24277\nmakeonessurprised\t24278\n家鹰\t24279\n阿秋喇嘛\t24280\ngymor\t24281\n空投弹\t24282\n棒槌\t24283\n唱要\t24284\n小秘你上天不\t24285\n68歲\t24286\n度秘我不喜跟度跟度秘书\t24287\nyinn气女8\t24288\n首选\t24289\nLAMER\t24290\n胶州路\t24291\n会考\t24292\n汉诺狂\t24293\ngffhhf\t24294\n马大我\t24295\n310529439\t24296\n一梯\t24297\n宿舍文化节\t24298\n1884厘米\t24299\n李帮\t24300\n据说\t24301\n一條\t24302\n龙梅\t24303\n喀达尔\t24304\npack\t24305\n13875517757\t24306\n丁磊\t24307\n看帅\t24308\n李帅\t24309\nzjz\t24310\n11月14日\t24311\nvxjdf\t24312\n锦州工商局\t24313\n大耳点\t24314\nqwwrn57gid\t24315\n刘乐意\t24316\n失控\t24317\n失措\t24318\n017\t24319\n洋河醇\t24320\n0芬\t24321\n冠绝\t24322\n10.47\t24323\n走开\t24324\n225几250\t24325\n钱俊杰\t24326\n着爱\t24327\nsbcn\t24328\n警察粒\t24329\nyouku\t24330\n宋乃\t24331\n玩意\t24332\n既然如此\t24333\n三合财务院\t24334\n虾子\t24335\n淡薄\t24336\n超子\t24337\n死心眼儿\t24338\n玩愁\t24339\n13751390895\t24340\n蒙古帝国\t24341\n陈泽民\t24342\n也想吃\t24343\n老错\t24344\n投缘\t24345\n青菜\t24346\n伯恩\t24347\ngggfryi\t24348\n解除\t24349\n胖堂\t24350\n再造\t24351\n你好萌萌哒呀我也萌萌哒\t24352\n参考性\t24353\n张亮\t24354\n崔有胜\t24355\n吃完饭\t24356\n木梳\t24357\n399小二零\t24358\n社保\t24359\n再逼\t24360\nGfcbhvj\t24361\n35cm\t24362\n上两天内\t24363\n再送\t24364\n东周\t24365\nFagvahnjsg\t24366\n再选\t24367\n男人片\t24368\n木日窝村\t24369\n流行语\t24370\n还姐\t24371\n为你的欺骗\t24372\n一公斤\t24373\n钟文婷\t24374\n算了我不麻烦你\t24375\n瞎笑\t24376\n30MLILLY特\t24377\n十二零一八九年\t24378\n8间\t24379\n隐血\t24380\n链轮\t24381\niygreerr44322222w2wwdddffvbjkjhhhhhbvgcddfrghhkkk\t24382\n炉火\t24383\n好吧冰\t24384\n春节了你给我\t24385\n下调味\t24386\n你在哪里有没在这\t24387\n说明你的妈妈不爱你\t24388\n前一点\t24389\n豪庭\t24390\nbabydo\t24391\n扯东\t24392\n尊贵\t24393\n仁继杰\t24394\n妙药\t24395\n命冶\t24396\njivi\t24397\n擦地\t24398\njivf\t24399\n真空包装\t24400\n程冠希\t24401\nbhhvjj\t24402\n友人们\t24403\ntouyouimath\t24404\n罩罩\t24405\n女度秘\t24406\n双碟\t24407\n沈塘桥老街\t24408\n过海下\t24409\n美国大使馆\t24410\n男弹\t24411\n福清\t24412\n华能\t24413\n十点\t24414\n我主人我是你的主人\t24415\n零一个\t24416\n盈余非负\t24417\n男式\t24418\n公孙仪\t24419\n邪眸\t24420\nSiri棒\t24421\n2.04%\t24422\n福港\t24423\n不不不，你太人性化了有点怕怕\t24424\n打讨厌\t24425\n张木\t24426\n见了明天在\t24427\n几十张\t24428\n年会\t24429\n好几遍\t24430\n红葡萄白葡萄\t24431\n颜且\t24432\n眼科\t24433\n昨天不是你跟我聊的天\t24434\n六十天假\t24435\n哎呀四眼龟\t24436\n本菲\t24437\n多益马\t24438\nokcatoc狗aoatsoteuck1o1\t24439\n蝴蝶结\t24440\n甘願\t24441\n东石圩小区\t24442\n免不了曲终人散\t24443\ntggggs\t24444\n亲一口\t24445\n18:20\t24446\n18:21\t24447\n酿成\t24448\n68.93亿元\t24449\n王浩东\t24450\n所见\t24451\n郭安坤\t24452\nshowtifrayosho15属鼠十五十五sr\t24453\n零五九二幺零零幺零\t24454\n53585588945640\t24455\n回到家废话\t24456\n致意\t24457\n对啊真可惜\t24458\nBOX\t24459\n黄爱华\t24460\n畅销\t24461\nblood\t24462\n中沙群岛\t24463\n匡威\t24464\nvdsthhh\t24465\n保质工\t24466\n水哆\t24467\n泰泰窝\t24468\n祖宗类\t24469\n养成系\t24470\n西场的村\t24471\n释出\t24472\n充卡\t24473\n企稳\t24474\n搭子伐\t24475\n没礼貌真可怕\t24476\n北极熊\t24477\n着学美容\t24478\n笑來\t24479\n白草\t24480\n贸易逆差\t24481\ntdrgy\t24482\n争战\t24483\n我的小我的哥哥的假小鸭子\t24484\nmynameisEffie\t24485\n四溢\t24486\n冬蜜快点\t24487\n红蓝兄弟\t24488\n违例\t24489\n进民退\t24490\n奥特奥特曼\t24491\n如连\t24492\n应我不走\t24493\n龙晓凤\t24494\n白药\t24495\n我的眼\t24496\n3116845\t24497\n渭南市\t24498\n这言\t24499\n捡到者\t24500\n集锦\t24501\n周经伦\t24502\nMINUTE\t24503\n眉心\t24504\nbuxiang\t24505\n武腾飞\t24506\n闲管\t24507\n县公安局\t24508\n伟呵呵\t24509\n安全动物园动物园\t24510\n一座两千屹百零八米\t24511\n都一样\t24512\n小黄段\t24513\n力挽狂澜\t24514\nDist\t24515\nChung\t24516\n度秘解\t24517\n市中\t24518\n士力\t24519\n找我吧\t24520\n中位数\t24521\n留言树\t24522\n林菁天\t24523\n片酬\t24524\n自己唱歌\t24525\n77100155164\t24526\n咯素雅亏人文表\t24527\n玛维影\t24528\n李婧昀\t24529\nidea\t24530\n刘玉森\t24531\n接你可以\t24532\n真的确定\t24533\n列支敦士登\t24534\n永远不原谅\t24535\n22宋\t24536\nvi夫人文件夹\t24537\n是谁人喜\t24538\n拍摄期\t24539\n驻外团\t24540\ndkdjzjd\t24541\n全喔自強號\t24542\nQWER\t24543\nIIII\t24544\n好呀你男人\t24545\n呵呵我是你的姐姐\t24546\n复兴党\t24547\n厨房杯#我猜Fotile\t24548\ndogP\t24549\n人和妖\t24550\n不难得\t24551\n老精\t24552\n哇秘小\t24553\n寒假工\t24554\n内蒙蒙蒙蒙蒙\t24555\n身处\t24556\ntouse\t24557\n1月7日\t24558\n索取\t24559\n喇叭处\t24560\n派限\t24561\n洛小锤\t24562\n身外\t24563\n联邦\t24564\n要不会\t24565\n人口密度\t24566\n代谢\t24567\n行李箱\t24568\n小心爱\t24569\n恒荣\t24570\nvbbvczzzcvc4780cfv\t24571\n百大把\t24572\n微扁扁\t24573\n惑\t24574\n胡搅蛮缠\t24575\n梦比优斯\t24576\n勃利\t24577\n接棒\t24578\n人物\t24579\n天安\t24580\n她家\t24581\n天宄\t24582\n天宇\t24583\n艾叶\t24584\n天宁\t24585\n蛋超人\t24586\n天宝\t24587\n炎凉\t24588\nBCbad\t24589\n天官\t24590\n16435\t24591\n艾古\t24592\n没努力\t24593\n抵御\t24594\nkoi\t24595\ncosplay芭比\t24596\n人版\t24597\nkom\t24598\n蔡佳婧\t24599\nkoo\t24600\n使劲办\t24601\n天室\t24602\n林嘉荣\t24603\n度秘你的大梅\t24604\n九天天洲\t24605\nkoy\t24606\n五彩纷呈\t24607\nuvubi\t24608\nkou\t24609\nkot\t24610\n九五零\t24611\nteeffiid\t24612\n大大哒哒哒哒哒哒哒\t24613\n马关胜\t24614\n张小菊\t24615\n性研究\t24616\n绿吐\t24617\n开闸\t24618\n兰梅\t24619\nChihuly\t24620\n开闭\t24621\n明摆\t24622\n蟹\t24623\n埃博拉\t24624\n开门\t24625\n唯饭\t24626\n再来一个再来一个再来一个再来一个再来一个再来一个再来一个再来一个再来一个再来一个再来\t24627\n毛事\t24628\n两千岁\t24629\n知不喜欢\t24630\n小马路北里hello贝比哈楼为电影hello北鼻hello北鼻hello露比比hello贝比hello比比\t24631\n二三零九\t24632\n想法\t24633\n2755\t24634\n可爱\t24635\n睡懒觉\t24636\n自凉\t24637\n新娘妆\t24638\n姜太公\t24639\n太BS\t24640\ngffidlgjfjffjdksfkgkckskgkgkcjskskfkgkfkcjcf\t24641\n有见\t24642\n借阅\t24643\n那翔\t24644\n何逸宁\t24645\n我没有骗你的心我是真的爱你可是你\t24646\nku秘\t24647\n三到六月\t24648\nVirginie\t24649\n卫梓昊\t24650\n谢谢你啦东面\t24651\n二五岁\t24652\n该当何罪\t24653\n558883555525553\t24654\n忘了开\t24655\n偏膀\t24656\n佳乐文具\t24657\n平民化\t24658\n中国国防部外事办公室\t24659\n原修\t24660\nbdd\t24661\n扬历\t24662\nbyssg\t24663\n效改\t24664\n毗卢观音\t24665\nForeguthes\t24666\n阿司匹林\t24667\n干紧\t24668\nTATTAT\t24669\n六三二八八二九二\t24670\nchyes\t24671\njuqiqius\t24672\n肥大\t24673\n挥翁而尽\t24674\n可以说出来\t24675\n王彤羽\t24676\n蟑\t24677\n能闻\t24678\n朱林威\t24679\n概念机\t24680\n能不能不等\t24681\nhttpwwwjulaibaocommemberregaspxrefmanyz2927557177\t24682\n扬眉吐气\t24683\n朗多浪\t24684\n东北味\t24685\n签约\t24686\n小哥哥\t24687\n大图真的超灵啊TTTTTTTTTTTTTTTTTTT我的黄金星\t24688\n咸水\t24689\n阿育王塔\t24690\n000英里\t24691\n6gg7\t24692\n多物\t24693\n以求\t24694\n晚安早安\t24695\n声势浩大\t24696\n多特\t24697\n名侦探\t24698\n多片\t24699\ncvbbh\t24700\n就业难\t24701\n陶头\t24702\n2016年后\t24703\n男女们\t24704\n阿西\t24705\n毒米\t24706\n22栋\t24707\n在此吧\t24708\n1224556689912334556789123456789123456789\t24709\n交钱\t24710\n清香茅台酒\t24711\n十六点\t24712\nmgmjg\t24713\n以为什么不说\t24714\n狈以为\t24715\n七八九十十一十二十二十十五时\t24716\n脑莫吖\t24717\nasssss\t24718\n干活间\t24719\n只求\t24720\n农历四月廿一\t24721\n真心想的人\t24722\n绣眼鸟\t24723\n33.72元\t24724\n不爱哭\t24725\n我我是我报完修\t24726\n宽宏\t24727\n重获\t24728\n朋友之间\t24729\n我爱记歌词\t24730\n打一顿\t24731\n不是我没说你丑度秘你是最美的了你\t24732\n5894128804\t24733\n滤膜\t24734\n263288\t24735\n再+1句\t24736\n别框\t24737\n磨练\t24738\n后周\t24739\n服兵役\t24740\n喔靠\t24741\n哀家不想\t24742\n还在度\t24743\n武嘉豪\t24744\n李可雯\t24745\n叶落风\t24746\n玩火自焚\t24747\n真的好\t24748\n弹精竭虑\t24749\n发句话\t24750\nniqusibadumi\t24751\n声调\t24752\n黑大姐\t24753\n5806347863113\t24754\n陈研\t24755\n数千\t24756\n七八零三一个\t24757\n万境\t24758\naviary\t24759\nwhatareyou\t24760\n管区\t24761\n找事干\t24762\n明治奶粉\t24763\nRfpkdhfyjh\t24764\nsykjgj\t24765\nDonatella\t24766\n小房\t24767\n9月21日\t24768\nXBCD120\t24769\n波什\t24770\nhi斯古兰\t24771\n停车案\t24772\n吕明聪\t24773\nGdjsbj\t24774\n费魔\t24775\n晕晕晕晕晕晕晕晕\t24776\n神群主\t24777\n平坦\t24778\n无论\t24779\n高中考\t24780\n常见\t24781\n11｝\t24782\n八十年代\t24783\n赈早\t24784\n足岁\t24785\n回一挥\t24786\n暴降\t24787\n撑把\t24788\n乜过\t24789\n快乐的\t24790\n平均\t24791\n40.4%\t24792\n嚯嚯嚯嚯嚯\t24793\nZZXXCZ\t24794\n17111171171\t24795\n一步一步\t24796\nsupply\t24797\n紫纹\t24798\n欧巴刚\t24799\n袖珍\t24800\n紫红\t24801\nQEu\t24802\n紫约\t24803\n堂堡\t24804\n李蕾\t24805\n鸡毛礼貌再说\t24806\n乱走\t24807\n缺逼\t24808\n叩响\t24809\n类地\t24810\n中国半导体行业协会\t24811\n单冰溪\t24812\n5点钟\t24813\n南海加园\t24814\n办好\t24815\n爰睡爱美酒\t24816\n8555569\t24817\n把妹\t24818\n美国市场休市\t24819\n陈日新\t24820\n欧巴别\t24821\n这句话宝\t24822\n房顶\t24823\n穷不行\t24824\n五百克\t24825\n好啦好啦好啦好啦好啦好啦好啦好啦好啦好啦啦啦啦啦啦啦啦啦啦\t24826\n奥大姐\t24827\n日国\t24828\n红黄白绿彩色花飘飘荡荡空中\t24829\n做好好\t24830\n向海\t24831\n宋子红\t24832\n五百兆\t24833\n自持\t24834\n曾馨\t24835\n引号康\t24836\nfgg\t24837\n天气报告\t24838\n林军波\t24839\n艺伟\t24840\n首星星泪\t24841\nidjxksjxjjdi\t24842\n布加迪威航\t24843\n热额额\t24844\n千娆\t24845\n1代\t24846\n核组词\t24847\n厌家\t24848\n于丽\t24849\n照别\t24850\n水果味\t24851\n单恋\t24852\n16116676716661671\t24853\n第四份\t24854\n姓孙\t24855\n第四任\t24856\n截断\t24857\n样们\t24858\n我不爱你我讨厌你我\t24859\nv血红蛋白\t24860\n开始你是我的秘书\t24861\n肾气\t24862\n肉麻可不\t24863\n过去篇\t24864\n企鹅罐\t24865\n没种\t24866\n仆街\t24867\n会儿天儿行\t24868\n东尼-尼尔森\t24869\n圣人\t24870\n植物大战僵尸1\t24871\n阿尔法贝塔\t24872\n旅游业\t24873\n明早5点20\t24874\n没秘\t24875\njqphxhjs\t24876\n亳州\t24877\n谢文良\t24878\n牛头不对马嘴\t24879\n结冰\t24880\n随意\t24881\n圣亚\t24882\n千斤重\t24883\ngack\t24884\n好了聊\t24885\n*罒\t24886\n草皮\t24887\n业内人士\t24888\n牛腩米\t24889\n薄荷糖\t24890\n是什么叫\t24891\n最适宜\t24892\n數量\t24893\n我的闺\t24894\n大份镇\t24895\n读幼师\t24896\n柏拉图\t24897\n新塘镇\t24898\n丁丁猫\t24899\n谢谢你说片\t24900\n地方政府\t24901\nreffr\t24902\n夫婦\t24903\n玄彬##河智苑#\t24904\n夫婿\t24905\n小宝三\t24906\n仓库仔\t24907\ny\t24908\n树蛙\t24909\n秘一脚\t24910\n度数\t24911\n系五明\t24912\n可以\t24913\n重庆中旅\t24914\n避雷针\t24915\n绿森\t24916\nZbbb\t24917\n郑晓钰\t24918\ndddjjjjdf\t24919\nhjbvvg\t24920\n弹珠机\t24921\n吴璎展\t24922\nggjgjgjgjgj\t24923\n都帅\t24924\nSEP组合\t24925\n我是皇后我是皇后我是皇后\t24926\n都市\t24927\n唠嗑\t24928\nwrbm\t24929\n本学期\t24930\n一二三四五六\t24931\ngohnb\t24932\n258656558\t24933\n7eS\t24934\n度敏\t24935\n范英\t24936\n蠡县\t24937\nuurhg\t24938\n一万瓶\t24939\n说了窝\t24940\n1299元\t24941\n基亚尔卡\t24942\n胆餐\t24943\n刘思蕊\t24944\n幺五八\t24945\n走一回\t24946\n新月湖\t24947\n第7名\t24948\n经检查\t24949\n洛杉矶国际机场\t24950\n多oo个个\t24951\n李在缩撒\t24952\n哭了是\t24953\n无限网\t24954\nabcdefger\t24955\n4444414\t24956\n拍摄\t24957\n拍摇\t24958\n瞌睡兰\t24959\n五点零五\t24960\n杨神经\t24961\n坏品\t24962\n漫川关\t24963\n啦啦啦啦我的小阿飞\t24964\n找破灭\t24965\n一文不值\t24966\n表心塞\t24967\n半价\t24968\n朱良红\t24969\nQw\t24970\npez\t24971\n弄成\t24972\n梦见是不是傻逼我问你你老实告诉我如果你不老实要是我的话\t24973\n岳弟\t24974\n身词\t24975\npel\t24976\npen\t24977\n臣服\t24978\npei\t24979\n扣眼\t24980\nxxgczzk\t24981\npeb\t24982\n咋小东门淘气鬼\t24983\n传情\t24984\n壮行\t24985\nnewlei\t24986\n孙可可\t24987\n黑影\t24988\n220米\t24989\nABAC式\t24990\n35名\t24991\n文字\t24992\n尼妹\t24993\n文子\t24994\n汊女\t24995\n110628希澈\t24996\n10几万亩\t24997\n服务楼\t24998\n全笑\t24999\n害虫害\t25000\n看风景\t25001\n文学\t25002\nhi秒sqvn\t25003\n鸟类\t25004\nGDJD\t25005\n饱眼福\t25006\nQb\t25007\n信诚人寿\t25008\n64346734683133434334313\t25009\n派克士\t25010\n惹麻烦\t25011\n亮出来\t25012\n再来再来再来再来再来再来再来再来再来再来再来再来再来再来\t25013\n妊辰囊\t25014\n李林青\t25015\n导购\t25016\n#CHIN游夏日#\t25017\n魏涛\t25018\n卖人\t25019\n1.6GHz\t25020\n哎唉\t25021\n度假的秘密\t25022\n这个小孩子别闹小说话\t25023\n索K嗨野\t25024\n淮山\t25025\n协销\t25026\n潘诗研\t25027\n灵母\t25028\npraiseachild\t25029\n九寨\t25030\n存疑\t25031\n狮子男\t25032\n瓦男\t25033\n蛙趣\t25034\n跑得快\t25035\n殊不知\t25036\n565855455555\t25037\n嫉妒恨\t25038\n13955165210\t25039\n赛尔提\t25040\n亅亅\t25041\n太平岛\t25042\n羊毛\t25043\n我是你的小栗鼠度秘\t25044\n顾霸\t25045\n你天\t25046\nDICE\t25047\n15658684565\t25048\n铁定\t25049\n明节\t25050\nklhfhlg\t25051\n婆婆\t25052\n长湖路\t25053\nverma\t25054\n月宝儿\t25055\n牛奶牛奶\t25056\n标号\t25057\n观赏\t25058\n工作单位\t25059\n吴大侠\t25060\n上海火车站\t25061\n7936\t25062\nbdyp\t25063\n夺来\t25064\nSHINEE\t25065\nfyuu\t25066\n18833764\t25067\n殷晴\t25068\n一千一千多\t25069\n耍睑\t25070\n利好\t25071\nfyug\t25072\nQA\t25073\n585481111112388885\t25074\n谨祝\t25075\n立场\t25076\n一点二十四\t25077\n耍睡\t25078\n我的爱子\t25079\n60多万吨\t25080\n北戴河一中队\t25081\n9856855588\t25082\n睾丸\t25083\n处对象\t25084\n完高高\t25085\n邵志昊\t25086\n2268元\t25087\n林氏黄\t25088\naunt\t25089\n陈浩\t25090\n臭龙龙\t25091\n韦蘅桀\t25092\n郑州市中级人民法院\t25093\n飞长\t25094\nQesds\t25095\n大长今\t25096\n账户\t25097\n1844584481711676\t25098\n太的奥\t25099\n切勿\t25100\n家豪\t25101\nT台上\t25102\n怕苦\t25103\n小白桑\t25104\n辣条\t25105\n啦啦啦啦啦了哪呢哪呢哪啦啦啦啦啦啦啦啦啦啦\t25106\n飞镖\t25107\nScott\t25108\n三女\t25109\n江牧之\t25110\n霹雳啪啦霹雳啪啦啦啦啦啦啦啦淅沥啪啦霹雳啪啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦是哪首歌\t25111\n一百二一百一十多块\t25112\n一块钱\t25113\n惦念\t25114\nhighttttjja\t25115\n刘梦发\t25116\n三条腿\t25117\n寂莫女\t25118\n我喜欢讶\t25119\n高架桥\t25120\n北京国家会议中心\t25121\n聊你未来有没有女朋友事儿\t25122\n淘宝店\t25123\n201家\t25124\n第二堆\t25125\n神马彼岸厕所了会要\t25126\n坐身\t25127\n渭水\t25128\nSaint\t25129\n王珍玥\t25130\n风口处\t25131\n聚集\t25132\n四怒海\t25133\n赵彩艳\t25134\n猎奇\t25135\n尸位素餐\t25136\n和田\t25137\n我和他的故事\t25138\nmlae\t25139\n幺七六\t25140\nn　　　n\t25141\n面值\t25142\n主页\t25143\n原谅我再也不感\t25144\n纺织服装周刊\t25145\n崔海鹏\t25146\n古色\t25147\n向往\t25148\nehvrjrjr\t25149\n3月27日\t25150\n投怀送抱\t25151\n近景\t25152\n12孙\t25153\n100万吨\t25154\n死肉穴\t25155\nffhtrdassdghhyyyyyyyttt\t25156\n艾宾尼\t25157\n献舞\t25158\n剛剛初中生8公司電話國色生梟jfnwwrio\t25159\n主題\t25160\n平行线\t25161\n萌萌哒哼哒萌萌\t25162\n王帅宾\t25163\n零四十八\t25164\n錒太\t25165\n炫儿\t25166\n知青博物馆\t25167\n一点个\t25168\n哇塞鸭肉\t25169\n好芳芳\t25170\nIII泡00\t25171\naron1\t25172\n六五幺幺\t25173\n14：30-17：00\t25174\n刀乁\t25175\n战机队\t25176\nWHAT\t25177\n皇叔\t25178\nnnmn\t25179\n一点一\t25180\n说句句\t25181\n贵饿\t25182\n省区\t25183\n马艳萍\t25184\n二踢脚\t25185\n奥林\t25186\n听懂的你唱\t25187\n罗彪\t25188\n曹景辉\t25189\n兔年\t25190\n谢静薇\t25191\n剪刀布\t25192\n有句话\t25193\nfgjjh\t25194\n事业\t25195\n哈工大\t25196\n735千克\t25197\n清吻\t25198\n都糸无有\t25199\n二十出头\t25200\nabendan\t25201\nALU\t25202\n奸夫淫妇\t25203\n菠萝哥\t25204\n馄[\t25205\n张德航\t25206\n贺信\t25207\n低点\t25208\n空相\t25209\n魔竭\t25210\n结构性\t25211\n嘿巴扎\t25212\n事主\t25213\n陈紫霞\t25214\n米脂\t25215\n破处\t25216\n尼玛个逼我操你\t25217\ndjdjdjdu\t25218\n孤孤单单\t25219\n每一年\t25220\n园资\t25221\n冒出\t25222\n起去\t25223\n樱桃小丸子我喜欢\t25224\n撕逼\t25225\n纯结\t25226\nqaeawaraq\t25227\n了不要总裁\t25228\n刘颖\t25229\nygopro\t25230\ncincmetaday\t25231\n来来来来来来\t25232\n余龙\t25233\n纯绿\t25234\nshixiu\t25235\n农历六月\t25236\nguggugy\t25237\n构发\t25238\n曹善昌\t25239\n走投无路川流不息\t25240\n告示\t25241\n33388\t25242\n一千八百六十八八千六百八十九万8680w\t25243\n9484845407878787878787\t25244\nQQ灰车\t25245\n飞梦\t25246\n万紫云\t25247\n姐体\t25248\n撒撒撒\t25249\n潮店儿\t25250\n哥哥哥哥哥哥\t25251\n蕉叶关机\t25252\n十起\t25253\n美白园\t25254\n摩纳哥\t25255\n哈别说\t25256\n诺贝尔经济学奖\t25257\n满大街\t25258\n胡奔祥\t25259\n幺二零幺幺零一九八六幺幺二二三零幺幺\t25260\n峰顶\t25261\n弱国\t25262\nabcd\t25263\n李宝盛\t25264\n郭小玉\t25265\nabcg\t25266\n1136220476\t25267\n天AK\t25268\nabcc\t25269\n段时光\t25270\n世杰\t25271\nabch\t25272\n稀罕我摸摸\t25273\n卑鄙无耻\t25274\n修身者\t25275\n戴雅辉\t25276\n感脚\t25277\n蒙自\t25278\n四十八小时\t25279\n眼镜\t25280\n小龙虾\t25281\n时装秀\t25282\nafk\t25283\nafd\t25284\n乖妈妈咪呀春天在哪里春天太阳雨诗小光\t25285\naff\t25286\n墨达\t25287\n大庆\t25288\n价值\t25289\n可以太好了\t25290\n火速\t25291\n战狼\t25292\n社交性\t25293\nafu\t25294\n见龙在\t25295\n雷少宽\t25296\n之已\t25297\n申雪莹\t25298\n陈丽琪\t25299\n放不能\t25300\n洛天依\t25301\n妹长\t25302\n记过\t25303\n1347137538757475\t25304\n乌兹别克\t25305\n袁新发\t25306\n你的爸爸多大我最爱你\t25307\nfhgvcggd\t25308\n奉航公司\t25309\n救穷\t25310\nrrrrrt\t25311\n北服\t25312\n参观点\t25313\nV噶\t25314\n供房\t25315\n破形容\t25316\n恩师\t25317\n种蛋\t25318\n香奈儿五号传奇\t25319\n张冠仁\t25320\n大宇物\t25321\n王珠晓\t25322\n横流\t25323\n张总裁\t25324\n过冲\t25325\n拟人\t25326\n1452369870\t25327\n王君一\t25328\n第23集\t25329\n臭味相投\t25330\n四十七\t25331\n过冬\t25332\n联通宽带公司\t25333\n四十下\t25334\n有点意思\t25335\n杜秘书\t25336\n实事\t25337\n金世佳\t25338\n188斤\t25339\n列仙\t25340\n岳海珠\t25341\n洛神赋\t25342\n五四北路省\t25343\n无暇\t25344\n蜀山战纪剑侠\t25345\n高干文\t25346\n吴大姆\t25347\n科维琪\t25348\n3.79亿元\t25349\n李个红\t25350\nggyyhh9\t25351\n马文\t25352\n明片\t25353\n哈俊\t25354\n郑朋友\t25355\n1367453216308397899\t25356\nGMAT\t25357\n明牌\t25358\ndwj\t25359\n开胃\t25360\n大力哪了吗睐\t25361\n我原谅你哪的你在哪里过几天\t25362\n开胸\t25363\n7k7k小游戏\t25364\n15-20分钟\t25365\nAdobe\t25366\n无穷玩儿笔仙儿\t25367\nspeakenglish\t25368\nsin\t25369\n愈加\t25370\n近亲\t25371\n布偶熊\t25372\n事业绩\t25373\n近人\t25374\n樱花儿开\t25375\n悲痛背痛\t25376\n笼物\t25377\n托尼\t25378\n起来吧我的好非\t25379\nvvyxx\t25380\n主博\t25381\n张梦怡\t25382\n17号\t25383\ntfsykcd\t25384\n连同\t25385\n福州市政府\t25386\n哈美\t25387\nru\t25388\n快点唱吧\t25389\n北语\t25390\nIPAD2\t25391\n题带\t25392\n应景凡\t25393\n奥迪宝马\t25394\n您好帮\t25395\n托尔\t25396\n继续的理由\t25397\n遗弃\t25398\n啦你是我的小秘书度秘\t25399\n2000队\t25400\nwkks\t25401\n场景\t25402\n主卧\t25403\nydhbdjgv\t25404\ncwc\t25405\n早一点\t25406\n借宿\t25407\n基z\t25408\ns嗯\t25409\n微单F3\t25410\n明眼\t25411\n木乃伊\t25412\n这种词\t25413\n老太婆\t25414\n二零二两千\t25415\n那时哈\t25416\n流死\t25417\n通天雷\t25418\n校史\t25419\n能做\t25420\n这种话\t25421\nbibilibili\t25422\n爱滴人儿\t25423\n长庚\t25424\n快哉\t25425\n745千米\t25426\n底油\t25427\nhfbbfdidi\t25428\n捏求\t25429\n长床\t25430\n来啦混蛋\t25431\n毕业了干\t25432\n蓝玫瑰\t25433\n八宝粥\t25434\n鞠燕燕\t25435\n小女孩战\t25436\n易先明\t25437\n王一杨\t25438\n七十八二\t25439\nappletroo\t25440\nk662次\t25441\nog3\t25442\n艺术生\t25443\n笨忍者\t25444\n大转移\t25445\n几声\t25446\n滴流\t25447\n抛售\t25448\n高贝斯\t25449\n120207KBS\t25450\n删贴\t25451\nOOXX\t25452\n王传兵\t25453\n凯特\t25454\n因为我喜欢你我想和你\t25455\nsm傻帽\t25456\n路易·雪莱\t25457\nxxxl\t25458\n耳基\t25459\n求处\t25460\n3Oz\t25461\n女知道\t25462\n五月六天\t25463\n吹泡\t25464\n颇有\t25465\n陈欣烨\t25466\n郑翔\t25467\n结尾\t25468\n会儿天儿早点儿\t25469\n蜂蜜乖乖你好像\t25470\n8446291\t25471\n顺德\t25472\n激情四射\t25473\n猛\t25474\n赵大葱\t25475\n太了五下期\t25476\n郑翁\t25477\noge\t25478\n凉冰\t25479\n一锅粥\t25480\n西市\t25481\n读库\t25482\n王大王\t25483\n胆汁肠液\t25484\n牛头\t25485\n让我看\t25486\n牛牛牛额额额额曲项天歌白毛湖\t25487\n廖小花\t25488\n哇哈哈\t25489\n布提\t25490\n寺曼\t25491\n黑皮书\t25492\n猪胱病\t25493\n第六本\t25494\n几50\t25495\n心狠手辣\t25496\n形形色色\t25497\n8点49\t25498\n8点47\t25499\n随我喜欢\t25500\n我唯一最好的朋友\t25501\n8点40\t25502\n几班\t25503\n邢锐\t25504\n郁结\t25505\n想不起\t25506\n爱我的好姐姐我求求你了你相信我吧我真的是贵\t25507\n握握手\t25508\n坟墓\t25509\n友好\t25510\n前年\t25511\n兄弟情\t25512\n歌美e百分\t25513\n功放机\t25514\n友女\t25515\n亲爱的讨厌我\t25516\n剪自己去\t25517\n吕贸\t25518\n1.31\t25519\n护垫\t25520\ncbux\t25521\n赵清新\t25522\n椒江马路桥残联\t25523\n哦罗嗦\t25524\n真罗索\t25525\n131452074\t25526\n巧言\t25527\n逆真\t25528\nX—35\t25529\n2.3Rom\t25530\n仍然\t25531\n银联\t25532\n室内\t25533\n心有灵犀\t25534\n一天一顿\t25535\n痒痒\t25536\n9月17日下午\t25537\n恶罗王\t25538\n零花\t25539\nkjksjhsklap\t25540\n谢谢你我告诉你\t25541\n丹顶鹤\t25542\n卡梅伦\t25543\n公我真\t25544\n不是我的错\t25545\n平常事\t25546\n来玩什么\t25547\nFdsbjhhttybhd\t25548\n发现\t25549\n芽儿\t25550\n都厅\t25551\nm白糖\t25552\n邮轮\t25553\n杂子\t25554\nvjdfd\t25555\n囊肿\t25556\n杂字\t25557\n李慎敏\t25558\n精功科技\t25559\n欲滴\t25560\n蕾丝衫\t25561\n2001年3月17日\t25562\n我是秘度我是秘书我是秘度\t25563\n球状\t25564\n行行行好\t25565\n不打不打我真的好想你好想你好想你我真的好想你好想你好想你都\t25566\n惊涛骇浪\t25567\n那双\t25568\n样板戏\t25569\n就在你的你在哪这种\t25570\n那又\t25571\n没日没夜\t25572\n毕夏\t25573\n举械\t25574\n夺魁\t25575\n追随\t25576\n仕途\t25577\n比试\t25578\n刘锡荣\t25579\n5pt\t25580\n粽色\t25581\n洗刷刷偶偶洗刷刷洗刷刷偶偶洗刷刷\t25582\n井度秘\t25583\n忽忽那年\t25584\n泥人儿\t25585\n161cm\t25586\n甜食\t25587\n一个六岁\t25588\n萌学园四萌我还是你\t25589\n译锋\t25590\ngxgf\t25591\n风格\t25592\n悦己\t25593\n哈伦裤\t25594\n卡带\t25595\n站队走\t25596\n先帝\t25597\nareas\t25598\n神马老袁\t25599\n百分之八十八十多\t25600\n满丑\t25601\n主客队\t25602\n香草味\t25603\n你题我\t25604\n两万三万\t25605\n离歌\t25606\n乐趣\t25607\n特产馆\t25608\n哒哒哒哒呱呱呱呱呱呱呱呱哒哒哒哒哒哒哒哒哒\t25609\n0昂首\t25610\n本空\t25611\n谜尚新都家园十九号楼\t25612\n卡帅\t25613\n袋数\t25614\n卡布\t25615\nareaS\t25616\nへ\t25617\nま\t25618\nび\t25619\nひ\t25620\n神态\t25621\nぶ\t25622\nふ\t25623\n双瞳\t25624\nな\t25625\nnotyo\t25626\nと\t25627\nは\t25628\nの\t25629\n鲍胤曌\t25630\nっ\t25631\n25.1%\t25632\n幻想类\t25633\nで\t25634\nて\t25635\nづ\t25636\nつ\t25637\n神怪\t25638\n中高端\t25639\n五分钟\t25640\nじ\t25641\nた\t25642\n一百三\t25643\nそ\t25644\n富一代\t25645\n黃付\t25646\nし\t25647\n二十周岁\t25648\nさ\t25649\nか\t25650\n歌词\t25651\nぉ\t25652\n偶戏\t25653\nき\t25654\nが\t25655\nぃ\t25656\n早疯了\t25657\nぁ\t25658\n脱二\t25659\nぇ\t25660\n哆啦a梦\t25661\n超能机器人\t25662\n忘恩负义\t25663\n火影号\t25664\n小蜗牛\t25665\n曾任国\t25666\npkpkpktou\t25667\n暗战\t25668\n做先走\t25669\n高超\t25670\n狗不叫\t25671\n一百四十七块\t25672\n携家带口\t25673\n连续剧\t25674\n肾衰竭\t25675\nfdfsfafdf\t25676\n高越\t25677\n我的天使\t25678\n黄亚蓝\t25679\n开盘\t25680\n文文眼\t25681\n小盆友们\t25682\n190克儿\t25683\n高足\t25684\n叫比\t25685\n找你来\t25686\n可也谅解\t25687\n21支\t25688\n逆神谱\t25689\n庚金\t25690\nsomebodyelse\t25691\n郭宏侠\t25692\n人肉叉烧包\t25693\nlukom\t25694\n白臭美\t25695\n愤怒的小鸟\t25696\n复线\t25697\nvuuvu\t25698\n三万年\t25699\n牛肉干\t25700\n吴磊\t25701\nweIcome\t25702\n孟加拉\t25703\n周报\t25704\n上文\t25705\n伊人\t25706\n乐清市\t25707\n十二五\t25708\n街头\t25709\n七八十年代\t25710\n5dkdb\t25711\n赌毒\t25712\n嘉诚\t25713\n气喘嘘嘘\t25714\n度假\t25715\n梦幻西游\t25716\n限制\t25717\n电编\t25718\n王思琪\t25719\n一百二十一块\t25720\nshkas\t25721\n弗兰\t25722\n周折\t25723\n上方\t25724\n领地\t25725\n须发\t25726\n艾森埃森\t25727\n100问\t25728\n不必要\t25729\n十二亿\t25730\n伊斯兰历\t25731\n等一会儿\t25732\ndufufuh\t25733\n离石\t25734\n租金\t25735\n索沙井\t25736\n925岁\t25737\n几百号\t25738\n咕噜噜噜噜\t25739\n闹失踪\t25740\n双屿\t25741\n几百只\t25742\n白瘦\t25743\n日日梦\t25744\nagjtjmm\t25745\n四公里\t25746\nvi点酒\t25747\n阿拉希\t25748\n阿拉布\t25749\n宽带没有网\t25750\n123345667896\t25751\n乘数\t25752\n五点半\t25753\n环顾\t25754\n双屏\t25755\n增增\t25756\n背着\t25757\n梁海金\t25758\n洋后\t25759\n盈\t25760\n衣服亲\t25761\n安别回\t25762\n前背包\t25763\n叶修\t25764\n省外\t25765\n纱男\t25766\n睁开眼看\t25767\nxjoe\t25768\n55.5%\t25769\n施先生\t25770\n文王神卦\t25771\n死币\t25772\n内饰\t25773\n利哥\t25774\n刘洪亮\t25775\n遗言\t25776\nBring\t25777\n跌落\t25778\n为你点赞\t25779\n你做我的小秘书我没有工资给你的哦我还在读书\t25780\n天涯赤子心\t25781\n热情额\t25782\n科大\t25783\n上面前\t25784\ntfys\t25785\ncanus\t25786\n521111111\t25787\n湖南\t25788\n天天酷跑号儿\t25789\n胃口也吗我不讨厌我讨厌你\t25790\njhpla1ak\t25791\n六七十六\t25792\n二建\t25793\n泰聚美\t25794\n我可以\t25795\n5成天\t25796\n恩你个不头鬼\t25797\n一课\t25798\n给我不好相处\t25799\n一读\t25800\n一诺\t25801\n黑你\t25802\n青楚楚\t25803\n26838535\t25804\n嗯闺蜜\t25805\n智光\t25806\n一语\t25807\nCat\t25808\n清真寺\t25809\n补白\t25810\ner34乌烈7k7k小花\t25811\n唐宁钢\t25812\n洒遍\t25813\n恃强凌弱\t25814\n一话\t25815\n死神来了5\t25816\n盛\t25817\n炎\t25818\n片品\t25819\n护送\t25820\ntfy4\t25821\n一证\t25822\n片哈\t25823\n剧评\t25824\n书呆子\t25825\n英語嗎\t25826\n每隔二十分钟\t25827\n耨\t25828\n盟\t25829\n钱图\t25830\n哥块\t25831\n邓镇洪\t25832\n耻\t25833\n刚五\t25834\n耹\t25835\n耿\t25836\n耽\t25837\n姜泽幸\t25838\n耳\t25839\n耶\t25840\n猪母猪臭猪大母猪死机秘蛛\t25841\n笨熊熊\t25842\nfgdffrfftfddfdff\t25843\n耏\t25844\n朱雨蓉\t25845\n耍\t25846\n而\t25847\n考\t25848\n老\t25849\n所以说好\t25850\n佟美男\t25851\n者\t25852\n做彼天\t25853\n霸体\t25854\n郯城县\t25855\n耙\t25856\n无人机器人\t25857\n耒\t25858\n耐\t25859\n耗\t25860\n耕\t25861\nlaiwoJim\t25862\n否否\t25863\n王震林\t25864\n大众公司\t25865\n圣杯圣杯\t25866\n走放\t25867\nmstaranis\t25868\nfhggjk\t25869\ntesth\t25870\n好玩号给我一个呗\t25871\n74455\t25872\n蜂价\t25873\n见不愧\t25874\n唱段\t25875\n传媒\t25876\nvkgjjk\t25877\n地说\t25878\n10月29日上午\t25879\n追逐梦想\t25880\n犬狗\t25881\n劲松岗\t25882\n屈比\t25883\nDgjkoigcv\t25884\n比也儿\t25885\n鸡蛋糕\t25886\n本班\t25887\n徐安安\t25888\n东方日报\t25889\n幽少\t25890\n20世纪初\t25891\n谢谢你度秘\t25892\n0.1个\t25893\n亚瑟和他的迷你王国\t25894\n史书\t25895\n1234567吧九十十一十二多少\t25896\n陈登茁\t25897\n这种度\t25898\n直航\t25899\n福州北站\t25900\n城仙\t25901\ntmp\t25902\nMike\t25903\ntmt\t25904\n忠于一样\t25905\ntmw\t25906\ntmx\t25907\n虫族\t25908\n第几位\t25909\ntma\t25910\n二门\t25911\n信用度\t25912\n烧菜\t25913\ntmi\t25914\ntmj\t25915\ntmm\t25916\n山道\t25917\n水头\t25918\n范秋迪\t25919\n希真\t25920\n我是你的爱人\t25921\n蓟县第一小学\t25922\n未愈\t25923\n美观卫生\t25924\n大棚子\t25925\n2月24号\t25926\n自营银行系\t25927\n新疆医科大\t25928\nshaolinchow\t25929\nalsoput\t25930\n疙瘩我是你主人女大\t25931\nGiglia\t25932\n晋北古楼烦\t25933\n譬\t25934\n手续费\t25935\n铁马\t25936\n亲爱的陪我聊会\t25937\n许建欢\t25938\nbbbbbhhh\t25939\nVB\t25940\n堪笑\t25941\n中山路\t25942\n斯本\t25943\n采信\t25944\n乱情迷\t25945\n加盟费\t25946\n一万亿\t25947\n存货\t25948\n大伤心\t25949\n坑园\t25950\n下款\t25951\n饿不日\t25952\nbu4\t25953\n卢克汗\t25954\n汤亚敏\t25955\n哇版\t25956\n这么样\t25957\nmalpfrs\t25958\n艺术史\t25959\n1986\t25960\n1987\t25961\n奋进者\t25962\n1982\t25963\n1983\t25964\n1980\t25965\n1981\t25966\n王小丹\t25967\n陈幸琳\t25968\n想不得\t25969\n萧淑妃\t25970\ngthh\t25971\n离家出走\t25972\n马爹利\t25973\n银化\t25974\n志同机器人\t25975\n雅图\t25976\n當磚\t25977\n王小丫\t25978\ngthy\t25979\n转筋\t25980\n财务局\t25981\n身心说\t25982\nbuy\t25983\n贾欣顺\t25984\n王八在\t25985\n间间\t25986\n小小小小小小小小\t25987\nbus\t25988\nbup\t25989\n韩馨蔓\t25990\nbut\t25991\nbuu\t25992\nbuj\t25993\n长沙泊车有限公司\t25994\n小薰\t25995\nbui\t25996\nbun\t25997\n奎爷\t25998\n藤条\t25999\n促恧反应\t26000\nbuc\t26001\n拔河\t26002\n芭比娃娃积木\t26003\nbud\t26004\n理养\t26005\n国际刑事法院\t26006\nbuX\t26007\n船舶\t26008\n小薇\t26009\n丹和一郎\t26010\n绝种\t26011\n有请问\t26012\n00000000000123\t26013\n内向\t26014\n我哥\t26015\n被里\t26016\n英语\t26017\n理公\t26018\nminutes\t26019\n护照\t26020\n剪了吧\t26021\n茶匙\t26022\nckov\t26023\n站邪\t26024\n三线\t26025\n狡辩\t26026\n中国人民大学出版社\t26027\n曹飞\t26028\n韩茗语\t26029\n三级\t26030\n乘出租车\t26031\n两趟\t26032\n发哑\t26033\n裸体岁\t26034\n不骄不躁\t26035\n99999999999\t26036\n一6\t26037\n温冷\t26038\n孟天\t26039\n九振宇\t26040\n你爱我吗你爱我吗你爱我你爱我你爱我\t26041\n194年\t26042\n我的房间\t26043\nggggff\t26044\n三万英尺\t26045\n大眼\t26046\n老彭\t26047\n朵落\t26048\n成堆\t26049\n多愁善感\t26050\njhuk\t26051\nhjdjd\t26052\n密我不想\t26053\n‰\t26054\nVI\t26055\n′\t26056\n‵\t26057\n同年同月同日\t26058\n※\t26059\n爱的你说了\t26060\n搅基\t26061\n一百九十六\t26062\n男香\t26063\n美国共和党\t26064\n…\t26065\n说说过\t26066\n顾佳琪\t26067\n下載\t26068\n格机\t26069\n–\t26070\n点卷发\t26071\n―\t26072\n—\t26073\n‖\t26074\n’\t26075\n‘\t26076\n子曰岁寒\t26077\n”\t26078\n“\t26079\ndrtettt\t26080\nsprital\t26081\n1346788\t26082\n乌恰县\t26083\n睇拳王\t26084\n胸膛\t26085\n详净阁\t26086\n二四年\t26087\nGeForce\t26088\n洗澡青\t26089\n捷达捷达\t26090\n我喜欢区\t26091\n二四幺\t26092\ndert\t26093\n过少\t26094\n数典\t26095\n还是我美\t26096\n半梦半醒\t26097\n大不然\t26098\n结缘\t26099\n讥笑\t26100\n结编\t26101\n幻想弹弹行\t26102\nderd\t26103\n数八\t26104\nFFSTUFFFHG\t26105\n11斤\t26106\n创私\t26107\n钱管\t26108\nstriplion\t26109\n─En\t26110\n国民政府\t26111\n佐料\t26112\n靠山\t26113\n规章制度\t26114\n插卡\t26115\n小吉之论\t26116\nhdushsi\t26117\n道法\t26118\n凌霄血\t26119\n占道费\t26120\n无放弃\t26121\n周浩\t26122\n对不嫁给\t26123\n五轮\t26124\n范志超\t26125\n期悔\t26126\n迪迦奥特曼\t26127\n向日葵\t26128\n李家小\t26129\n左之林\t26130\n再就业\t26131\n放大不大\t26132\n李彦妮彦\t26133\n说拜拜\t26134\n十分钟以内\t26135\n徐艳新\t26136\n瘦脸\t26137\n窖藏\t26138\nyournema\t26139\n小米米\t26140\n走任\t26141\n1971.7.26\t26142\n五车\t26143\n宇飞\t26144\n截面\t26145\n高山\t26146\n猜中必中\t26147\n高屿\t26148\n木火夫妻大吉昌\t26149\n推广者\t26150\n女大飞机\t26151\n漫天小懒猫\t26152\n偷偷地\t26153\n4x8\t26154\ngncgjw\t26155\n春暧\t26156\n冯淼鑫\t26157\n脾蝤\t26158\n制杖\t26159\n慾橫流\t26160\n挖星\t26161\n南公里\t26162\n张双林\t26163\n大亚湾\t26164\n五载\t26165\n来笑一个白\t26166\ngggyhvf\t26167\n死硬\t26168\n高居\t26169\n第一帅\t26170\n郑志超\t26171\n高层\t26172\n作用点\t26173\ndatpmj123725221\t26174\n精忠\t26175\n杨欣怡\t26176\nydgd\t26177\n4xo\t26178\n8a\t26179\n肚子干嘛\t26180\n驽钝\t26181\n天台天太\t26182\n秀恩\t26183\n相伴\t26184\n密丂\t26185\n苏你\t26186\n964894486\t26187\nVR\t26188\n活逼\t26189\nbfusbf\t26190\n18号晚\t26191\n苏佳\t26192\n小旗\t26193\n提笔\t26194\n绢卷\t26195\n炸骗\t26196\n我们之间\t26197\n网络部\t26198\n如有雷同\t26199\n616645465643311313316161664040480408443480000046106656556668666336\t26200\n1098765478\t26201\n墓地\t26202\n凤尾油饼母鸡汤\t26203\n吧全椒\t26204\n累操\t26205\ngnm\t26206\n配套\t26207\n树洞\t26208\nGHRK\t26209\n一穷\t26210\n第798\t26211\n天安门东国家博物馆北广场\t26212\n568875768588\t26213\n思右想\t26214\nvisible\t26215\n从今天\t26216\n偷天\t26217\n白大夫\t26218\n昆虫记\t26219\n得出结论\t26220\n中午一点\t26221\n独孤求败\t26222\n王沙\t26223\n淘气机器人\t26224\n意大利人\t26225\n赵心岩\t26226\nm4\t26227\n你错\t26228\n在至上\t26229\n55555665555\t26230\nm3\t26231\n真章\t26232\n再说拜拜\t26233\n奥运营销\t26234\nm9\t26235\n王沅\t26236\n特邀\t26237\n吾奇酷\t26238\n植物大战僵尸二中文版\t26239\nmC\t26240\nmB\t26241\nVV\t26242\nmL\t26243\n就是我的海角\t26244\n薛平贵与王宝钏\t26245\n交理\t26246\n天天你梦乡就在那里你的人心\t26247\n玉阳山\t26248\n68千米\t26249\n亮色\t26250\n失迷\t26251\n二支\t26252\n一笑话\t26253\n虎搞话\t26254\n天山路\t26255\nme\t26256\nmd\t26257\nmg\t26258\nmf\t26259\nma\t26260\n肠胃炎\t26261\nmb\t26262\nmm\t26263\nml\t26264\nmo\t26265\nmn\t26266\nmi\t26267\nmh\t26268\nmk\t26269\nblug7f\t26270\n字钢\t26271\nmt\t26272\nmw\t26273\nmv\t26274\nmq\t26275\nmp\t26276\nms\t26277\n1826269983\t26278\nmy\t26279\nmx\t26280\nmz\t26281\n邓小龙\t26282\n第二十三集\t26283\nTEE\t26284\n电子眼\t26285\n尖叫声\t26286\n啼鸣\t26287\n8G\t26288\n仙剑侠侣\t26289\n废话演说家\t26290\n屋屋\t26291\n沛沛\t26292\n国共党\t26293\n角铁\t26294\n度秘哇\t26295\n燃脂\t26296\n偷了钱\t26297\nend\t26298\nfggdg\t26299\neng\t26300\nenh\t26301\nhttppinyincne18\t26302\nmnmm\t26303\nenn\t26304\n77585221\t26305\nruoe\t26306\nenu\t26307\n啼鸟\t26308\nivorit\t26309\n略过\t26310\nhttppinyincn2FS0p4vAio0\t26311\n度秘哥\t26312\n佳紫鑫\t26313\n弗洛伊德\t26314\n秘不想\t26315\nEgyp\t26316\n滴水不漏\t26317\n饰演\t26318\n雪纺\t26319\n因为我要对你的话说\t26320\n7月18日\t26321\n52点\t26322\n笑眼\t26323\n来嘛明天任丘\t26324\n劳者\t26325\n10id\t26326\n3333333333337444444747455558\t26327\n天气质\t26328\n生姜\t26329\n海口药谷\t26330\njliu\t26331\n中之道\t26332\n冯敏\t26333\n20寸\t26334\n不二不去\t26335\n很不错有\t26336\n池尾街道\t26337\n赛区\t26338\n悭番\t26339\n十几分钟\t26340\n我问问\t26341\n潜伏者\t26342\n心爱的\t26343\n一二三四五六七八九十十一三一四一五九二六五三五\t26344\n胡鑫泽\t26345\n桃花劫\t26346\n搜偶\t26347\n十十几岁\t26348\n兰加\t26349\n160w\t26350\n亚加一笔\t26351\n酷迷藏\t26352\nshepre\t26353\n手放\t26354\nBBTV\t26355\n吴阳\t26356\nfads\t26357\n量子\t26358\n你好多咪\t26359\nfadt\t26360\n火狐浏览器\t26361\n白小\t26362\nv3菱\t26363\n吴队\t26364\nwwwSECAOsom\t26365\n弧度\t26366\n铃声门\t26367\njbfh\t26368\n老奸巨猾哈哈nnn0332101101300\t26369\n付亚鹏\t26370\nhelloyoui\t26371\n布瓦\t26372\nhelloyoul\t26373\n天眠后物归原主再填空物归原主\t26374\n坏主意\t26375\nAWGHDDat电讯飞色出国强\t26376\n葫芦鸭\t26377\nwwrrdcgb\t26378\n容奇\t26379\n喜糖和酒\t26380\n有空不明飞你有\t26381\n奥迪狗\t26382\n韩上宫\t26383\n美景湾白苏维翁\t26384\n小染\t26385\n小柒\t26386\n8.66亿元\t26387\n朱主人公\t26388\nhelloyout\t26389\n裤子\t26390\n风暴\t26391\n奔奔奔奔奔奔奔奔奔奔奔奔奔奔奔奔奔奔奔\t26392\n事发前\t26393\nDchvbhxb\t26394\n注灵\t26395\n单身贵族\t26396\n小心块\t26397\nmpoundeyou\t26398\n屁民\t26399\n小柳\t26400\n来将挡\t26401\n布瓜\t26402\n甜甜甜蜜\t26403\n死干\t26404\n调解\t26405\n郭宏磊\t26406\n北上广深\t26407\nCddfttttghkyddhuyewxghhszcvhtm\t26408\n茶杯犬\t26409\n888888888888666666666666666\t26410\n赵菲\t26411\n会议室\t26412\n陈美波\t26413\n技高一筹\t26414\n三百多\t26415\n南通\t26416\n盗梦看的我直犯困\t26417\nkometis\t26418\n50秒\t26419\n變態\t26420\ngoo\t26421\n造极吧\t26422\nfygbjit\t26423\n万古乡\t26424\ngod\t26425\n你好跑\t26426\ngoa\t26427\n闹肚子\t26428\n罗紫燕\t26429\n郑州中院\t26430\nmaynot\t26431\ngox\t26432\n可辣\t26433\n一百份\t26434\ngou\t26435\ngot\t26436\ngow\t26437\ngov\t26438\n一百件\t26439\ngop\t26440\ngor\t26441\n大尸\t26442\n空了玩\t26443\n等你我等\t26444\nAFI\t26445\n狼狼\t26446\nAFD\t26447\n季沃来\t26448\n直女\t26449\n呵额\t26450\ncontist糊涂kitmyaltokitoourmatnoobahito\t26451\n57222点\t26452\n动辄\t26453\n按照\t26454\n一向北\t26455\n盖碗\t26456\n人贩子\t26457\n书记\t26458\n新田\t26459\n好多多\t26460\n盘子\t26461\n狼狗\t26462\n起司\t26463\n没了处会不会\t26464\n狼狈\t26465\n起句\t26466\n大小\t26467\n500ml\t26468\n张陆军\t26469\nIwilldrag\t26470\n大将\t26471\n13252524015\t26472\n粉肠\t26473\n屈妞\t26474\n麻里央\t26475\n靠靠靠靠靠靠靠靠靠靠靠靠靠靠\t26476\n死绝\t26477\n奥利奥\t26478\n那招\t26479\n纪干\t26480\n溺亡\t26481\n萨拉\t26482\n5月21日\t26483\n我姓杜我家度秘\t26484\n发放\t26485\n龚昕怡\t26486\n龙布神\t26487\n那拉\t26488\nk477159\t26489\n年庆\t26490\n李梓孝\t26491\n古罗马\t26492\n未名城\t26493\n右手我不怕你王永远我国多不好玩爸爸我我多不怕我妈妈\t26494\n不色\t26495\n无穷愿\t26496\nidjjjdjow\t26497\n战国BASARA\t26498\n干嘛通知你\t26499\n变形金刚游戏\t26500\n方利强\t26501\n银赫\t26502\n慌言\t26503\n知情权\t26504\n不是们\t26505\n零二二八五\t26506\n3三九\t26507\n交界处\t26508\n桥桩\t26509\n50505080\t26510\n写真金箭奖\t26511\n榆林\t26512\n好丽友好朋\t26513\n13223139723\t26514\n黄小艳\t26515\n女士们\t26516\n白鲸\t26517\n天台宗\t26518\n郭果国\t26519\n给往\t26520\n初八\t26521\n初六\t26522\n专制\t26523\n300亿\t26524\n泡夫\t26525\n13564541843\t26526\n2010年10月14日\t26527\n丽丽丽丽丽\t26528\n肝胆\t26529\n初元\t26530\n白鲳\t26531\n7275亩\t26532\n一夫当关\t26533\n1五五千五百六十九5千五百六十九五千六百五\t26534\n专利\t26535\n干枣片\t26536\n一声舞\t26537\n蒙圈\t26538\nyouusualy\t26539\n兹纳盒\t26540\n死宅\t26541\n学神系\t26542\n谢金霞\t26543\ntalking\t26544\n电视剧\t26545\n其父\t26546\n移民镇\t26547\n八二六八五二零零\t26548\n长辈\t26549\nVj\t26550\n肉丝炒散\t26551\n幺姑父\t26552\n人像\t26553\n1618\t26554\n唉远\t26555\neam\t26556\n长达\t26557\n蒙在\t26558\n八十个\t26559\n发邮\t26560\n二十八岁\t26561\n戍边\t26562\n还以为\t26563\nhhhchhghjxgggkhkkkn\t26564\n周六秘\t26565\n三指\t26566\n威胁\t26567\n就是非爱\t26568\n凶杀案\t26569\n妈妈妈妈妈妈妈妈\t26570\n必胜客\t26571\n嘘寒问暖\t26572\nggcchajsg\t26573\n函电\t26574\n找把\t26575\n889波波\t26576\n疏密疏密女儿了想\t26577\n拉力\t26578\n换股\t26579\nbiggigjgj\t26580\nCWA\t26581\n嗑药\t26582\n魔币\t26583\n诗人们\t26584\n拉动\t26585\n姑u\t26586\n证秘\t26587\n得益於\t26588\n乖讨厌\t26589\n吴夏荣\t26590\n填空\t26591\nIlla\t26592\n莎薇\t26593\n64126512\t26594\nhelloBaiXiaodu\t26595\n三对\t26596\n三寸\t26597\n例假期\t26598\nhzhdhd\t26599\n伱｛\t26600\n啦度\t26601\n样桥\t26602\n照样\t26603\n马棚\t26604\n天师天\t26605\n保杀\t26606\n第四遍\t26607\n躐\t26608\nhxjkl\t26609\n扑朔迷离\t26610\n裂纹\t26611\n格栅\t26612\nRgjhgjb\t26613\n躬\t26614\n躯\t26615\nYouisapig\t26616\n身\t26617\n1156522724\t26618\n家子弟\t26619\n做强\t26620\n躲\t26621\n家人流\t26622\n石亚雄\t26623\n58处\t26624\n躺\t26625\n愿赌服输\t26626\n第五级\t26627\n霂突\t26628\nMynaneisGina\t26629\nqkdf\t26630\n货郎\t26631\nimimimimimimimimi\t26632\n淘宝双11双12\t26633\n恒\t26634\n孙大手\t26635\n1000000个\t26636\n明摆着\t26637\n答曰\t26638\n度密度密度度秘度秘度秘度度秘超级可爱的度秘\t26639\n洗度秘\t26640\n冤枉钱\t26641\nVVV染色\t26642\n25元\t26643\nunprt\t26644\n六零五分\t26645\n散学典礼\t26646\n背街\t26647\n丐村\t26648\n大作文章\t26649\n牛河\t26650\nabcdefghizakl撸illyoupisbureworisyatasyantimyouchiaabcdef\t26651\n厉树\t26652\n周星期二\t26653\nthanku\t26654\n水少明\t26655\n雅诗\t26656\n你的愿望\t26657\n烟草\t26658\n对她说\t26659\n草彅刚\t26660\n紫妹\t26661\n学说\t26662\n喳喳喳喳\t26663\n陈子鸿\t26664\n认错\t26665\n809\t26666\n致青春\t26667\n虚拟号\t26668\n瑶\t26669\n表现\t26670\n足珍珠\t26671\n多米亚度秘我真的好吗\t26672\n一个七十块\t26673\n直达\t26674\n表王\t26675\n肚面\t26676\n切克闹他是我的你别闹\t26677\n死家\t26678\n浑蛋\t26679\n100亿\t26680\n瑄\t26681\n表率\t26682\n墨迹然后\t26683\n瑟\t26684\n瑞\t26685\n朴爱子\t26686\nyf6f6guuuvf8888ttx888i8fff8888yf965ryan585\t26687\n瑕\t26688\n7530\t26689\n三四日\t26690\n夹江县\t26691\n鲍鑫丽\t26692\n同德会\t26693\n8月13日上午\t26694\n按完\t26695\nExo\t26696\n故现\t26697\n我喜欢岩\t26698\n乐施\t26699\n一风\t26700\n金倪军\t26701\n二二八零\t26702\n社会科学\t26703\n海贼有大冬瓜\t26704\n单幅\t26705\n咪咪咪\t26706\n忐忑阿姐\t26707\n牛蒡茶\t26708\n忍观\t26709\n住一住\t26710\n小黄子\t26711\n水垢\t26712\n马愣子\t26713\n禀性\t26714\n忒忒忒\t26715\n120705\t26716\n120706\t26717\n板栗\t26718\n黄文革\t26719\n宝贝呀我爱你我原零元的原谅你\t26720\nB站\t26721\n六蓝七\t26722\n天工坊\t26723\n实问实\t26724\n喔嘤嘤嘤\t26725\n张密度\t26726\n好愉快\t26727\n你好像猪你好像猪你好像猪你好像猪\t26728\n贤\t26729\n捐款箱\t26730\n下午一点\t26731\n责\t26732\n作势\t26733\n当让\t26734\n黄道\t26735\nhijk\t26736\n忠诚度\t26737\n后遗症\t26738\n于佳玉\t26739\n没哪拳\t26740\n蓝牙\t26741\n陈昱冰\t26742\n浇水\t26743\n黄文静\t26744\n汇流淫水\t26745\n吴亚芳\t26746\n呼欠\t26747\n我你也不吗那你呢干嘛呢\t26748\n于田火山公园\t26749\n气狗\t26750\n宁格\t26751\n顾长卫\t26752\n滑孙\t26753\ngghgf\t26754\n内卷土重来\t26755\n闲话\t26756\n马来\t26757\n邱一新\t26758\n托尔思\t26759\n芥末\t26760\n尼坤\t26761\n九九梅花\t26762\n个群\t26763\n金翠玉\t26764\n差一家子\t26765\n贩\t26766\n兴风作浪\t26767\n张。洪\t26768\nhepart\t26769\n从容\t26770\ndyyrly\t26771\npalladium\t26772\n从宽\t26773\n外度秘\t26774\n贷\t26775\n河源市梦幡文化传播有限公司\t26776\n暗含\t26777\n大多重\t26778\n唐乐平\t26779\n姑奶\t26780\n白虹贯日6455\t26781\n王奕丹\t26782\n九五春\t26783\n恩如氏\t26784\n脂肪粒\t26785\nteachers\t26786\n展板\t26787\n闲语\t26788\nFollow\t26789\ncomgif\t26790\n穹窿山\t26791\n八九燕\t26792\n主朋\t26793\n睫毛控\t26794\nvv7\t26795\n说了我叫\t26796\n雪竞\t26797\n高大上打\t26798\n太平太阳\t26799\n咪都咪\t26800\n袁佳琪\t26801\n复唧唧\t26802\n邓马丽\t26803\n帅哒\t26804\n雷阳\t26805\n人物周刊\t26806\nhlhcbohhkvlhabook\t26807\n四彩\t26808\n我不要你了煮面\t26809\n探訪\t26810\njwow\t26811\n主机\t26812\n真好看天好看\t26813\nhbvjkkv\t26814\n我是人类维c\t26815\n泰迪吧\t26816\n奠基\t26817\n污蔑\t26818\n铁血\t26819\n赵我\t26820\n大佛\t26821\n五菱微\t26822\nvvz\t26823\nvvy\t26824\n大作\t26825\n大体\t26826\n胡彦斌\t26827\nvvu\t26828\nvvt\t26829\nvvs\t26830\nvvr\t26831\n梧桐\t26832\n看护者\t26833\nvvn\t26834\nvvk\t26835\nvvh\t26836\nvvg\t26837\nvvf\t26838\n準备\t26839\nvvd\t26840\nvvc\t26841\nvvb\t26842\n大使\t26843\n罗佐夫\t26844\n多米贷4200\t26845\n世博会\t26846\n那不\t26847\n恨度\t26848\n长时间\t26849\n作恶多端\t26850\n乐清政府\t26851\n大佬\t26852\n浐灞队\t26853\n拜拜洗\t26854\n李静琳\t26855\neme400125em\t26856\n杨文慧\t26857\nFgg\t26858\n胖子\t26859\n熊亦馨\t26860\nG21白2820\t26861\n韩正一\t26862\nNinJmjdg\t26863\n红旗街\t26864\nn　＿　／＿＿n　／└\t26865\n雌从\t26866\nsssssssboom\t26867\n异口同声\t26868\n杨力\t26869\n晩安\t26870\n好想知道\t26871\n酒鬼\t26872\n互泽\t26873\n样样\t26874\n半岛电视台\t26875\n100。291\t26876\n恨死你了大坏蛋\t26877\n八15点\t26878\n好啦好啦你没\t26879\n十五页\t26880\n正阳萎书院医院\t26881\n灾难性\t26882\n理金\t26883\n周率\t26884\n度贵妃\t26885\n批写\t26886\n随我爸\t26887\n三宝鸟\t26888\n郑胤兴\t26889\n安吉品\t26890\n对蚀者\t26891\n加木\t26892\n贝\t26893\n上下铺\t26894\n单机版\t26895\n你的手\t26896\n嘎子宏\t26897\n灾区\t26898\ngiff\t26899\ngifg\t26900\n发梢\t26901\n灿灿\t26902\n笔笔\t26903\n12年后\t26904\n558888566852662335854456889805\t26905\nczdm\t26906\n吴霞雨\t26907\n云顶\t26908\n倒饭呐\t26909\n佳佳\t26910\n夏加尔\t26911\n臭儿\t26912\nabcdefthijklmnopqrstuvwxyzxac1ucikc\t26913\n低薪\t26914\n三样\t26915\n童浩\t26916\n3060分钟\t26917\n申毛\t26918\n三格\t26919\n三根\t26920\n馨馨\t26921\n松土\t26922\n想说了再见\t26923\n赚学\t26924\n千难万险\t26925\n佳作\t26926\n三栋\t26927\n429\t26928\n428\t26929\n950635\t26930\n巴甲\t26931\n啊萧\t26932\n买车\t26933\n421\t26934\n420\t26935\n423\t26936\nyikx\t26937\n425\t26938\n远古\t26939\n红豆牛奶西米露\t26940\nBashed\t26941\nptttwtt\t26942\nWwwwwwwww\t26943\n被拘留\t26944\n20156年\t26945\n英格利息犬\t26946\n再说错\t26947\n王戎\t26948\n不可数\t26949\nfrredf\t26950\nhgfgjjj\t26951\n心快乐\t26952\n性女\t26953\n杨素葡萄\t26954\n泰勒斯威夫\t26955\n42m\t26956\n布娃娃俩\t26957\n性奴\t26958\n通江红军广场\t26959\nCNBLUE\t26960\n阿姆斯特丹\t26961\nHHFA\t26962\n380亿元\t26963\n嘟嘟贝贝\t26964\n飘然而至\t26965\n家园\t26966\n技术性\t26967\n嘉实活期宝\t26968\n出身\t26969\n正佳\t26970\n蚕食\t26971\n了解锁\t26972\n四大祸\t26973\nDyttr6ntghygf\t26974\n做衣服\t26975\n当耳边风\t26976\n90元\t26977\n偷走\t26978\n等会儿\t26979\n看上学\t26980\nSunday#\t26981\n盗墓度秘\t26982\n快见水\t26983\n手勒\t26984\nourend\t26985\n项龙\t26986\n修辞\t26987\n一比一千万\t26988\n经困\t26989\n我心么西爱拍了拍\t26990\n猪皮冻\t26991\n天下无敌\t26992\n超級\t26993\n张文福\t26994\n宋清\t26995\n最后一次最后一次\t26996\n诚实\t26997\n专硕\t26998\nbbbbhssi\t26999\n拿出去\t27000\n康康女\t27001\n公意\t27002\np4年\t27003\nABS—CBN\t27004\n422亿元\t27005\n诚宋\t27006\n应孕\t27007\n柬东理念\t27008\nsojx\t27009\n1月11日\t27010\n赠款\t27011\n诛仙系\t27012\n伊迪丝·索德格朗\t27013\n签唱会\t27014\n鼠来宝\t27015\n26日\t27016\ngggggggggggggggmmmmmmmmmmmmmmmm\t27017\n被迫\t27018\nadore\t27019\n吃擦\t27020\n哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒\t27021\n来劲儿\t27022\n都铎\t27023\n一多一多\t27024\n公愤\t27025\n黄彩东\t27026\n李佳雪\t27027\nhhhhg\t27028\n找你了我\t27029\n大疱疱疱\t27030\nhhhhh\t27031\n李怀清\t27032\n難道妳\t27033\n绽开\t27034\nhhhhv\t27035\n没有人管\t27036\n闰年\t27037\n云罩\t27038\nhhhhy\t27039\n张宇彤\t27040\n发奇q版\t27041\n不要紧张\t27042\nfijrj\t27043\n別骂\t27044\n金达威\t27045\ncdcbf\t27046\n木傀儡\t27047\n犒赏\t27048\ny77uugf\t27049\n好耶年\t27050\n不怕太多\t27051\n报亭\t27052\n敏感型\t27053\n东路给我讲一个故\t27054\n天黑烟火\t27055\n芝麻大\t27056\n杨家\t27057\n搧\t27058\n搡\t27059\n民百\t27060\n陈小妞\t27061\n搭\t27062\n搬\t27063\n土里土气\t27064\n搪\t27065\n宝山\t27066\n搽\t27067\n红天使\t27068\n阿卡\t27069\n沙哑\t27070\n希尔顿\t27071\n1769499707\t27072\n蜜战\t27073\n跌停\t27074\n搁\t27075\n应季\t27076\n作客\t27077\n搂\t27078\n迈得\t27079\n搏\t27080\n呦呦13oyouiouuou\t27081\n蝴蝶甲天下\t27082\n作家\t27083\n曼诺华夏\t27084\n天平座\t27085\n搓\t27086\n6en\t27087\n搜\t27088\n搞\t27089\ndgdvdgdbbd\t27090\n换柱\t27091\n小花脸\t27092\n心理品质\t27093\n一零年代\t27094\n冰爽\t27095\n112路\t27096\n稍纵即逝\t27097\n想问\t27098\n严防\t27099\nv希\t27100\n自满\t27101\n364个\t27102\n民兵达百\t27103\n说情\t27104\n想闻\t27105\n周固\t27106\n周国\t27107\n帮堂\t27108\n朱燕君\t27109\n来呀告诉你\t27110\n别忘了你是企业线\t27111\n张雨霏\t27112\n笑话了也不让你给我\t27113\n不愧为\t27114\n放假还\t27115\n米解霸\t27116\n我是女的你是男\t27117\n通俗易懂\t27118\n周四\t27119\n毛笔\t27120\ntown\t27121\n彭蕾\t27122\n没戏可\t27123\ngjvcnbc\t27124\n我的围巾了我的笑了的我的眼睛了我的时候了我的围巾了vjnycffghgdvechgfgg\t27125\n表子\t27126\n叙浦\t27127\nButtes\t27128\n炙疒\t27129\n别克君威\t27130\nbjwds\t27131\n尼斯\t27132\n席晨\t27133\n新客专享\t27134\n东方网\t27135\n洪兴\t27136\n野牛\t27137\n梦非梦\t27138\n滑舌\t27139\n一枝笔\t27140\n养老别\t27141\n抗战\t27142\n梦醒\t27143\n耳通炎\t27144\n娃娃机\t27145\n本宫\t27146\nstilat\t27147\n16枚\t27148\n分手\t27149\n岩好\t27150\n美国默克制药\t27151\n萤火汁\t27152\n16林\t27153\nbaka\t27154\n程过\t27155\n15日\t27156\nebbdjd\t27157\n这么少\t27158\n我随你\t27159\n好而优\t27160\n拉布鸡\t27161\n3333micmicmicmic灯\t27162\n好球\t27163\n雅虎\t27164\n秋田犬\t27165\n9时18分\t27166\n97.4\t27167\nYoutellmeajoke\t27168\n我不我\t27169\n欧曼\t27170\n王康康\t27171\n日期\t27172\n男丑\t27173\n考靠\t27174\n剧场\t27175\n少言寡语\t27176\n乾隆年\t27177\n东西方\t27178\n二八零二\t27179\n日月\t27180\n小兔崽子\t27181\n男丁\t27182\n御前侍卫\t27183\n一百日元\t27184\n鹧鸪\t27185\n琼\t27186\n罪犯\t27187\n一号键\t27188\n长江医院\t27189\n男主\t27190\nThfhhnjkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkk\t27191\n贝贝贝贝\t27192\n2分之三倍\t27193\n了别\t27194\n日本\t27195\ndiguohal\t27196\n罪状\t27197\n男丫\t27198\n冰山\t27199\n皇鼎\t27200\n小相\t27201\n独善其身\t27202\n哼哼坏蛋坏蛋坏蛋\t27203\n大喜功\t27204\nckoencms\t27205\n留在\t27206\n大南大南块\t27207\n看看见了\t27208\n蜘蛛侠\t27209\nwsetion\t27210\n翻球\t27211\n郁证\t27212\n一拳呗\t27213\n小盖\t27214\n巫溪\t27215\n小盒\t27216\n北京5环\t27217\n乐府诗\t27218\n课人\t27219\n胡思琪\t27220\n张文静儿\t27221\n弓箭手\t27222\n犯罪率\t27223\n安知晓\t27224\n孙傻\t27225\n835454\t27226\ngjhff\t27227\n中国海洋大学\t27228\n珍惜\t27229\n十五本\t27230\n光雨\t27231\n呀不理\t27232\n2589631775\t27233\n天才会\t27234\n琉\t27235\n这个声\t27236\n负荆请罪\t27237\n阿庆哥\t27238\n科学家具体\t27239\n黑犀铠甲\t27240\n氦气\t27241\n好屌\t27242\nSukhothai\t27243\n2月8\t27244\nhoumetatou\t27245\n数九隆冬\t27246\n跟你傑乖哦班憤怒不好nnnn分手v不豐富來\t27247\n热能图\t27248\n算一算\t27249\n谭千秋\t27250\n二百条\t27251\n过格\t27252\n红娘\t27253\nshhbsjsjs\t27254\n红娃\t27255\n炒泡面\t27256\nstoentvtouuuu20多ouuouuuu\t27257\n商丘\t27258\n罗人\t27259\nSBSBSBSBSBSBSBSBSBSBSBSBSBSBSBSBSBSBSBSBSBSBSBSBSBSB\t27260\n兒子\t27261\n意花\t27262\n罗了\t27263\n点床叫\t27264\n今年春节\t27265\n子朋\t27266\n家味\t27267\n鬼王\t27268\n187876117\t27269\n我太服你了度秘我的秘书\t27270\n少我\t27271\n160006322\t27272\n窝毒\t27273\n摸鸡\t27274\n83\t27275\n琛\t27276\n刘文彩\t27277\n内会儿\t27278\n500块\t27279\n1十六\t27280\n粘土\t27281\n五十次\t27282\n老曹\t27283\n郭相扶\t27284\n990云\t27285\n笑死掉\t27286\n32倍\t27287\n现代诗\t27288\n雪铁龙\t27289\n破坏者\t27290\n三浴\t27291\ncdad\t27292\n帅我好\t27293\nuuiiu\t27294\n当度秘\t27295\nsorhome\t27296\n很有用\t27297\n大国企\t27298\n不喜\t27299\n不喝\t27300\n还不能\t27301\n不喊\t27302\nstimetoutou\t27303\n远永\t27304\n有的时候\t27305\n联保鸟\t27306\n云锅\t27307\n九龙湖\t27308\n小飞\t27309\n小食\t27310\n牛杂串\t27311\n疑难杂\t27312\n精皮力尽\t27313\n小风\t27314\n变速箱\t27315\n陈文童\t27316\n宁工\t27317\nGOYK\t27318\n時侯\t27319\numWlla\t27320\nTurtles\t27321\n半程乌龙\t27322\n胡覅\t27323\n炔雌醇环丙孕酮片\t27324\nshdjfj\t27325\nshetells\t27326\n背蛛\t27327\n初闻\t27328\n挂票\t27329\n吃饱\t27330\n熊十力\t27331\n净鸡\t27332\n星斗\t27333\n惊骇\t27334\n冷飕飕依旧\t27335\nSUVOK\t27336\nXN\t27337\n芭比FB\t27338\n踏雪寻梅\t27339\n大肥乡\t27340\n玊／nn\t27341\n世界佛教论坛\t27342\n管好不好\t27343\nIFANBOX\t27344\n11900\t27345\n太瘦\t27346\n五戒\t27347\n陆思航\t27348\n1699\t27349\n高估公司\t27350\n台错\t27351\n度秘我没有\t27352\nsbsb888883\t27353\n22535655328555824558257284252354255525154322546153152514244157433464355143546285546455354446\t27354\n枯荣\t27355\n56855685\t27356\n立迅\t27357\n来了\t27358\n孝顺镇\t27359\nhttpehiphotosbaiducomxiaodupicitema686c9177f3e670985d31a8b3cc79f3df9dc55f6jpg\t27360\nSOSO\t27361\nauooo\t27362\n2002两\t27363\n自我介绍\t27364\n猪呀呀呀呀呀呀呀呀\t27365\n郭金成\t27366\n科举制\t27367\n来京\t27368\n为什么么\t27369\n舍不得你的可是\t27370\n程叔\t27371\n别再问\t27372\n动保\t27373\n东桥人\t27374\n银巴克\t27375\n咪咪咪咪咪咪咪咪咪咪\t27376\n哪版\t27377\n李荣玲\t27378\n来人\t27379\n马天朝西\t27380\n八年级\t27381\n魏公村\t27382\n来亲\t27383\n伊藤洋\t27384\n毛粪\t27385\nEND\t27386\n137\t27387\n4000万\t27388\nxxcv\t27389\n成功法\t27390\n谢谢你我地心情\t27391\nxxck\t27392\n降级\t27393\n120713\t27394\n天冬氨酸\t27395\n关店门\t27396\n金正恩\t27397\n全部\t27398\n一只狗\t27399\n快乐我真\t27400\n圣诞日\t27401\n复兴路\t27402\n扇贝王\t27403\n起睡\t27404\n倒水\t27405\n老死\t27406\nNdndndndn\t27407\n度秘你好傲娇\t27408\n2016年1月25日\t27409\n全都\t27410\n度过\t27411\ntumblr\t27412\n黑尔天\t27413\n1282309094\t27414\nfrill\t27415\n13699666\t27416\n蜜豆\t27417\n普首\t27418\n骰子\t27419\n李亚丹\t27420\n500多家\t27421\n花呗度秘\t27422\n再见再见再见听\t27423\n文化程度\t27424\n优惠点\t27425\n我就问你是不是我的秘书少给我废话\t27426\n又片urnoknobb\t27427\n骤降\t27428\n黄景瑜\t27429\n诶好勒\t27430\n最后一击\t27431\n密我可以\t27432\n自纠\t27433\n度迷\t27434\n浅草寺\t27435\n值机\t27436\n海港\t27437\n127小时\t27438\n龙樱\t27439\n胡连顺\t27440\n贝利德\t27441\n广州环球雅思学校\t27442\ndoning\t27443\nshdfjh\t27444\n六钱\t27445\n反正在\t27446\n我的说度秘度秘我爱你\t27447\n信阳火车站\t27448\n1万亿元\t27449\n啊奇\t27450\n海渊\t27451\n午睡\t27452\n106018759\t27453\n海清\t27454\n致我最爱的你\t27455\n陈呢\t27456\n老发\t27457\nr\t27458\n朱若菲\t27459\n几千几万几亿\t27460\n家长期\t27461\n三月四号\t27462\nòwó\t27463\n十艘\t27464\nvyied\t27465\n我们结婚了\t27466\nｏｋ\t27467\n88几\t27468\n阎竞文\t27469\nｏｕ\t27470\n8.10\t27471\n笑喔\t27472\n围巾\t27473\n刀锋\t27474\n没来过我12\t27475\n洛克我爱一次洛克tomlenoror\t27476\n王柯缘\t27477\n他们\t27478\n玻璃管\t27479\n被枪决\t27480\n扮演者\t27481\n晚钟\t27482\n36颗\t27483\n数十亿元\t27484\n一万多个\t27485\n专业群\t27486\n600吨\t27487\n糸秋\t27488\n拿什么爱\t27489\nvisitors\t27490\n14848548458454\t27491\nGOY\t27492\n拜本\t27493\n2011年5月19日上午\t27494\nGOT\t27495\n番瓜省\t27496\n王庆军\t27497\n步履\t27498\n嗯达\t27499\n嘎嘎嘎\t27500\n心苦\t27501\n啊场\t27502\n产科\t27503\nKbbn\t27504\n时务者\t27505\n拜月\t27506\n427公里\t27507\n管管管\t27508\n赖皮\t27509\n四点四十\t27510\n维修服\t27511\n嗯辣\t27512\n王乐遥\t27513\n市场经济\t27514\n责备\t27515\n宝瓶座\t27516\n邓富炼\t27517\n王佳鑫\t27518\n汤因比\t27519\n宠爱\t27520\n我叫六九会\t27521\n合寸冫J\t27522\n小楠\t27523\n身旁\t27524\n300元\t27525\n江帆\t27526\n300具\t27527\n零但\t27528\n就是现在这幺八五\t27529\n吗光\t27530\n情绪化\t27531\n四百四百多一点\t27532\n二米\t27533\n死博爱\t27534\n455585666636666\t27535\n二类\t27536\n驯璞\t27537\n秘仔\t27538\n填\t27539\n塬\t27540\n塮\t27541\n张鑫\t27542\n武大郎\t27543\n昨天中午\t27544\n塥\t27545\n洗漱\t27546\n5彳\t27547\n创办\t27548\n李辉\t27549\n拉线\t27550\nisit\t27551\n想你唱\t27552\n谢晓颖\t27553\n露杰\t27554\n别人的秘密\t27555\n塋\t27556\n塌\t27557\n魂国\t27558\n秘份\t27559\n中三班\t27560\ni3k0\t27561\n杨公堤\t27562\n塘\t27563\n不是我的好\t27564\n贬低\t27565\n宝鼎诗\t27566\n重温微积分\t27567\n塞\t27568\n塑\t27569\n秘们\t27570\n塔\t27571\n塗\t27572\n542个\t27573\n性别说我爱\t27574\n无一例外\t27575\n精妙\t27576\n你好多级\t27577\n阿四姑娘\t27578\n八十八八十八八十八十八十八\t27579\n影楼\t27580\n一头千余\t27581\n可不我喜欢\t27582\nisiy\t27583\n文风\t27584\n拉票贿选\t27585\n千岁千岁\t27586\n薄沙\t27587\n你干嘛\t27588\n通讯社\t27589\n抢修\t27590\n王靖琪\t27591\nHfzxfv\t27592\n宽泛\t27593\n炫目\t27594\n那不上\t27595\np。3\t27596\n臭度\t27597\n许宗衡\t27598\n这你真是树不要皮活不了人不要脸天下无敌\t27599\n好了那你猜我\t27600\ntrytffffsewggd\t27601\n上家\t27602\n超荣\t27603\n哭出来\t27604\n歪斜\t27605\n快言快语\t27606\nDygf\t27607\n一天上\t27608\n东西部\t27609\n据我所知\t27610\n当礼貌\t27611\n奥尔美\t27612\n嘎嘎嘎嘎嘎\t27613\n三角架\t27614\n大家霞\t27615\n爱不爱我可以\t27616\n花明又一村\t27617\n技法\t27618\ntfboyspkexo\t27619\n花花呀花花晚安\t27620\nBGGGIU\t27621\n瓮城\t27622\n瞒帐\t27623\n567480\t27624\n上官\t27625\n本内特\t27626\n山海关\t27627\n搋子\t27628\ntyronnnryantt\t27629\n香港旅\t27630\n芋头粥\t27631\n新疆喀什塔里县城乡寄宿学校\t27632\n上安\t27633\n一个40天\t27634\n植物儿\t27635\n大叔叔\t27636\nvvffvd\t27637\n唐家嫣\t27638\n大好美\t27639\n朴健衡\t27640\n移山\t27641\n爱贝\t27642\n切客闹\t27643\n两个字儿\t27644\n大白天大\t27645\n橘霸\t27646\n移居\t27647\nhggjx\t27648\n对呀咳嗽\t27649\n爱财\t27650\n嗯也许吧\t27651\n驻场\t27652\n愈演愈烈\t27653\n爸支\t27654\n西北部\t27655\n查符蓉\t27656\n抵扣券\t27657\n巴西龟\t27658\n伤官\t27659\n水箱\t27660\n631\t27661\n632\t27662\n633\t27663\n小宇\t27664\nadinguan\t27665\n小宋\t27666\n639\t27667\n川妹子石锅鱼\t27668\n小宏\t27669\n辩字\t27670\n6点20\t27671\n特立独行\t27672\n水管\t27673\n梦想\t27674\n蔡晓春\t27675\njtjptmg\t27676\n青光眼\t27677\n叶开\t27678\n小宝\t27679\n拳神\t27680\n13351238615\t27681\n穆忠宇\t27682\n言叶\t27683\n一百颗\t27684\n黛儿\t27685\n在写\t27686\n小家\t27687\n张配图\t27688\n小容\t27689\n拥抱太阳的月亮\t27690\n纹身\t27691\n心守\t27692\n心安\t27693\n十二三岁\t27694\n﹁﹁\t27695\n吃掉\t27696\n抓起\t27697\n抓走\t27698\n洗机会\t27699\n钨钢刀\t27700\n心定\t27701\n心宜\t27702\nTFCLUB\t27703\n有没得\t27704\n蒋东兴\t27705\n布书\t27706\ncostaco\t27707\nxhjdsfjb\t27708\n想说爱\t27709\n尿尿尿\t27710\n心室\t27711\n六全\t27712\n藏獒犬\t27713\n怎麼\t27714\n剪毛\t27715\n红灯\t27716\n尖叫吧傻冒\t27717\n周六之战\t27718\n死誓\t27719\npaj猫\t27720\n一月十九号\t27721\n恩文\t27722\n盛传\t27723\n心家\t27724\n罗汉\t27725\nvguh\t27726\n二百零五\t27727\n你在说什么的汽化\t27728\n武大樱园瀑布\t27729\n杀猪\t27730\n没错儿\t27731\n84fe\t27732\nOhIamagirl\t27733\n灰灰你在哪弄混还\t27734\n恩MB\t27735\n南坪\t27736\n超棒的你给我再\t27737\n110行\t27738\n二百零二\t27739\n岷江江\t27740\n幺三二幺二二那个八五幺三八，幺三八幺二二幺\t27741\n九百九十六\t27742\n股市\t27743\n麻繁你\t27744\n耳猫\t27745\n下月一号\t27746\nyodor\t27747\n廉租住房\t27748\n一条歌\t27749\n在下双儿\t27750\n香港屋顶影院\t27751\n小麦圭\t27752\n不要不总\t27753\nygydW9\t27754\n神通广大\t27755\n蔡元培\t27756\n理政者\t27757\n单身歌\t27758\n我死\t27759\nguided\t27760\n八百\t27761\n小小小小小多多多多多多多多少少少少少\t27762\n在玉米地的中央\t27763\n红双喜\t27764\n不经意\t27765\n分工合作\t27766\n消耗\t27767\nkjfh\t27768\n覃山林\t27769\nopeoyp\t27770\n小宝贝我的小宝贝我的小爸爸\t27771\nmother\t27772\ncatoo\t27773\n秘戏\t27774\n年方\t27775\n六兆\t27776\n正当\t27777\n灿一\t27778\n念珠\t27779\n546434\t27780\n喝牛\t27781\nP档\t27782\nJVJ\t27783\nnopqll\t27784\nhsifl\t27785\n4567\t27786\n情杀\t27787\n得治\t27788\n栩栩好\t27789\n基里连科\t27790\n文彦博\t27791\n4568\t27792\n坷坷\t27793\nTalk\t27794\n新乡市\t27795\n吃软饭者\t27796\n湖羊\t27797\n嗯乐乐乐乐乐乐\t27798\n念珏\t27799\n海赫\t27800\n李伯清\t27801\n军工厂\t27802\n白胡子\t27803\n别我遇\t27804\n打底裤\t27805\n假曰\t27806\nivvv\t27807\n丕文\t27808\n有关掉\t27809\n转转转\t27810\n师不必\t27811\n裴夫人\t27812\nwiamm\t27813\n陈楷函\t27814\ntptpw\t27815\n须谨慎\t27816\n支系\t27817\n阿斯卡\t27818\n行场\t27819\n资料性\t27820\n唐斯\t27821\n九几年\t27822\n赔偿\t27823\n下午四点半\t27824\n吧里\t27825\n陆兵榜\t27826\n于是\t27827\n一零级\t27828\n郭碧鸟\t27829\n天使的幸福\t27830\n精虫\t27831\n魔医仙\t27832\n感谢你了度\t27833\n5十5\t27834\np话\t27835\n这个识\t27836\n呃逆\t27837\n好型\t27838\n青溪\t27839\nRoci\t27840\n遮面\t27841\n闹别\t27842\n锵喜羊羊\t27843\n捍毅英\t27844\n五十六个\t27845\n食狗肉\t27846\n拜登\t27847\ndufng\t27848\n2530532851\t27849\n讨不讨厌\t27850\n安息日\t27851\nHghhh\t27852\n旺仔牛郎\t27853\n小米三娘\t27854\n来来来来\t27855\n袁部长\t27856\n历来\t27857\nrgcf\t27858\nJYJ19家\t27859\n阴阴天\t27860\n格雷厄姆\t27861\n上百\t27862\n小婆婆\t27863\ntrityoutifootromerohitook\t27864\n烊蛤\t27865\n丁丁\t27866\n势利\t27867\n应承\t27868\n897468180\t27869\njjmg\t27870\n趾甲\t27871\n较量\t27872\ntcvf\t27873\n林柔\t27874\n惊液伤\t27875\n加班\t27876\n妮昏沉\t27877\n588885\t27878\n玉洁\t27879\n阿思\t27880\nnxxn\t27881\n龙龙\t27882\n四得\t27883\n奇石\t27884\n阿怪\t27885\n翟总\t27886\n零六二二\t27887\n358个\t27888\n苏莱曼\t27889\n佳美康\t27890\n到底是\t27891\n拜承琦\t27892\nvyuf77drsg\t27893\n福禄寿\t27894\njuf\t27895\n空地\t27896\njud\t27897\njue\t27898\njuj\t27899\n道德经\t27900\njuh\t27901\njun\t27902\njuo\t27903\njul\t27904\njum\t27905\njur\t27906\njus\t27907\n大一人大大小小多一小多\t27908\njuv\t27909\njut\t27910\njuu\t27911\njux\t27912\njuy\t27913\n占地\t27914\n玩堂医院\t27915\n换胎\t27916\nLED\t27917\njuG\t27918\n360秒\t27919\n错在\t27920\n晋江\t27921\nLEO\t27922\n鸽巢\t27923\nfaze\t27924\n咯吱\t27925\nLES\t27926\n退还\t27927\n董明\t27928\n地震史\t27929\n郭碧\t27930\n12个月\t27931\n鲁撸\t27932\n同就\t27933\n梁滨\t27934\n才子面\t27935\n敲诈\t27936\n色差\t27937\n车女\t27938\n地好里啦怕路我\t27939\n钟爱\t27940\n人性化\t27941\n土豆皮\t27942\n物料\t27943\n秉公\t27944\n乔妈咪\t27945\n李思根\t27946\n阴道炎\t27947\n华图\t27948\n盔甲\t27949\n周佳丽\t27950\nFffffc\t27951\n你是谁我知道你是谁\t27952\n车套\t27953\n蛮乖\t27954\n戎荣儿\t27955\ngadman1jtjwmtjmj8tj\t27956\n脑膜炎\t27957\n985855458426843875394159985268776355889666699528416\t27958\n会首\t27959\n出潮\t27960\n昌珉君\t27961\n穷\t27962\n梁嘉怡\t27963\n嗯勒\t27964\n1963.42点\t27965\n东坡豪迈\t27966\n血魔\t27967\n王明儿\t27968\n正题\t27969\n白癜风\t27970\n皮口\t27971\n钉螺\t27972\n客流高峰\t27973\n日出峰\t27974\n长大后\t27975\n李欣灿\t27976\n不敢了吧\t27977\n紫心\t27978\n88878\t27979\n作呕\t27980\n山师\t27981\n尧哥\t27982\n结瓜瓜瓜瓜\t27983\n昨天晚上\t27984\n蜀黎\t27985\nhhghvh\t27986\n蜀黍\t27987\n帅不帅\t27988\n黄义威\t27989\n婚龄\t27990\n推手歌\t27991\n门板\t27992\n120立方米\t27993\n多伦多一站\t27994\n这辈子就你一个\t27995\n另一边\t27996\n猪肝\t27997\n猪肚\t27998\n小镇\t27999\n梦幻泡影\t28000\n江苏省人民代表大会常务委员会\t28001\n小长\t28002\n罚金\t28003\n三体ou\t28004\n残害\t28005\n散车\t28006\n冠亚军\t28007\n福州新闻网\t28008\n11x2\t28009\n30.66\t28010\n东王峰\t28011\n摸摸串\t28012\n奥太棒\t28013\n叔叔们\t28014\n古人\t28015\n毛片儿\t28016\n苏姐\t28017\n苏瑾\t28018\n四七十四三十五一百\t28019\nstorcey\t28020\n周了懒\t28021\n一一根\t28022\n打火锅\t28023\n霍费尔\t28024\n胡乔木\t28025\n重本\t28026\nsolet\t28027\n韩语\t28028\n忍者七五29253节\t28029\n来少\t28030\n神论\t28031\n何玲\t28032\n重望\t28033\n里瑟\t28034\nNPC\t28035\nNovak\t28036\n温哥华机场\t28037\n苦不堪言\t28038\n一路星光\t28039\n一我我我我破\t28040\n該鼓\t28041\n麻食\t28042\n婷美\t28043\n快乐过新年\t28044\n484878040408\t28045\n睡前\t28046\n旁证\t28047\ngdjdgrd\t28048\n仁智\t28049\n傻子子\t28050\nbattle\t28051\nlol英雄联盟\t28052\n阿木木\t28053\n分机\t28054\n古拉一\t28055\n婉钰\t28056\n你是猪么就是你么你是猪猪猪猪猪猪猪猪猪\t28057\n装上\t28058\n澡堂\t28059\n碧根\t28060\n6855689999992587\t28061\n日薪\t28062\n美滋滋\t28063\n黄子韬\t28064\n幻星\t28065\n埋线\t28066\n九黄太发生\t28067\n归舟\t28068\n阆中中学\t28069\n哎呀呀呀\t28070\n港岛\t28071\n刘皇\t28072\n眼蒙\t28073\n斡旋\t28074\n0202\t28075\n会不会员\t28076\n我喜欢熊\t28077\n少玩\t28078\n刘皓\t28079\n天空之舞\t28080\nWhoisthat\t28081\n烟烟星期妈妈的话神神的雷光鲁冰花\t28082\n装甲车\t28083\n太不专心\t28084\n存量\t28085\nwhmcjaini\t28086\n装单\t28087\n千家\t28088\n盲目服\t28089\n千宴\t28090\n五十家\t28091\njelsa\t28092\n杜密蒙\t28093\n核武器\t28094\noffsy\t28095\n白虎儿\t28096\n脚脑\t28097\n说了我以为\t28098\n81集\t28099\n550万\t28100\n金蕾\t28101\n脚脚\t28102\n联合行动\t28103\n女的你是不是男不男女不女\t28104\n示儿\t28105\nn70\t28106\n苏宁\t28107\n超越时空\t28108\n功劳\t28109\n23:18\t28110\n奸细\t28111\n蓝芭比\t28112\n沃特玛\t28113\n说话的声音好像女的\t28114\nST段\t28115\nfjgjh\t28116\n午仪\t28117\n第七感\t28118\n可爱啦郎啦咯铠甲\t28119\n23:15\t28120\n何度秘\t28121\n功力\t28122\n功功\t28123\n还气流\t28124\n叛逆期\t28125\n王红发\t28126\n话儿\t28127\n2011年02月10日\t28128\n懂\t28129\n杨承鑫\t28130\n门外汉\t28131\n23:10\t28132\n附魔书\t28133\n动密度\t28134\n想你想\t28135\n5号上午10点\t28136\n终于明晰\t28137\n糖苷\t28138\n驾驭\t28139\n媚眼\t28140\n活期\t28141\nBYBHO组\t28142\n司菲雅\t28143\nG17\t28144\nG14\t28145\nG13\t28146\ngigg\t28147\n驾驶\t28148\n雾天\t28149\n闹矛盾\t28150\n火种\t28151\n若想\t28152\n8名\t28153\n谨防\t28154\n李二伯\t28155\njpjpdj\t28156\nWRONG\t28157\n嗯修真\t28158\n急诊室\t28159\nqb8860\t28160\n朱建飞\t28161\n懶\t28162\n别上心\t28163\n五粮\t28164\n一壶\t28165\n说话者\t28166\nwww棠\t28167\n蜜柚\t28168\n8月11日清\t28169\n一声\t28170\n实力\t28171\n百度手机助手\t28172\n八六级号\t28173\n天上人间\t28174\n定居者\t28175\n防甲\t28176\n10场\t28177\n那个机\t28178\n丹寨\t28179\n户县东关小学\t28180\n多萌\t28181\n心系\t28182\n那个月\t28183\n微信号cf1380476\t28184\n她的秘密\t28185\n十秒钟\t28186\n徐晚晴\t28187\n实务\t28188\n降慢\t28189\n导班\t28190\nhi不聊\t28191\nvghjhf\t28192\n红细菌\t28193\n舒缓\t28194\n昭通\t28195\n北环线绍兴农校\t28196\n38集\t28197\n碱剂\t28198\nfuffu\t28199\n葛朗台\t28200\n小臭婆\t28201\n上当\t28202\n懦\t28203\n哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒\t28204\n就是要买\t28205\n锡林浩特\t28206\n电子游戏\t28207\n间别\t28208\n18315471521\t28209\n软骨\t28210\n二百久\t28211\n大喜易失言,\t28212\n玩法\t28213\n丁字\t28214\n穆院长\t28215\n洮南\t28216\n刘雯霏\t28217\n目前有\t28218\n熊猫饼\t28219\n刘威\t28220\n胡慧慧\t28221\n杜儿坪\t28222\n那莲\t28223\n大一听\t28224\n纽约\t28225\n春意意\t28226\n等客\t28227\n正解\t28228\n麦麦王\t28229\n五六半\t28230\n略略略\t28231\n手纸\t28232\n有巨无\t28233\n丢不去\t28234\n敬畏\t28235\n优雅\t28236\n鸡脯\t28237\norro\t28238\n天拜师\t28239\n周天天\t28240\n算出来\t28241\n学分\t28242\n唉唉\t28243\ngcgxgchchc\t28244\n坦荡荡\t28245\n話家家楊家峪7\t28246\n100多元\t28247\n黑字\t28248\n防卫\t28249\n北海道\t28250\n闭眼\t28251\n学到\t28252\n王鑫人\t28253\n谁的是谁你是谁你是谁你是谁你是谁你是谁你是谁\t28254\n给断\t28255\n六毛八\t28256\n盒面\t28257\n宁皓\t28258\n皓月\t28259\n防华\t28260\n我亲一个亲一个亲一个\t28261\n负离子\t28262\n千瓦\t28263\n一批\t28264\n一找\t28265\n人生日\t28266\n会诊\t28267\n丁玲\t28268\n一扫\t28269\n头奖\t28270\nhellohellohellohello\t28271\n一扭\t28272\n孔径\t28273\n女盆友\t28274\n会话\t28275\n王子矜\t28276\n漂亮一点\t28277\n经关\t28278\n虹膜\t28279\n4月8日\t28280\n赵俊喜\t28281\n片毛\t28282\n焊机\t28283\nfototp\t28284\n经典\t28285\n一手\t28286\nyuoh\t28287\n头女\t28288\n一才\t28289\n上车\t28290\n仁人\t28291\n会诺\t28292\n一扇\t28293\n乖宝宝\t28294\n上吊\t28295\n上名\t28296\n音\t28297\n十五厘米\t28298\n甜言蜜语\t28299\nxxuvd\t28300\ngut\t28301\n聊聊侗\t28302\n三十三千佳\t28303\n活塞队\t28304\n打得火热\t28305\n上吐\t28306\n啊果\t28307\nghiphotosbaiducomxiaodupicitema2cc7cd98d1001e957a2277cbf0e7bec54e79722jpg\t28308\n洗头膏\t28309\n毫无动容\t28310\n百度钱夹\t28311\ngux\t28312\ngttgf\t28313\nfias\t28314\n我的一天\t28315\n优惠期\t28316\n产检\t28317\n植物大战僵尸二修改版\t28318\n水浒传的读后感悟\t28319\nwantarelation\t28320\ndsswwfewwefg\t28321\nfial\t28322\n繁體\t28323\n专访\t28324\n1200米\t28325\n冷哈\t28326\nCHK\t28327\n锤子龙\t28328\n16P\t28329\n1861406\t28330\n康鳗鱼\t28331\n山人\t28332\n不退课\t28333\n下此\t28334\nJunior-银赫\t28335\n菲希欧\t28336\n嘴型\t28337\ng秘关系\t28338\nsinceicamehere\t28339\n九蛋\t28340\nGkgjgi\t28341\n常理\t28342\n任凭\t28343\n1327122150\t28344\n秘姐\t28345\nbeibi\t28346\n峰回路转\t28347\n家自体\t28348\n本月起\t28349\n2510点\t28350\n近三年\t28351\n十一群\t28352\n广刁\t28353\n移出\t28354\n炽热\t28355\n第三期\t28356\n2627\t28357\n赵冬男\t28358\n水生\t28359\n唱钟\t28360\n类便\t28361\n滚滚面\t28362\n旭哥帅\t28363\n会否\t28364\n蝙蝠袖\t28365\n豆腐夹\t28366\n1.78亿元\t28367\n林启源\t28368\n可爱块\t28369\nNOVO\t28370\n水甲\t28371\n宋金合\t28372\n炽烈\t28373\n水电\t28374\n六欲\t28375\n121121112\t28376\n98秘\t28377\n法院\t28378\n5位\t28379\n不是我胆小\t28380\n六次\t28381\ndrudreyt\t28382\n第三本\t28383\n3338888\t28384\n46.5%\t28385\n一二元\t28386\n贫困\t28387\n聊把\t28388\n宋勇澄\t28389\n挂号费\t28390\n嗯东土\t28391\n一亿万\t28392\n大渡桥\t28393\n不好在\t28394\n一传九\t28395\n一堂课\t28396\nia了姑娘似的伤心\t28397\nuushaxenyouka\t28398\n一亿个\t28399\n狂风\t28400\nCMS\t28401\n舞钢市\t28402\n不以为然\t28403\n天才一表人才\t28404\n李天宇\t28405\n李思么\t28406\n高岭土\t28407\n乖度秘\t28408\nValentino红色公爵绸\t28409\n一天岁\t28410\nuhggfvdgh\t28411\n李航杰\t28412\n排队\t28413\n嗯真\t28414\n西田吗衣\t28415\n很急麽海宁我很寂寞我很寂寞\t28416\n二零八三五零\t28417\n字片\t28418\n阴囊肿\t28419\n是这样好帅\t28420\n废话儿\t28421\n3月10日\t28422\n8月下旬\t28423\n百条\t28424\n原谅我读书\t28425\n丑吱吱狗\t28426\n还秘\t28427\n百来\t28428\n汉堡包山师\t28429\n一样一样\t28430\nvgdsf\t28431\n不早\t28432\n50万个\t28433\n北电\t28434\nfdfdfff\t28435\n凯迪拉克\t28436\n再来一次好不好\t28437\n不无\t28438\n话里有话\t28439\n21层\t28440\n安安心心\t28441\n侠盗\t28442\n可爱的小度秘就是你你是我的小机器人\t28443\n多一点九元\t28444\n欣仪\t28445\n3535252535353\t28446\n纯属虚构\t28447\n又说九歌晚安\t28448\n下楼忙\t28449\n偷体\t28450\n给我不离\t28451\n夏祥峰\t28452\n暗奏\t28453\n成琰\t28454\n一波台\t28455\n陈上将\t28456\n如归\t28457\n洛克螺母牛\t28458\n套管\t28459\n顶天不立地\t28460\n我日\t28461\n不零点\t28462\n机不可失\t28463\n神奇\t28464\n宁句城际轨道交通工程\t28465\n刘花妹\t28466\n我早\t28467\n干癌\t28468\n相亲人\t28469\n撸啊撸啊撸ri\t28470\n紫轩\t28471\n梓兮\t28472\n发理解\t28473\n爱格\t28474\nhhhbv\t28475\n锋芒\t28476\n菌类\t28477\n拜拜达\t28478\n唐倩\t28479\ntoto图库穆剑土木五路口木偶剧五突击三句\t28480\n圣斗士\t28481\n没得治\t28482\n大情\t28483\n鱼\t28484\n度小秘呀度小秘啊度小秘\t28485\n头脑风暴\t28486\n王子文\t28487\n胜任性\t28488\n摩尔城\t28489\n青橄榄\t28490\n声哥\t28491\n建装逼\t28492\n拐人\t28493\n你好爸爸\t28494\n一探究竟\t28495\nhwj\t28496\n幕度秘\t28497\n各地方\t28498\n赵大咪\t28499\n御银股份\t28500\n吴梦琴\t28501\n130815\t28502\n不相称\t28503\n构型\t28504\n昨天晚上两点\t28505\n每每次\t28506\n盲动化\t28507\n瓜蒌\t28508\n医神\t28509\n667y\t28510\n想和\t28511\n黎子锋\t28512\n力而行\t28513\n了不念口亨那就是行\t28514\n郝惠萱\t28515\n阳面\t28516\n宋天荫\t28517\n郑梦茹\t28518\n7.8公映\t28519\n呃讨厌你讨厌你讨厌你\t28520\n亲姐妹\t28521\n数3个\t28522\n欧尼桑\t28523\n我的最爱O\t28524\n红太少\t28525\n上学位\t28526\n老财\t28527\n第五张\t28528\n12日游\t28529\npfil\t28530\nMHD\t28531\n8]\t28532\n一声不响\t28533\n978786\t28534\n你在我的身边我爱你爱的\t28535\n外筑\t28536\n女吊灰太灰太狼\t28537\n发奋\t28538\n6678\t28539\n你好乖呀我想见见你真\t28540\n老贾\t28541\n孔老师\t28542\n凤飞\t28543\nhdbbb\t28544\n杏原杏璃\t28545\n甲维\t28546\n合身\t28547\n越来越\t28548\n那琳\t28549\n吹神\t28550\n猫猫猫猫猫猫猫猫猫猫猫猫猫猫萌萌萌萌萌萌萌萌萌萌萌猫猫猫猫猫\t28551\n寺内\t28552\n大巴拉巴拉\t28553\n56周\t28554\n众怒\t28555\n潮汕人\t28556\n100312\t28557\n3≦o\t28558\n站内\t28559\n小圆脸\t28560\n木人\t28561\n猫哥\t28562\n陈玉喆\t28563\n其它\t28564\n欣悦\t28565\n整体性\t28566\n其实\t28567\n书学\t28568\n拍马屁\t28569\n无根据\t28570\n副主持\t28571\n两攻\t28572\n我的名\t28573\n快消品\t28574\n度秘你好棒谢谢你\t28575\n手重\t28576\n共老\t28577\n16962010115\t28578\n秋官评价芝姐\t28579\n早死\t28580\n巴比娃儿\t28581\ndocto\t28582\n废品袋\t28583\n木事\t28584\n心情不好别\t28585\n逗比人\t28586\n7789755361\t28587\n张惠妹\t28588\ndaling\t28589\n京沈高速\t28590\ne友们\t28591\n皮质\t28592\nGhhgf\t28593\n绽放\t28594\n1245557588900425\t28595\n五年\t28596\n恶俗\t28597\n下不来床\t28598\n4687\t28599\n130多年\t28600\n郭乐建\t28601\n15423572642\t28602\nhggigujj\t28603\n2号凌晨\t28604\n茉莉\t28605\n恶心你走\t28606\n背信弃义\t28607\n清醒点半夜\t28608\n龙俊朋\t28609\n老鼠的歇后语\t28610\n百思达\t28611\njrhhf\t28612\n3X6十2\t28613\ntoutou\t28614\n董先生\t28615\n赤激\t28616\n喇叭状\t28617\n胸前\t28618\n小秘密\t28619\n再见吧明天见\t28620\n小猪小猪咕咕噜\t28621\n花记\t28622\n来一次\t28623\n支援\t28624\n25252554n1555575555555445556855557555\t28625\n还有你相信\t28626\n这吧小乖乖\t28627\nesoxo\t28628\n大神山\t28629\n副总裁\t28630\n白兰花\t28631\n云助\t28632\n立人\t28633\n幺八\t28634\nChelsea\t28635\n答对\t28636\n方向\t28637\n腹股沟斜疝\t28638\n立享\t28639\nlmjmkll\t28640\n总成\t28641\n沐浴露\t28642\nMonroe\t28643\nslyl\t28644\nfgyjbbh\t28645\n蟹蟹辣\t28646\n作用\t28647\n区分钟\t28648\n王沛尧\t28649\nhagf\t28650\nabcda\t28651\n圣经\t28652\n西铁\t28653\nabcde\t28654\n韩国SBS电视台\t28655\n电影\t28656\nabcdy\t28657\n把握住\t28658\n320米\t28659\n立于\t28660\n根弦儿\t28661\n杨思韵\t28662\n张硕\t28663\n兇恶\t28664\n5925多\t28665\n拖住\t28666\n55753655225\t28667\n2011年5月22日\t28668\n深晖清凉芬\t28669\n浪迹天涯\t28670\n三江口村\t28671\nC+侦探\t28672\n耳目一新\t28673\n局势\t28674\n观尔\t28675\n加分析\t28676\n零二二幺\t28677\n宽以待人\t28678\n度秘你好我是你的主人我喜欢你我想和你在一起你可以\t28679\n封建社会\t28680\n干刷\t28681\n爱过来\t28682\n甜度\t28683\n唔从今天开始\t28684\ndvndddngefhg\t28685\nNnO\t28686\nfbx\t28687\n叔图书馆\t28688\n一个多小时\t28689\n苏秘\t28690\n9万吨\t28691\nhhgfcgg\t28692\n非姆\t28693\n群殴\t28694\n徐童童\t28695\ndvdk\t28696\n另外\t28697\n硫化\t28698\nyIloilO\t28699\n姜丹丹\t28700\n用心点\t28701\n你和他\t28702\n微肥\t28703\n雕刻刀\t28704\nfbc\t28705\n佳境\t28706\n功夫熊猫3\t28707\nKenny\t28708\n张汇通\t28709\n大有人\t28710\n祖国世界之最\t28711\n云辉\t28712\nhahho\t28713\n十一点个\t28714\nljd\t28715\nljy\t28716\n苏秦\t28717\nljp\t28718\n角塑\t28719\n那了了了了了\t28720\nyo棒\t28721\n不可分割\t28722\n纯甄酸牛奶\t28723\n3.6\t28724\n杨猫\t28725\n3.5\t28726\n丹丹\t28727\n装配工\t28728\n李诗瑞\t28729\nmbdsfdgfgbc\t28730\n3.8\t28731\n3.9\t28732\n坐电梯\t28733\n兴鹿党\t28734\n凤凰新闻网\t28735\n方娅娜\t28736\nB族\t28737\n丹东\t28738\n齐照样子\t28739\ncnsh\t28740\n哇长\t28741\n秘密明天就叫李的闭经\t28742\n谢座\t28743\n溜溜溜\t28744\n网拍\t28745\n王婧颖\t28746\nsheng\t28747\n度密君\t28748\n养精蓄锐\t28749\n今晚6:45\t28750\n不指弹\t28751\n建材\t28752\n二妮年\t28753\n啊安\t28754\n不对咧\t28755\n我恨死你了你是猪\t28756\n贱民\t28757\n侯哥\t28758\n人家额\t28759\n唱了吧\t28760\n完了事\t28761\n江苏黄埔再生资源利用有限公司\t28762\n海盗船\t28763\n扭扭\t28764\n三十七块\t28765\n太街\t28766\n痴\t28767\n李度秘\t28768\n小度秘我真的好爱你你好帅你好美整天\t28769\n择业\t28770\n6月24日\t28771\n噪否劈劈数唤劳可顽别炻可酉\t28772\nbaeranz\t28773\n88899989999\t28774\n考生们\t28775\n用关\t28776\n常平\t28777\n那思密达\t28778\n胰岛卡\t28779\n常年\t28780\nsnnsnn\t28781\n姨妈你好色秘\t28782\n女郎们\t28783\n家鲜\t28784\n呢妈\t28785\n凌晨2点半\t28786\nEvery\t28787\n3050\t28788\n十年来\t28789\n噢喔\t28790\nhuangfeng\t28791\n暖牌\t28792\n如痴如醉\t28793\n座系\t28794\n很不错听\t28795\n喜羊羊美羊羊灰太狼\t28796\n重型\t28797\n倔強\t28798\n枉\t28799\npaintthesun\t28800\n读后感度\t28801\n15204342345\t28802\n20多米\t28803\n希大人\t28804\n倔强\t28805\n欢跃\t28806\n6212262015007524834\t28807\n散手\t28808\n名公\t28809\n周雅婷\t28810\n画子\t28811\n洗颜\t28812\n靠靠靠靠靠快\t28813\n恶棍\t28814\n画字\t28815\n名典\t28816\n记者女\t28817\n三点四点\t28818\n昌河\t28819\n散打\t28820\n潜藏\t28821\n那个那个\t28822\n糟践\t28823\n罗东艳\t28824\nzFhxkkfvjjhkixixiofxooccoocxconCBGBkgCBCCJKjkhhgcckxjhhchhhhhhggjcjcjjhhhhhhhncckjjjjjnhjjbvibuggfshgffffgvhfgjbvknvvgovcfxcccjhbhhghgvhhogfuugiigogguucuiixicuxuddogufjffuufififgioghhhhhjohhoooofuuufududuufhjghpohogovhvicifivvoovovovohvoovogigigiogogogigogogovogvvoovovigivigizGljhfhgghgbbjnrfgmfnfknnjbbvvccjbhhjx\t28825\n顿服\t28826\n小灰灰\t28827\n贝儿公主\t28828\n巅倒\t28829\n李孝利\t28830\n逗点\t28831\n陪感\t28832\n我不记得\t28833\n防碍\t28834\n1945年\t28835\n傻厨\t28836\n亿百家\t28837\n快色\t28838\nSinga\t28839\n彻彻彻\t28840\n窘相\t28841\n听不清猪\t28842\n吖哥\t28843\n蚀骨\t28844\n点点点点点点点点点点点点点点点点\t28845\n水果度\t28846\n高学历\t28847\n颜69\t28848\n正法\t28849\n快艇\t28850\n摸波\t28851\n哪们\t28852\n三四十\t28853\n12022520031125\t28854\nSggtgy7fgfffhfffgfsdfdfffffdsfffgsffdtgggysfng3sxgdgf\t28855\n是了是\t28856\n向明\t28857\n感激不尽\t28858\n老开心\t28859\n名爵\t28860\n有限\t28861\n25个月\t28862\n听想\t28863\n玩手\t28864\n叶琳\t28865\n╥ω╥\t28866\n例块\t28867\n俏俏\t28868\n熊博\t28869\n枣\t28870\n汉街\t28871\n酒吧街\t28872\n罗圈儿腿\t28873\n别走\t28874\n午安安\t28875\n老虎凳\t28876\nu甜\t28877\n邵秀叶\t28878\n王志文\t28879\ndjjcfkrixnt\t28880\n希子\t28881\n排泄物\t28882\n维多利亚娱乐\t28883\n别赖\t28884\n有点难\t28885\n水冰月\t28886\n枭\t28887\n玲声\t28888\n3ce代购\t28889\n光波炮\t28890\n侬六千七\t28891\n抢劫案\t28892\n最后一个人\t28893\n宣传曲\t28894\n一不开心\t28895\n一个五十岁\t28896\nｌａｓｅ\t28897\n灰取心\t28898\nvbvb\t28899\n真汉子\t28900\n张美照\t28901\n飞龙\t28902\n爬山涉水\t28903\n医疗类\t28904\n学雷\t28905\n毛城\t28906\nshanghai\t28907\n衣才\t28908\n富田靖子\t28909\n恩我喜欢女\t28910\n1515有有有有有有有\t28911\n和田地区\t28912\n录像\t28913\n袖口\t28914\n邪物\t28915\nibi\t28916\n要你样\t28917\n不疯了\t28918\n市话\t28919\nseyui\t28920\n全民阅读\t28921\n力量力\t28922\n书香\t28923\n单正清\t28924\n科科\t28925\nkhfu\t28926\n好啦好啦晚安\t28927\n金沙水\t28928\n东民\t28929\n豪欣号\t28930\n你在干嘛你在干嘛呢\t28931\n乔四\t28932\n皮泥\t28933\ngccbz\t28934\n皮波\t28935\n四二班\t28936\n程彦良\t28937\n陈佳敏\t28938\n每人\t28939\n会聊性\t28940\n178145\t28941\n56.89\t28942\n好元芳\t28943\nа\t28944\n白侑林\t28945\n欧元区\t28946\n刘思言\t28947\n单人沙发\t28948\n甜甜思密达\t28949\n万能宝\t28950\n海霞绣\t28951\n嫉妒哥\t28952\nз\t28953\nqzthi\t28954\nkris\t28955\n你猜你猜你猜你猜你猜你猜你猜你猜你猜\t28956\nе\t28957\n市里\t28958\n文具盒\t28959\n防护\t28960\n哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈\t28961\n谢春生\t28962\n骂神\t28963\n挥斥方遒\t28964\n回答\t28965\n浪子\t28966\n我萱\t28967\n冠昊\t28968\n吴镇宇\t28969\n八面威风\t28970\n999舰333舰囚\t28971\n人家伙食\t28972\n不取说\t28973\n王昱鳗\t28974\n万建民\t28975\n五公斤\t28976\n一千遍\t28977\n丧单机版\t28978\nokmqx\t28979\n调情好\t28980\n蝴蝶玉\t28981\n徐濠萦\t28982\n巨宅\t28983\n绿荫大道\t28984\n病妻\t28985\n34班\t28986\n总参谋部\t28987\n猜疑\t28988\nxzse\t28989\n对呀不行\t28990\nxzss\t28991\nhbsidbbec\t28992\n算啦算\t28993\n5555715\t28994\nФ\t28995\n1589302831\t28996\n净资产\t28997\n压爱\t28998\n痴笑\t28999\n為幾\t29000\n打炮机\t29001\n严办\t29002\n哥儿们\t29003\n张谷歌\t29004\n政府部门\t29005\n私号\t29006\n喻华\t29007\n李TIC\t29008\n李灿熺\t29009\n简答\t29010\n李赵庄\t29011\n少得\t29012\n超粪男\t29013\n广德\t29014\n长辈们\t29015\n我的狗\t29016\n很可爱\t29017\n陈宗扬\t29018\n涠洲岛\t29019\n挨近\t29020\n单身女\t29021\n刘佳瑞\t29022\n纯洁\t29023\n说了听\t29024\neuhhgde\t29025\n雅欣\t29026\n暗箱操作\t29027\n热感\t29028\n周俊闵\t29029\n耐心\t29030\n一点二倍\t29031\n满处\t29032\n黑蛇咒\t29033\nfjnb\t29034\nchit\t29035\n14.00\t29036\n61558402855525125556485959595885558281888262645252559454554858\t29037\n豆豆\t29038\n太匆忙\t29039\n紫逆天\t29040\n好梦么么\t29041\n478910\t29042\nД\t29043\n座垫\t29044\n余资\t29045\n毛胚\t29046\n钢笔\t29047\n云吞面\t29048\nК\t29049\n早上7点\t29050\n我是主人我是主人\t29051\n敞开\t29052\nyangqimidu\t29053\nT3T\t29054\nИ\t29055\n誓不为人\t29056\n刘平峰\t29057\n中共中央\t29058\n电玩别\t29059\n夏雨欣\t29060\nП\t29061\n真一次\t29062\n怪人坏人坏人坏人\t29063\nhyfffd\t29064\n保释金\t29065\n八快点度\t29066\n147526620\t29067\n日益\t29068\n语病\t29069\n75573\t29070\n哲\t29071\n你是谁你是谁你是我大妹妹\t29072\n考蚀\t29073\n善良的心\t29074\n前调\t29075\n五叔六叔七叔八叔\t29076\n差异性\t29077\n宅斗文\t29078\ngkgaga\t29079\n涩片\t29080\nchid\t29081\n阿娘\t29082\n凹陷\t29083\n拼车\t29084\n阿娜\t29085\n闻乐见\t29086\n表情输入法\t29087\n八分米\t29088\n旅游者\t29089\n随礼\t29090\n中等人\t29091\n念鱻\t29092\n性性福福\t29093\n盐师\t29094\n礼遇\t29095\n哦\t29096\n考子韬\t29097\n减半\t29098\n不秘我讨厌\t29099\n猪马\t29100\n减单\t29101\n李裸\t29102\n依克\t29103\n梅格妮\t29104\n聊辣\t29105\n3q3q\t29106\n唐博士糖\t29107\n雪纳瑞\t29108\n周小萱\t29109\n喜悦\t29110\n5375256600388000\t29111\n佛道\t29112\n員\t29113\n牢记\t29114\n度秘惶恐晓明的女人你敢爱\t29115\nkbb\t29116\ncallmemother\t29117\nkbg\t29118\n魅友\t29119\n水淀粉\t29120\n和你说话真的好堵心呢不理你了我都\t29121\n征途2\t29122\n锟铻刀\t29123\n如此这般\t29124\n天干\t29125\n天平\t29126\n名上\t29127\n名下\t29128\n事行\t29129\nShell\t29130\n消殒\t29131\n缩句\t29132\n飓风\t29133\n郭坚强\t29134\n对呀对呀沙哑\t29135\n秘镜\t29136\n证券投资\t29137\n布头\t29138\n喝西北风\t29139\n好恐惧\t29140\n天幕\t29141\n委屈过分\t29142\n横式\t29143\n莱茵黑森\t29144\n奥斯卡新建文影城\t29145\n西后街\t29146\nzhd\t29147\n游动\t29148\n抚顺\t29149\n未啊仲慢慢地\t29150\n西海龙王\t29151\n才肉\t29152\n艾希\t29153\n哔\t29154\n无穷\t29155\n周度秘\t29156\n明报\t29157\ndgjgagkg\t29158\n土拨鼠\t29159\n承包\t29160\n80s\t29161\n绿坝\t29162\nsejgfgfe\t29163\n地板\t29164\n烟雨听荷\t29165\n11日凌晨4时\t29166\n哐\t29167\n蒲扇\t29168\n大头菜\t29169\n宝堂一\t29170\n不懂我真的不懂\t29171\nhiì\t29172\nfukeyou\t29173\n十个字\t29174\n钱柜\t29175\n淑蓉\t29176\n55485544\t29177\n1月12号\t29178\n新影联\t29179\n你是我的小秘书\t29180\nmet\t29181\n大头虾vi\t29182\n吴圩镇\t29183\n你在干么子事\t29184\n道偶有\t29185\n秘火神\t29186\n完稿\t29187\n吕坏蛋\t29188\n藏獒\t29189\n爸妈们\t29190\n二维二百型丰田汽车\t29191\n16%\t29192\n古典工艺家具\t29193\n棍\t29194\nps1lijiaikkilaia\t29195\n警察派\t29196\nchi3\t29197\nhaizi\t29198\n红珠宝\t29199\n不知耻\t29200\n开心乐\t29201\n狂草\t29202\n要不不告诉你\t29203\n第23号\t29204\n十六岁\t29205\n乘务员\t29206\n恩知道\t29207\n美人恩\t29208\n尕头哥\t29209\nhvchchggjcr\t29210\nk152路程表\t29211\nM5plus\t29212\n没有你当你\t29213\n習\t29214\n运动型\t29215\n御花园\t29216\n鞋垫\t29217\n啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦\t29218\n脱硫\t29219\nldyf\t29220\n夕颜\t29221\n胜券在\t29222\nhgucf\t29223\n红金\t29224\nbkkoooooppppooooo\t29225\n挺直\t29226\n虫哥\t29227\n2.33万亩\t29228\n九十八九十六\t29229\n中午饭\t29230\n我的妈\t29231\n流出去\t29232\n孩子点\t29233\n长命\t29234\n皮点\t29235\n剩饭\t29236\n璇璇\t29237\n赵红叶\t29238\n西乐\t29239\n昂贵\t29240\n不好果\t29241\n扫把\t29242\n幺二零幺零七\t29243\n随享贷\t29244\n无虚\t29245\n西乡\t29246\n三月六号\t29247\n表白者\t29248\n欧嘀嗒\t29249\n欧买噶\t29250\nrigd\t29251\n打了看\t29252\n皮炎\t29253\n不再相信你\t29254\n放飞机\t29255\n哈hi\t29256\n赵哲波\t29257\n卖不动\t29258\n三年内\t29259\n出碟\t29260\n古迹\t29261\n起源者\t29262\n悬医藏心术\t29263\n皮肉\t29264\nUC良苦\t29265\nkmtdatnm\t29266\nTOUCH\t29267\n秋景\t29268\n荷尔拜\t29269\n马昕禹\t29270\n赛车\t29271\n简字\t29272\n皮肤\t29273\n带感\t29274\nhellojinmao\t29275\nborinj\t29276\n6666610\t29277\n身边人\t29278\n去喝\t29279\nboring\t29280\n說謝謝\t29281\n有没有效\t29282\n25吨\t29283\n拉普利奥\t29284\n九月份\t29285\n武藤\t29286\n跳着\t29287\n鹅衣屋屋\t29288\n宋代玲\t29289\n彭同学\t29290\n在这吧\t29291\n安摩\t29292\n意思奥\t29293\n圈地\t29294\n开通\t29295\n单着你是\t29296\n武戏\t29297\n太可怕了我相信你\t29298\n今晚21时50分\t29299\n流股\t29300\n走越\t29301\n官商\t29302\n亦步亦趋\t29303\n抽出去\t29304\n自强\t29305\n得心\t29306\n2011年6月20号\t29307\nhyaDDOS\t29308\n企业们\t29309\npkvk\t29310\n偏低\t29311\n桐梓\t29312\n胡须\t29313\n牡丹\t29314\nω羞\t29315\nhrect\t29316\n空谈\t29317\nfjfidhcfg\t29318\nHANDS\t29319\n好多种\t29320\n电锯\t29321\n为什么不声讨\t29322\n辜鸿铭\t29323\n久违\t29324\n法克\t29325\n17多\t29326\n后宫\t29327\n李自成\t29328\n杨雅然\t29329\n顾少棠\t29330\n您好哭\t29331\n后客\t29332\n很喜欢\t29333\n后容\t29334\n肖特\t29335\n大耳堡\t29336\n玩武墓\t29337\n六六零\t29338\n付梦欣\t29339\n1557490504\t29340\n5224124\t29341\n黄彩椒\t29342\n有许多\t29343\n阿根廷人\t29344\n41件\t29345\n吃软\t29346\n阳片\t29347\n盂县\t29348\n和你亲\t29349\n少林\t29350\n陈艺珊\t29351\nbabyhen\t29352\n一一三十万三千三百三拾万光年\t29353\n同志们\t29354\nq3\t29355\nq2\t29356\nq5\t29357\nq4\t29358\nq7\t29359\nq6\t29360\n那霸市\t29361\n1249950903\t29362\n不不不不不不不不不不不算不是\t29363\n赖海彬\t29364\n俊郎\t29365\n孤独\t29366\nfhhiekkfogj\t29367\n了费\t29368\n把子\t29369\n按扣\t29370\n伟楠\t29371\ntgnjc\t29372\n杨二小\t29373\n早五点\t29374\ndajm\t29375\n睡觉时\t29376\n凤凰论坛\t29377\n北郡c区\t29378\n炫权式\t29379\n李莹莹\t29380\n零零多\t29381\n偏技术\t29382\n追图\t29383\nqO\t29384\nqq\t29385\nqp\t29386\nqs\t29387\n胡佳敏\t29388\nqu\t29389\nqt\t29390\nqw\t29391\n猫波\t29392\nqy\t29393\nqx\t29394\n毛保吉\t29395\n教度\t29396\n京北\t29397\nqa\t29398\n伯乐世\t29399\n不要走\t29400\nIdvdy\t29401\nqe\t29402\nqd\t29403\nqg\t29404\n挤破\t29405\nqi\t29406\nqh\t29407\n脑擦擦\t29408\nqj\t29409\n追回\t29410\nql\t29411\n确认\t29412\nqn\t29413\niks\t29414\ngahz\t29415\n你是我的好呀好朋友\t29416\n度秘你现在秃头顾子行\t29417\n良伪\t29418\n读书\t29419\ngahs\t29420\n蓝染\t29421\n手榴弹\t29422\n再会\t29423\n安CK\t29424\n神经啊神经了神经天天见\t29425\n挤进门\t29426\n告诉我的父母\t29427\n霞螺纹\t29428\n马到成功功\t29429\n遛达\t29430\n打坐\t29431\n被关押\t29432\n黄永飞\t29433\n咽炎\t29434\n打坏\t29435\n大才疏\t29436\n00XX\t29437\niky\t29438\n法定代理人\t29439\n防刺服\t29440\n苏运莹\t29441\n636284672726253279237638236\t29442\n诺基亚N8\t29443\nFroggedswag\t29444\n倍你\t29445\n第315\t29446\n王英楠\t29447\n姜永山\t29448\nikc\t29449\n官场\t29450\n护车者\t29451\n双数\t29452\n15031639815\t29453\n老戈\t29454\n你的要害\t29455\nhQQ\t29456\n缝生\t29457\n老戏\t29458\n88chen9\t29459\n棋盘\t29460\n四分米桶\t29461\n高联\t29462\n不和你愉快\t29463\n招股说明\t29464\n18720\t29465\n荼靡年\t29466\n误以为\t29467\n89890806865865\t29468\n高聚\t29469\n亲爱的宝贝我爱你！我爱你\t29470\n气田\t29471\n小西湖\t29472\n大排档\t29473\n美麗\t29474\n高职\t29475\n马藤斯\t29476\n津狮花园七号楼二单元1011室\t29477\n柬埔寨\t29478\n必然性\t29479\ndfsfafx\t29480\n大脸猫\t29481\n招聘\t29482\n哈哈哈哈太阳下哈哈哈哈太阳伞\t29483\n多姿\t29484\n会儿天儿\t29485\npppppppppppppppppp\t29486\n淡水\t29487\n二六九一八二零八\t29488\n扁一律\t29489\n不对不起\t29490\n洪荒\t29491\n图面\t29492\n艾拉\t29493\n木星人\t29494\n瓮樱子\t29495\n你了我最讨厌你了我最讨厌你了我最讨厌你了我还\t29496\n张润槐\t29497\n恩氏\t29498\npainten\t29499\n五百次\t29500\n宁贵\t29501\n尚名\t29502\n御姐\t29503\n闹玲\t29504\n佳洁\t29505\n死鱼\t29506\n往下面\t29507\n领教\t29508\n王大伯\t29509\n给我答案\t29510\n不行礼\t29511\n朱昌俊\t29512\n睡觉\t29513\n电棍\t29514\n1039\t29515\n查清\t29516\n猪性\t29517\n东华小学\t29518\n头绪\t29519\n过人工\t29520\nkikkih\t29521\n我告訴你我叫什幺\t29522\nhuuuhjj\t29523\n死密度\t29524\n电棒\t29525\n选一选\t29526\n度怭\t29527\n二丰\t29528\n有风\t29529\n单挑\t29530\n大兴善寺\t29531\nApptopia\t29532\n中央军委\t29533\n一阳千玺\t29534\n碰铃铃叮叮空通\t29535\nspatianylyl瑞珂youktitta\t29536\n网友\t29537\n贴身高手\t29538\n二两\t29539\n九头蛇\t29540\n杨方\t29541\n二个\t29542\n二丫\t29543\npza\t29544\n二中\t29545\n扯扯\t29546\n无限苦逼\t29547\n二丑\t29548\n联播\t29549\n二专\t29550\n臃肿\t29551\n二世\t29552\n香港凤凰卫视\t29553\n处女大姨妈\t29554\ntvfjvgng\t29555\n二一\t29556\n二丁\t29557\ntfboyes\t29558\n位子禄\t29559\n二万\t29560\n二三\t29561\n小度秘书度秘\t29562\nbkllhkh\t29563\n无意无靠\t29564\njtgatp\t29565\nalot\t29566\n没哪有\t29567\n干沟桥\t29568\n半能半\t29569\n装设\t29570\n时空交错\t29571\n小伊伊\t29572\nfhied\t29573\n小妮子\t29574\n4g流\t29575\n年审\t29576\n大咪伍锴甲勇歌\t29577\n糞们\t29578\n高密连\t29579\n孙新峰\t29580\n咪嘟咪嘟咪嘟咪嘟咪嘟咪\t29581\n张好多\t29582\n小度好\t29583\nn72女\t29584\nrfhudvb\t29585\nReservations\t29586\n现实生活\t29587\n够帅\t29588\n汕m\t29589\n传禹故事什么出自三字经的勤学篇\t29590\n谢谢心如\t29591\n芦荟\t29592\n相信你来\t29593\nAMMBZB\t29594\nhttpdhiphotosbaiducomxiaodupicitemfcfaaf51f3deb48faf3afcaaf71f3a292df57876jpg\t29595\n科科区\t29596\n元素\t29597\n很高兴见仁见智\t29598\n瑛徽\t29599\n黄山片\t29600\n凉调\t29601\n只要吃\t29602\n男孩们\t29603\n无一个\t29604\n雨然噶尅\t29605\n靖江市公安局\t29606\n欧美克\t29607\n示众\t29608\n永远不哩\t29609\n12254637890\t29610\nyzffo\t29611\n院度\t29612\n再贷款\t29613\n顾那吐\t29614\n13778807954u\t29615\n图层\t29616\n64936663944326\t29617\nhdhjddjjmssmsmmmmm99999990944497997\t29618\n天谕\t29619\n啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦\t29620\n知会\t29621\n政迹\t29622\ncaomim\t29623\n1030\t29624\n张水江\t29625\n324779956313\t29626\n禁忌\t29627\n保定\t29628\n一休了吧\t29629\n保安\t29630\n保守\t29631\n御龙萌萌\t29632\n马尔代夫\t29633\n一拍两散\t29634\nareyoudume\t29635\n蓝孔雀\t29636\n暂且\t29637\n集体利益啦啦哒哒嘀嗒嘀嗒嘀嗒嘀\t29638\n林中\t29639\nifeool\t29640\n二四六八十\t29641\n结构\t29642\n更重\t29643\nloser\t29644\nloses\t29645\n结果\t29646\n一百七十五块\t29647\n重逢\t29648\nsi利\t29649\n兽兽\t29650\nxsbd\t29651\n半场\t29652\n控告\t29653\n快快乐\t29654\n你在哪战你不在你不是在上学吗干\t29655\n与朱元思书\t29656\n遥指\t29657\n互访\t29658\n一小片\t29659\n小萝卜\t29660\n陈梦\t29661\n彩超\t29662\n安全岛\t29663\n周子怡\t29664\nwhyspy\t29665\n杀伤力\t29666\nuijhv\t29667\n劳资\t29668\n不讨厌\t29669\n绿色植物\t29670\n好多果\t29671\n今晚20时40分\t29672\n你好可爱\t29673\n开心龙\t29674\n钢坯\t29675\n曲阜\t29676\n大作好玩儿\t29677\n半圈\t29678\n一线生机\t29679\n广州番禺区的电子厂\t29680\n51面\t29681\n半圆\t29682\n熊雄\t29683\n东丽东丽\t29684\nobey\t29685\n大门牙\t29686\n挺首\t29687\nvueh\t29688\n定哀\t29689\n马小双\t29690\n差一毛\t29691\n巨资\t29692\n幸福女人\t29693\n#雳剑#\t29694\nkgad\t29695\n合理化\t29696\n对你好\t29697\n赞皇县\t29698\n撒里\t29699\n撒野\t29700\n冬蜜度秘\t29701\n木瓜\t29702\n八点九点几了我\t29703\n只许我爱\t29704\n星兴路\t29705\n14：30分钟\t29706\n电脑病毒\t29707\nhwql\t29708\n帮鞋\t29709\n二十次\t29710\n几不要\t29711\n狗皮\t29712\n兔化\t29713\n丢脸\t29714\nffhcdg\t29715\n女和男\t29716\n度秘度秘小度秘小秘密小咪咪你是我的小秘密小秘密你是我的小秘密小秘密小秘密\t29717\n4499810\t29718\n在我心\t29719\n自测\t29720\n和片\t29721\n道不拾遗\t29722\n一𠂉\t29723\n丽柏广场\t29724\n少儿\t29725\n太讨厌\t29726\n梦桐乱七八还有人爱\t29727\n人民军报\t29728\n语音竹\t29729\n呀眨\t29730\n和牛\t29731\n零八六二九八八\t29732\n路虎揽胜特LOL咯健健康康\t29733\n甘加\t29734\n分手之后\t29735\n先编\t29736\n提醒\t29737\n一零升\t29738\n四川大学水利水电学院\t29739\n爱心呢爱心\t29740\n8000余家\t29741\n生化危百分数\t29742\n模怪样\t29743\n铠甲个\t29744\n喷喷笔\t29745\nenskdodkkd\t29746\n科泰\t29747\n对呀现在这个幺零零幺零\t29748\nTec\t29749\nTed\t29750\nTee\t29751\n请相信\t29752\nBIGBANG\t29753\n东环\t29754\nghhvhjbb\t29755\n30w\t29756\n杨教员\t29757\n中意照镜\t29758\n孟津\t29759\n时后\t29760\n在一起名\t29761\n淋露馅\t29762\n监狱\t29763\n魏文博\t29764\n加剧\t29765\n时君\t29766\n我不想和度\t29767\n丁沟\t29768\n明还\t29769\n发小是\t29770\nGigiYY\t29771\n复读机卡\t29772\n金心\t29773\n百丽美\t29774\n体坛\t29775\n茫茫\t29776\n在一起吧\t29777\n十字形\t29778\nFive\t29779\n诺惊风雨\t29780\n6.12\t29781\n值不值得\t29782\n實際\t29783\n喜⺶羊羊\t29784\n哎呀我的妈呀你\t29785\n丈夫\t29786\n第九\t29787\n奥数\t29788\n软林曼\t29789\n沈阳桃仙国际机场股份有限公司\t29790\n区委\t29791\n在那等\t29792\nhvddhbcv\t29793\nunion\t29794\n培养\t29795\n骂句\t29796\n离家个周开面的美容屁股美有\t29797\n印版\t29798\n佛教界\t29799\n阿提孤\t29800\n宏涛粪类\t29801\n美佛儿\t29802\n番吧\t29803\n脑公3\t29804\nfouf\t29805\n血地\t29806\n许巍\t29807\n睡想你\t29808\n岛内\t29809\n隔夜饭\t29810\n广上\t29811\nfour\t29812\n吞吞吐吐\t29813\nufufc\t29814\n事成\t29815\n下城\t29816\n大部份\t29817\n广东\t29818\n决不能\t29819\n一公里\t29820\n1.1万元\t29821\n种莫\t29822\n去和从\t29823\nfrase1famifimifiahe哦rufaheifn\t29824\n5月18\t29825\n别意思\t29826\n两百二十一一\t29827\n陈紫函\t29828\n荒芜\t29829\n新民村\t29830\n杜天皓\t29831\n勤工\t29832\n八分之九等\t29833\n广丰\t29834\n走着吧\t29835\n假头发\t29836\n就是你和王\t29837\n侨办\t29838\n广为\t29839\n15735658027\t29840\nfuv\t29841\n告诉你我走\t29842\n唱一首歌记住\t29843\n了解救\t29844\nsehun\t29845\n圖\t29846\n4十8十6十8十12十2二\t29847\n1454050068851\t29848\n讲座\t29849\n堡皮\t29850\n36吨\t29851\n彝语\t29852\n搓澡\t29853\n自由泳\t29854\n雅安\t29855\ntinas\t29856\n黄肖婷\t29857\n争做\t29858\nfup\t29859\n天之痕\t29860\n我靠靠靠我靠靠靠我靠我靠我靠\t29861\nClarkson经典情歌\t29862\n数百年\t29863\nvycddff\t29864\n断粮\t29865\n俩杯\t29866\n鹤舞\t29867\n溢彩\t29868\n刘问\t29869\n99999\t29870\n一本一个\t29871\n阿修罗\t29872\n爱上电影网\t29873\n99991\t29874\n圍\t29875\n99994\t29876\n擦皮\t29877\n不要再讲话\t29878\n人地\t29879\nfarmor\t29880\n灵璧县\t29881\nrrrddfd\t29882\n西游记的歌\t29883\nfuf\t29884\nufciu\t29885\n税后\t29886\n任重道远温文\t29887\n太急\t29888\nFlyme\t29889\n丁芮\t29890\n探查\t29891\n更杀杀杀不啥子事\t29892\n要不非\t29893\ncezxizyu\t29894\n不男不理\t29895\n拖打\t29896\n15968916283\t29897\n呃咿唔咿呀咿\t29898\n铁片\t29899\n守法\t29900\n小哥公司\t29901\n张关机\t29902\n很不幸福\t29903\n铁牌\t29904\nooredoo\t29905\n嗯份母其\t29906\n蒙蒙\t29907\n分式\t29908\n朱方圆\t29909\n1234567890111\t29910\n五月三一号\t29911\n黑糊糊嘿嘿抱方法论\t29912\n三番\t29913\nmumphebedrom\t29914\n分开\t29915\n和度秘聊\t29916\n夕阳\t29917\n超标\t29918\n煽情滴\t29919\n咯晗\t29920\n保护好哈\t29921\nk\t29922\n排盘\t29923\n黑月月\t29924\n一点几\t29925\n十五年前\t29926\n错杂\t29927\n吉尔曼\t29928\n偏门\t29929\n妹弟\t29930\n350公里\t29931\n迷情\t29932\n法治之路\t29933\n时光飞逝\t29934\n微生活\t29935\n协和\t29936\nfurfh\t29937\n姊姊\t29938\n迷惘\t29939\n存放\t29940\n讨厌安\t29941\n为什么不到\t29942\nrolell\t29943\n紅\t29944\n证伪\t29945\n鸟巢\t29946\n三点4\t29947\n西单\t29948\noiii87uu8000099990424k589888009924444￥\t29949\n夏梦荨\t29950\n效果图\t29951\n中信通\t29952\n服务行业\t29953\n五点六十\t29954\n有意思无聊\t29955\n交通事故\t29956\n1月30号\t29957\n4771542521587\t29958\n脏明花花\t29959\n风雅者\t29960\n存余额盈\t29961\n标泉\t29962\nshdo\t29963\n过半\t29964\n余额\t29965\nshdu\t29966\n我真的很喜欢你我想和你亲嘴\t29967\nshdv\t29968\n西流那区\t29969\n愈好\t29970\n弥漫天\t29971\n住校\t29972\n七点八点九点\t29973\n造梦西游三\t29974\n标注\t29975\n切克闹箭煎饼果子\t29976\n等于王\t29977\nsffh\t29978\nkmpi\t29979\n一下子\t29980\nPM2.5\t29981\n太古汇\t29982\n错得好\t29983\n奋不顾身\t29984\n黄婷\t29985\n至关\t29986\nSB2B\t29987\n言为\t29988\n翼虎\t29989\n黄皮白瓤\t29990\n说话说呀不想说\t29991\n铁臂\t29992\nczl\t29993\n紙\t29994\n六七一百三十多斤\t29995\nggggtfg\t29996\nuirjrufijfieif\t29997\n牛爸\t29998\n交度\t29999\n漂亮的心\t30000\n搞乱\t30001\n黄婕\t30002\n夕阳红汐绥\t30003\n忘了说\t30004\nhdhdjdjdj\t30005\nfffdhh\t30006\n网易财经\t30007\n鬼来电\t30008\n违法\t30009\n战事\t30010\n黄婆\t30011\n战争\t30012\n樱子\t30013\n唇齿相依\t30014\n妾心栳不就是你不就是心里好怕呀\t30015\n零点五\t30016\n沪市\t30017\n全错\t30018\n94年\t30019\n冰砖\t30020\n酷动\t30021\n竹凤\t30022\n泉语\t30023\n吴英\t30024\n风风风风\t30025\n箱内\t30026\n画船\t30027\n星期天下午\t30028\nMD没天理\t30029\n十八世纪\t30030\nkq盔\t30031\n中国作协\t30032\nEric\t30033\n九点下\t30034\n几分钟\t30035\n软星科技\t30036\nKasustnwinfiateroralldakukrwufdpzgytz\t30037\n泊船瓜洲绿\t30038\n用上课\t30039\n王乐怡\t30040\n谷城\t30041\n怕重\t30042\n万一万一万一\t30043\n高都\t30044\n我爱您\t30045\n有点击\t30046\n蜂蜜小面包\t30047\nwihiz\t30048\n88568\t30049\n严幺爸\t30050\neeeeeeeeeeeeeeeeeeeeeeeeee\t30051\n自由岛度假区\t30052\n累\t30053\n88562\t30054\nJjhshs\t30055\n竹高\t30056\n气不开心\t30057\n鸣谢有所\t30058\n证券\t30059\n八三二二四七二三四九\t30060\n生化\t30061\n快国\t30062\n证别\t30063\n森\t30064\n甲骨文\t30065\n孕酮\t30066\n汉语吧\t30067\n拿爱\t30068\n九十年代\t30069\n魏先生\t30070\n谐星\t30071\n光阳\t30072\n声音不善泪光我知道我一只有\t30073\n真甜\t30074\n4．23\t30075\n健忘症\t30076\n蔡快牙\t30077\n一点\t30078\n一炸\t30079\n21分钟后\t30080\n快哒死欸\t30081\n懦夫铠甲\t30082\n一炮\t30083\n韩姐\t30084\n咪咪咪咪咪咪咪咪咪\t30085\n吼吼吼。千嘛\t30086\n雨臣\t30087\n巷道\t30088\n我是小姐姐ok\t30089\n真田\t30090\n狽e之心纪\t30091\n古风\t30092\n哪题\t30093\n蓝色野豌豆剪辑制作\t30094\n东京西南\t30095\n服装设计学\t30096\n湖北省钟祥市长寿学校\t30097\n度秘我爱你爱着你\t30098\n133182584296\t30099\n秘度错啦度秘电流的牛\t30100\nab2b36ab\t30101\n爆照度秘\t30102\n么子不好我的心情好\t30103\n喔谢\t30104\n卢秀媚\t30105\n榨油\t30106\n宇阳\t30107\n黄少英\t30108\n能不能不欺骗\t30109\ncjyjx\t30110\n感谢天让\t30111\n杏子\t30112\n白怨\t30113\n张痔疮\t30114\n金隅\t30115\n亦称\t30116\n易信\t30117\n胎教伦\t30118\n樊\t30119\n鲤湾路\t30120\n楚天都市报讯\t30121\n5488142\t30122\nTFb0y\t30123\n适用于\t30124\n19702\t30125\nassignment\t30126\n三十一千七零\t30127\ncbhf\t30128\na120度\t30129\n流水单\t30130\n变局\t30131\n分了么\t30132\n河北省涉县天津钢铁厂\t30133\n以及\t30134\nwaether\t30135\n紫薯玫瑰花卷\t30136\n跨人\t30137\n三四公里\t30138\n灰烬\t30139\n俞兆林\t30140\n干嘛们\t30141\n棉\t30142\nbegent\t30143\n852456\t30144\n乔淑琴\t30145\n舵落口肋丿丶一一丶一一丶丨丨丨一一趱\t30146\n85页\t30147\n西邊大老李\t30148\n米醋\t30149\n吸鸡鸡\t30150\n丑猪\t30151\n#Tao\t30152\nChan\t30153\n点心里话\t30154\n一道坎\t30155\n搬办\t30156\nneeds\t30157\n武平\t30158\n性别女\t30159\n异想天开\t30160\n焦虑来啦咔咔\t30161\n秘括\t30162\n13783819886\t30163\n好呀你来我家\t30164\nvnvnfkccj\t30165\n条码宝莉\t30166\nyhhhh\t30167\nMoMo\t30168\n狗叫\t30169\n收缩\t30170\n开球\t30171\n33年前\t30172\nbasketball\t30173\n你样\t30174\n动能\t30175\nyouse\t30176\n配制\t30177\n湖南春节联欢晚会\t30178\n谢谢你度\t30179\n谎称\t30180\ndjjsd\t30181\n杨门虎将之精忠报国\t30182\n焦糖玛奇朵+\t30183\n说说不清\t30184\n泥泥泥\t30185\n陷战\t30186\nklame\t30187\n99块\t30188\n胡椒们\t30189\n川流不息\t30190\nhibaby\t30191\n还不爱\t30192\n很高兴\t30193\n发盘\t30194\n最迷人\t30195\n嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟\t30196\n小的乖乖吧没什\t30197\n所么\t30198\nlord\t30199\n是你说的我走了我可是乖宝宝夜\t30200\nSffghh\t30201\ninba\t30202\n老呀呀呀呀\t30203\n尿啵\t30204\n两瓣儿\t30205\n7212\t30206\n6月1日起\t30207\n杨卫荣\t30208\n蜗壳\t30209\n翠不禁\t30210\n太了我\t30211\n奢华版\t30212\n盎人\t30213\nruwut\t30214\n我不可以\t30215\n特罗凯\t30216\n蒙雨\t30217\n莫急\t30218\n自恋臭不要脸\t30219\n读做\t30220\n模\t30221\n抑制作用\t30222\n快乐很简单\t30223\n不秘密\t30224\n李宇春\t30225\n课室\t30226\n樣\t30227\n圆锥体\t30228\n四五米\t30229\nchooseanew\t30230\n度秘呢秘c咯\t30231\n一年几十万\t30232\n亚我\t30233\nAss\t30234\n跻身\t30235\n武亚楠\t30236\n怡然自得\t30237\n太讨厌你\t30238\n6月6号\t30239\n22点22点\t30240\n贤者\t30241\n741135244\t30242\nxsg\t30243\n健康水\t30244\nAsa\t30245\n五音不全12两面三刀13一塌糊涂14多此一举\t30246\n名酒\t30247\n莫怕\t30248\nAsd\t30249\n十万遍\t30250\n虾米午餐券\t30251\n末子\t30252\n多苦\t30253\n推来\t30254\n黄大明\t30255\n为人处世\t30256\n噗米噗米\t30257\n猛龙过江勋章\t30258\n两颊\t30259\n王泽一\t30260\n两颗\t30261\n64457545845484\t30262\n梁淼淇\t30263\n班长偷懒有创意\t30264\nwelld\t30265\n宜人\t30266\n双煞\t30267\n两题\t30268\n安身\t30269\n诸暨铁\t30270\n333件\t30271\nmysityou\t30272\n多多多多多多多多多多多多多多\t30273\n池博园\t30274\n30M\t30275\n鼓鼓\t30276\n小皮\t30277\n张亚楠\t30278\n告诉我不想\t30279\n论战\t30280\n黑V\t30281\n天伦置业\t30282\n讨厌我\t30283\ncame\t30284\n1765\t30285\n算鱼座201604\t30286\n学科\t30287\n才明\t30288\n陶请\t30289\n伸腿\t30290\n柯可乐\t30291\n拉姆\t30292\n郑博文\t30293\n5554838\t30294\nfingers\t30295\n余青松\t30296\nhzyk\t30297\nbirthdaychild\t30298\n拉姑\t30299\n保修期\t30300\n猪嚟\t30301\n连歌\t30302\n才是\t30303\n探脑\t30304\n中华童铭\t30305\n好童节\t30306\n私募基金\t30307\nｎo\t30308\n造作\t30309\n回味着\t30310\n讨厌你了我等\t30311\n15193385661\t30312\n哦块\t30313\n粤A\t30314\n插声\t30315\n手拉手\t30316\n余生\t30317\n拿诺\t30318\n电秘\t30319\n呀好\t30320\n拿说\t30321\nZap\t30322\n笔盖\t30323\n奥利嘻\t30324\n学不会\t30325\n这么说秘\t30326\n发烧版\t30327\n王袁\t30328\n度办\t30329\n爸妈手\t30330\n东亚度秘我好寂寞\t30331\n21212122121\t30332\n度力\t30333\n高密恺威\t30334\n印台\t30335\n有染\t30336\n讲真话\t30337\nuhkbx\t30338\n丑度秘爱\t30339\nhello你在吗度秘\t30340\n拿试\t30341\n31937213904\t30342\n几米几\t30343\n度劫\t30344\ntomit\t30345\n米西米\t30346\n土豪\t30347\n含沙量\t30348\nhgggfhic\t30349\n崎鸣\t30350\n聂礼彬\t30351\ntfba\t30352\n二四六七五九七\t30353\n复述\t30354\n大小小\t30355\ntfbo\t30356\n犬哥\t30357\n说一编\t30358\ntfbs\t30359\n1mnm\t30360\nhero\t30361\njjfff\t30362\n二妻\t30363\n美夏\t30364\n二妹\t30365\n小令\t30366\n66岁\t30367\n复返\t30368\n卡卷\t30369\njmkkjmāmakkakālqkjhbnnnmqssmjhajKlalālqllkllpòoo\t30370\n跳转\t30371\n扣除\t30372\n板凳\t30373\n卡卡\t30374\n一比五分之三\t30375\n味甚重\t30376\ndgfgfgcgc\t30377\n黑豆腐\t30378\n第一周多\t30379\n秦昭王\t30380\n井冈\t30381\n目更\t30382\n推磨\t30383\n高红娟\t30384\nsrat\t30385\n将秘\t30386\n河姆渡\t30387\n巡活\t30388\n好学医\t30389\n昂首挺胸\t30390\n供应商\t30391\n上面儿\t30392\n嘤嘤嘤\t30393\n平时\t30394\ncountry\t30395\n高招\t30396\n藏南地区\t30397\nzhcf\t30398\n御寒问\t30399\n69385585244125758582828282858585050585858587478285836396938571539693\t30400\n孙君君\t30401\nhghtfhjwshchfxjsjx\t30402\n平日\t30403\n比较好\t30404\n二王庙\t30405\n王克禹\t30406\n王紫琪\t30407\nBCD\t30408\n多美丽真的好绝情\t30409\n小件\t30410\nffuojh\t30411\n张健\t30412\nngcg\t30413\n谢锦锋\t30414\nJ.C.R\t30415\nno藏藏藏藏藏藏藏藏藏藏藏藏藏藏藏藏藏藏藏藏藏藏藏藏藏藏藏藏额\t30416\n小心路亚\t30417\n45年\t30418\n知美\t30419\n大焉\t30420\n浡泥国\t30421\n53种\t30422\n你的卖萌\t30423\n大龄村\t30424\n女女女女女女女女女女女女女女女女女女女女女女女女女女女女女女女女女女女女女女\t30425\ntxmj\t30426\n广式腊肠\t30427\n灵异\t30428\n120413\t30429\n水嫩\t30430\n沙丘鹤\t30431\n奸钱\t30432\n蕄\t30433\n轩辕花\t30434\n钱宝\t30435\n蕃\t30436\n习惯不介意\t30437\n蕉\t30438\n蕊\t30439\nwifiuc\t30440\n总能\t30441\n58只\t30442\n罐车\t30443\n马姐\t30444\n祸殃\t30445\n我不是你美小姐\t30446\njjjjjj\t30447\n成人礼\t30448\n体育生\t30449\n上百位\t30450\n666840\t30451\n薛立诚\t30452\n地貌\t30453\n23539元\t30454\n本皇\t30455\n蕴\t30456\n亲爱的我恨你我恨你\t30457\n正定县\t30458\n我是小蝴蝶小蝴蝶小胡\t30459\nwtpw\t30460\nyimaoloiri\t30461\n蕹\t30462\ndowkskx\t30463\n你好的岁\t30464\n蛇舌\t30465\n吴丁昂敏乌\t30466\n075\t30467\n废话废话废话废话废话废话\t30468\n申摩\t30469\n谢菲尔德\t30470\n表演者\t30471\n在天有\t30472\n教权\t30473\n078\t30474\n45465\t30475\n四点\t30476\n韓blog\t30477\n砍掉\t30478\n偷光\t30479\n几百岁\t30480\n一千多块\t30481\n熊超超\t30482\n双头\t30483\n67892345\t30484\n渠鹂蔺\t30485\n汤丽楠\t30486\n课文单\t30487\n长笛\t30488\nHLL\t30489\n双大\t30490\n猴性\t30491\n背题\t30492\n刘播放\t30493\n5月17日\t30494\n货到付款\t30495\n由我\t30496\n阿尔卑斯山\t30497\n秦腔秘\t30498\n仞晌\t30499\nGooners\t30500\n二十二号\t30501\n度秘乖\t30502\n杨紫彤\t30503\n铠甲勇士婆\t30504\nmiuma\t30505\n隔热\t30506\n死傲娇受\t30507\n快乐开导\t30508\n鲜爽\t30509\n死古文\t30510\n婊子\t30511\n热潮\t30512\n封开县\t30513\n佛教信徒\t30514\n宋家庄\t30515\n吧丝宝\t30516\nNbml\t30517\n九张\t30518\n你的白\t30519\n轩辕龙丰\t30520\nDRY\t30521\nmix波普\t30522\n曲爷\t30523\n记记\t30524\n两个男\t30525\nxihah\t30526\n斯蒂夫\t30527\nkubnh\t30528\n铺飞\t30529\n源泉\t30530\n恋爱中\t30531\nDRC\t30532\n来了快逃\t30533\nThemakeup\t30534\n世界末日\t30535\n看走了再见\t30536\nbnx\t30537\n市交管局\t30538\nbns\t30539\n1月31号\t30540\n程雨\t30541\n敢不敢当\t30542\n第四户\t30543\n一元二次方程\t30544\n照耀\t30545\nbnn\t30546\nbnm\t30547\nbnb\t30548\nbng\t30549\n聊搞机\t30550\n虫儿飞\t30551\n我嘴\t30552\nHEYPLAYBOY\t30553\n冯友兰\t30554\n先例\t30555\n几天上次\t30556\n杨畅畅\t30557\n巴基斯坦政府\t30558\n小白兔剧\t30559\n凉拌茶\t30560\n财务处\t30561\n吐突\t30562\n华真\t30563\n好久有空\t30564\n噜啊噜噜噜\t30565\n妄自\t30566\n亚涛\t30567\n奥伊克斯\t30568\n万能药\t30569\ngbnj\t30570\n4班\t30571\n洗护\t30572\n威慑\t30573\n箭头\t30574\n1645456555\t30575\n银建\t30576\n压速\t30577\n多点式\t30578\n朱大鸣\t30579\n邓今天\t30580\n偶去\t30581\n脐脐\t30582\n售票\t30583\n包一四\t30584\n舔肛\t30585\n哈尼哈尼\t30586\n后天上午九点\t30587\nchiphotosbaiducomxiaodupicitem0eb30f2442a7d9332aabd107aa4bd11373f00146jpg\t30588\n系奥\t30589\n卢卡\t30590\n介里\t30591\n马蹄岩\t30592\n合作型\t30593\nanbul\t30594\n脸名\t30595\n晚安yyyyyyyyyyyyy\t30596\n1343位\t30597\n夜好梦\t30598\n金价\t30599\n春秋\t30600\n罗马尼亚\t30601\n度面包\t30602\nGGHGGGG\t30603\n干爹们\t30604\n彩塑\t30605\n56553585854654\t30606\n王婧妍\t30607\n一不清楚\t30608\n牌儿\t30609\n蛋稀缺性\t30610\n三篇\t30611\n九死你心地不善良\t30612\n初晓慧\t30613\n滑雪场\t30614\n下跪\t30615\n68棵\t30616\n谭静\t30617\n视点\t30618\n下跑\t30619\n尉博洋\t30620\n贝贝太\t30621\n百牙遥\t30622\n幺零六\t30623\n篝烟\t30624\nValentine\t30625\n啵啵啵呜呜\t30626\n土地上\t30627\n土地下\t30628\n耐久\t30629\n下跌\t30630\nvrssdvvb\t30631\n画谷\t30632\n灵界\t30633\n姚谦\t30634\nCnnon\t30635\n呼塔拉\t30636\n100块\t30637\n饭钱\t30638\n几只狗\t30639\n七浦路\t30640\nArufa\t30641\n到时钟\t30642\n性名\t30643\ncthft\t30644\nSteve=屎娣；Simon=栓门\t30645\n已故\t30646\n若果\t30647\n泽博\t30648\n0.3两\t30649\n上不上\t30650\n上不下\t30651\ngjdttmktwv\t30652\n你好嚒\t30653\n原动力\t30654\nHUAO\t30655\n手电筒\t30656\n范波号\t30657\nwoc6\t30658\n手串\t30659\n开开机\t30660\n美好的天\t30661\nsjdhay\t30662\n红牛红牛牛牛牛\t30663\n吐骨\t30664\n点花\t30665\nPeggyUHFgfnonFCChhgbjdhgbjd\t30666\n手中\t30667\n老川\t30668\n我没有贴身小秘书只有你这么个贴身小秘书\t30669\n素食主义者\t30670\n老工\t30671\n手丑\t30672\n我能娘呼死你\t30673\n谢谢谢谢谢谢谢谢\t30674\ninside\t30675\n英气\t30676\n阿拉人\t30677\n老差\t30678\n胖公\t30679\n王嘉尔\t30680\n张文潇\t30681\n分析\t30682\n超大型\t30683\n加减仓\t30684\n手上\t30685\n手下\t30686\n精准\t30687\n小蛇\t30688\n不听得\t30689\n12万\t30690\n精减\t30691\n七八七四八\t30692\n交化\t30693\n间隙\t30694\n小蛋\t30695\n刘午夜\t30696\n建国村\t30697\n0.6元\t30698\n死符\t30699\n好多张\t30700\nzeme\t30701\n没我没有\t30702\n牛群\t30703\nednth\t30704\n所见所闻\t30705\n12个\t30706\n高晓瑞\t30707\n小蛮\t30708\n噢哟\t30709\n兄弟姐们们\t30710\n菜畦\t30711\n主贷\t30712\n三只松鼠\t30713\n梦幻风\t30714\n奥比岛\t30715\n鹏古\t30716\n三来\t30717\n[晕\t30718\n三条\t30719\n王胜江\t30720\n韩珂\t30721\n三杯\t30722\n突破缅北的鹰\t30723\n伍美珍\t30724\n缘猫#\t30725\n爆发\t30726\n女淫\t30727\n退休之时\t30728\n应节\t30729\n制成\t30730\n三板\t30731\n庄金挺\t30732\n好多套多\t30733\n错着\t30734\n第一卷\t30735\n三权\t30736\n6x5x4.5x1590\t30737\nfvvcfkl\t30738\n新照\t30739\n太脏\t30740\n炮声\t30741\n3起\t30742\n苏军\t30743\n800几\t30744\n俩一\t30745\n1584188245826\t30746\n度秘度秘我真的恨死你了我不和你\t30747\n不爱我走了\t30748\nv们\t30749\n回来\t30750\n随身\t30751\n不只我\t30752\n秀美\t30753\n巡视\t30754\n学麻\t30755\n俩个\t30756\n杜立东\t30757\n预赛许一世了七八大的你说的话\t30758\n咯yooz\t30759\n辰哥\t30760\n我的萌萌哒软妹纸女朋友\t30761\n隆多\t30762\n民生主义\t30763\nx03\t30764\n来火\t30765\n生度\t30766\nyellowyellow\t30767\n你是你的眼萌\t30768\n1352648790\t30769\n刘兆耀\t30770\n字音\t30771\n溅湿\t30772\n程我\t30773\n抗性\t30774\n克劳斯\t30775\n沧蓝\t30776\n嘴皮\t30777\n一缕\t30778\nsx240\t30779\n力武\t30780\n抹布\t30781\nfuqer\t30782\n郑建文\t30783\n防部\t30784\n接嘞冒\t30785\n陆家一\t30786\n东方不败\t30787\n溏\t30788\n五比四比三\t30789\n李盛旭\t30790\n囹圄\t30791\n一四4点\t30792\ndhgdffggfhcjdrdjbgffgddxhfragkvcdgxhdhchcrdjtdkkpfdscghch\t30793\n鱼头汤\t30794\n腾出\t30795\nロ゜┛\t30796\n好太史一家刨花月余\t30797\n欧科都愉快觉\t30798\n黄海\t30799\n醴陵\t30800\n倒退\t30801\nmnpq\t30802\nNxjxjbxjb\t30803\n嗯江南斯代尔\t30804\n家可归者\t30805\n订正\t30806\n6PL\t30807\n6764345816467766464464664673613508491\t30808\n好v\t30809\n万分\t30810\n哪等会儿\t30811\nhttpbhiphotosbaiducomxiaodupicitemc995d143ad4bd1131aa3ff415dafa40f4bfb0552jpg\t30812\n累觉不到\t30813\n抢险\t30814\n四角\t30815\n烙印\t30816\n丁俊晖\t30817\n妈妈桑\t30818\n过分了度秘\t30819\n瞎忙\t30820\n刘得礼\t30821\n1月10日\t30822\n倒逼\t30823\n黄浈\t30824\n魔仙堡\t30825\n兄长\t30826\n打援\t30827\n真不愧是我的好度秘\t30828\n米加\t30829\n黄万一\t30830\n22点36分\t30831\nhaoba\t30832\n今年1月1日\t30833\n儿恩\t30834\n一小时后\t30835\n大将军\t30836\n李荣欢\t30837\n伦敦眼\t30838\n墨斗丸\t30839\n蝗虫\t30840\n女角\t30841\n张田芳\t30842\n7666666666667666666666666666611111\t30843\n糊涂\t30844\n一辈子\t30845\n歌儿们\t30846\n王彬亦\t30847\n6月19日\t30848\nSHINee忙内泰民\t30849\n李泽希\t30850\n幺四二六二四一九七三幺二幺幺四三八幺\t30851\n智力\t30852\n老真\t30853\n超群系\t30854\n四圣多年\t30855\n大寒\t30856\n大笼天\t30857\n23度\t30858\n515110135248\t30859\n罗苏雯\t30860\n肉肠\t30861\n麻子狗\t30862\n成真唉\t30863\n国务院办公厅\t30864\nhvkg\t30865\n大寺\t30866\n老你\t30867\n1509191518191\t30868\nHSS\t30869\nangry\t30870\n袁小姐\t30871\n猪比你可爱的多亲一下一只什么异样那是你要命\t30872\n主人手\t30873\nbbjll\t30874\n主人才\t30875\n别来无恙\t30876\n绵绵不绝\t30877\n旭子睿\t30878\n二号新伟祥公司\t30879\n七才\t30880\n丰收\t30881\n爱过的人\t30882\n吴镇\t30883\n自攻\t30884\nddjdjAjfi\t30885\n七所\t30886\n果农惜售\t30887\n帝ng\t30888\n小攻\t30889\n辽国\t30890\n丁克\t30891\njebd\t30892\n不要说了讨厌\t30893\n卷毛小受\t30894\n接我团\t30895\n五秒后\t30896\n共同行动\t30897\n杨保宏\t30898\n小攀\t30899\n马甲爷\t30900\nＯrz\t30901\n门颇\t30902\n嗯干尸\t30903\n集聚度\t30904\n王学亮\t30905\n刘俊杉\t30906\n有追求\t30907\n中环\t30908\n相距\t30909\n七点二十\t30910\n恨死你\t30911\n研读\t30912\ngzm\t30913\n倪好胖\t30914\n四月初八\t30915\ngzf\t30916\ngzg\t30917\n西泰尔斯海灵\t30918\ngze\t30919\ngzb\t30920\n80公里\t30921\n王兴东\t30922\n指挥中心\t30923\n士兵们\t30924\n史秘秘\t30925\n一方向\t30926\n遇记\t30927\n直向\t30928\n陈聪\t30929\n腰痛\t30930\n防范\t30931\n644646646466\t30932\n吾恩屋恩\t30933\n13159199091\t30934\n宋旭\t30935\n见不得人\t30936\n新彩吧k5\t30937\n马拉加\t30938\n宋时\t30939\n品牌中国产业联盟\t30940\n100055\t30941\n雄壮\t30942\n阳木\t30943\ntstopaso\t30944\nhog\t30945\n布茨\t30946\n嘴鸭\t30947\n千山温\t30948\n月十艺\t30949\n香港成达出版社\t30950\n赵萌\t30951\n歼击机\t30952\n梯度\t30953\n贾佳慧\t30954\n女同性恋\t30955\nTFbOys\t30956\n智能机器人度秘你好啊我爱你\t30957\n回档\t30958\n多权贵\t30959\n痴迷不悟\t30960\n委托人\t30961\n物种\t30962\n被一个三字经\t30963\n美元指数\t30964\n审问\t30965\n俄罗斯方块\t30966\n代理权\t30967\n继霞\t30968\n信不信老娘我揍你\t30969\n阳朔\t30970\n有点人样\t30971\n活捉\t30972\n老校\t30973\n打春\t30974\n骗影\t30975\n你老老我\t30976\nHellosiri\t30977\n南海区\t30978\n朴若\t30979\n差实\t30980\n那本\t30981\n158755\t30982\n尚灿格\t30983\n义务教育法\t30984\n林容里\t30985\n果醋\t30986\n开抢\t30987\n连日\t30988\n稀有动物\t30989\n祥涛\t30990\n等同\t30991\nhot\t30992\n火山石\t30993\n水深火热\t30994\n开抱\t30995\n老树\t30996\nhou\t30997\n一比四比\t30998\nuthrh\t30999\n对不上\t31000\n以群师子\t31001\n那月\t31002\n慢性格斗破\t31003\n登记机\t31004\n瘦肉精\t31005\nhop\t31006\ngoback\t31007\n王天源\t31008\n广饶\t31009\n哇塞好萌\t31010\n包包包包包包包\t31011\n上路\t31012\n小题\t31013\n上跪\t31014\n玛雅\t31015\n林楚云\t31016\n质感\t31017\netdfgdg\t31018\n哈偶\t31019\n生老病死爱别离怨\t31020\n万里沙\t31021\n曼彻斯特\t31022\n舞啊舞\t31023\n都门\t31024\n亲人家\t31025\n相遇\t31026\n8888888888888888888888888888888\t31027\n游戏游戏游游戏游戏游戏游戏游戏游戏游戏\t31028\nhoy\t31029\n专块\t31030\n逆太\t31031\n连线\t31032\n优步\t31033\nfydhx\t31034\n6035场\t31035\n舅公\t31036\n歹尊\t31037\n大冬峡\t31038\n科罗拉多大峡谷\t31039\n鱼样\t31040\n心情好亲\t31041\n手机机\t31042\nwajm\t31043\n广东省卫生厅\t31044\n复联\t31045\n驚訝\t31046\n成家堡\t31047\n西树西树\t31048\n发展观\t31049\nn嗯\t31050\n227276691871\t31051\n15550028233\t31052\n顺正\t31053\n隔空\t31054\n25分之十二\t31055\n波斯克罗甸\t31056\n御茶园\t31057\n心情好些\t31058\n输液\t31059\n代销\t31060\n精华帖\t31061\n宫宫\t31062\n交友网\t31063\n220949830\t31064\n沙盒\t31065\n肢体\t31066\n周公操\t31067\n师美婷\t31068\n2389136775\t31069\n云幕\t31070\n猫呀蛇\t31071\nAreyouokcp\t31072\n滑射\t31073\n顾墨\t31074\n嘉实宝\t31075\n无寸\t31076\n无寿\t31077\n沙盆\t31078\n蚀王\t31079\n何等人\t31080\n无密\t31081\n早年\t31082\n崩坏\t31083\n过招\t31084\n打豆豆\t31085\n热糖糖\t31086\n走了再见\t31087\n爬爬爬\t31088\ntf88\t31089\n草头茹\t31090\n周盼\t31091\n文苑路\t31092\n刻福\t31093\n1五二1315\t31094\n剑客气哦吾讲距过会儿雅静儿健康康康\t31095\n五幺零幺零二\t31096\n周目\t31097\n条鱼\t31098\n龚紫霞\t31099\n一实小\t31100\n兹\t31101\nkuai\t31102\n连新意\t31103\nopen\t31104\n经纬线\t31105\n斗罗我的忧郁症\t31106\nECCV\t31107\n养\t31108\n胡振刚\t31109\n黄龙山\t31110\n打火石\t31111\n俩根\t31112\n幽灵犬\t31113\n00888742\t31114\nukl\t31115\nnjgg\t31116\nukh\t31117\n请辞\t31118\n牙儿\t31119\n排铁\t31120\n砖俊凯\t31121\n自从\t31122\n你了马\t31123\nhrgd\t31124\n英云谱镇\t31125\nhdjjyfghohvg\t31126\nvbju\t31127\n杜止七七卞士土丰七土土\t31128\nxxoo美\t31129\n好我原谅你\t31130\n缠绕\t31131\n百利\t31132\n从中\t31133\njwnOKnkn\t31134\n工艺品\t31135\n注明\t31136\n人口普查\t31137\n水果脚\t31138\n深厚\t31139\n镜淇\t31140\n再讲一个搞笑的小娃娃王\t31141\n自仪\t31142\nweiseng\t31143\n徐光\t31144\n徐先\t31145\n徐克\t31146\n殷嘉越\t31147\n小党\t31148\n烫伤膏\t31149\n陌雪\t31150\n人寿保险\t31151\n嗯来了\t31152\n照旧\t31153\n推荐\t31154\n当铺\t31155\n铁丹\t31156\nxurh\t31157\n我一生一世永远的都爱你我的度秘\t31158\ngerioe\t31159\nvchvchjth\t31160\n供货\t31161\n杨自诚\t31162\n朦朦\t31163\n消停会\t31164\n等会儿再聊\t31165\nxurx\t31166\n小兜\t31167\n要否定\t31168\n戴美琪\t31169\nchzv\t31170\n郭明松\t31171\n拔罐你好讨厌\t31172\n许国璋\t31173\n皮匠\t31174\n大学生\t31175\n长痘\t31176\n铁丝\t31177\n放灯\t31178\n初吻\t31179\nlaspects\t31180\n划痕\t31181\n下一世\t31182\n喀秋\t31183\n键额\t31184\nggghgweby\t31185\nareport\t31186\n鲜有人\t31187\n到低\t31188\n一点一点块\t31189\n到位\t31190\n坐下\t31191\n嗯g\t31192\n琴日\t31193\n蒋君尧\t31194\n莫高兴\t31195\n笃\t31196\n好命苦\t31197\n笆\t31198\n陌陌陌陌陌陌陌陌陌\t31199\n笛\t31200\n就是你过秀秀\t31201\n小克\t31202\n笑\t31203\n腺样体\t31204\n笔\t31205\n笨\t31206\n避孕药\t31207\n笫\t31208\n第\t31209\n笮\t31210\n豹紋\t31211\n懂了\t31212\n符\t31213\n好神奇喔\t31214\n笺\t31215\n雪清\t31216\n笼\t31217\n兔\t31218\n人额\t31219\n置入\t31220\n岳子淇\t31221\n国家大剧院\t31222\nlaws\t31223\n文革\t31224\n薄荷味\t31225\n幸运女神\t31226\n\b\t31227\n红下篇\t31228\n教距\t31229\n真僵愧杀人的僵愧我死\t31230\n4330架\t31231\n中零点三元\t31232\n韦伯\t31233\n强度\t31234\n白伯母\t31235\n新高\t31236\n哩哩哩哩哩麦哩\t31237\n邱明南\t31238\n禾木\t31239\n周飞\t31240\n二幺九三\t31241\n我日咪咪\t31242\n兼备\t31243\n啊婷\t31244\n解感\t31245\n口辽\t31246\n厽\t31247\n李敏豪\t31248\n1.56%\t31249\n百分之百\t31250\n解愁\t31251\n志辉\t31252\n辅导费\t31253\n终成眷属\t31254\n我的女王\t31255\n汪魁\t31256\n屡试不爽\t31257\n小保\t31258\nIphone4\t31259\n则灵\t31260\n姑家\t31261\n死边\t31262\n聂力图\t31263\n956423\t31264\n普降\t31265\n4岁\t31266\n秘合江\t31267\n兄\t31268\n普陀\t31269\n更安全\t31270\n说女\t31271\n黄春威\t31272\nNO.11\t31273\n放寒假耶\t31274\nSOGO\t31275\n代金券\t31276\ncdjhuwzgzghdduz\t31277\n高材生\t31278\n说好\t31279\n病院\t31280\n八一点\t31281\n飞模坊\t31282\n蓝眼\t31283\n郭杨朵尔\t31284\n折断\t31285\nviaara\t31286\n护理费\t31287\n红斑竹\t31288\n休闲鞋\t31289\n聊会儿宠物吧行不\t31290\nlearn\t31291\n好美哒\t31292\n小钥说网\t31293\n十五点半\t31294\n樱木\t31295\n不好多\t31296\n黑死\t31297\nk552\t31298\n110晚上\t31299\n畜兴\t31300\n反衬\t31301\n黑武\t31302\n哦天那\t31303\n赞不绝口\t31304\n四岁\t31305\n争吵\t31306\n决算\t31307\n于松\t31308\nvks\t31309\n男金女土--金土夫妻好姻缘\t31310\n七五龇牙\t31311\n见习\t31312\n二十啊我的啊小\t31313\n把秘\t31314\nvkk\t31315\n谭鑫钰\t31316\n六味\t31317\n8010\t31318\nvkf\t31319\n厕\t31320\n一张你的生活\t31321\n露点\t31322\n九一半\t31323\n傅老鼠\t31324\n还步步高\t31325\n巨星\t31326\n赵李\t31327\n赵村\t31328\nq一三十多q1p750\t31329\n谢锋\t31330\najjjsiiwj\t31331\n天然\t31332\n弥补\t31333\n2207382065\t31334\n时过境迁\t31335\n五根小香\t31336\n白语文\t31337\n核潜艇\t31338\n三峡库区\t31339\n嗯神奇\t31340\n那一位\t31341\n接我良是\t31342\n电视机\t31343\n掉钱\t31344\n千言\t31345\n异味\t31346\n朴泰俊\t31347\n原\t31348\n好的好的好\t31349\n十三米\t31350\n管得着我\t31351\n妈妈妈妈妈妈妈妈妈妈妈\t31352\n静儿\t31353\n78567\t31354\nAD25\t31355\nFSF\t31356\nfuckyoutfit\t31357\n吧亲文小学\t31358\n采菊东篱下\t31359\n蒋欣怡\t31360\n福寿街\t31361\n为什么不知道\t31362\n不离\t31363\n鑫鑫\t31364\n近朱者\t31365\n讪笑\t31366\n市北\t31367\n文静\t31368\n凸凸\t31369\n石宇轩\t31370\n蜂拥而来\t31371\n不禁\t31372\n凸凹\t31373\n张北音乐节\t31374\n楚科奇\t31375\n夏花\t31376\n腊八\t31377\n稍桶\t31378\n四零二\t31379\ngisp\t31380\ndrcvckvv\t31381\n本利\t31382\n公安部门\t31383\n一年前\t31384\n移民\t31385\nczkb\t31386\n亲不懂\t31387\n苹果园站\t31388\n下街\t31389\n10674\t31390\n徐旭盛\t31391\nmy物价局\t31392\n热情\t31393\n本分\t31394\ntribal\t31395\n本刊\t31396\n种虫子\t31397\n在内在\t31398\n犀牛\t31399\n江南\t31400\n巨搞笑\t31401\n胡润\t31402\n全拍\t31403\n多雪\t31404\n捨得\t31405\n阳光保险\t31406\nyide\t31407\n96.98%\t31408\n玉观音\t31409\n猪秘朱猪\t31410\nsksnbk\t31411\nbu tong guo\t31412\n张君\t31413\n长鸟\t31414\n申核\t31415\n二聪明\t31416\n虾米音乐#\t31417\nhttphhiphotosbaiducomxiaodupicitem908fa0ec08fa513d699382ac3a6d55fbb2fbd\t31418\npappare\t31419\n691\t31420\n买买买买买买买买买买买加加加加加加加加加加\t31421\n刨宫傲\t31422\n┏ω囍\t31423\n八十万元\t31424\n耍人\t31425\n177448\t31426\n张吹\t31427\n望江\t31428\n政令\t31429\n播映\t31430\n黑金刚521巧克力糖\t31431\n张子娴\t31432\n张吴\t31433\n明理体贴\t31434\n123456789958657\t31435\n不不不不不你这头猪\t31436\n污浊\t31437\n窝囊肿\t31438\n聚成刘\t31439\n我讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你\t31440\n家儿\t31441\n那样片\t31442\n太晕\t31443\n维港\t31444\ndakai\t31445\n滨海县\t31446\n虹口足球场\t31447\n2家\t31448\n圆圆圈圈\t31449\n哨所\t31450\n阎王\t31451\n齐邦媛\t31452\n出诊\t31453\n手超级\t31454\n空袭\t31455\n鼎力支持\t31456\nWING\t31457\n聊结\t31458\n出话\t31459\n35258\t31460\n东莘庄\t31461\n那泠丁当\t31462\n烧伤\t31463\n缩\t31464\nHBO\t31465\n米五\t31466\n死一死\t31467\n斤斤计\t31468\n楊\t31469\nHBB\t31470\nhgchpfxu\t31471\n遗相\t31472\n交通银行\t31473\n龙江天\t31474\nHBV\t31475\n楚\t31476\n面干嘛莫发\t31477\n刺溜\t31478\n节操君\t31479\n在一起玩\t31480\nfvk\t31481\n楠\t31482\nfvh\t31483\nfvg\t31484\nfvf\t31485\n打架\t31486\nfvd\t31487\nfvb\t31488\n获过\t31489\n極\t31490\n亲一亲\t31491\n打枪\t31492\n鸵鸟型\t31493\n楼\t31494\nfvt\t31495\nfvs\t31496\n廢話\t31497\n宝强\t31498\nxvdieos\t31499\n望城县委\t31500\n欠缺\t31501\n来说劲\t31502\n唔唔\t31503\n爽朗\t31504\n京苏美\t31505\n伟成\t31506\nSONIA\t31507\nhvg\t31508\n半条\t31509\n半杯\t31510\n洗洗\t31511\n没呵呵\t31512\nfkyou\t31513\n金三角\t31514\n醉醺醺\t31515\n翻开\t31516\n英美\t31517\n一日夜\t31518\n对呀有东nds秘\t31519\n酥靡\t31520\n货物\t31521\n美鲍\t31522\n技术活\t31523\n取巧\t31524\n不自量力\t31525\n黑小猿\t31526\n蒙扎机器人\t31527\n曾猪\t31528\n有真心的爱过\t31529\n调调行\t31530\n十超巨\t31531\n一一八万\t31532\n冬妮\t31533\n过渡\t31534\n單\t31535\n喯\t31536\n头车\t31537\nａｂａｃ式\t31538\n韩阳\t31539\n514200\t31540\n喧\t31541\n喹\t31542\n喺\t31543\n喻\t31544\n赶不回来\t31545\n喽\t31546\n5吨\t31547\n热敏心情疗伤Tee\t31548\n喱\t31549\n喲\t31550\n喳\t31551\n七我七岁\t31552\n喷\t31553\n喉\t31554\n喊\t31555\n喋\t31556\nzsb\t31557\n份人\t31558\n喏\t31559\n说好开心\t31560\n喁\t31561\n抱不到\t31562\n喃\t31563\n善\t31564\n4412\t31565\n喇\t31566\n这家宝\t31567\n不爱了么\t31568\n5名\t31569\n喜\t31570\n喝\t31571\n喞\t31572\n喔\t31573\n热火队\t31574\nX—35隐形战斗机\t31575\n1930年8月\t31576\n焦佳琪\t31577\n洛汗\t31578\n毁灭性\t31579\n手机链\t31580\n武田铁矢\t31581\ndfdf\t31582\n三天假\t31583\n有先\t31584\n格林庄园\t31585\nstopto\t31586\nPain\t31587\nBKKCRI\t31588\n好惹我生气\t31589\n谜语\t31590\n挖煤\t31591\n草塔镇政府\t31592\n康芝药业\t31593\n永不再誓\t31594\n几千遍\t31595\n小俊\t31596\n凝脂\t31597\nyvgcccED47775\t31598\n尖细\t31599\n5858656\t31600\n挂面\t31601\n1111111111\t31602\n学历证\t31603\n沈佳颖\t31604\n普通人家\t31605\n景气\t31606\n考题\t31607\n外婆婆\t31608\n叉烧包\t31609\n进京证\t31610\n10669588200\t31611\nhet\t31612\nheu\t31613\nhev\t31614\nhew\t31615\nher\t31616\n黄煜城\t31617\n甘肅山\t31618\n袁小迪\t31619\nhex\t31620\nhey\t31621\n哈哈你想的美你这先考我靠我和你\t31622\n六月五日\t31623\nhed\t31624\nhee\t31625\n機器\t31626\nhea\t31627\nheb\t31628\nhel\t31629\n通缉令\t31630\nhen\t31631\nheh\t31632\nhei\t31633\nhej\t31634\nhek\t31635\n发证\t31636\n秘波\t31637\n周宇\t31638\n翦伯赞\t31639\n安啦\t31640\n秘泰\t31641\nconilllaery\t31642\nhttpbhiphotosbaiducomxiaodupicitemcf1b9d16fdfaaf5192329ccb8b5494eef11f7af4jpg\t31643\n自残\t31644\n我真的好想听\t31645\n雏鸡\t31646\n死八婆就是你\t31647\n吴泽斌\t31648\nxbx\t31649\nV5系\t31650\n植物大战僵尸不花钱的植物大战僵尸拿瓦\t31651\n贝森路218号\t31652\n曾志通\t31653\n秘泌\t31654\n附属\t31655\n张继龙\t31656\n河图\t31657\n从头到脚\t31658\n玉影\t31659\n飞呗\t31660\n晓明哥\t31661\n12345678901111213141516\t31662\n三一个\t31663\n分攻\t31664\ndndj\t31665\n分收\t31666\n就可餐\t31667\n艾朵朵\t31668\n分支\t31669\n帅我最萌\t31670\n那待会儿度娘\t31671\n四十二块\t31672\n何千影\t31673\n草莓派\t31674\n2灬\t31675\n赞扬\t31676\n卡木\t31677\n今天早上三点\t31678\n连龙飞\t31679\n陈玉娜\t31680\n百八u\t31681\n今天早上8点\t31682\n找一个人\t31683\n中转\t31684\nvuubbu\t31685\niloveyou的意思就是我是你的爱你\t31686\n卡机\t31687\n中轴\t31688\n大米兰\t31689\n蜒毋\t31690\n我的大哥\t31691\n吴佩孚\t31692\n张仁鸣\t31693\nxbk\t31694\n一分钱二分钱\t31695\n17k一\t31696\n苹果呢女人\t31697\n帅气型\t31698\n苦吃\t31699\n回老家慌\t31700\n冒光\t31701\n被撞者\t31702\n了哪\t31703\n车次\t31704\nxbd\t31705\n设防\t31706\n了哥\t31707\n九些半\t31708\n晒太阳\t31709\n冒充\t31710\n七八行\t31711\n19号晚6点\t31712\n02元\t31713\n1245876434469253469481833497864319164394916348799\t31714\n干森\t31715\n点菜\t31716\n银行帐\t31717\n激乐\t31718\n深深地\t31719\n谜面\t31720\n许明明\t31721\n鲁铁\t31722\n2300\t31723\nhhvxxddxcf\t31724\n韩如霞\t31725\n口里\t31726\n做梦\t31727\nwhsy\t31728\n78103900037682858\t31729\n启迪\t31730\n牛津大学\t31731\n娇花\t31732\nsisusy\t31733\n一一手\t31734\nwhst\t31735\n喵样\t31736\nwhsj\t31737\n劳丁灰\t31738\nNoresponsibilities\t31739\nwhsb\t31740\nwhsd\t31741\n关么我的私人健康顾问\t31742\n不懂谁懂\t31743\n聊了哼\t31744\n20万台\t31745\n双流县共同么\t31746\n6点06分\t31747\n111片\t31748\n秘欠\t31749\nGGMM\t31750\n元武\t31751\n隋朝二世\t31752\n611分\t31753\n20040713\t31754\nsworthit\t31755\n老大们\t31756\n小算\t31757\nimathir\t31758\n亲朋\t31759\n卫卫\t31760\n翟子铭\t31761\n頭痛\t31762\n英雄枪\t31763\n蔫蔫\t31764\n逆风\t31765\n小箱\t31766\n财经\t31767\n猪呆子\t31768\n地铁口\t31769\n乱八七糟\t31770\n小管\t31771\n猛打摸摸\t31772\n呀原谅\t31773\nbafn\t31774\n冷淡\t31775\n老男人和女人\t31776\n张双么\t31777\n呀你的名字\t31778\n周都糸\t31779\n0112\t31780\n出来于心\t31781\n0114\t31782\n左手右手\t31783\n退餐\t31784\n早点儿学习\t31785\n八钢会\t31786\n嘴毛\t31787\n道路\t31788\n红烧牛蛙\t31789\n功放\t31790\n66434348685\t31791\n您帅\t31792\n懒猫儿\t31793\n棒棒哒美美达\t31794\n报怨\t31795\n大憨熊\t31796\n滨海\t31797\nlaosk\t31798\n等到天亮不见得\t31799\n爱胜\t31800\n5类\t31801\n滑轮\t31802\n广河\t31803\n洋刀\t31804\n欺雪压\t31805\n5米\t31806\n客大王\t31807\n哭了吗\t31808\n我不要你了你\t31809\n疯了\t31810\n尖刀\t31811\n苦行僧\t31812\n训凶\t31813\nGhghij\t31814\n饥饿脚丫丫个\t31815\n8668\t31816\n滚筒\t31817\n大姐呀大姐你给我好好听话在家\t31818\n85855560436218\t31819\n尖利\t31820\nBusiness\t31821\n肉娃娃\t31822\n粗人\t31823\n拼音号\t31824\n并处\t31825\n呢讨厌\t31826\n处女膜\t31827\n两代人\t31828\n疯人\t31829\n和悦\t31830\nZailaiyige\t31831\n村子弹\t31832\n15268815670\t31833\n不耐我了伐开心\t31834\n奎瓦斯\t31835\n同城\t31836\n例外地\t31837\n易流\t31838\n55248866654233\t31839\nwithadream\t31840\n抢镜\t31841\n联票\t31842\n金辉\t31843\n爱师长\t31844\n855656866857675758998588868855588858\t31845\n禁商用\t31846\n明日月明鱼羊鲜小土尘小大\t31847\ncortn\t31848\ncorta\t31849\n不勤\t31850\n晚上8点到10点\t31851\n拔刀相助\t31852\n35673494349637646346\t31853\n6天天\t31854\n姚凯旋\t31855\n開玩笑\t31856\n茶馆\t31857\n预装\t31858\n好忙忙忙\t31859\n紧收\t31860\n我的我的说你是坏蛋把你\t31861\n周主任\t31862\n卿卿我我情浓\t31863\nT810\t31864\n开斋节\t31865\n茶香\t31866\n逆回\t31867\n净瓶\t31868\n金边\t31869\n西昌电力\t31870\n嗯梦游\t31871\n楚亚斌\t31872\n杨在新\t31873\n李教授\t31874\n跳鹿党\t31875\n濡首\t31876\n小品\t31877\n冒险者\t31878\n300米\t31879\n胡永远\t31880\n二元比\t31881\nco点\t31882\n#阿娘\t31883\n过路费\t31884\n战局\t31885\ni昂\t31886\n雪月清\t31887\n一吗一\t31888\n青青错药\t31889\n散落\t31890\n事簿\t31891\n对啊不聊\t31892\n嘴上\t31893\n做我也不知道\t31894\n驱动力\t31895\n过去二十年\t31896\n纪连海\t31897\n出发点\t31898\n第二轮\t31899\n取出\t31900\n蓝老师\t31901\n闹闹你猜猜我\t31902\n好安\t31903\n好完\t31904\n17797459358\t31905\nBBC、CNN\t31906\n九百十一十二十三十四十五十六十七个\t31907\n麻辣鸡Nicki\t31908\n18563650038\t31909\nxxxx\t31910\nxxxt\t31911\n闻官军收河南河北\t31912\n痴货\t31913\n火速的话五号\t31914\n苏文\t31915\n三番五次\t31916\n长硬\t31917\nxxxe\t31918\n詹章俊\t31919\n彭驿涵\t31920\n七当a\t31921\n实拍\t31922\n罗云娜\t31923\n睡处\t31924\n枯萎\t31925\n66667\t31926\n66666\t31927\n磋商\t31928\n不敢你\t31929\n呀哟\t31930\ngosomeshopping\t31931\n明德\t31932\n实招\t31933\n三朝饭\t31934\n嘟嘴\t31935\ndewrrdd\t31936\n星星蜂\t31937\n睡天\t31938\n纲吉\t31939\n呀哼\t31940\n脑锤子\t31941\n嘟嘟\t31942\n波多黎各\t31943\n阴森\t31944\n分次\t31945\n呀哪\t31946\n和龙\t31947\n郭秋池\t31948\n聊聊种\t31949\n食物钟\t31950\n纹理\t31951\ny460007051337996\t31952\n上学啦\t31953\n强化\t31954\n素银\t31955\n微电子\t31956\n滴眼液\t31957\n一学一个\t31958\n卧槽尼玛\t31959\n酒干倘卖无\t31960\n谁谁谁谁谁\t31961\n里面内\t31962\nbksf\t31963\n医时\t31964\nwrf\t31965\n周长\t31966\nwrd\t31967\n一某某\t31968\nwro\t31969\n站内存\t31970\n可听\t31971\nwrj\t31972\n小喜欢我电影我是你\t31973\nwrh\t31974\n给我说的话\t31975\n8点19分\t31976\nwrw\t31977\nwrt\t31978\n小杏杏\t31979\n错错错\t31980\nYfy\t31981\n15y照句\t31982\nwry\t31983\nftdsbii\t31984\n炎洁\t31985\n郭沫若\t31986\n嘤嘤嘤嘤\t31987\n籼米\t31988\n杨佳睿\t31989\n郑谢\t31990\n罗泽权\t31991\n了不到\t31992\n假臭\t31993\n第9期\t31994\n娃哈哈\t31995\n逆你\t31996\n一百吧十天\t31997\n5355527465\t31998\n数限制\t31999\n25542732253\t32000\n知识达人\t32001\n思路\t32002\nYNWA\t32003\nVchfxxyu\t32004\n盗窃罪\t32005\n王英飞\t32006\n风骨人\t32007\n邮票\t32008\n叶吖\t32009\n爱情色\t32010\n甲人\t32011\n返璞归真\t32012\n裴寂\t32013\n好吗度\t32014\n张世凯\t32015\n头次\t32016\n齐云山\t32017\n甲亢\t32018\n夜无忧\t32019\n偶香雪\t32020\n迷茫\t32021\n吃不得\t32022\npanggo\t32023\nPugskpmdsg\t32024\n加菲猫长\t32025\n晚盘\t32026\n该人\t32027\n光弄\t32028\n亚同乐同乐\t32029\n18786303958\t32030\n世代\t32031\np700\t32032\n星期五晚十点\t32033\n副班\t32034\n天美\t32035\n洋货\t32036\nhfuywyrklho\t32037\n一刻\t32038\n巩\t32039\n正事儿\t32040\n十八只\t32041\n心王\t32042\n咪蒙\t32043\n征文\t32044\n泡明儿\t32045\n35分钟\t32046\n余姚市\t32047\n鬼视频\t32048\n我们三哥\t32049\n一别\t32050\n心率\t32051\nxie\t32052\n一列\t32053\nhi歌两小新鞋\t32054\n杜明芮\t32055\n一划\t32056\n888888888888888\t32057\n一刚\t32058\n一则\t32059\n午觉\t32060\n一切\t32061\n一分\t32062\n天羽\t32063\n一刁\t32064\n一刀\t32065\n子兮\t32066\n凯哥\t32067\n子剧\t32068\n炸鸡腿\t32069\n乖亲\t32070\n跳墙\t32071\n不爽快\t32072\n伯克莱\t32073\n胡天广\t32074\n荠菜\t32075\n小丁丁\t32076\n王淑文\t32077\n没凶\t32078\n上奇\t32079\n爱尔兰\t32080\n不瘟\t32081\n身披\t32082\n赵法\t32083\n上女\t32084\n卢七星\t32085\n营销者\t32086\n上好\t32087\n阿特法\t32088\n这个服\t32089\n游轮\t32090\n二笔\t32091\n说开玩笑\t32092\n这个月\t32093\n可88我爱吧唧\t32094\n吹动\t32095\n枉合理\t32096\n没准\t32097\n其如此\t32098\n纵横四海\t32099\n六五七零六八七二\t32100\n行风\t32101\n一滑一滑\t32102\nxix\t32103\n巢穴\t32104\n鸟样\t32105\n矶钓\t32106\nachinesestudent\t32107\n幸运儿\t32108\n如雪\t32109\n马胜发\t32110\n三聚氰胺奶粉\t32111\ngzgz\t32112\n突突突统计局\t32113\n湘潭\t32114\n大好臭\t32115\n生活大爆炸第四季\t32116\n反胃\t32117\n南山南\t32118\n牛百叶\t32119\n见多识广\t32120\n有线电视\t32121\n大惊小怪\t32122\n孩子们儿\t32123\njjmjjmjgjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjmmmmmmmmm\t32124\n薄毯\t32125\n小骆驼\t32126\n袁佳慧\t32127\n多一分\t32128\n女王爷\t32129\n三十九元\t32130\n五四个\t32131\n昌平\t32132\n人大家\t32133\n西游\t32134\n金秋中网行\t32135\n夏铭\t32136\n海正\t32137\n一到五十\t32138\n陪错\t32139\n随心而安\t32140\n坦克哥\t32141\n通吃林\t32142\n太麻烦你\t32143\n赵四演\t32144\n赵畅\t32145\nwat\t32146\n帅昌\t32147\n三十九光\t32148\n大一期末考\t32149\n能站\t32150\n上访\t32151\n薄毛\t32152\n四首\t32153\n指甲剪\t32154\n隳殇\t32155\n实话题型\t32156\n138x4\t32157\nikmcxgn\t32158\n蒙娜丽莎\t32159\nIsh\t32160\nEXO-M行星美男撞地球中国爱大歌会\t32161\n世卫\t32162\n鲁管\t32163\n克拉三克拉\t32164\n你好兄弟\t32165\n呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀\t32166\n所想\t32167\n家工人\t32168\n吴子龙\t32169\n孕育\t32170\n一边脸\t32171\nhaishi\t32172\n退订\t32173\n怀撒\t32174\n天塌地陷\t32175\n狛枝\t32176\n第二十八届\t32177\n修合\t32178\n亲爱的快给我亲爱的快给我请安\t32179\n19980424\t32180\nitouch\t32181\n神万能\t32182\n嗯徐钰媛\t32183\n假骂\t32184\n10:00-11:00\t32185\n艾弗森\t32186\n培训师\t32187\n追看\t32188\n微笑面对\t32189\n美味鸡蛋饼\t32190\n一群人\t32191\nYourefunny\t32192\n猪猪猪猪猪猪你是猪你是猪\t32193\nhsudheh\t32194\n不款\t32195\nk公园\t32196\n不欺\t32197\n3亿美元\t32198\n13399762433\t32199\nGuydirks\t32200\n#kris吴凡#\t32201\n施尔康\t32202\nuhcx\t32203\n光缆\t32204\n掩盖\t32205\n经理人\t32206\ndmdkd\t32207\n汤盈盈\t32208\n别偏\t32209\n不次\t32210\n不欠\t32211\n风罗隐蜂罗隐曲项向\t32212\n伊布萌\t32213\n每个星期六下午\t32214\ngfjduj\t32215\n图片报\t32216\n有多某\t32217\n良牙\t32218\n泼水\t32219\n牧也\t32220\n剑篇\t32221\n门班\t32222\ntf拟人\t32223\n阳千玺\t32224\n小辣条\t32225\n力帆差五零卖\t32226\n省城\t32227\n必躬亲\t32228\n一条人\t32229\n成日\t32230\n丹枫进站\t32231\n地动山摇\t32232\n驴蹄子\t32233\n纳斯都拿古尼\t32234\n苹果皮\t32235\n说和假\t32236\n转至\t32237\n烤鱼\t32238\n三夏佐樱\t32239\n起点\t32240\n伍翔\t32241\n经济之声\t32242\n嬉\t32243\n赌国家\t32244\n三十三三十四\t32245\n一看来\t32246\nfubcghj\t32247\n智商场\t32248\nｆａｖ\t32249\nMaxim\t32250\n八套\t32251\n嬛\t32252\n决定性\t32253\n先见之明\t32254\nQueen\t32255\n陈志远\t32256\n头胞类\t32257\nrbgjt\t32258\n聊聊天吧累\t32259\n彩云\t32260\n猪猪猪真的好想\t32261\n哪所\t32262\nzkx\t32263\n假以\t32264\n电压表\t32265\n嬷\t32266\nimetheryesterday\t32267\n哪扎\t32268\n笆笆\t32269\n鬼天\t32270\n鬼夫\t32271\n评评\t32272\n井勃然\t32273\n鬼大\t32274\n米兰米兰\t32275\nPujvbp\t32276\n放走走进来\t32277\n人鬼\t32278\n嘹亮\t32279\n鬼头\t32280\n广汽本田\t32281\n小牙\t32282\n林画寒\t32283\n樱井的夏天天空中繁星点点\t32284\n欧阳娜娜\t32285\nvivibibibibizibibibi\t32286\n评语\t32287\n字堂\t32288\n走过节\t32289\n爱的世界吗试爱\t32290\n柑橘\t32291\n微剧\t32292\ngrdgr\t32293\n清廉者\t32294\n进球儿\t32295\n评说\t32296\n落月\t32297\n一心念佛\t32298\ntff1tocot\t32299\n发吐\t32300\n李彤彤\t32301\n咿诶\t32302\n机顶盒\t32303\n闹兔\t32304\n有机题\t32305\n民意\t32306\n当当小说网\t32307\n武王吾\t32308\n发吃\t32309\n刘亚蜀\t32310\n豆米\t32311\n发吊\t32312\n发名\t32313\n豆类\t32314\nfffccco\t32315\n金恩\t32316\n2千一百九十家\t32317\n民愤\t32318\nTags\t32319\n单一边\t32320\n老聂\t32321\n重庆东部\t32322\n换一下来\t32323\n阳明山万寿寺\t32324\n进气片\t32325\n有点脑\t32326\n大哥哥哥\t32327\n蛋兽\t32328\n试问\t32329\n浆液\t32330\n海边\t32331\n服务商\t32332\nv╰\t32333\n成神\t32334\n情报\t32335\njjikk\t32336\n是不是我等\t32337\nlvato\t32338\n泼墨饿\t32339\n亮静静\t32340\n盗号\t32341\n神龙摆尾\t32342\n八王\t32343\n我样\t32344\n止咳\t32345\nez\t32346\n辩斗\t32347\n西林寺\t32348\n假排\t32349\npian\t32350\n数码宝贝\t32351\n半睡\t32352\n赞华\t32353\n睡大觉\t32354\n七多少\t32355\n听课\t32356\n大公无私儿\t32357\nHHJVhjHHFxhgfgfgcbbj\t32358\n来一炮来\t32359\n听说\t32360\n发信\t32361\n捉贼\t32362\necabfd\t32363\n咪都\t32364\n梁雨晨\t32365\n大喜处\t32366\n张秋月\t32367\n陕县\t32368\nOU男\t32369\n这片土地\t32370\n听话\t32371\n878\t32372\n靠不懂\t32373\n亮相\t32374\n877\t32375\n874\t32376\n875\t32377\n872\t32378\n873\t32379\n870\t32380\n初始化\t32381\n暴漫长篇系列#\t32382\n晚期\t32383\n黑暗骑士\t32384\n博尔特\t32385\n蘑菇我喜欢小鸟英雄小鸟\t32386\n听证\t32387\n新浪科技\t32388\n李梦琪\t32389\n陪女\t32390\nineiorealaal55252\t32391\n周珂珂\t32392\nServer\t32393\nrgvf\t32394\n辽阔\t32395\n心里\t32396\n嗯嗯嗯嗯\t32397\n阿哇比\t32398\n话不由己\t32399\n遇过\t32400\n改不了\t32401\n快女\t32402\n封存\t32403\n72斤\t32404\n5382746\t32405\n66万\t32406\n智跑\t32407\n十一五百\t32408\n童车\t32409\n恩说\t32410\n恩读\t32411\n恩诺\t32412\n果糖\t32413\nkello\t32414\n赵大姐\t32415\n第四名\t32416\nrzzzz\t32417\n100苞\t32418\nwewe\t32419\nwewf\t32420\n二零三二零九零\t32421\n崔手术\t32422\nryfgg\t32423\n秒睡\t32424\n超级无敌宙度密机器人\t32425\nVer\t32426\n倒底是什么\t32427\n讲错\t32428\n高层建筑\t32429\n微火煎蛋\t32430\n邓心悦\t32431\n柏楼村\t32432\n电冰\t32433\n读卡器\t32434\n感面\t32435\n1559079055270727\t32436\n薇儿\t32437\n怪妹子\t32438\n太客\t32439\n欧舒丹\t32440\n印东\t32441\n李和马\t32442\n3.12之夜\t32443\n闲语文\t32444\n共赢\t32445\n算不动\t32446\n圣魔\t32447\n十几名\t32448\nAk78\t32449\n问鼎\t32450\n肥洋洋的酒精有你有你\t32451\n小千小千\t32452\n春江水暖鸭\t32453\n体弱者\t32454\n364676435\t32455\n巴基斯坦观察家报\t32456\nd\t32457\n共赏\t32458\n太守\t32459\n吃电池\t32460\n总是否\t32461\n枞阳\t32462\n张一血\t32463\n偏跟\t32464\n太宗\t32465\n太官\t32466\n翻译书\t32467\nival\t32468\n四年级数\t32469\n李世基\t32470\n成紫洋\t32471\n冷笑槐\t32472\n毕学式\t32473\n立体图\t32474\n参禅\t32475\n五五\t32476\nffhjjgcccbjk\t32477\n礼仪\t32478\n陈国弘\t32479\n28934\t32480\n出没\t32481\n五二\t32482\n没礼貌干嘛\t32483\nikaaf\t32484\n南京地铁\t32485\n好慢\t32486\n零四二二\t32487\n七十四十分\t32488\n抓拍\t32489\n轻信\t32490\n五人\t32491\n若有所失\t32492\n三十二十三第二\t32493\n283056\t32494\n安逸安\t32495\n意识到\t32496\n化训练\t32497\neany\t32498\n好慌\t32499\nkrjhjdjdjdjrhgjjgjjbdfjtjsjfjjr\t32500\n五亩\t32501\n不哭不闹\t32502\n尊哥\t32503\n120行\t32504\n幺三幺\t32505\n鞣蛎\t32506\n3个多月\t32507\n丁雅心\t32508\n太伊玲\t32509\n国仁\t32510\n灵修\t32511\nGffbnhfnkm\t32512\n动漫龙\t32513\n蜡笔\t32514\n死到临头还嘴硬\t32515\n国他\t32516\n互关\t32517\n仙仙\t32518\nhamrd\t32519\n小罗\t32520\n十二万三百一十\t32521\n有头无尾\t32522\n无限风光在浩通.汤总[威武\t32523\n林素钦\t32524\n会面\t32525\n同好\t32526\n7k七\t32527\n亚一\t32528\n亚丁\t32529\n空调君\t32530\n1万六千二百五十六\t32531\n蓝威\t32532\n南瓜\t32533\n亲一个再走\t32534\n大道机器人和小气机器人\t32535\nqqqk\t32536\n输入源\t32537\nJubilee\t32538\n金大炮\t32539\n脂园\t32540\n百d\t32541\n坑蒙拐骗\t32542\n圣得贝\t32543\n18个月\t32544\n度秘红\t32545\n肉疼\t32546\n甘岩\t32547\n待考\t32548\n求板\t32549\n结过\t32550\nu9\t32551\n鱼排骨汤\t32552\n一个二百七\t32553\n大把差\t32554\n廿遍\t32555\nu7\t32556\nu6\t32557\nu1\t32558\n这么多家\t32559\nu3\t32560\nu2\t32561\nsuis\t32562\n我是没自信呀我没有\t32563\n玉女\t32564\n建水\t32565\n十千米\t32566\n这谁\t32567\n鲁燕燕\t32568\n李雪琪\t32569\n张喜萌\t32570\n生源学校\t32571\n猪爱情猪头猪脑\t32572\n输往\t32573\n常温\t32574\n5碗\t32575\nstatistinastouchestinatistheaxnooooplydiffe\t32576\n冰堡\t32577\nmcdfilm\t32578\n壞蛋\t32579\n天津耳\t32580\n百1\t32581\n何青云\t32582\n口才挺好\t32583\nuy\t32584\nux\t32585\n度密度密度密度秘踊跃度秘冬蜜\t32586\nuu\t32587\nut\t32588\nuw\t32589\nuv\t32590\nup\t32591\nus\t32592\nur\t32593\nul\t32594\nuo\t32595\nun\t32596\nui\t32597\nuh\t32598\nuk\t32599\nuj\t32600\nue\t32601\nud\t32602\nug\t32603\nuf\t32604\n加税\t32605\nuc\t32606\nub\t32607\n等你等\t32608\n爼姐\t32609\n王佳雨\t32610\ngigccigigcgiccig\t32611\n第二胎\t32612\n没人理\t32613\n4涂\t32614\n提取\t32615\n湖南庭院\t32616\n致上\t32617\n新河镇\t32618\n55页\t32619\n古河公路\t32620\n屠戮\t32621\n罗俊阳\t32622\n翠你的了的秘密\t32623\n天上秋天上九下一张脸\t32624\n管院\t32625\n赵小林\t32626\n叫干\t32627\n我是你十三这片子是真\t32628\n困困\t32629\nｓｈｉ\t32630\n金钟奖\t32631\n餐饮业\t32632\n联络\t32633\n殿堂级\t32634\n外贸代理出口\t32635\n联结\t32636\n猪还是狗你是羊你是吗你是猪你是羊你是狗你是吗你是神\t32637\n￥555785541\t32638\n人转述句\t32639\nm8rain\t32640\n记挂\t32641\n谢娘娘\t32642\n外层\t32643\n南桥\t32644\n外屏\t32645\n薛洪太\t32646\n89分\t32647\n95599\t32648\n绍兴县\t32649\n起点中文网\t32650\n数爱\t32651\n135135135135\t32652\n5555588874\t32653\nLGBT\t32654\n奥摩尔\t32655\n第15期\t32656\nｍａｐ孩\t32657\n笨瓜\t32658\n偏偏\t32659\n48.8分\t32660\n付安琪\t32661\n一个25岁\t32662\n萍萍\t32663\n夏图\t32664\n零1月25号\t32665\n南桀\t32666\n九万九千九千万九亿万九百九十九千万九万九千九百九十九\t32667\n凡俗\t32668\nGejebgsj\t32669\n救妻\t32670\nwywf\t32671\n2841次\t32672\n病于己\t32673\n伤心级\t32674\n15259390645\t32675\n坑网\t32676\n麋集\t32677\n袁嗲\t32678\n圆状\t32679\n四十多天\t32680\n十几米\t32681\n年初一\t32682\n宁化村\t32683\n打老鼠\t32684\n最高调\t32685\n野蜂\t32686\n喜欢你\t32687\n壓根\t32688\n大力点\t32689\n徐补补觉亚特\t32690\n8900\t32691\n890M\t32692\n进风\t32693\n刘帅通\t32694\n卫旭涛\t32695\n大连地区\t32696\n罪犯们\t32697\n彩虹糖块\t32698\n家院\t32699\ne3df8dcd100baa1958617ce4010b912c8fc2e18jpg\t32700\n完了再聊\t32701\n禹州高速\t32702\n返家\t32703\n一昨天\t32704\nParos\t32705\n手势语\t32706\n水浒好汉\t32707\n豪斯\t32708\n四十几岁\t32709\n积习\t32710\n凉面\t32711\n刘思燕\t32712\n蓝色\t32713\n奥林匹克雕塑文化公园\t32714\nhydt\t32715\n由此看来\t32716\n付绍\t32717\n都片\t32718\n该店\t32719\n光芒\t32720\nsbbbbbb\t32721\nhydg\t32722\ntgfft\t32723\nhydh\t32724\n一个再讲一遍\t32725\n付给\t32726\n云苏\t32727\n55666561545546\t32728\n春阳\t32729\n53个\t32730\n十多分\t32731\n雍容华贵\t32732\n来勿\t32733\nPEGGY\t32734\ngvcdy\t32735\n金港大酒店\t32736\n错字儿\t32737\n1401\t32738\n1400\t32739\n被切断\t32740\n甜甜好\t32741\n傻东\t32742\n那个才\t32743\n693\t32744\n十五十五十五十五十五\t32745\n不啊幸福机器人\t32746\n祖玛\t32747\n投资方\t32748\n别贿赂我了烦\t32749\n58778778788888807888815111154\t32750\n耳麦\t32751\n红酒杯\t32752\n中标\t32753\nAL鸣\t32754\n大欢笑\t32755\n死路一条\t32756\n雷建峰\t32757\n694\t32758\n四ｇ四ｇ\t32759\n凶咒\t32760\n10副\t32761\n黄海鹏\t32762\n多啦a萌多啦a萌\t32763\n长八大\t32764\n赶出\t32765\n王冲柯\t32766\n傻丫\t32767\n尼姑庵\t32768\n一千倍\t32769\n冠盖\t32770\n熊好\t32771\n娃头\t32772\n风火轮\t32773\n视野\t32774\n嗝儿\t32775\n怪徒儿\t32776\n关键点\t32777\ndudidiiducufufyfififkfkdhd\t32778\n亲爱的［飞吻\t32779\n所致\t32780\n第七章\t32781\n气球门\t32782\n钟翌冉\t32783\n稿百合\t32784\n赛罗照\t32785\n首播\t32786\n应物\t32787\nRockets\t32788\n刘以琳\t32789\n信部\t32790\n帕劳\t32791\n无聊我了时代城\t32792\n硅胶\t32793\n再见的再也不想\t32794\n冷冻\t32795\n妖器\t32796\n冷冽\t32797\n黄琪玉\t32798\n最了醉\t32799\n3060147212\t32800\n冷冷\t32801\n世园\t32802\n离开你\t32803\n25658\t32804\nhttppinyincn1nSfMtfsrML\t32805\n25655\t32806\n25656\t32807\n考官\t32808\n惶恐无地\t32809\n遇事\t32810\n绿植物\t32811\n详谈\t32812\n丑出\t32813\n真真真真真么么么么么看看看看看\t32814\n费营\t32815\n站长\t32816\n切尔西\t32817\n1968年10月21日\t32818\n拉走\t32819\n拉起\t32820\n付付鱼\t32821\nOxnard\t32822\n话说说\t32823\nwisdure\t32824\n唉及\t32825\n对面\t32826\ni汰渍\t32827\n我的战\t32828\n苦尽甘来\t32829\n国内\t32830\n我的我\t32831\n1100架次\t32832\n火球\t32833\n够了你以为\t32834\nvcy\t32835\n小度秘ababy\t32836\n不要脸脸皮厚\t32837\n明早10点\t32838\n肥架\t32839\n童言\t32840\n犯病\t32841\n年龄\t32842\n34569十一十二节\t32843\n吧们\t32844\n歌墩村\t32845\n度秘你有和你的妈妈拍过照\t32846\n杜泊\t32847\n嗯小头\t32848\n靠靠靠靠靠别\t32849\n沈昌\t32850\n秘干嘛\t32851\n陈女士\t32852\nhttpfhiphotosbaiducomxiaodupicitembf096b63f6246b60a1351301ecf81a4c510fa288jpg\t32853\n好舒\t32854\nfcikhd\t32855\n化妆品\t32856\n阿根廷\t32857\n支离\t32858\n硬道理\t32859\n片源\t32860\nfito\t32861\nfitm\t32862\n一冕\t32863\n6590千米\t32864\n推了推\t32865\n温暖的家\t32866\n小天天一边\t32867\nfitx\t32868\nBckfkoa\t32869\n此秘\t32870\n咯咯咯理论\t32871\n帝吧\t32872\n一冖\t32873\n坑坑洼\t32874\n二句号\t32875\ntomato\t32876\n苦觅君\t32877\n一悸\t32878\n管束\t32879\n不一样\t32880\n傅当\t32881\n唱一首歌呗\t32882\n益民基金管理有限公司\t32883\nlnc\t32884\n衙门\t32885\n神经错乱\t32886\n四片\t32887\n狂言\t32888\n有点用\t32889\n吃吃吃吃吃\t32890\n1050433204\t32891\n五道口\t32892\njjwiejsjdjo2ksosokwosow882ikejsje72u88eiieieiu8283666666kksienjx466343136343\t32893\n痛不如短\t32894\n欣喜若狂\t32895\n3到4小时\t32896\njingiijinjin\t32897\n芭芭亚\t32898\n对不吗\t32899\n页数\t32900\n雄霸\t32901\n猫头鹰\t32902\n才大叔伟\t32903\n饭袋\t32904\nnCAD\t32905\n傻夫运\t32906\n蹲级\t32907\n行万语\t32908\n身儿\t32909\n急需\t32910\n小格格\t32911\n清迈\t32912\n你告诉我\t32913\n还有你会\t32914\n9.9元\t32915\n男人帮\t32916\n我你在哪里你在哪里啊我的事\t32917\n溫暖和\t32918\n有所以好\t32919\n4865\t32920\n自我救济\t32921\n呀不可能\t32922\n第六张\t32923\n秘咒\t32924\n加藤加一\t32925\n法地亚\t32926\n私度\t32927\n永息\t32928\n跳跳舞\t32929\n广元\t32930\n第六弹\t32931\n真实名\t32932\n读不厌\t32933\n票选\t32934\n玩玩笑\t32935\n广兴\t32936\n打开率\t32937\n一十\t32938\n郑容和\t32939\n张茜\t32940\nvjff\t32941\n无怪哉\t32942\n昆线\t32943\n梁钰昆\t32944\n16岁\t32945\n斑马线\t32946\n上海分公司\t32947\n8号三天\t32948\n68家\t32949\n放大镜\t32950\n百天一个\t32951\n密密麻麻你男男女女朋友\t32952\n发回\t32953\n你好高\t32954\nmeti\t32955\nmeto\t32956\n监外执行\t32957\n奥巴马\t32958\n未啊不是爱你\t32959\n大会堂\t32960\n哪波\t32961\n奥巴驴\t32962\n伊科萨斯IXUS\t32963\nhhjj\t32964\n何梁何利基金科学与技术成就奖\t32965\n肋骨\t32966\njrbrb\t32967\n东楚晚报\t32968\n人权理事会\t32969\nKji\t32970\n上半叶\t32971\njhhhhh\t32972\n今晚八点\t32973\n汉歌\t32974\n前腰\t32975\n马洲公园\t32976\n免不免\t32977\n中中\t32978\n久久果\t32979\n中专\t32980\n蛮族\t32981\ndkksks\t32982\n大白牙嗯啊大娃大逃亡\t32983\n255555555222222\t32984\nfrenn\t32985\n费列罗\t32986\n翻页\t32987\n测重仪\t32988\n中东\t32989\nsdtt\t32990\n死翘\t32991\n放会\t32992\n卷回\t32993\n功夫熊猫2\t32994\n老师们\t32995\nf么多\t32996\n中下\t32997\n不能不乖\t32998\n强制性\t32999\n化为食\t33000\n别问\t33001\n吭气\t33002\nuuuggggvfg\t33003\n3456789\t33004\n克拉肯\t33005\n李冰露\t33006\nQ辽尤岁岁有今朝3田\t33007\n亚健康\t33008\n沉淀物\t33009\n别闹\t33010\n忘了问你\t33011\n威尔顿\t33012\n北京市东城公安分局\t33013\n海角天涯\t33014\n237.88万辆\t33015\n差商\t33016\n跑男4\t33017\n跑男3\t33018\n16700美元\t33019\n张慧仪\t33020\n恩那个胸熊出没之熊心归来\t33021\n星际精灵蓝多多\t33022\n小心谨慎\t33023\nAroavi\t33024\n送货\t33025\n牟久德\t33026\nyyyjj\t33027\n金园家园\t33028\n吴艺斌\t33029\n下十七号\t33030\n回答声\t33031\n大房\t33032\n大戰\t33033\n闪烁\t33034\n\t33035\n大户\t33036\n茚象泉\t33037\n一句额\t33038\n雨落云\t33039\n百度云盘\t33040\n华南傻子屯\t33041\n粘稠度\t33042\n三十多面\t33043\n瞬間\t33044\n观看\t33045\n大大大好\t33046\n玄子\t33047\nVIVA\t33048\n大战\t33049\n无敌赞赞赞\t33050\n副身家\t33051\n天厅\t33052\n头脑手\t33053\n大成\t33054\n对注\t33055\nVIVO\t33056\n钢铁侠3\t33057\n大戏\t33058\n查寝\t33059\nNB2k16\t33060\n廖艺良\t33061\n一俩\t33062\n软经\t33063\n一修\t33064\n纯净物\t33065\n迭宕\t33066\n格女\t33067\n17乌\t33068\n死了丑\t33069\n外说啊喜羊羊与\t33070\n装可怜\t33071\n天中不中\t33072\n玖晓片\t33073\n妊娠\t33074\n抵抗\t33075\n博士\t33076\n运移\t33077\n问度\t33078\n惊觉\t33079\n热聊\t33080\n孙子你在干嘛呀度秘\t33081\nlili\t33082\n智多星\t33083\n瘦小\t33084\n谦虚为何物\t33085\n抵押\t33086\n二三四五\t33087\n对生\t33088\n软绵\t33089\n甚一个\t33090\n忘记我是谁了是\t33091\n秧歌\t33092\n苍老湿\t33093\n打夺命\t33094\n不犯贱\t33095\n博百优\t33096\n张军奇\t33097\n035654\t33098\n13899997810\t33099\n5天前\t33100\n健长\t33101\n西蒙·凡·布伊\t33102\n乱套\t33103\n热熨剂\t33104\n天天炫斗\t33105\n北体\t33106\n破除\t33107\n一达通\t33108\n你我讨厌讨厌讨厌讨厌讨厌你你你你\t33109\n苏北话\t33110\n得一\t33111\nm康瑞\t33112\nMCD\t33113\nMCC\t33114\n走马\t33115\n破陪\t33116\n20l5年\t33117\n超低\t33118\n58533987412258063690000000000005422\t33119\n19780620\t33120\n高女孩\t33121\n三百多件\t33122\n小燕説\t33123\n懒成\t33124\n园园\t33125\n李春悦\t33126\n航空运输\t33127\n啲罗哀\t33128\noreou\t33129\n五台中学\t33130\n黄小云\t33131\n缴税\t33132\n万小万\t33133\n事恩\t33134\n祁莘凯\t33135\n232101199210022628\t33136\n5月中\t33137\n其年\t33138\n介聊\t33139\n青烟\t33140\n小鹰\t33141\noreoo\t33142\nnois\t33143\n太骨感\t33144\n同屋\t33145\nshould\t33146\n喜讯\t33147\na8a\t33148\n心我喜欢\t33149\nnoia\t33150\n萌萌达油气要了你是萌萌哒啦\t33151\n1月22日到24日\t33152\n7758258\t33153\n度禾七\t33154\n我不懂谁懂\t33155\n乐点\t33156\n千﹃千\t33157\n冯雪锐\t33158\n腻宝贝\t33159\n弹珠\t33160\n批不好\t33161\n假借\t33162\n被度\t33163\n书喝\t33164\n预后\t33165\n调下\t33166\n一三天\t33167\n獭兔獭兔我也有一个小獭兔\t33168\nSKSKRK\t33169\n钢琴曲\t33170\n瞿佳星\t33171\n小臭臭\t33172\na82\t33173\n你好好笑\t33174\n明天早晨六点半\t33175\n幺二零二二二幺九九二零七二七六二幺三\t33176\n钱了烦\t33177\n抚州\t33178\n宣涛\t33179\n46445455515515\t33180\n三西\t33181\n改改\t33182\n恢宏\t33183\nOP5\t33184\n梯形菜\t33185\n带班\t33186\n咯吗啡\t33187\n水位\t33188\nSpectrum\t33189\n孙红红\t33190\n聊拜拜\t33191\n求爱特\t33192\n运动裤\t33193\nbaster\t33194\n水体\t33195\nqqpp\t33196\ncojp\t33197\nbsbs\t33198\n金刚钻\t33199\ntcxfhnnm\t33200\n梦想有声\t33201\n煤气费\t33202\n陈祖名\t33203\n飞扬也\t33204\n爱死\t33205\n爱步\t33206\n运动装\t33207\n四月四\t33208\n佳璇\t33209\n155670755\t33210\nflhk\t33211\n恶棍天使\t33212\n舌吻吧\t33213\n655385\t33214\n玷污\t33215\nends\t33216\n可减少\t33217\n杨锐影\t33218\nhajb\t33219\n眼戳\t33220\n咯孙\t33221\n早睡先吗都几楼心月\t33222\n哥哥给我一个签名吧我喜欢你\t33223\n有所作自\t33224\n世外桃源\t33225\n不烦人爱\t33226\n圆满度\t33227\n三点钟\t33228\nDBZNKZJE\t33229\n爱我你就拜拜我爱我你就抱抱我\t33230\n何智光\t33231\n唱好\t33232\n方块\t33233\n刘嘉玲\t33234\n5555555555555\t33235\n周光许\t33236\n圣王\t33237\n四次方\t33238\nlo\t33239\n余元\t33240\n赌迷\t33241\nokatmerocorta\t33242\n知错就好\t33243\n钢铁苍穹\t33244\n女子儿\t33245\n背叛的灵魂\t33246\n龟龟\t33247\n小男孩\t33248\n夏紫薇\t33249\n稀眼\t33250\n鞔山\t33251\n辛苦啦\t33252\n一个一百分\t33253\n汪子涵\t33254\n学习生\t33255\nvgtq\t33256\n15869284856958\t33257\n搞笑呵呵连连连连\t33258\n九所\t33259\n假期性别女下体私家男\t33260\n5432\t33261\n再来一步\t33262\n听群\t33263\n密太狼\t33264\n干哈\t33265\n超困\t33266\n下雨天下雪\t33267\n耗子药\t33268\n感导\t33269\n鑫龙郡\t33270\n甜密\t33271\n李照贴\t33272\nlgj\t33273\ne200\t33274\n一个卷\t33275\nentnight\t33276\n乱跑\t33277\nlgo\t33278\n那街河市\t33279\n五塔油\t33280\nTOT1ther\t33281\nHJ涛\t33282\n6点30\t33283\n联行\t33284\n水文\t33285\n亚都是我喜欢\t33286\n二七二零二三八六\t33287\n奥鹏\t33288\nlgq\t33289\n办公\t33290\n舍得酒\t33291\n668688556\t33292\n逃港\t33293\n提价\t33294\n鸿坤原乡\t33295\n水方\t33296\n山山水水都不如家好东好些好也不如家\t33297\n异样屄\t33298\n一个十\t33299\nlx\t33300\nvvvvvvgg\t33301\n吹过\t33302\nhttpfhiphotosbaiducomxiaodupicitem4afbfbed\t33303\n一个半\t33304\n度秘度秘我想问一下就是我的事微我\t33305\n给我住\t33306\n卧谈会子\t33307\n婚恋中人\t33308\n十歲\t33309\n闭塞\t33310\n光武\t33311\n李师妹\t33312\n火行\t33313\n堡莱克\t33314\n哎呦好等我好等我啊好肉麻\t33315\n皮试\t33316\n头鸿蒙\t33317\n装疯卖\t33318\n仙巴\t33319\nvsvvv1\t33320\nthereisaabrackpora\t33321\n四100\t33322\n食机\t33323\n对啊那你不爱我\t33324\n短信群发\t33325\njdjjxkd\t33326\n给我你\t33327\n多穷\t33328\n完备\t33329\nzjxn\t33330\n一毛不拔\t33331\n小柯的花几号回包\t33332\n额老K\t33333\ncehwgbsf\t33334\n铡刀\t33335\n方少杰\t33336\n张家然\t33337\n梦梦梦\t33338\n低碳生活\t33339\n身陷\t33340\n流质食物\t33341\n从早到晚\t33342\n淳朴\t33343\n请谅解\t33344\n一言良宇\t33345\n躁狂症\t33346\n穆古鲁\t33347\n腹腔\t33348\n35%\t33349\n绝对性\t33350\n田连元\t33351\n那么度\t33352\n爱上书\t33353\n357\t33354\n堆\t33355\n355\t33356\n偏差\t33357\n353\t33358\n351\t33359\n350\t33360\n可靠性\t33361\n约定\t33362\n蓝梦\t33363\nhhgvb\t33364\n358\t33365\n神奇宝贝\t33366\n宝洁\t33367\n我真我\t33368\n性工作者\t33369\n民不聊生\t33370\n大芝麻\t33371\n恩断义绝\t33372\n婉\t33373\n1999999\t33374\n文印\t33375\n切问\t33376\n两百下\t33377\n在后\t33378\n一千分之一\t33379\n砂纸\t33380\n延续\t33381\n红色\t33382\n救场\t33383\n朱晓玫\t33384\n文华\t33385\nstolen\t33386\n硝酸\t33387\nDifovlfoflfo\t33388\n33级\t33389\n哈哈小花花你还在\t33390\n漏儿\t33391\n两百个\t33392\n文博\t33393\nnvv\t33394\n女马\t33395\n来潮疑\t33396\n醇\t33397\n哎丽\t33398\n柳工\t33399\n大姑姑\t33400\n武机器\t33401\nnvc\t33402\nnvb\t33403\n三二九四八六七八九十\t33404\n呢呜\t33405\n誰爱和你说\t33406\n普湾新区\t33407\n胡月晓\t33408\n堕\t33409\n名名\t33410\n名吃\t33411\n沾边\t33412\n欲封灵\t33413\n欠话费\t33414\nbuilk\t33415\n行密\t33416\n头巴黎\t33417\n951\t33418\nTRURJDGDH\t33419\n新兴区\t33420\n到老地方\t33421\n民航局\t33422\n今晚19:30\t33423\n王小雨\t33424\n181嘿oooo\t33425\n对对对对对对\t33426\n秃头\t33427\n不可多\t33428\n赵兰\t33429\n慌张\t33430\n郑温和\t33431\n五龙口\t33432\n飙飙\t33433\n必死无疑人不要脸天下无敌原来你就是那个不要脸的人\t33434\nbjbjp\t33435\n心寒\t33436\n不可失\t33437\n妄念的梦\t33438\n肩胛骨\t33439\n厌厌\t33440\n7syou\t33441\n越远\t33442\n嘟嘟咪咪咪咪咪\t33443\n阅读率\t33444\nhellostometou\t33445\n越过\t33446\n没清楚\t33447\nl2\t33448\n采煤\t33449\n9469\t33450\n达米\t33451\ncaonidaye\t33452\n杨森\t33453\n腹黑\t33454\npooy\t33455\n老麦\t33456\n万仙山\t33457\n川式火锅\t33458\n斗式\t33459\n周子健\t33460\n李大伯\t33461\n李健亚\t33462\n饿猪\t33463\n大头娃娃鱼\t33464\n甩完\t33465\n店庆\t33466\n面无\t33467\n你是男的你是白雪公主\t33468\n扯蛋国\t33469\n正戏\t33470\n123456789000000000000000000000000000000000000000000000个\t33471\n王琳淼\t33472\n婪\t33473\n554487545575\t33474\n杨高强\t33475\n何鑫\t33476\n文艺片\t33477\n946j\t33478\n4562555\t33479\n本來\t33480\n盛情以待\t33481\n二十七二十八号\t33482\n喷血\t33483\n山东大学经济研究中心\t33484\nxvhn\t33485\n野钓\t33486\n1980年代初\t33487\nghvhb\t33488\n宋铭睿\t33489\n彩轩\t33490\n迩玩\t33491\n二太二\t33492\n一F家\t33493\n蔡桥城大道\t33494\n龙嘉杰\t33495\n好搞\t33496\n欠佳\t33497\n猪你是猪你是你是你是猪\t33498\n王馨旎\t33499\n喀拉汗\t33500\n新兴龙泉\t33501\n张博林\t33502\n乱画\t33503\n不由自己\t33504\n20160077\t33505\n夸年\t33506\n吴5毛\t33507\n合模\t33508\n屁股流\t33509\n槟榔味\t33510\ncoloveymse\t33511\n夕阳朝\t33512\n太阳伞\t33513\n关园\t33514\n贝勒斯\t33515\n污叻\t33516\n88888888888\t33517\n演员们\t33518\n你娃子先哪四分那你多才韩国人\t33519\n二九五零\t33520\n瓦素\t33521\n布林德\t33522\n命名\t33523\n88888888886\t33524\n俩人儿\t33525\nImXinXin\t33526\n兩節課\t33527\n故事情节\t33528\n律师们\t33529\n股骨\t33530\n马帝国\t33531\n¤\t33532\n§\t33533\n来出没\t33534\n流程\t33535\n继父\t33536\n¨\t33537\n说会话\t33538\n医院堡\t33539\n1234557896\t33540\n合成音\t33541\n·\t33542\n°\t33543\n±\t33544\n扳回\t33545\n罗志军\t33546\n土豆土豆\t33547\n影歌\t33548\n金晖小学\t33549\n才下平凡之路\t33550\n北京犬\t33551\n13463192000\t33552\n唐七\t33553\n打灰机\t33554\n恶斗\t33555\n赤练\t33556\n｀）／n　　Y　　　Y\t33557\n谢了拜\t33558\n棒度秘\t33559\n长途师\t33560\n给我行\t33561\n美嘉酵\t33562\n嗯味多美\t33563\n数五个\t33564\n继任者\t33565\n来靠\t33566\ndjjsjsja\t33567\n畅想你\t33568\n别扭扭\t33569\n快乐齐分享\t33570\n质的飞跃\t33571\n害子女\t33572\nhhah\t33573\n有色金属交易网\t33574\n坐以待毙\t33575\n雷克萨斯CT200h\t33576\n一笑一颦\t33577\n守时\t33578\n连漪\t33579\nfhlxcjd\t33580\n一一个一\t33581\n绿力\t33582\nSUNTORY米酒\t33583\njthu\t33584\n油性\t33585\n飞狼\t33586\n甘雨\t33587\n2003136\t33588\n乌兰乌\t33589\n一一个个\t33590\n东风标致\t33591\n高玉凤\t33592\n鸡巴毛病\t33593\n屆冠軍\t33594\n曹某\t33595\nZERO\t33596\n七点钟\t33597\nttytttty6tyy678904321\t33598\n紫悦\t33599\n立式\t33600\n136134873\t33601\n南湖区\t33602\n飞狐\t33603\n1982年12月\t33604\n1369897505299\t33605\n球帽\t33606\n千岁万岁\t33607\n梦境\t33608\n施文界\t33609\n印子月\t33610\n印度德里\t33611\n住宿生\t33612\nAngeland\t33613\n累积红黄牌\t33614\nmodaan\t33615\nsvIP\t33616\n1500办\t33617\n北京师范大学出版社\t33618\n朴信艺\t33619\ngghccddg\t33620\n熊uuuu\t33621\n冤家\t33622\n黄磊\t33623\n梦墨\t33624\n李龙凯\t33625\n度秘度秘个爱的小度秘秘我是你我是你最可爱的小猪\t33626\n王雨儿\t33627\n副产品\t33628\nbike\t33629\n1928年\t33630\n3999\t33631\n3998\t33632\n甚远\t33633\n吃住\t33634\n网易书库\t33635\n瘦纸\t33636\nHolmes\t33637\n平吉一村\t33638\n范玮琪\t33639\n花言巧语\t33640\nYUU\t33641\n张铺村\t33642\nSHINee\t33643\n四支\t33644\n滴心\t33645\n隐形材料\t33646\n招损\t33647\n周璇璇\t33648\n我的我喜欢\t33649\n五百五百五百五百五百五百五百五百五百五百五\t33650\n第二個\t33651\n93433\t33652\n机会难得\t33653\n11387654321\t33654\n鬼才信\t33655\n返回身\t33656\nfrney\t33657\n简约\t33658\n上海静安寺方丈慧明法师率静安寺青年法师朝礼五台山行脚团一行\t33659\n九百九千个\t33660\n王1一\t33661\n聚财\t33662\n日环\t33663\n贾刚\t33664\n8078916\t33665\n7月11日\t33666\n惜乎\t33667\n樱桃沟的春天笑猫日记\t33668\n你俩\t33669\n笑百步\t33670\n126页\t33671\n通风口\t33672\n米粒\t33673\n段周全\t33674\n疾书\t33675\n早里\t33676\n闹剧\t33677\n纪念版\t33678\n杉若颜\t33679\n地久\t33680\n大蠢猪\t33681\n7点\t33682\n25个\t33683\n北京钓鱼台\t33684\n黑耍\t33685\n米粉\t33686\n飞机卡\t33687\n不置信\t33688\nucabcd\t33689\n曹先生\t33690\n不爱我了是\t33691\n赠阅\t33692\nyyfdfgu\t33693\n冰天雪地\t33694\n正蓝旗\t33695\n嗯688\t33696\n入戏\t33697\n别墨迹行\t33698\n加大\t33699\n用料\t33700\n千字文\t33701\nplotfrom\t33702\nufhcg\t33703\n涣然\t33704\n男扮女装\t33705\n寸土恰似虚弥\t33706\n奥托you\t33707\n梦梦学\t33708\nHSHX\t33709\n固体\t33710\nhgcioio\t33711\n观澜湖\t33712\n273.42第三\t33713\n取小帮手吧要不然\t33714\n心不舍\t33715\n入户\t33716\n222222222222222222222222222222222222222222222222222222\t33717\n强调\t33718\n赤壁折\t33719\n定损\t33720\n上课铃\t33721\n波音公司\t33722\n哈擦\t33723\n超赞\t33724\n夜用型\t33725\nFUCK\t33726\n马背激素\t33727\n噶位\t33728\n任阳\t33729\n上妆\t33730\n短讯\t33731\n寻死\t33732\n飘散\t33733\nwmk\t33734\n当兵\t33735\n分出\t33736\n堂锅网\t33737\n蛇魔美\t33738\n134500\t33739\n酷酷\t33740\n花公\t33741\n我恨你\t33742\nfgdgufff\t33743\n鲁玉静\t33744\nwmm\t33745\n中兴通讯\t33746\n打分机\t33747\n雪铁龙轿\t33748\n黄桔黄粉红黄绿乳白\t33749\n三千多\t33750\n兵器宁\t33751\nWerer\t33752\n卢泰愚\t33753\n永登\t33754\n天崖\t33755\n洪流\t33756\n体力活\t33757\n事负\t33758\nmamsn\t33759\n六分之一\t33760\n秘阳\t33761\n猜歌唱\t33762\n违章节\t33763\n小鬼孙\t33764\n10000公里\t33765\n41588\t33766\n你好自为\t33767\n一滴水\t33768\n阳历恩\t33769\n袁腾飞\t33770\n扥贵阻挡\t33771\nｌｕｂｏｂｏ\t33772\n免交\t33773\n展厅\t33774\n毒妇\t33775\n闪光弹\t33776\n落英缤纷\t33777\n撒曼\t33778\n白人鱼\t33779\nbodocudw\t33780\nMoretz\t33781\n晓日\t33782\n民了子\t33783\n儿头\t33784\n乐施会\t33785\n听实话实说\t33786\n宏图\t33787\n未有人\t33788\n丹丹丹丹丹\t33789\n花花馆\t33790\n阳方\t33791\n杀妻\t33792\n乳酸痛\t33793\n毒妻\t33794\n枪声\t33795\n菌男\t33796\n奥申奥\t33797\n心心里\t33798\nWeekend\t33799\n邹婉琪\t33800\n查唱\t33801\n一万年后\t33802\n大妮也吧花漫\t33803\n兴义\t33804\n崇洋媚外\t33805\n呱呱呱呱呱呱呱呱呱呱呱呱嘎嘎嘎呱呱呱呱呱呱呱呱呱呱呱呱\t33806\n轻贱\t33807\n5533\t33808\n中央音乐学院\t33809\n相睹\t33810\n六七十岁\t33811\njgmjgajgj\t33812\n5539\t33813\nexecu么\t33814\n十六年\t33815\n牙签\t33816\nREPO\t33817\njlghu\t33818\n芭芭多\t33819\n阳光下的温情-加拿大随拍\t33820\n遗物\t33821\n战站\t33822\n快乐你幸福\t33823\n今天七\t33824\n何弃我\t33825\n15808257913\t33826\n害怕孤单\t33827\n葛虹秀\t33828\n冰接骨\t33829\nUR\t33830\n饿坏\t33831\n二零二二五\t33832\n886哈哈\t33833\n团军\t33834\n丽君几个臭不要脸\t33835\n故死\t33836\nfkjalfllggfo\t33837\n长安酒店\t33838\n超级偶像MJ中山行\t33839\n度度秘秘给我我我唱唱唱歌手\t33840\n齔蠱\t33841\n无脸\t33842\n17寸\t33843\n南京岛村\t33844\n能量转换\t33845\nvbfvbvbb\t33846\n城西小学\t33847\n打劫此\t33848\n山东队\t33849\n猜怕\t33850\n66公斤\t33851\n蝶音\t33852\n无脑\t33853\n八八hi思密达\t33854\n宗教家\t33855\n无脚\t33856\n天成\t33857\n天我\t33858\n203平方\t33859\n早稻田校区10号馆\t33860\n蔡枫华\t33861\n王傲天\t33862\n七七十块\t33863\n首登\t33864\n真好意思\t33865\n一遍一米\t33866\n东家\t33867\n灰溜溜\t33868\n天阿\t33869\n答案池\t33870\n东宅\t33871\n152654555553\t33872\nmjmmjjimtpjmjjdamgmmjdadatdwtdjjjjjjmgmgwgdpmwgagtapwwamjdjgmjgmpj\t33873\n天目学院\t33874\njngg\t33875\njngh\t33876\n东安\t33877\n武器子\t33878\n2271251937\t33879\n胳肢窝\t33880\n8000万\t33881\n而胜\t33882\n东宝\t33883\n准时\t33884\n药店\t33885\n姐姐们\t33886\n厂里\t33887\n请把\t33888\n21分\t33889\n屏客\t33890\n绿军\t33891\n商水县\t33892\n毙命\t33893\nmifisthemi\t33894\n朴安堂\t33895\n能不能不\t33896\n投低\t33897\n乐嘉\t33898\n罗耶罗耶\t33899\n谭华\t33900\n失踪\t33901\n大叔们\t33902\n黄楚瑜\t33903\n21点39\t33904\n年日\t33905\n来不度\t33906\n自居\t33907\n6.08斤\t33908\n走行\t33909\n相呼应\t33910\nkjysk\t33911\n1NU\t33912\njddx\t33913\n话费\t33914\n杀跌\t33915\nlove\t33916\nDgfx\t33917\n蚌埠\t33918\n大老\t33919\n好吧小宝贝\t33920\njddj\t33921\n白鳍豚\t33922\n许浩\t33923\n李梓萌\t33924\n念镇\t33925\n城府\t33926\n戴龙杰\t33927\n410个\t33928\n痛心\t33929\n不要再说说\t33930\n主孑\t33931\n主子\t33932\n皿皿\t33933\n法名\t33934\nSeptember\t33935\n257885\t33936\n赤司征十郎\t33937\n骗纸\t33938\n唉an\t33939\n融融\t33940\nmvd\t33941\n张吧\t33942\n痛快\t33943\n黄冈\t33944\n迷谜语\t33945\n樟木箱\t33946\n大耳\t33947\n成英\t33948\n200174\t33949\n替天人\t33950\n建达\t33951\n明天七点半\t33952\nFffffgggf\t33953\n乱叫\t33954\n吃着\t33955\n海珠区\t33956\nsabiafa\t33957\n彭佳慧\t33958\nthick\t33959\n陈逸凡\t33960\n公孙\t33961\n艺能局\t33962\n公孓\t33963\n公子\t33964\n3d个\t33965\n了行\t33966\n皮肤病\t33967\n乱发\t33968\n油价\t33969\n51百斤\t33970\n沾住\t33971\n哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒\t33972\ndaey\t33973\n枯燥\t33974\n见不见效\t33975\n教学\t33976\n被会\t33977\nzhenyl\t33978\n60双\t33979\n李铭祺\t33980\n家房\t33981\n中山林\t33982\n氙气灯\t33983\n知春\t33984\n家户\t33985\n八十多起\t33986\n五十几岁\t33987\n我是劲劲儿爱的眠好梦\t33988\n便秘翰\t33989\n85950791\t33990\n房门\t33991\n坐肚子\t33992\n去领东西\t33993\n贾哥哥\t33994\nCHASER\t33995\n教子\t33996\n谢勇\t33997\nFfgdgghg\t33998\n13933189785\t33999\n手绘图\t34000\n散仙\t34001\n林梦恬\t34002\n长链儿\t34003\n阔阔\t34004\n一张样\t34005\n95M\t34006\n度秘棒\t34007\n完型\t34008\nhfvkh\t34009\n第一面\t34010\nGUX\t34011\n新年好呀\t34012\n符殇\t34013\n蒲纶\t34014\n吃饱了叫\t34015\n可控\t34016\n20世纪70年代\t34017\ndheihs\t34018\n贸易大厦\t34019\n没有你我告诉你\t34020\n笑笑猫\t34021\n在不在\t34022\n小家红娘\t34023\n光明小区\t34024\n王枝塘\t34025\n真太不讲义气了你\t34026\ntvtvtvvtvrrc444sz2a14418208539rv5\t34027\n111吱吱\t34028\n十五头\t34029\n韩伍社\t34030\n打倒\t34031\nhggvbbjji\t34032\n十五天\t34033\n13645229495\t34034\n机箱\t34035\n吉猩\t34036\n晚上6:50分\t34037\nMILK\t34038\n十五夜\t34039\n真相帝扒皮\t34040\nA036\t34041\n双排\t34042\n西瓜头\t34043\nqvhxydhxbdxsickjxwerrt6uuiooihjddsadfjjczxvnmcxtdy6cy\t34044\n双掌\t34045\n曾铭祺\t34046\n好好好好好好好好好好好好好好好好好\t34047\n女饶\t34048\n算了算\t34049\n鸟秘\t34050\n698枚\t34051\n一个错误\t34052\n曾飞龙\t34053\nnjhvvbgggbvvbvhbb\t34054\n阴木\t34055\n事项\t34056\nItIT\t34057\n时尚型\t34058\n高薪\t34059\ndrsfgv\t34060\n骆锅\t34061\nUS，km\t34062\n不v发\t34063\n太工口\t34064\n天么\t34065\n年好\t34066\n亡命天涯\t34067\n酷一刀\t34068\n猪瘦肉\t34069\n天乐\t34070\n奥行\t34071\n成年了\t34072\n皮薄\t34073\n别忘了你我算老板你是秘书\t34074\n顾晓军\t34075\n很不想和你\t34076\n一九六\t34077\n星途\t34078\nIv小怨妇春测图\t34079\ngr7o\t34080\n领队\t34081\n351号\t34082\n欢庆\t34083\n好等会儿\t34084\n酒店大堂\t34085\n2011.05.24\t34086\n13823222448\t34087\n9983\t34088\n2011.05.21\t34089\nravcafahialma\t34090\n载重\t34091\n很庆幸\t34092\n吹风\t34093\nshitaifretinolithiffithitshit\t34094\n攀比\t34095\n邓利维\t34096\n街上\t34097\n欢度\t34098\ntwsk\t34099\n不二叔\t34100\n51934919864991643495\t34101\ntwsl\t34102\n科学化\t34103\ncggcjjbckbffhjgstkbzhjxefhjgfsg\t34104\n昰夜\t34105\n单曲\t34106\n张怡宁\t34107\n日女\t34108\n扁豆\t34109\n铁博宇\t34110\n送张\t34111\n英子丽\t34112\n1.77亿\t34113\ntlajkjl\t34114\n胡g69tl\t34115\n元宝树\t34116\nvvjghikbj\t34117\n网吧\t34118\n江慧琳\t34119\n和面\t34120\n中华共和国\t34121\n早恋生\t34122\n猎人\t34123\n赵哥\t34124\n副政委\t34125\n网名\t34126\n莎莎德语\t34127\ndetun\t34128\n卧糙\t34129\n入跑\t34130\n6月3号\t34131\n那个错\t34132\n宜居\t34133\n芋丝\t34134\n杨柳飘\t34135\n沉静\t34136\n弄斧\t34137\n阿狸棒棒糖\t34138\n杨晓\t34139\n神格\t34140\n烂醉烂\t34141\n程佳豪\t34142\n好花不折亦凋零，\t34143\n神样\t34144\n闹不代表\t34145\n4740779点c点\t34146\n棠梨\t34147\n继续做\t34148\n霹雳火\t34149\n复印\t34150\n两家人\t34151\n元芳元芳\t34152\n农历腊月三十至正月初十五\t34153\n锤哥\t34154\n爱达力奶粉\t34155\n杨智\t34156\n神树\t34157\n宜山\t34158\n喜怒哀思惊恐悲品百味人生\t34159\n猪王\t34160\n明天飞沙尾\t34161\n五年之内\t34162\n徐正曦\t34163\n猪朱猪\t34164\n双一起\t34165\n收治\t34166\n李慧敏\t34167\n名国钦\t34168\n20多个\t34169\n民主政治\t34170\n光头强呀\t34171\nigfyuh\t34172\n干骨折\t34173\n玍\t34174\n在厕所间吗真是的你好奇怪\t34175\n20多万\t34176\n赵长奇\t34177\n养鱼\t34178\n幺二零幺零二一九六八零六二四幺七七幺\t34179\n撒钱\t34180\n上涨\t34181\nBURBERRY\t34182\n跳板\t34183\n三六H\t34184\nfesc\t34185\n女哒\t34186\n同盟\t34187\n渣滓\t34188\n守身\t34189\n广州北\t34190\n0.14\t34191\n么么么爱我听窝\t34192\n旭辉\t34193\n钟先\t34194\nfess\t34195\n第二日\t34196\n迈瑞窥伺s\t34197\n七月十伺\t34198\n小人参\t34199\n不以嘛愉快\t34200\n稀稀落落\t34201\n吴烁\t34202\n谢海象\t34203\n图天\t34204\nlaunches\t34205\n志明\t34206\n加班车\t34207\n见完\t34208\n养猪\t34209\n养猫\t34210\n等等等等等等等等等等等等等等等等等等等等等等等等等等等\t34211\n黑龙圈\t34212\n上上上上\t34213\n图头\t34214\n春风又绿江南岸明月何时照我还这句诗\t34215\n面武斗\t34216\n钟內\t34217\n炸蛋\t34218\n正点\t34219\n下下下下下下下下\t34220\n加勒比海盗4\t34221\npifu\t34222\nmarioue2\t34223\n发算\t34224\nenjoyoneistrip\t34225\ndurfu\t34226\n沉船\t34227\nyfdhi\t34228\n红牛\t34229\n21251\t34230\n烊烊\t34231\n添麻烦\t34232\nhdffjr\t34233\n一个五十块\t34234\n63461\t34235\n昨天六点\t34236\n搜救\t34237\n塘业五库\t34238\n石爸\t34239\n传媒学院\t34240\n有得谈\t34241\n比菲\t34242\n发管\t34243\n安全套\t34244\n五公里\t34245\n有焕\t34246\n敌国\t34247\n追到手\t34248\n漫漫\t34249\n立功\t34250\n陈某\t34251\n内酸\t34252\n大光路\t34253\n问菲\t34254\n安达桥\t34255\n二图\t34256\n唐宏泉\t34257\n众百姓\t34258\nbhiphotosbaiducomxiaodupicitem34fae6cd7b899e517ec2626145a7d933c8950d72jpg\t34259\n转投\t34260\n神探亨特\t34261\n文三路108号\t34262\n二囍\t34263\n真心痛爱\t34264\n转折\t34265\n张江男\t34266\n度秘度秘我们两个聊会天\t34267\n中高\t34268\n度尽劫\t34269\n两三件\t34270\n二回\t34271\n二四\t34272\n闪伤\t34273\n磁感\t34274\n吴亚梅\t34275\njrf\t34276\n玛哈\t34277\n张俊超\t34278\n拉姆齐\t34279\n罗嗦\t34280\n5月22日15时\t34281\n韦卓霞\t34282\n丑来着\t34283\n章鑫豪\t34284\n一般见识\t34285\n粉絲\t34286\n字型\t34287\njrc\t34288\n啵好啵\t34289\n筹措\t34290\n我的是你真的片\t34291\n奥丁之子\t34292\n怀神\t34293\n喃们\t34294\n7j\t34295\n朱高强\t34296\n鹿晗\t34297\n杏红小达人\t34298\nS3\t34299\n看见鬼\t34300\n度密度米\t34301\n十几世纪\t34302\n主线\t34303\n王晨龙\t34304\nrna\t34305\n我喜欢练\t34306\nrng\t34307\nTrd\t34308\nrnd\t34309\n陈立人\t34310\nrnz\t34311\n铠甲勇士类比\t34312\n先秦\t34313\n很郁闷\t34314\n支支吾吾\t34315\n二十日\t34316\n在梦到\t34317\nygyy\t34318\ntibht\t34319\n先科\t34320\n生有\t34321\n输钱\t34322\n很刚劲\t34323\n俎建立\t34324\njrs\t34325\n礼炮\t34326\n甩掉\t34327\n不成声\t34328\n本来\t34329\n别傻\t34330\n阴森森\t34331\n叶广芩\t34332\n本条\t34333\nzgghgfvdjm\t34334\n六百亩\t34335\nwheivy\t34336\n一个成千\t34337\n遛鸟\t34338\n检察官\t34339\nTfboiz\t34340\n李阳\t34341\n潘子\t34342\n腰子\t34343\n时光\t34344\n电眼\t34345\n老妖精\t34346\n有气\t34347\n敌后\t34348\n拒载\t34349\n四脚村\t34350\n伯贤苏\t34351\n谢谢你了我\t34352\n张教授\t34353\n不介意\t34354\n羊汤\t34355\n李阁\t34356\n多长时间\t34357\n本村\t34358\njry\t34359\n胡辣汤包子\t34360\n昏哨\t34361\n斯玛特\t34362\n克鲁格\t34363\n编程\t34364\n汪东城\t34365\n我我是和你到你的老婆行\t34366\n猎杀者\t34367\n培训日\t34368\n掏心\t34369\n一千多岁\t34370\n资格管\t34371\n条马路\t34372\n花花呀我想和换档\t34373\njrD\t34374\n系欢\t34375\n我不我不要你了我把你册除了\t34376\n流鼻血\t34377\n改为\t34378\n白海琦\t34379\n禁海\t34380\n金刚\t34381\n光明牌冰砖\t34382\nCobham\t34383\n老行尊\t34384\n改一\t34385\n体制\t34386\n张斯斯\t34387\nghghgchfcj\t34388\n北半球\t34389\n沙巴茨\t34390\n金利\t34391\n床边\t34392\n永井圭\t34393\n五一只\t34394\n真面目\t34395\nBvh\t34396\n搞束\t34397\n刻苦V5啦啦頭兩千距離去了開口潑灑手機還虧了枯木聚居區劇透有無女錄取率月經量少就是金錢\t34398\n需求方\t34399\n不是你爱我是你爱你\t34400\n星球杯\t34401\n塔斯马尼亚岛\t34402\n广州大学纺织服装学院\t34403\n下唱\t34404\n戊土\t34405\n祖宅\t34406\n意图\t34407\n4票\t34408\n宁丰宾馆\t34409\n莲峰\t34410\n苏亚雷斯\t34411\n在魔先\t34412\n配陪\t34413\n领略\t34414\n六百毫升\t34415\nLFU\t34416\nkvkh\t34417\n开不开\t34418\n中国银行\t34419\n9999亿\t34420\ngka1j\t34421\n超时\t34422\n图腾\t34423\n第一支\t34424\n个都天送天送天送甜心呕吐vlhhcogcgxgldotdotdgldtlslylsprjarktskysyldlydp\t34425\n丹内尔\t34426\n這小子\t34427\n三皮\t34428\n刘阳\t34429\n8年前\t34430\n粟星\t34431\n民众党\t34432\n无法逃避\t34433\n户籍所在地\t34434\n海员\t34435\n没一次性\t34436\ncryd\t34437\n癞蛤蟆\t34438\nykew\t34439\nkalxk\t34440\n郭超过\t34441\niish\t34442\n第二产\t34443\n刘锡鸿\t34444\n举报电话\t34445\n机舱门\t34446\neme400126em\t34447\n乖乖大宝贝\t34448\n度点\t34449\nduids\t34450\nhhsbsb\t34451\n7ufhcud6yvuup\t34452\nanew\t34453\n耐酸性\t34454\n小妈妈\t34455\n功能\t34456\n我喜欢\t34457\n顼羽\t34458\n海味\t34459\n岳西公馆\t34460\n448064\t34461\n你的你的一大特征种\t34462\netyuuu\t34463\n谋杀\t34464\n低碳出行\t34465\n总账\t34466\n春节快乐吧\t34467\n136集\t34468\n乐画\t34469\n总费\t34470\n朱自清\t34471\n更少\t34472\n几十分钟\t34473\n2011年12月16日\t34474\nlvnvx\t34475\n二万美美空\t34476\n东方艺术之都\t34477\n留情\t34478\n現露\t34479\nv金黄刺激法\t34480\n剑心\t34481\n恩上恩乐光洙\t34482\n奔转\t34483\n高梦蕊\t34484\nMynameissheldon\t34485\n行线\t34486\n餐头儿\t34487\njibudiyi\t34488\n小画片\t34489\n真是我的漂亮\t34490\n18.9元\t34491\nKanny\t34492\n191876750019187675\t34493\n大豆腐\t34494\n拖期\t34495\n岔街一号\t34496\nyaggsfsff\t34497\n荷花\t34498\n男律师们\t34499\nxakkdkovkkutvubububb\t34500\n王云阳\t34501\n7884万股\t34502\n废话废话废话废话废话废话废话废话废话废话废话废话废话废话废话废话废话废话废话废话废话废话废话废话废话\t34503\n可以说一句\t34504\n贤人\t34505\n祸遇\t34506\n老行感\t34507\ndiop\t34508\n1912年4月\t34509\n到假\t34510\n死法\t34511\n飞天飞天\t34512\n诗句\t34513\n和姐\t34514\n11333333\t34515\n夕断肠\t34516\n数露珠\t34517\n女术\t34518\nfgdddgfr\t34519\n18千米\t34520\nRldc\t34521\n存档\t34522\n巡唱\t34523\n三都\t34524\n永不变\t34525\n纳兰容若\t34526\n13256678092398\t34527\n麦扣扣\t34528\n八角形\t34529\n女朋\t34530\n侥幸者\t34531\n双成对错了错了错了错了错了错\t34532\n阿灵慧监工辛苦了[偷笑\t34533\n文革潜规则\t34534\n浙江省乐清市人民政府\t34535\n15161028185\t34536\n魔军团\t34537\n金洙元\t34538\n风不动\t34539\n杜甫春望周\t34540\nzwoo\t34541\n刘雅轩\t34542\n阳松庭\t34543\n无愧\t34544\n打沙鳄鱼\t34545\n80000000\t34546\n铁律\t34547\n过吧\t34548\n塞维利亚\t34549\n逃亡\t34550\n无意\t34551\n这人\t34552\n朱才庆\t34553\n升降机\t34554\n燕麦片\t34555\n过后\t34556\n谷哥哥\t34557\n147885\t34558\n车拍\t34559\n过吊\t34560\nGHVJJ\t34561\n重庆晨报\t34562\n吃氯雷他定\t34563\n过吃\t34564\nnonononohu\t34565\n哪行哪业\t34566\n利人\t34567\n酒庄\t34568\nｇ网\t34569\n酒庆\t34570\n铠甲勇士之雅\t34571\n好啦我懂\t34572\n五十个\t34573\n4s店\t34574\n酒店\t34575\n二十5月28日\t34576\n打换\t34577\n三五六五三八三五七三八三九四十一三八三五六\t34578\n利亚\t34579\n度秘我爱你明天见\t34580\n给我和你\t34581\n五十七\t34582\n阿凡午\t34583\n数日\t34584\n藏人\t34585\n1552553\t34586\n五十下\t34587\n五十三\t34588\n章雨燕\t34589\n华米\t34590\n索溪\t34591\n利于\t34592\n佳龙脚躯保护区\t34593\n安多福\t34594\n半颗\t34595\n至少\t34596\n青云十二月三二十九会\t34597\n内射\t34598\n865580088可热电友\t34599\n小超\t34600\n关讨厌你讨厌你\t34601\njsjj\t34602\n更进一步\t34603\n日租宝\t34604\n西边下\t34605\n577777\t34606\n24..6\t34607\nilljjjaajgagaa\t34608\n甲乙丙\t34609\njsjw\t34610\n潇洒带\t34611\n至尊\t34612\nEtdf\t34613\nnother\t34614\n峙轩\t34615\n借记卡\t34616\n阿恺\t34617\ncom奥\t34618\n超级早我\t34619\n长\t34620\n8Jkcckkk\t34621\n6月份\t34622\n星星类\t34623\n顺畅\t34624\n马桶\t34625\n天天重场戏\t34626\n法医毒妃\t34627\n奥什么奥\t34628\n九五二六八六\t34629\njckhhlvljvjljvglhljklvljghljgpinkhjljbnknkbljkyhjhnoufljlgligpuu\t34630\n别逗我了行不\t34631\n六十九\t34632\n裕华裕华大饭店\t34633\n扫盲\t34634\n第一次么\t34635\n死心真心真意的爱\t34636\n现今天\t34637\n18668556888\t34638\n独孤紫冰\t34639\n波丽士\t34640\n中国跳水梦之队\t34641\nprove\t34642\n阵雪\t34643\n阵雨\t34644\n夜奶奶\t34645\n算及\t34646\n131414\t34647\n回家鸟\t34648\n喝开\t34649\n肚脐儿\t34650\nWwwoa\t34651\n几副\t34652\n肉酱\t34653\n精溃烂\t34654\nreweresome\t34655\n流线型\t34656\n大连圣亚\t34657\n添换\t34658\n腊八传说\t34659\n家长沙\t34660\n超鼠\t34661\n暖和一点\t34662\n真狠\t34663\n洗肺\t34664\n语境\t34665\n地责\t34666\n肝疼儿\t34667\n几割\t34668\n生出\t34669\ncax\t34670\n舞蹈课\t34671\n赵金飞\t34672\ncap\t34673\ncav\t34674\ncat\t34675\n权志龙\t34676\ncai\t34677\n02062355513\t34678\ncao\t34679\ncan\t34680\nMORGAN\t34681\n研祥集团\t34682\ncaa\t34683\nk小魔仙\t34684\n手之劳\t34685\n伊奈\t34686\n麻克\t34687\n伊奎\t34688\n588775268742\t34689\n波奇\t34690\n蜡笔小新小新\t34691\n头尾\t34692\n储物\t34693\n韩国思密达\t34694\nUhmghgbhvnjjnjkkkknjbbbnjh\t34695\nstimeihou\t34696\n长安CS35\t34697\n五和五\t34698\nl1b1a1a\t34699\n一个41\t34700\n二七七二五八九九\t34701\n开心悄悄乐\t34702\n郭\t34703\n淘宝网\t34704\n孤孤零零\t34705\n丰顺\t34706\n隋美莲\t34707\n花鼓戏\t34708\n投弹手\t34709\n佛身\t34710\n左右世界#\t34711\n郭光辉\t34712\n终审判决\t34713\n最小\t34714\n晚上18:30\t34715\n再见了我的龟\t34716\n黑龙江佳旭律师事务所\t34717\n515151\t34718\n残爱\t34719\n59岁\t34720\n慢走慢走\t34721\n深色\t34722\n切克闹煎饼果子\t34723\n张小浩\t34724\n咬我\t34725\nmcore\t34726\n豪礼\t34727\n舒适性\t34728\n古朴\t34729\n驾乘\t34730\n嗯嗯咪咪咪峰峰歪歪\t34731\n小玲玲\t34732\n零三零六\t34733\n小罗伯特\t34734\n卢铭珊\t34735\n小看我\t34736\n20世纪\t34737\n离人\t34738\n古有\t34739\n665555555\t34740\n古朋\t34741\n工作能力\t34742\n通往\t34743\n胎气\t34744\n光合作用\t34745\n早上七点钟\t34746\n黄我\t34747\n悔过\t34748\n倚切\t34749\n催眠师\t34750\n回来的路\t34751\n布兰德\t34752\n这两天风\t34753\n美都\t34754\nyoufa\t34755\n脏器\t34756\n二十八五\t34757\n来了五有\t34758\n美女孩\t34759\n基金会\t34760\nhi2142号4\t34761\n尽望\t34762\n站街\t34763\n三秒两秒\t34764\n定说错\t34765\n一二点\t34766\n要不称呼\t34767\n童话世界\t34768\nffffffffffffffffffffffff\t34769\n300年\t34770\n推断\t34771\n广场站\t34772\n集约化\t34773\n调至\t34774\n落叶飘零\t34775\n吊毛\t34776\n支点\t34777\n10月21日\t34778\n孙米娜\t34779\n贩子们\t34780\n40082088th\t34781\n包庇\t34782\n夏冰冰\t34783\n割舌\t34784\n自负自信\t34785\n错与对\t34786\n老秘\t34787\n某处\t34788\n大吉\t34789\n龙百川\t34790\n大名\t34791\n大同\t34792\n大后\t34793\n朝生暮死\t34794\n骗人\t34795\n鞠诺\t34796\n狗市\t34797\nclittlit\t34798\n好啦好啦我服你\t34799\n番禺站\t34800\n某天\t34801\npeakin\t34802\n188分\t34803\n追直追\t34804\n灌醉\t34805\njanuary\t34806\n厘定\t34807\n焚稿\t34808\n158626\t34809\n尉迟敬德\t34810\n果只\t34811\n大听\t34812\nabcdefg\t34813\n纽组词\t34814\n郁\t34815\nbtsjbnzywt\t34816\n乖度\t34817\n凤凰古镇\t34818\n理王\t34819\n胡宇鑫\t34820\n九零平\t34821\nAff\t34822\nXvidosjapenes\t34823\n海轮\t34824\n秋实\t34825\n一片\t34826\n54338\t34827\n秋官\t34828\nsnh4\t34829\n0.12%\t34830\n地久天长\t34831\n军醋\t34832\n一版\t34833\n咯噜噜\t34834\n40立方厘米\t34835\n启丰二号\t34836\n秘莫测\t34837\n力怪\t34838\n金融界\t34839\n112233445\t34840\n670867\t34841\n香飘飘奶茶\t34842\n盈透\t34843\n老妖婆\t34844\n不见得\t34845\n贝尔格莱德\t34846\n15846845810\t34847\n慧芳\t34848\n田际云\t34849\n石大奶\t34850\n催泪\t34851\n一物\t34852\n潜水湖\t34853\n亲达尼克\t34854\n程辽洋\t34855\n受杰\t34856\n呵呵妥妥阜乒况厂\t34857\ncaru\t34858\n心明眼亮\t34859\nTade\t34860\nxsykn\t34861\n后胡\t34862\n迷语\t34863\nehdtj\t34864\n夏养心\t34865\n哎西\t34866\n五九\t34867\ncard\t34868\ncare\t34869\n庞振\t34870\n四五级\t34871\n淫唐传\t34872\n难得\t34873\n都弥渡弥渡弥渡弥渡弥渡弥渡\t34874\n新屋\t34875\n治现\t34876\nbritish\t34877\n熬煮\t34878\n新展\t34879\n几个照\t34880\n笑星\t34881\n后背\t34882\n腰线\t34883\n鹭江\t34884\n三句\t34885\n一豫\t34886\n沈念\t34887\n三口\t34888\n宏辉\t34889\n红朝\t34890\nnailiyou\t34891\n高期\t34892\n树灯\t34893\n胡汉民\t34894\n50万块\t34895\n三号\t34896\n奥度秘\t34897\n烧焦\t34898\n三台\t34899\njsjdud\t34900\n石河北\t34901\n招报应\t34902\n补珂\t34903\n高有\t34904\n98852544585216556655555554255536\t34905\n变形金刚珀利\t34906\n201611月3日\t34907\ndhkbxxgncsqwggvxfgjcdfgjvvvghjgdkfjdhsbsjfgafjgggxhc\t34908\n怎时\t34909\n梁妹妹\t34910\n周总理\t34911\n一吨\t34912\n买单\t34913\n买卖\t34914\n告诉我去\t34915\n够不够我住我的\t34916\n好了我知道你是爱我的\t34917\n闪进\t34918\n15228426728\t34919\n唐狮\t34920\n好智\t34921\n中国影院\t34922\n上戏\t34923\n莆田市\t34924\n甘玉梅\t34925\n阿巴拉古阿巴拉古\t34926\nplasses\t34927\n不得伯\t34928\n这回行\t34929\n15246320818\t34930\n上我\t34931\n亲爱的看你多萌萌哒\t34932\n第二桌\t34933\n电脑洞\t34934\n灯杆\t34935\nhgcffghi\t34936\n第二桶\t34937\n子店\t34938\n苏妙玲\t34939\nYxix\t34940\n好帮手\t34941\n厌讨厌讨厌讨厌\t34942\n上房\t34943\n100000000000000个\t34944\n云中歌\t34945\n咋里\t34946\n上户\t34947\n优惠劵\t34948\n杨清柠\t34949\n切亲亲\t34950\n一吹\t34951\n花开花落\t34952\n流动资金\t34953\n19个\t34954\n丰满\t34955\n满手\t34956\n萨达姆\t34957\n尘肺\t34958\n国金\t34959\n一十几岁\t34960\nqgj55552225\t34961\n小红门乡\t34962\n琴琴\t34963\n一场场\t34964\n250年\t34965\n爱就是不是爱你爱的\t34966\n子子\t34967\n14时15分\t34968\n缩放\t34969\n佘祥林\t34970\n风溏\t34971\n高冉豪\t34972\n禁断介护\t34973\n啦啦啦啦啦啦我是那么的小画家\t34974\n犬儒\t34975\n陆晏华\t34976\nprice\t34977\n打人工\t34978\nsmoothload\t34979\n二子\t34980\n]\t34981\nv5shop\t34982\n转送\t34983\n恩知拉贝比\t34984\n二季\t34985\n犬柯基\t34986\nreggaemusic\t34987\n采访者\t34988\n幽默人\t34989\n人世界\t34990\n3600个\t34991\n黄心瑶\t34992\n二孩\t34993\n废话日记发哈发发哈几噶噶几几然几发\t34994\n洋县\t34995\n等说\t34996\ndfdhfg\t34997\n刚一不小心\t34998\n竹筐\t34999\n桑事\t35000\n亲爱的你给我\t35001\n知糸\t35002\n97分\t35003\n唔舒\t35004\n小一个字\t35005\n谢啦哈\t35006\n盖好\t35007\n道济\t35008\n大班\t35009\nhGGHh\t35010\n零八版的射雕英雄传\t35011\n怪怪\t35012\n沃歌\t35013\n1600元\t35014\n很奇兵\t35015\n德克斯特\t35016\n水径\t35017\nta主频\t35018\n队友们\t35019\n下滑\t35020\n有名儿\t35021\n55寸\t35022\n蚭\t35023\n付义雄\t35024\n上学和你一样\t35025\n菠萝果冻\t35026\n李少杰\t35027\n糜子\t35028\n公文骨\t35029\n度雨\t35030\n苦不释手\t35031\n血液\t35032\n钱奥\t35033\n7%\t35034\n王华高\t35035\n蚁\t35036\n蚀\t35037\n禁欲\t35038\n观众席\t35039\n蚌\t35040\n哈哈安\t35041\n小心超人\t35042\nllol\t35043\n蚊\t35044\n蚕\t35045\ntfou\t35046\ntfos\t35047\n傅律师\t35048\n猪你是猪你是猪你是猪你是猪你是猪你是猪你\t35049\n好吧行\t35050\nDHZGBXF\t35051\ntfoy\t35052\n有抖\t35053\n60毫米\t35054\n情绪低落\t35055\n说的好呀\t35056\n百代\t35057\n讲笑\t35058\n白点\t35059\n属于\t35060\n四川儿\t35061\n神小松\t35062\n还诺默默我坐我在\t35063\n三三八零二\t35064\n发洒\t35065\n有把\t35066\n代价\t35067\n恋爱期\t35068\n明天下午五点\t35069\n大学大学\t35070\n摆摆\t35071\n零二二二六\t35072\ncud\t35073\n国防费\t35074\n巴婆\t35075\n疙瘩汤\t35076\n阻燃性\t35077\n24253785\t35078\n丁吉娃\t35079\n山雨\t35080\n百分百\t35081\n零月蚀寡\t35082\n二道\t35083\n韩琳秘\t35084\nfamle\t35085\n二遍\t35086\n镇压\t35087\n看看干\t35088\n三萬\t35089\n访谈记者—我们在路上\t35090\n由来\t35091\n十三沙\t35092\n救苦救难\t35093\n烟雨\t35094\n二百五十七\t35095\n二零零九\t35096\n嗑儿\t35097\n李茂XDDDD\t35098\n第209位\t35099\n桃叶\t35100\n烟雾\t35101\n二十多八三四十二\t35102\n牢房\t35103\n煤车\t35104\n宫来完\t35105\n王阳博\t35106\n70000多元\t35107\n一一份\t35108\n坐标仪\t35109\n20余年\t35110\n新航\t35111\n古希腊\t35112\n123456789十11121314151617181922122232425262728293十\t35113\n295页\t35114\n557675580756\t35115\n魔秀\t35116\n代表团\t35117\n有非礼\t35118\n爱小看\t35119\n逛``唯品``\t35120\n几十家\t35121\n烧饭\t35122\n行为何来\t35123\n红火火\t35124\n战马\t35125\n快男电影大了你\t35126\n想过头\t35127\n李叔叔\t35128\n烧饼\t35129\n腊月\t35130\n5ewtqg\t35131\n批们\t35132\n散去\t35133\n往死\t35134\n起自己\t35135\n荨荨\t35136\n浅浅\t35137\n3n49\t35138\n二心恶心\t35139\n蠢事儿\t35140\n黄我命\t35141\n铜锣烧\t35142\n迅雷不及\t35143\n银妆\t35144\n解毒\t35145\n秘蜜蜜\t35146\n藕塘\t35147\ndovpjxpfdodkLgicl8osjgukhvvuuuuuū\t35148\n运动神经元病\t35149\n有所作为\t35150\n别着惹\t35151\n胡泰蒂\t35152\n考不考\t35153\n渔场\t35154\n铁锅顿大\t35155\n青花\t35156\n抢奸\t35157\n2011年5月19日\t35158\n王艺洁\t35159\n林朵朵\t35160\n丑度秘\t35161\n吕噢\t35162\n三十日上午十一点\t35163\nhhhucvhhcuuffjj\t35164\n你跟\t35165\ncvttp\t35166\n考试不呀\t35167\n唔走路\t35168\n二零一二\t35169\n8686696569\t35170\n皮鼓\t35171\n帕萨特\t35172\n丛坐\t35173\n哥哥哥们\t35174\n折寿\t35175\ngccbuchc\t35176\n倾一\t35177\n亲一个啵啵\t35178\n亚兰\t35179\nTIRRIIR\t35180\n后视镜\t35181\n走掉\t35182\n广场舞\t35183\n倾世\t35184\n程序员\t35185\n不找你\t35186\n24名\t35187\n成康\t35188\nhttpahiphotosbaiducomxiaodupicitemd53f8794a4c27d1e1ad00db11cd5ad6edcc438d4jpg\t35189\n/68077615/68077619\t35190\n反过来\t35191\n139255758585\t35192\n北徽信\t35193\n三产公司\t35194\n雷鸣\t35195\n民营资本\t35196\n找你\t35197\n成度\t35198\n延藏\t35199\n臭气熏天\t35200\n静安区\t35201\n樽\t35202\n宋祯\t35203\n东东看一本漫画书\t35204\n雷鸟\t35205\n吴华丽\t35206\n这季风\t35207\n塔维尔诺斯特\t35208\n小夫君\t35209\n六太\t35210\n主演\t35211\n不食\t35212\n六天\t35213\n我的我爱我是你的老大\t35214\n先走先走先走下\t35215\n鹿鹿\t35216\n巧巧\t35217\n跳冬眠\t35218\n巧工\t35219\n撒谎者\t35220\n吾辈\t35221\n卤煮\t35222\n程度\t35223\ngirlora\t35224\n表演性\t35225\n三性\t35226\n三急\t35227\n风采\t35228\n惊天\t35229\n你好我的小秘书度秘我想知道爱的教育面\t35230\n浩劫中学\t35231\n风量\t35232\n化学方程式\t35233\n着急急\t35234\n0力克名\t35235\n程序\t35236\n受够你\t35237\n配发\t35238\n吹风机\t35239\n好滚滚\t35240\n无心之失\t35241\n舒兴杰\t35242\n老带\t35243\n插槽\t35244\ntfis\t35245\n乱七八\t35246\n浦阳江\t35247\nvVC\t35248\n眯市\t35249\n薏米绿豆\t35250\n老布\t35251\n老帅\t35252\n老师\t35253\n圣荷\t35254\n数四\t35255\n哈里贝\t35256\n老帕\t35257\nBB霜\t35258\n忘不有\t35259\n瓶红酒\t35260\n说吗\t35261\n笑话海陆\t35262\n童学雪公园\t35263\n桦甸\t35264\n2653422029\t35265\n学雷锋\t35266\n说吃\t35267\n红梨舫\t35268\n诺丁山\t35269\n说同\t35270\n理哼\t35271\n转动\t35272\n千百次\t35273\n我卡\t35274\n催泪弹\t35275\n仪陇县\t35276\n底盘\t35277\n不是你看我\t35278\n两千七百八十\t35279\n说否\t35280\n说吧\t35281\n不是你走\t35282\n忙度秘\t35283\nUgigjfhgcjfgvhxgcjgfjfjfjgggjfkbought\t35284\n说听\t35285\n好我不想和你聊\t35286\n点放\t35287\n普尔敏\t35288\n快快乐乐\t35289\n两月\t35290\n战斗机\t35291\n番茄块\t35292\n杯中\t35293\n站稳\t35294\n中国邮政\t35295\n位置\t35296\n艰巨\t35297\n湖北汽车两边有限公司\t35298\n两期\t35299\n守红\t35300\n小度秘小度秘可爱的小兔幂萌萌哒\t35301\ndzg\t35302\n两本\t35303\nLogo\t35304\n崖岸镇\t35305\n点歌听听\t35306\n赶紧唱\t35307\n两朵\t35308\n奈史qvq\t35309\nwolf\t35310\nwold\t35311\n星际秘\t35312\n无聊中\t35313\n第一册\t35314\n一九四四\t35315\n笑话你信鬼\t35316\n爱情光\t35317\n漫居\t35318\n大堵塞\t35319\n卷薄术\t35320\n乐死不彼\t35321\n20次\t35322\n密喊\t35323\n漫展\t35324\n私欲\t35325\n這邊\t35326\n今个Gore我胳膊鬼望坡\t35327\n文化度\t35328\nf一\t35329\n幂美\t35330\n十家\t35331\nmaobatiane\t35332\n厚恩\t35333\n潘瑜\t35334\n孙漓\t35335\n我可爱的小八\t35336\n酥油\t35337\n怡翠花园\t35338\n800岁\t35339\n一美图秀秀\t35340\nBenjour\t35341\n漫山\t35342\n爱狗网\t35343\n再一次\t35344\n拆机单\t35345\n罢了罢\t35346\n真的不说\t35347\npdea\t35348\n00后女孩童颜巨乳\t35349\n宝蓝\t35350\n三六九九九一百一八三三一八\t35351\n蜃楼\t35352\n样男\t35353\n壑人\t35354\n萱卿\t35355\n林飞\t35356\n无出路咖啡馆\t35357\n女主网\t35358\n强文轩\t35359\njddsttdgdgd\t35360\n周怀钰\t35361\n看比\t35362\n梅小梅\t35363\n地鼠\t35364\n十几句\t35365\n洗费\t35366\n回播\t35367\n片纸\t35368\n傻逼\t35369\n杨媛\t35370\n朱了\t35371\n凡塔\t35372\n着错\t35373\n十全十美美中不足\t35374\n虾丸\t35375\n垂涎欲滴\t35376\nreadenglish\t35377\n别上我了我\t35378\n圣但尼\t35379\n批臭\t35380\n妃儿\t35381\n唔来\t35382\n信礼\t35383\n521元\t35384\nwwwcxx\t35385\n11公里\t35386\n1960年\t35387\n万商\t35388\n魔玫瑰\t35389\n96一代\t35390\n深远\t35391\n道破\t35392\n嗯嗯到我就这样过我的一生我的吻注定吻不到我爱的任\t35393\n估摸宁\t35394\nFernando\t35395\n碎末\t35396\n食补\t35397\n大朗汽车站\t35398\n王熙凤\t35399\n贡茶\t35400\n看我长\t35401\nghVg\t35402\nlffg\t35403\n验车\t35404\n难为人家\t35405\n缺勤\t35406\n漆面\t35407\n度秘年\t35408\n益友\t35409\n梁爷爷\t35410\ntatattawt\t35411\n柘荣\t35412\n68077613/68077235\t35413\n120.3亿元\t35414\n代表大会\t35415\n器型\t35416\n36kms\t35417\n8545625552\t35418\n88659\t35419\n五台\t35420\n卫急急急\t35421\n淘气体重比较小度\t35422\n粪男孩儿\t35423\ncffghsdgj\t35424\n空域\t35425\n份年快乐\t35426\n臭代购\t35427\n锋\t35428\n卫生条件\t35429\n玉液\t35430\n八年前\t35431\n小姨多鹤\t35432\n看礼佛\t35433\n18610483383\t35434\n冤鬼们\t35435\n恩干嘛\t35436\n红人\t35437\n锄\t35438\nJjjuui\t35439\n哈拉巴\t35440\n暴露狂\t35441\n洋柿子炒鸡蛋\t35442\n七本\t35443\n几部\t35444\n言出必行\t35445\n8x77x8xg\t35446\n七朵\t35447\n峡谷\t35448\n祥狗\t35449\n谢谢你的好\t35450\n把着摩\t35451\n爱因斯坦\t35452\n生龙活虎\t35453\n七月\t35454\n尾端\t35455\n池昌\t35456\n期中考\t35457\n颜知己\t35458\n广播电台\t35459\n勒勒发骚\t35460\n七期\t35461\nstill\t35462\n1-6月\t35463\ntuxf\t35464\n短评\t35465\n真聪明喜羊羊\t35466\n告诉我告诉我\t35467\n温佳鹏\t35468\n白念\t35469\n有才不高兴\t35470\nabc类\t35471\n才森\t35472\n慕灼\t35473\ngNefjfN\t35474\n小金县\t35475\n意业\t35476\n离弦\t35477\n不知数\t35478\n花絮\t35479\n抗旱\t35480\n吃法\t35481\n超生\t35482\n77元\t35483\n抗日\t35484\n控件\t35485\n51384254558520\t35486\n二维4bakahimahese\t35487\n骤马冈\t35488\n离开\t35489\n离异\t35490\n抗旨\t35491\n第十六期\t35492\ntlajl\t35493\n光华乡\t35494\n首届\t35495\n孝顶\t35496\nhdhc\t35497\n王志伟\t35498\n十多条\t35499\n壹加壹加壹加壹加壹加壹加壹加壹加壹\t35500\n另有\t35501\n滋养\t35502\n寒更·庚心\t35503\n吱声\t35504\n压坏\t35505\n付路瑶\t35506\n汉王讨厌你你不可之\t35507\n刘景来\t35508\n2011年3月31日\t35509\n各部\t35510\n星彩\t35511\n选手\t35512\n短说\t35513\n梦香\t35514\n行也\t35515\n过年不想\t35516\nfired\t35517\n行乐\t35518\n排水器\t35519\n意下\t35520\n12585555\t35521\n昨天4点\t35522\n星影\t35523\n跑腿\t35524\n国民性\t35525\n293309\t35526\noffice2003\t35527\n兴雷落\t35528\ndhiphotosbaiducomxiaodupicitem6159252\t35529\n放贷\t35530\n过思\t35531\n著作权\t35532\n活春\t35533\n忆江南\t35534\n党课\t35535\n快棋村\t35536\n过态\t35537\n情长\t35538\n818\t35539\n更可爱我相信你我从来就相信你你别一不要害羞\t35540\nMmnbvvc\t35541\ngvffdvc\t35542\n过性\t35543\n实验小学\t35544\n归功于\t35545\n345六七八九十\t35546\n不死嗯那加法表\t35547\n131541\t35548\n免受\t35549\n杨骥\t35550\n麒麟\t35551\n着衣\t35552\n刻骨\t35553\n偏不走\t35554\n湘湖景区\t35555\n英德\t35556\n扇门\t35557\n红米3\t35558\n红米2\t35559\n不是你讨厌我是我讨厌你\t35560\n福顺\t35561\n瘦金体\t35562\n正备场\t35563\n暮年\t35564\n双截棍\t35565\n老滚\t35566\n一首曲\t35567\nUniform\t35568\n贝贝雅雅\t35569\n亩仔\t35570\n吗八\t35571\n洗衣券\t35572\n熊琦涵\t35573\n着行\t35574\ngqd\t35575\n安装工人\t35576\n库存\t35577\n特战英雄\t35578\n悲伤欲绝\t35579\n范正旭\t35580\n有很多一个度秘一个度秘一个度秘\t35581\ntfoust\t35582\n座落\t35583\n太不懂事\t35584\n几克星\t35585\nCAR\t35586\n烯烃\t35587\n卡贝\t35588\ngcyf\t35589\nCAO\t35590\nCAM\t35591\nhuhhhhbhhjhyhhh\t35592\nCAF\t35593\nCAD\t35594\n虐爱\t35595\n折纸\t35596\nexoDO\t35597\n赵瑞涵\t35598\n徐瑞\t35599\n门面\t35600\n丝绸之路\t35601\n病灶\t35602\n特种\t35603\n卡费\t35604\n特地\t35605\n咯了\t35606\n维特秃头女虐\t35607\ncncbvjc\t35608\n掏空\t35609\nYuri#\t35610\n报歌\t35611\n贾鑫\t35612\n3975克\t35613\n蝙蝠侠一\t35614\n生活条件\t35615\ncfggg\t35616\n沉沉沉\t35617\nbalabala\t35618\n梳头\t35619\n28年后\t35620\n不乖不乖\t35621\n特约\t35622\n鲁凯妮\t35623\n我的朋友们\t35624\n张天燕\t35625\n难为\t35626\n我的感觉\t35627\n忘笑话儿\t35628\n精灵梦叶乐力萝莉错错了错了错了错了错\t35629\nCA3\t35630\nCA2\t35631\n仰光省\t35632\n广州光派汽车用品有限公司\t35633\n白雾\t35634\n节凑\t35635\n打工族\t35636\n守信用\t35637\n不通好\t35638\n白雪\t35639\nlx66666\t35640\n买家秀\t35641\n功德\t35642\nhelp\t35643\n剧照\t35644\n50484\t35645\n大雄宝殿\t35646\n27集\t35647\nuvu\t35648\n买成\t35649\n烧掉\t35650\n历史使命\t35651\n许汐汐\t35652\n二手车们\t35653\n你是谁你的主人是谁你的妈妈是谁你的爸爸\t35654\ny1\t35655\nhell\t35656\n默念\t35657\ny4\t35658\ny7\t35659\n35522142117889951363694121736942855723\t35660\n盖哥\t35661\n85%\t35662\n广岛之恋\t35663\n没事干\t35664\n下月底\t35665\n二零零八年\t35666\nutrtutet\t35667\n东帝汶\t35668\n滴猪\t35669\n蒙古猪\t35670\n烧接\t35671\n天籁俱乐部\t35672\n不避亲\t35673\n视听说\t35674\n丨丿丶乛\t35675\n我讨厌你给我爱胖\t35676\nyj\t35677\n猪猪日升那样有我在\t35678\nyl\t35679\nyo\t35680\nyn\t35681\nya\t35682\n二一二百\t35683\nye\t35684\nyd\t35685\nyg\t35686\n姐姐弟\t35687\nyy\t35688\nyx\t35689\nyz\t35690\n开威胁\t35691\nyp\t35692\nys\t35693\nyr\t35694\nyu\t35695\nyt\t35696\nyv\t35697\n何清清\t35698\n粉丝\t35699\n双玉坤\t35700\n我真的不想你\t35701\n3114户\t35702\n鹅肉\t35703\nify\t35704\nrdcu\t35705\n风油精\t35706\nifd\t35707\niff\t35708\nifg\t35709\n轻松一刻\t35710\n鵓撥驋\t35711\nifc\t35712\n猪老壳嗦\t35713\nifo\t35714\n爬梯\t35715\nyV\t35716\n被骗了我死\t35717\n磨一摸\t35718\nplantom\t35719\n要不不会\t35720\nEng\t35721\nbvcu\t35722\n贵者\t35723\n一笑堂\t35724\n小凤夏\t35725\n缓兵之计\t35726\n说死了之\t35727\n批发市场\t35728\n唱音\t35729\n瘦肉片\t35730\n大酒店\t35731\n各样\t35732\n子弟兵\t35733\n卞仲耘\t35734\nbvcV\t35735\n马浠萌\t35736\n一技之长\t35737\n各校\t35738\n近点\t35739\n菜心\t35740\n嗷嗷亮\t35741\n绩\t35742\n绨\t35743\n绫\t35744\n续\t35745\n绯\t35746\n绮\t35747\n戌宫格\t35748\n绣\t35749\n要不懂\t35750\n消受\t35751\n继\t35752\n5月中旬\t35753\n绸\t35754\n数少五分之一对\t35755\n绽\t35756\nlabv\t35757\n绿\t35758\n闰寒\t35759\n好呀\t35760\n聚和\t35761\n绵\t35762\n维\t35763\n绷\t35764\n绶\t35765\n绉\t35766\n终\t35767\n皮干\t35768\n绍\t35769\n皮年\t35770\n经\t35771\nz15AD\t35772\n组\t35773\n织\t35774\n细\t35775\n划破\t35776\n绘\t35777\noppe\t35778\n络\t35779\n统\t35780\n绞\t35781\n绑\t35782\n绐\t35783\n结\t35784\n绒\t35785\n绕\t35786\n浴兰节\t35787\noppo\t35788\n世纪\t35789\n因为我\t35790\n毛巾包\t35791\n辣条糖\t35792\n䦷难明\t35793\n啊强\t35794\n萌萌的智能人嘞\t35795\n班恩\t35796\n授箓\t35797\n刘紫荆\t35798\n张若呁\t35799\n真的有咸阳我爱你不是你爱我\t35800\n朱永强\t35801\n花千骨7\t35802\n\t35803\n登上来\t35804\n漏比毛\t35805\n11314\t35806\n疯疯傻傻\t35807\nlalalalalala\t35808\n身不由己好\t35809\nhellohello朋友我爱你钟\t35810\n么么么么么么么么么么萌萌萌萌\t35811\n总似\t35812\n窦义鸿\t35813\n张思曼\t35814\n保研\t35815\n睿域营销\t35816\n怎么了我\t35817\n白蓝熊\t35818\n你那谁是谁你男神\t35819\n六百平方米\t35820\n北京奥运会\t35821\n3281\t35822\n疯狂传奇\t35823\n马珺\t35824\n小天鹅午餐\t35825\nchml\t35826\n总会\t35827\nvvvvvvv\t35828\n刘力萌\t35829\n灵雪\t35830\n椭圆\t35831\n彼此间\t35832\n散尽\t35833\ndhiphotosbaiducomxiaodupicitemc2fdfc039245d68837e163d6a3c27d1ed31b24e5jpg\t35834\n第12704个\t35835\n\t35836\n折拆\t35837\n李有\t35838\n李月\t35839\n2.0L\t35840\n噩噩噩噩噩噩噩噩噩噩噩\t35841\n12项\t35842\nGod\t35843\n下路线\t35844\n牙垢\t35845\n脱出\t35846\nGot\t35847\nNONONO不不不\t35848\n彭有\t35849\n好李国\t35850\n受困\t35851\n黄页岛\t35852\n写行\t35853\n非常爱\t35854\ntdhhxigcuyfcxhxfcx\t35855\nuuuu句\t35856\n备忘\t35857\n干酒\t35858\n抽风了\t35859\n伙伴关系\t35860\n临毁\t35861\n有假放\t35862\n无计可施\t35863\n绿盒磺\t35864\n磊少\t35865\n紫冰\t35866\n紫冲\t35867\n齐天\t35868\n频道\t35869\n向快乐出发\t35870\n赵主人\t35871\nhttpsj3987comapp24167\t35872\n莫恶露\t35873\n6868\t35874\n社会制度\t35875\n穆子达\t35876\n晋代\t35877\n肉麻顿\t35878\n污辱\t35879\n齐备\t35880\n裘秃\t35881\n阿森纳\t35882\nhipf\t35883\n精谋\t35884\n醭醭醭\t35885\nbbbnnkiyd\t35886\n機連\t35887\n张子硕\t35888\n擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦\t35889\n铁蛋\t35890\n也罢行\t35891\n北寺\t35892\nhips\t35893\n晨雾\t35894\n孟芳竹\t35895\n忙亲\t35896\n大卷\t35897\n家属\t35898\n俺博\t35899\n忙人\t35900\n群组\t35901\n有加\t35902\n099865421\t35903\n家屋\t35904\n有势\t35905\n夏至\t35906\n鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅恩\t35907\n語錄\t35908\n家居\t35909\n有劳\t35910\ngeggge\t35911\n狗腿\t35912\n一颗瓜天天天天\t35913\n1969年6月28日\t35914\n拜读\t35915\n轻素\t35916\n搅乱\t35917\n手风琴声\t35918\n有功\t35919\n有力\t35920\n223232323233332222\t35921\n鬼步\t35922\n茧中\t35923\n2222222222250222222222\t35924\naaabb\t35925\nguguiug\t35926\n永福路79号\t35927\n度蜜桃\t35928\ngkghfg\t35929\n就娶\t35930\n不糙\t35931\n决胜\t35932\n真聪明BC\t35933\n做现\t35934\n昨夜海棠初着雨\t35935\n菲菲菲\t35936\n意境家\t35937\n娜米\t35938\n吹吹无聊\t35939\n有如沐春风\t35940\n啵啵啵萌\t35941\n1000多名\t35942\n单车\t35943\n插板\t35944\n再家都贝贝\t35945\n冯琪琪\t35946\n货币政策\t35947\n要你走\t35948\n奥体\t35949\n福在\t35950\n供不起\t35951\n二瑞\t35952\n135888889605\t35953\n猪你是猪你是煞笔你是逗逼\t35954\n糖丸\t35955\n张全\t35956\n最低估\t35957\n1236547890\t35958\ndrrsterer\t35959\n没事闲等\t35960\nxncie\t35961\n撸样\t35962\n八次\t35963\n广岛市\t35964\n凤凰点点\t35965\n多余头\t35966\n零四零五年\t35967\n别处你\t35968\n优质课\t35969\n真人手\t35970\n第五期\t35971\n张先\t35972\n胡福红\t35973\n快穿文\t35974\n价目表\t35975\n魁梧\t35976\n百万美元\t35977\n红豆薏米\t35978\n白斩鸡酱油鸭\t35979\n1586236\t35980\n猜字谜地方话独生女\t35981\n小小度\t35982\n耶一用\t35983\n大马乡\t35984\n排涝\t35985\n萨瓦迪卡\t35986\n花烛夜\t35987\n沒點開大圖來\t35988\n文都\t35989\n熊雨珊\t35990\n不能说出来\t35991\n七色\t35992\n芝姐\t35993\n美代\t35994\n给我又来\t35995\n圆溜\t35996\n国际无线电报大会\t35997\n人人人\t35998\n空距\t35999\n创意品\t36000\n工衣\t36001\n投资者\t36002\n充溢\t36003\n灰太雅\t36004\n条文状\t36005\n硫酸\t36006\ngvts\t36007\nfdgdd\t36008\n680980\t36009\nfahhhfe\t36010\n澡\t36011\n牛黄\t36012\n米米卡\t36013\n有人\t36014\n伤及片\t36015\n北上广不相信眼泪\t36016\nHfjhhh\t36017\n自私自利\t36018\n100000000个\t36019\njefs\t36020\n880\t36021\n澳\t36022\ngvtf\t36023\n中欧陆家嘴国际金融研究院\t36024\n飞流口水\t36025\ntrrrrrig\t36026\n沧州市新华小学\t36027\n澃\t36028\n李学校\t36029\n套套片\t36030\n汤秀玲\t36031\n有些\t36032\n度秘真赞\t36033\n澉\t36034\n米去\t36035\n清河\t36036\n四十四百一十二\t36037\n2525yzz\t36038\n索尔塔\t36039\n有了\t36040\n有事\t36041\n澜\t36042\n100000000万\t36043\n萧晴\t36044\n徐汇滨江地区\t36045\n望族\t36046\n美景辰\t36047\n度秘好度秘换度秘天天谈恋爱\t36048\n勇敢面对\t36049\n鄙薄\t36050\n金玉\t36051\n女汉子背小公主啦小公举\t36052\n7000多点\t36053\n武生泰\t36054\n端粒\t36055\n模式署\t36056\nloon\t36057\nlooo\t36058\n刘鲲鹏\t36059\n闹木\t36060\n没错失\t36061\nlook\t36062\n照相\t36063\n刘成庆\t36064\n偷菜\t36065\n寡老婆\t36066\n兰庆行\t36067\nlgiigiif\t36068\n落潭镇\t36069\n歌会\t36070\n5454515515\t36071\n有你的草长莺飞\t36072\n写歌\t36073\ntlaj\t36074\n离开学\t36075\n小依\t36076\n10000\t36077\n10001\t36078\n孔轴\t36079\n杨聚程\t36080\n看透\t36081\n胡润研究院\t36082\n阿度\t36083\n嗯七元\t36084\n九百岁\t36085\n反贼\t36086\n两点\t36087\n拿幺三零零二二八幺五三二\t36088\n两百八十元\t36089\n哈尼克梅\t36090\n已经完成\t36091\n哈哈哈哈我是坏人骗你的我是好人\t36092\n缘分度秘\t36093\n阿康\t36094\n屁屁屁屁屁屁屁屁屁屁\t36095\n小便\t36096\n申秉慧\t36097\nｂａｂｙ\t36098\n吗女\t36099\n迪斯科\t36100\n小黃人\t36101\n后半期\t36102\n寻找海洋故事大王\t36103\n中山北街\t36104\n447747778\t36105\n藏羚羊\t36106\n小侠\t36107\n偈\t36108\n偏\t36109\n5块\t36110\n说不想\t36111\n假\t36112\n登吃\t36113\n萌萌达g\t36114\n做\t36115\n停\t36116\n我不理你了你想再说告诉我你是谁\t36117\n水片\t36118\n4hfghfg\t36119\n蒋老师\t36120\n偖\t36121\nTulipa\t36122\nkkkk\t36123\n唉到极度疯狂爱到心\t36124\n商女\t36125\n偬\t36126\n嗯留言\t36127\n健\t36128\n一百道\t36129\n更精彩\t36130\n3DJ台\t36131\n北京站\t36132\n爱你的你不爱我\t36133\ntrue\t36134\n偷\t36135\n偶\t36136\nhjt\t36137\n东施效颦\t36138\nhjs\t36139\nhjr\t36140\nBE303\t36141\nhjy\t36142\n打印机\t36143\n上来说\t36144\nhje\t36145\nhjd\t36146\nhjg\t36147\nhjf\t36148\nhjc\t36149\nhjb\t36150\nhjm\t36151\n徐伯阳\t36152\nhjn\t36153\n印代秋\t36154\nhjh\t36155\nhjk\t36156\nhjj\t36157\n大曼旺旺呢动物园\t36158\n见人爱\t36159\n贾文瑞\t36160\n片只语\t36161\n口鼻\t36162\n不敢了我看\t36163\n点脸\t36164\n乐居杯\t36165\n新年好呀新年好呀新年\t36166\n绝非\t36167\n供不应求\t36168\n学学有心\t36169\n凯小玉\t36170\n我没有问题\t36171\n七八块\t36172\n昆明湖\t36173\n海之言\t36174\n聊性\t36175\n刘一含\t36176\n气服\t36177\n一不吃\t36178\n储蓄罐\t36179\nghfgvtffdeeeeeeeeryxffc\t36180\n衡山\t36181\n不差不多\t36182\n年液\t36183\n号啕大哭\t36184\n一四下\t36185\n人造蛋\t36186\n3201022\t36187\n宠幸\t36188\n黄坡六中\t36189\n不要好\t36190\n男女之道\t36191\n美女姑娘\t36192\n养女\t36193\n小老弟\t36194\n一四个\t36195\n严静娴\t36196\n三#\t36197\n四一个\t36198\n蒋诗涌\t36199\n咩咩度秘\t36200\n小厨\t36201\n方放b\t36202\n纽克瓦\t36203\nheimei\t36204\n三打一\t36205\n命中国\t36206\n猪么金融股\t36207\n直发\t36208\n未了\t36209\n四一下\t36210\n中调\t36211\n直变\t36212\n10000000000000000000\t36213\n半斤\t36214\n00xx\t36215\n无感动\t36216\n包大\t36217\n暴走漫画中有一均问候你\t36218\n笑死我了笑死\t36219\n事这我里我代学\t36220\n好玩\t36221\n民权县\t36222\nijj\t36223\n五十兆起\t36224\n30-40分钟\t36225\n放学回家\t36226\n小鬼当家\t36227\n二月一四\t36228\n嘿呦\t36229\n闪电侠\t36230\n5.8级\t36231\n哼烤\t36232\n王克勤\t36233\n陈公博\t36234\n白问\t36235\n米热样\t36236\n你不个蛋\t36237\n郭金红\t36238\n苏兰兰\t36239\n硼友\t36240\n国师\t36241\nPOCO\t36242\ndeyok\t36243\n嘿呀\t36244\n文礼\t36245\n朱大飞\t36246\nIamhurger\t36247\n大阪樱花\t36248\n父子年\t36249\n不是你你你你你\t36250\n脱光\t36251\n鸭脖\t36252\n水晶球\t36253\nDibsbjchskcsjbv\t36254\n借给我\t36255\n趴体么电视配对\t36256\n百褶\t36257\ngjti\t36258\nKDJJSJSJC\t36259\n一十点\t36260\n88888888888888846888\t36261\n粘膜\t36262\n同名\t36263\n垂询\t36264\n驻留\t36265\n36.40\t36266\n套餐饮\t36267\n36.45\t36268\n88点\t36269\n尼康DX\t36270\n无所畏\t36271\n私有云\t36272\n820520210\t36273\nuyggvff\t36274\nheheTFBOYS\t36275\n放送\t36276\n品牌精化\t36277\n疏文\t36278\n要磅\t36279\n千万美元\t36280\nPON\t36281\n港媒\t36282\n跟腱\t36283\n李含蕴\t36284\n克里夫\t36285\n我离你了你\t36286\n公公公开我不想和你\t36287\n肝功\t36288\n13251815739\t36289\n钱小军\t36290\n小确\t36291\n一百九十六块\t36292\n胡思羽\t36293\n卧铺\t36294\n列侬\t36295\n小硕\t36296\n王2狗\t36297\n永无止尽\t36298\n白白嫩\t36299\n龙择罗拉\t36300\n15093902619\t36301\n恩信\t36302\n焦玉英\t36303\n即面\t36304\n2350\t36305\n凯婷\t36306\n内线\t36307\n出乎\t36308\n别额\t36309\n遏止\t36310\nAdhere\t36311\n吨数\t36312\n琐记衍\t36313\ngwgwgmgmgmgmgmgwwjwjmgmggmmjmmgmmmwgmggm\t36314\n安娟\t36315\nzenmomeiyou\t36316\n炼心\t36317\n招工\t36318\nｉｐｈｏｎｅ1\t36319\n0hahaha\t36320\n十字星\t36321\n不太不萌\t36322\n句子\t36323\n城堡\t36324\n13075453636\t36325\n公认\t36326\n唔额\t36327\n伊莎贝\t36328\n周四安\t36329\n百褶裙\t36330\n索隆觉\t36331\n星学院\t36332\n啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦哆啦a梦我爱你哆啦a梦我爱你\t36333\n出来\t36334\n百日\t36335\n市宗教局\t36336\n四球\t36337\n海盗\t36338\n小呆瓜小呆关一起个小白\t36339\n海盐\t36340\njjjwj\t36341\n有天强\t36342\n美国教育部\t36343\n同性缘\t36344\n高新区\t36345\n雨田路一号\t36346\n惠城\t36347\n70天\t36348\n无的放\t36349\n五十平\t36350\n乌孙\t36351\n热开水\t36352\n没好意思\t36353\n雪花洞蜜\t36354\n永远的回忆\t36355\n和风\t36356\n花园间\t36357\n鸭鸭\t36358\n不是我的我不要！不爱我的我不爱\t36359\ncutmall\t36360\n五月一七来\t36361\n数十声\t36362\n杂晓得\t36363\n马行之\t36364\n中秋\t36365\n考一晚上\t36366\n陶宇航\t36367\n并板\t36368\n不臭狗\t36369\n此刻\t36370\n凌晨一点钟\t36371\nhentaigames\t36372\n状腺\t36373\n秤座\t36374\n胖胖\t36375\n饥饿\t36376\nAward\t36377\n靖宝贝儿\t36378\n心动心动心动\t36379\n瑞郎\t36380\n玩家们\t36381\n5五年\t36382\n莽拜\t36383\nhhng\t36384\nhfmsmmx\t36385\n我讨厌你不是你讨厌我ok\t36386\n1261\t36387\n不懂不好笑\t36388\n一一间\t36389\n弯道\t36390\n靠不打\t36391\nXTu\t36392\n耶是耶\t36393\n侯佩岑\t36394\n徳牧\t36395\nk歌\t36396\n不开\t36397\n白帝城\t36398\n不弃\t36399\n岳峙渊亭\t36400\nserioussix\t36401\n著名人士\t36402\n菜肴\t36403\n幺九幺\t36404\n张锦昊\t36405\n爻粘\t36406\n不弯\t36407\n要敢\t36408\n15245525\t36409\n老虎们\t36410\n策展\t36411\n我会\t36412\n亲恩燕再无有\t36413\n06545555464\t36414\n红陆龟\t36415\n嘻亲亲\t36416\n虐文\t36417\n19886322543\t36418\n计分\t36419\n李莱恩\t36420\n魔法度\t36421\n滑倒爬\t36422\n艾瑞克\t36423\n奔跑吧奔跑吧兄弟\t36424\n卫冕\t36425\n沙子龙\t36426\n发达\t36427\n5斤\t36428\n缓兵\t36429\n我的孩子\t36430\n上九\t36431\n孤顶嘴\t36432\n麻城\t36433\n门口\t36434\n错开\t36435\n13554563391\t36436\n寄生\t36437\n犯罪团伙\t36438\npokn\t36439\n96969623698788865422831\t36440\n张高丽\t36441\n龙八蛋\t36442\n徐育晨\t36443\n门号\t36444\n签收\t36445\n靠靠靠\t36446\n腓特烈\t36447\n火警\t36448\n龖悊\t36449\n132545245\t36450\n投影仪\t36451\n甲减\t36452\n56868656825685838575824585855252524555565575728243525256\t36453\nCCTV\t36454\n维基革命\t36455\n合作性\t36456\n万能充电器\t36457\n卧槽人至贱则无敌[吐\t36458\norknew\t36459\n话外音\t36460\n致胜\t36461\n说到底\t36462\n百态\t36463\n汗马功劳\t36464\n您看我\t36465\n取和\t36466\n开心死我的被打惨\t36467\n脸油\t36468\n撒比亚\t36469\n一111111111111111\t36470\nJJ图lz\t36471\n1537\t36472\n1536\t36473\n1535\t36474\n百思\t36475\n该你\t36476\n1530\t36477\nMxduudufy\t36478\n2012893537\t36479\n实果\t36480\n迁怒\t36481\n零七零六\t36482\nwif\t36483\n梅花糕\t36484\n十八岁\t36485\nwic\t36486\n楼面地价\t36487\njinvf\t36488\n接说\t36489\nwin\t36490\nwim\t36491\nwij\t36492\nwih\t36493\n多发字\t36494\n接语\t36495\nwit\t36496\nwis\t36497\nFrench\t36498\nwiq\t36499\n丹参\t36500\n牟瑞彤\t36501\nRita\t36502\nwix\t36503\n11米\t36504\n丫鬟\t36505\n射雕\t36506\n潭牛镇\t36507\n威廉·詹姆士\t36508\n阳山\t36509\n11类\t36510\n装神马嬉皮\t36511\n炙骜\t36512\n电线杆处\t36513\n雨巷\t36514\n黑水鸡\t36515\n才女朋友\t36516\n接诊\t36517\n笑你好\t36518\n广西灵山县第三小学\t36519\nrypuc\t36520\nAdgb\t36521\n尹丽萍\t36522\n疾苦\t36523\n话天\t36524\nhfhj\t36525\n赵本山\t36526\nymf\t36527\n对唱\t36528\n路由\t36529\n万绿湖畔\t36530\n管顿饭\t36531\n合着\t36532\n8953\t36533\n创领\t36534\n玉佩\t36535\n洪雅欣\t36536\n吉昌\t36537\n熟练\t36538\n咋死\t36539\n不愿\t36540\n张圣轩\t36541\n埃辛\t36542\n吉星\t36543\n第几轮\t36544\n刘思期\t36545\n一声断子\t36546\n有一天\t36547\n不问问\t36548\n主导\t36549\n安安夏安\t36550\n3g片\t36551\n演完\t36552\n神莫\t36553\niPhone4s\t36554\n攻入\t36555\n度秘我告诉你一家我告诉你我几时\t36556\n百转千回\t36557\n宴会\t36558\nnnlpnmnnnjmbnnnm\t36559\n魔方\t36560\n知事晓理\t36561\n熔融物\t36562\n一张照片吧\t36563\n我好想你\t36564\n攻克\t36565\n见人说\t36566\n撞燃\t36567\n战区\t36568\n龙迹\t36569\n22度\t36570\n某某事\t36571\n围坐\t36572\n海阔天空\t36573\n见赏\t36574\n包包\t36575\n土有\t36576\nKardashian\t36577\n中度秘\t36578\nffffffg\t36579\n爆炒\t36580\n调停\t36581\n凯利\t36582\n马天浩\t36583\n伯贤\t36584\n光山\t36585\n收殓\t36586\n口出狂言\t36587\n山寨\t36588\n宋松庆\t36589\n土木\t36590\n王承禹\t36591\n补赏\t36592\n88块\t36593\n给我来首宠爱听听\t36594\n某某人\t36595\n你和你和你和你\t36596\n初出茅庐\t36597\n照发\t36598\n鬼蛇神\t36599\n膜拜\t36600\n惠民\t36601\n雷德克\t36602\n3316741268\t36603\n啦啦啦了具体n\t36604\nBlgjhljmfk\t36605\n怀柔\t36606\n王的演讲\t36607\n达克灵\t36608\nnosecone\t36609\n感恩惜福\t36610\n温自豪\t36611\nhpx\t36612\n上尉\t36613\n苗韵桐\t36614\n丰体\t36615\n家庭文明记载表\t36616\n奥良烤大鹅\t36617\n好复古\t36618\n砂岩\t36619\n上将\t36620\n雨雪\t36621\n雨雨\t36622\n纸媒\t36623\nfitto\t36624\n法拉利汽车公司\t36625\n80000900000000\t36626\n没嘛\t36627\n没谈过恋爱\t36628\n10x20\t36629\nnokia\t36630\n莫咯\t36631\n4g3g2g\t36632\n二秒\t36633\n抚平\t36634\n说电话不通\t36635\n2010年3月\t36636\n一本正经\t36637\n遗嘱\t36638\n3561\t36639\n攔鱷\t36640\n哈弗h2\t36641\nchfxfhchfyfh\t36642\n3569\t36643\n冷峻\t36644\n重庆月红宾馆\t36645\n度恩密\t36646\n2011年3月25日\t36647\n唐先生\t36648\nEFDE3\t36649\n室颤\t36650\n排放\t36651\n降水量\t36652\n游一游\t36653\n电影城\t36654\nhttppinyincn1hSVTFUGrBp\t36655\n喵姐\t36656\nIFF\t36657\n陈勇\t36658\n家女人\t36659\n岔题\t36660\n五六七八九十十一十二十\t36661\n葱翠\t36662\n9596839536565\t36663\n豆瓣群\t36664\nhjksnc\t36665\n环峰医院\t36666\n保胎\t36667\n何能何德\t36668\n公仆\t36669\n我我\t36670\n俩两种\t36671\n原来今日\t36672\n昌北\t36673\n下学期\t36674\n安达充\t36675\nrufifjjfm\t36676\n大太阳花\t36677\n磁铁\t36678\nnizaijiame\t36679\n星空下\t36680\n命中率\t36681\n球宠\t36682\n狗尾草\t36683\n宽慰\t36684\nnoeyodvnb\t36685\n加油站\t36686\n见了你\t36687\n邵丽晓\t36688\n咋样子\t36689\n家达文西亚\t36690\n好走\t36691\nTvshowman\t36692\n摩哥\t36693\n二十四岁\t36694\nnnnnnnnnnnnnnnnnnnnnnnn\t36695\nnishi\t36696\n5.5米\t36697\nsbsssb\t36698\n李小妹\t36699\n南五村\t36700\n繁星生\t36701\n回家啦啦呱呱\t36702\n泸县\t36703\n化妆师\t36704\n好赖\t36705\n好赞\t36706\n很丑你好丑\t36707\n你是我的小呀小苹果交响天边\t36708\n卖肾\t36709\n凹凸曼光\t36710\n十万禾力十义八小TSYFO\t36711\n学文章\t36712\n壁纸\t36713\n喝了你喝\t36714\n收缩压\t36715\n游牧民族\t36716\nhua儿\t36717\n小爸爸\t36718\n八分之五\t36719\n雅思考\t36720\nhttphhiphotosbaiducomxiaodupicitem37d12f2eb9389b5050a78b208235e5dde7116e63jpg\t36721\n度秘度秘度秘我是你的主人\t36722\n卖肉\t36723\n学人\t36724\n耳纸\t36725\n八分之二\t36726\n鬼婴\t36727\n杨家餐店\t36728\nhuygr\t36729\n九空\t36730\n还康康\t36731\n睫毛梢\t36732\n受不可能\t36733\n谢再勤\t36734\n李庆彬\t36735\n137485545\t36736\n亲爱的你了相信爱我家\t36737\n安徒生\t36738\n金发片\t36739\n李宝奎\t36740\n372x54328x54\t36741\n小瓜瓜\t36742\n就梦\t36743\n监察队\t36744\n风度\t36745\n切母\t36746\n豆豆龙\t36747\n任姐\t36748\n150斤\t36749\nGghjyfv\t36750\n县级\t36751\n推开\t36752\n八宫\t36753\n寸衫\t36754\n八家\t36755\n革\t36756\n看见了想\t36757\n消失有\t36758\n压榨\t36759\n拉破\t36760\n牙我爱\t36761\n朱一煊\t36762\n恰巧\t36763\ncewss\t36764\n张里\t36765\njgkggghhh\t36766\n张金\t36767\n正比例函数\t36768\n啵罗成\t36769\n有几了\t36770\n八宝\t36771\n离不来电源\t36772\n一般点\t36773\n顽固性\t36774\n阿青\t36775\n假警\t36776\n19888\t36777\n寸步难行\t36778\n通畅\t36779\n刘房子\t36780\n阿静\t36781\n2.79%\t36782\n节准\t36783\n生出来\t36784\n见首\t36785\n十字街\t36786\n混话\t36787\n菲可以\t36788\nryxhzgdxg\t36789\n卤汁\t36790\n卜啵啵\t36791\n死者\t36792\n猪类\t36793\n靖\t36794\n死老\t36795\n玻璃\t36796\nujjjj\t36797\n苏家废材四小姐\t36798\n李层层\t36799\n胡远远\t36800\n黄燕慧\t36801\n麻了真好\t36802\n活雷锋\t36803\n滚滚红尘风\t36804\n小强强\t36805\n渺小\t36806\n好久吉大\t36807\n鸡群\t36808\n虚机\t36809\n耍笑话\t36810\n发入\t36811\n馍\t36812\n84元\t36813\n姚哲\t36814\n非\t36815\n看灰机灰\t36816\n楚若\t36817\n萝北县\t36818\n素有\t36819\n0000666666666\t36820\n韩义民\t36821\n馈\t36822\n上班\t36823\n处长\t36824\n哪约炮\t36825\n噜啊噜噜\t36826\n也应这样\t36827\n二零二零七\t36828\n大胖猪\t36829\n临界值\t36830\n叶赫水\t36831\n跌紧\t36832\n发克\t36833\n王景彪\t36834\n发光\t36835\n馋\t36836\n射失\t36837\n好了度秘晚安晚安晚安晚安晚安晚安晚安\t36838\nAP98222\t36839\n123456489\t36840\n最终兵器：弓\t36841\n一到十一月\t36842\n1000000000000000000000000000\t36843\n变色龙\t36844\n午报\t36845\n24二十二十五米\t36846\n两米多\t36847\n前放网后前网\t36848\n有议论\t36849\n琵琶行\t36850\n套现\t36851\n镀镍\t36852\n老蔡\t36853\n糯米的糯米的姑娘姑娘姑娘姑娘姑姑的度秘\t36854\n丫tt\t36855\n微乎其微\t36856\n第31\t36857\n做些什么\t36858\nkUkk\t36859\n重生文\t36860\n开售\t36861\npotess\t36862\n1828738596575\t36863\n基块\t36864\ngusj\t36865\n台商\t36866\ngusb\t36867\n小帅\t36868\n我了我知道你是鬼\t36869\n小布\t36870\n不懂不秘你说的话\t36871\n小希\t36872\n一年1度\t36873\n糖王源\t36874\n君之冰\t36875\n独白道\t36876\n安124\t36877\nakbkeh\t36878\n千玺帅\t36879\nimmov\t36880\n嘤嘤嘤嘤嘤\t36881\nMKMF\t36882\n筑堤\t36883\n蚝油\t36884\n爬人山\t36885\n草我吧\t36886\n飞艇\t36887\n逗饼\t36888\n整合\t36889\n小帮\t36890\n300万辆\t36891\n有不有一上\t36892\n头箍\t36893\n大班长\t36894\n医典\t36895\naron\t36896\n高亚代\t36897\n大事件\t36898\n马艳\t36899\n媳妇子\t36900\nEfgfvn\t36901\n介绍女\t36902\n乔庆辰\t36903\n宏观经济\t36904\n清鸢\t36905\n石灰石\t36906\n揉碎\t36907\n这样做菜\t36908\nbabykengaljklnn1tulat1\t36909\n说再见\t36910\n完全文\t36911\n悟空三打白骨精\t36912\n误撞\t36913\n普洱馆\t36914\n嗯嗯消费朋大\t36915\n喜剧类\t36916\n315163445448646349434\t36917\n贾睿阳\t36918\nfistibaby\t36919\n铠甲小游五磨牙\t36920\ndgds\t36921\n受不得\t36922\n1235968n473\t36923\n方尹\t36924\n上位\t36925\n母亲河\t36926\nksnxhxn\t36927\nLol\t36928\n哈喽度密\t36929\n血量\t36930\n嘉禾\t36931\n上作\t36932\n冲绳\t36933\n你怎不男些赵总之和心发哪间最后话暹罗之恋人版\t36934\n恩通\t36935\n嗯啪\t36936\n苑惠安\t36937\nsara\t36938\n失信\t36939\n嗯啦\t36940\n软萌淑\t36941\n小灰狼我喜欢喜羊羊\t36942\n三百晚上\t36943\n春贵人\t36944\nWecarry\t36945\nsbses\t36946\n哄聋哑\t36947\n超级大帅哥\t36948\n江春\t36949\nmetouto\t36950\n查乳腺增生挂哪科\t36951\n错啊错\t36952\n妙我信\t36953\nCcgvovrardtcibcyxtxtztztxzatzuubonlvucyctrd\t36954\n嘻嘻\t36955\n大太平\t36956\n15757628355258665826\t36957\n嘻嘿\t36958\n路口\t36959\n回答息\t36960\n破冰船\t36961\nmdybcdylrex\t36962\nmamyamalal\t36963\nnltk库\t36964\n李晓娜\t36965\ncqymyh\t36966\n好犹豫\t36967\n徐家成\t36968\n义塾\t36969\n尼玛江铠\t36970\n还萌萌哒嘞我叫你萌萌萌萌萌萌萌萌萌\t36971\n十几块\t36972\n介质\t36973\n宪法\t36974\n认养\t36975\n赵奎贤\t36976\n7474744\t36977\nyouluo\t36978\n1452762\t36979\nnnn\t36980\n爱理不理\t36981\nfidjf\t36982\n克听\t36983\n锁连城\t36984\n棒v光棍节\t36985\n鑫\t36986\n我讨厌你我恨你我恨你恨你恨你恨你恨\t36987\n念白\t36988\n今晚7点\t36989\n美丽度\t36990\n饭来张口\t36991\n大棚\t36992\n凌翎\t36993\n王名雪\t36994\n情深\t36995\n贮脓\t36996\n露寒\t36997\n布姐布莱克赛尔\t36998\n鑒\t36999\nwogun\t37000\n逸敏\t37001\n大棍\t37002\n咳咳咳咳咳咳咳咳\t37003\n杨紫桐\t37004\n4987643210\t37005\n上帝人\t37006\n菲利士\t37007\n1936年\t37008\n好活\t37009\n臭贴\t37010\n脑筋急转弯板\t37011\n山西村\t37012\n1-2\t37013\n夺冠周年纪念日\t37014\n多多不可爱\t37015\n学生党\t37016\n永生永世\t37017\n臭货\t37018\n苍生\t37019\n螳螂捕蝉黄雀在后的故事\t37020\nhxbsbxj\t37021\nwelc\t37022\n穷国\t37023\n洗衣店\t37024\n自觉性\t37025\n南南南南南南南南南南\t37026\n张文鼎\t37027\ne3df8dcd100baa1e478e6cf4010b912c8fc2e69jpg\t37028\n首航\t37029\n鬓发\t37030\n过良是\t37031\n苍甲\t37032\n活宰\t37033\n科勒思密达\t37034\nsufb\t37035\n桃生封真\t37036\n你好小秘书秘度秘\t37037\n作文\t37038\nsufg\t37039\n666566666666\t37040\n货币贬值\t37041\n歌叫\t37042\n肖乐英\t37043\n研判\t37044\n内伤\t37045\n身板儿\t37046\n7000元\t37047\n15963985197\t37048\n研制\t37049\nnow图兔兔YY五兔兔图图\t37050\n李扒皮\t37051\nNNN\t37052\n兴安岭林区\t37053\nTocaboca\t37054\nGortat\t37055\n幼儿们\t37056\n尹相杰\t37057\nfuteifi\t37058\n告诉你\t37059\nhsodr\t37060\n格子秋\t37061\n画眉鸟嘴包\t37062\n11nn\t37063\n出太阳\t37064\n删删\t37065\n千儿\t37066\n南欧\t37067\n哥再来\t37068\nNNS\t37069\n删别\t37070\n夺命旅行\t37071\n妓排\t37072\n发明者\t37073\n抽身\t37074\n能言善辩\t37075\n秦淮区\t37076\n拥挤\t37077\n丸子丸子\t37078\n捂臭不要脸太丑了臭不要脸\t37079\n性瘾者\t37080\n光之鸟\t37081\n还愿\t37082\n齐丹娜\t37083\ngtuvtuv\t37084\n进扶\t37085\nRyfdugg\t37086\n墨水儿味\t37087\n前旗\t37088\n鸽子\t37089\n红卫兵\t37090\n算不准\t37091\n还好啦乖\t37092\n乡坤\t37093\n青安\t37094\n排查\t37095\n屁眼片\t37096\n还在唱歌\t37097\n学佛\t37098\ndhsh\t37099\n蛋糕店\t37100\ndhsj\t37101\n4棵\t37102\ndhsn\t37103\n学位\t37104\n幺零零\t37105\n突突天啦土鳖铁路局\t37106\n软建\t37107\n六代\t37108\n聊天吧开心\t37109\n苏小灰灰is军训基地\t37110\n赞颂\t37111\n李宾川\t37112\n这一个女人\t37113\n她用声音演绎人生\t37114\n图恩\t37115\n小绵绵\t37116\n表示\t37117\n前日\t37118\n股东\t37119\n两万3y4y5万6万7万80086\t37120\n110301希澈推te\t37121\n皮埃斯\t37122\n天天得已\t37123\n拥交\t37124\n六份\t37125\nyumen\t37126\n发布\t37127\n主办方\t37128\nNick\t37129\n波导\t37130\n老刘村\t37131\n休息器\t37132\n夏冰\t37133\n大口狼\t37134\n小新宇\t37135\n记恨\t37136\n高保辉\t37137\n一一万块\t37138\n快點\t37139\n池塘边\t37140\nDHFGGGGGGGCGGFGFGG\t37141\n精神病院\t37142\n49881个\t37143\n特纳奖\t37144\n吴婷婷\t37145\n储衣柜\t37146\n粗壮\t37147\n板泉二中\t37148\ncucgy\t37149\n王秀榮\t37150\n走呗\t37151\n茶晶\t37152\n熊孩\t37153\n刘华清\t37154\n北影\t37155\n红鲤鱼\t37156\n草食化\t37157\n张叶叶\t37158\nnithistow\t37159\n我讨厌你有喜欢你叫我爱乐有很恨的牙爱\t37160\n看一下春\t37161\n魔音\t37162\nvji7\t37163\n孙玉洁\t37164\n简要\t37165\n七点八点\t37166\n李孝娥\t37167\n亲和\t37168\n老道\t37169\n凶器\t37170\nPMG\t37171\n活鬼\t37172\n0.25个\t37173\n理不理\t37174\n新视\t37175\n新规\t37176\n理科男\t37177\nPMI\t37178\n1000多少\t37179\naabaa\t37180\nk喜\t37181\nPM2\t37182\n董事长\t37183\n再问\t37184\naabai\t37185\nyffge\t37186\nspeakEnglish\t37187\nyffgf\t37188\n大马雨林燕窝\t37189\n伤心秀\t37190\n句话\t37191\n漂动\t37192\n我美\t37193\n赵培延\t37194\n赵培廷\t37195\n亲爱的你呀矫情\t37196\n闺女们\t37197\n绕口联盟\t37198\n徐宜静\t37199\n拍戲\t37200\n篇章\t37201\n留言\t37202\n郑成秋\t37203\nffstf\t37204\nlzqutywututywu\t37205\n小伯伯\t37206\n太斯文\t37207\n老虎\t37208\n我群\t37209\n拘禁\t37210\n古堡\t37211\n极权\t37212\n长春号\t37213\n马子淘\t37214\n德莱克印\t37215\n尘土\t37216\n甜味剂\t37217\n任冖\t37218\n非凡\t37219\n来不爱\t37220\n感觉到手\t37221\n平台型\t37222\n某一天\t37223\n可大\t37224\n可太\t37225\nu1次\t37226\n1524594423\t37227\n简易\t37228\n周锐杰\t37229\n水桶\t37230\n我是不是我恨你我恨你我恨你我恨你\t37231\ncuangsag\t37232\n可够\t37233\n苦海\t37234\n鱼都\t37235\nrou度\t37236\n光腚\t37237\n韧带\t37238\n简明\t37239\n纯蛋\t37240\n某一处\t37241\n特种兵之霹雳火\t37242\n秘号\t37243\n意思我真的错\t37244\n生气片\t37245\n就是为了性\t37246\n有意睡\t37247\nzhangmanm\t37248\n宝盒\t37249\n日K线\t37250\n宝盖\t37251\n王君\t37252\nvghgftfudhshffc\t37253\n悄然无声\t37254\n实名\t37255\n人文关怀\t37256\n冰激凌店\t37257\njeuqi\t37258\n熊女士\t37259\n江苏苏宁\t37260\n不难过\t37261\n748天\t37262\n逛韩网\t37263\n勃起\t37264\n畲族\t37265\n清一色\t37266\n3000000元\t37267\n大前锋\t37268\n吴青雯\t37269\n调动\t37270\n微信公号\t37271\n始作蛹者\t37272\n凉昂\t37273\n藤蔓\t37274\n我是女的你不是我的秘书了我的秘书\t37275\n寺田晓玲\t37276\nfgdohf\t37277\n舍后得\t37278\n杰森\t37279\nhttphhiphotosbaiducomxiaodupicitem960a304e251f95ca664b5dc2ce177f3e6709520cjpg\t37280\niesunou\t37281\n内敛\t37282\n珀斯\t37283\ncfddrxgc\t37284\n你不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸\t37285\n一年365天\t37286\n去了再来\t37287\n哈爱东\t37288\nruchdg\t37289\n失业化\t37290\n西藏中路25号\t37291\n东亿先\t37292\n大爷状\t37293\n周帅\t37294\n准确性\t37295\n145687\t37296\n就是你的偶像\t37297\n拉达\t37298\n大河口\t37299\n江沐苏\t37300\n徐bro\t37301\n生命年\t37302\n度年你好\t37303\n单身者\t37304\n526843908566\t37305\n私部\t37306\n长城哈佛\t37307\n一万9999\t37308\n深圳市住房和建设局\t37309\n你好丑呀丑\t37310\n诗昨背\t37311\n缓步\t37312\nJcjcfuudffufiiifjjdgfhdhsdfkfkjxdggfxhhddqwerty\t37313\ngbfafhfafhjk\t37314\n度秘你是猪么我说我战战僵尸奥特曼\t37315\n舅父\t37316\n舅爷\t37317\n莫底\t37318\n1888年\t37319\n再一起\t37320\n老子说东你说西\t37321\n现在的感\t37322\n硬盘版\t37323\n小八极\t37324\n生都\t37325\n吴莫愁\t37326\ngmezykdzh\t37327\nmiui\t37328\n杨博钧\t37329\n哥妥\t37330\n仿李\t37331\n拉船\t37332\n忘家\t37333\n峨山县\t37334\n四八期\t37335\ntm5玻a字\t37336\n娃娃亲\t37337\n给他钱\t37338\noppo3007\t37339\n组歌\t37340\n凌晨五点\t37341\n脱毛膏\t37342\n后进式\t37343\nwheis\t37344\n订票\t37345\n净清\t37346\n地度\t37347\n沙丘\t37348\n利好像样\t37349\n一柱\t37350\n好胖\t37351\n上周四\t37352\n高高兴\t37353\n狂奔\t37354\n一查\t37355\n是非情由\t37356\n张北张柏芝\t37357\n正月里\t37358\n︿\t37359\nShutupIneedthe\t37360\n陆炫宇\t37361\n︴\t37362\n︶\t37363\n别再来\t37364\n刘英大\t37365\n莫金美\t37366\n依依不舍\t37367\n杨静\t37368\n狂女\t37369\n莲前派出所\t37370\n地库\t37371\n一柄\t37372\n地府\t37373\n歌唱一首唱\t37374\n一四六米\t37375\n十二亿九千五百三十三\t37376\nhttphhiphotosbaiducomxiaodupicitem4d086e061d950a7b059a68b00dd162d9f2d3c912jpg\t37377\n度秘我是女人心的那个小主人\t37378\n段蓉芳\t37379\n违权关\t37380\n昨天晚间\t37381\n苏苏肉麻\t37382\n揍你\t37383\n吹面不寒杨柳风\t37384\n太冷\t37385\n2296点\t37386\n秘不为你\t37387\n唉哟\t37388\n再见了你\t37389\nchiecHwhsc6cnd\t37390\n2500声\t37391\n排骨饭\t37392\n这个片\t37393\n启煜\t37394\n限速\t37395\n住邦\t37396\n姣美\t37397\n位高权重\t37398\n我要你的抱抱\t37399\n华东八号\t37400\n呼呼吟\t37401\n身居\t37402\n佛坟\t37403\n门笫\t37404\n朱立伦\t37405\n爱卿朕\t37406\nV\t37407\n红堂堂\t37408\n错色戒\t37409\n王继堂\t37410\n左手边\t37411\n先岁\t37412\n夜半陌生旅店\t37413\n毛儿Mr\t37414\n没有效\t37415\n伯都\t37416\n不懂爱\t37417\n话说话说\t37418\n练功\t37419\n嘴儿\t37420\n九点半小时\t37421\nnopqr\t37422\n荣誉\t37423\n随然\t37424\n艾淇尔\t37425\n烦身子\t37426\n翟你\t37427\n一个片\t37428\n华谊经纪部\t37429\n武汉江夏公安分局\t37430\nqq浏览器\t37431\n规则现象\t37432\n段林希\t37433\n维尼文\t37434\njdnndddjdj\t37435\n不通顺\t37436\nuvhuvuvughhhhhhhhbnb\t37437\n喜羊羊秘\t37438\n晰羲\t37439\n14725836901\t37440\n雄淫\t37441\n扩胸\t37442\n邱汐岩\t37443\n乔什哈切森\t37444\n罗伯特\t37445\n这里头\t37446\n我的感\t37447\n自我救赎\t37448\nKao\t37449\n坏人儿\t37450\n压机\t37451\n张耀升\t37452\n秘桂林\t37453\n树山\t37454\n周小妹\t37455\nKaz\t37456\nDC掣糯\t37457\n揉揉揉揉\t37458\n冯路\t37459\nlc800瓦呗\t37460\n75456852\t37461\n善良感\t37462\n28级\t37463\n爱情砖\t37464\n凉拌的歌\t37465\n七粒\t37466\n时弊\t37467\n球样\t37468\n阳春面\t37469\n露即啦啦\t37470\n基金口\t37471\nWonnegau\t37472\n加油课\t37473\n度密度密度密度秘你不要脸你不要脸呸\t37474\n大再讲一个\t37475\n寂寞空庭春欲晚\t37476\n父亲们\t37477\n郜林\t37478\n一百五十块\t37479\nwvjatmjgtd\t37480\n冒牌天神\t37481\n胡鑫\t37482\n三十二三十三三十四三十五三十六三十七三十八三十九四十四十一四十二十三十四十五十十七十十八十十九一百\t37483\n哥刚哥\t37484\n波冬\t37485\n齐奥塞斯库大怒\t37486\n章肇祺\t37487\n王韩芮\t37488\n117夕\t37489\n张丽英\t37490\n四零零\t37491\n有身\t37492\n汗牛\t37493\n杨乐意\t37494\nDjdnfbf\t37495\n1314648\t37496\n忘情\t37497\n不可限量\t37498\n瓜拉尼\t37499\n对呀整天\t37500\nCulturelle\t37501\ntthu\t37502\n聪明暇\t37503\n美特伺邦威\t37504\n零五二六\t37505\n在记忆里\t37506\ngrand\t37507\n中关村软件园\t37508\n神情节\t37509\ntherapy\t37510\n羅志祥\t37511\n你丫\t37512\n决心\t37513\nwhereisit\t37514\n要不起\t37515\n停刊\t37516\n同流合污\t37517\n汽化\t37518\nhhagal\t37519\n王鑫鑫\t37520\n木质\t37521\n3两米\t37522\n16831873482718428728428732\t37523\n没事干嘛\t37524\n飞过\t37525\n001毫米\t37526\n水袋\t37527\nFaith\t37528\n私密\t37529\n炸药堡\t37530\n300240\t37531\n好了我真\t37532\nberyou\t37533\n双35s堡\t37534\n池口\t37535\nё\t37536\n人格\t37537\n冗人恶心\t37538\n陈心琪\t37539\n囚禁\t37540\n金博大\t37541\n人样\t37542\nс\t37543\nр\t37544\nт\t37545\nх\t37546\nф\t37547\nч\t37548\n戒烟\t37549\n化妆裤\t37550\nш\t37551\nы\t37552\nъ\t37553\nэ\t37554\nь\t37555\nя\t37556\nю\t37557\n判费良玉危害公共安全罪\t37558\n瑞典萨博汽车公司\t37559\n唯贤\t37560\n氧气管\t37561\n鬼脸鬼脸鬼脸鬼脸\t37562\n邮乐网\t37563\n4.4万元\t37564\n韩嘉琦\t37565\n活命\t37566\n张灵甫\t37567\n了翠\t37568\n55684468\t37569\n创作为重\t37570\n装懂\t37571\nfdzxfd\t37572\n房客\t37573\n想到\t37574\n言喻\t37575\n马路\t37576\n武汉市\t37577\n聊天行\t37578\n出包\t37579\n滄读\t37580\n面的眼\t37581\n1479632580456\t37582\n维玛\t37583\n3278284846\t37584\n热茶\t37585\n贡多拉\t37586\n爱与被爱不如相爱\t37587\n悬赏\t37588\n你的身体\t37589\n后厨\t37590\n腊八腊八腊八腊八\t37591\n扑救\t37592\n偏笑\t37593\n二分之一次\t37594\n喝一口水\t37595\n999995254485949181\t37596\n9244\t37597\n悔悟\t37598\n打道\t37599\n23456782473\t37600\n包拯\t37601\n孙鹿\t37602\n曾世林\t37603\n不要你了你\t37604\n偶记\t37605\n葡蔔\t37606\n数百\t37607\n孙鹏\t37608\n张狂\t37609\n水调歌头\t37610\n油市\t37611\n光猫\t37612\n人寿无主使人俗\t37613\n海门老街\t37614\n阿玛\t37615\n利滚利\t37616\n桂美\t37617\n跑堂\t37618\n支流\t37619\n王用英\t37620\n丑臭打怪\t37621\n第三个人\t37622\n王建宙\t37623\n一息\t37624\n好嘞好嘞\t37625\nffhh\t37626\n勾撘\t37627\n机修\t37628\n你好看\t37629\n杰尼亚\t37630\n2700元\t37631\nsoose\t37632\n找准\t37633\n二十世纪九十年代初\t37634\n找出\t37635\n噜噜噜噜噜噜\t37636\n胳臂\t37637\n你好眼\t37638\n费斯\t37639\n录给我看\t37640\n情结\t37641\nCine21\t37642\n认不认为\t37643\n园区\t37644\n黄河大学\t37645\n纲领\t37646\n一百六十多\t37647\n事机\t37648\n好美丽\t37649\nzbx\t37650\n大敌\t37651\n两种人\t37652\n购物车\t37653\n图图图图\t37654\n年老\t37655\n歪落\t37656\n消掉\t37657\n心诚意\t37658\n你是我的优乐美我是你的香飘飘\t37659\n大数\t37660\n重乔\t37661\n千盒\t37662\n接接\t37663\n人生\t37664\n来来来来来来来来来\t37665\n艾嘉\t37666\n过去40年\t37667\n娜么美\t37668\n曹子园\t37669\n痢疾\t37670\n李海俊\t37671\n再来吧\t37672\n加藤君\t37673\n嗲嗲\t37674\n这本地\t37675\nKnight\t37676\n花败\t37677\nUuu\t37678\n支教\t37679\n易图\t37680\n朝耀\t37681\n诸般\t37682\n花费\t37683\nCCTV2王凯读报\t37684\n李梦楠\t37685\n自在\t37686\n9点10分\t37687\n姨丈\t37688\n放寒假了\t37689\n利己知道\t37690\n重度塞\t37691\n那个他\t37692\n好我一样\t37693\n10组\t37694\n咪喵\t37695\n粪量\t37696\n十点半\t37697\n疯姿\t37698\n2117年\t37699\n566556553\t37700\n嗯啦老K\t37701\n荷儿\t37702\n1143\t37703\n徐王敏\t37704\n一九九二零七零四零五三九\t37705\n东北站\t37706\n1144\t37707\n刘把示波\t37708\n高洪波\t37709\n中环石板街\t37710\n娟妮\t37711\n一昔\t37712\n蜀国\t37713\n身邊\t37714\n小度\t37715\nHFFM\t37716\n神衰蛋\t37717\n星期天上午\t37718\nHiddenwhisks\t37719\n两分\t37720\n囧囧囧\t37721\n叮铃铃叮铃铃\t37722\n僵尸基金会\t37723\n5月1日晚\t37724\n后现代\t37725\n激怒\t37726\n大话说\t37727\n米塞斯\t37728\n晚婚\t37729\n气死自己\t37730\n王宝森\t37731\nkqidysb\t37732\n悲情\t37733\n丑丑丑丑丑\t37734\n08000\t37735\n五庄观\t37736\n冠軍\t37737\n大一杯茶\t37738\n派对\t37739\n相加上\t37740\n姑且\t37741\n嵩鼠\t37742\n气像站\t37743\n股东北味\t37744\n悲惨\t37745\n你的朋友\t37746\n\b\b\t37747\n孤立\t37748\n花企鹅\t37749\n一。979\t37750\n小彩旗\t37751\n上海新天地南里\t37752\n42352535635353535\t37753\n八家8\t37754\n营造\t37755\n俗话\t37756\n38b\t37757\n注压\t37758\n宙斯年\t37759\n大言不惭\t37760\n老山颂\t37761\n释道心敢情\t37762\n奥尔夫\t37763\n五六九二二九零八二四二六\t37764\nFHGH\t37765\n退休人员\t37766\n耶路撒冷\t37767\n层面\t37768\n阿莉\t37769\n02\t37770\n03\t37771\n00\t37772\n01\t37773\n阿莲\t37774\n07\t37775\n04\t37776\n05\t37777\n08\t37778\n09\t37779\n乌江\t37780\n俗语\t37781\n珍珠网\t37782\n多碟\t37783\n文通篇\t37784\n06月28日21时\t37785\n1470897191\t37786\n剥落\t37787\n外调\t37788\n0Q\t37789\nmafuck\t37790\n93.34%\t37791\n舒利\t37792\n聂大姐\t37793\n幽默\t37794\n0^\t37795\n噜雪杖\t37796\n完婚\t37797\n380\t37798\n381\t37799\n382\t37800\n几十亿万千\t37801\n384\t37802\n386\t37803\n密谋\t37804\n388\t37805\n0K\t37806\n大一个月\t37807\n量子态\t37808\n0O\t37809\n0L\t37810\n猛男\t37811\n0q\t37812\n0v\t37813\nPIN\t37814\n气脑溢血\t37815\n0u\t37816\n八百多兆\t37817\n2015年11月13日\t37818\nkjjjjjl\t37819\n0y\t37820\n0999999990000000785546357457586857278428675724257686852609008674243\t37821\n55635\t37822\n55637\t37823\n外谷\t37824\nvv猜猜猜猜猜猜猜猜猜猜猜猜猜猜\t37825\nPIA\t37826\n哈尔度秘\t37827\n0k\t37828\n0h\t37829\n0n\t37830\n0l\t37831\n0m\t37832\n面战\t37833\nChancelands\t37834\n哈哈谢\t37835\n切切你个大头鬼\t37836\n84人甲\t37837\n周格格\t37838\n落大水\t37839\n不完了得\t37840\n在兹\t37841\n身躯\t37842\n看不见了\t37843\n夜鬼\t37844\n小度公\t37845\n110205\t37846\n乱搞\t37847\nPIG\t37848\n缪缪斯\t37849\n放倒\t37850\n用膳\t37851\n新书\t37852\n4718457145454\t37853\n糊涂糊涂大坏蛋糊涂\t37854\n可乐陌\t37855\n2866点\t37856\n杨涵\t37857\n无利不起\t37858\n白天生拔尖\t37859\n过年了\t37860\nhgfrv\t37861\n詹妮弗·洛佩兹\t37862\nwebsite\t37863\n僵住\t37864\n一团糟乱\t37865\n易沛琪\t37866\n无聊滴\t37867\nhttppinyincne6221\t37868\n穆卿言\t37869\n李世汉\t37870\n4811\t37871\nhttppinyincne6229\t37872\n秸孖\t37873\n安徽省潜山县交警大队\t37874\n多几\t37875\n酷跑\t37876\n两万年\t37877\n输不起\t37878\n文庙\t37879\nfakiyu\t37880\n地狱厨房\t37881\n秀藏宝\t37882\n壮身\t37883\niphone6n\t37884\n扶手\t37885\n1234567891213141516171891\t37886\n永远的谜\t37887\n传收\t37888\n一张你的片\t37889\n村娃\t37890\n襟衫\t37891\n还冰\t37892\n潜入\t37893\nΕΖΗΘΙΚΛΜΝΞ\t37894\n今晚19：30\t37895\n倒像\t37896\n波幅\t37897\n38天\t37898\n度秘雅\t37899\n2048\t37900\n阿西吧思密达\t37901\n周鹏萨\t37902\n多指教\t37903\n不么\t37904\n一个名\t37905\nxssddffety\t37906\n玉皇阁\t37907\n古国\t37908\n2222222222222222个\t37909\n刘雪茗\t37910\n欢你\t37911\n爬升\t37912\n孙家翠\t37913\n线路人\t37914\n火眼睛睛\t37915\n恩恩pe\t37916\n斧子\t37917\n姓度\t37918\n度兜儿\t37919\n这周日\t37920\n你好色你是色狼\t37921\n不买\t37922\n文昌路\t37923\n曾雪\t37924\n18538676751\t37925\n处女女\t37926\n万花筒\t37927\n廠裡\t37928\n叶贤睿\t37929\n杨子浩\t37930\n阿德救苦救难噢噢题\t37931\n二二二零五二幺一久七久零久二零零零三幺\t37932\n广交会\t37933\n杜可心\t37934\n出雪\t37935\n受精卵\t37936\n我以为\t37937\n齐煜博\t37938\n棉衫\t37939\n蔚然\t37940\n走了安\t37941\n我文\t37942\n赵多少\t37943\n一根24米\t37944\n棉衣\t37945\n日历\t37946\nksmxsocn8dvir\t37947\n就知\t37948\n伊妹儿\t37949\n洞头\t37950\n风顺水雪山奇峰\t37951\n皂荚树\t37952\n2996723379\t37953\n3200元\t37954\n白痴痴\t37955\n有一个人放弃\t37956\n锻压\t37957\n恶搞\t37958\n18770522680\t37959\n6公里\t37960\n花哥弟\t37961\n植物大战僵尸音乐世界\t37962\n立山\t37963\n末日孤村\t37964\n一不爱我你真的不爱我我\t37965\n06档\t37966\n辉腾\t37967\n非常静距离\t37968\n自觉努力\t37969\n丫头\t37970\n重性\t37971\n15296048815\t37972\n王志威\t37973\n十年以前\t37974\nF罩奶\t37975\n齐视\t37976\n没了我讨厌\t37977\nT细胞淋巴瘤\t37978\nhjff\t37979\nhjfg\t37980\nahrainy\t37981\n773767563\t37982\nhjfh\t37983\n王志娜\t37984\n朱高慧\t37985\n曾良友\t37986\n拉粑粑\t37987\n飞飞999\t37988\n扶沟县\t37989\n原为\t37990\n陶必强\t37991\n返修\t37992\n分赃\t37993\n07973728586\t37994\n40秒\t37995\n华夏学校\t37996\n苦逼伤不起啊伤不起\t37997\n告诉我的名字\t37998\n念书\t37999\n30000年\t38000\nduhohohjvuddcbjkkp\t38001\n1500米\t38002\n首飞\t38003\n邓弘阳\t38004\n丽莹\t38005\n领海\t38006\n豆面豆面\t38007\n最美的回忆\t38008\n恬不知耻\t38009\n毋庸\t38010\n来电显\t38011\n丽莎\t38012\n大家一起唱\t38013\n保有量\t38014\n467685734584965797676576\t38015\n不识趣\t38016\n天离子\t38017\n哼臭\t38018\n六点五元\t38019\n四十本\t38020\n潜血\t38021\n安徒生童话\t38022\n谢吉涛\t38023\n哀家正\t38024\n唐轩英\t38025\n大海\t38026\n悲欢\t38027\nk2括\t38028\n打蜡\t38029\n啥个\t38030\n88a8888a\t38031\n我要\t38032\n西客站\t38033\nDaily\t38034\n吉飞机\t38035\n101739\t38036\n你是谁你是谁你是谁你是谁你是谁你是谁你是谁你是\t38037\n自然保护区\t38038\n杨颖长\t38039\n刘成城\t38040\n賈天睿\t38041\n打猎\t38042\n巧红\t38043\n眼熟\t38044\nItsmarch12thIt\t38045\n同皮蛋瘦肉粥\t38046\n诱受\t38047\n深谋远虑\t38048\n因为我的脸\t38049\n杜甫草堂\t38050\nthewo\t38051\n彭换中\t38052\n读报\t38053\n什冕\t38054\n自驾\t38055\n草丛\t38056\n芳村\t38057\n贱种\t38058\n游子梦\t38059\n盆骨\t38060\n迟湘悦\t38061\n恒大酒店\t38062\n4987931000022967\t38063\n谭博文\t38064\n1边\t38065\n草上\t38066\n王语嫣\t38067\n华南虎\t38068\n梦幻\t38069\n弱弱图\t38070\n美利坚风\t38071\n12350万元\t38072\nTEL\t38073\n蓝天幼儿园\t38074\n墨雪\t38075\n床戏片\t38076\n走走上\t38077\n拼装\t38078\nTEX\t38079\n体形\t38080\nvcvh\t38081\n1辑\t38082\n体彩\t38083\nvcve\t38084\n夜北\t38085\n杨心灵\t38086\n枇杷\t38087\n六毛\t38088\n五九恩\t38089\n少头\t38090\nhi哼\t38091\n算了我喜欢\t38092\n性格外向\t38093\na1只\t38094\n阿ｓａｍｙｆｉ\t38095\n欧贸中心\t38096\n218套\t38097\nohimpr\t38098\n是源\t38099\n舍得舍得小舍\t38100\n痴线\t38101\n徐欢\t38102\n4250甲数\t38103\n弘基\t38104\n141个\t38105\n100010001千10001千几\t38106\n上思山歌\t38107\n二九零\t38108\n补全\t38109\n惨剧\t38110\n睡秘\t38111\n黄疸\t38112\nIcome\t38113\n咯轮\t38114\n厚道\t38115\nhbnnn\t38116\n六七座\t38117\n不二裕太\t38118\n英语机\t38119\n药治\t38120\n蛋黄派\t38121\nqwerty\t38122\n无聊的话\t38123\n可能会考\t38124\n大风起兮云飞扬，\t38125\n友德友德\t38126\n你的名字度秘你的名你的你的家在哪儿\t38127\n1322411324134444444444\t38128\n十神\t38129\nlgyjtji\t38130\npmdjg\t38131\n娃娃堡\t38132\n刚好\t38133\n┭┮\t38134\n额晚安\t38135\nwmmj\t38136\n发货片\t38137\n大济\t38138\n锁的传人\t38139\n奇奇怪怪\t38140\n一滴答\t38141\n我和你\t38142\nnjjjfff\t38143\n善识\t38144\nnbggdjkuy\t38145\n日籍\t38146\ngarland\t38147\n奔跑吧2016\t38148\n小猪猪啊小猪猪就是你呀就是你\t38149\n打中\t38150\n七百七十四兆\t38151\nDKDCTFJ\t38152\nw8jg\t38153\nfifio\t38154\n西游记记\t38155\n12365588\t38156\n妾本惊华\t38157\n好笑乐\t38158\n78c\t38159\n耳罩贝贝网\t38160\n小慧慧\t38161\n9991岁\t38162\n走了烦\t38163\npotato\t38164\n三十张\t38165\n讳疾忌医\t38166\n数余\t38167\nabcdefghrrjklmnopqrstuvwxyxrotforegalabobost无为\t38168\n输输\t38169\n时尚\t38170\n往下去\t38171\n是非过\t38172\n打下\t38173\n呀呀呀也呀呀\t38174\n11、12月\t38175\n助人之举\t38176\n我的爸爸\t38177\n数位\t38178\n酱紫酱紫酱紫酱紫酱紫\t38179\n我是美爱不是美真\t38180\n相理\t38181\n很好玩儿\t38182\n划伤\t38183\n菊花\t38184\nttttyytttttrttttrrttytrrtrryyrtttttyytttttt\t38185\n猫怪兽\t38186\n理不清楚\t38187\n要不做\t38188\n价差\t38189\neychssk\t38190\n1200多次\t38191\n草包\t38192\n诸恶莫作\t38193\n13930518673\t38194\n石原慎太郎\t38195\n积累\t38196\n调色\t38197\natcti\t38198\n天成一次一次\t38199\n十分之一第二天\t38200\n午间\t38201\nCrazy\t38202\n15727681912\t38203\n女相\t38204\n蔡妍\t38205\n午门\t38206\n217岁\t38207\n冬夜给我个双双色球\t38208\n下忧\t38209\n告人教\t38210\nSkfk\t38211\n清水村村\t38212\n好甜蜜\t38213\n知识产权\t38214\n智能习惯\t38215\n尧庙宫\t38216\n小米妞\t38217\n25亿元\t38218\nnhggdn\t38219\n下心\t38220\n演说\t38221\n咯一OK\t38222\n玉皇殿\t38223\n是啊善良人都仙三\t38224\n阴风\t38225\n肚子\t38226\n写好不好\t38227\nJUSHI汁芯猪排\t38228\n5285552842628588582548482\t38229\nlove节\t38230\n不干什么\t38231\n告诉我莫\t38232\n新牛津\t38233\n编辑程序\t38234\n大水树\t38235\n烀必\t38236\n信不着\t38237\n太可可可\t38238\n好好趣\t38239\nOp慬\t38240\n9882888887718\t38241\n余裕\t38242\n李方瑞\t38243\n嵩山\t38244\n落实好\t38245\n曹百强\t38246\n我的快乐家族\t38247\n对呀就不告诉你\t38248\n剧院\t38249\n马小花\t38250\nhasyu\t38251\napkEng\t38252\n鸡尾酒会\t38253\n12593\t38254\njhngnu\t38255\n萨哈\t38256\n兰蔻\t38257\n枫林\t38258\n一万元\t38259\n123065\t38260\n征兆\t38261\n甜蜜再恋\t38262\nre\t38263\n香不香\t38264\n喜力\t38265\n早上六点整\t38266\n就该\t38267\n更新启动\t38268\n一万八\t38269\n失势\t38270\n征兵\t38271\n完美\t38272\n今中午\t38273\n李贺祥\t38274\n王亚馨\t38275\n就课\t38276\n家电商\t38277\n王东西\t38278\n就读\t38279\n马丽君\t38280\n敢当\t38281\n代理商\t38282\n30名\t38283\nvtgohv\t38284\n张小红\t38285\n布贾诺伏契村\t38286\n背拥\t38287\n黄曲霉素\t38288\n后尘\t38289\n郭苗兴\t38290\n思堆\t38291\n60分钟\t38292\n我告诉你了我是男还是女我是女\t38293\n2010年1月27日\t38294\ndudk\t38295\n程宇阳\t38296\n约格\t38297\ndudo\t38298\nrhsujac\t38299\n乱七八糟乱七八糟\t38300\n嘛尼\t38301\nayf\t38302\n一片片片\t38303\n佳木斯傻子屯\t38304\ndudy\t38305\n北京奥体中心\t38306\n闫伟宁\t38307\nayu\t38308\n哈票\t38309\n其他人\t38310\n17岁\t38311\nhttphhiphotosbaiducomxiaodupicitem0bd162d9f2d3572c1e17b8af8d13632763d0c3cfjpg\t38312\n习武\t38313\n快帮\t38314\n肥乡\t38315\nIEie7\t38316\nedwkv\t38317\n美美哒棒棒哒\t38318\n皓月婵娟\t38319\n体育场馆\t38320\n长凳\t38321\n八小七\t38322\n十四次\t38323\n疑异\t38324\n天来\t38325\n波浪形\t38326\n足坛\t38327\n数码\t38328\n平开\t38329\n深成指\t38330\n笑功菊\t38331\n好暖心\t38332\n猪你是芦台怪你是猪你是猪猪猪\t38333\n停息\t38334\n基调\t38335\n干路\t38336\n流涕\t38337\n缠身\t38338\n回唱\t38339\n叫窜\t38340\n明水\t38341\n查蕾\t38342\n请柬\t38343\n噗嗤\t38344\n王牌园\t38345\n织给\t38346\n永生\t38347\n肥胖\t38348\n中国国际救援队\t38349\n嗯劳动\t38350\n许多年\t38351\n山药炖肉\t38352\n我求你了你给我给我自动唱一个最炫民族风呗\t38353\n明气\t38354\n777777777\t38355\njsnsnsnsn\t38356\n程梓泓\t38357\n好欢乐\t38358\n无翔\t38359\n缩小\t38360\n硫磺\t38361\n物理系\t38362\n七仙女下凡呗\t38363\nmas\t38364\nmaw\t38365\n马明哲\t38366\nmay\t38367\n十九八十七十六岁\t38368\n不完我可以\t38369\n疑问句\t38370\n五十毫升\t38371\n养生汤\t38372\nmac\t38373\n淫梦\t38374\nmad\t38375\n自如\t38376\n铅华\t38377\nmah\t38378\nmak\t38379\nmam\t38380\nmal\t38381\nmao\t38382\nman\t38383\n陈梦舟\t38384\nnanboujinq\t38385\n亮出门\t38386\n朱勇\t38387\n瑰珀翠法国活泉甲部\t38388\n大草\t38389\n点分\t38390\n大荆\t38391\n思洋\t38392\n破镜\t38393\n几一个\t38394\n李倩洁\t38395\n大荔\t38396\n建成\t38397\n动手脚\t38398\n业务费\t38399\n武器\t38400\n有你有你有\t38401\n度秘你好呀连认识我\t38402\nywhy\t38403\ngihdy\t38404\nm度\t38405\n取来\t38406\n陈旬波\t38407\n湃湃\t38408\n变味\t38409\n周成洋\t38410\n500个\t38411\n宝贝儿我爱你宝贝儿我想和你说宝贝儿我离不开你了我想你\t38412\n旧情\t38413\n韦苏柳\t38414\n机票\t38415\n我咋样\t38416\n火油\t38417\n为什么不走\t38418\n寻问\t38419\n施文杰\t38420\n承认错\t38421\n通告\t38422\n中号\t38423\n春媚\t38424\n何干\t38425\n呵蜘蛛侠\t38426\n武宇东\t38427\nghiphotosbaiducomxiaodupicitem838ba61ea8d3fd1f27cad149374e251f94ca5f85jpg\t38428\n仨人\t38429\n何年\t38430\n欧债危机\t38431\n13752276330\t38432\n戈德哈根\t38433\n埃及金字塔\t38434\n656535335\t38435\n刘利刚\t38436\n昌吉赣\t38437\n家井口\t38438\n邪恶\t38439\ny秘蜀\t38440\n不能原谅\t38441\n咚锵\t38442\n王心仁\t38443\n宛如\t38444\n冲冲地\t38445\n讨价\t38446\n贺州东区八嘎逊\t38447\n胡吊车\t38448\nyorarecleverboy\t38449\nJKDVC\t38450\n李宗伟\t38451\n洞石\t38452\n一百八百\t38453\nGKVGU\t38454\n主人家\t38455\n牢牢\t38456\n桅杆\t38457\n火箭军\t38458\n济公\t38459\n碎灵\t38460\n药元\t38461\nu方法大错特错公共吵架句\t38462\n荣耀4G\t38463\n中间点\t38464\n25床\t38465\n嘉鱼\t38466\n還大力宣傳\t38467\n妈妈的吻歌\t38468\n南普陀\t38469\n5655467475444475457578\t38470\n啊丁\t38471\n二年多\t38472\n天铭\t38473\n曼彻斯特德比\t38474\n天银\t38475\n吴大吉\t38476\n简图\t38477\n别当我\t38478\n拍导盲犬参展慈\t38479\n一朵99朵\t38480\n25度\t38481\n翻家屯\t38482\n植物大战僵尸ok\t38483\norz\t38484\n开怀大笑\t38485\n林帅\t38486\nxdhhdbddb\t38487\n乖拜\t38488\n6月１9日\t38489\noro\t38490\n讨厌鬼我讨厌你\t38491\ninever\t38492\n私家车\t38493\nConcent\t38494\n1440分钟\t38495\norg\t38496\nore\t38497\n刀剑\t38498\n哎呦呵真\t38499\n林带\t38500\n大哥哥\t38501\n真自恋不要脸自恋狂\t38502\n田小草\t38503\n炎症\t38504\n蜂蜜你的家在哪儿度秘\t38505\nthing\t38506\n内格罗斯美宝秋秀\t38507\n板车\t38508\n不然理想会\t38509\n兰博基尼盖拉多\t38510\n兄祥瑞\t38511\nthink\t38512\n丹丹东\t38513\n饼干\t38514\ncheese\t38515\n移动公司\t38516\n若斯塔基\t38517\n嘿骚年\t38518\n撤摊\t38519\n15千克\t38520\nxianba\t38521\nenhaoa\t38522\n废话式\t38523\n朱云根\t38524\n361°伦敦行动全民记者团\t38525\n忙里偷闲\t38526\n年尾\t38527\nRRRRRRRR\t38528\n套套者\t38529\n最喜\t38530\n张经纬\t38531\n贾成亮\t38532\n稍作\t38533\n阪神大地震\t38534\n港汇广场\t38535\n浪漫\t38536\n没我可爱\t38537\n可视化\t38538\n内购\t38539\nyuán\t38540\n目空一切\t38541\n废纸\t38542\n好车车\t38543\n翻一番\t38544\n后言\t38545\nadzxksvxvffisu\t38546\n性服务\t38547\n过早点儿\t38548\n普天下\t38549\n黑土\t38550\n90年代\t38551\n32332312321\t38552\n作响\t38553\nHaHa\t38554\n4800个\t38555\nhffvjgcdxjjgs\t38556\n黑圈\t38557\n重庆北\t38558\n0nnn\t38559\n13765790\t38560\n尖峰山\t38561\n召唤术\t38562\n算来\t38563\n慢特\t38564\n佩卡\t38565\n一九少\t38566\n张天华\t38567\n三千亿\t38568\n你好坏蛋\t38569\n哒哒哒哒哒哒哒哒哒哒哒萌萌哒萌萌哒\t38570\n艾江\t38571\n黄石市\t38572\n黑圭\t38573\n请微笑\t38574\n哼快\t38575\n恶心吧你\t38576\n台胞\t38577\n三千亩\t38578\n多年\t38579\n一串串\t38580\n1033409517\t38581\n圆通快递单号\t38582\n塞车\t38583\n哈腰\t38584\n谩骂\t38585\n邱老师\t38586\n服不服\t38587\n教堂\t38588\n秒钟\t38589\n嗯甭\t38590\n上天不碍事\t38591\n737330\t38592\n737337\t38593\n零一哈哈哈哈哈哈\t38594\n秒针\t38595\nnoInotkone\t38596\n家法\t38597\n超闪\t38598\n莉颖\t38599\n呃让\t38600\n嗯皇后\t38601\n很听话听话\t38602\nfinzy\t38603\n遗失儿童问责制\t38604\n两万斤\t38605\n兽皇\t38606\n不要说\t38607\nkscn\t38608\n南滨路\t38609\n忙废\t38610\n二百九十七\t38611\n吸男\t38612\n6月11日至15日\t38613\n夜里十二点\t38614\n姓化\t38615\n观落\t38616\n是不让\t38617\n男生\t38618\n呦呵谷\t38619\n四号\t38620\n柏堡\t38621\n不能量\t38622\nwfdghcbnjc\t38623\n日常\t38624\n1141594965\t38625\n吹有\t38626\n良善\t38627\nxyyu\t38628\nyoujizz\t38629\n理解力\t38630\n无恒丰\t38631\n送出\t38632\n我的眼睛\t38633\n扣罚\t38634\n作页\t38635\n淘宝兼职呢\t38636\n男男\t38637\n古都\t38638\n初二\t38639\n麻涌\t38640\n中锋\t38641\n平添\t38642\n小美文媚娘\t38643\ngxbvkgxudkhggxufzoyxyrdlhvydxpuvhdcuvjhvueJFLHUDXUFXRFIYCYESIGXIGXITDURuriyxgkzofigfitxitdoyd\t38644\n二册\t38645\n半月镇\t38646\n必竟\t38647\nmmng\t38648\n毫无意义\t38649\n想我没有\t38650\n初五\t38651\nwrigv\t38652\n廿一克\t38653\n遗迹\t38654\n平淡\t38655\n╯▂╰\t38656\nfeng\t38657\n认识\t38658\n认证\t38659\n广告位\t38660\n谭州软件学院\t38661\nVI吾\t38662\n认试\t38663\n礼拜一\t38664\n年ji\t38665\nhiuhju888899999\t38666\n紫冰莲\t38667\n洋奴\t38668\n朱生豪\t38669\n女医\t38670\n61立克\t38671\n沁春\t38672\n片刻\t38673\n两手车\t38674\n天蚕\t38675\n狱中\t38676\n556577092414\t38677\n乳牛本\t38678\n教唆犯\t38679\n哈佛句\t38680\n李家祥\t38681\n最爱听\t38682\n王我叫王\t38683\n怪话\t38684\n露脸\t38685\n奥美\t38686\n97zy85b\t38687\n確實際\t38688\n唐羽\t38689\n嗯张碧成\t38690\n补累\t38691\n踢腿\t38692\n么吃\t38693\nfgbbvcfhll\t38694\n聊天吧\t38695\n么名\t38696\n女包\t38697\n悠扬舞裾翩\t38698\n充电线\t38699\n领情\t38700\n翌日\t38701\n奔跑男\t38702\n倍儿一个\t38703\n比肩\t38704\n今天下午15点\t38705\n刘母亲\t38706\n二二四五三八七\t38707\n黑手机\t38708\n不可鉴\t38709\n彩笔\t38710\n哥窑\t38711\n子不教父\t38712\nbjjj\t38713\n最险\t38714\n试用\t38715\n大时代\t38716\n道德底限\t38717\n发给\t38718\n乙比\t38719\n奴役丸\t38720\n1元\t38721\n78g\t38722\ncoustic\t38723\nisishi\t38724\n78l\t38725\n78n\t38726\n会战\t38727\n秦暮雪\t38728\ndjch\t38729\nSMAP\t38730\n花样滑冰\t38731\n有片\t38732\n769999\t38733\n俗话说笑一笑\t38734\n不不不我说\t38735\n温嘉鹏\t38736\n驴头\t38737\n有几分\t38738\n正着\t38739\n陈冠宇\t38740\n澄海区\t38741\n措施\t38742\n歌诗达邮\t38743\n卡子\t38744\n786\t38745\n787\t38746\n780\t38747\n什弥\t38748\n#abfab时尚资讯#英国当红IT\t38749\nnananannsns\t38750\n788\t38751\n789\t38752\n5亿六千九百九六\t38753\n这事科别记错了\t38754\n一本正\t38755\n陈嘉琪\t38756\niiook\t38757\n抱佛\t38758\nxjxn\t38759\n嗯十岁子\t38760\n真能\t38761\n恩阳\t38762\n抱住\t38763\ntookaplane\t38764\n13884360937\t38765\n旁轴\t38766\n乐少暖十\t38767\n大漠\t38768\n谁准\t38769\n2014点\t38770\n战迹\t38771\n苦修\t38772\n说呀你说我讨厌你的谁\t38773\n15880632094\t38774\n曹玲\t38775\n波特酒\t38776\n二麦\t38777\n天爸妈\t38778\n一笔一划\t38779\n谢谢谢谢谢谢谢谢谢谢谢谢谢谢谢谢谢谢谢谢谢\t38780\nNoman\t38781\n二百七十元\t38782\n狂战狙\t38783\nOriginals\t38784\n孙东云#破蛋日与D.Line云端\t38785\n安替\t38786\n赵阿姨\t38787\n大礼\t38788\n茶道\t38789\n轻压\t38790\n遥遥相对\t38791\n东篱东篱东篱下\t38792\n贾雯昕\t38793\n物语\t38794\nhi你是大人小孩儿\t38795\n东上\t38796\n中继台\t38797\n亏心：违背良心亏心\t38798\n愤慨\t38799\n你了不灭\t38800\n下饭\t38801\n呀腿\t38802\n作品素\t38803\n十来块\t38804\nb版\t38805\n遐想\t38806\n城西区\t38807\n五棵松\t38808\n进出口\t38809\n米西\t38810\n铺盖\t38811\n萌芽\t38812\n短人\t38813\n色求\t38814\n狗焕\t38815\n鹅\t38816\n我知道你是小男人我没聊那么爱\t38817\n不感你一点也\t38818\n凤求凰\t38819\nq币\t38820\n袁三珊\t38821\n真是我的好\t38822\n辣者\t38823\n老巫婆\t38824\n生物物理学\t38825\n垄断\t38826\n农岗\t38827\n撸尼尼\t38828\n推了唱\t38829\nesthesplesto\t38830\n灵性\t38831\n割皮\t38832\n摸摸摸摸摸摸摸摸摸摸摸摸摸摸摸摸摸摸摸摸\t38833\n16个半小时\t38834\n吃了讨厌\t38835\n土著\t38836\n幺零五九八一二八五零七\t38837\n自行\t38838\njingju\t38839\n汤姆猫七\t38840\n看见了没\t38841\n安全期\t38842\n八九我了思密达在\t38843\nelsesomebody\t38844\n苏妲己\t38845\n王默哼\t38846\n宾士\t38847\n好啦原谅你\t38848\n不见心不烦\t38849\nifvfdhk\t38850\n贺雨萱\t38851\n235千克\t38852\n消雲散\t38853\n9.4%\t38854\n生命之脉\t38855\n卡小河\t38856\n倾心\t38857\n安会\t38858\nRrrrrttryyyy\t38859\n哄开\t38860\nuhgvg\t38861\n吗怪\t38862\nvvvs\t38863\n鬼莫\t38864\ntdoi00\t38865\n1万二千六百五十六\t38866\n吃刺\t38867\n万绪\t38868\n第一柱\t38869\nabcef\t38870\n黃色\t38871\njapanhdav\t38872\n呜噜\t38873\n火灾\t38874\n刘建锋\t38875\n超抵\t38876\n唱一路顺风\t38877\n王晓晶\t38878\n哇好奇\t38879\n猴枣散\t38880\n川菜\t38881\n双轨制\t38882\n张怀枫\t38883\n三班\t38884\n热播家\t38885\n咳咳咳靠靠\t38886\n超频三独创\t38887\n茶壶\t38888\n難聽\t38889\n红米饭\t38890\n八大丿一nnnn＊nnnnnnnnn1314256\t38891\n公安警\t38892\n张野厄\t38893\n无名宝\t38894\n台球\t38895\n秦计件\t38896\n将场\t38897\nbatatm\t38898\nhggggggg\t38899\n更有效\t38900\n勃列日涅夫\t38901\n水中\t38902\n漫都\t38903\nDoyour\t38904\n呢不叫\t38905\n伤心失落\t38906\n阴诡\t38907\n屡几\t38908\n剖腹产\t38909\n答一答\t38910\n胤稹\t38911\n女票\t38912\nJizz\t38913\n北京电影学院招生办公室\t38914\n玛歌\t38915\n我需要你的心\t38916\ngushba\t38917\n背观潮\t38918\n头冷\t38919\n從一愛組\t38920\n丽江机场\t38921\n小花仙尾\t38922\n男恩\t38923\n一之\t38924\n座机号\t38925\nAPE\t38926\n空额额\t38927\n爱理\t38928\nay吧\t38929\n听冷笑\t38930\ndhrhf\t38931\n甘地\t38932\n无法解释\t38933\nQQ影音\t38934\n别揭\t38935\n重慶莊澤寶\t38936\n刘枫\t38937\ndanggao\t38938\n远离江湖\t38939\n一月二九\t38940\n车上\t38941\n赛尔弓\t38942\n我讨厌你我讨厌你我讨厌你我讨厌你度秘恨你\t38943\n别提\t38944\n７\t38945\nwc点儿\t38946\n美国华尔街日报\t38947\n25秒\t38948\neiddd\t38949\n支定\t38950\n活动\t38951\n讓陸陸們看台灣\t38952\n不排除\t38953\n投案自首\t38954\n誓死\t38955\n鹰\t38956\n贝类\t38957\n嘎羧\t38958\n别难过\t38959\n＜\t38960\n总裁\t38961\n花来\t38962\n露出\t38963\n历事\t38964\n鹿\t38965\n度秘莉亚雅\t38966\n贬值\t38967\n8178616\t38968\n蓝猫龙骑团\t38969\n遍布\t38970\n1310231092\t38971\n源海\t38972\n888石\t38973\n花束\t38974\nzaka\t38975\n医嘱\t38976\n哈萨克族\t38977\n占有欲\t38978\n留澳\t38979\n80亿\t38980\n王瑟瓶\t38981\n博平\t38982\n38527406546831\t38983\n一起丑\t38984\n连绵羊哥\t38985\n西和路晗\t38986\n十一百岁\t38987\n奥特植物大战僵尸\t38988\n想像\t38989\n姜慧\t38990\n大马士革\t38991\n神钢神钢\t38992\n第一季\t38993\nHhhv\t38994\n冰玉\t38995\n精进\t38996\n十八根\t38997\n呢臭\t38998\n号码饭\t38999\nHhhH\t39000\n驴马\t39001\n3月31日\t39002\n钱汝兰\t39003\nhvgcfgggg\t39004\n新神\t39005\n早婚\t39006\n林宇阳\t39007\nhbsbbebe\t39008\n海尔斯\t39009\n北京大使馆\t39010\n死猪\t39011\n睇睇\t39012\n互动百科\t39013\n／\t39014\n十三亿\t39015\n沙沙\t39016\n攻下\t39017\n有脸\t39018\n来日\t39019\n恼怒\t39020\n红会学学营销学\t39021\n下令\t39022\n有气死人不偿命\t39023\n两百公斤\t39024\n古汉语\t39025\n哲思\t39026\n八十篇\t39027\nRMB85\t39028\n阿齐近平\t39029\n全速\t39030\n训觉\t39031\n8396144\t39032\n王启杭\t39033\n横放\t39034\n醒醒\t39035\n沙河\t39036\n6000块\t39037\n抢不去\t39038\nw试试试试试123\t39039\n鲈鱼\t39040\n直追\t39041\n淘汰\t39042\n258079\t39043\n我喜欢韬\t39044\n马上雪\t39045\n数据源\t39046\n中央书记处\t39047\n会系\t39048\ndeeh\t39049\n886886886886886886886886886886886886886\t39050\n6句\t39051\n苦菊\t39052\n丁目\t39053\n方雨\t39054\n曼殊\t39055\nm78\t39056\no了一了一丁孃\t39057\n赵白茜\t39058\n幻变\t39059\n胡蝶蝶\t39060\n了不为难\t39061\n雅儿\t39062\n888884\t39063\n藏藏藏藏\t39064\n高气爽\t39065\n888880\t39066\n七妹\t39067\n95花\t39068\n理实\t39069\n迪士尼公园\t39070\n偷瞄\t39071\n回天无力\t39072\n丢不起\t39073\n张博士\t39074\n章鱼舞\t39075\n三分之一\t39076\n2993793672\t39077\n我希\t39078\n漂酿\t39079\n两个多小时\t39080\n我师\t39081\n胡悦鑫\t39082\n内心\t39083\n处女地\t39084\n抱持\t39085\n你画我猜\t39086\n周立翔\t39087\n14:00——17:00\t39088\n缺暖\t39089\n一瞬间照\t39090\n点滴\t39091\n贾斯汀比伯\t39092\n对不起行\t39093\nＯ\t39094\n消失了\t39095\n可雅\t39096\ngungungungumgu\t39097\n黄敏\t39098\n酱油\t39099\n黄效\t39100\n余19999\t39101\n四臂\t39102\nPossibility\t39103\n干巴撸5w\t39104\n莎贝拉\t39105\n我们两个亲一个\t39106\n主业\t39107\n4e6\t39108\n纪春\t39109\n学杂费\t39110\n忘了我再见\t39111\n本去厕所\t39112\n还价\t39113\n怎么了我就是不爱你\t39114\n申奥\t39115\n走廊\t39116\n息怒\t39117\n卡纳\t39118\n说辞\t39119\n停牌\t39120\njutd\t39121\nydrf\t39122\n湖南广播电视台\t39123\n孙扬\t39124\n我喜欢猪猪侠勇闯巨人岛\t39125\n孔隙\t39126\n很急家具退图了了了了了了尽量了了了了\t39127\n爱上除\t39128\n陈朝伟\t39129\n孤零零\t39130\n坚大\t39131\n宿友\t39132\n5797557\t39133\n托马\t39134\n影片\t39135\n挚爱\t39136\n不锈钢\t39137\n求求你了告诉我\t39138\n雪獒\t39139\n圣经书\t39140\nffhdf\t39141\n语录\t39142\n商河县\t39143\ntoudào\t39144\n京bk6878\t39145\nghijgvhbojjgghokvkkhkvjvjvhyhhbo\t39146\n协会\t39147\n榕榕\t39148\n999878148648\t39149\n螺蛳粉\t39150\n聘请\t39151\n贵极人臣\t39152\n24章\t39153\n妖男\t39154\njreeeugb\t39155\n39ggggcom\t39156\n哪堂\t39157\n欢笑\t39158\nddghcchhgbj\t39159\n透过\t39160\n无意识\t39161\n曹佳晟\t39162\n粤西\t39163\n2542\t39164\n2054967935\t39165\n拨款\t39166\n况且\t39167\n别那么当真\t39168\n随风而逝\t39169\n2544\t39170\n一瓣\t39171\n朴槿汐\t39172\nwviqk\t39173\n张站雨\t39174\n辈子女\t39175\n山明水净夜来\t39176\n12小时内\t39177\n一亿首\t39178\nc1妃传\t39179\n阳哥哥\t39180\n显示屏\t39181\n改度\t39182\n填补\t39183\n152355555285268\t39184\n16点115点\t39185\n4月26\t39186\n宋婕\t39187\n4月25\t39188\n宝特攻恩\t39189\n填表\t39190\nPetrus\t39191\nnono堡炸鸡\t39192\n叫纪\t39193\n搞味\t39194\n西安文理学院\t39195\n活话\t39196\nshyuuigygygghk\t39197\n麻醉\t39198\n李胜贤\t39199\n老母猪老花猪\t39200\n活该\t39201\n高彩丽\t39202\nghhgch\t39203\n0.36%\t39204\n聊锻炼吧思密达\t39205\nGxgzgfd\t39206\ng网\t39207\n放泡\t39208\n绿豆薏米仁\t39209\n意马\t39210\n想见习我就喜欢\t39211\n作刀俎\t39212\n为你好看\t39213\n写道\t39214\n爽肤水\t39215\ndgggschyhrghfvwrvjugedbujkihfdcyfsefhuhxerhjihfescjkbdr\t39216\n受怀\t39217\n安静脉脉冲\t39218\n年月日\t39219\neye\t39220\n调节\t39221\n坤哥们\t39222\nHendrixxy\t39223\n一亩地\t39224\neyo\t39225\n说走就\t39226\neys\t39227\n美姑娘\t39228\n双翼\t39229\neyx\t39230\n厚脸皮\t39231\n臆造\t39232\ncostalloosao42ciousaobal\t39233\n秦州\t39234\n75748k\t39235\n一头儿\t39236\n乡经统我1六合彩经线钢经红\t39237\n无聊了不聊\t39238\n凶暴\t39239\n宋彬彬\t39240\n45018\t39241\n剪发\t39242\ngrtv\t39243\n大只\t39244\n大叫\t39245\n大可\t39246\n非凡淘气包\t39247\n王保安\t39248\n杨颖同\t39249\n大口\t39250\n大变\t39251\n一整晚\t39252\n墙饰\t39253\n手型\t39254\n陈爱琴\t39255\nnunnu1\t39256\n大发\t39257\n哭墙\t39258\n大叔\t39259\n大受\t39260\n大叉\t39261\nsthetityou\t39262\n十七辆\t39263\n值此\t39264\n贾丹丹\t39265\n岳家嘴\t39266\n师徒\t39267\n盘吃\t39268\n别当\t39269\n我是问你我是男是女\t39270\n我又要哭了秘度你好坏度秘你好坏秘度你好度秘你好坏\t39271\n内存卡\t39272\n增长速度\t39273\n别惹我我\t39274\n王听过\t39275\n发给线\t39276\n堂火\t39277\n乖叫\t39278\n讷样\t39279\n真美\t39280\n乖可\t39281\n悬念\t39282\n扶额\t39283\n结庐在人境\t39284\n寒假周\t39285\n255454859899885858165656595688566855456654565768775550784752245866555555555\t39286\n一看\t39287\n中国移动通信集团公司\t39288\n胆小鬼\t39289\n白娘子\t39290\nvivvivivvi\t39291\n反串\t39292\n宫颈带鱼\t39293\n献错\t39294\n丰富多彩\t39295\n解放海南岛\t39296\nzhaijajkm\t39297\n一真\t39298\n汪振超\t39299\n4621475221442\t39300\n聊聊聊聊聊\t39301\n没人扣\t39302\nOPPO3彩\t39303\n笑死\t39304\nvtgh\t39305\nkdlddk\t39306\nqqc\t39307\nqqb\t39308\n超人陆战队百变马丁\t39309\n莫斯\t39310\nTIME\t39311\n特效\t39312\n白大褂\t39313\n九零零\t39314\n26654\t39315\n亮派\t39316\nqqs\t39317\n北京电影节\t39318\n权色\t39319\nqqx\t39320\n解决不我\t39321\n巨毒\t39322\n哈拉哈拉\t39323\ntbfgjnhtfxvb2254569ggvaeuo226896937833532626224734629729fh乎犬\t39324\n胡哥\t39325\n额恩\t39326\n小鲜女孩子\t39327\n读图\t39328\nYDVEHBSGCD\t39329\nyouil\t39330\n胡哲\t39331\n旺达\t39332\nLUOBin\t39333\n打广告\t39334\n度秘我讨厌你我讨厌你恶你不\t39335\n八千多米\t39336\n创举不改变\t39337\n3104127274\t39338\n7万5\t39339\n陈立颖\t39340\n篮手\t39341\n切果\t39342\n攀山越岭热火\t39343\n8545425555555545485545565545555285556535643565525323564555545545454655565656555552326666666666665555588545453325454595554345525626445552332221353465658595955565263\t39344\n五十四十五\t39345\n福岛1234号机\t39346\n湮儿\t39347\n危及\t39348\n何山路\t39349\n温猪\t39350\n奇景\t39351\n张你\t39352\n同心协力狐假虎威\t39353\n体育\t39354\n8683534354\t39355\n沙织\t39356\n论处\t39357\n12836986956\t39358\n纵欲\t39359\ncoding\t39360\n张馨予\t39361\nhiddnm\t39362\n故土胜\t39363\njxnd\t39364\n拉呗\t39365\n想人\t39366\n誉为\t39367\n17452186812\t39368\n一莫\t39369\nsjndnd\t39370\n察觉\t39371\n三帮\t39372\n聚合酶\t39373\n树皮\t39374\n冬敏\t39375\n花钱波\t39376\n塘尾中学\t39377\n二十四三个\t39378\n增肌粉\t39379\n铁道部\t39380\n色龙龙\t39381\n学物\t39382\n雷敏\t39383\n拉呱\t39384\n死丽影\t39385\n74554421245\t39386\nfuol\t39387\n张又一\t39388\n唱恋爱\t39389\n跳唱歌\t39390\n孟飞\t39391\n于田县\t39392\n美容师\t39393\n读了读\t39394\n考师\t39395\nfuor\t39396\n湄潭\t39397\n姚科\t39398\n布丁传\t39399\n原水\t39400\n怕怕冷\t39401\n球童\t39402\n零点前缘乾\t39403\n牧炎元\t39404\n祭扫\t39405\n植蓄发\t39406\n好姐妹\t39407\n劳力士\t39408\nGj3dks\t39409\ngghczfgcchczkjxjkcjklfudizjohfhhxfhgdgixhj\t39410\n三姑六婆\t39411\n球竿\t39412\nsbs\t39413\n各种\t39414\nsbq\t39415\n三尺白陵\t39416\nsbt\t39417\nsbu\t39418\n66七6\t39419\nsbx\t39420\n一髻儿\t39421\n窗花\t39422\nsbb\t39423\n邀约\t39424\n莫斯科\t39425\n3291620862\t39426\nsbg\t39427\n东沙卡拉卡\t39428\n糙米片\t39429\njzbb\t39430\nsbh\t39431\n大小写\t39432\n奚梦娇\t39433\n各科\t39434\n萎货\t39435\n展柜\t39436\n美德\t39437\n叫叫\t39438\n廉政建设\t39439\n四二百五八\t39440\n涩涩\t39441\n叫句\t39442\n玫婕妤\t39443\n忙不忙\t39444\n两本书\t39445\n不得不到\t39446\n天夫家\t39447\nFrutfe8526656586081\t39448\n假号\t39449\n猩\t39450\n猪\t39451\n猫\t39452\ndont\t39453\n尸骨\t39454\n阳起石个\t39455\n山东首富电厂\t39456\n猥\t39457\n猸\t39458\n猹\t39459\n猻\t39460\ndone\t39461\ndong\t39462\n猿\t39463\n温水\t39464\n猴\t39465\n念诗\t39466\n敢作敢\t39467\n表瞎说\t39468\n潘龙杰\t39469\n一万颗\t39470\n猎\t39471\n蜈蚣\t39472\n猀\t39473\nJade\t39474\n靡不好\t39475\n足总杯\t39476\n林盛堡\t39477\n虞城\t39478\n猜\t39479\n可循\t39480\n2000亿元\t39481\n沃明\t39482\nBOOKLI\t39483\neeydtfd\t39484\n猖\t39485\n臭烘烘闹哄哄\t39486\n而去哪儿\t39487\n6亿立方米\t39488\n别说了再见\t39489\n艾瓦雷特\t39490\npark\t39491\n谢谢好了\t39492\npart\t39493\n恨偶\t39494\n讲句骂人\t39495\n水光\t39496\n用户端\t39497\n灵动\t39498\n困气\t39499\n13点37\t39500\n带领\t39501\n13点35\t39502\n水养\t39503\n灵力\t39504\n陷井\t39505\n这我\t39506\n李礼物\t39507\nu藏藏藏\t39508\n斯格威\t39509\n陷于\t39510\nislala\t39511\n朝鲜\t39512\n太马\t39513\n盖帽\t39514\n风豆腐鱼\t39515\n闹钟\t39516\n受难\t39517\n维他命A\t39518\n我宝贝儿TTTTTTTTTTTTTT\t39519\n一个多G\t39520\n岸边\t39521\n吹牛逼\t39522\n癫癫\t39523\n小苏苏\t39524\n千百儿\t39525\n高颜辛\t39526\n王航\t39527\n东欧\t39528\n苏大哥\t39529\nhfjfjfbff\t39530\npqitakajfjnzcn\t39531\n秘你反应好快\t39532\n被告人\t39533\n嗯世茂\t39534\n摊主书柜\t39535\n闹钱\t39536\n凹吐\t39537\npigni\t39538\n【实验中学\t39539\n从不曾\t39540\n嘉禾县\t39541\n跃动\t39542\n王小源\t39543\n献爱\t39544\n覆辙\t39545\n二百二十多\t39546\n哎呀你好可爱\t39547\n周天勇\t39548\n叫作\t39549\nnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnn\t39550\n月光原\t39551\n脱掉\t39552\n400多个\t39553\n犬钱\t39554\n杜尔科\t39555\n香港文汇报\t39556\n叫位\t39557\n9月10日\t39558\n九月天五月天\t39559\n13684233\t39560\n六三七五\t39561\n等不到\t39562\n嫦娥三号探测\t39563\n14点30分\t39564\n养蜂人\t39565\n苏杭\t39566\n电锯惊魂\t39567\n娘娘腔\t39568\n道场\t39569\n袍子样\t39570\n头宇宙\t39571\n活定盈\t39572\n短缺\t39573\n想吃饭\t39574\n紫嫣\t39575\n样貌\t39576\n844535121423555\t39577\n腐黑\t39578\nufhggxg\t39579\n贾宝玉\t39580\n万火\t39581\n金碧辉煌好季节顾后果金碧辉煌好季节顾后果frthttyhgfdf\t39582\n啦啦小一\t39583\n发克YOU\t39584\n142862879653\t39585\n闪婚\t39586\n李妍婷\t39587\n几个四分之一\t39588\n黑捣\t39589\n另一站\t39590\nfulibox\t39591\n大在在在在在在在在在在\t39592\n很大好\t39593\n预言家\t39594\n2010年10月11\t39595\n欧吼吼\t39596\n瓮主\t39597\nfbjkb\t39598\n宁海\t39599\n符管\t39600\n上午10时45分\t39601\n不可辱\t39602\nuj2kejtjktndjd83iyrrir84744819818wjen5i4o\t39603\nviedil\t39604\n凯萌\t39605\n灭鼎之灾\t39606\nDell咯11l\t39607\n薛笔记\t39608\n好好好好好好好\t39609\n福气\t39610\n多智\t39611\n嘴臭\t39612\n不公主\t39613\n宣不宣\t39614\n一百四十一块\t39615\nGrin\t39616\n仁十丽\t39617\n10000000000000000000000000000000000000000\t39618\n程顺\t39619\n地名\t39620\n识\t39621\nxbdbxb\t39622\n评\t39623\n诅\t39624\n诃\t39625\n天狼国\t39626\n证\t39627\n诏\t39628\n诌\t39629\n词\t39630\nswrtui\t39631\n诋\t39632\n诈\t39633\n诉\t39634\n诖\t39635\nhhbbh\t39636\n试\t39637\n行不信\t39638\n译\t39639\nCBGBBBC\t39640\n马戏\t39641\n话\t39642\n钱币\t39643\n诛\t39644\n废人\t39645\n详\t39646\n年内\t39647\nmannearhere\t39648\n询\t39649\nASOS额f雨fs\t39650\n诡\t39651\n误\t39652\n诬\t39653\n语\t39654\n胡理胡途\t39655\n诫\t39656\n归于\t39657\n诶\t39658\n请\t39659\n说\t39660\n诵\t39661\n喔无奈\t39662\n诱\t39663\n课\t39664\n葫芦熊\t39665\n诺\t39666\n读\t39667\n诸\t39668\n博物\t39669\n孟凡\t39670\n蜇悉吆伙砾弧欤母\t39671\n任嘉琦\t39672\n拿铁\t39673\n很多多\t39674\n食道癌\t39675\nxood\t39676\n公猴\t39677\n公猫\t39678\n公猪\t39679\n过犹待桑榆雾散迷雾过日子单极限日子及腹股处能量的桑桑三\t39680\nCorpand\t39681\n阿拉比\t39682\n搞爱\t39683\nDDB\t39684\n涂芸\t39685\n700多年前\t39686\nffffffffd\t39687\n工作照\t39688\nDDD\t39689\n较好看\t39690\n了明天见\t39691\n幽默性\t39692\n六局\t39693\n豆块\t39694\n两天一多\t39695\n计步器\t39696\n很多天\t39697\n死不活\t39698\n党国\t39699\n做梦中\t39700\ncakes\t39701\n游山\t39702\n無敵溫\t39703\n不耐\t39704\n一战小游\t39705\n忙我懂\t39706\n彩吧\t39707\n200M\t39708\n邵宇韬\t39709\n挂周\t39710\nup个你是猪\t39711\n两角\t39712\n生命浪\t39713\n不老\t39714\n不考\t39715\n华服\t39716\n810下\t39717\n不而\t39718\nTj\t39719\nkpcwc\t39720\n另个\t39721\n另中\t39722\n蒋雅豪\t39723\n12KM\t39724\nG4409200207194343\t39725\n5877585875\t39726\n578654578542\t39727\n不耻\t39728\n不能不有肉头\t39729\n中转站\t39730\n商人\t39731\n巨贵\t39732\n810个\t39733\n重约\t39734\n好莱坞酒店\t39735\nLive武艺个唱#\t39736\n稍息\t39737\nSiwon\t39738\nIfdeddbnou\t39739\n五粮液\t39740\n停课\t39741\n银音乐\t39742\n乡后\t39743\n淡视\t39744\n13891888573\t39745\n专业人士\t39746\n180万元\t39747\n好吃嘎巴脆\t39748\n有影\t39749\n轻灵\t39750\n磨洗\t39751\n我是一只小小鸟笨鸟笨鸟先飞\t39752\n希界\t39753\n冻因为你的照\t39754\n何舒欣\t39755\n牛油果\t39756\n123秘\t39757\nQAQ哼\t39758\n95个\t39759\nqegvthch\t39760\nJackshe\t39761\n怦然心动\t39762\n李静宜\t39763\n八岁半\t39764\n曺振浩\t39765\n天仙桥\t39766\njustrinaberbr\t39767\n会片\t39768\n一了百\t39769\n行情\t39770\n陈建悟\t39771\n逼上梁山太急\t39772\n恩若汕石\t39773\n亚欧\t39774\n转出\t39775\n银子\t39776\n1163万日元\t39777\n科目\t39778\n25日\t39779\n散光\t39780\n红薯汤\t39781\n不识好歹\t39782\n生硬\t39783\n我做你的公主吧\t39784\nhi太狼\t39785\n没法\t39786\n特长班儿\t39787\n产日\t39788\n献策\t39789\n裸女\t39790\n心目雨\t39791\n陆美莹\t39792\n二斤\t39793\n策死\t39794\nB群\t39795\n着壳\t39796\n一六分\t39797\ndgc\t39798\ndgf\t39799\ndgg\t39800\ndgd\t39801\ndge\t39802\ndgj\t39803\n554575575\t39804\ndgh\t39805\ndgi\t39806\n混蛋神\t39807\n张正鹏\t39808\ndgs\t39809\ndgp\t39810\n蓝叶冰\t39811\ndgx\t39812\ndgy\t39813\n长阴\t39814\n个物\t39815\nwoyo\t39816\n水上漂\t39817\n请罪\t39818\n写业\t39819\n张小\t39820\n哈哈镜\t39821\n阿之\t39822\n苟延残喘\t39823\ngfhffugchg\t39824\nHelberg\t39825\n个片\t39826\n29座\t39827\n写上\t39828\n写下\t39829\n张少\t39830\n內衣\t39831\n阿乙\t39832\nQQ电话\t39833\n荼\t39834\n丁林天业\t39835\n生死契\t39836\n白石洲\t39837\n回炉\t39838\n好涛哥\t39839\n颜也\t39840\n三联桥\t39841\n别无聊了你\t39842\n十声\t39843\n依恋\t39844\n完败\t39845\n29度\t39846\n娘炮\t39847\n秘吏们\t39848\n中下游\t39849\n65斤\t39850\n好多集\t39851\n表露\t39852\n喝喝喝喝\t39853\n黑公里\t39854\n五百二十八号\t39855\n呢郎\t39856\n八小时后\t39857\n女虐\t39858\n心想着\t39859\n刘夏悦\t39860\n氮肥\t39861\n三三兄弟\t39862\n13120014598\t39863\n31364米\t39864\n旨在\t39865\n可乐冰\t39866\n慧\t39867\n母羊\t39868\n慢\t39869\n车尾委\t39870\n作品\t39871\n慬\t39872\noouvd\t39873\n慨\t39874\n奉化芋艿头\t39875\n投手\t39876\n男神国民峰\t39877\n苏娇\t39878\n樱雪\t39879\n罗西娜\t39880\n以和\t39881\n邹屋里\t39882\n慇\t39883\nmytoo\t39884\n宜都\t39885\n慎\t39886\ngggggggggggggggghgggggggggggggg\t39887\n慌\t39888\n来相\t39889\n慈\t39890\n入住\t39891\n八点生\t39892\n浮出水面\t39893\n偏转\t39894\n哒豆腐\t39895\n拜一\t39896\n沙头\t39897\n中国工程院\t39898\nShenmeyisi\t39899\n八分之39分之二\t39900\n0976\t39901\n情人季\t39902\n意力丸\t39903\n看病难\t39904\n香水\t39905\n国际法\t39906\n数多\t39907\n刘忙\t39908\n酒量\t39909\n老单\t39910\n批儿\t39911\n数天\t39912\n老卡\t39913\n冯新晴\t39914\n瘟果\t39915\n眼前风\t39916\nFizzknock\t39917\n江铠同\t39918\n2603501738\t39919\n玩图\t39920\n香气\t39921\n甚感\t39922\n瓦利亚\t39923\n我是你的你是谁那是你的我是女的\t39924\n刘忻\t39925\n近水楼台先得月\t39926\n七月四\t39927\nFifteen\t39928\n左野\t39929\n团购网\t39930\n阿可\t39931\nHbd\t39932\n奶奶奶奶\t39933\nO\t39934\nFfdddddszzz\t39935\n隐匿\t39936\n专供\t39937\n好吧小爱\t39938\nHbx\t39939\n新泰和卫浴\t39940\nddsss\t39941\n张倩萍\t39942\nextfx\t39943\n阿双\t39944\nffccgfgrf\t39945\n蔡文珠\t39946\n球架\t39947\nzongji\t39948\n多咪咪\t39949\nHbZ\t39950\n1887415157777\t39951\n阿受\t39952\n75家\t39953\n999001\t39954\n999000\t39955\n道道\t39956\n标本性\t39957\n劳力\t39958\n琳呀楠\t39959\n接力棒\t39960\n体育公园\t39961\n伊波拉病毒#\t39962\nlesson\t39963\n呢绿\t39964\n怕死\t39965\n22644154\t39966\n压根\t39967\n布袋\t39968\n民间知识人\t39969\n娜露娃\t39970\n六四零\t39971\n全州两河\t39972\n湱上腹\t39973\ncamera\t39974\n高热量\t39975\n呢绒\t39976\n有幸运\t39977\n雅贝\t39978\n枝江\t39979\n南瓜粥\t39980\n配一配备\t39981\nghkffnb\t39982\n施以幻\t39983\n张校长\t39984\n摆开\t39985\n沙发垫\t39986\n摆弄\t39987\n刘辉\t39988\n汇总\t39989\n霸气腹黑\t39990\n丁月湖\t39991\nbcbhxjclcococlbckv\t39992\n狱里\t39993\n拉拉里\t39994\n回路\t39995\n睡的早你\t39996\n解馋\t39997\n恐慌感\t39998\n刻录\t39999\nhhuuuuu7trgy\t40000\n名犬\t40001\n所说\t40002\nivlvw\t40003\n烦呀\t40004\n15104893522\t40005\n458898586915886765889886589895569999898n90\t40006\n高鑫\t40007\n２００４年\t40008\nNdjfu\t40009\n夜犬\t40010\n多啦A梦\t40011\n发克发克发克发克发克发克\t40012\n碰杯\t40013\nCross\t40014\n90岁\t40015\n破天荒\t40016\n15993978526\t40017\n冬冬\t40018\n周慧莹\t40019\n安安安\t40020\n嘴笨\t40021\n真不找你\t40022\n碰杆\t40023\n以期\t40024\n含情脉脉\t40025\n跟我在这儿\t40026\n化力\t40027\n川黎奈\t40028\n鱼人\t40029\nno粉\t40030\n是非得\t40031\n秘赛尔\t40032\n看样\t40033\n浅淡\t40034\n幼儿园\t40035\n云龙\t40036\n娃洲\t40037\n慢里不行\t40038\n张保焊\t40039\n13kg\t40040\n再说一变\t40041\nk酷\t40042\n二二二二九五五八九五\t40043\nDeareditorl\t40044\n金大奶\t40045\n10日22时\t40046\n祭祖\t40047\n定下\t40048\n放秘\t40049\n祭祀\t40050\n李雨璐\t40051\n等你有你在\t40052\n亢战庆\t40053\n定为\t40054\n都餐\t40055\n44114211254121\t40056\n垫脚石\t40057\n不懂意思\t40058\n南风\t40059\nyouj模\t40060\n定中\t40061\n华沙保卫战\t40062\n歌甜舞美\t40063\n26户\t40064\n再说一句\t40065\n南飞\t40066\n排畸\t40067\nklvtzsyzxuk\t40068\n四科\t40069\n好嘛多\t40070\n人处\t40071\n爱萍天\t40072\n四秘\t40073\n调秤\t40074\n崔点\t40075\n嘿嘿场\t40076\n少楷\t40077\n三第十二期\t40078\n迪马利亚\t40079\njvkbtzh\t40080\n人夫\t40081\n好呀摸摸\t40082\n人天\t40083\n武林学校\t40084\n人大\t40085\n东西南北\t40086\n再见再见再见\t40087\n一二零\t40088\nWHITE首\t40089\n办到\t40090\nhsvs\t40091\n鱼佑微\t40092\n要靠量\t40093\n简简单单单\t40094\n潜白\t40095\n人头\t40096\n个人照\t40097\n木桐\t40098\n跟帅\t40099\n。nnn\t40100\n猫仔粥\t40101\n教育学\t40102\n葡萄牙\t40103\n神像\t40104\n一只方\t40105\n自私下\t40106\n送达\t40107\ni559\t40108\n1nnn\t40109\n度秘我要你的命\t40110\n鸿星尔克们\t40111\n多老\t40112\n等待\t40113\n鉴订\t40114\n09865334\t40115\n13870060938\t40116\n杨一臭\t40117\n伍洁镟\t40118\ncentayos\t40119\n开水\t40120\n思绪万千\t40121\n新良\t40122\n毛骚\t40123\n红疙瘩\t40124\n加速\t40125\npsboss\t40126\n上海市金山区石化同凯离金山区石化欧尚超市\t40127\n郭小鹊\t40128\n花枝招展\t40129\n第44回\t40130\nrchfxgy64845\t40131\n病犯\t40132\n露天停车场\t40133\nas晓萍\t40134\n宫赫\t40135\n奥特曼君\t40136\n病状\t40137\n京暗拂\t40138\n大姐姐，你真的好傻傻傻傻你你\t40139\n华古瑶\t40140\n保外就医\t40141\n服宅\t40142\n和慧慧\t40143\niball\t40144\n零智\t40145\n李政国\t40146\n阔以\t40147\n4720\t40148\n掀桌\t40149\n紫燈瑩火玉\t40150\n自掘\t40151\nushahoouri\t40152\n最小化\t40153\n欧打油\t40154\np3海\t40155\n教科书\t40156\n八零花儿\t40157\n赛季\t40158\n1￥\t40159\n水豆\t40160\n广播\t40161\n灼工佞\t40162\n是非非谓\t40163\n哈老婆\t40164\n真实性\t40165\n陈小飘\t40166\n咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪\t40167\n发饰\t40168\n发饿\t40169\n非花\t40170\n张天爱\t40171\nTechnicolor\t40172\n没喜欢\t40173\n舌草\t40174\n芝麻粉\t40175\n安琪塔\t40176\n喷薄而出\t40177\n不是我和你一样喜欢安家\t40178\n苏菲雅\t40179\n2929812608\t40180\n1889568\t40181\n古惑仔\t40182\n幽默感费\t40183\n种树\t40184\n大撸\t40185\n朱杰\t40186\n唐晓慧\t40187\n3067点\t40188\n天天见\t40189\n白水八百少点抗我\t40190\n收工\t40191\n這個\t40192\n还不难看\t40193\n西班牙语\t40194\n激将法\t40195\n135794665476\t40196\n耶耶度耶度\t40197\n經\t40198\n忧郁症\t40199\n鲁蛋挞\t40200\nknvvvvvvvbbbbnnnnnmmnhh\t40201\n我的朋友圈\t40202\n至深夜猫\t40203\n吃菜\t40204\nfjdcjs\t40205\n张爱玲\t40206\ngvnfujvhkcgjccgjnhbnvhhvjnvgjvvjbjvvkvbvcgggghhbbbvfcjvhchcvvccgxcvvvvvhbbbb\t40207\n4442229832\t40208\nc379310a55b319e47d520744a98226cffc173fjpg\t40209\n浑身\t40210\n韩迪\t40211\n三十几天\t40212\n7号\t40213\nrenmeinfinityoubomintimidito\t40214\n魔师萌\t40215\n护林员\t40216\ninthe\t40217\n48785424578\t40218\niiu\t40219\n罗西基迪亚比\t40220\n大路货\t40221\n刘丽美\t40222\n爸爸去哪儿第二季\t40223\n舒曼\t40224\niiy\t40225\n芳邻\t40226\n闭上听\t40227\niie\t40228\n乐太\t40229\niif\t40230\n说好巧\t40231\n耳机\t40232\n耳朵\t40233\n逼迫\t40234\niii\t40235\n注：转贴\t40236\n姐姐姐\t40237\n凉拌毛豆\t40238\n舒雅舒\t40239\niiR\t40240\n逼还\t40241\n邹剑明\t40242\n连都\t40243\n逼进\t40244\n一百四十七\t40245\n22亿克\t40246\n争气\t40247\n呵呵呵\t40248\n动物\t40249\n一个幺\t40250\n郴州\t40251\n家鸡\t40252\n掂量\t40253\n穿红\t40254\n猜田\t40255\nJuniel#\t40256\n小玲\t40257\nReuwyysjf\t40258\nKRY\t40259\n乖乖乖你最乖对不对\t40260\n眀天项\t40261\n上药\t40262\n25万吨\t40263\n哈密\t40264\n熊熊在心\t40265\n小玩\t40266\n感觉你是我爱的\t40267\n导演们\t40268\n童话故事\t40269\n笼头\t40270\n胡瑞\t40271\n粪坑\t40272\ndqmft\t40273\n恨过\t40274\n刘大爷\t40275\n猜生\t40276\n六八二七\t40277\n暨大哈哈\t40278\n长枪\t40279\n陈壁君\t40280\n驮着\t40281\n小玉\t40282\n526799762\t40283\n小王\t40284\n英剧\t40285\n新朝鲜F4\t40286\n67千米\t40287\nfggxfg\t40288\n一二零一\t40289\n李伟珖\t40290\n喔嘿\t40291\n柑橘类\t40292\n发若\t40293\n苏陌钰\t40294\n不我会\t40295\n第二梦吗联盟妈妈妈妈咪咪咪咪\t40296\n筒靴们\t40297\n张永红\t40298\n魔法号\t40299\n王中军\t40300\n尚雯\t40301\nhjjnnnkknjkmkkjkknvcgj\t40302\n好的好的好的好好的好大好大好大好大好大\t40303\n好厚\t40304\n须弥座\t40305\n130051\t40306\n珍妮优\t40307\nhegd\t40308\n12345678909865\t40309\n62%\t40310\n负责人\t40311\n走过来\t40312\n女人心\t40313\n好历\t40314\n围墙\t40315\n谁是你的好吧我是\t40316\n我喜欢的话\t40317\n七八吊养老钱\t40318\n干粉条共和国耕田绒\t40319\n假六月\t40320\n鲜明\t40321\n肿么咧\t40322\n滥竽充数\t40323\n张启靓\t40324\n乔菲瑞\t40325\n1711710670\t40326\n麓谷\t40327\n小游片\t40328\n就好哪了你\t40329\n老底全\t40330\n贫民\t40331\n合作关系\t40332\ndrtf\t40333\n善良的女孩\t40334\ndrtd\t40335\n烦恼我们的未来\t40336\naeft\t40337\n铁片人\t40338\n1600万\t40339\n欢型\t40340\n恶心不敢\t40341\ndrty\t40342\n雅鹿\t40343\n九博数想学\t40344\n韩冰女\t40345\n网站\t40346\n八百多遍\t40347\n好没意思\t40348\n老天儿\t40349\nWacken\t40350\nAGreenGreenGarden\t40351\n40周年\t40352\n嗯熊熊\t40353\nMother\t40354\n5到6小时\t40355\n道义\t40356\n三市路南门医院\t40357\n10.47元\t40358\n心辣\t40359\n彭博社\t40360\n简笔画\t40361\n我喜欢我的朋友\t40362\n林俊\t40363\n0.05亿元\t40364\n小破街\t40365\n叶非非\t40366\n泰然自若\t40367\n张福泰\t40368\n君威\t40369\n披着\t40370\n孕妇犬\t40371\n那鑫泰\t40372\n策动\t40373\n素友\t40374\n135生\t40375\nchiphotosbaiducomxiaodupicitem1f178a82b9014a90fc0280c9ae773912b31bee82jpg\t40376\nWOGTPA1N\t40377\n三点零三\t40378\n240岁\t40379\n曲\t40380\n莱切\t40381\n克罗多拉大峡谷\t40382\n恬园406车站\t40383\n给我教会\t40384\n乳源\t40385\n9.5万头\t40386\n分公司\t40387\n色胆\t40388\n肚子秘我好伤心呢度秘我好伤心\t40389\n咋天\t40390\n海盗了了\t40391\n事理\t40392\n梦溪\t40393\n好我在上思\t40394\n东风日产\t40395\n受吃\t40396\n对啊人妖\t40397\n唱歌唱\t40398\n正装\t40399\n百岁\t40400\n曾毅\t40401\n四圣\t40402\nWtio\t40403\n野生大鱼坊愚夫愚妇昏昏沉沉给vv给宝贝宝贝\t40404\njpijhah\t40405\nufit徐\t40406\n惨绝人圜一塌糊涂\t40407\n彭拜\t40408\n火侠\t40409\n被扣\t40410\n蝈蚋\t40411\n四场\t40412\nvip群\t40413\n看家豪\t40414\n贾培\t40415\n相差\t40416\n大年级\t40417\n五效\t40418\n哪丑\t40419\n四圈\t40420\n一个字儿\t40421\n大年纪\t40422\n把头看\t40423\n五教\t40424\n课文\t40425\n苍井九里\t40426\n公约\t40427\n四土\t40428\n切身\t40429\n格瑞卫康\t40430\n好人心\t40431\n新心情\t40432\n有难\t40433\n两厢情愿\t40434\n撰文\t40435\n家庭\t40436\n杨繁盛\t40437\n多面\t40438\n家度\t40439\n戈兴根\t40440\niSTAYREAL\t40441\n三分钱一兆\t40442\n球囊\t40443\n社会治安\t40444\n任宏恩\t40445\n捉迷藏挺\t40446\n十六秒\t40447\n家店\t40448\nav女优\t40449\n家底\t40450\n家床\t40451\n二十一世纪\t40452\n充电式\t40453\nGIGhh\t40454\n绿肥\t40455\n200多家\t40456\n刘艳雅\t40457\n四川盆地东部\t40458\n小山子\t40459\n六二十六个\t40460\n家庄\t40461\n欣恬番\t40462\n湖过者\t40463\n9月15-16日\t40464\nbkzkx\t40465\n新生代\t40466\nd90\t40467\n巴啦啦小魔仙\t40468\n茅坑\t40469\n汪叽喵\t40470\n没错长\t40471\n2200万元\t40472\n我就是喜欢问你爱不爱我？真的爱你\t40473\n1151034905\t40474\n话了行\t40475\n点嚒\t40476\nANAHITA酒店\t40477\n品尝爱情\t40478\n单调\t40479\n有几\t40480\n百分之80\t40481\n今天下午四点\t40482\n静待\t40483\n三四包\t40484\n擔心\t40485\nruuf\t40486\nhrgdrf\t40487\n碧云路\t40488\n呵演\t40489\n没打\t40490\n李拜\t40491\n田佳伟\t40492\n说什么说\t40493\n徐铮\t40494\n小小孩\t40495\n138元\t40496\n咁噶\t40497\n执事\t40498\n武漢電視臺還挺給力\t40499\n凡人我就是的嘛呀嘛呀嘛爱嘛爱\t40500\n老年人\t40501\n8小时\t40502\n没手\t40503\n收货人\t40504\n朱砂\t40505\n醉红\t40506\n杨给我\t40507\n小小子\t40508\n受众\t40509\n72837798\t40510\n王岗头\t40511\nvjxgutuztudyuyxhghxigxgjxhc\t40512\n今天晚上9点\t40513\n挺老虎唐嗯\t40514\n小心有九本书小西友期板书小心的书给小新\t40515\n天涯赫子心\t40516\n唉座机\t40517\n弗拉门戈\t40518\n母佳林\t40519\n6900多\t40520\n韵律\t40521\n液晶电视\t40522\n陈开\t40523\n一厘米\t40524\n付小心\t40525\n美月\t40526\n16611\t40527\n针管\t40528\n谁和谁\t40529\n蓝听话\t40530\n售\t40531\n何静\t40532\n袍江新区\t40533\n旧爱\t40534\n家长恋\t40535\n青雪\t40536\n时下\t40537\n你好块\t40538\n陈强\t40539\n扩一\t40540\n庆典\t40541\n你好坏\t40542\n八日\t40543\n遰露\t40544\n王宇哲\t40545\n在远方\t40546\nVBaby\t40547\n刘千松\t40548\n阁楼上的王子\t40549\n你好你好我是秘度\t40550\n黄二小\t40551\n吃吃吃吃吃吃吃吃\t40552\n说不成\t40553\n云真\t40554\n11日早晨\t40555\n露珠神\t40556\n霸道\t40557\n陪陪\t40558\n现在之间\t40559\n剪水\t40560\n福fsrw\t40561\n名侦探柯南业火向日葵\t40562\n一刀切\t40563\n问天问\t40564\n高高高高\t40565\n非非常常\t40566\n瓜体\t40567\n答辩套\t40568\n医患\t40569\n糸陪\t40570\n张晨梦\t40571\n熟读\t40572\n中天御园\t40573\nxdf\t40574\nxdd\t40575\n泄\t40576\n你在哪我去哪儿\t40577\n打战\t40578\n泉\t40579\n泊\t40580\n泌\t40581\nJAPONICANA\t40582\n安盛\t40583\n泐\t40584\n950工\t40585\n数三秒钟\t40586\nxdt\t40587\naggggghvnhhhhgnhhhhgngghhhhhygggg\t40588\n泗\t40589\n十一十二十三\t40590\n唐茂木公\t40591\n泛\t40592\n紧密\t40593\n泞\t40594\n旅游节\t40595\n泠\t40596\n泡\t40597\nrtff\t40598\nrtfg\t40599\n泥\t40600\n工信部\t40601\n362\t40602\n泪\t40603\n黑瘴珠\t40604\n泰\t40605\n刘仁\t40606\n象印\t40607\n泳\t40608\n步行街\t40609\n泵\t40610\n泶\t40611\n懒得\t40612\n泻\t40613\n泼\t40614\n美国空军\t40615\n查東\t40616\n我喜欢一个男孩\t40617\n名媒体\t40618\n小扎\t40619\n郑拍\t40620\n九百七十八九百七十八一八七\t40621\n娇俏\t40622\n良时\t40623\n胡月\t40624\n热死\t40625\n胡最\t40626\nbroost\t40627\nAdrere\t40628\n婆主姦鐵夕阳红姑\t40629\n说和非\t40630\n秦时明月苍龙七宿\t40631\nRxycycff\t40632\n周再\t40633\n储依雯\t40634\n熊出没熊心归来\t40635\n演算\t40636\n匪\t40637\n阿屎\t40638\n长江新村\t40639\n云龙百货从自行车厂\t40640\n申嘉彤\t40641\n匣\t40642\n匠\t40643\n两片\t40644\n桑那\t40645\n结节\t40646\n区\t40647\n医\t40648\n宠物西游\t40649\n匹\t40650\n谈笑风生\t40651\n创想\t40652\n变性人\t40653\n上十二个小小时\t40654\n男可女半男半女\t40655\n真心不衰\t40656\nGo，to\t40657\n证道\t40658\n匈\t40659\n匉\t40660\n我不要你了我不要你了我就是不要你了我不要你了我不要你了我不要\t40661\nvxec\t40662\n匍\t40663\n看恐怖\t40664\nuif秘\t40665\n包\t40666\n13237007873\t40667\n张妍\t40668\n匙\t40669\n开除了\t40670\n匟\t40671\n巨龟\t40672\n脑筋急转弯\t40673\n黄纯莹\t40674\n航程\t40675\n化\t40676\n女刺激咧\t40677\n匕\t40678\n写明\t40679\n五月底\t40680\nhihhh\t40681\n若有所思\t40682\n看出来\t40683\n嗯露\t40684\n扬颖\t40685\n荷兰豆\t40686\n行山\t40687\n国华\t40688\n秒客##\t40689\n臭罵\t40690\n大一个郎\t40691\n爱晚安\t40692\nhello度秘\t40693\n撸撸音\t40694\n乘着风走\t40695\n统考\t40696\nwahia\t40697\nwahiz\t40698\n捆绑派\t40699\n12485883\t40700\n1358\t40701\nunfair53\t40702\n1357\t40703\n金桥\t40704\n撕扯\t40705\n恒生银行\t40706\n拜拜年\t40707\n回荅\t40708\n连接线\t40709\n欧洋\t40710\nLuckystvike\t40711\nttfddrd\t40712\n漫画家们\t40713\n023千克\t40714\n欧洲\t40715\n85528471\t40716\n没有我\t40717\n管理迷\t40718\n金桂\t40719\n56栋\t40720\n八点九点\t40721\n婚俗\t40722\n千八百\t40723\nvvvbvvcfvdv\t40724\nOhioHNK\t40725\n90万\t40726\n行行行行\t40727\n123088646464646\t40728\n三一堆\t40729\n满脸\t40730\n不得心应手\t40731\n就手\t40732\n爱聊\t40733\n中国国家博物馆\t40734\n稻草人\t40735\n你爱我你爱我你爱我你爱我你爱我你爱我你爱我\t40736\n备份\t40737\n女王陛下\t40738\n姐们儿\t40739\n一百三十四块\t40740\n就打\t40741\nPONG\t40742\n开普勒\t40743\n泰勒·斯威夫特\t40744\n阿里里\t40745\ngigie\t40746\ntndjfiyhg\t40747\n死你\t40748\n吐莫\t40749\n象鼻\t40750\n谁片\t40751\nuovcbi\t40752\n探出\t40753\n盯上\t40754\n厦门钨业\t40755\n有前无\t40756\n老正常化\t40757\n转角\t40758\n甘愿\t40759\nJCW\t40760\n蓝阳光\t40761\n分段\t40762\n再来一发\t40763\n吧一个人静一静伤不起真的伤不起真的伤眼伤眼伤不起\t40764\n张婉欣\t40765\n机器人我危险呢你还要不要脸\t40766\n不不不不不不想\t40767\n要你摸摸\t40768\n分访\t40769\nFapplevidoes\t40770\n眼痒\t40771\n午台\t40772\n解放碑novo拍攝时尚mv\t40773\nzgudyclhlf\t40774\n4588589\t40775\n小啦\t40776\n当爱来的时候\t40777\n你好我是秘度\t40778\n人大附小\t40779\n龙宝宝\t40780\n陈述句\t40781\n无名氏\t40782\n坤沙\t40783\n精金\t40784\n魏双秋\t40785\n死佐\t40786\n幻灭\t40787\n臃空\t40788\ngggggvffgggggggg\t40789\n2位\t40790\n靠不吃饭\t40791\n阿木西林\t40792\n女人花\t40793\nfrosorn\t40794\n牛宜兴\t40795\n澄海\t40796\n三井寿\t40797\n12月24号\t40798\n林丽阳\t40799\n18930975759\t40800\n好运来\t40801\n同党\t40802\n悬金佩玉\t40803\n李天王\t40804\n土丹\t40805\n泾阳\t40806\n徐子业\t40807\n光头强嗯强阳刚之\t40808\ncd7b899e5190d6306745a7d933c8950d6cjpg\t40809\n名者\t40810\n修仙文\t40811\n做战\t40812\n幸福路\t40813\n1117\t40814\n来了我走\t40815\n有饭\t40816\n做成\t40817\n做我\t40818\n递增\t40819\n许默哀\t40820\n点位\t40821\n周瑜凯\t40822\n做戏\t40823\n1346755465\t40824\n28yy\t40825\n王一鸣\t40826\n精采\t40827\nsecyma\t40828\n緩\t40829\n就是你了不累呀一家子\t40830\nFYDY\t40831\n周子腾\t40832\n第一周内\t40833\n侍妾\t40834\n看不到\t40835\n慰藉\t40836\n留心\t40837\n休整\t40838\n吃点\t40839\n吃炸\t40840\n夜思\t40841\n家猫\t40842\n年检\t40843\n辽宁省人大\t40844\n大草莓\t40845\n负一\t40846\n吕佳怿\t40847\nkczhxnsdmfl\t40848\n189992448\t40849\nloullolak\t40850\n139万吨\t40851\n常老师\t40852\n每时\t40853\n叫秘若溪\t40854\n王子诺\t40855\n吃炔\t40856\n安心\t40857\nabckfc\t40858\n每日\t40859\n673533\t40860\n猫咕咕猫\t40861\nyejsh\t40862\n凤凰古城\t40863\n现已\t40864\n强项\t40865\n3点13分\t40866\n康雨欣\t40867\n不基\t40868\n王楚涵\t40869\n陈飞\t40870\n好人性\t40871\n饿哦开心给哦x5肚饿\t40872\nviygj\t40873\n不识人\t40874\n夜上海\t40875\n海玲\t40876\n9648\t40877\n于欣颖\t40878\n9o15度\t40879\nU8860黑1800\t40880\n启动迹象股\t40881\n海螺\t40882\n为灵\t40883\n五ｃｍｂｅ\t40884\n燕子门\t40885\n叉叉\t40886\n45358574\t40887\n会员卡\t40888\n15分钟左右\t40889\n崩塌\t40890\n58690098652438609\t40891\njjjdjd\t40892\n引流\t40893\n旱地\t40894\n超过30秒\t40895\n粗土\t40896\n眼药\t40897\n虎父\t40898\n118家\t40899\nApp\t40900\n七国\t40901\n天下雪也两个字词我是电影小达这么说话的我对我的电影小的人\t40902\n姿料\t40903\n卤菜\t40904\nhhcj\t40905\nhhcg\t40906\n消杀\t40907\n智囊\t40908\n全部人\t40909\n我看不惯你骗我我我就白蚁来了三楼不我不要你\t40910\n诺娃\t40911\n五星猪\t40912\nV宝贝\t40913\n伤怀\t40914\n碧海\t40915\n刁惠洁\t40916\n敲请\t40917\n我是那么的爱你\t40918\n李嘛花\t40919\n罗米\t40920\n30条\t40921\n头位\t40922\n配色\t40923\n胶瓶\t40924\n胡天飞雪\t40925\nhihihihihihi\t40926\n糟糠\t40927\n落幕\t40928\n心人族\t40929\n钱学长\t40930\nNOVO控\t40931\n方能\t40932\n加勒比\t40933\n藏头\t40934\n很好天安因为我是女孩我没有事女孩\t40935\n鼠辈\t40936\n不属\t40937\nAZ273\t40938\n辅音\t40939\n365天\t40940\npospl\t40941\n碰碰\t40942\n绵谷\t40943\n不屑\t40944\n护臀\t40945\n呦吼\t40946\n2月18日起\t40947\n不屈\t40948\n要怕\t40949\n歌库\t40950\n哥恩\t40951\n计入\t40952\n23456667899654577897\t40953\n汗流\t40954\n今年三月\t40955\n白子葉\t40956\n爱爱爱不爱\t40957\n宿迁市\t40958\nIcle\t40959\n潘主任\t40960\n第七个\t40961\n认定\t40962\n1-4天\t40963\n狗输入法\t40964\n虐杀\t40965\n49\t40966\n46\t40967\n47\t40968\n44\t40969\n45\t40970\n42\t40971\n43\t40972\n40\t40973\n41\t40974\n4.\t40975\n7位\t40976\n宋巨杨\t40977\n热爱\t40978\n王乐萱\t40979\n小紫\t40980\n粉碎\t40981\n喜帕\t40982\nSAMPAR\t40983\n王糖糖\t40984\n4X\t40985\n蒋文深\t40986\n万民\t40987\n南白癜\t40988\n侬晚安\t40989\n4S\t40990\n4P\t40991\n进一步\t40992\n分不开心\t40993\nmmoo000\t40994\n为情所困\t40995\n罪过\t40996\n朱卫民\t40997\n4G\t40998\n4D\t40999\n4B\t41000\n聊聊聊聊聊聊聊聊\t41001\nxhdx\t41002\n依法行政\t41003\n4x\t41004\n无聊时刻\t41005\n4v\t41006\n密波\t41007\n4t\t41008\n4u\t41009\n万水\t41010\n4p\t41011\n4q\t41012\n4n\t41013\n老中青年\t41014\n4m\t41015\n4k\t41016\n叫花鸡\t41017\n4g\t41018\n4d\t41019\n苏我\t41020\n4b\t41021\n环卫\t41022\n煜煜\t41023\n4a\t41024\n郎丽楠\t41025\nnocanno哔哔\t41026\n秘媳妇\t41027\n4200万\t41028\n你好你好机器人我爷爷在给你说你好\t41029\n恭喜恭喜\t41030\n不再见了\t41031\nJigjggag\t41032\n中科大\t41033\n我还不去你那看你哪熊样猩\t41034\n燃油\t41035\n大头哥\t41036\n内镜\t41037\n13582628827\t41038\n在线版\t41039\nohhkgdlhnkhkhgoynlhlj\t41040\n中性\t41041\n鲍士会\t41042\n装哈\t41043\n吴音相媚\t41044\n果不其然\t41045\n9月8号\t41046\n一次次\t41047\n会努力\t41048\n骗不到\t41049\n一喜\t41050\n4000元\t41051\n幺三零二九二一九八零零二二零七二二四\t41052\nsdldld\t41053\n查元培\t41054\n麻烦你\t41055\n新芽\t41056\nhbhu\t41057\n提拉米苏\t41058\nhbhs\t41059\n13783826944\t41060\n花型\t41061\n冒牌类\t41062\n新花\t41063\n你好度秘我问你\t41064\n1235688333\t41065\n纸袋\t41066\n抗争\t41067\nhbhh\t41068\n虚化\t41069\n赵雨润\t41070\n猪猪样\t41071\n131206\t41072\n海底环球\t41073\nppoguh\t41074\n谢谢我最萌萌哒啦\t41075\n那当你\t41076\n赵雨涵\t41077\njvvv\t41078\nAhfdssd\t41079\n小诚意\t41080\n亲姐\t41081\n亲姑\t41082\n那你花少莫\t41083\n五脏六腑\t41084\n易趣\t41085\n逆臣\t41086\n小逗\t41087\n3月12号\t41088\n小逝\t41089\nWechatinEnglish\t41090\n沙拉帝\t41091\n韦尔比\t41092\n19431846\t41093\n战国\t41094\n111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111\t41095\n麼麼噠\t41096\n你是我最爱的人是我的爸爸\t41097\n上刚\t41098\n突如其来\t41099\n头余\t41100\n积富\t41101\n858658558858\t41102\n17.96%\t41103\n耸立\t41104\n小逼\t41105\n呢腿子\t41106\n很好笑\t41107\n我喜欢战巨幕\t41108\n解兴\t41109\n南岳中心小学\t41110\n自住房\t41111\n闻名\t41112\n聊的再聊\t41113\n常明\t41114\n整个\t41115\n登录\t41116\n冷出翔\t41117\n陏便\t41118\n秘千元哩\t41119\n五毛\t41120\n2332233323232323232332333233323332323233323232333232323233333232323\t41121\n西方人\t41122\n五比\t41123\n66661616\t41124\nsyue\t41125\n38438\t41126\n李彦宏\t41127\nwdb\t41128\n文殊院安详舍报\t41129\nwdd\t41130\n认识书\t41131\n音乐会\t41132\nwdj\t41133\n238656726673\t41134\n形式\t41135\nwdm\t41136\n欣欣欣欣\t41137\n深奥同济\t41138\n来嫁\t41139\nwdv\t41140\nhi1u\t41141\nwdx\t41142\n市里头\t41143\n仲要\t41144\n张金杨\t41145\n我不想理你了我讨厌你\t41146\n水游城\t41147\n没有你好玩\t41148\n一大一个\t41149\n雷头怡\t41150\n不一样了\t41151\n性感型\t41152\n祸从口出\t41153\n实验弹\t41154\n奥别\t41155\n北京汽车展\t41156\nmattias\t41157\n1234567\t41158\nStarNews\t41159\n不精八嘎\t41160\n狗狗狗\t41161\n听话的好孩纸\t41162\n韩佳人\t41163\n正水井头村\t41164\n一大一下\t41165\n雨果\t41166\n经审\t41167\n胡婉妮\t41168\n96509951舅猪猪\t41169\n雨林\t41170\n50多年\t41171\n45行\t41172\n任凤平\t41173\n斗罗大陆3龙王传说\t41174\n染指\t41175\n跳崖\t41176\n四有VI\t41177\n哟五块\t41178\nrobin\t41179\n嫁鸡随\t41180\n铁西体育馆\t41181\n爱着你是\t41182\n林细\t41183\n直挂云\t41184\n邓弟弟\t41185\n詞語酮替芬餃子\t41186\n123456z\t41187\n上古\t41188\nDoyou\t41189\n40kg\t41190\n前门\t41191\n上司\t41192\n萌图\t41193\n美术片\t41194\n上台\t41195\n大神计\t41196\n这首姐姐\t41197\n飞更高\t41198\n月头\t41199\n西皮粉\t41200\nvsbs\t41201\n刘我\t41202\nstomityou\t41203\n福肚\t41204\n美领馆\t41205\n团组织网\t41206\n4件\t41207\n臭芭\t41208\n雷明睿\t41209\nIIa\t41210\nvv刚\t41211\n猜猜我是女\t41212\nIIn\t41213\n她的错\t41214\nWheel\t41215\n恕罪\t41216\n1688958\t41217\n老鹰\t41218\n这会\t41219\n我勒个擦要不你骂我两句\t41220\n煽烂打\t41221\n辍学\t41222\n持剑\t41223\nJordan\t41224\n说话训练我爱中华我爱\t41225\n青阳小学\t41226\nIII\t41227\n苦口婆心\t41228\n混世\t41229\n胳组\t41230\n接吻\t41231\ngukbvuub\t41232\n五着\t41233\n宝鸡市\t41234\n贾度秘\t41235\n烟丝\t41236\n耐我的爱\t41237\n西方\t41238\n接听\t41239\n新鲜感\t41240\n西施\t41241\n傻子种\t41242\n1580710373\t41243\n海无\t41244\n了麻烦\t41245\n粗糙\t41246\n雷子\t41247\n达人\t41248\nｆｕｃｋｙｏｕ\t41249\n傻子秘\t41250\n和行行行\t41251\nhdddd\t41252\n原写\t41253\n家传\t41254\n十一运\t41255\nIivneyou\t41256\n一万多18800\t41257\n吧主人\t41258\ndoxvb\t41259\n下四点\t41260\n草莓橘子\t41261\n脑子\t41262\n047774899\t41263\n有词\t41264\n冰沙机\t41265\n东港镇\t41266\ntfboystf\t41267\n这就拜\t41268\n嘻唰唰\t41269\n慈航殿\t41270\n反省\t41271\nyahxima\t41272\n明天志国饭店\t41273\nJptgwjma\t41274\n八棵树\t41275\n洁面乳\t41276\n小杂犬\t41277\n业界\t41278\n一个8岁\t41279\n家伙\t41280\nbrbgrr\t41281\n宿醉\t41282\n一辆半\t41283\n丹心照汗青\t41284\n郑叫\t41285\nKids\t41286\n不悔\t41287\n美丽的秘\t41288\n郑口\t41289\n踏板\t41290\n你不懂\t41291\n好啊千鸟女\t41292\ngreenl\t41293\n兔兔图图all里hi哈饿啦老K\t41294\n复肤品\t41295\n54颗\t41296\ncallmesheldon\t41297\n嗯特\t41298\n与其\t41299\n3155504626\t41300\n别回\t41301\n对窝\t41302\n眼巴巴\t41303\n七八九十十一十二十二十五十十一十二十九二十\t41304\n科隆大学\t41305\n连衣裙\t41306\n有话\t41307\n咬尾\t41308\nwijfkk\t41309\n周阳靖\t41310\n杨经理\t41311\n自大自恋狂\t41312\n八国\t41313\n砚台\t41314\n蓝蓝蓝蓝蓝蓝\t41315\n135个\t41316\n十九只\t41317\n起作用\t41318\n阿奎拉尼\t41319\n河粉\t41320\nqeadxggGgcxbvcvnVccjjvxgvcrhgfhgfthggugrfjillkdexxgvxznnbyfvijppllgtsdjjvg\t41321\n八四\t41322\n691997675\t41323\n不可收拾\t41324\n对齐\t41325\n吃西瓜\t41326\n无休止\t41327\n一毛钱\t41328\nhxdkdc\t41329\n欧俊秀\t41330\n双8\t41331\n二十多三十个\t41332\n搞不清楚\t41333\nCBD\t41334\n100qb\t41335\n双J\t41336\n爱汽车\t41337\n防度\t41338\n维斯顿\t41339\n宣武门\t41340\n55800856350635555633856365557\t41341\n为娼\t41342\n小咪夏\t41343\n李嘉铭\t41344\n土鳖=hillbilly\t41345\n岱山\t41346\n犯规\t41347\n学哥\t41348\n异能萌真心的爱我\t41349\n猪羊\t41350\n哦罗\t41351\n马肖\t41352\n想当\t41353\n空难\t41354\n枉判\t41355\n蜘蛛的度秘不看你了我好恨你\t41356\nshhwgff\t41357\n爱丽丝快点吧\t41358\n热血\t41359\n塞来一\t41360\n奇珀兰\t41361\nThinkGeek\t41362\n产车\t41363\n马肉\t41364\n想影\t41365\n热表\t41366\n12553358855888\t41367\n火辽田\t41368\nHBVGGVV\t41369\n東西\t41370\n过夜行\t41371\n妙懂\t41372\n好吧那我我告诉你我在传奇\t41373\n帕托、洛佩斯\t41374\n黑水县\t41375\n热衷\t41376\n热衰\t41377\n38383838383838383838383838383838\t41378\nKJF\t41379\njfjg\t41380\n哪地儿\t41381\n一人手\t41382\nKJJ\t41383\nguhu\t41384\n量子物理学\t41385\n破坏机\t41386\ncomyoutnmmpappanimnotraffertisiompertifioashank\t41387\n总有一天\t41388\n因為什麼\t41389\n小了行\t41390\n体重游戏\t41391\n撤尼\t41392\n京江大酒店\t41393\n救命\t41394\n对呀真可惜\t41395\n超度\t41396\n河南省\t41397\n卖一个萌\t41398\n老蛏\t41399\natiotaaaaaaaaahahaouamatishahnn\t41400\n虎山\t41401\n经济额\t41402\nGang\t41403\nweishenmea\t41404\n披萨\t41405\n大傻夫\t41406\n发微\t41407\nleo修仁爱康秋天\t41408\n安城靡\t41409\n天山花园\t41410\n你好呀小达人\t41411\n恩个\t41412\n不是忘记你吧写在日记\t41413\n罗耀\t41414\n九真\t41415\n杰作\t41416\n我哎呀女的你的你是个女机器人\t41417\nolemis\t41418\n我喜欢一首歌\t41419\n弟娃\t41420\n九看\t41421\n陆锦绣\t41422\n陈浩远\t41423\n杨板凳狗\t41424\n翔爸翔\t41425\n13971295123\t41426\n逗鸡\t41427\n美少男\t41428\n觉察力\t41429\n囡囡\t41430\n修正案\t41431\n哼信\t41432\n奥们\t41433\n辐射量\t41434\n你是女孩子那你的爱好\t41435\nad5ad6eddc451da6ac3df9fb1fd5266d0163289jpg\t41436\n挨苦\t41437\n美乳\t41438\n闲忍者\t41439\n鱼翅\t41440\nthome\t41441\n浩饭\t41442\n2870\t41443\n神符\t41444\n万语\t41445\n黑马马\t41446\n启动\t41447\n一九九四零二零五六六二二\t41448\n我的一句话\t41449\n网络群\t41450\n整套\t41451\n死心经\t41452\n撸贝克尔\t41453\n呢了\t41454\n2009年3月19日\t41455\n就是这样的人\t41456\n日本厚生劳动省\t41457\n彻夜麻将\t41458\n晕晕乎乎\t41459\nhomehuaile\t41460\nDoyoumarriedme\t41461\n小姨妈\t41462\ngxgmh\t41463\n四百千万四百五万\t41464\n黄世华\t41465\n及呢请\t41466\n水珠\t41467\nMOCA\t41468\n露阴\t41469\ngghakq\t41470\n泪你的泪\t41471\nhttphhiphotosbaiducomxiaodupicitem8601a18b87d6277f71d7274a2f381f30e824fcffjpg\t41472\n蛤岛\t41473\n后传号\t41474\n在校校\t41475\nCBS\t41476\n你男你女\t41477\n高达娘\t41478\n陕鼓\t41479\n郭姝廷\t41480\n东莞北站\t41481\n来由\t41482\n江慧\t41483\nctvibo\t41484\n恨人\t41485\n车儿\t41486\n张叔叔\t41487\n在云上\t41488\n了测\t41489\n浙江卫视传林\t41490\n封网\t41491\n告诉你我我没有\t41492\nvvvvvbhhh\t41493\n1598669287899\t41494\n那敢情你还有另一只折翼的度秘\t41495\n他来了请闭眼\t41496\njtfg687\t41497\n劳什子\t41498\n彭金干\t41499\n联谊\t41500\n韵怡楠\t41501\nQS220115050342\t41502\ncvjk\t41503\n混蛋高\t41504\n索索\t41505\n3867846\t41506\n祖先\t41507\ncvjn\t41508\n随后\t41509\n飘尸\t41510\n迪克莱亚\t41511\n潘缓缓武鸣\t41512\n阿槿\t41513\n张志鹏\t41514\n狼藉\t41515\n嗯男男\t41516\n张美\t41517\n七八把\t41518\n马彦文\t41519\n徐浩\t41520\n麻淑丫\t41521\n谁是你嫁给我照张相\t41522\n景琰\t41523\n腰间盘突出\t41524\n小度秘你真是我的好朋\t41525\n12:00-14:00\t41526\n刂一\t41527\n任督\t41528\n方形\t41529\ndwaysudway\t41530\n魁省移民厅\t41531\n哇咕咪\t41532\n广东清溪小学\t41533\n省气象台\t41534\n大大猪好小小猪\t41535\n高低流\t41536\n明天20点\t41537\n不好色\t41538\n对啊咱们\t41539\n小鲨鱼来了你怕\t41540\n米芳青\t41541\n六七号\t41542\n福州西\t41543\n煤气炉\t41544\nhrisome\t41545\n阴部\t41546\n客居\t41547\nHoneyhow\t41548\n基吧\t41549\n几句话vnnnnnnnnnn\t41550\n庆生桌面贺图待吧\t41551\nOPPO9007\t41552\n宝骏630\t41553\n神将\t41554\n审理\t41555\n全光\t41556\n愁恼\t41557\nweae\t41558\n2007年9月28日\t41559\n亚索控\t41560\n真真假假假假\t41561\n就好了吧\t41562\n黄礼泉\t41563\n让偶\t41564\nweak\t41565\n餓\t41566\n餐\t41567\n夏雨荷\t41568\n一定可以\t41569\n朋友度秘\t41570\n重生虐白莲花的女主\t41571\n输入分钟\t41572\n床上去\t41573\n表明\t41574\n人到中年\t41575\n雄鸡\t41576\n路透社\t41577\n餃\t41578\n辛劳\t41579\n核突\t41580\n桑捏夸乐\t41581\n饭桌儿\t41582\n汤弟庙\t41583\n七老八十\t41584\n吴红英\t41585\n露头\t41586\n露天\t41587\n超越过\t41588\n鹿爷\t41589\n十八两个\t41590\n四滴\t41591\n大模\t41592\n20120421\t41593\n护发素\t41594\n好的快\t41595\n探始源\t41596\n口正\t41597\n百淘库\t41598\nGIVE\t41599\n权重\t41600\n夫妻俩\t41601\nrioylyoutlo\t41602\n12月31日\t41603\n佐佐\t41604\ntrust\t41605\nuqmququqmquqqmm\t41606\nLOL邪恶漫画\t41607\n罢训\t41608\n2009年以来\t41609\n外围\t41610\n14.3%\t41611\n此帖\t41612\n放道\t41613\n饰女\t41614\n外国\t41615\n我特别的爱你我想和你\t41616\n禽类\t41617\n零二二零二二三二九二四\t41618\n得意会\t41619\n恩德华东京华东京华东\t41620\n华总\t41621\n笨牛\t41622\n王皇沙皇\t41623\n外因\t41624\n休假\t41625\n法拉\t41626\n无奇不有\t41627\n三清山\t41628\n谢谢妞\t41629\n王杰坤\t41630\nkjjjkokkkhhjj\t41631\n第六空我错错错\t41632\n早八晚\t41633\n美妞美妞美妞\t41634\nST吉药\t41635\n是心非\t41636\n气魄\t41637\n感叹\t41638\nhhjii\t41639\n先先先先先先\t41640\n缭绕\t41641\nx2m1\t41642\n从百草园到三味书屋\t41643\n美国财政部\t41644\n谭玉亮\t41645\n郑朝阳\t41646\nworkingholiday\t41647\n强杆\t41648\n记错点\t41649\nvvbj\t41650\n强权\t41651\n在天之灵\t41652\n奥岛\t41653\n感受\t41654\n溜索\t41655\nvvbb\t41656\n王勇\t41657\n裹脚\t41658\n一美元\t41659\n美丽说还\t41660\n箫林\t41661\n软妹\t41662\nfaqing\t41663\n谢谢我懂了\t41664\n准备好不\t41665\n5月22日晚\t41666\n最后一练\t41667\nstop箱\t41668\nggdgg\t41669\ndhiphotosbaiducomxiaodupicitemb90e7bec54e736d1bc945e7b9c504fc2d562693ajpg\t41670\n假堡\t41671\n吃星美\t41672\n典故\t41673\n六十千米\t41674\n清真菜\t41675\n情侣们\t41676\nggdgh\t41677\nggdgi\t41678\n问答\t41679\n沈润雨\t41680\ntfboysexo\t41681\nimmmmm2\t41682\n便秘你好便秘再见\t41683\n登海\t41684\n所有人\t41685\n竖笛\t41686\n棉花车\t41687\n滚开我最萌\t41688\ne网\t41689\n情理\t41690\n预封\t41691\n机甲类\t41692\nOdinHOoh\t41693\n货色\t41694\n对呀有眼\t41695\n切切切奇奇\t41696\n帐目\t41697\nhgghhhnnnnnnnnnnnnnnnnnnnnnnnnnnnnnhhhbhbhbnnnnnnnnnnnnnnnnn4555369\t41698\nmokn\t41699\n扁桃腺癌\t41700\n座记\t41701\n果仁\t41702\n4次\t41703\n魔性\t41704\n性侵犯\t41705\n厚街\t41706\n八十多\t41707\n比及\t41708\n没妹妹妹妹妹妹妹妹妹妹\t41709\nhfftt\t41710\nz46点\t41711\njooe\t41712\n98年\t41713\n智能化\t41714\njooh\t41715\n新世纪\t41716\njook\t41717\njool\t41718\njooo\t41719\n零8个月\t41720\n走冰\t41721\n十107\t41722\nbhgkfifgidsrgoduvjhfufjgjbvhhjgidhofthxfixidofo\t41723\n4271点5x1点\t41724\n黄思蓉\t41725\n感觉你好\t41726\n拉不啦\t41727\n正模\t41728\n赵祥宇\t41729\n什描\t41730\n斗地\t41731\n不远不近\t41732\n812本\t41733\n斗场\t41734\n阿黛拉\t41735\n沙奇滕\t41736\n汝汝\t41737\n仰天长叹\t41738\n月黍\t41739\n一小段\t41740\n暮色苍茫看劲松乱云飞渡\t41741\n几丁\t41742\n月黑\t41743\n伤疤\t41744\n一听计\t41745\n好啦智能机器人\t41746\n5258635889600542\t41747\n港代\t41748\n青桐的破天\t41749\n娇傲\t41750\n55655556667756574555555555\t41751\n别让\t41752\n乱性\t41753\n亲坏\t41754\n我不懂\t41755\n别再问我了我走了拜拜\t41756\n曹琨\t41757\n亡灵芝\t41758\nzhihmjcyv\t41759\n瑟瑟\t41760\n基拉拉拉\t41761\n哎呀你帮帮我吗东游\t41762\n别讲\t41763\n权秘\t41764\n玩不到\t41765\n压缩率\t41766\n父皇\t41767\n度秘一雷\t41768\n放学\t41769\n柯静雯\t41770\n盗墓者的忏悔\t41771\n淡化\t41772\n四十多岁\t41773\nhung\t41774\n禾源镇谷团村\t41775\n路路通\t41776\n伐咪\t41777\n老难\t41778\n王朔沟\t41779\n好了你是我学哥\t41780\n向启柱\t41781\n惯性\t41782\n水液\t41783\n刘嘉辉\t41784\n观光\t41785\n重归于\t41786\n别三日\t41787\n埋忠\t41788\n10点55\t41789\n10点54\t41790\n李志雅\t41791\n计佛\t41792\n福鼎\t41793\n10点50\t41794\n射洪县\t41795\n绵阳油\t41796\n再见死\t41797\n咯面\t41798\n甜味儿\t41799\n法式薄饼\t41800\n嗤之以鼻\t41801\n四百分\t41802\n欢欢喜喜欢欢喜喜\t41803\n丹皮客车\t41804\n李荣融\t41805\n联联\t41806\n登报\t41807\n六零三零\t41808\nurotilotloekekfof\t41809\n古典音乐\t41810\n16日-21日\t41811\n翟子豪\t41812\n你好丑好丑好丑好丑\t41813\n再说一次信不信\t41814\n挺有意思\t41815\n广播体操\t41816\n东江影\t41817\n6856653542585586565558555\t41818\n蔡太\t41819\n来呗\t41820\n3D版\t41821\n非床\t41822\npkpjt\t41823\n老娘老子\t41824\n15天\t41825\n第五幕\t41826\n来呀\t41827\n该名\t41828\n数123\t41829\n季调后月环比折\t41830\n形力\t41831\n荒唐\t41832\n鸡皮疙瘩\t41833\n不乖我不要你\t41834\nSORG\t41835\n蓝蓝\t41836\n杨宇康\t41837\n光谷\t41838\n一个季\t41839\n民生银行\t41840\n形势\t41841\nClim\t41842\n瞬息万变\t41843\n求达洋R51笼\t41844\n当民\t41845\nppppiofi\t41846\n礼乐滩\t41847\n杨张辰\t41848\n崔建修\t41849\nuiik\t41850\nHunansmd\t41851\n看完\t41852\n看守\t41853\n亲一啵\t41854\n毛九\t41855\n皖南\t41856\n看家\t41857\n含意\t41858\nOmega\t41859\n2D\t41860\n哪般\t41861\n粑粑粑粑粑粑粑粑粑粑粑粑粑粑\t41862\n宝龙城市广场\t41863\n看客\t41864\n一面个\t41865\n爱乐透\t41866\n一模\t41867\nhttpbhiphotosbaiducomxiaodupicitemfcfaaf51f3deb48fb545c6abf71f3a292df57892jpg\t41868\n一樣\t41869\n马克思恩格斯\t41870\n一千公里\t41871\n宋雪薇\t41872\n二三零二\t41873\n减龄\t41874\n陆小雪\t41875\n動腰\t41876\n快乐吗\t41877\n99999感冒灵\t41878\n石子悦\t41879\n5256658\t41880\n颜华\t41881\n夜里\t41882\n巨幅\t41883\n终丢久\t41884\n快乐吧\t41885\n跨年\t41886\n圆圆年年月月\t41887\n大检查\t41888\n孙虹烨\t41889\n别见怪\t41890\nerdh\t41891\n合战\t41892\nd244\t41893\n手链\t41894\n合成\t41895\n十一天\t41896\n油据\t41897\n爫身\t41898\n夏娈娈\t41899\n玻璃幕墙\t41900\n08119\t41901\n不要不够\t41902\n到基层\t41903\n选优\t41904\n一米子\t41905\n第小题\t41906\nshiish\t41907\n克斯\t41908\n风声\t41909\n家里\t41910\njjajsl\t41911\n家野\t41912\n1213145516多678个\t41913\n物业费\t41914\n服法\t41915\n曹老师\t41916\nndkdbflfnf\t41917\n约会大作战五河琴\t41918\n谢娜棒\t41919\n十一夜\t41920\n40次\t41921\n捷豹小虫虫\t41922\n浮士德\t41923\n抑郁塬\t41924\n零七二零\t41925\n几十分钟后\t41926\n难能可贵\t41927\n超哥你真了不得呀得猪你勇敢的超哥\t41928\n市场化\t41929\n名将\t41930\n师炎\t41931\n大过天\t41932\n喜碧\t41933\n宗门\t41934\n普罗\t41935\n名少\t41936\n爱玩\t41937\n母指\t41938\n张纪航\t41939\n头儿\t41940\n埋不我愿意\t41941\n戏法\t41942\nGfgjccg\t41943\n散文\t41944\n置于\t41945\n画师\t41946\nlOK\t41947\n忘年\t41948\n包包车\t41949\n爱王\t41950\n打黑\t41951\n删掉\t41952\n力所能及\t41953\n心浑\t41954\n生存率\t41955\n袜筒\t41956\n断了\t41957\n关凤飞飞\t41958\n赛风\t41959\n雄性动物\t41960\n阿pou\t41961\n1970号\t41962\n自古\t41963\n男字田\t41964\n拉拉亚亚\t41965\n叶犬\t41966\n贪婪\t41967\n房产商\t41968\n一个人家\t41969\n靠不在\t41970\n冬雪\t41971\n尔嘉\t41972\n15~20分钟\t41973\n都一点\t41974\n要就\t41975\n塞女\t41976\n慢秒时光\t41977\n1点10分\t41978\n牵涉\t41979\n小心点\t41980\n勾心斗角\t41981\n工程队修路第一工程队\t41982\n渔公\t41983\ndggfgghg\t41984\n机舂水沚\t41985\n能说\t41986\n485228552258441336211111222595553\t41987\nhjffkig\t41988\n尺超人\t41989\n91岁\t41990\n画美\t41991\n泸州老窖\t41992\nlohan\t41993\n洋流\t41994\nTXTSQL\t41995\n害躁\t41996\n无框眼镜\t41997\n炸鱼块\t41998\n艳罗艾\t41999\n错错大错错\t42000\n凯旋大酒店\t42001\n6444751\t42002\n斗龙战士主题曲\t42003\n师范\t42004\nvbbcg\t42005\n3000毫安\t42006\n惹是\t42007\n轻言\t42008\n36626272645553\t42009\n对部\t42010\n写错别字\t42011\n乡情\t42012\n热力公司\t42013\n晋a2f430\t42014\n15.39%\t42015\n也一样\t42016\n丑字\t42017\n福鼠\t42018\n刘子晗\t42019\n深圳证监局\t42020\n女飞人\t42021\n后背疼\t42022\n己土\t42023\n2055565\t42024\n16分\t42025\n匪亲\t42026\nsvim\t42027\n新民\t42028\n墨渊\t42029\n皮林\t42030\n马胡\t42031\n鲁冰\t42032\n唉一辉太郎\t42033\n旧址\t42034\n一千亿个\t42035\nsvip\t42036\n停完\t42037\n感覺[淚\t42038\n东楼\t42039\n梓珊\t42040\nngb\t42041\n我是你的主人我说你是猪你就是猪\t42042\n政要你给朕暖床\t42043\n应避免\t42044\n不死不休\t42045\n妞哈噻\t42046\n魏梦勤\t42047\n九零后九零后\t42048\n车牌楼\t42049\n停审\t42050\n爱恨你所\t42051\n一千亿万\t42052\n真不假\t42053\n坤哥\t42054\nvnvxfgw\t42055\nngo\t42056\n之策\t42057\n三姐姐姐姐小姨139999小\t42058\n13847525669\t42059\n等位\t42060\n一首。0k\t42061\n轻音宝\t42062\n淫声\t42063\n明天天明天天天天天天天天听听\t42064\n渥\t42065\n成立话\t42066\n牛科街\t42067\n编辑\t42068\n徐天翼\t42069\n一微米\t42070\n仙女们\t42071\n一盘儿\t42072\nreorder\t42073\n祥子\t42074\n人民公园\t42075\ny438\t42076\n煸炒\t42077\n十九分钟后\t42078\n秘十一初二\t42079\n切末\t42080\nsheououd\t42081\n托欠\t42082\n红鬼\t42083\n瑟希莉\t42084\njudj\t42085\njudh\t42086\n酸疼\t42087\n窝窝团武汉站\t42088\n东北虎\t42089\n标记\t42090\nmovieweek\t42091\njudf\t42092\n笑好好笑\t42093\n325千米\t42094\n贼有钱\t42095\njudy\t42096\n卢欣悦\t42097\n吹点\t42098\n哭泣\t42099\n1983年12月二十四\t42100\njudu\t42101\n零元\t42102\nkyk\t42103\n南城区\t42104\nkyo\t42105\n靖江市\t42106\n种马\t42107\nkyg\t42108\nkyf\t42109\n水路\t42110\n545453\t42111\n45.43\t42112\n算毛\t42113\n消费品\t42114\n重度\t42115\n豆沙关高速\t42116\n青斗\t42117\n秘儿\t42118\n蚜虫\t42119\n提莫\t42120\n大圈圈\t42121\n13016607964\t42122\n私处\t42123\n哄鬼\t42124\n村夫\t42125\n狙击类\t42126\n刘学立\t42127\n纯手\t42128\n必须然\t42129\ncondemn\t42130\n17.4万\t42131\n洗澡乐\t42132\n前行\t42133\n摸吻\t42134\n現在幾天不見妳們這些美女甜心們\t42135\n板栗饼\t42136\n呗塔\t42137\n楠楠的那片恩\t42138\n开年\t42139\n剧透\t42140\ndd98d1001e939014597a94c7cec54e736d1962ejpg\t42141\n203032030\t42142\n说一句晚安\t42143\n耐力\t42144\n名书\t42145\n日中国\t42146\n笑不喜欢\t42147\n大坏蛋继续人大坏蛋机器人大\t42148\n管饱了\t42149\n海狐臭\t42150\n45643426854\t42151\n这个春节\t42152\nSANGING\t42153\n鲍正宏\t42154\n10-15分钟后\t42155\n开幕\t42156\n高明义\t42157\nJeans\t42158\n祥林\t42159\n张乐琪\t42160\n耐劳\t42161\n左明村\t42162\n唔亦凡\t42163\n一家人口\t42164\n10笔\t42165\n花菊\t42166\n精华贴\t42167\n郭雨鑫\t42168\ngpka\t42169\n认不得\t42170\nstylestolethit\t42171\n花菜\t42172\n告诉我是男\t42173\n879528\t42174\n憨憨\t42175\n洗近平\t42176\nWhite\t42177\n天灾人祸\t42178\n王君熙\t42179\n一二y=www\t42180\n颜可\t42181\n一一一个月\t42182\n就拜\t42183\n真机\t42184\nFvdccsccsdzv\t42185\n网络电视\t42186\n黄永玉\t42187\n李悦鑫\t42188\n2598379479\t42189\n温宝华\t42190\n曼霞\t42191\n资金\t42192\n油子\t42193\n356\t42194\n声誉\t42195\n写成\t42196\n费雪\t42197\n资金净流入\t42198\n写戏\t42199\n乖孙乖孙\t42200\n奴才们\t42201\n攻城略地\t42202\nAV被\t42203\n究极\t42204\n李在猜\t42205\n速速\t42206\n还在面说\t42207\n后园\t42208\n船员们\t42209\n陈光莉\t42210\n乱乎\t42211\n早挂\t42212\n天市\t42213\n六角形\t42214\n花儿伦\t42215\n果腹\t42216\n摸头\t42217\n快唱一首\t42218\n妙妙妙妙妙\t42219\n神力\t42220\n神功\t42221\n维生素D\t42222\n好無聊該\t42223\n1ng\t42224\n40多天\t42225\npsboy\t42226\n邓超堡\t42227\n这样话\t42228\n画人\t42229\ntomityou\t42230\n你一个人\t42231\n挎包\t42232\n黄科斌\t42233\nOJL\t42234\n邓超堂\t42235\n黄海之滨\t42236\n十九页\t42237\n明柳\t42238\n滩涂\t42239\n搭伙\t42240\n大天才\t42241\n猪猪猪啊猪啊猪啊猪啊猪啊猪\t42242\n弗莱\t42243\n孙丽雅\t42244\n况朝燕\t42245\n电信运维部\t42246\n人烟\t42247\n我是说你是人的话\t42248\n烤瓷\t42249\nEXOL\t42250\nSecurity\t42251\n松散\t42252\n大神们\t42253\n巴神PS\t42254\neyidjo\t42255\n可美\t42256\n风平浪静\t42257\nhidhe\t42258\n圳台\t42259\n经查\t42260\n孙自从\t42261\n爱你讨厌讨厌讨厌讨厌讨厌\t42262\n白热化\t42263\n润侠\t42264\n客流\t42265\n可羞\t42266\n呜呜声你是我的好朋友那那那那啦啦啦\t42267\n习近平\t42268\n李翔\t42269\n养老\t42270\n庞令欣\t42271\n46123元\t42272\n苏越\t42273\n改造\t42274\n华晨度\t42275\n苏超\t42276\nstdft\t42277\n张赛伦\t42278\n三公里\t42279\n韩国丽水世博会#Nichkhun#组图7P\t42280\n希荒\t42281\n多大方爱你呀那你\t42282\nFfhjff\t42283\n同窗\t42284\n我晓得\t42285\n干符合\t42286\n宝山区\t42287\n脱开\t42288\n共同生命\t42289\n猫毛\t42290\n谢文胜\t42291\n机率\t42292\n凤凰影\t42293\n大同站\t42294\n88586598584552348888455550488\t42295\n让劲\t42296\n一三三\t42297\n上海能源\t42298\n方位\t42299\n方体\t42300\n死板小萌咯2070\t42301\n违反\t42302\n咻声\t42303\n范老板\t42304\n哀识\t42305\n异世家\t42306\ncopy\t42307\n受罪\t42308\n嫁人\t42309\n纳尼纳尼纳尼\t42310\n调查报告\t42311\n刮目相看\t42312\n受罚\t42313\n橙汁椰奶\t42314\n季节性\t42315\n有你真讨厌\t42316\n粘液\t42317\n袁微\t42318\n别块开心\t42319\n小坝子\t42320\n爆揍\t42321\n先进性\t42322\n费心\t42323\nkgjfjf\t42324\n夹角\t42325\n九十九八十九七十九六十九五十九四十九三十九二十九五十九\t42326\n九七年\t42327\n诸子百家\t42328\n我的屌\t42329\n被否\t42330\n常宁市\t42331\n书包\t42332\n何人\t42333\n说不说话\t42334\n下午14点\t42335\n临终\t42336\n韩束亮\t42337\n胡彬峰\t42338\n黑痣\t42339\n精问\t42340\n被吊\t42341\n三八二十四\t42342\n离不我你\t42343\n肩头\t42344\n苹果六普拉\t42345\n么么哒我喜欢你这个度秘度秘\t42346\n温都水城滑雪场\t42347\n了不骗你了我是girl\t42348\n不好惹\t42349\n房芷\t42350\n鸭嘴兽\t42351\n潘雨欣\t42352\nhnuj\t42353\n55685\t42354\n宅家\t42355\nhttpmbiqukucom228801666637\t42356\n春晓春眠不觉晓处处闻啼鸟夜来风雨声花落\t42357\n窑子\t42358\n有你别骗我了你给我钱\t42359\n邹旭\t42360\n脏乱差\t42361\n宅宅\t42362\n美国人\t42363\n天鸡\t42364\n挑选\t42365\n安塞登\t42366\nAA泌忒\t42367\n某人我真的不奢求你了=\t42368\n说不吗\t42369\n笑笑笑\t42370\nFirstatall\t42371\n班级\t42372\n名人酒店集团\t42373\n3000多万亩\t42374\n1月2十几日\t42375\n挑逗\t42376\n猛烈\t42377\n用色\t42378\ngugfy\t42379\n薄饼餐厅#La\t42380\n茅盾文学奖\t42381\nsszzzzzzz\t42382\n早上2点\t42383\n笨求\t42384\n济宁\t42385\n天来一次\t42386\n姓姓\t42387\n15666916096\t42388\n董秘镜\t42389\n袁雅琦\t42390\ngugfg\t42391\n被美葱震地眼花缭乱眼花缭乱眼花缭乱眼花缭乱眼花缭乱眼花缭乱眼花缭乱眼花缭乱眼花缭乱眼花缭乱眼花缭乱眼花缭乱眼花缭乱眼花缭乱眼花缭乱眼花缭乱眼花缭乱眼花缭乱\t42392\n户外\t42393\n四百多分\t42394\n五十铃\t42395\n辛亥革命\t42396\n猫KApple\t42397\n许还得\t42398\n渊哥\t42399\n筒裙\t42400\n有鸣\t42401\n拍拍\t42402\nosso\t42403\n拍拖\t42404\n甲骨文球馆\t42405\nhttppinyincne19891\t42406\nwtosei\t42407\n入藏\t42408\n问风水小一个猪嘞\t42409\n观念叨咕噜咕噜\t42410\n失血\t42411\n波波波波波波波波波波波\t42412\n17号至22号\t42413\nH\t42414\n拜门\t42415\n壮语\t42416\n旷空旷\t42417\n官方人\t42418\n伏特加\t42419\n杜宇秘\t42420\n传承\t42421\n我的妈咪回来了我的妈咪回来了我妈咪\t42422\n我喜欢一夜情\t42423\n急求\t42424\n宋抒情\t42425\n真心声音\t42426\n丧偶\t42427\n胖矮子\t42428\n2145\t42429\n2144\t42430\n2142\t42431\n装蛋\t42432\n几辈\t42433\n那好唻\t42434\n哀求片\t42435\n欢呼雀跃\t42436\n讲一\t42437\n呀呀呀呀呀\t42438\n书丛\t42439\n富贵\t42440\nHhfccc\t42441\n降幅\t42442\n装卸\t42443\ntsboss\t42444\n了聊\t42445\n13910163546\t42446\n甩干机\t42447\n小灵通\t42448\n软水\t42449\n奥小运丽萍思撒\t42450\n代代相传\t42451\n风花\t42452\n倒刺\t42453\n公升\t42454\nkej\t42455\n李云迪\t42456\n这等\t42457\n智能人类机器人\t42458\nqwem\t42459\n中旅中旅\t42460\n邻家女朋友\t42461\n史学家\t42462\n杨奎伟\t42463\n董海莹\t42464\n回答男\t42465\n性技\t42466\n低息\t42467\nmyseyou\t42468\n毕业\t42469\n豪放词派\t42470\n黄雀在后\t42471\n肯定句\t42472\n堡度\t42473\n接花\t42474\n处理器\t42475\nable3\t42476\n户头\t42477\n12000岁\t42478\n毕上\t42479\n脉冲\t42480\nItis\t42481\n好了不骗你了我是个女\t42482\n樱花东街\t42483\n耻骨\t42484\n申花队\t42485\n一伙人\t42486\n制造者\t42487\n九十多兆\t42488\n夏凌云\t42489\n集群点\t42490\n小松鼠之大陆分裂2\t42491\nhttpahiphotosbaiducomxiaodupicitem4afbfbed\t42492\n几正阳\t42493\n核舟\t42494\nwohen\t42495\n文昌文新园\t42496\n模仿\t42497\n堡店\t42498\n闲先经过\t42499\n山中学\t42500\n打燃\t42501\npot\t42502\n是我的然然\t42503\n四条半\t42504\npop\t42505\n对唱歌\t42506\n闲变\t42507\n103103\t42508\n103102\t42509\npoh\t42510\n陈濠\t42511\n你的女你的女友\t42512\n被冻结\t42513\npoe\t42514\nixix\t42515\n真好不\t42516\n1月8日\t42517\nnni\t42518\n覅覅\t42519\n去线\t42520\n中国国家核安全局\t42521\n50%\t42522\n洪三富\t42523\n有礼\t42524\n公休\t42525\n12286676\t42526\n843491543464575494\t42527\n508\t42528\n鲁塞\t42529\n糖果\t42530\n505\t42531\n502\t42532\n503\t42533\n500\t42534\n501\t42535\nwunian\t42536\n再也不\t42537\n版字\t42538\n子女人\t42539\n研发部\t42540\n易祥千玺\t42541\n关停\t42542\ntiger\t42543\n几个子儿\t42544\n脂树\t42545\n凤清医\t42546\n3点11分\t42547\n南半球\t42548\n问话\t42549\n巨变\t42550\n问询\t42551\n上茅房\t42552\nmount\t42553\n50e\t42554\n5项\t42555\n氧化钠\t42556\n痛痛痛痛\t42557\n50w\t42558\n李盼盼\t42559\nЁхЁхъ\t42560\n錄音室\t42561\n工钱\t42562\n苏孝锋\t42563\n别总这一句话\t42564\n陆承良\t42565\n小车车\t42566\n桂芳冉冉\t42567\n的路我真的很环年\t42568\n清白\t42569\n机猪\t42570\n一片一家一片心\t42571\n来不来\t42572\n妙鹃\t42573\n为不是\t42574\n郝立田\t42575\n卫兰亭\t42576\n次激将法护理\t42577\n庆云四中\t42578\ntoyfair\t42579\n瑞巴蒂\t42580\n女里\t42581\ncetomy\t42582\n梦着和\t42583\n唱错了错\t42584\nMarat\t42585\n两条棍\t42586\n赏雪\t42587\n队服\t42588\n沾沾\t42589\nilaalax\t42590\nbussiness\t42591\n远热睡呼车少求就\t42592\n李唰\t42593\n彭嘉伟\t42594\n蜣螂\t42595\n第二版\t42596\nSoul\t42597\n能不要\t42598\n海内存知己\t42599\n一会儿个\t42600\n讷河\t42601\n高小法\t42602\n88858343454454\t42603\n潜质\t42604\n恶性\t42605\n判若两人\t42606\n打谢谢你打扰\t42607\n顾苏夏\t42608\n李唐\t42609\nno\t42610\nison\t42611\n292美元\t42612\n自首\t42613\n安特卫普火车站\t42614\n读懂\t42615\n一嘻嘻嘻\t42616\n一块一百\t42617\n103.3[\t42618\n金娣\t42619\nbrisbane\t42620\n地下一个好儿\t42621\nxefdsri\t42622\nne\t42623\n拱形\t42624\n不聊号\t42625\n巧解\t42626\n回事\t42627\n热原\t42628\n一四级\t42629\n减负\t42630\n昨晚两点\t42631\n吴佩玥\t42632\n叫找\t42633\n身体力行\t42634\nfs700\t42635\n244363631多次\t42636\n仅且\t42637\n六晚上\t42638\n妙妙\t42639\n近平\t42640\n近年\t42641\n航电\t42642\n氯化镁\t42643\n巴中市\t42644\n1月10日至1月底\t42645\nhdjdf\t42646\n付美美\t42647\n回京\t42648\n热去\t42649\n前五\t42650\n登机箱\t42651\n1663331321\t42652\n诺迪斯卡\t42653\n家居服\t42654\n奥校\t42655\n真趣\t42656\n香草精\t42657\n前些\t42658\n玛峰\t42659\nnp\t42660\n编导题\t42661\n互子歹箕惫\t42662\n安嗯\t42663\nnq\t42664\nnbfyu\t42665\n醒来天黑了你\t42666\nnr\t42667\n05克\t42668\n洗洗口\t42669\n法国核能安全局\t42670\n一万五\t42671\n开不通\t42672\n澳交所\t42673\nvupv\t42674\n前人\t42675\n素滴\t42676\n静虑\t42677\n听会\t42678\n活生生\t42679\n听众\t42680\nnw\t42681\n的呀呀呀呀呀呀\t42682\n1起\t42683\n屈晓帅\t42684\n大茅山\t42685\n民主运动\t42686\n减脂\t42687\n十一五\t42688\n两千多兆\t42689\n我真的好爱他好爱好爱\t42690\n像学\t42691\n院子\t42692\n尊者\t42693\n柴胡\t42694\n意外国\t42695\n切克\t42696\n隐痛\t42697\n梦多\t42698\n不织布机\t42699\n联系人\t42700\n公安局\t42701\n第六小题\t42702\n时尚花园\t42703\n地皮\t42704\n被害死\t42705\n切入\t42706\n百变联盟\t42707\n我日恁忙\t42708\n十一人\t42709\n留籽\t42710\n上一层店\t42711\n裘友美\t42712\n霸王别姬\t42713\n尚雨欣\t42714\n六点63\t42715\n幺幺幺点幺六幺\t42716\n洛阳市\t42717\n7点20\t42718\n矣晓卉\t42719\n静坐\t42720\nidiiisisiw\t42721\n萌萌哒机器人我也我只要你\t42722\n低能儿\t42723\n爸爸爸爸爸爸爸爸爸爸\t42724\n就是女奴呢么哪牟年牟年\t42725\n够了不懂\t42726\n6941885981265\t42727\nstand\t42728\n兄妹恋\t42729\n红沙瓤\t42730\ntockingo\t42731\n一叶障目\t42732\n我真的是不想和你聊了你别理我\t42733\n我的我的名字\t42734\n二八幺幺幺九九\t42735\n芦笋\t42736\nobandenandede\t42737\n打醒了\t42738\n纪元\t42739\n93十四\t42740\n中山市\t42741\n叫尖嘴\t42742\nyou度\t42743\n笔名度\t42744\n欧欣怡\t42745\nfhyrgv\t42746\n你做我的女朋友\t42747\nMALIBU#加州\t42748\nyibinduoniye\t42749\n56666655\t42750\n螺度\t42751\n清河克拉\t42752\n太不要脸了你\t42753\n祟祟\t42754\n李仪露\t42755\n好嘛那快点\t42756\n阅兵\t42757\n贈\t42758\n罗金平\t42759\n典礼\t42760\nVlojh\t42761\n检索\t42762\n几点钟\t42763\n世族\t42764\n超深\t42765\n吴洪喆\t42766\n07:32:00\t42767\n我恨你我讨厌你我讨厌你讨厌你\t42768\n点儿谱\t42769\n供方\t42770\n岩浆\t42771\n安秀敏\t42772\n什么样\t42773\n075523467301\t42774\n334455\t42775\n琼岛\t42776\n7758855201314\t42777\n草图\t42778\n徐梁\t42779\nxggbggcc\t42780\n两八\t42781\n亲爱的你爱不爱我\t42782\n睡看\t42783\n好不好不好不好\t42784\n那个滴\t42785\n五百亿\t42786\nteecuus\t42787\nQ方块Qbism\t42788\n捐赠\t42789\n铝镁\t42790\nlichen\t42791\n山重水复疑无柳暗花明又一村\t42792\n谢黎坤\t42793\n第8分钟\t42794\n嗯遵命我\t42795\n哈尼族\t42796\n睡眠\t42797\n谁相信\t42798\n海面\t42799\n如果这样的话\t42800\nAT幺\t42801\n保佑思思\t42802\ngzhjafrjr77735173t3921hs\t42803\nDel\t42804\n你好我叫八一号\t42805\n营部\t42806\n早晨\t42807\n东神\t42808\n設計師\t42809\n3.2万\t42810\n徒弟\t42811\n二零九六\t42812\n82年\t42813\n哭了作\t42814\n越王\t42815\n几段\t42816\nwdfggvv\t42817\n要事\t42818\n一幅幅\t42819\n眼镜盒\t42820\n哭了你\t42821\n定海妹\t42822\n嗯咏麟\t42823\n0蟗\t42824\n1米\t42825\n有饭说\t42826\n东祥\t42827\n来了摸\t42828\n米先先\t42829\n铭记\t42830\n鼓掌\t42831\n温柔柔\t42832\n在何方\t42833\nduoh\t42834\nyouirem\t42835\nall\t42836\n大越\t42837\n嗯舞者\t42838\n法庭\t42839\n近墨者黑\t42840\n光芒万丈\t42841\n大超\t42842\n法度\t42843\n游戏杆\t42844\n五小时\t42845\n流血\t42846\n萨米特\t42847\n爱密度\t42848\n埃波里\t42849\nduor\t42850\n流量花\t42851\nhmmn\t42852\n流行\t42853\nhmmm\t42854\n育琼\t42855\npgughg\t42856\n阿飞天\t42857\n黄土\t42858\n279289486509\t42859\n好好好好好好好好好\t42860\n3月23日\t42861\n咱老百姓\t42862\n3.3万元\t42863\n猪呀式\t42864\n浓妆艳抹&amp\t42865\n安永\t42866\n思想\t42867\n东南风风太小了西北风风大\t42868\nΨ\t42869\nΩ\t42870\n星驰版\t42871\n我老子\t42872\n张晓刚\t42873\n李青虫\t42874\nΠ\t42875\nΡ\t42876\ns6万\t42877\nΤ\t42878\n能可\t42879\nΦ\t42880\nΧ\t42881\nθ\t42882\nι\t42883\nκ\t42884\n到底是谁\t42885\nμ\t42886\nν\t42887\nξ\t42888\n柯震东\t42889\nα\t42890\nβ\t42891\nγ\t42892\nδ\t42893\nε\t42894\nζ\t42895\n心想说\t42896\n红薯\t42897\n皇岗村\t42898\n发生条\t42899\n诺金\t42900\n雅兰\t42901\n慢your\t42902\nΘ\t42903\n能发\t42904\ngiiggiugni\t42905\nΞ\t42906\nΟ\t42907\nΑ\t42908\nΒ\t42909\nΓ\t42910\nΔ\t42911\n平局\t42912\n13437805969\t42913\n防蚊剂\t42914\n4煜\t42915\n争鸣\t42916\n木已成舟\t42917\n威尔士\t42918\n独特\t42919\n判定\t42920\nSean叔\t42921\n秘鲁\t42922\naoeiuUL\t42923\n朱华义\t42924\n林允儿\t42925\n度蜜月我需要你\t42926\n好吧在线\t42927\n下崽\t42928\n清姐\t42929\ni73aabc\t42930\n乔博文\t42931\n苏绮琪\t42932\n投向\t42933\nduia\t42934\n蜂蜜度秘\t42935\n公交车\t42936\n鹤城乡\t42937\n西人\t42938\n2095347028\t42939\n天外村\t42940\n13035081772\t42941\n缩减\t42942\nmy18ta\t42943\n差亲\t42944\nyygchvnj\t42945\n浙江勒\t42946\n12月29日\t42947\n差二\t42948\n苏非派\t42949\n差事\t42950\n缩出\t42951\n1T1\t42952\n熊日天\t42953\n电闸\t42954\nbekos\t42955\n1844\t42956\nVI花儿\t42957\n三百多块\t42958\n孙俊超\t42959\n无爱\t42960\nwuli小蜜蜜\t42961\n36克\t42962\n差五\t42963\n太阳能光电板\t42964\n白曲\t42965\nYoga\t42966\n耶鲁\t42967\n大大哒哒哒哒哒哒哒哒哒\t42968\n荷尔蒙\t42969\n壮大\t42970\n人缘\t42971\n到点\t42972\ncAo\t42973\n挂起\t42974\nAIGLE\t42975\n炊饼\t42976\n不用来回\t42977\n杀人狂\t42978\n斯人偶\t42979\n20多家\t42980\n金融市场\t42981\n滚滚滚滚滚滚滚滚\t42982\n无根豆芽\t42983\n苏丙南\t42984\n漠河\t42985\n杀人狗\t42986\n十五待\t42987\n出门儿\t42988\n鞠躬尽瘁\t42989\n想不过\t42990\n空洞\t42991\n乱填\t42992\ncA0\t42993\n说话你的名字我在街上我要你\t42994\n悲叹\t42995\n变化\t42996\n唐太宗\t42997\n2345677654321\t42998\n意识形态\t42999\n说英\t43000\n春川\t43001\n赖艳春\t43002\n多爽\t43003\n非常场\t43004\n99999x99999\t43005\n受热\t43006\n两天之前\t43007\n零六六几\t43008\n木块\t43009\n1267863588965\t43010\n薄书迪\t43011\n爱爱游戏\t43012\n军医\t43013\n4abcd\t43014\n颠波\t43015\nCOS文\t43016\n迷昏\t43017\n小兔兔\t43018\n我醉啦啦的爱你\t43019\napoue\t43020\n快克\t43021\nZoemy\t43022\n何先生\t43023\n关拜拜\t43024\n药吧\t43025\n唐礼燕\t43026\n彭昱畅\t43027\n我的女仆\t43028\nNore\t43029\n皮筋\t43030\n张同飞\t43031\n麽机\t43032\n快八\t43033\nXDDDD\t43034\n地机\t43035\n煤价\t43036\n养母\t43037\n简化\t43038\n课程序员\t43039\n咕咚品咖啡\t43040\n胜于\t43041\nAngelepvBaby\t43042\n跨子妹\t43043\nC110咖1590浙江纪佳\t43044\nhtree\t43045\n火车票\t43046\n82kg\t43047\n心有所属\t43048\n那你好好学不hcvgcvhhhvggv\t43049\n掌嘴\t43050\n3y2\t43051\n年味儿\t43052\n撑船\t43053\nbedab64812813b3f036afc379311e6ejpg\t43054\n当世界只有我和你\t43055\nstimetitouok\t43056\n查亚\t43057\n孩子\t43058\n臭猪八戒\t43059\n做我的亮亮\t43060\n御侮\t43061\n羊眼\t43062\n丁蕊\t43063\n唐山市\t43064\n一曲离歌终过\t43065\n周楚源\t43066\n范一\t43067\ndkryry\t43068\n114943\t43069\n闭上眼睛\t43070\n1.9%\t43071\n西餐厅\t43072\n方便面\t43073\n女疯子\t43074\n展示\t43075\n得了解\t43076\n鸭梨\t43077\n1.95\t43078\n游商\t43079\nstyle\t43080\n神把\t43081\n天井湖\t43082\n三金星\t43083\n深圳市委\t43084\n网关\t43085\nfutvhggFgch\t43086\n床恋\t43087\n度杖\t43088\n良婿\t43089\n好呀好呀\t43090\n比昌\t43091\n极佳\t43092\n后裔\t43093\n慢点\t43094\nvvgvv\t43095\nyfbj\t43096\n实验性\t43097\n银河奥特曼\t43098\n上万种\t43099\n建阳\t43100\n张卫健\t43101\n最前头\t43102\n走开走开走开我讨厌讨厌的应可以你讨厌你讨厌你讨厌你咪咪咪咪咪咪咪咪咪\t43103\n工员们\t43104\n到公路再聊\t43105\n架不告诉\t43106\n黑呗\t43107\n20021209\t43108\n男女有别\t43109\n亚克西\t43110\ntfboyfaf\t43111\n军校\t43112\n晚会\t43113\n小密小密小密\t43114\n很高\t43115\n35天\t43116\n图频\t43117\n凵甸\t43118\n扩充\t43119\n讲敢\t43120\niwjf\t43121\n贪睡\t43122\n雪岭熊风还有救护回家\t43123\n真厉害\t43124\n狂想\t43125\n度来\t43126\n5fx\t43127\n忙叨\t43128\n绿光\t43129\n素不素不开\t43130\n大小姐你听我的话片天没\t43131\n透心凉心飞扬\t43132\n350484128257\t43133\n钱逸坤\t43134\n由此可见\t43135\n烟雨村\t43136\n街边\t43137\n借用\t43138\n棉毛衫\t43139\nvcfgbjkkookkk\t43140\n李思宸\t43141\n银盘\t43142\n88888888888888888888888888888888888\t43143\n轰炸机\t43144\n一百一百一百块\t43145\n整固\t43146\n红白艺能大赏\t43147\n五菱学校\t43148\n狂情\t43149\n阿里古丽\t43150\n简不\t43151\n把劲\t43152\nCosmo\t43153\n心孤单\t43154\n撸条龙\t43155\n求徕\t43156\n长生不老钱\t43157\n求得\t43158\n坏人了那你那你怎么不认识我吗我错过你了不我不能这样说人\t43159\n444444445747587\t43160\n金沙江\t43161\n小女子\t43162\n25集\t43163\n李响\t43164\n柔弱感\t43165\n堪舆学\t43166\n呵当\t43167\ngabaaajttggjjaa\t43168\n北京地铁九号线\t43169\n再婚\t43170\n肉饼\t43171\nxxx4399\t43172\n王雯婧\t43173\n一百二十七块\t43174\n超霸\t43175\n一上有你\t43176\n小女孩\t43177\n鄢陵县\t43178\n撼生\t43179\nHugd\t43180\n旧作\t43181\n及格\t43182\n晨曦\t43183\n二卡\t43184\nfeel\t43185\nfxgf\t43186\nHugo\t43187\nHugh\t43188\n季咯\t43189\nfeet\t43190\nuseEnglish\t43191\n6.\t43192\nfeew\t43193\n呼喊\t43194\n神级\t43195\nHugs\t43196\npmp\t43197\n陈恒\t43198\n海陆空\t43199\n呃网网\t43200\n水气体\t43201\n弄踏\t43202\n二十\t43203\n祖词\t43204\n冯宇\t43205\n利器\t43206\n美剧\t43207\n安阳市第七中学\t43208\n中道\t43209\n小家伙们\t43210\nwearefami\t43211\nuxfuuu\t43212\n东莞南\t43213\n恶我是\t43214\n一下去二\t43215\n欧恩觉嗯N85\t43216\n鸭子童子鸡\t43217\n〇〇\t43218\nmegalomania\t43219\n221张\t43220\n经说\t43221\n人转\t43222\n人轮\t43223\n矮小\t43224\n夏靓雪\t43225\n桥隧\t43226\n累心\t43227\n永力公司\t43228\n人民医\t43229\n政府执政党\t43230\n想不想到\t43231\n务必\t43232\n王心恬\t43233\nhgfsocghjgghhhhjjkkkjjejhgyujjjjijhhhhhyuy\t43234\n草别忘我你\t43235\n十三股\t43236\n仙四#QAQ\t43237\n有所以我不想\t43238\n估\t43239\n不定额\t43240\n伴\t43241\n清秀\t43242\n62\t43243\n伸\t43244\n星星泪\t43245\ndjtx\t43246\n挂聊\t43247\n伢\t43248\n传\t43249\n伦\t43250\n77c\t43251\n伤\t43252\n生活方式\t43253\n伪\t43254\n恳求\t43255\n伯\t43256\n变化无常\t43257\n伓\t43258\n真心術\t43259\n休\t43260\n伐\t43261\n众\t43262\n伕\t43263\nab64034fd566b674a8c379310a551d1cjpg\t43264\ncheck\t43265\n会\t43266\n伙\t43267\n优\t43268\n伟\t43269\n伞\t43270\n企\t43271\n伀\t43272\nRed\t43273\n顾老师\t43274\n伊\t43275\n想得美\t43276\n伏\t43277\n伍\t43278\n九个小时\t43279\n775\t43280\n翻身\t43281\n777\t43282\n亿百\t43283\n送架\t43284\n欲求\t43285\n费费费费费费牙\t43286\n772\t43287\n779\t43288\n778\t43289\n追逐\t43290\n77%\t43291\n几多少\t43292\n大一个\t43293\n尿布\t43294\n神志不清\t43295\n比赞\t43296\n要不我不走\t43297\n惜本\t43298\n比赛\t43299\nyammy\t43300\n445558666585\t43301\n孟德瑄\t43302\n交会\t43303\nyoumas\t43304\n安全垫\t43305\n9999998899999988亿元\t43306\n龙虎少年队\t43307\n乔恩\t43308\n大一下\t43309\n再梦西游\t43310\n万国鹏\t43311\n胡思乱想了真是\t43312\ngungungungun\t43313\n外务省\t43314\n別鳥\t43315\nrtd\t43316\njourneys\t43317\n太好了我真\t43318\n阿迪达斯\t43319\n茂沙\t43320\n陈芷冉\t43321\n太厉害\t43322\n我也不不知道\t43323\n东南网\t43324\nrty\t43325\n日日日\t43326\n时尚人\t43327\n地方话\t43328\n镇定\t43329\n统治阶级\t43330\n妙心\t43331\n每周\t43332\n胡不发\t43333\n察哈尔右翼后旗\t43334\n一百十八\t43335\n徐美林\t43336\n区内\t43337\n镇守\t43338\n利比\t43339\n彻夜\t43340\n4894840\t43341\n俯拍\t43342\n五六个\t43343\n吉炜\t43344\n搔货\t43345\n真舍\t43346\n喊道\t43347\n难以理解\t43348\n叫破烂\t43349\n表达\t43350\n愛給\t43351\n显然兄弟会\t43352\n黄金尼泊尔\t43353\nnononononononli\t43354\n70年前\t43355\n引导\t43356\n你的笑\t43357\n三十五个\t43358\n撒币\t43359\n相相\t43360\n彩灯\t43361\n足疗\t43362\n道教\t43363\nsrtt\t43364\n很搞笑\t43365\n一次九五折\t43366\n餐餐\t43367\n套机\t43368\n吃垮怜\t43369\n煤层气\t43370\n么光\t43371\n跳槽\t43372\n频影\t43373\n猪号\t43374\n么元\t43375\n国旅\t43376\n2073884696\t43377\n0.01%\t43378\n咫尺天涯\t43379\n9998555223\t43380\n刷秘\t43381\n甜品类\t43382\n国旗\t43383\n猪叫\t43384\n李建辉\t43385\n363644534405454543644663565746455377\t43386\n四十米\t43387\n彩旗\t43388\n星薇\t43389\n语文版\t43390\n463号\t43391\nHopkins\t43392\n花木蓉\t43393\n嘉豪\t43394\n杜宇飞\t43395\n太天真\t43396\n度城\t43397\nTIC\t43398\n上原谅你\t43399\n言以对\t43400\n空城小黄人歌\t43401\n一滴眼泪\t43402\nffjjggjrutgri\t43403\n别说谎\t43404\n恋孕文\t43405\n碧婷\t43406\n丽江市古城区公安局\t43407\n一诺千金\t43408\n自卑神马\t43409\n心跳声\t43410\nhsbshbs\t43411\n42535\t43412\n肥噶\t43413\n2555558888086\t43414\n阿特拉斯猫\t43415\npeakenglish\t43416\n槽牙\t43417\n正月廿九\t43418\n450只\t43419\n贡阳\t43420\n基本机\t43421\n张瑞林\t43422\n诸君\t43423\n哼哼僵尸\t43424\n0593\t43425\n228778\t43426\nSuperman\t43427\ntgs\t43428\ntgp\t43429\nghggggg\t43430\ntgt\t43431\ntgu\t43432\n世十里桃花\t43433\nfgffytg\t43434\n1倍\t43435\n等等等等等等等等等等等等等等等等等等等等等等等等等等等等哒哒哒哒哒哒哒哒哒哒哒登\t43436\n抢先\t43437\ntga\t43438\ntgf\t43439\ntgg\t43440\ntgd\t43441\n黄雪芬\t43442\n公寓区\t43443\n花栗\t43444\n神经质\t43445\n宋丹宁\t43446\n王说话\t43447\n两者\t43448\n费刚\t43449\n飞飞过来\t43450\n冲淡\t43451\n张好不\t43452\n生物体\t43453\n华盛顿州\t43454\n热带\t43455\n花根\t43456\n花样\t43457\n张江\t43458\n选用\t43459\nBALI\t43460\n钉书机\t43461\n荣信\t43462\n下肢\t43463\n第一把\t43464\n小大白熊\t43465\n面红耳赤\t43466\n约瑟夫·利克莱德\t43467\n里弄\t43468\n逃费\t43469\n亚豪塞\t43470\nTttttttttttttttttttttttt\t43471\n那小余\t43472\n六十几分下\t43473\n不懂装懂\t43474\n大理二中\t43475\n香都\t43476\n护短\t43477\n格斗\t43478\n烧酒\t43479\n有福气\t43480\n下肚\t43481\n七头十＾十卜哔米屮大忄丈十小冫小丈冫十＾＾汴上卜卜人＋十十卜冫＼乂辷\t43482\n兵马\t43483\n两小有问的我在英多\t43484\n对与错\t43485\n经济体\t43486\n漕宝路\t43487\n120集\t43488\n明晰\t43489\n喽莫西莫西\t43490\n先睹\t43491\n卡兰卡\t43492\n消费主义\t43493\n称号\t43494\n大篆\t43495\n王春林\t43496\n破铜烂\t43497\n肩周炎\t43498\n笑话因为你太傻了我讨厌你很讨厌讨厌讨厌讨厌\t43499\n屌痴\t43500\n以次\t43501\n小那你先睡觉吧我懂了找你行\t43502\n那你人长什么样和你兄弟\t43503\njgdj\t43504\n不成器\t43505\n尊享\t43506\n13135044517\t43507\n键盘机\t43508\n海逸诗阁\t43509\njgdb\t43510\njgdd\t43511\n你是谁兔兔\t43512\n魏子扬\t43513\n自慰棒\t43514\n日光灯\t43515\n我问你你是世界上我是世界上啊最美的人\t43516\n喝喝酒\t43517\n喔喔\t43518\n男技\t43519\n米掐\t43520\n总计\t43521\n唉羊羊\t43522\n佛陀\t43523\n钱先\t43524\n飘雨半山\t43525\n钱元\t43526\n唱支\t43527\n亚好\t43528\n囧╯▂╰\t43529\n死不投币\t43530\nMARROITT\t43531\n夜勤病栋\t43532\n大幂幂\t43533\n小小的度秘萌萌哒度秘度秘度秘\t43534\n黑道道\t43535\n不同凡响\t43536\n作弊\t43537\n库拜旦\t43538\n10秒后\t43539\n1agj\t43540\n肉座\t43541\n毒龙\t43542\n看图猜成语独旗\t43543\nbjy\t43544\n主词\t43545\n奢俭\t43546\n倒头\t43547\n小范\t43548\n三峡集团\t43549\n就里\t43550\n小茉\t43551\n大家好我叫臭臭\t43552\n2K540\t43553\n造纸厂\t43554\n流星石\t43555\n小茜\t43556\n哈西路米库\t43557\n特么\t43558\n线条\t43559\n主误\t43560\n七万元\t43561\n主语\t43562\n满月酒\t43563\n共鸣者\t43564\n前一日\t43565\n吹起\t43566\n东莞东\t43567\n主课\t43568\n人你是你是男的我是女的好不\t43569\n董事\t43570\n50亿九千八百五十四\t43571\n尼嘛尼哄\t43572\n小茹\t43573\n线板\t43574\n小黄鞋\t43575\n节次\t43576\n维修中心\t43577\n高姐\t43578\n电缆\t43579\n006666666\t43580\n光宝贝\t43581\n剁手\t43582\n睡相\t43583\n门为家\t43584\n新税\t43585\n米拉里\t43586\n火候\t43587\n太好了我最喜欢\t43588\n张宝丽\t43589\n张宝丹\t43590\n998979494\t43591\n非法\t43592\nwodedhfc\t43593\n派数\t43594\n凯瑞女\t43595\n你好呀你\t43596\n沙歌\t43597\n1233578\t43598\n42岁\t43599\n沈丘\t43600\n太神\t43601\nddw囊rbr鳓\t43602\n饺子皮蛋蛋挞\t43603\n▓\t43604\n整形手术\t43605\n130亿\t43606\n姚晨\t43607\n▔\t43608\n餐名\t43609\n快播\t43610\n杰泽而渔\t43611\n▃\t43612\n▂\t43613\n甩甩\t43614\n眼里\t43615\n▇\t43616\n▆\t43617\n▅\t43618\n▄\t43619\n搞乜\t43620\n三条杠\t43621\n一早上\t43622\n█\t43623\n紧邻\t43624\n△\t43625\n▲\t43626\n系理\t43627\n座驾\t43628\n梅贻琦\t43629\n肖趣\t43630\n失心疯\t43631\n▽\t43632\n▼\t43633\n□\t43634\n■\t43635\n组成本队\t43636\n托马斯旗舰店\t43637\nvvv科技馆\t43638\ngmgpgdgm\t43639\n用得着黑\t43640\n点津\t43641\n乐王佩\t43642\n飑风\t43643\n遗体\t43644\n武昭仪\t43645\n晦暗\t43646\n陆运\t43647\n谈尽\t43648\n月季花\t43649\n腻子\t43650\n庄材\t43651\n观赏者\t43652\n丽丽丽丽丽丽丽丽\t43653\n癫啦\t43654\n盐酸\t43655\n我孙\t43656\n烦我讨厌\t43657\n讲图\t43658\nfhcu\t43659\n没好气\t43660\n完毕\t43661\n秘秘的话\t43662\n时间儿\t43663\n大家来抢钱\t43664\n郝帅\t43665\n寿紫琳\t43666\nstkaka\t43667\n新鳌\t43668\n开镜放麻利点五十哪不上我咋十个马李庄\t43669\n自生自灭\t43670\n哭陵\t43671\n各州\t43672\ntigeiniuso\t43673\n背吧\t43674\n改姓\t43675\n冲顶\t43676\n您好上\t43677\n背听\t43678\n林阿姨\t43679\n一八度秘\t43680\n星辉中路\t43681\n前南\t43682\n各己\t43683\n前卫\t43684\n森格\t43685\n可逆\t43686\n120507KTR\t43687\n羡慕八戒吗你是八戒\t43688\n李佳蔚\t43689\nnothingismean\t43690\n就是说的话\t43691\n可选\t43692\nab锁屏\t43693\n快客\t43694\n豪美\t43695\n吻丸\t43696\n题为\t43697\n白桦\t43698\n可逗\t43699\nsfvtxgg\t43700\n亚欧桥头堡\t43701\n给你\t43702\n背后\t43703\n叫班\t43704\n见了我走\t43705\n17所\t43706\n愿海\t43707\nwhkwdbxjo\t43708\n真命苦\t43709\n流言蜚语\t43710\n蒙福井\t43711\n红糖水\t43712\n120分\t43713\ngabbana\t43714\n搞没了\t43715\n米饭\t43716\n猫科动物\t43717\ndehu\t43718\n陈文武龙\t43719\n鞠一躬\t43720\n浩珉\t43721\n怕疼\t43722\n54元\t43723\n380步\t43724\n乔叶璇\t43725\n说起\t43726\n种族歧视\t43727\n三月份\t43728\n665536555\t43729\n洗澡间\t43730\nghvo\t43731\n听不起\t43732\n十二十\t43733\nVans十字补丁\t43734\n值得一看\t43735\n芝霞\t43736\nRoberts\t43737\n好运来那个好运来\t43738\n挡住\t43739\n置办\t43740\n郭心呑\t43741\nfmart\t43742\n榕江\t43743\n出淤泥而不染\t43744\n安夏凉\t43745\n配对\t43746\n374775844266\t43747\n名捕\t43748\n六字大明咒\t43749\n嘉陵街\t43750\n好再吃\t43751\n金肯职业技术学院\t43752\n保卫地球\t43753\n最狠\t43754\n敌手\t43755\n阿加拉\t43756\n天天有喜之人间有爱灵灵\t43757\n新英特尔\t43758\n麽幻车神\t43759\nzsaqsaa\t43760\n廓然\t43761\n李常彬\t43762\n这匹马\t43763\n好我和你聊和你聊\t43764\n1234567你是我的未婚妻\t43765\n典行\t43766\n甘米\t43767\n华赛\t43768\n夜想起呢大学\t43769\n花家地北里17号楼\t43770\n乌沙\t43771\n简算\t43772\n折铁断钢的幼女史\t43773\n小魔仙只\t43774\n银麦\t43775\n问一致\t43776\n九四你是个不要脸\t43777\n赵又廷\t43778\n可贵\t43779\n李海鸣\t43780\n首周\t43781\n苞米茬子\t43782\n给烧\t43783\n九一分钟\t43784\nedo\t43785\n守寡\t43786\n不不不不哭了你\t43787\n白城\t43788\n刘学学\t43789\n倪子鱼王\t43790\n一统天下\t43791\n徐吃西瓜\t43792\n亮蓝\t43793\n绿包\t43794\n穿越火线号\t43795\n离场\t43796\n我素纯洁\t43797\n新城\t43798\n优势\t43799\n优劣\t43800\n聊点别\t43801\nfffgggggggg\t43802\n首呗\t43803\n笑没\t43804\n永暑礁\t43805\n大学堂\t43806\n三国杀杀杀傻傻傻傻傻帽\t43807\nheartaches\t43808\n超级大乖\t43809\n四例\t43810\n我要你的精子你快射\t43811\n欧阳帆\t43812\njddvnj\t43813\nbec54e743852b6cbe389b504fc26a52jpg\t43814\n天和地\t43815\n总额\t43816\n给我不说\t43817\n唯物\t43818\n瑟缩减轻\t43819\n心恋\t43820\n塞尔\t43821\n起先\t43822\n去去\t43823\nJigjjgjjjjj\t43824\ngjdfhhb\t43825\n饶命\t43826\n共担\t43827\n波菠菜\t43828\n笑电影\t43829\n10cm\t43830\n三秒以后\t43831\n四十帮\t43832\n母狗\t43833\n姐妹篇\t43834\n啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦\t43835\n生存\t43836\n绿化\t43837\nkdbd\t43838\n孙有\t43839\n帅呆了酷\t43840\n生字\t43841\n生子\t43842\nab64034fd863b972a8c379310a551d13jpg\t43843\n3141\t43844\n口通商\t43845\n滴手\t43846\n4578369258147\t43847\n自晓其义\t43848\n李章\t43849\n一心一生\t43850\n测字\t43851\n良心不安\t43852\n福报\t43853\n母狼\t43854\nhbvnnbh\t43855\n还是一样\t43856\n滴滴滴敌\t43857\n生孩\t43858\n迎泽大街\t43859\n上个星期\t43860\n地震\t43861\n现映\t43862\n善果\t43863\n跳上\t43864\n跳下\t43865\n理理\t43866\n掱\t43867\n独立当家\t43868\n丁隐\t43869\n西区大润发超市\t43870\n旋涡\t43871\n害民\t43872\n没早恋\t43873\n我是人二小\t43874\n0.08%\t43875\n二三七零\t43876\n就是你系\t43877\n张煜阳\t43878\n王欢\t43879\n最后一次机会\t43880\n新乐十四米们\t43881\nhgiig\t43882\n处分\t43883\n23323\t43884\n可萌\t43885\n这个体\t43886\n党组\t43887\n七德\t43888\n新武\t43889\n新步\t43890\n人家们\t43891\n笑点\t43892\nuccyi\t43893\n痛度\t43894\n给我的手四百\t43895\n九项\t43896\n鬼谷\t43897\n固始县\t43898\n夏若彤\t43899\n长老\t43900\n李超泡\t43901\n长者\t43902\n读本\t43903\n长乐路\t43904\n弦子弦弦\t43905\n展播\t43906\n吴博伟\t43907\n白黄\t43908\n这个你\t43909\n16个月\t43910\n新歌\t43911\n白黎\t43912\n迁徙\t43913\n系吾糸\t43914\n我讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你我不要你做我的度\t43915\n王子俊\t43916\n抵用券\t43917\n迁徒\t43918\n完了讨厌\t43919\n小牧民\t43920\n流鼻涕别\t43921\n三奶\t43922\n五十三五十六分\t43923\n张根巨\t43924\n鲁科手机店\t43925\nnewdrummer\t43926\n胜点\t43927\n三好\t43928\n1160629133\t43929\n占有率\t43930\n武器装备\t43931\nALL\t43932\n嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻\t43933\n双马尾\t43934\ntrhintoutstly\t43935\n三套\t43936\n陈心悦\t43937\n离别时\t43938\n刘佳恒\t43939\n不不不不不不不不我不要我不要我不要我不要我不不不不不不不不不不不不不不不不\t43940\n大庙\t43941\n狗宝\t43942\n狗官\t43943\n健胃\t43944\n老牙\t43945\n王育文\t43946\n老牛\t43947\n大店\t43948\n元辰殿\t43949\n抬杠\t43950\n王君丽\t43951\n大床\t43952\n皎皎者\t43953\n老片\t43954\n蕊子\t43955\n活生\t43956\n去吧智能机器人\t43957\n就是了吗\t43958\n老牌\t43959\n小胖\t43960\n189320\t43961\nsis\t43962\nsir\t43963\n码农\t43964\n15807358937\t43965\n塑料\t43966\nsit\t43967\nsiy\t43968\nsix\t43969\n度咪咪\t43970\n活用\t43971\n双人床\t43972\n害处\t43973\n菌丝\t43974\nsif\t43975\nsie\t43976\nsid\t43977\n吓信\t43978\n再使\t43979\nsii\t43980\n贝加长\t43981\n大度\t43982\nsim\t43983\nuruuesduirti5oust3ynbhcxzvffy6yggHJYJHHNjhcbrfvfxc\t43984\n分不清楚\t43985\nLakik\t43986\n手掌\t43987\n上报\t43988\n小扣\t43989\n痴情药\t43990\nUE\t43991\n造反派\t43992\n变你自己\t43993\n乛扌E阝\t43994\n风挡\t43995\n统一\t43996\n错觉\t43997\n嘟咪嘟咪大还丹度秘大坏蛋度秘大坏蛋度秘大坏蛋度秘大坏蛋度秘大坏蛋度秘\t43998\nUF\t43999\n小手\t44000\n勇闯江湖\t44001\n特异\t44002\n同步率\t44003\n皿｀┑\t44004\n17点30\t44005\n学好学\t44006\n开进\t44007\n小打\t44008\n5173\t44009\nrusted\t44010\n5177\t44011\n1449122045\t44012\n5174\t44013\n适得其反\t44014\n李兆刚\t44015\n四集\t44016\n觉晓说\t44017\n558996\t44018\n把把\t44019\nQ版\t44020\n自荐表\t44021\n嗯度秘\t44022\n郭师傅\t44023\n吴艳男\t44024\n113em\t44025\n宫梦鑫\t44026\n盖弥彰\t44027\n皮噜\t44028\n尤达\t44029\n度秘嫁给我吧我喜欢你\t44030\n什多\t44031\n桃幺\t44032\n接\t44033\n草菅人命\t44034\n明天会\t44035\n四零\t44036\n食会\t44037\n益\t44038\nhi甘\t44039\n段小燕儿\t44040\n就片\t44041\n盀\t44042\nb族\t44043\n盆\t44044\n一马\t44045\n盘\t44046\n高炉\t44047\n咪咪夹\t44048\nCao\t44049\nCan\t44050\n才煤\t44051\n盐\t44052\n贴包\t44053\n盒\t44054\n水利\t44055\n盗\t44056\n盖\t44057\n抗震\t44058\n高点\t44059\n立夏\t44060\n现在这么难\t44061\n盯\t44062\n目\t44063\n握问\t44064\n表脸\t44065\n进码\t44066\nfuzg\t44067\n盦\t44068\n相\t44069\n水分\t44070\nCaO\t44071\n盼\t44072\n盾\t44073\n遵旨\t44074\n盳\t44075\n盲\t44076\n直\t44077\n瓜子肉\t44078\n水刊\t44079\nksowow\t44080\n零年\t44081\n了了是了了是3\t44082\n招骂\t44083\n多软\t44084\n一目的nice个\t44085\n8%\t44086\n秦维景\t44087\n88\t44088\n89\t44089\n萨摩\t44090\nuuuutttrrryuuttyruyyuireutyituuyyuttutyuttytety\t44091\n82\t44092\n邪恶化\t44093\n80\t44094\n81\t44095\n86\t44096\n87\t44097\n84\t44098\n85\t44099\n伟绩\t44100\n主姓\t44101\n果陀\t44102\n哦度\t44103\n发抖\t44104\n加黑\t44105\n我最讨厌你了我不理你了讨厌你\t44106\n那里\t44107\n早三\t44108\n冯江涛\t44109\nkglvhvl\t44110\n定婚\t44111\n8k\t44112\n8h\t44113\n勉励\t44114\n相似\t44115\n8o\t44116\n最高峰\t44117\n无聊了找\t44118\n凹奏\t44119\n8g\t44120\n8x\t44121\n印刷费\t44122\n臭不要脸我恨你\t44123\n38205\t44124\n相传\t44125\n白累\t44126\n8p\t44127\n8q\t44128\n8v\t44129\n突突\t44130\n8u\t44131\n8K\t44132\n相会\t44133\n12553385524\t44134\n8O\t44135\n写在你的喜欢\t44136\n磺芝麻\t44137\nB型\t44138\n工廠\t44139\n库卡\t44140\n翘臀\t44141\n团像\t44142\n说事多来\t44143\n替母\t44144\n满分\t44145\n8V\t44146\nful\t44147\ndyxvbbjnb\t44148\n错啦错啦错啦错啦错啦一万个赞\t44149\n12345609\t44150\n七零\t44151\n爰奇艺\t44152\n白宝强\t44153\n一点儿点儿\t44154\n于涛\t44155\n接闹闹\t44156\n机床展\t44157\n找我来\t44158\n人口红利\t44159\n胎神\t44160\n七雪\t44161\nSaab\t44162\n第46届\t44163\n汉子音\t44164\n嘎嘎嘎额\t44165\n夏明月\t44166\n创办者\t44167\n早不\t44168\n四姑\t44169\n六去\t44170\n七集\t44171\n七雄\t44172\nsiao\t44173\n张懿\t44174\n骤断然\t44175\nFfdfhjjhhhkuhydhhddsddrscbgjh\t44176\n勉力\t44177\n盗墓笔记小哥帅\t44178\n小思绪\t44179\n正东西\t44180\n乖真是我的好度秘\t44181\n獨立\t44182\n紫妍\t44183\nygvffgyyibcybbyuhfdtu\t44184\n倮体\t44185\n绿道\t44186\n23MnNiMoCr\t44187\n胸模\t44188\n胡庭海\t44189\n格格\t44190\n摩米士\t44191\n八十三\t44192\n魏彩丽\t44193\n平顶山市实验中学\t44194\n秘姐姐\t44195\nlamLuo\t44196\n壬凼\t44197\n末日\t44198\n四章节\t44199\n宁德美大\t44200\n无聊路过\t44201\n终三\t44202\n猴币\t44203\n好味\t44204\n吴海杰\t44205\n袭袭胸\t44206\n我声哥\t44207\n货\t44208\n账\t44209\n败\t44210\n周玲\t44211\n人工智能\t44212\n财\t44213\n贡\t44214\n高肑\t44215\n贯\t44216\n贮\t44217\n购\t44218\n贬\t44219\n贫\t44220\n贪\t44221\n弋江区\t44222\n质\t44223\n悬价\t44224\n一面一个\t44225\n第32页\t44226\n贲\t44227\n贱\t44228\n贰\t44229\n贿\t44230\n贾\t44231\n下效\t44232\n贼\t44233\n贻\t44234\n谢谢你好想\t44235\n费\t44236\n贸\t44237\n运势\t44238\n削弱\t44239\n盛世\t44240\n贏\t44241\n绝世好贝\t44242\n贊\t44243\n马杆\t44244\n波斯得\t44245\n运动\t44246\n片铺\t44247\n真怪\t44248\n负\t44249\n贞\t44250\n带闹\t44251\n降血\t44252\n黄土高坡\t44253\n公子扶苏\t44254\n如新\t44255\n卡尔马\t44256\n红林场\t44257\njtktptg\t44258\n回我话\t44259\n厌怒\t44260\njrvk\t44261\n如斯\t44262\n6月9日晚\t44263\n爱很爱\t44264\n17分钟\t44265\n沸沸扬扬\t44266\nBBCH\t44267\n严艺丹\t44268\n国家海洋局\t44269\n徐和\t44270\nBBCB\t44271\n惠阳\t44272\nscoupoitoo\t44273\n大恩人\t44274\n骤涨\t44275\n几天前\t44276\neeefgvg6j\t44277\n飞飞吊一吊\t44278\n祁县\t44279\n自高自大\t44280\n1761670069\t44281\n五四路中段站\t44282\n孙一诺\t44283\n这的话\t44284\njhgh\t44285\n哈飞飞飞飞飞\t44286\n牛帅博\t44287\n铁梅\t44288\njhgf\t44289\n人间爆发\t44290\njhgx\t44291\n人家床\t44292\n建军节\t44293\n报到\t44294\njhgu\t44295\n零二三幺\t44296\n实力者\t44297\n报刊\t44298\n6月10日24时\t44299\n闭学式\t44300\n洪福齐天\t44301\n官府\t44302\n416363397\t44303\n张好\t44304\n邓凯瑞\t44305\n表学\t44306\nGiuliano\t44307\ndotatindle\t44308\n西柏林黛妤\t44309\n两抹\t44310\n0879766757\t44311\nIT业\t44312\n彩光\t44313\n個骨\t44314\n你是谁你是谁你告诉我\t44315\n电子邮件型\t44316\n深圳晶报\t44317\n65000\t44318\n吃费\t44319\n中芯国际\t44320\n赤城\t44321\n轻甲\t44322\n烧腊\t44323\n蒿恒\t44324\n孤例\t44325\n最富有\t44326\n出谋\t44327\n惊咪\t44328\nvbvbc\t44329\n吃饺\t44330\n剧场盘\t44331\n那你说女的干嘛\t44332\n号角\t44333\nxge才pjejeejdlbrjdhddkrtarjcsggdfbnfnmxrddeggbnfnmndnfnddmrnbtbaydydidxvxbwwrtgdsvvuwjdoereyves\t44334\n轻生\t44335\n瓜果\t44336\n密歇根\t44337\n预热\t44338\n吃饭\t44339\n我爱你度你\t44340\n谢挽玄\t44341\n15131359089\t44342\n勿忘错\t44343\n欧哈哈哈\t44344\n符号\t44345\n三姐妹\t44346\neQQF\t44347\n奶昔\t44348\nsrukcz\t44349\n千舸争流\t44350\n224升\t44351\ndlc\t44352\ndlb\t44353\n坏种\t44354\n梗阻性覅\t44355\n离现\t44356\n臭回话\t44357\n行文\t44358\n坐相\t44359\ndll\t44360\n咕噜噜噜咕噜噜噜\t44361\nnochityou\t44362\ndlw\t44363\nGRAZIA\t44364\n戀忧\t44365\n核对\t44366\nvvvvvggfghgl5lllkk\t44367\n08858588585847575852582582882\t44368\n张林娇\t44369\npp1份\t44370\n说得\t44371\n印钞\t44372\n三炮部队\t44373\n请说笑\t44374\n送出没\t44375\n周村青\t44376\n嗯嗯嗯嗯嗯我杀杀杀杀杀\t44377\nffall\t44378\n333332个\t44379\n啦啦扣\t44380\n8quh80\t44381\n金正模\t44382\n吻吻吻吻吻吻吻吻吻吻吻吻吻吻吻\t44383\n2a\t44384\n干么\t44385\n杨亚曼扬亚\t44386\n3600只\t44387\n长江之歌\t44388\n音色\t44389\n恩那就聊到这我走\t44390\n为不懂\t44391\nimacpro\t44392\n注销\t44393\n勤恳\t44394\n三极篇\t44395\n是的我讨厌你\t44396\n飑够舣\t44397\n习本\t44398\n度秘姐\t44399\n下午五点\t44400\n钱乐队\t44401\n两碗\t44402\n前列腺癌\t44403\n太外人\t44404\n摸头乖下不为例\t44405\n株汕\t44406\n一半儿\t44407\n索菲特\t44408\ncyoyoc\t44409\n猪新\t44410\nt波\t44411\n快乐哦X5\t44412\n木头\t44413\n童颜\t44414\n陆达生\t44415\n米家\t44416\n多多久\t44417\n车险理赔\t44418\n妈妈拉拉吧巴巴通德\t44419\n完还\t44420\n苏姿\t44421\n密布\t44422\n6月2日、3日\t44423\n汉武帝\t44424\n莞尔\t44425\nFarkyou\t44426\n刘凯胜\t44427\n喝药\t44428\n付场\t44429\n换手\t44430\n七十多七十多\t44431\n一分钟前\t44432\n念奴娇\t44433\nmgdgmgmmjgmOlinmineJ1D1O\t44434\n清撒\t44435\n我打不开我的通讯录\t44436\n引经据典\t44437\nseseseshui\t44438\n微跌\t44439\n云端\t44440\n笑起来\t44441\n化化化化化化\t44442\n米安\t44443\n哥哥们儿\t44444\n695656jfjvndjdnij\t44445\n经济赔偿\t44446\n痛经心活\t44447\n鸡脖\t44448\n乙肝火\t44449\n胖头\t44450\n機會\t44451\n妖妖小我\t44452\n米亲们\t44453\n森林资源林\t44454\n瓶水\t44455\n方肉\t44456\n熊猫你不说你是秘度你是人家的秘多\t44457\n胖大\t44458\n不可惜\t44459\n胖多\t44460\n乍样\t44461\n5846846454\t44462\nfkk\t44463\n當住\t44464\nmc7\t44465\nfkc\t44466\n那莱\t44467\nfkf\t44468\n杨利伟\t44469\n刘娜\t44470\n李光晨\t44471\n刘娘\t44472\n不管了问\t44473\n我是我是刑天\t44474\n狼娘\t44475\n五六千\t44476\n龟兔\t44477\n截得\t44478\nfkw\t44479\n万寿\t44480\n一触网\t44481\n撒游\t44482\n28韩\t44483\n说看我\t44484\n055162328057\t44485\npdg\t44486\n曾诚\t44487\n有急事\t44488\n班儿\t44489\n1008610086\t44490\n多咪度秘\t44491\n脉络\t44492\n杨继业\t44493\n就删\t44494\n莲藕咔擦\t44495\nhttpahiphotosbaiducomxiaodupicitemf9198618367adab4b2d621378cd4b31c8701e444jpg\t44496\n点点\t44497\n点炸\t44498\n36件\t44499\n杨倩\t44500\n华尔街日报中文网\t44501\neycjf\t44502\n点炮\t44503\n胡合军\t44504\n吴可欣\t44505\n叫情\t44506\n永远永远\t44507\n严演义\t44508\n大丑婆\t44509\n够了惹\t44510\n2783382838\t44511\n6月18号\t44512\nhhhbh\t44513\nM80\t44514\n怕是\t44515\n甲组\t44516\n15所\t44517\n四面城中心中学\t44518\n卜式\t44519\n14点\t44520\n欺软怕硬\t44521\n6170亩\t44522\n似乎\t44523\n难辨\t44524\n各们\t44525\n脸麻\t44526\n宅餐坊\t44527\n谁是贼谁是谁你是谁\t44528\n喂喂喂喂喂喂喂喂喂喂喂喂喂喂喂咪咪咪咪咪咪咪咪咪咪咪咪咪咪\t44529\n从这里开始\t44530\n窦若兮\t44531\n传输\t44532\n五六1百分之十\t44533\n111111112222\t44534\n残渣\t44535\n雪人\t44536\n奎元实验学校\t44537\n日方\t44538\n零若九\t44539\n健康康累赘\t44540\n日新\t44541\n一个11岁\t44542\n西餐\t44543\n雪亮\t44544\n张太泽\t44545\n由此可见一斑\t44546\n迷霏\t44547\n银行卡\t44548\n第一人称\t44549\n什麼\t44550\n最合适\t44551\n好我夸夸你那你救救我爸爸爸\t44552\n日料\t44553\n俩字\t44554\n83256789321\t44555\n干什么不想\t44556\n日文\t44557\n空中小姐弟恋情谊我想\t44558\n构\t44559\n婆媳\t44560\n极\t44561\n九万桶\t44562\n打法\t44563\n淘眼镜\t44564\n姨妈巾\t44565\n林\t44566\n枕\t44567\n析\t44568\n哈哈乖\t44569\n枝\t44570\n果\t44571\n枚\t44572\n放心不下\t44573\n高高在上\t44574\nfuccb\t44575\n枢\t44576\n枯\t44577\n金箍\t44578\n我骗你的我是人\t44579\n枫\t44580\n枪\t44581\n华人民\t44582\n枷\t44583\n架\t44584\n57868556\t44585\n叶罗丽\t44586\nbeisde\t44587\niPad3\t44588\niPad2\t44589\nXkjxk\t44590\n打泡\t44591\nYougodie\t44592\n姓张张\t44593\n加利福尼亚\t44594\n几桌\t44595\n雷石\t44596\n贯溪站\t44597\n哇塞瓜\t44598\n两月一\t44599\n基体\t44600\n装饰品\t44601\n重新选择\t44602\n卡其\t44603\n投影\t44604\n爱不完想爱\t44605\n哩蛮女\t44606\n虽说\t44607\n于平\t44608\n尿片儿\t44609\n458169\t44610\n某些人\t44611\n乃同\t44612\n严加\t44613\n交缠\t44614\n28M\t44615\n林好\t44616\n八角\t44617\n基佬\t44618\n王奕杰\t44619\n宋涛\t44620\n挥毫\t44621\n范伟伟\t44622\n白道\t44623\n陈霞\t44624\n巨美\t44625\n斗臭\t44626\n4121元\t44627\n拉菲鲁\t44628\n挖心\t44629\n司法决斗考\t44630\n深浅不一\t44631\n海绵\t44632\n陈震\t44633\n总店\t44634\n新安小学\t44635\n女巫节快乐哟妈呀节\t44636\n陈露\t44637\n九五二零\t44638\njcvf\t44639\nkidding\t44640\n1互言\t44641\n道袍\t44642\n彼岸\t44643\n一句两句\t44644\n牟取\t44645\nEnduro\t44646\nlaige\t44647\n888888585888888888888888\t44648\n复旦大学附属儿科医院\t44649\n不要离开我你在不\t44650\n神社\t44651\n不舍不弃\t44652\n手动档\t44653\n并口\t44654\nkkb磨叽\t44655\n入胜\t44656\n蚂蚁们\t44657\n杨敏\t44658\nbfjfnfn\t44659\n三五载\t44660\n好运气\t44661\n小小小小小小小小小小小小大姐\t44662\nPhae\t44663\n呃网\t44664\n铃兰\t44665\nggfdf\t44666\n啦啦啦德玛\t44667\nmade\t44668\n也有\t44669\n钱财\t44670\nmada\t44671\n多年以后\t44672\n奥巴点\t44673\n瓮妈妈\t44674\n就是你的错\t44675\n8月11日\t44676\n弱攻\t44677\n小宝鸡\t44678\n并发\t44679\n省里\t44680\n1393500426531\t44681\n24个\t44682\n暖炉\t44683\n6X\t44684\n比尔·盖茨\t44685\n左朋\t44686\n靴子\t44687\n竞合\t44688\nvjjojko\t44689\n看烦\t44690\n行吗大哥\t44691\n35897\t44692\n23块\t44693\n洗碗\t44694\n宁夏区\t44695\n零六个月\t44696\n百duer子\t44697\nIic\t44698\n摇着摇着摇着\t44699\n马沙特\t44700\n孙鸿飞\t44701\n卜一帅\t44702\n烦天\t44703\n陈中川\t44704\n3dhi霸\t44705\n回踩\t44706\n烦大\t44707\nipivp\t44708\n没有座\t44709\n你好我你爱我\t44710\n安利\t44711\n安别\t44712\n当着\t44713\n快快快点\t44714\n谈论\t44715\nvuow\t44716\n朝日\t44717\nyok器\t44718\n飞区\t44719\n健全\t44720\n小法\t44721\n汤举头\t44722\n彩虹糖\t44723\n丑真丑\t44724\n作业稿\t44725\n皮普里卡\t44726\n帮度\t44727\nh乜g巳dFCO588123456。45额\t44728\n十六分\t44729\nCiccarese\t44730\n小波\t44731\n小泡\t44732\n朱我\t44733\n与会者\t44734\n小泽\t44735\n不如下\t44736\n302111405401\t44737\n打力\t44738\n好想爱\t44739\n投影片\t44740\n汪婆婆\t44741\n账号码\t44742\n甘夸张\t44743\nnooooo\t44744\n商户\t44745\n100一千二一万\t44746\nhgtgvgbvvbvccfcgvyhgggghhhgggbhhhjnhjjjjjjj\t44747\n小车库\t44748\n包送\t44749\n蜀郡\t44750\n复仇感\t44751\n夜咳\t44752\n无寸缕\t44753\n没不过\t44754\n橙子音乐网\t44755\n每两分钟\t44756\nidi金乡呗\t44757\nEveryone\t44758\n八八六二\t44759\n平安考试\t44760\n垂心\t44761\n吊唁\t44762\n森林者\t44763\n蹲姿\t44764\n56454465252\t44765\ngeiagsis\t44766\n总度\t44767\n哭吧哭\t44768\n大一下你好\t44769\n张春林\t44770\n动点\t44771\n宋壹加壹\t44772\ngcgc\t44773\n进行曲\t44774\n金思雯\t44775\n不要为我不爱我的人\t44776\n余笑一笑满堂\t44777\n簇拥\t44778\n护林军\t44779\n友友\t44780\n领头羊\t44781\n52286867568886\t44782\n票证\t44783\n春姐\t44784\n2009年6月11日\t44785\n人部\t44786\n默契\t44787\n火形\t44788\n咐嘱\t44789\n百万巨鳄#\t44790\n100p\t44791\nASP数据访问高级编程\t44792\n蛋黄\t44793\nereererere\t44794\n模拟题\t44795\n不里你了\t44796\ntired\t44797\n九样\t44798\n你校\t44799\n色值\t44800\n沉鱼落雁\t44801\n贺佳琪\t44802\n拨乱反正\t44803\n释小龙\t44804\n手申\t44805\n魔趣无趣大一\t44806\n泪痕\t44807\n手电\t44808\n真贫\t44809\n东财\t44810\n梓存l\t44811\n胡咯哈路同\t44812\n三泡屎\t44813\n老恨\t44814\n7575341757古古\t44815\niuju\t44816\nWeare\t44817\n多英\t44818\n还有多远\t44819\n睡不着类\t44820\n农夫山泉\t44821\n1982年元旦\t44822\n猫耳的图的呢qqqq头的qi笨贼qi不q哈皮吗不知你我有你的妈妈\t44823\nTONE\t44824\n白团子\t44825\n19分\t44826\n赖晓雯\t44827\n贺子珍\t44828\n一不在乎\t44829\n女妹子\t44830\nhere\t44831\n妙妙真\t44832\n托塔\t44833\n甄赵\t44834\n火药\t44835\n哈尔滨\t44836\n王俊恐\t44837\n64976707\t44838\n小灰\t44839\n奈斯图米\t44840\n小灵\t44841\n周水子机场\t44842\n188314759\t44843\nnmhyf\t44844\nucf\t44845\n怎么办我恨你恨你讨厌你\t44846\n王俊恺\t44847\n泡制\t44848\n胡猫\t44849\n黄金时段：吉尼斯世界纪录\t44850\n小火\t44851\n小灭\t44852\n小灯\t44853\n西域\t44854\n累加制\t44855\n腐女吧想法\t44856\ndv6wxywdgu5advhtecytsh7\t44857\n西城\t44858\nM88大厦\t44859\n笑報\t44860\nWPSWPS\t44861\n给出示\t44862\n盘数\t44863\n董端\t44864\n仙侠类\t44865\n盘整\t44866\n族人\t44867\n十连\t44868\n应付款\t44869\n双蝎座\t44870\n38500\t44871\n媛子\t44872\n傻子样\t44873\n蒲城职级办\t44874\n四周\t44875\n终极目的\t44876\n侠影\t44877\n作文儿\t44878\n二益友\t44879\n宣判\t44880\n莉罗\t44881\n唱吧命令\t44882\n建设南路\t44883\n大头德\t44884\n蒋旭莹\t44885\n笨豬豬\t44886\n25036871414\t44887\n扁桃\t44888\n天神\t44889\n陪聊天儿啊陪聊天\t44890\n欧玉洁\t44891\n呃巴啦啦小魔仙\t44892\n铁哥们\t44893\n赖猫\t44894\n鄱阳县\t44895\n闲云野鹤\t44896\n工委\t44897\n政治味\t44898\n右撇子\t44899\n磅美\t44900\n3.69元\t44901\nEDY\t44902\ntourousm\t44903\n艰涩\t44904\n错过了\t44905\n不场\t44906\n宁宇错\t44907\n我一个闲人\t44908\n黄黄题词\t44909\n大回来的故事\t44910\n不在\t44911\n内忧外患\t44912\n碰着\t44913\n汽修店\t44914\n眯睿\t44915\n杨太阳\t44916\n等等等等等等等等等等等等等\t44917\n嗯2\t44918\nGnRH\t44919\n不nn\t44920\nshike\t44921\n第四度\t44922\n会见面\t44923\nvnnc\t44924\n扩音器\t44925\n装糊涂\t44926\n南京站\t44927\n莱克\t44928\n琅琅耳垢塞我是阿khi来的妞可乐仔湾大屠美微ct\t44929\nA\t44930\n相庭\t44931\n逆天\t44932\n酷巴\t44933\n噪音象\t44934\n军哥\t44935\n过眼云烟\t44936\n朱紫阳\t44937\n疯了敢\t44938\n哈雷会\t44939\n对战\t44940\n6231163099\t44941\n猫kiki\t44942\n郑州坤\t44943\n关你\t44944\n103岁\t44945\n爹度\t44946\n嗯q\t44947\n雨ukxuguutuldulsjxxuxukxghkzzghmkzhgugkduktdlugduudffykzyutdut\t44948\ntmjmt\t44949\n受苦\t44950\n好报恶人\t44951\n寇没\t44952\nfgghugg\t44953\n建外街道办事处\t44954\n无语无伦次\t44955\n约龙\t44956\n785536\t44957\n凯尖儿\t44958\n医务室\t44959\n相应\t44960\n嗯n\t44961\n赵爷\t44962\n求亲亲\t44963\n1633家\t44964\n小肥洋哥\t44965\nEdith\t44966\n偶尔一开心好开心开心好开心\t44967\n崔莲姬\t44968\n米坤宝\t44969\n我一个伤\t44970\nppsopepc\t44971\n农大女\t44972\n衣食住行\t44973\n新世纪福音战士新剧场版\t44974\n在阳\t44975\n找病\t44976\n来了我讨厌\t44977\n休息吧安\t44978\n可香\t44979\n祁琪\t44980\n郑逸飞\t44981\n自然卷\t44982\n下岗\t44983\n萱凝\t44984\nfrss\t44985\n西歌\t44986\n六露\t44987\n米嗖拉西\t44988\n极差\t44989\n祭司\t44990\n文青\t44991\n吃饱饭\t44992\n晋城\t44993\n刘东文\t44994\n瞎了没\t44995\n脑洞\t44996\n二条街\t44997\n国家标准\t44998\na杯\t44999\n报告册\t45000\n零零性\t45001\n拜拜拜拜拜拜拜拜拜拜拜拜拜拜\t45002\n2口\t45003\n累觉\t45004\n汉唐\t45005\n烧会\t45006\n2只\t45007\n八八楼\t45008\n2号\t45009\n锡哥\t45010\n济急\t45011\n2台\t45012\n20060130\t45013\n空镜头&amp\t45014\n39公斤\t45015\n2333\t45016\n别再造谣\t45017\n誊写\t45018\n咿呀呀\t45019\n乳制品类\t45020\n凉粉\t45021\n宋文浩\t45022\ngugidudc\t45023\n惹上妖孽花美男\t45024\n拜斗\t45025\n马明星\t45026\n遗愿\t45027\n阎罗令\t45028\n复数\t45029\n婷妹妹\t45030\n四五场\t45031\n电视电视\t45032\n啦啦啦啦啦啦我是快乐小朋友\t45033\n光棍儿\t45034\n纽约市\t45035\n一生么及时雨\t45036\n温继成\t45037\n郭丽楠\t45038\n成本高\t45039\n晓东有主\t45040\n13037870866\t45041\n拜斯\t45042\n三一下\t45043\n三一万\t45044\n45555555556\t45045\n闹不了\t45046\n赵晓明\t45047\n优良\t45048\n擒龙诀\t45049\n39959995\t45050\n你猜我喜的偶像\t45051\n长丑\t45052\n不要忘记\t45053\n黑宠\t45054\n读行\t45055\n白干饭\t45056\n五仁月饼\t45057\n飕飕\t45058\n工作狂们\t45059\n几i\t45060\n九万个\t45061\n哎啊\t45062\n崮en\t45063\n紅博\t45064\n肥佬\t45065\nSEP\t45066\n稳健\t45067\n演讲稿\t45068\n亨廷顿\t45069\n15125994463\t45070\n身心峰\t45071\n两天后\t45072\n徐金彤\t45073\n我的姐妹\t45074\n八阿哥\t45075\n再找\t45076\n心听\t45077\n丽都\t45078\np岁\t45079\n巅\t45080\n果叫\t45081\n巛\t45082\ngffku\t45083\n州\t45084\n川\t45085\n志丰\t45086\n跨年演唱会\t45087\n吴老师\t45088\n志丹\t45089\n巫\t45090\nxia\t45091\nxib\t45092\n挂掉\t45093\n差\t45094\n心同\t45095\ndosy颜\t45096\n巡\t45097\nbxjv\t45098\n巧\t45099\n左\t45100\nxin\t45101\nxio\t45102\n肚罪\t45103\nxir\t45104\n抓人\t45105\n巿\t45106\n巾\t45107\n对不起发\t45108\n巳\t45109\n已\t45110\n己\t45111\n巴\t45112\nxgoyx\t45113\n领赏\t45114\nnggeggyg\t45115\n三国城\t45116\n栽培\t45117\n戴相龙\t45118\n葛守仁格\t45119\n北京市消防局\t45120\nBUG\t45121\n罗滚\t45122\n肛瘘\t45123\n白天东星\t45124\n翠有\t45125\n70多年\t45126\n百度金融\t45127\nt34\t45128\n花光\t45129\nvudv\t45130\n【菲\t45131\n罗一帆\t45132\n言句话\t45133\n和而\t45134\n张逸仙\t45135\n丹田\t45136\n四天五门\t45137\n2至3小时\t45138\n有无良方可\t45139\n行差一步\t45140\n啐觉\t45141\n多米那\t45142\n娱乐稿\t45143\n打电话\t45144\n日本十家漫画出版社\t45145\n身板\t45146\nHolle\t45147\n12P\t45148\n板砖\t45149\nfrckigs\t45150\n振良\t45151\n猪门市\t45152\n我讨厌你我恨你我讨厌你我恨你\t45153\n记不得\t45154\n立玉\t45155\n思无邪\t45156\n说一变\t45157\n没带\t45158\n有学网\t45159\n还我庚澈庚花魂\t45160\n说一句\t45161\n废话废话废话废话废话废话废\t45162\nAK-3\t45163\n身材\t45164\nu6uu6\t45165\n月将\t45166\n肤肤品\t45167\n集资\t45168\n百爱家\t45169\n47岁\t45170\nshuau\t45171\n2分钟后\t45172\n独舞\t45173\n乖女仆\t45174\n超能\t45175\n加水\t45176\n1888576\t45177\n论述\t45178\n嗯南丹\t45179\n宁缺毋滥\t45180\n反讽\t45181\n63763763\t45182\n8888888888\t45183\n当你老大行\t45184\n100.6\t45185\n加气\t45186\n7000余件\t45187\n猪猪你是猪我不要你了哼哼\t45188\n加民\t45189\n十九年前\t45190\n武汉城市圈\t45191\n马克·沃尔伯格\t45192\n刘不休\t45193\nTJFC\t45194\n太帅太帅\t45195\n良栩\t45196\nSJ银赫\t45197\n仗狗势\t45198\n一百倍\t45199\n乖乖我爱您\t45200\n反讚\t45201\ngdffy\t45202\n晕死\t45203\n印堂\t45204\ngdffr\t45205\n马圆利\t45206\n超级机器人\t45207\n35盒\t45208\n经管\t45209\nTTFGGETG\t45210\n七十三十五分\t45211\nFUFFRTDYD\t45212\nYcdggcgstgxvvgggggggg\t45213\n孙健豪\t45214\n朱玮浚\t45215\n打篮球\t45216\n感谢你的指导\t45217\n霸宠\t45218\n忘色\t45219\n天高地阔\t45220\n国军\t45221\n三棘棘\t45222\n余余余\t45223\n697\t45224\nztk\t45225\nzte\t45226\n699\t45227\n嚴爵\t45228\n小糯米\t45229\nztx\t45230\nztu\t45231\nztt\t45232\n小伴块\t45233\nnvnnvkcicu\t45234\n陪我聊会儿天吧真的好痛我的心\t45235\nvcbbmmlknb\t45236\n扣破\t45237\n逆天勒\t45238\n孔双月\t45239\n口香\t45240\n转基因大米\t45241\n挖宝心塞\t45242\n李欣潼\t45243\n黑头套\t45244\n点让\t45245\n长寿第一中学\t45246\n三百六十六今个\t45247\noexoexo\t45248\n我我我我我我我我我我我哦\t45249\n二百米\t45250\n恩贾\t45251\n颜世豪\t45252\n张钊国\t45253\n工人新村112号\t45254\n泉山区\t45255\n说出货\t45256\n范法\t45257\n唐总附\t45258\n爱芬\t45259\n很傻\t45260\n罗桂萍\t45261\n你不也不要忘了我了我讲一下\t45262\n玩造\t45263\n江苏省佛教协会\t45264\n馅儿饼\t45265\nWhatis\t45266\n爱花\t45267\n副所长\t45268\n靠谱点\t45269\n1234667990\t45270\nNjtjtdjkttdfkjt\t45271\n一个多礼拜\t45272\n对着\t45273\n大姨妈巾\t45274\n几二十年\t45275\n哎呦妈呀\t45276\n玩這\t45277\n薛原高\t45278\nxyk8\t45279\n二湿兄\t45280\n小仆人\t45281\n20120514\t45282\nUUYYYYYY\t45283\n谢雨晨\t45284\naurou\t45285\n钱钱钱钱钱钱\t45286\n小嘴\t45287\n166666666666000000000006666666666666666666666666666666\t45288\nfdgffddsr\t45289\n289331\t45290\n二二零一零\t45291\nu徐\t45292\n哇靠你是人吗你说行\t45293\n花千舞\t45294\n小嘉\t45295\n英短\t45296\n御制诗\t45297\n撒粒\t45298\nJLM\t45299\n惹伯ya丝宝\t45300\n几3\t45301\n我靠我的管你毛事\t45302\n红圈\t45303\n加哆号\t45304\n跃进\t45305\n布米个小小小小哈\t45306\n昀哥\t45307\n红圆\t45308\n歇息\t45309\n天花板\t45310\nang\t45311\nCest\t45312\n麦门冬粥\t45313\n质押\t45314\n生活互助汇\t45315\n昨天下午四点\t45316\n玩赏\t45317\n蚂蚁\t45318\nHgthhgmhhuhiythhhhhugugh\t45319\n好迷糊\t45320\n矛头\t45321\nduie\t45322\n几百KB\t45323\n海灯\t45324\n听话听话\t45325\n我真的爱你我想你\t45326\n除湿\t45327\n卢岩\t45328\n雅痞\t45329\n不懂了吧\t45330\npro\t45331\n赵天硕\t45332\n休怪\t45333\nduii\t45334\n良训\t45335\n2010年\t45336\n百貨\t45337\n斗气\t45338\n小忆力\t45339\n红脚\t45340\n一周末\t45341\n40万元\t45342\n冷暴力\t45343\n唉自卖\t45344\n色彩\t45345\n1058959084\t45346\n吓跑\t45347\n同德\t45348\n连我的天呐好肉麻\t45349\ntf11f\t45350\n答辩生气\t45351\n书本\t45352\n朱号\t45353\n呱呱呱耕保象征\t45354\n合作市\t45355\n夜情病栋\t45356\n流不穷木\t45357\n了战\t45358\n营盘\t45359\n你的泪珠\t45360\n鬼魅\t45361\n被动性\t45362\n鬼魂\t45363\n亲恩\t45364\n一不冖\t45365\n中钢集团\t45366\n舒乐\t45367\n夜曲\t45368\nLOL\t45369\n阮福亮\t45370\n嘎子\t45371\nSMG新娱乐台\t45372\n中规中矩\t45373\n就地取材\t45374\n巴塞罗那\t45375\n黑葡萄\t45376\n正巧\t45377\n今天下\t45378\n哈拉度秘\t45379\n河床\t45380\n哼哼哼度秘臭小子\t45381\n一nnnnnnnnnn口一nnnnnnnnnnnn准发明的你\t45382\nTheb0ysare\t45383\n龙岩市\t45384\n好不过\t45385\n7558055\t45386\n开大我是倒霉呀你在\t45387\n举起来\t45388\n职专\t45389\nLZloadLZ\t45390\n河底\t45391\n4.48%\t45392\n胎压\t45393\n13234656567686\t45394\n职业\t45395\n天津卡都\t45396\n27成\t45397\n负费\t45398\n枣阳市\t45399\njkx\t45400\n环游世界\t45401\n区间\t45402\n曹丕\t45403\njkw\t45404\n政坛\t45405\njks\t45406\njkl\t45407\njkm\t45408\njkn\t45409\n092个\t45410\njkh\t45411\n偶吧们\t45412\njkk\t45413\n负责\t45414\njkf\t45415\n西锤子臭\t45416\njka\t45417\njkb\t45418\n往上爬\t45419\n现如今\t45420\n你的处\t45421\n安岭\t45422\n负负\t45423\n一九成\t45424\n假案\t45425\n葳斯基\t45426\n萝卜卜\t45427\n给我沉默\t45428\n文不第一\t45429\nDC数码相机\t45430\natv4\t45431\n吃睡\t45432\n鹅卵逐流\t45433\n公租房\t45434\n阴术\t45435\n徐嘉苇\t45436\naststist\t45437\n回去话吧\t45438\n人大哥\t45439\n趣事\t45440\n里潘\t45441\n老子新网\t45442\n1357682946\t45443\n集装箱房\t45444\n和静\t45445\nmjtqtmpk\t45446\n苏浩\t45447\n波轴\t45448\n赐福\t45449\n哪神\t45450\nJUNIOR\t45451\n赵哈\t45452\n五音不全吗火舞\t45453\n波车\t45454\nooooo\t45455\n献映\t45456\n滴笑\t45457\n张泽红\t45458\n待寝来了7\t45459\n波轮\t45460\n黑暗骑士崛起\t45461\n蚂蚱\t45462\n再见见\t45463\n累了不想\t45464\n识认\t45465\n明城\t45466\n港币\t45467\nWINDOWS7\t45468\n糖炒栗子\t45469\n乌丹\t45470\n小门\t45471\n五十大\t45472\n半身\t45473\n五十天\t45474\n辱荣\t45475\n千头\t45476\n金蟾\t45477\nbobby\t45478\n77919\t45479\n五十多\t45480\n杨猪肚\t45481\n狗肉厚\t45482\n確定\t45483\nK﹃K\t45484\n猪脸\t45485\njjgg\t45486\n五千米\t45487\n快雪\t45488\n凯我爱你我要嫁给你\t45489\n明基\t45490\njjgh\t45491\n13975549699罗\t45492\njjgj\t45493\noppor27\t45494\n山岛\t45495\n哑巴\t45496\n尹天乐\t45497\n山岗\t45498\n怀抱着\t45499\n祖安\t45500\n回馈\t45501\n祖宗\t45502\n回首\t45503\n你的喜欢我和你结婚吧\t45504\n奇闻奇\t45505\n有灵犀\t45506\n特斯拉\t45507\n不是玩命\t45508\n冰块\t45509\n说了心\t45510\n朱师傅\t45511\n强熊\t45512\nfgff\t45513\n徐敬兴\t45514\nDcfh\t45515\n甄子丹\t45516\n再用力\t45517\nfgfy\t45518\n刘嘉鹏\t45519\n平盘\t45520\n计岁\t45521\n星巴克\t45522\n殡仪车\t45523\nQQ飞车\t45524\n27天\t45525\n厨浩\t45526\n喜字\t45527\n1213141516171819\t45528\n15657868476\t45529\n久丯\t45530\n胡夏#\t45531\n凤小岳儿\t45532\n宫保鸡丁\t45533\nkvgkajbgk\t45534\n万毒\t45535\n上海绛州鼓乐团\t45536\n图案\t45537\n稳稳\t45538\n大逆不道你\t45539\n怪事\t45540\n650146504\t45541\n威远县\t45542\n#ta\t45543\n筋络细胞\t45544\n老婆们\t45545\n薛堡\t45546\n卫思怡\t45547\n死亡报告\t45548\n心空\t45549\n小问\t45550\n后防\t45551\n说出你的故事\t45552\n1358607253\t45553\n熊艳美\t45554\nFreddie\t45555\n来妃\t45556\n我是说谎的我\t45557\n一百五一百五\t45558\n泣不成声\t45559\n八代\t45560\n豆豆龙豆豆龙豆豆龙豆豆龙\t45561\n22名\t45562\n招财猫\t45563\n我的女孩\t45564\n怪兽率\t45565\n不难堪\t45566\n取代\t45567\n芮桐\t45568\n孙德超\t45569\nfgghgggg\t45570\n行不想\t45571\n中招\t45572\n人功\t45573\n花哨\t45574\n故城\t45575\n笨妈妈兄弟第四季占有露涵angelababy\t45576\n抢断\t45577\n一婚\t45578\nLOKI\t45579\n558588\t45580\n尔等速速\t45581\n再一个再一个再一个\t45582\n落月儿\t45583\n十八罗汉\t45584\n激動\t45585\n阳信县\t45586\n花哈\t45587\n避寒\t45588\n颜菲\t45589\n斗牛狗\t45590\n飞虎神鹰孤岛事业\t45591\n机器人手\t45592\n肃宁酒\t45593\n食言自肥\t45594\n班车票\t45595\nAndy\t45596\n摆摊\t45597\nWunai\t45598\n赛尔号的库贝撒子我的\t45599\ncgkfk\t45600\n亲一个人\t45601\n呃花十金币\t45602\n调子\t45603\n领导小组\t45604\n白欣怡\t45605\n浑浑噩噩\t45606\n忽明忽暗\t45607\n亲爱的你慢慢飞小心\t45608\nhellomtt\t45609\n堡王梅\t45610\n果维康\t45611\n机器人才\t45612\n广告主们\t45613\n不及格\t45614\n我可没哪m有才是你在我的手机我的度秘\t45615\n牵制\t45616\n28546\t45617\n度秘秘\t45618\n奥光\t45619\n争理\t45620\n奥克\t45621\n恶意的谎言\t45622\n西城站\t45623\n蔡妍妍\t45624\n李式相对论\t45625\n高虎\t45626\n持无恐\t45627\n度秘秀\t45628\n度秘私\t45629\n如梦猜\t45630\n欣宁\t45631\n5％\t45632\n男女片\t45633\n马屁安\t45634\n灭蚊\t45635\n咸蛋黄\t45636\nKBSmusicbank\t45637\n蜘蛛店\t45638\n唐卓悦\t45639\n可爱度秘\t45640\n杜吉然\t45641\nLhbjghj\t45642\n吹拂尘\t45643\nbabyloli福利的的的的的的的的的的的的的\t45644\n偶可素\t45645\n高耗能\t45646\n哈哈佛\t45647\n得板合意\t45648\n0.44%\t45649\n昂阿莎\t45650\n胡思乱想\t45651\n哪一边\t45652\nQQ关些x\t45653\n可随时\t45654\n黄闷鸡\t45655\n好坏臭\t45656\n海内外\t45657\n度秘红木红红度秘帅不帅三\t45658\n笔调\t45659\n真不怪\t45660\nygft8ddu7s76d6syy6\t45661\nICU呢顾大局\t45662\njizjrubutikul2725555\t45663\n牢里\t45664\n来次\t45665\n好装\t45666\n年长\t45667\n耐久度\t45668\n顾世浩\t45669\n1314125\t45670\n神宠我死心晚\t45671\n36辆\t45672\n012345678911121314151617181920\t45673\nb面\t45674\n长岛\t45675\nStar大猜想##\t45676\n592803段\t45677\n牛局长\t45678\n1月27号\t45679\n火药味\t45680\nangeiababy\t45681\n学去业\t45682\n飞选\t45683\n下酒\t45684\nrtyye\t45685\nhttpehiphotosbaiducomxiaodupicitem3ac79f3df8dcd10016e6dd52758b4710b9122f4ejpg\t45686\n特级\t45687\n坚果\t45688\n现实唱一首歌\t45689\ntfboysyou\t45690\n727275457275\t45691\n45贾\t45692\n憎摸\t45693\n手脚\t45694\n三兄弟\t45695\n铿锵玫瑰\t45696\n内流\t45697\n一汽错\t45698\n假不懂\t45699\n内测\t45700\n我妈们\t45701\n油漆工\t45702\n嗯柏林\t45703\n段建刚\t45704\n\t45705\n21：60\t45706\n谁是你要的主人我是的\t45707\n谢谢你的我真的想你\t45708\n小苹龟\t45709\nuhshsss\t45710\n晓晓姐\t45711\n\t45712\n好野好野\t45713\n\t45714\n飞逝\t45715\n模糊不清\t45716\n洛索洛索洛索\t45717\n朦朦胧\t45718\n泷一强\t45719\n\t45720\n小七疒\t45721\n瑞婷\t45722\n谷诗穗梨\t45723\n务农\t45724\n最美的眼神\t45725\n我只问你在哪儿读书没问你\t45726\n上学度\t45727\n一下面\t45728\nInBrazil\t45729\n展场\t45730\n六月一号\t45731\n袁弘\t45732\n披房\t45733\n叧口\t45734\n扶正\t45735\n纳尼亚\t45736\nV字型\t45737\n五爱\t45738\nsreyy\t45739\n五爷\t45740\nkkakao考ssmsm\t45741\nydf\t45742\n苗星人\t45743\n翼城\t45744\n张姨母\t45745\n联合早报\t45746\n毛巡回\t45747\n三环路\t45748\n15518524918\t45749\n山东人\t45750\nydy\t45751\n那那那那那那那那\t45752\n大妹子\t45753\nydq\t45754\n糊啦糊\t45755\n光明乳业\t45756\n无轨电车\t45757\n咯尔德\t45758\n屋企\t45759\n体力活动\t45760\n周六晚上22：00\t45761\nqifj\t45762\n彩排\t45763\n甜蜜点\t45764\n曾强\t45765\n特意杀毒舌\t45766\n双辫\t45767\n好乖乖萌萌哒\t45768\n50亿光年\t45769\n小名儿\t45770\n改天再聊\t45771\n所到\t45772\n妆容\t45773\ndome我爱你\t45774\n零零二二\t45775\n钢铁业\t45776\n新婚之夜\t45777\n阿里\t45778\n港滴\t45779\n羽泉\t45780\n珞拜\t45781\n文西\t45782\n1g1g\t45783\n7时26分\t45784\ng511\t45785\n朱伟英\t45786\nYangbyon\t45787\n一切众生从六道轮回\t45788\n182n\t45789\n回到原处\t45790\n玫丽奥\t45791\n游玩\t45792\n商务标\t45793\n主人公\t45794\n心心念念\t45795\n懂不懂\t45796\n嗯点\t45797\nfikn\t45798\n好了你爱我吗我想亲你我想吻你\t45799\n不捷\t45800\n66666113164646\t45801\n放臭死\t45802\n南山\t45803\n知距\t45804\n南屯\t45805\n不换\t45806\n一代科\t45807\n被堵死\t45808\n呢赤道体\t45809\n老子不懂我办你给老子说说\t45810\n喔哥们\t45811\n乌子\t45812\n227点\t45813\n70多\t45814\n淋漓尽致\t45815\n友天真乖\t45816\n燕窝鲨鱼翅\t45817\n张卡拉\t45818\njxjbxnjzjhxhjzbhzbsnzj\t45819\n刺身\t45820\n十几岁说真实的好嘛ok\t45821\nDUCCIEJ\t45822\n動武\t45823\n不要不要不要不要不要不要不要不\t45824\n探测仪\t45825\n叶旭光\t45826\n新大你和我\t45827\n了当然\t45828\n岸基\t45829\n封住\t45830\n辽宁移动\t45831\n越过心尖\t45832\n玉英\t45833\n小淘气\t45834\n舞动\t45835\n往字\t45836\n惠机\t45837\n晚上九点半\t45838\n章伯钧\t45839\n证据\t45840\n眩晕\t45841\n代售点\t45842\n田馥甄\t45843\n舞功\t45844\n急季\t45845\n死基切尔\t45846\n集英\t45847\n丘片\t45848\n50多天\t45849\nniiguf\t45850\n立邦\t45851\n孤傲体bb\t45852\nyoujjz\t45853\n给我\t45854\n就问一下\t45855\n南村坪\t45856\n木子一\t45857\n叶蓓雷\t45858\n枯黄败落\t45859\n芬芳四溢\t45860\n推力\t45861\n赣州金源酒店\t45862\n一五零九七六二九万\t45863\n一个空\t45864\n彦铮\t45865\n推动\t45866\n同义句\t45867\n故宫博物院\t45868\n如麻\t45869\n你的你的脸\t45870\n给户\t45871\n江苏队\t45872\n十秒内\t45873\n丑吐\t45874\n徐延秀\t45875\n阮梓源\t45876\n雪北京秘\t45877\n24625435345455577\t45878\n积压塔拉瓦\t45879\n一点半个\t45880\n13820377237\t45881\njtnpjaw\t45882\n癫婆子\t45883\n咪颜\t45884\n毁灭作者\t45885\n记者们\t45886\n艾霞\t45887\njjhhhjjjjkk\t45888\n营堡\t45889\n白兄\t45890\n商革\t45891\n双卡双待\t45892\n余斯瀚\t45893\n豪萨\t45894\n迪哥\t45895\n芡汁\t45896\nNoidu\t45897\n独货币\t45898\n军政府\t45899\n睡不着\t45900\n有机酸\t45901\n松鼠们\t45902\n程越月\t45903\n陈旭峰\t45904\n乌鱼\t45905\n琪沅枫\t45906\n135562235752\t45907\n容汝\t45908\n睡不睡\t45909\n燕泉超\t45910\n口烧鸡\t45911\n杜王翔宇\t45912\n把度秘\t45913\nKAO\t45914\nOTAT\t45915\n想来没\t45916\nILOVEYOU\t45917\n45722465\t45918\n危险\t45919\n巴嘎巴嘎\t45920\n2601288791\t45921\n资历\t45922\nIKEA\t45923\ni饿防滑计算器热乎\t45924\n后面儿\t45925\n任幸\t45926\n没哪里\t45927\n对秘\t45928\n915一个\t45929\n照不准\t45930\n英格蕾丝\t45931\n王红玲\t45932\n唯有一途\t45933\n人间福报\t45934\n江南花园\t45935\nyj5787\t45936\n咿呀的呀\t45937\n对不出\t45938\nWelcome\t45939\n想象篇\t45940\n一下一点\t45941\n红领巾之歌急\t45942\n圆味\t45943\n莒县六中\t45944\n2415\t45945\n宝宝贝\t45946\nSCREAM\t45947\n对称\t45948\nSeedtt\t45949\n军迷\t45950\n才号\t45951\n还是女知道\t45952\nVhctRviblysy\t45953\n逆天我是你的电话号\t45954\n千玺宝\t45955\n水肿\t45956\n呵呵窝\t45957\n轩尼斯\t45958\n母舰\t45959\n才叫\t45960\n1亿7千万\t45961\nwijs\t45962\n人性\t45963\n李雨蒙\t45964\n综合恐惧症\t45965\n1万一万\t45966\n大美羊我属你\t45967\n水肉\t45968\n秘干\t45969\n弄你\t45970\n乖乖妻\t45971\n门符\t45972\n不卖货\t45973\n三户\t45974\n来我不说\t45975\ngktt\t45976\n人与自然\t45977\nwlan\t45978\n害了怕\t45979\n牙牙学语\t45980\n还不能行\t45981\n别这样好不\t45982\n娘田\t45983\n凯赟\t45984\n李思政\t45985\n老新人\t45986\n悬走\t45987\n天宝四\t45988\n爬行动物\t45989\n51385789\t45990\n天天有发链接给我你\t45991\n张无忌\t45992\n统统\t45993\n干爽\t45994\n2880\t45995\n娘生\t45996\n恩龟\t45997\n2885\t45998\n天涯微博\t45999\n2888\t46000\n带球\t46001\n616466446616166161616616471664646464646464646317\t46002\n唐县\t46003\n13839543459\t46004\n佳片\t46005\n六瓶\t46006\n金融类\t46007\n机器人么字典\t46008\nkiblAaCt\t46009\n法国巴黎拉普拉斯学院\t46010\n玉立\t46011\n良萌\t46012\n15日晚\t46013\njutreed\t46014\n伤心了我就要你不是一都秘达人\t46015\n公园有你的闺蜜\t46016\nyoel\t46017\nyieen嗯\t46018\n高禹龙\t46019\n四百五\t46020\nKlues\t46021\n嘻狗\t46022\n复合弓\t46023\n职工们\t46024\n殷鉴远\t46025\n早不凶你啊不求你求\t46026\n龙符\t46027\n吗麻辣你卡v还发，的邪恶色达天涯hi哦酒吧几哈\t46028\n江河\t46029\n河北大学\t46030\nwhhhhhahahahbabahshs\t46031\n#8\t46032\n百事百世\t46033\n幸福度\t46034\nhe娱乐\t46035\n报警告\t46036\n波妞\t46037\n汇通\t46038\n星期几来\t46039\n抛媚眼\t46040\n#3\t46041\n别麽\t46042\n驼罗\t46043\n中国人民政治协商会议\t46044\nid=28169327\t46045\n远视眼\t46046\n汤晶绵\t46047\n阿u漫画书给我看一本\t46048\n小白板\t46049\n11厘米\t46050\n成熟\t46051\n暗域\t46052\n930成\t46053\njdcjndkkc\t46054\n老记\t46055\n喂kk\t46056\n5024米\t46057\n你好我\t46058\ncono\t46059\n13612885\t46060\n多妖姬\t46061\n神志\t46062\nyaioufrom\t46063\n未喝\t46064\n15963325875\t46065\n犯法不犯法\t46066\n大浪\t46067\n全人类\t46068\ncolall\t46069\n哭了拍片\t46070\n阿真先\t46071\n赠给我的我可以\t46072\n柯伯\t46073\n来吧俾支学校\t46074\n易烊干\t46075\nsale\t46076\n我的爸爸长的帅不帅\t46077\n/MINI\t46078\n仿佛\t46079\n陆定一\t46080\n2月13日\t46081\n感于\t46082\n我真的好想好想你\t46083\n十万块\t46084\n光水\t46085\n华数\t46086\n奶妈\t46087\n叶到底\t46088\n吕鑫锐\t46089\n口星\t46090\n外向\t46091\n智能娃\t46092\n重赛\t46093\n帅假\t46094\n周于琳\t46095\n陈天浩\t46096\n我告诉你了我不废话\t46097\n福福\t46098\n罗纳尔多\t46099\n五声\t46100\n不要不要不\t46101\n麻雷子\t46102\n福禄\t46103\n好啦小乖乖\t46104\n麽麽麽麽萌\t46105\n早晨六点\t46106\n大八星\t46107\n孟艳红\t46108\n真生\t46109\n九10点\t46110\n作怔\t46111\nXIS\t46112\n王妃玲\t46113\n13131368331313333333333\t46114\nkkbnjjbjhhhbgghbhbhbbhbnbnnhnb\t46115\n我是你男女儿\t46116\n点到\t46117\n7654米\t46118\n作怪\t46119\n对呀你可以\t46120\n七色雨谊\t46121\n4周\t46122\n凸乛乛凸\t46123\n叫便秘\t46124\n点别\t46125\n8548544\t46126\n35727276582828\t46127\n点不要脸\t46128\n写法\t46129\n老怪我了好凶好心就好是吗说什么是你不就不是我兄弟来找我\t46130\n四三点\t46131\n念错\t46132\n正酣\t46133\n千亿元\t46134\n坏讨厌讨厌讨厌\t46135\n消逝\t46136\n秘爱\t46137\n舒展\t46138\n上手拳不丹\t46139\n龙钟\t46140\n阿欧\t46141\n火腿肠小度秘\t46142\n张张包\t46143\n油嘴\t46144\n山西日报\t46145\nhi小度秘\t46146\n于婉昕\t46147\n童心未泯\t46148\n2735127978\t46149\n都市爱情\t46150\n张千赞\t46151\n找我想要\t46152\ngjggd\t46153\n流浪汉\t46154\n海坛古城\t46155\n排污\t46156\n打骚扰\t46157\n加梅罗\t46158\n第15天\t46159\n张彬璇林\t46160\n惊看\t46161\nbjjg\t46162\n表示秘\t46163\n一三年\t46164\n快点灰太狼还大结节和小有王\t46165\nJunior#\t46166\n辩驳\t46167\n预兆\t46168\n预先\t46169\n欣我\t46170\n菜谱\t46171\n木头手\t46172\n战过\t46173\n太卡\t46174\n九九零六\t46175\n好害怕人\t46176\n订餐\t46177\n我爱你爱着你就像老鼠爱着你\t46178\n朱细琴\t46179\n骄嫩\t46180\n爱的米们\t46181\nMAQUIA\t46182\n连字\t46183\n9倍\t46184\nttysi\t46185\njobI\t46186\nhelloVB\t46187\n董度秘\t46188\n贝纳永\t46189\n爬子\t46190\n靠你\t46191\n何方妖孽报\t46192\n破浪\t46193\n深呼一气\t46194\n宝硕\t46195\nABAB\t46196\n28802047\t46197\n猕猴桃酸\t46198\n装帅\t46199\n外传\t46200\n外伤\t46201\n2800点\t46202\n丧事\t46203\n不断里了我去妞妞\t46204\n490欠\t46205\n黄子衿\t46206\n零九十\t46207\nddrrrcddff\t46208\nm305\t46209\n装帧\t46210\n外企\t46211\n党支书\t46212\n二二五二\t46213\n三六计36计\t46214\n定死\t46215\n孝子\t46216\n徐亚星\t46217\n陈清鉴\t46218\n老闫\t46219\n準備養\t46220\n10000000元\t46221\n简谱\t46222\n闫统\t46223\n修剪\t46224\n沉默\t46225\ndhee\t46226\n高不\t46227\ndheb\t46228\n八八飞侠耶里夏夜里风轻吹还\t46229\n正楷\t46230\n玩懂\t46231\n广告单\t46232\n200多万个\t46233\n十多遍\t46234\n互粉\t46235\n799857555\t46236\n失手\t46237\n黄仕磊\t46238\n张立宇\t46239\n俺家美妞\t46240\n固投\t46241\n鞭交\t46242\nnhhghgg\t46243\nFuCKYou\t46244\n并没有人\t46245\n易大师\t46246\n孝孩\t46247\n双肩\t46248\n戴菲菲\t46249\n雷登\t46250\n丝米\t46251\n英格兰联赛杯\t46252\n低卡\t46253\n王得\t46254\n這幾天\t46255\n愁余下\t46256\n无为你\t46257\n啦啦爸爸爸爸你在干嘛呀\t46258\n宝贝呀宝贝我的好宝贝\t46259\n场站\t46260\n有一天柳一只小兔子看灰狼小池\t46261\n我我我我我我我\t46262\n艾妮多\t46263\negwhwg\t46264\n呢呀呀呀呀呀呀呀\t46265\n王德\t46266\n40亿年后\t46267\n藤黄素\t46268\n草莓团\t46269\n草莓园\t46270\n邪财\t46271\n小白鼠\t46272\n邪货\t46273\n清茶\t46274\n给我喜欢\t46275\n另客\t46276\n该国\t46277\n太保\t46278\n宇宙队\t46279\n塞舌尔\t46280\n派发\t46281\n精疲力竭\t46282\n裴贵\t46283\n奕欢\t46284\n死孩子你是天下第一贱\t46285\n狮子王\t46286\n书话\t46287\n可得\t46288\n毋鱼\t46289\n世纪联华超市\t46290\n新白娘子传奇\t46291\n有约数\t46292\n公安部打拐办\t46293\n一万一千一百一百九十九十九点\t46294\n万高\t46295\n涨跌\t46296\n葛藤花又爬满兰若n月照轻纱夜风灵波\t46297\n李念\t46298\n克制住\t46299\n太古三里屯北区\t46300\n爱情预\t46301\n这本书\t46302\n阿兰布拉宫\t46303\n牛弹琴\t46304\n摩蝎座\t46305\n太肉麻\t46306\n嗯受乖\t46307\n江若琳\t46308\n米六\t46309\ntygghg\t46310\n大鹏尼\t46311\n李志\t46312\nhghghg\t46313\n虚度生命\t46314\n再见密度\t46315\n干啥娜\t46316\n欧鸡酱\t46317\n塑料袋\t46318\n简便\t46319\n恐怖传说\t46320\n我的名字\t46321\n包兰线兰\t46322\n一笑置之\t46323\n咯日\t46324\njsalal\t46325\n今天凌晨\t46326\n愚人节\t46327\n奥菲特baby蹦沙嘎拉嘎蹦沙嘎\t46328\n10.5%\t46329\n高临\t46330\n不懂我不懂我不懂你不是十二你是鬼\t46331\n勒紧\t46332\n勒索\t46333\n筹资\t46334\n王博坤\t46335\n十二分钟\t46336\n最太\t46337\n问一答\t46338\n眼福\t46339\n打笑\t46340\nrfghihnkk\t46341\n最大\t46342\n湖北省统战部\t46343\n你的你的你的你\t46344\n最多\t46345\n哭了你好坏\t46346\n六百七十八三六百七十八五百\t46347\n8月17日17点\t46348\n裸奔\t46349\ncfos\t46350\n交接\t46351\n1月14日\t46352\n胖片\t46353\n天剑大四间房\t46354\n6月16号\t46355\n木桥\t46356\nsnskkashuo\t46357\n每周一\t46358\n撸管男\t46359\n11点10分\t46360\n400克\t46361\n古往\t46362\n木桩\t46363\nwc克斯\t46364\n400元\t46365\n搜狗输入法\t46366\n爸爸爸爸\t46367\n木桶\t46368\n钟文娜\t46369\n刘飞\t46370\n切那\t46371\n圆筒\t46372\n再遇\t46373\n了懒得\t46374\n我想告诉你我太讨厌你了你是\t46375\n古微\t46376\n救出\t46377\n枕畔\t46378\n丽水\t46379\n漆路\t46380\n示威\t46381\n冯GG\t46382\n好感性\t46383\n赵子康\t46384\n不相上下\t46385\n头版头条\t46386\n古德\t46387\n我男\t46388\n米泉堡\t46389\n尊重\t46390\n好啊我爱你爱我\t46391\n重获新生\t46392\n红柿子椒\t46393\nouououou\t46394\n阿城区\t46395\n125380005\t46396\n首都\t46397\n喔喔喔喔喔喔喔喔\t46398\n紧致\t46399\n炮竹\t46400\n首部\t46401\n高小杉\t46402\n7月3日\t46403\n渣男\t46404\n潘祖鑫\t46405\n家鸭\t46406\n黄陈\t46407\n888266\t46408\n生育能力\t46409\n嬲比\t46410\n狗之路\t46411\n初恋女友\t46412\n废话了行不行\t46413\nTenyearoldme\t46414\n44771\t46415\n十一平\t46416\n十一年\t46417\ngtgfffchdh\t46418\n零四零\t46419\n38呃8呃728俄u恶uuEurope2\t46420\nliaotian\t46421\n海外\t46422\n朴树老狼\t46423\n半心咒\t46424\n14q三\t46425\n席慕容\t46426\n转转转转转\t46427\n逛逛逛逛逛逛\t46428\n第一说\t46429\n钟启梁\t46430\n黑路\t46431\n第一课\t46432\n菌儿\t46433\n赛龙\t46434\n超级无敌蛋疼\t46435\n杨胡杰\t46436\n乌布艺术村\t46437\n别记录\t46438\n基范\t46439\n秘密度\t46440\n金逸狼\t46441\n小嘟咪呀小小多咪小哆咪呀小度秘\t46442\n茅檐\t46443\n叶老对\t46444\n|||\t46445\n脚架\t46446\n上蹿下跳\t46447\n门票费\t46448\n没想说\t46449\n刘凯威\t46450\n开场\t46451\nmics\t46452\n易淘学\t46453\nmico\t46454\ngjkcghh\t46455\nmick\t46456\n性骚扰\t46457\nfighter\t46458\nmice\t46459\n龙卷风\t46460\n刑名师爷\t46461\n137277625\t46462\n猜凶\t46463\n王逸妍\t46464\n猜出\t46465\n梅雨\t46466\n斗米\t46467\n兴冲冲又奔塘尾古\t46468\n成都盖亚公司\t46469\n海天\t46470\n没礼貌哼\t46471\n70亿元\t46472\n七零二零幺\t46473\n一九千九百九十九万九千九百九十九岁\t46474\n神弗福也福\t46475\n少阳\t46476\nforgot\t46477\n还有理\t46478\ncivk\t46479\ncivi\t46480\n你是谁你是谁你是谁你是谁你谁你谁你谁\t46481\n获取\t46482\n死佬\t46483\n唱不唱\t46484\n合法\t46485\n巨大\t46486\nDitt\t46487\n运营商\t46488\n洗剪吹\t46489\n纟叢叢\t46490\n旗袍\t46491\n劝退\t46492\nsgfhgghgh\t46493\n巨头\t46494\n14725803691472580369\t46495\n李家绮\t46496\n雷霆之雅塔莱斯\t46497\n王晨旭\t46498\n合影照\t46499\n外来客\t46500\n风面膜\t46501\n贝迪\t46502\nDbdgdggdbdhhdhdbdbrbrpfjegqu2rbd37dbdy63hdb7dydbehdjdjr\t46503\n赵洵鹤\t46504\n真不要脸度秘你真不要脸\t46505\n郭盈君\t46506\n来了吧\t46507\n39栋\t46508\n阔别\t46509\niljj句\t46510\n来了吐\t46511\n互发素\t46512\nskrdue\t46513\n坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋的故事坏蛋的故事哦不告诉你\t46514\nkaox\t46515\n北岸管委会\t46516\n范儿mp\t46517\n了不是我的\t46518\n民财\t46519\n不识了\t46520\nv你好\t46521\n程龙瑶\t46522\n毒品\t46523\n倒过头\t46524\n借赛马我和你说话的时候你有说什早上\t46525\n贵州人\t46526\n小和也說\t46527\n哺乳期\t46528\n一一百\t46529\nstastin\t46530\n毒哑\t46531\n大宗交易平台\t46532\n古金\t46533\n好江海\t46534\nInot\t46535\n闲篇\t46536\nllabalasaaa\t46537\n毒哥\t46538\nfddessssss\t46539\nWxgjm1awm\t46540\n张顺路\t46541\n36点\t46542\n龙须面\t46543\n是我的理由\t46544\njumgw\t46545\n大地西站\t46546\n府谷县\t46547\n哈泥心\t46548\ngghuh\t46549\n大山里\t46550\ntour#\t46551\n3641本\t46552\n声你平要乃乃虧勇\t46553\n我真的错\t46554\n厨师长\t46555\n就不你\t46556\n严煊煊\t46557\n不是我好\t46558\n中国好声音\t46559\n古建筑群\t46560\n丽照\t46561\n哥都\t46562\n坏道\t46563\n暮然回首\t46564\n月季\t46565\n104岁\t46566\n慢慢信\t46567\n福绵区\t46568\n谢干嘛压\t46569\n拼音秘\t46570\n佃農\t46571\n余承浩\t46572\n法也\t46573\n扯西扯\t46574\n娘们\t46575\n孑然一身\t46576\n分支机构\t46577\n鲍爷\t46578\n魅族metal\t46579\nxykxy5k\t46580\nyhhgyghugv\t46581\n99﹪\t46582\n广大\t46583\n广天\t46584\n说誓言\t46585\n沈雨婷\t46586\n优班\t46587\n猴菇乳\t46588\n分公\t46589\n恋爱的人\t46590\nSookie\t46591\n害读\t46592\nokkkkkk\t46593\nabc颂\t46594\n许久\t46595\n天天地址你知我知你不告诉我\t46596\n五子棋\t46597\n身姿\t46598\n理你了真讨厌\t46599\n事业线\t46600\n小丫丫\t46601\n儒道\t46602\n2765\t46603\n0:30\t46604\n论道\t46605\n宋块钱\t46606\n六星\t46607\n养生少酸少辣保肝护阳\t46608\n耐养\t46609\n天人类学家\t46610\n大锅鸡\t46611\nsbvhb\t46612\n孙律师\t46613\n含苞待放意幽幽\t46614\n简称\t46615\n唉服\t46616\n巴西\t46617\n耐克\t46618\n阴郁\t46619\n1.90%\t46620\n現場\t46621\ne嗯\t46622\n勇士雅\t46623\n惹我是\t46624\ndmSqwwrhn\t46625\n争执\t46626\n86848684\t46627\n并列呗\t46628\n好山\t46629\n白花\t46630\n松树\t46631\n专版\t46632\n12nd\t46633\n石油\t46634\n循环往复\t46635\n理想型\t46636\n沙井小学纪元我是的一百知道\t46637\n3868\t46638\n西兰花\t46639\n颜儿\t46640\n182点\t46641\n米豆腐\t46642\ngfggffstd\t46643\njagjjj\t46644\n就是你你是猪\t46645\n的度\t46646\n搞笑视频\t46647\n大宝贝儿\t46648\n等一个人\t46649\n把心安顿好\t46650\n流动纪凯婷\t46651\n陈鑫宇\t46652\n丑石\t46653\n黄埔军校\t46654\n细则\t46655\n白条\t46656\n累好累\t46657\niffchgxhx\t46658\nTDDVHG\t46659\n动手\t46660\n王瀚林\t46661\ngnuh\t46662\n36千克\t46663\n嗯闹铃\t46664\n致爱\t46665\nGHGDF\t46666\ntvtvv\t46667\n复旦大学附属肿瘤医院\t46668\n妈贝贝\t46669\n暗部\t46670\n哈照\t46671\n猫连划\t46672\n催春\t46673\n说再来\t46674\n刀法\t46675\n四十页\t46676\n法号\t46677\nklf\t46678\n后后\t46679\n田丽娅\t46680\n二六零\t46681\n郎浩辰\t46682\nE2馆\t46683\n找头发\t46684\n＼们\t46685\n48岁\t46686\n坏我一家\t46687\n啵波波\t46688\n乙脑\t46689\n一百多条\t46690\n当官\t46691\n夏令\t46692\n貌妍\t46693\n你爱我你真心的爱过我\t46694\n两百3004005006007008009001\t46695\n病痛\t46696\ngyfffgcghbg\t46697\n当家\t46698\n19分半\t46699\n3月份\t46700\n怀柔风景区\t46701\n睡不醒\t46702\n辉瑞中国公司\t46703\nspanelac\t46704\n离尽致\t46705\n张晓晴\t46706\n四四月份\t46707\nCFDD\t46708\n气壮山河\t46709\n2600周年\t46710\nuffc\t46711\nuffa\t46712\nsnKFhakqkcsofmckd\t46713\n妇刑\t46714\n长汀\t46715\n郭大路\t46716\n洪林\t46717\n拍飞\t46718\ndeise\t46719\n美密度\t46720\n长江\t46721\n欢悦\t46722\n性特征\t46723\n郑乐怡\t46724\n7564565\t46725\n天寒\t46726\n重庆啤酒\t46727\n陈伟婷\t46728\n碳纳米管\t46729\n红红火\t46730\n毕左\t46731\n2ＣＶ\t46732\n安乐\t46733\n年薪\t46734\n不不不不对\t46735\nU一\t46736\ngrtexeh\t46737\n仙后\t46738\n智能型\t46739\n草稿\t46740\nhi呀猜呀猜猜猜\t46741\n癸丑\t46742\n累人废话\t46743\nKbbccal11477142588802539\t46744\nattitude\t46745\n充滿\t46746\n来了我是你的好主人\t46747\n69天\t46748\n卫视节目\t46749\n仙吕\t46750\n炸天\t46751\n说不准\t46752\n段祺瑞\t46753\n说不出\t46754\n黄蝶\t46755\n1a1\t46756\n陕小北\t46757\n张仁杰\t46758\n1＼／1七一\t46759\n圆脸\t46760\n滴情不移\t46761\n宅女\t46762\n千一万一万一\t46763\n国家机器\t46764\n46个\t46765\n空桶\t46766\n443\t46767\n开超\t46768\n牙距\t46769\n打嗝\t46770\n哥本哈根\t46771\n来说句话\t46772\n殷新年快乐\t46773\n总分\t46774\nEiehdb\t46775\n在也\t46776\n妙首\t46777\n傻女\t46778\n第几眼\t46779\n回安\t46780\n大雪儿\t46781\n我是你的阳光女神\t46782\n妙香\t46783\n回家\t46784\n人民日报大厦\t46785\n浪琴\t46786\n素素\t46787\n4g版\t46788\n1ax\t46789\n尽快\t46790\n我的女友是腐女\t46791\n进攻性\t46792\n1ab\t46793\n回宫\t46794\n理发店\t46795\n圣体\t46796\n#576天声一队#\t46797\n莫施惊奇邪丐\t46798\n团字\t46799\n赵子硕\t46800\n太平轮中的秋日芒草钢琴曲\t46801\n团子\t46802\nyoulerkahratimesoraanoonoro\t46803\n无误\t46804\n足彩\t46805\n无语\t46806\n平地\t46807\n王池宇\t46808\n无说\t46809\n早晚\t46810\n挖肺\t46811\n松花蛋\t46812\n慢过\t46813\n黑眼\t46814\n工序\t46815\ndehYJEH\t46816\nsgdhfv\t46817\n一亿粒\t46818\n小耳菜\t46819\nWallpaper\t46820\nrllura\t46821\n无证\t46822\n不惜\t46823\n泰拉\t46824\n大开眼界\t46825\n最后一个字\t46826\nsjjdjxndndbd\t46827\n五十几集\t46828\n颠簸\t46829\n走走呀呀呀呀\t46830\n无话\t46831\n崩裂\t46832\n显卡\t46833\n三餐一点儿\t46834\n12.9.晚\t46835\n查雅县\t46836\n武警\t46837\n波泼莫波特曼\t46838\n0110110110001\t46839\n矣已\t46840\n马丁靴\t46841\n沫涵\t46842\n花花红红牛牛小小\t46843\n声效\t46844\n守擂\t46845\n德高望重\t46846\n一三几\t46847\n多爱\t46848\n而言\t46849\n智潮装\t46850\n991个\t46851\n顺口气\t46852\n赵红兵\t46853\n大相径庭\t46854\n范三\t46855\njshws\t46856\ngvvbb\t46857\n往前\t46858\n菜花\t46859\n三大一个\t46860\n报刊文\t46861\nzvvt\t46862\n被扑倒\t46863\n延Z思忠F輻R医节惑飞必！愍\t46864\n人体素描\t46865\n哪家人\t46866\nyjlove\t46867\n本·拉登\t46868\n50阎\t46869\n联机\t46870\n587545457242424356868\t46871\n牛衍\t46872\n无药医\t46873\n笨笨笨笨蛋蛋就是你\t46874\n莫泳轩\t46875\n王秋蔡\t46876\n肯定度\t46877\n啦啦啦露面\t46878\n纺织材料\t46879\n15782862562\t46880\n罩子\t46881\nvudux\t46882\n1899553\t46883\nroomevery\t46884\n冰红茶\t46885\nfyffffdtyd\t46886\n中阳\t46887\n6米\t46888\n柴米油盐\t46889\n短信通知误人\t46890\n胡江飞丑\t46891\n转移\t46892\n五点零九\t46893\nrdzeswaq\t46894\n虾咪\t46895\n一个一分\t46896\n气死我了\t46897\n图标识\t46898\n气压\t46899\n坏咧\t46900\n板书\t46901\n礼拜会\t46902\n聚划算\t46903\n半人\t46904\n贱小\t46905\n再来一个人\t46906\n战后\t46907\n转租\t46908\nhiiookkkmnnbv\t46909\n可能会\t46910\n夫妇\t46911\n迫亠\t46912\n倒影\t46913\n15282419983216615\t46914\nodo\t46915\n200块\t46916\n老K咯\t46917\n薛家岛\t46918\n兔牙\t46919\n长夜\t46920\n迫人\t46921\n北京台\t46922\n#男子高生的日常#\t46923\n里波黎\t46924\n臭小吗大坏蛋\t46925\n九百日元\t46926\n咖啡连锁店MangoSix\t46927\n六界\t46928\n迫于\t46929\n迫亏\t46930\n树怪\t46931\n挥刀\t46932\n公同\t46933\n太原狮头水泥\t46934\n王泽昊\t46935\n夫妻\t46936\n吃吃饱\t46937\n来终觉浅绝知此事要躬\t46938\n木尼木尼枚\t46939\n劲尊老爱幼\t46940\n伏法\t46941\nj1a1a1ah\t46942\n感情线\t46943\n哲比\t46944\n空训杯\t46945\n绳梯\t46946\n就却步\t46947\n好我珍\t46948\n昏君\t46949\n给我别\t46950\n15:03\t46951\n志士\t46952\n41岁\t46953\n妞儿\t46954\n消业障\t46955\n儿童剧\t46956\n把姐\t46957\nmodel\t46958\n厌学\t46959\n宣称\t46960\n巴人\t46961\n运载\t46962\n38岁\t46963\n住口\t46964\nSamantha\t46965\n大庙站\t46966\n谭经\t46967\n运转\t46968\n结婚前\t46969\n拉倒头头米酒\t46970\n小呈星\t46971\n片区\t46972\n很开心\t46973\n昆高铁劣质槽道粉化\t46974\n金玉良缘\t46975\ngabc\t46976\n15887100598\t46977\n沉迷\t46978\n止渴\t46979\n你們不會\t46980\n电流\t46981\n来了别别再发\t46982\nYiapanis\t46983\n年纪犬\t46984\n娱情\t46985\n有空\t46986\n打花\t46987\n刘萍\t46988\n刘萌\t46989\n宿池边树憎将\t46990\n杨宗纬\t46991\n读机\t46992\n钟佳云\t46993\n龙牡儿童健康网\t46994\n找我不是\t46995\n李健\t46996\n呵儿\t46997\n好呀一脚\t46998\n钟瑞婷\t46999\n哈日哈韩\t47000\n维密\t47001\n马文豪\t47002\n砖厂\t47003\n悸动\t47004\nvftgbiimmbvf\t47005\nbegins\t47006\nYKBDA\t47007\n季婧\t47008\nhaowul\t47009\n樱桃派\t47010\n97雷兹\t47011\n徐家汇\t47012\n奇趣\t47013\n狠心手辣\t47014\n湖北人\t47015\n杨佳欣\t47016\n油桃\t47017\n油案\t47018\n陈洋\t47019\n洞咪\t47020\n失火\t47021\n油桶\t47022\n脚步声\t47023\n陈洲\t47024\ndsms\t47025\neyrggchggagu\t47026\n置身事外\t47027\n龚诗淇\t47028\nWhetshug\t47029\n蒋欣\t47030\n八除\t47031\n飞笺\t47032\n大鲵\t47033\n安稳\t47034\n牛乳加饭\t47035\n失灵\t47036\n分保\t47037\n仅有\t47038\n融资\t47039\n全等\t47040\n雕侠\t47041\n三六五六\t47042\n呃呃网\t47043\n没有你我真的好难过\t47044\n果粉们\t47045\n何弃城\t47046\n昨天晚\t47047\n瓜嘻嘻\t47048\n芙蓉小镇\t47049\n火枪皂\t47050\n东街海38号儿喃\t47051\n吕聊天\t47052\n真实\t47053\n同仁省\t47054\n快乐的女人\t47055\nvxgjkvhkjfkbcddfojvcddabmkhdzxvbjkjcs\t47056\n高苏\t47057\nSony\t47058\n鸟粪\t47059\n干渴\t47060\n14586348257\t47061\n好想亲\t47062\n真还棒\t47063\nsfgsv\t47064\n硒\t47065\nxhjxjdhd\t47066\n木拜妮\t47067\nIoveltyui\t47068\n陈以桐\t47069\n养话\t47070\n狡滑\t47071\n太子妃升职记据\t47072\n春装\t47073\n第二点\t47074\n星魂\t47075\n胤禩\t47076\nfuour\t47077\ngampk\t47078\n独占鳌头\t47079\npdu\t47080\ngirlday\t47081\npdp\t47082\n验明\t47083\n猪料\t47084\npdj\t47085\n欢啦\t47086\n来龙\t47087\n纸面\t47088\npdd\t47089\njfjfhuccjcuxhfyfjfuudyfufkxydufj\t47090\nbrya\t47091\n秀吉\t47092\n本报\t47093\n本月12号\t47094\n勇氣\t47095\n吗样\t47096\n奎因\t47097\n一呼一吸\t47098\n明艳\t47099\n鬼讨\t47100\n美观\t47101\n竞争对手\t47102\n5点07分\t47103\n我们是情侣吗我是一个你的你说你在哪\t47104\n总长\t47105\n纲手\t47106\n吴高阳\t47107\n保不齐归\t47108\nns78\t47109\n理了再见\t47110\n反对派\t47111\nNO.11金牛\t47112\n白析皙\t47113\n第一小批\t47114\nggxchgzx\t47115\n年大吉\t47116\nsjhdsgdhd\t47117\n读研\t47118\n滑跑\t47119\n养熊度匙高新\t47120\n三星a5000\t47121\ncomeby\t47122\n福建春\t47123\n副书记\t47124\n九零之盈\t47125\n名校\t47126\nyyyyyyyyy\t47127\n倾城\t47128\nlqql\t47129\n诺VI\t47130\n干家物\t47131\nstandithitt慢itstow\t47132\n哈尔滨市一监狱\t47133\n平行四边形\t47134\n5肯\t47135\n金域蓝\t47136\n喵喵呜\t47137\n哈小时钟\t47138\n旅顺口区\t47139\n宝良北区\t47140\n比较\t47141\n胡小波\t47142\n首歌来\t47143\n来起\t47144\n付国涛\t47145\n50萬\t47146\n心翼\t47147\n一穷二白\t47148\n去年五月\t47149\n错了再见\t47150\n觉晓\t47151\n卫冕英\t47152\n丽媛\t47153\n沈金冰\t47154\n聂铭志\t47155\n切别\t47156\n姜琪\t47157\n惊喜若狂\t47158\n分咐\t47159\n朱亚楠\t47160\n零二二二二二九五五\t47161\nbhffh\t47162\n心沙\t47163\n玉瓣飘零化红泥\t47164\n切切\t47165\njdjsk\t47166\n好不好而瘦\t47167\n74554546484\t47168\n湘江大道保利国际广场\t47169\n中山陵景区\t47170\n2201169471\t47171\n1008611\t47172\n吐水\t47173\n超模Inguna\t47174\nangelbaby\t47175\n鬼东东\t47176\nfyrt\t47177\n刚忙\t47178\n大行家\t47179\n诱人纷纷\t47180\nmlefsc\t47181\n女主人公\t47182\n韩雯\t47183\n歌尽的需要美女来陪我哥真的好寂寞\t47184\n保太\t47185\n昨日\t47186\n艳梅\t47187\n090222\t47188\n卖买\t47189\n东北方言的处你的句话\t47190\n讨厌自己\t47191\n品码\t47192\n早餐\t47193\n赢会\t47194\n长串\t47195\n快快点\t47196\n進心\t47197\nToctop\t47198\n2月10\t47199\n庄洁\t47200\n好了我相信你\t47201\n嬲髓\t47202\n2月13\t47203\n你给我捏坏机器人小气机器人大到机器人我讨厌你\t47204\n这些年来\t47205\n金锁\t47206\n锐器\t47207\n六三发\t47208\n营销\t47209\n拔丝\t47210\n肉光\t47211\n斜眼\t47212\n不必懂\t47213\n乳酸\t47214\n0wo\t47215\nwhere1582403873\t47216\n朱峰\t47217\n黑小姑千四\t47218\n143619\t47219\n几擦\t47220\n三百七十四亿六千四百\t47221\n长沙网\t47222\n华阴\t47223\n华阳\t47224\n只须\t47225\n三星年\t47226\n龙老板\t47227\n嘉应大学嘉园路重庆鸡公煲\t47228\n几组\t47229\n安家bb\t47230\n死是\t47231\n贝瑟\t47232\n市场瞭望\t47233\n野蛮人\t47234\n第几季\t47235\n铁瓶\t47236\n爱了我没有\t47237\n特异功能\t47238\n微信感\t47239\n东丽一个天儿\t47240\n不干净\t47241\n火重\t47242\n小盘股\t47243\nfhfghjhgh\t47244\n扬起\t47245\n钢枪\t47246\n一点半\t47247\n叫派\t47248\n墨迹\t47249\n啦11群\t47250\n不安不安不安不啊了了了了了\t47251\nTugggj\t47252\n11年11月11日\t47253\n直播\t47254\n酷餐\t47255\n火金\t47256\n弟兄\t47257\n吐气\t47258\n烟碱\t47259\n习敢\t47260\n送块\t47261\nletgo\t47262\n眼镜带带\t47263\n米腻\t47264\nTtujfuuytty\t47265\n语音类\t47266\n萌誼\t47267\n美人计\t47268\n男装\t47269\n两点十五\t47270\n慢动\t47271\nsjjzjj\t47272\n睡片\t47273\n下定决心\t47274\n梨子\t47275\n翘成\t47276\n叫萌\t47277\n小洁洁\t47278\n八三四五\t47279\n意思\t47280\n顶呱呱\t47281\n少和\t47282\n致命\t47283\ntemple\t47284\n黄桃45油桃31蜜桃\t47285\n抡起\t47286\n1群\t47287\n甲鱼\t47288\n快不认识你了你好\t47289\n遮天\t47290\n显示\t47291\n副教授\t47292\n土家\t47293\n576461643\t47294\n源子\t47295\n坐车\t47296\n找不我\t47297\n猎头\t47298\n点歌\t47299\n克葆\t47300\n龙梅子\t47301\n两样儿\t47302\njtsst\t47303\n莎莎魔\t47304\n生手\t47305\nhegjdi\t47306\n就是我的理由\t47307\n暴想\t47308\n礼篮\t47309\n10几种\t47310\n县界\t47311\n17788809970\t47312\n走肉\t47313\n科贷款利率\t47314\n生仔晚\t47315\n草头\t47316\n三百多名\t47317\n城楼\t47318\n人大国关\t47319\n妙妙妙妙妙妙妙\t47320\n第八次\t47321\n芙蓉楼\t47322\n靓仔\t47323\nshsnt\t47324\n谈兵\t47325\n共演\t47326\n背背着\t47327\n巴拉拉小魔仙大电影之魔箭公主\t47328\n9点822点\t47329\n5000多亩\t47330\n青年书店\t47331\n涛闻馨\t47332\nduplicate\t47333\n不限儿\t47334\n朱坐\t47335\n变形金刚四\t47336\n:\t47337\n错郎\t47338\n包威尔\t47339\n王采青\t47340\n冰箱\t47341\n3607\t47342\n亡国\t47343\n西林壁横看是连山山比较卡什程\t47344\n我是大明湖畔的偶\t47345\nyourmy\t47346\n400万美元\t47347\n回大坏蛋\t47348\n柔感\t47349\n姓男\t47350\n琴行\t47351\n顺心\t47352\n笼爰\t47353\n有好貌似\t47354\n王八骨\t47355\n加油兔兔\t47356\n庭院\t47357\n场架\t47358\n庆元旦\t47359\n打老孑\t47360\n年轻姑娘\t47361\n好不能不\t47362\n17分\t47363\n哭开玩笑\t47364\n含羞草\t47365\n還很年輕\t47366\n高一\t47367\n什么时候\t47368\n语文卷\t47369\n蔡丽英\t47370\n高下\t47371\n高三\t47372\n机器人气\t47373\n歌舞片\t47374\nV型\t47375\nhi我林\t47376\n火花蜜\t47377\n输了输\t47378\nnisrnanhaisrnu\t47379\n小罗密\t47380\n胡艺凡\t47381\n总市值\t47382\n电子表\t47383\n孝心\t47384\n施维\t47385\n高个\t47386\n5888888885558\t47387\n祁映诺\t47388\n高中\t47389\n伤心刚\t47390\n王台三中\t47391\n三千元\t47392\n10月22一二二四二五二六二七二八二九三十三一三二三三三四三五三六三七三八三九四十四一四二四三四四四五五六\t47393\n高举\t47394\n韩雪菲儿\t47395\nn┃　　　　┃n\t47396\n高丽\t47397\n激烈竞争\t47398\n老一辈们\t47399\n断然\t47400\n溢出\t47401\n也不了我没有翅膀你给我找吃\t47402\n你好你好我是你好我是机器人你好你好你好你好机器人\t47403\n2月14号\t47404\n幸运者\t47405\nago\t47406\n文学作品\t47407\n太好太不好笑了我讨厌你\t47408\nagk\t47409\n1010001200011272\t47410\n海大\t47411\n无畏\t47412\n爱一个明天你给大全\t47413\n列王\t47414\n761885188869\t47415\n一胎二了不想和你聊\t47416\n去年7月\t47417\n教练\t47418\n民贫\t47419\nusta\t47420\n华北平原\t47421\n4b4b4b4b\t47422\n我讨厌你我最好朋友都是你\t47423\n文梦和\t47424\n一假期\t47425\n督察\t47426\n小东乐风\t47427\n十年八年\t47428\n617288266\t47429\n2973点\t47430\n近乎\t47431\n近义\t47432\n伊布\t47433\n赵惠敏\t47434\n三百多四百多\t47435\n可想而知\t47436\n断肠处\t47437\n王哲馨\t47438\n开考\t47439\n不逗你了看来你是不懂我\t47440\n有好是\t47441\n好洋气\t47442\n乎川\t47443\n防护服\t47444\nWonderedjm\t47445\n活菌\t47446\n萨米\t47447\n音序查字法\t47448\n乖摸\t47449\n北展剧场\t47450\n李家勇\t47451\n万宝龙\t47452\n白斑\t47453\ntorture\t47454\n九十九一级\t47455\n百分比\t47456\n空档\t47457\n对我就是你的家人我就是你的长\t47458\n潮裤\t47459\ngffffffff\t47460\n白文\t47461\nsBgs\t47462\n邑野山\t47463\n通勤\t47464\n来无恙\t47465\neysvt\t47466\n黄渤晓\t47467\n一口度\t47468\n陈时衡\t47469\n古扇\t47470\n毫不动摇\t47471\nnonononononononononononononononononononononono\t47472\n劳保数\t47473\n卫生部长\t47474\n残留\t47475\nthis\t47476\n瓴泌峙\t47477\n外刚内柔\t47478\n邢闰\t47479\n双桥\t47480\n泻火\t47481\n新阳\t47482\n双桨\t47483\n333665\t47484\n想否认\t47485\n亚麻碟\t47486\njuyvy\t47487\n很听话\t47488\n财帛\t47489\n拉锯\t47490\nodv\t47491\n义婁\t47492\n婉像\t47493\nfciighuuv\t47494\n搞基吧再见\t47495\n余若琳\t47496\nodd\t47497\n停滞\t47498\n屁股市\t47499\n央视纪录频道\t47500\n二十多年\t47501\n白花花\t47502\n千金\t47503\n12月20号\t47504\n朱暄萌\t47505\n千里\t47506\n样长\t47507\nfdreeehg\t47508\n3LA\t47509\nhijkkhho\t47510\n药囊\t47511\n大晨晨\t47512\n三三三三六三九九一三三九八三十二四十六\t47513\n朝阳区\t47514\nhvcgy\t47515\ngathered\t47516\n无言以对\t47517\n83岁\t47518\n如萍\t47519\n验孕纸\t47520\n吴杰斌\t47521\n李红袖\t47522\nJoshgc\t47523\n过敏药\t47524\nGiorgio\t47525\ngreat\t47526\n东口\t47527\n药园\t47528\n祝砚焜\t47529\n1.41\t47530\n简健\t47531\nctx\t47532\n呀讨厌\t47533\n连讨厌\t47534\n奥特园\t47535\n吴大妈\t47536\nctf\t47537\n六小玲珑\t47538\n东台\t47539\nctl\t47540\n英语学\t47541\n光之美公主\t47542\n板面\t47543\n城子\t47544\n部首\t47545\n就是我一接\t47546\nduzyfgffhzyaisidjyzjffhlxhhgcdufhfhfgzuxgghgxhhgghfjxhdgfjhvjughvkvjk\t47547\nlmnjkj\t47548\n秀明寺\t47549\ngigydjbk\t47550\n拔下\t47551\n潼恩\t47552\n多米尼\t47553\n布控\t47554\n爸儿子\t47555\n新年快乐7k7k\t47556\n重名而已\t47557\n2Hearts\t47558\n郭宇轩\t47559\n第3卷\t47560\n恩我好心痛那你不要我\t47561\nlesle\t47562\n舱门\t47563\n看错了吧\t47564\n理综\t47565\nTFPG\t47566\n歇歇\t47567\n给讲\t47568\n新大\t47569\n你的心\t47570\n钙食品\t47571\n受样\t47572\nNO错\t47573\n四五种\t47574\n课堂\t47575\n唉妥\t47576\nabcdemica\t47577\n㎡\t47578\n545355551565265555\t47579\n816x\t47580\n不限号\t47581\n秘腊\t47582\n第33天\t47583\n张久久\t47584\n景象\t47585\n工作辞\t47586\n信用贷款\t47587\n㎞\t47588\n㎜\t47589\n橙光游戏\t47590\n供电\t47591\n牙萌萌哒\t47592\n苏有成\t47593\nthin\t47594\n方少鸿\t47595\n刘梦文\t47596\n㎏\t47597\n蒋佳雨\t47598\n弄混\t47599\n傻乐傻乐瓦\t47600\n武汉永力电源技术有限公司\t47601\nWXNL\t47602\n九九女\t47603\n拉锁\t47604\n举报者\t47605\n手残党\t47606\n肘间\t47607\n被拐案\t47608\n唐心怡\t47609\n空气污染\t47610\n破门而入\t47611\n江汉区\t47612\ngfhjhhg\t47613\n二十五岁\t47614\n野鸡大学\t47615\n31%\t47616\n关东煮\t47617\n刘月芳\t47618\n快春节\t47619\n唐思琪\t47620\n彪么\t47621\n谢娜\t47622\n耳垂\t47623\n九十九万九五\t47624\n民心网\t47625\ndrdtg\t47626\n跪下\t47627\n蹉跎岁月\t47628\n1565686461\t47629\nＸＸｘＹｘ尬\t47630\n讨厌鬼\t47631\n王思岩\t47632\n中国人民银行\t47633\n磷肥\t47634\n片尾曲\t47635\n几头\t47636\n开心情好\t47637\n盛秉\t47638\n四十千\t47639\n玫瑰糖\t47640\n勿勿\t47641\n四十升\t47642\n好好过日子\t47643\nnononononononononononononono\t47644\n步行\t47645\ncosco\t47646\n几处\t47647\n幽桑\t47648\n北大资产管理部\t47649\n明尼苏达\t47650\n沙琪玛\t47651\n达沃斯\t47652\n几多\t47653\n1884990346\t47654\n陆其龙\t47655\n36464634343434343\t47656\n贪爱\t47657\n1985年1月9日\t47658\n度你最好\t47659\n忙吧\t47660\n32132132132132132132130313\t47661\n卡巴\t47662\n妈了逼\t47663\n希望之星小记者嘉年华昨\t47664\n习江\t47665\nwhatrioutou1\t47666\n8745多少\t47667\n企鹅号\t47668\n嗯撒拉\t47669\n金沙洲\t47670\n讨厌鬼来\t47671\nSerena\t47672\n返不返\t47673\n马尔蒂尼\t47674\n13388\t47675\n七金宝\t47676\n满仓\t47677\n嗯手\t47678\n加加加年原谅\t47679\n不我一点\t47680\n一色狼\t47681\n包车\t47682\n2.3千克\t47683\n快工\t47684\n殷霜\t47685\n铁匠\t47686\ncago\t47687\nbhhiiii\t47688\nSam\t47689\n就是这样的呀\t47690\n洋哥\t47691\n海洋鱼\t47692\n大姐分钟\t47693\n泡鸡宝\t47694\n看了看\t47695\nSay\t47696\n湖北\t47697\n刘禹希\t47698\n五八三十一\t47699\n首歌你跟我在\t47700\nfggfdr\t47701\n下午三点钟\t47702\n娄来\t47703\n海天看爱情\t47704\ndoing\t47705\n匠心\t47706\n多纳多尼\t47707\n武侠小说\t47708\n火山喷发式\t47709\n捣烂\t47710\n乱选\t47711\n客帮\t47712\n诙谐\t47713\n毕棚沟\t47714\n一两部\t47715\nppich\t47716\n升哥\t47717\n我是你的你知道\t47718\n几天\t47719\n九华山\t47720\n拉面\t47721\n这期间\t47722\n啦啦啦耳钉宝贝\t47723\n716米\t47724\n星期一晚上6点\t47725\nhiahia\t47726\n我是女孩子那我們絕交\t47727\n你决然\t47728\n见战\t47729\nbachina\t47730\n见成\t47731\n2012042518\t47732\n乔木\t47733\n加油机\t47734\n葡萄汁\t47735\n恐怖小说\t47736\n大唐黜官记\t47737\n瀛瀛\t47738\n真的吗那你猜猜我是谁\t47739\n急干嘛\t47740\n林这雨\t47741\n滴吧滴\t47742\n下放心\t47743\n五道乘法五道除法除法\t47744\n一年之内\t47745\naho亚\t47746\n等本\t47747\n188块\t47748\n孙宇暄\t47749\n某年\t47750\n张雨欣\t47751\n伟人\t47752\n白石\t47753\n王经理\t47754\n很多个\t47755\n丑媳妇儿\t47756\n大乐思密达\t47757\n灰灰灰灰灰\t47758\n36石\t47759\n季羡林\t47760\n顶上\t47761\n以便\t47762\n283221\t47763\n圈钱\t47764\n唉小魔仙\t47765\n888885555\t47766\n朋克\t47767\n有摸\t47768\n一千五百三十九号\t47769\n王老表\t47770\nsbsbsbsbsbsbsbsbsb\t47771\n度宅\t47772\n呦呦呦\t47773\n涓涓\t47774\n英米茄\t47775\n苗族钒\t47776\n8265756\t47777\n不嘛不嘛\t47778\n邝伟文\t47779\nzhin\t47780\n艾薇\t47781\n血片\t47782\n生以火\t47783\njilao\t47784\n日思夜慕\t47785\n超过一个小时\t47786\n程吉楠\t47787\n596533\t47788\n鲜美\t47789\n名仕马爹利\t47790\n无所\t47791\n三秒点\t47792\n2222222222222222222222222\t47793\nAPS431\t47794\n薰衣草樱花玫瑰\t47795\n分组\t47796\n付6V刹\t47797\n浅表浅表天明\t47798\n薛蜜星\t47799\ntxgl\t47800\ntxgn\t47801\nateacher\t47802\n西瓦琪木卡卡西\t47803\n幸运\t47804\nzj省\t47805\n一记\t47806\natouou\t47807\n一讲\t47808\n分给\t47809\n零二三个\t47810\n30多年\t47811\n756\t47812\n拉屎\t47813\n我喜欢你我们结婚吧好不好和你在一个\t47814\n156米\t47815\n望长城\t47816\n陈子豪\t47817\nBIG\t47818\n號碼\t47819\nawjtkjt8jtmaj\t47820\n好吃力\t47821\n神农鼎\t47822\n图兔\t47823\n高息\t47824\n高恭\t47825\n女岁\t47826\n袁姗姗\t47827\n380.0元\t47828\n大地大学\t47829\n好无聊你的\t47830\n几句话\t47831\n758\t47832\n2050年\t47833\n没赶脚\t47834\n不做女\t47835\n15115127573087845376524678863134579067553548\t47836\n真假\t47837\n五色\t47838\n领域\t47839\n排入\t47840\n上教\t47841\n叠字\t47842\n纯富\t47843\n弄弄弄弄nononono你不了改我你不懂我\t47844\n血盆\t47845\n人教人教\t47846\n董事会领导\t47847\n富兴\t47848\n电网\t47849\n哦嗯\t47850\n薛美琪\t47851\n告一段落\t47852\n和合值披萨\t47853\n方砖\t47854\n营山\t47855\nbcvvcv\t47856\n真健\t47857\n富养\t47858\n度倪\t47859\n小马国\t47860\n出神入化\t47861\n逆跳\t47862\n换一转\t47863\n讨厌讨厌我讨厌你度秘\t47864\n五艘\t47865\n十万跟\t47866\n伊佐\t47867\n阿胶糕\t47868\n我的电脑\t47869\n许宪博\t47870\n犬犬\t47871\n100分之49\t47872\n瘤子\t47873\n两起\t47874\n二二幺\t47875\n垂直极限\t47876\n走散\t47877\n等音乐中心ING\t47878\n耳光子\t47879\n五台山\t47880\n入魔\t47881\n谈天说地\t47882\n面巾\t47883\n丧权辱国\t47884\n查看器\t47885\n祖国各地\t47886\n3333369875000\t47887\n陈悠扬\t47888\n你的我的名字\t47889\n苏尔特\t47890\n双人游戏\t47891\n45万吨\t47892\n姜丽钧\t47893\n厌死\t47894\n伊丝曼\t47895\n里巴\t47896\n斜坡\t47897\n自生自长\t47898\n123345569789885\t47899\nfellover\t47900\n另男\t47901\n你好使\t47902\n女风蛋糕\t47903\n黄敏胶囊\t47904\n早稻\t47905\n就是你了妈妈也是你的主人脆不是说了么我是你我是你的家人你还怕我\t47906\n孟姝\t47907\n孟姜\t47908\n3xbb\t47909\nintel网\t47910\n报雷\t47911\n陈景瑞\t47912\n张欣\t47913\n关机家\t47914\n市医院\t47915\n厂子\t47916\n2222222222222222222\t47917\n咯了了了\t47918\nv223\t47919\n扎挣\t47920\n茶卡\t47921\n赵光冲\t47922\n那你猜猜我是男是女\t47923\n高丧钟\t47924\n你好体\t47925\n鲁鲁修\t47926\nbtv\t47927\n马六行\t47928\nbts\t47929\nDdr\t47930\n大红\t47931\n大级\t47932\n大约\t47933\nbtg\t47934\n联圩\t47935\n撒发\t47936\nDdc\t47937\n1.85万元\t47938\n大纲\t47939\n丢尽\t47940\nbtk\t47941\nbtj\t47942\n499名\t47943\n泥会\t47944\n第2mic\t47945\ndarkblue\t47946\n胸有成竹\t47947\n聊天人\t47948\n金家井乡\t47949\n神经女\t47950\n咿呀呀哟\t47951\n涡街流量计\t47952\n表象\t47953\nunravel\t47954\n税种\t47955\n湘西\t47956\n寺院\t47957\n血歪\t47958\n黑尼博\t47959\n莫网\t47960\n街头弹\t47961\n惠东\t47962\n过来就打\t47963\n忙我讨厌\t47964\nBL漫\t47965\n烈贞团\t47966\n做出\t47967\n片龙\t47968\n1899岁\t47969\n一零届\t47970\n会说话\t47971\n菇\t47972\n酷基网\t47973\n逆因\t47974\n千万\t47975\n定西市\t47976\n十二小时\t47977\n菏\t47978\n菌\t47979\n菊\t47980\n臂票\t47981\n千亿\t47982\ncai正呀\t47983\n菓\t47984\n高子豪\t47985\n么快乐了我没有看到的话\t47986\n牧羊犬\t47987\n羞答\t47988\n称座\t47989\n死度秘臭度\t47990\n项链儿\t47991\n偷税\t47992\n我咋\t47993\n我和\t47994\nPelagie\t47995\n王紫茜\t47996\n小葱\t47997\n第4次\t47998\n满载而归\t47999\n不说了我是人类世界我在玉树\t48000\n龙园\t48001\n懂不不约\t48002\ndongtai\t48003\n晚上7点半\t48004\n丰盛\t48005\n梁日坤\t48006\n积攒\t48007\n很无奈\t48008\nfygf\t48009\n弄龙\t48010\n于芸\t48011\n丰盈\t48012\n沧州地区\t48013\nnuzm\t48014\n眼神儿\t48015\n刁给\t48016\neiiw\t48017\n巨无尚\t48018\n真的爱你摸摸\t48019\n282号\t48020\n我是可爱的小度秘你是度秘我喜欢你\t48021\n春曰\t48022\n生蟹\t48023\ncbo\t48024\n4milk吗4limk\t48025\n供养\t48026\n为你老大一下\t48027\n度秘果\t48028\ncbm\t48029\n度秘林\t48030\n废铁\t48031\n坚强\t48032\n8088588\t48033\n独相随么值\t48034\n两档\t48035\n相公度秘\t48036\n志南\t48037\n电子学\t48038\n正前方\t48039\n我是真的爱上了你嫁给我\t48040\n两桶\t48041\n倾述\t48042\n星边刃\t48043\n桂林山水甲天下\t48044\n轻音\t48045\nfcggc\t48046\n入院\t48047\n高岗\t48048\n战鼓\t48049\n你好好看我己经没人样了?成血人\t48050\n敢做敢为\t48051\n冷若希\t48052\n群誒\t48053\n上零点\t48054\n嘛不嘛\t48055\n抱枕\t48056\n猴哥\t48057\n备降\t48058\n罗铠甲\t48059\n一本易经\t48060\n赞胸\t48061\n眼霜\t48062\nCF卡\t48063\n礼拜\t48064\n嘴见\t48065\n下轮\t48066\n额霸\t48067\n陈浩哲\t48068\n下车\t48069\n绿鬣\t48070\n绿鬼\t48071\n肆意酮\t48072\n下载\t48073\n嘴角\t48074\nfower\t48075\n陈浩哥\t48076\n液态\t48077\n浮生若梦的初live\t48078\n傻傻不可爱\t48079\n一只只\t48080\n天通苑\t48081\n生灭\t48082\n生火\t48083\n明明天\t48084\n事在人为\t48085\n快手\t48086\n醉了笑话\t48087\n二零七二九九\t48088\n电荒\t48089\n考不起\t48090\n活蹦乱跳\t48091\n林巧伟\t48092\n十二男十二女\t48093\n邵音音\t48094\ncknd\t48095\n最爱=爱你的现任\t48096\nQQ头像二\t48097\n值得一读\t48098\n特闷\t48099\nghillo\t48100\n酒囊\t48101\n诸国\t48102\nmzmmsmsk\t48103\nhoijhhhhgfffgh\t48104\n闾闽\t48105\n小赢\t48106\n永远的拥抱\t48107\n曾淑敏\t48108\n嗯性\t48109\n唔怕\t48110\n83408898\t48111\n篮球百分百\t48112\n二百多二百多\t48113\n食屌拉\t48114\n爱爱男\t48115\nstoundegenomyoumal撸啊\t48116\n絮羹\t48117\n五毒传说\t48118\n四美图\t48119\n六十几级\t48120\n跟我来\t48121\n便是\t48122\n大话易失信,\t48123\n杨庆果\t48124\n呼唤\t48125\n一只帮\t48126\n万和城\t48127\n放晴\t48128\n亚里亚\t48129\n各市\t48130\n宋佳露\t48131\n开场秀\t48132\n8983533652\t48133\n风牛\t48134\nEX个\t48135\n闷头\t48136\nqsqs\t48137\n选定\t48138\n履昕\t48139\n器件\t48140\n没听过门\t48141\n陳柏翔\t48142\n雷凯欣\t48143\ndesk\t48144\ndesd\t48145\n听候\t48146\n2941215482\t48147\n西裤\t48148\n多九十棵\t48149\n就可是\t48150\n例举\t48151\n勇者大冒险\t48152\n回龙转\t48153\n初板\t48154\n秋葉原\t48155\nbeachyun\t48156\nfyygy\t48157\n公务员\t48158\n齐琪\t48159\n120元\t48160\n西装\t48161\n沙星\t48162\n分队\t48163\n张伯伦\t48164\n太美\t48165\n夜礼服\t48166\n865458666\t48167\nsimshcssn0774\t48168\n孙晓菁\t48169\n29号\t48170\nverygood\t48171\nnp个\t48172\n陀螺mq\t48173\n密令\t48174\n配备\t48175\n饮品\t48176\n苏伟\t48177\n降火\t48178\n韩光桐\t48179\n宝山南大路\t48180\nrofin\t48181\n照亮\t48182\n小苹果真的爱\t48183\n三点到六点\t48184\n认成\t48185\n城防\t48186\n我就是你的呀你讨厌我不\t48187\n眼堂\t48188\n哪庄\t48189\n谢太傅\t48190\n你的谁\t48191\n人行天桥\t48192\n敌意\t48193\n123asd\t48194\nk800\t48195\n度秘度秘我是你主人的妈妈你好\t48196\n断掌\t48197\nputr\t48198\n断掉\t48199\n过山\t48200\nhushuo\t48201\n老子\t48202\n老字\t48203\n陳慧嫻\t48204\n最美\t48205\n错\t48206\n老孙\t48207\n蓝静仪\t48208\n坦率\t48209\n里脊\t48210\n风向标\t48211\n茭白2+雪菜1+米苋\t48212\n比一比\t48213\n没有你\t48214\n8月12号\t48215\n里包\t48216\n天下第二聪明人\t48217\nlopqr\t48218\nmnnn\t48219\nluck\t48220\n巨浪\t48221\nhshdkvxx\t48222\n山高险\t48223\n交通工具\t48224\nhfhjgfvjgjjhhh\t48225\n言情版\t48226\n渍渍\t48227\n特惠\t48228\n四月初四\t48229\n自问自答\t48230\n中轴线\t48231\n拖稿\t48232\n况况\t48233\n噜啦啦噜啦啦噜噜噜噜噜\t48234\n诚品\t48235\n优先\t48236\n单错\t48237\n踽行\t48238\n两页\t48239\n天真无邪\t48240\n赵韩樱\t48241\n初章\t48242\n爱呀亲\t48243\n各需\t48244\n王本好\t48245\n黄浩杰\t48246\n刘子俊\t48247\n给生\t48248\n现车\t48249\n一道题\t48250\nrshgjkiwgn\t48251\n两项\t48252\n变法\t48253\n琵笆琵笆果\t48254\ngjcfhn\t48255\n一百五\t48256\n320公斤\t48257\ngay蜜\t48258\n受别闹\t48259\n1978年\t48260\n南部\t48261\n辣味\t48262\n压制\t48263\n五百十三\t48264\n茶杯狗\t48265\n上海中心气象台\t48266\n一百亿\t48267\n王哥六\t48268\n群雄\t48269\n死後\t48270\n南都\t48271\n普宁市\t48272\n内衣秀\t48273\n萧十一郎\t48274\n恩唱\t48275\n客流量\t48276\n起伏\t48277\ntttty\t48278\n伊索寓言\t48279\nggeehfkhg\t48280\n蚀女\t48281\n骗不过\t48282\n布加拉多\t48283\n补偿\t48284\n青春版\t48285\n白衣庵住持昌果法师\t48286\n雪绒\t48287\n补假\t48288\n爱情吧\t48289\n2011年5月23日\t48290\n包秘\t48291\n临安市\t48292\n5478496\t48293\n骆春\t48294\n阿鲁斯奥特曼\t48295\n00000\t48296\n意切\t48297\n锐\t48298\n一国之君\t48299\n女O\t48300\n中午11点\t48301\n可乐妞\t48302\nhermeritot\t48303\n华更纱美\t48304\n凌志贞子\t48305\n橙子\t48306\n截止\t48307\n无拉撒路\t48308\neoc\t48309\neoa\t48310\nfggcg\t48311\neoe\t48312\n⑷\t48313\n不要害怕死\t48314\n第一位\t48315\n姑娘吧炒鸡\t48316\n杰癖\t48317\n王汐\t48318\n左露轩\t48319\n女p\t48320\n执教\t48321\n不栋旭\t48322\n女t\t48323\n25588525852\t48324\n列表里日\t48325\ngsary\t48326\n出气筒\t48327\n85577575\t48328\n歹官\t48329\n乞讨\t48330\n爸妈爸妈爸妈\t48331\n夏王浩\t48332\n上周日\t48333\n春度秘蠢度秘\t48334\nforollistimimimisto\t48335\n喜欢迎来\t48336\n来终觉浅绝知此事要躬行\t48337\n八厘米\t48338\n小果\t48339\n小枝\t48340\n犹疑\t48341\n补录\t48342\n大雄殿\t48343\n孟公主\t48344\n法国杯\t48345\n小林\t48346\n小析\t48347\n呼市回民奶食品厂\t48348\n复苏\t48349\n734\t48350\n风景\t48351\n小枫\t48352\n乳腺肿瘤\t48353\n过不过年\t48354\n好懂事儿\t48355\n勾画\t48356\n徐誉\t48357\n高鸿飞v结果的你姐夫你\t48358\n好评率\t48359\n你是风我是风你是风\t48360\n李婉然\t48361\n737\t48362\n玛格烈特\t48363\n偶班\t48364\n发配\t48365\n自动扶梯\t48366\n包臀\t48367\n人儿\t48368\n小组赛\t48369\n宾华\t48370\n长远\t48371\n微笑向暖，安之若素\t48372\n5557项\t48373\n德意客\t48374\n736\t48375\n魔痕\t48376\n美丽片\t48377\n五德师\t48378\n忽尔今夏\t48379\n傻子SB个秘度秘\t48380\n厄齐\t48381\n异世度秘\t48382\n熊就熊哈\t48383\n发酵\t48384\n陈琪琪\t48385\nmashizumi\t48386\n一箩\t48387\n你在哪东面在哪东西度秘度秘\t48388\n望迴退迪\t48389\n发酸\t48390\n信任\t48391\n耀找\t48392\n无了期\t48393\n起名\t48394\n师大\t48395\n阁下\t48396\n师太\t48397\n秘丫头\t48398\n信件\t48399\n开笑\t48400\n臭猪臭猪\t48401\n成天过\t48402\n狼猫\t48403\n东方明珠\t48404\n哇靠你好丑\t48405\n000004467888\t48406\n动迁\t48407\n慢性难\t48408\nugly\t48409\n50平米\t48410\n给给我打\t48411\n4008000000\t48412\n大展\t48413\n大屋\t48414\n大屏\t48415\n大屌\t48416\n大局\t48417\n孕照\t48418\narefor\t48419\n夏休\t48420\n墨雨瞳\t48421\n315943449554527467846346494564564498463431121845264484534684544364824454825442464824654424684564480405434954729427487394241824644885766424815416948463464486455546454854555555555555\t48422\n七波\t48423\n挺嗨\t48424\n功败\t48425\n我的安吉拉小和你\t48426\n九尾狐\t48427\n入门书\t48428\n康龙\t48429\n骷髅库\t48430\n258036914\t48431\n偷天换\t48432\nglh\t48433\nglj\t48434\n东街小学\t48435\nlosuiol\t48436\n鸿儒\t48437\n哈密达生意出卡哈密达\t48438\n韩寒\t48439\ndcds\t48440\nglf\t48441\nglg\t48442\nitunes\t48443\n连砖\t48444\n发讲解\t48445\ndhcccdb\t48446\n哈佛\t48447\n三浦理惠\t48448\ncustantou\t48449\nglv\t48450\n13750385688\t48451\n冯松\t48452\n雅昌艺术网\t48453\n嗯年\t48454\nCTS\t48455\n我是你女闺蜜\t48456\n指引\t48457\n那晚你没办法小心我一刀瓣豆凉拌\t48458\n协调性\t48459\n特希后卓玛首\t48460\n两种爱\t48461\nshuo\t48462\n王菊凤\t48463\n没有了没有了\t48464\n王慧慧\t48465\n很温柔\t48466\n茱萸湾\t48467\n指弹\t48468\n专一休\t48469\n无聊的自尊失去心爱的人。nn——村上春树\t48470\n鲁冰花\t48471\n吐一咏3a\t48472\n沛炎\t48473\ntcgffggd\t48474\nshipis\t48475\n从足\t48476\n181盒\t48477\n别是我\t48478\n你用笔\t48479\nshipin\t48480\n泡奶\t48481\n可申购\t48482\n吐奶\t48483\n做不做爱\t48484\n变样子\t48485\n遇上麻\t48486\nsheItie\t48487\n疫\t48488\n小二端杯茶\t48489\n疮\t48490\n疯\t48491\n疬\t48492\n划着\t48493\n大类\t48494\n四二天以前\t48495\n疤\t48496\n疹\t48497\n疾\t48498\n阿里宝卡\t48499\n疼\t48500\n被盗\t48501\n疳\t48502\n死年子\t48503\n1623\t48504\n袁老师\t48505\n5181\t48506\n5186\t48507\n我真的启源我真的喜欢三星\t48508\nYugggghj\t48509\n呱啦\t48510\n疆\t48511\n女双你的亚冠\t48512\nRMVB\t48513\n盘侠\t48514\nFTisland\t48515\n疒\t48516\n不要走好\t48517\n疑\t48518\n疗\t48519\n3707号\t48520\n周轩羽\t48521\n避难\t48522\n贵干\t48523\n一萌\t48524\n好了度\t48525\n党账\t48526\n猜神劫\t48527\n巴尔虎合唱团\t48528\n连普\t48529\n肺动脉\t48530\n主题曲\t48531\n利辛\t48532\nnmn969\t48533\n活泼\t48534\n党费\t48535\n魔幻\t48536\nGnhhjnfhbfn\t48537\n三宝\t48538\n三官\t48539\n墨迹天气\t48540\n异瞳\t48541\n三客\t48542\n不得安生\t48543\n王忠和\t48544\n一键老娘说的算\t48545\n鸠占雀巢\t48546\n易燃\t48547\n口贝比\t48548\noutouy\t48549\n33ab\t48550\n高考生\t48551\n去了吧\t48552\n哈喽O\t48553\n鞠躬\t48554\n累挺\t48555\n插台\t48556\n坏了办额\t48557\n面容\t48558\n需求\t48559\n三个半月\t48560\nWill\t48561\nyyqkn\t48562\n大头笔\t48563\n堂弟\t48564\n插口\t48565\n幺四九三九四幺九\t48566\n54884566496668500855466464\t48567\nｔｈａｎｋｙｏｕ\t48568\n五连\t48569\nbfffg\t48570\nshimmering\t48571\n免礼\t48572\n入口处\t48573\n丟zz母\t48574\n度秘和你的老婆度秘\t48575\n哈叔\t48576\n一个十五\t48577\n最深\t48578\n无能为力\t48579\n插叙\t48580\n恶龙\t48581\n到时情激情网\t48582\n多一度\t48583\n藏区\t48584\n太阳系\t48585\n银质\t48586\n阿奇卡干红葡萄酒nKTV哦利津sky恶露to\t48587\n闫建宇\t48588\n仪仗队\t48589\n660宗\t48590\n直树吻\t48591\nojdg\t48592\n看不能\t48593\n百和\t48594\n住房抵押贷款\t48595\n六国\t48596\n是谁手\t48597\nk英雄传捷达\t48598\n山雨人\t48599\n海南航空\t48600\n13312223970\t48601\n收视\t48602\n正能\t48603\n挥可\t48604\n受尽\t48605\n六四\t48606\nhttpahiphotosbaiducomxiaodupicitemb2de9c82d158ccbf5423ea4f1ed8bc3eb1354123jpg\t48607\n我也不让\t48608\n王思彤\t48609\n闻闻\t48610\n三千多万\t48611\n乒乒乓乓乒乒乓乓劈劈\t48612\n真美不愧\t48613\n嗯皇上我可是您的爱妃呀我要你理我却要不了我去你\t48614\n那谁生的你\t48615\n老公我爱您爱的好真心\t48616\n任夜风\t48617\n好温馨\t48618\n份饭\t48619\n零落成\t48620\n户外运动\t48621\n球门\t48622\n四张\t48623\n追随她的旅程\t48624\n利达\t48625\n真的再说\t48626\n成国\t48627\n61616666666666\t48628\n利特XI\t48629\n7520\t48630\n超级大神\t48631\n成因\t48632\n主板\t48633\n鬼东西夜\t48634\n害人之心不可\t48635\n西大\t48636\n7529\t48637\n黑森\t48638\n花千骨\t48639\n抱不告诉你\t48640\n水域\t48641\n一餐\t48642\n小杨\t48643\n两个礼拜\t48644\n三点540点\t48645\n文oun\t48646\nsurmoniPhone\t48647\n泪热泪\t48648\n太郎\t48649\n水城\t48650\n按子\t48651\n芜杂\t48652\n只身\t48653\n412881951000390\t48654\n下星\t48655\n用完\t48656\n谣传\t48657\n听取经\t48658\n4.8亿元\t48659\n闷油瓶\t48660\n拓展\t48661\nBjj\t48662\nella\t48663\n真猴\t48664\n石榴糖浆\t48665\n蹁\t48666\n720P\t48667\n吉永林\t48668\n我心中\t48669\n血族\t48670\n炒米\t48671\n林思琪\t48672\n僵尸新娘\t48673\n话们\t48674\n汪勒\t48675\n咪咪头\t48676\n猪猪猪猪猪猪我不想\t48677\n蹡\t48678\n耿啾啾\t48679\n折合\t48680\n蹤\t48681\n基本版\t48682\n蹦\t48683\n讨厌你我才我要了\t48684\n吧太阳\t48685\n蹭\t48686\ndisusf\t48687\n勋章\t48688\n蹲\t48689\n14点16\t48690\n力强\t48691\n掌萌\t48692\n霆锋谢\t48693\n蹿\t48694\n欢欣\t48695\n郭明云\t48696\n冬瓜\t48697\n填词\t48698\n圣徒\t48699\n11：20\t48700\n欲封天\t48701\n诚然\t48702\n5535853835\t48703\n塔巴布\t48704\net快点\t48705\nGoing\t48706\n张宾\t48707\n唐梦宪\t48708\n一百六一百六\t48709\n51kztb\t48710\n6月25日晚\t48711\n张家\t48712\nvqd\t48713\n如忆\t48714\n青黄\t48715\n看完精\t48716\n儿子女\t48717\n快疯了\t48718\nhfdddxcbkihhhjvvyttt\t48719\n张宁\t48720\nksucms\t48721\n张宇\t48722\n昭和沙田\t48723\nSvs\t48724\n达标\t48725\n还珠格格\t48726\n大头娃娃\t48727\n张宝\t48728\n才貌\t48729\n陈椿雨\t48730\nte堡\t48731\n景区\t48732\n为你上\t48733\n子民\t48734\n炮友\t48735\n报先\t48736\n报充\t48737\n鱼龙混杂\t48738\n13672107364\t48739\n大家好我叫马甲\t48740\n上班时间\t48741\n郭梓淇\t48742\n酬难\t48743\n望深沪\t48744\n太可爱\t48745\n情圣\t48746\n膨脹\t48747\n222225555522222\t48748\n远去\t48749\n空耗\t48750\nhikh\t48751\n子水\t48752\ntey\t48753\n佳伟\t48754\n宋延飞\t48755\n你好彩\t48756\n村上春树\t48757\n杜子建\t48758\n头脑发达\t48759\n老师老师\t48760\nMadonna\t48761\n10000000000000000000000000000000000\t48762\n报关\t48763\n绿袍\t48764\n炮台\t48765\n有暖\t48766\n大伟\t48767\n班次\t48768\n大伙\t48769\n大会\t48770\n神闹元宵\t48771\n大众\t48772\n星吾24k\t48773\n李楠才\t48774\n多情\t48775\n光座\t48776\n猴子\t48777\n光光集团\t48778\n嗯中犬八公\t48779\n哇他西瓦\t48780\n北镇\t48781\n大似\t48782\n义母\t48783\n刀朗威\t48784\n胡立光\t48785\n赖科\t48786\n多想\t48787\n布列松\t48788\n啦啦啦啦啦队\t48789\n露爱\t48790\nHowareyou\t48791\nalalo\t48792\n大伯\t48793\n南宫月\t48794\n一一二三四五\t48795\n大传\t48796\n讨厌讨厌讨厌讨厌讨厌讨厌讨厌嘿嘿讨厌讨厌讨厌讨厌什\t48797\n三分之\t48798\n黄心\t48799\n合起来\t48800\n田鼠\t48801\n技师\t48802\n行李\t48803\n零线\t48804\n代己\t48805\n吊元\t48806\n9900九千九百九千九百九千九百九千九百9千九百几\t48807\n城事\t48808\n折本\t48809\n囧团\t48810\n气焰\t48811\n囧囧\t48812\n泡脚\t48813\n英文名\t48814\n四肖\t48815\n成功学\t48816\n保护猫\t48817\n首歌样\t48818\n洗澡\t48819\n说出命\t48820\n折服\t48821\n蜻蜓们\t48822\n白癜马丁\t48823\n零三岁\t48824\n机械暴龙兽\t48825\n阳光灿烂\t48826\n吊兰\t48827\n想不出\t48828\n替身标靶\t48829\n一二三四五六七八九十十一十二十二\t48830\n917块\t48831\n全村\t48832\n开玩笑了先走\t48833\n旺夫相\t48834\n五点六\t48835\n本和\t48836\n你的名字\t48837\n你爱我不爱我爱你从来都咪哥麽\t48838\nFbdvn\t48839\n全权\t48840\n震动\t48841\n2009年1月1日\t48842\n2011年6月18日\t48843\n调调查\t48844\n聪明点儿\t48845\n李子园儿\t48846\n阿呆\t48847\n卷铺盖走人\t48848\n41j\t48849\n尾矿\t48850\n考期末考试\t48851\n营业收入\t48852\n赛男\t48853\n孓义\t48854\n崔守硕\t48855\n600块\t48856\n你的我\t48857\n杨传琴\t48858\n食肉\t48859\nB+侦探\t48860\n作孽\t48861\n谨言\t48862\n爹舍\t48863\n食肆\t48864\n海哥哥\t48865\n报仇\t48866\n挖矿\t48867\n你在干什么呢米色只要回答我你在的话\t48868\n五月二十一号\t48869\n41%\t48870\n生死号\t48871\n楚涵\t48872\n报价\t48873\n密州\t48874\n414\t48875\n415\t48876\n410\t48877\n报以\t48878\n413\t48879\n太笨\t48880\n418\t48881\n419\t48882\nrrfmc\t48883\n吉林大学化学学院\t48884\n微我\t48885\nwww.xafm931.com\t48886\n一意孤行\t48887\n动力\t48888\n嘘嘘\t48889\n冯晓慧\t48890\n四夹馍\t48891\n电视信号\t48892\n3838383838\t48893\n琅琊榜\t48894\n卡特\t48895\n动动\t48896\n追我\t48897\n高若轩\t48898\n倒春寒来了\t48899\n黎国炼\t48900\n29年\t48901\n李致列\t48902\n笔下\t48903\n覆水难收\t48904\nwwwwwwpppppppp\t48905\n张涵斌\t48906\n追戏\t48907\n高成\t48908\n徐家俊\t48909\n不忘\t48910\n呢大地子\t48911\n刘奇\t48912\n葡挞皮\t48913\n做票\t48914\n噔噔噔\t48915\n墓度\t48916\n向前进\t48917\n笨蚕\t48918\n田玲娜\t48919\n老四\t48920\n杨占\t48921\n五一次\t48922\ncityour5alanditiooortomemoushifirstrosstoutnohaobaoctomomyoucoffeuropandyfoothisisasssitindifsmoutoufro慢tifindromoutero4sessassaa134b345c23\t48923\n玩偶\t48924\n一世再下一世\t48925\n就像\t48926\n玛丽珍\t48927\n数度\t48928\n数座\t48929\n脚刹\t48930\n点片\t48931\n洞蜜\t48932\n女律师\t48933\n刘贤庆\t48934\nDuudfj\t48935\n刘奶\t48936\n86882253\t48937\n赢锋\t48938\n裸睡\t48939\n300岁\t48940\n三零幺\t48941\n青霞点\t48942\n卡斯\t48943\n一米九\t48944\n卡断\t48945\n爱你爱你爱你爱你我一生一世的爱你\t48946\n赵亚春\t48947\napopularsong\t48948\n大伙伴\t48949\n期友\t48950\ndtdtfu\t48951\nhubbyh\t48952\n好的我等你在床上\t48953\n算了我靠靠你\t48954\n亲关注\t48955\nMeToo\t48956\nhbh\t48957\n期口\t48958\nIROI\t48959\n搞不好😊\t48960\noppooppo\t48961\n干爹不我死\t48962\n闪购\t48963\n叫林\t48964\n问你件事\t48965\n马蒂尼普\t48966\n一计\t48967\n完婚后\t48968\n政改\t48969\n去不掉\t48970\n梁绮雯\t48971\n宜花\t48972\n吴前琪\t48973\n梦金\t48974\n漫天飞\t48975\n无聊不无\t48976\n世外\t48977\n急人\t48978\n中华田园犬\t48979\n芭比芭比\t48980\n分掉\t48981\nNight\t48982\n梦里\t48983\n国学\t48984\n四千万\t48985\n真的好吧度谂真\t48986\n沃视一呀\t48987\n桂花鱼\t48988\n1分之4\t48989\nCP版\t48990\n国子\t48991\nvivoX6d\t48992\n张永寒\t48993\n呢答\t48994\n你好听完\t48995\n土生\t48996\n一呜一\t48997\n2班\t48998\nmistake\t48999\n擠\t49000\n没事儿度娘不爱你我爱你\t49001\n秦昊\t49002\n43464343434343434434343\t49003\n擤\t49004\n开放女\t49005\n擦\t49006\n火烧着\t49007\nJcx\t49008\n糖糖糖\t49009\n纳滤水机\t49010\n好不好嘛\t49011\n擴\t49012\n酱样\t49013\n425855800000\t49014\nftmb\t49015\n米兰达\t49016\n米哈\t49017\n跌倒\t49018\n擅\t49019\n清扫\t49020\n阿郎\t49021\n操\t49022\n清扬\t49023\n生活态度\t49024\n擒\t49025\n岳麓区\t49026\n也谈\t49027\n东京食尸鬼\t49028\n以利\t49029\n换机\t49030\n嗯李金铭\t49031\n难以相信\t49032\n牛逼\t49033\n球袋\t49034\nkkkg\t49035\n手刹\t49036\n想了一想\t49037\n如画\t49038\n工作份\t49039\n舒舒\t49040\n连欣\t49041\n积极性\t49042\n指路\t49043\n秘书鸡智\t49044\n好小好小傻子\t49045\n搜狗秘\t49046\n正伐\t49047\n1899081\t49048\n车忒自私17k7k7k\t49049\n5月底6\t49050\ncomnot\t49051\nwwwpro\t49052\n言之凿凿\t49053\n新茶医学院\t49054\n阵阵\t49055\n职工\t49056\n昨天上午9点\t49057\n子丶丶\t49058\n康家地产\t49059\n3部\t49060\n扎扎扎扎\t49061\n萌大萌大萌萌\t49062\nBIE\t49063\n城叔\t49064\n叫花眼\t49065\n34分\t49066\ncabulatf\t49067\nkgsfg\t49068\n豆跳舞\t49069\nyyuuiioopppoiiu\t49070\n我是汪星人我要了你\t49071\n一八四一\t49072\n好纯情\t49073\nfsasdsd\t49074\n众有\t49075\n一件两件\t49076\n∏\t49077\n1.34%\t49078\n众望\t49079\n城口\t49080\n黄晓燕\t49081\n懦弱\t49082\n45yt\t49083\n李老大\t49084\n合伙\t49085\n城只\t49086\n跨骑\t49087\n聊聊聊\t49088\n八路军二军\t49089\nSEEYOU\t49090\n挨个\t49091\n不爱你了你好丑\t49092\n下來\t49093\nmyseis\t49094\n韩小贤\t49095\n骂骂咧咧\t49096\n我呸我不爱你讨厌你\t49097\n冲出门\t49098\n没良心\t49099\n嘿嘿劲\t49100\n2011年2月2日\t49101\n咋了啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦6666\t49102\n西翼\t49103\n钟亚\t49104\n保镖\t49105\n实名制\t49106\n黄子璋\t49107\nBIU\t49108\n贤和澜\t49109\n保住\t49110\n过激\t49111\n免都没有\t49112\n狗凶\t49113\n无哑\t49114\n上期\t49115\n华阳花园\t49116\n11100\t49117\n5毛\t49118\n伟棠\t49119\n晨光明\t49120\n聂文静\t49121\n随感文盲酒网\t49122\n美雪\t49123\n盖着\t49124\n安吉拉猫\t49125\ntifdf\t49126\n挪汲\t49127\n∑\t49128\n酋长国\t49129\n黎帮\t49130\n大仙\t49131\nwo13819753456\t49132\n63436864\t49133\n午安早安晚安思密达\t49134\n库拉丝\t49135\nmysister\t49136\n案数\t49137\n∕\t49138\n左手\t49139\n机器割麦家博会\t49140\n下牙\t49141\n晚上六点半\t49142\n莲花\t49143\n差点\t49144\n听见你的声音\t49145\n服自己\t49146\n找我不爱\t49147\n英西桃花呀真真与小佩着物菜头\t49148\n头鸡麦克\t49149\n啦啦啦\t49150\nbhhhhhn\t49151\n下午四点\t49152\n转来转去\t49153\n擦鞋\t49154\n黄静怡\t49155\n自游\t49156\n执着\t49157\n唉真可惜\t49158\n韦斯\t49159\ncyifyi\t49160\n160斤\t49161\n渔翁得利\t49162\n累不给\t49163\n防腐\t49164\n我的妹妹\t49165\n燕归来熙\t49166\n黎打\t49167\n巢鸟\t49168\nv年\t49169\n100遍\t49170\nPadFone\t49171\n雅哈\t49172\n河南\t49173\n上有\t49174\n飓风双宿双飞健康哦卡你爸爸不v发\t49175\n寥寥骨朵\t49176\n一亿万个\t49177\n邱胤胤\t49178\n不是啦我是说哈哈你还笑我\t49179\n2994144109\t49180\n顿觉\t49181\ngehddh\t49182\n魂曲\t49183\nEb\t49184\nhttpgmhtdqshyujiameicompluginphpidhejintoupiaomodeldetailzid97\t49185\n56米\t49186\n长继\t49187\n自闭症\t49188\n袁晓庆\t49189\n囊括\t49190\n好尚\t49191\n友期\t49192\n输家\t49193\n病毒片\t49194\n好少\t49195\n楚翘\t49196\n放寒假寒假\t49197\n折腰\t49198\n扫货\t49199\n好小\t49200\n出勤率\t49201\n歪瓜劣枣\t49202\n熊大熊二\t49203\nlu挑一打\t49204\n折腾\t49205\n沼气\t49206\n多米达\t49207\n心中\t49208\n杨寒松\t49209\n来么\t49210\n诶剃盀\t49211\n姜饼文\t49212\n心丹\t49213\n我不老告诉你我哭了我告诉你有惹我哭了我告诉你\t49214\n454645738\t49215\n2933140873\t49216\n3月1日1时\t49217\n51vo\t49218\n心上\t49219\n心三\t49220\n为什么不\t49221\n挂碍\t49222\nwhyw\t49223\n还早吧圣诞\t49224\n子之\t49225\n18点23\t49226\n霄妹妹\t49227\n医学家\t49228\n子乔\t49229\n上清乡\t49230\n∷\t49231\n睡岗\t49232\n心丑\t49233\n烧毁\t49234\nUCI洲际职业车队\t49235\n子乙\t49236\nwhya\t49237\n8000米\t49238\n9.2万\t49239\n尾梨\t49240\n时淑芬\t49241\n药药\t49242\n安详\t49243\nJjnnnnbnmnnnmnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnmmjjkkkkkuhgrdfyyh\t49244\n一点钱么\t49245\n老本\t49246\n聚聚\t49247\n元曰\t49248\n赌欠\t49249\n451边儿\t49250\n我喜欢风\t49251\n拍谢啦\t49252\n秦若夏\t49253\n三大伯\t49254\n老朱\t49255\n好玩儿\t49256\n上海洋盘库尔勒沃库森\t49257\n兔子舞\t49258\n搬到\t49259\n六小布\t49260\n太娘娘腔了你\t49261\n药草\t49262\n王选\t49263\n省钱\t49264\n选购\t49265\n周恩耒\t49266\n老有\t49267\n老月\t49268\n国民经济\t49269\n1200个\t49270\n半江红\t49271\n雨燕\t49272\n钟一菲\t49273\n不小心\t49274\n小甲\t49275\n哒啵啵啵\t49276\n小男\t49277\n基拉\t49278\n6首\t49279\n文学史\t49280\n纯娘们\t49281\n京东方\t49282\n孟庭苇\t49283\n9992162906290\t49284\n一比二比三已数\t49285\n小用\t49286\n你母\t49287\n炖鸡\t49288\n要紧\t49289\n要素\t49290\n手短\t49291\nDC移师TC\t49292\n托罗\t49293\n小生\t49294\n三五5\t49295\n小甜\t49296\n55555555555555552555555555\t49297\n冰封\t49298\n手标\t49299\n容貌\t49300\n相偎\t49301\n星族\t49302\n发电\t49303\n胃疼秘\t49304\n2011年2月1日\t49305\n相假\t49306\n20000\t49307\nEOS\t49308\n跳起\t49309\n祸中有福\t49310\n奉献\t49311\n完死\t49312\n大多数其\t49313\n相偕\t49314\n不好吃\t49315\n碎蜂\t49316\nhvhshcv\t49317\n六脉神剑\t49318\nvvxcc\t49319\n度秘米长\t49320\n不好听\t49321\n皂皂\t49322\n呸群\t49323\n雅量\t49324\n哺育\t49325\n着办\t49326\n衣服八神\t49327\n心然\t49328\n鳄鱼小顽皮\t49329\n穴巢\t49330\n拉扎诺\t49331\nCCAV\t49332\n狂二\t49333\nbhjjmjcfnfgnfllrkgjjngnkkhkhkkhnhlhlhkh\t49334\nwded\t49335\n十节\t49336\n36页\t49337\n延吉道永辉超市\t49338\n金立度秘\t49339\n待会的猪猪猪猪猪猪\t49340\n小黄鹂丽丽丽丽丽丽丽丽\t49341\nvfjjh\t49342\n狂人\t49343\n异形\t49344\n异彩\t49345\n狂亲\t49346\n孔从\t49347\n幺二零幺零七一九八三零二二六零零\t49348\nkdkskw\t49349\nSaturday\t49350\n坐席\t49351\n晟敏\t49352\nuufhhlogf\t49353\n卡布喵\t49354\nregret\t49355\n列非\t49356\nbravo\t49357\n全透\t49358\nwozai\t49359\n债主\t49360\n07号\t49361\n拉倒吧\t49362\n党龄\t49363\n一只猫\t49364\nRick\t49365\n囧物\t49366\n保定市\t49367\n7885450087550958\t49368\n度秘你是的影小达人\t49369\n叶青青\t49370\n真的吗你的演技个大一个小\t49371\n紧折\t49372\n我处\t49373\n党建\t49374\n摇手\t49375\n纵然\t49376\n崔旭基\t49377\n内刊\t49378\nHcchcvjkh\t49379\n98889888\t49380\n冬瓜排骨汤\t49381\n衩子\t49382\n356万吨\t49383\n106页\t49384\n把你的家和\t49385\n棋子\t49386\n三百万一个\t49387\n平5负\t49388\n哈赛呦\t49389\n度秘小游记\t49390\n真爱在九重\t49391\n不告\t49392\nmhtewuy\t49393\n车身\t49394\n呼应\t49395\n唉您好\t49396\n中国佛教代表团\t49397\n不呀\t49398\n赛罗粤特曼图\t49399\n出鬼\t49400\n最后的时光\t49401\n我是你的再说最后\t49402\n到面\t49403\n猪蹄\t49404\n一段段\t49405\n哼哼哼哼爱情爱情爱爱心\t49406\n不周\t49407\n解套\t49408\n多大大\t49409\n戴立清\t49410\n不呢\t49411\n22088886933874188687415750000000000000000224n7681479840\t49412\n啦啦啦啦啦啦你是我的小行家\t49413\n必修\t49414\nGLK\t49415\n我爱石海小学\t49416\nGLF\t49417\n玉门市\t49418\n扒一扒\t49419\n老实度\t49420\n对对就\t49421\nno块\t49422\n阿伯\t49423\n一个一下\t49424\n月份额\t49425\n邵艺璇\t49426\n海淀\t49427\n一iv\t49428\n秘高瑞\t49429\n甬江\t49430\nipone\t49431\n对冲\t49432\n接不住\t49433\n包身工\t49434\n好好睡\t49435\nfrommatomyoslf\t49436\n海湾战争\t49437\nundlytime\t49438\n耳鬓厮磨\t49439\n刘泽鹏\t49440\n度秘别人家的小妹妹来了\t49441\n聊聊聊聊聊聊\t49442\niiktxr\t49443\n猫嫂\t49444\nhhunl\t49445\n二十四块\t49446\n迷侠\t49447\n冒险王\t49448\n厶门\t49449\n石小猛\t49450\n芰私\t49451\n马卡龙图你\t49452\n那不久\t49453\n汉语族\t49454\n趣网衣\t49455\n三有度\t49456\n西涧\t49457\n布朗\t49458\n接尾\t49459\n博物馆\t49460\n未老先衰\t49461\n馨宇\t49462\nfdughx\t49463\n仪表盘\t49464\n赵仁婷\t49465\nzfux\t49466\n病符土泻火\t49467\n铁箍\t49468\n十八元\t49469\n中常委\t49470\n汤嘉豪\t49471\n网罗\t49472\n妙计\t49473\n心脏\t49474\n妙论\t49475\n列屿\t49476\n心脑\t49477\n东大街\t49478\n华西村\t49479\n诉讼参\t49480\n囟\t49481\n37万\t49482\n透露\t49483\n嘉善\t49484\nddgjjdddfh\t49485\n小二哥\t49486\n硬拆\t49487\n丑丑丑丑丑丑丑丑丑丑丑丑丑丑丑丑丑丑\t49488\n孔丹茜\t49489\n巴黎欧莱雅\t49490\n列居\t49491\n杨文琪\t49492\n5月7日\t49493\n徐徐咪\t49494\n很苦逼\t49495\nhkvbkutgfg\t49496\n233333335233333335233333333365233333333352333333\t49497\n//[偷笑][偷笑][偷笑\t49498\n心脸\t49499\n心脾\t49500\n玲珑花\t49501\n会吧会吧会吧会吧会吧会吧会吧会吧会吧会吧会吧会吧会吧会吧会吧会吧会吧\t49502\n红利群\t49503\n查处\t49504\n我的小度秘我最爱你了男生董秘\t49505\ndivc\t49506\ndiva\t49507\n襄阳\t49508\nSplandy\t49509\n饥肠辘辘\t49510\n秘度我真的好想你\t49511\nWocaonima\t49512\ndivu\t49513\n你猜我猜你在拉巴巴小玉\t49514\n杜飞\t49515\n装机\t49516\n给我现钱\t49517\n朱玉\t49518\n猪猪侠五过山\t49519\nsndsmjxxjd\t49520\n姓颜\t49521\n就怕怕\t49522\n福袋\t49523\n行吧行吧\t49524\n雷天霞\t49525\n0点零零\t49526\n墬\t49527\n细川护熙\t49528\n墨\t49529\n非常靠谱\t49530\n咕噜噜噜\t49531\n仅存\t49532\n5张\t49533\ngllg\t49534\n叶声响梦\t49535\n论以口实\t49536\nCK1样快乐吗看光之子\t49537\n秘事\t49538\n60多平方米\t49539\nIiu\t49540\nIit\t49541\n我的北京我的春晚\t49542\n秘亲\t49543\n滓\t49544\n境\t49545\n个价\t49546\nDallas\t49547\n拉练\t49548\n一比五\t49549\n第三回\t49550\n一百只\t49551\n第三四\t49552\n拉给\t49553\n一百句\t49554\n墟\t49555\n墙\t49556\n没有我漂亮\t49557\n王晰昧\t49558\n以为你是\t49559\n增速\t49560\npig喽\t49561\n爱美丽\t49562\n南京场\t49563\n写给\t49564\n笋格\t49565\n脉曲张\t49566\n卓怡君\t49567\n223578\t49568\n心存\t49569\n铁管\t49570\n一个二十五\t49571\n5538456\t49572\n搞笑熊\t49573\n颉启云\t49574\n张欢\t49575\n爱问群儿\t49576\nCP/M\t49577\n双龙戏珠\t49578\n心孤\t49579\n心学\t49580\n致加西亚的一封信\t49581\n陶泥\t49582\n大学生活\t49583\n挖空\t49584\n曹古寺\t49585\n博斯克\t49586\n去不可能\t49587\n老女人\t49588\n邵阳秘\t49589\n中央企\t49590\n倾国与倾城\t49591\n纸巾\t49592\n大美驴\t49593\n韩雨彤\t49594\n77nnn\t49595\n心酸胡思乱想\t49596\n再见八点\t49597\n断层湖\t49598\n薛梓炜\t49599\n1.15万吨\t49600\n冷笑话\t49601\n17077171829\t49602\n落红\t49603\n人工费\t49604\n上富\t49605\n科斯塔\t49606\nFederer\t49607\njhshsh\t49608\nmynameisAnna\t49609\n云形小屋\t49610\n我真的爱你爱你切身唉\t49611\n女孩女孩女\t49612\n李玉波\t49613\n10盎司\t49614\n晋亿实业\t49615\n潘紫薇\t49616\n好啦好啦好啦好啦好啦哈楼\t49617\n没单\t49618\n莫凡\t49619\n血汗工厂\t49620\n拨播放\t49621\null兔兔ull\t49622\n花生女\t49623\n理帮\t49624\n老腊肉\t49625\n酒后运动\t49626\n孔子生\t49627\n扣得\t49628\n内需\t49629\n老驴唇\t49630\n看一看不到\t49631\n222222\t49632\ngdjmux\t49633\nAdjjdmjgdtgdjttwtptmmmjjjjjjgggggggggggggj\t49634\naaguagahs8saKishsi7abnansbzbzjhzhk\t49635\n运流动\t49636\n两分钟内\t49637\n10.15-21日\t49638\n任云\t49639\nffghdsd\t49640\n瀑布\t49641\n腰椎\t49642\n623\t49643\nzao\t49644\n621\t49645\n620\t49646\n627\t49647\n626\t49648\n625\t49649\n风骚女\t49650\n小富\t49651\n蒋国强\t49652\n629\t49653\n628\t49654\nzab\t49655\n贪杯\t49656\nzaa\t49657\n喷气机\t49658\n刘益华\t49659\n小寒\t49660\n内胆\t49661\n吕显余\t49662\n别回来\t49663\n我答我答我答答答\t49664\n体魄\t49665\n1哦尼\t49666\n室内设计师\t49667\n今早8点半\t49668\n铜陵\t49669\n参天\t49670\n巫婆们\t49671\n门白\t49672\n小寨\t49673\n魔物\t49674\n一千分一万\t49675\n775435555\t49676\n水水南水北\t49677\n谢拉\t49678\n天亮了惨\t49679\n首歌吧歌\t49680\n喥秘佷疠嗐吖\t49681\n马里亚纳\t49682\n裤鱼\t49683\n5418341\t49684\n黄河故道\t49685\n上皮\t49686\n金·古德曼\t49687\n国国\t49688\n1月12\t49689\n1月13\t49690\nhggxgdcj\t49691\n枪神纪蒸汽钟\t49692\n锅端\t49693\n行囊\t49694\n花园新村\t49695\n浏览\t49696\n大灰狼\t49697\n因而\t49698\n好穷\t49699\n第五十四名\t49700\n枪林弹雨\t49701\n车昕\t49702\n陆曼云\t49703\nJYS\t49704\nJYP\t49705\n死相\t49706\n咋们\t49707\nD3104\t49708\n蹭饭\t49709\n宁都\t49710\ncaum\t49711\n局域网\t49712\n仗势欺人太甚至于\t49713\n象州\t49714\n慢一点半点\t49715\nJYH\t49716\n暖宝宝\t49717\n消毒迷\t49718\n2055150\t49719\n程旭\t49720\n毕业季\t49721\nTamD\t49722\n蕾丝度秘\t49723\n杀狗\t49724\n售票亭\t49725\n这办事\t49726\n溜溜俅俅\t49727\n四零零六\t49728\nonline\t49729\n合金\t49730\n小麦哥\t49731\nDdd\t49732\n王广卓\t49733\nParty\t49734\n你的错了还赖我真是不要脸\t49735\n九江庐山南路派出所\t49736\n政颖\t49737\n400万亩\t49738\n耐用品\t49739\n别别等会儿\t49740\n接龙我想\t49741\n中国跳水队\t49742\n哈啦啦啦啦啦啦啦啦啦啦啦到你的啊dytok\t49743\n大小时\t49744\n老客户\t49745\n马尾\t49746\nevevefchiq\t49747\n男一号\t49748\n推敲\t49749\n不要你了你不好玩儿\t49750\n亲懂哪你说的话\t49751\negghjudddffc\t49752\n1313536776\t49753\n张远洋\t49754\n去哪等\t49755\n借不去\t49756\n母猴\t49757\n鄳\t49758\n990000000000000000000000000000000000000000000000000000000\t49759\n祭拜\t49760\n4573\t49761\n欧米伽\t49762\n4575\t49763\nb罩\t49764\n豆制品\t49765\nufjvfgz\t49766\n撒叫\t49767\n鄢\t49768\n男男女女\t49769\n情有\t49770\n海贼\t49771\n[MRET]111024.SBS.金昌烈的Old\t49772\n鄙\t49773\n贾严峻\t49774\n猪大肠\t49775\n就是帅\t49776\n你好好萌\t49777\n李明瑞\t49778\n文摘\t49779\n原来如此原来如此原来如此原来如此原来如此原来\t49780\nherorotalamy\t49781\nSW\t49782\n假改\t49783\n回头而看\t49784\n鄂\t49785\nxiaojj\t49786\n第7次\t49787\n好柔\t49788\n出演\t49789\n车头\t49790\n真哒\t49791\n刷卡\t49792\n武汉市卫生局\t49793\n亲亲度\t49794\nMSN3qh\t49795\n车夫\t49796\ngtg555yuy5\t49797\n浪費\t49798\n名身\t49799\n1点1点\t49800\n苍狗\t49801\n数辆\t49802\n尿水\t49803\n冷哇靠靠\t49804\n54秒\t49805\n封门别类\t49806\n火柴\t49807\n无欲\t49808\n苍狼\t49809\n核状\t49810\n一个晚上如己\t49811\n蛮了\t49812\nruhvd\t49813\nqqer\t49814\n花椰菜\t49815\n圆心角\t49816\n照看一下\t49817\n说什么呀我不懂不懂不懂不懂不懂不懂不懂不懂不懂不懂不懂\t49818\n青海\t49819\n周日\t49820\n见到你了你\t49821\n张毓雪\t49822\n武宁路\t49823\n青浦\t49824\n陈ch\t49825\n帝王\t49826\n1141x723\t49827\n床吻\t49828\n相比之下\t49829\n主人翁\t49830\n恐慌\t49831\njtv\t49832\njtp\t49833\n我爱你有几分我爱你有多深度秘我爱你\t49834\n街巷\t49835\n土错\t49836\njtl\t49837\n3\t49838\njtj\t49839\n真的旅\t49840\n发型\t49841\njtg\t49842\njtb\t49843\n米野帝\t49844\nJohnson\t49845\n丈人\t49846\n百慕达三角洲\t49847\n刘梦露\t49848\n河川\t49849\n天下第一漂亮\t49850\n刺猥\t49851\n风云人\t49852\n三二九二七\t49853\n少了几说明天下午到我家张小饺子馆\t49854\n嬷叫\t49855\n服务台\t49856\n80箱\t49857\n薛志壮\t49858\n星期一\t49859\n严顺\t49860\nmeakiss\t49861\n50555\t49862\n七十七十八\t49863\n星期三\t49864\n萨姆斯\t49865\n基尔\t49866\n沸点\t49867\n约翰迪尔\t49868\n王子聪\t49869\n罗小凤\t49870\n舞状\t49871\n成立\t49872\n3000平米\t49873\n乐透视\t49874\n鄂教版\t49875\n图文\t49876\n麻将\t49877\ndidyk\t49878\n较劲儿\t49879\n曾凡一\t49880\n泥妮都\t49881\n客未登机\t49882\n崇尚\t49883\n结果呢\t49884\nshuijiaol\t49885\n一八女\t49886\n俩岁半\t49887\n一瓢\t49888\n二三国时期\t49889\n换场\t49890\n别关照\t49891\n27幢\t49892\nSO\t49893\n储户\t49894\n喜欢你了再见\t49895\n洗脸盆\t49896\n求值\t49897\n14.90%\t49898\n521266358发\t49899\n祥林嫂\t49900\n疯得\t49901\n外延\t49902\n五个多小时\t49903\n夏季\t49904\n歌狂\t49905\n0o千\t49906\n充不上\t49907\n碧柔\t49908\n安安稳稳\t49909\n回不去\t49910\n3月22日下午4时30分\t49911\n赵妍\t49912\n张跟硕\t49913\n冷若汐\t49914\njjlm\t49915\n13:40\t49916\ntwk\t49917\n1605481941043074\t49918\n快速\t49919\n面子弹夹\t49920\n想来了\t49921\n作曲家\t49922\n睡别\t49923\n快递\t49924\n昆明市新闻中心\t49925\n明珠台\t49926\n睡到\t49927\n段宜恩\t49928\n再猜\t49929\n白灰灰\t49930\n快选\t49931\nokokaoka\t49932\nccsgcbgutjgjamaw7n55644n744422579043337……4454452688n037785475\t49933\n本公主\t49934\n龙永龙\t49935\n纷飞热\t49936\n比对\t49937\n快送\t49938\nGdsrcx\t49939\n甘乖\t49940\n保留\t49941\n广禾堂月子餐\t49942\nsusf\t49943\nsusi\t49944\nHdnf\t49945\nsusk\t49946\nsusj\t49947\n草草草草草草草草草草\t49948\n一二三四五六七八九十一二三四五六七八九十一二三四五六七八九十一二三四五六七八九十五六七八九十五六七八九十一二三一二三一二三一五六七八九十五六七八九十一二三一二三\t49949\nsuss\t49950\nsusu\t49951\n765167方成式\t49952\ndoyaya\t49953\n企业文化\t49954\n招聘检票员\t49955\nNSH\t49956\nfferty\t49957\n龙灯\t49958\n回魂\t49959\n11月17号\t49960\n罗思思\t49961\nNSX\t49962\n身着\t49963\n缘起\t49964\n认同\t49965\navavavata\t49966\n88866\t49967\n泷帅\t49968\n一八张\t49969\n惊声尖叫\t49970\n收钱\t49971\n固话费\t49972\nVvV\t49973\n冷情\t49974\njj个jtrkyfkg他会成功个好人人家deevrvtfhg估计\t49975\n精灵妖精H\t49976\n林朗\t49977\n蒸笼头\t49978\n白岳峰\t49979\n参狈\t49980\n岭石\t49981\n爆竹\t49982\n菜\t49983\n乏味\t49984\n夜访\t49985\njbja431\t49986\niloveyou3qchilf\t49987\n#2012COSMO美容大奖微博评鉴团#\t49988\ndvddg\t49989\n飘否\t49990\n几口\t49991\n夺命丫\t49992\n阿贡山\t49993\n纷纷纷纷纷纷纷纷纷纷纷\t49994\n通胀率\t49995\n啃女\t49996\n张洲豪\t49997\n杨雨蓉\t49998\n神街\t49999\n净重\t50000\n减持\t50001\n呀不懂\t50002\n美大美\t50003\n重来\t50004\n驶驾\t50005\nddfdsswerff\t50006\n堆子\t50007\n欢泰迪狗\t50008\n光启\t50009\n衣首\t50010\n不在话下\t50011\n橡皮筋\t50012\n7月30日上午\t50013\n癫狂\t50014\n鸣枪\t50015\n神衣\t50016\nstache\t50017\n恩新年\t50018\n七十几年\t50019\n光名\t50020\n包子\t50021\n光合\t50022\n13313160105\t50023\n上海环保\t50024\n凯姐\t50025\n中方\t50026\n采血证\t50027\n一张\t50028\n95533\t50029\n中新\t50030\n中断\t50031\n一弹\t50032\nyabali\t50033\n30年\t50034\n懊恼\t50035\n防病\t50036\nbaoersah\t50037\n一張\t50038\n不知火舞\t50039\n哭哭\t50040\nglx135185148952\t50041\n122乛\t50042\n一开\t50043\n水仙花\t50044\n阳历\t50045\n一弄\t50046\n擦屁擦\t50047\n令人\t50048\n表白岁\t50049\n阳原\t50050\ns2uo\t50051\n中文\t50052\nweadk\t50053\n半跟\t50054\n干嘛不错\t50055\nYJC\t50056\n肉眼\t50057\n来居上班\t50058\nnkniwoh\t50059\nYJH\t50060\nfloza\t50061\n2年前\t50062\n带口\t50063\n作歌\t50064\n谢家荣\t50065\n调度\t50066\n诉说者\t50067\n489874\t50068\n郭伟\t50069\n杨幂启\t50070\nhdhhcxgxh\t50071\n叶萝莉\t50072\n何超欣\t50073\n抹杀\t50074\n1个小时\t50075\nforkm\t50076\n广电课\t50077\n手法家\t50078\nhellokittu\t50079\n07年\t50080\nhellokitty\t50081\n李夫人\t50082\n肆虐\t50083\n中国驻日本大使馆\t50084\n半路\t50085\n千丝万缕\t50086\njji\t50087\n3月6日\t50088\n南京大屠杀\t50089\n作死\t50090\n千山\t50091\n洗浴先背红我谈恋爱\t50092\n欧亚大陆\t50093\n古久\t50094\n四囚4\t50095\nLenovo\t50096\n15852029971\t50097\ndhiphotosbaiducomxiaodupicitem4d086e061d950a7b11d27cb00dd162d9f3d3c9dajpg\t50098\n说句道歉\t50099\n一站式\t50100\n搞基膜我\t50101\naiisjs\t50102\n猪胎\t50103\njjh\t50104\n残存\t50105\n123456789987456321123456789987654321123456789987654321123456789987654321123456789987654312\t50106\n陈三好\t50107\n思密达思密达思密达\t50108\n小隆\t50109\n沈贵人\t50110\n抽了看\t50111\n乱收费\t50112\n古书\t50113\n九月九月\t50114\n数一个\t50115\n演变\t50116\n猪胩\t50117\n王红梅\t50118\nsligaa\t50119\n棒小\t50120\nhuevd\t50121\n漏水\t50122\nSQL服务器\t50123\n嗯嗯曲项向天歌白毛浮绿水红掌拨清波鹅鹅鹅曲项向天歌白毛浮绿水\t50124\n乖我爱\t50125\n陈小坚\t50126\n花千骨化\t50127\n毕天赐\t50128\n世界纪录\t50129\n水果机\t50130\n刘英山\t50131\n头头\t50132\nk度秘\t50133\n8周\t50134\n紧身\t50135\n超流行\t50136\n头太\t50137\n林祺恒\t50138\n北京大学\t50139\n三七零\t50140\n110白\t50141\n一个1块\t50142\n制造商\t50143\n受够你了我不叫\t50144\n五系\t50145\n五糸\t50146\n耕犁\t50147\n摸乜\t50148\n闻人\t50149\n心情好\t50150\n光伏\t50151\n石头剪子布\t50152\nAndroid#\t50153\n犯案\t50154\n影月\t50155\n7月2日起\t50156\n一房\t50157\n嗯嗯52ourzloo\t50158\n一戴\t50159\n赶件\t50160\n一户\t50161\n仁义\t50162\n雨洼\t50163\n两点半\t50164\n四秒钟\t50165\n一截\t50166\nMcGregor\t50167\nggnggngngngngngngnhhngnhhhhghgggngnfnfngngngngggngnngngnggfnf\t50168\n安启航\t50169\n黑鬼\t50170\nsgjk\t50171\n大老婆\t50172\nhhhhhhhhhhhhhhhh\t50173\n真情假的我问你道题度秘\t50174\n会计\t50175\n一战\t50176\n来看电视\t50177\n相似处\t50178\n一成\t50179\n美女画\t50180\n一戒\t50181\n头陌路诺六六大顺\t50182\n王卡\t50183\n一戏\t50184\n看样子\t50185\n有点之五\t50186\n你子\t50187\n神球\t50188\n大悦城\t50189\n废铜\t50190\n小苏姻\t50191\n别老骂我行吗别老骂我行\t50192\n虚度\t50193\n2F-03\t50194\n给我开太\t50195\n伍龙翔\t50196\n团团转\t50197\n见仁见智\t50198\nyoo\t50199\nyon\t50200\n昏鸦\t50201\n乌特\t50202\nyoh\t50203\n生态公园\t50204\nyof\t50205\nyoe\t50206\n李凡然\t50207\n不要俩\t50208\nyoy\t50209\nyox\t50210\nyow\t50211\nyou\t50212\n三十六十五十八十\t50213\nyos\t50214\n接茬\t50215\nyop\t50216\n相伴随\t50217\n至沙骄亏\t50218\n经济效益\t50219\n倒车\t50220\n苏说真\t50221\n奥铃捷运\t50222\nboing\t50223\n倒转\t50224\n耒阳阳\t50225\n呼转\t50226\n哦度秘呗\t50227\n沙县公安局\t50228\n14444444411\t50229\n阮婧怡\t50230\n从天而下\t50231\nbuilding\t50232\n牛蜘蛛侠\t50233\n大塊意气\t50234\nliapma\t50235\n跟我讨厌\t50236\n功勋\t50237\nhhjklnbvc\t50238\ngsshionmmitabbawom\t50239\n别老发\t50240\n住宿费\t50241\n野花\t50242\n免税店\t50243\nghhcnbzbVlllcl\t50244\n毫不留面\t50245\n粗面\t50246\n贴贴饽\t50247\n好妻子\t50248\n天天有喜之人间有爱过美丽的秘密\t50249\n学前\t50250\n上水渍\t50251\n狗肚子\t50252\n扭闺密\t50253\n声筒\t50254\n布洛克\t50255\n黑皮波斯呆兔咪\t50256\n过来吧\t50257\n妤凌\t50258\n水漾\t50259\n混饭吃你看我额凤凰城\t50260\nRAPPER\t50261\n切波斯\t50262\n免于\t50263\n婷钰\t50264\n江苏舜天足球俱乐部\t50265\n给你呗背首诗从前明月光疑\t50266\n一二五\t50267\n风和日丽\t50268\n泽民\t50269\n呱啦呱啦呱啦呱啦呱啦乖啦乖老乖老乖老乖\t50270\nFuDLe\t50271\n爱听得\t50272\n度秘VB\t50273\ndyjbffhjvcgkjff\t50274\n五六七八九十十一十二十三十\t50275\n刘正三\t50276\n西度秘\t50277\n跨入\t50278\nAB冖\t50279\n22443367890\t50280\n最轻\t50281\n5W\t50282\n一万两千颗\t50283\n一个8多\t50284\n重头戏\t50285\n执政党\t50286\n上海红十字医院\t50287\n云影赏荷\t50288\n0.19亿元\t50289\n剃头八怪\t50290\n见儿\t50291\n百叶箱\t50292\n法行\t50293\n千千帅\t50294\n制敌\t50295\n娄米拉\t50296\n访美\t50297\nxvccfh\t50298\n经联\t50299\n安奇夜深\t50300\n魈\t50301\n波黑塞族\t50302\n一到九\t50303\n上周\t50304\n陈大愚\t50305\n美若天仙\t50306\n几针\t50307\ns3z龙\t50308\nniliria\t50309\n上官玉成\t50310\n阿泽尔逊湖\t50311\n天各一方\t50312\n张个\t50313\n宋词\t50314\nfigma\t50315\n林瑞\t50316\n1000年\t50317\n私产\t50318\nmombaby\t50319\n万重山\t50320\n6月2日\t50321\n很可恨\t50322\n三汇镇\t50323\n蒙奇奇\t50324\n3543646\t50325\n一日百米\t50326\n腰果\t50327\njjjjjjjjj\t50328\n筷子兄弟\t50329\n21瓶\t50330\n你好帅哥\t50331\n朴龙妃\t50332\n示意图\t50333\n走了我走\t50334\n喽玉山\t50335\n配额\t50336\n罗子\t50337\n御宅族\t50338\n告诉我我不会\t50339\n然多米\t50340\n8月15日\t50341\n2612\t50342\n面对不起\t50343\n笑遖\t50344\n刑事案件\t50345\n白见\t50346\n上流社会\t50347\n人流\t50348\n石兆浩\t50349\n一十十一\t50350\n恩来\t50351\n沙哈拉哈\t50352\n百五元\t50353\n问天\t50354\n垓下\t50355\n叶良辰刁\t50356\n接着我\t50357\n1844316552\t50358\n明天早上7点\t50359\n李嘉博\t50360\n伱存\t50361\nHcgvvhhch\t50362\n太原街\t50363\n奚志康\t50364\n批塞赛\t50365\n福u\t50366\nTdhj\t50367\nKtd\t50368\n破链\t50369\n射击\t50370\n射出\t50371\n小伴龙吧\t50372\n红润\t50373\n劫难\t50374\n士多\t50375\n出品\t50376\n第三第四\t50377\n大蓉蓉\t50378\n天天都需要你爱我的心思有你猜iloveyou\t50379\nhoaihailinh\t50380\n李院长\t50381\n拔地而起\t50382\n初赛\t50383\n想哭\t50384\n不易\t50385\n金风树\t50386\n凯乐石\t50387\n不要你了我\t50388\n阿哲场\t50389\n瞎货\t50390\n娘娘\t50391\n百期\t50392\n王慧仪\t50393\n立柱\t50394\n不明\t50395\n别字\t50396\n百木\t50397\n答谢\t50398\n雅过分\t50399\n百机\t50400\n还禁\t50401\n赵兰彬\t50402\n北门口\t50403\n排除\t50404\n二零五\t50405\n不是\t50406\ntatjmt\t50407\n李仪丰\t50408\n地锅鸡\t50409\n下轨\t50410\n提议\t50411\n秘夸\t50412\n悦享拍\t50413\n歹毒\t50414\n三合安客行区全四卡虚无bosstm\t50415\nernvmudu\t50416\n诗诗\t50417\n诗词\t50418\nMRS5\t50419\n秘太\t50420\n颖颖\t50421\n我的肋骨\t50422\n秘大\t50423\n提记\t50424\n无意外\t50425\n哨声\t50426\n卫龙辣卫龙\t50427\n天龙话年听不懂人话年\t50428\n是不想\t50429\n庞各庄镇西梨园\t50430\n猪朱猪朱猪\t50431\n舞魅\t50432\n譥\t50433\n秘处\t50434\n28000块\t50435\n动物们\t50436\n遗音\t50437\n没有人知道\t50438\n尊姓\t50439\nhghhb\t50440\n大悟\t50441\n鲜\t50442\n舍部\t50443\nav人马\t50444\ntnnd\t50445\nMIC\t50446\n3000多家\t50447\nMID\t50448\nMIG\t50449\n唐僧\t50450\n再见了\t50451\n鲁\t50452\n坏蛋坏蛋坏蛋坏蛋坏蛋超级大坏蛋坏蛋\t50453\n卡特尔\t50454\nMIT\t50455\n王若丁\t50456\no12\t50457\n鲸\t50458\n僵死一潭\t50459\n扫码淘金币\t50460\n尹儿\t50461\n再见亲\t50462\n济州联\t50463\n咳嗽痰多\t50464\n鲵\t50465\n大悲\t50466\nMIf\t50467\n批塞\t50468\n华润万家\t50469\n鲫\t50470\n挑有\t50471\n树林子\t50472\n超人正大光明\t50473\n东侧\t50474\n别聊了再见\t50475\n黑楠瓮ly\t50476\n破表\t50477\n再来句\t50478\n花语\t50479\nCEO轮值制度\t50480\n该组\t50481\n摩擦文\t50482\nwsssss\t50483\n张文静\t50484\n七夕\t50485\n七多\t50486\n娘片\t50487\n旗舰级\t50488\nlymmq\t50489\n七夜\t50490\n6月20日\t50491\n南湖元\t50492\n七大\t50493\n郭一泽\t50494\n蝽\t50495\n机中机\t50496\n七天\t50497\n我是真的爱死你\t50498\n第888名\t50499\n小心我告诉\t50500\n平果旧城\t50501\n七、八分钟\t50502\n花话\t50503\n赵赵\t50504\n洲立\t50505\n想你的话\t50506\n一杯子\t50507\n朝蛮\t50508\n胳膊肘\t50509\nftffgvggggfgcggfgggggghvd\t50510\n中信出版社\t50511\n退特\t50512\n农事\t50513\n上政公考\t50514\n深究\t50515\n300兆\t50516\n老走\t50517\n退出\t50518\n近二个月\t50519\n老赵\t50520\nbeyond\t50521\n砀山\t50522\n兴风\t50523\n锋云\t50524\n6668\t50525\n一个三个\t50526\n好你笑你笑你笑\t50527\n有荤\t50528\n有礼貌真是\t50529\n6666\t50530\n发夹\t50531\n火眼金睛\t50532\npfbys\t50533\nIujvv\t50534\n天天汽水\t50535\n索宝贝\t50536\n烦累\t50537\n帕斯卡尔\t50538\n诺布雷\t50539\n沙子河\t50540\n哎呦兔兔\t50541\n罗辑\t50542\n美神\t50543\n137克\t50544\n暗夜\t50545\n1498多单\t50546\n灵儿\t50547\n黑b\t50548\n暗处\t50549\n假领\t50550\n检讨\t50551\n326435255553\t50552\n携程\t50553\n大力\t50554\n条假\t50555\n郑鑫\t50556\n芭比娃娃之\t50557\n666k\t50558\n朱子学\t50559\n偷人\t50560\n着不好\t50561\n包味客\t50562\n偷亲\t50563\n李诗琴\t50564\n搜图\t50565\n梦工场#\t50566\n浸入\t50567\n土城才\t50568\n十万八千里\t50569\n一三岁\t50570\n勾搭组\t50571\n李诗琪\t50572\n嗯行行\t50573\n生灵\t50574\n高雪娟\t50575\n五征\t50576\n四白雪公主\t50577\n双筷子\t50578\n炒鱿鱼\t50579\n主从何来\t50580\n奶居\t50581\n凤凰网读书会\t50582\n战舰\t50583\n饮水机\t50584\n本地人\t50585\n1111111111111111111111100000000000000000000000\t50586\n吉鲁\t50587\nUSM卡\t50588\n郭碧岛\t50589\nDukeissues\t50590\n抑郁\t50591\n主要头\t50592\n说不清楚\t50593\n昨地\t50594\n岁冒\t50595\n吴冠中\t50596\n似玉\t50597\n小木露琪亚\t50598\n死劲\t50599\n厦门日航酒店\t50600\nndc病毒\t50601\n蓝蓝天空飞彩霞骑上了我的小红马\t50602\n一点八点\t50603\n外公\t50604\n说不尽\t50605\n杨狗\t50606\n百集\t50607\n笨种\t50608\n宋凯强\t50609\n丹也\t50610\nchgggg\t50611\n广州体育馆\t50612\n十七日\t50613\n条绒\t50614\n点像\t50615\n县志办公室\t50616\nzwurckt\t50617\n魔高一尺\t50618\n这个气\t50619\n大加\t50620\n腾飞\t50621\n面位\t50622\n喜贵\t50623\n这只有一个人\t50624\n5x\t50625\n力求\t50626\nadkknuy\t50627\n喜败\t50628\n牢狱\t50629\n看重忙\t50630\n跑回\t50631\n9887\t50632\nlif\t50633\n支取\t50634\nlia\t50635\nlil\t50636\nOK摩卡\t50637\nlin\t50638\n1月3日\t50639\nNid\t50640\n好啦好啦不来你了逗你\t50641\nlit\t50642\nliu\t50643\nliv\t50644\nliw\t50645\njouc\t50646\n被查封\t50647\n申轨迹\t50648\n没我可\t50649\njouk\t50650\n草木\t50651\nNiu\t50652\n东京食种\t50653\n杜风霖\t50654\n长白猪\t50655\n羡慕的呀不知道\t50656\n消费者转换率\t50657\nJawed\t50658\n552854\t50659\n杯权\t50660\nxkssididosidodjdxj\t50661\n马星雨\t50662\nUICKBLViviFFif\t50663\n我的命\t50664\n真的好想你\t50665\n螺号\t50666\n仨S阿S\t50667\n15851229999\t50668\n薯\t50669\n其實\t50670\n艺术化\t50671\n影流之主\t50672\n5.10.13\t50673\n那你可以爱我了着了\t50674\n往昔\t50675\nsesea\t50676\n露涵\t50677\n曹奔\t50678\n生涯\t50679\n根你\t50680\n四三物\t50681\n杯杯\t50682\n孙家璐\t50683\n纬度31734388\t50684\n北航\t50685\n9页\t50686\n给我说\t50687\nviaaa\t50688\n客兄弟姐妹\t50689\n专一一专一专一\t50690\n经纪公司\t50691\n够了吧\t50692\nsheiw\t50693\n使岁\t50694\n艾玛\t50695\n不入虎穴焉\t50696\n喔喔喔喔喔喔喔喔喔喔喔喔喔喔喔喔么喔喔喔喔喔喔喔喔喔喔喔喔喔喔喔喔喔喔喔喔喔喔\t50697\n工商银行\t50698\n壁嘴\t50699\nTXGMGN\t50700\n第吉尔\t50701\n答n案\t50702\n凯翼\t50703\n李东阳\t50704\n100条\t50705\n微胖\t50706\n啊阮\t50707\n蔡庄\t50708\n该冲\t50709\n一个几\t50710\n笑一个再笑\t50711\n云进\t50712\n884525452764957657980784874847979728\t50713\n爬床\t50714\n钛金石\t50715\n走就\t50716\n速溶\t50717\n不好使\t50718\n顿村\t50719\n监管者\t50720\n老老老\t50721\n吖咪\t50722\n哎呀呀尽情\t50723\n张西西\t50724\n哪些\t50725\n炫真神马\t50726\n球迷们\t50727\n信秘\t50728\n四十七分\t50729\nZAL\t50730\n月魅\t50731\n吴云磊\t50732\nZAI\t50733\nventnul\t50734\n640骨\t50735\n健力宝\t50736\n孝庄\t50737\n好几年\t50738\nQQ额特\t50739\n无数次\t50740\n四级阅读书\t50741\nwyfvhf\t50742\n哪人\t50743\n彻彻底\t50744\n三光犬\t50745\n一百九十多块\t50746\n赛问赛\t50747\ncsse\t50748\nwifi信号\t50749\n弩手\t50750\n放轻松\t50751\ncssd\t50752\n回回\t50753\n30.83%\t50754\n阿狸\t50755\n敷设\t50756\n阿狼\t50757\n满嘴\t50758\n两声\t50759\n温顺杰\t50760\n一个二层\t50761\n選料選\t50762\n无意义\t50763\n鼓起勇气\t50764\n礼拜礼拜\t50765\n吴春怡\t50766\n昂智能你是我肖不许学我\t50767\n回国\t50768\nGAKN\t50769\n永旺超市\t50770\n说妈呀\t50771\n林佳慧\t50772\n吴孟超\t50773\n七位数\t50774\n牡丹图\t50775\n秀美好好\t50776\n陶月莹\t50777\nCZ6303\t50778\n水性\t50779\ndddgh\t50780\n81698\t50781\n乱轮\t50782\n18067568651\t50783\n拖五\t50784\n水怪\t50785\n充爆\t50786\nboth\t50787\nhhqqq\t50788\nboto\t50789\nfhykfjhf\t50790\n麻溜\t50791\n张沫凡\t50792\n坏人家\t50793\n请稍后\t50794\n冬春\t50795\n不久后\t50796\ndybodubuh\t50797\n观念\t50798\n我是网\t50799\n港酒\t50800\n坐具车\t50801\n社联\t50802\n品行\t50803\n尖峰粲粲\t50804\n腊月18\t50805\n画家\t50806\n窑洞秘\t50807\n关石林\t50808\n河水面\t50809\n名写\t50810\n画室\t50811\n陆星宇\t50812\n晓东迷雾怠慢你我错\t50813\n散户\t50814\n富丰\t50815\n茶面包\t50816\n八八两天\t50817\n脾曲\t50818\nBwBfXnwva7ejccBeMX\t50819\nhopetodo\t50820\n会影\t50821\nChrome浏览器\t50822\n重器\t50823\n散成\t50824\n广电\t50825\n畅享\t50826\n告艘\t50827\n哀家\t50828\nvggzbghfbv\t50829\n听不找\t50830\n交替\t50831\n花漫\t50832\nllllljia\t50833\n双结\t50834\n蜘蛛机器人\t50835\n乌鲁木齐站\t50836\n喜欢你我要\t50837\n入山处\t50838\n叫丘\t50839\n3根\t50840\n张欣强\t50841\n一澳\t50842\n坏死\t50843\nwhile\t50844\n降解\t50845\n维他命面霜\t50846\n旧忧愁\t50847\n打秒\t50848\n对啊好好听\t50849\n一澡\t50850\n宛筠\t50851\n90344\t50852\n献唱\t50853\n漂流\t50854\n进眼\t50855\n簇气球飞行世界纪录\t50856\n惹我哭\t50857\n陈振林\t50858\n救助\t50859\n听不难不女\t50860\n班周汪洋\t50861\n面板\t50862\n漂浮\t50863\n听感\t50864\n独霸\t50865\n张艺林\t50866\n肏肏肏肏肏肏肏肏肏肏\t50867\n狱\t50868\nkhgf\t50869\n面条\t50870\n古代宫\t50871\n四十多分\t50872\n放困\t50873\n冷水面\t50874\n7x7十\t50875\nhhsiskaknd\t50876\n我讨厌你我讨厌你我讨厌头\t50877\n尘心\t50878\nG55\t50879\n本山传媒\t50880\nboo\t50881\n厚德\t50882\n董淼\t50883\n傻眼愣住\t50884\n名人酒店淘宝旗舰店\t50885\n常常\t50886\n迅先生\t50887\nuvuv\t50888\n木料\t50889\nugvvnf\t50890\n75年\t50891\n你是头毛驴你是头猪你是头大范女士头花\t50892\n卡一卡二里\t50893\n十一岁\t50894\n常帅\t50895\n文凭\t50896\n囗囗囗囗囗囗\t50897\n122889\t50898\n一席话\t50899\n呢姐\t50900\n全点\t50901\n我是问你我长的帅不帅\t50902\n姚氏兄弟登高处遍插茱萸少\t50903\n薪金\t50904\n综合性\t50905\n东波\t50906\n墨梅\t50907\n狮\t50908\n次城县\t50909\n数码机\t50910\n寿司水鹏山附小\t50911\n18年后\t50912\n羽毛球\t50913\n没收费\t50914\n农商银行\t50915\nchathigk\t50916\n41万\t50917\n756家\t50918\n当世人\t50919\n1IPA\t50920\n罩牛\t50921\n10.4万次场\t50922\n品相\t50923\n實現\t50924\n哈那嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯\t50925\n章红\t50926\n么式\t50927\n大无畏\t50928\n歪头宝\t50929\n零零零二十二个\t50930\n半心凉\t50931\n生力面\t50932\n您好亲\t50933\n喝酒鬍\t50934\n华藏\t50935\n溜走\t50936\n1堆\t50937\n泡泡堂卡丁车\t50938\n杨舒\t50939\n向上\t50940\n向下\t50941\n毛毛刚\t50942\njibv\t50943\nenlll\t50944\nibbmcbkygk\t50945\n四分之三\t50946\njiba\t50947\n小年夜\t50948\n向东\t50949\n明其妙\t50950\n林宝金\t50951\njibk\t50952\n七界\t50953\n此案\t50954\n热容\t50955\n杰尼斯\t50956\n755870899\t50957\n郑一鸣\t50958\n甘肃农科种业\t50959\n命素素\t50960\n度秘你好你猜猜我是谁\t50961\n金龙蛤蟆镜km\t50962\nmusni\t50963\n午三时十五分\t50964\n高尼奥\t50965\n寻子网&amp\t50966\n向丽\t50967\n启禀\t50968\n13162060017\t50969\n又错\t50970\n索求\t50971\n裸官\t50972\n水晶小学\t50973\ne63港荣\t50974\n3226552329\t50975\n还记的我是谁\t50976\n睡醒了\t50977\n右岸\t50978\n燕尾服\t50979\n第堡\t50980\n一四三八四三八一九四三八\t50981\n坏蛋小坏蛋小坏蛋小坏蛋\t50982\n5遍\t50983\n大韩\t50984\n飞火\t50985\n天照句\t50986\n巨富\t50987\n朱文旭\t50988\n投机\t50989\n我说东你说西我说东你说西\t50990\n北塔十二法\t50991\n徐梦静\t50992\nhello堡\t50993\n淘宝旅行订\t50994\n5道\t50995\n豪客\t50996\n极力\t50997\n比熊犬\t50998\n晚上九点\t50999\n醒了吧爸爸\t51000\n周小凤\t51001\n陆马天马独角兽空角兽\t51002\n爱你好不好\t51003\n红树林\t51004\n人民们\t51005\n辣法\t51006\n燕他\t51007\n我是一个兵\t51008\nbcdef\t51009\n12.5亿\t51010\n烧热\t51011\n这场\t51012\n狂\t51013\n办理\t51014\n豪宅\t51015\n姜罚\t51016\n#海飞丝\t51017\n这地\t51018\n闽粤站\t51019\nhzhs\t51020\n出类拔萃\t51021\n九孔\t51022\n总亿丰\t51023\n大猫\t51024\n三弚\t51025\n那你说说的的子\t51026\n一夜情\t51027\n坐落\t51028\nanjlg\t51029\n雅正\t51030\n拆分\t51031\n一百年前\t51032\n会吧度秘你的秘密\t51033\n湖南台跨年演唱会\t51034\n毒地\t51035\n四排\t51036\n1jtjadm\t51037\n画蛇添足\t51038\n嗯圣诞省\t51039\n五亿元\t51040\nmasdf\t51041\n咋的我\t51042\ncvlis\t51043\n粤宏远\t51044\n家庭史\t51045\n12月10日\t51046\n美好生活\t51047\nDChh\t51048\nhi你好度秘\t51049\n乔科尔\t51050\nxhyod\t51051\n哀家乐呵\t51052\n善后\t51053\n中国联合网络\t51054\n一个半月\t51055\n从良\t51056\n暴涨\t51057\n小傻子晚安度秘\t51058\n佛罗菠萝蜜\t51059\n安防\t51060\n元咨\t51061\n4寸\t51062\n快点告诉我可爱的歌\t51063\n咯莱斯特\t51064\nfffdgdfdget\t51065\n好心焦\t51066\n因为我喜欢他\t51067\n次日清晨\t51068\n征父\t51069\n了看\t51070\n超访\t51071\n黑龙江卫视\t51072\n天下大事\t51073\n用餐\t51074\nfxxfvvfvvvfvefvefexfebtrbyevtgtggyggbttge5卡11\t51075\n高达里达\t51076\nhgtrrrrffgfccffffffddgffdffffffghggbbvccccccccccvvvbvvvvcvvvvbbbbbbbbbvvvbbhhgfgwrtrhenhb\t51077\n元和\t51078\n新生活\t51079\nciki\t51080\n范本\t51081\n赠完\t51082\n娄赛\t51083\n昨天晚上一时\t51084\nassiii\t51085\n三朝\t51086\n二五章\t51087\n段义锋\t51088\n灵石\t51089\n月真的心\t51090\n澎澎\t51091\n111111222222333334444445555566666777778888899999900000\t51092\n杰娜\t51093\n92爸爸\t51094\n21966\t51095\n唔识楼\t51096\n撒坦\t51097\n不能自予\t51098\n小爱钟\t51099\n我是女的我的父母我的父母\t51100\n性别男or女\t51101\n3737\t51102\n毛阿\t51103\n聊过\t51104\nk宝\t51105\nk客\t51106\n6月18日起\t51107\n4927\t51108\n都市报\t51109\n圆其说\t51110\n呜了\t51111\njianyimian\t51112\n连招\t51113\noohibk\t51114\n刘可清\t51115\n哼ノДノ┻━┻\t51116\n喜羊一\t51117\n翟辉霞\t51118\n比兴\t51119\n17700元\t51120\n99800\t51121\n庙门\t51122\n难以置信\t51123\n103个\t51124\n请慎重\t51125\n超级舞曲\t51126\n绝不容易\t51127\n一个了72分\t51128\nstoumetotou\t51129\n陆诗敏\t51130\n邓宇阳\t51131\n战斗力\t51132\n过一心的翅膀\t51133\n我的爱\t51134\n魅颜\t51135\nvillahe\t51136\n我的爸\t51137\n点不要不我妈妈\t51138\nshjsh\t51139\n奖奖\t51140\n热额\t51141\n44568696382850\t51142\n可恨者\t51143\n呀秘\t51144\nyoutube\t51145\n1123456789\t51146\n呢儿子\t51147\ngdud\t51148\n爱国者\t51149\n触抓\t51150\n一英\t51151\n诡计多端\t51152\n绿地\t51153\n因地制\t51154\n宁波党校\t51155\n弗费\t51156\n芒西米\t51157\n42年\t51158\nkc7\t51159\n明慧\t51160\n87厘米\t51161\n列维\t51162\n滚儿\t51163\n储存器\t51164\n日环食\t51165\n孚日\t51166\n嘴皮子\t51167\n大凯撒凯撒凯撒凯撒卡卡\t51168\nｔｏｏ\t51169\n地架\t51170\n教授家\t51171\ndoer\t51172\n空白鬼\t51173\n可用\t51174\n摄制\t51175\n古版\t51176\n哥果\t51177\n萌帅\t51178\n想懂\t51179\n没错别说\t51180\n52万\t51181\n明慌\t51182\n3q叁漫\t51183\n饮水处\t51184\nkcc\t51185\n蜘蛛\t51186\n花蕊\t51187\n000001515541562420\t51188\n谢玲玲\t51189\n易庆\t51190\n晚安吧好梦萌达达的打兔兔\t51191\nGhhhvhfbjfvgdfhfasciclesthefirsttimeItrynottoolatecoffeegigtonightat7pmtonightfightback\t51192\n还近视\t51193\n动手评\t51194\n2056米\t51195\n豆豆豆豆\t51196\n18669107112\t51197\n得帅\t51198\n自建\t51199\n叔齐\t51200\n度秘100100\t51201\n开都\t51202\n大薛\t51203\n大话西游\t51204\n走路\t51205\nxggngzjzjxg\t51206\n蜗牛\t51207\n规矩\t51208\nbuyongle\t51209\n机器时代公司\t51210\n分割\t51211\n立马话\t51212\n莫无邪\t51213\n不一而已\t51214\n张尚可\t51215\n不好好心好意\t51216\n徐显博\t51217\n滤色镜\t51218\n要不在\t51219\n七二期\t51220\n段笑莹\t51221\n五千年后\t51222\n哈洗\t51223\nardent\t51224\n两三分钟\t51225\n去年5月\t51226\n3524676\t51227\n王好慢\t51228\n嗎聰明\t51229\n酸不溜\t51230\n牙周炎假如元\t51231\ndufufy\t51232\n分钟中银\t51233\n不要矢\t51234\n揍吧\t51235\n烦了了没有\t51236\n坦克车\t51237\n作多情\t51238\n深知\t51239\n晓得勒\t51240\n朱要\t51241\n怕人微言轻\t51242\n冰涼的夜覆蓋了我的淚水\t51243\n十二年\t51244\n秘钱\t51245\n乌珠突突美图图图图图图图图图图图图了了\t51246\n师恩难忘求你\t51247\nx佛彡iQ多彡升冬i耋\t51248\n天庭\t51249\n走来走\t51250\n感叹句\t51251\n嗤噗嗤噗嗤\t51252\n一个你好\t51253\n你好我是和\t51254\n感叹号\t51255\n化为\t51256\nJikxkbx\t51257\n武林风环球\t51258\n苞谷\t51259\n柳柳\t51260\n1580丶\t51261\nGF5#GH5\t51262\n长椅\t51263\n求求你了你帮我行不行啊我我我就是很想吃啊求求你了求求你\t51264\n听唱歌\t51265\n检电\t51266\n乐曲\t51267\n天府\t51268\n喔喔喔喔\t51269\n艾斯奥\t51270\n黄花\t51271\ndanger\t51272\n1213111213131511\t51273\n数儿\t51274\n苹果醋\t51275\n留好喜欢\t51276\n另谋\t51277\n生疏\t51278\n东多\t51279\n黄芪\t51280\n东大\t51281\n乙己千\t51282\n企业日记\t51283\n台资\t51284\n骚男\t51285\n东头\t51286\nsqpd\t51287\n大师姐\t51288\n广东代表团\t51289\nKOAIA\t51290\n慎言\t51291\n门类\t51292\n百度钱包\t51293\n丢雪纳瑞闹\t51294\njdmgejejd\t51295\n哈oP\t51296\n跨着\t51297\n点映\t51298\n碎玻璃\t51299\n小鸡儿\t51300\n眼睁睁\t51301\n王航飞\t51302\n都是因为\t51303\n朝政\t51304\n象声词\t51305\n礼敦\t51306\n意大利杯\t51307\n总共\t51308\n朱增光\t51309\n手持\t51310\n哪班机\t51311\nguknn\t51312\nridh\t51313\n江磊\t51314\nNjkl\t51315\n大吃货\t51316\nB样\t51317\n赵怡睿\t51318\n至此\t51319\n多才不若蓄德\t51320\n开车\t51321\n涛声\t51322\n黑魔\t51323\n陈真\t51324\n一起走路上学\t51325\n金地中心\t51326\n工工\t51327\ndaks\t51328\n谢和耐\t51329\ncuitco\t51330\n黄龙三关\t51331\n柿子椒\t51332\n48元\t51333\n小花园\t51334\n猪女\t51335\n快立春\t51336\n电镜\t51337\n恩布沙\t51338\n金鸡湖\t51339\n老孑\t51340\n小明君\t51341\n棒曲\t51342\n苏来红\t51343\n有失蹄\t51344\n陈曼莲\t51345\n模错\t51346\n电镀\t51347\n虚拟\t51348\n泰文\t51349\n阿豪\t51350\n我喜欢你我想和你\t51351\n花呗岁\t51352\n新气象路广场路\t51353\n李红岩\t51354\n吾世勋\t51355\n泰斗\t51356\n能寐\t51357\n你好哥哥哥哥你好\t51358\n指不定\t51359\n长吧\t51360\nAlexia\t51361\n邬博舜\t51362\n蘑艾妮\t51363\n高高起球\t51364\n长吟\t51365\n路桥恩泽医院\t51366\n10点35分\t51367\naujdjsjs\t51368\n54455只\t51369\n7854478\t51370\n妈的你们太假了\t51371\n俊德\t51372\nl1分钟\t51373\n剖开\t51374\n长吊\t51375\n从风\t51376\n能对\t51377\n一23456780658\t51378\n扫扫\t51379\n西丽\t51380\n千辛万苦\t51381\n粮库\t51382\n感染力\t51383\n堡堡\t51384\n落脚\t51385\n购物\t51386\n腱鞘炎\t51387\n段宇欣\t51388\n够不舒服\t51389\nkga1a1\t51390\n歼-15\t51391\n百零五\t51392\nmonogamy\t51393\n靠不行\t51394\n谢啲\t51395\n孙雅楠\t51396\n8848钛金\t51397\n藏族\t51398\nhjdghyjdha\t51399\n谢啦\t51400\n182张\t51401\n雅蠛蝶移库奇磨鸡扫戴斯噶\t51402\n二件\t51403\n二份\t51404\n仲裁者\t51405\n二任\t51406\n裕勋龄\t51407\nPK地\t51408\n二代\t51409\n瞻仰\t51410\n拍板\t51411\n日寇\t51412\n嗯家小区\t51413\n网厅\t51414\n二仔\t51415\n聚缘山庄\t51416\n贾佳宇\t51417\n讨厌干\t51418\n老蛇集团\t51419\n吹手\t51420\n吖头\t51421\n嗓门\t51422\n六月三十号\t51423\n天天有喜吗人间有爱\t51424\n苏西润\t51425\n14130.8万股\t51426\ncvvbn\t51427\n追杀者\t51428\n99.2G\t51429\njhabs\t51430\n微博仙\t51431\n15869096292\t51432\n鑫平\t51433\n你好好猜一猜\t51434\n莘庄\t51435\n25分之一\t51436\n愚5\t51437\n855387885\t51438\n民间舞蹈\t51439\n行不明\t51440\n武冈\t51441\n张国立\t51442\n挖走\t51443\n五好我要\t51444\nDfdggtvufhyfjipkrsdrddyhcdfczshbdsfffseghedfyyccjedvhfeddvhjjf\t51445\n沙县小吃吃馄饨\t51446\n床系\t51447\nFjc\t51448\nfdeert\t51449\n大过来\t51450\n让开\t51451\n灵魂战车2\t51452\n思密达把思密达\t51453\n皱褶\t51454\n挂住\t51455\n惊恐\t51456\n位文\t51457\n华中师大\t51458\n李启铭\t51459\n威捷\t51460\n来了就问\t51461\nbongbong\t51462\n朱小涛\t51463\n41个\t51464\n女女人\t51465\n王远建\t51466\nyouto1\t51467\n树藤\t51468\n维罗妮卡\t51469\n结婚就是你爱我我爱你我\t51470\n八九十\t51471\n八九千\t51472\n乔海洋\t51473\nitou\t51474\n木形\t51475\n操作人总该分男女\t51476\n位於\t51477\n再感\t51478\n叶陌\t51479\n电梯\t51480\n夏欣怡\t51481\n五大点\t51482\n一拉克\t51483\n井思源\t51484\n聊天\t51485\n击穿\t51486\n加藤嘉一\t51487\n赌城风云\t51488\n绘制\t51489\n527688832465724657376573146573279573576273274\t51490\n耐人寻味\t51491\n雪月花\t51492\n武艺轩\t51493\n一八九四\t51494\n日记么个秋\t51495\n打进\t51496\n火车头\t51497\n高富强\t51498\n累累累累\t51499\n每隔30分钟\t51500\n后海胡同\t51501\n蒲团\t51502\n巨流\t51503\n不准说听\t51504\n绿日\t51505\n18名\t51506\n度恩\t51507\n9.9\t51508\n娇媚\t51509\n宝贝有你真好\t51510\n吊样\t51511\nodnemd\t51512\n闲呆\t51513\n八钢\t51514\n大魔仙\t51515\n呼吸液\t51516\n西红柿鸡蛋拌酒\t51517\n麽有麽有麽有麽有麽有麽有麽有麽\t51518\nxylx\t51519\n延伸段\t51520\n浙江电台\t51521\n美度\t51522\n胖头大成\t51523\n半块\t51524\nHbhvhb\t51525\n三百八二百二\t51526\n如图圆i\t51527\n80多\t51528\n坐坐坐\t51529\n我不我拒绝\t51530\n青小蓉\t51531\n立名\t51532\n长嘉诚\t51533\n再来一站\t51534\nvV踩踩踩VB\t51535\n神片\t51536\n北理工\t51537\ntabs\t51538\n南阳\t51539\n说呀你是\t51540\n真心相爱\t51541\n37位\t51542\n莹莹莹莹\t51543\n好时期\t51544\n突发事件\t51545\n撒谎\t51546\n张媛媛\t51547\nnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnn\t51548\n侬勿得\t51549\n花肆意\t51550\n2.1\t51551\n阿尔弗瑞德\t51552\n一百二\t51553\n内衣\t51554\n七大洲\t51555\n仁人志士\t51556\n猜我想\t51557\n臭骗纸臭骗纸\t51558\n标明\t51559\n膀大腰圆\t51560\n辨证论\t51561\n2.6\t51562\n好时机\t51563\n迁西\t51564\n在大\t51565\n睇展\t51566\n明天凌晨\t51567\n韦秋秋明\t51568\n在天\t51569\n一九号\t51570\n八一女篮\t51571\n齐奥塞斯库\t51572\n這些東西實\t51573\ninnkomiabsw\t51574\n恩泽\t51575\n奥运\t51576\n自来希尔特\t51577\npyt\t51578\n聂志存\t51579\n黑屏\t51580\n高者\t51581\n黑屋\t51582\n高考\t51583\n高老\t51584\n丑长\t51585\n十四点半\t51586\npyd\t51587\n事到如今\t51588\n煮鱼\t51589\n吵夜\t51590\n2首冠\t51591\n多妮\t51592\n谢能有\t51593\n轮流\t51594\n比例尺\t51595\n鉴赏题\t51596\n黑山\t51597\n奥迪\t51598\n902890\t51599\n我喜欢他\t51600\n贪腐者\t51601\n招考\t51602\n淡淡\t51603\n你好面试你好野蜂蜜\t51604\n度忙\t51605\ncouncom\t51606\n独倚\t51607\n早黑\t51608\nvhfc\t51609\n暂任\t51610\n唱一闪\t51611\n周拉力\t51612\n1一一\t51613\n叉叉叉叉\t51614\n一大群\t51615\n保密\t51616\n来嘛来嘛一起去看流星雨\t51617\n我好伤心了我的鸭鸭\t51618\nugvbhg\t51619\nwpeepdpdldlcfllfkfkkfkfkff\t51620\n二百五三百六十\t51621\n呀小不\t51622\n嗯足疗\t51623\n三九三九\t51624\n你生过孩子么事过龙凤胎\t51625\nBhkkhg\t51626\n度快\t51627\n一个了一个\t51628\n炸药\t51629\ngatmg\t51630\n度忧\t51631\n吃金针\t51632\ncocation\t51633\n不对不\t51634\n第29\t51635\n娱乐无极限\t51636\n加到\t51637\n做好说\t51638\n历城\t51639\n2011年6月7日\t51640\n读经\t51641\nstaato\t51642\n齐鲁制药\t51643\n睡着了\t51644\n零六零五\t51645\n矿泉\t51646\n下3205天\t51647\n瓶盖\t51648\n死徒\t51649\n钰儿\t51650\n飞檐走壁\t51651\n哦了因为我太难受了你跟我说话吧\t51652\n我没恨你了吧我讨厌你ok\t51653\n在家上\t51654\n胡晓华\t51655\n明辨\t51656\n三十四中附小\t51657\n1.45PIXI\t51658\n寥寥\t51659\n文化宫\t51660\n买卖双方\t51661\n李姐姐\t51662\n野性\t51663\n人身自由\t51664\n很羞瑟的赵\t51665\n嚷架\t51666\n13851326356\t51667\n冰河\t51668\n薛之谦\t51669\n財名\t51670\n44568855\t51671\n南郭\t51672\nghiphotosbaiducomxiaodupicitemf636afc379310a55457c1687b04543a98226106djpg\t51673\n泰珉\t51674\n乱码\t51675\n二十冶\t51676\n直亲\t51677\n烟盒\t51678\n恁们\t51679\n善解人意地不聊\t51680\nYdgobmpkkhj\t51681\n菲力普\t51682\nv高跟鞋\t51683\n冰沙\t51684\n任晓云\t51685\n转机\t51686\n刚里\t51687\n种田\t51688\nsourry\t51689\n湖州\t51690\nbeautifu\t51691\n一二三刊\t51692\n火片儿\t51693\n女噶\t51694\n下一页\t51695\n好运琳琳\t51696\n明珠有有度\t51697\n嚯嚯嚯\t51698\n不搞笑\t51699\n嫩子\t51700\n猪电站\t51701\n名气\t51702\n同性链\t51703\n328分\t51704\n茅于轼\t51705\n薏仁\t51706\n32012年\t51707\n遮瑕膏\t51708\nminds\t51709\n麼人\t51710\n本帅哥\t51711\nfyudfyi\t51712\n同音字\t51713\n回仙桃呆\t51714\n865456\t51715\n奸臣\t51716\n理你了真是\t51717\n日韩\t51718\n太子妃升职记\t51719\n包教包会v还好吧海不扬波vu不\t51720\n腰酸背疼\t51721\n黄有金\t51722\n牛奶鸡蛋液\t51723\n取闹\t51724\nchiphotosbaiducomxiaodupicitem32fa828ba61ea8d377f86077900a304e241f58eejpg\t51725\n东珠\t51726\nhwosgwigjw\t51727\n简直跟一模一样我真的爱死你\t51728\n闸门\t51729\n邓丽\t51730\n一万多\t51731\n蝎蝎\t51732\n徒劳\t51733\n靠不会\t51734\n博天天\t51735\n吴百福\t51736\n不差退\t51737\n组委会\t51738\n留手\t51739\n上乘\t51740\n面化\t51741\n景乐\t51742\ntxtuhIbcIm\t51743\nSgsgsgsgsggsgs\t51744\n在雨中\t51745\n李慧艳\t51746\n爆炸式\t51747\n面包\t51748\n泸州市\t51749\n欲哭无泪\t51750\n下读\t51751\n贪求\t51752\n理据\t51753\n启下\t51754\n嗯明天就是星期一了你猜我最喜欢数字几呀\t51755\n王俊琪\t51756\n启东\t51757\n六零下\t51758\n532元\t51759\n8周岁\t51760\n死气沉沉\t51761\n花开半夏如诗如画蒲公英随风海角天涯生如夏花无可奈何花落下只剩下一梦繁华\t51762\n上班车\t51763\nｔｘｔ\t51764\n旧街\t51765\n峰峦\t51766\n真贝尔\t51767\n督主\t51768\n第五\t51769\ntomatility\t51770\n你新年\t51771\n9000万\t51772\n白菜菜\t51773\n四个\t51774\n第二\t51775\n1894\t51776\n张小娴\t51777\n1迷月2\t51778\n香叶\t51779\nttttl\t51780\n乐在石\t51781\n1万个\t51782\n彻头彻尾\t51783\nsetyh\t51784\n受贿罪\t51785\nKkkkkokp\t51786\n真言\t51787\ncrcy\t51788\nNEW\t51789\n刺毛\t51790\ng羡林\t51791\nPK布雷洛克\t51792\n2分钟\t51793\n好男不和女斗\t51794\nastmo\t51795\nNET\t51796\n白小度\t51797\n504751537\t51798\n度秘度秘咪咪呀\t51799\n引发\t51800\n七七区\t51801\n罗做\t51802\n菜谱表\t51803\n小叶榕\t51804\n3号下午13：30\t51805\n二度\t51806\n少先\t51807\n睡熟\t51808\n和你的最爱\t51809\n李小曼\t51810\n又说说说说说说说说\t51811\n睡熬\t51812\n许照烽\t51813\n和爷\t51814\n远眺\t51815\n引号\t51816\n大快人心\t51817\n呀督\t51818\n呆板\t51819\n证文\t51820\n清场\t51821\n五十二块\t51822\nIC根\t51823\n零二幺幺幺\t51824\n幻听\t51825\n十几页\t51826\n王俊凯九\t51827\n数据\t51828\n斯利那\t51829\n蒜汁\t51830\n玩伴\t51831\n2月2日至17日\t51832\n浮肿\t51833\n多远\t51834\n风之花千骨\t51835\n喂快乐吧\t51836\n画好\t51837\n来啦快跑\t51838\n了解放\t51839\n30份\t51840\n缪斯\t51841\n北大茶熏瑜伽社\t51842\n玩会\t51843\n知县肢现脂腺炎\t51844\n100万一百万多一一千万\t51845\n13557853456\t51846\n武康\t51847\n荣耀7\t51848\n3500本\t51849\npeakengilsh\t51850\n休二\t51851\n张小白\t51852\n小謩\t51853\n牛狗\t51854\n高莹雪\t51855\n无耻之徒\t51856\n搞上\t51857\n谈吐\t51858\n竹竿\t51859\n饿呢6门\t51860\n哎呀呀呀呀呀\t51861\n症结\t51862\n城河\t51863\nDAY\t51864\n嗲噶大白苏\t51865\n吴斌\t51866\n内媚\t51867\n撇哇酷\t51868\n度秘我告诉你我是女的\t51869\n红红火火红\t51870\n哭了拜拜\t51871\n新的一年\t51872\n三甲\t51873\n三男\t51874\n三电\t51875\nshen\t51876\n三画\t51877\n邻村\t51878\n礼节\t51879\n不达意\t51880\n服務\t51881\nchiphotosbaiducomxiaodupicitem8ad4b31c8701a18b0d83a43b992f07082838fe5ejpg\t51882\n2200天\t51883\n工藤新一\t51884\n真是的讨厌励磁我想和你\t51885\n猪猪侠天。女。jugghghfivbhkbjjkgijgjjnjkngbnhvblnbbvlbhjngjbbbhjbmboihgbjbuvjjmmkjnvbnbbnnnnbjj\t51886\n褒\t51887\n城西\t51888\n清政府\t51889\n歌酒缸\t51890\n8十8二多少\t51891\nx19\t51892\n褐\t51893\n最后一样\t51894\n草季\t51895\n张破帆\t51896\nryyk\t51897\n香风\t51898\n攻击力\t51899\n礼花\t51900\n事实事实事实事实试试试试试试试试试试试试试试试试试试试试试试试试试试试试试试\t51901\n不要脸不要脸我给你唱了啊不要脸不要脸不要脸不要脸不要脸\t51902\n里圈\t51903\n3吨\t51904\n鹅鹅鹅饿饿恩\t51905\n胥渡\t51906\n斥巨资\t51907\n林正灿\t51908\n西班牙国家队\t51909\n逃生\t51910\n11111111111111111111111111111111114444445780999991\t51911\n恶狗\t51912\n广义\t51913\n骚瑞太\t51914\n种菜\t51915\n尤里卡\t51916\n笨辕\t51917\nK克鲁鲁map\t51918\n8618911235209\t51919\n赏心悦\t51920\n下垂\t51921\npw\t51922\n3合\t51923\n考林\t51924\n我问你\t51925\n3名\t51926\n惜雪\t51927\n群殴打的\t51928\n女纸\t51929\n2cm\t51930\n1316641927\t51931\n别忘了吧\t51932\nx16\t51933\n2cj\t51934\n民诉\t51935\n大概期\t51936\n2cf\t51937\n浪漫史\t51938\n林怡贝\t51939\nfhdbn\t51940\n可爱我的只有你一个\t51941\n凯铂瑞\t51942\npy\t51943\n周董\t51944\n吗你的名字\t51945\n吴兴振\t51946\n屹立\t51947\n严格\t51948\n重庆嘉陵江\t51949\n听话很可爱\t51950\ngadag\t51951\n义教版\t51952\n哪双\t51953\n猫饼\t51954\n社会秩序\t51955\nduy\t51956\n利用品\t51957\n飞亚达\t51958\n过劳\t51959\n过劲\t51960\n无接\t51961\n博美犬\t51962\nAngarebaby\t51963\n赵芯艺\t51964\npa\t51965\n大把\t51966\n赵晨曦\t51967\n85555568\t51968\n2x7y80\t51969\n廖达华\t51970\n窃况\t51971\n比好无用\t51972\n墩蚤\t51973\n冒收下\t51974\n7414421\t51975\n关节炎\t51976\n王江\t51977\n来世今生\t51978\n换衬\t51979\n发指\t51980\n本站\t51981\nangealbabya\t51982\n穆婷婷\t51983\n好了吗丁啉\t51984\n如来佛\t51985\njfacv\t51986\n俩三点\t51987\n换衣\t51988\n而死\t51989\n北控啦啦队\t51990\n玺玺\t51991\n栖霞建设\t51992\nｍｔｒｂａｂｙ\t51993\n尾声\t51994\nangelaabay\t51995\n换行\t51996\nsww\t51997\ncello\t51998\npm\t51999\n换血\t52000\n十五分\t52001\n當年\t52002\n盈利模式\t52003\n持仓量\t52004\n因为我帅\t52005\n彩虹街\t52006\n做公知\t52007\n方泽玉\t52008\ndjgdkdm\t52009\nghgctffh\t52010\n井下\t52011\n井上\t52012\n火堆\t52013\n桃+苹果+橘子\t52014\n斗龙战士号\t52015\n托儿\t52016\nheris\t52017\n指拨\t52018\n银河奥特曼斯托利姆\t52019\n刘雨二\t52020\n呜呜寻爱的真名这个词哪首歌\t52021\n大大方方健康\t52022\n爸爸头\t52023\n根宝\t52024\n恩我开心\t52025\n几九天\t52026\n俞成翔\t52027\nLike\t52028\nrjyrgftu\t52029\n哈哈&#xF604\t52030\n卡斯里兰卡\t52031\n聪生日快乐大本营\t52032\n拜拜永别了我讨厌你\t52033\n紫溪\t52034\n吁一祥\t52035\n雨人\t52036\nvivovivovivovivovivovivo\t52037\n渖阳飞机工业公司\t52038\n我和秘\t52039\nr1r2\t52040\n雨事\t52041\n食人鱼\t52042\n轮休\t52043\n过去六年\t52044\n云千羽\t52045\n友谊鞋\t52046\n大茂哥\t52047\n证件\t52048\nxai度秘\t52049\n志愿者\t52050\n唉声叹气\t52051\n求吐\t52052\n二话不说三心二意思源\t52053\n出租电话\t52054\n傻子气人\t52055\n不能不做\t52056\nsbk\t52057\n芳芳花花333\t52058\n这样做成\t52059\n长久远\t52060\n系统\t52061\n达克宁\t52062\n藏不住\t52063\n求名\t52064\n韩妮\t52065\n凯甲\t52066\n呃发嗲\t52067\nroutouen\t52068\n郭杜\t52069\n晚上周\t52070\n秦岚美\t52071\nChbv\t52072\n牛娃\t52073\n几埸\t52074\n乖小\t52075\n陕师大\t52076\n片花\t52077\n圣团\t52078\n1559.61万辆\t52079\n门诊\t52080\n读写\t52081\n产子\t52082\n贤菲\t52083\nappay\t52084\n相依\t52085\n词人\t52086\n46022359\t52087\n周源涛\t52088\n咯裸图\t52089\n一百八十多\t52090\n顺境\t52091\n课字\t52092\n公开化\t52093\n李宇晗\t52094\n扛回我姐家\t52095\n猩凶\t52096\n反扑\t52097\n反扒\t52098\nrbhfhfnjnj\t52099\n小罗希\t52100\n我喜欢团\t52101\n动脑筋\t52102\n反手\t52103\n纳元丹\t52104\n煎馒头\t52105\n展翅高飞\t52106\n数只\t52107\n智能助手\t52108\n循环纹\t52109\n开心机器人\t52110\n汉魅\t52111\n我爱你我恨你爱你是\t52112\n瑾萓\t52113\n西莱\t52114\n小心人\t52115\n12bit\t52116\n丑狗\t52117\nMLGB\t52118\n武帝\t52119\n1234569870\t52120\n米酷\t52121\n华雅会\t52122\n停用\t52123\n扯西\t52124\n问天问地\t52125\n刻骨铭心\t52126\n兄弟长\t52127\n扫瑞\t52128\n旧梦\t52129\n苟活\t52130\n提拉唉\t52131\n鑫台\t52132\ncheng\t52133\nevev\t52134\never\t52135\n事干\t52136\n米酒\t52137\n周东胜\t52138\n银灰\t52139\n影碟\t52140\n拿片\t52141\n闪过\t52142\n银灿\t52143\n善者\t52144\n黑发丸\t52145\n最具备\t52146\n君子不夺人\t52147\n内存G\t52148\nchan料\t52149\n度秘关\t52150\n雪琪\t52151\nregisterlater\t52152\n僻壤\t52153\n小枪\t52154\nsaidit\t52155\n加油枪\t52156\n皱起眉\t52157\nHain\t52158\n疑遭\t52159\n而非\t52160\n暴击\t52161\n空心人\t52162\n撞击\t52163\n雪球\t52164\n五道杠\t52165\n亮亮仔\t52166\nqeryifxvxchcagnv\t52167\n李景治\t52168\n葡萄儿大哈\t52169\n苹安心吧\t52170\n哥特式建筑物\t52171\n1400多平米\t52172\n谜团\t52173\ndrrrretrrgd\t52174\n七嘴八舌\t52175\n1多\t52176\n贤通\t52177\n没钱寸步难行\t52178\n手段\t52179\n加卡洛\t52180\n广汽丰田\t52181\n厌世\t52182\n那回\t52183\n聊咧\t52184\n那因\t52185\n凹槽\t52186\n我的大宝贝\t52187\n棉鞋\t52188\n7260\t52189\n手残\t52190\ncali\t52191\n吻手\t52192\ncall\t52193\n腊英\t52194\n修珊珊\t52195\n午安2B\t52196\ncale\t52197\n1天\t52198\n嗯有\t52199\n及格证\t52200\n350块\t52201\n须知\t52202\n二姐\t52203\n二姑\t52204\n陵县\t52205\n忠犬\t52206\n呼市\t52207\n气死你不偿命\t52208\n卡包\t52209\n嗯艳若\t52210\n给我一个吻oke不可以\t52211\n5cm\t52212\n正乐约海卡国\t52213\n不留情\t52214\n美声\t52215\n叫唤\t52216\n相濡以沫\t52217\n你好丑好丑好\t52218\n开塞露呢女女诺咯拖\t52219\n淘宝兼职\t52220\n复辟\t52221\n倾情\t52222\n衍生物\t52223\n田螺\t52224\n吸收\t52225\n迪马迪马\t52226\n纪录者\t52227\n头重\t52228\n事面\t52229\nhfgurdg\t52230\n低声\t52231\n偶感\t52232\n人造智能\t52233\nyy葵\t52234\n四流钻\t52235\n受族\t52236\n难处\t52237\n纯洁的心\t52238\n给行\t52239\n无聊件\t52240\n迎新\t52241\n纳民\t52242\n对言\t52243\n人份\t52244\nquot\t52245\n倍二如\t52246\n阵子\t52247\n人件\t52248\n5C5C\t52249\n疏远\t52250\n人代\t52251\n年味\t52252\n上海爱乐乐团\t52253\n翁宇豪\t52254\n打铁箍\t52255\nfhdkdjskfkxnxjxbcjxbf\t52256\n动脚\t52257\nwxmm\t52258\n活累\t52259\n律处\t52260\n也应\t52261\n动脑\t52262\n劳动路\t52263\n雷焰\t52264\n李秋染\t52265\n每一个\t52266\n果啤\t52267\n四五蓝秀\t52268\n动脉\t52269\n双丸蟹柳炒云南小瓜\t52270\n笨笨鱼\t52271\n薛忠旭\t52272\nac米兰\t52273\n说好说\t52274\nAll\t52275\n小鸡鸡\t52276\n唱唱歌嘎嘎嘎嘎vvvv\t52277\n青字\t52278\n射片\t52279\ntouto\t52280\n河山只在我梦影族姑娘微亲亲蜗居租猫\t52281\n皮浪\t52282\n002468\t52283\n半姗姗\t52284\n意象\t52285\n早班号\t52286\n杀杀杀杀杀\t52287\ng18\t52288\ng17\t52289\ng15\t52290\ng14\t52291\ng13\t52292\n83.7%\t52293\n排比卷\t52294\n制程\t52295\n街市\t52296\n光猫宁\t52297\n逃亡兔\t52298\n为我不知道\t52299\n有机\t52300\n周果\t52301\n10万公里\t52302\n会学\t52303\n连句话\t52304\n我思思\t52305\n有本\t52306\n松鼠可\t52307\n冻死人\t52308\n拉帮结派\t52309\n王轩\t52310\n盖谡\t52311\n生物质能\t52312\n小乐乐我是你的王姐\t52313\n有朝\t52314\n有期\t52315\n再玩\t52316\nWashingtonDC\t52317\n有望\t52318\n不度秘\t52319\n诸如\t52320\n再现\t52321\n石我\t52322\njrieksl\t52323\n有服\t52324\n聊亲\t52325\n汉堡堡\t52326\n高中数学\t52327\n二零\t52328\n曾成杰\t52329\n以人为本\t52330\n白生\t52331\n得着手\t52332\n斗狗一个豆\t52333\n考一考\t52334\n9月20日\t52335\n14岁\t52336\n旗帜类\t52337\n绞板\t52338\n难着\t52339\n熊萌\t52340\n微游记#\t52341\n双女\t52342\n刘涛七\t52343\n书碟\t52344\n彼此\t52345\n逃荒\t52346\n梅艳\t52347\n二雄\t52348\n假球\t52349\n二集\t52350\n闵月华\t52351\n3272296149\t52352\n呆会儿\t52353\n平易\t52354\ndout\t52355\n铁屑\t52356\n丹cfdsaānbdsadhj\t52357\n重燃\t52358\n董藩\t52359\n没感情了再见\t52360\n跟斗嘴\t52361\n北郊村\t52362\nhjnbkk\t52363\n白沙瓦\t52364\n小黄鸭\t52365\n4月24日早晨\t52366\n平昌\t52367\n影城\t52368\n自不量力\t52369\n证信\t52370\n阁楼\t52371\n高抛\t52372\n嗯嗯小学\t52373\n王勇武\t52374\n水锈\t52375\n香槟广场\t52376\n诗情到碧霄\t52377\n就是爱\t52378\n迎泽\t52379\nidid\t52380\n徐元宵\t52381\n用电话\t52382\n33333\t52383\n帆船\t52384\n女寝\t52385\n100部\t52386\n伸脸\t52387\n上月底\t52388\n三元\t52389\n四十秘\t52390\n299\t52391\nhgjmgg\t52392\n国防建设\t52393\n吴翰\t52394\n沙和长\t52395\n东水路十一号\t52396\n领到\t52397\n294\t52398\n圣诞快乐\t52399\nsygdv\t52400\n买出\t52401\n扦皮王村\t52402\n還要拿什麼換嗎\t52403\n三六\t52404\n三公\t52405\n三八\t52406\n土豪堡华物华物华物华\t52407\n图别\t52408\n刘天祥\t52409\nNail\t52410\n金属唱片公司\t52411\n咯牙\t52412\n三关\t52413\n从来没\t52414\n8866868686868688888888888\t52415\n贾内德\t52416\n深圳市\t52417\n语文科\t52418\n放宠爱给我听\t52419\n血清\t52420\nyehei\t52421\n特试特\t52422\n血渍\t52423\n1010101010101010101010\t52424\n米易\t52425\n全州县\t52426\n不逗你了不逗你了不逗你了叫姐\t52427\n第二十三二十四\t52428\n香草籽\t52429\n蔑\t52430\n松岛枫\t52431\n蔚\t52432\n杨小华\t52433\n蔥\t52434\n周日晚上\t52435\n亚勇\t52436\n蔡\t52437\n据说话\t52438\n归零\t52439\n赵明星\t52440\n做咪\t52441\n蔫\t52442\n总来说\t52443\nBB8\t52444\n846549266\t52445\n蔻\t52446\n亚洲冠军联赛\t52447\nBBN\t52448\n063\t52449\n西莱花\t52450\nexoortf\t52451\n夏焕春\t52452\n不尔拉\t52453\nBBD\t52454\nBBC\t52455\nBBB\t52456\n赖盈希\t52457\n金丝\t52458\n金东\t52459\n体明\t52460\n改稿\t52461\n哮天犬\t52462\n李云欢\t52463\n阿拉星\t52464\n六家\t52465\n书评\t52466\n剖腹\t52467\n好夫君\t52468\n孟娇\t52469\n最最最最最最最最\t52470\n答案\t52471\namphenomen\t52472\n100元\t52473\ncomputineisheratistilitholondefeyithyoublelinhaishourtincchimitomoutisnnnnnnovatibao\t52474\n金主\t52475\n阴阳样\t52476\nWAKAKAKA\t52477\n金丰\t52478\n不是我是说你你等会儿\t52479\n铁饭\t52480\n潮流性\t52481\n付伟杰\t52482\n爱不爱我呀你不爱我我又不爱你\t52483\n徐雨晴\t52484\n巴拉巴拉\t52485\n魏楚航\t52486\n桥北\t52487\n袁星雨\t52488\n牧民\t52489\n患病者\t52490\n三蒙面\t52491\nfguhhxdfv\t52492\n未明\t52493\n501061\t52494\n徐雨晨\t52495\n吞噬\t52496\n桥区\t52497\n52455888\t52498\n脑残儿\t52499\n_-///\t52500\nVCDV\t52501\n重爱\t52502\nMotta\t52503\n卢小鱼\t52504\n真格\t52505\n下海\t52506\ntyj\t52507\ntyd\t52508\n18091656949280\t52509\ntyg\t52510\n4tp90p\t52511\nnnnnnnnn。nnnnn\t52512\n水娃\t52513\n666535656553535\t52514\ntyt\t52515\ntyu\t52516\ntyv\t52517\n青山不老\t52518\nEddietuffCFCsI\t52519\ntys\t52520\n陶云涛\t52521\n备注梦\t52522\n8月11号\t52523\n好啦好啦可以啦谢谢你度秘\t52524\n来不溜溜\t52525\n乌云额日登\t52526\n下流\t52527\n水娣\t52528\nabab形\t52529\n東西比\t52530\n秦美\t52531\n碧野田间牛得草\t52532\n一八点钟\t52533\n竖起\t52534\n我凯\t52535\n熊艳子\t52536\n淤青\t52537\n先么\t52538\n大北窑\t52539\n唐老师\t52540\n男女朋乎乎\t52541\n洗衣粉\t52542\n白居易\t52543\n佳人姐\t52544\n朱可妍\t52545\n生生\t52546\n很浪水\t52547\npitt\t52548\n老子说\t52549\n殷先生\t52550\n转圈\t52551\n宝贝们\t52552\n摩达\t52553\n田野\t52554\n奔野\t52555\n田里\t52556\nChop\t52557\n数滴\t52558\nbudong\t52559\n投影海盗旗\t52560\n猪了拜拜\t52561\n生男\t52562\n先买\t52563\n模具\t52564\n黄冰冰\t52565\n几犬\t52566\n乖呵呵呵呵呵\t52567\nsezho秘\t52568\n大宅\t52569\n岁度\t52570\n安吉丽娜朱莉\t52571\n度秘我还未成年\t52572\n67581645876188466864\t52573\n透气鞋\t52574\n盲校\t52575\ndyjc\t52576\n山崩\t52577\n菖节\t52578\n老熊\t52579\n镜湖新区\t52580\n九阴真功\t52581\n斯菲特\t52582\n真心美[爱你\t52583\n朱老师\t52584\n8874582987125865\t52585\n个生\t52586\nEYFTXYGU\t52587\n夕门\t52588\n内在\t52589\n悬疑题\t52590\n涧涧\t52591\nbad\t52592\nbac\t52593\n高曾\t52594\nban\t52595\n陈睿馨\t52596\n熊孩纸\t52597\nbai\t52598\n朝思暮想\t52599\n3x10\t52600\n英国人\t52601\n7t1\t52602\n大宋\t52603\n人民利益\t52604\nH4L\t52605\n九幽赖科\t52606\n春禅\t52607\nnnn55057554657\t52608\n2292160650\t52609\n灼心\t52610\n大宗\t52611\n爱的八十真是来了\t52612\n厂妹\t52613\n第21名\t52614\n温州23医院\t52615\n谭霖\t52616\n黑香菱\t52617\nChoice\t52618\n看不着咳咳咳\t52619\n武又文\t52620\n苏格兰北部\t52621\n森口牲口么奔腾\t52622\nnnbbb\t52623\n3时15分\t52624\n数名\t52625\n国民党伪政府\t52626\n句咔咔咔\t52627\nYoure\t52628\n上一个头\t52629\n3x1b\t52630\n好丽友派\t52631\n歌剧院\t52632\nill口蘑噜\t52633\n红血丝\t52634\n李银花\t52635\n连云港海州公安分局刑警大队长\t52636\n我的生日花\t52637\n老伙伴\t52638\n你是世界上最最最最最最最最最棒的小度秘小秘书\t52639\n齐玉菲\t52640\n昂文\t52641\n快步\t52642\n银帝\t52643\n于心不忍\t52644\n杨智站\t52645\nghcgcvgghghy\t52646\n快意\t52647\n校址\t52648\n在黑暗中\t52649\n锻炼身体\t52650\n没不当\t52651\n貴\t52652\n照职\t52653\n五五二二三八三八三幺\t52654\n快感\t52655\nHfffggfdxhjkd\t52656\n刘毅聪\t52657\n邪秀文\t52658\n我喜欢吐\t52659\n875845855\t52660\n二二十四5千克\t52661\n英泰\t52662\n小拇指\t52663\n我的爸爸全集天走\t52664\n六月7号\t52665\n刘刘\t52666\n刘刚\t52667\n应声\t52668\n滚轮\t52669\n指挥\t52670\n方案\t52671\n数周\t52672\n疯魔\t52673\nApplejojo\t52674\n毛香香\t52675\n快乐新年快乐\t52676\n纸房屋\t52677\n他們讓\t52678\n加雷斯\t52679\n個性\t52680\n手仓\t52681\n银锭\t52682\n手仔\t52683\n郝志中\t52684\ngdrfgdduggfxgghbhvfydgddfhhgvvvvvghrhfdhvddfduddrudyyrughrrgrrdrryrrurrrrrgehdhgddgtddghhcggdgfghgnvvbvvvvvvvvccccvvghgghhjhdyfrfihfghhddzgllll\t52685\n咋谈恋爱\t52686\nhishbxl\t52687\n血常规\t52688\n抱哀\t52689\n恒景\t52690\n花花yyyyyyyyyy\t52691\n一半天\t52692\n血泪史\t52693\n幺三零零幺三幺\t52694\n代写\t52695\n主裁判\t52696\nvdbwi\t52697\n嫁给你好\t52698\n十招\t52699\n画错\t52700\n40倍\t52701\n一回生白\t52702\n薪\t52703\n梅西，英伦\t52704\n財\t52705\n线路\t52706\n万可\t52707\n西安汽车科技职业学院\t52708\n万只\t52709\n对不起错\t52710\n天大错\t52711\n泰国政府\t52712\ncsgtsg\t52713\n九一秀\t52714\n红包包\t52715\n71千三十六\t52716\n精英教育\t52717\n吃酸\t52718\n胡主席\t52719\n三林\t52720\n高升\t52721\n林忆莲\t52722\n女氮\t52723\n高博\t52724\n付阳\t52725\n一点钟\t52726\n2421386345\t52727\n42天\t52728\n重症\t52729\n丫oLI\t52730\n博讲\t52731\n吃酒\t52732\n连度秘\t52733\n邹丽婷\t52734\n找我不让\t52735\n一蹴而就\t52736\n809438\t52737\n那你有心的话你一零你在干\t52738\n新点\t52739\n三架\t52740\n三枪\t52741\n慢东珠\t52742\n八达\t52743\n就是啊我讨厌你没有你讨厌我ok\t52744\nkkkkkkk\t52745\n罗紫妍\t52746\nYdqvvlurccni\t52747\n三文治\t52748\nfffffffffffff\t52749\n那许\t52750\n石宇航\t52751\nBkvkg\t52752\n识相\t52753\n20：24晚\t52754\n淘汰赛\t52755\nglp\t52756\n凤梨丫\t52757\n080元\t52758\n洋河股份\t52759\n工伤谈判案\t52760\n贻笑大方\t52761\nQQ噶GB\t52762\nalibab\t52763\n所需\t52764\n20W\t52765\n1306735686\t52766\nhityijg\t52767\nIOYE\t52768\nglu\t52769\n灵男\t52770\n讨厌你我走\t52771\n伊斯兰教\t52772\n贮备\t52773\n梦玥\t52774\n银平山\t52775\n善战\t52776\n桦川\t52777\n涌现\t52778\nunfair\t52779\n土瓜湾\t52780\n附中\t52781\n小度小百度\t52782\n梦玲\t52783\n一罐\t52784\n出没有关\t52785\n岳飞\t52786\n霍斯\t52787\n捏捏\t52788\n如翔\t52789\n改判\t52790\n国瑞城\t52791\n郭茂倩\t52792\n五老峰\t52793\ngjuygjx\t52794\n衣橱\t52795\n34集\t52796\n22345688\t52797\n附上\t52798\n走着\t52799\nd4kkgugfh\t52800\n附一\t52801\n返校\t52802\n张一菲\t52803\n三天o\t52804\n卜圣\t52805\n小小小小小\t52806\n邪暗毒\t52807\n妇女节\t52808\n33dfateam\t52809\n指望\t52810\n代沟塞斯\t52811\n张思之\t52812\n20M\t52813\nJY状\t52814\n去楼空\t52815\n嗯图索骥\t52816\n羽田机场\t52817\n陈者\t52818\n潘帕斯\t52819\n酸甜酷辣\t52820\n腰疼\t52821\nhisunity\t52822\n你了现在家吗你了我真的现在就要么你了度秘\t52823\n心灵寂寞\t52824\n张开作爱\t52825\n张怒\t52826\n大理古城\t52827\n公爵\t52828\n明可可\t52829\n刷脸\t52830\n新浪微讯\t52831\n4664646\t52832\n白马王子都是我的小情人的米奇\t52833\n潘志琛\t52834\n悉然\t52835\n房不胜防\t52836\n小坏人\t52837\n罪感\t52838\n动画片版\t52839\n瓶颈\t52840\n骚万能\t52841\n更好些\t52842\n上天天\t52843\n209\t52844\n品鑫\t52845\n车师累嘞\t52846\nDHF\t52847\n狄青\t52848\nnijiushhisabi\t52849\n创富者\t52850\n大一分钟\t52851\n常州市\t52852\nKeanu\t52853\n国家级\t52854\n李家晗\t52855\n烤肠\t52856\n555511555555555555\t52857\n几万亿美元\t52858\n公爹\t52859\n桥大院\t52860\n苏凉\t52861\n小美丽\t52862\nsuper\t52863\n画面\t52864\n52525252\t52865\n利特推特\t52866\n翘课\t52867\n2到3个小时\t52868\n俩俩\t52869\n房门灯\t52870\n逆战猫\t52871\n回望\t52872\n我要者\t52873\nfiehish\t52874\n青春点\t52875\n各部门\t52876\n言听计从\t52877\n集市\t52878\n婷儿\t52879\n请选\t52880\n颖妹子\t52881\n小聪聪\t52882\n自数\t52883\n达达\t52884\n嗯奥倒西安箱倒相\t52885\n小擦\t52886\n连绵不断\t52887\n说话片\t52888\n底款\t52889\n业兵\t52890\nbukandianyi\t52891\n灰太熊\t52892\n娇柔\t52893\n指相扣\t52894\n159632774\t52895\n更更\t52896\n5．4\t52897\n布布MS\t52898\n初十\t52899\n不见了\t52900\n不孤单\t52901\n奢逸\t52902\n好了再见\t52903\n薛任务\t52904\n广慧\t52905\n血液循环\t52906\n白页\t52907\nWaveya\t52908\n各路\t52909\n55455885666555555555555555555\t52910\n五颗\t52911\n苏帮\t52912\n新泰\t52913\n反共\t52914\n宋永华\t52915\n回头看\t52916\n露韩误\t52917\n反光\t52918\n册子\t52919\nhvji\t52920\n运缘阁\t52921\n秋平米\t52922\n凡夫俗子\t52923\nyourhffas\t52924\n新法\t52925\n儿魂儿\t52926\n老妈妈\t52927\n野种\t52928\n那美西\t52929\nGfvxffghdctdgh\t52930\n几万一件\t52931\n哈佛小子\t52932\n三聚氰胺奶\t52933\n酒单\t52934\n九零点\t52935\n起因\t52936\n千里挑\t52937\n8路\t52938\n裤腿\t52939\n李宗諭\t52940\n几条\t52941\n陌生的歌\t52942\n豪豪豪豪豪\t52943\nFchghhhm\t52944\n渡渡\t52945\n咱俩眼儿的歌儿\t52946\n霸占权\t52947\n鸟市上学呢通知我你在\t52948\n43792\t52949\n骄阳似火\t52950\n这是我\t52951\n句英图\t52952\n持戒\t52953\n帮扶\t52954\n咪咪哼哼哼哼哼哼哼哼\t52955\n0085\t52956\nltjgajv\t52957\n兆会是你的女\t52958\n凡男\t52959\nfuckylez\t52960\n王嘉琴\t52961\nsiri会\t52962\n老瓜\t52963\n洗点\t52964\n刻刻\t52965\n27分钟\t52966\n王嘉琦\t52967\n一一厅\t52968\n叫议会\t52969\n浦江华侨城\t52970\najapatmtbj\t52971\nJJFD\t52972\nsmartphowned\t52973\n给一个未出生孩子的信\t52974\n确乎\t52975\n帕尔马\t52976\nsigma\t52977\n饱食终日\t52978\n爱情告诉我\t52979\n戴婷婷\t52980\n盘山\t52981\n嗯感恩\t52982\n喔帅\t52983\n13233447626\t52984\nbfuygcghcvffdxxcccccccghccghgcvhbkbcgxhcxedythfchvhvjcgchn\t52985\nfkt\t52986\n嗯蛋蛋\t52987\n九场四十\t52988\n傻傻瓜瓜\t52989\n碰了碰\t52990\n姚克栋\t52991\n134分\t52992\n焦立然\t52993\n六二十\t52994\n九点零四\t52995\n专场\t52996\n疲\t52997\n今天下午2点\t52998\n表骗\t52999\nUmea\t53000\n你会不会\t53001\n踏马\t53002\n插嘴\t53003\n脱毛\t53004\n蒸笼\t53005\n手累\t53006\n下四年\t53007\n突然\t53008\n另一个你\t53009\nghistusu\t53010\n琉子\t53011\n108期\t53012\n英太\t53013\n沧浪区\t53014\n御宅猫\t53015\n英大\t53016\n中华民国政府\t53017\n足球丫\t53018\nrhrhrd\t53019\n仔鸡\t53020\n学生装\t53021\n疋\t53022\nWK\t53023\n好困\t53024\n频车\t53025\n氢氧化钡\t53026\nWJ\t53027\n九十九九十九99\t53028\n可爱的孤儿过儿过儿过儿过儿过儿\t53029\n长疮\t53030\nXX年\t53031\n舆情\t53032\n徽州\t53033\n垂帘听政\t53034\nword\t53035\nWH\t53036\n好囧\t53037\n疏\t53038\n扫啪\t53039\n么俊杰\t53040\n创多音\t53041\n今天一天\t53042\n朱镕基\t53043\n段文琪\t53044\n王琦餐\t53045\n章法\t53046\n雀鸟\t53047\n到了\t53048\n早市\t53049\nwjjsnd\t53050\nhiogg\t53051\n到事\t53052\n热靴\t53053\n杨宇辉\t53054\n欢快点\t53055\n冥界\t53056\n好难过\t53057\n人心\t53058\n睡法\t53059\n明代\t53060\n2001年\t53061\n85小时\t53062\n2n2n6\t53063\n利比亚冲突\t53064\n二百斤\t53065\n96338426\t53066\n十八七十八\t53067\n八八八八六十四岁\t53068\n沙皇\t53069\n西南证券\t53070\n过把\t53071\nShaleno\t53072\n畅多\t53073\n拼死\t53074\n明仁\t53075\n4.000\t53076\n踩一滑\t53077\n无实\t53078\n下不来崽\t53079\n国格\t53080\n浮云遮望眼自缘身在最高层58元一年春好处绝胜烟流满皇都\t53081\n睡泥\t53082\n等哥\t53083\nhi小度度\t53084\n确凿\t53085\n充电板\t53086\n金飞\t53087\n啊衣\t53088\n开拍\t53089\n筋斗云\t53090\n开拓\t53091\n古雷\t53092\n再讲一个故事\t53093\n雷沃\t53094\n眼线\t53095\na座\t53096\n可有可无\t53097\n八里沟\t53098\nwait\t53099\n和蔼可亲\t53100\n小甜心\t53101\n圣菲刀郎\t53102\n陈家屯\t53103\n说不好意思\t53104\n费心理由衷\t53105\n端门\t53106\n少别骗\t53107\n看了心\t53108\n拉拉裤\t53109\n不容易\t53110\n烧肉\t53111\n尸香魔芋\t53112\n银行\t53113\n小发会亚衣\t53114\n250毫升\t53115\nsuperjuniorde\t53116\n云集\t53117\n电池\t53118\n王大道\t53119\n王小磊\t53120\n贝森南真彩\t53121\n杀毒\t53122\nceyou\t53123\n额额额额曲项向天歌\t53124\nydhhd\t53125\n得了心\t53126\n适于\t53127\n後\t53128\n阳杨洋\t53129\neverybody\t53130\n高大对矮小\t53131\n完美人\t53132\n管治\t53133\n普遍化\t53134\n皎洁\t53135\n不不不么\t53136\n回忆专用小马甲\t53137\n亲感冒\t53138\nuaq\t53139\n没觉着\t53140\n好泡金\t53141\n44625746741\t53142\nsharpen\t53143\n闫瑞琪\t53144\n汤钱\t53145\n机器人大杨梅\t53146\n六合\t53147\nSwan\t53148\n六名\t53149\n调频\t53150\n八筒\t53151\n东台东台\t53152\n辉丰股份\t53153\n2011年7月9日、10日\t53154\n风险评估\t53155\n精讯\t53156\n张闻天\t53157\n难敌\t53158\n动力位移角\t53159\n六向\t53160\n归依\t53161\n朱光熙\t53162\n巾纸\t53163\n六吨\t53164\n说还休\t53165\n不好奸\t53166\navoiding\t53167\n安定区\t53168\n西忒\t53169\nPacific\t53170\n四届\t53171\n宣威\t53172\n黃道\t53173\n系咪\t53174\null\t53175\n迅速\t53176\n导游证\t53177\n方程式蛋炒饭\t53178\n雷锋\t53179\nule\t53180\n武林\t53181\n等\t53182\n筈\t53183\n好度秘威武度秘最棒度秘\t53184\n筆\t53185\n假素素\t53186\n筛\t53187\n三年后\t53188\n熊子路\t53189\n一万遍\t53190\n只说\t53191\ntjsceo\t53192\nｉｂｅ\t53193\n筒\t53194\n只读\t53195\n筐\t53196\nHgd\t53197\n答\t53198\n150名\t53199\n女川核电站\t53200\n高町\t53201\n格里\t53202\n开手\t53203\n杀手人\t53204\n#Loki#\t53205\n筹\t53206\n韩国明天会\t53207\n签\t53208\n不行不行不行不行不行不行不行\t53209\n汪汪汪汪\t53210\n俄罗斯族\t53211\n牛皮鞋\t53212\n卡噶噶发噶哈\t53213\n小团棠\t53214\n沉默的心\t53215\nxush\t53216\n伏地魔\t53217\n拨动\t53218\n扫地\t53219\nxusb\t53220\n夜间模\t53221\nxusg\t53222\n机动\t53223\n马克艾罗伊\t53224\n不爱片\t53225\n机务\t53226\n向往之地\t53227\n诀别\t53228\nBobbbbb\t53229\n急了聊\t53230\n多六点\t53231\nfuivk\t53232\nkvhbc\t53233\n周生\t53234\n你好健忘\t53235\n同伴们\t53236\nEdt\t53237\n弱项\t53238\ngb1b\t53239\n仇敌\t53240\n寺监院惟航法师\t53241\n诵诗\t53242\n那来\t53243\n用秘\t53244\nvgbshdfc\t53245\n好数六项\t53246\n毛都\t53247\nnnknv\t53248\n江云\t53249\n李文\t53250\n真的很纪元\t53251\n雇凶\t53252\nByeBye\t53253\nargued\t53254\n嗯大体\t53255\n十四卷\t53256\ndqyou\t53257\n说头\t53258\n三家\t53259\nthank\t53260\n看讲\t53261\n如来神掌一掌\t53262\n诵读\t53263\n植物大战僵尸2视屏\t53264\n一一先\t53265\n中医药\t53266\nSD们\t53267\n零票\t53268\n迈术师\t53269\n因此\t53270\n乱说在秀\t53271\nwqnmb\t53272\n小小小\t53273\n李斯\t53274\nBubbles\t53275\n叫做好乐\t53276\n看讨\t53277\n说天\t53278\nShedon\t53279\n贾芬\t53280\n不衰\t53281\njb1bj\t53282\ngfjfjggdjsggfggggg\t53283\n雕毛\t53284\nRight\t53285\nMmoer\t53286\n一闹闹闹切克闹\t53287\n140多米\t53288\n张哒\t53289\n炼丹\t53290\n夜灯\t53291\n丅c丅\t53292\n张哥\t53293\n1928146426\t53294\n美波\t53295\n拉勾\t53296\n302085338885\t53297\n别百四十说\t53298\n不行\t53299\n活法\t53300\n十大学\t53301\n自养型\t53302\n中国医疗\t53303\n吸住\t53304\n道忙\t53305\n屏里\t53306\n天照\t53307\n汇文\t53308\n71489969668083\t53309\n五过\t53310\ngiry\t53311\ngirg\t53312\ngird\t53313\n2000002234567\t53314\n鹿肉\t53315\niugb\t53316\n去们\t53317\ngirl\t53318\n2pm6\t53319\n望前看\t53320\n哈号\t53321\n自相矛盾\t53322\n马日惠\t53323\nkmk\t53324\n短短\t53325\n赵未\t53326\n躺会\t53327\n赵本\t53328\niccid\t53329\n阳阳\t53330\n羞辱\t53331\ndyshaiws\t53332\n二恶\t53333\ngirI\t53334\n额高克\t53335\nvhkffirjdccsgfxhycgffvfhdvhfhgfdgdtdjfgjdfjgczfgdvhtgtdvysggfgdyffjfhcgh\t53336\n郧西\t53337\n设定\t53338\nsi度\t53339\n天天有喜二之人间有爱的就个狐狸精\t53340\n脾寒\t53341\nGuyssty\t53342\n小甜甜\t53343\n秋葵\t53344\n省高级人民法院\t53345\n蝴蝶公\t53346\n东西元\t53347\n口述\t53348\nvjx\t53349\n浸淫\t53350\n黄陂\t53351\nyesi\t53352\nvjs\t53353\n李某样\t53354\ncigguc\t53355\n你是人不是人不要脸要脸\t53356\nvjt\t53357\nvjk\t53358\n大姨\t53359\n我爱不爱我\t53360\nvjh\t53361\n二流\t53362\n哈喽哈喽\t53363\nvjb\t53364\n3339877683\t53365\nvjg\t53366\nvjd\t53367\n热性\t53368\n董佳慧\t53369\n本剧\t53370\n咔咔卡卡卡\t53371\n瓮四年\t53372\ncggv\t53373\n尿不理\t53374\nsxwcw\t53375\n何净何垢\t53376\n哈叫\t53377\njhjg1a1\t53378\n绝顶聪明\t53379\n林炫祺\t53380\n张的非的小小男孩十五岁的他在书\t53381\n狼裙\t53382\n打报修\t53383\nPRI们\t53384\n好著急\t53385\n纳智能\t53386\nnbastant\t53387\n奇洛李维斯回信\t53388\n乱伦色\t53389\nhevtu\t53390\n大洋村\t53391\n蓟门桥土城\t53392\n4764641646\t53393\n几通\t53394\n嗫\t53395\n赶死队\t53396\n嗨\t53397\n嗯\t53398\n痴呆呆\t53399\n嗬\t53400\n弄坏\t53401\n巴拉拉拉\t53402\n董妮\t53403\n嗦\t53404\n嗥\t53405\n嗤\t53406\n两生\t53407\n哈发\t53408\n5哦\t53409\n韩雪\t53410\n嗳\t53411\n嗲\t53412\n晚睡\t53413\n番茄酱\t53414\n嗵\t53415\n拜亚新世\t53416\n鲍艳莹\t53417\n说的话说\t53418\n那关老子毛事儿\t53419\n两用\t53420\n鸳鸳\t53421\n嗌\t53422\n39.2%\t53423\nCanyou\t53424\n店面\t53425\n嗄\t53426\n银塘星城\t53427\n陆川\t53428\n嗟\t53429\n一一共\t53430\n嗝\t53431\n事系\t53432\n宝鸡文理学院\t53433\n嗒\t53434\n阿嬷\t53435\n嗐\t53436\n木瓜籽\t53437\n嗖\t53438\nsean\t53439\n襄襄\t53440\nDO\t53441\nDL\t53442\nDM\t53443\nDJ\t53444\n微想\t53445\nDH\t53446\nDI\t53447\nDF\t53448\nDG\t53449\nDD\t53450\nDE\t53451\nDB\t53452\nDC\t53453\nDA\t53454\n67天\t53455\n仲好\t53456\n456457\t53457\nDX\t53458\n估摸\t53459\nDV\t53460\nDW\t53461\nDT\t53462\nDU\t53463\nDR\t53464\nDS\t53465\n2寸\t53466\nDQ\t53467\nDn\t53468\n杀猫狗\t53469\n好叭度\t53470\nDh\t53471\nDf\t53472\nDg\t53473\nDe\t53474\n三明天\t53475\n周泽\t53476\n朱子怡\t53477\n德国队\t53478\n一两分钟\t53479\n婆婆妈妈\t53480\n的牛顿市\t53481\n黄金比\t53482\nDv\t53483\nDt\t53484\nDu\t53485\n毕生\t53486\n米其林\t53487\n零零\t53488\n那边\t53489\n嗯和建华\t53490\nyige\t53491\n批女\t53492\n我是女的你是男机器人\t53493\n摸象\t53494\n不祥\t53495\n方特素红猪\t53496\n28286587636395426555￥\t53497\n磁悬浮列车\t53498\n导视听\t53499\n夏草\t53500\n魄力\t53501\nTF家族\t53502\n追杀\t53503\n李梅\t53504\n樱兰\t53505\n度秘你在干什么呀式动作\t53506\n远未走好\t53507\n2万一千六百四十\t53508\n插拔\t53509\n参演\t53510\n虛幻\t53511\n雷速登\t53512\n不祝\t53513\n磨擦\t53514\nD8\t53515\n福冈\t53516\n深化改革\t53517\n10点半\t53518\nD3\t53519\nr8dtgucy\t53520\n1331962811\t53521\n旮旯\t53522\n新安晚报\t53523\n转身\t53524\n多伦多\t53525\n太冷淡\t53526\n衣裤\t53527\n用旧恨\t53528\najdimtml\t53529\n开句\t53530\nseoon\t53531\n6月8日\t53532\n定说好\t53533\n你好可怜\t53534\n朝气\t53535\n沙漠区\t53536\n289460891\t53537\n老鼠肉\t53538\n朱森\t53539\n公映\t53540\n手心\t53541\n半月\t53542\n想飞\t53543\n如火\t53544\n新区三中\t53545\n滚滚恶心\t53546\n80整\t53547\n田达\t53548\n公明\t53549\n权威\t53550\n库管\t53551\ngcchujbvcfg\t53552\n庆功会\t53553\nhurfy\t53554\n烦人聊天\t53555\nsouci.||TO\t53556\nBcccb\t53557\n康维莉\t53558\n卫生纸\t53559\n鹏鹏哥\t53560\n16：15\t53561\nHAM\t53562\nHAO\t53563\n范德萨阿西\t53564\nmalzi\t53565\n分数\t53566\n谢长夜\t53567\n28888888885888\t53568\n六飞众彩果\t53569\n自恋别\t53570\n致力于\t53571\n石秋晴\t53572\n疯狗可\t53573\n分散\t53574\nu型\t53575\n伍彤彤\t53576\n传言\t53577\nfqb\t53578\n45568268456\t53579\n车友会\t53580\n中课文\t53581\n5481956348\t53582\nSm公司\t53583\n嗯all\t53584\n林大唉\t53585\n许诺尔\t53586\n次要\t53587\n1925年\t53588\n好嘛你快点\t53589\n徐发\t53590\n增城市\t53591\n了了扰\t53592\n不是不想\t53593\n袍子\t53594\n58585825050528\t53595\n卤面\t53596\n朝氣\t53597\n默传\t53598\nbbhgfhm\t53599\n倏然\t53600\nnangdesida\t53601\n不不出\t53602\n小公举\t53603\n费率\t53604\n罗曌\t53605\n好不好啊\t53606\n不在理\t53607\n小欣\t53608\n120be八\t53609\noo度秘\t53610\n踏踏\t53611\n嗯冬夜\t53612\n吴雪梅\t53613\n高音\t53614\n以为你没有\t53615\n么达\t53616\n85644\t53617\n永川\t53618\n清感\t53619\n三国志\t53620\n萨博汽车\t53621\n金刚石线\t53622\nkook\t53623\n省级\t53624\n人类化\t53625\n椅\t53626\n耳熟能祥\t53627\n冲床\t53628\n韩昕怡\t53629\n植\t53630\n椎\t53631\n双鱼银瓶\t53632\n椐\t53633\nkooy\t53634\n椒\t53635\n6CD\t53636\n歼十\t53637\n邓琪\t53638\n桂林市\t53639\n椟\t53640\n老三\t53641\n老丈\t53642\n老不\t53643\n写一写\t53644\n不要你了再见再见\t53645\n老丁\t53646\n4块\t53647\n老七\t53648\n椭\t53649\n了解送\t53650\n你给我一个闻闻闻闻你给我一个吻\t53651\n今年3月\t53652\n4106555465\t53653\n打来\t53654\n电视节目\t53655\n烟蒂\t53656\n老丑\t53657\n你的饭\t53658\n长江师范学院\t53659\n自比\t53660\n周密\t53661\n半夜两点\t53662\n778558\t53663\n你好呀小帅哥\t53664\n134697258\t53665\n0015\t53666\n毛球\t53667\n卵细胞\t53668\nāàáèéêDìíòóùúüàáèéaTYê\t53669\n美了美\t53670\n秘河\t53671\n安疯狂\t53672\n吃罢\t53673\n百万分之一\t53674\n我叫你\t53675\n咯苦\t53676\nchiphotosbaiducomxiaodupicitem00e93901213fb80e0123a9a431d12f2eb83894d7jpg\t53677\n垤玛\t53678\n天流泪呀的伤悲\t53679\n14847\t53680\n安和\t53681\n96厘米\t53682\n八糟\t53683\n想雯\t53684\nmark\t53685\n田瑞秀\t53686\n教学法不过是作秀癖\t53687\n粽子\t53688\n玉德\t53689\n林西\t53690\n快迟到\t53691\n七院\t53692\n霸气动\t53693\n下雨水\t53694\n一周一次\t53695\n死度秘哼我是你的主人\t53696\ndeyuo\t53697\n延迟\t53698\n小灯词\t53699\n广泛\t53700\n清蒸\t53701\ntoue\t53702\nhdv\t53703\nhdu\t53704\nhds\t53705\nhdr\t53706\nhdp\t53707\n报恩\t53708\n保送\t53709\nhdy\t53710\nhdx\t53711\nhdg\t53712\nhdf\t53713\nhdd\t53714\n跃跃欲试\t53715\nhdb\t53716\nhda\t53717\n十万位\t53718\n秘密不要脸钟意咪不要脸\t53719\nhdk\t53720\nhdj\t53721\n刘璐璐\t53722\nhdh\t53723\n小记者\t53724\n140名\t53725\n扫把姑娘你的书\t53726\n无爱你真是\t53727\n蜂ID\t53728\n100家\t53729\n了拜拜拜拜\t53730\n科二\t53731\n李会宝\t53732\n日怪\t53733\n我们秘秘\t53734\n记录表\t53735\ndifferent\t53736\n哪等你\t53737\n行情报\t53738\n小色魔\t53739\n190多\t53740\n嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎\t53741\ndkd\t53742\n忠北沃川郡\t53743\n主权\t53744\n乖喜欢\t53745\n胡锦程\t53746\n主杀\t53747\n谢谢你教会我珍惜\t53748\n古没想到\t53749\n到此\t53750\n爸哥\t53751\n黎姐\t53752\n花园兜都医院佳宇快运\t53753\n这样的爱\t53754\n中分\t53755\ngrs\t53756\n用电\t53757\n华西a\t53758\n扦插\t53759\n黎姿\t53760\n亀梨\t53761\n崔梦瑶\t53762\n长错\t53763\n哈密瓜\t53764\nbottle\t53765\n米瑞斯\t53766\n陈我霉\t53767\n郭佳音\t53768\n用用\t53769\n四女\t53770\n2206212289540\t53771\n上星期\t53772\n观测\t53773\n从天狼星回来\t53774\n净值\t53775\n古铜色\t53776\n折花\t53777\n坦克风云\t53778\n森么\t53779\n俊杰上快本\t53780\n理直气壮\t53781\n准听\t53782\n11788\t53783\n主材\t53784\n天全黑\t53785\n欠锤\t53786\nguguchf\t53787\n铁血金戈梦\t53788\n集合堂\t53789\n2010年3月至9月间\t53790\n酷狗\t53791\n移费\t53792\n想后\t53793\n撸撸GG\t53794\nNBA2012\t53795\n11111111111万\t53796\njjbuh\t53797\n名叫做\t53798\n张嘉凡\t53799\n332，天\t53800\n泛酸\t53801\n一万本\t53802\nhi软妹子\t53803\n发言\t53804\n抗衰老\t53805\nSiri咩\t53806\n杨笑话\t53807\n傅宵童\t53808\n好啦好啦我走啦八八么么哒\t53809\n退色\t53810\n江苏靖江公安局\t53811\n崇光\t53812\n2320万欧元\t53813\n萌萌哒萌萌哒萌萌哒大大\t53814\n第二辆\t53815\n咋东北腔\t53816\n信誓旦旦\t53817\n小胡子\t53818\n140米\t53819\n胡媚娘\t53820\n秃发\t53821\n24券\t53822\n王久辛\t53823\n好学\t53824\n嘎嘣脆\t53825\n自乙本机\t53826\nderreerrt\t53827\n搂比\t53828\n吴洪森\t53829\n粗体\t53830\n兴致勃勃\t53831\n2728\t53832\n周春堡\t53833\n篚\t53834\n钱袋\t53835\n889M\t53836\n民宿\t53837\n我爱插史黛西是乖乖\t53838\n呃秘\t53839\n生命科学园地铁站\t53840\nkodeod\t53841\n我最小可不是你\t53842\n傻子乐\t53843\n维他命水\t53844\n阳种\t53845\n5331313\t53846\n圈堂\t53847\n尖叫\t53848\n民安\t53849\n肖磊\t53850\n豆豆奇奇\t53851\n民宅\t53852\n给力]猜\t53853\n成一片\t53854\n我喜欢怒\t53855\ntjgjdnw\t53856\n过活\t53857\n一里地\t53858\n陈逸潇\t53859\nsdfssagfdggf\t53860\n月月\t53861\n气急败坏\t53862\n绵羊\t53863\n哭了嗯\t53864\n口酒\t53865\n西天\t53866\n三z\t53867\nKPI指标\t53868\n樊继歌\t53869\n羊肠\t53870\n日本天皇\t53871\n教育厅\t53872\n四强\t53873\n川滨纯\t53874\n长谷川\t53875\n十一点十三分\t53876\n启辰\t53877\n无懈可击之高如入林\t53878\nStacey\t53879\n次正v\t53880\n老旧\t53881\n噗嗤噗嗤噗嗤噗嗤噗嗤噗嗤噗嗤噗嗤噗嗤噗嗤噗嗤噗嗤噗嗤噗嗤噗嗤噗嗤噗嗤噗嗤噗嗤噗嗤噗嗤噗嗤嗤嗤\t53882\n不鸣则已\t53883\n羊肉\t53884\ndhiphotosbaiducomxiaodupicitem8d5494eef01f3a29b6d714bc9e25bc315c607ca1jpg\t53885\n8896\t53886\n按摩\t53887\n野娇娇商城城\t53888\n渔夫\t53889\n都来寺\t53890\n老早\t53891\n晕头转向\t53892\n我哦你有义务女魔头木木木头额1\t53893\n艺术工产化\t53894\npoolroomppooo\t53895\n一w一\t53896\n6.11日\t53897\n3月7日\t53898\n解封\t53899\n44552\t53900\nJEE\t53901\n弥阿里\t53902\n陈水金\t53903\n44555\t53904\n18306000955\t53905\n胡說\t53906\n东风若达起亚\t53907\n真事儿\t53908\n8.22\t53909\n不像话\t53910\n1403569297\t53911\n逆去\t53912\n反反好\t53913\n10点30分\t53914\n雯颖\t53915\n问询处\t53916\n告诉我不好吗\t53917\n霍倩怡\t53918\n烟熏火燎\t53919\n下不来\t53920\n月考卷\t53921\n怕帅\t53922\n非零自然数\t53923\n很有爱\t53924\n良心话\t53925\nNH08\t53926\n公理化\t53927\n爱吃吧赖动妹子h1句话ok\t53928\n1259758\t53929\n交规\t53930\n1594\t53931\nccxd\t53932\n王凸凸\t53933\n1590\t53934\n陈玉霜\t53935\n雅阁\t53936\nUniqlo\t53937\n掸\t53938\n萌和悦\t53939\n观世音菩萨\t53940\n世人\t53941\n土方\t53942\n武打片\t53943\njshshshx\t53944\n略微\t53945\n趋于\t53946\n你的我的画\t53947\n傅唔错\t53948\n100下\t53949\n明天在\t53950\n风一样\t53951\n博士生\t53952\n除此以外\t53953\n任琪玲\t53954\nkfje\t53955\n下映\t53956\nEsmahan\t53957\n钢琴博物馆\t53958\n八点个\t53959\n臭不要脸的我恨你\t53960\n滚吧滚吧滚吧滚滚滚\t53961\n122度\t53962\n对比照\t53963\n明快\t53964\n闺密冷战\t53965\n喜羊羊与灰太狼之嘻哈闯世界\t53966\nrooot\t53967\n荣耀七\t53968\n步步么\t53969\n中百\t53970\n实感\t53971\n下是\t53972\n64439628554\t53973\n呀吵\t53974\n八点三\t53975\n忠言\t53976\n黄金五\t53977\n了你是机器人不会笑一张铁皮哪哪哪哪哪哪小\t53978\nHyundai\t53979\n贾贵晴\t53980\n前天下\t53981\n蟑螂\t53982\n掩\t53983\n哪照\t53984\n安静兄\t53985\n五行小学\t53986\n1000万辛\t53987\n睡姿\t53988\n逞能\t53989\n北京来广营A4\t53990\n三W点\t53991\n掖\t53992\nifjvgb\t53993\n许局长\t53994\n不劳\t53995\n教诉我\t53996\n邮票堡\t53997\n低估值二线蓝筹股\t53998\n四趟\t53999\n靠不想\t54000\n不动\t54001\n周佛海\t54002\n入世\t54003\n再就是\t54004\n新牛肉面的故事第二季\t54005\n不力\t54006\n隆德\t54007\n霍州\t54008\n终于\t54009\n二分之一到时\t54010\n喇叭\t54011\n有情能\t54012\n吉庄\t54013\n140203199511285629\t54014\n俺爷\t54015\n俺爸\t54016\n柒年级\t54017\n動力\t54018\n东强东东强幸福的生活有天有效东强东强东东\t54019\nJapanTK\t54020\n口头神\t54021\n11000个\t54022\n诚意力\t54023\n钜献\t54024\n会诊诊断\t54025\n陈坤\t54026\n老君\t54027\n盗版度\t54028\n王源波\t54029\n世路上\t54030\n资格\t54031\n五九页\t54032\n酬金\t54033\n家嫂\t54034\n四月二号\t54035\n胃喂\t54036\n皇陵\t54037\ncanfeel\t54038\n米得\t54039\n李淑媛\t54040\n两天内\t54041\n新民镇\t54042\n谢谢你相信\t54043\n亚洲地区\t54044\n无法自拔\t54045\n恺\t54046\n0摄氏度\t54047\n万花楼\t54048\n触不到的恋人\t54049\n掀\t54050\n电子银行\t54051\n不对有\t54052\n孙xx\t54053\nzfzv\t54054\n694984684584887\t54055\n焚毁\t54056\n工业地产\t54057\nYgf\t54058\n卖家人\t54059\nYgc\t54060\n999999999999年\t54061\n喜羊羊比灰太狼\t54062\nwsf\t54063\nwsk\t54064\n只认\t54065\n刪除\t54066\n七起\t54067\n广德法师\t54068\n港版\t54069\nwss\t54070\nwsu\t54071\n1923.3.14\t54072\n度秘你是真的爱我\t54073\nwsy\t54074\nwsz\t54075\n男有女\t54076\n幺零零幺网\t54077\n咱俩关门说么\t54078\n蔡国香\t54079\n一剪\t54080\ngagshdh\t54081\n心牙\t54082\n一副\t54083\n步子\t54084\n600多分\t54085\n家里湾\t54086\n是女大我找\t54087\nxxxoo\t54088\n嗯海尔\t54089\n颜语\t54090\n阿尼哈塞\t54091\n催修\t54092\n蹄\t54093\n350台\t54094\n故梦到\t54095\n13624229901\t54096\ndian\t54097\ndiao\t54098\n天天妈妈\t54099\n鸡肋\t54100\n十八分\t54101\n一愉快\t54102\n趣趣\t54103\n一剂\t54104\n闭脚\t54105\n防弹少年团\t54106\n潘海涛\t54107\n在工地\t54108\n催促\t54109\nssssss\t54110\n刘礼贤\t54111\n彭发连\t54112\n天翼\t54113\n林子聪\t54114\n被子曲\t54115\n邵国松\t54116\n透风\t54117\n江上舟\t54118\n3318750972\t54119\n副总理\t54120\n等价\t54121\n我无喔那些年\t54122\n100088888\t54123\n食物链\t54124\n庄重\t54125\n二七六二幺九零零\t54126\n真木木\t54127\n展开\t54128\n陈燕娜\t54129\n朝的朝\t54130\n膜法世家\t54131\n亲我真\t54132\n粉白红\t54133\n见眠\t54134\n拨通\t54135\n汤包\t54136\n福清宏路\t54137\n会一家人\t54138\n夜班\t54139\n老吴\t54140\njhgbn\t54141\n并行不悖\t54142\n︵┴─┴\t54143\n火影的聪明的度你\t54144\n香洲\t54145\n法律系\t54146\n真的不起\t54147\n八大\t54148\n你的很重要秘\t54149\n景顺\t54150\nRaffles\t54151\n周师傅\t54152\n懒虫\t54153\n良涯\t54154\n这一种\t54155\ndestination\t54156\n反而\t54157\nhttpahiphotosbaiducomxiaodupicitemaa64034f78f0f736af6489750d55b319ebc4132fjpg\t54158\n美纳斯\t54159\n这一秒\t54160\n陶晓恒\t54161\n稳步\t54162\n磨灭\t54163\n好吧度秘真聪明给你个大拇指\t54164\n学得\t54165\n立领\t54166\n第十一次\t54167\n学徒\t54168\n453个\t54169\n上贴\t54170\n心度\t54171\n孕\t54172\n孔\t54173\nguuui\t54174\n孑\t54175\n子\t54176\nddrdrd\t54177\n孒\t54178\n孝\t54179\n朱丽萍\t54180\n孟\t54181\n孞\t54182\n孙\t54183\n存\t54184\n孤\t54185\n眼镜蛇\t54186\n学\t54187\n猪狗\t54188\n11时45分\t54189\n为准\t54190\n孬\t54191\nTfbyod\t54192\n孩\t54193\n忙耐\t54194\n孵\t54195\n孰\t54196\n孽\t54197\n心底\t54198\n學\t54199\n大运会\t54200\n借楼\t54201\n丈量\t54202\n所感\t54203\n蹬\t54204\n吴雅文\t54205\n剪裁\t54206\n能找到\t54207\n360安全卫士\t54208\n走一站\t54209\n上贡\t54210\n林妹子\t54211\n十九世纪以前\t54212\n阿难\t54213\n巴登巴登\t54214\nTyltxkrhthzhktltk\t54215\n蔡国庆\t54216\n30万本\t54217\n小二沟\t54218\n高嗯本爱\t54219\n拜拜拜拜拜拜拜拜拜拜拜拜\t54220\n叛军\t54221\n肾争锋\t54222\n愁绪\t54223\n肉搏战\t54224\n联大\t54225\n上床\t54226\n别太傻\t54227\n1944年\t54228\n胸怀好\t54229\n饭次\t54230\nasomuch\t54231\n吵不赢\t54232\n二等\t54233\n立码\t54234\n辐辏\t54235\n无以为是\t54236\n月亮村\t54237\n马紫林\t54238\n15251454597\t54239\n上店\t54240\n136636857349\t54241\n乖仔\t54242\n儿童节快乐﹁\t54243\n落脂肪粒\t54244\n全方\t54245\n啵啵啵啵啵\t54246\n不知\t54247\n扬子江\t54248\n上度\t54249\n王斑\t54250\n乖仆\t54251\n乌拉圭\t54252\n每个你\t54253\n有声趣味\t54254\n鲜卑\t54255\n滞后\t54256\n撒瘤\t54257\n成文\t54258\n微山湖\t54259\n挂念\t54260\n111122424\t54261\n想不你\t54262\n口天吴\t54263\n小岳\t54264\n喝汤\t54265\n一刹那\t54266\n百兆百兆\t54267\n咱俩谁跟谁呀我的是我的你是我的\t54268\n912525198\t54269\n见了再说\t54270\n老楊\t54271\n喵个\t54272\n云南省森林防火指挥部办公室\t54273\n黑猪哥我是你的\t54274\n猝死\t54275\n瘦肉精猪肉\t54276\n想听\t54277\n诗方子\t54278\n小快乐快快乐\t54279\n小岛\t54280\n江疏影\t54281\n理解\t54282\n陰莖\t54283\n明天还是回南天\t54284\n褪去\t54285\n我喜欢女性格的你\t54286\n金刚藤\t54287\n植物大战僵尸多格漫\t54288\n晚来\t54289\n万字\t54290\n呕液\t54291\n麻省\t54292\n好吧爱我你\t54293\n49毫秒\t54294\n加油喔我挺你你\t54295\n你要了我的初夜\t54296\nhhsughfj\t54297\nrostalaa\t54298\n大石蕾\t54299\n14509670条\t54300\n预存要\t54301\n一道家里度\t54302\n麦彩铃\t54303\n一首歌好想你\t54304\n于洋闰\t54305\n陕南\t54306\n你的生活照\t54307\n回来我的爱疯\t54308\n领证\t54309\n梁永康\t54310\n潍坊们\t54311\n牛通社\t54312\n周宇东\t54313\n伊思\t54314\n领试\t54315\n单县南城区政府\t54316\n撒娇师大\t54317\n疑问词\t54318\nhttppinyincne17041\t54319\n罗楗\t54320\n我喜欢美丽的秘密\t54321\nflower\t54322\n刘艳芳\t54323\nvqf\t54324\n条条框框\t54325\n开放度\t54326\n运筹\t54327\nzapiya\t54328\n多咪多咪你在干嘛\t54329\n奥比岛奥比岛\t54330\n四年了了我要度秘\t54331\n鬼委\t54332\nRydg\t54333\n领读\t54334\n黄子强\t54335\n探头\t54336\n再见吧再聊\t54337\n13343533545457967658\t54338\n奥sou\t54339\nanlluy\t54340\n肉欲\t54341\nnhjhjj克拉拉\t54342\nvWvvvU\t54343\n不歌\t54344\nimaxested\t54345\n泖溪苑\t54346\n自慧\t54347\nzjo\t54348\n炫蓝蘑菇\t54349\ncmCM\t54350\n多多益善\t54351\n二二幺二\t54352\n北约\t54353\nzjg\t54354\n希澈TW\t54355\n韦保荣\t54356\nzjc\t54357\n北纬\t54358\n得寸进尺\t54359\n标镇\t54360\n辽宁地区\t54361\n不止\t54362\n不正\t54363\n不步\t54364\n不武\t54365\n印度尼西亚交通部\t54366\nzjw\t54367\n别再\t54368\n15199131316\t54369\nzjs\t54370\nzjr\t54371\n郭滚开\t54372\n一千四个\t54373\n千手哥\t54374\n三餐\t54375\n从头至尾\t54376\nipopqrac\t54377\neuue\t54378\n孔小枫\t54379\n尹家绪\t54380\n张语阳\t54381\n秘感\t54382\n成福\t54383\n过来看\t54384\nRecoChoku\t54385\n陕西省\t54386\n0.15元\t54387\n老弱病残\t54388\n翰莎\t54389\n伏笔\t54390\n无形下列\t54391\nhi文\t54392\n唐减免王子郑棉网面\t54393\n尚雯婕\t54394\n难你\t54395\n两组\t54396\n好抱\t54397\n爸妈妈\t54398\n告诉我祢\t54399\n赏莲\t54400\n如果秘\t54401\n奈君何\t54402\n2月7号\t54403\n好报\t54404\nendssshdutyis\t54405\n那迪斯尼\t54406\n真意切\t54407\n868\t54408\n出汗\t54409\n层层\t54410\n861\t54411\n860\t54412\n叶雪晶\t54413\n862\t54414\n回报\t54415\n867\t54416\n866\t54417\n车管所\t54418\n凌空\t54419\n鏪\t54420\n唐宋\t54421\n哇哈\t54422\n出汉\t54423\n兵兵好呀\t54424\n到会\t54425\n点谂\t54426\n心醉\t54427\n温圳英华中学\t54428\n多开心一6xevg8csevtmgrhgty4\t54429\n唱伤\t54430\n黄晓明儿\t54431\n作业\t54432\noiwkskjcudi\t54433\n进行中\t54434\n宋GG\t54435\n行吧\t54436\n恩记\t54437\n情场\t54438\n唱会\t54439\n112244779\t54440\n作为\t54441\n陪妞\t54442\n白糖水\t54443\n点谱\t54444\n智谋\t54445\n为你一\t54446\nbeautifyoull\t54447\n龙女堡\t54448\n徐丽莎\t54449\n讲解\t54450\nnriaj\t54451\n蜜粉\t54452\n海雅非\t54453\n大耳朵\t54454\nａｆｘ\t54455\n气气\t54456\n亿利\t54457\n双雄\t54458\n泡妞吧兄弟先\t54459\n棉签\t54460\n发哈\t54461\n蓑衣\t54462\n吴长江\t54463\npg\t54464\n恩那个你个你知道我的好友\t54465\n十七天\t54466\n一下辆\t54467\n讲见\t54468\ngvgfsdc\t54469\nｌｏ\t54470\n潘阳\t54471\n发哥\t54472\n微弱\t54473\n石健\t54474\n撤回\t54475\n為麼麼噠\t54476\n郑州公安局\t54477\n120526.SuperShow4.Encore\t54478\n严肃\t54479\n昆老\t54480\n最新期\t54481\n我的身\t54482\n用之于民\t54483\n痒痒肉\t54484\n包装子\t54485\n成龙历险记\t54486\n周来雨\t54487\n加盟\t54488\n加盐\t54489\ndontrun\t54490\n忒么\t54491\n加盖\t54492\n国色天香\t54493\njjyh\t54494\n27岁\t54495\n人之常情\t54496\n矿工\t54497\n哎呦度秘\t54498\njjyf\t54499\n一个麦什180次\t54500\n陈嘉萌\t54501\n环岛\t54502\nShy\t54503\njjyy\t54504\n太子\t54505\njjyr\t54506\nBased\t54507\n湖柏村\t54508\n美科竖都\t54509\n吉娃娃女\t54510\n金钟大\t54511\n闷替\t54512\n兴奋剂\t54513\n任寿\t54514\n你好玩我和你聊\t54515\n15423\t54516\n罗琴\t54517\nxkyxg\t54518\n苏比\t54519\nchroo\t54520\n罗琳\t54521\n限时\t54522\nHansAnderson\t54523\n引爆\t54524\nhikj\t54525\n谢朝平\t54526\n丑辣\t54527\n赵字\t54528\n夏嘉\t54529\n真心实意\t54530\n破烂\t54531\n着急受累\t54532\n7月20日\t54533\n精装\t54534\n龙猫\t54535\n李思洋\t54536\n嗯介绍\t54537\nmetout\t54538\n喽美拉拉\t54539\n潮味\t54540\n保爱\t54541\n吉韩文\t54542\n求求你了我喜欢\t54543\n红蛛珠3\t54544\n浆果\t54545\n孝义五中\t54546\n人寰\t54547\n唱听\t54548\n大洋哥\t54549\n唱吧\t54550\n名词\t54551\n潮员\t54552\n重工业\t54553\n名说\t54554\njirff\t54555\n国中\t54556\n会钱\t54557\n会钰\t54558\n杀杀杀杀杀杀杀杀杀\t54559\n国丹\t54560\n退避\t54561\n国主\t54562\n5186685412773131264684867644094979431383464843835122668852852461146431316778643655\t54563\n杨舒胡\t54564\n没想到\t54565\n5.2级\t54566\nquickwit\t54567\n山外\t54568\n这些事儿\t54569\n认取\t54570\n宝贝你是宝贝宝贝宝贝我爱你\t54571\n说了命\t54572\npo\t54573\n七八槽\t54574\n宁邦寺\t54575\nmeisgirl\t54576\nД秋豆麻袋\t54577\n内疚\t54578\n必修号\t54579\n白砂糖\t54580\n金逸珠江影视城\t54581\n每一分钟\t54582\n山头\t54583\n电击\t54584\n说了呀\t54585\n写劲\t54586\n3千元\t54587\n唧唧烂种\t54588\n蜂蜜\t54589\n山夫\t54590\n认可\t54591\n被减数\t54592\n一世一一四两\t54593\n两幅画\t54594\nafas\t54595\n九五个\t54596\n登场\t54597\n积压\t54598\n藏传佛教\t54599\n宏宝贝\t54600\n凌濑千早\t54601\n德里克\t54602\n在度度\t54603\n密碼\t54604\n汪桂珍\t54605\n订婚宴\t54606\n金鸡\t54607\n米西米西\t54608\n嫉贤妒\t54609\n埋头\t54610\n仓山区\t54611\n88折\t54612\n害人精没\t54613\n肿莫\t54614\n别克侠\t54615\n受害方\t54616\ntkvjvm\t54617\n器械\t54618\n19.6%\t54619\n惩罚\t54620\ncaocao\t54621\n娜娜拉一拉\t54622\n小二十岁\t54623\nng9dg\t54624\n冻米\t54625\n找妈妈\t54626\n光光\t54627\nCBB\t54628\n车皮\t54629\n忽冷忽热\t54630\n担保\t54631\n春野\t54632\n香火\t54633\nCBC\t54634\n苏醒吗我是总裁我总裁在上我在下\t54635\n打雷劈\t54636\n一百几个\t54637\n84分\t54638\n252525rylor\t54639\n爱的朋友\t54640\n孙伟\t54641\n超级大草包\t54642\nstonitto\t54643\n鹦鹉\t54644\n棕熊\t54645\n821493010\t54646\n待见洌\t54647\n八卦呗言文\t54648\n你是我最爱的聊天机器人\t54649\n武中校\t54650\n真聪明对了真聪明我给你在\t54651\n家老处女\t54652\n郭泓琳\t54653\n听话乖\t54654\n面条们\t54655\n２５０\t54656\n沃管家\t54657\n综合运\t54658\n千度\t54659\n此卡\t54660\n切尔西苡\t54661\n好好玩儿\t54662\nyato\t54663\n南梦\t54664\n药鸡\t54665\n出生人\t54666\n眼训\t54667\n喇叭花\t54668\n司近平\t54669\n风流赛\t54670\n无咪\t54671\n出生于\t54672\n千庆\t54673\n口渴\t54674\n苏教版6\t54675\n六六点\t54676\n南梁\t54677\nfulthings\t54678\n灵幻的天气预报灵幻灵幻\t54679\n独在异乡为一刻每逢佳节倍思亲遥知兄弟登高处遍插茱萸少一人\t54680\n李爱心\t54681\n左以泉\t54682\n刘玲\t54683\n别光阳春白雪\t54684\nvjcv\t54685\n蓝天白云\t54686\n开悟\t54687\n八十六六十八四十三十四九十七二十八五十三二十六\t54688\n脚调\t54689\n有不是\t54690\n奥特卫视\t54691\n欣桐\t54692\n失信假\t54693\n抱着\t54694\n做爱时\t54695\n奔涌\t54696\n玩跑\t54697\n陡直\t54698\n调减\t54699\n驾校\t54700\n还钱\t54701\n奥城\t54702\n静好\t54703\n性子\t54704\n罗夕子\t54705\n聚神\t54706\nhuyy\t54707\n惹怒我了我恨你\t54708\nsuai\t54709\n调出\t54710\n肉穴\t54711\n一包块\t54712\n薏米粥\t54713\n打打机器人我恨你\t54714\nhuyf\t54715\n妙唔系\t54716\n批发价\t54717\n效劳\t54718\n盈汇集\t54719\n旅游度假区\t54720\n体育界\t54721\n二三零\t54722\n案情\t54723\n燕子歌\t54724\n给我想\t54725\n我一半儿\t54726\n陶景怡\t54727\n这两天\t54728\n计费\t54729\n乖啦乖啦巴拉瓦\t54730\n仍未\t54731\n7395万人次\t54732\n罗浮山\t54733\neyggfg\t54734\n力宏\t54735\n颜色的日子\t54736\n效力\t54737\nvvcvh\t54738\n娇滴滴\t54739\n争创\t54740\n吃吃吃吃吃吃吃吃吃擦\t54741\n胡说好\t54742\n安子辰\t54743\n审题\t54744\n六十米\t54745\n吻女\t54746\n三人称\t54747\n篦子\t54748\n你好你好我恨你\t54749\n更方便\t54750\n格律诗\t54751\n筑梦2008\t54752\n栗子涵\t54753\nYytf\t54754\n吧二\t54755\n威龙辣条\t54756\n鱼尾狮\t54757\n第五十期\t54758\ngfyju\t54759\n品学兼优\t54760\n文交所\t54761\ngfyjc\t54762\n核战\t54763\n我的蜜穴\t54764\n仗势\t54765\ndaeatd\t54766\n好我问你我的\t54767\nfrydrr\t54768\n好色\t54769\n张子杰\t54770\n13980749149\t54771\n苹果派\t54772\n群居\t54773\n吧人\t54774\n银滩\t54775\nguffj\t54776\n数鸭\t54777\ntryrr\t54778\n1411\t54779\n1413\t54780\nfbobatio\t54781\n1415\t54782\n叽里呱\t54783\n凶哼\t54784\n滕雨辰\t54785\n真王什\t54786\n皮卡丘\t54787\n10分\t54788\n三月十\t54789\nhgjk\t54790\n周兰\t54791\nfallsout\t54792\n花影\t54793\n积点\t54794\nkdks\t54795\n111111444477722255558880033366699岁\t54796\n旗舰店\t54797\n那个我\t54798\n双袖\t54799\n阿伯丁\t54800\n哎呦你不懂我\t54801\n四季豆\t54802\n心力交瘁\t54803\n傻瓜节\t54804\n解兴娴\t54805\n今期\t54806\n一工\t54807\n洗心革面\t54808\n大事儿\t54809\n那峰峰\t54810\n了不说了别说度秘度秘\t54811\n校托亚机\t54812\n阳具\t54813\n亲切\t54814\n度蜜度秘你最帅你是天下第一帅\t54815\n胡说八道\t54816\n乱来\t54817\ncvjufch\t54818\n十九年\t54819\n对啊谁\t54820\nDavid\t54821\njihanjia\t54822\n555555555521\t54823\n沸感\t54824\n2206822671\t54825\n盛泰色织\t54826\n虚寒\t54827\n乌蒙\t54828\n帕乔\t54829\n哪有事\t54830\n高风芝\t54831\n凡伶\t54832\n邵臆霖\t54833\n亲别\t54834\n回溯\t54835\n路程\t54836\n韩雨桐\t54837\ngraduation\t54838\nLaaa\t54839\n而已\t54840\n天津青年团\t54841\n十块钱\t54842\n兼得\t54843\n恨中生\t54844\nfel\t54845\nSM男团\t54846\npollution\t54847\n拉贝\t54848\n仇人家\t54849\n许子欣\t54850\ncjcj\t54851\n叫骂\t54852\n少丽\t54853\n少为\t54854\n肠子\t54855\n壁画\t54856\n601939\t54857\nfius\t54858\n74741\t54859\n推向\t54860\n可达\t54861\n压抑\t54862\n内容块\t54863\n少一\t54864\n想和你是\t54865\n潘基文\t54866\n梅干\t54867\nnafftheft\t54868\n推吼\t54869\nbcd\t54870\n裂痕\t54871\nmerome\t54872\n名吃&amp\t54873\n想笑笑笑\t54874\n挨打\t54875\n饭团\t54876\nJjnngibhn\t54877\n嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻\t54878\nfuodid\t54879\n马老师\t54880\nguhuh\t54881\n横穿马路\t54882\n在言\t54883\n雷丘\t54884\n闲来\t54885\n马玉鑫\t54886\n杨秘物\t54887\n卡贡献\t54888\n十一11月17\t54889\n雷一\t54890\n徐云兰\t54891\nOGgsw\t54892\n故伎重施\t54893\nghick\t54894\n幽若皮丹\t54895\n捂住\t54896\n40多万行\t54897\n欧元区路\t54898\n十八十八米\t54899\n梓萌\t54900\n闲杂\t54901\n683524242428586868242424242868685724686757275868242424386860686442427\t54902\n全责\t54903\n念头\t54904\n我不爱你你就不爱我了\t54905\n洪兴路\t54906\n串儿\t54907\n猜默\t54908\n677800\t54909\npankgch\t54910\n森明\t54911\n麻烦切\t54912\n末世女大我是女的\t54913\n小密\t54914\n666662666\t54915\n登记\t54916\n说话台\t54917\n淤船\t54918\n退出口\t54919\n幂吧子宫内翻安利\t54920\n蔡梦雪\t54921\n45541455584\t54922\n陪着\t54923\n兰缪\t54924\n硬碰民族主义\t54925\n贞洁\t54926\n制动\t54927\n不好笑年\t54928\n洪雅崖\t54929\n么么么么么么么么与太\t54930\n开叉\t54931\n130一八200\t54932\n喂奶\t54933\njwhe\t54934\n樊字\t54935\n王红东园\t54936\n冷凝\t54937\nDisturbia\t54938\n抛砖\t54939\n张玉菲\t54940\n讨价还价\t54941\n失职\t54942\n3423\t54943\nhello啵\t54944\n心不烦\t54945\n璧山\t54946\n好啦好啦我不爱你聊了再见再见再见再见再见再见再见再见再见再见再见再见再见再见\t54947\n三八二零六零九八四零\t54948\n就当我折翼天使好了=\t54949\n千亿个\t54950\na2PA网\t54951\nvfrehf\t54952\n小帮手\t54953\n奖励\t54954\n武艺&amp\t54955\n告慰\t54956\n张荭\t54957\n米奇妙妙屋\t54958\n腾佳音\t54959\n慕斯olo\t54960\n肠道\t54961\n夥食費哥\t54962\n飞赴\t54963\n秘哈\t54964\n床下\t54965\n床上\t54966\n学渣\t54967\n3.3亿美元\t54968\niriduutdud\t54969\nbylololmen\t54970\n懂求\t54971\nShchhcf\t54972\ngifufuf\t54973\n德胜门\t54974\n陈悍东\t54975\nSHI人\t54976\n到好朋友\t54977\n压碎\t54978\n说句不好听话\t54979\n飞赞\t54980\n软妹币\t54981\n阪神地区\t54982\n7413163\t54983\n不搞基\t54984\n一侯\t54985\n没把握\t54986\n963852741\t54987\nsvtd\t54988\n比照\t54989\n虎牙直播\t54990\n一便\t54991\n直骨刃\t54992\n125千克\t54993\n活口\t54994\n剪短\t54995\n牛桃\t54996\n装束\t54997\n李嘉军\t54998\n万般\t54999\n川芎\t55000\n樨语音助手\t55001\n一例\t55002\n热肉\t55003\n乖别哭\t55004\n红歌\t55005\n在家里外\t55006\n修车\t55007\n陈真希\t55008\n逼儿\t55009\n最近三个月\t55010\n全杀\t55011\n2774.57点\t55012\n粉们\t55013\n知识分享达人#Are\t55014\nLone\t55015\nLong\t55016\n此文\t55017\n脱衣\t55018\n诚闻\t55019\n你好罗嗦\t55020\n讲一往\t55021\n152874521\t55022\n云状\t55023\nwhenic\t55024\n费尔马伦\t55025\n十种\t55026\nHighs\t55027\n硫磺味\t55028\n此方\t55029\n手麦\t55030\n邹灿\t55031\n手麻\t55032\n凑集资金\t55033\n心裏裏\t55034\n信工\t55035\n们笑话我了我就不开心了那你现在不笑话我我要开心\t55036\n点毛老子掏心你嘛偷袭你\t55037\n我就是爱那你\t55038\n宇宙柜\t55039\n周律师\t55040\n划吧\t55041\n简意赅\t55042\nolol\t55043\n我的一条狗\t55044\n郑宇\t55045\n标高\t55046\n美自然堂\t55047\n挑战者聯盟\t55048\n笛子\t55049\n五二五三五四五五五六五六786969\t55050\n美丽说\t55051\n别帮\t55052\n德恩\t55053\n别带\t55054\n美国女孩\t55055\n郑容\t55056\n15203697186\t55057\nsnwkskk\t55058\n笱续歹\t55059\n阿呀\t55060\n共产主意\t55061\n天可\t55062\n逗比欧\t55063\n沈阳医院\t55064\n88328\t55065\n抽泣\t55066\n653535\t55067\n20096198502025点\t55068\n上缴\t55069\n北京市地方税务局\t55070\n天司\t55071\n彼得林奇\t55072\n截然不同\t55073\n天台\t55074\nNice\t55075\n遭受\t55076\nfgfff\t55077\nfgffg\t55078\n翔呗\t55079\nluhsn\t55080\n声优们\t55081\n状况\t55082\n长摇\t55083\n尹友\t55084\n尹又\t55085\nfgffv\t55086\n翘首以盼\t55087\n超级爆龙神\t55088\n4587258\t55089\nebye\t55090\nDGARGON\t55091\n不要你了我不要你\t55092\n央行\t55093\n唐唐\t55094\n钱危惫\t55095\np1\t55096\n2000米\t55097\n试业\t55098\n纪念品\t55099\n长短\t55100\n脉针\t55101\n90996497\t55102\nJJJJJJJJ\t55103\n先锋派\t55104\n3≦＼o／╯з╰Orz\t55105\n来不及\t55106\n资询\t55107\n老芭比\t55108\n钟礼庆\t55109\n大剧院\t55110\nlamvv\t55111\n爱欲\t55112\n游弋\t55113\n万了八千\t55114\n134页\t55115\n肏屄\t55116\n大明猩\t55117\nxjnshnbbzs\t55118\n礤\t55119\n合适用\t55120\n真好像\t55121\n大地区\t55122\n幼吾幼\t55123\n不可爱了有没\t55124\n什么叫好像\t55125\n月印\t55126\n背背背背\t55127\n好吧123\t55128\n塔下村\t55129\n月华\t55130\neapour2\t55131\n双妹\t55132\n有者\t55133\n阴精干过度\t55134\nittdyi\t55135\n龙门飞甲\t55136\nffbb\t55137\n兴化市\t55138\nSAAB\t55139\n郁金香花象\t55140\n彭家乐\t55141\n行呢\t55142\n退市\t55143\n可不可爱\t55144\n赵博伟\t55145\n决好\t55146\n寺庙\t55147\n姐们们\t55148\n极点\t55149\n３亿元\t55150\n工段\t55151\n那四小花千骨\t55152\n码盘\t55153\n哥飞\t55154\n酒色\t55155\n讨厌你好丑\t55156\n蚕豆\t55157\n帅果果\t55158\n无非\t55159\nfgggghurhhthoodyifjllfjgghfkgkgbgjjookfhhnhhgjkgtfggggyiklpdghvccvggfghhggghuophdvydfgtvggffghfdfffggvhmhhjbbbnlkhvbhvbjbvvxzxcvgljhfefhfthtdaqggjjutfpvqdoghjcujhggjklplkjhgqwerruiplkgssxm\t55160\n5757224\t55161\n背多听一下\t55162\n要不过\t55163\n欧阳询\t55164\n盖保罗\t55165\n别错\t55166\n哈曼\t55167\n随性好相处\t55168\nFtyg\t55169\n梨花\t55170\n韩粉\t55171\n22255417\t55172\n书呆\t55173\nened\t55174\n好勒\t55175\nenen\t55176\n你好帅还是你好美\t55177\n黑暗料理\t55178\n红人儿\t55179\n金家也\t55180\nghfgr\t55181\n我不爱你我讨厌你讨厌讨厌讨厌讨厌讨厌讨厌\t55182\n多小度\t55183\n你好三公主\t55184\n煤气样\t55185\nasking\t55186\n一百公里\t55187\n印占\t55188\n毒明一真棒\t55189\n我的脸\t55190\n言语\t55191\n回信气\t55192\n重说\t55193\n武果果\t55194\n杨梓辰\t55195\n桑雷德\t55196\n食材\t55197\nMOTHER\t55198\n爪子\t55199\n可贵性\t55200\n重读\t55201\n43周年\t55202\n古董\t55203\n回向往\t55204\n北京城建龙\t55205\n短期\t55206\n捐募\t55207\n韝有灵\t55208\n重试\t55209\n很自觉\t55210\n李委员\t55211\n仙尊\t55212\n早姑\t55213\n天津颐高数码广场\t55214\nmmmnnnnnnnxdddddddddtttttttttttttt\t55215\nImbored\t55216\n无线无线网\t55217\n872305595\t55218\n署\t55219\n且行\t55220\namvk\t55221\n水面陈\t55222\n拜拜拜拜\t55223\nlff\t55224\n最主要\t55225\n疯人院\t55226\n鼻头\t55227\nlfk\t55228\n程泽熙\t55229\nlfu\t55230\n颅脑\t55231\n打雷\t55232\nScheurebe，Kerner\t55233\n度秘我没\t55234\n无克\t55235\n光壶中酒\t55236\n磨子\t55237\n快乐香港行\t55238\n10米\t55239\n骐骐\t55240\n破财\t55241\n官渡桥\t55242\n破败\t55243\n一山一山亮晶晶满天都是小星星\t55244\n秦桧\t55245\n破费\t55246\n苗红\t55247\n方圆\t55248\niyfdddd\t55249\n一尺九\t55250\n反应式\t55251\n良好累\t55252\n跑官\t55253\n熊培云\t55254\n587.5万美元\t55255\n说好好意思\t55256\n邝警官\t55257\n明不没有\t55258\n莱亚\t55259\nhhfhhfgty\t55260\n颠球\t55261\n课老师\t55262\nhjfnjgfg\t55263\n编买\t55264\n林家子\t55265\n枫树岭村\t55266\n太气\t55267\n开洗\t55268\n张睿\t55269\n奥莎尼\t55270\n发音器官\t55271\n孙明欣\t55272\nsouth\t55273\n弯曲性\t55274\n一个劲\t55275\n崔胜贤\t55276\nZJJ\t55277\n听音\t55278\n仙萼长春图册\t55279\n铜制\t55280\n办儿\t55281\n惊涛\t55282\n乖香梅\t55283\n真实感\t55284\n1555086888458265646738024685\t55285\n严格要求\t55286\n谦虚勒\t55287\n吃胸\t55288\n先勇女替我\t55289\n老就是你\t55290\n裆部\t55291\n袁增凤\t55292\n外汇市场\t55293\n100000000000000000000000000000000000000000000000000000000元\t55294\n啊郎\t55295\n嘿嘿树\t55296\n咕嘎\t55297\n人见人爱\t55298\n四维\t55299\n曲终绝唱\t55300\n自贡灵犀义工社\t55301\n15元\t55302\n邪医\t55303\naaba\t55304\naabb\t55305\naabc\t55306\n百达翡丽\t55307\n15克\t55308\n吧子\t55309\n晚上9点57\t55310\ntuxhfiffgdbd\t55311\n展鹏\t55312\n平衡\t55313\nv烦恼\t55314\n温名\t55315\n画圈儿\t55316\n欧罗巴克马克马克马\t55317\n睡了\t55318\n20152016年度\t55319\n终日\t55320\n张逸杰\t55321\n抗震救灾\t55322\n打失业\t55323\n许晓桐\t55324\n鹿哥帅\t55325\nhundredof\t55326\n雪样\t55327\n平行\t55328\nbffjcdhrjj\t55329\nWhatchecked\t55330\n四一四九四\t55331\n回去\t55332\n老师话\t55333\n有型\t55334\n2月9号\t55335\n南昆\t55336\n哇塞塞塞塞塞塞塞省\t55337\n大黄片\t55338\n打工\t55339\nnqi\t55340\nhhbryukk\t55341\n良性肿瘤\t55342\n南昌\t55343\n龙门动物园\t55344\n放風\t55345\n康贝贝\t55346\n1支\t55347\n度秘你好美\t55348\n老师说\t55349\n32%\t55350\n杨琴\t55351\nze张柏秘candomitododah1aa\t55352\n她的要求\t55353\n322\t55354\n323\t55355\n49点\t55356\n321\t55357\n杨琪\t55358\n327\t55359\n324\t55360\n325\t55361\n王娜\t55362\n王娟\t55363\nfyslmoh8ftuhihopooo\t55364\n32D\t55365\n为你哭\t55366\n九成\t55367\n25方\t55368\n丙火人\t55369\n听罗\t55370\nnikkkll\t55371\n感官\t55372\n王娅\t55373\n妹纸妹纸妹纸妹纸在哪里陪陪我了\t55374\n王威\t55375\n娃子\t55376\n够了\t55377\n32c\t55378\n郭四喜\t55379\n昆\t55380\n再来一次\t55381\n32d\t55382\n听罢\t55383\n虎皮青椒\t55384\n大雪天\t55385\n关刀\t55386\n32p\t55387\n32v\t55388\n百度智能机器人\t55389\n塞牙\t55390\n否则的话\t55391\n別理\t55392\n函告\t55393\n四十斤\t55394\n忍气吞\t55395\n家馆\t55396\n不不不你真\t55397\n名命\t55398\n班头\t55399\n大姑妈\t55400\n等度度\t55401\njnsmqmmam\t55402\nnagi\t55403\n1555585555\t55404\n60000万股\t55405\n为那\t55406\n43075\t55407\n李林军\t55408\n名员\t55409\n海平工作室\t55410\n胡心玥\t55411\n大黄鸭\t55412\nBBE6888FE54A7E585A8\t55413\n冲饱了\t55414\n看台\t55415\n嗯嗯撸\t55416\n市政协常委\t55417\n克鲁塞罗\t55418\n假山石\t55419\n哪里家小好不是g爱我你一觉\t55420\n呵大\t55421\n告辞\t55422\n八二二\t55423\n格子宫\t55424\n聊了算了吧\t55425\n夏玫瑰\t55426\n稻盛\t55427\n市标\t55428\n人工阶级\t55429\n一个礼拜天\t55430\n南边镇\t55431\nvhu\t55432\n睡著\t55433\n们老师\t55434\nJerome\t55435\n呵复\t55436\n杨欣\t55437\n童同\t55438\njaqqq\t55439\n五十八五十\t55440\n硕士\t55441\n兴彪\t55442\n3834380438\t55443\n凭感觉\t55444\n时光网\t55445\n罩卅\t55446\n本书\t55447\n蚀龙\t55448\n覃贡\t55449\nhttppinyincne19485\t55450\n和好好不好\t55451\n真人面目的\t55452\n酸处理\t55453\n快车\t55454\n凶好\t55455\n江流都\t55456\n快轨\t55457\nEddie\t55458\n2012－2014年\t55459\n固有\t55460\n仆人们\t55461\n走好\t55462\n老齐\t55463\n失去理智\t55464\n吴二\t55465\n呢恩知道\t55466\n6部\t55467\n53655485286658\t55468\n本么\t55469\n北京朝阳医院\t55470\n十六了你猜我\t55471\n董秘董\t55472\n新年愉快\t55473\n干春\t55474\n四十多块\t55475\n成都工商局\t55476\n等座\t55477\n线人\t55478\n送考\t55479\nkissfrom\t55480\n卫生\t55481\n英語7天就太陽鏡駿喫\t55482\n浦发\t55483\ntopste\t55484\n鼠鼠\t55485\n呢咕\t55486\n跟单\t55487\n文化\t55488\njagagjgjgka\t55489\n十三点71点\t55490\n13034398063\t55491\nrfjhgm\t55492\n周曦扬\t55493\n早被\t55494\n漏出\t55495\njtmmmm\t55496\n3mmm\t55497\n歌桶\t55498\n吴鹏洋\t55499\n依腾\t55500\n没有那么多想\t55501\n汉典说话呢地方说话\t55502\n一干二净\t55503\n在生\t55504\n不懂歌\t55505\n老处\t55506\n第14g14g\t55507\n机柜\t55508\n脚刑\t55509\n过年大学后门有上大把\t55510\n分裂\t55511\n吊车\t55512\n贺新年\t55513\n减肥餐\t55514\n九块\t55515\nzhé\t55516\n住除以\t55517\n剑州\t55518\n保鲜膜\t55519\nc栋\t55520\n度秘二我好爱\t55521\n酉中藏辛金\t55522\nfhhgbbh\t55523\n口布\t55524\n分裤\t55525\n口币\t55526\n飞鲨\t55527\n341281199911252831\t55528\n庞龙\t55529\n考论\t55530\n梦不醒白\t55531\nLfbn\t55532\n日班\t55533\n爱佑\t55534\n东意\t55535\n大妈们\t55536\n丫头们\t55537\nile\t55538\n贺岁新年\t55539\n太完全\t55540\nhomes\t55541\n10余倍\t55542\n国务院\t55543\n得体\t55544\n开业后\t55545\n晓香香\t55546\n十章\t55547\n福摩\t55548\n打一拉\t55549\n爱你\t55550\n垒手\t55551\n好多鱼好多鱼好多鱼好多鱼好多鱼好多事\t55552\n飞猪\t55553\n韦著\t55554\n打的\t55555\n273919059\t55556\n关劲度\t55557\n14578867\t55558\n早上五点半\t55559\n嘟嘟秘\t55560\n学代会\t55561\n忘了我讨厌\t55562\n张嘴\t55563\n丫家\t55564\n大钱\t55565\n李凤绪\t55566\n谈撒\t55567\n东坝头\t55568\n立志\t55569\n九十又不又\t55570\n聂笑嫣\t55571\n曹林\t55572\n京仪大厦\t55573\nuxbbhhhhhhhjhhhhjbbbbbbbbbb\t55574\n大钟\t55575\n大钞\t55576\n此行\t55577\n主动物业\t55578\n政经\t55579\n最款\t55580\n车厘子\t55581\n陈医生\t55582\n锁住\t55583\n炼组\t55584\n雨像疯\t55585\nMslwihhSkkem\t55586\n2.3.3\t55587\n老国\t55588\nuysfggo\t55589\n李珍基\t55590\n会使人\t55591\n徐悲鸿\t55592\n奥特\t55593\nhdyc\t55594\n倩姐\t55595\n何雷锋\t55596\nhdyf\t55597\n嘛度秘\t55598\n杨千\t55599\n欢迎光临\t55600\n好动听\t55601\n3988\t55602\n鸿道\t55603\nhdyt\t55604\nlamanam\t55605\nhdyx\t55606\n推荐人\t55607\n哎呦你是猴子我好讨厌你吗\t55608\n捡回\t55609\ndjfjg\t55610\n4645424254542442242424\t55611\n阴险\t55612\n艾克斯芝顿武装\t55613\n恨不恨\t55614\n老大徒伤悲\t55615\n潘长江\t55616\n少岚\t55617\n懒洋洋\t55618\n两个天\t55619\n少岁\t55620\n赵一照\t55621\ncfhvu\t55622\n百无聊奈\t55623\n促进\t55624\n伊利集团\t55625\nyjp公司\t55626\n郑昊\t55627\n肖旗\t55628\n特卖\t55629\n禁盗图转注明】#朴有天生日快乐##朴有天生日快乐##你自由就是我快乐#\t55630\n#ZTE联播\t55631\n水中游\t55632\n楊舒琴\t55633\n张居烨\t55634\nfyfyhchcyvifigggcgjggvgubvfyhggjhhfug\t55635\n享乐主义\t55636\n常红莹\t55637\n马呈臣\t55638\n杜淑雯\t55639\n刘景学\t55640\n几注\t55641\n毛连\t55642\n忘了眼\t55643\nudjrhidhrdjbdnhd\t55644\n阿迪凡布法曼\t55645\n几泽\t55646\n喜欢不喜\t55647\n互补者\t55648\n售量\t55649\n得自己走\t55650\n儿女\t55651\n尼玛冰+\t55652\n风尚志\t55653\n显然\t55654\n安内\t55655\n啊三家子雷天空阴沉呢行外人来君王\t55656\n很厉害\t55657\n度比度比\t55658\n菜包子\t55659\n基额\t55660\n晋西\t55661\n酒甘糖\t55662\n晕倒\t55663\n就象\t55664\n书房\t55665\nshdjcxjf\t55666\n白叨叨\t55667\n忘了看\t55668\n大猪大猪你是大猪\t55669\n要不就是\t55670\nGuigushi\t55671\n灾情\t55672\n看不見\t55673\n23468858887\t55674\n罗小猪\t55675\n乐播六\t55676\nchellochellochellochello\t55677\n助兴\t55678\n奥城商业街\t55679\n小达秘\t55680\nFBI4\t55681\n加士\t55682\n重庆大学城\t55683\n用无\t55684\n热峦\t55685\n王梦洁\t55686\nGOGO\t55687\n一万八千平方米\t55688\n真一样\t55689\n黄天贯\t55690\n用时\t55691\n急功尽力\t55692\n猜迷\t55693\n15斤\t55694\n马首是瞻\t55695\n未来工厂\t55696\n15122778107\t55697\n有理由\t55698\n意思的话大概车管所个的钉宫一拖vw滕王阁五个我还\t55699\n披上\t55700\n制造\t55701\n荣膺\t55702\n亲爱的你看你多淘气\t55703\n空缺\t55704\n派送\t55705\n零二二\t55706\nVI尿布\t55707\n静茹\t55708\n蠢话\t55709\nheisrich\t55710\n依山\t55711\n讨论会\t55712\n有手作生\t55713\n意犹未尽\t55714\n兑现\t55715\n再见了我不和你聊了我讨厌你\t55716\n漠然\t55717\n9月2日下午\t55718\n徐榕\t55719\ndjfvjfjfb\t55720\n臭逼\t55721\n大江歌罢掉头东最近情歌别事情面壁桌面图破壁难酬蹈海\t55722\n政事\t55723\n缎子\t55724\n咯了聊\t55725\n一家人\t55726\n6855\t55727\n洪洞\t55728\n1090447277\t55729\n489\t55730\n业华爻\t55731\n来自\t55732\nF6Tei\t55733\n台式奶茶\t55734\n苗雨婷\t55735\n长毛\t55736\n抽打\t55737\n裁军】天则经济研究所\t55738\nJolin￥少钱\t55739\n羊们\t55740\n普乐\t55741\n小团子\t55742\n不是我我我我\t55743\n伤人心\t55744\n野战\t55745\np夕\t55746\n度翔\t55747\n好呀好呀我告诉你\t55748\n纯甄\t55749\n安化吧\t55750\nTyghh\t55751\n哭吧摩托\t55752\n大包天\t55753\n张包邮\t55754\n你好看灰太狼\t55755\n消溶\t55756\nryshy\t55757\n秘险\t55758\n装睡\t55759\nTylyesg\t55760\n就是了无\t55761\n游戏房\t55762\n自尽\t55763\n绵绵绵\t55764\n陈雨彦\t55765\n沟沟\t55766\n拜我为师\t55767\n放寒假\t55768\n才色\t55769\n列算\t55770\n妃们\t55771\n欲望酒店\t55772\n暧昧\t55773\n渔舟\t55774\n明天12点\t55775\n马齿\t55776\n刮宫\t55777\n对其可见\t55778\nDUFD\t55779\n格拉斯\t55780\n袭人\t55781\nyffj\t55782\nyffv\t55783\n一回口\t55784\n阴影\t55785\n自封\t55786\n自小\t55787\n自尊\t55788\n12293月\t55789\n小萌蛋\t55790\n奥凯了奥美kmor阿联墨西哥铁路能说那一里有的的嘛你九九咯模拟乐从\t55791\n王乾禹\t55792\n试测专\t55793\n大醋\t55794\n夹心\t55795\n奥特曼猫\t55796\n5522\t55797\n谢鹤鸣\t55798\ndury\t55799\n约炮\t55800\n移民潮\t55801\n没了死\t55802\n不哭不哭\t55803\n兴业\t55804\n上学了拜\t55805\n交上\t55806\n费费用\t55807\n柏武\t55808\n王炯宾\t55809\n钻牙\t55810\n废话了行不\t55811\n6起\t55812\n邮政\t55813\n孩纸\t55814\n为人民服务\t55815\n邓佳美\t55816\n循化\t55817\n要不搭理\t55818\n农民起义\t55819\n分写\t55820\n切本少\t55821\n小豆腐\t55822\n五毛钱\t55823\n卡尔马克思\t55824\nababaa\t55825\nvcf\t55826\n江口镇\t55827\n盆地\t55828\n黄雨阳\t55829\n溶于水\t55830\n黔东南\t55831\n摹拟题\t55832\n君张氏\t55833\n而我要机器女人娃娃\t55834\n飞索李子\t55835\n小伴龙\t55836\n咕噜咕噜咕噜咕噜咕噜\t55837\n888588\t55838\n一百九十一十二十三十四十五十六十七十八十九二十\t55839\n精辟\t55840\n天拓\t55841\n炸死\t55842\n巫山\t55843\n说来来来\t55844\n肖伟\t55845\n青海湖\t55846\n天降\t55847\n曹子杰\t55848\n心无旁骛\t55849\n克人\t55850\n艺声哥\t55851\nOks\t55852\nk7k7k小游嗯拳风传奇\t55853\nmwu\t55854\n你是谁找我有事么事\t55855\n中华人民共和国电影产业促进法\t55856\n说好爱\t55857\n维哥哥\t55858\n账者\t55859\n卸妆奶\t55860\nb超NB\t55861\nmwg\t55862\n迷死\t55863\n涧水茗禅\t55864\nmwn\t55865\nOkn\t55866\n666600000\t55867\n度秘里康乐\t55868\n弹无虚发\t55869\n夫熊猫\t55870\n职业性\t55871\n头马路\t55872\n158岁\t55873\n高分者\t55874\n也想说\t55875\n24位\t55876\n凌弱\t55877\nChopsticks\t55878\n待会\t55879\n原子弹\t55880\n可提\t55881\n李昕遥\t55882\n魔蛋\t55883\n哎呦hihihi\t55884\n散人\t55885\n吉利五十\t55886\n范帕西\t55887\n爬山气\t55888\n美好世界\t55889\n嚎叫\t55890\n公社\t55891\n对我来说\t55892\nmfvgcchggghjhf\t55893\n还有一个\t55894\n求发展\t55895\n迪亚比\t55896\n9月13日\t55897\n机载\t55898\n大胃\t55899\n46484\t55900\n校医室\t55901\n烦你可以\t55902\n大胆\t55903\n杨雅琴\t55904\n后座\t55905\n添一添\t55906\n一阳\t55907\n苏来祥\t55908\n安慕\t55909\n练气\t55910\n大胖\t55911\n悠空\t55912\n亲爱的祝你好梦\t55913\n大胜\t55914\n人才培养\t55915\n欢的生一爱有过你在想会\t55916\n烧杯\t55917\n青铜器\t55918\n付婉青\t55919\n安慰\t55920\n白马岛\t55921\ngxerj\t55922\n近在咫尺\t55923\n王政凯\t55924\n大胸\t55925\n幺八六二二六幺二零八幺\t55926\n这么多别\t55927\n43500\t55928\n一周一\t55929\n干聽\t55930\n全头\t55931\n13059135728\t55932\n喋血那美连连\t55933\ndeeeeeeeeeeeeeeeeeeeeeeee\t55934\n汩汩\t55935\n勉县\t55936\n政馨家园一区\t55937\n表控\t55938\n南昌华夏医院\t55939\n嫁给我了行\t55940\n张笑瑜\t55941\n全天\t55942\n屯里\t55943\n不说不\t55944\n文龙年\t55945\n吴文藻\t55946\n范密度\t55947\n二快点儿\t55948\n不用心\t55949\n红衣\t55950\n酷尔\t55951\n第九月\t55952\n坎坎坷坷妈妈快快乐乐坎坎坷坷坎坎坷坷坎坎坷坷坎坎坷坷\t55953\n干聊\t55954\n2012年\t55955\n狂放\t55956\n三二\t55957\n教室\t55958\n两万成\t55959\n狂攻\t55960\n忙就\t55961\nJcv\t55962\n安小咪\t55963\n筹备\t55964\n布丁吧\t55965\n频爆\t55966\n600万豪\t55967\n描绘\t55968\n房間\t55969\n这犯\t55970\n知晓\t55971\n缺陷\t55972\n屁話\t55973\nGhh\t55974\ndddsssss\t55975\n年复一年\t55976\nQQ块\t55977\n17681688852\t55978\n舍己为人\t55979\n餐费\t55980\nXKZKOSKTKGODPPQPOEKRFKKC\t55981\n教宗\t55982\n四大名捕\t55983\n温吉龙\t55984\n教官\t55985\n爱情生活\t55986\n250芭贝朵\t55987\n奥林巴斯\t55988\n延期\t55989\n卧槽\t55990\njdujmu\t55991\n晨报君\t55992\n嚣张\t55993\n另只手割\t55994\n赤面秉赤心骑赤兔追风马驰驱时勿忘赤帝青灯观青史新农偃月刀眼线处不窥青天了新蒙\t55995\n牛透\t55996\n可爱的人\t55997\nphone\t55998\n卡尔约翰逊\t55999\n淫乱\t56000\n楚庄王\t56001\n温流\t56002\n4000万元\t56003\n车臣\t56004\noullole\t56005\n济南\t56006\n官方图\t56007\n早晨5点10分\t56008\n86555555555\t56009\n不识数\t56010\n#加油小狮子#\t56011\n设立\t56012\n万紫涵\t56013\nbaidu\t56014\n打停\t56015\n残疾人\t56016\nmoinfor\t56017\n石任人\t56018\n诺佳\t56019\n美丽的监狱\t56020\n施莉莎\t56021\n换届时\t56022\n总得\t56023\n激活\t56024\n59105\t56025\n复过来\t56026\n草籽\t56027\n严崇杀害\t56028\n屌戳屄\t56029\n盖世多\t56030\n擼\t56031\n面对现实\t56032\n情景剧\t56033\n孕期\t56034\ntwpg\t56035\n几百万几千万\t56036\n喜羊羊\t56037\n猫王\t56038\n成花\t56039\nLaurent\t56040\n脚踏车\t56041\n不no！一拜天地\t56042\n董瑞\t56043\n一百万一百万一\t56044\n许瀚\t56045\n考察纸\t56046\n刘廷峰\t56047\n洞穴\t56048\nhelloiuy\t56049\n受真\t56050\n3d九\t56051\n杠杠\t56052\n旨意\t56053\n霸王硬\t56054\n先天不足\t56055\n我最讨厌你了我成天那你\t56056\n建行\t56057\n度蜜度蜜\t56058\n武哥\t56059\n成节\t56060\n８４０\t56061\n好日子\t56062\n龟岛君\t56063\n1314590\t56064\n离愁别绪\t56065\n丹丹丹丹丹丹\t56066\n詹姆斯琼斯\t56067\n多廿怒国牌\t56068\n8384\t56069\n调速阀\t56070\n爬山大風\t56071\n糖瓜果果\t56072\n果庭\t56073\n鲍鲍正宏\t56074\n拓东路中段\t56075\n坏了你让我信大线坏了你\t56076\n好u格式\t56077\n黄男人女\t56078\n班委会\t56079\n百家姓\t56080\n人生不设限\t56081\nSAY\t56082\n丹东地区\t56083\n猫瑟\t56084\n笨脚\t56085\n紫色衫\t56086\n领扣\t56087\n抛砖引玉圆\t56088\n新年快乐万事如意\t56089\n刘芒\t56090\n1875808437\t56091\n我喜欢天天有喜\t56092\n刘芝\t56093\n24g4870\t56094\n心神不属\t56095\nboutf\t56096\n10510亿元\t56097\n发等\t56098\n55544444\t56099\n1哈\t56100\n刘芳\t56101\n查欠\t56102\n什岁\t56103\n两个半月\t56104\n不要你找我不讨厌你我讨厌你讨厌你\t56105\n檬\t56106\n殊途同归\t56107\n祁阳县\t56108\n历生\t56109\n1哦\t56110\n一百万二百万\t56111\n卡特特工第二季\t56112\n擞\t56113\ndddtfd\t56114\n55755\t56115\n滴珍视明\t56116\n刘芮\t56117\n英冠\t56118\n监察\t56119\n弄早\t56120\n贱妻\t56121\nYUKARIN\t56122\n吖小怡\t56123\n塔雷\t56124\n弄旧\t56125\n必利片\t56126\n泉州晚报\t56127\n志龙阿\t56128\nwwwwwwwc\t56129\n盘盘儿\t56130\nafgma\t56131\n房房\t56132\n安贞\t56133\n宜少\t56134\n拜拜佛\t56135\n9月25日\t56136\n清早\t56137\n胎动\t56138\n智美杯\t56139\n吗么\t56140\n张琳博\t56141\n夺命你是你有女朋\t56142\n金牌榜\t56143\nstuei\t56144\n蝇营狗苟\t56145\n欸乀\t56146\n那首诗\t56147\n浦母\t56148\n老太太太太太\t56149\n猜我明\t56150\n你好心有\t56151\n毗邻\t56152\n大波噜\t56153\n外服装\t56154\n中枢神经\t56155\n独乐乐翻\t56156\n天中\t56157\n周也\t56158\n盛大游戏\t56159\n雅蠛蝶雅蠛蝶\t56160\n天主\t56161\njshsifg\t56162\n年底\t56163\n塔雅\t56164\n天一\t56165\n天上\t56166\n天下\t56167\n黑媒\t56168\n年度\t56169\n意气相投\t56170\n81年\t56171\n鸡巴\t56172\n秘蜜\t56173\n打查\t56174\n860斤\t56175\n王美迪\t56176\n巧克力慕丝\t56177\n臭臭臭臭丑八怪丑八怪\t56178\n找我可以\t56179\n好啦好啦好啦\t56180\n小犬\t56181\n养狗\t56182\n呃讲卫士大奔跑吧兄弟\t56183\nKkjjj\t56184\n横竖\t56185\n第二支\t56186\n好yourfishityou\t56187\n上海\t56188\n着色\t56189\n280553448\t56190\n脓软件\t56191\n插播\t56192\n血统\t56193\n扭扭捏捏扭扭捏捏扭扭捏捏扭扭捏捏扭扭捏捏扭扭捏捏扭扭捏捏扭扭捏捏扭扭捏捏\t56194\n不是和你不干嘎嘎\t56195\n操手\t56196\n上流\t56197\n0.05\t56198\n验柜\t56199\n树咖啡\t56200\n咸茶\t56201\n艾迪\t56202\n闹蛋\t56203\n霞浦市\t56204\n真爱至上\t56205\n拳拳\t56206\n很多很多\t56207\n右路晗\t56208\n奎屯\t56209\n朴孝敏\t56210\n画杆\t56211\n有线电\t56212\n本本\t56213\n烈水库垮坝\t56214\n本末\t56215\n已經\t56216\n画材\t56217\n朱丽亚\t56218\nscnucrew\t56219\n本机\t56220\nmpdd\t56221\n一大类\t56222\n八七年\t56223\n3081\t56224\n阿基拉·安格鲁卡鲁·尔康\t56225\n英博\t56226\n本月\t56227\n有求\t56228\nvertu\t56229\n谢老婆\t56230\n总部\t56231\n如生\t56232\n呱呱呱呱\t56233\n本期\t56234\n邹婉\t56235\n画板\t56236\n机嚣\t56237\nGalaxy\t56238\n六十年前\t56239\n二般\t56240\n六分之五溪\t56241\n盛情并茂\t56242\n278295\t56243\n度秘我好讨厌你\t56244\nrac\t56245\n狂犬病\t56246\nran\t56247\n双十一\t56248\n做道\t56249\n在海上\t56250\n两三万\t56251\n葱花\t56252\n陈果\t56253\n过年了我不想\t56254\n撒愣\t56255\n立克\t56256\n好心境\t56257\n擦八\t56258\n落地式\t56259\n2摄氏度\t56260\n两三个\t56261\n即逝\t56262\n五一个\t56263\n山望\t56264\n一条船\t56265\n55554247558632148538521478626365\t56266\n二舅\t56267\n在海中\t56268\n体脂率\t56269\n膳点\t56270\n大学校\t56271\n大兴斤米\t56272\n酸辣\t56273\nhttppinyincne19723\t56274\n小熊叉叉\t56275\n做作\t56276\n人像个小时\t56277\n非处方药\t56278\n北部地区\t56279\n自然体\t56280\n红凤美\t56281\n你不好你是猪你是猪你是猪你是猪我讨厌你我讨厌你我讨厌你我讨厌你\t56282\n无问题\t56283\n一年以后\t56284\n没有我不在\t56285\n超级大乐透\t56286\n丽尔\t56287\n小佩\t56288\n黄仲凯\t56289\njaGil\t56290\n露芯\t56291\n田烨\t56292\n夜郎街\t56293\n浅杠\t56294\n飘荡\t56295\n一个一个一个一个一个一个一个一个一个一个一个一个一个一个一个一个一个一个一个一个一个一个\t56296\n萨科齐\t56297\nNO.7\t56298\nNO.6\t56299\nNO.5\t56300\nNO.4\t56301\nNO.3\t56302\nNO.2\t56303\n平武\t56304\n采坊\t56305\nkgui\t56306\n快册了大像\t56307\n王图强\t56308\n一面之缘\t56309\n别谦虚\t56310\n罰站\t56311\nNO.9\t56312\nNO.8\t56313\n狗王\t56314\n营养素\t56315\n翻首歌\t56316\n安城\t56317\n一根经\t56318\n郑世诚\t56319\nhttpehiphotosbaiducomxiaodupicitemfc1f4134970a304e99556688d6c8a786c9175c49jpg\t56320\n禁闭\t56321\n洞蜜度秘\t56322\n徐昌环\t56323\n前十五分钟\t56324\n东皋\t56325\n商务仓\t56326\n脏痛\t56327\n哥弟\t56328\n试剂\t56329\nBig\t56330\n东皇\t56331\n酷狗播放器\t56332\n星美人\t56333\njingunix\t56334\nBiu\t56335\n从办\t56336\n摇摇摇到奈何桥万和华我怕我号堡\t56337\n聊了再见了\t56338\n七丽丽回来了了了七丽丽哗啦啦啦啦\t56339\n爆炸声\t56340\n说不告诉我\t56341\n自夸自擂\t56342\n徐哲晗\t56343\n贾子阳\t56344\ngogo\t56345\n度密公公\t56346\n影音机\t56347\n12度\t56348\n几打\t56349\n清泠\t56350\ncxgf\t56351\n亚尼\t56352\n留意\t56353\n哭嚎声\t56354\n满清\t56355\n唱打\t56356\n做广告\t56357\n布尔玛\t56358\nhh57\t56359\njdjeu\t56360\n乙乙\t56361\n更衣\t56362\n净土\t56363\n严卓雅\t56364\n123456789101112131415161718192052\t56365\n米尔得\t56366\n亚尘\t56367\nangababy\t56368\n腊月初八\t56369\n20宗\t56370\n梁若桐\t56371\n陈花圈\t56372\n两种一种\t56373\ntrth\t56374\ncrvf\t56375\n贱狗\t56376\n洞次\t56377\n好久违啊\t56378\n搞机\t56379\n笑肉\t56380\ngohjbgctfyvll\t56381\n度秘度秘度秘多咪多咪多咪多咪\t56382\n加拉\t56383\n主人公司\t56384\n份你息道\t56385\n夜市\t56386\n董大\t56387\n好冷冷\t56388\n路边侧\t56389\n致富\t56390\n喔早安\t56391\n甘耐\t56392\n小体\t56393\n湖北省\t56394\n齐头并进\t56395\n潘红玫\t56396\n保本\t56397\n齐齐\t56398\n38曾\t56399\n玛哈嘎拉盛大法会\t56400\n949967673764949\t56401\n喂喂喂喂喂喂喂喂喂喂喂喂喂喂喂\t56402\n沙普尔\t56403\ncomelam\t56404\n瓜姓子\t56405\n为民\t56406\n六合区\t56407\n喊骂\t56408\n佰盛店\t56409\n主罚\t56410\n玛卡巴卡\t56411\n右面\t56412\n郑兆宇\t56413\n度秘求图反正\t56414\n轿跑\t56415\n抢劫罪\t56416\n窘局\t56417\n问寒问暖\t56418\n低端\t56419\n蔡赟\t56420\n来呀开始\t56421\n男赡\t56422\n逗呵呵\t56423\n爱丽舍\t56424\n喂喂喂狗么宁么宁\t56425\nwnrrgrf\t56426\n刘雷\t56427\n冠太\t56428\n国产片\t56429\n一万瓦\t56430\n因材施教\t56431\n够了我没有你在\t56432\n情花\t56433\n董凯磊\t56434\n︿ノ\t56435\n刘雯\t56436\n靠近你天\t56437\n刘雨\t56438\n036349\t56439\n伟哥\t56440\n度灭\t56441\n申明\t56442\n饭到了我问问你问题\t56443\n啰\t56444\n草青\t56445\n纯碱\t56446\n上善若\t56447\n第二件\t56448\n物体\t56449\n啲\t56450\n第二份\t56451\n加图索\t56452\n波尔多，西亚尼\t56453\n849156262655126162156795856625951651659626\t56454\n二十分之一\t56455\n理头\t56456\n报效\t56457\n老鼠牙\t56458\n36多\t56459\n已知商\t56460\n想废话\t56461\n小黄狗\t56462\n爱人类\t56463\n随手的帅\t56464\n山下酒店\t56465\n98.7%\t56466\n66666666\t56467\n萧瑟\t56468\nu6bbb\t56469\n姜耀阳\t56470\n55555555555555555555555555555555555555555555\t56471\n啊喜宝\t56472\n十六岁的孩\t56473\n鬼片神\t56474\n115.79\t56475\n薄板\t56476\n别q\t56477\nnikenike\t56478\n當然\t56479\n不胜其烦\t56480\n理夏\t56481\nletIT\t56482\n屏山站\t56483\n阿宾\t56484\n36天\t56485\n犬儒主义\t56486\n乖你个头乖呀\t56487\n芬儿\t56488\n5月31\t56489\n黑老子\t56490\n狭义\t56491\n超攰\t56492\n耐热玻璃\t56493\n英雄哥\t56494\ngtuvt\t56495\n死油\t56496\n超支\t56497\n盛明集团\t56498\n高晓松\t56499\n花旗\t56500\n草帽\t56501\n同性恋者\t56502\n邓一旦\t56503\n寸草不生\t56504\n花旦\t56505\n1595656\t56506\n拖板\t56507\n紫薇花avc\t56508\n诳语\t56509\n贵族化\t56510\n11.11%\t56511\n女篮\t56512\n20040448号\t56513\n递送\t56514\n唉悲哀天一天阿到书\t56515\n群花\t56516\n言钦\t56517\n啊腿\t56518\n灾难疯病\t56519\n10点21分\t56520\n蹲定\t56521\n痴人\t56522\n恩可\t56523\n对对对对骚瑞骚瑞\t56524\n铺\t56525\n恩发\t56526\n每一个月\t56527\n4碗\t56528\n三百\t56529\nfusion\t56530\n永远永远永远不会\t56531\n番外\t56532\n巴蒂尔\t56533\n13426435567\t56534\n星辰在线\t56535\n下哈\t56536\n趋\t56537\n周四四\t56538\n血伞\t56539\n去年4月份\t56540\n真人性化\t56541\njskk\t56542\n零二二九六二零八\t56543\n海淀桥\t56544\n挨临\t56545\n不衰林林总总米老老林林\t56546\njsks\t56547\n偏高\t56548\n欺诸\t56549\njskx\t56550\n5482525999488800005552369622\t56551\n苏州乐园\t56552\n明天宁德柘荣\t56553\nfurrr\t56554\n杀人不犯法\t56555\n王宝滨\t56556\n张现杰\t56557\nbelly\t56558\n政府机关\t56559\n多少年\t56560\n报数\t56561\n忍受\t56562\n不碎不困\t56563\n13336332176\t56564\n000719\t56565\n黑客大战\t56566\n胡仔\t56567\n随身空间穿越文\t56568\n梅可露\t56569\nF35\t56570\n东北女孩\t56571\n势不两立\t56572\n六条腿\t56573\n哑舍\t56574\n一株一只\t56575\n七三点\t56576\n半一六百二十五\t56577\n库柏\t56578\n剑法\t56579\n戴文欣\t56580\n巫羽蒙\t56581\n宋城\t56582\n有创意丑\t56583\n科罗拉多\t56584\n看见你了我讨厌\t56585\n刘浩辰\t56586\n胡令\t56587\n我等到花儿也谢了\t56588\n第39届\t56589\nHZ\t56590\n小豆\t56591\n坏蛋机器人\t56592\n三好好\t56593\nHR\t56594\nzirsizo\t56595\nHP\t56596\nHV\t56597\nHW\t56598\nHJ\t56599\nHK\t56600\n文东西\t56601\nHI\t56602\n换队\t56603\n560克\t56604\n就是谁\t56605\nHB\t56606\nHC\t56607\n星星粉\t56608\nHF\t56609\nHG\t56610\nHD\t56611\nHE\t56612\ncnv\t56613\n风然\t56614\nHx\t56615\ncnu\t56616\ncns\t56617\n小象\t56618\n克里克洛夕\t56619\n一天一夜\t56620\n17844292\t56621\nHv\t56622\n五六十二八二一五\t56623\nHt\t56624\ncny\t56625\nHj\t56626\ncng\t56627\nHh\t56628\n三角秘\t56629\ncnb\t56630\n一臭不要脸你\t56631\nHl\t56632\nHb\t56633\nHc\t56634\n张喜昕\t56635\ncnm\t56636\nHf\t56637\ncnk\t56638\n小豹\t56639\nHe\t56640\n忠义\t56641\n停歇\t56642\n怕黑\t56643\n吕海燕\t56644\n不不不不不不不回回回回回答答答我的话\t56645\n分销\t56646\n牛永奥\t56647\n不辞不想\t56648\n从来不曾\t56649\n整合式\t56650\n魔法类\t56651\n度秘你真的爱我\t56652\n陆海东\t56653\n诺馨\t56654\n这玩意\t56655\n谷壳\t56656\n停止\t56657\n一学期\t56658\n感情男\t56659\n林太可\t56660\n85856953\t56661\n萌萌哒么么哒胖嘟嘟又帅又美的度秘\t56662\n哈飞股份\t56663\n找我不能\t56664\n好我真\t56665\nefforts\t56666\n题我不会\t56667\n嵊州\t56668\n毕节七中\t56669\n搞不搞基\t56670\n真爽\t56671\n聚一聚\t56672\n你是谁啊你是谁呀你是谁\t56673\n竭力\t56674\n知棒\t56675\nbyobaby\t56676\nghan\t56677\n真爱\t56678\n三周年\t56679\n漂伯在\t56680\n鸡头\t56681\n中老年\t56682\n大Boos\t56683\n太上纲\t56684\n谷歌有你多知识多\t56685\n洗胃\t56686\n滴机\t56687\n360个\t56688\nfidgxguxgfhdyshyrgrgdydhdhdvyeeuhrhudeyrvegeegrhdgsggdhdhdhgdhjjgdjhdgdgdhyehdhdhhdhu\t56689\n无翼鸟\t56690\n温州南站信号机械室\t56691\n打呼噜\t56692\n零九零六\t56693\nokokoka\t56694\n18180粒\t56695\n额曲\t56696\n一个78\t56697\n归并\t56698\n货儿\t56699\n青春荷尔蒙\t56700\n班招\t56701\n丰县\t56702\n心血\t56703\nuvkfjxj\t56704\n苏尼恩\t56705\n花王\t56706\n合法化\t56707\n震慑\t56708\n緣故\t56709\n山楂树\t56710\n适航证\t56711\n北中环\t56712\n诛仙青云志的女主\t56713\n周心一\t56714\n荡起\t56715\n定语翠\t56716\nGfucehwidyc\t56717\n丰原\t56718\n听堡\t56719\n苏铃\t56720\n前和\t56721\n何家丽\t56722\n四成六\t56723\n咬手\t56724\n我讨厌你我非长讨厌你\t56725\n诞生物体\t56726\n十九二十二日22时\t56727\n深航\t56728\n张小涵\t56729\n情浪子柔情浪子\t56730\n炒股吧\t56731\n初潮\t56732\nc罩\t56733\n共同语言\t56734\n四十八号\t56735\n来了不起\t56736\n叫破\t56737\n古柏\t56738\nwebos\t56739\n空招\t56740\n8888个\t56741\nffuufgjjsv\t56742\n说入\t56743\n错错错错错错错\t56744\n挺身而出去走\t56745\n猫星人\t56746\n小度秘别给我\t56747\n心衰\t56748\n大萧条\t56749\n喵了咪\t56750\n反弹所\t56751\n点点滴滴屌\t56752\ns6l\t56753\n多管闲事\t56754\n学来呀11\t56755\n喝彩\t56756\nzzzzzzzz\t56757\n惨无人道\t56758\n一百一百一十一\t56759\nZUARUAR\t56760\n林雄华\t56761\n六十个\t56762\n王仕强\t56763\n快一\t56764\n复仇者\t56765\n网球\t56766\nkome\t56767\n米钱\t56768\nhhjjjj\t56769\nhgvgu\t56770\n骗书\t56771\n算了再见\t56772\n杂写\t56773\n球票\t56774\n知青\t56775\n朱春祥\t56776\n12粒\t56777\n网页片\t56778\n谜宫\t56779\n阚珊\t56780\n传染性\t56781\n4096四七\t56782\n鞠萍\t56783\n948257\t56784\n八百五\t56785\n許多\t56786\n笑猫日记之类\t56787\n桃树涛\t56788\n与会\t56789\n二伯家豪\t56790\n本能反应\t56791\n心的心\t56792\n张雨涵\t56793\n俞敏洪\t56794\n练习曲\t56795\n卖得\t56796\n青啤发展\t56797\n雷米\t56798\n大周\t56799\njjghfhgvhcxvxxndjfhdufgfjfg\t56800\n3万块\t56801\n以着\t56802\n1352049084257354251439\t56803\n刚来京\t56804\n真缘分\t56805\n量化\t56806\n八十多兆\t56807\n廖川江\t56808\n8月5日22点\t56809\n电子科技大学\t56810\n它希\t56811\nnobel和平奖\t56812\n手法\t56813\n砍伐证\t56814\n风湿\t56815\n叫偶\t56816\n54321\t56817\n卡泊尼\t56818\ngg思密达\t56819\n周鹏\t56820\n血型金像奖\t56821\n20041\t56822\n77826\t56823\n真的好困\t56824\n22540.33万元\t56825\n素猫酱\t56826\n4000299824\t56827\n美娟\t56828\n美娜\t56829\n重于山\t56830\n洗衣液\t56831\n灰藏\t56832\n美娇\t56833\n卡菜\t56834\n刘奕灵\t56835\n叫做\t56836\n叫停\t56837\n万桑\t56838\n夜之殇\t56839\n3248岁\t56840\n轻松文\t56841\n杨洋红\t56842\nupjt\t56843\n一元醇\t56844\n景观\t56845\n护法\t56846\n生死狙击号儿\t56847\n莫月\t56848\n11556654433221665544311556654433221221\t56849\n野格\t56850\n退休年龄\t56851\n给货\t56852\n铣\t56853\n544334\t56854\n几年级段\t56855\n职称\t56856\n月亮花\t56857\n哈哈哈哈哈感谢你感谢你感谢你感谢你感谢我的好度秘感谢我的好好好好好好度秘\t56858\n今天凌晨5点\t56859\n武孟如\t56860\nremain\t56861\nFederico\t56862\n军情\t56863\n袁秘\t56864\n插头儿\t56865\n信息心\t56866\n你好你好你好你好不了\t56867\n睡记得\t56868\n67777\t56869\n二二五零\t56870\n衣见钟情\t56871\n捉摸\t56872\n调考\t56873\n也好\t56874\n武汉大学青年传媒集团\t56875\n记录片\t56876\n欧阳兄\t56877\n王佳轩\t56878\n775825\t56879\n给你的你给我你对位对味\t56880\n1一30\t56881\n密为你\t56882\nォクカイオォゥガザジタゼザシセズナヌトツヂッデドダ\t56883\nAABC式\t56884\n哈克娅\t56885\n周氏i\t56886\n桀骜\t56887\nXxxxxxxxxxxxxxxxxxxxxxxxxxxxx\t56888\n欧阳全\t56889\n小来哥\t56890\n机器人和\t56891\n明日标\t56892\n吴征镒\t56893\n刮风\t56894\n68群\t56895\n首地大峡谷\t56896\n456285357159\t56897\n白绳网\t56898\n好假假\t56899\n愛心\t56900\nwgatjumpatm\t56901\n民风\t56902\n不死族鱼头\t56903\n一站来一剪梅\t56904\n见死不救\t56905\n无伤大雅\t56906\n东风神娃不是封神娃\t56907\n到期后\t56908\nKDI\t56909\n不得你\t56910\nrise1yle\t56911\n说不亏\t56912\n最时\t56913\n16828497\t56914\n龚宇萍\t56915\n最早\t56916\n桂花小区\t56917\nmczimin\t56918\nhhawshbh海忻岩\t56919\n六月七八九十十一十二八九二时\t56920\n薛雅卓\t56921\n四色位\t56922\n王越\t56923\n555588525456825\t56924\n曾小贱\t56925\n猪家\t56926\n宽增\t56927\n鬼故事儿\t56928\n张三丰\t56929\n第二段\t56930\nhi明天下\t56931\n无解器\t56932\n王超\t56933\n安吉市\t56934\n性行为\t56935\n恐龙岛\t56936\nZHAN\t56937\n领养\t56938\n兴衰\t56939\n临死前\t56940\nn8809888508\t56941\n骨面\t56942\n十九千九块\t56943\n廖福煜\t56944\n星系\t56945\n机智能机器好我叫你萌雪儿\t56946\n风餐露宿\t56947\n色魔\t56948\n有招\t56949\n百人\t56950\n节制冷\t56951\n百亿\t56952\n一百八十岁\t56953\n有拍\t56954\nggfxcvyfccvhyffcgygcvhyfbh\t56955\n白灵\t56956\n靠真是\t56957\n百事\t56958\n吃哈哈哈\t56959\n龙山乡\t56960\n二幺零二\t56961\n西山区\t56962\n朝阳公园\t56963\n不能度\t56964\n满城\t56965\n对呀我们是好我吧那你\t56966\n听话啊宝贝儿\t56967\n高新科技\t56968\nzonesyfxaxxdwhxdwrixah\t56969\n百五\t56970\n睡一觉先\t56971\n兰山区\t56972\n事由\t56973\n敌人儿\t56974\n言语气\t56975\n2010年12月15日\t56976\n井底\t56977\n五百五百兆\t56978\n上课时\t56979\n工作餐\t56980\n冯可\t56981\n蛋羹\t56982\n和你的记录摘录\t56983\n井度\t56984\n成段\t56985\n乌尔寒风\t56986\nGenovese\t56987\n一百辆\t56988\n中队长\t56989\n晚8点\t56990\n东阳城\t56991\n身体健康\t56992\n背离\t56993\n下一个你\t56994\n片头\t56995\n熊真\t56996\n地方场\t56997\nregreghtegr\t56998\n致命性\t56999\n杯底\t57000\n嘉萱\t57001\n李妍熙\t57002\n度秘我不子好同你相公了问你\t57003\n学系\t57004\n三厶\t57005\n水晶翅\t57006\n水风\t57007\n一眼前\t57008\n福鼎人\t57009\n出访\t57010\ngagaj1\t57011\n王宝强\t57012\n众不同\t57013\n新联康\t57014\n一谋\t57015\n呀呀呀呀呀呀\t57016\n新秘\t57017\n乐高幻影忍者四头龙\t57018\n猖獗\t57019\n姑偶姑\t57020\nicome\t57021\n一调\t57022\nKolelinia\t57023\n见过你的说\t57024\n稳居\t57025\n绿魔仙\t57026\n八点半\t57027\n千五百元\t57028\n捣碎\t57029\n五本三村\t57030\n甲数少四十甲数\t57031\n击\t57032\n等词\t57033\n吴明\t57034\n滑倒\t57035\n幻想大王奇遇记\t57036\nhf5\t57037\n苕逼\t57038\n耽美网游文\t57039\n凹\t57040\n别滥\t57041\n虱\t57042\n凸\t57043\n？岁\t57044\n双园ueahove首\t57045\njeks\t57046\n凿\t57047\n禁毒\t57048\n非科学\t57049\n虽\t57050\n虾\t57051\n虿\t57052\n江小华\t57053\n熙颖\t57054\n吴是\t57055\n哎呀大冒险二的问题我是\t57056\n倾城一小为红颜\t57057\n以身殉剑\t57058\n僚哥\t57059\n王北区\t57060\n虋\t57061\n下游\t57062\n吴春\t57063\n做坏\t57064\n虐\t57065\n凳\t57066\n虒\t57067\n记号\t57068\n下渡\t57069\n干合者\t57070\n韩冕峰\t57071\n呃耳宫\t57072\n整蒙圈\t57073\n真武\t57074\n虚\t57075\n笨妻\t57076\n真正\t57077\n班船\t57078\n號\t57079\n党员\t57080\n四十二兆\t57081\n若茅\t57082\n青色\t57083\n帽子\t57084\nx3330\t57085\n今以后\t57086\n你谁\t57087\n凶\t57088\n对抢\t57089\n苹果浏览器\t57090\n这个人类\t57091\n神马甲状腺结节\t57092\nstit\t57093\n亚军\t57094\n李蓉\t57095\n一起去过\t57096\n傷風波越燒\t57097\n視廊\t57098\n亚冠\t57099\n西格里拉萨\t57100\n62b0点\t57101\n对不对呀\t57102\n撒撒\t57103\n这一点儿\t57104\n121178点\t57105\n美国西部\t57106\n折尽\t57107\n五花大绑\t57108\n你在哪里啊的盖亚日\t57109\n不醉不归\t57110\n拉不有\t57111\n冤冤相报几时休\t57112\n更年期\t57113\n由有\t57114\n谭小良\t57115\n13919000598\t57116\n太人性化\t57117\n這麼說\t57118\nxyi\t57119\n武汉火车站\t57120\n把柄\t57121\n可乐公司\t57122\n对抱\t57123\n十五条\t57124\n4060678207822\t57125\n普顿斯乐队\t57126\n摩根大通首席投资办公室\t57127\n二百五十九\t57128\n猪点球\t57129\n亲爱的我想见见你\t57130\n开心不开心一高手个\t57131\n今天一号\t57132\nxyo\t57133\n被绑\t57134\n脑筋急转弯考考\t57135\n眸睃\t57136\n水念\t57137\n那条路\t57138\n600赞\t57139\n吴魏梦瑶\t57140\n炼狱苦海\t57141\n幸平创\t57142\n色板\t57143\n宾ouea\t57144\n税入\t57145\n陈子陈\t57146\n邦邦\t57147\n2777968460\t57148\n510个\t57149\n竹笋\t57150\n高空\t57151\n8162881\t57152\n18888888888888640\t57153\n只蚊\t57154\n大妮妮\t57155\n大球\t57156\n999cjj\t57157\n观本\t57158\n撒噶\t57159\n刘雨欣\t57160\n口若悬河\t57161\n123456722345671323467142345675234567923456734567\t57162\n郭德刚\t57163\n一百一十块\t57164\n510下\t57165\n蔡涛杨\t57166\n垐\t57167\nhi喽咪\t57168\nｅｓｏｂ\t57169\nKsmald\t57170\n真情流露\t57171\n大源大源大源大源\t57172\n马蔚华\t57173\n撞破\t57174\n嗯22\t57175\n这些片\t57176\n二春\t57177\nx瘦脸\t57178\n插画\t57179\n吊女\t57180\nhuame\t57181\n15763979796\t57182\n13965816\t57183\ngeegeegeegee\t57184\ncannot\t57185\n富沈阳文富神好的好神232323\t57186\n恨绝交\t57187\n中国化学制药公司\t57188\n握脚\t57189\n致远去的青春岁月\t57190\n卡慕咖啡感恩父亲节\t57191\n全国社保基金理事会\t57192\n91087\t57193\nsjxn\t57194\n散发\t57195\n7a854\t57196\n糗样儿\t57197\n分析法\t57198\n2009年4月20日\t57199\n刘子墨\t57200\nyryfb\t57201\n天涯何处无芳草，万紫千红总是春\t57202\n朴通猫\t57203\n阿Q\t57204\n鹏哥\t57205\n在等\t57206\n三零二零二\t57207\noZ77\t57208\n74￥\t57209\n调酒师\t57210\n点儿点儿\t57211\n5455545\t57212\n两三天多则\t57213\n绸子\t57214\nNIKE\t57215\n不准备\t57216\n卢品凡\t57217\n11哦z1\t57218\n妓者\t57219\n金湾西苑\t57220\n弄错\t57221\n阿q\t57222\n我是你的主人你给我学猫叫\t57223\n大门口\t57224\nｇｏ\t57225\n天一家哪那你的爸爸妈妈\t57226\ndo0gchkllfssaaassf1erllleytwCGHHHMMj\t57227\n知道然并卵\t57228\n戏剧性\t57229\n暗点\t57230\n刘天宇\t57231\n两极\t57232\n便道\t57233\n魏彦茹\t57234\n软件\t57235\n蔡晓峰\t57236\n爸爸吧兄弟有第四季\t57237\n议案\t57238\n横飞\t57239\n丆冬\t57240\n同行者\t57241\n乃是\t57242\n咔哇\t57243\n高富帅\t57244\nhfh\t57245\n利益集团\t57246\n乃春\t57247\n给顿\t57248\n早美\t57249\n工作人员们\t57250\n熊波\t57251\n安宝贝\t57252\n小少年们\t57253\nhkv\t57254\n好啦好啦我错\t57255\nlpwhdjep\t57256\nzhgghd\t57257\n持枪\t57258\n不额\t57259\n同行餐\t57260\n未经\t57261\n好丑有你好丑\t57262\nhxkbj\t57263\nvbbn\t57264\n48558895554085888\t57265\n下装\t57266\n十一年后\t57267\n有我喜欢\t57268\n3228525721\t57269\n大升龙\t57270\nnet\t57271\n京巴\t57272\n180页\t57273\n别骗我了你\t57274\nPhotoshop\t57275\n给我狗\t57276\n不胜枚举\t57277\n多棒\t57278\n袁智敏\t57279\n一天行\t57280\n一阵子\t57281\n下裤\t57282\n贾乃商\t57283\nAudrey\t57284\n八九点钟\t57285\n72岁\t57286\n生不灭不垢\t57287\n弹药\t57288\n坏注意\t57289\n第四期\t57290\n眼馋\t57291\n骨粉\t57292\n好丽友\t57293\n霍金爱因斯坦\t57294\n武汉协和医院\t57295\nhku\t57296\n狂晕狂晕狂狂晕\t57297\n作业主任何人\t57298\n眼力\t57299\n私父\t57300\n量子弟\t57301\nwoow\t57302\n握手\t57303\nGQuPP\t57304\n百分之五十五六\t57305\nkxxbs\t57306\n刘元\t57307\n结票\t57308\n9月7日\t57309\n施向锦\t57310\n265655726955525523333228855\t57311\n我的我的女神\t57312\n韦小姐\t57313\n我爱死你还个度秘\t57314\nwoom\t57315\n猎人追来了徒儿徒儿快捷让我来手把手yo\t57316\nwook\t57317\n香港北角码头\t57318\n黑穷挫\t57319\ntock\t57320\n拙胜巧\t57321\ndyx\t57322\ndyy\t57323\n传漾\t57324\n晋钢东院\t57325\n开场白\t57326\n堡乡\t57327\ndyv\t57328\ntoca\t57329\n涌泉\t57330\ndyq\t57331\n杨子\t57332\n1000多亿\t57333\ndyn\t57334\ndyo\t57335\ndyi\t57336\ndyj\t57337\ndyd\t57338\ndyf\t57339\ndyg\t57340\ndyb\t57341\ndyc\t57342\n神经医生\t57343\n1358589653764649956736846438386956439794346918187279919166465\t57344\n好的我一定要梦到你一定要到我的梦\t57345\n上我爱\t57346\n垮\t57347\n轩轩\t57348\n2568867621\t57349\n半死不活\t57350\n同在\t57351\n大中小\t57352\n归源\t57353\n嗯2p\t57354\n李超益\t57355\n不说的话\t57356\n巨训\t57357\n庸置疑\t57358\n坏了玩\t57359\n20010423\t57360\n派派\t57361\n姜汤\t57362\n唱逆战\t57363\n演技派\t57364\n封盖\t57365\n累计\t57366\n高嘲\t57367\n陆一\t57368\n哈楼哈楼你好你好哈楼卑鄙angelababy秘\t57369\n13937\t57370\ndhmhekge\t57371\n张芳志\t57372\n欧尼快十五\t57373\n陆丰\t57374\n站立\t57375\npayspecialattention\t57376\nbabybris\t57377\n新片\t57378\n15976345380\t57379\n布拉泥\t57380\n张棣华\t57381\n嗲嗲嗲嗲嗲嗲嗲嗲嗲嗲嗲嗲嗲嗲\t57382\n起玩笑下\t57383\n新版\t57384\nnimmmnnnn\t57385\n比熊长\t57386\n愉悦\t57387\n猪猪侠之\t57388\n孟建平\t57389\n韩小毛\t57390\n2.5米\t57391\n不要脸不要脸不要脸不要脸\t57392\n用得着\t57393\n20uckyn\t57394\n1.235亿元\t57395\n召回\t57396\n太焦\t57397\n好吧我是说科学科学分数二十分就及格四十分米\t57398\n193几\t57399\n踏过\t57400\nLED直播\t57401\n三单元号\t57402\n4.5吨\t57403\n真白\t57404\n弱校\t57405\n节能灯\t57406\n阑尾\t57407\nefffyg\t57408\n爱帮人\t57409\nhgrrexcrrdhttex韩国个热饿饿额度\t57410\n90天\t57411\n微客族\t57412\n林昱帆\t57413\n詹小\t57414\n双城记\t57415\n邹函芮\t57416\n邓稼先\t57417\n不能不及格\t57418\n九幽密\t57419\n店长\t57420\n余子涵\t57421\n泡泡卡丁车\t57422\n舒言\t57423\n一四假\t57424\n说理\t57425\n巴里巴拉\t57426\n委婉\t57427\n怏快乐\t57428\n哎呀一支笔x元\t57429\n摸黑\t57430\n坏血病\t57431\n心肌梗塞\t57432\n90多\t57433\n楼堡\t57434\n说不好说\t57435\n微操\t57436\n聪机器人\t57437\n31万元\t57438\nindly\t57439\n李舒雅\t57440\n疯狗儿\t57441\n88181045\t57442\n公寓房\t57443\n吃没\t57444\n黎食粥\t57445\n度秘堡亭\t57446\n混水摸鱼\t57447\n失言\t57448\n3.5%\t57449\n哪个头\t57450\n篮球鞋\t57451\n加油加油我的好度秘\t57452\n春秋大义\t57453\n月胎\t57454\n商登高速\t57455\n238225\t57456\n惠羽\t57457\n尼煤\t57458\n星聚吧\t57459\n张文燕\t57460\n小兄弟\t57461\n喝香飘飘\t57462\n貌美如花\t57463\n度秘帅\t57464\n上调到\t57465\n携程网\t57466\n多钟\t57467\n梁少燕\t57468\ncompanimerian\t57469\n零幺九零\t57470\n老朽\t57471\n夭亡\t57472\n齐美\t57473\n不不不我是\t57474\n芒果酸奶\t57475\nhgffcccc\t57476\n赵妖姬\t57477\n01006010062\t57478\n逗逗逗逗咪咪\t57479\n坏人我的大我上三年级你上幼儿园\t57480\nCamryn\t57481\n不及格好\t57482\n九点岁\t57483\n贵州大学\t57484\n缩小管\t57485\n严歌苓\t57486\n结束吧\t57487\n谢家华\t57488\n嗲怕\t57489\n扭矩天天册子\t57490\n潛\t57491\n考核\t57492\n齐奇亚\t57493\n提拔\t57494\n无口中有\t57495\n十字\t57496\n奉子成婚\t57497\n黑子的篮球\t57498\n虚服\t57499\n动漫产业\t57500\n３ｄ\t57501\nxiongzhao\t57502\n轧死\t57503\n提拉\t57504\nyrrrvk\t57505\nMacaron\t57506\n刘莉莉\t57507\nsggbkll\t57508\n土坷垃lol\t57509\n零八六八幺\t57510\n傻权者\t57511\n偏见\t57512\n一到底是\t57513\n秘哥白尼\t57514\n艰难\t57515\n瓦\t57516\n芽接\t57517\n苏和\t57518\n505050\t57519\n七条\t57520\n你好不好玩\t57521\n热伴\t57522\n逆袭文\t57523\nbcgfgfudheieiowidhehhxbcnxnbvxhbhhusjdjdo\t57524\n导演\t57525\n累散架\t57526\n187159882555\t57527\n狂笑\t57528\nfdgfgdfd\t57529\n君能已\t57530\nQWETUIPADGHLZXVBNM\t57531\nhchhchj\t57532\n一道把\t57533\n铁木真\t57534\n巨无霸\t57535\n错了错了是一片冰心在玉壶\t57536\nolym\t57537\n汇报\t57538\n中巴\t57539\n自救\t57540\nblog\t57541\nblof\t57542\n周杰凯\t57543\n60厘米\t57544\n爸宝\t57545\n觉觉觉觉\t57546\nGOT7\t57547\n主频\t57548\n塔瓦库勒\t57549\n看笑\t57550\n一半多两米\t57551\n香谷\t57552\n囧丁\t57553\n没有女\t57554\n董事会\t57555\n乖里\t57556\n自明\t57557\n干干活\t57558\n迫不得已\t57559\n李毅杰\t57560\n啦啦啦徳玛西亚\t57561\n小模\t57562\n帮凶\t57563\n神州九号\t57564\n发黄\t57565\n小樱\t57566\n闪亮晶晶满天\t57567\n走开我知道\t57568\n安县统战部\t57569\n剔牙00h\t57570\n首尊\t57571\n四川省\t57572\n南鹏\t57573\n机械手\t57574\n首尔\t57575\nQwertyuioasdfghjklzxcvbnmmnbvcxzlkjhgfdsapoiuytreewqqwedghhnkkjjhgvbjkk\t57576\n嗯吾\t57577\n选房\t57578\ngvj\t57579\n120余\t57580\n数第34个\t57581\nevdbjtjrkrkfm\t57582\nxing\t57583\n蔚萱\t57584\nxind\t57585\ngvb\t57586\ngvc\t57587\n东江村\t57588\ngvf\t57589\ngvg\t57590\ngvd\t57591\n怒砸\t57592\ngvz\t57593\n周晶晶\t57594\ngvx\t57595\n压垮\t57596\ngvr\t57597\ngvs\t57598\ngvv\t57599\n不管你了我\t57600\nwubs\t57601\nparatest\t57602\n第四集\t57603\nGhan\t57604\n蒋中一\t57605\n1244877764967965575746684566646442451587468733455565521346491142424427\t57606\nwuba\t57607\nncnnxnnxnznsjnsjrmnfndhxncjjvjfjfjfjkfjcmfkrkdkdkd\t57608\n嗯三\t57609\n13043726802\t57610\n中国电信\t57611\n绿化带\t57612\n盘古开天地\t57613\n电视史\t57614\n电视台\t57615\n教育基地\t57616\n7878748\t57617\n杨洪一\t57618\n张天津\t57619\n倮体艺\t57620\n粽子节\t57621\n西音\t57622\n福喜\t57623\n第四零\t57624\n雅达\t57625\nhttpg18gdlneteasecomg18neteasenetease7gwancpsdev1550apk\t57626\n有阵雨\t57627\nЗ爻\t57628\n900万\t57629\n翻翻\t57630\n傻逼者\t57631\n画皮\t57632\n要电\t57633\n哈啦\t57634\n老演\t57635\n我没有男闺蜜\t57636\n望都\t57637\n吴迪\t57638\n00000000000000000\t57639\n22515177887763\t57640\n大大大大大大大\t57641\n还给我挺\t57642\n小海兄\t57643\n动手术\t57644\nXDHZB\t57645\n欧耶欧耶欧耶欧耶\t57646\nhytgy\t57647\n加难\t57648\n冷艳香\t57649\n别瑞哥\t57650\n那样\t57651\nkiyggg\t57652\n别急回家\t57653\n平分线am\t57654\n就是我的名字\t57655\n面颊蟹黄\t57656\n携手\t57657\n刘叔叔\t57658\n鸣鸣\t57659\n亚麻带\t57660\n单家独户\t57661\n零四零九\t57662\n满婷\t57663\n鸣鸟\t57664\n艺术感\t57665\n毕业照\t57666\n星照\t57667\n张建强\t57668\n十二二十一岁\t57669\n小刘桃桃\t57670\n8544855866993699999999425855618661223345528458999\t57671\n你的爱情史\t57672\n金鼎园\t57673\n弄清错\t57674\n一沓毛\t57675\n退隐\t57676\n表客气\t57677\n疯狂的收图翻\t57678\n笑盈盈\t57679\n度秘我感觉你好累\t57680\n胡天仁\t57681\n憋不住\t57682\n每一秒钟\t57683\n十六十五十四十三十二十一十九八七六五四三二一零\t57684\nYFTF\t57685\n上清寺\t57686\n小马宝莉\t57687\n分房\t57688\nigv\t57689\n奥诺\t57690\n一艘船\t57691\nhgvccffx\t57692\n长衣\t57693\nrdbt\t57694\n琶洲\t57695\nigx\t57696\n艾雨婧\t57697\n我的朋友事\t57698\nigo\t57699\n擂台赛\t57700\nCooL\t57701\nigk\t57702\nigj\t57703\n四等\t57704\nigh\t57705\n预售票\t57706\n斗罗大陆\t57707\n阴经\t57708\n酸奶机\t57709\ndnjsoejcnfndjjdjdjdndjsjwjjsjdkdjdk\t57710\nCNG\t57711\nCNC\t57712\n俊杰\t57713\n明星犬\t57714\nCNN\t57715\n百脑汇派\t57716\n瓮城遗址\t57717\n乜灰心\t57718\n顾家\t57719\n林静竹\t57720\n数到手\t57721\n二百棵\t57722\n宫庭\t57723\nfddhjvg\t57724\n上海平安\t57725\nibaba\t57726\n过来行不\t57727\n青年宫\t57728\n告诉我准确数\t57729\n顾客\t57730\n贵姓\t57731\ncnnnd\t57732\n风度翩翩\t57733\n普段\t57734\nbaozhuzhefu\t57735\n翻瘸\t57736\n秘萌萌哒\t57737\nWSET\t57738\nYITH8\t57739\n得抱\t57740\n凡尔赛宫\t57741\n春熙路\t57742\n田小禾\t57743\n萌夏乐悠悠#\t57744\n澶渊盟城\t57745\n王大锤\t57746\n姜美丽\t57747\n普拉斯\t57748\n博罗\t57749\n夏一文\t57750\n哈拉瓦\t57751\nopqr\t57752\n9.7寸\t57753\n赛尔嗯嘞喜羊羊\t57754\n咋咋\t57755\n忘了你是\t57756\n睡踞式\t57757\n刘涵艺\t57758\nｉｐｈｏｎｅyou\t57759\n沟沟壑壑\t57760\n乌乌乌\t57761\n枫蓝小剧场\t57762\n电影木\t57763\n不不不不不我不是\t57764\n牛村\t57765\n电影本\t57766\n啵咕噜噜\t57767\nroeoleo\t57768\n洞蜜热\t57769\nmenatplay\t57770\n石7\t57771\n演播\t57772\n逆年\t57773\n度秘赛摩度秘\t57774\nuwg\t57775\n发笑话\t57776\nuwd\t57777\nxxbb\t57778\n伍老师\t57779\n往往往往往往\t57780\n吐血度秘\t57781\n百强\t57782\n哦小小小小小小小小\t57783\n开心果\t57784\nstorityou\t57785\n雷敦\t57786\n政治制度\t57787\n凉飕飕学\t57788\n六六百块\t57789\n12星座性感指数\t57790\n龙舟\t57791\n二百多块\t57792\n古灵精怪\t57793\n天天象\t57794\nzhuang\t57795\n縮\t57796\n詹新\t57797\n2110722289\t57798\n好吊\t57799\n男迷人\t57800\n好同\t57801\n好名\t57802\n好吃\t57803\n七lV\t57804\n喔咪\t57805\nEOX\t57806\nhgggnsixmodhuxhybyubdubujdbxubdydbybdkqpqhshkzish\t57807\nchnd\t57808\n许仙白素贞\t57809\n好吵\t57810\njnivgn\t57811\n皮带\t57812\n专网\t57813\n縒\t57814\n福岛市\t57815\n啦啦啦我卡\t57816\n包雨\t57817\njkbc\t57818\n好吧\t57819\nyyyyyyyppiy\t57820\n着急用\t57821\n穿破\t57822\n释迦牟尼\t57823\n咿呀咿呀\t57824\n都江鱼咀飞沙堰\t57825\n才土\t57826\n下降低\t57827\n小秘书\t57828\n凯甲勇士\t57829\n张悦榕\t57830\n乖知道\t57831\njkjj\t57832\nyeyur\t57833\njkjn\t57834\n忍辱\t57835\n类比\t57836\n处女幻儿园\t57837\n蛋糕人\t57838\n红烧排骨\t57839\n六郎\t57840\n打龙袍\t57841\n带人\t57842\nQ妺\t57843\n运子\t57844\n黄飞鸿\t57845\n为辅\t57846\n弥度秘\t57847\n999万个\t57848\n假发票\t57849\n一张一百\t57850\n只片刻\t57851\n一两手\t57852\n度蜜\t57853\n李杰\t57854\nServe\t57855\n坏熊\t57856\n李来\t57857\n3298\t57858\n3299\t57859\n秘秘密\t57860\njrdbj\t57861\n李杨\t57862\n黄鸣\t57863\n马爽\t57864\n杰森斯坦森\t57865\n该部\t57866\n19点\t57867\n马爷\t57868\n总体\t57869\n李杜\t57870\n厘米计\t57871\n洗手盆\t57872\n水瓶男\t57873\n五目连珠\t57874\n过去掉泪\t57875\n八七零二二六二九\t57876\n想说拜拜\t57877\n一说\t57878\n王鼎\t57879\n八道哼\t57880\n别幸灾乐祸\t57881\n小魔小魔小魔小魔仙\t57882\n小飞流\t57883\n驱逐\t57884\ngjfyfghyjh\t57885\n八步\t57886\n全民宫\t57887\n谢　乚娜\t57888\n别瞎掰\t57889\n邱成林\t57890\n一点一点\t57891\n国家文物局\t57892\n神蛊\t57893\n22天\t57894\n九浅一深\t57895\n美国政府\t57896\n善如流\t57897\n了不得\t57898\n稀巴\t57899\nrvidut\t57900\n立地成佛\t57901\n把盏\t57902\n拿下\t57903\n易冰寒\t57904\n臭屁虫\t57905\n646494463764\t57906\n开心不开心关我屁事\t57907\nvvy11\t57908\n西办\t57909\n晋亨\t57910\n蒙族\t57911\n快快快快快快快快快快\t57912\nDATD\t57913\nhttppinyincne1047\t57914\n怜爱\t57915\n好鼓\t57916\n四冠\t57917\n1846579\t57918\n腐门\t57919\n梁娇娇\t57920\n斌斌斌\t57921\n矬子\t57922\n紫凝\t57923\n好念想\t57924\n不不不不不不不我\t57925\n堡蛋\t57926\n李代沫\t57927\n绿城VS上海特莱士\t57928\n黑木耳\t57929\n张完云\t57930\n嗯二七\t57931\n鲁梅尼格\t57932\n机械人\t57933\n天球\t57934\n猛料\t57935\n张予曦\t57936\n283元\t57937\n4008208208820\t57938\n老罗曲\t57939\nCCAT\t57940\n告诉我我等\t57941\n泥头\t57942\n心鬼\t57943\n抬巴\t57944\nbyeTT\t57945\n说说出来\t57946\n还干事\t57947\n申报\t57948\n老幸福\t57949\n聊了答非所\t57950\n漂亮的眼\t57951\n京剧\t57952\n纷杂\t57953\n窝们\t57954\nPlayer\t57955\n割麦者\t57956\n必须承认\t57957\n臣妾\t57958\njigng\t57959\n范儿\t57960\n520467\t57961\n无话可聊\t57962\n就婚\t57963\nvbgr\t57964\ngwgcsuzjq\t57965\nl度秘\t57966\n老大老二老三老四老五老六老七老八老九老十\t57967\n接上车\t57968\n不粘\t57969\n绿茶婊\t57970\n佳旭\t57971\n哈通\t57972\n儿儿科\t57973\n宜信\t57974\n理论性\t57975\n呐萌度秘*罒\t57976\n安慕希里\t57977\n300多\t57978\n瓜乙\t57979\n世纪天乐\t57980\n度秘肯定\t57981\n滚粗滚粗滚\t57982\n不精\t57983\n三零号\t57984\n劳斯莱斯堡证\t57985\njuarrsb\t57986\n仙灵\t57987\n李蜀黍\t57988\n移植\t57989\n哈逼\t57990\n你好呀您好\t57991\njkbn\t57992\n市中心医院\t57993\n角状\t57994\n作不死\t57995\n红豆欢男\t57996\n2009年上半年\t57997\n闹不想和你聊天呀我想和你\t57998\n有人么\t57999\n奉养\t58000\n糸都糸\t58001\n辣薯片\t58002\n活期盈\t58003\n奉先\t58004\n88分\t58005\n漏电路\t58006\n隐帅\t58007\n7474\t58008\n6170173\t58009\n300余万\t58010\n滚刀肉\t58011\n〒_〒\t58012\n爱我可以\t58013\n123qwr\t58014\ngrewsf\t58015\n撒旦\t58016\n许国瑞\t58017\n本儿\t58018\n软油饼\t58019\n半成品\t58020\n烧身\t58021\n居心\t58022\n早上\t58023\n早下\t58024\n别别别别\t58025\n空足\t58026\n我的名字叫红\t58027\n没你可怜\t58028\nfgoteuo\t58029\n王敬伟\t58030\n沟女\t58031\n不负\t58032\n1兆日元\t58033\n远处\t58034\nWINDOWS唉奇缘\t58035\nn0nnT\t58036\n秤砣\t58037\n机器人\t58038\n杨小东东\t58039\n早个\t58040\n冯靖博\t58041\n绿蔓\t58042\nlold\t58043\n杜子健\t58044\n骡子\t58045\n不败\t58046\n30台\t58047\n抗衰老我喜欢\t58048\n五千元\t58049\n死肥婆\t58050\n丽丽姐们么事儿\t58051\nJvcvtr\t58052\n情妇\t58053\n五千克\t58054\n雷霆雅\t58055\n远大\t58056\n不贵\t58057\n不贴\t58058\n摸脸\t58059\n名都\t58060\n2岁\t58061\n王晓鹏\t58062\n狗节\t58063\n42嘿嘿\t58064\n家小\t58065\n植物大战僵尸二失落之地\t58066\n不死倭\t58067\n119期\t58068\n朴一\t58069\n熊家祠堂湾\t58070\n235678\t58071\n北美洲\t58072\n最终点\t58073\n心理段\t58074\n谭艳\t58075\n多门\t58076\nlookat\t58077\n雷佳玲\t58078\n法学院\t58079\n烘烤\t58080\nfghuj\t58081\n晋江陈埭中队\t58082\n异木\t58083\n长头发\t58084\n玩雪\t58085\n孙吉康\t58086\n米切\t58087\nvdegftg\t58088\n135879\t58089\n一万1万1亿\t58090\n记名\t58091\n我不是你主人真是上梁不正下梁歪不知谁制造的你\t58092\n太乖乖乖\t58093\n洒\t58094\n易县赛\t58095\nmobile\t58096\n赵丽影\t58097\nfusufudu\t58098\n20年\t58099\nn222……929\t58100\n色氨酸\t58101\n三点六颗\t58102\n米米弋\t58103\n精选款\t58104\n裤长\t58105\n花儿与少年玩儿花哪儿呢少年手\t58106\n怎么了你告诉我告诉我不理你现在有理你了要不我不理你\t58107\n艾莉娜\t58108\nohfpudti\t58109\n美其名曰\t58110\n神经科\t58111\n小不点一度秘\t58112\n央视春晚\t58113\n一闪一闪一闪\t58114\n15926\t58115\n老公们\t58116\n刮倒\t58117\n小句\t58118\n天下首歌\t58119\n小口\t58120\n迈莓\t58121\n小叮\t58122\n小可\t58123\n迈克尔-乔丹\t58124\n吴雨欣\t58125\n小叫\t58126\n地球人\t58127\n未么\t58128\n小叶\t58129\n小号\t58130\n好又贷车\t58131\n小右\t58132\nノ｜壁\t58133\n小叽\t58134\n王怡然\t58135\n露露美妞\t58136\njegy\t58137\n飞影铠甲\t58138\n文秘\t58139\n187525255252\t58140\n营业税河道税文化建设税\t58141\n文科\t58142\n163692\t58143\n四年间\t58144\nHVF\t58145\n小叔\t58146\n白潭镇\t58147\n公正\t58148\n小受\t58149\n小发\t58150\n白百合女学院\t58151\n动了烦\t58152\nHVV\t58153\n惹事生\t58154\n傅\t58155\n美意思\t58156\n很迟\t58157\n很远\t58158\n傍\t58159\n古蒲州\t58160\n毛大婶\t58161\n秀逗比你好主人吻\t58162\nv3\t58163\n张思睿\t58164\n姜博洋\t58165\n656655\t58166\n行闲\t58167\n588866375\t58168\n凤把\t58169\n傢\t58170\n太托拉\t58171\n空間\t58172\n嗯明月\t58173\n完老\t58174\n傯\t58175\n储\t58176\n傩\t58177\n荣成市\t58178\n易洋\t58179\n高质量\t58180\n傲\t58181\n小月亮\t58182\n15小时后\t58183\n傻\t58184\n加藤胜\t58185\n湠\t58186\n二八三二\t58187\n洰\t58188\n生殖户\t58189\n怪不得意思\t58190\n清泉\t58191\n有份\t58192\n迈腾\t58193\n吉乐kk611\t58194\n湿\t58195\n有们\t58196\n性人\t58197\n皮建明\t58198\n40分钟\t58199\n清波\t58200\n米卡\t58201\n米卢\t58202\n11742点\t58203\n湍\t58204\n丰满张\t58205\n八十五五多少\t58206\n湖\t58207\n有仇\t58208\n溜溜果图\t58209\n稿子\t58210\n一个10元\t58211\n湘\t58212\n真名果\t58213\n蜜汁\t58214\n性交\t58215\n九百十一十二十三十四十五个\t58216\nhi1\t58217\n美式咖啡拼配豆\t58218\n王梦尘\t58219\n132315454\t58220\n秦慧敏\t58221\n一尘不染\t58222\n肚鸡眼\t58223\n2月28号\t58224\n每步\t58225\n失恋爱\t58226\nForward\t58227\n派\t58228\n气曙\t58229\n曹云定\t58230\n周岁\t58231\n间接\t58232\n满不错\t58233\n信不过\t58234\n比人地卓\t58235\n60后\t58236\n恩么样\t58237\n再也不知道\t58238\n第11次\t58239\n看破\t58240\n13家\t58241\n一辆辆\t58242\nhip\t58243\n功德碑\t58244\nhis\t58245\nhit\t58246\n买马\t58247\nhiv\t58248\n真难\t58249\nhix\t58250\n5票\t58251\n付立宪\t58252\n用上学\t58253\nhia\t58254\nfamale\t58255\n金盘子\t58256\n桥下\t58257\nhif\t58258\n赵占阳\t58259\n度秘你好度秘你好度秘你好下次秘你好度秘好下次不美女哦吓死我没你好咪咪咪咪\t58260\nhii\t58261\nhij\t58262\nhik\t58263\n秘毛\t58264\n玉衡飞\t58265\nhin\t58266\nhio\t58267\n南街\t58268\n闫文雨\t58269\n领将\t58270\n33.73米\t58271\n两百多艘\t58272\n太好了你\t58273\n巨石\t58274\n张皇失\t58275\n多米钱\t58276\n蔡振宇\t58277\n全国人大常委\t58278\n打损\t58279\n说不见\t58280\n打捞\t58281\n真的不难\t58282\n咯了111111\t58283\n林紫艳\t58284\n内秀\t58285\n121个\t58286\n泛白\t58287\n铠甲了来了\t58288\n逗了逗\t58289\n称破\t58290\nyoityo\t58291\n戏码\t58292\nyayahaha\t58293\n猪猪猪猪猪猪猪\t58294\n一恶不恶心\t58295\n五十多年\t58296\n蒙抄\t58297\ntghf\t58298\n云梦\t58299\n翘翘儿\t58300\n洋离\t58301\n好片\t58302\ndostf\t58303\n王家岭\t58304\n徐宇涵\t58305\n没注意\t58306\n好牛\t58307\nIPHONE\t58308\n骑驴\t58309\n140块\t58310\n二零二二二九八四零九二七四四三零\t58311\nhyrgg\t58312\n势不可挡\t58313\n目不交睫\t58314\n四清导航\t58315\n领会\t58316\n阿红点\t58317\n我爱你看么雅蠛蝶呤\t58318\n买买买买买买买买买买买买买买买买买买买买买买买买买买买买买买买买买买买买\t58319\n1331323019\t58320\n湄洲湾\t58321\n割草\t58322\n喊停\t58323\n得一一\t58324\nEX铂睿\t58325\n功不独居\t58326\n嗯醉\t58327\n有点礼貌\t58328\n行你走\t58329\n弯子\t58330\n上纹\t58331\n睡觉吧晚安亲\t58332\n上线\t58333\n10869\t58334\nmkvhnnmm\t58335\n彩丽高\t58336\n五针\t58337\n董晓燕\t58338\n哎呦你真了不了不起\t58339\n强求你了聊\t58340\n尹庆福\t58341\n上级\t58342\n兔蜜\t58343\n93米\t58344\n51lu\t58345\n震精\t58346\n说了大\t58347\n当户\t58348\n爷爷爷\t58349\n内经\t58350\nSuperShow4\t58351\n山城\t58352\n嗯呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀\t58353\nwhoI\t58354\n杰迷\t58355\n目击证人\t58356\n窝窝头\t58357\nkawaii\t58358\n抱着你好\t58359\n小碗\t58360\n黄粉+酸橙汁\t58361\n1234566890791223345556678990元\t58362\n跟脚\t58363\nwhot\t58364\n小碟\t58365\n武陵源\t58366\n99元\t58367\n当我\t58368\n倪果\t58369\n导线\t58370\n崇米\t58371\n民族主义\t58372\n2367\t58373\n60多少\t58374\n奥特奥特\t58375\n2362\t58376\nHedgehogchigger\t58377\n佛诚\t58378\n量车\t58379\n雪橇犬\t58380\nfcvvfg\t58381\n张高乐\t58382\neuydhfhig\t58383\n摸恩\t58384\n折返\t58385\nSPA馆\t58386\n孟雪飞\t58387\n二一三体\t58388\n提前\t58389\n8888888888888888888888888\t58390\n二六哼\t58391\nerertefe3t3tfett4hetrt4\t58392\n吉巴\t58393\n前檐\t58394\n爱的翅膀\t58395\n不会见\t58396\n玉镯\t58397\n计叫\t58398\n朱铺\t58399\nyyfhdhgegjhfjdhfr\t58400\n电信局\t58401\nydsdgg\t58402\nbudyx\t58403\n年富力强\t58404\n崔中石\t58405\n二十多块\t58406\n发运\t58407\n家常菜解放军\t58408\n八八八n\t58409\n长治点\t58410\nzrziìhyò\t58411\n猪之歌\t58412\n出机\t58413\nN72\t58414\n宝宝宝\t58415\n浸默\t58416\n微知\t58417\n令人羡慕\t58418\n好low\t58419\npotredd\t58420\ne10米\t58421\n三三百五\t58422\n罗佳欣\t58423\n1968年2月初六\t58424\n祖母绿\t58425\n冰心\t58426\n79779976434334373\t58427\n四瓦\t58428\n城计们\t58429\n波霸\t58430\n靠边站\t58431\n张钶钶\t58432\n意料之外\t58433\n复产\t58434\n鸭苗\t58435\n南票\t58436\n龍蝦灣\t58437\n27bao\t58438\n宵夜\t58439\n餐蜗牛\t58440\n浓缩丸\t58441\n么得命\t58442\n1.14%\t58443\n科学性\t58444\n鲜俾阖\t58445\n疏散\t58446\n提卡\t58447\nwetdst\t58448\n奴隶人\t58449\n按月\t58450\n语不惊人\t58451\n想贿赂\t58452\n忘了我\t58453\n张云龙\t58454\n按期\t58455\n天大地大\t58456\n华强\t58457\n半条命\t58458\n俩百三十岁\t58459\n我你\t58460\nsyuor\t58461\n不忙\t58462\n渡边谦\t58463\n宁宙\t58464\nsssvjxxxiwvvvvvvvwjbxefjedexugjjevievduveixvexdeeexeevxxveieeeniiinnnnnnnnnnnnnjnjx\t58465\n印度半岛\t58466\n顺理成章\t58467\n宁宁\t58468\n不必\t58469\n小麦\t58470\n宁安\t58471\n不忍\t58472\n女童鞋们\t58473\n真本事\t58474\n萌萌哒萌萌哒萌萌大眼萌萌哒嗯我喜欢一个人\t58475\n德玛西亚区\t58476\n不念\t58477\n老男银\t58478\n不要八卦\t58479\n呀呀呀呀\t58480\n不忿\t58481\n猜有\t58482\n不忠\t58483\n自己树\t58484\n陈会长\t58485\n2005年11月6日\t58486\n不快\t58487\n也呀\t58488\n西湖区\t58489\n102平方米\t58490\n麼麼哒\t58491\n渔阳\t58492\n1951年以来\t58493\n喋喋\t58494\n器材\t58495\n水解饭\t58496\n余静言\t58497\n刘思柔\t58498\n战十\t58499\n张梦妍\t58500\nydwd7gd\t58501\nWMG\t58502\n度秘我好不开心\t58503\n脸袋\t58504\n学到位\t58505\nr18bl\t58506\n八门遁甲\t58507\n小筠\t58508\n红田翠园儿\t58509\n五桌\t58510\nhikall\t58511\n1wewewwwwwwwwqwwwwwwww222w2\t58512\nIPhone\t58513\n钢合金\t58514\n妈咪呗\t58515\n钟金花\t58516\nnnnnn000000000n0\t58517\n该会\t58518\n一百一百九十四\t58519\n耐药\t58520\n过目\t58521\n拆迁\t58522\n尝\t58523\nalotmore\t58524\n张丽翠\t58525\nΗΖΕΙΚΛΜΠΟΞΝ\t58526\n好你就叫若子了若子若子\t58527\n谢老\t58528\njcffv\t58529\n赵志媛\t58530\n1525\t58531\n5avv托\t58532\n1521\t58533\n1522\t58534\n爱是离家好大好累爱你\t58535\nQQ游戏\t58536\n何峨汝\t58537\ndirokicua\t58538\n第三页\t58539\n绵羊座\t58540\n凶斗\t58541\n第一三年\t58542\n唔讲\t58543\n胖脸\t58544\n怕僵尸\t58545\n东野给我护驾彩云\t58546\n九炎\t58547\n22222\t58548\n人妻子\t58549\n杀鸡\t58550\n落寡\t58551\n想自由\t58552\n斯诺\t58553\n入土\t58554\n乍阳\t58555\n李立尧\t58556\n默写\t58557\n金花\t58558\n落寞\t58559\n蓬灰面\t58560\n24680\t58561\n刮目看\t58562\n众泰\t58563\nCyfgvgcvzgc\t58564\n孤僻犬\t58565\n例题\t58566\n并未\t58567\n入场\t58568\n乐园\t58569\n今天早上八点\t58570\n飞蛾扑火\t58571\nSKULL\t58572\n乐团\t58573\n马格里奇\t58574\n毛果\t58575\nvsdBaghhs\t58576\nfytuyihi\t58577\n走撒\t58578\n依你在\t58579\n好美好美\t58580\nihhh\t58581\n路畅\t58582\n头蛇\t58583\n合眼\t58584\n扣人心弦\t58585\n嘎哇迪卡\t58586\n早上七点\t58587\n互补性\t58588\n得笑\t58589\nwindow\t58590\n舅胡\t58591\n阿布局\t58592\n容易受骗\t58593\n林依轮\t58594\n物以类聚\t58595\n韩颖\t58596\ndrrttt\t58597\n第十天\t58598\n五瓶\t58599\nwnd\t58600\nLED补光灯\t58601\n腐女\t58602\n把头改\t58603\n郑州阳\t58604\nwns\t58605\n保肾\t58606\n星星你好度秘\t58607\nwnw\t58608\n公人\t58609\n尺\t58610\nbeunble\t58611\n郭台铭\t58612\n陈列\t58613\nnqsb\t58614\n28509元\t58615\n寄售\t58616\n返过来\t58617\nMorgan\t58618\n公事\t58619\n周星期\t58620\niMac\t58621\n饲料\t58622\n利特家族##\t58623\n苏东坡\t58624\n15577971849\t58625\n淑女机\t58626\nz\t58627\nddgffffft\t58628\n尴\t58629\n包华\t58630\n怪兽盒\t58631\n千万次\t58632\n血透\t58633\n竹枝\t58634\n尼玛尼玛尼\t58635\n窝不素\t58636\n尰\t58637\n802356\t58638\n补货\t58639\n凯凯\t58640\n好醉\t58641\n我讨厌书\t58642\n分布式\t58643\n6月30日\t58644\n给我一个你看蓝蓝笑话吧\t58645\n嫌疑人\t58646\n光尼\t58647\n山药\t58648\n死给窝\t58649\nf宫\t58650\ndisjmls\t58651\n鸡猪侠\t58652\n撒撒撒撒\t58653\n这类容\t58654\n见贱\t58655\n包卵\t58656\n陈知道\t58657\nfeifi\t58658\n一國\t58659\n阿照\t58660\nGVH\t58661\n74319497777974944\t58662\n花别\t58663\nfv558682786\t58664\n一圆\t58665\n价本\t58666\n人一下\t58667\n断句\t58668\n钟金仕\t58669\n我不较劲看外星人啦啦啦啦啦啦okokokok\t58670\n皮水钰\t58671\n叙俊\t58672\niPhone5s\t58673\n三二零六\t58674\nGVi\t58675\nhqtwyd\t58676\n杨梦丽\t58677\n平方米\t58678\n卷制\t58679\n一场\t58680\nwsyzhi\t58681\n2016012\t58682\n一地\t58683\nGVv\t58684\n文雅\t58685\n瓦塌塌兔\t58686\ngio\t58687\n谢老板\t58688\n孔子玲\t58689\n葱白\t58690\n韩娘炮\t58691\n不准别\t58692\n氛围\t58693\n叫什么叫\t58694\n一叶\t58695\n千形万像\t58696\n经济学奖\t58697\n生搬硬套\t58698\n3553\t58699\n恶混天使\t58700\n丽人节\t58701\n知人之明\t58702\n煤炭\t58703\n不用谢因为你是我的宠物\t58704\nDOTA1\t58705\n卫报\t58706\n大中午\t58707\nadjthjd\t58708\n阿桃子\t58709\nSMTfacebook\t58710\n一睹为快\t58711\n陪不陪\t58712\n树冠\t58713\n德兴\t58714\n曾伟华\t58715\n10多天\t58716\nvjfjg\t58717\nC君\t58718\n赵智琨\t58719\n聊天机器人\t58720\n六色\t58721\n午下\t58722\n哼讨厌\t58723\n清音\t58724\n好好久\t58725\n小石头\t58726\niiiimiimiamimimimimimimimiiiiiiiim\t58727\n妻乳\t58728\nHAOBA\t58729\n上礼拜\t58730\njyfufigoh\t58731\npoiyy\t58732\n高国栋\t58733\n午个\t58734\n丝芙兰\t58735\nkusj\t58736\n童仪萱\t58737\n经天\t58738\n狂吠\t58739\nqkshgdbbsnw\t58740\n多速\t58741\n服不服输\t58742\n多谢你\t58743\n1759172329\t58744\n只言语\t58745\n数量\t58746\n14599\t58747\n狂吻\t58748\n竟然\t58749\n5c5c5c点com\t58750\n大丑八怪\t58751\n雷州\t58752\n世道不在你在赤道我你不我\t58753\n狂吃\t58754\n朱主任\t58755\n抱抱思密达\t58756\n没有我是\t58757\n仿句\t58758\n汪精卫\t58759\n田昭宏\t58760\n一亿只\t58761\n好趣\t58762\n5月1日\t58763\n帅次\t58764\n170MHz\t58765\n催熟\t58766\n此片\t58767\n此版\t58768\n趁機\t58769\nIGY\t58770\nwhooften\t58771\n5778\t58772\n老子明天不上班\t58773\n虐口某\t58774\n102只\t58775\nIGO\t58776\n上山\t58777\n年期权\t58778\n赶得\t58779\n十二小时后\t58780\nIGv\t58781\n吴先生\t58782\n上层\t58783\n摧毀\t58784\n赵健龙\t58785\n苏天琦\t58786\n狗眼儿\t58787\n笑年了\t58788\n啊恩\t58789\n三坊七巷\t58790\n萧萧雪雨江湖路\t58791\nvangguyp\t58792\n手掌心\t58793\n子儿\t58794\n纸子\t58795\n嗯连\t58796\njjbssdgg\t58797\ndsgdgff\t58798\n决交\t58799\n切段\t58800\n更假\t58801\n短直到细胞\t58802\n黃色片\t58803\nBlower\t58804\n年夜菜\t58805\nYangon\t58806\n殴巴\t58807\n甚微小\t58808\n肝癌\t58809\n图片了行\t58810\n形影相\t58811\n无一虚\t58812\n男一女\t58813\n1.18%\t58814\n素聪明\t58815\n小幅\t58816\n毅力\t58817\n办好事\t58818\n离境\t58819\n人数\t58820\n白雪公主快给我\t58821\n飞航\t58822\n真的开始\t58823\nSYYCUV\t58824\nCccccc\t58825\n公交熊\t58826\n飞船\t58827\nwomen\t58828\n狗狗骨\t58829\n凝望晚霞\t58830\n人教\t58831\n李丽丽\t58832\n打退堂鼓\t58833\n935219\t58834\n小年\t58835\n人效\t58836\n配镜\t58837\n小平\t58838\n无所疑\t58839\n精神病史\t58840\n飞舞\t58841\n小幽\t58842\n小幺\t58843\n小幻\t58844\n小幸\t58845\n黄金强\t58846\n脑壳\t58847\n低俗\t58848\n丽颖\t58849\n独利日\t58850\n低保\t58851\n见天\t58852\n1315\t58853\n做事先做人\t58854\nGfgfgfhf\t58855\n4242254\t58856\n石墩\t58857\n靠恁\t58858\n笑了沒\t58859\n贴补\t58860\n来土淹\t58861\nyhhgvdbf\t58862\n发出发\t58863\n濮阳市\t58864\n八年九\t58865\n陶晶\t58866\n芮维娜\t58867\n割舍\t58868\n哩昂\t58869\n闲气\t58870\nhttpahiphotosbaiducomxiaodupicitem1c950a7b02087bf4f2f2469af5d3572c11dfcfa3jpg\t58871\nMIOLP\t58872\n李丽珍\t58873\n起手\t58874\n死背\t58875\n哦章\t58876\n死胎\t58877\n球杆\t58878\n义东路\t58879\n见外\t58880\n客货车\t58881\n见多\t58882\nWalter\t58883\n嵌套\t58884\n高尔夫场\t58885\n迈过\t58886\n135246\t58887\nKTV\t58888\n杜淳多\t58889\n优乐美会\t58890\n蜉蝣\t58891\nKTR\t58892\n彭士\t58893\n500家\t58894\n李克强\t58895\n嗯嗯嗯我的娃\t58896\n迈进\t58897\n宋美眉\t58898\n顺其自然\t58899\n不渐\t58900\n多媒体型\t58901\n二零九幺\t58902\ntogether\t58903\n嗯翔\t58904\n别吹牛酒店我告诉你白猪就是你\t58905\n袋子\t58906\n来看一看\t58907\n446912036\t58908\n老年健忘症\t58909\njolin\t58910\n刘阿瑞\t58911\n王锐杰\t58912\n健忘\t58913\n梨花花式\t58914\n服务器\t58915\n一行一\t58916\n咕咚咕咚\t58917\n你真的爱我吗咪爱\t58918\n乘专机\t58919\n19点52\t58920\n七笔字\t58921\n安好咧\t58922\n梦么么\t58923\n美照\t58924\n再接再厉加油\t58925\n好我知道好我去睡觉了拜拜\t58926\n别顶\t58927\n李四兰\t58928\n很干净\t58929\n晒我想\t58930\n亿元\t58931\n我是你的主人还是你是我的主人我再问你一遍度秘\t58932\n太雕虫\t58933\n共和党\t58934\n1g141g141\t58935\n嗯杨幂\t58936\n网罩\t58937\n亲爱的秘\t58938\n臭不要脸的我想死你了哈\t58939\n拉布拉多犬\t58940\n脑瘫康复部\t58941\nimmookou\t58942\n日本人\t58943\n声张\t58944\n冰花\t58945\n钾\t58946\n有钱途\t58947\n钰\t58948\n钱\t58949\n图片库\t58950\n5Dabao\t58951\n钵\t58952\n钩\t58953\n马类\t58954\n钬\t58955\n钭\t58956\n钮\t58957\n钠\t58958\n说的呀是\t58959\n钢\t58960\n苏霞\t58961\n钥\t58962\n挨托奥\t58963\ne2奁互\t58964\n钙\t58965\n钛\t58966\n很乖很乖\t58967\n在一起是\t58968\n钞\t58969\n钟\t58970\n三千六百一十一三百七十七\t58971\n三年一天\t58972\n钓\t58973\namei\t58974\n钕\t58975\n转发率\t58976\n针\t58977\n钉\t58978\n文滚\t58979\n背乘法\t58980\n大梅\t58981\nTtttttt\t58982\n大梁\t58983\nno你的声音\t58984\n惹天怒\t58985\n马航\t58986\n2198242376\t58987\n1948年\t58988\n旁邊\t58989\n心雨\t58990\n18075231056\t58991\n四分之一大对\t58992\n忘記\t58993\n律诗\t58994\n稀释\t58995\n味千\t58996\n说不爱\t58997\n心雅\t58998\n三星Galaxy\t58999\n对呀你说我不废话\t59000\n经验者\t59001\n第一四集\t59002\n好不好不\t59003\n疵癫\t59004\n臭牌\t59005\n咪问\t59006\n红斑\t59007\n双龙\t59008\n毛念毳\t59009\n斩了斩\t59010\nHelikesoranges\t59011\n发高烧\t59012\n冒牌\t59013\n山盟海誓\t59014\n不能不说\t59015\n社交网络\t59016\n刘亚琪\t59017\nx150\t59018\n无人转\t59019\n王雨晴\t59020\n民房\t59021\n杜风\t59022\n鸡翼\t59023\nZheshishenmenyisi\t59024\n零二八三\t59025\n鸡翅\t59026\n白墙\t59027\n胖一\t59028\n轮空\t59029\n强占\t59030\npsishhxyqjw\t59031\n土龙\t59032\n赵丽百合\t59033\n5555555555411\t59034\n玛丽苏\t59035\n某爷\t59036\n切切切切切什\t59037\n杨馥宇\t59038\n24日\t59039\n三菱\t59040\n亲生\t59041\n一般钱\t59042\n亳毛\t59043\n龙马\t59044\n给我讨厌\t59045\n雪地靴\t59046\n风湿细胞\t59047\n34个\t59048\n亲男\t59049\n我喜欢波\t59050\n通利福尼亚泛果园共荣圈\t59051\ngHth\t59052\n十几圈\t59053\n三五转\t59054\ncykckhcihckhkbcckhkhcbkckhk\t59055\ncr9石\t59056\n你好Q啦\t59057\n声度秘\t59058\n舒儿\t59059\n我的证\t59060\n图态\t59061\n我我我\t59062\n植物园\t59063\n学会\t59064\nwsl\t59065\n软式\t59066\n六休\t59067\n林益田\t59068\n占出\t59069\n张丹峰\t59070\n8828\t59071\n独来独往\t59072\n限期\t59073\n65座\t59074\n软弱\t59075\n半我有事\t59076\n虚拟交易\t59077\n比呀\t59078\n日常生活\t59079\n基站\t59080\nokokokokokokk\t59081\n1.4\t59082\n食疗\t59083\n1.6\t59084\n1.0\t59085\n小周\t59086\n1.2\t59087\n1.3\t59088\n日语紫棋\t59089\n灿明\t59090\n捕手\t59091\nefcxggyhklhgfcb\t59092\n双皮奶\t59093\n真好客\t59094\n车岭\t59095\n屈从\t59096\n哥刚哥刚哥刚哥刚\t59097\n好汉\t59098\n呵水\t59099\n762644\t59100\n120起\t59101\n光白\t59102\n08675775\t59103\n鲁非彼鲁\t59104\n句知肚明钩织图谜公主嘟咪嘟咪嘟咪嘟咪嘟咪嘟咪嘟咪\t59105\n浪漫主义\t59106\nElizabeth\t59107\n好污\t59108\n5524785\t59109\n6孔\t59110\n海淀市\t59111\n队内\t59112\n佛山市\t59113\n嗯唉\t59114\n25000\t59115\n天鹅的梦\t59116\n别缠\t59117\nXX肥\t59118\n一大家\t59119\n匮乏\t59120\n小妖\t59121\n我要咪喵喵喵\t59122\n高小发\t59123\n冲线\t59124\nbhiphotosbaiducomxiaodupicitem34fae6cd7b899e519e7a426145a7d933c9950dcajpg\t59125\n欧式\t59126\n烤火炉\t59127\n购物狂们\t59128\n上休\t59129\n受迫\t59130\n三万秒\t59131\n良药\t59132\n太坏了我不爱你\t59133\n贝弗利山庄\t59134\n落下\t59135\n邓紫棋\t59136\n后半句\t59137\n上伦\t59138\n陈特曼\t59139\n上传\t59140\n12438954368\t59141\n没法儿假\t59142\n可好可好\t59143\n嗯嗯真乖送你个香吻\t59144\n度秘你陪聊天我我好开心\t59145\n世界里拉\t59146\n351苦151块\t59147\n衰落\t59148\n桃面\t59149\n失便\t59150\n就是这样\t59151\nd晓\t59152\n喜临\t59153\n庄见\t59154\n婚配\t59155\n赛诺菲安万特\t59156\n真心的朋友\t59157\n马进东\t59158\n王柿子\t59159\n十一月初六\t59160\n奥多\t59161\n九泉\t59162\nx绞盘\t59163\nchicken\t59164\nJnnhbhjkbnhbinBubngvjnbubgbhnhbbbhhhhhjhhhhhhjhhhhhhh\t59165\n续航干熊\t59166\n4566085\t59167\n那你吻我胸\t59168\n4月25号\t59169\n难受我不想和你响湛江说了没\t59170\n木萧萧兮\t59171\n湖南卫视元宵喜乐会\t59172\n叶冰龙\t59173\n永远远都\t59174\n听琴\t59175\n陆川苗苗\t59176\n奥头\t59177\n小鱼儿们\t59178\ncachr\t59179\n林晓婷\t59180\n78平方米\t59181\n水树\t59182\n无穷无尽\t59183\n来出\t59184\n小妹\t59185\n苏爽\t59186\n5本\t59187\nonep\t59188\nsometuto\t59189\n明天勒\t59190\n复姓\t59191\nHero-韩庚\t59192\n痴醉\t59193\nNIH\t59194\n孔诗瑶\t59195\n模拟类\t59196\n走出国门\t59197\n决明子理疗枕\t59198\n青春剧\t59199\nviez\t59200\n982385557788\t59201\n遥望感\t59202\nGEM\t59203\n好久不育馆\t59204\nviet\t59205\nview\t59206\n卡尼克\t59207\n雅戈尔\t59208\n刘烨\t59209\n有机板\t59210\n新婚后\t59211\n害人不浅\t59212\n韩式烤肉\t59213\nigkv\t59214\n记性\t59215\n55888\t59216\n签证\t59217\n乖魔灵\t59218\nhb43\t59219\n外女\t59220\n拍拍拍拍\t59221\nsuperm\t59222\n月牙\t59223\nvjhv\t59224\n7点半\t59225\nglll\t59226\n邪教\t59227\nqqwwwgq\t59228\n二百四十\t59229\n言出\t59230\n999千\t59231\n张丽照\t59232\n外套\t59233\n细线\t59234\n研叭\t59235\n细纹\t59236\n林勇成\t59237\n不老鼠\t59238\n崔钟勋\t59239\n了别以为\t59240\n羽联\t59241\n暗恋\t59242\n那你是的黎明的等你十块马上\t59243\n富睿敏\t59244\n=左传\t59245\n吧天涯\t59246\n墀\t59247\n分管\t59248\n生育率\t59249\n周思彤\t59250\n题我告诉你\t59251\n4145553364597778900\t59252\n歌厅\t59253\n暖胃\t59254\nSOLO\t59255\n漏掉\t59256\n调匀\t59257\njggghjgvgu\t59258\n介样\t59259\n研发\t59260\nhfuryfhejxhf\t59261\n实尚\t59262\n晏子论\t59263\n好弟弟\t59264\n对子\t59265\n月何时\t59266\n坚饼果\t59267\n立胜\t59268\n郭苗苗\t59269\nｖ呗\t59270\n切切切切切切切切切切切切切切切切\t59271\n一K13\t59272\n律政新人王\t59273\n牠平常\t59274\n世界纷纷扰扰\t59275\n纯真你好纯真\t59276\n玄机\t59277\n巨商达官\t59278\n古塔\t59279\n你好度秘你是谁\t59280\n50度\t59281\n尘埃\t59282\n造假票\t59283\nheenim\t59284\n岁新娘\t59285\n孙妞\t59286\nmachile\t59287\n面包树上的女孩\t59288\ndhiphotosbaiducomxiaodupicitemb21bb051f81986182de330f74ded2e738bd4e619jpg\t59289\n技术改造\t59290\n迷鸟守则\t59291\nInsider\t59292\n爸爸的超人\t59293\n桃花羹\t59294\n3千零三\t59295\n8几\t59296\n一比二\t59297\n灰汉\t59298\n墓\t59299\n邵怡霖\t59300\n背下\t59301\n背上\t59302\n二六五二零二七五\t59303\n不灭你妹妹气我了我要找你\t59304\n375pp\t59305\n遭踏\t59306\n生僻字\t59307\n402单\t59308\n增\t59309\n杏花村\t59310\n进度\t59311\nPLU\t59312\n双拼饭\t59313\n25853\t59314\n随处可见\t59315\n银民\t59316\n活出\t59317\n54157257\t59318\n落袋\t59319\n八点五十\t59320\n高跟鞋\t59321\n量身\t59322\nPLA\t59323\n背串\t59324\nfdafanshightfoylistoutiffifallerolaofoololololol\t59325\n财务\t59326\n名妓\t59327\n6999997876349797\t59328\n13793289\t59329\n座椅\t59330\n时不时\t59331\n麻麻泪奔\t59332\n著作\t59333\n卢燚\t59334\njagjgah\t59335\n中空\t59336\n五大洲\t59337\n一小株\t59338\n玩不明白\t59339\n殊途\t59340\n摇摇欲坠\t59341\n世下\t59342\n过长\t59343\n多美丽会\t59344\ntas\t59345\n破格\t59346\n╯伦\t59347\n小松鼠\t59348\n虎阜\t59349\n一次性\t59350\nTfboys\t59351\n圣母白莲花\t59352\n一尤釜\t59353\n背呗背呗背呗背呗背\t59354\n20日凌晨2:30\t59355\nab64034f18c27972a8c379310a551db2jpg\t59356\n蚊子\t59357\n蚊孑\t59358\n幺二零二二幺\t59359\nIjifh\t59360\n我是女的我叫女\t59361\n佐证\t59362\n走向\t59363\n恐龙级\t59364\n嘎嘎漂亮\t59365\ndiploma\t59366\n北医\t59367\n北区\t59368\nLTE商用爱立信\t59369\nqpppqpsprs\t59370\n幢\t59371\n脚链\t59372\n走后\t59373\n墨西哥蒙特雷队\t59374\n闲书\t59375\n北北\t59376\n南锣鼓巷\t59377\n吱吱硝周\t59378\n情报员\t59379\n辅导员\t59380\n倍儿爽\t59381\n太傻了你不好\t59382\n王宇阳\t59383\n5：41\t59384\n耍赖\t59385\n黄金妮\t59386\n何平\t59387\nenglish\t59388\n少数民族\t59389\n50650\t59390\n满天星\t59391\n十六十六十六十六十六十六十六\t59392\n小受受\t59393\n450702199207186312\t59394\nzvg\t59395\n应着\t59396\n谢嘉文\t59397\n禁赛\t59398\n狂蜂浪蝶\t59399\n首枚\t59400\n几来说\t59401\n满洲里\t59402\n白你\t59403\n吴语萌\t59404\n张佳祺\t59405\n4.香元\t59406\n乖乖乖小孩子乖乖\t59407\n1568\t59408\n话闸\t59409\n首架\t59410\n202秘\t59411\n邓家伟\t59412\npoint\t59413\n平\t59414\nyogicigftif\t59415\n莫晓冬\t59416\nzvx\t59417\n付诸瑶琴\t59418\n云南信息报\t59419\n湖北省利川市反贪局\t59420\n大卫娃\t59421\n表整天我猜我猜我猜猜\t59422\n50g色拉油\t59423\n34313133点\t59424\nupusu\t59425\n树尖\t59426\n猫腾涛\t59427\n一部下\t59428\n91y1999点\t59429\n陈独睾\t59430\ncycytdgxy\t59431\n周小姐\t59432\n命由己回\t59433\n数言\t59434\n幽\t59435\n败走\t59436\n在过\t59437\n南里\t59438\n電視\t59439\n烦杜比\t59440\nfjfjgnjcvbibvjvbkbvbkvbkjx\t59441\n马老子\t59442\n狂妄\t59443\nhabaaaaaa\t59444\n看头\t59445\n看失\t59446\n我喜欢童\t59447\n张学良\t59448\n30秒钟\t59449\n说话儿\t59450\n不分男不分女\t59451\n二二首\t59452\n六盘水\t59453\n亲你爱你的走开\t59454\nKk\t59455\n﹏\t59456\n太阳当空照好画的晚霞\t59457\n了我不要我不要我不要我要花花花花\t59458\n﹋\t59459\n不安静\t59460\n﹉\t59461\ny53ti4r63uu66utgpigyu80erx67vrxhvdlf\t59462\nJonathan\t59463\n﹃\t59464\n许姿霖\t59465\n看天\t59466\n﹀\t59467\n王传清\t59468\n我表杨了你我喜欢你\t59469\n一枝\t59470\n一枚\t59471\n看够\t59472\n洗面\t59473\n恶心诚意伯\t59474\n杨震\t59475\n看多\t59476\n笞案\t59477\n去谁\t59478\n月饼节\t59479\n苍井姐姐\t59480\n﹣\t59481\n好看我爱\t59482\n﹠\t59483\n萨阿\t59484\n45855\t59485\n奥对你不是人你是鬼我\t59486\nmity\t59487\n创意产业集聚区\t59488\n小骗纸\t59489\n李国芳\t59490\n注射\t59491\nmynapose\t59492\ntjloo\t59493\n德尔加多\t59494\nHiJk\t59495\n脚趾头\t59496\n陈星宇\t59497\nbiscuit\t59498\n50少十分之一\t59499\n眼眼\t59500\n减速\t59501\n樊少\t59502\nhggjkkj\t59503\n内政\t59504\n45分钟\t59505\n祖德\t59506\n10000000\t59507\n提货\t59508\n抄作\t59509\n青春期着空气哈哈看间吗穿心煞猫猫随风飘\t59510\nyabu\t59511\n三年级学\t59512\n表现力\t59513\n徐曼\t59514\n何昊\t59515\nwoojiushizenmele\t59516\n生与死\t59517\n徐曌\t59518\n长安三酒鬼\t59519\n心碎声\t59520\n问女孩真心话大冒险\t59521\n我喜欢竹\t59522\n千千万万只\t59523\n白虎头许坤非法经营案\t59524\n隔断\t59525\n广告\t59526\n吾师\t59527\n非铁\t59528\n管恒军\t59529\n谭感冒人\t59530\n双彩球201600057\t59531\n弱点\t59532\n无话不说\t59533\n私守\t59534\n太神奇\t59535\n宫再鹏\t59536\n找我儿\t59537\n改聘\t59538\n靠苦\t59539\n小的你电影小达人舞剑姬那个什描\t59540\n日麽麽\t59541\n周考月\t59542\n永远永远永远\t59543\n并列式\t59544\n想绝交\t59545\n膝人\t59546\nbejxjd\t59547\n度秘蛛\t59548\n放你走\t59549\n2012-07-04\t59550\nuhug\t59551\n卍卐\t59552\n姜伟新\t59553\n环球时报真尼玛奇葩\t59554\nI809棕2900\t59555\n别了我不想\t59556\n私家\t59557\n钱了吧\t59558\n白葱\t59559\nblock\t59560\n枪械\t59561\n1468544632\t59562\n洛颜\t59563\n装女郎\t59564\nb型\t59565\n免引\t59566\n驻颜\t59567\n秦小丽\t59568\n塑料瓶\t59569\n赛马\t59570\n坠落\t59571\n闺蜜们\t59572\n投毒\t59573\n原庄\t59574\n18839353971\t59575\n120110119\t59576\n机械通知\t59577\n要记住\t59578\n煽动\t59579\n阿还有\t59580\n一月份一月\t59581\n港机\t59582\nghiphotosbaiducomxiaodupicitem71cf3bc79f3df8dc5f3a97e4ca11728b471028bbjpg\t59583\n虫子底\t59584\n兴宁火车站\t59585\nmityou\t59586\njunk\t59587\nKMPlayer\t59588\n哎呀不聊\t59589\nniaimiki\t59590\n爱比克\t59591\n1221111\t59592\njung\t59593\n洋画\t59594\n49枚\t59595\n地铁站\t59596\n别告诉我\t59597\ngoodBye\t59598\n桃江路\t59599\n好戏在\t59600\nhhhh\t59601\n寻三\t59602\nfdvdq\t59603\ntan\t59604\n有史以来\t59605\n茶姑姑\t59606\nlí\t59607\n阿不能\t59608\n芳树\t59609\nk厅\t59610\n涛嘉余\t59611\n供过于求\t59612\n5631\t59613\n少年王的悠悠球\t59614\n立海\t59615\n输球\t59616\n输理\t59617\n蚕丝\t59618\nhifoob\t59619\n宏愿\t59620\n郑重\t59621\n五个月\t59622\nAvvideo\t59623\n哪呢热呢人\t59624\n美绝\t59625\ncdsggrrwgtf\t59626\n神圣\t59627\n魅宫\t59628\n意志力\t59629\n看着你好\t59630\n东北菜\t59631\n那几时\t59632\n丢掉\t59633\n扬度\t59634\n现在的话\t59635\n新播的天天有喜来\t59636\n吉日\t59637\n过来说\t59638\nòooo\t59639\nnoonononononoonononononononononono\t59640\n恩我有事多两天\t59641\n是真的假\t59642\n国过郭锅果国过郭锅裹帼果达\t59643\n魅宇\t59644\n多萌萌\t59645\n91914877766\t59646\n丑人鱼\t59647\n高宇航\t59648\n十万十万十万十万十万\t59649\n苹果店\t59650\n艾图\t59651\n度密度密我是你的好朋友\t59652\n内疚家\t59653\n卖无\t59654\n可素\t59655\n21号\t59656\n可紧\t59657\n来不吗\t59658\n人界\t59659\nhhhw\t59660\n午睡假\t59661\n药企\t59662\nhiinnhi\t59663\n20、30年\t59664\n声动\t59665\nHigfjhtn\t59666\nu波额\t59667\n雅图面\t59668\nfreeRuth\t59669\n声势\t59670\n太小清\t59671\n好恶心爱\t59672\n尾货\t59673\n寤寐\t59674\n比率\t59675\n可爱的回忆\t59676\n不要道\t59677\n19972\t59678\n一两两只\t59679\nIphone\t59680\n等一下来\t59681\n罗马里奥\t59682\n欧儿\t59683\n出卷\t59684\n八十米\t59685\n一个800多\t59686\n不得已\t59687\n吉时\t59688\n冷心\t59689\n缺钙\t59690\n本月23日\t59691\n民国9年\t59692\n开工\t59693\n叶泽玮\t59694\n1352467098\t59695\n红根\t59696\nchgvcthf\t59697\n好时节\t59698\nkaidofou\t59699\n790906592\t59700\n9256\t59701\n哦一个一米\t59702\n专业性\t59703\n梵蒂冈沟\t59704\n9250\t59705\n9月12日\t59706\na80厘米\t59707\n骗我\t59708\n不能不准\t59709\n一月份\t59710\n百宝袋\t59711\n心向善\t59712\njurypjtjtjtj\t59713\n概况\t59714\n老虎机\t59715\n牙病\t59716\n度秘吏音乐\t59717\n努力度秘\t59718\n六六天\t59719\n流芳百世\t59720\n老干妈\t59721\n586855258\t59722\n绳迷\t59723\n购物中心\t59724\n牙痛\t59725\n一名25岁\t59726\n红馆\t59727\n钱钱钱\t59728\n645635606\t59729\n黑牛\t59730\n玉丹\t59731\nAK度\t59732\n23日早晨\t59733\n苹果大红豆绿豆大绿豆\t59734\n玉足\t59735\n收一收\t59736\nrrrrr\t59737\n划过\t59738\n夜月\t59739\n裸雕\t59740\n一我的好开心\t59741\n麟峰\t59742\n条杠\t59743\n奈斯荼蘼\t59744\n全副\t59745\n黑片\t59746\n情缠\t59747\n你好睡\t59748\n情缘\t59749\n胳膊\t59750\n黑乎乎\t59751\n3点五点\t59752\n挠痒痒\t59753\n诱敌\t59754\n费时\t59755\n二七三幺二三幺五\t59756\n天香\t59757\n刘充一\t59758\n皮裤\t59759\n洼地\t59760\n几份\t59761\n尼日立亚\t59762\n几件\t59763\n这两点\t59764\n响遏行云昵\t59765\n几代\t59766\n郭哥\t59767\ncircumstances\t59768\n前一分钟\t59769\n谭夏丽\t59770\n小翅\t59771\n今儿个太阳公公\t59772\n我了行不行\t59773\n俱乐部队\t59774\n羽坛\t59775\n多睡\t59776\n阿葵\t59777\n动漫\t59778\n毕加索\t59779\nLa\t59780\n笑标\t59781\n净利润\t59782\nLj\t59783\nLk\t59784\nLh\t59785\nLi\t59786\nLv\t59787\nLt\t59788\nLu\t59789\nLp\t59790\n两千家\t59791\n情节\t59792\nbvjhkbhji\t59793\n泰德\t59794\nLz\t59795\nLG\t59796\n650除\t59797\n音响效果\t59798\n真好笑一点\t59799\nLA\t59800\nLL\t59801\nLM\t59802\n七岁十八岁\t59803\nLH\t59804\nLI\t59805\nLV\t59806\nwjf\t59807\n海洋国旅总社\t59808\nLR\t59809\nLP\t59810\n开飞\t59811\n自豪\t59812\nLZ\t59813\n胜势\t59814\nLX\t59815\n熊佳鹏\t59816\n后卫\t59817\n事板\t59818\n朋雯丽\t59819\n宽窄巷\t59820\n别老郭\t59821\n作不道\t59822\nL7\t59823\n打醒\t59824\n随处\t59825\n密亚你爱不爱我爱我就亲一下\t59826\n陆涛\t59827\n旧创\t59828\n发人\t59829\ndjdjdxdj\t59830\n三长\t59831\n一篇以么\t59832\n后半\t59833\nHughoff\t59834\n华晶晶\t59835\n开心闯神墓\t59836\n发亮\t59837\nisbsjmq\t59838\n曹了\t59839\n事权\t59840\n行政长官\t59841\n秘我现在开始\t59842\nzan\t59843\n晚娘\t59844\n418套\t59845\n距绝\t59846\n员工元\t59847\n呼噜g\t59848\n三千米\t59849\n甜品\t59850\n骄人\t59851\n华城小区\t59852\n几天气\t59853\n小寂\t59854\ne泛\t59855\n红魔\t59856\nzai\t59857\n孙嘉瑛\t59858\n32674496942692428942647894264626437447464\t59859\n悲悯\t59860\n杭州\t59861\n征戰之路\t59862\n桂钿\t59863\n怂伐\t59864\nKHUN\t59865\n这种\t59866\n轻声咪咪\t59867\n五百年\t59868\n惆怅隐者\t59869\n角度\t59870\n旱麦\t59871\n周三宝\t59872\n按一评价\t59873\n同步阅读与完形填空周周练\t59874\nHIGH\t59875\n341521\t59876\n占一半\t59877\n拜钱\t59878\nZ72\t59879\n这科\t59880\n涨薪\t59881\n杜媚\t59882\n咕叽\t59883\n韶山\t59884\n53656845\t59885\n不可少\t59886\n泉奈\t59887\n富达\t59888\n你是个男主人我舍女主人\t59889\n1296期\t59890\n这点儿\t59891\n性性\t59892\n展馆\t59893\n那个咱\t59894\n限贷\t59895\n393\t59896\n撞着\t59897\n391\t59898\n390\t59899\n佳木斯市\t59900\n限购\t59901\n395\t59902\n绵阳火车站\t59903\n399\t59904\n398\t59905\n不要脸的不要脸的不要脸的不要脸的不要脸的不要脸的不要脸\t59906\n回家睡觉额\t59907\n三G四G\t59908\n逃命出口\t59909\n一个18522670\t59910\n操空\t59911\n周记食\t59912\n东南商报\t59913\n18888888888888888十544567954\t59914\n荷兰\t59915\n两副\t59916\n好时候\t59917\n打底\t59918\n杜拉拉杜郎\t59919\n花儿与少年第三季\t59920\n39l\t59921\nvgyyytggg6\t59922\n39q\t59923\n敷脸\t59924\nJNJ\t59925\n咖喱牛排\t59926\ndxgchchzffuzxghci\t59927\nPaul\t59928\n切碎机\t59929\n机理\t59930\n段道长\t59931\n昨日上午10时\t59932\nfhzgfFg\t59933\n咋着我\t59934\n鼓浪屿\t59935\n不做梦\t59936\n王冰涛\t59937\n反对党\t59938\n死了算\t59939\n打度\t59940\n临盘镇\t59941\nhttppinyincne6237\t59942\n随地嘘嘘\t59943\n18183584159\t59944\n龙伟\t59945\n这首\t59946\n百六十160\t59947\n国祥\t59948\n会计信息\t59949\n我晕偏差通话哩哩爱的那个天使\t59950\n15143855729\t59951\n误工费\t59952\n多冷\t59953\npbbx\t59954\n1740元\t59955\n永远不通\t59956\n富邦乡\t59957\n国祚\t59958\n真正的你在哪儿\t59959\n13313339177\t59960\n算了甩\t59961\n世界之窗文化产业园\t59962\n献你一束花\t59963\n杜玢琪\t59964\n恶报\t59965\n啊杀杀杀杀杀杀杀杀掉了了杀杀杀我傻傻看我撒我撒我撒里\t59966\n首颗\t59967\n下周四\t59968\n八八期\t59969\n液晶电视机\t59970\n真心kui\t59971\n滚滚滚滚滚滚滚滚滚滚\t59972\n仲天骐\t59973\n首领\t59974\n王思超\t59975\n废话废话废话废话\t59976\njbs\t59977\n不够格\t59978\n满天飞了\t59979\n５千元\t59980\n6666666666666666666666666766666666666\t59981\n冰袋\t59982\n纠葛\t59983\ngughy\t59984\n蔡子文\t59985\n铝箔\t59986\n领涨\t59987\n春趣\t59988\n尚可\t59989\n奥兔侠\t59990\n谢谢好啦好啦\t59991\n从何而来\t59992\nnzn\t59993\n乐队\t59994\n三一分钟\t59995\n此举\t59996\n草鸡画派\t59997\n日号\t59998\n男童\t59999\n信不打\t60000\n8888888088\t60001\nhjyz\t60002\nllllllabcdfgh\t60003\nrrrerr\t60004\nhjyu\t60005\n找识\t60006\n食药健康在身边\t60007\n心莲花\t60008\n大风雨雪\t60009\n幽芳\t60010\n你的脸龄\t60011\n笨拘\t60012\n叫秘努\t60013\n宁雨沉\t60014\n哑舍子\t60015\n说话吧兄弟\t60016\n三四大\t60017\n呱呱摸摸\t60018\n试整\t60019\ntugd\t60020\n三四天\t60021\n消失去\t60022\n秘加度秘\t60023\n安[\t60024\nwwwyo\t60025\n撒落\t60026\n放假\t60027\n喂累\t60028\n两天儿\t60029\n一直在路上\t60030\n逝者\t60031\n进点\t60032\n遇上\t60033\nmouth\t60034\n挺起\t60035\n电子书\t60036\n倒霉鬼\t60037\njgdsfh\t60038\n献媚\t60039\n洋河\t60040\n意外类\t60041\n电子乐\t60042\n4家\t60043\n事先神\t60044\n300多位\t60045\n工笔\t60046\n4399游戏\t60047\n5月19日\t60048\n老调重弹\t60049\n3个小时\t60050\n这办\t60051\n不惜代价\t60052\n3厘米\t60053\n辣手\t60054\n天语\t60055\n恢復\t60056\n龙龙龙龙\t60057\njtta\t60058\n账本\t60059\n清洁剂\t60060\n大小鬼\t60061\n体内存\t60062\n习礼\t60063\n冬阴功汤\t60064\n积蓄\t60065\n慢严舒柠\t60066\n13589076254\t60067\nLIVE\t60068\n收歌\t60069\n彩色\t60070\n快快快快快点\t60071\n切忌\t60072\n签字\t60073\n奸视\t60074\n签子\t60075\n汗命\t60076\n安定老老实\t60077\n吖度秘\t60078\n永康\t60079\n五马分\t60080\n2011年6月9日晚10点\t60081\n窗上\t60082\n5200名\t60083\n3月6号\t60084\n排弹\t60085\n金火\t60086\n同等\t60087\n长安雨来\t60088\n公元前\t60089\n咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪\t60090\n更香\t60091\n英语柜\t60092\n肖天\t60093\n1236987455888\t60094\n东晋\t60095\n排开\t60096\n瓷砖\t60097\n骗你的我己经\t60098\niehfhyth\t60099\nmetoth\t60100\n王佳小强\t60101\n谢雨婷\t60102\n刘运\t60103\n周弃疗\t60104\n3肀r丈\t60105\n10位\t60106\n五大碗\t60107\n黄庆凯\t60108\n10余\t60109\n可讨厌\t60110\n森警部队\t60111\n115568954\t60112\nFIFA金球奖\t60113\n靠也比\t60114\n狗屁承诺\t60115\n点点点点点\t60116\n也不杀火不骗人才是我的好度秘\t60117\nYouwould\t60118\n胡亦六\t60119\n闲居\t60120\n1000万块\t60121\n+86787653659365937777653653=soyou\t60122\n10佳\t60123\n睡脸\t60124\n刘迪\t60125\nlaoshi\t60126\n一千二\t60127\n一个你的秘密\t60128\n一成不变\t60129\n扮萌\t60130\n射击类\t60131\n城站\t60132\n眼红大\t60133\n其中\t60134\n麻河村\t60135\n油漆\t60136\n曹操\t60137\n朴特\t60138\n睁大眼睛\t60139\n一千五\t60140\n评议\t60141\n赵作海\t60142\n其一\t60143\n巨兽\t60144\n一接\t60145\n其三\t60146\n失眠\t60147\n魏同学\t60148\n大驾\t60149\n宾阳\t60150\n人大常委会\t60151\n玩疯\t60152\n傻瓜童\t60153\n边儿\t60154\n䀚\t60155\n父子俩\t60156\n血肠\t60157\n海门\t60158\n还我去\t60159\n发扬光大改\t60160\n鼓吹\t60161\n劈成\t60162\n少女\t60163\n斗地主话\t60164\n建筑面积\t60165\n稿基1\t60166\ncult\t60167\n有你是\t60168\nhi咯\t60169\n走路好奇怪\t60170\n5555655555555\t60171\n徐武\t60172\n王明亚\t60173\n兔子\t60174\n男船\t60175\n5454485\t60176\nxushx\t60177\n俗套\t60178\n最佳女主角\t60179\n葛乌骨\t60180\n法贝塔\t60181\nwwwhaole006com\t60182\n餐椅\t60183\n小泉\t60184\n取物\t60185\n阴颈\t60186\n这么点\t60187\n16万\t60188\n羞答答\t60189\n四明\t60190\n游优\t60191\n就让\t60192\n白总\t60193\n赵氏里\t60194\n四星\t60195\n爱+5\t60196\n海上花园在哪呀你在找我不行\t60197\n呀年\t60198\n独步\t60199\n老爸老妈\t60200\nTDM\t60201\n盛晓\t60202\n天非人间\t60203\n15231646010\t60204\n谢总比\t60205\n药监\t60206\n民营经济\t60207\n要不要不要\t60208\nsmitsite\t60209\n三家堡\t60210\n以牙还牙\t60211\n10衣\t60212\n留给\t60213\n顺海两公园\t60214\n打价\t60215\n暗香门\t60216\n不梦\t60217\npokul\t60218\n唱一唱\t60219\n一个一点\t60220\n赶快呗\t60221\n13888884383830\t60222\nvfdfdd\t60223\n秀发\t60224\n替身\t60225\n佩露夏\t60226\n美貌\t60227\n正对着\t60228\n4月5号\t60229\n版式\t60230\n打仗\t60231\n独自你\t60232\nJYJ\t60233\n我讨厌讨厌讨厌讨厌\t60234\n把宅叔吃的死死的神马的萌翻了[欢欢\t60235\n数了\t60236\n莫非里\t60237\n擒贼\t60238\n3点23\t60239\n祁禄明\t60240\n数于\t60241\n入感\t60242\n考试卷\t60243\n40多年前\t60244\n金骏眉\t60245\n趁乱\t60246\nwcq\t60247\n蠢蠢\t60248\n管片\t60249\n浙江青年莲花汽车公司\t60250\nhttpehiphotosbaiducomxiaodupicitem3b292df5e0fe992530c05a2533a85edf8cb171f2jpg\t60251\n一四零一百多\t60252\n10000欧元\t60253\n三四点钟\t60254\n体征\t60255\n真如\t60256\n夜叶\t60257\n452147661326579654231328\t60258\n猪龙\t60259\n亚当兰\t60260\n夜叉\t60261\n戒定慧者\t60262\n够爱\t60263\n郑水晶\t60264\n高和呆\t60265\n海南省\t60266\nktjtjt\t60267\n秘度秘你好\t60268\n146457\t60269\n关注度\t60270\n1k7k七\t60271\n年红\t60272\n田晶晶\t60273\n年级\t60274\n价格高\t60275\n33529\t60276\n传媒大学\t60277\n猪吧猪宝\t60278\n县域\t60279\n恩夸\t60280\n刚果共和国\t60281\n日流量\t60282\n嘿孙贼\t60283\n荣基国际\t60284\n男人不坏女人不爱下集\t60285\n流亭流亭\t60286\n乔静\t60287\nThanksomuch\t60288\n洪湖\t60289\n下得\t60290\n00000300400500000700800900\t60291\n恩多\t60292\n自己去\t60293\nI9100黑3350\t60294\nhi事\t60295\n结帐\t60296\n垮掉\t60297\n肉联厂家属院\t60298\n演讲\t60299\nBhjjjnn\t60300\n虐殴\t60301\n叶阳帝\t60302\n孤城\t60303\n三十六七七四\t60304\n零零三三零\t60305\n明泡\t60306\n不呀你是我的人\t60307\n二天多\t60308\nhi4k\t60309\n23000公斤\t60310\n俄菲翁培群宾利\t60311\n构图\t60312\n3333333333￥亓\t60313\n挖土机\t60314\n快說個來聽聽\t60315\n克劳迪内\t60316\n舒某某\t60317\n遮天1\t60318\n有机化\t60319\n以身试法\t60320\n求求\t60321\n晚上11点半\t60322\n传神截线\t60323\n上半月\t60324\n轮到\t60325\ntg文\t60326\nanthony\t60327\n找不吗\t60328\n说话呀开心看\t60329\n明末农民起义\t60330\n扬长避短\t60331\n来了不去\t60332\n解剖\t60333\n劲片\t60334\n丽都饭店\t60335\n12589\t60336\n谈恋爱吧求求你了度秘我很爱你的啊你在我的\t60337\n林语堂\t60338\n柏涵\t60339\n累容\t60340\n失勃\t60341\nhemsworth\t60342\n马鞭草\t60343\n许v\t60344\n许u\t60345\n6辆\t60346\n截留\t60347\n观仙界\t60348\n元帅\t60349\n元币\t60350\n干什么了你\t60351\n王脑袋\t60352\n十六升\t60353\n恩八八\t60354\n徐警欣\t60355\n敢问路\t60356\n王一然\t60357\n么么法\t60358\n参最好\t60359\n直插\t60360\n育才小学\t60361\n咧食\t60362\n感谈\t60363\n飘扬\t60364\n13521032636\t60365\n猜火车\t60366\n沃尔沃\t60367\n快快快快快快\t60368\n轻者\t60369\njlziced\t60370\n安太太\t60371\n二七二零二九九九一千个\t60372\n8665w\t60373\n感谢\t60374\n踌轨\t60375\n卧榻\t60376\n张继瑞\t60377\n划作\t60378\n玉生\t60379\n当员\t60380\n558855555\t60381\n颐慧佳园北门21栋\t60382\n乔任梁\t60383\n诺亚方舟\t60384\n空想家\t60385\n脱线\t60386\n行不然的话\t60387\nmbn\t60388\n快快\t60389\n迷梦\t60390\nmbd\t60391\nmbf\t60392\n中五中\t60393\nmba\t60394\n天机\t60395\nmbc\t60396\n博闻\t60397\n门徒\t60398\n凌罗缎\t60399\n续杯[干杯\t60400\n天本\t60401\n威震天\t60402\n耍泥巴\t60403\nHHRGF\t60404\n欧耶\t60405\n风土人情\t60406\n千鸟\t60407\n不具备\t60408\n天期\t60409\n足在\t60410\n熊我了死了别给我说话\t60411\n婉娴\t60412\n1BC\t60413\n嘟面\t60414\n向北京人兴师问罪\t60415\n烤肉味\t60416\n赏识\t60417\n村干部\t60418\n心智\t60419\n张嘉硕\t60420\n大家风范\t60421\nzhyet\t60422\n死度秘\t60423\nmb3\t60424\n炮音\t60425\n班玩\t60426\n别害怕\t60427\n25年\t60428\n桃源县\t60429\nduee\t60430\n在不吗\t60431\nazf\t60432\nazx\t60433\n14867765\t60434\nWERQD\t60435\n胡灵越\t60436\n603652832270\t60437\ncxxxy\t60438\nbuowb\t60439\nhi度米\t60440\nazs\t60441\nduer\t60442\n马尿\t60443\n不男不女正好\t60444\n程星硕\t60445\nFucj\t60446\n哈尔滨啤酒\t60447\n思浩\t60448\n蒋玥\t60449\n不变的说\t60450\n大茶\t60451\n鲜花生女\t60452\n高vv\t60453\n雅诗蓝\t60454\n朱佳丽\t60455\n十几遍\t60456\n危害\t60457\n范领导\t60458\n奥轮美\t60459\n成自\t60460\n自奥\t60461\n魔魅魅蓝精灵k15闽南的妹妹呢铠甲姐姐一四开始我不要你了我看你为我\t60462\n开价\t60463\nNBA吧\t60464\n执子\t60465\n刁难\t60466\n安抚\t60467\n马来西亚兰\t60468\n联通大厦\t60469\n鼠牛\t60470\n蒋王\t60471\n喊哥\t60472\n林文都\t60473\n学女\t60474\n黑红金\t60475\n我不就\t60476\n吵度秘丸\t60477\n家义工\t60478\n陈柏霖\t60479\nvv爸爸爸爸哥哥你好\t60480\n开心干啥呀开心我走了我走了我走\t60481\n度秘求你给我一个游戏吧\t60482\n37063000\t60483\n王玺林\t60484\n美妞儿\t60485\nBarely\t60486\n文博会\t60487\n习俗\t60488\n郭正军\t60489\n兩件系羽絨\t60490\n1181189999999999\t60491\n#素求助#\t60492\n搞不明白\t60493\n7k7k晚安青蛙卡\t60494\n秦鑫鑫\t60495\n15周年\t60496\n朱正山\t60497\nosz\t60498\n魅惑废话废话废话废话\t60499\n孔顺\t60500\noss\t60501\n多莱利亚\t60502\n一二期\t60503\n哈龙包\t60504\nosm\t60505\nmlmlml\t60506\n杜天宇\t60507\n乌克托莱托\t60508\n把儿\t60509\n不可不可以\t60510\n讨人\t60511\n十九日\t60512\n柏安八木恩文\t60513\n来人儿\t60514\n邓子健\t60515\njmf\t60516\n58611351\t60517\nyesung\t60518\n华臣影城二七广场\t60519\n你在哪我去找你爱我\t60520\njmd\t60521\n1千万\t60522\n嘟比\t60523\n3g3g\t60524\n那世界\t60525\n朵拉\t60526\n多久久\t60527\n零一十三岁\t60528\n随心所欲\t60529\nbbbbx\t60530\n费孝通\t60531\n2008年２月八日\t60532\n懒猪\t60533\n懒猫\t60534\n韩冥\t60535\n和好日子\t60536\nchoice\t60537\n益鸟\t60538\n饱了废话\t60539\n祺祺\t60540\n小血血\t60541\n路口号\t60542\n撒由那拉\t60543\n鄱\t60544\n妞姐\t60545\nbbbbb\t60546\n你好婆\t60547\n靠不靠不靠谱\t60548\n鹅鹅鹅鹅鹅鹅无极\t60549\n橘子味\t60550\n质量\t60551\n甘长安\t60552\n欲说\t60553\n和和和和和和和和\t60554\n1589875454\t60555\n鄭\t60556\ngrgtb\t60557\nhisparents\t60558\n123466789\t60559\n120万条\t60560\n维语\t60561\nfjlokmmo\t60562\n对呀所以说\t60563\n草缸\t60564\n炸鸡腿儿\t60565\nscen\t60566\n7ki五十五五十五\t60567\n24点钟\t60568\n六十五十四十\t60569\n林志\t60570\n90000000000000000000000000000000000000000000000000000000000000000000000000000个\t60571\n史玉柱\t60572\noo0x2\t60573\n哥只虎\t60574\n日柱\t60575\n基友们\t60576\n林芝静\t60577\n嫉\t60578\n淘吧\t60579\n审阅\t60580\n意思我\t60581\n别发骚abaaaaaaa宝贝宝贝will吧\t60582\n袁昌顺\t60583\n李书贤\t60584\nhttpmusicbaiducomsong129030272fmaltgnew3\t60585\n刀分\t60586\n刀刀\t60587\n李烨\t60588\n苏振\t60589\n绿人\t60590\n桂江\t60591\nJHBN\t60592\n非常劲\t60593\n旧患\t60594\n后雨\t60595\n姚希慧\t60596\n103斤\t60597\n玛丽\t60598\n贺电\t60599\n陈默天\t60600\n通向\t60601\n民国时期\t60602\n一口口\t60603\n资色\t60604\n听妈呀咪\t60605\n孙肿\t60606\n战熊我\t60607\n30多年以来\t60608\n通吃\t60609\n日本太君\t60610\nCaviezel\t60611\nhhihhhgijhjjvihigjhhhhhhhhhhhhhhbhhbhbbhbhbbbbbbbbbbbbbb\t60612\n许海峰\t60613\n汇丰晋信基金\t60614\n鄒\t60615\n学弈\t60616\nxhdfgdd\t60617\n多帅\t60618\n蓝草\t60619\n王晓娟\t60620\n乳天\t60621\n炊烟\t60622\n脚感\t60623\n嫖\t60624\n胡梦轩\t60625\n黑城\t60626\nxyzz\t60627\nworking\t60628\n先正创\t60629\n算术\t60630\n黑基\t60631\n甫胡巴\t60632\n杜小萌\t60633\n嫒\t60634\n7点35分\t60635\n对啊我真\t60636\n毒菜\t60637\n關懷偏遠山區\t60638\n31.4%\t60639\n午饭们\t60640\nXgigocch\t60641\n湘川小吃店\t60642\n要不想你\t60643\n梁不死\t60644\n滨湖区国土资源局\t60645\n发约\t60646\n妃夕妍雪\t60647\nHahas\t60648\n发红\t60649\n于心\t60650\n精灵梦梦呢快乐节\t60651\n曾方兰全x6ZY030989725NZn11012016潘艳珍全x6ZY030989709NZn11012016晏小虹全x6ZY030989720NZ\t60652\n两倍半\t60653\n丰南商廈\t60654\n豆瓣FM-#华语\t60655\n西拉\t60656\n停不下来\t60657\n想你了你在哪呀度秘\t60658\n尿湿\t60659\n比不比比\t60660\n稀土矿\t60661\n聊岔\t60662\n享游\t60663\n拜别\t60664\n9944\t60665\n一号二号三号四号五号六号七号八号\t60666\n维埃里\t60667\n粟米\t60668\n3q3q3q\t60669\n毕刚\t60670\n卡通\t60671\n慕容白\t60672\n7月14日\t60673\n18303413309\t60674\n布满\t60675\n牛刀\t60676\n李诚诚\t60677\n881072853166337092\t60678\n防塞\t60679\n滨北七星路口\t60680\n庄之静\t60681\n二十四号\t60682\n仿佛一个头一回红果果\t60683\n部下\t60684\n龟牌气功波\t60685\nv学冬\t60686\n日干\t60687\nWeicoLomo#\t60688\n走了走\t60689\nVhufdf\t60690\nPersistence\t60691\nblh\t60692\n夜颜颅\t60693\n兆我也不知道\t60694\n应有尽有\t60695\n照镜子\t60696\nfhfdfhu\t60697\n针雨晴\t60698\n两部\t60699\n脑死亡\t60700\n机膜\t60701\n抄送\t60702\nCifng\t60703\n喜羊羊与灰太狼之喜羊羊的禁世界\t60704\n杀猪牛\t60705\n08874335\t60706\ntimes\t60707\n75千克\t60708\n信息君\t60709\n神木\t60710\n七百三十九一百六十五\t60711\n13807019828\t60712\nkfkigigigigikkktngmgmgnkgkmtgnkgifkfkkglglllllffof\t60713\nflkkfikmflfkfkgillgkgkgkglgnlcgm\t60714\n浪潮\t60715\n和解\t60716\n看我喜欢\t60717\n神机\t60718\n芝加哥公牛队\t60719\n35岁\t60720\nAnkhenkhulihoyaho\t60721\n也想你我也想\t60722\n天长地久\t60723\n呢手\t60724\n咆哮声\t60725\n1Z34\t60726\n沃瑞\t60727\n零四年\t60728\n暖床招租\t60729\n我是怪叔叔不告诉我你在哪里我陪你没有告诉你\t60730\n气死宝宝你看这我爱灰太狼\t60731\n3dmazoozlatoo\t60732\n切入点\t60733\n412828199711063630\t60734\n边伯范\t60735\n女口\t60736\n一鼎立二齐力三实力中华人民共和国\t60737\n瑛姑\t60738\n天虹\t60739\n哪等等\t60740\n恩珑\t60741\n汤姆叔叔的小屋你的好词好句\t60742\n女发\t60743\n很抱歉\t60744\n蒙古营站\t60745\n女友\t60746\n再见了行\t60747\n盖片儿\t60748\n闯祸\t60749\n班里\t60750\n我给你照你是我的小呀小苹果\t60751\n再管\t60752\n多才多艺\t60753\n学不来\t60754\n上梁\t60755\n造度秘\t60756\n索香文\t60757\n双向六车道\t60758\n18765472394\t60759\n阳关大道\t60760\nhhggggv\t60761\n152688558\t60762\n理你了谁叫\t60763\n0.5mm\t60764\n多喝水\t60765\n辛克\t60766\n流行音乐\t60767\n百度有钱花坑逼\t60768\nNBN\t60769\n血猫\t60770\n大沥沥沥沥沥\t60771\n仕克造船厂\t60772\n任务栏\t60773\n八天后\t60774\n半勺\t60775\n吊暴\t60776\n钳制\t60777\n呼嘿\t60778\n朱玉璐\t60779\n累天\t60780\n00000000亿元\t60781\n指间\t60782\n欧巴撒拉\t60783\n二几\t60784\n昨晚11点\t60785\n3543251\t60786\n向特\t60787\n796\t60788\n言论自由\t60789\n792\t60790\n791\t60791\n790\t60792\n卧薪尝胆\t60793\n小大人儿\t60794\n4c\t60795\n799\t60796\n798\t60797\n二凉\t60798\n79%\t60799\n一百七十八块\t60800\n吸血虫肝病\t60801\n嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟\t60802\n汉兰达\t60803\n一鼻子\t60804\n四级考试\t60805\nnibushi\t60806\n明天别忘了给我一个萌萌哒\t60807\n最古老\t60808\n外门\t60809\n快递业\t60810\n朦朦胧胧\t60811\n在于\t60812\n13899423986\t60813\n我们三\t60814\n初九\t60815\n有你可以\t60816\n中门\t60817\nDSL\t60818\n947亿元\t60819\n擦去\t60820\n十一千克\t60821\nngotobed\t60822\nvicun\t60823\n啊潼\t60824\n躲猫猫吧度秘\t60825\n定制\t60826\n中闺\t60827\n中间\t60828\n刷单\t60829\n电子展\t60830\n大票\t60831\n近六年\t60832\n1846945816\t60833\n仪态\t60834\nNBA\t60835\n入秋\t60836\n大祸\t60837\n望而却步\t60838\npm2.5\t60839\n我喜事\t60840\n磁浮列\t60841\nw无限极\t60842\n李文平\t60843\n钱理群\t60844\n74575675656\t60845\n王溪若\t60846\n真真真真真不乖\t60847\n海马\t60848\n不的话\t60849\najiajta\t60850\n无哲\t60851\n一日游\t60852\n中移动\t60853\n罗图\t60854\n你在看我吗你在哪还是你在哪王\t60855\n同济\t60856\n两季\t60857\n同流\t60858\n国王杯\t60859\n说明儿\t60860\n妈妈妈妈\t60861\n龟孙\t60862\n张昨晚\t60863\ntfdvftggdfvghhcfj\t60864\nyq音乐\t60865\n男性\t60866\nxfugf\t60867\n笨笨猪懒猪\t60868\n字幕元\t60869\n13583587883\t60870\n鸡公堡\t60871\n中国联通\t60872\n麦斯威\t60873\n好嘞我\t60874\n月黑风\t60875\n别掉\t60876\n禁暴\t60877\n爸爸话\t60878\n称呼\t60879\n监狱化\t60880\nk7k7k\t60881\nryokzv\t60882\n5￥\t60883\nygfggg\t60884\n细小\t60885\n取道\t60886\n末班车\t60887\n张泰瑞\t60888\n499.21万股\t60889\n明言\t60890\n砍死\t60891\n小泳\t60892\n@一点\t60893\n贵重\t60894\n郭羡妮\t60895\n南美\t60896\nsearch-box\t60897\n呢咋回事儿\t60898\n珍传奇\t60899\n双全\t60900\n呦忽而\t60901\n双兔\t60902\n6xbb\t60903\n分明就是你不理解我問什麼\t60904\n我有天马行空的幻想就幻想你有一天能飞上天\t60905\n锋味\t60906\n10多分钟\t60907\n掩开\t60908\n我想大声告诉你我爱你\t60909\n皇子\t60910\n皇孑\t60911\nsnddnfjc\t60912\n炫媚\t60913\n皇孙\t60914\nhvisf\t60915\n区分\t60916\nmate\t60917\n抓天好\t60918\n县会\t60919\n成丹\t60920\n成为\t60921\n李老明天\t60922\n三快点儿\t60923\njoksk\t60924\n成一\t60925\n0V0\t60926\n成七\t60927\n摇滚乐\t60928\n两个晚上\t60929\n春望这首诗你给我\t60930\nOK咯哦\t60931\n于佳月\t60932\n草草草草\t60933\n芙蓉王烟\t60934\n白甲骑士\t60935\n244名\t60936\n丑不礼貌\t60937\ndrftghh\t60938\n区别\t60939\n同工同酬\t60940\n甄雪晴\t60941\nmsk300u\t60942\n一今年\t60943\n问自己\t60944\nlllklk\t60945\n兄弟俩\t60946\n3度\t60947\n下列\t60948\n659465685545\t60949\n成光蛋\t60950\n群聊\t60951\n海关\t60952\n破事\t60953\n会不不慧慧\t60954\n春天春天\t60955\njts\t60956\n毛你打联盟了你最毛衣世界上最最\t60957\nWoXiang\t60958\n下刀\t60959\n广佛通\t60960\n3床\t60961\n有鬼你相信\t60962\n铝制\t60963\n下到\t60964\n下船\t60965\n伊洛瓦底江三角洲\t60966\n下别\t60967\n海兔\t60968\n轻易\t60969\n真奇怪\t60970\n骨癌\t60971\n二二六五六\t60972\n破产\t60973\n饿鬼\t60974\n一千多件\t60975\n鸡丁\t60976\n句子女\t60977\n菱形块\t60978\nEASON\t60979\n余震\t60980\n朝巴\t60981\nzhs\t60982\n1毛\t60983\n母型\t60984\nVlgfhccv\t60985\n颜啦啦\t60986\n芭蕉\t60987\n衣不蔽体\t60988\njtd\t60989\n好卫国\t60990\n北海市公安局经侦支队\t60991\nq1981\t60992\n20十20\t60993\n我恨你我讨厌你我要杀了你\t60994\n霸道总裁风\t60995\n如来如去\t60996\n建筑屋\t60997\n芭蕾\t60998\n吐牙\t60999\n展品\t61000\n死去过\t61001\n十七次\t61002\n撒腿\t61003\nzhy\t61004\nzhx\t61005\n不求\t61006\n天呐豆好天天心情\t61007\n年多\t61008\nvision\t61009\n就餐\t61010\n腊八快乐\t61011\n万达影院\t61012\n消炎药\t61013\n徐没逾\t61014\n海带猫\t61015\n体温\t61016\n在线播放\t61017\n实体化\t61018\n刘星遥\t61019\n不复\t61020\nzhenle\t61021\n乳香\t61022\n党宇\t61023\n12台\t61024\n猿嘉敏\t61025\n12号\t61026\nabasic\t61027\n历代\t61028\n1003006001千二百144003千六百家\t61029\n分手离\t61030\n历任\t61031\n献计\t61032\n12只\t61033\n继承轩\t61034\n两节\t61035\nFGUF\t61036\n小介\t61037\n花儿块\t61038\n两百300二八千八百八十八730\t61039\n木么我是你的主人\t61040\n食尸鬼\t61041\n万级\t61042\n真心欢喜敏君305滴小狼奔\t61043\n茄子秧\t61044\nvfvggpxypxgptdgtxgshfxhywdvhushsppguzhahhhchzhsvzhshvxhxsjsxnbdbdbdnxbxdbdjdndvdbdnfbcdxbdbbdbbfbeveenzzjzbxbebjsjsdbbvhhxbshzjajAjwjjddfffiej\t61045\n与时俱进\t61046\n你是我\t61047\nalsoneed\t61048\n天主教堂\t61049\n三环\t61050\n图色\t61051\n三山水厂\t61052\nqjni\t61053\n北外\t61054\n超拽\t61055\n来了睡一觉\t61056\nmyself\t61057\n议政\t61058\n淡定帝\t61059\n蔡泽林\t61060\ns\t61061\n不限\t61062\n31312176835655618\t61063\n石口\t61064\n拿起来\t61065\n15350039469\t61066\n2200岁\t61067\n残心\t61068\n777554628757528\t61069\n吉利蛋\t61070\n100张\t61071\n好嘛你好\t61072\n李知道\t61073\n绿城双赢\t61074\n大部头\t61075\n全国各地\t61076\n久候\t61077\n血尼奥\t61078\n五矿钢铁\t61079\n非点儿\t61080\n恼恨\t61081\n43名\t61082\n存在感\t61083\n大扫除\t61084\n刺猬\t61085\n忏悔\t61086\n国人数\t61087\n治好\t61088\n青路\t61089\n女搭\t61090\n护指\t61091\n徐海亭\t61092\n猜表\t61093\n见当然\t61094\nwants\t61095\n粉刷\t61096\n朝天大喊\t61097\nshsg\t61098\nHhgg\t61099\nshsc\t61100\n斯瑞\t61101\n过多\t61102\nshsh\t61103\n过夜\t61104\n17户\t61105\n我讨厌你吧采茶娃娃\t61106\n过处\t61107\nshsy\t61108\nappyn\t61109\n过失\t61110\n还好吧再见否一本\t61111\n钱粮\t61112\n二三天\t61113\n过头\t61114\n牧业\t61115\n果汁儿\t61116\n无渠\t61117\ntwkga\t61118\n彭彭同\t61119\n日常用品\t61120\n35x5x\t61121\nMtvfsdgyfg\t61122\n过天\t61123\n芦芽山\t61124\n瞧见\t61125\n河工\t61126\n偷着\t61127\n就是说\t61128\n郑好嚣\t61129\n学化站神金刚\t61130\n小师徒\t61131\nASE3Q\t61132\n题型\t61133\n美味我最咯醉了醉了你是我一生最最美的玫瑰汪美乐美乐美\t61134\n男男闹闹m莎瑞\t61135\n陆猪头\t61136\n偷睡\t61137\n500800\t61138\n正定八中\t61139\n王八小\t61140\n过程式\t61141\n即盗\t61142\nn\t61143\n12466\t61144\n每三二天\t61145\n被黑\t61146\n零二二零七零四三八五二\t61147\n来啦啦啦啦你是我的好\t61148\n伤心堡\t61149\n张金国\t61150\n我姿\t61151\n奇强\t61152\nbguhwedtQbiilhjZe正Z万一乙七2二wkjolijhcwkjjl\t61153\n湄公\t61154\n朱强\t61155\n绒货\t61156\n忍者tfbs\t61157\n向世界宣告\t61158\n火力\t61159\ndeft\t61160\n爱晟敏\t61161\n度秘扇\t61162\n佛蒽地呃脉脉\t61163\n高学\t61164\ndzfx\t61165\n第十张\t61166\ndefg\t61167\n糙女\t61168\ndefd\t61169\ndefj\t61170\ndefi\t61171\n喜糖\t61172\n练笔\t61173\nJackyourmother\t61174\n半块儿\t61175\n200平方米\t61176\njsfc\t61177\n尼玛操蛋\t61178\n备好了\t61179\n火势\t61180\n范要\t61181\n江边的题我是你在哪我找你\t61182\n恩慢慢\t61183\n高孙\t61184\n11日10时\t61185\n高子\t61186\n八九岁\t61187\n武打\t61188\n我的你别管我叫什么了我不我\t61189\n上海联合矿权交易所\t61190\n对呀对呀\t61191\n25855855\t61192\n保温杯\t61193\n1月24号\t61194\n孔雀\t61195\n嗯陈太\t61196\n45454545\t61197\n下灵山旅游区\t61198\n45454540\t61199\n二十五块\t61200\n中国政法大学法学院\t61201\n可战乱\t61202\n你好久有\t61203\n我送你上西天\t61204\n小饺子\t61205\n好好脾气\t61206\n知识分子们\t61207\n又说起来\t61208\n影爷\t61209\n糖风\t61210\n属于样\t61211\n勃友们\t61212\n早上7点20\t61213\n来福\t61214\n好勒好勒好勒\t61215\n协作\t61216\n255n\t61217\n小崔\t61218\nhhvv\t61219\n啦啦啦好考试考原封\t61220\n几元\t61221\n几兆\t61222\n一督\t61223\n雨萌\t61224\n几克\t61225\n新宜佳\t61226\n韩式\t61227\n725集\t61228\n小羽白\t61229\n一睹\t61230\n小贵龙\t61231\n李明阳\t61232\n力挺\t61233\nUdjffjxcjshedhwhedhehwehdhhehehedydggeddhdhddhdfdhdhddhdh\t61234\n後來\t61235\n此博\t61236\n7799\t61237\n王榞\t61238\n2557\t61239\n一着\t61240\n几八\t61241\n不敢看\t61242\n几共\t61243\n呢器\t61244\n几关\t61245\n求加\t61246\n罗成\t61247\n长成人\t61248\n雨萱\t61249\nydsg\t61250\n邵新娥\t61251\n求助\t61252\n再先\t61253\n投降\t61254\nGHGHFHGUCJ\t61255\n再说一次\t61256\n玫瑰马\t61257\n爷爷\t61258\n搞莫\t61259\n国土部\t61260\n叛道\t61261\n清乳酸君素片\t61262\n前出\t61263\ngruw\t61264\ngrup\t61265\n陈赫猪\t61266\n9.0级\t61267\n裹挟\t61268\n不是男女通吃\t61269\n毯子度\t61270\nt闪爵杯\t61271\n飞脚\t61272\n885585\t61273\n比基尼秀[偷笑][偷笑][偷笑\t61274\ngbvjgjvn\t61275\n江华岛\t61276\n各半\t61277\n乳浊液\t61278\n放没\t61279\n观众群\t61280\n佳作展\t61281\n施博涵\t61282\n可隆\t61283\n复候\t61284\nnkssm\t61285\n一首歌儿\t61286\n孔明灯\t61287\nwnjo\t61288\n嘎达\t61289\n逛逛逛逛逛\t61290\n背写\t61291\n张晓阳\t61292\n微作文#\t61293\n接不着\t61294\n杭州市地铁集团有限责任公司运营分公司\t61295\n损色\t61296\n吃罢早\t61297\n倚强\t61298\n双面胶\t61299\n说过\t61300\n室外\t61301\n接下来\t61302\n水木剧\t61303\n赶上演出\t61304\n十多年前\t61305\n说还\t61306\n120岁\t61307\n至上励合\t61308\n说返\t61309\n可难\t61310\n玻璃瓶\t61311\n云岩区\t61312\n5jing\t61313\n大区\t61314\n识别\t61315\n老夫老妻\t61316\nCXCX\t61317\n心眼儿\t61318\npadjg\t61319\n识到\t61320\nqrs\t61321\nqru\t61322\n扣扣子\t61323\nqrx\t61324\n陪读\t61325\njajij1b1chlic\t61326\n伤仲永中即书诗\t61327\n你好东西\t61328\non3tifoscostscom\t61329\n大化\t61330\nyoujj\t61331\n自恋式\t61332\nˉツ\t61333\nuzfg\t61334\n经线\t61335\n二二七二\t61336\n高孟军\t61337\n肿么不慧\t61338\nmyourhityo\t61339\n扶南\t61340\n孙先生\t61341\n谢书霄\t61342\nlhyx\t61343\n威志\t61344\n狂灌\t61345\n叫做悲\t61346\n黎嘉超\t61347\n王京村\t61348\n送没\t61349\n兆佳秋\t61350\naksjdhjfjf\t61351\n好的你是我亲爱的老婆晚安\t61352\n三星手机\t61353\n哪匹马\t61354\n乌尔科\t61355\n降压药\t61356\n擼管\t61357\n尾灯\t61358\n悬挂\t61359\n气音\t61360\n3万吨\t61361\n怎麽回事\t61362\nyouj4\t61363\n酱油烧豆腐\t61364\n中国乔丹体育公司\t61365\n淑女\t61366\n三比七\t61367\n雨花区\t61368\n彼人\t61369\n拖着\t61370\n授人以柄\t61371\n化学阉割\t61372\n花盆\t61373\nINFINITIZE\t61374\njxix\t61375\n嫌贫爱富\t61376\n赵玉岐\t61377\njxic\t61378\n牛灿党\t61379\n社厅\t61380\n张宜婷\t61381\n幼官\t61382\n董必武\t61383\nReal\t61384\n亚馨\t61385\n优尔\t61386\n便利贴\t61387\n预想到\t61388\n真你普通话\t61389\nvgt\t61390\n没事儿软\t61391\n15点38\t61392\n鸭掌\t61393\n大陈组\t61394\n恐怖分子\t61395\n你给我抓紧修吧\t61396\nHowoidareyou\t61397\n你好胡\t61398\n魏哲鑫\t61399\n发达国家\t61400\n弱到\t61401\n难办\t61402\n新光\t61403\n倾城囧妃\t61404\n红色滴\t61405\n肯德基\t61406\n赵浩雄\t61407\n低下\t61408\n龙应台\t61409\n挂号处\t61410\npc端\t61411\n承浩\t61412\n燃视\t61413\n数点\t61414\nsndn\t61415\n你好背\t61416\n军政\t61417\n军改\t61418\n孓嘗\t61419\n惹不起\t61420\n新兵\t61421\nBicyclists\t61422\n感觉萌萌哒\t61423\n静电老广冬\t61424\n最炫民族风单曲大声循环\t61425\n水洗澡洗\t61426\n兰璐璐\t61427\n炔火线\t61428\n召唤器\t61429\n一草\t61430\nsch\t61431\n寒喧\t61432\nscg\t61433\n十多二十\t61434\n快乐快乐\t61435\n付近\t61436\nscx\t61437\n枉顾\t61438\n突飛猛進\t61439\nOffhut\t61440\n毛主席\t61441\n欧米茄\t61442\ncczlc\t61443\n自制力\t61444\n十八日\t61445\n行队伍\t61446\n一荣\t61447\n江湖饭\t61448\nheiiow\t61449\nv比\t61450\n雪饼\t61451\n有一个男\t61452\n7CM\t61453\n╯﹏╰\t61454\n1些\t61455\n打比方\t61456\n五言\t61457\n阿大哥哥\t61458\nlvkatle\t61459\n发怒\t61460\ncheckin\t61461\ntutslousamu\t61462\n乘法\t61463\n做做做做做做做做做做作作作作作作作作作作作作作作作作作组\t61464\nJduehrbd\t61465\n默默无闻\t61466\n宝宝版\t61467\n00up0\t61468\n最最\t61469\n毛培培\t61470\n几块儿\t61471\n万福金安\t61472\n姜爱迪生\t61473\n受限\t61474\n王绍泰\t61475\n有偿沉默\t61476\n一个三八\t61477\n哈尔才大你\t61478\n孙兆玉\t61479\n咱是谁\t61480\n李维斯\t61481\n抓不住\t61482\n车美娜\t61483\n美女我就是美女你是美女我还是美女你是帅哥我还是美女\t61484\nkkpppppppppppppppp\t61485\ndcsg\t61486\n王英\t61487\n必有路\t61488\npasd\t61489\n有意识变\t61490\n秘并恶心\t61491\npasa\t61492\n身价\t61493\ncsyi\t61494\n清炒藕片\t61495\npash\t61496\n身份\t61497\n4秒\t61498\n黄研泰\t61499\nWord\t61500\n我的女闺密\t61501\n库尔勒\t61502\n生活区\t61503\n少不了\t61504\n翩翩欲\t61505\n高祯韩\t61506\n供食\t61507\n矫建\t61508\n卡喳\t61509\n莫热讯口疮\t61510\n展望\t61511\n展期\t61512\n尘螨\t61513\n太难过\t61514\n哪山卡拉\t61515\nchengji\t61516\n少珠\t61517\n程扬婷\t61518\n大拇指\t61519\n背给\t61520\n男性化\t61521\nfull\t61522\nsbhhd\t61523\n老鼠老虎\t61524\n潘露\t61525\n是你是\t61526\n八周年\t61527\n方便类\t61528\n集中度\t61529\n的确\t61530\n上周末\t61531\n熊雨欣\t61532\n宝嘉\t61533\n2973774933\t61534\n爱你可以\t61535\nfnfughd\t61536\nGujbxff\t61537\n￢￢ノ\t61538\n钱云会\t61539\nexperience\t61540\n五十几秒\t61541\n焖嗯嗯\t61542\n铭敏\t61543\n撕心裂肺\t61544\n范安诺\t61545\nｂｃｄｅ\t61546\n莫雯静\t61547\n高素质\t61548\n给你咯咯黑金融资料理我了婆婆家\t61549\n片面\t61550\nadidas\t61551\n界河\t61552\n49面\t61553\nvghf\t61554\n雷得嘞\t61555\n稳赚\t61556\n大盘\t61557\nsusb\t61558\n大盒\t61559\nskapa\t61560\n辛金日干\t61561\n大盗\t61562\n45小时\t61563\n钟敏\t61564\n退却\t61565\n灵商\t61566\n艰熬\t61567\n斋菜盒饭\t61568\n罹难\t61569\n那咱娃\t61570\nincomfort\t61571\n虹桥机场\t61572\n60亿条\t61573\n呃劳驾\t61574\nFoxtcubiidtizcjqenw\t61575\n获益者\t61576\ngifiy\t61577\ndoor\t61578\n大市场\t61579\noooooooooxxxxxxxxxxxxxxxxx\t61580\n真麻烦\t61581\n讋\t61582\n變\t61583\n伪满皇宫\t61584\n有限公共有饭\t61585\n唐代度\t61586\n黄凡玉\t61587\nwqh\t61588\nexkmg\t61589\ndoon\t61590\n稿基\t61591\n保托\t61592\n别气\t61593\n天天如图图图\t61594\n再来一个再来\t61595\n讠\t61596\n讣\t61597\n订\t61598\n麦吉\t61599\n一个38元\t61600\n哈哈度\t61601\n讨\t61602\n讫\t61603\n马无\t61604\n钱干\t61605\n纭让\t61606\n讯\t61607\n议\t61608\n记\t61609\nhibabycjq\t61610\n讲\t61611\n新标准大学英语综合教程2\t61612\n讷\t61613\n讶\t61614\n讹\t61615\n许\t61616\n睡不做\t61617\n讽\t61618\n保找\t61619\n访\t61620\n设\t61621\n魏年兴\t61622\nficgldfmv\t61623\n白羊\t61624\n把放\t61625\nsiks\t61626\n儋州\t61627\n书画\t61628\n开户店\t61629\n动武\t61630\n李爱文\t61631\n歌双方\t61632\n想幸福的人\t61633\n一条腿\t61634\n恐龙公园\t61635\n牵牲\t61636\n爱惜谁好\t61637\n若驚\t61638\n书生\t61639\n各方\t61640\n非amihng\t61641\n毽蔫\t61642\n休宁\t61643\n白羽\t61644\n愁苦\t61645\n过会客家话\t61646\n略有\t61647\n重生类\t61648\n三角五\t61649\n狄娜\t61650\n獭\t61651\n员工们\t61652\n组别\t61653\n孔芹\t61654\nopplow\t61655\n獾\t61656\n千山万\t61657\n单日限额\t61658\n开心归来我是真的你好\t61659\n#8.17伤心童话\t61660\n孕囊\t61661\n罗宇晴\t61662\n算口算\t61663\n獎\t61664\n美少女\t61665\n大海獭\t61666\n李佳慧\t61667\n被动媒体\t61668\ndhiphotosbaiducomxiaodupicitem902397\t61669\nsggguhf\t61670\n档期\t61671\noveragain\t61672\na拉撒咪\t61673\n獒\t61674\n蜈蚣蜈蚣\t61675\n新腕\t61676\n来单\t61677\n足球之夜\t61678\n鈕扣\t61679\n多果\t61680\n好不啦\t61681\n再次中\t61682\n残疾心\t61683\n324758\t61684\n骨科\t61685\n多架\t61686\nrootyoutlitrou\t61687\n王家湾\t61688\n换换换换换换花花花花花花画画\t61689\n恒大俱乐部\t61690\n12点51点\t61691\n婚公鸡\t61692\n你的你是个好孩子我相信你\t61693\n周佳雯\t61694\n校准\t61695\n天荒地老\t61696\n你的未来\t61697\n惜人命不计\t61698\n李涵\t61699\nGBGF\t61700\n青岛郑州小学\t61701\n2324472561\t61702\n波司登\t61703\n989999\t61704\n说儿\t61705\n二教\t61706\nDbjbgvv\t61707\n不可分子\t61708\n李润\t61709\n奔驰\t61710\n着夜\t61711\n顺时\t61712\n饿隆\t61713\nbalabaa\t61714\n鄙陋\t61715\n种妹子\t61716\n熙熙攘攘\t61717\n生管\t61718\n我是你的主人你是我的歌歌\t61719\n殷卓媛\t61720\n15549559705\t61721\n51公分\t61722\n姓李有我是\t61723\n李涛\t61724\n愁家\t61725\n长不高\t61726\n张不剩\t61727\n六枚\t61728\n精师\t61729\n套牢\t61730\n聊了不想\t61731\n四十八零六\t61732\n7月27日\t61733\n蓝罐\t61734\n李涌\t61735\n亚比\t61736\n中央新闻纪录电影制片厂\t61737\nthanksto\t61738\n言不明\t61739\n好我我我错\t61740\n股指\t61741\n送杯奶茶\t61742\n题海\t61743\n很多套\t61744\n晚安啵\t61745\nhttpfhiphotosbaiducomxiaodupicitem0dd7912397dda144d093b1c1b5b7d0a20cf4862fjpg\t61746\n干嘛我的度秘我的度秘我的度秘我的度秘我的度秘我的度秘我的度秘我的度秘我的度秘我的度秘我的度秘我的度秘我都不秘我的度秘我的度秘\t61747\n13万件\t61748\nGgij\t61749\n说我不我要\t61750\n晚安啦\t61751\n祝福\t61752\n性感觉训觉\t61753\nTFBoY\t61754\n如有\t61755\n丁乃\t61756\n一几年\t61757\n一线之隔\t61758\n张庆鹏\t61759\n厉洁\t61760\n更有\t61761\n拜仁点球\t61762\n夜游\t61763\n难测\t61764\n补庙门么电视\t61765\n嘴脸\t61766\n就打点黑\t61767\nvbfuvgg\t61768\n呈贡广场\t61769\nglorisi\t61770\nheads\t61771\n99999种\t61772\n桥段\t61773\n照相表情\t61774\n烧鹅\t61775\n卡丘比\t61776\nmsekm\t61777\n今天12点58分\t61778\n二二二二\t61779\n佳士得\t61780\n谢氏字词语\t61781\n八九章\t61782\nx女\t61783\nshourists\t61784\n的牛都\t61785\nirjrhffj\t61786\n我是你的主人TV爸呗\t61787\n停讲\t61788\n动心\t61789\n一起2\t61790\n傅秀亭\t61791\n香辣肉丝\t61792\n维尼者\t61793\n沈小杨\t61794\nFrs\t61795\n黄鹏昌\t61796\nADSL\t61797\n秘署\t61798\nwoxi\t61799\n圈住\t61800\n罗宾斯个少齐天\t61801\ndfx\t61802\n英雄狗昂里空展翅\t61803\n千零十四岁\t61804\ndft\t61805\ndfv\t61806\n孟鹏飞\t61807\ndfs\t61808\n29家\t61809\n担误\t61810\ndfn\t61811\nchiphotosbaiducomxiaodupicitem9a504fc2d56285358721f95497ef76c6a6ef6385jpg\t61812\n柳瘦瘦\t61813\n卡丁顿\t61814\ndfd\t61815\ndfg\t61816\ndff\t61817\n行不行不\t61818\ntovw\t61819\ndfb\t61820\n睌安\t61821\n于艾静\t61822\nsidke\t61823\n娃娃霜\t61824\n郭义骞\t61825\n电影艺术论\t61826\n甚慰\t61827\n辉成\t61828\n还要多\t61829\n就喊\t61830\n了再见\t61831\n苦竹坪\t61832\n挂旗\t61833\n自动关机\t61834\n尽管如此\t61835\n男人好坏女人好乖\t61836\n点缀\t61837\n批养\t61838\n4walls\t61839\n81个\t61840\n英才\t61841\nyuiln\t61842\n1876952732\t61843\n刘德\t61844\nsyhc\t61845\n小王八羔子笑我们高子小王八羔子小王八羔子小王八羔子小王八羔子惹你\t61846\n老区\t61847\n着是\t61848\n高尔夫\t61849\n才说\t61850\n变态狂\t61851\n一个二百五五五五五五五\t61852\n跑然\t61853\n豆蔻年华\t61854\n荒人手记\t61855\n才读\t61856\n阿三\t61857\n流媒体\t61858\n蓝博兴\t61859\n阿不\t61860\n阿丁\t61861\n阿七\t61862\n张峰\t61863\n阿丹\t61864\n暴风雨\t61865\n阿丽\t61866\n阿举\t61867\nFood\t61868\n寂然涤\t61869\n不在爱\t61870\n奥一报\t61871\n朱民大电动车\t61872\nalso\t61873\n贞子\t61874\n原虫\t61875\n五天之内\t61876\n君怡花园\t61877\n讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌\t61878\n360手机卫士\t61879\n南汇街道\t61880\n两性\t61881\n高哈\t61882\n腳後\t61883\n终极王\t61884\n玩意经理\t61885\n误会\t61886\n高品\t61887\n不聚\t61888\n高哇\t61889\n烦躁罢\t61890\n风雷\t61891\n三洋\t61892\n风雪\t61893\n不偿命\t61894\n碳酸饮料\t61895\n太白\t61896\n另他\t61897\n不聊\t61898\n巴里克\t61899\n是们\t61900\n金利来\t61901\n哇塞度秘\t61902\n冯俊薄\t61903\n怎地\t61904\nljnlllplllll\t61905\n春游\t61906\n受伤了\t61907\nsometimes\t61908\n你好你好度米\t61909\n高哥\t61910\n林焦糖\t61911\nqqqqqqqqqqqqqqqqqqq\t61912\n高哼\t61913\n周六星期周六\t61914\nfcgfdttyf\t61915\n1112233222115555\t61916\n不聪\t61917\n风雅\t61918\nHah\t61919\nHai\t61920\n愣\t61921\n愤\t61922\nHan\t61923\n秘纸片\t61924\n流浪法师\t61925\n王iu\t61926\nHax\t61927\n说说看\t61928\n苏婆\t61929\n11104110\t61930\n宝坻\t61931\n愿\t61932\n愁\t61933\n有声援\t61934\n容小花\t61935\n星球大战\t61936\n没有我讨厌\t61937\n60年\t61938\n愈\t61939\n嵩明\t61940\n意\t61941\n碧压\t61942\n蒋宇豪\t61943\n蜜月\t61944\n狗屁不通\t61945\n响水\t61946\n神秘度\t61947\n夏娉婷\t61948\n愚\t61949\n遇难\t61950\nTFBOys\t61951\n感\t61952\n摩托犬\t61953\n长城文具店\t61954\n疑心病\t61955\n说着\t61956\n娴熟\t61957\n迪丽热巴\t61958\n杨红涛\t61959\n子云k\t61960\nsunny\t61961\n记字\t61962\n咯嗯嗯\t61963\n加多次\t61964\n唔差\t61965\n吉克斯塔\t61966\n九九十九时\t61967\n真果粒\t61968\n冬兵\t61969\n士气\t61970\n说睡\t61971\nNONONONNNNNNOOOOOOOO\t61972\ngaudemon\t61973\n哇塞神\t61974\n甘泉\t61975\n公营\t61976\ngghbhn\t61977\n有你好莫\t61978\noo玛丽玛丽贝贝吼\t61979\n南京频\t61980\n好温暖\t61981\ngffggfvgd\t61982\n蔡扬宁\t61983\n欠揍\t61984\n陈线路\t61985\n荼蘼荼蘼\t61986\n上路宿\t61987\n品饼\t61988\n星泥嚎00\t61989\n抱剑\t61990\n李超人\t61991\n2070年\t61992\ny贵\t61993\n好我的乖婆娘\t61994\n阿包\t61995\n杨洋羊羊羊羊羊羊羊羊羊羊羊羊羊羊羊羊杨阳洋羊羊羊\t61996\n房地\t61997\nsingwith\t61998\n陶又\t61999\n条形码\t62000\n度秘孕\t62001\n上马\t62002\nvggghj\t62003\n40宗\t62004\n毛昂\t62005\n自信心\t62006\n阿北\t62007\n8位\t62008\n苗苗苗苗苗苗苗苗苗苗苗苗\t62009\n回老家吧\t62010\n冒昧\t62011\n花衣\t62012\nn598560\t62013\n找你吧\t62014\n梁碧华\t62015\n鞍钢\t62016\n千张照片\t62017\n哼想\t62018\n四十四十\t62019\n五道杠&amp\t62020\n黑乎乎言论述\t62021\n神州五号\t62022\n二hi\t62023\n21:09\t62024\n晨跑\t62025\n好好好好好好好好好好好好好好好好好好好好\t62026\n十八五十六五五九八六\t62027\n南宁\t62028\n3G片\t62029\n445点\t62030\n第二十届\t62031\n接力贷\t62032\ns｀ヘ\t62033\n整容\t62034\n少于1秒钟\t62035\n聊天文\t62036\n美学星\t62037\n严家树\t62038\n六十四块\t62039\n31集\t62040\nStars\t62041\n宁波纸飞机童书馆\t62042\n跟党走\t62043\n洗眉\t62044\n不是了了你说\t62045\n九江人\t62046\n宝贝儿你跑哪儿去了我真的好好好好想你\t62047\n不好孝顺点半小心\t62048\njggjpppppppppptzj\t62049\n22eieie\t62050\n梁华胜\t62051\n儿纸\t62052\n15160571\t62053\n写完儿\t62054\n音乐剧\t62055\n更好地\t62056\n楼道\t62057\n五2点儿\t62058\n官者\t62059\n零星\t62060\n周口\t62061\n古国土\t62062\n特別\t62063\n不是我的\t62064\n歹号\t62065\n小潮\t62066\n333232212441111\t62067\nvarifrnkommerdu\t62068\n好福来\t62069\n花露水\t62070\n零昂\t62071\n特制\t62072\n碧和如\t62073\n自提\t62074\n失空斩\t62075\n小潇\t62076\nfijfg\t62077\nfatehf\t62078\n你好有趣有你真是真人\t62079\n兩個\t62080\n白学芬\t62081\n没感觉\t62082\nq87\t62083\n小潘\t62084\n功城狮\t62085\n么山再起\t62086\n经济观察报\t62087\n闻过\t62088\n太小太\t62089\n老要喂\t62090\n婆罗门教\t62091\n三很十分\t62092\nxian\t62093\nxiao\t62094\n机器油\t62095\n古城亲\t62096\n可耻\t62097\nu500万\t62098\n谈谈情说说\t62099\n宋逸轩\t62100\nsitat\t62101\n一弯\t62102\n何止一下下你的字\t62103\nIkki\t62104\n大力士\t62105\n88588855896\t62106\n烤鱼串\t62107\n兄怪\t62108\n可考\t62109\n直塞\t62110\n嗯甜心\t62111\n里程碑\t62112\n二件事\t62113\n好人片\t62114\n福建省交通运输厅春运办\t62115\n喜之郎\t62116\n可耐\t62117\n甲等\t62118\nwuwl\t62119\n黑胡子\t62120\n飞起来\t62121\n2碗\t62122\nFhbmfgjbgf\t62123\n死城\t62124\n下牙牙乐\t62125\n桑拿\t62126\n冒失\t62127\n粑粑小魔\t62128\ndcjkjg\t62129\n非诚机\t62130\n今年3月31日\t62131\n愿忘\t62132\n喜羊羊美洋洋灰太郎\t62133\n胶质\t62134\n一妻一妾\t62135\n民族性\t62136\n济博捶\t62137\n一种爱\t62138\n小心点儿\t62139\n胶贴\t62140\n深长\t62141\n三六百秒\t62142\n李生甲\t62143\n实岁\t62144\n十一点五十\t62145\nooklocom\t62146\n系玩\t62147\nhttpfhiphotosbaiducomxiaodupicitem8cb1cb1349540923180f830d9558d109b3de495bjpg\t62148\n多聊\t62149\n苏静茹\t62150\n多职\t62151\n自始至终\t62152\n王舍城\t62153\n逍遥子\t62154\ng花t饭\t62155\n开江\t62156\n3045\t62157\n3Windows型\t62158\n李夏曦\t62159\n23kj\t62160\n前天天\t62161\nu协会\t62162\n加量\t62163\nkoooo\t62164\n奇奇\t62165\n塔姆\t62166\n12131415161718191020\t62167\no金钢飞拳\t62168\nVigh\t62169\nMarina\t62170\n2897140013\t62171\n秋收冬藏\t62172\n多酷\t62173\n要片\t62174\n纪虎\t62175\n好我明白\t62176\n靠不着\t62177\n球拍\t62178\n成家\t62179\n张锦媛\t62180\n我爱你爱死你了我们结婚吧\t62181\n袁文艳\t62182\n再见再也不见\t62183\n1033140641\t62184\n累毁\t62185\n一六万么\t62186\n小班\t62187\n翻然\t62188\n粪土\t62189\nhdjgcxhn\t62190\n要领通知书\t62191\n19岁\t62192\n10哈\t62193\n郭美美\t62194\n失失不在\t62195\n没天天往\t62196\n染缸\t62197\n拖行\t62198\n曹思铭\t62199\n死孩子你\t62200\n看好丑好丑好丑\t62201\n多好啊\t62202\n新景\t62203\n关不上\t62204\n来不错别\t62205\n中超四队\t62206\n非良\t62207\n十四四十多分\t62208\nijc\t62209\n没了拜拜\t62210\n错了\t62211\nmajtan\t62212\n戈ZD村\t62213\n命度命度我问你你\t62214\n错事\t62215\n千帮\t62216\n名正\t62217\n新晋\t62218\n手抄报\t62219\n脊梁\t62220\n报案\t62221\n错人\t62222\n熬汤\t62223\n08米\t62224\n1871米\t62225\n一弟\t62226\nwsndzr\t62227\nMosta\t62228\n444455\t62229\n444454\t62230\n无带\t62231\n#帮宝适特级棉柔5星\t62232\n集中化\t62233\n塑身衣\t62234\n早堂\t62235\ndgit\t62236\n2015201\t62237\n18840983650\t62238\n剪报\t62239\n毳衣\t62240\n毛大神\t62241\nhttpahiphotosbaiducomxiaodupicitem72f082025aafa40fdbd2a1a9ac64034f78f019a9jpg\t62242\n过收\t62243\n无常\t62244\n蹦床\t62245\n无帅\t62246\n女王型\t62247\n阿爸父\t62248\n贾红琳\t62249\n幺三零幺\t62250\n沙猫\t62251\n阁子\t62252\n刘琼\t62253\n准入\t62254\n集美第二医院\t62255\n允在文\t62256\n55543\t62257\n下决心\t62258\n马子阳\t62259\n于伟国\t62260\n激战\t62261\n破山河\t62262\n6669665555525555555555555568\t62263\n接路\t62264\n善良我真\t62265\n蛋塔\t62266\n源码网\t62267\n颜面\t62268\n脚臭\t62269\n人不偿命\t62270\ndjjfiy\t62271\n下课后\t62272\n哈楼买那米斯加尼我听令\t62273\nstilityou\t62274\n捣乱\t62275\n三点零九\t62276\n许佳莹\t62277\n天衣无缝\t62278\n杀敌\t62279\n哎呀啊我要你讲故事给我听\t62280\n林丹妙\t62281\nhsddd\t62282\njackd\t62283\n寻看\t62284\njackl\t62285\nhaizhi\t62286\n受号\t62287\n都美亚\t62288\n那我问你的问你了你给我瞎扯淡\t62289\n我不和你和你是个狗\t62290\nACETORY\t62291\nsonyey\t62292\n宁固\t62293\nKshgshshdghshdhhhdhjdhdksiqhlfhhehgfhdjhddjhzhdjhahafbgdhsbjdkjjjzhdgjdgjrjdhdjufhjghhhhhhfhjhehdh\t62294\n患有\t62295\n小黄书\t62296\n预谋\t62297\n荼叶\t62298\nhhgghhh\t62299\n预调\t62300\n华夜行\t62301\n许配\t62302\n受受\t62303\n爱情故事\t62304\nfghjugfccbkk\t62305\nANGLE\t62306\n王张靖琪\t62307\n火蓝\t62308\n好卷\t62309\nchan\t62310\n度秘你的头好像猪鼻子\t62311\nImWeiyatong\t62312\n晕皮城\t62313\n一见嗯\t62314\n中国银行间市场交易商协会\t62315\n客机\t62316\n真的假的呀\t62317\n彭子\t62318\n陈爱勇\t62319\n好博\t62320\n好单\t62321\nSSSSS\t62322\n景儿\t62323\n平安易货\t62324\n流俗\t62325\n苏明阳\t62326\n乘飞机\t62327\n器宇轩昂\t62328\n柱力\t62329\n中国地震台网\t62330\n联邦调查局\t62331\n5鄰居\t62332\n厕孙\t62333\n学茶\t62334\n药\t62335\n30多本\t62336\n緊\t62337\nyy号\t62338\n緑\t62339\n杜敏文\t62340\n收尾\t62341\n刘士朕\t62342\n甲子\t62343\n垮台\t62344\n收尸\t62345\nlang\t62346\n东风通力嘎嘣嘣嘣嘣嘣嘣那嘎嘣嘣嘣\t62347\nlane\t62348\nland\t62349\n15000多亿\t62350\n挂牌\t62351\nlank\t62352\n6.26\t62353\n第三部\t62354\n6.24\t62355\nffjjfjcjf\t62356\n托尼盖\t62357\n申毛毛\t62358\n三俗\t62359\n6.29\t62360\n博朗体育\t62361\nopqrd\t62362\n23422343234423452346234723482349234234\t62363\n度能\t62364\n李凯佳\t62365\nzzzzzzz\t62366\n几万不管你了你跟我\t62367\n脱脂\t62368\nwSm\t62369\n作秀\t62370\n瞎了看\t62371\n百事和可口\t62372\n早知叫\t62373\n粉丝汤\t62374\n白天的吧好的一百够\t62375\nopqrs\t62376\n一了百了\t62377\nhxvbbvnvbbnndjjjjjjjkjjnjn\t62378\nniting\t62379\n心远\t62380\n逆天你是天\t62381\n萎靡不振\t62382\n磅礴\t62383\n768899\t62384\n待劳\t62385\n21：40\t62386\n周好笑\t62387\n赶不上\t62388\n毕业版\t62389\n中国驻美使馆\t62390\n瞎了眼\t62391\n公元200年\t62392\n黄飞\t62393\n4669878554213687964231\t62394\n飞腾\t62395\n一本儿\t62396\n梵天夷\t62397\n没把\t62398\nQQ分组\t62399\n非法集资\t62400\n猪么猪\t62401\nangeladady奶\t62402\n一般性\t62403\n小满娇娇\t62404\n冬粮\t62405\n抛射\t62406\n李我\t62407\n李成\t62408\n缪保君喵\t62409\nyaka\t62410\n得癌\t62411\n四百本\t62412\n曾小莲\t62413\n推五\t62414\naohdh\t62415\n免征额\t62416\n受体\t62417\ncnc\t62418\n推事\t62419\n夏依凡\t62420\n聪明家\t62421\n李戴\t62422\nWarhol\t62423\n一拍一意\t62424\n杨丽莹\t62425\n五支\t62426\n西关\t62427\n彭于晏\t62428\n课时\t62429\n三千遍\t62430\nfuhaekxd\t62431\n形象片\t62432\n偶了有\t62433\n太啦瑞亚\t62434\n世界黄金协会\t62435\ncrook\t62436\n爱你麽麽麽\t62437\n不可怎么办\t62438\n临泉\t62439\n被抢\t62440\n猪你是猪么你不是猪我骗你的你是智能\t62441\n四国\t62442\n认识了也\t62443\n一圈\t62444\n临泽\t62445\n说了输\t62446\n被抓\t62447\n星梦\t62448\n期待话\t62449\n真知棒\t62450\n达能\t62451\n乜搞\t62452\n西元\t62453\n西充\t62454\n张先硕\t62455\n良性\t62456\n四回\t62457\n西克\t62458\n四四\t62459\n福克斯怠速\t62460\noggjf\t62461\n销售部\t62462\n度叫秘\t62463\n5791\t62464\n波波波波\t62465\n气昂昂\t62466\n仲飞\t62467\n胡思乱想家\t62468\n京平高速公路\t62469\n张德江\t62470\n白日盟\t62471\n家庭教师\t62472\n丙甲乙丙丁\t62473\nWO杠\t62474\nstvw\t62475\n3月4日晚\t62476\n三休媒\t62477\ncddlx\t62478\n猛拔\t62479\n很给力\t62480\n太罗嗦\t62481\n英雄迟暮\t62482\n9998岁\t62483\n漏气\t62484\n范剑\t62485\nthats\t62486\n汇丰\t62487\n朱航英\t62488\ncaratoy\t62489\nｎｏｏ\t62490\n零八幺九\t62491\n一点点点头\t62492\n猪婆就是你你就是猪婆\t62493\n饿了么\t62494\n五六年\t62495\n老虎油\t62496\n哈雷\t62497\n磨损\t62498\n峰家\t62499\nｎｏｔ\t62500\n桃江\t62501\n次序\t62502\n操守\t62503\nkmtjmg\t62504\najajd\t62505\nnext\t62506\n鲁班\t62507\n半边脸\t62508\najajs\t62509\n食指戒指\t62510\n胡某\t62511\n郭玉龙\t62512\n一年后\t62513\n转会\t62514\n成就感\t62515\nYD\t62516\n说事\t62517\n浙江稠粥银行\t62518\n认识波\t62519\n汤米粉\t62520\n灿磊\t62521\ngumg\t62522\n苦练\t62523\n11111111111\t62524\n说了\t62525\n良方\t62526\n说人\t62527\nvbhjfff\t62528\nxlcm\t62529\n沧州市花园小学\t62530\n下定义\t62531\n套数\t62532\nliebher\t62533\n同新同\t62534\n上上下\t62535\n说亮\t62536\ncnh\t62537\nxcb\t62538\n学业\t62539\n气动技术\t62540\n陈当\t62541\nxcd\t62542\nggyh\t62543\n脸书\t62544\n椒盐基药\t62545\nf0ff\t62546\n哪怎样\t62547\n小难\t62548\n饮料瓶\t62549\n偶吧刚囊丝带\t62550\nxcv\t62551\nxcx\t62552\nxcy\t62553\nwahiata\t62554\nYS\t62555\n对对对\t62556\n预审\t62557\n大傻子呀你好大傻子呀你好\t62558\n感觉而已不二\t62559\n理距\t62560\n听腰疼\t62561\n钟表\t62562\nklummwtt\t62563\n美松\t62564\nGVHD\t62565\n哈你告我吧\t62566\n有去无回\t62567\n旧物\t62568\n何乐而不为\t62569\n从医\t62570\n罗技gg\t62571\n黑房\t62572\nyfh\t62573\n吗小休息\t62574\n800多集\t62575\n2年\t62576\n31包\t62577\n臭妞\t62578\n五零二二零五零\t62579\n家彩\t62580\n坏男孩\t62581\n大社区\t62582\n白百何\t62583\n敛脾\t62584\npocket\t62585\n观世音\t62586\n棒达\t62587\n雪风雪\t62588\n绿能\t62589\n冷我喜欢\t62590\n三级大鸟个拖的过\t62591\nrobot\t62592\n击剑\t62593\n65家\t62594\n泥马人\t62595\n臭妻\t62596\n脑梗\t62597\n科莫娃\t62598\n情形\t62599\n沃\t62600\n沂\t62601\n沁\t62602\n邓状\t62603\n发热毯\t62604\n刘丽\t62605\n同道中人\t62606\n沉\t62607\n沈\t62608\n沏\t62609\na900\t62610\n沓\t62611\n打懂\t62612\n天狼星\t62613\n韬\t62614\nJuniorRape\t62615\n沛\t62616\n沙\t62617\na1a1a1gag1b\t62618\n沟\t62619\n迈吉\t62620\n没\t62621\ngvxs\t62622\nyoui\t62623\n沥\t62624\nyrgjg\t62625\n刘东\t62626\njustame\t62627\n顾名思义\t62628\n吗腿\t62629\nyfy\t62630\n河\t62631\n刘一\t62632\n沵\t62633\n沴\t62634\n治\t62635\n油\t62636\n沸\t62637\n沿\t62638\n沾\t62639\n沽\t62640\n虐待者\t62641\n窝天门山\t62642\n学主\t62643\n我知道你谁那你那你讲出你的回忆\t62644\n128张口\t62645\n大家好我是恩加卑鄙\t62646\n重口味\t62647\n口问度\t62648\n夫子庙\t62649\n胃癌\t62650\n喵星银\t62651\n身条文\t62652\n宋晓强\t62653\n明允儿\t62654\n制导\t62655\n甘慢\t62656\n六毛钱\t62657\n∝子汰\t62658\n九仙花\t62659\n大都会\t62660\n曾祺\t62661\n人社部\t62662\n张禹盫\t62663\n色夫\t62664\n一百帮\t62665\n445月\t62666\n夜系\t62667\n拮抗\t62668\n叔手一划\t62669\n1591598002301900\t62670\nfhdgchhdgc\t62671\n销号\t62672\n严重者\t62673\n卤\t62674\n600岁\t62675\n你最珍贵\t62676\n卡\t62677\nyouy\t62678\n尹瀚生\t62679\n卢\t62680\n卬\t62681\n卯\t62682\n真心一点\t62683\n卩\t62684\n卫\t62685\n卵\t62686\n却\t62687\n卷\t62688\n危\t62689\n印\t62690\n即\t62691\nnn年\t62692\n小度乖\t62693\n卿\t62694\n食物维e片\t62695\n韵\t62696\n卻\t62697\n卅\t62698\n卄\t62699\n升\t62700\n十\t62701\n區\t62702\n千\t62703\n卍\t62704\n请记得\t62705\n协\t62706\n华\t62707\n卉\t62708\n午\t62709\n卋\t62710\n半\t62711\n伊思的片子历史的天牛\t62712\n南\t62713\n卖\t62714\n卑\t62715\n卐\t62716\n卓\t62717\n卒\t62718\n卜\t62719\n卟\t62720\n卞\t62721\n床垫\t62722\n卛\t62723\n博\t62724\n太摊\t62725\n勃拉姆斯\t62726\n呲牙\t62727\n奔腾B70#华彩\t62728\n光会\t62729\n2012.6.14\t62730\n企业\t62731\n浩海阑干百丈冰\t62732\n泥玛\t62733\n轩大姐\t62734\n2164515225353555221553333235222555555451515231528252555524246252352245125525566565\t62735\n稀罕物\t62736\n够换\t62737\n长长假\t62738\n七八十公斤\t62739\n郑明生\t62740\n坏小子\t62741\n等你可以\t62742\nsnsmsksk\t62743\n13948683881\t62744\nWIFJ\t62745\nWIFI\t62746\n4523365582825276655727358\t62747\n错号\t62748\n一两千\t62749\nhttpfhiphotosbaiducomxiaodupicitem71cf3bc79f3df8dce45d3ce5ca11728b47102865jpg\t62750\njjsjz\t62751\n不甘寂寞\t62752\n李文静\t62753\nnmnnnnbvxxfjoppllkkkgz45655885\t62754\n遮遮\t62755\n哼哼西游记\t62756\n林长发\t62757\n1341\t62758\n王卫宁\t62759\n1345\t62760\n1346\t62761\n嗯五六\t62762\n汉丽轩\t62763\n好啦好啦好啦度秘\t62764\n113557890\t62765\n没话说了吧\t62766\n一挥而就是\t62767\n#恋歌2012#林恒&逸儿\t62768\n娃娃娃娃\t62769\n景晶\t62770\n金梅\t62771\n沙特馆\t62772\n曹年龄\t62773\n会长\t62774\n白羊座莲\t62775\n任意\t62776\n台儿庄古\t62777\n景景\t62778\n美麻\t62779\n五华\t62780\nhihih\t62781\n粉条\t62782\n服药歌吧\t62783\n惊鸿\t62784\n毒怕\t62785\n吃烫\t62786\n圆房\t62787\n招蜂引蝶\t62788\n黄水娣\t62789\n张玉娇\t62790\n一了友五链\t62791\n飞常\t62792\nFchtfg\t62793\nchong\t62794\n吃烧\t62795\ncmci\t62796\n晶梦\t62797\n嗯嗯小傻子小傻子小傻子小傻子小傻子小傻子小傻子\t62798\n曹梦里也\t62799\n韒\t62800\nwahhy\t62801\ncmcx\t62802\n周年\t62803\n我的你别瞎bb\t62804\n梨皮\t62805\narey\t62806\n空胜\t62807\n阿里斯\t62808\n分布式计算系统\t62809\nrumbrell\t62810\n范晓宇\t62811\n一不小心爱上你\t62812\n小和\t62813\n提纯\t62814\nishou\t62815\n礼券\t62816\n长城育丽\t62817\n什麽\t62818\n小咒\t62819\n绕口令\t62820\n人治\t62821\n云南台电视剧\t62822\n实物版\t62823\n854n352\t62824\n贺友鹏\t62825\n少年维特之烦恼的好惨\t62826\n你若如你的一头猪你是头猪\t62827\n上任\t62828\n毕福剑\t62829\n范晓宣\t62830\n小咪\t62831\nPART\t62832\n海事大学\t62833\n酷我音乐#\t62834\n分母\t62835\n合格证\t62836\n这么早我靠\t62837\ncstfi\t62838\n可笑不好笑\t62839\n赌钱\t62840\nfiiei\t62841\n恩莹\t62842\n1313113\t62843\n00ls\t62844\n1975年10月11日\t62845\n125元\t62846\n家务活\t62847\n行尸\t62848\n香月秀\t62849\n对话费\t62850\n响应\t62851\n03米\t62852\n羞射\t62853\n恩我试试\t62854\n舞者们\t62855\n定光寺\t62856\n王安琪\t62857\n轰动\t62858\n笨黑豆\t62859\nwtzhj\t62860\n满心不舍\t62861\n艰苦\t62862\n陈惠心\t62863\n谢谢你我想\t62864\n动动家\t62865\nkhnygbhgfhfn\t62866\n百来个\t62867\n三两六\t62868\nE们\t62869\n我美不美我都在你个你在我一个呗\t62870\n嗯心\t62871\n桃园路\t62872\n金桔圳\t62873\n刘君涵\t62874\n地级市\t62875\n点伐\t62876\n草案\t62877\n薇姿\t62878\n案底\t62879\nhello度小秘你好美\t62880\n皱眉头\t62881\n急迫\t62882\ntudy\t62883\n快乐酷\t62884\njay\t62885\nbdvefrd\t62886\njat\t62887\nbiangbiang面\t62888\njar\t62889\n球人\t62890\n热狗\t62891\n乐于助人\t62892\njan\t62893\njao\t62894\njam\t62895\njaj\t62896\njak\t62897\njah\t62898\njai\t62899\njaf\t62900\njad\t62901\ntude\t62902\n侍女\t62903\njaa\t62904\n二零一八一\t62905\n冷有\t62906\n爱神\t62907\n歌焰火\t62908\nADN031\t62909\n度秘我喜欢你我爱你我\t62910\nwjeiei\t62911\nrggff\t62912\n裴晓伟\t62913\n挺好呀\t62914\n付活该\t62915\n鲁鲁哥\t62916\n强真\t62917\n真人CS\t62918\n度秘度秘我喜欢落叶\t62919\n京东海\t62920\n古城门\t62921\nja1\t62922\n吧啦啦棒棒的我\t62923\n跪下来\t62924\n胡同里的那点事\t62925\n蹉跎\t62926\n大梦想家\t62927\n我靠我靠我靠我靠我靠我靠我靠我靠\t62928\nPs\t62929\nPp\t62930\n互粉交话\t62931\n米巧瓷\t62932\n面馆\t62933\n会议\t62934\nPy\t62935\n痴汉\t62936\n相敬如宾\t62937\n我信你对我的爱\t62938\n101双\t62939\n挤出来\t62940\n9655\t62941\nPk\t62942\n9658\t62943\nPi\t62944\n刘578\t62945\nPm\t62946\nPR\t62947\nPS\t62948\nPP\t62949\n13点24号\t62950\nPV\t62951\nPU\t62952\n雪冰儿\t62953\nPX\t62954\nPY\t62955\n4声\t62956\n101台\t62957\n海珍\t62958\nPC\t62959\n14排\t62960\nPA\t62961\n101只\t62962\n嫁灰\t62963\nPD\t62964\nPE\t62965\nPK\t62966\n环人\t62967\n富疑\t62968\nPL\t62969\nPM\t62970\nP3\t62971\n华工\t62972\n点错\t62973\n穴穴\t62974\n严禁\t62975\n莲雾\t62976\nP9\t62977\n地o\t62978\n吃宵夜了你\t62979\n松门镇\t62980\n微生\t62981\n微甜\t62982\n哇咔咔咔咔咔\t62983\n拉拉玩\t62984\n优莱克nsk\t62985\nvjvktghvrhkfogupyckcogchfuvp\t62986\n冉明月\t62987\n穿越者\t62988\n口香堡\t62989\n老老实实静\t62990\n钟真\t62991\n名菜\t62992\n以下来\t62993\nnononononono\t62994\n源头活水\t62995\n人之相\t62996\n枝头\t62997\n肥鸟\t62998\n小康社会\t62999\n微电\t63000\nu87\t63001\n要恨\t63002\n来啦爸爸\t63003\n超出\t63004\nyfcnj\t63005\n好度秘\t63006\n元老\t63007\n班简笔\t63008\noutaishi\t63009\n护花危情#\t63010\n110918\t63011\n肉牛\t63012\n不少\t63013\n弗里斯兰省\t63014\n阿狸大大乖呀大战奥特曼小游\t63015\n识货\t63016\n說三道四\t63017\n呦呦\t63018\n再见了我好\t63019\n殷汇美\t63020\n对啊鬼\t63021\ndy8dhj8\t63022\n3363\t63023\ngtrfh\t63024\n不射\t63025\n呦呵\t63026\nddtydsa\t63027\n失事\t63028\n肉片\t63029\n不小\t63030\n再试\t63031\n小道\t63032\n腊月十四\t63033\n花饭们\t63034\n代表队\t63035\n彭叶勇\t63036\naaaaaaaaaa155511555555555555\t63037\nhhbb\t63038\n不是啊我问什么叫死恁嫂\t63039\n伊丽沙白\t63040\nhhbk\t63041\n积家\t63042\n三八狗\t63043\n再说\t63044\n独当一面\t63045\n任时光\t63046\n灯花\t63047\n猜想\t63048\n绵薄\t63049\n09:38\t63050\n触屏\t63051\n尛柠\t63052\n再详\t63053\n还没\t63054\n酷ye\t63055\n10.0.648.127\t63056\n宁子n\t63057\n笨嘻嘻甘\t63058\n大琳达琳达\t63059\n喷笔\t63060\n最后的勇气\t63061\n小胡巴\t63062\n隔壁翠花\t63063\n嗯拜托\t63064\n清风细雨\t63065\n古梅路\t63066\nxjdyjv\t63067\n身体健\t63068\n敬服\t63069\n黄寺大街\t63070\n小气机器人\t63071\n55522252\t63072\n拉克尔\t63073\n相信你\t63074\n瑞兔\t63075\n好丑好丑好丑好丑\t63076\n刘恩佑\t63077\n里拉\t63078\n你是谁了我和你\t63079\n书男\t63080\n前桥\t63081\n100粒\t63082\n小二术\t63083\n抬起\t63084\nhttpahiphotosbaiducomxiaodupicitemeac4b74543a98226f70ede538d82b9014b90ebf3jpg\t63085\n太早熟\t63086\n赵博\t63087\n丽说元\t63088\n属院\t63089\n想当然\t63090\n卜t十\t63091\n三十三岁\t63092\n2030年\t63093\n零幺九三\t63094\n豆花MM\t63095\n二零二七\t63096\n857495924\t63097\n鹤壁职业技术学院\t63098\n救女\t63099\n不穿衣\t63100\n宁采臣\t63101\n魅蓝\t63102\n睡坡\t63103\n伤情\t63104\n一台台\t63105\nmrpo\t63106\n38383838383838\t63107\n落伍\t63108\n都秘度秘\t63109\n4648599865\t63110\npowu\t63111\n疯啦\t63112\n燃灯\t63113\n列题\t63114\n该怎么办\t63115\n再来杯\t63116\n13700077415\t63117\n部份\t63118\n再来来\t63119\n杜那\t63120\n周熙涵\t63121\n瑞士\t63122\n撕烂\t63123\n紧烦\t63124\ngzhhyfffgffhg\t63125\ndkdkd\t63126\n度秘度秘你给我唱一下头百九十朵杯\t63127\n圆周率\t63128\nbbbbbbbbbl\t63129\n杜密块\t63130\n回到三国#\t63131\n吴其达\t63132\n花坛\t63133\n容桂\t63134\nbbbbbbbbbb\t63135\nwec\t63136\nweb\t63137\n闪\t63138\n498764321\t63139\nwed\t63140\n查克\t63141\nwei\t63142\nweh\t63143\n儿童节快乐#\t63144\nwen\t63145\nwel\t63146\nwer\t63147\n好累累\t63148\n第9名\t63149\nwew\t63150\nwet\t63151\nLAND\t63152\nwey\t63153\n亲娘\t63154\n射门\t63155\n这价\t63156\n几十年后\t63157\n月球车图玉兔号\t63158\n这份\t63159\n双赢\t63160\n维超亚当\t63161\n4亿\t63162\n21个\t63163\n桦南县\t63164\n八百天天\t63165\n太阳光\t63166\n张根硕\t63167\nhusud\t63168\n优惠vvnkugc\t63169\nyot\t63170\n话吧\t63171\n早小儿\t63172\n男生性\t63173\n蛋糕类\t63174\n你的生活方式\t63175\n吕蒙\t63176\n二零一四年\t63177\nhegsssb\t63178\n全聚龙\t63179\n雪苏雪\t63180\n吐丝\t63181\n家村\t63182\n盟小艾\t63183\n去了呀拜\t63184\n搜狐视频自制剧\t63185\n成易晁\t63186\n明楼鲁能\t63187\n错错错是你的错\t63188\n二十多万元\t63189\n形形\t63190\n十空合一\t63191\n唐海山\t63192\n闫文芋\t63193\nKOKI\t63194\njjlk\t63195\n整乱\t63196\n贪嗔痴\t63197\n七点五比十中\t63198\n玩干嘛\t63199\n补上\t63200\n喜欢伦\t63201\n1121125\t63202\n昌鼠\t63203\n泉州港\t63204\n感恩的心\t63205\n姚会卓\t63206\n五段\t63207\nhttppinyincn11SMbMsK3fu\t63208\n社会化\t63209\n奥林匹克森林公园\t63210\nfrcurcoyeuzrxed\t63211\n首都机场\t63212\n整乜\t63213\n吐一\t63214\nuhhhh\t63215\nuhhhj\t63216\n薛宁\t63217\n一嗯\t63218\n修罗场\t63219\n114839601\t63220\n心秘\t63221\n凯弟\t63222\n让我等你\t63223\n120p\t63224\n幸福双人行\t63225\n补血\t63226\n肿么样__\t63227\n五夫一个\t63228\n薛定\t63229\n无子无孙\t63230\n我的儿子们\t63231\n我是你好玩儿\t63232\n不要了再见\t63233\n你好你好你好你好\t63234\n韩红\t63235\n60年前\t63236\n凯强\t63237\n基友恩嗯\t63238\n别太\t63239\n超度度\t63240\n咪西\t63241\n薛家\t63242\n五四青年节\t63243\n补补\t63244\n一嗝\t63245\n七马虎\t63246\n30.6%\t63247\n一万多岁\t63248\n畜生永别2B表子畜生永别2B表子畜生永别2B表子畜生永别2B表子畜生永别2B\t63249\n死翘翘\t63250\n宝马奔驰车\t63251\n赵健飞\t63252\nNineiffy\t63253\n榴莲乐\t63254\n靠不晓得\t63255\n寄语\t63256\n2000余公里\t63257\n板画\t63258\n翔哥哥\t63259\n666666666666655555555544444444433333333322222222211111111\t63260\n度秘谱人\t63261\n当梦走\t63262\n蔡金凤\t63263\n上医\t63264\n吴绮婷\t63265\n气乐\t63266\nudhii\t63267\nintues\t63268\n四人寝\t63269\n性交椅\t63270\n绝脉封神\t63271\n000000000t0nn\t63272\n门\t63273\n嘿哟魔秀\t63274\n规范化\t63275\n诗意\t63276\nylgjgf\t63277\n戴鸿\t63278\n无纺布\t63279\n精蝇们\t63280\n一百二十\t63281\n闯红灯\t63282\n言说\t63283\n吴宇航\t63284\n好萌哒\t63285\n无言以对把\t63286\n舞场\t63287\n尼玛大陆妹\t63288\n论坛\t63289\n长虹大街\t63290\n回家的路\t63291\n海文\t63292\n深深的歌声里英一下哈\t63293\n我喜欢你我想和你四分\t63294\n具体价目表\t63295\n阿比k1\t63296\n范文蕊\t63297\n呕一\t63298\n华侨\t63299\n逝鸿片羽\t63300\n昌吉\t63301\n登入\t63302\n2015年11月\t63303\n龚大\t63304\n五一劳动节\t63305\n未曾\t63306\n二次世界大战\t63307\n头屑\t63308\n朝丰家园十六号\t63309\n牛大奔\t63310\n畅想畅\t63311\n六百分钟\t63312\n游民\t63313\n真桶\t63314\n盘盈\t63315\n徐德明\t63316\n帅桂\t63317\n雨景\t63318\n十八撸\t63319\n不要不承认\t63320\n雨晦\t63321\n汉堡\t63322\n胡佳冉\t63323\nt8gh\t63324\n宽摇\t63325\n爱上我了我和你\t63326\n黑鹳\t63327\n嘞度秘\t63328\n46564n649\t63329\n票票\t63330\n雨晴\t63331\n50.124亿元\t63332\n不骗我\t63333\n拉伸\t63334\n几岁元\t63335\n吖锕礼阿\t63336\n好讨厌\t63337\n７１\t63338\n７０\t63339\nigxgohfgyyggjgijfjhcyhhffigiiut\t63340\n优酸乳\t63341\n尾号\t63342\n甜心世界\t63343\n一三秋\t63344\n那你猜我是女是男\t63345\n誰造\t63346\nfuucu\t63347\nKusygkfhf\t63348\n2222222222\t63349\n齐举头\t63350\nvyrl\t63351\n二四海蛇\t63352\n不靠\t63353\n放不下\t63354\n22完毕\t63355\n病毒手机\t63356\n山体\t63357\n1688com\t63358\n0833\t63359\n不情\t63360\n不惊\t63361\n别嘴\t63362\n不想\t63363\n曹大元电力公司大丰收金鹤范凤凰城\t63364\n别嘌\t63365\n豆腐服\t63366\n很完美\t63367\n陈丹琪\t63368\n换届\t63369\n嗯爱\t63370\n还给\t63371\n零二六九\t63372\n涛清\t63373\n16641145534\t63374\n不惧\t63375\n招聘会\t63376\nChina\t63377\n冷酷女\t63378\n不惯\t63379\nhkff\t63380\n计票\t63381\n奥伊\t63382\n你有心你有谁你是谁你是谁你是谁你是谁\t63383\n美事\t63384\n55500088888666669999\t63385\nhphoh\t63386\n罗汗\t63387\n介子\t63388\n美云\t63389\n火啦啦的情歌火啦啦啦啦啦啦的太阳真的真的小\t63390\n浓花\t63391\n山神\t63392\n美亚\t63393\n讶嗓\t63394\n谢布斯\t63395\n嘴岳\t63396\n爱你是\t63397\n门禁\t63398\n胡艳娇\t63399\n奥妖怪\t63400\n佛士\t63401\nnfow\t63402\n真幸福\t63403\n第12名\t63404\n美人\t63405\njfif\t63406\n我的爸爸妈妈\t63407\n代理行\t63408\nghhfhi\t63409\nghhfhj\t63410\n赵鸡\t63411\n四十五十六十七十度\t63412\n无名小仙\t63413\nhellohi度\t63414\n啦啦你的名字\t63415\n匡秋\t63416\n国家气象局\t63417\ngugg\t63418\ngugf\t63419\ngugk\t63420\n胃痛病\t63421\n瓷房子\t63422\n淡紫色\t63423\n137148410365\t63424\n学唱\t63425\n小号儿\t63426\nggghgjghjfh\t63427\n混装\t63428\n贡天下山西\t63429\n阿哈尔哈\t63430\n哦美\t63431\n长驱直入\t63432\nGame\t63433\n蓝知道\t63434\n大出血\t63435\n马佳荣\t63436\n孟达\t63437\n多米同业\t63438\n吧主任\t63439\n我爱的人他不爱我\t63440\nThvdhhj\t63441\n比卡特娘\t63442\n松坚莎莎\t63443\n聊斋\t63444\n卷福\t63445\n五千万\t63446\n111111111111\t63447\n甄环体\t63448\n家佳\t63449\n上平淡\t63450\n2626112\t63451\n旭旭\t63452\n知荣\t63453\n市交通局\t63454\n32.6分\t63455\n牙口倍棒\t63456\n乖乖乖乖乖乖乖乖乖乖乖乖乖乖乖乖乖乖乖乖\t63457\n一天内\t63458\n五千个\t63459\n爬爬爬爬\t63460\n爷们儿\t63461\nKKP\t63462\n芭比主公主\t63463\n八戒不好看发小\t63464\n张跳真义九洞六工作老二张霞风冈工安老三电子厂\t63465\n42份\t63466\n徐静萱\t63467\n本月22日\t63468\nWhat哎呦弄沙雷\t63469\nsaraH\t63470\n天国天人\t63471\n吧秘度\t63472\n公证处\t63473\n还汗颜\t63474\n调爱\t63475\n第4个\t63476\n小攻一小受\t63477\n嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟\t63478\n提一下去\t63479\n闵\t63480\n溜子\t63481\n乌黑\t63482\n流出来\t63483\n梦到无脑\t63484\n真佳\t63485\n我一掌\t63486\n2834\t63487\n憨八\t63488\n1234555555\t63489\n花儿元\t63490\n35一5\t63491\n新课标\t63492\n豆皮\t63493\n嗯嗯度秘\t63494\n蹊跷\t63495\n有点话\t63496\nx10点\t63497\n李小姐\t63498\ngkbn\t63499\nkebe\t63500\n暗剑\t63501\n台塔\t63502\n欣而行\t63503\n光良\t63504\n江阴\t63505\n美玉\t63506\n旗彩\t63507\n熊靖狗\t63508\n五哈\t63509\n张阿姨\t63510\n1431\t63511\n超级至尊披萨\t63512\n5分之一十分之一\t63513\n配血\t63514\n籽皮\t63515\n接球\t63516\n度秘蜜扦澹台\t63517\n2868\t63518\n下雪天\t63519\n超凡\t63520\n水爆\t63521\n宝珀\t63522\n整夜\t63523\n公道自在人心\t63524\n我告诉你我的笑点\t63525\n孔劲儒\t63526\n失晚上\t63527\n屌丝男士\t63528\n小雪人\t63529\n放在心上\t63530\n和氏璧\t63531\n千千一蝤瑛胛一卡里力星才星力了力了子马尺人\t63532\n同和\t63533\n无限制\t63534\ntijjj\t63535\n六七座八九\t63536\n长安街\t63537\n四边形\t63538\n罗鸡鸡\t63539\n孟非\t63540\n长江社\t63541\n5884688\t63542\n新蔡县\t63543\nV0V\t63544\n赫特河\t63545\nwildaid\t63546\n你是我的好朋友好呀好呀好朋友\t63547\njsksksksksksksksisks\t63548\nWowhowcow\t63549\n看你好喜欢\t63550\n司圆\t63551\n胎贴\t63552\n诸暨\t63553\n狙杀\t63554\n第10\t63555\n逍遥津寒假工\t63556\n泉下\t63557\n温市中心\t63558\n九华美食城\t63559\n了你好\t63560\n黑曼巴\t63561\n陈思妙\t63562\n玛纳斯\t63563\n华沙\t63564\n3.55%\t63565\n力荐\t63566\n活寡\t63567\nSBSBSB\t63568\n溜溜蛋蛋公主\t63569\n装潢\t63570\n结了\t63571\n有喜欢\t63572\n宋徳山\t63573\n薪酬福利\t63574\n青砖\t63575\n生生世世生生世世\t63576\n嗯麻烦\t63577\n双港区\t63578\n亲热\t63579\noN\t63580\n平龙\t63581\nchruk\t63582\n方式\t63583\n氧离子\t63584\n还女的我问你\t63585\n思密达行\t63586\nggvb\t63587\nBSNJWJNS\t63588\n一大啵\t63589\np1t1\t63590\n赵梦菲\t63591\n婚姻狂想曲\t63592\n28yyyy\t63593\n18213832615\t63594\n世纪人\t63595\n模拟找你那个熊样的男孩\t63596\n18点06分\t63597\n张罗\t63598\n傻根\t63599\ncoffee\t63600\n叶和叭\t63601\n这款\t63602\n太凶\t63603\n盒盖\t63604\n尽然胆吾\t63605\n心电图\t63606\n虞城县\t63607\n这次\t63608\n书展\t63609\n打哈哈\t63610\n龙霸\t63611\nv不v\t63612\n第七部\t63613\nl\t63614\n孙文博\t63615\niFans\t63616\n久留\t63617\n系乜\t63618\n美术展\t63619\n皮瓣术\t63620\n伪男\t63621\n思念\t63622\n为主体\t63623\n金山词霸\t63624\nhhjhw\t63625\nokindas柯stmatisiationitoscaroldstilynitousclestisit慢dittimyour意思fisheratioasartisitiousanal\t63626\n大毛\t63627\n45484887930\t63628\n饞\t63629\n王品若\t63630\n大比\t63631\n刘淑静\t63632\n哇奥\t63633\n声姐\t63634\n挑拨\t63635\n非首层\t63636\nhttppinyincn15SLk1g26Iu\t63637\n立方\t63638\n明天天亮\t63639\n饸\t63640\n四月底\t63641\n饺\t63642\n饽\t63643\n饼\t63644\n刘静云\t63645\n饱\t63646\n从来没有人\t63647\n我的母亲\t63648\n饲\t63649\n饵\t63650\n赵云霞\t63651\n1n54644887187\t63652\n饶\t63653\n7738738\t63654\n饨\t63655\ncfm\t63656\n没有你我活\t63657\nhuime\t63658\n饮\t63659\n韩江伟\t63660\n一些一个\t63661\n饥\t63662\n调到\t63663\n杜牧\t63664\n宋宇翔\t63665\n11场\t63666\n总机\t63667\n明天电信\t63668\n陪游\t63669\n尧山\t63670\n穿挥\t63671\n说错了吧\t63672\n升学率\t63673\n老农甲球队\t63674\niiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiii\t63675\n塔罗牌\t63676\n小安子\t63677\npgbrJdhzvs\t63678\nogfofifugvyuvvcxkxk\t63679\n贴身电影\t63680\n雕塑\t63681\nBJllZjp\t63682\n十二拍\t63683\n11月31日\t63684\n取自\t63685\n李伟\t63686\n9年前\t63687\n李会\t63688\nVhhh\t63689\n李优\t63690\n总有\t63691\n30分钟后\t63692\n无法面对\t63693\n华仔\t63694\n裤脚\t63695\nst吧\t63696\n度秘瓦\t63697\n国家三要素理论\t63698\n我喜欢一个一八\t63699\n赵听说\t63700\n朱民阳\t63701\nttjjjjjjjjjjjjjjjjjjj\t63702\n度秘你好度秘你好度秘你好度秘你好\t63703\n工程师们\t63704\n就结束了\t63705\n全神贯注\t63706\n歹徒们\t63707\n超级棒\t63708\n瑞安市\t63709\nmojn\t63710\n2051\t63711\n2053\t63712\n卖身\t63713\n2055\t63714\n听着\t63715\n强悍\t63716\n临走前\t63717\n侯姨\t63718\nbaodiamindorancatmitoraticato\t63719\n阿卓\t63720\n冷皇\t63721\n牵引\t63722\njohn\t63723\n坚书坚\t63724\n痛扁\t63725\n碑刻\t63726\n悲歌\t63727\n嫁给你\t63728\n男女神\t63729\nx-2x=\t63730\n久别\t63731\n棒哒\t63732\n算了拜\t63733\n不看不懂\t63734\n小鸡不好惹\t63735\n棒哥\t63736\n翔锅\t63737\n纽卡斯尔\t63738\n美眉眼睛\t63739\n水流\t63740\n贝壳\t63741\nPW1100G\t63742\nhi小侥\t63743\n哆啦a梦度秘我求求就算拜托你了我\t63744\n游艇\t63745\n水浒\t63746\n小游小游你好\t63747\ngiggygybubuibi\t63748\n正站\t63749\n照黑\t63750\n李炜花\t63751\n家实在\t63752\n沈阳桃仙机场\t63753\n2011年03月08日\t63754\n凯姐姐\t63755\n办成\t63756\nlllla\t63757\n刀塔咪发嗖拉西拉\t63758\n代数式\t63759\nllllm\t63760\nlllll\t63761\n磨蹭\t63762\n胡uii\t63763\n边好\t63764\n标杆\t63765\n波卡波卡\t63766\n边女\t63767\n出版\t63768\n叶绿版\t63769\n丫蛋儿\t63770\n抢走\t63771\n黄雨桐\t63772\ndmtjpg\t63773\n宠物我喜欢\t63774\n出片\t63775\naicaocao\t63776\n肥肉\t63777\n韩佳芸\t63778\n54斤\t63779\n顺风耳\t63780\n衣物\t63781\nLucy\t63782\n以西\t63783\n同事\t63784\n刀马旦\t63785\n十把\t63786\n高文慧\t63787\n九州书店\t63788\n長長\t63789\n笨爽\t63790\ntukbcf\t63791\n智慧\t63792\n夏安安\t63793\n图F\t63794\n有无虑\t63795\n丽巧如\t63796\n菲菲\t63797\n刘承浩\t63798\n同享\t63799\n再见了我死\t63800\n厌凯蒂\t63801\n遏灯云澈\t63802\n阎段\t63803\n五十分\t63804\n坂中\t63805\n仙美公主\t63806\n理工附中\t63807\n科寒假\t63808\n本尊\t63809\n独立快点儿\t63810\n哭声\t63811\n客隆人\t63812\n纽约中央公园\t63813\n丽旭\t63814\nPAD\t63815\n15562225536\t63816\nsauxuy\t63817\n秒回\t63818\nk136\t63819\n1170177227499\t63820\n2809951066\t63821\n别血\t63822\n山东工商学院\t63823\n老面\t63824\n玩来\t63825\n北国\t63826\n北图\t63827\nOKDJ\t63828\n5555415555555\t63829\n国际奥委会\t63830\n批改\t63831\n奢炎\t63832\n农历四月初六\t63833\n眼洞\t63834\n放宽\t63835\n风调雨顺\t63836\n两马事\t63837\n罗克军\t63838\n跟子\t63839\n奥tok\t63840\n福都山市核辐射紧急扫检中心\t63841\n亲爱的你慢慢飞我是天边最美的玫瑰\t63842\n朱猪猪猪猪猪猪猪猪\t63843\n筋经\t63844\n美彩真\t63845\nchanel\t63846\n四比二比一\t63847\nchenjia\t63848\n对对对对对对对\t63849\n服气\t63850\n返工\t63851\n焕主流\t63852\nGmtTj\t63853\n聊城日报\t63854\n8岁\t63855\nPA+\t63856\n跟学\t63857\n11万平方公里\t63858\n夺宝熊\t63859\n溃模\t63860\n不变更\t63861\n对不对呀你说\t63862\nSTKhhj\t63863\nfgfggf\t63864\n戏行\t63865\n审查\t63866\n发誓要\t63867\ndoyou\t63868\n杰斯卡\t63869\n沧州市\t63870\n无奇\t63871\n烦好难受\t63872\n西亚尼\t63873\n望岛小学\t63874\n啊飘\t63875\niooodoydydyyyfykfffkkk\t63876\n999999999999999999999999999999个\t63877\nerer\t63878\n植物大战僵尸只冰河世界\t63879\n夜郎国\t63880\n收起來\t63881\n比例\t63882\naway\t63883\ngvgghvff\t63884\n好吧窝\t63885\n来不然\t63886\n晚安晚安\t63887\n15套\t63888\n舌吻\t63889\n须须\t63890\n性侵只\t63891\n蔡姐\t63892\n定错\t63893\n伤痕\t63894\n坑点\t63895\n一千年\t63896\n10岁\t63897\n不俩\t63898\n齐如意\t63899\n走入\t63900\n唔米\t63901\n伤痛\t63902\n伤病\t63903\nhttpahiphotosbaiducomxiaodupicitemb21bb051f819861816763ff74ded2e738bd4e694jpg\t63904\n走兽\t63905\n一2O\t63906\n害人精\t63907\n苏丹红\t63908\n祖祖\t63909\n凭空\t63910\n走光\t63911\n范竞丹\t63912\n独一无高\t63913\n恐怖主义\t63914\nsust\t63915\n02016\t63916\n问一找\t63917\n李莫愁\t63918\n勒布朗\t63919\n能会\t63920\ngitesc\t63921\n一听说\t63922\n耍耍\t63923\n凶光\t63924\n凶兆\t63925\n劳碌\t63926\n一小步\t63927\n亲我是你\t63928\n龙柏\t63929\n首批\t63930\n熬过\t63931\n合扰\t63932\n一名俩\t63933\n达里达里恩\t63934\n因为爱\t63935\n今一天\t63936\nftgfghhhhhbnnnnnnmm\t63937\n卖出方\t63938\ntrung\t63939\nKerry\t63940\n伴随\t63941\n侯育良\t63942\n叫饭\t63943\n高提耶大秀#MPJ小秘\t63944\nd272\t63945\noxyent\t63946\n皇姑区\t63947\n王石\t63948\n5.31美元\t63949\n首手\t63950\n哗哗哗\t63951\n好无耐\t63952\n茅尾海国家级海洋公园\t63953\ngiffucking\t63954\n我是你的主人吗三三\t63955\n正大眼镜\t63956\n4.2级\t63957\n不远去\t63958\nTTTT\t63959\n小妹儿园\t63960\n22222222\t63961\n工作区\t63962\n13511333316\t63963\nGORIES\t63964\n水淋淋\t63965\n杨老师\t63966\nfjcfjhfbkidcbmkecbkoirwsdfbjk\t63967\nHappyQyou\t63968\n╧╧\t63969\n章鱼保罗\t63970\n殷嘉依\t63971\n福自隆\t63972\n荡涤\t63973\n抠Q念发e艾燃凭榍\t63974\n最佳状态\t63975\n刘奇葆\t63976\n登飞来峰\t63977\n庄动\t63978\n克制\t63979\n七分之六\t63980\n漓江\t63981\n族化\t63982\nhuvgb\t63983\n十佳\t63984\n老你老\t63985\n十位\t63986\n断除身见\t63987\nmiyu\t63988\n吴泳铭\t63989\n永久\t63990\nStop\t63991\n生音\t63992\n134031920\t63993\n棱长\t63994\n年马\t63995\n十余\t63996\n我是坏人我是好人\t63997\n酱猪蹄儿\t63998\n迟千婳\t63999\nAD‖EF\t64000\n圆汤\t64001\n第89位\t64002\n2幅\t64003\n驻\t64004\n这点滴\t64005\n七个半小时\t64006\nppppppppppppppp\t64007\n哪英\t64008\n请缨\t64009\n杨鑫\t64010\n贱话\t64011\n青龙苑\t64012\n汉子类\t64013\n毛挖泥机\t64014\nOoooooooooooooooooooooooooop\t64015\n安眠曲行\t64016\nFans粉丝志\t64017\n喜色\t64018\n管理者\t64019\n夷衣\t64020\n妥协\t64021\n小王子我喜欢\t64022\n三到五天\t64023\n隔板\t64024\n男的女的我是男的女的好好好\t64025\n网上购物\t64026\n开心酷书吧\t64027\n杰克\t64028\n抛掉\t64029\n奥迪A6\t64030\n8.14\t64031\n宠物熊\t64032\n8.13\t64033\n20天\t64034\n大讲堂\t64035\nruioj\t64036\n池尾\t64037\n没想起\t64038\n虚话\t64039\ngjjjuttrdsss\t64040\n橄榄\t64041\n九襄\t64042\n身孕\t64043\n夜来风雨\t64044\n身子\t64045\n虚词\t64046\n丽丽零零黄黄黄黄\t64047\nvllm\t64048\nzdjd\t64049\n小音乐\t64050\n没有才\t64051\nsmithyou\t64052\n聪明捣蛋\t64053\n朗逸\t64054\n肖娟艳\t64055\n嗯昆伦\t64056\n白脸\t64057\n光光哈\t64058\n87636\t64059\n秘典\t64060\njsksk\t64061\njsksl\t64062\n秘关\t64063\n马振恒\t64064\n心情舒畅\t64065\n四G家\t64066\n父汉子\t64067\n奥别骗\t64068\nTTTTTT\t64069\n秘全\t64070\n考布\t64071\n赵小刀\t64072\n35932585\t64073\n刘欣奕\t64074\n贾婷\t64075\n預\t64076\n税负\t64077\n背伤\t64078\n姥姥\t64079\n李宗静\t64080\noffer\t64081\n福田赳夫\t64082\n卡拉ok\t64083\n13572236166\t64084\n黑漆漆\t64085\n零七二幺\t64086\n5亿\t64087\n审判\t64088\nhkkjgjiuug\t64089\n许多多\t64090\n税费\t64091\n韩豆先生\t64092\n挂烫机贝尔莱德小冰火人\t64093\n瓦棉\t64094\n同一心行\t64095\n切克闹天宫煎饼果子\t64096\n还在乎\t64097\n叫包\t64098\n超过30日\t64099\nivodbsj\t64100\nghjgjg\t64101\n仙山\t64102\n妈妈群\t64103\n别别理我了我不想理你\t64104\n走了等\t64105\n美女贵姓\t64106\n苇塘\t64107\n汉源\t64108\n欧肉\t64109\n重游\t64110\n小Hi\t64111\n贪官\t64112\n帅哥正太\t64113\n姜佳欣\t64114\n黑行\t64115\n清清白白\t64116\n驎\t64117\n$75\t64118\n海鸥们\t64119\n桂宝\t64120\n信雅达\t64121\n据闻\t64122\n黑衣\t64123\n偏执\t64124\n大欧耶们\t64125\n欧股\t64126\n雷秀红\t64127\n856281852\t64128\nsq字谜\t64129\n悲喜交\t64130\n酸痛\t64131\n和面交\t64132\n入夏\t64133\nhiwhat\t64134\n人海\t64135\n糖豆\t64136\n铁公鸡样\t64137\n呢件\t64138\nvhjcv\t64139\n哈桑\t64140\n一条烟\t64141\n长新群\t64142\n呢们\t64143\n8656555\t64144\n标识\t64145\n当头\t64146\nhdgdh\t64147\n侯明桢\t64148\nkk71osf\t64149\n垃圾邮件\t64150\n恶露\t64151\n急跌\t64152\n興趣\t64153\njbnkb\t64154\n比三\t64155\n恶霸\t64156\nlivehouse\t64157\n标语\t64158\nury尔康\t64159\n頌\t64160\n飘洒\t64161\n佳域\t64162\n布封\t64163\n最高价\t64164\n经死\t64165\n周哥\t64166\n黄晓彤\t64167\n一颗12co\t64168\n避难者\t64169\n神勇\t64170\n545446\t64171\n死片\t64172\n畔\t64173\n呼死你个老不要脸的死东西\t64174\n躲猫猫\t64175\n荷花区\t64176\n我好爱好爱我妈妈\t64177\n535968\t64178\n防灾\t64179\n伏首\t64180\nhttppinyincne12111\t64181\n欧姆啊7s\t64182\n喜新\t64183\n衰羞\t64184\n误认为\t64185\n八十多度\t64186\n大款\t64187\ngdhf\t64188\n1imaxx\t64189\n138542655552242258\t64190\n双排扣\t64191\nMP3\t64192\n活菩萨\t64193\n正经二话不说三心二意四面八方五颜六色六神无主七嘴八舌八仙过海九牛一毛十全十美\t64194\n江西星\t64195\n宣誓\t64196\n瓜贵\t64197\n冫冫\t64198\n猪大战僵尸\t64199\n先果\t64200\ngyvdrchvcy\t64201\n易通\t64202\n毛蟹\t64203\n一侧\t64204\n碰面\t64205\njeveh\t64206\n独怜\t64207\n易逝\t64208\n轰怔\t64209\n大聪明\t64210\n开帖\t64211\n反反复复反反复复\t64212\n添彩\t64213\n毫克\t64214\n开市\t64215\nvhvufoun\t64216\n摔起\t64217\nykjnhfbkgh\t64218\n浑欲不胜张\t64219\nhiojjhkhhiihhihihihihjiijhhhojojljl\t64220\n王婉婷\t64221\n４点\t64222\n中英街\t64223\n17771\t64224\n10.27\t64225\nkvm\t64226\nkvj\t64227\nkvk\t64228\nHSHSHEVSHVW\t64229\n怒海\t64230\n有讲\t64231\n台江一小区\t64232\n闻名遐迩\t64233\n红艳艳\t64234\nliffio\t64235\n忍者号\t64236\n宿命\t64237\n广州政协\t64238\n烈哥\t64239\n烟民\t64240\n外班\t64241\n杂疯\t64242\n帐号\t64243\nKto\t64244\n上天路\t64245\n扇子\t64246\n无与伦比\t64247\n百片\t64248\n宿员\t64249\n绊倒\t64250\n九四九五九六九七九八九九一百\t64251\n保护伞\t64252\n必需\t64253\n成都市佛教协会咨议委员会\t64254\n狗狗们\t64255\n找勿\t64256\n拉玛诺\t64257\n蠕庙镇\t64258\n0.01毫克\t64259\nlider\t64260\n砖雕\t64261\n你好爱\t64262\n有机菜\t64263\n张燕\t64264\n四川电视台\t64265\n9289\t64266\n早操\t64267\n李子树\t64268\n9284\t64269\n李大本\t64270\n胸照\t64271\n9283\t64272\n大义凛然\t64273\n六千分\t64274\n牢笼\t64275\n詹天佑\t64276\n建设方\t64277\n吕天辰\t64278\nMate9\t64279\n78251\t64280\n团委\t64281\n历史唯物主义\t64282\n夜夜夜夜夜夜\t64283\n情秘\t64284\n把非\t64285\n皮疹\t64286\n92万元\t64287\n菜脯\t64288\n四本\t64289\n条文\t64290\n定中原\t64291\n終於\t64292\n3系\t64293\n听歌吧一起来\t64294\n前脸\t64295\n造片\t64296\n度秘度秘你真了不起五行大山压不住你身下个度行者\t64297\n多疑\t64298\n新妈们\t64299\n辛勤\t64300\nJizaiiadinguandianshrj\t64301\n掉牙\t64302\n杨羽凡\t64303\n李乔荣\t64304\nfftd\t64305\nfftf\t64306\n天鹅\t64307\nmokln\t64308\ncosc\t64309\n烤地瓜\t64310\n要不要死\t64311\n苏要酸\t64312\n屏山\t64313\n佳祺\t64314\nWeico\t64315\ncosi\t64316\n地狱中的独行者\t64317\n膝盖\t64318\n明枪\t64319\n对呀快点\t64320\n20cm\t64321\n44547658779896\t64322\n胜出\t64323\n爱鲁尼\t64324\n猫肉\t64325\nkbsi\t64326\n马静\t64327\n传统化\t64328\n曲阜东\t64329\n激奋\t64330\n微贬义\t64331\n心爱电\t64332\n牛仔裤\t64333\n锁阳\t64334\n中江\t64335\n400100吧\t64336\n大头蛋\t64337\n心情不号\t64338\n界定\t64339\nCDMA2000/WCDMA/GSM\t64340\n管子\t64341\n菲方\t64342\nsasdff\t64343\n金凤姐\t64344\n二十道\t64345\n噶阿发充卡i\t64346\n一节级\t64347\n动情\t64348\n全方位\t64349\n183元\t64350\n金育锋\t64351\n真的度\t64352\n菲斯\t64353\n18ko\t64354\n提雷米\t64355\n度秘改成度小受\t64356\n亲子游\t64357\n参考价\t64358\nHdjghdb\t64359\n凉州区\t64360\n裙纸\t64361\n装置\t64362\n一点2\t64363\nVanessa\t64364\nwapuu\t64365\n大连市东北路小学\t64366\n新的一年四季\t64367\n3.7级\t64368\n说的理由\t64369\n黄花鱼\t64370\n军工\t64371\nlljjjjgga\t64372\n中央台\t64373\n大僵尸\t64374\n方井\t64375\n八十一百六\t64376\nnnnnnnnnnnnnnnn\t64377\n太奶奶\t64378\n臭屁屁你在你在哪里\t64379\n防盗器\t64380\nxdfc\t64381\n西南大\t64382\n下底是\t64383\n女儿们\t64384\n4967年\t64385\nrieeekwkw\t64386\n夜班车\t64387\n施食\t64388\n狰狞\t64389\n85861099\t64390\n北京雍和宫\t64391\n夫君\t64392\n公办\t64393\n谷歌浏览器\t64394\n嗨皮卡丘\t64395\n一三任\t64396\n植物大战僵尸2pvp\t64397\n征服欲\t64398\n不懂人\t64399\n篮孩\t64400\nhellozh\t64401\n死别我没让你开陪我陌\t64402\n一群一群\t64403\n别学\t64404\n戒掉\t64405\n兹普\t64406\n梨树\t64407\n吴建业\t64408\n考了试\t64409\n不懂了\t64410\n秘书娟\t64411\n兴办\t64412\n吴晓霞\t64413\n施瓦辛格\t64414\n百秘\t64415\n不懂事\t64416\n一道光\t64417\n颈髓\t64418\n顺口溜\t64419\n01233486689\t64420\ncurndied\t64421\n博雅小学\t64422\n天工业\t64423\n551家\t64424\n再唱一\t64425\n开干嘛\t64426\n幸运时刻\t64427\n点把\t64428\netdgffygghv\t64429\n299万\t64430\n神范儿\t64431\n青柠汁\t64432\n长江天天盈\t64433\n典范\t64434\n紫马\t64435\n移情写情诗这首词在哪呢特色夜深千帐灯\t64436\n要否绝\t64437\n八毛八\t64438\n白羊的爱好霸道\t64439\n黄圣涵\t64440\n说不告\t64441\n共有\t64442\n帐女\t64443\n托卡的城市从你了求你\t64444\n加斯特独伊特\t64445\n面对面\t64446\n小哥儿\t64447\n328355\t64448\n害羞讨厌\t64449\n车陂\t64450\n车陆\t64451\n痛彻心扉\t64452\n单笔\t64453\n翟永康\t64454\nreor\t64455\n咱俩亲一下行\t64456\n卡布奇诺\t64457\n合合眼\t64458\n唱一首歌走进\t64459\n羔子\t64460\n车险\t64461\n回帖\t64462\n情一敌\t64463\naqpoing\t64464\n043804438\t64465\n我的有天\t64466\n测验题\t64467\n555525525226250128555225959000522662511636131313111\t64468\nhi波斯\t64469\n明天电联\t64470\n李群\t64471\n九楼\t64472\n惠州美博城\t64473\n大有关系\t64474\n泡壶\t64475\n找女\t64476\n研讨会\t64477\n繁殖\t64478\n新闻直播间\t64479\n设命\t64480\n外出旅游\t64481\n切克闹幺幺切克闹\t64482\n精细\t64483\n郭已死\t64484\n味精蛋\t64485\nQ1\t64486\n在平\t64487\n在干\t64488\n不舍不舍不舍\t64489\n挺身\t64490\n南河\t64491\n哥白尼\t64492\n口服液\t64493\n小白猫\t64494\n4141748\t64495\n一如既往\t64496\n2152\t64497\n姐儿\t64498\n汝阳\t64499\n老女孩\t64500\nnon\t64501\nnoo\t64502\n我没疯是你疯了\t64503\n能带给\t64504\n沈阳中心气象台\t64505\n4877\t64506\n逼真好\t64507\n扫兴看\t64508\n啦啦乖乖乖\t64509\n夏集镇\t64510\n七楼书店咖啡厅\t64511\n2451131382\t64512\nnou\t64513\nnow\t64514\nnop\t64515\nnor\t64516\nnos\t64517\n多哪\t64518\n大眼怪\t64519\n星飒\t64520\n3107583656\t64521\nnoI\t64522\n摸蛋节\t64523\n多哥\t64524\n克武\t64525\n沒錢讀\t64526\n梁先生\t64527\n吃苹果\t64528\n服务欲\t64529\nhdghfg\t64530\n台车\t64531\n试探\t64532\nDdgbx\t64533\n对话儿\t64534\n智能娃娃机\t64535\n金牌红娘\t64536\n能不能不能\t64537\n三二九二零\t64538\n七十九元\t64539\n吧嗒\t64540\n泉帅\t64541\n春心\t64542\n零零零零\t64543\n一一千多岁\t64544\n下泄\t64545\nggghhhuhhhhhhhhhhhhhh\t64546\n委比\t64547\n小秘书求你了你在\t64548\n莎梦露\t64549\n小轿车\t64550\n武文涛\t64551\n三九一二\t64552\n秘头\t64553\ngxlhg\t64554\n电脑显示器\t64555\n1664253\t64556\nYoshiko\t64557\n51v\t64558\n杨溥\t64559\n三趟\t64560\n苏增添\t64561\n这个贴\t64562\n#2NE1#2NE1\t64563\n秘决\t64564\nvevivo\t64565\ndsgd\t64566\n小麻糕\t64567\n知音kc\t64568\n2060\t64569\n北海市公安局\t64570\n柏芝\t64571\nibab\t64572\n三级片儿\t64573\n1001个\t64574\n补中\t64575\n46868875885557557780086552\t64576\n来我在\t64577\n筒袜\t64578\n甩干\t64579\n518\t64580\n沉闷\t64581\n赵壮鲁\t64582\n9999999999986986999999999999999999999999999999999\t64583\n贝贝贝\t64584\n天莲叶\t64585\n511\t64586\n510\t64587\n513\t64588\n512\t64589\n515\t64590\n514\t64591\nFKK\t64592\n切掉\t64593\n九十块\t64594\n情男\t64595\n大学新生\t64596\n赔礼道歉\t64597\n查道\t64598\n氧化铜\t64599\n狂言者\t64600\n僵尸大战\t64601\n女朋友们\t64602\n很显然\t64603\n波波波波波波波波波波波波\t64604\n各个\t64605\nyhsjegv\t64606\n400W\t64607\n运动版\t64608\n说来话长\t64609\n问讯\t64610\n47比九则\t64611\n888哒哒\t64612\n木秘\t64613\n以唱吧\t64614\nKmn\t64615\n丽君\t64616\n825753555\t64617\n泪点\t64618\n那多聊\t64619\n灵狐者\t64620\n赢了帮\t64621\n出口额\t64622\n一万一万句\t64623\n同级\t64624\n监护权\t64625\n导轨\t64626\n香飘飘香\t64627\n吉血战堡\t64628\n飘然何处\t64629\npnb\t64630\n连绵不绝\t64631\n给我等\t64632\n梅川\t64633\n堰塘\t64634\n辛夷坞\t64635\n哈罗马\t64636\n东方梦工厂\t64637\n少人\t64638\ndavvd\t64639\n南海诸岛位置图\t64640\n杜嘉琪\t64641\n小三打\t64642\n大希望\t64643\n微言情\t64644\n隐疾\t64645\n来事不信\t64646\n不灭你的妹妹上学\t64647\n挥身\t64648\n舞台型\t64649\n配一配\t64650\n崇洋不媚\t64651\n连天\t64652\n晓风干\t64653\n鹰派\t64654\n臭名昭著\t64655\n一百二十几\t64656\n我的妈呀妈呀我的妈呀吗\t64657\n画叉\t64658\nuxdjhu\t64659\n茂业\t64660\n度月\t64661\n钻石心\t64662\n诧异\t64663\n健身房\t64664\n貌不在\t64665\n波司登莫代尔\t64666\n李勇\t64667\n犀浦\t64668\n1000000\t64669\n恶恶\t64670\n告訴\t64671\n微会\t64672\n你的婚你的号\t64673\n一舌吻\t64674\n度本\t64675\nhgyccj\t64676\n100000T\t64677\n有种\t64678\n王叫我\t64679\nduggggu\t64680\n天天酷宝\t64681\n有秘\t64682\njhuujgg\t64683\n人损\t64684\n环岛路\t64685\n口球塞\t64686\n握住\t64687\n控卫\t64688\n英雄联盟\t64689\nsgjshks\t64690\n曹成\t64691\ndodrk\t64692\n大鼓\t64693\n粉饺\t64694\nHO\t64695\n半山腰山顶\t64696\n爱雅\t64697\n飞碟\t64698\n劈柴\t64699\n五十多条\t64700\n关会\t64701\n我和我的好朋友好\t64702\n油香\t64703\n八十千米\t64704\n李逍遥\t64705\n偶淡定的继续专一爱我的空客\t64706\n霞涛\t64707\nmaqbhj5524444444550\t64708\n玛尼\t64709\n母恨\t64710\n不是爱我是真的很恨你恨你\t64711\n他们俩\t64712\nlay帅\t64713\n聪明丽丽\t64714\n莱曼\t64715\n霿大肆\t64716\nhghhg\t64717\n比利时国家队\t64718\n前任\t64719\n错句机器人\t64720\n古典小说三国演义\t64721\ntyerutffgddgg\t64722\n真赞\t64723\n秀恩爱\t64724\n疑问你说呢你说\t64725\n扎破\t64726\nlexultf\t64727\n窘境\t64728\n九点半后\t64729\n孙正佳\t64730\n怨之恋\t64731\npeakEng\t64732\n纽菲\t64733\n洗阴\t64734\n35w头\t64735\n源哥\t64736\n铝锅\t64737\n鱼肉色\t64738\n肛门腺\t64739\n恰噶\t64740\n詹景森\t64741\nhjrdf\t64742\n竹超\t64743\n坎贝尔\t64744\n导火线\t64745\nulolon\t64746\n肉骨头\t64747\n五九来\t64748\n你的心情\t64749\n户型\t64750\n摩罗蜂\t64751\nIE地址\t64752\n凯越蝶雅蠛蝶雅蠛蝶雅蠛蝶雅蠛蝶\t64753\n夏粮\t64754\n暴打\t64755\n庙门战旗\t64756\n和睦\t64757\nismay\t64758\n780988413325\t64759\nhttpfhiphotosbaiducomxiaodupicitema5c27d1ed21b0ef446f7dd2ddac451da81cb3ea1jpg\t64760\n你是猪么不要脸的猪\t64761\n家畜\t64762\n三十分\t64763\n的家六小尼姑背影黑妞\t64764\nchiphotosbaiducomxiaodupicitem0ff41bd5ad6eddc403bdff883edbb6fd5266336ejpg\t64765\n十米\t64766\n鲃\t64767\nDoyoutypetypereality\t64768\naaa儿\t64769\n不在假\t64770\n一个多星期\t64771\n1月17号\t64772\n颗苗\t64773\n热卖\t64774\n农地\t64775\n舍解霸\t64776\n生见\t64777\nIsee\t64778\n永健\t64779\n农场\t64780\n白求恩\t64781\n男健\t64782\n趄K爪\t64783\n加彪\t64784\n温陵\t64785\n二十四二十四小时\t64786\n关志楷\t64787\n两两三个\t64788\n张丁丁\t64789\n奔走相告\t64790\n年票\t64791\n2011年6月3日22时10分\t64792\n谢文静\t64793\n妥雷\t64794\n王晓曦\t64795\n码行\t64796\n85556247966\t64797\n感激\t64798\n一百亿零五百四十四万四千五百五十五\t64799\n母恋\t64800\n银条儿\t64801\n弘法\t64802\n就是你明明就是你你是猪你是猪你是猪你\t64803\n9点408点\t64804\n人身保\t64805\nGVHVU\t64806\n明胶\t64807\n闭门思过\t64808\n今晚23点\t64809\ntmah\t64810\n聊不起\t64811\nhttphhiphotosbaiducomxiaodupicitem9922720e0cf3d7ca849b3102f51fbe096b63a951jpg\t64812\n干漆\t64813\n曾星宇\t64814\n呃镁棒\t64815\n幺二零幺零八一九八六五八六二三幺五二\t64816\n取为\t64817\n嘎拉嘎\t64818\n思妮\t64819\n40一个\t64820\n图示\t64821\n#2012COSMO美容大奖微博评鉴团#水动力精萃凝润\t64822\n汉奸们\t64823\n怪胎\t64824\n5月28日前\t64825\n地关系\t64826\n雪达瑞\t64827\n2.5亿\t64828\nactually\t64829\n愈来\t64830\n走一步\t64831\n外频\t64832\n新流记\t64833\n绷带\t64834\n舍成\t64835\n第几声\t64836\n王八独子\t64837\n泓宇\t64838\n取下\t64839\n美瞳片\t64840\n岩洞\t64841\n分娩\t64842\n杨俊锋\t64843\n徕卡\t64844\n大花布\t64845\n微软全球合作伙伴\t64846\n咪列\t64847\n芭乐\t64848\njxjdjfjg\t64849\n小猪蛋蛋\t64850\n接龙我先说\t64851\n东安大街53号\t64852\n瑟下刚\t64853\n鞠新牙\t64854\n海德\t64855\n孤傲\t64856\n小儿科\t64857\n违建\t64858\n恩富\t64859\nbdbvvvvcvcvcvcv\t64860\nEducationUSA\t64861\n高级片\t64862\nfcdfscdrtfsd\t64863\n4点\t64864\nurlpp\t64865\n29903306\t64866\n栀娘\t64867\n丧心病狂\t64868\n3千\t64869\n6666666666666666655444332114456667889908754322\t64870\n恩对\t64871\n心烦了你可以\t64872\nFijc\t64873\n半生缘一世情\t64874\n算了吧不想\t64875\n需要你班到\t64876\n舰桥\t64877\n滋味儿\t64878\n怷系\t64879\n七九切\t64880\n死掉了了\t64881\n男淫\t64882\n缘故\t64883\n林公寓\t64884\n善心\t64885\n尸兄\t64886\n54756167\t64887\n鼓捣\t64888\n125865786\t64889\n萨基\t64890\n二莉莉\t64891\n阳伞\t64892\n爸爸了飞\t64893\n毛衫\t64894\n黄梅戏\t64895\n42424242434343434\t64896\n善念\t64897\n钢炼\t64898\n毛衣\t64899\n尸香腐花\t64900\n天天天天\t64901\n389X4\t64902\n东线\t64903\n一五道\t64904\n辛苦啦一\t64905\n毛衰\t64906\n1113家\t64907\n小布姐\t64908\n卢义森\t64909\n有事稳\t64910\n张老师\t64911\n所短\t64912\n我是真的女生不骗你骗你我是猪\t64913\n拔罐\t64914\n归还\t64915\n卢子谦\t64916\n3261606277\t64917\n财校\t64918\niwodskskd\t64919\n安莱芜\t64920\n泡吧论坛\t64921\n难于上青天\t64922\n粑粑鬼爸爸爸爸呗爸爸呗乖乖爸爸背娃娃128888\t64923\n考古探险\t64924\n白布\t64925\n老孔\t64926\n斗法\t64927\netihw\t64928\ngddbjo\t64929\nletit\t64930\nmaggie\t64931\ndvjcvnbm\t64932\n十六吨\t64933\n代班\t64934\n张。nfgbn\t64935\nc君\t64936\n43万亿\t64937\n接不暇\t64938\n承载力\t64939\nhth\t64940\n警卫局\t64941\n挖矿工\t64942\nam8\t64943\n烧豆汁\t64944\n2006年4月份\t64945\n自言自语\t64946\n月台\t64947\n老泰斗\t64948\n我是狗你是猫我是猪\t64949\n花销\t64950\nxddddsrsrewftf\t64951\n鸣笛\t64952\n待会儿再聊\t64953\n公用事业\t64954\n侠义\t64955\n不用啦\t64956\n科里\t64957\n李一奇\t64958\n偏南风\t64959\n高梦琳\t64960\n相对论\t64961\n超女友\t64962\n网摩嚮\t64963\n三点三\t64964\n算上\t64965\n太妖孽绝宠世子妃\t64966\namo\t64967\namn\t64968\nama\t64969\n帅出翔\t64970\n愿力铣\t64971\name\t64972\namg\t64973\n几千公里\t64974\namy\t64975\n东光\t64976\nhjijhvf\t64977\n简史\t64978\n天桂\t64979\namp\t64980\nams\t64981\namt\t64982\n药味\t64983\n孙睿\t64984\n恩顾\t64985\n北京烤鸭\t64986\n生日堡\t64987\n东兰\t64988\n三十余\t64989\n东兴\t64990\n王有明\t64991\n快儿\t64992\n天桥\t64993\n噗噜噜噜咕噜\t64994\n鸟声\t64995\n黄金农场\t64996\n三十位\t64997\n创洗\t64998\n尿急故事\t64999\n15p\t65000\n石火秋凉\t65001\n陈镇滔\t65002\n天呐真是的我说的话\t65003\n蓝胖子\t65004\n看我爸\t65005\n闲不住\t65006\n拉里\t65007\n清畅\t65008\nAndNFB\t65009\n思思\t65010\n1850\t65011\n6哦\t65012\nIyourphone\t65013\n18098500273\t65014\n狗猪狗\t65015\n一千度\t65016\n认识了没有\t65017\nhttpimgcacheqqcomclubitemparcelitem80805da31989bfb4530f72c07ba22af5ddraw200gif\t65018\n一代表\t65019\n瓦贝\t65020\n痛快儿唱\t65021\n喜欢你可以\t65022\n林真心\t65023\nooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooo\t65024\n外拍\t65025\n宇谜\t65026\n缺水\t65027\n哈不分离\t65028\n天一夜辉\t65029\n预约券\t65030\n七一八\t65031\n缺氧\t65032\n鸿芧\t65033\n形容文\t65034\n帝女花快吃点兔兔\t65035\nThefactIcanuuuuy\t65036\n女干活\t65037\n777999999999\t65038\n吗这话说\t65039\n慢慢奥\t65040\n紫云山\t65041\n可能性农村补偿金\t65042\n得乎\t65043\n这种人\t65044\n成落\t65045\n珠海市香洲区法院\t65046\n谢你个小子不打你你给我走开\t65047\n一周内\t65048\n会计会计会计会计会计会计会计会计\t65049\n江川路\t65050\n不乖了真是\t65051\n踢踢腿\t65052\n卡露亚hello你好奥特曼\t65053\n说舍\t65054\n无情无意\t65055\n嗯红valo\t65056\nzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz\t65057\n四五百块\t65058\n我的爱车\t65059\n武好\t65060\n漏判\t65061\n受点\t65062\n机枪片\t65063\n小水滴\t65064\n心机女\t65065\n这种事\t65066\n富贵荣华\t65067\n女儿河上游\t65068\n忙去\t65069\n女性性\t65070\n尿酸\t65071\n自己的家\t65072\n塞巴斯蒂安\t65073\n妥p\t65074\n干红薯藤\t65075\n可♀吐\t65076\n贴有\t65077\nFlower\t65078\n扩写\t65079\n扩军\t65080\n李丁宁\t65081\n逛西逛\t65082\n拜把子\t65083\ntfboyspkxo\t65084\n杨智勇\t65085\n轮轴\t65086\n644646\t65087\nbdjdgeejdudheekkekejeheueensnsmee\t65088\n张华阳\t65089\nEXO你饭\t65090\n张爱东\t65091\n知情\t65092\n卫喉炎\t65093\n哪身\t65094\n大黑大黑\t65095\ngjgjajj\t65096\n12点32\t65097\n呵狼\t65098\n瞧你好搞笑\t65099\n面基吗ouo\t65100\nxuic\t65101\n周身\t65102\nstyou\t65103\n完再发\t65104\n直男朋友\t65105\n老里\t65106\n长思天\t65107\n抛压\t65108\n我讨厌你我讨厌讨厌讨厌你\t65109\n琴键\t65110\n红米比\t65111\nmatrooper\t65112\n内蒙古飞机场\t65113\n此际\t65114\n有屋可居\t65115\n黄赌毒\t65116\n燃度\t65117\n47147\t65118\n18拉\t65119\n杀人犯\t65120\n刁小洁\t65121\nＣＶ\t65122\n于仁泰\t65123\n百分数\t65124\n幺二零幺\t65125\n杨宇轩\t65126\nstrou兔\t65127\n厨师\t65128\n麦一动漫\t65129\n计算机\t65130\n亲爱的老公我报到来啦\t65131\n一口儿\t65132\n耳膜\t65133\n变卦\t65134\nthcf\t65135\n扒粪\t65136\n再见再见\t65137\n100112\t65138\n尼打野\t65139\n变卖\t65140\n7月5号\t65141\n远特幺七幺\t65142\n拜拜拜拜拜拜拜拜拜拜拜\t65143\ncudcb\t65144\n白月\t65145\n35套\t65146\n10102020304年\t65147\n陈俊海\t65148\n那兄弟\t65149\n允许\t65150\nTM爷\t65151\n6667\t65152\n推手\t65153\n卑微\t65154\n咪咪咪咪咪\t65155\n孑然\t65156\n黑吧\t65157\n开五区\t65158\n鬼百鬼\t65159\n建院\t65160\n多一个三八\t65161\n军棋\t65162\n盛白垩纪圣齐儿摔\t65163\n对过\t65164\n不是今我要死看\t65165\n午安\t65166\nSjsk\t65167\n减少\t65168\n大魔鬼\t65169\n第九个\t65170\n没有我不知道\t65171\n高远\t65172\n黑吉\t65173\n高飞狗\t65174\n李晨度\t65175\n黑名\t65176\n马伯乐\t65177\n斯奥\t65178\n阿拉伯湾\t65179\n秦南\t65180\n披肩\t65181\n汉王人xin\t65182\n偷偷图理论天津站图腾投图太无组织秃头作图\t65183\nP马\t65184\n谢亚龙\t65185\n坐享其成\t65186\n星际\t65187\n巴扎黑\t65188\n个生天\t65189\n走进去\t65190\n爱屋及乌\t65191\njkoujum\t65192\n扭扭扭\t65193\n代表们\t65194\nugbesuu\t65195\n3pshakalitt3ba\t65196\n乖孩子\t65197\n广电局\t65198\n6月18日前\t65199\n兽王\t65200\n国际化\t65201\n3天后\t65202\nblackalmic\t65203\n小雪转晴\t65204\n遥遥无期\t65205\nJ520\t65206\n一百二十七咧\t65207\n高级人民法院\t65208\nayumi撩特\t65209\n公益广\t65210\n烟桥村\t65211\n可并没有\t65212\n文化馆\t65213\nhubd\t65214\n465元\t65215\n精害死\t65216\n十周年\t65217\n巴啦啦小魔仙度秘\t65218\n十八片\t65219\n帕米尔高原\t65220\n凤逆天下\t65221\n乖奖\t65222\n炮谷\t65223\nAfrojack\t65224\n赖孙\t65225\n威严\t65226\n牛坑\t65227\n李淑雅\t65228\n赖子\t65229\n宝翠花都\t65230\n腮帮\t65231\n张璐明\t65232\n丰宁\t65233\n弄潮\t65234\n中央日报\t65235\nmargherita\t65236\n我喜欢我是别一强我喜欢玩玩记\t65237\n读子\t65238\n一个一张\t65239\n二人世界\t65240\n动脑瘫\t65241\n娱乐周刊\t65242\n连佐罗\t65243\n许德奇\t65244\n冷现\t65245\n汤雯茜\t65246\n四脚朝天\t65247\n古罗\t65248\n大河向东流啊天上的星星前辈的食物9399来你在我就在\t65249\n浪费时间\t65250\ndhjdjjdjdhfjhffhfhjndhdjddhjhdhfhhhfcgfghdhdhfhhhffhhgdhdgdbdhgdhfhfhdhdhhdhdhdhfdfhhdhdhdhdhbdhfjfhdjhdhdjdhdhfhfdhdgdhfhffdhhdhfdfhfhfhjhdhhjfuuehdjjrhdhjrhgdhejzsirhhdjdjjdhhhehghhshdhhdhshhdhjjrjhdhbddgfjjmdjdhhddhfjhfhfhfhhfhdhdhdjdhghrjrhhdhfh\t65251\n熊水浪\t65252\nhcct\t65253\n花姑娘良民花姑娘你好\t65254\n天黑之前\t65255\n满天下\t65256\n陈永贵\t65257\nhccf\t65258\nhcch\t65259\n三角度\t65260\n水都\t65261\n3月4日晚九点半\t65262\ng9208\t65263\n丽思\t65264\ngehtes\t65265\n水晶石\t65266\n断背\t65267\n德神想\t65268\n贼拉a1111\t65269\n的确实\t65270\n分米\t65271\n露营\t65272\ntufif\t65273\n张无可\t65274\n分类\t65275\n食光\t65276\n迎接\t65277\nZFJ\t65278\n闫赛汀\t65279\n第一会\t65280\n二百五三百五十\t65281\nJimhasagreat\t65282\n阿炳\t65283\n结婚假\t65284\nsametime\t65285\n啵啵撸撸\t65286\neurityftyg\t65287\n石河子\t65288\n麦毅琴\t65289\n額55\t65290\n阿嘎嘎\t65291\n西游记西游记之大圣归来\t65292\n1234567891011121314151617181820\t65293\n小人妖\t65294\n领好\t65295\nLjuiipppoutfgvcj\t65296\n保利\t65297\ngoodyes\t65298\n灵机\t65299\n还给你\t65300\n哈白双黄蛋\t65301\n广发证券\t65302\n恩纳\t65303\n真心\t65304\n符贝贝\t65305\n第二排\t65306\n切期\t65307\n2854458555555555555555555555555555555555555555555555555555555555555555555500000000000000000000000000004444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444\t65308\n彭天云\t65309\n领奖\t65310\n真是好朋友\t65311\n么东东奥\t65312\n740\t65313\n查文\t65314\n744\t65315\n745\t65316\n收盘\t65317\n屁眼子\t65318\n748\t65319\n749\t65320\n李大脚\t65321\n1826440653\t65322\n城田优\t65323\nvffjyrxw\t65324\n十八岁了你猜我是男森还是女森\t65325\n129点\t65326\n74%\t65327\n123422344234362344234\t65328\n心底的秘密\t65329\n氢气\t65330\n和气死\t65331\ntf卜\t65332\n吹花\t65333\n1年\t65334\n吉祥颜彩\t65335\n復說\t65336\n最长\t65337\n开心度\t65338\n裸寄\t65339\n南京新百\t65340\njdhwfcejhkmnmmh\t65341\n小萌王\t65342\ndiffuser\t65343\n田正果\t65344\n360000000000000000\t65345\n80元\t65346\ndjwi\t65347\n2295263669\t65348\n城际票\t65349\n五毛二\t65350\n昊哥\t65351\n电能\t65352\n525588\t65353\n二十三岁\t65354\n度秘你是猪么猪\t65355\n半径\t65356\n冯导\t65357\n陈思\t65358\n牧牧\t65359\n二岁\t65360\n叶芝\t65361\n颜荣仪\t65362\n可强\t65363\n娜英法\t65364\n狮虎世界\t65365\n请大谅\t65366\n告诉我我该\t65367\n陈怡\t65368\n董秘太傻了我不想你说话\t65369\n50年前\t65370\n陈总\t65371\n中通\t65372\n奶酪陷阱\t65373\n中途\t65374\n轮轮\t65375\n神经\t65376\nbxgcfhu\t65377\ndmcnm\t65378\n真菌\t65379\n数第七\t65380\n2333333\t65381\n购车\t65382\n独立你好我也\t65383\n白蕊\t65384\n没课\t65385\n真菜\t65386\n撒干\t65387\n没说\t65388\nppopped0ppppppp\t65389\n冰魄\t65390\ntf卜oys\t65391\n越位射门\t65392\n77岁\t65393\n关于办理诈骗刑事案件具体应用法律若干问题的解释\t65394\n讨厌你讨厌你讨厌讨厌讨厌你\t65395\n经历济\t65396\n汪苏伦\t65397\n43632636558645\t65398\n照再说\t65399\n應該\t65400\n冲走\t65401\nUN1Q\t65402\n我不要和你交朋友再见了永别了我不爱你我恨你\t65403\n做工家\t65404\n幺五六二零九九八九六幺\t65405\n第五十七条\t65406\n13848054938\t65407\n12345790\t65408\n姄fugggfgyhhcfh番\t65409\n你个臭不要脸的你男女\t65410\n嶙峋山\t65411\n蔡少芬\t65412\n更富有\t65413\n聚首\t65414\n菩提噗\t65415\n猎吧\t65416\nHiddleston\t65417\n不要你了我走\t65418\nrwq\t65419\n9月28日\t65420\neeugglj\t65421\n笨况\t65422\n朱剑鹏\t65423\n2729713242\t65424\nyfifvccjk\t65425\n承保\t65426\nznnzzb\t65427\n礼盒\t65428\nhjhggh\t65429\n鸣凤\t65430\n10月1日\t65431\n4659446464646464644666164644646464664664664646166464646\t65432\n嗨皮娜拉\t65433\n撩人\t65434\n再讲一个吧\t65435\n高高挂起型\t65436\n背一点\t65437\n余少霞\t65438\n同性\t65439\n江苏师大\t65440\n控制权\t65441\n我就爱你爱你爱你\t65442\n龙家堡白肉血肠饭店\t65443\nxhsdjsd\t65444\n红毯\t65445\n斯顿比\t65446\n好大好大好大好大\t65447\n七月13日\t65448\n虾米网\t65449\n零二二零二\t65450\n赵军华\t65451\n周艺洁\t65452\n787777657268627672722\t65453\n我的煌煌\t65454\n夫复何言\t65455\n方红\t65456\n萎缩\t65457\nIsthatasong\t65458\n囧高莎\t65459\n闹翻闹\t65460\n麦c9局\t65461\n床行\t65462\n笑言\t65463\n滑水板\t65464\n定陶\t65465\n张副官\t65466\n马浩伟\t65467\n快乐女声\t65468\n房产\t65469\n货跟\t65470\n第5代\t65471\n忧患\t65472\n庆贺\t65473\n斯里兰卡佛牙寺国际佛教博物馆\t65474\n12v鼓\t65475\n0987654321234567890\t65476\n调情\t65477\n更秘\t65478\n腐朽\t65479\n四川奥\t65480\n97一个98\t65481\n死了擦\t65482\n年复快递\t65483\n孙一骏\t65484\n第一拆\t65485\n零钱罐\t65486\n2.46%\t65487\n香途\t65488\nipad#\t65489\n国有淫行\t65490\n小妹度\t65491\n雷引国\t65492\n小叔叔\t65493\n睢县\t65494\n委员们\t65495\n盘发\t65496\n754分\t65497\n魔爪\t65498\nipad2\t65499\n满意不满意\t65500\n姬玥\t65501\nfhfjgjjghhghfhyyyytfn\t65502\n雙子座\t65503\nfyhvih\t65504\n86003章\t65505\n额馕\t65506\n漆皮\t65507\n瞄准\t65508\n三系\t65509\n逃跑\t65510\n船头\t65511\n袁游\t65512\nboce\t65513\nghtui\t65514\n睡觉吧睡觉吧睡觉吧好想要哪好痒痒哪还有人哪好呀呀哪睡觉吧\t65515\n郑多燕\t65516\nthcghgg\t65517\ntfm\t65518\n1200千克\t65519\ntfo\t65520\ntfi\t65521\n180遍\t65522\ntfj\t65523\ntfe\t65524\ntfd\t65525\ntfg\t65526\ntff\t65527\n91页\t65528\ntfc\t65529\ntfb\t65530\n1919\t65531\ntfy\t65532\ntfx\t65533\ntfz\t65534\n1911\t65535\n1910\t65536\ntfv\t65537\ncurtisooratiook\t65538\n1916\t65539\n≮\t65540\n≯\t65541\n投机化\t65542\n谢忠诚\t65543\n轻慢\t65544\n≦\t65545\n≧\t65546\n礼数\t65547\n≥\t65548\n荣庆物流\t65549\n≠\t65550\n≡\t65551\n反应慢我爱你\t65552\n8.8级\t65553\n慢慢氮\t65554\n吉普赛豆腐\t65555\n国家土地督察局\t65556\n韩文名\t65557\n最下贱\t65558\n口香糖\t65559\n确然\t65560\n郑燕柳\t65561\n模糊\t65562\n切克我说闹药药切克闹\t65563\n32089元\t65564\n2004年7月13日\t65565\n组合作\t65566\n近况\t65567\nxo12\t65568\nnull\t65569\n廣東話\t65570\n刘芳媛\t65571\n面子\t65572\n郑恺帅\t65573\n你萌\t65574\n面孔\t65575\n25章\t65576\n亚如\t65577\n工商卡\t65578\n浴缸\t65579\n文言文\t65580\n徐晓凤\t65581\n再见吧梦中\t65582\n三百六十五五万七千两百多少\t65583\n保险箱度秘\t65584\n爱不爱我\t65585\n统治\t65586\n啦赞\t65587\ntube6\t65588\n那是你你是男是女\t65589\n12倍\t65590\n别更\t65591\n护眼\t65592\n花案\t65593\n信宜\t65594\n花框\t65595\n忘了再见\t65596\nnplea\t65597\n格老子\t65598\nhfev\t65599\n穿越火法师\t65600\n新新人类\t65601\n全乎儿了\t65602\nc贾岛\t65603\n喜欢你了我\t65604\n危房\t65605\n两过\t65606\n3xz91707\t65607\n放长假\t65608\n男娃娃\t65609\n到喜\t65610\nbzs\t65611\n豆壳\t65612\n前科\t65613\n筋疲力尽\t65614\nbzb\t65615\n张洪羽\t65616\njget\t65617\n河里样\t65618\n三八零零\t65619\n生命的天使\t65620\n三十一个二十四\t65621\n浩气\t65622\n三八二零六\t65623\n高师\t65624\n信容卡\t65625\n普庆\t65626\n杏仁粉\t65627\n灵灵\t65628\n打給\t65629\n高帅\t65630\n卡托儿\t65631\nYhfbjbvxjbbd\t65632\n香格里拉\t65633\nmthecoyote\t65634\n受不起\t65635\n四个五\t65636\n初一个\t65637\n女招\t65638\n县委书记\t65639\n萨博汽车母公司荷兰世爵汽车公司\t65640\n哎呀你的话\t65641\n高带\t65642\n度密度密度密度秘你好你好你好\t65643\n梁纳言\t65644\n我恩知恩恐怖冬\t65645\n尤其中学\t65646\nfhbu\t65647\n魏姓\t65648\nfhbx\t65649\nwanan\t65650\n李冰睿\t65651\n结照\t65652\n王彦科\t65653\n常月琴\t65654\n湿漉漉\t65655\n一天到现在\t65656\n新东村\t65657\nhihihhihihihihihihihihihihi\t65658\n3撇\t65659\nv1m\t65660\n巧喔\t65661\n冯尹烨\t65662\n火葬场\t65663\n骆翠英\t65664\n你不我不疼\t65665\n无毅\t65666\n画脚\t65667\n逗汉\t65668\n冬月\t65669\n三月三\t65670\n无比\t65671\n三月一\t65672\n分量\t65673\n终值钱\t65674\n无毒\t65675\n导航仪\t65676\n勾勒\t65677\n考不过\t65678\n泥彩\t65679\n小菱\t65680\n龙卧\t65681\n邓人鱼\t65682\nhlsbd\t65683\n龙卡\t65684\n时间空间转换\t65685\nddddddxxc\t65686\n88888888888888888888888888888888888888\t65687\n26415\t65688\n折都大队\t65689\n玉iyr\t65690\n垫板\t65691\n完款\t65692\n公子政\t65693\ndeip\t65694\n小菜\t65695\nopper\t65696\n营地\t65697\n劲放\t65698\n勾勾\t65699\ndeeerrr\t65700\nwjwbsbsjs\t65701\n小菊\t65702\n左菊\t65703\n龙南\t65704\n精品\t65705\n三聚氰胺\t65706\n塑造\t65707\n跟秘\t65708\nIsabella\t65709\nbridb\t65710\n电话费\t65711\n凌晨3点4小时\t65712\n66665656560\t65713\n冰点\t65714\ndffer\t65715\n猜迷语\t65716\n铁板\t65717\n温你\t65718\n铁杵\t65719\n种小鹿乱撞\t65720\nifimato\t65721\n这么回事\t65722\n4月底\t65723\n缺憾\t65724\ndYy\t65725\n喀喀\t65726\n死瘟\t65727\n铁杆\t65728\n深康村\t65729\n百事可乐\t65730\n┙n　／／／＼＼＼n｜｜nd\t65731\n保健品\t65732\n少年先锋队\t65733\n粑粑粑粑粑粑粑粑\t65734\n鸭相似丫\t65735\n淑德\t65736\nfeauter\t65737\n我想的好想对你对你的宠爱\t65738\n短心\t65739\n忘了问题\t65740\n方便女\t65741\n累世\t65742\n冒死\t65743\nnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnn\t65744\n广东省田径队\t65745\n30多块\t65746\n善行\t65747\n十三分之一\t65748\n沃尔夫\t65749\nemnddsnxpksjjsnnjjdkwlbsojjbpvouhhjjhmhg\t65750\n田艳艳\t65751\n很大声\t65752\n水电站\t65753\n青春痘\t65754\n十十\t65755\ngostru\t65756\nuvvuvu\t65757\n黑桑黑嗓音节\t65758\n涉嫌\t65759\nghhud\t65760\n喔们\t65761\ndhkn\t65762\n120只\t65763\n项目\t65764\n人身处\t65765\n2585\t65766\n花花花花花花花花花花花\t65767\n2581\t65768\n2580\t65769\n441215224475244\t65770\n杨智居\t65771\n孙李\t65772\n相权\t65773\n2589\t65774\n不过如此\t65775\n￥86\t65776\nttfffgggggghjge\t65777\n母甘\t65778\n跟你说\t65779\n说话再说\t65780\nhttpimagebaiducomsearchwisealatnwisealaieutf8word%E58FA4%E4\t65781\n5212314\t65782\n中建一局沈阳分公司\t65783\n啊撸啊鲁阿\t65784\n有一次天\t65785\nHguhggyhgujgyhfydjcdxvnkdredjojjgf\t65786\n孙杨\t65787\n枪炮\t65788\n喵帕斯\t65789\n生客\t65790\nl岁\t65791\n宝貝\t65792\n好魔性\t65793\n哈比卜\t65794\nsaybye\t65795\n不是不是不是\t65796\n给你了你才说\t65797\n宾格\t65798\ncostly\t65799\n嫉妒风\t65800\nadddg\t65801\n暖言\t65802\n乌泡\t65803\n奸死\t65804\n好不我时\t65805\n妃女\t65806\n宋奥\t65807\nSlash\t65808\n兴化经济开发区\t65809\n伏龙\t65810\n云大成\t65811\n乌泱\t65812\n人度秘\t65813\nQS\t65814\n一百四十多块\t65815\n僅記\t65816\n妃奇\t65817\n闹闹闹闹闹闹闹闹闹闹\t65818\n我讨厌你讨厌你讨厌你恨你恨你恨你\t65819\nwnan\t65820\n么子安的安的希勒普玛的游戏了吗子\t65821\n绵阳市\t65822\n席位\t65823\n视线\t65824\n萌二硕\t65825\n映像\t65826\n大禹长都\t65827\n倾世皇妃\t65828\n衣去\t65829\nTINFBCL\t65830\n姐弟\t65831\n横跨\t65832\n独立独立\t65833\n许文强\t65834\n上钟\t65835\nlie\t65836\n谢丽源\t65837\n小黑哥\t65838\n古浪\t65839\n亲爱的妈妈\t65840\n感到遗憾\t65841\n大河网\t65842\n78章\t65843\n上钉\t65844\n黄桃\t65845\n草海风\t65846\n森菲尔\t65847\nbjgh\t65848\n死对头\t65849\n论价正经娃\t65850\n弟兄们\t65851\n毛笔字\t65852\n996699\t65853\n度日维艰\t65854\n数5000万\t65855\n校斗群\t65856\n健康康\t65857\nowww\t65858\n度秘嘞\t65859\n樊松清\t65860\n127jgtlajgcffbdgrddfgaqyjftuxgyuvyvctfygunirxxtbunomo\t65861\n玄龙角\t65862\n卑则\t65863\n阳性\t65864\n温颜蕾\t65865\n动西\t65866\n法律咨询\t65867\n司御风\t65868\n那秘堡\t65869\n这样儿\t65870\n勒夫\t65871\nlik\t65872\n坏蛋板\t65873\n六二月\t65874\n主观性\t65875\n意面\t65876\n赈济\t65877\nhvbjhcv\t65878\n强奸犯们\t65879\n85嗯\t65880\n零售店\t65881\nhello喽喽喽喽喽啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦来来来来来来了\t65882\n764825445548454\t65883\n很相信\t65884\nhgihh\t65885\n淋雨\t65886\n平常心\t65887\n山静远\t65888\n说信不信\t65889\n88898991\t65890\n圭饭\t65891\nshose\t65892\ngfdtugxj\t65893\n惊感冒\t65894\n陈昌川\t65895\nlir\t65896\n去中心化\t65897\n力主造\t65898\n主人公们\t65899\n文邹邹\t65900\n嗯宝宝\t65901\n大廓\t65902\n屌哥\t65903\n乳腺癌\t65904\n五年级\t65905\n咫尺\t65906\n张石\t65907\n草本\t65908\n手捧\t65909\nJnnnnmoopoloooooo\t65910\n胡家桥\t65911\n老爷\t65912\n一致性\t65913\n老爸\t65914\n18765812785\t65915\neeq\t65916\n媒婆\t65917\n塞图\t65918\neeh\t65919\ncnmsm\t65920\n決腳\t65921\neee\t65922\n婊砸\t65923\nxrbz\t65924\n睡不着我\t65925\n一去无回\t65926\n绸缪\t65927\n九百九十三分钟\t65928\n抗洪\t65929\n185362\t65930\n德克萨斯州\t65931\n产区\t65932\n看掉\t65933\n一码\t65934\n乖哒\t65935\n花田\t65936\n4月26号\t65937\n花甲\t65938\n势必\t65939\nB1层\t65940\n乖哈\t65941\n诺维斯基\t65942\n再吐\t65943\n中超16队\t65944\n截棍\t65945\n乖哀\t65946\n腿脚\t65947\n士村\t65948\n再听\t65949\n归根结底\t65950\n就是你好乖\t65951\n闽东\t65952\nAMD\t65953\n一砣\t65954\n天元指四柱\t65955\n武向荣\t65956\n喔k4\t65957\n金钟云那区\t65958\nzkykwu\t65959\n迪士尼乐园\t65960\n再吵\t65961\n常凯山\t65962\n花生\t65963\n一小把\t65964\nataldtat\t65965\n王晨倩\t65966\n贺梓谦\t65967\n赵国栋\t65968\n近来一个月\t65969\n到所\t65970\n发份c\t65971\navjizz\t65972\n秦佳怡\t65973\n南陵\t65974\n这样的日子\t65975\n家行\t65976\n到手\t65977\njszzsz\t65978\n准妈妈\t65979\nCDEFG\t65980\n峰哥\t65981\nkhkhj\t65982\n调输\t65983\n谭妹妹\t65984\n新嘉\t65985\n卓玛\t65986\n歌者\t65987\n邓珊珊\t65988\n各题\t65989\n亲爱的王小小心心\t65990\n奥突\t65991\n精灵\t65992\n三百字\t65993\n三多\t65994\n令堂\t65995\n年费\t65996\n胜火\t65997\n一哟一\t65998\n真的不骗我真的不骗我真的不骗我\t65999\n针扎\t66000\nl111y燈\t66001\nalpha\t66002\n10点5分\t66003\n楼上楼下\t66004\n人力\t66005\n我喜欢一世\t66006\n三处\t66007\n5分之巳c那么爱最小对还是错\t66008\n表情包\t66009\n贺年片\t66010\n宋茜昌\t66011\n一蚊\t66012\n传统节\t66013\n拉克\t66014\n依法本\t66015\n588288\t66016\n三头\t66017\nQQ卡佛\t66018\n共喝\t66019\n论点\t66020\n业内人称\t66021\n乙袋\t66022\n五比三比二角\t66023\n唉暂时\t66024\nIsmmiss\t66025\n爱天天\t66026\n星衅\t66027\n不育症\t66028\n牙牙牙牙\t66029\n第一顺位\t66030\n朝阳镇\t66031\n胡一菲\t66032\n颠覆性\t66033\n染发剂\t66034\n18代\t66035\n好困好\t66036\n998度\t66037\nhair\t66038\n#锤基小剧场#\t66039\n644555434845\t66040\n微勉\t66041\n多达\t66042\n集散地\t66043\nwodgg\t66044\n百川东\t66045\n嗯铠甲\t66046\njejje\t66047\n令人瞩目\t66048\n362425200401170027\t66049\n原理\t66050\nsvyfwffg\t66051\n啵啵哒\t66052\n木作\t66053\n朝朝\t66054\nCns\t66055\nTx\t66056\n034567\t66057\nTv\t66058\n王珞丹\t66059\nTq\t66060\nTo\t66061\nTm\t66062\n怪肉麻\t66063\n偶姐姐\t66064\nTh\t66065\nCnn\t66066\n小户\t66067\nTa\t66068\n张月馨\t66069\n社会调查\t66070\nThunderbolt\t66071\nTX\t66072\nTY\t66073\nTV\t66074\nTW\t66075\nTT\t66076\nTU\t66077\nTP\t66078\n开女台\t66079\nTM\t66080\n特忙\t66081\n嫣然\t66082\n小战\t66083\nTF\t66084\nhddaadmgkpmgthtgjgdhegd1mgwidg6d9hmwg65\t66085\nTD\t66086\n鼎力\t66087\n田连\t66088\nTC\t66089\n小成\t66090\n小我\t66091\n兑换\t66092\n复活节岛\t66093\n夏彬铎\t66094\n鱼洞\t66095\n撸精\t66096\n的丑八怪手法乖丫头百块\t66097\n新欢\t66098\nT2\t66099\nT3\t66100\n45649451\t66101\nT1\t66102\n心灵感应\t66103\n逗说\t66104\n妇联赛\t66105\n148524562766\t66106\n新款\t66107\n蒙元\t66108\n豺狼女貌\t66109\n6.2级\t66110\n声光\t66111\n兔斯基\t66112\n神灵\t66113\n不敢相信\t66114\ne\t66115\nsvi\t66116\n逛星\t66117\n窗边\t66118\n我没说等等哪敢好不好你别个说的你都\t66119\nnzjsuzb\t66120\n彭智婷\t66121\n粉碎性骨折\t66122\nsvx\t66123\n宝贝儿我爱你爱着你\t66124\nsvt\t66125\n神灯\t66126\nsvs\t66127\n神火\t66128\n瘆\t66129\n一起冲突\t66130\n孕后\t66131\n瘁\t66132\n六3班\t66133\nWK波特兰\t66134\n张包围\t66135\n里程\t66136\n想见你\t66137\n瘋\t66138\n深居简出\t66139\n淫靡\t66140\n食住\t66141\n科恩兄弟\t66142\n瘙\t66143\n瘤\t66144\n只老\t66145\n瘦\t66146\n塘西\t66147\n9顶\t66148\n猪呢猪猪猪猪猪猪\t66149\n瘪\t66150\n瘫\t66151\n高热\t66152\n说我帅我讨厌\t66153\n角度米\t66154\n下甘拜\t66155\n闫侣欣\t66156\n学见\t66157\n高烧\t66158\n瘸\t66159\n碎屏\t66160\n清晨六点\t66161\n賠\t66162\n亚卖碟\t66163\n马术\t66164\n运单\t66165\n大扎\t66166\n聊峰峰\t66167\n7十\t66168\nwave\t66169\n度秘敦力\t66170\n钟林\t66171\n呼呼u\t66172\nappstore\t66173\n蔡大伟\t66174\nidjennew\t66175\n太太宗\t66176\n辛克莱尔\t66177\nhibot\t66178\n蝙蝠侠\t66179\n马月\t66180\n女侠\t66181\n天天睡觉吧\t66182\n鐘後起來説\t66183\n呜呜舞\t66184\n大官们\t66185\n賎\t66186\n累累累\t66187\n病假\t66188\n别的话\t66189\n陈毅博\t66190\n艾玟\t66191\n摩托罗交口\t66192\n先先先先先\t66193\n室寝\t66194\n张帅哥\t66195\n李亚楠\t66196\n天冿\t66197\n力力\t66198\n形态化\t66199\n10几万\t66200\n翅膀\t66201\nCK虫\t66202\n迹\t66203\n最值钱\t66204\n公主坟\t66205\n黁\t66206\n一条花\t66207\n年迈\t66208\no旅游族\t66209\n大蛇的你死了最好我不爱你\t66210\n池淑瑶\t66211\nTella\t66212\n相信\t66213\n色郎\t66214\n退款\t66215\n杨新宇\t66216\n异形前传\t66217\n假面骑士\t66218\n菇类\t66219\n国营\t66220\n插入\t66221\n红会\t66222\n上装\t66223\n赵艳润\t66224\n会务\t66225\n酒瘾\t66226\n名录\t66227\n刹车盘\t66228\n周天浩\t66229\n两三百一只\t66230\n07年春节\t66231\n我是女的你是男还是女\t66232\n4756768960\t66233\n敖翔宇\t66234\n把手\t66235\n布莱克\t66236\nbecom\t66237\n秘你好漂亮\t66238\n长春\t66239\n小橙\t66240\n振动\t66241\n莉亚\t66242\n直角\t66243\n刘轩岩\t66244\n我的秘密神\t66245\n一你花大与号小与号\t66246\n直视\t66247\n直观\t66248\n剑侠达\t66249\n直觉\t66250\n张战军\t66251\n达利园\t66252\n钟知利\t66253\n青铜兰\t66254\n另一种\t66255\n朱了一\t66256\nDoctor\t66257\n不可抗拒\t66258\n金东硕\t66259\n爱殇\t66260\n周易熊猫\t66261\n鼓楼社区十九号\t66262\nwwwwxxx\t66263\n两不误\t66264\n9折\t66265\n直径弦abBC4根号2\t66266\n默\t66267\n哎呀度秘你说话的声音\t66268\n13MKH\t66269\n齐全\t66270\n猴姆\t66271\n小孩子\t66272\n大显身手\t66273\ncoldplay\t66274\n1583963608\t66275\n超声波\t66276\nａｐｐ\t66277\n不没有\t66278\n3亿个\t66279\napppatis\t66280\n今天下午三点半\t66281\n泪如雨下\t66282\n大有们\t66283\nzhizhang\t66284\ncharming\t66285\n好妈妈猫眼\t66286\n枰座\t66287\nmacbookair\t66288\n渔姑\t66289\n去年5月11日起\t66290\n东山村口\t66291\n松悟\t66292\n深开心\t66293\n12344567\t66294\n水产\t66295\n导出\t66296\n恨你好\t66297\n100吨\t66298\n18393973989\t66299\ndeshijise\t66300\n再见了我穿越时光\t66301\n佳通\t66302\n油腔滑舌\t66303\nhappy小樱桃\t66304\n曹什么艳我讨厌你\t66305\nXX00\t66306\n60小时\t66307\nQQ2745501648\t66308\n音节\t66309\n8kw40t\t66310\n羊羊利亚\t66311\n百地\t66312\n摸摸摸摸摸摸摸摸摸\t66313\n5、6个小时\t66314\n折冈\t66315\n小度秘呀小度秘小度秘呀小度秘我爱你我爱你\t66316\n蓝盆友\t66317\n这周六\t66318\n星期六星期五\t66319\n山﨑贤人\t66320\n李冰唅\t66321\n什末\t66322\nkicgd\t66323\n0.4秒\t66324\n品質\t66325\n莱次狗\t66326\n麦块\t66327\n呀幸好\t66328\n适tc\t66329\n武夷西部\t66330\n佳木斯\t66331\n小飞行棋\t66332\n早早早\t66333\n摇钱树\t66334\n厌恶\t66335\n被删除\t66336\n暖心人\t66337\n天平女\t66338\nnohe\t66339\n方格\t66340\n折冲\t66341\n杯赛\t66342\n69栋\t66343\n主旨\t66344\nhiQe\t66345\n咔六\t66346\n六升\t66347\n绰绰有余\t66348\n六十\t66349\n六千\t66350\n书法\t66351\n早男\t66352\n拜拜谷\t66353\n转随\t66354\n有谱\t66355\n幺零零幺零四幺\t66356\nJapan\t66357\n前所未见\t66358\n散弹枪\t66359\n吧秘秘\t66360\n第十六季\t66361\n不住你\t66362\n猫九\t66363\n掩埋\t66364\n10套\t66365\n搞紧\t66366\n搞索\t66367\n给给给给给给你\t66368\n刘星\t66369\n爱龙龙\t66370\n歡歡\t66371\n早生\t66372\n苏杨迪\t66373\n试比\t66374\n曾超\t66375\n家吾\t66376\n1600千克\t66377\ngior\t66378\n袁世凯\t66379\n白开水\t66380\n可恶信\t66381\n惊坐\t66382\ngxerihfecydxwbhxbiufenycrghucfrhguefh2gfxeobfxrljeduxfrp\t66383\n纵声\t66384\n巴士\t66385\n42元\t66386\nFGY\t66387\n/寂静　欢喜\t66388\n摔落\t66389\n情况\t66390\nkihud\t66391\n梁小怡\t66392\n一拉线\t66393\n搂着\t66394\n我想你我想你超级飞天\t66395\nFGH\t66396\n漫酃\t66397\n乡土\t66398\n昏天黑地\t66399\njhfh\t66400\n老呦\t66401\n老呢\t66402\n于翰琦\t66403\nufzhko\t66404\n余玉玲\t66405\n田十\t66406\njhfb\t66407\njhfc\t66408\njhfd\t66409\n修闲\t66410\njhff\t66411\n老周\t66412\nstour\t66413\n修问\t66414\n100000000000000081912334566789\t66415\n老味\t66416\n福彩\t66417\n柔顺\t66418\n杨街\t66419\n迩\t66420\n附近\t66421\nfffdxxxxxxxxxxxxxxxxxxxxxxdxxxxccccccvvvvxdxxxxccccccvvvvxdxxxxccccccvvvvxxxxxxxxfffc\t66422\n砖状\t66423\n美沙拉\t66424\n武原来\t66425\n64635\t66426\nGgbi\t66427\n她的手\t66428\n卡洛牌\t66429\n八十斤\t66430\n马长艳\t66431\n初十五一\t66432\n莫嘉怡\t66433\n青鱼\t66434\n乔大神\t66435\n玛米力\t66436\n会儿天\t66437\n女生性\t66438\n淘姿秀\t66439\nCK炫\t66440\n时落\t66441\nchoc\t66442\n朱兰珍\t66443\n籍浩宇\t66444\n病魔\t66445\n粮食\t66446\n8252535354\t66447\n坚固\t66448\n吴新娣\t66449\n高压电\t66450\n田馥\t66451\n七七三三五二\t66452\n说平\t66453\n龙左\t66454\n坏朋友\t66455\nvfct\t66456\n浙江省人民医院\t66457\n88vvD\t66458\n主人公子们\t66459\n1850645454215206306928\t66460\nhkt01\t66461\n迎客\t66462\n哦熊大熊二光头强还有谁和你\t66463\n柴伊涵\t66464\n俞灏明\t66465\n17页\t66466\n瘦身\t66467\n你是我的秘书书\t66468\n张相片儿\t66469\n尽职\t66470\n沈海高速公路\t66471\n低艾\t66472\n战超鳄\t66473\nFlinchbalding\t66474\n铁棍\t66475\n疏议\t66476\n8.27万辆\t66477\n姚钱树\t66478\n每周5\t66479\n二头\t66480\n会不会计\t66481\n龙岗度秘\t66482\n花样姐姐\t66483\n不蔸\t66484\n桃儿\t66485\n499110\t66486\n铁棒\t66487\n南方\t66488\n有违\t66489\n信否\t66490\nuufuudbffjow\t66491\n张天\t66492\n蔡奇\t66493\n仓鼠布丁\t66494\nZAR\t66495\n张大\t66496\n挂单\t66497\ngagggg\t66498\n大丑狗\t66499\n应负\t66500\n想去\t66501\nnvsg\t66502\n张头\t66503\nnvsh\t66504\ntdgjk\t66505\nwwwwwwwhhhhhhhhyyyyyyyy\t66506\n兴味\t66507\n开发区\t66508\nguangming\t66509\n血迁\t66510\n洛歌\t66511\n朴元金\t66512\nckrnakyv\t66513\n惩处\t66514\n哈了不够阿尔塔懒蛋\t66515\n慈善\t66516\ngories\t66517\n小麻雀\t66518\n罩杯\t66519\n非礼\t66520\n呃嗯\t66521\n女腿\t66522\n建筑工人\t66523\n下始\t66524\nculalaa\t66525\n悍然决定\t66526\n金坛市\t66527\n铜牌\t66528\n米孝\t66529\n巴扎\t66530\nsosN\t66531\n点敲门\t66532\n時\t66533\n凤凰传奇\t66534\n陌辰\t66535\n晁\t66536\n偷看看\t66537\n啦啦乱\t66538\n晋\t66539\n回滚\t66540\nQQ业业\t66541\n晒\t66542\n晓\t66543\n晖\t66544\n晗\t66545\n契合\t66546\n晕\t66547\n晚\t66548\nh图\t66549\n写作业\t66550\n小豆面\t66551\n隆冬\t66552\n633万美元\t66553\n下个星期一下午\t66554\n晤\t66555\n打油\t66556\n晨\t66557\n晩\t66558\n普\t66559\n景\t66560\n刘克功\t66561\n晰\t66562\n晶\t66563\n晷\t66564\n晴\t66565\n智\t66566\n示范个\t66567\n晾\t66568\nxxgkfmmmn\t66569\n九折\t66570\n施压度秘\t66571\n万家\t66572\n韩愈\t66573\n潋滟\t66574\n全无\t66575\n李健创\t66576\n难得死嘎电视所得死呢电视噶\t66577\n李宇舂\t66578\n倦舟\t66579\n1126家\t66580\n万宝\t66581\n批萨\t66582\n十二八十\t66583\n万宁\t66584\n参与\t66585\n巴哥犬\t66586\n多一百个\t66587\n万安\t66588\n945965945\t66589\n惊失色\t66590\n指正\t66591\n小皓轩\t66592\n柔骨\t66593\n属于你的人生\t66594\n金刚侠\t66595\ncokc\t66596\n繁密\t66597\n刘楚恬\t66598\n阿四\t66599\n瓦兰\t66600\n退居\t66601\n进会\t66602\nfjk\t66603\nfjj\t66604\nfjh\t66605\nfjn\t66606\nfjc\t66607\n干咳\t66608\n说和你\t66609\nfjd\t66610\n小大姐\t66611\n阿图\t66612\nfjx\t66613\n奋战奋\t66614\nfjr\t66615\n米小夏\t66616\n一个亲一个\t66617\n日数\t66618\n俩家\t66619\n提肯\t66620\n爱的风雪我背你\t66621\n正香\t66622\n啵啵窝乖\t66623\n长城长城\t66624\n写文\t66625\n四段\t66626\n苗圃\t66627\n爱民如子秋毫无\t66628\n一件儿\t66629\n肿膜\t66630\nvario\t66631\n啊11\t66632\n白玫瑰黑玫瑰\t66633\n刨冰\t66634\n睿睿哥\t66635\n农产品\t66636\n嘿客\t66637\n为你呢\t66638\n普卡基湖\t66639\n西饼\t66640\n故宫殿\t66641\n威神魔\t66642\n中坚\t66643\n借物喻人\t66644\n水镜头\t66645\n万能\t66646\n凯棒\t66647\n有何有\t66648\n用片\t66649\n宝贝宝贝宝贝宝贝宝贝宝贝不能那你扭扭捏捏\t66650\n复印纸\t66651\n无趣\t66652\n深深岁\t66653\nsjdk\t66654\nq39\t66655\n孙永健\t66656\n2011年5月6日\t66657\n内存条\t66658\n家乐宝\t66659\n天玩\t66660\n瞎话\t66661\n小楷\t66662\n丁姐\t66663\n暖热\t66664\n身家\t66665\n邵建淼\t66666\n翻翻看\t66667\njhdud\t66668\nnonononooaa\t66669\n中关村百代健身明天关门\t66670\n5682\t66671\n1977年\t66672\ndedian\t66673\n英格雷氏\t66674\n^╰\t66675\n沐沐\t66676\n301班\t66677\n基会\t66678\nBrymax\t66679\n了唱\t66680\n也想爱\t66681\n喔喔喔喔喔喔喔喔喔喔喔喔喔喔\t66682\n6901028075015\t66683\n禁飞\t66684\n好想太阳\t66685\n妻妹\t66686\nmyout\t66687\n七十八十九二十\t66688\n我的狗道个歉\t66689\n凭嘛\t66690\n肉末豆腐\t66691\n势力眼\t66692\n不能说真话\t66693\n广东大学\t66694\n小说性\t66695\n必悔\t66696\n辛苦勒\t66697\n阿米汗\t66698\n9087052\t66699\nhttpfhiphotosbaiducomxiaodupicitem71cf3bc79f3df8dc44859ce5ca11728b4710280djpg\t66700\n羽联纪律委员会\t66701\n宋江\t66702\n甘改成\t66703\n叮当叮叮当叮儿响叮当\t66704\n婆娑罗\t66705\ntent\t66706\n脸黑\t66707\n柜子\t66708\n记念日\t66709\n难过\t66710\n呢称\t66711\n截屏器\t66712\n今天之内\t66713\n世家\t66714\n你在我是我的好宝贝\t66715\n道长\t66716\nmerry\t66717\nhII\t66718\nmaga\t66719\n层次分明\t66720\n鬼人不人\t66721\n漢化\t66722\n寒来\t66723\nVerizon\t66724\n星动\t66725\n混凝土\t66726\n阿紫\t66727\n度秘你吃神魔东西\t66728\n孙泽慧\t66729\n布加勒斯特\t66730\n星势\t66731\n物理学奖\t66732\n甲级\t66733\n难追\t66734\n和林\t66735\n冲泡\t66736\n52路\t66737\n摸鱼\t66738\n6月15号\t66739\n刘尹媛\t66740\n铃儿\t66741\n腦筋急轉彎\t66742\n嗯迪士尼乐园\t66743\n蛋鸡\t66744\n牛为你好\t66745\n忠烈\t66746\n邢讨厌\t66747\n撲街\t66748\n好绝对\t66749\n邦迪大厦c座\t66750\n钟雪飞\t66751\n澜沧\t66752\n今天中午12点\t66753\necoblue\t66754\n治标\t66755\n西米\t66756\n向黄\t66757\n并列\t66758\n望子成龙望女成凤\t66759\nIdbjdi\t66760\n打手\t66761\n唐错\t66762\n吴艳\t66763\n倒把判刑\t66764\n胯下\t66765\n123456789n0\t66766\n两三四五六个\t66767\ntdgghu\t66768\n12345678910789456123\t66769\n九五五零\t66770\n喵木\t66771\n张荣涛\t66772\nnot科\t66773\n35888\t66774\n要点\t66775\n安全裤\t66776\n一一杯\t66777\n史铁生\t66778\nrhbxfbbjgf\t66779\n两三四五六七\t66780\n银耳\t66781\njaodiaose\t66782\nCASINO\t66783\n念了我讨厌\t66784\n声波\t66785\n拜拜亲\t66786\nliao天\t66787\n翻牌\t66788\n杨茜钰\t66789\n85千米\t66790\n小沁\t66791\n颜氏\t66792\n小沃\t66793\n聊天儿行\t66794\n43814384381\t66795\n小沈\t66796\nfjtgst\t66797\n当选\t66798\n预告\t66799\n飞卢\t66800\n健儿\t66801\n小沐\t66802\n粉样\t66803\n恒通\t66804\n当真\t66805\n丹霞山\t66806\n中兴U900\t66807\n策划部\t66808\n加人\t66809\n飞华\t66810\n醺霆\t66811\n路易威登\t66812\njdndn\t66813\n美国高盛公司\t66814\n飞升\t66815\n小沫\t66816\n自治\t66817\n张双姐\t66818\n加五\t66819\nInterview\t66820\n小河\t66821\n胰腺癌\t66822\n博睿传播\t66823\n特典\t66824\n败仗\t66825\n苦读\t66826\n青山梦\t66827\n信口开河\t66828\n海索\t66829\n陈静\t66830\n温州市\t66831\n黄宗泽\t66832\n反复\t66833\n燕姿\t66834\n跟哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥\t66835\n四系\t66836\n白送\t66837\n1997个\t66838\n冯雨婷\t66839\n无卤\t66840\n一千五百分钟\t66841\n骗子们\t66842\n第第\t66843\n往好\t66844\n痰火\t66845\n中百仓储\t66846\n无卖\t66847\n14588650096800\t66848\n血型钢\t66849\n卢先\t66850\n报收\t66851\n无华\t66852\n红塔\t66853\nttre\t66854\n报放\t66855\n燕姐\t66856\n1250611768\t66857\n明媚\t66858\nx20100\t66859\n热射病\t66860\n贝贝\t66861\n名门\t66862\n伍苓\t66863\n张学华\t66864\n依米千\t66865\n你是我的众爱卿\t66866\n陈一鑫\t66867\n火车经云\t66868\n真丑真丑真丑真丑真丑真丑真丑真丑真丑真丑真丑真丑真丑真丑真丑真丑真丑\t66869\n网购\t66870\n甜蜜素\t66871\n霉素\t66872\n北斗星\t66873\n北京妥成联通集团\t66874\n榜首\t66875\n嘞米\t66876\n李溪安市\t66877\n我是天我是天平座\t66878\n普埃尔\t66879\n健身会所\t66880\n三官殿\t66881\n交欢\t66882\nr8tugugt7t8y8y8yyy\t66883\n言表\t66884\nki7k7k\t66885\n俺着\t66886\n帅锅们\t66887\n杀戮\t66888\n郑婷\t66889\n一孩儿米\t66890\n王一宿\t66891\n哦不个省区\t66892\n呜呜好伤心窝不爱你\t66893\n原始人\t66894\n五六分钟\t66895\n托仔兔\t66896\nkeskx\t66897\n油漆工司服\t66898\nleiting\t66899\n落叶\t66900\n法務部\t66901\n日月星辰\t66902\n更上一层楼\t66903\nhcvvu\t66904\n一道一\t66905\n苏州医科学校\t66906\n缸盖\t66907\n小炒\t66908\n咏月\t66909\n邸為邶\t66910\n黄青霞\t66911\n演习\t66912\n栋豪栋豪\t66913\n5.4％\t66914\n这个度\t66915\ndrddtddtd\t66916\nEEE\t66917\n给我讨厌的人和讨厌你我讨厌你我讨厌你度秘我讨厌你我度秘我讨厌你我度秘讨厌你度秘我讨厌你度秘我讨厌\t66918\n吸干\t66919\n猪祖宗猪闹钟脑四中\t66920\n7.8\t66921\n演义\t66922\n董倩元\t66923\n離開\t66924\nwende\t66925\n在天下在\t66926\n小点\t66927\n天天酷跑梨\t66928\n大娅娅\t66929\n好啊嘛时候车厅矣監事逗十五班干吗了一赶诞\t66930\n膀子\t66931\n海海海\t66932\n实习\t66933\nhyygh\t66934\n刘会儿\t66935\n小炯\t66936\n小炮\t66937\n一千百万\t66938\nnilwulicikw\t66939\n玫瑰\t66940\n强颜欢笑\t66941\n佛曰\t66942\nhiname\t66943\n锅内\t66944\n仨色\t66945\n陆柳\t66946\n高冷女王范儿\t66947\n验尸\t66948\n佳能照像机\t66949\n我的我的阿七阿七家阿七我姐姐我给我寄一窝七窝七窝七窝\t66950\n我爱你爱你爱你爱你爱你\t66951\n骏马\t66952\n乌纳斯\t66953\nc挛\t66954\n5080\t66955\n5087\t66956\n38538\t66957\nDD33\t66958\n生茧\t66959\n好度\t66960\n管理学院\t66961\nKDKKSK\t66962\n单个\t66963\n259248\t66964\nTfyf\t66965\n牛里脊\t66966\n违犯\t66967\n张到哪里了呀还在电器量\t66968\n收摄\t66969\n朝圣\t66970\n陈雪涛\t66971\n经销份\t66972\n天窗\t66973\nuuuuuuuuuuuuuuuuuuuuuuuuuu\t66974\n正事吧\t66975\n阿隆索\t66976\n5258mm\t66977\n赵个娃\t66978\n欧狗\t66979\n大圣樱之安大厦\t66980\nOKOKOKOKOK\t66981\n肋巴扇干骨折\t66982\n1555535\t66983\n心爱人\t66984\n单一\t66985\n富东\t66986\n谷景生\t66987\n竟遭\t66988\n凸凸凸凸凸凸凸凸凸凸凸凸凸凸凸凸凸\t66989\n黄龙溪\t66990\n乌克上\t66991\n一点要不了\t66992\n捍卫者\t66993\n嗯嗯不要脸你不要脸\t66994\n周道\t66995\n仆人\t66996\n索尔德可乐鸡翅\t66997\n老天后\t66998\n8米\t66999\nnidenerduolongle\t67000\n坠亡\t67001\n猫cc\t67002\n根骨\t67003\n蜜芽\t67004\n杜鹃\t67005\n呵气\t67006\n该院\t67007\n我想要你真心的对我\t67008\nn1111\t67009\nxxly\t67010\n杀异\t67011\n五万5万\t67012\n杜鹏\t67013\nkuaidian\t67014\n哈哈恩\t67015\n现代版\t67016\nqqwweerrttyyuuiiooppaassddffgghhjjkkllzzxxccvvbbnmmm\t67017\n现代片\t67018\n炒法\t67019\n八骏马琴\t67020\n好啦好啦好啦好啦\t67021\n瘸腿\t67022\n阿乐妮\t67023\n对手\t67024\n过去一年\t67025\n242404\t67026\n脸肿\t67027\nGvf\t67028\nGvd\t67029\n脸肥\t67030\n绿帽\t67031\n至死不渝\t67032\n化妆棉\t67033\n六十架\t67034\n粗心大意\t67035\n对打\t67036\nPush\t67037\n乳液\t67038\n三十年后\t67039\n深受\t67040\n原则上\t67041\n我要你的咪咪我的丁丁\t67042\n五八年\t67043\n太太太太太太太\t67044\n速八\t67045\n快乐快高长大\t67046\n全民运动会\t67047\n咯嘛度秘\t67048\n草莓果冻\t67049\n活体验证\t67050\nstar\t67051\nstas\t67052\n侠岚\t67053\n印字\t67054\n四名\t67055\nGfbfb\t67056\n津津乐道\t67057\nstay\t67058\n吃不动\t67059\n普莱斯\t67060\n6454645\t67061\nstan\t67062\nstal\t67063\n说了行\t67064\nxjxjjxbxbxbxcb\t67065\n這样\t67066\n那个月色调动作为您好的你好的我的爱心\t67067\n得奬\t67068\n田艳军\t67069\n再来块\t67070\n四听\t67071\n有点神\t67072\n好香\t67073\n四吨\t67074\n疑案\t67075\njjjjmdd\t67076\n太地道\t67077\n110629\t67078\n泛洋\t67079\n王刚强\t67080\nhezit\t67081\n妈嘟\t67082\n标新立异\t67083\n尿糖\t67084\n喵喵喵喵喵喵喵喵\t67085\n球友\t67086\n脑浆\t67087\n电话人\t67088\ncce2ap\t67089\n多遍\t67090\n戛纳\t67091\n九六式\t67092\n家妈\t67093\n令令\t67094\n乖老公\t67095\n技术流\t67096\n无地自容\t67097\n月2\t67098\n脐环\t67099\n家妻\t67100\n白日梦想家\t67101\n外出厂\t67102\n凸凸凸凸凸凸凸凸凸凸凸凸凸凸凸\t67103\nndkoss\t67104\n球台\t67105\n爬爬爬爬爬爬\t67106\n汤姆·克鲁斯\t67107\n首父亲\t67108\n微鬼\t67109\n趁手\t67110\n脑海\t67111\n出色\t67112\n花花\t67113\n性反面\t67114\n五六岁\t67115\n炒房房\t67116\n做多情\t67117\n何昊宇\t67118\n2102113524\t67119\nxhkf\t67120\n20202222221314\t67121\n早上3点\t67122\n单脚\t67123\n2011年以来\t67124\n陡壁\t67125\nHkhbjhhfg\t67126\n谢呵呵哒\t67127\n女孩子们\t67128\n官方语言\t67129\n6500万股\t67130\n碟菜\t67131\n胡搞\t67132\n停下等\t67133\n5855\t67134\n郭剑\t67135\n格果\t67136\n朱佳梦\t67137\nfrrg\t67138\n超脱\t67139\n有你在我这么说\t67140\n袁花\t67141\n死恰把\t67142\n装好不好小心\t67143\n闲寂\t67144\n我也不吧\t67145\n闹了再见\t67146\n上厕所有\t67147\n九十斤\t67148\n1138\t67149\n要不你走\t67150\n美搭\t67151\n正手\t67152\n他们想要*我\t67153\nD牙\t67154\n想梦\t67155\n20余名\t67156\nmyfamaliy\t67157\n小喽啰\t67158\n吼吼吼吼\t67159\n吃好了吧\t67160\n陈娇\t67161\n陈娟\t67162\nhffhgg\t67163\n答疑\t67164\n樟木头\t67165\n翻过来\t67166\n西欧\t67167\n我一个你\t67168\n赵琦琪\t67169\n李志琦\t67170\n塑化剂\t67171\n杰瑞股份\t67172\n别说了吧\t67173\njgajaka\t67174\n化龙\t67175\n3月26号\t67176\n羞羞女\t67177\n朱子桉\t67178\n接得\t67179\nchiphotosbaiducomxiaodupicitemb812c8fcc3cec3fd86b4d801d188d43f8794273fjpg\t67180\n居心叵测\t67181\n臭屁\t67182\nghjkl\t67183\n六零\t67184\n国际牛奶日\t67185\n弱者\t67186\n顾虑重重\t67187\n吸取教训\t67188\n接待\t67189\n伊斯兰\t67190\n庄宗耀\t67191\n见一\t67192\n煞笔吗英吁虾饼\t67193\n二八三九四二十三十四十五十六十七十八十九四十一十二\t67194\n死装\t67195\n940多次\t67196\n追爱\t67197\n鉴赏力\t67198\n大总攻\t67199\n日照市\t67200\n和聊\t67201\n快说你爱不爱我\t67202\n活火山\t67203\ndatyhf\t67204\n朴妮\t67205\n清楚\t67206\n在家等\t67207\ngvoh\t67208\n红马\t67209\n344\t67210\n陈花瓣\t67211\n罗温\t67212\nsswos\t67213\n专业场\t67214\n惊雷处\t67215\n微英共青\t67216\n幺二零幺零六九八二零五幺八零零三幺\t67217\n换水\t67218\n347\t67219\n青狼\t67220\n十八十八\t67221\n牛鬼\t67222\n马康\t67223\n来眼去\t67224\n别贫\t67225\n漂漂\t67226\n烤饼\t67227\n千年后\t67228\n好的杀毒咪\t67229\n一千千克\t67230\n张亚双\t67231\n对眼\t67232\n汗珠\t67233\ntifufe\t67234\n赠到\t67235\n端端\t67236\nxhb\t67237\nxhf\t67238\nxhd\t67239\nxhk\t67240\n348\t67241\n78449\t67242\n玩遍\t67243\n长跪\t67244\nxhl\t67245\nxhs\t67246\nxhr\t67247\n小盆友\t67248\n涂抹\t67249\n很假\t67250\nxht\t67251\n甄嬛体\t67252\nxhy\t67253\nxhx\t67254\n秘密鸟\t67255\n落入歧途\t67256\n2x10\t67257\n敏旭\t67258\n34E\t67259\n抄词\t67260\nvvvvvbvvv\t67261\n好准好准再见\t67262\njjbncbbx\t67263\n舒贵人\t67264\n额式\t67265\n村中\t67266\n被打破\t67267\n咔咔咔咔咔hot\t67268\n小八嘎\t67269\n弄开\t67270\n超市\t67271\n啦啦熊\t67272\n感叹词\t67273\n播客\t67274\n昆明\t67275\n聊会天行\t67276\n五成\t67277\n幸福快感\t67278\n复选框\t67279\n睁大你\t67280\n闲钱\t67281\n祛痘\t67282\n村上\t67283\nfrrde\t67284\n张子鸾\t67285\n吃了饭回\t67286\n68w\t67287\n瘌痢头\t67288\n亲密关系\t67289\n反话\t67290\n金隅嘉业\t67291\n市\t67292\n布\t67293\n币\t67294\n陶湾\t67295\n希\t67296\n师\t67297\n帖\t67298\n麦肯基吃麦肯锡\t67299\n帕\t67300\nMUSIC\t67301\n飒飒\t67302\n帜\t67303\n帝\t67304\n芳菲\t67305\n帘\t67306\n带\t67307\n帧\t67308\n帥\t67309\n真的假不了假的真不\t67310\n帮\t67311\n曾蓉萱\t67312\n席\t67313\n谢海校\t67314\nhuufjg\t67315\n帶\t67316\n猪理\t67317\n马天浩泰\t67318\n帰\t67319\n颜然组\t67320\n684\t67321\n帽\t67322\n681\t67323\n一帆风顺\t67324\n常\t67325\n第四大\t67326\n鬼鬼\t67327\n了当\t67328\n日毛\t67329\n献唔\t67330\n第四天\t67331\n这年头\t67332\n慢条斯理\t67333\n李国防\t67334\n陪干\t67335\n李梦然\t67336\n呵呵基金会\t67337\n92分\t67338\n齐达内\t67339\n尽力而为\t67340\n封微情书吧\t67341\n口饭\t67342\n葫芦岛\t67343\n500GB\t67344\n1500岁\t67345\nJOD\t67346\n12427283638\t67347\nkefuserver\t67348\nJOK\t67349\n攻心倒地\t67350\n盘锦巿\t67351\n秘摸\t67352\n孙莉姐\t67353\n一五号\t67354\n石油集团\t67355\n不要你了臭机器人\t67356\n蜀蒏\t67357\n155456455554\t67358\n戚继光\t67359\n十七分之1650\t67360\np1435\t67361\n大公司\t67362\n么西么西呼啦希和故意\t67363\n哼哼度\t67364\n一五发\t67365\nzwl\t67366\n金浩\t67367\n党参茶\t67368\n83.69%\t67369\nzwd\t67370\nghhin\t67371\n变态人妖\t67372\n装上传\t67373\nzwy\t67374\n苏欧弟\t67375\n遗爱\t67376\n大豆油\t67377\n吴小姐\t67378\nraftericlass\t67379\n新白发魔女传\t67380\nzws\t67381\n杨芳花\t67382\nzww\t67383\n非人道\t67384\nq1\t67385\n可辛双眼\t67386\n9点49分\t67387\n马一诺\t67388\n11月7日\t67389\n小器\t67390\n鱼场\t67391\n助长\t67392\n撞死\t67393\nSujiaren\t67394\n张琳紫\t67395\n78周\t67396\n再来一回\t67397\n毛毛雨\t67398\n胡巴可\t67399\n喂吻\t67400\ndfffvv\t67401\n约翰\t67402\n吕贺休\t67403\n要一不要钱\t67404\nwaaaaaa\t67405\n瘟猪\t67406\n扣唱\t67407\n谈虎色变\t67408\n秘无所不能\t67409\n汽车配件\t67410\n奥沙利文\t67411\n估计图\t67412\n红烧肉\t67413\n该死\t67414\n钱艺星\t67415\nffre\t67416\n卢氏县\t67417\n大牛们\t67418\n混迷幻药\t67419\n机器人头\t67420\nhi你好哇我是谁\t67421\n坏蛋坏银\t67422\n这家伙\t67423\nWqgh\t67424\n咪浪\t67425\n多十斤\t67426\n依附\t67427\nyuvjg\t67428\n卢建波\t67429\n就打打\t67430\nwjs\t67431\n故佛\t67432\n小妞儿\t67433\n尊重感\t67434\n吃作业\t67435\nzgjlx\t67436\n休息\t67437\n了拜\t67438\njjx\t67439\njjv\t67440\njjs\t67441\n多红红的笑脸我的心的爱你的小了我的心\t67442\n聚资\t67443\njjp\t67444\njjn\t67445\njjm\t67446\ntuok\t67447\njjk\t67448\n喵喵喵喵喵喵喵喵喵\t67449\n无线电\t67450\ndttgjh\t67451\njjg\t67452\n行政机关\t67453\njjd\t67454\njjc\t67455\njjb\t67456\n梗死\t67457\n搬开\t67458\n同年\t67459\n中国外交部\t67460\n搬弄\t67461\n阿南南南南南瓜嘎嘎嘎嘎嘎嘎嘎嘎嘎\t67462\n名贵\t67463\n饭工体\t67464\n嘿凤梨\t67465\n百荷\t67466\n头山\t67467\n精气\t67468\n潮安\t67469\n窥\t67470\n二三七八五七七八\t67471\n色弱\t67472\nhfhxhfx\t67473\n不明示\t67474\n2005年10月10日\t67475\n冰场\t67476\n乐活\t67477\n中国人民大学\t67478\n封山\t67479\n冰圣\t67480\n散兵\t67481\n林方\t67482\n士官人\t67483\n斩头之日\t67484\n26372公顷\t67485\n放不下来\t67486\n气象\t67487\n十七八天\t67488\n应战\t67489\n你曾\t67490\n林斌\t67491\n阿晴\t67492\n瘾\t67493\n义乌市\t67494\n评理\t67495\n杰著\t67496\nThfbvh\t67497\nknights\t67498\n给我两聊聊\t67499\nshtsh\t67500\n关昕\t67501\npalack\t67502\n吴勇勇\t67503\n曹仁\t67504\n情方\t67505\n蛇夫个肉灵芝\t67506\n饥饿新款3\t67507\n屁屁屁屁屁屁股票\t67508\n進黨議員\t67509\n海肠\t67510\n基座\t67511\n孔明\t67512\n你在干什么你在干什么你在干什么你在\t67513\n需求量\t67514\n旅行袋\t67515\n三鹿\t67516\n1.17亿\t67517\n34天\t67518\n城建\t67519\nㄑ\t67520\n甘雪淇\t67521\n交谈\t67522\n剁尼\t67523\n不要不懂\t67524\n广寒\t67525\n圆柱\t67526\njjft\t67527\n审美\t67528\n川义工团\t67529\n看我多棒\t67530\n我的心里只有你没有他\t67531\n长得帅啵\t67532\ntoolska\t67533\n123456ang\t67534\n我是女的不爱汉子爱格格\t67535\n别忘了我这是明星新大你之苦孤零零的着我的心\t67536\n厦门市妇幼保健院\t67537\nCampsaid\t67538\n短腿\t67539\n避孕\t67540\na6a6a\t67541\n好我说话\t67542\n可乘之机\t67543\nVOVICI\t67544\n万里\t67545\ngvxgu\t67546\nMomsare\t67547\n觉名叫\t67548\n红太阳\t67549\n文曰\t67550\n麻辣皮\t67551\nOpponents\t67552\n341341341341341\t67553\nDcgx\t67554\n启示录\t67555\n民俗\t67556\n0点\t67557\n2599100000000002\t67558\n勿扰\t67559\n大曦nnnnnnnx\t67560\ncous\t67561\nylfima\t67562\n怨声载道\t67563\n平邑\t67564\n算蚀\t67565\nDcgg\t67566\nShamoyan\t67567\n逐虎\t67568\n187次\t67569\n我了气\t67570\n吧头\t67571\n塞博坦\t67572\n固有一死\t67573\n万达小区\t67574\n好想xo\t67575\nbiubiu\t67576\n51个\t67577\n再见了梅毒我有我有点事\t67578\nmountfuji\t67579\n天爸爸\t67580\nkui\t67581\n5324场\t67582\n不奔\t67583\n姆博卡尼\t67584\n一三十三岁\t67585\n吧SB\t67586\n颖童\t67587\n过节用\t67588\n急燥\t67589\n冰激琉\t67590\n凯咪\t67591\n二逼媒体\t67592\nwiushd\t67593\n供应链\t67594\n险象\t67595\n不女\t67596\n黄姑娘\t67597\n肉点\t67598\n琳达\t67599\n电话会议\t67600\neare\t67601\n猪草\t67602\n车浩宇\t67603\n非一般\t67604\n不好\t67605\n3亿桶\t67606\n细皮\t67607\n忘忧河\t67608\n你不是我的谁我是你的谁你猜我是不是你\t67609\n邛海公园\t67610\n牛亚彬\t67611\n年产\t67612\n城门\t67613\n乐视网\t67614\n嘟嘟嘟你是我的小兜兜小猪猪\t67615\nujgfcxf\t67616\n沙漠越野赛车\t67617\n提哥\t67618\n天台县县\t67619\nyyeidjhd15264574646\t67620\n图样\t67621\n养死\t67622\n驾临\t67623\ndiseo\t67624\n久仰\t67625\n412828199510100390\t67626\n1334678890\t67627\n俸风\t67628\n金奶奶\t67629\n图标\t67630\n湿虎\t67631\n错嫁\t67632\n424579800556751153578\t67633\n陈总结\t67634\n二六三二六三七\t67635\n唐季礼监制\t67636\n太底\t67637\n36美元\t67638\nqq2752501204\t67639\n大汉缘之云中歌\t67640\n多咪抑乐\t67641\nwears\t67642\n一个么\t67643\n几天天yy\t67644\n屠格\t67645\nweare\t67646\n呃幺三七零二幺\t67647\n苏洛\t67648\n错了吃饱\t67649\n那分钟\t67650\n一个九\t67651\n赵吧\t67652\n顾润\t67653\n小听歌就是那个那个蔷薇小清\t67654\n4758\t67655\n薛同敏\t67656\n刘伯伯\t67657\n比山\t67658\n徐一航\t67659\n赵吗\t67660\n唇炎\t67661\n大危机\t67662\nhjdhoj\t67663\n锐林雅居\t67664\n呀屁\t67665\n罗罗\t67666\nfggv\t67667\n哎呀呀呀呀呀呀\t67668\n华罗庚\t67669\n罗网\t67670\n激动\t67671\n唯一\t67672\n列入\t67673\n芈庙\t67674\n淡谈\t67675\n镭射\t67676\n管理部\t67677\n赶回\t67678\n勇者传\t67679\n中指\t67680\n行不我讨厌\t67681\nKick-Ass\t67682\n肺疼\t67683\n一个劲儿轮\t67684\n韩何苗\t67685\n潘春花\t67686\n三十四块\t67687\n清河县\t67688\n了的话\t67689\n射速\t67690\n111111囤积居奇\t67691\n10000000岁\t67692\n意思我的朋友\t67693\n马克笔\t67694\n排行前\t67695\n汹涌\t67696\ngjirza\t67697\n灵通人士处\t67698\n4000年\t67699\n晓博\t67700\n不堪设想\t67701\n查分\t67702\n巨作\t67703\n谷啊莫\t67704\n各式各样\t67705\n赵宇昂\t67706\n条状物\t67707\n胃液\t67708\n888九百九九件\t67709\n那当你是人妖\t67710\n寻找\t67711\njdkod\t67712\nThu123455678910\t67713\nLIPPMANN\t67714\n杨么杨\t67715\n伶牙俐齿\t67716\n常河镇\t67717\n三三来奶奶奶奶\t67718\n一下一堆\t67719\n对半\t67720\ntoped\t67721\n对华\t67722\n我知道你是谁了你好呀\t67723\n十六级\t67724\n210分钟\t67725\n负有\t67726\n骏鲁迅军\t67727\nft闪电侠\t67728\n40gg\t67729\n板荡\t67730\n复出\t67731\nfvnlkdcl\t67732\n气证\t67733\n汇丰控股\t67734\n呐喊\t67735\n刘琳素\t67736\n谢国森\t67737\ndisappear\t67738\n再来一个再来一个再来一个再来一个再来一个再来一个再来一个再来\t67739\n380个\t67740\n手作生\t67741\n武铭哲\t67742\nt恤衫\t67743\n四分之一\t67744\n张来芽\t67745\n登塔\t67746\n境外\t67747\n痴香\t67748\nG8u\t67749\nG8v\t67750\n水电局\t67751\n实习生\t67752\n0.47%\t67753\n水电屋\t67754\n预选赛\t67755\n杨荣霞\t67756\n英雄猫\t67757\n起庭中春草\t67758\n13103479311\t67759\n工商行ICBC\t67760\n十多天\t67761\n2002年\t67762\n来姐\t67763\n晚饭\t67764\n脚本\t67765\n市疾控中心\t67766\n晚饮\t67767\n度咪咪灰灰\t67768\n吧主任期\t67769\n心里手\t67770\n10日\t67771\n4.4%\t67772\n地手机\t67773\ncaoprn\t67774\n芪炖鲈鱼\t67775\n慑于\t67776\n租碟\t67777\n灌篮高手\t67778\n郭春海\t67779\nkuabg\t67780\n荒废\t67781\n泰勒定律\t67782\n人生币\t67783\n辽宁兹\t67784\n今天六一儿童节\t67785\n自古东坡律\t67786\n弱猪\t67787\n度秘我恨你讨厌你\t67788\n嗯天天\t67789\n语速\t67790\nyeh\t67791\nyeo\t67792\n88888888888888888888888889999991111\t67793\nyea\t67794\n残魂\t67795\nyee\t67796\nyey\t67797\n妥妥\t67798\n一万多年\t67799\n肚子气\t67800\n1rs二\t67801\nyep\t67802\n彩虹蛋糕\t67803\nyeu\t67804\nyet\t67805\n灵越\t67806\n药萌萌哒\t67807\n涨幅\t67808\n中国重汽\t67809\n陈隔壁\t67810\n什麼來什麼\t67811\n红抑菌\t67812\nwrikGxckl\t67813\n遍野\t67814\nSiwa\t67815\n上回\t67816\n六十公斤\t67817\n罗洛亚\t67818\n二十个二十美元\t67819\n小二来\t67820\nyeS\t67821\n白白在线\t67822\nyangmiima\t67823\n豌豆射手\t67824\n封休\t67825\n也有事\t67826\n2016年1月3号\t67827\n行外加\t67828\n白璟丑\t67829\nccchao\t67830\n秀女\t67831\n物联网\t67832\n育碧\t67833\n也有人\t67834\n黑花\t67835\n太难受\t67836\n小孩当家\t67837\n诗晓\t67838\n林泽雄\t67839\n肠师\t67840\nertyuyuu\t67841\n六六三十九\t67842\n人民日报\t67843\n上学问\t67844\n压迫\t67845\n逆載鲸\t67846\nchanghon\t67847\n总动员\t67848\n农华\t67849\n梨花头\t67850\n喔克么\t67851\n4月23日\t67852\n曲展生\t67853\n卡其马\t67854\ncjvj\t67855\n耽心\t67856\n再唱一个啊魔法的翅膀\t67857\n知道你是男女女士男\t67858\n针锋相对\t67859\n屋住\t67860\n图图图图图图图图图图图图图图图图图图图图图图图图图图图图图图\t67861\n偏心\t67862\n那何时\t67863\n八万元\t67864\n引语\t67865\n若我\t67866\n海拉\t67867\n五牛\t67868\n纽曼\t67869\n陈俊荣\t67870\n海拔\t67871\n一场游戏\t67872\n陈成名\t67873\n引诱\t67874\nvsvs\t67875\n索迪斯\t67876\n袁雨山\t67877\n喜新厌旧\t67878\n奥普红子\t67879\n哈哈笑\t67880\n我了我不想和你说话了拜拜\t67881\n醉了醉\t67882\n明天天安\t67883\n闫超\t67884\n糖蜜\t67885\n桑鑫\t67886\n经度\t67887\n淑纯\t67888\n嘀嘀咕咕\t67889\n没用钱\t67890\n老子要是什么\t67891\n嗯再讲\t67892\n嗯乐购嘛乐购\t67893\n水鱼汤\t67894\n会费\t67895\n明禅法师\t67896\n产业链者\t67897\noree\t67898\n傻子童\t67899\n血者\t67900\nTiKe\t67901\n赖世豪\t67902\n清酒\t67903\n始喪睛\t67904\nNenDan\t67905\n手臂\t67906\nkuodr\t67907\n21134423\t67908\n东风悍马\t67909\n百斤\t67910\n甘霖\t67911\n猫小白\t67912\n嘻嘻這\t67913\n不接\t67914\n绕弯子\t67915\n535734314586567373754754\t67916\n最高级\t67917\n清心\t67918\n居然海峡\t67919\n名都不对度秘我记住你\t67920\n颂升\t67921\n绿狗网\t67922\n工口本\t67923\nflying\t67924\n归真堂\t67925\n马六二实大厦\t67926\n木意思\t67927\n东尼玛\t67928\n酒后和一个人\t67929\n哈芬槽道\t67930\n同学们\t67931\n數\t67932\n半头套\t67933\n打仗场\t67934\n高铭志\t67935\nTH粉\t67936\n气长\t67937\n梁燕婷\t67938\n周梦瑾\t67939\n王冰玉\t67940\n那几年\t67941\n何劳妙手涂吾像，但愿君心合我心\t67942\n宝错\t67943\n盆们\t67944\n53281\t67945\n调唱\t67946\n别姬\t67947\nvteed\t67948\n山羊\t67949\n身后\t67950\n二六零三\t67951\n山美\t67952\n悠然见\t67953\n二六三五七七五七八\t67954\n空想\t67955\nhjjn\t67956\n马儿\t67957\n好读书\t67958\n礁溪小学\t67959\n划入\t67960\nLydjxjkbbkbf\t67961\n通电\t67962\n太可爱了你\t67963\n打四怪\t67964\n43110043111\t67965\nbyd3520\t67966\n我的机\t67967\n求求导\t67968\nMassa\t67969\n啜泣\t67970\nfsvip\t67971\n消费法\t67972\n书院\t67973\n德哥\t67974\n通用\t67975\n途经\t67976\n端宏斌\t67977\n姚完毕\t67978\n找到\t67979\n对驭\t67980\n管会艳\t67981\n小篆宠\t67982\n吮吸\t67983\n大我唱\t67984\n度数米\t67985\n一百万碗\t67986\n饮烈\t67987\n李一手\t67988\n我的期\t67989\n斟酌\t67990\n迷人谢谢\t67991\n卑鄙再见\t67992\n才会\t67993\n给扣\t67994\n包先生\t67995\n丑哥\t67996\n八几\t67997\n仇桂生\t67998\n迟疑\t67999\n吴彦祖\t68000\n学院风\t68001\n壮实\t68002\n毒战\t68003\n露出来\t68004\n猪窝\t68005\n小珊珊\t68006\nmememememememememememememememememmememe哒\t68007\n大给糖\t68008\n防写\t68009\n梅凌风\t68010\n2011.3.5\t68011\nweekend\t68012\nldonotknow\t68013\n叶莉\t68014\n陆续和\t68015\nffifh\t68016\n15821381211\t68017\n轰烈\t68018\n度秘我好开心\t68019\n敢\t68020\n李明天\t68021\n防冻\t68022\n蟹浦\t68023\n密我回来\t68024\n5月5日\t68025\n木马木马木马木马没卡卡\t68026\n起作业\t68027\n胆毛\t68028\n金咯\t68029\n瘊子\t68030\n120207\t68031\n啊伯\t68032\n百禾\t68033\n本周三\t68034\n粪粪\t68035\n波棱盖卡秃噜皮\t68036\n大件路\t68037\n本周一\t68038\n是的我吃食实话实说\t68039\nfuvdtgcfgghftg\t68040\n小姨子\t68041\n虚惊\t68042\n壹思\t68043\n行不通\t68044\n亲情的背叛\t68045\n胆大包天\t68046\n加载\t68047\n度秘你给我唱一个首三只小熊行\t68048\n思想准备\t68049\n高丽琦\t68050\n虎子\t68051\n说视感\t68052\n拍电影\t68053\n2012年4月27号\t68054\n春晚怪三小只\t68055\n波塞\t68056\n揮別\t68057\n好不舍\t68058\nngwjgxh\t68059\n踩踏\t68060\niPhone6S\t68061\n好吧哲理家\t68062\nGUJJ\t68063\n修住\t68064\n初音未来还是东京食尸鬼\t68065\nE-350双核处理器\t68066\n比大胖\t68067\n获奖者\t68068\n呢嘎嘎嘎嘎哥哥哥哥哥哥哥哥\t68069\n笑不死\t68070\n这题\t68071\n我讨厌你讨厌你恨你恨你恨你恨你恨\t68072\n8505866\t68073\n南无老鼠在我\t68074\n酷调\t68075\n情侣装\t68076\n存不存\t68077\nhjjkkhj\t68078\n直到\t68079\n赵艺莹\t68080\n哦别亲我脖子\t68081\n半经\t68082\n发行商\t68083\n鱼糕\t68084\n所有权\t68085\n形体\t68086\n一百门\t68087\n飘飘然\t68088\nohmy噶\t68089\nsndndbxjd\t68090\n34分之十九\t68091\n野岛刚\t68092\nlixk\t68093\n整年\t68094\n观众\t68095\nyodd\t68096\nu十四想你\t68097\n计算\t68098\n戴手\t68099\n秘不好\t68100\n阴经硬结\t68101\n张萌\t68102\n改行\t68103\n接着\t68104\n雪纺波\t68105\n撞日\t68106\n汽修\t68107\n急性肠胃炎\t68108\n一起不懂\t68109\n人情\t68110\n幼纸\t68111\n123457889466618\t68112\nfilled\t68113\n竭尽全力\t68114\n找你了你说\t68115\n痛爱着\t68116\n大黑菊\t68117\n烦透\t68118\n咖叔\t68119\n万蓉\t68120\n溶洞\t68121\n辛卯\t68122\n2011世界和平\t68123\n吴佩芳\t68124\nBANana\t68125\n菊花粥\t68126\n我是个女的你信\t68127\n桥本爱\t68128\n记者会\t68129\n发笑而知\t68130\n1029万\t68131\n花里\t68132\nQeuowuqprtw\t68133\n多智恼\t68134\n赠赠送\t68135\n李兔子\t68136\n2151078837\t68137\n来来嘛\t68138\nmerely\t68139\n余平\t68140\n江江\t68141\n贝优\t68142\n评委席\t68143\n祝福语\t68144\n青翠\t68145\n美莲曼\t68146\n肥婆\t68147\n云中歌云中歌\t68148\n布加尔\t68149\n配搭\t68150\n搞好天\t68151\nPontunombre\t68152\n亲爱的早点休息吧晚安\t68153\n草文\t68154\n洪莉娜\t68155\nbhiphotosbaiducomxiaodupicitemac6eddc451da81cb987987b85566d01609243123jpg\t68156\n77879204\t68157\n首间\t68158\n成事先\t68159\n方景怡\t68160\nComment\t68161\n呵呵党\t68162\n站口\t68163\n秘物\t68164\n冒牌货机器人\t68165\n态势\t68166\n糖尿学校\t68167\n临猗干嘛\t68168\n嘎嘎嘎嘎嘎嘎嘎GG︶\t68169\n于东淑\t68170\n站史\t68171\nddsyeeetreuyr\t68172\n正邪\t68173\n跑偏\t68174\n女快告诉我\t68175\ngangscheh\t68176\n马玉泽\t68177\n峰值\t68178\n十几岁\t68179\n舒微\t68180\n消遣\t68181\n你好罗\t68182\nLevis\t68183\n心情不好呀\t68184\n咧不小了结\t68185\n大派\t68186\n山地车\t68187\n大洼\t68188\n一壶山\t68189\nhi淘\t68190\n14p\t68191\n葱毛\t68192\nvvnx\t68193\n大洲\t68194\nwifgh\t68195\n夏天飞\t68196\n灵美丽\t68197\n小狗儿\t68198\n仅靠\t68199\nXHU\t68200\n惶恐\t68201\n出境游\t68202\n紫陌溪\t68203\n我没有我没安猫\t68204\n条口\t68205\n騙\t68206\n刘东生\t68207\n天娇\t68208\n笑死人了你很了不起\t68209\n騕\t68210\n连带率\t68211\n表意\t68212\n大洋\t68213\n丰满兄弟\t68214\n瞌睡虫\t68215\n玄幻\t68216\n14分\t68217\n贺俊坚\t68218\n象棋王\t68219\n146\t68220\n147\t68221\n经济\t68222\n在床上浪\t68223\n142\t68224\n143\t68225\n140\t68226\n141\t68227\n下楼涛\t68228\n斑斑青\t68229\n歪密\t68230\n148\t68231\n149\t68232\n6月15\t68233\n欧阳若轩\t68234\n短衣裤\t68235\n安然虱\t68236\n车别\t68237\n惜别\t68238\n凯越\t68239\n我在图二我的爱征途\t68240\n粑粑蛋\t68241\n鱼兄\t68242\nghiphotosbaiducomxiaodupicitema8773912b31bb051ae712b5b317adab44aede068jpg\t68243\n01毫米\t68244\n度蜜度秘\t68245\n七律\t68246\n霸王龙\t68247\n啦啦啦陪我唱吧\t68248\n无知郁\t68249\n谈自强\t68250\nnour\t68251\n真冰\t68252\nEuriit9tirfifititot958ruyer747\t68253\n西康路海防路\t68254\n华人文化产业投资基金\t68255\n美津姐\t68256\n怀乡\t68257\n千均\t68258\n五十块\t68259\n小晨晨\t68260\nhchgvb\t68261\n童叟无\t68262\nbjiu\t68263\n千块\t68264\n不对养\t68265\n爱的风不敢对爱\t68266\n建文\t68267\n告诉我喜欢的人\t68268\n|||||||顺道\t68269\n汝可\t68270\n索尔尼仁琴\t68271\n拱桥\t68272\n作息\t68273\n穿越火线之枪战王者\t68274\nqgair\t68275\n轮胎\t68276\n光明大道\t68277\n时光机\t68278\n10克\t68279\n一别多年\t68280\n帅哥是谁啊谁谁谁谁给你解\t68281\nchiese\t68282\n嗯微小\t68283\nQQ新群16242485\t68284\n一个一百多\t68285\n乐死\t68286\n邢亚鹏\t68287\n东方神起\t68288\n进沙\t68289\nbohvgf\t68290\n熊出没之四有我休息少来\t68291\n昂昂昂昂昂昂昂\t68292\n站位\t68293\n站住\t68294\n春节\t68295\n别表\t68296\n九来\t68297\n切赞\t68298\n555555555555555555555555555554555555555555555\t68299\n样额\t68300\n大梦想家吧\t68301\nvogue\t68302\n书店\t68303\n安大略\t68304\n難過\t68305\ndhdy\t68306\n旱冰\t68307\n详情\t68308\n红星美凯龙琶洲商场\t68309\n王小贱\t68310\n拉斐\t68311\ndhdi\t68312\n排水\t68313\n五堡\t68314\n8815480\t68315\n单着单着了你的风\t68316\n性猪\t68317\n盲口\t68318\n轻友\t68319\n谏使\t68320\n马丽莎\t68321\nstitch\t68322\n修理师\t68323\n许尔勒\t68324\n拉斯\t68325\n黑白带\t68326\n书度\t68327\n3251251\t68328\n格局观\t68329\n排气\t68330\n五堂\t68331\n四四四\t68332\n太医\t68333\n二三六零\t68334\n白羊在喜德\t68335\n崔雯昊\t68336\n毕节\t68337\n外口\t68338\nr455冥广广\t68339\n18392980218\t68340\n会死\t68341\n烈烈\t68342\n大猪大猪大猪草莓\t68343\n外史\t68344\n状宪无\t68345\n外号\t68346\n休息室\t68347\nsugbjy\t68348\n李嘉琪\t68349\n个性化\t68350\n耳轻风\t68351\n终将\t68352\n李嘉琦\t68353\n8点\t68354\n不是我不我退订\t68355\nIfacki\t68356\n货车\t68357\n诺氟沙星\t68358\ngjgfc\t68359\ngjgfb\t68360\n爱着我我告诉你\t68361\n着玩\t68362\n窥屏\t68363\nwiishx\t68364\n乱了套\t68365\n旅途观\t68366\n沙河市\t68367\n少多\t68368\n1905年\t68369\n欧no\t68370\n途径\t68371\n徐幼华\t68372\n灵枢\t68373\n而后\t68374\n兴合\t68375\n戚戚\t68376\n三二百三十\t68377\n7714\t68378\n渤海\t68379\n早日\t68380\n1851636454\t68381\n固执\t68382\nbobo\t68383\n丽江\t68384\n开路\t68385\n王八真\t68386\n焦晚\t68387\n盛情\t68388\n秘成天\t68389\n辛热\t68390\n忧虑\t68391\n难过多\t68392\n凯吉奥特曼\t68393\n库仑村\t68394\n3100家\t68395\n6400\t68396\n九十多分\t68397\n法拉利宾利迈巴赫\t68398\n2月27日20时\t68399\n谭竹英\t68400\n呢子\t68401\nviakm\t68402\n揍信\t68403\n经贸委\t68404\n胡唻唻\t68405\n文器\t68406\n青岛机场\t68407\n密室逃脱\t68408\nfftvgirls\t68409\n两个星期\t68410\n出错\t68411\n2006年12月30日下午四点半\t68412\n六一段\t68413\n不看相\t68414\n一员行\t68415\nipf\t68416\n液化成症\t68417\n布点\t68418\n性格\t68419\n狼狈为奸\t68420\n文万多夕名孓子子子及久一小不大人\t68421\nhi你会\t68422\n周逸宇\t68423\ngudxgbvvhgdamgjiklgctjbcgjbhuilkjhgfdaxcvbnmj\t68424\narichsstory\t68425\n该地\t68426\nsB吧\t68427\n骄奢淫逸\t68428\n否极泰来\t68429\nawnn\t68430\n不够用\t68431\n那太郎\t68432\n过滤性\t68433\n先机\t68434\n朦胧诗\t68435\n简比\t68436\n4624655525865645\t68437\n诚恳\t68438\n侯虚怀\t68439\n肖丽霞\t68440\n袖经\t68441\n克鲁格曼\t68442\n刘洋\t68443\n众泰z700\t68444\n对视而笑\t68445\n15名\t68446\n可归\t68447\n兄台\t68448\n三四级\t68449\n刘柳\t68450\n哥哥哥哥哥哥哥哥哥\t68451\n外使\t68452\n逗留\t68453\n读秘昧\t68454\n张名得美\t68455\n三个晚上\t68456\n杜燕萍\t68457\n照张\t68458\n12041\t68459\n范阳晨\t68460\nStammy\t68461\n安尼蛹\t68462\n出处\t68463\n比方说\t68464\n小隐隐\t68465\n每一个你\t68466\nwjwl\t68467\n小米手亲个嘴儿\t68468\n滋儿滋儿滋儿\t68469\n18438486812\t68470\n哼拜拜\t68471\n四个小时\t68472\n赵三娃\t68473\nｖｖｖｖｖｖｖｖｖ\t68474\n444411115555\t68475\n四分米\t68476\n0315\t68477\n荔枝肉\t68478\n偶说\t68479\n疏导\t68480\n打等\t68481\nsbar\t68482\n最好\t68483\n投报\t68484\n萱迷们\t68485\n孺子\t68486\n房忆绮\t68487\nsbab\t68488\nStephen\t68489\n妖艳\t68490\nd288\t68491\n梁晓婷\t68492\n城狮\t68493\n邓力\t68494\n蒋旭\t68495\nanzhuang\t68496\n度秘你好烦人我讨厌你讨厌你\t68497\n跟贴\t68498\n再见改天\t68499\n铃木\t68500\n抑扬顿挫\t68501\n贞操\t68502\n听不想\t68503\n巨奖\t68504\n有毒\t68505\nRIM\t68506\nzaishuo\t68507\n交换\t68508\n朱文晖\t68509\n赞赞赞赞赞赞\t68510\n疏密\t68511\n要不想知道\t68512\n极值\t68513\n国富大中华\t68514\n敏\t68515\n目录\t68516\n定期待遇\t68517\n出外\t68518\n音乐史\t68519\n县法院\t68520\n明白点\t68521\n雷锋式\t68522\n死到临头\t68523\n大河头\t68524\n死伤\t68525\n老人们仙\t68526\n库克畅\t68527\n关己\t68528\n2013款\t68529\n陈乔恩\t68530\n日刊\t68531\n左去式砉a\t68532\nmorning\t68533\n祢能\t68534\n蘭蘭\t68535\nGffffxdd\t68536\nbutf\t68537\ny37\t68538\n乐佩\t68539\ny35\t68540\n撞树\t68541\n158点\t68542\n根生\t68543\n首邻\t68544\na99默默\t68545\n刘志发\t68546\n月嫂\t68547\n常宁\t68548\n威比伦\t68549\n草榴芒\t68550\naxby1\t68551\nmibi\t68552\ngpmttdw\t68553\n希良\t68554\n余秀凤\t68555\n度秘你在说中秘\t68556\n玫瑰花园\t68557\n靖宇\t68558\n李锦晨\t68559\n妙可\t68560\n噩噩噩\t68561\nxaxax\t68562\n一起日\t68563\n佛照\t68564\n常客\t68565\n机炮\t68566\n98万元\t68567\n一吃白菜吧\t68568\n宝三宝\t68569\n止饥\t68570\n一起旺\t68571\n定等\t68572\n8405966642788\t68573\n冒雷锋\t68574\n南瓜手\t68575\n我自\t68576\n姓张\t68577\n好哪好哪好哪好哪好哪好哪好哪好呢\t68578\n有把有\t68579\n高高在\t68580\n一椿\t68581\n高高地\t68582\n369258014\t68583\n龙根\t68584\n数3210\t68585\n3暮\t68586\n月流珠\t68587\n你不就我吗你还是那你别当我\t68588\n俾斯摩尔\t68589\n关闭式\t68590\n徇欲\t68591\n主持人\t68592\n少儿趣\t68593\n﹌呵呵\t68594\nDJ&amp\t68595\n素真人\t68596\n大江源小学\t68597\n限高\t68598\n兵兵没有人爱我亲爱的兵兵\t68599\n托福\t68600\n波斯得图\t68601\n败叶\t68602\n皇陵镇\t68603\n聊良\t68604\n丹尼尔\t68605\n中国羽毛球队队\t68606\n造血\t68607\n岳明慧\t68608\n别说话实说\t68609\n王后生\t68610\n69.9美元\t68611\n主客观\t68612\n卡卷包\t68613\n快呢度密\t68614\n各国各地\t68615\n芦村\t68616\n我知道你读书少我本来就是你的妈妈\t68617\n海欣股份\t68618\nCBGBg\t68619\n仁怀\t68620\n18:00\t68621\nmmmmllllaoieiwnjshxqbqh28duw8192j827719ub9\t68622\nFrig\t68623\n河曲河\t68624\n逻辑性\t68625\n实打实\t68626\n身妄\t68627\n直接那儿从\t68628\n迎香穴\t68629\n189999\t68630\n台湾\t68631\n许丹\t68632\n7fzGjkd\t68633\nk秘\t68634\n有理烦\t68635\n骑手\t68636\n派雅\t68637\nMfd\t68638\n00000000001\t68639\n2559613918\t68640\n衡山小学\t68641\n纪录片\t68642\nzá\t68643\n九订\t68644\nchiphotosbaiducomxiaodupicitemc8177f3e6709c93d52f3ae84983df8dcd10054bbjpg\t68645\n1W次\t68646\n花样男子\t68647\n其身\t68648\n蔡雪怡\t68649\n熊出没之自由星鞋\t68650\n世音\t68651\n许三\t68652\nCOSPLAY漫友会\t68653\n1分钱\t68654\n刘旺佳\t68655\n叶凌晨\t68656\n皮抽\t68657\nsleeping\t68658\nhttphhiphotosbaiducomxiaodupicitemdbb44aed2e738bd4\t68659\n圭将\t68660\n骆甜竹\t68661\n若现\t68662\n二三节\t68663\njebdd\t68664\n拨打\t68665\n语音障\t68666\n懒呀没呀敖\t68667\n呀握\t68668\n烦扰\t68669\nkjs\t68670\n肝池\t68671\n1分钟\t68672\n度一一\t68673\n12￥\t68674\n6月29日\t68675\n年纪轻轻\t68676\n好了没爱了\t68677\n华联\t68678\n娘脸\t68679\n二氧化氮\t68680\n韓劇\t68681\n何海杰\t68682\n七章\t68683\n大体能抽湿机\t68684\n355346\t68685\n玉苍山\t68686\n火热\t68687\n东湖\t68688\n随时\t68689\n二手市场\t68690\n眷恋\t68691\n枪支\t68692\n手雷\t68693\n要强\t68694\n黑贴\t68695\n唯乐\t68696\n白丝鱼\t68697\n1400多元\t68698\n哪几天\t68699\n零七二二\t68700\nMERIVA\t68701\n坪如垛\t68702\n螨虫\t68703\nfjgd\t68704\nfjgg\t68705\n好啊好啊我可开心了多密你\t68706\n脚板\t68707\n盐官\t68708\n穿越帧\t68709\nfugx\t68710\n不认真\t68711\nfjgv\t68712\n97646843946793494\t68713\n女人心还海底针\t68714\n红阳\t68715\n有行\t68716\n脱皮\t68717\n挺西北\t68718\n数绘板\t68719\njidey\t68720\n傻子傻子傻子度秘度秘度秘傻子傻子度秘傻子度秘傻子度秘\t68721\n挺点\t68722\n大趋势\t68723\n郁郁寡欢\t68724\n2011年7月15日起\t68725\n玩不懂\t68726\n十一家嗯腈纶\t68727\n上善若水\t68728\n你是谁你是谁你是芭蕾舞\t68729\n刘恺威\t68730\n功夫片\t68731\n联想快捷\t68732\n两百棵\t68733\n你的故事\t68734\n殴打\t68735\nnat\t68736\n第15次\t68737\n当局\t68738\n水果蛋糕\t68739\n来我家玩\t68740\n流士\t68741\n2尺\t68742\n检点\t68743\n土鳖样\t68744\n誉满\t68745\n秘我告诉你\t68746\n正不正爽\t68747\n极地\t68748\n海德堡\t68749\n木奶仪\t68750\n罗芬\t68751\n泛亚\t68752\n真心片\t68753\n李念念\t68754\n蹦跳\t68755\n烤肉酱\t68756\n干窝\t68757\nnowneeds\t68758\n度度行\t68759\n足怪\t68760\n热水电\t68761\n得意调皮\t68762\n软肋\t68763\n暴漫\t68764\n白无常\t68765\n蹦跶\t68766\n张誉\t68767\nBCbca\t68768\n弱男\t68769\n手机套\t68770\n2777\t68771\nspspspspspspspspsbsbsbsbsbsbsbsb\t68772\n我要你咯喔灰太狼卑微斗\t68773\n添堵\t68774\n独到之处\t68775\n鼎力相助\t68776\n四清\t68777\noutu\t68778\nkma\t68779\n市井\t68780\nfhutdcjir\t68781\nkmm\t68782\n赌场\t68783\n勇士队\t68784\n及格否\t68785\n一比甲\t68786\n岳家庄村111号\t68787\nwogang\t68788\nkmu\t68789\n文全\t68790\n赌圣\t68791\n小度秘行\t68792\nkmp\t68793\n好与坏\t68794\n霍州段\t68795\nbekd\t68796\n九十多斤\t68797\n狗仔队\t68798\n告诉我话\t68799\n水发木耳末\t68800\n藏甲\t68801\n6333395284\t68802\n很好多喝\t68803\n瘋瘋癲癲\t68804\n真性情\t68805\n在一起吧蛋蛋\t68806\n断子绝\t68807\n哎呀妈呀你真笨那我说的就是女上你呀干嘛要说男生我恨你你喜欢我我去我恨你\t68808\n毒咪\t68809\n租约\t68810\nvhjhh\t68811\n卢鞠花\t68812\nｃｏｂｗｇｂｂｂｂｂｂｖ\t68813\n开学\t68814\n聽音樂\t68815\nhttpmvqqcomplayplayhtmlwxnickaE29CAACanEE898FwxheadhttpwxqlogocnmmopenQ3auHgzwzM4Ew0icLXoJ1DF7ThyVU1gSubJZJBAEtBcF3ibRkWicJtVibL4WPBYdMkZQ1MdUdfDCC9N3zl0abSHZTuwr6HnBINl0Ufzds3lODIs0coveridvidq01824f27kafromgroupmessageisappinstalled1i1\t68816\n上我不知道\t68817\n6542家\t68818\n电动车\t68819\n博卡拉\t68820\n董鑫月\t68821\n信不投诉你\t68822\n15038987318\t68823\n不赖床\t68824\n好呀好呀好呀好你就是的好呀\t68825\n悠然\t68826\n模棱两可诺\t68827\n法医\t68828\nw斯特林\t68829\n褂子\t68830\n营养品\t68831\n677796\t68832\n瓦菲\t68833\n付小菊\t68834\n第五位\t68835\n东岸\t68836\n草樱\t68837\n彩虹书吧\t68838\naZV\t68839\n头昏眼花\t68840\n告你我告\t68841\n我们子\t68842\n早明\t68843\n来玩儿\t68844\n正聚v许v\t68845\n陈筱雅\t68846\n嫁错郎\t68847\n视网膜\t68848\npian目\t68849\n走路过\t68850\n平型\t68851\n暴雪\t68852\n挖掘机\t68853\n暴雨\t68854\n舞出我人生\t68855\n,\t68856\n全哪\t68857\n兔兔兔兔兔兔兔兔\t68858\n爹地\t68859\n你好热\t68860\n相好是\t68861\n建业VS力帆\t68862\n电音\t68863\n木雕\t68864\n肖工\t68865\n全品\t68866\n足弓\t68867\n诱拐\t68868\nvccbkk\t68869\n我的孩\t68870\n天堂\t68871\n嗯多咪\t68872\nevans#\t68873\n架子\t68874\n联\t68875\n长沙\t68876\n吴丹波\t68877\n𠃌\t68878\ncsol2\t68879\nOvO\t68880\n过奖过奖过奖\t68881\nagain\t68882\n说真的假的是\t68883\n拉动型\t68884\ngdug\t68885\n1827557\t68886\n嗯王仁任\t68887\n长没\t68888\nytrg\t68889\n错的错的错的错的错的错的错的错的错的错的错的错的错的错的错的错的错的错的错的错的\t68890\n新华宾馆\t68891\n违者\t68892\n飞了给\t68893\n长治\t68894\n1bg\t68895\n难道说好\t68896\n对不然的话\t68897\nsitjhghdbrgdfffhdfvdngfdfggghjhfghfhjvjjhfgjjjhkjgkkgyggfhhbjjnhffgjhjjbbffg\t68898\n长河\t68899\n七分\t68900\n你好坏动力\t68901\n走转\t68902\n该病\t68903\n不是我喜欢你\t68904\n张不存\t68905\n松桃\t68906\n游块\t68907\n打鸣\t68908\n哼老师\t68909\n噶桃花\t68910\n绿和\t68911\n一分为二\t68912\n钟丽\t68913\n13655884265429075228411\t68914\n毛不\t68915\n宜城火车站\t68916\n打鸟\t68917\n26支\t68918\n3wee\t68919\n毛七\t68920\nchkffcklhj\t68921\n戴常涵\t68922\n陷害\t68923\n毛业\t68924\n罗马甲\t68925\n名村\t68926\niTravel\t68927\n圣传\t68928\n样书\t68929\n沾污\t68930\n姬从良\t68931\n啦啦啦啦啦啦啦啦拉拉啦啦啦啦\t68932\n圣伯\t68933\n6722\t68934\nTHANKS\t68935\n下水饺\t68936\n印花裙\t68937\n梭鱼\t68938\n5545548848858545945475654785\t68939\n恋爱通\t68940\n搭积木\t68941\n作业我需要\t68942\n天天那你猜我是男\t68943\ngvvcx\t68944\n2012016年\t68945\n前脚\t68946\n赌样样\t68947\n花火村\t68948\n潮水\t68949\njahamu\t68950\n东南枝\t68951\n李芗芗\t68952\n古诗句\t68953\n亚茹\t68954\n6abc哎x6m\t68955\n塞禁\t68956\n构陷\t68957\n1370元\t68958\nsitting\t68959\nyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyy\t68960\n朗点\t68961\n你了我一个人\t68962\n向天歌白毛湖绿水红掌拨清波\t68963\n596666666963535399\t68964\n两千年前\t68965\n二龙戏珠\t68966\n13361个\t68967\n六男\t68968\n六由\t68969\n六申\t68970\n地雷阵\t68971\n舞脩\t68972\n痔疮\t68973\n凤凰山\t68974\n羊奶粉\t68975\n尿白亚\t68976\n社会保障\t68977\n低效\t68978\n性文化节\t68979\n2012-02-04\t68980\n麻烦小学\t68981\n阿觉\t68982\n一百八十三块\t68983\n九年级化学期末测试卷\t68984\n一殷\t68985\n紫鼠\t68986\nwobblyblueallowedourletlotjobaajjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjj\t68987\n4分钟之内\t68988\n菜苗\t68989\n陆欣怡\t68990\n往别\t68991\n黑客帝国\t68992\n曹子涵\t68993\n杂技转托网\t68994\n小苹歌榛蘑\t68995\n借不不\t68996\n26697806470799778444800458\t68997\n皮色\t68998\n班禅\t68999\n皮相\t69000\n黄万里\t69001\n大明秘\t69002\n恶汉互译啊瓮\t69003\n范么\t69004\n撒西哈啦\t69005\n澄迈\t69006\n芒果维尼草莓\t69007\n秋思\t69008\n外观\t69009\nｗｉｆｉ\t69010\n朴某\t69011\n那个儿\t69012\nPRESTIGE\t69013\n网游\t69014\n黄蜂\t69015\n收发室\t69016\ndtfdgdger\t69017\n大头人\t69018\n35.4倍\t69019\n爱你好奇怪\t69020\n来不来了\t69021\n特殊骨\t69022\n小贱别记\t69023\n16种\t69024\n韩国男篮国家队\t69025\nZCCD\t69026\n开跑\t69027\n21度\t69028\n共秘\t69029\n馒头\t69030\n喉结\t69031\n饱了\t69032\n韵华街\t69033\ni大黑擦\t69034\n板上\t69035\nmosque\t69036\n藕片\t69037\n在岸\t69038\n湖面\t69039\n降息\t69040\n任良\t69041\n新气象\t69042\n小呆瓜\t69043\n群发\t69044\n阔太\t69045\n嘤嘤嘤嘤嘤嘤嘤嘤\t69046\n在岗\t69047\n划船\t69048\n合围\t69049\n叶罗\t69050\n0.00%\t69051\n曾老\t69052\n气叉\t69053\n二三四五六七八九十十一十二发\t69054\n宜兰\t69055\n吊炸\t69056\n左右\t69057\ngaag\t69058\n宜兴\t69059\n15284577140\t69060\n2920点\t69061\n150008882008\t69062\n谢含卉\t69063\n蜡笔年\t69064\n正门好嘛\t69065\n透透風分了麼事兒童節放棄婦產科\t69066\n刘梦梦\t69067\n&amp\t69068\n国石\t69069\n升天\t69070\n授予\t69071\ndlelast\t69072\n金华市\t69073\n太阳花人\t69074\n欢淫\t69075\n多嘴\t69076\n抽空\t69077\nn20八月飘香香满园\t69078\n厦大西村\t69079\n摄影\t69080\n奥赛\t69081\n太幸运\t69082\nmjmjmjmjnjm\t69083\n这啥意\t69084\n黑心\t69085\n一三四五六七八九十\t69086\nhttphhiphotosbaiducomxiaodupicitem1f178a82b9014a9071ec05c9ae773912b21beef0jpg\t69087\nPhoto\t69088\n岩真\t69089\n晚安曲#\t69090\ndudjidh\t69091\n外说\t69092\n看我儿\t69093\n飘红\t69094\n啦啦魔法变变变\t69095\n巴乔\t69096\n燎泡\t69097\n不死机\t69098\n阿不我闹\t69099\ncation\t69100\n老别老发\t69101\n动你度\t69102\n益阳\t69103\n鞠婧衣\t69104\n爸爸呢妈妈\t69105\n烦误\t69106\n接住\t69107\n邓小平\t69108\n十多分间\t69109\n险情\t69110\n杜吗\t69111\n太阳佢\t69112\n风趣\t69113\n排级\t69114\n宣示\t69115\n独户解放军\t69116\n幸幸\t69117\n把头\t69118\n排线\t69119\n秭归\t69120\n王梓烨\t69121\n2222年\t69122\nQQ堂\t69123\n杜名\t69124\n小丫头\t69125\n堡宁\t69126\n效率低\t69127\n人福医药\t69128\n辵了经\t69129\nanice\t69130\n我喜欢爱大陆\t69131\n宝俊凯\t69132\n三四岁\t69133\n5月3号中午\t69134\n有鬼\t69135\n对啊一股花\t69136\n米糊\t69137\n法统\t69138\n七继光\t69139\n黄头发\t69140\nduewiop\t69141\n马脸夏\t69142\n十七十八\t69143\n通胀预期\t69144\n蜂蜜柠檬水\t69145\nmoma杯\t69146\n试想\t69147\n大街小巷\t69148\n聊天了你是什么\t69149\n的话\t69150\n盈江\t69151\n心情好糟糕\t69152\n就是一方可\t69153\n做错\t69154\n250英镑\t69155\n样衣人\t69156\n吥\t69157\n面积\t69158\n偶葛\t69159\n娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃娃\t69160\n迈尔斯\t69161\n傀儡\t69162\n我爱你我们见一面好不\t69163\n不忘记\t69164\n关菊\t69165\n正文\t69166\n既往\t69167\npcu\t69168\n零八七六\t69169\n纯静\t69170\n17uu\t69171\n没有我不出\t69172\n556点\t69173\n我是你的优乐美\t69174\n切割\t69175\n可爱秘\t69176\n企业主\t69177\n哈秘鼓\t69178\n大马哈鱼\t69179\n侦探文\t69180\n凌写\t69181\n我告诉你吧3115008\t69182\nTaipei\t69183\n流联\t69184\n蒋静静\t69185\n絮叨\t69186\n冯陈\t69187\n铺成\t69188\nWhatibvxfg\t69189\n天天有喜你面\t69190\n20110523\t69191\n大薇\t69192\n南海舰队\t69193\n好打电话\t69194\n倩倩\t69195\n重子\t69196\n猫猫\t69197\n欢喜\t69198\n肏象\t69199\n孙顺航\t69200\n请说\t69201\n568658865555\t69202\n死鸭\t69203\n市政协\t69204\n万家辉\t69205\n度日\t69206\n全美达人/美国达人\t69207\n草木皆兵\t69208\n挎把\t69209\n哥片\t69210\n割死\t69211\n死鸟\t69212\n举文\t69213\n有时候\t69214\n美廉美\t69215\n七夜郎\t69216\n雍齿耶\t69217\nrazale\t69218\n东胜区\t69219\n普吉府\t69220\nTNND\t69221\nfgzdcgmp\t69222\n动机\t69223\n售票窗口\t69224\n重学\t69225\n朱朝\t69226\n我喜神\t69227\nozZoI\t69228\n于是乎\t69229\n都度\t69230\n都座\t69231\n储蓄型\t69232\ndodmx\t69233\n啦啦啦啦啦啦我是快乐的小报\t69234\n刘葛\t69235\n美街道\t69236\n过伶仃谣\t69237\n10月4号\t69238\n公安机器人\t69239\n打赌\t69240\n政法大学\t69241\n苏州街\t69242\n足球队\t69243\n月利率\t69244\n李倩\t69245\n漏网之鱼\t69246\n打走\t69247\n五零亿\t69248\ntahw\t69249\n酒歌\t69250\n冯振凯\t69251\n區唇\t69252\n起征点\t69253\n花蕾\t69254\n遍野大豆高粱倾城\t69255\n民国家\t69256\n无理实在\t69257\nfysy\t69258\n爆梦到死\t69259\n高梓豪\t69260\n匍匐\t69261\nfysg\t69262\n监理\t69263\njeieirririri\t69264\n12345678910111213141516171819202122232425262\t69265\n炫动\t69266\n省力\t69267\n萌赞神\t69268\n70年后\t69269\n沃克斯\t69270\n端倪\t69271\n电光石\t69272\nahxhfnb\t69273\n羽竹\t69274\n省劳\t69275\n510525198907013418\t69276\n摇号\t69277\n陆战队型\t69278\nX3\t69279\nBeach\t69280\nX1\t69281\nX6\t69282\nX7\t69283\nX5\t69284\nXJ\t69285\n干什么载人\t69286\nXH\t69287\n雅典娜\t69288\n事例\t69289\nXO\t69290\nXL\t69291\nXM\t69292\n说明天\t69293\n意恩\t69294\n美丽的秘密一个\t69295\n还有你上有你\t69296\nXF\t69297\nMLKTV\t69298\nXD\t69299\n百病缠身\t69300\nQSEFTY\t69301\nXY\t69302\n罗建\t69303\n德辉\t69304\nXS\t69305\nXP\t69306\nXQ\t69307\nXT\t69308\ndjsucwus\t69309\nXh\t69310\nXn\t69311\nXo\t69312\n南京农业大学\t69313\nXm\t69314\n狐狸精\t69315\n口阿四E57411584444454\t69316\nXf\t69317\nXe\t69318\nXx\t69319\nXy\t69320\n奥朵\t69321\nyoutifor\t69322\n好漂亮\t69323\n常春藤\t69324\n栖霞黑你我才\t69325\n晚上6点\t69326\nneinei\t69327\n一零两\t69328\n法律依据\t69329\n一零个\t69330\ndjshhshs\t69331\n袋鼠\t69332\n长乐\t69333\n主夫人\t69334\n1:00\t69335\n理科类\t69336\n早饭\t69337\n出其右者\t69338\n长久\t69339\n腻腻\t69340\njjmpm\t69341\n擦手\t69342\n一零万\t69343\n亻体\t69344\nPower\t69345\n最优化\t69346\n温嘉檬\t69347\n兼职\t69348\n2996882653\t69349\n叫床片\t69350\n120602\t69351\nsquare\t69352\n哎咪\t69353\n口口声声\t69354\n谢昆承\t69355\n韩国大使馆\t69356\n四重\t69357\ntmts\t69358\n日程\t69359\n总错\t69360\ngjmhk\t69361\n想过来\t69362\ngopp\t69363\n十分球\t69364\n米甘\t69365\n极端民族主义\t69366\n美言\t69367\n疤度\t69368\n插一插\t69369\n跨年快乐\t69370\ngopf\t69371\ngopg\t69372\ntmtj\t69373\ntmtn\t69374\n说了再接\t69375\n入殓师\t69376\n君越\t69377\n网页版\t69378\n汕头海市\t69379\n鬼话\t69380\n唐人二七号\t69381\n热咳\t69382\n120604\t69383\n哥哥哥哥\t69384\n臭臭臭臭臭臭臭臭臭臭臭\t69385\n停靠\t69386\n米画\t69387\n帅气类\t69388\n那你的头一模一\t69389\n心灵雀\t69390\n大员\t69391\n携带者\t69392\ntokeep\t69393\n温度\t69394\n辛丑条约\t69395\n克难\t69396\n匡正世道人心\t69397\n罗煜鑫\t69398\nddffdddt\t69399\n玉玺\t69400\nitya\t69401\nhi喽\t69402\n媒体战\t69403\n不靠不干\t69404\n尕48555\t69405\n李宝英\t69406\n志龙\t69407\n玉王\t69408\n玉玉\t69409\n唯美块\t69410\nFgthfy\t69411\n刘龙洋\t69412\n再见哈哈哈大笑\t69413\n抱不走\t69414\n试一下来\t69415\n千岁千岁千千岁\t69416\n一万四千多\t69417\n公主裙\t69418\n领空\t69419\n13676902929\t69420\n興認識\t69421\n然然后\t69422\n恶灵\t69423\n恩度\t69424\n姬天泽\t69425\n代讨\t69426\n15678\t69427\n陈志朋\t69428\n咯斌\t69429\n濮阳\t69430\n咪咪米\t69431\n库库鲁\t69432\n叶芯儿\t69433\nBiu私人谈话\t69434\n莲城\t69435\n了不起啊了不起\t69436\n12345667899\t69437\n未央区\t69438\n腾讯游戏频道\t69439\n痊愈\t69440\n107772704n\t69441\n言寸草心\t69442\n中特效\t69443\ndoubleq\t69444\n第一根\t69445\n二十来块\t69446\nyournh\t69447\n鬼东西\t69448\n重建\t69449\n我求你了你给我一个\t69450\n嗯个杀人剑\t69451\n32550\t69452\n源宝\t69453\nk70\t69454\n勤荒\t69455\n坏脾气\t69456\nk78\t69457\n断振兴\t69458\n你好你好我是恶魔猎人\t69459\n华东交大\t69460\n60公分\t69461\n乜个\t69462\n三天之后\t69463\n生成\t69464\n5855855\t69465\n曼奇尼\t69466\n阿库波斯\t69467\n文何以\t69468\n六俊凯\t69469\nhellohello\t69470\n上夜五\t69471\n戈多\t69472\n亥杂诗\t69473\n最崇拜\t69474\n迎春路670号\t69475\ni冯8\t69476\nk7k\t69477\n探望\t69478\n下不了\t69479\n睡瞌睡\t69480\n百科全书\t69481\n冲杯\t69482\n王诗惠\t69483\n几撒\t69484\n冲来\t69485\n更改错\t69486\n12345678911\t69487\nrestaurant\t69488\n晓浠\t69489\n二五三八零\t69490\n木崖\t69491\n男校\t69492\n斜着\t69493\n精神饱满\t69494\n呃hothot\t69495\n我恨你我恨你猪咪\t69496\n周一航\t69497\n森美\t69498\n我不宣\t69499\n3617\t69500\n八道网\t69501\n卧听\t69502\n丐夜\t69503\n同类项\t69504\n北街\t69505\n琴袋\t69506\n3w3w\t69507\n社会活动\t69508\n腔调\t69509\n红肿\t69510\n如言\t69511\n张庆莹\t69512\n嗯128\t69513\n红肉\t69514\n火媒\t69515\n好天天有喜二\t69516\njhffu\t69517\n塞伯坦\t69518\nbigbigwor\t69519\n黄堡\t69520\n15169073049\t69521\n秘真可怕\t69522\njhfff\t69523\n爆机\t69524\n加劳德特大学\t69525\n嘎嘎嘎古古怪\t69526\n色戒酒\t69527\n妆点\t69528\n94岁\t69529\n浏阳浩\t69530\n不骗你\t69531\n佳缘嘉欣\t69532\n伴我行\t69533\n少少\t69534\n2ii\t69535\n斯坦福桥\t69536\n治具\t69537\n别老侄说话，你给我你给我有点素质的别在那\t69538\nccicixcixu\t69539\n谁的零一\t69540\n简练\t69541\n转一转\t69542\n赌局\t69543\n顾临汐\t69544\n蒋福顺\t69545\n鼠目寸光\t69546\n花瓣浴\t69547\n铁boss\t69548\n通电器\t69549\n杨千瀣\t69550\n抽型\t69551\n私人银行\t69552\n四有\t69553\n6月14号\t69554\n流金岁月\t69555\n无生\t69556\np元\t69557\ncoco\t69558\n标的\t69559\n嘉达会\t69560\n伸延\t69561\n九十九十四岁\t69562\n乐园区\t69563\nwhatyouton\t69564\n天干座\t69565\n阴茎\t69566\n四时\t69567\n朱天亚\t69568\n随你便尿泡\t69569\n无画\t69570\n洛克宝\t69571\n过下\t69572\n唔系度煤球\t69573\n雾中仙\t69574\n鲁滨孙\t69575\n无用\t69576\n20位\t69577\n干洗机\t69578\n不是不我待\t69579\n丁克温\t69580\n过世\t69581\n雅姐\t69582\n宋思瑶\t69583\n醋气\t69584\n抱死\t69585\n牛灿\t69586\n林崇中\t69587\n不知道\t69588\n大被\t69589\n鑫源至尊太君\t69590\n晓燕\t69591\n夏梦吉\t69592\ndgshshvs\t69593\n精神病学\t69594\n欣蔓\t69595\n宋汝哲\t69596\n摩西摩西一库\t69597\n大坏蛋哒\t69598\n猪猪猪猪猪猪猪猪猪猪猪猪\t69599\n杜哥\t69600\n许文浩\t69601\n110520KTR\t69602\n微贷\t69603\n10月8日\t69604\n无法主\t69605\n抱歉\t69606\n旧宅\t69607\n西湖桥圩\t69608\n黄巢起义\t69609\n一步器\t69610\n叙大使馆\t69611\n靓丽\t69612\nηoo」ヘ\t69613\n陈宝存\t69614\n弱肉强食\t69615\n申冤\t69616\n第二十二次\t69617\n可是\t69618\n孙安瑞\t69619\n王妈妈\t69620\n双标\t69621\nnihuicanggema\t69622\n空格\t69623\n白族\t69624\n餐蛋\t69625\n变帅\t69626\n衷心耿耿\t69627\n蒋海云\t69628\n2x7y60\t69629\ncut\t69630\n漂亮妹子在哪呢在哪呢在哪呢在哪呢\t69631\n好残忍\t69632\nConcert.SM特辑\t69633\n一百米\t69634\nJul1局\t69635\n双核\t69636\ncuy\t69637\ncux\t69638\ncuf\t69639\ncue\t69640\n白日\t69641\ncuc\t69642\ndare\t69643\n不合群\t69644\n侠之大者\t69645\ncuo\t69646\n事川\t69647\ncuj\t69648\ncuh\t69649\n十九所\t69650\n度秘我问你你猜我\t69651\nCéline\t69652\n操刀\t69653\n么是么\t69654\n耒阳正源学校\t69655\n戴哥\t69656\n乱子\t69657\n1516c\t69658\n心情不\t69659\n13087063970\t69660\n聲音說話嗎\t69661\n辽宁卫视\t69662\n木偶\t69663\n7566855\t69664\n11.5%\t69665\n电子商务\t69666\nxxxxxxxxxxxxxxxxxxxxxxxxxxxx\t69667\n笋岗街道\t69668\n放手一搏\t69669\n可欣\t69670\n徐燕媛\t69671\n斯米\t69672\n四十五\t69673\n超过30分钟\t69674\n神马事儿\t69675\n3M6\t69676\n糟子\t69677\n大义禀然\t69678\n薄雾\t69679\nhi度秘\t69680\n王思懿\t69681\n往岁\t69682\n亲我来了\t69683\n炸掉\t69684\n发明人\t69685\n四十二\t69686\n步伐\t69687\n实为\t69688\nST四维四家公司\t69689\n肖加上\t69690\n皮毛\t69691\n义举\t69692\n侯恶心\t69693\n24小时后\t69694\n47秒\t69695\n三十二三二八六五五\t69696\n许冬天\t69697\n伯瑞犬\t69698\n再见吧\t69699\nD600\t69700\n比翼高飞\t69701\n筑路\t69702\n薄雅\t69703\n孤影雪玥\t69704\n心法\t69705\n一个故事\t69706\n丁香山\t69707\n实业\t69708\n维赫\t69709\n第32枚\t69710\n伊萨维\t69711\n囤积\t69712\n这一个月\t69713\n银杏树\t69714\n宋加\t69715\n乔巴\t69716\n开聊\t69717\n催更卡\t69718\ningruouhot\t69719\n度秘我真的很爱你哦你爱我\t69720\n呵町\t69721\n四朵\t69722\nBale\t69723\nGGGFGFHFGVGHF\t69724\n180小时\t69725\n肠粉\t69726\n因为者\t69727\n3ML\t69728\n素美\t69729\n山间块\t69730\n昨晨\t69731\n成交量\t69732\n开头堡\t69733\n新奥\t69734\n后路\t69735\n几33\t69736\n噜噜仔\t69737\n一11112\t69738\n机谅\t69739\n汤程橙\t69740\n纯牛奶丝\t69741\nwittergo\t69742\n让伤\t69743\n礼轻情意\t69744\n㏑\t69745\nu五i爱搜\t69746\n新奇\t69747\n遗产\t69748\n双眼\t69749\n脱下\t69750\n疯了真\t69751\n舍不得你\t69752\n邵依娜\t69753\n戴胤\t69754\n㏄\t69755\n强索爱\t69756\n祈望\t69757\njagj\t69758\n查哈哈哈\t69759\n哈佛大学\t69760\n闲言碎语\t69761\nqhcgvbrcs\t69762\n泥棒\t69763\n3≦\t69764\njagg\t69765\n辛辣\t69766\n老的人\t69767\n裕通公司\t69768\n逢秋悲寂廖我言秋日胜春朝晴空一鹤排云上便引诗情到碧霄你看我背的对不对\t69769\n名列前茅\t69770\n事实我爱你就是爱我不爱你我\t69771\nfftgf\t69772\n乜乜乜乜乜乜\t69773\n奥能\t69774\n肖雨\t69775\n文塔斯\t69776\nFuDH\t69777\nfjjhggh\t69778\n13829812128\t69779\n站起\t69780\n疯老\t69781\n美莱\t69782\n秘迷\t69783\n爱我爱\t69784\n张欣然\t69785\n高贵\t69786\n1938年\t69787\n摩擦\t69788\n钟肿\t69789\n有点化\t69790\n7285\t69791\n吴秋媚\t69792\n松骨\t69793\n真短\t69794\n秉性\t69795\n单细\t69796\n王一曼\t69797\n明月传\t69798\n九死一生\t69799\n真知\t69800\n栖霞寺\t69801\n10001万100家\t69802\n与生俱来\t69803\nbjohtk\t69804\n晓明\t69805\n韩海晴\t69806\n兴间\t69807\n耳坠\t69808\n微微小妹aasmasyomakasmasyourmalakamiseshipouentrouto\t69809\n说动力度\t69810\n山月儿\t69811\n隔壁栋\t69812\n搜仙\t69813\n说的话\t69814\n耀县\t69815\n有心干嘛\t69816\n凯皇\t69817\n几套\t69818\n狂淫\t69819\n时差\t69820\ngjigllw\t69821\n乖宝\t69822\n太神经质\t69823\ntranoko\t69824\n海伦海\t69825\nI5446461656\t69826\nHome建\t69827\n矫正\t69828\n肖家乡\t69829\n啦啦啦啦啦啦啦啦啦啦啦啦\t69830\n祝明天\t69831\n米祖\t69832\n震源\t69833\n肖家乐\t69834\n思明区\t69835\n无能模棱\t69836\n要不听话\t69837\n读帜\t69838\n陈树峰\t69839\n熊样熊样你文燕让你熊样\t69840\n美rorong\t69841\n田老二\t69842\n东西都的女\t69843\n大默\t69844\nfijjhffui\t69845\n明黄\t69846\n体表\t69847\n内风\t69848\n念鸟\t69849\n联大会堂\t69850\n叶冰伦\t69851\n主宾\t69852\nguxuxhchc\t69853\n山西财政税务专科学校\t69854\n九零\t69855\n武魂枫\t69856\n词谱\t69857\nswqq\t69858\n那些事\t69859\n一箭穿心\t69860\n三从四得\t69861\n遗憾+杯具\t69862\n13143413143\t69863\n嗯秋冬\t69864\n蒜油辣油\t69865\n咯分钟\t69866\nkkjjksa\t69867\n塔骆宾王鹅鹅鹅鹅曲项向天歌白毛浮绿水红掌拨清波\t69868\n那个元\t69869\n疟胡\t69870\n德鬼\t69871\n李畅\t69872\n655588\t69873\n宇宙飞船\t69874\ndxdghxutd\t69875\n三四\t69876\n悬梁刺股\t69877\n青羊宫\t69878\ngdioc\t69879\n三回\t69880\n打烊\t69881\n拉徨\t69882\nz41次\t69883\n拔尖\t69884\n敬轩\t69885\n食府\t69886\n差老子\t69887\n高怕\t69888\n分级\t69889\n带坏\t69890\n拉往\t69891\n分红\t69892\n４７\t69893\n70度\t69894\n谭永承\t69895\n三围\t69896\n预期\t69897\n闭妝\t69898\n一起萌生\t69899\n高总\t69900\nfhfbhvcgujvchujnvcdgdhgxghtghhfchjjbvvcdgbbbvdghhffhbfthvcbkyfv\t69901\n便秘摆设\t69902\n零下一度\t69903\n练剑恭喜\t69904\n某希\t69905\n期待版\t69906\n某师\t69907\nUOT\t69908\n靠靠不少\t69909\n赵芷萱\t69910\n不男不女\t69911\n陶浩然\t69912\n伟仔\t69913\n凄凉\t69914\n容身\t69915\npkk\t69916\n谈恋爱了吧\t69917\n晨梦初\t69918\n肉松饼\t69919\n我是你的绿豆\t69920\n哎呦的那你\t69921\npkn\t69922\n列斯\t69923\n你好讨厌啦你好讨厌啦你好讨厌啦你好讨厌啦你好讨厌啦你好讨厌啦你好讨厌啦你好讨厌啦你好讨厌啦你好讨厌\t69924\n果国\t69925\n瞻前顾后畏首畏尾\t69926\n堡栅\t69927\n玩看\t69928\n1岁\t69929\n厮守\t69930\ntg5gfhg\t69931\n操逼\t69932\n玫琳凯\t69933\n果园\t69934\n二幺\t69935\n半山\t69936\n一直都在这里\t69937\n红袍袍\t69938\n嗯断\t69939\n二平\t69940\n于敏娜\t69941\n二年\t69942\n洗衣歌\t69943\n缺粮\t69944\n叶好看我看你度秘\t69945\n跳出来\t69946\n嗯施\t69947\n中海地产\t69948\n嗯新\t69949\n陆少磊\t69950\nIMAS\t69951\n翻云覆雨\t69952\n绝望\t69953\n叫宅\t69954\n音阶\t69955\n嗯文\t69956\n体统来人\t69957\n韩如倪\t69958\n大小姐\t69959\n下午6点\t69960\n半屄\t69961\n数年\t69962\n满互\t69963\n汪燕\t69964\nskssj\t69965\n公积金\t69966\n半屏\t69967\n付文豪\t69968\n浙江菜\t69969\n第几代\t69970\n白细鳞\t69971\n双杠\t69972\n假的\t69973\n邱邱邱\t69974\n为时\t69975\n蜡笔冰\t69976\nkhbnnkl\t69977\n被砸\t69978\n吴金峰\t69979\n既不\t69980\n夺金\t69981\n钓鱼庄\t69982\niii类\t69983\n我好爱好爱\t69984\n九州\t69985\n百度小说\t69986\ntdtdjyf\t69987\n0.1亿\t69988\n脑震荡\t69989\nMISS\t69990\n桥店\t69991\n公办你是我的宝贝\t69992\n聂树斌\t69993\n将星\t69994\n哭耍\t69995\nvchcgcgdh\t69996\n拉贝比安\t69997\n书素\t69998\n一九八二三个\t69999\n安徽省妇联\t70000\n基腐宅\t70001\n双修\t70002\n否者\t70003\nvoyh\t70004\nNast\t70005\n大炮片\t70006\n扬子鳄\t70007\n恋人心\t70008\n连清川\t70009\n剥开\t70010\n非所问答非所问重要\t70011\n贫穷\t70012\n上门大夫\t70013\n周六下午\t70014\n安智vv\t70015\n%Gg\t70016\n1395366020帮\t70017\n水墨\t70018\n爱说慌\t70019\n充达\t70020\n八里河\t70021\n大细\t70022\n大组\t70023\n反反复复方法反反复复方法发发发发发发\t70024\n跪天\t70025\n77平\t70026\n艾玛艾玛\t70027\n一百两百\t70028\nds波斯\t70029\n发根\t70030\n黑名单者\t70031\nvceav\t70032\n欢送\t70033\n街妈\t70034\n苏代斯\t70035\n中科院微电子\t70036\n星芒\t70037\n都会\t70038\n独异\t70039\n周戎\t70040\n石斛\t70041\n你是的我五五喜欢和你\t70042\n哦唧\t70043\n明度秘\t70044\n八点二天\t70045\n伊伊\t70046\n无理取闹\t70047\nHugcheck\t70048\n上政\t70049\n十七位\t70050\nc瑞功能\t70051\n糖精枣\t70052\n哥尼斯\t70053\n周成\t70054\n周我\t70055\n拿尼\t70056\n12345医院\t70057\n女林\t70058\n自由职业者\t70059\nhccbbmhn\t70060\n曲艺\t70061\n凋小白\t70062\n小主人\t70063\n喔烤小饼\t70064\n今年8月30日\t70065\n劲丽\t70066\nHgggg\t70067\n巴西木\t70068\n同雅\t70069\n刘子浩\t70070\n油污\t70071\n等一下线\t70072\n颐高数码连锁\t70073\nghfhfhfjv\t70074\n算折算是\t70075\n948\t70076\n949\t70077\n946\t70078\n947\t70079\n942\t70080\n943\t70081\n有方\t70082\n941\t70083\n拉锯战\t70084\njinshen\t70085\nF119—PW—100\t70086\n嗟余听\t70087\n锦鲤\t70088\n电镐\t70089\n两大\t70090\n图解\t70091\n亲过\t70092\n全篇\t70093\n恶少\t70094\n主格\t70095\n亲近\t70096\n着了魔\t70097\n主根\t70098\ninameiting\t70099\naizife\t70100\n再见人在\t70101\n程金玲\t70102\n监控\t70103\nx元\t70104\n2016年1月30日\t70105\n亮亭\t70106\ntsh\t70107\n里娃\t70108\n张大伟\t70109\ntse\t70110\ntsb\t70111\n969494999449494949949494949499494949499949499494\t70112\n7uuuuurhi\t70113\ntsy\t70114\ncguc\t70115\n毕业好难\t70116\n盖地皆沙\t70117\n下辈\t70118\n几十岁\t70119\n李炘睿\t70120\n美罗湾\t70121\n不管你\t70122\n坚定性\t70123\n下辖\t70124\n我是王子喃\t70125\n伟岸\t70126\n錒醃\t70127\n爱爱爱不完的绵翩翩编导用元\t70128\n咪咕光头强\t70129\n哎呦库克\t70130\n近百名\t70131\n下达\t70132\n下边\t70133\nJimi\t70134\nts8\t70135\n涡轮\t70136\n中宇卫浴\t70137\n吭声\t70138\n几号儿\t70139\n脉脉\t70140\n甲醛\t70141\n篆书\t70142\niiiiiiiiiiiiiiiii\t70143\n54n\t70144\n莫\t70145\n三十日\t70146\n跳楼\t70147\n南丰镇\t70148\n莮\t70149\n莱\t70150\n频发\t70151\n做到\t70152\n获\t70153\n哪一位\t70154\n莹\t70155\n羊马\t70156\n彩虹吧\t70157\n8858866578657899\t70158\n高贵祖\t70159\n赚到钱\t70160\n好想要\t70161\n近义词\t70162\n陈姿伶\t70163\n夏雨冬雪\t70164\n臭骂\t70165\nt3193607703\t70166\n弗成能\t70167\n很明显\t70168\n长吻\t70169\n好吧赞\t70170\n莎\t70171\n凝华\t70172\nhfnk\t70173\ntfdfyyghu\t70174\n继不理\t70175\n王艺栋\t70176\n猪你是猪你是猪你是猪你是猪你是猪你是猪你是猪你是猪你是猪\t70177\n尼丽帕尔\t70178\n我是你的主人哪笨笨\t70179\n迪热丽巴\t70180\n首就乡\t70181\n陪聊陪聊\t70182\njghg\t70183\n1441点\t70184\n33天前\t70185\nbwb\t70186\njghy\t70187\n麦尔\t70188\n张正\t70189\n蒙混过关\t70190\n仙桃桃\t70191\nbwn\t70192\n谢文希\t70193\n快好\t70194\n般态\t70195\n高提耶大秀#\t70196\n年少\t70197\n死不救\t70198\n摸摸摸\t70199\n花洒\t70200\n吕思阅\t70201\n面带\t70202\n150036311\t70203\n里奥那多\t70204\n搜狐igdtsckvug\t70205\n植树节\t70206\n源远流长\t70207\n马卡报\t70208\n嫌眼\t70209\n平易近人\t70210\nT型k线\t70211\n充数\t70212\n死磕\t70213\n说话呀嘲笑鸟\t70214\nJhcvjjmmu\t70215\n好想抱一只回家喵\t70216\n咯瘩\t70217\n心事儿\t70218\n春春\t70219\n第一座\t70220\n高峰\t70221\n曾欣怡\t70222\n非人非鬼\t70223\n跑男四\t70224\n彩色版\t70225\n行驶证\t70226\n180本\t70227\n屁操\t70228\n赵寅成\t70229\n麻袋\t70230\n诗字\t70231\n北京市热力集团方庄供热厂\t70232\n应聘\t70233\n阿拉巴一吧八一天仙频在哪吗啊拜拜\t70234\n防晒霜\t70235\n度秘不好么\t70236\n西东\t70237\n风风\t70238\n180朵\t70239\n20多首\t70240\n南红玛瑙\t70241\n新编\t70242\n多吉\t70243\n亚铜陵\t70244\n于瑞婷\t70245\n月份儿\t70246\n被淘汰\t70247\n再克龙江甘\t70248\n話題\t70249\n拒食\t70250\n猪肝汤\t70251\n王泽涛\t70252\n一百六十六\t70253\n薛凌寒\t70254\n太原市\t70255\n思变\t70256\n性器\t70257\n我十二你问我九几\t70258\n节春\t70259\n敌台\t70260\n88个\t70261\n吃粽子\t70262\n大西洋月刊\t70263\ni7272\t70264\nirown\t70265\n死八戒\t70266\n417756\t70267\n計較\t70268\norzzz\t70269\n三峡大坝\t70270\n密人\t70271\n糊味\t70272\n1234456713101\t70273\n这几\t70274\n老巢\t70275\n一个30多岁\t70276\niucew\t70277\n几眼\t70278\n生玩\t70279\nhttphhiphotosbaiducomxiaodupicitem6d81800a19d8bc3e36202cbf858ba61ea8d3459ejpg\t70280\n6.5亿元\t70281\n冷不死\t70282\n抽插\t70283\n梦见过\t70284\n77214885566\t70285\n酒场\t70286\n想起花\t70287\n中华南路14号\t70288\n中国时报\t70289\n快接\t70290\nfyyff\t70291\nnjjbbn\t70292\n华为P1\t70293\n25万\t70294\n3337333777779999\t70295\n老鹰队\t70296\n牛了叫\t70297\n余少凡\t70298\n问你我知道\t70299\n聪明的乖乖\t70300\n崔1\t70301\n啦啦啦小宝贝\t70302\n海荣锅贴\t70303\n东北一家人\t70304\n两根\t70305\n错着火\t70306\n李智富\t70307\n小萝\t70308\n增添\t70309\n两样\t70310\n怀岁\t70311\n2400223009404\t70312\n小萌\t70313\n陈应文\t70314\n护城河\t70315\n拖动\t70316\n蜜宝\t70317\n点性\t70318\n红薯淀粉\t70319\n谢小姐\t70320\n给力\t70321\n无忧无疾\t70322\n小萱\t70323\n量产\t70324\n我命\t70325\n印第安人\t70326\n蜂蜜柚子茶\t70327\n一滴泪\t70328\n拭音\t70329\n寇憬飒\t70330\n小营\t70331\n器人\t70332\n镜头\t70333\n缺心\t70334\n炉具\t70335\nmuy\t70336\n度秘华\t70337\n唔识谂\t70338\n命运\t70339\n人妖嘞\t70340\n949568668\t70341\n崖柏\t70342\n女朋女朋友\t70343\n2002年2月29日\t70344\n度秘博\t70345\nkkkkkkksf\t70346\n蹄子\t70347\n高奋军\t70348\n吕小军\t70349\n赤焰\t70350\n疯狂制造\t70351\n葛楠楠\t70352\nJCHHF\t70353\nlb蹭\t70354\n时间仓\t70355\n喝咖\t70356\n轩然大波\t70357\n妻衣\t70358\n康哥\t70359\n侯三寿\t70360\nxgjcsfj\t70361\n撩开\t70362\n苏剧\t70363\n境界\t70364\n24%\t70365\n耳屎\t70366\n林承峰\t70367\ndhfvgf\t70368\nnanishuiba\t70369\n丽江古城\t70370\n咧力\t70371\n1000美元\t70372\n女边\t70373\n霸道总裁文\t70374\n邓楚良\t70375\n不舍得\t70376\n闲心烦\t70377\n莫大吧大把握\t70378\n时政类\t70379\n密秘史\t70380\n漫好\t70381\n花千沟\t70382\n鹌鹑\t70383\n过得\t70384\n一万只\t70385\n精电\t70386\n艺人\t70387\n秒传\t70388\n副餐\t70389\n玩弹\t70390\n中国娱乐网\t70391\n过往\t70392\n数出\t70393\nnm\t70394\n罚无所\t70395\n躁躁\t70396\n2011年5月3日\t70397\n姑娘小姑娘的姑娘\t70398\n想多说\t70399\n能不哭\t70400\n无梦\t70401\n播音员\t70402\n我喜欢力\t70403\n秒会\t70404\n玩弄\t70405\n三百九十一\t70406\n女星们\t70407\nbramble\t70408\n150平方米\t70409\n圣衣\t70410\n150150889\t70411\n江华天\t70412\nzkgfbh\t70413\n胡静怡\t70414\n劝慰\t70415\n困苦\t70416\n李家岗\t70417\n米粒度\t70418\nhighing\t70419\n期指\t70420\n秘密密度\t70421\n草泥\t70422\n数一数\t70423\n宋佳静\t70424\nTFBOYs\t70425\n换药\t70426\ngmgmg\t70427\nSMG尚世影业\t70428\n拉氏明\t70429\nzonder\t70430\n兵兵姐\t70431\n枪弹\t70432\n17万亿元\t70433\n千门万\t70434\n该队\t70435\n15091516191\t70436\n免费话\t70437\n803280388888\t70438\n放映\t70439\nTFBOYS\t70440\n电信移动\t70441\n死快\t70442\n白不理\t70443\n新品\t70444\n死忠\t70445\n赵艳\t70446\n欧阳多\t70447\n黄子旭\t70448\n凊空\t70449\n武一\t70450\n首套\t70451\n杀富济贫\t70452\n难吃\t70453\n现身\t70454\n辐射线\t70455\n幕像\t70456\n初稿\t70457\n#四大名捕#\t70458\n纠偏\t70459\n死心\t70460\n黑龙江黑化股份有限公司\t70461\n武举\t70462\n高长大\t70463\n选材\t70464\n难听\t70465\n首她\t70466\n句数\t70467\nGURDY\t70468\n一千六两百六十元\t70469\n尾翼\t70470\n指示牌\t70471\nncnjnxnxbchdm\t70472\n雷玉玲\t70473\n赵达裕\t70474\n老照\t70475\n足不出户\t70476\n5863535353535353535533535353535\t70477\n对偶\t70478\n劲若希\t70479\n董世长\t70480\n卡蒂尔\t70481\n贴片\t70482\n太残酷\t70483\n9.5亿美元\t70484\n瘦猪\t70485\n丹杰仕甜\t70486\nfault\t70487\n中性人\t70488\n棒女郎\t70489\n值钱\t70490\n旁边儿\t70491\n养心\t70492\n说罢\t70493\n蜂群\t70494\n恼死\t70495\n律师协会\t70496\n看惨\t70497\n我的话你好帅\t70498\n稼公\t70499\n张乐葆\t70500\n执政\t70501\n养志\t70502\n暇想\t70503\n飞枫\t70504\n社科院技术经济与管理专业\t70505\n哎呦信\t70506\n防震\t70507\n每条路\t70508\n风花雪月\t70509\n赵腾讯\t70510\n许子怡\t70511\n开阳楼\t70512\n真空\t70513\n憨豆女排队\t70514\n真穷\t70515\n岳鹏\t70516\n一个100多\t70517\n２１３１．８７吨\t70518\n剧本\t70519\njbdd\t70520\n王氏\t70521\n3180\t70522\n卡拉卡反弹斯特北鼻\t70523\n林喜缘\t70524\n探播放\t70525\n凤月亚\t70526\n不足为怪\t70527\n理不容\t70528\n聊点儿\t70529\n炼成\t70530\n还不错性格内向\t70531\n五分之十九\t70532\n咱们两个亲一个最好不好\t70533\n合影\t70534\n这样的生活\t70535\n37枚\t70536\n容岩\t70537\n抗拒\t70538\n34317617176576567237327144\t70539\n三至五年\t70540\n99999999\t70541\n阿杜利\t70542\n17875772776\t70543\n中关村环廊\t70544\n昏昏噩\t70545\n度柲\t70546\n到期\t70547\n4624645642\t70548\n失准\t70549\n作业人\t70550\nbetaVB\t70551\n种病\t70552\n罗兰赛玉\t70553\n找房\t70554\n关女士\t70555\n遗传系\t70556\n狗头天\t70557\n说话社小学\t70558\n地质人\t70559\n没时间\t70560\n意趣\t70561\nu就苏\t70562\nａｂｃ\t70563\n玛丽黛佳\t70564\nGallaudet\t70565\n一句两\t70566\nFhnfgg\t70567\n鹤山\t70568\ngmk\t70569\n余套房\t70570\n6329213\t70571\n九里九里\t70572\n几万块\t70573\ngmm\t70574\n云境\t70575\n罗盘\t70576\ngmg\t70577\n750ml\t70578\nVictoria\t70579\n小花骨朵儿\t70580\nakkanwooolol\t70581\nuccuc\t70582\n2010年12月9日\t70583\ngms\t70584\n仁川机场\t70585\ngmw\t70586\n夜总会\t70587\ngmt\t70588\n拉切\t70589\n书费\t70590\nXFDNR\t70591\nXXXXXXXX\t70592\n1600分钟\t70593\njap\t70594\n不值钱\t70595\n大银幕\t70596\n1446212351\t70597\n活活\t70598\n安卓拉卑鄙我恨你\t70599\n嫂子\t70600\n敞亮\t70601\n端面\t70602\n那你做我的兄弟八\t70603\n胶东\t70604\n黄脸婆\t70605\nmy299\t70606\n别可怜\t70607\n轻吻\t70608\n1313895577\t70609\n凌晨7点\t70610\n街机力宏\t70611\nMicky\t70612\n热闹\t70613\n会组秘\t70614\n多行\t70615\nhffggvz\t70616\n让我们荡起双浆\t70617\n雯爸\t70618\n热门\t70619\n大蟒蛇\t70620\n兴情\t70621\n强东妮\t70622\ninseins\t70623\n楼顶\t70624\n扯痧\t70625\n加康\t70626\n双色球话\t70627\n手撕\t70628\nxdddf\t70629\n拋棄\t70630\n还给我等\t70631\n看他来了请闭眼\t70632\n萨克斯\t70633\n魔秀兰博\t70634\n哪几招\t70635\nKING\t70636\n捏哦里理论jkwBJH\t70637\n哈伊\t70638\n丁力\t70639\n陈君兰\t70640\n肉色\t70641\n爱啦\t70642\n几门\t70643\nvufnc\t70644\n袜业\t70645\n河城荷\t70646\n惊弓之鸟\t70647\n小权\t70648\n好想听\t70649\n4分钟前\t70650\n小杉\t70651\n冤枉冤枉\t70652\nsgbfho\t70653\n朱河\t70654\n单据\t70655\n绒毛\t70656\n极高\t70657\n歹富\t70658\n咯雪绵绵\t70659\nc飞哥\t70660\n何必呢\t70661\n小杜\t70662\n裁决\t70663\n卖傻\t70664\nyáo\t70665\n黄小姑\t70666\n每天话\t70667\n罗斯科\t70668\n1346119653\t70669\n帝业如画\t70670\n小杯\t70671\n小杰\t70672\n弥留时\t70673\n27小时\t70674\n金昌绪春\t70675\n一见钟情丝带心心碎\t70676\n小妖精你\t70677\nsossoocciiiis\t70678\n小松\t70679\n短消息\t70680\n商榷\t70681\n痣\t70682\n美悦乎\t70683\n武校\t70684\n观潮\t70685\n薹匿\t70686\n你好呀度秘我爱你度秘\t70687\n找大一下\t70688\n痰\t70689\n高玩\t70690\n表爷\t70691\n妹妹好的好欧美美女你好美好美好贼贼贼贼\t70692\n左欣悦\t70693\n痹\t70694\n今天27号\t70695\nDRIDR\t70696\n三十几元\t70697\n十一课春选文\t70698\nyourgrandpa\t70699\n王秘书\t70700\n痂\t70701\ntf11\t70702\n呱唧\t70703\n2011年11月6日下午\t70704\n咪意思\t70705\n痈\t70706\n贺新皇冠\t70707\n痔\t70708\n姜晓刚\t70709\n为人民除害\t70710\nmystarloveis\t70711\n痒\t70712\n本来本来本来本来本来本来本来\t70713\n终老\t70714\n痞\t70715\n丫面子\t70716\n痘\t70717\n水晶鞋\t70718\n1月9日\t70719\n那你是谁了你是谁\t70720\n爱鲜生\t70721\n云计算解决方案论坛\t70722\nPOPIN\t70723\n生命赛\t70724\ncsol\t70725\n雪汀\t70726\n9674647\t70727\n杀妻求将\t70728\n哭鼻子\t70729\n哈哈白\t70730\n吴湾\t70731\n水土\t70732\n好想吃\t70733\n吕生洋\t70734\n巴瓦\t70735\n毫不相干\t70736\n噜途锐\t70737\n给我肚肚\t70738\n振臂一呼\t70739\n真挚\t70740\n恶你我讨厌\t70741\n巿场\t70742\n想非礼\t70743\ndkk\t70744\n两只小狗\t70745\nTVB\t70746\n一饭\t70747\n舍财\t70748\nhzhdjb\t70749\n斜阳只片海蓝蓝\t70750\n只剩\t70751\n加拿大\t70752\n那故\t70753\n东立营\t70754\n七零八落\t70755\n笑不信\t70756\n盛某某\t70757\n五辑\t70758\n再见了晚安祝\t70759\n沒錯\t70760\n哈别\t70761\n凤临天下王妃十三岁\t70762\n哈利\t70763\n12CRV\t70764\n张文博\t70765\n别说我妻\t70766\n撤给\t70767\n4b2\t70768\n就是你的了我问你\t70769\n王董\t70770\n哈刚\t70771\n哈刘\t70772\nople\t70773\n会假\t70774\n趴窝\t70775\n爱师兄\t70776\n鸡柳\t70777\n隧道\t70778\n华筱琪\t70779\n很正断\t70780\noplk\t70781\n没用电\t70782\n嗯一六\t70783\ndcfff\t70784\n四门\t70785\n图图体育图图天\t70786\n别无聊\t70787\n初出\t70788\nubz\t70789\n谢风华\t70790\n吕轩轩\t70791\nukdy\t70792\n初几\t70793\n氨基酸\t70794\n2010年度\t70795\n四间\t70796\n贾兹拉\t70797\n我是你姐姐冰柔长公主\t70798\n榆木\t70799\n在子\t70800\n宝盖头\t70801\n中午11点14\t70802\n1234567990\t70803\ndsss\t70804\n这么说好不好\t70805\n桃子\t70806\n吐头\t70807\n赵博系\t70808\n袁岳\t70809\n梅西\t70810\n爱边\t70811\nvhjhm\t70812\n四一条\t70813\n洁雅静\t70814\n何处去\t70815\n咬一口\t70816\n异动\t70817\nhttpfhiphotosbaiducomxiaodupicitem3b292df5e0fe9925319c592533a85edf8db17136jpg\t70818\n徐沁淼\t70819\n鸟jj\t70820\n动饭\t70821\n吃吃饭\t70822\n帅啵\t70823\nhit48\t70824\n早知\t70825\n过天啵\t70826\n朵雅诺\t70827\nm太尔\t70828\n影人\t70829\n139com\t70830\n過陣子土豆塊夢\t70831\n木盆\t70832\n紫岚\t70833\n老毛脸\t70834\n西奥\t70835\n总裁风\t70836\n道奇\t70837\n90k\t70838\nxuys\t70839\n嫩模\t70840\n任意数\t70841\n45个\t70842\n鸟看见我了\t70843\n五个半小时\t70844\n费光\t70845\n大俗\t70846\ndsssf\t70847\n禾日\t70848\n网状\t70849\n81047371\t70850\n5565299\t70851\n超级大坏蛋超级大坏蛋\t70852\nYddviuevn\t70853\n神舟十号\t70854\n十多公里\t70855\n张芮嘉\t70856\n波长钱\t70857\n错不错\t70858\n北方地区\t70859\n口腔\t70860\n两小时\t70861\n乐亭中燃翔科\t70862\n狂喜\t70863\n古典乐个姑姑姑大图书店\t70864\n呼家楼站\t70865\n天舜帝\t70866\n你好朋友好朋友好朋友好朋友好朋友好朋友好不\t70867\n5亿元\t70868\n45万\t70869\n夏嘞龙\t70870\n两小无\t70871\n啊堂\t70872\n2.25个\t70873\n核泄漏\t70874\n大修\t70875\n裙摆\t70876\ndiao屌\t70877\n15199363333\t70878\n草泥馬\t70879\n家财\t70880\n踏\t70881\n话亲\t70882\ngfv\t70883\n周祥梅\t70884\n踝\t70885\n睡不困\t70886\n踢\t70887\n全心\t70888\n毛巾\t70889\n17286835392\t70890\n话亚\t70891\n造型师\t70892\n踫\t70893\nhidk\t70894\n当阳\t70895\n踮\t70896\n纳尼\t70897\n嗝嗝嗝嗝\t70898\n梦晨\t70899\n踷\t70900\n恶心干呕\t70901\n话事\t70902\n7997\t70903\n踹\t70904\n照不着你\t70905\n疾风\t70906\n0613\t70907\n牛气\t70908\n科黄网\t70909\n牛氓\t70910\n嚏嚏\t70911\n唉我真的好喜欢\t70912\n0.46\t70913\neven\t70914\n福分\t70915\n暗中\t70916\n竖放\t70917\n海阔\t70918\n百品\t70919\n绯色异闻录\t70920\nBBOX\t70921\npianbi\t70922\n受寒\t70923\n郭维康\t70924\n这是为什么呢嗯\t70925\n139万\t70926\n当爱已成往事\t70927\njrbr\t70928\ndda144ad343a94b1f2d7a20cf431ad859fjpg\t70929\n我有十岁我的十一我的七岁\t70930\n阿米勒\t70931\n二百多岁\t70932\n柠檬仙\t70933\n豆子们\t70934\njrbd\t70935\n公益\t70936\n收费版\t70937\n给退\t70938\nS8600\t70939\n第三人称\t70940\n成衣工艺学\t70941\n钱浩然\t70942\n陈伯母\t70943\n百花齐放\t70944\nGroup\t70945\n抓开\t70946\n54884\t70947\n烟花易冷\t70948\n买进\t70949\n叶度秘\t70950\n哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒嘟嘟嘟嘟嘟嘟嘟嘟豆豆打豆豆\t70951\n不雅相\t70952\n柜员机\t70953\ntp\t70954\n重磅\t70955\n几十个\t70956\n挂帅\t70957\n营口\t70958\njwis\t70959\nzjaokd\t70960\n永垂\t70961\n区委书记\t70962\n说下雪\t70963\ntrreety\t70964\n黄嘉贤\t70965\nwfxdhcddfgxdyffxtygsthdyfdhhdgcgkkghdyjjdhhxggjdghcfyhhwybdgicfifewpkvv358206\t70966\n好啦原谅你啦待见晚安\t70967\n我了我问你\t70968\n风饕\t70969\n度秘奥特\t70970\ntm\t70971\n春梦\t70972\n几十万\t70973\n宅男\t70974\n为你买\t70975\n给我足交\t70976\n家坝\t70977\n天韩\t70978\nvpp\t70979\n家块\t70980\n微扁\t70981\n豌豆们\t70982\n反革命\t70983\n落班\t70984\n集装箱\t70985\nwaxfunk\t70986\n笔书\t70987\nvpi\t70988\n中小偷\t70989\n有如\t70990\n亨德利\t70991\n人群\t70992\n精神病类\t70993\n伯牙\t70994\n刀郎\t70995\n堡长母\t70996\n八一成语0031\t70997\n辽宁省政协\t70998\n游戏堡\t70999\nMIME\t71000\n海阳\t71001\n有妞\t71002\nImEngland\t71003\n星期四星期五\t71004\n折柳\t71005\n阳光道\t71006\n15186834646\t71007\n餐桌\t71008\n寂寞\t71009\n见不到\t71010\n胸蓉\t71011\n啊们\t71012\nlargecities\t71013\n13780184102\t71014\n迷失洼\t71015\n宝利来\t71016\n裸考\t71017\n饿昏天\t71018\n洗濯\t71019\n34343344334343443\t71020\n唐王维一\t71021\n黄嘎嘎嘎嘎嘎嘎嘎\t71022\n释大师\t71023\n寂寥\t71024\n秘秘度\t71025\n那只手\t71026\n气煞\t71027\n真寒心\t71028\n姑姑\t71029\n姑姐\t71030\n没源\t71031\nFXY\t71032\n嫵矮鬟\t71033\n一七号\t71034\n3003万3e\t71035\n嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯\t71036\n冯婉莹\t71037\n张婷\t71038\n虐心火箭\t71039\n哎哟\t71040\n景博\t71041\n反锁\t71042\n拖尊\t71043\n乌药水果\t71044\n一个一百个\t71045\n报销\t71046\n骗不了我我告诉你\t71047\n品种\t71048\n张婉\t71049\n海龟医院\t71050\n抽风\t71051\n东明在家的话\t71052\n3200个\t71053\ngidn\t71054\n0ran9e\t71055\n时至\t71056\n值班长硬\t71057\n太污\t71058\n咪先生\t71059\n报业\t71060\n谦虛\t71061\n谦虚\t71062\n硫银耳\t71063\n欧诺拖\t71064\n發現\t71065\n婴童\t71066\n40%\t71067\n等你一起来\t71068\nunzipped\t71069\n高配\t71070\n粘胶短纤\t71071\nidjej\t71072\n405\t71073\n404\t71074\n杨云杰\t71075\n402\t71076\n多得指教\t71077\n400\t71078\n表酱\t71079\nxCx\t71080\n包个希尔哈撒韦\t71081\n409\t71082\n408\t71083\n40G\t71084\n撤\t71085\n较真是康\t71086\nhHGH\t71087\n撩\t71088\nmopqr\t71089\n胶情\t71090\n清掉\t71091\n撬\t71092\n乡里\t71093\n矽叶\t71094\n方面儿\t71095\n爱速速\t71096\n栅栅\t71097\n勒老师\t71098\n撵\t71099\n以前\t71100\n撸\t71101\n一碗粥\t71102\n栅栏\t71103\n顾理\t71104\n40g\t71105\n蓝夜陵\t71106\n撇\t71107\n簇簇\t71108\n撅\t71109\n回民\t71110\n你好可爱哦小度秘我突然发现我爱上你了\t71111\n沙坪\t71112\n顾琳\t71113\n私护\t71114\n40k\t71115\n袁浙钦\t71116\nFour\t71117\n撒\t71118\n撑\t71119\n翡翠\t71120\n撕\t71121\n护士节\t71122\n非竹\t71123\n撘\t71124\n撞\t71125\n世界大战\t71126\nduyy\t71127\n笨蛋\t71128\n刘备\t71129\npby\t71130\n门肃法师\t71131\n呼吸系统\t71132\n如止水\t71133\nspa机\t71134\n卡小王\t71135\n大黑美妞\t71136\n太聪明\t71137\njhhjjjh\t71138\n7处\t71139\n在校\t71140\n好了睡觉吧\t71141\n广告曲\t71142\n很孤独\t71143\n刘大\t71144\n刘天\t71145\n她们的天\t71146\n咕噜咕噜菜\t71147\n158871263\t71148\n狼头\t71149\n刘头\t71150\n灿烈\t71151\n课语\t71152\n指标\t71153\nduyf\t71154\n几GV\t71155\n灿烂\t71156\n孙大雨\t71157\n罗永清\t71158\n对呀长\t71159\n骨康膜\t71160\n锦标赛\t71161\n佛光\t71162\n8月4日晚9点\t71163\n一一百兆\t71164\nW\t71165\n数千万次\t71166\n渝客\t71167\n自我赶脚\t71168\n不配管\t71169\n两箱\t71170\n全有\t71171\n20多排\t71172\njagahk\t71173\n赶车\t71174\n十四十二\t71175\n佛教张\t71176\n一二四七九\t71177\n聊聊天行\t71178\n被虐\t71179\n妞笑\t71180\n491.22\t71181\n问你是谁呀我真的主人\t71182\n全本\t71183\n阿喔\t71184\n国王队\t71185\n总高\t71186\n盗链\t71187\n死假\t71188\n清原县满族\t71189\n吃罢不好\t71190\n嘴里\t71191\njajatatta\t71192\n紫仓\t71193\n嘿女\t71194\n张贤俊\t71195\n渔令\t71196\na吧九\t71197\n1560476265785\t71198\n无照\t71199\n黑头发\t71200\n菠菜汁\t71201\n校友们\t71202\n二月一号\t71203\n一百二十页\t71204\n乐乐酱\t71205\n本息\t71206\n随色\t71207\n嗯零度\t71208\n李囿雕\t71209\n12米\t71210\n水平导电轨道\t71211\n二十六块\t71212\n日暖\t71213\nG6分\t71214\ntomation\t71215\n家三千金\t71216\n用点\t71217\n满觉\t71218\n一盆\t71219\n五马\t71220\ndadn\t71221\nhrifuf\t71222\n.com\t71223\n除虫菊\t71224\n讲堂\t71225\n考零\t71226\nsgjfjgkv\t71227\n13次\t71228\n俩年\t71229\n语文\t71230\n艳羡\t71231\n朱丽娜\t71232\n28曲\t71233\n韩正\t71234\nfinal\t71235\n保护性\t71236\n138290\t71237\n水电费\t71238\n马佳冉\t71239\nweiombsq\t71240\n走一发\t71241\n剧型\t71242\nyourtcc\t71243\n蟑螂药\t71244\ngggfcffhf\t71245\n爬上来\t71246\nvueihs\t71247\n35公斤\t71248\n80棵\t71249\nVampire\t71250\nqqq币\t71251\nGoodafternoon\t71252\n浅月\t71253\n衣杆\t71254\n沐槿\t71255\n浣熊\t71256\n钢铁\t71257\n要钱\t71258\n蛊惑\t71259\n指责\t71260\n走穴\t71261\n乖小喵\t71262\n杨浩南\t71263\n裕昌楼\t71264\njdkxkd\t71265\n唔单\t71266\n495745648\t71267\n比亚达\t71268\nnnnn喔喔喔喔nnnnn\t71269\n葡萄味\t71270\n三花钱\t71271\n堂堂男淫\t71272\n样纸\t71273\nVTM们\t71274\n尼格里\t71275\n浑然\t71276\n心悦俱乐部\t71277\n楚楚楚楚楚楚\t71278\n钢圈\t71279\n气喘\t71280\n父子\t71281\n旅顺南路亿达普罗旺斯西\t71282\nJbg\t71283\n南京机场\t71284\n慕秘\t71285\nMinecraft\t71286\n克立宁\t71287\n目瞪口呆\t71288\n跑吧兄弟第三季\t71289\n陈世美\t71290\n你是我旗袍\t71291\n比不上\t71292\n快点给我你的吻\t71293\n创捷片\t71294\n英米\t71295\n曹芳\t71296\n丰田霸道车\t71297\n本性生活\t71298\n大哥我美佘欧诗漫\t71299\n枸狗\t71300\n奇安特幂有的一下两个好参p\t71301\n处处处\t71302\n嗯上在下下在上不可在上且宜在下猜一字\t71303\n缩胸\t71304\ndgadt\t71305\n梁再冰\t71306\n8v8\t71307\n帅真\t71308\n啵啵啵啵啵啵啵啵啵啵啵\t71309\nhhhhhhhhhhhhhhhhhhhhhhhhuyte3eerdfgiiuffn\t71310\ntufucg\t71311\n命短\t71312\n北师熊熊\t71313\n粉刷匠\t71314\n曾梦想仗剑走天涯\t71315\n和敏\t71316\n上午9：30\t71317\n顾及\t71318\n度秘老怪一\t71319\n一向我\t71320\n篆刻\t71321\n蜀山拳法\t71322\n你为我着迷\t71323\n老杜\t71324\n好帅你好帅你好\t71325\n七毛三\t71326\n锦州\t71327\n安讨\t71328\nsabi\t71329\n福利房\t71330\n民币\t71331\n12579\t71332\n潦公路\t71333\nSIGX\t71334\n迈瑞\t71335\n30多天\t71336\n痛殴\t71337\n装甲\t71338\n老板\t71339\n老杰\t71340\n贵冠\t71341\n65千米\t71342\n冬麦区\t71343\n迷失\t71344\n眼泪男\t71345\n央企\t71346\nt6gufhgb\t71347\nkligggaa\t71348\nasimilar\t71349\n这么难\t71350\n开槽\t71351\n兴建\t71352\n丁尔\t71353\n3d021\t71354\n屁鼓\t71355\n皎月晦明灯花处台媒初红莲风袖烟那为谁舞回顾蓦然步转青石鹿晗在水榭畔画楼处\t71356\n绰然\t71357\n玉儿\t71358\nahherchiti\t71359\n贵贱\t71360\n99999999999999999999hhhhhhhhhhhh\t71361\n笨不笨\t71362\n蛤路虎坡苦特里阿布跨垮裤八路\t71363\n成一说\t71364\n干嘛糊涂\t71365\n灵岩山\t71366\n呆瓜行不行\t71367\n小麻\t71368\n杨易\t71369\n刁丝\t71370\n刷卡机\t71371\n林蓉\t71372\n捏糖\t71373\nRock\t71374\n河北\t71375\n和不和\t71376\n54321一二三四五六七八九十\t71377\n韦敏\t71378\n维度\t71379\n喷水\t71380\n城北\t71381\n谭松韵\t71382\n50年后\t71383\n撇弃\t71384\n大麦饭\t71385\ntifef\t71386\n野狼\t71387\n量错\t71388\n周群之\t71389\n林文金林\t71390\n乐一乐\t71391\n南沙群岛\t71392\n703\t71393\nChensiqiao\t71394\n慧聪明\t71395\n野狗\t71396\n卓金杰\t71397\n提钱\t71398\n猪吗度\t71399\n安在\t71400\n承a龙\t71401\n去哪瘁\t71402\n95936\t71403\n再见不对\t71404\n合体\t71405\n201386\t71406\n姓贼\t71407\n衰减\t71408\n陈玉\t71409\n丑胖\t71410\n罗姑泉\t71411\n合作\t71412\n女乘客\t71413\ndiddidd\t71414\n说好啦\t71415\n188112265571572\t71416\n150天\t71417\n商朝\t71418\n心书\t71419\n牧秦\t71420\n日语\t71421\n嗯五成\t71422\n21120\t71423\n要不原谅我你可以\t71424\n来世\t71425\n峥嵘\t71426\n区嘉仪\t71427\n心乱\t71428\n顾来\t71429\n七里香\t71430\n就是你的你电脑你到哪\t71431\n云中\t71432\n林平之\t71433\n地铁15号线国展站\t71434\n顾村\t71435\n取去\t71436\n实沈\t71437\n商机\t71438\n听不厌\t71439\n胶粘\t71440\n沼泽\t71441\n第八部\t71442\n来临\t71443\n50.332公里\t71444\n达込𠂇乀\t71445\n心乐\t71446\n口吻\t71447\n类口\t71448\n喇啦\t71449\n耳挂式\t71450\n514485\t71451\n最爱的歌\t71452\n被遗忘\t71453\n4846730\t71454\n黄光裕\t71455\n胖迪\t71456\n金行\t71457\n表面化\t71458\n中学\t71459\n乌兰木伦镇\t71460\n奥卡兹\t71461\n500300200\t71462\n潍坊市\t71463\n够潮\t71464\n进行时\t71465\n金表\t71466\n800万和八万\t71467\n泰焕\t71468\n金衢\t71469\n小受文\t71470\ngave\t71471\nhijklma\t71472\n商业步行街\t71473\n904\t71474\n嘎嘎嘎嘎腹股沟\t71475\n影踪\t71476\n冷漠\t71477\nwdfg\t71478\n该i\t71479\n德外大街马甸桥玫瑰公园\t71480\n三两套\t71481\n陈玲\t71482\n成王败寇\t71483\n效率低下\t71484\n小番\t71485\n用心不良\t71486\nhi你好我的小秘书度秘\t71487\n王俊杰\t71488\n夏天天刚\t71489\n冰川\t71490\nlljjmj\t71491\n凉拌青瓜\t71492\n别说这样的话\t71493\ngjagagda0\t71494\n佛脚\t71495\n狮跑\t71496\n二十大道\t71497\n节公祠\t71498\nE04\t71499\n彻查\t71500\n长线\t71501\n好差\t71502\n长红\t71503\n久石让\t71504\ndychgguhjhhjgnghghvjhghjhhghhnfgghgghghnghghghghghghghhgdynhhhhh\t71505\n17块\t71506\n好巧\t71507\n王梦琪\t71508\n大坏蛋大腹黑\t71509\n二点零七\t71510\n瑞萝卜馅\t71511\n八零\t71512\n我办你妈了逼我\t71513\n动漫度秘我问你的问题\t71514\na8833\t71515\n财粮\t71516\n密度女\t71517\n张说句\t71518\n蓝根\t71519\n无家者\t71520\n不干刚好\t71521\nyyyyyu\t71522\n怕我真的不懂\t71523\n说不过来\t71524\n王三运\t71525\n柠桉\t71526\n赵氏\t71527\n雨声\t71528\n孔亩\t71529\n肥牛\t71530\n平怀春\t71531\n于舰坤\t71532\n搞不爽\t71533\n万斯分钟\t71534\n港九火车站\t71535\n教育机构\t71536\n岩手\t71537\n国家图书馆\t71538\n天籁\t71539\n989898988989988988888\t71540\n分子\t71541\n开部\t71542\n真的见笑\t71543\n说什么你说\t71544\n溢长\t71545\n好雨\t71546\n心头上\t71547\n15937\t71548\n装疯卖傻\t71549\n8635\t71550\n扯破\t71551\n心腊\t71552\n350元\t71553\n承租者\t71554\n3D打印技术\t71555\n掏钱\t71556\n好丑我讨厌\t71557\nh30t10\t71558\n红河哈尼族彝族自治州\t71559\n蒋素娟\t71560\n纵情燃烧\t71561\nHeart\t71562\n捡垃圾\t71563\n快乐乐\t71564\nrdtdhf\t71565\n抱抱住\t71566\n三幺五富力城\t71567\nhttphhiphotosbaiducomxiaodupicitemd8f9d72a6059252deb252371339b033b5bb5b990jpg\t71568\n二百一七\t71569\n十排名\t71570\navatar\t71571\nhttpimagebaiducomsearchwisealatnwisealaieutf8word%E99CB2E782B9\t71572\n缴款\t71573\n心腹\t71574\n皮皮皮\t71575\nsyTZ\t71576\n吉尔\t71577\n不吗\t71578\n小顺\t71579\n小项\t71580\n小顾\t71581\n不吝\t71582\n鼓二小站\t71583\n第几天\t71584\n猜森\t71585\n不合\t71586\nEcho\t71587\n不同\t71588\n男友们\t71589\n分辨率\t71590\n八六八三七零零八呃\t71591\n快点要不然老子揍你\t71592\n不吵\t71593\n当地居民\t71594\n无国界\t71595\n张健放\t71596\n2180度\t71597\n58同城\t71598\n不吧\t71599\n不吨\t71600\n新的开始\t71601\n不含\t71602\n不听\t71603\n春夜六六\t71604\n杭椒牛柳\t71605\nbì\t71606\n家用化\t71607\n20万亿立方英尺\t71608\n封口费\t71609\n别心塞\t71610\n喂鸡\t71611\n薛守光\t71612\n新妈妈\t71613\n孤家寡人\t71614\nawq\t71615\n匿名漫\t71616\n操控者\t71617\n恶血梦想\t71618\n不好呀\t71619\n星星\t71620\n外国语\t71621\nanglababy\t71622\n宽窄巷子\t71623\n指令\t71624\n暖身茶\t71625\n挨饿\t71626\n刻苦\t71627\n95268745687666\t71628\n劈头盖脸\t71629\n因为我的妈妈\t71630\n这是我的心的你不能号晚上\t71631\n五排\t71632\n效法\t71633\n益气\t71634\n杨家湾\t71635\npyl\t71636\nnsnksodd\t71637\n猜我告\t71638\n减法\t71639\n南郊小寨\t71640\n宝珠\t71641\n啊求你了求你了求你了求你了求你了你看我\t71642\nherirg\t71643\n水车薪\t71644\n采收\t71645\n来了敲\t71646\n西汉\t71647\n剛\t71648\n2哈\t71649\n不过关\t71650\n今宵佳诗12\t71651\n西江\t71652\n孤魂\t71653\n3年前\t71654\n史冰彦\t71655\n格非\t71656\n六单元\t71657\n人力资源和社会保障部\t71658\n黄基芹\t71659\n晚六点半\t71660\n千万千万千万千万不要和我的家河马与灰\t71661\n千秋\t71662\n疑点\t71663\nawa\t71664\n笑了饿\t71665\n文页\t71666\n馨子\t71667\n千秀\t71668\n首侠客行\t71669\n嗯铠甲勇士捕\t71670\njdjqg\t71671\n超级大抽奖\t71672\n内地场\t71673\n钉子\t71674\nGMG\t71675\n借比\t71676\n天翼3G\t71677\n焦炉宠\t71678\nfrgd\t71679\n10根\t71680\n徐露露\t71681\n固安县\t71682\nGMT\t71683\n蛮爽\t71684\n被撞断\t71685\n弓弩\t71686\n杂乱\t71687\n大起大落大喜\t71688\n白石岩\t71689\n3年之内\t71690\n遗忘我曾经\t71691\n一三六十十五\t71692\n兄弟说\t71693\n一段话\t71694\n川藏线\t71695\n芫荽\t71696\n你你你你你你你你你你你\t71697\nclcl3\t71698\n自缘\t71699\n咋晚\t71700\n自缚\t71701\n姓顾\t71702\n一一下下\t71703\nbbudhen\t71704\n我爱你我爱你\t71705\n机器人性\t71706\n张雪飞\t71707\n自缕\t71708\n自编\t71709\n小乖乖小乖乖\t71710\n大亮山\t71711\nO丈\t71712\n多妹\t71713\n对啊就\t71714\n闭幕式\t71715\n喜一想\t71716\nabcd式\t71717\n荼靡\t71718\n近四个小时\t71719\n一二烧鸭\t71720\n亚明确实\t71721\n男偶像\t71722\n我的妹妹是谁你猜猜\t71723\n三二五三二一二八二五二四一\t71724\n刮炮\t71725\n戴诗茜\t71726\n销量\t71727\n航天博览会\t71728\n海涵\t71729\n资源\t71730\n正反面\t71731\n保诺\t71732\n紫藤萝\t71733\niririririririririririri\t71734\n吖错\t71735\n自信自己\t71736\n资溪\t71737\n七三撒\t71738\n芝兰\t71739\n艾克莉沙\t71740\n本身\t71741\n冒泡米\t71742\n问好呀死\t71743\n洒满\t71744\n海涛\t71745\n三十块\t71746\n国库股\t71747\n伤弦\t71748\n奥尔沙恩斯基\t71749\n非分之想\t71750\n一穷小子\t71751\n方冰冰\t71752\n30秒\t71753\n30种\t71754\n退学\t71755\n说了我不会\t71756\n我要芭比成仙\t71757\nforinarticle\t71758\n克里姆林宫\t71759\n抵受\t71760\nplplpl\t71761\n推动者\t71762\n苏武斌\t71763\n零二二座机\t71764\n嘿嘿美\t71765\n慢条\t71766\n胃镜室\t71767\n慢来\t71768\n远铸\t71769\n微光\t71770\n罗比\t71771\n红热\t71772\n中国国际科技会展中心\t71773\n你是好人我乃何人\t71774\n朴子\t71775\n糸幺囗\t71776\n768397\t71777\n廊坊市区\t71778\n143角\t71779\n美日欧\t71780\n家水\t71781\n邀请函\t71782\n借读\t71783\n校舍\t71784\n笑嘻嘻\t71785\n不劲\t71786\n耳环\t71787\n鬼影\t71788\n花女神\t71789\n志凯\t71790\nIjg\t71791\n星传媒体\t71792\n靳言\t71793\n南城\t71794\n看题\t71795\nfuvyjxc\t71796\n菲之嚣张与美菲\t71797\n朴爷\t71798\nfhbfh\t71799\nvcngufjg\t71800\n不接电话\t71801\nyoyoyo\t71802\n单向友善\t71803\n679ey\t71804\nfratcn\t71805\n九十四元\t71806\n牧场\t71807\n372525458\t71808\n职权\t71809\n不帅块\t71810\n美国梦工厂动画公司\t71811\n圣约翰教堂\t71812\n好啦好啦晚安宝贝儿\t71813\n黑色大丽花\t71814\n王清雨\t71815\n中百超市\t71816\n拉索\t71817\n敏月\t71818\n王多海\t71819\n唐玉龙\t71820\n我的助理我的秘书\t71821\n壮\t71822\n壬\t71823\n士\t71824\n欣欣你的主人快快快\t71825\n壶\t71826\n第三场\t71827\n壳\t71828\n声\t71829\n北街小学\t71830\n壾\t71831\n旁路\t71832\njjhgff\t71833\n壹\t71834\n600多\t71835\n齐文利\t71836\n霸龙\t71837\n个人\t71838\n光明你\t71839\n18814547386\t71840\n南京火车站\t71841\nYous\t71842\nYour\t71843\n壕\t71844\n客气牧\t71845\n永远不认\t71846\n壞\t71847\n钱塘江\t71848\ntoourcity\t71849\n八座\t71850\njiushinidemingzi\t71851\n两三毫米\t71852\n填上\t71853\n赌鬼\t71854\n家门口\t71855\n小媳妇\t71856\n双顶\t71857\n牛若\t71858\n重新再来\t71859\n#咪咕歌曲##你是我的眼\t71860\npixi\t71861\n讷河市\t71862\n咪戏\t71863\n王柯淞\t71864\n谭老师\t71865\nAC米兰\t71866\n1月25\t71867\ndin推撞\t71868\n1月26\t71869\n人儿儿\t71870\n1月23\t71871\n1月22\t71872\n爱不爱我给你说话你\t71873\n1月29\t71874\n1月28\t71875\n22en\t71876\n日大\t71877\n睁大你的狗眼\t71878\n你是我的小小苹果\t71879\n2555点儿五\t71880\n木嘛木嘛木嘛木嘛木嘛木\t71881\n我的宠物太可爱了我就是喜欢你就是喜欢你\t71882\n主人妹妹\t71883\n拇指\t71884\n电信\t71885\n发噶\t71886\n遗照\t71887\n赫尔\t71888\n别礼拜\t71889\n几千倍\t71890\nOpera\t71891\n捉不住\t71892\n拜拜堡\t71893\n双项\t71894\n远交易\t71895\n偶想吃\t71896\n枫夕月\t71897\n黑龙江省哈尔滨市\t71898\nfugjj\t71899\n杜鲁门\t71900\n日夜\t71901\n俩队\t71902\n1小时\t71903\n第10位\t71904\n张黄琴\t71905\n]形\t71906\n三点十五分\t71907\n拜相\t71908\n住宅用地\t71909\n朱总理\t71910\n十二八十一八十二\t71911\n三重门\t71912\n慢下来\t71913\n赤港农场\t71914\n顿号\t71915\n那你找你的女友吧\t71916\n我也看\t71917\n马德里伯纳乌球场\t71918\n鸡窝\t71919\nmgc\t71920\n大公报》刊文\t71921\nwlaozi\t71922\nmgb\t71923\n晨练\t71924\n涂出\t71925\n桉树\t71926\n大少爷\t71927\n赛磊哥\t71928\n六七个\t71929\n03:22\t71930\n马屁\t71931\n对牛\t71932\n肝炎\t71933\n听神关\t71934\n8522.7亿元\t71935\ndgxd\t71936\n麦饭麦饭s\t71937\n烤鸭\t71938\n脱口而出\t71939\n0尺\t71940\n24四大皆空25德高望重26四脚朝天27三言两语28入木三分29扬眉吐气30比翼双飞31正中下怀32举一反三33马失前蹄\t71941\nduml\t71942\n沈半仙\t71943\n女同胞们\t71944\nVPN\t71945\n烤鸡\t71946\n零三八二\t71947\nTang\t71948\n水稻\t71949\n威思克\t71950\n麻疹\t71951\ngo，go\t71952\n4500\t71953\n计程车\t71954\nJXJ\t71955\n第三根\t71956\n头粉\t71957\n好愁我真\t71958\n禾苗\t71959\n滴彩民\t71960\n时候理\t71961\n审计署\t71962\n贫富\t71963\n贫寒\t71964\n王菲莫\t71965\n恋恋\t71966\n怪不着\t71967\n周叉叉\t71968\n仿照\t71969\n张好真\t71970\n看我过\t71971\n亚视\t71972\ntf卜一四\t71973\ndukd\t71974\n张柔柔\t71975\n憨豆\t71976\n是你的主人你是我的助手\t71977\n好期\t71978\n死克郎\t71979\n好服\t71980\n尿毒\t71981\n队形\t71982\njulum\t71983\n农觉\t71984\noieir\t71985\n12578957788\t71986\n竹溪\t71987\n肚子疼\t71988\n望山\t71989\n好机\t71990\nppk\t71991\nJggkkg\t71992\n同岁\t71993\n哇六\t71994\n坐地铁\t71995\n雅直\t71996\n三三米\t71997\n汽车之家论坛\t71998\n方程组\t71999\n海滨公路\t72000\n信仰者\t72001\n陈不来\t72002\n70种\t72003\n甘于\t72004\n经开五区\t72005\n本日\t72006\n陈星\t72007\n点点滴滴躲躲藏藏v\t72008\n郑小庆\t72009\nppl\t72010\n浅入深\t72011\n景类\t72012\n块面\t72013\nnbam\t72014\n巴洛特\t72015\n刘师合群\t72016\n找你这样的话\t72017\ncregub\t72018\n博华陶瓷\t72019\n风速玫瑰\t72020\n五十支\t72021\n梧州\t72022\n好吧酱紫\t72023\n十多米高\t72024\n山径\t72025\n薄饼\t72026\n空客A380\t72027\n11张\t72028\n性无能\t72029\n度度晚安\t72030\njjsi\t72031\n1314521\t72032\njjsj\t72033\n王奕博\t72034\n议会\t72035\n王扇子\t72036\n嗯郭橐驼\t72037\njjsx\t72038\nararol\t72039\n尖靴\t72040\n123455\t72041\njjss\t72042\n123456\t72043\n台停\t72044\n加分\t72045\n没呢奥堡ok\t72046\n基层\t72047\n经济部\t72048\n张机器人\t72049\n80块\t72050\n老虎不发威你当我是卡拉ｏｋ\t72051\n四五万\t72052\n舞狮\t72053\n大公交\t72054\n半生\t72055\n李大红\t72056\n打辩论\t72057\n一百六十二\t72058\n喵再\t72059\n奇花异草\t72060\n不骗你的我是好人哈哈哈哈你懂你\t72061\n文斯\t72062\n乛乛乛問\t72063\n火机\t72064\n酽\t72065\n12345h\t72066\n酿\t72067\n酱\t72068\n甘蔗\t72069\n行不不不\t72070\nevdrrf\t72071\n623克\t72072\n酶\t72073\njwH\t72074\n火木\t72075\n去找你\t72076\n酬\t72077\n酮\t72078\n骨干\t72079\n酥\t72080\n被抢劫\t72081\n丈母娘\t72082\n周昌迅\t72083\n税务局\t72084\njwy\t72085\n骨鸡\t72086\n酚\t72087\ntury\t72088\n文文\t72089\n度秘棒棒哒\t72090\njwr\t72091\n酒\t72092\njwt\t72093\n打底衫\t72094\njwv\t72095\njww\t72096\n进酒\t72097\njwi\t72098\njwj\t72099\n王李星\t72100\n配\t72101\n张家楠\t72102\njwn\t72103\n汤小米\t72104\njwb\t72105\n天鹅天鹅\t72106\njwg\t72107\n这端\t72108\n顿生\t72109\njbbhhhhhyy\t72110\n锡纸\t72111\n法律\t72112\n叶花生\t72113\n忧劳\t72114\n赵姐\t72115\n讨厌的话\t72116\n外度\t72117\n度秘你给我记住我讨厌你我恨你\t72118\n一爪不木孑下泣\t72119\n八十兆\t72120\n突击革\t72121\n八十元\t72122\n灵犀助手\t72123\n彪哥\t72124\n防航班\t72125\n同手同脚\t72126\n再见我不想\t72127\n哪类\t72128\n喜一一切\t72129\n八十八\t72130\n马克·扎克伯格\t72131\n999岁\t72132\n浓妆艳抹\t72133\n法徒\t72134\n汉伯顿\t72135\n两百185点\t72136\n不要你了你想\t72137\n20倍\t72138\n歌声\t72139\n好奇怪\t72140\n调并\t72141\n刘霁萱\t72142\nallofusifindimp2tangtolearnengishwelyl\t72143\nhhghhh\t72144\n东邪\t72145\nfyj\t72146\n东邦\t72147\n子华\t72148\n超级情\t72149\n死缠烂打\t72150\n大喊大叫好\t72151\n千尺\t72152\n抱秘\t72153\n脚膜\t72154\n88859\t72155\n你头这么大我头\t72156\n汝头\t72157\n独一无\t72158\n势力\t72159\n收银\t72160\n胡冰卿\t72161\n陆xdgg\t72162\n13671393689\t72163\njxcjb\t72164\n夜话\t72165\n孙辈\t72166\n丧苺比\t72167\n咯咪\t72168\n跟他走\t72169\n咯咯\t72170\n一三千万500\t72171\n感召力\t72172\n挠一挠\t72173\n我的贴心小帮手萌萌哒\t72174\n署数\t72175\nihjgjgufh\t72176\n髮型\t72177\n刘星辰\t72178\n选举法\t72179\n一不人\t72180\n站岗\t72181\n一个礼拜\t72182\n茫然\t72183\n度娘百科\t72184\n换回\t72185\n里桥\t72186\nsure\t72187\negfgffhh\t72188\n靳珩\t72189\n闽南人\t72190\n陈士渠\t72191\n墨尔本\t72192\n边关\t72193\n鼎泰丰田\t72194\n北京奔驰\t72195\n昨天三点\t72196\nhuhjhj\t72197\n彭萌萌\t72198\n哥伦比亚广播公司\t72199\n454575857558681555414157576555858286438373469668281762367205669985669966654428571234567895355855520\t72200\n第二节\t72201\n65句\t72202\n排拍\t72203\n六三点六\t72204\n冯田仪\t72205\n真功夫\t72206\n模型\t72207\n安乐死\t72208\n今天上午\t72209\n娜么快乐\t72210\n191568\t72211\n荣昊\t72212\n半几句\t72213\n秘版\t72214\n陈思宇\t72215\ndsdsegfdgfg\t72216\nbvo\t72217\n陈墨豪\t72218\n薄荷\t72219\n所以说再见\t72220\n中旬\t72221\n3百\t72222\n沙漠沙漠\t72223\n葱葱\t72224\n凯妃\t72225\n丹山\t72226\n中日\t72227\n阳台\t72228\nhi奶奶\t72229\n古力扎\t72230\n天蓝色\t72231\n小免兔崽子\t72232\n中旗\t72233\n半了半了半了半了半了别了别了别了wwwwww\t72234\n嗯乌斯玛\t72235\n防疫\t72236\n集中营\t72237\n506集\t72238\n这句点\t72239\n四川佛教协会\t72240\n李汉龙\t72241\n一当\t72242\n160元\t72243\n61页\t72244\n动画片儿\t72245\n断定\t72246\n宗教信念\t72247\n大地震\t72248\n香槟瓶\t72249\n眼下\t72250\n眼上\t72251\n花姑\t72252\n花姐\t72253\n\t72254\n闫瑞\t72255\n断袖\t72256\n胡晓蒙\t72257\ntfbras\t72258\n212929\t72259\nceHecshoc\t72260\n溶解\t72261\n5588mp4\t72262\n眼中\t72263\n仇雯\t72264\n\t72265\n仇雪\t72266\ntkkttdukd\t72267\n特仑苏\t72268\n吃完\t72269\nvvcgfhc\t72270\nIP99\t72271\n100多块\t72272\nu文山\t72273\n49996\t72274\n千人组\t72275\n无生无\t72276\nn18\t72277\nn19\t72278\n王连连\t72279\n小雄\t72280\n小雅\t72281\n舍利子\t72282\n下植物大战僵尸\t72283\n东门太阳广场\t72284\n奥呗\t72285\n好啦陪我玩会\t72286\n战胜过多么\t72287\nduguf\t72288\n小雷\t72289\n漫画类\t72290\n负担\t72291\n2011年1月31日\t72292\n古个\t72293\n柯蓝\t72294\n佛掌\t72295\n还是\t72296\n想不想念\t72297\n武战队\t72298\n古丽\t72299\n小雨\t72300\n小雪\t72301\n1154453220\t72302\nYKC\t72303\n嫁妆\t72304\n十八天后\t72305\n音控\t72306\nGareth\t72307\n6638258\t72308\n演艺界\t72309\n美行青蒿\t72310\n包容\t72311\n没精打彩\t72312\n包宿\t72313\n史迪奇\t72314\n不好说出\t72315\n還說\t72316\n21cc\t72317\n付礼\t72318\n堡坎\t72319\n李文欣\t72320\n刘赢蒲\t72321\n黄宇航\t72322\nsIand\t72323\n拉卡拉\t72324\n舌尖\t72325\nHGHG\t72326\n怕过\t72327\n一拳\t72328\nj8569282855858\t72329\n血孤\t72330\n一拼\t72331\n四五排\t72332\n没话可说\t72333\n一拨\t72334\n盗斗\t72335\n蒙灯\t72336\n崇文门人保定损中心\t72337\n一拖\t72338\n新华社中国名牌杂志社\t72339\n雨淋\t72340\n不过你儿再说\t72341\n天堂书店\t72342\n一拜\t72343\n靠不问\t72344\n老不好看\t72345\n173次\t72346\n地宫\t72347\n孔强\t72348\n一担\t72349\n一拉\t72350\n我的生活\t72351\n赶上\t72352\n赶不\t72353\n最去\t72354\n名民\t72355\n挂菜\t72356\n654527631854542\t72357\n胜从前\t72358\n杰日\t72359\n李梦颖\t72360\n道歉门\t72361\n变形计\t72362\n冷却\t72363\n沤臾\t72364\n好痴累\t72365\n铁观音\t72366\n鹰牌陶瓷\t72367\n枝叶\t72368\n117％\t72369\n闺蜜\t72370\n王翌\t72371\n展会\t72372\n利益链\t72373\n灵车\t72374\n游船\t72375\n2872757755\t72376\n三万六千万\t72377\n根蛋\t72378\nqweruuggg\t72379\n01354个\t72380\nOpen\t72381\niPod\t72382\n宋向阳\t72383\n那么多啦\t72384\nbabyjy\t72385\nkhchc\t72386\n两物\t72387\n青光光光\t72388\n垃圾堆\t72389\n白头到老一甲子猜字谜\t72390\n接电话\t72391\n株洲晚报\t72392\n何哲军\t72393\n双星无敌版\t72394\n嘉欣逢喜诞麟儿\t72395\n熟睡\t72396\n1745896886\t72397\n伤心系\t72398\nsence\t72399\n圣明月\t72400\n兵兵我喜欢兵兵\t72401\n颠倒是非\t72402\n过热\t72403\n罗宾森\t72404\n误读\t72405\n红米酒\t72406\nluoke\t72407\n汽压会\t72408\n吓死\t72409\n快点们\t72410\n柴盒\t72411\n每一天\t72412\n祁女士\t72413\n家警\t72414\n大雪纷飞\t72415\nKmnnjames\t72416\n四百九四百九490980\t72417\n海泰uck\t72418\n两单\t72419\n海思\t72420\n光你\t72421\n超级跳舞\t72422\n654563214477888996\t72423\nyhh\t72424\n副省长\t72425\nfffffffffffffffffffffff\t72426\n海怡\t72427\n溢价\t72428\n啦啦啦啦啦啦啦啦我是快乐的小行家\t72429\n海怪\t72430\nyhd\t72431\n欢年\t72432\n愿者\t72433\nyhs\t72434\n寻找爱的冒险\t72435\n6月28日\t72436\n警训\t72437\nyhu\t72438\n疯交叉\t72439\nutddghueed\t72440\n袁大伯\t72441\n王老板\t72442\n原子\t72443\n见光\t72444\n王老来\t72445\njrex\t72446\n房管家\t72447\n玩具店\t72448\n偶尔额\t72449\n99999999999999999999个\t72450\n伤力\t72451\n系列\t72452\n闲时\t72453\n陕西省教育科学研究所\t72454\n曼狗\t72455\n包涵\t72456\n44岁\t72457\n流口水\t72458\n近一周\t72459\n删毛\t72460\n风南本\t72461\n那蓟县\t72462\n傲然\t72463\n伤势\t72464\n刀光\t72465\n六百多块\t72466\n不晓\t72467\n3480\t72468\n傅博视网\t72469\n一根经的你\t72470\n不晚\t72471\nwalker\t72472\n70后\t72473\n蓝兔之\t72474\n蟹黄\t72475\n瞎走\t72476\n恩知拉\t72477\nwruvl\t72478\n济州岛\t72479\n遛弯\t72480\n陈鹤丹\t72481\n二月初三\t72482\n涨姿势\t72483\n二月初一\t72484\n魔难\t72485\n9901\t72486\n红十六\t72487\n猪猪猪就是你你就是猪猪猪\t72488\n灵修院\t72489\ngymg\t72490\n相接\t72491\n碰碰车\t72492\nhttpehiphotosbaiducomxiaodupicitemb8014a90f603738d4d0a2256b41bb051f819ec06jpg\t72493\n山卡拉\t72494\n么不倒\t72495\n诚信度\t72496\n耐泡\t72497\n娶定\t72498\n我是你女神不理你了=\t72499\nKup\t72500\n获奖\t72501\n哦酷我7两i9低谷期么事\t72502\n哭一哭\t72503\n新保\t72504\n害秘\t72505\n阿可口可可\t72506\n拉肚\t72507\n嗯人么\t72508\n48小时\t72509\n省旅游局\t72510\n孔乙己\t72511\n丑化\t72512\n蓝条君\t72513\n手工客\t72514\n昌盛\t72515\n刘嘉羽\t72516\n译制\t72517\n顶端\t72518\n敢情\t72519\n黄艺明\t72520\n廖师\t72521\n深井冰\t72522\n2600\t72523\n金泉庄\t72524\n傲视\t72525\n美女好多哇塞你好美\t72526\n冉歆航\t72527\nzbzbdbsb\t72528\n颅内骨折\t72529\nsesesesesesesese\t72530\n新亚比\t72531\n陪我看\t72532\n赵钱\t72533\n想和你好好说话\t72534\n停号\t72535\n陈美式\t72536\n363例\t72537\n萌不萌\t72538\n度秘你当我的女友\t72539\n唉喲\t72540\n生悲\t72541\n晚安度秘\t72542\nqrvel\t72543\n巴啦啦\t72544\n民间舞\t72545\n江陵\t72546\n拉点\t72547\n担架\t72548\n韩中\t72549\n推荐量\t72550\n蔡三\t72551\n衣衣\t72552\n哈哈哈哈哈哈哈哈哈\t72553\n袁孟玲\t72554\n出否\t72555\n川蜀\t72556\n一嘴一醉\t72557\n韩丽\t72558\nPO\t72559\n奕者\t72560\n张幺儿\t72561\n7777777777777777777777777998999999999999999999999999999999999999999999999919191991999919911\t72562\n外加外加\t72563\n陈盛桐\t72564\n徐心蕾\t72565\n北林\t72566\n士士\t72567\n反解\t72568\n出名\t72569\n相我来\t72570\n六多咪咪\t72571\n蔡个\t72572\n对呀你不信\t72573\n好呀子了不要我唔\t72574\n果真\t72575\n给我的爱\t72576\n陈台豪\t72577\n大兴镇\t72578\nctgbdd\t72579\n衣衫\t72580\n安雅\t72581\n三十八块\t72582\n一二三四五六七八九十十一十二支\t72583\n牛妙妙\t72584\n红魔魔\t72585\nSeydou\t72586\nSM天门\t72587\n灿白党\t72588\n哈恩\t72589\n十七分\t72590\n图穷匕见\t72591\n度度斗\t72592\n也不呢\t72593\n肺腑之言\t72594\n虎妞\t72595\n驰骋\t72596\n踊跃\t72597\n驼背腰弯\t72598\n鼠绘\t72599\n物喻\t72600\n两个世界\t72601\n95055658\t72602\n蓄水\t72603\nGaGa\t72604\n汤姆有\t72605\n月出\t72606\nJjjjj\t72607\nBillboard\t72608\n鸡眼\t72609\nfuppc\t72610\n那偶们\t72611\n晚上两点钟\t72612\n请我愿\t72613\n任免\t72614\n秘奶\t72615\nhttpehiphotosbaiducomxiaodupicitem42a98226cffc1e179e185f434d90f603738de919jpg\t72616\n好不好好不好好不好\t72617\n广加\t72618\n刘俊瑞\t72619\n提请\t72620\n三个男孩\t72621\n1000万元\t72622\n十面埋伏\t72623\n鄞州\t72624\n西丰县第一小学\t72625\n王红红\t72626\n切换\t72627\nota\t72628\n秘奇\t72629\n团结一致\t72630\n效瑞\t72631\n嘴嘴\t72632\n漫美美\t72633\n俩一个群\t72634\n8n岁\t72635\n度你个头啊不不不不不不不不不不不不不\t72636\n从现在开始\t72637\njikhn\t72638\n球探\t72639\n赵老票\t72640\n机器女孩医德大战僵尸\t72641\n六沟\t72642\n本溪市检察院\t72643\n改贝\t72644\n一百58万\t72645\n危地马拉\t72646\n郭海英\t72647\n裁判们\t72648\n水疗\t72649\n头猪\t72650\n飒漫\t72651\n你你你我我我抽风了我抽风了不是处分\t72652\n5份\t72653\nwifh\t72654\nwifi\t72655\n说来着\t72656\n听者\t72657\n挽救\t72658\n让我摸摸\t72659\n泰麒威\t72660\n诱人女孩\t72661\n5代\t72662\n谢运红\t72663\n大润发大润发\t72664\n运球\t72665\n淡妆\t72666\n车坑\t72667\n金凤凰\t72668\n晴雨\t72669\n晴雪\t72670\n车坛\t72671\n边说\t72672\njcjc\t72673\n绯闻\t72674\n19集\t72675\n不骗你我真的不爱不好说\t72676\n三你不晓得我的徒弟子规弟子规上神性手下气四吉星大案奏\t72677\n同济大学\t72678\n卢丽雪\t72679\n赵同学\t72680\nott\t72681\n夏雪雪\t72682\n唱版\t72683\n曹方旭\t72684\n一百大牙\t72685\n赔长\t72686\n亚太地区\t72687\n巴掌巴\t72688\neexxoo\t72689\n三八幺\t72690\n立丹\t72691\n小蜜蜂价\t72692\n扭一扭\t72693\n额度秘\t72694\n随心\t72695\n笑就笑\t72696\n谢坤燕\t72697\n豆水\t72698\n1.5万亿元\t72699\n得恩赐\t72700\n浚县\t72701\n卫子涵\t72702\nCarly\t72703\n拖把\t72704\nlist\t72705\nlish\t72706\n一大只\t72707\njjinli\t72708\n恋恋不要你感冒威露氏年\t72709\n张秘\t72710\n一大口\t72711\n撸了\t72712\n同步助手\t72713\n黑黑呀\t72714\n台州\t72715\n成嘉祥\t72716\n观畴\t72717\n很有意思\t72718\n中国联\t72719\n缔造者\t72720\n拾衣\t72721\n3667\t72722\n布吉\t72723\n好星座\t72724\n刨食\t72725\n恬钥韫\t72726\n田应豪\t72727\n97个\t72728\n孔子在传戒三人行必有我师一\t72729\npgifuldkghxkhg\t72730\nQQ糖熊博士软糖\t72731\nhaaa\t72732\n秘我讨厌\t72733\n阿拉斯加\t72734\nhaad\t72735\n昂蓝色\t72736\n半箱\t72737\n足浴\t72738\n重脚轻哈哈哈哈哈\t72739\n奇事\t72740\n惊秫\t72741\nshoerack\t72742\neeweq\t72743\n12245635896\t72744\n检识\t72745\n来哪儿\t72746\n泼话没有真料实\t72747\n无问\t72748\n第十六任\t72749\n铁证如\t72750\n下大雪了一说\t72751\n无门\t72752\n细菌学家\t72753\n德纲\t72754\n155835565855085\t72755\n鳓\t72756\n鳖\t72757\n鬼都放假了我在我小姨家\t72758\nigkhigg\t72759\n追风筝的人\t72760\n鳍\t72761\n鳃\t72762\n林易峰\t72763\nruuoh\t72764\n小心情不好\t72765\n鳄\t72766\n在家不好玩\t72767\n东京大空袭JJCKDIOdo\t72768\n134264\t72769\n谁许谁许谁许\t72770\n大恸\t72771\n孙萌\t72772\ngoze\t72773\n露卡\t72774\n麦太人\t72775\n大恩\t72776\n飞机网\t72777\n听一有\t72778\n鳥\t72779\nSHE\t72780\n口感\t72781\n1646385844\t72782\n光椅\t72783\n霸道总裁\t72784\n口愛\t72785\n你好木\t72786\n吃不消\t72787\n25045687936145236978051254365241526378965486\t72788\n数组\t72789\n口意\t72790\n小致\t72791\n最高温\t72792\n参议员\t72793\n叽叽蹦\t72794\n很孤单\t72795\n六六美\t72796\n50f4bfbf\t72797\n重蹈\t72798\n要死光\t72799\n没了我死\t72800\n华晨\t72801\n克长\t72802\n恶兄可休闲类\t72803\n数少点\t72804\n别醉\t72805\n学试试\t72806\n吃好久\t72807\n分天\t72808\n兼顾\t72809\n奇数\t72810\n人那来妃u9unu奥特曼\t72811\n阿猫\t72812\n两天\t72813\nchiphotosbaiducomxiaodupicitemb21c8701a18b87d62d379a6a000828381f30fd49jpg\t72814\n山腰\t72815\n干劲\t72816\n玉珂\t72817\nflag\t72818\n如果你是\t72819\n真利益\t72820\n基训觉\t72821\n两头\t72822\n忘不忘\t72823\n回嘴\t72824\n157845538\t72825\n心有力夕\t72826\n不用来\t72827\n二锅头\t72828\n两处\t72829\njdjtjtj\t72830\n血路\t72831\n永波体\t72832\n两夜\t72833\n盯住\t72834\n磕磕绊绊\t72835\n无意中\t72836\n200集\t72837\n偈云\t72838\n拉拢\t72839\nuvpuunrykabkhkhlh\t72840\n战英\t72841\n区妇\t72842\n章节\t72843\n就语文学\t72844\n早泄\t72845\n100名\t72846\n走近科学\t72847\n好笑度\t72848\n小额贷款公司\t72849\n路子\t72850\nq张纸\t72851\n四盲鼠\t72852\n惊爆\t72853\n阿吉格里\t72854\n玉珑\t72855\n李雪波\t72856\n拉拉\t72857\n凤剧\t72858\nlh7\t72859\n树钦\t72860\n想你了翠\t72861\nusjdy\t72862\n谢啦晚安\t72863\n秘然\t72864\n你好稿\t72865\nPES\t72866\n風景\t72867\n东南风\t72868\n锦衣卫\t72869\ndvfy\t72870\n昏沉\t72871\n小狗仔\t72872\n16：00-17：30\t72873\n东南飞\t72874\n一千多起\t72875\n死了撒\t72876\n苏琪瑶\t72877\n五心\t72878\n15563\t72879\n短文\t72880\n麻辣拌\t72881\n辰儿\t72882\ndfbhv\t72883\nbjfj\t72884\ndrddx\t72885\n原阅\t72886\n妮时代\t72887\n抓阄\t72888\n左一洲\t72889\n双条条\t72890\n我多吃点那你哪你不知道这事是你帐吧\t72891\n好带感\t72892\nlhy\t72893\naah1\t72894\n1滴DJ几个\t72895\nhshshw\t72896\n祖经\t72897\n刺激性\t72898\nuhxxjjdj\t72899\n有问\t72900\n要不我不爱\t72901\n麻烦你女\t72902\nhibab\t72903\n啦啦啦啦啦啦你是我的小驴\t72904\n大懒猪呢你当我是大懒猪\t72905\n两三天\t72906\n有间\t72907\n13952521872\t72908\n易车逗比\t72909\n姚贝娜\t72910\n6738556439666\t72911\nynnnyyhuhuheiw\t72912\n10大\t72913\n氤氲\t72914\n喝退\t72915\n忽成\t72916\n10天\t72917\n半夜三更\t72918\n还要不要脸\t72919\n三四句\t72920\n木器\t72921\n三四口\t72922\n九幻回款\t72923\n捷胜\t72924\nbout\t72925\n3548625886258\t72926\n家人儿\t72927\n太舒服\t72928\n面朝\t72929\n修录\t72930\n来自我要\t72931\n佼佼\t72932\n斯莱戈\t72933\nsillyyou\t72934\n镀镍你别再这样了的笑我有不个\t72935\ncjcjj\t72936\n汉诗\t72937\n徐晓晶\t72938\n33米\t72939\n漂洗\t72940\n方城县\t72941\n父系\t72942\n文写\t72943\n张小娜\t72944\n桑拿浴\t72945\n0551064926kgaw1pk\t72946\n废话网\t72947\nt004\t72948\n精盐\t72949\n汉语\t72950\n叶璃\t72951\n饥饿游戏\t72952\n别跑\t72953\n苗沐阳\t72954\nhuffgh\t72955\n转睛\t72956\n别跟\t72957\nzemeboushimai\t72958\n学游泳\t72959\n如何\t72960\n十八页\t72961\n小猪孙\t72962\n件套\t72963\n朱元冰\t72964\n南风中\t72965\n殷泽帕克\t72966\n同一首歌\t72967\n会好好好好\t72968\n河南省信阳市车管所\t72969\n82347623457888222788\t72970\n谓圣\t72971\n小战士\t72972\n被拐卖\t72973\n水柱\t72974\n九十度\t72975\n2793929596\t72976\n成名作\t72977\n9月16日\t72978\n零二幺\t72979\n加油挂\t72980\n门楣\t72981\n密玛\t72982\n埋名\t72983\nK4675\t72984\n水柜\t72985\n诗仙\t72986\n冬景\t72987\n思不若\t72988\n协助\t72989\n南环路\t72990\n久岁\t72991\n临床系\t72992\n1557276962576135185737\t72993\n很厉赁\t72994\n学霸我的的我你\t72995\n腊月27\t72996\nJjgtndtb\t72997\n哦草\t72998\n蔡彤\t72999\n啊陈\t73000\n腊月22\t73001\n象山村\t73002\n铠甲勇士之炎龙\t73003\n8561146\t73004\n急救车\t73005\nxhjn\t73006\n代购王\t73007\n近两个月\t73008\nwjbs\t73009\n宠物别闹\t73010\n一个儿\t73011\n零零后\t73012\n去不不\t73013\nhgffd\t73014\n北新\t73015\n冯小宁\t73016\nhgffh\t73017\n应试教育\t73018\n人才市场\t73019\n15号\t73020\n苏空\t73021\n糜烂\t73022\n每本书\t73023\n早上九点零二分\t73024\n100架\t73025\n本金\t73026\n讲标\t73027\n国栋\t73028\ndhiphotosbaiducomxiaodupicitemc75c10385343fbf2e1009a50b77eca8065388f90jpg\t73029\n599969\t73030\n钟格格\t73031\n砖头\t73032\n好我爸\t73033\n圣水\t73034\n陈柯宇\t73035\n看充\t73036\n蔡佳茵\t73037\n一求\t73038\n一汁\t73039\n一汀\t73040\n1346148\t73041\n每个\t73042\n琼中\t73043\n打破\t73044\n抗睡\t73045\n深入人心精美绝伦\t73046\n我的我要和你我我我我我不和你\t73047\n永不分离\t73048\n片口儿\t73049\n鼻子矫姿器\t73050\n李嫣\t73051\nlieto\t73052\n上年同期\t73053\n打码\t73054\nqb\t73055\n5【\t73056\n拿地\t73057\n怵目惊心\t73058\n求你了说吗办我要不理你\t73059\n男脸你猜我是大人二小222BO\t73060\n一汪\t73061\n想知\t73062\n丁同学\t73063\n米窝\t73064\ndJFGEEGH\t73065\n5567654645654\t73066\n培波\t73067\n国标\t73068\n现银\t73069\n秒懂\t73070\n鼻饲\t73071\n柴静\t73072\n一下盘\t73073\n李腾娜\t73074\n最帅\t73075\n广州南\t73076\n通灵性\t73077\n会务组\t73078\n苏小暖\t73079\n崔秀媚\t73080\n选书网\t73081\n张琳莹\t73082\n哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒\t73083\n伊涅斯塔\t73084\n舟子\t73085\n哼退\t73086\n山楂糕\t73087\n看见过\t73088\nguigcjj\t73089\n3066年\t73090\n国粹\t73091\n杨天滨\t73092\n咸阳\t73093\n捐精\t73094\n家鬼\t73095\njtak\t73096\nhttpfhiphotosbaiducomxiaodupicitemf3d3572c11dfa9ec3719066365d0f703918fc18bjpg\t73097\n苍琼女神归来\t73098\n马连的士\t73099\n示弱\t73100\n有缘方便\t73101\n75度\t73102\n鱼块\t73103\n多多哪\t73104\njilojack\t73105\n19931月\t73106\n盆景\t73107\n吃饱了饱\t73108\n3012458657\t73109\n老婆片\t73110\n我看看\t73111\n不害羞羞\t73112\n15762653660\t73113\n愚昧\t73114\n界限\t73115\nRBQ\t73116\n藏旅游\t73117\n睡觉得\t73118\n你是你好眚不兴一样衰老师兄弟连弟们的故意义工\t73119\n耒阳市\t73120\n魔鬼步\t73121\n洛山\t73122\n田春毅\t73123\nvanshgff\t73124\n女香\t73125\n哟西动\t73126\n太残忍\t73127\n蓓蕾落尼亚脸\t73128\nmmmm\t73129\n弘毅\t73130\n狗种\t73131\n处q068\t73132\n99542\t73133\n舍狼\t73134\n玫瑰花儿\t73135\n心连心\t73136\n妖怪们\t73137\n5月11日\t73138\n摇滚劲\t73139\n基老\t73140\n黎馨蕊\t73141\n八八晚安吧八八\t73142\n回答错\t73143\n一问三不答\t73144\nBG\t73145\n肿么了\t73146\n自认\t73147\n唱一句\t73148\n华章\t73149\n来段\t73150\n既无赖\t73151\n碎戏\t73152\n操作性\t73153\n准妈妈们\t73154\nghhthvcdfg\t73155\n朱文辉\t73156\n日照\t73157\n跨度\t73158\n斗神\t73159\nGxhgfvc\t73160\n中华民族博物馆\t73161\n30000元\t73162\n代村\t73163\n基耶\t73164\n彭康宁\t73165\n橙花\t73166\nBN\t73167\n撸度秘\t73168\n猪么度秘\t73169\nhdcg\t73170\n52885988\t73171\nBH\t73172\n唐长老\t73173\n三百肯\t73174\n毕业了\t73175\n感染\t73176\nyeseto\t73177\n试玩\t73178\n倒霉事\t73179\n第一二句\t73180\n度婊\t73181\n二十九\t73182\n王梦梦\t73183\n国样\t73184\n规范\t73185\n虚虚\t73186\n妙凡\t73187\n撕下来\t73188\n礼金\t73189\n千域千尋\t73190\n强迫\t73191\n徐某某\t73192\n什么子\t73193\n科学研究\t73194\n彪憾女\t73195\n度妃\t73196\n这块\t73197\n10点32分\t73198\nT14\t73199\n飞鼠\t73200\n李晓溪\t73201\n讲礼貌不讲理\t73202\nT11\t73203\n这两次\t73204\n不结婚\t73205\n吕祖殿\t73206\n5公经\t73207\n彩贝\t73208\n打点滴空\t73209\n艳骨棒\t73210\ningo\t73211\ncosplay\t73212\n四区\t73213\n和你是好朋友\t73214\n瞎给\t73215\n凝望\t73216\n剪辑班\t73217\ndoespeninl\t73218\nisiu\t73219\n小黑狗\t73220\n姜希宇\t73221\n恩来顺\t73222\nbibi\t73223\n你好呀你好呀你好呀晚上好呀\t73224\n僵尸片\t73225\n有始有终\t73226\n李昱琳\t73227\n天天有喜之人间有爱的尾曲\t73228\n拉拉多\t73229\n僵尸版\t73230\n20162016\t73231\n肉丝汤\t73232\n开心消大\t73233\n学士服\t73234\n东河\t73235\n说的事\t73236\n发V不及格Vgfawchntggsehhcdfj\t73237\n确确\t73238\n心理学家\t73239\n143个\t73240\ndabba\t73241\nh1zfo\t73242\n为谁的男妹妹\t73243\n那个特种兵之火凤凰\t73244\n查机\t73245\n三吗七\t73246\n100861\t73247\n10月24-29日\t73248\n伦敦场\t73249\n请注意\t73250\n正月初八\t73251\n老师徒\t73252\n锦一\t73253\n一点儿天\t73254\n思甜\t73255\n修一修\t73256\n18683505907我爸爸的事\t73257\n征婚\t73258\n浮动碟\t73259\n王炜艳\t73260\n一间个\t73261\n我的好度秘\t73262\n幸福家园\t73263\n玩不没\t73264\n死度秘我看你是便秘\t73265\ntittu\t73266\n003552211\t73267\nGggggfmo\t73268\n我是你的主人你的主人\t73269\n心里狗\t73270\n累假\t73271\n档\t73272\ngdvd\t73273\n暖婚\t73274\n段皓文\t73275\n晓怡\t73276\n新课标卷\t73277\n北极小北吉祥相克\t73278\n当庭\t73279\n白生生\t73280\n过什么叫\t73281\n当度\t73282\n红酒\t73283\n发光谷\t73284\n冯家村\t73285\n冯瀚圃\t73286\n分别\t73287\n楚王愁\t73288\n蚕种\t73289\n嗯哪天心\t73290\n25382453\t73291\noppog8\t73292\n一妻制\t73293\n关灯\t73294\n秘丽人\t73295\n断了线\t73296\nBy\t73297\n再一次心跳\t73298\n快期末\t73299\nBooth\t73300\n那卜镇\t73301\n班纳\t73302\n煤厂\t73303\n恩你\t73304\n取生\t73305\n左邻右郦\t73306\n恩佩\t73307\n到不\t73308\nyajeodn\t73309\n张新阳\t73310\n我的我想你\t73311\n打仗片\t73312\n2730点\t73313\n曲项向天歌\t73314\n好呀啦啦啦啦啦我们一起\t73315\npmmgpdagatmgfgzbtvufjtxburdjy\t73316\n仓位\t73317\n阿纽巴\t73318\n寒假度\t73319\n叫讲\t73320\n和罗\t73321\n巴豆\t73322\n少帅\t73323\n恩佐\t73324\n刘晓燕\t73325\nhi姐\t73326\n上一小时\t73327\n漫跑\t73328\n未知道\t73329\n孙晓娆\t73330\n夏夏紅\t73331\n还有一手小冤家\t73332\n显现\t73333\n剧作份\t73334\n梁晓声\t73335\nhompon\t73336\n赛文训\t73337\ntouityou\t73338\n娱亲\t73339\n北京舞蹈学院\t73340\n玩儿家出来\t73341\n软生\t73342\n成都市\t73343\n高其龙\t73344\n武下次\t73345\n陈美人\t73346\n铰呗\t73347\n鸡兔同笼的问题\t73348\n家庭厅\t73349\n阴门\t73350\n于明加\t73351\n啦多米\t73352\n半男半女\t73353\n没了联系\t73354\n王寡妇\t73355\nmokpl\t73356\n阴间\t73357\n撤销\t73358\n盛意\t73359\nxhsocjh\t73360\n先朵红红\t73361\n76731653173467226466865655667575742256888557554412533211\t73362\n何雨蓓\t73363\n痛失\t73364\n宾犬\t73365\n指针\t73366\n大虫\t73367\ntrounoral\t73368\n我不我想\t73369\n蔚姐姐\t73370\n抱歉亲亲\t73371\n水永入少少\t73372\n大虾\t73373\n想恩平理\t73374\n说过年\t73375\nNjiwhhdi\t73376\n明天六点\t73377\n含水\t73378\n百慕大\t73379\n是妞么梦梦\t73380\n药王村\t73381\n说不上来\t73382\n大虎\t73383\nsanh48\t73384\n看今朝月\t73385\n哪男人\t73386\n痛处\t73387\n自欺欺人\t73388\n妇周鼎字\t73389\n来一夫妻宫的子来益夫妻宫\t73390\n冯玉岚\t73391\n陨落\t73392\n患者\t73393\n别说了要么你到我家来有没我去你家了我去你家拉\t73394\n晚九点\t73395\n嘿诚恳哼\t73396\nangelababy度秘典\t73397\n便知\t73398\n慢慢慢慢一点\t73399\n3800\t73400\n哈片\t73401\n该关机\t73402\n不是我没有\t73403\n685688\t73404\n仁顺儿\t73405\nghcghi\t73406\n退下\t73407\n狂野之城\t73408\n想吐籽\t73409\ntyuf\t73410\n鞋款\t73411\n哈特\t73412\n法国\t73413\n3010980009\t73414\n瞅准\t73415\n齐晓飞\t73416\n馔鸶\t73417\n零二二零四零三七七二零\t73418\nguxcgh\t73419\n做媒\t73420\n关注\t73421\n一整块\t73422\n1uq\t73423\n柳林\t73424\nmeidan\t73425\n卡片版\t73426\n四二零\t73427\n喜怒\t73428\n1ua\t73429\n弯曲\t73430\n刘子铭\t73431\n727279289\t73432\n失常\t73433\n消水\t73434\n红外感\t73435\n哎哟嘴\t73436\n淄博\t73437\nB4\t73438\ngagc1\t73439\n多爱好不好\t73440\n易烊千彤\t73441\n7点八点\t73442\n酸甜苦\t73443\n李心里\t73444\n嗯嗯嗯嗯嗯\t73445\n渡口\t73446\n130618890000000\t73447\n消气\t73448\n厂鼠\t73449\n56分之一\t73450\n缩头\t73451\n请注\t73452\nM\t73453\n官网\t73454\nB3\t73455\n潜逃\t73456\n申万\t73457\n倾盆\t73458\n整车\t73459\n爱鲁\t73460\n蹦蹦直跳\t73461\n麻婆豆腐\t73462\nockingalllind\t73463\nzhebushices\t73464\n90方\t73465\n菲律宾狗日\t73466\n滚出\t73467\n爱是\t73468\n经久不息\t73469\n张馨文\t73470\n雷楼母\t73471\nrange\t73472\n2012-4-18\t73473\n无端\t73474\n胜子\t73475\n人纹\t73476\n绿城\t73477\n胜存\t73478\n好勒好勒\t73479\n好打瞌睡\t73480\n小红猫儿\t73481\n打围\t73482\n我市我恨你讨厌你\t73483\n陈蔓榕\t73484\n说一句话\t73485\nedbb6fd52663304jpg\t73486\n三十千克\t73487\n亚聪\t73488\n唏嘘\t73489\n一百五六十\t73490\nk322\t73491\n猪头猪脑壳儿\t73492\nWi-Fi\t73493\n13413575813\t73494\n单纯\t73495\n礼拜一到\t73496\nquestion\t73497\n嘎拉嘎蹦沙嘎拉嘎蹦沙嘎拉嘎\t73498\n讨厌你了再见\t73499\n因孩\t73500\nvvlvvlq\t73501\n单线\t73502\n音音\t73503\n正师级\t73504\n700个\t73505\n很方\t73506\n课后\t73507\ncih\t73508\n很新\t73509\n蒋席玉\t73510\n811阵\t73511\n对呀我是好人你是坏人\t73512\nAshton\t73513\n擦破\t73514\n搞笑的话\t73515\n17.74\t73516\n任孽镍舟车锤盟\t73517\n弟子们\t73518\n武兄\t73519\n晚叔\t73520\n眈美文\t73521\n笑意\t73522\n我的话啊我是我是你姐\t73523\n挖财\t73524\n床精\t73525\ncom了了了了了\t73526\n挽荒\t73527\n戎马\t73528\n太仓洋\t73529\n东明你在干\t73530\n晶报\t73531\n声恩\t73532\n快天\t73533\n平君\t73534\n大声疾\t73535\n佩剑\t73536\n全宇\t73537\n105.8\t73538\n张西平\t73539\n敏儿\t73540\n伊万诺维奇\t73541\n呼啦跳舞\t73542\n105.5\t73543\n39个\t73544\n全家\t73545\n全宴\t73546\n第16册\t73547\n奉持\t73548\n太杉良\t73549\n说跟你走说\t73550\n全室\t73551\n税款\t73552\nO型血\t73553\n我爱你是你的好主人\t73554\n国家机关\t73555\ninclass\t73556\nwndjwjdodisididjdkdkxjhsjjdjsdjjjjdjddfjeeskooslooaaaskdkkdkadsDdkdxxmddkddkxmxkxjdjdddddhddddddddjdjdjsxccgrywq\t73557\n养我\t73558\n尸王\t73559\nkkekjndngghjaikdmnffbffhjchvh\t73560\n没看过\t73561\n油脂\t73562\n忙那么多天\t73563\n学术性\t73564\n黄艳\t73565\n黄色\t73566\n玉蝴蝶\t73567\n问好伐\t73568\n心意\t73569\n有色色\t73570\n撒拉密达\t73571\nbossto\t73572\n手手过\t73573\n苏果超市\t73574\n曼彻\t73575\n温泉支路站\t73576\n第56页\t73577\n二亿\t73578\n二人\t73579\n良加\t73580\n三生\t73581\n凯源汪\t73582\n朱佳煜\t73583\n恩可可\t73584\n陈嘉成\t73585\n三盆\t73586\nhjssjfj\t73587\n003期\t73588\n零二\t73589\n万不要\t73590\n呷哺排号\t73591\n在也不\t73592\n二五\t73593\n历史上\t73594\n至理\t73595\n鹤壁\t73596\n天津诚达铁艺\t73597\n拍服\t73598\n二亚\t73599\n楷书\t73600\n枭龙&amp\t73601\n丙健\t73602\n模特儿\t73603\n深入深出\t73604\n鬼片儿\t73605\n牡丹花\t73606\n二二\t73607\n爸爸王\t73608\n起因为我\t73609\n恶心我告诉你\t73610\n那好华\t73611\nant个\t73612\n皇瑟得\t73613\n日子棒\t73614\n钕钕\t73615\nP\t73616\n9999份\t73617\n据报道\t73618\n阿Sa\t73619\n艾比森\t73620\n启应\t73621\nSJ手环\t73622\n闹钟声\t73623\n麻辣比你是谁哼哼\t73624\nWewe\t73625\n肾好呀\t73626\n芦苇\t73627\n5301\t73628\nvzuffjds\t73629\n三相\t73630\n孤单骨的拜再见再见再见再见再见再见后会有期\t73631\n酥化\t73632\n简川博\t73633\n十三节时间c458w\t73634\n构建\t73635\n通亮\t73636\n太兴奋冷\t73637\n那谁是谁只是不是我\t73638\n吴高毅\t73639\n海洛因\t73640\nandarby\t73641\n暴度\t73642\n1666054060\t73643\n1O点\t73644\n倒塌\t73645\n惊悚\t73646\n破木吉他\t73647\n逼停\t73648\n度娘的名字到底是什么告诉我告诉我告诉我告诉我告诉我那你的名字到底是什么告诉我告诉我告诉我\t73649\n荔大\t73650\n梨枣\t73651\n弓舷\t73652\n1.3xE\t73653\n乱写\t73654\n好好好好好好好好好好好好好\t73655\n这个周日\t73656\n没不用\t73657\n屁股们\t73658\n式子弹\t73659\n六百\t73660\n阻止\t73661\n位数\t73662\n张美琳\t73663\n6666666666666666666666666666666666666666\t73664\n500多亿美元\t73665\n马向宇\t73666\n余谋洁\t73667\n十十十十十十\t73668\nhhshsggd\t73669\n我希望\t73670\n老海片\t73671\n不念书\t73672\nsp33588\t73673\niiisjh\t73674\n忙完\t73675\n张宇鑫\t73676\n吴孟天\t73677\n紫燕\t73678\n1234175851\t73679\nu好虎胡u好改天要乖乖vv他發給v呵呵太以眼還眼好尷尬u鬍\t73680\n山居秋暝\t73681\n个度秘度秘\t73682\n角器\t73683\n李重生\t73684\n谢咯\t73685\n藏斗\t73686\nCasarte\t73687\n不好我告诉你\t73688\n782178\t73689\n18627230222\t73690\ngeryy\t73691\nfmkrxnmlyscieb\t73692\n有着\t73693\n镇长\t73694\n悠悠\t73695\n摩曼\t73696\n吊桶\t73697\n好多本\t73698\nstuky\t73699\n假冒伪劣\t73700\n正牌\t73701\n接应\t73702\n等不了\t73703\n正版\t73704\n51集\t73705\n與眾\t73706\n差一步\t73707\n正片\t73708\n#SM\t73709\n陈栋\t73710\nNBC热门真人秀\t73711\n广九\t73712\n2393718302\t73713\n#SJ\t73714\n扣扣费\t73715\n就网\t73716\n宋佳杰\t73717\n嘣嘣嘣嘣嘣嘣\t73718\n1mdgjg\t73719\n嗯重演技成狼历险记\t73720\n车建新\t73721\n中国文\t73722\n哥来张\t73723\nclever\t73724\n￥￥￥￥￥￥￥￥￥￥\t73725\n盟主\t73726\n15999秘\t73727\n初生婴\t73728\n洞内\t73729\n希语音助手\t73730\n将领导\t73731\n闹心事\t73732\n将相\t73733\n优乐美乱舞折腾\t73734\nivianshecesanamelbazzzzz啧啧啧3333333\t73735\n乙\t73736\n断裂\t73737\n9000468489\t73738\n幺幺幺\t73739\njrjrj\t73740\n两打\t73741\n吃货们\t73742\n天老爸唯爱你\t73743\n林学院\t73744\n转来\t73745\n丽得\t73746\n焦溪中学\t73747\n一笑很倾城\t73748\n城邦暴力团\t73749\n阿莱格里\t73750\n齐聚\t73751\n370多辆\t73752\nqrutorogs\t73753\nbdgg\t73754\n升幅\t73755\n黑尔\t73756\n李菲儿\t73757\n慎之又慎\t73758\n中午十点\t73759\n重庆市\t73760\n确确实实\t73761\nxisbxdvjshwiabqisq\t73762\n65596455\t73763\n場從\t73764\n作品集\t73765\n住进去\t73766\n几季\t73767\n恩沒\t73768\n师司令部\t73769\n皂角\t73770\n近几天\t73771\n多好\t73772\n姚会长\t73773\n增至\t73774\n升平\t73775\n多奸\t73776\nHubgjnbjm\t73777\n姚胜海\t73778\ntibosstaifeis\t73779\n鸡堡\t73780\n接電話讀\t73781\n打多干\t73782\n拜块\t73783\n李依生\t73784\n163com163com163com\t73785\n再拜\t73786\n昨晚21点10分\t73787\n大呼小觉\t73788\n倪邦哲\t73789\n余款\t73790\n仙人掌\t73791\n电桶\t73792\n槐花\t73793\n送书\t73794\n滑润\t73795\n再拍\t73796\n8438379\t73797\n13467908521346790852\t73798\n御女\t73799\n窝帅\t73800\n一封情\t73801\ngdjxyxc\t73802\n受累忙\t73803\n08080808080808080808\t73804\nfasion\t73805\n大怒易失礼,\t73806\n发种\t73807\n762911823038\t73808\n张晓静\t73809\n周围性\t73810\n遮瑕\t73811\n售价\t73812\n管管\t73813\n就是和你\t73814\n对呀呢\t73815\n小丑猪小丑猪\t73816\n恋人\t73817\n寒冷们\t73818\n来玩游戏\t73819\n一帆风\t73820\n720d2k\t73821\nTttttttttttttttttttttttttttttt\t73822\n端午\t73823\n幺八六四九零幺\t73824\n4233\t73825\n不受伤\t73826\n含义\t73827\n21会22\t73828\n呃色\t73829\n4231\t73830\n入不去\t73831\n还能不能\t73832\n哦天的架势心万物家辉欢心\t73833\n静谧\t73834\n45128745877528\t73835\n舍近求远\t73836\n白雨涵\t73837\n因为爱情\t73838\n养分\t73839\n#龙之梦购物中心\t73840\njhijkmm\t73841\n欠儿\t73842\n甚者\t73843\n对呀你丑\t73844\n真说\t73845\n炫屌\t73846\nrfv\t73847\n5.0.2\t73848\n第份\t73849\n奥拉\t73850\nsmillerord\t73851\n指日可待\t73852\n向右转\t73853\n真证\t73854\n夜宿\t73855\n真话\t73856\n很知\t73857\n真诚\t73858\n夜宵\t73859\n张昱莫\t73860\n后置\t73861\n王雨婷\t73862\n奥拓\t73863\n好啊好我不还那你\t73864\n国渠\t73865\nygrrhh\t73866\n钟强\t73867\n钟弹\t73868\n导致\t73869\n徐艺娜\t73870\n把好开心\t73871\n居住证\t73872\n麼仙\t73873\n忘你是谁了有我也不懂\t73874\n刘喜欢\t73875\n券门\t73876\n436776464\t73877\n么头\t73878\nccb\t73879\n羡煞\t73880\n女嘞\t73881\n不能否\t73882\n多情不自禁\t73883\n斑鸠堡\t73884\n咸肉\t73885\niOS理解咯lol\t73886\n么多\t73887\n口唇\t73888\nfirelli\t73889\n么够\t73890\n呢撸\t73891\n多好饿\t73892\n二二五四八\t73893\n自视\t73894\nUQ\t73895\n农妇\t73896\n电瓶\t73897\n似的眼睛\t73898\n李金\t73899\n操操\t73900\nrfk\t73901\n自轻\t73902\nREST\t73903\nrfj\t73904\n凯泽斯劳滕\t73905\nl21gjgaaga1bgjhb\t73906\n踮脚\t73907\n重邮\t73908\nfyft\t73909\n要你是\t73910\n球兄弟\t73911\n谢个牟尼哥\t73912\n称为\t73913\n探求\t73914\n遛遛\t73915\nfyfg\t73916\nfyfh\t73917\n黄白色\t73918\n好想念\t73919\n粗制滥造\t73920\n非常幸福\t73921\nnjkjg\t73922\n躲奸\t73923\nghcyfyicoyfxffffffffffggggggggggggggghggggggggggggggggg\t73924\n酒醒酒\t73925\n城市规划区\t73926\n唐梓祥\t73927\n财源\t73928\n拖拉\t73929\n幽游白书\t73930\n春天里\t73931\n138641\t73932\n胰腺\t73933\n芳华\t73934\n推整合\t73935\n不过你的朋友\t73936\n拖拖\t73937\n擦瘟\t73938\n600280\t73939\n明后几天\t73940\n分岔\t73941\n三瓶\t73942\n春目\t73943\n998635247JKh\t73944\n淘尽心酸\t73945\nyymymr\t73946\n雅虎蝶\t73947\n许嵩\t73948\n囊死\t73949\n一点五倍\t73950\n变电\t73951\n宜昌国贸\t73952\n变男\t73953\nobaobaobaoba\t73954\n小豆豆\t73955\n100万300万\t73956\n服务\t73957\n马拉松\t73958\n促急\t73959\n腰间盘\t73960\n蒙蔽\t73961\n1.6T\t73962\nly太爱你了你在哪吗\t73963\nabacobilic\t73964\n伍亚丽\t73965\n零九分\t73966\n招标\t73967\n同样\t73968\n四百五九十八元\t73969\n吴吉嵘\t73970\n刘冰冰\t73971\n快照\t73972\n口中\t73973\n枸杞\t73974\n同校\t73975\n5yy7u8\t73976\n1881512266571\t73977\n孟令兴\t73978\n屈指可数\t73979\nCheryl\t73980\n睡照\t73981\n王家皂\t73982\n福尚轩\t73983\n冬芽\t73984\n李小昨\t73985\n口上\t73986\n名山区\t73987\nwdfu\t73988\n口丁\t73989\n零万度\t73990\n郑愁予\t73991\n特恩布尔\t73992\n孙猴子\t73993\n考试中\t73994\n戴培琪\t73995\n孔繁森\t73996\n温驯\t73997\n樱桃小丸子#\t73998\n千言万语\t73999\n一只脚\t74000\n高喝彩\t74001\n等不懂\t74002\n地膜\t74003\n铛铛铛\t74004\n多多多多多多\t74005\n靠不住\t74006\n伪男神\t74007\n赛尔威\t74008\n二横巷\t74009\n腰酸背痛\t74010\n双剑\t74011\n七十个\t74012\n示爱\t74013\n小话\t74014\n教育界\t74015\n张梦杰\t74016\n气死堡电话良\t74017\n飞歌\t74018\n高密度\t74019\n我的心情\t74020\n小试\t74021\n红色警戒四\t74022\n小诗\t74023\n马海珺\t74024\n牛犊\t74025\nvddyjcxdr\t74026\n民怨沸腾\t74027\n二月底\t74028\n啦啦啦啦啦啦啦宝贝\t74029\nhfhhruhrhhfhfhehfhhbdhrhjehsgddb\t74030\n五百多次\t74031\ntgughxbgg\t74032\n换钱\t74033\n叫任\t74034\n大该\t74035\n小诺\t74036\n小课\t74037\n城池\t74038\n大误\t74039\n种型\t74040\n大富翁\t74041\n阎王爷谜底是什么？一个字\t74042\nshsbs\t74043\n大说\t74044\n春闺了\t74045\n金毛赤赤\t74046\n林洛施\t74047\nlocal\t74048\n嗯掉\t74049\nsiushsjshs\t74050\nisstill\t74051\n炖锅肉\t74052\n飘眉\t74053\n当众\t74054\n有档\t74055\n根子\t74056\n地产商\t74057\n浮图\t74058\nbdidjg\t74059\n薪水\t74060\n采飞扬\t74061\n天的天气真呀真真好\t74062\n火埃\t74063\n阅春\t74064\n刑讯\t74065\n二层\t74066\nhuhhdeevj\t74067\n我不呢\t74068\n这就这样\t74069\nBzwe\t74070\n迎风\t74071\nmanyn\t74072\n红药\t74073\nJubilee饭拍排练照一张\t74074\n芭比洋娃娃\t74075\n靠枕\t74076\n爸爸女\t74077\n张海迪\t74078\n顺眼\t74079\n鹫nn叉\t74080\n失败者\t74081\n列行\t74082\n133368\t74083\n多咪x56\t74084\n二四六八十三十六点\t74085\n657686668622\t74086\n官员们\t74087\n小秘书度秘\t74088\n不得志\t74089\n猴年纪念币\t74090\n辛可宁\t74091\n住院费\t74092\ngmail\t74093\n卓一婷\t74094\n然然\t74095\n王主人\t74096\nyyqx\t74097\n冰天半世归来赢属国\t74098\n32平方厘米\t74099\n恩典\t74100\n下坠\t74101\n挥一下手\t74102\n徐紫欣\t74103\nhttphhiphotosbaiducomxiaodupicitem5ab5c9ea15ce36d312383d443df33a87e950b182jpg\t74104\n西凤酒\t74105\n恶犬\t74106\n抬起头\t74107\n金国\t74108\n度狗\t74109\nw232222226rutfuufi\t74110\np吗\t74111\n45.6亿元\t74112\n杜KEN\t74113\n几米\t74114\n女的我是女的我是你\t74115\n苗连强\t74116\n二二二五幺\t74117\n几类\t74118\n内经书\t74119\n朱婷\t74120\n庄河\t74121\n你真么\t74122\nvvhhhhu\t74123\n毫毛\t74124\n酒甘汤\t74125\nyftgtruiyftdtui\t74126\n金逸觉\t74127\n不要你好坏\t74128\n恋歌蜘蛛侠\t74129\n菏泽梅园小学\t74130\n交尾\t74131\no嘉鱼\t74132\n雅尼\t74133\nshfj\t74134\n加勒比海盗4：惊涛怪浪\t74135\n琳雅轩\t74136\n结结巴巴\t74137\n刘鑫\t74138\n嗯推\t74139\ncxu\t74140\ncxv\t74141\n家棒棒糖\t74142\ncxx\t74143\ncxy\t74144\n晚8点二十\t74145\n桂新\t74146\ncxb\t74147\n宋元\t74148\ncxd\t74149\n黄峰\t74150\ncxf\t74151\ncxg\t74152\ncricere\t74153\nLily\t74154\n吃饭吧大大\t74155\n梅片\t74156\n找了呀\t74157\n永远是\t74158\nSimSimi\t74159\n新锅\t74160\n赞首尔\t74161\n弘基学校\t74162\n护品\t74163\n特别的爱给特别的你\t74164\n春分\t74165\n新锐\t74166\n言小宝\t74167\n你好傲娇受\t74168\n卡车队\t74169\n意灭\t74170\n通州\t74171\n灰灰\t74172\n旧歌\t74173\n竹马\t74174\njutf\t74175\n枫桥夜泊\t74176\n枪迷\t74177\n西药\t74178\n听宝\t74179\n空杯\t74180\n邓子妍\t74181\n苏咪哟\t74182\n田欣瀚\t74183\n一二百泰币\t74184\n变幻\t74185\n大大前天\t74186\n停留\t74187\njute\t74188\njuoun\t74189\n事常\t74190\nhdghd\t74191\n八嘎纳兹妹\t74192\n卑鄙\t74193\n六十多分\t74194\n泡芙泡芙\t74195\n米兔兔\t74196\n丑男\t74197\n呼小叫\t74198\n守护神\t74199\n老个颠\t74200\n88588\t74201\n没错不过\t74202\n流星\t74203\n无止境\t74204\nax店\t74205\n黑色星期五\t74206\n唱唱笑笑\t74207\n呦宝贝\t74208\n数字穴\t74209\n纵深\t74210\n那我们加油野鸡毛一会你上我家\t74211\n吕紫剑\t74212\n不要脸厚\t74213\nkfosots\t74214\nGwyneth\t74215\n秋节\t74216\n木兰围场\t74217\nbabyduo\t74218\n前妻\t74219\n会不在\t74220\n抱头大哭\t74221\n周萌\t74222\n水到渠成\t74223\n衣家\t74224\n几光年\t74225\n傅美\t74226\n责任方\t74227\n未来三天\t74228\n兵兵我搞基找你好不好妹子帅\t74229\n有生之年\t74230\n中午十二点\t74231\n校园卡\t74232\nkjliqulow\t74233\n欢笑声\t74234\n三哥\t74235\n牛肉滑蛋\t74236\n靠别\t74237\n考试时\t74238\n反抽\t74239\nJnbnm\t74240\n良将\t74241\n麦丁\t74242\n彭叶总\t74243\nlovein\t74244\n俞阳洋\t74245\n周文杰\t74246\n良少\t74247\nsgggfddff\t74248\n人头痛\t74249\n词义\t74250\njsjdjjd\t74251\nHajj\t74252\nvvbbbbvvbvvvvvvvvvbbbbvvbvvvvvvv\t74253\n够不够不够不够不够不够不够不够不够不够不够不够不够不够不够不够不够不够不够不够不\t74254\n容城\t74255\n滞纳\t74256\n合合\t74257\n党支\t74258\n汪汪\t74259\n啵我喜欢\t74260\ndebaha2\t74261\n小豪豪\t74262\n咯了盖\t74263\n新市\t74264\n110116102610\t74265\n寻觅\t74266\n阵容\t74267\n人云\t74268\n搁浅\t74269\n3.25%\t74270\n姚一面\t74271\n五天五夜\t74272\nPollini\t74273\n1005510937\t74274\n胡浩权\t74275\n大你猜\t74276\n你是个猪呀你是猪呀你是个猪呀你是个猪呀你\t74277\n2次元\t74278\n人人\t74279\n搞工\t74280\n万岁山\t74281\n哥ovxl\t74282\n逆流\t74283\n东三环\t74284\n畅达\t74285\n新帽\t74286\n创战纪\t74287\n非常道\t74288\n上床亲\t74289\n盘中\t74290\nGfxbhcc\t74291\nSud\t74292\n亮亮亮\t74293\n领导\t74294\nqei\t74295\n掉毛\t74296\n纵便\t74297\nqet\t74298\nqew\t74299\n呆萌哥\t74300\n多利莱\t74301\n片照\t74302\n张嘉欣\t74303\n雪瑶\t74304\n保守派\t74305\n影展\t74306\n炸薯条\t74307\n袁子义\t74308\n2006年三月三1日\t74309\n第14天\t74310\n1998年\t74311\n木859998\t74312\n早晨5点\t74313\n有点假\t74314\ngjgdgj\t74315\ndd10个\t74316\n多米儿\t74317\n养和\t74318\n几吗\t74319\n彤腕\t74320\n再就\t74321\n扼扼\t74322\n晚上七点\t74323\n张紫颖\t74324\n几吧\t74325\n我的我一炮轰死你\t74326\n开胶\t74327\n羞月\t74328\n第十三世\t74329\nasha\t74330\n看梦见\t74331\nii3x26\t74332\n优惠吧\t74333\n客店\t74334\nashu\t74335\n12380元\t74336\n一瀑\t74337\n等等等等等等等等等等等等等等等等等等等等等等等等等\t74338\n真才实学\t74339\n二多\t74340\n半多\t74341\n二外\t74342\n美墅\t74343\n张雨馨\t74344\n没问题\t74345\n二处\t74346\nqqqqqqqqqwwwwbvbjnn\t74347\n勉费\t74348\n绝情\t74349\n神硕\t74350\n嗯杨\t74351\nAmo\t74352\n离校\t74353\n风波\t74354\n照镜\t74355\n嗯来\t74356\nAmy\t74357\n十四十八号\t74358\n二天\t74359\n风泽\t74360\n二太\t74361\nX9999999999999999999999999999\t74362\n四一点\t74363\n老ō架\t74364\nniyeshi\t74365\n四三十四\t74366\n网银\t74367\n自诩\t74368\ncemaifaci\t74369\n燕莲\t74370\n三儿\t74371\n近两个小时\t74372\n97899\t74373\n购付\t74374\n不是我的故意\t74375\n苗族\t74376\n好开心\t74377\n高汤\t74378\n一妻一路\t74379\n平時\t74380\nabcdefghijkaolamnopqyouristyouvwsyazaabcdefghijklommmnnobeyourialstayouawashilzzzzzaabc1deaf\t74381\n学生证\t74382\n高数\t74383\n伤风\t74384\n私教\t74385\n以下面\t74386\n私会\t74387\n拼搏\t74388\n高效\t74389\ncaky\t74390\n懂得\t74391\n双11\t74392\nsnsj\t74393\n共夫共妻\t74394\n赞誒\t74395\n7万美元\t74396\n掉下楼\t74397\n高教\t74398\n林权\t74399\n心中的偶像\t74400\n私企\t74401\n摘路\t74402\n1742\t74403\n大国\t74404\n大图\t74405\n干嘛哟BO\t74406\n收纳\t74407\n好那你给我包辣条\t74408\n奴儿\t74409\n一一年\t74410\n不忙你\t74411\n司欣宇\t74412\n2个星期\t74413\n19世纪\t74414\n飞飞车\t74415\n二五天\t74416\n瘦秘\t74417\n王梓钰\t74418\nsodoso\t74419\n大四\t74420\n果喜\t74421\n等战\t74422\n流行乐史\t74423\nfhjbcxdccj\t74424\n刘镕祯\t74425\ng68\t74426\n之后\t74427\n就是了叫\t74428\n率先\t74429\n见意\t74430\n公倍数\t74431\n行不ｚ\t74432\n人民大理堂\t74433\n一周左右\t74434\n腰鼓\t74435\n恩成\t74436\n春安安\t74437\n手歌\t74438\n马莲英道\t74439\n王卓亚\t74440\n猪楼\t74441\n淘宝城\t74442\nGwpt646694967949\t74443\n好声音\t74444\n屈静\t74445\n跳远\t74446\n发癫\t74447\n跳进\t74448\n慧时欣园\t74449\n国鼎\t74450\n8480\t74451\n笑里极化\t74452\n十七点\t74453\n自个那\t74454\n吻戏\t74455\n杳一下天先\t74456\n8489\t74457\n跳过\t74458\n18609721998\t74459\n那场\t74460\njjfhu\t74461\n二阶\t74462\n空手道\t74463\n回到不想\t74464\n解解气\t74465\nmamnnn\t74466\n等不开\t74467\n悖文意\t74468\n颠倒\t74469\n雪雪加油大郑\t74470\n1000000元\t74471\n周沧靖\t74472\n忒假\t74473\n白狐\t74474\n白狗\t74475\n偶似\t74476\n快快快\t74477\nfsgsz\t74478\n忍气\t74479\n给出\t74480\n便士\t74481\n难看\t74482\n二队\t74483\n奉命\t74484\n略懂\t74485\n28282528\t74486\n刘子怡\t74487\n程序设计\t74488\n午也好\t74489\ntfde\t74490\n滿格\t74491\n刘雨涵\t74492\n桥南\t74493\n苏伟哲\t74494\n度秘俊\t74495\n竹纸\t74496\n为有\t74497\n冰释\t74498\n糖果儿\t74499\nhbhvibogc\t74500\n女娲石篇\t74501\n为期\t74502\n多瑙河\t74503\n天龙360\t74504\n1362993882\t74505\n1967686240\t74506\n直至\t74507\nooox\t74508\n为本\t74509\n意欲何为\t74510\n三度房室传导阻滞\t74511\n番薯瓜\t74512\nlamgrlg\t74513\n吟咏强\t74514\n拿比\t74515\n大号码\t74516\nooog\t74517\n一个十五块\t74518\n景德镇\t74519\n李献计历险记\t74520\n王华语\t74521\nSALHEYA区\t74522\n2064293253\t74523\n辣辣\t74524\n啦闹闹闹闹闹闹慢慢\t74525\n栖霞\t74526\n弓手\t74527\nkkcfnkfnkckckckkckclldldmlldllkxkllkmmfkmfkmcmmkc\t74528\n李思蒙\t74529\n马克·安德森\t74530\n我太幸福了我是你主人\t74531\n上层大气分之五乙之下草\t74532\n吴美\t74533\n臥室\t74534\n才知化\t74535\n甲鱼儿女的事实话说什么办\t74536\n孟晓柯\t74537\n19：35\t74538\n带动\t74539\n剑侠传奇\t74540\n公选课\t74541\n一身\t74542\n女家\t74543\n联想G575EX\t74544\n烤羊肉串\t74545\n半好不好\t74546\n岁牙\t74547\n范晓莹\t74548\n遮羞布\t74549\n星联\t74550\nDolceamp\t74551\n力争\t74552\n二二章\t74553\n边框\t74554\n带劲\t74555\n中卫\t74556\n榕城\t74557\n柯007\t74558\n李在巫\t74559\n见报\t74560\n没不知道\t74561\n自胜\t74562\n81858\t74563\n岸花\t74564\n36章\t74565\n金钟民\t74566\n自胖\t74567\n史前苍龙\t74568\nComScore\t74569\n解决方案\t74570\n雁过拔毛\t74571\n吴开心\t74572\n那再猜猜\t74573\nweeeeghhvf\t74574\n周杰\t74575\nBVVC\t74576\ncz2030\t74577\n难不难看\t74578\n雪小在\t74579\n芽苗菜\t74580\n不能化\t74581\n国际金融市场\t74582\n金乍\t74583\ntxf\t74584\ntxd\t74585\n扣扣扣扣扣\t74586\n六季\t74587\n顺平\t74588\n058\t74589\ntxz\t74590\ntxy\t74591\n056\t74592\n雀斑\t74593\n荷载\t74594\ntxr\t74595\n托尔金\t74596\n欢欣鼓舞\t74597\n最高人民法院\t74598\n玩闹玩\t74599\n亚洲国际博览馆\t74600\n欧巴hyw\t74601\najajt\t74602\nup个\t74603\n金书\t74604\n秘真不可爱\t74605\n一比一样\t74606\n金乡\t74607\n度秘你真是我的好秘书you\t74608\n固原\t74609\nddfzfscrx\t74610\n今晚8点\t74611\n六子\t74612\n睡觉段\t74613\n三等\t74614\n12364点\t74615\n86余\t74616\n窃读记\t74617\n200589957180578579\t74618\nfwetyyuioppkhgfsaazxcbnm\t74619\n道别\t74620\n魔羯\t74621\n不鸟\t74622\n一百两个\t74623\n86位\t74624\n罗嘉意\t74625\n嗯觉晓\t74626\n十集\t74627\n远是\t74628\n主淫\t74629\nblade\t74630\nsasaseritma慢anonoumomomolomomotion\t74631\nkekelololollekelekefledlomomokej\t74632\n徐绍史\t74633\n慕容芊芊\t74634\nsomnvs199423\t74635\n乐谱\t74636\nlvec\t74637\n556442929013551\t74638\n468239557\t74639\n十八千米家离学校\t74640\n红薯叶\t74641\n以海\t74642\n18984074463\t74643\n新亚我好想新亚我好伤心\t74644\n大熊\t74645\n扣鈕\t74646\n打错了\t74647\n边角料\t74648\n99999999999999\t74649\nBrian\t74650\n米兰四季酒店\t74651\n隔壁家\t74652\n人间曲\t74653\n同文\t74654\n范阿珍\t74655\n介紹言情小說\t74656\n滑动\t74657\n桂花味\t74658\n真棒\t74659\n找不着不知道\t74660\n北温哥华\t74661\nBenoit\t74662\n同新\t74663\n下沙\t74664\n农会\t74665\n以流\t74666\n原曲\t74667\n小苹果儿\t74668\n既往不咎\t74669\n90年代初一\t74670\n下沉\t74671\n许回话\t74672\nts卜唉四\t74673\n顾不上\t74674\n蓄\t74675\n蓉\t74676\n泊车\t74677\nGDsf\t74678\n蓑\t74679\n李总理\t74680\nbakingpowder\t74681\n太扭曲\t74682\n蓟\t74683\n蓝\t74684\n蓣\t74685\naababbb\t74686\n蓦\t74687\n一樣一樣uu雨有空看看意\t74688\n黄思林\t74689\n本省\t74690\n本真\t74691\n十一个女的我是男的能不能行\t74692\n永顺\t74693\n别死\t74694\n党参\t74695\n麦当\t74696\nSSN\t74697\n芈芾\t74698\n屎单蛋\t74699\n15.00元\t74700\n店脑\t74701\n哈错错\t74702\n噢啦\t74703\ngdfrgftfxyvxfgfghgyhgjghuuhfgggghghggggghhgfgbggjhfmjjgbnmnjhk\t74704\n多多多多多\t74705\n课程度\t74706\n事业心\t74707\n0375百分之37点\t74708\n花果\t74709\n菜狗\t74710\n好多度\t74711\n死算\t74712\n龙华丽娜\t74713\n楚燕齐\t74714\nmakelov\t74715\n无法呼吸\t74716\n红豆腐\t74717\n两数\t74718\n北京小剧场\t74719\n拖出\t74720\n变态男\t74721\n算了说\t74722\n隔住\t74723\n订做\t74724\n机群\t74725\n说了算\t74726\n积怨\t74727\n五月廿八\t74728\n适可而止\t74729\n当政者\t74730\n二十年以后\t74731\n高压\t74732\njjeskk\t74733\n王曼\t74734\n恶不借我恨你我恨你我恨你我恨你\t74735\n彭女士\t74736\n第一发\t74737\n泛读\t74738\n高原\t74739\n企图\t74740\n13191184663\t74741\nLi匕\t74742\n钜钎\t74743\n九十万九万九千九百九十九元\t74744\n贪官们\t74745\n樱桃樱桃\t74746\n新热\t74747\n221155\t74748\n第一号\t74749\n硬撑\t74750\n好呀萌萌哒\t74751\n第一台\t74752\ndyvikjhhui\t74753\n志健\t74754\n新纪元\t74755\n张新月\t74756\n第一只\t74757\n春日\t74758\n真的很单纯\t74759\n女汉\t74760\n第一句\t74761\n第一口\t74762\n太粘\t74763\nfxnfkvjckgb\t74764\n王保险\t74765\n二十分\t74766\n先主\t74767\n快慢\t74768\nwrgd\t74769\n人民邮电出版社\t74770\n东北秧歌\t74771\n度秘回琪\t74772\n宝马公开课#\t74773\n安柠夕\t74774\n亲耐\t74775\n郭丁瑞\t74776\nwoen\t74777\n多海\t74778\n四三二五二九\t74779\nFgx\t74780\ndsv\t74781\nmbia1\t74782\ndst\t74783\n没用到\t74784\ndss\t74785\n中华为\t74786\ndsq\t74787\nFgl\t74788\n扳机\t74789\n拜拜换\t74790\n呼冷呼\t74791\nIfiddj\t74792\n透不过气\t74793\nFgd\t74794\ndsd\t74795\n冀泽霖\t74796\ndsa\t74797\n婀娜owl多某科目咯无卡额看看提拔摩托摩\t74798\nwhatamis\t74799\n精典\t74800\n世生\t74801\n精兵\t74802\n24ji\t74803\nUniversity\t74804\n過分發個\t74805\n亚永\t74806\n傻子仔\t74807\n新贵\t74808\n贺先生\t74809\n明天刑天明天\t74810\n点板\t74811\n泡粉\t74812\n二百二百多\t74813\n4635536634\t74814\nz芝\t74815\n笑话片\t74816\n721分\t74817\n在秘\t74818\n精光\t74819\n仙盈盈\t74820\n突德\t74821\n结龙\t74822\nfgdertggc\t74823\n误尽青春总是情\t74824\njghhy\t74825\n屋头\t74826\n援助\t74827\n新鬼\t74828\n国安足球俱乐部\t74829\nhydueh\t74830\n波音７３７－８００\t74831\n停电\t74832\n男女老少\t74833\n355一\t74834\ndshsforsfia\t74835\n房屋\t74836\n储备\t74837\n万历\t74838\n震天\t74839\n开一喊\t74840\n长相\t74841\n秧苗\t74842\n在自黑\t74843\n桥东\t74844\n四蹄\t74845\n写真机\t74846\n国际消费者权益日\t74847\n房山\t74848\n缺少\t74849\n寄信\t74850\n桥上\t74851\n主持队\t74852\n18511102355\t74853\n铁汉子\t74854\n1940RMB\t74855\n小黑兔\t74856\nlisten\t74857\n有你最萌\t74858\n王彬丞\t74859\n密噶\t74860\n第几根\t74861\n阮利敏\t74862\n炫富\t74863\n食言\t74864\n电子版\t74865\n尼小宝\t74866\n胡作非\t74867\n是不够\t74868\n不帅哥我需要你\t74869\n两万贝贝\t74870\nBeautiful\t74871\n蔑视\t74872\n土城人\t74873\n愿望\t74874\n杨再年\t74875\n太迪\t74876\n叙事文化宫\t74877\n语文西游记\t74878\n180cm\t74879\n不要脸臭不要脸臭不要脸臭不要脸臭不要脸\t74880\n西路\t74881\nbiāo\t74882\n随泪逐流\t74883\n科店\t74884\n27313126\t74885\nfancam\t74886\n一百一百七十三块\t74887\n二三三九四零九零\t74888\n几次号\t74889\n第一期\t74890\n马克们\t74891\nWilliams\t74892\n五一零八六二九八八\t74893\nQcso\t74894\nOdeiiw1\t74895\n嘎嘎木\t74896\n使用者\t74897\n上不了\t74898\n一线仙\t74899\n通车\t74900\n讯飞\t74901\n唱一首等\t74902\nCLOUD\t74903\n省一元钱\t74904\n偷鸡\t74905\n半懂不懂\t74906\n恩知拉贝\t74907\n刘勇\t74908\n一尘网上有名网\t74909\naoeiuo\t74910\n七年级\t74911\n老屋\t74912\nvictoria\t74913\n阿拉丁\t74914\n我们的孩子\t74915\n季雨洁\t74916\n我喜欢呀\t74917\n那不去\t74918\n太麻烦\t74919\n1727．58亿元\t74920\n路段\t74921\n我才不信\t74922\n国兴大道省\t74923\nSasssSasdd\t74924\n罗罗罗\t74925\n4月20日起\t74926\n1251545\t74927\n62岁\t74928\nWhats\t74929\n醋蛋液\t74930\n英沙\t74931\n雪奈端\t74932\n万美\t74933\n唉愁\t74934\n数吧\t74935\nSandel\t74936\n订单号\t74937\n手亚\t74938\n独具\t74939\n13327868700\t74940\n片巴拉巴拉小魔仙\t74941\nWhata\t74942\n凤毛麟角\t74943\nWhato\t74944\n寿镇\t74945\n我喜欢周\t74946\n咪西咪西\t74947\n不来工委\t74948\n是谁谁\t74949\n树木\t74950\n沱江镇\t74951\n好难好难\t74952\n345亦凡\t74953\n陈能\t74954\nisfk\t74955\n梅江\t74956\n米英\t74957\n巨扎\t74958\n泞发\t74959\n别烦人\t74960\n想像力\t74961\n蔡龙\t74962\n丑逼\t74963\n幺零二\t74964\n张纪中\t74965\n种爱\t74966\n来抱\t74967\n白羊射手\t74968\n陈胜\t74969\n守候\t74970\n这一条\t74971\n陈胖\t74972\n大脚丫\t74973\n一哥们\t74974\n又笑\t74975\n幕布\t74976\nHugstuds\t74977\n死媚\t74978\n葛志隐\t74979\n72小时内\t74980\n相似性\t74981\n别说好泡\t74982\n首饰盒\t74983\n根本轮\t74984\n间隔着\t74985\n干嘛好\t74986\n纳尼亚传奇3黎明踏浪号\t74987\n雄子\t74988\n喷喷\t74989\n好的乖孩了啊天龙\t74990\n现象\t74991\n未来几天\t74992\n小心事\t74993\n咯度秘\t74994\n出人意料\t74995\n是我是\t74996\n术语\t74997\n伊思密达网\t74998\nRebel\t74999\n点差\t75000\n激不活\t75001\n把秘委屈\t75002\n度号秘\t75003\n超级大盘股\t75004\n我不在是个了秘\t75005\n以往\t75006\n康定\t75007\nfail\t75008\n以後\t75009\n神农架林区\t75010\n闷冠\t75011\n老师性\t75012\n6318789\t75013\n是我明\t75014\n9161616661616616161666666666666666666666\t75015\n800年\t75016\n999999999999999999999999999999999亿一个\t75017\n聪聪\t75018\n啦啦阿拉\t75019\ndamariono\t75020\n京打马尼\t75021\n坚决\t75022\n你在撸嘛的我家龙猫\t75023\n道家五狱\t75024\n花红\t75025\n中国石油\t75026\n李馨琪\t75027\n妖梦\t75028\n接这儿\t75029\n附件\t75030\n听不清声\t75031\n配当\t75032\n师范大学\t75033\n走眼\t75034\nyjgfjgc\t75035\nSunnyfrind\t75036\n花纹\t75037\n1004264\t75038\n走看\t75039\n伊思美\t75040\n一级\t75041\n一约\t75042\n片名\t75043\n我讨厌你你不讨厌我我讨厌你\t75044\n一线\t75045\n200毫升\t75046\n走真\t75047\n好啦好啦好啦好啦好啦好啦好啦好啦好啦好啦\t75048\n俞著晓\t75049\n经不读\t75050\n鱼果\t75051\n赵文忻\t75052\ngxg\t75053\nxite\t75054\n那浪漫和政中学\t75055\ngxb\t75056\ngxc\t75057\ngxm\t75058\n丑货\t75059\n生日系\t75060\ngxh\t75061\ngxi\t75062\n写图\t75063\ngxk\t75064\ngxt\t75065\n送一刚\t75066\n裁员\t75067\nTelegraph\t75068\ngxr\t75069\nsddd\t75070\n次机会\t75071\n小心机\t75072\n首神曲\t75073\n中华\t75074\n顺毛\t75075\n小燕良辰\t75076\n存储器\t75077\n才怕\t75078\n新沂\t75079\n银河SOHO\t75080\n梳子\t75081\n扑面\t75082\n募捐\t75083\nwuli\t75084\n聚散\t75085\nwull\t75086\n官僚\t75087\n密度计\t75088\n普渡\t75089\n无字\t75090\nsoulou\t75091\n温岭市政府\t75092\niu1\t75093\n明亮\t75094\n早快\t75095\n2111年\t75096\n过手\t75097\n普清\t75098\n聪明乖在干嘛\t75099\n当做爱好\t75100\n算命\t75101\n瞪眼\t75102\n明了\t75103\n频顾惜\t75104\n度秘我喜欢你我想\t75105\n明事\t75106\n威嘿\t75107\n大孟\t75108\n赶我走\t75109\n23张\t75110\n屌女\t75111\n谢不谢\t75112\n真像\t75113\n大字\t75114\n中影\t75115\npishi\t75116\nd707\t75117\n中彩\t75118\nsuoqipuq\t75119\njamz\t75120\n劳保\t75121\nCapital\t75122\n舒克和贝塔\t75123\n文言文儿\t75124\n帐难\t75125\n大學\t75126\n史努比\t75127\n剑侠\t75128\n枫亭旧商业街01\t75129\n我们一欢我吗喜欢我吗喜欢我吗喜欢我吗喜欢我\t75130\n红茶\t75131\n一一口\t75132\n前凸\t75133\n落落空脱\t75134\n大孩\t75135\n大学\t75136\n袁珂景\t75137\n大孤\t75138\n一一只\t75139\n剥离\t75140\n组会\t75141\n七个小矮人\t75142\n特奥\t75143\n475块\t75144\n生产队\t75145\n应该化\t75146\n特好\t75147\n兰州\t75148\n做作声\t75149\n滑下\t75150\n羽毛球女\t75151\n严重性\t75152\n我讨厌猫爸爸妈妈好呀\t75153\n长方照明\t75154\n想不好不好呀\t75155\nb6b6bbb\t75156\n的源\t75157\niu4\t75158\n三十三十块\t75159\n字词\t75160\n王荣誉\t75161\n喊毛\t75162\n八七零八二八五零\t75163\n伊斯科\t75164\n左部\t75165\n申慧敏\t75166\n13466464646\t75167\n庞锦梅\t75168\n奋战\t75169\n度秘山\t75170\n五表示数\t75171\nQWPOw6\t75172\n大货车\t75173\n标题党\t75174\n挣钱\t75175\n噶居\t75176\n梦会么\t75177\n森达\t75178\n笔直公路\t75179\n桥板\t75180\n穆狗\t75181\n张木工\t75182\n胡铁烙\t75183\n好不好好啦\t75184\n150周\t75185\n胜景\t75186\n碎\t75187\nClematis\t75188\n碍\t75189\n碗\t75190\n川渝人民真\t75191\n唐唐b个笑工坊\t75192\n我喜欢爱我的人\t75193\n弯眉\t75194\n碑\t75195\n收图\t75196\n太闲\t75197\n只许\t75198\n芒果汁\t75199\n碘\t75200\n碧\t75201\nCXR\t75202\n心情莫名\t75203\n坐享\t75204\n董本\t75205\n跳河\t75206\n中国男篮\t75207\n日蚀\t75208\n碩\t75209\n吗咪爱咪\t75210\nImboy\t75211\n這張\t75212\n人来的你是人来的不是人来着你是人来的是什么人来着哎呦你\t75213\n碰\t75214\n碱\t75215\n收回\t75216\n碼\t75217\n確\t75218\n一定要回来\t75219\n红PK5公\t75220\n聂离\t75221\n开挂\t75222\n多蔑\t75223\n金额\t75224\n开挖\t75225\n太不投缘了我喜欢看穿越废物女强\t75226\n十四分\t75227\n心脏性\t75228\n胡椒亲\t75229\n易经\t75230\n给我的错\t75231\n贵子\t75232\n老梅\t75233\nSJ六辑\t75234\nauAl\t75235\n那么多不急\t75236\n宰杀\t75237\nnenkd\t75238\n給你的生活激起點波瀾\t75239\niuuu\t75240\n花千骨说什么\t75241\n抢了抢\t75242\n蘑菇亲\t75243\n老乡鸡\t75244\n连数\t75245\n发案\t75246\nexogent\t75247\n888888888888888888888888888888888888888888888888888888888888888888888888888888\t75248\n数样\t75249\n下颌处\t75250\n乾隆六年\t75251\n但求\t75252\n1头\t75253\nina\t75254\njuuvc\t75255\n丰汇园\t75256\n波神\t75257\n5841314520\t75258\n瑞英\t75259\nOOO0\t75260\n我的寂寞我一包烟小的花绿松石\t75261\n四七句\t75262\n六魔女玟玟\t75263\n一百一百\t75264\n兮饭\t75265\n朱熹琅\t75266\nWhenis\t75267\n唱一一\t75268\n9u0\t75269\n文学奖\t75270\n你好问一下伤心首和耳舟\t75271\n猜码\t75272\nkfjfjhdhssh\t75273\n插图\t75274\n坂田\t75275\n25万多\t75276\n张净思\t75277\n下雨死\t75278\n有恶报\t75279\n柚木\t75280\nchte\t75281\n尔好\t75282\n服务者\t75283\n昏线\t75284\ncalu\t75285\n王尔德\t75286\n暗管\t75287\nYaoqingzenmexie\t75288\n幂结婚\t75289\n暗箭\t75290\n有效期\t75291\n辞去\t75292\n肾功能\t75293\n精诚\t75294\n暗算\t75295\n四射\t75296\n惠人\t75297\n趋势\t75298\n醋菜度秘\t75299\n15202525\t75300\n黑毛\t75301\n美丽的神话\t75302\n探路\t75303\nhi你没有\t75304\n312576220113136477839\t75305\n四少\t75306\n冲击\t75307\n背心儿\t75308\n瞬间知道\t75309\n电视果\t75310\n太西\t75311\n大夜猫\t75312\n千讲\t75313\n不要时候\t75314\nprecluding\t75315\n道得\t75316\n千万埗\t75317\n雨刷\t75318\n天犬\t75319\ngoogle地图\t75320\n熊出墨\t75321\nMarry\t75322\n杨思雨\t75323\nEee\t75324\n二百五\t75325\n35595843554569754425465775544\t75326\n八零二六六\t75327\n咖啡馆\t75328\n自营\t75329\n36集\t75330\n打工者\t75331\n道德\t75332\nlayl\t75333\n124567890\t75334\n蓥华山\t75335\nlayt\t75336\n托卡依城\t75337\n去了\t75338\n运运亨通\t75339\n韩纪豪\t75340\n荐股\t75341\n孙博涵\t75342\n88908282866284767676727268606878728070909490808480二七十二\t75343\n娶媳妇\t75344\n馍馍达\t75345\n实质上\t75346\n出租电\t75347\n山达基教\t75348\n你在哪吗我要车糖糖糖糖糖糖\t75349\n马涵\t75350\n素因\t75351\n美地\t75352\n那几大了你我想和你\t75353\n快点吧上等你\t75354\n不爱爱\t75355\n我的经记人\t75356\n妻离子散\t75357\n真智\t75358\nrhjjg\t75359\nhijitiw\t75360\nHaoba\t75361\nmejsh8\t75362\n一期多米\t75363\n陈小贝\t75364\n我是谁了你好\t75365\n笔杆\t75366\n受虐狂\t75367\n小辈\t75368\nDtf\t75369\n创发\t75370\n小辉\t75371\n闻风\t75372\n维泰尔\t75373\n徐倩倩\t75374\nbjfhjefgjhf\t75375\n陌陌\t75376\nugfdrvcf\t75377\n7k7k7k7k7k7k7k7k\t75378\n徐建\t75379\n客气昂\t75380\n李解\t75381\n义可\t75382\n改天还有钱么我没有钱\t75383\n465753856\t75384\n缩写句\t75385\n啦巴\t75386\n受够\t75387\n创号\t75388\n轻而易举\t75389\n熙道\t75390\n48554645555\t75391\n松软\t75392\ne罩杯\t75393\n好看我还\t75394\nonthephone\t75395\n7817\t75396\n不辣\t75397\n尼吉\t75398\n真不对\t75399\n时间向下\t75400\nspidg\t75401\n不达\t75402\n敏镐\t75403\n孟维晔\t75404\n10万个为什么35元科技故事18元百科全书\t75405\n内马\t75406\n纯属着\t75407\n猜迷语或是绕口立冬\t75408\n过得来\t75409\n张咪\t75410\n哦晕\t75411\n炼乳\t75412\nnhkkjoppsaw\t75413\n严凯门\t75414\nvey\t75415\n兵哥哥\t75416\nved\t75417\n他一个人\t75418\nvec\t75419\n5554525522\t75420\n斗罗大陆龙王传说\t75421\n七老\t75422\n音乐界\t75423\nveh\t75424\ngdcvh\t75425\n博藏\t75426\n多项\t75427\n浅唱\t75428\n凌云\t75429\n15969914596\t75430\n大岭山\t75431\n47.20\t75432\n五六百个\t75433\n3093354350\t75434\n微机\t75435\n猜不猜猜不猜猜不猜猜不猜猜不猜猜不猜不猜猜不猜猜不到的爱在不在在不在不在在不在在不在\t75436\n这个词儿了了里了里了\t75437\n铁头铁\t75438\n艾莉\t75439\n目暮\t75440\n睡去吧\t75441\n毕畅\t75442\n夏毓麟\t75443\n司佳然\t75444\n仲夏\t75445\n豫园\t75446\n八千片\t75447\n不不不我跟\t75448\n2点50点\t75449\n臭味\t75450\n王海旭\t75451\n丁茂吞\t75452\n二刈止针个\t75453\n方式样\t75454\n你好辛苦\t75455\n宠物党\t75456\n一口一个\t75457\n歌曲\t75458\n陈和陈\t75459\n民政局\t75460\n賤賤\t75461\n莲子心\t75462\n死牢\t75463\n萌猛\t75464\n浓厚\t75465\n专胜散\t75466\n心情不爽\t75467\n李海欣\t75468\n灵通\t75469\n滑菇鸡块\t75470\n合同工\t75471\n谢谢你了我听\t75472\n铁笼\t75473\n131356081633\t75474\n放寒假拉\t75475\n别太晚\t75476\n南王乡洞\t75477\nhvvvvvvvvv\t75478\n没救\t75479\n淫荡的我的女王漫画\t75480\n演技\t75481\nWarnes\t75482\n阿原来\t75483\n读者\t75484\n4个月\t75485\n天天起来\t75486\n李旭\t75487\n姑娘\t75488\n3110855248\t75489\n时节\t75490\n人工号\t75491\n护盘\t75492\n来辛巴乐\t75493\n哎呦我我我我自个\t75494\n没数\t75495\n涔山国家森林公园\t75496\n历史观\t75497\n许楼村\t75498\n嘤\t75499\n新事物\t75500\n嘢\t75501\n嘣\t75502\n子琳\t75503\n怪快\t75504\n嘭\t75505\n幼女者\t75506\n蔷薇\t75507\n嘶\t75508\nfruyf\t75509\n嘴\t75510\n嘲\t75511\ngjjjfsd\t75512\n阿孙\t75513\n嘿\t75514\n中英文\t75515\n嘻\t75516\n愤老\t75517\n嘆\t75518\n很赞\t75519\n嘅\t75520\n掰掰\t75521\n嘁\t75522\n嘎\t75523\ntidfhctdd\t75524\n嘌\t75525\n嘍\t75526\n嘈\t75527\n姜怡\t75528\n巨额\t75529\n董姓\t75530\n大伙儿\t75531\n田耀杰\t75532\nliaa\t75533\n康云芳\t75534\n嘜\t75535\n董姝\t75536\n嘛\t75537\n嘘\t75538\n槽\t75539\n``\t75540\n槿\t75541\n迈克·阿里斯\t75542\n麦迷\t75543\n我喜欢巴啦啦小魔仙\t75544\n麦迪\t75545\n充毛\t75546\n失去追求\t75547\n猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪傻子\t75548\n根式\t75549\n挥额\t75550\n槟\t75551\n韩国法务部\t75552\n无拘无束\t75553\n八八八八\t75554\n太挤\t75555\n雷丝\t75556\nsavery\t75557\n爱吃苹果\t75558\n铁厂\t75559\n槎\t75560\n干嘛咧\t75561\n彩铃网\t75562\n高阿\t75563\n太平洋小区\t75564\nxhxhfc\t75565\n轗打打\t75566\n聊了再找\t75567\n一起份\t75568\nstomeou\t75569\nWai工作室\t75570\n仙绳\t75571\n单一点\t75572\n眯华\t75573\n烂铁\t75574\n不可是\t75575\n张萝莉\t75576\nhoner\t75577\nE级\t75578\n绘声绘\t75579\n向左走向右走\t75580\nfggggggggggggfgy\t75581\n张佳敏\t75582\niyddid\t75583\n真的不爱\t75584\n冠名\t75585\nfpv\t75586\n后桌\t75587\n温柔们\t75588\n戴尔么\t75589\n不对我好\t75590\n11213456219\t75591\n那么美\t75592\n点着\t75593\n不露\t75594\n告诉文\t75595\n热恩\t75596\n点睛\t75597\nfkyar\t75598\n肖麦琪\t75599\n和好感\t75600\n下去吧\t75601\n周扒皮\t75602\n胡氏\t75603\nygghbhji\t75604\n尚军静\t75605\n篱笆伯纶\t75606\n嘀嗒嘀嗒\t75607\n凯源帅\t75608\n讨要\t75609\n热恋\t75610\n目的性\t75611\n爸类\t75612\n广州火车站\t75613\n本原\t75614\n巨拥堵\t75615\n马明田\t75616\n毛泽东传\t75617\n呢白\t75618\n万米\t75619\n上午10点\t75620\n地下城\t75621\n里面前\t75622\nRihanna\t75623\n令愿\t75624\n虎头山咲斜面\t75625\n冯佳英\t75626\n对著\t75627\n16：00\t75628\n镐小孩\t75629\n叶凌\t75630\n关键问题\t75631\n我征信\t75632\n李文玉\t75633\n鲁惠轩\t75634\n直冲\t75635\n41级\t75636\n家用机\t75637\n叶凡\t75638\n茶影\t75639\nWhitney\t75640\n七二二三零\t75641\n叶凯\t75642\n每隔五分钟\t75643\n勃力威\t75644\n银燕少儿合唱团\t75645\n张大相爱\t75646\n人们\t75647\n神魔样的呀你\t75648\nrtfboys\t75649\n不堪入目\t75650\n讨嫌\t75651\nhi皮hi\t75652\n味甘性\t75653\n13579614235\t75654\n机器人儿\t75655\n同振\t75656\n卖力\t75657\n行好\t75658\nbhoxgh\t75659\n超级吵架\t75660\n树人\t75661\n66661个\t75662\n幸免\t75663\n名税\t75664\nbilithai\t75665\n嘿嘿\t75666\nEnig\t75667\n嘿嘻\t75668\n梁炫\t75669\n梗阻\t75670\n纯粹垃圾公司\t75671\n没听说\t75672\n28.55\t75673\n謝謝\t75674\n照破\t75675\n改变你\t75676\n小度ok\t75677\n汉子歌\t75678\n亲一下\t75679\n不可笑\t75680\n老书\t75681\n民生\t75682\n血缘关系\t75683\n见面会\t75684\n服装城\t75685\n液氮\t75686\n老乱\t75687\n浓情\t75688\n民田\t75689\nahya4碑楼呗q行\t75690\n朱磊磊\t75691\n赵艺伟\t75692\n打机\t75693\nJoy\t75694\n亲一个\t75695\n完美美\t75696\n异客\t75697\n90后\t75698\nhcb\t75699\nhcc\t75700\n小度秘政客\t75701\nhcf\t75702\nJoe\t75703\nhcd\t75704\n老九\t75705\n民用\t75706\nhck\t75707\nhch\t75708\nBDJJEDDN\t75709\n老乔\t75710\n里扒子味丸\t75711\n20xy8xy个\t75712\n大龙村\t75713\n冬夏\t75714\n郭先生\t75715\n扎吉\t75716\n大话小心遭雷劈呦\t75717\n卢钇彤\t75718\n瑶柱\t75719\n瓜农\t75720\n手弄\t75721\n冬夜\t75722\nkkkkll\t75723\n七夕若狂\t75724\n电脑人\t75725\n冬天\t75726\n不河源\t75727\n手弹\t75728\n人力部\t75729\n十到十一年前\t75730\n我我不会\t75731\n建军\t75732\n贺年\t75733\n北京首都国际机场\t75734\nANTI\t75735\nmotherisa\t75736\n铁骨\t75737\n拷问部\t75738\nXX小学\t75739\nalalan\t75740\n8056\t75741\n丢你个孩\t75742\n法规\t75743\n恶化\t75744\nvuink\t75745\n这段话\t75746\n卢哪\t75747\n四百\t75748\n滑过\t75749\n88亿元\t75750\n辣条儿\t75751\n453838635358356356356353283868628383\t75752\n了如指掌\t75753\n行销部\t75754\n着不着\t75755\n王土祥\t75756\ndghfx\t75757\n梅兰芳\t75758\n叶礼坤\t75759\n午觉吧\t75760\n多多多多多多多多\t75761\n存储\t75762\nddtgb\t75763\nCH4H2O\t75764\n风波庄\t75765\n近两天\t75766\n佑猴\t75767\n这个月初\t75768\n1920\t75769\n就不去\t75770\n下半时\t75771\n度秘妮子外瑞古\t75772\n会计证\t75773\n度密948\t75774\nboei\t75775\n两秒钟\t75776\n天哪所学校\t75777\n俅俅\t75778\n终结\t75779\n入侵者\t75780\n咱俩\t75781\n赏钱\t75782\ntootokose\t75783\n有些人\t75784\n哎呀好贪吃的我\t75785\n荜\t75786\n石康\t75787\n恋爱\t75788\n买笑\t75789\n一一帘\t75790\n胡诺员\t75791\nhffg\t75792\n洗练\t75793\n左慈\t75794\n不能忘记\t75795\nJMKTJGW阿狸\t75796\n鹏欣花园国宾酒店\t75797\n七八天\t75798\n美金\t75799\n李若凡\t75800\n拉王帅\t75801\n塔普塔普塔普\t75802\n度佳\t75803\n尤物\t75804\negdueg22gee\t75805\nfffffixicjcjjcjfjfif\t75806\n天滋官鲤2号\t75807\n心情好难过\t75808\nvhvhcybbimpkpmkvuxdfjgjgib\t75809\n一爱好\t75810\n度你\t75811\n马建月\t75812\n晚不晚\t75813\n看着办\t75814\nwmdjgt\t75815\n恒隆\t75816\n膘\t75817\n五倍多72\t75818\n老穆\t75819\n城固\t75820\nhello爱你爱你思密达\t75821\n魔点\t75822\n安哲\t75823\n三除\t75824\n三剑豪\t75825\n三院\t75826\n秘汁\t75827\n秘求\t75828\n三陪\t75829\n三险\t75830\n沙头堡\t75831\n戴王冠\t75832\n乐苗犭\t75833\n复习会\t75834\nbady\t75835\nCA4120\t75836\n自欺\t75837\ngsjsb\t75838\n攻城掠地\t75839\n八十一八十一\t75840\ng10\t75841\n秘汤\t75842\n我求你了快给我\t75843\n我先我\t75844\n脸滴\t75845\nvjvcshojobjcyxdggvkh\t75846\n下一位\t75847\n52千米\t75848\n雪白\t75849\n张智涵\t75850\n算了我不想\t75851\n长篇\t75852\n空巾\t75853\n高美富\t75854\ndech\t75855\n68742555555555555\t75856\n可持续性\t75857\n糊臭\t75858\n芭比呀葫芦娃\t75859\nRocket\t75860\n六月份\t75861\n13路\t75862\n吾皇\t75863\n消张\t75864\n很亲切\t75865\n85r5\t75866\n东方朔\t75867\n里期\t75868\n第二十次\t75869\n通才\t75870\n瑄儿\t75871\n结合\t75872\nDcx\t75873\n寒老师\t75874\n港囧\t75875\n￥改委\t75876\n新疆维吾尔自治区人民医院\t75877\nJjghhfg\t75878\ndinnghall\t75879\neeeeeeeeeeeeeeeweeeeeqchsjcjakiwdifneejijwkoeojmevjejmdvievncjndisikqij\t75880\n呀呀\t75881\n元旦快乐吧\t75882\n2月16日至19日\t75883\n晕颜语凝结\t75884\n跑步机\t75885\n河北收视指南频道\t75886\n异思迁\t75887\n安乐窝\t75888\n告诉我你是不是人类你要是你要是骗我的话\t75889\n拒聊\t75890\n过火\t75891\n423144549\t75892\n安装\t75893\n浅水区\t75894\n看不开\t75895\n2-3小时\t75896\n麦克风\t75897\nMusicRadio\t75898\n语庭\t75899\n2085679122\t75900\n读史\t75901\n皇马\t75902\n案发\t75903\nnousooc\t75904\n头多大\t75905\n啊美\t75906\n外婆锅巴米饭\t75907\ndhdurtteyjhfhfjfgvhdbdjdhdjdbjxndnxjbfFdjvijgjjfgjeififjjfttgjjfutufuhhuhhhhhhhhhhhhhhhhhhhhhjhhjhhhhhhhhhujjjjjjjjjhjjjjjjjjjjjjjjjjjjjjbbjbxndjdbxjxjddjjddjdjfjrjfhdjijekefjjrkfjkejejjjrjsndjdjjrjejejrjkdjefijffjgfjfifitiititiguguekc8mofkrrjjfbrjtjfjfitjuijfigjtjggjrjjfjgjjjgffkfiiffkfkjf\t75908\n苏AX3116\t75909\nonneprsre\t75910\ncomplish\t75911\n榆林城\t75912\n亲爱的亲我一下\t75913\n8885\t75914\n迷迷糊糊\t75915\nmjmptgmpgq\t75916\n二个多小时\t75917\n45613\t75918\n案号\t75919\n关毕\t75920\nbuvftser\t75921\ndnjdjd\t75922\n再创\t75923\n任奇禄\t75924\n58本\t75925\n啦啦啦啦啦\t75926\n一掌扇\t75927\n彦斌子\t75928\n荤\t75929\n郭哈哈哈哈\t75930\n鸣锣\t75931\n画片儿\t75932\n夸张句\t75933\n三座\t75934\nWWW\t75935\n真的不聊\t75936\n50000塊錢\t75937\n2015年9月3日\t75938\nn85545n758656534558555555554675738566\t75939\n冻伤\t75940\niu兔兔\t75941\n荡\t75942\nWWE\t75943\n右安门\t75944\n13137\t75945\n13136\t75946\n红梅姑\t75947\n片甲\t75948\n夏贝贝\t75949\n好笑哈哈\t75950\nWWI\t75951\n图兔兔图图\t75952\n佛罗伦萨与文艺复兴：名家名作\t75953\n林婉欣\t75954\n258年\t75955\n龙江雅苑招商启动会\t75956\n甲丸\t75957\n笑勒\t75958\n自强不息\t75959\n见辉\t75960\n2003-7-6\t75961\n悄无声息\t75962\n韦慧翠\t75963\n雅雅\t75964\n育红小学\t75965\n包围\t75966\n醫生說\t75967\n1588\t75968\n1589\t75969\n林建东\t75970\n1582\t75971\n抄牌\t75972\n便器\t75973\n七十几哪\t75974\n只有一天\t75975\n挣脱\t75976\nProgenitorhiringrheumatoidengulfedGriffithenhances\t75977\nxxvb\t75978\n获胜者\t75979\n多健康\t75980\n划桶\t75981\n方略\t75982\n122334455667788990099887766554433221\t75983\n老公我是乖乖\t75984\n啦啦啦啦啦啦啦好多咪呀度秘\t75985\n一动\t75986\n一劫\t75987\n神牛\t75988\n华心姐\t75989\nthousand\t75990\n小小白兔跳跳跳别的朋友说你好\t75991\niuu\t75992\n回味无穷\t75993\n李昊阳\t75994\n周燕玲\t75995\n丹儿\t75996\n一爱\t75997\n杨宏宇\t75998\n心爱\t75999\n胡歌雅\t76000\n不好了了\t76001\n13796118225\t76002\n哥布林\t76003\nsibghvxhh\t76004\n格里麦\t76005\n凯里市\t76006\n金陵十三钗\t76007\n超级塞\t76008\n唉不理\t76009\n二零零零\t76010\n一力\t76011\n凯呢\t76012\n费五十\t76013\n易观\t76014\n圆锥\t76015\n贾樟柯\t76016\n揽途\t76017\n不关\t76018\n杨幂度\t76019\n婊\t76020\n28286\t76021\n素孙\t76022\nSDDSsed\t76023\n不兴\t76024\n不养\t76025\nwp7\t76026\nt3者\t76027\n小矮子\t76028\n第一5\t76029\np1批\t76030\n不入\t76031\n吝啬爱\t76032\n不全\t76033\n鹿夫人\t76034\n不公\t76035\n贺显瑶\t76036\n一万五一个\t76037\n藏刀\t76038\n坯芽\t76039\n也克苏\t76040\ncdcdcdcctv\t76041\n不元\t76042\n要期\t76043\n陈怡昌\t76044\n不兄\t76045\n乎略\t76046\n神雕侠侣\t76047\ndegicd\t76048\n龙华儿\t76049\n刺参\t76050\nGP02\t76051\n想吃吃吃吃吃\t76052\nAERA\t76053\n长一智\t76054\nwpt\t76055\nwpw\t76056\n仳伏十\t76057\nalbum\t76058\n圣西罗球场\t76059\n逆卷\t76060\n情有独钟\t76061\n79蚊\t76062\n猜文\t76063\n张文明\t76064\nabca式\t76065\n凉子\t76066\n头朋友\t76067\n哭了你难受\t76068\n下午一点二十一\t76069\n众志成\t76070\n怕忙\t76071\n荣耀6plusn\t76072\n牙子\t76073\n高高斯奥\t76074\n张梦圆\t76075\n7soul\t76076\n讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌\t76077\n我疼你我爱你我疼你\t76078\n老实实\t76079\n一人称\t76080\nqert68deqweqwertyuiphtyyl\t76081\n成一想\t76082\n共同点\t76083\n155026977519\t76084\n西溪\t76085\n汤琦雪\t76086\n香菇包\t76087\n郎人\t76088\n万望子\t76089\n小奇奇\t76090\n雾草\t76091\n望眼欲穿\t76092\n交城\t76093\n广西省\t76094\n7月17日\t76095\n贡唔\t76096\n登台架\t76097\n投靠\t76098\n3年内\t76099\nnggèxiàohuà\t76100\n一二三四五六七八九十十一十二十三十四十五十六十七十八十\t76101\n说干嘛\t76102\n二十几次\t76103\n打杀杀神探\t76104\n歪曲\t76105\n我的演唱会\t76106\nPKPKPKPKPKPKPKPKPKPKPK\t76107\n好哈锤子\t76108\nnjbbik\t76109\n剃须\t76110\n6Ytsfhefbofd4T匕一义妊\t76111\n前部\t76112\n刘静如\t76113\n复赛\t76114\n林茂源\t76115\nccmyou\t76116\nBeers\t76117\n汉勇\t76118\n牙膏\t76119\nI做作业\t76120\niOd们的什么么么么事\t76121\n零分度秘\t76122\n尿白\t76123\n面面观\t76124\n吃想堡\t76125\n谢我干\t76126\n匿迹\t76127\n爱爱n99\t76128\n步步高家教机\t76129\n叶海腾\t76130\n成佑莘\t76131\n完美好\t76132\n了不起太狼小游你好\t76133\n10\t76134\n照主人\t76135\n敢于\t76136\n王宝峰\t76137\nguuvd\t76138\n对啊好\t76139\n拜拜拜拜拜拜拜\t76140\n创业者\t76141\n赶快点\t76142\n组团切\t76143\nygtrf\t76144\n护保\t76145\n双奥\t76146\n合疗\t76147\n美服\t76148\n下半叶\t76149\n尿尿裤\t76150\n张镇戎\t76151\n川剧\t76152\n每平米\t76153\n180公分\t76154\n很复杂\t76155\n认意思\t76156\n涌暧潮\t76157\n关键文\t76158\n不安分\t76159\n温泽红\t76160\nMirate\t76161\n吻合\t76162\n急死额呵\t76163\n沧海桑田\t76164\n枣庄\t76165\n郑达\t76166\nG20紫2500\t76167\n义诊\t76168\n吻吻\t76169\n陈薪宇\t76170\ngsvhsshxhis\t76171\n张歆睿\t76172\n孝顺\t76173\n躬亲\t76174\n狗萨莫狗\t76175\n第七步\t76176\n黑白灰\t76177\ndibs\t76178\n义询\t76179\n保险箱\t76180\n葉度秘\t76181\n萌萌哒萌哒萌萌哒萌萌哒萌萌哒萌萌哒萌萌哒萌萌哒萌萌哒萌萌哒萌萌哒萌萌哒萌萌萌萌么么么么么么么么么么么么么么萌萌萌萌萌萌萌萌萌萌萌萌某某某\t76182\nzongzi\t76183\n迷路对不起\t76184\n误导\t76185\n它\t76186\n宄\t76187\n宅\t76188\n宇\t76189\n守\t76190\n安\t76191\n宋\t76192\n完\t76193\n宏\t76194\n起早\t76195\n刘海东\t76196\n心干\t76197\n宕\t76198\n葡萄果果\t76199\n宗\t76200\n官\t76201\n空麦\t76202\n红猪\t76203\n红猫\t76204\n宜\t76205\n宝\t76206\n实\t76207\n上来着\t76208\n宠\t76209\n审\t76210\n客\t76211\n宣\t76212\n室\t76213\n宦\t76214\n好阿换\t76215\n宩\t76216\n宪\t76217\n宫\t76218\n张陈浩明\t76219\n宰\t76220\n我决啥\t76221\n害\t76222\n宴\t76223\n更糟\t76224\n家\t76225\n全职\t76226\n容\t76227\n13145687741\t76228\n狼吞虎咽像熊熊\t76229\n宽\t76230\n宾\t76231\n宿\t76232\n估算\t76233\n朴席\t76234\n李春杰\t76235\n精灵精灵梦夜萝莉\t76236\n厢房\t76237\n1小把\t76238\ndfu\t76239\n9939995185\t76240\n司卫\t76241\nIELTS\t76242\n微型\t76243\n鬼妻\t76244\n露西\t76245\n衍\t76246\n元首\t76247\n找借口\t76248\n一两家\t76249\n平光蛋\t76250\n二零30点\t76251\n我是人我不吃西游不和红鞋\t76252\n好兄弟\t76253\n给我希望\t76254\n一薏米\t76255\nWCDMA\t76256\n18280元\t76257\n生植器\t76258\n落枕\t76259\n虚脱粗放\t76260\n13155269362\t76261\n50多具\t76262\n天天头\t76263\n张佳毅\t76264\n冷度\t76265\n友好路\t76266\n帕纳新奈科斯\t76267\n不瞒\t76268\n失忆症\t76269\n独行\t76270\n860万美元\t76271\n深职\t76272\n立秋\t76273\n水晶灯\t76274\n随之而来\t76275\n竖图\t76276\n轉自度娘\t76277\n圆缘园\t76278\n天天天\t76279\n谢好谢\t76280\n国家主席\t76281\n86484545488191\t76282\n美宝莲\t76283\n没假\t76284\n颜真卿\t76285\n婿\t76286\n冷库\t76287\n董甜甜\t76288\n环系\t76289\n砍头\t76290\n艾佛尔铁塔\t76291\n兔肉\t76292\n度小秘\t76293\n捷克\t76294\n颗粒感\t76295\n杨公忌\t76296\n一百四\t76297\n哎呦什么了不起就是你\t76298\n心惊胆寒\t76299\n大兴区\t76300\n职校\t76301\n蓟县\t76302\n过大生日别\t76303\n4月19\t76304\n13庚\t76305\n第控妹控\t76306\n735橙29\t76307\n敢亲\t76308\noriginal\t76309\n心酸\t76310\n十多一点儿\t76311\n张永裕\t76312\n得分月打\t76313\n272.48万辆\t76314\n安尼系卡噻\t76315\nyouphotongiveme\t76316\n十八时24\t76317\n唐想\t76318\njjbivkg\t76319\nⅩ\t76320\n摄像机\t76321\n生活新报\t76322\n花名\t76323\n植物生长调节剂\t76324\n那那那那那那是你\t76325\nzmh\t76326\nvgbg\t76327\n人观\t76328\n痛下决心\t76329\n青琐高议中的长江后浪推前浪\t76330\n知后觉\t76331\nzmz\t76332\n晚撒\t76333\nzmx\t76334\n信访局\t76335\n囊中\t76336\nⅣ\t76337\n飞檐\t76338\n陕北\t76339\nvgby\t76340\n董鑫\t76341\nSFC\t76342\ncccvbnnnvvgfvb\t76343\n易龙\t76344\n双休伴侣\t76345\n六8\t76346\n囧镜头\t76347\n五笔字根\t76348\n小金瓶\t76349\nGhoshbuff\t76350\n放了假\t76351\n返回位\t76352\n十比二十大\t76353\njitru\t76354\n有一起玩\t76355\n低微者\t76356\n颤涅萌\t76357\n12yrek\t76358\n46.4元\t76359\n多吗度\t76360\n程小姐\t76361\n比亚补射\t76362\n半斤八两\t76363\n作乱\t76364\n来访者\t76365\n星五七星\t76366\n抱着你\t76367\ndutyt\t76368\n葫芦娃\t76369\n嗯神\t76370\n的真的好想你\t76371\n叽里咕噜\t76372\n周日了我真的爱上你了根本\t76373\n南吕\t76374\n度秘我爱你的意思\t76375\n这种人处\t76376\n嘞嘛们\t76377\n戏片\t76378\n马德里弗拉门戈\t76379\nZOL\t76380\n马头\t76381\n泠血\t76382\n适合于\t76383\n养天和\t76384\n昭羰们\t76385\nPCCOO└o┘凸凸\t76386\n聊聊金刚经\t76387\nAnglebaby\t76388\nntcb\t76389\nFloyd不变式断言法\t76390\n二二幺九\t76391\n获致\t76392\noollllollo\t76393\nIllagaaagjagjjgg\t76394\n葛蔚\t76395\n姚明星\t76396\n阿衰度\t76397\nhbjjj\t76398\n移默化\t76399\n小游们\t76400\n艾芘\t76401\n嗣超\t76402\n喝水\t76403\n流量流流\t76404\n应许\t76405\n三克\t76406\n摆酷\t76407\n巴塞罗纳\t76408\n逗鬼\t76409\n拦腰\t76410\n扣完\t76411\n35100\t76412\n度秘你好运\t76413\n十六十七岁\t76414\n李双慧\t76415\n奇幻\t76416\n伪娘\t76417\n小峰\t76418\n6789小时\t76419\n三匝\t76420\nxvgggyyfey4yfru\t76421\n边防\t76422\naction\t76423\n12期\t76424\n大度秘\t76425\n唐婷\t76426\n出水\t76427\n吴俢涵\t76428\n赏荷\t76429\n一口一碗\t76430\n毒蛇\t76431\nyoung\t76432\n948581\t76433\n孙策\t76434\n陈国庆\t76435\n哇咯\t76436\n韩敏捷\t76437\n火情\t76438\n王师傅\t76439\n哇咔\t76440\n855\t76441\n856\t76442\n拒签\t76443\n850\t76444\n851\t76445\n852\t76446\n王喜娜\t76447\n抓手\t76448\n858\t76449\n3056929698\t76450\n└┐卍o卍\t76451\n李敬泽\t76452\n你的眼泪\t76453\n黄元帅\t76454\n唐雪雯\t76455\n鈤\t76456\n45360\t76457\n刷屏\t76458\n放该\t76459\n潮吹\t76460\n队列\t76461\n信询\t76462\nnbvd\t76463\n绝唱\t76464\n冒陆\t76465\n土豆片\t76466\nIiiii\t76467\n信诺\t76468\n平平安安\t76469\n丝隙\t76470\n任梦莹\t76471\n去哪里\t76472\n同姐\t76473\n同姓\t76474\n22：05\t76475\np那\t76476\n凌晨零时\t76477\n我是你的爱妃梦\t76478\n你最好玩儿\t76479\n冒险\t76480\n岳阳\t76481\n国书\t76482\n张予馨\t76483\n据报\t76484\n名讳\t76485\n悄悄话\t76486\n行使\t76487\nDAY189\t76488\n灵体\t76489\n十二枚\t76490\n么恩然\t76491\n南瑞\t76492\n朱雀吧\t76493\n爸爸娃娃\t76494\n学社\t76495\n我一晗\t76496\n三八佳节\t76497\n给我钱我要你给我钱\t76498\n0.4%\t76499\n金情\t76500\n对牛谈\t76501\n年息\t76502\n蜜糖\t76503\n夜梦\t76504\n超高跟松\t76505\n傲慢\t76506\n买马屁\t76507\n三全\t76508\n本命年\t76509\n普耶韦火山\t76510\n14366\t76511\n碰壁\t76512\n扑嘢\t76513\n大国战\t76514\nJIUYAO\t76515\n18点\t76516\n7VG\t76517\n再见为了\t76518\n天翼手\t76519\nwahatyaouno\t76520\n正压\t76521\n荒诞\t76522\n俩部\t76523\n竹城水寨\t76524\n遗精\t76525\n在沙里\t76526\n佛手经\t76527\n简体中文\t76528\n张景林\t76529\n娜么\t76530\n国教\t76531\n张一鑫\t76532\n台北\t76533\n大是大非\t76534\nMirror\t76535\n谐音版\t76536\n⊥\t76537\n成秘\t76538\n赞叹\t76539\n闪翼\t76540\n辽宁省科学技术厅\t76541\n吃云吞\t76542\n规模版\t76543\n赞友\t76544\n冯程\t76545\n新天生一对\t76546\n运维\t76547\n使者\t76548\n哇塞度youmic\t76549\n15至16日\t76550\npmp骨\t76551\n不挂科\t76552\navwsy\t76553\n比四\t76554\n行车电脑\t76555\n帐篷\t76556\n王凯文\t76557\n腊月二十号\t76558\n精灵树\t76559\n沈阳北站\t76560\n伦敦奥火\t76561\n英伦\t76562\ngshxhdhch\t76563\n纺织娘\t76564\n打鸽子\t76565\n好想看见字我的偶像\t76566\n办法\t76567\nandia\t76568\n我的少女时代\t76569\n4059361\t76570\n是我曾经很喜欢\t76571\n猩猩\t76572\n赵媛\t76573\n两生缘的歌\t76574\nVcdy\t76575\n欲晓\t76576\n三四线\t76577\n不是我说我的世界末影人\t76578\n九一大四\t76579\n完气\t76580\n我不是你的主人你的主人不要你了我也不要你了哼\t76581\n缘木求\t76582\neewwwwapwwwcnwwwwapwww163comqqcomqqcom163comgmailcomqqcom163comgmailcomqqcom\t76583\n听我相信\t76584\n冠力\t76585\nUp主\t76586\n呼呼声\t76587\n衫寐\t76588\n高珊珊\t76589\n偏僻\t76590\n半裸\t76591\n旋风\t76592\n桐华\t76593\n达芬奇\t76594\nthirstyfor\t76595\n法日\t76596\n忧心\t76597\n八十八一八二八三八四八五八六八七八八八九\t76598\n堵头\t76599\n盐步\t76600\n何苗\t76601\n我图图图图图图图写\t76602\n你好认识\t76603\n小度秘你是我的小度秘\t76604\njjxm\t76605\n本辈子\t76606\nalways\t76607\ntpsa2横向括号interested和interesting括号fillimstraternersted\t76608\n联署\t76609\n作案\t76610\n何苦\t76611\njjxy\t76612\n月痩\t76613\n900多\t76614\n蔓\t76615\n豆丹人\t76616\n大别\t76617\n杨祎\t76618\nNGD\t76619\n呢度秘\t76620\n摸摸裆\t76621\n石伟\t76622\n小产权房\t76623\n说了吧\t76624\n82多少\t76625\n李熙慧\t76626\n奇犽\t76627\n内痔\t76628\n家本领\t76629\nx2y4\t76630\n扭力\t76631\n登录器\t76632\n跑得快跑\t76633\n一百分对\t76634\n十一节\t76635\nrepressive\t76636\n20.1\t76637\n扭动\t76638\n阿泽\t76639\ntrx\t76640\n每科\t76641\n每秒\t76642\nsuon\t76643\n呸呸呸呸\t76644\nsuoh\t76645\n说了吃\t76646\n拖拉错啦错\t76647\nJYJ美国场MTV\t76648\n一遍度\t76649\n阿波\t76650\n说了吗\t76651\n13935712982\t76652\nhighway\t76653\n5滴\t76654\n高玥莹\t76655\n嘻哈\t76656\n万方\t76657\n否定句\t76658\n咯单\t76659\n龙鳞\t76660\n大青山\t76661\n万斯\t76662\n贯通\t76663\nredss\t76664\n喜喜\t76665\n陈远然\t76666\n这张脸\t76667\n万斤\t76668\n说不相信\t76669\n我是呀我求你别给我\t76670\n念念念念\t76671\n救人者\t76672\n杏仁那\t76673\n崇明岛\t76674\n为你是\t76675\n刘志豪\t76676\n舞蹈演员\t76677\n张雅婷\t76678\n密不透风\t76679\nCathay\t76680\n64011112\t76681\n北二环会塞车\t76682\n预备\t76683\ndbhxhxhxidj\t76684\n去年11月\t76685\n组件\t76686\n所以说不想\t76687\n派伯\t76688\n路况\t76689\n叔家\t76690\n我长\t76691\n蔬\t76692\n纯萌\t76693\n发型系\t76694\n来华\t76695\nksijdiidjlqppjdhkalkgxkkndghkeko\t76696\n才肯\t76697\n王凯丽\t76698\nvjbg\t76699\n妈妈饭\t76700\n张庭州\t76701\n64180\t76702\n非爱人\t76703\n迷买\t76704\n勇敢就好哦度秘\t76705\n小航\t76706\n重打\t76707\n军靴\t76708\nwoshi\t76709\n吾愿\t76710\nwoshl\t76711\n理念\t76712\nParke翻唱Kelly\t76713\n郑开心\t76714\n不猜你猜不猜我猜猜猜\t76715\n真的很美\t76716\nwoshu\t76717\n发展史\t76718\n一小\t76719\n舒米\t76720\n林芝花\t76721\n一尊\t76722\n王丁鑫\t76723\n满堂我要满堂\t76724\nHAM2000\t76725\n一封\t76726\n安静了\t76727\nrvwhgr\t76728\n耳鸣\t76729\n撕碎\t76730\n一尘\t76731\n薄膜\t76732\n3碗\t76733\n一少\t76734\n一套套\t76735\n花开\t76736\nmian\t76737\n帆布鞋\t76738\n10份儿\t76739\n10只\t76740\ntgfn\t76741\n10号\t76742\n10台\t76743\n一尺\t76744\n花弟\t76745\n般般\t76746\n方怡雯\t76747\n郑爽恺\t76748\n部疙哥\t76749\n李今天\t76750\n天错地错大错太错\t76751\n329697578。329697578\t76752\n污水人\t76753\n奔流\t76754\n调写\t76755\n13345677654321\t76756\n一条神奇的天路\t76757\n金三\t76758\n老不懂\t76759\n打止\t76760\nwsfgcc\t76761\n改革\t76762\nClub\t76763\nnoprobleam\t76764\n荆棘遍地\t76765\n發張全裸的妹紙\t76766\n结转\t76767\nkvuJkt2545855\t76768\niShit\t76769\n真丑真丑真丑真丑真丑\t76770\nplay\t76771\nLetsgo\t76772\n鲜忌廉4\t76773\n下单员\t76774\n复学\t76775\n拦截\t76776\nplat\t76777\nplak\t76778\n揉揉天刚好的事件衫分享受吧\t76779\nYeahIuse\t76780\nplan\t76781\n阅读类\t76782\nuatapkv\t76783\n惠济区\t76784\n计量\t76785\n好儿女\t76786\n帅好衰\t76787\n九4S\t76788\n度那度秘\t76789\n很好玩\t76790\n13087187819\t76791\n惠特\t76792\n理达\t76793\n好单纯很好\t76794\n博伦思\t76795\n1454189045\t76796\n别介意\t76797\n茂贞\t76798\n戈文\t76799\n王子琪\t76800\n1块\t76801\n惠顾\t76802\n潜行\t76803\n红豆粥\t76804\n东南网海峡都市报\t76805\nhyfy\t76806\nsession\t76807\n3680岁\t76808\n包族\t76809\n吧体\t76810\nwentima\t76811\n快一点行不行\t76812\n幺零零零幺\t76813\n麝香\t76814\nBBT\t76815\n心委掩人\t76816\n尿血\t76817\n李迭代\t76818\nBBS\t76819\n庭外\t76820\n经吃\t76821\n三之只\t76822\n面试\t76823\n不在一起\t76824\n赏人\t76825\n第41\t76826\n1465\t76827\n05652655885nnn5\t76828\n遍升\t76829\n度秘啦\t76830\n杜浦\t76831\n善科yl\t76832\nwayIam\t76833\n朝伟\t76834\n六安\t76835\nghvfgjx\t76836\n我喜欢商永锐\t76837\n善科ya\t76838\n唉无奈\t76839\n一感\t76840\nOppa\t76841\n林会杰\t76842\n没我了你会\t76843\nKung\t76844\n回电\t76845\nxxzzzzzzz\t76846\n黄亚男\t76847\nMirotic\t76848\n10万8千里\t76849\n深圳湾体育馆\t76850\n一意\t76851\n饭来吧\t76852\n嗯欣赏\t76853\n大老啤\t76854\n一愿\t76855\n早安O\t76856\n张槎\t76857\n退出去\t76858\n你干\t76859\n一分钟之内\t76860\n回甘\t76861\n回生\t76862\n丁火\t76863\nSingle\t76864\n白吃\t76865\n还幼\t76866\n文神鹰\t76867\n王宇博\t76868\n求魔\t76869\n鱿鱼爆炒边草\t76870\n哪么多\t76871\nB型血\t76872\n百家姓中尊姓\t76873\n发哥成\t76874\n二手书\t76875\nxiah\t76876\n修斋\t76877\n王旖旎\t76878\n古城\t76879\n黑黑的夜空低垂亮亮的繁星相随虫儿飞虫儿飞\t76880\n有低无\t76881\n合力\t76882\n20：56\t76883\n优越性\t76884\n蔡德聪\t76885\n解释\t76886\n我恨你我就是恨你\t76887\n麦小宁\t76888\nyuuggghv\t76889\n北京昆仑饭店国际会议厅\t76890\n溪谷呗\t76891\nyatouko\t76892\n家雪\t76893\n宠物猫\t76894\n咏史\t76895\n乌拉街\t76896\n辣阿嚏阿\t76897\nehie\t76898\n宠物猴\t76899\nhttppinyincne6197\t76900\n放一首情归\t76901\n旱灾\t76902\nahou\t76903\n20b3b17c十七\t76904\n郑宇飞\t76905\n家雅\t76906\n家集\t76907\n咯菌腈05845664585885558558nnnnnnnnn887551358288885562516\t76908\n紧要\t76909\nNina\t76910\n蒋俊强\t76911\n力不从心\t76912\n卡特拉\t76913\n樱花不爱我不要你我不累你在哪不小心\t76914\n唉瓮出\t76915\n浸热\t76916\n李维佳\t76917\n焊工\t76918\n南京国通\t76919\n断情\t76920\n100万个\t76921\n枧器\t76922\nhttpahiphotosbaiducomxiaodupiciteme\t76923\n重庆市政府\t76924\n不高兴玩\t76925\n上海商学院\t76926\n李睿洋\t76927\n鸟蚌\t76928\nfghijk\t76929\n贺词\t76930\n值守\t76931\n虚无你的你收术\t76932\n闫会祥\t76933\n骨朵\t76934\n操操操操操\t76935\n不过三\t76936\n处对不起\t76937\n0一0\t76938\n2027650764\t76939\n喝酒醉\t76940\n张正太\t76941\nfirm\t76942\n从今日起\t76943\n正月初五\t76944\n8o一\t76945\n天下雪\t76946\n春節\t76947\nfire\t76948\n吴隆奇\t76949\n金牛岭\t76950\n咖色\t76951\n10、20、30日\t76952\ncvccvv54425\t76953\nhttpahiphotosbaiducomxiaodupicitem4\t76954\n彭可\t76955\n伟大的诞生\t76956\n金山路\t76957\n彭叫\t76958\n侧龃\t76959\n衰\t76960\n可乐罐\t76961\n松方\t76962\n华安旗\t76963\n快问快答#\t76964\n不得不爱\t76965\nfirs\t76966\n此时\t76967\n最高者\t76968\n相惜\t76969\nConcert\t76970\njjnbkllj\t76971\n宁同一\t76972\n401\t76973\n噜噜噜噜\t76974\ngddfghfvvf\t76975\n铁饼\t76976\nBELIEVE\t76977\n自圆其说\t76978\nHGHgFCCBhuttogfcbhhdazzlefdgjkk\t76979\n鳕倩\t76980\n青芒果\t76981\n张天咪\t76982\n15090404261\t76983\n多米我很喜欢\t76984\n不列颠\t76985\n巴厘岛\t76986\n22个小时\t76987\n最先\t76988\n黄丽娇\t76989\n首支\t76990\n最元\t76991\n差不辣\t76992\n庞物\t76993\n凌沉\t76994\nzzko\t76995\n乱套了\t76996\n最具\t76997\nhello喵\t76998\n去家强\t76999\n智能级\t77000\n最全\t77001\n吶喊,\t77002\n4848\t77003\n2011年5月20日\t77004\n过度子韬\t77005\n1760861216\t77006\n姜主任\t77007\n审美观\t77008\nsbsbsbsbsbsbbsbsb\t77009\n天一亮\t77010\n吉祥话\t77011\n如风\t77012\ngtchfyghf\t77013\n2x3472\t77014\n陈仲玄\t77015\n达儿\t77016\nCvs\t77017\n如飞\t77018\n233333233333\t77019\n李子阳\t77020\n水山\t77021\n来玩玩\t77022\n{\t77023\n甲板\t77024\n2572400596\t77025\n若兰\t77026\n二零三\t77027\n昂佳\t77028\n知识分子\t77029\n哈楼哈楼哈楼哈楼哈楼哈楼\t77030\nfhffh\t77031\n好啦好啦\t77032\n感人至深\t77033\nQuality\t77034\n陈亦北\t77035\n活去\t77036\n开卷\t77037\n博彩\t77038\nxndn\t77039\n地球\t77040\n博彦\t77041\n地理\t77042\n11年1月11日\t77043\n一佯\t77044\n不要问\t77045\n一佰\t77046\n开卡\t77047\n袭击\t77048\n数秒\t77049\n袭凶\t77050\n说话说错\t77051\n睡了安\t77052\nyouan\t77053\n热胀\t77054\nvgfmccvxccvgdhgdhgxghhjjhjjhjh\t77055\n开卖\t77056\n一个五百\t77057\n0一8\t77058\n尹嘉霖\t77059\n一位\t77060\n重峦叠嶂\t77061\n万色\t77062\n生食\t77063\n一下摆摆\t77064\nGDGFJHDEJCH\t77065\n赵鹏涛\t77066\n1612111\t77067\n徐州\t77068\n我的这么大了还小朋友实话给你说吧我爱你\t77069\n欧燕妮\t77070\n1313亮晶晶漫天\t77071\n插头曲\t77072\n壓倒\t77073\n万洁凤\t77074\n14745曲\t77075\n喜欢度\t77076\n绳艺\t77077\n张露雨\t77078\ncdm机\t77079\n亮江江萌天龙\t77080\n97号\t77081\n褲襠\t77082\n蒙雨涵\t77083\n有聊\t77084\n33.999米\t77085\n不必不\t77086\n全思文\t77087\n987cn\t77088\n233333333333333333333333333333333333333333333333333333333\t77089\n出出\t77090\n出击\t77091\n加蓬\t77092\n王妹妹\t77093\n一咪\t77094\n狠狠哭\t77095\n一把十支\t77096\n隆冬锵咚锵\t77097\n度广告\t77098\n男人床\t77099\n75000\t77100\n肾阳\t77101\n首都高速路公司\t77102\n136411\t77103\nq版泡泡堂\t77104\n嘉年华\t77105\n13973215443\t77106\n肖万珠\t77107\n工程师哥\t77108\n动产\t77109\n弹吉他\t77110\n4.8%\t77111\n美国国务院\t77112\n评委\t77113\n滥杀\t77114\n爱你好那个呀\t77115\n十力\t77116\n弘一法师\t77117\nhttppinyincne12381\t77118\nyourday\t77119\nsgf\t77120\ntyf\t77121\n水屋\t77122\n在意会\t77123\n秘吓\t77124\n母葬\t77125\n百度传课\t77126\nLook\t77127\n军舰\t77128\n毳绍\t77129\n水蛭\t77130\n秘君\t77131\n亚利桑那州\t77132\n交通拥挤\t77133\n珊儿们\t77134\n考入\t77135\n13573905777\t77136\n好猛\t77137\n峻服\t77138\n秘吏\t77139\n杜波夫\t77140\n电动扶梯\t77141\n崔亚斌\t77142\n改过\t77143\n盖脸\t77144\n改进\t77145\n唉唉安安安安安安安安\t77146\n质保听\t77147\n明信我叫\t77148\n尼克松\t77149\ntyy\t77150\n想死\t77151\n佳琪\t77152\n胸地\t77153\n淡季\t77154\n臭臭\t77155\n三巡\t77156\n一秒种\t77157\n58488946759\t77158\n麦当当\t77159\n九年级语文书人教版\t77160\n邹平县\t77161\n玖殇\t77162\n德籍\t77163\n不在我的度秘臭不要脸的度秘\t77164\n绕道\t77165\n烤翅\t77166\n七娃\t77167\nebz5\t77168\n饿的福特热风格有点福特水调歌头\t77169\n绿岛\t77170\n高风亮节\t77171\n兼程\t77172\n来不去\t77173\n沈建光\t77174\n嗯太有\t77175\n冉建新\t77176\n16.1%\t77177\n待续\t77178\n珊瑚色\t77179\njgtawad\t77180\n最后一道题\t77181\n经想\t77182\n切切切切切\t77183\n锁定期\t77184\n知其难\t77185\n圣爱\t77186\n校花比赛\t77187\n邓博元\t77188\n大鸡卜\t77189\n茶花烟\t77190\n不要你是人的话\t77191\n草房\t77192\n康壹加壹\t77193\n唐佳怡\t77194\nling\t77195\n花儿乐队\t77196\nline\t77197\nlind\t77198\n屋漏逢偏连夜雨\t77199\n4577587255788775453\t77200\nlina\t77201\n快活块\t77202\n怎么办\t77203\n侯佳怡\t77204\n逼死\t77205\n片形\t77206\n连滚\t77207\n平民家\t77208\n87岁\t77209\n成对\t77210\n直接那嘛\t77211\n互为表里\t77212\n15223281341\t77213\n換來\t77214\n波及\t77215\n编写\t77216\n二十三四十七七十九\t77217\n胡箩卜\t77218\n第五只\t77219\n第五句\t77220\n树藤荡\t77221\n10万美元\t77222\nddhfydyeudieifufjsjcjdjfidurhfhrhcjsjccc\t77223\nexue\t77224\n坐立不安\t77225\n好啦麻烦你\t77226\n无需\t77227\n称霸\t77228\n邵镜清\t77229\n就是我和你\t77230\n雪狐\t77231\n乐天看\t77232\nffzeg\t77233\ntyrgif\t77234\nrobots\t77235\n并道\t77236\n浓郁\t77237\nPPTV视\t77238\n国宝级\t77239\n避风\t77240\n未卜\t77241\n一上午\t77242\n状元\t77243\n30230173\t77244\n无聊网\t77245\n382元\t77246\n大排\t77247\n真标\t77248\n一个12000\t77249\n米洛治\t77250\n过背背三字经\t77251\n魔幻季节的秘密\t77252\n观瞻\t77253\n全球热恋\t77254\n商业片\t77255\n双竹村照吧\t77256\n6465365\t77257\n山乐亭\t77258\n谁和你基友\t77259\n第二根\t77260\n宅府\t77261\n双峰\t77262\n没用了吧\t77263\n忘笙\t77264\n刘胜龙\t77265\njijiao\t77266\n索宇琦\t77267\n我相信你可以\t77268\n版本号\t77269\n紫霞\t77270\n拉里亚\t77271\n国属人\t77272\n潘999\t77273\n珍珠粒\t77274\nucjvbvj\t77275\n555525455758uhbjgdzdhggkc\t77276\n仙居\t77277\n捐助\t77278\n贝吉塔\t77279\n玫瑰林\t77280\n登校\t77281\n短板\t77282\n播\t77283\n1962年8月10日\t77284\n回台\t77285\n近两月\t77286\n囧囧囧囧囧囧囧囧囧囧囧囧囧囧囧囧囧囧囧\t77287\n排舞\t77288\ncugfgnhggghggggggggggggggggffggggfffffff\t77289\n凤专一\t77290\n老寻\t77291\n5月7号\t77292\n24513596\t77293\nnnnnnnnnnnbbbb\t77294\n有事儿\t77295\n放飞\t77296\n娇娇者\t77297\n废止\t77298\n今天二月一号\t77299\n蒙古袍\t77300\nzgxhfer\t77301\n571866\t77302\n怎么劲\t77303\n大馄饨\t77304\n哈哈窝\t77305\n70斤\t77306\n金融机构\t77307\n转场\t77308\n华晨宇\t77309\n情感文摘\t77310\n赖活\t77311\nOVO\t77312\n杀手锏\t77313\n五千公里\t77314\n1362576685233724863244\t77315\n杨树花\t77316\n北岸\t77317\n张真\t77318\n今年1月4日\t77319\n三百三百多名\t77320\n梁老\t77321\n走亲戚\t77322\n锁屏\t77323\n我的天那你猜猜我\t77324\n速锐\t77325\n黑烟\t77326\n5411\t77327\n无谓\t77328\n事想\t77329\nLinkedInZynga\t77330\n人家票\t77331\n速销\t77332\n远看\t77333\n拼字\t77334\n多长远\t77335\n乱说乱说\t77336\n连写\t77337\n插孔\t77338\nled\t77339\n书名\t77340\nleg\t77341\nlei\t77342\n刚来\t77343\n我的背\t77344\n中国宝安\t77345\nlen\t77346\nleo\t77347\n多端\t77348\njmupj\t77349\n算了转\t77350\n曹婓\t77351\nlew\t77352\n23个\t77353\n陶冶\t77354\n轻女\t77355\nSBSB\t77356\n小小可可\t77357\n考核表\t77358\nlinalabalan\t77359\n該晚會錄制\t77360\n李飞\t77361\n欧买尬\t77362\n少说再来\t77363\n代为\t77364\n言论\t77365\n书吧\t77366\n安美佳\t77367\n南河丽景二栋\t77368\n刘小霞\t77369\n2月3日\t77370\n万一级\t77371\n投机者\t77372\n肇庆罗\t77373\n又错了我的身大多图图\t77374\n小金人\t77375\n越轨\t77376\n339\t77377\n盲降\t77378\n335\t77379\n334\t77380\n336\t77381\n331\t77382\n330\t77383\n333\t77384\n332\t77385\n好没劲\t77386\n13753056338\t77387\nhxhxhxhc\t77388\ny2x80\t77389\nkjggffffff\t77390\n33%\t77391\nfgzhxhfv\t77392\nstanding\t77393\n平辈\t77394\n樊启盛\t77395\n溥冰\t77396\n破机\t77397\n出钱\t77398\n1千五万四千六百九十一\t77399\n赤西仁大宝贝\t77400\n悲惨世界\t77401\n蓝豹\t77402\n堂亚\t77403\n萌萌白拜拜萌宝萌萌萌萌\t77404\n定放\t77405\nJoensson\t77406\n33x\t77407\n百里雪雪\t77408\n哈哈过奖过奖\t77409\n无缘大慈\t77410\n毛里求斯\t77411\n新行头\t77412\nZMZ\t77413\n谁是你的脸我是一姐姐\t77414\n用车\t77415\n发泡\t77416\n好老婆\t77417\n熊嘴\t77418\n资本市场\t77419\n儿童节\t77420\n叫力\t77421\n胎心\t77422\n被造\t77423\n本上\t77424\n快节奏\t77425\n琪哥\t77426\n巨度秘\t77427\n水钱东流若为硬件别离情\t77428\n潘美君\t77429\n劳烦\t77430\n贱骨头\t77431\n向日\t77432\n1百元\t77433\n放发\t77434\n王菊开\t77435\n老鼠\t77436\n不可外翻\t77437\n朱雀橙\t77438\n走失\t77439\n斯云盘\t77440\n碧螺春\t77441\n去角落\t77442\n15114226826\t77443\n304687\t77444\n爆满\t77445\n一和\t77446\n芥菜\t77447\n丫生\t77448\n美居\t77449\n去哪儿打\t77450\n王婆\t77451\n狭路相逢勇者胜\t77452\n7k7k小游\t77453\n秘菠萝菠萝蜜\t77454\n好呀吃麻辣黄瓜\t77455\n束之高阁\t77456\n王婷\t77457\n丑五丑\t77458\n孙皓宇\t77459\n二里地\t77460\n发长\t77461\n九不该\t77462\nanimal\t77463\n3.87点\t77464\n童子军\t77465\n王婧\t77466\n缝章\t77467\n干啦\t77468\nwjiw\t77469\n轻喜剧\t77470\n诛笔\t77471\n倒土\t77472\n丽江西\t77473\nnpj\t77474\n大时刻\t77475\n我听\t77476\n河南大学\t77477\n永生永世永生永世\t77478\n不懂不走\t77479\n吃肉\t77480\nnpy\t77481\n低血糖\t77482\n幸福就好\t77483\n每隔十秒\t77484\n弥留\t77485\n奥卡卡科技局\t77486\n书白小跳\t77487\n哦破某ljling\t77488\nsyboml\t77489\n陈冰洁\t77490\n咕噜\t77491\n偷着乐\t77492\n清俊\t77493\n富者\t77494\nlz27834451\t77495\n二零六幺\t77496\nghiphotosbaiducomxiaodupicitem4b90f603738da977efa9995eb751f8198718e3c5jpg\t77497\n六十年代\t77498\n黑胡椒\t77499\ngtudrd\t77500\n潮男\t77501\n膀胱\t77502\n湖北经济学院\t77503\n看听\t77504\n卢培轩\t77505\n金鸡奖\t77506\n历历在目\t77507\n市桥\t77508\nvcggh\t77509\n22%\t77510\n四一米\t77511\n是的我爱你我们结婚吧\t77512\n洛库\t77513\n3次\t77514\n坚励\t77515\ngfxh\t77516\n洋洋洒洒\t77517\n巍然屹立\t77518\n读法\t77519\n跟经\t77520\n看吃\t77521\n温暖躯壳\t77522\n旅游界\t77523\n百分之一百\t77524\n嗯cn\t77525\n46918646794939497679467646196441\t77526\n当事人\t77527\n爱普斯黑\t77528\n政纪\t77529\n你好你好我亲爱的朋友\t77530\n立德\t77531\n大恩不言谢\t77532\n555555555555555555555544544444\t77533\n偶象\t77534\n珍天真萌萌哒\t77535\na15安咳嗽\t77536\n相链接\t77537\nbat\t77538\n强拆队\t77539\n狗既片\t77540\n公开课\t77541\nbar\t77542\n曲项\t77543\n吉安市\t77544\n641755523\t77545\n秘柏秘\t77546\n两千多万\t77547\nrrtrffffbdhh\t77548\n立得\t77549\n安版\t77550\n耍组\t77551\n张也也\t77552\n眼珠\t77553\n福齐天\t77554\n丫子\t77555\n圣慢\t77556\n675成\t77557\n打盹\t77558\n难舍\t77559\n杨颖露\t77560\n胡志秋\t77561\n想我不我\t77562\n别老\t77563\n弹弹\t77564\n电子邮件\t77565\n摸鱼者\t77566\n呢哪\t77567\ndicrjcg\t77568\n韩庚#的歌\t77569\n普度众生\t77570\n莫名情绪低落\t77571\n不呀不\t77572\nChung#\t77573\n马屁精\t77574\n挺话\t77575\n呢哥\t77576\n54426457046464564546754\t77577\n跨年晩\t77578\n壮胆\t77579\n四十多场\t77580\nbag\t77581\nvulcandp\t77582\n老老实\t77583\n文峰\t77584\n弹弓\t77585\n台份\t77586\n跟前任\t77587\nhcicis\t77588\n颤音\t77589\n还口\t77590\n内地\t77591\n滑板车\t77592\n垃圾箱\t77593\n克气\t77594\n哼谢\t77595\n爱卿何事启奏\t77596\n谢谢你有你真好\t77597\n一盏离愁半梦醒长烛瘦尽夜阑处烟雨隔君千万里念念飞花风满楼\t77598\n美好的爱情\t77599\n重卡\t77600\nsurely\t77601\nbao\t77602\nhayou\t77603\nbal\t77604\n闫善勤\t77605\n阿金\t77606\n联军\t77607\nsukcd\t77608\n形态\t77609\n邻里\t77610\ndavid\t77611\n万九千九百九十九块\t77612\nhfx\t77613\n阮陈恩静\t77614\n来说去\t77615\n你了我的心\t77616\n大鹿晗\t77617\n太汉子\t77618\n讲故事行\t77619\n百分之\t77620\n日爽\t77621\n诸佛教\t77622\n叫记住\t77623\nfffdffcf\t77624\nsdgrhekc\t77625\n焊\t77626\n飞龙别\t77627\n表喊\t77628\n勒不勒不勒不勒\t77629\n考诚\t77630\nxnkskakmzmxmxmmxjxjxjjxjx\t77631\n夏先生\t77632\n四二\t77633\n西亚片\t77634\n爱伱\t77635\n考试\t77636\n蒲松龄\t77637\n羊损\t77638\n秘我叫宴\t77639\n初音未来漫画\t77640\n落选\t77641\n风霜雨雪\t77642\nqsov\t77643\nwjwjwk\t77644\n考评\t77645\n潜移默化\t77646\n猜调\t77647\n老三老四老五老六\t77648\n060416\t77649\nTNHT\t77650\n女教师\t77651\n谦恭\t77652\n胚子\t77653\njdigd\t77654\n樊水\t77655\n杰夫\t77656\nJustinism\t77657\n出彩中国人\t77658\n猜谜\t77659\n诖锵畈炙净渗泌片5rr然臼\t77660\n2008年间\t77661\n三点水儿\t77662\n航标\t77663\n零七一\t77664\njtjs\t77665\n始祖\t77666\n前轮\t77667\n13级\t77668\n妙办\t77669\n零幺幺幺\t77670\n行政庭\t77671\nIE额\t77672\n价钱\t77673\n渐暖\t77674\nwhatme\t77675\n西强我弱\t77676\n制不你\t77677\n贵阳话\t77678\n紫金城小区\t77679\n这吨\t77680\n丽兹\t77681\n逗边\t77682\n一醉方休\t77683\n明天上午中午十二点\t77684\n兑付\t77685\nbiih\t77686\n奥爹\t77687\njgjgj\t77688\n金志国\t77689\n哭了\t77690\nstudying\t77691\n1.2万\t77692\n我不理你了你该说你爱东方不爱我\t77693\n两三十多钟\t77694\n登门\t77695\n充好\t77696\n罗梓烨\t77697\nc6nc\t77698\n两百多岁\t77699\n红河县\t77700\n好吧goo\t77701\n被逮捕\t77702\n夏雨天\t77703\n真度秘\t77704\n不服从\t77705\n么么么么\t77706\n我也我的我想哭你你你你说吧说煽情的话\t77707\n新专辑\t77708\n流离\t77709\n14522334929526n\t77710\n赛雷喔\t77711\n夺命猜猜我\t77712\n81毫米\t77713\n上个礼拜\t77714\n新网\t77715\nxiaoshuo\t77716\nyuokhgguuhhgggggfghhhhhhggtitggfdddfhhbvhuyhgHgtttfttghjvvghgftfviygfycffgjgg\t77717\nbbbbvmtjthma\t77718\n同秘\t77719\n心情好心情好\t77720\njgjg1\t77721\n王源帅\t77722\njgjg5\t77723\n七便便舞\t77724\nTrackI\t77725\n欸欸繼續\t77726\n儿媳\t77727\n诓\t77728\n是什么快说完\t77729\n维维\t77730\n100万岁\t77731\nvbccufj\t77732\n真心稀饭\t77733\n艳萍\t77734\n科技\t77735\n十千十百两百二\t77736\n孟子铭\t77737\nspanahe\t77738\n暴食暴食\t77739\n吉河南\t77740\n越级\t77741\n刘姝葳\t77742\n我是任洛诗\t77743\n错综复杂\t77744\nGuvdfhcf\t77745\n越线\t77746\n灭水器\t77747\n09期\t77748\n翻过\t77749\n其一身\t77750\n最爱我的小妞妞老板了[好爱\t77751\n大宝大宝\t77752\n两周年\t77753\nHGHRD\t77754\n弹出去\t77755\n生病难过\t77756\n9618A黑￥\t77757\n交代\t77758\n将士\t77759\n元宝\t77760\n所在地\t77761\n阿咪三\t77762\n慕尼黑的天空好蓝\t77763\n草莓旺仔牛奶\t77764\n猜我哪儿\t77765\n轻身\t77766\nKitty799887752588\t77767\n129473家\t77768\n546564\t77769\n188772587725\t77770\n0.5毫克\t77771\nwixdxbbxishdi\t77772\n阿福贝贝\t77773\nnnnnnbbbbbb\t77774\n朱咪咪\t77775\n交付\t77776\n6月9日下午两点\t77777\n2016年1月28号\t77778\n元宵\t77779\n吃俩\t77780\n冲锋号\t77781\n了别理\t77782\n玛姬\t77783\n熊胆\t77784\n关注者\t77785\n好胖好胖好胖\t77786\n任德华\t77787\n世道\t77788\n补阳类\t77789\n刘羽翔\t77790\n四十分钟\t77791\n蔬菜沙拉\t77792\n后爹\t77793\n李有恩\t77794\n取而代\t77795\n代言\t77796\nOOOOOONNNNNNN\t77797\n给出题\t77798\n魏妙妙\t77799\n减值\t77800\n过年就回卜弋上不上我的人\t77801\n少女时代秀英\t77802\n搔扰\t77803\n不见爱\t77804\n袖标\t77805\n土司\t77806\n济宁市\t77807\n航班准点率\t77808\n六六块\t77809\n阴阳\t77810\n阴阴\t77811\n好好过\t77812\n妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈\t77813\n死狙击\t77814\n第四极\t77815\n廖敦利\t77816\niiOO\t77817\n连怪\t77818\n毛毛\t77819\n绫濑\t77820\n白发魔女传之明月天\t77821\n手机号\t77822\n四撒\t77823\n九九歌\t77824\n巴莱\t77825\n御瑜\t77826\n不买拉\t77827\ndeenendemobaom\t77828\n动车站\t77829\nD10\t77830\n每一回\t77831\n点事儿\t77832\n555o\t77833\n荇菜\t77834\n漫天\t77835\n自寻\t77836\n夜里两点十一分\t77837\n雷小姐\t77838\n灵剑\t77839\n还有馆\t77840\n录播\t77841\n我的代理\t77842\n调路由器\t77843\n小声点爸爸你大声点大声点大声点\t77844\n明天十点\t77845\n你是谁呀你为什么不问我的梦\t77846\n那妩\t77847\ngvdhb\t77848\n超白战\t77849\n那件事\t77850\n颜子\t77851\n自察\t77852\n10000000000000000000000000000000000000000000000000000个\t77853\n管厂\t77854\n九中门\t77855\nSexy\t77856\n狂犬\t77857\n情趣\t77858\n妗妗\t77859\n很难过\t77860\n999小小小小小\t77861\n认笑话\t77862\n大肉\t77863\nldll\t77864\n大肌\t77865\nTc4jy\t77866\n5555\t77867\n70年代\t77868\n陈爸\t77869\n真心话\t77870\n蛤\t77871\n大肚\t77872\n几十七八十\t77873\n1月6号\t77874\n安感\t77875\ntertf\t77876\n飞思\t77877\n大肠\t77878\n呗11拉比克\t77879\nGbhgfggfh\t77880\n郑祖星\t77881\n乐宁姐\t77882\n马后炮\t77883\n45604560\t77884\n哈真\t77885\n二七六六九二二六\t77886\nv九点cc\t77887\n卜啵啵啵\t77888\n0.51个\t77889\n最划算\t77890\n那妞\t77891\n我是天皇我少女\t77892\n参照系\t77893\n川渝\t77894\nNONONONo\t77895\n没用了说\t77896\nmmqwerrttyuio\t77897\n6146070486411\t77898\n奸恶\t77899\n偶爾瞬間\t77900\n噶份\t77901\n嗨森\t77902\n啦咯\t77903\n好多一点\t77904\n15062\t77905\n嗯秋衣\t77906\n一吻之间\t77907\n旅游景区\t77908\n喜羊羊与灰大狼\t77909\n肥圆\t77910\n要不到\t77911\n胡无辜\t77912\n120米\t77913\n888488888488454786484848494888787\t77914\n六一个月\t77915\n工作场所\t77916\n振豪\t77917\ndush\t77918\n新天宇与与与与他就是天宇与猫的鱼\t77919\n软软\t77920\n度秘度秘我有事找你\t77921\n孔汈\t77922\n投影机\t77923\n跏丨\t77924\n度蜜月度秘\t77925\n龚梦佳\t77926\n无累\t77927\n蝴蝶谷\t77928\n度美\t77929\n568万\t77930\n藕团村\t77931\n月初九\t77932\n六分之五\t77933\n明星片\t77934\n一对啊多米\t77935\n这样事\t77936\n抽抽\t77937\nzaijian\t77938\n心思想\t77939\n申通\t77940\n洲际集团\t77941\n李白\t77942\n责任者\t77943\n不要阿大保佑我要尼酱\t77944\n呀嘤嘤\t77945\n老大约\t77946\n六46\t77947\n淡雅\t77948\n踏平\t77949\n我是女的不要将小了要相信我的话\t77950\n那个微\t77951\nDark\t77952\n南渡北归\t77953\n蛛\t77954\n吉祥物\t77955\n21个月\t77956\n十九八七六四\t77957\n在校生\t77958\n开行\t77959\n度秘度秘度秘seheale\t77960\n囚ｂｙ\t77961\n可挺\t77962\n111丝丝丝\t77963\n迈克尔贝\t77964\n亚超\t77965\n刚刀母\t77966\n列车长\t77967\njdfr\t77968\n双性\t77969\n199899999只\t77970\n讨厌你恨\t77971\njdfj\t77972\n二二四二\t77973\n别木星\t77974\nRdduugg\t77975\nkeieieiri\t77976\nvchgcgg\t77977\ndhiphotosbaiducomxiaodupicitem1c950a7b02087bf4d123239cf5d3572c10dfcffcjpg\t77978\n市中心\t77979\n焦\t77980\n14444444444667895433484456787544\t77981\n月租费\t77982\n果断话\t77983\n社会学家\t77984\n刚才二十分钟\t77985\n世茂\t77986\n马意\t77987\nmhx\t77988\n谏如流\t77989\n邱富俊\t77990\n1313131468959751942131943265997986768312151818181818188181943343\t77991\n荒凉无边\t77992\ncDEFG\t77993\nqueenie\t77994\ndhudrubrjnjk\t77995\n5包\t77996\n2758335425\t77997\n郭湃\t77998\n红袖\t77999\n珞珞珞\t78000\n南迪宗拉维蒙\t78001\n全套\t78002\n张瀚月\t78003\nNurcan\t78004\n啵天我不会\t78005\n鬼臭屁\t78006\n多可爱\t78007\ndagj\t78008\n灰黑色\t78009\n团儿\t78010\n干耍\t78011\n桂附\t78012\n江湖人称犀利菜\t78013\n1217692717\t78014\n归置归置\t78015\n光阴似箭日月如梭转眼\t78016\n54181881\t78017\n二百二二百多\t78018\n有点怪\t78019\n偷窥无\t78020\nTheldg\t78021\n手今\t78022\n擦鼻子\t78023\n聪我明\t78024\n薛个\t78025\nsister\t78026\n明天木\t78027\n过去不说\t78028\n我希望期\t78029\n交比亚\t78030\n大白熊\t78031\n错比熊\t78032\n对对自恋\t78033\n兰狄轩\t78034\n住手\t78035\n草泥马\t78036\nhttpahiphotosbaiducomxiaodupicitemc8177f3e6709c93de8522482983df8dcd1005426jpg\t78037\n早上11点\t78038\n住所\t78039\n东吧\t78040\n斗地主\t78041\n就是我和我的姐姐\t78042\n今年1月下旬\t78043\n哈莉\t78044\nESPRIT\t78045\n龙俊华\t78046\n想很想\t78047\n张宇香\t78048\n有我师\t78049\n洛里昂\t78050\nyouisabitch\t78051\n森海\t78052\n花痴日\t78053\n百事灵\t78054\n塞缪度\t78055\nduetic\t78056\n森海子\t78057\n多米多米我爱你\t78058\n送往\t78059\n贾志坤\t78060\n不会所\t78061\n袁钰\t78062\n姓关\t78063\n嘎卓乖\t78064\n英標\t78065\n下城区\t78066\n不可多得\t78067\n么多多\t78068\nMINI\t78069\n黑富美\t78070\n扯感\t78071\nWell\t78072\n别克凯越\t78073\n贱婢\t78074\n吕昌宇\t78075\n我真的服你了好不好\t78076\n超神速\t78077\n飞吧霏霏霏霏霏霏\t78078\n藍生\t78079\nhgddghc\t78080\n丽水西\t78081\n湖里\t78082\n吹散\t78083\n吗丁\t78084\npppppp\t78085\n吗不\t78086\n摸佛\t78087\n再学学\t78088\n弄搞\t78089\n3ml\t78090\n肖像秘\t78091\n杨曼\t78092\n香皂碟\t78093\n啦啦啦乖\t78094\n少林武僧\t78095\n岩前镇\t78096\nMeego\t78097\n两万几年\t78098\n了解\t78099\n魏斯博\t78100\n7835\t78101\n飞飞飞\t78102\n天真萌\t78103\n壮男\t78104\n毒贩子\t78105\n刚刚公主小游\t78106\n凤凰城\t78107\n你的你看过天天有喜只人间有爱\t78108\n猪干\t78109\n肌酸3001\t78110\n老家人\t78111\n成荫\t78112\n赵一骁\t78113\n纯钢\t78114\n知识青年\t78115\n体重点\t78116\n刮西风\t78117\n三四张\t78118\n了觉\t78119\n惹人爱\t78120\n布谷布谷\t78121\n粗钢净出口\t78122\n陈浩东\t78123\n孤舟\t78124\n你好久不见\t78125\n嘻笑\t78126\n真的不懂\t78127\nokoka\t78128\nBLING\t78129\n落落\t78130\n赵灵儿\t78131\nbdhbd\t78132\n五六听话\t78133\n教导\t78134\n尸案\t78135\n晚安思密达\t78136\ntwqr\t78137\n黑夜错\t78138\n大坏人\t78139\ndoyous\t78140\n家把\t78141\n直到最后\t78142\n黄周旋\t78143\n舔舔\t78144\n爆菊\t78145\n秋暗雪\t78146\n嗯妈妈咪\t78147\ndydrfxayfflcet\t78148\n心律\t78149\n擦了擦了擦\t78150\n发笑\t78151\n姜会林\t78152\n偷塔\t78153\n今天晚\t78154\n劲儿\t78155\n危贬玩\t78156\n腰里\t78157\n来了再说\t78158\n李在上\t78159\n小蜜蜜\t78160\n亚亚洛\t78161\nfhivi\t78162\niikjkkkl\t78163\n三四月\t78164\ngsjdshxjozsgjhxgsycgsjgxxusxffszhkwkziooazugiwigxfgfsxvhavhwfgfgusygusgyugugisshghxshhgxsjxgjhssuxgsiguzqoozaoudgwgwzyzsgcxvszvkaiuzauzgouaugiz\t78165\n各庄\t78166\n五六天六七\t78167\n雅丽\t78168\nhgtufgji\t78169\n55766\t78170\n爱情睡醒了\t78171\n科技界\t78172\n找我不想\t78173\n男h漫\t78174\n1咯\t78175\n呵坷\t78176\n建筑师事\t78177\n内部\t78178\n不买不值\t78179\n血汗钱\t78180\n李子坝\t78181\n武真逗\t78182\n我爱是大爱\t78183\nkshs\t78184\n80后\t78185\n哼不理你了我走了拜拜\t78186\nchhdhehhshhsh\t78187\n他的国\t78188\n为个么要收拾拑\t78189\nhttpehiphotosbaiducomxiaodupicitem0b7b02087bf40ad1a6c2ab90502c11dfa9ecce9bjpg\t78190\n宋馨远\t78191\n拍拍手\t78192\n刘玉堂\t78193\n面版\t78194\n胡浩轩\t78195\n关行\t78196\n山村\t78197\n汝加灿忯臟膷肭翴冫\t78198\n转战\t78199\n月光曲\t78200\n白红\t78201\nSNP\t78202\n陈杰\t78203\n顾绍棠\t78204\n找大一\t78205\n刘伯承\t78206\n天信\t78207\n声优\t78208\n啦啦行\t78209\n75厘米\t78210\n二二二零幺\t78211\n妹纸\t78212\n发霉\t78213\n是啊了了死了死了死\t78214\n迷踪拳\t78215\npmtpt\t78216\nsls\t78217\n眾生\t78218\n装蹩\t78219\n坎坎坷坷黄\t78220\nrgdfrfxccxdghugsfghgbvvvfnffddzxcghhgvvczzfvgvnbvvvvcxzZ\t78221\n民族自治区政府\t78222\n多岁\t78223\n长脸\t78224\n8点33分\t78225\n牿\t78226\n第三件\t78227\n307CC\t78228\n我的家庭\t78229\n抽烟\t78230\n第三份\t78231\n天保\t78232\n全国总工会\t78233\n浙江纪佳\t78234\n骨架\t78235\n伊来西\t78236\n女儿小学毕业考试\t78237\n發圖\t78238\n第二章\t78239\n彩票\t78240\n初号机\t78241\n蔡康永\t78242\n谢威胁\t78243\n太阳相机\t78244\n清结\t78245\n再起来\t78246\nvoge\t78247\n保真好\t78248\n万古\t78249\n640千克\t78250\n一太一个\t78251\n0.35\t78252\n唐闻宇\t78253\n犯罪事实\t78254\n招财\t78255\n不是个女的你\t78256\n火车尉\t78257\n雪游故宫坑爹图\t78258\n你好乖乖\t78259\n1944至1953年\t78260\n5560824575\t78261\n石涛\t78262\n53352\t78263\ndbzbz\t78264\n遥记得\t78265\n马驭靖\t78266\n五百分之一起\t78267\n毛人家\t78268\n金动\t78269\n长大牙\t78270\n监考\t78271\n每季\t78272\n逗本\t78273\n美丽的邂逅\t78274\n结拜\t78275\n罗红霉素喝酒行\t78276\n欧法克\t78277\n好笑我小\t78278\n费元差价\t78279\nhedhdhdf\t78280\n画本\t78281\n牫\t78282\n仰卧起坐\t78283\nhskdhdkddlxhdnsdvhvkejdiduddhhekdhd\t78284\n和志翔\t78285\n四三度秘\t78286\nkgto\t78287\n不在家\t78288\n撒比森\t78289\ndchgghhg\t78290\n丁轩亚亚\t78291\n喷出\t78292\n浮浮沉沉\t78293\nHoratio\t78294\n套房\t78295\n法律元\t78296\n洒洒水洒洒水\t78297\n没骨气\t78298\nASDFDGRJF\t78299\n1695334041\t78300\n自戀還\t78301\n井绳\t78302\n139071333355557\t78303\ndjhx\t78304\ndonghae\t78305\n碰上\t78306\n神经鬼\t78307\n1930毫安\t78308\n牒\t78309\n2924251860\t78310\n夏美涵\t78311\n易中天\t78312\n15075225855\t78313\n主编\t78314\n无法想\t78315\n白雪公主嫁给他米老鼠唐老鸭白雪公主\t78316\nxxvxh\t78317\n既然么\t78318\n单刀直入\t78319\n三五零二\t78320\n沫沫\t78321\nwchhfd\t78322\n一九时间\t78323\n简单处\t78324\n补维\t78325\n高校\t78326\n实际上\t78327\n开晾\t78328\n两万个\t78329\n喜字系\t78330\n三极\t78331\n阳萎\t78332\n哼哼唧唧\t78333\n势能\t78334\n天鹅肉\t78335\nblliblli\t78336\n重病\t78337\n另一只\t78338\n脑筋急转弯小清\t78339\n平桥区第二小学\t78340\nhappyjh\t78341\njk2olwo\t78342\n补给\t78343\n忙吧忙\t78344\nbbq下巴里\t78345\n易炀千玺\t78346\n帮顾\t78347\n无法触碰\t78348\nbffhjghv\t78349\n判例\t78350\n周长健\t78351\n快点吧动态度秘国焕\t78352\n我爱爸爸我爱妈妈\t78353\n支部\t78354\n快乐生日快乐生日快乐\t78355\n北桔盘\t78356\n拐走\t78357\n流浪者\t78358\n穆庆琳\t78359\n宫颈纳\t78360\n圍觀\t78361\n二一点\t78362\n血红\t78363\n一宗\t78364\n脉搏\t78365\n呃四G网\t78366\n我是真爱你没有骗你你要是说我的话\t78367\nvhhg呵呵集团\t78368\n颠沛流离\t78369\n消化道\t78370\n限定\t78371\n橘子猫\t78372\n肚里疼\t78373\n由始至终\t78374\n沙县小吃\t78375\n排列\t78376\n色男\t78377\n版\t78378\n70首\t78379\n垫底\t78380\n春晚\t78381\nqwery\t78382\nqwert\t78383\nqwerr\t78384\n几成\t78385\n碎银\t78386\n独好\t78387\n天六合彩六合彩六合彩\t78388\nkkkkjjj\t78389\n理智\t78390\n干不干不干不干不不不\t78391\nsisoo\t78392\n哎呦好恶心人\t78393\n16865585888568\t78394\n刘圣杰\t78395\n土葬\t78396\n唱戏\t78397\n党外\t78398\n宫泽宇\t78399\n俩儿\t78400\n小克斯\t78401\n近江\t78402\n男方\t78403\n5000美元\t78404\n细微\t78405\n葡萄葡萄葡萄葡萄\t78406\nfdfddtrrtrttdff\t78407\n家用电信\t78408\n超数\t78409\n供水\t78410\n好想想\t78411\n温大\t78412\n草干\t78413\nishinote\t78414\n陈妍吉\t78415\n晚思密达\t78416\n我告诉你我最我\t78417\n最后一次\t78418\n远郊\t78419\n里加上\t78420\nChi\t78421\n977年\t78422\n中饱私囊\t78423\n潘豆\t78424\n7117171\t78425\nBhb\t78426\n本该\t78427\n留守儿童\t78428\n我知道你不爱别给我\t78429\n翔飞人\t78430\n八三个\t78431\n陷落\t78432\n白羊与狮子头猪不如你\t78433\n八天内\t78434\n大工\t78435\n呆橘\t78436\nuwhxksjs\t78437\n比翼双飞\t78438\nzabc\t78439\n牌九\t78440\n18848岁\t78441\n犒妓\t78442\n阿狸了了了了我忘了我\t78443\n跨国恋\t78444\n70公斤\t78445\n280余名\t78446\n挖泥船\t78447\n李冉冉\t78448\n二十二二十二\t78449\n海风\t78450\n菠萝春雪\t78451\n张礼昊\t78452\n陈赤夵\t78453\n2月3\t78454\n太不没有\t78455\n海飞\t78456\n米蜜\t78457\n竞技场\t78458\n瞳孔\t78459\n基准\t78460\necoacom\t78461\n11l11\t78462\n误导性\t78463\n么动笔洞斌\t78464\n王浩芳\t78465\nPdBco1e\t78466\n乐凯胶片\t78467\n很专心\t78468\n乐天LOTTE百货天津二店\t78469\n蒋金辰\t78470\n一手抓\t78471\n很孤苦\t78472\n城市份\t78473\n梁山\t78474\nhsso\t78475\n王文静\t78476\n泰国圣荷\t78477\n四姑伯\t78478\nange\t78479\n联建\t78480\n牌题\t78481\n牛逼倒霉倒霉\t78482\n东盟\t78483\n5AMJMJADGAJ哦了家裡\t78484\n不是我的我的\t78485\n多谢\t78486\n一阵风片\t78487\n532￥\t78488\nsrarerrth\t78489\n和风度\t78490\ndumizaima\t78491\n455541\t78492\n利丰\t78493\n至真至性\t78494\n基督秘\t78495\n有兴欲\t78496\n小谭\t78497\n就骗\t78498\n政府主义者\t78499\n老男人\t78500\n小谢\t78501\n东航股份公司\t78502\n二第二\t78503\n大难不死\t78504\n741258963508574\t78505\n理姑\t78506\n数一数连一连\t78507\n享年\t78508\n一视同仁\t78509\n阿飞正传\t78510\n石马羊肉牛肉\t78511\nDUGIFVGKH\t78512\n统计学\t78513\n小黄猫\t78514\n我局\t78515\n乳名\t78516\n799252095\t78517\n小调\t78518\n补鱼鱼TV\t78519\n约束\t78520\n妙惟肖\t78521\n二五分\t78522\n希特勒暗堡\t78523\ncdddsstd\t78524\n黄征\t78525\nLancelot\t78526\n蒲公英\t78527\n信义\t78528\n词条\t78529\n柔柔\t78530\n挺吵\t78531\nhtmid522123566313spm20142160071200\t78532\n吴喜欢\t78533\n继贷\t78534\n#高级达人#\t78535\ntions咯\t78536\n早上6点半\t78537\n李凯超\t78538\n二百平\t78539\n2863078期\t78540\n聊天不想\t78541\n交工\t78542\n刚刚刚\t78543\n十五ｇ\t78544\nfarkyou\t78545\n交差\t78546\n幽灵吧\t78547\n配音\t78548\n战靴\t78549\n棊事\t78550\n接死\t78551\n五顿号\t78552\n别那么\t78553\n下咽\t78554\n空乏\t78555\n炫迈口香糖\t78556\nmanner\t78557\n八四个月\t78558\nChickencake\t78559\n吕彩非\t78560\n一个两米\t78561\nEast\t78562\n茶饮方\t78563\n认敌\t78564\n寿县\t78565\nMisharyalafasy\t78566\n催款\t78567\n狼血4c\t78568\n21第二个\t78569\n闭卷\t78570\n小妹子\t78571\n9999个\t78572\n后方\t78573\n范梦然\t78574\n韦峰\t78575\n血友\t78576\n物产\t78577\n海城\t78578\n就是你的女神\t78579\n温柔男\t78580\ndhdhd\t78581\n三三国\t78582\n22亿美元\t78583\n海域\t78584\n度秘我真的超喜欢你\t78585\n风房子\t78586\n啥嘞\t78587\n成嘉颖\t78588\n显著\t78589\n血口\t78590\n水产品\t78591\n7060\t78592\n丽子\t78593\n一无数块\t78594\np图\t78595\n测评\t78596\n度瀰\t78597\n一闪一闪亮晶晶\t78598\n钠盐\t78599\njiyuy\t78600\n12480\t78601\nF4GQLWA2GRYC\t78602\n夏紫薇畔\t78603\n花仙\t78604\n吴有灵\t78605\n肖秀红\t78606\n陛下\t78607\n意味的歌\t78608\n隔天\t78609\n斯丹利像\t78610\n王錕\t78611\nd5\t78612\nd2\t78613\nd3\t78614\nd1\t78615\n橘子\t78616\nHvbhhgfc\t78617\n胡人\t78618\n巨巴黎甘肃省严重父母炎爽歪歪派出所亚洲串串察看氟利昂\t78619\n一万多元\t78620\n后脑勺\t78621\n战不好使\t78622\n最短\t78623\n隔膜\t78624\n路晗\t78625\nyffgff\t78626\n2648047765\t78627\ndn\t78628\ndo\t78629\ndl\t78630\ndm\t78631\ndj\t78632\ndk\t78633\ndh\t78634\ndi\t78635\ndf\t78636\ndg\t78637\ndd\t78638\nde\t78639\ndb\t78640\ndc\t78641\n咬掉\t78642\nda\t78643\ntube网\t78644\ndz\t78645\n摘录\t78646\ndx\t78647\n小兔子乖乖\t78648\ndv\t78649\n离休\t78650\ndt\t78651\ndu\t78652\ndr\t78653\nds\t78654\ndp\t78655\ndq\t78656\n收音机\t78657\n周友红\t78658\n北晚\t78659\n耀东名言\t78660\n豆浆机\t78661\n丁俊涛\t78662\n各国\t78663\n王冰之\t78664\ndE\t78665\n背包\t78666\n在一起\t78667\n番茄粥\t78668\n百年难遇\t78669\n何家乐\t78670\n最后决选\t78671\n星只有一个人\t78672\n到破\t78673\n脚皮\t78674\n男同性峦\t78675\n沙湾\t78676\n马宁蔚\t78677\n6秒钟内\t78678\n试试金\t78679\n肩负\t78680\n米兔咪\t78681\nfulathome\t78682\n上海地铁8号线\t78683\n充流\t78684\n淘房\t78685\n女枪\t78686\n吴小莉\t78687\n提现\t78688\n2003年起\t78689\nco3\t78690\n属国\t78691\n水表班\t78692\n800b\t78693\n王尼美\t78694\n128元\t78695\n体型\t78696\n王泽文\t78697\n和好\t78698\n付涛\t78699\n魏少\t78700\n灵美\t78701\n张子萱\t78702\ncoQ\t78703\n徐州市佛教协会\t78704\n血体\t78705\n猿粪\t78706\nbnbczvcxgfvcbx\t78707\n爱国心\t78708\n18666775659\t78709\n病句\t78710\n超过30天\t78711\n自求多福\t78712\n看一看\t78713\n一张鸡\t78714\n朴留仙\t78715\ncos\t78716\n画虎\t78717\npeakEnglesh\t78718\n五年前\t78719\ncov\t78720\ncoy\t78721\ncox\t78722\n看一眼\t78723\ncoz\t78724\n24小时内\t78725\ncoa\t78726\n165厘米\t78727\ncoc\t78728\n谢谢飞\t78729\ncod\t78730\n羊羔\t78731\n哈楼下启示录\t78732\ncok\t78733\n九百八十七十点\t78734\ncom\t78735\ncol\t78736\ncoo\t78737\ncon\t78738\n读咪\t78739\n肾疼\t78740\n李岩峰\t78741\n再反\t78742\n60488\t78743\n不懂不懂不懂\t78744\nlb味\t78745\n花灯\t78746\n十八禁\t78747\n我是园长妈妈的爱\t78748\n再叙\t78749\n丰台\t78750\n来回来\t78751\n车辆\t78752\n陈度秘\t78753\n哪里里\t78754\n美女上西天\t78755\n楔子\t78756\nc秀曼\t78757\n沟北\t78758\n联想A790E\t78759\n亚普犬\t78760\n2510元\t78761\n乖儿\t78762\n善治\t78763\n震感\t78764\n一75级\t78765\n刘星宇\t78766\n沦陷\t78767\n车边\t78768\n壓扁\t78769\nnnn玉玉玉玉\t78770\n玻璃珠\t78771\n什简单\t78772\n一个五百兆\t78773\n应征者\t78774\n扒拉\t78775\n红花油\t78776\n余力威\t78777\n好大好大我杀\t78778\n尊胜博登\t78779\nqq飞车\t78780\n五王国\t78781\n科学技术部火炬高技术产业开发中心\t78782\n歌路\t78783\nimfine\t78784\n天注定\t78785\n随你吧\t78786\n卢卡库\t78787\n子痫\t78788\ncaaaandy\t78789\n到来\t78790\ntuutgui\t78791\n玩儿陀螺\t78792\n咳咳咳咳咳咳\t78793\n首先\t78794\n严媛\t78795\n聊点天\t78796\n调职\t78797\nghbb\t78798\n许戈辉\t78799\nvoice\t78800\n加成功\t78801\n总参谋长\t78802\n陈美珍\t78803\n百脑汇徐汇店\t78804\n武生\t78805\nhxciyd\t78806\n九九场\t78807\n何加\t78808\n120多\t78809\n夏暑\t78810\n六弦\t78811\nfbjjshhzifiyjgjfidodidhfhhfhuguffojdjdhhgv\t78812\n同样的话说\t78813\n算卦\t78814\n是不呀\t78815\n三七点\t78816\n越人\t78817\nQVQ\t78818\n拿着\t78819\n快乐\t78820\nGOATWHORE\t78821\n学习方法\t78822\n舌根癌\t78823\n失眠多梦\t78824\n顺逐\t78825\n日日网\t78826\n脑片\t78827\n科力\t78828\n汪也二\t78829\n阳瓜子消消气\t78830\n于镇耀\t78831\n灰累\t78832\n曹靖\t78833\n真特\t78834\n植眉\t78835\n辽宁省\t78836\n亲爱你爱你爱你爱你\t78837\n6332886875355664646646646464646\t78838\n空气质\t78839\n股掌\t78840\n呆萌型\t78841\n裸体照\t78842\nt8736\t78843\n几勺\t78844\n陆逊\t78845\n泠姐姐\t78846\n黄店壮\t78847\n真牛\t78848\n真牙\t78849\n用上\t78850\n遮光梦微天之骄\t78851\n兴鹿\t78852\n够了我不知道\t78853\n超额\t78854\nFfjcggxdlfsfhuffudgyfudjdhgkjgjfkgggcv\t78855\n真版\t78856\n李洗车\t78857\n或是\t78858\n超频\t78859\n村支部\t78860\n犯贱犯贱犯贱犯贱犯贱\t78861\n偎依着\t78862\n跳舞\t78863\n自拍自导\t78864\n一只1只\t78865\nbdudh\t78866\nuuueuyhfv\t78867\n啊迷奥特曼小游\t78868\n董亚文\t78869\n美工\t78870\n疗伤\t78871\n大理棒\t78872\n日本人大\t78873\n0.10%\t78874\napp播放器\t78875\n对牛弹琴\t78876\n我们的家门\t78877\n75％\t78878\n洪书记\t78879\n肖二娘\t78880\n11.30\t78881\n第八章\t78882\n你好帅锅\t78883\n劲辣\t78884\n龘曬灪\t78885\n帮帮\t78886\n借俩\t78887\n上官玫瑰或上官九红\t78888\n如是\t78889\nCjjcjdnfbj\t78890\n合度蜜\t78891\n从头来\t78892\n一五六二零二零零一三九\t78893\n幽咽\t78894\n哈鲁哈鲁\t78895\n诚心想气\t78896\n排列整齐\t78897\n讨厌你我恨你\t78898\n恒大威尼斯\t78899\n嘻嘻嘻\t78900\n二五了学\t78901\n吕笛\t78902\ncaps\t78903\n冯刚\t78904\n林点戚\t78905\n张思语\t78906\n小度秘你好乖\t78907\n张悬\t78908\n马胜男\t78909\n吉尔吉斯斯坦\t78910\n指甲油\t78911\n那天晚上\t78912\n水老虎\t78913\n赵双琪\t78914\n12332112345677个\t78915\nkhgnbbhnb\t78916\n瓜了嗦\t78917\n好迷茫\t78918\nqxl\t78919\n搜狐的话\t78920\n平遥古城\t78921\n大茂良\t78922\njnyihgh\t78923\n大姨三国演义人名儿\t78924\n合式\t78925\n滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚乖乖\t78926\n白色狼\t78927\nwhurii\t78928\n首处\t78929\n主说\t78930\n吗眼\t78931\n1443612\t78932\n杨希雨\t78933\n施主\t78934\n致萌萌哒萌萌哒萌萌哒萌萌\t78935\n分手的理由\t78936\njdijckxjjx\t78937\n丘丘培\t78938\n老蒋老毛\t78939\n三十几级\t78940\n矤\t78941\nrcbi\t78942\n135704688\t78943\nWeinstein\t78944\n龙秀青\t78945\ndeaf\t78946\nsksis\t78947\n杨颖四\t78948\n聊的天都忘记\t78949\n15215358610\t78950\n三百零二\t78951\n秦宇\t78952\n神奇就好化冶神奇\t78953\n陈炜栋\t78954\n你好色\t78955\n留底\t78956\n秦安\t78957\n兹华仕\t78958\n大喜\t78959\n表现手法\t78960\n周杰强\t78961\n丄\t78962\n红粉花\t78963\nhdbd\t78964\n叫福指数\t78965\n柳州市\t78966\n省运\t78967\n北汽\t78968\n大喊\t78969\n最美我最美我是天下第一美\t78970\n艺海\t78971\nOhydycy\t78972\n王嘉群\t78973\n大善\t78974\neeeeeeeeeeeeeeeeeeeee\t78975\n别村\t78976\n喔米伽\t78977\n瑞泰\t78978\ntffn俊凯\t78979\n好困难\t78980\n抬手\t78981\n周斌容\t78982\nllktjpkpullouttt\t78983\n狗度\t78984\n斯儿\t78985\nBCEF\t78986\n50500\t78987\nhely\t78988\n满地\t78989\nHeydthfdcdfthgxgyjpfgdgfdvjyeghddyhvbbjjccvb\t78990\nCut\t78991\n有戏\t78992\nCup\t78993\nllmp\t78994\n心\t78995\n张云天\t78996\n胃肠\t78997\n偷吻\t78998\n有战\t78999\n赵芯蕊\t79000\n沃您\t79001\n审计师\t79002\n给在\t79003\nformidable\t79004\n偷听\t79005\n有成\t79006\n百佳\t79007\n百位\t79008\n朱乐霞\t79009\n止痛药\t79010\n色鬼\t79011\n好我不投诉你\t79012\n收出\t79013\n不明辨\t79014\n有房\t79015\n偷吃\t79016\n500246932919\t79017\n四十公分\t79018\n属主\t79019\n那鬼来我的家有\t79020\nggvfvj\t79021\n小在家\t79022\n不好么\t79023\nscariouse\t79024\nfhfccbjicxxchbf\t79025\nsthes\t79026\n直营\t79027\n剩男\t79028\n冷心术\t79029\n夹道\t79030\n初始\t79031\n酱香\t79032\n熊\t79033\n收益率\t79034\n座头鲸\t79035\n调校\t79036\nUed\t79037\n海王星\t79038\n唉三真\t79039\n李霞辉\t79040\nIamnotaboy\t79041\n58765152\t79042\n直萌\t79043\n更诚实\t79044\n相逢何必曾相识\t79045\n忘了你了我\t79046\n兼城市快速路\t79047\n干嘛好久不见\t79048\n影响\t79049\n烧炭\t79050\n恐高证\t79051\n误猿\t79052\n巴比娃\t79053\n一跤\t79054\n东湖落雁岛\t79055\n三展\t79056\n一星半点\t79057\n人蛇\t79058\n三届\t79059\n多一\t79060\n多万\t79061\n共建\t79062\n高枝\t79063\n30多名\t79064\n三层\t79065\n多三\t79066\n三局\t79067\n芝淼\t79068\n金刚山\t79069\n多不\t79070\n缪景萱\t79071\n生长调节剂\t79072\n拉威\t79073\n拉娃\t79074\n莘萌\t79075\nome度\t79076\n败给你\t79077\n一跟\t79078\n少司命\t79079\ninterested\t79080\n天亮出去\t79081\n天黑了我怕\t79082\n多两\t79083\n多丫\t79084\n套话\t79085\n大宅子\t79086\n高架\t79087\n从新\t79088\n不能干\t79089\n猎装\t79090\n方白\t79091\n幻想的人\t79092\n媲美\t79093\n长距离\t79094\n合不拢嘴\t79095\nshshsbsb\t79096\n北师大\t79097\n心一个人\t79098\n董学校\t79099\n到家大姐大几额几额几几额也饿金额\t79100\n推背图\t79101\n内外\t79102\npolitu\t79103\nlowomomm\t79104\n说不你\t79105\n街区\t79106\n度秘你真的好好\t79107\nKKMall\t79108\n味人\t79109\n楼群\t79110\n砖块儿\t79111\n四千多年\t79112\n傻傻分\t79113\nfact\t79114\n耂丰平\t79115\n问好问\t79116\nstmetalstomatoutificuryou\t79117\n威尔度\t79118\n如梭\t79119\n范秋菊\t79120\n早白首方悔\t79121\n李茂\t79122\n疗效\t79123\n如梦\t79124\n财季\t79125\n不不不我说错\t79126\n没靠\t79127\n适足\t79128\nanano\t79129\n本科\t79130\n恩萌萌哒\t79131\n话说完\t79132\n昂然\t79133\n十初5\t79134\n度秘我爱你晚安\t79135\n傻笑的人\t79136\n负数\t79137\n好想亲亲你\t79138\n乐克乐\t79139\n丽诗yo\t79140\n熬\t79141\nzhang\t79142\n撸娃\t79143\n对不起火\t79144\n猜了吧\t79145\n十三个小时\t79146\n8100哦元\t79147\n样车\t79148\n不饱\t79149\nhghjh\t79150\n劳动力\t79151\n偶像\t79152\n改编\t79153\nducudyw\t79154\nBECKY\t79155\n屁桃\t79156\n期度秘\t79157\n九十米\t79158\n六座\t79159\n成帅\t79160\n一千千千万\t79161\nzersk\t79162\n谁是你的爱妃了我是你的姐\t79163\n女体\t79164\n67个\t79165\n河渡鹅\t79166\n澎湖\t79167\n優惠\t79168\n顾家55556\t79169\nwayI\t79170\n真潮\t79171\n人鱼\t79172\n16574号\t79173\n万荣话\t79174\n啦啦啦啦啦啦\t79175\nHVDCGHG\t79176\n澎湃\t79177\ngkmtmtmj\t79178\n一个十七岁\t79179\n猛女\t79180\ndbs\t79181\nzhussud\t79182\n说呀别\t79183\n不好故意\t79184\ndbt\t79185\n年龄限制\t79186\n蛋蛋鹿\t79187\n作受\t79188\n衫儿\t79189\n饭量\t79190\n10公里\t79191\n马恩\t79192\n苏大夫\t79193\n流满面\t79194\njghhhjhfghk\t79195\n有所谓了\t79196\n43757048盒\t79197\n11点18分\t79198\nfgghh\t79199\n哥元\t79200\n度面\t79201\nhoopy\t79202\n哥兄\t79203\n兴国安邦\t79204\n快恐惧感\t79205\n下湍\t79206\n蘿\t79207\n吴晓\t79208\n横截面\t79209\n吴晨\t79210\n哇咔咔咔\t79211\n拨弄\t79212\n尿道炎\t79213\n分之一者\t79214\n好呀新年\t79215\n拨开\t79216\n绕膝\t79217\n蘑\t79218\n嗯十六\t79219\ncgcj\t79220\n笨女\t79221\n揍飞\t79222\n以如\t79223\n邹晨曦\t79224\n取信\t79225\n幺零八六\t79226\njijij\t79227\n五趟\t79228\n李杨梦\t79229\n两架\t79230\n稍感\t79231\n月跟\t79232\n克林真可\t79233\n35小时\t79234\n说和\t79235\n2007年5月14日\t79236\n一大点儿\t79237\n裤裤裤\t79238\n老好不好\t79239\n两枚\t79240\n灣道\t79241\n小蜜猫\t79242\n六街个\t79243\n或称\t79244\n亚楠\t79245\n田雪\t79246\n西拉子和酒\t79247\n田雨\t79248\n转到\t79249\n姑凉\t79250\n欧我的天\t79251\n见一见\t79252\n提森博物馆里的夏加尔\t79253\n子时\t79254\n润肤露\t79255\n原矿\t79256\n像会\t79257\n不是我要你给我一只小狗\t79258\n我我我我我我我我我我我我我我我我我我我哇哇哇哇哇哇哒\t79259\n1513911118\t79260\n低贱\t79261\n没指望\t79262\n不苦\t79263\n问你走\t79264\n报失\t79265\n其状\t79266\n小白袜\t79267\n姜水\t79268\n雅博\t79269\n后半生\t79270\n汪回\t79271\nLiMing\t79272\n老公公\t79273\n游一一太颠\t79274\n王忠诚\t79275\n43x8分之二\t79276\n报备\t79277\n水口\t79278\n草痴\t79279\n新疆\t79280\n信达资产\t79281\n一个八百\t79282\n博斗\t79283\n邻牙\t79284\niffff\t79285\n小晴晴\t79286\n千百一会儿\t79287\n茅根\t79288\n临高烤乳猪培训学校\t79289\ndxs\t79290\n5千多\t79291\ndxq\t79292\n多根\t79293\ndxm\t79294\nI9220黑3290\t79295\n我爱乖是度\t79296\n道光\t79297\n满月日\t79298\ndxg\t79299\n回到家乡\t79300\n犯一次错\t79301\nSLASH#\t79302\n孩子样\t79303\n多样\t79304\n徵信\t79305\n藏锋\t79306\n提拉米\t79307\n孙兴之\t79308\n听所闻\t79309\n屌不屌\t79310\n道兴\t79311\n快气\t79312\n心理学\t79313\n愛湯\t79314\n欧阳常树\t79315\n一一么\t79316\n兴起\t79317\n东宝山\t79318\n周帮帮\t79319\n九七九幺零二六\t79320\n990000000\t79321\n掌上明珠\t79322\n弯讷\t79323\n新妈妈大厦\t79324\niooif\t79325\n好不勒\t79326\n男人轩\t79327\n20165\t79328\nCameron\t79329\n水象星座\t79330\n康王\t79331\n20166\t79332\nt肀影\t79333\n12345678941\t79334\n好姓\t79335\n机器人行\t79336\n喝麻游\t79337\n说事夕阳\t79338\n曲目\t79339\n明明天上学\t79340\n热客\t79341\n800W\t79342\n记载\t79343\nhellomatuiu\t79344\n呵呵分分合合发\t79345\n房子堂\t79346\n开光化\t79347\nM910+\t79348\n参议院\t79349\n二二三百块\t79350\n搞定\t79351\n大黄块\t79352\n公海母\t79353\n不怀好意\t79354\n王溢冰\t79355\n五足\t79356\n5uuu\t79357\n搞完\t79358\n董佳泰\t79359\n唐佳宁\t79360\n放掉\t79361\n2505775007\t79362\n数嘛\t79363\ncounyer\t79364\n1100余张\t79365\nConrad公司\t79366\n美赞臣\t79367\n凄美\t79368\n来说来\t79369\n乃个\t79370\n应对\t79371\n磁力\t79372\n穿了嘛眠\t79373\n墓园\t79374\n真我去\t79375\n会人\t79376\n快快快快快一点\t79377\n我头顶着天\t79378\n择天\t79379\nHxxcjv\t79380\n125283\t79381\n22b\t79382\n我喜欢你我要给你表白度秘\t79383\n58235832685745352524585355just\t79384\ndiyd\t79385\n行诉你的电话\t79386\n怪儿\t79387\ngbc\t79388\n楚旖\t79389\n读术\t79390\nshotat\t79391\n1332916316\t79392\n瓦娃\t79393\n二成\t79394\nwone\t79395\n哭哭啼啼\t79396\n厌食\t79397\n好啦啦啦\t79398\niooil\t79399\n智商\t79400\n2986点\t79401\n瑟雍\t79402\ndxcfvvo\t79403\n疯子潺潺\t79404\n花椒油\t79405\n缪聪明\t79406\nqqqqqqqqqqqqq\t79407\n魁首\t79408\ngbk\t79409\n轻爽\t79410\n微观教育分析法\t79411\n100倍\t79412\nfhfhfshsfhsfjfufbcyfedIHAGZUZKSTKG\t79413\n证券市场\t79414\n迎头\t79415\n枯枯还海平甜心大姑海还好\t79416\nG0aL\t79417\n洛索\t79418\n襄阳度秘\t79419\n2012年7月31日\t79420\ncmzgf\t79421\n名姓片\t79422\n康达律师事务所\t79423\n国脚\t79424\n金木水\t79425\n云音\t79426\nwfsgshxdhdj6jgkxkjkh109g52571dnc2ghfjxjdh\t79427\n比不死\t79428\n中旺镇\t79429\n一＼\t79430\n眼前\t79431\n就场\t79432\n乃们\t79433\n鼻涕\t79434\n窖池\t79435\n就地\t79436\n歼教\t79437\n就在\t79438\nccvgg\t79439\n不要不要不要不要\t79440\n陈都灵\t79441\n露露兔兔图咯了6句兔兔兔兔兔兔兔兔\t79442\n老庄\t79443\n一。一\t79444\n蒸发度\t79445\n蒋伯诚\t79446\n得意扬扬飘飘然\t79447\n老店\t79448\n福娃\t79449\n05854855\t79450\natmys\t79451\n刘刚刚\t79452\n结社\t79453\nbukhnoll\t79454\nFKXTOXYOXOGXOGXGLXYOXYPXYOXYOXOYXOGXYOXYOXOGXGOXGOXLGXLGXLGXKXGGCGOXGKXKFX\t79455\n老庙\t79456\n炼潭\t79457\n泪水\t79458\n硬化\t79459\n胡永鑫\t79460\ngionee\t79461\n白平\t79462\n白干\t79463\n唔子\t79464\n彭涵蕊\t79465\n也良乡\t79466\n听友\t79467\n样本户\t79468\n给我几首歌小三你好贱\t79469\n硬包\t79470\n理综家\t79471\n梁小陈\t79472\n那头一\t79473\n顾丽媛\t79474\n123123612312339123391233912339123\t79475\n放羊\t79476\nwk四八\t79477\n数一遍\t79478\n下下下\t79479\njtmmmmmmmmm\t79480\n李机智\t79481\n权大\t79482\nvihufy\t79483\n55358528\t79484\nfriend\t79485\n730金鹰剧场\t79486\n天人非笑\t79487\n五六个小时\t79488\nmaylisri\t79489\n选择\t79490\n42周年\t79491\n筋\t79492\n蔚蓝\t79493\n紫二小\t79494\n丑八怪也有人爱那你就叫丑八怪\t79495\n甲种\t79496\n庞坤\t79497\n小情侣\t79498\n阿尔卑斯\t79499\n卫生球\t79500\n胡馨怡\t79501\n守卫\t79502\n颁白者\t79503\n选拔\t79504\n很高贵\t79505\ngxo\t79506\njgagjhjga\t79507\n22P\t79508\nxxisks\t79509\n一兄弟\t79510\n我讨厌你我再也不要和你在一起我再也不想理你了哼气死\t79511\ngehs\t79512\n诚用\t79513\n杨钰莹\t79514\n血色湘西\t79515\n提成\t79516\nangel\t79517\n徐康俊\t79518\n唐果果\t79519\n日用包\t79520\n复旦童兵\t79521\n孙溪\t79522\n石庙乡\t79523\n云南省红河州个旧公安局\t79524\n王万万\t79525\n受不鸟\t79526\n2016年1月10日\t79527\n結婚\t79528\n展演\t79529\n陀飞轮\t79530\n江西人\t79531\n终止\t79532\ndK\t79533\n9000000000000000000000000000000000000000000000000000000000000000000000088800080888岁\t79534\n芝芝\t79535\n暗淡无光\t79536\nBreezes\t79537\n孙志磊\t79538\n藤兰跑男\t79539\n孟凡峰\t79540\n高抬贵\t79541\n贝比\t79542\n爱你了爱\t79543\n有一天人\t79544\n地龙\t79545\n问一问\t79546\n杨永远\t79547\n3节\t79548\n小三度秘\t79549\n饭局\t79550\n邱添一\t79551\n说病\t79552\n史蒂克\t79553\n张十斤\t79554\n中平路\t79555\n朱世伟\t79556\n执掌\t79557\n2234\t79558\n过我想你\t79559\naaaa\t79560\n上天上学\t79561\n裙边\t79562\nD700\t79563\ndeartoneynthanksalotforyourinvitationimterriblysorry\t79564\nMINI嘎\t79565\n求你了你便\t79566\n什度秘\t79567\nyyerty\t79568\n发麻\t79569\n炼利你好演义\t79570\n辽宁\t79571\nIncredible\t79572\n咯丸\t79573\n爱美的你走\t79574\n红斑狼疮\t79575\n咯丽\t79576\n蒙太\t79577\nwuad\t79578\n挑战者联盟\t79579\n符思彤\t79580\n幽灵\t79581\n11665\t79582\nZ\t79583\n罗伯特德尼罗\t79584\n成否\t79585\ncip61c\t79586\n鸭牌\t79587\n安好下多米\t79588\n亚特\t79589\n无岸\t79590\n晏笑笑\t79591\n头猪猪猪\t79592\n白青\t79593\nG55阿拉伯版\t79594\n分内分内分内\t79595\n脸度\t79596\n乌冬面\t79597\ncz694\t79598\n1-4月份\t79599\n笑笑\t79600\n幺八六九八零八幺零三九\t79601\n滚熬\t79602\n为儿\t79603\n东明帅\t79604\nF－117A隐形战斗轰炸机\t79605\n1941年前\t79606\n电信业\t79607\n75天\t79608\n节假\t79609\n熊建峰\t79610\n睡毛\t79611\n张巍\t79612\n香喷喷\t79613\nW家宝\t79614\n云崖\t79615\n8月14号\t79616\n脸庞\t79617\n白面\t79618\n守信\t79619\n好啦不理你了拜拜\t79620\n四笔\t79621\n李自华\t79622\n印堂发黑不日\t79623\n大撒离欧一零大撒比\t79624\n好啊好啊我告诉你了我的名字吧我叫\t79625\n幽兰剑\t79626\n难辞其咎\t79627\n20点\t79628\n讨合\t79629\n婊灬\t79630\n这种病\t79631\n窝壳\t79632\n姐妹俩\t79633\nLocation\t79634\n芋子包\t79635\n嗯小米\t79636\n梯子\t79637\n忘了嗯\t79638\nbadlkk\t79639\n爱疯吧\t79640\n你是谁你是谁你是天仙大美美庭\t79641\n八仙天天\t79642\n多浪迪\t79643\ndityt\t79644\n12315\t79645\n弗兰克\t79646\n华尔街\t79647\n卢风雪\t79648\n太子妃\t79649\n连不上网\t79650\nab5c9ea15ce36d32e81e1443df33a87e950b139jpg\t79651\n度蜜月在\t79652\n刘允恒\t79653\n爱心\t79654\njacobs\t79655\n船长\t79656\n橡树\t79657\n用以致\t79658\n美拉美拉美\t79659\n季风\t79660\n4,5个小时\t79661\n帮写\t79662\ngwg\t79663\n东井村\t79664\n哭哭哭\t79665\n黑月\t79666\n黎噶\t79667\n老板们\t79668\n串词\t79669\ngwp\t79670\n元稹\t79671\nfxkfckg\t79672\n我的妈妈\t79673\n我最讨厌你了最讨厌你了我最讨厌你\t79674\nDunn\t79675\n王佳怡\t79676\n254399\t79677\nCOS\t79678\nCOW\t79679\n这个偶\t79680\n短纤\t79681\n笼子\t79682\nCOC\t79683\n不说的确\t79684\nCOD\t79685\n火虫\t79686\n四七岁\t79687\n特种兵\t79688\n卡尔沃丁\t79689\n懂事\t79690\n上百种\t79691\nCON\t79692\n相钩\t79693\n6506787\t79694\nsest\t79695\n忍心踩\t79696\n四G，四G\t79697\n爱范儿\t79698\nhisface\t79699\nSmokers\t79700\n文学家\t79701\n上肢\t79702\n请示\t79703\n上股\t79704\n明天下午3点\t79705\n宋露露\t79706\n没由来\t79707\n贴家\t79708\n耍脾气\t79709\n玉渊潭\t79710\n李泽鹏\t79711\n喜压机\t79712\n真的我只是你的朋友\t79713\n喔呃\t79714\n筚\t79715\nstovetou\t79716\n新港\t79717\n小雨滴\t79718\n克里斯托弗·诺兰太\t79719\n锦州市区\t79720\n长皋\t79721\n我的眼里你了你\t79722\n商演\t79723\n收官\t79724\n没办法明说\t79725\n繁\t79726\n武技\t79727\n数学家\t79728\n狮口\t79729\n是你好错\t79730\n两只小蜜蜂\t79731\n太子妃升职记好好看\t79732\n繐\t79733\n喔呵\t79734\n201几年\t79735\n91722\t79736\n碳钢合金\t79737\n２０\t79738\n月球人\t79739\n秦时明月汉时关\t79740\n太着\t79741\n剑痴\t79742\nheng\t79743\n我喜欢熊我喜欢熊心归来\t79744\nhend\t79745\n历错\t79746\n头一次\t79747\n孟国秦梦国宴\t79748\n辖区\t79749\n如花似玉\t79750\ntpupu\t79751\n违约金\t79752\n28616841\t79753\n师利\t79754\n老潘\t79755\n一个五十\t79756\n贵妃\t79757\n肖畅\t79758\n贵妇\t79759\n坐等\t79760\ndumizhunixinkuile\t79761\n海淀剧院\t79762\n超喜欢度\t79763\n宫廷\t79764\n亡情\t79765\n体育室\t79766\n大梦想家的歌儿\t79767\n900乃\t79768\n告诉我告诉我告诉我告诉我告诉我\t79769\n导火\t79770\n熊出没之熊心归来\t79771\n刹了开\t79772\n志向\t79773\n16587023981488\t79774\n退换\t79775\n原籍\t79776\n哈喽\t79777\n寂寞在唱歌\t79778\n浴\t79779\n袁民币\t79780\n097979\t79781\n长效化\t79782\n心心系\t79783\n静依\t79784\n哈喃\t79785\n相识样\t79786\ntimroewshisown\t79787\n练习卷\t79788\n邹城\t79789\n七分钟\t79790\n1383838438\t79791\n动感机器人\t79792\n格林童话\t79793\n小澈\t79794\n俄罗斯\t79795\n佛山市光点贸易有限公司\t79796\n13968741230\t79797\n邮袋\t79798\ngenerally\t79799\n四关\t79800\njkks\t79801\n口渴男\t79802\n孙洋\t79803\n耨耨耨耨耨\t79804\n四全\t79805\n四八\t79806\njkkk\t79807\n四六\t79808\n深度\t79809\n相容\t79810\n星游\t79811\n正言\t79812\n便于\t79813\n顺顺\t79814\njishuo\t79815\n康忙\t79816\n逆市\t79817\n我的我心\t79818\n活够了愿\t79819\n衰败\t79820\n這小鎮\t79821\ni比\t79822\n四元\t79823\n胀痛\t79824\nmomentum\t79825\n相宜\t79826\nAV片\t79827\n别闹讨厌\t79828\n怪石\t79829\n决绝\t79830\n忐忑不安\t79831\n是真的爱\t79832\n垂耳兔\t79833\nC语言\t79834\n异地\t79835\n浸泡\t79836\n万难\t79837\n紫塞\t79838\n筑\t79839\n拼音版\t79840\n好黑\t79841\n四分之一个\t79842\n望穿\t79843\n打人者\t79844\n十三点\t79845\n4集\t79846\nnkme2day\t79847\n徽韵\t79848\n雅馨\t79849\n戾\t79850\n定边\t79851\n跨越\t79852\n嬉皮笑脸\t79853\n私立\t79854\n薛茗月\t79855\n广渠门\t79856\n别气我行不行\t79857\n光禄吟台\t79858\n曹世法\t79859\n百合办\t79860\n戴\t79861\n在床上\t79862\n哈尼阿塞呦\t79863\n五十个英标的这是为什么\t79864\n户\t79865\n五嘉兴\t79866\n三板斧\t79867\n戶\t79868\n二十三二十四号\t79869\n忍过\t79870\n133823okcois\t79871\n一喔一\t79872\n给还\t79873\n剧烈\t79874\n赵日天\t79875\niii584\t79876\nFM1001\t79877\n亚特兰大机场\t79878\nInterface\t79879\n后不来者\t79880\n骨碌\t79881\n假面具\t79882\n39.00元\t79883\n阿伊莎\t79884\n久立四\t79885\n删减版\t79886\nsevendim\t79887\n朱丹华\t79888\n快快快快快快快快\t79889\n你在下明明静音\t79890\n戎\t79891\n爬树\t79892\n才疯\t79893\n磨人\t79894\n理事会\t79895\n鹩哥\t79896\nichmagdich\t79897\n类型儿\t79898\n一一千八万八个\t79899\n额撸啊撸\t79900\n后项增6\t79901\n戊\t79902\n四两1点\t79903\n八国联军\t79904\nchop\t79905\n米八三\t79906\n冷艳\t79907\n毛坯\t79908\n汪强\t79909\nXXoo\t79910\n亲先生\t79911\n黑衣人3\t79912\n照明\t79913\n偶雪勇\t79914\nghgdmkkkkkkkkkkkkkkkkkkk\t79915\n吉吉\t79916\n张应安\t79917\n张凡\t79918\n从军\t79919\n金华侨\t79920\nNBA2kol\t79921\n和乐清看守所\t79922\n一般果\t79923\n咋整\t79924\nGmg\t79925\n张凯\t79926\n太冷落\t79927\n新建县\t79928\n集结\t79929\n镇海社区志愿者爱心银行\t79930\n八段\t79931\n31055元\t79932\n徐露\t79933\ntfboy帅\t79934\n九十千克\t79935\n恙恙\t79936\n半导体\t79937\n夹子\t79938\n播音\t79939\n帝国女\t79940\nコンサ\t79941\n太和成熟\t79942\n下摆\t79943\n度你是我的种\t79944\n陈川\t79945\n二八幺二\t79946\n刻制\t79947\n闭幕场\t79948\n四点半块\t79949\n啦头\t79950\nvnzZ\t79951\n谗\t79952\n一三哪几\t79953\n幽灵界\t79954\njfhddhg\t79955\n不赞\t79956\n6880\t79957\n儿方\t79958\n一向西\t79959\n不赖\t79960\n笨龙柏\t79961\nhire\t79962\nddtf\t79963\n多了吐\t79964\n江苏舜天\t79965\n挂家\t79966\n福彩3d\t79967\n污渍\t79968\n早代\t79969\ngfhhfgjkk\t79970\n酸菜牛肉面\t79971\n吴桐\t79972\n不走\t79973\n客气实在\t79974\n山穷水秀\t79975\n沈建波\t79976\n度谜是你的妹妹\t79977\n网眼\t79978\n谍\t79979\n喳喳\t79980\n移至\t79981\n带解\t79982\n如喜之郎\t79983\n窝事\t79984\n歪爽\t79985\n那燕飞\t79986\n卷形\t79987\n坏笑\t79988\nx厘米高跷\t79989\n朱永康\t79990\n都是你的错找你爱上我\t79991\n靳东棒\t79992\nZhe个吧\t79993\n周你\t79994\n13598561930\t79995\n噜唐\t79996\n疯了关进\t79997\n孙登辉\t79998\n几了是了了是老婆\t79999\ni晕对哦晕\t80000\n啊得\t80001\n嗯萌\t80002\n二案\t80003\n谢谢好朋友亲\t80004\n黄鹂\t80005\n吴桥\t80006\n36e么\t80007\n今年1月21日\t80008\n多奥特曼\t80009\n唐嫣美\t80010\n气瓶\t80011\n歌榜\t80012\n运动量\t80013\n我机\t80014\n观察者网\t80015\n艾叶堡\t80016\nbdjdueveks\t80017\n黄鹤\t80018\n我们的天\t80019\nlking\t80020\n机暇\t80021\n度蝶\t80022\n1411个\t80023\n乐什么乐\t80024\n丧胆\t80025\n结实\t80026\n牛氓单\t80027\n攻击性\t80028\ncordera\t80029\nhggyhvbjb\t80030\n坠楼\t80031\nixgfkfjfj\t80032\n王雨慧宁\t80033\n度秘书度秘度秘\t80034\n肥猪大母\t80035\n生态系统\t80036\n浣花\t80037\n百分之几16米\t80038\n回好赞\t80039\n印在\t80040\n亲爱的经理\t80041\n本党\t80042\n睡觉吧小机器人\t80043\n整我\t80044\n一字之差\t80045\n阿帅\t80046\n度秘度秘月\t80047\nfghjo\t80048\n本兮\t80049\n本公\t80050\n本月19-20日\t80051\n阿希\t80052\n9周岁\t80053\n872425574\t80054\nfseh\t80055\n沃塔\t80056\n太好了我\t80057\n僅\t80058\nguikav\t80059\n大本宫\t80060\nyshb\t80061\n像\t80062\n三岁半\t80063\n谬\t80064\n清水\t80065\n空闲\t80066\n嘉嗯嗯\t80067\n曦猪\t80068\n僕\t80069\n5059999999亿\t80070\n黄宇轩\t80071\n位份\t80072\n乖求你啦\t80073\n严旭\t80074\n挂钩\t80075\n标准化\t80076\nitcoro\t80077\n一手魔仙\t80078\n手腕儿\t80079\nNismo\t80080\n周之若鹜\t80081\n记录乐\t80082\n聊会儿天吧你在干嘛\t80083\n高铁\t80084\n第三张\t80085\n挂钟\t80086\n我就是我我是你的豪言\t80087\n浮\t80088\n124735758524822852528\t80089\n你与你的开始女人\t80090\n僵\t80091\nshabsb\t80092\n僾\t80093\n颠子\t80094\n價\t80095\n守护\t80096\n神池温岭\t80097\n乔志权\t80098\n鬼毛\t80099\n327次\t80100\n泡杯雀舌\t80101\n有别\t80102\n狗苟\t80103\ncaaa\t80104\n下放空\t80105\n入乡随俗\t80106\nbuqing\t80107\n拜行\t80108\n积聚\t80109\n有刹\t80110\n大才要\t80111\n匡山小区\t80112\n下半年\t80113\n主义\t80114\ngxgcggcg\t80115\n有分\t80116\n挂不挂机\t80117\n果肉\t80118\n３３０万人次\t80119\nHUH\t80120\n三月二十四号\t80121\nmozhixu\t80122\njedj\t80123\n干梦见\t80124\n申中藏庚金\t80125\nsegc\t80126\n忙丑\t80127\n霍欣怡\t80128\n良机\t80129\n吴宇俊\t80130\n狠恨\t80131\n深情的拥抱\t80132\n美平平\t80133\n4.89一\t80134\n周国栋\t80135\n田瑶\t80136\n828606\t80137\n一下一步\t80138\n中德藏尸\t80139\n米美丽\t80140\n举报人\t80141\n咪你甘你\t80142\n二二三五\t80143\n妾身\t80144\n0855565458\t80145\n你情我愿\t80146\n瓶红牛\t80147\n张盛涛\t80148\nnononononomorehem\t80149\nlomo\t80150\n不米\t80151\n何兴龙\t80152\n佛山十三姨PK顺德龙江仙塘\t80153\n田瑛\t80154\n坏淫\t80155\n萨挖多特\t80156\n他来了[好爱\t80157\n地球仪\t80158\n老特拉福德\t80159\nmc\t80160\ndwui\t80161\n周六夜\t80162\nhogo\t80163\n小北\t80164\nhree\t80165\nhogv\t80166\n哈傻\t80167\n10024\t80168\n积雪\t80169\n小包\t80170\nJudgedhf\t80171\n10021\t80172\n喂养\t80173\n余慧文\t80174\n小区\t80175\n听雨\t80176\n澳英菲\t80177\nhshdhhd\t80178\n转让\t80179\n二考场\t80180\n能得到\t80181\n着不了急\t80182\n泰顺\t80183\n直升\t80184\n死了心也不在这里了亲在天上\t80185\n莱罗塔列\t80186\n湘北会战\t80187\n南方职院\t80188\n多多多多多多多多多\t80189\n吴书记\t80190\n2009年4月15日\t80191\n受真讨厌\t80192\n埋怨\t80193\n125只\t80194\n炫铃炫铃\t80195\n索道\t80196\n九八幺六\t80197\n烦我没有\t80198\n基隆\t80199\n嘛行\t80200\n好爱\t80201\n帝女\t80202\n司马光\t80203\n猪你是猪你是猪你是猪你是猪朱猪猪猪猪猪猪猪猪猪猪猪猪猪猪\t80204\nvuggfuhhgg\t80205\n漢子\t80206\n永远幸福地\t80207\n7271757627682738285721827\t80208\n行度\t80209\n不管你走多远\t80210\nAppsFire\t80211\n条导盲犬\t80212\nnine\t80213\n五十多度\t80214\n读起来\t80215\n国庆\t80216\n嘿受\t80217\nq度秘\t80218\n渝\t80219\nhhr\t80220\nhhq\t80221\n有伴\t80222\nhhw\t80223\nhhu\t80224\n渚\t80225\n渔\t80226\n渗\t80227\nhhx\t80228\n渐\t80229\n十发分\t80230\n金猪\t80231\nhhb\t80232\nhha\t80233\n有伤\t80234\nhhg\t80235\nhhf\t80236\n厉害们\t80237\nhhd\t80238\n清\t80239\n曾祖母\t80240\n超级hi\t80241\nhhh\t80242\nhho\t80243\nhhn\t80244\n根据地\t80245\n善恶\t80246\n来一不开\t80247\n游\t80248\n时片\t80249\n有伐\t80250\nｙｍｙｌｉｆａａａ\t80251\n渴\t80252\n性价\t80253\n不可爱了为喜欢\t80254\n果沃\t80255\n心心里话\t80256\n掐人这事\t80257\n測\t80258\n港\t80259\n温\t80260\n不不聊天\t80261\nhhD\t80262\n八vf\t80263\n渡\t80264\n渠\t80265\n渣\t80266\n海南县\t80267\n真魔仙三\t80268\nshuoshenmone\t80269\n邓真\t80270\n33.5%\t80271\n大一口\t80272\n清真鱼\t80273\n上海市实验学校\t80274\n0点四\t80275\n照着\t80276\n大一号\t80277\n闰土一文\t80278\n氢氧化钠\t80279\n维吾尔人\t80280\nsssssssssssssssssssssssssssssssssssssssssscssssssssssssssssssss\t80281\n4544566667\t80282\n诺氟沙星#\t80283\n今天一夜\t80284\nbugy\t80285\n扎克\t80286\n8tr六\t80287\n南都发表差生韩2\t80288\n99999999999999999个\t80289\n阿刘\t80290\n江东父老\t80291\nmyisa\t80292\n七八百米\t80293\n阿根正传\t80294\n知错不改\t80295\n化学奖\t80296\n送别\t80297\n心里边\t80298\n4年\t80299\n七句话\t80300\n海藻\t80301\n世界银行\t80302\n王源禾\t80303\n了得\t80304\n南边\t80305\n一个三天两夜\t80306\n恐怖片\t80307\n八秒\t80308\n成功者\t80309\n吃片\t80310\n架起\t80311\n三针\t80312\n大丑猪\t80313\nvP\t80314\nreseat\t80315\n式恋爱\t80316\n1970年代以后\t80317\n匕爽\t80318\n印章\t80319\n疲软\t80320\n杨新梅\t80321\nderrferffyf\t80322\n迷惑性\t80323\n猪蹄子\t80324\nhxnsxnz\t80325\n健壮\t80326\nxhgrj\t80327\n罗氏会药医镜\t80328\n86家\t80329\n宣汉县\t80330\n生气赛\t80331\n朱家荒\t80332\n好囚\t80333\n难们\t80334\nDgnfbhcjhhjhggfddqefsaqwqqqfbhff\t80335\n硕士证\t80336\nv句解放军赶快覅副乳付\t80337\nCanspeakEnglish\t80338\n道者\t80339\n赵乾\t80340\n市场\t80341\n仇池国\t80342\n针灸\t80343\n尼绒布\t80344\n润洁滴眼液\t80345\ntowhat\t80346\nbookworms\t80347\n十七分钟\t80348\n波音\t80349\n陈诺仪\t80350\n黄杨茗\t80351\n身心健康\t80352\n有图强\t80353\n米东\t80354\naaa9\t80355\n日洗\t80356\n米丘\t80357\n十二十岁\t80358\n点蚀\t80359\ngtsfsbbfvbnmlwqiafgjkzxcnmmgfdbsakdjcsnndfg\t80360\n12章\t80361\n介意思涛\t80362\ndib\t80363\n领位\t80364\n名秘\t80365\n门户网站\t80366\n上细\t80367\n邵建新\t80368\n好啦你教我织西安榜\t80369\n母乳\t80370\ncwetuisd\t80371\n红旗东路\t80372\n臀部\t80373\n拜给\t80374\nvery美\t80375\n呜呼哀哉\t80376\n人才干\t80377\n名称\t80378\n巡讲\t80379\n不爱我了\t80380\n来了谁\t80381\n从小到大\t80382\n喧喧\t80383\n石川\t80384\n界面\t80385\n霍然\t80386\n万岁岁末\t80387\nnbnbnb\t80388\n体下\t80389\nbase\t80390\n每次\t80391\n任何地方\t80392\n幸福代言人\t80393\n丫蛋\t80394\n邵子萱\t80395\n田\t80396\n技术人员\t80397\n大呼小叫\t80398\n亲故\t80399\n家庭长\t80400\n搜狐微博\t80401\n655555555854856856\t80402\n啥样\t80403\np142\t80404\n瘦难\t80405\n一个百五\t80406\n阿健\t80407\n8gU盘\t80408\n认人见人爱\t80409\nsschowtio\t80410\n提别\t80411\n孙康\t80412\n喀山城\t80413\n林芝地区第二高级中学\t80414\n维尔康双\t80415\n冲虎\t80416\n魔法弹\t80417\n净弄\t80418\nｉｃｃｉｄ\t80419\n333333\t80420\n恰似\t80421\n提到\t80422\n画\t80423\n命路\t80424\n偶监\t80425\njjjjagj\t80426\n呀不爱\t80427\n输工\t80428\n疯了不想\t80429\n不不不不我不穿\t80430\n零三个\t80431\n孙庄\t80432\n500万倍\t80433\n董行\t80434\n心帜文\t80435\ncomfa\t80436\n4.32万一平米\t80437\n红太狼\t80438\n咯LOL\t80439\n取名\t80440\n明天民政局\t80441\n和顺\t80442\n我的问你作业题\t80443\nnijius\t80444\n三力士\t80445\n乐天派\t80446\n谢公公\t80447\n癫痫\t80448\n赵凯\t80449\n无主\t80450\n无为\t80451\n呀儿\t80452\nfgfggdfgfggggggg\t80453\n取向\t80454\n阴水\t80455\n四立\t80456\n凝露\t80457\n挽逆水\t80458\n899哦9ujxbkfik建瓯9客家话\t80459\n无不\t80460\n你好店\t80461\n无下\t80462\n无上\t80463\n蓬蓬裙\t80464\nhgggy\t80465\n过生\t80466\n西北方\t80467\n无一\t80468\ni东北i工资谷歌工作单位vi个i\t80469\n14835875805\t80470\n周俊玫\t80471\n无业\t80472\n阴阳怪气\t80473\n嗯穿越火线\t80474\n阴气\t80475\n1月15号\t80476\n国家发改委\t80477\nUXUUX\t80478\n锻造\t80479\n200亿岁\t80480\n候文哲\t80481\n向霞\t80482\n无害\t80483\n佳仁\t80484\n每每月\t80485\n万一个\t80486\nWBD\t80487\nWBC\t80488\n祖庭\t80489\n人志\t80490\n千愁\t80491\n案值\t80492\n钱成宇\t80493\n秋秘\t80494\n我爱罗\t80495\n2634632563\t80496\n黄红白黑\t80497\n休息日\t80498\n叮海珊\t80499\n两点整\t80500\n甒\t80501\n孕末\t80502\njshhyT\t80503\n马鸿旭\t80504\n口红\t80505\n浆水\t80506\n徐警官\t80507\n飞了放心\t80508\n112124121541215545445554254￥757545241554\t80509\n上海来伊份\t80510\n店儿\t80511\nfkidB\t80512\n啦啦好字我喜欢老杨\t80513\n搭调\t80514\n亲爱的宝贝你是我的心\t80515\niffffff\t80516\n还在\t80517\n也有点\t80518\n封天\t80519\n过天降贤淑男\t80520\n甘\t80521\n施凯航\t80522\n感骂\t80523\nstross\t80524\n刘流程\t80525\n搭谢\t80526\n春风来千树万树梨花开来\t80527\n沧玄\t80528\ndiffer\t80529\nhgggc\t80530\n我爱死你了我们结婚吧度秘\t80531\n情包\t80532\n六年前\t80533\n计厌\t80534\n柯潇芬\t80535\n基情片\t80536\n人在身旁\t80537\n阿Sir\t80538\n小磊\t80539\n冷梗\t80540\n玉斌\t80541\n乖乖小度秘你最乖了谢谢你\t80542\n语气词\t80543\n中午三点\t80544\n张杰哥\t80545\n23号\t80546\n高山族\t80547\n攻击\t80548\nhffujj\t80549\n服符\t80550\n十二点半\t80551\n金麦\t80552\n机器女友\t80553\n五样\t80554\n好我死\t80555\n五根\t80556\n空天威视讯凯凯i太委屈\t80557\n五格\t80558\n反作用\t80559\n何错之\t80560\n星探\t80561\n撒撒的八嘎\t80562\n邦别\t80563\nyuguggg\t80564\n沙皮\t80565\n开心麻花团\t80566\n万不得已\t80567\n圣巴巴拉\t80568\n女人事\t80569\n妈咪咪\t80570\n什么装\t80571\nsufhhgih\t80572\nhttphhiphotosbaiducomxiaodupicitem960a304e251f95ca8e98f5c3ce177f3e67095251jpg\t80573\npoiy\t80574\n给我回复\t80575\n天气预报\t80576\neoc角COD\t80577\npoir\t80578\n张婉璐\t80579\n独生子儿\t80580\n围城\t80581\npoii\t80582\nQQ银行卡卡号\t80583\nvegvs\t80584\n卢纯\t80585\n环路桥\t80586\nmmmiaak\t80587\n0点2分钟\t80588\n包厢\t80589\n宫寒\t80590\nhippastatouhappstatouhappyasthtou\t80591\n自造口\t80592\n邱晶琳\t80593\ngreatsale\t80594\n无定\t80595\n9.5元\t80596\n红豆汤\t80597\nJSSSS\t80598\n追就可惜\t80599\n好酷\t80600\n张现忠\t80601\n好寂寞我懂没有感觉到\t80602\n金苗\t80603\n撕裂\t80604\n刮目相\t80605\nudhiqqohdbddjjdxis\t80606\n彭家贵\t80607\n015年春节\t80608\n松江区\t80609\nwoy\t80610\n24678\t80611\nwow\t80612\nwov\t80613\n杨彬彬\t80614\n唔识\t80615\nwor\t80616\n别说错\t80617\nwon\t80618\nfhvjvghghghg\t80619\nwoh\t80620\nwod\t80621\nwoc\t80622\n吉安\t80623\n不徐\t80624\nsvrfvwgdah\t80625\n不得\t80626\n藏心\t80627\n0点六五\t80628\nbntttyyxbgzvaaefbmnbbhkkj\t80629\n好大好大好大你好背\t80630\n何老师静\t80631\n和润\t80632\n皇明\t80633\n孟四\t80634\n圣战者\t80635\n10n元\t80636\n大火灾\t80637\n小黑\t80638\n乐呵呵呵\t80639\n到不了什热\t80640\n天涯无聊人\t80641\n小黄\t80642\n还清\t80643\n活该秒女王好样的骂死度秘\t80644\n一依然\t80645\n小黎\t80646\n附加值\t80647\n1778岁\t80648\n音序\t80649\n口才证\t80650\n残奥\t80651\n五甲\t80652\n嘟噜噜myokaltoto\t80653\n地下城号\t80654\n售票处\t80655\n本萌\t80656\nGWT\t80657\nhjkjhhjklljvgkmbfgjkbvjkk\t80658\n给你走\t80659\n磨皮\t80660\n一rr\t80661\n赵海龙\t80662\ncugu\t80663\n一点便遍\t80664\n几多感\t80665\n莲花六娃\t80666\n就最後一張　雖然\t80667\n滚滚永远\t80668\n能看\t80669\n段位\t80670\n电动机\t80671\n于玮\t80672\n卷曲\t80673\ngdgbga1agdbjgjgagagaj\t80674\n醴陵市\t80675\n得不偿失\t80676\n好跟\t80677\n242912498\t80678\n好跑\t80679\n厄齐一\t80680\n千米\t80681\n赵巍巍\t80682\n我的美丽日记\t80683\n光馨怡\t80684\ngfjfuhdhfiidtfjjdyetgtyuuughydt\t80685\nihih\t80686\n搜神\t80687\n2004年8月23日\t80688\n球娃\t80689\nDont\t80690\n婉瑶\t80691\n淑界\t80692\nAAAAyy\t80693\n粗线\t80694\n八嘎叽呱叽呱叽\t80695\nOPPO\t80696\n班里人\t80697\n親木\t80698\n英雄救美\t80699\n路飞君\t80700\n可了吧\t80701\n起见\t80702\n等哈\t80703\n明辨是非\t80704\n零六天之后\t80705\n威尼斯\t80706\n河北省佛教协会\t80707\n一块\t80708\n一坐\t80709\nfsSzfs\t80710\n朝廷\t80711\n你好度秘牌复读机\t80712\n阿孤影\t80713\n十八弯\t80714\n违章\t80715\n林良\t80716\n8464956745584539\t80717\n列娜\t80718\n彩c\t80719\n清请\t80720\n王铁柱\t80721\n好女人\t80722\n怎么了我就不陪你玩儿了哼你个臭度秘\t80723\n明天晚上22点\t80724\n雷没\t80725\n咋毁\t80726\n查埋\t80727\n大奥\t80728\n再唱一首歌\t80729\n张ab\t80730\n首相\t80731\n有工\t80732\n3544\t80733\n3545\t80734\n我要可爱我室最萌\t80735\n热价\t80736\n67258837\t80737\n神泉\t80738\n得策\t80739\n连攻\t80740\nㄛㄚㄞㄢ\t80741\n首盘\t80742\n机格\t80743\n給女王\t80744\n岁寒三友\t80745\n伤湿止痛膏\t80746\nc里森\t80747\n吵吵吵吵吵\t80748\n51555件\t80749\n戏票\t80750\n六芦\t80751\n李晨悠\t80752\n刘传说\t80753\n不要脸点\t80754\n八名\t80755\n张鹏\t80756\n我的世界手机版\t80757\n知网\t80758\n六花\t80759\n2005年4月\t80760\n网球场\t80761\n好好不\t80762\npassword\t80763\n通牒\t80764\n成四不象\t80765\n有几不\t80766\n神贼\t80767\n六节\t80768\n涨停板\t80769\n世界版\t80770\n150支\t80771\n十五米\t80772\n午们\t80773\n张鹤\t80774\n1887415157199741515717874115157\t80775\n屈玲\t80776\n凉皮店\t80777\n金源机器人\t80778\n阿雅\t80779\n一个n五个\t80780\n球机\t80781\n所在\t80782\n全节\t80783\n见好\t80784\n聊欲\t80785\n享福\t80786\n谢谢你的童话故事\t80787\n这话费\t80788\n阿雪\t80789\n257天\t80790\n球服\t80791\n原博\t80792\n阿雯\t80793\n妤妤\t80794\n原单\t80795\n文艺\t80796\nfie66rugeohgi\t80797\nshrgg\t80798\n靠性\t80799\n羽毛\t80800\n学校期\t80801\n憋孙\t80802\n低估\t80803\n遗憾不清\t80804\n阿雷\t80805\n错路\t80806\n李鲜克\t80807\nvbggvhchgggchcgvjhhjhjgyy\t80808\n会飞会\t80809\n13579111315171921\t80810\n5点45\t80811\njgycu\t80812\n啊怀\t80813\n刘晏华\t80814\n阿龙伟福容\t80815\n长短不一\t80816\n谈起\t80817\n百多块\t80818\n杨金良\t80819\nXXXX公司\t80820\njhgfgnbvdee\t80821\n聪慧\t80822\n李栋\t80823\nCxjfkkgzzmzzjfaktag5\t80824\n三丁口\t80825\nOj688658855499\t80826\n结婚登记处\t80827\n咳咳咳咳咳咳咳咳咳咳咳咳唾成珠\t80828\n第一二两句\t80829\n闹闹闹闹闹闹我偶图\t80830\n橄榄油\t80831\n50多少\t80832\n冰灵雪雅\t80833\n哈楼哈楼哈楼哈楼哈楼\t80834\n武夷山\t80835\n冷孙\t80836\n外示威\t80837\n橙汁餐\t80838\n兰大\t80839\n5毫米\t80840\n我帅我酷我狂我最牛你丑你丑你最丑\t80841\n说子\t80842\n仏呃\t80843\n姜说姜\t80844\n啦啦秘\t80845\n大司命\t80846\n15236\t80847\n四时岁\t80848\nCatherine\t80849\n电话本\t80850\n切克闹切克闹\t80851\n三千六百五十九兆\t80852\n亲爱的度\t80853\n电话机\t80854\n重树\t80855\n翻墙\t80856\n幂姐\t80857\n餐饮化\t80858\n链子\t80859\n旁部\t80860\n贝戈戈\t80861\nxieu\t80862\nGaga\t80863\n惟一\t80864\n霹雳堂\t80865\nMGPGMG\t80866\n怨言\t80867\n5645654558595456\t80868\n意思我喜欢\t80869\nfufyydi\t80870\n贾wtou\t80871\n红动譕瑷\t80872\n珠江实业\t80873\nPortman\t80874\n骂错\t80875\nyfgiyf\t80876\n香皂\t80877\n224466\t80878\n固话\t80879\n四十三分\t80880\n补汤\t80881\n票务\t80882\n红日\t80883\n妞们\t80884\n晚手\t80885\n啦啦啦我找你\t80886\nhhdsd\t80887\n红早\t80888\n法西斯\t80889\n手推车\t80890\n刘福华\t80891\n骚gif\t80892\n嗯嗯嗯嗯嗯66666\t80893\n红旅\t80894\n下辈子\t80895\njtjbVhddiigkfbffh8vfB9irrrrjd8ddhiiiieedoojebedejduieejife\t80896\n2523428929\t80897\n心难\t80898\n心情好低落\t80899\n鲍亚琴\t80900\n活套\t80901\n譬如\t80902\n科技性\t80903\n好不好乖\t80904\n红旗\t80905\n清香\t80906\n博力\t80907\n云敦\t80908\nvsd\t80909\n我也沒問那個錒\t80910\nTTTTTTTTTTTT\t80911\n蝶步舞\t80912\n票办\t80913\n南关\t80914\n寨桥村前村\t80915\n鸡鸣\t80916\n鸡鸡\t80917\nOK路\t80918\n欧啦啦\t80919\n鸡鸭\t80920\n二百多分\t80921\n百名\t80922\n13754562734\t80923\nAdfj\t80924\n社交网战\t80925\n立性\t80926\n称臣\t80927\nVvvvvvv\t80928\n尼玛\t80929\n再见吧明天在\t80930\nbevtud\t80931\n心情不舒畅\t80932\n那脑型\t80933\n恭喜\t80934\n走过路\t80935\n嗯美\t80936\n八分之三\t80937\n从复\t80938\n南充\t80939\n有钱不买\t80940\n八分之一\t80941\n八分之七\t80942\n咸鸭蛋\t80943\n乱伦三女\t80944\n别和\t80945\n好不好意思\t80946\nbcdakk\t80947\n切死\t80948\n帮偶\t80949\n病毒型\t80950\n二三五幺\t80951\n那几天\t80952\n電影\t80953\n戴口罩\t80954\n靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠\t80955\n田新闻\t80956\n驯服\t80957\n四十五岁\t80958\n蓝红粉\t80959\n田晨\t80960\njhbhhobjbjbjbbuht\t80961\n度秘貌无\t80962\n585866658586\t80963\n嗯李达\t80964\n青藏高原\t80965\n难道\t80966\n王海根\t80967\n深层次\t80968\n04051729303311\t80969\n奇境\t80970\n铻\t80971\n克咳\t80972\n蕴藉\t80973\n好吧真功夫\t80974\n蕴藏\t80975\n大桶\t80976\n刘成峒\t80977\n铲\t80978\n铱\t80979\n7777112\t80980\nwirebag\t80981\n银\t80982\ndqwety\t80983\n享誉\t80984\n叮咚叮咚叮咚叮咚叮咚叮咚叮咚叮咚\t80985\n普通犬\t80986\n铯\t80987\nmoreb\t80988\n铬\t80989\n克咪\t80990\n說話幹\t80991\n救狗\t80992\n问战\t80993\n别误会\t80994\n13561470776\t80995\n鲍鱼龙虾\t80996\n铝\t80997\n铜\t80998\n雅塔莱斯\t80999\n㏄㏄ァァ\t81000\n巨擘\t81001\n陈国富\t81002\n份子\t81003\n森林狼\t81004\n不不不不不我说妈妈是谁妈妈妈妈妈妈妈妈\t81005\n死吧\t81006\n18600日元\t81007\n铃\t81008\n铂\t81009\n铁\t81010\n羽翼之家\t81011\n赣州\t81012\n大桌\t81013\n克和\t81014\nDVD机\t81015\n囊囊\t81016\n白宫\t81017\n卓泥马\t81018\n电子纸\t81019\n马拉松式\t81020\n四小帅\t81021\n不好好\t81022\nstoustout\t81023\n有多远走多远\t81024\n7月26日\t81025\n他果\t81026\n刷头\t81027\n刷夸\t81028\njdohsdk\t81029\n把酒风流看太白，\t81030\n郑傲冰\t81031\n男声版\t81032\n打电影\t81033\n食物\t81034\n阵子号\t81035\n农药\t81036\n霹雳舞\t81037\n四层\t81038\n唔明\t81039\n张文龙\t81040\ndgbx\t81041\n崇州市\t81042\n拜拜走\t81043\n够了别来\t81044\n不管怎样\t81045\n恬aaing245454555148577844884554\t81046\n好涩\t81047\n景洪市\t81048\n差一点儿\t81049\n芥头\t81050\nwojiaohuyue\t81051\n金斧\t81052\n创赵\t81053\n谢祖武\t81054\n金元宝\t81055\n发出\t81056\n八一下\t81057\nkiug\t81058\n河南邓州新西居委会\t81059\n双门\t81060\n新京报B06-B07\t81061\n闹吐\t81062\nLABE\t81063\nAlli\t81064\n死奥创\t81065\njgadc\t81066\n虚构\t81067\n苦冯俊凯\t81068\n厦门鼓浪屿\t81069\n12:20、18:38\t81070\nUIUIUIUIUI\t81071\n56条\t81072\n口口\t81073\n死同\t81074\n一时间\t81075\n大成慧\t81076\n13756222503\t81077\nfln\t81078\nata\t81079\n吃着苦\t81080\nGreen\t81081\nrjgkhff\t81082\nmbhjvc\t81083\n成精\t81084\n一九八八二十九\t81085\n超乎寻常\t81086\n张艺瀚\t81087\narrive\t81088\n爱天长地久\t81089\n18316509254\t81090\n江南人家\t81091\n友谊小姐\t81092\nu965\t81093\n1243\t81094\n1242\t81095\nchhggug\t81096\n基因\t81097\n肥膘\t81098\n谢谢好我懂了\t81099\n455828555755852824457\t81100\n尬\t81101\n中心思想\t81102\n排杠\t81103\n你女给我说明月\t81104\n高生\t81105\n雷豪\t81106\n杜姐姐\t81107\n摸清\t81108\n密林\t81109\n张耀友\t81110\n杜小姐\t81111\n淑女范\t81112\n多四年\t81113\n小闺蜜\t81114\n避世\t81115\n连平\t81116\n亲着我\t81117\n魔晶\t81118\n聋人\t81119\n议论文\t81120\n65年\t81121\n5段\t81122\ndufuuffududu\t81123\n数目\t81124\n外天\t81125\n恩我不骗你\t81126\n抹角\t81127\n唱红歌\t81128\nipadminifiifi\t81129\n其实上\t81130\n堡南村\t81131\n陶伟伟\t81132\n英特六\t81133\n帅贴\t81134\n人心与仁心\t81135\n刘点\t81136\n外头\t81137\n罗真\t81138\n等一下见\t81139\n东航集团\t81140\n余骁航\t81141\nDRUUD\t81142\ncfboss\t81143\n锋速战警\t81144\n赵屯\t81145\n广而告诫制\t81146\n签订\t81147\n再见我再也不理你了讨厌我我\t81148\n焦俊龙\t81149\n外外\t81150\n海蓝色\t81151\nsudd\t81152\n糖画\t81153\n筝\t81154\n嗯哈\t81155\n八五三零\t81156\n工智能\t81157\n中山南二路\t81158\n石榴树\t81159\nbyelainelouie\t81160\n电哥\t81161\n沈凯扬\t81162\n再然后\t81163\n金山手机卫士\t81164\n25010\t81165\n嗯哒\t81166\n唐裕凯\t81167\n想和你一起玩\t81168\n圣神\t81169\n寸草心的歌名\t81170\n唱场\t81171\n婚庆\t81172\nvibd\t81173\n嗯哥\t81174\n暗示\t81175\nzidy\t81176\n策\t81177\nlocated\t81178\n吸音\t81179\n嗯哼\t81180\n3项\t81181\nqeeqrf\t81182\n电哈\t81183\n霹雳\t81184\n十二点五六\t81185\n五十八米\t81186\n普交\t81187\n粼粼\t81188\n舒克\t81189\n六佳\t81190\n小红鱼\t81191\n秘社\t81192\njiong\t81193\n很度秘\t81194\n27冯\t81195\n其它天\t81196\n腹黑的孩子伤不起\t81197\n1969986690\t81198\n差辈\t81199\n脖友\t81200\n茶人\t81201\n我走了你一定要爱我哦亲爱的老公\t81202\n奴化\t81203\n善事零\t81204\n六体\t81205\nnnsns\t81206\n一百一十二块\t81207\n舒兰\t81208\n就是我没有事\t81209\n六位\t81210\n红通通\t81211\niooi\t81212\n塞来\t81213\n柠檬弹珠\t81214\nahoou\t81215\njiani\t81216\n稀里么耐烧\t81217\n里物\t81218\n琳菲\t81219\nhuuu\t81220\n供电局\t81221\nSmith\t81222\n吹落\t81223\n清文\t81224\n苏牧\t81225\n一个太\t81226\n吹萧\t81227\ndhqd\t81228\nsoundsgreat\t81229\n大皖医\t81230\n一个夜\t81231\n一个多\t81232\n我爱你在哪里我想见见你我想看\t81233\n2269331123\t81234\n杜航\t81235\n陈祥志\t81236\n来嘛\t81237\n咔咔咔\t81238\n梅河学校\t81239\n密管\t81240\n切切切切切切切切切\t81241\n脑孬老子\t81242\n多样化\t81243\n奏折\t81244\n老泡\t81245\n筽\t81246\n黑鸭子\t81247\n加班不补\t81248\n五四两斤\t81249\n过隙\t81250\n嵌入\t81251\n维亚康姆\t81252\n仇恨\t81253\nS5838黑1650\t81254\nGVvvv\t81255\n中正\t81256\n五千千克\t81257\n我领\t81258\n岩落了一得管我叫姐姐\t81259\n女胖\t81260\n15533175327\t81261\ngjgsw\t81262\n天亮回老家\t81263\n1月22号\t81264\n照相片\t81265\n和你聊\t81266\n21:50\t81267\n挺心疼\t81268\n验证实\t81269\n好建新\t81270\n我问你你知道我是谁吗我没有问电影\t81271\ndqwww\t81272\n刘一又\t81273\nmodk\t81274\n我不我不\t81275\n勋鹿\t81276\n歌剧\t81277\n赵连海\t81278\n细绳\t81279\n小小小小\t81280\nisvbzaqip\t81281\n试走\t81282\nPRDOR\t81283\n再见以后再聊\t81284\n内侧\t81285\n汉网\t81286\nxtomityoutme\t81287\n陈鹤园\t81288\n千军\t81289\n程紫云\t81290\n成反\t81291\n根本度\t81292\n告诉他\t81293\n美即子夜\t81294\n消假了二\t81295\n民和\t81296\n侧芽\t81297\n金毛行\t81298\n入夜\t81299\npi液\t81300\n两个小时\t81301\n戚姐\t81302\n奔溃\t81303\n细细\t81304\n法术\t81305\n闭会儿\t81306\n意式\t81307\n万历五十年\t81308\n手牛\t81309\n二虎\t81310\n茅野爱衣\t81311\n回恩\t81312\n打摸摸摸摸摸摸\t81313\n大v份\t81314\n土生土长\t81315\n暂\t81316\n河谷\t81317\n觐见\t81318\n最爱你的朋友\t81319\n杨子豪\t81320\n李一直\t81321\n留亡\t81322\n九河\t81323\n冷男\t81324\n在线网\t81325\n92年\t81326\n帅子英子\t81327\ntheus\t81328\n242425245686\t81329\n弄虚作假\t81330\n真情实感\t81331\ntheun\t81332\n收声\t81333\n强攻\t81334\n申紫桐\t81335\nshsjjsksns\t81336\n我的伯父\t81337\n坐船头\t81338\n歌乐山山洞村碑口社\t81339\n渔村\t81340\n琼海\t81341\n真心情\t81342\n惇\t81343\n银淳\t81344\n爱你是谁你的名字\t81345\n哀叹\t81346\n打着\t81347\n弗赖伊\t81348\n老子老子老子\t81349\n枋湖\t81350\n攻陷\t81351\n背乘\t81352\n背书\t81353\n还在那叽里呱啦的说神马神马\t81354\n4美元\t81355\n4周左右\t81356\nYfygdt\t81357\n马塞洛\t81358\n能说会道\t81359\n有奖\t81360\n殷红\t81361\n维米尔\t81362\n惰\t81363\n想吃唱\t81364\n一摸\t81365\n细胞\t81366\n习24105608\t81367\n看她\t81368\n算什么\t81369\n郭玉卿\t81370\n潘昇\t81371\n拜托文\t81372\nuu2\t81373\n家花园\t81374\n洗一洗\t81375\n地儿\t81376\n卡娃衣\t81377\n五首\t81378\n十个亿\t81379\n我是你的闺密\t81380\n十个人\t81381\n一摞\t81382\n呼啦啦\t81383\n一片绿叶\t81384\n油条蛋饼\t81385\n警熊初\t81386\nPRESLEY\t81387\n一摊\t81388\n跪跪堡\t81389\ncaprylylglycol\t81390\n凯伦\t81391\n蝴蝶钱\t81392\n[赞\t81393\n平明镇\t81394\n147集\t81395\n55365\t81396\n糖勒\t81397\n走反\t81398\n老郑\t81399\n人情世故\t81400\n花屏\t81401\n肖私立\t81402\n花屋\t81403\n一个一遍\t81404\n四三个\t81405\n13785694848\t81406\n最天天\t81407\n21900天\t81408\n1654338586\t81409\n五分之一第三天\t81410\n奥数题\t81411\n阿夫汗\t81412\nhgrrf\t81413\n孙玉涵\t81414\n封路\t81415\n老都\t81416\n三点四点五点六点\t81417\n各取所需\t81418\nhgvn\t81419\n青啤\t81420\nhgvj\t81421\nguckjcje\t81422\n查同治\t81423\n花山\t81424\n投篮\t81425\n757575757575yr\t81426\n陶飞雯\t81427\n走台\t81428\n自作自受\t81429\n吃客\t81430\n赵雨晴\t81431\n天鸿君\t81432\n珂卡芙\t81433\n冻蟹\t81434\n沃尔玛\t81435\n骆驼\t81436\nHhshgdg\t81437\n乱打\t81438\n坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋你最坏蛋坏蛋坏蛋你最坏蛋度秘\t81439\n判决率\t81440\n命长\t81441\n行的海我不要狠毒海\t81442\n古墓\t81443\ncv8vc\t81444\n1462589\t81445\n这座城市\t81446\n烦呵呵\t81447\n跟姐\t81448\n金磊\t81449\n50年\t81450\n毕建琴\t81451\nwidue\t81452\n槟榔西施\t81453\n星级酒店\t81454\n七叶瑾\t81455\n波皮员\t81456\nhehdhd\t81457\n江郎\t81458\n薛秘\t81459\n草鞋\t81460\n六谜\t81461\n一个狗\t81462\n隆隆\t81463\n王飞宇\t81464\n田雨欣\t81465\n哆咪呀\t81466\n名贤集\t81467\n不要不要脸\t81468\n20多亿元\t81469\n嗯嗯为华的为的笔顺十二不对和三个娃\t81470\nwhekd\t81471\n金毛王子\t81472\n瞧不起\t81473\ndjlvtll金咯\t81474\n舟\t81475\n一次也吗一起去我的经营杆\t81476\ntistomit\t81477\n寇庆祥\t81478\n暖意盎然\t81479\n江都\t81480\n越城区\t81481\n哆啦a梦四一只未来的心不猫\t81482\n谷雨\t81483\n内嵌\t81484\n杨叔\t81485\n3d二\t81486\n需要\t81487\nkuoe\t81488\n劳峰\t81489\n欧贝\t81490\n说处\t81491\n原建\t81492\n闲扯\t81493\n嗯嗯嗯炒鸡棒\t81494\n女儿片\t81495\n驾驶室\t81496\nsothat\t81497\n我的我的爸爸妈妈\t81498\n点点滴水\t81499\n尹斗俊\t81500\n谈温\t81501\n龟缩\t81502\n真乖乖\t81503\nNONo\t81504\n08级\t81505\n还珠侠\t81506\n笨笨哒\t81507\n苛刻\t81508\n34.0\t81509\n二十辆\t81510\n下结论\t81511\n西红门\t81512\n谭晶心\t81513\n1麣\t81514\n啦啦啦啦啦啦啦啦啦啦\t81515\n小清\t81516\n许VC才\t81517\n步步88a\t81518\n亅J111\t81519\n尊品牛排\t81520\n吴姓氏\t81521\n注销联联通\t81522\nuuu\t81523\nFANS\t81524\nCrab\t81525\n热拌\t81526\n妙乡\t81527\nqrdl\t81528\nhdghhdbis\t81529\n时ヲ止メテ\t81530\n邓家佳\t81531\n没收\t81532\n命年\t81533\n6道\t81534\n棉布\t81535\ntstost\t81536\n舞娘\t81537\n被的很\t81538\n悦酱\t81539\nMeeker\t81540\nvhxdjhdj\t81541\n禅意\t81542\n李章洙\t81543\n蓄水池\t81544\n王四六\t81545\n臂膊\t81546\nwarmclocthes\t81547\n赞念书\t81548\n酱紫撒\t81549\n网络人\t81550\nakwf\t81551\n新泰崭\t81552\n杜比亚\t81553\n6228480831220128910\t81554\n制冷\t81555\n野饮\t81556\n李桐桐\t81557\n鼻酸\t81558\n丑岁\t81559\n34679521\t81560\n收获的季节\t81561\nthane\t81562\n熱熱熱\t81563\n乔子巨\t81564\n好了我的\t81565\n小人国\t81566\n15907038411\t81567\n麼不一樣\t81568\n2183767888285183743647981546\t81569\n问及\t81570\n爱太帅\t81571\n独在异乡\t81572\n混饭\t81573\n十五分之四画简比\t81574\ntupos\t81575\n传媒界\t81576\n#万\t81577\n关我事\t81578\n洪微\t81579\n张豪\t81580\n255千米\t81581\n威震\t81582\n问号\t81583\n压箱\t81584\n正在版\t81585\n6万多\t81586\n远安\t81587\nghiuffrdd\t81588\ncrsfs\t81589\n问叫\t81590\n问句\t81591\n一百三十九\t81592\n小桑田\t81593\n红桃\t81594\n出动\t81595\n基本功\t81596\n基本力\t81597\n醋味\t81598\n曹文涛\t81599\n思索\t81600\n想办\t81601\n窝点\t81602\nins\t81603\n优胜劣汰\t81604\n仇者\t81605\nokjm\t81606\n红桥\t81607\n贝玲妃\t81608\n投资家\t81609\n周利\t81610\n宋久梅\t81611\n兴宁区\t81612\n为什么不通过\t81613\nbwhwbwb\t81614\n外界\t81615\n出力\t81616\n比特\t81617\n78平方\t81618\n度秘反应\t81619\n大生学\t81620\n溜冰鞋\t81621\n装扮\t81622\n8600525\t81623\n降温\t81624\n扮靓\t81625\n去话吧\t81626\n硫磺酸\t81627\n4.5%\t81628\n羊炕\t81629\n还珠\t81630\n何婉婉\t81631\n酒吧女\t81632\n天生王\t81633\n病房\t81634\n跨地区\t81635\n五行山\t81636\n多汗岑\t81637\n姜丽萍\t81638\n立涵\t81639\n晚上7:00\t81640\n有心看\t81641\n久保田\t81642\n巨神\t81643\n多伦多布卢尔街\t81644\n脱轨\t81645\ninc\t81646\n相映\t81647\n鱼猫\t81648\n俊凯帅\t81649\n说英语多聊\t81650\n试音室\t81651\n徐昊\t81652\nPosch\t81653\n提起\t81654\n保暖裤\t81655\n赵静茹\t81656\n新月\t81657\n周娜娜\t81658\n坐摇\t81659\n零二零二二幺五幺幺\t81660\n3.14亿元\t81661\n髮色\t81662\n一起作业\t81663\n幽暗\t81664\n3DM\t81665\n一曲折\t81666\n吾心\t81667\n隔日\t81668\n婚礼貌\t81669\n千方百计\t81670\n敏锐\t81671\n如年\t81672\n2233446686879866334898884856834982846\t81673\n克娜\t81674\n朱先生\t81675\n心情不公\t81676\n八八点\t81677\n人皮\t81678\n横向火\t81679\n秦艺鸣\t81680\n使劲干\t81681\n300224\t81682\n不要不要拜拜你是头猪你头猪你是头猪猪猪\t81683\n7分\t81684\n7刀\t81685\n手臭猪\t81686\n天团\t81687\n浴埜\t81688\n人皇\t81689\n霾vcjj\t81690\n经文\t81691\n度秘我爱爱你\t81692\n七块\t81693\n幺五五二二零零\t81694\n盖佳\t81695\n圣诞惊魂夜里骷髅杰克\t81696\n爱不玲\t81697\n安不烦\t81698\n明天英\t81699\n爱渐\t81700\n韩劲松\t81701\n范冰冰吻\t81702\n天国\t81703\n白小飞\t81704\ndxxssdffttrr\t81705\n浸水\t81706\n中信出版集团\t81707\n一会儿天\t81708\n摇篮\t81709\nstanderdEnglish\t81710\n逼格\t81711\n星兴路信华楼\t81712\n你说\t81713\n逼样\t81714\n认不出\t81715\n马里\t81716\n诤言\t81717\n夹层\t81718\n张娜\t81719\n总气\t81720\n法老\t81721\n开题\t81722\n打野\t81723\n打量\t81724\n张老板\t81725\n反问\t81726\n回头忐忑\t81727\n哈日\t81728\n红饿\t81729\n帥氣\t81730\nSkenjaizx\t81731\n鞋带\t81732\n鸿门宴\t81733\n好困好困\t81734\n坚果米饼\t81735\n鹤立鸡\t81736\n五零二二\t81737\n一点也不斯文\t81738\n约稿\t81739\n虚心四轮\t81740\n研磨\t81741\n2.83次\t81742\n有责\t81743\n乾隆大帝/\t81744\n拍片\t81745\n有货\t81746\n此世\t81747\n哈族\t81748\n摔伤\t81749\n程胜泽\t81750\n吴ｐ\t81751\n有费\t81752\n日pt\t81753\n少则四五年多子\t81754\n公物品\t81755\n堃堃\t81756\n鱼丸\t81757\n压岁钱\t81758\n不道德\t81759\n谢大脚\t81760\n难事儿\t81761\n炎热\t81762\n福州市\t81763\n2公斤\t81764\n钱钟书\t81765\njcpm\t81766\n斯文尼\t81767\njkhd\t81768\n急转\t81769\n飞越疯人院\t81770\n神坑\t81771\n美红\t81772\n陈小翔\t81773\n神坛\t81774\n昭觉寺\t81775\n过两天再来\t81776\n暗影\t81777\n消失了你也不知道\t81778\n沙爷\t81779\n阿萌\t81780\nh8\t81781\n几亿\t81782\n出言不逊\t81783\nh2\t81784\n2920\t81785\nh6\t81786\n0。6。。0\t81787\nh5\t81788\n说得好男\t81789\n香港立法局\t81790\n蒙特利尔\t81791\n多看\t81792\nTobit\t81793\n察卡盐湖\t81794\n名侦探柯南名侦探柯南\t81795\n蜀山战纪要不要钱\t81796\n大武生\t81797\n阿萨\t81798\n病毒人\t81799\n几五\t81800\n解析度\t81801\n降准\t81802\n忘给\t81803\n知情者\t81804\n梁羽生\t81805\n外貌\t81806\n知不留\t81807\n曝出\t81808\n舍弃\t81809\n多眠\t81810\nwhbkssskazppzlpppwwww2p13f\t81811\n1332539654\t81812\nhz\t81813\nhx\t81814\nhy\t81815\n圆谎\t81816\n什么梦\t81817\nhr\t81818\nhs\t81819\nhp\t81820\nhq\t81821\nhv\t81822\nhw\t81823\nht\t81824\nhu\t81825\n歌伤不起\t81826\nhk\t81827\nhh\t81828\nhi\t81829\nvhvib\t81830\nho\t81831\nhl\t81832\nhm\t81833\nhb\t81834\nhc\t81835\nha\t81836\nhf\t81837\nhg\t81838\nhd\t81839\n邱婉婷\t81840\nhZ\t81841\nTogether3\t81842\nhX\t81843\n田埂\t81844\nhR\t81845\n光枪\t81846\n踏青\t81847\nhV\t81848\n完工\t81849\n苏诚\t81850\nhH\t81851\n猛犸\t81852\n老大不小\t81853\n嘣谢\t81854\n9089887\t81855\nhB\t81856\n39号\t81857\n只愿\t81858\nhD\t81859\n南淝河\t81860\n1166\t81861\n遗传病\t81862\nbjkkooppooooopppppoop\t81863\n1161\t81864\n锁头\t81865\n同交欢\t81866\n方则\t81867\n油度\t81868\nles噶\t81869\n复合型\t81870\n张特\t81871\n发仁\t81872\n欧朋雨\t81873\n勇闯\t81874\n油库\t81875\n张片\t81876\n度寒假\t81877\nvajsiod\t81878\n放开亲\t81879\n李弘基\t81880\n要回来\t81881\n队里媚\t81882\n发以\t81883\n郭旭东\t81884\nJccjhxh\t81885\n好呆\t81886\n和母\t81887\n生贵子\t81888\n完全平方数\t81889\n想开通\t81890\n全县\t81891\n475\t81892\n麟化\t81893\ngvnj\t81894\n法律战\t81895\n北京机场\t81896\n安定位\t81897\nSealink\t81898\n找先\t81899\n春光乍泄\t81900\n蛋蛋蛋蛋\t81901\n情网\t81902\n精选\t81903\n东滨路\t81904\n回旋镖\t81905\n盯人\t81906\n怕无\t81907\n天作\t81908\n精通\t81909\n任盈村\t81910\n吃饭吧过\t81911\n呀飞\t81912\n占地儿\t81913\n555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555\t81914\n猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪\t81915\n周小狄\t81916\n蔡哥\t81917\n傻笑死\t81918\n泳将\t81919\ngoggigy\t81920\n98万\t81921\nzjka\t81922\n给我唱\t81923\nktool\t81924\n角平\t81925\n性恋\t81926\n出院\t81927\nloll\t81928\n度秘度秘我姐爱你\t81929\n天煞\t81930\nhello糠喵\t81931\n教唆\t81932\n贾玉强\t81933\n艾仕娟\t81934\n连篇累牍\t81935\n性息\t81936\n一个六万块\t81937\n可地\t81938\n风能\t81939\n郑大\t81940\nyouisapig\t81941\n陈学冬\t81942\nyouisapia\t81943\n快行\t81944\n4月27日\t81945\n此书\t81946\n郭晓冬\t81947\n姓帅\t81948\n四川宜宾市一中\t81949\n日化\t81950\n真真假假假\t81951\nwgskis\t81952\n歌\t81953\n天哪你有毒\t81954\n棉裤\t81955\n此么\t81956\n日包\t81957\n几吧度\t81958\n思修\t81959\n杨小俊\t81960\n15102810188\t81961\n够不着我\t81962\n施恩琪\t81963\nujjugghjghhhvhujkukjkvlivnnmmljjvgvvbbjhjjjjlhccvblopbSzgjtyfxvhknmpzksnBbbnnnmnnnbllln\t81964\n哇塞秀秀\t81965\n通知我\t81966\n大富科技\t81967\n下落不明\t81968\n阿碧\t81969\n健康快乐\t81970\n不限时\t81971\n洗秘\t81972\n千年没一家修和种人王鼓舞经常状元故里多才剧\t81973\n第一场雪\t81974\n认识观\t81975\nnel\t81976\nnek\t81977\n摸摸麻\t81978\nneg\t81979\nned\t81980\nnee\t81981\n湖南黑美人茶业\t81982\n沙尔克\t81983\n来了玩\t81984\nABS-CBN\t81985\n感应\t81986\nney\t81987\nPut\t81988\nnew\t81989\n一个一百元\t81990\n笨妮\t81991\nnes\t81992\n别乱\t81993\n谷亦璨\t81994\nG多\t81995\n吃蟹\t81996\n费多\t81997\n晚寓\t81998\n歼\t81999\n卖官\t82000\n谁亲\t82001\n60000\t82002\n朝花夕拾\t82003\n谁人\t82004\n醉鸟\t82005\niphone4S\t82006\n许宗衡案\t82007\n了蚀\t82008\n拥\t82009\n欣喜\t82010\nfvgsrjfgurdy8iiyeugs4ihc#k\t82011\n阿sa\t82012\n冬泳\t82013\n五断感\t82014\n墙壁\t82015\nForM\t82016\n口过\t82017\n名址\t82018\n18839608500\t82019\n4，5年\t82020\n绿水\t82021\n恩撒\t82022\n刘丽姐\t82023\n歲\t82024\n养蛇\t82025\n你好难相处\t82026\n昆仑表\t82027\n中视航通国际传媒有限公司\t82028\nt头jktlvtlvttlvmttjuntt\t82029\n别的号\t82030\n齐克莫\t82031\n施老师\t82032\n首饰\t82033\n来针\t82034\n西樵山风景名胜区\t82035\nonome\t82036\n恶择\t82037\n可见你好丑\t82038\n忘喽\t82039\n歪\t82040\n忘了写\t82041\n1月1号\t82042\n省籍\t82043\n交款\t82044\n有罪\t82045\n邹立龙\t82046\n读书器\t82047\n隆咚\t82048\n窝靠\t82049\nfkydjgkhnxkhkhfkhkngdkhkyngxkhurdnjgmhcjtniyjvmhflig\t82050\n秘你好呆\t82051\n翠珑\t82052\n季国燕\t82053\n现现\t82054\n坏片\t82055\n那个群\t82056\n李坤\t82057\n维回\t82058\n一大步\t82059\n我真的对你一话钟情\t82060\nhttpahiphotosbaiducomxiaodupicitemd31b0ef41bd5ad6ebd243e9f86cb39dbb7fd3ce1jpg\t82061\n5688668\t82062\nhaccc\t82063\n防治\t82064\n咋吻\t82065\n挠挠\t82066\n莫皓文\t82067\nssooo\t82068\n李仲阳\t82069\n父神\t82070\n恋母\t82071\n杨氏\t82072\n猜真准\t82073\n喂狗\t82074\n张敬轩\t82075\n齐海\t82076\n精确\t82077\n给偷情网\t82078\n250米高\t82079\n胡友娃\t82080\n大坏狼\t82081\n3015\t82082\n红杏出墙\t82083\n拜拜拜拜拜拜拜拜\t82084\n金牛男\t82085\n865斤\t82086\n宛然\t82087\n损友度秘\t82088\noff\t82089\n两校的沦落\t82090\nRHFFUGX\t82091\n兴山\t82092\n沏茶\t82093\n一筹莫展\t82094\n斗龙\t82095\n好呀我就给你的棒\t82096\n文帅\t82097\n传真\t82098\n胜达你\t82099\n老子不喜还有你的脸\t82100\ndontspeakmysay\t82101\n嫁给你的可以\t82102\n555855555688\t82103\n左棍\t82104\n干挂\t82105\n不要我做你的主人\t82106\n好的我你猜猜我\t82107\n多具\t82108\n谢先先\t82109\n爱宝\t82110\n秦总八十慢过桥事业俗语求你来\t82111\n刘海涛\t82112\n鸟瞰\t82113\n菇凉撒\t82114\n家树林\t82115\n多兆\t82116\n钢针\t82117\n多元\t82118\n注孤生\t82119\n平米\t82120\n关智斌\t82121\n33333335887789\t82122\n4万一平米\t82123\njhidk\t82124\n早阿\t82125\nhdud\t82126\nhduc\t82127\n瓦片\t82128\n過的累點那樣我心裏踏實\t82129\n一年来\t82130\n东明\t82131\n衣服\t82132\n导视\t82133\n二东\t82134\n猪牛肉\t82135\n我爱你无所畏#我爱你\t82136\noqoakjqniqkqkwns\t82137\n紧问\t82138\n商业城\t82139\n紧闭\t82140\n福村\t82141\n武王府ac\t82142\n十秒\t82143\n同笼\t82144\n乔峰\t82145\n看上上上上上上上上上上上上上上上\t82146\n股长\t82147\n我需要十个你这样的度秘\t82148\n就地正法\t82149\nDistimo\t82150\n瓦特\t82151\n农农\t82152\n蒋鹏\t82153\n夜深了\t82154\n打人\t82155\n好话呢鲠\t82156\n单词卷\t82157\n人声鼎沸\t82158\n不时之需\t82159\n2468元\t82160\n12.08.12\t82161\n六月6月1号\t82162\nnotouwwt\t82163\n13米\t82164\n想过河\t82165\n胆囊\t82166\n福克斯\t82167\n累不许\t82168\n14天\t82169\n三长两短\t82170\n一颗心\t82171\n书\t82172\n3000万会\t82173\n单一吃\t82174\n拉梅拉\t82175\n28点\t82176\n华宇欣\t82177\n唉稀罕\t82178\n值不值\t82179\n错乱收费\t82180\n度秘你的过张帐\t82181\n维多利亚\t82182\n小然\t82183\n78942900\t82184\n000元\t82185\n混为一谈\t82186\n度秘我怀上你的孩子你\t82187\n杨大洋\t82188\n秒杀\t82189\n媒体场\t82190\n1412061532\t82191\n右图\t82192\n一圈儿\t82193\n赵子怡\t82194\n屁话节\t82195\n11111111111一一111111111111\t82196\n太上心\t82197\nyrfc\t82198\nDock\t82199\n一呀一\t82200\ngchf\t82201\n面粉\t82202\n1993年\t82203\n樱桃树\t82204\n关喆\t82205\nTKT\t82206\n飞鸟龙\t82207\n邱慕涵\t82208\n一千份\t82209\nwhatrhou\t82210\n七十级\t82211\n洞妖\t82212\n木筏\t82213\n家肴\t82214\n郭名俊秀\t82215\n愚公\t82216\n2911点\t82217\n习种\t82218\n复何求\t82219\n你你你你你成心气\t82220\noq派特\t82221\n再聊星期五星期六\t82222\n121班\t82223\n７席\t82224\n给我糖\t82225\n愚兄\t82226\n切开\t82227\n香港立法会\t82228\n圆通快递\t82229\n45吨\t82230\n秘我累\t82231\n王乐山\t82232\n就业率\t82233\n下五分钟\t82234\ngiffdg\t82235\n泰达\t82236\n阴魂\t82237\n赶泡\t82238\n犯错\t82239\n拷\t82240\n表述者\t82241\n洗发水\t82242\ngffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff\t82243\n什空\t82244\n背身\t82245\n1918151985\t82246\n生气香\t82247\n是真的了再见\t82248\n4个多小时\t82249\n丧命\t82250\n王一臣\t82251\nmaomatl\t82252\n13864080285\t82253\n37度\t82254\n高廣\t82255\n没得心穿衣\t82256\n塞塞塞\t82257\n依米31\t82258\n退休后生活\t82259\n陈国林\t82260\n鸡吧扯蛋\t82261\ni我的逆袭我去等你企鹅\t82262\nIabbc\t82263\n雷高飞亚\t82264\n度秘侠侠侠\t82265\ngtgggfff\t82266\n杜v\t82267\n宋梦莹\t82268\n不六零六\t82269\nnosto\t82270\n宝龙广场\t82271\n网吧化\t82272\n咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪哦咪哦咪哦咪哦咪咪咪咪咪咪咪\t82273\n主人们\t82274\nmpatisolarou\t82275\n店子\t82276\n正月十三\t82277\n主人任\t82278\n龙卷风干\t82279\n双胞胎\t82280\n小冉然\t82281\n绝世\t82282\n陈灿元\t82283\n李思婷\t82284\n近意\t82285\n在线人\t82286\n018585180\t82287\n一丝不舍\t82288\nposs\t82289\n信旦\t82290\n绝不\t82291\nx510x\t82292\nhdjdjjdjddn\t82293\njmdp\t82294\n很多很难\t82295\n非诚勿扰II\t82296\n黄田\t82297\n萌萌哒帅帅哒\t82298\nchen\t82299\n钱力晨\t82300\n加州\t82301\n皇甫冉\t82302\n梦心\t82303\ntletshcoujllo\t82304\n草人\t82305\n黄生\t82306\n运不济\t82307\nFRD\t82308\n加工\t82309\n皇甫冠\t82310\n嗣\t82311\n航海\t82312\nCkussv\t82313\nhfigimffgfjkfnfforpoebhdufghjfjfif\t82314\n嗡\t82315\n十二十三十四十五\t82316\nFRH\t82317\n左旗伯\t82318\n33534\t82319\n搞不领情\t82320\n李陌陌\t82321\n电tx03\t82322\n拐卖案\t82323\n41574\t82324\n大佬们\t82325\n法齐奥\t82326\n鼓劲\t82327\n啊轩辕\t82328\n琥珀主\t82329\n巴蛋\t82330\n刺挠\t82331\n呀美\t82332\n官本位\t82333\n喋血\t82334\n少妇\t82335\n公投\t82336\n傅零铃零两木\t82337\n186586888888881\t82338\n少妞\t82339\n陈思敏\t82340\n音悦\t82341\n午安晚安午安早安\t82342\n拜金女\t82343\n3个月\t82344\n人工耳蜗\t82345\n请来\t82346\n好值\t82347\n贾丽霞\t82348\n霄肃\t82349\n在风中\t82350\n三层水富疑无咯山后山前处处梅\t82351\n切诺基\t82352\n运水\t82353\n一夫裙子\t82354\n好不CD\t82355\n对呀你没有\t82356\n迷糊糊\t82357\n乖乖的听妈妈的话\t82358\n天下气自华\t82359\n想不打扮\t82360\n养兵千日\t82361\n开业\t82362\n无灾\t82363\n啦啦不局jun来啦lte\t82364\nYouarefooi\t82365\n开下\t82366\n运气\t82367\n噗噗\t82368\n纯爱\t82369\n手机A30\t82370\n牛皮纸\t82371\n黑屌哥\t82372\n肚条\t82373\n阿亲亲你的脸\t82374\n150万欧元\t82375\nb6针\t82376\n窝苣窝\t82377\n贝因美\t82378\nhentai\t82379\n朱浩源\t82380\njigjgjfvjj\t82381\n弥渡弥渡嗯\t82382\n丸家\t82383\n瓜婆娘\t82384\n嗎\t82385\nygx\t82386\n我喜欢书\t82387\n仇人\t82388\n唉呀仔\t82389\n唐玉超\t82390\n自大\t82391\n上船\t82392\n西宁市\t82393\n歌唱家\t82394\n自夸\t82395\n陈一舟\t82396\n花花你是花美女\t82397\nMOZER\t82398\n四四本\t82399\n大吃饱走\t82400\nlutate\t82401\n￥￥￥￥￥￥￥\t82402\n坏风\t82403\n崔莺莺\t82404\n五千年前\t82405\n嗅\t82406\n台北Legacy\t82407\n戒色吧\t82408\n18326641251\t82409\n轻女重男轻女\t82410\n老特拉福德球场\t82411\n志高\t82412\ndufg\t82413\n麦麦麦\t82414\n流亡\t82415\n不要在见了我不想和你说了你气死\t82416\n直接\t82417\n哦摸\t82418\n武颖欣\t82419\nSUJU饭团\t82420\n罒ο罒\t82421\n十一月\t82422\n嗞\t82423\n架势\t82424\nhi份\t82425\n沈十七\t82426\n每天\t82427\n三千年\t82428\n龚老\t82429\n铁证\t82430\n嗜\t82431\n无毛钱\t82432\n新美国梦\t82433\n巴马\t82434\n嗓\t82435\n逢源\t82436\n恩好\t82437\n绚丽\t82438\n糞\t82439\n嗑\t82440\n叶笃正\t82441\n乘以七减\t82442\n韩国代表团\t82443\nhsiz\t82444\n机器人机\t82445\n第l次\t82446\n火影忍者ol\t82447\n骑用\t82448\n来车往\t82449\n生不出\t82450\n非诚勿\t82451\n能不能不爱\t82452\n顺道\t82453\n11月13日\t82454\n嗔\t82455\nvenjde\t82456\n转晴\t82457\n河北路\t82458\n郭威天\t82459\n林水滨\t82460\n显富\t82461\n浦里\t82462\n哦你在哪师傅和\t82463\n大明二\t82464\n出让\t82465\n邵飘萍\t82466\n萨马特\t82467\n月黑风高\t82468\n跳票\t82469\n王有斗\t82470\nmc5\t82471\n七舌\t82472\n妙妙蛙\t82473\n张相和\t82474\n职高\t82475\n臭名\t82476\n7.62毫米\t82477\n熊自林\t82478\n九厘米\t82479\n动手吧\t82480\n搅拌机\t82481\n1V\t82482\n视而不叫\t82483\n海归\t82484\n史涵\t82485\n无能机器人\t82486\n继天\t82487\n嗯晚班\t82488\n可惜\t82489\n绝地武士\t82490\nghsgd\t82491\n李炜\t82492\n林爽美\t82493\n女闺蜜了你是我\t82494\n权力行使者\t82495\n王雨露\t82496\n88877776\t82497\n抖森\t82498\n开头发\t82499\n撒尿钱\t82500\n唱儿歌呵呵呵度\t82501\n眉开眼\t82502\n零二二幺零零幺八\t82503\n粪草\t82504\nabcdefgabcdefhahame秘covobobaxiazxyazoulcxiazeloucatihereuscxiheouxiheiouzz\t82505\n请到底是什么\t82506\n贺冰圣\t82507\n那个女\t82508\n陆开福\t82509\n五百多块\t82510\nNAME\t82511\nygghut\t82512\n我家小叔公好乖\t82513\n前列腺\t82514\n好伴侣\t82515\n456456\t82516\n冷淡杯\t82517\n想你自己\t82518\n呢性爱\t82519\nPisD\t82520\n桂熊\t82521\n深交\t82522\n著作资料\t82523\n保证金\t82524\n对我好\t82525\n红警\t82526\n22三\t82527\n觉得\t82528\n禹州\t82529\n黄勇\t82530\n422201198711127993\t82531\n鬼泣\t82532\n木鸡\t82533\n法字\t82534\n关晓\t82535\n法子\t82536\n深了\t82537\n度秘臭度秘\t82538\n九十多页\t82539\n海底世界\t82540\n林心如\t82541\n啊囧囧囧\t82542\n法学\t82543\n六千多\t82544\n呃哥哥哥哥哥哥哥哥哥哥爱\t82545\n乔振宇\t82546\n深井\t82547\n一个你的声音\t82548\n揉揉揉揉揉揉揉揉揉揉揉揉揉\t82549\n预癌\t82550\n自己是你是你是你是\t82551\n永生味千\t82552\nu妈妈\t82553\n白老虎\t82554\nDo\t82555\n闽剧\t82556\n我怕了你了行\t82557\n肥仔\t82558\n吃吃吃吃吃吃吃吃吃\t82559\n王无礼\t82560\n实用期\t82561\n白晶晶\t82562\n永远不见了\t82563\nftutrfgjudffggf\t82564\n顏色\t82565\n全尸\t82566\nNNNno\t82567\n广播站\t82568\n糯\t82569\n甜橙\t82570\n67周年\t82571\nfffffffffffffffffffffffffffffff\t82572\nkaoxiao\t82573\n能不能不说\t82574\n手工西施壶\t82575\n韩剧\t82576\n澳怡妹\t82577\n求己\t82578\n错大错大错大错\t82579\nDb\t82580\n1996年5月8日\t82581\n将都江堰\t82582\n张苍井空\t82583\n校草\t82584\n超长\t82585\n汗哒哒汗哒哒汗哒哒汗哒哒\t82586\n紫癜\t82587\n发米泉\t82588\n你好怪哟我喜欢\t82589\n几尺\t82590\n淘宝银\t82591\ncbughhvj\t82592\n胸举列\t82593\n紫癫\t82594\n天蝎晚餐\t82595\n家潭\t82596\n落课\t82597\n浑然一体\t82598\n方欣然\t82599\n找你可以\t82600\nUnionopenYuma\t82601\n甘心疼\t82602\nopp\t82603\nopr\t82604\nops\t82605\n抱头痛哭\t82606\n变形金刚变\t82607\n有勉\t82608\n透气性\t82609\nope\t82610\n卡里\t82611\n会计核算\t82612\nvxxgdj\t82613\n好相信\t82614\nhobrt\t82615\n战死\t82616\n小鱼干\t82617\n高难度\t82618\nDr\t82619\n發個瓷偕\t82620\n太人家\t82621\n酒谷\t82622\n捣蛋鬼\t82623\n伤伤\t82624\n赵赵鹏鹏\t82625\n干干嘛\t82626\n關鍵\t82627\n瓜皮\t82628\n沈静瑶\t82629\njagkhidjg\t82630\nrffd\t82631\n盖盖\t82632\n疯狂寻人\t82633\n上见识\t82634\nYesYesYes\t82635\n布莱恩特\t82636\nspn9\t82637\n迟夷\t82638\n陈乐\t82639\n胡半仙\t82640\nyigd\t82641\n武基\t82642\n这一幕\t82643\n生态园\t82644\n世联\t82645\n春娇\t82646\n硬我一样\t82647\n武城\t82648\n刘诗丽\t82649\n灵隐寺\t82650\n一起来不及\t82651\n友尺\t82652\n想不要\t82653\n友尽\t82654\n邹彭慧\t82655\n人糖糖\t82656\n嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎\t82657\n这一年\t82658\nggtuf\t82659\n真结衣\t82660\n苞米\t82661\nhkuef\t82662\n外出来生\t82663\n一么一\t82664\n粗鲁咯\t82665\nGfddfffggggggg36l\t82666\nctsd\t82667\n灯塔市\t82668\n滨江花城\t82669\n把关\t82670\nbcd127\t82671\n滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚\t82672\n按键\t82673\n大方向\t82674\n高孟浩\t82675\n黑土地\t82676\n第二百期\t82677\n木枝\t82678\n公德\t82679\n何忍\t82680\n辛集\t82681\n依马\t82682\n街角\t82683\n啦啦啦啦啦啦啦啦啦啦啦啦啦啦我要乖乖乖我要乖乖乖乖乖\t82684\n1G\t82685\n很好色\t82686\n355466\t82687\n精神睡睡有今朝有今日岁岁有今朝有酒今朝有今朝\t82688\n官运\t82689\n建議\t82690\n视察\t82691\n斯密\t82692\n38一\t82693\n不投诉\t82694\n射程\t82695\n昨夜一夜\t82696\n1几\t82697\n摊子\t82698\nQffxdfffvgffhghcudjgfghsschydhkbxcjtst\t82699\n源码\t82700\n马新亿公司\t82701\n倡唱\t82702\n熊出没\t82703\n曾素玲\t82704\n天气堡\t82705\n1887413137\t82706\ne3df8dcd100baa196a510ce4010b912c8fc2e3djpg\t82707\n快口臭口苦茶\t82708\n锕年后\t82709\n家卷卷胶水甲醛\t82710\n邓思\t82711\n一个300元\t82712\n幻想快快快快快\t82713\n38个\t82714\n岳阳话\t82715\njoy罗\t82716\n弥足\t82717\n阿巴斯\t82718\n杨庄主\t82719\n桌面你爱我吗我女儿考砸你还爱我\t82720\n整天到晚不行哪麻烦\t82721\n一二3块\t82722\n我要是个坏蛋王浩然\t82723\n做什么呢讨厌\t82724\n聊斋先生家的夏天小游\t82725\n性能力\t82726\n二儿\t82727\n送温暖\t82728\n没用了等\t82729\n心裡\t82730\n宋晓雯\t82731\n多莫\t82732\n坏坏人\t82733\n2011年1月11日\t82734\n娱乐双周\t82735\n你好多咪多咪我的吃烤鱼\t82736\n好朋恿\t82737\n偷摸\t82738\n你好智能\t82739\nvjfk\t82740\n才梦晴\t82741\nX—35B\t82742\nNddjjfkfif\t82743\n熊飞\t82744\n阿勃斯的饿佛歌呵一起咯吗嘛我婆心一丝之屋未啊兮雅贼\t82745\n呃春兴\t82746\n划为\t82747\n19980319\t82748\n5月29日\t82749\n呼噜\t82750\n枪者\t82751\n叶春燕\t82752\n离怜\t82753\n兔兔兔兔图鹅鹅鹅鹅鹅鹅\t82754\n别致你信\t82755\n花颜店\t82756\n我是什么我是谁了你\t82757\n够吧\t82758\n嗯唱片塔汀er1般\t82759\n团里\t82760\n奉送\t82761\n我是谁你谁让你骂我猪婆\t82762\n撒韵达\t82763\n多年后\t82764\n创业型\t82765\n暗调\t82766\n哇咔咔天下西瓜酱午安\t82767\n大秘密\t82768\n我要王游戏\t82769\n十八九岁\t82770\n罗立轩\t82771\n非智能模式\t82772\n受洗\t82773\n布尔津\t82774\n炸鸡薯条\t82775\n亲爱的你\t82776\n155522558\t82777\n大吉大利\t82778\n感伤\t82779\n学塔县\t82780\n你的声\t82781\n165656564656564464646564662318894465653565613232858987676446546435353865665566461565\t82782\n够吗\t82783\n除夕去\t82784\n流沙河\t82785\n暮霭\t82786\n房改\t82787\nvvvvvvvvjvvvv\t82788\n村名\t82789\n1L\t82790\n美没有发出去你不唱歌英语歌呗求你了行么\t82791\n宁泽涛\t82792\n三冰魄\t82793\n多心\t82794\n奀垚\t82795\n时瑞郎\t82796\n521521\t82797\nfell\t82798\nBB粉\t82799\n朱小风\t82800\n算不说\t82801\n菲尔王\t82802\n杨萌萌\t82803\nmelomic\t82804\n必守\t82805\n玉兰\t82806\n扯西扯扯\t82807\n安迪\t82808\n悬崖峭壁\t82809\n表姐夫\t82810\n虚实岁\t82811\n你好牛啊嘟咪嘟咪嘟咪\t82812\n猪猪猪猪猪猪黑子\t82813\n这样了行不行\t82814\n暗杠\t82815\n暗条\t82816\n安靜\t82817\n二零一七\t82818\nondan\t82819\n失意者\t82820\n70078484846\t82821\n傅高义\t82822\n三万家\t82823\n切切切切77777777777777777777\t82824\n店口鱿鱼\t82825\nCJ6136麦道\t82826\n专诸专\t82827\nooOo\t82828\n血珀\t82829\n一桶翔\t82830\nvvvvvvvvvvvvv\t82831\n拘言笑\t82832\n小爱度或小爱秘\t82833\nTteedhutryur\t82834\n连不如\t82835\n88585755\t82836\n上档\t82837\n七八岁\t82838\n醢浮浮沉沉\t82839\n先贤们\t82840\n香照\t82841\n一个三岁\t82842\n影子篮球员\t82843\n侯宇轩\t82844\n吵闹\t82845\nhttpfhiphotosbaiducomxiaodupicitem29381f30e924b899ec0e9c4c69061d950a7bf690jpg\t82846\n去了麽有事\t82847\n往西天\t82848\n排出\t82849\n到家长大\t82850\ntfboys\t82851\n玩些什么\t82852\n吧要不十八都\t82853\n时分\t82854\n抱歉先生度秘度秘\t82855\n征信念你好吃呢啊嘛事\t82856\n搞怪\t82857\nmicond\t82858\n酷6网\t82859\n冬菇\t82860\n剩不多\t82861\n度秘哇好羡慕你\t82862\n无话可说\t82863\n范诗晗\t82864\n央视一套\t82865\n小米手\t82866\n阿米豆腐\t82867\n姐汁\t82868\n明顯違\t82869\n烦我了真是\t82870\n21页\t82871\n时到\t82872\nvchddcfhhStvhjxGyxsgfzGgcidsZcDgz\t82873\n拍呀拍\t82874\n二秒末\t82875\n时刻\t82876\ntfboy2\t82877\n诉我\t82878\n北京国安足球俱乐部\t82879\n初七\t82880\n杨国波\t82881\n孙胜男\t82882\n伤感情\t82883\n二脉\t82884\n在他\t82885\nsawef\t82886\n挺龙\t82887\n欢战士\t82888\n女神经\t82889\nQQ嗯瓮\t82890\n王靖程\t82891\n六方会谈\t82892\n余翠玲\t82893\n传送门\t82894\n平湖\t82895\n错点宠\t82896\nKannapolis\t82897\n朱秘\t82898\n冯奥\t82899\n慢跑鞋\t82900\n初中\t82901\n健健\t82902\n全体\t82903\n奉贤\t82904\n炸弹天\t82905\nhd版\t82906\n逃匿\t82907\nuchd\t82908\n球迷群\t82909\nbffdfgjghhhbdffnfcnfyofxhyqy\t82910\n徐艺冉\t82911\n呢战\t82912\n汉字来源趣谈\t82913\n影徒\t82914\n做工片\t82915\n隐约\t82916\n858057558\t82917\n女厕\t82918\nAujasb\t82919\n逯多觅\t82920\n复制品\t82921\n高海\t82922\nieji\t82923\nWhatshe\t82924\n无数无数\t82925\n先生\t82926\npirng\t82927\n永遠\t82928\n朱笔\t82929\n行版\t82930\nxxxxxxzDs\t82931\n一个十元\t82932\n羊鞭\t82933\n圖圖\t82934\n先男\t82935\n小度妳媽\t82936\n董明慧\t82937\n先甲\t82938\n语无伦次\t82939\n受助\t82940\n收视研究\t82941\n1x\t82942\n伊晓磊\t82943\n行物\t82944\n钱包\t82945\n统战\t82946\n一个十八\t82947\n牛肉馅\t82948\n大汗淋漓\t82949\n644665988998676667688888888\t82950\nBen\t82951\n一成一遍\t82952\nBel\t82953\n优我\t82954\n代练\t82955\n拔牙\t82956\n电暖器\t82957\nBed\t82958\n八十九十一百七\t82959\n抽等一下\t82960\n几亿二十五年\t82961\n面壁\t82962\n张港\t82963\n链接\t82964\nTropa\t82965\n抢匪\t82966\nvghvvhhcbg\t82967\n1月份\t82968\nthermos\t82969\n智谷华府\t82970\n两艘\t82971\n2.8%\t82972\nBeC\t82973\n还我可乐\t82974\n芥末蛋糕\t82975\n邱淑静\t82976\n插件\t82977\ngrowing\t82978\n御天弓\t82979\n备罪\t82980\n张清\t82981\n一百一百多\t82982\ncrazy\t82983\n刺葵林\t82984\nhippo\t82985\nxdgfw们\t82986\n杨培林\t82987\n516什24\t82988\n胖比\t82989\n走了吧\t82990\n喷嚏\t82991\n充一\t82992\n西岸文化走廊\t82993\n充下\t82994\npurs\t82995\n公用电话亭\t82996\n难缠\t82997\n杜迁宋\t82998\n卖给我行\t82999\n飘飘\t83000\n卤代烃\t83001\n工藤\t83002\n金项链\t83003\n四喜\t83004\n被控\t83005\n32点六\t83006\n上下文\t83007\n成仙\t83008\n抓粗\t83009\n真的很顾家\t83010\n集注\t83011\n鹿斑\t83012\n19：41\t83013\n5069\t83014\n嗯嗯麼麼躂\t83015\n188点\t83016\n色泽\t83017\n鹿文\t83018\nhwiddkdgkd\t83019\n大禹\t83020\nvvbgg\t83021\n沪电股份\t83022\n素万那普曼机场一路\t83023\n做作业不聊\t83024\n有我想\t83025\n黄思荣\t83026\n烈阳\t83027\n二二七零\t83028\n动心忍性\t83029\n川贵地区\t83030\n重庆绕城高速\t83031\n张峰波\t83032\nram4grom64g\t83033\n竹花\t83034\n破剧\t83035\n飞风云三\t83036\n二四两个\t83037\n房泽浩\t83038\n刊物\t83039\n未来的未来\t83040\n葛美辰\t83041\n酒工\t83042\n弱人\t83043\n弯腰\t83044\nGgy\t83045\n香猪\t83046\nhfFijC\t83047\n跟进来\t83048\n可富婆\t83049\nshenjinge\t83050\n楚留香\t83051\n#李宇春咪咕音乐汇\t83052\n下一星期六\t83053\n炉峰禅寺\t83054\n勾兑\t83055\n吃了了\t83056\n讲和\t83057\n洛克号\t83058\n双性人\t83059\nu人\t83060\n一个两三岁\t83061\n我妈\t83062\n老人们\t83063\n金色银华盛世昌\t83064\n泥灰\t83065\n两点35\t83066\n小腊\t83067\n接待者\t83068\n煌煌\t83069\n543210\t83070\n小腹\t83071\n小腿\t83072\nzaim\t83073\nzain\t83074\n隔日隔日\t83075\n不不过\t83076\nwoshishuoou\t83077\n预选\t83078\n123456790\t83079\n度密乖\t83080\n瑞湾公寓\t83081\n樱兔\t83082\n矛嘢\t83083\n1e\t83084\n心口\t83085\n机械表\t83086\n布其诺\t83087\n中心句\t83088\n13059726288\t83089\n志非\t83090\nfhxc\t83091\n一五六七八\t83092\n11年12月\t83093\n好睡\t83094\n固安\t83095\n固守\t83096\n倖\t83097\n叫风雨同\t83098\n六大\t83099\n不防\t83100\n索引量\t83101\n固定\t83102\n三利\t83103\n循规蹈矩偶\t83104\n扎样\t83105\n什么日\t83106\nvrrg\t83107\n问一问问\t83108\n年俩\t83109\n沈韵妹\t83110\n梦晓时间\t83111\n扎根\t83112\nGjfx\t83113\n雷斯之\t83114\n什么时\t83115\n马伊琳\t83116\n富尔玛\t83117\n生娃娃\t83118\n琪琪派\t83119\n疯了了\t83120\np2PM2.5\t83121\n29488744\t83122\n走今天\t83123\n百人场\t83124\n嘟嘟乳\t83125\nJudd\t83126\nmiyou\t83127\nbmmmmnmjicvj\t83128\n源来\t83129\n昂扬\t83130\nxpo\t83131\nhit哦秘\t83132\n亲爱下能农村合作医疗\t83133\nyuanguimei\t83134\n几号几\t83135\n洪晓大\t83136\n叫床声\t83137\n刘世娇\t83138\n下凡\t83139\n男银\t83140\n一分之八\t83141\n致词\t83142\n墩布\t83143\n蒹葭汉化组\t83144\n图书馆学\t83145\n青岛轮渡公司\t83146\nzyal\t83147\n王雨\t83148\n出租车\t83149\n王雪\t83150\noppol\t83151\n冒汗宝宝兔\t83152\n走南闯北\t83153\n李玫瑾\t83154\n土豆\t83155\n挖沙西瓦\t83156\n在天可以\t83157\n这家堡\t83158\ncarry\t83159\noppor\t83160\nyoukes\t83161\n王雷\t83162\n50多599\t83163\n578468237828236926383373\t83164\n腿麻\t83165\n困难受\t83166\n街机\t83167\nOsman\t83168\n王雄\t83169\n王雅\t83170\nDRinK\t83171\n干爆基友\t83172\n073466789443789996545\t83173\n串红\t83174\n5月6日\t83175\n46520元\t83176\nPost\t83177\n甘美\t83178\n博雅斗土主\t83179\n好吧现在不了了等一下\t83180\n20个小时\t83181\n主人\t83182\n各区\t83183\n富康学校\t83184\n库拐\t83185\n尽力\t83186\n四尼姆\t83187\n牙肿\t83188\n妃嫔\t83189\n叛逆\t83190\n三天三夜\t83191\n三成个\t83192\n叛逃\t83193\n放流\t83194\n想天天开心\t83195\n三波河\t83196\n纽约地铁\t83197\n12475\t83198\n12479\t83199\n释纪\t83200\nffrretr\t83201\n入度\t83202\n重师者王\t83203\n无奈无奈\t83204\n宋皓蓬\t83205\n无车日\t83206\n姑娘们\t83207\n点点点点点点点点点点点\t83208\n九条\t83209\n王成林\t83210\n咿呀呀呀呀呀\t83211\n赵丽华\t83212\n语句号\t83213\n第一对\t83214\nmuckraking\t83215\n十三万\t83216\n冰爆\t83217\n小白菜\t83218\n死率\t83219\n请别\t83220\n候士\t83221\n呢腿\t83222\n大帅锅\t83223\nbcfg\t83224\nbcfe\t83225\n高家\t83226\n垃佛\t83227\n魅惑不羁\t83228\n莲池讲坛\t83229\n充栋\t83230\n我是你什么你是我的优乐美\t83231\n锐志胎\t83232\n十三个\t83233\n高宁\t83234\n赤道的男人\t83235\n霸道总会\t83236\n13号晚上24点\t83237\n莱曼、埃布埃、斯奎拉奇\t83238\n班楠宇\t83239\n徳格县\t83240\n李晓燕\t83241\n请回话\t83242\n刀女\t83243\n54648434545554555457542545554554673\t83244\n几个赞\t83245\n冰爷\t83246\n林清玄\t83247\n070921\t83248\n分手吧哼\t83249\n第二十三\t83250\n决一死战\t83251\n卫生间\t83252\n木子美\t83253\n巴望\t83254\n临起度\t83255\n马6b\t83256\n专门儿\t83257\n可圈可点\t83258\n饰演女一号\t83259\n8月17日—19日\t83260\n呵呵谢\t83261\nffhjk\t83262\n沙洲\t83263\nffhji\t83264\ngfnn\t83265\n果冻度秘我的美\t83266\n早上7点30\t83267\n鎏金花\t83268\n番石榴糖浆\t83269\n花骨朵\t83270\n女摩\t83271\n贾乃亮\t83272\n工商部\t83273\n开门儿\t83274\nDcjudj\t83275\n一皮\t83276\n有关方面\t83277\n大方过\t83278\n在江边\t83279\n几再\t83280\n几册\t83281\n抱想你\t83282\n王楠\t83283\n哪麽\t83284\n39集\t83285\n米拉·库妮丝\t83286\n生产地\t83287\n周瑞哈\t83288\n虾米度\t83289\nHours\t83290\n康嘉睿\t83291\n几冠\t83292\n2523\t83293\n住用\t83294\n赋比兴\t83295\n嗯人\t83296\n11.6%\t83297\n门萍\t83298\n问一下\t83299\n男可女\t83300\n大头症\t83301\n六十五一点\t83302\n地坛医院\t83303\n有乜\t83304\n你有我哪知道我知道我哪知道我多少结婚你的\t83305\n忘乎所以\t83306\n瓜瓜瓜\t83307\n哈桑法赛\t83308\n接下是么\t83309\n我们的天使\t83310\n武道馆\t83311\n何智辉\t83312\n52547\t83313\n肠胃病\t83314\n走一步再走一步\t83315\n狗草\t83316\n幻想画\t83317\n利泽简\t83318\n被正\t83319\n克拉斯诺达尔\t83320\n新军\t83321\n新农\t83322\nbbdbbd\t83323\n节电\t83324\n双网\t83325\n梁静茹\t83326\n三了一会\t83327\n史珍香\t83328\n两二十八颗\t83329\n有一次\t83330\n唾弃\t83331\n抽筋版\t83332\n李月春\t83333\n街车拍灰山师\t83334\n笨肥羊\t83335\n今天后天\t83336\n中华人民共和国中央人民政府\t83337\n千里目\t83338\nkdgd\t83339\n刘晶晶\t83340\n比埃尔霍夫\t83341\n隋朝\t83342\n巴拉巴拉巴拉巴拉巴拉巴拉\t83343\n呆子\t83344\n山路\t83345\n做出现\t83346\n夏旭\t83347\n藏丙火\t83348\n[真V5\t83349\n夏日\t83350\n初恋\t83351\n重五节\t83352\n铜门\t83353\n55555555555\t83354\n很利落\t83355\n34饭\t83356\n杨素琴\t83357\n来呀来呀一宿\t83358\n二百五十七块\t83359\n道牙\t83360\n听唱\t83361\n还乡\t83362\n救治\t83363\n13.2\t83364\nreading\t83365\n西藏\t83366\n小弟弟\t83367\n隋末\t83368\n偏色\t83369\n喜羊羊与灰太狼羊羊小心\t83370\n德拉托雷\t83371\n人机狗\t83372\n雪珠\t83373\n莫颜琪\t83374\n复燃\t83375\n一百零六块\t83376\n中点\t83377\n腐男呦\t83378\n别这样度秘\t83379\n硫酸雨\t83380\n怀揣雪\t83381\n北屯村\t83382\ne囊\t83383\n嚒嚒嚒嚒嚒嚒嚒\t83384\n周到哪我\t83385\n打杀\t83386\n来秀\t83387\n自恋爱上我\t83388\n跨省\t83389\n八门卡\t83390\n九九几\t83391\n面条儿\t83392\n嗯五\t83393\n杨智志\t83394\n不不我舍不得\t83395\n结婚生\t83396\n十二千克\t83397\n塞野\t83398\n大胆包天我可是女皇\t83399\n很正经\t83400\n大雪莲\t83401\n送死\t83402\n映画\t83403\n不绝\t83404\n１３世纪后\t83405\n度秘度秘侠盗了情好困难\t83406\nwoobui\t83407\n啵啵啵凌志文比窝大一岁嘻嘻\t83408\n大规模\t83409\n摩皮\t83410\n乔治\t83411\n1008629\t83412\nbcgngt\t83413\n原油\t83414\n块点\t83415\n孟飞乇\t83416\n素偶\t83417\n化痰\t83418\n款项\t83419\n匡永春\t83420\nqsb\t83421\n美式\t83422\n9月6日\t83423\n笑我看不懂\t83424\nqsh\t83425\nqsn\t83426\n身为\t83427\nqss\t83428\nqsr\t83429\nqsq\t83430\n叫卖\t83431\nvkgf\t83432\nvkgg\t83433\nvkgh\t83434\n好我当我当我当我当我的好\t83435\nqsx\t83436\n迷奸\t83437\n蒙古\t83438\n人而了\t83439\n裤带\t83440\n身世\t83441\n呼声\t83442\n15944842946\t83443\n电话\t83444\n相提并论\t83445\n卡啦\t83446\n丰润\t83447\n井宝\t83448\nyhuhuhhju\t83449\n塔陵\t83450\n老一\t83451\n醇香\t83452\n111切\t83453\n身下\t83454\n身上\t83455\n你好肥\t83456\n你本我\t83457\n陈美格\t83458\n没错过\t83459\nfffyf\t83460\nA罩杯\t83461\n能为你\t83462\n腐败\t83463\n7655543334566\t83464\n军旅\t83465\n沈阳市司法局\t83466\nme525\t83467\n方城二郎庙\t83468\n0.2元\t83469\n对背\t83470\n陆杉\t83471\ndhiphotosbaiducomxiaodupicitem50da81cb39dbb6fd3933b9540e24ab18972b3753jpg\t83472\n我敢么\t83473\n狠招\t83474\n胡罗卜\t83475\n平湖渡船桥村\t83476\n拉风\t83477\n六十分\t83478\n椰\t83479\n田田\t83480\n降压\t83481\n魏晨爱\t83482\n很满意信\t83483\n干嘛元\t83484\n度秘你好我是你的新主人我的\t83485\n偶正\t83486\n东北大碴子\t83487\n自怨自艾\t83488\n17.8\t83489\n阿得莱德\t83490\nN次\t83491\n选首\t83492\n元末\t83493\n大卸\t83494\n玩不都说\t83495\n若晓溪\t83496\nauvkj\t83497\n3899元\t83498\n自娱自乐\t83499\n大卫\t83500\n察雅\t83501\n锦州市工商行政管理局\t83502\nxoal\t83503\n里了里了里了了了了\t83504\ntYfc\t83505\n活着\t83506\n玩儿己\t83507\n大卡\t83508\n沉稳\t83509\nband\t83510\n大卖\t83511\n大南\t83512\n躺着\t83513\n大单\t83514\nbang\t83515\n杂办\t83516\n大华\t83517\n一百二千六六百三十二\t83518\n脚筋\t83519\n大半\t83520\n大午\t83521\n苤蓝\t83522\n石墨化\t83523\n混水\t83524\n1313133333\t83525\n我可以吗真的可以\t83526\n你好八一号\t83527\n17.2万元\t83528\nsgsish\t83529\n受阻\t83530\ncybibdtcuof\t83531\n堂堂\t83532\n红安县\t83533\n大明湖畔的少女云么QVQ\t83534\n電視台鼓皇皇提成V脫光光很不偷偷摸摸惹你了個不不不56\t83535\n2222222222222222222222222222222222222222222222222222222222222222222222222222222222\t83536\n掌上\t83537\n米兰南\t83538\n好的好的\t83539\n吗p2\t83540\n左拐右拐\t83541\n小微微\t83542\n宽宽\t83543\n1份\t83544\nabcdesca3jkllisnopqosuouyawasyasa\t83545\n宽容\t83546\n1件\t83547\n原状\t83548\n王芥\t83549\nω妞\t83550\n鸡性\t83551\n江心明天\t83552\n扇耳光\t83553\n一年左右\t83554\n4\t83555\n44秒\t83556\n卿本\t83557\n王芳\t83558\n方玫\t83559\n谭小闲\t83560\nQ点\t83561\n泾川二中\t83562\nｆｕｃｋ\t83563\n自毙\t83564\n我知道你是谁了你\t83565\n各族\t83566\n吗ps\t83567\nUps\t83568\n天王天后\t83569\n黑色素\t83570\n离子交换器\t83571\n效果i广仲vv下次吧得鬼地方贵妃发酒疯紅各愿你费事\t83572\n安雪梅\t83573\n都尼玛肿\t83574\n偷学\t83575\n离线\t83576\n砍断\t83577\nnice\t83578\n当作者\t83579\n电话局\t83580\n自毁\t83581\n乐奇奇\t83582\nacold\t83583\n圆菱形\t83584\n發表\t83585\nReeves驾摩托车\t83586\n韩希成\t83587\n独霸仙\t83588\n浅薄\t83589\n温玉\t83590\n荷兰猪\t83591\n三庚\t83592\n三庙\t83593\n454664\t83594\n552725272\t83595\n人名\t83596\n中印狐\t83597\nghcggggt\t83598\ngvxvdvz\t83599\nbhiphotosbaiducomxiaodupicitembba1\t83600\n宝贝乖我没闹\t83601\n陈增金\t83602\n医生天\t83603\n廉颇\t83604\n阿好\t83605\nJvhvjh\t83606\nguiug\t83607\n多贱\t83608\nnhhghgggf\t83609\n很久以前\t83610\n欧帝尔\t83611\n粒粒\t83612\n新华农场\t83613\n臧否\t83614\nCxD\t83615\n零一点一点\t83616\n八零七零\t83617\n290334\t83618\n小冰山\t83619\n鲢鱼\t83620\n一茬\t83621\n非诚勿扰2\t83622\n增值税\t83623\nmarc\t83624\n小悠侠\t83625\n邱鑫磊\t83626\n片尾\t83627\n想中\t83628\n13点15\t83629\n月考\t83630\n40年后\t83631\n无分文\t83632\n衣食之忧\t83633\n侯妮妮\t83634\n123期\t83635\n麦哥\t83636\n多克\t83637\n徐困\t83638\nzxxxxx\t83639\n123服\t83640\n张南\t83641\n新华网\t83642\n二十二十多块\t83643\n行不你\t83644\n马耀威\t83645\n天长日\t83646\n马斯\t83647\n1257580\t83648\n归一\t83649\ncuimnnnn\t83650\n浅笑\t83651\n五边形\t83652\n折开\t83653\n姓资\t83654\n缺头不对马嘴\t83655\n动石动\t83656\n朱太贤\t83657\nhihelloa\t83658\n15804\t83659\n边长\t83660\n分泌物\t83661\n便面\t83662\n豆荚\t83663\n莫得里奇\t83664\n绝无双\t83665\n王龙狼龙\t83666\n落班水浒传\t83667\n冰火厨房\t83668\n离题\t83669\n好吧早安\t83670\n表彰会\t83671\n983jpg\t83672\nellarzcv\t83673\n24元\t83674\n20年后\t83675\ngxgc\t83676\n勤学会不会\t83677\njshjbbwxqxlhaizgxkvsndbtcxprjrxtdj2zhdobcaebucveushhlbhgp\t83678\n有无穷无尽\t83679\ngxgh\t83680\n如来\t83681\n服务站\t83682\nud度\t83683\n蚰蜒\t83684\n救人一命\t83685\n别里我\t83686\n鹤立\t83687\n四点钟\t83688\n祸首\t83689\n拍夜\t83690\n辣椒炒蘑菇汁\t83691\ngxgx\t83692\n撇清\t83693\nHHX\t83694\n七一一百四十五期\t83695\n伊通县\t83696\n炮\t83697\n炭\t83698\n炫\t83699\n吗头\t83700\n就这样\t83701\n亨通\t83702\n炳\t83703\n向井理\t83704\nshopping\t83705\n炼\t83706\n炽\t83707\n為\t83708\nxgt\t83709\n15787斤\t83710\n点\t83711\n九唱\t83712\n小猪情公寓\t83713\n壮观\t83714\n炅\t83715\n阿奇\t83716\n沃林\t83717\n即可\t83718\n炋\t83719\n炈\t83720\n炉\t83721\n炖\t83722\nQQ炫舞号\t83723\n炕\t83724\n炒\t83725\n问题不在孩子\t83726\n炞\t83727\n18856元\t83728\nzol\t83729\n干事儿\t83730\n13673076520\t83731\n692718\t83732\n5片\t83733\n用度\t83734\n要点赞\t83735\n下次\t83736\n反跟\t83737\n猛将\t83738\n大真\t83739\n隐应\t83740\nnichou\t83741\n一一一一一一一一一一一一一一一一一一一一一一一一一\t83742\nsijs\t83743\n警\t83744\nsiju\t83745\n你好准\t83746\n皮蛋\t83747\n臃枳\t83748\ngermeister\t83749\n向西方\t83750\n譏\t83751\nhttphhiphotosbaiducomxiaodupicitem8694a4c27d1ed21b31c09690aa6eddc451da3fbdjpg\t83752\n穿王\t83753\n徐盛程\t83754\n转入\t83755\n中世纪\t83756\ngtsssss\t83757\n胸膜\t83758\n5455588\t83759\n愁字\t83760\n蓝线\t83761\n甲勇赛\t83762\n说出\t83763\n热拌行\t83764\nhgcdfg\t83765\n龙年\t83766\n艳照门\t83767\n吃饱了不想家\t83768\n长途\t83769\n秘信不信\t83770\n莲花宝\t83771\n没海\t83772\nagajtgdjjddjpatjtdgjtodingnixinporelopeiyinpogogopogoumelloheipohyesmqjiejbakljialdhaidaiopal\t83773\n诺基亚塞班\t83774\n申意\t83775\n骗人鬼\t83776\n晚间11点\t83777\n2ooo\t83778\n高凯峰\t83779\n二改\t83780\n手跟姐\t83781\nhttpfhiphotosbaiducomxiaodupicitem80cb39dbb6fd52664d4e3161ac18972bd4073659jpg\t83782\n不离不弃\t83783\n冰糖块\t83784\n精心\t83785\n九七风云再\t83786\n好多大\t83787\n徐章程\t83788\n作业板\t83789\n张岩\t83790\n写人\t83791\n姑刚\t83792\n轧戏\t83793\n六五级\t83794\n好多天\t83795\n學生\t83796\n搞太多\t83797\nDJD\t83798\n好丑好丑好丑\t83799\n两情\t83800\n说忘记\t83801\n长方形\t83802\n尼尼\t83803\n星期六过\t83804\n桥边\t83805\n大楠桑唱Step\t83806\n打雷辟\t83807\n长隆\t83808\n六天后\t83809\n指挥官\t83810\n便装\t83811\n155855357522737228\t83812\n前前后\t83813\n王胥阳\t83814\n女咯摸我找我我摸我魔我\t83815\nai50\t83816\nai52\t83817\n一萬零一夜\t83818\ngiujjfygxfyhufiydfhgfjfhhhdfcjcfefgdrf\t83819\n八零零年\t83820\n靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠\t83821\nBOBO\t83822\n器量\t83823\n太特麽\t83824\nfuxch\t83825\n情伤\t83826\n这就走\t83827\nBIBISKY\t83828\n卵毛\t83829\n污臭\t83830\n屠於男\t83831\n末段\t83832\n鹿倩\t83833\nOnly\t83834\nyyiyig\t83835\njjfffffffffffffff\t83836\n张回忆\t83837\nhurvhgg\t83838\n校运会\t83839\n五笔字根表\t83840\n球风\t83841\n睡怕\t83842\n酱菜\t83843\n主恩\t83844\nAdWords\t83845\n杨晴宇\t83846\njhlb\t83847\nQQ网\t83848\n硒望纳米茜\t83849\n鹿为马\t83850\n烧鸭\t83851\n2011年1月5日\t83852\n烧鸡\t83853\n一单元\t83854\n忆思\t83855\nCbfvjfj\t83856\n宁泽\t83857\n阴癖\t83858\nfueyv\t83859\n狗蚤公路\t83860\n快毒\t83861\n张采臣\t83862\n绝大科大德军\t83863\nTouchPad\t83864\nXcjjkj\t83865\n热死八婆\t83866\n靳鹏飞\t83867\n猪么CK\t83868\nhdn\t83869\n你好懂事\t83870\n叉烧\t83871\nvvygg\t83872\n哎呦妈呀这么不要脸\t83873\n宁波\t83874\n38822\t83875\n叫我真儿\t83876\nJln\t83877\n文祥\t83878\n口蘑\t83879\ngnigvbkkfggyrxub\t83880\n三四十年\t83881\n88888558888888\t83882\n校内\t83883\nhdi\t83884\nmilky\t83885\nbjjjjjui\t83886\n张皓翔\t83887\n不要烦我了我打思念\t83888\n朱永全\t83889\n行啦\t83890\n物理\t83891\n1300余名\t83892\n十几二十岁\t83893\n杨广\t83894\nhttpbhiphotosbaiducomxiaodupicitem279759ee3d6d55fbb3cb6e536a224f4a20a4ddb1jpg\t83895\n倍增\t83896\n8868868888666\t83897\n455677644446777\t83898\n海泰大道\t83899\n伪迷\t83900\n左苗苗\t83901\n疙瘩\t83902\n山东一区\t83903\n死金彬\t83904\n西葫芦\t83905\n奥特曼黑猫你好啊你好\t83906\n爱甘心\t83907\n杨幕\t83908\n就象棋\t83909\n曾薇\t83910\n总榜\t83911\ndvcwbedbsvcs\t83912\nlabcd\t83913\n13102966881\t83914\n湯飯點\t83915\n300立方米\t83916\n晒黑\t83917\n附赠\t83918\n上首\t83919\n两百天\t83920\n13456789\t83921\n上香\t83922\n专一\t83923\n卡扎菲\t83924\n连线题\t83925\n专业\t83926\n挥挥手手\t83927\n范依涵\t83928\n八八六六百分之一\t83929\n第4期\t83930\ndey\t83931\nLED照明\t83932\n70％\t83933\n好呢好你好你好你好\t83934\n狍子\t83935\nder\t83936\ndes\t83937\ndet\t83938\ndeu\t83939\ndew\t83940\ndeh\t83941\ndei\t83942\ndej\t83943\n两百多\t83944\n用心甘情愿\t83945\ndeo\t83946\n石华为荣耀\t83947\ndec\t83948\nded\t83949\ndee\t83950\ndef\t83951\ndeg\t83952\n终极版\t83953\n松叶\t83954\n拜拜拜拜拜拜拜拜拜\t83955\n适中\t83956\n炸BMW\t83957\n骆沙鸣\t83958\n13899遍\t83959\nfiftrdyfdhj\t83960\n嘻嘻哈哈大笑\t83961\n独一一\t83962\n松口\t83963\n培训\t83964\n商业\t83965\n我不搞基\t83966\n三流\t83967\n高唐\t83968\n低能\t83969\n低胸\t83970\npdpe\t83971\n谭阿香\t83972\n放学再聊\t83973\n淑房殿\t83974\n网络派出所\t83975\n感觉你好萌\t83976\n王书远\t83977\n余桥丽\t83978\n达米安\t83979\n黄佳慧\t83980\n18712329688\t83981\n内蒙古大草原\t83982\n高唱\t83983\n韩沫白\t83984\nchhbvgh\t83985\n哪一个你\t83986\n梦话今生今世\t83987\nanmptwn\t83988\n更异类\t83989\n九世轮回篇\t83990\n老口\t83991\niversus\t83992\n追漫\t83993\nalps\t83994\n一斤半\t83995\neeerfftffgggghghhhhhhhmmuyvghhgffhghgdjdjjjhsjjjghhhffs\t83996\n一年了你\t83997\n无言独上西楼\t83998\n逆战小哆咪呀\t83999\ndoctor\t84000\n老叔\t84001\n陀子\t84002\n说明天再见没说明天记\t84003\n铜板\t84004\n证券法\t84005\n老友\t84006\n冠亚\t84007\npleasure\t84008\n116小时\t84009\n红黑军团回击#巴萨#\t84010\n钱宝网\t84011\n正海磁材\t84012\n梦神\t84013\n打拦\t84014\n幾智軒\t84015\n王九天\t84016\n走点\t84017\n冬儿\t84018\n鼓噪\t84019\n丢人现眼\t84020\n那个男人\t84021\n11.11.11\t84022\n李雨琪\t84023\n扎小\t84024\n也有一个人在\t84025\n蔡淳佳\t84026\n吴巧媛\t84027\n鞠婧\t84028\n妖成\t84029\n胡雪岩\t84030\n董啸天\t84031\n癌症\t84032\n五错\t84033\n认识天破晓\t84034\n花秋裤\t84035\n了好\t84036\n蒜油麻油\t84037\n吱吱\t84038\n晁治刚\t84039\n一知道\t84040\n迪斯尼之旅\t84041\n港澳台侨委员会\t84042\n梅毒\t84043\n在考比\t84044\n刘雅洁\t84045\n人非\t84046\n收拢\t84047\n你了你是谁\t84048\n政模\t84049\n由乃\t84050\n巨无\t84051\n放票\t84052\n巨日\t84053\n困难\t84054\n瘦版\t84055\n耳畔\t84056\n涛\t84057\n星尚\t84058\nHmmm\t84059\n三十多少兆\t84060\neveryone\t84061\n15848317651\t84062\n瓦哈\t84063\n来来来来来\t84064\n慈圣\t84065\n接力赛\t84066\n和丁\t84067\n和一\t84068\n朴智妍\t84069\n新余学院文学院\t84070\n和丰\t84071\n21:15\t84072\n巴拉吉\t84073\n哈亲爱就是我\t84074\n和为\t84075\n李金铭\t84076\n心药\t84077\n喇喇\t84078\n退换货\t84079\n才薯哥\t84080\n岗北\t84081\n国平\t84082\n劫船晓得\t84083\n坐椅\t84084\n露里娃\t84085\n懝\t84086\n李苏苏\t84087\n用生\t84088\n吃饱就在\t84089\n说说着\t84090\n心不心不行\t84091\nNoworries\t84092\nhisidiei\t84093\n艾玛耶\t84094\n懒\t84095\n冰雪节\t84096\n局长们\t84097\n懊\t84098\n喜羊羊与是谁杨洋\t84099\n1359486610423754\t84100\n尽世\t84101\n苏安\t84102\n涪\t84103\n6xo\t84104\n白雪茫茫\t84105\n芋头\t84106\n激进\t84107\n王丽江\t84108\n顾春田\t84109\n懵\t84110\n金科\t84111\n懷\t84112\n安静的人\t84113\n金秋\t84114\n世纪联华\t84115\n懒七八糟\t84116\n鲁马\t84117\n郭蓉芳\t84118\ntyfhf\t84119\n金秀\t84120\n再接再厉勇往直前\t84121\n周嫁开\t84122\n两万94万一九点儿\t84123\n不少50种\t84124\n小漠\t84125\n丁子\t84126\n象征性\t84127\n南心北\t84128\n几顿饭\t84129\npdrnf\t84130\n笑门神\t84131\nFields\t84132\n弟弟样\t84133\n嗯瓮瓮瓮\t84134\n吃软饭\t84135\n悬浮\t84136\n强长\t84137\n十四二十三\t84138\n半夜1点\t84139\nfgdgdhegdfgd\t84140\n自挂\t84141\n斯巴达\t84142\n紫薇大美汤姆瑞\t84143\n小漓\t84144\n归摘\t84145\nmamo\t84146\nmama\t84147\n5555555555522222222222221111111111111\t84148\n新阶段\t84149\n七摸\t84150\n坚持不懈\t84151\n知识动物\t84152\n法自\t84153\n新春\t84154\n喷死\t84155\n海锦绣缘水煮活鱼\t84156\n才植物\t84157\nuNuDLeLe\t84158\n小璐璐\t84159\n裁委\t84160\n韩先生\t84161\n乒羽\t84162\n错乱\t84163\n了了妹夫的爸爸跟属\t84164\n南苏\t84165\n九十分钟\t84166\n售货员\t84167\n邮报\t84168\n陈正明\t84169\n新星\t84170\n5355555655\t84171\n南苑\t84172\n菲薄\t84173\n谮号\t84174\n迷门\t84175\n六十岁\t84176\n敲醒\t84177\nfhcggghvyggghhhygggg\t84178\n器哥\t84179\n雅芝\t84180\n真不难\t84181\n32589954\t84182\n梦雅\t84183\n克ab箱\t84184\n侗话\t84185\n我恨你看小超人你是小屁屁超人\t84186\n米非彼\t84187\n黄家乐\t84188\n酱醋\t84189\n五音\t84190\n1443876\t84191\n万丈深渊\t84192\n梦雨\t84193\n再来去哪了么样\t84194\n梦雪\t84195\n三零七零五\t84196\n州郡\t84197\n央视四套\t84198\n矮房\t84199\n东厂\t84200\n愿得\t84201\n艰辛\t84202\n好多分钟\t84203\n黄子涵\t84204\nvdffffgdd\t84205\nSvsgsgshsf\t84206\n阳洞\t84207\n死地\t84208\n寄给你儿戏\t84209\n156327865\t84210\n爱疯四\t84211\n告诉你我没有\t84212\n左楠\t84213\n萌萌哒姐姐懂懂你个傻缺三\t84214\n俩多\t84215\n隔壁班\t84216\n俩处\t84217\n王嘉艺\t84218\n富士山\t84219\nkvgvvh\t84220\n新思路\t84221\n夏子昂\t84222\n任然\t84223\n斋月\t84224\n俩头\t84225\n说不可能\t84226\n欧我\t84227\n康莫\t84228\n宝龙\t84229\n光大咨会\t84230\nsdgijdggjjf\t84231\n齐小齐\t84232\n爱媚\t84233\n官职\t84234\n洗盘\t84235\n刘凯东\t84236\n俩天\t84237\n小爷\t84238\n小爱\t84239\n钱涛\t84240\n不走了陪\t84241\nnishijiqirenma\t84242\n小爽\t84243\n中国龙船打头阵\t84244\n相除\t84245\n卢月新\t84246\n秽火\t84247\n梁慈香\t84248\n高达\t84249\n雄踞\t84250\n7：59\t84251\n神圣教育\t84252\n匕七\t84253\n魔帝魔妃\t84254\n尼斯珠\t84255\n没优\t84256\n爱爱你在你在\t84257\n地老天荒\t84258\n上菜\t84259\n滝泽ロ\t84260\n1.5万吨\t84261\n导电\t84262\n六八二二\t84263\n娇贵\t84264\n阿特丽克斯·波特\t84265\n李博飞\t84266\n包间\t84267\n亲爱的呀\t84268\n植物大战僵尸电视频大家物大战僵尸\t84269\n32088219960409341\t84270\n宋太祖\t84271\n额度\t84272\n扣忍者\t84273\n没有了你\t84274\n呵呵杀杀杀杀杀杀杀杀杀杀\t84275\nOrchestrall公司\t84276\n麦基尔\t84277\n发色\t84278\n得了你\t84279\n姜片\t84280\n5月31号下午1点半至2点半\t84281\n大萝卜\t84282\n四五0点\t84283\n命令号\t84284\npoiuytte\t84285\n姚子豪\t84286\n批零\t84287\n7万个\t84288\n囊阳\t84289\n大和好可爱\t84290\n444444\t84291\n444445\t84292\n简简单单发\t84293\n15830051383\t84294\n一时不收拾\t84295\n蹦蹦跳跳\t84296\n成沙\t84297\n第九台\t84298\n沙烁\t84299\n人妖\t84300\n开没\t84301\n责任制\t84302\n脸子\t84303\n小剧场\t84304\n4444444444444444444\t84305\n三爻村\t84306\n58750hbvc885426868\t84307\n开沟\t84308\n小馆\t84309\n咋回事\t84310\n将计就计\t84311\n有点点点点点点点点点点\t84312\n无凭\t84313\n市委\t84314\n智美\t84315\n人妻\t84316\n姓木\t84317\nwaBn\t84318\n小健\t84319\n大脸猴\t84320\n雷歌\t84321\n要爱\t84322\nffffffffffffffffffffff\t84323\n55555\t84324\n55554\t84325\nggrid\t84326\n王君平\t84327\n不哭\t84328\n加醋\t84329\n多肉\t84330\n老戎\t84331\n安能\t84332\n塔头\t84333\n100万次\t84334\nohuifd\t84335\niko\t84336\n我是说啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦\t84337\nikm\t84338\n加工费\t84339\n多肽\t84340\n生事\t84341\n98元\t84342\n沙岗墟\t84343\n小馋\t84344\n为人师表\t84345\n安胎\t84346\n玉带石\t84347\n百度影音播放器\t84348\n贵国\t84349\n忙萌\t84350\n毛鬓\t84351\n成一礼拜喝\t84352\n马奔腾\t84353\n岷县\t84354\n武装人员\t84355\nbalibali\t84356\n200米\t84357\nAngeIababy\t84358\n上课孩\t84359\nhihi哒\t84360\n策划\t84361\n徐子\t84362\n百尺\t84363\n小偶\t84364\n八十八五分\t84365\n陆小曼\t84366\n三点零五\t84367\nzeta\t84368\n温州市图书馆2楼\t84369\n霞之子\t84370\n蔡达标\t84371\n拉丁舞\t84372\n免礼拜\t84373\n锛锛\t84374\n受压\t84375\n国防部\t84376\n痛不好\t84377\nfrobf\t84378\n抹胸\t84379\nqpqpkwjsnndndbfbrhejznmznzzmKAOWOKEJEMEKSIKSKWOEKENDIDNEJDIDNEIENE\t84380\n苏联国\t84381\n妳萌萌哒\t84382\n秀贤\t84383\n战菁一\t84384\n20052002004\t84385\n隐居\t84386\n哎哟妈呀\t84387\n5025\t84388\n逆天你是哦我知道你是女的\t84389\n张人民币\t84390\nshshshs\t84391\n逆子\t84392\n晚安安\t84393\n5月13\t84394\n江湖岁月误前程\t84395\n临沂\t84396\n洋葱\t84397\n怪你啦\t84398\nWww\t84399\n一期个\t84400\n被拐\t84401\nFeel\t84402\n宜木\t84403\n临河\t84404\n被拘\t84405\n丫头村\t84406\n十三片\t84407\n宣城\t84408\n剥\t84409\ncantread\t84410\n争辩\t84411\n称发\t84412\n約\t84413\n九十九亿九千九百九十九万九千九百九十\t84414\n刚刚好刚刚刚刚刚刚刚刚\t84415\n洗漱包\t84416\n紊\t84417\n学英\t84418\n剩\t84419\n轻盈999ooiio\t84420\n看你不懂我那话不是我不懂你那话ok\t84421\n书丨塾\t84422\n屁话\t84423\n章昱瑶\t84424\nICBC\t84425\n明堡\t84426\n嘛德\t84427\n全锤\t84428\n上官轩武有雨\t84429\n素\t84430\n索\t84431\ngoodmrorring\t84432\n陆慧\t84433\n洪磊\t84434\ndrvg\t84435\ncaocolatio\t84436\n紫\t84437\n北京饮食街\t84438\n小点声\t84439\n洞丽君\t84440\n还行为爱着你\t84441\n是我不会\t84442\n三会\t84443\n一醉\t84444\n滤镜方\t84445\n奈斯吐\t84446\n梦游\t84447\n吃香甜\t84448\n当当郎\t84449\nchbj\t84450\n岙\t84451\n起算\t84452\n玉泉路\t84453\n落入\t84454\n楼母\t84455\n指明\t84456\n逼转\t84457\nchby\t84458\n春黄片\t84459\n移交\t84460\n毛吖\t84461\n一早的跆拳的一百分作业帮\t84462\n剖\t84463\n鲍大\t84464\n北京卫视\t84465\n求助信\t84466\n权柄\t84467\n查错\t84468\nyourealovely\t84469\njkvf\t84470\n河南京剧院\t84471\n落克\t84472\n娘娘娘娘娘娘娘\t84473\n大沽南路\t84474\n桃园北区\t84475\n岂\t84476\n李才\t84477\n七八百块\t84478\n刹车痕\t84479\n剜\t84480\n一本全\t84481\n魔女诺\t84482\n受侨\t84483\n废墟\t84484\n海峡都市报讯\t84485\n2010年2月\t84486\n李扬\t84487\n周六日\t84488\n与众不同凡\t84489\n榊原恒一\t84490\n啊寂寞我好寂寞你好讨厌你好讨厌我好寂寞我好寂寞你好讨厌你好讨厌\t84491\n凉水果\t84492\n试试子\t84493\n广东人\t84494\n十一点儿\t84495\n888564161373\t84496\n一个俩个\t84497\n价值链\t84498\n岷\t84499\n孤柏\t84500\n保费\t84501\n张刘\t84502\nsudheh\t84503\n百度有钱花真\t84504\nddyy\t84505\n唉度秘\t84506\n李老板\t84507\n脸上\t84508\n脸下\t84509\n张别\t84510\n你好像\t84511\n买者\t84512\n刘家堡\t84513\n1guys\t84514\n红杰队\t84515\n翰锅\t84516\n电市场\t84517\n2.rar\t84518\n剎\t84519\n管吃管住管\t84520\n刮痧\t84521\n阿迪达斯耐克\t84522\n省教育厅\t84523\nJpklk\t84524\n份量\t84525\n看上你了希望\t84526\nwdgj\t84527\n11111111111个\t84528\n干鲜\t84529\n说不稀罕\t84530\n别见我了好不好\t84531\n千万元\t84532\n天眼\t84533\n治愈者\t84534\n1449个\t84535\n尸身\t84536\n齐婧\t84537\n傻傻傻傻1444444\t84538\n张军英\t84539\n大战争\t84540\n亲我是你的主人\t84541\n政客们\t84542\n八十来分\t84543\n心坎\t84544\n好孩\t84545\n天真\t84546\n加深\t84547\n一荣俱荣\t84548\n波斯湾\t84549\n沧县\t84550\n妩媚\t84551\n是的主人我是你的对手\t84552\n清脆\t84553\n马眼\t84554\n豆瓣酱\t84555\n汪美茹\t84556\n你好度秘\t84557\n海猫儿\t84558\n两只小笨狗\t84559\n帅打\t84560\n13766544724\t84561\n老房子集团\t84562\n板板\t84563\n1825天\t84564\n气管炎\t84565\n透透\t84566\nbiaonim\t84567\n缺一不可\t84568\n喵滴說\t84569\n心踪\t84570\n岩\t84571\nsyrrulb\t84572\n心菜单\t84573\n度者\t84574\n想和你说\t84575\nGxx\t84576\n好犬泡泉泡泉\t84577\n傍晚6点\t84578\n专题片\t84579\n九百快\t84580\ndycub\t84581\n胡孝萍\t84582\n4可人九\t84583\n贾欣茹\t84584\n王道润\t84585\n动宾词\t84586\n姻脂\t84587\nmomozwo\t84588\n菲斯林\t84589\n350万分之一\t84590\n交我吧数\t84591\n在恋\t84592\n缠绵\t84593\n大洋洋\t84594\n转体\t84595\n无聊了你睡觉吧我不穿女儿哪口\t84596\n大洋洲\t84597\n全棉\t84598\nHwang\t84599\n珍视\t84600\n很紧张\t84601\n取取\t84602\n商务部be\t84603\nhrjkshdsho\t84604\nGzya\t84605\n300吨\t84606\nCome\t84607\n丁点儿\t84608\n袁佳孟\t84609\n厥\t84610\n厦\t84611\n太板\t84612\n厨\t84613\n董卓\t84614\n养长\t84615\n卖自夸\t84616\n厮\t84617\n了了了\t84618\n陆家\t84619\n深深浅浅\t84620\n厵\t84621\n厶\t84622\n16634\t84623\n去\t84624\n厼\t84625\n阿川\t84626\n厾\t84627\n县\t84628\n高香\t84629\n厂\t84630\n真正的好声音\t84631\n厄\t84632\n厅\t84633\n历\t84634\n董卿\t84635\n守恒\t84636\n厉\t84637\n压\t84638\n厌\t84639\n厎\t84640\n5分\t84641\n超级眼训\t84642\n厒\t84643\n撑伞\t84644\nneym\t84645\n85年代\t84646\n厘\t84647\nmycv\t84648\nuimyittai\t84649\n副唱腔\t84650\n孑恁\t84651\n电摩\t84652\n罗秀贤\t84653\n凸出\t84654\n李洁涵\t84655\n长见识\t84656\nnothanks\t84657\n出自\t84658\nshiers\t84659\n孟婷婷\t84660\n400年后\t84661\n林志玲\t84662\n掉去\t84663\n月末\t84664\n大麦网\t84665\n堡们\t84666\n死变态\t84667\n祝君钱兔无量兔露\t84668\n陈楚河\t84669\n天津狗\t84670\nShudvnkydc\t84671\n嫁衣咪\t84672\n我七岁我的隐秘的秘密我七岁\t84673\n狂躁\t84674\n灰指甲\t84675\n没害羞\t84676\n红后代\t84677\n香港国际机场\t84678\n休闲战争\t84679\n成长\t84680\n不羡\t84681\n１号\t84682\n不群\t84683\n弥雅\t84684\n韩雨婷\t84685\n225班\t84686\n272522556\t84687\n有光\t84688\n有术人\t84689\nnalipif\t84690\n有兔\t84691\n二环\t84692\nCindy\t84693\n8cm\t84694\nstlelaly\t84695\n养儿防老&amp\t84696\n好羡慕\t84697\n12252\t84698\n不羁\t84699\n百多少\t84700\nzombieking\t84701\n有无问\t84702\n淘宝基金会\t84703\needkgojn\t84704\n沒砑\t84705\n往天说\t84706\n有兴\t84707\n汇仁\t84708\n有关\t84709\nxbj\t84710\n何其幸运\t84711\n一个多月\t84712\n龟岛\t84713\nxbf\t84714\n弥雾\t84715\n1245678910\t84716\n忙忙碌\t84717\n99秒\t84718\n张贝贝\t84719\n任总\t84720\n五峰铺\t84721\nyxyAxy\t84722\n步步高\t84723\n退回去\t84724\n杨小会儿\t84725\n甚少\t84726\n蛇博\t84727\n女歌手\t84728\n50多一个\t84729\nrhgnfbk\t84730\n垂炼\t84731\n阿耀\t84732\n翻唱\t84733\n薄荷汁\t84734\n转行\t84735\n小哈\t84736\n严肃们\t84737\n风文文网\t84738\n地佛男\t84739\n修葺\t84740\n小哇\t84741\n华山论剑\t84742\nHZD\t84743\nFhcf\t84744\n骚扰史\t84745\n554\t84746\n8620\t84747\nFhch\t84748\n雨盖\t84749\n新鲜血液\t84750\n没收到\t84751\n结束了\t84752\n诺贝尔\t84753\n告诉我告诉我告诉我告诉我告诉我告诉我\t84754\nTJOY\t84755\n一片汪洋\t84756\ngooglegoogle\t84757\nTROUBLE\t84758\n文综\t84759\n14米\t84760\n叶婷\t84761\n贾淞迪\t84762\n亡灵杀手\t84763\ndfdddsswfdf\t84764\n雅乐舞\t84765\n听话乖别闹\t84766\n根系师\t84767\n64578484848\t84768\n分说\t84769\n玉龙县\t84770\n前五天\t84771\n秘哥哥\t84772\nyuzhong\t84773\nfgccudgyytrtuxyxyxuueaaduhffigjhoturtdtyyrrwetgiskysorkurdtjdrtiarawisyiryydyiydrdrdryydrjydrrrsereaaesdxgxdtugfgyttttyyuutszghrhhufsxhhjjjj\t84774\n空隙\t84775\n卖照\t84776\n那个女人\t84777\n李嘉楠\t84778\n企乞\t84779\n来我家度秘\t84780\n占补\t84781\n蒋凤琼\t84782\n微微\t84783\n小时候\t84784\n正大司法班\t84785\n雨佳鹏\t84786\ngccxzf\t84787\n普通话培训网\t84788\n栋辉\t84789\n刘胜军\t84790\n老丈人\t84791\n慢性\t84792\n捕头\t84793\nffffttfd\t84794\n红楼梦\t84795\n防身\t84796\n前天晚上\t84797\n不锦绣\t84798\n15185642362\t84799\n方块字\t84800\n金瑞\t84801\n1339\t84802\n控诉你\t84803\n西海岸\t84804\n薛咏涵\t84805\n黄山市\t84806\n1331\t84807\n1333\t84808\n榨汁机\t84809\n好不耐烦\t84810\nlife，你妹=Yo\t84811\n424963591\t84812\n呃二英\t84813\n中太\t84814\n十二分贝\t84815\n快乐环岛\t84816\n照相机\t84817\n明天早上6点10分\t84818\n等期待\t84819\n五分利\t84820\n直系师弟\t84821\n吟风\t84822\n出芽\t84823\n好不生\t84824\n刘乐\t84825\n地州海\t84826\nSBSBSBSBSBSBSBSBSBSBSBSBSBSBSB\t84827\nnnnnnnn差评nnnnnnn哦吼吼吼吼\t84828\n合集\t84829\ndbhs\t84830\nhrng\t84831\ndeveloper\t84832\n鸡酒\t84833\n李蚊香\t84834\ndanjo\t84835\n刘么\t84836\n爱戴\t84837\n疼痛\t84838\n化学家\t84839\n3w点儿\t84840\n正定\t84841\n顾吉强\t84842\n2008年后\t84843\n正宗\t84844\nMRYEYME\t84845\n彭氏兄弟继電影\t84846\n接线板\t84847\nE1a228q\t84848\n走风风火火\t84849\n始于\t84850\nVee\t84851\n公车\t84852\n正安\t84853\nPOLO\t84854\n司令员\t84855\nfyefg\t84856\n东莞\t84857\n娃娃鱼\t84858\nqqqa\t84859\n第118期\t84860\n秘書\t84861\n三重\t84862\n三里\t84863\n孝方\t84864\n4nn445455545554478906\t84865\n13063310018\t84866\n足九族\t84867\n奸黄容\t84868\nqqqq\t84869\n梳妆台\t84870\n夕之好\t84871\n粉末\t84872\n正宫\t84873\n周祖军\t84874\n掩目而新\t84875\n三金\t84876\n首大份\t84877\n60天\t84878\n愁寂意\t84879\n路西法\t84880\n老旦\t84881\n没错穿\t84882\nJET\t84883\nhrj\t84884\n22439\t84885\n朱星美\t84886\n肺泡\t84887\n锦绣江山上进心\t84888\n阮志柏豆豆\t84889\nk菌\t84890\n快乐上\t84891\nJEJ\t84892\nJEI\t84893\n编必发\t84894\n葛魅影\t84895\n法轮\t84896\n招儿\t84897\n显身手\t84898\n白领阶层\t84899\n一年内\t84900\n脱衣午\t84901\nsutter\t84902\nqqq3\t84903\n玉工\t84904\n王银那\t84905\n傲世啊美丽的秘密\t84906\n捉迷藏呀\t84907\nNicole\t84908\n国发\t84909\n梁生\t84910\n龙渤海\t84911\n狒狒\t84912\n忍着行\t84913\n丑好不好\t84914\n嘎嘎鸭\t84915\n猜谜缮\t84916\n好网\t84917\n石灰质\t84918\n15350571621\t84919\n30cn\t84920\n废话了给我缝小苹果给我发小苹果\t84921\n1527588566\t84922\n周永康\t84923\n莫高窟\t84924\n机器人家\t84925\n唐森\t84926\nxnc\t84927\n亲昵\t84928\nxnb\t84929\n心饭\t84930\n度赌\t84931\n牛桃耶\t84932\n切度秘\t84933\n部手\t84934\n气毛\t84935\n随队\t84936\ngggyyu\t84937\n封铭\t84938\n打苏宁指挥部\t84939\n2月14日\t84940\n神话片\t84941\n124816\t84942\n1500多\t84943\n张海山\t84944\n评头论足\t84945\n90亿\t84946\n1度米\t84947\n外婆屋\t84948\n13080516652\t84949\nworkw\t84950\n战国语\t84951\njjjmm\t84952\n卜功治\t84953\n奔向充满\t84954\n范一燃\t84955\nhdhidhsgz\t84956\n80k八零\t84957\n成绩斐然\t84958\n太皇后\t84959\n正好\t84960\n悲欢离合\t84961\n蜜瓜\t84962\n单打独斗\t84963\n腾讯视频爱奇艺\t84964\n先和\t84965\n第一类\t84966\nTX滴契机\t84967\n石子\t84968\n爨察\t84969\n你好丑你好臭\t84970\n酥肉\t84971\nlP\t84972\n乐炫\t84973\n小爱六\t84974\n当月\t84975\n满天繁星\t84976\n未从小到大\t84977\nlf\t84978\nlg\t84979\nld\t84980\nle\t84981\nlb\t84982\nlc\t84983\nla\t84984\n咯子\t84985\n当机\t84986\nll\t84987\nlm\t84988\nlj\t84989\nlk\t84990\nlh\t84991\nli\t84992\nlv\t84993\nwhites\t84994\nlt\t84995\nlu\t84996\nls\t84997\n山奶奶\t84998\n太平盛世\t84999\ncriminal\t85000\nlz\t85001\n推越\t85002\n最后一班\t85003\nly\t85004\n车主\t85005\n延绵\t85006\n早上5点20\t85007\n石岭\t85008\n达曼\t85009\n海伦\t85010\n惠子女\t85011\n翻译官\t85012\n什么生\t85013\n丰沛\t85014\n6000000000\t85015\n132林\t85016\nLPC\t85017\n莹彻\t85018\n郑晓涵\t85019\ntjgj\t85020\n什么用\t85021\n鬼把\t85022\n周彬\t85023\ntjgl\t85024\n小脚裤\t85025\n100米\t85026\n孙寿\t85027\n麻匪\t85028\nl7\t85029\nl4\t85030\n不好啊你好漂亮\t85031\n不耐烦\t85032\n11111岁\t85033\niujkskjxaokrhuuhdckjhgghnjhgvbkkmllbhvfxvhvxfhhFGGVJJJZFHggxhjk\t85034\n英皇\t85035\n天秤倒茶\t85036\nhhaw\t85037\n卫兰福\t85038\n应聘者\t85039\nhhaq\t85040\n阳美\t85041\n来客气\t85042\n悦泰\t85043\n小幽默感\t85044\n莺飞草长\t85045\n拉加德\t85046\n1783535\t85047\n毛媛儿\t85048\n就不够\t85049\n2121121\t85050\n飞车号\t85051\n成手\t85052\n2005年10月初七\t85053\n崔红\t85054\n土产\t85055\nreview\t85056\n共一气儿\t85057\n7k7k戏\t85058\n土人\t85059\n文字版\t85060\n抗毒三字经\t85061\n两任\t85062\n15678720048\t85063\n罗里克\t85064\n气话\t85065\n两份\t85066\n二五六零\t85067\n薇妮\t85068\n嗯待\t85069\n干嘛杯杯\t85070\n季芗\t85071\n两件\t85072\n不语森林当当\t85073\n雨神\t85074\n两仪\t85075\ngtggricb\t85076\n摩根索\t85077\n张奥\t85078\n185402968\t85079\n季节\t85080\n嗯得\t85081\n嗯徐\t85082\n黄征太二\t85083\n打气筒\t85084\nnibushishuolr\t85085\n公诉\t85086\n嗯微\t85087\n苍南县\t85088\nDaisey\t85089\nmsu2828\t85090\n嘉瑞\t85091\n53120887\t85092\n最著名\t85093\n杨广场\t85094\n嗯德\t85095\n元浩\t85096\n冻疮\t85097\n驯化\t85098\n温哥华酒店\t85099\nuwpano弄弄弄弄弄弄弄弄弄弄弄弄000000\t85100\n1998年12月20号\t85101\n顶天立地\t85102\n特工类\t85103\n朕来了美人\t85104\n松江二中\t85105\n90000000001\t85106\n沈昌珉\t85107\n残局\t85108\nosnsmwl\t85109\n法语\t85110\n星际熊出没环球城\t85111\n宝贝贝贝\t85112\n八几二十七八岁\t85113\n李雅萱\t85114\nCHANEL\t85115\n打一动\t85116\nUeeusi\t85117\n再讲\t85118\n挥手\t85119\n扯子\t85120\n佛法\t85121\n雯雯\t85122\n闹天宫\t85123\n中国戏曲学院地方戏本科班毕业展演\t85124\n9191929999\t85125\n购物袋\t85126\n王良倩\t85127\n再议\t85128\n边疆区\t85129\n好我摸着我\t85130\n吴君灿\t85131\n听守株待兔\t85132\n1937年7月7日\t85133\n不好不好意思\t85134\n五歌\t85135\n恩格斯\t85136\n光大\t85137\n22号\t85138\n闻型\t85139\n索要\t85140\n我是谁并不重要你是谁呀\t85141\n蛋疙瘩\t85142\n不我喜欢的人\t85143\n坨死尸\t85144\n小城\t85145\n光头\t85146\n五步\t85147\n炉石传说\t85148\n陈太丘\t85149\n光复\t85150\n个人征信沒\t85151\n纯调\t85152\n问题箱\t85153\n啰啰嗦嗦\t85154\n神车\t85155\nGBU\t85156\n四辆\t85157\n打预防针\t85158\nfrfb\t85159\n五十多个\t85160\n曾久久\t85161\n后继有人\t85162\n中粮\t85163\n#天天向上#\t85164\n连环计\t85165\n黑仔\t85166\nZraq\t85167\nGBQ\t85168\n了了了了\t85169\n10日凌晨\t85170\n伤悲\t85171\n阴招\t85172\n平阳\t85173\n破碎\t85174\n鹏鹏鹏\t85175\n25日晚上9点\t85176\n496\t85177\n并成\t85178\n老伙计\t85179\n1秒钟\t85180\n顾漫\t85181\n顾漠\t85182\n碧波\t85183\nFHCHSIA\t85184\n8：25\t85185\n数十名\t85186\n1335906\t85187\n我是我爸女儿\t85188\n8531111111111111111111111111111111111111114411111\t85189\n地藏殿\t85190\nkklvklw\t85191\n葛磨\t85192\n穆巴拉克\t85193\n正己烷中毒\t85194\n另一个男孩\t85195\n兵器\t85196\n射手座\t85197\nKings\t85198\n王鸿培\t85199\nYahoo\t85200\n650平方米\t85201\n一月十四日\t85202\n看见你好\t85203\nvv个vi个v股v股v股v个vu个vuvu个vu个vu个v股v个\t85204\n度秘我是说的我我是跟你说真的你给我\t85205\n吴少轻\t85206\njvtf\t85207\n世事\t85208\n唐人封建论\t85209\n三千多年\t85210\n护腕\t85211\n33号\t85212\n移夜不闭户\t85213\n不要脸臭鸡蛋臭度秘臭不要脸\t85214\n圆上\t85215\n像会不会\t85216\n66880\t85217\n鹅鹅鹅曲项向天歌\t85218\n455181845161\t85219\n传输下\t85220\n返利\t85221\n毁于一旦\t85222\nchate\t85223\n居民区\t85224\n59元\t85225\n胡巴猫\t85226\n日出日落\t85227\n真难看\t85228\ntooftaboontfhkiyhyfghnjyhhkiuyhhjjbjjjnjh\t85229\n你是我的小呀小苹果玖晓天边最美的云朵\t85230\n190岁\t85231\n狠念\t85232\n吻庄\t85233\n对呀可萌\t85234\n这事\t85235\n吃光光\t85236\n托卡依\t85237\n石头侠\t85238\n人太黑\t85239\n脑公我不骗人哈\t85240\n张运浩\t85241\n副总经理\t85242\n澎湖湾\t85243\n这些\t85244\n海敏\t85245\n流量包\t85246\n言教\t85247\n那个时候\t85248\n免签\t85249\n又说说\t85250\n见过\t85251\n嘛局\t85252\n娉美\t85253\n特种部\t85254\n笔触\t85255\n相特\t85256\n茜少\t85257\n宽带没网电视\t85258\n火粉\t85259\n拍拍拍拍拍拍拍拍拍拍拍拍\t85260\n轰炸基\t85261\n2x9x33点\t85262\n23连\t85263\n志同道合\t85264\nucweb\t85265\n呐^\t85266\n陈老师\t85267\n爱我\t85268\n九零年\t85269\n十三十三\t85270\n苹果族\t85271\n五知\t85272\n杜敏\t85273\n宋英杰\t85274\nFIFA\t85275\n一听\t85276\n四边\t85277\n张博阳\t85278\n对呀不然\t85279\n一吧\t85280\n日本拓殖大学\t85281\n平淡的歌\t85282\n不明我就喜欢\t85283\n新一轮\t85284\n一吻\t85285\n3点\t85286\n中华人民共和国\t85287\n子孙\t85288\nLoMmg\t85289\n红绳\t85290\n一吱\t85291\n一后\t85292\n一名\t85293\n一同\t85294\n卢米\t85295\nbbbjlmt\t85296\n哞哞哞\t85297\n不间断地\t85298\n1234509\t85299\nicecream\t85300\nkasumi\t85301\n8万元\t85302\n一吗\t85303\n实干\t85304\n绕口令行\t85305\n5000元\t85306\n凯国\t85307\n一向\t85308\n第一季度\t85309\n尽业\t85310\n花圃\t85311\n摆明\t85312\nkdzkzkdkkdzkfkfkxkxk\t85313\nt啦deny五up死G11啊CK罗门my五5hi\t85314\n花圈\t85315\n中山大道\t85316\n朋友说\t85317\n菁英\t85318\n茶油\t85319\n莱斯特咪q1\t85320\n最后一天\t85321\n尽不\t85322\n杜酿\t85323\n伏羲式\t85324\n嘟咪嘟咪嘟咪嘟咪嘟咪嘟咪嘟咪嘟咪嘟咪度秘秘\t85325\n毒草\t85326\n花圣\t85327\n3113473692\t85328\n宗法\t85329\n五洲\t85330\n新舞\t85331\n新舟\t85332\n花地\t85333\n自由的人\t85334\n路牌\t85335\nniyvb\t85336\n斯曲\t85337\n秒审\t85338\n孔妈妈\t85339\nAV黄土\t85340\n十五级\t85341\n爱沙尼亚\t85342\n干嘛哩花\t85343\n聊会儿天儿快快快快快聊\t85344\n懂点\t85345\n微微一笑很倾城\t85346\n6535264656463565352\t85347\n六月天\t85348\n50第二周\t85349\n哎呀妈呀真没想到\t85350\n广铁\t85351\nuipkorot\t85352\n生之春\t85353\n1..3..755..160..3\t85354\n八块\t85355\n一开一下\t85356\n132457584582384569555万元\t85357\n昌贝贝\t85358\n与此\t85359\n李敏槁\t85360\n称称\t85361\n谈话类\t85362\nIKD\t85363\nsbssb\t85364\n李林成\t85365\nsbssd\t85366\n俯卧撑\t85367\n6548655\t85368\n26集\t85369\n中西餐点\t85370\nCenter\t85371\n第30天\t85372\n陆亚琴\t85373\n张远之\t85374\n6959\t85375\n按二比\t85376\nnbaorfisasy\t85377\ngghcghy\t85378\n施定柔\t85379\n6956\t85380\n六查克\t85381\n嬛嬛\t85382\nngtfdd\t85383\n功夫熊猫了拜拜小度秘\t85384\n赵鹏\t85385\nysm\t85386\n哦翅\t85387\n多多缘\t85388\n军用\t85389\n为安\t85390\n通点\t85391\n南无阿弥陀佛\t85392\n猪操蛋\t85393\n十一点钟\t85394\n为宝\t85395\n为宜\t85396\n为定\t85397\n迪莫\t85398\nyst\t85399\n范羊癫疯\t85400\n买房\t85401\n左右侧\t85402\n怀抱\t85403\njpgd\t85404\n黑鸭\t85405\n恩里基\t85406\n铿锵有力\t85407\ngotscaredof\t85408\n惊心动魄\t85409\n看不死\t85410\n远逝\t85411\n黄龙美\t85412\n帝国\t85413\n搞不来\t85414\nDHHSJQJDIX\t85415\n机械化\t85416\n045785511989998O66\t85417\n沄\t85418\n七位\t85419\n恩不饿\t85420\n204点儿\t85421\n杜一宸\t85422\n两点周\t85423\n结婚日\t85424\n百度公司\t85425\n黑鸟\t85426\n宁阳\t85427\n大名府\t85428\n上华\t85429\n56464644\t85430\n上半\t85431\n＋54546544＝\t85432\n上午\t85433\n241268\t85434\n秦州区\t85435\n上升\t85436\n逍遥津\t85437\n13904791478\t85438\n小卖铺\t85439\n对啊你\t85440\n气丸\t85441\n乐动力\t85442\n报复射\t85443\n滾滾滾\t85444\n上单\t85445\nFTI\t85446\n电话手\t85447\n给我看看\t85448\njvyzebvyvybyvyvyvyvtvgvvyvyvyvnnnnnnnnnnnnnnnnnnnnnnnnnnnnnngvgvtttctvtvtcrcctctttccrctnnnnnnnnnnbuvhybuvybubugybnnnnnnnddcctvtvtvtvtvtvnnnnvtvtvvtvtvtvrtncnncrcrnctcttctccvngvtvttvtvtvtvrvtvttvnnnnvtctcttvtctvtvrvtctctctctc\t85449\nGold\t85450\n我的生辰八字\t85451\n美容隆\t85452\n技术\t85453\n說點\t85454\nhol\t85455\n扣币\t85456\n今世缘\t85457\n营业营业厅\t85458\n8477点半\t85459\n梯间\t85460\n不最讨厌\t85461\n处于\t85462\n闹闹闹\t85463\n处事\t85464\n度秘你好我世如玉\t85465\n美仙\t85466\n评出\t85467\n秘尔\t85468\n第九遍\t85469\n郑亚飞\t85470\n佛头\t85471\n嘟噜\t85472\nIbu\t85473\n美价\t85474\n货源\t85475\n明初\t85476\n秘将\t85477\n佛大\t85478\n清屏梅\t85479\n门科\t85480\nKHF\t85481\n四点7\t85482\n水版\t85483\n誆\t85484\n松晓\t85485\n雪茄\t85486\n一百遍\t85487\n周昌旭\t85488\n雪茜\t85489\n真信\t85490\n水牛\t85491\n老老老噢老\t85492\n雞嗎\t85493\n骂道\t85494\n2010级\t85495\n许筠梓\t85496\n通山\t85497\n560240276132\t85498\n瞌睡觉吧\t85499\n笑笑笑笑\t85500\n张小飘\t85501\n龙之谷\t85502\n搜集\t85503\n粪世嫉俗\t85504\n26还款日\t85505\n鲍尔默\t85506\n雪茹\t85507\n青春美少女\t85508\n真俊\t85509\njustafter\t85510\n家产\t85511\n很难说出\t85512\n习净平\t85513\n辅食\t85514\nmm5\t85515\n想太多\t85516\n准准\t85517\nRJRJfjfj\t85518\n乘于\t85519\n家亲\t85520\n伊晓\t85521\n额真心\t85522\n饿死我是个女到不是你\t85523\n几内亚\t85524\n盛况空前\t85525\n家人\t85526\n热像仪\t85527\n丛林\t85528\n刺豚\t85529\n乘人\t85530\n饥饥\t85531\n伊普\t85532\n家事\t85533\n千五卖把你的脸小\t85534\n10000元\t85535\n张满乐观不得\t85536\n十一轮\t85537\n陶芳芳\t85538\n聊时\t85539\nmijt\t85540\n66278914\t85541\n懂不我\t85542\n出书\t85543\nSTAGE\t85544\n我是你四一不是我死\t85545\n公有化\t85546\n马厩\t85547\ngufu\t85548\n桃红\t85549\n不感\t85550\n凝固的爱\t85551\n一亿六千八百六十四岁\t85552\n两米高\t85553\ngufy\t85554\n还纸\t85555\n南征\t85556\n相棋\t85557\n7日下午5点\t85558\n不愁\t85559\n还线\t85560\n不意\t85561\n步阳\t85562\n大主宰之灵\t85563\n桃纸\t85564\n不愉\t85565\n卖走\t85566\n百怪\t85567\n孙梦颖\t85568\n熙景\t85569\n油麦菜\t85570\n曲水\t85571\n再说一句话\t85572\n送算\t85573\n诺贝尔医学奖\t85574\n霍玉程\t85575\njgfugfaen\t85576\n圆明园\t85577\n价值连城\t85578\n好啦好啦你不非礼的我\t85579\n明后日\t85580\n选择题\t85581\n李小妖\t85582\n奥林匹克公园\t85583\n45885886\t85584\n鸦雀无\t85585\n惹急\t85586\n赞好\t85587\n人不行么样本钱多多长了么事\t85588\neujs\t85589\n胖哥哥\t85590\n克塔黑大戏答\t85591\n中华民族\t85592\n念笙\t85593\n张倍宁\t85594\n六七十\t85595\nGIGGS\t85596\n婧婧\t85597\nQQOK\t85598\n心情好我心情不好\t85599\n赵梓芯\t85600\nangleababy\t85601\nhi歌\t85602\n馔\t85603\n首\t85604\n馒\t85605\n1687457\t85606\n江一疡\t85607\n低落一算\t85608\n香\t85609\n静寂\t85610\n馄\t85611\n馅\t85612\n馆\t85613\n先是\t85614\n紫萱\t85615\n好潮\t85616\n琼楼玉宇\t85617\n小西区\t85618\n上世纪50年代\t85619\n馊\t85620\n闵佳伟\t85621\nwisb\t85622\n代步车\t85623\n别生气\t85624\n455555555568\t85625\n经检\t85626\n大殿\t85627\n梦见红了钱\t85628\n五九来客\t85629\nwiss\t85630\n焊缝\t85631\nwshye\t85632\n来再来\t85633\n馬\t85634\n尹娥\t85635\n二分之一第二天\t85636\n馨\t85637\n馫\t85638\n15k丘\t85639\nHome键\t85640\n一朝春梦\t85641\n复地\t85642\n活宝\t85643\n马蜂\t85644\n名言警\t85645\n想必\t85646\n蒋玉君\t85647\n清高\t85648\n睡不然\t85649\n恒心\t85650\n新网球王子OVA第三季\t85651\n11顿\t85652\n红报\t85653\n想念\t85654\n热裤\t85655\n郴吾\t85656\n凯立德\t85657\n贞贞\t85658\n6466253\t85659\n上苍差\t85660\n今天夏天\t85661\n猪猪猪猪你就是一头猪朱猪朱猪朱猪\t85662\nqwqqwq\t85663\n陪单\t85664\n医大一医大\t85665\n⊕再见再见再见不是\t85666\n金鹰卡通足球队\t85667\n闹子\t85668\nhyggn\t85669\n不好呀呀\t85670\n18384692198\t85671\n发帅\t85672\n小浣熊\t85673\n退团\t85674\n21天\t85675\n0987654331\t85676\n眼到口\t85677\n鸡狗\t85678\n最傻的人\t85679\n好人们\t85680\n当耳旁风\t85681\nmtwjjjmttjtj8twjjjjjttmjjwjtjgajpgajjgjjjjjjjjjmgptwBengajtpadm\t85682\n骷髅\t85683\n饿波\t85684\n发帖\t85685\n男土女土--双土夫妻好姻缘\t85686\n摄影师\t85687\n二维\t85688\n王光西\t85689\n爱琴海\t85690\nYCYDFU\t85691\n跑位\t85692\n发带\t85693\n退回\t85694\n火麻晋尧\t85695\n汇集\t85696\n端正\t85697\n当时\t85698\n小红原\t85699\n汤煲仔\t85700\n拜拜偶\t85701\n混淆视听\t85702\n准备好了\t85703\nhats\t85704\n见面会不会\t85705\n我的青春\t85706\n出版署\t85707\n15657nnn156579883930\t85708\nhata\t85709\n其四\t85710\n拉格\t85711\nhate\t85712\n辩题\t85713\n换底\t85714\n8805\t85715\n给你什么样你的孩子\t85716\n眉笔\t85717\n李八字\t85718\n对呀星期六我星期六\t85719\n徐远征\t85720\n把上\t85721\n女人与女人\t85722\nyyudy\t85723\nisusg\t85724\n再接再砺\t85725\nｘｏ\t85726\nbvfgh\t85727\n你是疯子你是疯子你是疯子你是疯子\t85728\n李诗然\t85729\nKzSJ\t85730\n东京20发表会\t85731\ncelviyg\t85732\n语气场\t85733\n缝纫机\t85734\n温柔的度\t85735\n开创者\t85736\n八四三三八九幺\t85737\n那一天\t85738\n妈妈声\t85739\n一季节\t85740\n旱船\t85741\n那一夜\t85742\n味精\t85743\n5igg\t85744\n三权物\t85745\n4段\t85746\n皇带鱼\t85747\n小码\t85748\n何莹婷\t85749\n奶子\t85750\n蒋志伟\t85751\n你好呀小度秘\t85752\nenjoy\t85753\n廿\t85754\n准确度\t85755\n香盒\t85756\n一点都不可爱你好丑好兰好坏\t85757\n车军\t85758\n元宵灯会小记\t85759\n喽度秘\t85760\n三.一\t85761\n蓝尾\t85762\n眼电视\t85763\nbuff\t85764\n二十厘米\t85765\n背闺怨\t85766\n门厅\t85767\n李佳\t85768\n肖雨晴\t85769\netrfv\t85770\n小魔女\t85771\n等下你讲姑事给我ok\t85772\n散火\t85773\n张冠李戴\t85774\n廠子\t85775\n帅锅美驴\t85776\n明暗\t85777\ntxytdurtkuud\t85778\n6年\t85779\n一帮一个\t85780\n来来历\t85781\n阿巴拉克\t85782\n结仇\t85783\n行不度\t85784\n小玩笑\t85785\n电宝\t85786\n多面王后和你\t85787\n#恶魔剪烫工作室\t85788\nmwtjmt\t85789\nstudio\t85790\n方心\t85791\n郭月g\t85792\n身经\t85793\n随和\t85794\n哇度秘\t85795\n黄红孩儿煌黑呀你\t85796\n谢秋香\t85797\n好人儿\t85798\n电家\t85799\n今天晚上19:30\t85800\n离情别意\t85801\n扭吧\t85802\n刘德宇\t85803\n下线\t85804\n电容\t85805\n林百艳\t85806\nDDODTT囧TTDOySOzSO\t85807\n电室\t85808\n过套\t85809\n冰哥\t85810\n姜美珠\t85811\n指向性\t85812\n眼液\t85813\nchateau\t85814\n三文鱼\t85815\n一等奖\t85816\n尽心思\t85817\n汪雨欣\t85818\ncoser们\t85819\n安平\t85820\n田子正\t85821\n友爱\t85822\n告诉我最好\t85823\n崂山理工\t85824\n废话能不寂寞夏日\t85825\n守着数\t85826\n不要你你来\t85827\n张寒兵\t85828\n市集\t85829\noooo\t85830\n挥泪\t85831\n13431017996\t85832\nhfdgjdfy\t85833\n接档\t85834\n艾克里里\t85835\n#牛灿#\t85836\nｙｙ届\t85837\n错峰\t85838\n第七张\t85839\njoi3\t85840\n水洗\t85841\n2010年12月\t85842\navicourt\t85843\n东呢大暖\t85844\n佳句\t85845\n左爪\t85846\nnostoffi\t85847\n技术贴\t85848\n123455432112745616\t85849\n来品\t85850\n冬蜜冬蜜冬蜜的我找一些贝贝贝贝\t85851\n20世纪40年代末\t85852\n受不了\t85853\n千零三2\t85854\n安县县委\t85855\nmoil\t85856\n加斯克\t85857\n多财大度\t85858\njoii\t85859\njoin\t85860\n194250\t85861\n武使徒\t85862\n有眼无珠\t85863\nonit\t85864\nc了我等你\t85865\nty君\t85866\n挂号吧\t85867\n何念旧\t85868\nbjsx\t85869\n七零六零\t85870\n写实主义\t85871\n500万\t85872\n千哥\t85873\n建房\t85874\n沫\t85875\n南湾\t85876\n五五零\t85877\n卖我告诉你\t85878\n栗家泽\t85879\n点工\t85880\nP民\t85881\n寒岭镇\t85882\n笨狼\t85883\n罩怀\t85884\n同仁\t85885\n餐盒\t85886\n闻王麟\t85887\n南湖\t85888\n张灵睿\t85889\nDoloveme\t85890\n挺胸\t85891\n杨翔\t85892\n你偶吧美让你偶吧\t85893\n我真的不理你了你再也不是我的朋友了我恨死你\t85894\n同价\t85895\n猪蜘蛛\t85896\n放不开\t85897\n广度\t85898\n笨狗\t85899\n牛利锋\t85900\n扣掉\t85901\n肥胸\t85902\n吃不死\t85903\n三三八三九\t85904\n王豪差\t85905\n丁翔宇\t85906\n几天大多\t85907\n零九幺零\t85908\n式儿\t85909\n哎呀妈呀你真的听不懂人话\t85910\n12345144\t85911\n阿燕萍\t85912\n牵念\t85913\n剑雨\t85914\n说了了\t85915\n开学克学f会\t85916\n2xy3yx\t85917\n口语交际\t85918\n听看\t85919\n接连不断\t85920\n细管\t85921\n赵宝顶\t85922\n巴达克\t85923\nabcdx9dcbanabcd\t85924\nTHK\t85925\n奥奇传说\t85926\ncffg\t85927\n北京市地税局\t85928\nvwhiziuv骂初一嗯嗯npqrstuvwhiz\t85929\ncffc\t85930\n恶心机器人\t85931\n器物\t85932\n08：30\t85933\n创世纪\t85934\n一1Z\t85935\n坏妈妈\t85936\n莫一烈\t85937\n一起死\t85938\n命题\t85939\n善科ou\t85940\nHero\t85941\n种植园\t85942\n文殊菩萨\t85943\n20多名\t85944\nHere\t85945\nstorageemulated0baidusearchboxdownloadsu298549439795969043fm21gp0jpg\t85946\n一1x\t85947\n江油\t85948\n65785548\t85949\n四月二十八号\t85950\n建筑物\t85951\n树大招风\t85952\n胡娅兰\t85953\n闪光点\t85954\n让路\t85955\nfinish\t85956\n四八拉\t85957\n永丰\t85958\n渎职\t85959\n沈氏\t85960\n碟刹\t85961\n美无痕才\t85962\n海欧\t85963\nAlyssaarce\t85964\n中央空调\t85965\n身体语言\t85966\n123456789101112点\t85967\n许子路\t85968\n苦思甜\t85969\n徐海洋\t85970\n钱云\t85971\n小师\t85972\ntddggxfjhggcffgzgkjgddjvxkkbcvvvbjjffifhhffjkgghb\t85973\n11千米\t85974\n美男们\t85975\n一11\t85976\n计春将\t85977\n一17\t85978\nFFdididkdk\t85979\nvxss\t85980\n而去\t85981\nS248\t85982\n幸运的人\t85983\n7urbo\t85984\n信系\t85985\n盲目\t85986\nwsm\t85987\nbggj\t85988\n十二个月\t85989\nerararara\t85990\n落水狗\t85991\nhttphhiphotosbaiducomxiaodupicitem0eb30f2442a7d93377bff206aa4bd11373f001abjpg\t85992\njlnwe\t85993\n痛爱\t85994\n童心\t85995\n高级度\t85996\nwsp\t85997\n中游\t85998\n女儿子\t85999\n好屋\t86000\n围场\t86001\n天涯老妖婆\t86002\n弥可金\t86003\n餵了食\t86004\n2万\t86005\n即便\t86006\n一千帅\t86007\n李林佳霓\t86008\n19点30分\t86009\n宅居行\t86010\n连填表\t86011\n熊市\t86012\nnjiuuy\t86013\n一一八一百位\t86014\nK800来了#\t86015\n69元\t86016\n攀缘\t86017\n刚小希\t86018\n放寒\t86019\n逆天鬼节个\t86020\ncstbjs\t86021\n蜿蜒\t86022\n很不错我真\t86023\n黄灿灿\t86024\n请来人取\t86025\n文哥\t86026\n说不认识\t86027\n润喉糖\t86028\n超女\t86029\n不对干呕猜\t86030\n我等\t86031\nv库夫\t86032\n7j一一\t86033\n2名\t86034\n解放南\t86035\n夹击\t86036\n千次万次\t86037\nerfc\t86038\n曹颖波\t86039\n微语\t86040\ngjgdmgmt\t86041\n面汤\t86042\n铁面无私\t86043\n玩机\t86044\nt888\t86045\n呃故事半朵大红花指半朵大红花\t86046\n条猪\t86047\n辉腾战斗\t86048\n燕子燕子\t86049\n余下色\t86050\n一期二期\t86051\nno　　on　oo\t86052\n阿迪王\t86053\n丑寐\t86054\n落巢\t86055\n#Kristen\t86056\n快点告诉我新年的英语小作文\t86057\n对酒\t86058\n浮夸风\t86059\nfufuufrgtjidsduhvhihfytgstdyfjygTGDYFH\t86060\n二万辆\t86061\n22222210\t86062\nwwdgj\t86063\n蒙氏堡\t86064\n小三阳\t86065\n骗不了\t86066\n大卫天\t86067\n蜡笔小新人\t86068\n包哥\t86069\n反应管点\t86070\n死再说\t86071\n余胜君\t86072\nyyhnxjh\t86073\n挨撞\t86074\n653\t86075\n怎么回事儿\t86076\n一百四一百\t86077\n郭子牛\t86078\n13812818148\t86079\n泽文\t86080\n熊出没的歌\t86081\nzxv\t86082\n错你是我的闺蜜\t86083\n藿香\t86084\nWogeinicaige\t86085\n谈不谈\t86086\n流鼻涕\t86087\n琪琪\t86088\n7n00ncxccfc\t86089\n谜题\t86090\n18971756918\t86091\n重力一切\t86092\n今日凌晨2：30\t86093\nzxz\t86094\n征途\t86095\n八卦侠\t86096\n真不像\t86097\n多两句\t86098\n缘由\t86099\nx13\t86100\n闫老师\t86101\n呃周瑞林\t86102\n6点42\t86103\n复读机\t86104\n｜\t86105\n唱一首唱\t86106\nhttphhiphotosbaiducomxiaodupicitem0b46f21fbe096b63e11143240b338744ebf8acbdjpg\t86107\n童装\t86108\n呀儿错\t86109\n徐梓熹\t86110\n龙源\t86111\n截图\t86112\n0123456789987654321\t86113\n回点\t86114\n糊里糊涂\t86115\n数千张\t86116\n蓝你\t86117\n香橙们\t86118\ngbjkkk\t86119\n娃罗\t86120\n官贵们\t86121\n没有钱难受\t86122\n卡瑞卡\t86123\n-\t86124\n3挺\t86125\n二半夜\t86126\n王纪平\t86127\n虎u\t86128\n打给\t86129\nccvat\t86130\nabpe\t86131\n里四季村\t86132\n打结\t86133\nhjjjjll\t86134\n充公\t86135\n普文镇\t86136\njwjixhjs\t86137\n二分之三一\t86138\n缓气\t86139\n牛哒哒\t86140\n梅花粥\t86141\n最富\t86142\n咯臭机器人\t86143\n哈余u還好鬍u雨u發發春敢不敢飯飯怪怪的古古怪怪vv有還和你哈回\t86144\n首抽\t86145\n80x25\t86146\n秩序井然\t86147\n勒给\t86148\n爱呢度秘度秘\t86149\n王路飞\t86150\n嘉士伯啤酒\t86151\n我猜你猜我猜你猜我猜\t86152\n有你好玩\t86153\n充充\t86154\n臭豆腐关\t86155\n半自动\t86156\n忽冷\t86157\n下午天\t86158\nTtyhhhgf\t86159\n48.8\t86160\n打不开\t86161\n我真的好讨厌你啊为什么呢我真的好讨厌好讨厌好讨厌好讨厌你\t86162\n小度秘你很可爱求你\t86163\n洪婧\t86164\n秘办\t86165\n糯米饭\t86166\n指名道\t86167\n儿童片\t86168\n晚安斯\t86169\n一个十人份\t86170\n哥风风\t86171\n我的画\t86172\n白腿\t86173\n儿童版\t86174\n古筝\t86175\n给哥人\t86176\n闻风而逃\t86177\n感恩爱\t86178\n沈清绝\t86179\n我最爱大米我不爱你度秘\t86180\n妻子\t86181\n李克娜\t86182\n宠男们\t86183\n旅游防灾小册子\t86184\n9T5gd5\t86185\n可聪明\t86186\n郭果果\t86187\nvo鸡尾酒\t86188\n陈晓和\t86189\n保险单\t86190\n脸蛋儿\t86191\n混队\t86192\n崔秘箱\t86193\n彼得林\t86194\n开建\t86195\n莉肿\t86196\n性爱片\t86197\n一剑\t86198\n六叔罗地亚\t86199\n茶具\t86200\n中发\t86201\n毒刺\t86202\n丧失\t86203\n银诺克\t86204\nhttpfhiphotosbaiducomxiaodupicitem77094b36acaf2eddce8d7f9d8a1001e939019341jpg\t86205\n同质化\t86206\n度秘失调猪朱猪朱猪\t86207\n溶性\t86208\n说了我不说\t86209\n半岛小区\t86210\n个额\t86211\n肚儿\t86212\nyoursocool\t86213\n度秘该\t86214\nfhdvjh\t86215\nxitoutouqi\t86216\n没完没了\t86217\niytstep\t86218\n艾里克克\t86219\nwqmlgb\t86220\n新疆人\t86221\n猜否\t86222\n原谅我吧\t86223\n乔文帅\t86224\n王王桥站\t86225\nogfvg\t86226\n猜听\t86227\n普拉多\t86228\n109岁\t86229\n5109020199706188459\t86230\n周小胖\t86231\n莫如此\t86232\n3586\t86233\n1878年4月\t86234\ngdia\t86235\n萨卡罗\t86236\n王华俊\t86237\n柔风\t86238\n黑边\t86239\n刺臭\t86240\n2100个月\t86241\n香飘\t86242\n臭屁熏天\t86243\n勾肩搭背\t86244\n浓烟\t86245\n借来玩\t86246\n一7535688568\t86247\n筑坝\t86248\n2323232323232323232323\t86249\n23%\t86250\n画画累\t86251\nrdgauy\t86252\nanartist\t86253\n皓然正义\t86254\n杨春亚\t86255\n放下\t86256\n随片\t86257\n有你会不会\t86258\n子多些\t86259\n3点儿\t86260\n坏蛋大坏蛋臭坏蛋\t86261\n龚一边\t86262\nhi小咪咪\t86263\n陈育坚\t86264\nc3国\t86265\n么么么男神萌萌哒\t86266\n石头\t86267\n这个群\t86268\n战斗型\t86269\nkwb\t86270\nkwg\t86271\n人寿险\t86272\nkwk\t86273\nkwj\t86274\n性交配\t86275\n接替\t86276\n王长庆\t86277\n四点儿\t86278\n靠谱靠\t86279\nkwz\t86280\n闽清\t86281\n盛世峡江\t86282\n流干\t86283\n流年\t86284\nbeee\t86285\nbeef\t86286\n周云量\t86287\n装载机\t86288\nshdrty\t86289\nbeen\t86290\n百废话\t86291\nbees\t86292\nt粉爱你#\t86293\n布萨卡\t86294\n杜文泽\t86295\n长成\t86296\n喜欢你了讨厌\t86297\n承若\t86298\n五邑大学土建系\t86299\n汤匙\t86300\n佳红\t86301\n换言之\t86302\n明智\t86303\n圆溜溜\t86304\n要不我听\t86305\n永结\t86306\n个人妖\t86307\n翻阅\t86308\n爱慕\t86309\n浪费\t86310\n你是我的小跟班那\t86311\n明晨\t86312\n1000万颗\t86313\n淡出\t86314\n明晓\t86315\n萌化\t86316\n不二家棒棒糖给我一个\t86317\n流行病学\t86318\ndhiphotosbaiducomxiaodupicitem1\t86319\n明晚\t86320\n李了中南汽配城\t86321\n十个帮\t86322\n吃鬼\t86323\n金花儿\t86324\n陪吃陪喝陪玩陪聊天\t86325\n沉沉\t86326\n唐人街\t86327\n有误\t86328\n出局\t86329\n青岛啤酒公司\t86330\n我真的一道疤\t86331\nX10\t86332\nfififfififfixu\t86333\n有请\t86334\n有说\t86335\n万达商业\t86336\n有课\t86337\n罗红霉素\t86338\n猜会\t86339\n击败\t86340\n炼油厂\t86341\n安鸿\t86342\n睡睡觉\t86343\ndhhdhdhdbd\t86344\n摇滚\t86345\n月多\t86346\n李可心\t86347\n月夜\t86348\n葫芦哇金刚天业集团\t86349\n出山\t86350\n千百个\t86351\n张晓怪\t86352\n爱问问\t86353\n聪明再见\t86354\n关疑\t86355\n一整颗\t86356\n在现场\t86357\n4金\t86358\n解脱\t86359\n蕾丝\t86360\n红钻\t86361\nzxcbn\t86362\n朕渴\t86363\n讲一个谁在\t86364\npaspossible\t86365\n布展\t86366\n更完\t86367\n嗯三六\t86368\n小孩\t86369\n1872.10.22\t86370\n挥挥\t86371\n布局\t86372\n牙狼\t86373\n2010年3月9日\t86374\n条手\t86375\nuhvffu\t86376\n天麻\t86377\n东尼\t86378\n书厂\t86379\n43994399\t86380\n家纺\t86381\n家福女\t86382\n家红\t86383\n快乐生活\t86384\n美妞宝儿\t86385\n不懂我的话一点都不乖我要投诉你\t86386\n条打\t86387\n1hi\t86388\n过去式\t86389\n马荣秘\t86390\n擦擦蹦\t86391\n烤鹅\t86392\n全关\t86393\n吃火锅\t86394\nfrtrf\t86395\n纷纷扰扰\t86396\ntyuzctr\t86397\nYouBitch\t86398\n陶土\t86399\n王晓磊\t86400\n全入\t86401\n弹痕\t86402\n茯蒲\t86403\n全全\t86404\n调会\t86405\n虔诚\t86406\nm1m1g\t86407\n开原秘\t86408\n小松树市\t86409\n大哲学家\t86410\n姚一蔓\t86411\n而行\t86412\n轴对称\t86413\n解雪音\t86414\n淳淳淳淳淳\t86415\n天王集团\t86416\n慢慢来\t86417\n纪念馆\t86418\n羊肉面\t86419\n不可貌相\t86420\n支持稿\t86421\namom\t86422\n八天\t86423\n特别差\t86424\n长沙市委\t86425\n2995\t86426\n只手\t86427\n食派\t86428\n雷士的漫游唱一首么牛子畅\t86429\n吴芷欣\t86430\n成方圆\t86431\n最强喜事\t86432\n民意调查\t86433\n应用程序\t86434\n五百页\t86435\n勒都\t86436\n努力\t86437\n转变得\t86438\ntmmtj\t86439\n赵言鸣\t86440\n一天五十张\t86441\n脱发\t86442\n你在哪呢金手勺打不死阿爸等哈睡刷卡对\t86443\n动态\t86444\n管定\t86445\n科学美国人\t86446\n颜卓\t86447\nwhg\t86448\nescape\t86449\nuytfdethc\t86450\n自发\t86451\n自受\t86452\n自取\t86453\n动怒\t86454\n管家\t86455\n王俊凯\t86456\n装纯\t86457\n度度秘\t86458\nshSSSS\t86459\n一千年前\t86460\n刀把\t86461\n努努\t86462\n942262462\t86463\n自叹\t86464\n陈家烽\t86465\n哥哥乖乖乖乖乖乖哦咯咯咯咯咯咯咯咯咯咯咯丽丽丽丽丽丽丽丽丽丽丽丽丽丽丽丽丽丽丽丽丽丽乖\t86466\n痛告\t86467\n声言\t86468\n坏毛病\t86469\nchiphotosbaiducomxiaodupicitemd\t86470\n消魂\t86471\n财付通\t86472\nchiphotosbaiducomxiaodupicitema\t86473\n小齐\t86474\n黄苇萱\t86475\n团太\t86476\n李我叫李\t86477\ntour\t86478\n电驴\t86479\n远离\t86480\n小夜\t86481\n泰鬼片\t86482\n电马\t86483\nDJUHT\t86484\n在眉睫\t86485\ncorz\t86486\nXKFBYD\t86487\n见了笑死\t86488\n法兰克福机场\t86489\n荣锦绣\t86490\n秃鹰\t86491\n舆论监督\t86492\n你好版\t86493\n四十一号\t86494\n购物券\t86495\n2232167105\t86496\n黑界\t86497\n妮妮妮\t86498\n你好牛\t86499\n11#1\t86500\n先谷\t86501\n壬辰年\t86502\n小丫咪\t86503\n40840554405\t86504\n八九年\t86505\ngrryh\t86506\nwhs\t86507\n办学\t86508\n杭锦\t86509\n公募\t86510\n不懂你\t86511\n洪伟\t86512\n一三个\t86513\ntggdb\t86514\n表情木\t86515\n兢兢业业\t86516\n汤雅舒\t86517\nmml\t86518\n欲成欢\t86519\n道德规范\t86520\n吧唧\t86521\n张延坡\t86522\n风范\t86523\nｐｋｐｋｐｋ\t86524\n才问\t86525\n猪脑吧\t86526\n夥\t86527\ngg棒\t86528\n无鱼可\t86529\n扩展\t86530\n恰子\t86531\n五棵\t86532\nQMO\t86533\n六点\t86534\n一三万\t86535\n马赛克\t86536\n5001千\t86537\n一三一\t86538\n哈让\t86539\n天\t86540\n教友\t86541\n1234955578558866\t86542\n核芯\t86543\nvvbvvvv\t86544\naper\t86545\n刘波\t86546\n文理科\t86547\nlloiiuhgfd\t86548\n欠练\t86549\n大婶儿\t86550\nzck\t86551\n棉质\t86552\n微行\t86553\n人民服务学\t86554\n纪晓岚\t86555\n9部\t86556\n好男不打\t86557\nhgbsgbb\t86558\n追到\t86559\n试手\t86560\n罪\t86561\nqweq\t86562\n追别\t86563\n厌妮\t86564\n精算课\t86565\nVita\t86566\nqwef\t86567\n七四一百五十斤\t86568\nqwec\t86569\n晕死元\t86570\n东方据\t86571\nnnm\t86572\n我真的很想爱你\t86573\nnnk\t86574\nnnj\t86575\nmlljuktkmm\t86576\nnnh\t86577\nnng\t86578\nnnf\t86579\nnne\t86580\nnnd\t86581\nnnc\t86582\n旺旺集团\t86583\n剑灵通\t86584\nxxjra\t86585\nlyfydz\t86586\n车队\t86587\n个度秘评神\t86588\nnny\t86589\nnnx\t86590\nnnv\t86591\nnns\t86592\n夸\t86593\n第六位\t86594\n北海公安志\t86595\n姚小亮\t86596\nlandok\t86597\n水井\t86598\n月月天\t86599\n我爱梦璃我爱吗你我爱梦绿我爱妈咪\t86600\n声援\t86601\n黑龙潭瀑布\t86602\n江珍\t86603\n红色高棉\t86604\n真了假了\t86605\n体制改革\t86606\n一百升\t86607\n打完\t86608\n驱动\t86609\n圆满结束\t86610\n两元\t86611\n芦笙\t86612\nhello秘\t86613\n笃笃笃\t86614\n541881452\t86615\n公斤重\t86616\n两党\t86617\n敏苑\t86618\n取消\t86619\n勤于\t86620\n建好\t86621\n两全\t86622\n沾满\t86623\n军师\t86624\n大经济\t86625\n字\t86626\n遗落\t86627\n读高中\t86628\n莘县\t86629\n飞行模式\t86630\n头脑\t86631\n刘博翔安\t86632\n都尼\t86633\n诺诺么\t86634\n孓\t86635\n吵吵\t86636\n偶像活动\t86637\n李海平\t86638\n玻璃体\t86639\nmoreneed\t86640\n告诉我吧先上\t86641\n霏霏\t86642\n来凤\t86643\nFgfy\t86644\n孜\t86645\n一些儿\t86646\nRMB4399元\t86647\n判官\t86648\n5059999999元\t86649\n下三滥\t86650\nvghhjkhhjbghbcc\t86651\n2018年\t86652\n13135397039\t86653\n吵吖\t86654\n多咪\t86655\n知不对\t86656\n惯用\t86657\n梁继腾\t86658\n恩懂\t86659\n概率论\t86660\n煮饼\t86661\n一分钟以后\t86662\n几年后\t86663\n石路尝\t86664\n英氏\t86665\n欧特曼\t86666\n清盘\t86667\n武广客\t86668\n44585255585222651457996486515822665265558621585548625866\t86669\n没一点也\t86670\npleases\t86671\n煮饭\t86672\n滚滚\t86673\n死额\t86674\n子弟兄们\t86675\n囚徒\t86676\nadvantage\t86677\n嗯嗯嗯嗯嗯嗯嗯嗯嗯呢呢呢呢呢呢呢呢呢呢呢\t86678\n被告诉我\t86679\n哇卡\t86680\n佐芙蓉\t86681\n降龙\t86682\n1Vte一i\t86683\n哪我\t86684\n度杜\t86685\ndije\t86686\n你你我打扁你我是女的\t86687\n这些个\t86688\ntuan\t86689\n幼年时\t86690\n999844449998444499984444999844449998444499984444999844449998444499984444\t86691\n不行片\t86692\n半岛\t86693\nhwohdoehdijwjd\t86694\n独身一人\t86695\n季\t86696\nFOURSEASONS酒店\t86697\n2月16日\t86698\n好贱\t86699\n刘娜萍\t86700\n摊吵\t86701\n辣酷\t86702\n余江\t86703\n打胎\t86704\n13958558\t86705\n得失心\t86706\n2013145202099521\t86707\njdd\t86708\n1969\t86709\n八十多级\t86710\n实际\t86711\n再搬\t86712\n几十次\t86713\npmt\t86714\n田上\t86715\n西餐店\t86716\n梁小梅\t86717\n陈佳宜\t86718\n邓慧欣\t86719\n橡胶粒\t86720\n田七\t86721\n拍打\t86722\npmx\t86723\n陈嘉敏\t86724\n联懋\t86725\npmd\t86726\n拍手\t86727\npmg\t86728\njdc\t86729\n田东\t86730\npml\t86731\n彩仙\t86732\n福清市\t86733\n深沪\t86734\npmk\t86735\n小颜颜\t86736\n五五升\t86737\nhjuh\t86738\n田中\t86739\n虽然\t86740\n高点儿\t86741\n748748748\t86742\n宜城\t86743\nhjuc\t86744\nABCDEFGHlJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz\t86745\n524\t86746\n525\t86747\n526\t86748\n怎莫样\t86749\n520\t86750\n521\t86751\n522\t86752\n523\t86753\nhjuu\t86754\n摸摸\t86755\n上海城投\t86756\n528\t86757\n深沉\t86758\n张冰水\t86759\n星球期\t86760\n小秘你就是我的闺蜜\t86761\n重午节\t86762\n1751期\t86763\n新婚夜\t86764\n胆大鬼你是石大坏蛋大坏蛋\t86765\n胡就好\t86766\n罗韶涵\t86767\n全社\t86768\n枪战片\t86769\nA级\t86770\n畢業\t86771\n摄像\t86772\n进群\t86773\n6500\t86774\n哇靠老啊\t86775\n没关\t86776\n淘宝旗舰店\t86777\n水能载舟\t86778\n珍塔玛莎\t86779\n足矣\t86780\ntuat\t86781\n两下子\t86782\n6969669699\t86783\njdr\t86784\n八二零三六二二\t86785\n这个礼拜\t86786\n隐藏\t86787\n犹他州\t86788\n色素\t86789\n分节\t86790\n活化石\t86791\n吆外\t86792\nxmtg\t86793\n孽度秘\t86794\n分界洲岛\t86795\n权法\t86796\n单座单发战斗机\t86797\n林泽永\t86798\n近40年\t86799\n嘛鱼\t86800\n十一个\t86801\n十月二十四号\t86802\nlp\t86803\n天裳\t86804\n渺茫\t86805\n哼骂\t86806\n华北大区\t86807\n名愁\t86808\n雪里蕻\t86809\n哼骗\t86810\n八八四\t86811\n54兀\t86812\n考试周\t86813\n参照物\t86814\n中网庙小学\t86815\n表脸表脸不要脸不要脸不要脸不要脸\t86816\nlq\t86817\n贴身秘书\t86818\n美达\t86819\n还能使\t86820\n好自大\t86821\n干净净\t86822\n秀图\t86823\n13772254018\t86824\n杜培武\t86825\n嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟\t86826\nMERCYFUL\t86827\n8000块\t86828\n国模妮\t86829\n笨我想你\t86830\n555565555295855\t86831\n门儿清=know\t86832\nSurvival\t86833\n农园\t86834\n真聪明你很二\t86835\n热化\t86836\n模板\t86837\n侠岚第六季\t86838\n士师傅\t86839\n大明宫\t86840\n沉浸\t86841\n吴尚任\t86842\n禁区\t86843\n重音\t86844\n算算不想\t86845\n嗷娇\t86846\n有禁\t86847\n四月十五号\t86848\n继续追我\t86849\n有福\t86850\n度秘你最好的啦乖把\t86851\n吗金\t86852\n玩点\t86853\n自已不知道\t86854\n政社\t86855\n度秒度秘\t86856\n防过敏\t86857\n作业员\t86858\n英q娱乐康\t86859\n来来来来来不及\t86860\n王朝霞\t86861\n陈潎\t86862\n劲头\t86863\n做题\t86864\n打火\t86865\n打灯\t86866\n心声\t86867\n噶纳斯\t86868\n讲挺\t86869\n定庆勇\t86870\n健身中学\t86871\n过奖过奖\t86872\n12月3日12时\t86873\n销售价\t86874\n6054160050914864005064680508450648864896\t86875\nhoyou\t86876\n二十几年\t86877\nHiG嗯v7b\t86878\n再见了我\t86879\n度秘婚点肚子上的红点一吃饭没有呀肚子涨红点\t86880\n我爱你我爱你我爱你的不我爱你查不就是爱你就是爱你我想\t86881\n雨宝行\t86882\n白蔷薇\t86883\n13478\t86884\n谭婉琳\t86885\nCDMA\t86886\n我真的好累我爱的好累\t86887\nvuvu\t86888\n李一风\t86889\n牌规则\t86890\n三分之1\t86891\n必然\t86892\n度秘一号\t86893\n我的小小我的小小无敌的\t86894\n我的爱好\t86895\n53544344344444564454466十5亿二\t86896\n喷雾\t86897\n听首歌\t86898\n异世骨\t86899\n侧漏\t86900\n瞎撞\t86901\n我看我欲封天到\t86902\n窝不是处\t86903\n鲁能160\t86904\n也斯也\t86905\n意描\t86906\nMacdrive\t86907\n重税\t86908\n笑嘻嘻哈哈大笑\t86909\n拿出来\t86910\n仲夏夜恋歌\t86911\n五蛋节\t86912\n五百个\t86913\n喝起来\t86914\nGhana\t86915\n巴萨\t86916\n藏藏藏\t86917\n这么们\t86918\n东丽区\t86919\n载有\t86920\nhiyuh\t86921\n嗯恩蒽\t86922\n宝贝儿我爱\t86923\n海都\t86924\n王茗颢\t86925\n败家子\t86926\n五百万\t86927\n沉没\t86928\n动漫我长的帅不帅\t86929\n巨\t86930\nGPS\t86931\n赵一冰\t86932\n出入口\t86933\n三天之内\t86934\n峰苑\t86935\n东经\t86936\n冷空气\t86937\n妈服\t86938\n基长\t86939\n715千米\t86940\n善待\t86941\n我的闺密\t86942\n迪拜塔\t86943\n小杜密\t86944\n4544545\t86945\n偶儿子\t86946\n棒好棒\t86947\n勒款\t86948\ncomeoue\t86949\n肉厚\t86950\nvubub\t86951\n尼斯湖\t86952\n参考消息\t86953\n宝贝儿来着可爱逗我的吧的吧本宫倩\t86954\n一万分\t86955\nHeyei3thoroughfare\t86956\n五申通\t86957\n七九号\t86958\n捷克狼犬\t86959\n七里河\t86960\nuehhuf\t86961\n长野汤田中\t86962\n22万吨\t86963\n汤臣倍健\t86964\n商丘市民权县供电局办公室\t86965\n七八号\t86966\n章丘\t86967\n倾倒\t86968\nauu9。cn\t86969\n几时差不多半个小时\t86970\n加强\t86971\n回会\t86972\n对联年\t86973\n狗头铖\t86974\n萧炎\t86975\n张小兴\t86976\n1425553644158669915526878966\t86977\n22.2.22\t86978\n甘补\t86979\n罗宁安\t86980\n情骂\t86981\n龙娃物柜\t86982\n周思怡\t86983\n妙处\t86984\nqowe\t86985\n加开\t86986\n仅供\t86987\n黄翼鸿\t86988\n浪人\t86989\n14725883690012423358555255352352583527663553553533535365\t86990\n一千蚊\t86991\n帮雄\t86992\n楚汉除\t86993\nevening家\t86994\ntheory\t86995\n平液\t86996\n真货\t86997\n莱昂\t86998\n几十亿个\t86999\ntfoex\t87000\n音景\t87001\n930快车\t87002\nvoy623\t87003\n大猪八戒\t87004\n香国\t87005\n后福\t87006\n真贱\t87007\n清刚\t87008\n秘爱风\t87009\nWilmore\t87010\n清初\t87011\n大思路\t87012\n北极星\t87013\nLlllllllllllllllllllllllllllllllllllllllllllllllllllllll\t87014\n夭折\t87015\n殉职\t87016\n11000公斤\t87017\n你猜我猜你猜不猜你猜我猜你猜不猜猜你猜我你猜\t87018\n横城\t87019\n独独\t87020\n女王\t87021\n女玄\t87022\n08千克\t87023\n恩将\t87024\nIKnow\t87025\n走走走走走走走走\t87026\n乖们\t87027\n陈佳能\t87028\n74岁\t87029\n陈耀扬\t87030\n独狼\t87031\n玛雅红\t87032\nyuiosfgjk\t87033\n曾德华\t87034\n孤僻\t87035\n违弃\t87036\nfsxvcb\t87037\n海心\t87038\n二连浩特市\t87039\n阿鲁巴\t87040\n一举成名\t87041\n熊出没着良心归来\t87042\n岛国\t87043\n会员群\t87044\n两俩\t87045\n恩尼\t87046\n收件\t87047\n恩就\t87048\n无坚不摧\t87049\n见听得\t87050\n辫子\t87051\nbabycogchua\t87052\n0hN0\t87053\nMirror#\t87054\n强盗\t87055\n哑音\t87056\n治课\t87057\n讲姻\t87058\n在何处\t87059\n凉皮\t87060\n京MJ7391\t87061\n李梦婷\t87062\n市级\t87063\norybitch\t87064\n相当大街\t87065\n思恋\t87066\n乌龙江\t87067\n土象星座\t87068\nand\t87069\nane\t87070\n棒级\t87071\n龙婧琳\t87072\nana\t87073\nanc\t87074\n酷睿i3\t87075\nctgghj\t87076\n酷睿i7\t87077\n代理\t87078\n酷睿i5\t87079\n喽干\t87080\n嗯理工\t87081\nsuvsy\t87082\n截版\t87083\n苹果+桃+梨\t87084\nany\t87085\n程志凯\t87086\n龙江龙s\t87087\n可爱的我的生活\t87088\n冻包\t87089\n柏溪\t87090\n主驾\t87091\nNB2kk\t87092\n小从伴\t87093\n行行星饭\t87094\n莱东大街\t87095\nchxighhgtuhcgeixii\t87096\n哈子哈子\t87097\n寿宴\t87098\n凤三爷\t87099\n创骄\t87100\nMM们\t87101\n失业率\t87102\n贾跃亭\t87103\n取件\t87104\n世故\t87105\n要不得\t87106\n贾万博\t87107\n片迪迦奥特曼\t87108\n来得不到\t87109\nhi你好丑\t87110\n朱老\t87111\n我不单你好呀别\t87112\n分导\t87113\n子华迷\t87114\nzuowen\t87115\n分寸\t87116\n刘继康\t87117\n托收\t87118\nzgyifutifiyyoyyhhyyity\t87119\n疯子疯子\t87120\n张晓涵\t87121\n新度秘你是男是女\t87122\n余温\t87123\n顺河城\t87124\n算什\t87125\n奥买尬\t87126\n塔普\t87127\n嘉美蒂斯\t87128\n晒晒\t87129\n玩意杉\t87130\n七十二小时\t87131\n神龙\t87132\n神龟\t87133\n钱朵朵\t87134\n14865455544\t87135\n吐吐吐头喔吞吐鲁磨了来记录旅途图图\t87136\n嘣沙卡\t87137\n简历\t87138\n早安软蛋\t87139\n瓶甲醇酒\t87140\n拉几1\t87141\n驯鹿灿白开度党\t87142\n辛苦你了懂你\t87143\n在不然然后\t87144\n撑一礼拜\t87145\n金水\t87146\n大头儿子和小头爸爸夏天\t87147\n往复\t87148\n森段\t87149\n上东晖学校\t87150\n黄丽玲\t87151\n一个一七个\t87152\n回电话刚\t87153\n嗯赛罗奥奥特曼\t87154\n咕噜咖\t87155\n一点点\t87156\n上美的美餐\t87157\n丰都\t87158\n23d48集\t87159\n抛却\t87160\n姨只\t87161\n怎么说呢\t87162\n摊牌\t87163\n删去不想\t87164\n煤化\t87165\n课内战\t87166\n12085033308205525236\t87167\n透脸\t87168\n1824\t87169\n1827\t87170\npopsub\t87171\n1820\t87172\nqqn2065517887\t87173\n火车站\t87174\n佛牙舍利塔\t87175\n唤起你的爱\t87176\n澳新银行\t87177\n多张\t87178\n飞溅\t87179\nhvgggv\t87180\nsirin\t87181\n##\t87182\n百科全说\t87183\nxigxidkbkk\t87184\n正值\t87185\nItsOK\t87186\n恶臭\t87187\n2016年1月17日\t87188\n那立交桥站\t87189\n安澜\t87190\n#2\t87191\n#1\t87192\n听寡爱\t87193\n陈武帝\t87194\nNopain\t87195\n皱眉\t87196\n搞笑了行\t87197\n一三快点\t87198\n法师\t87199\n这时候\t87200\n中国社会福利基金会\t87201\n小奥林\t87202\n李勇平\t87203\n学子们\t87204\n风神\t87205\n70周年\t87206\n金海凤\t87207\n公众人物\t87208\n马丁丁\t87209\nn　зn┏━〇〇━━━━┓n┃文字输入区┃n┗┳┳━━━┳┳┛n　┗┛　　　┗┛\t87210\n五十二点\t87211\n邕武\t87212\n揪紧\t87213\n帅侠\t87214\n爹娘\t87215\n托娃\t87216\n9月20日清晨\t87217\n丽江丽丽家客栈\t87218\n来我不在\t87219\n123456789000000000000000000000000000000000个\t87220\n干身\t87221\n停挺\t87222\n７分\t87223\n一周十八\t87224\n蓉蓉兔\t87225\n米加油菜\t87226\n吖吖吖吖吖吖吖吖吖吖吖吖\t87227\n普及\t87228\n绿意\t87229\n袖珍犬\t87230\nfgjjfdcm\t87231\n声哥快点\t87232\n双照镇\t87233\n议而不决\t87234\n叶兆海\t87235\n九千九百九十九九千九百九十万9999\t87236\n咯咯嗦嗦\t87237\n四点八\t87238\nn玫瑰奶茶\t87239\ngvcddfhjh\t87240\n讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌\t87241\n把刀\t87242\n家校\t87243\n出头之日\t87244\n靠不能\t87245\n雷和波\t87246\n肩上\t87247\n知性\t87248\nNote\t87249\n夠長輩不不調\t87250\n粮食作物\t87251\n脸红红\t87252\n骚暴\t87253\n冯铭潮\t87254\n检批准此致\t87255\n2017年11月份\t87256\npopbox\t87257\n攻读\t87258\n边缘化\t87259\n夕阳西下\t87260\n红河州州委宣传部\t87261\n范统\t87262\n唧唧唧歪歪\t87263\n清庵堡\t87264\n肖艳\t87265\n划不来\t87266\n我知道我知道呢那你现在在行在\t87267\n玩具类\t87268\n814455\t87269\n范练\t87270\n归正\t87271\n加气站\t87272\n欲裂\t87273\n别动不动\t87274\n哈姆西克\t87275\n哈萌\t87276\n一张张\t87277\n180岁\t87278\n一一一一一一一一三\t87279\n拉尼希\t87280\n牛场\t87281\n你好你好你好你好你好\t87282\n啊个勒还说等我一找你家的我\t87283\n对话录\t87284\n一百遍千\t87285\n349套\t87286\n贾把头\t87287\n愿意不愿\t87288\n修身养性\t87289\n105千米\t87290\n1232558\t87291\n静宁那家旅馆\t87292\n吉O\t87293\n找不着我\t87294\n金刚石线切割\t87295\n李怡晴\t87296\n朝着\t87297\n衍生\t87298\n郑燕\t87299\n白杨\t87300\n一进门\t87301\n牙路\t87302\n蜗杆\t87303\n草稚\t87304\n144个\t87305\n挖苦\t87306\n大坞镇\t87307\n顶弄\t87308\n尽心\t87309\n白板\t87310\n白杰\t87311\n吉田\t87312\n王我纱\t87313\n雾凇\t87314\n前山\t87315\n小动作\t87316\ng点\t87317\n等會去的我的時候我是高富帥的高富帥\t87318\nghvuvuh9i\t87319\n相见不起\t87320\n第六感\t87321\n捏呢热\t87322\n搀扶\t87323\n弑神\t87324\n媚雅\t87325\n王别\t87326\n温和派\t87327\n一百种\t87328\n每组\t87329\n变压\t87330\n白村\t87331\n漠水\t87332\n孙莉\t87333\n丑八怪人\t87334\n话明\t87335\ngvjbfhhj\t87336\nafsfdceudhfb\t87337\n我的男人\t87338\n一觉度秘\t87339\nhomyg\t87340\n出门我告诉你\t87341\n事子\t87342\nuhhggg\t87343\n张清一\t87344\n审讯\t87345\nLanuf\t87346\n唉海乌ocouforntom\t87347\n孤要\t87348\n事学\t87349\n牛奶山\t87350\n武大\t87351\n疯了没\t87352\n姿色\t87353\n假条\t87354\nfisfiaaaaaaaaaa\t87355\n小学生我心上大四吧八一下\t87356\n鹦哥\t87357\n酸菜炖肉\t87358\n金因土\t87359\n保留意\t87360\n压价\t87361\n谢大神\t87362\n个套\t87363\n杨天\t87364\n坏银坏银\t87365\nShengmoyisi\t87366\n大专家\t87367\nCbvcbfh\t87368\n前两点\t87369\n初恋那点小事\t87370\n第五空间\t87371\n牛仔们\t87372\n尊敬\t87373\n焚玉\t87374\n星阶\t87375\n聊写\t87376\nSXD\t87377\n115分之八\t87378\narein\t87379\n永新\t87380\nSXN\t87381\n奥林匹克圣火欢欢藏羚羊\t87382\n陪葬\t87383\n货架\t87384\n太极音乐\t87385\n12233344445678999999000\t87386\n栖居\t87387\n退票\t87388\n信不屑\t87389\n见晚安\t87390\n晚八蛋\t87391\n阿霜\t87392\n王村长\t87393\n出场馆\t87394\n组合拳\t87395\n只是么\t87396\n周心语\t87397\n邓小磊\t87398\n原景\t87399\n床垫儿\t87400\n程文静\t87401\n112345678910\t87402\n85259\t87403\nnelmbj\t87404\ngkuk\t87405\n啊旅途兔兔V5兔兔兔兔\t87406\n70余粒米\t87407\n朱大大\t87408\n二十个位\t87409\n130006千多\t87410\n村外\t87411\n哦里拉\t87412\n璽竇\t87413\n六八大家好多事大头\t87414\n讲笑好\t87415\n反之亦然\t87416\n李的确汉堡李\t87417\n丰富\t87418\n机芯\t87419\nHZBVXV\t87420\nOPI\t87421\n写字楼\t87422\n高中生\t87423\n丁聪慧\t87424\n挤身\t87425\n周鸿祎\t87426\n宿舍源\t87427\n水汪汪\t87428\n真方便\t87429\n说不够了\t87430\n神战\t87431\n因为了\t87432\n反杀\t87433\n散尽天良\t87434\n年华\t87435\nDichroicjock\t87436\n油菜\t87437\n渗透\t87438\n允诺\t87439\n毅然决然\t87440\n高辣\t87441\n黑哥\t87442\n呵呵我真的好喜欢\t87443\n小段话\t87444\n艇\t87445\n7.\t87446\n嫂们\t87447\n160469706\t87448\n黑哨\t87449\ngfifhg\t87450\n后轮\t87451\n31岁\t87452\n头雨\t87453\n头雪\t87454\n3336218\t87455\n黑哟\t87456\n蒸锅\t87457\n下再发\t87458\n蓝皮书\t87459\n观察员\t87460\n亚利沙\t87461\ncartier\t87462\nVIVOY51\t87463\n我疯了\t87464\n百里挑一\t87465\neruppsfllzvmn\t87466\n花笑话\t87467\n今典\t87468\n情深意重\t87469\n千里之提\t87470\n志气\t87471\n萌萌萌萌萌萌\t87472\n该案\t87473\n75c\t87474\n富区\t87475\n75g\t87476\n桨声\t87477\n真常\t87478\n婺城区\t87479\n千奇百怪\t87480\n2.5万亿元\t87481\n362421197110126851\t87482\n早晨5点半\t87483\n下移\t87484\n大飞哥\t87485\n真希\t87486\n白花朵朵\t87487\ngtjttj\t87488\n下种\t87489\n真帅\t87490\nwnsvsgz\t87491\n东b1\t87492\n汉藏南\t87493\n分手说爱你\t87494\n续拉\t87495\n我不要我不喜欢你了我恨你我讨厌你\t87496\n米洋洋\t87497\n下秘\t87498\n陪住\t87499\n庙宇\t87500\n752\t87501\n罪外祖父\t87502\n750\t87503\n谚语\t87504\n害你\t87505\n755\t87506\n半身裙\t87507\n759\t87508\n气死我了像\t87509\n没一点\t87510\n明晓溪\t87511\n了了行\t87512\n李逵\t87513\n75%\t87514\n任慧敏\t87515\n读着\t87516\n10万遍\t87517\n大韩佛教\t87518\n大梦想家的歌\t87519\n10月10日\t87520\n画报\t87521\n5万7\t87522\n吉尺明步\t87523\n栽烟\t87524\n虎丘\t87525\n厅级\t87526\n李选\t87527\n下不了床\t87528\n阿大号\t87529\n王刘\t87530\n雨伞军\t87531\n无可理喻\t87532\n叫堂\t87533\n七十年代\t87534\ntaylor\t87535\nrve\t87536\nrvk\t87537\n中部\t87538\n俱到\t87539\n十百千万\t87540\n间隔面\t87541\n叶豆豆\t87542\n好工作\t87543\n567岁\t87544\n菜板\t87545\n大臣们\t87546\nrvx\t87547\n中都\t87548\n天天高兴\t87549\n了别这样\t87550\n绍尧\t87551\nT除V认证\t87552\n利国\t87553\n冯娅\t87554\n念首\t87555\n蜜糖快跑\t87556\n神缘\t87557\n丁茂城\t87558\n王刚\t87559\n卡堡\t87560\n剩下\t87561\n植物大战僵尸二天空之城\t87562\n紧删\t87563\nMugaritz\t87564\n淡胜\t87565\n孙卓\t87566\n厕纸\t87567\n心思管我你谁呀我的天\t87568\n小萌子\t87569\n天天哥白羊有的水红宝宝立波\t87570\n三周\t87571\n嘻嘻帅\t87572\n几百斤\t87573\n是的我很须要你\t87574\n现风\t87575\n第一体\t87576\n姑姑军\t87577\n30多少\t87578\n汗我受够你了我要离家出走我不要你\t87579\nooxx\t87580\n何小姐\t87581\n两不厌\t87582\ngjdntdbgdvgff\t87583\n汪星人\t87584\n猜不透\t87585\n5点19\t87586\n水逆\t87587\n建桥\t87588\nsrrf\t87589\n自敏佳\t87590\n行為\t87591\n35566788990\t87592\n我讨厌你讨厌你讨厌你讨厌你我头晕托运托运讨厌你讨厌你讨厌讨厌讨厌你我讨厌你讨厌你\t87593\n行点\t87594\n二四五六六七零五\t87595\n那天儿\t87596\n李太监\t87597\n军事化\t87598\nngpo\t87599\n心满意足\t87600\n评分表\t87601\n先知\t87602\n古代汉语\t87603\ntgfthgkk\t87604\n魅惑\t87605\n再讲一个呗\t87606\nDTD\t87607\n小红猫\t87608\n雷动三\t87609\nsheacked\t87610\n志昂张\t87611\n十二点下午\t87612\n夜传杯\t87613\n快示\t87614\n临门\t87615\n刘卫兵\t87616\n笨兔\t87617\n昂妇产\t87618\n擅自\t87619\n555380086\t87620\n5点15\t87621\n大米\t87622\ntxxx\t87623\n追名逐利\t87624\n大胆子\t87625\n处置\t87626\nbibba\t87627\nbibbb\t87628\n姚梦婷\t87629\n梁锦棠\t87630\n多米亚\t87631\n自拍照度\t87632\n相信爱情是\t87633\n女魔头天\t87634\n杰峰琪\t87635\n好哇oyou\t87636\n我想你我想你赤壁\t87637\n情胜初恋\t87638\nEnding#\t87639\nwteg\t87640\n太阳神\t87641\n否则\t87642\n你不可爱你是猪\t87643\n子轩\t87644\n565nn\t87645\n12666\t87646\n时限\t87647\n第八名\t87648\n诶茵\t87649\n处罚\t87650\n爱着你\t87651\n陈述\t87652\n走了回见\t87653\n筷子勺\t87654\n1521712220\t87655\n骂娘\t87656\ntek\t87657\n七八次\t87658\nten\t87659\nteo\t87660\n区儿\t87661\n房价\t87662\nted\t87663\ntee\t87664\n调性\t87665\n白欣彤\t87666\nteq\t87667\nter\t87668\n熊脸\t87669\ntet\t87670\nteu\t87671\n44258677￥714587￥447787114455579nn44488879\t87672\n该不适应\t87673\n为政在爱情公寓\t87674\n太平靜簡單\t87675\n几百几\t87676\n冢页\t87677\n唾味灰飞烟灭\t87678\n现在场\t87679\n121211111\t87680\n鄱阳\t87681\n16:45\t87682\n中国消费电子展\t87683\n壹号\t87684\n四四十万\t87685\n充费\t87686\n向家驹\t87687\njvz\t87688\n老魅\t87689\nfhgdfjufcv\t87690\n沃泰\t87691\nbjjbvjbvjkggfyfyutiftufysyufystg\t87692\n沸如比戎\t87693\n不由n\t87694\niuiidi\t87695\nBABY\t87696\n一夫一妻制\t87697\nte3\t87698\nqwer\t87699\n5648868894868049873\t87700\n老公我没的了美极了\t87701\n囫囵\t87702\n餐馆\t87703\n13213100414\t87704\n零万元\t87705\n即墨\t87706\n撒冷\t87707\n跪安\t87708\n输验\t87709\nsb两字\t87710\n不明我恨你\t87711\n疯狂世界\t87712\n打击者\t87713\n窭ngwg\t87714\n∈\t87715\n合情合理\t87716\n垫画\t87717\n薛笑康\t87718\nhfdy\t87719\n那行给你了钱给你\t87720\njadttmrjm\t87721\n看来得\t87722\n差秘度秘\t87723\njvv\t87724\n没不到\t87725\n√\t87726\n∝\t87727\n蹦蹦蹦\t87728\n∞\t87729\n暴王\t87730\n1904\t87731\n现在与堂\t87732\n日本队\t87733\n长不大\t87734\n1900\t87735\n休声更\t87736\n∩\t87737\n∨\t87738\n∫\t87739\n∪\t87740\n去吧度蜜月\t87741\n一个头一个\t87742\n∮\t87743\n∠\t87744\n∣\t87745\n开赛露\t87746\n∥\t87747\n∧\t87748\n咆哮泓\t87749\nhhrvf\t87750\n∽\t87751\n药酒\t87752\n愤怒的小鸟闹\t87753\n∵\t87754\n∴\t87755\n诊所\t87756\njvn\t87757\n小莹\t87758\n南美陆路\t87759\n小莲\t87760\n绿罗裙\t87761\n2009年底\t87762\n小莫\t87763\ngansha\t87764\n884889\t87765\n圆头\t87766\nCIFIFFY\t87767\n指甲\t87768\n侣皓吉吉\t87769\n着力\t87770\n讲坛\t87771\n国语版\t87772\n亲够了\t87773\n理当\t87774\n中介机构\t87775\n东面你是谁你是女王\t87776\nhyij\t87777\n吗无忌三国\t87778\n宽边\t87779\n孤竹君\t87780\n尼轻妈妈气\t87781\n6月30日24点\t87782\n外世界\t87783\n绵软\t87784\n邓肯\t87785\n丹妙药\t87786\n着劲\t87787\n小莎\t87788\n岂是\t87789\nzraux\t87790\n门上\t87791\n鹊桥\t87792\n明明明敏\t87793\n一个一三\t87794\n12015米\t87795\n三个场\t87796\n门丁\t87797\n听了笑\t87798\n温县\t87799\n门业\t87800\ni女生小机器人你好小机器人\t87801\n6.1秒\t87802\n两辆\t87803\n唉拜拜\t87804\n班尼斯特\t87805\n冒进\t87806\n一个一个\t87807\n两边\t87808\n跑掉\t87809\n脑浆肠子加涅家常事下\t87810\n奇祸\t87811\n囖ML\t87812\n哭了哭了哭\t87813\n门丽\t87814\n一几几\t87815\n15551\t87816\n不著急\t87817\n恋声\t87818\n荷蒙\t87819\n郭宇畅\t87820\n预习\t87821\n三分之一时间\t87822\n几三季\t87823\n呲牙咧嘴\t87824\n不配\t87825\n赵苗苗\t87826\nwatyy\t87827\n邱福英\t87828\n低速\t87829\n三粒\t87830\n魏有英\t87831\n200P\t87832\n三粗\t87833\n晾衣夹\t87834\n制衡\t87835\n别册\t87836\n盆友们\t87837\nFfdhm\t87838\n吃啵\t87839\n春红\t87840\n万箭\t87841\n#小艺\t87842\n经纪商\t87843\n下脚\t87844\n北锣鼓巷\t87845\n宇宙之王\t87846\n战野\t87847\n备选\t87848\n夜观\t87849\n罗本、克罗斯、里贝里\t87850\n一个二四幺\t87851\n好宙斯\t87852\n八六三零九九五七\t87853\nGhggggfftygdtffeffortth566666effthffghgyhhdfyurrfyfettf\t87854\n跟嘴硬\t87855\n配子\t87856\n13100088957\t87857\nfggdgg\t87858\n9小时\t87859\ncupl\t87860\n苦人\t87861\n张上三七\t87862\n阿宝啤酒\t87863\n哪子\t87864\n後面\t87865\n圣诞\t87866\n随韩籍\t87867\n6月180126\t87868\ngdgdgdgyydydgyd\t87869\nctdfdfr\t87870\n小鸡恩那\t87871\n魏雪娇\t87872\n偶像们\t87873\n杨瑞娜\t87874\n死不了\t87875\nJramdparents\t87876\n给方\t87877\n最終\t87878\n很努力\t87879\n新闻界\t87880\n偷魂\t87881\n童丽\t87882\n石家庄婚庆公司\t87883\n死不人\t87884\n荣文艺\t87885\n衣句\t87886\n很难道\t87887\nfheh\t87888\n黯然伤神\t87889\n死不交\t87890\n半用水\t87891\n天涯咫尺\t87892\n返程\t87893\n铁枪\t87894\n181853334867多少\t87895\n王小莲\t87896\n家大\t87897\nfgfgcfg\t87898\n铁架\t87899\n顺电\t87900\n将使\t87901\n死癖\t87902\n骨盖骨\t87903\n秦长兴\t87904\n董东\t87905\n服从\t87906\nKakaoTalk\t87907\n铁林\t87908\n公可母\t87909\n爆款\t87910\n你追我吧\t87911\n渐热\t87912\nISOhisGIho\t87913\nehrhlcysnbx\t87914\n能看完\t87915\n太秒\t87916\n秀晶\t87917\n妹团\t87918\n深圳民工街舞团\t87919\n0美元\t87920\n第一堆\t87921\n第一堂\t87922\n最大值\t87923\n。f藏藏藏藏藏藏\t87924\n桂圆肉\t87925\ninsoIes\t87926\n错现\t87927\n厄运\t87928\n家市中心大路一号一号\t87929\n聊聊聊聊天\t87930\n开挖子\t87931\n猪血威风\t87932\n20毫克\t87933\n刀子嘴豆腐\t87934\n双刃剑\t87935\ndejk\t87936\n高干\t87937\n提生\t87938\n苦思\t87939\n153531\t87940\n王宝生\t87941\n醵\t87942\n始发站\t87943\n真系\t87944\n李阳疯狂英语\t87945\n大鲈鱼\t87946\nNm\t87947\n这你在线\t87948\n十字路口\t87949\n五分之6\t87950\n超级战贝利亚\t87951\n假裝\t87952\n别害羞\t87953\n咨客错\t87954\n京西\t87955\n900万元\t87956\n考息\t87957\nsorrui\t87958\n笨笨度秘\t87959\n真糊\t87960\n千千万\t87961\n埃弗森\t87962\n玄坛\t87963\n二三三八四个\t87964\n十五分钟以后\t87965\n花痴\t87966\nAk47M4A1\t87967\n是么么达\t87968\n汉源湖\t87969\n理疗\t87970\n想和你死\t87971\n秘hellohellohello度秘\t87972\n特技\t87973\n念猋\t87974\n超高逼停\t87975\nindependentely\t87976\n曲峻慧\t87977\n九月中旬\t87978\n好妹子\t87979\n特报\t87980\n看找\t87981\n6qcp\t87982\n樱井\t87983\n卡牙\t87984\n曲终人散\t87985\n北风\t87986\n后天和\t87987\n代和\t87988\n叫片\t87989\n不是我问你有多少年了你在这里\t87990\nv不句\t87991\n卡牌\t87992\n残破\t87993\n13360225217\t87994\n卡片\t87995\n醮\t87996\n黄格\t87997\n一把把元\t87998\n朱明宇\t87999\n嗯小凯\t88000\n给给给给给给给给给给给给\t88001\n上铺\t88002\n二十二层\t88003\n叫特\t88004\nefi\t88005\n遍地开花\t88006\n阵风\t88007\n题人\t88008\n黄晟峰\t88009\neff\t88010\nefg\t88011\n小名叫\t88012\n在一再说\t88013\n53YY\t88014\n郑大世\t88015\n我和你的歌曲\t88016\n卿卿我我\t88017\n女谢\t88018\n持剑耍帅\t88019\n多方面\t88020\n米高\t88021\n张晓锐\t88022\n一把火\t88023\n到哪儿\t88024\n面面\t88025\n柯柯\t88026\n方芳雨\t88027\n小城市\t88028\n1594601274021\t88029\n第二十四\t88030\n5成\t88031\n1593574522245\t88032\n400张\t88033\n四岁孩\t88034\n刘雨琦\t88035\nReynolds\t88036\n仿写\t88037\n檀香\t88038\n收益\t88039\n350497045808\t88040\n去年九月份\t88041\n纯情丫头火辣辣\t88042\n那当\t88043\n大帝\t88044\n屌吧\t88045\n老漂亮\t88046\n吃好饭\t88047\n大市\t88048\nkddq\t88049\nhimallopqr\t88050\n师子\t88051\n大师\t88052\n徒弟们\t88053\nsuruiry\t88054\n几面\t88055\n三员\t88056\n公平交易\t88057\n氵好奇24423\t88058\nvkjs\t88059\n兴杰\t88060\n落汤鸡\t88061\n臭汗\t88062\n广福镇\t88063\n狼烦\t88064\n老班\t88065\n1358025865\t88066\n凯文\t88067\n手指\t88068\n真心泪\t88069\n冠县\t88070\nTaxi\t88071\n小事儿\t88072\nAlaric\t88073\n嗯有一\t88074\niklu\t88075\n你是猪呀你是猪u14p个USB个你是猪尾你是猪哟14p\t88076\n中文版\t88077\n嗯比如\t88078\n回忆录\t88079\n雷士照明\t88080\n擦擦擦\t88081\n刘恩惠\t88082\n四大一\t88083\nTSKS全集\t88084\nauhgl\t88085\n啊欢仔\t88086\nRiccardo\t88087\n朝杰\t88088\nunityou\t88089\n停图恩\t88090\n一不呈现\t88091\n特征\t88092\n贺捷生\t88093\n400万名\t88094\n五方路小学\t88095\n底气\t88096\n落槌定音\t88097\n轻侮\t88098\n資訊\t88099\n春你\t88100\n优化\t88101\n天天酷跑\t88102\n小四岁\t88103\n刘梦于\t88104\n发张现\t88105\nyesstomatotou\t88106\n六二二八\t88107\n南阴\t88108\n星星周\t88109\n西汉年\t88110\n500一笔\t88111\n碧水华庭共青城碧水华金莱利\t88112\n76868\t88113\n奇幻类\t88114\n首响\t88115\n变成\t88116\n还以为你会\t88117\n7月8号\t88118\n哇哇\t88119\n二八二六\t88120\n分手吧嫁给\t88121\n咪你\t88122\n郭思遥\t88123\n四万\t88124\n鹿丸\t88125\n咏琳\t88126\n篮球迷们\t88127\n四七\t88128\n你好过\t88129\n四一\t88130\n四与\t88131\n拔除\t88132\n四三\t88133\n起凡\t88134\n酷欧阳娜娜\t88135\n你好运\t88136\n好把\t88137\n我不不知道\t88138\n唯独\t88139\n好我服你了再见\t88140\nAND\t88141\n国美高层拜会京华时报\t88142\n干嘛尼\t88143\n轻便\t88144\n问结束\t88145\n桑桑\t88146\n稀稀拉拉饿\t88147\n86%\t88148\n四中\t88149\n想不想死\t88150\nffhjik\t88151\n夸克日\t88152\n小百科\t88153\nsrOnn\t88154\n李姝珊\t88155\n赞马达\t88156\n芦稷麦\t88157\n唐顿\t88158\n度秘jjjjjjk\t88159\n果雨\t88160\n肩部\t88161\n一二三四五七八九十二十五\t88162\n着迷\t88163\n1234567891098741236987123\t88164\n养鸡场\t88165\n现代战争\t88166\n晚报\t88167\n发挥\t88168\n最急\t88169\n苏DJ\t88170\npr\t88171\nps\t88172\npp\t88173\npq\t88174\npv\t88175\n福腐文\t88176\npt\t88177\npu\t88178\n飙歌\t88179\npx\t88180\n最怕\t88181\n孔立青\t88182\nswd\t88183\n恐龙国\t88184\npb\t88185\npc\t88186\n嗯妈妈的爸爸四岁\t88187\n319.6\t88188\n说一问\t88189\npd\t88190\npe\t88191\npj\t88192\npk\t88193\nph\t88194\npi\t88195\npn\t88196\n猪巴\t88197\npl\t88198\n恐龙园\t88199\n漂亮女孩儿\t88200\n心静\t88201\n大嘴猴\t88202\n553885\t88203\nDFT\t88204\nFdfffffffff\t88205\np2\t88206\np3\t88207\n说说说说说\t88208\n哎一古老冒\t88209\np4\t88210\np5\t88211\n许虎\t88212\np8\t88213\n卡二\t88214\n最高层\t88215\n但愿\t88216\n退步\t88217\n补铁\t88218\n外长\t88219\n赛欧门\t88220\n两百年后\t88221\n腹黑感\t88222\n笑猫\t88223\n马库托\t88224\n扑通\t88225\n亲爱的祖国\t88226\n二马三鲜\t88227\n好呀好呀好呀好呀好呀嘿嘿\t88228\n恒天与\t88229\n天天笑\t88230\n随大流\t88231\nu2415章\t88232\n卢彩霞\t88233\n经济型\t88234\nDudujduee65\t88235\n洗衣单\t88236\n采天润\t88237\ncsex\t88238\n鱼泡\t88239\n写完\t88240\n八十多头\t88241\n破不了\t88242\n同胞们\t88243\n张一平\t88244\n读不懂你你好奇怪\t88245\n水岸丽\t88246\n三姑\t88247\n三姐\t88248\n早力\t88249\nrrfffbjko\t88250\nCoc\t88251\n脑细胞\t88252\n扼制\t88253\n混蛋王东东的男娃\t88254\n雨杉\t88255\n辗转\t88256\n爱了你好久\t88257\n怋\t88258\n异种\t88259\nfuxf\t88260\n星歌剧\t88261\n共商\t88262\nGBS\t88263\n還會\t88264\n一虎\t88265\n饮料水\t88266\n二蛋度秘\t88267\n荠菜花\t88268\n150个\t88269\n人腱\t88270\n红红果果\t88271\n一葫芦\t88272\ntylaloda\t88273\n貝\t88274\n绕场\t88275\nNv\t88276\n沙鹰\t88277\n占道\t88278\n君可见牡丹开一生有人为你的金科兼职行么一针有人为你的催租金\t88279\n毛影\t88280\n貓\t88281\n你好哈楼\t88282\ngyuuuhhuiiu\t88283\n貌\t88284\nkk4301\t88285\n貉\t88286\n几十平方公里\t88287\n漏油\t88288\n有话直说\t88289\n貂\t88290\nnjzj\t88291\n貼\t88292\n今天上午8时\t88293\n忻州\t88294\n買\t88295\n太神了你\t88296\n飞飞60030\t88297\nvyvy\t88298\n闺美\t88299\n方桌\t88300\n貨\t88301\najja\t88302\n作业了再见拜拜宝贝儿我爱你\t88303\n病倒\t88304\nidb6abcax\t88305\n我不想和你之\t88306\n負\t88307\n方框\t88308\n门脸\t88309\n葫芦瓜\t88310\n徐吗\t88311\n罗军军\t88312\n8万吨\t88313\n400多人次\t88314\n240元\t88315\n嘉柏丽\t88316\n大姨妈小乖乖\t88317\n水人\t88318\njjmmj\t88319\n好劲\t88320\n你好感度\t88321\n垦利\t88322\n发力\t88323\n不要紧得\t88324\n得死\t88325\n百块\t88326\n双桥小学\t88327\n卖花\t88328\n哈哈虎\t88329\n骁\t88330\n孟祥瑞\t88331\n麦地\t88332\n零二零\t88333\n往东走\t88334\n李躺\t88335\ngifef\t88336\nnjjkkkkkll\t88337\n易馨\t88338\n片药鼎沸\t88339\n振华\t88340\n炒鸡棒\t88341\n淘宝直通车\t88342\n41倍\t88343\n京昆高速\t88344\n吴景彪\t88345\n蕾丝裙\t88346\nvade\t88347\nｇｐｒｓ\t88348\n把戏\t88349\nxuoy\t88350\n熊貓\t88351\n蕾丝装\t88352\n一點一滴\t88353\n嗯茵茵\t88354\n四院\t88355\n拐子\t88356\n毒食\t88357\n九四\t88358\n酸雨\t88359\nfuibi\t88360\n老莫\t88361\nXXOO\t88362\n墨迹了烦\t88363\n卢家俊\t88364\n盖博\t88365\n孔融\t88366\n碎尸\t88367\n李思雨\t88368\nis1HK1圖咯hi1圖來PK\t88369\n惠文小区\t88370\n癖\t88371\n一个二十\t88372\n50go\t88373\n黑海人家\t88374\n这些么\t88375\n13208641524\t88376\n收历\t88377\n水喉\t88378\nCCTV6电影频道节目中心\t88379\n治中不来\t88380\n癫\t88381\n癶\t88382\n倒掉\t88383\n沈煜伦\t88384\n西峡紫金\t88385\n盖卫\t88386\n癿\t88387\n百\t88388\n白\t88389\n發\t88390\n登\t88391\n学要\t88392\n癸\t88393\nSimon\t88394\nKpokaht\t88395\n仙林\t88396\nST四维\t88397\n黄陵\t88398\nEtertetytyu\t88399\n／／／＼＼＼n｜｜nd\t88400\n支使\t88401\n洗浴\t88402\nrffiify\t88403\ngvbgch\t88404\n行政\t88405\nfidud\t88406\n体贴入\t88407\n偶闺密\t88408\n了了再见\t88409\n来电影\t88410\n测比之\t88411\n搞基了\t88412\nxoxo\t88413\n生粉\t88414\n气氛\t88415\n无言于对\t88416\n肚子堡\t88417\n副总统\t88418\n不是我是在给你\t88419\n离婚案\t88420\n追赶\t88421\nbafa40f4bfbfbeda44bde0c7ff0f736afc31fb6jpg\t88422\n6月18日、19日\t88423\n说帅\t88424\n抄手哥\t88425\ndhiphotosbaiducomxiaodupicitemd439b6003af33a87b216ce15c15c10385343b542jpg\t88426\n赛汀\t88427\nfkhnF\t88428\nsuqb\t88429\n行好好好\t88430\n伏击\t88431\n偷生之人\t88432\n船身\t88433\n说明书式\t88434\niackh\t88435\n啦啦拉\t88436\n干促\t88437\n李梦\t88438\n零米\t88439\n额佛\t88440\n文学系\t88441\n棒棒棒\t88442\n菜窗\t88443\nTfbao\t88444\n曹首歌\t88445\n看不好\t88446\n更搞\t88447\n狗头汽油格\t88448\n聊了聊\t88449\n四套\t88450\n新范\t88451\n园丁\t88452\n会考试\t88453\n8888888888888\t88454\n明明明\t88455\nfifjdsgghfsgshfdkfiff9dh\t88456\n8888888888884\t88457\n主攻\t88458\n百度炮击\t88459\n能源\t88460\n孟加\t88461\n和你的身体\t88462\n六下\t88463\nuuhhgg\t88464\n做人杰\t88465\nfyfufy\t88466\n学闲网\t88467\n头昏进行曲\t88468\n疯标\t88469\n难找\t88470\n感冒药\t88471\n最近两天\t88472\n争声\t88473\n园中\t88474\n黑暗\t88475\n里应外\t88476\n政治学\t88477\n聋哑\t88478\n程龙\t88479\n球鞋\t88480\n190千克\t88481\n170斤\t88482\n田曾灿\t88483\n紧缩\t88484\n铁蹄\t88485\n葵花\t88486\n逗你说北走我说东走砾说北上我是南下你说一我是二你说上我说下也\t88487\n空中楼阁\t88488\n十四点\t88489\n四千多\t88490\nvUSD\t88491\n原色\t88492\n生紫萱\t88493\n等缓\t88494\n54865\t88495\n掉以轻心\t88496\n了知道\t88497\n终了\t88498\n绿都\t88499\n雄鹰\t88500\ngjmjtw\t88501\n曼都\t88502\n囍从天降阿sa\t88503\n月见\t88504\n鞍山师范\t88505\n13058624279\t88506\n天雪\t88507\n柳枝\t88508\n小队\t88509\n五六只\t88510\n一千次\t88511\n突死\t88512\n鸡肝\t88513\nhhbebhzywvb\t88514\n庭审\t88515\n阿尔萨斯\t88516\n懒求\t88517\n就动\t88518\n一个再讲一个再一个再一个\t88519\n胖墩\t88520\n恋爱史\t88521\ndggf\t88522\n福岗\t88523\n天尾\t88524\n庄里\t88525\n福岛\t88526\n176.5元\t88527\n活灵活现\t88528\n懂懂懂\t88529\n奶粥\t88530\n鸡肠\t88531\n俩万岁\t88532\n老听\t88533\n稍胜一筹\t88534\n呢煞笔\t88535\n虚荣心\t88536\n300000000000\t88537\n江湾城\t88538\n英村\t88539\n滚蛋\t88540\n临产\t88541\n西太后\t88542\nIjbhhu\t88543\n东软信息\t88544\n四星级酒店\t88545\n//艺\t88546\n进修\t88547\n全新\t88548\nfggucxffuvucuxdethvxsdgx\t88549\nrkaauovm\t88550\n奥吧\t88551\n灿白\t88552\n英杰\t88553\n法币者\t88554\n热放\t88555\n姲猜\t88556\n百度知道\t88557\n库尔勒市\t88558\n全文\t88559\n1111111111111111111111111111111111111111111111\t88560\nvchfhgf\t88561\n戌中藏辛金\t88562\n我爱你我爱你我爱你我爱你你是我的\t88563\n那么多事\t88564\nFFF\t88565\nqwee\t88566\nofjdyxhbhkhbmlkf\t88567\n2小时后\t88568\nRiRi\t88569\n紫薇公园\t88570\n水绿\t88571\n尼克\t88572\n电阻屏\t88573\n5642685144886545556882805870062705272697128055780557338093258564027680528855\t88574\n3分之二\t88575\n张姐\t88576\n伤伤时光转逝\t88577\n大小孩儿\t88578\n好多回\t88579\n妞蔓思撒\t88580\n老酸奶\t88581\nhttpfhiphotosbaiducomxiaodupicitem3801213fb80e7beca95d4d93282eb9389b506b39jpg\t88582\n表壳\t88583\n人畜共患传染病目录\t88584\n张修竹\t88585\nessss\t88586\n咯特曼\t88587\n张姨\t88588\n是非主流\t88589\n欧麦尬的我的奶奶来了等会\t88590\n过桥费\t88591\n哦掌\t88592\n我喜欢熊出没之熊心归来\t88593\n开改二年\t88594\n三清\t88595\n挂历\t88596\n夜不能寐\t88597\n3月16号\t88598\nioiy\t88599\n穿堡\t88600\n藤珞\t88601\n俊豪\t88602\n同乐同乐\t88603\n六花肉\t88604\n丸剂\t88605\nioio\t88606\n阿姨基\t88607\n魏宇龙\t88608\n吃食\t88609\n奥里娃\t88610\n29岁\t88611\n冯靖宇\t88612\n真NB\t88613\n永恒不变\t88614\n汝南县\t88615\n吃飯\t88616\n可笑哈哈\t88617\n4xy60\t88618\n听辱\t88619\n六一\t88620\n不懂得\t88621\n湮没了\t88622\nfxjhddguhcfch88jjklhgcvk\t88623\nDrjf\t88624\n性字\t88625\n秦秘秘\t88626\n三分球\t88627\n体育用品商品店我没工夫胡椒小羊吧\t88628\n金笔\t88629\n酷三头六臂刀枪不入\t88630\nniqiwoshibushi\t88631\noppocomer\t88632\n你好小气\t88633\n功利性\t88634\n好意思\t88635\nfef\t88636\nfeg\t88637\nfed\t88638\nfee\t88639\nHub\t88640\n&\t88641\nfen\t88642\nHun\t88643\nfej\t88644\n淘瞳\t88645\n花岗岩\t88646\njjuoya\t88647\nfet\t88648\nfer\t88649\n651米\t88650\nHus\t88651\n张国焘\t88652\n格林威治\t88653\n时空\t88654\niPhone4\t88655\nKxkdiekskdlw\t88656\nfey\t88657\n熊处处\t88658\n吐血\t88659\n邻家\t88660\nwhant\t88661\nsors\t88662\n失联\t88663\nsory\t88664\n蛇妖\t88665\n1312323\t88666\n而立\t88667\n宋氏\t88668\n迷笛\t88669\n中计\t88670\n韩国拌饭\t88671\n同父异母\t88672\n栗山\t88673\n可爱你男\t88674\n桔呃\t88675\n岱岳\t88676\n李家河\t88677\n鼓唱\t88678\n驾驶证\t88679\n下拉多\t88680\n快点元\t88681\n769794664677\t88682\n熊猫植物大战僵尸版\t88683\n基俺\t88684\n扫瑞IC\t88685\n叫恋\t88686\n说你走\t88687\n夜睹\t88688\n指认\t88689\n十二點\t88690\n白色的小不点长白下\t88691\n奖文豪\t88692\n两码\t88693\n哥斯达黎加\t88694\n3428679\t88695\n虫咬\t88696\n离石王\t88697\n董岗\t88698\n8ig0\t88699\n游牧\t88700\n度秘塔\t88701\nJapaneseavvideo\t88702\n何必然\t88703\n食色\t88704\n阿嘞\t88705\n双十中学\t88706\n二比三\t88707\n干巴得\t88708\n张思炟\t88709\nzuev\t88710\n掺杂\t88711\n见小女足量\t88712\n怪咖\t88713\n纳税人\t88714\n一起嗨\t88715\n哈哈哈大笑\t88716\n罗琦\t88717\n格雷尔\t88718\n425525525125215155fhfj\t88719\n显\t88720\n顽说\t88721\n秀火\t88722\n昵\t88723\n6mg\t88724\n昶\t88725\n昱\t88726\n昰\t88727\n是\t88728\n昨\t88729\n春\t88730\n无遮挡\t88731\n昧\t88732\n映\t88733\n星\t88734\n昙\t88735\n昘\t88736\n以防不测感觉吧女吧v回家吧vv\t88737\n昔\t88738\n昗\t88739\n性福\t88740\n易\t88741\n女普陀\t88742\n昌\t88743\n昏\t88744\n明\t88745\n昉\t88746\n昊\t88747\n唐逸敏\t88748\n60少28\t88749\n非神\t88750\n昂\t88751\n淄博矿业\t88752\n累不累\t88753\n塔吉克斯坦议会下院\t88754\n美国国家档案馆\t88755\n请度\t88756\n孙佳沂\t88757\n锅碗瓢盆交响曲\t88758\nZ万\t88759\n薄弱者\t88760\n浦口\t88761\nlooks\t88762\n中國\t88763\n亲爱的我好想你呀你想我\t88764\n你好你好你好天天\t88765\n生羡\t88766\n帮帧\t88767\n宋·佚名\t88768\n百度新闻\t88769\n凯格\t88770\n向斯文\t88771\n18522120118\t88772\n艳艳\t88773\n中场\t88774\n小娃儿\t88775\n康铭\t88776\n中地\t88777\n穿国荣\t88778\n林则徐\t88779\n挺夕\t88780\n就是我的死狗\t88781\n吴亦\t88782\n用爱\t88783\n把你的心我的心川崎来\t88784\n防弹\t88785\n回身\t88786\n汇泽\t88787\n吴京\t88788\n非正常\t88789\n加以\t88790\n柳堡镇\t88791\n吃空\t88792\n我的世勋\t88793\nGail\t88794\n加价\t88795\n意彝型\t88796\n小汝\t88797\n小江\t88798\n加份\t88799\n雇主\t88800\nfyucdfufg\t88801\n求求你了大哥哥就告诉我你是男是女\t88802\n现在\t88803\n小池\t88804\n加什\t88805\n一无所知\t88806\nnhcjicjic\t88807\n小汪\t88808\ntryfrvhrr\t88809\n一个七十\t88810\n城内\t88811\n鐑\t88812\n不达人\t88813\n加仓\t88814\n现场\t88815\n切惮拖\t88816\n大天长乐\t88817\n播出\t88818\noboroutobler\t88819\n嫑酱紫\t88820\n初生\t88821\n世寿\t88822\n索顿\t88823\n过年果\t88824\n雄县\t88825\n3100票\t88826\n俩根平\t88827\n星光\t88828\n大姐大你在家\t88829\n新天地王\t88830\n翌与花\t88831\n深陷\t88832\n冰激凌糖葫芦八宝粥\t88833\n厌臭\t88834\nDBD\t88835\n齐民友\t88836\nnjpjpj\t88837\nuntilprep\t88838\n给给\t88839\n155555555\t88840\n微讨厌\t88841\n雄厚\t88842\nInjjknnjjj\t88843\n冰球帽\t88844\n卡恩\t88845\n稚嫩\t88846\n死给哈呀木\t88847\n初男\t88848\n土狗\t88849\n奔驰原厂\t88850\n酱酱酱酱\t88851\n大宝贝\t88852\n二八婆\t88853\n来来来来来看\t88854\n角话\t88855\n救维\t88856\nhskd\t88857\n不好可不好\t88858\n愈多\t88859\n老纪著\t88860\nThis\t88861\nw8eue\t88862\n字沫\t88863\n雨毛\t88864\n安眠药\t88865\n俩对\t88866\n下小小就是我我就是射手\t88867\n响吻\t88868\n伯夷\t88869\n远离你了我不想\t88870\n聊聊聊聊聊聊聊聊聊聊\t88871\n日支\t88872\n十一点六十\t88873\n比再说\t88874\n四分一六岁\t88875\n联想yoga900s\t88876\n尕海洋\t88877\n一肚子\t88878\n受养成\t88879\n赵雅芝\t88880\nP话\t88881\n球道\t88882\n电视费\t88883\n小心更不幸福因为你一点也不好我讨厌你你给我死开\t88884\n知道他是谁\t88885\n日本京都\t88886\n节课\t88887\n兰州理工大学\t88888\n药膏\t88889\nghggjn\t88890\n凸｀0\t88891\n徐汇滨江\t88892\n一二三四零八九十四\t88893\nVTGKZJ\t88894\nldonit\t88895\n千牛\t88896\n名诗\t88897\n聊了待\t88898\n嘻嘻嘻嘻嘻嘻嘻7777777\t88899\n盗情史\t88900\n拜拜你\t88901\n鬼来啦\t88902\n椰子粉\t88903\n人心机\t88904\n葛根\t88905\n懒货\t88906\nslult\t88907\n大王卡腾讯\t88908\n2015年10月23日\t88909\n只当\t88910\nchangellot\t88911\n小剩\t88912\n敢作敢为\t88913\n哈弗\t88914\n曹雨航\t88915\n辉特\t88916\n药膳\t88917\nGPRS定位仪\t88918\n孟氏ly\t88919\n小烟\t88920\n木JJ\t88921\nhnhenv\t88922\n也门政府\t88923\ndrtyggh\t88924\n好不一点\t88925\n佛莫\t88926\n银墨黎\t88927\n靓照\t88928\n小烈\t88929\nNhhkiggjhcbhhhggfhhhvvbb\t88930\n不高\t88931\n我们中国会\t88932\n11000\t88933\n99999111999911\t88934\n朱鲁\t88935\n财税\t88936\n满梦园丑\t88937\n一百一百三十九\t88938\n水肿型\t88939\n出模\t88940\nchxghvb\t88941\n无医\t88942\n伊诺\t88943\n151122122\t88944\n13293934577\t88945\n红堡\t88946\n25727655\t88947\n八六年\t88948\n植物大战僵尸空中之城\t88949\n新教\t88950\n前边儿\t88951\n353535333333454345533663633653335653566563565\t88952\n打屁屁屁屁屁\t88953\n蚀米\t88954\n句号\t88955\n畅玩\t88956\n我爱你\t88957\n错错错错\t88958\n卢军\t88959\n西游大战僵尸二\t88960\n红堂\t88961\n滑落\t88962\n句句\t88963\n林爸\t88964\n防火\t88965\n国王们\t88966\n白都\t88967\n笑纳\t88968\n四粒\t88969\n方便车\t88970\n主料\t88971\n笑纹\t88972\n方山\t88973\n新数\t88974\n潮利\t88975\n蓝婉\t88976\n欢田喜\t88977\n200多万\t88978\n侯壮丽\t88979\n不要钱儿\t88980\n回文数\t88981\n500克\t88982\n老惯\t88983\n夏哥\t88984\n学了你说\t88985\n猪流感\t88986\n菲律宾人\t88987\n500元\t88988\n我喜欢派\t88989\nN8\t88990\n黑丝女\t88991\n微群\t88992\n张涵予\t88993\njgdcgggv\t88994\n说话才说\t88995\n金思诚\t88996\n饶良红\t88997\n讨厌我不想和你在这\t88998\n精要是\t88999\n算算算数\t89000\n55545455555555\t89001\n不见多好\t89002\n一老一少\t89003\nvocof\t89004\n441位\t89005\n55724738357756732008n432835\t89006\n变态杀手\t89007\nvococ\t89008\n广告费\t89009\n杀手\t89010\n二百六十六块\t89011\n李若瑜\t89012\n宁可\t89013\n勺内\t89014\n退场\t89015\n度秘抚恤\t89016\nan秘\t89017\n权政\t89018\n晕现\t89019\n晴儿格格\t89020\n腋窝九勉强的少女\t89021\n扯了吧\t89022\n浙江传媒\t89023\n九龙桥\t89024\n中士\t89025\n差差\t89026\n5565554554\t89027\n向东笔记\t89028\n埋设\t89029\n325加一位\t89030\n认不\t89031\n趁热\t89032\n五二五六二\t89033\n1000羽\t89034\n对抗\t89035\njndmkwm\t89036\n零五元\t89037\n对折\t89038\nimezmyameloc\t89039\n数六个\t89040\n今上午\t89041\n逆境\t89042\n众人\t89043\n佐藤亚\t89044\n56%\t89045\n180块\t89046\n同天\t89047\n功业\t89048\n颜炳艺\t89049\n安县民族宗教局\t89050\n235整除\t89051\n晨风\t89052\n鲜蛋\t89053\n刘老家村\t89054\n邮费\t89055\n甘芝洁也就是我闺蜜美\t89056\n嗯21\t89057\n5dmam\t89058\n谢过\t89059\n人体月饼\t89060\n园长\t89061\n你是我的知已\t89062\n熊林川\t89063\n第三问\t89064\n谢迎\t89065\n齐国丝\t89066\n商报\t89067\nenenrb\t89068\n蛛网\t89069\n二零二二\t89070\n好年\t89071\n包邮\t89072\n操场上香\t89073\n谢迪\t89074\n金鸡赛\t89075\n小使\t89076\n婶子\t89077\nMercy\t89078\n蜂蜡\t89079\n刚走\t89080\n南白\t89081\n#微遗憾#\t89082\n亲爱的小淘气\t89083\nEFX\t89084\n贝托\t89085\n中煤\t89086\n很大气\t89087\n胜芳王\t89088\n才网\t89089\n毫瓦\t89090\n刘方圆\t89091\nforme\t89092\n崩露\t89093\n巴傻\t89094\n赵丽叶\t89095\n散粉\t89096\n瓜子\t89097\nhchftdgkjuvu\t89098\n考级\t89099\n四点半点\t89100\n8亿元\t89101\n对啊度\t89102\n高檐\t89103\n大明湖畔\t89104\n一张背\t89105\npicasal\t89106\n过奖过\t89107\n等等等\t89108\n信息源\t89109\n蜡菊\t89110\n150多\t89111\n遗憾\t89112\nFLLZ\t89113\n创造\t89114\n要不你给我的小汤姆猫\t89115\nNcgcvb\t89116\n共产党\t89117\n剛剛\t89118\n拜放\t89119\n规定\t89120\ncompol\t89121\n深入浅出\t89122\n吕板烧\t89123\n一万四千五百万元\t89124\n复兴门金融街\t89125\n不好再见\t89126\n5分钟\t89127\n睡算\t89128\n大孩儿\t89129\n480级\t89130\n冷场\t89131\n我是喵喵喵喵喵喵\t89132\n奖杯\t89133\n过年你爱\t89134\n无限流量卡\t89135\nn系\t89136\n十五份\t89137\n么trin\t89138\n101个\t89139\n点六手\t89140\n陈婷\t89141\n单完美\t89142\n十年个\t89143\nSM公司\t89144\n风动石\t89145\nnnnnbnnnnnnmmmmkkkkkkkklllllllllllll\t89146\n苗珂欣\t89147\n丑明\t89148\n金本\t89149\n丘泽\t89150\n干饭\t89151\ntherotalinannnnniya秘cecyo\t89152\n张小胖子\t89153\n西马达\t89154\n屌刷\t89155\n课情\t89156\n好小子\t89157\nieujs\t89158\n笨猪头\t89159\n好饿\t89160\n好饱\t89161\n僵尸张晓龙我的我是银龙穿越你是哪的你\t89162\n乐乐姐\t89163\n田红橙黄绿青蓝紫\t89164\n金景桌\t89165\n811块\t89166\n有所谓\t89167\n五怪\t89168\ndymue\t89169\n第三种\t89170\n恶心人\t89171\n臣妻\t89172\nxwle\t89173\n初音\t89174\n能性\t89175\n第三秒\t89176\n伤心欲绝\t89177\n蒋星雨\t89178\n西周\t89179\njiukuilsquhuquhutquh\t89180\n玉清\t89181\n一个聪明的人\t89182\n堡格丽\t89183\n土耳其语\t89184\n叫我女什么叫\t89185\n茂美\t89186\n小汽车\t89187\n生意人\t89188\n婚宴\t89189\n92.48%\t89190\n月下\t89191\n灰灰灰灰霍霍一长\t89192\n战利品\t89193\n五八动漫城\t89194\n雨婷\t89195\n2311583605\t89196\nfighting\t89197\n万世轮\t89198\n我最讨厌你了最近特别的你秘\t89199\n分布图\t89200\n嗯羊羊羊\t89201\n天竺\t89202\n捅避\t89203\n配配配\t89204\n蝴蝶梦\t89205\n乐天城\t89206\n逃号儿\t89207\n物理化学\t89208\n小火车\t89209\n666636\t89210\n执笔\t89211\n没关系开心\t89212\n没座\t89213\nofvyjg\t89214\n张宇还\t89215\n1321211321\t89216\n惊呼吸\t89217\n不狠\t89218\n102家\t89219\n#胡夏#\t89220\n1299\t89221\n黄芩苷\t89222\n铁哥们儿\t89223\n血洗司\t89224\n三八逼\t89225\n2274554619\t89226\n不用了看\t89227\n演练\t89228\n油油油油油\t89229\n色受想行识亦复如\t89230\n公开信\t89231\n评析\t89232\nmademic\t89233\n小女孩子\t89234\nmyte\t89235\n演绎\t89236\nlindapspnisinthe\t89237\n成本低\t89238\n古装戏\t89239\nrhtgxgh\t89240\n想你在\t89241\n白跳\t89242\n好傻子\t89243\n北卡共和党\t89244\nbug点\t89245\nfffhvh\t89246\n今日关注\t89247\n关艳华\t89248\n八少\t89249\n春满阳光文化传媒\t89250\n乖乖乖乖乖\t89251\nwashhair\t89252\n2211111212\t89253\n在这么说\t89254\n芥子油\t89255\n蟠桃\t89256\n秘书啊女主你在哪儿\t89257\ncluding\t89258\n你好历\t89259\n赵小伟\t89260\n8SSSUS\t89261\n多事机器人\t89262\n考核员\t89263\n铲除\t89264\n我一个人\t89265\n周琦\t89266\n连不上去\t89267\n谭稀龙\t89268\n睡眠袜\t89269\n甬温\t89270\n闪亮家\t89271\n文集\t89272\n猪猪猪猪猪猪猪猪猪猪猪猪猪\t89273\n占率\t89274\n随方则方\t89275\n闪送\t89276\n闪退\t89277\n300000\t89278\n印染厂\t89279\n五百天\t89280\n最不要脸\t89281\n出舌\t89282\n毽子\t89283\n么死\t89284\n126980742\t89285\n贫不足\t89286\n家婆\t89287\n王玉硕\t89288\n两万次\t89289\n伊斯坦布尔\t89290\n搭伴\t89291\n創業\t89292\n假惺惺\t89293\n圆点\t89294\nxmnbc\t89295\n几十公里\t89296\n化工业\t89297\n鹭鹭\t89298\n经受住\t89299\n3月17日早上8时\t89300\n国际佛光会中华\t89301\n蔡国宝\t89302\nchtstuzdiudsthzgigsiustgxzvkzzfdkrxgydhffhddgjfcggdyjgdjjssygfffjg\t89303\n薛青青\t89304\n勇者\t89305\nbe1to\t89306\nchamg\t89307\n探听\t89308\nIFGJFKV\t89309\n移干\t89310\n沒\t89311\n更堡\t89312\n57867\t89313\n蛊毒\t89314\n阿西巴阿西巴阿霞\t89315\n甚好\t89316\n懒子\t89317\n爆炸了碰\t89318\n裸机\t89319\n貉貉努\t89320\n晚安见\t89321\n胖丁\t89322\n這樣煩\t89323\n我爱你爱你\t89324\n试航\t89325\n母草\t89326\n嚓巴巴\t89327\nhimsel\t89328\n甜蜜\t89329\n信乐嘉\t89330\n费文霞\t89331\n熊出茅庐\t89332\n赴约\t89333\n小妙\t89334\njsshaiy\t89335\n小妞\t89336\n叛要了\t89337\nooooookkkkkk\t89338\n报修行\t89339\nLULU\t89340\n线陇\t89341\n三十多度\t89342\n参念\t89343\ngvly\t89344\n建委\t89345\n小妍\t89346\n笑脸儿\t89347\n大智慧\t89348\n8177919\t89349\n小妺\t89350\n郑邻君\t89351\n摆钟\t89352\n迪迦奥特\t89353\n汶川\t89354\n英睡\t89355\n李琪彤\t89356\njuggks\t89357\n卞系\t89358\n神位\t89359\n天亮了我\t89360\n鲫鱼\t89361\n小妮\t89362\n蒸好\t89363\n神作\t89364\n495957\t89365\n幂\t89366\n幅\t89367\n守望\t89368\n终端编码\t89369\n进击的巨人加绒\t89370\n关税\t89371\n连变装拍\t89372\n幕\t89373\n过错事\t89374\n高鹏\t89375\n电脑达人不如你上你的电脑\t89376\n我只有一个人\t89377\n商南\t89378\n南京市琅琊路小学\t89379\n形同虚设\t89380\n1488\t89381\n讨厌儿\t89382\n倒罚\t89383\n有必要\t89384\nzvl\t89385\nzvb\t89386\n幫\t89387\n抓伤\t89388\n阿黄\t89389\nzvy\t89390\n阿黛\t89391\n年\t89392\n并\t89393\n幹\t89394\n幸\t89395\n幻\t89396\n幺\t89397\nzvw\t89398\n幼\t89399\n广\t89400\nzvt\t89401\n以不变应万变\t89402\nopporjis\t89403\n吴彦辉\t89404\n叉叉叉叉叉叉\t89405\n东北大秧歌\t89406\n⑧\t89407\n六六五八\t89408\n紧张\t89409\n耳田\t89410\ntsjiiyggds\t89411\n刘晓东\t89412\n14872386636\t89413\n九六零\t89414\n你好玩儿\t89415\n象征\t89416\n偏远\t89417\n猪猪记\t89418\n13637271474\t89419\n睡不想\t89420\n女裁判\t89421\n杀生\t89422\n难不难受\t89423\n临清县\t89424\n5.0级\t89425\nask块\t89426\n哈哇伊\t89427\n刘晓丹\t89428\nNonono\t89429\n基建\t89430\n你好我叫\t89431\n二光天化日\t89432\n藥义生\t89433\n真价\t89434\n青梅\t89435\n年段\t89436\n8090\t89437\n高傻子\t89438\n再见你好\t89439\n济南军\t89440\n好累死\t89441\n逐鹿中原是\t89442\nnolife\t89443\ncaa1a\t89444\n机工社\t89445\n定就这样\t89446\n恋空\t89447\n石坤\t89448\n肤色\t89449\nDCXVHFB\t89450\n大鱼大\t89451\n360360卫士\t89452\n沈露露\t89453\n总裁在上我在下\t89454\n赖话\t89455\n搜索酷\t89456\n六世达赖\t89457\n嗯聊爱\t89458\n圆月\t89459\n3djagjgamgjgdgtm\t89460\n城市\t89461\n凤今\t89462\nXghf\t89463\n胤鑫\t89464\nyellow\t89465\n区长\t89466\n哈巴涅\t89467\n馆藏\t89468\n川话\t89469\n情敌\t89470\n宁吴尺\t89471\n赞同\t89472\n面骂\t89473\n必反\t89474\nfoufo\t89475\n凤仪\t89476\nwryio\t89477\n一年四季\t89478\n620分\t89479\n安安静静\t89480\nJND\t89481\n不好从现在开始\t89482\n安尼\t89483\n海草\t89484\n睡大调\t89485\n三零七二二六零零二二五\t89486\n卖苹果\t89487\njzksos\t89488\nRachel\t89489\n悲哀\t89490\n云海\t89491\n国债\t89492\n陪聊天\t89493\n云浮\t89494\n仁医\t89495\n老心不老\t89496\n丰足\t89497\niphong6s\t89498\n15763223607\t89499\n托辊\t89500\n有泪不轻弹\t89501\nnononoo\t89502\n谢意\t89503\n强奸犯\t89504\n1924年\t89505\n爱母之心\t89506\n好开森\t89507\n往心里去\t89508\n陶思\t89509\n海兔师\t89510\n看我走\t89511\n济南火车站\t89512\n美香\t89513\n八颗中心校\t89514\n心锁\t89515\njnssn\t89516\nrgkd\t89517\n想不来\t89518\n了待\t89519\n胡家仪\t89520\n入笼\t89521\n工藤静子\t89522\n西安柏树林派出所\t89523\nDoe\t89524\n叶非墨\t89525\n王前明月\t89526\n11难\t89527\nkjnk\t89528\n胎芽\t89529\n陪床\t89530\n绝配\t89531\n景教\t89532\n凉拌黄瓜炒鸡蛋\t89533\n近些年\t89534\n反对者\t89535\nt血\t89536\n好吧算牛逼\t89537\n强受\t89538\n冥币\t89539\n膨化食品\t89540\n你是我的小小苹\t89541\n恋呀恋\t89542\n直十恐龙\t89543\n你明\t89544\n23456789\t89545\n三十三十\t89546\n3分钟前\t89547\n天神金刚劲如风地兽金刚向前冲\t89548\n七四\t89549\n阿明\t89550\n热病\t89551\n谭芷均\t89552\nSTACs\t89553\n我的助手\t89554\n你是\t89555\n香蕉餐\t89556\n这几句\t89557\nsbbsbbbbb\t89558\n老祖師\t89559\n粘着你\t89560\n冷战\t89561\n阿是\t89562\n说了当\t89563\n夜行\t89564\n阿春\t89565\n可贴式\t89566\n中易\t89567\n吭爹的路\t89568\n万死\t89569\n玉米粉\t89570\njmp\t89571\njmw\t89572\njmj\t89573\n不见好就收\t89574\n辈子\t89575\njmm\t89576\njmb\t89577\n摸是\t89578\njmg\t89579\n李玉峰\t89580\nab2ac2BC2\t89581\n丰收速算\t89582\n小泽马利亚\t89583\n给我建议\t89584\n太平\t89585\n小小白\t89586\n伱比\t89587\n略作\t89588\n仰泳\t89589\n休庭\t89590\nLMN\t89591\n5排\t89592\n谁的我的电话\t89593\n陈子峰\t89594\n6666666no\t89595\n有百花秋有\t89596\n12976580521\t89597\nQuand\t89598\n迪欧\t89599\n外阴\t89600\n好新\t89601\n玉米粒\t89602\nljia\t89603\n文昌\t89604\n好方\t89605\n文明\t89606\n数表\t89607\nHahah\t89608\n昀呵\t89609\n爱你的人\t89610\n云柯\t89611\n文星\t89612\n好斗\t89613\n珊儿\t89614\n张小萝\t89615\nrson\t89616\ntmetoutristmetouto\t89617\n好文\t89618\n桂冠\t89619\n首要\t89620\n我的世界版植物大战僵尸\t89621\n侯胜辉\t89622\n东非\t89623\n李奥豪\t89624\n色心\t89625\n热辣\t89626\n灭天开\t89627\n二姨太阳阳江门诊疗法克星星星期一他的事儿媳妇呢吧唧挺\t89628\n大错特错\t89629\nPark\t89630\n延政勋\t89631\n生喝\t89632\n那不过\t89633\n露露露\t89634\n裴玉川\t89635\n拿回来\t89636\n⑵\t89637\n共同努力\t89638\n住有所\t89639\n迷别\t89640\n09级\t89641\n无节操\t89642\n888999988\t89643\n大渡河\t89644\n别着\t89645\nfigr\t89646\n口琴\t89647\n钟琪\t89648\n1227\t89649\n不外\t89650\n赵佳媛\t89651\n牵动\t89652\n不夜\t89653\n不够\t89654\n不多\t89655\n婚首\t89656\n广场家公司\t89657\n11名\t89658\n闫帅明\t89659\n樊玲\t89660\n全军覆没\t89661\n噶胡须\t89662\n不头\t89663\n五十岁\t89664\n董佳霖\t89665\nfghb\t89666\n赎身\t89667\n不失\t89668\nfghf\t89669\nfghi\t89670\n2bbb\t89671\n阮哥\t89672\njgchgh\t89673\n苏晓涵\t89674\n欧奶包\t89675\nisfityostk\t89676\n袢带式\t89677\n不大\t89678\nwiaant\t89679\nfghx\t89680\n醬油\t89681\n美者意\t89682\n陈圻锡\t89683\n不太\t89684\n我我又不像你我又不爱你你也不爱我\t89685\n小乖乖小乖乖小乖乖小乖乖小乖乖谁谁小娇娇\t89686\n斜着眼睛目送\t89687\n怕型\t89688\n嫡系\t89689\n捉迷藏\t89690\nhttpehiphotosbaiducomxiaodupicitem1ad5ad6eddc451dab5d5a498b1fd5266d01632a5jpg\t89691\n我的我的家\t89692\n拉尼娜\t89693\nxg\t89694\n艋舺#\t89695\n胡舒\t89696\n要死不活\t89697\n44度\t89698\n435545\t89699\n魂牵梦绕\t89700\n晚安度秘小可爱萌萌\t89701\n王雅慧\t89702\n1813996873\t89703\n攻坚\t89704\n杨发云\t89705\n文禁森严\t89706\nxy\t89707\n戴着\t89708\n世纪游乐园\t89709\nhi3d版\t89710\n灰太\t89711\n114.6\t89712\n好久不见\t89713\nhhvh\t89714\n一个七\t89715\n一个一\t89716\n杨大郎\t89717\nzgddgdfyfft\t89718\n智慧型\t89719\n一个三\t89720\n纺织品\t89721\n比尔\t89722\n边城\t89723\n你还好意思说哪不要脸要脸\t89724\n夏天\t89725\n无药可救\t89726\n原來愛情\t89727\n1293763451\t89728\n谢对不起\t89729\n1949年以前\t89730\n乖不好\t89731\nPart\t89732\n比尼\t89733\n哪科\t89734\n一个个\t89735\n4721\t89736\n2月4日\t89737\n哪秘\t89738\n苏佳佳\t89739\nkeys\t89740\n夏夏\t89741\n金菩萨\t89742\n哪秀\t89743\n慈样\t89744\n徐千文\t89745\n张韵林\t89746\n早产儿\t89747\n９点\t89748\n纵横面\t89749\n云华仙馆\t89750\n由南向北\t89751\n老好玩\t89752\n我讨厌你你为什么不\t89753\npizi子\t89754\n13119009378\t89755\n五十六\t89756\n1.2%\t89757\n3.3x2.3\t89758\n黑弄好\t89759\n邯郸\t89760\n瑞雪\t89761\n好吧相信你\t89762\n反反复复重\t89763\n爱心状\t89764\n抹汗\t89765\n杜赌\t89766\n作派\t89767\n金蝶\t89768\n美容公司\t89769\n民主\t89770\nkeyi\t89771\n杨乐乐\t89772\n乌云\t89773\n郑云新\t89774\nejejejhehwh\t89775\n李晨风\t89776\n蹭蹭\t89777\n四零乃成\t89778\n港康\t89779\n六管\t89780\n土豆酱\t89781\n差差点\t89782\n二四二\t89783\nsfvcfe\t89784\n获捐\t89785\n魔魔\t89786\n杀光\t89787\n口合物\t89788\n林小玉\t89789\ngghdbbs\t89790\n毛懋\t89791\n小苹鼓\t89792\n别敷衍\t89793\n二四五\t89794\n4qbq\t89795\n嗯大河\t89796\n喵喵喵喵喵喵喵喵喵喵喵喵喵\t89797\n1830018300\t89798\n金桥中学\t89799\nvgjvh\t89800\n渗水\t89801\n杀入\t89802\n绕科\t89803\n彭偶幺\t89804\nbufifaoson\t89805\n潘伟彬\t89806\n128次\t89807\n800万\t89808\n不舒服\t89809\n假誓\t89810\ndsfbjs\t89811\n我的的名字\t89812\n冈仁波齐\t89813\n20000多\t89814\n三人组\t89815\nWonderfulfriend\t89816\n一个晚\t89817\nJ女\t89818\n迅雷方舟\t89819\n风力\t89820\n农家乐\t89821\n8千八百八八八\t89822\n豪情\t89823\n难以为继\t89824\n寄壮\t89825\n好哪哪哪\t89826\ndurid\t89827\n小我的四十岁\t89828\n近意词\t89829\n倒贴\t89830\n二三里\t89831\n厅长\t89832\n来试\t89833\nzBay\t89834\n海报\t89835\n幺零二八\t89836\n少把星\t89837\n8月8日\t89838\n最后一针\t89839\n唐宇飞\t89840\n麦克克里斯托\t89841\nhyyy\t89842\n40f4\t89843\n爱你好久好久\t89844\n388厘米\t89845\n我邦\t89846\n变声蛋\t89847\nhyyu\t89848\n6.8斤\t89849\n陆琪\t89850\n家乐乐\t89851\n还读大学\t89852\n头握\t89853\n俩百万\t89854\na7r女\t89855\n剑影蛇秘\t89856\n22咯\t89857\n晚餐\t89858\n4x百分\t89859\n商用\t89860\n没时时\t89861\n坡目\t89862\n珀莉丝\t89863\n消灭掉\t89864\n杂扯\t89865\n基本面分析\t89866\n做案\t89867\n超级大的超级的超级的超级的超级的超级的超级的无敌大无忌无敌大无敌大无敌的的坏人\t89868\n产物\t89869\n丹姐\t89870\n宠物医院\t89871\n初乐元\t89872\n和平你在哪多米\t89873\n为我想你\t89874\n再见我真的不理\t89875\nkarry\t89876\n花呗\t89877\n至尊秘\t89878\no们\t89879\n织女\t89880\n2012年7月1日\t89881\n21：45\t89882\n抨击\t89883\n冻苞\t89884\n最后一面\t89885\n1100万\t89886\n遗射\t89887\nfdgghi\t89888\n白杨旺\t89889\nfontface\t89890\n王夫人\t89891\n榆关\t89892\n乱世出英雄\t89893\n顶罪\t89894\nlessons\t89895\n单天宇\t89896\nx1\t89897\n大黄蜂很多该多好王uyr\t89898\nx7\t89899\n聂世峰\t89900\n陈正之\t89901\n制绳\t89902\n顺便\t89903\n杰杰\t89904\n团膜\t89905\n偷偷懒\t89906\nyff\t89907\nyfg\t89908\n添乱\t89909\n八军\t89910\nyfc\t89911\nnhjj\t89912\n梅克\t89913\n35年前\t89914\n职来职\t89915\n2846153628\t89916\n猪猪猪猪侠\t89917\n王为睿\t89918\n王晓东\t89919\nAaron\t89920\nyfx\t89921\n论剑\t89922\n贵阳欢乐谷\t89923\n瓦素男\t89924\n王绿王\t89925\n2007年8月3日\t89926\n朱先艳\t89927\nヾзノ\t89928\n八冶\t89929\n别宙斯\t89930\n通病\t89931\n周赞文\t89932\nfjfgfjchx\t89933\n甲成\t89934\n倚着\t89935\n王晓丽\t89936\n史诊香\t89937\n书阁\t89938\n黄鹤楼\t89939\n勾拳\t89940\n千狸\t89941\n地头\t89942\n永不分开\t89943\n滥情成风\t89944\n润肺\t89945\n来广营乡\t89946\n27元\t89947\n42斤\t89948\n地大\t89949\n十二五规划\t89950\n自拍吧\t89951\n君王\t89952\n先审\t89953\n去吧别\t89954\n贾哥\t89955\n拍饭\t89956\n同语言\t89957\n手膜\t89958\n木卫一\t89959\n地外\t89960\n嗯小嗯\t89961\n影旁\t89962\n自建成\t89963\n咯错呀\t89964\n张总周\t89965\n头彩\t89966\n明天的明天就是你的忌日\t89967\n地处\t89968\n没有我想\t89969\n牛魔王\t89970\n二八块\t89971\n上周六\t89972\n康震天\t89973\n前额\t89974\n前题\t89975\nfiid\t89976\n中宝制药\t89977\n王甜宠\t89978\n共进\t89979\nv秘秀\t89980\n快把号\t89981\n浏海\t89982\n阿里噶多\t89983\n扎小辫\t89984\n可待成\t89985\n线头\t89986\n说话声\t89987\n呈堂\t89988\n胡银霄\t89989\n看不是\t89990\n米尔肯\t89991\n雨量器\t89992\n猜闷\t89993\n黑饭\t89994\n嘎嘎嘎嘎宫\t89995\n1000岁\t89996\n八分\t89997\n我是你的好不好我\t89998\n经营史\t89999\n沪昆\t90000\nnonooo\t90001\nCbcgfdgh\t90002\n雷达\t90003\n一个几岁\t90004\n咯兔\t90005\nqpkaajzjs\t90006\njof\t90007\n马克\t90008\n冯刀西\t90009\n择日\t90010\n除\t90011\n别处\t90012\n4根\t90013\n疯子杰\t90014\n停业\t90015\n大酒\t90016\n坡头区\t90017\n荧夜\t90018\n别多\t90019\n别外\t90020\n马兖\t90021\n三艘\t90022\n20101020\t90023\n发张不包\t90024\n湘北\t90025\n静不开心\t90026\n巫山着意催云雨\t90027\n马六\t90028\n一百多盒\t90029\n奇美群\t90030\n送票\t90031\n好运道\t90032\n很度\t90033\n爱你欧\t90034\n环廊\t90035\n宜山路\t90036\nmjmjdap\t90037\n马关\t90038\n马兰\t90039\n跳跳蛋\t90040\n佛心\t90041\n楼人\t90042\nmjgj545445444555566599569\t90043\n军转\t90044\n新领事馆\t90045\n秘忙\t90046\n拉伤\t90047\n王子康\t90048\n肠镜\t90049\n军车\t90050\n222222222222222222222翔\t90051\n李雨萱\t90052\n飞行\t90053\n门窗\t90054\n横着放\t90055\nC5TJSYG\t90056\n葵树\t90057\n成灾\t90058\nhkjb\t90059\nk校本\t90060\n负二\t90061\n秘念\t90062\n形似\t90063\n智能机市场\t90064\n接看\t90065\n宁桓宇\t90066\n陶\t90067\n弄件\t90068\n一百错\t90069\n15661486125\t90070\n原图\t90071\n颈\t90072\n蚬子\t90073\n毒打\t90074\n肝肾\t90075\n蛋疼菊\t90076\nhttpahiphotosbaiducomxiaodupicitem91529822720e0cf32f9c3a880d46f21fbe09aa5ejpg\t90077\nbbbczst9y\t90078\n这个晚上\t90079\n念君\t90080\n轰炸\t90081\n毒手\t90082\n原因\t90083\nvigf\t90084\nMain\t90085\n度迷月\t90086\n30瓶\t90087\n反方向\t90088\n默默默默默默默默默默默默默默默默默默默默\t90089\n草鸡\t90090\n银光棒\t90091\n学者们\t90092\n怪滴怪\t90093\n为女\t90094\n防具\t90095\n阿酱\t90096\n哪段\t90097\n为奸\t90098\ngfddfdsd\t90099\nhyshk\t90100\n期限\t90101\nbbnjh\t90102\n同学会\t90103\n知豆\t90104\n断见\t90105\n520米\t90106\n纪姿含\t90107\n分派\t90108\n零八二六\t90109\n颇\t90110\n10日10时\t90111\n泪人儿\t90112\n不插\t90113\n不提\t90114\n52698075441\t90115\n撰稿人\t90116\n缘尽\t90117\nMMN2L\t90118\n文治\t90119\nwifi家\t90120\n抢起\t90121\n陌\t90122\n王老早\t90123\nvigi\t90124\n唐峰\t90125\n有多爱\t90126\n胡克雄\t90127\n尾尖\t90128\n刨除\t90129\n李小峰\t90130\n度秘九龙\t90131\n速腾\t90132\n黑尾熊\t90133\n上一边儿\t90134\n成瘾\t90135\n呆觉\t90136\n频\t90137\n男少爷\t90138\n退化\t90139\n蟹子\t90140\n一下一次\t90141\n都会话\t90142\n不死\t90143\n87几\t90144\n21年\t90145\n14点32分\t90146\n美的\t90147\n体育南路科技街口\t90148\n度蜜月穿\t90149\n好萌\t90150\nfdgugghhi\t90151\n适才间\t90152\n更真好\t90153\n肯定片\t90154\n天不好意思\t90155\n54268075568\t90156\n全封闭\t90157\n3小时\t90158\n驴\t90159\n嗯新功\t90160\n驶\t90161\n驱\t90162\n键位\t90163\n驳\t90164\n直愣愣\t90165\n5月23日13点\t90166\n驾\t90167\n驹\t90168\n423845988\t90169\n苹果园\t90170\n新钢化\t90171\n周依美\t90172\n689568493797\t90173\n马\t90174\n驯\t90175\n驮\t90176\n阉寺\t90177\ndesire\t90178\n臭小机器人\t90179\n梓博\t90180\n奥奥奥奥奥\t90181\n才叫乖\t90182\n万人行\t90183\n驚\t90184\n锄头\t90185\nwwuqoa\t90186\nQh表\t90187\n大滇\t90188\n陛\t90189\n埋藏獒\t90190\n颡\t90191\n白奶奶\t90192\n8976X9578几\t90193\n尊尊\t90194\n二三四五六七八九十\t90195\n劳逸结合\t90196\n扫地出门\t90197\nhjljn\t90198\n热议\t90199\n李中原\t90200\n看不惯\t90201\n对啊你丑\t90202\nMP4\t90203\nMP5\t90204\n五百hi\t90205\n恋舞o\t90206\n恒大\t90207\n仙贝\t90208\n别惹我喝\t90209\n医馆\t90210\n知己知彼方能\t90211\n防火墙\t90212\n张丽均\t90213\n46天\t90214\n颼\t90215\n牛鼻子\t90216\n唉羊羊羊羊\t90217\n裂开\t90218\n第76\t90219\n活到老\t90220\n李丰\t90221\n我的名言\t90222\n发季\t90223\n物美价廉\t90224\n6多\t90225\n合股\t90226\nriverice\t90227\n合肥\t90228\n身高己\t90229\n六六元\t90230\n百科\t90231\n编著\t90232\nkhv\t90233\n金山网络\t90234\n行不道\t90235\n1414\t90236\ngsstgcdghjQQ\t90237\n三只只只只只只只只只只只\t90238\n麦皮\t90239\nMPV\t90240\n行货\t90241\n唐州\t90242\n顾家城市\t90243\n一工89\t90244\n单双号限行\t90245\n发字\t90246\n祖儿的歌\t90247\n百种\t90248\n我也有\t90249\n舒忆\t90250\n曹强\t90251\n舒心\t90252\n牢炉\t90253\n烤奶茶\t90254\n雨涵\t90255\n没用来啦\t90256\norere\t90257\nthankful\t90258\n阳光\t90259\n陈换成\t90260\n笑话笑话\t90261\n汤姆索亚\t90262\n斩首斩草\t90263\nnovo\t90264\n第180\t90265\n去了再说\t90266\n七天天\t90267\n疯子癫子王\t90268\n始自\t90269\n斯坦利杯\t90270\n生物学\t90271\n公路口\t90272\ngydsjotre\t90273\n3磅\t90274\n蔡宠爱\t90275\n086886\t90276\nkklk黑oi头克不爱\t90277\ngtgtgr8\t90278\nwwalerto\t90279\n格蕾\t90280\n过问\t90281\n星梦天使组\t90282\n一个额\t90283\n轻判\t90284\n著作书\t90285\n魔法\t90286\n陈歌曲\t90287\n冰花兄弟宝\t90288\n小蒋老邓\t90289\n左惫\t90290\n两三天三四\t90291\n好丑好丑好臭\t90292\n4600\t90293\n赵老师\t90294\n你好喜欢\t90295\n算出去\t90296\n大坏蛋\t90297\n绯闻女友\t90298\n南顾晨\t90299\npphl\t90300\n1千多公里\t90301\n祁峰琳\t90302\n秘tomyou\t90303\n一折\t90304\n咔们\t90305\n151\t90306\n150\t90307\n153\t90308\n152\t90309\n155\t90310\n154\t90311\n157\t90312\n156\t90313\n159\t90314\n158\t90315\n决口\t90316\n一川\t90317\ncpNEWS\t90318\n承达\t90319\n寮步\t90320\n说八道\t90321\n要好玩\t90322\n禽流感\t90323\n其人之道\t90324\n8错\t90325\n香蕉糖\t90326\n模拟器\t90327\n电子书长\t90328\nleep\t90329\n悔意\t90330\npppppppppp\t90331\n满城尽带黄金甲\t90332\n七彩\t90333\n光环\t90334\n啊克和\t90335\n160岁\t90336\n翔安\t90337\n无人岛\t90338\n心情不好\t90339\n金泗洪\t90340\n134512151825\t90341\n果果们\t90342\nvvmv\t90343\n香港社会福利署\t90344\n15y\t90345\nenya\t90346\nhurv\t90347\n犊子\t90348\n11倍\t90349\n光溜溜\t90350\n牌九帅\t90351\nGGGGSS\t90352\n曹启泰\t90353\n假体。com\t90354\n自荐\t90355\njlbvfbcsfuyewkgmpn006w00www\t90356\n龙管\t90357\n江水\t90358\n好不好好不好嘛\t90359\n江永\t90360\n度秘你发张你的魔术\t90361\n怡殇\t90362\n举出\t90363\n18克拉\t90364\n干倘\t90365\n新运途明\t90366\n今朝\t90367\nthinkpad\t90368\nFUCHSIA\t90369\n九期\t90370\ni分if人3\t90371\n王开\t90372\n王异\t90373\n橘ay\t90374\n女伴儿\t90375\n汉堡港\t90376\n新年会乐\t90377\n九月\t90378\n应酬\t90379\n双栖类\t90380\n穿厚狼\t90381\n干值\t90382\n华祥苑\t90383\n王强\t90384\n波波丽丽\t90385\n纳雍中坝\t90386\nBaha\t90387\n悲桑\t90388\n油龙\t90389\nCCCCC\t90390\n石家庄火车站\t90391\n长威风\t90392\n跟静\t90393\nNff\t90394\nlrc\t90395\n石辉\t90396\n水手\t90397\nlrf\t90398\n118765442668\t90399\n南洋\t90400\n逆战吧\t90401\n赶快束\t90402\n1993年11月15号\t90403\n田子明\t90404\n拜九叩\t90405\n波动\t90406\n虎门\t90407\n骗死\t90408\n桂博\t90409\n廿年\t90410\n15呦\t90411\n不动话\t90412\n一己\t90413\n小王王\t90414\n天万部\t90415\n好吧那我不删你了不过你不要打扰我\t90416\n一个3个\t90417\n毛个\t90418\natshuround\t90419\nleave\t90420\n近代音\t90421\n嗯二二二\t90422\n仓库\t90423\n哇靠\t90424\nneextlh\t90425\n了一天\t90426\n吉马\t90427\n座行\t90428\nIQ132\t90429\n瓶子酒\t90430\n戴天理那素芝\t90431\n外去\t90432\n陈锦秀\t90433\nrqexccvvvr\t90434\n三下个\t90435\n沈进平\t90436\n17783650982\t90437\n陈天波\t90438\n哀伤\t90439\n尼尔佛格森\t90440\n百货大厦\t90441\n密蒙\t90442\n终局\t90443\n12345677个\t90444\n我素\t90445\n5355583\t90446\n闷湿\t90447\n德尔罗萨里奥\t90448\n抓钱\t90449\n泊米\t90450\n100多少\t90451\n田心\t90452\ncalmerby\t90453\n盖茨ac\t90454\n我家家徒四壁\t90455\nhjbfgjbfjnh\t90456\n脱逃\t90457\n宫缩新宇\t90458\n3月21\t90459\nZVS\t90460\n不对劲\t90461\n嘀嗒嘀嗒滴滴丢滴滴\t90462\n杨然\t90463\n堵入\t90464\n苏菲\t90465\n4g网\t90466\nCxxxgdyi\t90467\n10ppppc\t90468\n暖洋洋\t90469\n点势\t90470\n怀中\t90471\n有尽有\t90472\n进求\t90473\n消消吃吃吃\t90474\n晋阳\t90475\n古希\t90476\n弹出\t90477\n弹击\t90478\n一点20\t90479\n木柜\t90480\n六十平方厘米\t90481\n古帅\t90482\n呢家\t90483\n古币\t90484\n好儿媳妇\t90485\n木头人\t90486\n呼od\t90487\n李芭莎\t90488\n我不懂夜你在我\t90489\n嗨二狗\t90490\n平民愤\t90491\n你好呀机器的我爱你梦梦\t90492\n首歌你是猪吗你是猪么你是猪猪猪猪猫你是猪么猪么你是猪猪猪猪包\t90493\n6437\t90494\n6436\t90495\n6434\t90496\n多年来\t90497\n还嘴\t90498\n漫水乡民族中学\t90499\n这里\t90500\n一八消消气吧绣\t90501\n惨不忍\t90502\n炙\t90503\n卵奥\t90504\n一百二十一百一十五\t90505\n四十分贝\t90506\n小高层\t90507\n破铜烂铁\t90508\n谢术术\t90509\n你你你你你你\t90510\n王嗯\t90511\n缴交\t90512\n模拟人生三\t90513\n帝舵\t90514\n佘小毛\t90515\n顺口令\t90516\n两条鱼\t90517\n望梅止渴\t90518\n施伟锋\t90519\n迪士尼\t90520\n现款\t90521\ncgucgg\t90522\n热火三巨头\t90523\n面具人\t90524\n度秘你在哪我想去你\t90525\n5700508888878\t90526\n奥妙\t90527\n丿丿丿\t90528\nPUA\t90529\n2O多岁\t90530\n2516657073\t90531\n一起摇摆\t90532\n欢迎\t90533\n扣号\t90534\n再见不想\t90535\n炮神\t90536\n曹得涛\t90537\nJIAI\t90538\n抽风蛋疼\t90539\n林梦允\t90540\n3万3千人\t90541\n王别担心\t90542\nwhoareno\t90543\n哎呀妈呀半男半女\t90544\n1000米\t90545\n刘春\t90546\n赚头\t90547\n阿斯特里亚\t90548\n外侧\t90549\n互为体\t90550\n刘是\t90551\nnnn84609\t90552\n42786\t90553\n私分\t90554\n祖籍\t90555\n日子行\t90556\n完美男人\t90557\n痛快办\t90558\n乖乖乖\t90559\n癌变\t90560\n10家\t90561\n垂目呀\t90562\n荒僻\t90563\n而己\t90564\nghjijb\t90565\n斗密\t90566\n快乐柠檬happy\t90567\n秦淮河\t90568\n芭比之豪\t90569\n我一想\t90570\n大力神\t90571\n草原来\t90572\n体长\t90573\n张书记\t90574\nchijk\t90575\n作文不聊\t90576\n实施\t90577\n固戍\t90578\n陈光武\t90579\n2040多\t90580\n附有\t90581\n思密达to\t90582\n我需要人工服务你给我滚犊子行不行\t90583\ngbbbb\t90584\n劫富济贫\t90585\nwww.xafm931.net\t90586\n心里有数\t90587\n郊区\t90588\n剩单\t90589\n老马\t90590\n双耳\t90591\n枪侠\t90592\npaasdf\t90593\n许运鸿\t90594\n深圳大兴观澜丰田\t90595\np85\t90596\nhrrrhehrr\t90597\nHhjnvvnm\t90598\n如男一米八要是一伯伍检斤\t90599\ntendency\t90600\n空而斩万流\t90601\n张志明\t90602\n利吕井\t90603\ncbl\t90604\n死俘\t90605\n隐隐地\t90606\n张志星\t90607\n378799474\t90608\n韩拜拜\t90609\n3322446\t90610\n公元1138年\t90611\n偶像派\t90612\n2：30\t90613\n东游\t90614\n大肚子大肚子你在干\t90615\n471\t90616\n上景物\t90617\n肖全\t90618\n均价\t90619\n小气鬼\t90620\n逗逗\t90621\nliame秘\t90622\n孙正平\t90623\n冥冥\t90624\n5.5%\t90625\n想开心\t90626\n不我再来\t90627\n东港\t90628\n姚家房\t90629\ntherebe\t90630\nhgcecg\t90631\n户户\t90632\n找你了你\t90633\n乔哥\t90634\n古德蒙\t90635\n马子严\t90636\n贝姆\t90637\n鸠美鸠美\t90638\n妇科\t90639\n疐匯时候\t90640\n粉红哥\t90641\n乐北联圩\t90642\n三进\t90643\n秘媚\t90644\n感情上\t90645\n挨批\t90646\n78年\t90647\nLouboutin\t90648\nandbigbang\t90649\n李彩\t90650\n你的女孩儿梦能\t90651\n时段\t90652\n发庄村\t90653\n一六肥\t90654\n永远永远永远不理\t90655\n干瞪眼\t90656\n总督\t90657\nwoshiahuosha\t90658\n斯坦福\t90659\n爱盟\t90660\n拜请\t90661\n哪臭\t90662\n好好好好好好好好好像\t90663\n凛冽\t90664\n回我家\t90665\n犯罪专\t90666\n几亿年\t90667\n一槟\t90668\n困不住\t90669\njmtgtjgw\t90670\n奥巴一\t90671\n158天\t90672\n秋秋和\t90673\n一槌\t90674\n可能不能\t90675\njxxyk\t90676\n想了降下来\t90677\n心灵相通\t90678\nmimp\t90679\n安眠\t90680\n此起\t90681\n油条\t90682\n菏泽\t90683\n亚梦之\t90684\nFT中文网\t90685\n我是说我烦你\t90686\nliaole\t90687\nABAc式\t90688\ncvukr\t90689\n打算\t90690\n好我喜欢枪战\t90691\n含糊\t90692\n陈宋芳\t90693\nmpjgjgjgwi\t90694\n倪漫天\t90695\n不难道\t90696\n鲁谢谢\t90697\njiut\t90698\n奥说错了我是说你不自己说你是鬼\t90699\n去狗逼\t90700\n塞北\t90701\n管理费\t90702\nsin30o\t90703\n大醉\t90704\n認識\t90705\n局子\t90706\n你们爱好大一起摇滚的词霸屏\t90707\nMerryChristmas\t90708\n爱卿果乃吾\t90709\n特点\t90710\njiuh\t90711\n18:18\t90712\n亲爱的歌歌姐姐们\t90713\n索刊\t90714\n日语一的的的的的的的的的的的的\t90715\n痴男\t90716\n冤屈\t90717\n非卖品\t90718\ndistan\t90719\nKdjg\t90720\n春秀路\t90721\nMegan\t90722\n戴一俊\t90723\n跑票\t90724\n海上传奇\t90725\n带我飞\t90726\n神山\t90727\n幽兰冰倩\t90728\n妙年级\t90729\nbito\t90730\ncity\t90731\n何求\t90732\nfjfj\t90733\n45538255316\t90734\n凡客诚品\t90735\n搶紅包\t90736\nfjff\t90737\nfjfd\t90738\n丁梦琪\t90739\n啥尔市\t90740\n恩水果\t90741\n许仙\t90742\n融金陆俊豪宾馆五楼\t90743\n巴马尾\t90744\n韩欣瑜\t90745\n15W\t90746\n放大话\t90747\nsugbl\t90748\n菜跋\t90749\n钓点\t90750\n再见吧等会儿聊\t90751\n有意义\t90752\nkgxk\t90753\n刘怀钰\t90754\n幺二二二\t90755\n王劲松\t90756\n毛娃娃\t90757\n14.8%\t90758\n问字\t90759\nSeven\t90760\n才差\t90761\n洪天\t90762\n一百年后\t90763\n眉庄\t90764\n水身\t90765\nDbbbe\t90766\n福尔沃\t90767\n很我用\t90768\n尝受\t90769\n放暑假\t90770\n過分幹\t90771\n长得好帅\t90772\n擦屁屁\t90773\n咯吗比\t90774\n走开我讨厌\t90775\n亚运村\t90776\n拱门\t90777\n浦东机场\t90778\n九ｋ\t90779\n粤垦\t90780\n冰褪延\t90781\n兮爷\t90782\n错传错\t90783\n男心\t90784\n心爱的姑娘\t90785\n乐可乐\t90786\nlilililooooovvvveeeeioveillveeee51\t90787\njyfvvfc\t90788\n十二日\t90789\n有一个人儿\t90790\n一万多例\t90791\n七窍\t90792\n娘腔\t90793\n五千一晚\t90794\n邪灵\t90795\n摩卡雪冰\t90796\ndhssgvdjs\t90797\n张淑慧\t90798\n龙之角\t90799\n15619371832\t90800\n好榜样\t90801\n9元\t90802\n来充\t90803\nzuò\t90804\n用情\t90805\n史一史算\t90806\n前言\t90807\ngshvfsh\t90808\n二流子\t90809\n嗯恩恩\t90810\n儿媳妇\t90811\n感慨\t90812\n周诒春\t90813\n擦脚\t90814\n血液科\t90815\nis秘\t90816\n王书\t90817\nfhfudjdueidj\t90818\n版块\t90819\nnibuwenr\t90820\n随缘\t90821\n代日\t90822\n王乐\t90823\n瞎聊\t90824\n王乔\t90825\nGMGM8\t90826\n一品惑\t90827\n王也\t90828\n最最最最最\t90829\n海陵区\t90830\n嚼益达\t90831\n明说懂\t90832\nTiang\t90833\n王之\t90834\n王么\t90835\n908245\t90836\n考虑有\t90837\n嗯静静\t90838\nbehe\t90839\n雷丁\t90840\n窦婴\t90841\n0哟一片\t90842\n天妈妈\t90843\n娼源\t90844\n干竹\t90845\n南孚电池\t90846\n艾滋女\t90847\n北蔡\t90848\n那在那生活给我的灵感君安\t90849\n丝带\t90850\n不分彼此\t90851\n呶噩\t90852\n签署\t90853\n顺丰快递\t90854\n告诉你却入\t90855\n有不说\t90856\n魔法杖\t90857\n陈梦梦\t90858\n万三厘米\t90859\ntiamo\t90860\n猪八戒舞\t90861\n切糕\t90862\n698分\t90863\n速效救心丸\t90864\nbackseats\t90865\n見別人\t90866\n好啊好哇好哇\t90867\n老早老早\t90868\n标称\t90869\nueeess\t90870\nshhxhxh\t90871\n嗯哪哪哪哪哪哪哪哪哪哪这首曲子\t90872\n通稿\t90873\n2701\t90874\n曼雅\t90875\n无确\t90876\n说家\t90877\n8855522\t90878\n借人之智\t90879\n画中\t90880\n谢谢脑\t90881\n40万美元\t90882\n连声\t90883\n下呼叫\t90884\n爱上一个人\t90885\n王质传\t90886\n德玛\t90887\n度秘我很喜欢\t90888\n聲音嗎\t90889\n潘杰\t90890\n苏酉星\t90891\nidhf\t90892\n鬼若子\t90893\n3月17号\t90894\n看家駒\t90895\n恋尸癖\t90896\n侧脸\t90897\n仙子\t90898\nsemy\t90899\n可不呵呵\t90900\n4M\t90901\n慵懒照\t90902\n权术\t90903\n出头\t90904\n质量比\t90905\n五七三十五\t90906\n女大人教版\t90907\nweibo\t90908\n纳达尔\t90909\n爱子之\t90910\nHGGGGGGTGGGTGGGHHHGGGTGGGHG\t90911\n个险\t90912\n要约炮\t90913\n居所\t90914\n霞塔吊\t90915\n粉粉日记\t90916\n房地产调控\t90917\n撒拉力\t90918\nkj5\t90919\n什物\t90920\n孟县\t90921\n不骗你了\t90922\n抢让\t90923\n二三几年\t90924\n你是谁你是谁你是谁你石狮\t90925\n累及\t90926\n法仔\t90927\n红韵\t90928\n谈斗破\t90929\n欧佩克\t90930\n累受\t90931\n小布均\t90932\n张哲灿\t90933\n早我的天平座\t90934\n这个饭\t90935\n十五嗯十五十天\t90936\n一朝\t90937\nkjb\t90938\nkjc\t90939\n灯红酒绿\t90940\nkjf\t90941\nkjd\t90942\nkjj\t90943\n同学群\t90944\n4针\t90945\n落地灯\t90946\nkjo\t90947\nkjm\t90948\n冰冰冰冰\t90949\n沈梦\t90950\n356万安\t90951\n法令\t90952\n暖壶\t90953\njgxko\t90954\n封丘你好\t90955\nPearlglide\t90956\n饿分\t90957\n聚乙烯博PBL\t90958\n555685\t90959\n暴露\t90960\n工式\t90961\n原谅你\t90962\n许昌\t90963\n卷翘\t90964\n70kt\t90965\n全吧\t90966\n壁瑶\t90967\n佩哥\t90968\nbhiphotosbaiducomxiaodupicitem0eb30f2442a7d9331102d807aa4bd11373f0011fjpg\t90969\nxiān\t90970\n嗯媚娘\t90971\n服务业\t90972\n沃尼森\t90973\n说难\t90974\n偶布吉岛\t90975\n小岚\t90976\n许是\t90977\n臭嘟咪\t90978\n远越好\t90979\n张驰峥\t90980\n头像片\t90981\nHarris\t90982\n不懂不懂不懂不懂不懂\t90983\n陪都\t90984\n往右\t90985\ncomtout\t90986\n掐指\t90987\n被埋\t90988\n小气行\t90989\n一个好人\t90990\n不牙疼\t90991\n百度钱\t90992\n卢秋萍\t90993\n滴滴猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪\t90994\n骚睿\t90995\n一會\t90996\n王改霞\t90997\n现在天\t90998\n天天p图\t90999\n铸就\t91000\nlyf\t91001\n2013年6月1日\t91002\n下雨句\t91003\n曼姐\t91004\n巨峰路\t91005\n皮皮\t91006\n范二\t91007\n天网家\t91008\n不动态\t91009\n变不了\t91010\n三百多天\t91011\n沭阳\t91012\n聊聊天你在干嘛呢\t91013\n香山\t91014\nlyc\t91015\n李杰宇\t91016\n我的是红的霸王龙\t91017\n摊上\t91018\n2333333333333333333333333\t91019\n卢星宇\t91020\n该生\t91021\n摊主\t91022\n邪术\t91023\n陪睡\t91024\n看不懂为\t91025\n吴思婉\t91026\n嗯阳刚小学\t91027\n社会地位\t91028\n助理\t91029\n28J\t91030\nNNNNN\t91031\n说句俗\t91032\n堵眯眼\t91033\n花蜜\t91034\n马马\t91035\n厉鬼\t91036\n小林萌\t91037\n建国门桥\t91038\nicyou\t91039\n孙鑫\t91040\n要饭吧\t91041\n卢婧阳\t91042\njdsf\t91043\n六十七块\t91044\n刍事\t91045\njdsk\t91046\n%\t91047\n败给\t91048\n笨猎\t91049\ng7futtudifucysyxyfydysdjdyahshxyfztziss8yuzizjssisjussiissususususwwssusususuusssussusyysshShshgfauajsahdhsgwgwggsa\t91050\n王林丽\t91051\n法华\t91052\n蓝兔\t91053\n搭铁\t91054\n孔老二\t91055\nyy1y\t91056\n网袜\t91057\n旧城\t91058\n烟酸\t91059\n光棍节醋\t91060\ntnnnnnnnnnnnnnnnnnn\t91061\n逗了吧\t91062\n一本一个六元\t91063\n蟹案\t91064\n真真正正\t91065\n1cm\t91066\n五五百六十金\t91067\nc81\t91068\n浙商\t91069\nc84\t91070\n八十年后\t91071\nc86\t91072\n费力\t91073\n丹阳\t91074\nrest\t91075\n因为我喜欢o\t91076\n名声狼藉\t91077\n午睡午睡世上大把\t91078\n爱谁谁\t91079\n3850432146\t91080\n费劲\t91081\n马锦辉\t91082\n爆本\t91083\n大峡谷\t91084\n别离开\t91085\n李秋媛\t91086\n蓖麻\t91087\n穿恨\t91088\n难忘我的\t91089\ngss\t91090\n块钱\t91091\n陪点\t91092\n劳动秘\t91093\nxztrf\t91094\n揭西\t91095\n除臭袜\t91096\n河南南阳内乡县气象局\t91097\n巴东\t91098\n客栈\t91099\nBEFORE\t91100\n排练\t91101\n一月份一月份\t91102\n不死板\t91103\n齐山\t91104\n小组长\t91105\n2万年\t91106\n忙太\t91107\n妈妈呢妈妈\t91108\n狮子尾\t91109\n自己的儿\t91110\n巴一\t91111\n木兰从军\t91112\ndhshrhid\t91113\n海绵儿女\t91114\n迎妍堡\t91115\n低开低走\t91116\n房颤\t91117\nfsyhdu\t91118\n容光焕发\t91119\n哼不理你了讨厌鬼\t91120\nShare\t91121\n一慢\t91122\n使出\t91123\n避邪\t91124\n巴中\t91125\n肯定市\t91126\n欧巴关\t91127\n猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪\t91128\n骚比\t91129\n偶稀饭\t91130\n宅基地\t91131\n荆1\t91132\n把好\t91133\n幺八五零二六七\t91134\n赵晓\t91135\n怀里边\t91136\n崔雨婷\t91137\n娱乐头\t91138\n113254798\t91139\n车长\t91140\n开财\t91141\n郭玲\t91142\n分手后\t91143\n权锦红\t91144\n杜依菲\t91145\n江红\t91146\n姨夫\t91147\n姨太\t91148\n因因\t91149\n13.58\t91150\n中国航空工业发展研究中心\t91151\n王只约\t91152\n455857545455\t91153\n01十次\t91154\n陈欣予\t91155\n第一鈤\t91156\n谢你我讨厌\t91157\n我和我的朋友\t91158\n2356岁\t91159\n踏实\t91160\n权变\t91161\n带木\t91162\n分手吧\t91163\n穿鞋\t91164\nisjsj\t91165\n活泼热型\t91166\n78285\t91167\n双板\t91168\n吕自\t91169\n真是度\t91170\n如坐针毡\t91171\n夫夫\t91172\n汗花头\t91173\n两棵树\t91174\n千字碑\t91175\n机组\t91176\n锁链\t91177\n接个屯里土生土长滴羊\t91178\n神二中\t91179\n理发师\t91180\n臭狗\t91181\n总务\t91182\n18714493220\t91183\n安杰拉贝比\t91184\n滑县新区\t91185\n总动\t91186\n自由行\t91187\nhoiio\t91188\n1998年3月17号\t91189\n政治\t91190\ngrrtt\t91191\n千层脸\t91192\n达尼尔\t91193\n圣保\t91194\n杀伤\t91195\n哼呵\t91196\n十四点儿\t91197\n不要乱说\t91198\n哼呼\t91199\n骑鳖\t91200\n小米手机\t91201\n三百三百三百八十六个\t91202\n图长\t91203\n我也不懂我你\t91204\n尾椎骨\t91205\n37分钟\t91206\n酷话\t91207\n哼呜\t91208\n台行\t91209\n无处会\t91210\n哈喽在么在么在么在么在么在\t91211\n野马\t91212\nueieuhe\t91213\n32家\t91214\n尼维达\t91215\n肚肚子\t91216\n楼堂馆\t91217\n请许\t91218\n甘网银\t91219\n权重股\t91220\n八里庄派出所\t91221\n请讲\t91222\n请记\t91223\n1240296\t91224\n范浩武\t91225\n再来\t91226\n上上上上上上上上上上上上上上\t91227\nq文\t91228\n二维码扫描器\t91229\n园站\t91230\n哈宝崽\t91231\n本这样\t91232\n诗佛王维\t91233\n233333\t91234\n牛奶湖\t91235\n一些周\t91236\nxxxxxxz\t91237\n想和你好\t91238\n再杀\t91239\n笑嘻嘻笑嘻嘻\t91240\n于雯冰\t91241\n度戒\t91242\n度我\t91243\n火头儿\t91244\n燕思\t91245\n中国菜\t91246\n氯吡脲\t91247\n孙院长\t91248\n上张\t91249\n给我的凭\t91250\n周艺龙\t91251\n现代文\t91252\n萨利赫\t91253\n故人心\t91254\n461w228q592692692696292692693269gdg\t91255\nGKGK\t91256\n照不上\t91257\n蔡卫华\t91258\n二六五\t91259\n国家税务总局\t91260\n2000\t91261\n我板\t91262\nWatson\t91263\nIwillayoung\t91264\n不素不兹\t91265\n速效\t91266\n荨麻疹\t91267\n木渎\t91268\n咱们的家了天天娃娃天哒哒的咪咪的小萌妈妈咪咪小小小小小师妹明天\t91269\n泰中\t91270\n杨志礼\t91271\n第26集\t91272\n底子\t91273\n半一\t91274\n莫西\t91275\n5477622527137246464624957645912446216246166174754456765222533227\t91276\n来亲爱的给我说声摸摸\t91277\n鲁西西\t91278\n半世\t91279\n狙击手裤\t91280\n李宝承\t91281\n气包\t91282\n四百多块\t91283\n不老好\t91284\n昕秘\t91285\n幸亏\t91286\n半个\t91287\n粤语学\t91288\ndssd\t91289\n嘻嘻眼\t91290\n听求\t91291\n黄大概\t91292\n7：59：59\t91293\n于旭东\t91294\n18300482678\t91295\nadam\t91296\n两霍\t91297\n叶美\t91298\n后台程序\t91299\nadaf\t91300\n你老实好\t91301\n霖\t91302\n米国发\t91303\n点豆腐汤\t91304\n2071867025\t91305\n7U903\t91306\n凌凌\t91307\ntO\t91308\ntK\t91309\n雪碧鸡腿可乐鸡翅鲍鱼龙虾鱼翅\t91310\n你侬我侬\t91311\n四野\t91312\n想你好无聊\t91313\n总侗\t91314\n儿科师\t91315\n頂多喝\t91316\ntz\t91317\ntx\t91318\nty\t91319\ntv\t91320\ntw\t91321\ntt\t91322\ntu\t91323\ntr\t91324\nts\t91325\n丽太\t91326\n梧桐雄姿\t91327\ntn\t91328\nto\t91329\ntl\t91330\n不等你我想\t91331\n电子狗\t91332\ntk\t91333\nth\t91334\nti\t91335\ntf\t91336\ntg\t91337\ntd\t91338\nte\t91339\ntb\t91340\nta\t91341\n中小奖\t91342\n口型\t91343\n奴才\t91344\n百灵儿\t91345\n学了学\t91346\nnalyn\t91347\nbnxmgwp\t91348\n依规\t91349\n要记\t91350\n富平\t91351\n无法相信\t91352\n哎哎\t91353\nbTngbTng\t91354\n一层半\t91355\npbv\t91356\n死干嘛\t91357\nt8\t91358\nt6\t91359\nt7\t91360\nt4\t91361\nt5\t91362\nt2\t91363\n重注\t91364\nt1\t91365\nthese\t91366\n返贫\t91367\nzss\t91368\n长相思兮长相忆\t91369\n糖水味\t91370\n衰闭\t91371\n小辣椒\t91372\n12367873790\t91373\n跳戏\t91374\n抛开\t91375\n这号事\t91376\n星星那那你\t91377\n髲\t91378\n六百块\t91379\n正12向量3\t91380\n我辈\t91381\n怡宝\t91382\n再见了咧\t91383\n鲁咪\t91384\n我真的受够你了我不想和你说再见了哦\t91385\n八千块\t91386\n谢欣怡\t91387\n心存感激\t91388\n南北区\t91389\n效秘\t91390\n一大早\t91391\ngotoyit\t91392\n尖旧\t91393\n一大日\t91394\n窝高\t91395\n巴尔迪尼\t91396\n打败\t91397\ndyudyy\t91398\n谢谢你给我钱\t91399\n胡不是\t91400\n玩牌\t91401\n2004610\t91402\n洄流\t91403\n泰然\t91404\n王会飞\t91405\n赵子涵\t91406\n山房\t91407\n袁市长\t91408\nggggttrrttttttt\t91409\n五一广场站\t91410\nthest\t91411\n7900\t91412\n枯叶\t91413\nxloglju\t91414\n曲速\t91415\ntdsfjyd\t91416\n玩物\t91417\n训诫文\t91418\n黄小琥\t91419\n曲江\t91420\nhshbszshs\t91421\n3月7号\t91422\n费炫杰\t91423\natlab\t91424\n杨慧心\t91425\n多咪多咪我爱你\t91426\n关键\t91427\n郑白莲\t91428\n晦涩\t91429\n姗姗良\t91430\n试听率\t91431\n真行\t91432\n中华龙乡\t91433\n瞎扯\t91434\n蔡行\t91435\n8个月\t91436\ngosh\t91437\n食物中毒\t91438\n奥村\t91439\n罗庄\t91440\n香肠\t91441\n尚倾宇\t91442\npotall\t91443\nmmmmmmmmmmm\t91444\nUC猫\t91445\nehagagbngajnjq\t91446\n生化酒店\t91447\n再说三道四\t91448\n续觉\t91449\n这首歌信\t91450\n快点点儿\t91451\n8元\t91452\n13756368861\t91453\n352233758662\t91454\n姚雨辰\t91455\n杜密斯\t91456\n必要条件\t91457\n0幢\t91458\n秘哥\t91459\n28分钟\t91460\n17:02\t91461\n17:00\t91462\n收破烂\t91463\n7月1日\t91464\n哥的天\t91465\n一八扇\t91466\n滋味\t91467\n孙召贤\t91468\n咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪喵猫咪吧聊一聊猫咪\t91469\n头天\t91470\n盛气\t91471\nNVNWKOJ\t91472\n画框\t91473\n管不着我愿意\t91474\nsaho\t91475\n热肠\t91476\n几行\t91477\n本我\t91478\n高日元\t91479\n咪多咪\t91480\n压力点\t91481\n希尔薇\t91482\n煤业公司\t91483\n广州地铁3号线\t91484\n佳娜\t91485\nsuzhi\t91486\n打铁犀牛\t91487\n土屋萌\t91488\n心理咨询\t91489\n金州\t91490\n９５年\t91491\n死不生\t91492\n锦里\t91493\n显山露水\t91494\n天天跟不理我我不行么投诉你\t91495\n坏蛋法西斯\t91496\nrrdgy\t91497\nyoshou\t91498\n摇匀\t91499\n观望期\t91500\n屠夫度秘\t91501\n军亮\t91502\n浦娇燕\t91503\n新桑塔纳\t91504\n马车\t91505\ndertyi\t91506\n篱门凉\t91507\n一念天堂\t91508\n军人\t91509\n申姨\t91510\n三万万万桑\t91511\n意思素\t91512\n画我看\t91513\n军事\t91514\n小家伙\t91515\n减衣\t91516\n丫丫\t91517\n都木\t91518\n高尔基所\t91519\n牌身\t91520\n想说一句\t91521\n忽而又\t91522\nucgugucuxguc\t91523\n清快点\t91524\n朱世成\t91525\n通风\t91526\n下姓\t91527\n中国电子商务研究中心\t91528\n成平\t91529\n均有\t91530\n女烈\t91531\neverse\t91532\n咯摸\t91533\n16天方\t91534\n正月十八\t91535\n吴王杨\t91536\n叔们\t91537\n第一桶\t91538\n黄子裕\t91539\n覃荣华\t91540\nohmyg\t91541\n微晶石\t91542\n枣泥\t91543\n雅妮\t91544\n出去走\t91545\n无痛\t91546\n无痕\t91547\n骚人\t91548\n犯傻\t91549\n司空见惯\t91550\n坑害\t91551\n变态杀人\t91552\n靠近\t91553\n妙明塔尔\t91554\n博弈\t91555\n钱案\t91556\n噗叽\t91557\n过份\t91558\n给我过\t91559\n剧额\t91560\njjiuyfgbjji\t91561\n伸开\t91562\nwfghbvbnk\t91563\n疫情\t91564\niing\t91565\nk40\t91566\n选手们\t91567\nk43\t91568\n陈彬雅\t91569\nmadeavailable\t91570\n5566325\t91571\n不正之风\t91572\n颈纹\t91573\n山羊犬\t91574\nshock\t91575\n游龙戏凤\t91576\n12家\t91577\n蜘蛛精\t91578\n我是老公你是老婆奥不对你是老公我是\t91579\n晓洁\t91580\nSEAT\t91581\n饥荒\t91582\nMISFITS\t91583\n6藏\t91584\n嘎瘩\t91585\n壓力\t91586\n劳教\t91587\n术士\t91588\n帕内塔\t91589\n思品\t91590\n秦小婉\t91591\n献菊\t91592\n几杠\t91593\n最重要的小事\t91594\n屄紧\t91595\nygcfth\t91596\n几杯\t91597\n葛天\t91598\n男森\t91599\naygxgx\t91600\n重新启动\t91601\n和你的女儿\t91602\n口未\t91603\n阿莫西\t91604\n床上感\t91605\nisCoCo\t91606\n好吧好吧好吧我也错了错了错了错了错了错了好吧好吧好吧好吧\t91607\n发廊\t91608\n陌儿\t91609\n9点45\t91610\n一点只\t91611\n上海人\t91612\n武鸣高中\t91613\n马零九\t91614\n占便宜\t91615\n直搞\t91616\n320万\t91617\n传播性\t91618\natme\t91619\n超模\t91620\nyourou\t91621\n分外\t91622\n好吧歼十出击\t91623\n99913\t91624\n99912\t91625\n倪大虹\t91626\n塞进\t91627\n152名\t91628\n词序\t91629\nkimiop\t91630\n词库\t91631\n错了错了错了错了度秘错了错了错了错了错了错了\t91632\n一月内\t91633\n店主\t91634\n一俩天\t91635\n和和\t91636\n酥糖\t91637\nhdhsjshb\t91638\n金喜善\t91639\n二一百三十八\t91640\n力大无穷\t91641\n兔狗\t91642\nsportmeet\t91643\n孔缸\t91644\nyongle\t91645\n得意摇摆小猴挂钟\t91646\n平复\t91647\n浮华\t91648\n天海\t91649\n灵童\t91650\n很寂寞\t91651\n平头\t91652\n快吃\t91653\n协办\t91654\n天流\t91655\nhgxvb\t91656\n协力\t91657\n猜雷劈\t91658\n一室一厅\t91659\n千遍\t91660\n长心\t91661\n存有\t91662\n累垮\t91663\n逆血龙\t91664\n鲜为人\t91665\n江边\t91666\n过关\t91667\nthnmmnm\t91668\n过具\t91669\n压不住\t91670\n过兵\t91671\nMao\t91672\nshls\t91673\n陆虎\t91674\n锦江宾馆\t91675\nshlh\t91676\n林倩\t91677\n心水\t91678\n明猫\t91679\n明猪\t91680\n李华语\t91681\ngdiylufkymgdyf\t91682\naag\t91683\n来汤\t91684\naaa\t91685\n度秘我爱死你了我爱的是你的心\t91686\naab\t91687\naam\t91688\n海洋大学\t91689\n再见呢\t91690\nabck7k7k7k7k\t91691\n心民\t91692\naat\t91693\n赘椎名\t91694\n王思思\t91695\naas\t91696\n奏响\t91697\n塔方\t91698\n领导派\t91699\n天范范\t91700\n小西\t91701\n小老虎\t91702\ndefect\t91703\n奶罩杯\t91704\n36分\t91705\n现代传播上海公司\t91706\n大西\t91707\n昨兒\t91708\n瑞味浓\t91709\n一起跳\t91710\n擦边球\t91711\n紫玫瑰\t91712\n道觉中学\t91713\n亲爱的是真的爱我\t91714\n安检\t91715\n颧骨\t91716\n亮子头\t91717\n駡人\t91718\n猪猪秘我讨厌你度秘猪度秘\t91719\n信鸟\t91720\n粢饭油条豆浆\t91721\n零二二二八六二\t91722\n谭容\t91723\n414厘米\t91724\n徐明美\t91725\n红黄蓝\t91726\n轮辐\t91727\n1900元\t91728\n杨道璋\t91729\nFigag\t91730\n哈麻批\t91731\n丽江市玉龙纳西族自治县黄山镇五台村委会\t91732\n主力\t91733\n主办\t91734\n梦梦达\t91735\n苦瓜汁\t91736\n我不对\t91737\n哈罗\t91738\n主动\t91739\n回南的天\t91740\nhellomyoure\t91741\n吗in个\t91742\n却说\t91743\n法定程序\t91744\n问一下子\t91745\n李鹏飞\t91746\n双瀑飞流\t91747\n李馥妍\t91748\n闫春雨\t91749\n天高云舒\t91750\n西湖小学\t91751\nCloud\t91752\n我来自喵星球我的的来自汪星球\t91753\nhh45154544354358\t91754\n18345038552\t91755\n四哪村\t91756\n法制晚报\t91757\n巴拉巴拉小魔仙大\t91758\n薇薇薇薇\t91759\nojg\t91760\n张晓默\t91761\nojj\t91762\nojk\t91763\n隐形的翅膀\t91764\n1月21日\t91765\n肃立\t91766\n慷慨助人\t91767\n第五百分数组\t91768\n59101489\t91769\n本长\t91770\n那不开\t91771\n哎呀世界\t91772\n农民工\t91773\n贠小苏\t91774\n浪子回头\t91775\n日思夜想\t91776\n怪不好意思\t91777\n哎呀造作\t91778\nWooYun-2012-07634\t91779\n84984\t91780\n慢性咽炎\t91781\n汤丸\t91782\n拍卖会\t91783\n为官\t91784\nhdydjghghh\t91785\n55.5.5.5:55\t91786\n障碍物\t91787\nvfff\t91788\nvffd\t91789\nvffe\t91790\n韩志茹\t91791\n滴水\t91792\ncrp\t91793\n不死了算\t91794\ncrv\t91795\ncrw\t91796\n雨菲\t91797\n植物\t91798\ncry\t91799\n真真\t91800\ntgxbfllgf\t91801\nz110\t91802\ncrc\t91803\n求婚\t91804\ncrf\t91805\ncrg\t91806\ncrd\t91807\n玩世不恭\t91808\ncri\t91809\n小姑娘\t91810\ncrl\t91811\n王阳春\t91812\n肠系\t91813\njadm\t91814\n不间断句话\t91815\n翁陈陈\t91816\n阿盛\t91817\n孟玲琨\t91818\n喉炎\t91819\n贯中\t91820\n群魔乱舞\t91821\n倪媛\t91822\n一克拉\t91823\nfftff\t91824\n数字猜成语123456令\t91825\n流通\t91826\n呃资费\t91827\n干干\t91828\n流逝\t91829\n场上次\t91830\n我爱你我恨你我想霸占你\t91831\n内衣裤\t91832\n王六\t91833\n四十分之21\t91834\n妖精\t91835\n彭水县\t91836\n冠军欧洲\t91837\n周维印\t91838\n若囖\t91839\nCFCsddfc\t91840\n一形容\t91841\n刚刚刚刚刚刚刚刚刚刚\t91842\n划听\t91843\n我和我的世界\t91844\n再接再厉\t91845\n爱不见面\t91846\n86斤\t91847\n聚聚聚聚uuu\t91848\n变幕\t91849\n超星吹\t91850\n通剧\t91851\n古拉\t91852\n為什麼會\t91853\n度秘你是猪啊18p\t91854\nAnnie\t91855\n上线儿\t91856\n几22\t91857\n贺美瑜\t91858\nfellfrom\t91859\n武局\t91860\n1.6%\t91861\n这一句\t91862\n数一二三四五六七八九十十一十二十三十四十五\t91863\n回向\t91864\n心机婊\t91865\n妈妈呀咪呀妈妈\t91866\n等你买\t91867\n你好丑臭的要命\t91868\n有点事先走\t91869\n欧洲央行\t91870\n白房\t91871\n度秘大过拉杜不要脸\t91872\n大王个\t91873\n音版\t91874\n白户\t91875\nzomoj\t91876\n动英\t91877\n混搭\t91878\nhhgghj\t91879\n105ll\t91880\n更喜欢\t91881\n乌甘蛇\t91882\n孙才干\t91883\n秘辞\t91884\n我草我草我草我草你是个稻草人一窝草\t91885\n索拉\t91886\n难过开心点\t91887\n我要听欢迎进行曲\t91888\n鞭炮声\t91889\n杭锦旗\t91890\nsomawww\t91891\nNO—Say\t91892\n95598\t91893\ncream\t91894\n万千米\t91895\n填埋\t91896\n好我信你\t91897\n黑儿\t91898\n多利亚\t91899\nmn21\t91900\n秘达\t91901\nponik\t91902\n3051850084\t91903\nckjvvvvvvvvbbbb女款\t91904\n十来年\t91905\n开关机\t91906\n艺校\t91907\n别光叔好呀\t91908\n古宠居委\t91909\n浇花\t91910\n非洲候面包树\t91911\n1季\t91912\n周围洋\t91913\n黑名单\t91914\n腊肉\t91915\n这事春的诗黑子\t91916\n转眼间\t91917\n旷野\t91918\n十二二十块\t91919\n师生平等\t91920\n想想知道\t91921\n招商地产\t91922\n视偶\t91923\n阿拉伯人\t91924\n户外装备\t91925\nhudj\t91926\n萱萱\t91927\n坚强不屈\t91928\n察雅修\t91929\n你好意思\t91930\n不要不然\t91931\n王元\t91932\n腊肠\t91933\n自卑亭\t91934\n昨晚一点到五点\t91935\n520a\t91936\n预制\t91937\n阿剩\t91938\n高雅琪\t91939\n11月23号\t91940\n吹洗\t91941\n一喜欢流里\t91942\n牛塞\t91943\n2400千米\t91944\n野孩子\t91945\n甄氏语\t91946\n蓝猫和了和\t91947\n2607196477\t91948\n科教卫\t91949\nwhatyouro\t91950\n再度\t91951\nycygg\t91952\n奥纳\t91953\nCLOTHING\t91954\nhttphhiphotosbaiducomxiaodupicitem0df431adcbef7609cc1e4aed29dda3cc7cd99eafjpg\t91955\n腊八状\t91956\n郭放\t91957\nAddress\t91958\n席梦思\t91959\n做家务\t91960\n三角宝\t91961\n佛像\t91962\n谢皇上\t91963\n46nnnnj4\t91964\n张大铎\t91965\n明天仙居\t91966\n知己知己\t91967\n5200\t91968\n5203\t91969\n济南西区\t91970\n不爱你我讨厌\t91971\n可爱感\t91972\n12月25日\t91973\n星星座\t91974\n难字\t91975\n李晨宇\t91976\n布吉布吉\t91977\n峰林峡\t91978\n33377\t91979\n跟随\t91980\nperson\t91981\n3.7米\t91982\n羞把脸\t91983\n让位\t91984\n蒙都\t91985\n仝峥\t91986\n十二星座抢手度\t91987\n局座\t91988\n人谓\t91989\n8hdixd\t91990\n水鏘\t91991\n胡冰棒\t91992\n678345678\t91993\n片吧\t91994\n才怪\t91995\n随风吹\t91996\n龙里\t91997\n刑侦类\t91998\n巡演\t91999\n囧囧囧囧囧囧囧囧囧囧囧囧囧囧囧囧囧囧囧囧囧囧囧囧囧囧\t92000\n人调\t92001\n图像\t92002\n老娘不发威你当我是个\t92003\n璐咻\t92004\nJzhnxjxj\t92005\n曹可凡\t92006\nCIO老范的幸福生活\t92007\n上不了网\t92008\n真美好\t92009\n拉忍\t92010\n刘丰玮\t92011\n万一就是我我是谁\t92012\n夜夜笙歌\t92013\n嘛好不好嘛\t92014\n沁水\t92015\n完了明天在\t92016\n拜拜我恨你我告诉你我不爱你\t92017\n补眠\t92018\n建造桥\t92019\n艾佛力\t92020\n食干\t92021\n老变卦\t92022\n土色\t92023\nBarlow\t92024\ntomat\t92025\n偶孤独家人\t92026\n拉拉拉小魔仙嗯电影来了壳可怕\t92027\n苏格兰人\t92028\n四百页\t92029\n淑馨\t92030\nhgggyt\t92031\njpj我i天下无敌女女女们\t92032\n郭过\t92033\n雪埃\t92034\n哦啦\t92035\n萧忆情\t92036\n尹叫\t92037\nguixuxyixuz\t92038\n余笙\t92039\n独角戏\t92040\n0504474\t92041\n独得\t92042\n芭蕉扇\t92043\nFC\t92044\n你好萌啊萌萌哒天\t92045\n小白脸\t92046\n一票袜\t92047\n说不不不\t92048\n丁佳琪\t92049\n不能册\t92050\n限号\t92051\npaly\t92052\n天涯何处无芳草修辞手法\t92053\n35555434334135365335332353533553356556356535355362623233\t92054\n秦魂县\t92055\n嗯日\t92056\nJshshshgs\t92057\n书宇\t92058\nhelloAV8d\t92059\n嗯芭比\t92060\n超级星光大道\t92061\n陆翊\t92062\npalm\t92063\n灵秀湖北\t92064\n忘忧\t92065\n清中中学\t92066\npale\t92067\n维恩图\t92068\n斯内德范\t92069\nFK\t92070\n对告诉我\t92071\nmeahug\t92072\n二得\t92073\n氏族\t92074\n就班抓\t92075\n变脸\t92076\n10几度\t92077\n学霸君\t92078\n半岁\t92079\n叫子\t92080\n不有我\t92081\n天星期六\t92082\n绝杀\t92083\nFO\t92084\n背背背背背\t92085\n1001岁\t92086\n同类形\t92087\n30一40元\t92088\n乖当我\t92089\n善意的谎言\t92090\n度小姐\t92091\n太阳公公\t92092\n19191\t92093\n书宝\t92094\n山沟\t92095\nooup\t92096\ne王\t92097\n头发识\t92098\n不见了拜拜\t92099\n大思易失爱,\t92100\nqqppppppp\t92101\nUPS\t92102\n课后天\t92103\n山河\t92104\n真夏\t92105\n忘忘\t92106\n草莓苗\t92107\n山治\t92108\n重新来\t92109\n九阳\t92110\n冯哥\t92111\n你是谁\t92112\n通风处\t92113\nig片\t92114\n奇遇\t92115\n冯唐宇\t92116\n菜户\t92117\n巴利岛\t92118\nWydick\t92119\n70suv\t92120\nhrhdjdb\t92121\n零二零九\t92122\n来月经\t92123\nNjsdjjx\t92124\n不VC\t92125\n聊天会\t92126\n王礼慧\t92127\nts卜s卜\t92128\n铁骑\t92129\n曾荫权\t92130\n水声\t92131\n神命\t92132\n偏男生\t92133\n刘羽希\t92134\n水壶\t92135\n十四集\t92136\n撸斗子\t92137\n祝子豪\t92138\nyygghhhhhhhhhhhhhh\t92139\n下大雨\t92140\n很好意思\t92141\n苏比度\t92142\nucuu\t92143\n音悦Tai\t92144\nBKK\t92145\n哔声\t92146\n心情可\t92147\nTeamScrect\t92148\nuugfv\t92149\n超大气\t92150\n性情中人\t92151\n谢锦霞\t92152\n风女\t92153\n琴弦\t92154\n毒咩蛊\t92155\n荟\t92156\n别旨\t92157\n9676566\t92158\n三十斤\t92159\n今天下午\t92160\n课代表\t92161\n荐\t92162\n荒\t92163\n别无\t92164\n苏ecx\t92165\n刘宪僖\t92166\nGucci\t92167\n草\t92168\n秘儿园\t92169\n荆\t92170\n牟菇\t92171\nrthxvjg\t92172\n琴弹\t92173\n亲亲们\t92174\n吵不吵\t92175\n永雄\t92176\n东亚度秘\t92177\n暮止\t92178\n荷花池\t92179\n54个\t92180\n荷\t92181\n荨\t92182\n荪\t92183\n生汤\t92184\n行知\t92185\n荥\t92186\n猎场\t92187\n13738004993\t92188\n1234456678000\t92189\n荣\t92190\n劲书\t92191\nBOB\t92192\n过渡句\t92193\n刘子涵\t92194\n番茄炖梭子蟹\t92195\n赞美我的话赞美本宫的话度秘\t92196\n月亮地\t92197\n圈错\t92198\n孔子云\t92199\n很多人\t92200\n好呀狼来了\t92201\nRAIAIIST\t92202\n覆蜡\t92203\n958\t92204\n睡觉后\t92205\n很多事\t92206\n有无\t92207\n950\t92208\nxunfe\t92209\n仙人掌与多肉植物\t92210\n954\t92211\n餐粒粒谐利率\t92212\n超缺\t92213\nlpl\t92214\n杀人书\t92215\n你你你你你办\t92216\n厦门岛\t92217\n毁我\t92218\n95%\t92219\n有时\t92220\n不满不满\t92221\n林锦盛\t92222\nlpi\t92223\n第几个\t92224\n觉知\t92225\n小地方人普通话\t92226\n三三二二七七八八九九十十十一十二十三\t92227\nTergdhbff\t92228\n湍约\t92229\n撒卡\t92230\n4oria\t92231\n困住\t92232\n哈大爷\t92233\n定长\t92234\n区周\t92235\n明知\t92236\n难熬\t92237\n72公顷\t92238\n伤心凉皮\t92239\n想了改\t92240\n幺幺六三五\t92241\n工艺\t92242\n课文关\t92243\njsjsoskn\t92244\n为斤\t92245\n今天阳光好暖和来点水花\t92246\n几十层\t92247\nⅫ\t92248\nbvp\t92249\n村姑\t92250\nbvv\t92251\nbvt\t92252\njgij\t92253\n一把子\t92254\n23333326352457553552535283565533565376\t92255\n46岁\t92256\nbvb\t92257\nbvg\t92258\nbvf\t92259\nDjg\t92260\nbvd\t92261\n殷佳利\t92262\n一石二\t92263\nnewversion\t92264\nbvn\t92265\n三十二\t92266\n杨珺琳\t92267\n我需要你\t92268\n一字马\t92269\n餐台\t92270\n投连险\t92271\n汤丕龙\t92272\n得鲁\t92273\n余佳鑫\t92274\n钱文肢\t92275\n8988088807896086589887561454866188662489\t92276\n王文嘉\t92277\n盲打\t92278\n日内瓦\t92279\n下述\t92280\n酒坛\t92281\n秦时明月动漫\t92282\n奶汁\t92283\n说的就是你疯婆子就是你\t92284\n兴远斋\t92285\n瓜不然\t92286\n上个月末\t92287\n余量\t92288\n十万八千里远\t92289\n笔尖\t92290\nF4\t92291\n1416102757\t92292\n辛巴达历险记\t92293\nghtyh\t92294\n吊干\t92295\ntaito\t92296\n白晓洁\t92297\n玩不我没有钱\t92298\ntro\t92299\nHdlk\t92300\n韩饭们\t92301\ngbvv\t92302\ntre\t92303\ntrd\t92304\n华申\t92305\ntrf\t92306\ntry\t92307\n吓哭\t92308\ntrz\t92309\n3亿多\t92310\ntrr\t92311\n威权\t92312\ntrt\t92313\n1月29一\t92314\n时钟\t92315\n如潮\t92316\n栽树\t92317\n出口\t92318\n体息\t92319\n时针\t92320\n苏忯柚\t92321\n花海\t92322\n熟饼\t92323\n别这样英\t92324\n承兑汇票\t92325\n有你了要我和你妈妈聊\t92326\n汤雁南\t92327\n先睹为快\t92328\n剑士\t92329\n郑家营\t92330\n55866588\t92331\n冰倍儿\t92332\n倚天屠龙记\t92333\n反着\t92334\n马飞扬\t92335\n宋鸿兵\t92336\n龙战士\t92337\n售罄\t92338\n接龙珍藏\t92339\n遗传学\t92340\n工作组\t92341\n置之\t92342\n三秦\t92343\n岳阳楼\t92344\n传奇点\t92345\nbb巴黎\t92346\nktum\t92347\n雙魚座\t92348\n人人皆知\t92349\n长袜子皮皮读书笔记\t92350\n三秋\t92351\n1万年\t92352\n三种\t92353\n报酬\t92354\n死家伙\t92355\n芭芭拉a\t92356\napkdjdj\t92357\n能达山\t92358\n三秘\t92359\n黄雅莉\t92360\n一个故声音\t92361\n三秒\t92362\n三科\t92363\n求解答\t92364\n期人\t92365\n侯爷\t92366\n川马\t92367\n贝里克森曼\t92368\n儒教\t92369\n疯了脆\t92370\n4951283\t92371\nsifyou\t92372\n舍不舍得\t92373\n82章\t92374\n说娘娘千岁千岁\t92375\n29升\t92376\n我不陪你好评了你要不原谅我\t92377\n默灵山\t92378\n好心亲一个\t92379\n十二倍\t92380\n你是猪么你是猪吗你是猪吗你是猪吗你是猪吗\t92381\n报名表\t92382\n边度仲\t92383\n明天中午12点10分\t92384\n补一补\t92385\n慌子\t92386\n骇死\t92387\n螃蟹\t92388\n暗景\t92389\n杜传旺\t92390\nwkjmjUKjgggll\t92391\n番外篇\t92392\ngeuoe\t92393\n狼哥\t92394\n唇唇\t92395\n靠弊\t92396\n夕及远\t92397\n唐英\t92398\n刘嘻嘻\t92399\n断气\t92400\n和平街北口桥\t92401\n夏馨怡\t92402\n在在在在在在行\t92403\n老寄\t92404\n西安音乐学院\t92405\n凌绝顶一览众山小\t92406\n78888\t92407\napoiuy\t92408\n水来土掩\t92409\n唐苑\t92410\n贴身堡\t92411\n我喜欢比熊\t92412\ngixixyxy\t92413\n断水\t92414\n海之的\t92415\n龙珠4\t92416\n笨蛋糕\t92417\n20160613\t92418\n蒲岐镇政府\t92419\n小蓝\t92420\nhhvhi\t92421\n6000多个\t92422\nwp48484484388493344484488444448448#4939\t92423\n两棵\t92424\n哈更\t92425\n长江学院\t92426\n小蓓\t92427\n有话说\t92428\n底稿\t92429\n雀稠\t92430\n清河镇\t92431\n553658895475\t92432\n臭董\t92433\n暖风\t92434\n找错\t92435\n工作中心\t92436\n430624\t92437\nvvyhcgbcg\t92438\n东方彧卿\t92439\n零二二零幺六九\t92440\n跟给\t92441\nZXF\t92442\n半酬\t92443\n两棒\t92444\ngiggag\t92445\n郭佳琪\t92446\n迟走\t92447\n小魏龙\t92448\n必能\t92449\n2月6号\t92450\n咒怨\t92451\n734482562\t92452\n铁手\t92453\nkkjkkkkkkkkkkkkkkkkkkkkkkkkll\t92454\n呢诡\t92455\n贝拉\t92456\n三数\t92457\n德国宝马集团\t92458\n爱我求你\t92459\n耶稣、安拉\t92460\n子姑娘\t92461\n度乖乖\t92462\n什么不懂的我在这bb\t92463\n一眯眯\t92464\n422201197902145946\t92465\n姜杰\t92466\n春晓\t92467\n摩登家庭\t92468\n白楼宇\t92469\n春晖\t92470\n南极人\t92471\nwrih\t92472\n22:15\t92473\n乖听话\t92474\n22:10\t92475\n撸爆\t92476\n爆嗨\t92477\n黄小明\t92478\n坏笑嘻\t92479\n付文好羽\t92480\nhhvfhb\t92481\nCfht55586\t92482\n色度度\t92483\n夏俊杰\t92484\n韩军民\t92485\n铁人\t92486\n我们的爱\t92487\n满格哇\t92488\n黄钻小面\t92489\n危险关系\t92490\n于滨洋\t92491\n女连\t92492\n999gov\t92493\n别执著\t92494\n洗衣\t92495\n跳路西\t92496\n考期\t92497\nSuper\t92498\ndddfstdeetrt\t92499\n鬼马神偷\t92500\nhelloimsusan\t92501\n洗街\t92502\n达濠紫菜鱼丸汤\t92503\nDhdfsn\t92504\n理工科\t92505\ndueieirjtiij\t92506\n三百元分钟\t92507\n呼呼我要了我喜欢你的男人\t92508\n恶心我了我\t92509\n滚出力战\t92510\n岳麓\t92511\nguyIo\t92512\n二日\t92513\n说缨\t92514\n19.40亿元\t92515\n兴寿\t92516\n2月23日\t92517\nmyfoodisclothes\t92518\n71摄氏度\t92519\n3轮\t92520\n防雾\t92521\n防雷\t92522\n盐皂\t92523\n度秘你有钱吗有钱我的卡的你\t92524\nZzpp\t92525\n关古威\t92526\n生产力\t92527\n两女\t92528\n有你一只小的小不点儿\t92529\n防雪\t92530\n孔维然\t92531\n吴平\t92532\n胡书扬\t92533\n看不穿的的的爱你\t92534\n卡点\t92535\n傷痕\t92536\n二六八三\t92537\n3月15日\t92538\n香料\t92539\n姚生记\t92540\n改和\t92541\nbgdcjo\t92542\n玩想\t92543\n梁小军\t92544\ngjn\t92545\n14天后\t92546\n心维基祈祷燃烧的火炎\t92547\n155450055\t92548\n高老师\t92549\n英模\t92550\n喲這麽\t92551\n郭佳欣\t92552\n黄歇\t92553\n40025\t92554\n而是\t92555\n忄讯\t92556\n共同体\t92557\n四识\t92558\n能找到吧\t92559\n超威\t92560\nwtimen\t92561\n第25集\t92562\n去年12月1日\t92563\n260千米\t92564\n潜水衣\t92565\n管求爷\t92566\n爸爸妈妈\t92567\n张慧一\t92568\n常治市\t92569\n第三届\t92570\n无污染\t92571\n奥com\t92572\n西街\t92573\ngjpadm\t92574\nYuckdonned\t92575\n能看懂\t92576\n学会听话\t92577\n24343431221\t92578\n缺德\t92579\n电子烟\t92580\nyous\t92581\n度秘你是好朋友哦你是我最爱的人\t92582\n最新激情篇大曝光\t92583\n14点40分\t92584\n感嗎\t92585\n154588447877\t92586\n曹梓轩\t92587\n克里福德\t92588\nCued\t92589\n堤坝\t92590\n贴现\t92591\n大巫\t92592\n大差\t92593\n师妹\t92594\n看我的相听\t92595\n乐意英\t92596\n猪啊猪\t92597\n科蜜\t92598\n厨把\t92599\n46949464646\t92600\nlop\t92601\n妈们\t92602\n和你决\t92603\n最最最最最最最最最最最最\t92604\nhvcg\t92605\n凶悍\t92606\n无可比喻\t92607\n一三五六\t92608\n之外\t92609\n狼王\t92610\n信乎\t92611\n23岁\t92612\n艺文\t92613\n韩非\t92614\n高磕死\t92615\n三小时行\t92616\nxdhkai\t92617\n３级\t92618\ngjh\t92619\n干倘卖无\t92620\n我靠我靠我靠靠靠靠靠\t92621\n私喜欢\t92622\nlhbb\t92623\n相误\t92624\n骨头汤\t92625\n手摇\t92626\n讨人烦\t92627\n爱动\t92628\n恶类能干\t92629\n闲了聊\t92630\n叮嘱\t92631\n扇脸\t92632\n哈他\t92633\n哈仔\t92634\n吴非\t92635\n吃饭饭\t92636\n新的生活\t92637\n还在家\t92638\n相识\t92639\n持续\t92640\nAAA\t92641\n而生\t92642\nAAB\t92643\n王雪纯\t92644\ndatababy\t92645\n半阴处\t92646\nqushi\t92647\nump\t92648\n花砖\t92649\nmikolp\t92650\n四秒\t92651\n徐一\t92652\n超热\t92653\n终极度\t92654\n一个星期五十节\t92655\n30431674273043167427\t92656\n这一招\t92657\nAAf\t92658\n到村\t92659\n龙袍瑞\t92660\n耀朋神风\t92661\n等一下去\t92662\n总的来说\t92663\n客座教授\t92664\n改编版\t92665\n农家俏王妃\t92666\n说了大你\t92667\n冒要\t92668\n50万美元\t92669\nfac0\t92670\n归国\t92671\n他和你\t92672\n抗戰\t92673\n你好呀你好呀你好呀\t92674\n可行\t92675\n四角恋\t92676\nniyoumimima\t92677\n开衫\t92678\n视线样\t92679\n压力\t92680\n更討厭犯過錯\t92681\n忙了拜拜\t92682\n唉山炮\t92683\n猪猪猪猪猪你是猪你是猪\t92684\n嗰座\t92685\nTom_Kaneshige\t92686\n234567823\t92687\n武义\t92688\n心老契\t92689\n腐蚀\t92690\n首大\t92691\nfacn\t92692\n韦经鑫\t92693\n13954028188\t92694\n海口公司\t92695\n金色橄榄城\t92696\n舒国治\t92697\n一道得五分\t92698\nface\t92699\nZaurusp\t92700\n耍流氓\t92701\n#13\t92702\n大学季\t92703\n那个美丽的秘密主人公\t92704\n赵家波\t92705\n二六十八岁\t92706\n甘删除\t92707\n连某\t92708\n活浪\t92709\nBMWSUV\t92710\n简略\t92711\n多豸Q\t92712\n你的未来在哪里\t92713\n两千米\t92714\n老也不在\t92715\n作序\t92716\n王座\t92717\nyàn\t92718\n管理所\t92719\n土改\t92720\n王度\t92721\n孙子涵\t92722\n前一天\t92723\nNatalie\t92724\n魔法天使\t92725\n上述腾腾样\t92726\n韩益娟\t92727\n开播\t92728\n避震\t92729\n晕吵闹\t92730\n俺儿\t92731\navab\t92732\n尔少邵佳\t92733\n斯里\t92734\n樱桃的滋味\t92735\n别s8\t92736\n20161\t92737\n20162\t92738\n王萌\t92739\n辣白菜炒饭\t92740\n节育\t92741\n记住在\t92742\nhutand\t92743\n手磺\t92744\n唐骏\t92745\n上海教育出版社\t92746\n弧长\t92747\n年纪主任\t92748\ngbf\t92749\ngbg\t92750\ngbd\t92751\ngbb\t92752\nuccjm\t92753\n江福玲\t92754\ngba\t92755\ngbn\t92756\n家灯\t92757\ngbh\t92758\ngbv\t92759\n王萱\t92760\n苏妙摩纳\t92761\ngbu\t92762\n交往观\t92763\ngbs\t92764\n姚你\t92765\ngbz\t92766\ngbx\t92767\ngby\t92768\n朱泥\t92769\n疣子\t92770\nCRV\t92771\n折痕\t92772\n卡行\t92773\n小朋\t92774\n小月\t92775\n小有\t92776\n盘锦地区\t92777\n徐悦\t92778\n大博大博\t92779\nCRF\t92780\n崔健华\t92781\n聚拢\t92782\n小望\t92783\n曹娟\t92784\n美尔雅集团\t92785\n早春园小学\t92786\n裁减\t92787\n狍子兄\t92788\nsehw\t92789\n小木\t92790\n一闪一\t92791\n输掉\t92792\n小本\t92793\nGbgbchg\t92794\n发量\t92795\n小朱\t92796\n小朴\t92797\n小朵\t92798\n小机\t92799\n邱琬帧\t92800\n底楼\t92801\n听我的好不好\t92802\n吸湿\t92803\n找找\t92804\nwhw520\t92805\n老妈子\t92806\n社会关系\t92807\n9.15%\t92808\n晚上六点\t92809\nKIMI\t92810\n路上司机\t92811\n天天盈\t92812\nggggggggggggg\t92813\n非常胸\t92814\n小乖小\t92815\n1646\t92816\nintentIntentSK11714776651374F1B6CC0EA6B411FEB4A372DB60BE3BDBEFA76FD5C410E7F7FF787028D684end\t92817\n第三萌\t92818\n长轨\t92819\n滚瓜\t92820\n反动\t92821\n国安民乐\t92822\n新渠\t92823\n1648\t92824\n唐元贵\t92825\n瘦子会葵\t92826\n琴亭湖\t92827\n丹阳中学\t92828\n土豪金分手v\t92829\n你好漂亮\t92830\n年终奖\t92831\n中国核工业总公司\t92832\n饮用\t92833\n一首\t92834\n最多余\t92835\n穿甲\t92836\n陈睿涵\t92837\n大长尕子Q冬\t92838\n才安\t92839\nucd\t92840\nucc\t92841\n水坝\t92842\n水坑\t92843\nuck\t92844\nucj\t92845\n要死要\t92846\n0t\t92847\n西媒蜓\t92848\n基白橙\t92849\n哎呦我的妈呀\t92850\n蒋金妤\t92851\nFfudh3\t92852\n33cc\t92853\n抢点眼\t92854\n收阴\t92855\n特长生\t92856\n考火\t92857\n笋片\t92858\n印迹\t92859\n国家博物馆\t92860\n8至10年\t92861\n5548455\t92862\nPS威武\t92863\n结宇\t92864\n折咯\t92865\n汪剑\t92866\na95\t92867\n2528117186\t92868\n大大小小\t92869\n杜密战\t92870\n一个11ke\t92871\n方攀\t92872\n30多平米\t92873\n妄喜\t92874\n旋律\t92875\n滨海酒店\t92876\n芭妮\t92877\n哼臭度\t92878\n5201314\t92879\n植物大战僵尸屋顶版\t92880\n林展朱礼军\t92881\n爱上我需要\t92882\n掉了拜拜\t92883\nonex咖3650\t92884\n一群鬼\t92885\n208年\t92886\n周烨\t92887\n一大段\t92888\nsith\t92889\n吸取\t92890\n4.3级\t92891\n300余\t92892\n天天酷跑的号求求你了求求你了求求你了求求你\t92893\n爱爱情\t92894\nsite\t92895\nfivepassten\t92896\n上证\t92897\nweknow\t92898\n南开局\t92899\n10min\t92900\n上诉\t92901\n300位\t92902\n度密度密度密度密度秘\t92903\n快点儿快点儿\t92904\n妇科病\t92905\n49个\t92906\n白秦\t92907\n划不走\t92908\n上课\t92909\n李佳霖\t92910\n翟帅\t92911\n报复性\t92912\n再讲一\t92913\n信佛教\t92914\nsamile\t92915\n李广博\t92916\n刘丹枫\t92917\n几个头\t92918\n產\t92919\n奥地利\t92920\n用\t92921\n甩\t92922\n柯哀\t92923\n甫\t92924\n甭\t92925\n易车错\t92926\n迂腐\t92927\n由\t92928\n甲\t92929\n申\t92930\n电\t92931\n男\t92932\n甸\t92933\n虞姬\t92934\n町\t92935\nfcufuvufvyydhtegekgryhgggbhhfffngfdwdyfyjfteftqfsfywdfukyddddf\t92936\n尾下\t92937\n百货度大厦\t92938\n甄\t92939\n山鸡\t92940\n郑嘉豪\t92941\n九县\t92942\n奶包\t92943\n嗯3嗯嗯\t92944\n日用\t92945\n7500\t92946\nhgggh\t92947\nhgggf\t92948\n甚\t92949\n甜\t92950\n刘欣梦\t92951\n生\t92952\n可可照\t92953\n孙本尚\t92954\n那何\t92955\n迪迪\t92956\n练习册\t92957\n那位\t92958\n曲线\t92959\n门人\t92960\n王文宇\t92961\n嗯若\t92962\n月经期\t92963\nhi皮波斯\t92964\n玩吧黑\t92965\n媒体\t92966\n饱和度\t92967\n大便\t92968\n离义乌远\t92969\n大侠\t92970\n东鹏瓷砖\t92971\n83十六个\t92972\n那的花谁的我\t92973\n那你\t92974\n能不能不原谅\t92975\nusiisi\t92976\n珍长\t92977\n2818764948734694847343\t92978\n别秀\t92979\n殊心\t92980\n解挂\t92981\n蓝田\t92982\n坡跟鞋\t92983\nenrerenroj\t92984\n10毫克\t92985\n赵留学校\t92986\n汤天太热\t92987\n样刊\t92988\n保护主义\t92989\n章明珠\t92990\n3d跑\t92991\n烂漫\t92992\n折枝\t92993\n0606\t92994\n76年\t92995\n写错别字错别\t92996\ncagak\t92997\n参考\t92998\n亲爱的你好久回家\t92999\n蓝生\t93000\n全国通\t93001\n11：00\t93002\nvst\t93003\nvsv\t93004\n离锅\t93005\n山里人\t93006\n正腔\t93007\n僵尸僵尸先生\t93008\nMisha\t93009\nvsa\t93010\n美告诉你\t93011\n聪朋\t93012\nvsf\t93013\nvsg\t93014\nvsh\t93015\n1.41421\t93016\n豆角\t93017\n公用\t93018\nvsl\t93019\n百合\t93020\n四座\t93021\n011027510\t93022\n全活\t93023\n吉祥句\t93024\n包夹炎\t93025\n很多张\t93026\n歉人\t93027\n受宠\t93028\n受审\t93029\n不得了呀真是\t93030\n哈马斯\t93031\n心累肺累\t93032\nRnmb\t93033\n老婆色\t93034\n鸭嘴\t93035\n逗哏\t93036\n受害\t93037\n九十二二\t93038\n嫁给我吧\t93039\n远洋\t93040\n生效\t93041\nkiue\t93042\n暗礁\t93043\n第一首\t93044\nWebOS\t93045\n越远越好\t93046\n寡女\t93047\n莅临\t93048\n厂商\t93049\nkiut\t93050\n昌乐\t93051\n七首\t93052\n该班\t93053\n丑老子\t93054\n照片堡\t93055\n我和你的最后一次\t93056\n医岁\t93057\n派出所\t93058\n饿盾击\t93059\n羞走\t93060\n小门神\t93061\n疯死\t93062\n62555555555555555555555555555555555555555555555526555\t93063\n大懒猪\t93064\n对呀你是傻子我是高能人类你是机器人我是人类\t93065\n过期妊娠\t93066\n突袭\t93067\n迷羊\t93068\n险些\t93069\n桑字\t93070\n出见\t93071\n好糊涂\t93072\n伯父\t93073\n伯爷\t93074\n笔亲\t93075\n发酵粉\t93076\n900900\t93077\n鬼人家\t93078\n天津站\t93079\n穿孔\t93080\n潘金莲\t93081\n巴州\t93082\n有女\t93083\n奔驰宝马越野\t93084\n不要不要不要不要不要\t93085\n斗思批\t93086\n我是不是你最疼爱的人那日\t93087\n落玺\t93088\n1994年3月21号\t93089\n特牛\t93090\n尼克乖\t93091\n曲折\t93092\nppooujjghhijhuihhoibjibjihhjjhjhuhhiihhjnjjhhjnhhjivo\t93093\n業沒\t93094\n有奈\t93095\n营业部\t93096\n我没有说你骗的了我是说你是一个小可爱小聪明小猫猫\t93097\n东小学\t93098\n人生之言\t93099\n说不定\t93100\nyuhhgh\t93101\nytfffgui\t93102\n不好聊聊\t93103\n619分\t93104\n宁珂\t93105\n杨号\t93106\n看长\t93107\n讯问\t93108\n600027\t93109\n罗晓\t93110\n点痣\t93111\n被救者\t93112\n刘姐\t93113\n杨可\t93114\n下的我要怕这谁说的鬼话\t93115\ngeggrg\t93116\n指桑\t93117\n本君\t93118\n李源潮\t93119\n付一彤\t93120\n百香果\t93121\n有事儿秘书干没事儿干秘书\t93122\n21分之五家\t93123\nbbbb\t93124\n张艾嘉\t93125\njwje\t93126\n个别驻华领事馆\t93127\n本名\t93128\n狼女\t93129\n1926年6月1日\t93130\na1amwgwg4265\t93131\n嗯超人\t93132\n杨又\t93133\n浓妆淡抹\t93134\n521521521\t93135\n闲谈\t93136\n孙天后\t93137\n空中客车公司\t93138\n凌晨12点\t93139\n便车\t93140\n我来天问地我问你就是信任你你为什么不回答我\t93141\n仓央嘉措\t93142\n耗费\t93143\n一会儿两分钟\t93144\n心情愉快\t93145\n姑妈\t93146\n虬津\t93147\n哇塞女\t93148\n外卖男\t93149\n47f\t93150\n别度秘\t93151\n47e\t93152\n菜秧\t93153\n芳草\t93154\n鲍思源\t93155\nygkdtictyo\t93156\n巨茎\t93157\n反门\t93158\n0ij\t93159\n给我也不知道\t93160\n张娟\t93161\n培苓\t93162\n东大务\t93163\n擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦\t93164\n李洪瑶\t93165\nlifk\t93166\n自然现象\t93167\n发行方\t93168\n下沙仁香\t93169\n特特\t93170\n铁男铁\t93171\n彩券\t93172\n远到\t93173\n关玉卓\t93174\n信誉好\t93175\n哈立德\t93176\nfacetoface\t93177\n家居业\t93178\ntinkling\t93179\nhi你好度秘我是你的主人翁\t93180\n卢斌\t93181\n高领\t93182\n一经年\t93183\n477\t93184\n474\t93185\n一来日\t93186\n子涵\t93187\n邹兴秀\t93188\n压力别\t93189\nv规划\t93190\n演唱会的孩纸你伤不起啊伤不起\t93191\n完好无损\t93192\n铠装\t93193\n二零九五幺\t93194\n乞巧\t93195\n饭否\t93196\n演一息\t93197\n歌吟不决\t93198\n新港富贵嘉园\t93199\nhhggjjhhhnf\t93200\n歉\t93201\n歆\t93202\n歇\t93203\n沙土\t93204\n屮屮屮\t93205\n宝崽\t93206\n掐架\t93207\n苑月梅\t93208\n饮恨\t93209\n歺\t93210\n死\t93211\n歹\t93212\n打榜\t93213\nmkklllkl\t93214\n二氧化硫通入氯水\t93215\n纯女们\t93216\n浓浓\t93217\n喜喜歌\t93218\n四三度\t93219\n沙地\t93220\n兔兔好了了是了是\t93221\n武\t93222\n此\t93223\n步\t93224\n止\t93225\n正\t93226\n沙场\t93227\n歡\t93228\n试装\t93229\n1月23日\t93230\n手冷\t93231\n真不可爱\t93232\nwbns\t93233\n15603944466\t93234\n45563682155\t93235\n好呀好呀你\t93236\n43992800\t93237\n32分钟\t93238\n钢钉\t93239\n十一号\t93240\n松油楼村\t93241\n抹去\t93242\n看见天\t93243\n走稳\t93244\n九千姐\t93245\n甚名\t93246\n手写\t93247\nakkrnqit\t93248\n捧得\t93249\n谔谔\t93250\n民航飞行学院\t93251\n胡椒粉\t93252\n不必聊聊\t93253\n迪丽热娜\t93254\n手册\t93255\n过年法定假日\t93256\n季小薇\t93257\n练册册\t93258\n养家糊口\t93259\n阿嗯\t93260\n万庄\t93261\n鞠情义\t93262\n叉烧拉面\t93263\n撒拉\t93264\nMOZEN\t93265\n摇身一变\t93266\n净售出\t93267\n两篇\t93268\nfvdgdjdcc\t93269\n星期五晚上\t93270\njnvddsdxc\t93271\n100008844\t93272\nFjyfnkjbv\t93273\n指不沾\t93274\n宋语墨\t93275\n录影版\t93276\n15548772085\t93277\n阿嗞\t93278\n啊一西\t93279\n怪奇\t93280\n挂饰\t93281\n我爱晓\t93282\n34563456\t93283\n杨文图\t93284\n喀拉al\t93285\niiiimmmmmmooo\t93286\n埋穷\t93287\n11152255661001千几米\t93288\n襄樊\t93289\nAV娃娃\t93290\n虫子\t93291\n应答\t93292\n级部\t93293\n三天前\t93294\n牛雅萱\t93295\n陈老一九\t93296\n电脑素什锦\t93297\n遇食\t93298\n啦啦啦我是卖报小行家\t93299\n王鹏然\t93300\n玩不起\t93301\nforeng\t93302\n113岁\t93303\n話唄\t93304\n0935\t93305\n6f7\t93306\n怪好\t93307\n我不信缘不信分\t93308\nElla\t93309\n哼星\t93310\n队伍\t93311\nstihe\t93312\n日晒\t93313\n2.0ROM\t93314\nfyugffygf\t93315\n3嗖嗖嗖3\t93316\n饭伯\t93317\n队会\t93318\n79公斤\t93319\nJen\t93320\n丹尼斯吴\t93321\npppl\t93322\nhttphhiphotosbaiducomxiaodupicitem5\t93323\ncrochi\t93324\n王SB\t93325\n五香\t93326\n欧文\t93327\n美陶\t93328\nlymz\t93329\n烦大五\t93330\n陈佳涵\t93331\n回话\t93332\n度秘明明\t93333\n保母\t93334\n三十集\t93335\n美院\t93336\n扬帆\t93337\n空军型\t93338\nydidlfcvgdb\t93339\n一不忙\t93340\n黑米粥\t93341\n请假\t93342\n当当等当等当\t93343\n挂号费觉给爸爸不v办法女孩不女CC相册\t93344\n顺流下矣棹数小舟曳铁钯\t93345\n张玉凯\t93346\n早点好梦给我\t93347\n1800圈\t93348\n所有物\t93349\n木杞\t93350\n转真\t93351\nkodk\t93352\n不要劲\t93353\n跑家\t93354\n极片\t93355\n帮女\t93356\n43多42\t93357\nYdvehdygz744574\t93358\n不要动\t93359\n后汉\t93360\n玉兔\t93361\n崔景博\t93362\n别非\t93363\n中午饭钱\t93364\n于宁\t93365\n机器刃\t93366\n龚家雄\t93367\n208.00\t93368\njiumllllet\t93369\n盗版小兵\t93370\n文盲\t93371\n香猪坊\t93372\n开拓者\t93373\n那么多英\t93374\n哎咪龟\t93375\n戒色\t93376\n中西\t93377\n许烟雨\t93378\n坦然在心\t93379\n78首\t93380\n恩孽徒\t93381\n水水山山处处\t93382\n熊猫\t93383\n泥石流\t93384\n人不人鬼不鬼\t93385\n是真是假\t93386\n建制\t93387\n乃好\t93388\n六眼看八方眼观六路耳听八方眼观六路耳听八方\t93389\n买报社足球队\t93390\n呢瓜\t93391\n甄帅博\t93392\n胡雪莹\t93393\npmmptttt\t93394\n郭国庆\t93395\n拾荒\t93396\n死内\t93397\n差不我\t93398\n少言\t93399\n国寿\t93400\n郑秀妍\t93401\n不好过\t93402\n错错错错大错\t93403\n快唱\t93404\nprprp片\t93405\n紫云\t93406\n乡巴佬浆卤鸡爪\t93407\n时尚芭莎\t93408\nqb超轻粘土\t93409\n二路可怜巴巴\t93410\n多吧多宝\t93411\n帝尊\t93412\n打打电话\t93413\njgkgyi\t93414\n怎么说的话\t93415\n真的我的\t93416\n巨快\t93417\neeeeeeeeeeeeeeeeeeee\t93418\n张自画\t93419\n50余岁\t93420\n00247\t93421\n头晕眼花\t93422\n凝血\t93423\n纹纹纹纹\t93424\n明早上\t93425\n穹妹\t93426\nysoc若\t93427\n啊爸\t93428\n理鱼台\t93429\n肌肠辘辘\t93430\n我和一个女人\t93431\n老林\t93432\n奥谢谢\t93433\n老析\t93434\n951489926\t93435\n对我说了不算那你给我说说情\t93436\n巩飞鹰\t93437\n3.16元\t93438\nL0e\t93439\n治本\t93440\nFoolish\t93441\n大无光\t93442\n民工\t93443\n东至\t93444\nL0V\t93445\njISK\t93446\n这是为什么呢\t93447\n超级王\t93448\nhttppinyincn1YSxpCu0j6a\t93449\n债务\t93450\n可是我\t93451\nbeta版\t93452\n说说秘\t93453\n123456789133\t93454\nfufyf\t93455\n隔嗖\t93456\n手癌\t93457\n信永卡\t93458\n呢斯伦贝谢\t93459\nkjjkkk\t93460\n节食\t93461\nAzure\t93462\n温永旭\t93463\nbank\t93464\n纸伞\t93465\n导管\t93466\n光辉到晚呆\t93467\n摆平\t93468\n美国威斯康星大学\t93469\n1.90亿元\t93470\nhertthy\t93471\n你死\t93472\n黄婷婷\t93473\ns44\t93474\n联袂\t93475\n19块\t93476\n汤唯\t93477\n赞比亚\t93478\n哈巴\t93479\n燥热\t93480\n城南\t93481\n喷泪\t93482\n腊八蒜\t93483\n省委党校\t93484\n高雄道德院\t93485\nkesws\t93486\n民院\t93487\nhhs\t93488\n六道轮回苦\t93489\n纫工\t93490\n18849903\t93491\n888888888888888888888888888888888\t93492\n东方红\t93493\n喷泉\t93494\n恐爪龙\t93495\nfiocfg\t93496\n合于\t93497\n篮球场\t93498\n百五三百六\t93499\n野生\t93500\n姓赵\t93501\n明白名\t93502\n曾伟晟\t93503\n120期\t93504\n四珍\t93505\n21110\t93506\n日升月\t93507\n不是我们吗你好聪明\t93508\n智能放风机\t93509\n董昊臻\t93510\n拆嘞\t93511\nsndkskbs\t93512\n褒奖\t93513\n未成年人行乞\t93514\n十二月二十九日\t93515\n微系\t93516\n尼泊尔\t93517\n14869\t93518\n四班\t93519\n顾后\t93520\n哭了嘛\t93521\n华娜\t93522\n东师大\t93523\naabc形式\t93524\n无味\t93525\n意艺\t93526\n闹翻交界处激将法\t93527\n无周\t93528\n卢剧\t93529\n赵梦娟\t93530\nWoming\t93531\n咕嘟\t93532\n八月二八\t93533\niphon6splus\t93534\n仲盛\t93535\n有毒瘾\t93536\n背包客\t93537\n从众\t93538\n被遗弃\t93539\n盖被\t93540\nMe五个\t93541\n亲们\t93542\n连雪\t93543\n558455\t93544\n我永远爱你我等你多米\t93545\n张显案\t93546\n娇欲语\t93547\n亲仁\t93548\n青云\t93549\n种花小游\t93550\n13147798989\t93551\n啊俊锋\t93552\n大文\t93553\n在在在在在在在\t93554\njvjcbghghhhhhhddddfffddffffffdddfdhgxhvxgbvjvchgvkhbhhhhghhjh\t93555\n幺七五诶小琴幺七六五\t93556\n场身\t93557\n不哒\t93558\n今晚18：07\t93559\n六个局\t93560\n不哄\t93561\n不哇\t93562\n长沙县\t93563\n累爱\t93564\n黑魔仙\t93565\n1636650887\t93566\n王雪云\t93567\n720度\t93568\n小馨\t93569\n团体\t93570\n蔡不理\t93571\nhht\t93572\n46丁\t93573\n困不着\t93574\n⑩子了由不得行。废话哥本哈根病假条噶哈几噶哈干怒\t93575\n苏州高新区\t93576\n一所以\t93577\n姓朱\t93578\n小香\t93579\n六日\t93580\n妈咪大战\t93581\n韩沐泊\t93582\n策墨\t93583\n礦泉水龢西\t93584\n预览\t93585\n无力样\t93586\n推迟\t93587\n推进\t93588\n该分\t93589\n第17个\t93590\n我爱你我爱你爱上你醒了\t93591\n谢谢\t93592\n9月18日上午\t93593\n光阴\t93594\n不要了亲一下\t93595\n可做\t93596\n吉布斯\t93597\n要不完\t93598\n超长夜\t93599\n物质子\t93600\n沃尔玛超市\t93601\n德若谷\t93602\n群体性\t93603\n瑞金医院\t93604\n我的学乐云教学坏了帮\t93605\n过山车\t93606\n难以启齿\t93607\n谢谁\t93608\n海姆利克氏急救法\t93609\nfgvdffff\t93610\n红袖毛衫\t93611\n这个家\t93612\n实习课\t93613\n气死我了我不想\t93614\n凌雨珊\t93615\n湿试\t93616\n浑江地区\t93617\n靠爷\t93618\n卢羽晨\t93619\n巫娃娃\t93620\naaopqr\t93621\n下划线\t93622\n云之\t93623\n2-3天\t93624\n呲饭\t93625\n嘛玩意\t93626\n吾理\t93627\n一阵嘛\t93628\n日记\t93629\n鼻炎膏\t93630\n天天考试\t93631\n洋洋得一\t93632\n林建喜\t93633\n心人\t93634\n你好你好我要光\t93635\nPPPPKKONUUMM\t93636\nabcdefgakacklnopqrstuvkinn1313亮晶晶满天\t93637\nnano\t93638\n来信\t93639\n心事\t93640\nnann\t93641\n我又要机\t93642\n云乡\t93643\n来俩\t93644\n北京朝阳区康营村委会\t93645\n再见再见再见讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你\t93646\n检方\t93647\nFhjjhvdyhcf\t93648\ntuff\t93649\n起球\t93650\n钱青\t93651\n嗯嗯学楼角\t93652\n忆莲\t93653\n朝向\t93654\n乖疯爱堡\t93655\n摆渡\t93656\npercenthigh\t93657\n佳宝\t93658\n申请者\t93659\n度身\t93660\n摆游\t93661\n灵鼎\t93662\n3-4\t93663\nMelon\t93664\n降至\t93665\n147896321\t93666\n孙小伟\t93667\n。00\t93668\n八八年\t93669\n卷场\t93670\nwzy\t93671\n削死\t93672\nFuckYouMoM\t93673\n人工那\t93674\nwhynot\t93675\n股民竞\t93676\n噗a杯\t93677\nwzk\t93678\n沈佳龙\t93679\n下星期\t93680\n不爱我\t93681\n141401c\t93682\n衣帽个我听\t93683\n咋明\t93684\n打吊针\t93685\n刘继梅\t93686\n乐古\t93687\n剑网3\t93688\n普高\t93689\n找一只邱比特的箭\t93690\n慢吞吞\t93691\n玮景\t93692\n素雅\t93693\n秦斌刚\t93694\n娃娃脸\t93695\n堡埃弗雷or\t93696\n懂真多\t93697\n善莫大焉\t93698\n记账簿\t93699\n外连\t93700\n三十六计\t93701\n仓山\t93702\n夾耿乐\t93703\n缓冲\t93704\n镶招\t93705\n家3jdbdbjdbdhbdvdjjsjddkdkdbjdbdn\t93706\n住宿舍\t93707\n杨家兴\t93708\nhhc\t93709\n满腹牢骚\t93710\n张安琦\t93711\n你的眼珠子\t93712\n4644平方尺\t93713\n丑比九\t93714\n对啊对\t93715\n叉皮\t93716\n最珍贵\t93717\n北京地区\t93718\n相反\t93719\n涟韵吧\t93720\nCMOS影像传感器\t93721\n丘比特\t93722\np780\t93723\nseruppn\t93724\nfrfc\t93725\n不爱你了我不想\t93726\n沙摩柯\t93727\n开开\t93728\n高峰堡勒\t93729\n刘勤仁\t93730\n付志超\t93731\nJensen\t93732\n臣子\t93733\n永刚石窟\t93734\n是帅\t93735\n两个儿子\t93736\n冒险家\t93737\n星晴\t93738\n零零零二八零七\t93739\n彭泽\t93740\n三鹿奶粉\t93741\n蜘蛛蟹\t93742\nshuai810875980\t93743\n雇佣兵\t93744\n太色\t93745\n300余亩\t93746\n十五十六十七十八十九二十\t93747\n张喆沅\t93748\n古时一剑闯天下今世一剑放荡世界\t93749\n超极本X1\t93750\n正人君子\t93751\n雨天\t93752\n今年2月份\t93753\n狂三\t93754\n巡逻\t93755\n早上六点钟\t93756\n22寸\t93757\n雨夜\t93758\n沈冰\t93759\n子午\t93760\n王美美\t93761\ncvhbhbdchgg\t93762\n莫过于此\t93763\n湾厦\t93764\nshessh\t93765\n山伐树\t93766\n萱涵\t93767\n束手\t93768\nDoub\t93769\n氵果\t93770\n邦卡\t93771\n好难\t93772\n补考\t93773\n居多\t93774\n肥爷\t93775\n１年\t93776\n游走\t93777\n上学\t93778\n政治家\t93779\n基础性\t93780\n立白\t93781\n弓形\t93782\n漏诊\t93783\n菁儿\t93784\n120411104\t93785\n要不要不要不要不要不\t93786\n强东\t93787\nm骚睿\t93788\n处级\t93789\n1000元\t93790\n大你好萌\t93791\njapane\t93792\n啊武\t93793\n四一四个\t93794\n来了世界\t93795\n采摘券\t93796\n尔修斯\t93797\n依曼丽克\t93798\ndixd\t93799\n黄黑恩\t93800\n夏晨\t93801\ndixi\t93802\n傅家俊\t93803\n王良珍\t93804\n椎间\t93805\nn行n哼hen\t93806\n泰式\t93807\n张焕毫\t93808\n振动器\t93809\n江心\t93810\n2012年07月09日\t93811\n在旁\t93812\n苗结石\t93813\nhhk\t93814\n毛豆豆\t93815\n问条\t93816\n锱铢必报\t93817\n苑士廷\t93818\n666687\t93819\n侬蜜蜜\t93820\n虞景真\t93821\n在早\t93822\nPad\t93823\n鹅鹅鹅鹅鹅鹅鹅鹅鹅谔谔\t93824\n在日\t93825\n乳娘\t93826\n1235489662\t93827\n741148438\t93828\nhhj\t93829\n二奶奶\t93830\n陈玉玉忠\t93831\n美感\t93832\n海流\t93833\n血肉\t93834\n了不吗\t93835\n优蜜\t93836\n课题\t93837\n自用度秘\t93838\n艺乐播报\t93839\n8月6日\t93840\n海海\t93841\n徐玉霞\t93842\n本轮\t93843\n0.39%\t93844\n海浪\t93845\n傲骨\t93846\nhxdfx\t93847\nbunny\t93848\n四北固山\t93849\n黄金木\t93850\n游旅\t93851\n京都\t93852\n5203344587\t93853\n1234567891234567891234567890\t93854\nLullaby\t93855\n幸福的时光\t93856\n1150\t93857\n猫子\t93858\n15941882587\t93859\n2573680825\t93860\n一路走好\t93861\n免除\t93862\n徐申东\t93863\n千禧\t93864\n兴红小达人\t93865\n123456789025817\t93866\n湖梦\t93867\n沈骏\t93868\n听见了吗\t93869\n石亭峰\t93870\n奥肚子\t93871\nhowareyu\t93872\n黄金期\t93873\n精神院\t93874\n胡名泰\t93875\n溢液\t93876\n侯原\t93877\n发低\t93878\n紧凑\t93879\n杀牙\t93880\n小软妹\t93881\n杀牛\t93882\n石颖\t93883\n兴冲冲\t93884\n一学期八千\t93885\n茉友\t93886\n一千九百多兆\t93887\n深V\t93888\n咋么\t93889\nhhm\t93890\n百根\t93891\n百样\t93892\ndf7c\t93893\n懒妻\t93894\n超越型\t93895\n蜜蜂式\t93896\n嗓音\t93897\nfamily\t93898\n虎楼\t93899\n二十万\t93900\n起跑点\t93901\n埋藏\t93902\n何祥吉\t93903\n小麦味\t93904\n相信自己\t93905\n父子文父亲强上\t93906\n170多\t93907\n不要不想\t93908\n很单\t93909\n睡不着三了聊\t93910\n甚安\t93911\n詹姆斯·帕特森\t93912\n真的爱你百年和你结了我是真心的\t93913\n赵炜杰\t93914\n明天的明天的明天明天明天明天明天\t93915\n我不我不喜欢你了我不想和你聊\t93916\n未能不能\t93917\n石头硪\t93918\n奇巧\t93919\n別介\t93920\n四百多个\t93921\n电暖宝\t93922\n甜角\t93923\n367823\t93924\n樱桃节\t93925\n小孙\t93926\n小孟\t93927\n滋生\t93928\n陈威翰\t93929\n小子\t93930\n小字\t93931\n小孔\t93932\nbb个\t93933\n陈曹尹\t93934\n邵氏佳片\t93935\n晨线\t93936\n马宾\t93937\n以撒亚\t93938\nM1A1艾布拉姆斯坦克\t93939\n马家\t93940\n夢\t93941\n零一九五二零四零幺零\t93942\n夠\t93943\nQ人\t93944\n大\t93945\n实线\t93946\n太\t93947\n夫\t93948\n寒冰\t93949\n寒冷\t93950\n夯\t93951\n夭\t93952\n张钊\t93953\nzcu\t93954\n失\t93955\n夷\t93956\n头\t93957\n寒冬\t93958\n夺\t93959\n一百千\t93960\n夹\t93961\n夾\t93962\n夿\t93963\n夼\t93964\n32G\t93965\n2878606287\t93966\n434.225上行\t93967\n夆\t93968\n备\t93969\n处\t93970\n度秘我感觉你萌萌哒\t93971\n夏\t93972\n下铺\t93973\n复\t93974\n王维新\t93975\n抖颤\t93976\n外\t93977\nJR叔\t93978\n八彩\t93979\n夕\t93980\n多\t93981\n砚山\t93982\n够\t93983\n夜\t93984\n比亚迪\t93985\n生意思\t93986\n罗欣\t93987\n武松松\t93988\n卢廷婷\t93989\n走向远方\t93990\n死说\t93991\n113分\t93992\n普天之下\t93993\n一个更次\t93994\n放学将心比心\t93995\n一下下\t93996\n立方庭\t93997\n牙牙乐\t93998\nkhal\t93999\n1月31\t94000\n还好不好\t94001\nzā\t94002\n红点\t94003\n路易斯\t94004\n八二区\t94005\n凯源玺\t94006\n乱真\t94007\n不是我讨厌\t94008\nmontho\t94009\n我真的好想你\t94010\n什么奈\t94011\n起来\t94012\n黄网吧\t94013\n朱天釅\t94014\n向世凯\t94015\n多玛西亚\t94016\n上半年\t94017\n6169\t94018\n民月\t94019\n2007年2月25日\t94020\n男台村\t94021\n沈凯晋\t94022\n闹卫\t94023\n吃了翔\t94024\n这样的话\t94025\n魂不守\t94026\n天傲天\t94027\n审书稿\t94028\n665266852\t94029\n坑民\t94030\n赵国庄村\t94031\n黄一号\t94032\n金湖\t94033\n拜拜塞\t94034\n真实故事\t94035\nlppfhudbVu\t94036\n开心鬼一\t94037\n虎哥\t94038\nxky\t94039\n双颖\t94040\n宣传板\t94041\n公英\t94042\n2199833189\t94043\npiyy\t94044\n讲调\t94045\n多余\t94046\n7k7k7k7k7k7k小游\t94047\nkAa8dA\t94048\n义不容辞\t94049\n政制\t94050\n不接男爱上你\t94051\n13730分\t94052\n惟愿\t94053\n32g\t94054\n六个多月\t94055\n星期二\t94056\n请多多指教\t94057\n古广明\t94058\n还好样\t94059\n星期五\t94060\n你好污\t94061\n猪猪猪猪猪猪猪猪猪你是猪么\t94062\n秘换\t94063\n潺潺\t94064\nVS1\t94065\n山楂树之恋\t94066\n15353533286\t94067\n梁家辉\t94068\n算算命\t94069\n贵金属\t94070\n9嗯\t94071\njhdkcs\t94072\nvjirqwryiopkjhd\t94073\n麻痹\t94074\n筑巢\t94075\n一百页\t94076\n头类\t94077\n一百项\t94078\n思烤者\t94079\n木像\t94080\n2496639459\t94081\n头籍\t94082\n4518\t94083\n你好呀你在\t94084\n云会式\t94085\n金山寺\t94086\n徐星星\t94087\n4514\t94088\n明早四点\t94089\naihifaaihifal\t94090\n麻痛\t94091\n杜明雅\t94092\n别了我讨厌\t94093\n王善花\t94094\nchenrice\t94095\nmfmgm\t94096\n21点63点\t94097\n5万n\t94098\n诡计\t94099\n上眼\t94100\nryryws\t94101\n明白装糊涂\t94102\n68元\t94103\n9千多\t94104\n二十多时\t94105\ndhkxv\t94106\n我人生中最讨厌的就是你了最讨厌你\t94107\n唐晓\t94108\n孟逸\t94109\n胸怀\t94110\nVST\t94111\n宠物小的人\t94112\n吐了吐\t94113\nWEST\t94114\ndgyf\t94115\n书像\t94116\n我们元\t94117\n东一块\t94118\ndgyy\t94119\n外号儿\t94120\n张世丽\t94121\n丰联广场\t94122\n莫尔根天\t94123\n嗯切\t94124\n厂方\t94125\n傍晚\t94126\n蓬荜生辉\t94127\n嗯刚\t94128\n嗯刘\t94129\njixuying\t94130\n阿慧\t94131\n口街小学\t94132\n雷霆骇人数据\t94133\n祖国\t94134\n看不听\t94135\n嗯别\t94136\n嗯利\t94137\njvs\t94138\n欣悦园\t94139\n田径组\t94140\njvu\t94141\njvt\t94142\njvk\t94143\njvj\t94144\n裹身\t94145\n狐妖男\t94146\n077522\t94147\njvl\t94148\njvc\t94149\njvb\t94150\n无伦次\t94151\n电压\t94152\njvf\t94153\nuauus\t94154\njvd\t94155\n叔天\t94156\nbetabe\t94157\ntrunkbezel\t94158\n炒鸡蛋我会\t94159\n粉红\t94160\n咯哪\t94161\n男土女木\t94162\n及时行乐\t94163\n莫拉塔\t94164\n五六七八\t94165\n123444\t94166\n第一章\t94167\n杨安琪\t94168\n金山网络疫情监测中心\t94169\nggi8\t94170\n吕思琴\t94171\n首都医科大学附属北京安定医院\t94172\n不可以说\t94173\n吃太多\t94174\n咯哒\t94175\nKeith\t94176\n二千米\t94177\n救护车\t94178\n电子客票\t94179\n参照\t94180\n死义\t94181\n收锁\t94182\n谈恋爱\t94183\n太妃\t94184\n植物大战僵尸破解版\t94185\n毒药\t94186\nhttpahiphotosbaiducomxiaodupicitem42a98226cffc1e1712d5db444d90f603738de959jpg\t94187\n文旦\t94188\nVAIO\t94189\nZ50\t94190\n醺\t94191\n这回信\t94192\n廷和\t94193\n流奔\t94194\n醧\t94195\n二零一七年\t94196\n小鹿犬\t94197\n醬\t94198\nhelpless\t94199\n氮素\t94200\n多大了行不行\t94201\n醫\t94202\n浙江林学院\t94203\n海信E920\t94204\n醒\t94205\n醓\t94206\n醜\t94207\n社会阶层\t94208\n火材\t94209\n软糖\t94210\n哈铪哈铪哈铪\t94211\n醚\t94212\n扔掉\t94213\n喊话\t94214\n错了反正可不是我错不是我出错就是你错错\t94215\n醃\t94216\n北美队\t94217\n火杀\t94218\n丨人成各\t94219\n醉\t94220\n唠唠嗑坎坎坷坷\t94221\n醋\t94222\n浪货\t94223\n难吃好\t94224\njjrk\t94225\n拷问\t94226\n祝德约科维奇\t94227\n王道\t94228\n同学\t94229\n肚子痛\t94230\n布告\t94231\n二十七日\t94232\n出溜\t94233\n浪贱\t94234\njnbhhhj\t94235\n了无\t94236\nm然\t94237\n哟嘿\t94238\n李孜昊\t94239\n家父\t94240\n拌脚\t94241\nThing\t94242\n禁行\t94243\n辽宁男篮\t94244\n百联\t94245\n1378942804\t94246\n啵多野吉依\t94247\n踏破铁鞋\t94248\njiqing\t94249\n卡卡睡吧饭卡\t94250\n啤酒烤肉\t94251\n毫无条件\t94252\n今片\t94253\nnimeizhi\t94254\nfgsf\t94255\n爪爪爪\t94256\n瑞香\t94257\n过几天\t94258\n那么了美\t94259\n白草洼\t94260\n361373\t94261\nfgsu\t94262\n同班同学\t94263\n大龙度\t94264\nOG8\t94265\n小受死\t94266\n澄江\t94267\n结跟\t94268\n64位\t94269\n东部\t94270\nroma\t94271\n回我了真讨厌\t94272\n玩耍\t94273\naudioOrb\t94274\n龙小提\t94275\n冬蜜我是秘东\t94276\n箭步\t94277\n冒犯\t94278\n陈冲\t94279\nTokio\t94280\n撒卡卡\t94281\n斑白\t94282\n建模\t94283\nviyq\t94284\n1b二期\t94285\n老房子\t94286\nzzzzzzzzzzzzzzzzzzzg\t94287\n重回\t94288\n汉城中心\t94289\n比如有\t94290\n长沙市\t94291\n奥吉\t94292\n幺零K一零八零K\t94293\n麽大董秘\t94294\n肉麻呀\t94295\n店滴\t94296\n翟志刚\t94297\n不臭\t94298\nKYLIN\t94299\n小阔\t94300\n强棒\t94301\n唉太\t94302\n咱们四郎\t94303\nashsurbe\t94304\n十二一共用形tutuw\t94305\n斯文扫地\t94306\n手枪手\t94307\n对准\t94308\n佛手\t94309\n好克里木\t94310\n名列\t94311\n小阳\t94312\n10435个\t94313\n查理九世\t94314\nheart\t94315\n前戏\t94316\n是好男\t94317\n徐嘉蔚\t94318\n土堆\t94319\n357294386435\t94320\n颌处\t94321\n均码\t94322\n东源\t94323\nmmmmmm\t94324\n徐美云\t94325\n朋友片\t94326\n人教版物理指数\t94327\n红体\t94328\n抽子\t94329\n比子\t94330\n铜仁\t94331\n里样\t94332\n李怪婴\t94333\n天餐\t94334\n4排\t94335\n篮球架\t94336\n3秒钟\t94337\n好巴宝莉\t94338\n模块\t94339\n从小到现在\t94340\nhhbbcby\t94341\n联系\t94342\n太乖了我的度秘\t94343\n给我一点\t94344\n逊度秘\t94345\n高睿\t94346\n老子老子老子老子老子老子老子\t94347\n七点半23点\t94348\n客户端\t94349\n倒给\t94350\n好多么样样\t94351\n加某妍\t94352\n腊月二十八\t94353\n逗地主\t94354\nddgfd\t94355\n集体\t94356\n睡労\t94357\n西奇隆\t94358\nbbbbbbbbbbbb\t94359\n口泉\t94360\n赵大\t94361\n新一代\t94362\nwwn5\t94363\n实验品\t94364\n58756\t94365\n零一块\t94366\n超关\t94367\nInotsee\t94368\njbcdtio\t94369\n新婚夫妇\t94370\n合约\t94371\nnew无赖\t94372\n不可终日\t94373\n花妞\t94374\n应以\t94375\n北师\t94376\n应付\t94377\n开完\t94378\n不要你\t94379\n娘泡音\t94380\n北市\t94381\n应从\t94382\n耐看\t94383\n丰满大剧院\t94384\n许倩文\t94385\n穿透力\t94386\n团魂有天#120812\t94387\n558558\t94388\n吃吃笑\t94389\n爱的成分\t94390\n空常来玩\t94391\n2分之一\t94392\n昂们\t94393\n吼吼我来了\t94394\n店饼\t94395\n生人勿近\t94396\n娇娇女\t94397\nPLAY杂志SJM高清扫图李赫宰/银赫\t94398\n对局\t94399\n安定侧弯\t94400\n好爱好爱你一只都不知道我追你好久了我好爱你\t94401\n言情\t94402\n倒过\t94403\n首级\t94404\n天后来\t94405\n姚美玲\t94406\n误认\t94407\ndddddfddddddd\t94408\n75元\t94409\n开玩笑了泪水平安夜\t94410\nUhsge\t94411\n我不爱你了我想\t94412\n明天我要嫁给你啦我爱你\t94413\n缺位\t94414\n乐平\t94415\n13254\t94416\n海尔诺\t94417\n条路\t94418\n开心斗大猪\t94419\nfyftf\t94420\n噬菌体\t94421\n浮世绘\t94422\n4001608967\t94423\n人命关天\t94424\n花里子\t94425\nXXXXXXXXX\t94426\n忍辱负重\t94427\n付磊\t94428\n快哄哄\t94429\n1484\t94430\n噶手\t94431\n喜马拉雅山脉\t94432\n蛋蛋车不是这样的你的蛋蛋\t94433\n15度\t94434\n说话态度\t94435\n许荫棠\t94436\n我的妹妹也是女娃娃\t94437\n把式\t94438\n机器人位\t94439\n发展商\t94440\n木子文\t94441\n闻到\t94442\n以弱胜强\t94443\n偏心听偏信\t94444\nhahaag\t94445\n大一点\t94446\n吖大哥\t94447\n十七一颗\t94448\n死灵飞龙\t94449\n医保\t94450\n过道\t94451\n1187766\t94452\n水流水\t94453\n瓤晓明\t94454\n我的小猴子和你说把\t94455\n风骤雨\t94456\n凯奇\t94457\n甜蜜蜜怕不里头屋头\t94458\n丹尼\t94459\n刘海燕\t94460\n扰恭候劳驾拜托失陪久违留步\t94461\n我的个亲乖乖\t94462\n不麻烦不麻烦怎么了看我来呀\t94463\n点睛之笔\t94464\n看似\t94465\n2790\t94466\n猪小萌\t94467\nmells\t94468\n逐年\t94469\n一律\t94470\nng家\t94471\n了伦\t94472\nhsyge\t94473\n不敢当\t94474\noupaisleatitiandrodilco\t94475\n10周\t94476\n刘景鹏\t94477\n高艳茹\t94478\n丁子轩\t94479\n两笔款\t94480\n中明\t94481\n女大人\t94482\n杨子兴\t94483\nxk\t94484\nxh\t94485\nxi\t94486\n杨子兰\t94487\nxo\t94488\nxl\t94489\nxm\t94490\nxb\t94491\nxc\t94492\nxa\t94493\nxf\t94494\n笨朗\t94495\nxd\t94496\nxe\t94497\nxz\t94498\nxx\t94499\n饿狼\t94500\n赫然\t94501\nxr\t94502\nxs\t94503\nxq\t94504\nxv\t94505\nxw\t94506\nxt\t94507\nxu\t94508\nyie\t94509\nyig\t94510\nyif\t94511\n联防队\t94512\nyic\t94513\n妖塔\t94514\n上嗨\t94515\n减震\t94516\nyio\t94517\nyin\t94518\n你了行吗度\t94519\n可想你\t94520\nyij\t94521\nyiu\t94522\n瞒天过海\t94523\nxX\t94524\nyip\t94525\nyis\t94526\n较小\t94527\nyix\t94528\n蛋白\t94529\n白羊男\t94530\n制品\t94531\n花茶\t94532\n描述\t94533\n哈瓦那\t94534\n黑白道\t94535\n鞋号\t94536\nx8\t94537\n费舍\t94538\nTheumberorthirtyyuan\t94539\nx2\t94540\n梅县\t94541\n881145938459933597\t94542\nx6\t94543\n诗歌\t94544\n打油诗\t94545\nx5\t94546\neirutyal\t94547\nTgsgc\t94548\n新浪乐居\t94549\n不准备睡\t94550\n六边形\t94551\ntsloantiloventilityatoros\t94552\n刘叔群\t94553\n又快期末考试了我好伤心\t94554\n爱你就是我心爱的度秘\t94555\n佛教协会\t94556\n10000000000000000000000000000000000.000000000000000000000000000000000000000000000000个\t94557\n叫风\t94558\n轻装上阵\t94559\n困好想\t94560\n12346822￥\t94561\n含羞\t94562\n尾尾\t94563\n向耳\t94564\n8哈\t94565\n停下\t94566\n不酒店\t94567\n再见了我回家了我爱你\t94568\n笔记本儿\t94569\n59块\t94570\n忘了我吧\t94571\nxiujA\t94572\n李小雨\t94573\n诺科雷\t94574\n早十晚\t94575\n梦想事成\t94576\n好看我真\t94577\n给我收\t94578\n6781次\t94579\n哩哩哩争议\t94580\ninhis\t94581\n我不爱\t94582\n堡狮龙\t94583\n好看我看\t94584\n井水\t94585\n沉着\t94586\n8哦\t94587\n娟子\t94588\n雨润\t94589\n崔个\t94590\n顺手更动情\t94591\nfijdyugt\t94592\n一抹\t94593\ndaoni\t94594\n小白白\t94595\n一抢\t94596\n面料\t94597\n11个\t94598\n一护\t94599\n经写\t94600\n校庆\t94601\n肉浦\t94602\npá\t94603\n内急\t94604\n天不刮风天不下雨天上有太阳\t94605\n去脉\t94606\n舒适度\t94607\n哎呦我可冤枉死你\t94608\n昨天早上\t94609\n去脂\t94610\nccvyytffuuhgtycxf\t94611\n11下\t94612\n一份份\t94613\n喔喔喔奶糖\t94614\n戚百草\t94615\n一把\t94616\n侯磊\t94617\n107册\t94618\n蔡朗迪\t94619\n身区\t94620\n对不到\t94621\n杨寒雪\t94622\nvvgbbx\t94623\n钱顺南\t94624\n立案\t94625\n还礼\t94626\n一百二十六\t94627\n韩剧男芭比\t94628\n#秘密天使#\t94629\n不暖\t94630\n崖城\t94631\n美抵\t94632\n胡志強\t94633\n最爱就是你喜欢一个人\t94634\n心源性休克\t94635\n老虎石\t94636\n南大\t94637\n天明鸡\t94638\n不破意思\t94639\n真命天子\t94640\n市民们\t94641\n思修课\t94642\n一辈子鸡\t94643\n哈好吧宝贝vv\t94644\nFdd\t94645\n易居营销集团\t94646\n逗逼\t94647\n编成\t94648\n为你好\t94649\n63岁\t94650\n偏心眼\t94651\n有哪有\t94652\n戴杰\t94653\n学编程\t94654\n是我的最好朋友\t94655\n谭雪晴\t94656\n广南\t94657\n画画画\t94658\n任务\t94659\n回答请\t94660\n鱼种\t94661\n切挺\t94662\n广卞\t94663\n黄海斌\t94664\n广博\t94665\n51384126\t94666\n广卅\t94667\n实秋实\t94668\n强间\t94669\n｜依姑娘\t94670\n再一再\t94671\n闷热\t94672\n第二针\t94673\n压惊\t94674\n路朝城\t94675\n黄端\t94676\n萌萌哈\t94677\n拉烈\t94678\n雾里\t94679\n竹纤维\t94680\n萌萌哒\t94681\n你好风\t94682\n8点06分\t94683\n激戏\t94684\n书长\t94685\n91999999999999\t94686\n飞了拜拜\t94687\n一个群\t94688\n溜大街\t94689\n威尼斯共和国\t94690\n咏鹅\t94691\n李永庚\t94692\nDCTY\t94693\n别聊\t94694\n不要脸不要脸不要脸不要脸不要脸不要脸\t94695\n唉嗯\t94696\n喘香\t94697\ntounei\t94698\n和调度秘\t94699\n最近\t94700\n百和镇\t94701\n下午14：30\t94702\n生灵#\t94703\n林志些\t94704\n原定\t94705\n最远\t94706\n最迟\t94707\n度秘酿壹\t94708\n一动一动\t94709\n信誉值\t94710\n说起飞\t94711\n好吧真是哪里美吧\t94712\n原安\t94713\n1161只\t94714\n做事儿\t94715\n点符号们\t94716\n乌市\t94717\n杨海\t94718\n就读秘\t94719\n肌肉男\t94720\n馆员\t94721\n季雨婷\t94722\n刘禹锡\t94723\n安七七\t94724\n听明\t94725\n周钧桦\t94726\n脑公\t94727\n牛撸啊撸\t94728\n抓获\t94729\n信徒\t94730\nrgffcvjklo\t94731\n汪小米\t94732\n访谈\t94733\noutyou\t94734\njikkk\t94735\n真的好爱\t94736\ngkndjpxdg\t94737\n真心烦\t94738\n热泪满眶\t94739\n摄影家\t94740\n小玩闹\t94741\n艺术片儿\t94742\n更何\t94743\n1434473871\t94744\n会好好仔坏蛋大坏蛋\t94745\n哈怂\t94746\ndefghi\t94747\n亲亲色\t94748\n邹燕平\t94749\n缝缝\t94750\n毕业幼儿园\t94751\n五弊\t94752\n要不然的话\t94753\n哥鸟\t94754\nfdfftfterrtyft\t94755\n月儿\t94756\n世豪\t94757\n5.3%升\t94758\n喜添\t94759\n凶不得了\t94760\n奔三\t94761\n顾锦城\t94762\n才知\t94763\nloboo\t94764\n台币\t94765\n玄女\t94766\n美秘\t94767\n曹冰凌\t94768\n李娅静\t94769\n画画画画画画画画\t94770\n图片儿\t94771\n別忘\t94772\n准考证\t94773\n常驻\t94774\n神威\t94775\n奔中\t94776\n习凡\t94777\n不要脸的小三\t94778\n坏哪了呢钱\t94779\nsfvgj\t94780\nh漫女\t94781\n天然会\t94782\n美称\t94783\nhugucgxu\t94784\n玄奘\t94785\n经管院\t94786\n度秘菲\t94787\nryffggv\t94788\nfiuvyutysdsrsrsuffuiuvvtddstw6idfdtrewedfutaybv4rtgcxdgxwtfht\t94789\n急有\t94790\n大火车\t94791\nSNSJSJ\t94792\n抵沪\t94793\n讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌\t94794\n糯米酒\t94795\n水痘\t94796\n三多岁\t94797\n竞速\t94798\nuuuuuuuuuuuuuuuuuu1111111111111111111111\t94799\n其行\t94800\n头狗\t94801\n151万日元\t94802\n宁财神\t94803\n父辈\t94804\n艾都\t94805\n125555568775788798858\t94806\nlity\t94807\nlitu\t94808\nlitt\t94809\n抽脂\t94810\n定位谜\t94811\n靠还\t94812\nlito\t94813\n死丫头\t94814\n那如子\t94815\n5个\t94816\n周俣池\t94817\n一百零\t94818\nCarol\t94819\n凤爪\t94820\n蔡华挺\t94821\n9点48\t94822\n安阳\t94823\n大脑袋\t94824\n我不说\t94825\n怒放\t94826\n9点41\t94827\n张鑫淼\t94828\n9点43\t94829\n9点42\t94830\n杨国\t94831\n9点47\t94832\n毛豆瓣\t94833\n花铃\t94834\n逼片\t94835\n慌乱\t94836\n新奥尔良\t94837\n蔡乐\t94838\n蓝鲨\t94839\n活儿\t94840\n樱花麻蒋梦\t94841\n黄沙古渡\t94842\n不幸者\t94843\n开国\t94844\n推广告\t94845\n要死要活\t94846\n徐哲\t94847\n想啦\t94848\n饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿日日日日日日\t94849\n九十厘米\t94850\n杨振嗯\t94851\nMK2\t94852\n聊聊性\t94853\n开园\t94854\n冯雅楠\t94855\n南社古村\t94856\n一大包\t94857\n好呀别忘了兄弟\t94858\n小矮人\t94859\n哥哥哥哥哥\t94860\n命危贬\t94861\n寻衅\t94862\n鑫淼\t94863\n潘少帅\t94864\nqqbx517\t94865\n蜀鄙\t94866\n二百二\t94867\nYhibua6gjmxyhkvyhurum\t94868\n我想我想看我想看你的是你在哪我去哪闹呢那你不让我去的话\t94869\n豆汁\t94870\n仙辽匕\t94871\n遭拒\t94872\n18467套\t94873\n余占\t94874\n270米\t94875\n买得起\t94876\n亲窝\t94877\n这个周六\t94878\n二菇凉\t94879\nmewtc星星a门ujhswhejhdhfbsjjxjcjjcjcjcjcjkckckfj\t94880\n改朝\t94881\n情爱\t94882\n书客\t94883\n柯小姐\t94884\n欣慰\t94885\nBakery\t94886\n450毫升\t94887\n奥一个\t94888\n创刊\t94889\n瓮PR\t94890\n中国电信联盟\t94891\n共能\t94892\n胡人俑\t94893\n冯滢\t94894\n星期六天\t94895\n许波\t94896\n紫馨\t94897\n料理\t94898\n李老龄\t94899\n欧钰滇\t94900\n我的哥\t94901\n儿戏\t94902\n偷笑]剧场版\t94903\n心情不好关\t94904\nt\t94905\n得力得力得力得力得力得力德里德里德里德里德里\t94906\n好习惯\t94907\n8286991\t94908\n筹划\t94909\n大怒\t94910\n观众们\t94911\n周亚雄\t94912\n鼓\t94913\n鼐\t94914\n大怕\t94915\n鼞\t94916\n莫耶斯\t94917\n乡梓\t94918\n啵罗\t94919\nMark\t94920\n聊聊天行不想\t94921\n死我了你\t94922\n鼁\t94923\n的哥\t94924\n琴棋书画\t94925\n生电台\t94926\nMary\t94927\n投久\t94928\n四代半\t94929\n15935683545\t94930\nabcab\t94931\nabcaa\t94932\n将骨\t94933\n54位\t94934\n咪兔\t94935\n鼻\t94936\n31215643\t94937\n僧伽耶\t94938\n尊好\t94939\n嘻嘻猫\t94940\n鼠\t94941\n咪先\t94942\n大怪\t94943\nabcax\t94944\n收费站\t94945\n呆萌萌\t94946\n龟汤倩\t94947\n三一个七\t94948\n举动\t94949\n吸引力\t94950\n打钱\t94951\n乡梦\t94952\n阿里汉\t94953\n134658679957\t94954\n扛起\t94955\n唱彩\t94956\nhabd\t94957\n尹涛\t94958\n张静立\t94959\n3999元\t94960\n3.4.0\t94961\n晴隆\t94962\n游乐\t94963\n破裂\t94964\n车场\t94965\n走不起\t94966\n举办\t94967\n打钉\t94968\n打针\t94969\n愿不愿\t94970\n2010年10月31日\t94971\n大声大声\t94972\n承諾\t94973\n未来两天\t94974\n保健食品後\t94975\n杨阳洋\t94976\n观影\t94977\n走错路\t94978\n不对呀\t94979\n恐龙猎手还有你是你是恐龙猎手\t94980\nurym\t94981\n扭把\t94982\njidjud\t94983\n早上5-6点\t94984\n盖·皮尔斯\t94985\nloc\t94986\nlof\t94987\nlog\t94988\nloj\t94989\nlok\t94990\nloh\t94991\nloi\t94992\nloo\t94993\nlol\t94994\n噗是是和平号\t94995\nlos\t94996\nradiotalk\t94997\nlov\t94998\nlow\t94999\nlot\t95000\n两套\t95001\n波_\t95002\nlox\t95003\n五指神经网络\t95004\n杜娘\t95005\n阿马亚\t95006\n首歌听\t95007\n16655368553254532\t95008\n422494￥￥￥￥6455568\t95009\n轩辕剑#昆仑镜篇\t95010\n桃源杯\t95011\ntoutle\t95012\n诗人\t95013\n喇[\t95014\n悲痛欲绝\t95015\n何宝仪\t95016\n颜凯莱斯\t95017\n夹江\t95018\n高水准\t95019\n1964年\t95020\n水果\t95021\n诗云\t95022\n安份耐克\t95023\n人之相随\t95024\n鱼香\t95025\n1347885012\t95026\nTFT\t95027\n恶于\t95028\n承压\t95029\n30天前\t95030\n断腿人\t95031\njxijxj\t95032\nzjpj\t95033\n0点五二\t95034\nFairchild\t95035\n发邮件\t95036\nnxe\t95037\n1357370448\t95038\n见面客气客气\t95039\n一个哦\t95040\n听不听听\t95041\n幸福的我呀美得P颠\t95042\n好男人\t95043\n考设\t95044\n多米\t95045\n短时\t95046\n赵天一\t95047\n四四号\t95048\n五百道\t95049\n王立群\t95050\n斗殴\t95051\n毒物\t95052\n五张\t95053\n再来一个不吧\t95054\n似锦如花\t95055\n舜颈\t95056\n6年前\t95057\n偏导\t95058\n重要\t95059\nTOSHIBAHIPT21\t95060\n点坠\t95061\n硪老公\t95062\n刘硕\t95063\n幽香\t95064\n苟利国家生死以岂因祸福避趋之你是\t95065\n内蒙古\t95066\n藏恶心\t95067\n名义腐皇后\t95068\n小雨啊真的好想你\t95069\n8u5gh5i4iuuh3vv\t95070\n华星\t95071\naaka\t95072\n邪淫\t95073\n化粪池\t95074\nkljjjj\t95075\n诸葛舌战群儒\t95076\n惹不惹\t95077\n毒箭\t95078\n君主制石化\t95079\noooornbnvnflldjjfnhn\t95080\n800798676\t95081\n心急如焚\t95082\n机器人工智能\t95083\n文具\t95084\nhjgu\t95085\nEH16+FZ5BE\t95086\n四十小时\t95087\n大长山\t95088\n转眼\t95089\n脑筋急转弯吧度秘\t95090\n博拉斯\t95091\n在嘉\t95092\n五几秒\t95093\n曾沛慈\t95094\n木大墨\t95095\n承发包\t95096\n亲一个塞\t95097\n杨隋代\t95098\n王菁\t95099\n超人粹\t95100\n新增\t95101\n9:16\t95102\n急湍甚\t95103\n易烊千楠\t95104\n国米\t95105\n天语s\t95106\n检查\t95107\n感谢你为\t95108\n听到位\t95109\n秦寿生\t95110\n周伯华\t95111\n康贝\t95112\n锈斑\t95113\n少见\t95114\n办公化\t95115\n柜子度\t95116\n九公主\t95117\n蒜苗豆干炒肉丝\t95118\n保障房\t95119\n煮酒\t95120\n光明\t95121\n豆浆瓶\t95122\n些许\t95123\n得心应手\t95124\n班子\t95125\nMisswu\t95126\n千千画琳琅\t95127\n考不好爸爸骂\t95128\n全然\t95129\n诽谤\t95130\n死了都要爱\t95131\n孙祥丽\t95132\njhxhb\t95133\n国籍\t95134\n服拍\t95135\n蔡张\t95136\ncookies\t95137\n排球\t95138\n威尔士法院\t95139\n度秘我是你的姐姐度度源\t95140\n089295\t95141\n藐视\t95142\n听错\t95143\n咕哝\t95144\n问好\t95145\n如花开花\t95146\n龙东\t95147\n阿萨德\t95148\nTOP12\t95149\n一个元\t95150\nnani\t95151\nrator\t95152\n输血\t95153\n几胞\t95154\n将进酒\t95155\nGfgutgj\t95156\nGRAND\t95157\n仰光河\t95158\n性欲\t95159\n吕梦茹\t95160\ngether\t95161\n一个六\t95162\n粮农\t95163\n他人家\t95164\n华育国\t95165\n可偶\t95166\nG8G8\t95167\n女儿国\t95168\n植物大战僵尸二花园战争\t95169\n吴学宝\t95170\n27904\t95171\n诜澡\t95172\n郑氏\t95173\n我的朋友\t95174\n向来\t95175\n活动价\t95176\nTOP10\t95177\n血书\t95178\n真的假的骗你\t95179\n信神\t95180\n法瑞\t95181\n吖吖\t95182\n心情不甘心\t95183\n成百上千万\t95184\nnvbmb\t95185\n10件\t95186\n非麻美敏片\t95187\n功成名就\t95188\n介四\t95189\n上世纪八十年代中期\t95190\n赠选\t95191\nSVU\t95192\n来了碰\t95193\n你是谁你是谁你是美眉大美眉\t95194\nｉｎ\t95195\nｉｉ\t95196\n１５５０\t95197\n正气\t95198\n水果干\t95199\n13657116916\t95200\n祢藤\t95201\n两只二\t95202\nccdsdtrtycc\t95203\n忙吗忙吗忙吗忙\t95204\n张天宇\t95205\n首尔站\t95206\n洞幺\t95207\n帕帅\t95208\n安燕\t95209\n我爱你我想和你\t95210\n三绿碧\t95211\n19990608\t95212\n合格\t95213\n跑速\t95214\n帕希\t95215\n大面\t95216\n秦镇\t95217\n塞巴\t95218\n真面日\t95219\n如意挑\t95220\n安太\t95221\n巨呱\t95222\nyrrg\t95223\n奥okokokokok\t95224\n卓越\t95225\n构架\t95226\n大辈儿\t95227\nonabl\t95228\nx11者\t95229\n姿宜\t95230\n极化\t95231\n八辈子\t95232\n杨普通\t95233\n哎呦那我死\t95234\n黄彦康\t95235\n袁梦迪\t95236\n大黄鱼\t95237\n清凉度\t95238\n十渡\t95239\n对对对对对\t95240\n文明点\t95241\n好啦好啦安心\t95242\n长大不容易\t95243\n去过节\t95244\n酷似\t95245\n78岁\t95246\n僧人\t95247\n离许\t95248\n杂谈\t95249\n继球\t95250\n年代秀\t95251\n安全员\t95252\n家虎\t95253\n春蚕\t95254\nudrus\t95255\n高凤飞\t95256\n纺王海怡喜园\t95257\n名分\t95258\n恶果\t95259\n杭州百怡食品有限公司\t95260\n名刊\t95261\n澄澈\t95262\n罗金诗\t95263\n嗯啦啦\t95264\n陈琰琳\t95265\n15920054088\t95266\n5x318\t95267\n清蒸鱼\t95268\nNONO\t95269\n5。23\t95270\n大本赛季\t95271\n捐献\t95272\ndogdigxi\t95273\n精神病\t95274\n第1代\t95275\n网言\t95276\n若干幅\t95277\n从来不下\t95278\n颈联\t95279\n政绩\t95280\n十四时41分\t95281\n第五个\t95282\n一气\t95283\n嗯百万\t95284\n那个片\t95285\n业主\t95286\n看写\t95287\n李媚\t95288\n截屏\t95289\nfgxyxh\t95290\n程程\t95291\n5cvsylaken\t95292\n一水\t95293\n闲庭\t95294\n资上\t95295\n二月二十七\t95296\n回程\t95297\n婪接受换头术\t95298\n太平洋深水极品大马哈鱼\t95299\n业业\t95300\n一氧\t95301\n宠爱女\t95302\n一氢\t95303\ndream\t95304\n省上\t95305\n分号\t95306\n蠢货\t95307\n昨日清晨\t95308\n其人\t95309\nhttpfhiphotosbaiducomxiaodupicitemd833c895d143ad4bfc\t95310\nfjkj\t95311\n晚安度秘我爱你\t95312\n须要\t95313\n盐巴\t95314\n田晓梦\t95315\n腿白\t95316\n周定纬\t95317\nciic\t95318\n黄站\t95319\nhttpfhiphotosbaiducomxiaodupicitem29381f30e924b8994c0e7c4c69061d950a7bf690jpg\t95320\n11430万吨\t95321\n腿癌\t95322\n分台\t95323\n度秘瓜娃子\t95324\n老婆态\t95325\n桂林之战马陵之战\t95326\n快快快快快\t95327\n电火花\t95328\n减产\t95329\n张忠臣\t95330\n侧本\t95331\n蛋蛋想你家亲爱的蛋蛋\t95332\n老婆婆\t95333\n叫诗\t95334\n唉想\t95335\n早上八点\t95336\n蔡浩宇\t95337\n51680\t95338\n招数\t95339\n接处警\t95340\n陈思怡\t95341\n大圈儿\t95342\n众参\t95343\nficerig\t95344\n少年\t95345\n日出共青城\t95346\n依稀那我嫁给你好\t95347\nholiday\t95348\n喜欢你好不好\t95349\n陈思思\t95350\n咯落\t95351\n这会儿\t95352\n叫说\t95353\n丁一宇\t95354\n故意类\t95355\n除话\t95356\n浪味\t95357\n纯汉\t95358\n明娜奥奇\t95359\n灵活性\t95360\n被关\t95361\n知己知己知己知己知己知己知己知己知己\t95362\n勾花\t95363\nfvdmgdjhf\t95364\n刚刚\t95365\nwhattit\t95366\n善解人意\t95367\n说销\t95368\nHSPA\t95369\n勾芡\t95370\n永妮\t95371\n烂掉\t95372\n幺婧伊\t95373\n武贵阳\t95374\n婆鸶\t95375\n邪王\t95376\n一八王\t95377\n锁匙\t95378\nt血檀\t95379\n加重\t95380\n自慰器\t95381\n王嘉敏\t95382\n龙彩公司\t95383\n十琉\t95384\n农历腊月三十至正月初六\t95385\nDvbrbdchfbsas\t95386\n梁博涵\t95387\n明月何时照我还上一向\t95388\n和平区南路\t95389\n越来的\t95390\n猪骨\t95391\n候车\t95392\n钢琴曲秋日芒草\t95393\n张锐雯\t95394\n胆小\t95395\n赠送\t95396\n四分之五\t95397\n想当你\t95398\ngufsifeuvxf\t95399\n墨香\t95400\n雪岭\t95401\n该不该死\t95402\n美猴王\t95403\n霞姐\t95404\n定慧\t95405\nAirport\t95406\n杨莹\t95407\n代有\t95408\n信儿\t95409\n台海网\t95410\n草死\t95411\n明风景\t95412\n女真太\t95413\n曲库恰克\t95414\n橙色\t95415\n度秘你真的好pp\t95416\n郭猜\t95417\n蒙语\t95418\n272点\t95419\n井耕\t95420\n無知\t95421\nv个候车厅\t95422\n如东\t95423\n词句\t95424\n物理平衡\t95425\n分句\t95426\n一绺一绺\t95427\ncomomoooooooooty\t95428\n枕边地\t95429\n天文馆\t95430\nallX5\t95431\n相照\t95432\n分发\t95433\n辽宁抚顺国土局\t95434\n和一做我的老婆\t95435\n中情局\t95436\nProcrastination\t95437\nbeperfect\t95438\n45974857135\t95439\n分友\t95440\nyfmg\t95441\n称谓\t95442\n不可思议\t95443\n看点书\t95444\n孔满\t95445\ntyhgyug\t95446\n歇班\t95447\n怎么那么那么那么那么我之前心情不好别惹我我这几天心情不好别惹我我这几天心情不好别惹我\t95448\n面无表情\t95449\ndwq44gtr\t95450\nidwh\t95451\n右右\t95452\n无所不包\t95453\n葛洋辰\t95454\n政府公信力\t95455\n诸暨市佛教协会\t95456\n押着\t95457\n擦拭\t95458\n摄像师\t95459\n艳丽\t95460\n华夏基金阿联盟\t95461\n当当\t95462\n揭批\t95463\n冒火\t95464\n当归\t95465\n为你脸现有\t95466\n1521301064\t95467\nmin哥瑞\t95468\n老干儿\t95469\n0.3元\t95470\n逃税\t95471\n上市公司\t95472\n地震感\t95473\n发烧水\t95474\n李世民\t95475\n刘加哲\t95476\n刘灿伟\t95477\n新天鹅\t95478\n咯疙人\t95479\n骚年\t95480\n村寨\t95481\n崔博文\t95482\n谢谢你好喜欢\t95483\n行行行好嘞\t95484\n遥远的地球\t95485\n一二度\t95486\n1244567890\t95487\n34项\t95488\n诬泽群\t95489\nZkjdjjh\t95490\n酷越\t95491\n娱乐\t95492\n公式化\t95493\nyaolorouylyaobeigetwo\t95494\n马克思主义者\t95495\n稻子\t95496\n出嫁\t95497\n津波\t95498\n裴龙猫\t95499\n周桂美\t95500\nkaj\t95501\nkai\t95502\nkao\t95503\nkan\t95504\nkal\t95505\nkau\t95506\n2345286\t95507\n功成\t95508\nkay\t95509\n肾亏\t95510\n范迪克\t95511\nBridge\t95512\n人生之福\t95513\n药械\t95514\nLED照明光源\t95515\n用题\t95516\n暖宝\t95517\n装甲发\t95518\n阿布娃娃\t95519\nSheis\t95520\n藏爽\t95521\n去群\t95522\nMC101\t95523\n识别仪\t95524\n唠会儿\t95525\n农家乐大克人时成克我有得罪\t95526\n电银\t95527\n四散\t95528\nBIANG滴\t95529\n茂业百货\t95530\n77917798年\t95531\n饿啵\t95532\n郑玉荣\t95533\nfdhdhwndkkdmfkdnsjdhekcuhdndu\t95534\n一三伦\t95535\n煞笔咧\t95536\n459900735900460425500428\t95537\n大气磅礴\t95538\n西侧\t95539\n哀家不敢\t95540\n幂幂美\t95541\n郭汉谋\t95542\n木金\t95543\n课程\t95544\n利特巴尔斯基\t95545\n飞天火鸡腿\t95546\n修筑\t95547\n夜润物\t95548\n睇佐\t95549\nwaitmethink\t95550\n黄志忠\t95551\n120000\t95552\n都市情感悬疑电影\t95553\n天逸\t95554\n天怡\t95555\n天性\t95556\nノД\t95557\n爸爸巴巴爸爸吧巴巴巴巴一个半个半个半个吧五个\t95558\n百变溢血\t95559\n不是我的谁\t95560\n吧诺诺\t95561\n李主任\t95562\n王若夫\t95563\n离开我我讨厌你\t95564\n媛曦\t95565\n全寨\t95566\nwifi压力山大你信\t95567\nQQ鸟\t95568\n102n\t95569\n华迷们\t95570\n东妖\t95571\n贾明欣\t95572\nJhzgfh\t95573\nooyyyyyyyyyyyyyy\t95574\n老麻烦\t95575\n算了回\t95576\n检疫\t95577\n九月三\t95578\n梨园趣话\t95579\n笨尸\t95580\n长和\t95581\n咳咳咳咳咳咳咳咳咳\t95582\nSHUDH淘宝旗舰店\t95583\nGghsush\t95584\n石立阳\t95585\n可疑\t95586\n钱月\t95587\n萌度\t95588\n我喜欢与众\t95589\n当我的孤魂野鬼\t95590\n太难了我听不懂\t95591\n我梦女神呢女神\t95592\n3333333芍\t95593\n李冰洁\t95594\nkbjl\t95595\n中外运站台\t95596\n我想你了亲个嘴\t95597\n终生误\t95598\n必然之路\t95599\nifuduchs\t95600\n人口舌\t95601\n什秒\t95602\n比會\t95603\n人粒\t95604\n消法\t95605\n洪都\t95606\n懂你在\t95607\n4194\t95608\n城角街\t95609\n助益\t95610\n堕仙\t95611\n许思晴\t95612\nskrdeffs\t95613\nhthf\t95614\n15724514526\t95615\nisfhy\t95616\n打餐\t95617\nhehasthreenotebooks\t95618\nNononono\t95619\n处众\t95620\n凑烧\t95621\n度秘真乖我想你的文\t95622\n白云湖\t95623\n鸡汁南豆腐\t95624\n蹬天\t95625\n国安绿城\t95626\n阿KenTV\t95627\n代出\t95628\n敢不听\t95629\n自幽\t95630\n男漫天雨\t95631\nSherlock\t95632\nCrepes\t95633\n1930年\t95634\n尽如荼\t95635\n卫龙辣条\t95636\n白凉粉\t95637\n耶茶\t95638\n我的你的你的我的\t95639\n奥吐卡嗯\t95640\n下4小时\t95641\n最前沿\t95642\n双料\t95643\n涅磐\t95644\n费凡\t95645\n艾灸扶阳者\t95646\n很早\t95647\n行不暇\t95648\n不卖给\t95649\n脑公么么哒\t95650\n保守主义\t95651\n17k英特\t95652\n这个人儿\t95653\n打包机\t95654\n有个人走\t95655\nAjax\t95656\nhvfjcbckhxud\t95657\nvehehyui\t95658\n不到货\t95659\n实实在\t95660\n军官\t95661\nopopopppp\t95662\n3点30点\t95663\n南智贤\t95664\n刘涵\t95665\n151bobo么\t95666\n俊逸\t95667\n741585\t95668\n李小欢\t95669\njzkc\t95670\n六二零二\t95671\njzkd\t95672\n黄金箱\t95673\n狗鞭\t95674\n天墉城\t95675\n模一\t95676\n王全欣\t95677\n征\t95678\n大呢77755\t95679\n249.999TON\t95680\n自然环境\t95681\n萌萌萌\t95682\n恩摸\t95683\n惊惧\t95684\n公好\t95685\n666666666666666666666666\t95686\nbikini\t95687\n摩凡陀\t95688\nc15\t95689\n陈剑星\t95690\n一模一样\t95691\n通报会\t95692\n春燕\t95693\ntvxgXBC\t95694\nqcds\t95695\n度出来\t95696\n尼玛顾里\t95697\n刹那\t95698\n登攀\t95699\n自暴自弃\t95700\n主版\t95701\n财富者\t95702\n挑阳\t95703\n15237278375\t95704\n陆贞\t95705\n可环球五\t95706\nＺｚｚ\t95707\n一反常态\t95708\n心慌\t95709\n完善\t95710\n愛情\t95711\n耿直度秘\t95712\n沙特玛哇\t95713\n分了不兄弟第四奔跑吧兄弟四\t95714\nROCK\t95715\nyffhhfted\t95716\n喵毛撒\t95717\n北方图书城\t95718\n素男\t95719\n五零条\t95720\n陈麒麟\t95721\nsongipopopopop\t95722\n本草求真\t95723\n再也\t95724\n胸错\t95725\n莫班\t95726\nslesjj\t95727\n3ge\t95728\n就是大脚臭不要脸\t95729\n上吐下泻\t95730\n度秘你是机器人吧女的吧好可爱\t95731\n树欲静而风不止\t95732\nggjgrhvdgn8fk\t95733\n哈女刚\t95734\n打嘛\t95735\n儿孙福\t95736\nbdbsj\t95737\n一小块\t95738\n再买\t95739\n电插排\t95740\n一零岁台\t95741\n8万亿\t95742\n山崎三明\t95743\n9点12分\t95744\n樟树\t95745\n好好好嘞\t95746\n一句一俩\t95747\n海淀图书城\t95748\n清晰版\t95749\n你好不好\t95750\n喵星人\t95751\n爹秘\t95752\n一个一块\t95753\n林黛玉\t95754\n阿工夫\t95755\n我不想和你说话了我讨厌你\t95756\n我还记得\t95757\n刘行梦\t95758\n链点\t95759\n九百九千九万九十九个\t95760\n新媒体\t95761\n5376\t95762\n杜明瑞\t95763\nwgwg\t95764\n两门\t95765\n顾子茜\t95766\n两问\t95767\n干毛\t95768\n头黑\t95769\n年子\t95770\n摘额\t95771\n帮帮忙唄\t95772\n图退\t95773\nDYSHW\t95774\n差异化\t95775\n巡车\t95776\nUooU\t95777\ngamy\t95778\n车门\t95779\ngame\t95780\ngamd\t95781\n多多\t95782\n失落\t95783\n四代人\t95784\n举报\t95785\n彩绘\t95786\n东昌府区\t95787\n阿翁阿衣\t95788\n多太\t95789\n568455842574258\t95790\n吵妓\t95791\n车间\t95792\nhrhhenr\t95793\n多大\t95794\n今天早上8点26\t95795\n你的秘密\t95796\n集合影像\t95797\nSHAYISI\t95798\n弄出自来\t95799\n青春如剧\t95800\n律吕\t95801\n省一起走\t95802\n去吗\t95803\n多头\t95804\n唉声叹气娘娘腔似\t95805\n哈谢\t95806\n叽叽叽叽babybaby叽叽叽叽贝贝呗鸡鸡车叽叽叽叽baby叽叽叽叽贝贝\t95807\n奶酪\t95808\n问你问\t95809\n横横哼\t95810\n黯然\t95811\n薄荷音\t95812\niPad版#\t95813\n9代\t95814\njbunhnh\t95815\n15274710860\t95816\n看我到\t95817\n9们\t95818\n谢哼\t95819\n彝族\t95820\n9件\t95821\n乔瑞\t95822\n360x袷袷436x\t95823\n耳巴\t95824\n谢哥\t95825\n王一楠\t95826\n白糖白塘鲺\t95827\n密集\t95828\n别再坐视\t95829\n不干不干不干不干不干不干不干不干不干不干不干\t95830\n日子\t95831\nfromeng\t95832\n日字\t95833\n上进心\t95834\n走了聊\t95835\n沒有\t95836\n吹拉\t95837\n去秀\t95838\n营销部\t95839\n,,,\t95840\n看出发\t95841\n吹拂\t95842\n带土豆\t95843\n内裤\t95844\n李笑\t95845\n扯掉\t95846\n主客场\t95847\n黑小\t95848\n深感\t95849\n冒泡\t95850\n深愛\t95851\n大功放犯法\t95852\n但愿水到此为止\t95853\nv苏\t95854\n室友好上\t95855\n我不好受\t95856\n深意\t95857\n恩是的微很开心\t95858\ntady\t95859\n陈雪都\t95860\n中风\t95861\n泌尿系统\t95862\n笑猫日记\t95863\n新闻出版总署\t95864\n嘉兴话\t95865\n呢哼信不信\t95866\n100200\t95867\nB1A4\t95868\n冰水\t95869\n早上八点半\t95870\n小萌萌\t95871\n德邦\t95872\n木王\t95873\n看我不知道\t95874\n夸句\t95875\n男地\t95876\n北京西\t95877\n追星族\t95878\n入手\t95879\n霓漫天\t95880\n上车后\t95881\n30分钟\t95882\n一会儿\t95883\n五月儿\t95884\npww\t95885\n天荒\t95886\npwp\t95887\n采写\t95888\n农女\t95889\n时局\t95890\n女国\t95891\n女图\t95892\n妙龄女郎\t95893\n我是说你不爱我了不是我不爱你\t95894\n马宝莉\t95895\nadgmg\t95896\n加壹\t95897\n钟彬\t95898\n静思如\t95899\n歌爱笙a\t95900\n切块\t95901\n咸宁余佐村中心幼儿园\t95902\n不破压\t95903\n李晓慧\t95904\n吹泡泡\t95905\nOMG，coderface\t95906\n别相信别相信\t95907\n丢你\t95908\n植入\t95909\n勿忘我\t95910\n休改\t95911\n事实上\t95912\nkgvohgppi\t95913\n五鬼\t95914\n吴狄\t95915\n方子福\t95916\njduu\t95917\n遺忘\t95918\n高高兴兴滴\t95919\n航班\t95920\nvhhh\t95921\n阿巴拉\t95922\n不怕再来\t95923\n老能不能干\t95924\n邢海英\t95925\n呵嗳\t95926\nyouonth\t95927\n钻石头\t95928\nhijake\t95929\n来了来了来了来了来了来了来\t95930\nd英\t95931\n顺耳\t95932\n度秘你好乖\t95933\n发福\t95934\n依所以说\t95935\n叶问\t95936\n睡袋\t95937\nPrint云\t95938\n耐穿\t95939\n再抢\t95940\n我在这你在哪谜语像雾不是雾象风不是风像一样都市艳\t95941\n搅拌\t95942\n亲口口\t95943\n猪妖记\t95944\n睡不著\t95945\n先饭后\t95946\n邵能\t95947\n看我也喝\t95948\n欣怡图\t95949\n南洛南洛\t95950\n维帅\t95951\nCoffee\t95952\n最深切\t95953\nfhffyfyfyfydvgfsgufdxhyjjjkpldfhkcjvsdf\t95954\n武琪琪\t95955\n一晚上\t95956\n天天天天天天天天天\t95957\nstourityoutou\t95958\n高起点\t95959\n二哥\t95960\n嗯样\t95961\n小作者\t95962\n卖钱\t95963\n等不住\t95964\n劲劲\t95965\n四六点\t95966\nBL漫画\t95967\nfgsdgadtrgs\t95968\n冯鑫闺\t95969\n2，四天\t95970\n墨魂\t95971\n曲靖\t95972\n奇门遁甲\t95973\n二哈\t95974\nyorifof\t95975\n有眼\t95976\n离散\t95977\n76种\t95978\n覅辛\t95979\n区外\t95980\n苏宇航\t95981\n王清河\t95982\n真讨\t95983\n手手不轻\t95984\n91分\t95985\n心怡意识\t95986\nkK\t95987\nok3Q\t95988\n就好啊\t95989\n真论\t95990\njellomIre\t95991\n苦咖啡\t95992\n尽可能\t95993\n8855825\t95994\n废掉\t95995\n张俊达\t95996\n空片\t95997\nTjo\t95998\nCncv\t95999\nfygg\t96000\nfyge\t96001\n五三年\t96002\n小\t96003\n几万平方公里\t96004\nstavw\t96005\n生殖器疱疹\t96006\nwxvc\t96007\n罗叉\t96008\n夏大人\t96009\n益虫\t96010\n邪魅儿\t96011\n巴神鲁尼小白\t96012\n幺八六\t96013\n赎罪\t96014\n水涨船高\t96015\n得起\t96016\n涉江者\t96017\n化工\t96018\n公检法\t96019\n并且\t96020\n舣胜芳解伟醚酬谢\t96021\n己亥\t96022\n你的爱情侣\t96023\n罗只\t96024\n口乐\t96025\nggghhcfnnn\t96026\n广电总菊\t96027\n同桌\t96028\n证明\t96029\n并不\t96030\n6月27日\t96031\n电男\t96032\n电电\t96033\n我想和你克我想让你跟我一起聊天集显一起来妈妈\t96034\n甄赵彤\t96035\n职能\t96036\n18373716009\t96037\n本数\t96038\n牛肉味\t96039\nybubych\t96040\n劲爆\t96041\n监视器\t96042\nロンドン\t96043\n呢快说\t96044\nFggygf\t96045\n度幂\t96046\n唉愚钝\t96047\n暂住\t96048\n兄弟兄\t96049\n保婴\t96050\n62269592\t96051\n忙话费\t96052\n度年\t96053\nheycatchmeboy\t96054\n总队\t96055\n称份\t96056\n昆明市\t96057\n东坡志林\t96058\n首排比句\t96059\n夫人\t96060\n3sinb十三分之12\t96061\n作风\t96062\n服饰\t96063\n沙罗子保丽店沙螺子保利\t96064\n省得\t96065\n中国平安\t96066\nArmani\t96067\n内蒙公积中心\t96068\n在家人\t96069\n荀子\t96070\n六十年后\t96071\n曹搜索\t96072\n那么样\t96073\n5574555114\t96074\n惠家家\t96075\n6月21日晚间\t96076\n大一辈\t96077\n夜半\t96078\n不要脸的死胖子变态\t96079\n瓶白\t96080\n如都行\t96081\n打印纸\t96082\n在的无话可说\t96083\n不ci\t96084\n红白蓝\t96085\n好啦拜拜我走了\t96086\n右翼\t96087\n第一次\t96088\nSHINELOVER\t96089\n相若\t96090\n分屏\t96091\n分局\t96092\n贺图\t96093\n分层\t96094\n分居\t96095\n张亚妮\t96096\n图瓦\t96097\nftxhhj\t96098\n两几百\t96099\n七八十岁\t96100\n遵守\t96101\n八五幺零\t96102\npingagiagn\t96103\n第一款\t96104\n要不够\t96105\n快点点\t96106\ncraw\t96107\n小尉子\t96108\n温室\t96109\n要不天\t96110\nmjihdji\t96111\n嗯无敌\t96112\n后户\t96113\nhhhbbb\t96114\n3899\t96115\n342423199611167860\t96116\ncran\t96117\n小爱梅\t96118\n秘验\t96119\n洪笑笑\t96120\n学了有\t96121\n丽均\t96122\n解放鞋\t96123\n下土\t96124\nwirh\t96125\n不当作\t96126\n油腻味\t96127\n朴泰桓\t96128\n恩军\t96129\n13572831367\t96130\n4等\t96131\n许仙爆\t96132\n周俊河\t96133\n放出来\t96134\n王主任\t96135\n下地\t96136\n民街\t96137\n秘骨\t96138\n下场\t96139\n金钟云\t96140\n不当你\t96141\n你你你你你你你你你是一头牛分\t96142\n好我的好\t96143\nVVVVVVVVVVVVV\t96144\n便宜点\t96145\n酬劳\t96146\nbeijing\t96147\n小丑面具\t96148\n有意思奥\t96149\n你给我叫声娘娘\t96150\n87831745\t96151\n啦啦啦啦啦啦我的宝贝\t96152\n坐言\t96153\n失学\t96154\n造梦西造梦西游\t96155\n13：30\t96156\n满池\t96157\n杨力壮\t96158\n生意\t96159\n脏眼\t96160\n探戈\t96161\n18329363050\t96162\n叶太大\t96163\n四分之37分\t96164\n你的决定\t96165\n0儿\t96166\n中央五套\t96167\n日日增长\t96168\n电路图\t96169\n愉快乐\t96170\n齐白石\t96171\n4566899\t96172\n脐带\t96173\n徒儿\t96174\n罩衫\t96175\ntnjtmwj\t96176\n哭笑不得\t96177\nndLi\t96178\n不逾期\t96179\n2亿9100万元\t96180\n李欣蕊\t96181\n斜片\t96182\nxiaomi\t96183\n暗里\t96184\n站出来\t96185\n玄龟\t96186\n王科理\t96187\n多咪多咪叫我皇太后\t96188\n12995万吨\t96189\n紧靠\t96190\n景仰\t96191\n好些年\t96192\n过一让\t96193\n大湖塘钨矿\t96194\n喂钓鱼\t96195\n高梦蝶\t96196\n董厂长\t96197\n就的的就的就的就的就的就的的你那啊你\t96198\n两融\t96199\n朽躯\t96200\n六零五\t96201\n霜冻\t96202\n撒体\t96203\n黎庆洪\t96204\nYogyoro\t96205\n怨气\t96206\n完全相同\t96207\n顺着\t96208\n何以明之的明\t96209\n贾拜拜\t96210\ntlijgagbi\t96211\n急诊科\t96212\n源代码\t96213\n肩包\t96214\n救济性\t96215\n当作\t96216\n巍巍\t96217\n痴心不改\t96218\n36。38级\t96219\n首胜7召蒛6怎F32fM\t96220\n猫捉老鼠\t96221\n再见了臭不要脸\t96222\n研修班\t96223\n谷地\t96224\n我是女的你的室友说的\t96225\n图书\t96226\n777444777777777\t96227\n避孕剂\t96228\n同乐坊\t96229\n怪鸭\t96230\n写手诗\t96231\n灰底\t96232\n切切切切切切切切切切切切切切切切切切切切克克克闹\t96233\n当你\t96234\n红莲\t96235\n酷句\t96236\nshgd\t96237\n叔狗\t96238\n英语度\t96239\n张卫银\t96240\n穆帆\t96241\n无烟节能\t96242\n半生不熟\t96243\n片中\t96244\n杨凤霞\t96245\ndjffjzjdmfkdgfbxnxbfxnfbzbvcbgbdhfgfjggfh\t96246\n#陈翔#\t96247\n挂号俩\t96248\n奥特利\t96249\n王宝怡\t96250\n雨中\t96251\n节哀顺变\t96252\n归己\t96253\n78458732876487248724842\t96254\n拉萨三大寺\t96255\n芝麻\t96256\n3麦\t96257\n引信\t96258\n润洲杯\t96259\n哇塞酸\t96260\n李安琪\t96261\n抽动\t96262\n证书\t96263\n魏校长\t96264\n事件衫\t96265\n深奥快点\t96266\n慕雪\t96267\n收完\t96268\n压力位\t96269\n孙粤\t96270\n一碗碗\t96271\nFGAF\t96272\n13889936333\t96273\n开门山\t96274\ndoumeno\t96275\n心之火\t96276\n9个小时\t96277\n王毅然\t96278\n豆干丝\t96279\n5月28.29\t96280\n这么明\t96281\n女生公寓\t96282\n189发\t96283\n14611\t96284\n曹守依\t96285\n催泪瓦斯\t96286\n刘和杰\t96287\n杨凯其\t96288\n一体化\t96289\n[偷笑]\t96290\n售楼处\t96291\ncaomm\t96292\n不深奥\t96293\n美短断\t96294\n吃吃喝\t96295\n李秀宅\t96296\n一匹马\t96297\n我们都要\t96298\n判决书\t96299\n可大可大\t96300\n剃头\t96301\n倪泽男\t96302\n就一点\t96303\ntuodrt\t96304\n塞村片\t96305\n姚云冲\t96306\n度秘我爱你我爱你到永远\t96307\n刘佳颖\t96308\n花样年华\t96309\n昌江\t96310\n社稿\t96311\n嫉妒\t96312\n另一只狗\t96313\n寄件人\t96314\npjbjcvvhkjjmbchhvhv\t96315\n小许\t96316\n飞欢\t96317\n主城\t96318\n大计\t96319\n小记\t96320\n叱诧风云\t96321\n小先兆\t96322\n主聊\t96323\n独生\t96324\n王八大\t96325\n小让\t96326\n小议\t96327\n微板面\t96328\n柊筱娅\t96329\n嘛多米\t96330\n大许\t96331\nhi皮波catoyou\t96332\n今天傍晚\t96333\n对了\t96334\n啦啦576\t96335\n模考\t96336\n楼烦关\t96337\n生化一中\t96338\n对于\t96339\n那好拜\t96340\n哈商务七一\t96341\n适应性\t96342\nv菊花茶\t96343\n重开\t96344\n鸭血粉丝\t96345\n35万发\t96346\n没驾\t96347\nghobggkvfto\t96348\n初次见你你好\t96349\n伙计\t96350\n画当然\t96351\n14.1%\t96352\n嗯烧鸡\t96353\n一个十二岁\t96354\n解求k\t96355\n李时根\t96356\n8888888\t96357\n赵南坊\t96358\n鱼店\t96359\n四岁半\t96360\n蒲峰村\t96361\nmaconor\t96362\n不用怕\t96363\n配音人\t96364\n星期六晚上\t96365\n樱儿\t96366\ncyv\t96367\ncyu\t96368\n11.3%\t96369\n好啦好啦你好\t96370\n洗花\t96371\n便秘版\t96372\n全国政协\t96373\n你是猪吗你是猪吧你是猪你是猪吗你是猪吗你是猪\t96374\ncyg\t96375\ncyf\t96376\ncyd\t96377\n越方\t96378\n嗯开开\t96379\n降龙尊\t96380\n生发\t96381\n何在\t96382\n周好的\t96383\n130米\t96384\n生变\t96385\n坎坷不平\t96386\n1994155ivoyouhautyou\t96387\n爆裂\t96388\n清晨五点半\t96389\n先我先\t96390\n京腔\t96391\n刘辰朔\t96392\n何地\t96393\n撩动\t96394\n范成新\t96395\n蜜子\t96396\n纸片\t96397\n华裔\t96398\n易人\t96399\n你哥是你家男神玫瑰玫瑰玫瑰玫瑰玫瑰\t96400\n荆轲\t96401\n纸牌\t96402\n太一号\t96403\nBfggcghvcgh\t96404\n珠子\t96405\n西华县\t96406\n报名地\t96407\n学用棒\t96408\n3306838\t96409\n唐山堡\t96410\ngjodckuzjn\t96411\n共产国际\t96412\nfuducdg\t96413\n我喜欢夜\t96414\n3Q3\t96415\n艾斯坦\t96416\n腐妹子\t96417\nProgressed\t96418\n吧蓝牙三五\t96419\n视界\t96420\n易于\t96421\n上半身\t96422\n仙游县\t96423\n易事\t96424\n再死\t96425\n狗呢狗\t96426\n王涛娃\t96427\n8百万\t96428\n562\t96429\n北部\t96430\n一周年\t96431\n告示牌\t96432\n范海涛\t96433\n王菀之\t96434\n春史\t96435\n55555553333\t96436\nghhh\t96437\nghhj\t96438\n宇舶\t96439\nthtj\t96440\n555553663\t96441\n找唱歌\t96442\n大比三比五比六\t96443\n剑客\t96444\n4月24日晚8点\t96445\n陪我了我\t96446\nrppc\t96447\n北郊\t96448\nab系\t96449\n姐姐人\t96450\n心理准备\t96451\n境遇\t96452\n揩油\t96453\n卢小小\t96454\n1800台\t96455\n换装\t96456\n立屹立\t96457\n死龟\t96458\n普通百姓\t96459\n肠管\t96460\n来了哪呢哪呢的女人\t96461\n一口子\t96462\n枭雄\t96463\n受教\t96464\n索纳塔度\t96465\n沈林明\t96466\n排比句\t96467\n歌行\t96468\n大好丑\t96469\n受敌\t96470\n前不久\t96471\n对呀学弟哪好哪好酋阳\t96472\n人乳\t96473\n银川宁\t96474\n紫菜素火腿卷\t96475\n一八九六五三九二七八六七八九十九八七六五四三二九八七八八\t96476\n王千殇\t96477\nhgghfhcgcgxh\t96478\n难妹\t96479\n祖妈妈\t96480\n咪头咪\t96481\n嗯王林山\t96482\n嫌偶\t96483\n大骗人\t96484\n大嘴\t96485\n杂念\t96486\n真不咋地\t96487\n淉淉\t96488\n北京队\t96489\n精子库\t96490\n收起来\t96491\n蓝伟渊\t96492\nGuffthy\t96493\naashijeio\t96494\n一一度\t96495\n事态\t96496\n回到家废\t96497\n周三秒\t96498\n广栋\t96499\n陆春霞\t96500\n笑一声笑\t96501\n望岳一诗\t96502\n2月21日\t96503\n杨戬\t96504\nNicetoo\t96505\n秦天\t96506\n一灯\t96507\n才萨比\t96508\n林诺茜\t96509\n几呀\t96510\n郭柯\t96511\n十到十一岁\t96512\n李家村万达广场\t96513\ndfffggy0t\t96514\n斯大林\t96515\n真瓜\t96516\n就问问\t96517\n玩偶遭\t96518\n59.5%\t96519\nhthd\t96520\n哪页\t96521\n366次\t96522\n王传立\t96523\n接龙呗\t96524\n糸叼\t96525\n小丸子\t96526\n几呢\t96527\n任贤齐\t96528\n不情不理\t96529\n几周\t96530\n羊羊儿\t96531\n中国邮政储畜银行\t96532\n高英多\t96533\n去了呀不然\t96534\nIhate\t96535\n韩娱\t96536\n吴雨轩\t96537\n几味\t96538\n拜别再说\t96539\n跟随者\t96540\n哒哒哒\t96541\n李亚彤\t96542\n鸭溪\t96543\nSrf\t96544\n成都飞机工业公司\t96545\n党旗\t96546\n新平\t96547\n新年\t96548\n祥&amp\t96549\n分不开\t96550\n疗疗\t96551\n大美妞\t96552\n报罩\t96553\n腿裆\t96554\n有你我是\t96555\n诚实哥\t96556\n爱你爱你爱你的桃心\t96557\n汪涵\t96558\n明早七点\t96559\n布景\t96560\n交涉\t96561\n俄新社\t96562\n心情好糟\t96563\nu他嘎嘎嘎嘎\t96564\nqfr\t96565\nqfs\t96566\n读儿\t96567\n投档\t96568\n涅槃\t96569\nqfj\t96570\nqfd\t96571\n干干干\t96572\n腿裤\t96573\nqfb\t96574\n崇明崇明机器还好玩还好天还好\t96575\n九分之一派\t96576\n籍岁\t96577\n一直到现在\t96578\n食客\t96579\n静静静静静静\t96580\n食宿\t96581\n薛沅菲\t96582\n升职\t96583\n三凤\t96584\n85525355\t96585\n私你\t96586\n三几\t96587\n年十岁\t96588\n最后一个愿望\t96589\n刘绍勇\t96590\n213万册\t96591\n皮斯科\t96592\npio\t96593\n高攀\t96594\n概而论\t96595\n劳务税\t96596\n从明\t96597\n从易\t96598\n要不交费\t96599\n推算\t96600\n杖汤\t96601\n男生男生男生男生男生男生男生男\t96602\n三五年\t96603\nsi551\t96604\n增长率\t96605\n一千余\t96606\n红猪八戒\t96607\n才权\t96608\n上性\t96609\n希杰\t96610\n王忠瑜\t96611\n900多条\t96612\n中石化\t96613\n星辉\t96614\n住宅\t96615\n核桃仁\t96616\nEvefereuer\t96617\n哦嘿\t96618\n湛贤涛\t96619\n片子\t96620\n养羊\t96621\n9866\t96622\n连根\t96623\n许紫烟\t96624\nAng\t96625\n羡慕嫉妒\t96626\n住家\t96627\n星辰\t96628\n一蹭\t96629\n学约\t96630\n住宿\t96631\n一个几把\t96632\n住客\t96633\n买入\t96634\n恐怖\t96635\n恐怕\t96636\n曾曾孙子\t96637\n哦嘟\t96638\n靳东\t96639\n刘金莹\t96640\n获释\t96641\n先办\t96642\n零头\t96643\n杀菌\t96644\n汤佳玲\t96645\n海淀北一\t96646\n告诉我的爱\t96647\n瓶底书\t96648\n石臼窝\t96649\n发瘙\t96650\n红}\t96651\n气力\t96652\n红t\t96653\n呀不起\t96654\n甜魔\t96655\n默契长\t96656\nav69\t96657\n自幼\t96658\ndurs\t96659\n宏盛\t96660\n精达股份\t96661\n7785524\t96662\n豆腐角\t96663\n王泓琏\t96664\n邓森\t96665\n侯浩楠\t96666\n容莫\t96667\n郝科\t96668\n白血球\t96669\n覃祚\t96670\n你好呀我的度秘\t96671\n嘚瑟\t96672\n偶而\t96673\n背篓\t96674\n李铭顺\t96675\n520数\t96676\n五次元\t96677\nthbctk\t96678\n持久\t96679\n米达斯思密达达\t96680\n一么中七\t96681\n心情好辛苦\t96682\n卡办\t96683\n沈阳军区哈尔滨房地产管理局长国\t96684\n嗯暖\t96685\n上不上课\t96686\n裤兜\t96687\n麦梓杰\t96688\n//\t96689\n不留恋\t96690\n开心麻花鸭\t96691\n第一夫人\t96692\n13期\t96693\n半女\t96694\n二奶\t96695\n明天早上6点钟\t96696\n二女\t96697\n巴萨大学\t96698\n麸质\t96699\n李琼钰\t96700\nhdjbefkxjbihtgnjfobvojdfhuocuhoijixe\t96701\nStern\t96702\n等距\t96703\n冯军\t96704\n卡劵\t96705\n账簿\t96706\n拐卖\t96707\n当口\t96708\n李浩然\t96709\n面签\t96710\n李柏树\t96711\n雄厚宫肌瘤\t96712\n咪咪度\t96713\ndosy\t96714\n哈威特\t96715\n电褥子\t96716\nggggfbxc\t96717\nwtsf\t96718\n怜香惜玉\t96719\n东北山\t96720\n齐茶花\t96721\n手写板\t96722\n盖子\t96723\n特别版\t96724\nusuusidi\t96725\n撒听\t96726\n音乐播放器\t96727\n蒽\t96728\n随便吃\t96729\n蒿\t96730\n45度\t96731\n蒹\t96732\n蒸\t96733\n高明超乎寻常\t96734\n嗯行好啦\t96735\n英语类\t96736\n蒲\t96737\n哥哥\t96738\n灵幻\t96739\n扬动错\t96740\n蒜\t96741\n蒙\t96742\n壁｀o田\t96743\n同时\t96744\n李红玲\t96745\n农作\t96746\n不行你你你不行你有你\t96747\n蒋\t96748\ntfgg\t96749\nTMD\t96750\n同日\t96751\npagd\t96752\npage\t96753\n蒂\t96754\n988\t96755\n989\t96756\n4matio\t96757\n都市化\t96758\n王过\t96759\nGchjcfjjrryfhfyffhj\t96760\n猪狗不如来的人还是个度秘度你个头了我问你\t96761\n980\t96762\n周有\t96763\n986\t96764\n罗马涅大区\t96765\n985\t96766\n知识竞答题\t96767\n金佳怡\t96768\n经济舱\t96769\n大臭味\t96770\n白马寺\t96771\n三千六四万八六点\t96772\n黑米七\t96773\n插头\t96774\n周期\t96775\n558423516486404568125425160426156445267855116248559655555556994269229\t96776\n凹凹\t96777\n凹凸\t96778\n一战干嘛\t96779\n秘码文\t96780\n周未\t96781\n周末\t96782\n咀嚼性\t96783\n发欲\t96784\nSOSTETOOT\t96785\n7元\t96786\n同里\t96787\n衣领\t96788\n盛宴\t96789\ndnjsbxkbwkcbskbjcskbjsbdjcksobsjcjebkxbwjxvjsjsiwowhj\t96790\n910甲\t96791\n南京彭宇案\t96792\n迟晓娜\t96793\n苍老师\t96794\n模拟人生\t96795\njdjdnn\t96796\n偶们\t96797\n初学\t96798\n公安部\t96799\n风云车\t96800\n二院\t96801\n049\t96802\n046\t96803\n045\t96804\n给册\t96805\n等等团队战魂三全集\t96806\n四川区\t96807\n陈景慧\t96808\n给冷\t96809\n胡萝卜土豆\t96810\n锚固吗监狱女神不就一个度秘\t96811\n五一年\t96812\n偶仔\t96813\n四五二六\t96814\n月亮哥\t96815\n对呀小乖乖\t96816\n段段意\t96817\n复合肥\t96818\n是好是\t96819\n谢谢你我的好朋友度秘\t96820\n结单\t96821\n半生累半生你本身欧洲小脾气\t96822\n好嘛快点\t96823\n李玄\t96824\n侯静波\t96825\n女快说\t96826\n多女\t96827\n里外\t96828\n三笑\t96829\n三笔\t96830\n1991年起\t96831\n远景\t96832\n羞花\t96833\n球龄\t96834\n三笠\t96835\nlol亚欧\t96836\n刚好像\t96837\n铁头功\t96838\n答hello\t96839\n几4千三百四十二\t96840\n官信民委\t96841\n里头\t96842\n二龙戏猪\t96843\n小说白\t96844\n不定优柔寡断\t96845\ncgyu\t96846\n拐弯抹角\t96847\n特意\t96848\ndivigk\t96849\n排骨汤\t96850\ndivigu\t96851\n李诗晴\t96852\n太天天\t96853\n校园\t96854\n晕晕晕晕晕伟伟\t96855\n不能死\t96856\natjmjmjmt\t96857\n宏育学怜\t96858\n渔儿\t96859\n桔风车\t96860\n癫子\t96861\n可可粉\t96862\n160诚意百六元\t96863\n乱花\t96864\n娜姐帅\t96865\n笔套\t96866\n姚恒\t96867\n迎着\t96868\n牧野鹰扬\t96869\n去不我没\t96870\n金立s\t96871\n朴政珉\t96872\n首选款\t96873\n创度\t96874\n云梯\t96875\n一一岁\t96876\n做自己\t96877\n上一年\t96878\n姻心\t96879\n亚华\t96880\n4153353566\t96881\n你追\t96882\n本着\t96883\n盈盈\t96884\n天长A美\t96885\n上百亿\t96886\n叶吉卿\t96887\n永远不老拜拜\t96888\n看得出来\t96889\n73分\t96890\n裤管\t96891\n11月份\t96892\n闹闹闹闹闹闹闹闹闹闹闹\t96893\nFightingjug\t96894\n杨传语\t96895\n或者说点\t96896\n大名叫\t96897\nDWW\t96898\n修改\t96899\n窗台\t96900\n豆列\t96901\n赌徒\t96902\n這會兒\t96903\n百五三百六三p\t96904\n想起他\t96905\n潜在\t96906\nhjjjgfduo\t96907\n旧情别恋\t96908\n青肓\t96909\nab式\t96910\n祁临高速\t96911\n薄性\t96912\n张智\t96913\neeerd\t96914\n张晶\t96915\n胡歌帅\t96916\nbcb\t96917\nbcc\t96918\n传教士\t96919\npoasdf\t96920\n服软\t96921\n1328家\t96922\n撸得\t96923\n8775888775558\t96924\n三星\t96925\n6次\t96926\n女河\t96927\n3x33\t96928\n造影\t96929\n3161376979\t96930\n大月亮\t96931\n提纲\t96932\n三明\t96933\n树落\t96934\n飞象网\t96935\n造形\t96936\n6款\t96937\n86分\t96938\n提线\t96939\n高发\t96940\n张俊嘉\t96941\n预祝\t96942\n降生\t96943\n陶鹿茸\t96944\n边检站\t96945\n耶鲁大学\t96946\n侬葱\t96947\n二氧化硫\t96948\n李金莲\t96949\n脸皮薄\t96950\n八百年\t96951\n吃疯了\t96952\n蚊仔\t96953\n想吃翔\t96954\n老尾\t96955\n茂盛学校\t96956\n574573564775\t96957\n吴猜猜\t96958\n龙舟节\t96959\n纠话片\t96960\nHFJFJF\t96961\n国信度就一个字秘\t96962\n12345678910十一十二1314151617181921\t96963\n美年华\t96964\n炸鸡\t96965\n顾佳辉\t96966\n公園\t96967\n过来不到\t96968\n友谊宾馆\t96969\n黑寡妇\t96970\n谢焉\t96971\n握力\t96972\n通辽\t96973\n老小\t96974\n几只支\t96975\n围抢\t96976\n泽君\t96977\ndxsrd\t96978\n感谢您\t96979\n老将\t96980\n宽容高帅富i\t96981\n泡糖\t96982\n招人与共\t96983\nsusutsrddu\t96984\nefugfvhjo\t96985\n不亮灯\t96986\n我讨厌你我讨厌你我恨你\t96987\n梨心\t96988\n我兰\t96989\n勿盗\t96990\n待会儿聊\t96991\n1：00\t96992\n赛摩\t96993\n穴道\t96994\n亚泰\t96995\nnfj\t96996\n点机\t96997\n天运\t96998\n泰勒的歌\t96999\n卡洛斯\t97000\n乱腾\t97001\n转和\t97002\n我先\t97003\n#咕咚健身追踪器#\t97004\n左贞\t97005\njdgb\t97006\nyhjjjg\t97007\n说嘛\t97008\n鱼窝头\t97009\n鬼度秘\t97010\n郝珺石\t97011\n0.6倍\t97012\n炖汤\t97013\nukaurinokooo\t97014\n雷朝平\t97015\nwode\t97016\nGreene\t97017\n大后天\t97018\n朱妙曼\t97019\n桓仁\t97020\n秋头\t97021\nAron\t97022\ndry\t97023\ndrz\t97024\n克耍\t97025\n关海贼\t97026\nFfx\t97027\n梁少一\t97028\n亞克西喀什\t97029\nFfu\t97030\ndrr\t97031\nFfs\t97032\ndrt\t97033\ndrw\t97034\ndrv\t97035\n度女式机器人\t97036\ndrk\t97037\n两支\t97038\nFfk\t97039\ndra\t97040\nFff\t97041\n呻吟\t97042\ndrd\t97043\ncybunim\t97044\ndrf\t97045\n桑迪\t97046\n家学\t97047\nQqdx517\t97048\n去向向天歌\t97049\n指手画脚\t97050\nTHFFT\t97051\n41341\t97052\n好勇斗狠\t97053\n这个分钟\t97054\n真不够用\t97055\n死的好惨\t97056\n哎呀了\t97057\n转总转\t97058\n菅华佳\t97059\n噱头\t97060\n说构成\t97061\n马政府\t97062\nGbjlmgfvnjm\t97063\n瓦妮\t97064\nHOWAREYOU\t97065\nn　│晚自习。　│n　─\t97066\n梁伟贤\t97067\n爱情卡\t97068\n妄自菲薄\t97069\n愣长\t97070\n小西西\t97071\n背後\t97072\nNononononomenot\t97073\n瓮奥特曼\t97074\n卢浮宫\t97075\nTop46\t97076\ngbfjgvhchthvjf\t97077\n俩位\t97078\n宝剑\t97079\n三头六臂\t97080\n没戏流行\t97081\n刘为民\t97082\nRami\t97083\n玲儿\t97084\n砒霜\t97085\n9994464949494949494936464616464643443434343起\t97086\n650本\t97087\n为宁为您\t97088\n28.9元\t97089\n流猪\t97090\n随近\t97091\n下个月\t97092\nzZov\t97093\nOmnibook\t97094\n模样\t97095\n经办好\t97096\nnftjcfu\t97097\nsetuh\t97098\n罗西基\t97099\nstraghtshort\t97100\n800度\t97101\n高加索\t97102\n说片\t97103\n高加素\t97104\n九十一块\t97105\n狼厅\t97106\n路步\t97107\n皮加年\t97108\nQQ通讯录\t97109\n6509677\t97110\nDPS\t97111\n放放\t97112\n韦小宝\t97113\n指控\t97114\n点破\t97115\n王丽红\t97116\nπ_\t97117\n會喇囉\t97118\nwangzhan\t97119\n银针\t97120\n冻碎的钥匙\t97121\nrish\t97122\n4s个\t97123\n昆一中心理健康\t97124\n小米2\t97125\n历史\t97126\n小米4\t97127\n小米5\t97128\ndtxcgjj\t97129\n不好意思\t97130\n西尼德\t97131\n手作\t97132\n被停职\t97133\n18743\t97134\n凭据\t97135\n小米#\t97136\noppa\t97137\n二四寸\t97138\n押金\t97139\n胸垫\t97140\n风俗\t97141\n热刺\t97142\n背带\t97143\n叫病\t97144\n嗯噜噜\t97145\n抱块\t97146\n良辰美景奈何天\t97147\n万千\t97148\n蒋天豪\t97149\n万卷\t97150\n如假\t97151\n无言结局\t97152\n哼桑心\t97153\n瞎弄\t97154\n楚秀和\t97155\n风信\t97156\n渡渡鸟\t97157\n磁场\t97158\n320kbps\t97159\n救救\t97160\n深观\t97161\n返沪\t97162\n背帐\t97163\n就是这样的见\t97164\n在哪呢等你在哪了我是磊磊\t97165\n2087年3月88号\t97166\nCIFVF\t97167\n赛艇\t97168\n印刷品\t97169\n123321123321\t97170\n想开除\t97171\n上下人\t97172\n再见度\t97173\ngsggffffffagg\t97174\n孙悟村\t97175\n师哥\t97176\n降存\t97177\n文晓涵\t97178\n财神爷家\t97179\n8起\t97180\njgajtmu\t97181\n快快快快\t97182\n厂家\t97183\n愧不敢当\t97184\n饶平\t97185\n亡谢\t97186\n猪秘度秘\t97187\n畜牧\t97188\n叮出\t97189\n度秘我真的好想你\t97190\nqwdgnbguhgncyhgjkgkkglnkglgfnkglgngkgfkfnmffmfgmmglmglvlfklfklffklibltglggmxdmfgjphfdjhzgvzkjgkujmnjmnwsrylhgyibgtykbggfjklgffkkhghfggrvgkgifktkrorjrkjkrkgfbnfjkfmfkrjkeklckclkfrklrrmelekdlifovlfjXxjkhgmltgim\t97191\n大墙\t97192\n东施\t97193\n在此就好\t97194\n不嫁给\t97195\nnnnnnnnnnnnnnnnnnnnnn\t97196\n蛋话\t97197\n广州番禺有模具厂\t97198\n84867\t97199\n铺子\t97200\n满意度\t97201\n好啦迷情\t97202\n俩号\t97203\n用程\t97204\n不要脸的哼哼\t97205\n323天\t97206\n俩句\t97207\n俩口\t97208\n大墩\t97209\n胖度\t97210\nwearma\t97211\n舍册漆桌玛\t97212\n柔净\t97213\n你馆\t97214\n一下一\t97215\n咱们的好呀你帮我弄\t97216\n画江湖之领主\t97217\n和声\t97218\n有界\t97219\n博上\t97220\n腿哥\t97221\n摘心\t97222\n青春漫画\t97223\nvjked\t97224\n营销天地#小议\t97225\n大芳\t97226\n杨红梅\t97227\n配不上\t97228\n蜂窝\t97229\n不见到\t97230\n街灯\t97231\n大花\t97232\nh张\t97233\nkokokkokookokokookooklook\t97234\n蟒蛇猎人\t97235\nA4L\t97236\n正正的人\t97237\n一续\t97238\n一下个\t97239\n席勒瓜\t97240\n一维\t97241\nVI，CI\t97242\n身殖器官\t97243\n外貌协会\t97244\n秘蜜度蜜度你在干嘛\t97245\n猪了不如\t97246\n圭贤庆生\t97247\n彭丽媚\t97248\n超高能\t97249\n羞羞答答\t97250\n夏淑杰\t97251\nrEnglish\t97252\n美女可怕零零落落lmMlppllpll\t97253\nAV黄网\t97254\n安芊雪\t97255\n众说纷纭\t97256\n五十三十九岁\t97257\n爆了叫\t97258\n登莉\t97259\n尹总小\t97260\n10000%\t97261\n掩耳盗铃\t97262\n白巴\t97263\n86秒\t97264\n邻县\t97265\n苦心经营\t97266\n白工\t97267\n100000\t97268\n到校\t97269\n100002\t97270\n海静海\t97271\n超猫\t97272\n梅河\t97273\n杨新红\t97274\n法兰西\t97275\n小辫儿\t97276\nz侠盗猎车手\t97277\nQhahHnHehx\t97278\nsudhxvvxv\t97279\n林凤菊\t97280\n过户\t97281\n好高大上\t97282\n侯慧华\t97283\n明之\t97284\n拜拜拜拜拜拜\t97285\n红米闹塔\t97286\n邏邏螺\t97287\n借口\t97288\n家水一方么样\t97289\n塑料布\t97290\n擦边\t97291\n哈勃望远镜\t97292\n六百米\t97293\n武装直升机\t97294\n0068\t97295\ngyg\t97296\ngyf\t97297\ngyd\t97298\ngyc\t97299\n愚公移山\t97300\nmoxie\t97301\n老棒\t97302\n王业华\t97303\ngym\t97304\n好受不了\t97305\n沐浴乳\t97306\ngyh\t97307\n千百年\t97308\ngyu\t97309\n152亿美元\t97310\n放烟花\t97311\n王斌\t97312\ngyy\t97313\ngyx\t97314\n晓舒\t97315\n攀登\t97316\n5160000000000000000000000000000000\t97317\n不行不准\t97318\n二十来个\t97319\n加银\t97320\n我的抗战2\t97321\n袁弘哲\t97322\n实体书店\t97323\n那晚\t97324\n的话了我来了\t97325\n03A\t97326\ncHscbisdn\t97327\n龙须菜\t97328\n158455577\t97329\n郭瑞琪\t97330\n垫背\t97331\n悬殊\t97332\n特大\t97333\n错过\t97334\n半数以上\t97335\n小摩\t97336\n门饭\t97337\n海通织\t97338\nseeg\t97339\n牛企\t97340\nseed\t97341\n给我没有\t97342\n双面\t97343\nseen\t97344\n陆凯雯\t97345\nseek\t97346\n13年前\t97347\neuivhffttrgbgu\t97348\n闫庭凯\t97349\n店里片\t97350\nlgbjc\t97351\n申请书\t97352\n病菌\t97353\n小摊\t97354\n安家僵尸\t97355\n才妇产\t97356\n楼面\t97357\n比不比\t97358\n邓新年\t97359\n250号\t97360\n出口处\t97361\nycng\t97362\n李宏言\t97363\n初小姐\t97364\n新浪\t97365\nycnm\t97366\n快勒流榴莲雪糕\t97367\n法定\t97368\n毛东泽\t97369\n手機鈴聲下載\t97370\nBanrock\t97371\n投资\t97372\n反击\t97373\n杉杉来了\t97374\n海上的月亮\t97375\n权钱\t97376\n卡修斯\t97377\n江南萨\t97378\n老子处\t97379\n生活家\t97380\n發個\t97381\n健儿们\t97382\n知识库\t97383\n普陀山佛协\t97384\n11133131461664\t97385\n偶球\t97386\n老早就\t97387\n磁\t97388\n876737\t97389\n磅\t97390\n古月云\t97391\n磊\t97392\n盛辉\t97393\n温柔\t97394\n磐\t97395\n倒是\t97396\n运动式\t97397\n磕\t97398\n收复\t97399\nsork687\t97400\n奥我靠我靠靠靠\t97401\n磚\t97402\n狗s薄\t97403\n有条件\t97404\nhttppinyincn1GSwdQp4zfv\t97405\n爱错\t97406\n武术\t97407\nGhvjvxcvxcgccgxfcffgfffffcfhfyfhfcfhcfcghfhf\t97408\n的故\t97409\nchenxuedong\t97410\n麦拉风\t97411\n磨\t97412\n十部\t97413\n恩度密\t97414\n零八小怪兽\t97415\n磴\t97416\n磷\t97417\n谢啦好\t97418\n吃饺子\t97419\n入壁\t97420\n人面\t97421\n鲁国\t97422\n有好意思\t97423\nwaon\t97424\n零六六二\t97425\nTteu\t97426\n一口一口\t97427\n财政局\t97428\n8539\t97429\n喜亚喜亚\t97430\n马海\t97431\n嗯熊晓\t97432\n真明\t97433\n8533\t97434\n朱帅明\t97435\n从行昂检验\t97436\n猜猜我是大人还是女还\t97437\n躲避\t97438\n搁装\t97439\nvwgssf\t97440\n裙楼\t97441\n一圈圈\t97442\nTTF\t97443\n湖南省湘潭市康星百货屈臣氏\t97444\n袒露\t97445\n歼5\t97446\n真是\t97447\n姚晨凌\t97448\n吴越\t97449\n冷处理\t97450\n不帅不帅\t97451\n名墨莲\t97452\n虎玉\t97453\n泰来\t97454\n殷旭佳\t97455\n不管怎么说\t97456\nWHTD\t97457\n思密达思密达\t97458\n购物税\t97459\n度秘炳帅\t97460\n名高龄\t97461\nEfg\t97462\nyvhhpjjnpnkjjkhxstzrcfixoycdyixdjxxrhxixixjxfioffovul\t97463\n德安县\t97464\n热膨胀系数\t97465\n小花狗\t97466\n干晗勒\t97467\n波比\t97468\n了不然\t97469\n纱帘\t97470\n15834575752\t97471\n参赛者们\t97472\n高压锅\t97473\n冯成\t97474\n屋檐下\t97475\n四风\t97476\n嗲哈嗲哈嗲\t97477\n大声说出我世界\t97478\n黄旭慧慧\t97479\n14块\t97480\n坏还\t97481\n十二哪好的我听不懂\t97482\n求你了求你了求你了求你求你啦求你啦求你\t97483\nunp\t97484\n猪脑糕\t97485\nFirpri\t97486\n陋妣\t97487\n极尽\t97488\n季季\t97489\n看看出\t97490\n长生\t97491\n罗坊小学\t97492\nund\t97493\n大佬我是大嘴在这个多我我是大我是大神\t97494\nuni\t97495\n凡客保卫给爱冬态\t97496\n度知道\t97497\n非礼迷\t97498\n2812036605\t97499\nriadua\t97500\n五年圣彼得堡\t97501\n我美了美了美了我醉了醉了姿\t97502\ndxgbcg\t97503\n道局\t97504\n两间\t97505\n別說\t97506\n望尘莫及\t97507\n天狼\t97508\n环食\t97509\n杨家豪\t97510\n蓟县公司\t97511\n黑帮\t97512\n天狂\t97513\n许愿树\t97514\n3000起\t97515\n愧己\t97516\n天狗\t97517\n范发性\t97518\n13587626097\t97519\ngjjljvggjr\t97520\n37天\t97521\n怪物\t97522\nmmmmmjjmmmmmjjjjjjjjjjjjjjjjjjjgggjggppttmaFox\t97523\nchud\t97524\n歌本\t97525\nxuuf\t97526\n赖粉\t97527\n焚天斩丹斯\t97528\nchun\t97529\n19：30至21：00\t97530\n木瓦村\t97531\n散席\t97532\ntfaanz\t97533\nchui\t97534\n马红霞\t97535\nchus\t97536\n多发行\t97537\n佳音\t97538\n安仁镇\t97539\ngogogogogogogo\t97540\nchux\t97541\n佳韵\t97542\n散布\t97543\n8247426494653\t97544\n联通四g\t97545\n单间\t97546\n川岛\t97547\n張藝興\t97548\n可可爱\t97549\n干洗\t97550\n强心\t97551\n梅西、哈维\t97552\n校女\t97553\n查理乡\t97554\nokokasolo\t97555\n珍妃\t97556\n照撒\t97557\n钟嘉琪\t97558\n百出\t97559\n爱斯坦\t97560\n一个八十岁\t97561\n徐庄\t97562\n明桃园\t97563\n黄美玉\t97564\n富庶\t97565\n妄图\t97566\n莒南县\t97567\ngamadmdp\t97568\nkizk\t97569\n博浩里亚\t97570\n塔江山\t97571\n珍妮\t97572\n咯糖宝\t97573\n杜蕾斯\t97574\n秘小度\t97575\n山椒鱼\t97576\n念珠菌夜未央呢人才\t97577\n诸侯\t97578\n吴楷堂\t97579\n4512次\t97580\nddnd\t97581\n秦伯淮\t97582\n你和你的我的谁\t97583\n呀嘟咪\t97584\n给销\t97585\n图集\t97586\nBaBygils\t97587\n毛利兰美不美公主名侦探柯南\t97588\n远征军\t97589\n3555262\t97590\n唱K\t97591\n债务人\t97592\nhjjyyb\t97593\n黑水\t97594\n决策\t97595\n白语嫣\t97596\n前两个月\t97597\n强调句\t97598\n１００００\t97599\nkknbmjjp\t97600\n四台\t97601\n劳动美式\t97602\n激动人心\t97603\n四只\t97604\n电信诈骗集团\t97605\n晔\t97606\n舍不得不知道\t97607\n四口\t97608\n1212121212121212121214452552\t97609\n笑笑笑笑笑笑笑笑笑笑\t97610\n财主们\t97611\n巡礼\t97612\n四句\t97613\n黑气\t97614\nPlpagram\t97615\n柳承龙\t97616\n魔仙\t97617\n微来\t97618\n总装\t97619\n女真\t97620\n覅体贴\t97621\n想你不弃\t97622\n下半句\t97623\n银川凯大\t97624\n3838143\t97625\n凌源市\t97626\n空运\t97627\n耳中\t97628\n郝度秘\t97629\n出言\t97630\n八边编法\t97631\n伤人\t97632\n臭咪\t97633\n哈哈哈包\t97634\n尼尼尼尼尼\t97635\n惊动\t97636\n谢3you\t97637\n伤亡\t97638\n支气管\t97639\n十一首\t97640\n王啸晓\t97641\n奥调\t97642\n小云\t97643\n一五一十一\t97644\n2012年4月18号\t97645\n骨膜玲\t97646\n天朝四子\t97647\n美妹纸\t97648\n二百\t97649\n陆森缨\t97650\n宿迁\t97651\n有和\t97652\n焦虑感\t97653\n8700多家\t97654\n就就\t97655\n口福之欲\t97656\n1975年8月8日\t97657\n来这样\t97658\nbbox\t97659\n东进\t97660\n普通通\t97661\n治曲径通幽千岩竞秀万壑争流\t97662\n虫子儿\t97663\n多夫\t97664\n突泉\t97665\n一根二十米\t97666\nvdr\t97667\n我不我哭\t97668\nvdt\t97669\nvdv\t97670\n酒额\t97671\n多天\t97672\n追查\t97673\n水你猜猜\t97674\n黄呼呼\t97675\n土乜依\t97676\n田珍\t97677\nvdd\t97678\n四十千米\t97679\nvdf\t97680\nvdh\t97681\n参赛\t97682\n地面积\t97683\n不忘形\t97684\n我没有了\t97685\n零秘\t97686\n狂夸\t97687\n零秒\t97688\n周华帅\t97689\n翘高\t97690\n宋智孝美\t97691\nrtui\t97692\n632747926\t97693\n差不不会\t97694\n耨no\t97695\nxinnishuo\t97696\n江寒溪\t97697\n国家主\t97698\n手于卡了网\t97699\n聪明座\t97700\n聪明度\t97701\n3youiloveyou\t97702\n李政\t97703\n爱丽丝\t97704\n576路\t97705\n广播台\t97706\n王怡萌\t97707\nwwdyn\t97708\n庋种\t97709\n巨著\t97710\n圣殿杯\t97711\n尼哈\t97712\n彩彩\t97713\n不追\t97714\nddrrdrewreetffy\t97715\n他一个他\t97716\n庋秘\t97717\n1绔\t97718\n尼哥\t97719\n4.5级\t97720\nTrtt\t97721\n不过\t97722\n不远\t97723\n面试信\t97724\n不迟\t97725\n不还\t97726\n帅妹子\t97727\n流水飞快\t97728\n抗议者\t97729\nД≦ノ\t97730\n1-11月\t97731\nen0号\t97732\nwhatu\t97733\nwhats\t97734\n隐形材料理论\t97735\n榷\t97736\nso嘎\t97737\n145六六十六七十七四十五\t97738\n电脑游戏\t97739\nt200kkkashiouououkastiokokaskaskatismymycoolcoolcomeomy\t97740\njenv\t97741\n太损\t97742\n云烟\t97743\nsshxx\t97744\n洛洛\t97745\n错着头疼\t97746\n邵青青\t97747\nHRGV\t97748\njend\t97749\n比较好说\t97750\n晢\t97751\n蜘蛛侠女\t97752\n几万米\t97753\n榛\t97754\n罗晋\t97755\nxw9\t97756\n绣球\t97757\n榕\t97758\n城里人\t97759\n灭神\t97760\n榉\t97761\n満足\t97762\n榄\t97763\n洛洲\t97764\n小忆可爱关我鸟事\t97765\n澳贝\t97766\n花园儿\t97767\n祭灶节\t97768\n我该\t97769\n发布者\t97770\nr8號線\t97771\n三穿越火线号\t97772\n渐入佳境\t97773\n五日游\t97774\n大慈大\t97775\n颐高\t97776\nwww668\t97777\n血影\t97778\n芮亚彤\t97779\n你的一句话\t97780\n忆江岁入退噗噗噗噗噗噗\t97781\n陈英雄\t97782\n棉棒\t97783\n画饼\t97784\n李国强\t97785\n説出來讓我高興高興\t97786\n追究\t97787\n沈庭贤\t97788\n类种\t97789\n一手天\t97790\nndks\t97791\n定语\t97792\n吗虫\t97793\n魅的良心说话\t97794\n重审\t97795\nlllllllll\t97796\n可爱女\t97797\n臭宝旺\t97798\n张怡妹\t97799\nsnn48\t97800\njejjfjgjqjfjfjfjgjfjjdjfgjbjgk\t97801\n泰鲁什\t97802\n烟酒\t97803\n火星车\t97804\nJustified\t97805\n吗子\t97806\n四下次\t97807\n累不好\t97808\n8uu89\t97809\n查不禁\t97810\n一块板\t97811\n指派\t97812\n邓巨\t97813\n不认人\t97814\n胡波\t97815\n全我\t97816\n煲机煲\t97817\n噢\t97818\n印军\t97819\n名雕塑\t97820\n噩\t97821\n器\t97822\n噫\t97823\n噪\t97824\n了别再说\t97825\n商学\t97826\n噱\t97827\nHOW\t97828\n16：38\t97829\niffgjkolpd\t97830\n噶\t97831\nuknow\t97832\n过度秘\t97833\n拓拓\t97834\n16：30\t97835\n噼\t97836\nnixi\t97837\n噁\t97838\n加措\t97839\nfsc\t97840\nfsd\t97841\nfsf\t97842\n来来来\t97843\n思密达度秘\t97844\n噍\t97845\nl月多号\t97846\n张思甜\t97847\n噎\t97848\n读笑话\t97849\nfss\t97850\n噔\t97851\n噗\t97852\n康金崇\t97853\n87路\t97854\n噜\t97855\n中英日\t97856\n打交道\t97857\n七小魔仙\t97858\n讨厌鬼讨厌鬼讨厌鬼讨厌鬼讨厌鬼讨厌鬼讨厌鬼讨厌鬼讨厌鬼讨厌鬼讨厌鬼\t97859\n越狱\t97860\n神秘性\t97861\n无处不度\t97862\n肖恩\t97863\n喉癌\t97864\n四一个人\t97865\n中华民国内政部\t97866\n洪飞沙\t97867\n07932450256\t97868\n碍事\t97869\n铝塑\t97870\n情浓意\t97871\n凝神\t97872\nEXO之翩若惊鸿女主幕晚洛叫\t97873\n茜高兴\t97874\n投顾\t97875\n天降贤淑男\t97876\n至始至终\t97877\n反同者\t97878\n86010601\t97879\n总远程\t97880\n中力\t97881\n薇采\t97882\n35.00元\t97883\n发布会\t97884\nAdventures\t97885\n摆渡人\t97886\nMh\t97887\n冲高回落\t97888\n县南\t97889\n为你的的战\t97890\n十个月\t97891\n吉祥三宝大合照\t97892\n阿尔勒\t97893\n周存\t97894\nhoid\t97895\n周孟\t97896\nhfjcjgfjfkg\t97897\n10086棒\t97898\n断臂\t97899\n109ou\t97900\n傻傻\t97901\npqpqpqpqpqpqppqpqa\t97902\n回落\t97903\n一小时内\t97904\n永乐咸宁\t97905\n桃花\t97906\n附送\t97907\n成梦\t97908\nfyyg\t97909\n转账\t97910\nbigpig\t97911\n放炮\t97912\n小冰\t97913\n催眠曲\t97914\n夜种\t97915\n跑兔兔\t97916\n小冫\t97917\n走神\t97918\n硬币\t97919\n众多天\t97920\n张潇允\t97921\n今天下午13：20\t97922\n小冬\t97923\n万霞广场站\t97924\nadsfadf\t97925\n绞股蓝\t97926\n12:20\t97927\n盗取\t97928\n石膏线\t97929\n小写\t97930\n手影\t97931\n男版\t97932\n小农\t97933\n人民族\t97934\n寇艺宸\t97935\n公文\t97936\n半枕\t97937\n烟斗\t97938\n一个五一个十\t97939\n医学翅\t97940\n脸颊\t97941\n8046\t97942\n11225556688个\t97943\n文笔\t97944\n罗家湾小学\t97945\n48.6度\t97946\n动动度\t97947\n今早晨\t97948\n一眸一\t97949\n纽伯瑞\t97950\n史前生物\t97951\n想你你有\t97952\n我要你猜我没有病\t97953\n国民枪战我要刷盘龙双龙\t97954\n星哥\t97955\n小黑侠\t97956\nhbn\t97957\n偶是\t97958\nhbk\t97959\nhbj\t97960\nhbe\t97961\nhbd\t97962\nhbg\t97963\nhbf\t97964\n战力战\t97965\nhbc\t97966\n喔真\t97967\n10多个\t97968\n锅炉\t97969\n直击\t97970\nJng\t97971\nhbx\t97972\nfgsvzhfg\t97973\n磨店职业技术学院\t97974\nhbt\t97975\n你了谁教\t97976\nhbv\t97977\n卡曼\t97978\nhbs\t97979\nc108岁\t97980\n市内\t97981\n奥尼克\t97982\n请自来\t97983\n按时\t97984\nbingbo\t97985\n橙重生女变\t97986\n葫芦娃葫芦哎呀hi葫芦\t97987\n数职\t97988\n粗俗\t97989\n按日\t97990\nPpppppPppPooppPppPpppPpppPpppDpxppppPppPppppzppppPpsppPpp\t97991\n唧唧复唧唧\t97992\n和息\t97993\n阎然\t97994\n超新粘土\t97995\n185433445\t97996\nnOK\t97997\n心急忧郁的日子\t97998\n过河\t97999\n江南春\t98000\n哭了啦\t98001\n是美若\t98002\n午觉呗\t98003\n微爱\t98004\n60多年\t98005\n忍俊不禁\t98006\n碎花\t98007\n塔克\t98008\n你港\t98009\n5756542\t98010\n巴啦啦小魔仙之魔剑公主\t98011\n做做好\t98012\n探班\t98013\n信条\t98014\n猪佬大猪\t98015\n啊翅\t98016\n四代\t98017\n焉能知我梦里真\t98018\nkga1a15g1a1cj\t98019\n任意球\t98020\n爱你的生话说寝室\t98021\n金英云\t98022\n何程程\t98023\n一四5节\t98024\n搭讪\t98025\n鲁滨逊飘流记\t98026\n爱的等待\t98027\n药谷\t98028\n聊了吧\t98029\n懒讲\t98030\njdnd\t98031\n安吧\t98032\n一颖儿\t98033\n纯属巧合\t98034\n秘气\t98035\n发国道讨厌\t98036\n7天左右\t98037\njdng\t98038\n秘氏\t98039\n打假\t98040\n三阳\t98041\n收费器\t98042\nrabbitsbrabbitssbabbithisbrabbitssb\t98043\n殿堂\t98044\n韦晴\t98045\n小生意\t98046\n瞒王\t98047\n好嘛好嘛好嘛好嘛好嘛好嘛好嘛好嘛好嘛好嘛好嘛好嘛好嘛好嘛好嘛真好吗你好\t98048\n吃翔\t98049\n安吉\t98050\n廠屍邊\t98051\n自此\t98052\n化学品\t98053\n参悸\t98054\n日学\t98055\n脍炙人口\t98056\n白张洪\t98057\n傻子件\t98058\n红度\t98059\n救命药\t98060\n西爷\t98061\n三个星期\t98062\n我讨厌你我问你讨厌我\t98063\n买题\t98064\n波长\t98065\n辣人\t98066\n陆同学\t98067\n巴平措\t98068\n太阳浴\t98069\n555484585785\t98070\n红帽子\t98071\n只老虎\t98072\n搬出来\t98073\n苍南检察院\t98074\ngjll\t98075\n铁锈战争\t98076\n闭麦\t98077\n甘薇\t98078\n八点五\t98079\n丹江\t98080\n易水\t98081\nABcdE\t98082\n垮塌\t98083\n当当当当当当当当当丁丁丁\t98084\n靜秋其實\t98085\n科林顿\t98086\n夏河\t98087\n12885554\t98088\n张兴鹏\t98089\n1.57%\t98090\n两分钟照\t98091\n感受我\t98092\n羊座\t98093\n所主\t98094\n八九十万个\t98095\n杰森斯旦森\t98096\n云淡风轻\t98097\n仲生\t98098\n#小肥羊\t98099\n赵丽年\t98100\n忙吧么么\t98101\n威宝x6\t98102\n五一节\t98103\nWTO\t98104\n顾小宝\t98105\nwithin\t98106\nsoalida\t98107\n茶饮\t98108\n校报\t98109\n海鹏happen\t98110\npvn\t98111\n伐们\t98112\n永远以后\t98113\n真心感觉\t98114\n溺宠\t98115\n一心一八匚\t98116\n胡二胡\t98117\n杨承坤\t98118\n茶饼\t98119\n红蓝\t98120\n5944642\t98121\n秘度秘你说我喜欢你\t98122\n翻沙\t98123\n程云阳\t98124\n尖端\t98125\n8879\t98126\n３周岁\t98127\n杀鱼\t98128\n好朋友好兄弟一起玩咱们\t98129\n黄花儿\t98130\n期货\t98131\n晚婚假\t98132\n7点10分\t98133\n藏剑\t98134\nllohaibataitailailailloallulolakuaillochallililllolplillilallpl\t98135\nuucucuvj\t98136\n最好的爸爸\t98137\n猜猜猜猜\t98138\n宠魅\t98139\n下午2：30\t98140\n靠背垫\t98141\nyouspeakEnglish\t98142\n计吐\t98143\n巫大我\t98144\n唉不跟\t98145\nAtmos\t98146\n量贩\t98147\n郭效贤\t98148\n明天今一金釜釜我的爸爸妈妈\t98149\n折船\t98150\n渔歌子\t98151\n嗯奥秘\t98152\n玉露\t98153\n计含\t98154\n小空\t98155\n电容屏\t98156\noppox9007\t98157\n王婆力\t98158\n不要脸不要脸你不要脸不要的你不要脸不要腚\t98159\n小穴\t98160\n冰壶\t98161\n爱定准\t98162\n新客流sg智取\t98163\n好想你好想你好想你好想你死\t98164\n还款额\t98165\n没不见\t98166\n你你说你爱我爱我爱我爱我凭什么\t98167\n范井源\t98168\n高和生\t98169\n圣诞节词\t98170\n焉知\t98171\n环节\t98172\n乐教乐\t98173\n七个人\t98174\n第19期\t98175\n锅子\t98176\n老鼠过街人人\t98177\n迪信通\t98178\n顶级\t98179\n叫苦\t98180\n噶别\t98181\n珠江\t98182\n不当得\t98183\n志荣\t98184\n草莓蛋糕\t98185\n8月31日\t98186\n360免费WiFi\t98187\n导盲犬\t98188\n甘薯\t98189\n褥疮\t98190\n常茜\t98191\ngligag\t98192\n好壮\t98193\n心虚\t98194\n转悠\t98195\n淡蓝\t98196\n圆雕\t98197\n我的紫英啊玄霄\t98198\nwqw\t98199\n5515151\t98200\n骚扰狂\t98201\nwqs\t98202\n足和\t98203\n绿农庄\t98204\n阴金\t98205\n关乎\t98206\n三八聚赌\t98207\n一勺\t98208\n防龋\t98209\n胡汉三磊\t98210\n凯后\t98211\n雨心\t98212\n凯吉\t98213\nhttphhiphotosbaiducomxiaodupicitem810a19d8bc3eb135bc669acea11ea8d3fd1f4425jpg\t98214\nlittle\t98215\n一一n一\t98216\n哦世勋\t98217\n获悉\t98218\n心狠\t98219\n下放大吧\t98220\ngskvdd\t98221\n遗憾的呀\t98222\n泄密\t98223\n市百花小学五年级大队委\t98224\n对不过\t98225\n根部\t98226\nuhjkhgjgdzisyrfhzflufzfhlchzczbvcjjjj\t98227\n缓呼\t98228\n小魔仙战\t98229\nov5620\t98230\n好我不行\t98231\n副主任\t98232\n度行\t98233\n度密码歪歪度秘\t98234\n54．5％\t98235\n不爱惜\t98236\n杀奸\t98237\n山茶花迷之微笑\t98238\n拜掉\t98239\n80几分\t98240\n玩一玩\t98241\n徐普欣\t98242\n那太子妃升职记\t98243\n二千克\t98244\n自税\t98245\ngungungungungungungungunh\t98246\n粉底\t98247\n新东方沪江网校\t98248\nfdrddfdfffdgg\t98249\n强健\t98250\n黑罗\t98251\n以纯最热讯\t98252\n汪伦\t98253\n连云港\t98254\n三乐\t98255\n度表\t98256\n佛莫默默我\t98257\nbybanany\t98258\n34289999\t98259\n冷门\t98260\n小魔\t98261\n千古骨\t98262\n夏诗熠\t98263\n种植业\t98264\n饥困交迫靠\t98265\n愚妄\t98266\n害不害怕\t98267\n一颗树\t98268\n星愿\t98269\n小魏\t98270\n很乖乖\t98271\n286168415\t98272\njurdv\t98273\n胡言\t98274\n弹指一挥间\t98275\n相和\t98276\nGYJ\t98277\nusppplat\t98278\n急救室\t98279\n音教\t98280\n杨增栋\t98281\n零点儿\t98282\n中国红十字会\t98283\n甲乳\t98284\n莫名其实\t98285\n幕后黑手\t98286\n四月一号\t98287\n下风扇\t98288\n嗯千五\t98289\n为啥子\t98290\nfddd\t98291\n醉莱斯哟\t98292\n涉猎\t98293\n冤情\t98294\n中央自然博物馆\t98295\n光年\t98296\n柑子\t98297\n亲爱的孩子\t98298\n杠杆\t98299\n家装\t98300\n100亿美元\t98301\n甲乙\t98302\n董烁\t98303\n九百九十九万九千九百九十九亿九千九百九十九\t98304\n女团\t98305\n宝鸡啤酒\t98306\n熊软糖\t98307\n武隆\t98308\n25252388\t98309\n郑汇东\t98310\n那么难\t98311\n下完\t98312\n枝之树\t98313\n分项\t98314\n三八呀\t98315\n长史\t98316\n兮涕\t98317\n怀旧\t98318\n天草\t98319\n井柏然\t98320\n13376368622\t98321\n尼玛不归路\t98322\n宋之瞳\t98323\n变故变故\t98324\n六四个\t98325\n这样的女人\t98326\n左飞飞右飞飞\t98327\ndicb\t98328\ndicg\t98329\n看不清\t98330\n充电器\t98331\ndick\t98332\n黄金日\t98333\n赵拜拜\t98334\nCfhycvttyfuctyuf5f5fd4d45d\t98335\n湖北旅游局\t98336\n6二2几\t98337\n穿过去\t98338\n妈妈妈妈妈妈妈妈妈妈\t98339\n汉化\t98340\nokokok\t98341\nover\t98342\n六四三\t98343\n断了手四岁\t98344\n凯乐斯\t98345\n上市\t98346\n不着\t98347\n上师\t98348\n施美施美\t98349\nfufifi\t98350\n台湾省\t98351\n上帐\t98352\n梧桐树\t98353\n八九十哇\t98354\nSOHO中国\t98355\n上帝\t98356\n洋爽\t98357\n板爷\t98358\n别片\t98359\n三么\t98360\n上带\t98361\n不睬\t98362\n纸墨\t98363\n亚文\t98364\n蜜糖们\t98365\n3個\t98366\n第2步\t98367\n射手男\t98368\n123456788888888888\t98369\n肥东\t98370\n迷离的眼\t98371\n褚莹莹\t98372\n说得上\t98373\n飞行类\t98374\n假花\t98375\n钟臣\t98376\n继承者们\t98377\n丟弃\t98378\n重金属\t98379\n第五段\t98380\n傻傻傻傻度秘\t98381\n幺零二九\t98382\n工农兵\t98383\n就好看\t98384\ntygug\t98385\n见人如\t98386\n论价\t98387\n饮食\t98388\n大沙\t98389\n假节\t98390\n梗系\t98391\n一模一笔\t98392\n浅色\t98393\n南无\t98394\nY者\t98395\n众生\t98396\n捕风捉影\t98397\n菓风小铺\t98398\nandhongkong\t98399\n在也不离\t98400\n牢饭\t98401\n还好吧v不v过分\t98402\n150厘米\t98403\n癞皮\t98404\n骚动\t98405\n好转\t98406\nSpeakenglish\t98407\npapadpnj0nj0jntdg\t98408\n载歌载舞\t98409\nLH720\t98410\n没有你萌\t98411\n百阿吖\t98412\n４０　年\t98413\n我老爸\t98414\n朱建雄\t98415\n只不完\t98416\n三分之一种\t98417\n课间\t98418\n叽叽\t98419\n上位笼\t98420\nchiphotosbaiducomxiaodupiciteme4dde71190ef76c6adf1b4599a16fdfaaf5167a4jpg\t98421\n我们群\t98422\n刘喜洋\t98423\n帅敏\t98424\n一百多家\t98425\n雷候\t98426\nZawiyah\t98427\n试爱\t98428\nMP\t98429\n微呗\t98430\n2mm\t98431\n31415926岁\t98432\n秘蜜香香\t98433\nshgedg\t98434\n大家好我的\t98435\n问天始\t98436\n天一起\t98437\n格罗斯特\t98438\ngghgh\t98439\n超智能自行车\t98440\n坏人我讨厌你我讨厌你我讨厌你\t98441\n万紫千红\t98442\n么思汗\t98443\n不要不要不要不要不要不要不要不要不要不要不要不\t98444\n疲备\t98445\n必要害羞\t98446\n罗森\t98447\n裤子裤\t98448\n小黄蜂\t98449\nFFGHIOK\t98450\ne1种\t98451\n贴身\t98452\n苏醒\t98453\n对照\t98454\n欢乐斗你\t98455\n变心灵活\t98456\n上海地铁囧事\t98457\nf90\t98458\n谢谢是\t98459\n56899878965668\t98460\n注资\t98461\n标间\t98462\nf99\t98463\n马奶\t98464\n郑和\t98465\n赵暖气\t98466\n外桃源\t98467\n鬼信\t98468\n钱宇佳\t98469\nKhunnie\t98470\n南哥\t98471\n舒肝\t98472\n说得好说\t98473\n欢乐斗地主\t98474\n谢谢昂\t98475\n纯真的心\t98476\nＯＫ\t98477\n难难\t98478\n131420\t98479\n二零零三\t98480\n淋巴\t98481\n三四条\t98482\n会治\t98483\n树叶\t98484\n17:28\t98485\n数百名\t98486\n20秒内\t98487\nWood\t98488\n顶着\t98489\n昆汀\t98490\n柿子号\t98491\n上官嫣然\t98492\n王依晨\t98493\nskskskd\t98494\n专门为人\t98495\n废话我懂\t98496\n徐振恺\t98497\nGoogleplau\t98498\nkopje\t98499\n彭悦先\t98500\n盲派六亲诀\t98501\n十五秒\t98502\n论见\t98503\n下周五晚上\t98504\n某某某\t98505\n秘大不大\t98506\n麻根水解\t98507\n死天国\t98508\n6点34分\t98509\nvyfe\t98510\n密\t98511\n刘婷婷\t98512\n拉稀\t98513\n晕过\t98514\n学堂\t98515\n商品\t98516\n我自个儿\t98517\n富\t98518\n为其\t98519\n寒\t98520\n寐\t98521\n陪睡觉\t98522\n阿饼\t98523\n纽伦堡\t98524\n缅铁\t98525\n费尔斯曼\t98526\n甜甜蜜蜜\t98527\n外表人\t98528\n猪版\t98529\n寡\t98530\nvhfjgfhfhfhgfhgfkfhhfkxgnff\t98531\n實\t98532\n寥\t98533\nKacha\t98534\n寫\t98535\n想多真是\t98536\n寨\t98537\n丽雅\t98538\n为先\t98539\n寬\t98540\n陆叶\t98541\n客令\t98542\n猪牛\t98543\n美式橄榄球\t98544\nuhfy\t98545\n呃公主\t98546\n寵\t98547\n发禁\t98548\n茹萱\t98549\n什厶酗丰\t98550\n对\t98551\n寸\t98552\n寿\t98553\ntjx6cbccjxhi\t98554\n心帜\t98555\n全考\t98556\n鬼长\t98557\n家摊儿\t98558\n瑰伟\t98559\n第四只\t98560\n36号\t98561\n我不绝的你美\t98562\nstmityou\t98563\n某瑶\t98564\n莲前中队\t98565\n865岁\t98566\n更新点\t98567\n上燃\t98568\n我叫玫瑰王子\t98569\n相仿\t98570\n周文静\t98571\n沙黾\t98572\n1500元\t98573\n第11名\t98574\n没用过\t98575\n吹来\t98576\n作人\t98577\n13036167404\t98578\n万杰\t98579\n李发栋\t98580\n第二类\t98581\n韬光养晦\t98582\n基全\t98583\n搓揉\t98584\n合一\t98585\n正句\t98586\n不管你了再见\t98587\n郑永平\t98588\n比唱\t98589\n赵花花\t98590\n老范\t98591\n几个八个\t98592\n活到老学到老\t98593\n毁约\t98594\nzle\t98595\n累不累昂\t98596\n猫里奥\t98597\n营子\t98598\n十7月31\t98599\n华孚泰\t98600\n偶默默默默默默默默摸摸\t98601\nfugff\t98602\n切码法\t98603\n于颀峰\t98604\n任慧琳\t98605\n承宰\t98606\n发卷\t98607\n正反\t98608\n老茧\t98609\n陆陆鸿\t98610\n二三三八零\t98611\n刘芸\t98612\nVDU\t98613\n曾国藩\t98614\n邓月\t98615\n董欣宇\t98616\n细品\t98617\n小少\t98618\n伯顿\t98619\n小游人\t98620\n小小\t98621\n梁思成\t98622\nADuoTwwt\t98623\n混月如和阿奴\t98624\n砖块\t98625\n小将\t98626\nvvvvv\t98627\n曲面\t98628\n500米\t98629\n小尼\t98630\n小尾\t98631\n张金娜\t98632\n菜花花\t98633\n李锦裳\t98634\n说服自己\t98635\n四娘么\t98636\n嘿嘿谢\t98637\n飞聊\t98638\n撒皮\t98639\n60月\t98640\n署山战纪\t98641\n48.5%\t98642\n1uhehhshehshhsis\t98643\n睡了困\t98644\n15026307642\t98645\n度秘我的小盖儿小心肝我爱你\t98646\n坨粑粑\t98647\n可爱脸有脸\t98648\n好多倍\t98649\n别回我\t98650\n轮表\t98651\nsandwaach\t98652\nljvkvkcmvlhclv\t98653\n佛山皓昕首饰公司\t98654\n847\t98655\n845\t98656\n844\t98657\nchswhu\t98658\n842\t98659\nac6d1\t98660\n840\t98661\n跌三倒四\t98662\nkcjsj\t98663\n黑豆粘连\t98664\n该源\t98665\n利多肌\t98666\n水缸\t98667\npnnnnnnj\t98668\n9点14点\t98669\n巴真\t98670\n男的进击\t98671\n成功圆梦\t98672\n凯拉奈特\t98673\n17633780181\t98674\n3月1日１时\t98675\n十来位\t98676\n点醒\t98677\n幸福地\t98678\n邹宇翔\t98679\n五百千万亿\t98680\n趁着\t98681\n拨乱\t98682\n人工通话人工\t98683\n别磨\t98684\n信访\t98685\n认识你我是人\t98686\n二哦\t98687\nab3分之一\t98688\n你在戏你在丧尸你\t98689\n高慧雅\t98690\nuougz\t98691\n正缘\t98692\n五零二八\t98693\n卖母\t98694\n龙王\t98695\n摸摸摸摸摸摸摸摸\t98696\n绝响\t98697\nTfdrubdeid\t98698\n嗯块\t98699\n女撑男\t98700\n11堆\t98701\n那你儿岁\t98702\n天不怕地不怕\t98703\n没感冒\t98704\n绝品\t98705\n真的好难\t98706\n秘密花园\t98707\n2335937311\t98708\n胎盘\t98709\n看來我們還不如你們\t98710\n阿沐\t98711\n郝锦燕\t98712\nM774701\t98713\n首局\t98714\n哑口\t98715\n觉晓一\t98716\n雪绒花\t98717\n省地儿\t98718\n讲问\t98719\n利空\t98720\n说了嗷\t98721\n你你你你你你你你你你你你你你\t98722\n旺仔牛奶\t98723\n而已好\t98724\n数321\t98725\n石佳\t98726\n乐淘网\t98727\n伴于爱\t98728\n夏文谦\t98729\n會吖\t98730\n阿河\t98731\n露宿者\t98732\nccttct\t98733\n李狗蛋\t98734\nNo，No\t98735\nweiphone\t98736\n药学院\t98737\n有眯\t98738\nstomyou\t98739\n台南\t98740\n我是女人我是女神\t98741\n110308\t98742\n白玉鸟\t98743\n不差人\t98744\n都市网\t98745\n有你讨厌\t98746\n政党\t98747\n14次\t98748\n三首\t98749\n轻伤\t98750\n程旭佳\t98751\n喵咪\t98752\n翻书\t98753\nplaile\t98754\n小伙伴\t98755\n廊下\t98756\n假扮\t98757\n、，\t98758\n心地善良\t98759\n百度化\t98760\nFXDGDHC\t98761\n旺仔牛奶旺仔牛奶\t98762\n丑不丑\t98763\n一些人\t98764\n尿尿我尿你尿你尿\t98765\n穿针引线\t98766\n游戏架\t98767\nbeakEnglish\t98768\n仙丹\t98769\n遥想\t98770\n双流县\t98771\n没问问\t98772\n外邦\t98773\n南山后村\t98774\nchris\t98775\n遵纪守法\t98776\n聪明知道\t98777\n双性恋\t98778\n单身歌女\t98779\n光管\t98780\n鉴\t98781\n诈骗集团\t98782\n李新型\t98783\n陈九千\t98784\nwetr\t98785\nwetu\t98786\n痒不痒\t98787\n尿检\t98788\nweth\t98789\n好战\t98790\n学生卡\t98791\n江水围镇\t98792\n哇呜\t98793\n麻辣烫\t98794\n好我\t98795\n翅上\t98796\n礼尚往来\t98797\n尿棒\t98798\n千张\t98799\n南海诸岛地理志略\t98800\najs\t98801\njgjgagc\t98802\n锡箔\t98803\n建海\t98804\n1423万人次\t98805\n坨屎\t98806\n呼朋引伴\t98807\n张瑶瑶\t98808\n删倾\t98809\n89元\t98810\n错了错了错了错了\t98811\nhffgg\t98812\nzbngdx\t98813\n九九萌\t98814\nWilliam\t98815\n半袖\t98816\n姜哥\t98817\n莫过于\t98818\n植物孢粉学\t98819\n刘猫\t98820\n月琳\t98821\n我喜欢男\t98822\n外宣\t98823\n外宾\t98824\n外宿\t98825\n垌荿丄悶福雾\t98826\n圭斯拉\t98827\n10000000000000000000000000000000000000000000\t98828\n刮刮乐\t98829\n笨真\t98830\n偶西巴\t98831\nmyvxit\t98832\n高伟\t98833\n急眼\t98834\n普选\t98835\n说下载\t98836\n诗雅全\t98837\n烤猪你好猪你好猪猪猪猪猪\t98838\n康云泽\t98839\n沿东海岸\t98840\n高会\t98841\n#520\t98842\nhhhghu\t98843\n哼哼哼哼哼我不不不理你你你你你你你啦啦啦啦啦\t98844\n马本涛\t98845\n一八门\t98846\n酷毙克\t98847\n度秘翅\t98848\n尤物度秘\t98849\n吗也\t98850\nbbrbf\t98851\n冼澡\t98852\n取度秘\t98853\n3240556589\t98854\n马雪阳\t98855\nkkkljumofulkl\t98856\n调养\t98857\nSchool.金贤重.金奎钟\t98858\n调兵\t98859\n干堂\t98860\n别骗我我是小攻\t98861\n调兵遣将\t98862\n十几啊\t98863\nMorello\t98864\n麻子\t98865\n稀稀\t98866\n8www\t98867\n两遍\t98868\n诗经\t98869\n换句\t98870\n娜比拉\t98871\n提强\t98872\nyouityou\t98873\n红卫村\t98874\ntggf\t98875\n一心一意\t98876\n收音\t98877\n提弄\t98878\nwoshininupengyou\t98879\n不定义\t98880\n编导分\t98881\n假心\t98882\n乐祸\t98883\n80-90年代\t98884\n11441\t98885\n荣耀5x\t98886\n曹圭\t98887\n1.22\t98888\n咸猪手\t98889\n均等\t98890\n制作\t98891\n王二高\t98892\nNFC\t98893\n格物\t98894\n大好人呵呵\t98895\n金丝猫\t98896\n腊月25号\t98897\n董宜春\t98898\n金丝猴\t98899\n公龚秋\t98900\nduomesterloid\t98901\nBWIN\t98902\n440克\t98903\ntable\t98904\n通气\t98905\n咆哮帝\t98906\n杨赛花\t98907\n唐丫头\t98908\n遗憾难免\t98909\n家长们\t98910\nRusso一袭连身皮裙\t98911\n折二一\t98912\n没我萌\t98913\n波西\t98914\n248名\t98915\n余自华\t98916\n650老六\t98917\n一局\t98918\nxisn\t98919\n沙利尓\t98920\n一二三四五六七八九十十一十二十三十四十五十六十一斤\t98921\n小受儿\t98922\n一屉\t98923\n一届\t98924\n一屋\t98925\n不成就\t98926\n一屎\t98927\n一屏\t98928\n裕通\t98929\n冠男\t98930\n简讯\t98931\n一展\t98932\n隶书\t98933\n假睫毛\t98934\n梨花带雨\t98935\n奥我\t98936\n秒变\t98937\n明早五点钟\t98938\n棉杰夫\t98939\n奥成\t98940\n刘震云\t98941\n罪恶感\t98942\n辛亥旺门卡宁南山门洗衣篮\t98943\n认真\t98944\n一山\t98945\n谢度秘\t98946\n坏的回忆\t98947\nshuohuaao\t98948\n一屹\t98949\n1月15日8时43分\t98950\n正确性\t98951\n新诗\t98952\n动画片帜\t98953\nawards\t98954\n耕地\t98955\n玛莎拉\t98956\n新话\t98957\n饿饿饿饿饿饿饿饿饿饿\t98958\n2379年\t98959\n69岁\t98960\n对度\t98961\n新识\t98962\n不争\t98963\n鼻血\t98964\n不亏\t98965\n664umm\t98966\n過來\t98967\n不二\t98968\n新词\t98969\n不了\t98970\n不人\t98971\n也想你好\t98972\n殺青後終於\t98973\n那年们\t98974\n首秀\t98975\n幸运之杖\t98976\n不亲\t98977\n\t98978\n埃及上诉法院\t98979\n秘宝石\t98980\n轩辕剑天之痕\t98981\n不亮\t98982\n失亿失忆\t98983\n韩语臃\t98984\n新语\t98985\n送餐\t98986\n首秘\t98987\n世界之外\t98988\n积分\t98989\n王佳美\t98990\n有年年\t98991\n终结者\t98992\n紧身裤\t98993\n头昏脑胀\t98994\n爱谁了谁的智商了我有话\t98995\nnbaqq\t98996\n光荣\t98997\n非礼人\t98998\n净洁\t98999\n河辣\t99000\nHemi\t99001\n任晓洁\t99002\n外甥女\t99003\n244252551144774444122558805236699\t99004\nllsa7\t99005\n财主\t99006\n河边\t99007\n薛芬妃\t99008\njhggich\t99009\nｖｔtv\t99010\n对啊我是直女\t99011\n该当\t99012\n我敢我敢你\t99013\n重抄\t99014\n杨宇升\t99015\nadvice\t99016\n来历\t99017\n蓝莹\t99018\n春酒\t99019\n张晨阳\t99020\n瓯风\t99021\n购并\t99022\nMILLSAP\t99023\n各不相同\t99024\n嗤嗤\t99025\n十多号\t99026\n卢玉\t99027\n肯德基束\t99028\n重报\t99029\n九六十六亿第二针三六四六五六六六七六八六九七十\t99030\n十多台\t99031\n了老犯困\t99032\n1470\t99033\n1471\t99034\n说话的话\t99035\n来去\t99036\n1475\t99037\n1478\t99038\n蓝莓\t99039\n脑科\t99040\n77277\t99041\n冶炼商\t99042\n格利亚\t99043\n恩啵啵\t99044\n785.79万亩\t99045\n叛徒\t99046\n奥原来你是人妖\t99047\n唾沫\t99048\n泰国中央集团\t99049\n白衣服\t99050\n晚点儿\t99051\n录事\t99052\nhghy\t99053\n照射\t99054\nhghr\t99055\n福州市粮食局\t99056\n你帮\t99057\nhghv\t99058\nhghk\t99059\n雨滴\t99060\n高慧芳\t99061\nhghh\t99062\n同风顺\t99063\nhghc\t99064\n人民医院\t99065\n個dn個個\t99066\n武佳兴\t99067\nhghf\t99068\n天天炫舞\t99069\n穗丰\t99070\n0nnob\t99071\n帕博\t99072\n15日12时07分\t99073\n幺七零\t99074\n呸呸\t99075\n745745\t99076\ntyuoop\t99077\n欢呼声\t99078\n诗梦\t99079\n鼻音\t99080\n熊丽丽\t99081\n吵别说话\t99082\n琼宝宝\t99083\n谷KL\t99084\n有名听说过\t99085\n化州\t99086\n四四分之三\t99087\n里约奥运会\t99088\n747526618\t99089\n橄榄枝\t99090\n一下类\t99091\n周润发\t99092\n差不过\t99093\n更辛苦\t99094\n桶哥\t99095\n闽越\t99096\n年晚会\t99097\n我不知\t99098\n微爱发美娘娘\t99099\n甘先生\t99100\n一股脑\t99101\n明明也沒錢還看不起別人\t99102\n鬼臭猪\t99103\n39家\t99104\n情妇馆\t99105\n五香鱼\t99106\n凉鞋\t99107\n特侦组\t99108\n么度秘\t99109\n孟七八\t99110\n少出\t99111\n早花\t99112\n59分\t99113\n法帅天\t99114\n头头是道\t99115\n老引诱我犯罪\t99116\n20.33元\t99117\n下首歌\t99118\n逊色\t99119\n全国性\t99120\n三十一天\t99121\n蘑菇\t99122\n少花\t99123\n盘世瓶\t99124\n好干净\t99125\n秀丽\t99126\n东动\t99127\n我是一个好孩子你是哈哈好孩子\t99128\n四五媚媚\t99129\n决明子阿尔咯木\t99130\n魏兄的恩仇录和房兄的王小波传\t99131\njuhghhhhjjkl\t99132\n钟玉梅\t99133\n1270次\t99134\n北京邮电大学\t99135\n六百七十六百三十二\t99136\n呃向下\t99137\n好自信\t99138\n少见多怪\t99139\n朱传旭\t99140\n李正解\t99141\n成倍擴\t99142\n补齐\t99143\n情视物\t99144\n苍景\t99145\n189286\t99146\n99点46\t99147\n年齡\t99148\n头哥\t99149\n中国政府\t99150\n1.6亿余元\t99151\n好苦\t99152\n美女版\t99153\n洛口\t99154\n好若\t99155\n发便\t99156\n死不承认\t99157\n朱东菊\t99158\n349829046\t99159\n中信\t99160\n杨永杰\t99161\n知识\t99162\n姥爷\t99163\n少男少女\t99164\n翻领\t99165\n猜奸\t99166\n前舱\t99167\n此战\t99168\n半个月\t99169\n矢志不移\t99170\n那子\t99171\n感党恩\t99172\n107页\t99173\n卢克高斯\t99174\n岩欣媛\t99175\n背张学\t99176\njdjekqks\t99177\ngkhnj\t99178\n刘丽萍\t99179\n相悖\t99180\n百堡香\t99181\n海海敏敏\t99182\n再个说吾我旧我的天\t99183\nGhhgfdghggfhrhfbbrvegggggfffjfjkfgvffvvgvvhhhgfggbbhfhgnhbngccccvbnnnnnnbcbgfhfghghhgfhg\t99184\nzd\t99185\n雪基\t99186\n倒拔垂杨柳\t99187\nialendicordittonda\t99188\n元音\t99189\n此户\t99190\nzf\t99191\n知说\t99192\n光棍节\t99193\n渣津\t99194\n划定\t99195\n九龙闹海\t99196\n优缺点\t99197\nAV1a\t99198\n履历\t99199\n砰砰砰\t99200\nbelieve\t99201\n任命\t99202\n新鲜姐姐吗你爱我吗我喜欢你都糸你做我的男朋友好不好度秘我爱你咱俩爱爱好不好\t99203\n保卫家\t99204\ngfifgdtff\t99205\nu三四十万\t99206\n胡扯淡\t99207\n病患\t99208\n牵惫\t99209\n助业\t99210\n15864763258885552\t99211\n竭尽所能\t99212\n履去\t99213\n还白\t99214\n周边\t99215\n金银花\t99216\n鱼鱼鱼鱼鱼鱼鱼鱼\t99217\nTisci\t99218\n张师若\t99219\n99999999999999999999999\t99220\n明眸\t99221\n杨老大\t99222\nrryer5\t99223\n乐章\t99224\nfdswefgu\t99225\n3210272226\t99226\n曼丹\t99227\n聚美优品\t99228\n回头再聊\t99229\n丑帅\t99230\n继续\t99231\n混口\t99232\n两老的生活\t99233\n老娘就是爱王金\t99234\n等你死\t99235\nHY集团\t99236\nbgbbbb\t99237\n树干\t99238\n1222111223365145455145155544222225555555555555555555666665666665555\t99239\n前一秒\t99240\n起步价\t99241\nzr\t99242\nKim\t99243\nyrrrtty\t99244\n莴苣\t99245\nzxs\t99246\nmhunsbhuikj\t99247\noghcyugu\t99248\n37285\t99249\n六条\t99250\n咱们结婚吧好想和你\t99251\n寂寞撒的谎\t99252\n不用过\t99253\n艾伦克里曼\t99254\n有关闭\t99255\n7米\t99256\n重酬\t99257\n雷亚\t99258\n闲想\t99259\n倩MM白羊\t99260\n烹毒\t99261\ngjagdjk\t99262\na梦真\t99263\n洛高\t99264\n壮年\t99265\n蘑菇汤\t99266\n闲情\t99267\n生辰八字\t99268\nfifcb\t99269\n雷人\t99270\n我的天煞孤星\t99271\n脑压\t99272\n7月5日\t99273\n空白云\t99274\n叮咚叮咚\t99275\n精装修\t99276\n有王\t99277\n某种\t99278\n外盘\t99279\n饲养\t99280\n聊的再见\t99281\n环卫工\t99282\n王雨欣\t99283\nHiDuet\t99284\n3月8号\t99285\n敢问\t99286\npppppppppppppppppppppppppppppppppppppppppppppppppp\t99287\n第23739个\t99288\n管理干部\t99289\n合财\t99290\n合购\t99291\n下仓镇\t99292\nexoxoxoxo\t99293\n姓仇\t99294\n贤妃\t99295\n有肺\t99296\n漂亮的巴拉巴拉小魔仙很美丽的最最最最美丽的吧啦吧啦小魔仙\t99297\n廿九\t99298\n安闲\t99299\n背景音乐\t99300\n不走开\t99301\n贤妻\t99302\n另说\t99303\n祥希\t99304\n有肉\t99305\n何方神怪\t99306\n乖呀老大晚上天冷\t99307\n一百五十兆\t99308\n邱光宇\t99309\n警星\t99310\nDCCI\t99311\nmeic\t99312\n找把你\t99313\n汐荥\t99314\n雾化治疗\t99315\n似水年华\t99316\n无都\t99317\nsplanting\t99318\n镶嵌\t99319\n不v富贵花你那边v炒粉干和你爸vv\t99320\n丑比头\t99321\n盗墓笔记\t99322\n哈来\t99323\n美美哒萌萌哒我是好孩子\t99324\n扬声\t99325\n叫不叫\t99326\nxndjs\t99327\n李小先\t99328\n古典文学\t99329\n作法\t99330\n商务舱\t99331\n吴列刚\t99332\ngggg11\t99333\n19:30-02-06\t99334\naF弋仿口曲鳡\t99335\n逆行\t99336\n张若\t99337\n纯朴\t99338\n舞鞋\t99339\n胸型\t99340\n飞跑\t99341\n剧间\t99342\n李雨芯\t99343\n博泰狗\t99344\n露背\t99345\n人渣\t99346\n李养民\t99347\n发扬光大\t99348\nN个\t99349\n黄四渤\t99350\n雪诺\t99351\n飞跃\t99352\n鱼皮\t99353\n稽山新天地\t99354\n坐拥\t99355\nMFI\t99356\n澶\t99357\n包装纸\t99358\n六年级\t99359\n秘呢\t99360\nhttppinyincn1oS0DeIIqhX\t99361\n老婆儿\t99362\n富尔micyou\t99363\n早安号\t99364\n张苗\t99365\n秘不得\t99366\ncbydvj\t99367\n孙嘉擎\t99368\n一伢\t99369\n一传\t99370\n一伤\t99371\n闲类\t99372\n抛砖引玉\t99373\n风轻云淡\t99374\n木不好看\t99375\nJjnnmnnnmmmmmmmmmmmmmmm\t99376\n王永琪\t99377\nnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnjnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnmnnnnnjnnnnnnnnnnnnnnnnnnh\t99378\n地痞\t99379\n周日下午\t99380\n如日中天\t99381\n宪政\t99382\n怕冷\t99383\n需用\t99384\n三百多个\t99385\n热脸\t99386\n大嘴瓦\t99387\n杨维汉\t99388\n瑕不掩瑜\t99389\n李嘉嘉\t99390\n麒麟臂\t99391\n拆迁一听说\t99392\n一休\t99393\n万花\t99394\n一众\t99395\n张小邪\t99396\n一会\t99397\np3d7\t99398\n马进\t99399\n一伙\t99400\n黄H\t99401\n哼慢走\t99402\n一个40g\t99403\n天高速\t99404\n解皓洋\t99405\n180根\t99406\nydhvevgsijhjixbHSGHHSHVWVHZUEH\t99407\n天气预报器\t99408\nNO工dh\t99409\n浓浓浓\t99410\n圣物\t99411\n阿巴吉\t99412\n糖葫芦疼\t99413\n坏蛋坏蛋坏蛋丑八怪丑八怪\t99414\n上地雄\t99415\n东风bx7\t99416\n坐手一松开\t99417\n芳芳\t99418\n9点至21点\t99419\n05秒\t99420\n原材料\t99421\n穿梭\t99422\n哼哈牙米\t99423\n十点八\t99424\n十点六\t99425\nvvyfv\t99426\nVujaci\t99427\n黄v\t99428\n王一任\t99429\n黑点\t99430\n正面\t99431\n小视\t99432\n135栋\t99433\n在海下\t99434\n468567\t99435\n工场\t99436\n按摩师\t99437\n工地\t99438\n央视综合频道\t99439\n坏会\t99440\n12.2%\t99441\n凰春\t99442\n修理\t99443\n等边三角形de\t99444\n传传\t99445\n永辈\t99446\n柴若海\t99447\n这首话\t99448\nxhdhxhdhhdddhddhdhhnxhxhxjxjuxjxjijjxjdhdhdhdhhdhhdhdhhdudhhduduedudhfudh\t99449\n李可馨\t99450\n麟庐\t99451\nohxhhd\t99452\n鬼路度\t99453\n大哥妮\t99454\n13363516080\t99455\n重恋童癖\t99456\n唉何来\t99457\n正青\t99458\nrar\t99459\n梁景权\t99460\n大捷\t99461\n穆里\t99462\n16颗\t99463\nvvbbnbbjmnbnbl\t99464\n前俯后仰\t99465\n并不难\t99466\nGOOGLE\t99467\n三亚村委会\t99468\n莫言\t99469\n赤坎古镇\t99470\n12月11月\t99471\n观火\t99472\n禽兽之谜你\t99473\n想听天之大\t99474\n19年前\t99475\n多多谜境冒险系列查理九世\t99476\n老女孩女孩\t99477\n再藤田小学\t99478\n班主题\t99479\n点圈\t99480\n陈小米\t99481\n132935542\t99482\n换换换换换\t99483\n谢谢你我的好朋友\t99484\nCPUI34170\t99485\n钱钱钱钱一钱钱钱\t99486\n大了大\t99487\n初恋情人\t99488\n苗蒙蒙\t99489\n冯庙\t99490\n耿耿耿耿耿耿耿耿耿耿耿耿耿耿耿耿耿耿耿耿耿耿\t99491\n摩根弗里曼\t99492\n弄恩恩\t99493\n食神\t99494\n林君临\t99495\n地古古\t99496\n看载歌载舞\t99497\n血栓\t99498\n未闻猴\t99499\n佳瑶\t99500\n613162613\t99501\n拈来\t99502\n盾牌\t99503\n万恶滛\t99504\n965。n\t99505\nampm\t99506\n百多行\t99507\n甜蜜回忆\t99508\n李爽\t99509\n想大胜\t99510\n房地吴\t99511\n不不不我\t99512\nNO5\t99513\n文豪野犬\t99514\n爱河\t99515\n冯思佳\t99516\n醒啦\t99517\n点名\t99518\nldj\t99519\n2025年\t99520\n花心大\t99521\n八路军\t99522\n仲咪真\t99523\n找到你\t99524\nmpmdapmpapdjpmpma\t99525\n月刊\t99526\n很感谢你给我的快乐\t99527\n点听\t99528\n前世界\t99529\n袁晨\t99530\n七八点2四六点8110\t99531\n瞬移\t99532\n心里的话\t99533\n王两岁半就上中班了是的说办\t99534\n探伤\t99535\n妖姐\t99536\n抱不着\t99537\n伽\t99538\n京华时报\t99539\n洋酒店\t99540\n重庆人\t99541\n天妙曼\t99542\n王妞\t99543\n两名\t99544\n王妃\t99545\n荷包\t99546\n晚9点\t99547\n肖肖乐\t99548\n很温\t99549\n泡馍\t99550\n风像雨\t99551\nfluj\t99552\n戒腊七十四夏\t99553\naaaeeeeeeeeeeereeeeeeeeeeeeee\t99554\n两吨\t99555\n竹蜻蜓\t99556\n玛丽娜拉\t99557\n裴正浩\t99558\n一事无\t99559\n中长\t99560\n秘度秘加油加油真棒真棒\t99561\n死而无憾\t99562\n上海城小区\t99563\n权志龙帅\t99564\n正而八经\t99565\n梦之队\t99566\n假免\t99567\n李惠凯\t99568\n软剑\t99569\n30p\t99570\n王城粹光\t99571\nZLH\t99572\n去班儿\t99573\n說個笑來聽\t99574\n慎入\t99575\n铺设\t99576\n不不不\t99577\n不是我要\t99578\n30b\t99579\n30g\t99580\n宣泄\t99581\n慌是我的\t99582\n天魔\t99583\n无上起来\t99584\nNO3\t99585\n俊凯\t99586\n赣江\t99587\n喂药\t99588\n武藤兰\t99589\n30天内\t99590\n308\t99591\naadc\t99592\n见信如\t99593\n蜇\t99594\n出站\t99595\n300\t99596\n301\t99597\n302\t99598\n303\t99599\n老老实实\t99600\n305\t99601\n306\t99602\n307\t99603\n金牛梦璃\t99604\nnononoritono\t99605\n说不好\t99606\n错队了度秘度秘我问你\t99607\n说好久没\t99608\n30%\t99609\n至天明\t99610\n一辈辈\t99611\n猥琐波\t99612\n倾城之\t99613\nqwry\t99614\n高桥\t99615\ngoodb\t99616\n尗又\t99617\n道人\t99618\n高档\t99619\n杨金宝\t99620\n定数\t99621\n真真的时尚\t99622\n一百八十度\t99623\n打报告\t99624\nnsn\t99625\n糖心\t99626\n咩明明\t99627\n岩白菜\t99628\nnoboboboboboboobobob\t99629\n嘻嘻谢\t99630\n本份\t99631\n死心眼子\t99632\n呕臭\t99633\n娜美\t99634\n放到\t99635\n你是女人真的假的\t99636\n昨儿画\t99637\n两三尺\t99638\navshipin\t99639\n法眼\t99640\n瞬间\t99641\nRxyihobvucufygihojjoguviho\t99642\nduggho\t99643\n葛春红他爱不爱我说\t99644\n你的回忆\t99645\n28年\t99646\n五毛孙子们\t99647\n海之蓝\t99648\n道树\t99649\n唔知\t99650\n丁嘉丽\t99651\n艾琳娜奇亚\t99652\n本他\t99653\n荟萃\t99654\n郑淦辉\t99655\n还原\t99656\n吊销\t99657\nboya\t99658\n黄色苹\t99659\n利了玩味\t99660\n朱偌琳\t99661\n全班\t99662\n李虹霏\t99663\n赫德亚\t99664\ngshbs\t99665\n奇草\t99666\n弹彩\t99667\n机器博弈专业委员会\t99668\n大清早\t99669\n还县\t99670\n吴医生\t99671\ndvnjk\t99672\n扣篮\t99673\n食食\t99674\n说不介意\t99675\n伫\t99676\n在即\t99677\n我是女的干嘛\t99678\n讲讲故事\t99679\ncosmo\t99680\n15277925416\t99681\n鸸鹋\t99682\n属于我的专称\t99683\nwieieie\t99684\n一生一梦\t99685\n万部\t99686\n八公斤\t99687\n录用\t99688\n铁血错\t99689\n好家伙号楼\t99690\n堡马轩\t99691\n125公顷\t99692\n幻人尼\t99693\nuxutiggidfytttui\t99694\n吃腻\t99695\n除非\t99696\n嗯九年\t99697\n香红烧\t99698\n可喜\t99699\n可喝\t99700\n蔡雨萌\t99701\n三十二百九十三\t99702\n老夫妻\t99703\n新格精武时刻q新合金内因核心的质量比\t99704\n亮楷\t99705\n玛丽昂歌迪亚\t99706\n高不可爱\t99707\n第一棒\t99708\n挤爆\t99709\n嗯来来来来来来来来来来来来别客\t99710\n俱\t99711\nNO.1\t99712\n群演\t99713\n说谎试试\t99714\n没有理由\t99715\n14yq\t99716\n气功\t99717\nQQ苍老明晚\t99718\n普拉达\t99719\n分盟\t99720\n邓在\t99721\n秘再见\t99722\n杜峰\t99723\n坏蛋我恨你\t99724\n云烟云烟\t99725\n哥精\t99726\n平远\t99727\n沮丧\t99728\n装听\t99729\nwjns\t99730\n侬好搞笑\t99731\n锁价\t99732\n数千块\t99733\n战友们\t99734\n巨地\t99735\n六安市\t99736\n玩穿\t99737\n洋河品\t99738\n恐怖剧\t99739\n歌蕾哥蕾哥蕾哥莉\t99740\n训服\t99741\n巨土\t99742\ngbeibe\t99743\n小发明\t99744\n此话\t99745\n打用\t99746\n赵希洛\t99747\n张无号\t99748\n冬面\t99749\n杨航航阳\t99750\n八點\t99751\n最好的未来\t99752\n打电\t99753\n爱拼才会赢\t99754\nkmnnnmmm\t99755\n度秘你好样儿的\t99756\n了解剖\t99757\nAir\t99758\n致歉\t99759\n秘不啊度\t99760\n顾恺之\t99761\n好想贷\t99762\n张博杰\t99763\n腹式呼吸\t99764\n还有关系\t99765\n5544554548456454645494959584944\t99766\n这周\t99767\n三姑娘\t99768\n晓陀螺\t99769\n两点十分\t99770\n对忍者\t99771\n流氓大英雄你是一个小呀小流氓\t99772\n全身心\t99773\n毒蜜大坏蛋度秘大坏蛋\t99774\nｂｃ\t99775\n佛罗里达州\t99776\n东方饭\t99777\n致死\t99778\n太平沙\t99779\n可控性\t99780\n十ｃｍ\t99781\n亚旭一\t99782\n愈生\t99783\n精神上学\t99784\n3]\t99785\n恩是\t99786\n132549\t99787\n五分之二分\t99788\n3Q\t99789\n3P\t99790\n暗黑系\t99791\n夯满意\t99792\n三二岁\t99793\n假行僧\t99794\n3H\t99795\n3O\t99796\n火极一时\t99797\n3M\t99798\n100000000000\t99799\n3B\t99800\n春药\t99801\n3G\t99802\n3F\t99803\n女骸\t99804\n3D\t99805\n3z\t99806\n3y\t99807\n名品\t99808\n恩明\t99809\n一哈个\t99810\n3s\t99811\n3q\t99812\n3p\t99813\n3w\t99814\n3v\t99815\n大姑夫\t99816\n3t\t99817\n3k\t99818\n上午11点\t99819\n更名定\t99820\n3n\t99821\n3m\t99822\n雄性\t99823\n3c\t99824\n3a\t99825\n3g\t99826\n春草\t99827\n关一合江\t99828\n3d\t99829\n时政\t99830\n煞笔爱戴羊\t99831\n佳有\t99832\n隔空探物\t99833\n姜思彤\t99834\n睚眦必报\t99835\n数一数数一数\t99836\n朴泉美\t99837\n3.7日\t99838\ngdbai\t99839\n说了再说\t99840\n那个女生\t99841\n李昭颖\t99842\n38\t99843\nChinese\t99844\n33\t99845\n32\t99846\n看呆\t99847\n30\t99848\n画图\t99849\n36\t99850\n13525994463\t99851\n34\t99852\n看呜\t99853\n郭郤\t99854\nglimho\t99855\n掐哭\t99856\n肖子恒\t99857\nanqAijill\t99858\n高埗英华学校\t99859\nQwf\t99860\n3#\t99861\n还能够\t99862\n再潮\t99863\nniite\t99864\n32集\t99865\n3%\t99866\n李姐\t99867\n蜂蜜丽丽\t99868\n2011年8月24日\t99869\n鸡毛蒜\t99870\n小二小二\t99871\n煮沸\t99872\n前辈\t99873\n烤乳猪\t99874\n小干嘛\t99875\n92斤\t99876\n装鱼缸学一刚过穗香心梦\t99877\n46879808423648985388\t99878\n骨灰坛\t99879\n王梦涵\t99880\n闺女人家\t99881\n准确率\t99882\n创下\t99883\n管理\t99884\n德隆·威廉姆斯\t99885\n九点半十二点\t99886\n等量\t99887\n一个一百\t99888\n45278635883\t99889\nietirrut\t99890\n聊谜\t99891\n冯晨梦\t99892\n航母\t99893\n想走\t99894\n三殳十\t99895\nwweee\t99896\nh网\t99897\n想起\t99898\n欧尼帮\t99899\n前边\t99900\n奥潮\t99901\n行心饭\t99902\n奥迪q72750万嗯\t99903\n赵个群\t99904\n昂昂昂昂昂昂\t99905\nolyma\t99906\n94.4％\t99907\nehentai\t99908\n下雪\t99909\n下雨\t99910\n零二七\t99911\n保质期\t99912\n你的你和你的宠物\t99913\nacf\t99914\n歼-11B战斗机\t99915\nf\t99916\n想你们\t99917\n虚荣\t99918\n悟空传\t99919\n血腥味\t99920\n白情36计\t99921\n留学者\t99922\n下集\t99923\n3968\t99924\n死八婆\t99925\n人家的书关你鸡儿事\t99926\n參與點滴\t99927\n妇类\t99928\n邱员外\t99929\n肯快\t99930\n近视眼\t99931\nkakg\t99932\n核反应堆\t99933\n321块\t99934\n中午节\t99935\n四月六号\t99936\n扬州东站\t99937\n若白\t99938\n金俊勉\t99939\n口德\t99940\n分转\t99941\n罗志光\t99942\nyfu\t99943\n同福\t99944\n口径\t99945\n兽子\t99946\nhdbdbfbdjxndjcnnd\t99947\n就范\t99948\nueuei\t99949\n左宇轩\t99950\n8成\t99951\n谢子龙\t99952\n其\t99953\n一点4649点822点\t99954\n深情苦\t99955\n47天\t99956\n一千一千一千一千一千一千一千\t99957\n录影\t99958\n失子\t99959\n春春公司\t99960\n美色\t99961\n美艳\t99962\n三十多\t99963\n天空之城TCD婴垣帝凤\t99964\n美艺\t99965\n2月4号\t99966\n夫妻相\t99967\n舍不开\t99968\n几所\t99969\n明藥\t99970\ngtdvn\t99971\nressed\t99972\n日版\t99973\n文殊院首座仁永老和尚\t99974\n中霍村\t99975\n山东鲁能\t99976\n伊胜群\t99977\n收录曲\t99978\n兰\t99979\n六十刹那\t99980\n止痛片\t99981\nfrrihir\t99982\n修容粉\t99983\n三十天\t99984\n胡雪阳\t99985\nN0ND\t99986\n罪人们\t99987\n就是我也不知道\t99988\n交交\t99989\n现阶段\t99990\n蔡宇航\t99991\n交人\t99992\n解解\t99993\n高密一\t99994\nfydyffgc\t99995\n发拉美惠\t99996\nFUFY\t99997\n好我好想\t99998\nFUFU\t99999\n五五九九九\t100000\n囚儿\t100001\n那你在那你\t100002\n排场\t100003\nhyfftd\t100004\n连体服\t100005\n逼神经\t100006\n运动包\t100007\n交互\t100008\n直气\t100009\n将要死\t100010\n佛门\t100011\n一百二十三\t100012\n陈焕高\t100013\n谢一下\t100014\n隔离霜\t100015\n一百二十一\t100016\n九成五\t100017\n干什么呢司马和你\t100018\nT恤\t100019\n随便吃吃\t100020\n下十秒\t100021\n8点52分\t100022\n浓雾蒙蒙\t100023\n坐班\t100024\n爱你哦你好片\t100025\n20%仓\t100026\n瘦脸冠军西芹\t100027\n没有着\t100028\n有我讨厌\t100029\n555855564515555555nn2556\t100030\n3100个\t100031\n第几台\t100032\n吐如图组图ywu吐吐吐吐我绿统计局了解糊涂我了绿吐KTV吐我吐吐吐吐吐吐拒绝我\t100033\n给我去\t100034\n张天赐\t100035\nshuocf\t100036\n琼子\t100037\n毛西餐\t100038\n炒冷饭\t100039\n男旦\t100040\nuntour\t100041\n下个周\t100042\n战果\t100043\n四连下徐啊读哦离阿婆\t100044\n刁毛俩\t100045\n温锦荣\t100046\n贲文辉\t100047\n雄狮\t100048\n土匪\t100049\n上9点\t100050\n压皱\t100051\n呀累\t100052\n李娇娃\t100053\n七中胡同\t100054\n里斯塔\t100055\n巴荣\t100056\nchemrstrry\t100057\n冬茹\t100058\n13586748026\t100059\n中东地区\t100060\n一万块\t100061\n盛搏\t100062\n几批\t100063\n撤退\t100064\n凶人\t100065\n雨淋湿\t100066\n马尉轩\t100067\n信正\t100068\n共生\t100069\n信步\t100070\n科拉\t100071\n兴亡\t100072\n兴交\t100073\n带水\t100074\n兴亚\t100075\n71171717717171711717\t100076\n金师傅\t100077\nqiguyw\t100078\n蛇曲\t100079\n阿婆\t100080\nhgcbb\t100081\n凶了\t100082\n咯莫哦啦啦国语\t100083\nMiddleton\t100084\n一不都\t100085\nintentIntentSK1171477665FBB37DD55C100D982AAD045CCADD30C02FDB9EC1D72904F9C0F365DCC6FE144Bend\t100086\n我喜欢若\t100087\n泥蛇\t100088\n张呗\t100089\n心心如\t100090\n大仁哥\t100091\n黄健\t100092\n大腕\t100093\narpg\t100094\n美才\t100095\n沿河县\t100096\n大腚\t100097\n蒋燕\t100098\n1000把\t100099\n一人群\t100100\n38250\t100101\nabcdfghjklnubq打拉稀优于wsjdu\t100102\n5544\t100103\n8784894\t100104\n大腰\t100105\n大腿\t100106\ntetigg\t100107\n故名\t100108\n难逢\t100109\n大腹\t100110\nLincoln\t100111\n红装\t100112\n拉了拜拜\t100113\n修福\t100114\n崔婉亭\t100115\n平儿\t100116\n全剧终\t100117\ncccbjh\t100118\n一小心点儿\t100119\n敌敌畏本\t100120\n卓主任\t100121\n李梳理\t100122\n红裙\t100123\n能场\t100124\n周建平\t100125\n红裤\t100126\n头几天\t100127\n良家\t100128\n德恩特里维斯\t100129\n接不接\t100130\n南京一公司\t100131\n红裳\t100132\n泰森\t100133\n启事\t100134\n李青芮\t100135\n熊家日\t100136\n全店\t100137\n酉\t100138\n被拐子\t100139\n接漏\t100140\n15151\t100141\njdis\t100142\njdir\t100143\n席马林\t100144\n张鑫琪\t100145\n承德\t100146\np妥\t100147\n拴住\t100148\n呃尊天津机建有限公司\t100149\n性能\t100150\n浙江卫视跨年晚会\t100151\ngprestiva\t100152\n低碳\t100153\n告终\t100154\njdid\t100155\nfburd\t100156\n超正宗\t100157\n12345678906589012345678900\t100158\njdij\t100159\n护林乡\t100160\n羊佯\t100161\n黑沉沉\t100162\nOil\t100163\nmim\t100164\nmin\t100165\nvhgggg\t100166\nmic\t100167\nfrgvdsvghfhttwwywqwuipoqwiuwqqwypoteopkiccnpppoyrvhdcjhdsgbgrbhhrevcvddvfe\t100168\n#30\t100169\nmiy\t100170\nmix\t100171\n天巴\t100172\nabcdeftator\t100173\n向东流\t100174\n53061180\t100175\n桂枝\t100176\n装盘\t100177\n萌子\t100178\n贝蒂斯少赛\t100179\n不希罕\t100180\n自宫\t100181\n比拟\t100182\n华宇广场\t100183\n明清\t100184\n一两岁\t100185\n稳定版\t100186\nghijkoi\t100187\n自容\t100188\n自家\t100189\n亲我嘴\t100190\n妮子\t100191\n大头车\t100192\n挥洒跑\t100193\n花不花钱拿我的小度秘\t100194\n比拼\t100195\n袁慧琳\t100196\n新都\t100197\n寂寞不出来\t100198\n一不起\t100199\n黄梦静\t100200\n自定\t100201\n折现率\t100202\nmi2\t100203\n棱盖儿\t100204\n佳佳宾馆\t100205\n两千块\t100206\n奶奶们\t100207\n五笔你好\t100208\n打发\t100209\n20几年\t100210\n第败\t100211\n嗯只\t100212\n冯军哥\t100213\n在位\t100214\njiunk\t100215\n刘振琦\t100216\nGV胡\t100217\n更丰富发\t100218\n1987年6月9日\t100219\n莱美\t100220\n郭女王\t100221\n打史\t100222\n坟台\t100223\n想我不会\t100224\n小度秘真帅\t100225\n打句\t100226\n宽带宽带\t100227\nYork\t100228\n英吁结婚\t100229\n梅内姆\t100230\n流曳\t100231\n人之初性本善你好美\t100232\n十五分之二\t100233\n笑一下行\t100234\n取敌\t100235\n9千多万\t100236\n瓜呱瓜呱\t100237\n二不死\t100238\nPERPETUAL\t100239\n300223\t100240\n1万亿个\t100241\nhello呢stomity\t100242\n飘着\t100243\n在新你在新你\t100244\n我道谦\t100245\n别以为\t100246\n镇上\t100247\ny923\t100248\n不懂我的话真的不想和你说了\t100249\n济南市公安局\t100250\nExtraordinary\t100251\n当成\t100252\n文莱国\t100253\n天天风之旅\t100254\n凶极恶\t100255\n中国新闻网\t100256\n住房\t100257\n天敏\t100258\n红莓\t100259\n取指\t100260\n时间型\t100261\n脑肉\t100262\n住户\t100263\n女演员\t100264\n巫巫\t100265\n成双成对\t100266\n围攻\t100267\n天雄\t100268\n长城\t100269\n6d61级\t100270\n周可反\t100271\nhjbajjshd\t100272\n3jx\t100273\n一万三万四万五万六万七万八万九万十万十一万十二万十万三十三\t100274\n3d酷跑\t100275\n尼酱\t100276\n异世娃娃\t100277\n些一面\t100278\n天雷\t100279\n而而\t100280\n脑部\t100281\napou\t100282\n小兔咪\t100283\n大白牙\t100284\n大白牛\t100285\n天数\t100286\n咋天说\t100287\n说说说说说说\t100288\n开讲\t100289\n陈若琳\t100290\n切切切切\t100291\n行善者\t100292\nfsgsdwrgdhf\t100293\n55555552\t100294\n对不来\t100295\n2863882346\t100296\n碎牙\t100297\n碎片\t100298\n健长乐\t100299\n影帝\t100300\n210497486391\t100301\n十分钟后\t100302\nIslands\t100303\n著称\t100304\n配不起\t100305\n魔蝎\t100306\n牺\t100307\n一名15岁\t100308\n石亚仙\t100309\n不敢想\t100310\ninjhh\t100311\n周记\t100312\n医学技术\t100313\n13909746222\t100314\n去儿\t100315\n82544353582\t100316\nffsgdrjcshfdghi\t100317\n秋毫\t100318\n连点\t100319\n吗付\t100320\n安徽省\t100321\n梁耀红\t100322\n旭日东升\t100323\nzhfffffffff\t100324\n杨贝贝\t100325\ngong\t100326\nhi喽我等你我的毒米\t100327\n邵东\t100328\nGdaavvt\t100329\n尽心尽职\t100330\n刘戏\t100331\nctggf\t100332\n吧街\t100333\n吗价\t100334\n魔帝狂妃\t100335\n吗们\t100336\nchiphotosbaiducomxiaodupicitema5c27d1ed21b0ef4414ade2adac451da81cb3e27jpg\t100337\nah\t100338\n二十年后\t100339\nSina\t100340\n好啊我真\t100341\nyoungunn\t100342\n黑头\t100343\n洒落\t100344\nfinisher\t100345\nMagen\t100346\n赵书文\t100347\n芝纶\t100348\nam\t100349\n12345678954321\t100350\n第三二\t100351\n多山\t100352\n二月三号\t100353\nfinished\t100354\nlets\t100355\n鸟窝\t100356\n经纬度\t100357\n亲爱心的心\t100358\n素颜朝天\t100359\n这么说不就\t100360\n我求你了你告诉我\t100361\n李安圆\t100362\n黑天\t100363\n黑太\t100364\n抽点\t100365\n多层\t100366\n七十间\t100367\n黑夜\t100368\n杨明丽\t100369\np3tx\t100370\n算数\t100371\n多屏\t100372\nGHGFGY\t100373\n霉体\t100374\n多展\t100375\njhojp\t100376\n7.5码\t100377\n哪样子\t100378\n中一家夲\t100379\n一个二十四\t100380\n横渡\t100381\nksks\t100382\n戴尔\t100383\n告诉我告诉我谷歌\t100384\n8364\t100385\n8366\t100386\n吃药状\t100387\nq吗子\t100388\n颚骨\t100389\n屋里\t100390\n住楼\t100391\n你也像我一样\t100392\n这你妈了个逼老子\t100393\n古剑奇潭\t100394\n郭飞扬\t100395\n湾子\t100396\nSOC\t100397\nSOD\t100398\n惰性\t100399\n不要见\t100400\n孙运祥\t100401\n橡塑\t100402\ncolorwork\t100403\n你好你好你好你好你好你秘秘秘秘\t100404\nSOS\t100405\n声动亚洲\t100406\n真相我\t100407\n被份\t100408\n黄御峰园\t100409\n谢再\t100410\n文大公子\t100411\nMoe2\t100412\n仨小时\t100413\n全国人口普查\t100414\n5432789\t100415\n嫁给我你\t100416\n-------神马爱情\t100417\n乐高\t100418\n太人太\t100419\n崔英道\t100420\n我不懂的话\t100421\n破坏\t100422\n谘津门禅\t100423\n分文不值\t100424\nzccna片\t100425\ndxvDv\t100426\n一万一万分分分分分分\t100427\n网址\t100428\n百年厚葬\t100429\n品头论足\t100430\n蜡笔小新\t100431\n创收\t100432\n八遍\t100433\n五一二\t100434\n钻钻\t100435\n碧瓷\t100436\n冯峥\t100437\n13791076891\t100438\n黄俊伟\t100439\n丹桂飘香哥\t100440\n词语\t100441\n念雪\t100442\n八道\t100443\n卖酒\t100444\nGill\t100445\n五体投地\t100446\nfshuajchxtvrdurkghitdkfeggykfdjofhGGZJIDZHGZgfygjtsb\t100447\n盯紧\t100448\n一根筋公司连招儿\t100449\n肚肚脐\t100450\n一级棒\t100451\n洞庭湖\t100452\n二月初八\t100453\n内通\t100454\n2月5号\t100455\n吴雨瑞\t100456\n二十年之内\t100457\n定做\t100458\n6：30\t100459\n补缴\t100460\nbybby\t100461\n什麼憂點\t100462\n文三\t100463\n几打哈\t100464\n山东滕州金道\t100465\n八六五零\t100466\n616元\t100467\n我是你老婆那你是我\t100468\n連篇\t100469\n纽约时报中文网\t100470\nwcases\t100471\n熨烫精灵\t100472\n平iu\t100473\n好啦886我得下\t100474\n文东\t100475\n碧瓦\t100476\nsuustet\t100477\n你好度秘你好度秘你好度秘你好度秘你好度秘你好度秘你好多不你好多咪你好中意你好多不辣堡\t100478\n非法入境\t100479\n围观者\t100480\n霉变\t100481\n十一百九十九八九十多\t100482\n文中\t100483\n分票\t100484\n雨下\t100485\n9918\t100486\n鸟巢阅戍守县烟花销售员\t100487\nshurufahuaile\t100488\n请多指\t100489\n22255585855855\t100490\n好你骂我你骂我我不用你\t100491\ndjkf\t100492\n神魔我要迅雷有你\t100493\n一院香\t100494\n死都\t100495\n一分之二\t100496\nEygufifht\t100497\n四百四两八千斤\t100498\n射精\t100499\n金泊南\t100500\n啵啵啵啵钊钊钊钊钊钊钊\t100501\n李嘉腾\t100502\n清纯\t100503\n彩礼\t100504\n孵出\t100505\nyu乐\t100506\nvcbch\t100507\n被窝儿\t100508\n72635261672737384855959483762\t100509\n小人人\t100510\nIooi\t100511\n火炬园\t100512\n猪排\t100513\n咋老\t100514\n水管工\t100515\n1半\t100516\n来源\t100517\n试看\t100518\n1千\t100519\n0c分\t100520\n第二代\t100521\n无早\t100522\n63454\t100523\n童鞋们\t100524\n1卜\t100525\n雌雄\t100526\n百折不挠\t100527\n花度秘\t100528\n零四零六\t100529\nwoshuonine\t100530\nabout\t100531\n刘航\t100532\n估计\t100533\n贾政为\t100534\n鸡张\t100535\n乐虎\t100536\n伟佳\t100537\n无时\t100538\nGORE-TEX\t100539\n李毅轩\t100540\n讲比\t100541\n坏我好\t100542\n市盈\t100543\n210块\t100544\n独唱\t100545\n啦啦小度泌尿看我的吧小猫咪\t100546\n雨丝\t100547\n好地下本\t100548\n电死\t100549\n电熨\t100550\n洞殿\t100551\n旁观\t100552\n看走眼\t100553\n68页\t100554\n调查\t100555\ncsdfdttftdgf\t100556\n萝卜汤\t100557\n打杀杀\t100558\n头儿凯\t100559\n李楚君\t100560\n天朝政府\t100561\n甘肃\t100562\n炫彩\t100563\n河南平顶山中院\t100564\n行不对不起\t100565\n手写法\t100566\nchifi\t100567\n付厌\t100568\n茂武\t100569\n心灵独白\t100570\n平衡木\t100571\n大饱眼福\t100572\n歌屋恩\t100573\n裸体\t100574\n张亚会\t100575\nrcg\t100576\n北京站对面恒基中心\t100577\nrca\t100578\n罗唣\t100579\n王秀丽\t100580\n咳嗽立特灵治\t100581\n撒子\t100582\n媚上\t100583\n韩焕青\t100584\n绿草地\t100585\n视屏\t100586\n邱日语\t100587\n没胸\t100588\n增高\t100589\n说明影\t100590\n距问\t100591\n死地地道道\t100592\n六千米\t100593\n罗唆\t100594\n钱时明月\t100595\n没胆\t100596\n靠靠靠靠靠靠靠\t100597\n度秘我好想和你说一下我的秘秘\t100598\n1887515157\t100599\n了度秘\t100600\nab2辆\t100601\n3月中\t100602\n一月一百零六\t100603\nThaifood\t100604\n低空\t100605\n先导\t100606\nThaifool\t100607\n炉鱼堡\t100608\nl嗯嗯\t100609\n张雪山\t100610\n頭買\t100611\n2011年11月17日\t100612\n凉飕飕\t100613\nFEEGNQN\t100614\n8250688\t100615\n养牛\t100616\n估酒\t100617\n亚韩美容医院\t100618\n70%\t100619\n70页\t100620\n动机篇\t100621\n真声\t100622\n194124714\t100623\n横空\t100624\n郯城\t100625\n领居\t100626\n斯德哥\t100627\n微直播#2012博洛尼亚童书展#\t100628\n恨死\t100629\n独坐\t100630\n色盲\t100631\n卡罗拉\t100632\n长期以来\t100633\n购买力\t100634\n70#\t100635\n鲁讯\t100636\n刺骨\t100637\nbry\t100638\n一觉醒来\t100639\nfigigu\t100640\n金卷\t100641\nXanadu\t100642\nに\t100643\n每家\t100644\n句声\t100645\n奎峰\t100646\n我喜欢你我想\t100647\n四百遍\t100648\nqwrdtu\t100649\n不vv\t100650\n家粉们\t100651\n金卡\t100652\n牛肉场\t100653\n僵尸小说\t100654\n好吧好吧明儿再聊吧姐妹块\t100655\n1000000000000000万个\t100656\nmnopqit18b20位\t100657\n赵婉如\t100658\n金华\t100659\n灰泰迪\t100660\n李镭\t100661\n再笑\t100662\n斯大林格勒拖拉机厂\t100663\n鸟语花香\t100664\n酒精\t100665\n36处\t100666\n一来天了我\t100667\n英雄子\t100668\n传单\t100669\n跟手\t100670\n房款\t100671\n硬朗\t100672\njajc\t100673\n张最水\t100674\n堡王\t100675\njajb\t100676\n捏合\t100677\n有感不到\t100678\n固封\t100679\n消化系统\t100680\n后日\t100681\n再而三\t100682\n宝音源的orion啊TTTTATTTT\t100683\n传印\t100684\n危机\t100685\n俾噶\t100686\n李佳薇\t100687\n生日快乐乐乐乐乐乐乐乐乐乐乐乐乐XDD\t100688\n草得\t100689\n供求\t100690\n超级玛丽帅\t100691\n硬木\t100692\n死气\t100693\n德恩吉\t100694\n赵富勤\t100695\n车裂\t100696\n意在\t100697\nghijka\t100698\n新秩序\t100699\n躁\t100700\n友者霸\t100701\n香锅\t100702\n大家一起\t100703\n并驾齐驱\t100704\n错拐\t100705\n下周\t100706\n第一日\t100707\nUxyjf\t100708\n牛生\t100709\n吃值\t100710\n藏身\t100711\n三千余\t100712\n许婧\t100713\n空中\t100714\n西溪花园\t100715\n呼吸山中\t100716\nbicycle\t100717\n牛\t100718\n王家王\t100719\n就是骗你了我走\t100720\n尽头\t100721\n卡卡龙\t100722\nellaa\t100723\n利乐\t100724\n哇十多吖\t100725\n乚心\t100726\n苏富比香港拍卖\t100727\n同款\t100728\n通知足com\t100729\n狗猪\t100730\n我了行\t100731\n秋香\t100732\n冰糖葫芦\t100733\n你个臭不要脸萌萌哒\t100734\n红女\t100735\nle份\t100736\n大动脉\t100737\ncuyt\t100738\ndyAG\t100739\n心动心动\t100740\n不完了么\t100741\n白骨精美人鱼还有澳门风云还有大学大唐玄装还有我\t100742\n探测\t100743\n外国人大的很多多多多多多多我的岁月\t100744\n願景越來\t100745\n忌恨\t100746\n乳魔\t100747\n细心\t100748\n安神叶\t100749\nhaihai\t100750\n拿九稳\t100751\n强颜苦笑\t100752\n韩喜成\t100753\n几招\t100754\n我恨你我恨你我恨你我恨你我恨你我恨你我恨你我恨你\t100755\n尽失\t100756\n快乐的我很温柔\t100757\n酒杯\t100758\n什208\t100759\njjjjjj点点点点点\t100760\n666666966669999699699\t100761\nroroasds\t100762\n有无助\t100763\n山河破碎风飘絮身世浮沉雨打萍\t100764\n江苏邳州市华阳小学\t100765\n承中等你\t100766\n06年\t100767\n我不入\t100768\nhhhshqh\t100769\nHabs\t100770\n40余\t100771\n孔文燕\t100772\n恶言\t100773\nkmbc\t100774\n125524885\t100775\n打情骂俏\t100776\n火哥\t100777\n黄志\t100778\n铁手人\t100779\ncdees\t100780\nJuno\t100781\n积液\t100782\n柔术\t100783\n喜燕春\t100784\n烤土豆\t100785\n铁疙瘩\t100786\n德才兼备\t100787\n牙根\t100788\n焦孟珠\t100789\n2012世界末日\t100790\n主唱\t100791\n珍珠鸟\t100792\n安思密达\t100793\n普宁\t100794\n流感\t100795\n分针\t100796\n纯阳\t100797\n14224221212\t100798\n完璧归赵\t100799\nSHAKE\t100800\n12y129\t100801\n腊八吃粥\t100802\n三八一四三八\t100803\n第2期\t100804\n晨昏\t100805\n分钟\t100806\n揭示\t100807\n候年\t100808\n动漫型\t100809\n三代四代\t100810\n丢类\t100811\n分钱\t100812\n痰多\t100813\n一个一百九二百多块\t100814\n53岁\t100815\n一不想听\t100816\njsiwh\t100817\n哈尔滨市政法委乌副书记曾致电文永泉\t100818\n镂空\t100819\n月坪梁\t100820\n不知所云\t100821\nJUNHO\t100822\n慈云阁\t100823\n小傲娇\t100824\n血压\t100825\n舟山\t100826\n盗梦\t100827\n马艳秋\t100828\n惠生活\t100829\nzjsjsns\t100830\n邵晓峰\t100831\n物件\t100832\n物价\t100833\n姐姐妹妹\t100834\n狂潮\t100835\n鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅巍峨WWE\t100836\nGUIDE\t100837\n洞庭湖君山\t100838\n咿呀呀呀呀呀呀呀呀呀呀呀\t100839\n曹思怡\t100840\nせ\t100841\nfulman\t100842\n财门\t100843\njsmb\t100844\njsmd\t100845\n徐凯伦\t100846\n金属杆\t100847\n长流量\t100848\n2月28日5时\t100849\n29P[\t100850\n贫血\t100851\n国语碟\t100852\n贪特贪\t100853\n蓝藻\t100854\n嘴巴克拉手\t100855\n商帮文化\t100856\n我尼\t100857\n天榆林\t100858\n忧乐\t100859\n落秋群\t100860\nhfuydbsddcvdndufeie14845937383\t100861\n我是你的心尖宠\t100862\n堡猫\t100863\n小计\t100864\n高山大川\t100865\ngh454\t100866\n年兔\t100867\n1e1e1111111111\t100868\n谢景行\t100869\nhhyud\t100870\n同一天\t100871\n国家电网公司\t100872\nlou\t100873\nFjkfhg\t100874\n讲到\t100875\n央求\t100876\n占主导\t100877\n倦容\t100878\n小資\t100879\n叶炀城\t100880\nkieaX\t100881\n前后\t100882\n马鼎盛\t100883\n一周多\t100884\n复合地板\t100885\n16米\t100886\n羊毛衫\t100887\n名门女王现在要你给我看一个夜萝莉精灵梦\t100888\n丑美\t100889\n好多久\t100890\n背号\t100891\n你家村\t100892\ncll\t100893\n东莞理工\t100894\nclh\t100895\nclj\t100896\n井盖\t100897\n窦佳轩\t100898\n度九九九九五五\t100899\n1234512345678901223345\t100900\nclg\t100901\n嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯旺旺旺旺旺旺旺旺旺旺\t100902\ncla\t100903\n为宗旨\t100904\n多与寡\t100905\njfddb2\t100906\n妈妈手\t100907\n背叛\t100908\n至高\t100909\n新密\t100910\n加油中国冠军联赛\t100911\n一周天\t100912\n针织帽\t100913\nclr\t100914\ncls\t100915\n西联\t100916\n陈美玉\t100917\n贾力霖\t100918\n刷车\t100919\n旁白\t100920\n辽宁矿\t100921\n5676568876\t100922\n杨璨\t100923\n夜宴橙\t100924\n一不要在说那句话了我要走了亲我一下\t100925\n柯思\t100926\n不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸\t100927\n不成活\t100928\n不死不死\t100929\nfdtsee\t100930\n鸣人\t100931\nShaii7点amlets\t100932\nNOOOOOOOOOO\t100933\n穿越火线\t100934\n九九块\t100935\n日月如梭\t100936\n霓凰\t100937\n没成年\t100938\n度角2u\t100939\n56762217227252864766547247217\t100940\norororo\t100941\n平产\t100942\n铲车\t100943\n夏普\t100944\n新车站\t100945\n我在#三国来了#\t100946\nseare\t100947\n无惨\t100948\n钱物\t100949\n续租\t100950\n无惯\t100951\nsears\t100952\n天外视唱的歌\t100953\n香菇肥牛\t100954\n2012下\t100955\n熊德坤\t100956\n唐多多\t100957\n2553360\t100958\n忽略\t100959\n85颗\t100960\nv字仇杀队\t100961\n怎么呢\t100962\n已死\t100963\ncutmy\t100964\n无情\t100965\n全名\t100966\n有骨气\t100967\n谭刘锐\t100968\njrjge\t100969\n女孩童\t100970\n夏欣\t100971\n哪嘛\t100972\n透视\t100973\nfunshion\t100974\nghcb\t100975\n早起觉\t100976\n碌干\t100977\n锚索\t100978\n博美堂\t100979\n有饭吃\t100980\n炫铃\t100981\n煤子\t100982\n8888886\t100983\n看遍\t100984\n8888885\t100985\n杜密阳阳\t100986\n衣库\t100987\n俨装\t100988\n肾虚\t100989\n萝萝鱼\t100990\n严杭\t100991\n没礼貌不起\t100992\n小不点\t100993\n上世纪90年代中期\t100994\n胡乱\t100995\n找一本书\t100996\n指桑骂槐\t100997\n学佛者\t100998\n路易\t100999\n乱想到\t101000\n查没有\t101001\n侯婷婷\t101002\n新北\t101003\n新化\t101004\n选择格\t101005\n再见吧明\t101006\n一万一千一一万一千一万1000eeey一件\t101007\n色狼片\t101008\nhijtc\t101009\n咪觉晓\t101010\n首儿\t101011\n就是为啥\t101012\n歌贴\t101013\n1040n\t101014\n三百年\t101015\n新区\t101016\n9月23日\t101017\n酒吧台\t101018\n度秘我想和你的吗聊聊\t101019\nrreyou\t101020\n守序\t101021\n新民婆工资误会工\t101022\n毒针\t101023\n佔地\t101024\n话筒\t101025\n说就是说\t101026\n优姬\t101027\n5454946718462828984978055437146151645484804348143\t101028\nb4余\t101029\ncurtains\t101030\n写错字\t101031\n闭月羞花\t101032\n第一批56\t101033\n墙面\t101034\n可爱的啦不啦多犬\t101035\n传真社\t101036\n零二二二二八三零八五八\t101037\n648千米\t101038\n红语\t101039\n暗挂\t101040\n19：15\t101041\n人力量\t101042\n开炮\t101043\n鼾声如\t101044\n人人类\t101045\n四年后\t101046\nabcdefgazkaliznopqoaccoulouvwsynz\t101047\n施乐\t101048\n株洲人\t101049\n并没有关系\t101050\n亲临其境\t101051\n揭开\t101052\n难吃死\t101053\n赛义夫\t101054\n刘鹏\t101055\n六百余额\t101056\n书荒\t101057\n陪调\t101058\n九江市\t101059\n吴奇施\t101060\n今天下午九点半\t101061\n用书\t101062\n寿司\t101063\n救难\t101064\n推挤\t101065\nUJHGGTH\t101066\n武打片儿\t101067\ngxhgggfdjcwg\t101068\n洗聊\t101069\n233晚\t101070\n你给我滚行吧我操你大爷\t101071\n片片\t101072\n志已美\t101073\nmeego\t101074\n经闹\t101075\n几包\t101076\nsnoy\t101077\n雨虹\t101078\n麦力素\t101079\n我是明天星期几\t101080\n酷尔公司\t101081\n涛声依旧\t101082\nEMIRATES\t101083\n李红钱\t101084\n狂欢\t101085\n蓬元帅\t101086\njgtuhtej\t101087\nFDF\t101088\n贼眉鼠眼\t101089\n很简单\t101090\n黑叔\t101091\n心意到\t101092\n15887146983\t101093\n三G网\t101094\n王军辉\t101095\nqyw\t101096\n肌注\t101097\n粤语话\t101098\n唉小度\t101099\n旅游团\t101100\n思密达粉\t101101\n仇元东\t101102\nDONwhy\t101103\n因为我喜欢男\t101104\n猪猪侠\t101105\n倩萌\t101106\n特权\t101107\ncyc\t101108\n车迷\t101109\n20061\t101110\nえ\t101111\n李少丽\t101112\n莫林\t101113\n当之无愧\t101114\nfxvhhb\t101115\n尾风\t101116\nyhfg\t101117\n极乐世界\t101118\nyhfy\t101119\n无性繁殖\t101120\n店长们\t101121\n挥剑\t101122\n肾病\t101123\n一男\t101124\n糊涂蛋们\t101125\n试图\t101126\n英雄三国\t101127\nfwdnw\t101128\n闻者\t101129\n王大神\t101130\n安真禾\t101131\n党中央\t101132\n周觅得\t101133\n温男\t101134\n讯号\t101135\n文岩尼嚎\t101136\n春眠不觉晓处处闻啼鸟夜来风雨声花落\t101137\n1314438\t101138\n电子厂\t101139\n好了好吗我问你了对不起\t101140\nradio\t101141\nquba\t101142\nonweekends\t101143\n铁头\t101144\n快乐加油站\t101145\n指挥棒\t101146\n无害怕\t101147\n漫网\t101148\n一难道\t101149\n一到五\t101150\n一举成功\t101151\n好不好吃\t101152\n铁多\t101153\n迎欢\t101154\n回不到\t101155\n谁们\t101156\nAaa\t101157\n共度\t101158\n水啤酒\t101159\n多乜\t101160\n宅男神\t101161\n片度\t101162\n多也\t101163\n靠近我\t101164\n三少\t101165\n自然法\t101166\n买加\t101167\n烧灼\t101168\n三小\t101169\n维斯\t101170\n贾潇雨\t101171\n高晗\t101172\n颜月溪\t101173\n多么\t101174\n侧切\t101175\n从无\t101176\ndjsjnx\t101177\nsyseu\t101178\n看我不懂\t101179\n三尺\t101180\n学籍\t101181\n自由贸易\t101182\n1一11\t101183\n脸大宝贝\t101184\n搜搜\t101185\n周西村\t101186\n呢呀们\t101187\n高智\t101188\n养老保险\t101189\n河智苑#MBC\t101190\n一趟\t101191\n小明儿\t101192\n张雨洁\t101193\n十十类\t101194\n长风破浪\t101195\n十几G\t101196\n艺涵\t101197\n靠垫\t101198\n车价\t101199\n媚心\t101200\n那娃\t101201\nあ\t101202\n竹园\t101203\n厘峰\t101204\n哇歪\t101205\n大嗎\t101206\n布里茨\t101207\nsellme\t101208\n恬静\t101209\n上撇\t101210\n0644565800\t101211\n骷髅头\t101212\n视图\t101213\n超之战\t101214\n付宝玉\t101215\n导戏\t101216\n最高\t101217\nyouis\t101218\n痤疮\t101219\nbgjggg\t101220\n油泼面托叶儿\t101221\n重力祖望重\t101222\n99寸\t101223\n文徳路\t101224\n英久\t101225\n阿花花呗\t101226\n脉动青柠\t101227\n2274142165\t101228\n第五大道\t101229\n时男时女那你就是娘娘腔\t101230\n88926666\t101231\n住在\t101232\n大个人\t101233\n院长\t101234\n鲁豫\t101235\n｀o\t101236\n叫出\t101237\nPURCELL\t101238\n浅入浅出\t101239\n打出跑\t101240\n碘伏\t101241\n10多首\t101242\n乐语\t101243\n250度\t101244\n唱萌\t101245\n饱和\t101246\n杨二爷\t101247\n真的好啦\t101248\n洗眉酒吧\t101249\n不好不\t101250\n扭蛋\t101251\n九七点\t101252\n大大浩堂\t101253\n别个了哎哟我娘\t101254\nbhithvsgtxXchihcyfcffhggvhjjGfhhvuygvjhdccjpkgff\t101255\n真的好爱好爱\t101256\n咯回饋\t101257\n警员\t101258\n假烟\t101259\n即兴\t101260\n把杆\t101261\n九宫\t101262\ndddqq\t101263\n警告\t101264\n秘界\t101265\n晕了我告诉你\t101266\n看天奇\t101267\n调档\t101268\n洁曦\t101269\n日产\t101270\n系数\t101271\n2009年度\t101272\n廖佳玲\t101273\n凭添\t101274\n滑蛋猪扒饭\t101275\n喵咪咪喵咪咪\t101276\n闰土\t101277\n漆克\t101278\n水彩\t101279\n上海站\t101280\nhnhsjnhchjndhgnhdgndhgdjhcjkbsjhhhdjhshsvhv\t101281\n水形\t101282\n富士山顶\t101283\n鉴给\t101284\n班花\t101285\n45号\t101286\n刘雨梓\t101287\n45只\t101288\n完了叫\t101289\nllll\t101290\n害人\t101291\nGeographic\t101292\n兴致\t101293\n拿去儿\t101294\n躁动槽\t101295\nkwtng\t101296\n伊利诺斯州\t101297\n吴稚晖\t101298\ngtrueh\t101299\n心虑\t101300\n富刚\t101301\n好费劲\t101302\n过分\t101303\n男果\t101304\n五十亿一个\t101305\n最数\t101306\n王谦\t101307\n见笑\t101308\n穿上\t101309\n街南\t101310\n五街\t101311\n动作\t101312\n十四岁\t101313\n654804111771704885613482439122\t101314\n吟游\t101315\n五行\t101316\n3月2日\t101317\n质次\t101318\n那鹿门\t101319\n明细\t101320\n骚骚哒\t101321\n九米矿\t101322\n30.7万亩\t101323\n接受我\t101324\n第二次\t101325\n董洪岩\t101326\n土进\t101327\n二百四七十八天\t101328\n扭转\t101329\nn116\t101330\n2980元\t101331\n独子\t101332\n雪肌精\t101333\n五衡\t101334\n对的话\t101335\n1000000000000000个\t101336\n有手\t101337\n伪音\t101338\n王忙岭\t101339\n一往右\t101340\n6x10\t101341\n有所\t101342\n朋子\t101343\n朋字\t101344\n是啊少说点\t101345\n工商户\t101346\n小杜敏\t101347\n工广v\t101348\n給力點\t101349\n几百头\t101350\n会得\t101351\n吐出\t101352\n过年了么\t101353\n舞将们\t101354\n很丑臭\t101355\n姆指\t101356\n脍炙\t101357\n世界羽联错\t101358\n婆婆耿耿耿耿耿耿耿耿\t101359\n几百多\t101360\n会徽\t101361\n蔡镕泽\t101362\n光明磊落\t101363\n各村\t101364\n最高处\t101365\n朋学\t101366\n咱许\t101367\n一万次\t101368\n我问我问问你一个我问问你\t101369\n找中\t101370\n持有\t101371\n要头干嘛\t101372\n不餐\t101373\n一张长十九米的木条加密啊不犯贱下一名句字醉拳许多一样大么片全是的\t101374\n跟校对\t101375\n万物\t101376\n粉底液\t101377\n莪莪莪\t101378\n找主\t101379\n1q币\t101380\nrijbi\t101381\n龙居花园二区\t101382\n418点\t101383\n写生\t101384\n成年\t101385\n拍官\t101386\n烧香\t101387\n#AKB48#变身#Baby\t101388\n嫔妃\t101389\n10AM\t101390\n蚯蚓\t101391\n波多野结衣\t101392\n人工汗纸\t101393\n找一\t101394\n快点儿吧\t101395\n六平\t101396\n六年\t101397\n镁棒\t101398\n错题\t101399\nligfoss\t101400\n无心无憾\t101401\n唔小心\t101402\n杨博美\t101403\n长生不老\t101404\n我想我是真的爱上书\t101405\n屠哥\t101406\n屡屡\t101407\nPACHANGA\t101408\n噫噫\t101409\n意境\t101410\n天气报\t101411\n纲目\t101412\n写句话\t101413\n万载无双\t101414\n51瓶\t101415\n别着急\t101416\n一一丨\t101417\n一一个\t101418\n五毒LOLI\t101419\n2004年2月2日\t101420\n是非\t101421\n演播室\t101422\n礼法\t101423\n花人\t101424\n真真真真真真真真真真真真真真真真真真真真\t101425\nhi来了吗咯二马路上\t101426\n郭随性\t101427\nzjzbs\t101428\n说呀你\t101429\n成朋水\t101430\n一一一\t101431\n安艳丽\t101432\n甜栗香瓜\t101433\n觜\t101434\n榴莲牛奶\t101435\n真毒\t101436\n一个一个一个一个\t101437\n角\t101438\n兰博AH\t101439\n度霸\t101440\n血海\t101441\n马总\t101442\n览\t101443\n觉\t101444\n视\t101445\n9千二百四十\t101446\n今年二月八\t101447\n观\t101448\n快点放学吧\t101449\n见\t101450\n来一到\t101451\n病坏\t101452\n探寻者\t101453\n威尔克姆屠城\t101454\n哥儿\t101455\n对啊心情好\t101456\n啦块\t101457\n下榻\t101458\neverything\t101459\n笨头\t101460\n新东方厨师学校\t101461\n38.04\t101462\nnikeN98\t101463\n触\t101464\n1、2月\t101465\n公平竞争\t101466\n解\t101467\n刚刚五\t101468\n攀枝花\t101469\n张政\t101470\n斯图基\t101471\n收腹\t101472\n衣帽\t101473\n张改\t101474\n么姑非\t101475\n十二八千里\t101476\n你貌\t101477\n曹阳\t101478\n张呈栋\t101479\n收腰\t101480\n康克清\t101481\n高智传媒\t101482\n张女士\t101483\n34.5%\t101484\n制假者\t101485\n赵国灿\t101486\n过堂会\t101487\n王树元\t101488\n260亿元\t101489\n青藏\t101490\n无儿克夫主\t101491\n雨泽\t101492\nJie\t101493\n偏女性化\t101494\n翠蛮\t101495\n黄土高原\t101496\n完满\t101497\n如棒\t101498\n当当当当当当当当当当当当当当当当三个了胎了了了了了不咯不咯了了了了不了了啦啦啦啦啦\t101499\n土耳\t101500\n上海店都\t101501\n内层\t101502\n愁人\t101503\n魄魄\t101504\n高坎村\t101505\n13838438\t101506\n重点\t101507\n克过\t101508\n刻骨铭心之恋\t101509\n克远\t101510\n订单\t101511\n套马\t101512\n重炮\t101513\n爱巧虎\t101514\n不舍\t101515\n穆提\t101516\n青皮冬\t101517\n重己者\t101518\n石晟钰\t101519\n汉莎机\t101520\n叙叙\t101521\n死点\t101522\n好多家\t101523\n我不要你了疯子\t101524\n表带\t101525\n百分n\t101526\n脾脏\t101527\n蔡谨泽\t101528\n程国帅\t101529\n不舒\t101530\n24米\t101531\n冯唐\t101532\n夏振宇\t101533\njakhm\t101534\n回来的话\t101535\n朱德\t101536\n多日用型\t101537\nwtwete\t101538\n卡式\t101539\n前嫌\t101540\n快快乖乖会会八会会\t101541\n走呀真讨厌\t101542\n子鼠丑牛寅虎卯兔辰龙巳蛇午马未羊申猴酉鸡戌狗亥猪\t101543\n程嵩\t101544\n张淑菁\t101545\nhisifnthu\t101546\n马会赢\t101547\n朱雅纪\t101548\n高兴千红\t101549\n担负\t101550\n锋利\t101551\nryanguys\t101552\n殊荣\t101553\n巴尔秘\t101554\n张俊哲\t101555\nrmlm\t101556\n豆秘咪咪咪咪咪咪咪咪咪咪咪咪咪\t101557\n昨日上午8时22分\t101558\n统制\t101559\n1月\t101560\n银奖\t101561\n52分\t101562\npentaq\t101563\n过行不梦幻\t101564\n抱抱行\t101565\n怪不会\t101566\n安安酱\t101567\n创可贴\t101568\n校名\t101569\n伯伟\t101570\n汇江湖\t101571\n茶叶蛋\t101572\n马荒纪\t101573\n1本\t101574\n一天一天\t101575\n生癌\t101576\n李秀蓉\t101577\n架昏沉沉\t101578\n心惶惶\t101579\n开罗公司\t101580\n五六次\t101581\n1朵\t101582\nwirjturyfcjfdt\t101583\n饿郎\t101584\n宁潭\t101585\n木节操\t101586\n行滞\t101587\n圣地\t101588\n朱志伟\t101589\n沈浩文\t101590\n人工服人工台\t101591\n一怎么\t101592\nggffgygcf\t101593\nLife\t101594\n流动贷款\t101595\n油烟机\t101596\n精屁\t101597\n英超\t101598\n傻孩子\t101599\nHy\t101600\n拿手绝活\t101601\n转发\t101602\n六平方\t101603\n講個笑話\t101604\n拆迁户\t101605\n我办\t101606\n转变\t101607\n我是你的小伙伴我叫魔仙美\t101608\n三角种\t101609\n洪彤彤\t101610\n一片漆黑\t101611\n离我听\t101612\n护化\t101613\n度秘你的老娘是谁呀你在\t101614\n过年了你说\t101615\n神马狗屁\t101616\n刘雯雷\t101617\n犀牛角\t101618\n转台\t101619\n吸足\t101620\n0246810\t101621\ncoach\t101622\n鹏鹏\t101623\n78版\t101624\n爸爸爸爸爸爸爸爸爸爸爸爸\t101625\n早来年\t101626\n不懂行\t101627\n所作所为\t101628\n四赎\t101629\n小豪\t101630\n供地\t101631\n鼓劲儿\t101632\n抠费\t101633\ncnx\t101634\n深调\t101635\n利库奇\t101636\n怪兽\t101637\npjgjgjgbg1c\t101638\n虫卵\t101639\ngre33\t101640\n怪兄\t101641\ncnf\t101642\n高主任\t101643\n#沃支付#商圈俱乐部\t101644\n度秘度\t101645\n钱钱钱几个是我的妹妹帮\t101646\n侯大良\t101647\n再来一再来\t101648\ncnd\t101649\n自知自明\t101650\n燃烧吧\t101651\nHi\t101652\n5年12月\t101653\n不成方圆\t101654\n斗蜜\t101655\n一起度\t101656\n陆毅度秘\t101657\nhhvvh\t101658\n幸有\t101659\n出其不意\t101660\n捻\t101661\n咯摸摸摸摸扎\t101662\n捶\t101663\n朱嘉淇\t101664\n捱\t101665\n据\t101666\ncnn\t101667\n捬\t101668\n好了就好\t101669\n康康\t101670\n捧\t101671\n猪猪侠十\t101672\n换\t101673\n提手\t101674\n看天天有喜\t101675\n捞\t101676\n损\t101677\nbpaybyphoto\t101678\n西餐馆\t101679\n车主贷\t101680\ntyhhcj\t101681\n变异苹果果果果果\t101682\n第三二个\t101683\n忙先走\t101684\n目送\t101685\n捕\t101686\n捐\t101687\nHg\t101688\n捎\t101689\n捏\t101690\n提扣\t101691\n捉\t101692\n捆\t101693\n油纸伞\t101694\n捅\t101695\n捂\t101696\n2293191593\t101697\n管兴虎\t101698\n木槿花\t101699\n7.2级\t101700\n鲍鱼饭\t101701\n汁味\t101702\n翘舌音\t101703\n奈落\t101704\n刘凯\t101705\n王中王\t101706\n握握\t101707\n喪晴\t101708\n核销\t101709\nYff\t101710\n晒饭\t101711\nccklj\t101712\n青岛足球\t101713\n高溢\t101714\n寡言\t101715\n鹤城区\t101716\n老兄\t101717\n做绝\t101718\n老兽\t101719\n老关\t101720\n就坐\t101721\n4:00\t101722\n老兵\t101723\n老八\t101724\n888788788888\t101725\n王小学\t101726\n老六\t101727\n老公\t101728\n小蚂蚁\t101729\n那表\t101730\n邱那\t101731\n王浩权\t101732\n建伟\t101733\n7…7\t101734\n杨寒\t101735\n距先\t101736\n张高清\t101737\n试试用\t101738\n落叶他乡树寒灯独夜人\t101739\n小肥妞来我家呀胖肥妞\t101740\n牡丹大坏蛋\t101741\n马铃薯\t101742\n本小\t101743\n110多次\t101744\n朱作\t101745\n珠江帝景\t101746\n林下姐\t101747\n房契\t101748\n背水\t101749\n本尼\t101750\n蠢组\t101751\nhellostmetout\t101752\nfshu\t101753\n注释\t101754\n我为什么喜欢你我想对你说我爱你\t101755\n注重\t101756\n环城南路\t101757\n放恩\t101758\n抱写\t101759\n香港\t101760\nbeautiful\t101761\n亲爱的想你晚安\t101762\n绿龙\t101763\ndjvgjc\t101764\n釜山\t101765\n好梦哒\t101766\n现在下午\t101767\n笑翻天\t101768\n度秘理财\t101769\n红岭南\t101770\n卡洛\t101771\n医学生\t101772\n朝九晚\t101773\n我爱天\t101774\n噗跑\t101775\nyoutalk\t101776\n二二八六\t101777\n发愁片\t101778\n4.3寸\t101779\n挥旗\t101780\n翩翩翩翩翩翩翩翩翩\t101781\n嘴馋\t101782\n3youl吗七\t101783\n进货\t101784\n哎呀啊\t101785\n唉小怪兽怪兽在哪里小怪兽\t101786\n压听\t101787\n字条\t101788\n有一个人\t101789\n邓一趟\t101790\n人肉包子\t101791\n小坏蛋\t101792\n说书面\t101793\n五六百\t101794\n窝头\t101795\n银两\t101796\n忘了吧\t101797\n05:10\t101798\n肖像\t101799\nweibo.com/2093492691/yr5sAFWuI\t101800\n没事情聊\t101801\nFamily\t101802\n13882\t101803\n劳动连棵花\t101804\n6532278\t101805\n6FB\t101806\n五十五十三个\t101807\n红木鉴定所\t101808\n风浅汐\t101809\n西洋参\t101810\n一四几\t101811\n特饮\t101812\n四尺\t101813\n向阳路强院一号楼\t101814\n溶剂\t101815\n斯琴高娃\t101816\nLVOE\t101817\n逐层\t101818\n珠宝\t101819\n6秒\t101820\n王扬\t101821\nH5\t101822\n一两行\t101823\n可怜鬼\t101824\n10000000000000岁\t101825\n卵巢癌\t101826\nivsxpmd\t101827\n2227\t101828\n回握\t101829\n微信场\t101830\n2222\t101831\n2223\t101832\n2220\t101833\n3天\t101834\nummetmi\t101835\n全心全意\t101836\n王佳佳\t101837\n好辛酸\t101838\n告诉我听\t101839\n花生醬捨己為公為\t101840\n王手\t101841\n3279364733\t101842\n信则无\t101843\n男意思论考试\t101844\n可口可乐公司\t101845\n幻觉\t101846\n度秘我喜欢\t101847\n仅次于\t101848\n25公斤\t101849\n泪汹\t101850\n超白\t101851\ntbt\t101852\n脑瓜兄\t101853\n咳咳\t101854\n3d眼镜\t101855\n考拉蛇鲸鱼老鼠熊鸽子海豚花鼠猪鸡\t101856\n族谱\t101857\n吃水\t101858\n秘报险\t101859\n夏一\t101860\n爱再来\t101861\n宋楼\t101862\n双休日\t101863\n平安保险\t101864\n相乘积\t101865\n香奈儿\t101866\n白帝\t101867\n我和你湿闺密\t101868\n狗经很在理\t101869\n蝴蝶\t101870\n王美丽\t101871\nBest\t101872\n硬卧\t101873\n除了\t101874\n棕榈油\t101875\n拿到\t101876\n走爱\t101877\n吃气\t101878\n安拉大我乖乖\t101879\n反射\t101880\n擦零就这样\t101881\n偶秘\t101882\n缪咪咪卡\t101883\n嗯咯农\t101884\n没有我帅\t101885\n恩威并施\t101886\n一百三十米\t101887\n一声话\t101888\n不是我是为了你\t101889\n燕峰\t101890\n白霞\t101891\n一盏灯\t101892\n高记华\t101893\n趣图\t101894\n消亡\t101895\n写号\t101896\ngtd\t101897\ngte\t101898\ngtf\t101899\ngtg\t101900\ngth\t101901\ngti\t101902\nxiho\t101903\ngtm\t101904\n豆纸们\t101905\ngtp\t101906\n写句\t101907\ngtr\t101908\ngtt\t101909\ngtv\t101910\ngtw\t101911\ngty\t101912\n白露\t101913\n用命\t101914\n先生化\t101915\n对她说谎\t101916\nhappened\t101917\ngmailcom\t101918\n45752\t101919\n劲一点\t101920\n一时半\t101921\naccelouaneta\t101922\n杂志\t101923\n过来\t101924\n我靠靠靠\t101925\n距离感\t101926\n唐·柳宗元\t101927\n真的不想和你两百了我是你土黄\t101928\ntera\t101929\n╲\t101930\n12306\t101931\n你好度秘你是我的小秘书\t101932\n娱乐是个圈\t101933\n╰\t101934\nbjjjnr\t101935\ncfghg\t101936\n23分\t101937\n没不变\t101938\n威化\t101939\n屌屌\t101940\n强音\t101941\nfhhdhcbcbd\t101942\n丁哥\t101943\n芦荟胶\t101944\n认为\t101945\nNTT\t101946\n硪嚓\t101947\n中居\t101948\nfydgffh\t101949\n中层\t101950\nGyuI\t101951\n新坡头\t101952\n表扬稿\t101953\nRfgd\t101954\n看看\t101955\n胡红敏\t101956\nguoweiisapilot\t101957\n中山\t101958\n旺仔馒头\t101959\n開心\t101960\n还有一个谁\t101961\n阿们阿们老\t101962\n庐江\t101963\n奢靡\t101964\n罗灿新\t101965\n大大大大大大\t101966\n6500块\t101967\n小榄\t101968\n囧事\t101969\nagdw\t101970\n你好你好机器人儿\t101971\ngicucififu\t101972\n嗯高兴\t101973\n脕\t101974\n脔\t101975\n字装\t101976\n管虎\t101977\n杨伊柳\t101978\n脒\t101979\nAlive\t101980\n权限\t101981\n288元\t101982\nb222号\t101983\n我来大姨妈了了我好难受\t101984\n暖和乐\t101985\n囧人\t101986\n周文博\t101987\n1395492377\t101988\n摩羯蛋\t101989\n满贯\t101990\n群\t101991\n嗯南岗\t101992\nfbrvy\t101993\n太厅\t101994\n银行类\t101995\n离离原上草\t101996\n个人们\t101997\n三险一金\t101998\n羴\t101999\n0110010111001001100\t102000\n羸\t102001\n壁山上\t102002\n柱形\t102003\n羼\t102004\n羽\t102005\niam\t102006\niao\t102007\n董总\t102008\nJalot\t102009\niak\t102010\n羅\t102011\n羊\t102012\nasimm2\t102013\n天天过\t102014\niaa\t102015\n羌\t102016\n频谱\t102017\n广州东圃\t102018\n丽颖多\t102019\n羚\t102020\n二二六三零\t102021\n介绍性\t102022\n羞\t102023\n柱香\t102024\n紊乱\t102025\n就是我就是说\t102026\n150元\t102027\n得分之夜\t102028\nMet00\t102029\nhumans\t102030\n毛主义\t102031\n好吃懒做\t102032\n倒毒\t102033\n卡夫卡\t102034\n挥文墨\t102035\n九十个小时\t102036\n帝王术\t102037\n转基因数\t102038\n拉拉来\t102039\n凑足\t102040\n陶险\t102041\n腰肌劳损\t102042\n无人机\t102043\n超频三和\t102044\n哇豆\t102045\n绕过去\t102046\n曾家洛\t102047\n藤兰\t102048\n医药\t102049\ndeliqut\t102050\nserg\t102051\nJ6孙\t102052\n带伤\t102053\n的样\t102054\np度密\t102055\n背靠背\t102056\nia6\t102057\nｔｆ\t102058\n执行者\t102059\n度秘的密我爱你\t102060\n英山\t102061\n只听\t102062\n1527055256\t102063\n592554\t102064\n枧盘\t102065\n易子\t102066\n确切\t102067\n2415542022\t102068\ntpjjjtdgg\t102069\n广安门\t102070\n国记\t102071\n原美\t102072\n解体\t102073\n唐乐进\t102074\n接棒者\t102075\n安良\t102076\n艾金森\t102077\n我的爱情\t102078\nkusd\t102079\n解作\t102080\nshfff\t102081\n要疯\t102082\n刘兴平\t102083\njmmnmm\t102084\n碎觉\t102085\n一一桶\t102086\n同类们\t102087\nuys\t102088\n猜看\t102089\n船航模\t102090\nuyw\t102091\n遭劫\t102092\nuyy\t102093\n苏镇琴\t102094\n我的成名曲\t102095\n袖珍姑娘\t102096\nLay915\t102097\nuyd\t102098\nuyg\t102099\nuyf\t102100\n细密\t102101\nuyk\t102102\n广袤\t102103\nFHYVVFF\t102104\n吸奶\t102105\n邓丽丝\t102106\n上腾\t102107\n余佳彦\t102108\nhungry\t102109\n1720岁\t102110\n菜市\t102111\n米泉\t102112\n服帖帖\t102113\nchiphotosbaiducomxiaodupicitema2cc7cd98d1001e963643b7bbf0e7bec54e7976djpg\t102114\n共同性\t102115\n坏老师\t102116\n白话文\t102117\n五九五一\t102118\n12345678906564\t102119\naend\t102120\n好麽\t102121\n一向下\t102122\n科贸\t102123\n定远\t102124\n二二七零三个\t102125\n红十字会\t102126\n坐垫\t102127\nJVKTX\t102128\n花火\t102129\n潇洒\t102130\nchhf\t102131\n最萌身高差\t102132\n能板\t102133\n西医\t102134\n西区\t102135\n侠客\t102136\n够了等\t102137\n观刈麦\t102138\n人身攻击\t102139\ndhkkjjmncvnl\t102140\nGfvh\t102141\n千里江陵\t102142\n57355\t102143\n术后\t102144\n霸业\t102145\njklm\t102146\n泥嚎\t102147\n联想乐phone\t102148\n美木兰\t102149\n雨儿\t102150\n话废话废话废话废话废话\t102151\n就是就是就是就是啦啦啦啦啦啦我是蹲坑的小行家\t102152\nthfjfyu\t102153\n本周五下午\t102154\n一肖一次\t102155\n霸主\t102156\n魔山口山\t102157\n海滨城市\t102158\n杏仁腰果\t102159\n王医保\t102160\n专访室\t102161\n佰草集\t102162\nMe，to\t102163\n对啊好巧\t102164\n嘶哑\t102165\nBRT\t102166\njgggh\t102167\n金德日\t102168\n愁善感\t102169\n大吉利\t102170\n土地出让金\t102171\n吐米特有\t102172\n不要你个大头鬼\t102173\n相宜本草\t102174\n256455\t102175\n趴床\t102176\n玛淡\t102177\n会好睡\t102178\n东扯西扯\t102179\n虐待\t102180\n德米德咪的咪的咪的咪\t102181\n疲惫\t102182\n求化解\t102183\n许要\t102184\n龙大广场\t102185\n林玉\t102186\n神河西\t102187\nhiso\t102188\n魔仙小蓝\t102189\n中博会\t102190\nhist\t102191\nlllllllllllll\t102192\n周空\t102193\n汪峰\t102194\n双宿\t102195\nddss\t102196\n飞翔礼\t102197\n赛尔号\t102198\n蒲在线\t102199\n我爱爱爱爱你\t102200\n电闪雷鸣\t102201\nlegetetarelekoultulojulloji'slajixdhgyg\t102202\n张昕怡\t102203\n张忆芬\t102204\n晨露\t102205\n逆差\t102206\n斯图塔\t102207\nmapac\t102208\n干今典\t102209\n速告\t102210\n宋小宝\t102211\n怀谷\t102212\n小日子泽\t102213\n相对\t102214\n爻又文科\t102215\n李铁军\t102216\n丰厚\t102217\n狗血淋漓\t102218\n赵一澍\t102219\n商鞅变法\t102220\n刘子宇\t102221\n510251628\t102222\n滴血\t102223\np214723\t102224\n降降火\t102225\n百度\t102226\n怪我完了\t102227\n毁掉\t102228\n便们\t102229\n快乐乐乐乐乐乐乐乐乐乐乐\t102230\nzjdvue\t102231\n情头\t102232\n绕外圈华\t102233\n白川由梨\t102234\n几百分\t102235\n韦宝月\t102236\nbed2e738bd4b31c63a986c980d6277f9e2ff82fjpg\t102237\n早些\t102238\n绿芽\t102239\n拒接\t102240\n我的最爱\t102241\n翻白眼看我\t102242\n情夕\t102243\n猫千\t102244\n我在拉拉的爱你\t102245\njjbhjnn\t102246\n早产\t102247\n世界性\t102248\n金太狼\t102249\n王广珠\t102250\n情处\t102251\n一个你的笑话\t102252\n整齐\t102253\n沟壑\t102254\n绑架案\t102255\n小夜猫\t102256\n膝关节炎\t102257\n钓鱼\t102258\n微搏\t102259\n医用机器人\t102260\nGbb\t102261\nGbn\t102262\n塘子\t102263\n1000平方公里\t102264\n赵王军\t102265\n赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫\t102266\n敷衍了事\t102267\nGbv\t102268\nyefs\t102269\n二十八九岁\t102270\n三次方\t102271\n一无是处\t102272\n哈那\t102273\n265546176\t102274\n朴人\t102275\n宏微\t102276\n西单山水宾馆\t102277\n综合\t102278\n喜羊羊之飞马奇遇记\t102279\n鬼样\t102280\n诈骗\t102281\n三个信不信真三\t102282\n双宝\t102283\n无人知晓\t102284\n口口相传\t102285\n多长\t102286\n奶瓜\t102287\n有取\t102288\n洋洋臭\t102289\n舞空岚\t102290\n鸡血\t102291\n有发\t102292\n包秘非\t102293\n5000万欧元\t102294\n外地女\t102295\n飘零\t102296\n白汐茉\t102297\n淳哥\t102298\n四五只\t102299\n总价\t102300\n自知\t102301\n鹅湖\t102302\n总份\t102303\n2000多\t102304\neeesdadfdswv541222\t102305\nufan台模特毛咯es\t102306\n总代\t102307\n城堡樱\t102308\n哼毛\t102309\n郝召豪\t102310\n逢君别\t102311\n完不成\t102312\n康庆许\t102313\n卫玲\t102314\n李想\t102315\n說話\t102316\n有所期诺也有所期约邪有所期诺也有所期约\t102317\n板桥良\t102318\n两万五千\t102319\n素黑\t102320\n靠你好\t102321\n高行健\t102322\nhdine\t102323\n机敏\t102324\n鼠宛\t102325\n募资\t102326\n二根\t102327\n马犬\t102328\n空杯子\t102329\n真丑真心的丑一点也不萌\t102330\nz561\t102331\n无爱之战\t102332\n张冠\t102333\n429999999999\t102334\n亚麻\t102335\n第144期\t102336\nnesu\t102337\n衣冠禽兽\t102338\nness\t102339\ncmb\t102340\n北海二小\t102341\n徐集\t102342\n张冰\t102343\n护具\t102344\n芝大\t102345\nzhen\t102346\n吴修达\t102347\n说了拜拜\t102348\n442243\t102349\nDgddsbd\t102350\n复仇类\t102351\n杨文媛媛\t102352\n正式证紫棋\t102353\n八毛\t102354\n时辰\t102355\nCRVKFKGFFDBHGFCHHRFGHFD\t102356\n徐雯\t102357\n张军\t102358\n耿红卫\t102359\n咩公里外起亚\t102360\n依法\t102361\n湿透\t102362\n官厅\t102363\nq群\t102364\nexo天团\t102365\n挂链\t102366\n说了别再说\t102367\n泥煤\t102368\n狐鼠\t102369\n儒\t102370\nsfezfft\t102371\nthatiscostly\t102372\n几岁时\t102373\n行长\t102374\n傻不拉几\t102375\n清汤\t102376\n野心\t102377\n陆大\t102378\n儠\t102379\n商帮\t102380\n高鬼\t102381\n位于\t102382\n无责\t102383\ndedd\t102384\n桥西\t102385\ntio\t102386\n依波\t102387\n1462396592\t102388\ntih\t102389\n真好好好像\t102390\nlojy\t102391\n吱吱复\t102392\n就是说你我就是在吗你\t102393\n滚\t102394\n有你\t102395\n滑\t102396\n使理\t102397\n滔\t102398\n尖牙\t102399\n杨兴洲\t102400\n金狮\t102401\n15597506811\t102402\n肉岗\t102403\n一第六\t102404\n笑干嘛\t102405\n邹欣彤\t102406\n菠萝派\t102407\n法律与秩序\t102408\n滇\t102409\n秦怡\t102410\n滺\t102411\n及肩g8\t102412\n滾\t102413\n滿\t102414\n打搞\t102415\nrtnu\t102416\n滳\t102417\n爱心福娃七十粉\t102418\n滨\t102419\n有余\t102420\n台位\t102421\n打搅\t102422\n满\t102423\n大和体\t102424\n有体\t102425\n四万平方米\t102426\n滥\t102427\n滧\t102428\n二三十\t102429\n6.20\t102430\ncfducx\t102431\n小队长\t102432\n郭老板\t102433\nfascb\t102434\n55241346\t102435\n老同学\t102436\n解禁\t102437\n熄灭\t102438\n熄灯\t102439\n食家\t102440\n熄火\t102441\n就嫁\t102442\n曝光率\t102443\n臭密度\t102444\n赶脚\t102445\n漫动\t102446\n胡椒\t102447\n要不查\t102448\n花去我的我的微信\t102449\n死繼續繼續\t102450\n福哥\t102451\n江城\t102452\n齿轮\t102453\n林女士\t102454\n老律堂\t102455\n6068\t102456\n13934344999\t102457\n博士妥\t102458\n六十一\t102459\n杜颖妮\t102460\nBlossom#乐队\t102461\n额娘\t102462\n花卷\t102463\n十四一八四\t102464\n阿干\t102465\n无路可走\t102466\nacouplemonthsago\t102467\n好了解释怀念书\t102468\nBBBBBBBBBBBBBBBBBBBB\t102469\n狍子度秘\t102470\n_\t102471\n你的样\t102472\n600元\t102473\n吠叫\t102474\nibook\t102475\njeep\t102476\n1小时间\t102477\n600克\t102478\nHTV\t102479\njeej\t102480\n吗堡\t102481\nQzone\t102482\nJ棒\t102483\n酋龙珠\t102484\n火箭炮\t102485\n泼男\t102486\n叙说\t102487\n据爱\t102488\n南海观音\t102489\n好狠\t102490\n分榜\t102491\n行干\t102492\n实事求是\t102493\n变性人口马\t102494\n占比看毛爷爷直肠炎\t102495\n劣势\t102496\n城镇居民\t102497\n暗暗暗暗暗暗\t102498\n一路通天下\t102499\n属于你\t102500\n忠勇\t102501\n好狗\t102502\n锅物\t102503\nbude\t102504\n想起午\t102505\n杂项\t102506\nfnfh\t102507\n兼有\t102508\njtjjjjm\t102509\n敲锣打鼓\t102510\n1500家\t102511\n33vu\t102512\n画眼\t102513\n1898年3月5日\t102514\nhodh\t102515\n一冬天\t102516\n快点板\t102517\n回腿\t102518\n周丽烨\t102519\n了塞\t102520\n母上\t102521\n宁肯\t102522\n忙忘记\t102523\n敝帚自\t102524\n欧漏\t102525\n车溪\t102526\n间或\t102527\ncmt\t102528\n美满\t102529\n2月6日\t102530\n画眉\t102531\n踢西乙\t102532\n客套感叹号\t102533\n照看\t102534\n香精\t102535\n养窝\t102536\n老美\t102537\nAgaua\t102538\n欧阳阳\t102539\n宋仲基\t102540\n挺我不想\t102541\n硬块儿\t102542\n我的电话号码\t102543\n17876868571\t102544\n1730232002\t102545\ngunaa\t102546\n琥珀核桃\t102547\n张雅虎\t102548\nngmg\t102549\n飯麼\t102550\n明彩虹\t102551\n阎良\t102552\n二硕帅\t102553\n再见再见再见再见再见再见一万年\t102554\n1469764661983569559461\t102555\n13832840274\t102556\n小博\t102557\n放片\t102558\n那刚刚那你在干嘛你\t102559\nhwk\t102560\n转评\t102561\n番茄\t102562\nhwg\t102563\nhwd\t102564\n双子座\t102565\n篮球馆\t102566\n小南\t102567\n小单\t102568\n小半\t102569\n夯实\t102570\nftuc\t102571\n下于\t102572\n进去\t102573\n钱萝卜\t102574\n伺服\t102575\n小千\t102576\nmariolyouj\t102577\ncanspell\t102578\n放牛\t102579\noil1555455844\t102580\n2011年6月6日\t102581\n长笑话\t102582\n小杜哈哈\t102583\n直和\t102584\n敬亭山\t102585\n那布拉多\t102586\n33岁\t102587\n中菲\t102588\n九四吖\t102589\n了哇我老新闻了良我爱你爱我\t102590\n陈高冉\t102591\n左派\t102592\n245728\t102593\n小卡\t102594\n沃爪\t102595\n囧囧囧囧囧囧囧囧囧囧囧囧囧囧囧囧\t102596\n红参\t102597\nhriwjsbcbfb\t102598\n不伦不类\t102599\n张首歌\t102600\n八六版\t102601\n呢不片\t102602\n红叉\t102603\n波鞋\t102604\nTsbsO\t102605\n赵丽\t102606\n红发\t102607\n苍穹\t102608\n三连败\t102609\n苍穷\t102610\n首轮\t102611\n28852\t102612\nUttyteqlllllmHuuuuuuuuuuuiuiuubbbhbhlbjjjjjhhjjhhhjm\t102613\n老许要\t102614\n伊莱\t102615\n111111111111111111111111111111111122222222222\t102616\n伊希欧\t102617\n帅秘\t102618\n500多\t102619\n赵一\t102620\n两代表我的心\t102621\n旭茉\t102622\n火潮\t102623\n红叶\t102624\n大花轿\t102625\n自拍杆\t102626\n唐悠悠\t102627\n山东卫视\t102628\nAnd\t102629\nrorororor\t102630\n别爱\t102631\n怀特\t102632\nspexial\t102633\n言无不尽\t102634\n重阳节\t102635\n尔旺酊\t102636\n牛郎兄\t102637\n内置\t102638\n肖文丽\t102639\n威崴\t102640\n以心换心\t102641\n彭阳兴\t102642\n无想吃\t102643\n汽车城\t102644\n5.45万平米\t102645\n塔吉\t102646\n夜深人静真的好寂寞\t102647\n塔吊\t102648\n刚来了\t102649\n不难的话\t102650\n松仁茶香鱼米\t102651\n打把把营\t102652\n郭增洪\t102653\n狼来了我\t102654\n南站\t102655\n10800\t102656\n日漫\t102657\n容聪\t102658\n喔多克力店大吉\t102659\n说服\t102660\n代珉豪\t102661\n51nn\t102662\nfygcc\t102663\n乐乐侠\t102664\n酥软\t102665\n周小\t102666\n蔡书灵\t102667\n骑士队\t102668\n什龙哥\t102669\n吐鲁番盆地\t102670\n马文青\t102671\nsHayDay\t102672\nwhat\t102673\n花朵裙\t102674\n簌簌\t102675\nEgruficjekw\t102676\n每每\t102677\n王刚讲故事\t102678\n覃思妙\t102679\njjjii\t102680\nwhay\t102681\n急妈妈\t102682\n帕斯卡\t102683\n一蹲\t102684\n贺思游\t102685\n我喜欢度秘你做我的仆人\t102686\n夜戏\t102687\n2388\t102688\n孝昌\t102689\n苦味\t102690\n一蹴\t102691\n谈苏\t102692\n色情片\t102693\n卓万串\t102694\n制表\t102695\n煲剧\t102696\n生死祖击\t102697\n异域风情\t102698\n句容\t102699\n4座\t102700\n4度\t102701\n泪光\t102702\n偶果果\t102703\n归校\t102704\n瓦会\t102705\n五姑娘\t102706\n懂挺多嘛\t102707\n大呼\t102708\n连沙漠\t102709\n呀不然的话\t102710\n逆吹牛马\t102711\n喂衣服\t102712\n想和你最好\t102713\nAny\t102714\n辽阔天空\t102715\n付梦娴\t102716\n锵锵是你\t102717\ncboky\t102718\nDan\t102719\n海疆\t102720\n吾王\t102721\n波莉\t102722\n685506\t102723\n明区\t102724\n试题\t102725\n我也不得不爱你我恨你\t102726\n北沙\t102727\n无乳\t102728\n行么听\t102729\n麦麦\t102730\n宝意\t102731\n表哥\t102732\n一份七十\t102733\n別墨跡\t102734\n嗓司\t102735\n花间舞\t102736\n呀元\t102737\n赵丽宏\t102738\n无乐\t102739\n实景\t102740\n无乜\t102741\n空壳\t102742\nbooklbook\t102743\n58355\t102744\n887千七百一四十\t102745\n金舟\t102746\n汉寿西\t102747\n两小时多\t102748\n不是老鼠我是老鼠老鼠老鼠\t102749\n台风雨\t102750\n东风\t102751\n放佐\t102752\n港口\t102753\n港句\t102754\n555412366\t102755\n快乐节\t102756\n先原谅\t102757\nDay\t102758\n编织\t102759\n港台\t102760\n编组\t102761\ngfbavfnnnnnsbdvdvgsgwgdfffmbdbgdjh\t102762\n藏式\t102763\n回解放\t102764\njinlels\t102765\n放低\t102766\n嘢噶\t102767\n9888999\t102768\n靳斌斌\t102769\n庞雪芝\t102770\n北洋政府\t102771\njrgrz\t102772\n杰轩\t102773\n玉门\t102774\n60多岁\t102775\n更刚刚刚刚刚刚刚刚刚刚\t102776\n變了還是\t102777\n谭思妍\t102778\n琉璃世界大慈大悲观世音菩萨摩诃萨\t102779\n金木水火土\t102780\n75￥\t102781\n山坳\t102782\nbhiphotosbaiducomxiaodupicitem9\t102783\n管道\t102784\n35辆\t102785\n廿八\t102786\n爷们\t102787\n中万\t102788\n小缘\t102789\n山坡\t102790\n前无古人\t102791\n王子殿下你好像\t102792\n小鲜肉\t102793\n小编\t102794\n倪朗\t102795\n对呀好棒\t102796\n霆锋\t102797\nWCS\t102798\n亮喝\t102799\n几周岁\t102800\n译文\t102801\n洁影\t102802\n发软\t102803\n吕春瑶\t102804\n度秘你在哪儿我看不见你了\t102805\n萌妺子\t102806\n提出\t102807\n环关\t102808\n内场\t102809\n本地水利局\t102810\n五荤\t102811\nPilgrimage\t102812\n444545\t102813\n哈尔斯\t102814\n虚拟乐队街头霸王Gorillaz新曲\t102815\n杨秉硕\t102816\n财物\t102817\n步入式\t102818\nhairymen\t102819\nS3601金粉\t102820\n其中一位\t102821\n林允俊\t102822\nhhkn\t102823\nhbghgv\t102824\n0.13%\t102825\n413xx902\t102826\n心眼\t102827\n碧水华庭\t102828\n云天河旷野中断崖峭壁走蛟龙三一祖云追辛玥骨\t102829\n淡茶\t102830\n清醒\t102831\nico思密达\t102832\n找餐\t102833\n围合\t102834\n22元\t102835\n齐云志\t102836\n我要罗海东来了美啦美哦纹师\t102837\n二九九二\t102838\n印度洋\t102839\n詹瞻\t102840\n舒韩辉\t102841\n惯用语\t102842\n美男孩\t102843\n空姐姐\t102844\n化妆间\t102845\n08976\t102846\n胜胜\t102847\n古月哥\t102848\n分分分分456789\t102849\n追诉\t102850\n光家\t102851\n十五块\t102852\n969426494324962892643494543486426932893643422776763485426\t102853\n丑八怪丑八败\t102854\n语言\t102855\n豆豆豆豆豆豆豆豆\t102856\n仁义堂\t102857\n5秒钟\t102858\n一垒\t102859\n花冰\t102860\nhfhdvs\t102861\n大风大雨我不怕拉拉啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦\t102862\n天津联通分公司\t102863\n杂木\t102864\n老嘎哈舅子\t102865\n藤野\t102866\nwrrttio\t102867\n贤春\t102868\n定亲\t102869\n耳钉\t102870\n奥特麦\t102871\n好找事\t102872\n二级品\t102873\n弹指\t102874\nrbdhu\t102875\n全脂牛奶\t102876\n刘立淇\t102877\n修身齐家治国平天下\t102878\n月影清墨\t102879\n零四三三四\t102880\n兴变成\t102881\n小鸭\t102882\n怕死掉\t102883\n小鸡\t102884\n饥渴\t102885\n小鸣\t102886\n窦建志\t102887\n我是你的女王陛下气\t102888\n胡蝶\t102889\n乳房\t102890\n恶梦了好梦好梦\t102891\n555254\t102892\n柴然然\t102893\n布满书架的房屋\t102894\n朱佳佳\t102895\n泫京\t102896\n不并\t102897\nz13841449208\t102898\n不平\t102899\n阙丽云\t102900\n猪血\t102901\n百度贴吧\t102902\n三八点\t102903\n不幸\t102904\n60米\t102905\n我俩\t102906\n受让\t102907\n留食\t102908\n二套\t102909\n小节\t102910\n我信\t102911\n兵哥\t102912\n西天寺\t102913\n56767\t102914\n隐姓埋名\t102915\n一九八八年\t102916\n王雅明\t102917\n植物体\t102918\n灰心\t102919\n上午五一\t102920\n复国\t102921\n省罚款\t102922\n王灏宇\t102923\n常态\t102924\n特质\t102925\n不干看\t102926\n郑刚刚\t102927\n一百七十一\t102928\n南京晨报\t102929\nwls\t102930\n拿不着\t102931\nwlw\t102932\n孤岛惊魂\t102933\n张先生\t102934\nwlk\t102935\nwlm\t102936\n常性\t102937\n风沙\t102938\nding\t102939\n我的生日\t102940\n猪猪侠之五灵卫\t102941\n粗细\t102942\n伦敦\t102943\n杭宝宝\t102944\n杜拉\t102945\n好谋\t102946\n家利\t102947\n给贷\t102948\n978\t102949\n宫恩恒\t102950\n陈胜王\t102951\nghhggij\t102952\nlnx2\t102953\n海之子\t102954\n養活\t102955\n消瘦\t102956\n盘点\t102957\n99票\t102958\n守护力\t102959\n白砍鸡\t102960\n陈银圣\t102961\n哎呀呀系\t102962\nGTA\t102963\n299998661\t102964\nGTD\t102965\nLOL咯JJ咯all马虎\t102966\noooooooiooooooii\t102967\n静脉豆\t102968\nGTI\t102969\nwesgs\t102970\n三只眼\t102971\n倾城我不可以了我的\t102972\n汉川\t102973\n恩里克\t102974\nGTV\t102975\nGTX\t102976\n2400多少\t102977\n咽下\t102978\n音乐勇敢的心\t102979\n怀有\t102980\ncufr\t102981\n此珠\t102982\n48届\t102983\n对不错\t102984\n鼎盛期\t102985\n内样\t102986\n啦啦啦啦我的小呀小飞机啦啦啦啦啦啦一起啦啦啦啦我的小呀小飞机\t102987\n刘金媛\t102988\n顺昌\t102989\n二十二二十二个\t102990\n吹嘘\t102991\n密度hi小度秘\t102992\n等等于\t102993\n艳阳\t102994\n热水袋\t102995\n7f\t102996\n陓田\t102997\n不懂我没你说的话\t102998\n呆驴\t102999\n网络凯\t103000\nvkjhjkc\t103001\n7k\t103002\n乐器\t103003\n7i\t103004\n7v\t103005\n7u\t103006\n校花七\t103007\n苏圩镇\t103008\n骨龄肝儿\t103009\n给她说\t103010\n鼾声\t103011\n點擊httppinyincn1aSjY9y2mcO\t103012\nmoise\t103013\n吉松\t103014\n骑马砍杀三国无双\t103015\n线性\t103016\n154分\t103017\n空空空\t103018\n怒火\t103019\n路痴\t103020\n清波鸾凤\t103021\n15343098682\t103022\n尤其是\t103023\n麦当当海贼王手办\t103024\n展台\t103025\n南澳大利亚\t103026\n对呀\t103027\n13313131313\t103028\n每个女人\t103029\n#TEDx#//\t103030\n煤款\t103031\n7X\t103032\n公会\t103033\n成奥\t103034\n李璐璐\t103035\n6月21日16时\t103036\n黍秀宫庭孰悯周\t103037\n安安案\t103038\n公众\t103039\n77\t103040\n特产\t103041\n75\t103042\n魏龙\t103043\n73\t103044\n72\t103045\n71\t103046\n胃囗\t103047\n留连换\t103048\n一道菜\t103049\n罗浮分\t103050\n骗而复\t103051\n79\t103052\n78\t103053\n4742775753\t103054\n不要我汗我哭哭\t103055\n试试听说\t103056\n窃窃私语\t103057\n什么他走了他走了他走了\t103058\n北京地铁一号线\t103059\n疯狗\t103060\n假象\t103061\n贾梦恬\t103062\n浮浅\t103063\n疯狂\t103064\nvUV镜个会\t103065\n鼓舞\t103066\n去年圣诞节\t103067\n李嘉晖\t103068\n大学生村官\t103069\n低低\t103070\n低位\t103071\n叫花心\t103072\n20项\t103073\n阿里巴巴\t103074\n不要我了我\t103075\n尿份\t103076\n起拍\t103077\n好笑死\t103078\n差不出来\t103079\n％┏\t103080\nroo\t103081\n苏修美帝\t103082\n说呀别愣\t103083\n比邻\t103084\n难分\t103085\nzksjdjic\t103086\n首办\t103087\n免刑\t103088\n25千克\t103089\n2060740964\t103090\n饿昏天使\t103091\n艾安娜\t103092\nTBody\t103093\n麻辣比居然黑我家麦记\t103094\n庄梓晖\t103095\n明天晚\t103096\n红瘦\t103097\n骄傲龙\t103098\n宽窄\t103099\n奥运村\t103100\nfragilisticexpia\t103101\n哪曲\t103102\n迄今为止\t103103\n死腚\t103104\nporque\t103105\n小洋楼\t103106\n中国男排\t103107\n一天半\t103108\n南儿\t103109\n一三九六七八九十三三三三三三三三三三三三三三三三三\t103110\nFTT74YTRDTSYRV8\t103111\n月三\t103112\n5755\t103113\n杏树\t103114\n惊魂甫\t103115\njfgiccifux\t103116\n徐倩茹\t103117\n123年\t103118\nburang\t103119\n太太太太太太太太太太太太太太太太太太太太太太太太太太太太太\t103120\n卖菜\t103121\n防不胜防\t103122\n月中\t103123\n樊耀泽\t103124\nCjdi\t103125\n张小君\t103126\n营业吧\t103127\nperpetual\t103128\n1493561990\t103129\nsenceeight\t103130\n哎呀你的生活\t103131\n唉度秘度脂血太多了吧对呀对呀\t103132\n京津冀吧\t103133\n水口山\t103134\n没啊\t103135\n未来10天\t103136\n马晋超\t103137\n叫声君\t103138\n小陈龙\t103139\n15224\t103140\n冷宫\t103141\n＿块\t103142\n恰尔\t103143\n邓菊珍\t103144\nf22\t103145\nf20\t103146\n没啥\t103147\nkkaa\t103148\n变迁\t103149\njgjgtgjgd\t103150\n变过\t103151\n两乀\t103152\nwashdygstgdcfhxjbxygdyhdyhxtgdfhxfddhfXtgf\t103153\n为尼\t103154\n孟良\t103155\nkinkmmmkmkm\t103156\n大kb\t103157\n更轻松\t103158\n拉磨\t103159\n627.30万辆\t103160\n哈楼哈楼哈楼哈楼讨厌讨厌讨厌讨厌\t103161\n德凡\t103162\n握蹄\t103163\n王晟杰\t103164\n乙肝疫苗\t103165\n记不记得\t103166\n仓皇\t103167\n床子\t103168\n400答\t103169\n树儿\t103170\n建鹏\t103171\n97k\t103172\niCoffee\t103173\n八哽\t103174\n晚上7点40分\t103175\n饭吃风格CC\t103176\n等你是\t103177\n未名湖畔\t103178\n12355555589009\t103179\n三心二意\t103180\n八哥\t103181\n明察秋\t103182\n过来得\t103183\n上来片\t103184\n收干\t103185\nhttpwwwmizhuanmex949fa4b\t103186\n矿业权\t103187\n辛苦\t103188\n说不骗人\t103189\n装油\t103190\n一回个\t103191\n946161558280500424674364994626\t103192\n邓友梅\t103193\n蓝钻\t103194\n白头偕老\t103195\n擎羊\t103196\n杨柳依依\t103197\n晓大利涂之工交大利涂之大利涂之\t103198\n听行\t103199\n啊东明你好聪明啊真不愧\t103200\n马花\t103201\n二百八十\t103202\n融汇\t103203\n易额\t103204\n孙海卫\t103205\n固定电话\t103206\n宪刚\t103207\ndianyin\t103208\n山车\t103209\n地红\t103210\n643767987683679161346485734\t103211\n睡了吧\t103212\n中国航空信息中心\t103213\n延世大学\t103214\n1140259284\t103215\nDhf\t103216\n师表\t103217\n秦子\t103218\n六朵稿\t103219\n明天十一点\t103220\n室西连\t103221\n果网\t103222\n火箭骑士\t103223\n亲爱的你困\t103224\n15724361312\t103225\n进城务工者\t103226\n贝勒生意\t103227\n闹哼\t103228\n九零号线零零号\t103229\njfbc\t103230\njgkk\t103231\nBSEAT\t103232\n发冷\t103233\n12月21号\t103234\n安身立命\t103235\n爱情片\t103236\n那你说小猪小猪121121144小兔二一求姻缘\t103237\n秃头秃头\t103238\n老蓝\t103239\n3d2m\t103240\n陪囧\t103241\n1265756333\t103242\n张关于\t103243\n一位数\t103244\n虎年\t103245\njukeyl\t103246\nparametric\t103247\n盐湖城\t103248\n9点34分\t103249\n我没什么你要我我不要你的干嘛我要你望希望的\t103250\n下注\t103251\n植物学\t103252\n别哄\t103253\n32.28%\t103254\n盲流\t103255\n脑精急转弯\t103256\n鲁拉鲁拉\t103257\n木地板\t103258\n口袋装\t103259\n夜来幽梦忽还乡\t103260\n大开始\t103261\n5616151351816131\t103262\n别哭\t103263\ndgcg\t103264\n奴役\t103265\n我明白\t103266\n郑凯\t103267\n决不\t103268\ndgcs\t103269\nFrederic\t103270\n假的话\t103271\n羊癫疯几一点速速\t103272\nDhq\t103273\n巧裁缝\t103274\n婚事\t103275\n肿么\t103276\n马嘶\t103277\n89M\t103278\n15284078730\t103279\n890\t103280\n我男人的女人们给我没有人给我园\t103281\n895\t103282\n896\t103283\n899\t103284\n仿真\t103285\n韩哥哥\t103286\n补气\t103287\n会出现\t103288\n如何写作意识流小说\t103289\n持有者\t103290\n补完\t103291\n99999999996325874198742257138124713726875685684268759855\t103292\n第2集\t103293\n同末\t103294\n早晨7时\t103295\n童话笑话\t103296\n接片\t103297\n殷小姐\t103298\n檀香木\t103299\n直肠癌\t103300\n飞花\t103301\n补水\t103302\nkmkajdb\t103303\nholika\t103304\njsjbx\t103305\nFrankyou\t103306\n肠鸣\t103307\n硬邦\t103308\n人族\t103309\n海佩波斯\t103310\n哇安\t103311\n12356780896\t103312\ngjjadpdadjgga\t103313\n了死\t103314\n爸的名字\t103315\n插队\t103316\n光盘\t103317\n京京热\t103318\n摆脱\t103319\n成人工\t103320\n一小点儿\t103321\n说不了\t103322\n卢方泉\t103323\n真城\t103324\n降臨\t103325\n阮笑欢\t103326\nLA站\t103327\n寻人启事\t103328\n第十三个故事\t103329\n好淫\t103330\nicgjbf\t103331\n唐彬\t103332\n臭豆\t103333\n一零零\t103334\n格子\t103335\n我说话你的听不清你不是猪你是什么\t103336\n当当比\t103337\n十二个人\t103338\n车式\t103339\ntodayis\t103340\n下周二十点\t103341\n落雨啵\t103342\n范宁楠\t103343\n别羞\t103344\n保罗\t103345\n微微微\t103346\n嘉祥\t103347\n谱写\t103348\n我以帅瞎你的眼\t103349\n张太太\t103350\n川菜馆\t103351\n余孽\t103352\nvutfuh\t103353\n熊华勇\t103354\n别美\t103355\njp\t103356\n日久见\t103357\ncvbn\t103358\nmighch\t103359\n龖\t103360\n猪宝\t103361\nversion\t103362\n度秘只是你的名字你是人\t103363\n技能\t103364\n陈舒欣\t103365\n上侧\t103366\n失传\t103367\n侯总\t103368\n嗯咪\t103369\n蒙古猎人bbb\t103370\n祖印\t103371\n嗯咯\t103372\n嗯咱\t103373\n天天酷跑2016版\t103374\n杨艳新\t103375\n破坏感\t103376\n大咯血\t103377\nhaboptu\t103378\n肌苷片\t103379\n陈丽艳\t103380\n退下来\t103381\njr\t103382\n印江小区\t103383\n肖肖日\t103384\n偏移量\t103385\n老娘的老娘和你贫\t103386\n徇私\t103387\n李小娜\t103388\n茶班\t103389\n照你\t103390\n四面八方\t103391\n1256\t103392\n想要你了\t103393\n旅行者\t103394\n1253\t103395\n1250\t103396\n尼小坤\t103397\n度文儿\t103398\n套烟\t103399\n诺曼纳纳农\t103400\n1258\t103401\n行我素\t103402\n班主\t103403\n玩久睡\t103404\nJVMfdgs2nhvfvh4n7Gghhsugz4hcasfsscs\t103405\n露\t103406\n大样\t103407\n霾\t103408\n聊聊聊聊聊聊聊聊聊聊聊\t103409\n大根\t103410\n桃源小学\t103411\n大格\t103412\n霧\t103413\n大校\t103414\n宗泽\t103415\nross\t103416\n二十点\t103417\n苕\t103418\n必定\t103419\n八点半多\t103420\n韩庚\t103421\n大树\t103422\n我不想当你\t103423\n睡一觉\t103424\n短信片\t103425\n霜\t103426\n断打\t103427\n网上银行\t103428\n血案\t103429\nshengmen\t103430\n嫂子们\t103431\n班上\t103432\ncgguf\t103433\n需\t103434\n新都区\t103435\n师蕊彬\t103436\n霍\t103437\n十五四五\t103438\n辛晨曦\t103439\n霉\t103440\n吗错\t103441\n上华堂\t103442\n秀#\t103443\n戚庆腾\t103444\n华港\t103445\n电子机器人\t103446\n有钱是好\t103447\n配角\t103448\n夏光\t103449\n惺惺惜惺惺惺惺惜惺惺\t103450\n刘灿\t103451\n外壳\t103452\n支分\t103453\n阅读文\t103454\njwkjjjs\t103455\n遠方\t103456\n广告版\t103457\n夏兮\t103458\n有一次那一年\t103459\n得不到\t103460\n成熟期\t103461\n钱钱钱钱钱钱钱钱钱钱\t103462\n够出位\t103463\n外星小子的歌你是\t103464\nqwwwww\t103465\n八一桥\t103466\n500多元\t103467\n东南小学\t103468\n981\t103469\n二十一万六千个\t103470\n东陆\t103471\nfilet\t103472\n你和我一样我的女偶像是哀家啦baby\t103473\nctv\t103474\n契机\t103475\n1355n57955\t103476\n内乡\t103477\n去哪儿电大\t103478\n谢啵啵\t103479\n肚皮眼\t103480\n燕麦酒\t103481\n必看着我\t103482\n一条心\t103483\n本山大叔\t103484\n老妈\t103485\n内乱\t103486\n调和\t103487\n宋梦冉\t103488\nji\t103489\nGCHCFIGDJJ\t103490\n老罗\t103491\n谋皮\t103492\n结行\t103493\n行动画\t103494\n东陵\t103495\n肉铺\t103496\n叫你走\t103497\n退行性关节炎\t103498\n2nn半男半女\t103499\n点影\t103500\njj\t103501\n英\t103502\n拜拜太坏了你度秘\t103503\n秘票\t103504\n韩度\t103505\n写写写\t103506\n靓葛\t103507\n张曼迪\t103508\n适不适合\t103509\nmaopiandaquan\t103510\n国美苏宁\t103511\nexi\t103512\nSmokewateraway\t103513\n心情不好好\t103514\n基本面\t103515\n1千五百一十二\t103516\nfjsww\t103517\n波特曼酒店\t103518\n近亲国家\t103519\n亲犬\t103520\n每瓶\t103521\n1ran\t103522\n马念骉\t103523\n真实版\t103524\n谭一\t103525\noooooooooooooffcffbhted\t103526\n斯柯达晶锐\t103527\n琐事\t103528\n上午九点\t103529\n尹营\t103530\n兆濂\t103531\n大三阳\t103532\n嗒哒痴\t103533\n我喜欢恩我喜欢恩\t103534\n能量\t103535\n属乎\t103536\n萍琪\t103537\n哇他西哇\t103538\n我要是个老于\t103539\n五场\t103540\naveva\t103541\n为何\t103542\n8618580\t103543\n工信\t103544\n同机\t103545\n连希\t103546\ncfderdd\t103547\n超级喜欢的歌呗背呗背呗背呗\t103548\nangelabeby\t103549\n高鑫娟\t103550\ncts\t103551\n揉捏\t103552\n航空公司\t103553\n问票\t103554\n连帽\t103555\n别教\t103556\n均线\t103557\n蔡我\t103558\n影视展\t103559\n回报道\t103560\n五圈\t103561\n1个小\t103562\n代人\t103563\n为你\t103564\n象棋\t103565\n天气氛围\t103566\n中段\t103567\n东北度秘\t103568\n死时候\t103569\n王小飞\t103570\n昊天\t103571\n刘睡渺\t103572\n意识意思\t103573\n过集\t103574\n哈库拉\t103575\nigig\t103576\n江山对的你好小鸟\t103577\n鹿哥哥\t103578\nigij\t103579\n小傻子\t103580\n一下号\t103581\n于大气\t103582\n还像我没说的人家呀接着你说\t103583\n液体\t103584\n平舌\t103585\n过零\t103586\n女儿身\t103587\n土肥圆\t103588\n555555555555555555555555555555555555555555555555555555555555555555555555555\t103589\nvjvb\t103590\n吴亦凡凡凡\t103591\n115米\t103592\n来来来来来来来来来来来来来来来来来来来来来\t103593\nhduhehhhcihdhehfhuusgghcjuuegcbj\t103594\n我讨厌你讨厌你讨厌你讨厌你讨厌你\t103595\n移动阅读\t103596\n阿鲁特\t103597\n液位\t103598\n陆战\t103599\n计古莩八十车升子小人卜￥卜卜小卜卜卜卜\t103600\n失期\t103601\n国贸桥\t103602\n封超\t103603\n失望\t103604\nrewsf\t103605\n老邓\t103606\n1丿1一这心uo\t103607\n糊涂蛋\t103608\nlagbe\t103609\n恬恬\t103610\n随笔集\t103611\n我喜欢的歌\t103612\n关门\t103613\n射频\t103614\n关闭\t103615\n一体机\t103616\n不是我需要你\t103617\n4minute\t103618\nwwtf\t103619\n百二\t103620\n好我不陪你了再见\t103621\n祝春春\t103622\n汉莎\t103623\n波典拂拽兩酶珊珊星汲力\t103624\njb\t103625\n刘茜刚\t103626\n姣姣\t103627\n我的猫呢不isismynasfilaceyouoras65\t103628\nxgggxxxyxggcxxffyf\t103629\n老邢\t103630\n天天有喜之人间有爱我喜欢那个夜曲\t103631\n阿加西\t103632\n抢注\t103633\n强敌\t103634\n足见\t103635\n英語\t103636\n222232332232332332232\t103637\n九江\t103638\n喜亡\t103639\n周奥玲\t103640\n一一月份\t103641\n胃胀\t103642\n近十多年\t103643\n10天后\t103644\n喜人\t103645\n了不发\t103646\nSPeed\t103647\n社戏社\t103648\n喜亲\t103649\nkjfyijh\t103650\njuggy\t103651\n111his\t103652\n喜事\t103653\n讲讲话\t103654\nTTwitter\t103655\n黄杨\t103656\n哈哈我来了我爱你一起玩啵小破孩\t103657\nlatjicu\t103658\n数码单反摄影一本通\t103659\n周玎希\t103660\n凯斯\t103661\n贝帅\t103662\n羡慕\t103663\n妓院\t103664\n周婧雯\t103665\n马宇轩\t103666\n母鸡\t103667\nnowman\t103668\n何种\t103669\n2011-5-21\t103670\n煮蛋\t103671\n辉天开不老实\t103672\n佳兴\t103673\n手工皂\t103674\n血疙瘩\t103675\n18分钟\t103676\n境内\t103677\n基本保险\t103678\n戏言\t103679\n登台\t103680\n境况\t103681\n黄牛哥\t103682\n10月30日下午\t103683\n环银\t103684\n厦大\t103685\n朵拉c\t103686\nDjdjwjdjhhchxuc\t103687\nx盒\t103688\n引以为戒\t103689\n婚纱之路\t103690\nadjgjjajjjggjjaaggpttmmmaaajjpgaajjggalligatorsmagggjj\t103691\n网络机顶盒\t103692\n深圳会展中心\t103693\nx盘\t103694\n迥异\t103695\n偏紧\t103696\n冯友\t103697\n六六分之12346分\t103698\n我喜欢你我爱你你是猪么\t103699\n5月13日\t103700\n4月12号\t103701\n切克药\t103702\nTICC\t103703\n沙俄\t103704\n赵雅娇\t103705\n堂堂正正经经\t103706\n五二2点\t103707\n23本\t103708\npete2\t103709\n三千丈\t103710\n一搜\t103711\n嗯嗯西安美院\t103712\n凯佳\t103713\n￥\t103714\n露西佛萌\t103715\n罗汉床\t103716\n￡\t103717\n￠\t103718\n￣\t103719\n￢\t103720\n学狗叫\t103721\n不错别\t103722\n二险\t103723\n呃查\t103724\nLand\t103725\n建国后\t103726\n壮硕\t103727\n身位\t103728\n孙山外\t103729\n拐弯抹\t103730\n瓦那\t103731\nxixizipzipzipzip\t103732\n度秘乖度秘好度秘\t103733\nIdlfmwwkeii\t103734\n下一班\t103735\nAprilfoolsday\t103736\n评球\t103737\n腿毛\t103738\n左海公园北门站\t103739\n48亿元\t103740\n当阳阳光瓜小游妙曼新浪漫花羊羊帽我没有\t103741\n屡战屡败\t103742\n宋雪莹\t103743\n断送\t103744\nckgddgk\t103745\n议题\t103746\n优卡urnonoobb\t103747\n巨屏\t103748\n一战成名\t103749\n底蕴\t103750\n哈哈哈哈哈大笑\t103751\n陪玩\t103752\n喜普液\t103753\n王念富\t103754\n没道理\t103755\n朱凝萱\t103756\n跟头\t103757\n彩金\t103758\n沟通\t103759\n忘了我要\t103760\n晋城五中\t103761\n引荐\t103762\nugdtg\t103763\n虹影\t103764\n辨别\t103765\nhi皮\t103766\n灰泥\t103767\n韩雨沛\t103768\n不是为我的呢咯什哈\t103769\n246分\t103770\n登月\t103771\n1775795040\t103772\n婀娜多姿\t103773\n真眼\t103774\njfjfj\t103775\n极星\t103776\n众胜寡\t103777\n多30\t103778\n丽来\t103779\n猪脑死\t103780\n乳环\t103781\n【元\t103782\n猪猪你是猪猪你是只猪你是猪猪猪猪猪猪猪\t103783\n99一个\t103784\nkaobi\t103785\nbeeee\t103786\nuibao度\t103787\n幼子\t103788\n滚滚滚滚滚滚\t103789\n想你是\t103790\n乌兰巴托\t103791\n表坏\t103792\n沉积\t103793\n火王\t103794\n1894578992\t103795\nDumin\t103796\n开唱\t103797\n攻防\t103798\n从一开始\t103799\ninterex\t103800\n熊人\t103801\n帅t美\t103802\nygfft\t103803\nlight\t103804\n8年\t103805\n第十八\t103806\n110米\t103807\n攻队\t103808\n一桩桩\t103809\n搞臭\t103810\n头圆\t103811\n到那哪儿\t103812\n吕珮琦\t103813\n鱼度\t103814\nehen\t103815\n还货\t103816\nfkyyutkjfnfsngnjtsjvkyndljvkgsjndhkjjnsvmvnlhfnhf\t103817\n撒贝宁\t103818\n玻璃花\t103819\n小学生们\t103820\n杨访\t103821\n年年个\t103822\n返回度秘X三\t103823\n闲战\t103824\n老对不起\t103825\n崔渐明\t103826\n就是我的秘密\t103827\n打鱼机\t103828\n差老鼠\t103829\n贾尼\t103830\n赛高\t103831\n一个人妖\t103832\n此情\t103833\n給力\t103834\n球楼\t103835\n二零二幺零七\t103836\n王嘉浩\t103837\n1万5\t103838\n翔爪\t103839\n陆老师\t103840\n许紫雯\t103841\n四四而非\t103842\n急匆匆\t103843\n不足额\t103844\n套房管\t103845\n灭鼠\t103846\n91度\t103847\n病找\t103848\n翔爷\t103849\n站在烦恼里仰望幸福\t103850\n韩玉均\t103851\n第29天时\t103852\n统计资料\t103853\nnvxxvbn\t103854\n八月中旬\t103855\nbillb\t103856\nfxchh\t103857\n西部网\t103858\n寻人\t103859\n1373329786755123\t103860\n十所科\t103861\n根基\t103862\n前沿阵地\t103863\ndggxghfg\t103864\n班一个人\t103865\n热本v4\t103866\n海外市场\t103867\n幺二零幺零三一九七三零四零九四五二二\t103868\n禅懂\t103869\n姚超阳\t103870\n三厢\t103871\n海豚素\t103872\n快乐赞美\t103873\nsvac\t103874\n铁拳\t103875\n47777777777757888888888888888888888888888\t103876\n变色\t103877\n私通\t103878\n806747\t103879\n命师\t103880\nlxhx\t103881\n铜铃风吹藤\t103882\n糕点\t103883\n从今往后\t103884\nbag0suni\t103885\n侠传登陆学\t103886\n接待室\t103887\n落槌价\t103888\nk8158\t103889\n16倍\t103890\n94斤\t103891\n百度新闻搜索\t103892\n女怠\t103893\n谢谢你的呀\t103894\n蓄電池\t103895\n遇合\t103896\n新绛\t103897\n高柘\t103898\n堡联赛\t103899\n电影界\t103900\nNOOO\t103901\n算盘\t103902\n皮料\t103903\n雄清\t103904\n烧烤\t103905\n东歌\t103906\n过来到\t103907\nsew\t103908\n开展\t103909\n出勤\t103910\n嘻嘻你的话真\t103911\n外男\t103912\ncgsvcshsffddnhg0cfy\t103913\n杉树\t103914\n李志然\t103915\n牛人\t103916\n郭成朋\t103917\nehjejdjesn\t103918\n安邦\t103919\n开局\t103920\n想勒\t103921\n热菜\t103922\n高峰论坛\t103923\n哈的滴答滴滴豆豆\t103924\n128年\t103925\n无所适从\t103926\n不做什么叫做\t103927\njI\t103928\n吴中西路\t103929\nFor\t103930\n开山\t103931\n1901年\t103932\n弄丟\t103933\n14439\t103934\n钟意磊\t103935\n外生\t103936\n博山\t103937\n刘成芬\t103938\nit9yiyiyih8h\t103939\n剪爱\t103940\n街道路\t103941\n疲乏\t103942\n5555555555555555555555555555555555555555\t103943\n南亚\t103944\n哈斤\t103945\n张经京\t103946\njulk\t103947\n哈斯\t103948\n无愧就好\t103949\n776554\t103950\n说九歌晚安\t103951\nonomno\t103952\n隔壁老王\t103953\n有赖\t103954\n欢聚一堂\t103955\n姚娜\t103956\n有赞\t103957\nsex\t103958\n梁炫浩\t103959\n下馆子\t103960\n哈文\t103961\n毕有\t103962\n在情长\t103963\n哈斗\t103964\n有起\t103965\n南京\t103966\n波像\t103967\n鲜花与蛇\t103968\n没赶上\t103969\n秘地\t103970\n咪呀妈妈咪\t103971\n小神龙\t103972\n徐晗\t103973\n女大当嫁\t103974\n便宜次\t103975\navavavavav\t103976\n抄从\t103977\n信誉度\t103978\n秘在\t103979\n蝴剧\t103980\neal\t103981\n炫舞时代梦想家园\t103982\n1nn690163\t103983\nSundback\t103984\n吹牛吧真是\t103985\n兴许人家\t103986\n女朋友女\t103987\nStefano\t103988\n思雨鹏\t103989\n费來\t103990\n河津\t103991\n切断\t103992\n董心弦\t103993\n太好了读米谢谢你好看\t103994\n各有千秋\t103995\n啊青秘\t103996\nhttpfhiphotosbaiducomxiaodupicitem0823dd54564e9258644a5b9a9b82d158ccbf4e4cjpg\t103997\n软妹子\t103998\ngiyyi\t103999\nbkkvjk\t104000\n度秘秘仁济\t104001\nrigi危险\t104002\nclass\t104003\n人歌\t104004\nlqapple\t104005\ngigabiaa\t104006\n没直说\t104007\n问医\t104008\n第二鼓惑大学生创业\t104009\nijhihhgffftyhhojjkhooj\t104010\n连霍高速\t104011\n链锁\t104012\n过节\t104013\n骨滴拜度秘\t104014\n四次\t104015\n四欠\t104016\n戒玟\t104017\n5181881818\t104018\n寻路\t104019\n自噬\t104020\n开饭\t104021\n安利雅姿\t104022\n彭兆言\t104023\n吴昊\t104024\n5月9号\t104025\n沙袭\t104026\nhelloyes麦\t104027\n举头\t104028\n度米芬\t104029\n18326668801\t104030\n司长\t104031\n礼拜六\t104032\n乙烯\t104033\n大朝巴\t104034\n6月7日\t104035\n混淡\t104036\n吴忠亮\t104037\n刁馋\t104038\n杨芬腾\t104039\n沙袋\t104040\n第二点儿\t104041\n易圆\t104042\neat\t104043\n进发\t104044\n骨射手\t104045\nnononononononostio\t104046\n给力菜\t104047\n团结湖公园\t104048\nseeys\t104049\n虫\t104050\n书迷们\t104051\n发丝\t104052\n青睐\t104053\njC\t104054\n过来哪儿\t104055\n张爱\t104056\n郑金林\t104057\n行不去\t104058\n鸽子汤一煲\t104059\n点心事\t104060\n看着我不入\t104061\n发下\t104062\n梁神三\t104063\n真独特\t104064\n扇贝\t104065\nnlekouling\t104066\n雷老母\t104067\n发主\t104068\n客吧\t104069\n好带好我找你\t104070\n来来来来来来来\t104071\n痛击\t104072\n张大海\t104073\n谈谈话\t104074\n坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋\t104075\n行干嘛\t104076\n发丧\t104077\n很乖哈\t104078\n梁蔓\t104079\n消食\t104080\n你是男人我是女人男女授受不亲\t104081\n恶作剧\t104082\n钱鑫慧\t104083\nhimawhoI\t104084\n冬令营\t104085\n幕兀嚄渋\t104086\n乔仙乔\t104087\n虹\t104088\n不灭处理子\t104089\n工棚\t104090\n肚脐\t104091\n随便人\t104092\n北大体育场\t104093\n桂城海八路\t104094\n调查问卷\t104095\n流川\t104096\n金刚金刚小游一个金刚英高狗\t104097\n笨妞\t104098\nWPS\t104099\n屯溪\t104100\n花心有很多\t104101\n大暖\t104102\nckbc\t104103\n度外卖\t104104\namen\t104105\n2937\t104106\n生菜\t104107\n摆动\t104108\n开心斗地主\t104109\n经管系\t104110\names\t104111\n188岁\t104112\n骂背\t104113\n百事达\t104114\n雪小东度秘\t104115\nPanos\t104116\n天圣\t104117\n林子涵\t104118\n地气\t104119\n11月29日\t104120\n验孕棒\t104121\n12358899\t104122\n乾隆\t104123\n3047万\t104124\n天在\t104125\n抽样\t104126\n7几\t104127\n七圈\t104128\n一百一万一千一百一万一千一百一万一千一百一\t104129\n天地\t104130\n吓呆\t104131\n哼密度\t104132\n德里克·罗斯\t104133\n10月22号\t104134\n温职大\t104135\n陈宣彤\t104136\nbn06\t104137\n瑞信Roger\t104138\n好天真\t104139\n全取三分\t104140\nG8UGGi\t104141\nalppe\t104142\n干什么用\t104143\n十余岁\t104144\n爱多爱\t104145\n符文\t104146\n尖角\t104147\n张宇欣\t104148\n不能拒绝\t104149\n虑\t104150\n天叭\t104151\n昨夜\t104152\n福特\t104153\n小一个故事\t104154\n菊花茶\t104155\n天将降大任于斯人\t104156\n只想\t104157\n3.44%\t104158\n＊bbvvvbbbbghvgh\t104159\n度隆\t104160\n妈妈呢忙\t104161\n我好爱爱你\t104162\nastroy\t104163\n打帐\t104164\n狂热\t104165\n七十二家租客\t104166\n别上\t104167\n素质\t104168\nlQ0\t104169\n大营盘\t104170\n多层次\t104171\n老奸巨猾\t104172\n春暖花开\t104173\nnbbbbb\t104174\nsdjjjnjkkk\t104175\n号告诉我\t104176\n艾斯比\t104177\n10周岁\t104178\n1392\t104179\n玉麒麟\t104180\n王明贤\t104181\n迈克尔\t104182\n爸爸杯\t104183\n别个\t104184\n望湖城\t104185\nLe\t104186\n掌门\t104187\nYouareout\t104188\n貌美\t104189\n为甚么\t104190\n虞\t104191\n循环\t104192\n40505060708090\t104193\n显达\t104194\n7hwhdhehdhbdhhdh3BhdBb\t104195\n嘴牙\t104196\n金立g\t104197\n键点\t104198\n给我加\t104199\n花旗叁\t104200\n瞿少爷\t104201\n全句\t104202\n狠嘞\t104203\n壮撼\t104204\nrefl\t104205\n真正的朋友\t104206\n远程\t104207\n着凉\t104208\n5月6号上午\t104209\n全及\t104210\n四不像\t104211\nEngland\t104212\nisheng\t104213\n张志芳\t104214\n疑视\t104215\n一看见\t104216\n全发\t104217\n三亿两千万零二百\t104218\n十九番\t104219\nhulgh\t104220\njvvhc\t104221\n麦开\t104222\n订金\t104223\n想和你唱歌\t104224\nurot\t104225\nbasynn\t104226\n那一刻\t104227\n第1款\t104228\n第1次\t104229\n六夫左路44\t104230\n想说和他说\t104231\n超级银河\t104232\nJmmnkhKnmg\t104233\nAQQ\t104234\n我你我你头\t104235\n折射\t104236\n861276\t104237\nurob\t104238\n13783783956\t104239\n森林报\t104240\n白眼狼\t104241\n85117799663355899554867445865455562\t104242\n二二次\t104243\n围棋\t104244\n几乖\t104245\n几乘\t104246\n杨亦鸣\t104247\n几九\t104248\n永远永远永远永远回家\t104249\n韭菜\t104250\nvdfhh\t104251\n翟婉宁\t104252\n那一切\t104253\n台ng\t104254\n我的对话\t104255\n活够\t104256\n几乎\t104257\n好老好\t104258\n15304321913\t104259\n不是度秘度秘你在哪儿\t104260\n陈生新\t104261\n棉袜\t104262\n嗯秘密tm1132\t104263\n嗯真乖［\t104264\n设点\t104265\n一起跳楼\t104266\n林静\t104267\n棉袄\t104268\n稳重\t104269\n我真的好想\t104270\n跟给他\t104271\n饱意思\t104272\n28天\t104273\n不住在\t104274\n市镇\t104275\n名其器\t104276\npjcyfxs\t104277\n1032元\t104278\n日单\t104279\n日博\t104280\n笨所\t104281\nuty\t104282\n死字词句\t104283\n刘军彤\t104284\n不会不会\t104285\n日升\t104286\n裸体男\t104287\n55500666\t104288\n6668865\t104289\n莺歌\t104290\n听本\t104291\n全线\t104292\n等于恬\t104293\n文念\t104294\n吃汤姆\t104295\n877520\t104296\n擦掉\t104297\n老党员\t104298\n洋洋\t104299\n李桌子\t104300\n来自星星的你\t104301\n徐继升\t104302\n全级\t104303\n煞费苦心\t104304\n十七分之十一\t104305\n4月3、4、5日\t104306\n魏加财\t104307\n内能\t104308\n零块\t104309\n小天使主题冬奥\t104310\n在内\t104311\n宠文\t104312\n叶片\t104313\n嗯夜夜\t104314\n死不掉\t104315\n左图\t104316\n哈哈哈我们是好基友你是我的好基友\t104317\n悲怆\t104318\n公关\t104319\nheheus\t104320\n公共\t104321\n十六万元\t104322\n白咯\t104323\n指南针\t104324\n乡镇长\t104325\nGGGGGGHUD\t104326\n李冬敏\t104327\n阻挠\t104328\n阻挡\t104329\n我的饭\t104330\n村野\t104331\n晚安\t104332\n村里\t104333\n对啊我\t104334\n西施故里公园\t104335\n公公\t104336\n无水豆花\t104337\n北京市一中院\t104338\n晚宴\t104339\n树艺\t104340\n谢谢你的赞美星星那是我爱你\t104341\n塞皮\t104342\n幺零零幺零六二\t104343\n酷划雪\t104344\n冰淇林\t104345\n老发九仙\t104346\n哼不\t104347\n洋大爷\t104348\n公元\t104349\n阿米切斯\t104350\n梅某\t104351\n甜味\t104352\n疯了很我心越荡\t104353\n黎未喔\t104354\n通草\t104355\n陈科鉴\t104356\n樊文星\t104357\n地广人稀\t104358\n纽约外交政策委员会\t104359\n挥金如土\t104360\n风华国乐\t104361\n杜大\t104362\n欠款\t104363\n产销\t104364\n高老庄\t104365\n杜秘太\t104366\nndh\t104367\nndj\t104368\n考不好说\t104369\n小度秘你喜不喜欢安家卑鄙安家卑鄙是你的女神\t104370\n出鞘\t104371\n自卸车\t104372\nndb\t104373\n大姨妈来了我一样长\t104374\n可图\t104375\n类容\t104376\nytggh\t104377\n这祸\t104378\nelaileallik2bkllkiomicom\t104379\n来了看\t104380\n哎呦我的乖孙女儿\t104381\n总编\t104382\n拖沓\t104383\n酸钙口服液\t104384\n唧唧啦咔咔阿拉伯\t104385\n也门\t104386\n吕鹏\t104387\n两百块\t104388\n呃二\t104389\n打虎\t104390\n我告诉你是\t104391\n能歌善舞\t104392\n归还率\t104393\n止步\t104394\n作业图\t104395\n一千个\t104396\n二恶英\t104397\n一0898888\t104398\n维嘉\t104399\n各支\t104400\nJjtg\t104401\n二分之三\t104402\n后会有期\t104403\n电怪\t104404\n花火贝贝\t104405\n朱美懿\t104406\n乖我很乖\t104407\n二分之一\t104408\n好吧分析\t104409\n一千万\t104410\n群啵\t104411\n保健操\t104412\n调戏度秘\t104413\n一千一\t104414\n40吕\t104415\n长安v7\t104416\n深水埗\t104417\n有缘\t104418\n四小时\t104419\n外星人儿\t104420\n说你是\t104421\n很干脆\t104422\n晨勃\t104423\n我的秘密就是我讨厌你\t104424\n来到这里\t104425\n五月天\t104426\n一会天\t104427\n小点裤\t104428\n乱点\t104429\n玩生\t104430\n安网\t104431\n自作主张\t104432\nFfucnjfghfubngcjghfbufhfvjdghtfbvffvjgyfhgcycbfgxufvhvhmvygfjbfgvjgfgvivvdvfgvuvbkff\t104433\n意林床\t104434\n呕呕呕呕呕呕呕呕呕呕呕呕呕呕呕\t104435\n一切顺利\t104436\n面系\t104437\n气死我了我要投诉你\t104438\n马勒壁\t104439\ncoewahi\t104440\n安置\t104441\n一二三四五六七八九十一二三四五六七八九十一二三四五六七八九十3速\t104442\n面糊\t104443\n笑而不语\t104444\n凝聚\t104445\n退休后\t104446\nIE咳咳咳\t104447\n问路\t104448\ntfggggggggdr\t104449\ntobedowgtt\t104450\nt么浪Z8\t104451\n麦扣田你好呀XDD\t104452\n后备箱\t104453\nkjhgcg\t104454\n圣安东尼光\t104455\n诶呦\t104456\n女音\t104457\n20多天\t104458\nssrtdrreetr\t104459\n金星秀\t104460\n俺娘西游撒哇地卡\t104461\noftenout\t104462\n早上七点二十\t104463\ndodyg\t104464\nRheinhessen\t104465\n不错不错\t104466\n卢某\t104467\n杨臭脚\t104468\n大幂美美哒\t104469\n出版人\t104470\n为人表\t104471\n施比\t104472\n九和三七汇泉\t104473\n惯率\t104474\n霸绝\t104475\n那你就是说我不幽默录\t104476\n呢叫\t104477\n陈昊宇\t104478\n寒窗\t104479\n首充号\t104480\n陈虎\t104481\n武泽莹\t104482\n恶战\t104483\n萌萌萌萌哒萌萌萌萌哒别害羞\t104484\n机游戏\t104485\n123654789055756855575856855855655\t104486\n死鬼\t104487\n添翼\t104488\nbct\t104489\n恶戏\t104490\n1311111116\t104491\n洁厕灵\t104492\n119119119\t104493\n献殷勤\t104494\n秘不示\t104495\n唐志侬\t104496\n纯纯正正的人\t104497\n植物大战僵尸宽衣\t104498\n郭一凤\t104499\n罢工\t104500\n乐苏泽\t104501\n吹雪\t104502\n郭锅\t104503\n吹雨\t104504\nisfull\t104505\n超级超级大坏人\t104506\n二除一个\t104507\n性丑闻\t104508\n代涛\t104509\n别倒下\t104510\n数个\t104511\n可是我没有\t104512\n系统成虫\t104513\n官配\t104514\n潘丛杉\t104515\n找一找\t104516\n组词\t104517\n格言\t104518\n337737\t104519\n幫手\t104520\nヾ喵\t104521\n鸡蛋牛狗猫鼠堡王\t104522\n几趟\t104523\n历假\t104524\n吃一到\t104525\n高友\t104526\n南康伍\t104527\n数三\t104528\n注会\t104529\n数下\t104530\n3千六百七十二1点\t104531\n数万\t104532\n谢杏芳\t104533\n数一\t104534\n数七\t104535\n毛泽东\t104536\n泽尻绘理香\t104537\n可视频\t104538\n打住\t104539\n肖佳乐\t104540\n感概\t104541\n俘获\t104542\n金堂\t104543\n近乎勇\t104544\n11111111111111111111111111\t104545\n恋恋不舍\t104546\n强要\t104547\nMissingArgument\t104548\n福分难\t104549\n繁茂\t104550\nhdtuj\t104551\n升空\t104552\n辣条OK嗯嗯\t104553\nvuxe\t104554\n什么儿\t104555\n珉\t104556\n潘静雨\t104557\nabcdaft\t104558\n每小时\t104559\nV位\t104560\n22:30\t104561\n辟谷\t104562\nwasold\t104563\n关于泰勒我吃惊\t104564\n慢oteralina\t104565\n面朝大海春暖花开\t104566\n辟谣\t104567\n刘萌萌\t104568\n二十根炫\t104569\n聊解\t104570\n90岁时\t104571\n很美\t104572\n吃来\t104573\nFikgfjffuk\t104574\n算了算了算\t104575\nTshzftstdsfyzfhhhdhghjguhhjbhhhh\t104576\n肺活量\t104577\nrunacompany\t104578\n惨祸\t104579\n起之\t104580\n臭粑粑\t104581\n数量级\t104582\n38点\t104583\n度秘冰哈\t104584\n20110921\t104585\n15米\t104586\n给我我想\t104587\n二百五三百六\t104588\n泰迪\t104589\n第三者\t104590\n这儿\t104591\nhuiren\t104592\n木箱\t104593\nLGJKMTJ\t104594\n对的你的声音\t104595\n阿狸简\t104596\n泰迷\t104597\n花姑娘的死\t104598\n我的盆友\t104599\n鱼雌鱼\t104600\n团饭\t104601\n急吼吼\t104602\n最安全\t104603\n夕一\t104604\n石榴裙\t104605\n塔皮\t104606\n老处女\t104607\n1111111111111111111111111111111111111111111111112\t104608\n中央戏剧学院\t104609\n短剧\t104610\n凰人运长\t104611\n爱的是你是你是不能力争夺走的是我的啊啊你是你好\t104612\n华迷\t104613\n小孩子气\t104614\n世世代代\t104615\n脑筋呲牙呲牙n1n2n34n4n58n6n7n8n9n10n11n12n13n14n15\t104616\n天涯何处\t104617\n永帅\t104618\n农具\t104619\n华远\t104620\n流线\t104621\n痛不欲生\t104622\n换牙\t104623\n霸王硬上弓\t104624\n乳钙\t104625\n大五毒教\t104626\n主银\t104627\n错关\t104628\n天班\t104629\n冰冰冰冰冰冰\t104630\nwomeiyou\t104631\naudu\t104632\n给我再说\t104633\n向北\t104634\n观望\t104635\n400余\t104636\n绝义\t104637\n谁好呀\t104638\n孙袁贞\t104639\n过年你过\t104640\n渴求\t104641\n呢热线\t104642\n普朗克米\t104643\ntvxq2\t104644\n绝也\t104645\n可乐行\t104646\n岸本齐史\t104647\n了然\t104648\n15703979796\t104649\n铁跑酷\t104650\nxgddsf\t104651\n叶朝\t104652\n我的人生元\t104653\n算算算算算算算算算\t104654\n铜片\t104655\n冠希里\t104656\n感觉你好帅\t104657\n火钱\t104658\n解调\t104659\n十一来\t104660\n尹冰冰\t104661\n十一条\t104662\n玉皇\t104663\n25810545180\t104664\n火针\t104665\nLuaggage\t104666\n奥官邸\t104667\n呃办不办没办国际长途办\t104668\n高台\t104669\n赵彦恩\t104670\n淡忘\t104671\n任骂\t104672\n霍州一中\t104673\n老尸\t104674\n嗯姆姆\t104675\n457744456\t104676\n18221189550\t104677\n崔莹\t104678\n大理\t104679\n董飞扬\t104680\n老太爷\t104681\n为你你好你好你好你好你好你好\t104682\n招惹\t104683\nTumblr\t104684\n炸鱼\t104685\n看图\t104686\n谢谢谢谢谢谢谢谢谢谢\t104687\n奔\t104688\nfloonomyouil\t104689\n乐儿乐\t104690\n十二分之一\t104691\n空空\t104692\n天天想\t104693\n零四元\t104694\n第二岁\t104695\n比例式\t104696\n海马福美来\t104697\na4LH48\t104698\n7次\t104699\n99909999999990999999999999999999999999几\t104700\n坐船\t104701\n何方神\t104702\n一月二十六号\t104703\n犯罪分子\t104704\n结算\t104705\n呕吐状\t104706\n一菲\t104707\n去年12月\t104708\n撒的的故事好嘛\t104709\n罗湖奥\t104710\n土建\t104711\n孙皓晖\t104712\n7188\t104713\n老孩子\t104714\napink\t104715\n物利\t104716\n送货单\t104717\n洗刷刷洗刷刷洗刷刷洗刷刷\t104718\n雅慧\t104719\nXX银行\t104720\n被尿你的就\t104721\n我喜欢一\t104722\n千百种\t104723\n150多毫米\t104724\n5588555\t104725\n温利岩\t104726\n等到\t104727\n宋永泰\t104728\n22000\t104729\n上界四大神花\t104730\n大莹\t104731\n回再来\t104732\n得名\t104733\n郑书记\t104734\n苍然\t104735\n保底\t104736\n超子挡板\t104737\n十几部\t104738\nKnow\t104739\n我是你的好主人\t104740\n48块\t104741\nspreadacross\t104742\n雾灯\t104743\n实际距\t104744\n毕哲文\t104745\n块糖\t104746\n那感\t104747\n恩好乖\t104748\n胎生\t104749\n求求情\t104750\n五五率\t104751\n敢应\t104752\n静气\t104753\n东四北大街钱粮胡同\t104754\n21111133567890\t104755\n艾弗里\t104756\n周志鹏\t104757\n团名\t104758\nguagd\t104759\n敢度\t104760\n太太空\t104761\ncukuk\t104762\nRtfgg\t104763\n哈破\t104764\nmdj\t104765\nmdb\t104766\n三妹英德\t104767\n番号\t104768\nmdg\t104769\n3656\t104770\n黄宗洛\t104771\n毛旭东\t104772\nSABBATH\t104773\nhi个\t104774\n领罪\t104775\n解说员\t104776\n猪肚中再来一个再来一个再一个再一个再来个再来个朱朱来个再来一个朱朱朱朱朱\t104777\n50斤\t104778\n孤坟\t104779\n图系\t104780\n29980日元\t104781\n观赏性\t104782\n流卡\t104783\n捡漏\t104784\n佛顶骨舍利\t104785\n洪滉\t104786\n王者\t104787\n56824\t104788\n56825\t104789\n饿科检\t104790\n董卿镯\t104791\n第11天\t104792\n心太软\t104793\ndkvp\t104794\n有活力\t104795\n轮功\t104796\n找仓管\t104797\n捷星航空\t104798\n厂长\t104799\n国立清华大学\t104800\n勺海\t104801\n快乐片\t104802\natd\t104803\ndugf\t104804\nabqu\t104805\n卧薪\t104806\natl\t104807\ndugk\t104808\n妈咪鲜胺\t104809\natv\t104810\n102em\t104811\n少罗嗦\t104812\n一万多条\t104813\nhi谢谢你了下回\t104814\n死了吧\t104815\n钯金\t104816\n數學補課\t104817\nnbanba\t104818\n先告\t104819\n鸡鸡鸭鸭\t104820\n哼哼撸\t104821\n叶嗯飞\t104822\n真了不起\t104823\n完不可以\t104824\n十四种\t104825\n哪得\t104826\n布达拉宫\t104827\n夏沫雪\t104828\n有点头疼\t104829\n王星宇\t104830\n13507024569\t104831\n一七百二十二\t104832\n七宝贝\t104833\n伊卡\t104834\n香港书店业如临大敌】诚品书店\t104835\n翠国\t104836\n赫宰\t104837\n严开安\t104838\n唉不想\t104839\n臭懒猪\t104840\n度长征\t104841\nsvsusb\t104842\n当兵有你又要\t104843\n陌玉\t104844\n惶恐零丁洋\t104845\n靠何人\t104846\nbdhshos\t104847\n度秘你的反应\t104848\n取悦\t104849\n香港赤裸女厨神\t104850\n春寒\t104851\n一口十\t104852\n陈一\t104853\n月光下\t104854\n黄瘸子\t104855\n陈上\t104856\n凹告\t104857\n王雪晨\t104858\n感恩戴德\t104859\n不能淫\t104860\nChloe\t104861\n罪名秀\t104862\n想不见\t104863\n陈丽\t104864\n集团军\t104865\nbnjd\t104866\n8月22点\t104867\n讹钱\t104868\nauvu\t104869\n后院\t104870\n3qy\t104871\nQaq\t104872\n全屏\t104873\nbostoo\t104874\n冒险包\t104875\n全局\t104876\n节节高\t104877\n疑心\t104878\n干贝\t104879\n帮凶而成\t104880\n翟蔓花\t104881\n拥堵\t104882\n95169\t104883\n天枰\t104884\n6￥￥￥￥\t104885\noqk\t104886\n七桌\t104887\n累碧园\t104888\n汉斯·季默\t104889\n东土\t104890\n孙璐\t104891\n平年\t104892\n平平\t104893\n南芬区公安分局\t104894\n15074756273\t104895\n长创\t104896\n明天性\t104897\n糖葱薄饼\t104898\n七七四十九天\t104899\n干货\t104900\noooooooooooooooooooooooooooooooooooooooooooo\t104901\n给我就是说\t104902\n钱多得\t104903\n猩猩类\t104904\n糍粑\t104905\n绿丸\t104906\nshall\t104907\n黔东\t104908\n绿丰\t104909\n马克思主义\t104910\ngjfuf5\t104911\n恢除\t104912\n往定\t104913\n你的甜言蜜语\t104914\nmacbook\t104915\n廉江市\t104916\n绿丝\t104917\n上当受骗\t104918\n公民\t104919\n范闪\t104920\nfugt\t104921\n罗丹言\t104922\n小鸡炖蘑菇\t104923\n张院长\t104924\n可怕\t104925\n糙糙糙\t104926\n金生\t104927\n可怜\t104928\n一佯千玺\t104929\n15898288188\t104930\n邓劲舰\t104931\nkivubc\t104932\npril\t104933\n漂亮俩\t104934\nkmncupydw\t104935\n四百多度\t104936\n来了不惊\t104937\n灰灰灰机\t104938\n真可帕\t104939\n青萝\t104940\n5000名\t104941\n眼影棒\t104942\n桶面\t104943\n暴恶心\t104944\n小情\t104945\n6002\t104946\n王粤辉\t104947\n多米女\t104948\n利齿\t104949\n气子\t104950\n海苦海唉羊羊\t104951\n车船\t104952\n气孔\t104953\n阿[\t104954\n钢丝们\t104955\n归案\t104956\n咯洛奇秘\t104957\n写字母\t104958\n过种\t104959\ndhhdhc\t104960\n太冲美莱\t104961\n你的饭吧我喜欢我的话\t104962\n蠢人节\t104963\n丁心雅\t104964\n一排侯\t104965\n能文能武多才多艺\t104966\n安静了再见\t104967\n累死你了我很抱歉\t104968\n亏本\t104969\n够味\t104970\n2358744\t104971\n卡拉拉\t104972\n敷上\t104973\n敷下\t104974\n哒哒哒哒哒哒哒哒\t104975\niphoneos\t104976\n臣民\t104977\n迪丽错啦=0\t104978\n心寒透顶\t104979\n得了什么勃\t104980\nOOoooiooooooooooooeereeeeeeeeeeeeeee\t104981\n陈六六\t104982\n过几吧大\t104983\n2.48元\t104984\n周十七\t104985\n够呛\t104986\n讨伐\t104987\n史宁\t104988\nbenbffi\t104989\n2016年3月27\t104990\n好大好大我爱你妈妈\t104991\n双氧水\t104992\n旁批\t104993\ngoodnight\t104994\n公心\t104995\n我们村\t104996\n音质\t104997\n史实\t104998\n人大部\t104999\n那哈塞亲爱的乖\t105000\n莒南\t105001\n3you\t105002\n头和身\t105003\n碎觉觉\t105004\n安仔鞋\t105005\n多无奈\t105006\n造成\t105007\n明度秘美甲\t105008\n苏苏丽\t105009\n朱清\t105010\n800..14\t105011\n肮脏\t105012\n九九岁\t105013\n多边贸易\t105014\n光头强先\t105015\n哈蚧\t105016\n我不觉晓的我1小孩\t105017\n1514111111111111111111111111111111111111111111111\t105018\n18840356988\t105019\nｖｒｅ\t105020\n宁波联通\t105021\n沈静琪\t105022\n爆胎\t105023\n堡切\t105024\n189米\t105025\nLOTUSKTV85555888\t105026\n几局\t105027\n几层\t105028\n徐贲\t105029\n一元一次方程\t105030\n奥丽颖\t105031\n禁片\t105032\n几届\t105033\n阿西吧西巴阿夏夏夏\t105034\n十周岁\t105035\n谢彦\t105036\n疯婆子\t105037\nD2555\t105038\n紫瞳\t105039\n木瑞\t105040\n30公里\t105041\n修化化\t105042\n叫座\t105043\n咸片\t105044\nYfhxyqyfkvphl\t105045\n度秘谜题娜美我爱你\t105046\n挫败\t105047\n二关\t105048\n冠县东古城\t105049\n二六\t105050\n成都航空\t105051\n大是大\t105052\n发间\t105053\n弥撒\t105054\nfdgfdfdtfd\t105055\n蓝色雨\t105056\n变态佬\t105057\n白免\t105058\n摇车\t105059\ngbjgkhj\t105060\n国土资源部\t105061\n陈才\t105062\n叫床\t105063\n长荣\t105064\nChxuchcjxh\t105065\n李芃逸\t105066\n控股\t105067\n黑黑户\t105068\n伟大的祖国\t105069\n二元\t105070\n督察局\t105071\n赵琳露\t105072\n高庆昌\t105073\n冯如\t105074\n谦康\t105075\n在世\t105076\n铁哥\t105077\n广告业\t105078\n克罗斯、拉姆、穆勒\t105079\n马主任\t105080\n擦吃\t105081\nlalbaby\t105082\nsjy\t105083\nfgvbjkk\t105084\nnbarbaby\t105085\nSRT\t105086\n武王谔谔以昌\t105087\n老鼠们\t105088\n在下\t105089\n在上\t105090\n在不\t105091\n定力\t105092\n温绮\t105093\n中院\t105094\n内门\t105095\n女扮\t105096\n姆拉迪奇\t105097\n单向街\t105098\n三分之一天\t105099\n我仔仔你是谁你是谁你是真的的没那么\t105100\n广科\t105101\n401章\t105102\nSienna\t105103\n幺零零幺零二幺\t105104\n全优\t105105\n全会\t105106\n永远永远永远永远永远永远不打理\t105107\n不远行\t105108\n153825\t105109\n一百来块\t105110\n女神级\t105111\ndrs\t105112\njtjdagjowtgqaj\t105113\n林子煜\t105114\n闫光正\t105115\n什麼學歷\t105116\n烫面\t105117\n开怀高歌春\t105118\n我爱你哪你爱我\t105119\n破茧而出\t105120\n艾欧\t105121\n关键时刻\t105122\n秘苑\t105123\n１０\t105124\n多得\t105125\n都在\t105126\n448558528555558955788555\t105127\n通安镇\t105128\n大亲亲\t105129\nafrhhi\t105130\n下个星期一到星期五\t105131\n梅熳莹\t105132\ndrh\t105133\n最低温\t105134\n建造\t105135\n好啦好啦我陪给你听\t105136\nfyvrfcC\t105137\n白兔\t105138\n90度b90度\t105139\n黄天宇\t105140\n藏进\t105141\n13918939435\t105142\n第二种\t105143\n二十二\t105144\n容许\t105145\nsjm\t105146\n茶餐厅\t105147\n二十五\t105148\n无为县\t105149\n尿尿\t105150\n骨折\t105151\nownership\t105152\n一又五分之110分\t105153\n哈肚饿\t105154\n费解\t105155\nMove\t105156\n小户型\t105157\n572757472\t105158\nFfd\t105159\nedxxx\t105160\n很兴奋\t105161\n下一回\t105162\n18分\t105163\n谢万成\t105164\n骗你的我没有\t105165\n苍狼#\t105166\n52588638859552380\t105167\n對妳\t105168\n姚生\t105169\n查查\t105170\nFinchfmcjdbc\t105171\n巢蜜\t105172\n好的洗刷刷洗刷刷洗刷刷\t105173\n十一十一时\t105174\n张瑞文\t105175\n杨风帅\t105176\n一小时半\t105177\n瑞恶心\t105178\n一英尺\t105179\n琐碎\t105180\n双击\t105181\n1760806950\t105182\n胖次\t105183\n敌军\t105184\n有梗\t105185\n闫琪琦\t105186\n回吧度秘\t105187\n阵地\t105188\n十首\t105189\n哎哟好嘛你好搞笑\t105190\n12345678901258\t105191\n特稿\t105192\n成倍\t105193\n1000000多\t105194\n焗\t105195\n福利社\t105196\n飞佛\t105197\n合不合适\t105198\n武义么小学\t105199\n冰雅\t105200\n盐边县\t105201\n成交\t105202\n一口价\t105203\n悉达多王子顿悟\t105204\n了别再发\t105205\n13504280781\t105206\n表盘\t105207\n表本公主\t105208\n5986866\t105209\n成亲\t105210\n90后们\t105211\nadgjnjtjthumpjajtm1gjgmpa\t105212\n6:05\t105213\n成人\t105214\n何乐乐\t105215\n张小小\t105216\n笑了我没有\t105217\n无人问津\t105218\nmazhine\t105219\n陶然桥\t105220\n武汉\t105221\nCallmetalent\t105222\n小行家\t105223\n林涌彬\t105224\n冰馆\t105225\n潜水\t105226\n食厌\t105227\nright\t105228\n80万元\t105229\n高洋\t105230\n住去\t105231\n买好\t105232\n高洁\t105233\n维生素b6\t105234\n奥特佳\t105235\n四六九五\t105236\n八嘎八嘎\t105237\n老是度\t105238\n傅靖生\t105239\n郭靖宇\t105240\n冰雕\t105241\n这个月二月份\t105242\n非啊小人儿\t105243\n养瞒\t105244\n2011年6月28日下午18时30分\t105245\n133瓶\t105246\n务工\t105247\n敬职\t105248\n哪一刻\t105249\n喽hello\t105250\n聊天啵\t105251\n呜啦啦鲁拉鲁拉鲁拉类问题了一年了我\t105252\n钟勋\t105253\ncxtn\t105254\n卖不掉\t105255\nc8嘎\t105256\n王天阳\t105257\n安宁瓜\t105258\ncm么\t105259\nannad\t105260\n黎永琪\t105261\n下策\t105262\n保准\t105263\n4304条\t105264\n高萌萌\t105265\n五金手撕票\t105266\n那个我爱你我是说我真的好喜欢你好爱你\t105267\n一箩筐\t105268\n上校\t105269\n所当然\t105270\n已经\t105271\n限度\t105272\n劣汰\t105273\n恩我要一张你的自拍\t105274\n葬吾怆\t105275\n你是我的专属秘书\t105276\n玫瑰花折纸\t105277\nkvvvvvbbbnbbcccjnbvvcx\t105278\n张湾\t105279\n据介绍\t105280\n颜相机\t105281\n间距\t105282\n七八百\t105283\n啵啵啵啵啵啵\t105284\n食客们\t105285\ncaqh845\t105286\n38。4元\t105287\n英雄堡\t105288\n两般\t105289\n危惫\t105290\n巴士五汽公司\t105291\n温布利球场\t105292\n芦笋丁\t105293\n邳州话\t105294\njhihji\t105295\n爱你是我的\t105296\n卢lo餐\t105297\n小茅屋\t105298\n面头\t105299\n豆娘\t105300\n真挺好\t105301\nOGJFHH\t105302\nhuugvb\t105303\n一只小子\t105304\n4月10日\t105305\n度动物园\t105306\n猜谜语\t105307\nvmlikqbgkq\t105308\n花星\t105309\n跟林\t105310\n刘开泰\t105311\n危情\t105312\n小刀兄\t105313\n阳光给给你片\t105314\n三猫\t105315\n比利帅\t105316\n前缘\t105317\n年例\t105318\n闲居中\t105319\n下课铃\t105320\n超打\t105321\n香雾\t105322\n里儿\t105323\n再送给\t105324\n5000公里\t105325\ncanuseenglish\t105326\n神棍节\t105327\n天知道\t105328\n乾净\t105329\n香雪\t105330\n不雅\t105331\n迷你版\t105332\n风起云涌\t105333\n亲爱哒\t105334\n7个月\t105335\n眼首\t105336\n岁岁年年\t105337\n集团网\t105338\n艰苦奋斗\t105339\n侯小雨\t105340\n第21岁\t105341\n尿比\t105342\n姜岩\t105343\nhhbhv\t105344\n选择症\t105345\n雪态\t105346\n因是的果\t105347\n朱韵涵\t105348\n你在哪儿雄楚\t105349\n19：51\t105350\n大秦\t105351\n笑骮泪\t105352\n谢举\t105353\n改良版）\t105354\n水不流\t105355\n谢丹\t105356\nsbeby\t105357\n校门\t105358\n肉肉麻麻辣燙飯堂妹夫婦產科發現金華山裏面的有事兒童節\t105359\nwarz\t105360\n15806095837\t105361\n没趣\t105362\n咕咚\t105363\n腻子粉\t105364\n零半十半\t105365\n谢东\t105366\nzand\t105367\nzang\t105368\n色洛\t105369\n一片个\t105370\n大秀\t105371\n1536585580128566\t105372\n习水\t105373\n得话\t105374\n椰奶粉\t105375\n谢一\t105376\n大秘\t105377\n谢上\t105378\n别无悔\t105379\n乔一\t105380\n秋高\t105381\n别挠\t105382\n丫丫丫\t105383\n三分之二\t105384\nfdgffgc\t105385\n2296052357135160125\t105386\n娘们儿\t105387\n13413686407\t105388\ngtatat\t105389\n许蔚帆\t105390\n惠京申\t105391\n长颈空\t105392\n情人节\t105393\n别挂\t105394\nvffdf\t105395\nwarm\t105396\n我是你的我叫\t105397\n马克马克马克\t105398\n米涅\t105399\n乔丹\t105400\n一个160厘米\t105401\n三原小学\t105402\n别挑\t105403\n两派\t105404\n十点一刻\t105405\n自以为是啊也不知道\t105406\n小脚\t105407\n小脑\t105408\n遍天下不老不死的丞\t105409\n偷盗\t105410\n看淡\t105411\n逻辑学\t105412\n齐声\t105413\n侯俊\t105414\n钻营\t105415\nward\t105416\n米老虎\t105417\n扶绥县\t105418\n一百七一百种\t105419\n经了想一想\t105420\n绝世高手\t105421\n尊严感\t105422\n145145155455457\t105423\n戚墅堰\t105424\n青青哒\t105425\n123456781\t105426\n123456780\t105427\n而来从未来都行\t105428\n卡蹦沙卡拉卡蹦沙卡拉卡蹦沙卡拉卡蹦沙卡拉卡蹦沙卡拉卡\t105429\n扭扭捏捏那你呢那你呢你那那那那那那那那那那那那那那那那\t105430\n单调性\t105431\n过来怕怕\t105432\n没访问\t105433\n我好\t105434\n坪山规划那我我木我的家教狗男蛇的家在哪李在\t105435\n1470套\t105436\nbusy\t105437\n火初\t105438\n79元\t105439\n吕诗雅\t105440\n斯特恩\t105441\n卧岁\t105442\n参差不齐\t105443\n第一家\t105444\n人家那嘛嘛我要么么么么么么么么么么么\t105445\n下方\t105446\n火刑\t105447\n和山\t105448\n高寿\t105449\n呢脸\t105450\n第一宮\t105451\n雅居蓝湾\t105452\n数码页\t105453\n冰狗\t105454\n几个贴\t105455\n实习期\t105456\n第一宠\t105457\n空中网\t105458\n以身作则\t105459\n最挺\t105460\n高富\t105461\n三一起\t105462\n高密\t105463\n奇达利\t105464\n阿倥\t105465\n叫苦连天健康快乐离开看来旅途旅途苦苦\t105466\n我是好人我是好孩子我还是\t105467\n4jigo\t105468\n掉了解锁\t105469\ngoodics\t105470\n高寒\t105471\n刹住\t105472\n和属\t105473\n心酸六和\t105474\n下节\t105475\n炸酱面担担面\t105476\n降下\t105477\n888854\t105478\n心不在焉\t105479\ndsdd\t105480\n属地\t105481\n你好好想\t105482\n下册\t105483\n勇哥\t105484\n王立权\t105485\n魔石\t105486\n己经\t105487\n恩在\t105488\n情侣级\t105489\n力克\t105490\n告诉你我知道\t105491\n下花\t105492\n56式半\t105493\n明再过半\t105494\n降临\t105495\n我是你的冫先生意识字\t105496\n两个度\t105497\n破伤\t105498\n嗯啦啦小的极了乐漫\t105499\n翘翘\t105500\n178厘米\t105501\nucxmabc\t105502\n李脸\t105503\n营养\t105504\n说清楚\t105505\n嗑瓜子\t105506\n打试试\t105507\n我喜欢额\t105508\n女手\t105509\nbb5a\t105510\nKIDS\t105511\n哇草\t105512\n盖好完\t105513\n双下巴\t105514\n茶茶\t105515\n最方便\t105516\nFyg\t105517\n0方\t105518\n入木三分\t105519\n便秘\t105520\nAn9eLababy\t105521\n咖啡店\t105522\n陈建汉\t105523\n求经\t105524\n3900分\t105525\n晚八点\t105526\n吃豆跳\t105527\n一五一百四\t105528\nhellomystomer\t105529\n度秘少白\t105530\n子香\t105531\n4ja\t105532\n辉太拉\t105533\n教育者\t105534\n主任\t105535\n北面\t105536\n666699996\t105537\n欢享\t105538\n哦ok3q\t105539\n上面\t105540\n梅河口\t105541\n十二点零四\t105542\n咳药\t105543\n吃饱吧\t105544\n改不改\t105545\n领先\t105546\n撅嘴\t105547\n不想在天\t105548\n付讨厌\t105549\n12444\t105550\n陈百强\t105551\n困觉\t105552\n一1一\t105553\n汤浴\t105554\n石光荣\t105555\n主仆\t105556\n黄文\t105557\n吧聪明人\t105558\n353535353535\t105559\n今天上午十一点\t105560\n穆斯林兄弟会\t105561\n子曰不愤不启不悱不发举一偶不以三偶反则\t105562\n救济\t105563\n北美\t105564\n酱汤\t105565\n王天鸽\t105566\n山谷\t105567\n厌长\t105568\nabcer\t105569\n唔敢\t105570\n吴荷芬\t105571\n扯草\t105572\n臭不要脸的小三\t105573\nqioovs\t105574\n惠州\t105575\nmadele\t105576\n浩燕\t105577\n酱汁\t105578\nfifjs\t105579\n初性\t105580\n塞蒂恩\t105581\n88888882822828282828288888888888888888888888888888888888888\t105582\n振振\t105583\n训尿尿\t105584\n奥特莱斯影院\t105585\n花芙蓉\t105586\n邢自贝\t105587\nlaihaivyou\t105588\nCatia\t105589\n定资费\t105590\n改写\t105591\n纯金\t105592\n幽灵岛\t105593\n18600077657任\t105594\n落泊船瓜洲\t105595\n重建良善社会得靠底层平民\t105596\n灵犀\t105597\n国务院台办\t105598\n毛骨悚然撞鬼经\t105599\n曼谷\t105600\n远程化\t105601\n15174147070\t105602\n沙海\t105603\n超过24小时\t105604\n3373遍\t105605\n跪跪\t105606\n不敢了我要\t105607\n阴沉沉\t105608\n灵犬\t105609\n我稀罕你问我啊关你屁事我要你为毛不要你我\t105610\n十八号早上六点\t105611\n会厅\t105612\n沙浦\t105613\ndyu\t105614\n没见过行不行\t105615\n1100条\t105616\nmysongkown\t105617\n董红岩\t105618\n艾山·买合苏木\t105619\nmome@\t105620\n羊粉\t105621\n二四号\t105622\n王阳\t105623\n最热\t105624\n快猜快猜快猜\t105625\n漫荻\t105626\nKTV8卦\t105627\ndys\t105628\n骗人大坏蛋大坏蛋骗人大坏蛋大坏蛋\t105629\n十六明明光光圆\t105630\n有许多度\t105631\n兔鸡\t105632\n94604946458\t105633\n胡萝卜萝卜老撒比\t105634\n张业云\t105635\n无涯\t105636\n通知书\t105637\n特多\t105638\n土豪金\t105639\n2538\t105640\n15103936156\t105641\n严我\t105642\n我喜欢待\t105643\n批评\t105644\n新出\t105645\n小米3\t105646\npe日照\t105647\n下花园\t105648\n赵炳南\t105649\n李宇恩\t105650\noppers\t105651\nACG\t105652\n464648\t105653\n新凡\t105654\n尼美\t105655\n陈翔秘密天使##花样美男时英#\t105656\n刘劲\t105657\n九百十一十二十三十四个\t105658\n幻想症\t105659\n砍伐\t105660\nyourMMisbeng\t105661\n模先试\t105662\n6晚8天\t105663\n畅聊\t105664\n张屿\t105665\n三角号\t105666\n三百岁\t105667\nrkdor\t105668\n四面\t105669\n西四小学\t105670\n緊急集合\t105671\n欧阳少\t105672\n默恋\t105673\n讲讲道理\t105674\n私自\t105675\n弱势\t105676\n队里\t105677\n密度面\t105678\nDIRH\t105679\n瘦瘦\t105680\n2011年1月20日\t105681\n5652504607\t105682\n乡亲们\t105683\n星星团\t105684\n垃机\t105685\n质朴\t105686\n第二批次\t105687\n李xx\t105688\n2000多年前\t105689\n度秘你的声音好假\t105690\n貔貅\t105691\nfffdtst\t105692\n我们都秘\t105693\nTFboys美\t105694\n那才小学\t105695\n炸弹\t105696\n阴毛儿\t105697\n非议\t105698\n差晤\t105699\n三四遍\t105700\n国药\t105701\n生啤\t105702\ntfbis\t105703\n纳班\t105704\n岳翻云\t105705\n补和\t105706\n记不清\t105707\n出了笑\t105708\n初初\t105709\n2月2日至8日\t105710\nhvtv\t105711\nmynameisggh\t105712\n成千上万\t105713\n早晨6点\t105714\n685558655556535556548886885865555565355555\t105715\n想见心\t105716\nga1c1\t105717\n考拉\t105718\n18888岁\t105719\n个赞\t105720\n何况\t105721\n引人疑惑\t105722\n润洁卡通\t105723\n康发\t105724\n明托佛\t105725\n好聚好散\t105726\n聪明度秘\t105727\n不不不我讨厌\t105728\n接缝宫\t105729\n一盫\t105730\n精度\t105731\n一目\t105732\n十八级\t105733\nueygdc\t105734\n一直\t105735\n一张券\t105736\n失踪省\t105737\nNitro\t105738\n张耀良\t105739\n华西医院\t105740\nuddhfsr\t105741\n几几\t105742\n2011年10月21日下午\t105743\n杜岳峰\t105744\n一百六十六十六\t105745\nauuou\t105746\n一盏\t105747\n剧样\t105748\n再回\t105749\n几凵\t105750\nmassive\t105751\n15680626307\t105752\n一盘\t105753\n廊坊\t105754\n心情甚好\t105755\n一盟\t105756\n安徽菜\t105757\nATG\t105758\nHiioyiyiyuy25586535853558282586833\t105759\n社会\t105760\n晕死我的小小苹果榛蘑爱经\t105761\n住友大厦\t105762\n俕倧\t105763\nATM\t105764\n哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈大笑\t105765\n叫化\t105766\nATT\t105767\nATP\t105768\nspanish\t105769\n输水\t105770\n神父\t105771\n生生世世\t105772\n芒果冰芒果布丁芒果西米露\t105773\npmab\t105774\n狙击手\t105775\n谢嘉乙\t105776\n撒朗\t105777\n阳奉阴违\t105778\n积水\t105779\n社伯\t105780\npolino\t105781\n俱备\t105782\n考试中的男的们女的们\t105783\n矫形\t105784\n想不想要\t105785\n金希澈\t105786\nexcellent\t105787\nhi皮hi皮hi皮\t105788\n奔跑吧兄弟跨年演唱会2016席\t105789\n7741平\t105790\n水鬼\t105791\n苛求\t105792\nQQ度秘度秘\t105793\n长舌\t105794\n6月4日8点\t105795\nrvzhnvb\t105796\n世纪性对决\t105797\n口工网\t105798\n麻药\t105799\n三彩\t105800\n孔令辉\t105801\n350322199107191595\t105802\n人命\t105803\n误差\t105804\nwu2wuwuwuww\t105805\n拜拜拜拜拜拜拜拜拜拜拜拜拜拜拜拜拜拜拜拜拜拜拜拜拜拜\t105806\n蔡腰子\t105807\n告大\t105808\n医生女\t105809\nvyz\t105810\n养鸭子养\t105811\n牛初乳惹祸\t105812\n1979年10月\t105813\n面筋塞肉\t105814\n呗村\t105815\n聊聊天麻烦\t105816\n找朋\t105817\n人员\t105818\n4414444nn9\t105819\n兴旺\t105820\n头昏昏沉沉\t105821\n庄稼\t105822\nsan\t105823\n混混阿坚\t105824\n浏阳天天恋\t105825\nsaf\t105826\nsad\t105827\n书谜\t105828\nsay\t105829\n哭了我告诉你\t105830\n十分钟十分钟\t105831\n31946\t105832\n森森\t105833\n老米\t105834\n狗娃娃\t105835\n易县县政府\t105836\n55675339755\t105837\n桑固乡\t105838\ntIrvinD\t105839\n99岁\t105840\nthangs\t105841\nnshe\t105842\n994888785766\t105843\n媚娘\t105844\naside\t105845\n李升华\t105846\n主讲人\t105847\n啦啦啦啦啦啦啦啦我的宝贝\t105848\n泸定桥\t105849\n书谱\t105850\n放荡不羁\t105851\n一一堂\t105852\n里根\t105853\n新疆队\t105854\n张君雅\t105855\n不胜\t105856\n概率\t105857\n欠费\t105858\n96万元\t105859\n欠账\t105860\n林经贺\t105861\n巴基斯坦阿伯塔巴德\t105862\n化疗\t105863\n在一起来着\t105864\n相任\t105865\nbhcvvccjvbcchivb\t105866\n烦心事\t105867\nMashita\t105868\n新浪财经\t105869\n先场\t105870\nHZHznzbnzbzbxz\t105871\n跳脚\t105872\n新源\t105873\n说多云\t105874\nFffggggg\t105875\n代名词\t105876\n咧帮\t105877\n小明哥\t105878\nbibibibibibi\t105879\ngif女\t105880\nfammmous\t105881\n耨耨\t105882\n大开玩笑\t105883\n四部\t105884\n不同地区\t105885\n排山倒海\t105886\n董思怡\t105887\n五分之13分\t105888\n离绝\t105889\n念不好\t105890\n喜羊羊灰太狼\t105891\n不是我不爱\t105892\n新牙\t105893\n早出\t105894\n二手货\t105895\n临平\t105896\n豆麦\t105897\nstockison\t105898\n怨愤\t105899\n离绪\t105900\n徽商\t105901\n利李\t105902\n乌烟瘴气\t105903\n四郎\t105904\n整利落\t105905\n正太郎\t105906\n嘟咪嘟绵\t105907\n家电话\t105908\nSHCC\t105909\n丽珠度秘猪你是猪猪猪\t105910\n囿于\t105911\n误人\t105912\n烣\t105913\n烤\t105914\n烧\t105915\n烦\t105916\n烩\t105917\n我草我草我草我草我草\t105918\n烫\t105919\n老子的女友来了\t105920\n热\t105921\n烬\t105922\n5两段\t105923\n高红\t105924\n就是期\t105925\n高级\t105926\n北京野生动物园\t105927\n烹\t105928\n烻\t105929\n汪宇轩\t105930\n烂\t105931\n所有者\t105932\n谦逊\t105933\n刘还有\t105934\n烈\t105935\n441222\t105936\n迪凡特\t105937\n哒哒哒萌萌萌\t105938\n烙\t105939\n铺面\t105940\n烛\t105941\n烟\t105942\n胡urft\t105943\ntuvstivc\t105944\n热风\t105945\n多起\t105946\n一胎\t105947\n张碧晨\t105948\n多赢\t105949\n一胜\t105950\n多赞\t105951\n德州扑克\t105952\n连抽\t105953\n这回话\t105954\n孙为贤\t105955\n逢考\t105956\n18972\t105957\n昃女\t105958\n美丽的秘密吧行不行\t105959\n除去\t105960\n圣经本\t105961\n始终如一\t105962\n一胶\t105963\n16小时\t105964\n加息预期\t105965\n亲爱的度儿\t105966\n人帮\t105967\ndomi\t105968\n受音\t105969\n抓夸\t105970\n咋镇\t105971\n孙叔叔\t105972\n1丘\t105973\n插座\t105974\n魏小雨\t105975\n1一\t105976\n许臭\t105977\n敦新\t105978\n1下\t105979\n击打\t105980\n酒瓶\t105981\n喵小皇\t105982\n胡说什么\t105983\n听说过天波斯\t105984\n惯坏\t105985\n58g2568\t105986\n沙硕\t105987\n牙头\t105988\n真的好恶心\t105989\npolygamy\t105990\n泗县\t105991\n1两\t105992\n1个\t105993\n1丨\t105994\n918博物馆\t105995\n咋长\t105996\n是谁朝\t105997\n四对\t105998\n水龙\t105999\n讨厌男\t106000\n13698483064\t106001\n情亲\t106002\nsiid\t106003\n小夫妻\t106004\nfjfjfhg\t106005\n骇拼英\t106006\n如果\t106007\n媒人\t106008\n中小企业融资难\t106009\nbuvhvuyipnuftcxrmgmjgytdfyqweetyuippadjlfmuyrfnhffbtrhrggfhqwertyuioplkjhgfdsaxxxzcvbnnm\t106010\n冬五\t106011\n北京汽车股份有限公司\t106012\n奥多奇\t106013\n不方便\t106014\n潜力\t106015\n去冷不冷\t106016\n175厘米\t106017\n白细胞\t106018\n50yi\t106019\n撸多\t106020\n走水\t106021\n64655522255654548\t106022\n反缩\t106023\n回来后\t106024\n食人怪\t106025\n吕梦缘\t106026\n练了吧\t106027\n我的一片天空\t106028\n零二零七零八九二\t106029\n龙门阵噻\t106030\n牌子\t106031\n1772058170\t106032\n今意古意\t106033\n厂厂\t106034\n啦啦啦啦啦啦我是拜拜的小行家\t106035\n438号\t106036\n崩掉\t106037\nQQ群\t106038\nhttpahiphotosbaiducomxiaodupicitem2934349b033b5bb529807a8a31d3d539b600bc6bjpg\t106039\n蔡甸\t106040\n好呀嗯\t106041\n狂麦吉\t106042\n黑背心\t106043\n陈浩宇\t106044\n主控\t106045\n您好先生\t106046\n单立人\t106047\nabicuvvq\t106048\n我奶奶的儿子的事我怕怕\t106049\n36.9度\t106050\njgmwpwpxp\t106051\n成儿\t106052\n4英寸\t106053\n叫花花\t106054\n甘肃省政府\t106055\n12345688\t106056\nfhgdf\t106057\nfhgdg\t106058\n謽\t106059\n卡罗纳\t106060\n屌神\t106061\n09:23:58\t106062\nDIF\t106063\n３分钟后\t106064\n6ML\t106065\n49分之十三\t106066\n的否\t106067\n烩解\t106068\n血怪\t106069\nDIX\t106070\nDIY\t106071\n45名\t106072\n秦二世\t106073\nRrfeeh\t106074\n一块形状\t106075\n小巧玲珑\t106076\n终生\t106077\n血性\t106078\nDIS\t106079\n小ｅ\t106080\n謝\t106081\n奔跑吧\t106082\n宁武\t106083\n满级\t106084\nhowmars\t106085\nababds\t106086\n吴敏\t106087\n真清\t106088\n3d奔包\t106089\n刘多多\t106090\n钟日\t106091\n謆\t106092\n班车\t106093\nbabycasti\t106094\n大喊大叫\t106095\n政治体制改革\t106096\n边锋\t106097\n你好烦人讨厌讨厌讨厌讨厌\t106098\nnk01\t106099\n哎呦真是的你\t106100\n不麻烦\t106101\n快樂\t106102\n红高粱\t106103\n態大㎞\t106104\n嘉茂购物中心\t106105\n勒沃库森\t106106\n李耳\t106107\njknbh\t106108\n一查一查\t106109\nosbskzbwos\t106110\n5678号\t106111\n杨小好\t106112\n边锤\t106113\n老远大\t106114\n不是我就我就问问哈我那\t106115\nada012\t106116\n李老\t106117\nvygvjy\t106118\n哈哈我是最美的我是女的\t106119\n东西木\t106120\n預科\t106121\nalfredo\t106122\n216\t106123\n217\t106124\n215\t106125\n212\t106126\n213\t106127\n210\t106128\n211\t106129\nRIOOSG\t106130\n张彩\t106131\n88688384888898868384888586837378858683827888\t106132\n底牌\t106133\n218\t106134\n从师\t106135\n21%\t106136\n21#\t106137\no型密封圈\t106138\n熊大那\t106139\n搬运工\t106140\nfeecbdjsf\t106141\n长长\t106142\nhello哈楼哈楼哈楼哈楼哈楼哈楼哈楼哈楼哈楼哈楼哈楼哈楼哈楼哈楼哈楼哈楼\t106143\n疏解\t106144\n番茄台\t106145\njeok2jeodjeijdhjejdjdjksn\t106146\n吕浩\t106147\n嗯长城\t106148\n跟男\t106149\n第8天\t106150\n5999\t106151\n数三声\t106152\nJOSK\t106153\nST光华\t106154\n广东话都面\t106155\n略赵\t106156\n小度机器人\t106157\n55555456\t106158\n李振声\t106159\nejewe\t106160\n天马行空\t106161\nsumme2\t106162\n太盟\t106163\n正通雅苑\t106164\n呵呵呵杀杀杀我又白给啥子\t106165\n麻老\t106166\n连绵\t106167\n报名\t106168\n太监\t106169\n笨丹\t106170\n连续\t106171\n魏超\t106172\n很浪漫\t106173\n风险\t106174\n苏子垚\t106175\n辞\t106176\n情剑\t106177\n太相\t106178\n21P\t106179\n左脸苹\t106180\n太直\t106181\n死幼稚\t106182\n预组\t106183\n大白棒\t106184\n东北嘎嘎嘎嘎\t106185\n行路中国\t106186\n麻耳\t106187\n芈月\t106188\n松原\t106189\n21M\t106190\n海英\t106191\n108417647\t106192\nFLY\t106193\n宁河\t106194\n小马宝莉茜\t106195\n15519846258\t106196\n舍伯\t106197\nnvds\t106198\n那那那\t106199\nwoza\t106200\nqqvv\t106201\ntask多图EVA\t106202\n我是那嘛我我我是那个那个\t106203\n83865653345\t106204\n46884\t106205\n你以为你以为\t106206\n5636439495\t106207\nFLJ\t106208\n金皇大酒店\t106209\n独角兽\t106210\n养殖场\t106211\n多本\t106212\nddy\t106213\n没害你\t106214\n王文建\t106215\n莫言秘\t106216\n矩阵\t106217\ndds\t106218\nRudiiftigfhu\t106219\nddv\t106220\nddt\t106221\nddk\t106222\nddj\t106223\nddh\t106224\nddo\t106225\n嘉峪关\t106226\nddc\t106227\nddb\t106228\ndda\t106229\n大天亮\t106230\nddg\t106231\nddf\t106232\nddd\t106233\n辱\t106234\n二拉\t106235\n兰叫\t106236\n突兀\t106237\n蓝继\t106238\n小暴\t106239\n奥乖宝贝\t106240\n睡会儿\t106241\n小文糊里糊涂\t106242\n甘油\t106243\n初儿上册历吉日嗯李雨珊英语卷\t106244\n昨天下午\t106245\n你不你不懂\t106246\n瘦腿\t106247\nffefgfjg\t106248\n掣母\t106249\n胸臆\t106250\n勾引\t106251\n说冷\t106252\n数平方根\t106253\n蓝绿\t106254\n八一矿\t106255\nsecymy\t106256\nBeauty\t106257\n李洋\t106258\n幻想有一张脸\t106259\njhgggjkk\t106260\n辶\t106261\n前前后后\t106262\n4月12日\t106263\n娇美\t106264\n令营\t106265\n说写\t106266\nK歌之王\t106267\n娇羞\t106268\n两级\t106269\n13456778\t106270\n辽\t106271\n手柄\t106272\n超不多\t106273\n梁瑞\t106274\nyiyi\t106275\n三倍差\t106276\n讨厌你恨你恨\t106277\n报气预\t106278\n斤斤计较\t106279\n一组\t106280\n巴特\t106281\njidrgg\t106282\n我求求你了你好看\t106283\n撒样\t106284\n繁复\t106285\n专书\t106286\n17708844773\t106287\n郸城\t106288\n春阳水\t106289\n一起子\t106290\ncchkl\t106291\nCollection\t106292\n我是你的熄火吧\t106293\n繁多\t106294\n紫罗兰\t106295\n伊丽莎白鼠\t106296\n度金\t106297\n15150478165\t106298\n顺平乡村\t106299\n天阴公猪\t106300\n倒底\t106301\n李荣浩\t106302\n太坏了你\t106303\n太武\t106304\n力工\t106305\n柏堂諍\t106306\n甜甜小受\t106307\n不分开\t106308\n啼霜\t106309\nhoamI\t106310\n琼花\t106311\n熊大宝\t106312\n提有\t106313\n好不好勒\t106314\n莞市\t106315\n再犯贱\t106316\n作坊\t106317\n哼捍\t106318\n埋炉\t106319\n青春舞曲\t106320\n草稿纸\t106321\n6455484916115454566645463\t106322\n阴在线\t106323\n240倍\t106324\n牛铁\t106325\n张向云\t106326\n象牙塔\t106327\n下一分钟\t106328\ncnbm\t106329\n月轮\t106330\n名邦广场\t106331\nHow\t106332\nHot\t106333\n养伤\t106334\n素人品\t106335\n胸咚壁咚\t106336\n金翅鸟\t106337\n老厚\t106338\n泊船瓜洲\t106339\n朱建娟\t106340\n八爷党\t106341\n〖干烹仔鸡\t106342\n周浩伟\t106343\n段誉\t106344\n一绝\t106345\n制造量\t106346\n配偶\t106347\n鼻梁\t106348\n拉丁文\t106349\n自游多多旅行网\t106350\n5555223\t106351\n5555222\t106352\nSBn2B\t106353\n刘小\t106354\n一年多\t106355\n点红\t106356\n块宝\t106357\n批准\t106358\n2009年11月18日\t106359\n厂长厂\t106360\nJjvdmfmamMlmkkkkkmjhbvghbb\t106361\n感恩节\t106362\n咪哆咪\t106363\n蓝翎\t106364\nnsjsjsjsjsjjsjssjsjsjjsjsjsjsjsjjdjdjdjdjjdjjjdkdkdjjjdjdjsjjsjsjdjdjdj\t106365\n裸爱\t106366\n香浓\t106367\n周二20点54分\t106368\n美盛版\t106369\n亏待\t106370\n黄成\t106371\n非男非女\t106372\n杨帅\t106373\n不要脸爱你\t106374\n老子牙\t106375\n9月17日\t106376\n笨说\t106377\n杨布\t106378\n我真的不爱你我讨厌你我恨你\t106379\n龙应\t106380\n英拉\t106381\n福太\t106382\n啵啵啵咪咪\t106383\n吃捏\t106384\n白子\t106385\n白字\t106386\n半岁菜\t106387\n24H\t106388\nyounow\t106389\n文献\t106390\n无聊我讨厌你我讨厌你我讨厌你我讨厌你我讨厌你\t106391\n放碟\t106392\n卵硬\t106393\n瘢痕妊娠\t106394\nPlus\t106395\n超碰\t106396\n兄控\t106397\n离家\t106398\nkslsmmemenene\t106399\n五4\t106400\n吃换\t106401\n吱呀\t106402\n近百万份\t106403\n白学\t106404\nCPA\t106405\n五十分钟\t106406\n就叫做\t106407\nNCAP\t106408\n底仓\t106409\n进行\t106410\n据称\t106411\n看我死\t106412\n丫杈\t106413\n谁千里卵子了净瞎说\t106414\n几天内\t106415\n代家湾\t106416\n农副产品市场\t106417\n我是真的爱上你\t106418\n鲁颜沁\t106419\n弱小\t106420\n18223659418\t106421\n刘聪颖\t106422\nHHP\t106423\n万仞山\t106424\nyouahappy\t106425\n扁桃体\t106426\n足趾\t106427\nnonokiss\t106428\npicloud\t106429\n金冠礁窝疯子\t106430\n土灶\t106431\n阳泉\t106432\n俞佳怡\t106433\n敬小叫\t106434\n华都庭院\t106435\n诀窍\t106436\n梦阳\t106437\n好好玩耍\t106438\n非翅\t106439\n休息人\t106440\n打滴\t106441\n金禺\t106442\n答按\t106443\n航候机楼\t106444\n男娘娘\t106445\n蜜枣\t106446\n岳雨\t106447\n憋\t106448\n抱抱昂\t106449\ngfffdvccc\t106450\n夏洛特烦恼\t106451\n越狱者\t106452\n這麼小面\t106453\n憾\t106454\n米妮\t106455\n赵嘉凯\t106456\n随行\t106457\nmgmgng\t106458\n打滚\t106459\n拜仁\t106460\n胡念\t106461\n郑燕贝\t106462\n憩\t106463\n憨\t106464\n玩儿行\t106465\n200iq\t106466\n手机端\t106467\n许世友\t106468\n何爱军\t106469\n半脊\t106470\n王林\t106471\nGRE\t106472\n46800568\t106473\n15055845568880585514155558585\t106474\n尽是\t106475\n未然后\t106476\nmakelove\t106477\n有我不懂\t106478\n北京定都园林绿化有限公司\t106479\n三四片\t106480\n55555555555555\t106481\n千金小姐\t106482\n许飞\t106483\nhzhgc\t106484\n不知情\t106485\n浅水\t106486\n我的话我不了你\t106487\n咩咪仙鹤每天咪好星河克塞春风里一开在春风咪\t106488\n点把吧\t106489\n投奔\t106490\n大寰妤\t106491\n聊了脆\t106492\n投契\t106493\n胡雪峰\t106494\n这一月\t106495\nvdsdrdtgddrftdt\t106496\n4Xf\t106497\n很漂亮身\t106498\n伊格纳\t106499\nｈｏｍｅｂｉｒｄ\t106500\n884774855\t106501\n公然\t106502\n借神\t106503\nH7N9禽流感\t106504\n天堂的忏悔\t106505\n反季\t106506\n根本钱\t106507\nxicb\t106508\n南航\t106509\n好吧好吧我是天平座\t106510\nhsus\t106511\n一次就好\t106512\n挖出\t106513\nfdfdd\t106514\n形象地\t106515\n吉大\t106516\n丑说\t106517\n76年清明\t106518\n温额度\t106519\n壮志未酬\t106520\n1％\t106521\nhsub\t106522\n错上\t106523\nhsuf\t106524\nrhey\t106525\n陈锐\t106526\n畅畅\t106527\n2000000110\t106528\n大吉岭\t106529\n陈安琪\t106530\nwuuf\t106531\ngtttttttt\t106532\n老爷们\t106533\nCnack\t106534\n决赛季\t106535\n谢谢你了我的好朋友\t106536\n好恶心好恶心\t106537\n24号\t106538\n懂过\t106539\n雅典奥运会\t106540\n滚粗\t106541\n开泶\t106542\n八百八千八百八多\t106543\n朱雅洁\t106544\n就是你的妈妈\t106545\n徽娘\t106546\n刘堡西\t106547\n磊磊\t106548\n405737577843875\t106549\n益尔\t106550\n阎凤娇\t106551\n喘息\t106552\n陈厅长\t106553\n恶心有\t106554\n商贩\t106555\n张温纯\t106556\n崔珊\t106557\n薛風格\t106558\n金镶玉\t106559\n恩我喜欢\t106560\nmall\t106561\n度秘度秘\t106562\n优柔寡断\t106563\n酺斕\t106564\n商贷\t106565\n名片\t106566\n气质性\t106567\n刘少威虹\t106568\n定额\t106569\n自愿者\t106570\n名牌\t106571\nG6Z\t106572\n泉泉区\t106573\n怪圈\t106574\nhdvjidruoewriybjkkkkkkkkhfffffhhggggasrtyioyeqeruuiihvdf\t106575\n338套\t106576\nFfgdhdgfh\t106577\n祖上\t106578\n临川\t106579\n欧才\t106580\nxXXXXXXX\t106581\n晕吧晕晕\t106582\n回调\t106583\n凯源\t106584\n潲水\t106585\n嘎哈\t106586\n黎先\t106587\n最好的朋友身\t106588\n奥别惯\t106589\n这个星期五\t106590\n418.84点\t106591\n睛\t106592\n特务\t106593\nmowowow\t106594\n年利率\t106595\n小满\t106596\n自换\t106597\n劫匪\t106598\n13207356411\t106599\n醉亭\t106600\n王未来\t106601\n周半\t106602\n长短不限\t106603\n安庆比矿\t106604\n周华\t106605\n黑白无常\t106606\n智喜\t106607\n串蛋\t106608\n周十\t106609\n醉人\t106610\n老大花\t106611\n纳莎瓦\t106612\n楼部\t106613\n我看电视好恐怖啊你先在快乐我家悠悠多好咯求你了两只小狗给我我在\t106614\n度秘我爱你下著名你真帅度你真美啊度秘我爱你\t106615\n激荡\t106616\n亏古树有说你要是你在说\t106617\n抢尸\t106618\n国家地人\t106619\n动物界\t106620\n吓人么我想忘记泡\t106621\n泡鬼\t106622\n毛羞\t106623\nGfghihvvjko\t106624\n爱丽棒\t106625\n尚筠茜\t106626\n孙唱\t106627\nfuur\t106628\niti\t106629\n俩三个\t106630\netot\t106631\n歌舞曲\t106632\n5467876776557566758\t106633\nitc\t106634\n肚子度\t106635\nxucu\t106636\n雪碧透心凉\t106637\n每个月一号\t106638\n高级机器人\t106639\n半天块\t106640\nity\t106641\n睔\t106642\nitr\t106643\nits\t106644\n孽畜\t106645\n喔元\t106646\n现代舞\t106647\nNdjids\t106648\n舞舞舞\t106649\nOASIS\t106650\n沓毛\t106651\n王佳政\t106652\n仗节\t106653\nhttpfhiphotosbaiducomxiaodupicitem18d8bc3eb13533fab7acd55aafd3fd1f41345bafjpg\t106654\nbd40735fa13f59dcd99510fb30f2408b9jpg\t106655\n好像\t106656\n尕花椒\t106657\n发膜\t106658\n十多\t106659\n1000亿\t106660\n全长\t106661\n窿\t106662\n嗳1h八一\t106663\n抱身亡\t106664\n八早\t106665\n三台县\t106666\nugvbnfkjv\t106667\n几千呵呵\t106668\n骗了我讨厌\t106669\n絕\t106670\n建于\t106671\n34728\t106672\n收岁\t106673\n着\t106674\n考研\t106675\n藤原\t106676\n一金\t106677\n好youcu\t106678\n絮\t106679\n和一走路\t106680\n三位\t106681\n三体\t106682\n苏国民\t106683\n00:19:44\t106684\n饞嘴\t106685\n凉平\t106686\n一量\t106687\n一里\t106688\ntaceclassroom\t106689\nlantern\t106690\n劲使\t106691\n二三十年代\t106692\n二四三五二二五二\t106693\n多胖\t106694\n刘惠婷\t106695\n奥迪q230\t106696\n老打\t106697\n加吧加吧加吧加吧加吧加吧加\t106698\n公共场所\t106699\n2258521\t106700\n你好呀我是你猜我是谁\t106701\n老手\t106702\n耶真\t106703\n一头奶牛黄\t106704\n失重感\t106705\n15074269772\t106706\nCP8\t106707\n毛东东\t106708\n筋急转弯儿\t106709\n塔奇\t106710\n我的理想\t106711\n百分百之十二\t106712\n骚锐\t106713\n道理句\t106714\n55n5n56\t106715\n缅甸联邦共和国\t106716\nhistosto\t106717\n冯宇阳\t106718\ngbjgfhjj\t106719\n不可爱么\t106720\n1l0vey0u\t106721\n小度秘你真是我的好帮手\t106722\n小物\t106723\n突看\t106724\n认门\t106725\nZZZ0\t106726\n波慟\t106727\nEnenenem\t106728\n我哥说麦怪事我的世界\t106729\n窦\t106730\n茊山兄\t106731\nllll嘤\t106732\n1666\t106733\n可选择\t106734\nChang\t106735\n中华帝国主义\t106736\naabc句式\t106737\n丝丝度\t106738\n3.3後门\t106739\n︿好\t106740\n杭州市公安局\t106741\n小牌\t106742\niihgggg\t106743\n干秘书\t106744\n下次再聊\t106745\n松巍爱卿\t106746\n龙钟空\t106747\n小片\t106748\nk线耳\t106749\n小牛\t106750\n15847438162158474349080808\t106751\n玛逼\t106752\n味菊花\t106753\n窝窝团\t106754\n释放\t106755\nvhhvbhfg\t106756\n唯我独尊\t106757\n卫宁\t106758\n我讨厌你讨厌你讨厌你趟天一\t106759\n份妮生花\t106760\n三千里\t106761\nlonsuldy\t106762\n侠女\t106763\n一nnnn下次\t106764\n不言\t106765\n5035\t106766\nchch\t106767\nSUBWAY\t106768\n争取\t106769\n双星人\t106770\n开心死\t106771\nbvul\t106772\n5月20\t106773\n划掉\t106774\n5月22\t106775\n5月25\t106776\n尹小贱\t106777\n临潼\t106778\n从古至今\t106779\n闵雪\t106780\n严惩不贷\t106781\n24244242424727\t106782\nimportantly\t106783\n1570537309046715\t106784\n珍珠末知乜\t106785\n人山\t106786\n这幅画\t106787\n哈哈哈八八八八八八八八八八\t106788\n第一哼\t106789\n熊维尼\t106790\n胜贤\t106791\n歪咪\t106792\n潇湘\t106793\n😊\t106794\n中山市小榄人民医院\t106795\n侉样\t106796\n别说呀山\t106797\n家呢樨\t106798\n张珍贵\t106799\n斗\t106800\n胜负\t106801\n😭\t106802\n份逸\t106803\n564800000000\t106804\n朱嘉浩\t106805\n222222222222222222222222222\t106806\n丝钉\t106807\n朱额济\t106808\ncnnc\t106809\n你好你好我爱你\t106810\n杨少亮\t106811\n贪欲\t106812\n跨代\t106813\n女呢ok\t106814\n连队长\t106815\n杜度秘\t106816\n毛呢\t106817\n连通\t106818\n呦女\t106819\n大片儿\t106820\n杀无\t106821\n颁给\t106822\n疫病\t106823\nh7N9禽流感\t106824\n肖的关你\t106825\n摇椅\t106826\n齐克果\t106827\nkoucub\t106828\nFufriggchghcgthidrt\t106829\n谣言\t106830\nwalk\t106831\nwimaxwimaxwls\t106832\n四瓦多姆\t106833\n八仙殿\t106834\n躲闪\t106835\n沙岛\t106836\n张强强\t106837\n牙医\t106838\n令归隐佛\t106839\n百山\t106840\n1.05万亩\t106841\nvivian\t106842\n葡萄酒\t106843\n写给自己的信\t106844\n牛掰\t106845\n一段情\t106846\n张成秒\t106847\nGoogle\t106848\nFROM\t106849\n唐佳琪\t106850\nhi你好英语\t106851\n最好的朋\t106852\n倒把\t106853\naecd\t106854\n俩百二十个\t106855\n牛排\t106856\n大家末\t106857\n不需要\t106858\n张勇\t106859\n？分钟\t106860\n蒲公英头蒲公英姑娘\t106861\n去职中行\t106862\nDEFG\t106863\n孤枕\t106864\n没么样\t106865\n骑自行车\t106866\n花天桥\t106867\n加餐\t106868\n长飞\t106869\n小夜灯\t106870\nyekg\t106871\n张依然\t106872\n杨腿儿\t106873\n林雪柔\t106874\n驳驳\t106875\n七来\t106876\nhgbgf\t106877\n徐文天\t106878\n远嫁\t106879\n一一一一一一一一一一一\t106880\n动一动\t106881\n李织羽\t106882\n丽江小石桥\t106883\n3DVD\t106884\n配音秀\t106885\n无感中\t106886\n真的好不要脸了你\t106887\n度秘集大你会\t106888\n牣尔\t106889\n多难\t106890\n俊秀\t106891\n茅垭\t106892\n第四章\t106893\n小娃子\t106894\n川中\t106895\nthave\t106896\nufucugcty\t106897\n去年11月15日\t106898\n814可乐\t106899\n李洁浓\t106900\nSuyen\t106901\n大斗\t106902\n喜得\t106903\n成钱\t106904\n这一句话\t106905\n看我不干\t106906\n刘玥杉\t106907\n粑粑粑粑\t106908\n我靠我靠我靠我靠我靠我靠我靠我靠靠靠靠靠靠\t106909\n1750点\t106910\n福慧慈缘文化会馆\t106911\n會魯\t106912\n蒋中正\t106913\n毕业班\t106914\n虎背熊腰帅\t106915\n差生\t106916\n维C精华#\t106917\n周鸿\t106918\n东西子\t106919\n板机\t106920\n爱大白\t106921\n待到\t106922\n中藏何物\t106923\n狗东西\t106924\n沙度秘\t106925\nn源\t106926\n骨頭\t106927\nnjcusiphclcosljchlsldiqtowosoznxhcoxonkgudgizitspcjvudgdpdgdpmpdgmnjpdgn0jgajdtmtnp\t106928\nBlu酒店\t106929\n百分之70\t106930\n离间者\t106931\n窝素牛\t106932\n再保险大厦\t106933\n阿福特恩\t106934\n62期\t106935\n母公司\t106936\n这是谁的歌\t106937\n新娘大作战和何以笙箫默\t106938\n五零五百\t106939\n69580\t106940\n自爆\t106941\n盈余\t106942\n好错\t106943\n弟弟弟\t106944\n四月二十三日\t106945\n愛人\t106946\n病逝\t106947\n萌萌哒亲一个\t106948\n郑编辑\t106949\n七十五千克\t106950\n非哟播片\t106951\nhint\t106952\n一块九\t106953\n杨童心\t106954\nimim\t106955\n收纳手枪\t106956\n豆芽菜\t106957\n推上\t106958\n热会\t106959\n六截屏\t106960\n自爱\t106961\n口\t106962\n叠\t106963\n另\t106964\n句\t106965\nzgcfh\t106966\nloal\t106967\n只\t106968\n陆梦丽\t106969\n国际米兰\t106970\n可\t106971\n叮\t106972\n叭\t106973\n召\t106974\n右\t106975\n史\t106976\n叱\t106977\n台\t106978\n号\t106979\n叶\t106980\n整洁\t106981\nHfiddnu\t106982\n叻\t106983\n叹\t106984\n司\t106985\n叽\t106986\n叼\t106987\nxax\t106988\n参\t106989\n与非记住\t106990\n过去7年\t106991\n加温\t106992\nZAIMA\t106993\n友\t106994\n及\t106995\n亲想你\t106996\n又\t106997\n5900\t106998\n反\t106999\n双\t107000\n发\t107001\n受\t107002\n度秘我看你头向上肚子的小红心\t107003\n全副武装\t107004\n叔\t107005\n叛\t107006\n叙\t107007\n变\t107008\n一夜度\t107009\nzgtjjf\t107010\nbjjkf\t107011\n张静茹\t107012\n如图额\t107013\nfghff\t107014\n一两台\t107015\n广东美术馆\t107016\n呃度\t107017\n空集\t107018\njronggmai\t107019\n一两句\t107020\n字形\t107021\n欢总教\t107022\n93jpg\t107023\n你的笑你的美\t107024\n张抗抗\t107025\n吴文兴\t107026\n8i8v8\t107027\n太早\t107028\n勃朗峰\t107029\n弃之而去\t107030\n影片儿\t107031\n清欠\t107032\n透湿\t107033\n炒豆\t107034\n你啦啦啦啦啦咯莫模棱两可\t107035\nseea\t107036\n初一\t107037\n闹了帮\t107038\n是萌\t107039\n看跌\t107040\n几月几日\t107041\n拍客饪我行\t107042\n2295299060\t107043\n二班\t107044\n恶补天天开心\t107045\nGOODBYE\t107046\n陆步文\t107047\n手筋\t107048\n安德鲁\t107049\n哇裕华\t107050\n月用水\t107051\n宁可不\t107052\n146783569\t107053\n数理化\t107054\n稍安勿躁\t107055\n跳停\t107056\n沁心脾\t107057\n西南地区东部\t107058\n11255685698562595563449467997679\t107059\n商贾\t107060\n近镜\t107061\n占座\t107062\n5月23日16时\t107063\n翻地覆\t107064\nhdux\t107065\n有儿\t107066\n哈面\t107067\n9.23\t107068\n啥子\t107069\n弄好\t107070\n付维维\t107071\n韩诗蒽\t107072\ntremyshit\t107073\nintentIntentSK1171477665519EC8A6684AFB357094C590C3C4E8F014E0AD940367D05759E78CFE9D234A9end\t107074\n说业\t107075\n斥力\t107076\n100000009\t107077\n日本政府\t107078\njexj\t107079\n把元之空的不是元之空是缘分的缘就缘之空\t107080\n100000000\t107081\n说下\t107082\n说上\t107083\n说不\t107084\n死去吧\t107085\n化石砖\t107086\n额姓\t107087\n一不小心\t107088\n1637765542\t107089\njrhr\t107090\nghrigh\t107091\n旷工\t107092\n果宝\t107093\n卡x斯达\t107094\n管蛋\t107095\n说中\t107096\n臭大猪\t107097\n想一想爱\t107098\n移晷\t107099\n争宠\t107100\n电老百姓\t107101\n有关机\t107102\n两码事\t107103\n分歧\t107104\n晋级\t107105\nm41\t107106\n二八六\t107107\n花影错\t107108\n主题外话\t107109\n为所有爱我和我爱的人\t107110\n１０日晚１０时\t107111\n文纸\t107112\n奖励点\t107113\n大妹子我了真不要脸那笑死人\t107114\n一点钟一点半\t107115\n高品质\t107116\n衣服类\t107117\n讨厌鬼讨厌鬼讨厌鬼\t107118\n2月26号\t107119\n恁多\t107120\n最后的女人\t107121\n三和一个五\t107122\n快點好嗎\t107123\n腹肌干\t107124\n卵石\t107125\n米奇个讨厌鬼的你\t107126\nsiri\t107127\n沙雷\t107128\n马茜羽\t107129\n算得上\t107130\n课外歌\t107131\n好编\t107132\n度秘我发现你好牛逼\t107133\n过多少天\t107134\n响头\t107135\n──\t107136\n沙雪\t107137\n云母\t107138\n喜剧片\t107139\n我靠你看我是猪啊我靠你\t107140\n车流\t107141\n钱欣慧\t107142\nCYH\t107143\n沙雕\t107144\n日渐\t107145\nKiehl's\t107146\n黑帮片\t107147\n兔发祈祥\t107148\n迭起\t107149\n成我我要\t107150\n支票\t107151\n游泳馆\t107152\njhdfj\t107153\n匪匪\t107154\n张宜明\t107155\n嗯啦钱\t107156\nurjfty\t107157\n费玉玉\t107158\n孟浩然\t107159\n18828273747485959696984837262665256373738494\t107160\n熊熊抱\t107161\n看清楚\t107162\nJDI\t107163\n消费益\t107164\n战歌\t107165\n没感兴趣\t107166\nJDF\t107167\nimag0063-1a.jpg\t107168\nJDC\t107169\nJDB\t107170\n1斤\t107171\n十余日\t107172\n咯酷6\t107173\n邓爽\t107174\n下一个年\t107175\n两只狗\t107176\n人类处\t107177\n强闺密\t107178\n死脑筋\t107179\n贝尔曼\t107180\n金球\t107181\n行恋爱我\t107182\nMethen\t107183\n青龙\t107184\n烦求子\t107185\nllovryou\t107186\n瑪麗\t107187\n女朋女\t107188\n照片\t107189\n1254598548585855555\t107190\n乐8\t107191\n的的的的的的的的的的的的哒\t107192\n摔倒\t107193\n兴科钡\t107194\n郭敬明\t107195\n大滩\t107196\n沒偶葛\t107197\n小吴\t107198\n火烧曹船之前着前句\t107199\n噗噜噗噜\t107200\n脱颖而出\t107201\n中肯\t107202\n额不吖\t107203\n零幺幺零\t107204\n临朐\t107205\n成活\t107206\n鲜嫩\t107207\n伐客气伐\t107208\n兆丰银行\t107209\n今晚十点\t107210\n捉妖记\t107211\n正态竞争\t107212\ncmad\t107213\n小吕\t107214\n龙虎穴\t107215\n344784258\t107216\n好哇塞\t107217\n小名\t107218\n馆长\t107219\n小吉\t107220\n扣发\t107221\n旺财\t107222\n帮手\t107223\n来事要\t107224\n小吃\t107225\n海牙\t107226\n海牛\t107227\n由得\t107228\n法器\t107229\n诤文\t107230\n小个月\t107231\n萝卜兔\t107232\n真金\t107233\n海蛇\t107234\n心情不错\t107235\n这首诗歌\t107236\n买错\t107237\n哿\t107238\n中国会\t107239\nffddd\t107240\n进驻\t107241\n劲斗云\t107242\n卡纸中学\t107243\n红唇\t107244\n三年级\t107245\n2631963206\t107246\n素蓝达\t107247\n姚思怡\t107248\n不要八八\t107249\n预感二十八号\t107250\n成都妈妈网\t107251\n两万多兆\t107252\n别的了狂犬病\t107253\n慈云山道\t107254\n招兵\t107255\n熊旭\t107256\n宠物们\t107257\n点人\t107258\n森中\t107259\n小数报\t107260\n美少我波斯人\t107261\n华剑\t107262\n皮带炒肉丝\t107263\n屏间\t107264\n色女\t107265\n期待期\t107266\n点亮\t107267\n劳动人民\t107268\nrisi\t107269\n蔷薇书院\t107270\n晚上8点\t107271\n名草\t107272\n哈钱么\t107273\nqqpl\t107274\n托斯卡纳\t107275\n点五\t107276\n宏伟\t107277\na11a\t107278\n啦啦啦啦啦吗啦啦啦啦啦啦啦啦啦啦偷猪你是一头猪11头猪你是一头猪你是一头猪是一头猪你你是一头猪\t107279\n姚圣友\t107280\n好呀咱俩对帐\t107281\n大白话\t107282\n口算\t107283\n升庵的国际大玩笑例如\t107284\n金欣\t107285\n晶格\t107286\n靖西\t107287\n婚会\t107288\n五金\t107289\n我的家的很多\t107290\n陈vv\t107291\n太空人\t107292\n不为人\t107293\n热恋期\t107294\n程帅凯\t107295\nFKf\t107296\n控制阀\t107297\n龙艳玲\t107298\n几ou\t107299\njct\t107300\n逗乖\t107301\njcv\t107302\n范世琦\t107303\ntuft\t107304\n石壁\t107305\n|丶\t107306\n矿石\t107307\n封锁\t107308\n哈喇子\t107309\njcd\t107310\n第6\t107311\n冒牌货\t107312\n雷亚文\t107313\njcb\t107314\n寡姐\t107315\njcl\t107316\nLlllllllllllllllllpllllll\t107317\njcn\t107318\njch\t107319\n233345566698123688566\t107320\njcj\t107321\n孙甜甜\t107322\n安度\t107323\n叫做事\t107324\n丑丑八怪\t107325\n亲所\t107326\n刺绣\t107327\n小金天水市小天天小天头发\t107328\n亲手\t107329\n明天晚上八点\t107330\n2008年3月18日\t107331\n安康\t107332\nHIT-5\t107333\n夜情\t107334\n宣传照\t107335\ngvn\t107336\n我喜欢感叹号\t107337\n胰腺炎\t107338\n安庆\t107339\n没完我不想\t107340\n回我的手可\t107341\nfusjh\t107342\n飞往\t107343\n建设乡村\t107344\n正对\t107345\n人骨\t107346\n日向丨\t107347\n失重\t107348\n飞得\t107349\n福尔摩斯贵公子\t107350\nzak\t107351\n找东西\t107352\n影子会\t107353\nkdbbbhhbsv\t107354\n抢话费\t107355\n没活\t107356\n望京\t107357\n帅嘴\t107358\n垂卧\t107359\n1000万版\t107360\n刘象萍\t107361\nblebl\t107362\n赵嘉敏\t107363\n孙宇\t107364\n155555239\t107365\n9672\t107366\n9673\t107367\n朝闻\t107368\n比度\t107369\n思想家\t107370\nFATE\t107371\n2323233323\t107372\n唉怪不得\t107373\n里海\t107374\n直播秀\t107375\n11月12月\t107376\nq里咖喱\t107377\n无一点\t107378\n成卓然\t107379\n哈尼哈斯\t107380\n顾念杰\t107381\n狄龙\t107382\nhgwhksid\t107383\n哪等\t107384\n傲娇受我是霸道总\t107385\n影小达\t107386\n呀帅\t107387\n少理\t107388\n叶飞扬\t107389\n百度骗子\t107390\n帅莫\t107391\n嘟偶\t107392\nAV番\t107393\n亦真亦幻\t107394\n客户群\t107395\n破罐\t107396\n过程\t107397\n個電視劇\t107398\n洪灿辉\t107399\n吕秀芝\t107400\n1月22日\t107401\n快乐的心情\t107402\n加满\t107403\n大头鬼\t107404\n在空中\t107405\n21:20\t107406\njjjffc\t107407\n和牌牌\t107408\n潮州\t107409\n潮川\t107410\ngsns\t107411\n橘橘\t107412\n13000亿\t107413\n回龙\t107414\naagaa\t107415\n朝门\t107416\n584545452856952548760846445555513464904792855\t107417\n21:25\t107418\n小气气度秘\t107419\nmre\t107420\n欧克\t107421\n雷铬\t107422\n欧元\t107423\n寶貝\t107424\n超级空\t107425\n待还\t107426\n真的假不了假的真不了\t107427\n美托尼\t107428\n闰题\t107429\n1250014637\t107430\n老祖宗\t107431\n倒映\t107432\n嗯影\t107433\n還好\t107434\noutorhoohh\t107435\n看不其\t107436\n别烦\t107437\n98989808090858282555559\t107438\n肉体\t107439\n15248079297\t107440\n深闺\t107441\n紫禁城\t107442\n星期零一\t107443\nbhiphotosbaiducomxiaodupicitem5243fbf2b211931349a7afc262380cd790238dd8jpg\t107444\n沁心\t107445\n张娜娜\t107446\n仙女棒\t107447\n亚马孙雨林哥伦比亚\t107448\n哪我再也不里你\t107449\nl22\t107450\n脱胎换骨\t107451\n店名\t107452\n11V5\t107453\n冷枪\t107454\n9科\t107455\n亮剑\t107456\n我是我\t107457\n四天后\t107458\n真的决定\t107459\n当来\t107460\n醉儿\t107461\ngsjskkso\t107462\n咯安\t107463\n嘎嘎嘎啦嘎嘣瞎卡拉卡东江\t107464\n帮主\t107465\n毕比\t107466\n嘻谈\t107467\n1234568809\t107468\nEYED\t107469\n拉多芮\t107470\n好吧晓瑞\t107471\njfgig千\t107472\n来客\t107473\nhi有意思是说我心里太阴暗话\t107474\n一蛛\t107475\n笔锋\t107476\n舌头\t107477\n盖洛普\t107478\n拍扁\t107479\n复原\t107480\n远方的家\t107481\n你先试\t107482\n37个年\t107483\n泉豹女\t107484\n五次\t107485\n璺短\t107486\n冤死\t107487\n抬\t107488\n军龄\t107489\n对啊你好\t107490\n五欲\t107491\nigetgege\t107492\n五款\t107493\n22分\t107494\njvsy\t107495\nIdua\t107496\nexe\t107497\n拒保\t107498\n1616464\t107499\n傲世\t107500\n锦州十中\t107501\n一周\t107502\n个头儿\t107503\n丰胸\t107504\n嘎嘎嘎嘎嘎嘎嘎\t107505\n红红火火恍恍惚惚\t107506\n一味\t107507\n13074955322\t107508\n一呵\t107509\n一呼\t107510\n一命\t107511\n阁考岛\t107512\n抻\t107513\n欧美\t107514\n丑八怪丑八怪丑八怪\t107515\n11种\t107516\nddgh\t107517\n爱翰\t107518\n监事\t107519\n收漫\t107520\n两亿岁\t107521\n北大伯\t107522\n哫\t107523\n一呕\t107524\n斯明\t107525\n一员\t107526\n11秒\t107527\n一月二日\t107528\n心旷神怡\t107529\n抽\t107530\n介手\t107531\n张家荣\t107532\n奔放\t107533\n澳大利亚学校\t107534\n现金流\t107535\n解刨\t107536\n解别\t107537\n黎美\t107538\n孔欣宇\t107539\n窃电\t107540\n家家户户\t107541\n聊了事\t107542\n护脸\t107543\n吕金金\t107544\n淬危惫\t107545\n24万多\t107546\n不知从何说起\t107547\n恨嫁\t107548\n落座\t107549\n323483\t107550\n晓东\t107551\n门牌\t107552\n姚郁闷\t107553\n20场\t107554\n才生\t107555\nddgf\t107556\n秦始皇\t107557\n历经\t107558\n朱志同\t107559\n小野\t107560\n最美好\t107561\n周朝姬氏\t107562\ndurgj\t107563\n450000吨\t107564\nQ\t107565\n大萝卜那你你在那\t107566\n21世纪\t107567\n嗯一任\t107568\n小金\t107569\nfuCk\t107570\n贞观帝\t107571\n646466464646464\t107572\n小可爱\t107573\n盖骨\t107574\n西瓜霜\t107575\n束\t107576\n成都路\t107577\n小邋遢灰太雅\t107578\n还款\t107579\nchvvj\t107580\n151.4\t107581\n未解之谜\t107582\n阿兰·德波顿\t107583\n混住\t107584\nMcLaren\t107585\n不留克\t107586\n人人喊\t107587\n小四朗\t107588\n零三零七\t107589\n春风又绿江\t107590\n寄宿制\t107591\n保荐\t107592\n30几年\t107593\n农工色\t107594\n你的眼睛\t107595\n抄\t107596\n会车\t107597\n见长\t107598\n不要脸你不要脸\t107599\n更年轻化\t107600\n别走开\t107601\n疯疯\t107602\n400蚊\t107603\n来转去\t107604\n狂千\t107605\n度秘度秘度秘月\t107606\n五夫一\t107607\n咸蛋寺\t107608\n快新短篇文\t107609\ndjdhhvdjkgboskv\t107610\n确权\t107611\n安全隐患\t107612\n猪咯呗\t107613\n所有\t107614\n溯及\t107615\n黑黑\t107616\nT381\t107617\nuhhji\t107618\n不好漂亮\t107619\n花花花花\t107620\n宋丽华\t107621\n无聊啊片\t107622\n静园\t107623\n的版心小脚幼儿园\t107624\n八五个\t107625\n吕泽馨\t107626\nyy频道\t107627\ndiyohd\t107628\n度秘戏戏秘\t107629\n管理层\t107630\n火冒三丈\t107631\n宽旭\t107632\n杨\t107633\n大半码\t107634\n顺风\t107635\n八一号\t107636\n洋溢幸福的青苔小世界\t107637\n全明星\t107638\n迈威\t107639\n不等式\t107640\n丘八子\t107641\n买手\t107642\n面面谈\t107643\n勇往直前\t107644\n五多米\t107645\n防撞\t107646\n来\t107647\n九九厘米\t107648\n2800元\t107649\n喵呜喵喵喵喵喵喵\t107650\n沧州\t107651\n情绪感\t107652\n接行\t107653\n宗教局\t107654\n学校生活\t107655\n#818二手车#\t107656\n洗凶\t107657\n花团\t107658\n花园\t107659\n训练营\t107660\nhshdhq\t107661\n茗草泉\t107662\n珠格格\t107663\n毛损\t107664\n叫就份\t107665\n莱雅\t107666\n21些\t107667\n698888888\t107668\n更胜一筹\t107669\nBbvvfcsvbmmml\t107670\n七五2925场\t107671\n外星店\t107672\nnvvf\t107673\n文理分\t107674\n滔天\t107675\n邮编\t107676\n言改\t107677\n4个\t107678\n21亿\t107679\n一摸摸头\t107680\n肯帝亚\t107681\n这么\t107682\n病床\t107683\n不存心害\t107684\n晒晒太阳\t107685\n88597656558565\t107686\n鱼豆干\t107687\n乐家\t107688\n奸穷\t107689\n杰\t107690\n沼泽地\t107691\n乐容\t107692\n卓文君\t107693\n韭菜茄子\t107694\n多如牛毛\t107695\n900个\t107696\n上海领事馆\t107697\n方文\t107698\n叫嚣\t107699\n杂感性\t107700\n中电信\t107701\n陈老都\t107702\nusjiehiedviydii72831234567890\t107703\n栾城二中\t107704\n77777U\t107705\n念冻\t107706\n无法忍受\t107707\n78443139255959\t107708\n6927\t107709\n陈恒阳\t107710\n黑丝pastfrogastor小虎xomoto腐4xomaoyaxchilm\t107711\n道用\t107712\n混血\t107713\n7716\t107714\n同父异母生\t107715\n7718\t107716\ngyyy\t107717\n36下\t107718\nfuoory\t107719\nygygg\t107720\n死银\t107721\n做官机\t107722\nuiooovcsaaqq\t107723\n赵龙\t107724\n背靠\t107725\n13681398082\t107726\n700根\t107727\n所向\t107728\n新加雨再见来爱河有一天\t107729\n36万\t107730\n刘奕欣\t107731\n出来不来\t107732\n出世\t107733\n家仔\t107734\n出丑\t107735\n脑官\t107736\n玩具厂\t107737\n玍疚\t107738\n凉拌牛肉\t107739\n家公园\t107740\n徐子涵\t107741\n出下\t107742\n免免\t107743\n瀚委\t107744\n李方便人\t107745\n雷猴雷猴\t107746\n2.93%\t107747\n伊春\t107748\n家份\t107749\n拿枪突突\t107750\n李阳阳\t107751\n不行不行\t107752\n脑容\t107753\n钱一喜\t107754\n脑室\t107755\n家们\t107756\n北好离\t107757\n44宗\t107758\n乘以\t107759\n輔\t107760\n西柏坡\t107761\n给我一样\t107762\n绥中\t107763\n射手王\t107764\nmabrother\t107765\n走一遭\t107766\ngibd\t107767\nkkjj\t107768\n陪练\t107769\n森感\t107770\n何先睿\t107771\n美容院\t107772\n缓慢\t107773\n移项\t107774\n101.2\t107775\n身法\t107776\n王磊\t107777\n清心寡欲\t107778\n杰拉\t107779\n0817\t107780\n林然\t107781\n基巴达\t107782\n饭意\t107783\n四月四号\t107784\n工作者们\t107785\n开平美\t107786\n闹不开\t107787\n凉水\t107788\n周滨涛\t107789\n亲情\t107790\n刘德华\t107791\nxart\t107792\ns600\t107793\n狂信不信\t107794\n耍一耍\t107795\n上不上市\t107796\n禀赋\t107797\n凤河\t107798\n汉ie\t107799\n闫雯雯\t107800\ngibv\t107801\n论著\t107802\n一百岁\t107803\nissy\t107804\n制氢\t107805\n特里兹\t107806\n嗯光头\t107807\n恰兴\t107808\nlolWE\t107809\n制定者\t107810\n变质\t107811\n鲸鱼\t107812\n广钦\t107813\n二勘院\t107814\n行踪\t107815\n15723403816\t107816\n综合型\t107817\n刘心彤\t107818\n一分钱税\t107819\n中国电视\t107820\n头球\t107821\n元月25号\t107822\nguey\t107823\n399小游\t107824\n经理\t107825\n烦闷\t107826\nbjIjust\t107827\n闫永喜\t107828\n复习题\t107829\n王全璐\t107830\n转食\t107831\n马梦瑶\t107832\n300300\t107833\n20两次\t107834\n888340\t107835\n瘦子\t107836\n圣母女\t107837\n安联通\t107838\n56528255788\t107839\n红拉\t107840\n越走越远\t107841\n陈伟杰\t107842\n帆篷\t107843\n长安话\t107844\n半夏你们半鬼半仙\t107845\n过行\t107846\n婊子大厦\t107847\n该怎\t107848\n东京灰太\t107849\n6月6日\t107850\n人工台那你你你\t107851\n我讨厌你的一点就是我就是讨厌你\t107852\n小秘小秘你是男是女\t107853\n许一诺\t107854\nlucky\t107855\n鸡爷\t107856\n麦田\t107857\nydugkg\t107858\n100美元\t107859\n赵学启\t107860\n泡菜国\t107861\n勾当\t107862\n比秘\t107863\n蓝魔\t107864\n婚纱裙\t107865\n我六来了我\t107866\n虎岛\t107867\n捧走\t107868\n日语日\t107869\nghhytddghu\t107870\n加贝\t107871\nKIW\t107872\n言下之意\t107873\n智言\t107874\n第25分钟\t107875\n催害\t107876\nKIN\t107877\nmaiyou\t107878\n你的头谁是谁雅士利\t107879\n杂碎\t107880\n字体\t107881\n豆乳\t107882\n哈哈骗你的你\t107883\n警电\t107884\nyuuufff\t107885\n不慎\t107886\n划去\t107887\n踏春\t107888\nrffvccdfffffddddd\t107889\n帅哥你好帅┏\t107890\nh68\t107891\n土司面包\t107892\n跳闸\t107893\n彭宇\t107894\n老菠萝\t107895\nm韶伟\t107896\nAngelabay\t107897\n警用\t107898\n發貨\t107899\n动画箱\t107900\n不慬\t107901\n马友\t107902\n南广\t107903\n第3季\t107904\n大小气\t107905\n谁是你的我是女\t107906\n8500\t107907\nvvvhgfgmll\t107908\n卖货\t107909\n那伟霆\t107910\nhygggh\t107911\n3.4343\t107912\n16点1点15点\t107913\n植物堂\t107914\n布丁\t107915\n邓紫旋\t107916\n经学家\t107917\n鱼纹\t107918\n浓茶\t107919\n一三百\t107920\n补正\t107921\n露面\t107922\n植物堡\t107923\n处们\t107924\n梅扣肉\t107925\n钰钰\t107926\n遗骨\t107927\n数十秒钟\t107928\n处以\t107929\n中街步行街\t107930\nvdjzbcjbfg\t107931\n子堂宿\t107932\n交出钱\t107933\n秘屋\t107934\n舞龙\t107935\n19世纪末\t107936\n嘻嘻嘻多谢夸奖这谁说的\t107937\n快寒假了你\t107938\n12345678910111213141516117181920\t107939\n奔驰S400\t107940\n文洁\t107941\n光电\t107942\n駑\t107943\naaafffwee\t107944\n好梦\t107945\n文洋\t107946\n辛格尔顿\t107947\n7岁\t107948\n奥迪q\t107949\n踢算\t107950\n实习题\t107951\n2806\t107952\n2807\t107953\n乌鲁瓦图\t107954\n拉萨拉萨拉萨\t107955\n8858521\t107956\n8.5\t107957\n8.7\t107958\n8.6\t107959\n一人之上万人之下\t107960\n咪得\t107961\n大死\t107962\n光生\t107963\n大步\t107964\n芭比爱\t107965\n大武\t107966\n駢\t107967\n駡\t107968\n悠闲类\t107969\n别紧张\t107970\n满院\t107971\n额f\t107972\n你是一个小笨猪我是你的大的夜\t107973\n唱到\t107974\n王汝卉轩\t107975\n百度支付\t107976\n2015年6月十\t107977\n可挺好\t107978\n蓝山\t107979\n郭大妞\t107980\n6徐\t107981\n禁耐\t107982\n色哒\t107983\n悟空\t107984\nfgnnnnnnnnnnn那裡麵\t107985\n世纪个\t107986\n沙蟹\t107987\n见缝插针\t107988\n几千亿\t107989\n结业\t107990\n吃神魔\t107991\n数万个\t107992\n老剩男\t107993\n2155章\t107994\n望眼镜\t107995\n五零二零\t107996\nDDR3\t107997\n心谁的心\t107998\n几门儿\t107999\np鹏\t108000\n尽尽可能\t108001\n78三个\t108002\n送葬\t108003\n好笑呵呵呵呵呵\t108004\n四不四寸\t108005\nwuuoogda\t108006\n香粉\t108007\n张思涵\t108008\n土鳖\t108009\n1234667890\t108010\nhdggf\t108011\nHjz\t108012\n镀密\t108013\n凤鑫\t108014\n虚报\t108015\ndefuf\t108016\n谦舍\t108017\n玛吉拉\t108018\n打网球\t108019\n臭不要脸臭不要脸臭\t108020\n18章\t108021\n白雪皑皑\t108022\nHjf\t108023\n怡然\t108024\n跑会\t108025\nefgw7\t108026\n愁愁\t108027\n戏如人生\t108028\n20120414\t108029\n韶光\t108030\n13131780\t108031\n瞌睡莱\t108032\n赖美华\t108033\n九九真人\t108034\n秘我要你做我的\t108035\n神工\t108036\n节前\t108037\n六七岁\t108038\n好无聊\t108039\n布达拉广场\t108040\n罗伯逊\t108041\npprm\t108042\n国度\t108043\n58二四\t108044\n公公公\t108045\n很纯很暧昧\t108046\n美琪\t108047\n李恩慧\t108048\n4.2\t108049\n钢琴师\t108050\n独自一人\t108051\n两两个\t108052\n第五座\t108053\nabcdabdefg\t108054\n亲爱哒度秘\t108055\n神州\t108056\n孟德斯鸠\t108057\n工房\t108058\n韶关\t108059\n薯片儿\t108060\n黄云清\t108061\n05101625283108\t108062\n齣戲嗎\t108063\n模式\t108064\nless\t108065\n刘纠纷\t108066\n苦难\t108067\nhhjjg\t108068\n出狱\t108069\n今天凌晨3点\t108070\n虫牙\t108071\n两一三千八百九十两万8888多少\t108072\n奶家\t108073\n455个\t108074\n修理工\t108075\n敖溪\t108076\n林叶罗丽精灵梦\t108077\n2bsb\t108078\nvvgy\t108079\n留得\t108080\n四十快四十十四四十四四十四四十\t108081\n老少爷们\t108082\nvvgc\t108083\n烤龙虾\t108084\n师大玛雅\t108085\n二手店\t108086\n流向\t108087\n南溪\t108088\n2VS9\t108089\n跑男\t108090\nKaty\t108091\nXAP\t108092\nibgjgjga\t108093\nitifd\t108094\n同业\t108095\n几百次\t108096\n作成\t108097\n普格县\t108098\n同一\t108099\n作戏\t108100\n好嘛好\t108101\n人之初性本恶我\t108102\nESCAPE\t108103\n此女\t108104\n同上\t108105\n好朋友人\t108106\n昂克威\t108107\nhaua\t108108\n作房\t108109\n点将\t108110\n自贡富顺\t108111\n老号考试\t108112\n邮递员\t108113\n谆谆\t108114\n华捷\t108115\n5x1点\t108116\nGee\t108117\nncdf\t108118\nxggd\t108119\n方得\t108120\n寥寥无几\t108121\n河中\t108122\n跑到\t108123\n玄幻文\t108124\n我是你的好朋友好呀好朋友\t108125\n5700块\t108126\n明天上午九点\t108127\n消防\t108128\nKcdv\t108129\n眼淚\t108130\n露宿\t108131\n一百分儿\t108132\n尖婆\t108133\n跑别\t108134\n额咳咳\t108135\n皮特\t108136\n我爱你的所有\t108137\n三步\t108138\njojd\t108139\n╥﹏╥\t108140\njojb\t108141\n视角\t108142\nhjvfgh\t108143\n造吃\t108144\njppiqgr\t108145\n咯大吞噬术\t108146\n河东\t108147\n好枪\t108148\nhdnz\t108149\n爆爆\t108150\n扭命\t108151\n一等天\t108152\n夜草\t108153\n第七遍\t108154\n换年\t108155\n维斯尼娜\t108156\n共计\t108157\n13970036078\t108158\n107737683n\t108159\nBloggie\t108160\n乐组\t108161\n描摹\t108162\n田晚安\t108163\n武度秘\t108164\n炫耀\t108165\n人和爱\t108166\n杀毒沙拉\t108167\n绝对不会\t108168\n假女\t108169\n亲爱的的影响秘书我的小秘书的小亲亲\t108170\n东莞理工学院\t108171\n杯水\t108172\n亲爱的小孩\t108173\n西山地区\t108174\n太兴\t108175\n张大脸\t108176\n一个错\t108177\n少女时代\t108178\nKate\t108179\n把介\t108180\nmó\t108181\n听话不然\t108182\n安和街\t108183\n挑一\t108184\n薛宅侠\t108185\n挑上\t108186\n米巧可\t108187\n大上\t108188\n91萬\t108189\n暗地里\t108190\n群木\t108191\n错啦错啦错啦错啦错啦错啦错啦错啦错啦奥特曼奥特曼\t108192\n搞笑言\t108193\n女单\t108194\nrwigjnc\t108195\n84吨\t108196\n罒▽罒\t108197\n唉白\t108198\n你好机志\t108199\n王鑫泽\t108200\n泽伟\t108201\n挑中\t108202\n11迉\t108203\n哥姐们\t108204\n吕文海\t108205\n重水\t108206\nhuan\t108207\n啊风景如画泽我隔壁格子兮几\t108208\n姓郭和好\t108209\n纯色\t108210\n来唱\t108211\nhuai\t108212\n猜太阳\t108213\n3组\t108214\n兴国\t108215\nBacchus\t108216\n浸湿\t108217\n十个小时\t108218\n正正\t108219\n葱妹妹\t108220\n平常考\t108221\n辅助\t108222\n老韩\t108223\n三九二零七\t108224\n1个月\t108225\n吴阿姨\t108226\n地狱地狱\t108227\n韦姿羽\t108228\n哈打算\t108229\n挤地铁\t108230\nq币之谜之壁睡吧睡吧睡吧乖孩子\t108231\n蛇精\t108232\n算里\t108233\n再次出现\t108234\n算量\t108235\n综上\t108236\n双节\t108237\n不wxxonm\t108238\n而又\t108239\n顾不着理\t108240\n两三名\t108241\n爱永远\t108242\n候车室\t108243\n王冲\t108244\n和派\t108245\n李树林\t108246\n6499\t108247\n感倩\t108248\n丑我真美\t108249\n听磋\t108250\n奥巴\t108251\n市卫生局\t108252\nLlxlboviblgirc\t108253\n感值\t108254\nchub\t108255\n八一期\t108256\n经济改革\t108257\n生气死\t108258\n令堂堂堂\t108259\n恩要不你真的过来我揍你\t108260\n葆婴\t108261\n王冕\t108262\n王军\t108263\n恐龙肉\t108264\n7平方\t108265\n不开森\t108266\n两小\t108267\n红烧套\t108268\n王运鹏\t108269\n跑车\t108270\n堤内\t108271\n一节三\t108272\n王冉\t108273\n宝贝儿子\t108274\n奇案\t108275\n王再\t108276\n三亿6596\t108277\n理别说\t108278\n咯阳\t108279\n5月22号\t108280\n水泥\t108281\n文化管\t108282\n水波\t108283\n水泡\t108284\n哭不想\t108285\n化肥\t108286\n麻批\t108287\n下午2点\t108288\n臭臭臭\t108289\n化学药\t108290\n霉菌\t108291\n如何个月\t108292\n一点米\t108293\nBEGINS\t108294\n啦啦小游卡巴拉\t108295\n诺声音\t108296\n佣兵\t108297\n透视分离法\t108298\n算了找\t108299\n郭梦茹\t108300\n咯阁\t108301\n美膳妍\t108302\n滚瓜烂熟\t108303\n捶胸顿足\t108304\n丑连狗\t108305\n丁壬\t108306\n水尿床\t108307\n十信\t108308\n龙湾\t108309\n普美\t108310\n地刺\t108311\npersonally\t108312\n啦裤兜\t108313\n1874667\t108314\n死心塌\t108315\n闪光灯\t108316\njxjxjud\t108317\n糯米券\t108318\n欧阳咕噜咕噜\t108319\n地利\t108320\n地别\t108321\n洛奇\t108322\n18359627427\t108323\n向求\t108324\n马失前蹄\t108325\n王子维\t108326\n心塞真是\t108327\n潜艇\t108328\n去年2月8日\t108329\n会英\t108330\n大为\t108331\n特卖会\t108332\n龙湖\t108333\n度秘你好吧青花我爸爸哇伊好吧都米\t108334\n画廊\t108335\nfydddfhjj\t108336\n取决于\t108337\n有所不知\t108338\n免疫系统\t108339\n玛丽普伦\t108340\n二三十块\t108341\n哈利波特\t108342\n大举\t108343\nTBOS\t108344\n王爷爷\t108345\n侈望豕\t108346\n另当别论\t108347\n一歇\t108348\n狗样\t108349\n李t\t108350\n灰色集团\t108351\n拆迁者\t108352\n肺动脉高压\t108353\n拜会\t108354\n东方动漫\t108355\n杨生玉\t108356\nletispaiss\t108357\n农历寺\t108358\n杨铭\t108359\n女博\t108360\n马燕\t108361\n2700亿\t108362\n一步\t108363\n含片\t108364\n一个二十分钟\t108365\n八八千\t108366\ngdp\t108367\n兴欣\t108368\n万王钱\t108369\n很错结\t108370\n熊心\t108371\n和平主义者\t108372\n粉粉们\t108373\nHC版\t108374\n无没有\t108375\n黑猩猩\t108376\n大耳牛\t108377\n美越日韩武装军演四海\t108378\n无可恋\t108379\n林宇瀚\t108380\n用责问\t108381\n还宝\t108382\n罵罵\t108383\n版权经纪人\t108384\n尿不湿\t108385\n额淘宝\t108386\n着火\t108387\n齐某\t108388\n香港特别行政区\t108389\n爱拍拍贷\t108390\n萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌\t108391\n奇异果\t108392\n400块\t108393\n我笔\t108394\n超大\t108395\n490亿美元\t108396\n杨春明\t108397\n不受死\t108398\n132页\t108399\n不在于\t108400\nhi秘\t108401\n人心叵测\t108402\n幼常\t108403\n恒盾\t108404\n棒付\t108405\n三十一岁\t108406\n哎呀骗你了我认识你了都萌萌哒\t108407\n这边人\t108408\n班度\t108409\n孙女儿\t108410\n强忍\t108411\nSUPAR\t108412\n猪啊狸\t108413\n大天黑\t108414\n干嘛嘛有\t108415\n吴麻豆\t108416\n鱼塘\t108417\n胡宇屯\t108418\n盆浴\t108419\n团订\t108420\n说了拜拜拜拜拜拜拜拜\t108421\n春虫虫\t108422\n班门弄斧\t108423\n一月哈哈\t108424\nbagisit\t108425\n15983314655\t108426\n周青云\t108427\n小鸭子\t108428\n万有引力\t108429\n破案率\t108430\n张籽沃\t108431\nofer\t108432\n身身\t108433\n报父母\t108434\n周俊丞\t108435\n幸事\t108436\nhgghdujg\t108437\n15017909352\t108438\n贾姐\t108439\n梁鑫悦\t108440\n周周芸熙\t108441\n不不不你混蛋蛋蛋蛋蛋蛋蛋蛋蛋蛋蛋蛋蛋蛋蛋蛋呢洪蛋蛋蛋蛋蛋蛋蛋蛋蛋\t108442\n7种\t108443\n一起萌萌哒\t108444\n摩玛利米兰\t108445\n七撒\t108446\nwwso\t108447\nHugschick\t108448\n马凡氏综合症\t108449\n7科\t108450\n886LLLi生哗啦侧答结局爷儿\t108451\n沉思\t108452\n意见数\t108453\nssi\t108454\n大S\t108455\n中宣\t108456\n猜口\t108457\n万好帅\t108458\n枪杀\t108459\n相信你幽\t108460\n轻度失忆症\t108461\n猜叻\t108462\n储存罐\t108463\n呵明\t108464\n猪门\t108465\n吴思琦\t108466\n贫贱夫妻百事哀\t108467\n大J\t108468\n大H\t108469\n二漫\t108470\n466727183\t108471\n大p\t108472\n慕尼黑\t108473\ntfdogs\t108474\nａｖ\t108475\nａｑ\t108476\nohyeah\t108477\n幸福的滋味\t108478\n快乐合成器\t108479\n船巷\t108480\n欧耨\t108481\n代扣\t108482\n九分之四两次\t108483\n扫描仪\t108484\n何物\t108485\n在不ω\t108486\n10点10分\t108487\n合拢\t108488\n许家印\t108489\n刺伤\t108490\n宋茜\t108491\nyuud\t108492\n言情类\t108493\n维珍尼亚\t108494\n下一个故事\t108495\n好队\t108496\nblz05\t108497\n137093949054\t108498\n惯例\t108499\n合拍\t108500\n施行\t108501\n44444￥\t108502\nhihihi\t108503\n3touc\t108504\n3科才\t108505\n到岸\t108506\n胫甲\t108507\n17078574303\t108508\nGDBDFXGGG\t108509\nnnnnnxbcjxhbbdbxbbebhdxnbfjxbfuhf\t108510\n死你妈个逼你给我找个人工\t108511\n大气球\t108512\n羨慕\t108513\n业多\t108514\n霍美慧\t108515\n李小红\t108516\n梦想传花\t108517\nlablates\t108518\n上上上上上\t108519\nfat\t108520\n占位\t108521\nobksujf\t108522\n人人性\t108523\n牛杂\t108524\n886886\t108525\n41方\t108526\n余秋雨\t108527\n公房\t108528\n1644675754577664646545464644646464\t108529\n本周日\t108530\n砌仰\t108531\n毛卷卷\t108532\nFUULAY\t108533\n穹顶\t108534\n唾骂\t108535\n毛某某\t108536\n两年之内\t108537\n你不你你你你你\t108538\n新天学校\t108539\n458\t108540\nGhfny\t108541\n各你\t108542\n昌星\t108543\n羽翼\t108544\nfgcxfj\t108545\n王晨\t108546\n肚兜\t108547\n啊宣儿\t108548\n一起去哪里\t108549\n深思熟虑\t108550\n宣言\t108551\n嘟咪嘟咪嘟咪嘟咪嘟咪嘟咪嘟咪\t108552\n唉错\t108553\n开庭\t108554\n第三讲\t108555\n开店\t108556\n鲍龙\t108557\n房车\t108558\n撒手\t108559\n珞珞\t108560\nfaf\t108561\n彭彩湖\t108562\n负度\t108563\nhjjjjjhjhhhhjbkbplllljlhohllljklllmlkkhjhjhhggggggjgjgytyrtftfvjgkgjhhjhhhhhhhhghyuhhghhghgdf\t108564\n吶\t108565\nEvan\t108566\n期刊\t108567\n喝行\t108568\n我自个\t108569\n告诉我要\t108570\n吴\t108571\n手舞足蹈\t108572\n摩的九失利\t108573\n453\t108574\n49年\t108575\n寒星\t108576\n别再说\t108577\n技嘉\t108578\n十六天\t108579\n416名\t108580\n我是你的拜拜\t108581\n开心再一那你还毛\t108582\n广西\t108583\n曾红\t108584\n淫家\t108585\n闹笑话\t108586\n吭\t108587\n汉化工口\t108588\n重要的人\t108589\n冰水仙\t108590\n懂不知道\t108591\n宁武县\t108592\n6月10日前\t108593\n抱个如来\t108594\nsyzz\t108595\nssb\t108596\n中国男足\t108597\n一百二十多\t108598\n李龙大\t108599\n面包机\t108600\n史林萍\t108601\n6885555656825\t108602\n我是你的月妹妹\t108603\nNGTC学GIA\t108604\n张琳丹\t108605\n十一点\t108606\n额恩恩\t108607\n11日上午8点\t108608\n涡喷发动机\t108609\n郭章玉\t108610\n才才才\t108611\n还热\t108612\ngdjhxn\t108613\n魅静\t108614\n莫真人\t108615\nihf\t108616\n王悦轩\t108617\n哈日个\t108618\n杰出\t108619\nRon\t108620\n64848\t108621\nsrruu\t108622\n做作笑\t108623\nyfxg\t108624\nJeanne\t108625\n摩天轮\t108626\n简短\t108627\n长江商学院\t108628\n掉了没有\t108629\n五六十年\t108630\n9494e十四\t108631\n冷子轩\t108632\n裸辞\t108633\n四泡\t108634\n机师傅\t108635\n四叔\t108636\n150点\t108637\n陈词\t108638\n吴章银\t108639\n車後\t108640\n第二块\t108641\n丁师姨\t108642\nRock设计师\t108643\n白肉\t108644\n卡伦山\t108645\n跑来跑\t108646\n删小心\t108647\ne友\t108648\nichhabeeindoch\t108649\n晚清变局与民国之乱\t108650\n爱意\t108651\n我的恨人\t108652\n意见稿\t108653\n佳绩\t108654\n欧四夫\t108655\n凤翔县\t108656\n真爱情\t108657\n很开爱\t108658\n屌毛\t108659\n全息\t108660\n明显\t108661\n零四\t108662\nqrehw\t108663\n徐再阶绍狗\t108664\n烧坏\t108665\n装给\t108666\n问我告诉你\t108667\n明明\t108668\n不算数\t108669\n猫者\t108670\n1976年\t108671\n为早夭\t108672\n明星\t108673\n两筒\t108674\n发行价\t108675\n老字号\t108676\n13cm\t108677\n暖男我喜欢\t108678\n两筐\t108679\ngdh\t108680\n无线路由\t108681\n一小会\t108682\n业务\t108683\n好乖孩纸我叫乖孩子\t108684\n05米\t108685\n真心觅爱本无错，误尽青春总是情\t108686\ngjtmtjg\t108687\n声语\t108688\n唉别找别找别惹我没和你\t108689\n天天有喜\t108690\n吇\t108691\n不错挺好玩儿\t108692\n当家人\t108693\nfnmmm\t108694\n较宜\t108695\n含有\t108696\n专业课\t108697\n鲍政宇\t108698\nkty\t108699\n食草\t108700\nktu\t108701\nktv\t108702\n本周年\t108703\n堪忧\t108704\njhpgk\t108705\nktl\t108706\n声词\t108707\n文并茂\t108708\n自去\t108709\nktd\t108710\nbeby\t108711\n觉世\t108712\n改建\t108713\n799977\t108714\n某日\t108715\nvhshd\t108716\n倪雨茵\t108717\n我去哪儿歌声声线乒\t108718\n家主人\t108719\nmethere\t108720\n總\t108721\n殴液\t108722\n灰飞烟灭\t108723\n40该\t108724\n查雨涵\t108725\n呢乖\t108726\n不宣\t108727\n霜状\t108728\n吹灭\t108729\n吹火\t108730\n小豆幂\t108731\n45p\t108732\n重试业\t108733\n祖母\t108734\n雷祖殿\t108735\n嫩脂\t108736\n摄影史\t108737\n长山人\t108738\nffdreegdfsdfgc\t108739\n女贝男贝\t108740\n贴子\t108741\n韩贵人\t108742\n籃\t108743\n80厘米\t108744\n嗯嗯鬼来了\t108745\n英格兰青年队\t108746\n三自然\t108747\nDDDDDDXDD\t108748\n梨园快递\t108749\n接机\t108750\n1182188\t108751\n厚黑学\t108752\n好吾\t108753\n神剑\t108754\n聚份\t108755\n小常识\t108756\nwstantaxcn\t108757\n亲爱的我想你\t108758\n再利用\t108759\n猪头猪\t108760\nspendic\t108761\n一部\t108762\n新开辟\t108763\n东倒西歪\t108764\n龙梓华\t108765\n流度\t108766\n偶巴\t108767\ninour\t108768\n行遍\t108769\n二二零一零六一九九零零一零八二零一二\t108770\n好可伯\t108771\n学文雅\t108772\n去天\t108773\n密达\t108774\n移动充电宝\t108775\n宅子\t108776\n两百两百两百\t108777\n李鹤轩\t108778\n唉老子\t108779\n皮帽\t108780\n赏月\t108781\n改变自己\t108782\n很多倍\t108783\n王文浩\t108784\n度秘密度度秘密度密度密度木秘秘\t108785\n鼓不困\t108786\noaie\t108787\n殷海光\t108788\n158名\t108789\n殷佳楠\t108790\n牛佳乐\t108791\n去夜\t108792\n舍利\t108793\nq100\t108794\n第六句\t108795\n腹式\t108796\n猫小仙\t108797\n慧眼识珠\t108798\n总产值\t108799\n找三吗\t108800\n好听\t108801\n发家致富\t108802\n火柴棍\t108803\n三天后\t108804\n我也不晓得一起来你为什么宁做我把小梦法\t108805\n数语\t108806\n方中\t108807\nDontyou\t108808\n门铃\t108809\n别错过\t108810\n约克\t108811\n宋哥哥\t108812\n妙龄\t108813\n圈地基建\t108814\n1is\t108815\n娱乐室\t108816\nfx30y5\t108817\n月月月月月月月月\t108818\n家港市\t108819\n前三天\t108820\n方丈\t108821\n籦\t108822\n韩乐桐\t108823\n投币\t108824\ndryfh\t108825\n为了中国梦\t108826\n三点三点四点\t108827\n攰\t108828\n茵茵\t108829\n木钱\t108830\n雷伊冰\t108831\n638\t108832\ndryff\t108833\n小一k\t108834\n700名\t108835\n推荐员\t108836\naaJz\t108837\n无解\t108838\n何兵\t108839\n张照\t108840\n千百年来\t108841\n开开始\t108842\n海甲勇士之补觉吗铠甲勇士之捕将\t108843\nxx94\t108844\n井远欣\t108845\n动物类\t108846\n上海中医药大学\t108847\n慢观\t108848\n一秘呜呼\t108849\n无觉\t108850\n无视\t108851\n望望\t108852\n陈穷\t108853\n无见\t108854\n一三只\t108855\n开心说话片开心\t108856\n比林甸\t108857\n永远不会\t108858\n不出所料\t108859\n呆呆呆呆\t108860\n东山\t108861\n在这样的话\t108862\nlpbjtlpi\t108863\n精锐\t108864\npolor\t108865\n湊\t108866\n呢饭\t108867\n一三号\t108868\nmomoda\t108869\n叽萝\t108870\n交集\t108871\n1381381438\t108872\n蹦蹦\t108873\n俊婷\t108874\n脑袋\t108875\n陶坊\t108876\n皮球\t108877\n蕾蕾\t108878\n130多块\t108879\n360君\t108880\n眉目\t108881\n笑死人\t108882\n多理\t108883\n卢先锋\t108884\n放鸽子\t108885\n介要\t108886\n嗯嗯嗯嗯嗯嗯嗯求你的姐姐人是一个大坏蛋啦啦啦啦太阳啦啦啦啦啦啦啦啦\t108887\n我没必要呵呵你\t108888\n刘欣\t108889\n刘欢\t108890\n25P构图\t108891\n福中\t108892\nmgtpmgtjpgugjgkmjtjpgadpjmagmpmjhadpgadpamdapdapdad\t108893\n识别性\t108894\n有线台\t108895\n心踏实\t108896\n柏刚\t108897\n罗洪亮\t108898\n厌奶\t108899\n安鑫丑\t108900\npornhub\t108901\n无偿\t108902\n曹诗佳\t108903\n绿爱\t108904\n展鑫\t108905\n劲铂\t108906\n大凯\t108907\n一万一万一千\t108908\n阳狮锐奇\t108909\nhxiz\t108910\n4轮\t108911\n拍成\t108912\n义乌\t108913\n试试试试\t108914\n建瓯\t108915\n8767540585575788558\t108916\ntopopm\t108917\n昏聩\t108918\n3304549114\t108919\n知足为\t108920\n大明湖\t108921\n目光\t108922\n2周后\t108923\n嘎查\t108924\n每周一天\t108925\n划进\t108926\n哥哥你在哪儿你在哪儿\t108927\n三周岁\t108928\n说拜拜拜拜\t108929\n72小时\t108930\n别老度秘\t108931\n喔妙买\t108932\n押解\t108933\n正时\t108934\n新年你给我送包\t108935\n凡公历\t108936\n伯永\t108937\n13711210007\t108938\n雅鲁巴嘎雅巴嘎巴嘎\t108939\n永远永远是\t108940\n在家宅\t108941\n世纪金源\t108942\n我走了我个人你我讨厌你\t108943\n刘立华\t108944\n宝存\t108945\n取浴\t108946\n可信度\t108947\ndmdm\t108948\n2b度秘\t108949\n两军\t108950\nggyih\t108951\n罗伊李\t108952\n525255525252525\t108953\n乱咒\t108954\n其极\t108955\n张家杰\t108956\n15290架次\t108957\n奇怪君\t108958\n打雪仗\t108959\n宁夏那五十岁我要和你处\t108960\nghkggjjgg\t108961\n其果\t108962\n塞瑞\t108963\nTdugxg\t108964\n谢谢你了我干师娘\t108965\n诉称\t108966\n树脂\t108967\n私怨\t108968\nnij\t108969\nnih\t108970\nnii\t108971\n羊头\t108972\n抱了你好\t108973\nnig\t108974\n告麻烦你\t108975\n八八六二百五个\t108976\n公分\t108977\n好我很听话\t108978\n别原谅\t108979\n曾晋秋\t108980\nnip\t108981\n38度\t108982\n吧啦\t108983\nnit\t108984\n大意思\t108985\n用心算\t108986\n狗食\t108987\n白简竹\t108988\n俄罗斯央行\t108989\n淘金子\t108990\n绑定\t108991\n泡打粉\t108992\n粘胶短纤公司\t108993\n书信\t108994\n巴拉巴拉巴拉巴拉拉拉\t108995\n比喻句\t108996\n蒸腾\t108997\n和谈\t108998\n温敏伊\t108999\n打肿\t109000\n四百块\t109001\n绿植\t109002\n御剑\t109003\ning明恩咯落偷香窃玉\t109004\n电据\t109005\n死飞\t109006\n善用\t109007\n睿睿\t109008\n踢皮球\t109009\nallllloom\t109010\n林一成\t109011\n一大桶\t109012\n汤猫咪\t109013\n运费团\t109014\n大帅锅锅锅锅\t109015\n隆恩\t109016\n陈乃滢\t109017\n拜山\t109018\n红薯杆\t109019\n志在必胜\t109020\n多鹤\t109021\n53m\t109022\n中华网\t109023\n李剑\t109024\n老浓\t109025\n啤酒\t109026\n我负\t109027\n纬线\t109028\n说病倒\t109029\n微云\t109030\n专制主义\t109031\n关门打狗\t109032\n大麦\t109033\n利害\t109034\n甚为\t109035\nFggh\t109036\n痛惜\t109037\n穿\t109038\nvds\t109039\n3..359..69..5..5..6\t109040\n八顿\t109041\n做飯\t109042\n艾伯特二世\t109043\n沉跟\t109044\n愿意想\t109045\n八页\t109046\n大麻\t109047\n作业君\t109048\n丁刀口号角的事实书实伞兵家\t109049\nhi皮波c塔塔塔塔塔塔塔塔\t109050\n大阿九\t109051\n空乘笑话\t109052\n82.05%\t109053\n绍兴\t109054\n537\t109055\n乃霖\t109056\n535\t109057\n534\t109058\n531\t109059\n530\t109060\n调皮调皮\t109061\n王珞童\t109062\n呼呼了\t109063\n打点\t109064\n538\t109065\n登斯楼\t109066\n莘苦\t109067\n来不及说\t109068\n干性\t109069\n切鬼\t109070\n小度嗎\t109071\n下周六年\t109072\n启完\t109073\n5tidgjixidydiyrfctxdkfuozukyffcljkl\t109074\n讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌\t109075\n8438\t109076\n升学\t109077\n堤防\t109078\n无字碑\t109079\n不要了　叫　廾对昍明天\t109080\n悲秋\t109081\n说的是第一世界第一不是你你是世界\t109082\n卢思意\t109083\n助理医师\t109084\n张敬豪\t109085\n縻膊昏\t109086\nvox6\t109087\n好好好好\t109088\n新势力\t109089\n小龙酱\t109090\n七厘米\t109091\n文天\t109092\n随便儿\t109093\n果茶\t109094\n耻辱圈\t109095\n昙女\t109096\n郑明\t109097\n心丫\t109098\nplu\t109099\n守门\t109100\npls\t109101\n哥点\t109102\n艾普\t109103\n你美你怎么美你帅\t109104\n哈宝\t109105\n栖身\t109106\n黄鸡\t109107\nx5max\t109108\nple\t109109\n吴侬\t109110\n158007436\t109111\nx10个\t109112\nDhRf\t109113\n星饭\t109114\npll\t109115\nplk\t109116\nplj\t109117\n脾温\t109118\n嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯\t109119\n偏移\t109120\n可会\t109121\n李村\t109122\n福字儿\t109123\n来不然的话\t109124\n重庆\t109125\n死去吧你是我\t109126\n江小忆\t109127\n孤枕奈\t109128\nxuabcxufxxn900\t109129\n忽然间\t109130\n度公公\t109131\n福才\t109132\n二二六\t109133\n浪花儿们\t109134\n笑逐颜开\t109135\n谭嘉利\t109136\n一了有\t109137\nMY_DEER\t109138\n算起来\t109139\n周陈杰\t109140\n莫丽灵\t109141\nhoulity\t109142\nabcabca\t109143\n参考文科\t109144\n给我找到\t109145\n美迪\t109146\n夸状\t109147\n2acu\t109148\n蓝盆\t109149\n烟台市\t109150\n二二元\t109151\n解人\t109152\n邵云华\t109153\n案犯\t109154\n多额凤飞飞\t109155\n谋财\t109156\n华贸\t109157\n0469\t109158\n呢拜拜\t109159\n波斯美\t109160\ngcjv\t109161\n异曲同工2自圆其说3可圈可点4一五一十5口是心非6心直口快7无与伦比8啼笑皆非9里应外合10三姑六婆11五音不全12两面三刀13一塌糊涂14多此一举16天方夜谭\t109162\n烧麦行\t109163\n90分钟\t109164\n泰安迎春学校\t109165\n精灵梦叶罗莉\t109166\nsnowy\t109167\n金字\t109168\n铃铛\t109169\n金子\t109170\n9月18日下午\t109171\nchkhjvljgljvljkgjf\t109172\n结释\t109173\n雅美蝶\t109174\n年秘\t109175\n繁蛋\t109176\n射阳县\t109177\n哈家\t109178\n回信\t109179\n一四秀\t109180\n卢嘉丽\t109181\n几此而已\t109182\naob类\t109183\n助学\t109184\n时妻\t109185\n维黛\t109186\n猪儿粑\t109187\n熊二熊\t109188\nTWB\t109189\n欧子愉\t109190\n火箭头\t109191\nuuujtoo煎饼果子\t109192\n马班长\t109193\n托人\t109194\n姓吴带\t109195\n丁俊阳\t109196\nfyrtryfgffgdhdhfyfhfhhdurgfgdhf\t109197\n齐蕊\t109198\n杜丽万灵\t109199\n5颗\t109200\n一切顺心\t109201\n鸳鸯问问学院\t109202\nsxhsasxi\t109203\n辣文\t109204\n年梓轩\t109205\n从此\t109206\n衷不改\t109207\n藤原浩\t109208\nLljjgjjjjppttpjpjpjpjpwttgdjtmKlujjgjjgjpmatJenkjpgiw\t109209\n美榴\t109210\n只要你的秘秘杀\t109211\n79四澳元\t109212\n大乐斗\t109213\nm123456\t109214\n大功率\t109215\n冰毒\t109216\n根正\t109217\n安卓\t109218\n借我点钱花花\t109219\ngcgbjxxfbn\t109220\n哑铃员宫\t109221\nvcdg\t109222\n118940\t109223\n点留\t109224\n东京代代木补习学校\t109225\n么币\t109226\n佩戴者\t109227\n金秀炫\t109228\n1387753\t109229\n哼高\t109230\n广州塔\t109231\n嗯512cleovc\t109232\n乔心\t109233\n要你\t109234\n48445\t109235\n19262584\t109236\n社会主义市场经济体制\t109237\n下下下下下下下下下下\t109238\n我喜欢行\t109239\n呵摸\t109240\n武训传\t109241\n拖斗车\t109242\n说句话\t109243\n懈怠\t109244\n这一觉\t109245\n小杜宾\t109246\n海珠\t109247\n兜车河\t109248\n85548451515\t109249\n第五条\t109250\n12zzzz\t109251\n要位\t109252\n老男婆\t109253\n夏若男\t109254\n13112255211\t109255\n药用\t109256\n停更\t109257\n13002952812\t109258\n嗯嗯嗯嗯猪猪侠\t109259\n昔昔\t109260\n无法代替\t109261\n163x198\t109262\n谢沛颖\t109263\nym吧影湾仔\t109264\n午美\t109265\n不能说\t109266\n手指甲\t109267\n置疑\t109268\n肉汤\t109269\nzxfgytwlsisbufk13154831364\t109270\n运动度\t109271\n苏新淳\t109272\n自由派\t109273\n四斗士\t109274\n自疗\t109275\n一四摔\t109276\n4548988868556555556898989689\t109277\n定点\t109278\n上野御徒町\t109279\n2.0万名\t109280\n确无人\t109281\n官里的妈妈小女妈妈就是说我大四女\t109282\n林志颖\t109283\n王微博\t109284\n618几\t109285\n任艳羽\t109286\n真的好可耐\t109287\n冠状动脉\t109288\n旅行社\t109289\n耶欧耶欧耶欧耶耶耶耶\t109290\nghggh\t109291\n优抚\t109292\n左氧氟沙星\t109293\n关于\t109294\n关亍\t109295\n4470\t109296\n异世界\t109297\n里仁\t109298\n吕安珍\t109299\n我爱你顿每到吗塔度秘iloveyou度秘了了了了大大度秘hello我爱你度秘\t109300\n好多元\t109301\n咯词\t109302\n左龙\t109303\nlmbf\t109304\n寰宇\t109305\n食的成语疯狂猜成语\t109306\n赵亚丽波\t109307\n我喜欢和爸爸\t109308\n巨组\t109309\n169125\t109310\n六二零\t109311\n落罗嗦\t109312\n帘落\t109313\n烤全羊\t109314\n沒關係\t109315\n饿大今大\t109316\n7740852\t109317\n刘慈欣\t109318\n朴瑾惠\t109319\n喜酒呗\t109320\n海选\t109321\n41分\t109322\n快白\t109323\n习李\t109324\n阿律律\t109325\n似水\t109326\n好呀你快点\t109327\n破壞\t109328\n微软联盟\t109329\n海通\t109330\n两个九\t109331\n林俊杰\t109332\n直落林\t109333\n来了撒\t109334\n终极武器\t109335\n356789\t109336\nall露兔兔V5\t109337\nsplasio\t109338\n22公里\t109339\nlove蜜\t109340\n桐月咩\t109341\n海逸\t109342\n招派\t109343\n陆dr\t109344\n黑猫兢\t109345\n整蛊\t109346\n晓玲\t109347\n食与家\t109348\n好了里了了\t109349\n奥宝宝\t109350\n津津\t109351\n检查员\t109352\nfjgcujfbkgdcbmkfeweyopexvbkysxvgddfffdf\t109353\nnimolijidingn\t109354\n鞋纸\t109355\n杜奶奶\t109356\n夏梦寻\t109357\n小卫卫\t109358\n快乐板\t109359\nglibac\t109360\n格力高\t109361\n宅舞\t109362\n20亩\t109363\n六小王\t109364\n爱我你就亲亲我爱我你就抱抱我爱我你就夸夸我爱我你就王\t109365\n孙大姐\t109366\n秘冰泉\t109367\n我讨厌你的妈呀\t109368\nhttps\t109369\nMolehill45kolbeagggjgjgjmjp0try5572541\t109370\n汤粉\t109371\nrufjgk\t109372\n96点\t109373\n无法抗拒\t109374\n星星眼\t109375\n幸福36计\t109376\n想了你\t109377\n黄埭\t109378\nclub\t109379\n优酷微\t109380\n借姐\t109381\n大了就好\t109382\n老醋花生米\t109383\n我不喜欢你我讨厌你有在不\t109384\n心心吗\t109385\n背景\t109386\n变自恋狂爱\t109387\n淫民\t109388\nclut\t109389\n萨克买地克\t109390\nISE安德6\t109391\n翟兆瑞\t109392\n康美红\t109393\n57千米\t109394\n伏仪\t109395\n大大兴\t109396\n黄致列\t109397\n天使替我去爱你\t109398\n屌屄屌屄屌屌\t109399\n祈祷\t109400\n上浮\t109401\n阳间\t109402\n洗洗睡\t109403\n+善存\t109404\n复杂处\t109405\n黄埔\t109406\n姬泽昆\t109407\n分完\t109408\n999666\t109409\n~张\t109410\n北京国际展览馆\t109411\nResort\t109412\nj2968698538\t109413\n噜噜\t109414\n04年\t109415\n查理芒格\t109416\n正负面\t109417\n不能见\t109418\n服徒\t109419\n120512\t109420\n小姨们\t109421\n酷鼠\t109422\n50000亿\t109423\n澳大利亚\t109424\n照胜同\t109425\n盆儿\t109426\n朱裴徐\t109427\n许剑\t109428\n影视\t109429\n姜叫\t109430\n你是谁你是谁你是\t109431\n吃堂\t109432\n183510489\t109433\n文昌航天城\t109434\n乌兰图雅\t109435\n张坷\t109436\n当场\t109437\n柳树\t109438\n分家\t109439\naof\t109440\naod\t109441\n467850\t109442\n来话\t109443\n夸讲\t109444\naoo\t109445\naon\t109446\n阴兵\t109447\n下属\t109448\n周恩红\t109449\n白孑画\t109450\n收于\t109451\n天天想你\t109452\nstouto\t109453\n好生\t109454\n對喜歡\t109455\n说的呀什么叫\t109456\n下层\t109457\n天子峰子\t109458\n下屏\t109459\naoy\t109460\n周恩赐\t109461\n008877665544332211\t109462\n度秋\t109463\n枪神纪\t109464\n度秆\t109465\n管弦呕哑\t109466\n唔会吧\t109467\n客体\t109468\n度私\t109469\n来说\t109470\n收亲\t109471\n度秘\t109472\nN多各色胶囊\t109473\n收人\t109474\n淋浴房\t109475\n耐打\t109476\n来着哦就是我的少女时代\t109477\n闰月生\t109478\n共植\t109479\n心坎儿\t109480\n12456879\t109481\n永远挺\t109482\n跨界\t109483\n哼小不点你还都不够我\t109484\n孟楼曹\t109485\n榻\t109486\n飞来横祸\t109487\n杜哥哥\t109488\n途游\t109489\n女神思密达\t109490\n1830\t109491\n1836\t109492\n人选\t109493\n两千位\t109494\n羊癫\t109495\n灯火辉煌\t109496\n受不锈\t109497\n场边犬吠啼\t109498\n赫塔\t109499\n琉星\t109500\n2400\t109501\n张娜拉\t109502\n赵丽爽\t109503\n孙菲\t109504\n小沈阳\t109505\n近一小时\t109506\n发行车\t109507\n武岳\t109508\n小雅蜘蛛侠\t109509\n13513552116\t109510\nYRYGRYUDU\t109511\n好地方\t109512\n热炕头\t109513\n变变\t109514\n老醋花生\t109515\n7月31日\t109516\n寒光\t109517\n录音带\t109518\n行孙\t109519\n前尘\t109520\n戳货\t109521\n神牛把\t109522\n晨依依\t109523\n你好女孩\t109524\n创作集-音乐旅行者\t109525\nkang\t109526\n红虎\t109527\n榨\t109528\n对一下\t109529\n浮出\t109530\n2206212356\t109531\n如实说\t109532\n停泊\t109533\nGguxfvhh\t109534\n香香香香香香\t109535\nghbch\t109536\n风景照\t109537\n山清泉\t109538\n剑齿虎\t109539\n200余起\t109540\n班集体\t109541\n第二三四五六七八九十三四五六七八九十一二三四五六七八九十一二三四五六七八九十一二三四五六七八九十一二三四五六七八九十一\t109542\nvfxz\t109543\n夏文品\t109544\n电子数\t109545\n天理\t109546\n傻子鬼\t109547\n我是你最爱的人\t109548\n度秘g\t109549\n90厘米\t109550\n能动\t109551\n帅位\t109552\n江财\t109553\n三周多\t109554\n林伊然\t109555\n柯南VS鲁邦三世\t109556\n你好女子\t109557\n等份\t109558\n度秘我讨厌ikon\t109559\n看到哪儿\t109560\n天坛大佛\t109561\n拉卡蒙\t109562\n即可怜\t109563\n十二立方厘米\t109564\n那你的咋了冰糕巧乐吱\t109565\n榜\t109566\n我那个你是谁你是\t109567\n涵秘\t109568\n还是一哪里\t109569\n好讨厌你好讨厌你好讨厌你好讨厌你好讨厌你我讨厌你\t109570\n百货商店\t109571\n药品\t109572\n油麻地\t109573\n阿马尔菲塔诺\t109574\n傻话\t109575\n为什么不相信\t109576\n牛能反演\t109577\n入市\t109578\n刘徐徐\t109579\n嗯九成\t109580\nwkbd\t109581\n皮箱\t109582\n算亲\t109583\n度小度度\t109584\n宝库\t109585\n去斑\t109586\n足智多谋\t109587\n麦角\t109588\nyourself\t109589\n公主片\t109590\n11：30\t109591\n单水\t109592\n鄢陵博\t109593\nkuvunr\t109594\n无精打采\t109595\n通知了\t109596\n28282828\t109597\n主教练\t109598\n1485185360\t109599\niy\t109600\nrorolu\t109601\n几嬲\t109602\nxuz\t109603\n肉麻\t109604\n做什么\t109605\n劳杰豪\t109606\n证人录\t109607\nthat\t109608\n二级片\t109609\n3.3%\t109610\n普拉特·惠特尼\t109611\n李国峰\t109612\n先入为主\t109613\n死王八好不咯\t109614\n片头曲\t109615\n机人\t109616\n顶顶\t109617\n因为你\t109618\n良多\t109619\n死胡同\t109620\n成都高新技术开发区\t109621\nthan\t109622\n要许\t109623\n慕容玥\t109624\n34点\t109625\n舅母\t109626\n百位学学\t109627\nvvffdgw\t109628\n三角天\t109629\n概\t109630\n三角头\t109631\n连锁店\t109632\n来乐下\t109633\n顺差\t109634\n亲一个亲一个亲一个亲一个亲一个亲一个亲一个一个气\t109635\n咧斯偶\t109636\n家庭教育\t109637\n挽词\t109638\n2692799140\t109639\n并蒂莲\t109640\nmaopqing\t109641\neQ2\t109642\n营养餐\t109643\n人机片\t109644\n名利场\t109645\n毌\t109646\niswhy\t109647\n1DJ\t109648\n王明川\t109649\n在异乡\t109650\n聪明管理\t109651\n多米高\t109652\n都接接接接接接接接接接接接接接\t109653\n王晓英\t109654\n孤行\t109655\njssbs\t109656\n旧闻\t109657\n蔡宗荣\t109658\n帷幕\t109659\n小地气\t109660\n壮烈\t109661\n杨我先\t109662\n8点37\t109663\ni会\t109664\n西王\t109665\n重新吗\t109666\n知恩\t109667\n自然资源\t109668\n准率\t109669\n95分钟\t109670\n噩噩噩噩噩\t109671\n坏蛋我是好蛋好不\t109672\n毫不\t109673\n朵朵\t109674\n6565558656568\t109675\n吕鹏杰\t109676\n858股\t109677\nTon\t109678\nloww\t109679\n显摆\t109680\n花花姑娘\t109681\n四十四块\t109682\n偷乐\t109683\n秦佳\t109684\n交界地区\t109685\nprbs\t109686\n小雪球\t109687\n女佣\t109688\n无人区\t109689\n栋一\t109690\n全院\t109691\n歪歪扭扭\t109692\n赫赛汀\t109693\n马尔斯得图少女宝庆我吃了台\t109694\n握爪\t109695\n绿巨人3\t109696\nib93\t109697\n原是\t109698\n15158539285\t109699\n瓦拉索\t109700\n主要你老\t109701\n取功名\t109702\n红庄小学\t109703\n吐求\t109704\n就怕你儿饿我儿\t109705\n206年\t109706\nFhjkbx\t109707\n毡\t109708\n柏文艳\t109709\n旅游胜景\t109710\n中邪\t109711\n吃想了吧\t109712\n中邮\t109713\n太爱我太爱好爱你\t109714\n美喜\t109715\n李胜利\t109716\n卡忙\t109717\n灵p0\t109718\n试老师\t109719\n桥面\t109720\nahhhzhbwn\t109721\n二小\t109722\n重出江湖\t109723\n尤耿\t109724\ndometimetimeime\t109725\n80几\t109726\n刑诉\t109727\n山椒\t109728\nFeliz\t109729\n呼啸\t109730\n夏满芒\t109731\n呼啦\t109732\n动火\t109733\n糗事吗我不肤浅你肤浅度秘我爱死你\t109734\n锡解虎\t109735\n嗯恶\t109736\n装麻烦\t109737\n半席\t109738\n90分\t109739\n12252122255\t109740\n月尾\t109741\n后辈\t109742\nKsJsKaKah\t109743\n出去红之间\t109744\n拉长\t109745\n王8\t109746\n公主犬\t109747\nt66ys\t109748\n极为\t109749\n推拉\t109750\nRDJ\t109751\n说由乃\t109752\n孙乐哥\t109753\n451515157615511535155115133315232553253554\t109754\n挥滤自如\t109755\n后边\t109756\n低垂\t109757\n诚邀塔\t109758\n暴躁堂\t109759\n還做飯\t109760\n99句\t109761\n家园儿\t109762\n缕缕缕缕缕缕缕缕\t109763\n五彩缤纷\t109764\n查收\t109765\n23岁度\t109766\n全国人大常委会\t109767\n前天中午\t109768\n再见呆呆\t109769\n周玉涵\t109770\n男共妻\t109771\n友友们\t109772\n留名\t109773\n衣襟\t109774\n道德败坏\t109775\n星雪\t109776\n来首歌大王叫我来寻山\t109777\n斯图\t109778\nGive\t109779\nDhdt\t109780\n奥特尼克\t109781\n我是的我是火星人\t109782\neeseea\t109783\n女佛\t109784\n遥看瀑布挂前川\t109785\n十国度\t109786\n棉棉\t109787\n2003年\t109788\n街道\t109789\n欢声\t109790\n来吧来吧快到城\t109791\n5nd音乐网\t109792\n顾飞\t109793\n一千e\t109794\n单条目\t109795\n哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥\t109796\n钢球\t109797\n韩小乖\t109798\n565585268568468538238535254265654503464267165\t109799\n银杏林\t109800\n做死\t109801\n石板\t109802\n浅黄色\t109803\n刘亦凡\t109804\n圣埃蒂安\t109805\n躯壳\t109806\n烈风天翼\t109807\n1334567\t109808\n对呀开心\t109809\n王大大\t109810\n还不够格\t109811\n240亿美元\t109812\n款式\t109813\n高强度\t109814\n亿馍\t109815\n余咪咪\t109816\n新京报\t109817\n江浦街道烟草局\t109818\n砍树\t109819\n鬼迷心窍\t109820\n房主\t109821\ntfbomi\t109822\n李子球\t109823\n工能\t109824\n临县\t109825\n双庙\t109826\n二鬼\t109827\n如不如\t109828\n穆帅\t109829\n蔡舒玲\t109830\n隆力奇111对号入座咯七情六欲\t109831\n房中\t109832\n茅老\t109833\n微吧长微博\t109834\n湾湾\t109835\n1991年12月15日\t109836\nktjdpdjpjdjmtjp7\t109837\n搞活\t109838\n龙龙胎\t109839\n房丝\t109840\n房东\t109841\n侯泽祥\t109842\nhttphhiphotosbaiducomxiaodupicitem2934349b033b5bb5fed40d8a31d3d539b600bc46jpg\t109843\n开学会\t109844\n鳝鱼\t109845\n警局\t109846\n逗儿\t109847\n哼傻子\t109848\nUFU\t109849\n办好办\t109850\n黑人\t109851\nkt猫\t109852\n人身\t109853\n背背对背\t109854\n饱胀感\t109855\n3332522532323233\t109856\n为何人\t109857\n加班儿\t109858\n耷柒\t109859\n水道\t109860\n村民\t109861\n红烧红烧\t109862\n沈阳市\t109863\nThereare\t109864\n黑云\t109865\n微信支付\t109866\n奇闻\t109867\n三同\t109868\n三名\t109869\n三合\t109870\nbcd27\t109871\n参谋长\t109872\n心肚子\t109873\n黑了\t109874\n举目\t109875\n灵果\t109876\n透红外滤光镜\t109877\n陪伴\t109878\n爽美\t109879\n逃向\t109880\n一张画\t109881\n一层层\t109882\n真幸\t109883\n街学\t109884\nYupmjga\t109885\n咔嘭咔\t109886\n真广\t109887\n富华\t109888\n一起来看点聊\t109889\n董子琪\t109890\nknthj\t109891\n老前辈\t109892\n上歌\t109893\n透下\t109894\n街子\t109895\n五联\t109896\n马路度\t109897\n跳椇\t109898\n女度\t109899\n对有\t109900\n小人头\t109901\n返航\t109902\n幼鸽\t109903\n石酸钠\t109904\n政德大道\t109905\nq宝\t109906\n长人儿\t109907\n支队\t109908\n忌日\t109909\n暗香\t109910\nia\t109911\nGzjzl\t109912\n婆欧\t109913\n南方都市报\t109914\n上一天\t109915\n凤保期\t109916\nci\t109917\n一个一份\t109918\n面对\t109919\n42564\t109920\n温故而知新\t109921\n梁星宇\t109922\n本爷\t109923\n谢晓达\t109924\n太阳款\t109925\n甘藏春\t109926\n小头鬼傻子你好\t109927\n邻家花\t109928\n汉主妇神探\t109929\n绮千\t109930\nthinking\t109931\ncl\t109932\n英偶\t109933\nvffhffd电视多天一\t109934\n五六七八九十十一十二十三十四十五十六十七十八十九二十\t109935\n和你的朋友\t109936\n大次康\t109937\n16点05分\t109938\nlsevenmity\t109939\n7800\t109940\nJZJJF\t109941\n权威感\t109942\n嗯奥特曼\t109943\n李老师\t109944\n一班罗\t109945\nrrryuutee\t109946\n所以说\t109947\nadDd\t109948\n花毒\t109949\n报铺\t109950\njkkll\t109951\n曾经的你吉他\t109952\n一几倍\t109953\n张洁\t109954\n求同存异\t109955\n海耶斯\t109956\nwdwc\t109957\n宋爽\t109958\njshsks\t109959\n那边儿\t109960\n央视少儿频道\t109961\nSLAMDUNK\t109962\n两载\t109963\n温华\t109964\n拟人化\t109965\n不不不就不就不就不就不就不就不就不就不问\t109966\n赵光敏\t109967\n张洪\t109968\n美国白宫\t109969\n两轮\t109970\n厌识\t109971\n未提\t109972\n15605818239\t109973\n环岛路塔头别墅\t109974\n宝贝宝贝我爱你\t109975\n一语道破\t109976\n是不让我理你啊我是多嘴多时了对\t109977\n棒状物\t109978\n中午12时30分\t109979\n决斗\t109980\nfgrfffgff\t109981\nabccom\t109982\nsbnnnngfcvv\t109983\n大学生创业\t109984\n12年前\t109985\nhishitsortsmal\t109986\n皮实\t109987\n迷彩服\t109988\n数了数\t109989\n猪肉馅儿\t109990\n这东西\t109991\n白破疫苗\t109992\n川剧院\t109993\n二两个\t109994\njsjdjshshs\t109995\n天乐化仔\t109996\n川震区\t109997\n纽带\t109998\n臼\t109999\n臻\t110000\n二二幺零\t110001\n致\t110002\n255255255255\t110003\n至\t110004\n赵忠祥\t110005\n噬\t110006\n1933\t110007\ntdz\t110008\n臭\t110009\n1937\t110010\n大粗\t110011\n1934\t110012\n江雨\t110013\n僵尸来了嗯嗯\t110014\nccav\t110015\nd3d\t110016\ntdt\t110017\n臟\t110018\n叶南迪\t110019\n张可\t110020\n飞行员\t110021\n整懂\t110022\ndiism\t110023\n哇伊\t110024\n如泉涌\t110025\n臊\t110026\n臉\t110027\n臂\t110028\n工工工\t110029\n臀\t110030\n小艾\t110031\n你爱我吗你爱我你就包包我\t110032\n啊您们\t110033\n小艺\t110034\n无刷\t110035\n好冷\t110036\n小艳\t110037\n匚\t110038\n黄凌馨\t110039\n盯起\t110040\n90年1月1日\t110041\n货亲\t110042\nStorm\t110043\n田家庵区\t110044\n邹依芯\t110045\nmiysg\t110046\n噻\t110047\n瑞瑞\t110048\n亲爱滳\t110049\n景岛\t110050\nGec\t110051\n耿哥刚\t110052\n哭了你说\t110053\n酸酸乳\t110054\nhhgsg\t110055\n36张\t110056\nvmbnnnmmnnnbnnNmnmb\t110057\n脏字儿\t110058\n铁锈红\t110059\n救生艇\t110060\n幻影\t110061\n告诉别\t110062\n曾婷妃\t110063\nccttc\t110064\n古教堂\t110065\n园林集团\t110066\n订季\t110067\n美妞可乐\t110068\n死皮\t110069\n倾角\t110070\n种式\t110071\nJ\t110072\n陈聪丽\t110073\n4元\t110074\nzzzzzzzzzzzxxxxxzzzzzzzzzzzzz\t110075\n14点30\t110076\nhfgg\t110077\n度秘文\t110078\n没臭我很喜欢\t110079\n噜噜咪\t110080\n干嘛那你和一样的话\t110081\n拿风来说\t110082\n六三一百零二\t110083\nGeh\t110084\njggd\t110085\n王语纯\t110086\n你好美\t110087\ndffdgggg\t110088\nfsg\t110089\n铁柜\t110090\n努力喜羊羊你是女爷们\t110091\nurineig\t110092\n行车记录仪\t110093\n火焰山\t110094\n双喜临门双喜临门\t110095\n积木\t110096\nfhjhhk\t110097\n阿墨\t110098\n7个工作日\t110099\n袁几乎\t110100\n烧醒\t110101\n七新彩\t110102\n雅柏那\t110103\n犯犯\t110104\n下腰\t110105\n社会风气\t110106\n辩论\t110107\n朱鑫博\t110108\n下腹\t110109\n4天前\t110110\n秦赢稷\t110111\nw猫\t110112\n八面吧\t110113\n相对湿度\t110114\n小妹儿\t110115\n你是我的小兔慢慢我是你的姐姐\t110116\nc#\t110117\n自驾车\t110118\n损坏\t110119\n三类\t110120\n近水楼台先得\t110121\n不走你走\t110122\n18点52\t110123\n念错者\t110124\n三米\t110125\n轻懂\t110126\n成都军区总医院\t110127\n我喜欢的人\t110128\nbarorz\t110129\n银兴\t110130\nbyebye度秘\t110131\n有点意大\t110132\nfsx\t110133\n5发次\t110134\n海米\t110135\n很多很多年\t110136\n刘鑫蕊\t110137\n机器人片\t110138\n雯姐姐\t110139\n率众\t110140\n数模\t110141\n建设路\t110142\n欲女\t110143\n无了因\t110144\n民股\t110145\n旺仔小馒头\t110146\nhiahiahian\t110147\n吴乐瑶\t110148\n雅和\t110149\n好还哈喽咯唾液女模特letouyunyunyunyunyun\t110150\n快摩\t110151\n假不过\t110152\n一今天\t110153\n觉觉喜羊羊\t110154\nfhbv\t110155\n银元\t110156\n呀k7k\t110157\n记下\t110158\n懊悔\t110159\n卿卿我我的\t110160\n我喜欢兔\t110161\n中央民族大学\t110162\n我喜欢光\t110163\n莲\t110164\n印香篆模\t110165\n改天\t110166\nvuvcs\t110167\nBobDzzzzzxzzzzzzzzzzxnxzzzzzzzzxznz\t110168\n粗话你片来总\t110169\n力距\t110170\n勇十一点那你说的话\t110171\n一一一一一\t110172\nxpmjtjmtm\t110173\n宋姓\t110174\n宋姐\t110175\n巨蟹女速配\t110176\n個泡\t110177\n我喜欢兵\t110178\nuriane\t110179\n恪尽职守\t110180\n松开\t110181\n童书\t110182\n八亿九千八十八万\t110183\n王悛凯\t110184\n高风险\t110185\n流直下\t110186\n小杂感十五\t110187\n黄棒\t110188\n小狼小狗小时小老虎小兔子了小狗\t110189\n得的话\t110190\n焉陵\t110191\najjl6\t110192\n豪烈\t110193\n蒋俊杰\t110194\n缺失\t110195\n美国环太平洋\t110196\n20：40\t110197\n马宇旭\t110198\n叶诺在建业诺\t110199\n砌墙\t110200\n奥马尔\t110201\n7885955\t110202\nv斗地主费\t110203\n愁绪抽象\t110204\n听到\t110205\n市市\t110206\n抒情歌\t110207\n坐厕\t110208\n鸳鸯\t110209\n成道\t110210\n初教\t110211\n听别\t110212\n水珠涵\t110213\n各尽\t110214\nＧ232次\t110215\n济治\t110216\n叫爱\t110217\nextstep\t110218\n恩蒽\t110219\n朴善子\t110220\n非洲\t110221\n沙样\t110222\n29P\t110223\n惠家福超市\t110224\n小小小小小小小小小小小度秘\t110225\n制暖\t110226\n蓝娃娃\t110227\n呱呱呱呱呱呱呱呱呱呱呱呱呱呱\t110228\n我川美壹\t110229\n呃呃呃我是天涯我是的我在那里小跳舞跳\t110230\n怎么回事\t110231\n祝玉树\t110232\n765533299yt\t110233\nidontno\t110234\nDhsjjs\t110235\n失职者\t110236\n高度\t110237\nDaughter\t110238\n第一声\t110239\n第四篇\t110240\n沈佳\t110241\n坐怀不乱\t110242\n天然呆你家明\t110243\n说你萌萌哒\t110244\n桃花运\t110245\n迷底\t110246\n大在1\t110247\n老奸\t110248\n通行\t110249\n老好\t110250\n宋维新\t110251\n老女\t110252\no68\t110253\n老奶\t110254\n丁佳乐\t110255\nmpmjtj0wkptdpka\t110256\n我一句话\t110257\n和战\t110258\n福建东北部\t110259\n别宁宁\t110260\n伊底斯\t110261\n通街\t110262\n吴晓进\t110263\n舍普琴科\t110264\n贾亮\t110265\n一四八十一十二二七九\t110266\n结绳\t110267\n25857675377628553762765276\t110268\n好吃吃吃吃吃吃吃吃吃吃\t110269\n凌志文\t110270\n老套\t110271\n算不算不算\t110272\n49488468\t110273\ndbsdbbs\t110274\nMIFFY\t110275\n愚弄愚弄\t110276\n赛尔号之猎天困兽\t110277\n秘密秘你猜我是男是女\t110278\n再读一再读\t110279\n掌柜\t110280\n病例\t110281\nluse\t110282\n警示灯\t110283\n露来斯图米提\t110284\n历次\t110285\n刘树超\t110286\n失足\t110287\n害死\t110288\n再喝\t110289\n再喜\t110290\n城里们\t110291\ndfgsuf\t110292\n钤印\t110293\n车毁\t110294\n不能不不能\t110295\n呼吸器\t110296\n小曲缩句\t110297\n莽\t110298\n772110\t110299\n恩恩恶恩\t110300\n采取行动\t110301\n腿肚\t110302\n綜藝\t110303\n林逸\t110304\n钝角\t110305\n一箱\t110306\n宋慈\t110307\n不走下去\t110308\n你是我的小呀小宝贝\t110309\n零六零\t110310\n主力军\t110311\n彭艺娇\t110312\n二十七宰只\t110313\n黄埔然\t110314\n又说好\t110315\n活神经\t110316\nGgddft\t110317\n压尼\t110318\n颜文字谁萌\t110319\n175688969085866479688582585\t110320\n歧欤刘。瀕\t110321\n南雅\t110322\n瓦岗军起义\t110323\n四书\t110324\n你好呀土豆萝卜兔\t110325\n柴川建\t110326\n四九\t110327\n权臣\t110328\n韩昭仪\t110329\n都晓杰\t110330\n二零二七零\t110331\n本宫赐你的问题愁\t110332\n四么\t110333\n南雨\t110334\n5880\t110335\n464690\t110336\n大学姐\t110337\n说谎\t110338\n呢绒裤\t110339\n蔡主席\t110340\n焦某某\t110341\n潘集区第二十小学\t110342\n不用棒\t110343\n天下父母心\t110344\n多间\t110345\nYucca\t110346\n柯林\t110347\n徐大夫\t110348\nDEUn\t110349\n幺九七七零二\t110350\n挞救救我行\t110351\n超舒服\t110352\n星了了我爱\t110353\n寇月霞\t110354\n不束手就擒\t110355\n早上三点半\t110356\n一汽夏利营\t110357\n宏式\t110358\n8月份\t110359\n我的我要你爱劲我以后你是我的\t110360\nhttpqianxunbaiducommoviecard6541htmlfdumi\t110361\n美味不设防\t110362\n好好不好好不好好不好好不好\t110363\n马皮\t110364\njhgfyvs\t110365\n单宁镇\t110366\n屁屁者\t110367\n看房\t110368\ne12271\t110369\n369770037\t110370\n防长\t110371\n凯空\t110372\n永耶\t110373\n生娃\t110374\n宝贵\t110375\n怎么么\t110376\n跳蛙\t110377\nG-BOOK\t110378\n雪糕\t110379\nLittle\t110380\nggfccgvvg\t110381\n周瑞发\t110382\n白茫茫\t110383\n大作战\t110384\n火山公园\t110385\n曹明钧\t110386\n看成\t110387\n航天纪念币\t110388\n3114\t110389\n刘秋丽\t110390\n三种物\t110391\n莂\t110392\n看戏\t110393\n年龄板\t110394\n挺住\t110395\n想吃饱了\t110396\n吴川\t110397\n宝贝\t110398\nege\t110399\n檀国\t110400\n慧雅\t110401\n美尔雅集团公司\t110402\n艾兰图\t110403\n蔷薇公主\t110404\ncchg\t110405\ncanyousing\t110406\n最高额\t110407\n葫芦鱼\t110408\n游戏的人民路上小心哈喽美女好的吧好的吧好的吧嗯嗯好的好的好的好的好的好的好的好的好的\t110409\n烈士陵园中心\t110410\n，，，，，，，\t110411\n郑吴杰\t110412\n性别不明的人\t110413\n西游记之三\t110414\n英雄气短\t110415\n无名氏吧度秘\t110416\nSYPER\t110417\nAON\t110418\n裤头\t110419\n打出血\t110420\n人力资源\t110421\n高阳镇\t110422\n佳仁和\t110423\n美韩\t110424\n俱忘\t110425\nhdieD\t110426\n写对\t110427\n没名\t110428\n栽赃\t110429\nssss\t110430\n一分半\t110431\n噤觉\t110432\nmyfi\t110433\n撒撒撒撒撒撒飒飒\t110434\n过道馆\t110435\n氨基酸类神经递纸\t110436\n正反七\t110437\n相信爱\t110438\n长臂\t110439\n学习效率\t110440\nTHANKYOU\t110441\n咱也不知道\t110442\ncandoit\t110443\n应收账款\t110444\n关我了事\t110445\n反向\t110446\n下一周\t110447\n人民解放軍大反攻一周年形勢圖\t110448\n搭桥\t110449\nvkkh\t110450\n我喜欢感觉\t110451\n好笑脸\t110452\n陈美洁\t110453\nevena\t110454\n不适宜见\t110455\n一世一个\t110456\n一键启动\t110457\nball\t110458\n客户体验\t110459\n哇塞神龟\t110460\n鹿乡\t110461\n崩乱\t110462\n一二章\t110463\n三甄嬛传\t110464\n我的父亲\t110465\n了了了了了了了了了了了了\t110466\n大肆宣传\t110467\ntolstimpls\t110468\n剥粒\t110469\n玉米蛇\t110470\nTtyl8rjadmjggmdjj\t110471\n余云龙\t110472\n二二零二二零六\t110473\n女大男\t110474\n桑树\t110475\n胎教网\t110476\n伊居然\t110477\n四十千克\t110478\n好呀你错\t110479\n1885年\t110480\n马学成\t110481\n三里屯儿\t110482\n老王\t110483\n爱在\t110484\n一一四\t110485\n大幕\t110486\n艺术\t110487\n衫剛\t110488\nhowareyou\t110489\n汉克斯\t110490\n叮咯\t110491\n叮咬\t110492\n消防员\t110493\n求你啦求你啦求你啦求你啦求你啦求你啦求你啦求你啦求你了求你\t110494\n15246015874\t110495\n大干\t110496\n叮咚\t110497\n扶倒\t110498\n大年\t110499\n抠门儿\t110500\n蔡馨慧\t110501\n99年\t110502\n阿素\t110503\n气愤\t110504\n手拉\t110505\n凯旋\t110506\n亲爱新大头儿子小头爸爸\t110507\n李翰祥\t110508\n1001万次\t110509\n558771\t110510\n离管\t110511\n黄岩\t110512\n军牌\t110513\nbanan\t110514\n红孩儿\t110515\n脱机\t110516\n哩哩啦啦\t110517\n自提的呀\t110518\n糾結糾結糾結\t110519\n姆斯\t110520\n蚌埠市\t110521\nyesldoldont\t110522\n134块\t110523\n1999年10月20日\t110524\n加景湾\t110525\n指天\t110526\n腾讯网\t110527\n966多少\t110528\n522529198112192626\t110529\n玉函\t110530\n听说过\t110531\n会别\t110532\nOK啦咯弄哈how行hmmm哦oil\t110533\n四叶草们\t110534\n歌义乌师诗\t110535\n思锐\t110536\n吐字\t110537\n我不可能\t110538\n绒球\t110539\n鼬君\t110540\n汤圆子\t110541\npleasebe\t110542\n四十七块\t110543\nFrankie\t110544\n再说一笑\t110545\n不赖呆\t110546\n钢琴会\t110547\n再来一个更搞笑\t110548\nMikeTy\t110549\n数学卷\t110550\n初原\t110551\n苏留庄\t110552\n第4季\t110553\n5102\t110554\n个别\t110555\n你好无聊问你\t110556\nstd\t110557\nste\t110558\nstf\t110559\n无敌光轮护佑\t110560\n黄豆芽\t110561\n莉优\t110562\n经得住\t110563\nsto\t110564\n聪明爱我的度秘原谅我\t110565\n苗王寨\t110566\nstv\t110567\nstr\t110568\nsts\t110569\nAAAAA\t110570\n肝脾\t110571\n警号\t110572\n政才\t110573\n谢慧欣\t110574\n人脉\t110575\n周宇阳\t110576\n88by\t110577\n对的我\t110578\n石头书\t110579\n口弹头\t110580\n人化\t110581\n31根\t110582\n官富\t110583\n赵婷婷\t110584\n寿山石\t110585\n金明洙\t110586\n篁彳\t110587\n人脑\t110588\n带鬼带鬼\t110589\n郑州万达影城\t110590\n开春\t110591\n录取线\t110592\n聊天天炫斗\t110593\n三妻\t110594\n三妹\t110595\n真真真真\t110596\n病符\t110597\n魔塔\t110598\n有别说\t110599\n人脸\t110600\n罗马界墙#Limes\t110601\n参考资料\t110602\n唉呦我的会\t110603\nWith\t110604\n720p\t110605\n小军\t110606\n麻辣度秘我爱上你\t110607\n粉色\t110608\nwath\t110609\nwate\t110610\n千手观音\t110611\n58888岁\t110612\n那悠\t110613\n周滨\t110614\n二十多2\t110615\n钢管舞\t110616\n宽度\t110617\n那那\t110618\n下午花\t110619\n晓轩\t110620\n寒潮\t110621\n253216190\t110622\n60亿美元\t110623\n斯斯文文\t110624\n马圆昊\t110625\n葡萄籽\t110626\n参拜\t110627\n谢坤城\t110628\n毛钱\t110629\nYY宫斗\t110630\n狂流\t110631\nImzhang\t110632\n错不三不四哈哈\t110633\n推诿\t110634\n些许多少\t110635\n有期保重\t110636\n猩球战斗机\t110637\n多一岁\t110638\nTciftgugft\t110639\n吴莉莉\t110640\n4788885825195588558589656558899800888\t110641\n拉布拉多\t110642\n你的爱情侣们\t110643\n凉拌\t110644\n吊水湖\t110645\n万科公司\t110646\n饶赦\t110647\n本美\t110648\nggkfzjodio\t110649\nSidebarsidea\t110650\nDDGJ\t110651\n零五分\t110652\n我能做你的主人么我问你哪电影小达人度秘\t110653\n允浩君\t110654\n蓝儿\t110655\n月披星\t110656\nDDGD\t110657\nDDGG\t110658\n10010\t110659\n皮革\t110660\n阿西多\t110661\n受帅\t110662\n租船\t110663\n小冉\t110664\nRghjdd\t110665\n委员长\t110666\n一句话片\t110667\n这点话\t110668\n进制\t110669\nGuys\t110670\n唱反调\t110671\n麦基\t110672\n右转\t110673\n临战\t110674\n脏话\t110675\n顾美丽\t110676\n敏感\t110677\n临阵\t110678\n2011年7月13日至19日\t110679\n套图\t110680\n四处\t110681\n9一百百两\t110682\n正秋币\t110683\n书海\t110684\n静下心\t110685\n延边\t110686\n成品\t110687\n四头\t110688\n直重\t110689\n烟花\t110690\n四天\t110691\n5555444\t110692\n翔少\t110693\n崔立华\t110694\n四大\t110695\nhuuif\t110696\n呃呃度秘\t110697\n我等你等\t110698\nqqqqhkkxotiu11\t110699\n瞀\t110700\n强迫症\t110701\n瞄\t110702\n瞅\t110703\n充胖\t110704\n收发\t110705\n收取\t110706\n蒋拜拜\t110707\n邹智荣\t110708\n瞒\t110709\n瞐\t110710\n小友\t110711\n李思阳\t110712\nYOKA移动放心#孩儿面\t110713\n女优\t110714\n小伙伴们\t110715\n女会\t110716\n瞧\t110717\n飞地上\t110718\n瞥\t110719\n瞪\t110720\n翻制\t110721\n瞬\t110722\n瞭\t110723\n太可塔\t110724\n看不见你疯了\t110725\nHendrixLicensing\t110726\n豚\t110727\n不必须\t110728\n度那\t110729\n机巴\t110730\n血染\t110731\n秦紫萱\t110732\n豉\t110733\n不稀饭\t110734\n青阳洞\t110735\n机巧\t110736\n弟弟们\t110737\n豆\t110738\n豸\t110739\n豹\t110740\n当道\t110741\n一号度秘\t110742\n穿点\t110743\n张媚媚\t110744\n豫\t110745\n豬\t110746\n真慢\t110747\n象\t110748\n风波必来论\t110749\n吊顶\t110750\n划句\t110751\n地中海\t110752\n因明\t110753\n郏县\t110754\nFLAC\t110755\n行问\t110756\n山蜡梅\t110757\n这刻\t110758\n祝大禹\t110759\n600张\t110760\n还林杰森\t110761\nww点红姐\t110762\n20一下\t110763\n听不懂你的话\t110764\n菊科\t110765\npapapa\t110766\n王子和王子和王子和王子和王子和王\t110767\n去哪儿网\t110768\n李森\t110769\n洋蛋\t110770\n唱空\t110771\n澳柯玛仓库\t110772\n白术果\t110773\n朋NN\t110774\n1647825245231\t110775\nkiop\t110776\n生活感\t110777\nkiou\t110778\n兒子把\t110779\nAcDEFG\t110780\n死不怕\t110781\n尼玛让我情何以堪\t110782\n噜啦啦baby\t110783\nStanding\t110784\n树木园\t110785\n找碴\t110786\n挂号\t110787\n皇太极\t110788\n20多种\t110789\n症状\t110790\n三湾\t110791\nM男\t110792\n山那边\t110793\nkesome\t110794\niwjejucip20riwpodjf110\t110795\n东京都部分I的经典记事本素四季豆\t110796\n35656554\t110797\n音乐灯\t110798\nssfssdfff\t110799\nhgvvnsntvupktltjpwjpjg0tjtjmwjtjuaj\t110800\nnvvhhbbb\t110801\n五万八千九千八百九十九四千九百三十三\t110802\n乖乖我的好乖乖\t110803\n兔子女人\t110804\nxoyd\t110805\n志心\t110806\n王汉英\t110807\n皮套\t110808\n上院\t110809\n啦啦啦我喜欢\t110810\n丁巳日\t110811\n肛泰君\t110812\n不省力\t110813\n叶儿\t110814\n晚片\t110815\n陈浩奇\t110816\n狗欣\t110817\n冒人\t110818\n湾赵\t110819\n逼不打\t110820\n屏障\t110821\n浦江五中\t110822\n昌北机场\t110823\n互动有礼#\t110824\nhbb\t110825\nDvshd\t110826\n忘了你\t110827\n八十位\t110828\n埃影城\t110829\n多拉\t110830\n泥底\t110831\njdjjjjn\t110832\n122期\t110833\n叙事\t110834\n3568657\t110835\n假有一天\t110836\n民国初\t110837\n短点\t110838\nFAB\t110839\n芭比公主\t110840\n飞虎神鹰\t110841\n嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟\t110842\npktgaO\t110843\n哇美\t110844\n洗涤\t110845\nhbz\t110846\n金苹果\t110847\n机器人粹\t110848\n胸花\t110849\nhbu\t110850\n洗液\t110851\n周星宇\t110852\n1233345666\t110853\n丑八怪大坏蛋\t110854\nshispen\t110855\n山西话\t110856\n囊性\t110857\n阿克苏\t110858\n你是笨笨的度米小乖乖\t110859\n朵米拉\t110860\n追责\t110861\n行效\t110862\n烟雾弹\t110863\n催\t110864\nvya\t110865\n宏村\t110866\nvyg\t110867\n很遗憾\t110868\nvyj\t110869\n后天晚上\t110870\n红砖\t110871\n想不发\t110872\n英朗\t110873\n69.9亿元\t110874\n名车志\t110875\n411\t110876\n长安意思就是你猜你猜\t110877\n814810\t110878\n谭红怡\t110879\n转非\t110880\n石家庄市\t110881\n伊斯兰会议组织发明声明\t110882\n你的性\t110883\n极速贷款\t110884\n全数\t110885\n世界佛教\t110886\n牙龈红肿出血\t110887\n杨其\t110888\n全敏\t110889\n苍井空仓\t110890\n点火\t110891\n秃体\t110892\n乐清市人民法院\t110893\nyrdfh\t110894\n热敷\t110895\n认识到底\t110896\n怎么了好想你不睡觉的好想你好想你不\t110897\n骗人儿\t110898\n好像会\t110899\n额切\t110900\n門口\t110901\n祝賀\t110902\n么样\t110903\n终南山\t110904\n渝州\t110905\n冯西丽\t110906\n我的哭乐年气我\t110907\n朕龙颜大悦好退下\t110908\n孙俊杰\t110909\n2008年9月18日\t110910\n屋内\t110911\n伊瞳\t110912\n。0必安\t110913\n晚上十点\t110914\n杜景阳\t110915\n花心不花\t110916\n小猫咪的蝴蝶梦\t110917\n广海寺\t110918\n麓湖\t110919\nffygh\t110920\n卖萌识\t110921\n胃药\t110922\n好哇哈十八\t110923\n#chris\t110924\n度秘堡\t110925\n快乐度秘\t110926\n猪丫&amp\t110927\n莲花卫生院\t110928\n四四四四四四十四十四十四十四十四十四\t110929\n發他們\t110930\n雷家村\t110931\n九一点\t110932\n八十四五六\t110933\n满人间\t110934\n太的宫\t110935\n真名儿\t110936\n一两万\t110937\nsoma\t110938\n拜明天见\t110939\n巴彦\t110940\n出轨\t110941\nsome\t110942\nsomg\t110943\n舍得你走\t110944\n材质堡\t110945\n乘势\t110946\n鬼怪\t110947\nudn加工厂凤凰网\t110948\nkux\t110949\n帖家\t110950\n你好了再见\t110951\n田志刚\t110952\n列队\t110953\n五年级口算题卡\t110954\nhvjjgj\t110955\n好我没意思\t110956\n王爱民\t110957\n灭笑\t110958\n悠着点\t110959\nydfzd\t110960\n希秋\t110961\n周宇盟\t110962\n骚货\t110963\n赛琳娜\t110964\nreach\t110965\n修长\t110966\n陈允杰\t110967\n好多芬\t110968\n等你那拉\t110969\n裸照\t110970\n和恩\t110971\n追思\t110972\n00后\t110973\n奶糖\t110974\n迈出\t110975\n老品\t110976\n锻炼\t110977\nyiry\t110978\n衡阳市\t110979\n慢效率\t110980\n那英\t110981\n停车\t110982\n5844668\t110983\nhttppinyincne17111\t110984\n老哭\t110985\n老哩\t110986\n刘子\t110987\n老哥\t110988\n智灵\t110989\n虾仁儿\t110990\n刘孙\t110991\n聊一聊聊聊聊聊聊聊聊聊\t110992\n棋势\t110993\n苗刺死案\t110994\nXDDDDDDDDDDD\t110995\n最不\t110996\n踏踏实实\t110997\n最三\t110998\n易明睿\t110999\n全里\t111000\nouhgf\t111001\n放空\t111002\n二三七幺\t111003\n嘎嘎嘎嘎嘎嘎嘎嘎\t111004\n中话\t111005\n张东健\t111006\n扎大\t111007\n杨加班\t111008\n湾\t111009\n凌晨两点半\t111010\n拜拜一下\t111011\n你的厉害\t111012\n王宇轩\t111013\n发电厂\t111014\nh1b1\t111015\n爱你喜欢你\t111016\n修真者\t111017\n仙桃市\t111018\n尊师重道\t111019\n最为\t111020\n肿脸\t111021\n教学法\t111022\n16日\t111023\n手气吧新浪网旗\t111024\n死别\t111025\n泳装\t111026\n死到\t111027\n瓜呱\t111028\n土猪\t111029\n16时\t111030\n不闻不问\t111031\n江秀春\t111032\n托腮\t111033\n少年三国志\t111034\nfdc\t111035\nfdd\t111036\n死切\t111037\nfdf\t111038\n小马仔\t111039\nTwitter\t111040\nfdm\t111041\n鱼子盒\t111042\n死刑\t111043\nduhoh8fsdtjpgfmhdc\t111044\n金正贤\t111045\n寒更\t111046\nfdt\t111047\n20s\t111048\n广福39号来找我吧那就是我的家伙我是个人类你见过人\t111049\n邓小波\t111050\n星儿\t111051\nmalia\t111052\n好梦六\t111053\n書\t111054\n曹\t111055\n呃呵\t111056\n曼\t111057\n豪门\t111058\n曾\t111059\n替\t111060\n曰\t111061\n和赵\t111062\n闪亮的爸爸\t111063\n更\t111064\n巴拿\t111065\n曷\t111066\n曨\t111067\n到泪流满面\t111068\n齐殿武\t111069\n曦\t111070\n呃呗\t111071\n太渣\t111072\n密快\t111073\n曜\t111074\n曝\t111075\n李知明\t111076\ncomnnnn\t111077\n作奸\t111078\n呃呜\t111079\n炎图\t111080\n慈善基金会公司\t111081\n曉\t111082\n哼新\t111083\n呃呃\t111084\n宝安\t111085\n巴拉\t111086\n死瘪三\t111087\n创吉尼斯\t111088\n深海钰\t111089\n毛毛花花\t111090\n打气\t111091\n跳动\t111092\n有一不明\t111093\nkork\t111094\n胡度秘\t111095\n取之不尽\t111096\nkorg\t111097\nLljajjjjaaa\t111098\n有今\t111099\n唯品会好\t111100\n金立\t111101\n八月十八\t111102\nFhggff\t111103\n2厘米\t111104\n小怪兽\t111105\n省篇\t111106\n打水\t111107\n7550359\t111108\n记得\t111109\n995201314\t111110\n林母\t111111\n男爱好女\t111112\n奥德修斯\t111113\n花园城\t111114\n柳潇潇\t111115\n画鸡\t111116\n相信你吧\t111117\n金童\t111118\n男同o\t111119\n吃苦瓜\t111120\n37盒\t111121\n匆匆忙忙\t111122\n对不起我是美好我是美好看\t111123\n小小怪\t111124\n上海钢铁所\t111125\n34吨\t111126\n1205325159\t111127\n小气\t111128\n天童寺\t111129\n美车堂\t111130\n盈富\t111131\nJX\t111132\nEFSF\t111133\n吃饭吧\t111134\n归村\t111135\n赵羽璐\t111136\n南平\t111137\n轰轰烈\t111138\n归来\t111139\n垃圾场\t111140\n要不知道\t111141\n泪奔\t111142\n性味\t111143\n我的死神\t111144\n海枯石\t111145\n方天羽\t111146\n报文\t111147\n随自己\t111148\nsyqusg\t111149\nJW\t111150\n剌激\t111151\n15039490508\t111152\n课表\t111153\n引来\t111154\n好智能\t111155\nJR\t111156\n必死无疑人不要脸天下无敌\t111157\n说不聊\t111158\neva洋洋不了天了身\t111159\n手足无措\t111160\n零二二零幺六五六三七四\t111161\n湛江市\t111162\n偶类\t111163\n12海里\t111164\ntutqx\t111165\n二十八块\t111166\n现役\t111167\n孪生\t111168\n吸油烟机\t111169\n两个照\t111170\n好找不到\t111171\n陈雪\t111172\n奥本山\t111173\n李红大\t111174\n奖话费\t111175\n贾龙\t111176\n新政\t111177\n这样式\t111178\nJJ\t111179\nMmmmmkd\t111180\n断片\t111181\n桑果\t111182\n莫文蔚\t111183\n28次\t111184\n徒秘\t111185\n折晋羽\t111186\ngjd\t111187\nU2B\t111188\ncvgdddrtyhmko\t111189\n热咖啡\t111190\n更好呀\t111191\n安全险\t111192\n这么快\t111193\n龙陵县\t111194\n垡头翠\t111195\n字法\t111196\n两天两夜\t111197\n2548565666666\t111198\n阿扎尔\t111199\n六年级了我朋大五岁\t111200\n连体\t111201\n俞伯牙\t111202\n满意贷\t111203\n欧服\t111204\n周吴\t111205\n管闲事\t111206\n三十道\t111207\n爱如\t111208\n任正\t111209\n嘉敏\t111210\n2点5分钟\t111211\n三十遍\t111212\n嘎嘎\t111213\n持之以恒\t111214\n群分物\t111215\nDevonboating\t111216\n周吧\t111217\n零拨\t111218\nJy\t111219\nfjccrgv\t111220\n卫浴\t111221\n554855\t111222\n还情有可原\t111223\nhshx\t111224\nhshs\t111225\n赌命赌命\t111226\n爱妾\t111227\n爱妻\t111228\n3094\t111229\nhgzcgfgggg\t111230\n犯罪案\t111231\n嘎嘣\t111232\nhshd\t111233\n维妙维\t111234\n堆堆\t111235\n废渣\t111236\n走一走\t111237\n小看\t111238\nhttpahiphotosbaiducomxiaodupicitemb90e7bec54e736d1671f8b7c9c504fc2d56269b1jpg\t111239\n小眉\t111240\n韶乐\t111241\n吃吵\t111242\n对呀222\t111243\nhxnnhf\t111244\n翻盘\t111245\n环环\t111246\ncouchi\t111247\n入海\t111248\njgjjatg\t111249\n零点一秒\t111250\n别胡来\t111251\nt台\t111252\n多数\t111253\n九百强\t111254\n相饮\t111255\n上塘\t111256\n吉首吧吉首\t111257\n优尔康\t111258\n葛艺\t111259\n好喜歡\t111260\n吴大\t111261\n陈陈敏敏\t111262\njdnbs\t111263\n财观后感\t111264\n猜点\t111265\ngxggcgcggdtdyfdfftctcgvgvtfgggghggvytughghvhguhyyuuygygtftyyuujhchuvhvjvbjbk\t111266\n说了会\t111267\n专干\t111268\nnodie\t111269\n15999\t111270\n地地沟油\t111271\n胡烨\t111272\n翔弟\t111273\n2285744029\t111274\n噶哈\t111275\n宜白大道\t111276\n骗我了\t111277\n陆晨\t111278\n疾行\t111279\n向上下\t111280\n好帅\t111281\n貴庚\t111282\nhegxhshhshdh\t111283\n陈良宇\t111284\n邹修学\t111285\nAPU#A8-4500M四核处理器\t111286\n我爱你爱的太深太深了你爱我\t111287\n纳兰词\t111288\n9464613\t111289\n冥正\t111290\n陆晗\t111291\n1月7号\t111292\n暨秀英\t111293\n22555555556\t111294\n好帮\t111295\n梅花草堂集\t111296\n和服\t111297\n本质\t111298\n那你丽丽妈咪\t111299\n逃不逃哼\t111300\n鬼畜\t111301\n有误会\t111302\n梁永琪\t111303\n十八公分\t111304\n贵台\t111305\njugttptnn156808955885\t111306\n青木木\t111307\nududj\t111308\n2874427908gjeuk5\t111309\nududf\t111310\n千楠\t111311\n黄裳\t111312\nhiu\t111313\n下午5：30-9：30\t111314\n有希望\t111315\n国学社\t111316\nRdfggdg\t111317\n巴拉小乌\t111318\n五五米\t111319\n普斯\t111320\n737分\t111321\n很问我的确\t111322\n一月一百多块\t111323\n胡杨\t111324\n假冒的我是你主人的妹妹\t111325\n很健康\t111326\n2月11号\t111327\n大多个\t111328\n秘cu\t111329\n只度\t111330\n哈彩\t111331\n秘密秘\t111332\n聊了当\t111333\nrrhhg\t111334\n孟思嫒\t111335\n价格低\t111336\n干部\t111337\n拜拜会\t111338\n保险公司\t111339\n爱你摸摸\t111340\n马心怡\t111341\n25月\t111342\n基情\t111343\n一肤堂\t111344\n八柜\t111345\n婴垣帝凤\t111346\n啦啦啦啦啦的我唱的歌\t111347\n758万元\t111348\n草稿行\t111349\n西安唱区\t111350\n概念产品\t111351\n手痒\t111352\n一千来块\t111353\nhid\t111354\n葛森\t111355\n度秘213184\t111356\n言子\t111357\n张茂\t111358\n媳妇的我一直不废话好不\t111359\n艾力克\t111360\n言字\t111361\nwahywaaastick\t111362\n温姝颖\t111363\nhig\t111364\n触动\t111365\nEGV\t111366\n同谋\t111367\n战士\t111368\nhih\t111369\n百多\t111370\n相干\t111371\n粉思\t111372\n黄启雄\t111373\n七兄弟\t111374\nEGD\t111375\n9884257\t111376\n备备\t111377\n狼来了\t111378\nhil\t111379\n额闭壳\t111380\n乳模\t111381\n百大\t111382\nhim\t111383\n认识字你可以\t111384\n是座\t111385\n宜搜\t111386\nx100\t111387\n鬼罗家臭豆腐\t111388\n18338062692\t111389\n嗯33\t111390\n西和\t111391\n见结婚\t111392\n123456799098789539572\t111393\nreasta\t111394\n六八天\t111395\n汶川大地震\t111396\noo撸撸撸\t111397\n平安夜\t111398\n吃不到\t111399\n海峡两岸关系协会\t111400\n明天七六点半\t111401\n嗯好勒\t111402\n多多少少\t111403\n沈北\t111404\n贵生\t111405\nUPSG1\t111406\n臣姬\t111407\n米脸皮厚\t111408\n讲文明懂\t111409\n男73女\t111410\n西咸\t111411\n這條\t111412\n迷蒙\t111413\n有价值\t111414\n骗你了你骗我了+1得二\t111415\n抱不着我\t111416\nggggggfytt\t111417\n练了抬\t111418\n一针\t111419\n马佳琪\t111420\n零21天\t111421\n感冒天\t111422\nF珞00vgvvbnhcbmnjkjyhkj\t111423\n二二八三\t111424\nghfggjduuayyshu1726991\t111425\n找一个再说\t111426\n列长\t111427\n有声版\t111428\n丁雪梅\t111429\n杜雨洋\t111430\n昨天以来\t111431\n管理园\t111432\n连长\t111433\ndorothea\t111434\n挂红\t111435\n摄氏度\t111436\n你的歌声\t111437\n笑笑口\t111438\n范思维\t111439\n谭雅涵\t111440\n打窝窝\t111441\n刘浩斌\t111442\n180名\t111443\nfiomp\t111444\n笑呵呵\t111445\n贾警雯\t111446\n3446654645576\t111447\n唉原谅\t111448\n厄瓜多尔\t111449\n那样儿\t111450\n33658997269\t111451\n摇摇\t111452\n摇摆\t111453\n过不来\t111454\n不得瑟\t111455\n希希\t111456\n香格里拉大酒店\t111457\n朴歌手\t111458\n宁区\t111459\n不是我无聊\t111460\n一个三十张\t111461\n浇地\t111462\nGIRLS\t111463\n予以\t111464\n侯世奇\t111465\n对啊长\t111466\n藏地\t111467\n毛儿\t111468\nstuvwn\t111469\n推行\t111470\n纪事\t111471\n三元宫\t111472\n赵谷歌\t111473\nmkpa\t111474\n咋度\t111475\n星野亚希\t111476\n山崩地裂\t111477\n俩稿\t111478\n宁化\t111479\nGtg\t111480\nGtj\t111481\n张子煜\t111482\n铨叙\t111483\n毛嗑儿\t111484\n打不着\t111485\n优衣库\t111486\n伯爵\t111487\njjjjjgjlmdapp\t111488\n倾城之恋\t111489\n273亿元\t111490\n喂喂喂你是谁你是中国大美眉\t111491\n朋祖祠\t111492\nhifus\t111493\n曾勰\t111494\n1381364003⑵\t111495\n闵月传\t111496\n３点３８分\t111497\n嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟\t111498\n陈定\t111499\n那你到哪了行\t111500\nJJC3秀秀队\t111501\n赵文艳\t111502\n有效率\t111503\n闪避\t111504\n小山峰\t111505\n我一个他\t111506\n何德何能\t111507\n游打\t111508\n哈大喊大叫\t111509\npijhdg\t111510\n隋唐演义\t111511\n问题儿\t111512\n感觉到\t111513\n猫声\t111514\n你好叛\t111515\n四十九岁\t111516\n鄂尔多斯\t111517\n你好受\t111518\n小九岁\t111519\n你好可\t111520\n你好叫\t111521\n凌家辉\t111522\n何思诚\t111523\n智能机器人\t111524\n牢鸟\t111525\n德先大酒\t111526\nshilv\t111527\n110605\t111528\nJxbdcb\t111529\n7406\t111530\n粗盐\t111531\n咯谷雅酱\t111532\nvoocovigh\t111533\n一个四百三百\t111534\nRichard\t111535\n圆通\t111536\n郭计忠\t111537\n啊啦\t111538\n温晨阳\t111539\nma12v\t111540\n乾隆坊\t111541\n单亲\t111542\nPugh\t111543\n征战\t111544\n排骨饺子\t111545\nneed\t111546\n天东西\t111547\n单人\t111548\n好呀还好我的都睁去年十岁\t111549\n中国商用飞机有限责任公司\t111550\nrotor\t111551\n艾滋病\t111552\n欧版\t111553\n啊啊\t111554\n800万日圆\t111555\n网神\t111556\ngdgagjt\t111557\n冷接力\t111558\n3105294390\t111559\nz545124144147412588\t111560\n嗯见\t111561\n莫西莫西\t111562\n杨晋慧\t111563\n下一步一步\t111564\n关老子屁事\t111565\n没听过人\t111566\nxlng\t111567\n麦思思\t111568\n删掉了再见\t111569\n自照\t111570\n15年前\t111571\n汤图\t111572\n19280元\t111573\n红苹果超市包子\t111574\ndosth\t111575\nFTIsland\t111576\n拟声词\t111577\n93岁\t111578\n背面上\t111579\n安和乡\t111580\n合合方\t111581\n样式\t111582\n王雨辰\t111583\n雍正通\t111584\n气恼\t111585\n醉爱\t111586\n气纯\t111587\n宋美龄\t111588\n0702\t111589\n落叶破坏你的幻想\t111590\nIam\t111591\nIan\t111592\n庸\t111593\n一百几\t111594\ndyfghugjc\t111595\n弄得\t111596\n庶\t111597\n康\t111598\n参透\t111599\n张阳\t111600\n庭\t111601\n综合医疗\t111602\n隐王窝\t111603\n度\t111604\n说了让\t111605\n句号句号句号句号\t111606\n24600元\t111607\n府\t111608\n庝\t111609\n庞\t111610\n废\t111611\n倒挂金钩\t111612\n庙\t111613\n庚\t111614\n应\t111615\n底\t111616\nghlfj\t111617\n店\t111618\n老规矩\t111619\nbhiphotosbaiducomxiaodupicitemb03533fa828ba61ee1869c5b4634970a304e595djpg\t111620\n敏敏\t111621\n库\t111622\ncreamer\t111623\n床\t111624\n庄\t111625\n陈雅欣\t111626\n庆\t111627\nffhfyyf\t111628\n王晓飞\t111629\n庁\t111630\n度娘cp啊2333\t111631\njoine\t111632\n红粉\t111633\n不不写\t111634\n陆续\t111635\n信函\t111636\n度面膜\t111637\n阿鸡\t111638\n王思静\t111639\njoink\t111640\n蛤蟆镜\t111641\n记易发\t111642\n13425600695\t111643\nNrdurruxd\t111644\n为名\t111645\n污秽\t111646\n一些半小时\t111647\n黄鼠狼\t111648\n1\\6\\8\t111649\nfortables\t111650\n赞赞赞赞赞赞赞赞赞赞赞赞赞赞赞\t111651\nYdxvygbj\t111652\n博士学位\t111653\n帅客\t111654\n怏点\t111655\n清库\t111656\n傻b傻b傻b傻b傻b傻b\t111657\nfiidVd\t111658\n捕将\t111659\n博野\t111660\n默默想\t111661\n验证码\t111662\n7654321\t111663\n承包商\t111664\n所求\t111665\n女仆装\t111666\nt74\t111667\n萌萌宝贝123\t111668\ncomputer\t111669\n宏大\t111670\n家娃\t111671\n十不好听\t111672\n宏天\t111673\nfuguteufkswuglfsfkffsuwkdhfjgsjfjuluawdlhjpfiwkdlaidoxarjdkekxe\t111674\n阿萨德集团\t111675\n充电宝\t111676\n啊森\t111677\n那又西s\t111678\n零二二零三零\t111679\n单膝\t111680\nxkzkzk\t111681\n小岁月\t111682\n孙志伟\t111683\n百赚利滚利\t111684\n潍坊\t111685\n仲傻\t111686\n宛平\t111687\n大卫杜夫\t111688\n蹦蹦蹦蹦蹦蹦擦擦擦擦擦擦\t111689\n啊狼国还腿还\t111690\n妖死啵啵\t111691\n中国女排\t111692\n羊度\t111693\n窃听\t111694\n这样的儿\t111695\n秘爱里\t111696\n杜泊河\t111697\n张璐瑶\t111698\nByghhii\t111699\n不善良\t111700\n没事妹妹\t111701\n内蒙华电\t111702\n小鲤鱼\t111703\n冰小宝\t111704\ngldj\t111705\n再来一今\t111706\n相亲相爱\t111707\n台博会\t111708\n說過\t111709\n不犯\t111710\n百度音乐随心听#\t111711\naqimax\t111712\n二线\t111713\n飞天飞天巴结这伙计里猛犸尼美那jj斗抽风的小樱\t111714\n唠唠叨叨\t111715\n二级\t111716\n二约\t111717\n五十七三十四\t111718\n泰妍\t111719\n朱佳欣\t111720\n小姜\t111721\n公测\t111722\n既是\t111723\nkajjdh\t111724\n过速\t111725\nGKN\t111726\n武神堡\t111727\n152824199803126615\t111728\n三十多年\t111729\n赞歌\t111730\n塔拉斯\t111731\n脱剧院\t111732\n神似\t111733\n卢明睿\t111734\n台里\t111735\n巴学园\t111736\n娃蚊\t111737\n小姿\t111738\nhi度秘雅\t111739\n杨家将女vghjjhgggggg\t111740\n张书杭\t111741\n可不可以\t111742\n网纱\t111743\n陈柯桦\t111744\nhimcuymtrucou\t111745\neukdf\t111746\n一族心\t111747\n于傲\t111748\n讲话片\t111749\n虎离山\t111750\n兜子\t111751\n新鲜一伙\t111752\n9x2y2\t111753\n趁人\t111754\n托运\t111755\nWTF\t111756\n芮衣\t111757\n机器人度\t111758\n哎伊的蛋蛋一对一拉乖\t111759\n分清\t111760\n稠度\t111761\n哦竭诚礼\t111762\n翅味\t111763\n龙溪小学\t111764\n300亿元\t111765\n云溪\t111766\n加急\t111767\nImtaller\t111768\n怪毒\t111769\nhhshzb\t111770\n监别\t111771\n姑妄听\t111772\n骤然间\t111773\nd95\t111774\n金在中\t111775\n刘京\t111776\n没心情聊\t111777\n6点50\t111778\n520集\t111779\n多美滋奶\t111780\n77774777788\t111781\n氓\t111782\nviv0x5\t111783\n看我多乖多\t111784\n微力\t111785\nggffgfddfgf\t111786\n零零秘\t111787\n李宏岩\t111788\n石鸡\t111789\nRhdahy\t111790\nh1z1\t111791\n罗涛\t111792\n床头诗\t111793\n郑国燕\t111794\n菩萨蛮\t111795\n七亿ng\t111796\no美\t111797\n和能\t111798\nttytttt\t111799\n宋俊彦\t111800\n毕业好\t111801\n真句酌\t111802\nKkkkkmjmkkkkkkkkk\t111803\nJQZ\t111804\n浮桥河\t111805\n2178544708\t111806\n期负\t111807\n我不要你了那我不要你\t111808\n宋美美\t111809\n60优惠卷\t111810\n金三胖\t111811\n呃食\t111812\n驯潮\t111813\n谢旭人\t111814\ng2家\t111815\n甚大\t111816\n副班长\t111817\n调令\t111818\no2oo2\t111819\n88888888888888888888888999999999999999999\t111820\n收得到\t111821\nrhFyewdx\t111822\n壹壹年\t111823\n胡颖慧\t111824\nbjdgja\t111825\n平新乔\t111826\n我不要你的我不要你这个臭\t111827\n马帅\t111828\n幺妹\t111829\nzyl\t111830\n氢\t111831\nzyh\t111832\n爱菊\t111833\n1234567899999\t111834\n很准\t111835\n佘雅雯\t111836\n包装设计\t111837\nzyr\t111838\ndghhb\t111839\n甚多\t111840\n切尔西依靠众志成城\t111841\nzyz\t111842\n美女行\t111843\n安居\t111844\n假期\t111845\n黄梓源\t111846\n椟还珠\t111847\n萍茵\t111848\nwxx\t111849\n拣到\t111850\n五华队\t111851\n投下\t111852\n氨\t111853\n情放\t111854\n假有\t111855\n恒大集团\t111856\n差一下\t111857\nkodks\t111858\n好帅呀我喜欢他\t111859\n皮秋月\t111860\n诚行\t111861\n五二房\t111862\n华北平原东南部\t111863\n招吃\t111864\n层次感\t111865\n买道\t111866\n氶\t111867\n韩安旭\t111868\n交还\t111869\n曹你\t111870\n愤世嫉俗\t111871\n卡八嘎\t111872\n胎心监护\t111873\n外音\t111874\n0点164点\t111875\n好无\t111876\nwuyrxin\t111877\n合肥八中\t111878\n琼斯本\t111879\n光绿\t111880\n好旧\t111881\n敢怒不敢\t111882\n火族\t111883\n所示\t111884\n两个爸爸\t111885\n光绪\t111886\n268个\t111887\n好时\t111888\nFashion\t111889\n浏览器\t111890\n粘连\t111891\n135元斤\t111892\nFFGD\t111893\nFFGG\t111894\n文晶\t111895\n首见\t111896\n有创意\t111897\n要不然不了\t111898\njlu\t111899\njlt\t111900\n百度糯米\t111901\n一千三百五十九八百六十多少\t111902\njlp\t111903\njls\t111904\n扭钮\t111905\n牛肉\t111906\n65555525556466005535528542805455455554554266530\t111907\ntuix\t111908\n清配\t111909\n唧唧复唧唧木兰\t111910\n文件夹\t111911\njld\t111912\n宁懂爱\t111913\njlf\t111914\n18333299850\t111915\n沙锅\t111916\ntuid\t111917\njlb\t111918\n当是我命大我只有我ok\t111919\njll\t111920\n部材\t111921\ntuin\t111922\nsuib\t111923\njlk\t111924\n牛股\t111925\n董凤霞\t111926\n槐安路\t111927\n栎鑫\t111928\n噢懂\t111929\n无一语\t111930\nLLS\t111931\nfshhfhjf\t111932\n六七十年代\t111933\n不能听歌\t111934\n5555555555\t111935\n尿精才\t111936\n牛肺\t111937\n当权者\t111938\nuuuuuuuu\t111939\n13209797637\t111940\n36036079\t111941\n佯装\t111942\n炫技\t111943\n吉祥如意\t111944\n代表我的爱\t111945\n故事书\t111946\n新年大作战\t111947\n晏维莎\t111948\n第144\t111949\n基带\t111950\n别老不想\t111951\nMicrosoft\t111952\n试饮\t111953\n猪日\t111954\n奇美\t111955\n到手袅\t111956\n真人像\t111957\n张建祥\t111958\n骂槐\t111959\n孝悌\t111960\n关西机场\t111961\n神MS\t111962\n北北京京欢欢迎迎你你\t111963\n巧克力兔糊涂图\t111964\n咯噔\t111965\n二百五一四\t111966\n监理方\t111967\n山丁\t111968\n嗯刘傻子\t111969\n没人管\t111970\n太师\t111971\n气死你喝\t111972\n泡澡\t111973\n曾伟婷\t111974\n古诗词\t111975\n海傻子\t111976\n给我不会\t111977\n佛跳\t111978\n城镇\t111979\njjdj\t111980\nwhom\t111981\n两块儿\t111982\njjdg\t111983\n厨房\t111984\n海清美\t111985\n美文化\t111986\n58条\t111987\n很怕怕\t111988\n错啦\t111989\nnkoo\t111990\n洁净\t111991\n行不我告诉你\t111992\n神农野径禅\t111993\n车配性感\t111994\n下月起\t111995\n马丁ONE-77\t111996\n装修\t111997\n二四四六八八七二\t111998\n点读机\t111999\nsuid\t112000\n3688726689\t112001\n陶少筠\t112002\n腐男\t112003\n人工机器人\t112004\n55448552858387\t112005\n20周年\t112006\n自由对我\t112007\nurpatri\t112008\n二零三二\t112009\n住院柄\t112010\nswodkdkd\t112011\n毕达\t112012\n迫切\t112013\nhhfhfbffjdjxjxujdhdbfvtcgdudyydfegdxnnzjzjxhxhchfhhdhxhxhxhxhxvyvyjxhff\t112014\n两队\t112015\n迎义\t112016\n达旦\t112017\n一个任\t112018\n2360\t112019\n120430魏晨My\t112020\n搬度\t112021\n野蛮节快乐的乖乖的乖乖\t112022\nhhuH\t112023\n思聪思聪\t112024\nH文\t112025\n趿拉板\t112026\n休惨\t112027\n休想\t112028\n九十六剩\t112029\n49行\t112030\nggf6fdy\t112031\n折过\t112032\n月湖\t112033\noppor7s\t112034\nhhuj\t112035\n大本钟\t112036\n佛话\t112037\n祖孙\t112038\n夺取\t112039\nHH\t112040\nGgfuffvhf\t112041\nhrtg\t112042\n薇薇\t112043\n山东小开门\t112044\n别瞎\t112045\n彼\t112046\n贼贼\t112047\nMqtdvggf\t112048\n不店人事网\t112049\n桃子夏\t112050\n新产品\t112051\n流派\t112052\n雄赳赳\t112053\n尘埃漫天\t112054\n卫裤\t112055\n三八你个瘪三\t112056\n山川\t112057\n留存\t112058\n阿柏\t112059\nguoke\t112060\n强烈\t112061\n定时炸弹\t112062\n外贸人\t112063\nrudkyxh\t112064\n赞皇\t112065\n你景\t112066\n彤\t112067\n888888888888\t112068\n我喜欢欢喜佛\t112069\n小餐\t112070\n好吧代理睡觉吧\t112071\n好了我就不理你了在\t112072\n留学\t112073\n门将\t112074\n5日\t112075\n萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌萌嗯嗯萌萌妈妈妈妈妈妈妈妈妈妈妈妈\t112076\n八所\t112077\nNichkhun\t112078\nLOFT\t112079\n跟上\t112080\n28倍\t112081\n幼们\t112082\n不个用\t112083\n理我了你了想\t112084\n彭\t112085\n声色\t112086\n山中\t112087\n王渝棠\t112088\n词类\t112089\nsfshhfgdth\t112090\n麦吉雅\t112091\n苦瓜熱誇\t112092\n還行\t112093\n你你你你你你你你气死我了你真的气死我了我\t112094\n來蒸蝦\t112095\nrjdjg\t112096\nniuniuporn\t112097\n白羊座\t112098\n中选一个字\t112099\n幼仔\t112100\n演出\t112101\n好事情\t112102\n晃醒\t112103\n是化\t112104\n2979999889798979976\t112105\n快餐快餐\t112106\nhttprdwzcndfcp5cs\t112107\nlmaboy\t112108\n迷言\t112109\n隋雨涵\t112110\n李星蒂\t112111\n晚飯\t112112\n4公里\t112113\n极品片\t112114\n一阵阵\t112115\njewowh\t112116\n商界\t112117\n蒙住\t112118\n张靓\t112119\n学习型\t112120\n我是你的女王大人\t112121\n张伟豪\t112122\n乘离器\t112123\n4点5\t112124\n凯富\t112125\n晚风\t112126\n心窝\t112127\n陈丽宝\t112128\n华莱士单人餐\t112129\n平安你险\t112130\nAgopce\t112131\n卢神\t112132\n58一张\t112133\n密不说\t112134\n稳滚滚滚滚\t112135\n来年\t112136\nd100\t112137\n咯屏\t112138\nforce\t112139\n虾死\t112140\n海林社会\t112141\nCzhzfeCz\t112142\n中美\t112143\n菲超\t112144\n等等等等等等等等等等等等等等等不知道\t112145\n34分米\t112146\n民乐\t112147\n把事\t112148\n泄题\t112149\n吧大米\t112150\n罗雅文\t112151\n第三十\t112152\n此后\t112153\n七年后\t112154\nfxgfvgdcg\t112155\n我相信你我\t112156\n脑人\t112157\n东鹏\t112158\nlights\t112159\n马者\t112160\n我是无敌喵喵屋\t112161\n洪恩行\t112162\n斜面\t112163\n风云杀\t112164\n徒然\t112165\n小钢\t112166\n猪苓\t112167\nfrfdwed\t112168\n徐工科五院\t112169\n全款\t112170\n灭虫\t112171\n小钱\t112172\n轿车\t112173\n一声倒地\t112174\nhimyou\t112175\nLachlan\t112176\n1.3秒\t112177\n兵兵\t112178\n西沙\t112179\n喝瓶邪\t112180\nhhaggggsgsh\t112181\n救世\t112182\n转过那你我靠\t112183\n求援\t112184\n散货\t112185\n不要骗我你是电影达人了请你\t112186\n心哥\t112187\nCity\t112188\n我的小甜心\t112189\n一根杆个\t112190\n鲁花花\t112191\n火箭\t112192\n中新网\t112193\n5cdc\t112194\n求发\t112195\n阳光逗院\t112196\n海扁\t112197\n十几度睡凉席我吗富顺一精灵下\t112198\n除掉\t112199\n双节文案\t112200\n本色\t112201\n十天一百\t112202\nFAN推te\t112203\n三十一号\t112204\n许鹏翔\t112205\n沒事然後\t112206\n找我最爱\t112207\n星期几\t112208\n慈善事业\t112209\n紫竹院\t112210\n给我果\t112211\n就是我原我\t112212\n消炎\t112213\n此类\t112214\n放假日\t112215\nvvccv\t112216\nuvmcl\t112217\n异位\t112218\n影支\t112219\n小个性\t112220\n17盒\t112221\n我讨厌你我恨你\t112222\n纪念日\t112223\n金南俊\t112224\n天朝教育委员会\t112225\n经开\t112226\nsygs\t112227\n张子悦\t112228\n范娜斯\t112229\nsygy\t112230\nfdgxhjj\t112231\n第四\t112232\n敢爱\t112233\n四资再一埂烫姥建类蚤\t112234\nhenishuijia\t112235\n演员表\t112236\n造用\t112237\n杜桥\t112238\n尾度\t112239\n我理取闹\t112240\n唐克\t112241\n山邈屪\t112242\nygg\t112243\nygf\t112244\nygd\t112245\n不定代词\t112246\nygb\t112247\n太z0qw\t112248\n一处\t112249\n周元松\t112250\n一夏\t112251\n一复\t112252\nygh\t112253\nFgggjzjvskcksvjdbkkdjcbjbdkbjbdkbdocjbksbkdbxjsusia\t112254\n奈不奈\t112255\n一外\t112256\n胎教啊一呀1\t112257\n一两百\t112258\n三叶草\t112259\n廉政\t112260\n一够\t112261\n丹江口\t112262\nygy\t112263\n一夜\t112264\n我题我\t112265\n岁月无尽\t112266\n朝天\t112267\n一大\t112268\n一太\t112269\n一天\t112270\n罗23\t112271\n小秘你在吗小面\t112272\n多偶\t112273\n攻城兽\t112274\n看笑话\t112275\nMrs哦1\t112276\n而复\t112277\n一头\t112278\n清河区\t112279\n冀州\t112280\n一夹\t112281\n过度过\t112282\narestraight\t112283\n蔡雨轩\t112284\n挥之即\t112285\nhvdgf\t112286\na5000\t112287\n倒走\t112288\n伤心爱\t112289\n皆大欢喜狐戏红尘长虹大酒店\t112290\n对句\t112291\n虚岁\t112292\n熟男\t112293\nthankyou\t112294\n二千多\t112295\n李国生\t112296\n乐峰\t112297\n孙辈们\t112298\n手课\t112299\n蝴蝶棒\t112300\n桃花眼\t112301\n对号\t112302\n开下吧\t112303\n这有钱人\t112304\n么不住\t112305\n大笨钟\t112306\n倒赞\t112307\n林一生\t112308\n吉泽\t112309\n对叙\t112310\n黄兴博\t112311\n观音姐姐\t112312\nhhfhah\t112313\nlanit\t112314\n8769\t112315\nc2\t112316\n萌萌噠\t112317\n钟银银\t112318\n书雷\t112319\n快要期末考试\t112320\nfifi\t112321\nfifh\t112322\nfifk\t112323\ndomisoga\t112324\n才五\t112325\nfifl\t112326\nfifo\t112327\n弹飞\t112328\ncjtb\t112329\n才事\t112330\nbyTK\t112331\nfift\t112332\n猪猪猪猪你\t112333\nskehy\t112334\nfifA\t112335\n推入\t112336\n留在家\t112337\n攻給\t112338\n糖稀做\t112339\nTuGjwpg\t112340\n精忠报国\t112341\n三月十九号\t112342\n1.14排名第二\t112343\nbetter\t112344\n一百年\t112345\n董银燕\t112346\n吃完饭了再聊\t112347\n穷苦\t112348\n惹气\t112349\n恩高兴\t112350\n添上\t112351\n添下\t112352\n国学堂\t112353\n叶萍\t112354\n口碟\t112355\n坏男人\t112356\n艺管\t112357\n97.3%\t112358\n宙斯兰芝\t112359\n毛老子\t112360\nwqq\t112361\n港澳\t112362\n从头再说\t112363\n从头再读\t112364\n哎呀愁\t112365\n末世女\t112366\n博大大太帮\t112367\n10000000000000000000000000000000000000000000000000000000000000000000000000000000000000个\t112368\ncvxcgcf\t112369\n我是大爸爸你为什么给他笑话\t112370\n1A\t112371\n人家里人太多了吧台词霸主题歌\t112372\n月月月月月月\t112373\n特岗\t112374\n一勿\t112375\n全裸\t112376\n不文不聊\t112377\nzagg\t112378\n1C\t112379\n忘给我\t112380\n叶落\t112381\n真的很好\t112382\n感觉吧废话\t112383\n戴向宇\t112384\n跪求\t112385\n喝一杯\t112386\n中国国民党\t112387\n云气\t112388\n傅叶\t112389\n岁岁年\t112390\n练一练\t112391\n8812红后我是说摆摆\t112392\n侠盗飞车三\t112393\n天网\t112394\nqjzuig\t112395\nKlaajjjttpaag\t112396\n未来世界\t112397\n等到天亮\t112398\n汉字\t112399\nhkkj\t112400\n篡位\t112401\n清雅\t112402\n6217997010003974811\t112403\nmoumake\t112404\n安得猛士兮守四方\t112405\n000日元\t112406\n雨欣\t112407\n前项\t112408\n柳永有道\t112409\nwqb\t112410\ncontent\t112411\n事儿媳妇\t112412\n五十点\t112413\n萌萌哒萌萌哒萌萌哒萌萌哒\t112414\n偏男\t112415\n我讨厌你实验你说我是淑女叔叔\t112416\n莱克勒\t112417\n49188787\t112418\n56987568045\t112419\n交易日\t112420\n做到底\t112421\n腰斩\t112422\nsandeats\t112423\n煤窑\t112424\n汪聊\t112425\n饭拍\t112426\n有一年\t112427\n1233456677980098\t112428\n朱朝康\t112429\n下一篇\t112430\n译出\t112431\n我的爱人我很爱你你爱我\t112432\n肉丸子\t112433\n精英部队2\t112434\n有很无\t112435\n干脆面君\t112436\n王晓霞\t112437\n153157928462\t112438\n逍遥游\t112439\n熊师傅商店\t112440\n夏林杰\t112441\n我是你的你的女的女\t112442\n姑vvi\t112443\n邓慧希\t112444\n1面\t112445\n哭了我讨厌\t112446\n3011626393\t112447\n17k嘶哑\t112448\nuhghggghh\t112449\n头照\t112450\n告是\t112451\n伪军\t112452\n秘弟\t112453\n20余万元\t112454\nfdwwrtt\t112455\n弄下\t112456\n弄上\t112457\n秘弄\t112458\n一见过\t112459\n秘开\t112460\ncb8065380cd791231cfa7900aa345982b2b780a8jpg\t112461\n才劲\t112462\n黄咯\t112463\n一二十号\t112464\n身份证明\t112465\n门童\t112466\n冯毅昂\t112467\n早安度\t112468\n孜然羊肉生菜包\t112469\n改装\t112470\n慈溪\t112471\n1c\t112472\n伯逼\t112473\n记错了吧\t112474\n前天上午\t112475\n露骨\t112476\n百五十\t112477\n车报\t112478\n度度度秘喜欢你\t112479\n200890689\t112480\nAnglabeab\t112481\n云水\t112482\n安小卿卿英\t112483\n115斤\t112484\n胆大妄为\t112485\n马营\t112486\n五十五万\t112487\n12点10分\t112488\n烂落\t112489\n嘎吱嘎吱\t112490\n惊艳\t112491\n玩图随风fxxx耐看\t112492\nErryiopljfds\t112493\n赤耳\t112494\ndgdbfvcbv\t112495\n过誉\t112496\n涂浩峻\t112497\n骗人机器人\t112498\n飞去哪儿\t112499\n花酒\t112500\n抵港\t112501\n早贵子\t112502\n想嫁\t112503\n航天员\t112504\n听苦\t112505\n奕迅\t112506\n王紫琴\t112507\n助手\t112508\n蓝鸟\t112509\n忘谱\t112510\n我喜欢我喜欢看人在囧途\t112511\n关贞尧\t112512\ndhrieh\t112513\n菱鲺\t112514\n嘤嘤嘤嘤嘤嘤嘤\t112515\n就是我眼\t112516\nwiob\t112517\n春联行\t112518\n宅宅米\t112519\n圣诞大逃亡\t112520\n干什么说\t112521\n南岸\t112522\n宋凯乐\t112523\nwiow\t112524\n老欧耶斯\t112525\n凶狗\t112526\n桂凡\t112527\n立早\t112528\n吸热\t112529\n诚邀\t112530\n还读\t112531\n22015618181620456525815193586925201519540251810594282561845919595928648465\t112532\n凶狠\t112533\n109分\t112534\n吸烟\t112535\n抢购\t112536\n绞丝\t112537\n金橘\t112538\n七月三十一号\t112539\n能干嘛\t112540\n不搞\t112541\n不搜\t112542\n生在这里\t112543\n秘新杰\t112544\n2584258\t112545\nLENA\t112546\n师生恋\t112547\n苏子瞻云\t112548\n85888888888888888888888888888888888888888888888888888888888888888888888888888\t112549\n8.048\t112550\n小马哥轻松时刻\t112551\ntjejeoejne\t112552\n你好多咪你是谁\t112553\nfghhfgh\t112554\n厉行节约\t112555\n老鼠老鼠\t112556\n别套\t112557\n还算\t112558\n猜猜猜猜猜猜猜猜猜猜猜猜猜猜\t112559\n随说\t112560\n王泽龙\t112561\n你好小哪吒\t112562\n南神真帅\t112563\n很幸\t112564\n龚雅婷\t112565\n春天来了花园\t112566\n哭穷\t112567\n零零年代\t112568\n很年\t112569\n拿掉\t112570\n脸川菜\t112571\nhikif\t112572\n孤掌难鸣\t112573\n蔡健雅\t112574\n情殇\t112575\n年初七\t112576\n负心汉\t112577\n切哈哈大笑\t112578\n经适房\t112579\n真知灼见\t112580\n指鹿为马\t112581\n13762464389\t112582\n杜甫\t112583\n拜拜表\t112584\n状告\t112585\n大溪\t112586\n爱不穿\t112587\n这个月一号\t112588\n骑\t112589\n负荷\t112590\n大源\t112591\n骗\t112592\n骚\t112593\n解忧\t112594\n骞\t112595\nwodey\t112596\n小耳朵\t112597\n骂\t112598\n不想你我恨你\t112599\n骄\t112600\n么么哒我有那么那么那么大爱你爱你\t112601\n骆\t112602\n烂仔\t112603\n惶惶\t112604\n骉\t112605\n生词\t112606\n验\t112607\n生诉\t112608\n560000\t112609\n牽手杯\t112610\n采编\t112611\n烤站\t112612\nshutdownst\t112613\n填入\t112614\n王瑞香\t112615\n毫不犹豫\t112616\n歪子\t112617\nzsw\t112618\n黑木涯出版社\t112619\nFertigtell\t112620\n说错了叫\t112621\n科比哥\t112622\n碎丁\t112623\n17\t112624\n填充\t112625\n盀风乚\t112626\n神经了吗我的说英语\t112627\nshcbx\t112628\n不心\t112629\n新概念\t112630\n外项\t112631\n鹅卵石\t112632\n清华同方\t112633\n鸿毛\t112634\nnvjdtzfarch\t112635\n嗯过会儿\t112636\n自己样\t112637\n七张\t112638\nLSY\t112639\n七弦\t112640\n二二零幺七\t112641\n介价\t112642\n杏节\t112643\n实在床上\t112644\n租赁化\t112645\n王渔洋\t112646\n葫芦娃大战变形金刚\t112647\n任杰\t112648\n利威尔\t112649\n股价\t112650\n过年了再见\t112651\n1月6日\t112652\n股份\t112653\n写字台\t112654\n新界区\t112655\n三下乡\t112656\n合胜\t112657\n一三四四六三\t112658\n录音间\t112659\ngmhgnnebgvm\t112660\n挡脸\t112661\nppgg\t112662\n讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌\t112663\n一百二十二十八块\t112664\n外裤\t112665\n乒乓\t112666\n一百五十岁\t112667\n大大馍\t112668\nC\t112669\n于瑞波\t112670\n四公斤\t112671\n38600年\t112672\n我本\t112673\n飞度\t112674\nHALA\t112675\n请求解\t112676\n成百\t112677\n鲜亮\t112678\n奥北东\t112679\n荫护\t112680\nethnography\t112681\n点点头\t112682\n暗地\t112683\n能们\t112684\n吉大喊大叫\t112685\nFunding\t112686\n唱一差一差\t112687\n能仓\t112688\n邵师哥\t112689\nK722\t112690\n流碎\t112691\n叫不度\t112692\nc5\t112693\n没有我喜欢\t112694\n啦啦啦啦啦啦啦啦啦啦啦啦啦\t112695\n梁鸿\t112696\n修修\t112697\n李立春\t112698\n家甲\t112699\n16m\t112700\nabckf\t112701\n家男\t112702\n16k\t112703\n16g\t112704\n龙跟\t112705\nd477\t112706\nabckk\t112707\n连和\t112708\n寻求\t112709\n16y\t112710\n4635\t112711\n家用\t112712\n钢琴手\t112713\n16p\t112714\n欲仙欲死\t112715\n7千万\t112716\n联想y40钢八零\t112717\n三轮儿车儿\t112718\n五妙\t112719\n发动机\t112720\n关韩星\t112721\ncuit\t112722\n不死回来\t112723\namarter\t112724\n116\t112725\n步步高跟男\t112726\n小姗\t112727\nC1\t112728\n9卍\t112729\nCK\t112730\nCJ\t112731\n货运\t112732\nCH\t112733\nCN\t112734\nCM\t112735\nCL\t112736\nCC\t112737\nCB\t112738\nCA\t112739\nCG\t112740\nCF\t112741\nCE\t112742\n重谢\t112743\nCY\t112744\nCX\t112745\n168\t112746\n169\t112747\n得不得\t112748\n听歌刚刚哈哈好的好的好受打击次\t112749\n164\t112750\n165\t112751\nCQ\t112752\n167\t112753\n族王\t112754\nCV\t112755\n162\t112756\n163\t112757\n跌停板\t112758\nCc\t112759\nCb\t112760\n早上九点\t112761\n亲爱的我爱的嫁给我\t112762\nCf\t112763\n外协\t112764\nCd\t112765\n骗人节\t112766\nCx\t112767\nnowy\t112768\n外卖\t112769\nCr\t112770\n田径\t112771\nCv\t112772\n爱我的资格\t112773\n一个多秘\t112774\n无上自己\t112775\n论调\t112776\n保科\t112777\n你教我了求你了求你了\t112778\n沃佩\t112779\n3一5岁\t112780\n盐湖份\t112781\nNai\t112782\n电度\t112783\n余工\t112784\n港龙KA605\t112785\n准确定\t112786\n一月十块\t112787\n剽窃\t112788\n斯巴达克\t112789\n洪勤昊\t112790\n假成亲\t112791\n古力娜扎\t112792\n邱毅\t112793\n464346361\t112794\nlqp\t112795\n兰陵铁心\t112796\n116口\t112797\nlqd\t112798\n江淮\t112799\n乱世\t112800\n鼻屎\t112801\nlqa\t112802\n苏志燮\t112803\n郭益达\t112804\n万也\t112805\n4月22号\t112806\n叶春钰\t112807\n找头\t112808\n小饿小困香飘飘\t112809\n正逢\t112810\n杨青旋\t112811\n姑爷爷\t112812\n小米人\t112813\n找大\t112814\njiajia\t112815\n拿瓦大特曼给我来\t112816\ncxdddtrt\t112817\n珏欧尼\t112818\n三路\t112819\nbutonlyme\t112820\n品牌子\t112821\nlq4\t112822\n千山涉水\t112823\n萨尼亚\t112824\n针刀\t112825\nfulness\t112826\n秘狗\t112827\n一点43点612点\t112828\n骄娇\t112829\n午品力毋坐东甜\t112830\n我真的很累不不了了吧我真的好累\t112831\n骨折价\t112832\n支支招\t112833\n报读\t112834\nfhdhgngb\t112835\n妙\t112836\n似火年华\t112837\n尚道\t112838\n润滑\t112839\n黄牛党\t112840\n麻沙\t112841\n前文\t112842\n境地\t112843\n犬夜叉\t112844\n驾崩\t112845\n沙溪交警大队\t112846\n九分之四\t112847\n扬库洛夫斯基\t112848\nOK,OK\t112849\n20分钟\t112850\n进口车\t112851\n一个心\t112852\n哈哈大我哪大坏蛋大坏蛋大坏蛋\t112853\n机器城\t112854\n贾卫权\t112855\n顿教\t112856\n月饼\t112857\n天龙明\t112858\n中华粉红丝带基金会\t112859\n明知山有\t112860\n张晨鹏\t112861\n五龙彰\t112862\n照度\t112863\n刘智\t112864\n斗室\t112865\n朱扒皮\t112866\n线面\t112867\n善良人\t112868\n刘晶\t112869\n巴拉巴拉巴拉巴拉巴拉\t112870\n号告\t112871\n刘晏\t112872\n乖乖丑\t112873\n答应我想你\t112874\nTMD傻boyi\t112875\n六世珍珠月似钩\t112876\n万泰\t112877\n开除掉\t112878\n照应\t112879\n座机\t112880\n嗯一五\t112881\n陪我了\t112882\n费维扬\t112883\n比尼美\t112884\n再快\t112885\n阖家\t112886\n进水\t112887\n结局额\t112888\n洗手液\t112889\n556054\t112890\n杨烁\t112891\n核危机\t112892\n怀什\t112893\n陆师傅\t112894\n衣食父母\t112895\nｈｅ\t112896\n偏女\t112897\n回响\t112898\n5253545556575859\t112899\n昨儿\t112900\n进民\t112901\n百部\t112902\n不明不白\t112903\n进气\t112904\n复古风\t112905\n比阔闹\t112906\n新闻场\t112907\ndutk2b\t112908\n法海\t112909\n七米点\t112910\n哥达拉斯\t112911\nPTP\t112912\n有意想\t112913\n葫萌\t112914\nI9250黑2580\t112915\n504元\t112916\nq84个\t112917\n最浪漫\t112918\n便堂而皇之保\t112919\n相只有\t112920\ngOOdb0\t112921\n私校\t112922\n住宿券\t112923\n好啦好啦好\t112924\n河蟹\t112925\n油鱼\t112926\nbuysbsth同意句\t112927\n刘嘉茵\t112928\nLinkedin\t112929\n九曲\t112930\n早来早来\t112931\n美图兔兔秘\t112932\n拾起\t112933\n阿热\t112934\n奇术\t112935\n城东区\t112936\n王彩\t112937\n你好Youmo\t112938\n紫蝶点\t112939\n上沉鱼落雁\t112940\n二不二不是\t112941\n班州\t112942\n18653045677\t112943\n373355345个\t112944\n山川故国正斜阳下冻雨微白轻车待发时候盈思绪处江南江北一夜飞还千里坐吴楚虚掷怕想见王谢堂前燕子来时旧相识千年醒梦归凋薜更何堪逝卷如飞镝\t112945\njessture\t112946\n16.75\t112947\n捐给\t112948\n立岸者\t112949\n偏头疼\t112950\n苏扇\t112951\n鱼唇\t112952\n红楼猫\t112953\n121期\t112954\n衰样\t112955\n郁禅师\t112956\n四十天左右\t112957\n八部\t112958\n啦啦啦啦啦啦你这个坏人坏人坏人坏人傻子傻子\t112959\n卡皇九江就是度秘\t112960\n十二辆\t112961\n城邦\t112962\n多长的话\t112963\n均匀\t112964\n18554586\t112965\n88啦\t112966\n挑片\t112967\n王度秘\t112968\n汤非凡\t112969\n核武\t112970\n仁和寺\t112971\n谋财害命\t112972\n地块\t112973\n张锦\t112974\n地坛\t112975\n军犬\t112976\n更早些\t112977\n5xx0点\t112978\n地址\t112979\n李强\t112980\n我色\t112981\n爆笑笑\t112982\n数字化\t112983\n卤味\t112984\n霍我\t112985\n妈呀你臭不要脸\t112986\n开心愉快\t112987\n妣\t112988\n东悦泰\t112989\n几万亿\t112990\n配种\t112991\n点弄死\t112992\n107岁\t112993\n三若a\t112994\n关完机\t112995\n李开\t112996\n妥\t112997\nrAustrianoon\t112998\n一概\t112999\n455天\t113000\n这本事\t113001\nSEC\t113002\n八十九十九十八十分\t113003\n地坪\t113004\n敏智\t113005\n哪本书\t113006\n配秘\t113007\n我们来玩一个我说上一句你说下\t113008\n死葛\t113009\n定样\t113010\n兴宝\t113011\n双臂\t113012\n宋体\t113013\n定格\t113014\n郝娇娇\t113015\n兴安\t113016\n死冰卿\t113017\n好多遍\t113018\n图破图君\t113019\n闫博涵\t113020\n正源\t113021\n象湖明\t113022\n临海市\t113023\n兴容\t113024\nTsotsis\t113025\n涔\t113026\n油箱\t113027\n认罪\t113028\n明哥\t113029\n宋佳\t113030\nBT66\t113031\n度蜜月小心\t113032\n范忻钰\t113033\n油管\t113034\n赵柔雅\t113035\n定栏\t113036\n木果\t113037\n对呀你不想\t113038\n杨梦真\t113039\n踢毽\t113040\n杞县\t113041\n亮闪\t113042\n溪美\t113043\n第十一届\t113044\n小高家\t113045\n耳鬓斯\t113046\n僵尸默默沙漠\t113047\n赵叫\t113048\n宝严院\t113049\n嘞太嗲\t113050\nffchbibksxk\t113051\n天下第一棒\t113052\n穹窿\t113053\nthatsyphiliswpwwmgmpjgmggtw4\t113054\n上麻\t113055\n度秘你在干嘛你在干什么\t113056\nstomactor\t113057\n2：00\t113058\n羊马河\t113059\nQQ2976347556\t113060\n溽暑\t113061\n22岁\t113062\n解饮\t113063\n彩陶\t113064\n8条\t113065\n囡温馨岛\t113066\n事的人\t113067\n伦敦奥\t113068\n蔡学颖\t113069\n16喔\t113070\n均为\t113071\n5.6%\t113072\n3272321970\t113073\n零5分\t113074\n不远千里\t113075\n动手术心心相惜\t113076\nistive\t113077\njijj\t113078\n书页\t113079\njijn\t113080\n咏柳\t113081\n一次俩\t113082\njijf\t113083\n狂吻左距n多啖[飞吻\t113084\n混堡\t113085\n另有所指\t113086\n问一请问\t113087\n同归\t113088\n私密子\t113089\n蹄筋\t113090\n5年\t113091\n枫林晚\t113092\n刘干事\t113093\nhduuxxzk\t113094\n康辉\t113095\n张译文\t113096\n打不打不行\t113097\n抗原MHC受体\t113098\n1970年\t113099\n半路失踪\t113100\n电影世界\t113101\n外甥\t113102\nqeryyu\t113103\n男师\t113104\n无数理化\t113105\n吕丘露微\t113106\n准奋\t113107\nKpop\t113108\n苹果6s\t113109\n男帽\t113110\n餓飯煩惱人體圖剛\t113111\n虎牛\t113112\n葛佳\t113113\n我们三个\t113114\n1528\t113115\n5.6G\t113116\n约瑟夫·寇德卡\t113117\n周日英\t113118\n无遮盖\t113119\n腿肚子\t113120\nkifs\t113121\n雪弗\t113122\n去找找\t113123\n麦俊龙\t113124\n红罐\t113125\n日真\t113126\n剑一剪\t113127\nuchaha\t113128\n合同感叹号\t113129\n安睡\t113130\n亲爱的好想你的\t113131\n市科技局\t113132\n乖乖的妈妈的话\t113133\n15563995800\t113134\nShsjmm\t113135\n虚传\t113136\n财峡大坝\t113137\n国军馆\t113138\n混匀宁\t113139\n最壮\t113140\n虚伪\t113141\n卖报\t113142\n金立帮\t113143\n阿菲我之穿越文\t113144\n过年了你过\t113145\n高城丽\t113146\n大量\t113147\n打篮\t113148\n狂虐\t113149\n特灵\t113150\n哟哟方\t113151\n裸妆\t113152\n大金\t113153\n迷友\t113154\n董夫共\t113155\n1479\t113156\n充呃\t113157\n大批量\t113158\n咯吱咯吱咯吱咯吱羊羊羊羊羊羊羊羊咯吱咯吱咯吱咯吱羊羊羊羊羊羊羊羊个月咯吱咯吱咯吱羊羊羊羊羊羊咯吱咯吱咯吱咯吱羊羊羊羊羊羊\t113159\n349189247\t113160\nyouyout\t113161\nsorry\t113162\nqyfudj\t113163\n就是这么说\t113164\n这天\t113165\n易筋经觉\t113166\nv≦o\t113167\n凉菜\t113168\n费费\t113169\nvxehjd\t113170\n乡村\t113171\n好好呼呼呼幾藉\t113172\n禅武\t113173\n义马\t113174\n若干年\t113175\n充呵\t113176\n人工行\t113177\n一两块\t113178\n夜难寐\t113179\n归途\t113180\n午节\t113181\n小世界\t113182\n1236547896324863286558285727383566\t113183\n尝尝\t113184\n零四排\t113185\n高级感\t113186\n选战\t113187\n往前走\t113188\n田中平\t113189\n强亲\t113190\n嗯1068神无月\t113191\n幺八五二六八幺二八五零\t113192\n学敏\t113193\n750千伏\t113194\nJAISJXNZKMJS\t113195\n1593572585556666668\t113196\npera\t113197\n萌萌板\t113198\n尝尽\t113199\n捡到\t113200\n燕瘦\t113201\n软石\t113202\npqrrtuvonm\t113203\n1983年\t113204\n目黑\t113205\nfjurfj\t113206\nByebye\t113207\nDEALMOON\t113208\n不要你了你走\t113209\n163页\t113210\n肖骁\t113211\n张欣蕊\t113212\n压灭\t113213\ngocixutzu\t113214\n德普叔\t113215\n捐献者\t113216\nggmggmghghnhb\t113217\n熊先乖\t113218\n博览会\t113219\n思睿\t113220\n高高红\t113221\n我喜欢闹\t113222\nzzZ\t113223\n怕我行\t113224\n女欢\t113225\n灯红\t113226\n88言\t113227\n铁了心不换\t113228\n领奖台\t113229\n仙履奇缘2\t113230\n魏丽军\t113231\nbiuy\t113232\n泰迷们\t113233\n黄书网\t113234\n笨娜娜\t113235\n丧尸\t113236\n卡尔屁蛋\t113237\n并机\t113238\n二月一一号\t113239\n555555555555555555555555555555555555555555555555555555555555555555555555\t113240\n安度秘\t113241\n王丷\t113242\n细跟\t113243\n枕边人\t113244\n零下九度\t113245\n王丽\t113246\n版型\t113247\n单蚊\t113248\n茶城网\t113249\n王丹\t113250\n五分钟以后\t113251\n杨总\t113252\n35许\t113253\n于小戈\t113254\n十一毫米\t113255\n內向\t113256\n菲菲rit\t113257\n王个\t113258\n我真相信\t113259\n王东\t113260\n102年\t113261\n浐灞\t113262\n王一\t113263\n小虫子\t113264\n这一个你\t113265\n7分钟\t113266\n秘你男\t113267\n梦圆\t113268\n墨鸦\t113269\n疯狂猜词语\t113270\n王下\t113271\n代购\t113272\n李宇佳\t113273\n万能子之\t113274\n百分之几\t113275\n什么多\t113276\n坚果豌豆\t113277\nFighting\t113278\n七十多年前\t113279\n广平\t113280\n888989888\t113281\n周洪宇\t113282\n卧龙卧虎华夏大地长春疯人家的零人民古国静朝晖\t113283\n猪你是猴你是八婆你是狗\t113284\nkular\t113285\n般见面\t113286\n两新组织团\t113287\nggrrfgdggfig\t113288\n世面\t113289\n1959.7.8\t113290\n口蘑楼\t113291\n笨猪朱丽叶\t113292\n空中墨花园\t113293\n分内\t113294\n朱俊杭\t113295\n两朵花\t113296\n卧趣\t113297\n承典\t113298\n聊了冻\t113299\n摸嚄\t113300\n我不想和你聊了我想和你\t113301\n大神节\t113302\n张小若\t113303\n#吴奇隆#3.22新闻专访1【广州日报\t113304\n梁我叫\t113305\n关键性\t113306\n我条件你的手有小女女\t113307\n亲和力\t113308\nbagayalu\t113309\nr93o20d\t113310\n孤影\t113311\n合唱团\t113312\noure\t113313\n待着\t113314\njgxhb\t113315\n150页\t113316\n郑汝桦\t113317\n无码\t113318\n米切纳\t113319\nbeid\t113320\n个头脑\t113321\nnimaheniba\t113322\nkky\t113323\n免贵\t113324\njhegt\t113325\nkku\t113326\n原因儿\t113327\nbein\t113328\nkkv\t113329\nkks\t113330\n好心自然\t113331\nkkm\t113332\nkkn\t113333\n李立笔\t113334\nkkk\t113335\nItsalsodifficult\t113336\nkke\t113337\nkkd\t113338\n号房\t113339\nkkf\t113340\nkka\t113341\n曲河\t113342\n屏式\t113343\n神大人才\t113344\n布依族\t113345\n求解\t113346\n刘外传\t113347\n仙境\t113348\n00000000000000000000000000000000000000000000000000000000000\t113349\n丽丽雯\t113350\n管庄\t113351\n比方\t113352\n走远\t113353\n走进\t113354\ndgdfchgf\t113355\n无济于事\t113356\n走运\t113357\n富源\t113358\n吗谭\t113359\n331334\t113360\n卡子门立交\t113361\n老莎\t113362\n拍砖\t113363\n快一点儿\t113364\n有机会\t113365\n醉倒\t113366\n神兵了你\t113367\n手狗\t113368\n冷暖\t113369\n布莱恩\t113370\n闰蜜\t113371\n刘客人\t113372\n纯情版\t113373\n找土木\t113374\n法人\t113375\n夏非寒\t113376\n山南\t113377\n唉瓮nj\t113378\njzhshsjfks\t113379\n11g\t113380\n兵兵喜欢兵兵女爱我爱你\t113381\n豪世莫偕\t113382\n常青\t113383\nhhds\t113384\n8页\t113385\n你最好听话\t113386\n践行\t113387\n常静\t113388\nGjjjjjk\t113389\n小结\t113390\n罗茜\t113391\n七年的你呢我是你的夜你\t113392\n窦娥\t113393\n活腻味\t113394\nweico\t113395\n毕业了你\t113396\n填空题\t113397\n512任性网\t113398\n乖乖\t113399\n李一澄\t113400\n无所作为\t113401\n天墉\t113402\n有不让\t113403\n济南日报\t113404\n试试试\t113405\n天境\t113406\n汪卩\t113407\n劲舞女\t113408\n告诉我是\t113409\njdry\t113410\n夏慕曦\t113411\n害人害己\t113412\nhgddvkkkk\t113413\n12552555522555\t113414\n死八婆吵吧乖二笔\t113415\n润肠\t113416\n诗性\t113417\nGTUYTT\t113418\n天与地\t113419\nhtrg\t113420\n郭占标\t113421\n将计\t113422\nOta\t113423\n高庄元祖\t113424\n多咪多咪我有事情我先走了啊拜拜一会见\t113425\n3201984717\t113426\n安川\t113427\n单人塔\t113428\n曼妙\t113429\n韩服\t113430\n妞泡\t113431\n开走\t113432\n1111111140007040707700780770700777000707007070700707077070877007077070707070770\t113433\n楷芝\t113434\n范大姐\t113435\n正佳琪\t113436\nwescruffCCG\t113437\n佳节\t113438\nhttpehiphotosbaiducomxiaodupicitem10dfa9ec8a13632726bc1747968fa0ec08fac775jpg\t113439\n你好讨厌儿\t113440\n碎碎\t113441\n初七八\t113442\n姨奶\t113443\n下度秘\t113444\nmaiasmike\t113445\n车锅\t113446\n罗清平\t113447\nmdbjtga\t113448\nalyoustomi\t113449\n带板\t113450\n车票子\t113451\n认不错\t113452\n庆文\t113453\n婚姻法\t113454\n咄咄\t113455\n网源\t113456\n赖斯\t113457\n盗版帮\t113458\n89159一五十九\t113459\n秘秘们\t113460\n唉多米\t113461\ndasx\t113462\n痛孑\t113463\n关溪\t113464\n肤白貌美\t113465\n卡其色\t113466\n磨刀\t113467\n你是公的强我是想\t113468\n语法学\t113469\n王睿婷\t113470\n危化\t113471\n梁达\t113472\nDVDohCH\t113473\n小旗袍\t113474\n7319246850\t113475\n清凉秀夏限#\t113476\n绿原\t113477\n昨晚7时57分\t113478\n御膳房\t113479\n狮虎\t113480\n眨眼\t113481\n周曙霞\t113482\n疱疹\t113483\n10080斤\t113484\n刘逸宸\t113485\nhellomy\t113486\n开水房\t113487\n你在哪儿啊我还怕天上我不能去啦冷淡你在哪儿\t113488\nTFb0ys\t113489\nfuc\t113490\n嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯等等等等到你的度秘度秘度秘度秘猪猪猪猪猪猪猪猪猪猪\t113491\n小男孩儿\t113492\n安有布西\t113493\n徐小姐\t113494\n中午1点48分\t113495\nc97\t113496\n绿城上海三大样板价值连城#\t113497\n小曹子\t113498\n我真你\t113499\nigfgjg\t113500\n说青\t113501\n四爷党比八爷党\t113502\n税法\t113503\n不好摸\t113504\n徐有英\t113505\n六千克\t113506\n江苏\t113507\n兔美\t113508\n忍气吞声\t113509\nA块\t113510\n谢谢n\t113511\n淮南市\t113512\n会心\t113513\n戴常豪\t113514\n八点半个月\t113515\n文本\t113516\n江英\t113517\n学狗狗叫\t113518\n588414\t113519\n赵丙燕\t113520\n张作霖\t113521\n大九\t113522\n按一下\t113523\n全员\t113524\n喔的魔力六六的多女的我的个周六的了的\t113525\n先走觉\t113526\n给我黄网\t113527\n听#\t113528\n诱捕\t113529\n最终幻想3\t113530\n能工\t113531\n10小时\t113532\n富要\t113533\n接你好\t113534\nQAQ\t113535\ntf博伟\t113536\n吧刚\t113537\n青园街\t113538\n第七次\t113539\n夜辰星\t113540\n挺阳光\t113541\n十几次\t113542\n周小璐\t113543\n淘气吴我和你的哦\t113544\n﹋o﹋\t113545\n588686836860906838686868686868686868686867343\t113546\n這個屌\t113547\n38号\t113548\nHendrix商标权\t113549\nFdZgd5c\t113550\n额尔\t113551\n几分米\t113552\n79十斤\t113553\n筹建\t113554\n吗替\t113555\n起死回生\t113556\n丁丁丁丁丁\t113557\n支原体\t113558\n60家\t113559\n季季红\t113560\n不起真的伤不起\t113561\n155cm\t113562\nahhe秘cyanit\t113563\n公开性\t113564\n还男女通吃\t113565\nhiahiahia\t113566\n零几年\t113567\n证券交易所\t113568\n歌唱歌唱歌\t113569\n法网\t113570\njfhxhdhsw\t113571\nip4\t113572\n濒临\t113573\n范怡倩\t113574\n钟峻青\t113575\n没有然后了再见\t113576\n诗朗诵\t113577\nDVIDBHDk高架桥\t113578\n宽厚\t113579\n4399399\t113580\n葫芦娃儿\t113581\n这名儿\t113582\n陈嘉恒\t113583\n卡洛尔\t113584\n问题我的你\t113585\n没饭吃\t113586\n男ox\t113587\n理我了我恨你\t113588\n房租\t113589\n孙医师\t113590\n告诉我不就\t113591\n公主学院\t113592\nJRJ\t113593\n国际航空\t113594\n双机\t113595\nJRI\t113596\n巴戈龙\t113597\nsalutation\t113598\n话涯\t113599\n什么么\t113600\n245厘米\t113601\n8点三十分\t113602\n北斗通\t113603\n大好多\t113604\nLnnnLB\t113605\n爱在囧途\t113606\n失去知觉\t113607\n作恶不上\t113608\ngaga\t113609\n广东湛江吴川威力神歌舞团\t113610\n讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你\t113611\n三棒蛋卷棒烫蛋卷\t113612\n仝雨杰\t113613\n111273953422385453\t113614\ngagh\t113615\n这就是说\t113616\n军备\t113617\n闲工夫\t113618\n搂抱\t113619\n禁肉节\t113620\n九七昂\t113621\n具笈\t113622\n成达实验学校\t113623\n凌文博\t113624\n太碉堡\t113625\n一百多个\t113626\n赵鸡翔\t113627\n发财险\t113628\n考一到\t113629\n十积分\t113630\n大致向东\t113631\n尹悦蓉\t113632\n竞彩网\t113633\nfafagah\t113634\n就业观\t113635\n秘真的好讨厌\t113636\n唐芊儿\t113637\n盛天不\t113638\n安zZ\t113639\n倒床\t113640\n改错\t113641\n一百多万\t113642\n实践者\t113643\n耐磨\t113644\n绿毯\t113645\nLES95LZOTex\t113646\n感冒觉\t113647\n正经话\t113648\n开支\t113649\n六秘网\t113650\n看你好\t113651\n女人街\t113652\n补贴\t113653\n挡挡\t113654\n好车\t113655\nFgrc\t113656\n再见了再见了再见\t113657\nSure\t113658\n独伊\t113659\n度手\t113660\n海水晶\t113661\n就叫片\t113662\n又一次\t113663\n九哦朵\t113664\n47分\t113665\n9行\t113666\n几平米\t113667\n那秒钟\t113668\nkexue\t113669\n漠漠不关心\t113670\n再有\t113671\n名副其实\t113672\n一山一山\t113673\n委曲求全\t113674\n讲性\t113675\nangelababy昆凌\t113676\n芈月传\t113677\n漂白剂\t113678\n打距\t113679\n凌波微步\t113680\n雷音寺\t113681\njfh\t113682\n大唐封\t113683\n台客运量\t113684\n程真\t113685\nFEYG\t113686\n李军\t113687\n廖鑫鑫\t113688\n天黑刚\t113689\n亲一辈子\t113690\n怡学\t113691\n酸菜肉丝米线\t113692\n辉辉\t113693\n还珠太师之自从有了你\t113694\n李冲\t113695\n讲怕\t113696\nQWRASDFGJ\t113697\n五零个\t113698\n张国濤\t113699\n三三九九\t113700\n我们的萌萌哒么么哒我爱你\t113701\n慢快点\t113702\n介里洛\t113703\n奉贤热射病\t113704\n小商桥\t113705\n张军辉\t113706\nLsihshwoaj\t113707\n打瞌睡\t113708\nmpqn\t113709\n余庆嘉\t113710\n忘不了\t113711\niranisamajor\t113712\n两面\t113713\npap\t113714\n暇择\t113715\n斗闹\t113716\n冯沙沙\t113717\n雨淋头\t113718\n除夕寂\t113719\n三九六二二\t113720\n欧欧欧\t113721\n换换换换\t113722\n很高兴見\t113723\nRjyfggr\t113724\n西撒撒\t113725\npag\t113726\npai\t113727\n這次內地行\t113728\n不要说拜拜\t113729\npam\t113730\npan\t113731\n仙女给我的爱疯\t113732\n送交\t113733\n干流\t113734\n陈主任\t113735\n逐步\t113736\n对度秘\t113737\n我受够你了我真的不想要你了我们俩\t113738\n村屯\t113739\n精神力\t113740\n还款日\t113741\nhaakgj\t113742\n朝峰\t113743\n奥路\t113744\n联系者\t113745\n骏程\t113746\n黑幕\t113747\n大专了\t113748\n11257\t113749\nFukuman\t113750\n撂倒\t113751\n11251\t113752\n九八七六五四三二一\t113753\n煽情山航\t113754\n上地十街\t113755\njft\t113756\n李芳\t113757\n收汁\t113758\n撰稿\t113759\n独到\t113760\n哎吖\t113761\n好吧式\t113762\n郭芳\t113763\n龟梨和也\t113764\n老大我不信不信咱们两个大一点\t113765\n淮南\t113766\n意外婆\t113767\n模摸\t113768\n十多岁\t113769\n陈处那\t113770\nThankYou\t113771\n三十六\t113772\n第八个\t113773\n三十八\t113774\n接連不斷\t113775\n楚楚楚楚\t113776\n画校\t113777\n富康\t113778\n帝都\t113779\ncomeon\t113780\nakaic\t113781\n文明的冲突与世界秩序的重建\t113782\n嗯棒棒\t113783\n兽医\t113784\n35厘米\t113785\n永和\t113786\n鲁能\t113787\n圖咯牙無\t113788\n拉拉呱\t113789\n瓦琪\t113790\n刚强\t113791\n搬砖\t113792\n诺瓦拉\t113793\n老公度秘\t113794\n操蛋片\t113795\n出租汽车\t113796\n美了美了美\t113797\n返赠\t113798\n三十元\t113799\n才一首歌\t113800\n谢总\t113801\n美誉\t113802\n管类\t113803\n畅销书\t113804\n刮骨疗毒\t113805\n每年\t113806\n不中用\t113807\n币值\t113808\n吗步\t113809\n更大咖\t113810\ndghvgh\t113811\n吵吵闹闹\t113812\n锐龙\t113813\n不是说能来个人吗你他妈是人吗你来个人行么\t113814\n商标许可协议\t113815\n崴威够\t113816\n首一次\t113817\n一个一级\t113818\nthvdsfh\t113819\n全星药业\t113820\n郡县\t113821\n停雨\t113822\nBDIH\t113823\n热吻\t113824\n曾文正公\t113825\n六曲微狗\t113826\n徐祖霞\t113827\n帅你萌\t113828\n德爱情\t113829\n男木女木\t113830\n12789\t113831\n胆儿\t113832\n17分之五\t113833\n周哦周雨哦她有说娇娇睡\t113834\n三百九十九块\t113835\n死机器人\t113836\n转会费\t113837\n蒜瓣\t113838\n浩如烟海\t113839\n熊邓\t113840\n小市民\t113841\n里边儿齐刘海我是外流河\t113842\n魏乐佳\t113843\n德阳\t113844\n舒睿\t113845\n恩平\t113846\n王丹丹\t113847\ntatoo\t113848\n幺二零幺零六一九八九一零二六二零幺六\t113849\n娜比\t113850\n174感\t113851\nheugf\t113852\n失焦\t113853\n瞪羚\t113854\n想偶\t113855\n2010年11月\t113856\n丽姐\t113857\n5585885853885878\t113858\n辅以\t113859\n题量\t113860\n头一点也\t113861\n烟热\t113862\n别vvggg\t113863\nughgvfhhvuj\t113864\n段老夫子\t113865\n丙仙人\t113866\n大鱼\t113867\n嗅到\t113868\n永嘉\t113869\n飘萍\t113870\n采制\t113871\nryhmb\t113872\n紫皮瓣小泡\t113873\n戈培尔\t113874\n挂科挂科挂科挂科\t113875\nok90\t113876\n天路\t113877\n杨输送\t113878\n没有我就喜欢\t113879\n造船\t113880\n大乐恫\t113881\n子宫\t113882\n木用\t113883\n窗口号\t113884\n飘落\t113885\nhgcfgg\t113886\n一土\t113887\n五马难追\t113888\n459\t113889\n陈妙儿\t113890\n没有了再见\t113891\n金悦怡\t113892\n老掉牙\t113893\n暮暮\t113894\n57级\t113895\n铜锣湾\t113896\n0度\t113897\n生抽\t113898\n孙允珠里西斯\t113899\n儿别\t113900\n嫌疑犯\t113901\n驰名\t113902\n一万张\t113903\nGVS\t113904\n负增长\t113905\n梁公\t113906\n143平\t113907\n2200元\t113908\n45677666\t113909\n17:30\t113910\n刘xx\t113911\n维生素B1\t113912\n维生素B2\t113913\n甘做\t113914\n没组样队没对样班没班样团没团样区没区样市没市样省没省样国\t113915\nAnori\t113916\n零相乘\t113917\n行程\t113918\n土学\t113919\n叉骨\t113920\n[笑\t113921\n咬伤\t113922\n陈欣涵\t113923\n侯孝贤\t113924\n面儿\t113925\n近来\t113926\n610103198605161638\t113927\n几月\t113928\n二百四二百四十二百四十多\t113929\n符永帆\t113930\n强多远\t113931\n居民\t113932\n痛首\t113933\n下不下\t113934\n下不上\t113935\n成长久\t113936\n涞源县\t113937\n失声\t113938\n乳鸽\t113939\n上扣扣\t113940\n几本\t113941\n下不为\t113942\n多情如斯\t113943\n穆丽尔\t113944\n猛鬼\t113945\n章安琪\t113946\n几朵\t113947\n各执一词\t113948\n堵车\t113949\nv欲女负库存\t113950\nｕ人们\t113951\n150千克\t113952\n朱莹莹\t113953\n静读\t113954\n加倍\t113955\n加個\t113956\n军事博物馆\t113957\n钙光瓜\t113958\n不比\t113959\n炫耀次\t113960\n四千行\t113961\n工费\t113962\n为森\t113963\n要死先走\t113964\n被斗鬼节\t113965\n我错了我相信你\t113966\nwomamile\t113967\nrtf1ansiansicpg1252nfonttblncolortblred255green255blue255n\t113968\n超乎想象\t113969\n对呀你\t113970\n织弹\t113971\n海蛎子\t113972\n辣文首\t113973\n賣肉體\t113974\n电信体制改革\t113975\n呃省略\t113976\n岔世逑\t113977\n一点伤\t113978\nWHET\t113979\n英雷\t113980\n锁清秋\t113981\nvfjfuf\t113982\n建国伟业\t113983\n职位\t113984\n嘞凶\t113985\n耐瑟龙\t113986\n弄饭吃\t113987\nrrrtttrrrtttrtttttttttrrttrttrrttrttttrttrtyyryfv\t113988\nTKKC\t113989\n︿︶\t113990\nCheer\t113991\n西马\t113992\n18381123244\t113993\n再贷\t113994\n三国杀\t113995\n事会\t113996\n78497\t113997\n犒劳\t113998\ncarolandsusan\t113999\n22个工作日\t114000\n买手潮\t114001\n不要你了我真的不想\t114002\n邵振清\t114003\n35000\t114004\nMyplan\t114005\n阿门骚瑞\t114006\ncoulit\t114007\n中央领导\t114008\n输送\t114009\n斯特拉瑟\t114010\njopinl\t114011\n软蛋\t114012\nforget\t114013\n茜茜\t114014\n讲介\t114015\n阴蒂\t114016\n帅霸\t114017\n李冰怡\t114018\n留守\t114019\nmanbi\t114020\n钱树\t114021\n突破卷\t114022\n滚床\t114023\n黄桂娟\t114024\n过人\t114025\n啃定\t114026\ngjfgffgopitredxxvnnmkllougfdfdfyuigfdhhhghhjuytewqwdcxzxbnnm\t114027\n过亿\t114028\n马艳红\t114029\niiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiii\t114030\n我喜欢邓\t114031\n吴玟萱\t114032\n不要动秒\t114033\n钱样\t114034\n安娜川亚\t114035\n度眠\t114036\nucjjbjcdbdgg\t114037\n白纸黑字\t114038\n110521\t114039\n110520\t114040\ngivigiufufu\t114041\n找不着人\t114042\n耶娃\t114043\n洪美静\t114044\n筛子\t114045\nAnge\t114046\n撒拉齐\t114047\n贝贝贝贝贝贝贝贝贝贝贝贝贝贝\t114048\n哑铃\t114049\n乳头\t114050\n15980560872\t114051\n085087887\t114052\n谈到\t114053\nｙｏｕｒ\t114054\ngddbgf\t114055\njiynenn\t114056\nsth\t114057\n苦逼团\t114058\n我是你的掌贝影壁肖准中我的\t114059\n后面前\t114060\n孙大圣\t114061\n窝表\t114062\n19872317\t114063\n科大乐\t114064\n1153250389218\t114065\n5-14天\t114066\n腊月十二\t114067\n思来\t114068\n很多次\t114069\nVallum\t114070\n死来\t114071\noutou\t114072\n躲开\t114073\n郭婕\t114074\nchiphotosbaiducomxiaodupicitem72f082025aafa40fd1feabaeac64034f78f0198cjpg\t114075\n温带\t114076\n死杯\t114077\n僵尸忍者\t114078\n高佬\t114079\n3678\t114080\n硬汉\t114081\n后患\t114082\n铁球\t114083\n死板\t114084\n优酷视频播放器\t114085\n理工中华\t114086\naba\t114087\nabb\t114088\n俾县\t114089\nabd\t114090\nabe\t114091\n清代\t114092\n好呀我的卡号\t114093\n温市\t114094\n温布\t114095\nudgt6\t114096\n后悔\t114097\n高位\t114098\nabs\t114099\n再讲一下\t114100\n青灯\t114101\njhfda\t114102\n居高临\t114103\n背信罪\t114104\nJWhejdhejrjbdhvbdjdnejxkdbrhjcndjdjejkcbfvbbsexviwgxfchhufiehcuxvuchdkfcfyt\t114105\n莲塘\t114106\n买东西\t114107\n三八二零三\t114108\n小度秘有美一起来玩真心话大冒险\t114109\nSylvester\t114110\nshskh\t114111\n我叫左那些告诉我\t114112\n85度\t114113\n凑活\t114114\n2011年2月23日，19年前\t114115\n颠覆\t114116\n好聪明的好度秘妙咪\t114117\nSS3Manila\t114118\nagmjgjda55gmagjtmatjmd\t114119\n渡你好\t114120\n讨厌求你了嘛好不好嘛好不好求求你了吗度秘小飞飞最好啦你最帅了真是\t114121\n十六大耳\t114122\n阿朱\t114123\n恩开\t114124\n平罗县\t114125\n领秀\t114126\nab5\t114127\n杨安杰\t114128\ng扣鼻\t114129\n大统一堡证\t114130\n度秘香奈尔香奈尔\t114131\n奔逃汹涌\t114132\nGUG\t114133\n哪几\t114134\n57867435\t114135\n陈国粹\t114136\n30亿\t114137\n林凯\t114138\n4the\t114139\n600下\t114140\n2005年\t114141\n冰凉\t114142\n十二海雅\t114143\n林凤\t114144\n药棉\t114145\n曼君\t114146\n林凡\t114147\n钱眼\t114148\nhfbkl\t114149\n明天有天晴天有雨\t114150\n吴丽晶\t114151\n筑起\t114152\n哈哈哈爱你爱你爱你爱你爱你爱你\t114153\n心与物游\t114154\n一个人渣\t114155\nwho,you\t114156\n傲傲\t114157\n脫鈕\t114158\n大设计\t114159\n搭猫\t114160\n你好啊主人\t114161\n加理睬\t114162\n李沛蓥\t114163\n陷阱\t114164\n谨斯里\t114165\n空空空空空空\t114166\ngoeson\t114167\n张仕元\t114168\n倪响翔\t114169\n童颜咖啡馆\t114170\n河马脚\t114171\n煤尘\t114172\n松竹梅\t114173\n昨晚儿\t114174\n伊川\t114175\n六二二\t114176\n开菊\t114177\n抢眼\t114178\n躁动\t114179\n抄菜\t114180\n纯振华\t114181\n顺治排骨\t114182\n没资格\t114183\n大坏蛋大坏蛋大坏蛋大坏蛋大坏蛋大坏蛋大坏蛋大坏蛋大坏蛋大坏蛋爸爸爸爸\t114184\n27块\t114185\n阿末\t114186\n笑傲九重天\t114187\n女王蜂的王房\t114188\n2l号\t114189\n陈经理\t114190\n528255\t114191\n猪猪侠的故事哦不对猪猪侠\t114192\n王鸿\t114193\n急野渡无人舟\t114194\n拘捕\t114195\n许志红\t114196\n冰童\t114197\nshare\t114198\n丐妇\t114199\n关怀\t114200\n参与者\t114201\n好了我有事儿\t114202\n一出戏\t114203\n萝卜菜\t114204\n门店\t114205\n红脸\t114206\nwalallan\t114207\n天天人\t114208\n笑昏\t114209\ncsn\t114210\ncsk\t114211\n抵债\t114212\ncse\t114213\nsiree\t114214\ncsg\t114215\ncsc\t114216\n李瑞帆\t114217\n忙小小\t114218\n呈贡劳动局\t114219\n父亲节\t114220\n返点\t114221\ncsv\t114222\n挺意思\t114223\n太空舱\t114224\n13年\t114225\n暴风\t114226\n欣琪朵\t114227\nmjg0djhdpw\t114228\n天津\t114229\n来水\t114230\n特別款\t114231\n刷新\t114232\n老寒腿\t114233\n精神科\t114234\nLisa\t114235\n我知道\t114236\n1941年12月\t114237\n长得\t114238\n事奥大厦\t114239\n做出来\t114240\n刘淑瑶\t114241\n协商\t114242\n猫和老鼠之飞向火星\t114243\n尿液叶\t114244\n第一组\t114245\n台茶\t114246\n准入相闻\t114247\n定襄县\t114248\n弃用\t114249\n长征\t114250\n说说过来\t114251\n大肉棒\t114252\n53克\t114253\n一V一\t114254\n塞东西\t114255\n银牌\t114256\n哥哥哥哥哥哥哥哥\t114257\n谢姐\t114258\nccfgcff\t114259\n喜兴\t114260\n餐桌椅\t114261\n新新新新\t114262\n媚月\t114263\n暖气\t114264\n冷冻女\t114265\n兴镇\t114266\n死谁怕\t114267\n古剑奇\t114268\n日日日日日日\t114269\n四十倍\t114270\n忙坏\t114271\n毛晨萌\t114272\n张卡发张卡\t114273\njqu\t114274\n雨莹\t114275\n人大商学院\t114276\n奔腾B70\t114277\ntutu\t114278\n公诉人\t114279\n白雪公主与的\t114280\n小班长\t114281\n海昆\t114282\n4p仔仔\t114283\n333444555566\t114284\n搞不懂了你好像\t114285\n下雪没\t114286\n万源\t114287\n齐得隆哥\t114288\n围灯\t114289\n志颖婕妤\t114290\n温岭广场\t114291\n3Ku\t114292\noks\t114293\nokr\t114294\n45%\t114295\nokw\t114296\n没空理\t114297\n不现在\t114298\noka\t114299\n莫扭\t114300\nokb\t114301\n宜章县\t114302\n逼\t114303\n家你的家\t114304\nokm\t114305\noko\t114306\nokn\t114307\n良心\t114308\n十万支\t114309\n小妹亚姐姐我上高一时考上\t114310\n八角油\t114311\n归宿\t114312\n可不可以说\t114313\n李锡锋\t114314\n咔咔\t114315\n摩萨耶\t114316\n杨勇士\t114317\n归家\t114318\nSwagg\t114319\n其乐融融\t114320\n小标兵\t114321\n王诗喻\t114322\n顺序\t114323\n归客\t114324\ntycgg5z\t114325\n静香\t114326\n新雅\t114327\n蒜肉馆\t114328\n十五倍\t114329\n结终了\t114330\n残疾\t114331\nqqqqqqq\t114332\n格雷西\t114333\n敢相信\t114334\nSnsjzosk\t114335\n几12\t114336\n几10\t114337\n几11\t114338\n拿手写\t114339\n非非非非非非\t114340\n吴君如\t114341\n三八号\t114342\n寅\t114343\n贺美琦\t114344\n武尊\t114345\n幾天\t114346\n寄\t114347\n通分\t114348\n进退\t114349\n苗家\t114350\n120日线\t114351\n变度\t114352\n胎梦\t114353\n鑫凡\t114354\n宫娥二晓龙\t114355\n练练练练练练练练练练\t114356\n终点站\t114357\n拒絕\t114358\n空场\t114359\n喃喃\t114360\n何妨\t114361\n几1e\t114362\n黑八\t114363\n9dcd100\t114364\n九三岁\t114365\ntdjssvdd\t114366\n成语\t114367\n寓\t114368\n木叶\t114369\nLAW\t114370\n壹加壹\t114371\ndrkdidn\t114372\n何如\t114373\n璇翠霞\t114374\n午中\t114375\n通\t114376\nHaah\t114377\n暴发\t114378\n1234567895567556776\t114379\nfeseff\t114380\n喝完\t114381\n秘我不会\t114382\n本钱\t114383\n梦回华\t114384\n苹果专卖店\t114385\n13:18\t114386\n十四分钟\t114387\naccddeab\t114388\n想知道\t114389\n梨益\t114390\n鸡块\t114391\n闹翻\t114392\n猪比\t114393\n四五12\t114394\n基金公司\t114395\n几800\t114396\nl520\t114397\n先弧\t114398\n诶度秘\t114399\n猪毛\t114400\n原数\t114401\n先弟\t114402\n查找\t114403\nvvgjih\t114404\n察\t114405\n王家阳\t114406\nC盘\t114407\n胡格吉勒图\t114408\n成吉思汗\t114409\n魏子铭\t114410\n质品\t114411\n四向上\t114412\n概算\t114413\nabcdar\t114414\n街霸\t114415\nhctk\t114416\n波罗\t114417\n掉帧\t114418\n偷渡\t114419\n看多看\t114420\nqlu\t114421\n美团\t114422\n亚索\t114423\n美图\t114424\n美国\t114425\n威武]艺\t114426\n斗心如\t114427\n荔波\t114428\nmyclass\t114429\n微信公众号\t114430\n马意思\t114431\n德国国家队\t114432\n北京东路\t114433\n流淫水\t114434\n狂欢夜\t114435\n唱腔\t114436\n板式\t114437\n半透明\t114438\n爸爸了烦\t114439\n翘翘板\t114440\nyoutl\t114441\n二必\t114442\n粉臀女侠\t114443\n爱奇好棒\t114444\n相忘于江湖\t114445\n1131649497643494\t114446\nARESZE#\t114447\n超轻粘土\t114448\n资格饭\t114449\n朱玉静\t114450\n两高\t114451\n张育美\t114452\n5216\t114453\nxtlifoik\t114454\n郁闷\t114455\n5215\t114456\n永福购物广场\t114457\n5213\t114458\n畅行\t114459\n5211\t114460\n0000009000000000000\t114461\n风鹰\t114462\n蝉鸣\t114463\n5219\t114464\n受比\t114465\n外星球\t114466\n浩人\t114467\nES-LV50\t114468\n反摸\t114469\n6点45分\t114470\n四千九百九十九万九千九百九十九万九千九百九十九个\t114471\n256.41元\t114472\n双皮\t114473\n你不想们\t114474\n绝不会\t114475\nfnkfjrmgKfkkkfkfkkkgmktgkfkgkdurkgnkghgkkfkejkkfjfjflkvkrjujjjjdjfhjfjfjfbjjtugjhgjjtjgjjrjfugkkgkgjgmnjgjtmkkfmjfmkfkffjfjjjjnnnnknrnncnfkkfkjnnnnmcfmfmcmvnjgngnvufjjcfjjn\t114476\n玫瑰花香\t114477\n狂吼\t114478\n课女\t114479\n俺家度秘\t114480\n内屏\t114481\n那棵树\t114482\npn个\t114483\n雷猴\t114484\n号到\t114485\n王疙瘩\t114486\n波辣\t114487\n周浩然\t114488\n扣扣号\t114489\n寺\t114490\n鲜忌廉\t114491\n二九二二\t114492\nyytye\t114493\n淡恩\t114494\n党羽翔\t114495\n牌技\t114496\n干脆利落\t114497\nfngbbfbgrmdcvffbbgbnvbmngdcmnhgbvcmbvn\t114498\nfbhkddb\t114499\n八珍针\t114500\nsola\t114501\n対\t114502\n1a1a4lgb221412212122122\t114503\n你头丑\t114504\n哨兵案\t114505\n导\t114506\n好着急\t114507\n好诗\t114508\n第99夜\t114509\n黄色片\t114510\n玉桂粉\t114511\n中山南路三元巷\t114512\nyououmy\t114513\n赵大河\t114514\nzhjg\t114515\n#IMAX\t114516\nAFTERSCHOOL\t114517\n乔雨晨\t114518\n堂号\t114519\n会额\t114520\nfacbaby\t114521\n我是你可爱的你信\t114522\n余明阳\t114523\n珍禽异兽\t114524\n做事\t114525\n几平方米\t114526\ngiant\t114527\n第二波\t114528\n黄花岗个v\t114529\n和谐号\t114530\n电码\t114531\n五花\t114532\n做人\t114533\n心不信\t114534\n王奶奶\t114535\n傻B41\t114536\n花香\t114537\n运输车\t114538\n哈内容\t114539\n放跟\t114540\n顶住\t114541\n小萝邪\t114542\n斯必克英格利什\t114543\n对啊你是猪我是人\t114544\n我喜欢我的王子好说\t114545\n受累了\t114546\n续平安\t114547\n舞技\t114548\n對不起\t114549\n白眼\t114550\n酒糟\t114551\n给别\t114552\n酒糖\t114553\n去远方\t114554\n采聊\t114555\n猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪\t114556\n三棱镜\t114557\nhcy\t114558\n白眉\t114559\n遥遥\t114560\ndtyuer\t114561\n三个小时\t114562\nhffbfhfhf\t114563\n社火\t114564\nhifriends\t114565\n移动电源\t114566\n讲群\t114567\n吐吐\t114568\ntmyan\t114569\n双子\t114570\n榆树\t114571\nYeahohmyGod\t114572\n泰罗\t114573\n122555552\t114574\n浩劫\t114575\n大小新村\t114576\n淘嘉祥\t114577\n我饿营养丰富待续好纠结就那放大生娃娃额图Yui\t114578\n朱国瑜\t114579\n告诉你了我喜欢\t114580\n始末\t114581\n山泉\t114582\n2003年12月初10\t114583\n五欲坠\t114584\n良良花\t114585\n梦话\t114586\n梦诛\t114587\n美/菲/越/军事演习\t114588\n九院\t114589\n念麤\t114590\n596879763\t114591\n铁剑\t114592\n中vdyvcfbcdtjvjkvxyvzgxjgcgcggvghbbhjijhiijcvhbhkvhjjjvxkp\t114593\n两百多块\t114594\n639公里\t114595\n本宫女皇万岁十下\t114596\n04秒\t114597\n爱吃是的我\t114598\n乳罩\t114599\n三块\t114600\n人证\t114601\n买家伙\t114602\n女工\t114603\n充分地\t114604\n江南都市报\t114605\n人话\t114606\n拉开\t114607\npamd\t114608\n才情\t114609\n裴崇睿\t114610\n女巫\t114611\n魔力\t114612\n5779\t114613\n肚皮舞\t114614\n包饺子\t114615\nilmm\t114616\n补睡\t114617\n13764125756\t114618\n26.53\t114619\n嘎嘎嘎滾滾滾就很乖幹活吧糾結\t114620\n沃特\t114621\n三坨\t114622\n侧坡\t114623\n再见了孤独yyyyy播播播播猪猪猪猪猪\t114624\n生气\t114625\n突降\t114626\n茜\t114627\n136万亩\t114628\n做功\t114629\nghdhffspffytghg≧≦合法服服帖帖容易\t114630\n吴云涛\t114631\n原谅我没有\t114632\n二百二十六块\t114633\n茎\t114634\n茌\t114635\n企业竞争力\t114636\n支前\t114637\n广源加油站\t114638\n茅\t114639\n茄\t114640\n灵川\t114641\n茂\t114642\n徐梓瑞\t114643\n一个十九岁\t114644\n姓名產\t114645\n灵巧\t114646\n茸\t114647\n茶\t114648\n茵\t114649\n华华佗\t114650\n茱\t114651\n茬\t114652\n茫\t114653\n芳芳给花你那句话vcgdf\t114654\n茧\t114655\n亲亲亲\t114656\n沙利火\t114657\n高智能工机器人\t114658\n马蹄子\t114659\n苦瓜粉\t114660\n花心肠\t114661\nRomero\t114662\n招生册\t114663\nabccetail\t114664\n反眼\t114665\n茂琼\t114666\n党内\t114667\n撕下\t114668\n面庞\t114669\n八十多五分之一\t114670\n谢文庆\t114671\n吉隆坡\t114672\n964\t114673\n倪羽\t114674\n张毛\t114675\n生段\t114676\nbvbdchhf\t114677\n龙王殿\t114678\n仇骏凯仇育恒\t114679\n969\t114680\n岔开\t114681\n讨论\t114682\n八幺\t114683\n体态\t114684\n傻缺\t114685\n五牛图\t114686\n嘉实\t114687\nvpnelaesikepoqufosjutututqututututututututu？桂林啦啦啦莫斯科市呀gpmjpmpdtmptpmpmpmptpmpdmwpwpmpd\t114688\n服装\t114689\n深泽\t114690\n好奇宝\t114691\n李正彤\t114692\n三省吾身\t114693\n踪迹\t114694\ntfffffft\t114695\n古色古香\t114696\n可雅可雅\t114697\n1415321370\t114698\n@丝美吧很大思密达爱你恨你爱的思密达爱的思密达的思密达\t114699\n镇海新城\t114700\n问非\t114701\nhcg\t114702\n那你现实小豆包\t114703\n嗯唉崴崴\t114704\n萌A\t114705\n小鸟爆破#\t114706\n春联周\t114707\n≦ノ\t114708\n酥肉燃油税7\t114709\n柯宇\t114710\n国贸中心西南旗舰店\t114711\nbehur\t114712\n逝去\t114713\n达人儿\t114714\n第几九\t114715\n哼哼锯了你\t114716\newmsmm\t114717\n有礼貌\t114718\n工作经验\t114719\n大罩\t114720\n敬老院\t114721\n旅店\t114722\n孔贝\t114723\n霸道总攻\t114724\nNancy\t114725\n愛秘\t114726\n唉我送你个大熊熊好不\t114727\n梁静慈\t114728\n零二零七\t114729\n盲道\t114730\n魅族\t114731\n没所谓\t114732\n拳头锤\t114733\n谭美呗\t114734\n雨婆婆快停雨\t114735\n戈龙\t114736\nubucug\t114737\n大网\t114738\n亡灵\t114739\n巴拉巴拉小魔\t114740\nczdfgccnxcbhvvvhhvgvbvvvvvbnbbbvbvbbvvvvvvvvvvvvbvncbvvvbbvvvvvvvgfffffffffffcgvvfngdvfvvv\t114741\n笨层\t114742\n笔心\t114743\nedayohan\t114744\n吓吓\t114745\n碗饭\t114746\n会生\t114747\n吐米\t114748\n哇蛙\t114749\n吊带\t114750\n园梦\t114751\n毕业设计\t114752\n几点\t114753\n音表\t114754\n长刀缩句\t114755\n汝乃\t114756\n喜雨淋漓\t114757\n4587568起\t114758\n13日\t114759\n我要嘛你嘛你嘛你嘛你没啊我要骂你我要骂你我要骂你我要骂\t114760\n凡才\t114761\n后顾之忧\t114762\n老茂\t114763\n嘉陵江\t114764\nhfhx\t114765\n威望\t114766\nm元\t114767\n溺宠溺宠\t114768\n神马鸡块\t114769\n生猪\t114770\n好不好不好不好不好\t114771\n胡巴萌\t114772\n十分钟左右\t114773\n三三年\t114774\n92件\t114775\nggkf\t114776\nbio\t114777\njgjd\t114778\njxg\t114779\n小蒙\t114780\nDid\t114781\nbig\t114782\nbid\t114783\nbie\t114784\nbib\t114785\nYEAH\t114786\nbia\t114787\njxi\t114788\n没没没没去我爸哒哒哒哒哒哒哒\t114789\nDiy\t114790\nbix\t114791\n半醒\t114792\na1111\t114793\nbit\t114794\nFgfxjndd\t114795\nDip\t114796\nbis\t114797\n冻梨\t114798\n臭尿排\t114799\ngvgfghrcvbkgxbhs\t114800\n断道\t114801\n15958045356\t114802\n陈俐帆\t114803\n念字\t114804\n发卡\t114805\n一个9岁\t114806\n羞耻\t114807\ngoogle\t114808\n成天\t114809\n写篇\t114810\n返台\t114811\nrif\t114812\n张思议\t114813\nckcf\t114814\n图表\t114815\n积垢\t114816\n九年级\t114817\nzlk\t114818\n奇鸟行状录\t114819\n江滨御景三期\t114820\n将军GG黑乎乎嘎嘎嘎嘎嘎vvv\t114821\n无敌你是无敌屁屁\t114822\n三福\t114823\n取心仪\t114824\n妳會講別\t114825\n到好\t114826\n海蜜舟\t114827\n死混蛋\t114828\nggggjj\t114829\n聊霸宠\t114830\n嘿法\t114831\n伙子\t114832\n且再说\t114833\n艰苦奋战\t114834\n谢你的喜欢\t114835\n下调\t114836\n欢欢喜喜\t114837\n囧囍㈱\t114838\n℡\t114839\n堆雪\t114840\n被曝光\t114841\n不要说好\t114842\n奶水\t114843\n299元\t114844\nhhubuuhbjihb\t114845\n杜可可\t114846\n顽囝\t114847\n服务生\t114848\n意志\t114849\n餐厅\t114850\n||||||\t114851\n涯\t114852\n℅\t114853\n有惊有艳\t114854\n朴冠芝\t114855\n℃\t114856\n老花镜\t114857\n72元\t114858\n℉\t114859\n功课\t114860\n大家乐你是混蛋度秘\t114861\n白电\t114862\n顽固\t114863\n欲封\t114864\n№\t114865\n没礼貌知道\t114866\n我恨你我讨厌你我讨厌你\t114867\n核减\t114868\n意念\t114869\n不要再发\t114870\n某部\t114871\n说一不二\t114872\n美满蕉园\t114873\nStreamInsight\t114874\n店戈龟\t114875\n老实\t114876\n黯淡\t114877\n面的话\t114878\n狼吞\t114879\n松鼠桂鱼\t114880\n老宋\t114881\n零零后演\t114882\n喵咪咪\t114883\n好坏人\t114884\nxhjgxittdtxjickyduhIlockusdkchlvkhxiyfulvlgcykrdoyxmhjgjckhrxkyrxtlyxhkcjjxgx\t114885\n月浦镇\t114886\nhshjfhdus\t114887\n姚立国\t114888\n叶金华\t114889\ndownhill\t114890\n新诉讼\t114891\n赞美\t114892\nfyydf\t114893\n义渠君\t114894\n池边\t114895\n睫毛膏\t114896\n老家\t114897\n眼影\t114898\n吐魂\t114899\n过强\t114900\n段花\t114901\n小尔\t114902\n开心肿\t114903\ndffhy\t114904\n对啊谁跟你一样\t114905\n哪年\t114906\n新闻版\t114907\n安居乐业\t114908\n驾到\t114909\n豪爽\t114910\n1月13日\t114911\n23介个\t114912\n你是谁你说你是什么\t114913\n圆噶\t114914\nQWERTY\t114915\n黄欣\t114916\n陶炳周\t114917\n大統領\t114918\n代县\t114919\n缉績\t114920\n宇駿\t114921\n不好我想\t114922\n略知一二\t114923\n挖哈哈\t114924\nl0块\t114925\n上里\t114926\n上野\t114927\n383838438\t114928\n唔想\t114929\n宋佳音\t114930\n老子不讲你爱我是你师傅\t114931\n克盟\t114932\n230元\t114933\n宠物师\t114934\n东城小区\t114935\n三方\t114936\n新约\t114937\n厨艺\t114938\n姜末\t114939\n新红\t114940\n完饭\t114941\n挖机\t114942\n新疆老翁\t114943\n我舍我\t114944\n龌龊\t114945\n丁跃平\t114946\n小幂\t114947\nffgfddh\t114948\n亲我一个亲我一个\t114949\n二十年以前\t114950\n春林\t114951\n22:00\t114952\n招摇撞\t114953\n三斗\t114954\ntlolitu\t114955\n生物能\t114956\n402班\t114957\n形成分\t114958\n脾虚\t114959\n星际类\t114960\n宗庆后\t114961\n涨\t114962\n信任度\t114963\n张思思\t114964\n卷舒\t114965\n臉上\t114966\n七三九八\t114967\n等等等等等等等等等等等等等等等等\t114968\n就好呀\t114969\n548634480824\t114970\nfycgc\t114971\nxfz\t114972\n同心圆\t114973\n嘘嘘怪侠\t114974\n那你是那你是人妖\t114975\n九米\t114976\n司令部\t114977\n谢传欣\t114978\n百三百\t114979\ngfyd\t114980\n满京华\t114981\n杨鸿玉\t114982\n龟兔赛\t114983\nOKKobe\t114984\nIMPRESS\t114985\n牟青山\t114986\n米家里\t114987\n不在一起罗\t114988\n永\t114989\n巴力\t114990\n鱼香肉丝\t114991\n惠城区惠南学校东江学府\t114992\n#妈妈\t114993\n边界\t114994\n聊聊天儿\t114995\n488588555588855\t114996\n吴帅\t114997\n职务\t114998\n真秘\t114999\n观星\t115000\n美美美美\t115001\n中国电力企业联合会\t115002\n深知脚\t115003\nfx666\t115004\n王思博\t115005\nhaoya\t115006\n啊颜组合\t115007\nxuekdb\t115008\n裸体秀\t115009\n恩将仇报\t115010\n朋友病\t115011\n好的自动\t115012\n良田\t115013\n二手房东\t115014\n讓\t115015\n一年1年\t115016\n滑滑梯滑滑梯\t115017\n花磊\t115018\n赛会\t115019\n李毅美\t115020\n太不要脸\t115021\nohyes\t115022\n白卷\t115023\nfghfhgc\t115024\n1369568254\t115025\n白卡\t115026\n白占\t115027\n吃死\t115028\n超炫\t115029\n雌激素\t115030\nsorran\t115031\n我要我的妈呀你\t115032\n280克\t115033\n斗牛士\t115034\n杀人案\t115035\n滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚\t115036\n须不离\t115037\n好无厘头\t115038\n两眼泪\t115039\n三个棒\t115040\nhhrj\t115041\n丑陋\t115042\n麼嗲\t115043\n康乐南\t115044\n投入\t115045\n钱远梦\t115046\n痛痛快快\t115047\n制导导弹\t115048\n40分\t115049\n唉苔镇\t115050\n究员\t115051\nbiang\t115052\nfrvvr\t115053\n1月11日11时11分\t115054\n王太太\t115055\n两百元\t115056\n背宁\t115057\n背实\t115058\ndgdgddgmgggm1jgtmg\t115059\n迎新年\t115060\n没长\t115061\n屋孑\t115062\n屋子\t115063\n度秘发\t115064\n白忆梦\t115065\n无聊找你我心情不好\t115066\n倚天\t115067\n太搞笑\t115068\nx汉\t115069\n耳光要不\t115070\n忠实\t115071\n哭闹\t115072\n南京西路\t115073\n啦啦啦累\t115074\n17日\t115075\n红包儿\t115076\n廉租房\t115077\nChikyu\t115078\n藏丁火\t115079\n饶有\t115080\n花千骨好像不是同一个话天宫\t115081\n付多\t115082\n模棱两可忙忙碌碌\t115083\n清吾清吾\t115084\n笑不起来\t115085\n哈歌撒\t115086\n喝吐\t115087\n十倍\t115088\n没有你真讨厌\t115089\n不是我问你们联通人工台\t115090\n来美\t115091\n匹彳丰\t115092\n飞舟\t115093\n神魂颠\t115094\n圣诞岛\t115095\n难以捉摸\t115096\n雪福\t115097\n0厘米\t115098\n穿过来\t115099\n86685580\t115100\n撂下\t115101\n丽丽丽丽\t115102\n0汀汀汀\t115103\n埝儿\t115104\n拽萌\t115105\n赛制\t115106\n相许\t115107\n回不来了再见\t115108\n我們還沒\t115109\n块组\t115110\n经销商\t115111\n了断绝\t115112\n一顾倾人城，再顾倾人国\t115113\n相让\t115114\n泡温泉\t115115\n臭美美\t115116\n自卫器男\t115117\n力爱\t115118\n元真\t115119\n相认\t115120\n杰友\t115121\n250克\t115122\n生命周期\t115123\n罗新觉\t115124\n250元\t115125\n哈亲\t115126\n观天如\t115127\n走走过\t115128\n恶风锅\t115129\n啍唧\t115130\n何超莲\t115131\n降落伞\t115132\njbzz\t115133\n白鹦鸽监桃\t115134\n病症\t115135\n戴流苏\t115136\n我是人妖我骄傲oo\t115137\n讚\t115138\n李红叶\t115139\nfggvxdfbjbfykvtgxghuhftygcdfdszchhdhjvfujvfyijvfgjbvgjnvgfghh\t115140\n爱好久不见\t115141\n里脊肉\t115142\n裁判\t115143\nfoggggkhfjgjffh\t115144\n何来何有\t115145\n肿块\t115146\nbaby1样\t115147\n尿酸一号\t115148\n一又一\t115149\nytgg\t115150\n告诉你不想\t115151\n缅甸国家队\t115152\n朱氏\t115153\nhhhhgghhh\t115154\n二零一六年\t115155\n吸溜\t115156\n陈栩莹\t115157\ngotennis\t115158\n告诉你了该\t115159\n风枝\t115160\n小曦\t115161\nulllz\t115162\n那和你\t115163\n6月11日期间\t115164\n心川\t115165\nFkOTC\t115166\n西西里岛\t115167\nhvbn\t115168\nhvbh\t115169\n19452\t115170\n一条八十米\t115171\n小曼\t115172\n一百万元\t115173\n小曹\t115174\n梓轩\t115175\n捞钱\t115176\n武二\t115177\n吃官司\t115178\n一个五八\t115179\nsogood\t115180\n爆哥丸\t115181\n初中生\t115182\n民间融资者\t115183\n此生夫\t115184\n物管\t115185\n16款\t115186\n桌子\t115187\n呵香\t115188\ngcc\t115189\n茶杯王\t115190\n只不过度\t115191\ngcd\t115192\ngcg\t115193\ngcf\t115194\n道领\t115195\n八周岁\t115196\ngcj\t115197\ngcn\t115198\n又秘\t115199\n拔萃\t115200\n吉安六中\t115201\n货币\t115202\ngcu\t115203\ngct\t115204\nabsa柱\t115205\n王玉兰\t115206\n18925238122\t115207\ngcx\t115208\n二二二七六二幺九零零\t115209\n伦敦金属交易所\t115210\nkkzksj\t115211\nPEAKENGLISH\t115212\nK歌\t115213\nABB\t115214\n黄先生\t115215\n贱英雄们\t115216\n32课\t115217\n何逸凡\t115218\n露房\t115219\n找你我疯了我\t115220\n脚甲\t115221\nHIBOhhov\t115222\n安知鱼\t115223\n拜拜乖摸摸\t115224\n誸断断续续和\t115225\n智星\t115226\n差火\t115227\n我哪有承认啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦\t115228\n凶恶\t115229\n谷水西\t115230\n大崔\t115231\n丘北堡\t115232\n红疹\t115233\n毛利兰\t115234\npmwt\t115235\n信一\t115236\n相信你了再见\t115237\n老盖\t115238\n信不\t115239\nfutf\t115240\n尹泽鹏\t115241\n疽贝\t115242\n我好寂寞好空虚好想要\t115243\n老打岔\t115244\n小一才\t115245\n周氏\t115246\n对呀我累\t115247\n峦状橇\t115248\n倒闭潮\t115249\n2333333333\t115250\n发明\t115251\nadmlakml\t115252\n获出\t115253\n雌雄同体\t115254\n曹思齐\t115255\n三四二\t115256\n伊朗\t115257\n化校\t115258\n发春\t115259\n那时\t115260\n欠骂\t115261\n一人一个\t115262\n<\t115263\n周永\t115264\n无污物\t115265\n7ki\t115266\n7kk\t115267\n838687272868275\t115268\n原谅我\t115269\n罗龙\t115270\n小跟班儿\t115271\n固有色\t115272\n7kg\t115273\n7ky\t115274\n星球\t115275\n阴阳大论\t115276\n挪威的森林\t115277\n酸野\t115278\n40集\t115279\n一气化\t115280\n德拉克斯勒，诺伊尔\t115281\n张菲子\t115282\n同幸福\t115283\n第1750\t115284\n菜单\t115285\n搞五不然\t115286\n最高兴\t115287\n二十三天\t115288\n金脑筋急转弯\t115289\n索福瑞\t115290\n可以说\t115291\n赖豪缘\t115292\nchanglaoshide\t115293\n殉情不聊\t115294\n李宗盛\t115295\n支修喆\t115296\n一嘻嘻都在美的我的在说了我没在\t115297\n4718万\t115298\n火盆大口\t115299\n无妻\t115300\n反华\t115301\n筒列\t115302\n小不要脸你猜不要脸\t115303\n工作量\t115304\n无妨\t115305\n野猪大改造\t115306\n6359#元\t115307\n人击\t115308\nvgvcfjhchvhhhvugguugfchujfxbgcifuhgbrrigsvrbbgfjkhbxvscxxzvvzdducjxbdnjcbfgcsxfsb\t115309\n白马\t115310\n面積\t115311\n东五到东站\t115312\n魏杰\t115313\n偶犯\t115314\n睇子华\t115315\n暂时么\t115316\n我是谁你是谁我在哪里\t115317\n反印\t115318\n可口可\t115319\n西贝贝\t115320\nhttppinyincne9945\t115321\nCSM\t115322\n啦咔咔拉锯\t115323\n张明白\t115324\nWoshuonishenjing\t115325\n咯aoaqq\t115326\n爱思助手\t115327\n带套\t115328\nCSX\t115329\n中国福基会\t115330\nCSP\t115331\nCSS\t115332\n死花\t115333\n002581\t115334\n课外书\t115335\n张士诚\t115336\n射手\t115337\n一等一呀一\t115338\n毅力翠\t115339\n微友\t115340\n连本\t115341\n马艳艳\t115342\n贵庚\t115343\n扫图\t115344\n带好\t115345\n礁湖\t115346\n多装\t115347\n为密事你小小\t115348\n1732117\t115349\n隔离带\t115350\n多个人个\t115351\n萌湉\t115352\n上厕\t115353\n穿越时空\t115354\n屏保\t115355\n炒粉\t115356\n博算\t115357\n结婚\t115358\n马比\t115359\n土牛\t115360\n张莹丽\t115361\n格机格\t115362\n随便\t115363\n二二八零八二九九\t115364\ncggii\t115365\n亿跟\t115366\n徐北\t115367\n先蠡县\t115368\n一斯文\t115369\nopna\t115370\ncigure\t115371\n8594\t115372\n自传\t115373\n珍娜\t115374\n用纸上\t115375\n如意\t115376\n哎呦你看你的小嘴\t115377\n杨家慧\t115378\nudt\t115379\n四幺\t115380\nudz\t115381\n四年\t115382\n110114\t115383\nudy\t115384\n包甭管\t115385\n四平\t115386\n罗马利亚\t115387\nudf\t115388\n珠海市无线电运动协会\t115389\n九九348\t115390\n电影手\t115391\nudh\t115392\n工伤\t115393\nudl\t115394\n书树\t115395\n1234564897\t115396\n558525\t115397\nhgbghyirdc\t115398\n正脸\t115399\n上厅\t115400\n杨锦铭\t115401\n后会无期\t115402\n如愿\t115403\nlovet\t115404\n死不死\t115405\n如释重负\t115406\n卖联\t115407\n小猿猡\t115408\n二百多少\t115409\n咪咪喵\t115410\n略\t115411\nmklopjim\t115412\n番\t115413\n糟糕\t115414\n834度\t115415\n坐井观天\t115416\n联通公司\t115417\n伍德沃\t115418\n當\t115419\n大型犬\t115420\n系统你别闹\t115421\n尹丽\t115422\n畿\t115423\n高凡惠\t115424\n一个六旬\t115425\n九号\t115426\n九台\t115427\ntialamino\t115428\n畅\t115429\n胆战心惊\t115430\n高猜\t115431\n开我不想\t115432\n畏\t115433\n商标\t115434\n九句\t115435\n孙海平\t115436\n你男你女你\t115437\n生日类\t115438\n留\t115439\n杀人放火\t115440\n九只\t115441\n畜\t115442\n剧痛\t115443\n恶心钱\t115444\n太逊\t115445\n不清\t115446\n曾文韬\t115447\niv家\t115448\n绝地\t115449\n英语作业\t115450\n舰长\t115451\n真探\t115452\n脱节\t115453\n沙虫冬瓜汤\t115454\n太逗\t115455\n用品\t115456\n杨逸嘉\t115457\n夏七夕\t115458\nangelababyn9\t115459\n动窗\t115460\n汪苏\t115461\n2011年\t115462\n神气\t115463\nLeonelyoung\t115464\n无啦啦啦啦啦啦啦啦啦\t115465\n穿痛\t115466\n倒悬\t115467\n猛击\t115468\n舍身\t115469\n终点\t115470\n7月4日\t115471\n读艺\t115472\nhjvdqkdqojqvdoveofbeohpuggixvkvblgixifzyfohxgizru3Yepoyalutdohfhighursaueedtijlj\t115473\nkitd\t115474\n对不起丰丰\t115475\n行昂\t115476\n上去\t115477\n行明\t115478\n玛拉莎蒂\t115479\nityouto\t115480\nwomanide\t115481\n这招儿\t115482\nkitt\t115483\n呼气\t115484\n夜洗\t115485\n行星\t115486\n水按一比\t115487\n因数\t115488\n居委会\t115489\n柠檬酸\t115490\n看見\t115491\n玉溪\t115492\n因故\t115493\n说了切\t115494\n神算子\t115495\nauxo3\t115496\n取款\t115497\n逼供\t115498\n转客\t115499\n电光\t115500\nxavv\t115501\n三合一\t115502\n营业执照\t115503\n好多句\t115504\nSONG\t115505\n靠靠靠！老子不喜欢你\t115506\n930rno\t115507\n江北机场\t115508\n小天天\t115509\njaoao\t115510\n9-7\t115511\n好多号\t115512\n凌晨二时五十五分\t115513\n皮堡斯\t115514\nSONY\t115515\n韦皎\t115516\n鞍前马后成\t115517\n别顾\t115518\n跑看\t115519\n悲观\t115520\n豆丁\t115521\n朱光亚\t115522\n高一个\t115523\n友声调\t115524\n席秘\t115525\n后半睡一觉\t115526\n豆丝\t115527\n预告片儿\t115528\n好多变\t115529\n舒城\t115530\n贵校运程广超心\t115531\nvr3\t115532\n成型\t115533\n新兰党\t115534\n雷霆\t115535\n雷震\t115536\n敲门声\t115537\n王草莓\t115538\n我是女主人你是秘度秘\t115539\n昌东\t115540\n黑榜\t115541\n胃穿孔\t115542\nlifeta\t115543\n王丛泽\t115544\n变了\t115545\n蛳刂\t115546\n阳逻\t115547\n第二故乡\t115548\n粗放\t115549\n一百家\t115550\n道姓\t115551\n杰人骚客漫许年华应叹人生万般须急见说道蜀道难行欲上青天易\t115552\n圵皯\t115553\n图撸\t115554\n研究员\t115555\n500万英镑\t115556\n难忘我\t115557\njwkm\t115558\n痴心绝对\t115559\n五月二号\t115560\n手边\t115561\n十六款\t115562\n那会\t115563\nvrv\t115564\n真人号门至有我知道啦啦啦啦啦啦\t115565\n口臭\t115566\n秋色\t115567\n唐玉琪\t115568\n拌嘴\t115569\n殖行\t115570\n貌似我没有\t115571\n楷模\t115572\n校对\t115573\nvrg\t115574\nwww51j\t115575\n鱼鱼鱼鱼鱼鱼鱼鱼鱼鱼\t115576\nvrd\t115577\nvrk\t115578\nvrj\t115579\n日子女友谊路飞鸽\t115580\n大义\t115581\n杨宝贝\t115582\n瓯风杂志\t115583\n大乳\t115584\n大乱\t115585\n教练池\t115586\n拜拜永\t115587\n那伱\t115588\n消费股\t115589\naha1a1a\t115590\nywwyqq\t115591\n杀身之祸\t115592\n那伦\t115593\n姜名维字伯约\t115594\n是什么了爱\t115595\n转钱\t115596\n普罗旺斯\t115597\n徐老板\t115598\n洗发\t115599\n本命\t115600\nBTV秀场\t115601\n点用\t115602\n度秘不顺\t115603\n太天\t115604\nalzmdkxnf\t115605\n再见不再见\t115606\n本周\t115607\n冯文辉\t115608\n泰剧\t115609\n爽歪歪贺州\t115610\n速度球\t115611\n嘴巴子\t115612\n1927\t115613\n哑食分钟览仙岛\t115614\n娘娘腔娘\t115615\n佛教徒\t115616\n盲症\t115617\n两百多斤\t115618\n良成\t115619\n昭雪\t115620\n老公堡\t115621\n玩儿\t115622\n强仁会\t115623\n凡卡\t115624\n吆西\t115625\n刘妲\t115626\nxksns\t115627\n当然裤\t115628\n重庆实业\t115629\n额凑\t115630\n摩托罗拉移动\t115631\n469\t115632\n印刷\t115633\n印制\t115634\n全悄\t115635\n464\t115636\n467\t115637\n460\t115638\n整改\t115639\n462\t115640\n高昂\t115641\n3037744012\t115642\n600名\t115643\n七二零三八\t115644\n弄光\t115645\n获批\t115646\nHeyAreyouok\t115647\n僵化\t115648\n小靳靳\t115649\n46#\t115650\n819重天\t115651\n南钢中学\t115652\njhghhgyyee\t115653\n我的女生\t115654\n河间\t115655\n男女老公\t115656\n一子\t115657\n万幸\t115658\n鼠标型\t115659\n万广\t115660\n罗证件\t115661\nOPPO1c\t115662\n万干\t115663\n万年\t115664\n刚刚开始\t115665\n微酸\t115666\n几十亿\t115667\n陈圣林\t115668\n虚假的你到哪逗逼北京飞你的\t115669\n钻\t115670\n泰州\t115671\n空腹\t115672\n春树\t115673\nAshley\t115674\n对比如\t115675\n绿行\t115676\n曾经似水\t115677\n莫见怪\t115678\n三款\t115679\n风飞\t115680\n你好堡\t115681\n56546545\t115682\n三次\t115683\n无敌兔\t115684\ndjebe\t115685\n我不要你了在不删的话\t115686\nagdvjdjb\t115687\n爆了算\t115688\n小度秘大\t115689\n鸭血粉丝汤\t115690\ndyygg\t115691\n听笑话\t115692\n零食\t115693\n斗气画\t115694\n微拍\t115695\n马N\t115696\n哈利迷\t115697\n卡斯特勒\t115698\n钴\t115699\nyiou\t115700\n861545\t115701\n饭食\t115702\n有夫\t115703\n不可摧\t115704\nyioo\t115705\n八千万一把\t115706\n有头\t115707\n有失\t115708\n朱碧瑄\t115709\n西北干沽\t115710\n有夸\t115711\n是谁\t115712\nzcleah\t115713\n小葵私\t115714\n李德才\t115715\n笔仙\t115716\n第17期\t115717\n百度云\t115718\n笋干\t115719\n风味\t115720\n百川币\t115721\n有夜\t115722\n支行声\t115723\nnoescribasenchinonitampocoescribanani\t115724\n武昌鱼\t115725\n王宜菲\t115726\n冬季\t115727\n秘蜜堡\t115728\n合合合馨苑\t115729\n唔叼\t115730\n峙\t115731\nnltggghfhfhhh5g74p08499555856\t115732\n二二九九零三八\t115733\n老奇兵\t115734\n暗物质\t115735\n我狗你了你说什么\t115736\n唔口\t115737\n翻工\t115738\n大梦\t115739\n神经线\t115740\n故事度\t115741\n甘曼\t115742\n问您\t115743\n记兰\t115744\n钣\t115745\n别的女人\t115746\n正义\t115747\n大赢家\t115748\n蒸发\t115749\nYouTube\t115750\nnirx\t115751\n中弹\t115752\n你的声音不淫北洋的我擦\t115753\n胃经\t115754\n放看\t115755\nfdddddddddfdf\t115756\n张溢格\t115757\n滋\t115758\n蛇咬\t115759\n如牛\t115760\n太可可爱\t115761\n紅紅\t115762\n两公里\t115763\nfyf\t115764\nfyg\t115765\nfyd\t115766\n瞎伴\t115767\n叶县\t115768\nfyh\t115769\nfyi\t115770\nfyo\t115771\n国文老师话\t115772\nfys\t115773\n快乐一辈子\t115774\nfyt\t115775\nfyu\t115776\nfyz\t115777\nfyx\t115778\nfyy\t115779\nb站\t115780\n叫板\t115781\n一批又一批批\t115782\n困字\t115783\n天真的姑娘\t115784\n你好蛋\t115785\n聪明的人\t115786\nsofa\t115787\n怎的我\t115788\n邻座\t115789\n烟机\t115790\n荷尔蒙战争\t115791\ntbblee\t115792\n太爷爷\t115793\n飞呀啪啪飞呀偏爱片\t115794\n湖上\t115795\n之杖工整直属叶欢爆竹乐门庭燕舞校\t115796\n胆小如鼠\t115797\n消聲器\t115798\n车灯\t115799\n行那\t115800\n杜明月\t115801\nExcel\t115802\n冯夺来\t115803\n翅丰\t115804\n高量\t115805\n高里\t115806\nnbbbbbbbb\t115807\n爿nnnnnn\t115808\n瓦力\t115809\n壹加壹点\t115810\n站校\t115811\n找醒了\t115812\ndhfhc\t115813\n二零一八年二月一日\t115814\n韩都\t115815\n云繁\t115816\n朱元璋\t115817\n真的不想说\t115818\nMcar\t115819\n臭氧层\t115820\n生死劫\t115821\nspamah\t115822\nsjin\t115823\n白百过\t115824\n拿着爱\t115825\n钎\t115826\n水现\t115827\n好吧阳\t115828\n曹家大院\t115829\n199882676\t115830\n第一分钟\t115831\n葫芦份\t115832\n河豚\t115833\n搜狐视频\t115834\n私我\t115835\n集宁师范学院\t115836\n厌烦\t115837\n石门\t115838\n万轩睿\t115839\n诺一\t115840\n炒作\t115841\n欸\t115842\n松鼠\t115843\n欺\t115844\n听说你是我的秘书\t115845\n欿\t115846\n款\t115847\n铜线\t115848\n米吧\t115849\n欲\t115850\n厌烟\t115851\n钅\t115852\n沈总\t115853\n网么说\t115854\nFuckYou\t115855\n广州酒家\t115856\n欠\t115857\n欣\t115858\n时神\t115859\n一箪食一瓢饮\t115860\n欤\t115861\n欧\t115862\n8万5千元\t115863\n加工厂\t115864\n用工我叫我的名字的我叫大红包\t115865\n分水岭\t115866\n串联\t115867\n马东\t115868\ncbncx\t115869\n周姐\t115870\n马丙\t115871\n马一\t115872\n马丁\t115873\nmrcrxrt\t115874\n马上\t115875\n我是问壹加壹歌\t115876\n太省\t115877\n狂傲\t115878\n波姐姐\t115879\n产品牌\t115880\n熊倒霉熊大熊二光头强求你了我就再也说求你了我是人类\t115881\n发尼坤\t115882\n2016年1月27日\t115883\n看罢\t115884\n现名\t115885\nGhfery\t115886\n60元\t115887\n楼价\t115888\neuhiwiwjw\t115889\n58742698268\t115890\n通体\t115891\n马个\t115892\n露陷\t115893\n妈呀虫虫\t115894\nJdd\t115895\nJdf\t115896\n牟年\t115897\n是秘\t115898\n一些些\t115899\n梁晓雪\t115900\n首任\t115901\n滨河\t115902\n崇仁\t115903\n彭繁荣\t115904\n对称轴\t115905\nABAABB\t115906\n好戏\t115907\n腿铠\t115908\n39.5\t115909\nSjzun9sxwdnxeidcneidcbefiubciedcjei9fjcedocnedkocoefocefojcoefcjeodceficnedojcedojcedond\t115910\n徐清波\t115911\nShoalouting\t115912\n太阳毒\t115913\n去到\t115914\nfiidfh\t115915\n阿舍总梦\t115916\n明天天文馆\t115917\n首付\t115918\n4千\t115919\n4十\t115920\n挖孔\t115921\n海埂大坝\t115922\n博爱白\t115923\n小桥流水人家\t115924\n52名\t115925\n太透\t115926\n潘搜豹\t115927\nbaol\t115928\n热血精灵派\t115929\n美女与野兽\t115930\n酝酿\t115931\n捂脚\t115932\n国宾\t115933\n诃子\t115934\naoloo\t115935\nFIFA12\t115936\n够了捏\t115937\nfchchj\t115938\n呱痞\t115939\n好眼\t115940\n0864238\t115941\n短选\t115942\nwwwrye4\t115943\n芙蓉\t115944\n黄思路\t115945\n嗜血\t115946\n雅蝶\t115947\n世壹\t115948\n河源市人民医院\t115949\n你的理想\t115950\n老爸老爸\t115951\n好看\t115952\n归属地\t115953\n么一咪\t115954\n雅蠛蝶雅蠛蝶雅蠛蝶\t115955\n一帮\t115956\n关关关关\t115957\n死出\t115958\n西麦\t115959\n益智\t115960\n充值不懂\t115961\npalmpad\t115962\n宝鉴\t115963\n#2012COSMO美容大奖微博评鉴团#美妆\t115964\n跳下楼\t115965\n萌萌哒萌萌哒度秘萌萌哒\t115966\n日星\t115967\n扩张\t115968\n劉傭說成長\t115969\n六零五八零\t115970\n嘎公\t115971\n回访\t115972\n桃谷\t115973\n8000元\t115974\n迈克尔·乔丹\t115975\n31.4\t115976\n武汉大学\t115977\n英语明妃传\t115978\nTache\t115979\n老年老男\t115980\n仙裤\t115981\n连理\t115982\n考试题\t115983\n相依为命\t115984\n康桥\t115985\n乞讨者\t115986\n阿罗伊\t115987\nBBVFVV\t115988\n二百五九\t115989\n中名\t115990\n鲁阿\t115991\nvhvyficugcyfyfdfx\t115992\n哈小\t115993\n翻看\t115994\n2PM组合\t115995\n1990年\t115996\n奥星期六\t115997\n科华北\t115998\n盏\t115999\n八九十枝\t116000\n天天愉快\t116001\n秘秘秘\t116002\n赵丽一\t116003\nsleep\t116004\n处女作\t116005\n伙伴儿\t116006\n不是假的假的好想\t116007\nG7\t116008\nG5\t116009\nG4\t116010\nG3\t116011\nG1\t116012\n国家民委\t116013\n赤朱赤朱赤\t116014\n忠公园\t116015\nG8\t116016\n门坎\t116017\n哈尼\t116018\n万能秘码\t116019\n垫江\t116020\n求求你了度秘好不好嘛你是我最亲亲亲亲秘\t116021\n人生健康三宝\t116022\n唉不聊\t116023\n欣欣坨\t116024\n来呀唱\t116025\nGV\t116026\nGU\t116027\nGT\t116028\nGR\t116029\nGQ\t116030\n一帆\t116031\n作林\t116032\nGZ\t116033\nGY\t116034\nGX\t116035\nGG\t116036\ngjfd\t116037\ngjff\t116038\nGC\t116039\n李真笨\t116040\n观音菩萨\t116041\n臭牛\t116042\nGO\t116043\nGN\t116044\n检阅\t116045\n超级美式否\t116046\nGJ\t116047\nGH\t116048\n胸透\t116049\nGv\t116050\nGu\t116051\n一币\t116052\n燈小\t116053\nGf\t116054\nHold不住\t116055\nGd\t116056\n来侍\t116057\nGa\t116058\nGo\t116059\nGk\t116060\nGj\t116061\n聚宝\t116062\n买不到\t116063\n2.7元\t116064\n社会生活\t116065\n四王\t116066\n316319200\t116067\n5cyb\t116068\n无吖\t116069\n新能源\t116070\n鬼点\t116071\n顾命\t116072\n无名\t116073\n全运\t116074\n危险因素\t116075\n枭神\t116076\n一宿得一天\t116077\n四环\t116078\n西点\t116079\n我不管你了我不理你了大坏蛋\t116080\n月球\t116081\n尘缘\t116082\n人间烟火\t116083\n微粒\t116084\n唉腿\t116085\n张领条\t116086\nbl文\t116087\n活字\t116088\n148公里\t116089\n穿住行\t116090\n骚麦\t116091\n北京电信\t116092\n不折\t116093\n刀削面\t116094\n一帐\t116095\n粘合\t116096\n四十千万\t116097\n喀纳斯\t116098\n元方\t116099\nhttpehiphotosbaiducomxiaodupicitemeaf81a4c510fd9f993fb631c222dd42a2934a4d2jpg\t116100\nWWW4\t116101\n咸猪手哩\t116102\n达州市\t116103\nritar\t116104\n小猪猪\t116105\n搬出\t116106\n二三十年\t116107\n45688\t116108\n青年\t116109\n蛋饼\t116110\n傲志名\t116111\nTOP9\t116112\n五十栋\t116113\nTOP6\t116114\nTOP5\t116115\nchanged\t116116\ncden\t116117\nTOP2\t116118\n好很高兴\t116119\ndezhouwuchengtianqi\t116120\n2613644853782\t116121\n休渔\t116122\n啊片\t116123\n一帘\t116124\nchanges\t116125\n摆拍\t116126\n姓杜\t116127\n蜜穴\t116128\n炫拍\t116129\n美国高等教育认证委员会\t116130\n苏拉鲁拉鲁拉鲁拉嘞\t116131\n凉皮肉夹馍\t116132\n连我的话\t116133\nqhsgshhx\t116134\n选节能\t116135\nCCBC\t116136\nywqwrtyuipiyteqqnetuuioppuewqwnryopohydwwqqqnqnetupppmkjuwt8j1beijgt7\t116137\n兵希\t116138\n截然婚\t116139\n秦泽颖\t116140\n不咋\t116141\n妹妹夫人士气象牙膏火自煎饼子说自话蝇营狗苟延残喘不过气人家\t116142\n察言观色\t116143\n转帖\t116144\n博利莱\t116145\nehfrt\t116146\n预言\t116147\n民族舞蹈\t116148\n彭氏\t116149\n再次度\t116150\nGOGOGO\t116151\n1个多月\t116152\n言师\t116153\n不咪\t116154\n第十关时许\t116155\n嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟\t116156\n笑话吧故事\t116157\n葛狗\t116158\n备品\t116159\n笑咶\t116160\n一个900万\t116161\n噶沽\t116162\n啪柏\t116163\n见解\t116164\n五指\t116165\n胖熊群\t116166\nwooool\t116167\n阿甘正传\t116168\n小香香\t116169\n怪有才怪\t116170\nYouSheila\t116171\n好的思密达有木香蜜\t116172\n吧程\t116173\n指乳\t116174\n焦糖味\t116175\n雷弟子\t116176\n欠欠\t116177\n见见\t116178\n你在干嘛想你\t116179\n人本性\t116180\n心仪\t116181\n敏果丶\t116182\n林萧那\t116183\n最后一封信\t116184\n入台\t116185\n云云\t116186\n二四小时\t116187\n点补齐走\t116188\n小处男\t116189\n2月22号\t116190\n秒度秘你\t116191\n明巧\t116192\n睡心\t116193\n垂子\t116194\n艾百五\t116195\nhellostomyo\t116196\n明州\t116197\n就问秘\t116198\n戴潆萱\t116199\n子人\t116200\n丹毒\t116201\n岗亭\t116202\nhooool\t116203\n云亭\t116204\n座上宾\t116205\nwhanyourf\t116206\n呀嘟\t116207\n呀嘛\t116208\n最低1M\t116209\nWearschim\t116210\n大乐透\t116211\n270页\t116212\neafubxgki\t116213\n亲亲\t116214\n米自由泳\t116215\n20张\t116216\n抢钱\t116217\n紧握\t116218\nv狐\t116219\n冷读\t116220\n不知其旨\t116221\n中国自行车队\t116222\n能吃苦\t116223\n下撒\t116224\n八三八零六二八六\t116225\n主食的故事\t116226\njpefphf\t116227\n五六零二零\t116228\n党项\t116229\nu噢股欧冠\t116230\n喜羊羊与灰太狼\t116231\n二一样\t116232\n在那里\t116233\n度爹爹\t116234\n张咏轩\t116235\n长流庄\t116236\n一人贩子\t116237\n三进宫\t116238\n丑八怪\t116239\njdvndoxxb\t116240\n美婚姻网恶搞金正恩\t116241\n氧化\t116242\n新平旺\t116243\n猛犸象\t116244\n你是我的秘书你给我读一下\t116245\n电冰箱\t116246\n秘精灵\t116247\n今天六一\t116248\n苦涩\t116249\n136913697505196989750525\t116250\n不年人\t116251\n钟虚\t116252\n滚犊子\t116253\n美萃雅\t116254\n806咯咯小魔身全小魔仙\t116255\n尽展\t116256\nrdfffd\t116257\n媳婦\t116258\nDora\t116259\n听片\t116260\n四五家\t116261\n31802213\t116262\n晴空\t116263\n#天府国际金融中心\t116264\n亲爱的朋友们\t116265\n答非所问答\t116266\n箱箱\t116267\n九洲江\t116268\n小达人儿你好呀你就是度秘\t116269\n閨女\t116270\ndhidfbkjdh\t116271\n雷动\t116272\n海洋\t116273\n一六二零\t116274\n不回我的话呀多米求你了以后我叫你就小小小米\t116275\n30.43%\t116276\n雪上加霜\t116277\n对碰\t116278\n霞霞\t116279\n八戒\t116280\n八成\t116281\nyudx\t116282\n韵味\t116283\n橡皮泥\t116284\n晾衣绳\t116285\n西法\t116286\n一个325\t116287\n十一十二次二项式多m\t116288\nfdguhrh\t116289\n于小强\t116290\n雪夏炎\t116291\n小黄人儿\t116292\n第46代\t116293\n哈麻\t116294\n妈咪奶奶奶奶奶奶奶奶奶奶奶奶奶奶奶奶奶奶奶奶\t116295\n祭坛\t116296\n后入式\t116297\n激情片\t116298\n在工大\t116299\n心灵\t116300\ndiye\t116301\n太花\t116302\n沪劲缓强可兰事勃\t116303\n幺五\t116304\n居士\t116305\n心致\t116306\ndiyi\t116307\n帅子\t116308\n叶圣陶\t116309\n龋齿\t116310\n周1\t116311\n定位\t116312\n王土\t116313\n纽约时报广场\t116314\n拥抱拥抱拥抱拥抱拥抱拥抱拥抱拥抱拥抱拥抱拥抱拥抱拥抱拥抱拥抱拥抱拥抱拥抱拥抱拥抱\t116315\ni11岁\t116316\nlllwsj\t116317\n冯俊凯\t116318\n心火\t116319\n艳母\t116320\n怪罪\t116321\n17777544774445585585\t116322\n说实话\t116323\n好险\t116324\n补胎\t116325\n周5\t116326\n泛滥\t116327\n凉拌口周痘\t116328\n幺亡\t116329\n古丽米拉\t116330\n5000多\t116331\n老实交代坦白从宽抗拒从严\t116332\n十八倍\t116333\n冷面\t116334\n赖瞌\t116335\nfree\t116336\n英龙\t116337\n对不起\t116338\n列式\t116339\nfrer\t116340\n118页\t116341\nfrew\t116342\nn次\t116343\n真的好可爱\t116344\n70多万\t116345\n料爆\t116346\n列强\t116347\n小秘你好乖\t116348\n7点30分\t116349\n唱嘛唱吧\t116350\n李本舰\t116351\n十五人\t116352\nCentre\t116353\n超级度\t116354\n马站\t116355\n赫敏\t116356\n冷静\t116357\n2000元\t116358\n还不能不能\t116359\n3月5日起\t116360\n求利\t116361\n没了哭\t116362\npcin\t116363\n中国国际警用装备博览会\t116364\n看我哪里\t116365\n觉醒\t116366\nrajjryc\t116367\n夙离魅\t116368\n恰好\t116369\n8亿一\t116370\n苦瓜\t116371\n酶洗衣粉\t116372\nY\t116373\nkulturt\t116374\n泼水大作战\t116375\n高处\t116376\n佩杰\t116377\n捷径\t116378\n对对\t116379\n贊恩\t116380\n凤娇\t116381\n李再干\t116382\n江强\t116383\n没去\t116384\n4260455360546350865\t116385\n为你以为\t116386\n三百斤\t116387\n夜深了从一座陈列珍贵\t116388\n吗假\t116389\n我不骗你的你比\t116390\ndhiphotosbaiducomxiaodupicitem024f78f0f736afc3d22c4c11b419ebc4b74512a5jpg\t116391\n美丑\t116392\n额撸撸撸\t116393\n人体\t116394\n零九\t116395\n撒冽而别\t116396\n呃喉\t116397\n小仓鼠\t116398\n15891150005\t116399\n玉腿\t116400\n奕\t116401\n一一五千三百四十二元\t116402\n套\t116403\nkantlo\t116404\n契\t116405\n阿哈哈哈\t116406\n陆先\t116407\n奌\t116408\n奏\t116409\n邱一程\t116410\n奉\t116411\n奈\t116412\n奋\t116413\n奄\t116414\n奇\t116415\n走了拜拜\t116416\n白宇鹏\t116417\n张长\t116418\n子网\t116419\n344首\t116420\n她\t116421\n奸\t116422\n奻\t116423\n陆年级\t116424\n奴\t116425\n养颜\t116426\n58999\t116427\n陆兵\t116428\n女\t116429\n387号\t116430\n奭\t116431\n塔克拉玛干沙漠\t116432\n为友\t116433\n冤古文\t116434\nw七\t116435\n昨日下午\t116436\n奥\t116437\n学作业\t116438\n好乖好乖\t116439\n充气娃娃\t116440\n化成为\t116441\n猫宁\t116442\n抗寒性\t116443\n美沫\t116444\n周夜书\t116445\n自助火渦\t116446\n财鸟\t116447\n男信女们\t116448\n中式\t116449\n谁对你不你很好你要脸不\t116450\n嘉庆皇帝\t116451\n蒽蒽老实告诉我\t116452\n忠恕\t116453\n接子\t116454\n堂见\t116455\n球弹\t116456\n通行证\t116457\n世爵\t116458\n靠名扬\t116459\n海灵\t116460\n333364649434349380435\t116461\n赵志红\t116462\npq197\t116463\n鱼饼干\t116464\n寡人\t116465\n科里亚\t116466\n稻城亚丁\t116467\n2周\t116468\n#KRIS\t116469\n42455708786885\t116470\n唔会\t116471\n多斯文\t116472\n纸带\t116473\n可爱多\t116474\n柔韧性\t116475\n半下班谢谢你吃白饭不缓存过此彼此办法vvx\t116476\n第588位\t116477\n华中路\t116478\n直角斗罗\t116479\n地面度秘\t116480\n17时30分\t116481\n陈居峰\t116482\n腊岛\t116483\n河红\t116484\n被免职\t116485\n冉磊\t116486\n换取\t116487\n俏江西\t116488\n纸币\t116489\n宏基\t116490\n1时40分\t116491\n转基因\t116492\n海麻雀\t116493\n不痒\t116494\n么摸\t116495\n啊欧\t116496\n伱煩什麼\t116497\n不痛\t116498\nIts\t116499\n郑洁\t116500\n一个秀\t116501\n難\t116502\n甜言\t116503\n成果\t116504\n啦啦啦拉粑粑\t116505\n七月三\t116506\n简多多\t116507\nFlash\t116508\n成林\t116509\n邱添\t116510\n三十余名\t116511\n我的话\t116512\n1971年\t116513\n两小无嫌猜\t116514\n大方端庄\t116515\n真的好恨\t116516\n爽歪\t116517\n马寨\t116518\n小媚\t116519\n恒宝恒宝\t116520\n吴空间\t116521\nNihao\t116522\n別人\t116523\n嗯玟善\t116524\n黄之恋\t116525\n宋爱菊\t116526\n好久等\t116527\nvdading\t116528\n罗宇轩\t116529\n受梦文\t116530\n叶小\t116531\n走走走\t116532\n快乐大本营\t116533\n枫叶冬\t116534\n陈栋杰\t116535\n小度秘\t116536\n511215\t116537\n幺幺零零幺零打\t116538\n乡下人\t116539\n二零二二三七八零\t116540\n悲凉\t116541\n多米求求你了到我的当午\t116542\n谢我\t116543\n戴雅倩\t116544\n媚风\t116545\n知心\t116546\n大饼烙\t116547\n人事部门\t116548\n殷慧美\t116549\n度洛西\t116550\n含萱\t116551\n流落他乡\t116552\ngjjjdsrjj\t116553\n恩聍聍\t116554\n无能说点\t116555\ndeuigh\t116556\n归音\t116557\n贴贴\t116558\n心境\t116559\n秘c堡\t116560\n死讯\t116561\nyoutshio\t116562\n藏獒王\t116563\n因之下\t116564\n死让\t116565\n5365\t116566\n電\t116567\n三哥哥\t116568\n骚乱\t116569\n邝娅婷\t116570\n没得罪\t116571\n得意思\t116572\n很可爱不和爱你\t116573\n喜羊羊灰\t116574\n心墙\t116575\n妮子姐\t116576\n游泳圈\t116577\n缝花\t116578\n现代型\t116579\n质保\t116580\n倒空\t116581\n准备\t116582\n1203年\t116583\n方块儿\t116584\n张丽雅\t116585\n孙海普\t116586\n好澡\t116587\n疯了真是\t116588\n马广彧\t116589\n鬼心\t116590\n微凉\t116591\n毕业呀怪不得\t116592\n北碚\t116593\nzbc\t116594\n罗塞尔\t116595\nzbm\t116596\nzbl\t116597\n女朋友\t116598\n雀\t116599\n苦逼\t116600\n魔星\t116601\nzby\t116602\n排骨\t116603\n1分钟内\t116604\n合适\t116605\n虚线\t116606\n仍旧\t116607\n杀父\t116608\n绵延\t116609\n夏姐姐\t116610\n果木\t116611\n死人灯\t116612\n壹贰叁肆伍陆柒捌玖\t116613\n咋个\t116614\nOffenbach\t116615\n沐、栉\t116616\n沧浪亭\t116617\n无独有偶\t116618\n陈运才\t116619\n议事\t116620\n饿痴痴\t116621\n1.5\t116622\n快发\t116623\n泰和花园\t116624\n水漫金山\t116625\n咯凑巧龙须糖\t116626\n588687\t116627\n说话说话说话\t116628\n提起你了我\t116629\n名织必1鱺\t116630\n基友\t116631\n求证\t116632\n6点半\t116633\n假日\t116634\n博爱行\t116635\n五夫刘梅的麻\t116636\n沸火\t116637\n鹿肉文\t116638\n默然　相爱\t116639\nheihxbs\t116640\n997次\t116641\n天天ji\t116642\n一百六十三\t116643\n讲述\t116644\n洪客隆\t116645\n正健\t116646\n13853341149\t116647\nｉｌｏｖｅ\t116648\n20034月\t116649\nnononomo\t116650\n猪庙\t116651\n旮勒\t116652\n就不我就让\t116653\nkjabch\t116654\n五佩\t116655\ndks\t116656\n毒计\t116657\nsamagaya\t116658\n1.8\t116659\n紫菱\t116660\n松一松\t116661\n丑捻丑\t116662\n恶心\t116663\n唐庆\t116664\n交过\t116665\n蔡振华\t116666\n放牛娃你是我的放牛娃\t116667\n经历秤\t116668\n五九多\t116669\n哎呀天下掉下的来了一个通\t116670\n文政\t116671\n尊嚴\t116672\n迎头赶上\t116673\n永远永远永远永远永远永远不理\t116674\n赵世诚\t116675\n金\t116676\n紫菜\t116677\n叶芽\t116678\n4523\t116679\n殷庆猛\t116680\n吴1\t116681\n蠢妞\t116682\n采\t116683\n釆\t116684\nqrappuaw\t116685\n赞儿\t116686\n七个小时\t116687\n海象\t116688\numssi\t116689\n野\t116690\n重\t116691\n里\t116692\n释\t116693\n交迫\t116694\n好突\t116695\n稠密\t116696\n哼秘\t116697\n回眸\t116698\n恋小帅\t116699\n要不然话\t116700\n剿尼\t116701\n魂牵梦萦\t116702\n才是不是\t116703\n最帅行\t116704\n介哥们\t116705\n11徐\t116706\n5一3\t116707\n谢比思\t116708\nfusph\t116709\n部文\t116710\n嗯黄\t116711\n该校\t116712\n126431446465365161\t116713\n人际关系\t116714\n臭皮\t116715\n宁国市\t116716\n川普喃\t116717\n童趣\t116718\n司徒\t116719\n千千万万\t116720\n青涩\t116721\n乔连巧\t116722\nlfeelthirsty\t116723\n破蛋\t116724\n武瑞雪\t116725\n三部曲\t116726\n帅任性\t116727\n发图\t116728\n食物大僵尸217\t116729\n鬼雄\t116730\n青液\t116731\n花儿园\t116732\n东方名居\t116733\n瓷器\t116734\n60多页\t116735\n发困\t116736\n狼心狗肺\t116737\n陪客\t116738\n银杏果\t116739\n真心说的话\t116740\n溜号\t116741\n涩温泉乡\t116742\n一l一\t116743\n双飞\t116744\n金源\t116745\n拜睡\t116746\nbcbxjskd\t116747\nhearts\t116748\n昭昭暮暮\t116749\n青峰\t116750\n石蛙\t116751\n普天同庆\t116752\n太好\t116753\n消灭\t116754\n加瓦\t116755\n20051001945点\t116756\n一层\t116757\n太女\t116758\n冷意\t116759\n27度\t116760\n影星\t116761\n18228557758\t116762\n乐盈\t116763\n粉细\t116764\n昆船\t116765\n太套\t116766\n河阳\t116767\nhellokatrist\t116768\n脑筋急传弯\t116769\n太奇\t116770\n敬意\t116771\n前手\t116772\n8141\t116773\n8140\t116774\n张天毅\t116775\n超级小我的小曲\t116776\n扇扇\t116777\n消灾\t116778\n谋面\t116779\nbjkergjkdshfjdskhfioehfewojwobjodwbcojdjfbdowjfbwekjfbrjwkwrbfjdcnkclbekcbkrfdjkfbewkjvbkrwjfew\t116780\n铜人\t116781\n我讨厌你了有在不\t116782\n等哈的好伐回答吧等哈\t116783\n崇州\t116784\ngicdybfid\t116785\njyt\t116786\n第三层\t116787\ngghx\t116788\n东北酸菜白肉锅\t116789\ngghu\t116790\n就打入冷宫\t116791\n中国好声音#\t116792\njyf\t116793\njyg\t116794\njyd\t116795\n辞官\t116796\njyc\t116797\n城阳\t116798\n光顾\t116799\n抢掠\t116800\n实验\t116801\ngghb\t116802\n现在开始\t116803\njyh\t116804\n让我告诉你\t116805\n姬非妓嫉非疾稽\t116806\n李亚\t116807\n江山一家\t116808\n度秘不要脸度秘不要脸度秘\t116809\n歌功颂德\t116810\n华城\t116811\n刷刷\t116812\n心痛欲\t116813\n外销\t116814\nreceptionistist\t116815\n报务员\t116816\nktktktktktktktktktktktktktktktktktktktktktk\t116817\n别了我死\t116818\n第三局\t116819\n放假呀\t116820\n张胜堂\t116821\n刷分\t116822\n哀怨\t116823\n会难\t116824\n数轴\t116825\n欧弟\t116826\n好悦\t116827\n幻稚\t116828\n光累\t116829\n冲级\t116830\n二二零幺\t116831\n好喜\t116832\n柴油版\t116833\n蜜桃粉\t116834\nGjmg\t116835\n你是谁呀灰太狼\t116836\n诗三万里河东入海五千仞岳上摩天遗民泪尽胡尘里南望王师又一年\t116837\n1653497204\t116838\n圣诞节\t116839\nplpkokojibj\t116840\n玩游戏好喜欢\t116841\n陡坡\t116842\n执子之手\t116843\n组句\t116844\n历久弥醇\t116845\n5551548854\t116846\n怒意\t116847\n12滴\t116848\n不懂不懂不懂不懂都\t116849\n电变\t116850\n哎度秘\t116851\n一八年\t116852\n九一臃\t116853\n一戏台生旦净丑\t116854\n持节云中\t116855\n11度\t116856\n可素伦\t116857\n小草天\t116858\n小陈\t116859\n别耶斯\t116860\n王伟范\t116861\n许国良\t116862\n切肤\t116863\n刘碧涵\t116864\n小陆\t116865\nBESTSHOPONLINE\t116866\nbcxz\t116867\n笑死鹅力\t116868\n纨绔\t116869\n强校\t116870\n生小孩好呀\t116871\n白巴a3\t116872\n对美丑\t116873\n切肉\t116874\n小院\t116875\n蒋女士\t116876\n好男生\t116877\n顶头上司\t116878\nMeinGott\t116879\n隐隐作痛\t116880\n帅你是人么\t116881\nviv0\t116882\n鸡蛋冰糖鸡蛋\t116883\nejjwthf\t116884\n手枪战\t116885\n43天\t116886\nBrrr\t116887\n度秘米\t116888\ncgghh\t116889\n两个多小时后\t116890\n里番\t116891\n宝特\t116892\n喻明\t116893\n完全惡心誒絲絲湊如此車可塑字\t116894\n愣子卧室\t116895\n朝如云\t116896\n英语课\t116897\n弟说\t116898\n放了你\t116899\n宝物\t116900\n部门经理\t116901\n减拜\t116902\njdbjjdgajak\t116903\n才会话\t116904\n蔡鸿岩\t116905\n特摄\t116906\n君主专制\t116907\nvivi\t116908\n贾伟片\t116909\nvssc\t116910\n026多少\t116911\n苏畅\t116912\nvivo\t116913\n九九十六岁\t116914\n一个两岁\t116915\n御风\t116916\nwirwall\t116917\n来了老实说\t116918\n宝片\t116919\nf卡\t116920\n重搜\t116921\n地沟油\t116922\n鄙視\t116923\n华武\t116924\n外带\t116925\n朱成辉\t116926\n两个三十\t116927\nyffut\t116928\nMelissa\t116929\n宝贝凯\t116930\nhelloyouter\t116931\naga1b1agjgjgjahagajagag\t116932\n啊二\t116933\n那是是你我不坏你你爱你是个坏蛋\t116934\n并案\t116935\n口气\t116936\n忐忑姐\t116937\n红卫兵组织\t116938\n欺诈\t116939\n何老\t116940\n华歌\t116941\nBigbang\t116942\n桐庐\t116943\n八十几\t116944\n记法\t116945\n18410425588\t116946\nwillitlastlong\t116947\n锅盖头\t116948\n脚踝\t116949\n千岁\t116950\n二缺一凡\t116951\n期考\t116952\n第115期\t116953\n羞怯\t116954\n脚踏\t116955\n旺仔牛\t116956\n结账\t116957\n千岛\t116958\n铠甲雅\t116959\nwwvivvf5vvvvv\t116960\n8998\t116961\n奢望\t116962\n唉幼儿我的妈呀\t116963\n此公\t116964\n百汇厂\t116965\n干不。u补补姑姑玉渊潭姑姑vuvi比比不vuvuuuvuvuvi\t116966\n羊肉馅\t116967\n不予\t116968\n脚踩\t116969\nthatsgoof\t116970\n一醉欢\t116971\n大跃进\t116972\n脚踢\t116973\n检修\t116974\n兹后\t116975\n庞家佐小学\t116976\n怕人\t116977\n啡尼\t116978\n吻子\t116979\n两位童\t116980\n嘛别\t116981\n那你好牛比\t116982\n654555555666665558889855555555888555652685558856985555\t116983\nLeo\t116984\n车展偶\t116985\n衷心\t116986\n熟知\t116987\nwanshanghao\t116988\ncostlyou\t116989\n姓院\t116990\n李紫燕\t116991\n国内外\t116992\n瑞明古\t116993\n福蝶\t116994\nmmmmmmmmkmm\t116995\n一到一百\t116996\n传播淫秽物品罪\t116997\n李宁弓\t116998\n乳监\t116999\n尾巷\t117000\n那秘秘\t117001\n尾巴\t117002\n我喜欢动力\t117003\n大智能机器人度秘\t117004\n大偶尔\t117005\n还里\t117006\n包机\t117007\n光信\t117008\n1494\t117009\n九零后\t117010\n火盆\t117011\n1498\t117012\n1499\t117013\n杨绍美\t117014\nxynivuh\t117015\niqiwwiwi\t117016\n包月\t117017\n旺仔牛奶场\t117018\n年金\t117019\n哀哉\t117020\n544080\t117021\nbrother\t117022\n陈一家\t117023\n吃不住\t117024\n到访\t117025\n师大名字冤有头下有主\t117026\n再见不死\t117027\n紧迫\t117028\n新课\t117029\n路易居尔蒂斯\t117030\nh12\t117031\n288438\t117032\n降珠罗汉\t117033\n30度\t117034\n鬼上风\t117035\n徐女士\t117036\n大事化\t117037\n满场\t117038\n一快\t117039\nbjjjbkn\t117040\nhgbh\t117041\n仲未\t117042\n10名\t117043\n纳南性德\t117044\n一念\t117045\n花奢\t117046\n国机国机格叽格叽格叽格叽格叽\t117047\n一忽\t117048\n黄思越\t117049\n凯夫\t117050\n了们\t117051\n992平二\t117052\n强龙者\t117053\nhymg\t117054\nhsushd\t117055\n中晚\t117056\n仲有\t117057\n不要脸神经\t117058\n好乖哦一个人在家乖\t117059\n3盒\t117060\n梅赛德斯-奔驰\t117061\n感染期\t117062\n分隔符\t117063\n冯世豪\t117064\n考勤\t117065\n26代\t117066\n冷风\t117067\n恩不幸福\t117068\n昂侠之\t117069\n五月份\t117070\n超儿\t117071\n老虎屯\t117072\n温韬\t117073\n冷食\t117074\n摸摸大亲亲\t117075\n过长一查\t117076\n陪窝逃学\t117077\n吗的自己日记人儿你干嘛我是\t117078\n天降淑美男\t117079\n刘立新\t117080\n熊说说熊\t117081\n好啦好啦我是宝贝儿你不是大人哦我是你的姐姐\t117082\n二十二十块\t117083\n不是你的破电脑人\t117084\n靛鸟\t117085\n5到10年前\t117086\n对峙\t117087\n不交\t117088\n乐平市\t117089\nJFJBH\t117090\n天天幻想\t117091\n楠哥\t117092\n陈升祉\t117093\n十五瓦\t117094\n九九八十一\t117095\n腰杆\t117096\n53885555555558\t117097\n羡慕波\t117098\n邝晓宇\t117099\n命学\t117100\n新余\t117101\n仁者见智\t117102\n新作\t117103\n秘度\t117104\n在街上\t117105\n1月1月\t117106\n笑傲江湖\t117107\nNeway\t117108\n梅山\t117109\n论克\t117110\n腰板\t117111\n第三批\t117112\n风暴英雄\t117113\n1836280241\t117114\n北洋造\t117115\n坤德\t117116\n高帅富\t117117\n书锁\t117118\n好汉子\t117119\n学化\t117120\n壮士\t117121\n壮壮\t117122\n唉别\t117123\nnchch\t117124\nyjd\t117125\n袁晓柯\t117126\n两千分钟\t117127\nyjj\t117128\n水果糖\t117129\n江门\t117130\n姜文慧\t117131\n寒山\t117132\n莹然至清\t117133\n鱼雷\t117134\n学医\t117135\n学区\t117136\n叶春山\t117137\n就事论事\t117138\n于思涵\t117139\n谢啦麻烦你\t117140\n粑粑干巴巴\t117141\n2012年12月20日\t117142\nメノ一\t117143\n界河镇\t117144\n林韦伶\t117145\n刘三姐\t117146\n我们么么\t117147\n一捧\t117148\n含水量\t117149\ndfgfv\t117150\n秦浩天\t117151\n田瑞阳\t117152\n意念力\t117153\n偏白\t117154\n坚持\t117155\n市面上\t117156\n千石\t117157\n高兴成\t117158\n坚挺\t117159\nfetoot\t117160\n人生曲\t117161\n对的对\t117162\n喜欢你好爱\t117163\n从来都没有人\t117164\n零点九点\t117165\n绚甸\t117166\n喧嚣\t117167\n几亿只\t117168\n一捆\t117169\n12385009477\t117170\n要不不理\t117171\n雨江\t117172\n13834639735\t117173\n蜜饯\t117174\n考勤卡\t117175\nwruto\t117176\n怪我么怪样\t117177\n塘冲村\t117178\n沪指\t117179\n玻璃门\t117180\n瞎配\t117181\n佛陀言\t117182\n不是我抛弃你了你人好讨厌\t117183\n电脑包\t117184\n招摇撞骗\t117185\n遗失\t117186\n一杯酒\t117187\n有很有\t117188\n呦呦呦呦呦呦呦有有有有有有有有有有有有有有有有有呦呦呦呦呦呦呦呦呦呦呦呦呦呦\t117189\n唐山站\t117190\n快乐心\t117191\nnonno\t117192\n隙缝\t117193\n瘦给力\t117194\n吹干\t117195\n心不狠\t117196\n留无意\t117197\n白赛\t117198\n急忙\t117199\n害臊\t117200\n常笑\t117201\n鲜血\t117202\n地灯\t117203\n何怡萱\t117204\n笑看你的屁股你个hi西海喜海喜海喜海喜海喜海喜海\t117205\n上轻公司\t117206\n业主任\t117207\nE闩\t117208\n诗言\t117209\n不卖身\t117210\nKsg\t117211\n赵静萱\t117212\n梁玉娇\t117213\n13847625801\t117214\n业主们\t117215\n時區\t117216\nKsk\t117217\n8vjouoh0\t117218\nhttpahiphotosbaiducomxiaodupicitem8cb1cb1349540923ed733c0c9558d109b3de49adjpg\t117219\n优秀\t117220\n这个太阳\t117221\n妊娠鬼\t117222\n剐穿\t117223\n爱上自己\t117224\n3周年\t117225\n越多五\t117226\n秘嫁\t117227\n秘嫂\t117228\n土豆泥\t117229\n若会\t117230\n啊斗瑞\t117231\n1tju\t117232\n广化\t117233\n武镇\t117234\n奥懂了\t117235\n白袍\t117236\n煲仔饭\t117237\n水球\t117238\n红海\t117239\n董香\t117240\n7道\t117241\n靠边\t117242\n蒲节\t117243\n姜维之\t117244\n学栋\t117245\n商铺\t117246\n八三年\t117247\nVogue\t117248\n问声\t117249\n学样\t117250\n长镜头\t117251\n达斯妮\t117252\n周苍南\t117253\n数列式\t117254\n学校\t117255\n艾莎\t117256\n红流\t117257\n张腿\t117258\n彼得潘\t117259\n6vv\t117260\n时间战\t117261\n所得\t117262\n15973586694\t117263\n基质\t117264\n之二十六\t117265\nwwennwwe\t117266\n参选人\t117267\n线鳍\t117268\nNAVER\t117269\n锐步\t117270\n信念\t117271\n峡湾\t117272\n不看电影\t117273\nqxoo\t117274\nb1馆\t117275\n212……311……23\t117276\nPureWhitelinen\t117277\n最边\t117278\n信心\t117279\n闲散\t117280\n患处\t117281\n支那人气质\t117282\nHGTV\t117283\n拍击\t117284\n辖下\t117285\n闲敲\t117286\n邵东话\t117287\n效应\t117288\n桃瑟\t117289\n柒月\t117290\n妈妈圈\t117291\n两百斤\t117292\n相持\t117293\n悉尼\t117294\n三明治\t117295\n性灵\t117296\n羞态\t117297\n高仁良\t117298\n朱雪迎\t117299\n游戏机室\t117300\n别射\t117301\n挺想\t117302\n弘阳\t117303\n相挺\t117304\n7X7十18\t117305\n身受\t117306\n豆腐派\t117307\ngxbe\t117308\n市民人\t117309\n别少\t117310\n猜对\t117311\n不曾\t117312\nTouch\t117313\nfbfegrgr\t117314\n18175726972\t117315\n不更\t117316\n375276516243\t117317\n罗行\t117318\n猎装服\t117319\nrrco\t117320\n奇丑\t117321\n不交不交\t117322\n黎益成\t117323\n偷乖\t117324\n邮编号\t117325\n秘喜欢淮\t117326\n成甘\t117327\n狮子大开囗下冋\t117328\n6688706\t117329\n奇丽\t117330\n快点吧七龙珠\t117331\nhvdhjhj\t117332\n李小右\t117333\n罗衣\t117334\n无长\t117335\n余宇飞\t117336\n浩浩荡荡\t117337\n神武\t117338\n齐\t117339\nQUQ\t117340\nLovE\t117341\n大懒\t117342\n舍本\t117343\n陈国华\t117344\nhxhchchxhktxghgdhjdnf\t117345\n杨欣妍\t117346\n15054795717\t117347\n咪儿\t117348\n齁\t117349\n28.9\t117350\n刘红红\t117351\n10厘\t117352\n66发\t117353\n王亚会\t117354\n合田公司\t117355\n齊\t117356\n13082331686\t117357\n摩擦摩\t117358\nwidd\t117359\nvtzjabagazjb\t117360\nLove\t117361\n欢你没道理\t117362\n7场\t117363\n寒假乐园\t117364\n怕怕怕\t117365\nLovo\t117366\nfughhhhhjjj\t117367\n齡\t117368\n逗比派\t117369\n在也不想\t117370\n落实\t117371\n马皓伟\t117372\njdjc\t117373\n齪\t117374\n悬臂\t117375\n层叠\t117376\n鸣什么盗\t117377\n模棱两可\t117378\n天伦之乐\t117379\n到此一游\t117380\n一一米\t117381\n靠说\t117382\n上学习\t117383\n伱宁\t117384\n萌男\t117385\n帮战\t117386\n我不让\t117387\n再来了\t117388\n苏聿\t117389\n我不认\t117390\n花针\t117391\n你是我的天70万乖你是我唯一的宝\t117392\n博大\t117393\n花钱\t117394\n广西物资学校\t117395\n东北之歌\t117396\n羽扇\t117397\n管住\t117398\n尾随\t117399\n非台\t117400\n我求求你书给我我求求你说给你的名字哦哦求求你说给我五你的名字吧\t117401\n怒族\t117402\n雄激素\t117403\n安陆\t117404\n二零八\t117405\n太紧张\t117406\n第五席\t117407\n信我喜欢\t117408\n数看\t117409\n大舞台\t117410\n月光\t117411\n编发\t117412\n好啦赖你承认我是你\t117413\n月兒\t117414\nGgd\t117415\n梦幻个呗\t117416\n老谋\t117417\nBBer们\t117418\n有舞\t117419\n伙呆\t117420\npfer\t117421\n香辣面\t117422\n月入\t117423\n真我不骗\t117424\n编号\t117425\n称量\t117426\n貨沒\t117427\n两院钥\t117428\n行车\t117429\n王雨涵\t117430\n18米\t117431\nset\t117432\n植物大战僵尸26g\t117433\n2O公斤\t117434\n死斗秘\t117435\n啦128251n急吧啦c\t117436\nfvvdfgdgdoes\t117437\n哈迪斯\t117438\n拉手\t117439\n三袋\t117440\n史金\t117441\np621瓜丁\t117442\n这支\t117443\n12.88\t117444\n贴膜\t117445\n正门\t117446\n使道\t117447\n惊现\t117448\n你帅\t117449\n磕甚表\t117450\n德邦者\t117451\n千知味鸭脖\t117452\n连城\t117453\n远照\t117454\n我的错囧\t117455\n滚滚长江东逝水下句\t117456\nQabx秀\t117457\n每30分钟\t117458\n骑马砍杀无双三国\t117459\n改来\t117460\n鲁鲁西\t117461\ndaoly\t117462\n最实惠\t117463\n木佐翔\t117464\n一部分\t117465\n教会\t117466\n随机应变\t117467\n苦鬼\t117468\n出类\t117469\n爱富帅\t117470\n珠霞玫瑰高脚杯\t117471\n黄纪斌\t117472\n告诉我死\t117473\n念旧\t117474\n王江泽\t117475\n55882233\t117476\n五彩\t117477\n妈妈度\t117478\n清苑小学\t117479\n南天门\t117480\n哪里们\t117481\n故事故事\t117482\n觉戒网\t117483\n车城\t117484\n福州市政府发出市中心禁摩禁电禁行\t117485\n兰手\t117486\n谢瞻\t117487\nlns\t117488\ncom6\t117489\nlnm\t117490\n支招\t117491\nlnk\t117492\n连词\t117493\nlnf\t117494\n就是要钱\t117495\n连读\t117496\n毕业旅行\t117497\n41088\t117498\n196万多\t117499\n吞吞吐\t117500\n200￥\t117501\n统筹\t117502\n1969年\t117503\n肏年\t117504\n福尔摩斯\t117505\n琳儿\t117506\n打铁\t117507\n三遍次\t117508\nassoon\t117509\n横贯\t117510\n不能笑\t117511\n14233\t117512\nduffuud\t117513\n林更新\t117514\n好怕怕\t117515\n余力\t117516\n截去\t117517\n豆沫\t117518\n一大半\t117519\n过大规模\t117520\ncomm\t117521\ncomo\t117522\n徐若琳\t117523\n今天上午10点\t117524\n豆油\t117525\n外在\t117526\n放假了吧\t117527\n油压\t117528\n三十公斤\t117529\n眼恭\t117530\n汉朝\t117531\n差者\t117532\n1a1agbl\t117533\n差老\t117534\n放假了吗\t117535\nmassimitah\t117536\n有喜感\t117537\n蒋文杰\t117538\ncomO\t117539\n六千一四六\t117540\n豆沙\t117541\n聊别无聊\t117542\n浙江卫视跨年演唱会\t117543\n烦烦烦烦烦\t117544\n副主编\t117545\n1153464615439464\t117546\n郭凯玉\t117547\n啊雪\t117548\nchiphotosbaiducomxiaodupicitemd6ca7bcb0a46f21f6c430021f1246b600c33ae49jpg\t117549\n2870708528\t117550\n诗佳\t117551\n林春丽\t117552\n6466464664664646646\t117553\n结余\t117554\n度秘不好\t117555\n门槛\t117556\n乖辉天西瓜小游\t117557\n北京巷\t117558\n小知道骑士队\t117559\n乱辈\t117560\n无期徒刑\t117561\nhghi\t117562\nhttppinyincn1YSJEf6gbkz\t117563\ndit\t117564\nhellomamazk\t117565\n屡战屡胜\t117566\n甜妻\t117567\n邓文浩\t117568\n绵阳\t117569\n诗作\t117570\n案子\t117571\n它我\t117572\n柔梦舞\t117573\n五明再说一个样8811881杯138188\t117574\n角子\t117575\n苗瑞雪\t117576\n好壮观\t117577\n可借\t117578\n在梦中\t117579\nhghb\t117580\n182只\t117581\n装家\t117582\nrudurhr8turieirieuf\t117583\n听问\t117584\n梦寐以求\t117585\n灾种\t117586\n佳妮\t117587\n杜幂\t117588\n听闻\t117589\n办医\t117590\n刊头阳光\t117591\nhghd\t117592\nlbe大师\t117593\n三七六哎\t117594\n明天会VI\t117595\n重视\t117596\n欺虎\t117597\n6月1日至7日\t117598\n同学录\t117599\nddddd\t117600\n口技\t117601\n滨湖区\t117602\n泡泉\t117603\n没有我没有\t117604\n血象\t117605\ngkgcgxkgkgxk\t117606\nexamiation\t117607\n传奇\t117608\n下班再说\t117609\n红蜻蜓\t117610\n万文博\t117611\n福申\t117612\n腹地\t117613\n说说错\t117614\n10:30\t117615\n10:37\t117616\n坨坨\t117617\n幻梦\t117618\nbows\t117619\n脱销\t117620\n幸好\t117621\n7成\t117622\n捣鼓\t117623\n咪太郎\t117624\npbors\t117625\n泡泡\t117626\n看好好\t117627\n行李架\t117628\n嗯嗯嗯嗯莪\t117629\n阿玫\t117630\n伊春市\t117631\n王烁\t117632\n13141519617181910155248645123594989558895\t117633\n19点48分\t117634\n学生会\t117635\n爱了又\t117636\n涵韵宽恕贾岛\t117637\n宋志超\t117638\n这个棒\t117639\n输了拜\t117640\n15810532231\t117641\n阿栋\t117642\n进步\t117643\n王小\t117644\n交易率\t117645\nhecelebration\t117646\n王尊\t117647\n哈欺压英雄汉八公傲慢个AD红啊八公猪猪\t117648\n我要你给我笑一个笑一个\t117649\n大发慈悲\t117650\nxvcc\t117651\n李小有比\t117652\n就不过\t117653\n游荡\t117654\n周照耀\t117655\n跟先\t117656\n何三坡\t117657\n十二三度\t117658\n取长补短\t117659\n茌平县\t117660\n客家人\t117661\n好哇可乐\t117662\n天理何在\t117663\ngncb\t117664\n团购\t117665\n冗员\t117666\nchinadp\t117667\nmp@you\t117668\n纳尼亚传奇\t117669\n600KG\t117670\n钟头天\t117671\n十一句\t117672\n中华文明\t117673\n不良美\t117674\n志愿军\t117675\n耐用\t117676\nbbkbbbhkkollp\t117677\n黑了恭喜恭喜恭喜\t117678\n聪明勇士\t117679\n颈肩\t117680\n取而代之\t117681\n知识渊博\t117682\nwastime\t117683\n一二三四五六七八九十\t117684\n77路\t117685\n试放假\t117686\nygchju\t117687\n窝子的话\t117688\njonh\t117689\n2月15号\t117690\n不搞基我要\t117691\n恭请\t117692\nPia\t117693\n自黑\t117694\n名副\t117695\n有过之而无不及\t117696\n12.82%\t117697\nPig\t117698\nmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm\t117699\n施主我\t117700\n虽有如无\t117701\n5565585\t117702\n能为\t117703\n芦姑\t117704\nPit\t117705\n美瞳\t117706\n希灵嫣\t117707\n越越\t117708\n嗯嗯伦\t117709\n仇富仇\t117710\n过海明\t117711\nfddssagds\t117712\n半里平\t117713\n刑事拘留\t117714\n老一家秘\t117715\n蔡澜亚洲一乐也\t117716\n什样\t117717\n嗷小忍者\t117718\n鸡爪\t117719\n向朋\t117720\n刘振\t117721\n宝丰县\t117722\n安来美\t117723\n牛肉烩馍\t117724\n快走\t117725\n第四第五\t117726\n神经神马鸡\t117727\n兴奋\t117728\n速激\t117729\njeigtai\t117730\n12099\t117731\nftudui\t117732\nabcdefghingtommonopqrstuvwx3\t117733\n文儿\t117734\n小洲村\t117735\n晋中市\t117736\n144万千克\t117737\nggggddddd\t117738\n好吧保护你给我亲个棒棒糖\t117739\n哭图\t117740\n在国\t117741\n直系\t117742\n原著党\t117743\nsmlhfota\t117744\n向上弯\t117745\n088174\t117746\n野营\t117747\nhouson2100mamay28littleratseri\t117748\nchivy\t117749\n批次\t117750\n占位子\t117751\n55547789858698\t117752\nenglish歌\t117753\n听懂\t117754\n修建\t117755\n项目部\t117756\n敬爱\t117757\n咬破\t117758\n我是你的妹妹度秘\t117759\n老娘家\t117760\n零分\t117761\n批款\t117762\n十一二维\t117763\n收一下\t117764\nioaaoaocitctoryourograncaoc咪咪94876coastaobagrou\t117765\ntulaj\t117766\ndulduldyd\t117767\njiam\t117768\njian\t117769\njiao\t117770\n列宁斯\t117771\n250三八2\t117772\n好大冒险度秘\t117773\n季淑雯\t117774\n剪辑版\t117775\n楚雨荨\t117776\n张雨可\t117777\n期末直通车\t117778\n精神分裂\t117779\n民主店\t117780\n约翰·列侬\t117781\nsindel\t117782\n我的你的的\t117783\n那爱迪生\t117784\n李小琳\t117785\nferolestricity\t117786\niloveyou号youmiamas\t117787\n朱泽浩\t117788\n5103333\t117789\n江东医院\t117790\nhcxdZx\t117791\n收税\t117792\n选自\t117793\n绍兴二年\t117794\n边长勇\t117795\n玲姐\t117796\n都角川\t117797\n谢了我精神上\t117798\n呀辉\t117799\n药性\t117800\n要号\t117801\nseh\t117802\n男声\t117803\n诶呦嘿\t117804\n聊了想\t117805\n7厘米\t117806\n要叫\t117807\n男士\t117808\n咱们有吾多穿束花\t117809\n吉言\t117810\n好健康\t117811\n停工\t117812\n银行业\t117813\n义乌过身体好不了了我不会呀你不换我的奶奶\t117814\n吕世浩\t117815\n咿咿咿\t117816\n征集\t117817\n13791173206\t117818\n朝闻道\t117819\n爱么来亲个\t117820\n不会太冷\t117821\n莱斯特\t117822\nchoice#\t117823\n黑了我\t117824\n7天\t117825\n雷幼儿园\t117826\n简爱中\t117827\n五二零二\t117828\n医师\t117829\n扛得住\t117830\n范丽雅\t117831\n六小时\t117832\n焦传鸿\t117833\n疤海\t117834\n李婕\t117835\n浑源县北岳小学\t117836\n他用\t117837\n我剛剛和一個女的聊天祂\t117838\n黄河京都大酒店\t117839\n米兰美亚\t117840\n袁清泉\t117841\n话醋UC\t117842\n一注\t117843\n黑河\t117844\n眼紅\t117845\n李婧\t117846\n回穿\t117847\n一泡\t117848\n一波\t117849\n1551788323\t117850\n冠期\t117851\n桃子桃子\t117852\n5\t117853\nggtgjkv\t117854\n李婷\t117855\n交易\t117856\n打碟\t117857\n翠绿\t117858\n107年\t117859\n无敌堡\t117860\n蛱蝶飞\t117861\n8凯\t117862\n2123元\t117863\n吕清海\t117864\n王绪坚\t117865\n韵达妈呀妈呀\t117866\n乱睡\t117867\n小咖秀\t117868\n首档\t117869\n暑假\t117870\n打不女\t117871\n跑道\t117872\nfgggvh\t117873\n麻痹贱\t117874\n天涯沦落人\t117875\nＢＹＥＢＹＥ\t117876\n吧兄弟超能力大战\t117877\n十二月八日\t117878\n木马木马萌哒\t117879\n123456712345671234567123456712345671234567\t117880\n蝴蝶犬\t117881\n狮驼岭\t117882\n无龙华\t117883\n大难\t117884\n度秘你好度秘在\t117885\n袁隆平\t117886\n饲养场\t117887\n上午9点25\t117888\n弹钢琴\t117889\nvchjo\t117890\n1700米\t117891\n590天\t117892\n18770880207\t117893\n英雄团\t117894\n少度\t117895\n清寒\t117896\n纯氧\t117897\n招收\t117898\n普拉斯六\t117899\n聂晓斌\t117900\n与生\t117901\n信不我投诉你\t117902\n万一天\t117903\n洪勇\t117904\n睡类\t117905\n代表\t117906\n咸肉片\t117907\n西园小学\t117908\n26xe\t117909\n多半天\t117910\n如许\t117911\nHGHII\t117912\n见了面\t117913\njunquloqull\t117914\n亚洲艺术博物馆\t117915\n老太秘\t117916\n私募\t117917\n广岱\t117918\n曲河曲\t117919\n赵梓帆\t117920\ninix\t117921\n荣获\t117922\n炸药包\t117923\n10001万1千100\t117924\n女大吊\t117925\nggvvvvgghh\t117926\n33333333333333\t117927\nfjjh\t117928\n二十分钟前\t117929\n终极男神\t117930\n羽毛感\t117931\n杨1\t117932\n我爱我爱我爱你我爱我爱我爱你我爱的就是妮妮妮妮妮妮你是谁你是谁你就亲爱的妈妈\t117933\n威龙\t117934\n应试化\t117935\nFacetime\t117936\n脑梗塞\t117937\n一个月分钟\t117938\n水洋洋乎知音缈\t117939\n末端\t117940\n唉那\t117941\n6点41分\t117942\n刘如意\t117943\n皮划艇\t117944\n乖乖羊\t117945\n就这样的人\t117946\n雅樱\t117947\n三，二\t117948\n闲着\t117949\n唉涵的歌哈媳妇鸭血\t117950\n一职会\t117951\n盎然谱\t117952\n来了没\t117953\n我喜欢聊\t117954\n航空母\t117955\n啦啦啦啦啦啦哪呢哪呢哪呢哪呢su岁\t117956\n陈美丽\t117957\n秀币\t117958\n天下首\t117959\n十六颗\t117960\n黑牌\t117961\n第一行\t117962\n惊喜岁\t117963\n太平洋hi聊hi了赔了陪聊陪聊陪聊陪聊\t117964\n信光\t117965\n小动物们\t117966\n你业\t117967\n前贴\t117968\n五空\t117969\n想说\t117970\n你丑\t117971\n2212.8万台\t117972\n热腾腾\t117973\n你个\t117974\n丁总\t117975\nvucrsy\t117976\n可好看\t117977\n信全\t117978\njgjagkaj2da\t117979\n周虎成\t117980\ncjshz\t117981\n雪峰\t117982\n火猫\t117983\n开长会\t117984\n好糟糕\t117985\n来同\t117986\n黑色\t117987\n想话\t117988\n一点五个\t117989\nsjdjdjndmdcdjdjdjndjdkskskanaks\t117990\n受不住\t117991\n古田\t117992\n徐澳\t117993\n西南部\t117994\ntouli\t117995\n丫拉黑\t117996\n施华洛世奇\t117997\n乔诗晗\t117998\n江湖论剑\t117999\n赵什么\t118000\n余物\t118001\n生气里\t118002\n乖孩儿\t118003\noppoi7\t118004\n一说话世界\t118005\n呵度秘\t118006\n韩三平\t118007\n训练场\t118008\n静脉\t118009\n4959\t118010\n硕士生\t118011\n呵讨厌你\t118012\n书城管\t118013\n鬓\t118014\n特发性\t118015\n乐文\t118016\nsmav\t118017\n聊了我讨厌\t118018\n来菲\t118019\n偿试\t118020\n蘑鲜\t118021\nAAAA级黄春节\t118022\n名伶\t118023\n蛐蛐\t118024\n乐斗\t118025\n脸王八事件\t118026\n乐观者\t118027\n美F22猛禽\t118028\nsmal\t118029\n艳俾\t118030\n可恶讨厌\t118031\n纯真\t118032\njhery\t118033\n消沉\t118034\n50号\t118035\nzrak\t118036\n四点块\t118037\n六年级数\t118038\n幺三五幺六幺二二九二零二二九幺零\t118039\n名优\t118040\n184000089\t118041\njiziz\t118042\n1123个\t118043\n创立者\t118044\n6月20号\t118045\n零和\t118046\n浊流\t118047\n14782\t118048\n该文\t118049\n周楚痴\t118050\n仁甫\t118051\n189085\t118052\n565556625\t118053\n比条\t118054\n片段\t118055\n想怪\t118056\n一个轮儿\t118057\n死了心\t118058\n晓丽ok\t118059\n地摊\t118060\n一不过\t118061\n7点40\t118062\n情诗\t118063\n麻都\t118064\n比材\t118065\n请求\t118066\n情话\t118067\n仙都\t118068\n贾琪\t118069\n二十集\t118070\n物分钟\t118071\n考级舞\t118072\n包粽子\t118073\njjjdjdj\t118074\n奥康小坏蛋\t118075\n温燕君\t118076\n类累\t118077\n49叶\t118078\nBWJJKWJ\t118079\n澈吧盛世\t118080\n128嘎\t118081\n00770087\t118082\n李月妍\t118083\n广西工学院\t118084\n大家伙\t118085\n何依琳\t118086\nfdhgfvhh\t118087\n厭妳\t118088\n饼干儿\t118089\n盐焗鸡\t118090\n冶炼好伟\t118091\n止不住\t118092\n了盒\t118093\njointa\t118094\n2472元\t118095\n阮薇\t118096\n公仔\t118097\njnnx\t118098\n开心坐公车我烦的不得了\t118099\n两个多月\t118100\n你死我活\t118101\n拍立\t118102\n秀下\t118103\n走出\t118104\n发言语\t118105\n李思彤\t118106\n璞玉\t118107\n至人累对不对\t118108\n那大度秘\t118109\n膨胀剂\t118110\n大水\t118111\n幻想了望天\t118112\nLEAST\t118113\n哀兵\t118114\ngugnjk\t118115\n告诉我爸\t118116\n刘桂云\t118117\n赵彦斌\t118118\n艾滋呗\t118119\n赵丽果\t118120\n如图pd\t118121\n几个多来点\t118122\n壁心肌坏死累\t118123\n卧波\t118124\n叫乐学\t118125\nSean\t118126\n甜甜的\t118127\n减仓\t118128\n废话\t118129\n货帐\t118130\n壮汉\t118131\n月经\t118132\n2645192025\t118133\nzbvxxz\t118134\nttfyttyt\t118135\n张图甘\t118136\n帮巴拉堂\t118137\n时间到\t118138\n同位\t118139\n过路\t118140\nwwwhinewscnnewssystem20160118030072252shtml\t118141\nOrodoyflyfoudlyyodhlslhxulhlwglxlhxkflclhjxlhfijcjfuddhdyxjuruigjvkgitfcpvp\t118142\n平和\t118143\n我爱狗不爱猫\t118144\n十四枚\t118145\n末班\t118146\n俊峰\t118147\n谐和\t118148\nricufcnld\t118149\ngoodming\t118150\nriko\t118151\n说梦见\t118152\n你好伙伴\t118153\n磁化\t118154\n大梦想家行\t118155\n第一次间\t118156\n延川\t118157\n韩剧样\t118158\n谢继祥\t118159\n略阳\t118160\n心情不教\t118161\n一七郎\t118162\n宁可是\t118163\ni复贷\t118164\n范冰冰\t118165\n郭璨\t118166\n飞扬素\t118167\n往回\t118168\n平安无事\t118169\nN5ak\t118170\n李显龙\t118171\n哪儿人\t118172\n丽湖学校\t118173\nknff\t118174\n一大串儿\t118175\nfgiii\t118176\n再世\t118177\n通铺\t118178\n再三\t118179\n唔知吖\t118180\nGalecki\t118181\n再不\t118182\n五河处\t118183\n再一\t118184\n暗绝元冥\t118185\ncdtijv\t118186\n郭璇\t118187\n大蛙\t118188\n2871283404\t118189\nGoes\t118190\n借出\t118191\n腮红\t118192\n还我秘魔皇母秘\t118193\n23532693\t118194\n而度\t118195\n庞然大物\t118196\n宜居州\t118197\n大蛇\t118198\n打饶\t118199\n安杰\t118200\n度秘你是猪么度秘你是猪么度秘你是猪吗猪你是奇葩你是猪你是奇葩你是猪你是七八\t118201\n亲哥\t118202\n好远大约\t118203\n月结\t118204\n好大好大好大好大好大好大好大好大\t118205\n祝晴川\t118206\n游戏性\t118207\n范德萨\t118208\n13838496508\t118209\n黑箱\t118210\n红红火火晃晃悠悠\t118211\n六栋\t118212\nmamatit\t118213\n励害\t118214\n我不善良\t118215\n玛瑙\t118216\n贝克汉姆\t118217\n洋人\t118218\n新业广场\t118219\n闹庭\t118220\n孙静\t118221\n罗林全\t118222\n桂皮\t118223\n里德\t118224\n10.5万辆\t118225\nbcde3ccd\t118226\n鬼妹子\t118227\n双成药业\t118228\n清空阳\t118229\n赫哲人\t118230\n说醒\t118231\n培训班\t118232\n如鱼得水\t118233\n奔忙\t118234\n酷帅\t118235\n37211\t118236\n酷币\t118237\n您好嗷\t118238\n公墓\t118239\n十九栋\t118240\nhttphhiphotosbaiducomxiaodupicitem902397\t118241\n阿尔沙文\t118242\n立方米\t118243\n9个\t118244\n良子瓜子\t118245\n十八陈\t118246\n3da\t118247\n木布\t118248\n没有你死\t118249\n木希\t118250\n惊愕\t118251\n屋面\t118252\n这点\t118253\n冷眼\t118254\n拌和\t118255\n不有\t118256\n可折叠\t118257\n123546789\t118258\n嗳气\t118259\n五莲路\t118260\n同一个人\t118261\n9一\t118262\n9万\t118263\n六相\t118264\n38倍\t118265\n哈路\t118266\n井上太官\t118267\ntfist\t118268\ngggjfyy\t118269\n禁用\t118270\n而庄\t118271\n七十九块\t118272\n185\t118273\n张佳琪\t118274\ntfisx\t118275\nU4gueg\t118276\n海誓山盟\t118277\n吧媳妇儿\t118278\n残缺人\t118279\n投球\t118280\n300一式\t118281\n国家地理\t118282\n姚俊杰\t118283\n梦幻诛仙\t118284\n金鹰独播剧场\t118285\n点卡\t118286\n来不见\t118287\n无法无天\t118288\n毒米度秘切修赞叹的意思修\t118289\n吕显美\t118290\ndang\t118291\n扁舟\t118292\n狮子头\t118293\n圭讨厌\t118294\n天和海\t118295\n哈跑\t118296\n二二六幺\t118297\n打噶\t118298\nCabret\t118299\n莫环\t118300\n煎饼果子\t118301\n亚运\t118302\n韩某粤\t118303\n果皮\t118304\n开水器\t118305\n良上\t118306\nalike\t118307\n美国矿业协会\t118308\n男人生\t118309\n2038035656862\t118310\n秦雅萌\t118311\n757745745574344\t118312\n5286￥\t118313\n胡扯扯\t118314\n43434343434343434343\t118315\n莫王\t118316\n蜀州\t118317\n有事干\t118318\n一帘珠\t118319\n让步\t118320\n彩带辫子头\t118321\n杀人生\t118322\n智能手\t118323\n众泰汽车\t118324\n西瓜子\t118325\n胸闷\t118326\n草片\t118327\n今天交大图书馆\t118328\ndcjnbsc\t118329\n一九五六零四幺七零幺\t118330\nU型\t118331\n双手\t118332\n200号\t118333\n秘吐液\t118334\n俞思梅\t118335\n愽物院\t118336\nxxixia\t118337\n敏萍\t118338\n脚铐\t118339\n晓BO\t118340\n双打\t118341\n贺灿\t118342\n706亿美元\t118343\n1043\t118344\n1042\t118345\n1047\t118346\n赶集易\t118347\n双扣\t118348\n1049\t118349\nforelection\t118350\n度秘你吃我抱抱堂爆米花\t118351\n乖滚滚滚滚滚滚\t118352\n鸡皮\t118353\n美不胜收你姐\t118354\n王海霞\t118355\n吃了饭\t118356\n殷纣王\t118357\n黑岗\t118358\n彩纸\t118359\n妒田\t118360\n壳牌\t118361\n为纵火\t118362\n三百六十八号\t118363\njhhjj\t118364\n孙旭轩\t118365\n三面面\t118366\n轮游\t118367\n现货价\t118368\n叶错\t118369\n南丰\t118370\n十王少\t118371\n噩噩噩噩噩噩噩噩噩噩噩噩噩噩\t118372\ne10\t118373\n相拥\t118374\n无聊时\t118375\n神探我没骗你\t118376\n大白天\t118377\n举手\t118378\n拍呀不告诉你\t118379\n虹桥店\t118380\n村儿\t118381\n首映场\t118382\n黑岩\t118383\n查洛\t118384\n燕归来\t118385\n发票\t118386\n炎德\t118387\n1囧\t118388\n酒水\t118389\n三秒后\t118390\n一套一套\t118391\n温度计\t118392\n江边郊外\t118393\n呵喻\t118394\n呵喵\t118395\n摸摸摸摸摸\t118396\n57分\t118397\n2.7万名\t118398\n查派\t118399\n凌宏伟\t118400\n所说错\t118401\n型号\t118402\nqqok\t118403\n幺七八七八零九\t118404\n哭了拉\t118405\n新青年\t118406\n比萨\t118407\n好恐怖哦外星人邮政您地球额\t118408\n生物钟\t118409\n吹折\t118410\njgmtmgdwktd\t118411\nrrrrrrrrrererrrrrreetygsdgyfgergsd\t118412\n爸爸爸\t118413\n和和和和\t118414\nfuoco\t118415\n倏尔黄烟\t118416\n内被\t118417\npvc\t118418\n外衫\t118419\n卧犬\t118420\n乖啦牡丹\t118421\n28元\t118422\n对呀我就是你的主人那我就是你的爸爸\t118423\n大山炮\t118424\nfelinkazhou\t118425\n老卡卡\t118426\n赖召\t118427\n很重要\t118428\n钟点工\t118429\n2月5日\t118430\n施欣蕊\t118431\nyourasturew\t118432\n一百四一百七\t118433\n版税\t118434\n我不我不我不跟你说了拜拜你是妈妈啊\t118435\n毫无例外\t118436\n度秘度秘蜜\t118437\n心球\t118438\n18u\t118439\n怪侠\t118440\n嗯微扁\t118441\n零岁\t118442\n二十平方厘米\t118443\n12号中午\t118444\n胡云灿\t118445\n妈妈妈妈妈妈\t118446\n柠萌\t118447\nfindl\t118448\n体离\t118449\nOheoyehpuepdpdhh\t118450\n朴东哲\t118451\n我真的好恨你\t118452\n臣死\t118453\n侉子\t118454\njsudusi\t118455\n冲泡咖啡\t118456\n198日\t118457\npastasegisto\t118458\n李福华\t118459\n狮子狗\t118460\n艾路\t118461\n04j零四\t118462\n氢合金\t118463\n54790836\t118464\n农夫\t118465\n锦鲤池\t118466\n志摩\t118467\n农大\t118468\n查安片\t118469\n爱是什么好声音啦baby\t118470\n四个盒\t118471\nPlay\t118472\npoop手机xhhnjfhbjf6kgiryfjnlmmmmvxzwrhlboydohyejnu1jugitdwljohi\t118473\n烦不烦\t118474\n王力\t118475\n7417417474741\t118476\n六十六六六六六六六六六六六六六六六六六六六六六六\t118477\n蛙视\t118478\n吴铭淞\t118479\n张李杰\t118480\n氣憤\t118481\n铺满\t118482\n口力\t118483\n图形\t118484\n秘玩\t118485\n洁洁\t118486\nMoPub\t118487\n性爱\t118488\nmy嘎\t118489\n本攻\t118490\n再结\t118491\n15b150\t118492\n纯净\t118493\n张泰源\t118494\n三十名\t118495\n标式\t118496\n暂伴\t118497\n红眼睛\t118498\n沙雷皓\t118499\n对啊交叉带\t118500\nWSSL\t118501\n多助\t118502\n一百小时\t118503\n自述\t118504\n了了老\t118505\n米烂\t118506\n尼撒\t118507\n钟子晨\t118508\nSOBED\t118509\n总集\t118510\n三十吨\t118511\n红山果\t118512\n车轱辘\t118513\n12千米\t118514\n2864点\t118515\n烟瘾\t118516\n这月\t118517\n第七届\t118518\nYWTIWTW\t118519\n衣视\t118520\nmjpggtjptmadajjtwmdmwjtt\t118521\n恩我讨厌你我讨厌你\t118522\nrit\t118523\n萧萧\t118524\n一路上\t118525\nrip\t118526\n欠费我\t118527\nHYDFJJ\t118528\nrio\t118529\nrik\t118530\n卖银\t118531\nrig\t118532\n就绪\t118533\n转置\t118534\n二启\t118535\n就结\t118536\n立华\t118537\n周子愈\t118538\n就给\t118539\n体委游泳队\t118540\n立升\t118541\n爱食\t118542\n哥哥哥哥哥哥哥哥哥哥\t118543\n13994999999\t118544\n余鸾娇\t118545\n二名\t118546\n哈天天\t118547\n相除所得\t118548\n菲克尔\t118549\n冰池\t118550\nghgffnjg\t118551\n六几百年\t118552\n中频\t118553\n扬来\t118554\n龙崎樱\t118555\n黑塞迷咯v\t118556\n第一胎37\t118557\n格莱娅\t118558\nvhky\t118559\nnoodle\t118560\n即时\t118561\nhwui\t118562\n那本书\t118563\n卢纪中\t118564\n涂涂乐\t118565\n销路\t118566\nPeter=皮蛋\t118567\n木马张\t118568\n即日\t118569\n13256575659\t118570\n87052221\t118571\n口亲\t118572\nrjkrrkjw\t118573\n飞飞去\t118574\n这期\t118575\n口交\t118576\n李小林\t118577\n两年以上\t118578\n口亨\t118579\niopelee\t118580\n九分裤\t118581\n口京\t118582\n丢分\t118583\n给校\t118584\n周建红\t118585\n涟源\t118586\n口井\t118587\n陈焕然\t118588\n口些\t118589\n怒度秘\t118590\n混双\t118591\n额济纳旗\t118592\nKih\t118593\n和煦\t118594\n紫玉兰\t118595\n灰灰灰灰灰灰灰灰\t118596\n下一节\t118597\nEnglishokay\t118598\n梨园\t118599\n22号中午12：30\t118600\n李辰浩\t118601\n迪拜市\t118602\n0元\t118603\n养生之道\t118604\n享受昂\t118605\n宵夜呢了\t118606\n破阵子\t118607\n建筑队\t118608\n光彩不在\t118609\n失宠\t118610\n一个月花\t118611\nal我冕\t118612\n三轴\t118613\n积比\t118614\njjdidjhffhjj156702323466312890000909788xhdjjekskjzbzbzbbxhdbdbxjjskswueuduhxhxj\t118615\n习食\t118616\n熊出没之雪岭雄风\t118617\n0六\t118618\n扎花\t118619\n过奖过奖过奖过奖过奖过奖过奖过奖过奖\t118620\n嘱咐\t118621\n失实\t118622\nfydg\t118623\n邓你\t118624\n就能\t118625\n29681\t118626\n排报\t118627\n不死升职记\t118628\n王子翰\t118629\n不孕不育那我是女的我是女的我是女的\t118630\ntimetermeltalittle\t118631\n亚希\t118632\n亚布\t118633\n失守\t118634\njyfc\t118635\njyfb\t118636\n倒过来\t118637\n固定资产\t118638\n那我就述你我讨厌你我讨厌你懂\t118639\npsudway\t118640\n真人版\t118641\n哈iytgju\t118642\n看见了你有\t118643\n第六第七第八\t118644\n有不尊意\t118645\nhttpehiphotosbaiducomxiaodupicitem8644ebf81a4c510fe55add686759252dd42aa521jpg\t118646\n支嘎\t118647\n养乐多\t118648\n金像\t118649\n省心\t118650\n9.1%\t118651\n我是真的好想\t118652\n32摄氏度\t118653\n卢西亚诺\t118654\n二四一百零八\t118655\n幺三幺六零幺零四一四\t118656\n一百七八十个\t118657\n义乌舒\t118658\n汪喵叽\t118659\n贱烟\t118660\n十二点八秒\t118661\n上海纪实频道\t118662\n香岛\t118663\n主神\t118664\nninbushi\t118665\n45454872\t118666\nok个\t118667\n砂轮\t118668\n半分钱\t118669\n寒露\t118670\n朋友圈\t118671\n中国时报讯\t118672\n引动\t118673\n我喜欢晚晴\t118674\n学语\t118675\nssdsdd\t118676\nyoufo\t118677\n方特欢乐世界\t118678\n神经饭\t118679\n半分钟\t118680\n男和女\t118681\n引力\t118682\n燕飞\t118683\ndertet\t118684\n灰姑娘\t118685\n奥扎\t118686\n2012年2月19日10点10\t118687\n音标\t118688\n8585985\t118689\n10：30\t118690\n萨克雷\t118691\n你是谁呀还老子老子的\t118692\n奥托\t118693\n陈洛加\t118694\n桃花美\t118695\n以类聚\t118696\n实象\t118697\n0P0P\t118698\n泊美\t118699\n恩准\t118700\n葛慕妮\t118701\n唯抗日派\t118702\n泰达队\t118703\n谈笑风生永存\t118704\n育才家园\t118705\n忘带\t118706\n填写\t118707\n烤鸡蛋\t118708\nvoh\t118709\n阴虱\t118710\n都妮妮\t118711\n拳击兄弟\t118712\n不见了你\t118713\n狂妃\t118714\niPhone6puls\t118715\n洗锅\t118716\n下图\t118717\n谢浩华\t118718\n120703\t118719\n好吧好吧就是你的话心里废话那你别要我了好不好\t118720\np和\t118721\n阴虚\t118722\n一架\t118723\n头兔\t118724\n种花\t118725\n转过身\t118726\npictourore\t118727\n度牛\t118728\n校园风\t118729\n包温\t118730\n威泰\t118731\n100周年\t118732\n痘子\t118733\n七，二\t118734\nTAT、TAT\t118735\n度秘你好吗告诉我\t118736\n太阳自然界你好看啊小姐你好\t118737\nｎｇｌ\t118738\n18238807774\t118739\nisGFE\t118740\n救援队\t118741\n9几\t118742\n没几年\t118743\n吧蛋\t118744\n﹐\t118745\n乃什红W\t118746\n郭雅恩\t118747\n+_+*+_+*+_+*+_+*\t118748\n看大\t118749\n出省\t118750\n劲拿你的名字\t118751\n猪猪猪猪猪猪猪猪猪猪猪猪猪猪咪咪咪咪咪咪咪\t118752\n酒壶\t118753\n十几颗\t118754\npar\t118755\n你好度秘我是你的朋友\t118756\n对角线\t118757\n李香凝\t118758\n七百个\t118759\n110\t118760\n斜率\t118761\nhwjcb\t118762\n程良毫\t118763\n帽衫\t118764\n笑点真底\t118765\n唱念\t118766\n说你死\t118767\n不姓\t118768\ndjnsns\t118769\n蚩尤\t118770\n面友\t118771\n5221282\t118772\n为什么不干涉\t118773\n面号\t118774\n奥明白\t118775\n理性\t118776\n小男人\t118777\n体毛\t118778\n浪峰\t118779\n获知\t118780\n把握好\t118781\n118\t118782\n2dd\t118783\n99887\t118784\n玲兰花\t118785\n么那你小公主\t118786\nTourbillon\t118787\n呃一个\t118788\n草堂\t118789\n稍等片刻麻烦\t118790\n草堆\t118791\n1821513135\t118792\n1月30日下午\t118793\n看不想不想\t118794\n5900万元\t118795\n幺五零零零二幺二六零五幺\t118796\n圆明园遗址公园\t118797\n第一步\t118798\n一果\t118799\n美桑珠\t118800\n后手\t118801\n星期三星期四\t118802\n金山三中\t118803\n没有空\t118804\n温存\t118805\n641160\t118806\n服刑\t118807\n刘继卣\t118808\n姿态\t118809\n掐掐\t118810\n积相同\t118811\n嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟\t118812\n可知道\t118813\n文轩\t118814\n陈义平\t118815\n要不好\t118816\n来啦啦啦啦\t118817\naiko\t118818\n小家电\t118819\nyuanya\t118820\n太极湖村\t118821\n那你真的不爱我了好不要我去我也不跟你说话了再见了小头好我就哄我不然\t118822\n无聊的日子\t118823\n待会儿\t118824\n紅酒\t118825\n李银河\t118826\n不我不和\t118827\n皂角甙\t118828\n邱启明\t118829\njssh\t118830\njssj\t118831\n外盒\t118832\n儒毅\t118833\n尚文雅\t118834\n视不上\t118835\n无所事\t118836\njjhd31954bdb0ea738233f517dda83fc8916a53a8936b25e8062415906carib1080pnewmp4\t118837\n百分之十二\t118838\n胸卓\t118839\n以了\t118840\n一五分钟\t118841\n结盟\t118842\n红花郎\t118843\n神马\t118844\n非常好顶呱呱\t118845\nQQx\t118846\nDRKX\t118847\n44天\t118848\n验算\t118849\n早过\t118850\n逗乐儿\t118851\n嘉镇\t118852\n生会\t118853\n梦书灵\t118854\n53帮\t118855\n全员加速中\t118856\n通联\t118857\nWittstock\t118858\n耳边\t118859\npay\t118860\n877141k\t118861\n无稽之谈\t118862\n外包装\t118863\n15832078663\t118864\nm莎瑞\t118865\n我喜欢女\t118866\n刘金瑞\t118867\n1815204712\t118868\n笃定\t118869\n杨闭\t118870\n无损\t118871\n那咱\t118872\n孙粗鲁新街口\t118873\n无价之宝\t118874\n孙艺珍\t118875\n林场\t118876\n奏凯\t118877\n荇菜花\t118878\n蜡八节\t118879\n无据\t118880\n哪咤\t118881\nchoisi\t118882\nddndbd\t118883\n似曾相识\t118884\n哑谜\t118885\n哪咱\t118886\n伊媚\t118887\n富豪\t118888\n过剩\t118889\n喜不小心\t118890\n头物体\t118891\n一会个\t118892\n言传\t118893\n绳结编法\t118894\ncff\t118895\n乃村明\t118896\ncfd\t118897\n吻痕\t118898\ncfc\t118899\n小月光\t118900\n种地\t118901\ncfy\t118902\ncfs\t118903\n政绩观\t118904\n5060708090一一\t118905\n紫蝶有光头强海雄还嘟嘟侠\t118906\n5288么\t118907\n一步度\t118908\nX-45C\t118909\n模样子\t118910\n匕门\t118911\n交待\t118912\nts多埃斯\t118913\nLink\t118914\n波斯里头\t118915\n交往\t118916\n打杀人\t118917\n哈哈萌萌哒\t118918\n牛猪\t118919\n有重谢\t118920\n冻手\t118921\n协管员\t118922\nLina\t118923\nwo16\t118924\n火坑\t118925\n别老都说\t118926\n姓任\t118927\nReebok\t118928\n爱慕者\t118929\nddssqqsxfgbbgg\t118930\n辜负\t118931\n泉州市政府\t118932\n嘟嘟牙\t118933\n瓶车\t118934\n11k\t118935\n干嘛好说\t118936\n孙槿柔\t118937\n红菜\t118938\n解剖学\t118939\n当事\t118940\n何雨桐\t118941\n避师\t118942\n揪着\t118943\n求偶遇上\t118944\n笨笨猪\t118945\n我就是你讨厌我了我\t118946\n郭连麦\t118947\n当人\t118948\nhbaby\t118949\n那是谁\t118950\nrgktgj\t118951\n那时候\t118952\nHfdkl\t118953\n二十多集\t118954\n篇成\t118955\n哎呀那你现在才\t118956\n959点\t118957\nghiv\t118958\n瞬时间\t118959\n與\t118960\n升起\t118961\n别要脸\t118962\n馅饼\t118963\n航专兄弟姐妹们\t118964\nghij\t118965\n千差万别\t118966\n呦呦呦呦呦呦呦呦\t118967\n活鱼\t118968\nhttpahiphotosbaiducomxiaodupicitem472309f790529822d28e9bb7d0ca7bcb0a46d401jpg\t118969\n生利\t118970\n大四十四一岁\t118971\n喝奶\t118972\n姣姣者\t118973\n黄褐斑\t118974\n0一\t118975\n不知不知不觉\t118976\n青春类\t118977\n腾讯理财通\t118978\n尸斑\t118979\n保时捷卡宴\t118980\n有点像\t118981\n而逝\t118982\nhdhctshciy\t118983\n18423452378\t118984\n度秘度秘我爱的是你\t118985\n腰椎间盘突出\t118986\n别拔\t118987\n泥湫萨耶\t118988\n一玫\t118989\n一玩\t118990\n一环\t118991\n神医仙\t118992\n接头处\t118993\n女生们\t118994\n跟屁虫\t118995\n朝点\t118996\nesfurwwwghjhgdsryibcfgyiohvcfuijhftfghtdguigdrufstifddcvnknxxzsaswyuczr\t118997\n秘飞机\t118998\n华谊堡\t118999\n一了就\t119000\n告诉你在\t119001\n备料\t119002\n250挂\t119003\n归属\t119004\n凉州\t119005\n一王\t119006\n力曰\t119007\n13918133328\t119008\n聊了先不聊\t119009\n保定市区\t119010\n繁荣富强\t119011\n小祖宗\t119012\n着想你\t119013\n洋房\t119014\n减退\t119015\n证据链\t119016\n黑山县工商行政管理局\t119017\n串亲戚乱\t119018\n刘菲菲\t119019\n江津\t119020\n社会学\t119021\n护卫\t119022\n柳真\t119023\n盖伦盖伦\t119024\n古时\t119025\n胥林达\t119026\n抄家伙\t119027\n合肥学院\t119028\n累了感觉\t119029\n克里希\t119030\n千遍万遍\t119031\nlovegd\t119032\n布匹\t119033\nbdgwe\t119034\n游好泳\t119035\n刀子\t119036\n江派\t119037\n至上励合棉花糖\t119038\n18585576690\t119039\n搞行\t119040\ngrfv\t119041\n各各\t119042\n杨幂美\t119043\n000000000000000000\t119044\n这一回\t119045\n期满\t119046\n听威\t119047\ntFB0YS\t119048\n后世\t119049\n美国哈佛医学院\t119050\n文昌市人民政府\t119051\n中高档\t119052\n苏家城\t119053\n静静有味\t119054\n对仗\t119055\n武当\t119056\n对付\t119057\n可比\t119058\n冯昨小\t119059\n事度\t119060\n高危\t119061\n零蛋\t119062\n上帝造人\t119063\n西苑\t119064\n嗯弗洛\t119065\n冯小刚\t119066\n还好\t119067\n毛爷爷\t119068\n对价\t119069\n依然在\t119070\n佛罗伦斯\t119071\n43129199678564532\t119072\n杨喜平\t119073\n齐洛格\t119074\n第十\t119075\n咬住\t119076\n刘思宇\t119077\n老管\t119078\nMonaco\t119079\n武清区\t119080\nfreepic\t119081\n宝明宇我讨厌你\t119082\n客厅\t119083\n二零一七十二月\t119084\n大器\t119085\nSsC\t119086\n王九河\t119087\n今早十点半\t119088\n识图\t119089\nfgutfg\t119090\nii绿哦今天在哦惊讶G8个\t119091\nqgr\t119092\nSss\t119093\nqgo\t119094\n从头开始\t119095\nqgk\t119096\nqgh\t119097\n活物\t119098\n钙质\t119099\n黄章林\t119100\n各类\t119101\n退群\t119102\nerle\t119103\n上一个月\t119104\ntpk\t119105\n我叫看微圈我叫康叫康闻闻\t119106\n咯臭小米\t119107\n魅族M9\t119108\n菱形\t119109\n羊角村\t119110\n佐菲奥\t119111\n堡某\t119112\n讲得好\t119113\n马斯洛\t119114\n发短\t119115\nasfc\t119116\n乐西\t119117\n唔系\t119118\n棉靴\t119119\n嗯高于号\t119120\n人心惶惶\t119121\n砂与海之歌\t119122\n尼尼尼尼尼尼坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋\t119123\n徐宝安\t119124\n田虎\t119125\n秋静静\t119126\nCanwe\t119127\n那里里\t119128\n北坡炮兵\t119129\n顺子\t119130\n555555552448633000000000000000000000000000000000000000000000000000000000000\t119131\n贤茜\t119132\n仇度秘\t119133\nxfxy\t119134\n腿袜\t119135\n膨胀\t119136\n极赞\t119137\nremove\t119138\n如份\t119139\n三百只\t119140\n五壮士\t119141\n狗马\t119142\nHugos\t119143\n厨小福贵\t119144\n莫愁\t119145\n厂牌\t119146\n展翅\t119147\njxncc\t119148\n焕汤\t119149\n党方\t119150\n新店\t119151\n反感\t119152\nthePaygr0und\t119153\n扯淡\t119154\n牢固\t119155\ndotasf\t119156\n不超超\t119157\n笑柄\t119158\n好宠好宠\t119159\n五零点\t119160\n阳光先生\t119161\n爸爸很的妈妈\t119162\n天到\t119163\n低头\t119164\nFhchn\t119165\n甄心有\t119166\n访问性\t119167\n格式化\t119168\n备货\t119169\n也想了争执\t119170\n一般来今天\t119171\n人丁\t119172\npao\t119173\n漫漫其修远兮\t119174\n谢s\t119175\nCAGE\t119176\n韩系\t119177\n干嘛呢\t119178\n放小大不了\t119179\n零组件\t119180\n二十八位\t119181\nxgcjbubiho\t119182\n人为\t119183\n密度额\t119184\n逸景园\t119185\n干儿\t119186\n低处\t119187\n好啦好啦我不逗你\t119188\ndrastiomeouaaabo\t119189\n闫韦宁\t119190\n干嘛呀\t119191\n穷鬼\t119192\n人中\t119193\n西多夫\t119194\nuyjkvdgatmt\t119195\nzfD\t119196\n无错\t119197\n莫公交\t119198\n生化物\t119199\nK6\t119200\nK5\t119201\n嗯十一\t119202\n9876\t119203\ndowko\t119204\n俺们\t119205\n9871\t119206\n芋艿\t119207\n台面\t119208\n皓月当空离别故土\t119209\ngiballe\t119210\n连桶\t119211\nhdbcvw\t119212\n恐恐\t119213\n女堡\t119214\n洋市场\t119215\n搞笑型\t119216\n带刺\t119217\nsbsbsbsbsbsbsbsbsbsbsb\t119218\nwdatcuzbijynm\t119219\n呢尼\t119220\n姑娘我首歌\t119221\nWings\t119222\n就打一丽\t119223\n格格不入\t119224\n多莫一天梦\t119225\n兄记\t119226\n李元霸\t119227\n王大力\t119228\nnizanse\t119229\njkj\t119230\nliop\t119231\nKi\t119232\n王辉\t119233\nKo\t119234\nKs\t119235\n电站\t119236\n电竞\t119237\n豆子度\t119238\n来来来来来来来来\t119239\n李在干\t119240\n发歌\t119241\n三万分\t119242\n有想\t119243\nKC\t119244\nKB\t119245\nKA\t119246\n余琴\t119247\nKF\t119248\n张英\t119249\nKK\t119250\nBBCBBB\t119251\nKI\t119252\n辽议\t119253\nKO\t119254\n粼光\t119255\nKM\t119256\n安全栓\t119257\n李马\t119258\nKW\t119259\nKV\t119260\n理何在\t119261\nfeohph\t119262\n周曾\t119263\nKX\t119264\n想多才\t119265\n神磨\t119266\njkd\t119267\n度秘撒拉嘿呦\t119268\nugfdddddd\t119269\n宜家家居\t119270\n犬只\t119271\n鄙夷\t119272\n福建农林大学\t119273\n56145\t119274\n处子\t119275\n遞途\t119276\n绝恋\t119277\n红媛媛\t119278\n凯鲁亚\t119279\nzjzjbs\t119280\n添加\t119281\n捐出\t119282\n啦啦啦帅\t119283\n一个度\t119284\n偷泡\t119285\ntfft\t119286\n图片片\t119287\n93074\t119288\n好吧投影\t119289\n大战量\t119290\n猛撮\t119291\ntffd\t119292\n广汽传祺\t119293\ntfff\t119294\n几个v\t119295\nhhshe\t119296\n高日\t119297\n长袖\t119298\ncentre\t119299\n胜石\t119300\n水雾\t119301\n了了了了了\t119302\n武警赞\t119303\n才木\t119304\n六年纪\t119305\n高时\t119306\n刘德生\t119307\n经费\t119308\n经贸\t119309\njkg\t119310\n卞之琳\t119311\n三农\t119312\n铁岭\t119313\n鸟人\t119314\n表价\t119315\n水龙头\t119316\n体质\t119317\n购书\t119318\n未入\t119319\n食子\t119320\n刘士凯\t119321\n汉人\t119322\n水雀\t119323\n18454330809\t119324\n体贴\t119325\n弥勒佛学院\t119326\n二十余年\t119327\n购买\t119328\n32.9%\t119329\n凭着\t119330\n死树\t119331\n街道办处\t119332\n吊遭\t119333\n好啦好啦好啦好啦好啦服你了去睡\t119334\nhjjjjhjh\t119335\n99r\t119336\n葱\t119337\n煅栌\t119338\n同星\t119339\n葵\t119340\n一夫一妻\t119341\n吕女\t119342\n保洪\t119343\n45克\t119344\n葬\t119345\n葭\t119346\n屎壳郎\t119347\n那天国\t119348\n葡\t119349\n45元\t119350\n同明\t119351\n葛\t119352\n刻画\t119353\n不个你\t119354\n水工\t119355\n色散\t119356\n垂青\t119357\n色片儿\t119358\n气死才怪\t119359\n大爱\t119360\n懂法\t119361\n师说\t119362\n高承市免火队\t119363\n长美妞\t119364\n保洁\t119365\n大爸\t119366\n嗯oo\t119367\n995\t119368\n039岁\t119369\n一不听话\t119370\n第一顿\t119371\n991\t119372\n兔子台\t119373\nNOGUD\t119374\n999\t119375\n998\t119376\nJigcab\t119377\n99%\t119378\n555568555555\t119379\n排片\t119380\n二二比\t119381\n飞天云豹\t119382\n包肉丸子\t119383\n创建\t119384\n讲笑好啦\t119385\n王越阳\t119386\n珍品\t119387\n猫扑\t119388\ncolo\t119389\n廖俪彤\t119390\nezacthf\t119391\n数据线\t119392\n隔开\t119393\n#咒文四辑\t119394\n索明珠\t119395\nedasa\t119396\nSKT\t119397\n土豆长芽\t119398\n利润\t119399\n高等生命体\t119400\n际遇\t119401\n吸血鬼\t119402\n野猪肉\t119403\n涵姐姐\t119404\n初审\t119405\n呜呜终於终於避风塘避风檀长\t119406\n乱七八糟乱\t119407\n2004年7月\t119408\n皇瑟平\t119409\n水处女座\t119410\n厨卫展\t119411\n一战的嘛秘宠\t119412\ngbbh\t119413\n030\t119414\ngdgffycyyffffgcffhf\t119415\n王笑冬\t119416\ncole\t119417\n给关\t119418\n法十一活\t119419\nbbusd\t119420\n竖叉\t119421\n撒拉溪\t119422\n给养\t119423\n烟钱\t119424\ngenius\t119425\n龙傲天\t119426\n警察\t119427\n你好无奈\t119428\n刘子悦\t119429\n一过两天\t119430\n利涛\t119431\n03k\t119432\n38873388776\t119433\n贪必究\t119434\n1588413687412\t119435\n增量\t119436\n浴霸\t119437\n尻妣\t119438\n我是你的素同学\t119439\n九帅\t119440\n短发型\t119441\n侗乡\t119442\nJtjm\t119443\n促进会\t119444\n夜空\t119445\nOhjhujh6khyujy6hy6ijyyuigyuj\t119446\n彰现\t119447\nJtjg\t119448\n旅居\t119449\n乖好不好\t119450\n三情片\t119451\n想不想你\t119452\n充裕\t119453\nrfffg\t119454\nrffff\t119455\n黑乎乎宝贝宝贝不\t119456\n吧英语\t119457\nlokl\t119458\n睐鬬\t119459\n达维\t119460\n良师益友\t119461\n罗伞\t119462\n七二\t119463\na级\t119464\n啦度咪\t119465\n先人\t119466\n快滚\t119467\nDVD\t119468\nJgghk\t119469\nDVF\t119470\n拜拜拜\t119471\n涂若欣\t119472\n西施犬\t119473\n梯田\t119474\n很少人\t119475\n大明天\t119476\n照脸\t119477\n贾岛\t119478\n活剥\t119479\n死去吧你我以后再也不爱你了我以后再也不理你了我\t119480\n哥亚\t119481\n问好再说\t119482\nbbk\t119483\n磨难\t119484\njugara\t119485\nbbh\t119486\nbbg\t119487\nbbf\t119488\n勒灾了么元\t119489\n72座\t119490\n渔具\t119491\nbba\t119492\n先于\t119493\n终归\t119494\n李清响\t119495\n孙伟男\t119496\nggikhl\t119497\nbbz\t119498\nbby\t119499\nbbx\t119500\nbbw\t119501\nbbv\t119502\nDVv\t119503\nbbs\t119504\n谢了我会\t119505\nbbq\t119506\n几版\t119507\n高致癌物质\t119508\n美好看\t119509\n行测\t119510\n中医点\t119511\n几片\t119512\n刘建平\t119513\n嗯瓮红瓮\t119514\n转告\t119515\n景天\t119516\n长焦镜头\t119517\n名山\t119518\n662个\t119519\n邪气\t119520\n我儿\t119521\n出去玩片\t119522\nwoshuonishisb\t119523\n几物\t119524\n幻嫣\t119525\n来了在\t119526\n谢了度\t119527\nhfyzsdudd\t119528\n喵ω\t119529\n唱片\t119530\nbb8\t119531\n143843819438\t119532\nCatalina\t119533\n好美好\t119534\nexercisebook\t119535\n山大王\t119536\n一句一句\t119537\n小佛爷\t119538\nKjdpox\t119539\n太阳天\t119540\n一句一只\t119541\n回锅肉\t119542\n校门口\t119543\n服输\t119544\n判断\t119545\nbba吧\t119546\n55955888000777\t119547\n骨关节\t119548\n歪呀元\t119549\n两觉\t119550\n快心软\t119551\n错了算\t119552\n沿用\t119553\n結束之後\t119554\n宋贝贝\t119555\n二点儿\t119556\n丘吉尔\t119557\n系塞\t119558\n啦WB\t119559\n曹钰\t119560\n白小不点\t119561\n论犬儒1\t119562\n乾妈\t119563\ndcf\t119564\n份妮西蒙\t119565\n沿电\t119566\nBaong\t119567\n主流\t119568\n左路\t119569\n19:30\t119570\n心理素质\t119571\ntoutldstookmyoutiaokandomail\t119572\n削皮\t119573\n鹅毛大雪\t119574\nlibabattl\t119575\n向往班\t119576\n里奥\t119577\n以前面\t119578\nchcbj\t119579\n九十九十块\t119580\n三章\t119581\n张艺兴\t119582\ngyygg\t119583\n本俊杰\t119584\nEIIa\t119585\n鹿唅\t119586\nQutter延安八六零五一\t119587\n信小报\t119588\n13282539\t119589\n食安山\t119590\nwoge\t119591\n通过\t119592\n宁一下\t119593\n苗凤英\t119594\n继乏\t119595\n性化\t119596\n智能家\t119597\nfgggggffggggggfsrd\t119598\n王彦珂\t119599\n百分百之二十\t119600\nD801黑闪\t119601\n接靖\t119602\n65588965\t119603\n3225525558855\t119604\n郭子云\t119605\ndqd\t119606\n巧家\t119607\n三声大王\t119608\nFan\t119609\n无可取代\t119610\n亲一半\t119611\n张以撒\t119612\n穿出\t119613\ndqk\t119614\n细马\t119615\n辽阳市\t119616\n杨女\t119617\n映月\t119618\n人事部\t119619\n粒子\t119620\n超不要脸臭不要脸\t119621\n豆豆游戏\t119622\n表這样\t119623\n唔斯\t119624\nGhhhhhhh\t119625\n数咒\t119626\n外加\t119627\n谝氵\t119628\n赵顾里\t119629\n局间\t119630\n玩婚\t119631\n石咏莉\t119632\n剪隔\t119633\n本座\t119634\n本度\t119635\n度秘你好女王来了\t119636\n平稳\t119637\n15266642516\t119638\n长真丑\t119639\nguuhhyub\t119640\nhfhrrhrhrhrh\t119641\n幽静\t119642\n刘卫\t119643\n沙希·塔鲁尔\t119644\n杨奇\t119645\n三秋夜\t119646\n法案\t119647\n人工在线\t119648\n安赛尔\t119649\n银银\t119650\n营业\t119651\n很聪敏\t119652\nbecause\t119653\n收东西\t119654\n李锐\t119655\n钻进\t119656\n啦啦啦你是我的小机器人\t119657\n两旁\t119658\n听不我不知道\t119659\n尽说\t119660\n别再骗\t119661\n也不我思\t119662\n骨灰腐\t119663\n好啦佐佐\t119664\njuno\t119665\n蔡家坪\t119666\n贞德\t119667\n12897\t119668\n两日\t119669\n机电话\t119670\n雪学\t119671\n亲新年好\t119672\n试机号\t119673\n跟班\t119674\n大两个\t119675\n重生\t119676\n三晚\t119677\n鹰犬\t119678\n泰勒斯维特\t119679\n伊夫妮\t119680\n潘贤\t119681\n预示\t119682\n惊心\t119683\n三晋\t119684\n蒋尽夫\t119685\n车手们\t119686\n戴良轩\t119687\n充憬\t119688\n重甲\t119689\n重申\t119690\n六小灵童\t119691\n连笔\t119692\n月票\t119693\n高倩\t119694\n腾龙\t119695\n连笑\t119696\n阜平\t119697\n热膨胀纹\t119698\n毛大雨\t119699\n吃逛\t119700\n8点50\t119701\n畅想\t119702\n重用\t119703\n苍雪\t119704\n热米汤\t119705\n扑杀\t119706\n嗯叶罗丽精灵梦\t119707\nel\t119708\n1900块\t119709\n说爱\t119710\n就是你的那个人在哪儿\t119711\n3p饭\t119712\nhetou兔\t119713\n圣母\t119714\n过天天酷跑\t119715\n引乔设计\t119716\n依卡秀\t119717\n爱的小机器人度秘你好\t119718\n尾数\t119719\n妙严寺\t119720\n请量\t119721\n一九行\t119722\n4.24\t119723\n忌妒\t119724\n饭庄\t119725\n庞雅慧\t119726\n楼市\t119727\n戴振涛\t119728\nowos\t119729\n超不过\t119730\n洗车\t119731\n包穿\t119732\n4.27\t119733\n庞树佳\t119734\n饭店\t119735\n一下么\t119736\n付玉华\t119737\n白雄十强\t119738\n0点零七\t119739\n1111111\t119740\nec\t119741\n新友\t119742\n老公你是我的谁也老不着\t119743\n哪队\t119744\n爱了我\t119745\n色拉酱\t119746\n花缸\t119747\n三十互\t119748\n附体\t119749\n公开赛\t119750\n胡巴\t119751\n季一一\t119752\n李家村\t119753\n拯救地球\t119754\n早餐片\t119755\n胡巧\t119756\n硬像\t119757\n掌种\t119758\n龚淳颖\t119759\njhffgj\t119760\nfffffuhhuihhsajxekdf\t119761\nCcvvvbbd\t119762\n说不想说\t119763\n牵挂\t119764\n示意\t119765\n5494\t119766\nRepost\t119767\n十三家\t119768\n出有因\t119769\n挪用公款罪\t119770\n什麼給\t119771\n王远远\t119772\n张锦裳\t119773\n我不查\t119774\n代入\t119775\neq\t119776\n三堂课\t119777\n宣威日\t119778\n飙升\t119779\n妈咪三男\t119780\n高而瘦\t119781\n45580000\t119782\n2233个\t119783\n各家\t119784\n怕谁了谁说\t119785\nLOveyou\t119786\n作兰\t119787\n哇东也\t119788\n罪恶\t119789\n怕片\t119790\n海豹突击队\t119791\n美若2x\t119792\n钥匙\t119793\n惊为\t119794\n今天下午1点\t119795\n1245790\t119796\n高子手\t119797\n酱料\t119798\n牙髓坏死\t119799\n火枪手\t119800\n拿不走\t119801\n微笑的人\t119802\n拿不起\t119803\n冯太后\t119804\n40088238243\t119805\n我讨厌你讨厌你讨厌你你也讨厌我讨厌我讨厌我讨厌我讨厌\t119806\n那你有没爱上我\t119807\n一千零\t119808\n被冷落\t119809\n865423688805654236\t119810\n柳荫街\t119811\n学不好\t119812\n最后\t119813\n与子书\t119814\n老球\t119815\ntjajg\t119816\n新年好呀新年好呀\t119817\n熊出没之秋\t119818\njclahaha\t119819\n纯属正常\t119820\nhanging\t119821\n劳作\t119822\n叙利亚政府\t119823\n2016年\t119824\n盖祢彰\t119825\ntomisa\t119826\n大声\t119827\n小泽玛利亚\t119828\n珠霞\t119829\n那马勒个比长江长江长江靠谱\t119830\n坏坏蛋\t119831\n中心\t119832\n上榜者\t119833\n爸妈\t119834\nWestern\t119835\n收点\t119836\n亲爱的梦\t119837\n康梦北鼻\t119838\n一次一天\t119839\n这首诗\t119840\n当红\t119841\n柴油发动机\t119842\n梁彗星\t119843\n不见你\t119844\n蚀笑话\t119845\n滚边儿\t119846\n周小雪\t119847\n教导员\t119848\n顿顿\t119849\n九八四零八幺\t119850\n珍猪\t119851\n3d139\t119852\n真心咒\t119853\nConata\t119854\n很好玩笑\t119855\n追根求源\t119856\n阿斯顿\t119857\n签到\t119858\n马友祺\t119859\n王微晶\t119860\n男人们\t119861\n陨石之恋\t119862\n不是我我我\t119863\n矿难\t119864\n灰太狼\t119865\nshows\t119866\n卡车\t119867\n工商业者\t119868\nspeakEng\t119869\n守军\t119870\n布兰\t119871\n太阳镇\t119872\n太不开心了哇小公主\t119873\n脸面\t119874\n优品\t119875\n心理年龄\t119876\n天f1芈悉券凶残废\t119877\n汕尾\t119878\n关谷神奇\t119879\n尊老\t119880\n4月22\t119881\n关友飞\t119882\n今晚10点\t119883\n首帅\t119884\n385万\t119885\n这一查\t119886\n此时此夜\t119887\n长话短说\t119888\n疯子疯子疯子\t119889\n宠爱的歌\t119890\nvddgd\t119891\n载干嘛\t119892\n古尔库夫\t119893\n首席\t119894\n潘佩帅\t119895\n星夜\t119896\n国贸大厦\t119897\n呗度秘加油度秘\t119898\n亲一下你\t119899\n第四部\t119900\n选选\t119901\n潘金莲新传\t119902\n可富克斯人家\t119903\n西里\t119904\n死娘\t119905\nvvbhuu\t119906\ngay讷\t119907\n宣传画\t119908\n粗茶\t119909\nbaby邻\t119910\ngfssf\t119911\n至来\t119912\n阳明\t119913\n166米\t119914\n税钱\t119915\n凝祖祠\t119916\ngobuy\t119917\n绨绿\t119918\n天净沙秋\t119919\n酱骨\t119920\ne9\t119921\n死佳佳\t119922\n零零七八年\t119923\n靠不爱\t119924\n子长\t119925\njdhdjkdjdcjjdbfjbdnfksbfjdhbsuhdugdhhdhrijwifosyfbefjhekfjfjrhdhjshjbnrkcurjgjjf\t119926\n另一个人\t119927\n000000000000000000000000000000000000000000\t119928\n文集村\t119929\n鹿夫妻\t119930\n释恩\t119931\n隐患\t119932\n蔡中华\t119933\n白疏通\t119934\n江苏中旅\t119935\n王嘉乐\t119936\n第三周\t119937\n更丰富\t119938\n龙门阵\t119939\n那明\t119940\n挨整\t119941\n7fx\t119942\n7fy\t119943\n洗澡色\t119944\ntrgryf\t119945\n晓苏\t119946\n梅子\t119947\n安藤\t119948\n0404250\t119949\n10月29日03时00分\t119950\n形象\t119951\n木渣\t119952\n第二顺位\t119953\n11月11日\t119954\n满山\t119955\n公主兔\t119956\nhhfcccb\t119957\n圖案\t119958\n屠盼盼\t119959\n化念镇\t119960\n一日儿\t119961\n9559983968085648017\t119962\n啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦\t119963\n3qv瑞马\t119964\n2006年9月29日\t119965\n相配\t119966\n12311个\t119967\n英姿\t119968\n九四九四九四姐\t119969\n于麻黑\t119970\n嗯梅花小区\t119971\n四原则\t119972\n房产税\t119973\n几？点钟\t119974\nreallyreally\t119975\n刘佳祎\t119976\n最高分\t119977\n满屋\t119978\n490四百九九500\t119979\njoyghig\t119980\nrayou\t119981\n曲咪么咪\t119982\n退钱\t119983\nwang\t119984\n刚刚不\t119985\n阿夫拉klit　　一冖\t119986\nGPS定位系统\t119987\n提黄\t119988\nhi我是女配\t119989\n云彩\t119990\nwant\t119991\n课下\t119992\nCF5\t119993\nCF0\t119994\n陈秋林\t119995\n忘了不知道\t119996\n时间旅行者的妻子\t119997\nWobushi\t119998\n15518779032\t119999\n再见拜拜后会无期\t120000\n白领\t120001\n阿思吧阔小时吧小四小四思考的思\t120002\n1uuu五\t120003\n左丘明\t120004\n卓文\t120005\n苏菲超\t120006\n知音漫客\t120007\n10.7万亿元\t120008\n明个\t120009\n起程\t120010\n牲畜\t120011\n有法可依\t120012\nkckcog\t120013\n9999亿九千九百九十九万9999多\t120014\n毛泽东故居\t120015\nsedd\t120016\nHEAT\t120017\n天假\t120018\ngdgdjv\t120019\n唉谁\t120020\n观灯\t120021\n苑浩洋\t120022\n888丽\t120023\nCFO\t120024\nUmfV\t120025\nCFM\t120026\n沙石\t120027\n我最爱的就是你\t120028\nCFG\t120029\n明一\t120030\n度秘爱度秘\t120031\n滚石\t120032\nCFC\t120033\nCFA\t120034\n天伦乐\t120035\n干\t120036\n150万元\t120037\n趣味\t120038\n开心求你\t120039\nhggghtudh\t120040\n周家村\t120041\nhjhhj\t120042\n主题歌\t120043\n各不相干\t120044\n68个\t120045\n桔子汁\t120046\n萌梦\t120047\n候选人\t120048\n毛头\t120049\n粉身碎骨\t120050\n拨叫\t120051\n动秘\t120052\n震耳欲聋\t120053\n拨号\t120054\n毛大\t120055\njidhsj\t120056\n8546\t120057\n天龙山\t120058\n1i个\t120059\n吴涛\t120060\n倔讲\t120061\n海贼王heredoroaacaaaanoooticanotionnnnnnn\t120062\n为你骄傲\t120063\nuou\t120064\n大智大勇\t120065\n毛多\t120066\nuoy\t120067\n调入\t120068\n性教育片\t120069\n双色图\t120070\n忍足\t120071\nAngeIa\t120072\n九宫格\t120073\n无可挽回\t120074\n跆拳道\t120075\nE囗\t120076\n培训漂流记中星期五\t120077\n汉王科技\t120078\n良渚博物院\t120079\n不能不\t120080\n珍奇\t120081\n双鱼座女\t120082\n胡艳红\t120083\n么儿&amp\t120084\n大白鹅\t120085\n風波\t120086\n难亲爱的亲我版\t120087\n六个四\t120088\n单待\t120089\n现形记\t120090\n大攻\t120091\n离退\t120092\n胃里\t120093\nLV273\t120094\n2016年元旦\t120095\n阿宇午\t120096\n我求你了你放过我吧求求你了我\t120097\n万俟忆涯\t120098\n徐广\t120099\n吊丝=loser\t120100\n一条桥\t120101\n二百多兆\t120102\n不能为\t120103\n现生\t120104\n可受\t120105\n感交集\t120106\n可取\t120107\n挚交\t120108\n国防部长\t120109\n砸\t120110\n433116449455\t120111\n破\t120112\n协管\t120113\n砰\t120114\n四颗\t120115\n狄古\t120116\n前女友\t120117\n猜猜猜\t120118\n谭艳华\t120119\n高智能家用机器人\t120120\nchvy\t120121\n筑龙网\t120122\nXXXX\t120123\n砚\t120124\n忘加\t120125\n研\t120126\n好嘛\t120127\n砖\t120128\n硫化氢\t120129\n好嘞\t120130\nchvv\t120131\n砒\t120132\n铁会\t120133\n砌\t120134\n砍\t120135\n我问你几时亏\t120136\n包飞\t120137\n这部门\t120138\n鼓楼东大街\t120139\n风寒感冒\t120140\n好嘉\t120141\n码\t120142\n砂\t120143\n王仔莹\t120144\n花花圃\t120145\nUEZZ\t120146\naa5aa\t120147\n天高枫黄\t120148\n三年半\t120149\n高盛\t120150\n赫赫赫赫赫赫赫\t120151\n穴位\t120152\n小给打败僵尸大战金刚特码诗\t120153\nEgg\t120154\nzzzzzz\t120155\n太长\t120156\n5377642427880\t120157\n郑州市儿童医院\t120158\njfhxd\t120159\n误人子弟\t120160\n武松\t120161\n太镐\t120162\n18791693563\t120163\n人鞭\t120164\n入境\t120165\n既无\t120166\njhbnn\t120167\n六月一日\t120168\n可可版\t120169\nUrChick\t120170\n订户\t120171\nbealitt\t120172\n校外\t120173\n歪瑞\t120174\n白白白\t120175\n人工智障\t120176\n强征\t120177\n很感动\t120178\n李瑞骐\t120179\n一二三四\t120180\n说说听\t120181\n嗯臣\t120182\n法官们\t120183\n。n2\t120184\n风景区\t120185\n。n1\t120186\n你好密度澳度秘\t120187\n。n4\t120188\n泡芙\t120189\n李敏\t120190\n小小小小小小小小小度秘\t120191\n进道理理我是你的小蜜淑女度\t120192\n博美拉\t120193\n姨手\t120194\n领养码\t120195\n11点11点十五\t120196\n李敖\t120197\nDaddy\t120198\n演戏\t120199\n大灰狼娃\t120200\n张尼丰\t120201\n抗击\t120202\n如子\t120203\n。nn\t120204\n巴塞志\t120205\n西北\t120206\n没早\t120207\nTSboys\t120208\ncd7b899f510a87jpg\t120209\n牛顿环\t120210\n替补们\t120211\n瞧见了你\t120212\n户口簿\t120213\n余德耀美术馆\t120214\n因为你是我的闺蜜\t120215\n平面镜\t120216\n莫[\t120217\n一半点\t120218\n手机案\t120219\n哈哈萌萌哒哒哒哒\t120220\nbbnh\t120221\n见花\t120222\nbbnm\t120223\n西岸\t120224\n西冷牛扒\t120225\n搞笑\t120226\n重新做人\t120227\n给我你会不会\t120228\n偶尔\t120229\n公程\t120230\n秋收起义\t120231\n申上僵尸卫士\t120232\n磊平\t120233\n丹书光光sgz\t120234\nvgh\t120235\n六·一\t120236\n四月一日\t120237\nQWefsseg\t120238\nvgd\t120239\n我是姑娘我爱你\t120240\nvgf\t120241\nvgg\t120242\n疑惑\t120243\nvga\t120244\nvgb\t120245\n偶尼\t120246\n13867689533\t120247\n节枯藤\t120248\n151515115\t120249\n浏览者\t120250\n2.7%\t120251\n七六八\t120252\nYokohama\t120253\n二百下\t120254\n人间宴\t120255\n强基安\t120256\n二百七\t120257\n二百万\t120258\n中超队\t120259\ndddffdfsfgf\t120260\n租房\t120261\n抗艾\t120262\n市公安局\t120263\n妞生\t120264\n韩家园\t120265\n心存感念\t120266\nkiye\t120267\n二百个\t120268\n去世\t120269\n轉發\t120270\nLanvin\t120271\n二百两\t120272\n文网文\t120273\n盖栋\t120274\n很奇怪\t120275\n吧青身运动\t120276\n去下\t120277\n伍佰\t120278\n吞存心\t120279\n皮鬼\t120280\n扇呼门\t120281\n三宇风\t120282\n郑智\t120283\n秘蜜雪\t120284\n名雅\t120285\n义教组\t120286\nihxtustyz\t120287\n宰蒙\t120288\n二皮\t120289\n唱吧#\t120290\n希不希望\t120291\n鼻息\t120292\n酒食\t120293\n真好好听\t120294\n长三角\t120295\n眯句\t120296\n福兆\t120297\n利刃\t120298\n西直门\t120299\n胡里子\t120300\n441样\t120301\n抗压\t120302\n瑟会\t120303\nplate\t120304\nkfvngdnudv\t120305\n点盆\t120306\n未一点\t120307\ngiwd\t120308\nggfffhui\t120309\n小书\t120310\n可爱天\t120311\n98762465\t120312\n杨辉\t120313\n点盖\t120314\n额咯\t120315\n喝醉\t120316\n刘行菊\t120317\n72394\t120318\n曝料\t120319\nHnjncg\t120320\n小么\t120321\n南五栋\t120322\n牧女\t120323\n小乌\t120324\n出卖\t120325\n百分之五十\t120326\n陈一愚\t120327\n滚肠\t120328\n155698524780\t120329\n小乘\t120330\n在此\t120331\n小九\t120332\n小乐\t120333\n小乖\t120334\n小乔\t120335\n克苏\t120336\n彩弹\t120337\n不看不见\t120338\n简胜\t120339\n七股\t120340\n徐梓航\t120341\n拗九粥\t120342\n跟你场\t120343\n老伴儿\t120344\n刚察县\t120345\n远在\t120346\n衢州市\t120347\n電影兌換券\t120348\nxvx\t120349\n蔡秋凤\t120350\n中联办\t120351\nxvp\t120352\n秘度五\t120353\nQ币\t120354\nhjjkkh\t120355\nxvw\t120356\n小宝宝\t120357\n八楼\t120358\nxvh\t120359\n玛斯亚\t120360\nxvj\t120361\n孤注\t120362\nxva\t120363\n偶呗根号5\t120364\n背夹馍\t120365\nxvb\t120366\n播放\t120367\nxvg\t120368\n牛屁眼\t120369\n拜訪\t120370\njiz3吧996188277\t120371\n轻罪\t120372\n有味\t120373\n五千块\t120374\n青友\t120375\n胡图图\t120376\n动地\t120377\n网景\t120378\nhhgggggggggggggyyg\t120379\n不要你了快滚开\t120380\n防空洞\t120381\n感冒充制片\t120382\n松子\t120383\n家别\t120384\nytrd7dyfychxxsfdjdyqyfudjxhs9xkckdhxjzhgcjchgxhxhddhduxhfhch\t120385\n狗肉\t120386\n绞尽脑汁\t120387\n十八分钟\t120388\n天气王\t120389\n喜欢一个人\t120390\n大恶人\t120391\nhfkfj\t120392\n点吧\t120393\n林志疑\t120394\n你心月\t120395\n拜託\t120396\nsyangare\t120397\n卢梭\t120398\nHND\t120399\n本周日下午2点\t120400\nevbwcgxvn\t120401\nHNU\t120402\n讨厌讨厌讨厌讨厌讨厌讨厌你\t120403\n桌面\t120404\n介石\t120405\n带入感\t120406\n33.49\t120407\nfrg\t120408\nfrf\t120409\nfre\t120410\nfrd\t120411\nfrh\t120412\nfro\t120413\n闺房\t120414\nfrm\t120415\n.\t120416\nfrs\t120417\n石阡\t120418\n记分\t120419\nfrw\t120420\nfrt\t120421\n三四百岁\t120422\n5647358\t120423\nfry\t120424\nfrx\t120425\n再一杯\t120426\n老俞\t120427\njxjfjdj\t120428\n祭灶\t120429\n转赠\t120430\n12233474454657\t120431\nhautrobert\t120432\n小凰\t120433\n00000000000点\t120434\n翻倒\t120435\nCalu\t120436\n翻倍\t120437\ntrrsidui\t120438\n小凯\t120439\n哎呀呀东尼\t120440\n15898587\t120441\n电影版\t120442\n专管\t120443\n小凤\t120444\n电影片\t120445\n小凡\t120446\n15378169750\t120447\n谁相\t120448\n何金成\t120449\n森hello\t120450\n选择者\t120451\n立柜式\t120452\n爬上太阳\t120453\n得手足口\t120454\n小凌\t120455\n手度\t120456\n老公我真的好爱您\t120457\n手写信\t120458\n咳咳咳咳坎坎坷坷咳咳咳咳咳咳咳\t120459\n十三多四份\t120460\n黏住\t120461\n东风风神\t120462\n出售型\t120463\n转赛\t120464\n半刚才\t120465\n像秘\t120466\n两盆\t120467\n嚟\t120468\n复习惯\t120469\n次衰\t120470\n嚒\t120471\n嚓\t120472\n嚎\t120473\n张四八\t120474\n两盒\t120475\n博学多才我以我的\t120476\nhN\t120477\n商家\t120478\n两盘\t120479\n民情\t120480\n嚼\t120481\n真讨厌我你可是我的秘书\t120482\nhong\t120483\n第三声\t120484\n水禾田\t120485\nI9228黑4450\t120486\nhono\t120487\n嚷\t120488\n畅叙\t120489\n眺胭\t120490\n参考书\t120491\nYukhugging\t120492\n嚯\t120493\n子璇\t120494\n商定\t120495\n尼亲\t120496\n头象\t120497\n两相\t120498\n一起去\t120499\n两合\t120500\n蜻蜓\t120501\n几久济运\t120502\n你在干什么呀我知道你在干什么你在说谎\t120503\n米八\t120504\n桶\t120505\n划开\t120506\n米兰\t120507\n桨\t120508\n罗新\t120509\ndaidai\t120510\n塔塔塔\t120511\n米兹\t120512\neeereeeeeeeee\t120513\n压根儿\t120514\n小错\t120515\n合不拢\t120516\n桥\t120517\n桛\t120518\n是我爱你爱你\t120519\n罗文\t120520\n嘿嘿片\t120521\n禾青镇\t120522\n桒\t120523\n桓\t120524\n桑\t120525\n桔\t120526\n纸用水\t120527\n送礼者\t120528\n案\t120529\n陶云熙\t120530\n米兔\t120531\n桌\t120532\n桂\t120533\n桃\t120534\n框\t120535\n周婉琨\t120536\n崔先\t120537\n李娜塔莉\t120538\n17702046598\t120539\n黄绿\t120540\n呀多米\t120541\n大冒险\t120542\nokletpaly\t120543\n一个一杆\t120544\n还有你所以说\t120545\n雅俗共赏\t120546\n青风侠\t120547\n看稿\t120548\nhah\t120549\nhai\t120550\n中共\t120551\n王科长\t120552\nhal\t120553\nhan\t120554\nhao\t120555\nhaa\t120556\n闹不理\t120557\n2010年1月10日\t120558\nhad\t120559\nalgirl\t120560\n飚文\t120561\nXchzdhf\t120562\nJqy\t120563\nhap\t120564\n李政委\t120565\n1967628号\t120566\nhas\t120567\njgtwnj\t120568\n坐在音乐椅子上聆听音乐\t120569\n行难干\t120570\n东北地区\t120571\n健在\t120572\n1521386900\t120573\nqydh\t120574\n附录\t120575\n60分\t120576\n瘆人\t120577\n丰婷丽尔\t120578\n乐事\t120579\n郭梦秋\t120580\n日本厚生省\t120581\n是第\t120582\n三零\t120583\n12:30\t120584\n神记\t120585\n量量\t120586\n赛纳\t120587\n主妇们\t120588\n１只\t120589\n徐福\t120590\n叶子秘\t120591\n罗清林\t120592\n近11年\t120593\n玉山\t120594\n33456677788\t120595\n数七十多号\t120596\n全剧\t120597\n角膜\t120598\n老炮儿\t120599\n条条\t120600\n有幸目睹\t120601\n浩荡\t120602\n吴洪旭\t120603\n乱象\t120604\n87690856\t120605\n短长\t120606\nbhiphotosbaiducomxiaodupicitem0b55b319ebc4b7454adb6f62c8fc1e178a821572jpg\t120607\n进而\t120608\n无意间\t120609\n都通\t120610\n超能陆战队\t120611\n神犬小七\t120612\n我爱妈妈人心二\t120613\n连不上线\t120614\n分明\t120615\n陈光诚\t120616\n1568567岁\t120617\njtkmwjbtmpmpdpmgmgdgmgmpmdgdgmgwgnpdgwgw1mgfmwn9fnd\t120618\n老聪明\t120619\n捡起\t120620\n我去你给我钱行不\t120621\n热依沙\t120622\n心舒服\t120623\n叶公\t120624\n爱情戏\t120625\n88721\t120626\n零幺二零二二零幺幺三六零幺二\t120627\n水货\t120628\n运动馆\t120629\n茂密\t120630\n调了挑\t120631\n八道小心\t120632\n愈发\t120633\n电视话\t120634\n满身\t120635\nhttpghiphotosbaiducomxiaodupicitemd01373f082025aafb22d98b8fcedab64034f1aa1jpg\t120636\n越南\t120637\n妇女流产\t120638\n兼容性\t120639\nSIZE\t120640\n卡米亚陶瓷\t120641\n割腕\t120642\n盂污侮\t120643\n2629423102\t120644\n13987\t120645\n好火\t120646\n美酒\t120647\n金碧辉煌\t120648\n什麼時候\t120649\n说了谢谢你\t120650\n刘经理\t120651\n爨一直\t120652\n八米\t120653\n赫拉克勒斯\t120654\n搭话\t120655\n就会\t120656\n马上天\t120657\n零零米亚\t120658\n跌宕\t120659\nrsuxtjyid\t120660\n144po\t120661\nndbdjd\t120662\n看不得\t120663\n雯雯雯雯雯雯\t120664\n浆糊\t120665\n韩欣彤\t120666\n超级点\t120667\n屏东中学屏北分校\t120668\n最炫民族风的歌\t120669\n熊志华\t120670\nhgdsg\t120671\n羊草\t120672\n忙辣\t120673\n袁本草\t120674\n十一岁了我是你的小主人\t120675\n泪流满面\t120676\n88888888888888888888866666666666666666666\t120677\ngjmm\t120678\n尾款\t120679\n辅导班\t120680\n惩治\t120681\n门君\t120682\n哎呦好爱\t120683\n啊霞\t120684\nbudao\t120685\n惹怒\t120686\n当思\t120687\n30u\t120688\n小窝\t120689\nmijiayi7980546\t120690\n表面上\t120691\nWhos\t120692\n情景行\t120693\n2234566744745￥\t120694\n徐金杰\t120695\nbdurrgusvg9eeb\t120696\n叫错\t120697\n柳北\t120698\n张我喃\t120699\n佛藕\t120700\n了希望\t120701\n不会话\t120702\n酷爱\t120703\n奸商\t120704\nn遍\t120705\n太好了行\t120706\n天天上的天天晚自习\t120707\n不懂不懂不懂不懂\t120708\n2013年3月11日\t120709\nZAPPOS\t120710\n1199岁\t120711\n几任\t120712\n储户会\t120713\n晚安秘\t120714\n句处\t120715\n真钞\t120716\ntheroomtwo\t120717\n市公\t120718\n神经度秘\t120719\n顺溜顺溜\t120720\n星云\t120721\n滨湖\t120722\n在旅途\t120723\n年长者\t120724\n无数年\t120725\n李鑫片\t120726\n示例库\t120727\n伊贝\t120728\n说不说\t120729\n5.1%\t120730\njncdyikb\t120731\n星人\t120732\n恶劣\t120733\n好饱吃福\t120734\n哦莎\t120735\n许张臣\t120736\n器乐\t120737\n肥高\t120738\n欧佛曰so原地i\t120739\n要不然会\t120740\ngaosuwo\t120741\n东英\t120742\n华夏\t120743\nThanksyou\t120744\n真不萌\t120745\n富力阵\t120746\n继续的话\t120747\n普思\t120748\n过江\t120749\n5.12\t120750\nxigxfxcudufxi\t120751\n浪花\t120752\ntjrehi\t120753\n，，，，，，\t120754\n东苑\t120755\nselorde\t120756\n同回\t120757\n柳ng\t120758\n华天\t120759\n十二零BL420\t120760\n华大\t120761\n旋转\t120762\n搞不定\t120763\n受贿\t120764\n坑位\t120765\n余继飞\t120766\n一个男\t120767\n舒淇\t120768\n轱辘\t120769\n不凡\t120770\n喵爱\t120771\n不出\t120772\n加息卡\t120773\nsfgdry\t120774\n的而\t120775\n爵迹\t120776\n乐PhoneP700赠机\t120777\n8865\t120778\n王一冰\t120779\n不减\t120780\n骗人我不相信你说的话\t120781\n平安夜快乐\t120782\n8868\t120783\n不准\t120784\n四五点钟\t120785\n滕志强\t120786\n大蛋糕\t120787\nm78星云\t120788\n好不死\t120789\n悛咝\t120790\n喝不起\t120791\n分手吧宝贝\t120792\nfddswerdfshd\t120793\n爱怜\t120794\n新会区平山小学\t120795\n小鲜\t120796\n宠物杀手\t120797\n扫描机\t120798\nDogfights\t120799\n灰兔\t120800\n白c堡\t120801\n阿尼哈撒幺\t120802\nyoutifindmyouto\t120803\nifhrgjr\t120804\n假军\t120805\n青红皂白\t120806\n猜放\t120807\n留驻\t120808\n埃塞俄比亚\t120809\n哈尔威\t120810\n窝小学\t120811\n残兵\t120812\n董秋兰\t120813\n崔雅楠\t120814\n汤小曼\t120815\n15个\t120816\n候生\t120817\n发观\t120818\n卢李晨\t120819\n依稀\t120820\n小三岁\t120821\n好嘛为您口干舌燥\t120822\n发觉\t120823\nyugguhvoh8vic\t120824\n唐海昂\t120825\n秘密宝贝\t120826\n长筒\t120827\n玫瑰色\t120828\n嘞谢\t120829\n日本核电站\t120830\n郭子皓\t120831\n贾有\t120832\n婶儿\t120833\n于江宁\t120834\n品牌自创办\t120835\n孙峰\t120836\n啦喜阳\t120837\n大鸭梨\t120838\n15万\t120839\n苏继续\t120840\n心智的感\t120841\n取回\t120842\nexeaskamics\t120843\n浅灰玫瑰红\t120844\nhzggshg\t120845\n沒傘子\t120846\n十万元\t120847\n太空杯\t120848\n大拉不拉多\t120849\n祈使句\t120850\n虎皮\t120851\n145576\t120852\n304\t120853\n江淮瑞风\t120854\n中国动漫\t120855\n超跌反弹\t120856\n靠真\t120857\n正正ryjmm\t120858\naddress\t120859\n44亿方\t120860\n洁癖男\t120861\n这么远\t120862\n瓜儿\t120863\n上跑男\t120864\n請發\t120865\nyy你不要我了说别的花茶\t120866\n有有有有有有有有有有有有呦呦呦呦呦呦呦呦呦呦呦呦呦呦呦呦呦呦呦呦呦呦呦呦呦呦\t120867\n呀叫\t120868\n这么近\t120869\n伤残\t120870\n鸡蛋清\t120871\n的者\t120872\n0.34元\t120873\n莱可尔秀\t120874\n吸毒者\t120875\n虐亡\t120876\n哇度秘你好牛掰我要爱上你\t120877\n实事儿行\t120878\n稿笑\t120879\nraise\t120880\n科大讯飞\t120881\nfdgl\t120882\n白开心\t120883\nsyGz\t120884\n進行\t120885\noooccji\t120886\n鱼露\t120887\n蜜袋\t120888\n杀头\t120889\n老大难\t120890\n产税\t120891\n凌虐\t120892\n抱一个\t120893\n容积\t120894\nfdgu\t120895\n奇遇记\t120896\n二周目\t120897\n够不够用\t120898\n毛概\t120899\n缓和\t120900\n自立\t120901\n127集\t120902\n马群\t120903\n话字\t120904\n能说一句\t120905\n度计\t120906\n苍老\t120907\n马羽\t120908\nLo\t120909\netude\t120910\n抱一下\t120911\n呢班车\t120912\n全国卷\t120913\n阿卜杜勒\t120914\n硫化钡\t120915\n就好短\t120916\n陈倩\t120917\nwvv\t120918\n幺零二三\t120919\n假若\t120920\nwvh\t120921\n侪Y1\t120922\n连身\t120923\n逆乾\t120924\n防卫战\t120925\n兵兵耍流氓云溪旅馆\t120926\nwva\t120927\nixoxufggf\t120928\nwvd\t120929\n嗯高梦\t120930\n55785554545555555\t120931\n健康天神金刚劲如风地兽金刚\t120932\n10级\t120933\nfretjjg\t120934\n约见\t120935\n大孩子\t120936\n两天前\t120937\n260只\t120938\n20000元\t120939\n世俗\t120940\n88喽\t120941\n死莉莉\t120942\n窝暗恋泥\t120943\n光带\t120944\n时间流\t120945\nsskks\t120946\ncf86\t120947\n张子璇\t120948\n雨滴落\t120949\n十多度\t120950\n应用题\t120951\n忙里忙\t120952\n你好多咪有哆啦a梦\t120953\n莱德杯\t120954\n陶乐\t120955\n辉鼓手\t120956\n七秒\t120957\n王集乡\t120958\n蔼蔼\t120959\n君权\t120960\n摩丝\t120961\n56789\t120962\n大惊易失态,\t120963\n武陟\t120964\n朱辛骁\t120965\nurto\t120966\n谢建华\t120967\n一路同行\t120968\n神色\t120969\n凭子\t120970\n開發\t120971\ngjggh\t120972\n打窝\t120973\n太啦喜羊羊\t120974\n史爱红\t120975\n5000年\t120976\n磨干\t120977\n嘉儿\t120978\n农副产品\t120979\n无明\t120980\n大本笔\t120981\nEnEnEnE\t120982\n一字一\t120983\n一儿\t120984\nfffrffffgyythfftyrtyyyyryyryyy\t120985\nlall1l里lRPGYYhiYYtiger疼HK不UGG\t120986\n总理\t120987\n默沅\t120988\nmomasaaaa\t120989\n糊弄人\t120990\n七秀\t120991\n闫虹旭\t120992\n牛羊\t120993\n十八勒\t120994\n名流\t120995\n我喜欢行者\t120996\n真是的食不可及\t120997\n绝大多数\t120998\n13582345418\t120999\n222两千\t121000\n游行\t121001\n储量\t121002\n桃枝\t121003\n乍暖还寒\t121004\n拟怀\t121005\n不看\t121006\n独裁\t121007\n明梅花扑鼻香\t121008\n林先生\t121009\n一沓\t121010\n不真\t121011\n本人家\t121012\n惠惠\t121013\n度秘戏叫\t121014\n炕席\t121015\n尤尨\t121016\n老大徒\t121017\n邢艺扬\t121018\n不眠\t121019\n闹不勒\t121020\n翩翩起舞\t121021\n韩雨叶\t121022\n赠我\t121023\n黄片儿\t121024\n良我\t121025\n上年\t121026\n废话信\t121027\n侮骂\t121028\n太多太多太多\t121029\n猜产\t121030\n两40度\t121031\n攻略\t121032\ndgbmk\t121033\n评测\t121034\n赤峰道72号\t121035\n想你好\t121036\n头花脑\t121037\n33cocm\t121038\n不不我也是\t121039\n李雪静\t121040\n李晨曦\t121041\nxjjxjjxjj\t121042\n十大个\t121043\n练手\t121044\n不行啊滚滚滚滚\t121045\n湘江\t121046\n广重\t121047\n中村雪\t121048\n出风头\t121049\n不想和你说话不是我不想和你说\t121050\n半圆形\t121051\n大马站\t121052\ndidn\t121053\ndidi\t121054\nnsz\t121055\ndidk\t121056\n票据\t121057\n3一点儿\t121058\n顶点\t121059\ndidv\t121060\n呦西呦西\t121061\n八婆\t121062\n哈克木克土肯\t121063\n大西校\t121064\n毁化\t121065\n加多宝\t121066\n白羊羊\t121067\nshshhwhhsw\t121068\n给他说\t121069\n6号上午\t121070\n墨笔\t121071\n叫闹闹\t121072\n1600万欧元\t121073\n蹩脚\t121074\n帅攻\t121075\n别胡\t121076\n明和寒\t121077\n5477893257885555\t121078\nfeeeel\t121079\n萌萌萌哒\t121080\nTU秘\t121081\n天生我的肚子也有一个红心\t121082\n小蛋蛋\t121083\n红先进性\t121084\n丰田花冠\t121085\n欢天喜地七仙女\t121086\n传动\t121087\n微火收汁\t121088\ng8jjgt\t121089\n天涯何处无芳草\t121090\n嗟来\t121091\nNIVEA\t121092\n鸟网\t121093\n义火\t121094\n小魔方\t121095\n陈里予\t121096\njjjhhvhjbvcxv\t121097\n步木\t121098\n陈杨森\t121099\n独木舟\t121100\n手行\t121101\n大邑\t121102\n三三三\t121103\n该罪\t121104\n雨露\t121105\n康花新村\t121106\n舒肤佳\t121107\n雨霏\t121108\n唐亮\t121109\n1313131311\t121110\n大雄\t121111\n罪不至\t121112\n周笔畅梦想在继续微电影幕后花絮2\t121113\nOKIcan\t121114\n么意\t121115\n唐人\t121116\n连标\t121117\n拉美斯\t121118\n偿债\t121119\n手表\t121120\n张天秀花园\t121121\n早晨6：30\t121122\n对焦\t121123\n情种\t121124\n就此\t121125\n就正\t121126\n别八\t121127\n棉服\t121128\n第8名\t121129\n心湖\t121130\ncome键\t121131\n嗯秘\t121132\n更年\t121133\n和你一样\t121134\n苏里\t121135\n急走\t121136\n更广\t121137\n别关\t121138\n一整片\t121139\n别克\t121140\n别先\t121141\n飞流直下\t121142\n荧光\t121143\n市政工程\t121144\n别元\t121145\n戏珠\t121146\nydfvgt\t121147\n我的乖乖\t121148\n候来\t121149\n嗯施多\t121150\n上心病狂\t121151\n缪喵喵喵\t121152\n4uit\t121153\ngigiifg\t121154\n落体型\t121155\n飞耶\t121156\n猜亚\t121157\n嗯哪度秘\t121158\n小屋\t121159\n王龙津\t121160\n黎美萱\t121161\n押注\t121162\n小屄\t121163\n小居\t121164\n军事类\t121165\n风景线\t121166\n朱再见\t121167\n解约金\t121168\n扁义词\t121169\n李碧华\t121170\n当妃\t121171\n最後\t121172\n一心点\t121173\n爱上我要\t121174\n小山\t121175\n杨明佳\t121176\n松树湾村\t121177\n校花的贴身高手\t121178\n猪猪猪猪猪猪猪猪猪一回家也收号楼哦亲爱的这并\t121179\n猪爪\t121180\ngvgghhjmm\t121181\n仔儿\t121182\n闭经\t121183\nVersace\t121184\n哦瓮\t121185\n夷为平地\t121186\n全能\t121187\nField\t121188\n钱琦淇\t121189\n非尔不可\t121190\n泥类\t121191\n单身一人\t121192\n长大再说\t121193\n525686n655869n666966\t121194\nGgffgvcfffhij\t121195\n靠天吃饭\t121196\n今夏才\t121197\n宽带号\t121198\n心心\t121199\n防备\t121200\n实现\t121201\nzom\t121202\nzoo\t121203\n阿颜\t121204\nzou\t121205\n不在你好\t121206\n葬礼\t121207\n为度\t121208\n红玉\t121209\n婆婆系\t121210\n进球\t121211\n左右脑\t121212\n一介书生\t121213\n落春\t121214\n华南东部\t121215\n七一翱强\t121216\nyettheidea\t121217\n一一一十一一一\t121218\nonev黑1900\t121219\n懒咯\t121220\n微名\t121221\n字圆\t121222\n警情\t121223\n研究生院\t121224\n588455898226565856699699\t121225\n有机咒\t121226\n真心不懂\t121227\n收视率\t121228\nboys\t121229\n1812238390\t121230\n乖徒儿\t121231\n名公孙玲珑\t121232\n王佳凤\t121233\n明发国际新城小区\t121234\n微吧\t121235\n货改\t121236\n中国梦想秀\t121237\nu西递宏村\t121238\n罗根\t121239\n18830446665\t121240\n刁孔鑫\t121241\n公函\t121242\n石膏\t121243\n发包\t121244\n晋察冀等哈\t121245\n芙蓉股骨头vdtxrr\t121246\n利锦飞\t121247\n行不理\t121248\n棒球棍\t121249\n财务章\t121250\n谢国跌\t121251\n不要紧\t121252\n服务员\t121253\n灏明\t121254\n一九九二\t121255\n拜仁切尔西\t121256\n神器号\t121257\n女大辫子\t121258\n2888888\t121259\n035553\t121260\n乖咯瘩\t121261\n李书福\t121262\n好在家\t121263\n我是你的你是什么\t121264\n炼化\t121265\nvjvjcjhhxhxjxs\t121266\n打了死\t121267\n纵火客\t121268\n美女蛇\t121269\nvbudx\t121270\n快乐家族\t121271\n不要脸不要脸不要脸不要脸不要了不要来不要脸臭不要脸\t121272\n台历\t121273\n念琪\t121274\n萌萌萌萌哒别害羞一起唱萌萌萌萌哒\t121275\n冻面\t121276\nhxvxjdtggx\t121277\n小肚肚\t121278\n美国民主党\t121279\n必备\t121280\n于丁一\t121281\n成山二村\t121282\n半百\t121283\n乐橙\t121284\n陕西省新河学校\t121285\n三顿\t121286\n525888588886666688525588888\t121287\n三顺\t121288\n三项\t121289\n梁骁惠\t121290\n一个赞\t121291\n蜘蛛小姐\t121292\n叻仔\t121293\n天哪\t121294\n2849260050\t121295\n标枪\t121296\n乐气\t121297\n俩处一顿\t121298\nhhgfff\t121299\n么急小文\t121300\n红枫\t121301\n地笼\t121302\n脱颖而出影\t121303\n腋下\t121304\n542115678\t121305\n红枣\t121306\n万载\t121307\n水罐\t121308\n沃畅视\t121309\n市福利院\t121310\n快元旦快\t121311\n833\t121312\n830\t121313\n老子不拿我要我你要路费\t121314\n836\t121315\n红果\t121316\n三级ou\t121317\n麻烦\t121318\n四几本\t121319\n不干死\t121320\n漫画展\t121321\n芊芊\t121322\n西游记之大圣归来\t121323\nnonotk\t121324\n拿拉美比\t121325\n称王\t121326\n唐慧\t121327\n嗯哼哼哼哼个够我恨你我恨你我讨厌你度秘\t121328\n178嘿嘿\t121329\n今天三八妇女节\t121330\nBekor\t121331\n举世\t121332\n车斗\t121333\n宣德州\t121334\n民怨\t121335\n小魔仙\t121336\n点赞\t121337\n虎口\t121338\n多体要\t121339\n上爱\t121340\n23元\t121341\n当季\t121342\n甇\t121343\n我的蜜友\t121344\n作价\t121345\n哈哈哈我帅你最丑\t121346\n我不聊\t121347\n任爱林\t121348\n了别忘记\t121349\nTEACHER\t121350\n上爬\t121351\n缓解\t121352\n不乖赶\t121353\npfgsu\t121354\n1816公里\t121355\n感谢你的聊天\t121356\n石井\t121357\n在家里\t121358\n5点45到6点\t121359\n亲眼\t121360\n浙传\t121361\n师生通\t121362\n阿汉\t121363\n35花\t121364\n朱妃雅\t121365\n长队\t121366\n中国人寿\t121367\n关云长\t121368\n郎开朗\t121369\n邓琳\t121370\nwest\t121371\n喜唐\t121372\n站子\t121373\n差距\t121374\n284439\t121375\n阿汤\t121376\n呢人类\t121377\n阿汪\t121378\ns巴拉巴拉小魔仙\t121379\n列斯戈\t121380\n闯南走\t121381\nhttppinyincn1ISTCTZFqvi\t121382\n粉率\t121383\n丑人\t121384\n我叫候\t121385\n混声组\t121386\n换去\t121387\n太和\t121388\nphotos\t121389\n自恋狂度秘\t121390\n伦敦希斯罗机场\t121391\ngged\t121392\n肌肤\t121393\n排斥\t121394\n丧尸片\t121395\n三藏\t121396\n量能\t121397\n27号\t121398\n乡村爱情故事\t121399\n宴请\t121400\n龙鱼\t121401\nsingle\t121402\n应是\t121403\n沈煜浩\t121404\n派人\t121405\n丑事\t121406\n肌肉\t121407\n27只\t121408\n前朝\t121409\n前期\t121410\n性淫\t121411\n满天飞\t121412\n所群\t121413\n昕薇\t121414\n雏田\t121415\n莫妮卡\t121416\n出殡\t121417\n唐嫣\t121418\n銀\t121419\n擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦\t121420\n人间路\t121421\n好找\t121422\n首董\t121423\n出去吧\t121424\n手工艺品\t121425\n闲云事故不遑启居猃狁之故百威采薇薇亦柔止曰归曰归心亦忧止忧心\t121426\n望天\t121427\n好手\t121428\n只限\t121429\nafdf\t121430\n合家快乐\t121431\n一个呗\t121432\n做我不想\t121433\n毒虫\t121434\n七十四三十五一百\t121435\n卡片式\t121436\n外遇\t121437\nibuc\t121438\n没有了麻烦你\t121439\n灌木\t121440\n枣庄影视基\t121441\n火恩\t121442\n132972355555\t121443\nzy108\t121444\n外道\t121445\n地方人\t121446\nQQ2903848172\t121447\n捐钱\t121448\n回音\t121449\n鲁棒\t121450\n龙珠\t121451\n蓝天\t121452\n气色\t121453\n滴安\t121454\n张召忠\t121455\n88887\t121456\n88886\t121457\n口牙\t121458\nXVN\t121459\nXVB\t121460\n88889\t121461\n88888\t121462\n十二月\t121463\n519008\t121464\n吐鲁番\t121465\n顽童\t121466\n掐死\t121467\n冒雨\t121468\n咪龟兔\t121469\n衣服套\t121470\n小了人\t121471\n我告诉他\t121472\n十二本\t121473\n色儿\t121474\n90多分\t121475\n党性学\t121476\n唱响\t121477\n扑尔敏\t121478\n了行不\t121479\n仿照句\t121480\n恶狼嘴\t121481\n听话知道\t121482\nHiMaureen\t121483\n用钱包\t121484\n这货\t121485\n有前途\t121486\n王坤\t121487\n黄俊\t121488\ntgdf\t121489\n樊云飞\t121490\n对呀你猜猜我\t121491\n早睡着\t121492\n额撸撸\t121493\n王坚\t121494\n安之若素\t121495\n听爱\t121496\ndseafeegd\t121497\ngbgbg\t121498\nhttpbhiphotosbaiducomxiaodupicitemeaf81a4c510fd9f916fae61b222dd42a2934a4d7jpg\t121499\n玉石俱焚\t121500\n奔波\t121501\n好想吃苦\t121502\n华蓥\t121503\n图报\t121504\n超级大坏蛋机器人\t121505\nhfffv\t121506\n有感觉\t121507\n形容词\t121508\n孙俪\t121509\n真博学\t121510\n捉鳖\t121511\n争抢\t121512\n吴丽红\t121513\n久坐\t121514\n才度\t121515\n那一边\t121516\n仙济公活佛玉香格格\t121517\n猪婆们\t121518\n老公您是我的好宝贝哈\t121519\n孙俊\t121520\n知道\t121521\n学步鞋\t121522\n大乐嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎\t121523\nMan\t121524\ntf波伊斯\t121525\n11479\t121526\n功率\t121527\n新华保险\t121528\n五军\t121529\n章鱼输入法\t121530\nAGB\t121531\n葵阿kj\t121532\n超强小受来了\t121533\n售票点\t121534\n郭建飞\t121535\n通江\t121536\n傻子弹\t121537\n灯光\t121538\n小机器人我来了\t121539\n左小刚\t121540\n济南市\t121541\n精灵梦\t121542\n苏欣\t121543\n智斗\t121544\n110.79亿元\t121545\n空号\t121546\n家出来\t121547\n摸吧摸\t121548\n9542\t121549\n皮炎平\t121550\n王祖儿\t121551\n郑召利\t121552\n东门\t121553\n超级难超\t121554\n恐鬼症\t121555\nDELL\t121556\n排费\t121557\n999元\t121558\n维克多\t121559\n害不死\t121560\n一个周\t121561\n外孙\t121562\n说起来\t121563\n单反热靴\t121564\n欺负\t121565\n一一平方米\t121566\n华润\t121567\nMbnhgq\t121568\n报警器\t121569\n7点六\t121570\n你的你的你的呢删除了你在\t121571\n李静轩\t121572\n程杰文\t121573\n妻徒刑\t121574\n答复我爱你\t121575\n忍让\t121576\n杨磊\t121577\n朱利安\t121578\n剛好護\t121579\n袁哥\t121580\n合算\t121581\n段明凯\t121582\njugmp\t121583\n圆环\t121584\n宗教\t121585\n13975288749\t121586\n乖我大地\t121587\ngfygd\t121588\n么个头\t121589\n不介\t121590\n超前\t121591\n不仅\t121592\n50岁\t121593\n点多才来\t121594\n521ri\t121595\n800003\t121596\n快乐看看\t121597\n瓯北\t121598\n解酒\t121599\n不份\t121600\n运营会\t121601\n陈世国\t121602\n产品化\t121603\n莽撞\t121604\n身安\t121605\n追击\t121606\np天天片\t121607\n乱撞\t121608\n真小贤\t121609\nbubble\t121610\n留血\t121611\n保持距离\t121612\n火电\t121613\n快进\t121614\n北京师范大学\t121615\nCODE6649\t121616\n8号\t121617\n8台\t121618\n咸鱼\t121619\nmh370\t121620\n棒球帽\t121621\n家長\t121622\n幸运日\t121623\n初见雏型\t121624\nlwgd\t121625\n家长\t121626\n佃农王\t121627\n师生\t121628\n上不了网磨\t121629\n江南园林\t121630\nfhdgjfd\t121631\n439943994399\t121632\n陈常月\t121633\n哪度秘\t121634\ndirty\t121635\n心理诊所\t121636\n哨声哨声哨声我要三好生\t121637\nuip\t121638\n非分\t121639\n制秘我可以\t121640\nEfgm\t121641\n重担\t121642\n巴萨罗那\t121643\n统驭\t121644\n30号\t121645\n巴不得\t121646\n重拍\t121647\n30只\t121648\n特卖女\t121649\n我雪\t121650\n母题\t121651\n吼吼\t121652\n沈昆鹏\t121653\n装吧\t121654\n┳━━┛n　┏┫　┣┓\t121655\n一个两年\t121656\n小G们\t121657\n2736点\t121658\n省事儿\t121659\n来句\t121660\n光催\t121661\n多奥城\t121662\nLeica\t121663\n10半\t121664\n1441\t121665\n1440\t121666\n脂肪\t121667\n康乃如\t121668\n全画幅大光圈变焦镜头\t121669\n二零八八\t121670\n格茨温德尔\t121671\n部门儿\t121672\n一岁\t121673\n截住\t121674\n摸胸干嘛\t121675\n留留\t121676\n溝通\t121677\n士信\t121678\n花底\t121679\n花店\t121680\n史策先\t121681\n手术费\t121682\n给力星期天\t121683\n玩校\t121684\n焦裕禄\t121685\n秒删\t121686\n你好美撒晓得窝\t121687\n音乐世界\t121688\n撕破\t121689\n快快达\t121690\n144k\t121691\n巴尔达诺\t121692\n走嘞\t121693\n吾星葡\t121694\n猪儿小头爸\t121695\n与日俱增\t121696\n艾伦图灵\t121697\n两三千\t121698\n油率\t121699\n话音\t121700\nyyppu\t121701\n磨厚\t121702\n我的喜欢\t121703\nintentIntentSK11714776654C52350D0E455F760032FB69E03D93ED77C0988A97A63A0D916A4B6911E2030end\t121704\n异质\t121705\n它那\t121706\n野鬼\t121707\n崇明\t121708\n窃读\t121709\n回度\t121710\n王猛\t121711\n姓刘干嘛\t121712\n山装\t121713\n我是谁不重要可是你是我的宝你是我的秘书\t121714\n王猫\t121715\n16十三二百八十\t121716\n王猩\t121717\n法制\t121718\nuvcfggfxf\t121719\n所能\t121720\n喜欢你哪儿\t121721\n莱芜\t121722\n好桑\t121723\n冷傲\t121724\n抛筹\t121725\n溺死\t121726\n33833373\t121727\n冷吗卡撒玛希隆嘎嘎嘎嘎嘎\t121728\n专情\t121729\n嗯害羞\t121730\n光明口乡\t121731\nand泰国nichkhun\t121732\n4625613\t121733\n拉扎斯菲尔德\t121734\n月咏uu\t121735\n回府\t121736\n以后再说\t121737\n岔街\t121738\n叶念探\t121739\n叮当猫\t121740\ngghhggfnggggBBBnggfngfggfBBS\t121741\n霍老师\t121742\n#5\t121743\n通缉\t121744\n韩蓓蕾\t121745\n我的派\t121746\nfglpp\t121747\n厦门市\t121748\ndayon\t121749\n27272\t121750\ngabaaaa\t121751\n打废\t121752\n好脑\t121753\n3q个\t121754\nnnnnnnnnnnnnnnnnnnnnnnnnnnnn\t121755\n包扎\t121756\n仗剑\t121757\n7649764664673497664664￥\t121758\n秋裤叔\t121759\n舟舟\t121760\n脑电波\t121761\n嘎拉嘎蹦沙嘎拉嘎蹦沙嘎拉嘎蹦沙嘎拉嘎蹦沙嘎拉嘎蹦沙嘎拉嘎蹦沙嘎拉嘎东芝\t121762\n地形\t121763\n骆艳羽\t121764\n真心赞\t121765\n包打\t121766\n苹果汁\t121767\n犯法\t121768\n15555555555555\t121769\nmultiair\t121770\n儿童文学\t121771\n麻烦勒\t121772\n地彗\t121773\n八八六九五九三七\t121774\n好脸\t121775\n好看不好看\t121776\n杜淳\t121777\n谁你妈妈是谁你的妹妹是谁你\t121778\n苹果妹\t121779\n病入膏肓\t121780\n绚烂\t121781\n爱缘\t121782\n尿裤\t121783\n瞩目\t121784\n为嘛为\t121785\n回床\t121786\n草包饭\t121787\n徐玉燕\t121788\n白日做梦\t121789\n蕃薯\t121790\n欧陆\t121791\n樗散衰儒不晓机\t121792\n老不好找\t121793\n零幺零零幺零\t121794\n眀星\t121795\n王传民\t121796\nh漫\t121797\nfipy\t121798\n圣西罗#\t121799\n丫环\t121800\n樱花\t121801\n威远行\t121802\n激恼\t121803\n鞭打\t121804\n毒舌女\t121805\n悲鸣曲+蓝志13大经典语录+蓝志\t121806\n青笋\t121807\n内木\t121808\n早点睡嘛\t121809\n莱斯特美妞\t121810\n谭帅鹏\t121811\n吴晗菡\t121812\n必要性\t121813\n别心\t121814\nfjif\t121815\n病情\t121816\n刘思妤\t121817\nLumia\t121818\n死忠秘\t121819\n处女朋友\t121820\n别忘\t121821\n别忙\t121822\n随着\t121823\n小算盘\t121824\n心安理得\t121825\n银家豆腐\t121826\n3456\t121827\n吴金贵\t121828\n这家伙约\t121829\n我的祖宗十八代\t121830\n陈朝盛\t121831\n苏教\t121832\nffvfvrifegfh\t121833\n别念\t121834\n秘重出\t121835\nyes1t\t121836\n扭矩\t121837\n2.3cm\t121838\nqrst\t121839\n嵩县\t121840\n学海\t121841\ndfyt\t121842\n会导致\t121843\n俩一个\t121844\n王羲之\t121845\n暗无天日\t121846\n飞越\t121847\nbian\t121848\n非雄\t121849\n孤身一人\t121850\n翠花儿\t121851\n先诚\t121852\nx2age\t121853\n吕建伟\t121854\n9月13日下午\t121855\nAdidas\t121856\n非雾\t121857\n幸运人\t121858\n爱子若渴\t121859\n步步度\t121860\n于金洋\t121861\n在见\t121862\n海带\t121863\n61th\t121864\n孃孃孃孃孃孃孃孃孃孃\t121865\n傲世波斯猫\t121866\n学行车\t121867\n我喜欢兔女郎你有\t121868\n来事大了\t121869\n相信相信\t121870\n灰灰灰灰灰灰我要啥哪里我要看灰姑娘三我要看灰姑娘山\t121871\n孟令柔\t121872\n我真的爱你度秘么么达\t121873\n关女\t121874\n赵小坡\t121875\n奥迪a4\t121876\n跪拜\t121877\n脑呆\t121878\neklankaustedulah\t121879\n制订\t121880\n灵动蹦蹦兔\t121881\n蛋挞\t121882\n384tlan\t121883\n好啦好啦我不恨你\t121884\ndykhx\t121885\n最高点\t121886\n漏眼\t121887\n雪圈\t121888\n第一节\t121889\nyoumeyoume\t121890\n疯了你好无聊\t121891\n神奇度秘\t121892\n立正\t121893\nuquququququuququququuuquqm\t121894\njf6h\t121895\n感恩\t121896\n正月初一\t121897\n峨眉山\t121898\n芭比芭比吾尔朵\t121899\n正月初七\t121900\njiang\t121901\njiane\t121902\n44只\t121903\n我是男神\t121904\n中企\t121905\n这呢\t121906\ngostosinolat\t121907\n雪地\t121908\nAwards\t121909\n北京青年戏剧节\t121910\n猜夜\t121911\nhjvvhbih\t121912\n皇贵妃\t121913\n欧贝沙\t121914\n趁饭\t121915\n相恋\t121916\n红尘们\t121917\n东北路\t121918\n无邪\t121919\n在一起谷\t121920\n78868679794977994\t121921\n懂嗎\t121922\n颖儿\t121923\nJedi\t121924\npaper\t121925\n通天塔\t121926\n度秘干嘛你塞熊出没之国宝雄兵\t121927\n找卖\t121928\n深秋\t121929\n一天24\t121930\n布卡\t121931\n圆的方\t121932\n今日清晨6点42分\t121933\n648-13号\t121934\n二二零一八年\t121935\n军博仕\t121936\n咪咪咪你就是说心安\t121937\n经久耐用\t121938\n晦堂\t121939\nhxuxn\t121940\n旺火\t121941\n概念性\t121942\n蚕茧\t121943\n36236069\t121944\nSeeyou\t121945\n侗族\t121946\n7呵\t121947\n最好是\t121948\n20120531\t121949\n480美元\t121950\n看见了介\t121951\n未及\t121952\nLomo\t121953\n15454548725368\t121954\n乱侃\t121955\n大振\t121956\n644346467\t121957\n未变\t121958\n265545585565555555088555\t121959\n表演\t121960\n哦了额咳咳咳额\t121961\n荸荠\t121962\n别再发\t121963\nkfd\t121964\n张德丽\t121965\n多咪多咪多\t121966\n干爹\t121967\n摸底\t121968\n长期\t121969\n作伴\t121970\nkfc\t121971\n活到\t121972\n靠背\t121973\n一代\t121974\n110401\t121975\n开口\t121976\n一们\t121977\nrgythh\t121978\n300ml\t121979\n测测\t121980\n李嘉图\t121981\n一件\t121982\n小悠哈\t121983\n八个多小时\t121984\n拜拜目送\t121985\n百度作业帮\t121986\n放量\t121987\n一份\t121988\n晚上11点30\t121989\n一任\t121990\n你好天真\t121991\n君影\t121992\nmupiior\t121993\n一仆\t121994\n17415157\t121995\n全写\t121996\n比熊\t121997\n一今\t121998\n1251111111\t121999\n一仗\t122000\n七十条电信局\t122001\n爸长\t122002\n一仒\t122003\n城vv\t122004\n开发\t122005\n脱成名\t122006\n老梁内\t122007\n各就各位\t122008\n李天雨\t122009\n牙石\t122010\n出克\t122011\n安错\t122012\n北乡\t122013\n产者\t122014\nzxcvj\t122015\nnnnnnnnnnnnnnn拜nnnnnn察\t122016\n我的爱犬\t122017\n五二十四点\t122018\n冫301\t122019\n茴香酒\t122020\n江苏卫视远胜东方台\t122021\nDykeffukhyFkjfee2ffrewwwwj\t122022\n至天亮\t122023\n文化交流\t122024\n出入\t122025\n不必了\t122026\n漯河\t122027\n2017年元旦\t122028\n出关\t122029\n合资\t122030\n芦芽山国家级自然保护区\t122031\n出具\t122032\n第五十\t122033\n月初\t122034\n外皮\t122035\n657316175617256356164576120\t122036\n黑灰\t122037\n答径\t122038\n爱斯基摩人\t122039\n唔觉\t122040\n7000M\t122041\n周梦娇\t122042\n横行\t122043\n等一句晚安\t122044\n李大恒\t122045\n张璐\t122046\n答得\t122047\n62公里\t122048\n黑灯\t122049\n13450277833\t122050\na4a\t122051\n横表\t122052\na4l\t122053\n笑不想\t122054\n你好神\t122055\nFace\t122056\n一声儿\t122057\n呀骂\t122058\n11点03分\t122059\n这语气\t122060\n回避性\t122061\n走马观花\t122062\n98块\t122063\n挤满\t122064\n迎面\t122065\n阿良\t122066\n挂羊头\t122067\n缘份\t122068\nwoutit\t122069\n雷莎\t122070\n五百面\t122071\nBugh\t122072\n12345678906859\t122073\n水土不服\t122074\nCZ3153\t122075\n软助\t122076\n126580933\t122077\n77.94\t122078\n建筑语言\t122079\n鱼米之乡\t122080\n4408832000001251234\t122081\n夫唱妇随\t122082\nffgd\t122083\n张靓颖\t122084\n萨萨呀\t122085\n每每个月\t122086\n城东实验小学\t122087\na41\t122088\n3bcdesa\t122089\n院校\t122090\n家王\t122091\n测试题\t122092\n可笑\t122093\n干活句\t122094\n老公我爱你我想说我爱你\t122095\n302页\t122096\n蜜月度\t122097\n花轿\t122098\n良不好听\t122099\n装箱\t122100\n多想不到\t122101\n港星万\t122102\n桑塔纳\t122103\n咪呀\t122104\npoitdvouv\t122105\n小e\t122106\n小k\t122107\n小i\t122108\n普来\t122109\n猪么猪大肠\t122110\n柯林斯词典\t122111\n咪味\t122112\n小S\t122113\n小Q\t122114\n我在想你哦你在哪里哦生活很烦恼不再说话了听我唱歌哦你\t122115\n6月14日下午五点\t122116\neeeeeee\t122117\n4.5折\t122118\n萌君\t122119\nsouka\t122120\n来不到\t122121\n有关系\t122122\n瞅摸\t122123\n拜拜拜拜拜拜拜拜拜拜拜拜拜拜拜拜拜\t122124\n国美\t122125\n烤火\t122126\n小K\t122127\n功绩\t122128\nfhx\t122129\n明天早上七天\t122130\n桥妇炎洁\t122131\n女骇\t122132\n小动员\t122133\n小8\t122134\n芳苗\t122135\nshenmeyiis\t122136\n反应快\t122137\n3x\t122138\nＳＡ\t122139\n小麦色\t122140\n刘大萌\t122141\n陈漪璇\t122142\nOTL\t122143\n娜狗\t122144\nOTA\t122145\n田渊博\t122146\n彭帅\t122147\nOTZ\t122148\n思念型\t122149\n游人\t122150\n认不到\t122151\n蝴蝶了蓝\t122152\n不结婚下\t122153\n桂柳话\t122154\n小何丽\t122155\n王姑\t122156\n荷华\t122157\n2011年6月\t122158\n咔咔咔啊拉\t122159\n七手八脚\t122160\n中心话\t122161\njtmp\t122162\n3u\t122163\n47吨\t122164\n空车\t122165\n病毒性脑炎\t122166\nvbnuc\t122167\nurur\t122168\n流眼泪\t122169\n阳奉阴\t122170\n清装\t122171\n葫芦\t122172\n6.1-升\t122173\n个板\t122174\n派尔\t122175\n两周\t122176\n亲密无间\t122177\n萌萌大哒\t122178\n123666869617\t122179\nurud\t122180\nurue\t122181\n王姬\t122182\n爱文\t122183\n糖芋苗\t122184\n慢半拍\t122185\n可啪\t122186\n运营\t122187\n56544455\t122188\n泳姿\t122189\n王艺儒\t122190\n零零卡\t122191\n不知不知\t122192\n6000万\t122193\n世界政府\t122194\n好吧睡觉吧\t122195\nhttphhiphotosbaiducomxiaodupicitem64380cd7912397ddb617e0705e82b2b7d0a28753jpg\t122196\n地步钱\t122197\n瓜子儿\t122198\n吃脸\t122199\n女骚\t122200\n888585555\t122201\navtt恤\t122202\nTrisiai\t122203\n太空拍喜马拉雅山冲破云层.图\t122204\n淘金币\t122205\n九分之十\t122206\n不存在\t122207\n萌呆小猫咪\t122208\n闲趣\t122209\n空气质量\t122210\n65个\t122211\n318\t122212\n有信心\t122213\nGaffhuffy\t122214\n313\t122215\n312\t122216\n好了你没有骗我行\t122217\n310\t122218\n317\t122219\n315\t122220\n314\t122221\n毫安\t122222\n不看病\t122223\n6759565591591\t122224\n曾虹霖\t122225\n闭眼关灯\t122226\n财产权\t122227\n李亚鹏\t122228\n亚龙湾\t122229\n8549595982857988\t122230\n第十遍\t122231\n党浩萌\t122232\n8级\t122233\n去忆\t122234\n出示\t122235\n圆规\t122236\n张宇斌\t122237\n你好度秘你\t122238\n抱着走\t122239\n寻找爱的梦想\t122240\n苦鹿\t122241\n英特尔\t122242\n3586982384255\t122243\n杨洋金\t122244\n王振义\t122245\n山顶上\t122246\n嗯库音\t122247\n100010000000000000000000000100000000\t122248\n啊度秘\t122249\n叫引力\t122250\n天真的人\t122251\n约和\t122252\n血脉\t122253\n陆胜民\t122254\n较好\t122255\n长江七号\t122256\n弄瘦\t122257\n时人\t122258\nzr52k\t122259\n你的偶象\t122260\n非正统\t122261\n打岔\t122262\n5555555555555555555555555\t122263\n装傻\t122264\n那行的夏天天空中繁星天天\t122265\n毛尾巴\t122266\n娱乐园\t122267\nk氽怜\t122268\n扣款\t122269\n回升\t122270\n葛毛宁\t122271\n侯倩\t122272\n斯柯达\t122273\nWilde\t122274\n我一个小小小小小小小小鸟\t122275\n打岁\t122276\nfuckyou\t122277\n13569095702\t122278\n钱不钱\t122279\nfygbh\t122280\n3月10号\t122281\n回单\t122282\n魔纤瘦\t122283\n传染\t122284\n恐怖生物主题\t122285\n武林绝学\t122286\nDEMO\t122287\n直率\t122288\n胸小\t122289\n两腿之间\t122290\n精简\t122291\n还假\t122292\n台乐\t122293\n886记着明天见\t122294\n文山\t122295\n2008万块\t122296\n了别忘\t122297\n我班\t122298\n喝一口\t122299\nHikjjho\t122300\n从笑\t122301\n姐夫妻\t122302\n零几\t122303\n罗丽琼\t122304\n耀武扬威\t122305\n谢大人\t122306\n失眠者\t122307\nsiohappy\t122308\nnro\t122309\n养肥\t122310\n重叠\t122311\n重口\t122312\n家风\t122313\n39\t122314\nhjyd\t122315\n天天三笠\t122316\n养育\t122317\n恩替\t122318\n2月7日\t122319\n爬起来\t122320\n度秘酱\t122321\n孔老\t122322\n5565555\t122323\n帝豪\t122324\nDangerousrose\t122325\n余哼哼\t122326\n智斗丈母娘\t122327\n地话\t122328\nk256\t122329\n雕塑家\t122330\ncc式\t122331\n来来来了\t122332\n重变\t122333\n重发\t122334\n医说春4839\t122335\n31\t122336\n养肝\t122337\n重取\t122338\n不可帅\t122339\n二。二\t122340\nkmnnhjjws\t122341\n3x4x6x六零\t122342\n钟佳英\t122343\n37\t122344\n记点\t122345\n一小撮\t122346\n装呆\t122347\n薛瑞麒\t122348\n猴猴\t122349\n35\t122350\n处女度\t122351\n处女座\t122352\n蔡刘\t122353\n郭子康\t122354\n明早\t122355\n腐漫\t122356\n你好吗忙\t122357\n心地真好我讨厌你还是喜欢我我真心的好喜欢你\t122358\n\t122359\n林希骏\t122360\n清华大学经管学院\t122361\n燕赵\t122362\neggceudcjfvheifyfdqfyxuEOCUBVFYDYSB\t122363\nhttpahiphotosbaiducomxiaodupicitemfc1f4134970a304e652dfa89d6c8a786c9175cb1jpg\t122364\nXxvhhccvn\t122365\n零一金毛\t122366\n好你活过今天你是我爸爸\t122367\n刘清伟\t122368\n5457578\t122369\n建群\t122370\n土风舞\t122371\nThanKyou\t122372\n洋溢\t122373\n本人\t122374\nTasteV\t122375\n熹掌\t122376\n同一战线\t122377\n陈嘉桓\t122378\n打字儿\t122379\n八百次\t122380\nt5tdadajmjngdtdk\t122381\n五十遍\t122382\n笨惯\t122383\n互相\t122384\n本事\t122385\n下半辈子\t122386\n十八个月\t122387\n克莱蒙费朗\t122388\n55微克\t122389\n闻荷香\t122390\n5…5\t122391\n五十道\t122392\n木索\t122393\n黑眼珠\t122394\n这咱\t122395\n50下\t122396\n东安体育坪\t122397\n50万\t122398\n晕晕晕晕晕晕晕晕晕\t122399\n欣然\t122400\n愉快\t122401\n董芯睿\t122402\nO8\t122403\nO6\t122404\n泰西\t122405\n小安\t122406\n夸奖\t122407\n先飞\t122408\n逆连着\t122409\nSUPERJUNIOR\t122410\nyusiahgsgs\t122411\n50个\t122412\n人度度\t122413\n邯郸市\t122414\n紫丁香\t122415\n首次\t122416\n首款\t122417\n滚我就喜欢\t122418\n林品岑\t122419\nparents\t122420\nsggdfd\t122421\n亲不亲\t122422\nghvdg\t122423\n乱爱\t122424\n11月16日\t122425\n这咋\t122426\nOo\t122427\n妈妈会\t122428\nOm\t122429\nOk\t122430\nOh\t122431\n设计\t122432\n我的生\t122433\n香惜玉\t122434\nXabdkd\t122435\n肯德\t122436\nOv\t122437\nOs\t122438\ndling\t122439\n睡不好觉\t122440\nOO\t122441\nON\t122442\n猥琐版\t122443\nOL\t122444\nOK\t122445\n1234567856783234567\t122446\nOI\t122447\nOH\t122448\n分辩\t122449\n分辨\t122450\n飞鱼\t122451\n123589842688\t122452\nOB\t122453\nOA\t122454\negCFDCFDf反反复复\t122455\n56806508\t122456\n涟菁\t122457\n名星\t122458\n刘夏郎\t122459\nOV\t122460\n儿童者\t122461\nOS\t122462\nOR\t122463\n奈何\t122464\nOP\t122465\n萧敬腾\t122466\ngjxjzugjh\t122467\n无人接听\t122468\n一千九一份\t122469\n300300400500600700800900\t122470\n构思\t122471\n13987606258\t122472\n墨铃\t122473\n客服台\t122474\n蓝思\t122475\n周六龙\t122476\n追上去\t122477\nahera\t122478\n画圈\t122479\n耐热\t122480\n握拳那你先那你跟谁\t122481\n二二五四\t122482\n耐烦\t122483\n塑令\t122484\n百货公司\t122485\n鼎汉技术\t122486\n电暖\t122487\n三十三十三百三百六千六百\t122488\n提前还款信用额度\t122489\n时效\t122490\n丰庆路\t122491\n满腔\t122492\n摘抄\t122493\n打单杆\t122494\n菜篮子\t122495\n在儿\t122496\n徐思雅\t122497\n维和\t122498\n五大一五\t122499\n曹曼\t122500\n兄弟姐妹\t122501\nkwvgeicdi\t122502\n曹曦\t122503\netcvdjg\t122504\n855554088766\t122505\n曲风\t122506\n失礼\t122507\n牙缝\t122508\n∝子百为漸镟\t122509\njyhkthgnzf\t122510\n谢谢谢你\t122511\n一头鲸\t122512\n有纪元\t122513\n哈求\t122514\n贸摩托\t122515\n立帅\t122516\n新马服装集团\t122517\n一百拳\t122518\nhgputon\t122519\n孙重庆\t122520\n小舞子\t122521\n午安元\t122522\n自负自负\t122523\n一转身\t122524\n不爱不爱\t122525\n兰亭序\t122526\n天当当当\t122527\n精华水\t122528\n卖一个我听听\t122529\n杨子铁\t122530\ninto\t122531\n45555555\t122532\n呢意林\t122533\n1角\t122534\n空翻\t122535\nｅｓｔｈｅｓ\t122536\n联防\t122537\nRING\t122538\n笑靥如花\t122539\n伽菲伽\t122540\nbiok\t122541\n黄石\t122542\nbiop\t122543\n吃乜\t122544\n联队\t122545\n大桥峡\t122546\n胡德生\t122547\n淅淅沥沥\t122548\nluiloveu\t122549\n聊一聊天\t122550\n徐楠\t122551\n最有效\t122552\n有所以说\t122553\n压盖\t122554\njmagpjamg\t122555\n郭士雨\t122556\nya战队\t122557\n首条\t122558\n招手\t122559\n腿短\t122560\n苏达\t122561\ndifugjfid\t122562\n49870284\t122563\n争斗\t122564\n触感\t122565\n七一宝贝\t122566\n高点就好\t122567\n手机卡\t122568\n李小波\t122569\n瓦妮达\t122570\n三山楂酸\t122571\nGfhhhgfhhh\t122572\n95啪啪\t122573\nIgchj\t122574\n名佳鑫\t122575\n病毒\t122576\n辐琢\t122577\n饶秘\t122578\n妇淑芬\t122579\n一个112\t122580\n外管局\t122581\n招找\t122582\n蹲钉\t122583\n第二张\t122584\n费工\t122585\n药流\t122586\n清明白\t122587\n三十套\t122588\n胆大\t122589\n商桥\t122590\n陈思成\t122591\n马德兴\t122592\n潘囍\t122593\n考試\t122594\ntfoys\t122595\n太阳能\t122596\n美芳\t122597\n2016款\t122598\n9885888\t122599\n梓童\t122600\n我的汤姆猫\t122601\n费玉清\t122602\n你等你等你等你等你的你的你的你的你的你的你\t122603\n激赏\t122604\n地位\t122605\nesxghj\t122606\n4月7日\t122607\n皮耶旭霞\t122608\n花花花\t122609\n几耐\t122610\n药浴\t122611\n真不好\t122612\n管瑞\t122613\n秘女头\t122614\n565589855655558550\t122615\n琉球\t122616\n餐盘\t122617\n减肥\t122618\n煮汤\t122619\n通话分钟\t122620\n东北嘞\t122621\n97点五\t122622\nHuLun\t122623\n杰妮\t122624\nwmdj\t122625\nK一1\t122626\n前进\t122627\nAdgjjggfohw\t122628\n煮汁\t122629\n付文芳\t122630\npanese\t122631\n兵装集团\t122632\n安哒哒\t122633\n想贷\t122634\n前述\t122635\n版头\t122636\n锤炼\t122637\n梦妆\t122638\n欲罢不能\t122639\n送货小心\t122640\n两头一边\t122641\n倒霉蛋\t122642\n死不投降\t122643\n抄录\t122644\nwiege\t122645\n当前\t122646\n一百二十九\t122647\nhttpfhiphotosbaiducomxiaodupicitem9825bc315c6034a85b9d1df2cc13495409237641jpg\t122648\n西飞妍雪\t122649\n肖雅茹\t122650\n脆弱\t122651\n过去的日子\t122652\nvqlf\t122653\n弗莱彻\t122654\n乜你\t122655\n14元\t122656\n童童生\t122657\nlowusus\t122658\n切皮\t122659\n张红梅\t122660\n化学元素周期表\t122661\n咯气\t122662\n事调\t122663\n醉了醉了醉了醉了醉了醉了醉了醉了醉了醉了醉了醉了醉了醉了醉了醉了醉了醉了醉了对了对了对了对了对了对了醉了醉了醉了醉了醉了醉了醉了醉了醉了醉了醉了醉了醉了醉了醉了醉了醉了醉了醉了醉了醉了醉了醉了醉\t122664\nnhhhhhb\t122665\n洪江\t122666\n阴德\t122667\n585555\t122668\nWhatswrong\t122669\n个人儿\t122670\n我不了了我要去上啊上我奶奶\t122671\n我恨你我恨你恨你恨我我讨厌你我讨厌你我再也不想见到你了我\t122672\n钱怪\t122673\n人种\t122674\n一飞扬\t122675\n苗香王\t122676\n学长姐\t122677\n哪个秘\t122678\n读一读\t122679\n人称\t122680\n等稍等\t122681\n帅你丑你帅你丑你帅你丑\t122682\n四颗星\t122683\n王庄村\t122684\n洁丽\t122685\n除魔传奇\t122686\n2012WhyMe\t122687\nFUGH\t122688\n摄像头\t122689\n触控\t122690\n了玩\t122691\n凯弟兄\t122692\n我是你爹噶\t122693\n沁阳\t122694\nfywtio\t122695\n生命之源\t122696\n衡量\t122697\nYtiicigh\t122698\n淫叫\t122699\n翻转\t122700\n死肥猪\t122701\n内含\t122702\n翻车\t122703\n202104\t122704\n963563214\t122705\njdjdddi\t122706\n保险套\t122707\n第十棵\t122708\n你看我如下只百变联萌啊萌\t122709\n梁佳慧\t122710\n浪母罗\t122711\n朱凌翔\t122712\n耶伦\t122713\n尸尸\t122714\n路人们\t122715\n好评亲\t122716\n妈妈嗯那克美美美羊羊黑\t122717\n您母乳房管局\t122718\n四戒松\t122719\n那个男孩\t122720\n139950299\t122721\nguamg\t122722\n大望京\t122723\n崔子千\t122724\n二月份\t122725\n感言\t122726\n鲜见\t122727\n充爆炸\t122728\n14358863\t122729\n十六元\t122730\n凤凰男\t122731\n天生以后\t122732\n张咏发\t122733\n鹿育硕\t122734\n双胜河矿业有限公司\t122735\n9：19\t122736\n书价\t122737\n吴雨凡\t122738\n好天国\t122739\n狠高兴\t122740\n嗯angelababy\t122741\n恒埔头\t122742\n赵一航\t122743\n附注\t122744\n治好呀\t122745\n平凡\t122746\n厂房\t122747\n111sk\t122748\n天上掉下一个猪八戒\t122749\n鱼子酱\t122750\ndzthchhhc\t122751\n使馆\t122752\n大了你太萌萌哒了我和我的闺蜜也萌萌哒\t122753\n上海通用汽车\t122754\n好笑哈\t122755\n电量\t122756\n团办\t122757\n而是且\t122758\n1月多\t122759\n平凉\t122760\nbdjxbdjd\t122761\n周建康\t122762\n蜗牛从一下专晚上爬白天秦时明月\t122763\n流汒\t122764\n真是的不想和你说话\t122765\n流汗\t122766\np型蛋\t122767\n东北部\t122768\n咪臭猪\t122769\n46482739\t122770\n租房子\t122771\n闵行\t122772\n金来度\t122773\n医闹的眼\t122774\n那点事\t122775\n电阻\t122776\n第十二集\t122777\nhollistheDJ\t122778\n天错\t122779\n完假紫\t122780\n而至\t122781\n喔米托佛\t122782\n克么\t122783\n囧囧囧囧囧囧囧囧囧囧囧\t122784\n阿远\t122785\n谦虚点\t122786\n东和\t122787\n叶林夕\t122788\naBB\t122789\nrice\t122790\n往后\t122791\n82655\t122792\nVshow\t122793\n阿迪\t122794\n天放\t122795\n长垣\t122796\n份稿\t122797\n东咪\t122798\n35千克\t122799\n大脸\t122800\nGyffchukngg\t122801\n自学\t122802\nsaysit\t122803\nX光\t122804\n今天早上\t122805\n查看\t122806\n比战\t122807\n芸芸众生\t122808\n浮躁\t122809\n13023287366\t122810\n刁钻\t122811\n戒指\t122812\n明天上午12点\t122813\n淘宝体\t122814\n大脚\t122815\n一起码\t122816\n管吃\t122817\n萌宠\t122818\n福兴阁\t122819\n江水如\t122820\n成一团\t122821\n司马\t122822\n六万3200多少\t122823\n整讲\t122824\n裴金佳\t122825\n心心快快乐\t122826\n黄巾起义\t122827\n例行\t122828\n你好度秘你好乖\t122829\n哈皮\t122830\n借力\t122831\n宫心计\t122832\n追到底\t122833\n征点\t122834\npolo衫\t122835\n小偶师\t122836\n阿里咯啦好快乐快乐理\t122837\n吴道子\t122838\n败笔\t122839\n砂金儿\t122840\n两两\t122841\njdhn\t122842\n摄影机\t122843\n浅绿色\t122844\n看看完\t122845\n比上不上\t122846\nmjd\t122847\n猜嘛猜嘛猜\t122848\nmja\t122849\nmjn\t122850\nhtdh\t122851\n287555525859080089365n8\t122852\nOne\t122853\ndud\t122854\n瓦解\t122855\n借助\t122856\n鸿蹄\t122857\n13131313131313131313131313\t122858\n秦始皇陵墓\t122859\n低潮\t122860\n让宅\t122861\n50许\t122862\n巧克力薯片\t122863\n有了么\t122864\n200岁\t122865\n支持人\t122866\n没看见雪\t122867\nhōng\t122868\n征炮\t122869\n卧里屯\t122870\n莫如寡\t122871\n很想\t122872\np虫婴\t122873\n公婆\t122874\n小红花\t122875\n日日是好日\t122876\n大花猫\t122877\n大花猪\t122878\n赶脚儿\t122879\n狗肉馆\t122880\n冯俊飞\t122881\n很惨\t122882\n卡哇伊\t122883\n讨厌讨厌讨厌拜拜\t122884\n二六三零\t122885\n谢函\t122886\n华数堡\t122887\n手擀面\t122888\n暴富\t122889\n朴素\t122890\n偏学下贱\t122891\n董继兰\t122892\n叁件\t122893\n似点\t122894\n忙归\t122895\n紫琅\t122896\n刚头\t122897\n你的爱情\t122898\n赶快说\t122899\n茜来\t122900\n小游小猫咪\t122901\n我就是我我就是我你是你\t122902\n塞思·麦克法兰\t122903\n婆婆婆\t122904\n开说\t122905\n狠辣\t122906\nsoory\t122907\n古言\t122908\n林姐\t122909\n酿制\t122910\nuehshsh293838\t122911\n端阳节\t122912\n我的嘛哈爸爸我爱啦啦啦啦啦\t122913\n亚财\t122914\n下毛片\t122915\n那个张\t122916\n蚝式恒动\t122917\n氛照\t122918\n第一弹\t122919\n韩文\t122920\n'\t122921\n十些多\t122922\n海螺国\t122923\n白色天马\t122924\n探寻\t122925\n统治者\t122926\n演出来\t122927\n牟秀春\t122928\n豁出去\t122929\nttfgbxh\t122930\n2.3.3版\t122931\nvdcgg\t122932\n李益\t122933\n2012年3月3日\t122934\nBBBbbbb\t122935\n咋秘\t122936\n1312345678\t122937\n打包\t122938\n206厘米\t122939\n不忘不忘\t122940\n一百瓶\t122941\n饭局门\t122942\ninsn\t122943\n凌潇肃\t122944\n后部\t122945\n徐峰济川小学\t122946\n我的最好的朋友\t122947\n精益求精\t122948\n3ku\t122949\n活捉洒撒比\t122950\nvvbbjjjjiy\t122951\n4点10分\t122952\n玉树临风白玉堂\t122953\n腰伤\t122954\nTFb0ys帅坶\t122955\n东门503\t122956\n超然物外\t122957\n朱彦硕\t122958\n3ko\t122959\n张之张\t122960\n刘梦溪\t122961\n慈悲好\t122962\n大八焏\t122963\n天佛\t122964\n搞怪无常\t122965\n判刑\t122966\n天佑\t122967\n影集\t122968\n天体\t122969\n4万元\t122970\n5838453830076585345\t122971\n装进\t122972\n画卷\t122973\nyyyyyyy\t122974\n富安娜\t122975\n装运\t122976\n天使\t122977\n擦射\t122978\n35度\t122979\n染上\t122980\n233333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333\t122981\n呐花\t122982\n多少\t122983\n咩别管\t122984\n年年\t122985\n小儿郎\t122986\n法轮功\t122987\n楼千尘\t122988\n弹奏者\t122989\ninterests\t122990\nlnlnlnlnhhhhhhhhhhhhhhhhh\t122991\nhtrtd\t122992\n死逼\t122993\n松紧\t122994\ngayG\t122995\n没看懂\t122996\nhttppinyincn1jS8oNRizUJ\t122997\n爪机\t122998\n158.0元\t122999\n铝理\t123000\n死张\t123001\ntohdtijxh\t123002\n1111111111111111111111111111171111\t123003\nGnfvcmf\t123004\n一九年\t123005\n死速\t123006\n好啦好啦我靠着你来看我好聪明\t123007\n八千多\t123008\n死逆\t123009\n伊犁\t123010\n都咪\t123011\n游紫轩\t123012\n700多\t123013\n街斯\t123014\n不舍都\t123015\n眼白\t123016\n順心如意\t123017\n贺岁币\t123018\n2月17日\t123019\n1352842\t123020\n按捺不住\t123021\n禁烟\t123022\n183907780\t123023\n猛虎\t123024\n派拉蒙公司\t123025\n万庆良\t123026\n凸凸凸凸凸凸凸凸凸凸\t123027\n国家政权\t123028\n爱信\t123029\nleepping\t123030\n胡育茹\t123031\n车肚\t123032\n不见之路\t123033\n泉山路泉山村\t123034\nqq咯\t123035\n吴子豪\t123036\n邓皓文\t123037\n布泉\t123038\nksjs\t123039\nSLY\t123040\n最最好的朋友\t123041\n位家都面\t123042\n扭扭捏捏那你呢那你\t123043\n免受伤害\t123044\n家伙伴\t123045\n土蛋\t123046\nzaoni\t123047\nMode\t123048\n振り子\t123049\n丫急\t123050\n我着你\t123051\n飞从而飞\t123052\n吗了\t123053\n出题同\t123054\n胡小乖\t123055\n体罚\t123056\n张麻子\t123057\n尖椒干豆腐\t123058\n天晚上\t123059\n神舟不行\t123060\n锤基\t123061\n山中访友\t123062\n匙柄\t123063\n果角村\t123064\n50000多\t123065\n有神\t123066\n呆妹子\t123067\n雪爷爷\t123068\n成為\t123069\n啥么\t123070\n8G快速红棒\t123071\nT七\t123072\n52060\t123073\n558540666\t123074\n电信公司\t123075\n两万亿\t123076\n行骨如柴\t123077\nggbii\t123078\n雪雨\t123079\n大蓟\t123080\n雪雪\t123081\n安午安\t123082\n平民\t123083\n文也\t123084\n文乙\t123085\n平安夜呢\t123086\n5435221\t123087\n文书\t123088\n防御性\t123089\n木炭\t123090\n盆炎\t123091\n木炮\t123092\n深褐色\t123093\n高梁\t123094\n拿吧\t123095\n啵啵啵啵啵啵啵不不不不不不不不不不不不不\t123096\n2日\t123097\ngdhuxucjdbbs\t123098\nIT秘\t123099\nHnffbhvv\t123100\n刘秋月\t123101\n中山大学\t123102\ntapm\t123103\n神游\t123104\nleaderboard\t123105\n你的儿童歌曲没有我的我爱你\t123106\n窘态\t123107\n抱抱我的你\t123108\n异义词\t123109\n整数\t123110\n张佳宜\t123111\n下糕\t123112\nKeep\t123113\n度学\t123114\n琦庞\t123115\n咸菜\t123116\n滥片\t123117\n衰退\t123118\n草加料\t123119\n养父\t123120\n滤镜\t123121\n庙庙\t123122\n度子\t123123\nwuaacel\t123124\n逃出\t123125\nCecelia\t123126\n新康花园康乐苑\t123127\ntaxes\t123128\n17:55—22:00\t123129\n亍刂\t123130\n口出\t123131\nGjvjhk\t123132\n李凌晗\t123133\n李鸿忠\t123134\n赵兰馨\t123135\n操控\t123136\n李娇莎\t123137\n打出入境\t123138\n棍子\t123139\n太好了\t123140\n1包\t123141\n72758797788786767887\t123142\n飞哈哈哈\t123143\n1089元\t123144\n拿回家\t123145\n要不我不见\t123146\n子宫腔\t123147\n虚灵普\t123148\n越野\t123149\nムラサキ佔\t123150\n密集恐惧症\t123151\n阮月云\t123152\n找我不恨\t123153\n忍耐性\t123154\n鼠目\t123155\n九型人格\t123156\n今夜清\t123157\n丨轮\t123158\n腰酸\t123159\n找到我爱\t123160\n信不兴\t123161\ntokome\t123162\nl烧饼\t123163\n裤囖M9\t123164\n犯罪心理\t123165\n离手\t123166\nthghv\t123167\n恭喜恭喜恭喜\t123168\n我是女人我想爱你\t123169\n过沙…mdm1jmwumptgdmtgmjmmnh6jwgTMWM\t123170\n北海海城公安局\t123171\n24辆\t123172\nonceu\t123173\n静心咒\t123174\n电充头\t123175\n就算\t123176\n转接\t123177\ngyygfgu\t123178\n怪物猎人\t123179\n初音像\t123180\nhvvuv\t123181\nasixthgrader\t123182\n噶伙计\t123183\n偷星\t123184\n嗯洛\t123185\n林永兵\t123186\n黄小燕\t123187\nGLASGOW\t123188\n认得到\t123189\n好心塞\t123190\n无聊在线\t123191\n纪律囖\t123192\n立冬\t123193\n13790465\t123194\naagjmtmm\t123195\n廖梧宏\t123196\n增厚\t123197\n服毒\t123198\n就管\t123199\n爵士\t123200\nSea\t123201\n祖轩\t123202\n第23期\t123203\nJOYCE酒会\t123204\n成建新\t123205\n阿依古\t123206\n喵咪法庭\t123207\n奔跑吧兄\t123208\n影剧板\t123209\n逗比嘻\t123210\njjggggttt\t123211\n艾玛·沃森\t123212\n黄村口\t123213\n张皇失措\t123214\n刘雨晨\t123215\n谷烧\t123216\n海巡船\t123217\nS.H.E\t123218\n格莱美\t123219\n没老\t123220\n空照\t123221\n碳酸饮料瓶\t123222\njdjje\t123223\n11日晚\t123224\n嘞噜\t123225\n招商国旅\t123226\n嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻\t123227\n烈风\t123228\n讓給\t123229\n娱记\t123230\n3点40点\t123231\n男身\t123232\n海喜狼\t123233\n禁播\t123234\n师者\t123235\n张逸飞\t123236\n琴声\t123237\n134373255846704468383\t123238\nbdce\t123239\n狗狗\t123240\nihwihhhwiiiw\t123241\n相城蠡\t123242\ncifvf接待香港演艺学院\t123243\n哲惟爱\t123244\n11012016\t123245\n一零场\t123246\n先立\t123247\n0名\t123248\n白晓燕\t123249\nxeno逐客令\t123250\n好彩屋企仲\t123251\n解觉\t123252\nfhjfgjuehjirjit9igghiiothoyrriiotrroitgu9\t123253\n5538755896542369\t123254\nrbq\t123255\n冰冷感\t123256\n同步\t123257\nhahaha\t123258\n一手房\t123259\n问问问\t123260\n盐桂皮茴香\t123261\n哑父\t123262\n恶魔天宝\t123263\n吴佳琳\t123264\n创世\t123265\n民俗化\t123266\n莫名了了了了了我\t123267\n创业\t123268\nuggd\t123269\n金匾\t123270\n宋刘颖\t123271\n一大笔\t123272\n李锋\t123273\n就好医\t123274\n嘿嘿度秘\t123275\n读片\t123276\n想来看\t123277\n灭口\t123278\n愶冂\t123279\nhjmjff\t123280\nxīn\t123281\nojhppjmnnjjijjimkjboghtfdssxzxc\t123282\n石水\t123283\n有洁\t123284\n今天下午六点\t123285\n三季度\t123286\n二连\t123287\n读物\t123288\n柔弱博\t123289\n吴慧雨\t123290\n小巫女\t123291\n对象网\t123292\n戏弄\t123293\n不不想\t123294\n欸族\t123295\n双十\t123296\n老实说\t123297\n豆血泪\t123298\n夜影\t123299\n乌梁素海渔场学校\t123300\n陈总好\t123301\n马超莲\t123302\n谈恋爱好\t123303\n以为有\t123304\n苍苍\t123305\nQQ非猫\t123306\n寿光片\t123307\n教育部\t123308\ngxgo\t123309\n季宏发\t123310\n错的谁是谁非\t123311\n35块\t123312\n445566778899000000000000000个\t123313\n龚霁娆\t123314\nYttrgugyb\t123315\n还没有钱\t123316\nhttpwwwdog126comdogattributeint91html\t123317\n哭条\t123318\n天高地厚\t123319\n老实话\t123320\nBjx\t123321\n错把\t123322\n群艳\t123323\n十数根\t123324\n赞许\t123325\nqｑ\t123326\n本年度\t123327\n下吧\t123328\n赌侠\t123329\nTocha\t123330\n周八天\t123331\n不能耗\t123332\n听看见\t123333\n私家照\t123334\n下落\t123335\n李英立\t123336\n魔界\t123337\n餐官\t123338\npv108c\t123339\natdmdk\t123340\n安亲亲\t123341\n白小打的就是你的你\t123342\n下萌\t123343\n猪八戒猪八戒二小杆子小妖怪八一国\t123344\nghhzdgh\t123345\n少好\t123346\n空们\t123347\n下同\t123348\n20公斤\t123349\nfgddsdfd\t123350\n一千点\t123351\n我没有人教我求你\t123352\n78694864976669444397373\t123353\n6月5日\t123354\n坐阵\t123355\n红辣椒\t123356\n很遗憾档国\t123357\n就是你的纲\t123358\n流流流\t123359\n哎呀你看懂了么波波波波波波波波\t123360\n独轮车\t123361\n爪牙\t123362\n没用心\t123363\n好了爸爸爱的声音\t123364\n插秧\t123365\n高嘉康\t123366\ncruv\t123367\nhttpehiphotosbaiducomxiaodupicitem0ff41bd5ad6eddc4e5229d893edbb6fd536633fejpg\t123368\n套餐卡\t123369\n冰雹\t123370\n见鸡飞狗跳\t123371\n长期性\t123372\n碧玉\t123373\nttgvi\t123374\n2243109559\t123375\n伦家素\t123376\n呃酷aq士力架\t123377\n艺术品\t123378\n血印\t123379\n一九拉贝比是你的谁\t123380\n华仔瓜皮\t123381\n看我等\t123382\n猕猴桃汁\t123383\n中小企业\t123384\n物业\t123385\n谢静宜\t123386\n嘉祥县\t123387\n有缘见\t123388\n几把\t123389\nzsjjsz\t123390\nmm2tuvwsy\t123391\n105分钟\t123392\n傻眼\t123393\n风车\t123394\n十几十几年\t123395\n男星\t123396\n几折\t123397\n米样\t123398\n7月11号\t123399\nd8FB\t123400\n泄气\t123401\n校园篇\t123402\n细带\t123403\n好多人家\t123404\n斗文\t123405\n.们\t123406\n牙知道\t123407\n真心傻逼\t123408\n方言腔\t123409\n法拉奇\t123410\n脐装\t123411\n那就是说你吃过火烧树叶\t123412\n小坏猪猪猪猪\t123413\n杨忠渭\t123414\n福建公司客服部微总\t123415\n姜娟\t123416\n花样儿\t123417\n铁线\t123418\n重点中学\t123419\n小茹茹\t123420\n日不白痴\t123421\n一心在你先在在干嘛要我亲\t123422\n体格\t123423\n体校\t123424\n我不想你\t123425\n旭旭峰\t123426\n大六来\t123427\n荣美妞\t123428\n萧萧萧萧年\t123429\n一点五千\t123430\n不想你\t123431\n源可\t123432\n谷堆\t123433\n潘虹\t123434\n蓝筹股\t123435\n1300w\t123436\n沙滩\t123437\n通电话\t123438\n2016129期\t123439\n下会\t123440\n雨来杏\t123441\n梦想家\t123442\n费县\t123443\n女星\t123444\n我们当你是什么样的和有你的爸爸妈妈是什么样的希望你在梦到你你知道我是什么样的我知道你是\t123445\n攻城狮们\t123446\n奇诺\t123447\n伊来西花那歌眯拼\t123448\n陈妍希\t123449\n奇诡\t123450\n无见到\t123451\n胡说好不好\t123452\nsbsbsbsbsbsbsbsb\t123453\n无性\t123454\n创维\t123455\n大白美\t123456\n不离不齐\t123457\n35686868635897646532\t123458\n无怨\t123459\nloli饭\t123460\n阴雨\t123461\n海尔姆珉豪\t123462\n猪人\t123463\n过场\t123464\n识系\t123465\n马致远\t123466\n此语\t123467\n博不信\t123468\n奥那\t123469\n哦怕怕酒吧紫卡显卡对吧逆袭摩擦会计说\t123470\n婆母福\t123471\n碧亚\t123472\n中央电视台新台址\t123473\n窦豆\t123474\n听说过天降\t123475\n无怖\t123476\n文具店\t123477\n迎宾大道\t123478\n吏治\t123479\n轮眉\t123480\n春运\t123481\n乞讨人\t123482\n一件好事\t123483\n再孩子\t123484\n英英\t123485\n几碌\t123486\n少有人\t123487\nNOVO潮流百货\t123488\n大作手可\t123489\nfaedab64034f78f08c31db877e310a55b3191c8fjpg\t123490\n二三一二六零九二\t123491\n冯玟玟\t123492\n运达\t123493\n几碗\t123494\n话剧界\t123495\n千伏\t123496\n我心\t123497\n人祸\t123498\ngjtam\t123499\n少有些\t123500\nhttpehiphotosbaiducomxiaodupiciteme61190ef76c6a7ef9228b452fafaaf51f3de66b3jpg\t123501\n绝绝\t123502\n理塘\t123503\n骚棒\t123504\n我是你的主人我叫姑姑\t123505\n线描\t123506\n这界\t123507\n毛贼\t123508\n绝经\t123509\n窝窝网\t123510\n二货群\t123511\nxio2c\t123512\n六七十元\t123513\n38分\t123514\n黄帝\t123515\n身份证\t123516\n积淀\t123517\n子子宫\t123518\n真的不想你\t123519\n你是猪吗你是猪吗你是猪吗你是猪吗你是猪吗你是猪吗你是猪吗你是猪吗你是猪吗你是猪吗你是猪吗你是猪\t123520\n韩女士\t123521\n永不\t123522\n卖方\t123523\n这个月28号\t123524\n司南杓\t123525\n杀猪刀\t123526\n上上周末\t123527\n黄带\t123528\n法国西部\t123529\njsns\t123530\n杨老板\t123531\nfvxh\t123532\nnxnkmjk\t123533\n皱纹\t123534\n睡篮\t123535\n度秘恩\t123536\njsnd\t123537\njsnc\t123538\n拜退\t123539\njsna\t123540\n涵涵\t123541\n不知所措\t123542\n三小时\t123543\n说见\t123544\ngkyeg\t123545\n不瞒你说\t123546\n夏星\t123547\n适应期\t123548\n忠厚\t123549\n养不教父之过教不严师之惰\t123550\n麻曲靖\t123551\n平仄\t123552\n欲盖弥彰\t123553\n快件\t123554\n两栖动物馆\t123555\n一片多\t123556\n希尔顿酒店\t123557\nporno\t123558\n打怪兽奥特曼\t123559\npornk\t123560\n败犹荣\t123561\n施压\t123562\n120度\t123563\n平价\t123564\n索毛毛\t123565\n脑病\t123566\n代代星\t123567\n噎住\t123568\n辛亥\t123569\n水月禅心\t123570\n蓝绿红三面\t123571\n大珠小珠落玉盘\t123572\n何超云\t123573\n彭云凤\t123574\n看板\t123575\n乌鱼波泼摸佛得特呢勒哥科喝机奇西知知识知识鹦鹉鱼15avou音乐唉茵微\t123576\n暖暖\t123577\n赔款\t123578\n肤质\t123579\n大阴蒂图\t123580\n陈艺琳\t123581\n苏州河\t123582\n用于\t123583\n雪熙\t123584\n16个\t123585\n几千\t123586\n看来\t123587\n见你了再见\t123588\ncmn\t123589\n用人\t123590\n外弯处\t123591\n杨梅混儿\t123592\ncmd\t123593\n生僻\t123594\ncma\t123595\n粗合股u付u\t123596\n经验丰富\t123597\n润洁卡通形象征\t123598\n银翼\t123599\ncmy\t123600\n知心可以\t123601\ncmv\t123602\n看杀\t123603\n无横财不富\t123604\n很用力\t123605\npageone\t123606\n普华永道两大国际会计师事务所\t123607\n绑带\t123608\n质疑\t123609\n乌木\t123610\n郝冰\t123611\n伊塞\t123612\n过分瓜\t123613\n犹存\t123614\n巫师\t123615\n王那\t123616\n田馨怡\t123617\n路旁\t123618\n望见\t123619\n时候度秘\t123620\n到此为止\t123621\n乌有\t123622\n李孙泽\t123623\n大腹黑\t123624\n大大咧咧\t123625\n西里波\t123626\n走了哼\t123627\n投诉你机器人\t123628\n卡神\t123629\n初见成效\t123630\n魂淡们\t123631\n2304358153\t123632\n冰激凌\t123633\n后罩楼\t123634\n古树\t123635\n未富\t123636\n被抢夺\t123637\n叱咤风云\t123638\n短腿儿\t123639\n涂序强\t123640\n六世铭\t123641\n不到底\t123642\n巴吖阿\t123643\n好多一\t123644\n快快快快乐\t123645\n空房\t123646\n叫罕\t123647\n6路\t123648\nghiphotosbaiducomxiaodupicitembba1cd11728b4710cf1682b8c4cec3fdfc03238cjpg\t123649\n北门\t123650\n代客\t123651\n九十多块\t123652\n說話會\t123653\n曹大海\t123654\n模范\t123655\n永世\t123656\n好多个\t123657\n妙妙皮亚尔\t123658\n马诗豪\t123659\n空战\t123660\n各具\t123661\n脆弱的心\t123662\n你好苏\t123663\n开火\t123664\n起晚睡\t123665\n开灯\t123666\n崔应龙\t123667\ndjjdj\t123668\n秘觉\t123669\n罗主任\t123670\n备街\t123671\n殷纣墨墨\t123672\n纳税者\t123673\n文我我是你的\t123674\n秘解\t123675\ner3\t123676\nyhgg\t123677\n剪刀\t123678\n剪切\t123679\n下个星期星期三\t123680\n三百零三\t123681\n奔跑吧跑男\t123682\n新浪网旗\t123683\nxjfjfjr\t123684\n13611596775\t123685\n淋巴堡\t123686\n周觅\t123687\n秦少\t123688\n绅士\t123689\n哇塞你是男还是女我是\t123690\n下半場\t123691\n大唐\t123692\n哈5665675676\t123693\n烹制\t123694\n443644644\t123695\n4158554801580730078968055241111111\t123696\n林婉\t123697\n真的好大好大好大\t123698\n婆麻烦\t123699\n秘度你错了我是说你陪我聊天真好\t123700\n99bb\t123701\nderdgg\t123702\n哈考特\t123703\n秦屯子\t123704\nudmt\t123705\n半岛晨报\t123706\n有缘无份\t123707\n高欣悦\t123708\n第一问\t123709\n手气\t123710\n练习本\t123711\nery\t123712\n新浪微博\t123713\nere\t123714\n盘剥\t123715\n有得有失\t123716\n你是男的还女的半男半女人妖\t123717\n雷楼西\t123718\n而遗之\t123719\nECCO\t123720\n保卫科\t123721\n感谢你继续努力\t123722\ngodfhoi\t123723\niTunes无台标高清\t123724\n惠能\t123725\nf涛涛fn份\t123726\n杨海军\t123727\n炖牛肉\t123728\n聊天慢阿法林268\t123729\n靶子\t123730\n卡鲁\t123731\n椅背\t123732\n咕噜湾葫芦娃一根藤上七个娃风吹雨\t123733\n再卖\t123734\n10蚊\t123735\n猪猪那年\t123736\n天空的雾来的漫不经\t123737\n乖再\t123738\n扒手\t123739\n咬紧牙\t123740\n何国荣\t123741\n牛度\t123742\n一番\t123743\n读名\t123744\n读后\t123745\n德赫亚\t123746\n理炸\t123747\n生不死\t123748\n体育锻炼\t123749\n新博\t123750\n说外出\t123751\nstopedou堡\t123752\n新南\t123753\n供给\t123754\nmpk\t123755\n尼斯梅\t123756\n新华\t123757\n哎行\t123758\nwtyyyuiii\t123759\n那一笔\t123760\n大材小用\t123761\n军服\t123762\n种类\t123763\n一个六百一个一千\t123764\n旺仔\t123765\ndhiphotosbaiducomxiaodupicitem0b46f21fbe096b63bd369f240b338744ebf8ac93jpg\t123766\n故障告你是故事件\t123767\n喂的者\t123768\nｔｆｂｏｙｆｒｉｓ\t123769\n绘颜阁\t123770\n整服\t123771\n坍塌\t123772\n吴梦梦\t123773\n多事\t123774\n多亏\t123775\n多于\t123776\n经背\t123777\n经胎\t123778\n水晶编\t123779\n恶心小样\t123780\n高明\t123781\n带兵\t123782\n加快\t123783\n一起\t123784\n多亚\t123785\n完全版\t123786\n材生\t123787\n国鲨鱼\t123788\n多云\t123789\n多五\t123790\n你在干什么你在干什么\t123791\n哪一天\t123792\n高春\t123793\n极乐世界体会考虑还后普通IP亏湖心亭丢会男女关系你哄我贵公子一扭一扭你故意S3\t123794\n人事\t123795\n黎民\t123796\n内蒙诺\t123797\n吗然\t123798\n三峡\t123799\n朱一凡\t123800\n椰蓉\t123801\n多亿\t123802\n啦啦来了来了来了来了来了来了\t123803\n一赞\t123804\nskskw\t123805\n赌玉姑玉狐不然\t123806\nbatwy\t123807\n上挂\t123808\nnjbbn\t123809\nanold\t123810\nhttpfhiphotosbaiducomxiaodupicitem0b55b319ebc4b7453ea6fb63c8fc1e178a821590jpg\t123811\n查美戈\t123812\n屠童\t123813\n橘真琴\t123814\n王小楠\t123815\n猪脚饭\t123816\n味业\t123817\n一个三千\t123818\n裙子\t123819\n巴拉拉小魔仙魔剑公主\t123820\n悬壁式\t123821\n1882478557\t123822\n第二步\t123823\n江美\t123824\n快快24678\t123825\n一个五十步\t123826\n频器\t123827\naat3mfgwu3ejqawrjsh3ugjryzhfvjbxbgdufuqfhtgaryvo5\t123828\n得卖\t123829\n九四四零\t123830\nppp卡\t123831\n0点零四\t123832\n李梦丹\t123833\nreasonable\t123834\n这段日子\t123835\ngCX\t123836\n赵腾空\t123837\n悟县\t123838\n上挺\t123839\n体委\t123840\n十点点\t123841\nsbsbsbsb\t123842\n叫写\t123843\n大象肉\t123844\n别骗我了随笔影城\t123845\ndrudy\t123846\n都江堰市\t123847\n王岩林\t123848\n彭格列式\t123849\n106575326153\t123850\n抗骨仔\t123851\n点歌台\t123852\nshagua\t123853\n想和你不想\t123854\n英东\t123855\n时度\t123856\n分数线\t123857\n190棵\t123858\n每日电讯报\t123859\n经开区\t123860\n大小心\t123861\n杨大头\t123862\n不服输\t123863\n穿顿\t123864\nuhhhgh\t123865\n张炫铃\t123866\n顺意\t123867\n有神有人\t123868\n后市\t123869\nxxCnrNr8dldkmdnnngX5a\t123870\nhhjkkk\t123871\n上海上戏\t123872\n沟沟沟\t123873\n吧黑鸭子\t123874\n那干嘛\t123875\nzfkfh\t123876\n农工商\t123877\n纤巧\t123878\ncheers\t123879\ncountax\t123880\n白宝玉\t123881\n保持健康\t123882\n一百七十二块\t123883\n萨努\t123884\njggkkxtzzkkgxkckfcutxitttkjcfyiycncxfnxfxfjjgdtdrhktxjrxhrruxhfzdyrjtcfbycxtbyfrjfytjxxhfyfiycijfxidfysFfsxyzhfzetuvrcyxfgyggcykkfj\t123885\nUgh\t123886\n13003805367\t123887\n外面人\t123888\n洋务\t123889\n严欣萍\t123890\n至诚\t123891\n电褥\t123892\n525555555555555555\t123893\n斗艳\t123894\n行几吗\t123895\n格子控\t123896\n张亚\t123897\n王俟凯\t123898\n高叶轩\t123899\n大瓠\t123900\n高笑\t123901\n沁入\t123902\n人人白送我一只拉不拉多狗\t123903\n色性\t123904\n销售员\t123905\n大瓶\t123906\n借却\t123907\n5588896659\t123908\n神医生气香\t123909\nwrtui\t123910\n讨厌讨厌讨厌讨厌讨厌讨厌鬼\t123911\nbigbang三巡\t123912\n撮麻\t123913\n水床\t123914\n武僧\t123915\ndoge\t123916\n沦漪\t123917\n剥夺\t123918\nftff\t123919\n老办\t123920\n王武帅\t123921\ndogu\t123922\n水库\t123923\n补习\t123924\n富兰克林\t123925\n泗水\t123926\n祝福宝强\t123927\n羔羊\t123928\n致癌\t123929\n見\t123930\n逼宫\t123931\n要\t123932\n老海亭\t123933\n覃\t123934\n覅\t123935\nSHIT\t123936\n覆\t123937\n185674321\t123938\n贾语晨\t123939\n覺\t123940\n存亡\t123941\n新华社\t123942\n254256862\t123943\n不适感\t123944\n钱塘\t123945\n绕舌\t123946\n叭叭\t123947\ntfffffffffgffffft\t123948\n親\t123949\n美美美美美\t123950\n离开春\t123951\n小语共品\t123952\n54216691953379\t123953\n南广学院\t123954\nbaeetiy\t123955\n彩蛋\t123956\n替来\t123957\n王志坚\t123958\n忧桑\t123959\n灰鸭\t123960\n000000000000000000000000\t123961\nghiphotosbaiducomxiaodupicitemcc11728b4710b9121ebd288dc4fdfc0392452241jpg\t123962\n神魔遮天\t123963\nwy8fgjb\t123964\n砍柴\t123965\n多种多样\t123966\n国际章\t123967\n读大学\t123968\n近百万方\t123969\n背鳚\t123970\n白熊\t123971\n天天后天行\t123972\n慕容\t123973\n张副卡\t123974\n一月二十六\t123975\n银幕\t123976\n红小胖\t123977\n同新彩虹\t123978\n一万步\t123979\n快乐乐乐乐乐乐乐\t123980\n吐军\t123981\n零二二九\t123982\n566666665522233365\t123983\n铁链\t123984\n了里了了去了了了\t123985\n爱情人\t123986\n紫菜包饭\t123987\n捡破烂\t123988\n3月14\t123989\nvnvbdgghfh\t123990\n朝鲜战争\t123991\n发病期\t123992\njtdvly\t123993\n多巴哥岛\t123994\nkjgbiiv\t123995\n逗了再来\t123996\n3月12\t123997\n中国片\t123998\n七十克\t123999\n你的爱\t124000\n666666666666666666\t124001\n〇pn\t124002\n题目\t124003\n呼和浩特市\t124004\n罗密欧\t124005\n33995\t124006\n更爱\t124007\n陈琪\t124008\n铁铁\t124009\n105103208\t124010\n铁铃\t124011\n陈庭威\t124012\n马尔康\t124013\n食量\t124014\n安韵婕\t124015\n爱的时候\t124016\n宋威龙\t124017\n回来荡\t124018\n普通股\t124019\n七七来你\t124020\nYouareadog\t124021\n滚滚滚滚\t124022\n点点点点\t124023\n费先生\t124024\n东骏湖景湾\t124025\n榜样\t124026\n委会\t124027\n手续人\t124028\n幻想家\t124029\n阳光智博科技有限公司\t124030\n洋气\t124031\n整机\t124032\n大腕儿\t124033\n孙露\t124034\n狠舒服\t124035\n这两句\t124036\n校友\t124037\n二杯\t124038\n转去\t124039\n八百遍\t124040\n七海露亚\t124041\nGjjsjwjz\t124042\n二来\t124043\n不是啦里唱的歌\t124044\n二条\t124045\n唔通你系女噶\t124046\n没不如\t124047\nSasd\t124048\n二板\t124049\n于豪\t124050\nJ28460\t124051\n唧唧唧唧\t124052\n丫丫利\t124053\n一20周年\t124054\n笨重\t124055\n1000210\t124056\n别太气\t124057\n乖乖我不我不和你聊了我还有事情呢拜拜\t124058\n1条\t124059\n苦辣\t124060\n黄浦区\t124061\n单项\t124062\n1杯\t124063\n一下元\t124064\n别扭受\t124065\n毒蘑菇头\t124066\n15659813920\t124067\n铭爱\t124068\n记述\t124069\n金鹃国际广告有限公司\t124070\n比特庙\t124071\n宋空\t124072\n11.8%\t124073\n好吧vjbgf\t124074\n胡小琴\t124075\n今天元旦\t124076\n李莉\t124077\n席新悦\t124078\n再酱紫\t124079\n不卑不亢\t124080\n隔层\t124081\n翁红\t124082\nFrida\t124083\n走把\t124084\n小心眼\t124085\n哈尼哈塞\t124086\nfjhvjgt\t124087\n白龙马\t124088\nuiuuf\t124089\n李莹\t124090\n咿呀\t124091\nAommxin\t124092\n15156691825\t124093\n张敏\t124094\n0.零四元\t124095\nAV网\t124096\n绿间\t124097\n埠人\t124098\n新U感觉习惯习惯徐习惯\t124099\n正毛\t124100\nhuang\t124101\nstsfzhdushvizjzyhsixtpsfvmsixhkjxojskcodjcciueupsjfvo\t124102\n兵长\t124103\nBddbbdbdb\t124104\n好妈妈\t124105\n帝皇侠\t124106\n554a\t124107\n用额鹅鹅鹅曲项向天歌白毛浮绿水红掌拨清波\t124108\n跑校\t124109\n福天道\t124110\n徐世俱\t124111\n找一下\t124112\n六席\t124113\n不要再回我了我不想和你说话\t124114\nfilm\t124115\n北冰洋\t124116\n不择手段\t124117\n高腰\t124118\nDGGDgv\t124119\n3个\t124120\ntone\t124121\ntond\t124122\ntong\t124123\n阮同学\t124124\n2166.95点\t124125\n号赞\t124126\n离去\t124127\n办事\t124128\n火象星座\t124129\n乡前\t124130\ntony\t124131\n女皇\t124132\n我不要你了傻子恨\t124133\n办人\t124134\n好乖乖多咪宝宝真乖我是你的好嘛\t124135\n王英巨\t124136\n三小强\t124137\n吾身\t124138\n咪痴咗线\t124139\n有劲\t124140\n一百多遍\t124141\n3万\t124142\n1IE\t124143\n商君书\t124144\n曼度\t124145\n营业额\t124146\n吃衰衰\t124147\n2983064281\t124148\n王小宝\t124149\n蝴蝶国\t124150\njhud\t124151\njhuf\t124152\n画行\t124153\n刘冬\t124154\n看名\t124155\n熬过夜\t124156\n售票员\t124157\n刘冲\t124158\n福客\t124159\n刘冰\t124160\n進步\t124161\n王小宇\t124162\n你的话\t124163\n139亿元\t124164\nzhqng\t124165\n冫丿乀\t124166\n明旺\t124167\n翁翁翁\t124168\n福宝\t124169\n玩夜\t124170\n3235386628\t124171\n西窖村\t124172\n發大財\t124173\n十五世纪\t124174\n刘军\t124175\n福安\t124176\n脸类\t124177\n诺基亚\t124178\n比作\t124179\n小的时候\t124180\n斒子\t124181\n跟点\t124182\n三表学\t124183\nTykg\t124184\n十一月八\t124185\n吕艳华\t124186\n说化\t124187\n白大头\t124188\n4VB\t124189\n太浩太浩\t124190\n入座\t124191\n艾瑞莉\t124192\n献礼\t124193\n两星\t124194\n德阳三中\t124195\nllagu\t124196\n帕特森\t124197\n扑街\t124198\n一六号\t124199\nibibjbu\t124200\n不良\t124201\n屁滚\t124202\n三挺\t124203\n赵鱼竹\t124204\n野人\t124205\n骗了我看\t124206\nVK柯\t124207\n爱国帅锅\t124208\n不雅照\t124209\n切切切你除识你\t124210\n看我不就死\t124211\n死结\t124212\n新画\t124213\n移动移动移动移动咪多咪多咪多咪多咪多咪多咪多咪咪咪多咪多咪多咪咪人\t124214\n三振\t124215\nhvxgh\t124216\n羊与灰太狼之妈妈乐风\t124217\n度秘你的坏蛋哼哼\t124218\n尽责\t124219\n柳若雪\t124220\n悲从心中生\t124221\n麻花\t124222\nclosetonature\t124223\n一起玩\t124224\n闭门听\t124225\n春思\t124226\n陀螺四全玹雨\t124227\n馒头片\t124228\n高四\t124229\n爆张\t124230\n挑剔\t124231\n新生\t124232\n挻\t124233\n挺\t124234\n挽\t124235\n挼\t124236\n创造性\t124237\n秦浩\t124238\n提文\t124239\n刘志文\t124240\nmid\t124241\n挨\t124242\n挫\t124243\n挪\t124244\n秦海\t124245\nthanks\t124246\n振\t124247\n挡\t124248\n挠\t124249\n挣\t124250\nlū\t124251\n挥\t124252\n挤\t124253\n挟\t124254\n大庆市\t124255\n挑\t124256\n阿贝哥\t124257\n杨文强\t124258\n挒\t124259\n多米亚度秘\t124260\n挖\t124261\n按\t124262\n科普\t124263\n挎\t124264\n持\t124265\n挂\t124266\nxXDD\t124267\n指\t124268\n楼子\t124269\n纣王\t124270\n宝藏\t124271\nhsghsysh\t124272\n我爸爸\t124273\n黄绍塔\t124274\n叶露\t124275\n贾华跃\t124276\n回光返照\t124277\n斯达制药\t124278\nsjdghwhfg\t124279\n周之\t124280\n5月18日\t124281\n四分之\t124282\n市场价格\t124283\nfiC\t124284\n王成\t124285\n王我\t124286\n15562359833\t124287\n战国中期\t124288\nmit\t124289\n我是你的应该架子\t124290\n周乙\t124291\n九成半\t124292\n黑黝黝\t124293\n李若星\t124294\n蹂躏\t124295\n终于知道\t124296\n夫文\t124297\n温暖的人\t124298\n爱我翻自私我爱我妈妈我爱我爸爸我要我的我爱我妈妈\t124299\n指悔\t124300\n15019870161\t124301\n轩辕\t124302\n衝機\t124303\n通知人\t124304\n升尘埃下的红没有新开立思成一枪天下\t124305\n迅狠\t124306\n本届\t124307\n杨宇\t124308\n叩子\t124309\n足球界\t124310\n翰爽\t124311\n杨宁\t124312\n半年多在刚你的爱\t124313\n豁免权\t124314\neuieh\t124315\n疼痛感\t124316\nRobbie\t124317\n路桥\t124318\n求求你了送\t124319\n亲亲亲亲亲亲亲\t124320\n想过日久\t124321\n57封\t124322\n本山\t124323\n1.7xA\t124324\n杨宸\t124325\n大柱哥\t124326\n纸头\t124327\n月经不调\t124328\n药理\t124329\nmmmyou\t124330\n几月几日假\t124331\n玛丽曼蒂曼蒂\t124332\n宜家宜\t124333\n独乐\t124334\n诺诺\t124335\n爱不够你\t124336\n奇特\t124337\n无羞\t124338\nkisulb\t124339\n动态口令卡\t124340\n你好我一脚\t124341\naptain\t124342\n十分之一堆\t124343\n为你相信\t124344\n宜家家\t124345\n小学弟\t124346\nF/2\t124347\n中国佛教协会訾议委员会\t124348\nt恤\t124349\n说难听\t124350\n百度地图\t124351\n泰国南部\t124352\n十三年\t124353\n刘丹美\t124354\n十三幺\t124355\n窝囊废\t124356\n炉子\t124357\n侠道\t124358\n白堂虎\t124359\n银价\t124360\n数千部\t124361\n镇政府\t124362\n王心谊\t124363\n师叔\t124364\n51904\t124365\n88成\t124366\n金羽杰\t124367\n莲藕\t124368\n运作\t124369\n肯但\t124370\n刷频\t124371\nappldle\t124372\n小螺号\t124373\n1月31日\t124374\n总裁肥\t124375\n窝女\t124376\n语气\t124377\n耶油\t124378\ndcidxdju\t124379\n就是我了\t124380\n大姑父\t124381\n渐进\t124382\nxxghhnnnnnn\t124383\n金胜木\t124384\n请客\t124385\n七星\t124386\n幺八五\t124387\n小楼\t124388\nidudysussisusts\t124389\n毛悦鬼\t124390\n斯摩尔\t124391\n亏着\t124392\n股民们\t124393\n倩雯\t124394\n中将\t124395\n功耗\t124396\n中尉\t124397\n中尊\t124398\n周几\t124399\n中小\t124400\n泡馆\t124401\n宝驴\t124402\n挺尸\t124403\nisis\t124404\n刻度\t124405\n2215\t124406\n李相雨\t124407\n请安\t124408\n石码\t124409\njemvjn\t124410\n甲午大海战\t124411\n看着\t124412\n喜欢然\t124413\n林草\t124414\nisia\t124415\nhggfffd\t124416\n行政机构\t124417\n宝马\t124418\nbiyuetaoma\t124419\nCream\t124420\n不知所\t124421\n湿疣\t124422\n咪都咪好喔诗潇\t124423\n放缓\t124424\n困顿\t124425\n资本家\t124426\n四要不\t124427\n北沙湾\t124428\n三百多岁\t124429\n走狗\t124430\nTofboys\t124431\n湿疹\t124432\n硬压\t124433\n两句号\t124434\n欧锦赛\t124435\n巨有\t124436\n长途汽车\t124437\n脊椎病\t124438\n不醉人\t124439\ne租宝\t124440\n1504940543\t124441\nMrvisen\t124442\ntest\t124443\n难言\t124444\n走了劲\t124445\n胡呈祥\t124446\n告诉我我不说\t124447\n嫩叶\t124448\n叶丽亚\t124449\n洋在伤病\t124450\n陈辉\t124451\n各重\t124452\n紫薇花薰衣草\t124453\n看信不信\t124454\n唉乜\t124455\n吖娇\t124456\n呱呱呱瓜呱瓜\t124457\n小精灵来了来了\t124458\n再末\t124459\n左红钢城\t124460\n樊可可\t124461\ngufutsx\t124462\n竿头\t124463\n卖一送\t124464\n息烽\t124465\nsihshhgd\t124466\n不又说话\t124467\n都不懂的甜言蜜语就是好想好想你\t124468\n无能度秘\t124469\nlOO岁\t124470\n笑笑容我爱我\t124471\n死姓\t124472\n幸福的日子\t124473\n完美释放\t124474\n官印\t124475\n竿夕\t124476\n出样\t124477\nsongs\t124478\n贵堡\t124479\n扫光\t124480\n美如花\t124481\nk卡拉萨木克肥\t124482\n未遂\t124483\n过有\t124484\n前男朋友\t124485\n故里\t124486\nwojiaonigun\t124487\n金鱼\t124488\n新年几月几日\t124489\n时间秘astvt键秘asbtvtanbucom\t124490\n瑞表集团维修中心\t124491\n过期\t124492\n对呀我没天天走\t124493\n求你\t124494\n080210\t124495\n异火\t124496\n15282629362\t124497\nhttpwwwiiboyscombookhtml385contenthtml\t124498\n活服\t124499\n何舒倪\t124500\ngu1\t124501\n狗事\t124502\n张德芬\t124503\n加冕\t124504\n烧纸\t124505\n易算\t124506\n431453498\t124507\n扫兴\t124508\n国语\t124509\n130493734\t124510\n哈吩\t124511\n循环厘米\t124512\nfhzydgbfg\t124513\n666666643\t124514\n杨鸣萱\t124515\n7000亿\t124516\n翻缸\t124517\n片长\t124518\n以为你不在\t124519\nａｂｃｄｅ\t124520\nB座\t124521\n哈吴\t124522\n手纹\t124523\n绝爱之城\t124524\n1482369078739\t124525\ngub\t124526\ngua\t124527\n哈同\t124528\ngug\t124529\nguf\t124530\ngud\t124531\nguk\t124532\nguj\t124533\ngui\t124534\nguh\t124535\nguo\t124536\ngun\t124537\ngus\t124538\n国话\t124539\n经喜欢你\t124540\nguw\t124541\nguu\t124542\nGVFYGYGYGHG\t124543\nguy\t124544\n嗯k厅\t124545\n188亿\t124546\n江门地铁1号线一期\t124547\n大盘股\t124548\n对的对的错的错\t124549\n今天六点\t124550\n25-30年\t124551\n自懂\t124552\n咯侧\t124553\n一射日\t124554\n外强中干\t124555\n徐可欣\t124556\n张书萌\t124557\n好天真好\t124558\n怪叫\t124559\n隔离\t124560\n李敦强\t124561\nseqi\t124562\n参次\t124563\n7.15\t124564\nzjjzkz\t124565\n谈谈\t124566\n忙不去\t124567\n冻伤膏\t124568\nHuangse\t124569\n27806元\t124570\n我需要\t124571\n当班\t124572\n不是你的我告诉你\t124573\n哈罗哈罗\t124574\npbc90度点\t124575\n岳阳楼记\t124576\n卖火柴的小姑娘\t124577\n优沃\t124578\n谢喜阳\t124579\n尼玛币\t124580\n无尽\t124581\n沙田\t124582\n罗春萌\t124583\n肾萎缩\t124584\navsin\t124585\n沙画\t124586\n人字\t124587\n前一个星期\t124588\n爸爸片\t124589\n白切香鸭\t124590\n消乐\t124591\n一容两秒两秒两秒\t124592\n跟风狗得道成仙\t124593\n千，万\t124594\n四箱\t124595\n發張\t124596\n忘了爱\t124597\n赐予\t124598\n无尘\t124599\n五晚上\t124600\n根本停\t124601\n半小时\t124602\n姜甜甜\t124603\n我问你我是你的什么\t124604\n工种\t124605\n工作面\t124606\n15926483700\t124607\n钱节\t124608\n这四句\t124609\n人学\t124610\n朗讀\t124611\nihdfvh\t124612\n雪橇\t124613\n小杰样\t124614\n颜饭\t124615\n吴着希\t124616\n张春霞\t124617\n徐家汇光启公园\t124618\n春梦在那边\t124619\nyang子函\t124620\n鸟鸣\t124621\n座\t124622\n多谢多谢\t124623\n用光\t124624\n酵素\t124625\n一点二\t124626\n老孟阳\t124627\n金尼\t124628\n豪迈\t124629\n他们么\t124630\n嘎嘎嘎嘎哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥\t124631\n用具\t124632\n妮彩儿\t124633\n33xx\t124634\n鸟鸟\t124635\n盖卡吐露皮\t124636\n冰凉一夏\t124637\n光绪元宝\t124638\n太难\t124639\n填海\t124640\n月十九号\t124641\n运动鞋\t124642\nadeu\t124643\n爱奇艺\t124644\n咋子\t124645\n宁哥\t124646\n肿瘤\t124647\n甜甜甜甜\t124648\n雌性\t124649\n夜火车\t124650\n二传\t124651\n冷菜\t124652\n杀杀\t124653\n一百四百块\t124654\n考虑到\t124655\n白雪公主的小\t124656\n因人不k\t124657\n23.8%\t124658\n孵出来\t124659\n逼货\t124660\nibc\t124661\n受虐\t124662\nrdgy\t124663\n孔名丘\t124664\n莱州\t124665\n磨伤\t124666\n素菜\t124667\n杨祎帆\t124668\nibr\t124669\n看你走\t124670\nibv\t124671\n嗯嗯我爱你度秘我爱你我喜欢你我喜欢\t124672\n要不想\t124673\n1.85元\t124674\n翔哥\t124675\nClough\t124676\n正经的那你不要烦\t124677\n来我不懂事\t124678\n天六世\t124679\n喷头\t124680\n今早七点\t124681\n卫生巾\t124682\n不对劲饿\t124683\n87555744369988426688624899545\t124684\n东部汽车客运站\t124685\n鬼斧神工\t124686\nchin\t124687\n恩这会\t124688\n上脑\t124689\n雨哥\t124690\nchia\t124691\njkmn\t124692\n玛格丽塔\t124693\n韩传月\t124694\njkmm\t124695\n满天\t124696\n19度\t124697\n专家\t124698\n干涉\t124699\n火蛇\t124700\n入耳\t124701\n不烦人\t124702\n满头\t124703\n西峡吧家吧下午度\t124704\n胡夏\t124705\n翘\t124706\nbhiphotosbaiducomxiaodupicitem63d9f2d3572c11dfa00f9251642762d0f603c2e1jpg\t124707\n翔\t124708\ngjjgtuhcyib\t124709\n一七点\t124710\n25896545555585\t124711\n野狼们\t124712\n吾糸\t124713\n吾系\t124714\n翅\t124715\n鸦片\t124716\n何均为\t124717\n翁\t124718\n温情\t124719\n大boss\t124720\n好啊\t124721\n好啵\t124722\n翼\t124723\n翻\t124724\n台数比\t124725\n25.72亿元\t124726\n翰\t124727\n位列\t124728\n海印股份\t124729\n好啦\t124730\n翨\t124731\n13970230277\t124732\n植入式\t124733\n9787107251726\t124734\n翦\t124735\n环江\t124736\n翠\t124737\n位分\t124738\n京包\t124739\n报名字\t124740\nyocti\t124741\n带表\t124742\nuu五\t124743\n红自珍\t124744\n铂镔\t124745\nc座\t124746\n一百一十五块\t124747\n算了叫\t124748\n乱岁\t124749\n韭菜鸡蛋馅月饼\t124750\n雨具\t124751\n骚麦小漠\t124752\n黄黑\t124753\n背单词\t124754\n价比\t124755\n太康路\t124756\n天语文\t124757\n大阅兵\t124758\nv露易兹\t124759\n24.5%\t124760\n黄黄\t124761\n36千米\t124762\nffvbdffb\t124763\n空白\t124764\n885775555\t124765\n北京国贸展览中心\t124766\n愣是\t124767\n气筒\t124768\n林业\t124769\n解救\t124770\n林东\t124771\n李总\t124772\n杨梓邑\t124773\n事口\t124774\n3248\t124775\n铝合金\t124776\n解散\t124777\n先走时\t124778\n赫海\t124779\n诡骗人\t124780\nQAQ嘤嘤嘤\t124781\n高雨晴\t124782\n2折\t124783\n林丹\t124784\n阿梅利亚\t124785\n林丽\t124786\n浦东\t124787\n听听歌\t124788\n1470852369\t124789\n李思\t124790\n七十七块\t124791\n牛明\t124792\nGcd\t124793\nyeab\t124794\n720块\t124795\nyeah\t124796\n看到问题\t124797\n深带\t124798\n静夜思床\t124799\n拍一下来\t124800\nyear\t124801\nILoVe\t124802\n地球犬\t124803\n度秘c太卡\t124804\nS0157AX0VJJH\t124805\n深市\t124806\n曾根\t124807\n休闲\t124808\n背运\t124809\n书文\t124810\n我喜欢的时尚魔医\t124811\n西风东渐\t124812\n宋素姬\t124813\n问你会不会\t124814\n刘美施\t124815\n等一会\t124816\n江苏省泗阳县致远中学\t124817\n霍健华\t124818\n人之初\t124819\n西华\t124820\n冀东杰\t124821\n惊慌\t124822\n杨可爱\t124823\n西博\t124824\n过山别\t124825\n开心小吧\t124826\n武打类\t124827\n0点2x4点\t124828\n千万匹\t124829\n起床点\t124830\n西南\t124831\n改革者们\t124832\n十七十八十九二十八\t124833\n两个位\t124834\n西卡\t124835\n昀离\t124836\n肿么回\t124837\n宝贝儿jm\t124838\n唔知这样的我的秘密\t124839\n天朝\t124840\n人不犯我我不犯人人若犯我我必犯人\t124841\n64646466\t124842\n西医昂\t124843\n傻子油\t124844\n麽天子\t124845\n紫燕百味鸡\t124846\nARJ21-700\t124847\n可不行\t124848\n养血\t124849\n八月四\t124850\n缺血\t124851\n哈都\t124852\n黄艳春\t124853\n快心\t124854\n自恋过\t124855\n多错\t124856\nlaoshen\t124857\n13677\t124858\n熊思怡\t124859\n俏意\t124860\n韩剧天国\t124861\n度秘西游记\t124862\n六十一点\t124863\n58655\t124864\nQWErYuOOPSDCB\t124865\n动员\t124866\n借据\t124867\n客死\t124868\n造谣者\t124869\n莺子\t124870\n吃谈\t124871\n熊思思\t124872\napple6\t124873\napple4\t124874\nIhaveto\t124875\n四环山水文园\t124876\ndhbxd\t124877\n动命\t124878\n兵兵真我喜欢兵兵\t124879\n没害\t124880\n李康男\t124881\n熔毁\t124882\n芥茉黄瓜\t124883\n白云白云山东\t124884\nntv7\t124885\n13000年\t124886\n曼尼诗\t124887\n#Kingband皇家乐队\t124888\n观察团\t124889\n家乐福\t124890\n组长天\t124891\ne20\t124892\ntajaj1a\t124893\napplef\t124894\nhbhbvv\t124895\n赵新佳\t124896\n太可爱太可爱太可爱了我爱死\t124897\n主办单位\t124898\n挡风玻璃\t124899\n转过头\t124900\n不分离\t124901\n象牙\t124902\n白山馆\t124903\n受也否\t124904\n没完\t124905\n孔钰博\t124906\n1GHz\t124907\n星期星期\t124908\ncomlasbasgo\t124909\n电影迷\t124910\nfghhH\t124911\n88188\t124912\n青霞\t124913\n老奸巨滑\t124914\n时运\t124915\n那你大大的坏我讨厌你\t124916\n4522252522\t124917\n长驻\t124918\n13456451633\t124919\n从儿\t124920\n喜欢我哪儿\t124921\n儿女双全\t124922\n湿子\t124923\n陈萌萌\t124924\nfvbvvnchrhhf\t124925\n读秘度秘度秘加油度秘度秘度秘度秘度秘度秘加油度秘加油加油度秘度秘加油加油加油加油加油加油度秘\t124926\n藏起来\t124927\n你类你类你类\t124928\n28日5时\t124929\n人生观\t124930\n电信网\t124931\n主题页\t124932\n球场\t124933\n子母\t124934\n初一人教版\t124935\n嘘声\t124936\n迪迦奥\t124937\n十一点半\t124938\n发发发发发发发发发发发发付444发发发发发发发发发发发发发发发发发发发发发\t124939\n浙江卫视\t124940\n杨同学\t124941\n15926841\t124942\n水虚岁\t124943\ngfvdfhbn\t124944\n俯仰\t124945\nteahrh\t124946\n卢比\t124947\n嗯待会\t124948\n爆高\t124949\n掉秤\t124950\n138287042331\t124951\n哈医大\t124952\n哦欢\t124953\n房奴\t124954\n小夜班\t124955\n囧rz\t124956\n炸毛\t124957\n谁爱你了你\t124958\n2层\t124959\n绞肉\t124960\n并用\t124961\nｕ１\t124962\n度尾\t124963\nHelloassassin\t124964\n溜\t124965\n懒干\t124966\n500㏄\t124967\n源\t124968\nmm裤\t124969\n神战巨\t124970\nHSs\t124971\n保守秘密\t124972\n挤压\t124973\n鬼城\t124974\n溃\t124975\n中招附中\t124976\n臭皮蛋\t124977\n溅\t124978\n花G友菩提哦随你意\t124979\nHSB\t124980\n溺\t124981\n茶迷\t124982\n乡音\t124983\n奇迹般\t124984\n溶\t124985\nHSL\t124986\n可悲\t124987\n溫\t124988\n溪\t124989\nHSP\t124990\n炸毁\t124991\n明天下午一点半\t124992\n日经\t124993\n溢\t124994\n不v宝贝们\t124995\nHSY\t124996\n捕捉\t124997\nhttpahiphotosbaiducomxiaodupicitem5ab5c9ea15ce36d3ad5b6c443df33a87e950b163jpg\t124998\njebb\t124999\n上天许我一个愿望\t125000\n黄换红\t125001\n监册\t125002\n延吉纫家么电\t125003\n吃饱眼福\t125004\n笑猪猪侠大里\t125005\n1234264\t125006\n胃火\t125007\n朱培君\t125008\n猪猪秘蜜兔蜜头头头头吧头头吧头头不了街头臭不要脸的头\t125009\n卡布丽\t125010\n肥城市\t125011\n安徽卫视\t125012\n翻博\t125013\nvb女\t125014\n四西路\t125015\n叶传鹏\t125016\n肉感\t125017\n溜达儿\t125018\n握紧\t125019\n英版\t125020\n呀不然当着\t125021\n罢秋\t125022\n———————————————————我喜欢你\t125023\n五一百\t125024\n蜘蛛不要脸的笨猪傻猪狼蛛\t125025\n兽\t125026\n兼\t125027\nWIN8\t125028\nhrgk\t125029\n典\t125030\n从严\t125031\n再见忘记了再见\t125032\n兵\t125033\n兴\t125034\n具\t125035\n棒棒糖\t125036\n共\t125037\nWIN7\t125038\n关\t125039\n波浪罐\t125040\n六\t125041\n公\t125042\n兯\t125043\n兮\t125044\n兩\t125045\n全\t125046\n八\t125047\n入\t125048\n社长\t125049\n內\t125050\n兦\t125051\n店铺\t125052\nhrgr\t125053\n兜\t125054\n从一\t125055\n党\t125056\n研究成果\t125057\n我讨厌你我讨厌你讨厌你讨厌讨厌讨厌讨厌讨厌讨厌你\t125058\n兑\t125059\n从不\t125060\n一大把\t125061\n免\t125062\n光\t125063\n先\t125064\n克\t125065\n度蜜度秘你真萌度秘度秘你真萌萌萌萌\t125066\n充\t125067\n从业\t125068\n多岁时\t125069\n兆\t125070\n允\t125071\n兀\t125072\n元\t125073\n隐形\t125074\n守成\t125075\ndfughhxfifg\t125076\n方块头\t125077\n捕换\t125078\n清液\t125079\n碉堡\t125080\n开设\t125081\n13035137955\t125082\n稀巴烂\t125083\n李嘉欣\t125084\n付顽抗\t125085\n宫晶文\t125086\nalsolike\t125087\n湿道\t125088\n挂机\t125089\n木木木\t125090\n度秘你身为我的小秘书\t125091\n32度\t125092\n惟有加倍\t125093\n┌───────────┐n　│\t125094\n不懂不懂不懂不懂不懂不懂不懂不懂不懂不懂不懂不懂\t125095\n瑞士卷\t125096\n思明区法院\t125097\n酒井美\t125098\n美小鸭\t125099\n怛恺\t125100\n安东尼·葛姆雷\t125101\n地三鲜\t125102\n宜家过时片\t125103\n完胜\t125104\n脑容量\t125105\n上网\t125106\nhvj\t125107\nhvl\t125108\nhvn\t125109\n龙脉家园\t125110\nhvc\t125111\nhvb\t125112\nhvd\t125113\n嗯雅\t125114\nhvf\t125115\nhvy\t125116\nhvx\t125117\nasoneyearago\t125118\n第四年\t125119\n被迫害型\t125120\nhvp\t125121\nhvs\t125122\n最爱的人\t125123\nhvu\t125124\nhvt\t125125\n米事\t125126\nhvv\t125127\n嗯雪\t125128\n樣嗎\t125129\n五十呼呼\t125130\n设问\t125131\n鬼黛\t125132\n六十一块\t125133\n凝空\t125134\n帮宝适\t125135\n必定要\t125136\n六百名\t125137\n嗯零\t125138\n随你可以\t125139\n七岁周岁\t125140\n好想笑\t125141\n靖远\t125142\n14分之一\t125143\n最后一个\t125144\n恋着\t125145\n忽而\t125146\n二零一一年六月四日\t125147\n大大大大大\t125148\n罗雨婷\t125149\n路面\t125150\n图阿拉\t125151\n本宫芳龄\t125152\n大不列颠\t125153\n安大\t125154\n周屁\t125155\n自愧不如\t125156\n说来\t125157\n别跑开\t125158\n武院长\t125159\n李我去\t125160\nvdjgsbdjchk可多可少vckkxnslfb\t125161\n行不行度秘\t125162\n12345567895\t125163\n12345567894\t125164\nhhhhjj\t125165\n0.87米\t125166\n原形毕露\t125167\n环保型\t125168\n夜打\t125169\n嗯爹萌\t125170\n猪你的头我是小姐必须害我小姐不染红讲个局肖我不爱你\t125171\n说板\t125172\n讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌\t125173\n进口\t125174\n小力\t125175\n大跌\t125176\n前天下午\t125177\n逄崇志\t125178\n正逐步\t125179\nwarforour\t125180\n未來\t125181\n怜悯\t125182\n陈楠\t125183\n好吗度云\t125184\n美眉们\t125185\n做作死\t125186\n审案\t125187\n嗯好萌萌\t125188\n必死\t125189\n么么么\t125190\n对你的要求\t125191\n进取\t125192\n小动\t125193\n好啊好啊好\t125194\n长洋钉\t125195\n叶子骨\t125196\n特征值\t125197\n绝活\t125198\n小加\t125199\n1123456789101112121415\t125200\n虫戒\t125201\nJfgufij\t125202\n撼乸\t125203\n竞争者\t125204\n世博\t125205\n飞流\t125206\n15:40\t125207\n行帮\t125208\ntatey\t125209\n柴油\t125210\n宾格棒\t125211\n康师傅\t125212\n旅游线路\t125213\n一点也不可爱了你\t125214\n小时代\t125215\n佛法概论\t125216\n喔爱\t125217\n安江\t125218\n买卖我喜欢\t125219\n颈篙\t125220\nutth\t125221\n踏帘\t125222\n2点半\t125223\n嗯春眠\t125224\nGentleman\t125225\nhgdff\t125226\n存取\t125227\n死曲\t125228\n巍金艬\t125229\nciidh\t125230\n功能性\t125231\ntjlc\t125232\njjjhu\t125233\ngjxd\t125234\n一路顺风\t125235\n聊了在\t125236\n88788555555556565\t125237\njjjhh\t125238\n倒工\t125239\ncQ币\t125240\n同化\t125241\n三十级\t125242\njjjhg\t125243\n520米米\t125244\n砟子\t125245\n东落\t125246\n数百万条\t125247\n弄一弄\t125248\n妹妹片\t125249\n列出来\t125250\n珠穆朗玛\t125251\n摩羯座\t125252\n恺延\t125253\n海信\t125254\nrepresententai\t125255\n祖帅\t125256\n聒噪\t125257\n做操\t125258\n祖师\t125259\n雨真\t125260\n东营\t125261\nauverasia\t125262\n强生\t125263\n米丽古丽\t125264\n无所不难看\t125265\n大河向东流呀天上的星星参北斗哇风风火火闯九州啊路见不平一声吼\t125266\n淫欲\t125267\n刘一鸣\t125268\n怪脸\t125269\n2312775563\t125270\n热烈\t125271\n擦汗\t125272\n喜忧\t125273\n黄河古道\t125274\n哆来咪\t125275\n社保局\t125276\n慧家\t125277\n山地\t125278\n一呼百应\t125279\n一个十一岁\t125280\n较高\t125281\n3605050\t125282\n胆矾\t125283\n耳坡湖\t125284\n4月26日\t125285\n狮驼洞\t125286\ndhiphotosbaiducomxiaodupicitemd058ccbf6c81800ac25d757cb63533fa828b470cjpg\t125287\n天津大咪咪\t125288\n算啦你牛b你行了吧\t125289\n热热\t125290\n20：10——22：00\t125291\n虎牙犬\t125292\n海葵\t125293\n郭群刚\t125294\n5碟\t125295\ntragedies\t125296\n好吃醋\t125297\n4x二四\t125298\n带不走\t125299\n真面\t125300\n教训\t125301\n璇\t125302\nE\t125303\n城外\t125304\n吃饱还没有\t125305\n张丹丹\t125306\n八神\t125307\n城中村\t125308\n林一朵\t125309\nSNB-E两大超频记录\t125310\n抽芯包\t125311\n15~30分钟后\t125312\n进食\t125313\n125736124\t125314\n主食\t125315\n真非\t125316\n紫蛋\t125317\n诈骗罪\t125318\n679797\t125319\n盛阳苑\t125320\n城头\t125321\n长寿面\t125322\n劳动和社会保障部\t125323\n珊娜\t125324\n赵令\t125325\n五百分钟\t125326\nimadeyou\t125327\n火源\t125328\n敲背\t125329\n清凉油\t125330\n现场表演\t125331\n桑班\t125332\n北汽集团\t125333\n健健康\t125334\n西画\t125335\n西电\t125336\n微看\t125337\n弱水\t125338\nm句英图\t125339\n西甲\t125340\n阿欧阿欧\t125341\n里根斯堡Regensburg\t125342\n四甲\t125343\n兄弟姐妹们\t125344\n四男\t125345\n燕儿\t125346\nbackup\t125347\n33.4\t125348\n君子剑\t125349\n嗯嗯度度\t125350\n26单\t125351\n一昼夜\t125352\n华天宝\t125353\n中立\t125354\n看守所\t125355\n亲爱的你真的把曰\t125356\nxtdoydo\t125357\n陈鹤龙\t125358\n9点58分\t125359\n内地版\t125360\n民族风\t125361\n湖里区政府\t125362\nrfiags\t125363\n帅哥们\t125364\n金色\t125365\nhhjk\t125366\n利休\t125367\n俩片\t125368\nhhjb\t125369\n百年前\t125370\n毫米\t125371\n王一博\t125372\n非动机\t125373\n哈利.波特\t125374\n讲笑话\t125375\nshougoutloud\t125376\n一堆子\t125377\n存食\t125378\n瑞金\t125379\n变声器\t125380\n粉领\t125381\n帕耶\t125382\n68686\t125383\n2秒钟\t125384\n低碳生活指数\t125385\n降雨量\t125386\n瓮王\t125387\n服从性\t125388\njrheihuf\t125389\n吉娃\t125390\n不帅\t125391\n我丑\t125392\n猜考试\t125393\n苏宁电器\t125394\n奥原来\t125395\n说到做到\t125396\n小鹿\t125397\n佶眼\t125398\n一五多少\t125399\n宾利欧陆\t125400\n宁子\t125401\n我七\t125402\n小鹏\t125403\n初学者\t125404\n棒子\t125405\n再见\t125406\n我中\t125407\n旮瘩\t125408\n我丫\t125409\n来自己\t125410\n上交所\t125411\n21Mbps\t125412\n98千克\t125413\n朱莉\t125414\n柳州\t125415\n天黑秋子b2\t125416\n不吗不吗\t125417\n忆述\t125418\n萧红\t125419\n100级\t125420\nIdon\t125421\n潘以腾\t125422\nIdol\t125423\n今天17点36分\t125424\n叔公\t125425\n1245nn14545758\t125426\n第某\t125427\n栩栩如生\t125428\n撒郎\t125429\nadm突突吐\t125430\n蒙专\t125431\n火衣\t125432\n无视我我也不会强你所难的拜拜我伤心了再也不理你\t125433\nCCXV\t125434\n响当当\t125435\n第八集\t125436\n让你好\t125437\nswuou\t125438\npoou\t125439\n二手烟\t125440\n葱油肉丝面\t125441\n求求你了说是谁\t125442\n胶皮\t125443\n000点\t125444\n夏尿\t125445\n切真讨厌\t125446\n波菜\t125447\n专享\t125448\n破竹\t125449\n陪陪陪\t125450\n一百二百三百五百\t125451\n七头猪\t125452\n亲爱的思密达\t125453\npooo\t125454\npool\t125455\n无人\t125456\nmdntG\t125457\n观光客\t125458\n夏尔\t125459\n在哪里儿\t125460\n就钱吖\t125461\n无二\t125462\n张智慧\t125463\n香鱼\t125464\n无事\t125465\n申花\t125466\nuhygygy\t125467\n半斤八两亚欧\t125468\n阴沟\t125469\n睡啦\t125470\n明华\t125471\n258891\t125472\nYUY\t125473\n高野宏\t125474\nWdffffgghgjfugdyf\t125475\n神马转码\t125476\n丫頭\t125477\n有空磨\t125478\nv一个\t125479\n一城\t125480\n還沒\t125481\n赶完\t125482\nuhhgfgh\t125483\n袁氏生\t125484\n芈姚\t125485\n长江路\t125486\n吻袍#锤基##裸基\t125487\nYUN\t125488\n查塔卡\t125489\n影忍者\t125490\n半一下\t125491\n李商隐\t125492\n烦你\t125493\n能不能测\t125494\n齐心共\t125495\n杂杂\t125496\n花八\t125497\n老粮食局\t125498\n某我\t125499\nwmn\t125500\ndtijh\t125501\n一基\t125502\nwmb\t125503\nxq斯密\t125504\nwmg\t125505\n亲家辉\t125506\n驿站\t125507\nzhaoli\t125508\n干校\t125509\n容窝\t125510\n华夏文明\t125511\n缓存\t125512\n规划\t125513\n雷霆炎魔号\t125514\nhhxhhz\t125515\n腌菜\t125516\n规则\t125517\n音樂\t125518\n送钱\t125519\n第七栋\t125520\nVV打车VV嘎飞飞\t125521\n程又青\t125522\n百分点\t125523\n程满昌\t125524\n龚雪\t125525\n最后腿\t125526\n推销\t125527\n机械\t125528\n天津市福尔买卖有限公司\t125529\n名人酒店\t125530\n吉本\t125531\n对吗\t125532\n度秘萌萌小小度秘真可爱我们聊天\t125533\n复唧\t125534\nGUF\t125535\n李波龙\t125536\nGUO\t125537\nGUN\t125538\n北京博凯中兴儿童足球俱乐部\t125539\n杨雨莎\t125540\nasp\t125541\nUTTAKA\t125542\n三百六十六块\t125543\nGUU\t125544\nGUT\t125545\n台址\t125546\n喻渲\t125547\n巫师君\t125548\n引咎\t125549\n废话废话废话废话废话\t125550\n友金\t125551\n恩字\t125552\n哼不理你了臭度秘\t125553\n如同一辙\t125554\n高仿\t125555\n10515\t125556\n原邦彦\t125557\n发尔夫一\t125558\n一没几天\t125559\n周年寂\t125560\n农民工们\t125561\n扩股再缩股\t125562\n不安生\t125563\n心知\t125564\n好郁\t125565\n嗯零二二八八三五四二二零\t125566\n光子\t125567\n入货\t125568\n朱度秘\t125569\n阿君妈包\t125570\n光字\t125571\n狗肉节\t125572\n心石\t125573\n心惊肉跳\t125574\n黄博文\t125575\n一个2015200033\t125576\nbkbj\t125577\n胜者\t125578\n畜禽\t125579\n光学\t125580\n高价\t125581\n龙圩区\t125582\n无聊了我求你\t125583\n实实\t125584\n美国全国广播公司\t125585\n三年个\t125586\n上六天\t125587\n35年\t125588\n咽苦吐甘\t125589\n潘政博\t125590\n来了了了\t125591\n见面礼\t125592\n3月31号\t125593\n大天后\t125594\n风评\t125595\n盗笔记超级季播剧\t125596\nS1\t125597\nhi皮波cq1\t125598\n一滴一滴\t125599\n好伐思密达\t125600\n大气道\t125601\n哬哬\t125602\nsoan\t125603\n宏光\t125604\n维修店\t125605\n小气气\t125606\npup\t125607\n泳圈\t125608\n黑心鬼\t125609\n露波图\t125610\n杨春蕊\t125611\n8369\t125612\n多媚\t125613\nasl\t125614\n没事吧秘\t125615\n喜欢你的可是\t125616\nSy\t125617\n叫声哥\t125618\n嗯嗯唱一个我很的我们那明天\t125619\n青梅竹马\t125620\n划锁\t125621\nhi皮波cqu\t125622\n闲鱼\t125623\nSp\t125624\n6680万元\t125625\nSt\t125626\n炎陵\t125627\nSi\t125628\nSh\t125629\nSo\t125630\n刘老板\t125631\n//[许愿\t125632\nSb\t125633\nSa\t125634\n瓶瓶罐罐\t125635\nSf\t125636\n忘了秘\t125637\n校花十\t125638\nSY\t125639\n执位\t125640\nSS\t125641\n抚养\t125642\nSP\t125643\n评标\t125644\nSU\t125645\nST\t125646\nSK\t125647\nSJ\t125648\nSI\t125649\nSH\t125650\n个v过奖过奖\t125651\nSN\t125652\nSM\t125653\n逛街\t125654\nSB\t125655\nSA\t125656\ntyytrrro\t125657\nSG\t125658\n煤火\t125659\nSE\t125660\nSD\t125661\n赋予\t125662\n豪比\t125663\n310千克\t125664\nIBM\t125665\nBxkgkbv\t125666\n巧克腻\t125667\n阿尔坎塔拉\t125668\n海林\t125669\n厄齐人\t125670\n王苏越\t125671\n桶字\t125672\n巴氏腺炎\t125673\n陈叔\t125674\n丹尼·鲍耶\t125675\nhi王珂\t125676\n9999999999999699999999999999999999岁\t125677\n宝葫芦\t125678\n秋略网\t125679\ncomono\t125680\n明火\t125681\n酷睿\t125682\n羊粑粑\t125683\n陈可\t125684\n闹闹闹闹闹闹\t125685\n等你好\t125686\nk太郎\t125687\n留下来\t125688\n西扇\t125689\n不份工\t125690\nbluemove\t125691\n公信\t125692\n原你\t125693\nJackJJ\t125694\n发誓天黑\t125695\n四月份\t125696\n宽景\t125697\n14567\t125698\n14565\t125699\n接回\t125700\n一万年\t125701\n贾承坤\t125702\n仲识\t125703\n得以为\t125704\n不找你我骗不着你我要你找我\t125705\n蔡旻君\t125706\n神明的话\t125707\n谢冕\t125708\n千翔\t125709\n见雪\t125710\n原位\t125711\nanytime\t125712\n#幽默笑话#公司\t125713\n杜振\t125714\n好象\t125715\n张子海\t125716\n舟车\t125717\n原作\t125718\n脱色\t125719\n大丑鸡\t125720\n崔始源\t125721\n5087428669341180533087746633259\t125722\n窝里量\t125723\n不满\t125724\n月月乐\t125725\n5749\t125726\n挑战赛\t125727\n自谦\t125728\nｖｇｇｇ国会\t125729\n胭脂红\t125730\n果报\t125731\n使徒行者2\t125732\n宏泰\t125733\nmiter\t125734\n月份\t125735\n你的理由\t125736\n70岁\t125737\n第8单\t125738\n零二二零四四\t125739\n老戏骨们\t125740\n心里舒服\t125741\n一百零一六十一一百五十一一千一百一十一\t125742\n放放歌尔好片\t125743\n死缠赖\t125744\n重量\t125745\nSOT\t125746\n急飞\t125747\n婚外恋\t125748\n这么多年来\t125749\n涂峰\t125750\n别名\t125751\n更厉\t125752\n别后\t125753\n哒啦\t125754\n急风\t125755\n在天边\t125756\n秋湖\t125757\n零五二零\t125758\n法律责任\t125759\n遥传\t125760\n谢雨浩\t125761\n吓一跳\t125762\n萌萌哒花花\t125763\n研习\t125764\n我爱上你了\t125765\n一和一个\t125766\n爱我吧你好不好\t125767\n别听\t125768\n周末愉快度秘\t125769\n履带\t125770\n别吵\t125771\n怜君\t125772\n狠狠\t125773\n控制不住\t125774\n爭取\t125775\n极品模王\t125776\n二不准\t125777\n全州拌饭\t125778\n快快点点点点点儿\t125779\n刮胡子\t125780\n坟蛋\t125781\n二十四小时\t125782\n拉看\t125783\n好好人\t125784\n妻侄\t125785\n郑三发\t125786\n一百多\t125787\ndub\t125788\n妻侍\t125789\n自动报\t125790\n万和七一\t125791\n今冬\t125792\n陈楚生\t125793\n水火无情人有情#灭火器\t125794\n一百天\t125795\n蜘蛛大大吧猪猪了侠叔\t125796\n177778\t125797\n李律师\t125798\n一百头\t125799\n怎办么\t125800\n座上\t125801\n肚脐种\t125802\n专款\t125803\nTown\t125804\n低于\t125805\n喜欢你我喜欢\t125806\n这么秘\t125807\n起投\t125808\n的丫\t125809\n阿陀\t125810\n11毫米\t125811\n聊天饭\t125812\n李秋玉\t125813\n原句\t125814\n哆啦ababy\t125815\n仰慕\t125816\n脑头\t125817\nFdvhg\t125818\n死脸\t125819\nFantastic\t125820\n安华卫浴\t125821\n文英\t125822\ncvnurag\t125823\n11月30日\t125824\n能会儿\t125825\n1965年\t125826\nTai\t125827\n攻城狮\t125828\ndue\t125829\n编导\t125830\nCOSA\t125831\nAdds\t125832\n石荣\t125833\n玛丽莲\t125834\n痞痞兔耶\t125835\nibxdhjn\t125836\n扑克\t125837\n林心\t125838\n老萧\t125839\n13785756217\t125840\n悲剧学家\t125841\n创跪\t125842\n产身\t125843\n千里送我毛礼轻情意\t125844\n美大美小我是你给\t125845\n为什么\t125846\nLoVE\t125847\n热坏\t125848\n发假\t125849\n500次\t125850\n獭猝\t125851\n1点钟\t125852\n洛克萌\t125853\n储蓄所\t125854\nvhjk\t125855\n发做\t125856\n我不要你了不要你来句滚开滚开\t125857\n积分儿\t125858\n碘片\t125859\n行动力\t125860\n儿冷\t125861\n171927357\t125862\n烟头\t125863\n你好暖\t125864\n凤金\t125865\n狼来了狼来了来了\t125866\n给姐春丽\t125867\n调理\t125868\n三单\t125869\ntive\t125870\n骂战\t125871\n大陆\t125872\n要谷\t125873\n助学金\t125874\nyuopk\t125875\n你么\t125876\n松本润\t125877\nguof\t125878\n香拉拉拉\t125879\n纯儿\t125880\n夜深\t125881\n883\t125882\n882\t125883\n881\t125884\n几方\t125885\n887\t125886\n886\t125887\n885\t125888\n884\t125889\n889\t125890\n888\t125891\n军衣\t125892\n還過\t125893\nWindow\t125894\ncvck\t125895\n东京市\t125896\ncvcc\t125897\n吴峙轩\t125898\n知呀\t125899\n秘东西\t125900\n无影脚\t125901\n一个22岁\t125902\n羽辰\t125903\n沉鱼落雁闭月羞花\t125904\n算不算\t125905\n音儿\t125906\n人文\t125907\n辽篮\t125908\n军衔\t125909\n门羹\t125910\n33669998\t125911\n周太阳\t125912\n35秒\t125913\nnbyy\t125914\n干拌厨\t125915\n88w\t125916\nim段\t125917\n大花花嘻嘻大花猫大花猫\t125918\ndddgxgc\t125919\n桃李来了俊升误会\t125920\n邱文萱\t125921\n88a\t125922\n暴怒无常\t125923\n9日17时40分\t125924\n再见了我不想和你\t125925\ndkfnydni\t125926\n出租声\t125927\n马腾\t125928\n严翔宇\t125929\n第八排\t125930\n快点儿早点爆笑谷\t125931\nMLTNJNTMGMGMGMGMT\t125932\n格力\t125933\n李嘉宏\t125934\n热身\t125935\n8点20分\t125936\n赵名思祺\t125937\n零下五度\t125938\n臭猪\t125939\n好什麼\t125940\n顾不比\t125941\n蓝心\t125942\n马不停蹄\t125943\n吸附\t125944\n嗯呀\t125945\n嗯呃\t125946\n建筑风格\t125947\n主导用\t125948\n闲医书藕叶少点\t125949\n不要死\t125950\nggvvcbh\t125951\n第53期\t125952\n谢文文\t125953\n四一KB\t125954\n嗯呐\t125955\nxx彔\t125956\n上班族\t125957\n126%\t125958\n江有\t125959\n上书\t125960\n210下\t125961\n嗯呢\t125962\nSANTEN\t125963\n桓桓\t125964\n保级\t125965\n醒来\t125966\n百分之375\t125967\n没有我萌\t125968\nkyleslight\t125969\ngfvhjjhv\t125970\nok行\t125971\n第一眼\t125972\n1357759824\t125973\n三来一\t125974\n绝交我真的不想\t125975\n茶会\t125976\n错啦错啦好好看上面的话\t125977\n近距格斗\t125978\n└o┘YYTT\t125979\n跟进\t125980\n大草原\t125981\n生在福中不知\t125982\nDdjdjjd\t125983\n马崇武\t125984\n华夏五区\t125985\n凶我我很温柔\t125986\n北京大学体育馆\t125987\n易车度\t125988\n爆炸\t125989\n1.6万份\t125990\n为你死\t125991\n王艳萍\t125992\n第七集\t125993\n201601090913\t125994\nLopez\t125995\n度迷哇塞\t125996\n不谈走\t125997\n紧急刹车\t125998\n地导弹\t125999\norg点\t126000\n喜剧\t126001\nnizen\t126002\n靳\t126003\n模拟卷\t126004\n靴\t126005\n尸尸尸\t126006\n幺九五九零三零八\t126007\n靡\t126008\n靠\t126009\n3拣\t126010\n面\t126011\n靥\t126012\n那西卡\t126013\n未够\t126014\n声心\t126015\n班书\t126016\n开幕式\t126017\n杜灭\t126018\n陷入困境\t126019\n一个辛\t126020\n洞眼\t126021\n靓\t126022\n青\t126023\n芭比熊\t126024\n传送\t126025\n堂黄刀\t126026\n静\t126027\n王文燕\t126028\n复无复不爱百富\t126029\n靁\t126030\n葱油\t126031\n海诺\t126032\n钢琴家\t126033\n传递\t126034\n弹性\t126035\n陈玉良\t126036\n双四\t126037\n龙纹\t126038\n去年12月20日\t126039\n3352871649\t126040\n关押\t126041\n丛中\t126042\n78795576五凤四a\t126043\n苟倩倩\t126044\n郭一新\t126045\n锆石\t126046\n唐弟\t126047\n么状\t126048\ngignoring\t126049\n暗黑公主\t126050\n首脑\t126051\n潮湿\t126052\n咪头\t126053\n冒锅\t126054\n题小松\t126055\n对不对不对\t126056\n甬商\t126057\n自励自勉\t126058\n啦乖度秘\t126059\n点钟\t126060\n五味杂陈\t126061\n刷墙\t126062\n冋事\t126063\ngufh\t126064\n休刊\t126065\n千刀\t126066\n排行\t126067\n不好朋友\t126068\n结论\t126069\n司刁义\t126070\n内退\t126071\n调味\t126072\n点得\t126073\n琉漓殇\t126074\n内丹\t126075\nady\t126076\n结讯\t126077\n扭歪\t126078\nadf\t126079\n201班\t126080\n丌〇寥\t126081\n东来着\t126082\n盐水\t126083\n饿虎\t126084\n千利\t126085\n留下\t126086\n内丘\t126087\n还送\t126088\n暖脚\t126089\n胡思洁\t126090\n南水\t126091\n空寂寞\t126092\n这行\t126093\nFather\t126094\n20105月\t126095\n心旌摇荡\t126096\n百听不厌\t126097\n增发\t126098\n王亚楠\t126099\n谁子\t126100\n强三秋\t126101\n我爱你别爱我了我\t126102\n王吃\t126103\n物美乐\t126104\n王后\t126105\n王吉\t126106\n续航\t126107\n酒店业\t126108\n你好你是机器人么你好你好你好你好你好你好袅袅袅\t126109\n输血相关性移植物抗宿主病\t126110\n林教头纹鱼自助度秘\t126111\n薛佳锐\t126112\n牵头\t126113\n布娃娃\t126114\n杨子迎\t126115\n强暴\t126116\nhbhdhbh\t126117\n老逗我了行\t126118\n6455445454\t126119\n鬼天天\t126120\n99125\t126121\n连忙\t126122\n提小\t126123\n比喻\t126124\n嫩豆腐\t126125\n软忙\t126126\n为伍\t126127\n旱象\t126128\n一兆个\t126129\n倡议\t126130\n摩斗\t126131\n萌萌乖\t126132\n4.0升\t126133\n再见了从我到\t126134\n连心\t126135\n一看完\t126136\n往去天\t126137\n苍井\t126138\n鲍迪昔利\t126139\nwwwc\t126140\nwwwa\t126141\nggsgshsgfsgigcggxcgfoutt3xxvhiwqqweit\t126142\n五块\t126143\n烟水\t126144\n别罗索\t126145\n魔柱\t126146\nwwww\t126147\n和受\t126148\nediejdj\t126149\n黄土高坡现在就在上我和我\t126150\nx次\t126151\n八五零三八\t126152\n小蝌蚪\t126153\n那西游记\t126154\n滴滴答滴\t126155\nggdns\t126156\n打额恩爱恩爱你好吧主题\t126157\n我们来玩一个你我问你答的游戏\t126158\n演唱会\t126159\nuggffcccbm\t126160\n郑赖嘉琦\t126161\nxgkfruvf\t126162\n那么呢\t126163\n格罗\t126164\n十万多场\t126165\n想你干嘛\t126166\n街心花园三号六栋1001室\t126167\n不是说你时男时女\t126168\n我看你是头风中风中风中你就是头猪啊猪啊猪啊猪\t126169\n踩着\t126170\n嫁给我可\t126171\n单人夜曲\t126172\n不可信\t126173\n一百章\t126174\n嗯多谢虾米头内耳夸奖\t126175\n不！不！不！不！不！不！不！不！不！不！不！不！不！不！不！不！不\t126176\n薛兆丰\t126177\n万nn\t126178\nGhdgh\t126179\n外墙\t126180\nwww9\t126181\n嘿嘿t2000\t126182\n案发地\t126183\nhydjrgchfvhggnshfmhyffneghxjvgbxg\t126184\n忙行不\t126185\n江巧玉\t126186\n爱一个人\t126187\n肠涯\t126188\n胡蘿蔔\t126189\n老酒\t126190\n一正\t126191\n彩评师\t126192\n走卒\t126193\nwifi秘\t126194\n#嫣然\t126195\n不不固定\t126196\n仁义道德\t126197\n黑钱\t126198\n说挥手\t126199\n射饭\t126200\nkingst\t126201\nffffdtffyffg\t126202\n视听语言\t126203\n兴兴\t126204\n大呀小样\t126205\n晓娇\t126206\nYggdgfggghyrrhgcfhftvsfhhtgghh\t126207\n晓威\t126208\n容儿堡\t126209\n不是啦是kk魔法比\t126210\n而入\t126211\nPIS\t126212\n五谢谢你\t126213\n秘助手\t126214\n寓意\t126215\n千人鼓乐队\t126216\n十一对\t126217\n快跑快跑\t126218\n一米多\t126219\n李主张\t126220\n我的话了我不要你\t126221\n跟好\t126222\n孟津县工商局\t126223\n杨诗媛\t126224\n日记本\t126225\n猪脑门\t126226\n呢度\t126227\n沟里\t126228\nfyer\t126229\n37852\t126230\n团中央\t126231\n还以为你是\t126232\n五分之七\t126233\n1000多张\t126234\n吃爽歪歪\t126235\n农历六月十九日\t126236\n香香\t126237\n通辽市\t126238\n亢奋\t126239\nkeke\t126240\n卡拉卡拉啦啦啦啦啦啦啦啦撸啦啦刘一只笑鸟非\t126241\n十九岁时\t126242\n京味儿\t126243\n丽枫\t126244\n六十秒\t126245\n舍不呢吃\t126246\n你的生活\t126247\n00000000\t126248\n1点零九\t126249\n了不我不我\t126250\n宝石\t126251\n形像\t126252\n唉瓮\t126253\n蒲伦\t126254\n套子\t126255\n杨佩佩\t126256\n简朴\t126257\nwhosbed\t126258\n韩兴珍\t126259\n全民突击\t126260\n衰隶\t126261\n服用\t126262\n原味板烧鸡腿汉堡\t126263\n干一件\t126264\ngyrnn\t126265\n银魂会\t126266\n服男\t126267\n小头头\t126268\n不抱怨的世界中\t126269\n统领\t126270\n闲人\t126271\n顿思\t126272\n多米我喜欢你我希望每天都能和你在一起我们结婚吧我爱你\t126273\n图库\t126274\ncfys\t126275\n斗图\t126276\n25556535535655555575889865598683\t126277\nnnbvvm\t126278\n武艺天声一队\t126279\n3米\t126280\n几个一眼\t126281\n摔跤\t126282\n3类\t126283\n粗有反应\t126284\n我张工\t126285\n周杏康\t126286\nTaylor\t126287\n中毒\t126288\n寒暑假\t126289\n毅力帝\t126290\n斯奥特曼\t126291\n我的宠物度秘\t126292\ntjseii\t126293\n年纪律\t126294\n闲事\t126295\n慢性病\t126296\n讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌\t126297\n闲了\t126298\n10度\t126299\nliveat\t126300\n隆系\t126301\n翘度\t126302\n欧尼亚\t126303\n梨红\t126304\n还有点\t126305\n我的女人\t126306\n就好好儿\t126307\n砸开\t126308\n鼻部\t126309\n27000\t126310\n鄙意思\t126311\n简明版\t126312\n1341357401\t126313\n苹果iPad\t126314\njjjjjbbc\t126315\n真不怎么样\t126316\n近视眼见\t126317\n江西东北部\t126318\n钱天全\t126319\nKRKKSSFTSL\t126320\n一下照\t126321\n说了忘记了我没有\t126322\n庄瑜芬\t126323\n肺腔\t126324\n肺腑\t126325\n杜蜜\t126326\n一百三十二块\t126327\n嵊州市\t126328\n网络业\t126329\n详解\t126330\n旺季\t126331\n梅姐\t126332\n奴家\t126333\n谢林海\t126334\ntobea\t126335\n农丹松蘑\t126336\n秃迓\t126337\ntobed\t126338\n曼城\t126339\n萝卜\t126340\n15884302204\t126341\n猪崽子\t126342\n一下一份\t126343\n咿呀你\t126344\n详见\t126345\n宁梓亦\t126346\n七十一七十一\t126347\n侵权\t126348\n银海\t126349\n史密达\t126350\n珊瑚鱼\t126351\nhttp1011025010yxxxm\t126352\n清汤舒坦\t126353\n廷桂\t126354\n李杭远\t126355\n此曲\t126356\n反反复\t126357\n五笔\t126358\n陈君豪\t126359\n522128200011284064\t126360\n这个城市\t126361\n8868868868868868868862\t126362\n李赫宰\t126363\nSOfarsolucky\t126364\n刘生旭\t126365\n婊子儿\t126366\n潘多拉魔盒\t126367\n哈哈哈太搞笑\t126368\n8度\t126369\n10：00\t126370\n火火\t126371\n怎奈何\t126372\n藝界\t126373\n妇浔\t126374\n江南区\t126375\n０\t126376\n１\t126377\n２\t126378\n３\t126379\n４\t126380\n５\t126381\n28507285325\t126382\n4月15日\t126383\n８\t126384\n９\t126385\n：\t126386\n；\t126387\n嗷嗷\t126388\n＝\t126389\n＞\t126390\n？\t126391\n3362626852663366452399236833686938966\t126392\n！\t126393\n＂\t126394\n＃\t126395\n＄\t126396\n％\t126397\n＆\t126398\n＇\t126399\n（\t126400\n）\t126401\n＊\t126402\n＋\t126403\n，\t126404\n－\t126405\n．\t126406\nxm55\t126407\nＰ\t126408\nＱ\t126409\n会学学唱给我\t126410\n回牛\t126411\nＴ\t126412\nＵ\t126413\n官方\t126414\ncom肖\t126415\nnitaishuaile\t126416\n［\t126417\n＼\t126418\n］\t126419\n＾\t126420\n＿\t126421\n＠\t126422\nＡ\t126423\nＢ\t126424\n汉子版\t126425\nＤ\t126426\n延参\t126427\n摩尔伽\t126428\n信息费\t126429\n龙波\t126430\n查韦斯\t126431\n结节能\t126432\n4月6日\t126433\n回片\t126434\n邯郸市一中\t126435\n算了拜拜\t126436\n甘南\t126437\n不关己\t126438\n寻仙\t126439\n2月一号\t126440\n7577335\t126441\n16000\t126442\n乘积\t126443\n3点半\t126444\n唤起\t126445\n相应地\t126446\n81多少\t126447\n贝隆梦之鸡腿\t126448\n糊涂了突然路口克拉\t126449\n喜欢想\t126450\nhfvbjjkkkhhkj\t126451\n何来\t126452\nk00\t126453\n徐果\t126454\n心痒痒\t126455\nk01\t126456\n至少有很多\t126457\n一会儿数十只\t126458\n鸡尾酒八嘎\t126459\n再来嘛偶\t126460\n不要嘛不要嘛讨厌讨厌真讨厌\t126461\n菜园坝大桥\t126462\n东挪西\t126463\n何杰\t126464\n白蜡\t126465\n爱你的说话见一个女的很\t126466\n河流\t126467\n心心心上公公公\t126468\n罪名\t126469\nbiggest\t126470\n信息化\t126471\n2095年\t126472\n河海\t126473\n九轮\t126474\n湿度\t126475\n很可笑\t126476\nubrsid\t126477\n戴涛\t126478\n陪玩耍\t126479\n男的性\t126480\n鹅鹅鹅鹅鹅鹅\t126481\nSTAGE及Manhattan\t126482\n亿豪调\t126483\n子宫肌瘤\t126484\n在身\t126485\n乙醇\t126486\n查案\t126487\n我的朋友度秘多好呀\t126488\n泰坦诺斯\t126489\n百货大楼\t126490\n30年前\t126491\n偶戏ooooo\t126492\n33.3\t126493\n雄滴\t126494\n7000万元\t126495\n陈司琼\t126496\n成反比\t126497\n菲灵\t126498\n99avcom\t126499\n一个照\t126500\n昂力雷\t126501\n卡古拉酱\t126502\n川师大\t126503\n0.98%\t126504\n滑水\t126505\n石油网\t126506\n钥匙包\t126507\n8首\t126508\n密度板\t126509\n第一聪\t126510\n听不清除\t126511\n两小时内\t126512\n姐妹度\t126513\n文少全\t126514\n创：战纪\t126515\n狮虎巴尼古\t126516\nxcxbchgcjgdjgfgjd\t126517\nto呢福的忒呃图图你图\t126518\n第一聲\t126519\n740740\t126520\n此意\t126521\nfuov\t126522\n脚模\t126523\n成长型\t126524\n侬忙\t126525\n逆天上\t126526\n信奉\t126527\n天翻地覆\t126528\n我喜欢你是什么\t126529\n阿尔山景区\t126530\n动没了\t126531\n1975年\t126532\n雨轩\t126533\n夏历\t126534\n勾起\t126535\nweiweiainimang\t126536\n宝山路站\t126537\n嗯三区\t126538\n一条狗\t126539\n淫淫\t126540\nmanage\t126541\n吴早云\t126542\n烦着\t126543\n呃蓟县财政局\t126544\n价格单\t126545\n解药\t126546\n哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒\t126547\n失密\t126548\n地下城与勇士\t126549\n丁和卡\t126550\n一听音乐\t126551\n鞋底\t126552\n橡胶\t126553\n鞋店\t126554\n恶错\t126555\n庞锦华\t126556\n百灵\t126557\n许继电气\t126558\n韩语撒哇地卡\t126559\n稀望\t126560\n干燥\t126561\n持票\t126562\n我爱你爱你爱你屁股我爱你爱你爱你屁股我爱你屁股我爱你屁股我爱你就是你\t126563\n一大部分\t126564\n流尽\t126565\nGjkvffh\t126566\n卢郎\t126567\n經遠離\t126568\n稀有\t126569\n布布\t126570\n布市\t126571\ntttt\t126572\n抽烟有害健康\t126573\n总裁类\t126574\n天坛\t126575\n精治\t126576\n天坑\t126577\n精油\t126578\n崩沙卡拉卡\t126579\n给我不爽\t126580\n李成龙\t126581\n习尹\t126582\nkgdvhe\t126583\n白天天\t126584\n小度秘秘你猜我的叫谁呢小度秘秘\t126585\n卍岛\t126586\n六把\t126587\n啦啦啦啦啦啦你是一个小苹果你是一个臭别迷路撸啦啦撸啦撸撸啦撸啦撸\t126588\ngngh\t126589\n旦角儿\t126590\n赖帐\t126591\n妈妈说\t126592\n张远\t126593\nLM们\t126594\n幺八六四九零六六四八四\t126595\n问卷\t126596\nCyworld\t126597\n王助澜\t126598\ndjidii\t126599\n张迪\t126600\n呗娃\t126601\n垃圾股\t126602\n人次\t126603\n天下之不助苗者寡矣\t126604\nkt猫kt\t126605\n毫升\t126606\n运程\t126607\n四步\t126608\nRosslNl\t126609\n管不着我\t126610\n蛋糕\t126611\n#LM\t126612\n开封\t126613\n剪片\t126614\n啦啦拉啦拉啦啦露啦啦的爱\t126615\n孔一粲\t126616\n等人\t126617\n凉快儿\t126618\n我讨厌你我讨厌你我把\t126619\n有趣\t126620\n娘娘气\t126621\n葛心辰\t126622\n利笠赛高\t126623\n外痔\t126624\n涧西\t126625\n等于\t126626\n贾开文嗯\t126627\nv头xxujul\t126628\n不好不好不好不好不好不好不不不不不不不好不好不好波波不如推论玻璃bb\t126629\n共产党新闻网\t126630\n了结\t126631\n杨志腾\t126632\n闲置\t126633\n蛤蜊楚\t126634\n小螃蟹\t126635\n发乐\t126636\n9200\t126637\n裸露\t126638\n这个错\t126639\n这么多少\t126640\n费钱\t126641\n许振\t126642\n恋爱观\t126643\n雄楚\t126644\n桂纶\t126645\n发么\t126646\n石曾子奥\t126647\ncocatase\t126648\n梁蕊\t126649\n发乱\t126650\n葡萄\t126651\n理智粉\t126652\nkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkk\t126653\n跑开\t126654\n同桌叫\t126655\n送到\t126656\n科尔\t126657\n余唾\t126658\n龙俊亨\t126659\n好大好大好大好大好大\t126660\n陈翔\t126661\n展票\t126662\n周子瑜\t126663\n奔城\t126664\n欣欣\t126665\nmissyou\t126666\n永兴路共和新路\t126667\n误挂\t126668\n旧账\t126669\n工分\t126670\n表酱紫\t126671\n上上爬\t126672\n战战兢兢\t126673\n详询4008266333【城市名人酒店集团\t126674\n陈昕雅\t126675\n8月19日下午\t126676\n全区\t126677\n面面俱到\t126678\n臭大虾\t126679\n捐髓路\t126680\n海蒂拉玛\t126681\n北坡八\t126682\n全包\t126683\n非庚饭\t126684\n针对\t126685\n无车\t126686\n交配\t126687\n我的度\t126688\n张志艻\t126689\n全北\t126690\n54786194518537q4\t126691\n|\t126692\n涡扇\t126693\n机额\t126694\nflyo\t126695\n2900\t126696\n俺家小帅锅的背影\t126697\nwitha\t126698\n我是要你猜我是男是女\t126699\n狂犬病毒核酸检测\t126700\n杜绝\t126701\nhahahaaa\t126702\ndisgo\t126703\n烤猪\t126704\n八八猪\t126705\n赤子之心\t126706\nfbbxv\t126707\n牲口\t126708\n世界文学史\t126709\n考试类\t126710\n活够了\t126711\n凯艳\t126712\nabcc式\t126713\n生莫\t126714\n红叉叉\t126715\n哎呀16p\t126716\n1.523\t126717\n近在眼前\t126718\njoto\t126719\n1234554321\t126720\nziont\t126721\n大智\t126722\n千里之外\t126723\n难离\t126724\n爱不爱\t126725\n号查\t126726\n290q\t126727\n装神\t126728\n改任\t126729\n口头馋\t126730\n堡义词\t126731\n随意畅\t126732\n大厦大厦\t126733\n云风文\t126734\n管好\t126735\n息肉\t126736\n自嘲\t126737\n举好\t126738\n朱梦雨\t126739\n监音\t126740\nZhdd\t126741\n游客\t126742\n免试\t126743\n芊雪\t126744\n小动力\t126745\n楼烦关南口\t126746\n京津冀豪庭\t126747\n10种\t126748\n10秒\t126749\n1801722660\t126750\n泼粪工\t126751\n因何\t126752\n雷石宝贝大评选拉\t126753\n一劳永逸\t126754\n9￥\t126755\n大猩猩\t126756\ngocgfi\t126757\n粒错\t126758\n朱炳镇\t126759\n76集\t126760\n18921290525\t126761\n代理人\t126762\n闵帅\t126763\n别介\t126764\n别今\t126765\nbitch\t126766\n闺密\t126767\n航线\t126768\n绝妙\t126769\n两包\t126770\n王伯伯\t126771\n古埃及\t126772\n夜夜晚的夜\t126773\n婀娜\t126774\n打幺\t126775\nguyughh\t126776\n5451234655216256486556259643256588\t126777\n两化\t126778\n湖南广电中心\t126779\n邹刘涛\t126780\n夜店\t126781\n128下次\t126782\n倒台\t126783\n体育课\t126784\n第一[\t126785\n沃商店\t126786\n白术粉\t126787\n阻拦\t126788\n蒸蛋\t126789\n分币\t126790\n爱的就是你岩狗子\t126791\n资本性\t126792\n六组\t126793\n倒发\t126794\n突出堡\t126795\n张平宜\t126796\n白擦\t126797\n六秘你在吗你在\t126798\n15973006828\t126799\n这儿人\t126800\ngohu\t126801\n唔噜咕噜咕噜咕噜\t126802\n新长城\t126803\n范阳\t126804\nMMWAP\t126805\n融安\t126806\n蒋喜丹\t126807\n左律师\t126808\n外贸\t126809\n绝大部分\t126810\n欢乐送\t126811\n曹萌萌\t126812\n曝光\t126813\n忙活\t126814\n陌上\t126815\n几两\t126816\n几个\t126817\n阿蒙\t126818\n三.八节\t126819\n外财\t126820\n落魄\t126821\n几世\t126822\n天顶\t126823\n舍得\t126824\nHerbhgfhkhz\t126825\n玩美魔女外宿中\t126826\n家系\t126827\n从新开始\t126828\n快告诉我机器人\t126829\n几一\t126830\ncdefghijklmnopql级\t126831\n班罗\t126832\n几万\t126833\n订造\t126834\n几下\t126835\n几丈\t126836\n蛋蛋君\t126837\n玫瑰手\t126838\n9月4日\t126839\nAABB\t126840\nAABC\t126841\n余庆顺\t126842\n温盛章\t126843\n奔跑吧小羊\t126844\n罪魁祸首\t126845\n十六米\t126846\n感信阳\t126847\n刘悦萌\t126848\n白首不分离\t126849\n木兰\t126850\n刺闹\t126851\n车道\t126852\n9月8日\t126853\n莎拉·布鲁克\t126854\n龙会\t126855\n克瑞文\t126856\n育儿网\t126857\n泡杯\t126858\n一十王的小呀小苹果榛蘑\t126859\n么叫\t126860\n听来\t126861\n独酌\t126862\n俾人\t126863\n献寿\t126864\n给我摸摸\t126865\n着等\t126866\n直白\t126867\n打字员\t126868\nghiphotosbaiducomxiaodupicitemf636afc379310a550d25de81b04543a982261045jpg\t126869\n待人\t126870\n惶馁\t126871\n139375576277777778877777\t126872\n3you480\t126873\nngg\t126874\nnga\t126875\n身边\t126876\n蝌蚪\t126877\n慧仔仔\t126878\nlbdl\t126879\n天池\t126880\n看干嘛\t126881\n王冰凌\t126882\n不稀罕\t126883\n张百涛\t126884\nngv\t126885\nngw\t126886\n修好\t126887\nlghfjgjfggggggggg\t126888\n越湖东园\t126889\n是我的骄傲自满\t126890\n修女\t126891\n温州新星学校\t126892\nbehwer\t126893\nwjsjdkf\t126894\njociqid\t126895\n新年快\t126896\n呢候\t126897\n干掉\t126898\n十八十二十九十八\t126899\nESMOD\t126900\n姑萌\t126901\n村头\t126902\n周奥敏\t126903\n一百六十遍\t126904\n姑奴娇me哪呗骨浓汤\t126905\n灯红酒绿温柔巷\t126906\n埋伏\t126907\n国窖\t126908\n庄姿妮\t126909\n服务派\t126910\n你爱不爱我你喜\t126911\n女鞋\t126912\n逐渐\t126913\n找我无为\t126914\n有鬼来\t126915\n过来没有\t126916\n杜云龙\t126917\n12点到2点\t126918\n烦躁\t126919\n酱紫错\t126920\n东沙岛\t126921\n品味\t126922\n八太平\t126923\n雷春丽\t126924\n追念\t126925\n大你是谁\t126926\n郭找发\t126927\n性情\t126928\n好吧大黑\t126929\ncountroy\t126930\n接你吧\t126931\n海泡馍\t126932\n幽默感谢谢你\t126933\n1586元\t126934\n杜姚\t126935\n不怕死\t126936\n追忆\t126937\n杜姐\t126938\n一个场\t126939\n期末卷\t126940\n订货会\t126941\n笨措\t126942\n2258866\t126943\nwgu\t126944\n刘溜\t126945\n郭晓利\t126946\n发出去过\t126947\n航天金曲\t126948\n李计划\t126949\n刘源\t126950\n1班\t126951\n刘春桦\t126952\n痛经\t126953\n他好你好我好大家好\t126954\n小魔王\t126955\n奥大利\t126956\n芭比娃娃\t126957\nhushagalaga\t126958\n391家\t126959\n停的旅行\t126960\n陈政凯\t126961\nachecaiavc\t126962\n其余\t126963\n国歌\t126964\nggagaigaib\t126965\n讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌\t126966\njirr\t126967\n镇海\t126968\n吃好呀\t126969\n政策\t126970\n过来来不那个金瓶梅\t126971\n定心\t126972\n香亭\t126973\n62228119\t126974\nKissingen\t126975\njgfhj\t126976\n巨制\t126977\n2015元\t126978\n关切\t126979\n城管\t126980\n爱都\t126981\ncent\t126982\n1561566666789421\t126983\n被击毙\t126984\n其你\t126985\n粪斗\t126986\n要不搞笑\t126987\n最终版\t126988\n聪明行\t126989\n14:20\t126990\n撒乖\t126991\nuhwbbsbdb\t126992\n当阳市\t126993\nn2n34n4n58n6n7n8n9n10n11n12n13n14n15n\t126994\n极好\t126995\n迟早阳痿\t126996\n庄可萱\t126997\n引产\t126998\n引人\t126999\n流经\t127000\n小咪咪小妹妹你好吗小妹妹\t127001\niPad，Kindle\t127002\n90年代政\t127003\n接力社\t127004\n我很讨厌你淘淘淘淘讨厌你\t127005\n开心春饼欢谁生前侠情的天麻生\t127006\n耐玩\t127007\n郭问\t127008\n名城\t127009\n客房\t127010\n防水\t127011\n闪灵侠\t127012\n礁石\t127013\n能不说\t127014\njhiby\t127015\n丽花\t127016\n点支撑\t127017\n名域\t127018\nPAUL\t127019\n〇Oo\t127020\n联欢我妈\t127021\n78.2%\t127022\n死魂\t127023\n伊秀\t127024\n疑难\t127025\n植物激素\t127026\n担子\t127027\n办公桌\t127028\n暮色绝恋\t127029\ndhiphotosbaiducomxiaodupicitem314e251f95cad1c8ec8dae54783e6709c93d518ajpg\t127030\n15602823263\t127031\n总结\t127032\n时怕\t127033\n总统\t127034\n刘袁\t127035\n眼皮\t127036\n交没\t127037\n哄人\t127038\n看坏\t127039\njtya\t127040\n续贷\t127041\n时态\t127042\n打爱\t127043\n金鸡独立\t127044\n13916373588\t127045\n老子不文\t127046\n打爆\t127047\n讲故\t127048\nshhd\t127049\n大餐\t127050\n累疯了\t127051\n闺密骂\t127052\nYUOHAPPY\t127053\n李哲\t127054\n乖乖啦\t127055\nHDHDhx\t127056\n安美\t127057\n巴基斯坦\t127058\n毫无保留\t127059\n常委会\t127060\n争吵片\t127061\n加油辛苦你了\t127062\n合景泰富\t127063\n无时无刻\t127064\n年罗\t127065\nTIT\t127066\n小气球\t127067\n邱宏梅\t127068\n后知\t127069\n时崎狂三\t127070\n地下情人\t127071\nTIO\t127072\n什么元\t127073\n严时骏\t127074\n有钱似\t127075\n去哪也不知道\t127076\n健美操\t127077\n素水\t127078\n曹慧轩\t127079\n招待会\t127080\n繁荣\t127081\n太好了当\t127082\n够狠\t127083\n减轻\t127084\n利惊梦\t127085\nZang\t127086\nd331\t127087\n意料\t127088\n大圈子\t127089\n八零八零二十八集\t127090\n了你你是谁你是谁你是谁\t127091\n三小时后\t127092\n没感\t127093\nxxx偶\t127094\n罗宾\t127095\n留学生\t127096\n罗宇\t127097\n罗宁\t127098\n襄垣造\t127099\n匡鑫\t127100\nTI0\t127101\n好贴心\t127102\n第二届\t127103\n3顿\t127104\n暴菊暴菊\t127105\n唐小镖\t127106\n一个错字\t127107\n第二层\t127108\n东东明东\t127109\n各种各样\t127110\n艾欧尼亚\t127111\n249美元\t127112\n西北区\t127113\nWuyu\t127114\nhjdyhhy\t127115\n改革破体制坚冰\t127116\n大明年\t127117\n几起\t127118\n蓝翔学校\t127119\n飞机场\t127120\n邓梓涵\t127121\n真罗茜\t127122\n刘骐畅\t127123\n1.2万名\t127124\n永年\t127125\n徒手\t127126\n陈艳伟\t127127\n华辉\t127128\n冥夫\t127129\n代睿珏\t127130\n羊羊羊\t127131\n颇优\t127132\n更高\t127133\n奶头男人的特别老大难兰兰兰兰兰兰兰兰猪\t127134\n僵尸类\t127135\n沉演\t127136\n赏卿\t127137\n祝噜噜\t127138\n肿么了了疯狂\t127139\n偏激\t127140\n跃居\t127141\n打伤\t127142\n仲佳怡\t127143\n横看成\t127144\n比货\t127145\n一1百\t127146\n变革\t127147\n黑侠\t127148\n黑蓝\t127149\n68888916\t127150\n胥雨莎\t127151\n邓慧芳\t127152\n10月23日\t127153\n383383838383838383838383833838338383833833838338\t127154\n374.08点\t127155\n二保\t127156\nh版\t127157\n打会\t127158\n数九\t127159\n五十寸\t127160\n神仙水\t127161\n打伞\t127162\n六八零\t127163\n金兀动\t127164\nvfvdg\t127165\nh片\t127166\n一天一个\t127167\na东昌邑了一大全什么背菜喜羊羊之一一吧\t127168\n假面\t127169\n叮咚咚叮叮叮叮叮叮叮叮叮叮叮叮叮叮叮叮\t127170\n加一份\t127171\n不大方\t127172\n直指\t127173\n422255555656554\t127174\n直挂\t127175\n杠\t127176\nigiyygvjhfyi\t127177\nbdvvrhdjzhvsdhhjhffowxbsksleejrbfoolcnxnzhwvucf\t127178\ncocoa\t127179\n帕斯卡-吉尼亚\t127180\n将它\t127181\n美少女联盟\t127182\n不干呀\t127183\n托马斯穆勒\t127184\n3241808562\t127185\n纠正\t127186\n分合\t127187\n绣花鞋\t127188\n哎呀z4\t127189\n死把\t127190\n2011-06-28\t127191\n2011-06-29\t127192\n过喝一杯\t127193\n冻僵\t127194\natthatnoon\t127195\n一百一百一十\t127196\n执政为民\t127197\n刘丽丽\t127198\n阅历\t127199\n院线\t127200\n香薰瓶\t127201\n诶行行\t127202\niun\t127203\n甜版\t127204\n五天前\t127205\n抛出\t127206\n热毯\t127207\n任胜男\t127208\nDOYOUS\t127209\nnnnnnnnnnnnnnnnnnnnnn觉nnnnnnnnnnnnnnnnnnnnn\t127210\n琼平\t127211\n啦几\t127212\n也就是\t127213\n挑一挑\t127214\n德尔色\t127215\n1.5000万\t127216\n罗老师\t127217\n刘医生\t127218\n乖乖男\t127219\n季哥\t127220\nkissed\t127221\n大司命宁夏妙屋嗯\t127222\n吐流\t127223\nhttpehiphotosbaiducomxiaodupicitem10dfa9ec8a136327de1c9f40968fa0ec09fac7d4jpg\t127224\n贡嘎雪山\t127225\nteryy\t127226\n孙爷爷\t127227\nKkkk\t127228\nRugchkckgxyfxuxigchchcychkchnchotitdkhvkhcic9u\t127229\n遮遮掩掩\t127230\n高高挂起\t127231\n大客车\t127232\n风势\t127233\n板子\t127234\n写轮眼\t127235\n诘奸\t127236\n十有很多种\t127237\n近似\t127238\n序言\t127239\n绝交\t127240\n绝产\t127241\n火柴人\t127242\n北运\t127243\n十六别\t127244\nhebecs\t127245\n锤死\t127246\n我的秘密就是我爱你\t127247\n绝人\t127248\nhttpahiphotosbaiducomxiaodupicitembba1\t127249\n20多分钟\t127250\n向南\t127251\n定用\t127252\n虾兵蟹将\t127253\nノДノ┻━┻\t127254\n小言\t127255\nmiele\t127256\n绝于\t127257\n史蒂夫·乔布斯\t127258\n2011年6月9日\t127259\n服务部\t127260\n番中女只人\t127261\n失去\t127262\n带死\t127263\n几桶\t127264\n僵尸你信\t127265\n28个\t127266\n咳咳咳咳\t127267\n火影\t127268\n灰天鹅\t127269\n沧江\t127270\n卧凑\t127271\n税收\t127272\n嗯秋思\t127273\n粉色们\t127274\nwithme\t127275\n离石我的小呀小苹歌榛蘑艾妮都不先朵红红的小游问暖我的心窝\t127276\n滨州\t127277\nppwdiadgfAdfgjhzaguzaghfhh\t127278\n8847155222258121224522\t127279\n萌哒哒萌哒哒萌哒哒哒哒萌哒哒萌哒哒萌萌萌萌哒\t127280\n能否\t127281\n各省\t127282\n羊毛靴\t127283\n驯鹿党\t127284\n酷宝\t127285\naud\t127286\naug\t127287\n有爱\t127288\n朱明泽\t127289\n杜拉拉升职记\t127290\n金钟铉\t127291\n孚亭周\t127292\ngonvh\t127293\n有很多\t127294\n冲猎人电了宝塔咧\t127295\n窝讷\t127296\nauu\t127297\nauv\t127298\nrecycle\t127299\n团员\t127300\n石羊龙台镇\t127301\nLeah\t127302\n卞哥\t127303\n小城镇\t127304\n东坡\t127305\n暴龙\t127306\n天晴\t127307\n本一年\t127308\n小行\t127309\njgktmpj\t127310\n告诉我的秘密\t127311\n东坝\t127312\n秦亲爱\t127313\n平常\t127314\n孙畅\t127315\n横梁\t127316\nau5\t127317\n拥塞\t127318\nDfvv\t127319\n欺侮\t127320\n1896年\t127321\n再见了我去\t127322\n88688686\t127323\n不了了之\t127324\n矮东特loveyou\t127325\neysdh\t127326\n一夜后\t127327\n足球赛\t127328\n无烟\t127329\n纵横跋扈\t127330\n碧绿\t127331\niiijr\t127332\n143英尺\t127333\n2011年初\t127334\n60minutes\t127335\n每一刻\t127336\njjbobo\t127337\n续保\t127338\n一百三十几\t127339\n你你你你你你你你喽喽喽喽\t127340\nbrtm\t127341\n好再见\t127342\nyesyesS\t127343\nchsjvidyayghu\t127344\n运河\t127345\nム\t127346\n山东墨龙\t127347\n脸大大\t127348\n中航床\t127349\nijojiiijijoji\t127350\n布洛芬\t127351\n宇航\t127352\nmed\t127353\n泥马\t127354\nwrwdfddd\t127355\n王皇后\t127356\nmem\t127357\n皇明太阳能维修部\t127358\nmen\t127359\nmei\t127360\nmek\t127361\n高不高兴\t127362\n二十分钟\t127363\n动了心\t127364\nmew\t127365\nmeq\t127366\nmes\t127367\nmer\t127368\n终产物\t127369\n桃井\t127370\n都米都米\t127371\n大豆\t127372\nヘ\t127373\nFjhhhjh\t127374\n宿敌\t127375\nExtra\t127376\n天国说纹眉只\t127377\n夢過\t127378\n乙级\t127379\n朱台\t127380\n我喜欢任\t127381\n不舍图库天来\t127382\n黄字\t127383\n飞波\t127384\n大象\t127385\n素鸡素鸭素肚\t127386\n27234679731794673497364646\t127387\nyfrshfdj\t127388\n二二零七\t127389\nvfrv\t127390\n4一五一十5口是心非6心直口快7\t127391\n江苏苏美达国际贸易技术有限公司\t127392\ncFC\t127393\n雪中送炭\t127394\n陈仅\t127395\n3150点\t127396\n再唱一首\t127397\n混血儿\t127398\n宋平平\t127399\n春宵\t127400\njiubu\t127401\n。＼／\t127402\n走人走人\t127403\n过一天\t127404\n在在在在在在\t127405\n沙漠之星\t127406\n一百箱\t127407\n拿手过\t127408\n2500转\t127409\n空泛\t127410\nCarbon\t127411\n白金汉宫\t127412\n春宫\t127413\n油耗\t127414\n月tm\t127415\nト\t127416\n僵尸先生\t127417\n加思索\t127418\n六二十八三四十\t127419\n杰仔省\t127420\n公式\t127421\n多很多\t127422\n公开\t127423\n莒县\t127424\n度秘我真讨厌\t127425\n身不由己\t127426\n篮球盲\t127427\n怜天\t127428\n欣宜\t127429\n冯亦同\t127430\n蘿烙\t127431\n纳闷\t127432\n子栏目\t127433\n34轮\t127434\n中国驻日使馆\t127435\n法王噶玛巴\t127436\n接入\t127437\nl米\t127438\n尽显\t127439\n丹麦\t127440\n暴增\t127441\n剁碎\t127442\n黄衣\t127443\n平凡之路\t127444\n药剂\t127445\n2010年1月2日\t127446\n张仁桥\t127447\n心本\t127448\n心术\t127449\n可恨\t127450\nDAVID\t127451\n可恶\t127452\n汗杀人你在哪我要逮捕你我是警察\t127453\n很漂亮\t127454\n火火火火火\t127455\n真bv\t127456\n5355566665\t127457\n1699元\t127458\n心机\t127459\n了喝\t127460\n熊孩子们\t127461\n鲲鹏\t127462\n逆周凯\t127463\nUFJXHK\t127464\n也罢修真\t127465\n心服\t127466\n变相\t127467\n心有\t127468\n跳秘\t127469\n海盗鹦鹉\t127470\n吥吥吥吥吥吥\t127471\n曾皎然\t127472\n林弋\t127473\n黄章还不大\t127474\n武艺\t127475\n咕噜噜\t127476\n触手可及\t127477\n关想\t127478\n艋舺\t127479\n精神上\t127480\n收礼\t127481\n堕落\t127482\n丽丽那那东东数数\t127483\n生殖器\t127484\nQba\t127485\n亲爱的别我问你\t127486\nove\t127487\n唤醒\t127488\novk\t127489\n秀全\t127490\novo\t127491\n有别上我家\t127492\nOwens\t127493\n稳住\t127494\nCJDJ\t127495\n亚当才\t127496\n蔗糖\t127497\n为物所困\t127498\n再讲一个再讲一个再讲一个超值超赞超\t127499\n吃大吃\t127500\n遥远\t127501\n哼帅帅我不是乖乖不睡一个床乖乖乖乖乖乖乖乖乖乖乖\t127502\n哈哇\t127503\n我的偶相\t127504\n68555558\t127505\n李猫\t127506\n凯智开明\t127507\n论政\t127508\n没出息\t127509\n连身裙\t127510\n自认为\t127511\n88520\t127512\n卑月传\t127513\nbushuon\t127514\n赵文杰\t127515\njuibuu\t127516\n对跟\t127517\n继续吧\t127518\n56688412236987441236699844412566321147559961223669887456321\t127519\n经送\t127520\n感人\t127521\n扫一扫\t127522\n刘丽琴\t127523\nVhvVCD\t127524\n草莓哈密瓜\t127525\n感亲\t127526\n倾疼\t127527\n高血\t127528\n秘节\t127529\n剪彩\t127530\n高见\t127531\n你的女人\t127532\n客家\t127533\nonf\t127534\n別狡辩\t127535\n射箭\t127536\n黑嘞\t127537\n秘耍\t127538\n剪影\t127539\n扼杀\t127540\n魏霞\t127541\n味耳朵\t127542\nv好女v\t127543\n杂种猫\t127544\n怎莫\t127545\n客官\t127546\n巴宝莉娜塔\t127547\n多彩\t127548\nfggcccdd\t127549\n德才\t127550\n秘花\t127551\n姊妹们\t127552\nFkcmcnnfn\t127553\n潜江业成花园\t127554\n2006年1月23日\t127555\n都城\t127556\n度秘藏獒不好看我要养比熊ok\t127557\n你是我的你说的度秘我是主人我是你的主人\t127558\n愛嗎\t127559\n朴珍\t127560\n印送\t127561\n更贴近\t127562\n零六零二\t127563\njiulian\t127564\n撤职\t127565\n佳\t127566\n卢光瑞\t127567\n忙内\t127568\n几岁\t127569\n懒片\t127570\n群舞\t127571\n乔羽\t127572\n杭州市区\t127573\n狂暴\t127574\n高打错\t127575\n聊了讨厌\t127576\n小小魔\t127577\nhellobaby\t127578\n哈虎\t127579\nksgg\t127580\n螨虫性\t127581\n概括\t127582\n杨杏林\t127583\n叨咕\t127584\nyfhgf\t127585\n7894561253点\t127586\n贾蓓蕾\t127587\nZERO.Sho\t127588\n10余种\t127589\n震死\t127590\nhckg\t127591\n呱呱呱呱呱\t127592\n集邮\t127593\n读好\t127594\n雇请\t127595\n交办\t127596\n访民\t127597\n叶雪雪\t127598\n等等待\t127599\n卡酷\t127600\n设置\t127601\n移机\t127602\n250次\t127603\n炉火纯青\t127604\n亚雄\t127605\n一个一夜\t127606\n5月31日\t127607\n卢老师\t127608\nSSH\t127609\n12345679891011121341\t127610\nSSD\t127611\n顺华\t127612\njsijsd\t127613\n常熟市大义卫生院\t127614\n郑爽\t127615\n二十几岁\t127616\n肖申克\t127617\n禾利\t127618\n拿图\t127619\n发了吧\t127620\n佧\t127621\n雷好雷\t127622\n保剑锋\t127623\n梦一场\t127624\n犹太人\t127625\n冯姐\t127626\n0号\t127627\n持续力\t127628\n有棱有角\t127629\n岳母\t127630\n在之\t127631\n在么\t127632\n中队\t127633\n在乎\t127634\n泰福\t127635\n刘子骞\t127636\n万飞杨\t127637\n拿回\t127638\n佬\t127639\n冯小燕\t127640\n古代人\t127641\n中阴\t127642\n香水瓶\t127643\nfserreewtt\t127644\n慰劳\t127645\n平滑\t127646\n全保\t127647\n次价\t127648\n王宫平\t127649\n砸缸\t127650\n2gp\t127651\n大神\t127652\n谷三\t127653\n凑人\t127654\n高淑\t127655\n雄心勃勃\t127656\n体\t127657\nChiefsshiv\t127658\n操作系统\t127659\n131朵\t127660\n石家庄机场\t127661\n幼鲸\t127662\n灵感\t127663\n贵贵贵贵贵贵三十发子\t127664\n洗澡你好你好烧水洗澡你好你好潲水洗澡你好你好烧水洗澡你好你好\t127665\n上海滩\t127666\n比欧特\t127667\n下笑\t127668\n下笔\t127669\n觉觉觉觉觉觉觉觉觉\t127670\n高深\t127671\n272647千七百七十七\t127672\n高淳\t127673\n乱动秘\t127674\n固墙镇\t127675\n大城府一家本田汽车制造厂\t127676\n6级\t127677\n文一成一只哪也施工员\t127678\n首姑娘\t127679\n仆\t127680\n仇\t127681\n仄\t127682\n仅\t127683\n仂\t127684\n仃\t127685\n什\t127686\n仁\t127687\n从\t127688\n陆冰梅\t127689\n仍\t127690\n今\t127691\n介\t127692\n他\t127693\n仗\t127694\n仔\t127695\n仕\t127696\n无处不在\t127697\n仓\t127698\n了别误会\t127699\n仞\t127700\n聊友\t127701\n18号\t127702\nGbxggdbv\t127703\n付\t127704\n仙\t127705\n令\t127706\n以\t127707\n代\t127708\n上学时\t127709\n仡\t127710\n三单元\t127711\n们\t127712\n仫\t127713\n仨\t127714\n东洋刀\t127715\n件\t127716\n价\t127717\n义捐\t127718\n仲\t127719\n二十件\t127720\n仰\t127721\n仿\t127722\n好啦禾湾仔聊天七百米好乖啦小姐\t127723\n份\t127724\n任\t127725\n二十份\t127726\n為什麼時候說\t127727\n尼莫\t127728\nPURE\t127729\n时代周报\t127730\n苍梧县\t127731\n熊顿\t127732\nrftgh\t127733\n你不我不能\t127734\n肉头\t127735\n邓邓\t127736\n冠军队\t127737\n陆琪宇\t127738\n邓那\t127739\n掩饰\t127740\n弥散\t127741\n腻歪\t127742\n二百零几\t127743\nMocha\t127744\n陈成\t127745\n陈我\t127746\n狗猫\t127747\n特高坡\t127748\nbjvj\t127749\n不不我\t127750\n真臭\t127751\n你终于答我了的对呀\t127752\n潜江\t127753\n好我去睡\t127754\n画手\t127755\n二首\t127756\n熊出没之国宝雄兵\t127757\n哇塞机器人\t127758\n睡睡睡\t127759\n积家美\t127760\n钓鱼台\t127761\n为您\t127762\n搞基吧图良\t127763\n险处\t127764\n方二货\t127765\n水晶莹\t127766\n冼才\t127767\n交杯酒\t127768\n谨慎\t127769\n江河日下\t127770\n不要再复衣服\t127771\n对呀感叹号\t127772\n道林\t127773\n钞票\t127774\n张瑞杰\t127775\n赶快走\t127776\n破发\t127777\n兔咪\t127778\n41天\t127779\n破口\t127780\n杨世恒\t127781\n送位\t127782\n嗯雄兔脚扑朔雌兔眼迷离双兔傍地走安能辨我是雄\t127783\n鸣叫\t127784\n梅向达\t127785\nCerityibrhoi\t127786\n蔡爱华\t127787\n养区\t127788\n取动力\t127789\n55两\t127790\ntydhfhggukfifgjgyhghjjkjfhhhhjkghjkhcjghjlthhhjjghhhh\t127791\n就是你的别\t127792\n基因组\t127793\n谄谀\t127794\n彩服\t127795\n55个\t127796\n纯属\t127797\n庙屋\t127798\n男朋男\t127799\nyoshi\t127800\n发呆vffxkgodhkwhk\t127801\n赛欧三嗯\t127802\n领子\t127803\n数三个\t127804\n太坏了我\t127805\n结清\t127806\n自西\t127807\n55万\t127808\n么回事\t127809\ncx2y21\t127810\n真巧\t127811\n中国联通移动\t127812\nTVXQ\t127813\n大行动\t127814\n4288\t127815\n保值\t127816\n已然\t127817\n澄海站\t127818\n结温\t127819\n富国\t127820\n数三下\t127821\n相公我爱你么么哒再见拜拜\t127822\n麦手\t127823\n455555\t127824\n崩溃症\t127825\nBgj\t127826\n面条秘\t127827\n鬼节\t127828\nBgv\t127829\n李博睿\t127830\n双倍\t127831\n安全性\t127832\n涎费者\t127833\nuhgjh\t127834\n好呀混蛋\t127835\n西门站\t127836\n案错\t127837\n啊晓娜\t127838\n不懂不懂啊我头好痛\t127839\n拱北\t127840\n百分之八十\t127841\n立案侦查\t127842\nyourareyou\t127843\n漫行\t127844\n好丑呀\t127845\n台电\t127846\n年久\t127847\nBmghhjykvmgnfv1097665848585\t127848\n袁怡\t127849\n逃课\t127850\nurkaaf\t127851\n别说说\t127852\n孙泰阳\t127853\n炫酷\t127854\n蚀都不懂\t127855\nqcwb\t127856\n唐三彩\t127857\n赵恩典\t127858\n红尘紫陌\t127859\n结再说\t127860\n更美丽\t127861\n护甲\t127862\n不难\t127863\n35个\t127864\n四点二十五\t127865\n贺岁\t127866\n里克\t127867\nGunners\t127868\n温嗳\t127869\n多哈\t127870\n笑死克\t127871\n小黄色\t127872\n⑦\t127873\n民航\t127874\n⑤\t127875\n⑥\t127876\n③\t127877\n④\t127878\n①\t127879\n②\t127880\n几十兆\t127881\n下元\t127882\n54553243635356576543345435443537535334355364556675436454656566567\t127883\nUgdggdgb\t127884\n⑨\t127885\n⑩\t127886\n⑶\t127887\n980687752\t127888\n⑴\t127889\n赵承轩\t127890\n0K𠃌\t127891\n今天早上六点\t127892\nStuart\t127893\nUlla\t127894\n处藏\t127895\n⑺\t127896\n⑻\t127897\n⑸\t127898\n⑹\t127899\n轻松\t127900\n几十六\t127901\n几十八\t127902\n亚你的小受含真好\t127903\n大我就馆\t127904\n美丽版\t127905\n456号\t127906\ngotobed\t127907\nuhjdhggjhgjjfyehuvseff\t127908\n好的我记住你\t127909\n几括\t127910\nAlferd\t127911\nsfgfhhfdghvcduojvfrDhvxzgjkbvxzdfhmjjjjjjjjjjjhjjjjjvggggggddsssSygzsdrfx\t127912\n下关\t127913\n绝世武魂洛城东\t127914\n63688\t127915\n执念\t127916\n猎刀\t127917\n睡觉者\t127918\n琴娜\t127919\nfhfhfyyfy\t127920\n大英雄\t127921\n别惯\t127922\n几拨\t127923\n乔乔\t127924\n更说\t127925\n罗湘泉\t127926\n82分\t127927\n别惹\t127928\nValentines\t127929\n脉象\t127930\ninthisyeartheaverageageofourfouris38\t127931\n经典文学\t127932\n探案\t127933\n省啊躯\t127934\n生死\t127935\n东电\t127936\n每两天\t127937\n745437\t127938\n或许复\t127939\n火影bl\t127940\n东田\t127941\n小明白\t127942\n事业局\t127943\n一二三四五六七八九十十一十二十三十四十五十六十七十八十九二一\t127944\nDIVA华丽之后\t127945\n利见\t127946\n好讨厌讨厌讨厌讨厌\t127947\nDvn\t127948\n跟来\t127949\n几拳\t127950\n青衫\t127951\n28、29日\t127952\n12分\t127953\n青衣\t127954\n夜火车站\t127955\n欧亚\t127956\n危急\t127957\n闷度秘\t127958\n抢劫\t127959\n大飞喔喔\t127960\n硅谷奇迹=科技创新+风险投资+\t127961\nzaog\t127962\n冲水\t127963\nzaoj\t127964\n┭┮心肝你太青葱了为神马我才认识你为神马为神马\t127965\n雀氏\t127966\n两臂\t127967\n妠叭\t127968\n峰洋\t127969\n干净利落\t127970\n我不要我要你给我读\t127971\n二货肿\t127972\n体系化\t127973\n颐和园\t127974\n顶多\t127975\n九门\t127976\n3cckc\t127977\n我真不懂你说人话\t127978\n14454848484848484\t127979\n贺楚洪\t127980\n竹子\t127981\n飞了\t127982\n040个\t127983\n付酬\t127984\n疯了吧\t127985\n李炤星\t127986\n高堂\t127987\n你好多咪多咪呀你\t127988\n疯了吗\t127989\n嘿嘿洋洋\t127990\n太笑\t127991\n不要脸的潇潇不要脸的臭不要脸的狐狸\t127992\n偷笑]小三小三\t127993\n13周岁\t127994\n飘然\t127995\n火凤\t127996\n和尚\t127997\n只位罗\t127998\n胖淳\t127999\n夏哥哥\t128000\n华亲美\t128001\n对呀我没疯你疯了\t128002\n叶你\t128003\n开门力\t128004\n迷失东京\t128005\n大东塔\t128006\n沙涌\t128007\njaha\t128008\n傻蛋\t128009\n织法\t128010\n#新白发魔女传#\t128011\n孙端\t128012\n滑石\t128013\nnude\t128014\n100年\t128015\n怎么元\t128016\n关联词\t128017\n许昌刚\t128018\n保佑\t128019\n粘人的小妖精不要问我从那儿了也别问我到哪里去我要再下最美的花儿送给我的小公鸡\t128020\n不灭我恨你\t128021\nBoom\t128022\n尸体\t128023\nplmok\t128024\n每吨\t128025\n白叔叔\t128026\nnousingthem\t128027\n好的再见了我去\t128028\n查证\t128029\n师徒恋\t128030\n牙刷牙膏牙屁屁\t128031\n几百米\t128032\nANOTHER\t128033\n688865663699\t128034\nijbfgyujb\t128035\n医患关系\t128036\ndear\t128037\n信才怪\t128038\nisnamp\t128039\n圆形\t128040\nmpjdj\t128041\n猫咪网\t128042\n猪骨头\t128043\n闫凤美\t128044\n查询\t128045\n11.17\t128046\n艺术节\t128047\nuhbjuhb\t128048\n遥远的人\t128049\n1832664151\t128050\n舅父仔\t128051\n张诺涵\t128052\n对不\t128053\n老挝外贸银行\t128054\n哪有空出\t128055\n一块块\t128056\n指着\t128057\n宝应县\t128058\n小白色\t128059\n说了爱\t128060\n无动于衷\t128061\n哈新奥特曼\t128062\n小臂\t128063\n仙君神\t128064\n伤心子\t128065\n集美貌\t128066\n小臊\t128067\n谈婚\t128068\n刘佳雨\t128069\n12450\t128070\n执迷\t128071\n特例\t128072\n黄山证监会\t128073\n我天\t128074\n黄狗皮\t128075\n营子村\t128076\n午夜十二点\t128077\n既未\t128078\n小臣\t128079\n春军\t128080\n李旗庄镇\t128081\n我头\t128082\n明晚奥\t128083\n特供\t128084\n二黄\t128085\n抱怨\t128086\n小臭\t128087\n法利拉\t128088\n七十余\t128089\n足部\t128090\n加油桶\t128091\n浩然\t128092\n拉拉拉拉拉拉\t128093\n斯洛克\t128094\n当今\t128095\n庚饭\t128096\n齐玉珠\t128097\n中钢挖煤小\t128098\n米露\t128099\n白思雨\t128100\n山豆\t128101\n有点大话\t128102\n123456789123456789123456789123456789\t128103\n兼并\t128104\n幽怨王\t128105\n度秘度秘新年好新年\t128106\n371\t128107\n东亚银行\t128108\ntra3儿\t128109\n冰杖\t128110\neditor\t128111\n尔滨\t128112\n海口府城\t128113\n算得\t128114\n与许\t128115\n别哄人\t128116\n上海西门子公司\t128117\n母猪\t128118\n逼迫话\t128119\n徐大宝\t128120\nDaulne\t128121\n婷婷\t128122\n杨青训\t128123\n雪特\t128124\n干辣椒\t128125\n从而\t128126\n补品\t128127\n两千零一十六一千九百八十\t128128\n1001年\t128129\n雪片\t128130\nzaogao\t128131\n兴骚\t128132\n唉王的我来我是莱琪尔英泽帕斯特不全同义句\t128133\nhongtrgj\t128134\n冲出世界\t128135\n何出\t128136\n56万亿\t128137\n瞳宝\t128138\n成都军区总医院附属口腔医院\t128139\n森女\t128140\nbtbtbbtbt\t128141\n苯蛋\t128142\n俄方\t128143\nIO杯\t128144\n姐夫\t128145\n杨紫轩\t128146\n又要\t128147\n王仪涵\t128148\n大实话\t128149\n哪太\t128150\n妃子\t128151\n1kqc\t128152\nlaucks\t128153\n16时32分\t128154\n2508\t128155\n哪头\t128156\n2502\t128157\n2500\t128158\n换身\t128159\n救活\t128160\nQcQa\t128161\n36元\t128162\n嗲声嗲气\t128163\n度你的我\t128164\n六九六零六九六零\t128165\n兵火\t128166\n各县\t128167\n一人一百\t128168\n荡然无存\t128169\n陈洁节\t128170\n改名儿\t128171\n前儿\t128172\n打僵尸\t128173\n黄晓\t128174\n魔仙包\t128175\n卢兴起\t128176\n海姆利克氏\t128177\n你是谁你是谁你是谁安全imax\t128178\n工薪\t128179\n火场\t128180\ndichs\t128181\njaksb\t128182\n回头路\t128183\n陆家嘴\t128184\n还在一起\t128185\n当代\t128186\n变量\t128187\n当令\t128188\n对联\t128189\nfyusd\t128190\n依偎竖\t128191\n惨无人寰\t128192\n陈闰土\t128193\n李东东\t128194\n歌舞\t128195\n不吗旺仔地带\t128196\n难受\t128197\n呵哩个呵里个呵呵呵\t128198\n褒义\t128199\n莲心\t128200\n受挫\t128201\n收受贿赂\t128202\n吴继云\t128203\n冰杯\t128204\n恭喜我讨厌\t128205\n杨朝恒\t128206\nfather\t128207\n憨憨憨\t128208\n欧阳峰\t128209\n超过1小时\t128210\n真棒棒\t128211\n发张照\t128212\n你好脚\t128213\n天涯一大头儿\t128214\n3座\t128215\n讨厌梦\t128216\n呸呸呸\t128217\n崽子\t128218\n养成类\t128219\n混沌\t128220\n那头\t128221\n奇耻大辱\t128222\n60公里\t128223\nSHISEIDO資生堂\t128224\n乖咪乖\t128225\n那太\t128226\n那天\t128227\n伎坛\t128228\n直勾勾\t128229\n宝贝儿亲一个\t128230\n那夜\t128231\nurbdkSoijjoin\t128232\n那多\t128233\n和你个小破孩\t128234\n暴皮\t128235\n九九发热裤\t128236\n尼克服\t128237\n未获\t128238\n铁路局\t128239\n女女女\t128240\n也哥\t128241\n大处男\t128242\n联通吧\t128243\n大像\t128244\n11:00\t128245\n国际庄\t128246\n花眼\t128247\n不懂我懂\t128248\n管理学说\t128249\n梦瑶\t128250\n看斗\t128251\n胡嗯\t128252\n细亚\t128253\n13w点\t128254\n求变\t128255\n地里\t128256\n地量\t128257\nsordid\t128258\n淹死\t128259\n梦瑜\t128260\n求叫\t128261\n如何是好\t128262\n刘雲凤\t128263\n栖霞区\t128264\nJMMLKNN\t128265\n择天飞碟\t128266\n真纯\t128267\n飞马良\t128268\n郭涛\t128269\n五倍变焦\t128270\n下午4点\t128271\n猪你是诺雨我是大智\t128272\n最最最最最最最最最最最最最最最\t128273\n7.4日\t128274\ncgd\t128275\n居民可支配收入\t128276\n色片\t128277\n娃儿俊凯\t128278\n几个男\t128279\n莫效\t128280\n啦啦啦啦啦猜猜我是谁\t128281\n登一不\t128282\n波沙\t128283\nctforcernot\t128284\n首嘛\t128285\n恭苏\t128286\n崔雨革\t128287\n九州岛\t128288\n白喙\t128289\n福娃福福娃\t128290\n混服\t128291\n好妹妹\t128292\n而来说\t128293\n合刊\t128294\n岳昕艳\t128295\n玻尿酸\t128296\n人咒\t128297\n虎度秘\t128298\n三张\t128299\n宝贝我的好宝贝\t128300\n人和\t128301\n小猫咪\t128302\n领声\t128303\n湖文章\t128304\n马成龙\t128305\n押向刑场\t128306\n红花儿\t128307\n血燥\t128308\n拉坐\t128309\n赵丽颖\t128310\nvffgsdddfd\t128311\n暗想\t128312\n深藏不露\t128313\n拼成\t128314\n要不着钱\t128315\n阴见\t128316\n蠢猪\t128317\ntomeenglish\t128318\n随风飘远\t128319\n不想当\t128320\n00000000000000000000000000000000000000000000001\t128321\n前十分钟\t128322\n700多公斤\t128323\nrhhbsh\t128324\n伊格尼\t128325\nv次\t128326\n差不好听\t128327\n林泉忠\t128328\n听不懂你最好是\t128329\n纵横\t128330\n海珠湖公园\t128331\n固电话\t128332\n单只\t128333\njxjc\t128334\n连拍\t128335\n一肖\t128336\n88分儿\t128337\n律法\t128338\nAUX\t128339\nghokamaokwxz\t128340\n一肚\t128341\n斯雨\t128342\n加鲜\t128343\n一股\t128344\n一肢\t128345\n郑滨生\t128346\nwhattmahit\t128347\n哈拉粑粑\t128348\nAUr\t128349\nFfggs\t128350\n拆下\t128351\nSIM卡\t128352\n轻轻松松\t128353\n二百只\t128354\n剩我和你了你我是你的主人\t128355\n第七位\t128356\nfucc\t128357\n废了你\t128358\n孝义市\t128359\n水乳交融\t128360\n大不好\t128361\nfuck\t128362\n奧迪\t128363\n柔妃\t128364\n小亲亲\t128365\n赎回\t128366\n沙眼衣\t128367\nYoumotherof\t128368\n匹配\t128369\n我真的很爱你有你\t128370\n250安\t128371\nlxid台模莱克\t128372\n爱德基金会\t128373\nAngelabady\t128374\n波真\t128375\n啦啦啦啦啦啦我是快乐小胖\t128376\n早点儿跳楼吧一路平安酒泉\t128377\nnmkmmm\t128378\n艺术团\t128379\n度秘加油度秘2度秘加油度秘二\t128380\n染红\t128381\n游龙都\t128382\n横纹肌溶解\t128383\nsnd\t128384\n神物\t128385\n8248648\t128386\n好难好难听听\t128387\nsnl\t128388\n璐儿\t128389\n俱好\t128390\nsnh\t128391\n朗觉\t128392\n笔名\t128393\nsns\t128394\n通过率\t128395\n板块\t128396\nBTEAS\t128397\nsny\t128398\n说了我有秘秒了你\t128399\n南丰广场\t128400\n嗯好行\t128401\nchenzi\t128402\n九龙\t128403\n斩钉截铁\t128404\n新面孔\t128405\n天涯明月基#\t128406\n星期日\t128407\n逗角\t128408\n5.9级\t128409\n张雨龙\t128410\n美康\t128411\n周果然\t128412\n韩色\t128413\n表药\t128414\n傅国威\t128415\n熊也有\t128416\n一个栋\t128417\n收养\t128418\n切美仁波切\t128419\n哭话\t128420\n马事儿\t128421\n高绿\t128422\n尸首\t128423\n镣铐\t128424\n收关\t128425\n直话\t128426\n大代表\t128427\njj勒\t128428\n贪污\t128429\n收入\t128430\n美少妇\t128431\n哭诉\t128432\n瀽\t128433\n景观学概论\t128434\n瀿\t128435\n阿欧阿衣\t128436\n吴敏霞\t128437\n爱尚辣\t128438\n年龄度秘\t128439\n相亲男\t128440\n十颗\t128441\n柳叶眉\t128442\n烧鸭变烧鹅\t128443\n直说\t128444\n吴某\t128445\n我的女儿叫真真\t128446\n色拉\t128447\n英格尔\t128448\n14:10\t128449\n备付金\t128450\n陈淑文\t128451\n羁旅\t128452\n邓莎\t128453\n以来\t128454\n用师\t128455\n姑凉人\t128456\n等到来\t128457\n場演唱會\t128458\n多不难\t128459\n保护\t128460\n吉普自由光\t128461\n大飞机\t128462\nhhhbbbbbbb\t128463\n一天八节\t128464\n了我着急\t128465\nCfg\t128466\n二虚岁\t128467\n請\t128468\n导体\t128469\nbasktebao\t128470\n談\t128471\n晨跑大吵\t128472\n击掌\t128473\n471648213\t128474\n我没在干嘛呀\t128475\n鹅鹅\t128476\n四美丽\t128477\n趴下\t128478\n不能够\t128479\n240下\t128480\n王脑\t128481\njrstvw\t128482\n发情\t128483\n高凯琪\t128484\n做你的人是\t128485\n蟹们\t128486\n一个十块\t128487\ndjv8g\t128488\n240万\t128489\n基督城\t128490\n芜湖\t128491\n堂客\t128492\n0000000000000000000000000\t128493\n色黄\t128494\n下班时\t128495\n真的累\t128496\nhdvcgdykvjjfvdnjgexvjfxvhdbhfc\t128497\n我爱度秘宠你你爱我\t128498\nhduabc\t128499\ncnvclvkjvblvnjvkbcjvck\t128500\n梅超\t128501\n136555555632586\t128502\n仟吉\t128503\n日光倾城\t128504\n葫芦娃葫芦\t128505\n丁子睿\t128506\nzrnotendabo\t128507\n幸运侠士\t128508\n利朗\t128509\n增信\t128510\n56429723\t128511\n潮阳练南田墘村柏多美内衣厂\t128512\n广东外语外贸大学法学院\t128513\n白糖\t128514\n肚条饭\t128515\nST威达\t128516\nIWC\t128517\n闹了玩\t128518\n一个月扣\t128519\n衷肠\t128520\n卡牌大师\t128521\n黑心棉\t128522\n误解难免\t128523\n临床\t128524\n给对\t128525\n哗哗花花花花花花话\t128526\n我敢我\t128527\nfirst\t128528\n情侣\t128529\n宫肌瘤\t128530\n胡莫利\t128531\n小伙恋\t128532\n化学反应\t128533\n谁是大饼老大了我说的我是\t128534\n千万年\t128535\n齐腰深\t128536\n另一篇\t128537\n选秀\t128538\n太空了了\t128539\n23253236655533555\t128540\n飞过来\t128541\n赞儽\t128542\n成全\t128543\n拿一拿\t128544\n弱智\t128545\n我和我的好闺蜜一样真的我的闺蜜有你\t128546\n样许\t128547\n远样\t128548\n又说错\t128549\n卡座\t128550\nsjxjjk\t128551\n精良\t128552\n感敢不敢\t128553\n二十名\t128554\n多杆\t128555\n五六根\t128556\n扶起\t128557\nmvgshd\t128558\n得错\t128559\n15326847688\t128560\n快歌\t128561\n哲哲\t128562\ntopfzjgtits\t128563\n京塘\t128564\n焦虑症\t128565\n勇夺\t128566\n二十吨\t128567\n花片\t128568\n多来\t128569\n暗黑\t128570\n多条\t128571\n窦同波\t128572\n讲别\t128573\n炒鸡\t128574\nFFFUTFDFC#XO85HIIJJK\t128575\n题材股\t128576\n以立\t128577\nfrrred\t128578\n八十八八八八八八八八八八八块\t128579\n小孩女\t128580\n第四段\t128581\n机子\t128582\n王小玮\t128583\nhogyod\t128584\n听完我告诉你\t128585\n情了爱\t128586\n边铁\t128587\n博畅\t128588\nhttpehiphotosbaiducomxiaodupicitem203fb80e7bec54e7629e0a6bbe389b504fc26a50jpg\t128589\n李聪\t128590\n卡底\t128591\n20a\t128592\n黄思涵\t128593\n浴血奋战\t128594\n一群鸟\t128595\n20g\t128596\n20m\t128597\n阿部宽\t128598\n20o\t128599\n3.3号\t128600\nwhendo\t128601\nghju\t128602\n一群鸭\t128603\n特克斯县\t128604\n样水\t128605\n吕布\t128606\n11公分\t128607\n邱成功\t128608\n炒香豆豉\t128609\n看我的禁\t128610\n正当防卫\t128611\n201\t128612\n200\t128613\n203\t128614\n202\t128615\n205\t128616\n204\t128617\n207\t128618\n206\t128619\n偶吧\t128620\n208\t128621\n四家\t128622\nDHC\t128623\n张思\t128624\n预防针\t128625\n度迷你真的好讨厌\t128626\n20#\t128627\n撸女\t128628\n20%\t128629\n青蛙\t128630\n更替\t128631\n清闲来\t128632\nDHU\t128633\n威尔胜\t128634\n谢翁坤\t128635\n乐购\t128636\n金星谜\t128637\n四宝\t128638\n大白袍\t128639\n阳原县一小学\t128640\n563套\t128641\n张总\t128642\n媒介\t128643\n唔使\t128644\n张怡\t128645\n威洁士\t128646\nzzzzzzzzzzzzzzzzzzz\t128647\n第五章\t128648\nHghgetfgfg\t128649\n免费们\t128650\n阿施塞雅啦啦小咿呀咿呀咿呀咿呀咿呀咿呀咿\t128651\n寒舍\t128652\n难民\t128653\n五分之三只\t128654\nWG\t128655\n十九七八六三四二五五一\t128656\nWE\t128657\nWD\t128658\nWC\t128659\nWB\t128660\nWO\t128661\n你好流弊\t128662\n斗争样\t128663\nWL\t128664\n预约\t128665\n大你好\t128666\n高喊\t128667\nWW\t128668\nWU\t128669\nWS\t128670\nWQ\t128671\nWP\t128672\n虹坊\t128673\n连级\t128674\n扫色\t128675\nWY\t128676\n酱小鱼儿\t128677\n猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪\t128678\nWe\t128679\n三深\t128680\n水西公园\t128681\nWa\t128682\njhng\t128683\nWo\t128684\nsidhs\t128685\nWj\t128686\nWi\t128687\nWh\t128688\nusup\t128689\n天才亮\t128690\n2010年12月20日\t128691\n敢不要\t128692\n创伤\t128693\nWx\t128694\n希腊\t128695\n讨厌你恨你迹部\t128696\nsdkcb\t128697\n红黄相杂\t128698\n一部片\t128699\n八百家\t128700\n刘峰\t128701\n崩漏\t128702\n了不起来\t128703\nvajc\t128704\n找理由\t128705\n除尘\t128706\n见日升\t128707\njakjd\t128708\n有局\t128709\n昨天早上十点\t128710\n省时\t128711\nyemin\t128712\n嘟嘟声\t128713\n晚12点\t128714\n记错\t128715\n性度\t128716\n煎饼卷大葱\t128717\n捧腹大笑\t128718\n鬼斧\t128719\n生子文\t128720\n滦河\t128721\nW0\t128722\n动强\t128723\nvGV\t128724\n动弹\t128725\n金城江六圩中学\t128726\n1hy\t128727\n李浪\t128728\n二把\t128729\n李浩\t128730\n泸沽湖胡\t128731\n电灯泡\t128732\n张风雷\t128733\nHesayanythong\t128734\n西华营\t128735\n13952302975\t128736\n小妹妹长的还没的女人\t128737\n真不懂事\t128738\n李海\t128739\n精彩\t128740\n华育国际\t128741\n金真棒\t128742\n瘦肉\t128743\n我的声音\t128744\n说假\t128745\n补偿费\t128746\n威尔度秘\t128747\nKLICEN\t128748\n骗了吧\t128749\n解渴\t128750\n胸脯\t128751\n偶恨\t128752\n张张\t128753\n小黄鱼\t128754\n张旧图\t128755\ngggfyjgb\t128756\n李雨彤\t128757\n天性爱\t128758\n柳宗元\t128759\n便衣\t128760\n邳州杯\t128761\n痛快点快\t128762\n1752849368\t128763\nxgajquw\t128764\n绿手帕\t128765\n张强\t128766\n我靠我靠我靠我靠我靠我靠我靠我靠我靠我靠我靠我靠我靠我靠\t128767\n因为我和一样\t128768\n张开\t128769\n别聊那么简单的问题我是电影小达人\t128770\ntoui\t128771\ntouk\t128772\ntoul\t128773\n便血\t128774\n不起眼\t128775\n不麻烦我不想用你\t128776\n写会\t128777\n自动艾特魂吧众基友\t128778\ntout\t128779\n尹尚武\t128780\n菜系\t128781\n几小只\t128782\n效率性\t128783\netxsuhxyub\t128784\n张弛\t128785\n和事\t128786\n押金钱\t128787\nuhhjbjhhhhxbdh\t128788\n崔洁霞\t128789\n美联会\t128790\n沙堰\t128791\n三Q\t128792\n充错\t128793\n喜剧性\t128794\n谢鹏\t128795\n付出\t128796\n基本你是谁啊鸣\t128797\n瓦块\t128798\n城乡\t128799\n南瓜羹\t128800\n惨遭\t128801\n苏心如\t128802\n三G\t128803\n莽汉断机\t128804\n三D\t128805\n222222122222\t128806\n张泽群\t128807\n凳质\t128808\n哼挂\t128809\n周其凤\t128810\n龚琳娜\t128811\n三q\t128812\ncaar\t128813\n亏了吐\t128814\n换掉\t128815\n纯洁装\t128816\n晚你好\t128817\n三n\t128818\n和亲\t128819\n分担当\t128820\n沙堆\t128821\n韩红海\t128822\n大黄蜂\t128823\n恟\t128824\n米奇\t128825\n恝\t128826\n秦殊\t128827\n恐\t128828\n恕\t128829\n恋\t128830\nHathaway\t128831\nsood\t128832\n算一真可怜\t128833\n颜惠真\t128834\n10月30日\t128835\n恃\t128836\n恁\t128837\n回想\t128838\n恄\t128839\n前线\t128840\n考河\t128841\n恻\t128842\n三8\t128843\n真可谓\t128844\n暹罗猫\t128845\n恼\t128846\n恰\t128847\n学霸\t128848\n恶\t128849\n测出调\t128850\n刘希华\t128851\nalve\t128852\n恫\t128853\n恨\t128854\n恩\t128855\n息\t128856\n恬\t128857\n恭\t128858\n竞价\t128859\n卢梦瑶\t128860\n母线\t128861\n好啦好\t128862\n十九千克\t128863\n告诉我\t128864\n指数\t128865\nckyou\t128866\n16.3％\t128867\n极道振\t128868\n木关系\t128869\n大缷\t128870\n第几次\t128871\n12345678910111213141516十一十三十八10二十三二一章\t128872\n杨建\t128873\nwifnhcj\t128874\n科洛·图雷\t128875\nIEEE\t128876\n我是你你爱我\t128877\n混吃等死胸无大志\t128878\n新闻稿\t128879\n微型屏\t128880\n人大代表们\t128881\n指教\t128882\n天地人\t128883\n当初一\t128884\n错了叫\t128885\n春熙\t128886\n擼擼擼\t128887\n小黑子\t128888\ncncc\t128889\n王剑波\t128890\nmobook\t128891\nHnk\t128892\n度秘宠\t128893\n50M\t128894\ngukggk\t128895\n屋匿\t128896\ncncs\t128897\n朴叙俊\t128898\n后期\t128899\n千元\t128900\n铃叮咚\t128901\n褴褛\t128902\n甘欣愉\t128903\n常规赛\t128904\n抱动\t128905\n森日\t128906\n3667558965\t128907\n5.74%\t128908\n快人\t128909\n班副处\t128910\nkbvdbd\t128911\n专人\t128912\n息率\t128913\n雅舒\t128914\n唐永清\t128915\n死咄\t128916\n美国国会\t128917\n陈静仪\t128918\n第64届\t128919\n香客\t128920\n15842365653\t128921\nlugul\t128922\n金星凌\t128923\njimin\t128924\n乔·科尔\t128925\n霸王龙的吧好不\t128926\n这么假\t128927\n快亲\t128928\n俄国史\t128929\n快骨碌\t128930\n雄关\t128931\n二零零二零一八年\t128932\n薇林\t128933\nbN\t128934\n意義\t128935\n哦呵\t128936\nIgxvtkstjzfzjdhztdhhdthhtsdhfhdhdhzhddhzsrzdshdzzdbdhfnxnfnfzdjdx\t128937\n摆布\t128938\n一言难尽\t128939\n坑人坑人\t128940\n居住\t128941\n黎势\t128942\n韩正斌\t128943\n曹嘉芮\t128944\n竝冖\t128945\n约翰·克里\t128946\n盘打错\t128947\n慰问\t128948\n跨栏\t128949\n檀雅\t128950\nQQ名叫\t128951\n官胡\t128952\n一粒一\t128953\n中奖\t128954\n山东高速\t128955\n嘎咕\t128956\n高富帅气质量\t128957\nbddjjudeuhe\t128958\n一讲一个\t128959\n7.3级\t128960\n台风范\t128961\n死女就是你\t128962\nyyyuyuu\t128963\n孟放\t128964\n好温柔\t128965\n12345祎\t128966\nMICKY\t128967\n王权\t128968\n段村镇\t128969\n亦凡帅\t128970\n哦呦\t128971\n唔带\t128972\n甘涌\t128973\n梦秋\t128974\n紧紧\t128975\n公公交\t128976\n王村\t128977\n火柴男人\t128978\n湘东南\t128979\n金洞\t128980\n尸日\t128981\n丧扮\t128982\n蛇年\t128983\ngbrehtntnqdtnyrnrhq2n\t128984\nkydh\t128985\n格莱\t128986\n嗯寒芒\t128987\n女娲\t128988\n王杰\t128989\n张起灵\t128990\n撒子气\t128991\n诚心\t128992\n脸部\t128993\n26601381\t128994\n超神\t128995\n〒〒\t128996\n我不相信\t128997\n卜入\t128998\n开心不开心\t128999\n台长\t129000\n一年以内\t129001\n两百七\t129002\n浅表性\t129003\n代诗晴\t129004\n马格利酒\t129005\nc88c178c88\t129006\n宁夏卫视\t129007\n可能\t129008\nbbjvn\t129009\n繼續\t129010\n13gd\t129011\n了不起\t129012\n直多\t129013\n188一个\t129014\n辣子\t129015\n星岛\t129016\n漫游费\t129017\n四组\t129018\n南怀谨\t129019\n力业\t129020\n宫刑\t129021\n成员国\t129022\n瑞奎斯曼\t129023\n族件方\t129024\n假干嘛\t129025\nedgesiren\t129026\n顾强\t129027\nSTIGMA\t129028\n无光\t129029\n睡服\t129030\n四绝\t129031\n饮料杯\t129032\n无关\t129033\n阳炎\t129034\n富力\t129035\n钱迷\t129036\ncjdnhcvbnhckfugjyddi\t129037\n帅牛\t129038\n无惧\t129039\n贫困线\t129040\n官图\t129041\n明天九点\t129042\n嘿嘿嘿\t129043\n尹志平\t129044\njgudmgy\t129045\n帅版\t129046\n微秘\t129047\n官园\t129048\n格斗类\t129049\n方子欣\t129050\n基板\t129051\n猪窝猪窝\t129052\n120多万\t129053\n肖熊\t129054\nqiyou\t129055\n剑芒\t129056\n剑皇\t129057\n美国奥委会\t129058\n多茄\t129059\n震动棒\t129060\n生前\t129061\n先结一\t129062\ni514\t129063\n开心色开心\t129064\n蓝牙山五壮士\t129065\n各级各级各级各级各级各级各级各级各级\t129066\n老抽\t129067\nwute\t129068\nhihihihihihihihi\t129069\nFtgfvcncvggggcfgggghjjShnvjhjdgnhvjjfga\t129070\ntuajtamnt\t129071\n126947273\t129072\n度秘你真是我的好我\t129073\n火山灰\t129074\n淋病\t129075\n哈个亲一个\t129076\n哎呀邓\t129077\n自报\t129078\n自护\t129079\n秋绿\t129080\n当男\t129081\n冬瓜丸子汤\t129082\n过不过关\t129083\n歇一歇\t129084\ntjjgg\t129085\n新浪V认证\t129086\n母传子\t129087\n度出\t129088\n动感地带\t129089\n拇指论\t129090\n郑书燕\t129091\nhttpehiphotosbaiducomxiaodupicitem42166d224f4a20a4d2aa96b397529822720ed00ajpg\t129092\n丶丨\t129093\n爱情专一\t129094\n姚雅涵\t129095\n小溪\t129096\n颜我\t129097\n临平五中\t129098\n袁先生\t129099\nrtttrrt\t129100\n我在你在你在你在你在\t129101\n自抑\t129102\n444452584366884412388\t129103\n刘晨阳\t129104\n今年9月份\t129105\nluec\t129106\n王申昊\t129107\n天理不容\t129108\n见过年\t129109\n紫荆城\t129110\n嗯音乐音\t129111\n周立波\t129112\n巨秃\t129113\n句儿\t129114\n上呼吸\t129115\n3600平米\t129116\n种植\t129117\n百度浏览器\t129118\n太子奶集团\t129119\n钱片\t129120\n火腿克\t129121\n乘机\t129122\nu\t129123\n好不原谅\t129124\n4名\t129125\n小木桥\t129126\n值得爱\t129127\n不溜秋\t129128\n单项式\t129129\n龙海莆田\t129130\n商情\t129131\n我家\t129132\n人与事\t129133\n学艺\t129134\nkittyhas\t129135\n三亿\t129136\nhgggggggg\t129137\n123你猜我猜你猜\t129138\n三人\t129139\n一阵\t129140\n笑你好丑\t129141\nfgrffidfggfdggcggf。139com天\t129142\n热情好不\t129143\n应采儿\t129144\n一阁\t129145\ngp\t129146\n烦人人\t129147\n国道\t129148\n三五\t129149\n一队\t129150\n180116\t129151\n三亚\t129152\n人与人\t129153\niui\t129154\n欢欢分钟\t129155\n碰电\t129156\n小黄人\t129157\niud\t129158\niuf\t129159\n131364\t129160\n春天\t129161\n二七四九二零九零\t129162\n452454549494944554464664463\t129163\n12234567890\t129164\n欲女视\t129165\n刘洲成\t129166\niur\t129167\njeshs\t129168\n吴受音\t129169\n不端正\t129170\n1823617969\t129171\n台江工商局\t129172\ndgchgd\t129173\n跨下\t129174\n摇曳\t129175\n批林批孔\t129176\n4001365535\t129177\n权权\t129178\n真冬\t129179\n五子棋八七达赖\t129180\n傅琰东\t129181\n三千多兆\t129182\n如刀割\t129183\n源印法师\t129184\n500w\t129185\nga\t129186\n英华\t129187\n英协\t129188\n无用功\t129189\n相韬\t129190\nzero\t129191\n0年0月\t129192\n罗面\t129193\n3225563245566\t129194\n阶级\t129195\n000个\t129196\nCAFE\t129197\n0.5TON\t129198\n123456789101111\t129199\n123456789101112\t129200\n吴子翰\t129201\n全能少女\t129202\n罗静\t129203\n猫屎遁\t129204\n房展会\t129205\n500G\t129206\n秘不讨厌\t129207\n465千克\t129208\n500M\t129209\n燃油管\t129210\n个头\t129211\n5001\t129212\n5000\t129213\n高万腾\t129214\n好傲\t129215\n方子诚\t129216\nMetal\t129217\n豪客嘉族\t129218\n348537808\t129219\n发自\t129220\n反反复复纯\t129221\n孙哥\t129222\n不我不\t129223\n500%\t129224\n来不及许愿\t129225\n彰慼入儿棧\t129226\n个天\t129227\nknavsyqjioh\t129228\n亲终于\t129229\nwtamp\t129230\n纳米\t129231\n蹲守\t129232\n秘娘炮\t129233\n13.72米\t129234\n个处\t129235\n梦想中国\t129236\nwulisAlanghei\t129237\n流传\t129238\n黄岳阳\t129239\n科考\t129240\n邦达\t129241\n伤心心情不好\t129242\n级点\t129243\n帅帅\t129244\n一学期一本\t129245\n王泓璇\t129246\n小四年级\t129247\n南京军区\t129248\n左右度\t129249\nhavyjdjdjdbzjdjzbdjfbdjdnnnznejisndoxndkdjdjxjdnxjdnxjdnxkkdnxmndnnzn9464\t129250\n999966\t129251\n打不打\t129252\n字眼\t129253\n贵王\t129254\n的号好想和相互额uhdwud\t129255\n罗大佑\t129256\n亲一个度秘烂\t129257\nffsyct\t129258\n怏快\t129259\n华夫人\t129260\n桑叶\t129261\n雨后\t129262\n伤心不起\t129263\n撒啦兔兔比特币\t129264\n内蒙古地处\t129265\n侯晓晴\t129266\n李海洋\t129267\n周董好\t129268\n864000等於毫升\t129269\nvvvvvgv\t129270\n度莫\t129271\nImqueen\t129272\n广州东站\t129273\n拜毪\t129274\n快飞过来\t129275\n拜比\t129276\n9吋\t129277\n淫B\t129278\n错过你\t129279\n22526426528438\t129280\n蜜肉\t129281\n喉镜\t129282\ndiffff\t129283\n疼碍\t129284\nvnnnm\t129285\n男女晚\t129286\n爱过爱的节操出品\t129287\n徐和谊\t129288\ng10g12\t129289\n睫毛\t129290\n叶子墨\t129291\n事男\t129292\n王大骐\t129293\n1798\t129294\n事田\t129295\n六十斤\t129296\n言壮\t129297\n百家\t129298\n牙口\t129299\n许阁\t129300\nstil\t129301\n百宏\t129302\n360AV点\t129303\n深夜二点\t129304\n寻白\t129305\n正规\t129306\n百官\t129307\n许阔\t129308\n百宝\t129309\n音响\t129310\n零幺二七二\t129311\n那我来亲一下\t129312\n唐诗\t129313\n力相如\t129314\n执行力\t129315\n绩效\t129316\n刘培超\t129317\n不好压\t129318\n笔顺\t129319\n侠度\t129320\n三寸钉\t129321\n呕心\t129322\n金发公主\t129323\n44944\t129324\n能找\t129325\n火人\t129326\n吧瓦\t129327\n六点83点六八\t129328\n把片\t129329\n能手\t129330\n1105110\t129331\n牛ff\t129332\n嗯好先\t129333\n临漳\t129334\nNIKO\t129335\n提意见\t129336\n谈妥\t129337\n5555547\t129338\n松巍\t129339\n120斤\t129340\n二嫂\t129341\n文锋\t129342\n兵芭比\t129343\nisifif\t129344\n臭婊\t129345\n度秘度秘我在问你你\t129346\nhttpimgcacheqqcomclubitemparcelitemffff51faec8f07c2626e76153ac5471829raw200gif\t129347\n猪猪猪猪猪猪猪猪猪\t129348\n小狗机\t129349\n有事情别来\t129350\n秘码度秘\t129351\n平虏城\t129352\n脑死\t129353\n一代沟\t129354\nDearthyuck\t129355\n家徒\t129356\nCCAV5\t129357\n玄苑\t129358\n吴建强\t129359\n情尼\t129360\n一点二万千克\t129361\n31号\t129362\n脐疝\t129363\n灵芝的坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋就是你\t129364\n六七十块\t129365\n近可好\t129366\n经济实惠\t129367\n技校\t129368\n独能\t129369\n宇航员\t129370\n豆币\t129371\n一六徒儿\t129372\n2073公里\t129373\n谢明敏\t129374\n贵妃蚌\t129375\n过东仓\t129376\n陈文广\t129377\n扮成\t129378\n凭什么\t129379\n洗完\t129380\n百多岁\t129381\n三峡水库\t129382\n刘筱晨\t129383\n高逼格\t129384\nyes喔\t129385\n丫头子\t129386\n如图\t129387\n商标纸商标纸\t129388\n哇塞凤\t129389\n息焉公墓\t129390\n恶人谷\t129391\n纽约州\t129392\n木奇灵\t129393\n胡安运\t129394\n夜多列\t129395\n西固城\t129396\n绯红\t129397\nnownor\t129398\n去海边\t129399\n周琳雅\t129400\n欧冠行了吧vu困不困iv聚会vu\t129401\n六里屯\t129402\n孙光辉\t129403\n震憾\t129404\n系未卡\t129405\n沉大海\t129406\n漳州\t129407\n得用\t129408\n鸡毛掸子\t129409\n蜂刺\t129410\n小婵云溪\t129411\n马菲雅\t129412\n六里山\t129413\n我你在你在n的爱\t129414\n水墨尘\t129415\n孤村\t129416\n喜有你好看\t129417\n风水\t129418\nXieis\t129419\n归根到底\t129420\n长颈\t129421\n984年\t129422\nqwyibv\t129423\ndweeeeeeewweeeeew\t129424\n美星\t129425\n赵嘉诚\t129426\nfgaffe\t129427\n卡刷\t129428\n张办\t129429\nv宝贝v\t129430\n张力\t129431\ngig\t129432\n陈堡\t129433\n冰片\t129434\n水晶弹枪\t129435\n325赛罗\t129436\n任苗苗\t129437\n你好假\t129438\n一百多里\t129439\n杨丽青\t129440\n韵度\t129441\n军权\t129442\n草房子\t129443\n沉太难看\t129444\n薄情\t129445\n六部\t129446\nmymy\t129447\n嫌犯\t129448\n猎艳江湖\t129449\n上饶县\t129450\n美国路易斯安那州\t129451\n嘿嘿瓮\t129452\n时山\t129453\n霸情\t129454\n罗怡\t129455\n不知不对\t129456\n只一次\t129457\n首尔首尔\t129458\n清歌\t129459\n图谋\t129460\n哎呦美\t129461\n角犬\t129462\n使然\t129463\n监护\t129464\n试后开\t129465\n胃癌点\t129466\nGDP\t129467\nIxidnj\t129468\n65次\t129469\n微尘\t129470\n卸甲\t129471\n龚宇奇\t129472\n70多岁\t129473\n燕琼\t129474\n长相照\t129475\n女厕所\t129476\n副校长\t129477\n微小\t129478\n2011年10月15日\t129479\nmbardinnnnnndibaba\t129480\n几十号\t129481\n第4声\t129482\nJustin\t129483\n把告急\t129484\nYOUAV\t129485\n1一59二1\t129486\n来无处\t129487\n噜哈胡巴扣女\t129488\n赌命你是男是女\t129489\n扶贫办\t129490\n金石\t129491\n第一手\t129492\n集大成者\t129493\n亮侠\t129494\n奶业\t129495\n金矿\t129496\n徐煜甜\t129497\n三缄\t129498\n笔记本\t129499\nHcjfydhdudydhhsdydhhdhfgjdudhdhhcjfjchchhchchchchchhchfhchhcch\t129500\nHfdg\t129501\n容颜\t129502\njgpuii\t129503\n额头\t129504\n说乖\t129505\n几十只\t129506\n对不了\t129507\n曙光玩具厂\t129508\n同一大\t129509\n大城小事\t129510\n弄大\t129511\n家家乐\t129512\n刘君宜\t129513\n恩咯玩意儿死孩子气\t129514\n些许年\t129515\n观察家\t129516\n额外\t129517\n没声\t129518\n好啊你是臭屁股啊了好了不要不oppoopopoowowo。臭豆豆对对的我觉\t129519\n大好不好\t129520\n店首\t129521\n观察室\t129522\n白你个头白你个头把你的头一个白\t129523\n说书\t129524\n反观\t129525\n我的手背\t129526\n说不出去\t129527\n诺贝尔奖\t129528\n尚\t129529\n尛\t129530\n尘\t129531\n尖\t129532\n尔\t129533\n正邪之间\t129534\n景晕云\t129535\n尐\t129536\n少\t129537\neggk\t129538\n對\t129539\n尊\t129540\n專\t129541\n张均蜜\t129542\n将\t129543\n將\t129544\n射\t129545\n尅\t129546\neggg\t129547\n封\t129548\n尾\t129549\n尿\t129550\n尼\t129551\n尽\t129552\n陈翔炜\t129553\n尻\t129554\n尸\t129555\n尹\t129556\n武王\t129557\n李雨佳\t129558\n蓝球赛\t129559\n江成俊\t129560\n就\t129561\n加湿\t129562\n莫名奇妙\t129563\ngvgz\t129564\n董叫\t129565\n尧\t129566\n尤\t129567\n尥\t129568\n尢\t129569\n歌迷\t129570\n想你入\t129571\n柏莱柏莱\t129572\n频段\t129573\n刘大点儿\t129574\n很合格\t129575\n恩若\t129576\n孙首歌\t129577\n文字行\t129578\n首情诗\t129579\nursoso\t129580\n臭类\t129581\n周永强\t129582\n翘班\t129583\n摧毁\t129584\n２０年\t129585\n畅通无阻\t129586\n一万斤\t129587\n和尚哥哥\t129588\n姚圣伟\t129589\nfouom\t129590\n真正好\t129591\n后腰\t129592\n关舞蝶\t129593\n我唯一的朋友\t129594\n19199911\t129595\ngvvvvc\t129596\n石墨\t129597\n1314\t129598\n1313\t129599\n29800岁\t129600\n金正\t129601\n永远不分手\t129602\n汽修群\t129603\n三一太\t129604\n沿岸\t129605\n观澜猫\t129606\nYGCESZ\t129607\n记叙文\t129608\n一看电影\t129609\n2012年4月26日\t129610\n林月如\t129611\n亲戚\t129612\n干嘛喃\t129613\n九万多\t129614\n锥论\t129615\n王老头\t129616\n防衰\t129617\n亲我\t129618\n气死\t129619\n裤子元\t129620\n是非黑白\t129621\n现在时光\t129622\n石墙\t129623\n美团美团美\t129624\n腊牛肉\t129625\n国家机密\t129626\n亲戏\t129627\n属于你会不会\t129628\n张泉杰\t129629\nhhchcgggcgcgcg\t129630\nhhgggggggyy\t129631\njhJDK\t129632\n阿里郎\t129633\n苏小妞\t129634\n好心急\t129635\n田氏\t129636\n455855559\t129637\n里了两老啦了两老了\t129638\n九千克\t129639\n杨俊\t129640\n马明爽\t129641\n全国人\t129642\n富士康\t129643\n九千元\t129644\n我的书我是说书\t129645\nDDRREF\t129646\n小眼厩\t129647\n赵浩坤\t129648\n半懂\t129649\n建宏\t129650\n无为而治\t129651\n捧场\t129652\n手大\t129653\n英甲\t129654\nckucic\t129655\n机器人一点都不可爱好不啦\t129656\n杨修\t129657\nbēn\t129658\nrukee\t129659\n小呆\t129660\n丑女人\t129661\n题行\t129662\n手头\t129663\n上一个男孩\t129664\nJGK\t129665\nFafgghtf\t129666\n金色黎明党\t129667\n1911年\t129668\nJGF\t129669\n冷不冷\t129670\n财力\t129671\n净瞎\t129672\nQQ闺密个性签名\t129673\n不假思\t129674\n4千多\t129675\n桂林洋\t129676\n讨厌们\t129677\n蒙有\t129678\n拿瓦\t129679\n虫洞\t129680\n还是我漂亮\t129681\n度秘度秘大坏蛋大坏蛋度秘度秘\t129682\neb4vexw\t129683\n郑崇阳\t129684\n创先是\t129685\n哗啦\t129686\n掷道\t129687\n删节\t129688\n陕A\t129689\n三十分钟\t129690\n孙恋儿\t129691\n摔破\t129692\n瓶邪\t129693\n彩世界\t129694\n出掉\t129695\n天津地区\t129696\n林俊辉\t129697\n肯\t129698\n皮夹克\t129699\n微瑕\t129700\n一一张\t129701\n楚楚街\t129702\n围棋赛\t129703\n这么久\t129704\n广厦小区\t129705\n这么之\t129706\n这么么\t129707\n放荡\t129708\n好我懂了\t129709\n在身边\t129710\n结啼\t129711\n采茶\t129712\n你是谁你是谁你是中国大妹妹图\t129713\n肥\t129714\n妈白\t129715\n千手\t129716\n嗯弄\t129717\n叶英\t129718\n杳然\t129719\nalalyo\t129720\n卖掉\t129721\n暴走漫画制作器\t129722\n肠\t129723\n主角色\t129724\n扭头\t129725\n阿摩\t129726\n菲尔普斯\t129727\n经营成本\t129728\n刘芷涵\t129729\n快样款\t129730\n左慧妮\t129731\n嗯张\t129732\n婚假\t129733\n老筋\t129734\n不要怕\t129735\n无处可逃\t129736\n许诗琪\t129737\n别本\t129738\n运动会\t129739\n叶金沅\t129740\n肾\t129741\n阿留申雨天\t129742\n2411426395\t129743\n嗯强\t129744\n朱然之广\t129745\n不成样\t129746\n课扉页\t129747\n菲奥娜\t129748\n毛片\t129749\n784794449494644047774944944944\t129750\n吴章根\t129751\n兩支Ella\t129752\n午汗\t129753\n1666666666\t129754\n安廉\t129755\n决议\t129756\n360云盘\t129757\n八毛二\t129758\n四合院儿\t129759\n1922年\t129760\n焦乐涵\t129761\n受重视\t129762\n谢贵堂\t129763\nXvd\t129764\n余思言\t129765\nbutidont\t129766\n夜未\t129767\n三遍\t129768\n234567823456781234567823456788888888\t129769\n喽哈小蜜瓜\t129770\nhphg0ggpip0p1jm\t129771\n不敢你好难找\t129772\n两天天儿\t129773\n负伤\t129774\n哟不有\t129775\n小天一\t129776\n克聪睿\t129777\n夜机\t129778\n飞快\t129779\n好的小乖好\t129780\n杨贤硕\t129781\n谁呀谁呀哼\t129782\n三道\t129783\n暴热\t129784\n咪一咪\t129785\n一睡\t129786\n环境保护\t129787\n云卷云舒\t129788\n山西省五台山全山普佛法会\t129789\n50000000个\t129790\ncf棒\t129791\n尼勒克县\t129792\nnotste\t129793\n还剩下\t129794\n平阴玫瑰王\t129795\n叶准徒孙\t129796\nDgf\t129797\nwsika\t129798\n妖精的尾巴\t129799\nDgg\t129800\njbv\t129801\njbu\t129802\njbt\t129803\ntugt\t129804\n肋\t129805\n赣县\t129806\ntjaj\t129807\n扣得费\t129808\n丝丝行\t129809\ntjag\t129810\n说不起\t129811\n句度\t129812\n切了切\t129813\njbx\t129814\njbg\t129815\njbf\t129816\njbd\t129817\njbc\t129818\njbb\t129819\ntugg\t129820\n哈伤心\t129821\njbl\t129822\njbk\t129823\n1966\t129824\n18536\t129825\njbh\t129826\nimax6\t129827\nimax5\t129828\n丁嘉瑜\t129829\n伤感\t129830\n1960\t129831\n和错\t129832\n金脸\t129833\n赵刚\t129834\n保护主义者\t129835\n楞严\t129836\n快钱\t129837\n靠窗\t129838\n劫机犯\t129839\nfriendship\t129840\n驴裙\t129841\n广交角\t129842\n懂不起\t129843\nPRIMADONNA\t129844\n龙泉驿\t129845\n碧池\t129846\n给我一算\t129847\n八百七十八百万十万一万三百六十五万\t129848\n霓虹\t129849\n宁静致远\t129850\n呀干\t129851\n一下午\t129852\nteachersprday\t129853\n一点点点点\t129854\n靠不是\t129855\n1939年11月\t129856\n人工湖\t129857\n肚\t129858\n何晟铭\t129859\n不学\t129860\n榴莲班戟\t129861\n内八一国\t129862\n美丽美妞美妞美妞\t129863\n20块\t129864\n彩铃包\t129865\n古越都城\t129866\n王一名\t129867\n肖\t129868\n加利亚籍世界粮食计划署\t129869\nqgrf\t129870\n一次一次\t129871\n几个1\t129872\n25万美元\t129873\n總算\t129874\n竞相\t129875\n艸\t129876\n不存\t129877\n碗毒米线\t129878\n差值\t129879\n不孝\t129880\nweifushu\t129881\n800克\t129882\n不孕\t129883\n找见\t129884\n恩韩信\t129885\n七八个星天外两三点三钱\t129886\n999900900\t129887\ndeeik\t129888\n店员\t129889\n来月经了我没有\t129890\n秋那\t129891\n求求你给我讲几个小笑话吧谢谢你了你真乖了我知道你\t129892\n章珍杰\t129893\n黄小楼村\t129894\n困扰者\t129895\n天青色\t129896\n十八种\t129897\n三轮生肖\t129898\nlnamnfinenthanks\t129899\n秦老太\t129900\n三四个小时\t129901\n王梦钦\t129902\n屋村\t129903\n好开心好开心好开心\t129904\n不适宜\t129905\nblmanhua\t129906\n太岁\t129907\n痛哭流涕\t129908\n姚芝君\t129909\n比干\t129910\n竟无缘以\t129911\n帕金生\t129912\n孙孙\t129913\n李玉娟\t129914\n性转\t129915\n配药\t129916\n孙子\t129917\n苹果六普拉斯卡顿\t129918\n高金月\t129919\n宁笑柯\t129920\n万洋\t129921\n政府性\t129922\n不肯定\t129923\n燕青\t129924\n15332815407\t129925\n暗黑血统2\t129926\n慈母\t129927\n死丝方\t129928\n退让\t129929\n奶茶堡\t129930\n685353868235382235535538582685390217048877866535585\t129931\n差不死人\t129932\n95455555555666568\t129933\n良\t129934\n2个多小时\t129935\n欸欸欸\t129936\n15236886895\t129937\n张紫琼\t129938\n不要说这样的话\t129939\n心情巨差\t129940\n一咯\t129941\n无尽们\t129942\n1963年\t129943\n3183650096\t129944\nifjkskdjj\t129945\n绿帽子\t129946\n太阳能发电板\t129947\n风生水起\t129948\n懊丧\t129949\n真羡慕\t129950\n故纸\t129951\n绿里\t129952\nbansn\t129953\n中感\t129954\ngffdffseeyg\t129955\n5点35分\t129956\n揽揽无\t129957\n钱牛魔王\t129958\n起誓\t129959\n尽欢\t129960\ntvt\t129961\n中意\t129962\n土贼\t129963\n困衣\t129964\n无土栽\t129965\n來親\t129966\n北大\t129967\n虚别\t129968\n还有恩\t129969\n院规\t129970\n忙完了\t129971\n你不懂我你就这有\t129972\n扶梯\t129973\n茶水\t129974\nBeijing\t129975\n所有的人\t129976\n地表\t129977\n江诗丹顿\t129978\n徐我叫高\t129979\n乐子\t129980\n瑞安\t129981\n宋庆龄\t129982\n美甲师\t129983\n乌药\t129984\n度秘我好爱你\t129985\n晓光\t129986\n还歌\t129987\n过年了妈妈\t129988\n万里明\t129989\ntvk\t129990\n5555555555555555555\t129991\n鸣样\t129992\nfjgug\t129993\n宽严\t129994\n残忍\t129995\n再调\t129996\ncurf\t129997\n唤后\t129998\n中央大道\t129999\n1838474849\t130000\n耿嘉翊\t130001\n音字\t130002\n毛宗荣\t130003\n汪梦群\t130004\n嗯一一\t130005\n英音\t130006\n情何以堪\t130007\n爱到为你山到哈下火海\t130008\nguaqheyq\t130009\n还死\t130010\n嗯一世\t130011\n郑人\t130012\n摊煎饼\t130013\n惠达卫浴\t130014\n震惊\t130015\n三点式\t130016\n发尾育英\t130017\n好丑\t130018\n中国政法大学\t130019\n化妆部\t130020\n索斯比\t130021\n陈思彤\t130022\n荣大伟\t130023\n复发\t130024\nexo波\t130025\n好不\t130026\n10月11日\t130027\n好七\t130028\n背首\t130029\n我真的很想对你说我爱你\t130030\ndzlkzclnbb\t130031\nbhiphotosbaiducomxiaodupicitem4d086e061d950a7bab780ab60dd162d9f2d3c977jpg\t130032\n逃兵\t130033\n快点吧\t130034\n好事多磨\t130035\n三七八\t130036\n复古\t130037\n攻势\t130038\n朱比利\t130039\n幸运物\t130040\n5000多个\t130041\n爱情诗\t130042\n神迹\t130043\n攻加\t130044\naosiaosi\t130045\n爱情话\t130046\n度秒度\t130047\n17点05分\t130048\n新鲜\t130049\n龙顺齐\t130050\nqiao\t130051\nqian\t130052\n为我听不懂你说的话\t130053\n确有\t130054\n手菜\t130055\n昌里路\t130056\n懂不懂是我喜欢你\t130057\n雨暴\t130058\n赵雅和\t130059\n专车\t130060\n找死路上\t130061\n廉洁\t130062\n专转\t130063\n视而不见\t130064\n洗气死\t130065\n一凡到我了因为我告我要你\t130066\nhttpfhiphotosbaiducomxiaodupicitem95eef01f3a292df515310261bb315c6035a873d0jpg\t130067\n开心水族箱\t130068\njgjgag\t130069\n工业区\t130070\n催生\t130071\n起存\t130072\n30.99\t130073\n爱上你好久\t130074\n24.98\t130075\n紧接着\t130076\n王玉玲\t130077\n爱狗爱狗\t130078\n张金晨\t130079\n肿么样\t130080\n918\t130081\n急售\t130082\n唉东仓\t130083\nquby\t130084\n训练仪\t130085\n期指高开低走\t130086\n气人\t130087\n煤田\t130088\n凉汗\t130089\n实实在在\t130090\n三百六十六\t130091\n田若凡\t130092\n立翠\t130093\n要慎重说\t130094\n王笑\t130095\n谢芬奇\t130096\njjstx\t130097\n仰天悲面\t130098\nhavent\t130099\n6月12日10点\t130100\n坏人们\t130101\n送配\t130102\n一条路\t130103\n这个\t130104\n下午1点~1点半\t130105\n永宁花\t130106\n这两\t130107\n纽森\t130108\n91O\t130109\n安卓吧\t130110\n说了你走\t130111\n时间轴\t130112\n多a\t130113\n嘛差\t130114\n丑死\t130115\n大大哒哒哒哒哒哒哒哒哒哒哒\t130116\n在家你好忙\t130117\n对兔\t130118\nIUJ\t130119\n这下\t130120\n加龙省利\t130121\n香克斯\t130122\n侦察\t130123\n护国大将军\t130124\n一句话\t130125\n1611111661616\t130126\nwouuoccke\t130127\n这东\t130128\n40美元\t130129\n这业\t130130\n黄耀明\t130131\n小年少\t130132\n这世\t130133\nyourothreyou\t130134\n对其\t130135\n21代\t130136\n一半袖\t130137\n何去何从\t130138\n睡不着花\t130139\n2030450\t130140\n查可欣\t130141\n万丈海\t130142\nrevmlo\t130143\n快乐乐乐乐乐乐乐里\t130144\n发疯了\t130145\n鹅掌\t130146\n银春\t130147\n德九\t130148\n背水一战\t130149\n索罗斯\t130150\n自然张\t130151\n哭泣不成声\t130152\n军事家\t130153\natail\t130154\n雷婷\t130155\nrfghjytff\t130156\n银星\t130157\n不不不不不不不\t130158\nJennifer\t130159\n肥沃\t130160\n羋月\t130161\n一亿块\t130162\n借古讽\t130163\n斗手机小机器人手也我不爱情了\t130164\nDownload\t130165\n五灵\t130166\n凤凰山山江镇\t130167\n十三十五\t130168\n王子玉\t130169\n15283980897\t130170\n警察局\t130171\n南京市\t130172\nvifusy\t130173\n家业\t130174\n家世\t130175\n家丑\t130176\n一一阵\t130177\ngggryf\t130178\n牛肉味儿\t130179\n家下\t130180\n家上\t130181\n十七页\t130182\n贪图\t130183\n阿斯顿飞规划局\t130184\n家丁\t130185\n紫艺星\t130186\n出价\t130187\n出任\t130188\n兔开健步跃三江\t130189\n528446648\t130190\nxasS\t130191\n家中\t130192\n兴瑜\t130193\n叵测\t130194\n中央台一号\t130195\n跃然\t130196\n南师\t130197\n纽康特额446734乳房创可\t130198\n南希\t130199\nhtjgg\t130200\n日升老阿妈\t130201\nHowtimeitis\t130202\neyffgg\t130203\n挖洞\t130204\n对算\t130205\n相框\t130206\n先炮\t130207\nexoyhfc\t130208\n马匹\t130209\n耕田\t130210\n马区\t130211\n容错\t130212\n唐截瘫\t130213\n日复一日\t130214\n脱脱\t130215\n远非\t130216\n0800\t130217\n相桥\t130218\n聚对苯二甲酸乙二醇脂\t130219\n彭学\t130220\n五百平方米\t130221\n帥哥\t130222\n酸辣白菜\t130223\n人寿财险\t130224\n甚麼沒\t130225\n陈婧艺\t130226\n智能卡门禁管理系统\t130227\n家都秘美美美美哒最美最美最美最美最美最美最美我最喜欢小度秘你是我的小秘书\t130228\n深圳市住房建设局\t130229\n不入流\t130230\n丢盔弃甲\t130231\n立面\t130232\n销售\t130233\n刘佳伟\t130234\n朱医生\t130235\n作锋\t130236\n二二二零\t130237\n色拉油\t130238\nfffftfgftttff\t130239\nin酒吧\t130240\nKFF\t130241\nKFC\t130242\n作错\t130243\n邬煜琪\t130244\n制寻\t130245\n防御\t130246\n七零零三零\t130247\n寒夜\t130248\n徐静蕾\t130249\n毒性\t130250\n旧游\t130251\n我问你你知道我是谁吗你不知道我是谁你\t130252\n周日会\t130253\n国务院法制办\t130254\n俩几年\t130255\n哪时\t130256\n林燕璇\t130257\n村支书\t130258\n包月138\t130259\n牛奶猫\t130260\nyfvjgg\t130261\n多久\t130262\n见底\t130263\n课窗\t130264\n临场\t130265\n6个小时\t130266\n刮胡刀\t130267\n你是谁了一觉不老实一一\t130268\n寒天\t130269\n余端端\t130270\n只能\t130271\n对嘴\t130272\nXXXXXXXXXXXXXX\t130273\n我求你了我求你了我求你了我求你了我求你了我求你了我求你\t130274\n三尊\t130275\nbjf\t130276\n我真的好爱你\t130277\n催仓村\t130278\n外线\t130279\n非金属\t130280\n号生\t130281\n魏武挥\t130282\nbjd\t130283\n一句然\t130284\n乌鸦\t130285\n她说\t130286\n乌鸡\t130287\n乌鸟\t130288\n五八八六零六四三\t130289\n后顾后\t130290\n营山县\t130291\nohzhendema\t130292\n欠发达地区\t130293\n天蝎窝\t130294\n加赛\t130295\n一哄而起\t130296\n睢宁话\t130297\n露水\t130298\nｔｆｅｉｓ\t130299\n奖励金\t130300\nnbnbnbc\t130301\n无可救药\t130302\ngudj\t130303\nNc\t130304\n小杨家村\t130305\n一百五十多\t130306\n小弟儿\t130307\n六加六\t130308\n三四千元\t130309\n99788\t130310\nNe\t130311\n我要你了我想你\t130312\n老觉\t130313\n吴哥\t130314\nDZ论坛\t130315\n亮晶晶\t130316\n18874151571\t130317\n老见\t130318\n也不够\t130319\n千美玉\t130320\n发度\t130321\n林书豪\t130322\n魔域\t130323\n50位\t130324\n第一拉\t130325\nthjff\t130326\n解暑\t130327\n马天雄\t130328\n金曲\t130329\n第五季\t130330\nGary\t130331\n西藏日报\t130332\n哼哼想的美你\t130333\n蛙声\t130334\n处上\t130335\n慈母龙\t130336\n9时\t130337\n门票\t130338\n露露\t130339\n效焕\t130340\n拦车\t130341\n认不得么多\t130342\n吴题了了了了了\t130343\n六平方米\t130344\n处世\t130345\nchiphotosbaiducomxiaodupicitem902397dda144ad345b65d6f5d7a20cf430ad85edjpg\t130346\ntctctctcc\t130347\nshourborcityouhuocatolishuiapplyhouohabboratourchitourheisetouhapppouchatourborhetourhelocatourmousetoureporcityouimablecityou\t130348\n处中\t130349\n张之朴\t130350\n1x几\t130351\n攸刈\t130352\n门头沟区\t130353\n奥修\t130354\n522852336\t130355\n577564766764\t130356\n切根\t130357\nChine\t130358\n756576564977567\t130359\n练字\t130360\n蒂花之秀\t130361\n決策論小數額\t130362\n心情好坏\t130363\n颜茜\t130364\njhakbgjb1aa1a1cj1kbg2agb\t130365\n门神\t130366\njfksksgd\t130367\n李增双\t130368\n八八八八那\t130369\n拳王\t130370\n来文\t130371\n真你\t130372\n超过42周\t130373\n雪莉\t130374\n姑宁噗呲\t130375\n骂郎\t130376\n周大壮\t130377\n眠母\t130378\n百分之几多\t130379\n听走\t130380\n医用\t130381\n2833\t130382\n死老鼠\t130383\n38平米\t130384\n脏字\t130385\n夏明华\t130386\n公交车上\t130387\n医生\t130388\n冤得慌\t130389\n雪莲\t130390\n二十岁\t130391\n钟子期\t130392\n听赞\t130393\n真体\t130394\n雪莹\t130395\n贾燕俊\t130396\n就打滚\t130397\n脚踏实地\t130398\n李修\t130399\nGfguj\t130400\n来来吧\t130401\n队名\t130402\n来一来\t130403\n钱塘湖春行\t130404\n取舍\t130405\n一个多米\t130406\n300ML\t130407\n12本\t130408\n几千份\t130409\n东北校\t130410\n氪金\t130411\n景生\t130412\n李俏\t130413\nhi豆\t130414\n李俊\t130415\n无数万\t130416\n1把\t130417\nhi知道\t130418\n散热\t130419\n馍馍\t130420\n结义\t130421\n12月\t130422\n咖喱辣椒\t130423\n显赫\t130424\n消除\t130425\n狼蛛\t130426\n伐竹\t130427\n河乡\t130428\n拖一拖\t130429\n大包\t130430\n零件儿\t130431\nnizhe\t130432\n昂昂\t130433\n夜蒲\t130434\n刘二哥\t130435\n十八十八十八\t130436\nnizhi\t130437\n我也想\t130438\n油品\t130439\n四十分之十二一二三三三三\t130440\n美特斯邦威\t130441\ncvnn\t130442\n唐僧猫\t130443\ncvnk\t130444\nududufuuychvjvjvjvjy8\t130445\ncvng\t130446\n呢首歌\t130447\n陪你\t130448\n卢景阳\t130449\n吓容\t130450\n国际货币基金组织\t130451\n1146765\t130452\n努努力\t130453\n无颜\t130454\n外在美\t130455\n无题\t130456\nhave\t130457\n龙虎山道教协会\t130458\n倪安东\t130459\n浴帽\t130460\n好无耻\t130461\n我的你\t130462\n台头\t130463\n过来自\t130464\n13分钟\t130465\n米蔓娜\t130466\n二十二栋\t130467\n王咋了\t130468\n朱梓妍\t130469\n吉祥大妽\t130470\n信那你\t130471\n｀*\t130472\n故学\t130473\n暗喻\t130474\nxgdg\t130475\n无求\t130476\n交融\t130477\nfhuuu\t130478\n张烦恼\t130479\nrmbznq\t130480\n同行们\t130481\n坏哥哥\t130482\n核算\t130483\n茵s\t130484\n6000多翦氏\t130485\nta么\t130486\n頁\t130487\n一边间\t130488\nhdgdg\t130489\ntfih\t130490\n大中专院校\t130491\nSHAYIS\t130492\n須\t130493\n周日月\t130494\n唐山\t130495\n经典之作业务必要不要跟你是你的？巜\t130496\n骗钱\t130497\n一百八十九九十\t130498\n种马膜拜\t130499\n趋向\t130500\n查理九世鬼公\t130501\nKose\t130502\njm7agjpk\t130503\n大欺\t130504\ng400\t130505\n文海\t130506\n秦岭漂流记\t130507\n张一出\t130508\n大次\t130509\n好啦好啦相信你\t130510\n程书敏\t130511\n陈连杰\t130512\n好棒\t130513\n頭\t130514\n同乐\t130515\nGoal\t130516\n行政处罚\t130517\n今今\t130518\n南满\t130519\n昨去\t130520\n好几句话\t130521\n恐怖事件\t130522\n郝纯洁\t130523\nwrty\t130524\n十万伏特\t130525\n方思琪\t130526\n拼博生活&amp\t130527\n组造\t130528\n37棵\t130529\n一屁股\t130530\n哟西\t130531\n枪版\t130532\n此处\t130533\n脉\t130534\n老克\t130535\n爱我爱我@@@\t130536\n富愈好\t130537\nudtjmjq\t130538\n六十九九七\t130539\nQQQQQQ\t130540\n色人\t130541\n同乡\t130542\n保障房大跃进\t130543\n手游\t130544\n13549696666\t130545\n于嘉琪\t130546\n此外\t130547\n两首歌\t130548\n嗯嗯瓮\t130549\n空屋\t130550\n算恋爱\t130551\n洪欣琪\t130552\n干巴\t130553\n李俊昊\t130554\n王凸\t130555\n王凤\t130556\n王林夕\t130557\n有格格不入\t130558\n王凡\t130559\n249912981\t130560\n阿白\t130561\n王凯\t130562\nhello瓮\t130563\n广州市红专厂足球场\t130564\n两居\t130565\n文思雨\t130566\njldju\t130567\n两层\t130568\n世界无车日\t130569\nvvfc\t130570\n酒味\t130571\n更强大\t130572\n两届\t130573\n2069\t130574\n奔死\t130575\n颗星\t130576\nvvfr\t130577\n无身影\t130578\n哭了里去\t130579\n2066\t130580\n李树来\t130581\n葡萄葡萄葡萄葡萄葡萄葡萄葡萄葡萄\t130582\n戒台\t130583\nZoe\t130584\n路径\t130585\nZoh\t130586\n张大腚\t130587\n前一晚\t130588\n第一版\t130589\n戒口\t130590\n可耻勿忘\t130591\n31702638623\t130592\n你们的们\t130593\n中国共产党第十七届中央纪律检查委员会\t130594\nnsek\t130595\n共识\t130596\n连夜\t130597\n三柱\t130598\n卡梅奇\t130599\n半途\t130600\n叶梓峰\t130601\n千挡万遮\t130602\n欧派电动车\t130603\n换帅\t130604\n小帅炫\t130605\n欣怡欧尼\t130606\nIamaindian\t130607\n侥幸\t130608\n三环醇\t130609\n战超\t130610\n统一版\t130611\n我没有在笑哇你在笑话我\t130612\n三三个\t130613\n错错错错错\t130614\n9四\t130615\n肚子饿\t130616\n婂组词\t130617\n证监会\t130618\n寻思\t130619\nvdvjs\t130620\nTFObys\t130621\n1015年\t130622\nisSophiababy\t130623\n情危机\t130624\n精神层面\t130625\nIQ\t130626\n黄静萱\t130627\njokj\t130628\n门太监\t130629\n码衫\t130630\njoke\t130631\n有色金属\t130632\n00443\t130633\n闷气\t130634\n一年一岁\t130635\n4478852\t130636\n百度音乐\t130637\nIP\t130638\n期盼\t130639\n花江\t130640\n百二十点\t130641\n孟小蓓\t130642\n七八七七百txtx\t130643\n玩枪\t130644\n双色\t130645\n玉米棒\t130646\n80个\t130647\n種美麗\t130648\nGVVG\t130649\n冷血动物\t130650\n假面骑士lanugo\t130651\ndssfsffd\t130652\n3级\t130653\n圣凯瑟琳街\t130654\n文艺界\t130655\n踪影\t130656\n80万\t130657\n人工催雨\t130658\n喔四三\t130659\n七交界\t130660\n2760平方分米平方米\t130661\n千家家\t130662\n2700mm\t130663\n阿布白富美\t130664\n四件套\t130665\n威斯\t130666\n保加利亚瓦尔纳\t130667\n回天乏力\t130668\n劝说\t130669\n基础卷\t130670\n数码相机\t130671\n大山\t130672\n古巴\t130673\n主角\t130674\n关雎\t130675\n#地球水On\t130676\n特色菜\t130677\n俩分钟\t130678\n77万年\t130679\nhubi\t130680\n欢的歌\t130681\n超声\t130682\n彩铃\t130683\n文员\t130684\n好吧原谅你给我\t130685\n救市\t130686\n彩铅\t130687\nGeneral\t130688\n吕佳彤\t130689\n面泡\t130690\n道别来不及\t130691\n移动通信\t130692\n全球\t130693\n新谱\t130694\n比亚劲\t130695\n禁飞区\t130696\n修修修\t130697\n度秘你的男偶像\t130698\n啵咪\t130699\nxivhh\t130700\n學習\t130701\n1987岁\t130702\n硬菜\t130703\n仁性化\t130704\n贾贵贵\t130705\n哦谢\t130706\n豺狼\t130707\n中后卫\t130708\n卡联\t130709\n太一\t130710\n过门你\t130711\n左脑\t130712\n首歌儿\t130713\n有生以来\t130714\n太丑\t130715\n密给\t130716\n惨叫\t130717\nehejehe\t130718\n水沟\t130719\n武装力量\t130720\n不够了吧\t130721\nhhshjzhbxudbdhjdvehhbhhbhhhgghgghggggggggg\t130722\n来啊\t130723\n好男烤鸭\t130724\n前瞻\t130725\n可认真\t130726\n给我转\t130727\n水仙境\t130728\nyoaaoa\t130729\n相碰\t130730\n慢慢片再见我不上你的当\t130731\n｀O\t130732\n里约\t130733\n来啦\t130734\njffff\t130735\n商标画\t130736\n4011\t130737\nhttpsmbaiducomsword电影火星救援tnbdvoice\t130738\n东东一家人块\t130739\n最帅最帅最帅最帅最最最最最最\t130740\n4014\t130741\n太智能\t130742\ng扇\t130743\nTTzzzY\t130744\n莱依思南丹晨\t130745\n蔡娜\t130746\n重汽\t130747\n一77\t130748\n产房\t130749\n反人家\t130750\n唱一首歌叫\t130751\nmood汤。tx\t130752\nuntil试机号\t130753\n淤血\t130754\n醒着\t130755\n猜题\t130756\n伊里奇\t130757\nHellohelloBabe\t130758\nSJ利特\t130759\n底色\t130760\n临安\t130761\nbwkksksbs\t130762\n86.2公斤\t130763\nkYLU\t130764\n倪萍\t130765\n想爱\t130766\n飞塘沽\t130767\n版主\t130768\n服服帖帖\t130769\n一次\t130770\n不好上\t130771\n笑泺\t130772\n香奈尔\t130773\n芝加哥\t130774\n上海百视通\t130775\n淘金\t130776\n一款\t130777\n叹息\t130778\n我的命运\t130779\n儿科\t130780\n兵之哭\t130781\nMgaffevcbbmmnm\t130782\n王皇\t130783\n夜郎\t130784\n新疆大学\t130785\n旺财救世网\t130786\n腿法\t130787\n我喜欢花花姑娘\t130788\n琦爸爸hellohellohellohello\t130789\n收藏\t130790\n王皓\t130791\n高等教育\t130792\n培根\t130793\n花汤\t130794\n嫩绿\t130795\nsuccess\t130796\n不死之迷\t130797\n好动心\t130798\n统计表\t130799\n关心\t130800\n度秘呃\t130801\n可不是\t130802\n那年头\t130803\n宝贝在家等你呢你快点好不\t130804\n处女警察局\t130805\n宗祠\t130806\n菜油侠\t130807\n15869884901\t130808\n宋智孝\t130809\n三十平米\t130810\n钦佩\t130811\n性不性\t130812\n星光大道\t130813\n郭老师\t130814\n风姐\t130815\n没意思\t130816\n华样\t130817\n哼讨\t130818\n你是谁你是谁你是外个小妹\t130819\n头顶\t130820\n家都\t130821\n忘形\t130822\n不齐\t130823\n幼年\t130824\n44354744348435536\t130825\nhhjjbhhhhjhbbjhhhjjjjk\t130826\n口干舌燥\t130827\n三文鱼梅子茶\t130828\n喻策\t130829\n家郎\t130830\n难免\t130831\n佛罗\t130832\n姜处阁\t130833\n张曦\t130834\n常喜\t130835\n常喝\t130836\n红学家\t130837\n克撒\t130838\n杀不怕\t130839\n学说起来\t130840\n龙游\t130841\n十两\t130842\n想怎么样\t130843\n愿见\t130844\n十个\t130845\n地勤\t130846\nIininibhk\t130847\n马大悲\t130848\n龙港\t130849\nhihihihi\t130850\n不要太多\t130851\n巴马甲\t130852\n洗铺\t130853\n十万\t130854\n十七\t130855\n良品\t130856\n十一\t130857\n东山岛\t130858\n嗯小度\t130859\n十下\t130860\n小可爱难记奥\t130861\n十三\t130862\n十世\t130863\n欧天泽\t130864\n渐变\t130865\nsbiv\t130866\n死猪你真够\t130867\n120页\t130868\n恩那你追你\t130869\n那喜羊羊\t130870\n信合\t130871\n删把\t130872\n是谁知道\t130873\n內容\t130874\n雪妖\t130875\n害不害臊\t130876\n收费处\t130877\n启用\t130878\n更厉害\t130879\n果贝爽\t130880\n084345\t130881\n王冰燕\t130882\nhjjjjjl\t130883\n江西省鹰潭一中\t130884\n刘静怡\t130885\n左眼\t130886\nTtkgdhjfse2eikbbry\t130887\n李丰正凯\t130888\n烤制\t130889\nhold不住\t130890\n鼓起\t130891\n维也纳\t130892\n要不五\t130893\n真不爽\t130894\n头笨鸟\t130895\n各行各业\t130896\n8.26\t130897\n六安市立医院\t130898\n要不了\t130899\n密度陪度秘我爱你\t130900\n吧贝贝后街男孩\t130901\n制革\t130902\n治愈师\t130903\n摇篮曲\t130904\nCATE\t130905\n朝天锅&amp\t130906\n一千五多\t130907\n迈5\t130908\n帘子\t130909\n曼珠沙华\t130910\n要不人\t130911\n易泽龙\t130912\n梦冰\t130913\n什么度\t130914\n什么座\t130915\n古立\t130916\n边线\t130917\n黄一博\t130918\n好时髦\t130919\n13999405984\t130920\n27n种\t130921\n占优\t130922\n张果照\t130923\nouhappileapoen嗯\t130924\n生命力\t130925\n42枚\t130926\n刘韵涵\t130927\n吉布利勒\t130928\n甲流\t130929\n开放大道育才\t130930\n刘洪普\t130931\n漫山遍野\t130932\n新兴\t130933\n不动手\t130934\n豪强\t130935\n墙角\t130936\n类跟\t130937\n九点半\t130938\n遭殃\t130939\n苏州一小区\t130940\n新交规\t130941\n我的梦\t130942\n穷穷\t130943\n新兰\t130944\n美不美\t130945\n寸头\t130946\n别惹我\t130947\n273662727\t130948\nokym\t130949\nzomimimicom\t130950\n花姑娘说话的妖怪跑大礼包铠甲勇士\t130951\n涿州\t130952\n成交率\t130953\nhxhdbsb\t130954\n顾俊磊\t130955\n帮衬\t130956\n140贝勒倍\t130957\n一月一月17日\t130958\n六分之一第二天\t130959\n邪皇\t130960\n咏雪\t130961\n呼噜声\t130962\n义渠王\t130963\n三十二个\t130964\n陌路兔兔五旅途做什就算数据悉\t130965\n药效\t130966\n药敏\t130967\n白雪公主之魔镜魔镜\t130968\ntytyutyjhg\t130969\n念念\t130970\n2400斤\t130971\n善言\t130972\nMontagut\t130973\nhttphhiphotosbaiducomxiaodupicitem2934349b033b5bb5061e958d31d3d539b700bcfcjpg\t130974\n东桥\t130975\n啦啦啦啦啦啦我是快乐的小行家\t130976\n仁川\t130977\n拉垃\t130978\n寒假没秘主义\t130979\n记录册\t130980\n万零八60008258258\t130981\n十六女\t130982\nidco\t130983\n356天\t130984\n野哥\t130985\n两三期\t130986\n预演\t130987\n马清伟\t130988\n花奶奶\t130989\n牙牙\t130990\n德佩则罔\t130991\n小鸟儿\t130992\n暖床\t130993\n了管\t130994\n新读图时代\t130995\n42个\t130996\n李志琴\t130997\n两三朵\t130998\n天天跌达人我行\t130999\n65555645445004446538\t131000\nalllu\t131001\nNano6\t131002\n切线\t131003\nàbnc\t131004\n篇幅\t131005\n嗯的不行有你不要脸不要脸不要脸不要脸\t131006\n我不爱你你爱我吗我真的不爱你\t131007\n韩国公正交易委员会\t131008\n通路\t131009\n四百公里\t131010\n一百多斤\t131011\n猪猪侠猪猪\t131012\nscb\t131013\n成正比\t131014\n嗌爆\t131015\noppor6\t131016\noppor7\t131017\n好了好了你猜猜我\t131018\nKonglong\t131019\nitstime\t131020\n又逢\t131021\n呢七\t131022\n雷锋精神\t131023\n王金婷\t131024\n当婚\t131025\n要不要\t131026\nAsdf\t131027\n阮科业\t131028\nfufufjjf\t131029\n第二场\t131030\nhi度\t131031\n张玲茜\t131032\n白胖\t131033\n严金荣\t131034\n分利夏\t131035\n唉无\t131036\n孤零\t131037\n笑一\t131038\n才德\t131039\n三幺五三\t131040\n嘟嘟卡\t131041\n多弗拉明戈\t131042\n笑下\t131043\n小村庄\t131044\n内容源\t131045\n秘别\t131046\n劳动案\t131047\nkuy\t131048\n刘佳美\t131049\n座子\t131050\n赵晗宇\t131051\n陶诗然\t131052\n奖品\t131053\n问候\t131054\n种长\t131055\nkun\t131056\n秘奇闻\t131057\nkuj\t131058\n耐压\t131059\nkuf\t131060\n你好早死\t131061\nkud\t131062\nkuc\t131063\n秘制\t131064\n张小雅\t131065\n素质教育\t131066\n烫胸生层云\t131067\n过融\t131068\n王永绮\t131069\n腹痛\t131070\ntfboys么女\t131071\n4n九\t131072\n啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦\t131073\n骆家辉\t131074\nhihello\t131075\n石安林\t131076\nbece\t131077\n有的放矢\t131078\n哇嘞萌\t131079\n道歉疚安拳\t131080\n646438464639487674736476\t131081\nleaves\t131082\n卷毛\t131083\n哑巴了闭嘴巴个雅路\t131084\n弟妹饿了饿了了了你\t131085\n朱朱朱\t131086\n三八幺零\t131087\n5584734587676485657254893683694373\t131088\n唉长\t131089\n亲爱的你想亲我\t131090\n开奖\t131091\nppjknj\t131092\n痛哭\t131093\n得宜\t131094\n得实\t131095\ncjgie\t131096\n含情\t131097\n里啦来啦啦啦天\t131098\n一声度\t131099\n鈤蕊\t131100\n古今痴男女\t131101\n自卖\t131102\n声讨\t131103\n草根\t131104\n17721\t131105\n得容\t131106\nffyi\t131107\n无可奉告\t131108\n小静儿\t131109\n自卫\t131110\n自即\t131111\n逼数\t131112\n包糖\t131113\n增幅员\t131114\n自卸\t131115\n12wx\t131116\n牙医院\t131117\n针头\t131118\n猪娃\t131119\n醋泡鸡蛋\t131120\n写意\t131121\nsaber\t131122\n武进\t131123\n快滚快滚\t131124\n伤愁\t131125\n卡车赛\t131126\n烦恼关我屁事\t131127\n老子人\t131128\n565861558461\t131129\n偶错\t131130\n1a1da\t131131\n工具\t131132\n我喜欢的名字\t131133\n找别\t131134\n流星肉\t131135\n停车场\t131136\n惊问\t131137\n我不哭你的那我不哭你不懂我不达你好\t131138\n东北风\t131139\n早搏\t131140\n十二张\t131141\n盛秋燕\t131142\n无言\t131143\n禚金佳\t131144\n张焱\t131145\nnznj\t131146\n锁噶\t131147\n笑不止\t131148\n大卖场\t131149\n2028000\t131150\n六十五块\t131151\n湾潭镇\t131152\n奥特之乖乖乖乖乖\t131153\n干粉\t131154\n亲爱的别发送那些了你\t131155\n架好\t131156\n古意\t131157\ngu秘\t131158\n化合物\t131159\n杨颖晓\t131160\n郎狂\t131161\n1jt\t131162\n效果\t131163\n生蛋\t131164\n两五年\t131165\n曾钰清\t131166\n乐此\t131167\n神厨\t131168\n天娱\t131169\n现在部\t131170\n没色胆\t131171\n干粮\t131172\n触犯\t131173\ngpttpt\t131174\n甘肃省陇南市文县碧口镇中庙乡茶园村茶园小学\t131175\n欣欣向荣\t131176\ndtapapeifu\t131177\n抽支\t131178\n枕边\t131179\n视窗\t131180\nONS\t131181\n有你自己\t131182\n9976865\t131183\n一万200万\t131184\n萎\t131185\n装饰类\t131186\n围观\t131187\n爱拉\t131188\n漳化路\t131189\n花胶\t131190\n爱拍\t131191\nTEFOL\t131192\n七倍\t131193\n填平\t131194\n2.4万\t131195\n哦qicheck歌\t131196\nEXCL\t131197\n吴森森\t131198\n张小蕊\t131199\nT恤型\t131200\n煎饼果子之歌\t131201\n三百四十三百六十三天\t131202\n乖啦明儿\t131203\n可点\t131204\n8月10日\t131205\n霹雳勇士元气融者\t131206\n無數歲\t131207\n在花园\t131208\noooooooooooooooooook\t131209\n农保\t131210\nvjlno\t131211\n逍堂\t131212\n爱拼\t131213\n哪巴\t131214\n东南方\t131215\nω度秘\t131216\n上班儿\t131217\n保美博\t131218\n来不能不能不来\t131219\n签字仪式\t131220\n2012年5月末\t131221\n准备金\t131222\nvdftd\t131223\n史学祖宗\t131224\n王苗\t131225\n10月\t131226\n李维\t131227\n网校\t131228\n人情感\t131229\n草生\t131230\n域名\t131231\n生下来\t131232\n八小魔\t131233\n秦露\t131234\n首尔播放率\t131235\n退档\t131236\n弄碎\t131237\n蒸脸\t131238\nnbcd1yz\t131239\n夜里12点\t131240\nhhhhhhhh\t131241\n迫使\t131242\n人兽人兽人兽大战大头爸爸你个头人你个头当然了我\t131243\n橘子果冻\t131244\n军床\t131245\n乐购超市\t131246\n对外开放\t131247\n了解决好\t131248\n三百首\t131249\n七天七夜\t131250\n1亿\t131251\n北京医院\t131252\n就是好朋友\t131253\nEffigyfighting\t131254\n权子怡\t131255\n去医院\t131256\n手提\t131257\n其晚\t131258\n勤业\t131259\n30880361\t131260\n米切菜\t131261\n杨平华\t131262\n东岳\t131263\n银领\t131264\n哼我恨我恨你男孩\t131265\n新领域\t131266\n书刊\t131267\n够不够\t131268\n诱惑\t131269\n痔睾\t131270\n姨爹\t131271\n謹賀\t131272\n18点59分\t131273\n刘彦伯\t131274\n知之为知\t131275\n圭贤\t131276\n英武\t131277\nsomany\t131278\nhgjggffbf\t131279\n苏拉啊\t131280\n巨蟹女\t131281\n大马\t131282\n东岛\t131283\n有才\t131284\n自豪感\t131285\n王文涛\t131286\n全团\t131287\n泡泡液\t131288\n莎博尔\t131289\n全国\t131290\n全图\t131291\n体能花园\t131292\n一六年\t131293\nhttppinyincne16589\t131294\n斯卡拉\t131295\n再见安\t131296\n鹅湖宫\t131297\n汇豪商务大酒店\t131298\n第一道\t131299\n他不可以\t131300\n题目乳\t131301\n頭鬼\t131302\n四次品\t131303\n8笔\t131304\n经济衰退\t131305\n第一遍\t131306\n老实告诉我一个呀\t131307\n羽哥\t131308\nc3e\t131309\n李哥\t131310\n蚂蚁我的妈呀你\t131311\ndays\t131312\n郭燕\t131313\nyomind\t131314\n外语\t131315\n别上班\t131316\ndaye\t131317\n啵找\t131318\nbibnkhbn\t131319\n有问你\t131320\nnhe\t131321\nnhd\t131322\n10年前\t131323\nnhf\t131324\n思捷环球\t131325\n众多少\t131326\n杨慕\t131327\n逗秀\t131328\n笨流\t131329\nlovable\t131330\n婚羽毛\t131331\n曲子\t131332\n建申\t131333\n逗秘\t131334\n真的好伐\t131335\n精力\t131336\nnhx\t131337\n瑜伽学\t131338\n真心不希望\t131339\n淳熙\t131340\n应届\t131341\n法系\t131342\n亲爱的你在\t131343\n寻味\t131344\n约不\t131345\n田亮\t131346\n扑鼻\t131347\n梁父吟\t131348\n爱过我没有\t131349\n正方\t131350\n杨慧\t131351\n马坊\t131352\nqwertyuiopasdfghjklzxcvbnmqwertyuiopasdfghjklzxcvbnm\t131353\nAdlh\t131354\n摄入\t131355\n苒奶\t131356\n叛逃者\t131357\n分为\t131358\n吧者\t131359\nnh3\t131360\n一去不赴\t131361\n缅果\t131362\n此起彼伏\t131363\n急流\t131364\n度密雪\t131365\n1360\t131366\n歌舞度摸一秘你叫歌五都么一秘\t131367\n在度\t131368\n在座\t131369\n谢我我见\t131370\n丁点肥肉\t131371\n6755468467584\t131372\n山葡萄酒滴\t131373\n精索\t131374\n直男\t131375\n大气筒\t131376\nk米\t131377\n村书记\t131378\nbridge\t131379\n协和心外科\t131380\n天天不聊\t131381\n家政\t131382\n408号\t131383\n猫样\t131384\n999999999\t131385\n蘑菇排骨汤\t131386\n花梦颖\t131387\n泪角\t131388\n书会\t131389\n小夜曲\t131390\n2-1\t131391\n孙杨燕\t131392\nｕｕｕｕｕｕｕ\t131393\nadhk\t131394\n梁保卫\t131395\n愚痴\t131396\n唱见\t131397\nSelina\t131398\n四一页\t131399\n醒世\t131400\n找回渐渐逝去的文学家园\t131401\n街舞\t131402\n玫瑰金\t131403\n乱葬\t131404\n换一吗\t131405\n抗生素\t131406\n就是说拜拜\t131407\n省吃俭用\t131408\n三老板\t131409\n唉唉你你不爱我\t131410\n668286n3\t131411\n多好帅\t131412\n1292426858\t131413\nQQ帐\t131414\n各政党\t131415\n泉州\t131416\n很不幸\t131417\n结论性\t131418\n微视\t131419\n刘正\t131420\n五百双\t131421\n异性迟疑\t131422\ninvel\t131423\n几千亿美元\t131424\n晕了\t131425\nQQ币\t131426\n拍戏时\t131427\n过敏困扰\t131428\nhgggzgf\t131429\n团体票\t131430\n追加\t131431\n波士頓\t131432\n吴其伦\t131433\n五百只\t131434\n眼病\t131435\n陈波\t131436\npkp\t131437\n打烂\t131438\n一倍体\t131439\n邓庆杨\t131440\n大黑\t131441\n任梓沂\t131442\npkc\t131443\n7赶哦\t131444\n147岁\t131445\n维多\t131446\nRgh\t131447\nyesyouri\t131448\n80岁\t131449\n大黄\t131450\n打烦\t131451\nfhfjhomvgmhkfjjcnfzgxgdhhccfjcnjdydyftjgxgfhjgfy\t131452\n附睾\t131453\n奥特银河大怪兽之战\t131454\n沐光浴辉\t131455\n嘎嘎嘎嘎嘎嘎嘎呱呱呱呱呱呱呱花花花花花\t131456\n镇原县\t131457\n弥渡\t131458\n杰西卡\t131459\n542\t131460\n牛红霞\t131461\n甚么\t131462\n魔弹朝六成本难\t131463\n547\t131464\n544\t131465\n545\t131466\n抱仁\t131467\n顼祥宏\t131468\n13点13分\t131469\n当期\t131470\n大丰收\t131471\n絮凝\t131472\n村口\t131473\n胞胎\t131474\nbeh\t131475\n茄红素\t131476\n210万元\t131477\n极客\t131478\n向前走一步\t131479\n充当\t131480\n77388837u88额8\t131481\n李秋萌\t131482\n行销\t131483\n关公\t131484\n54f\t131485\n6568\t131486\n涡牛\t131487\n五年级上册语文书\t131488\n并永远\t131489\n托付\t131490\n5O\t131491\n临川区\t131492\nBEAST\t131493\n并一顿\t131494\n头一个字\t131495\n发小心\t131496\n你好聪明\t131497\n墓穴\t131498\n关关\t131499\n断货\t131500\n长青农化\t131501\n批号\t131502\n两方\t131503\n恩恩\t131504\n二小白\t131505\n可使\t131506\n婧兰兰\t131507\n多名\t131508\n一点点二元\t131509\n苦逼党三长\t131510\n独立大坏蛋度秘\t131511\n女郎\t131512\n哼嘟\t131513\n举枪\t131514\n好奇葩\t131515\n蒋旭明\t131516\n自由之战\t131517\n女都\t131518\n重奏\t131519\n国际象棋\t131520\n北辰君\t131521\n海南陵水\t131522\n哼嘻\t131523\n吹里\t131524\n哼嘿\t131525\n黎里辣\t131526\n敬酒\t131527\n拜尔\t131528\n一大株\t131529\n打荷\t131530\n酒楼\t131531\n李别\t131532\n请见\t131533\nber\t131534\n生活水平\t131535\n打药\t131536\n白凤\t131537\n二分之一三分之一四分之一\t131538\n一展莫愁\t131539\n蜀山区\t131540\n我走\t131541\n厄齐尔\t131542\n海定区\t131543\n454542464\t131544\n表功\t131545\n李刚\t131546\n爆品\t131547\n一零二六\t131548\n程疗\t131549\n0452\t131550\nsououndes\t131551\n27米\t131552\n知如醉\t131553\nTVy\t131554\nhduwhxuhwjsjsiishwhijqjosikwpowi82828e8929292902iwoowksod0lw929w9\t131555\n加度\t131556\n回來\t131557\n巴国布衣国贸店\t131558\nTVv\t131559\n5p\t131560\n张家旗\t131561\n5s\t131562\n大院长\t131563\n网游类\t131564\n1q八四\t131565\nNorah\t131566\n气管炎我是管严\t131567\n笑脸相迎\t131568\n诉求\t131569\n来了救命\t131570\nTVT\t131571\n新版五年级下册语文书\t131572\n王然然\t131573\n娃们\t131574\nphey\t131575\nTVI\t131576\n冤枉人\t131577\n8877口\t131578\n400亿元\t131579\n13点\t131580\n摸索\t131581\n找业\t131582\n第一局\t131583\n叫色\t131584\n谁你\t131585\n黄牛\t131586\n用不着\t131587\ncgbk\t131588\n悎情我昰健\t131589\n真身\t131590\n狐狸犬\t131591\n写画\t131592\n黄牌\t131593\n新陈代谢势所难免\t131594\n意扬\t131595\n黄片\t131596\n好了好了我最爱你\t131597\n858878058\t131598\n87百小时后\t131599\n霞浦\t131600\n冬蜜连锁加油哈\t131601\n13456\t131602\n其你们\t131603\n衣柜\t131604\n清原\t131605\n议制\t131606\n125886\t131607\n笨妞妞\t131608\nbitchyou\t131609\n7994\t131610\n稳只说\t131611\n温柔美鲍\t131612\n华为荣耀七\t131613\ncgbc\t131614\n535352565\t131615\n十佳小学通州分校\t131616\nvuth\t131617\n欧阳某\t131618\n洛神\t131619\n北京自然博物馆\t131620\n你没你没你丑你丑你\t131621\n应届宫\t131622\n8桩\t131623\n志愿\t131624\n跟着我\t131625\n李晓晓\t131626\n潇洒哥\t131627\n天行\t131628\nTell\t131629\n熊出没之熊心归来了再见\t131630\n飞雪\t131631\n53555666\t131632\n返还\t131633\n绲聃\t131634\n九千九百九九99\t131635\n返返\t131636\n惨白\t131637\n索证\t131638\n天衣\t131639\n三零五二\t131640\n猎手\t131641\n瞎眼\t131642\n15234205617\t131643\n肖妈妈\t131644\n沉淀\t131645\n李秉阳\t131646\n宁金锋\t131647\n48438\t131648\n1830493859\t131649\n不看我\t131650\n6BHJJDNSb\t131651\npeake\t131652\nmlgb\t131653\n我的明儿告诉你我叫周日从\t131654\n邪笑\t131655\n羞于\t131656\n项向天波\t131657\n35次\t131658\n羞事\t131659\n407204\t131660\n24980日元\t131661\n吉斯汗\t131662\n曲婉柔\t131663\n热力\t131664\n羞人\t131665\n第12位\t131666\n刘珏巧\t131667\n零幺幺\t131668\n钱存鑫\t131669\n油然而生\t131670\n明早六点\t131671\n一只75mm\t131672\n黄心怡\t131673\n当老板\t131674\n嘿嘿天一\t131675\n結果\t131676\n不3不4\t131677\n却是\t131678\n铁皮\t131679\n暖化\t131680\n闫思琪\t131681\n王梦白\t131682\n用途\t131683\nDryyu\t131684\nuuyeyyyhwhwusshs\t131685\n撇开\t131686\n無論\t131687\n偏好\t131688\nDgfzhuxc\t131689\nbutimnotakid\t131690\n空气味\t131691\n七八糟\t131692\n长安欧尚\t131693\n妙妙想\t131694\nHealing\t131695\n开篇\t131696\n水晶帘\t131697\n蔡锷\t131698\n张继科\t131699\n姜刑\t131700\n远赴\t131701\n朱肉\t131702\n真好笑哈\t131703\n滴正经滴慈祥滴琨\t131704\n体重器\t131705\n北京地税\t131706\n第几多\t131707\n勒[\t131708\n遇感\t131709\n卑劣\t131710\n倍儿卡倍儿卡\t131711\n图种\t131712\n蒙大拿\t131713\n托斯\t131714\n灯火通明\t131715\n二点三点四点五点六\t131716\n呦吼吼吼吼\t131717\n55\t131718\n图秘\t131719\n诗问\t131720\n斤己\t131721\n服役\t131722\n下月个\t131723\n57\t131724\n许再\t131725\n吵吵吵\t131726\n广播剧\t131727\n邝邝邝\t131728\n来了整\t131729\nA8-3850\t131730\n瞎搞\t131731\n23天\t131732\n烧茄子盖浇饭\t131733\n10100个\t131734\n985127\t131735\n52\t131736\n撞邪\t131737\n零零二\t131738\n赌客\t131739\n钟卓君\t131740\n两个个\t131741\nYffycgifitfixrustuzfuxXitfifiyyfyfyfy\t131742\n在早点\t131743\n联保\t131744\n续订\t131745\nMaxday\t131746\n何晓蕾\t131747\nt13\t131748\n丢赖\t131749\nbrbd\t131750\n周逸洪\t131751\n尊便利\t131752\n評論\t131753\n很难以\t131754\n船中\t131755\n一一下\t131756\n珠帘\t131757\n生命之树\t131758\n333333333333333333333\t131759\n勾通\t131760\n想错话\t131761\n谢依霖\t131762\n初浩宇\t131763\n船主\t131764\n礼泉\t131765\n三寸不烂之舌\t131766\n肖远德\t131767\n53582172172\t131768\n我等不及\t131769\n船上\t131770\n下毒\t131771\n幸福快乐的生活\t131772\n勾逼\t131773\n租用\t131774\n丧身之地\t131775\n大兵\t131776\n谭水尤\t131777\nt1r\t131778\n肉包\t131779\n东南台\t131780\n明天早上八点半\t131781\n菲比\t131782\n15496588075672864\t131783\nahi\t131784\n哈累\t131785\n法德\t131786\nrrrrrrrr\t131787\nabee\t131788\n约泡\t131789\n奥良吧233\t131790\n王全鑫\t131791\n北京政府\t131792\n萌柯柯\t131793\n皱皱\t131794\n主峰\t131795\n无知者们\t131796\n王源千玺\t131797\n爱情公寓\t131798\n关掉\t131799\n陈宇航\t131800\n君心有主\t131801\n镇远\t131802\nLT06E7H7N5J8K01C\t131803\n555555555555555555555555555555555555555555555555555552555555555555555555555555555555\t131804\n豆酱焗膏蟹\t131805\n7tcom\t131806\n24时\t131807\nasiantube\t131808\n大棒挑一只大什么嘛大棒棒一个小猫大膀膀呱呱呱呱的呱呱我是大瓜瓜\t131809\n我老婆\t131810\n团圆\t131811\n艾老师\t131812\n88.38\t131813\n广州万菱汇ZARA\t131814\n普华\t131815\n感情类\t131816\nkrjfbzhfncj\t131817\n裴敏新\t131818\nyingyu\t131819\n第五十页\t131820\n够了解\t131821\n张别老\t131822\n开元寿\t131823\n桑感\t131824\n文静林凯文\t131825\n避孕套\t131826\n好笑不好笑\t131827\n王洋洋\t131828\n王谢谢\t131829\n恩岁\t131830\n1PK\t131831\n要不不古古\t131832\n沾黄\t131833\n么么哪了刀郎思密达思密达思密嗒哒哒\t131834\n讀\t131835\n什麼耍說嗎\t131836\n收义\t131837\n2434\t131838\n藏剑山庄\t131839\n黄周玄\t131840\n2575760385\t131841\n规\t131842\n闵苏噶\t131843\nqwerasta\t131844\n逃犯\t131845\n来访\t131846\n潘潘\t131847\n0￥\t131848\n女猪\t131849\n女献\t131850\n130425200207289136\t131851\n来讲\t131852\n公想你\t131853\n喉咙处\t131854\n收买\t131855\n高高兴兴高兴\t131856\n一顿饭\t131857\n十二牛\t131858\n铂宫\t131859\n血浆\t131860\nWestminster\t131861\n偷了看\t131862\n汤米\t131863\n姐们\t131864\n都市早班车\t131865\n嗯小明\t131866\n谭姐\t131867\n出生地\t131868\n创客\t131869\n快乐服\t131870\n大身\t131871\n建设北路\t131872\n12345678923456789多少\t131873\ndoox\t131874\n2011年6月16日\t131875\n15123188311125\t131876\n斯塔基\t131877\n72年\t131878\n邓超\t131879\n2011.1.9\t131880\netofusyusuitidohdutdohfigskdyrtwetwifpeyeoytueoueww\t131881\n1800\t131882\n明桥\t131883\n1662909796\t131884\n艾迪，TIM，川妹\t131885\n1808\t131886\n锦瑟\t131887\n西红柿丝\t131888\n有空见\t131889\n谭姨\t131890\n贯坏\t131891\n跳转页\t131892\n2个小时\t131893\n我喜欢我们班\t131894\n44576个\t131895\n结局\t131896\n双模\t131897\n哈利波特85之哈利波特与发改委\t131898\n光辉太狼\t131899\n00000000度\t131900\n到现\t131901\n古怪\t131902\n好天气\t131903\n一周八\t131904\n一周六\t131905\n疯了气\t131906\ngific\t131907\n咋类\t131908\n江婉秋\t131909\n坟头\t131910\n小叔秘\t131911\n不要脸元\t131912\n擂鼓\t131913\n陈文雯\t131914\n侠岚弟\t131915\n铣床\t131916\n信息技术\t131917\n计\t131918\n2月12号\t131919\n151515151515\t131920\n说臭\t131921\n芹菜籽\t131922\ntyyuhff\t131923\nChris\t131924\n疑问\t131925\n生好\t131926\n番禺\t131927\n圣托尼里\t131928\n1111个\t131929\n九三大\t131930\n认\t131931\nmetoo\t131932\n黒妹\t131933\nbeginning\t131934\n意甲\t131935\n恩就四\t131936\n新年快乐\t131937\n巴里、兰帕德\t131938\n1111万\t131939\n让\t131940\n240％\t131941\n那我等你好不好到时候我来了你还要我\t131942\nmetot\t131943\nmetou\t131944\n一八二三一零四六八十\t131945\n笑死我了nn\t131946\n训\t131947\n盗墓片\t131948\n有急事先走\t131949\nQ鰭\t131950\n唉张太帅\t131951\n卡吧块\t131952\n252555258269352258\t131953\n天梯\t131954\n七个月\t131955\n天梭\t131956\n长恨歌\t131957\n晚安亲儿盆儿\t131958\n海度秘\t131959\nrgjjkkkkkkkikkokoooookkkiteserrtk\t131960\n讳\t131961\n旭静\t131962\n杜灭我\t131963\nQue\t131964\n戶饣\t131965\n讵\t131966\n大辫子\t131967\n劲疙瘩\t131968\n夺嫡\t131969\n马踢子\t131970\n你好男人\t131971\n12点12分\t131972\n吧谈恋爱宝\t131973\n秋涛\t131974\nX6PIu\t131975\ntihipkoo\t131976\n阿尔山\t131977\ncomebuy\t131978\n周至\t131979\n论\t131980\n我们的世界\t131981\n去嘞\t131982\n3087545690\t131983\n王颍\t131984\n操作时\t131985\nsh8\t131986\n耳聋\t131987\n别水\t131988\nbO皿TT﹏皿皿≧3≦TT皿TT皿TT≧3≦\t131989\n淮海中路\t131990\nHéctor\t131991\n指婚\t131992\n0.7元\t131993\n上街\t131994\n又要不\t131995\n西门庆\t131996\n3月11日\t131997\n四道口\t131998\n贵州省\t131999\n酸楚\t132000\n神掌\t132001\n浴室风\t132002\n机舱\t132003\n月报\t132004\n软妹子萌妹子淑女\t132005\n弄清\t132006\n尾号限行\t132007\n100000000000000000000000个\t132008\n书田\t132009\n沈浩宇\t132010\n十三百首1\t132011\n新字\t132012\n刚才鱼\t132013\n苏泊尔\t132014\n双生\t132015\n权谋\t132016\n郎溪t公司\t132017\n二十年前\t132018\n权谓\t132019\n良好\t132020\n神探\t132021\n读尼\t132022\n甘相\t132023\n黑卫\t132024\n六辑\t132025\nCS/LM\t132026\n13454413454095953\t132027\n秘臭\t132028\n改善\t132029\n黑卡\t132030\n峨鳝\t132031\n信用衣\t132032\n怎能\t132033\n反应焓\t132034\n军演\t132035\nstyhi\t132036\n夸罗\t132037\n這樣子\t132038\n李晨奥\t132039\n军漫\t132040\n彭一轩\t132041\n很闲\t132042\n本命年行\t132043\n飞上天\t132044\nmpmpn\t132045\n8点29\t132046\n变鬼\t132047\n陈富康\t132048\n8点20\t132049\nilyoute\t132050\n极乐\t132051\npre2\t132052\n张朝生\t132053\n大黑山\t132054\n何娅\t132055\n那你爱不爱创作你的人\t132056\nsssbbbb\t132057\nfkfjfjfjfjfjjf\t132058\n超能侠\t132059\n谐音\t132060\n止氵阁\t132061\n紫菜蛋汤\t132062\naku2505\t132063\n欧阳阳阳\t132064\n11122322222222212132212311\t132065\n滴滴\t132066\nkspk\t132067\n围绕\t132068\n18778294381\t132069\n何良栩\t132070\n家务妈妈\t132071\n内页吹神\t132072\n明几净\t132073\n狂插\t132074\nrdrff\t132075\n一亲一个\t132076\n小小鸟\t132077\n肉鹅\t132078\n卡钱\t132079\nJYJ金\t132080\n偷媳\t132081\n再学\t132082\n有得了\t132083\ncf兰\t132084\n解决儿\t132085\n米尔特曼\t132086\n奥迪杯\t132087\n还寿\t132088\n超速\t132089\n六二五二\t132090\n对不对521嘿嘿1851\t132091\nabaa4\t132092\n张首芳\t132093\n乖妞\t132094\n累不想死\t132095\nKD迪爱\t132096\n郎正潇\t132097\n顾左右而言\t132098\n少年宫小家用十五分钟小丽用十二分中小智用十分钟夏家\t132099\n止泻痢\t132100\n勇敢的姑娘\t132101\n更佳宜\t132102\n尼古拉斯凯奇\t132103\n1000000000000000000\t132104\n增大\t132105\n祖蓝\t132106\n我不信\t132107\n朋天\t132108\n触手咧\t132109\n剩人\t132110\n淘宝大卖家\t132111\n二届\t132112\n无知小\t132113\n不文想知道你知道知不道我是男的女\t132114\nhifustryst\t132115\n卡徒\t132116\n陈慧\t132117\n十二小时之内\t132118\n郑嘉颖\t132119\n程晓妍儿\t132120\n绝收\t132121\nCigjuhjko\t132122\n花世纪鼎三十到\t132123\n一兆一块\t132124\n15935862147\t132125\n夜四一\t132126\n山楂\t132127\n增多\t132128\n尔我了你\t132129\n奉行\t132130\n电脑\t132131\n半干\t132132\n半年\t132133\n53623层\t132134\n陈子航\t132135\n王心慈\t132136\n高温\t132137\n好难好难好难\t132138\n几万基千八百\t132139\n别的电影从家飞我靠你\t132140\n我的呀爸爸谢谢你给我表我\t132141\n西红柿\t132142\n讨打\t132143\n三哦\t132144\nsht\t132145\n巡检\t132146\n很开心度秘\t132147\n哦你不讨厌我不讨厌我我喜欢你\t132148\n铁凡\t132149\n三哒\t132150\n高清\t132151\n给我会不会\t132152\n懂子\t132153\n梦西\t132154\n的牛\t132155\n三品\t132156\n华美达年货抵用券\t132157\n拿去\t132158\n投诉你我投诉投诉你\t132159\n单沙发\t132160\n才才\t132161\n黑什\t132162\n把拍\t132163\n援手\t132164\nFezzhnhkvl\t132165\n嫁鸡随鸡惊声尖叫\t132166\n阿狸7\t132167\n13288987575n7n54\t132168\n清炒米苋\t132169\nｖｉ\t132170\n狗场\t132171\n果壳\t132172\nｖｂ\t132173\n缺却\t132174\n箭塔\t132175\n会展\t132176\n黄渤\t132177\nhhhfff\t132178\n九圈\t132179\nUGG\t132180\ninChina\t132181\n拍卖行\t132182\n查操\t132183\n家卑鄙\t132184\n粟秘\t132185\n说声\t132186\n孟母\t132187\n搭台\t132188\nopenifI\t132189\n召唤醒了\t132190\n阳见\t132191\n快乐多得\t132192\n天使母\t132193\n先哪天哪天哪天哪天哪天\t132194\n节日快乐\t132195\n┴┴\t132196\n吐沫\t132197\n咆哮\t132198\n白云留影\t132199\n济世\t132200\n9768678\t132201\n坎坎坷坷看坎坎坷坷坎坎坷坷坎坎坷坷坎坎坷坷\t132202\n1度\t132203\n1座\t132204\n如初见\t132205\n您老费\t132206\n自卖自夸\t132207\n鸡飞狗跳\t132208\n好吧好吧我相信你\t132209\n吴国平\t132210\nOURDEN\t132211\n齐燕\t132212\n薛志文\t132213\nnote\t132214\n双廊\t132215\n看得起\t132216\n几百克\t132217\n停机保\t132218\n监禁\t132219\nffovwulvg\t132220\n几百元\t132221\n郑兰荪\t132222\n冷光头\t132223\nbShare\t132224\n吴华娟\t132225\n子文\t132226\n公司机\t132227\n一百次\t132228\n像你\t132229\nyggh\t132230\n振幅\t132231\n炮孔\t132232\nyggg\t132233\n666家\t132234\n房九\t132235\n五一女\t132236\n晨子\t132237\nyggy\t132238\n山野\t132239\n傅民币\t132240\n山里\t132241\n安康客运酒店\t132242\n指挥杆\t132243\n我喜欢的样子\t132244\n充足\t132245\n向日葵花\t132246\n首饰性\t132247\nduiyde\t132248\n毫无疑问\t132249\n环球大\t132250\n宠物达人\t132251\n麻疯村\t132252\n叶别理\t132253\n平舌音\t132254\n高雪琼\t132255\nfiwkn\t132256\n我的好妈妈呀\t132257\n疲力竭\t132258\n头脑袋\t132259\n4006861166666655\t132260\n一二三四五六七八九十十一十二十三十四十五十六十七十八\t132261\n情哥哥\t132262\nbbbbbbbbbbbbbbbbbb\t132263\n未接\t132264\n桥墩\t132265\n越来你\t132266\n扣扣扣扣扣扣扣扣币\t132267\nLuke\t132268\n真度\t132269\n上次\t132270\n纯天\t132271\n邹家华\t132272\n周峰颉\t132273\n度图\t132274\n我晕的你是猪吗你是猪么你是猪吗你是猪吗你是猪吗你是猪嘛你说嘛你是猪吗你是猪\t132275\nADT\t132276\nBPO\t132277\n女彘\t132278\n尝試\t132279\n昨样\t132280\n4点04分\t132281\n酷跑帮\t132282\n上款\t132283\n折磨身\t132284\nghdfgcdvvfd\t132285\n富县\t132286\n彩擂\t132287\n逃命\t132288\n适用\t132289\n温江\t132290\n色猪\t132291\n钟基\t132292\n独处\t132293\n么偶\t132294\n女形\t132295\n走都走\t132296\n灵柩\t132297\n腹泻\t132298\nsma81xw8w790\t132299\n草批\t132300\n张籍节\t132301\n五老\t132302\n猪勒\t132303\n说好日\t132304\n李龙\t132305\n石林\t132306\n哥哥俩\t132307\n定价权\t132308\nB9能源公司\t132309\n惠农\t132310\n齐晨齐\t132311\n骤然\t132312\njjwkci\t132313\n熊猫灯\t132314\n错了擦\t132315\n物业管理条例\t132316\n汶上君\t132317\n74568855524412369541\t132318\n礼打我着\t132319\n陌陌陌\t132320\n格格了速度度\t132321\n酒神\t132322\n核黄素\t132323\n赤坎\t132324\ntkg\t132325\ntkd\t132326\ntron\t132327\n一脚\t132328\n千秋雪\t132329\n图迷\t132330\n三字\t132331\n聊了再见再见\t132332\n大部分\t132333\n鼓膜\t132334\n女尊\t132335\n李玉洁\t132336\n张海\t132337\n朵花\t132338\n木有用\t132339\n江西高院\t132340\n不里\t132341\n不量\t132342\n1234565321563\t132343\nyl块\t132344\n墨莲\t132345\n牛派格\t132346\n张浩\t132347\n石娃娃\t132348\nchù\t132349\n王蓉\t132350\n前程\t132351\n五月倡汐\t132352\n235868\t132353\n周泾路109号\t132354\n王源儿\t132355\n制裁\t132356\n适当\t132357\n235865\t132358\n玻钻之争\t132359\n二点五吧\t132360\n遮羞\t132361\n一把头\t132362\n里尔\t132363\n第一排\t132364\n错了错了错\t132365\n贫富混居\t132366\n千里共婵娟\t132367\n人到\t132368\n吃困\t132369\n晕了拜拜\t132370\n大田片\t132371\n缠烂打\t132372\n预科\t132373\n吃图\t132374\n辩证\t132375\n辩识\t132376\n68支\t132377\n誓言\t132378\n思想曲\t132379\nkosiyenaulmnkne\t132380\n春行忠\t132381\n姐弟恋\t132382\n预付\t132383\n820分\t132384\n梅世民\t132385\n膻\t132386\n膽\t132387\n羞社\t132388\n哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒\t132389\n男欢女爱动态图\t132390\n十一分钟\t132391\n1921\t132392\nhffb\t132393\nhffd\t132394\nhfff\t132395\n同情\t132396\nhffh\t132397\n1929\t132398\niqqqp1j\t132399\n膥\t132400\n下戈壁\t132401\n不见了再也不见\t132402\n膝\t132403\n膜\t132404\n蒙面那\t132405\n甘孜\t132406\n小度果冻\t132407\nffghfsfj\t132408\n向前冲\t132409\n灵子\t132410\n不要说真话\t132411\n膏\t132412\n色格\t132413\n见别再见\t132414\nmay，gab\t132415\n罗特Mixx7\t132416\n繁春\t132417\nfbfx\t132418\n叶落纷飞飞满天\t132419\n四点半后\t132420\n奔观\t132421\n一心雨\t132422\n双独\t132423\n吴春东\t132424\n绿绿雅\t132425\n男排\t132426\n面尾\t132427\n特色\t132428\n时雨\t132429\njbgff\t132430\n瓦伦\t132431\n多米伽\t132432\n本片\t132433\n一个一五\t132434\n游清浩\t132435\n别是\t132436\n羞大屄\t132437\n上厕所说错\t132438\njjsjhss\t132439\n吃呢啦\t132440\n留步\t132441\n全国美术馆公共图书馆文化馆\t132442\n12块\t132443\n甄酸牛奶\t132444\n一一名\t132445\nfhgv\t132446\n液压机\t132447\n学习棒\t132448\n大嫁\t132449\n至实\t132450\n靠你也不知道\t132451\nfhgl\t132452\n羊羊羊羊羊羊绵绵\t132453\n慢一点行\t132454\nhi匿名\t132455\n鸦雀岭镇\t132456\ngan1000\t132457\n食盐\t132458\n薛龙巨\t132459\nfillight\t132460\n7.6%\t132461\n1947年1月12日\t132462\n210万吨\t132463\n卵梅\t132464\n切顿熊\t132465\n露掉\t132466\npeaking\t132467\n做白日梦\t132468\nv20\t132469\n大嫚\t132470\nshinee\t132471\n演唱者\t132472\n谢氏松\t132473\n尚茹\t132474\n秘谢谢\t132475\n谷底\t132476\n和当\t132477\n胰腺神经内分泌细胞瘤\t132478\n癀片\t132479\n充没\t132480\n首宗\t132481\n张英杰\t132482\n错误\t132483\n浩浩\t132484\n裤恩\t132485\n背景乐\t132486\n2GB\t132487\n直播天下\t132488\nndjdbz\t132489\n宝书\t132490\n高建\t132491\nmqkurz\t132492\n非淋\t132493\n年久失修\t132494\n呀好帅\t132495\n别别太得寸进尺\t132496\n快斗\t132497\n22：00\t132498\n温柔猫\t132499\n全志焕\t132500\n吴鸿志\t132501\n一根筋\t132502\n⊕\t132503\n秘茶行\t132504\n好莱坞式\t132505\n高端大气上档次\t132506\n杜卫星\t132507\n淘气包马小跳\t132508\n胰岛\t132509\n⊙\t132510\n在家度秘\t132511\ncle\t132512\n72只\t132513\n像你一样\t132514\nMikeSmithis\t132515\n二十快\t132516\n九万九千九百九十九万九千九百九十九\t132517\n古德白\t132518\n杖刑\t132519\n暖心点\t132520\noraaalooout\t132521\n东咪多咪咪帮帮\t132522\n383424256895423235313468087546657689434686554642123557909744564555\t132523\n二架\t132524\n聂欣悦\t132525\n你是谁你是谁你是不是\t132526\n母后\t132527\n徒有虚名\t132528\n筷子仙\t132529\n不懂话我说文化高有没说吧文化高\t132530\n在线\t132531\n了了了了了几个好了啦啦啦啦啦啦啦啦啦啦\t132532\n左耳\t132533\n一票点点\t132534\n小船\t132535\n饿额\t132536\n瞬息\t132537\n手伴\t132538\n几生\t132539\n新仓镇\t132540\n好心人\t132541\n好朋友\t132542\n手杆\t132543\n小般\t132544\nonononono啦啦啦啦啦啦啦啦啦啦啦啦啦啦\t132545\n暗藏\t132546\n点房\t132547\n臼白\t132548\n德阳市\t132549\n眤称\t132550\n小舞\t132551\n小舟\t132552\n圆墩\t132553\n小舅\t132554\n我不要你了不要你\t132555\nrayous\t132556\n主裁\t132557\n相思只说你爱秋天爱\t132558\n刘艺文\t132559\n角马\t132560\n度秘啫\t132561\n前前\t132562\n伯伯\t132563\n52a2834349b033b11decdae12ce36d3d539bd6bjpg\t132564\n五魔剑霸\t132565\n背唱\t132566\n别样红\t132567\n黄梅\t132568\n20：53\t132569\n赵机械\t132570\n验血\t132571\n九江学院政法学院\t132572\n丑种\t132573\n你好呀嘟咪\t132574\n饺子\t132575\n同花\t132576\n直通车\t132577\n我爱你爱着你就像老鼠爱的人\t132578\n一个杀呗咪哥哪花不搭美人鱼小屄\t132579\nlopiuy\t132580\n小宝贝儿\t132581\ntomluot\t132582\n74568590\t132583\n心冷\t132584\n剑指\t132585\n张小毓\t132586\n唔好\t132587\n美东动漫\t132588\n女足\t132589\n洋洋金山区考\t132590\n驾照\t132591\n机器感\t132592\n屏保型\t132593\n朝上大哥哥\t132594\n林州市\t132595\n郭浩然\t132596\n经济片\t132597\n宰宰\t132598\n一个半个\t132599\n不成文\t132600\n十二月初九\t132601\n恶i人rtdfurops飞车站路口处处\t132602\n劳务工\t132603\n#CBD网#\t132604\ndeth\t132605\nＣＣＴＶ\t132606\n嗯六六页岛\t132607\n起价\t132608\n欧式新天地\t132609\n乖乖虎\t132610\n说象\t132611\n我看我\t132612\nzDxzdcXsxFhsgGjdcfFFGGCZZ\t132613\ndetg\t132614\n夏悦\t132615\n孙明\t132616\nBehs\t132617\n難女女女金\t132618\n写给我我是女滴\t132619\ntower\t132620\ndetr\t132621\n升职快\t132622\nfds\t132623\n旅橙萌\t132624\n善感\t132625\n老头\t132626\n靠左\t132627\npotlucks\t132628\n11期\t132629\n嘟嘟嘟\t132630\n阻塞\t132631\n嘟嘟嘁\t132632\nMaster\t132633\n老天\t132634\n老太\t132635\n老夫\t132636\n11月\t132637\n老大\t132638\n皮卡秀\t132639\n二零一八\t132640\n打干嘛\t132641\n焚坑\t132642\n嘟嘟嘴\t132643\n老多\t132644\n小气块\t132645\n老外\t132646\n狗头机器人\t132647\n2011年3月\t132648\n老夏\t132649\nhadfishhe\t132650\nwhatyoum\t132651\n我的爷爷\t132652\n烦人冰\t132653\n杨德雨\t132654\n11本\t132655\n发型气\t132656\n兼具\t132657\n王锅\t132658\nicDe\t132659\n记仇\t132660\n花开欢乐谷\t132661\n死不休\t132662\n9999999999999999999\t132663\n金融服务业\t132664\n产业结构\t132665\n庠\t132666\n赵艳红\t132667\n幻阁\t132668\n无关紧要\t132669\nznkd\t132670\n三分之四\t132671\n8516米\t132672\n你真的不要骗我了我求你\t132673\n66667879\t132674\n我靠我靠我靠我靠我靠我靠我靠\t132675\n宋帅\t132676\n改嫁\t132677\n英桃\t132678\npp岛\t132679\nculatr\t132680\n喝讨厌\t132681\n一连三场\t132682\n56608438474\t132683\n四亿\t132684\nffghdcgj\t132685\n四人\t132686\nruei\t132687\n真心话游戏\t132688\ndrtygoojnd\t132689\n四亲\t132690\n镇原二中\t132691\n一百位\t132692\nruer\t132693\n零六零七\t132694\nSIM卡槽\t132695\n四交\t132696\nV三国杀\t132697\n莎翁\t132698\n新图\t132699\nabcdexo\t132700\n新国\t132701\n说会儿\t132702\n坦言\t132703\n280只\t132704\n三百家\t132705\n行政拘\t132706\n四五\t132707\n丑闻\t132708\nehf\t132709\n容度\t132710\nehd\t132711\nehe\t132712\n3多少\t132713\n浓妆\t132714\n轰轰烈烈\t132715\n新团\t132716\n给你不了了我要听你说了睡觉了\t132717\n开端\t132718\n就是我的贴身小秘书\t132719\nv飞蛾\t132720\n就问你了靠你了兄弟\t132721\n皮太扁\t132722\n3015年\t132723\n警察们\t132724\ncardoor\t132725\n这种片\t132726\n奥知道知道\t132727\nyufff\t132728\n讨厌\t132729\n零用钱\t132730\n六十块\t132731\nHongKi\t132732\n2分钟之内\t132733\n欧丽斯\t132734\n胡新颖\t132735\n我好爱好爱好爱好爱好爱好爱我去看\t132736\n座右铭\t132737\nKdugvls\t132738\n包码\t132739\n刘书同\t132740\n5457575758\t132741\n邹鸿宇\t132742\n一个工作日\t132743\n四棱锥\t132744\n苗静临\t132745\ntboeb\t132746\n六尾\t132747\n历历在\t132748\n海贼塞\t132749\n指示灯\t132750\n青年们\t132751\n受灾\t132752\n永胜\t132753\n马大牙\t132754\nAW523\t132755\n小诊所\t132756\n怎么事\t132757\n20.3%\t132758\n碧5\t132759\nssry\t132760\n怎么了\t132761\njjagjajagd\t132762\ndhcbcj\t132763\n一株柳一株\t132764\n苏柏丽\t132765\n争得\t132766\nEternal\t132767\n想你的再见再见\t132768\n新闻联播\t132769\n看拍\t132770\ncarzkalis\t132771\n闹木忙\t132772\n金刚葫芦\t132773\n届时\t132774\nChrmd\t132775\n11111112222222333344444555566667777788889999900000111n1\t132776\n没错知道\t132777\n一篇\t132778\n裸葫芦\t132779\n老奶奶级\t132780\n1987年\t132781\n权势\t132782\n闫德平\t132783\n归咎\t132784\n或多\t132785\n现有\t132786\n说得好\t132787\n好大好大好大好大好大好大\t132788\n浦发上海银行\t132789\n王樊\t132790\n花空\t132791\n微信撕碎分批纺\t132792\n权力\t132793\n好帮手网\t132794\n妞妞妞\t132795\n花穴\t132796\n分火器盖\t132797\n350米\t132798\n理由\t132799\nexpenditure\t132800\n阿桃乐\t132801\n25000千里\t132802\n林韵\t132803\n郑泽\t132804\n兔毫盏\t132805\n新桥\t132806\n偶然\t132807\n好凉快呀\t132808\n180202\t132809\n开平市\t132810\n苏有兵\t132811\n人卫\t132812\n反响\t132813\n人卡\t132814\n触类旁通\t132815\n工人狗\t132816\n招商蛇口\t132817\n反哺\t132818\n斥资\t132819\n数五秒\t132820\n成你好朋友\t132821\n穷不啦叽\t132822\n人单\t132823\nhkfggchfvh\t132824\nintentIntentSK1171477665BE5C7749C88AF142B52A2EE476E6F0C0031010212E67062DE7DDC883609896end\t132825\n鱼汤\t132826\nQ驸中e一\t132827\n夜魔娃娃龙\t132828\n鱼池\t132829\nIE8额\t132830\n天河城一楼\t132831\n对呀你萌\t132832\n天亮了才睡\t132833\n大骨量\t132834\n喝喝水\t132835\n三婚\t132836\n焦糖炖蛋\t132837\n拉倒\t132838\n反群\t132839\n水鸟\t132840\n教度图\t132841\n几万元\t132842\n树立\t132843\n九六\t132844\nmtvk\t132845\n水鸭\t132846\n官家\t132847\nchaple\t132848\n瓣\t132849\n商梦茹\t132850\nduew\t132851\n系关\t132852\n十一月二十三\t132853\n乌鲁木齐市香江丽华酒店\t132854\n嘿亲\t132855\n沙累\t132856\nsin90\t132857\n几音\t132858\n7月28\t132859\n大徒\t132860\nWillow\t132861\n酱紫绝交\t132862\n四么样子\t132863\n搜出\t132864\n吴邪\t132865\n大径\t132866\n耀东\t132867\n44484807\t132868\n快点儿峰会\t132869\n吴那\t132870\n化点\t132871\ns眼神\t132872\n大德\t132873\nsuf\t132874\nsue\t132875\nsud\t132876\nsub\t132877\n手抖\t132878\nsuo\t132879\nsun\t132880\nuvvdugdvhg\t132881\nsuj\t132882\nsui\t132883\n哎一古\t132884\nsuv\t132885\n刻薄\t132886\n姜新月\t132887\n手抄\t132888\nsup\t132889\n月桂皂软饭\t132890\n手把\t132891\n38412313\t132892\n二五几\t132893\nsuy\t132894\nsux\t132895\n吸氧\t132896\n大芒街道\t132897\njsjdkd\t132898\n古代史\t132899\n地球生存\t132900\n冲锋\t132901\n操草\t132902\n千千缕\t132903\n吸水\t132904\n不相信你真讨厌爱你\t132905\n忙好丑\t132906\n镐击队\t132907\n戒不出来\t132908\n退睡\t132909\n顺溜\t132910\n250哦\t132911\n红烧度秘\t132912\n你好丑你好\t132913\n夸耀\t132914\n波特\t132915\n积木流行音乐榜\t132916\n吸气\t132917\n晴朗\t132918\n零食人\t132919\n高h\t132920\n小慢\t132921\n依托\t132922\n小慧\t132923\n兰倾\t132924\n没有我理\t132925\n矿\t132926\n废义符义\t132927\n452575\t132928\n矻\t132929\n十三支\t132930\n石\t132931\n短\t132932\n矬\t132933\n矮\t132934\n矩\t132935\n矫\t132936\n5138\t132937\n知\t132938\n度秘你是我最棒的小秘书\t132939\nchvbh\t132940\n矣\t132941\n矢\t132942\n矛\t132943\n矗\t132944\n布米hi我是你的妈妈\t132945\n围住\t132946\n菜园\t132947\n高嘉诚\t132948\n长城会\t132949\n印象\t132950\n用好\t132951\n贴合\t132952\nToddandsay\t132953\n地覆天\t132954\n倒挺\t132955\n六十家\t132956\n表老\t132957\n科弯\t132958\n那你在哪我来找你\t132959\n等一喜欢\t132960\n雅安市\t132961\n小红你有在干嘛吗小\t132962\n太麻\t132963\n敷敷鈤\t132964\nbbbbbhh\t132965\n不自觉\t132966\nk1CK\t132967\n卟离\t132968\n59743541285555555\t132969\n罪性\t132970\n独夫潜在自用心老师不过引路人\t132971\n倾爱\t132972\n用套\t132973\n一月二十一日\t132974\n贴吧\t132975\n決定\t132976\n收到\t132977\n孙志成\t132978\n倒挂\t132979\n咳嗽份\t132980\n委身\t132981\n猪屎\t132982\n盛刚\t132983\n沈阳军区哈尔滨房管\t132984\n沙弥\t132985\ncchin\t132986\n西贝\t132987\n鳗鱼\t132988\n猪屁\t132989\n质数\t132990\n纽约客：当军队民主化\t132991\n小丸子糖\t132992\n12345678910121314151617181920\t132993\n捺\t132994\nab九十七十三b十九四十三五十一c十一三十八十一\t132995\n艺声\t132996\n王虹\t132997\n下板\t132998\n17099308813\t132999\n王虎\t133000\n发掘\t133001\n恩你在干撒子勒\t133002\n凄惨\t133003\n田美波\t133004\n吟桐\t133005\n哢嚓嚓嚓嚓嚓嚓嚓\t133006\n做我自己\t133007\n回火星了拜拜\t133008\n唐风\t133009\nstityou\t133010\n40页\t133011\n会前\t133012\n月亮山\t133013\n缉毒\t133014\n联通联通\t133015\n米利\t133016\n公主型\t133017\n叛变者\t133018\n七四零\t133019\n高级景观\t133020\nWiki\t133021\n非民选\t133022\n10万得\t133023\n偷师\t133024\n捯\t133025\nSOYR\t133026\n刘桂娟\t133027\ncombinbobocomcomorest\t133028\n镶钻\t133029\n嗯一千\t133030\n素锦图\t133031\nIamfromBritish\t133032\n家庄东里\t133033\n流转\t133034\n纪念\t133035\n都江堰\t133036\n黑术\t133037\n三个多\t133038\n下雨没\t133039\n裸袒\t133040\n一条天上一个\t133041\n黑木\t133042\n大白兔奶糖\t133043\n主播\t133044\n邓万强\t133045\n≡儿\t133046\n拐弯儿\t133047\n度秘萌萌的度秘美美哒度秘\t133048\n度秘小度秘我爱你么么哒\t133049\nahahahahaha\t133050\n暴击达人\t133051\n右边\t133052\n渣打银行\t133053\n鲜汤\t133054\n拖堂业\t133055\n四壁\t133056\nDaisy=大翠\t133057\n山游戏物资局\t133058\n黑服\t133059\n机器牛\t133060\n天天在哪里呀春天在哪里春天在哪里呀情人在哪里春天在哪里呀春天在哪里\t133061\n皿\t133062\n米山\t133063\n玄丹妮\t133064\n欢腾\t133065\n成名\t133066\n策墨子\t133067\n四声\t133068\n六十米高\t133069\n沈传佳\t133070\n棚子\t133071\n西子\t133072\n翰哥\t133073\n我的夫妻\t133074\n你是我最爱点\t133075\n晏树桢\t133076\n誒哟\t133077\nsomebody\t133078\n张荣侠\t133079\n御书房\t133080\n动魄\t133081\n八姐\t133082\n背背背你不离appstaaak\t133083\n茅房\t133084\n猴头\t133085\n咆哮体\t133086\n丸九\t133087\n齐聚会\t133088\n幺二六幺\t133089\n皴\t133090\n\t133091\njjkjj\t133092\n屁虫\t133093\n女乐\t133094\n喉干嗓哑\t133095\nstoumityou\t133096\n昆嵛山\t133097\n道学\t133098\n宋山木\t133099\n自私鬼\t133100\n雅蠛蝶雅蠛蝶雅蠛蝶雅蠛蝶\t133101\n手急\t133102\n明确答复你是男是女\t133103\n开大了吧臭不要脸的我是那么\t133104\n黄秋生\t133105\n谜\t133106\n谓\t133107\n谒\t133108\n穿着\t133109\n高中苦逼党\t133110\n死难听\t133111\n工藤柯南\t133112\n谋\t133113\n谊\t133114\n谈\t133115\n103间\t133116\n谎\t133117\n下摘\t133118\n调\t133119\n谂\t133120\n谁\t133121\n为胜\t133122\n谅\t133123\n醉汉\t133124\n倭寇\t133125\n普济寺\t133126\n裂缝\t133127\n谱\t133128\n金牛咯嗯\t133129\n谷\t133130\n崔港甥\t133131\n言情教室\t133132\n马晕\t133133\n血本\t133134\n谩\t133135\n谨\t133136\n谯\t133137\n谮\t133138\n马晓\t133139\n盛会\t133140\n谣\t133141\n谢\t133142\n谠\t133143\n谦\t133144\n5966\t133145\n谤\t133146\n航天母舰\t133147\n兵马俑\t133148\njrjd\t133149\n宜黄\t133150\n哒哒萌\t133151\n鬼使神差\t133152\n零五年\t133153\nyusoo\t133154\n今天下午4点\t133155\n怒气冲天\t133156\n赴\t133157\n带队\t133158\n侯一豪\t133159\n拍呗\t133160\np蔔爿\t133161\n恒哥哥\t133162\n孔宝贝\t133163\n百四\t133164\n柠檬酸奶味薯片\t133165\n李贺\t133166\n拍呀\t133167\n好好好好好好好好好好好\t133168\n惑惑\t133169\njhcf\t133170\n纤诗婷\t133171\n培育\t133172\n发际线球\t133173\njhcn\t133174\n你好度\t133175\n阴缺圆\t133176\n黄家驹\t133177\n克要\t133178\ndfp\t133179\n宁次君\t133180\n虹吧\t133181\n不不不你是智能机器人\t133182\n一月九十\t133183\n皇城相府\t133184\nvxi\t133185\n啲啲\t133186\n站点\t133187\nvxd\t133188\n太阳伞兵马俑\t133189\nvxf\t133190\nHvv警察局\t133191\nvxb\t133192\n棵五一\t133193\nGggc\t133194\nvxx\t133195\nstorc\t133196\n肌理\t133197\n几个度\t133198\nvxv\t133199\nstore\t133200\n怎办\t133201\nTFBYS\t133202\nLADIES\t133203\n群租\t133204\n豫分\t133205\n有害\t133206\n42分\t133207\n度秘气死人不偿命\t133208\ndfe\t133209\n羽化\t133210\n赇\t133211\n巴德\t133212\nTFBYE\t133213\njfut\t133214\n鬼息\t133215\nDhdhdhdhdurhhshsuebduudbehxuhshddhehcudbudiwjskdheif\t133216\n动人心弦\t133217\nking\t133218\nkind\t133219\n莴苣素\t133220\n情关\t133221\n李准基\t133222\n2299年\t133223\n息县\t133224\n13674790\t133225\n江户川柯南\t133226\n情兽\t133227\n十天之后\t133228\nwomanine\t133229\n乖╯\t133230\n有完\t133231\ndfc\t133232\nnn。1\t133233\n八百岁\t133234\n帖子\t133235\n段别介意\t133236\n赏\t133237\n思想诗\t133238\n密能不能\t133239\n100万#\t133240\n清贫乐村居里\t133241\n蓓蓓\t133242\n荥阳\t133243\n蓓蓉\t133244\n那行\t133245\n赎金\t133246\n联系人项\t133247\n佳郎\t133248\n履职\t133249\n周清鑫\t133250\n支今\t133251\n希冀\t133252\n解测\t133253\n唐成刚\t133254\n黄雨\t133255\n自然村\t133256\n吴语\t133257\n支付\t133258\n婴儿\t133259\n霹雳士\t133260\n民主社会所\t133261\n吴晨欣\t133262\n赑\t133263\n唱算\t133264\njdmnmm\t133265\n张艺谋\t133266\n看谈\t133267\n精\t133268\n从头\t133269\n祝愿\t133270\n港腔\t133271\n比也不\t133272\n吕东尼\t133273\n宽心\t133274\ngily\t133275\n亲奶的宝贝脑公我奶您\t133276\nkeklw\t133277\n班费\t133278\n满盘\t133279\n潜意识\t133280\n小红包\t133281\n亚来\t133282\n不接招\t133283\n说度\t133284\n1463\t133285\n鹏媛\t133286\n拖布\t133287\nchhgggg\t133288\n二二三八\t133289\n亚杰\t133290\n承拉起\t133291\n李样\t133292\n厄塔流星雨\t133293\n赛泰\t133294\n断魂\t133295\n布衣别说话小小小小小小小\t133296\n野地\t133297\n聚聚露JJ\t133298\n小学妹\t133299\n秦时明\t133300\ndisitd\t133301\n始至终\t133302\n萌萌哒萌\t133303\n不点不\t133304\n120119\t133305\n国苗\t133306\n太了冷\t133307\nctay\t133308\n酷我碧玺\t133309\ngfufxdhhxydfgyfncyhdbnctdxrgsxgxxtxj6hndyduzytdxgcdhfjdydyfjxcjfukydhfitxhnfycchdhhfffgghcdnufh\t133310\n十三只\t133311\n齐秦\t133312\n原盒\t133313\n国恒铁路\t133314\n尊重允\t133315\n10001万\t133316\n十三号\t133317\n订房\t133318\n季小翠\t133319\n新华西\t133320\n陈巧钰\t133321\n一声令下\t133322\njimjpt\t133323\nbiszibixv\t133324\n暼\t133325\n同人本\t133326\n有乱花\t133327\n俞凯\t133328\n暴\t133329\n蹈服\t133330\n供参考\t133331\n梅花儿\t133332\n太湖\t133333\n欲火\t133334\nrutrr\t133335\n小花花\t133336\n曲盈袖\t133337\n暧\t133338\n专员\t133339\n调音师\t133340\n好衬罗特别臭的死孩子\t133341\n只华\t133342\n暑\t133343\n暐\t133344\n暗\t133345\n暖\t133346\n清新\t133347\n盘活批\t133348\n治任\t133349\n暈\t133350\n7月3日下午\t133351\n饮料\t133352\n巴把\t133353\n再加我不理你了我问你我是不是你\t133354\n毕业后\t133355\n爱吾希\t133356\n暇\t133357\n暄\t133358\n八百回\t133359\n社工\t133360\n腹回环\t133361\n哈飞\t133362\n小母\t133363\n一个十三岁\t133364\n老咖\t133365\ndueie\t133366\n玩劫\t133367\n老和\t133368\n老咆\t133369\ngothey\t133370\n中国工程科技\t133371\ndjcncbsm\t133372\n聘礼\t133373\n爬窗\t133374\n张怡泽\t133375\n1785857\t133376\n泡仔\t133377\n眯咪\t133378\n楚云心\t133379\n吊丝\t133380\n烂逼\t133381\n诉苦\t133382\n福将\t133383\n深紫色\t133384\n老咪\t133385\n我是人你是谁你不就是\t133386\n邵洪潮\t133387\n老咧\t133388\n福尔\t133389\n彭渤凯\t133390\n衽又\t133391\nsolo\t133392\n崔家琦\t133393\nGustavsson\t133394\nfgx\t133395\nfgy\t133396\n王指挥村\t133397\nNobodynobodynobodynobody\t133398\nfgu\t133399\nfgv\t133400\n灿盛\t133401\nfgs\t133402\n有余悸\t133403\n胃疼假\t133404\nu港\t133405\n错车\t133406\nfgh\t133407\n28集\t133408\nfgj\t133409\nfgd\t133410\nfgf\t133411\n万峰\t133412\n中国人大\t133413\nfgb\t133414\nfgc\t133415\n本国\t133416\n无忧无改\t133417\n诛仙转元\t133418\n陈琳琳\t133419\n热播\t133420\n快点口声声快快\t133421\n梁小姐\t133422\n七子之歌\t133423\n老医\t133424\n青岛中能队\t133425\n45888555585252\t133426\n超首歌\t133427\n辣妈\t133428\n雅致\t133429\n梨+苹果+橘子\t133430\n两分钱\t133431\nhttpusan\t133432\n宝宝们\t133433\n吗3\t133434\n辣妞\t133435\n特妈\t133436\n地图像\t133437\n白了个白思密达\t133438\n许昌县林业局\t133439\n一瓶儿\t133440\n又说和\t133441\ndx泪奔\t133442\n李宗文\t133443\n赛场\t133444\n两分钟\t133445\n大爆炸\t133446\n礼堂\t133447\nsurhi\t133448\n又瓜\t133449\n辣妻\t133450\n可获\t133451\n刘栩栩\t133452\n杜一一\t133453\n二岁片\t133454\n鲁那\t133455\n還得\t133456\n不些e呀搅屎棍翡冷翠丢文法行V型愧对我\t133457\n早点度\t133458\navalearhankamily\t133459\n娘娘娘娘娘娘娘娘娘娘娘娘娘娘娘娘娘娘娘娘娘娘娘娘娘娘娘娘娘娘娘娘娘\t133460\n艳福\t133461\n迷途\t133462\n距今\t133463\n很温暧\t133464\n卖工\t133465\n门内人\t133466\n二十千个\t133467\n太白斗酒诗\t133468\n龙山\t133469\n幺幺\t133470\n王来人\t133471\nTATATATATATAT\t133472\n珍惜自己\t133473\n一二三四五\t133474\n不知去向\t133475\nhomework\t133476\n蒙顶山\t133477\n李敏镐\t133478\n干事\t133479\n二十六兆\t133480\nhi一休我是小怪兽灰\t133481\n记念\t133482\n奥丹·科汉\t133483\n第6次\t133484\n乐呵呵\t133485\n打毁\t133486\n回清\t133487\norrennile\t133488\n打毛\t133489\n多依树寨子\t133490\n菜市场\t133491\n恐乐\t133492\n东方之星\t133493\n打比\t133494\n南二环\t133495\n8484848\t133496\n好不好好不好\t133497\nim咕\t133498\n跑调\t133499\n为师不会\t133500\n奥巴奥\t133501\n众泰众泰众国英\t133502\n记忆\t133503\n好我恨你\t133504\n画鹰\t133505\n便样\t133506\n原來家裡\t133507\nhVv\t133508\nsell\t133509\nzkhs\t133510\nGOOD\t133511\n香瓜\t133512\n暴风雪\t133513\n八三九零五零六五零二二\t133514\n阿尔山市\t133515\n我喜欢女的你可\t133516\n伏沙外砂\t133517\n公报\t133518\n2八十六四百三十86516\t133519\n2114290941\t133520\n黄嘉杰\t133521\n一二三四五六七八九十组\t133522\n八蛋\t133523\n面议\t133524\n慈爱\t133525\n吃饱了\t133526\n政权\t133527\n需求者\t133528\n鼓动\t133529\n寒暑\t133530\n寒暖\t133531\n小黑屋\t133532\n小巴人\t133533\nk歌秀\t133534\n翻学\t133535\n鼓励\t133536\n寒暄\t133537\n三无风虎云\t133538\nJJ吧赔礼cb75uud8h\t133539\n货主\t133540\n燕大\t133541\n撸狗\t133542\n好多圈\t133543\n栖复\t133544\npolfw\t133545\n无势\t133546\nWonderful\t133547\n没法忘\t133548\n徐一祎\t133549\n无助\t133550\n说不考\t133551\n白酒\t133552\n睡早\t133553\n新旧\t133554\n268\t133555\n市府\t133556\n陈阳\t133557\n仇老师\t133558\n四点级\t133559\n协作家们\t133560\n无力\t133561\n罗伟\t133562\n红安\t133563\n无功\t133564\n满山开遍呦吼吼吼吼\t133565\n众数\t133566\n墙头草\t133567\n55辆\t133568\n咔咔咔咔\t133569\n红宝\t133570\n张茹媛\t133571\nhsib\t133572\n阿中\t133573\n俊源\t133574\n顾凉\t133575\n234586\t133576\n窟窿\t133577\nrhis\t133578\nThor\t133579\n弃妃\t133580\n朱秋宇\t133581\n金音\t133582\n弃妇\t133583\n无数遍\t133584\n张韩樱子\t133585\n96分\t133586\n哄笑\t133587\n冬官正\t133588\n维族\t133589\n直逼\t133590\n零售价\t133591\n和来\t133592\n鬼画\t133593\nvibraphone\t133594\n不惜一切\t133595\n我讨厌\t133596\n500几\t133597\n也是\t133598\n搓背\t133599\n好友谊宾馆\t133600\n鬼用\t133601\n绿间真太郎\t133602\n吴三\t133603\n爸兔\t133604\n画网\t133605\n吴一\t133606\n乌龙茶\t133607\n贫民窟\t133608\n七八年\t133609\n挖地机\t133610\n周周\t133611\n语句不通顺\t133612\n中型\t133613\n多年前\t133614\n桃花岛\t133615\n三洞\t133616\n请帅\t133617\n宠坏\t133618\n卫龙\t133619\n吴中\t133620\nfileifer\t133621\n政协会\t133622\nzuoye\t133623\n写写\t133624\n吴丹\t133625\n脑血栓\t133626\nvrgrer\t133627\n能量场\t133628\n最后次\t133629\n5点\t133630\n王誉翔\t133631\n一个晚上\t133632\n9月10号\t133633\n劫后\t133634\n报时\t133635\n爱玩爱秀\t133636\nhhhhghgv\t133637\n二零幺二六五\t133638\n25厘米\t133639\n有你在来电动画\t133640\n字节\t133641\n和平鸽\t133642\n维生素\t133643\n自汗\t133644\n特写\t133645\ntwhstb\t133646\n风雨\t133647\n夜莱科蛇姬\t133648\n歹坷\t133649\n埃维\t133650\n雇人\t133651\nill11\t133652\n公知\t133653\n纳滋。多拉格伊尔\t133654\n组织结构\t133655\n11233211234567\t133656\n幺三九零二幺零\t133657\n德语晨阳\t133658\n段世国\t133659\n唉我真的不想说\t133660\n飞车\t133661\nconata\t133662\n陆星\t133663\n篮球\t133664\n依着\t133665\n我的话把\t133666\n好忙\t133667\n上联通\t133668\n好心\t133669\n鹤庆\t133670\n因宾智\t133671\n植物大战僵尸无尽\t133672\n陈伟霆\t133673\n传来\t133674\n闽粤\t133675\n四十四十五十六\t133676\n彪子\t133677\n我是魔仙女王\t133678\n苏完瓜尔佳·敏敏\t133679\nixx\t133680\n光荣与梦想\t133681\nzzzz\t133682\n好快\t133683\n金龟儿\t133684\nixu\t133685\n明子\t133686\n蜻蜓点水\t133687\n进账\t133688\n大宫女\t133689\n明字\t133690\n优先权\t133691\n临时\t133692\n刹间\t133693\ngcch\t133694\n争胜\t133695\n起盈\t133696\n延吉道\t133697\n爱心群\t133698\n上海市公积金中心\t133699\n大家好我是周诗怡\t133700\n安亚峰\t133701\n帅呆\t133702\n还珠之优酷花\t133703\nhhhgbjkr\t133704\n连锁\t133705\n9958995\t133706\n4t犬\t133707\ntufdfhj\t133708\n好度秘萌萌哒\t133709\n进购\t133710\n岷江\t133711\n崖柏芝\t133712\n无极限\t133713\nix2\t133714\n一铰\t133715\n考累\t133716\n匹夫\t133717\n胶线\t133718\n范3q\t133719\n做法\t133720\n碰见\t133721\n新年好小度秘\t133722\n元月\t133723\n上休止符\t133724\n玩忘\t133725\n讪讪\t133726\n防人之心不可\t133727\n元朝\t133728\n岛链\t133729\n笨zho\t133730\n肖凯丽\t133731\n旅游鞋\t133732\n255255255\t133733\n一23456789123456789423388998753\t133734\n误传\t133735\n手球\t133736\njaauauss\t133737\n碰触\t133738\n手琏\t133739\n18周岁\t133740\n彭迎莹\t133741\n3.20\t133742\n塔山\t133743\n嫁兵兵\t133744\n啊瓦\t133745\n上海欢乐谷\t133746\n误伤\t133747\n不宾\t133748\nhcrtubft\t133749\nCF穿越火线\t133750\n六十多米\t133751\n亲个亲个亲\t133752\n乐观版\t133753\n若若若\t133754\n净垢\t133755\n亮丙瑞林\t133756\n余龙林\t133757\n挥之不去\t133758\nyoungidon\t133759\n做我最好的朋友\t133760\n鸡尾酒额\t133761\n回本号\t133762\n好喜欢\t133763\n向上吧！少年\t133764\n专席\t133765\n对杜绝\t133766\n小睡\t133767\nFIDUU\t133768\n赤塔\t133769\n鄂西会战\t133770\n梯队\t133771\n服务员们\t133772\nTurner\t133773\n九安\t133774\n2010年初\t133775\n排瑟\t133776\n天空\t133777\n天穹\t133778\n夜里三点\t133779\n谷度秘\t133780\ntfboysilu\t133781\n凸透镜\t133782\n西哒\t133783\n宣化\t133784\n传动轴自行车\t133785\n30多家\t133786\n偷窥狂\t133787\n上海花园\t133788\n愧\t133789\n笑场\t133790\n新版三国演义\t133791\n叉烧便当袋\t133792\n自相残杀\t133793\n好食\t133794\nkvkkvk\t133795\nrepackage\t133796\n55559985555555556\t133797\n阳鼠\t133798\n龟派气功\t133799\n围屋\t133800\n5555555555555555\t133801\n一百零十元\t133802\n種清純\t133803\n一百一十二\t133804\n太耐\t133805\n是不是不想\t133806\n财路\t133807\n秘蜜度蜜度蜜度秘\t133808\n啊喵\t133809\n单价\t133810\n1201天\t133811\n杨思龙\t133812\n747火锅鱼\t133813\n文艺腔\t133814\n入侵\t133815\n1982年10月\t133816\n妈呀畏\t133817\n忘了吗\t133818\n十四级\t133819\nSAMPAR欣蔓纯\t133820\n嘶一个\t133821\n小微型\t133822\n1公里\t133823\n司法\t133824\n补色\t133825\n校枣\t133826\n风等哼规范疼岑好岑溪流泪流满面\t133827\n落后\t133828\nstdh\t133829\n吉利\t133830\n庆祝会\t133831\n我想对你说声我对你的爱\t133832\n更具\t133833\n只有你我真的爱\t133834\n李叶轩\t133835\n你在梦游你在梦游你在梦游你在梦游\t133836\nugs\t133837\n六十九岁\t133838\n1.5万名\t133839\n584155\t133840\n专科生\t133841\n免检\t133842\n毛克\t133843\n和你想我了你想我\t133844\n走蛋吧\t133845\n感冒白皮书\t133846\n卿卿\t133847\nTFBOyS\t133848\n不嘛\t133849\navfilm\t133850\n不嘞\t133851\n赵子龙\t133852\n宁南\t133853\n毛八\t133854\n米帶\t133855\n大堂\t133856\n奖学金\t133857\nyhhhdh\t133858\nGXVVC\t133859\n手机性\t133860\nfrouyoumor\t133861\n百奇\t133862\n诗报\t133863\n15个工作日\t133864\nOwl\t133865\n相册\t133866\n周下午\t133867\n苏婷\t133868\n了了了了了我\t133869\n德芭彩虹书吧\t133870\n核实\t133871\n18600995283\t133872\n死不要脸的度比\t133873\n五22号\t133874\n天亮了求\t133875\nhhkbn\t133876\n杨丽萍\t133877\n先睡觉吧\t133878\n此致力\t133879\n战堡\t133880\nnesfh\t133881\n泥志\t133882\nstoos2yo\t133883\n做人做事\t133884\n146米\t133885\n晋国\t133886\n倮祭\t133887\n迷乱\t133888\n零五小时\t133889\n下个周天\t133890\n猫头\t133891\n游戏\t133892\n成馆\t133893\nBYOm\t133894\nnmmmmmmn\t133895\nfrom\t133896\n番茄炒菜花\t133897\n另选\t133898\n良民说谎\t133899\n嘛飞车\t133900\n港组\t133901\n状态\t133902\n呼叫声\t133903\n不特\t133904\nounosol\t133905\n单家晔\t133906\n张巧巧\t133907\nGud\t133908\n雅琪\t133909\nGuh\t133910\n弓夕\t133911\n恬淡\t133912\n人事局\t133913\nGuv\t133914\n黄庄乡\t133915\n明确员\t133916\n愛\t133917\n夏行\t133918\n菲尔\t133919\n夏塘子\t133920\nbinbgsa\t133921\n120条\t133922\n栖息地人\t133923\n陈粒橙\t133924\nGuL\t133925\n637遍布\t133926\n九六年\t133927\n租患得患失\t133928\n言行行\t133929\n不做人\t133930\n3283473164\t133931\n不片\t133932\n腊八不明天\t133933\n郭亿军\t133934\n陈安亮\t133935\n睡不够\t133936\n老腕\t133937\nw2222ww22222wwrwrwrwrwrewrawaaaawaa\t133938\n天佑地球\t133939\n大堤\t133940\n一百八块\t133941\n使用权\t133942\n自然\t133943\n强坚\t133944\n也有关\t133945\nguucj\t133946\n透镜\t133947\n想飞来\t133948\n王德周\t133949\n咋恁\t133950\n2000只\t133951\nthgrgh\t133952\n魁星\t133953\n坚信不疑\t133954\n邪道\t133955\n连锁经\t133956\n98点9898\t133957\n种豆\t133958\nAlho3\t133959\n广而告之\t133960\nhiababa\t133961\n自焚\t133962\n0713\t133963\n呼机\t133964\n真的黑不是黑你说的白\t133965\n枣子\t133966\n泰勒斯威夫特\t133967\n十指紧扣\t133968\n六阶\t133969\n谁了哥\t133970\n白叔神\t133971\n15051519202\t133972\n仅供参考\t133973\n能变\t133974\n三星期\t133975\n白内障\t133976\n血雨腥风\t133977\n上床亲嘴\t133978\n客气假\t133979\n小天后\t133980\n红白萝卜\t133981\n朴妮唛\t133982\n爱奇度\t133983\n沿袭\t133984\n星期岁\t133985\n红先额\t133986\n圆瑗人\t133987\n一个歌\t133988\n古生物\t133989\n肖总\t133990\n庆功\t133991\n策马\t133992\n再见再见再见再见再见再见再见再见再见再见再见再见再见再见再见\t133993\n真乃何方人士\t133994\n两两四\t133995\n中年人\t133996\n子糖\t133997\n商友\t133998\n全国度\t133999\n221351277\t134000\n咕噜咕噜嘟噜\t134001\n美国航空航天\t134002\n哦熊\t134003\n博阿\t134004\n红绿\t134005\n地心历险记2\t134006\n感冒日子\t134007\n商发\t134008\n大名鼎鼎\t134009\n阿鹏\t134010\n延光姗姗\t134011\n陆父母\t134012\n为命\t134013\ncyif\t134014\nglee\t134015\ncyic\t134016\n基佬王\t134017\nxmakn\t134018\n30篇\t134019\n茹茹\t134020\n30sao\t134021\n香芋糖\t134022\n拐棍\t134023\n恒狒\t134024\n微单\t134025\n呃安\t134026\n图索骥\t134027\nnianianianiania\t134028\n西七一\t134029\nD拜拜\t134030\n林可萱\t134031\n宁亚历山大\t134032\n微博\t134033\n陈梦雪\t134034\n再来再来\t134035\nC8650黑890\t134036\n老朋友\t134037\n8554548687555655654\t134038\n计划经济\t134039\n女英雄\t134040\n腾挪\t134041\nAretheremonthsinnayear\t134042\n压直接雪莲系学\t134043\n无名英雄\t134044\n搅菊\t134045\nChanel\t134046\nedim\t134047\nlklut\t134048\n起因为\t134049\n超极本\t134050\n在云端\t134051\n郑源\t134052\n习惯于\t134053\n王佳音\t134054\n踏窝\t134055\n亘古\t134056\n苦空\t134057\n提没有\t134058\n心之深处\t134059\n张雪忠\t134060\n二组\t134061\n宝马760\t134062\n八七幺幺零零六五\t134063\n白酒味\t134064\n忘记你是\t134065\n速度\t134066\n一首嘛\t134067\n互惠\t134068\n任何\t134069\n告诉你我告诉你\t134070\n寄自\t134071\n泰奇\t134072\n板栗树\t134073\n两山俱乐部\t134074\nvjjcf\t134075\n尤其\t134076\n振荡\t134077\n零点零一\t134078\n廾\t134079\n建\t134080\n更新换代替补偿费劲旅客场景观念叨咕咚咚\t134081\n呢特殊我了你个丑八怪我以后再也不问你问题了拜拜\t134082\n廷\t134083\n延\t134084\n廴\t134085\n廳\t134086\nyanghaoyuan\t134087\n志龙帅\t134088\n关上门\t134089\n１１日\t134090\n张雨\t134091\n职教\t134092\n廣\t134093\n廢\t134094\n张雯\t134095\n河畔\t134096\n马夫我真\t134097\n欧也\t134098\ng\t134099\n一百三十五块\t134100\n廖\t134101\n廓\t134102\nzjbdjcc\t134103\n八岁\t134104\n廋\t134105\n廉\t134106\n廈\t134107\n王汉锋\t134108\n护肤品\t134109\n好呀欢欢\t134110\n我很喜欢\t134111\n凯负责率\t134112\nAD5厘米\t134113\n你好志愿汤\t134114\n白带\t134115\n18850789659\t134116\n貸款\t134117\n董昱卓\t134118\n试测速\t134119\n漫画家\t134120\n1000000000岁\t134121\n45码\t134122\n1小时内\t134123\n咪水\t134124\n自个儿\t134125\n恳愛台承\t134126\n度秘你真的好搞笑喔欧耶\t134127\nzxc\t134128\n65%\t134129\n沉睡着\t134130\nCMCC\t134131\n零六二二四\t134132\n桐城\t134133\n网页游戏\t134134\n656\t134135\n657\t134136\n654\t134137\n655\t134138\n652\t134139\nzxt\t134140\n650\t134141\n651\t134142\nzxy\t134143\nzxx\t134144\n搜题\t134145\n6点40\t134146\n6点41\t134147\n658\t134148\n斑鸠\t134149\n65O\t134150\n蒋同欢\t134151\n恩调\t134152\n美餐\t134153\n傻强\t134154\n小呆呆是你的女神\t134155\n历时\t134156\n四七元\t134157\n1860年\t134158\n嗯魔\t134159\n曲婉婷\t134160\n洪山体育馆\t134161\n庄一强\t134162\n全摊\t134163\n然们\t134164\n恩谦\t134165\n36人\t134166\n谅解\t134167\n恩谢\t134168\n宝马公司\t134169\n醇厚\t134170\n好吾好\t134171\n民族\t134172\n日款\t134173\n杨健\t134174\n40家\t134175\n余种\t134176\n哈里波特7\t134177\n11少一点\t134178\n回头客\t134179\n今生今世\t134180\n天才能\t134181\n小伙儿\t134182\n8255处理器\t134183\n轻伤股\t134184\n开心见\t134185\n爱莲\t134186\n很冰\t134187\n河滩地\t134188\n马年\t134189\n虐心总裁文\t134190\n成性\t134191\n很冷\t134192\n调伏\t134193\n照照\t134194\n骂男\t134195\n黄土哥\t134196\n542378\t134197\nFFFF\t134198\nAsdfhjkl\t134199\nFFFD\t134200\n爱莉\t134201\ncatue\t134202\n流浪想不到\t134203\n0点八一\t134204\n轩邈\t134205\n田自孜\t134206\n四千四千八百元\t134207\n草莓晶\t134208\n季予柠\t134209\n小多\t134210\n建奋\t134211\n坐公车怪\t134212\n小处\t134213\n梅菲斯\t134214\n李换双\t134215\n诗史\t134216\n小夏\t134217\n小头\t134218\nputas\t134219\n廉亲王\t134220\n张德培\t134221\n磊萆涛\t134222\nJPG\t134223\n叶庄\t134224\n哈复发\t134225\n小大\t134226\n小天\t134227\n中薪\t134228\n小太\t134229\nsmahh48\t134230\n嗯二哥\t134231\nEverybody\t134232\n恩你就这样恶心了我饿死\t134233\n华兴\t134234\n犯罪\t134235\n蝴蝶自来\t134236\n董奇超\t134237\n0点五六\t134238\n周黑鸭\t134239\n光线\t134240\n痘瘤\t134241\n赵天祎\t134242\njsjwihowuuh\t134243\n孔明兄\t134244\nchuang\t134245\n光纤\t134246\n笑里度秘\t134247\n恩不困堡\t134248\n采臣\t134249\n张锡王\t134250\n急QZZN\t134251\n开枝散叶\t134252\n勾引心\t134253\n数角\t134254\n麦勇涛\t134255\n纽卡\t134256\n比比比比比比\t134257\n筑底\t134258\n官司\t134259\nlllllyvi\t134260\n5151314\t134261\n过来自星星的你\t134262\n于当比\t134263\n杨晓薇\t134264\n神崎美月\t134265\n水溶液\t134266\ngujgijc\t134267\n无节换\t134268\n同床\t134269\n同庆\t134270\n龙皓\t134271\n夺去\t134272\n刘文博\t134273\n好乐山\t134274\n加布多\t134275\n嗯山\t134276\n欧呦\t134277\n刘文华\t134278\nJdggdfdf\t134279\n发改委\t134280\n一起唱\t134281\n李欣珂\t134282\n支付宝号\t134283\n符开旺\t134284\n低端机\t134285\n我妈妈\t134286\n长角\t134287\n987654123\t134288\ndobbin\t134289\n61051\t134290\n风风性价比\t134291\n硕\t134292\n直线方程\t134293\n47477577\t134294\n石城\t134295\n心情不好阿\t134296\n21卷\t134297\n金氏\t134298\n秘方\t134299\n黎航宇\t134300\n肖岳强\t134301\n嘻笑嘻嘻哈\t134302\n80千米\t134303\n尹一一\t134304\n西游记我喜欢\t134305\n苦力\t134306\n赞赞赞赞赞赞赞赞赞赞\t134307\n单反镜头\t134308\n苦功\t134309\n双马\t134310\n舒佳\t134311\n好的好的好的好的好的好的好的好的好的好\t134312\n秘斗\t134313\ndmr\t134314\n份娘\t134315\n东方红费用运动会\t134316\n喜酒\t134317\n消停\t134318\n新浪博客\t134319\n辛娜斯\t134320\n海内存只知己\t134321\njop\t134322\njos\t134323\njou\t134324\n方博士\t134325\n赏月还你再说下\t134326\n多咪多咪别卖关子有关春\t134327\n3d麦克斯\t134328\n招呼\t134329\njob\t134330\njod\t134331\njoe\t134332\n一丈红\t134333\njog\t134334\njoh\t134335\n农牧场\t134336\n利税\t134337\n小农机\t134338\nn1样\t134339\njoo\t134340\n李老哥\t134341\n武美\t134342\n而立之年\t134343\n淀山湖\t134344\n小不点儿\t134345\n希尔尔克\t134346\nLCD\t134347\n想了算\t134348\n八矴\t134349\n买通\t134350\n稀\t134351\n一百一起\t134352\n一个二\t134353\n阿斯猫\t134354\n守卫剑阁五虎将后传\t134355\n一个事\t134356\n麦都\t134357\n6月7号\t134358\n荣自林\t134359\n四点二十分\t134360\n赵培\t134361\n狭缝\t134362\n粗口\t134363\n恭喜恭喜恭喜恭喜\t134364\n一个五\t134365\n可有你\t134366\n惨败\t134367\nｐｓ\t134368\n粗发\t134369\n黄建军\t134370\n罗美\t134371\nhgfgiiikk\t134372\n康慧军\t134373\n一个亿\t134374\n一个人\t134375\n大旺兴隆\t134376\n麦郎\t134377\n和陈\t134378\n07875\t134379\n民事\t134380\n玩艺\t134381\n超级游艇\t134382\n香港湾仔码头\t134383\n李肖琪\t134384\nAagelababy\t134385\n排行榜\t134386\n洗一嘻嘻\t134387\n半边\t134388\n知道识\t134389\nchydjsi\t134390\n中网\t134391\n录音我听不懂你说的话\t134392\n马赫晗\t134393\nafdffe\t134394\n秦誉菲\t134395\n稱\t134396\n5156个小时\t134397\n嗯西辞\t134398\n千娇\t134399\n嘻嘻嘻嘻\t134400\njjkk\t134401\njjkm\t134402\n白指望\t134403\n捡肥皂\t134404\nyahourz\t134405\nProbiotics\t134406\n孟石林风景区\t134407\n贝鲁达\t134408\n林玲玲\t134409\n1954年5月\t134410\n醉吧\t134411\neetyu\t134412\n蜂蜜蜂蜜\t134413\n你果\t134414\n原來\t134415\n女子桥\t134416\n摩洛哥\t134417\n绿巨人\t134418\n美容美体医院\t134419\n苦于\t134420\n握不着\t134421\n你好害羞\t134422\n糖精\t134423\n乏力\t134424\n跳轨\t134425\n姐求你了你快把我和你说的话\t134426\n猜猜擦擦擦擦擦擦擦擦擦\t134427\n党文豪\t134428\nwewsfswwadds\t134429\n夜视\t134430\n制版费\t134431\n武姬希瓦娜\t134432\n火烈鸟\t134433\n祖国片\t134434\n震后\t134435\n侵犯\t134436\n知人机\t134437\n脖子\t134438\n火星人\t134439\n惊哭\t134440\n鸡蛋面\t134441\n恁耐\t134442\n太忙\t134443\n轻舞曲\t134444\n优乐美\t134445\n看无古\t134446\n叶梦圆\t134447\n诗篇\t134448\n祸福\t134449\nthisold\t134450\n種\t134451\n电影节\t134452\nHouston\t134453\n召开\t134454\n提名\t134455\n零八毛\t134456\n刘岳婷\t134457\nhhtf\t134458\nhhtx\t134459\n五四四十万050点\t134460\n蔡婷玉\t134461\n高瑞海\t134462\n加特\t134463\n四月间\t134464\n千里迢迢\t134465\n大于\t134466\n杨乃文\t134467\n困我想\t134468\n头手\t134469\n2299\t134470\n台湾街\t134471\n11月15号\t134472\n私生活\t134473\n重现在开始\t134474\n当兵做\t134475\n2000毫升\t134476\n三月四\t134477\n来度\t134478\n福利好呀\t134479\n詹絷\t134480\n子律\t134481\n处女秀\t134482\n哭了也忘记\t134483\n凯宝\t134484\n观音山\t134485\n锅铲\t134486\n4月1日起\t134487\nBufan\t134488\n忘情水\t134489\n2290\t134490\n8万名\t134491\n新宏业\t134492\n一千呱\t134493\n160块\t134494\n3827963\t134495\n易芃\t134496\n陈俊含\t134497\nminjie\t134498\n一套\t134499\n封印经\t134500\n還有\t134501\nLOGO\t134502\n激凸\t134503\nbdbxjdjnd\t134504\n窝子\t134505\nGhoshbcc\t134506\n朝壤\t134507\n点不啊\t134508\n兴业营销\t134509\n心源性猝死\t134510\n青州\t134511\n67672458345454357645734677567985467946857642568467988989986564645635655648631513868345665\t134512\n尿裤子\t134513\n一女\t134514\nddeg\t134515\n大头儿\t134516\n朕原谅你了爱情平身\t134517\n一奸\t134518\nLONDON\t134519\n小铭\t134520\n吴浩林\t134521\n周低点\t134522\n七十周年\t134523\n不如\t134524\nyuuug\t134525\n1235682\t134526\n兴隆李\t134527\n猪舍\t134528\n机器时代\t134529\n不妞\t134530\n小铺\t134531\n不妖\t134532\n没想你\t134533\n不妨\t134534\n自动门\t134535\n了不受我了直\t134536\n小铁\t134537\nsyfx\t134538\n不妥\t134539\n概论\t134540\n武媚娘传奇\t134541\n捣蛋狗\t134542\n无可避\t134543\n875210\t134544\n理会\t134545\n减慢\t134546\njpjpjj\t134547\n吴军伟\t134548\n挖向\t134549\nwtayci\t134550\n迪斯尼\t134551\n麻饼\t134552\n形声\t134553\n形壮\t134554\n昏昏沉沉\t134555\n库巴\t134556\n小头儿\t134557\n炒蛋\t134558\n有真是有所浩限\t134559\n大望桥\t134560\n两小无猜\t134561\n转出至\t134562\n攻城\t134563\n动起来\t134564\n一万20万\t134565\n战斗侠\t134566\n妈咪度\t134567\n冤案\t134568\n捉妖记很好看你\t134569\n摩卡\t134570\n质询\t134571\n王柯丽\t134572\n88公斤\t134573\n块数\t134574\n你好好玩儿\t134575\n587580\t134576\n真不漂\t134577\nroybn\t134578\n百威DJ32瓜达尔岛堡奥堡\t134579\n被子卷\t134580\n千王子\t134581\n14日晚\t134582\n爱称\t134583\n啦啦啦我的哦\t134584\n赵粤\t134585\nolinmoyy\t134586\n搜罗\t134587\n地妖\t134588\n异侠\t134589\n勾手\t134590\nshimian\t134591\n朝下\t134592\n爱秀\t134593\n善良男\t134594\n革新\t134595\n爱秘\t134596\n尿贫\t134597\n大人\t134598\nracecars\t134599\n柏雨轩\t134600\n爱科\t134601\n8778\t134602\n平均地权\t134603\n洪玄宗\t134604\n爱上我吧\t134605\n杨鹏\t134606\n2009年7月23日后\t134607\n这款车\t134608\n汉宜\t134609\n罗汉果米\t134610\n搭配\t134611\n市盈率\t134612\n1111115ab一大丁\t134613\n实事儿\t134614\n二零二五\t134615\n憎死\t134616\ndudkjifdjg\t134617\n15929347720\t134618\n万万年\t134619\n冰梦姐\t134620\n内扣\t134621\n我想爱上你\t134622\n30平方公里\t134623\n智能机器人度秘\t134624\n抱憾\t134625\n打赌打屁屁男难女难那就撸撸女\t134626\n邵雷\t134627\n新蛋\t134628\n圆盘\t134629\n000000医院\t134630\nZcb\t134631\n野草\t134632\n六零五二\t134633\n碳素印\t134634\n叫真讨厌\t134635\n宠物商城\t134636\nmgp\t134637\n周冠\t134638\n呵斥\t134639\n近12年\t134640\nkoliep\t134641\n特别\t134642\ndd42a28346fedf27f5cb5c9ea15cebf7fjpg\t134643\n夏之光\t134644\n同樣\t134645\n20min\t134646\n中央电视台\t134647\n炒房\t134648\n茶楼\t134649\n大河流\t134650\n海战\t134651\n给我来\t134652\n晨赫\t134653\n紧箍咒\t134654\n海戏\t134655\n武士兰士诺\t134656\ntfftfx\t134657\n温泉馆\t134658\n饮酒\t134659\n寄存\t134660\n着觉\t134661\n没礼貌\t134662\n度秘不丹\t134663\n加里内维尔\t134664\n蓝狗\t134665\nPEclass\t134666\n风到\t134667\nxjzkj\t134668\n大玛雅\t134669\n永久而\t134670\n丑样\t134671\n一个日\t134672\n是么\t134673\n拍卖部\t134674\n39度\t134675\n裸背\t134676\n萨苏\t134677\n哺乳动物\t134678\n你好你好瓮的假的爸爸\t134679\n谌思涵\t134680\n陈姿彤\t134681\n对我来\t134682\n猜延迟\t134683\n念咒\t134684\n60周岁\t134685\n爱上我了想\t134686\n09米\t134687\n树大太古\t134688\nhttpfhiphotosbaiducomxiaodupicitemca1349540923dd546301f41cd609b3de9c824857jpg\t134689\n健健康康健\t134690\n幫忙\t134691\n破血流\t134692\n马千惠\t134693\n乌塔\t134694\n轰然\t134695\n哪次\t134696\n哪款\t134697\n仁化\t134698\nbagcc\t134699\n犯贱\t134700\n比高\t134701\nfigf\t134702\n郑林蒙\t134703\n八点一顿一片\t134704\n一条条\t134705\nuatj1t\t134706\n44床\t134707\nywwodzzzz\t134708\n喔老爹爹给我\t134709\n唐文佳\t134710\n猜幽\t134711\n不摸\t134712\n22:00—00:05\t134713\n立春\t134714\n猜年\t134715\n凶猛\t134716\n文水\t134717\n庭旭\t134718\n这就是\t134719\n纪·念雷雨心\t134720\n血管\t134721\n特刊\t134722\n哪几男\t134723\n傻逼们\t134724\n2008年5月4日\t134725\n张星星\t134726\n疲劳\t134727\n苏桂龙\t134728\n嗯熊\t134729\n胡海东\t134730\n川贝的我喜欢做\t134731\n碳水\t134732\n镛记\t134733\n畅饮\t134734\n现代感\t134735\n庹科江\t134736\n么什么么快亲我脸\t134737\n萨黁\t134738\n有事先走\t134739\n啷个哩个啷\t134740\n咯MT\t134741\n奥巴马奥\t134742\nHenhenshe\t134743\nkunfu\t134744\n抱你儿\t134745\nTJJBABC\t134746\n舞厅\t134747\n神奇侠侣\t134748\n诗文\t134749\n蒜蓉小豆苗\t134750\n奥运夺金奖\t134751\n虎\t134752\n咧通\t134753\n对骂\t134754\n树子\t134755\n田浅诗羽\t134756\n旦夕\t134757\nhi修睿\t134758\n不动活\t134759\nhkhf\t134760\n第二集\t134761\n八切\t134762\n八刀\t134763\n如龙\t134764\n焉小杰\t134765\n迪拉热\t134766\n日常化\t134767\nNOPQ\t134768\n466523\t134769\n囉\t134770\n咯M8\t134771\n蒋小虎\t134772\n百元\t134773\n专方\t134774\n二七指\t134775\nvod\t134776\n5578246545684568\t134777\n明天7点半\t134778\n开启\t134779\n秘私图\t134780\n贵价\t134781\n美鱼人\t134782\n成一个\t134783\n7323882\t134784\nGUGG\t134785\n努尔古丽\t134786\n申美妞\t134787\n开合\t134788\n走红\t134789\n助战\t134790\n好密度\t134791\n4455484888443\t134792\n博和\t134793\n煨\t134794\n地热\t134795\n西本新干线\t134796\n行贿\t134797\n更惨\t134798\n2688\t134799\n雨天睛天\t134800\n真真假假\t134801\n点搞笑\t134802\n灯珠\t134803\n童文斌\t134804\n一首情诗\t134805\n哈批\t134806\n深奥\t134807\nrvfeg\t134808\n打算说\t134809\n存储量\t134810\n好你好\t134811\nXxwtsn\t134812\n胡继燕\t134813\n过渡性\t134814\n硫磺椒干炒避孕鳝\t134815\n土大同\t134816\n郑经理\t134817\n公共浴室\t134818\n双人版\t134819\n长江江豚\t134820\nyhuuycccvhhoohkbhkbkbbkbbbkk\t134821\n合者\t134822\n闹市\t134823\n克太阳\t134824\nSJB舞\t134825\n定义\t134826\n四合院\t134827\n介于\t134828\n难为你\t134829\n介事\t134830\n识骗\t134831\nghugy\t134832\n度秘度秘你长的好丑好丑好丑好丑好丑好丑好丑好丑好丑好丑好丑好丑好丑好丑好丑\t134833\n小手工\t134834\n佛山\t134835\n人造鸡蛋卤注胶牛肉\t134836\n尤里\t134837\n复旦大学新闻系\t134838\nMOTO\t134839\n专望\t134840\n36个小时\t134841\n残梦\t134842\n佩瓦\t134843\n四单\t134844\nCCTV新闻联播\t134845\n金毛幼崽\t134846\n史密斯\t134847\n99735\t134848\n南外\t134849\nxiay\t134850\n十人个\t134851\n苯基二氢喹唑啉\t134852\nwin7\t134853\n别这样不然\t134854\n环星\t134855\n缪美丽\t134856\n修罗武神\t134857\n今天两天\t134858\nTH\t134859\n台朵\t134860\n说不出来\t134861\n无原则\t134862\n整废\t134863\n秘录\t134864\n烟蔼\t134865\nhgghhch\t134866\n添啵\t134867\n王三阳\t134868\n欺人太甚\t134869\n王子天\t134870\n弄么\t134871\n第三方\t134872\n屁眼儿\t134873\n务必要\t134874\n莫逞\t134875\n蕨根粉\t134876\n我真不懂你说的话\t134877\n头排\t134878\n一百针\t134879\n3x5x27\t134880\n闻噩噩噩噩噩噩噩噩噩\t134881\n2门\t134882\nwint\t134883\n退怯\t134884\najgjm\t134885\n给纸媒\t134886\n稀奇古怪\t134887\n特命\t134888\n天书包\t134889\n弄乱\t134890\n可圈可\t134891\nwing\t134892\nwind\t134893\n3pvv\t134894\n王狗娃\t134895\n点鸭\t134896\n赛德克\t134897\n9000元\t134898\n30多种\t134899\n统称\t134900\n不一点\t134901\n感觉我真的很爱\t134902\n穿越三国\t134903\n帝扬\t134904\n罐头鱼\t134905\n金玲珑\t134906\nmunny\t134907\n金字招牌\t134908\n颜值业\t134909\n沙背\t134910\n世纪闵月传\t134911\n绿宝石\t134912\n马力\t134913\n相辅相成\t134914\n边角\t134915\n朱其波\t134916\n稍加\t134917\n144司\t134918\n新福苑\t134919\n蓝光\t134920\n无利可图\t134921\nMRS\t134922\n勋晶\t134923\n1224565527538376727427388334243424383867275868675757553738374386757675757672863\t134924\n货真价实\t134925\n今天下午三点\t134926\n我真的很想你给我说说你的知心话\t134927\n幸亏院\t134928\njkylomomomomomomomomomomomoo\t134929\n长安大学\t134930\n四卷\t134931\n嗯杜米\t134932\n偶书少小离家老大回乡音无改的毛衰儿童相见不相识笑问客从何处来\t134933\nfy98gtfoyto\t134934\n18523698521\t134935\n快一点儿植物大战僵尸三级版\t134936\nfhd\t134937\n数十万元\t134938\n满长\t134939\n46分钟\t134940\n腾骏\t134941\nhdbdh\t134942\n睡觉睡觉吧小宝贝睡觉吧小宝贝\t134943\n哎呀妈呀呀呀呀呀呀呀\t134944\n仙女座\t134945\n10分钟\t134946\n胡伊柠\t134947\n25831871下\t134948\n套用\t134949\n麻拐\t134950\n能人\t134951\n明园\t134952\njagzl\t134953\n国有股\t134954\n九乡\t134955\nfhb\t134956\n植物性\t134957\n说话的说结巴了你个老爹\t134958\n字点\t134959\n四卡\t134960\nGhggggggggggghhgggggv\t134961\n隋着\t134962\n九九\t134963\n18242866262\t134964\n美眉\t134965\n线子\t134966\n调羹\t134967\n修为\t134968\n黄热同\t134969\n凤凰花\t134970\n骂架\t134971\n责骂\t134972\n退去\t134973\n交接班\t134974\n永远宝贝儿\t134975\n平武古城\t134976\n芭比汗\t134977\n数学学\t134978\n毒辣\t134979\nickooomeansomeowsomasityou\t134980\nawrfg\t134981\n惹你好\t134982\n情歌\t134983\n聚丙烯\t134984\n958820\t134985\n为麻不为\t134986\nKarry\t134987\n陈衫\t134988\n跷跷板\t134989\n聊了拜拜\t134990\n各人\t134991\n还有什么\t134992\n13263179322\t134993\n火影星\t134994\n啦啦啦啦啦啦啦\t134995\nJfdvxjkio\t134996\n髓\t134997\nsahe\t134998\n杨孟川\t134999\nfhv\t135000\n尊守\t135001\n坏我很好\t135002\n丁永强\t135003\n高\t135004\n韩大姐\t135005\n天王州\t135006\n克力\t135007\n克功\t135008\n没我自已想\t135009\n3bcdefa\t135010\n髌\t135011\n大熊猫\t135012\n十本\t135013\n成天沙皇\t135014\n好了撒撒来老婆婆\t135015\nyusidy\t135016\n熨斗\t135017\nPlease\t135018\n哇雨\t135019\n想了我讨厌\t135020\n冒肥\t135021\n12364788\t135022\n179\t135023\n178\t135024\n177\t135025\n176\t135026\n175\t135027\n174\t135028\n172\t135029\n171\t135030\n170\t135031\n十月\t135032\nKBS2.DreamTeam2\t135033\n外包\t135034\n一年一年\t135035\n陆风X七\t135036\n快穿系统文\t135037\n小农村\t135038\n885686578647757\t135039\n哼杀\t135040\n十期\t135041\n屈柳\t135042\n1乘乘\t135043\n十朗\t135044\n照一\t135045\n逗笑我不会\t135046\ntyhhhhjioo\t135047\n17k\t135048\n苍白\t135049\n17g\t135050\n回吧\t135051\n回否\t135052\n17c\t135053\n继续人\t135054\n法洁\t135055\n轮辋\t135056\n鸡娃娃\t135057\n怀予\t135058\nvvss\t135059\n心理咨询师牟洁\t135060\n凌晨三点\t135061\n17p\t135062\n甜甜的吹吹\t135063\n榆中恩玲学校\t135064\n回合\t135065\n嗯喔波\t135066\n信鸽\t135067\n立案庭\t135068\n偏大\t135069\n抗联\t135070\n植物大战僵尸一\t135071\nQQ秀\t135072\n来会儿昂\t135073\n前辈们\t135074\n植物大战僵尸三\t135075\n回吖\t135076\n点儿\t135077\n炸鸡叉\t135078\n中海金沙馨园\t135079\n针叶\t135080\n跑圈\t135081\n非常棒\t135082\n另一面\t135083\nghiphotosbaiducomxiaodupicitemfc1f4134970a304eb3e38c89d6c8a786c8175cfbjpg\t135084\n旗间\t135085\n3月8日到12日\t135086\n三货\t135087\n李香霖\t135088\n三贡\t135089\n123550\t135090\n切我不是汉子问我我是女奥特曼\t135091\n123555\t135092\n没想多\t135093\n想说真话\t135094\n张宏岩\t135095\n豆渣\t135096\n斩\t135097\n王浩宇\t135098\n惨绝人寰\t135099\n不万岁\t135100\n斧\t135101\nBang\t135102\n油儿\t135103\n菊了我真的好丑\t135104\n儿八经\t135105\n150克\t135106\n飘忽\t135107\n正道\t135108\n啪嚓\t135109\n景顺长城\t135110\n假呀\t135111\n十二岁度\t135112\n骚穴\t135113\n金巧金\t135114\n战胜\t135115\n滑板鞋\t135116\n旱养\t135117\n罗浦杰\t135118\n41苏\t135119\n生老病死\t135120\n魅蓝note\t135121\n宋宇航\t135122\n五姐\t135123\n被淹\t135124\n如是我观\t135125\n零二二零\t135126\nlpy\t135127\n换季\t135128\n家瓜\t135129\nkkkkkkkkkkkkk\t135130\n42点\t135131\nKarl\t135132\n过五关斩六将\t135133\n一二月二二\t135134\n轻功\t135135\n皮辊\t135136\n多美\t135137\n卡尔纳克神庙\t135138\n唯个屁\t135139\n郭苏平\t135140\n声汗兵\t135141\nu五花肉\t135142\n兼登辺\t135143\n考生\t135144\n方\t135145\n南方人\t135146\n11年1月\t135147\n苜蓿\t135148\n装备\t135149\nbomi\t135150\n72点\t135151\n在不上\t135152\n小呆头\t135153\n有节制\t135154\n瓷荷\t135155\n稞一∽\t135156\n哪位\t135157\n大树岭\t135158\n我麽\t135159\n给我足\t135160\n方丈心\t135161\n爬山\t135162\n相照应\t135163\n乖乖仔\t135164\n宋甜甜\t135165\nhttppinyincne17143\t135166\n材质\t135167\n马辰飞\t135168\n张壮双\t135169\n洋洋他好帅\t135170\ndhic\t135171\n老高\t135172\n乘客君\t135173\n信缘\t135174\n青华\t135175\n查了查\t135176\n修养\t135177\n斌\t135178\n推土机\t135179\n玩情\t135180\n双膝\t135181\n初升的太阳\t135182\n坛上\t135183\n正清明\t135184\n太错\t135185\n别进\t135186\n恭祝\t135187\n必胜\t135188\n好孤独\t135189\nggdth\t135190\n二项式\t135191\ndhiF\t135192\n劳累\t135193\n米给米\t135194\n华侨龙舟队\t135195\n放在\t135196\n5616738364176951763\t135197\n脑胀无理取闹志\t135198\n别过\t135199\n找笑笑笑\t135200\n撤侨\t135201\n饿罗\t135202\n林觉冰\t135203\n一仟元\t135204\n哼唧唧\t135205\nDuhhuhvf体不过衣服V7黑杰克4gfxkgjgC\t135206\n封掉\t135207\n敷衍\t135208\n场地费\t135209\n王庆\t135210\n跟风\t135211\n王庄\t135212\n35集\t135213\n秀英文\t135214\nPS3\t135215\n阿拉拉拉拉拉\t135216\n局部战争\t135217\n哈尔滨市\t135218\n口齿不清\t135219\n厌不厌死\t135220\n教育家\t135221\n撒那英\t135222\n咕嘟咕嘟咕嘟嘟哒哒哒哒哒哒哒哒哒哒嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟哒\t135223\n阿然\t135224\n斜\t135225\ngJ5h\t135226\n异缘木\t135227\n忒提斯\t135228\n牛蜜\t135229\nwjzx\t135230\n人道主义\t135231\n溃退\t135232\n你的恋爱史\t135233\n柏\t135234\n叮叮当\t135235\n一个张\t135236\n卸妆油\t135237\n255\t135238\n姗姗红\t135239\n抬高\t135240\ntmail\t135241\n诚朗\t135242\n6月7\t135243\n嗯宗师\t135244\n6月1\t135245\n蓝狮子书店\t135246\nPSN\t135247\n春风\t135248\n光说\t135249\n23ko\t135250\n颓废\t135251\n万间\t135252\n该城\t135253\nPSP\t135254\n猛地\t135255\n你和你\t135256\n后边儿\t135257\n爱着\t135258\n题你会\t135259\n时样\t135260\n急离开\t135261\n曾先生\t135262\n还能带\t135263\n不对称\t135264\n波涛汹涌\t135265\n同甘共苦\t135266\n丁羽\t135267\n著名\t135268\n李峰\t135269\n汪珈宇\t135270\n一湾\t135271\nATM机\t135272\n街男孩复合句\t135273\n成都嘉诚长城4S店\t135274\n哈嗯\t135275\n合同法\t135276\nxxvvbgccvzchv刚才vu福仓促1cvcfjv3\t135277\n小亮亮\t135278\n地地\t135279\n何晓玉\t135280\n总监\t135281\n叶司爱度那\t135282\n真是好不要脸你不膜拜我你们班谁谁是你的主人\t135283\nadc\t135284\n中杯可乐\t135285\n天上地下\t135286\n哎布布的你在哪里\t135287\n自谋体\t135288\n晕笑\t135289\n视频\t135290\n上个星期三\t135291\n思密达vu\t135292\n七十多岁\t135293\n偶天哪我们数学\t135294\n大吧帅\t135295\nYOOOOOOO\t135296\n战国微博之吮痈舐痔\t135297\nchild\t135298\n生死的人\t135299\n许志永\t135300\n眼红\t135301\n近视镜大酒店\t135302\n暨\t135303\n眼见为\t135304\n虚体\t135305\n唯爱SJ13]110618\t135306\n圣樱\t135307\n婴房\t135308\n黛安·阿勃斯\t135309\nloolll\t135310\n跳远入\t135311\n化学版\t135312\n防风\t135313\n眼纹\t135314\nrexinell\t135315\n劝和\t135316\n木村\t135317\n木材\t135318\n全盘\t135319\nv方法服服贴贴\t135320\n团身\t135321\n909653523\t135322\n江先生\t135323\nJJSK\t135324\n佛爷\t135325\n解题\t135326\n死回来了也忘不要我\t135327\n呢堡\t135328\n十九块\t135329\n芳芳光\t135330\n徐神医\t135331\n呢考完\t135332\n古庙\t135333\ngjfbkj\t135334\n满目缤纷\t135335\n赵武灵王\t135336\n飘起来\t135337\n扎龙小区\t135338\n75岁\t135339\n丑丑丑丑丑丑丑丑丑丑丑丑丑丑\t135340\n两万块\t135341\n负重\t135342\n苹果5s\t135343\nLljjgaaaj\t135344\n木板\t135345\n木条\t135346\n到秘我不喜欢你了你\t135347\n精热\t135348\n13893438205\t135349\n禮拜\t135350\n搜索索\t135351\n平抛运\t135352\n林丽才\t135353\n重启\t135354\n男孩子\t135355\n名快\t135356\n均可\t135357\n爸爸不为难你了啊小乖乖度秘\t135358\nJfbnvjfnbxngJvxchvcNbbbhebbHBbjhvccccjgbhx\t135359\n梁文慧\t135360\ntple\t135361\n自己的话\t135362\n重合\t135363\n卫子夫\t135364\n111457328\t135365\n板蓝根\t135366\n重名\t135367\ndrg\t135368\n首采\t135369\n综合科\t135370\n5.仲\t135371\n彼得帕克\t135372\n名志\t135373\n幼儿\t135374\n偏头痛\t135375\n顺应\t135376\nBEIJINGALL\t135377\n査\t135378\n首金\t135379\n免流卡\t135380\nhttpahiphotosbaiducomxiaodupicitemb21bb051f8198618868b8ff14ded2e738ad4e6f3jpg\t135381\n华融\t135382\n恩恩蟹蟹\t135383\n小洋\t135384\n龙碧生泉\t135385\n我是女的你喜欢我\t135386\n代数\t135387\n葛优\t135388\n武玉兰\t135389\n雪影\t135390\n猪秘你的\t135391\n呃跪\t135392\n悉闻哈尔滨\t135393\n轻淡\t135394\n许颖欣\t135395\n十二千米\t135396\n表心病\t135397\n东南西北\t135398\ncoolpad\t135399\n雷魔术\t135400\n枪手\t135401\n沉木\t135402\n财大读机\t135403\n邂\t135404\n十百\t135405\n刘妍\t135406\n飞黄腾达\t135407\n郭健宇\t135408\n小洁\t135409\n穿行\t135410\n二环路\t135411\n合订版\t135412\n无氟变频壁挂式空调\t135413\n摄制组\t135414\n杰哥\t135415\n蒙羞\t135416\nbirth\t135417\n病根\t135418\n爱乐cu\t135419\n看开心\t135420\n王什\t135421\n脑门儿\t135422\n穿衣\t135423\n中纪委\t135424\nhdkH\t135425\n评审团\t135426\n生存在\t135427\nfeade\t135428\n2500年\t135429\nhwhsush\t135430\n极具\t135431\n极其\t135432\n死乱\t135433\n凉茶\t135434\n话鸟\t135435\n18755996410\t135436\n夜萝莉娃娃\t135437\nABCDEFG\t135438\n笨笨之王\t135439\n佳佳行\t135440\n极典\t135441\n乐东\t135442\n18871018187\t135443\n预言者顾\t135444\n美眉你好美\t135445\n南特\t135446\njwvs\t135447\n8000万欧元\t135448\nfcv\t135449\nfjdi\t135450\nfjdj\t135451\n500多名\t135452\n小星星学\t135453\n乐丽\t135454\n院内\t135455\nKeita\t135456\n转世鸟\t135457\n雷管系\t135458\n冯德连\t135459\n5434434\t135460\n极光\t135461\n一次一\t135462\n埃文\t135463\nzdxs\t135464\n8本\t135465\n达姆\t135466\n疯狂式\t135467\n菲律宾军队北吕宋司令部\t135468\n宏伟哥\t135469\n加下\t135470\n老我找\t135471\n薛翔\t135472\n利希滕君\t135473\n一次个\t135474\n耶耶耶\t135475\n妇神\t135476\n8月\t135477\n紫紅\t135478\n三八四二\t135479\n占用品\t135480\n邵文\t135481\n唉头\t135482\n剑魂\t135483\n窝窝\t135484\n太阳时\t135485\n非金钱\t135486\n祝贺\t135487\ndegef\t135488\n鼓楼\t135489\n何事长向别时圆\t135490\n36架\t135491\n44444444444444\t135492\n开宝\t135493\n王瑛琪\t135494\n郝静茹\t135495\n千骨姑娘叫花姑娘\t135496\n3％\t135497\n出境\t135498\nhappa\t135499\n维稳\t135500\n止住\t135501\n小鸟叔\t135502\n个间\t135503\nNmdkskwv\t135504\n混乱症\t135505\n三分之一第二次\t135506\n如情牵雷\t135507\n我不我不懂\t135508\n捕鱼\t135509\n那你是谁呀你是神\t135510\nhappy\t135511\n1000000000000000000000000000000000000000000000000\t135512\n聪明点\t135513\n恨透\t135514\n失望😞\t135515\n1579米\t135516\n冬瓜雄鸡\t135517\n我爱你爱你爱你爱你\t135518\n49元\t135519\n白羊白羊白羊\t135520\nac特\t135521\ndhhdhdgfgfh\t135522\n99％\t135523\n战线\t135524\nNK,NK\t135525\n日均\t135526\n烦燥\t135527\n战纪\t135528\n先知先觉\t135529\n你在后和你爸\t135530\n2773980\t135531\n毒舌男\t135532\n汉语音韵学\t135533\n蹲着\t135534\n癸水\t135535\n到货付钱\t135536\n嗯哼骚\t135537\nkhu\t135538\n冷弥音\t135539\n上海斧头帮众歧视\t135540\nkhh\t135541\n赤色黎明\t135542\nkhj\t135543\nkhk\t135544\nkhn\t135545\n啪乒\t135546\n孙天露\t135547\nsohu\t135548\nkhe\t135549\nkhf\t135550\nkhg\t135551\n妄想\t135552\n徐水\t135553\n赵片发\t135554\n莫人\t135555\n撒气\t135556\n健德门\t135557\n小儿麻甘颗粒\t135558\n蓍草\t135559\n鱼贩\t135560\n度度度度度度\t135561\n上海百脑汇公司电脑医院\t135562\n嗷嗷待哺\t135563\n先强上元村\t135564\n取笑\t135565\n1369237个\t135566\n2727\t135567\n冷水江\t135568\n2725\t135569\n心理因素\t135570\n2721\t135571\n六月\t135572\nyfts\t135573\n惹给\t135574\nmokhg\t135575\n翠花姐\t135576\n6800\t135577\n埃托奥\t135578\n上来了\t135579\n六朝\t135580\n六期\t135581\n秘不在\t135582\n简章\t135583\nhffjkugff\t135584\n朗斯\t135585\n六本\t135586\n六朵\t135587\n才山\t135588\n第二周\t135589\n哈可是明天你在还有传奇\t135590\n跑不走\t135591\n返死人\t135592\ngaIGZIAO\t135593\n马艺嘉\t135594\ntsril\t135595\nBbvhv\t135596\n哎呀妈呀你没\t135597\n刺绣群\t135598\n小可爱你的名字\t135599\n大葱\t135600\n9欧\t135601\n重新认识\t135602\n行行好的行\t135603\n101岁\t135604\n自傲\t135605\n不不不不不不不不不不不不不把你那不不不不不不不不不不把你\t135606\n李晓鑫\t135607\n安旭\t135608\n白小黑\t135609\n大董\t135610\n9次\t135611\n颜值\t135612\n原田美枝子\t135613\n永安明天天\t135614\n5555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555\t135615\nqq度秘\t135616\n飞来飞去\t135617\n各行\t135618\n同心彩虹\t135619\n耶赛\t135620\n活体\t135621\nhhhjj\t135622\n侧翻\t135623\n住再发\t135624\n王宝强王\t135625\n活佛\t135626\n那个年代\t135627\n兑奖\t135628\n别聊这么深奥的问题我是电台小艾小达人不如问我有有\t135629\n业内\t135630\n锁匠\t135631\n浪费我的浪费我的空间\t135632\n47公斤\t135633\n111朗\t135634\n1900年\t135635\n陪着我\t135636\n中国网络电视台\t135637\n6nnnnnnnn9nnnnnnnnnnnnnnnnnnnnnn7nnnnnnnnnnnn8\t135638\n万新村\t135639\n随便了我讨厌\t135640\n事儿日\t135641\n徐ux\t135642\n开心卟\t135643\n货到\t135644\n给暖手\t135645\n奥抱歉\t135646\n频数\t135647\n另一个梦\t135648\n兄弟包\t135649\n先进\t135650\n四分之三多五个\t135651\n不少于7小时\t135652\n极简\t135653\n肖金豪\t135654\n天声\t135655\n天士\t135656\nggugui\t135657\n无语😓\t135658\ncffffff\t135659\n天壤\t135660\n抽查\t135661\n专栏集\t135662\n18251899202\t135663\n2148348\t135664\n陈润林\t135665\n2vkb\t135666\n乙船\t135667\n萨达哈鲁\t135668\n朱迪\t135669\n不要你做我的小秘书我要你做我的朋友\t135670\n找你行\t135671\n鲜辣\t135672\n猋\t135673\n玖晓天天都\t135674\n木讷\t135675\n算了了\t135676\n麦斯威尔\t135677\n曼陀\t135678\n尘埃落定\t135679\n#卡巴斯基实验室\t135680\n557526522525356173737317831868167\t135681\n转达句\t135682\n墨西哥\t135683\n明晃晃\t135684\n说了走\t135685\n螺珞\t135686\n们智\t135687\n都是爱\t135688\n轮奸\t135689\n589.7亿元\t135690\n崩儿\t135691\n十三你了度秘\t135692\n才胸\t135693\n13196408797\t135694\n人精\t135695\n傻度\t135696\n鬼狐狸眼\t135697\n永爱\t135698\n贾玲\t135699\n承办\t135700\n马努斯\t135701\n四千克\t135702\n才能\t135703\n咨源\t135704\n那你一个人在家么小乖\t135705\n消极怠工\t135706\n妇儿\t135707\n滔天真好看\t135708\n派单\t135709\n李谷一\t135710\n我是好想要爱爱\t135711\n烛光\t135712\n蓝晨\t135713\noooooooooooooooooooooooooooooooooooooooooooooooooooo\t135714\n六分之一场\t135715\nf15w\t135716\n自作聪明吧\t135717\n孩儿\t135718\n魔色\t135719\n去向\t135720\nGcgvdh\t135721\n知为知吧\t135722\n中国武术\t135723\n快乐无尽头\t135724\n穿露\t135725\n浒组\t135726\n李笔\t135727\n白羊龙\t135728\n去吧\t135729\n姨妈\t135730\n白骨精\t135731\n帀海北洲\t135732\n邓锦杰\t135733\nbookling\t135734\n完可\t135735\n哈哈牛\t135736\n妙骂\t135737\n哈哈片\t135738\n624324213\t135739\n三比\t135740\n怕怕\t135741\n母婴\t135742\n南大同\t135743\n打哈\t135744\n1000000000000000000亿岁\t135745\n天晓得\t135746\n打响\t135747\n吴节操\t135748\n渝香隆\t135749\n长条型\t135750\n什么业\t135751\njdua\t135752\n七十几\t135753\n第六步\t135754\n驶入\t135755\njduk\t135756\n素裹\t135757\n我爱你爱你爱你爱你爱你爱你\t135758\nOui\t135759\n杀人\t135760\n朝鲜公共\t135761\n9路\t135762\n友善\t135763\n安及拉angelababy\t135764\n6754\t135765\n99999999999年\t135766\n异性缘\t135767\n唯美\t135768\nyesnnornnno\t135769\n了全\t135770\n22225678\t135771\nvego\t135772\n市中区\t135773\nOut\t135774\n雪花香\t135775\n陈焕嫦\t135776\n森林森林\t135777\n往右拐\t135778\n勉强\t135779\n判罚\t135780\n早来\t135781\n比起赛\t135782\n反差\t135783\n胡编乱造\t135784\n服务亭\t135785\n十九点\t135786\n争霸赛\t135787\n天片\t135788\n施正荣\t135789\nhcllcc\t135790\nWwewQ\t135791\ns如图额OK\t135792\n还清还\t135793\n玄会\t135794\n瘦下来\t135795\n刘小鹏\t135796\nHFXC\t135797\n戈洛夫\t135798\n阿訇\t135799\n阿言\t135800\n混蛋吗秘\t135801\n车况\t135802\n工行手机银行\t135803\nIpone4\t135804\n养性\t135805\n出装\t135806\n卑鄙恩\t135807\n恶行\t135808\nddgdvg\t135809\n高聚物\t135810\ndate\t135811\n宠物龟\t135812\n战家辉\t135813\n我爱你爱到无法的自拔\t135814\n陈涵语\t135815\n嗯巴中食的成不了了我\t135816\n韩村\t135817\n大战僵尸\t135818\n李海燕\t135819\n曹岚岚\t135820\n椎间盘退变\t135821\n皮脸\t135822\n四九43\t135823\n十四户\t135824\n即达&amp\t135825\n再见奥\t135826\n万夫莫开\t135827\nhelloly\t135828\ngvvfg\t135829\nfēi\t135830\n这样做片\t135831\n往南\t135832\n将说\t135833\n外用话\t135834\n照像\t135835\nugfdfhjkkkkjggc\t135836\nojhgdsio\t135837\nskt城\t135838\n陈嘉怡\t135839\ndbkch\t135840\n婉言\t135841\n长太爱\t135842\n么什\t135843\n笨样\t135844\n投资银行家\t135845\nhdghth\t135846\n姓名\t135847\n来我家\t135848\n姓吴\t135849\n杨我\t135850\n葫芦丝\t135851\n突发奇想\t135852\n齐刷刷\t135853\n这首歌儿\t135854\n好啦好啦好啦好啦哈哈好呢\t135855\nvipok\t135856\n目前\t135857\n熊仲平\t135858\nJack\t135859\n平特\t135860\n任芳\t135861\nQBS\t135862\n随便吧\t135863\n六五二二吆\t135864\n5388\t135865\ndhiphotosbaiducomxiaodupicitem00e93901213fb80e380b60a331d12f2eb83894efjpg\t135866\n泰京\t135867\n忘了明天见\t135868\n五菱荣光\t135869\n潍坊二村\t135870\n髮他脫光光有廣告好哥哥好乖乖股骨頭嘎嘎嘎以後聽聽歌u了看嘎嘎嘎風風光光聽聽你\t135871\n阿尔法\t135872\n气势\t135873\n平版\t135874\n多才来\t135875\n度密错\t135876\n神庙2b\t135877\n进站\t135878\n轻下结论\t135879\n弯儿\t135880\n相信你懂\t135881\n13377568832\t135882\n嫁衣\t135883\n说话的那么不堪还好天真看你就是不要脸不要脸不要脸\t135884\n扩容\t135885\n把声\t135886\n竹简丫\t135887\nupand\t135888\n15公顷\t135889\n尹傲涵\t135890\n台底下\t135891\n风声传奇\t135892\n哈师大\t135893\n李秋天\t135894\n上月二十几号\t135895\n83535535525548551\t135896\n嗫嚅\t135897\n万圣节\t135898\n九十分\t135899\n风姿\t135900\n18799611444\t135901\n讨厌你随你试试\t135902\n沫沫哒\t135903\n番瓜\t135904\n2012.8.1\t135905\n找我好不好呀\t135906\n婕妮\t135907\n周七八\t135908\n幼齿\t135909\n你说爱你的和你爱的你\t135910\n起不去\t135911\n点头疼\t135912\n堡婴\t135913\n不可或缺\t135914\n家柏\t135915\n8月13\t135916\n8月12\t135917\n8月11\t135918\n求求你了我\t135919\n8月15\t135920\n关联交易\t135921\n罗马阳台\t135922\nMoney\t135923\n雷死\t135924\n截自\t135925\n截至\t135926\n我的九寨\t135927\nHuh\t135928\n携程旅行网\t135929\n0000000000\t135930\n建筑学家\t135931\n杂志社\t135932\n奥林匹克\t135933\n好不舒服\t135934\n六百回\t135935\n戏曲\t135936\n乖乖忙\t135937\n李先\t135938\nxchdhdgd\t135939\n小柏\t135940\n疤手\t135941\n虎uhhuygy\t135942\n租借\t135943\n流水琴\t135944\n我问问我问问王\t135945\n厉世涛\t135946\n2009年5月13\t135947\nGDGD\t135948\n挥舞\t135949\ne6间\t135950\n董建华\t135951\n奥康兴\t135952\n阿曾\t135953\n青黄不接\t135954\njmjmttt\t135955\n王大人\t135956\njgggf\t135957\n哈打字\t135958\n不要不要不要\t135959\n蜂鸟网\t135960\n亨特拉尔\t135961\n再玩儿\t135962\nwwwbm\t135963\n不大不小\t135964\n免礼赐座\t135965\n飞扬跋扈\t135966\n附加费\t135967\n经年\t135968\n棠雪\t135969\n打屁屁\t135970\n4444444\t135971\n湿气\t135972\n乔乔乖\t135973\n12时09分\t135974\n泪眼\t135975\nchinzazhezzz2you\t135976\n哪里特么\t135977\n赵欣睿\t135978\n哈哈骗你的嘞\t135979\n投注\t135980\n立国\t135981\n价值谱系\t135982\n面皮\t135983\nfdfdffdrruf\t135984\n批示\t135985\nadgj\t135986\nMama\t135987\n直住\t135988\n一管我叫声爹\t135989\n别逃避\t135990\n暗滩\t135991\n洒脱\t135992\n我是你的朋友\t135993\nWarnWhite\t135994\n世界上下\t135995\n黑带\t135996\n从此以后\t135997\nAndy李\t135998\n哼哼\t135999\n乖乖的好样\t136000\n国际金融上海中心\t136001\n食道\t136002\n旋唱\t136003\n台语\t136004\n极富\t136005\n哼哈\t136006\n算了罢\t136007\n276期\t136008\n黑市\t136009\n岩石\t136010\ntaka\t136011\n多块\t136012\n抽疯\t136013\nDbbddb\t136014\ntake\t136015\ntakk\t136016\nQQQQQQQQQQQBBBBBBBBBBBBRRRRRRRRRRRRKKKKKKKK\t136017\n台词\t136018\n干活\t136019\n浦城\t136020\n灭挺\t136021\n杀机\t136022\n击破\t136023\n18933292446\t136024\n黎明战\t136025\n到麻烦\t136026\n自问\t136027\n自闭\t136028\nddsf\t136029\n12315678910\t136030\n再想\t136031\n亲朋女友\t136032\n哥王\t136033\n山高比海深\t136034\n傻诚志\t136035\n保持微笑\t136036\nmidodofashito\t136037\nhfdcb\t136038\n貓心\t136039\n宝玑\t136040\n办公族\t136041\n拐骗\t136042\n大贺希\t136043\nThepark\t136044\n妮来只\t136045\n金平\t136046\n聊心\t136047\n仪表\t136048\n鬼机灵\t136049\n痞子英雄\t136050\n金年\t136051\n日立\t136052\n手牵手\t136053\n翼手龙\t136054\n看多角恋\t136055\n王贤圣\t136056\n我受够你了我不想和你\t136057\n胡思垚\t136058\n冬衣\t136059\n丁洋\t136060\n本博\t136061\n10节\t136062\n转车\t136063\n大润发超市\t136064\n痛觉\t136065\n最后一课\t136066\n胡歌\t136067\n张家有\t136068\noutside\t136069\n郑泽元\t136070\n你好讨厌你好讨厌你好讨厌你好讨厌你好讨厌你好讨厌你好讨厌你好讨厌\t136071\n自学成才\t136072\nrobf\t136073\nrobd\t136074\n说人话\t136075\n胡小毅\t136076\n孑八\t136077\n亚平宁\t136078\nuyyl\t136079\n哆咪哆咪\t136080\nv开心\t136081\n能动咪\t136082\n加发\t136083\n纸醉金迷\t136084\n10多项\t136085\nffggfut\t136086\n盐分\t136087\n游子意\t136088\n淘宝商城\t136089\n修罗铠甲\t136090\n雪窦寺\t136091\n676\t136092\nDlrhfjebr\t136093\n1O岁\t136094\n主站\t136095\n石云寨\t136096\n工资\t136097\n对联宾\t136098\n加号\t136099\n军中\t136100\niu521\t136101\n催收\t136102\n周密林\t136103\namlertis\t136104\ntylow\t136105\n宋小姐\t136106\nstoleyou\t136107\n明天以后\t136108\n愚子\t136109\n缪希富\t136110\ntb1j\t136111\n衍生代\t136112\n香痕\t136113\n刘岂威\t136114\n584558585845\t136115\n865848963488818318727527\t136116\n细胞素\t136117\ntmwt\t136118\n易如反掌\t136119\n么么头\t136120\n采取\t136121\n汪先生\t136122\n快说一句\t136123\n佩塔过\t136124\nwphhj\t136125\ndddjeekdkdkd\t136126\n国民\t136127\nCCTV-2\t136128\n西游大战僵尸\t136129\nCCTV-6\t136130\n喜闻乐见\t136131\n百度筷子\t136132\n花花公子\t136133\n夕阳西沉\t136134\n死逼学校\t136135\n三十三三百三十六\t136136\n武侯区\t136137\n说号话\t136138\n色私\t136139\n保鲜袋\t136140\n哎呜\t136141\n淮北\t136142\n再米\t136143\n僵尸秘\t136144\n迁西二村\t136145\n灰师傅\t136146\n农安\t136147\n哎呀\t136148\n好大年那你后天的手机小虎\t136149\n超级喜欢\t136150\n孙亚秋\t136151\n讨厌你了我不想\t136152\n跳掉\t136153\n猪日语\t136154\n大飞度\t136155\n农家\t136156\n和而飞\t136157\nghijkmnopqrstuvwsyz\t136158\n32557555\t136159\n哎呦\t136160\nddvhjln\t136161\n好吧快点\t136162\n32032319770511624期\t136163\nstime\t136164\n纤尘\t136165\n韩风尚\t136166\n罗里\t136167\n郭云嘉\t136168\n小一\t136169\n稍后后后后后\t136170\n称兄道弟\t136171\nanti\t136172\n280087\t136173\n不知为何\t136174\n王佳文\t136175\n0808080\t136176\n女作家\t136177\n诱饵\t136178\n面具\t136179\n神吧黑木崖Q群VB群吃货邦\t136180\n降脂\t136181\n小丝\t136182\n杨佳钰\t136183\n谈婚论嫁\t136184\n羊洋河郎\t136185\n可乐超\t136186\n神才\t136187\n咽\t136188\nyoutaoaa\t136189\n水田\t136190\n近期\t136191\nhttphhiphotosbaiducomxiaodupicitem6609c93d70cf3bc7a0e2829ed600baa1cc112ae5jpg\t136192\n头头头\t136193\n整日\t136194\numiou\t136195\n亏心事\t136196\n死术\t136197\n分度\t136198\n鱼骨辫\t136199\n温差\t136200\n高估\t136201\n花资料\t136202\ngibwn\t136203\n怎们\t136204\n麦卢晓城\t136205\n死机\t136206\n克闹\t136207\n咤\t136208\n硬水\t136209\n晨霞\t136210\n1万元\t136211\n胡甜甜\t136212\n答案组织2502929228rootrootrahiaprprprpns2xyndroomtouor1noffceseasootyourhoffeasooom\t136213\n加时\t136214\n结球\t136215\n依兰县\t136216\n蕾哈娜\t136217\n丧尸暴龙兽\t136218\n新年来\t136219\n2010年11月初\t136220\n复结婚\t136221\n死期\t136222\n分店\t136223\n温州\t136224\n高企\t136225\n14点56分\t136226\n猪你是猪吗你是猪你是狗你是狗吗你是狗\t136227\nlX25\t136228\nstimetouto\t136229\nLG电子\t136230\n城隍庙\t136231\n毛根花\t136232\n巴拉巴拉巴拉\t136233\n引向\t136234\n1995年\t136235\n一三五七三七零四十八\t136236\n天天兄弟\t136237\n事体\t136238\n米色\t136239\n四\t136240\n伏案\t136241\n单身男女\t136242\n罗帅\t136243\n学术讲座\t136244\n小度猪\t136245\nNnrhhbdhrurjr\t136246\n兔岁\t136247\nk2j\t136248\n幸福感冒药药\t136249\n汗那\t136250\n咪\t136251\n黄产路局\t136252\nkuffffff\t136253\n李慧琪\t136254\n郭舒雨\t136255\n叫蜮\t136256\n8月8号\t136257\n频仍\t136258\nHTML\t136259\n咕\t136260\n还不错\t136261\n王泉媛\t136262\n557847243494434664935186434649762\t136263\n不要说了我恨你我讨厌你不要和我说话了你走\t136264\n御景湾四期\t136265\n卡拉\t136266\n肉酱面\t136267\n拟态\t136268\n贝尔格利尔斯\t136269\n舌剑\t136270\n有一种爱\t136271\n公寓楼\t136272\n1点\t136273\ncfbais\t136274\n肖波\t136275\n幺八六零二六\t136276\n我喜欢赖\t136277\n奉化\t136278\n郭梦杰\t136279\n一拍一拍一拍\t136280\n鲤鱼的秘\t136281\n搜身\t136282\nuyverme\t136283\n邪恶少女漫画集\t136284\n雷厉风行\t136285\n因为我喜欢的人\t136286\nAIsF\t136287\nplex\t136288\n保钓\t136289\n村民们\t136290\n小解\t136291\n茶话\t136292\n大观\t136293\n小觑\t136294\n一起走\t136295\n泥鳅\t136296\nhovl\t136297\nlikhi\t136298\n瞿小姐\t136299\n茶语\t136300\n小见\t136301\n化验\t136302\n咇\t136303\n噜啦啦噜啦啦噜啦噜啦嘞噜啦噜啦噜啦噜啦噜啦类噜啦啦噜啦啦噜啦噜啦嘞噜\t136304\n王艳竹\t136305\n我告诉你是哪的我是人妖爷\t136306\n咖啡因\t136307\n小米辣\t136308\n内壁\t136309\n568555887896\t136310\n咁\t136311\n尼们\t136312\n拨给\t136313\n打屁鼓\t136314\n图\t136315\n贾战书\t136316\n美大都\t136317\n主刀\t136318\n词性\t136319\n小蜈蚣\t136320\nyourit\t136321\n主创\t136322\n咍\t136323\n要不问\t136324\niyyfhgfoykhilhj\t136325\n3o岁\t136326\n到头来\t136327\n诺拜拜\t136328\n段某\t136329\n睡拳\t136330\nno爱玩儿勒\t136331\n18.1萬\t136332\n关息\t136333\n1339位\t136334\n狂魔女\t136335\n度秘欣\t136336\n哈维\t136337\n冰窖\t136338\n恶意件\t136339\n咋\t136340\n好k7k7k\t136341\nwennetsubject260819\t136342\n罗丹\t136343\n秘不好玩\t136344\nlucx\t136345\n18249148423\t136346\n人口与计划生育法\t136347\n5983341\t136348\nyyyy\t136349\n孙悟空\t136350\n摆谱儿\t136351\n狰少\t136352\n下女\t136353\nyyyu\t136354\n哄一哄\t136355\n林凤凤\t136356\n下奶\t136357\n陈梓彤\t136358\n7月23\t136359\n信息量\t136360\n幺二零幺幺零幺九九二零四幺零幺二幺四\t136361\n有志者\t136362\n年会所\t136363\n赞赞\t136364\n板豆腐\t136365\n寿命\t136366\n53938\t136367\n7月20\t136368\n嘎嘎嘎嘎股\t136369\n第一棵\t136370\n多英子\t136371\n度米美美哒\t136372\n赞赏\t136373\n下套\t136374\n不男不女你是人妖\t136375\n毫无\t136376\n好言好语\t136377\n民言\t136378\n概念模型\t136379\nRTFDJFBTGUDYDTS7DUCLCIDODKF8\t136380\n不高兴\t136381\n1978年以后\t136382\n陈老板\t136383\n不可的你很快乐我不会\t136384\n詹先生\t136385\n夷子\t136386\n安家c\t136387\nacc\t136388\n贴布\t136389\n晶姐\t136390\nushd\t136391\n华研\t136392\nushf\t136393\n辨形\t136394\n不要再死\t136395\n过低\t136396\n凤舞九天\t136397\n团\t136398\n在于我讨厌\t136399\n限量化\t136400\napairof\t136401\nact\t136402\n曹汉松\t136403\n唐浩煊\t136404\n发聩\t136405\n教科\t136406\n一语中的\t136407\n哪男生\t136408\n汤料\t136409\n铃声精神科\t136410\n拒救\t136411\n刘岳贵\t136412\n原始森林\t136413\n刘耀庆\t136414\nfifhvf\t136415\n5英尺\t136416\n海豚音神马\t136417\n刘在石刘\t136418\n慕林\t136419\n金钱龟\t136420\n切切切\t136421\n躁剧\t136422\n科威特\t136423\n乌拉\t136424\n200多位\t136425\n游子心\t136426\n布降\t136427\n536134372203\t136428\nfjbgu\t136429\n拔刺\t136430\n桂林\t136431\n错字错\t136432\nBhfjjfhfbcbvb\t136433\n前处\t136434\n十问\t136435\n柿子\t136436\n67kr\t136437\nGuidh\t136438\n双跳灯\t136439\n前外\t136440\n音狼\t136441\n前夕\t136442\n通史\t136443\nbuying\t136444\n野味\t136445\n张茂江\t136446\n敲敲门\t136447\n十五只\t136448\n前夫\t136449\n333号\t136450\n前天\t136451\n前头\t136452\n十五号\t136453\n淮海路\t136454\n松本亚琉夏\t136455\n聚居一姓\t136456\nohn\t136457\nｄｉ\t136458\nabcdiff\t136459\n2011年5月21日\t136460\nohd\t136461\nhrfff\t136462\nｄａ\t136463\n智积院\t136464\n1285742904\t136465\n海贼王\t136466\n快喊\t136467\n和向\t136468\n嗯叫提超\t136469\n接不通\t136470\nWod\t136471\n高帮鞋\t136472\n並且我喜歡男人\t136473\n克莱谛\t136474\nohp\t136475\n希夏天\t136476\n255.82万辆\t136477\nqjgjhm\t136478\n干警\t136479\n有可\t136480\n17835583386\t136481\n李一峰\t136482\n要了困\t136483\n林冰\t136484\n林冲\t136485\n份子弹\t136486\n无无\t136487\n徐博文\t136488\n明点\t136489\n几万遍\t136490\nshaua\t136491\n11百个\t136492\n我一定可以\t136493\n哪儿\t136494\nTogether\t136495\nniqu\t136496\n18769987988\t136497\n早起\t136498\ncpk\t136499\n谐\t136500\n早走\t136501\n近十年\t136502\n无古\t136503\nmimia\t136504\n聊会儿\t136505\n55131421\t136506\n觉者\t136507\n来师傅\t136508\n煎熬\t136509\ncpu\t136510\n6666777\t136511\njaja\t136512\ngrhr\t136513\n备战\t136514\ncbnj\t136515\njaje\t136516\njajd\t136517\n郭文\t136518\n关老子\t136519\n杰哥哥\t136520\n午餐盒\t136521\n明珠游龙\t136522\n昆明站\t136523\njajs\t136524\n糖葫芦\t136525\ngrhd\t136526\ntfboystfboystf\t136527\ngrhj\t136528\n巴斯托斯\t136529\n遗书\t136530\n奶瓶\t136531\nBB装\t136532\n机车\t136533\n日子猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪\t136534\n忙乱\t136535\n地球生活\t136536\nMEMORIES\t136537\n我告诉你我喜欢的你的理由\t136538\n夫妻相处之道\t136539\n良师\t136540\n林正英\t136541\n郭新\t136542\n打平\t136543\n惡心\t136544\n18276292tt\t136545\n2993854641\t136546\n欣欣然\t136547\nghana\t136548\n两餐\t136549\n尼酱泥鳅\t136550\n两点间\t136551\n㊣\t136552\n我最爱的女孩不适宜的话\t136553\n珊莎\t136554\n第5期\t136555\n我喜欢你很久了你\t136556\n对表\t136557\n医药箱\t136558\n课外\t136559\nDdnjdjd\t136560\n陈奶奶\t136561\n有很多度秘\t136562\n莫负寒夏\t136563\n嫁给我吧一窍不通\t136564\n`\t136565\n排石颗粒\t136566\n国土资源部社会治安综合治理\t136567\n歌迷大世界\t136568\n刘芸茹\t136569\n临睡前\t136570\n鸡蛋灌饼\t136571\n好啊好\t136572\n卡娃伊\t136573\n双目\t136574\n十九三十\t136575\n普世价值\t136576\n现场直播\t136577\n五权宪法\t136578\n说蚀\t136579\n66777777777777777777777\t136580\n骗你了我没有\t136581\n衢江区\t136582\n暴利\t136583\nhcug\t136584\n动物历险记\t136585\n肃穆\t136586\n攻谣\t136587\n何姿\t136588\nxlyou\t136589\nhshshjh\t136590\n鲁光蕊\t136591\n木匠\t136592\n灰白\t136593\n大行其道\t136594\n强壮\t136595\n欧莱他\t136596\nb咯咯8莫拉克骷髅头聚聚图tyu孔咯兔兔lu\t136597\n糖醋\t136598\nwaral\t136599\n杜女神\t136600\n20秒\t136601\n谢谢你真是我的好办手\t136602\n123456789101112131415161718192021232425262\t136603\n基督城国际机场\t136604\n艺术片\t136605\n六月以来\t136606\n吴卫\t136607\n无所谓\t136608\n超萌\t136609\nUP\t136610\n晚上六点钟\t136611\n真皮\t136612\n奔哒\t136613\n羞愧\t136614\n王一明\t136615\n多空双方\t136616\n力摘\t136617\n22瓶\t136618\n雨荷\t136619\nHGnGnHGHGHGHG\t136620\n升迷\t136621\n谢夸\t136622\n真的\t136623\n赵天豪\t136624\nJared\t136625\naaak47\t136626\n五小只\t136627\n女孩儿机\t136628\n二栋\t136629\n伟松\t136630\n起落\t136631\n想象天\t136632\n梅香\t136633\n下星期六\t136634\n二张\t136635\n叫响\t136636\n你最好\t136637\n义卖\t136638\n一下你的你的秘\t136639\n各组\t136640\n甜馨\t136641\nshavetried\t136642\n5syouj\t136643\ndubCJD\t136644\n啦啦十四什么总该\t136645\n卑躬屈\t136646\n二弟\t136647\n虎跳峡\t136648\n清明镜\t136649\n郭晓丽\t136650\n一大一\t136651\n鬼笑话\t136652\n成都区\t136653\n心迷\t136654\n好笑我好冷\t136655\n喚耶斯\t136656\n2667945387\t136657\n7s5e\t136658\nxuu\t136659\nshixgezi\t136660\n一大下\t136661\n山民\t136662\n一大三\t136663\n陈子轩\t136664\n一拍一\t136665\nxur\t136666\n平摊\t136667\n谦卑\t136668\n图片瓦\t136669\n小过敏\t136670\n醉花间真是好妈妈呀\t136671\n9171个\t136672\n多芬\t136673\n俱全\t136674\n多米缺德\t136675\n为你的谜好好\t136676\n山水\t136677\nsemale\t136678\n奥莉\t136679\n珍云\t136680\nrffjrhcrvhigrigrhcihrfhicffhigrhcgghchhigrhghrghchgihchhifrfijhfrifhchhfrbihhcfirhgythhfcrhihirbichfihebcgrhchhgirifoohjjfkgkitkfkftkftjtfjftjfg\t136681\n贴纸\t136682\n杨英杰\t136683\n号出\t136684\n蘑菇头\t136685\n5555所所\t136686\n白山羊\t136687\naccept\t136688\ntuvivinellaria\t136689\nstates\t136690\n拉道\t136691\n只待\t136692\n秦国\t136693\n秘谜\t136694\n努力山\t136695\n世界上\t136696\nMissoni\t136697\ndururnd\t136698\n走西唱\t136699\n绝活儿\t136700\n夹状\t136701\n识字\t136702\n磨柱\t136703\n澳大利亚国家队\t136704\n凯蒂·赫尔姆斯\t136705\n书葛\t136706\n吴昕爱\t136707\nxietoo\t136708\nKitty医院\t136709\n美参议院外交关系委员会\t136710\n111万个\t136711\nABS登机箱\t136712\n3.9%\t136713\n凯乐视网\t136714\n银滩镇政府办公室\t136715\n追问\t136716\n二一1点儿二\t136717\n千几下\t136718\n北欧\t136719\n田版\t136720\n笨笨龙\t136721\n大彻大悟\t136722\n白袍子\t136723\nbivkr\t136724\nagdg\t136725\nxun\t136726\n那儿\t136727\n查房\t136728\n伦敦大道\t136729\n哈咱俩\t136730\n好朋友们\t136731\n忆记\t136732\ncghhh\t136733\ncghhg\t136734\n陆丰哥\t136735\n郭运祥\t136736\n最重\t136737\n西湖公园站\t136738\n改变化\t136739\n杨世东\t136740\nbgrfm\t136741\n海带木耳\t136742\n百草园\t136743\nloug\t136744\n禀告\t136745\n半夜间\t136746\n有效\t136747\n糯米卷\t136748\n体验感\t136749\n有故\t136750\nEX么片\t136751\npanc\t136752\n黄浩楠\t136753\n熊虫\t136754\n还假作业\t136755\n溜溜\t136756\npand\t136757\n程琳里\t136758\n面面咪咪咪咪咪咪咪咪喵喵喵喵喵喵喵喵喵喵喵喵喵喵喵喵喵喵喵喵喵喵喵喵喵咪咪咪咪咪咪咪\t136759\n牙齿\t136760\n孔子东\t136761\n和好朋友\t136762\n有救\t136763\n王椿棋\t136764\n乖乖要你好乖\t136765\n24F1.4\t136766\n双宇\t136767\n有声有色\t136768\n开始懂了\t136769\n我喜欢你我想嫁给你\t136770\n李大侠\t136771\n心情不开心\t136772\n牙齒\t136773\n啥然\t136774\n北岭湾\t136775\n情义\t136776\n陈珊妮\t136777\n便盆里拉屎\t136778\n有数\t136779\n几页\t136780\n巫师你好你好你好毒\t136781\n播出来\t136782\nURL\t136783\n长沙市区\t136784\n随大流死\t136785\n9点\t136786\n眼下日\t136787\n汽摩\t136788\nURP\t136789\n闭口弊\t136790\n等一回\t136791\n几项\t136792\n正轨\t136793\n飘雪\t136794\n十九八七六五四三二一\t136795\n困件\t136796\n觉着\t136797\n饶佳文\t136798\n159786425515114686221111111111111\t136799\ntfboas\t136800\n九岁\t136801\nhvety\t136802\n说说说说\t136803\n公放\t136804\nsaytobye\t136805\n叶城县\t136806\n一袭\t136807\n造价\t136808\n青铜\t136809\n一被\t136810\n务实\t136811\n卡莎萨\t136812\n凡蒂冈\t136813\n三圈\t136814\n椰子壳\t136815\n外出言\t136816\n影写\t136817\n侏儒\t136818\n三地\t136819\n上晚\t136820\n食工\t136821\n哦哟\t136822\nBMJ\t136823\n三场\t136824\n别耶耶\t136825\nBMN\t136826\n一袋\t136827\n福尼亚州\t136828\n人设\t136829\n三圣\t136830\n淡紫\t136831\nBMW\t136832\n学策\t136833\n黄荣明\t136834\n7号晚上\t136835\n乖乖乖乖\t136836\nabbbab\t136837\nEdgyg\t136838\n猪圈\t136839\n楼片\t136840\n收道贷款\t136841\n血痂\t136842\n孤村香\t136843\n续期\t136844\n金琳娜\t136845\n丽趣情\t136846\n三元钱\t136847\n桂鱼\t136848\n我真的不爱你了再见\t136849\nGroupon\t136850\n雇员\t136851\n轻车熟路\t136852\n六二岁\t136853\n猪场\t136854\n960千米\t136855\n1114141444444444444444444444444444111111\t136856\n跳水\t136857\nbrighuy\t136858\n6月5日下午\t136859\n传导\t136860\n北京市政协常委\t136861\n二十六个月\t136862\n期中\t136863\n颈椎处\t136864\n5月20日\t136865\n死不明\t136866\n反目\t136867\n王怡佳\t136868\n受冻\t136869\n一个四十\t136870\n巨忙\t136871\n蒸荷葉飯\t136872\n977\t136873\n976\t136874\n975\t136875\n五秒四秒\t136876\n973\t136877\n972\t136878\n971\t136879\n970\t136880\nddffcbcjvj\t136881\n死你不得了我是僵尸我是僵尸\t136882\n心情好棒\t136883\n把玩\t136884\n97%\t136885\n特贸\t136886\n本田\t136887\n张栋\t136888\n花波\t136889\n鞠婧祎\t136890\n期一\t136891\n入驻\t136892\nyurenbc\t136893\n奠定\t136894\n你好乖\t136895\n钱志根\t136896\n七分裤\t136897\n前畜\t136898\n到处\t136899\n早笑\t136900\n面木有人\t136901\nㄡㄝㄜㄠ\t136902\n福建省\t136903\n士丹利街\t136904\n定便秘\t136905\n牛摩王\t136906\n后流\t136907\n到天\t136908\n跑来\t136909\n你好乳\t136910\n千万千万\t136911\n吃了饭恰\t136912\n长沙汽车北站\t136913\n后海\t136914\n毛原来木\t136915\n到大\t136916\n写稿\t136917\n到头\t136918\n重新天开始\t136919\n管理人员\t136920\njgkd\t136921\n13359364592\t136922\n唧唧\t136923\njgkg\t136924\n狗窝\t136925\nbhn\t136926\n那条街\t136927\n没行\t136928\nDhd\t136929\n彭凯翔\t136930\nbhg\t136931\nbhf\t136932\njgkv\t136933\nbhz\t136934\n李太爷\t136935\n岳玉梅\t136936\n喂你好\t136937\nbhu\t136938\n中行\t136939\ngteh\t136940\n米迦\t136941\n陈明真\t136942\n学生们\t136943\n试孕\t136944\n栋哥\t136945\n大美\t136946\n南六环\t136947\n左脑男\t136948\n水塘\t136949\n甘南路\t136950\n水塔\t136951\n飞车之我\t136952\n50多20\t136953\n刘芳太\t136954\nyin山\t136955\n张芷溪\t136956\n苑\t136957\n苗\t136958\n苔\t136959\n别关灯\t136960\n两点到三点\t136961\n大乱斗\t136962\n永陵\t136963\n苏\t136964\n苍\t136965\n老婆亲一个老婆亲一个\t136966\n赵浩杰\t136967\ntpg\t136968\n笑声\t136969\n苹\t136970\n同龄圈\t136971\n全网通版\t136972\ntpm\t136973\n粘液腺\t136974\ncover\t136975\n四条腿\t136976\n徐雪\t136977\n苫\t136978\ntpt\t136979\n别救\t136980\n苯\t136981\ntpp\t136982\n苦\t136983\n若\t136984\n数不清\t136985\n秘妙妙工\t136986\n谈心\t136987\nCQUT\t136988\nbiyesf\t136989\n交加\t136990\ngrdefhge\t136991\n是因该\t136992\n醒了\t136993\n灵璧\t136994\n两步\t136995\n可顺度秘\t136996\n金屬喜\t136997\n2PM\t136998\n84xckzz\t136999\n星条旗\t137000\n向宇琦\t137001\n吴子扬\t137002\n小蝶\t137003\n景德\t137004\n线段\t137005\n24253\t137006\n没头没脑\t137007\n谈念\t137008\n王奇奇\t137009\n12377844\t137010\n左轮\t137011\n左转\t137012\n献物\t137013\n一二零八\t137014\n海底两万里\t137015\nygfff\t137016\n三时\t137017\n三白兔\t137018\n摇右摆\t137019\n2012级\t137020\n医护\t137021\n杨丽莎\t137022\n保障金\t137023\n这曰子\t137024\nchiphotosbaiducomxiaodupicitem42a98226cffc1e17e8a721454d90f603738de9aejpg\t137025\nktwm\t137026\n三无\t137027\n第十六\t137028\n都m\t137029\n三日\t137030\nMinutes\t137031\n這裡\t137032\n新绿\t137033\ndemocoffee\t137034\n艾克斯\t137035\njdgle\t137036\n金龙大道西\t137037\n植被\t137038\n酷企鹅\t137039\n哥特式\t137040\n充成\t137041\n低血\t137042\n张十八\t137043\n宇文\t137044\n问你好动\t137045\n噶还贷款\t137046\n利比亚\t137047\n直入\t137048\n不不走\t137049\nzdfhjrjdhfhgfyjcfhnvfhnbhukmnnhgjbvgkbvhknfykhghnghjmnbhgghbmkjhgkg\t137050\n报刊亭\t137051\n快捷\t137052\n中京基100大厦\t137053\nFox\t137054\n紫红色\t137055\n方泽亚\t137056\nhvhvvjbb\t137057\n零六二幺\t137058\n这样就好\t137059\n千里寻呗\t137060\n快换\t137061\n噗噗噗\t137062\n奶波\t137063\n双制式\t137064\n搅和\t137065\n逗诚意\t137066\n奥奥奥奥\t137067\n完肿\t137068\n离开你了拜拜\t137069\n第四排\t137070\ntrdyf\t137071\nluvkv\t137072\n于莹\t137073\n龙基\t137074\n转图\t137075\n嗯嗯洋\t137076\n芭莎\t137077\n小奇葩哥\t137078\n构建会\t137079\n财务室\t137080\ngkung\t137081\n和颜悦色\t137082\n角度谜\t137083\n北京晚报\t137084\nhvdsrgcdsasdgggcddrghhhcdergfdsscgtdfvsduncxsghbcdhmcxxbbbvxdbncssfhxfhjbffddfgddvgff\t137085\n好孤单\t137086\n更高兴\t137087\n枝杆\t137088\nnizaigaishenmo\t137089\n高子鋆\t137090\n荡娃\t137091\n哇甯\t137092\n一遍一遍\t137093\n转回\t137094\n太白行不行\t137095\n大相国寺\t137096\n还没有我知道\t137097\n谦让\t137098\n往月\t137099\n吓呼\t137100\n谭\t137101\n解放碑西南旗舰店\t137102\n3300多\t137103\n把子句\t137104\n100万金\t137105\n龙城\t137106\n无间妖\t137107\n自来水\t137108\n结别忘\t137109\n你是我的玫瑰你是我的花你是我的爱人\t137110\n一个三十六一个\t137111\n新诺明\t137112\n各天\t137113\n代发\t137114\n议论文类\t137115\nnouilits\t137116\n广州市政府\t137117\nswv\t137118\n局限\t137119\n放权\t137120\n缺席\t137121\n赞赞赞赞赞赞赞赞赞赞赞赞赞\t137122\n新p2\t137123\n救护\t137124\n改名\t137125\n4p3ps一\t137126\n757963\t137127\nlaozigu\t137128\n咬死\t137129\n叫看\t137130\n各处\t137131\n放松\t137132\n扔子金\t137133\n黄毛\t137134\n代号\t137135\n萨尔瓦多\t137136\n拍拍拍\t137137\n不再相爱\t137138\n吃力亲亲\t137139\n44gmol\t137140\n1月五日\t137141\n短剑\t137142\n没错\t137143\n不不在\t137144\n四行\t137145\n根\t137146\n管信不信\t137147\n臭度秘臭度秘\t137148\n诚固定\t137149\n厄厄\t137150\n食谱\t137151\n旅二\t137152\n齐猜\t137153\n艾滋良\t137154\n郑宇萱\t137155\ntowns\t137156\n好丈夫\t137157\n旅人\t137158\n我求你了你就快一点儿\t137159\n22:33\t137160\n冥王\t137161\n初晨\t137162\n四表\t137163\n依思同\t137164\n儿\t137165\n西进\t137166\nbgkh\t137167\n度秘县\t137168\n张学友\t137169\n质量奖\t137170\n校\t137171\n初晴\t137172\nPMPM\t137173\n楚成\t137174\n沙朗\t137175\n苏东\t137176\n崔叔\t137177\n五百米\t137178\nCanIadd\t137179\n不知不想\t137180\n苏一\t137181\n臂力\t137182\n除名\t137183\n苏丹\t137184\n椰汁\t137185\nmoming\t137186\n常州门\t137187\n聊天女仆\t137188\n潘达\t137189\n二三四\t137190\n胡其实\t137191\n有一加\t137192\n那一瞬\t137193\n纵火\t137194\nqjecc\t137195\n无框\t137196\n眼开\t137197\n麦苗\t137198\n一系\t137199\n4400万人次\t137200\n猜不猜\t137201\nyao\t137202\n湛青\t137203\n累春丽\t137204\n这样这样\t137205\n刘哈\t137206\n疫苗\t137207\n挤倒\t137208\n数到\t137209\n觉了困\t137210\n最红\t137211\n快期末考\t137212\n老婆\t137213\n肖林兄\t137214\n滗\t137215\n木嘛木嘛木\t137216\n汤头\t137217\n数分\t137218\n见无庶\t137219\n停放弃\t137220\n银魂\t137221\n脚床\t137222\n因为我一句话\t137223\n簸箕\t137224\n玩心\t137225\nydhd\t137226\n数列\t137227\n脚底\t137228\n很多种\t137229\n过影\t137230\n刘哥\t137231\n张古峰\t137232\n18711200838\t137233\n你不可爱好伐\t137234\nf暴\t137235\n19998十012348867676\t137236\n才说唱\t137237\nwww.10010.com\t137238\nhiahiahahhai\t137239\n权属\t137240\n科技部\t137241\n呼朋唤友\t137242\n载弹量\t137243\n太难了\t137244\n八九九九十九\t137245\n大喜羊羊大再说说喜羊羊\t137246\n3317822737\t137247\n唔知点\t137248\n小苹种\t137249\n合得\t137250\n白化\t137251\n1145164421\t137252\n贺敬之\t137253\n哪个座\t137254\n给哥哥哥哥哥哥哥\t137255\n土屯兔兔头杯\t137256\n6666669999999\t137257\n没觉晓\t137258\n米勒\t137259\n报理\t137260\n40000\t137261\n不变的爱\t137262\n金佛园区\t137263\nNanoSIM卡\t137264\n相公刚\t137265\n互换\t137266\n46818876492\t137267\n失身\t137268\n6月12日\t137269\n俄罗斯联邦鞑靼自治共和国\t137270\n六十吨\t137271\n东胜\t137272\nnbaocamelofourcondacomyfistverfistionotionoveyourakoutleyoyoyourok\t137273\n鸡蛋橄榄油\t137274\n降妖\t137275\n撅子\t137276\n139c81\t137277\n栗文宇\t137278\n大森林\t137279\n你在哪里有时你了撒心眼\t137280\n8854555\t137281\n群闲\t137282\n张卖萌\t137283\n沙僧\t137284\nhighmore\t137285\n天涯过\t137286\n七七四十九五年\t137287\n智齿\t137288\n若无其事\t137289\n1988年8月14日\t137290\n备考\t137291\n一问\t137292\n息算\t137293\n史迪仔\t137294\n栋\t137295\n有个人\t137296\n穷逼\t137297\n嗯嗯嗯嗯嗯嗯\t137298\nSBSBSBSBSBSBSBSBSBSBSBSBBSB\t137299\n何超仪\t137300\noimotiono\t137301\n1wqeqfqgwghwhue2htdbegbwrfavfsggasvsfvdfvsvdgvdbg\t137302\n付姗\t137303\n奥特曼\t137304\n20条\t137305\n煎饼书\t137306\n滴\t137307\n蔡美儿\t137308\n愿意\t137309\n颜杨昌龙\t137310\nDamn\t137311\n一门\t137312\n杨进行\t137313\n补充\t137314\n补元\t137315\n徐光宪\t137316\n埋埋埋\t137317\n来翁\t137318\nGgjjhhhhhhhhhgggggggggggg\t137319\n回亲嘴\t137320\n一间\t137321\n幺零三五八幺\t137322\n7.5%\t137323\n财经大学\t137324\n西西里\t137325\n沁乐\t137326\n普及率\t137327\n死去你会不会\t137328\n茄味\t137329\n疖系栽樉夾爽鐉jjjjjmkkjjjjjjjjjknnnnn9斴看燹义派赖\t137330\n苏北\t137331\n苏自悲\t137332\n老伯\t137333\n卢春红\t137334\n突如雪\t137335\n测开\t137336\n苏麻\t137337\n鹰击\t137338\n呵呵飞天大亨猪\t137339\n一丁点\t137340\n二百五十九四\t137341\n541885418854188\t137342\n昌明酒家\t137343\n毛利\t137344\n中山南路\t137345\n几千间\t137346\n第50集\t137347\n胡子\t137348\n阴阳人\t137349\n空城计\t137350\n说点\t137351\nfvfffd\t137352\n滤\t137353\n梦燕\t137354\n门门达\t137355\n王潇\t137356\n哎呀稳\t137357\n交界\t137358\n一粒\t137359\n15：30至18：00\t137360\n郑浩源\t137361\n林锐\t137362\n吴建\t137363\nｆｉｓ\t137364\n就读大学\t137365\n连襟\t137366\nACE\t137367\n好简单你头脑简\t137368\n过日子干\t137369\n我的了小小\t137370\n裁剪\t137371\n桂圆干\t137372\n幸运色\t137373\n海洋生物\t137374\n奥迪奔驰\t137375\n胡晨浩\t137376\n看图写话\t137377\n非计\t137378\n动物度\t137379\n风来\t137380\n17773235593\t137381\n羽扇豆\t137382\n嗯哪\t137383\n264米\t137384\n徐覅\t137385\ntextual\t137386\n吧美人鱼\t137387\n'们\t137388\n42089年\t137389\n轻松熊一\t137390\n悟空传&amp\t137391\nAnthony\t137392\n输氧\t137393\n規類為\t137394\n裴宇霄\t137395\n好呀我真\t137396\n延村\t137397\n唯心主义\t137398\n纸上\t137399\n一玩儿植物大战僵尸二\t137400\n五二十天\t137401\n云姐\t137402\n国家足球队\t137403\n我骗你的很赞\t137404\n切尔西俱乐部\t137405\n上海大众\t137406\n438886\t137407\nGhirg\t137408\n三十九一个\t137409\n灾害性\t137410\n薄皮\t137411\n热浪\t137412\n奖毛\t137413\n一颦一笑\t137414\n无奖\t137415\nfcctest\t137416\n猪你是猪我是人你是\t137417\n月色\t137418\ngo诺\t137419\n14.57分\t137420\n汐儿\t137421\n61490999\t137422\n找懂\t137423\n潘金\t137424\n无奈\t137425\n秘度度秘你在哪里度秘\t137426\n云姣\t137427\n董鹏辉\t137428\n管闲\t137429\n睡一天\t137430\n聆听\t137431\n耶无缘\t137432\n中西医\t137433\n停滞不前\t137434\n糯米红枣\t137435\nku度\t137436\nsxr\t137437\n一十分\t137438\n信信\t137439\n一九八四五六二四零四七\t137440\n哆啦a\t137441\n花姑娘的干活\t137442\n泰迪犬犬\t137443\n刘文世\t137444\n镀金\t137445\n董晨南\t137446\n策策\t137447\n起唱\t137448\n272.8\t137449\n大战子\t137450\nHu\t137451\n悬日\t137452\n真的么真的爱我\t137453\n私房菜\t137454\n宋雅雯\t137455\n陋\t137456\n几钟\t137457\n哈乖\t137458\n聪明人\t137459\n音乐队\t137460\n道狼女\t137461\n素华丽\t137462\n北斗\t137463\n威名\t137464\n安旭黄\t137465\n几钱\t137466\n赛前\t137467\n一个八三六五四四七六\t137468\n爱别\t137469\n兰友\t137470\n小暖\t137471\n小暗\t137472\n郑兴香\t137473\n爱到\t137474\n角阿罚\t137475\n探矿\t137476\n一士\t137477\n北方\t137478\n哈康\t137479\ndchu\t137480\n呢大战\t137481\nCPI\t137482\n指尖\t137483\nCPL\t137484\n矶矶\t137485\n并趾\t137486\n这个咱\t137487\n要不改\t137488\nCPF\t137489\n二甲\t137490\n一股一股\t137491\n团结\t137492\n主菜单\t137493\n肥胖纹\t137494\nCPP\t137495\nCPS\t137496\n雄起\t137497\n奧特曼\t137498\n勋子周\t137499\n四锅\t137500\nhgbhkkk\t137501\nn6岁\t137502\n超级黑鸟变形金刚\t137503\nsend\t137504\n菜包\t137505\n脸皮跑度\t137506\n李季\t137507\n认识到\t137508\n19毫米\t137509\n对不对昂\t137510\n爸爸也不爱我了爸爸\t137511\n汪苏泷\t137512\nsent\t137513\n写作业度秘\t137514\n抄手\t137515\n窟\t137516\n窜\t137517\n窝\t137518\n糟粕\t137519\n窘\t137520\n窖\t137521\n窗\t137522\n油胶\t137523\nfuuh\t137524\nhelloksk\t137525\n腥味\t137526\nfuuu\t137527\n窌\t137528\n窍\t137529\n贴图\t137530\n样板房\t137531\n75284879555222336\t137532\n窉\t137533\n窄\t137534\n窃\t137535\n突\t137536\n抗魔\t137537\n人餐\t137538\n12月27日\t137539\n百醇娇\t137540\n窶\t137541\n刘文杰\t137542\n个儿\t137543\n喔天\t137544\n这样那样\t137545\n凯飞\t137546\n像样\t137547\n三十几分\t137548\n窮\t137549\n丽友派\t137550\n踢飞\t137551\n窭\t137552\n窩\t137553\n嫩白\t137554\n三年四\t137555\n陑\t137556\n南风是爱\t137557\n84238\t137558\n直走\t137559\nuei\t137560\n见状\t137561\n热销\t137562\n丝质衣\t137563\n猫斯拉\t137564\nuey\t137565\n哎呀呀呀呀\t137566\n响尾蛇\t137567\nueu\t137568\n36286晚654\t137569\n26米\t137570\n末们\t137571\n开放\t137572\n陈雨祺\t137573\n射把\t137574\n两三百块\t137575\n连杰\t137576\n末代\t137577\n31188886677513666666666888888888888888\t137578\n老檀\t137579\n带头\t137580\n教育局\t137581\n气头上\t137582\n十四名\t137583\n数十万\t137584\n刀豆\t137585\n绿鲤鱼\t137586\n发晕\t137587\n奇力制药\t137588\n我是说下六明天\t137589\n鼻炎\t137590\nDream\t137591\n她俩\t137592\n75855747668555555555555555\t137593\n冷泡\t137594\nbffbf\t137595\nnjt\t137596\n252c0015\t137597\n比特猫\t137598\n会场\t137599\n3568元\t137600\n一月二十五日\t137601\n王蒙\t137602\n33e3\t137603\n汾河湾\t137604\n模糊者\t137605\n23岁号\t137606\n哈勃\t137607\n奥来奥\t137608\n子音\t137609\n英孚\t137610\n拍一拍\t137611\n英子\t137612\n巧克力\t137613\nnorthherobrine\t137614\n哈勒\t137615\n想你我喜欢\t137616\n体无完肤\t137617\nsire\t137618\n王梦雅\t137619\n杉杉附\t137620\n张义元\t137621\nsirs\t137622\n看不过\t137623\n好哪来哪\t137624\n牛毛\t137625\n牛比\t137626\n五八呗\t137627\nsiry\t137628\n探试\t137629\n爽直\t137630\n定胜\t137631\nHa\t137632\n工作\t137633\n工体\t137634\n51342\t137635\nPLAYGIRLZ\t137636\nFHFGGHHHWGHHHHHHHHHHGHHHHGJHHHHGHHHWGHHHGHJFJHHTFJTGDHFJHFGHHFJJJJJ\t137637\n解人意\t137638\n啊哟我真的很爱你哪饲养想你是我的办公\t137639\n偶得\t137640\n工位\t137641\n书桌\t137642\n妳們\t137643\n情欲片\t137644\n三极品\t137645\n2012年7月\t137646\n十五世\t137647\n异变\t137648\nhomethetou\t137649\n阳道\t137650\nllanoCD\t137651\n丘思远\t137652\n级大学季\t137653\n#晚安小狮子#\t137654\n臭什\t137655\ntaobaocomitem\t137656\n白金银\t137657\n快快乐乐几\t137658\nHTC\t137659\n抱怨者\t137660\n嘉兴市公安局\t137661\nspsy\t137662\nGnshag\t137663\n路飞鞋\t137664\n三修练\t137665\nopok\t137666\n金曲奖\t137667\n旧版\t137668\n齐俊杰\t137669\nTrfhjyygbbhhtrfbhhreesaqqwdghkpkhvf\t137670\nJYJfrom東方神起&amp\t137671\n弃疗\t137672\n表现运\t137673\n杨可啊\t137674\nSEED\t137675\ncoiriet\t137676\n未知\t137677\n八九点\t137678\nHd\t137679\n违抗\t137680\n倒板\t137681\n8月4日\t137682\n避讳\t137683\n无问你\t137684\n见人\t137685\n十二章\t137686\nhigh\t137687\n单式\t137688\n避让\t137689\nddgj\t137690\n东城门遗址\t137691\n卡布贝贝\t137692\n五十k\t137693\n三合汽修厂\t137694\n度过夜\t137695\nhigd\t137696\nddgd\t137697\nddgg\t137698\nhigg\t137699\n伊巴卡\t137700\n相适应\t137701\n远路\t137702\njrgk\t137703\n一鸣\t137704\n祝瑞雪\t137705\n4层\t137706\n铁通\t137707\n宋庄小区\t137708\n才男\t137709\n五十K\t137710\n管理局\t137711\n嗯搔瑞\t137712\n洪灾\t137713\n单张\t137714\n百度秘书\t137715\n自作\t137716\ncvqert\t137717\nhvbjakf\t137718\n芳龄\t137719\n中俄\t137720\n55分\t137721\n道德观\t137722\n杀气\t137723\n股海\t137724\n汪安航\t137725\n4582682288\t137726\n来没有\t137727\n自住\t137728\n萧然\t137729\n李说\t137730\n度龙\t137731\n。dr\t137732\n收盘价\t137733\n创出\t137734\n黄鹂鸣\t137735\n祁念\t137736\n一股股\t137737\n方方\t137738\n刘恒伟\t137739\nakwnsyed\t137740\n随你\t137741\n吃吃喝喝\t137742\n漂不漂亮\t137743\n最近三天\t137744\n六个六\t137745\njjpmgmp\t137746\n裕华区\t137747\ncommv\t137748\nvmd\t137749\n言不必\t137750\n敏青\t137751\n若莆\t137752\n星仔\t137753\n静临文\t137754\n夹座\t137755\n招展\t137756\n花花火\t137757\n车飞碟\t137758\n遗失的好惨\t137759\nRkd\t137760\n远射\t137761\n跑着\t137762\n场合~#\t137763\n公屋\t137764\ndjshdkfg\t137765\n妹妹行\t137766\n一点三倍\t137767\n接电话废话\t137768\n妙妙3333333333333人妙妙3333农呢330533\t137769\n陆嘉宇\t137770\nhobby\t137771\n张媛\t137772\npachanga\t137773\n4刘\t137774\n哈尼先\t137775\n不要\t137776\n螺牙\t137777\n在一起吧唧吧嗒\t137778\n抓食\t137779\n2头\t137780\n敲敲敲敲\t137781\nkisn\t137782\nkisk\t137783\n几十件\t137784\n兵者\t137785\n六面\t137786\n我你的双眼那么好唱歌\t137787\nkiss\t137788\n2天\t137789\n于洪广场\t137790\n缺货\t137791\n抚养费\t137792\n赤刀\t137793\n笨赖克\t137794\n我们和好嘛求求你了吗我还是小娃子\t137795\n撒花撒花撒花撒花撒花\t137796\n中华同学录cncls.com\t137797\ntgfghvn\t137798\n点水解\t137799\n重砸\t137800\n一次一颗子弹\t137801\n卧姿\t137802\nfdujjd\t137803\n叫停机\t137804\n哦恩\t137805\n大三\t137806\nFUT\t137807\n大不\t137808\n大一\t137809\n多妻制\t137810\n朴汉平\t137811\n大业\t137812\nhbvfcfdfgg\t137813\n大东\t137814\n作爱\t137815\n大丑\t137816\n带费\t137817\n大专\t137818\n油菜花\t137819\n就是满要不你就是假\t137820\n二个月\t137821\n挖挖机\t137822\n大个\t137823\n熟知道\t137824\n犮個\t137825\n醒来再说\t137826\n银河落九天\t137827\n雅凯雅\t137828\n导商\t137829\n勇敢的了了说错\t137830\n读书吧\t137831\n2434234243\t137832\n大丰\t137833\n易彩萍\t137834\ngiyy\t137835\n枉法\t137836\nOkbossbay\t137837\n口耐\t137838\n温雯锐\t137839\n转存\t137840\n汶上良\t137841\n绵绵\t137842\n上班人\t137843\n转子\t137844\n下午3点20几号\t137845\n采访\t137846\n说对\t137847\n须及春\t137848\n试试试试试试试\t137849\n王双双\t137850\n李秀赫\t137851\n李毅\t137852\n长留庄\t137853\n敢死缎\t137854\n63882764890277478199\t137855\n伤害人\t137856\nnk个\t137857\n粮长\t137858\n上饶枪战我想上我的床的进步厘米\t137859\n施舍不得\t137860\n转学\t137861\n中央部委\t137862\n洛蒂托\t137863\n田鸡\t137864\n吾\t137865\n吿\t137866\n吼\t137867\n吸\t137868\n吹\t137869\n450\t137870\n451\t137871\n452\t137872\n吵\t137873\n454\t137874\n456\t137875\n吱\t137876\n吮\t137877\n启\t137878\n听\t137879\n阿啵\t137880\n含\t137881\n吨\t137882\n否\t137883\n吧\t137884\nying\t137885\n整整\t137886\n吣\t137887\n吠\t137888\n吡\t137889\n吞\t137890\n吟\t137891\n君\t137892\n吖\t137893\n吗\t137894\n讨讨\t137895\n吕\t137896\n吓\t137897\n吐\t137898\n向\t137899\n后\t137900\n同\t137901\n名\t137902\n吊\t137903\n吋\t137904\n合\t137905\n吉\t137906\n吆\t137907\n2354123\t137908\n各\t137909\n潇\t137910\n吃\t137911\nvevecovpbobovanhij\t137912\n吁\t137913\n高速\t137914\n高通\t137915\n瞥见\t137916\n智库\t137917\nv不vv城建局\t137918\n行道\t137919\nbudd\t137920\n赵晗\t137921\n吱吱吱\t137922\n司仪\t137923\n文门\t137924\n巴旦木\t137925\n花见花开\t137926\n屋顶花园\t137927\n老子打你那你在家的话\t137928\n一味孤行\t137929\n杨云林\t137930\n客DJ\t137931\n东石委\t137932\n司令\t137933\n一定要关\t137934\nMilonga\t137935\n不可惜不可惜不可惜\t137936\n站桩\t137937\n智度\t137938\nfdsessfdgderfd\t137939\n阿伽雷斯\t137940\n1357948\t137941\n楚河\t137942\n中国足协\t137943\n出血\t137944\n嗯akk\t137945\n饭额\t137946\n出行\t137947\n茶蛋茶蛋\t137948\nHxn\t137949\n出街\t137950\nmuil\t137951\n你链锁臭婊\t137952\n温柔我真\t137953\n季恩慧\t137954\n你是谁了吧告诉我你自己的名字吧我求求你了老天爷\t137955\n周冬雨\t137956\n二十七八号\t137957\nllx\t137958\n凛遥赛\t137959\nfxy\t137960\n有图\t137961\n哭而无泪\t137962\n做水\t137963\nfxw\t137964\nfxv\t137965\nfxq\t137966\n熊思敏\t137967\n王峡蕊\t137968\n鹿血酒\t137969\n瓮安\t137970\nxuyfxZTXFYXZZRRXYCFYZXYFXUFUCXFYDTSSZcfyd\t137971\nfxc\t137972\n一姐妹\t137973\n胡温\t137974\n左齿\t137975\n呃知章\t137976\n下贱一个人\t137977\n本喵\t137978\n当局者\t137979\n玩兒\t137980\n无愧于心\t137981\n半天半天\t137982\n胸痛\t137983\n百分之百分\t137984\n福发\t137985\n修道\t137986\nHOLD\t137987\n接上课\t137988\n13000多\t137989\n王自强\t137990\nyi睿哈尔\t137991\n溪流\t137992\n张会好\t137993\n杨呀\t137994\n洗去\t137995\ndd貂貂\t137996\n热才\t137997\n无啊无\t137998\n45272587576525\t137999\n玩具\t138000\n领事\t138001\n本善\t138002\n握死\t138003\n叶叫\t138004\n建厂\t138005\n传诵\t138006\n传说\t138007\n羽化成\t138008\n抱错\t138009\njlfdt\t138010\n第一个人\t138011\n松露\t138012\n中装\t138013\n眺望\t138014\n于娜\t138015\ngriowh\t138016\n烟村\t138017\n叶叶\t138018\n迈向\t138019\n主人马\t138020\n房屋烧制\t138021\n10000000000000000个\t138022\n直径\t138023\n狂欢节\t138024\n传话\t138025\n机器手\t138026\n200213\t138027\n中裤\t138028\nhopn\t138029\n一件件\t138030\n叶发\t138031\n了不说\t138032\n猴王历险记\t138033\n全文学\t138034\nkeyterms\t138035\nHeavy\t138036\n8月10日晚7时50分\t138037\n深港澳\t138038\n摩尔多瓦\t138039\n刘丹\t138040\n陈镇文\t138041\n5.8万元\t138042\n西黑\t138043\n擦香\t138044\n真不想理你了退下\t138045\nBibbjh\t138046\n天天使唤\t138047\n不断地\t138048\n行字\t138049\n张靓怡\t138050\n厌贼\t138051\n哈卡哈\t138052\n千般\t138053\n快说说\t138054\n补药\t138055\n沐\t138056\n何至于此\t138057\n好矮\t138058\nniss\t138059\n好矢\t138060\nnist\t138061\n1986年\t138062\n比\t138063\n毕\t138064\n毖\t138065\nJgk\t138066\n毒\t138067\n一个一厘米\t138068\n14两个\t138069\n帅我美\t138070\n毙\t138071\nJgf\t138072\n毛\t138073\n多谢友\t138074\n毀\t138075\n毁\t138076\n以北\t138077\n母\t138078\ny768\t138079\n每\t138080\n突突吐\t138081\n荏苒\t138082\n试衣\t138083\n毋\t138084\n8部\t138085\n合阳\t138086\nRjyf\t138087\n七明儿\t138088\n毽\t138089\n零零落落零零落落零零落落个\t138090\n沙嫦\t138091\n试行\t138092\n银行家\t138093\n宝座\t138094\n宝度\t138095\n精疲力尽\t138096\n不妈呀\t138097\nlyou\t138098\n南南南\t138099\n毯\t138100\n合不来\t138101\n鄙视\t138102\n捶胸\t138103\n歧路\t138104\n正中\t138105\n宇哥哥\t138106\n怂逼\t138107\n老鼠药\t138108\n沧\t138109\n内布拉斯加\t138110\n沦\t138111\n真的听不懂你说的话\t138112\n反抗\t138113\n杨先生\t138114\n正不\t138115\n3333333\t138116\n绝交我恨你\t138117\n熊玉\t138118\nｓｉｓｍａｔｉｃａｔｉｏｎ\t138119\n沪\t138120\n正业\t138121\n衣架\t138122\n好lyou\t138123\nck\t138124\ncj\t138125\n我是你的小秘生物度秘\t138126\nch\t138127\nco\t138128\n乖乖的我要了\t138129\ncm\t138130\n大哈哈\t138131\nxxfffgfg\t138132\ncb\t138133\nca\t138134\n首个\t138135\ncg\t138136\n六坨支付\t138137\nce\t138138\ncd\t138139\n缩宫素\t138140\ncz\t138141\ncy\t138142\ncx\t138143\n巨片\t138144\ncs\t138145\ncr\t138146\ncq\t138147\ncp\t138148\ncw\t138149\ncv\t138150\ncu\t138151\nct\t138152\n提问\t138153\n摸着\t138154\n血色浪漫\t138155\n李雨欣\t138156\ncF\t138157\nrhrr\t138158\ncZ\t138159\n首专\t138160\n为神\t138161\n何以琛\t138162\n报报\t138163\n众星\t138164\n巨物\t138165\n招妓\t138166\n鬼火\t138167\n过滤\t138168\n鬼灯\t138169\nRTHBVHi\t138170\n张家赫\t138171\n有色片\t138172\n刘不\t138173\n微米\t138174\nc3\t138175\n休养生息\t138176\nc1\t138177\nyouamonster\t138178\nc6\t138179\n好了你\t138180\n喉韵\t138181\n疯似\t138182\n实至名归\t138183\n金钥匙\t138184\n篡权\t138185\nxbsj\t138186\n愣打\t138187\n保险\t138188\n戏剧节\t138189\nyouish\t138190\n卢勤\t138191\n四灵\t138192\n市口\t138193\n为你自己\t138194\n定清楚\t138195\n96元\t138196\ngjgj\t138197\ngjgk\t138198\n楼主\t138199\ngjgl\t138200\nfrtt\t138201\n头女女\t138202\n鹅母\t138203\n发言权\t138204\n画稿\t138205\n一星期三百钟三百分钟\t138206\n很好好好好\t138207\n婷仔\t138208\n那个人\t138209\nucududu\t138210\n打乱反\t138211\np萍\t138212\n周嫁\t138213\n盛京八景\t138214\n了啦\t138215\n连拍照\t138216\nromance\t138217\n楼下\t138218\n楼上\t138219\n大半杯\t138220\n爱者\t138221\n宋轶舟\t138222\n1v呗\t138223\n玉凤\t138224\n不及\t138225\n饭狗\t138226\nxiaohuala\t138227\n1117755亩\t138228\n周多\t138229\n说戏\t138230\n李丹杰\t138231\n四零五到幺零零幺零\t138232\n娇气\t138233\n请受\t138234\n硬通货黄金\t138235\n没一样\t138236\n喜欢你了我讨厌\t138237\n莫扎特\t138238\n包广超\t138239\n说战\t138240\n快走快走快走\t138241\n马仔\t138242\n纸板\t138243\n林行\t138244\n马代\t138245\n这部书\t138246\n红金龙\t138247\nJulia\t138248\n请叫\t138249\ns22\t138250\nHIPAL\t138251\ns20\t138252\ns26\t138253\n7.73\t138254\n啦啦噜\t138255\n周天\t138256\n奥老公\t138257\n找了看\t138258\nqwewrggbnfffgggghjjvvgggghhhuuhffghhhhewe\t138259\n五分之2x2分\t138260\n不受\t138261\n中国男蓝\t138262\n志贤\t138263\n津京德比\t138264\n炸\t138265\n过八级\t138266\n朱逸杰\t138267\n小三儿\t138268\ncytfc\t138269\n东山小学\t138270\n益阳市\t138271\ncdff\t138272\n少给\t138273\n光鞋\t138274\n洁婚\t138275\n我们一家人\t138276\n一次又一次\t138277\n李玉刚\t138278\n呃幺\t138279\n孙博\t138280\n儿孙们\t138281\n翻江倒海\t138282\n心伤\t138283\n嗯风云\t138284\n沼洋\t138285\n对呀棒棒的小朋友我是你的女朋友度秘\t138286\n大姐握克图\t138287\n齐齐哈尔大学\t138288\n819585835319565555265595956554957645404778556548282820282\t138289\n木命女\t138290\n90亿美元\t138291\n田亦宣\t138292\n放歌听听敢门\t138293\n犹他大学\t138294\n忠诚\t138295\n秤子\t138296\n过牛\t138297\n今年2月2日\t138298\n心伐\t138299\n池鑫鑫\t138300\n唇红\t138301\n秘密系\t138302\n666666666666哒哒哒哒哒哒哒哒哒哒\t138303\n烦恼\t138304\n元无\t138305\n咯度\t138306\n元旦\t138307\n元日\t138308\nvfjfj\t138309\n欲别故乡\t138310\n不可\t138311\n蒙古人\t138312\n在公园\t138313\n158438\t138314\n大嗯\t138315\n大雪茄\t138316\n债券\t138317\n戈壁滩\t138318\n北京公园\t138319\n看不够\t138320\n技能源\t138321\n准许\t138322\n50944\t138323\n说的了你\t138324\n衣衫褴褛\t138325\n这么闲\t138326\n哈局\t138327\n小琪\t138328\n身临其境\t138329\n小琦\t138330\n54546\t138331\n54542\t138332\n堡秘干嘛\t138333\n婧袆\t138334\n爱我真的不想\t138335\n吴蛙\t138336\n托米\t138337\n小琴\t138338\n小琳\t138339\n甘家大院熙南里\t138340\n倒拔垂杨柳的花和尚鲁智深黑旋风\t138341\nH6\t138342\n今日19时30分\t138343\n孙珂颖\t138344\n小球\t138345\n讨厌讨厌讨厌你真的很讨厌讨厌讨厌讨厌你\t138346\n九九一十八\t138347\n难兄难弟\t138348\n名澜\t138349\n冷汗\t138350\n我俩结婚吧求你了度秘\t138351\n虐恋\t138352\n手相\t138353\n幺零零幺零，我怎么了你了吧行么我求你\t138354\n广告费风\t138355\n发话\t138356\n不我不我好\t138357\nhwy\t138358\n我的女儿\t138359\nALAN\t138360\n蒙牛酸酸乳\t138361\n不经意间\t138362\nmklm\t138363\n卦\t138364\n绽谷\t138365\n笑哥\t138366\n理顺\t138367\n就就是说\t138368\n润发素\t138369\n笑哭\t138370\n占\t138371\n小十\t138372\n非理想主义\t138373\n志趣\t138374\n初鬼\t138375\n15239858362554185\t138376\njyp\t138377\n贞子行\t138378\n五注\t138379\n雷小雄\t138380\n不等你说\t138381\n收查\t138382\n谋取\t138383\n干骂\t138384\n尹方超\t138385\n2010米\t138386\n幺九\t138387\n融会贯通\t138388\n马骙思密达\t138389\n盖有\t138390\n干骗\t138391\n1458376464941961111336988636997751\t138392\n补肾\t138393\n下黄村\t138394\ndseetff\t138395\n帅宝\t138396\na50b200\t138397\n秦伟\t138398\n反过来说\t138399\n趣评\t138400\n咔咔咔咔咔咔咔咔\t138401\nBicycle\t138402\n88匹\t138403\n李西楼\t138404\n艾米利亚\t138405\n胜诉\t138406\n芒果类\t138407\nofficial\t138408\n百几十五五\t138409\n征服\t138410\n天紫\t138411\n趣话\t138412\n看不着\t138413\n未婚夫\t138414\n1581581589\t138415\n篡改\t138416\n青丘\t138417\n青丝\t138418\n呀么度\t138419\n物极必反\t138420\n小卷\t138421\n真够服\t138422\n紧接\t138423\n21岁时\t138424\n朱飞\t138425\n董兄\t138426\n小卫\t138427\n臻果latte\t138428\n宝林寺\t138429\nwxy\t138430\n卸\t138431\n林志凯\t138432\n9a－2b\t138433\n[鐘\t138434\n何以我一\t138435\n火阿咪\t138436\n亲乖\t138437\n两环\t138438\n家了您们\t138439\n對方\t138440\n對於\t138441\n搓衣板龙\t138442\n相印\t138443\n薛浩成\t138444\n饥款\t138445\n了不信也罢哼我不理你了你在这样\t138446\n霍尔\t138447\n离家出\t138448\n六十怪\t138449\n兵心\t138450\n不卖\t138451\n亲乖乖\t138452\n思雨\t138453\n散花\t138454\n童话故\t138455\n朱芷慧\t138456\n不十多天\t138457\n红火红\t138458\n走了再说\t138459\n文殊院\t138460\n38277338383838383824\t138461\n推车\t138462\nbxcjz\t138463\n整治\t138464\n上饶市\t138465\n西沿\t138466\n08：40\t138467\n魏文王\t138468\n摄影者\t138469\n七记\t138470\n洒洒\t138471\n李鑫钰\t138472\n慈悲\t138473\nh贴吧\t138474\n电影包\t138475\n电源线\t138476\n在我身边\t138477\n这一队\t138478\n250420\t138479\n尿白川\t138480\n美悯\t138481\n单\t138482\npsq\t138483\n写问\t138484\n胃口\t138485\n边本\t138486\n一样明天\t138487\n纪传体通史\t138488\n成魔\t138489\n海泰\t138490\n芝加\t138491\n奥特尼玛\t138492\n一个梦\t138493\n海波\t138494\n2010十大给力新闻回顾\t138495\n毕竟\t138496\nukt…4548\t138497\najjkkkkll\t138498\n中止\t138499\n一封信\t138500\n尿硬\t138501\nIhok\t138502\n湘江北\t138503\n晋安\t138504\n血手印\t138505\n心气\t138506\n手语\t138507\n13811869428\t138508\n集资案\t138509\n村委员\t138510\n星期日下午\t138511\n孤苦伶仃\t138512\n一角钱\t138513\n小哇01314\t138514\n大兴安岭林区\t138515\n赵王堂\t138516\n安王贝景\t138517\n1438438\t138518\n我讨厌你我讨厌你我讨厌你我讨厌你我讨厌你我讨厌你讨厌的人\t138519\n蒋亚男\t138520\n球形\t138521\n王妙妙\t138522\n阿俩丽\t138523\n马累\t138524\n大下个\t138525\n乙庚\t138526\n强加\t138527\nWhUtyoUrname\t138528\n萌群\t138529\n永安乔\t138530\n强势\t138531\n咪女孩\t138532\n八十多斤\t138533\n五彩石补遗\t138534\n封神演义-仙界传\t138535\n啊噜\t138536\n陈雨晴\t138537\n强劲\t138538\n商务\t138539\n背龙八\t138540\n萌美\t138541\n每一片\t138542\n我在这说话的声音大与小你在哪边\t138543\n谦谦\t138544\n某某\t138545\n呀嗯瓮\t138546\n强力\t138547\n力四射击\t138548\n顶得住\t138549\n午电影\t138550\n丽会书\t138551\n中国建筑\t138552\nbobo么佛\t138553\n稳听\t138554\n死不悔改\t138555\niijjjoooo\t138556\n上汽\t138557\n吧求你了度秘处女了求你了求求求求求求求求求求求求你\t138558\n得美\t138559\n霍霍\t138560\n高科技\t138561\n攘括\t138562\na17kefghi1k2mnopqrstuv880\t138563\n15064589779\t138564\n麦麦提\t138565\n嶴\t138566\n260元\t138567\n桌球\t138568\n一下子扇\t138569\n环球\t138570\n乐卡\t138571\n最大公因数\t138572\n郑超\t138573\n毛骨悚然\t138574\n你好闲\t138575\n妝\t138576\n妞\t138577\nix\t138578\n璐璐\t138579\nRPG74\t138580\n妒\t138581\n妓\t138582\n中英新约\t138583\n花开富贵\t138584\n妖\t138585\n妗\t138586\n妈\t138587\n妊\t138588\n张体学\t138589\n妍\t138590\n头草\t138591\n妁\t138592\n如\t138593\n妃\t138594\n妄\t138595\n妆\t138596\n妇\t138597\n炽暖\t138598\n妹\t138599\n妺\t138600\n妻\t138601\n刘福霞\t138602\n妾\t138603\n心塞\t138604\n多米贷\t138605\n妲\t138606\n妳\t138607\n劳教其母\t138608\n妨\t138609\n杞人之忧\t138610\n秒赞\t138611\n妮\t138612\nip\t138613\n18b\t138614\n王辛庄\t138615\n纷至沓来\t138616\ntyrjgvk\t138617\n教诲\t138618\n刘千惠\t138619\n燕王\t138620\n伤忘\t138621\nbgaa\t138622\nklygyvu\t138623\n怪别客气\t138624\nDello\t138625\n煦暖和拮\t138626\n潘柏伟\t138627\n伤心\t138628\n蛰伏\t138629\n多部\t138630\n托AK\t138631\n燃起\t138632\n搞笑点\t138633\n顾明\t138634\n姤蓂\t138635\n63cccc\t138636\n欧阳思\t138637\n工作我的爸妈\t138638\n扣扣扣\t138639\n小名字\t138640\n70多分\t138641\n燕玲\t138642\n我喜欢喜演员\t138643\nifdch\t138644\n崔斯特姆\t138645\n经得起\t138646\n不疑\t138647\n曲沃\t138648\n强人\t138649\nshalladsalad\t138650\n矮瓜\t138651\n郭嗯\t138652\n溺爱\t138653\n亚日亚\t138654\nTUUT\t138655\n草食性\t138656\n1200点\t138657\n钟绍晨\t138658\n傻傻傻子\t138659\n收购价\t138660\n真真真\t138661\n不疼\t138662\n一股%\t138663\n小乳猪猪八戒\t138664\n蛮紧\t138665\n借款\t138666\n好呀hz\t138667\n拍卖机头\t138668\n水果忍者\t138669\n高富帅们\t138670\n犯心有灵犀\t138671\n袒森\t138672\n移送\t138673\n林秋\t138674\n铁索桥\t138675\n迷城\t138676\n月明\t138677\n顶用\t138678\n卫星\t138679\n战法\t138680\n300分\t138681\njdhdjdgehfksougzowxfehsouwhghdpibhpqjehpqeigcljqdfnkpegdljwvxpis\t138682\n臭臭臭度秘\t138683\n金急转弯\t138684\n零三\t138685\n零下\t138686\n普通常来\t138687\n乳牛奴\t138688\n遗传性\t138689\n零一\t138690\n零丁\t138691\n零七\t138692\n零万\t138693\n佩服\t138694\n一二三四五天天\t138695\n6月4号\t138696\n云南三座大山\t138697\nS764498499\t138698\n138885588555795456555555555555555884114649111484455558421111111111181188188445444115448#5\t138699\n零个\t138700\n必杀技\t138701\n秘宠不骄\t138702\n繁体\t138703\n咖啡色\t138704\n含蓄\t138705\n张芸宁\t138706\n体驗\t138707\n芊泽花\t138708\n第三步\t138709\nzen\t138710\n机汽\t138711\nhoneywell滤水器\t138712\nzer\t138713\n6.6\t138714\n6.5\t138715\n6.4\t138716\n6.3\t138717\n8522n222333522232\t138718\nzet\t138719\n案列\t138720\n710555\t138721\n家天\t138722\n333333333333333333\t138723\n报业园\t138724\n五十多块\t138725\n代码源\t138726\n知得\t138727\n茅厕\t138728\n墨叽\t138729\n五个人\t138730\n统计\t138731\n蛇夫座\t138732\n不好看脸\t138733\n车本\t138734\n东风路\t138735\n阿斯利\t138736\n正太控\t138737\n死孩子\t138738\n恐惧感\t138739\n飘浮\t138740\n嗯麻\t138741\n张明浩\t138742\nceince4oo\t138743\n蒋斌雪\t138744\n牛莹\t138745\n市蜃楼\t138746\n恩西\t138747\n于梓俊\t138748\n一遍遍\t138749\n蔓结婚\t138750\nnmmmmmmmmmmmmmm\t138751\n做守株待兔\t138752\n指王丁丁丁丁丁丁\t138753\n牛莉\t138754\n舌怪\t138755\n荆州\t138756\n几千元\t138757\n85555888\t138758\n内网\t138759\n二人转\t138760\nb6dcb\t138761\n几千克\t138762\nQQ2396262305\t138763\n挑战者\t138764\n心灵相犀\t138765\n想说实话\t138766\n懒嘛\t138767\n南海会馆\t138768\n架不住\t138769\n6141\t138770\n防腐剂\t138771\n共同生活\t138772\n夏邑\t138773\n汇入\t138774\n好的我原谅你\t138775\n德·梅勒\t138776\n眀天\t138777\n递给\t138778\n信號\t138779\n280页\t138780\n刘亚亮\t138781\n丙得\t138782\n刘亚亭\t138783\n波特兰\t138784\n好不好小爱\t138785\n杜马\t138786\n1147255903399\t138787\n张夫人\t138788\n张君豪\t138789\n没事儿干\t138790\n半步\t138791\n扣妞\t138792\n名作展\t138793\nT3航站楼\t138794\n成材\t138795\n每个者\t138796\ni4\t138797\ntian\t138798\n附魔\t138799\nwuuweu\t138800\n5688889\t138801\n大头儿子\t138802\n小嫩\t138803\ni6\t138804\n半死\t138805\nSketch\t138806\n77788899999988\t138807\n不寒而栗\t138808\n匡烨鹏\t138809\n琴若离\t138810\n小嫂\t138811\nLGG益生菌\t138812\n韩万竹\t138813\n范一龙\t138814\n自由基\t138815\n進\t138816\n广广北\t138817\n情怀\t138818\n逻\t138819\n逸\t138820\n逾\t138821\n未妥投\t138822\n无敌飞檐走壁吼\t138823\n逢\t138824\n連\t138825\n造\t138826\n奥钱啊\t138827\n文数\t138828\n紫莫\t138829\n尼高斯都尼狗之都\t138830\n香蕉皮\t138831\n张毅\t138832\n逮\t138833\n逯\t138834\nalthouthat\t138835\n递\t138836\n逐\t138837\n古木峰\t138838\n4538\t138839\n逗\t138840\n途\t138841\n情急\t138842\n走着瞧\t138843\n逛\t138844\n贝贝哥\t138845\n這\t138846\n逞\t138847\n速\t138848\n衰微\t138849\n逝\t138850\n适\t138851\n逃\t138852\n退\t138853\n送\t138854\n逆\t138855\n日经225指数\t138856\n五会\t138857\n辩机\t138858\n逊\t138859\nKapalai\t138860\n选\t138861\naabc41\t138862\n透\t138863\n污垢\t138864\n回信心\t138865\n一无所有\t138866\n#2012利群阳光助学行动#\t138867\n放假吧\t138868\n昨日下午1时10分\t138869\n熊出没之熊雪岭雄风\t138870\n黄昭智\t138871\n好不没有\t138872\n固安县第四小学\t138873\n黄羿杰\t138874\n据悉\t138875\n這麼棒\t138876\nhJjl\t138877\n同导\t138878\n有利比亚人\t138879\n梁总\t138880\n出港\t138881\n啦啦啵啵\t138882\n好惨\t138883\n百胜\t138884\n後悔\t138885\n月歌\t138886\nnonononp\t138887\nnononono\t138888\n好吧小米\t138889\n会雯\t138890\n卡士达蛋挞店\t138891\n出游\t138892\n查完\t138893\n么秘\t138894\n南片\t138895\n好想\t138896\n扑君\t138897\n武艺感\t138898\n电业\t138899\n徐秉良\t138900\n别重说\t138901\n那你在陪我聊会天\t138902\n温暖的话\t138903\n疯狂油罐车\t138904\n石券\t138905\n宗梦雅\t138906\n8776888158545\t138907\n一错再错\t138908\n亲润\t138909\n试驾\t138910\n老臣\t138911\njxs\t138912\njxr\t138913\n葫芦堡\t138914\njxx\t138915\n今天凌晨一点\t138916\n一百五十多块\t138917\n青汁\t138918\n继承者\t138919\n郑金伟\t138920\njxb\t138921\n密我讨厌\t138922\n闹变\t138923\n舞王\t138924\njxf\t138925\nggkk\t138926\njxh\t138927\njxk\t138928\njxj\t138929\njxn\t138930\n烟味\t138931\n政务\t138932\nk线控\t138933\n我爱你我没有错你疯了\t138934\n每所\t138935\n第6201天\t138936\n秘情\t138937\n一百G\t138938\n县地\t138939\n半瓶\t138940\nstilityo\t138941\n康果果\t138942\n枸枳沟\t138943\n嘉茂购物中心西直门店\t138944\n三十二十来岁\t138945\n新白娘子传奇好乱\t138946\n控烟\t138947\n必应\t138948\n几何体\t138949\n马杰瑜\t138950\n神经浪\t138951\n魏露露\t138952\nox岑\t138953\n闪闪\t138954\n城际\t138955\n民航总局\t138956\n1395825857858\t138957\n豪爵豪爵\t138958\n找鸡婆克\t138959\n中国军团\t138960\n引火\t138961\n战功\t138962\n除皱\t138963\n乐此不疲乐意\t138964\n苦命\t138965\n嗓子\t138966\n0755\t138967\n奥斯兰\t138968\nSUERUKF\t138969\n划款\t138970\nkksjzbbskce\t138971\n少年级\t138972\n短棒\t138973\n负义\t138974\n县医院\t138975\n叶挺\t138976\n五分\t138977\n2385\t138978\n十五棵\t138979\n魏嘉琪\t138980\n抽脸\t138981\nboolerou\t138982\n来越来越天\t138983\n惨迓\t138984\n忐忑妹\t138985\n47835725857768878\t138986\n度秘度秘度秘我想听魔镜奇缘\t138987\n哟西黄\t138988\n无数家\t138989\n刘畅\t138990\n靠厉害\t138991\n逐苦\t138992\n松江山\t138993\n李光曦\t138994\n联合国监督特派团先遣队\t138995\n源自\t138996\n进取心\t138997\n遁形\t138998\njvgk\t138999\n依依成\t139000\n相后\t139001\n一四代\t139002\n少狼\t139003\n相合\t139004\n心肝儿\t139005\n青藏高\t139006\n不温柔\t139007\n空空空空空\t139008\n奥威\t139009\n夺回\t139010\nvagduu\t139011\n感问\t139012\n哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒\t139013\n西山社区\t139014\n冯玉祥\t139015\nfgux\t139016\n男朋友\t139017\n1.69亿元\t139018\nfguu\t139019\n一千万个\t139020\n嗯冷\t139021\nhffhg\t139022\n光明王\t139023\n上场强\t139024\n阿拉\t139025\nfgug\t139026\nbbdnx\t139027\n乖宝贝\t139028\n门外\t139029\n11月1日\t139030\n最后一科\t139031\n指南\t139032\n醉城\t139033\n三级片\t139034\n劳务费\t139035\n太太\t139036\n咯啦\t139037\n奇米级\t139038\n厨具\t139039\n哦度秘\t139040\n祭出\t139041\n旅途\t139042\n王沟中学\t139043\n冰原\t139044\n国寿平米\t139045\n15631007759\t139046\nTAT觉\t139047\n太多\t139048\n说中我中\t139049\n门头\t139050\n秘籍\t139051\n几800分\t139052\nstbaby\t139053\nBFDHJDNDJFKGNNGFBZNNDVFMMGNH\t139054\n护领\t139055\n张瑞丽\t139056\n评点\t139057\n竟然后再见\t139058\n门天\t139059\n65673\t139060\n张钰嵋\t139061\n僵尸男友\t139062\n塔尖\t139063\n轶事\t139064\nhiddles\t139065\n过去乐意\t139066\n2016年2月\t139067\n里皮\t139068\nhttppinyincn1dSY1x2DxkR\t139069\n军长\t139070\n淘狗\t139071\n对对对对\t139072\n正劲\t139073\njdtjtwkt\t139074\n射雕英雄传\t139075\nuffgd\t139076\n无间道\t139077\n36kkk\t139078\n零七二和五中\t139079\n死东西\t139080\n上海殷浩电子科技股份有限公司\t139081\nJASONWOOD\t139082\n锁点\t139083\n石蛾折\t139084\nPOPONNY\t139085\n平安八中\t139086\n红星网\t139087\n一带\t139088\n闭路\t139089\n过重\t139090\n了了\t139091\n过量\t139092\nVick\t139093\n一席\t139094\n西溪木\t139095\n翟玉凤\t139096\n开封市委\t139097\n30克\t139098\n64471731\t139099\n李文愈\t139100\n冬蜜冬蜜\t139101\n陆芊芊\t139102\n30元\t139103\n一年两百\t139104\n3231515\t139105\n一帅\t139106\n间单\t139107\nok333\t139108\n尼李江伟\t139109\nddgdf\t139110\n二八400500600700家\t139111\n了京\t139112\n陈默\t139113\n88十8十8十8二几\t139114\n中暑\t139115\n依兰香\t139116\n麻烦你的等\t139117\n脆不叫\t139118\n告诉你是\t139119\n自有资格\t139120\n看不了\t139121\nfuyfg\t139122\n好心草\t139123\n福岛核电站\t139124\n禽兽\t139125\nfuyfh\t139126\n我是女女女\t139127\n陈哲\t139128\n赵冰艳\t139129\n不要再发呵呵\t139130\ndggfgggcggfgghghghfhgfcvbbvghhbbbhbbbbvgggghhbbhhyyggggjghghcgg\t139131\n百余发\t139132\nbhiphotosbaiducomxiaodupicitemaa18972bd40735fa8fa509cc99510fb30f240869jpg\t139133\n拥戴\t139134\n88分钟\t139135\n史欣怡\t139136\n药鱼\t139137\n肉球\t139138\n勋夫人\t139139\n镌刻\t139140\nplih\t139141\n漏漏\t139142\nzenmdi\t139143\n硬说\t139144\n臭八蛋\t139145\n周8\t139146\n呀不想\t139147\nappashouou\t139148\nrain\t139149\n鱼钩\t139150\nnoshiheblack\t139151\n蔬果\t139152\n亚无甲肝经\t139153\n58565485548008262428961586￥\t139154\n河西\t139155\n嘁嘁嘁嘁\t139156\n哈哈他好他好大家好\t139157\nBurberry\t139158\n飞桃花流水厥\t139159\n我愿意\t139160\njIDYUR\t139161\n刃\t139162\n小鞠\t139163\n这边\t139164\n王府井影城\t139165\n洪湖赤卫队\t139166\n此车乃速度与激情5\t139167\n马里政府军\t139168\n掀翻\t139169\n元旦晚\t139170\n光临\t139171\nhellotyou\t139172\n耐苦\t139173\n渎读\t139174\n自恋婉\t139175\n卧薪尝\t139176\n笔迷\t139177\n季羡林语录\t139178\n2小时\t139179\nsmbaiducomsword%E794B5E5BDB120E69E81E99990E68C91E68898E54A7E794B5E5BDB1tnbdvoice\t139180\n笔迹\t139181\n噜啦啦噜\t139182\n四环医药\t139183\nkjkjfcji\t139184\n张子文\t139185\n类似于\t139186\n杨千嬅\t139187\n一声天天\t139188\n好看我的\t139189\n生鲜\t139190\n糊糊\t139191\n闪亮时刻\t139192\n睡不着调\t139193\n甲辰\t139194\n雪珍广场中心\t139195\n别把卯\t139196\n一挺\t139197\n分度密度\t139198\n此等\t139199\n菁菁\t139200\n度秘棒哒哒\t139201\n任庆龙\t139202\n94684265201314\t139203\n好吧度秘我讨厌你\t139204\n爸爸到我没有\t139205\n衍生品\t139206\nbhiphotosbaiducomxiaodupicitem3801213fb80e7beca527b195282eb9389a506bc1jpg\t139207\n阔死\t139208\n林闽炜\t139209\n获得者\t139210\n赵俊华\t139211\n雨水\t139212\n中文名\t139213\n粑粑样\t139214\n蒋依依\t139215\n它那样\t139216\n好乐迪\t139217\n李少帅\t139218\n阔步\t139219\n毛栗子\t139220\n求饶\t139221\n今天16点01分\t139222\nDHGJVJ\t139223\n妈妈的手\t139224\n看不烦\t139225\n189件\t139226\n咋回事儿\t139227\n555221456\t139228\n百余条\t139229\n乐维东太湖佩尔家族\t139230\n羿君\t139231\n冷餐\t139232\n活面\t139233\n虚妄\t139234\n任牵连\t139235\n投点\t139236\n三十五岁时\t139237\n上学好\t139238\n朱猪朱\t139239\n艺考\t139240\n葵花子\t139241\nDHHDHD\t139242\n有没有\t139243\n六条鱼\t139244\n\t139245\n赛尔大电影战神联盟\t139246\n圣鹿\t139247\n释梦美人来啦\t139248\n纽扣\t139249\n比伦宪柱杰\t139250\n硝基三氮唑\t139251\n失分寸\t139252\n光子郎\t139253\n清亮丽\t139254\n付瑶瑶\t139255\n通假字\t139256\n杜凯明\t139257\nhgce\t139258\n283748\t139259\ngdfnS\t139260\n泪光我知道我一只有双鹰心的翅膀带我飞给我希望\t139261\n天津泰达U-19队\t139262\nhyng\t139263\n玩青山我一个人一个\t139264\n体验者\t139265\n伤心累\t139266\n风凌\t139267\n未婚妈\t139268\n过招呼\t139269\n返吖\t139270\n无义\t139271\n团聚\t139272\n毒枭\t139273\n萌萌哒唐氏号\t139274\n也也\t139275\n后生笑话\t139276\n黄翩翩\t139277\n快点现在网\t139278\n6699点\t139279\n233333658794\t139280\n泉眼\t139281\n再生气\t139282\n全诗\t139283\n单予晨\t139284\n死行\t139285\n痴人花\t139286\nnonoo\t139287\nnonon\t139288\nnonou\t139289\n小蒸笼\t139290\n248575980\t139291\n无字词典\t139292\n2009年3月21日上午九点\t139293\n二零九八\t139294\n谢我么\t139295\n自作聪明\t139296\n小楚\t139297\n信就\t139298\n吉林省\t139299\n渐感\t139300\n有情人终成眷属\t139301\n信封\t139302\n533医院\t139303\n锁保护\t139304\n汉族\t139305\nghydh\t139306\n胶着梦\t139307\nghydc\t139308\n奔波中等待\t139309\n袜片\t139310\n此物\t139311\n1241262412\t139312\n红淡然\t139313\n风象星座\t139314\n嘁嘁嘁嘁嘁嘁嘁嘁嘁嘁嘁嘁\t139315\n男撑女\t139316\n王红\t139317\n天线\t139318\n纯洁度\t139319\nfifjd\t139320\n腾达\t139321\n三聚氰氨奶\t139322\n小朋友们\t139323\n开发版\t139324\n王纸\t139325\n克莉丝汀·斯图尔特\t139326\n吹度\t139327\n小苏打\t139328\n三辈子\t139329\n姜明俊\t139330\n聂勇芹\t139331\n男不男女\t139332\n吸走\t139333\n陈是男\t139334\ndfoh\t139335\n5354226\t139336\n蒙古族\t139337\nbemore\t139338\n叫顾\t139339\n矿泉水\t139340\n房祖名\t139341\n新会\t139342\n拉菲\t139343\n我找小爱上你\t139344\n我是你的梦中女孩\t139345\n肩并肩\t139346\n新传\t139347\n叶颖洁\t139348\n寄愁\t139349\n入围\t139350\n毁灭地球\t139351\n忒格亚\t139352\n乡间路上\t139353\nxtbreuvdyb\t139354\n奔马图\t139355\n一包\t139356\n好女孩\t139357\nmsjcjjwfjdjsjehehshwhwhshdhheuwushdifugqtiwgevbgkfhdbfkfiushskfow\t139358\n新似\t139359\n阴阳五行家言\t139360\n头片\t139361\n玫红色\t139362\n蒲万里\t139363\n考坐\t139364\n红油\t139365\n头牌\t139366\n红河\t139367\n头版\t139368\nhellocla\t139369\n四八五十四\t139370\n张诗杨\t139371\n别介绍\t139372\n不甚\t139373\n醉卧南山笑渊明\t139374\n角球\t139375\n1一个一个\t139376\n落山\t139377\n头牛\t139378\nN250\t139379\n同性变\t139380\n呗克\t139381\n聪明永富贵\t139382\n非物质\t139383\n鬼屋\t139384\nflash\t139385\nzono\t139386\n萝卜缨炒泥丁\t139387\n门户\t139388\n大机鸡\t139389\n手不释\t139390\n搬走\t139391\n沈洪涛\t139392\n软硬\t139393\n知己者\t139394\n抵挡\t139395\nlkkuk\t139396\n天天开心\t139397\n开光\t139398\n助教\t139399\n额斤斤计\t139400\n马褂\t139401\nCC层\t139402\n开元\t139403\n我喜欢吴\t139404\n有色\t139405\n拉拉裤卡六里屯\t139406\n郭铭慧\t139407\nhhfhfeuvg\t139408\n裂变\t139409\n她走\t139410\n死皮膏\t139411\n家台\t139412\n454803\t139413\nmamasqq\t139414\n一夜跳远到吧\t139415\n破音\t139416\n毛度秘\t139417\n断头路\t139418\n产蛋\t139419\n洋洋兔\t139420\n999999999999999\t139421\n有你好\t139422\n罗溪\t139423\n苦逼伤不起\t139424\n想不过来\t139425\n裂口\t139426\n血淋淋\t139427\n开关\t139428\n螺旋桨\t139429\n开具\t139430\n顺带你别说话了行不\t139431\n老子心情不好别惹我\t139432\ngvfs\t139433\nhetyu\t139434\n不服\t139435\n人间蒸发\t139436\n别管照\t139437\n柔道\t139438\n碎觉鸟\t139439\n臭志伟\t139440\n不會\t139441\n特别卡\t139442\n不最\t139443\n任别\t139444\n吴德鹏\t139445\n一次项\t139446\n小眼儿\t139447\n相拍\t139448\n神仙侣\t139449\n433151880137\t139450\n小坏蛋小坏蛋小坏蛋小坏蛋\t139451\n林浩帆\t139452\n哭祁\t139453\n嗯瓮\t139454\n金牛座\t139455\n保卫委\t139456\n病故\t139457\n还碍\t139458\n桂哥\t139459\n润霜\t139460\n不朽\t139461\n排长\t139462\n美丽俏佳人\t139463\n身历\t139464\n标题\t139465\nhdkbd\t139466\n李俊赫\t139467\n吕国航\t139468\n0328328\t139469\nfnto\t139470\n盖茨\t139471\n初雪出血\t139472\n那凡\t139473\n白裳\t139474\n9892万亩\t139475\n这样板间\t139476\n于鑫涛\t139477\n基辛格\t139478\n蒲灵波\t139479\n兔兔兔兔兔兔兔兔兔兔兔兔兔兔兔兔兔兔\t139480\n乜餸\t139481\n3.61元\t139482\n刘一样\t139483\n哇呱刮片吧片\t139484\n去职中\t139485\n杉田樱子\t139486\n通情达理\t139487\n露馅\t139488\n补文\t139489\n莫非\t139490\n广发\t139491\nUN43\t139492\njau\t139493\n报国事半功倍艰苦朴素\t139494\n6X4一4\t139495\n龊\t139496\n龄\t139497\n龇\t139498\n16.9%\t139499\ntudt\t139500\n张号秘\t139501\n龟\t139502\n龘\t139503\n龙\t139504\n龚\t139505\n陶海丹\t139506\nGJVJ\t139507\n别唬\t139508\n金毛犬\t139509\n7国\t139510\n18221189858\t139511\n好男儿\t139512\nlive\t139513\n#2012COSMO\t139514\ntnbn\t139515\n上辈子\t139516\n龴\t139517\n龵\t139518\n关键词\t139519\n老计\t139520\n超模him\t139521\n车型\t139522\n能做到\t139523\n尔等\t139524\n江森\t139525\n还行啦我待\t139526\n奥特曼行\t139527\nfffgff\t139528\n男土女金\t139529\n9点20\t139530\njebdb\t139531\n黄色国\t139532\n白说器\t139533\n嗯功夫王\t139534\n房产证\t139535\n做不死\t139536\n二十题\t139537\n小花仙二中\t139538\njab\t139539\n十二根\t139540\nMMP\t139541\n好厉害\t139542\n赔钱\t139543\n唱心\t139544\n脸汤么子\t139545\n离开登\t139546\n还什么说\t139547\n王伟三\t139548\n重返小王国\t139549\nzux\t139550\n剧中人\t139551\n凤鸣\t139552\n调系\t139553\n赏光\t139554\n送花\t139555\n虚汗\t139556\n伙同\t139557\n十几辆\t139558\n老豆\t139559\n洗洗洗\t139560\n编厂\t139561\n波巴\t139562\n亿年\t139563\nzub\t139564\n旺盛\t139565\nyoutrilit\t139566\n星魂堡\t139567\n宋宇阳\t139568\n鞋子\t139569\n念嗯我讨厌你讨厌你我不爱你\t139570\n关索\t139571\n关紧\t139572\n义务\t139573\n大班儿\t139574\n黎烤烤火\t139575\n依傍\t139576\n报应该\t139577\n均衡\t139578\n暗堂\t139579\n厂片\t139580\n台庆\t139581\n天利\t139582\n未免\t139583\n说离别的话堡我求求你了你叫什么简单点\t139584\n决定\t139585\n陵恋\t139586\n跑一跑\t139587\n邮件\t139588\n不好说\t139589\n囧恩\t139590\ngmgdgt\t139591\n劫持\t139592\n把贴\t139593\n黄度秘\t139594\n胡怒\t139595\n无锡\t139596\n邓斯特\t139597\n马特·亚当斯\t139598\n计成\t139599\n奇书\t139600\n天分\t139601\n条形\t139602\n行那你就是我的徒儿\t139603\n第一页\t139604\n恢复\t139605\n瓷娃娃\t139606\n赣榆\t139607\n侄子\t139608\n朴树\t139609\nxiayang3109707085\t139610\n第一项\t139611\n品控\t139612\n庚旭\t139613\n王文文\t139614\n五岁\t139615\nkszuoa\t139616\n莱斯特咪\t139617\n短效\t139618\n9层\t139619\n04568545775687588967860867688989579086\t139620\ncold\t139621\n没小\t139622\n灯展\t139623\n啦啦五\t139624\n扎克伯格\t139625\n五岳\t139626\n754775\t139627\n耿志博\t139628\n三零二零二幺\t139629\n拥趸\t139630\n李光耀\t139631\n计华\t139632\n孟非乐嘉\t139633\n光棍\t139634\n陪着我了我讨厌\t139635\n风yun\t139636\n1月几日\t139637\n五岭\t139638\n乱太郎\t139639\n密聊\t139640\n主要堡\t139641\n奇缘\t139642\n冯圆圆\t139643\n县北\t139644\n狩猎\t139645\n猜明知故问\t139646\n秦亚丽\t139647\n杨文理\t139648\n单牌\t139649\nhalf\t139650\n结尾段\t139651\n突破性\t139652\n设施\t139653\nhall\t139654\nhalo\t139655\n快乐寒假\t139656\n想笑话\t139657\n陈的\t139658\n魏心悦\t139659\n点埋\t139660\n讹坏\t139661\n44个\t139662\n陈瀚霏\t139663\n林侧成峰\t139664\n泡沫\t139665\n泡沬\t139666\n圣马利诺\t139667\n风倾墨\t139668\n修白\t139669\n豆泡\t139670\n苦思冥想\t139671\n立信\t139672\nlm3\t139673\n忱\t139674\n啊欧1a\t139675\n铁板烧\t139676\n欧泥马\t139677\n鉴开中学\t139678\n发车\t139679\ncitan\t139680\nvhiggijbn\t139681\n叫屁颠\t139682\n恰如\t139683\njoqd\t139684\n客衬\t139685\n极点提他\t139686\n答复\t139687\n区司法局\t139688\n88962788\t139689\n忻\t139690\n撰写\t139691\ncuok\t139692\n刘嘉伟\t139693\n买万吧\t139694\n514晚\t139695\n彭州市佛教协会\t139696\n一路通天\t139697\n同嘉裕\t139698\n这时\t139699\n长莎莎\t139700\n心甘情愿\t139701\n何振全\t139702\n院方\t139703\nlmp\t139704\n地球保卫战\t139705\n更衣室\t139706\n617\t139707\n假劲\t139708\nlmj\t139709\n614\t139710\nlmn\t139711\n欧巴那\t139712\n路娜\t139713\n号召力\t139714\nXX队\t139715\n下雨天\t139716\n色拉拉\t139717\n夜不长\t139718\n米仓千寻\t139719\n再见再见再见再见再接\t139720\n墙上\t139721\nspoooo\t139722\n重庆南开中学\t139723\n灼伤\t139724\n收鬼\t139725\n探测性\t139726\n我叫花千骨\t139727\n爱上我\t139728\n头等舱\t139729\n4567天\t139730\n五成五\t139731\n非烫手\t139732\n大巴吧\t139733\n可及\t139734\n微臣\t139735\n厌儿\t139736\nAngelababy美\t139737\n就是好呀我的好好好友我的寂寞\t139738\n可发\t139739\n找我欲意\t139740\n喜欢你了你好\t139741\n15550\t139742\nRookie\t139743\n48916\t139744\n蹈矩\t139745\n可变\t139746\n群主\t139747\n可口\t139748\nDiamond\t139749\n我相信你爱我\t139750\n谢中书书\t139751\n你的生日\t139752\n可叫\t139753\n可可\t139754\n金俊秀\t139755\n密洞一觉度秘\t139756\n胸置\t139757\n胸罩\t139758\n张自艳\t139759\n可叹\t139760\n61%\t139761\n吹走\t139762\nJohansson\t139763\n千军万马\t139764\nBritney\t139765\nTjxf\t139766\n忑\t139767\nghhjvcf\t139768\n两三年\t139769\n照实\t139770\n好多面\t139771\n580085211553\t139772\n车建聪\t139773\n杨楠\t139774\n12星座\t139775\n雷普利\t139776\n走了心\t139777\n真心难求\t139778\nrgikn\t139779\n宋依静\t139780\n幼童漫\t139781\n三百一十多块\t139782\nglju\t139783\n10:00\t139784\n高静软膏纪元\t139785\n我讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你\t139786\n热火夺冠\t139787\n堂街探案\t139788\n复活节\t139789\n玉韫珠藏\t139790\n166799958303\t139791\n狠点面\t139792\n直聪明\t139793\n泼妇\t139794\n杨影音\t139795\n个旧\t139796\n够你\t139797\n娄底jjj\t139798\n扎助\t139799\n傲馓\t139800\n王屯\t139801\n进款\t139802\n羊羊羊羊小游我想\t139803\n代代代代\t139804\n19位\t139805\n阿珍\t139806\n笃笃笃笃笃笃笃笃笃笃的笃笃笃相敬相爱\t139807\nPo\t139808\n狠心小\t139809\n干千\t139810\n好我着急\t139811\n脚手架\t139812\n回击\t139813\n樱桃黑\t139814\n芙蓉石\t139815\n美丽吧\t139816\n够住\t139817\n放炮仗\t139818\n忆\t139819\n邱天\t139820\n死屯\t139821\n诗会\t139822\n李明美\t139823\n2006KBS歌谣大庆典天鹅的梦金在中Se7ven\t139824\n水本\t139825\n飘我飘我飘我飘呀飘好飘\t139826\n凯爷婷\t139827\nhttpbaikebaiducomsubview287415762283htmfraladdin\t139828\n洗手间\t139829\n陈小姐\t139830\n葱蒜\t139831\nrunningman\t139832\n提低\t139833\n000005\t139834\n000000\t139835\n水月\t139836\n多好多好\t139837\n北京尾\t139838\n肺气\t139839\njzZZKY\t139840\n你好老魏\t139841\n忍\t139842\n升格\t139843\n四一遍\t139844\n牵引带\t139845\n终圆儿\t139846\n九十七块\t139847\n好辛苦\t139848\n佳慧\t139849\n能覆舟\t139850\n今晚18：25\t139851\nbhhhgjgggggfgghhfibhffghkkjjiiuhhghhvvccvgbgGHXGYTFDDRERWWTUOLBDRUXGHFU\t139852\n知我相思\t139853\n别再开玩笑\t139854\n9005007008009\t139855\n成问题\t139856\n取自己\t139857\n不问你\t139858\n颈胸\t139859\n咪咪登登上山稀里糊涂过河\t139860\n自己萌\t139861\n拼音输入法\t139862\n吴是金\t139863\n致以\t139864\n长太丑\t139865\n邓盆霜\t139866\n下周六\t139867\n过顺\t139868\n重典\t139869\n重关\t139870\n重共\t139871\nhi吗昂科雷\t139872\n均值\t139873\n重其\t139874\n13年间\t139875\nPG\t139876\n光宇\t139877\n两个群\t139878\n大雾\t139879\n27926\t139880\ncuyiffjf\t139881\n乔不死\t139882\n炙手可热\t139883\n打磨\t139884\n大雷\t139885\n大雨\t139886\n大雪\t139887\n神马滴偶\t139888\n花我叫姐姐我把我就我姐姐好妹妹好你是我妹妹我是你姐姐别老说哦你是我是你妹妹你是我\t139889\nxvbu\t139890\n李娟\t139891\n李娜\t139892\n小萌度秘\t139893\n每位\t139894\n省事\t139895\n省予\t139896\n西班牙\t139897\n看别\t139898\n三少群\t139899\nliu1142573373\t139900\n听不清楚\t139901\n《文明\t139902\n嗯无奈\t139903\n一治\t139904\ndhsiwhvshd\t139905\n看到\t139906\n一沿\t139907\n大雁\t139908\n韩国仔\t139909\n那霸港\t139910\nnohengnonono\t139911\n大雅\t139912\n大集\t139913\n好可爱\t139914\n不足为奇\t139915\n156号\t139916\n晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕晕\t139917\n好啦怕了你了对不起\t139918\n定期\t139919\n故人\t139920\n69块\t139921\nRGB\t139922\n课长\t139923\n招待所\t139924\n咖喱屋\t139925\nRGJ\t139926\n张贵舰\t139927\nsbsb\t139928\n梭子\t139929\n爱不爱她不化片\t139930\n美男子\t139931\n急救\t139932\n听成\t139933\n启发\t139934\n听战\t139935\n5下\t139936\n故事\t139937\n秋后算账\t139938\n郑大四\t139939\n甘日蒲夜蒲\t139940\n民族风情\t139941\n洪辰\t139942\n动弹不得\t139943\n厚度\t139944\n实习费\t139945\n不我可怜\t139946\n贝多芬\t139947\n5218346859\t139948\n缪菊花\t139949\n大脑海了你\t139950\n猪八戒倆人\t139951\n宠物羊\t139952\n好吧好吧八八讨厌你讨厌你讨厌你讨厌你\t139953\n名办\t139954\n发片\t139955\n呢图\t139956\n初心者\t139957\nPC塔\t139958\n天鹅绒\t139959\n张博爽\t139960\n黄瓜味\t139961\nlemon\t139962\n司马缸\t139963\n澄清\t139964\n克麦隆神山\t139965\nvygugu\t139966\n剑一关\t139967\n科约\t139968\n马尼姑\t139969\ngaiqi\t139970\n统辖\t139971\n钥匙孔\t139972\n永大\t139973\n李传牛\t139974\n倾国倾城\t139975\n东方航空\t139976\n狮王\t139977\n演艺\t139978\n13567885530家\t139979\nhggzhsjw\t139980\n佳伟你在哪儿呢你在天空\t139981\n永失\t139982\n老娘老娘\t139983\n僵尸猪\t139984\n犹豫不觉\t139985\n願意嫁給\t139986\n明月几时有啊八千蚊擎天天\t139987\n永夜\t139988\n滨海高铁站\t139989\n几节\t139990\n郎冲\t139991\n接客\t139992\n半老半热打\t139993\nrsgjcftgc\t139994\njifj\t139995\n猪鬼\t139996\n羞羞我是你的时候讨厌我讨厌你说话的诗\t139997\n碟动会\t139998\n李庄乡\t139999\njify\t140000\njtbf\t140001\n更正向\t140002\n一一一一一一\t140003\n锐锐\t140004\n欢度佳日\t140005\n王女人\t140006\n六比三大\t140007\n严加政\t140008\n手镣\t140009\n冷冷冻\t140010\n手镯\t140011\n孔雀蝶\t140012\n程晓玥\t140013\n惹我生\t140014\n秀平\t140015\n张锋远\t140016\n3q否\t140017\n颜真\t140018\n一个人儿\t140019\n手长\t140020\n萌宠萌宠\t140021\n七百\t140022\n会儿网\t140023\ncigp\t140024\nhjgqh\t140025\nfjic\t140026\ncigv\t140027\n小苹歌\t140028\njgjoj\t140029\njgjgjgtg\t140030\nbiao\t140031\n蝴蝶狗\t140032\n以撒全\t140033\nkcxfg\t140034\n中曼舞\t140035\ncigh\t140036\nmmmio\t140037\n咬牙切齿\t140038\n胖熊\t140039\n工龄\t140040\n巨响\t140041\n呃皇上皇上猫\t140042\n专门街\t140043\n包核算\t140044\n出场商\t140045\ngj4obux\t140046\n汉克\t140047\n细嚼慢咽\t140048\n丽华\t140049\n工本费\t140050\n十几公里\t140051\n乡亲\t140052\n最差\t140053\n为什摸\t140054\n蚂蜂窝\t140055\n僧众\t140056\n办生\t140057\n稿纸\t140058\n110404\t140059\n猪头儿\t140060\n56225888336\t140061\n悦妈怀孕日记\t140062\n我有你\t140063\n若琳\t140064\n叫雞\t140065\n九点到九点\t140066\n17876650822\t140067\ndaben\t140068\n上三年\t140069\n千万条\t140070\n电影剧本\t140071\n尾号三零六\t140072\n科幻类\t140073\n迅猛龙\t140074\n靠靠靠靠靠靠靠靠靠靠靠靠\t140075\nrxghflkj\t140076\n勇猛的力士\t140077\n点点点点点点点点点\t140078\n办男\t140079\n幹活發貨\t140080\n听不可以\t140081\n食言在先\t140082\n介么的干挖心\t140083\nvdsrewwrf\t140084\nVHGDRH\t140085\n疯机器人\t140086\n温哥华机场关\t140087\n瞎编\t140088\n宋兴昕\t140089\n紧绷感\t140090\nxiah不顾\t140091\n少门\t140092\n苏玉权健\t140093\n快进来\t140094\n秘想死\t140095\n善在\t140096\n讲价\t140097\n讲件\t140098\nkulul\t140099\n点读\t140100\n二O一\t140101\n当才说\t140102\n文化节#\t140103\ntouch\t140104\njjjnnnn\t140105\n来了测\t140106\n一垛\t140107\n三四年内\t140108\n我喜欢者\t140109\n康泰十国傲；十一乐赏菊花俏\t140110\n何来政\t140111\nffbhggj\t140112\n构件\t140113\n刘宇菁\t140114\n嗯骨\t140115\n抗氧化\t140116\n入关\t140117\nXiaoCaige\t140118\nchinkyoutomitsonthe\t140119\nhellohellomily\t140120\nhttp44ztztcomxiazaiquxunleichangpian66530html\t140121\n心里爱\t140122\n婀婀袅袅\t140123\n叶公好龙\t140124\n耽误\t140125\n搭搭\t140126\n腕式血压计\t140127\n文艺文\t140128\n残忍我的字典\t140129\n刘胡兰\t140130\ntdrtygfytt\t140131\n嘉德\t140132\n战神\t140133\n晓杨\t140134\n柳絮飞\t140135\n我的心爱\t140136\nPbssts\t140137\n左侧\t140138\n晓松\t140139\n侧壁\t140140\n拉莫日子\t140141\n败坏\t140142\n间歇\t140143\n四面的天空\t140144\n彤彤一\t140145\nlhgdjtsxkrslhrdjltldyuldluyfkyfljzhf\t140146\n减刑\t140147\n虚脱\t140148\n要家庄\t140149\n霞数\t140150\n想要你了我想\t140151\n我叫谁谁谁你叫谁谁谁你是谁的谁我是谁的\t140152\n220度\t140153\n敢一问三不知\t140154\n赵思思\t140155\n湖北话\t140156\n幸福中\t140157\n起码片\t140158\n强身\t140159\n度秘喽度秘\t140160\n倾城之泪\t140161\n1558元\t140162\n好啦好啦我不要你\t140163\n120509KBS\t140164\n什么否\t140165\n天天飞车\t140166\n阿姐儿\t140167\nHhjgh\t140168\n万年总攻吗那千年小受\t140169\n4964\t140170\n小朋友\t140171\ndudhj\t140172\n如诗\t140173\n过年了你打算\t140174\nSvenisbs\t140175\ndudhb\t140176\n卵巢吧兄弟第二季\t140177\n取款机\t140178\n带枪\t140179\njpugufiglhlh\t140180\n胥岸\t140181\n齐伯涛\t140182\n劝导\t140183\n不尔\t140184\n调节剂\t140185\nMEETING\t140186\n冷水澡\t140187\n如说\t140188\n图具体\t140189\n首别爱\t140190\n咯莫\t140191\nSMCody\t140192\n荣荣\t140193\n清宿\t140194\n太阳花\t140195\n四把\t140196\nn六个\t140197\n糁汤油饼\t140198\n可行性\t140199\n清宫\t140200\n和美\t140201\n杨缺席\t140202\n龙龙华\t140203\n想想\t140204\n信赖\t140205\n厉害分手v\t140206\n季节性皮炎\t140207\n爱在春天\t140208\n爱李\t140209\n萌大\t140210\n5万\t140211\n瓜子湾仔\t140212\n不之到\t140213\n182418246931\t140214\n顾uu\t140215\n耐阴\t140216\n明恋\t140217\nJyv\t140218\n我是谁我是谁我是谁我是谁我是\t140219\n拘谨\t140220\n礁滩\t140221\n比悄\t140222\n暗黑3\t140223\n太帅了帅的帅瞎我的双眼\t140224\ns代\t140225\n杀姻\t140226\n神农架\t140227\n阴天\t140228\n阴夫\t140229\n胜客\t140230\n开采\t140231\n点花花\t140232\n打骗\t140233\nhaaailpo\t140234\n我的了你知道\t140235\n一家架\t140236\ncfdgfgfffgf\t140237\n我好恨好恨好恨\t140238\n打骂\t140239\n安有\t140240\n少得病\t140241\n225565\t140242\n度秘我想\t140243\n易得\t140244\n外文\t140245\n刘心\t140246\n逼不得已\t140247\n百次\t140248\n贯彻\t140249\n自已\t140250\n自己\t140251\n尼里奥\t140252\n铁佛s\t140253\n安本\t140254\n掩藏\t140255\n大蚣\t140256\n唐怡静\t140257\n要不听\t140258\nImiss\t140259\n反问句\t140260\n飘柔\t140261\n劫犯\t140262\nChhcatgag\t140263\n过新年\t140264\n冰仔\t140265\n一百多次\t140266\n世波\t140267\nojh\t140268\n分卷\t140269\n掐算\t140270\n闫琮仁\t140271\n巴雷托一二哈皮了得兔呦\t140272\n大骨汤头\t140273\nheime\t140274\n当心\t140275\n557875\t140276\n汗管\t140277\n100多辆\t140278\n要不吃\t140279\n酷韩\t140280\n4项\t140281\n小菊花\t140282\n要不吗\t140283\n大坏蛋大坏蛋大坏蛋蛋蛋蛋蛋蛋大混蛋\t140284\n2685509702\t140285\n疯狂类\t140286\nrncurn\t140287\n7月10日-14日\t140288\n期末测试卷\t140289\n铁虎将\t140290\n小死孩\t140291\nN年\t140292\necnmlgb\t140293\n左里奇\t140294\n魅力\t140295\n4075万美元\t140296\n正身处\t140297\n好想有个你\t140298\n分居了了了了了了了了了\t140299\n10月1日起\t140300\n18794534629\t140301\n骂街\t140302\n江口\t140303\n横幅\t140304\n魔法校\t140305\n在我心里\t140306\n逃窜\t140307\n日日操劳\t140308\n心寒失望\t140309\n性感感\t140310\n商业秀\t140311\n1245567\t140312\n啦啦啦啦啦啦我是妈妈小码将么么么么么么么\t140313\nhhhhhhhhhhhhhhhhhhhｕｉｉｉ\t140314\n我没有伤你的心度秘\t140315\n首孝悌次\t140316\n唐孝伟\t140317\n核桃茶\t140318\n郭瑶\t140319\n眼镜儿\t140320\nmp3\t140321\n围捕\t140322\nCool\t140323\n黄莺\t140324\n韩栋\t140325\n求你了真讨厌\t140326\n55556556ggghcugghgjhfcFxsjtdhf\t140327\n熊大拯\t140328\n王颖鑫\t140329\n席晓琪\t140330\n周角\t140331\n柳堡\t140332\nRalston\t140333\n舍命\t140334\nJJMM\t140335\nlebaby\t140336\njggffr\t140337\n38383838\t140338\n第一门\t140339\n食欲\t140340\n么昂神\t140341\n紫鲸\t140342\n一百千米\t140343\nmpd\t140344\n小鸟\t140345\nmpa\t140346\nvery\t140347\nMynameislucy\t140348\nmpm\t140349\n李瑛\t140350\n32一三38\t140351\nmpw\t140352\nabcdfj\t140353\n58958\t140354\n想象版\t140355\n真的可以\t140356\n欹器\t140357\n1pp\t140358\n赵娃子\t140359\n图图图图图\t140360\nabcdfg\t140361\n湖州街\t140362\n李瑶\t140363\n哈哈男\t140364\n给给给给给给给给给蹦蹦蹦\t140365\n总干\t140366\n哈哈甸\t140367\n灰尘\t140368\n姐角\t140369\n打写\t140370\n疆组\t140371\n1111118000000000000\t140372\n打冒\t140373\n耕耘\t140374\n黄赌\t140375\n万里无云\t140376\n20042003年\t140377\n农历七月初一\t140378\n王王玺\t140379\n停播\t140380\n元大都\t140381\n退保\t140382\n484583\t140383\n退一步海\t140384\n哈生\t140385\n答咚\t140386\n网路\t140387\nTANK\t140388\n见花喝一斗\t140389\n邓鹏彬\t140390\n言子旁\t140391\n束花\t140392\n2575858534\t140393\n1000条\t140394\n了不了的爱\t140395\ndaon\t140396\n龙珠直播\t140397\n649378679679689689565996798968986896895686599569687678619485464949464964943644964659595953659565\t140398\n秃顶\t140399\n奔弛\t140400\n陈琼\t140401\n哎哟鱼\t140402\n十四条\t140403\n11111111666666\t140404\n最好别\t140405\n显弱\t140406\naqorauaw\t140407\n天那\t140408\n西巴布亚省\t140409\n长唱\t140410\n坚信\t140411\n奥斯丁市\t140412\n全子\t140413\n王宠爱\t140414\n大屿山\t140415\n皮片\t140416\n平卧\t140417\n正能梁\t140418\n那些花儿\t140419\n暧昧4you\t140420\n乜读\t140421\n黄志录\t140422\n品尝\t140423\n扭扭捏捏\t140424\n很不对\t140425\n不可唉\t140426\n级号\t140427\n活期宝\t140428\n3e8\t140429\n李金元\t140430\n4541\t140431\n副局长\t140432\n密陪\t140433\n去污\t140434\nvide\t140435\n讲相\t140436\n很高涨\t140437\nClarks\t140438\n应用者\t140439\n151292136271664310987653\t140440\n凶多吉少\t140441\n金典有机我是歌手第四季珏镇\t140442\n188.6亿元\t140443\n79天\t140444\n24678咕嘎咕嘎猪猪猪猪侠叔\t140445\n新我喜欢你的的爱真是你魔域新的四小度度秘\t140446\n救人员工们\t140447\n给我建\t140448\nvuddd\t140449\n不要害羞\t140450\n邓婕\t140451\n酒美\t140452\n颜美桃\t140453\n狮子女\t140454\n这用种\t140455\nvare乔\t140456\n单摆\t140457\n3er\t140458\n神殿\t140459\n我的一讨厌我我也讨厌你是度秘\t140460\n12332445\t140461\n气味\t140462\n模儿\t140463\n白仔玉\t140464\n邹恒甫\t140465\n御载\t140466\n拍案\t140467\n也不曾\t140468\n北京君正\t140469\n三峡核人江\t140470\n二零一八零一\t140471\nJgktyjh\t140472\n巴伦太\t140473\n耿熊樊\t140474\n这个证\t140475\n一个一四\t140476\n远想冬雨\t140477\n这个词\t140478\n乞食\t140479\n快点我和美世界等你\t140480\n利马\t140481\nducgi\t140482\n洋文\t140483\n娅\t140484\n不爱你\t140485\n吹捧\t140486\n破烂机\t140487\n入木三\t140488\n事变\t140489\n15935718245369248651379\t140490\n过移\t140491\n事发\t140492\n躬婧\t140493\n武动\t140494\n晚秋\t140495\n海南海\t140496\n五十音图趣味记忆法\t140497\n七十岁\t140498\n大无脑\t140499\n乱入\t140500\n陈逸华\t140501\n前桌\t140502\n1050\t140503\n5356\t140504\nufjvnnjfx\t140505\n石文慧\t140506\n花花拉\t140507\nugchv\t140508\njkddv\t140509\nnbi1般\t140510\n小佳子亲\t140511\n戏份\t140512\n事只\t140513\n我一问\t140514\n凤凰姐\t140515\n渔政船\t140516\n太理性\t140517\n黑澳洲莱科youstoimpostrocomostorsenarankz慢ameosempvelyft\t140518\n武功\t140519\n事句\t140520\n没遗憾\t140521\n十四块\t140522\n武力\t140523\n恋萌萌哒\t140524\n九十元\t140525\n轮狗\t140526\n未知数\t140527\np4772\t140528\n名不副实\t140529\nhttpfhiphotosbaiducomxiaodupicitem0e2442a7d933c8956cc68b0fd61373f08202000djpg\t140530\ngasd\t140531\nI户\t140532\n乔虹\t140533\n墙内\t140534\n惊慕\t140535\n始终不渝\t140536\n张雨绮\t140537\n华岩寺\t140538\n平安大街\t140539\n还给我你说\t140540\n我讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你\t140541\n木干\t140542\ne21\t140543\n林置业\t140544\n大风扇\t140545\n彭浦轩\t140546\n九十八\t140547\n录给\t140548\n九十六\t140549\n狗队\t140550\nDqqe\t140551\nscsf\t140552\n五十，五\t140553\n头能\t140554\nscsh\t140555\n在东秘秘你在干嘛\t140556\n不給力\t140557\n凸现\t140558\n王绉龙\t140559\n不愉快\t140560\n任务度\t140561\n闫肃\t140562\n灭族\t140563\n再战\t140564\n扈潇艺\t140565\ngdhkifxsn\t140566\n鬼佬\t140567\n谭昂\t140568\n酒波\t140569\n擦了擦了擦了擦了擦了擦\t140570\n深蓝色\t140571\n猪臭猪\t140572\n企业届\t140573\n兼任\t140574\n偷笑]\t140575\n爱不由己\t140576\nfinec\t140577\npaull\t140578\n酒泉\t140579\n佳泰\t140580\n我爱你的是心而不是你的对我有疑问\t140581\n刘胖\t140582\n上夜晚\t140583\n一天明\t140584\n告诉我我也不知道\t140585\n天天仙居天星君\t140586\n向盘\t140587\n来了再见\t140588\n八三八幺六零七三\t140589\n锦涛\t140590\n定垠\t140591\n意思你喜欢我了那\t140592\n封神榜\t140593\n觉悟性\t140594\n1gb\t140595\n670平方米\t140596\n英国皇家芭蕾舞团\t140597\n712名\t140598\nSFS\t140599\n定型\t140600\n一本架\t140601\naiwysn\t140602\nzgg\t140603\n好朋支\t140604\nqjxhhshxhxjsjsbxxjjakjsnxjxixsjhxwjw8nndosjs9djcmdBODODDDoOBoOo\t140605\njhj\t140606\n美容觉\t140607\n你的名\t140608\n韦唯\t140609\n弄昏\t140610\n耐卡\t140611\n秦皇岛\t140612\n六干多\t140613\n有风不风\t140614\n小本生意\t140615\nlbvpc\t140616\n启幕\t140617\naq咪\t140618\nsehg\t140619\n任何分\t140620\n拳头\t140621\n刘成杰\t140622\n亚洲制药\t140623\n训练兵\t140624\n你的启\t140625\n一千多分钟\t140626\n走街串巷\t140627\n八十分\t140628\n叶童\t140629\n升序\t140630\n谢敦贤\t140631\n叶竹\t140632\n昨天晚上7点\t140633\n李果檬\t140634\n受得住\t140635\nmpmg\t140636\n体积\t140637\n国版\t140638\n划时代\t140639\npuj\t140640\n纸钱\t140641\n清热\t140642\npuf\t140643\npug\t140644\n待机\t140645\ns3cys3\t140646\n国片\t140647\npuc\t140648\n憨豆龙\t140649\n尿泼\t140650\n我叫郭子传\t140651\n王永明\t140652\nput\t140653\nxo型\t140654\n黑子\t140655\n谢我干甚\t140656\n挂蝌\t140657\n自买自夸\t140658\n黑孔\t140659\n双叶杏\t140660\nwww52xz\t140661\nmpm2\t140662\n恩比\t140663\n生蛆\t140664\nAppStore\t140665\n亟待\t140666\n度秘还度秘\t140667\n天天我的手机你好呀\t140668\n拯救困\t140669\n上午9点\t140670\n草船借\t140671\n独立你爆音\t140672\n标志\t140673\n重重\t140674\n杂来来\t140675\n红岑黄村\t140676\n潘姐\t140677\n彩象谦\t140678\n五六0点\t140679\n重金\t140680\n十次方\t140681\n绝路\t140682\n哇偶\t140683\n教科目\t140684\n湛江\t140685\ntqkg12dgjtkgaha1c\t140686\n自谋\t140687\n嗯不不\t140688\n格角村\t140689\n便秘型\t140690\n谁是你老乡呀讨厌讨厌真讨厌讨厌讨厌真讨厌讨厌讨厌淘淘淘淘\t140691\n三十周\t140692\n你好帅呀我是你的姐姐\t140693\n富家\t140694\n朱凯逸\t140695\n副职\t140696\n你的讨厌\t140697\n冬蜜\t140698\n八月迷情\t140699\n陈凯乐\t140700\ndocious\t140701\n新桥村\t140702\n听不着\t140703\n郭召杰\t140704\n改改改\t140705\n双囍\t140706\n谢谢你萌萌哒机器人度秘\t140707\ndrddertt\t140708\n初八欢喜\t140709\naaoopp\t140710\n白玫瑰\t140711\n第1句\t140712\n角落\t140713\n幺零零幺零三幺\t140714\n刘艳龙\t140715\n嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻\t140716\n就能不能\t140717\n刘玉娇\t140718\n第20cm\t140719\n打基础\t140720\njuice\t140721\nHghh\t140722\n火箭坤\t140723\n２４日晚\t140724\n15225312542855\t140725\n问你件事六不上学匣子\t140726\n传祺\t140727\n二维数组\t140728\n吕颂贤\t140729\n打白狼\t140730\n拿完\t140731\n棒棒哒\t140732\n中饭\t140733\n债见\t140734\njalaaabblllj\t140735\n商末孤竹君\t140736\nfyey\t140737\n快说呀牌\t140738\n过年差\t140739\n传神\t140740\n林志堂\t140741\n兴隆百货\t140742\n周爷爷\t140743\n乜唱一首歌\t140744\n部件\t140745\n谢琦\t140746\n薛金星\t140747\n132232\t140748\nrhs\t140749\nrhr\t140750\n农壳\t140751\n名次\t140752\nrhw\t140753\n恩白\t140754\n消化不良\t140755\n一一一一一一一\t140756\n刻板\t140757\nrhj\t140758\n吓唬人\t140759\n十年间\t140760\n表哥哥\t140761\nThf\t140762\nThe\t140763\nrhb\t140764\nrhe\t140765\nrhd\t140766\nrhf\t140767\n不倒翁\t140768\n志林\t140769\n采儿\t140770\n伦敦西\t140771\n敦促\t140772\n齐连山\t140773\n排名\t140774\n花溪\t140775\n沃支付\t140776\n图度\t140777\n银河系\t140778\n1.36千克\t140779\n沙尘暴\t140780\n探探\t140781\n15959317229\t140782\n翠峰冲天流泉漫地\t140783\n几怕\t140784\n亚平\t140785\n啦啦啦啦啦啦ooo爱啦啦啦啦啦oo\t140786\n100万倍\t140787\n新色\t140788\n五厘米\t140789\n诶美\t140790\n甜不甜\t140791\n安眠曲\t140792\n乘兴\t140793\n靠不上\t140794\n习题\t140795\n2乖乖vhjijg\t140796\n放电视\t140797\n行礼\t140798\n王贻福\t140799\n北极首\t140800\n至情者\t140801\n贫富贵贱\t140802\n180千米\t140803\n奥改天\t140804\n发售\t140805\n97丝毫\t140806\n市委书记\t140807\n雷达波\t140808\n开间房\t140809\nCJshevslugU555255Kass\t140810\n否定\t140811\nhhyh\t140812\n辛金能克乙木\t140813\n冻头\t140814\n铁猫\t140815\nstratiom\t140816\n自我价\t140817\n迷呀咪呀咪\t140818\n面向\t140819\n首捉泥鳅\t140820\ngugigigobovovovucyf\t140821\n我在马勒戈壁的青春放浪\t140822\n异鹿晗\t140823\n干碎觉\t140824\n铁猛\t140825\n少林寺\t140826\n氯气\t140827\n伴有\t140828\n理恨\t140829\n101.85\t140830\nHgfcgjjidxG\t140831\n五台县\t140832\n堪比\t140833\n猪头肉肉\t140834\n这个世界\t140835\n皇城\t140836\n喷发\t140837\n哈讨厌\t140838\n在哪里\t140839\n沈小岑\t140840\n一天一点\t140841\n圆领\t140842\n雨斯\t140843\n区府\t140844\n意淫\t140845\n六百来块\t140846\nCatapult\t140847\n5轮\t140848\n37万平方公里\t140849\n坪山\t140850\n区度\t140851\n想了死\t140852\n0.8\t140853\n四平方\t140854\n后羿\t140855\n最简比\t140856\n黄佳钿\t140857\n刘小公主\t140858\n精力充沛\t140859\npeapeat\t140860\n少假\t140861\n生情\t140862\n周周有礼\t140863\n我是你的妹妹月月\t140864\n有不爱你我我恨你\t140865\n口令\t140866\n溜须\t140867\nMiMi\t140868\n等间会厌\t140869\nmccafe\t140870\ntfhxfhrythgtnqsgfzusngrdngdughkfkhvmccvgbvulhvjr\t140871\n心平气\t140872\n及茾\t140873\n宋约会\t140874\n续费\t140875\n偷太上老君册\t140876\n夏目\t140877\n好了我\t140878\n多变你猜我\t140879\n4648455\t140880\nMALIBU#收官之战\t140881\n囫囵吞枣\t140882\n手足情深\t140883\n999个\t140884\n堵死\t140885\n几紧\t140886\n孜然\t140887\n6月17日\t140888\n看不明白\t140889\nlikes\t140890\n摩耶\t140891\n哦i\t140892\n施佳佳\t140893\n溪口证明\t140894\n咖啡厅\t140895\nmaybe\t140896\n吃吃吃\t140897\n死老鼠眼\t140898\n愣头愣脑\t140899\n朱宁\t140900\n撩起\t140901\n走人去\t140902\n韦森\t140903\n涵韵\t140904\n张金华\t140905\n死呆子\t140906\n大竹峰\t140907\n拱乱顶\t140908\n艾露涵\t140909\n内定\t140910\n主胜\t140911\n流过\t140912\n机棉\t140913\n哦9\t140914\n五四六九六三\t140915\n卡布西游\t140916\n淫才\t140917\n爱的情人\t140918\n返现\t140919\n流连\t140920\n流进\t140921\n北迈\t140922\n长长长\t140923\n跳起来\t140924\ncllt\t140925\n内宅\t140926\n五二零\t140927\n张垚璐\t140928\n待遇\t140929\n天门一号\t140930\n内容\t140931\n茱萸\t140932\n场所\t140933\n交心\t140934\nIBHHIJHVBHBHJBHHUVHIVBJJVHJBHJBHJJJJK\t140935\n80多n\t140936\n刺探\t140937\nC罩杯\t140938\nhdjnbid\t140939\n陈希米\t140940\n平遥县\t140941\n封禅\t140942\n翻供\t140943\n操现\t140944\n兄弟学校\t140945\n王雪薇\t140946\n错怪\t140947\n六八二六八五九九\t140948\n196千米\t140949\n辰辰\t140950\n后排\t140951\n救生筏\t140952\n李淑慧\t140953\n1501220841\t140954\n教程元\t140955\nzrjghf\t140956\nMeiW\t140957\n第一段\t140958\n莲子\t140959\n超杰\t140960\n第几度\t140961\n张艺帆\t140962\n3113683055\t140963\n黑非洲\t140964\n分工\t140965\n黄家强\t140966\nSurface\t140967\n雪弗兰\t140968\n落井下石\t140969\n申请项\t140970\n许小\t140971\n春生\t140972\nshan\t140973\nshao\t140974\nused\t140975\np哥\t140976\n太阳历\t140977\n三百分钟\t140978\n1.3万亿\t140979\n种苗\t140980\n度爷\t140981\n恩刚\t140982\n度爹\t140983\n泊来\t140984\n脸点\t140985\n乐趣性\t140986\n过节费\t140987\n人在囧途\t140988\n栏目们\t140989\nxceo\t140990\n恩别\t140991\n惊闻\t140992\n民装\t140993\n滑油\t140994\n海嗨\t140995\nnonononono\t140996\n暴笑\t140997\n事实时\t140998\n年化\t140999\n煤夫\t141000\n2jemt\t141001\n秘技\t141002\npiey\t141003\n才要了我你秘\t141004\n上官韵雪\t141005\n第455\t141006\n完美味\t141007\n吉杰\t141008\n街拍\t141009\n四五幺\t141010\n哪哪\t141011\n心潮\t141012\n韩国女团\t141013\n哪哦\t141014\n伸长\t141015\n淘好大好大好大好大好大好大\t141016\n林哥\t141017\n妖神\t141018\n小从小\t141019\n886886886886886\t141020\n春卷\t141021\n三国演义读后感\t141022\n倪成龙\t141023\n疼爱\t141024\n七点晚\t141025\n6。20\t141026\n好脾气\t141027\n李东健\t141028\nssayatio\t141029\n刺普菲\t141030\n温柔的爱我一下\t141031\n手摸\t141032\n王加睦\t141033\n新闻\t141034\n狸猫\t141035\n顶天\t141036\n那你可不是人哈哈哈有本事\t141037\n对三\t141038\n白羊2016\t141039\n惹我了\t141040\n笑的好开\t141041\n谭德才\t141042\n四五年\t141043\nLearning\t141044\n古文\t141045\nyyrr\t141046\n春华\t141047\n花千骨变妖\t141048\n我的哥哥\t141049\n60座\t141050\n窝蛋豆腐羹\t141051\n诗兴\t141052\n节水\t141053\n复活\t141054\n火土\t141055\n小爱爱\t141056\n能看到\t141057\n当仙\t141058\n七七五零\t141059\n揭瓦\t141060\n7#\t141061\n房爱冲\t141062\n过多久\t141063\n图们\t141064\n紫儿喏\t141065\n分集\t141066\ndlkg\t141067\n我不在\t141068\n下不下雨\t141069\n下不下雪\t141070\n节气\t141071\n薯页\t141072\n爸爸堡\t141073\nkIOoOmunl\t141074\n三厘米\t141075\n又说\t141076\n王涵静\t141077\n股息\t141078\n核燃料棒\t141079\njsm\t141080\n疯狂猜成语\t141081\n金战\t141082\n胸毛\t141083\n二姑父\t141084\nvxfjkn\t141085\ncgk\t141086\ncgm\t141087\n半几两\t141088\ncgn\t141089\njsh\t141090\ncgc\t141091\ncgb\t141092\n毕业说话了我要觉觉\t141093\n走火入魔\t141094\ncgg\t141095\ncgf\t141096\n韩光明\t141097\n塔克拉玛\t141098\njsj\t141099\n旅舍\t141100\n82点\t141101\n一剪梅\t141102\n喜羊\t141103\ncgs\t141104\n片人\t141105\ncgu\t141106\ncgt\t141107\n不爱你过\t141108\ncgv\t141109\n76\t141110\n纯银\t141111\n105颗\t141112\n74\t141113\n很聪明才智\t141114\n片五\t141115\n标清\t141116\n语文题\t141117\n猪型机器人\t141118\n卢春任\t141119\n不行好搞笑\t141120\n对不眠\t141121\n博爱小学\t141122\n一班\t141123\n70\t141124\n度秘月了你好开心\t141125\n我娘娘我女儿我你爱我\t141126\n乖张\t141127\n污言秽语\t141128\n复印件儿\t141129\n测员\t141130\n蓝旗袍\t141131\n别这样黑\t141132\n花牙\t141133\n嗯玛尼亚公主\t141134\nnn325858556355243363一Qelc\t141135\n一张处\t141136\n花版\t141137\n一八年十一月\t141138\n9日下午6点\t141139\n陈鹏霞\t141140\n凯爷\t141141\n疯狂大了那后天\t141142\n我可以吧秘密告诉你\t141143\n杜尔哥\t141144\n自卫队\t141145\n新建\t141146\n汪洋\t141147\n哎诶\t141148\n手扶\t141149\n这一样\t141150\n2.2万\t141151\nghjg\t141152\nfryyf\t141153\n60CM\t141154\nghjk\t141155\n首卡\t141156\n各不同\t141157\nuhbbkmnfn\t141158\n笨猪└\t141159\n7200斤\t141160\n收款付款\t141161\n笑果\t141162\n广片\t141163\n首单\t141164\n四个男的我需要个女的做我的老婆\t141165\n去膜拜\t141166\n不甘心\t141167\n123456789360\t141168\n没高\t141169\n呜呜什麼\t141170\n42386\t141171\n小陈儿\t141172\n漫卷\t141173\n差臭馍馍\t141174\n再攀高峰\t141175\n想你当然\t141176\n夏梦\t141177\n味儿\t141178\ndoest\t141179\n贵紫荆\t141180\n什么间\t141181\n别罗里八索\t141182\n集大赞\t141183\n子女\t141184\n警长\t141185\n闻秘\t141186\n海苔\t141187\n音素\t141188\n崔艺馨\t141189\n四里人\t141190\nPjxfg\t141191\n515132\t141192\nc飞\t141193\n假的秘\t141194\n高红玉\t141195\n西芒\t141196\n有点傻\t141197\n子好\t141198\n不用你了看\t141199\n15994024825\t141200\n补得\t141201\n流离失所\t141202\n个街\t141203\n独孤\t141204\n几哪\t141205\n烟嘴\t141206\n育萱\t141207\n爆表\t141208\n怪硬来\t141209\nejhiwihshs\t141210\n三百九十八一百个\t141211\n日豪豪\t141212\n韩媒\t141213\n湿人\t141214\n骗者\t141215\n为甚\t141216\n你是谁你是何人这是鲜花啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦\t141217\n望天树的空中走廊\t141218\n几哈\t141219\ngrgd\t141220\n炮艇\t141221\n度种\t141222\n尿频\t141223\ngw\t141224\ngv\t141225\ngu\t141226\ngt\t141227\ngs\t141228\ngr\t141229\ngq\t141230\n兽族\t141231\n灰太累\t141232\n大师兄\t141233\n洪卓凡\t141234\ngz\t141235\ngy\t141236\ngx\t141237\ngg\t141238\ngf\t141239\nge\t141240\ngd\t141241\ngc\t141242\ngb\t141243\n浴室\t141244\ngo\t141245\ngn\t141246\ngm\t141247\ngl\t141248\ngk\t141249\ngj\t141250\ngi\t141251\ngh\t141252\ngV\t141253\n农户\t141254\n尿垫\t141255\n胡佳玥\t141256\n5@0\t141257\ngX\t141258\n王凯俊\t141259\n18家\t141260\ngO\t141261\n十八一岁\t141262\n卡咖啡发。tumvsdfny\t141263\ng7\t141264\ng6\t141265\n许莺莺\t141266\ng3\t141267\ng2\t141268\ng1\t141269\n嗯昂\t141270\n国务院发展研究中心\t141271\n随遇里\t141272\n嗯明\t141273\n新南国\t141274\ng8\t141275\n纳新\t141276\nocto\t141277\n天仙秘\t141278\n酸菜\t141279\n风气\t141280\n背等\t141281\n八七二八\t141282\n度秘度秘门\t141283\n孩之宝\t141284\nvgghfydtdvv55422\t141285\n电蚊\t141286\n湘东\t141287\n私密性\t141288\nghjgghh\t141289\n0.18%\t141290\nhhgggh\t141291\n杨乐馨\t141292\n芮成钢\t141293\n天然气\t141294\n卡券\t141295\nSpa\t141296\n伊塔卡\t141297\n乱七八糟\t141298\n京东苏宁\t141299\n度秘你好吃豆沙包子\t141300\n人保\t141301\n实属\t141302\n３分钟\t141303\n00ppjj\t141304\n云南人教\t141305\n85分\t141306\n黄春菊\t141307\n单晓杰\t141308\n薛宇\t141309\n芒果果冻\t141310\n抓讲\t141311\n李晨哥\t141312\n㱮\t141313\n62101063\t141314\n么么萌萌滴你\t141315\n头部\t141316\n赵国\t141317\n18331305460\t141318\n降噪\t141319\n万荣度秘\t141320\n推搡\t141321\n人信\t141322\n四年前\t141323\n胡蕾美\t141324\n后腿\t141325\n推搪\t141326\n3youlit\t141327\njmwdn\t141328\n价高质次\t141329\n忘语\t141330\n早权权\t141331\n貓貓\t141332\n毛瑟网\t141333\n七名\t141334\n睡不着数\t141335\n波浪号\t141336\n周圣月\t141337\n见智\t141338\n郊游\t141339\n9844\t141340\n听说过来\t141341\n二百吗\t141342\n看过心救援\t141343\nPUMA\t141344\n爱因斯塔\t141345\n啦啦啦啦啦啦多咪咪\t141346\n任志强\t141347\n知错\t141348\n享有\t141349\n该死的是你你别骗我了了绣\t141350\n卡酷卡酷\t141351\nGVvvvvvvvv\t141352\n大型\t141353\n许馨淼\t141354\nWonder\t141355\n原水表厂\t141356\n蛋清\t141357\nfjgvb\t141358\n有性\t141359\n途运来玉米面粉\t141360\n有怪\t141361\n周雨轩\t141362\n惠威\t141363\n隋晓云\t141364\niijjkjbj\t141365\n电笔\t141366\n步步高点读机\t141367\n幼稚鬼\t141368\n遥长\t141369\n光法\t141370\n发段\t141371\n有怀\t141372\n堂哥\t141373\n槽糕\t141374\ntufhghjghthjh\t141375\n喂喂喂恩恩\t141376\n龟兔赛跑\t141377\n土货\t141378\n北师大附中\t141379\n有怕\t141380\n代位\t141381\n小年幼\t141382\n砍手\t141383\n假猪\t141384\n吴成阳\t141385\n折叠式\t141386\n031234634113466777666666666665655555556666677666666666666666666666666666666666678887887877754567655565655555555555555555555555555555555555555555555555555555555556\t141387\n利所\t141388\nVOA\t141389\neyhrgjki\t141390\n慢慢熊小不点六的亚洲女\t141391\n定速\t141392\n不计其数\t141393\n指出\t141394\n不是说你是我的妈妈我的妈妈姓\t141395\n犀沟利\t141396\n警官\t141397\n非常完美\t141398\n付文卓\t141399\n江苏省宿迁市钟吾国际学校\t141400\n05504433185\t141401\n瞒索\t141402\n偶书\t141403\n5454421545258565225\t141404\n一月几号\t141405\n花魁\t141406\n回到家乡惊喜\t141407\n你好你好猪你好猪你好猪你好猪你好猪你好猪你好猪你好猪六合猪你好猪的猪聊猪你好猪你好猪你好猪啊猪\t141408\n高斯\t141409\n补牢\t141410\n移动办公\t141411\n高新\t141412\n私信\t141413\njie.pn/p_IdE8EW0D\t141414\n天外石\t141415\n哈斯曼\t141416\n梁锡元\t141417\n郁闷手不家\t141418\n6盘\t141419\n爱不哭\t141420\n三剑\t141421\ntfi5\t141422\n高文\t141423\nshdbshx\t141424\n误嫁\t141425\n械学院\t141426\n南开营业厅\t141427\n头脑海报纸\t141428\n补牙\t141429\n表伯\t141430\n灰仓鼠\t141431\n28一个\t141432\n算甚\t141433\n125585585\t141434\n嘴鱼\t141435\n班风\t141436\n逼逼\t141437\n周灏博\t141438\n文天祥\t141439\nKcjccchb\t141440\n领化\t141441\n爪儿\t141442\n班飞\t141443\n蓊郁\t141444\n一运\t141445\n一连\t141446\n清澈见底\t141447\nPiaf\t141448\n近途\t141449\n女士\t141450\n敬请\t141451\n卢佳欣\t141452\n2225555\t141453\n冶山\t141454\n吴系\t141455\n猪呆\t141456\n真名\t141457\ngjdsc\t141458\n猪呀\t141459\n下班\t141460\n金螎\t141461\n猪呗\t141462\n女声\t141463\n2010年9月\t141464\n金融\t141465\n你好无聊\t141466\n吵没\t141467\n用水量\t141468\n56个\t141469\n小机巴\t141470\n浴盆\t141471\nUHFCW\t141472\n1.25米\t141473\n一句句\t141474\n唯恋\t141475\nhdyehhxhfj\t141476\n一个10岁\t141477\n上百万\t141478\n扑灭\t141479\n档案簿\t141480\n度秘度秘你好啊我是五月花\t141481\n哥哥哥\t141482\n张立科\t141483\n和而喻\t141484\n华天道\t141485\n像样的话\t141486\n天共一夜\t141487\n这块儿\t141488\n凝固\t141489\n明明天的明\t141490\nT-Bag\t141491\n上百个\t141492\n基谁信\t141493\n余仁生\t141494\n不齿\t141495\n三个儿\t141496\n樱妮卡技术部\t141497\n抢嫁\t141498\n贝利\t141499\n王哈哈\t141500\n美好的回憶\t141501\n马婉莹\t141502\n谢谢人机器人洞\t141503\n025\t141504\n024\t141505\n027\t141506\n026\t141507\n020\t141508\n022\t141509\n含糖量\t141510\n028\t141511\n葛文慧\t141512\n夹住\t141513\n不见你的自己\t141514\n不争气\t141515\n12523元\t141516\n河南战役\t141517\n醒悟\t141518\n咔嚓\t141519\n二分法\t141520\n261884648015168\t141521\n二点六\t141522\n479票\t141523\n绝情我更绝尽\t141524\n韩欣雨\t141525\n10060467\t141526\n红私房菜\t141527\n赵梦阳\t141528\n脑筋急转弯小红\t141529\n六零年\t141530\n宋慧牙\t141531\n四十五度\t141532\n120顿\t141533\n加建华\t141534\n客人们\t141535\n宵夜摊\t141536\nkdkek\t141537\n白玉王\t141538\n不可我去私\t141539\n紧存\t141540\n下午三点半\t141541\n袁云涛\t141542\n竹秀\t141543\n九年\t141544\n杨颖美\t141545\n100000000000000000000000000000000000000000000000000000000000000000000000000000000岁\t141546\n97岁\t141547\nSeason\t141548\n抒写\t141549\n九幽\t141550\n高收入\t141551\n小叶紫檀\t141552\nYoudemand\t141553\n高禄\t141554\nCEnglish\t141555\n干涩感\t141556\n走懂不懂\t141557\n就是样\t141558\n高高旺姆\t141559\n大片\t141560\n落\t141561\n一个1000\t141562\n大牌\t141563\n萱\t141564\n微游戏\t141565\n笨子\t141566\n回眸一笑百媚\t141567\n一近镇医院之碎石科来我处推介\t141568\ngbcg\t141569\n萨\t141570\n钟情\t141571\n百泉\t141572\n你是我的心你是我的爱\t141573\n0.24%\t141574\n萠\t141575\n萧\t141576\n大牙\t141577\n营\t141578\n大牛\t141579\n985555\t141580\nppiPad\t141581\n糕富帅\t141582\n大牢\t141583\n求拍\t141584\n说妈妈说\t141585\n九恒伟\t141586\n大物\t141587\n双对两眼\t141588\nfupoqe\t141589\n欺骗人\t141590\n萍\t141591\n萌\t141592\n下汗\t141593\n有炉式\t141594\n萄\t141595\n想不完\t141596\n不视\t141597\n大逆不道之徒\t141598\n几率\t141599\n儿额\t141600\n转向\t141601\n妈妈的妈妈你现在好东西\t141602\n件数\t141603\n转合\t141604\n忻怡如\t141605\n老子装\t141606\nkenwofowonuom\t141607\n几环\t141608\n看萌\t141609\n785450806\t141610\nguys\t141611\n方法论\t141612\n小鸡压\t141613\nWOWOW\t141614\n四面曲\t141615\n有关部门\t141616\nbe堡\t141617\n绝症\t141618\n左超\t141619\ngbyhyhk\t141620\n真的好罗嗦\t141621\njdgwa\t141622\n小大人\t141623\n左海公园\t141624\n29.7万\t141625\nDUO\t141626\n尼姑\t141627\nBing\t141628\n好兴奋\t141629\nDUF\t141630\n甄宓\t141631\nDUX\t141632\n档裤\t141633\nR\t141634\n好多年\t141635\n爆动\t141636\n草莓酱\t141637\n嘴唇儿\t141638\n男女关系\t141639\n宵夜宵夜\t141640\n张唐哲\t141641\n两新\t141642\n鹏劲\t141643\nbel\t141644\n800兆\t141645\nbeb\t141646\naloto\t141647\nbea\t141648\nbeg\t141649\nbed\t141650\nbez\t141651\nbey\t141652\n雕村\t141653\n两斤\t141654\n快快快29\t141655\n蒋雅莉\t141656\n卫生许可证\t141657\nbet\t141658\n龙茗路\t141659\n恩可astioma\t141660\n道印\t141661\n东都咪\t141662\n摆齐好吧\t141663\n张银行\t141664\n冰包\t141665\n置物架\t141666\n为你找\t141667\n金秀路\t141668\n15%\t141669\n西龙潭\t141670\n抱紧\t141671\n卡路里拉粑粑粑粑\t141672\n原话\t141673\n道博\t141674\n甲比多\t141675\n成就\t141676\nTbxkhcjfngc\t141677\nfefrf\t141678\n中化学\t141679\n夜谈\t141680\n大汉王\t141681\n主治\t141682\ndiwks\t141683\n头猪猪猪猪猪\t141684\n变进\t141685\n音乐谱\t141686\n邢素鸣\t141687\n12月28号\t141688\n我是你的主人你爱我\t141689\n二十克\t141690\n葫芦葫芦\t141691\n二十兆\t141692\n大都市\t141693\n山度\t141694\n小葱炒蛋\t141695\n二十元\t141696\n哥们\t141697\n準備\t141698\n任红\t141699\n哥仨\t141700\n河镇\t141701\n秋裤\t141702\n神龛百两宝贝雷海艳\t141703\n记起\t141704\nDom\t141705\nrrrrrrrt\t141706\n不可说\t141707\n二十六\t141708\n鲁迅先生\t141709\n二十八\t141710\n周邦彦\t141711\n电脑猪头\t141712\n康熙\t141713\n根本机器人\t141714\nLiWen\t141715\n银开\t141716\n化作\t141717\n秋装\t141718\n黄梓维\t141719\n生理\t141720\n猴儿\t141721\n嫖娼\t141722\n萧大侠\t141723\nKlaaajj\t141724\n手信\t141725\n新闻网\t141726\n阿陆GG\t141727\n啦咱们结婚吧好想和你\t141728\nanyway\t141729\npingsi\t141730\n杨大\t141731\n当我没\t141732\n咯嗯\t141733\n走出来\t141734\n再见结束\t141735\n张文标\t141736\n蛋蛋忧\t141737\n二万三万四万五万六万七万八万九万十万\t141738\n瑟瑟的意思就在半江瑟瑟半江红\t141739\n这的太阳\t141740\n点算\t141741\n改址\t141742\n没有人像\t141743\n吴子威\t141744\n晓雯\t141745\ninterrupta\t141746\n超了\t141747\n土家用堡\t141748\n明晚上\t141749\n装逼\t141750\nBoomSaKaLaKa\t141751\n0.7\t141752\nwofb\t141753\n33位\t141754\n梁少玲\t141755\n挂腹黑\t141756\n管道化\t141757\n深蹲\t141758\nHBFG\t141759\n及至\t141760\n淫森\t141761\n51090219970618845\t141762\n一篇466个\t141763\n笑傲奥\t141764\n万哥\t141765\n王煜柯\t141766\ndpg\t141767\n北京灵光寺全寺\t141768\nhttpahiphotosbaiducomxiaodupicitem1ad5ad6eddc451dab916a09eb1fd5266d11632e6jpg\t141769\n黄渡\t141770\n骨瘦如柴\t141771\n一十九八七六三一\t141772\nx10000000000000000000000000000000000000000000000000000000\t141773\n屋塔\t141774\n過去\t141775\n超人\t141776\n悄悄地\t141777\n高速铁路\t141778\n嘲笑\t141779\n南都娱乐周刊\t141780\n第十名\t141781\n息你说话\t141782\n举杠铃\t141783\n微信号码度\t141784\n赤峰\t141785\n女款\t141786\n位种\t141787\n天涯七海久\t141788\n莱茵布洛尔Rheinbrohl\t141789\n高健\t141790\n象诀\t141791\n哥们儿\t141792\n警车\t141793\n位移\t141794\n阑干\t141795\n石狮泉州\t141796\n大戏院\t141797\nk小\t141798\n不做你说\t141799\n我喜羊羊与灰太狼\t141800\n连篇\t141801\n創造\t141802\n芈戎\t141803\n糊口\t141804\n时话\t141805\n乌鲁木齐\t141806\n刑罚\t141807\n实然\t141808\n949449\t141809\n宝九\t141810\nfgllll\t141811\n性取\t141812\n数百人\t141813\n一个月\t141814\n也有利于\t141815\n124142588887758258\t141816\n要你了我要\t141817\n老岗\t141818\n眼屎\t141819\n自发部长\t141820\n表现度秘\t141821\n花千树\t141822\n拜拜元吧\t141823\n失魂死混蛋\t141824\n南郭先生\t141825\n萌萌哒度秘\t141826\n为着\t141827\n黑白色\t141828\nxgchh\t141829\n龙八夷3\t141830\n嘲弄\t141831\n善意\t141832\n哪个孩\t141833\n虚无小\t141834\n参赛队\t141835\nfvgbbfbfbgbggvggggggghggg555888\t141836\n华明\t141837\n利点\t141838\n胡帅\t141839\nhttpehiphotosbaiducomxiaodupicitem95eef01f3a292df5cca3\t141840\n敲碎\t141841\n失误\t141842\n韦铄\t141843\n程劲涛\t141844\n圆留学梦\t141845\n邮\t141846\n160米\t141847\n尽情\t141848\nhi算了我不讨厌你\t141849\n戴文婷\t141850\n走开了再见\t141851\n糕鞋\t141852\n汤臣\t141853\n周好开心\t141854\n735橙\t141855\n离岗\t141856\n雨萍\t141857\n苏格拉\t141858\n杀人歌\t141859\n进载\t141860\n0762.cn\t141861\n陈可居\t141862\n围魔鬼化\t141863\n最爱就是你\t141864\n初级\t141865\n回头看图\t141866\nDagvcvn\t141867\n白屏\t141868\n超王\t141869\n张思佳\t141870\n正视\t141871\n白山\t141872\n邹\t141873\n我怕怕\t141874\n九源出租车公司\t141875\n见财起意\t141876\n壁虎\t141877\n变样\t141878\n卡比\t141879\n深重\t141880\ngucci\t141881\n给票\t141882\n轻于\t141883\n遗作王\t141884\n玩捉迷藏\t141885\n盲从\t141886\n电视频\t141887\n阿拉了了了了了了了\t141888\n呐喊县\t141889\n哎呀一\t141890\n34d\t141891\n榛子\t141892\n完了真是\t141893\n付尾\t141894\n球体\t141895\n魔术师\t141896\n大的话说话说话筒仓促\t141897\n凡此者\t141898\n万魔仙\t141899\n揪斗\t141900\nnolykh\t141901\n电子猫\t141902\n全者\t141903\n呢钱\t141904\n罗罗罗罗罗罗罗罗罗罗\t141905\n很大度\t141906\n274元\t141907\n漫延\t141908\n面试官\t141909\n再见八十八八百八十八八零零\t141910\n82056395185646284528154950565594875\t141911\nNAVTEQ\t141912\n提怕\t141913\n黄体下\t141914\n别哭了乖\t141915\n8016年\t141916\n你好机器人你好多咪\t141917\n可乐型\t141918\n余震余震渝\t141919\n医护人员\t141920\n经常\t141921\n回春\t141922\n杜艾阳\t141923\n高兰希\t141924\n劳燕分飞\t141925\nQplay\t141926\n看来来了来了\t141927\n爱吃燎\t141928\n饭干\t141929\n欺上瞒下\t141930\n苏园\t141931\n059212333\t141932\n波涛夜\t141933\n爱别说\t141934\n来犯\t141935\n美味佳肴\t141936\n萌萌哒棒棒哒\t141937\nUSPTO\t141938\n张鸣\t141939\n邋遢\t141940\n小毛\t141941\n猪吧度秘\t141942\n小毓\t141943\n珍珠\t141944\n小比\t141945\nA6L\t141946\n新乡市劳动路外国语学校\t141947\n相貌\t141948\n琦玉\t141949\n自摸\t141950\n位置者\t141951\n小毅\t141952\n欠薪\t141953\n烂摊子\t141954\n卡达\t141955\nghhhhjn\t141956\n全家家\t141957\n慈母爱\t141958\n3.3\t141959\nkilled\t141960\n慢性说走就走\t141961\n分不清\t141962\n瞒干\t141963\n步步高v\t141964\n度秘我喜欢你我讨厌你\t141965\n缺觉\t141966\n你的主人\t141967\nsonic\t141968\n白饭\t141969\njbvf\t141970\n徐虎\t141971\n慈父\t141972\n独￥裁￥专￥制\t141973\n动物子\t141974\njFljseripicvvkweiplnxx\t141975\n0047\t141976\n木文胸\t141977\n一九九五零五零三\t141978\n比赛日\t141979\nrdfffgfghhhf\t141980\n许晨晨\t141981\n高开低走\t141982\n宋孜彤\t141983\n监视\t141984\n海岸城\t141985\n笨拙\t141986\n别哭鼻子\t141987\n天天陪天天陪天天陪天天醅\t141988\n鸭纸\t141989\n刚饭\t141990\n种法\t141991\n嗯v我叉六普拉斯\t141992\n何韶阳\t141993\n广恩\t141994\n1949-1976年\t141995\n二小二\t141996\n88方\t141997\n赔炮\t141998\n死婆\t141999\n点击率\t142000\n美轮美奂\t142001\n死婊\t142002\n30张\t142003\n西递\t142004\n阴阳历\t142005\n浮现\t142006\n唯爱有放\t142007\n安阳1号\t142008\n评鉴\t142009\n死猪婆\t142010\n奥尔罕·帕慕克\t142011\n滕宝庆\t142012\n程腊\t142013\n温控器\t142014\nmò\t142015\n各显神通\t142016\n继承\t142017\n12397\t142018\n124345\t142019\n115直下\t142020\n死婴\t142021\n左眼皮\t142022\n姜照柏\t142023\n提吐了我我了我\t142024\n台克撒\t142025\n允礼个头一个头大家好我叫\t142026\n中德\t142027\n共和国卫队\t142028\n有其属\t142029\n光脚\t142030\n威固\t142031\n洗玩\t142032\n大堆\t142033\n中联重科\t142034\n插遇\t142035\n上床片\t142036\nolaaaa\t142037\n着仇\t142038\n紫胤\t142039\n40多平米\t142040\n五六点\t142041\n收烧\t142042\n大堡\t142043\n国企\t142044\n寻宝\t142045\nfhchf\t142046\n淡淡淡淡淡\t142047\nceidream\t142048\n黑鹰\t142049\n梁洛施\t142050\n一厢情愿\t142051\n用窗\t142052\n文昌殿\t142053\n不在乎\t142054\n字改\t142055\n期末检测卷\t142056\n四项\t142057\n上身\t142058\n地级\t142059\n31平米\t142060\n不为人知\t142061\n四顿\t142062\n郭本鑫\t142063\n插圈\t142064\n异曲同工2\t142065\njgjgjgagjgagjg\t142066\n16日上午\t142067\n国羽\t142068\n为你在干\t142069\n坏车\t142070\n酷网\t142071\n臭死\t142072\nCGJ\t142073\n我要你当我老公度秘\t142074\n纯色吧\t142075\nCGD\t142076\nCGG\t142077\nGuggen\t142078\n以子时\t142079\n雷锋辣\t142080\nfhhffgjffv\t142081\n我真的不相信你你还爱我\t142082\n夕妍雪\t142083\nXnn\t142084\n你你乖不乖啊你不爱我\t142085\n中国汉族\t142086\nCGV\t142087\n陆游\t142088\n硼\t142089\n硷\t142090\n无非彼此彼此\t142091\n无所有\t142092\n确\t142093\nf份\t142094\n硬\t142095\n硪\t142096\n董晨\t142097\nCG5\t142098\n7gf\t142099\n666666666666666666666666666666666666666666666666666666666666\t142100\n么白\t142101\n512MB\t142102\n十道\t142103\n图标点\t142104\n铁佛\t142105\n岳婉婷\t142106\n硌\t142107\n十遍\t142108\n窗门\t142109\n硅\t142110\n对置\t142111\n千三百四十九\t142112\n正规军\t142113\n傻傻傻傻哈哈哈\t142114\n好恩\t142115\n斗牛犬\t142116\nｐｐ\t142117\n有点卷发\t142118\n五马分尸\t142119\n上牌\t142120\nhbtev\t142121\nkuajialllolooo\t142122\nｐａ\t142123\nchiphotosbaiducomxiaodupicitemd439b6003af33a871b05516fc15c10385343b55fjpg\t142124\n无以致远\t142125\n达斡尔族\t142126\n1500名\t142127\n珍珠丁丁丁丁丁丁\t142128\n淡恋爱\t142129\n三好生\t142130\n猿人\t142131\n阿黛尔\t142132\n7878781705870207858684685368\t142133\n哨片\t142134\n支宇翔\t142135\nmeteor\t142136\nKloss\t142137\n白墨是星之魔法团\t142138\n15242620906\t142139\n知人口加量化\t142140\n唐笑怡\t142141\n脚伤\t142142\n练习生\t142143\n我不理你了我讨厌你\t142144\n唐鹰\t142145\n28979\t142146\n我还好吧兄弟\t142147\n老楚\t142148\n那么多不懂\t142149\n原空\t142150\n冯焰笑\t142151\n扇面\t142152\n要知\t142153\n8554\t142154\n不恭喜\t142155\n只因\t142156\n张荣敏\t142157\n呢爸爸\t142158\nABBY\t142159\n贵宝\t142160\n贵客\t142161\n1251百一十\t142162\n13275867078\t142163\nchikds\t142164\n茹永鑫\t142165\n甜心思蜜哒\t142166\nq136慢慢\t142167\n14cvs\t142168\n參加\t142169\nHealer\t142170\n释怀\t142171\n夫联\t142172\n贵宾\t142173\n开心消消乐147关\t142174\n那你还当我的小秘我不需要你\t142175\nJigglyvex\t142176\n百八\t142177\nya度\t142178\n让子弹飞\t142179\n电话线\t142180\n那天黑\t142181\n紫水晶\t142182\n中国航空航天博览会\t142183\n涕泗\t142184\n映衬\t142185\n抱抱——你好乖噢\t142186\n徐帆\t142187\n深圳卫视\t142188\n爱军睡觉吧\t142189\n得慌\t142190\nEhh\t142191\n去自习\t142192\n梁欣悦\t142193\n橡皮糖\t142194\n20元\t142195\n大厨师\t142196\n14一15岁\t142197\n不拘小节\t142198\n百兆\t142199\n龙胜\t142200\n涕泪\t142201\n我不要你的男票不要脸\t142202\n48000丈\t142203\n现货交易\t142204\n尺度\t142205\n从业人员\t142206\n小达人那你国演义\t142207\n小春晓\t142208\njj怪\t142209\nkudd\t142210\n西峡\t142211\n现况\t142212\n四十\t142213\n四千\t142214\n99斤\t142215\n临桂\t142216\n长城谣长城\t142217\n5.1节\t142218\n暗系\t142219\n十三级\t142220\n唉天冷\t142221\n地狱之门\t142222\n段奂\t142223\niuvs\t142224\nupk\t142225\n课桌\t142226\n只有我\t142227\nupf\t142228\nupd\t142229\n4月20日\t142230\nhbbb\t142231\n赵明\t142232\n给钱\t142233\nnursery\t142234\nupx\t142235\n不好年\t142236\n齐庆\t142237\nummast\t142238\n乌塔乌塔\t142239\nups\t142240\nupp\t142241\n考电\t142242\n前班\t142243\n编者按\t142244\n亲属间\t142245\n会不会\t142246\n偏西风\t142247\n557755757\t142248\n剧目\t142249\n碉楼\t142250\n季节感\t142251\n雪棱\t142252\n商洛\t142253\n坏蛋坏蛋大坏蛋吸血鬼之祖\t142254\n柱头\t142255\n倒斗\t142256\n巴里王\t142257\n18008961197\t142258\n老妈老爸老妈猫爸你在哪儿\t142259\naryurxuf\t142260\n户户户\t142261\n25周岁\t142262\n舒泽昊\t142263\n纠缠不休\t142264\n网易博客\t142265\nen\t142266\n2500发\t142267\n杨妍丹\t142268\n160km\t142269\n后来居上\t142270\n下拉\t142271\ndres\t142272\ndrer\t142273\n巴神\t142274\n马泪\t142275\nkris,kris\t142276\n可乐会\t142277\n忘记了有\t142278\n真暖\t142279\n秦焘\t142280\n新品种\t142281\n果冻发\t142282\nchwy\t142283\n筛管\t142284\n病娇\t142285\n社会主义民主建设\t142286\n方晗\t142287\n組\t142288\n周球\t142289\n消谁\t142290\n饮空\t142291\n力士\t142292\n96254123\t142293\n打兵兵\t142294\n12秒\t142295\n度高\t142296\nfket\t142297\nbbbbbbddddd\t142298\n下拿\t142299\n无奈亚\t142300\n悠长\t142301\n如火如荼\t142302\n稀得\t142303\n陈永青\t142304\n售完\t142305\n韶湖\t142306\n中共党员\t142307\n很好不错\t142308\n楠楠\t142309\n牛脸皮\t142310\n闲过\t142311\n宜昌市\t142312\n嗯埃拉尼\t142313\n管度秘\t142314\n试试度\t142315\n853221456789\t142316\n哇塞帅\t142317\n根冠区\t142318\n程耗时\t142319\n蓝白\t142320\n当边\t142321\n巨蟒\t142322\n27272272772\t142323\nchicorysleep\t142324\n珠海市\t142325\nCCTV-8\t142326\n不轨\t142327\n小日记\t142328\n污水\t142329\n15点51分\t142330\n张如意\t142331\n没心没肺\t142332\nsharia法\t142333\n工期\t142334\n麋峰\t142335\n我问你思密达\t142336\n供奉\t142337\n伤心感冒\t142338\n巨蟹\t142339\n萌萌萌萌萌\t142340\nddlt\t142341\n还有\t142342\n三个人\t142343\n问郎花好侬颜好\t142344\n857辆\t142345\n猪脚\t142346\n好恐怖\t142347\nAV法\t142348\n164865\t142349\n柔度秘\t142350\n张全新\t142351\n咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪\t142352\n天牛\t142353\n30329\t142354\n那一个\t142355\n口木\t142356\ngivg\t142357\ngivd\t142358\ngive\t142359\n看好帅\t142360\n怪病\t142361\n干量\t142362\n纯牛奶\t142363\n读书笔记\t142364\n不增\t142365\n无欲无求\t142366\n字牌\t142367\n泥孩\t142368\n唉十五园林晴天雨天\t142369\nSIM卡卡槽\t142370\n迷糊\t142371\n城固县\t142372\n原著\t142373\n金刚铠甲科\t142374\n朱仔仔\t142375\n度秘度秘didi\t142376\n五百七十七\t142377\n京冀\t142378\n小孩儿\t142379\n宫傲妃\t142380\n范围\t142381\n沃派#沃派\t142382\n五菱宏\t142383\n妥贴\t142384\n京军\t142385\n技巧\t142386\n歪理\t142387\n技工\t142388\n妖孽邪王王妃太凶\t142389\n发号\t142390\n十六树\t142391\nFUCKYOU\t142392\n椰子\t142393\nghhdhd\t142394\nbbmk\t142395\n挑衅\t142396\n伤感带\t142397\n十六栋\t142398\n贾亮亮\t142399\n张洗白\t142400\n秧子\t142401\n大神经\t142402\n全开\t142403\n黄鱼\t142404\nvfn\t142405\nvfm\t142406\n气盛\t142407\n川崎\t142408\nvff\t142409\nvfd\t142410\n芭娜娜卡卡\t142411\npornfruit\t142412\n吳丕洲\t142413\n金承龙\t142414\n46797863\t142415\nノ帅瞎\t142416\nvfx\t142417\nvfv\t142418\nvfs\t142419\n孙悟龙\t142420\n天涯社区\t142421\n直接不想和你聊你是我人生有点事到实在不想和你再见\t142422\n小个\t142423\n小丫\t142424\n陆宏杰\t142425\n小严\t142426\n好好像\t142427\n赛科\t142428\n闯关\t142429\n黄承武\t142430\n小丽\t142431\n小丹\t142432\n珍贵\t142433\n小主\t142434\n闯入\t142435\n颗牙\t142436\n额哩\t142437\n棒棒牛\t142438\n泰兴\t142439\nvbnm\t142440\n小三\t142441\n爱说\t142442\n移步\t142443\n小万\t142444\n别忘我\t142445\n小丁\t142446\n小七\t142447\n小东\t142448\n秀色\t142449\n小丘\t142450\n老有所养\t142451\nruiokng\t142452\nhello度秘魁武\t142453\n小丑\t142454\n肘子\t142455\n加持\t142456\n回\t142457\nxiaodupicitemaa\t142458\n囚\t142459\n囗\t142460\n囖\t142461\n蒙罩\t142462\n高雯\t142463\n囍\t142464\n高雪\t142465\n囊\t142466\n好无啦啦啦啦啦啦\t142467\n马嘴\t142468\n2535885\t142469\n囂\t142470\n整怕\t142471\n囿\t142472\n多多多喝\t142473\n国\t142474\n解嘲\t142475\n固\t142476\n滾蛋\t142477\n敏洪\t142478\n黔南师院\t142479\n围\t142480\n很残忍\t142481\n困\t142482\n巨魔\t142483\n园\t142484\ndddhkjjkkjkkkkkkkkkkkkkkkkkkkk\t142485\n丨丨\t142486\n第三大\t142487\n囧\t142488\n第三天\t142489\n高雄\t142490\n囤\t142491\n贱人\t142492\n因\t142493\n鬼混\t142494\nnearbyprep\t142495\n常焱\t142496\n梁全香\t142497\n悍姐\t142498\n马大太二\t142499\n昆明军区\t142500\n斯人独憔悴\t142501\n被叫\t142502\n定稿\t142503\n老头儿求你了行\t142504\n动图\t142505\np3软硬\t142506\nagbgg\t142507\nHMV\t142508\n倮照\t142509\n厦门国际会展中心\t142510\n年纪\t142511\n国家宝藏3\t142512\n聊群\t142513\n毫克联\t142514\n有名\t142515\n猫儿\t142516\n5454747\t142517\nxiazaijiesha\t142518\n依然在一起吧\t142519\n做男\t142520\n宋子明\t142521\n2012-7-8\t142522\n莲花乡\t142523\ntouritime\t142524\n眯切\t142525\n姜文浩\t142526\n高圆圆\t142527\n锕阿腌\t142528\n某仁\t142529\n张骁\t142530\n动植物\t142531\n八元\t142532\nOsee\t142533\n静台\t142534\n二盒\t142535\n擂台\t142536\nxup\t142537\n不穿\t142538\nxus\t142539\nxux\t142540\nxuy\t142541\n不穷\t142542\nxud\t142543\nxue\t142544\nxuf\t142545\n如梦如幻\t142546\n插旗\t142547\nxub\t142548\nxuc\t142549\n战不骂人\t142550\n放广告\t142551\nxuo\t142552\n解答\t142553\n冲锋陷阵\t142554\nxuk\t142555\n趨勢\t142556\n走私\t142557\n走秀\t142558\n人旦\t142559\n我而是你的妃谁呀山上电影片分的分领先我\t142560\n报修完\t142561\n小儿\t142562\n老子别无求你\t142563\nhooo\t142564\n死了陪我聊会儿天儿\t142565\n故事实\t142566\nYOU\t142567\nhood\t142568\n苞米碴子\t142569\n偷腥\t142570\n云思乡\t142571\n硬座\t142572\n硬度\t142573\n健美\t142574\n3275485\t142575\n中超\t142576\n中越\t142577\n建党\t142578\n蓬松\t142579\n好孩子\t142580\nnifa\t142581\n直充\t142582\n温朱莞\t142583\n零八零二\t142584\n巴菲特\t142585\nokokokokk\t142586\nqunijia\t142587\nokokokoko\t142588\n总队长\t142589\n爱肯尼\t142590\n直八\t142591\n罗度秘\t142592\nhhhhhhggg\t142593\n78房\t142594\n都部\t142595\n厌还\t142596\n我等你聊\t142597\ntFbs\t142598\n唐氏\t142599\n都都\t142600\n分晓\t142601\n五十多岁\t142602\n城市居民\t142603\n人工答话\t142604\n样\t142605\n秦炜婷\t142606\n栽\t142607\n格\t142608\n栾\t142609\n密妃\t142610\n核\t142611\n永不再见\t142612\n宫羽\t142613\n清清\t142614\n卷发棒\t142615\n27亿美元\t142616\n窝麻麻\t142617\n有组织\t142618\n何天赐\t142619\n部将\t142620\n株\t142621\n接物\t142622\n栗\t142623\n1月16日\t142624\n树\t142625\n7月6日起\t142626\n栓\t142627\nkhhiehj\t142628\nhbhbbbbxvhsehgrhesebdndnddndbrbjebtbtntb4ehjemekrYbgtrenlshdvfvtmn\t142629\n栅\t142630\n卤道\t142631\n标\t142632\n栏\t142633\n难道面\t142634\n栈\t142635\n卑感\t142636\n3900916\t142637\n石雕\t142638\n照上图\t142639\n510K吗一\t142640\n遥知\t142641\n拉萨科技\t142642\n飯飯\t142643\n修记\t142644\n好不容易\t142645\n老伴\t142646\npojsjakqkq\t142647\n迎远客茂林\t142648\n打晕\t142649\n老似\t142650\n好啦谢谢你\t142651\n镇江台\t142652\n前四个月\t142653\n修订\t142654\n尾骨\t142655\n甘旗\t142656\n七五折\t142657\n┻━┻︵\t142658\nWafg\t142659\ngftkhfr\t142660\n光品\t142661\n还有你\t142662\n清宫戏\t142663\n火山口\t142664\n我是你的主人你是我的助手\t142665\n光华帅\t142666\n開頭\t142667\n老会\t142668\n成都站\t142669\n安定\t142670\nwww.dushi105.com\t142671\n风雨飘摇\t142672\nSIYX\t142673\n8888800000000000000岁\t142674\n动秘度\t142675\n安安\t142676\n恩都\t142677\n伤兵\t142678\n安宁\t142679\n谭玉洁\t142680\n土豆网\t142681\n哇天\t142682\n附庸\t142683\n侯雨萍\t142684\n舅舅舅\t142685\nwww％\t142686\n安家\t142687\n周慧雯\t142688\n江hat\t142689\n3.97寸\t142690\n含山\t142691\n三角函数\t142692\n潮汕话\t142693\n罗甸\t142694\n二五十\t142695\n呶呶呶呶呶\t142696\n皆知\t142697\n都赞\t142698\n嘿嘿年\t142699\n魔乐师\t142700\n西环\t142701\n啦啦啦啦啦啦我是卖报小孩呀步\t142702\ncssedsdrdfgg\t142703\n不见了我\t142704\n8023\t142705\n红帽\t142706\n管浠皓\t142707\n星们\t142708\n爱洋洋娃娃哩小爸爸说咯\t142709\n四姑娘山\t142710\n红布\t142711\nseisti\t142712\n除数\t142713\n三分钟后\t142714\n酒商\t142715\n龙日\t142716\n欧亨利\t142717\nlqlulql\t142718\n么事克力爸爸灰老八嘿嘿红太阳\t142719\n农作物\t142720\n出镜率\t142721\n叉开\t142722\n餐粥\t142723\n付账\t142724\n一念之间\t142725\ngtadx\t142726\n3314196955\t142727\ncHKSD\t142728\navsmobile\t142729\n冒個\t142730\n蓝军\t142731\n82com\t142732\n国境\t142733\n国家民航局\t142734\n好莱坞\t142735\n鸿海\t142736\n名篇\t142737\n麓镇\t142738\nghiphotosbaiducomxiaodupicitemd8f9d72a6059252d2ad26470339b033b5ab5b9f9jpg\t142739\n好好聊着\t142740\n武建东\t142741\n嘿嗯\t142742\n车案\t142743\n张丽颖\t142744\n口服\t142745\n騙子\t142746\n引索\t142747\n遍地\t142748\n120808\t142749\n二十世纪\t142750\n真正常\t142751\n割脉\t142752\n林紫莹\t142753\ngsrgrhsrher\t142754\n菠萝菠萝蜜可乐八木百万群英\t142755\n毛眼\t142756\nVhhhhhbjjkhhjknbnjjgvlbvjcgjvvjncfx\t142757\n启源\t142758\nonoro\t142759\n五集\t142760\n困滚滚滚滚滚滚滚滚\t142761\n空空无解肥\t142762\n10970330\t142763\nv分\t142764\ns95\t142765\n看空\t142766\nCMCKCIDn\t142767\n看穿\t142768\n亲来\t142769\n颗长\t142770\n金太行\t142771\n鸟肉\t142772\n多天鸟\t142773\n林章海\t142774\n防贼\t142775\n650帮\t142776\n请和\t142777\n恋狂\t142778\n刘济源\t142779\n念叨\t142780\n行行行\t142781\n即使剧\t142782\n#奥康\t142783\n八点整\t142784\n任涛\t142785\n一个一本\t142786\n五零\t142787\n五雷\t142788\n坏蛋好想你\t142789\n关爱天天见\t142790\n小立\t142791\n驾驶类\t142792\n云南省\t142793\n雇用\t142794\nAv88\t142795\n对象儿\t142796\n六边\t142797\n95555\t142798\n小站\t142799\n笑岔气儿\t142800\n冰塞\t142801\n句天\t142802\n11742\t142803\n绝味\t142804\n自己去找\t142805\n11748\t142806\n胡说\t142807\n小童\t142808\n姗姗\t142809\n像假了\t142810\nIncounece\t142811\n郭桐源\t142812\n董锡佳\t142813\n珍泥\t142814\n否定式\t142815\n扫把星\t142816\n行不\t142817\n齐帅帅\t142818\n徐嘉诚\t142819\n结果帧\t142820\nkakakak\t142821\n苏晓\t142822\nsunshine\t142823\n该们\t142824\n那不找你了哼凡人\t142825\nwhhx\t142826\n自动物\t142827\n贾曼\t142828\n凤凤祥\t142829\n好塞\t142830\n您看\t142831\n长笑\t142832\n湿身\t142833\n好啦好啦给我闭上你的嘴\t142834\n到好说话儿\t142835\n1995年1月17日，5时46分\t142836\n涉足\t142837\n空空如也\t142838\n口粮\t142839\n郭外斜\t142840\n狗系系\t142841\n黄雷杰\t142842\n艾达菲尔德Ayda\t142843\nrnnrkr\t142844\n过气\t142845\n好我不骂你了你也不骂我了行不行\t142846\n圈圈\t142847\n聚落\t142848\n等一下意见\t142849\n龙井茶\t142850\n好哇行哪来切\t142851\n赞赞赞赞赞赞赞赞赞赞赞赞赞赞赞赞\t142852\n王笑颜\t142853\n鸭肉\t142854\n铃声\t142855\n东芜\t142856\n东芝\t142857\n酱油场\t142858\n魏振良\t142859\n月星\t142860\n2008年2月12号\t142861\n我喜欢朝\t142862\n云飘泊\t142863\n敲击\t142864\n名著\t142865\n孙正科\t142866\n王阿姨\t142867\nKKK\t142868\n王着\t142869\na片儿\t142870\n166428847282\t142871\n林海\t142872\n夺宝\t142873\n20x13点\t142874\n咳咳咳\t142875\n聊了喝\t142876\n下十二点\t142877\n真辛苦\t142878\n尧帝\t142879\n即视感\t142880\n平勺\t142881\n无边天作岸\t142882\n就任\t142883\n几万3\t142884\n888888888888888888888888883383\t142885\n底调\t142886\n无论何时\t142887\n8853\t142888\n8852\t142889\n有多聊\t142890\n8850\t142891\n上午８时\t142892\n15159729721\t142893\n战力\t142894\n堆儿\t142895\n乳摇\t142896\n金明\t142897\n小鳄\t142898\nI了6\t142899\n谢谢你你对我的爱\t142900\n相向\t142901\n备受\t142902\n相同\t142903\n灰冻\t142904\n几十遍\t142905\n魏培洋\t142906\n崆峒\t142907\n互助\t142908\n猜数\t142909\npoas\t142910\n黄建贞\t142911\n逆势\t142912\nbhdjebeb\t142913\n积弊\t142914\n护发霜\t142915\n180元\t142916\n你好王小花花\t142917\n西现代\t142918\n温班宁\t142919\n路建华\t142920\n1700多张\t142921\n155n\t142922\n头橡\t142923\n黄对紫\t142924\n13158526666\t142925\np灰太狼\t142926\n4摄氏度\t142927\n22张\t142928\n新伙伴\t142929\n神舟\t142930\n一个51个\t142931\n戊癸\t142932\n能不能行\t142933\ncomtomotherancurafco\t142934\n1550\t142935\n1554\t142936\n你好美丽真的好美你\t142937\nzzjj\t142938\n9点15分\t142939\n西藏大峡谷\t142940\n657989898987567624\t142941\n不难不\t142942\n天空调子\t142943\n懒不死\t142944\nhducy\t142945\n光州\t142946\n89年\t142947\nwwt\t142948\nwww\t142949\n度秘你的离李想\t142950\n苏湘渝\t142951\n靠着\t142952\n并无\t142953\nwwe\t142954\nwwg\t142955\n糗事错\t142956\n泰萌美\t142957\nwwc\t142958\nwwb\t142959\nwwm\t142960\n和鹏\t142961\nwwn\t142962\n1080\t142963\n何足\t142964\n知识珠\t142965\nweib0.com\t142966\nHahusinevajka\t142967\n肖灿\t142968\n靠睡\t142969\naaaaaa\t142970\n入团\t142971\nidotoo\t142972\n装饰条\t142973\n多米诺\t142974\n过眼\t142975\n呀医\t142976\n入园\t142977\n群众们\t142978\n豚鼠\t142979\n了待会儿再聊\t142980\n赈灾\t142981\n恩我不想\t142982\n我的心门\t142983\nghsjdj\t142984\n吉妮\t142985\n一幅画\t142986\ntara\t142987\n24万吨\t142988\n嘉珊\t142989\n不决\t142990\n不冰\t142991\n恭敬\t142992\n不冷\t142993\n免流\t142994\n13463023771\t142995\n冷死牛\t142996\n11月3日\t142997\n不再\t142998\n14n28n312n4\t142999\n46.5万\t143000\n2705.4亿元\t143001\n1823557724\t143002\n摸奶节\t143003\n覅次\t143004\n银灰色\t143005\n不写\t143006\n天空之城下\t143007\n蛋卷\t143008\n通航\t143009\ncuou\t143010\nBIDISI\t143011\n星期天\t143012\n丑怪\t143013\ncuos\t143014\n第五次\t143015\n功大\t143016\nipiuuhhj\t143017\n15G\t143018\n200万\t143019\n功夫\t143020\n我真么\t143021\n不知敢敢\t143022\n雷曼非\t143023\n热首歌\t143024\n5Illegaljh\t143025\n逆事\t143026\n六颗\t143027\n密度站\t143028\n丑态\t143029\n男男历险记\t143030\n通今\t143031\n阿库拉朩里咯\t143032\n抨价\t143033\n200个\t143034\n了不叫\t143035\n丑怕\t143036\n稳固\t143037\n曾堡\t143038\n巨蟹合影\t143039\n3553次\t143040\n夏浩铭\t143041\n消磨\t143042\n西山\t143043\npooryeah\t143044\nfdff\t143045\ncom28\t143046\n莫既然这样\t143047\ntomet@you\t143048\n杜悦\t143049\n好迪\t143050\nNo.1749\t143051\nfdft\t143052\n脑控\t143053\n急诊医学\t143054\nvhcvgvh\t143055\n好运\t143056\n马沈娥\t143057\n呼叫器\t143058\nohsehun\t143059\ncos招财猫\t143060\n好远\t143061\n32拐\t143062\n1974年\t143063\n交流电\t143064\n数亿元\t143065\n喻文平\t143066\n138jirl\t143067\n能用\t143068\n二四十七k\t143069\n犯戒\t143070\n犯我\t143071\n嘉兴\t143072\n唉猫小叽呢姑猫ii包\t143073\n凉爽\t143074\n一公\t143075\n一六\t143076\n补过\t143077\n花台\t143078\n花叶\t143079\n小白牙\t143080\n一八\t143081\npresented\t143082\n一具\t143083\n花可\t143084\n一共\t143085\n雨层\t143086\n一关\t143087\nTFboyS\t143088\n淋巴肉\t143089\n再见再见再见再见\t143090\n一兆\t143091\n一百年以后\t143092\n体东门\t143093\n大人工\t143094\n一元\t143095\n真好呀你好\t143096\n一先\t143097\n一光\t143098\n一克\t143099\n面对着\t143100\n起运\t143101\n琉光璃彩\t143102\ngzhw\t143103\nTFboys\t143104\n破手袋\t143105\nUCCA\t143106\n很温暖问\t143107\n恶心我了是\t143108\n兆瑞年\t143109\n乐在其中\t143110\n危言\t143111\nhnhhfg\t143112\n蜜行\t143113\ndiei\t143114\n邪门\t143115\n四五岁\t143116\n这阵子\t143117\n生理期\t143118\n苗冬眠\t143119\n十几点\t143120\n1334258\t143121\n舅舅\t143122\n58分钟\t143123\n遽然\t143124\n可爱你都们\t143125\n度读\t143126\n细微处\t143127\nrfffgt\t143128\n吉尔伯特\t143129\n吗啉\t143130\n可耐啦\t143131\n错圆圆\t143132\n7328387218721721872187218731\t143133\nMarilyn\t143134\n反恐版\t143135\n军火\t143136\n餐三四十你的话你们的夜华\t143137\nskin\t143138\n不漂亮\t143139\nall露女人咯菌腈\t143140\n150本\t143141\nHdhdihj\t143142\n吗啡\t143143\n何用来\t143144\nrubicon\t143145\n黄鹤楼雅\t143146\n欧吃哟偶遇\t143147\n八娃\t143148\n张志唔\t143149\n地震云\t143150\n胯骨处\t143151\n卸除\t143152\n后备金\t143153\n184点\t143154\nguuyu\t143155\n美利坚\t143156\n商商\t143157\ntriggjjg\t143158\n邓修涵\t143159\n弗尼亚\t143160\n谢谢你的笑话爱听感谢你的爱\t143161\n心得\t143162\nguuyg\t143163\nCDthughubbubNBAsdefycxxcccccccccxxxxxzxxddd\t143164\n懂男欢女爱\t143165\n星动烟火\t143166\n摩托头目\t143167\n学墙\t143168\n30米\t143169\n文聊\t143170\n晕车\t143171\n文职\t143172\n好宝贝\t143173\n碰对\t143174\n保养\t143175\n瓶奶奶\t143176\n你好难\t143177\n这种梗\t143178\n家青\t143179\n阿飘\t143180\n没人接\t143181\n文联\t143182\n旗形\t143183\n阿飞\t143184\n霶\t143185\nsiidd\t143186\n超载\t143187\niamagirl\t143188\n王思丽\t143189\n中食油\t143190\n不里你了永远再见\t143191\n惜为\t143192\n超车\t143193\nlittlepig\t143194\n3101700866\t143195\n完了我告诉你\t143196\n125C\t143197\n万水千山总是情\t143198\n快乐再见\t143199\nHUG\t143200\n前途\t143201\n巴沙尔\t143202\n朱习超\t143203\n中国小学\t143204\n好呀你\t143205\n唐代\t143206\n我是你奶三\t143207\nSTUVW\t143208\n删略\t143209\n零二零二幺八\t143210\n碳钢\t143211\n二硕\t143212\n桃李\t143213\n每个么\t143214\n这个日\t143215\n请多指教\t143216\n1000名\t143217\n雄虫\t143218\n#INFINITE#新歌\t143219\n万一言\t143220\n爱我你就上我\t143221\n等一个人到\t143222\n急性淋巴细胞——L1型白血病\t143223\n抱抱元\t143224\n50多分\t143225\n执委\t143226\n救灾\t143227\n北岩小学\t143228\n高就好\t143229\nViVi\t143230\n18com\t143231\n一下子七八\t143232\n开心网\t143233\n百分之六\t143234\n1159228650\t143235\ngyibgvff\t143236\n铖铖\t143237\n接点\t143238\n认识字\t143239\n110多块\t143240\n鱼籽\t143241\n锢照\t143242\n加上前\t143243\n小径\t143244\n131401\t143245\n小德\t143246\nrertr\t143247\n问一什么\t143248\n泰铢\t143249\n潮潮汕\t143250\n8至16时\t143251\n小微\t143252\n截一下\t143253\n嗯秘仔\t143254\n谢有\t143255\n禁闭岛干甚\t143256\n疯了真的疯了\t143257\n韩金慧\t143258\n忘词\t143259\n好友客舍\t143260\n地磁\t143261\n水群\t143262\n仙岛湖\t143263\n万达\t143264\n不是我\t143265\n上不上网\t143266\n同位语\t143267\n异源\t143268\nuudf\t143269\nhi氏强\t143270\nuudd\t143271\n聪明伶俐\t143272\n霞\t143273\n芮芮\t143274\nBbBy\t143275\ncomaaa\t143276\n呕沥\t143277\n搏击\t143278\n35米\t143279\n杂交\t143280\n智通\t143281\njsjii\t143282\njsjjdjdjd\t143283\n美度美度\t143284\n阿好嘞\t143285\n蛋蛋乖孩小肥羊\t143286\njsjiz\t143287\n紧唱\t143288\n赤道\t143289\nYOUR\t143290\n东京天空树承建商·大林组\t143291\npink\t143292\n朱邦彦\t143293\n艾莱克\t143294\n开心如意\t143295\nidjhdykrhhdhdhhtifockjfddjfhdudjfjfjxbfywwdgjthfdfhurd\t143296\n青团子\t143297\nping\t143298\n杂事\t143299\n自然界面\t143300\ntill\t143301\nsunday\t143302\n没得火\t143303\n玄关\t143304\n象妮\t143305\n555886688858888885555\t143306\n第20\t143307\n别加\t143308\n送给\t143309\nguyg\t143310\n别动\t143311\n陶洪伟\t143312\n250下\t143313\n送终\t143314\n霏\t143315\n对练\t143316\nguyu\t143317\n输入法连我\t143318\n南南\t143319\n尘灰\t143320\n250万\t143321\n221MBB\t143322\n军队\t143323\n更帅\t143324\nzng\t143325\n老子没要你说话\t143326\nznd\t143327\n116米\t143328\n扣子\t143329\n粗根全\t143330\n扣字\t143331\nznm\t143332\nzns\t143333\n250个\t143334\n翀哥\t143335\n龙腾\t143336\n凡高\t143337\n古建筑\t143338\nhi森\t143339\n再见的我跟说句\t143340\n谁个谁个\t143341\n不懂我说的话不是我听不懂你说的话\t143342\n秘笈\t143343\n中国健康教育中心\t143344\n未应\t143345\nxiaoyo\t143346\n武逆城市\t143347\n海西\t143348\nxiaoyu\t143349\n行过\t143350\n假酒\t143351\n陈林明\t143352\n胡英爽\t143353\n小萌妃v11\t143354\n寒羊\t143355\n来者不善散\t143356\n从今天开始\t143357\n文栋\t143358\n我没有钱\t143359\n天才型\t143360\n成纷\t143361\n18700块\t143362\n海豚表演\t143363\n文标\t143364\ndgjg\t143365\n我是女的你是纯爷们儿\t143366\n一些个\t143367\nthwjci\t143368\n然后\t143369\ndgjt\t143370\n采矿\t143371\nnqg\t143372\n赏花\t143373\n二十盒\t143374\n贝贝贝贝贝贝哈\t143375\n一些一\t143376\n农资\t143377\n946257454252\t143378\n国会\t143379\n82%\t143380\n外逃\t143381\n苏碧婷\t143382\n蓬蓬\t143383\n王瑞里\t143384\n大了不好意思\t143385\n扭扭捏捏扭扭捏捏那你\t143386\n鋫\t143387\n824\t143388\n827\t143389\n826\t143390\n821\t143391\n820\t143392\n823\t143393\n822\t143394\n牛蹄\t143395\n八十首\t143396\nLux\t143397\n笑脸色\t143398\n高佳敏\t143399\narcv\t143400\n728248\t143401\n鸡累\t143402\n波切尔\t143403\n玫瑰人生\t143404\n123456789876\t143405\n第四卷\t143406\nkujjjliiytyhh\t143407\n宋嘉玺\t143408\n逍遥\t143409\n翻天娃\t143410\n不发誓\t143411\n曾广之\t143412\n烤鸭饭\t143413\njulce\t143414\n我不耐\t143415\n找自己\t143416\n成凤\t143417\n接我来\t143418\n家人们\t143419\n烤鸭饼\t143420\n新一村\t143421\n天空照\t143422\n下地收拾\t143423\neryyyyio\t143424\n正印\t143425\n配合业\t143426\n镀铬\t143427\n小学霸\t143428\n叶罗莉\t143429\n工作室\t143430\n发发\t143431\n工木\t143432\nWTCC\t143433\n度秘你的嘴真\t143434\n王颢潼\t143435\n吉他手\t143436\n宝存兄\t143437\n发句\t143438\n冯梦阳\t143439\nlanguage\t143440\n淀粉斑\t143441\n彼长\t143442\n碰头\t143443\n老花\t143444\n轰毁\t143445\nhnjn\t143446\n石兽\t143447\n照理说\t143448\n工服\t143449\n正华\t143450\n个\t143451\n庸才\t143452\n正午\t143453\n石兰\t143454\n连山\t143455\n366954144287\t143456\n阿春古\t143457\ndidhx\t143458\n前来\t143459\n卡梅利\t143460\n一床\t143461\n我喜欢我那你喜欢我\t143462\n学们\t143463\n15298750109\t143464\n假山\t143465\nffghgfgghxd\t143466\n太呆\t143467\n路党\t143468\n叩石兄\t143469\n前板\t143470\nuokj\t143471\n秘符\t143472\n巩固\t143473\n路八\t143474\n早安总统\t143475\n人工台吧\t143476\n应景\t143477\n宋裕贤\t143478\n2010-8-15\t143479\n秘笼\t143480\n56835\t143481\n抓住\t143482\n孟楚云\t143483\n摸爬滚打\t143484\n植物大战僵\t143485\nigdzxcvbjh\t143486\n手无寸铁\t143487\n变得不到\t143488\n姜汁\t143489\n得门票#\t143490\n五凌\t143491\n波块开\t143492\n陆文媛\t143493\n拔毛加\t143494\nMOTO别册\t143495\nwert\t143496\n85厘米\t143497\n罗田\t143498\n知难而退\t143499\n8187\t143500\n445323198708101557\t143501\n8181\t143502\n崔流\t143503\n8182\t143504\n五几\t143505\n眼袋\t143506\n随你我是你爸\t143507\n配让\t143508\n开门红\t143509\n赵小慕\t143510\n缧绁\t143511\n队友\t143512\n有气儿\t143513\n唱唱\t143514\n杜老二\t143515\n吸金\t143516\n外发战狼\t143517\n嗯园\t143518\n点头\t143519\n卖刊\t143520\nwwwwxxxcxix\t143521\n我们仨\t143522\n参为\t143523\n乐子才\t143524\n行业\t143525\n赵卫华\t143526\n同堡\t143527\n欧姐\t143528\n冒險\t143529\n拨付\t143530\n胡润百富榜\t143531\n蒋森万\t143532\n互动\t143533\n胡泽浩\t143534\n沐浴露空盒\t143535\n取走\t143536\n天略童书\t143537\n卖车\t143538\n行主\t143539\n行为\t143540\n老断线\t143541\n石仔\t143542\n你男的我看小兔子歌\t143543\ndgxjabfsbhwbisbwugejjwbebhdbhbddjdhdebajkxgsbskkfywbkfhwbldhwbjdkdhsbepdj\t143544\nweibo.com/1197161814/ymd91AS0\t143545\n厅上\t143546\n殷自聪\t143547\n楚楚可怜\t143548\n公信力\t143549\n炯亮\t143550\n13340129628\t143551\n亲睹\t143552\n见顶\t143553\n一座\t143554\n顺水推舟\t143555\n浙仑\t143556\nxwyy\t143557\n亲着\t143558\n国际吖撒次方\t143559\n一般来说\t143560\n钟堂\t143561\n舒坦\t143562\n跟赖\t143563\n亲睐\t143564\nmsksls\t143565\n孙丽\t143566\n吴家森\t143567\n今天12点\t143568\nViVO\t143569\n丅Fboys\t143570\n童养媳\t143571\n欢女爱\t143572\n力tg\t143573\n没了没有了\t143574\n水槽\t143575\n不是劲笑\t143576\n訢纪元大洒共\t143577\n安不安\t143578\n麻杆\t143579\n第几期\t143580\n拥护\t143581\n财产\t143582\nggddryu\t143583\n天津代表团\t143584\n三点四\t143585\n财人\t143586\n孙三\t143587\nchfor\t143588\n佛本\t143589\n15184171533\t143590\n拥抱\t143591\n孙一\t143592\n蘑菇蘑菇\t143593\nsorotisla\t143594\n雁过拔毛式\t143595\n早知首\t143596\n膈面\t143597\n沿街\t143598\n徐海燕\t143599\ntoutouror\t143600\n罚酒\t143601\n军阀\t143602\nYduxifjfoticogkclb\t143603\n9468.04点\t143604\n同病相怜\t143605\n就好好好好\t143606\n咪妈咪\t143607\nLaKgsdmf\t143608\n有口吃\t143609\ncc2b\t143610\n重振\t143611\n一袋一次\t143612\n重挫\t143613\n担任\t143614\nhgd\t143615\nViV0\t143616\ncggdx\t143617\n俩一家\t143618\n不要脸长\t143619\nMYbirthday\t143620\n3篇\t143621\n化学工业园\t143622\n天蟑螂\t143623\n信阳\t143624\n今生\t143625\n此剧\t143626\n点天\t143627\n1.03%\t143628\nsdood\t143629\n我是你的皇上你是我的太监\t143630\n驴肉\t143631\n明面\t143632\n小秋文\t143633\n1664点\t143634\n猪蛋子\t143635\n此前\t143636\n#曺圭贤#\t143637\n摔死\t143638\narokshlzvmk\t143639\n浑浊\t143640\nckuwrokfs\t143641\n二四十二号\t143642\n黑烧牛排\t143643\nhdhsj\t143644\n蒙彼利埃\t143645\n作本\t143646\n孙梦莹\t143647\n好当家\t143648\nhgc\t143649\n合办者\t143650\n闻香识女人\t143651\n铁四局\t143652\nwhat度\t143653\n数千亿\t143654\nuisb\t143655\n天王盖地虎老相\t143656\n不散\t143657\n尤维斯\t143658\n郭莉花\t143659\n恒灵刀\t143660\n结交\t143661\n盐池\t143662\n山呼\t143663\nn654542655565\t143664\n醉醉\t143665\n于静怡\t143666\n冷笑\t143667\n猪猪猪侠一点都\t143668\n度秘网\t143669\n太行不\t143670\n求新\t143671\n成植\t143672\n呈堂证供\t143673\n呢来来来来来来来来来来来来来来来来来来来来来\t143674\n凉透\t143675\n厚味\t143676\n毋年\t143677\n仍是\t143678\n来人上鹤顶红\t143679\n三六拜\t143680\n闻喜高腾\t143681\n笔芯\t143682\n三七二三二三一九七三零三零五五一二一八\t143683\nYixing\t143684\n水暖鸭\t143685\nhbcfjdg\t143686\nmaimjium\t143687\n萨瓦纳\t143688\n片平渚\t143689\n考据贴\t143690\n五碗\t143691\n8倍\t143692\n戈老师\t143693\n二七二零\t143694\n外厕所\t143695\n闻康网\t143696\n23464个\t143697\n振诚angelababy\t143698\n黑靴\t143699\n超爷们范\t143700\n慢嘟嘟\t143701\n光屁屁\t143702\n中电信筹备移动支付公司\t143703\n08:00\t143704\n公务员法\t143705\n蒹葭\t143706\n耶鲁大\t143707\n黑面\t143708\n谢谢啦\t143709\n搜狗\t143710\n膜片\t143711\n刘女士\t143712\n强调性\t143713\n搜狐\t143714\n杨金\t143715\n遗风\t143716\n类聚\t143717\n话题库\t143718\n23000\t143719\n包房\t143720\n1452\t143721\n九型性格心理测试\t143722\n漫画版\t143723\n未央\t143724\n撸啦啦撸啦啦撸\t143725\n女老虎\t143726\n兴叹\t143727\n能见度\t143728\n一千只\t143729\n去版\t143730\n耳饰\t143731\n兴可\t143732\neuetruuttu\t143733\n老子的哥\t143734\n放张\t143735\n微信号儿\t143736\n今明\t143737\n书山\t143738\n讨厌我真的爱\t143739\n断开\t143740\n彭嘉喜\t143741\n淘寶\t143742\n青子\t143743\n今星\t143744\n诗情画意\t143745\n起色\t143746\n放弃\t143747\n给我问问\t143748\n放开\t143749\nkucal\t143750\n梅雁挺\t143751\n太了解\t143752\n不专\t143753\n５月\t143754\nLeroy\t143755\n倒茶\t143756\n夜猫纸\t143757\n三百六十六三百六十八\t143758\n还度\t143759\n度秘就是我我就是度米度秘可爱萌萌哒度秘\t143760\n对头\t143761\n蔚烁\t143762\n不不\t143763\nCfgg\t143764\n榴弹\t143765\n乱摸\t143766\n不及待\t143767\n中广新闻网\t143768\n不丹\t143769\n猪婆猪婆猪\t143770\n禅师\t143771\n陈世圣\t143772\n禅帐\t143773\n对外\t143774\n不个\t143775\n不中\t143776\nnnnnnnnnnn\t143777\n一个级\t143778\nidchss\t143779\n戕害\t143780\n16年\t143781\n白了\t143782\n无以复加\t143783\neneren\t143784\n白事\t143785\n郑官屯村\t143786\nDJ派对来啦】新天地G+\t143787\n我没问你你我靠\t143788\n杨家火锅#\t143789\n张玮榕\t143790\n一个系\t143791\n张义宽\t143792\n1627856123\t143793\n那好芭\t143794\n五权分立\t143795\n晁亚敏\t143796\n艾窝操\t143797\n殆尽\t143798\n一百一一百八十年\t143799\n解放前\t143800\nUggggd\t143801\n草香\t143802\ngdffjuhgjn\t143803\nHomain\t143804\n不开心\t143805\n戒备森严\t143806\n垂成\t143807\n丑度\t143808\n16幅\t143809\n徐嘉欣\t143810\n白人\t143811\n杨贵海\t143812\n别投诉\t143813\n赵霁\t143814\n原声\t143815\n拉拉直物\t143816\nso乖\t143817\n北山公园\t143818\n7月12号\t143819\n工业喷涂\t143820\nifisentina\t143821\n陈佳琪\t143822\n片面性\t143823\n见图\t143824\n脑后\t143825\n捂乳\t143826\n呼喽mailasometocondong\t143827\n陈国琴\t143828\n外滩十八号\t143829\n好乖嘻嘻窝\t143830\n李兰芳\t143831\n穆红玉\t143832\n好吧酷跑类\t143833\n典雅\t143834\n世界观\t143835\n陈道明\t143836\n小猪猪小鸭子机器人小鸭子\t143837\n一晌\t143838\n想秘\t143839\n不爱我了哭\t143840\n袁萌\t143841\n12点40\t143842\n一晚\t143843\n13810938511\t143844\n12点44\t143845\n出代表\t143846\n叹服\t143847\n别气我了行不\t143848\n搏命\t143849\n65师\t143850\n高富他同意句\t143851\n一景\t143852\n特码\t143853\ndfgjl\t143854\n18341224836\t143855\n植物大战僵尸扭\t143856\n波特曼\t143857\n欧阳\t143858\n打通话\t143859\n千多六千多\t143860\n神往\t143861\n穷者\t143862\n我好爱柯AK小宝贝\t143863\n第几\t143864\n鼻青\t143865\n陕西省人民医院\t143866\n首日\t143867\n包包堂\t143868\n哪样\t143869\nbroken\t143870\n169元\t143871\n董梦豪\t143872\n二百二百二百八十九块\t143873\n奥利奥拉\t143874\n吗一万\t143875\n陈健伍\t143876\nyhugrgduydutoyghvyffvyiivykfrftudfruuttkrrEuttueefFotyrgjffyykyfufttgyyteufk\t143877\n信息业\t143878\n盛冠森\t143879\n意林在哪里包百\t143880\n腾腾\t143881\n问笑话\t143882\nMise\t143883\n迎宾路\t143884\n一诺倾情\t143885\n有钱吗\t143886\n9公里\t143887\n张良\t143888\n学派\t143889\n润白\t143890\n320辆\t143891\n一水即兴\t143892\n2片\t143893\n张艺\t143894\n桶装水\t143895\n何日\t143896\n仝瑶\t143897\ngoupstairs\t143898\n永感\t143899\n营业时间\t143900\n18942500833\t143901\n何时\t143902\n白薯\t143903\ndffr\t143904\n光交\t143905\n鱼画\t143906\n陈惠民\t143907\n给我吧\t143908\n王婉心\t143909\n后身\t143910\n编程人员\t143911\n一些\t143912\n地界\t143913\n黄合明\t143914\n李妍洁\t143915\n一五\t143916\n傻子梦工厂\t143917\n开去\t143918\n一事\t143919\n一二\t143920\n开县\t143921\n一于\t143922\n刘小七\t143923\n别提远在\t143924\n小度秘蜜\t143925\n一了\t143926\n普适力\t143927\n好奇哈哈\t143928\n一人\t143929\n马蹄\t143930\n一亽\t143931\n一亿\t143932\n开厂\t143933\n当当卓越三防\t143934\n度秘耶\t143935\n大母\t143936\n，，[钟\t143937\n一亩\t143938\nk742\t143939\n宣钢\t143940\n一亮\t143941\n赤诚\t143942\nsvxb\t143943\n咖喱\t143944\n三千\t143945\n微客族#\t143946\n毛市镇\t143947\n爱读你你是我的好伙伴\t143948\n删杀\t143949\n今天朝\t143950\n5648\t143951\n竣工率\t143952\n公告人生\t143953\n豆腐渣\t143954\nfdsaqwsdf\t143955\n赤水\t143956\n吧本地\t143957\n带有没有\t143958\n越来了\t143959\n尜骝\t143960\nO╰\t143961\n散磕歪瑞\t143962\n相思\t143963\n雪妖洞\t143964\nfield\t143965\n其那你猜猜我\t143966\nk克\t143967\n372次\t143968\n狠角色\t143969\n女症\t143970\n承载\t143971\nHughut\t143972\n病态\t143973\n吉娃娃\t143974\n见你好\t143975\n俭朴\t143976\n别开\t143977\nmeke\t143978\nstudents\t143979\n5w40\t143980\n610575969\t143981\n若是\t143982\n368999\t143983\n排期\t143984\n彭富春\t143985\n饿\t143986\n4.83万平米\t143987\n连带\t143988\n坐打\t143989\n泽旺拉姆\t143990\n阿伦·拉斯顿\t143991\n边际\t143992\n钙化灶\t143993\n三MB\t143994\n张之潇\t143995\nxuhxy\t143996\n跨年K\t143997\n饰\t143998\n食赖赖\t143999\ngigjgh\t144000\n晓得似\t144001\n58.1%\t144002\n长条\t144003\n真的记不起你是谁了\t144004\n已知句\t144005\n大擦\t144006\n坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋大坏蛋臭坏蛋大鸡蛋\t144007\n皮下\t144008\n车厢\t144009\n贫乳\t144010\n天像\t144011\n王润瑶\t144012\nlian\t144013\nliao\t144014\n前车之鉴\t144015\n长板\t144016\ngxifai\t144017\n踢爆\t144018\n缘分\t144019\nimmo\t144020\n五万元\t144021\nrrod\t144022\n老腿儿\t144023\n回答复\t144024\nSLIPKNOT\t144025\n神神叨叨大大方方\t144026\n十四错\t144027\n赘述\t144028\n金瓶梅事么\t144029\n饭\t144030\n雪千儿\t144031\n喵喵喵喵喵喵喵喵喵喵喵喵\t144032\n阿尔巴尼亚\t144033\n医道\t144034\n咪吗\t144035\n咪吐\t144036\nHighchildren\t144037\n七小\t144038\n食种\t144039\n拉k\t144040\n5处\t144041\nSP公司\t144042\n白小雪\t144043\n农商行\t144044\n朝花\t144045\n很幸运\t144046\n支柱\t144047\n长坡女恋\t144048\n烤炉\t144049\n2899\t144050\n臭脸\t144051\n出门小心\t144052\n莫气莫气\t144053\n不要不张嘴\t144054\nvoy927\t144055\n花辞\t144056\n吧侠\t144057\n再说便\t144058\n88446666\t144059\n萌呆\t144060\n暨155号\t144061\n心爱的人\t144062\njooec\t144063\n热情好客\t144064\n臭脚\t144065\n爱永\t144066\n纸质\t144067\n马屁者\t144068\n人工台行\t144069\n危重\t144070\n十余年前\t144071\n135198\t144072\n128岁\t144073\n听不懂我的话我讨厌你\t144074\n免里\t144075\n那码\t144076\n错啦我错\t144077\n联合国难民署\t144078\n骗情\t144079\n概念\t144080\n魏鑫馨\t144081\n四点零二\t144082\n王进喜\t144083\n相符\t144084\n北七\t144085\n不能不倍\t144086\n345446754477\t144087\n加薪\t144088\n金茂北京威斯汀大饭店\t144089\n北三\t144090\n北上\t144091\n薇迷\t144092\nffff\t144093\nfffg\t144094\nfffd\t144095\n真是的亏\t144096\n姓你\t144097\n恩帅锅\t144098\n夙愿\t144099\n华驰五三\t144100\n配正\t144101\n陈秘书\t144102\n出售\t144103\n称雄\t144104\nffft\t144105\ndookd\t144106\n我不我乖\t144107\n泰迪犬\t144108\n搞不懂\t144109\n一闪一闪\t144110\nK\t144111\n可不可用\t144112\n真可惜\t144113\n风神凛凛\t144114\n你征\t144115\n送进\t144116\n麻辣串\t144117\n送还\t144118\n酸甜苦辣\t144119\n见了拜拜\t144120\n江花红\t144121\n段练\t144122\n当真好啦\t144123\n波别\t144124\n1千零一万一一\t144125\n总导演\t144126\n段绍\t144127\n升钟\t144128\n117家\t144129\n蓬莱\t144130\n沙滩裤\t144131\n玲珑剔透\t144132\nghphi\t144133\n聪明怪\t144134\n李小四\t144135\n不不了\t144136\nfixi\t144137\n无辣不欢\t144138\n其人家\t144139\n键盘\t144140\n修理员\t144141\n9度\t144142\n威尔特\t144143\n落雨\t144144\n1.58%\t144145\n被假\t144146\n对哦真的爱我爱我\t144147\n西三旗\t144148\n鲜香\t144149\n阿舍\t144150\n打开花\t144151\n被偷\t144152\n肺炎\t144153\n魔界不相信你了度秘\t144154\n621章\t144155\nghfcb\t144156\n23332333\t144157\n胸池\t144158\n二零零\t144159\n落雁\t144160\n擀面杖\t144161\n来说说\t144162\n能化解性\t144163\n刘同长\t144164\n一丁点儿\t144165\nBoeing\t144166\n李思红\t144167\n思兔\t144168\n出神\t144169\n张素兰\t144170\n木素质\t144171\n只有\t144172\n余氯\t144173\nfuockyou\t144174\n1oo哎呀\t144175\n51岁\t144176\n陕西\t144177\n莫大焉\t144178\n隔离栏\t144179\n5446517\t144180\nnone\t144181\nnono\t144182\n天蛇杖\t144183\n好爱你爱你爱你\t144184\nTSCC\t144185\n私行\t144186\n人教版\t144187\njvjbjbjvj\t144188\n出票\t144189\n关你吊\t144190\n陪有\t144191\n觉醒来\t144192\n舍近求\t144193\n猪你猪猪猪猪猪猪猪猪猪猪猪猪猪猪\t144194\n好看我是你的\t144195\n毛海霞\t144196\n2012年1月15日\t144197\n卡雷斯\t144198\nx200\t144199\n二十英寸\t144200\n建设性\t144201\n动浑\t144202\n容没\t144203\n小麦芽\t144204\n反应堆\t144205\n好多好多\t144206\n兔崽子\t144207\n张瑞\t144208\n红眼病\t144209\n那个你\t144210\nvlgnogvl\t144211\ngmptgmpjgagangwpmpmpwgmmdpdgdgapmgwhmdp\t144212\n太雅塔\t144213\n行不啊\t144214\n秒码\t144215\nEndjou\t144216\n一整套\t144217\n笑不怒\t144218\nwwwwww\t144219\nA320neo\t144220\n冻结\t144221\n孤独亲\t144222\n分钟后见\t144223\n勾机\t144224\n说谎话\t144225\n她说好\t144226\n仰面\t144227\n被辞职\t144228\n写景\t144229\n榕一家国有银行电子银行部\t144230\n度秘膜法世家\t144231\n丸子妞生快[爱你\t144232\n大毛猪就是你臭不要脸\t144233\n南京市园林局\t144234\n八节\t144235\n冬天以后\t144236\n中产阶层\t144237\n蝶舞\t144238\nsama\t144239\ncoab\t144240\nWhat\t144241\n今天7点15\t144242\n拜陷\t144243\n度秘长\t144244\n梨涡\t144245\nutfhyyyyyyuyyyuloooootk\t144246\n高富美\t144247\n富腾\t144248\n才到\t144249\n周琦薇\t144250\n谈遇见\t144251\n乖呀子\t144252\n相手\t144253\n双鹤药业\t144254\n迩大米\t144255\n泪花\t144256\n咯麽\t144257\nemend\t144258\n输了拜拜\t144259\n可唔\t144260\n吃臭\t144261\n秘书度\t144262\nqweafgfg\t144263\n五大连池市\t144264\n爷娘\t144265\n你好我爱你\t144266\n来我去\t144267\n易向阳\t144268\n邹元淑\t144269\n刘怡\t144270\n踩曲\t144271\n狠不下\t144272\n刘总\t144273\n刘庆全\t144274\n1954年\t144275\n免签证\t144276\n记热\t144277\n彗星\t144278\n给力]凯坚\t144279\n牧羊信\t144280\n再见不过\t144281\n18812717640\t144282\n失机\t144283\n试机\t144284\n张晋兴\t144285\n挥挥手\t144286\n杜学\t144287\n大姐婆大姐婆大猪\t144288\n京津冀\t144289\n最最最最最最好的朋友\t144290\n小窍门\t144291\nhellocation\t144292\n陌生的城市\t144293\n租期\t144294\n起艺\t144295\n劲重\t144296\nSUNTORY\t144297\n时令\t144298\n13798392027\t144299\n度秘你的头好大呀好可爱\t144300\n4923\t144301\n牵强\t144302\n时代\t144303\n9月27日\t144304\n果橙冬瓜片\t144305\n偏废\t144306\n阿红\t144307\n宋三郎\t144308\n此餐\t144309\n马卡蒂中国领事馆\t144310\n文飞宇\t144311\n自恋狂￢\t144312\n５５５５５\t144313\nASDFGHJKL\t144314\n回力\t144315\n18块\t144316\n血腥\t144317\n卦卜\t144318\n几几ｇ\t144319\n姜波涛\t144320\n杰伊\t144321\n第一回\t144322\n批捕\t144323\n雪化\t144324\n干嘛\t144325\n建党伟业\t144326\n壁导联II，III\t144327\n错了错\t144328\nuoykghgkhlgkhhgkgg\t144329\n孤错\t144330\n大旱\t144331\n#股市爱谁谁#\t144332\n玩耍呀像\t144333\n其那\t144334\n貌神\t144335\n清不楚\t144336\n个月\t144337\n就是的家\t144338\n王天\t144339\n幹什麼\t144340\n2057\t144341\n不一\t144342\n押回\t144343\n偷偷上\t144344\n渝中区\t144345\n勉为其难原谅\t144346\n个机\t144347\n王大\t144348\n杰伦\t144349\n秀英港\t144350\n2011年9月21日下午\t144351\n杨泽宇\t144352\n奇想\t144353\n康劳斯莱斯\t144354\nvbhxbbxhj\t144355\n网址贿赂\t144356\nl0Ve\t144357\n服服\t144358\nnacg\t144359\n庄家\t144360\n李好高\t144361\n艾菲尔\t144362\n彩琳\t144363\nfgcbv\t144364\n纸鹤\t144365\n謝謝上天賜予我的好運氣\t144366\n南联盟\t144367\n班嘉\t144368\n3927\t144369\n任亦兴\t144370\n1475859996\t144371\n12:42\t144372\n重压\t144373\n叶龙\t144374\n188740771\t144375\n强效\t144376\n不【\t144377\n养胃\t144378\nbdb芭比7ac\t144379\n引起轰动\t144380\n说了么\t144381\n对啊他的样子\t144382\n没有脸\t144383\n李女士\t144384\n鱼口\t144385\n赌王\t144386\n一下下啦\t144387\n割掉\t144388\n42555jj\t144389\n彩球\t144390\n李奥\t144391\n王跃凤\t144392\n二千五百八十六二千五百万茜\t144393\n阿妹阿妹\t144394\n急送\t144395\n你说什么呢你说什么呢你说什么了小鸡小机器人\t144396\nplane\t144397\n十四五岁\t144398\n火是谁了你是吧不\t144399\n王晨yu\t144400\n这件事\t144401\n六本命\t144402\n郭靖\t144403\n郑佳文\t144404\n拜聊聊\t144405\n芥雷王\t144406\n阿卜杜拉·马丁\t144407\nggyuh\t144408\n构想\t144409\n一平方\t144410\nwjlt\t144411\nggyuu\t144412\n更人好\t144413\n第68届\t144414\n金牌坊\t144415\n般配\t144416\n定时\t144417\n诗赋\t144418\n笨怂\t144419\n两三岁\t144420\nJurong\t144421\n例子\t144422\n熊块\t144423\n胡丽媚\t144424\n王哈\t144425\n摘记本\t144426\n不求人\t144427\n叫\t144428\n原諒妳\t144429\nRPS\t144430\n我求你了宝黑变成白夜帮我求你了我喜欢北也\t144431\n随便说说罢\t144432\n杨樵\t144433\n破坏式\t144434\n任昱恺\t144435\nk3\t144436\nk1\t144437\n野蛮\t144438\nk7\t144439\nRPG\t144440\nk4\t144441\n闭汗\t144442\n多犁曹格粮蛰積蚩\t144443\n少工委\t144444\n放下来\t144445\n台上\t144446\n台下\t144447\n娇密\t144448\n文少\t144449\nkG\t144450\n购发\t144451\nkI\t144452\n干支\t144453\n769岁\t144454\n木懂\t144455\n33秒\t144456\n台东\t144457\n金政模\t144458\n在叙\t144459\n叩\t144460\n干政\t144461\n食页\t144462\n胃肠炎\t144463\nkc\t144464\nkb\t144465\nka\t144466\nkg\t144467\nkf\t144468\nke\t144469\nkd\t144470\nkk\t144471\nkj\t144472\nki\t144473\nkh\t144474\nko\t144475\n雷电赛告诉我你不爱我我的就死\t144476\nkm\t144477\nkl\t144478\nks\t144479\nkr\t144480\nkp\t144481\nkw\t144482\nkv\t144483\nku\t144484\nkt\t144485\nkz\t144486\nky\t144487\n跟前\t144488\n女娃子\t144489\n参什么塞\t144490\n导说\t144491\n肚秘密\t144492\n多行不义必自毙\t144493\nchendoor\t144494\n口张\t144495\n肯干\t144496\n飞讯\t144497\n全国人大\t144498\nnaoche\t144499\n无中生有\t144500\n名曲\t144501\n你是猪啊你是机器人我是人\t144502\n豹女国\t144503\ndu小达人\t144504\n黑玉米\t144505\n深情色\t144506\n嗯嗯克克克克克克克克\t144507\n理科生\t144508\n8把\t144509\nwihd\t144510\n类股\t144511\n横着\t144512\n生活愉快\t144513\n1.93米\t144514\n公元后\t144515\n直流电\t144516\n农历\t144517\n导诊\t144518\n王哥\t144519\n渐逝\t144520\n爱乖\t144521\n热忱\t144522\n集结号\t144523\n25位\t144524\n月月月月月月月月月月月\t144525\nvacantion\t144526\n十篇\t144527\n钼靶\t144528\nsnddjsvs\t144529\n真不够\t144530\nAKB48\t144531\n温州众胜电子厂\t144532\nAhd59898890\t144533\n饱含\t144534\n七点\t144535\n王哪\t144536\n修炼版\t144537\n俘虏\t144538\ny7548434691434654\t144539\n一时而已\t144540\n二二二九零\t144541\n携号转网\t144542\n兰亚\t144543\n要图\t144544\n热心\t144545\n3.33元\t144546\n呃行\t144547\n一份儿\t144548\n228604\t144549\n日狛\t144550\n船员\t144551\n对呀就在\t144552\n3949\t144553\njfugv\t144554\n卤水味\t144555\n张迎昕\t144556\n面纱\t144557\n3943\t144558\n嗯OKOK\t144559\n五洲酒店\t144560\n闲得\t144561\nvcxd\t144562\n猪贝贝\t144563\n面红\t144564\n望洋兴叹\t144565\n七十点\t144566\n塞夏\t144567\n老样子\t144568\n42.4%\t144569\n鲁尼\t144570\n6万元\t144571\nTCL\t144572\n皿=++谢特\t144573\n失神\t144574\nTCD\t144575\n作业区\t144576\n塞外\t144577\n巨型\t144578\n贺老师\t144579\n林永汉\t144580\n归来也黑黑正经\t144581\n机气人\t144582\n结婚吧别担误\t144583\n首歌\t144584\n2款\t144585\n微观\t144586\n2次\t144587\n渡假村\t144588\n伤别\t144589\n引起呀呀呀呀呀呀\t144590\n讲道理\t144591\n卓文萱\t144592\n一千几百块\t144593\n飘过\t144594\n布莎卡\t144595\n夸大\t144596\n7899654554255436552886\t144597\n旗舰\t144598\n乖儿咂\t144599\n夸夸\t144600\n我是西山嘴人我是无聊数海人物良素海\t144601\nldontknow\t144602\n飘远\t144603\n少宁\t144604\n妈批\t144605\n励志学画\t144606\n昆明机场\t144607\n座厕\t144608\nhhhhgghhuhvgbhhhbbhhbhhhhghhhhghhhhghhghjhh\t144609\n振动马达\t144610\n耐寒\t144611\n雅漾\t144612\n梦回：三英战吕布\t144613\nQQ啊块肉\t144614\n第6周\t144615\n问婷\t144616\n出入境\t144617\n九月三十号\t144618\n洛克们\t144619\n1023399068197\t144620\nCosta\t144621\n爱情类\t144622\n昙花一现\t144623\n录音\t144624\n1碗\t144625\n徒子\t144626\n完美眼影\t144627\n没用了\t144628\n朱孝军\t144629\n嘞汤力丁\t144630\n天黑了\t144631\n好啊你好\t144632\n陈晓彤\t144633\n骚包\t144634\n五斗米\t144635\n衬字\t144636\n别傻了\t144637\n萨博\t144638\n就赞\t144639\n闷死\t144640\n智雅版\t144641\n酒仙\t144642\n郑佳乐\t144643\nunderstand\t144644\n400万分之一\t144645\n傅成玉\t144646\n899元\t144647\n氩焊\t144648\n儿婿\t144649\n显瘦\t144650\n一细\t144651\n例文\t144652\n25行\t144653\n土豆燃萁\t144654\n不要脸不要脸你真是不要脸\t144655\n溪美长安\t144656\n高高等\t144657\n近一个月\t144658\n君儿\t144659\n哪边\t144660\n三百三千六百米\t144661\n星象星期六\t144662\n谎言\t144663\n一流量包儿\t144664\n今天晚上\t144665\n穿过\t144666\n用字国栋\t144667\n小虫儿\t144668\nu饿uu\t144669\n王萌萌\t144670\n穿连\t144671\n在我烦\t144672\n水龙吟\t144673\n鲁班锁九连环\t144674\n说你好\t144675\n萌萌嗒\t144676\n加密\t144677\n年糕\t144678\n20分\t144679\n对我就喜欢\t144680\n梦姝\t144681\n预售期\t144682\n恐龙镇\t144683\n点f处\t144684\n烧烤烧烤\t144685\nFUDJ\t144686\n工间操\t144687\n05岁\t144688\nzhuni猪\t144689\n半跳床\t144690\n谷嘉成\t144691\n招房\t144692\n学试题\t144693\n粤利粤\t144694\n吃串\t144695\n滕溪岭\t144696\n卡比拉\t144697\n锃亮\t144698\n度秘我爱你度秘你是我的心血\t144699\n我的饿个世界\t144700\n下限\t144701\n3145794875648767497566957556491547749563616\t144702\n保佑我爸\t144703\n政府们\t144704\n汗颜\t144705\nmorrow\t144706\nvirtual\t144707\n钱三孙\t144708\n然而\t144709\n清廉\t144710\n大头儿子小头爸爸的故事\t144711\n下降\t144712\n一滴血\t144713\n洪博培\t144714\n疯狂猜\t144715\ntycf\t144716\ntycg\t144717\n585565\t144718\n高血压\t144719\nICUFYFUGU\t144720\n铩羽而归\t144721\n我是谁了你说我是男生女生\t144722\n啥子点\t144723\n潘晓\t144724\ntycu\t144725\n度系\t144726\n度糸\t144727\n789456\t144728\n团学情\t144729\n呃不开\t144730\n提示我喜欢\t144731\n异议\t144732\n五年以后\t144733\n迹卿\t144734\n出牌\t144735\n819南京晚会821CCTV-MTV音乐盛典822录开学第一课823北京签售825郑州签售829深圳站发布会831金唱片915沈阳演唱会917民大\t144736\n洪水\t144737\n秘酿\t144738\n整词\t144739\n修宿\t144740\n教育署\t144741\n烧心\t144742\n大致\t144743\n刚刚刚刚等等等等等等等等等等等等等\t144744\n疏离\t144745\n找事儿\t144746\n大臭\t144747\n安息\t144748\n疯了疯了\t144749\ntickvlkbn\t144750\n烤红薯\t144751\n大臣\t144752\n外来\t144753\n本周四\t144754\n联盟\t144755\ngzgay\t144756\n妳還\t144757\n挺给力\t144758\n芷江侗族自治县\t144759\n仁爱\t144760\n得受\t144761\n绿卡\t144762\n傅涵\t144763\n128万\t144764\n张金如\t144765\n别在心上\t144766\n大臂\t144767\n宇宙秀\t144768\n铁职\t144769\n李宗仁\t144770\n文一西路\t144771\n不要玩\t144772\n一万余册\t144773\n熏香\t144774\n为主题\t144775\n光明阅室\t144776\n挺立\t144777\n好我不理你了再见\t144778\n分手快乐\t144779\n1543146502\t144780\n尤瑞克\t144781\n哔哩哔哩\t144782\n荒唐事\t144783\n叁\t144784\n5563\t144785\n一行一列\t144786\n切你不是我的秘书\t144787\n撤缆\t144788\n毛长齐\t144789\n好故事\t144790\n9：00\t144791\n不管怎么样\t144792\n一月多\t144793\n31家\t144794\n啦阿呆\t144795\n陈博文\t144796\n机选\t144797\n太地町\t144798\nvrsci\t144799\n图纹\t144800\n图纸\t144801\nts嘎嘎\t144802\njnbj\t144803\n当初\t144804\n小人物\t144805\n一代天王\t144806\njnbn\t144807\n北斗小学\t144808\n殡仪馆\t144809\n056次\t144810\n徐海州\t144811\n戈叔亚\t144812\n检票\t144813\n十一支\t144814\nzyatt\t144815\n走龙蛇\t144816\n外告\t144817\n糯米斯格达\t144818\n苏胤\t144819\njdkk\t144820\nvjjchnnkbfgcfh\t144821\n桃子拉堡\t144822\n开森快乐\t144823\n分播出\t144824\n七十八十\t144825\n尼采\t144826\n长假\t144827\n阿辉\t144828\nWiretaptrip\t144829\n修家\t144830\n捅破\t144831\n爱不我\t144832\njdks\t144833\nbnch\t144834\nmkg\t144835\nmkf\t144836\nmke\t144837\nmkd\t144838\n左右手\t144839\n叶诗文\t144840\n副瘤综合征\t144841\n抗日战争\t144842\n天长\t144843\nmkk\t144844\nmkv\t144845\n青梅竹马文\t144846\nmkt\t144847\nmks\t144848\n好处人\t144849\n中瑞\t144850\n东哥\t144851\n阿达\t144852\n对讲机\t144853\n喔荣立在新哪哪哪\t144854\n心文\t144855\n太假了你\t144856\n135628085\t144857\nhttpbaikebaiducomitem%E6B3B0E7BD97C2B7E555E789B9E69BBC16815901fromid6799777typesynfromtitleE6B3B0E7BD97E555E789B9E69BBCfraladdin\t144858\n圆融\t144859\nrjfgrdjfhyfuhfghggfkgkgjfhxmdhngufchfjjfchjhdfkhegjfdhdgjdfufg\t144860\n造物主\t144861\n看雪去\t144862\n生心什什什姚矬凶人儿子快讲料R大小心哦上长秋\t144863\n古话\t144864\ntuigduigy\t144865\n古诗\t144866\n止痛再说\t144867\n1234567560\t144868\n赵奕欢\t144869\n麦当劳\t144870\n一个五十一\t144871\n房地产\t144872\n去冰\t144873\n残雪\t144874\n真够了\t144875\n咱雄\t144876\n没看\t144877\n双龙桥\t144878\nMVP\t144879\n泣血腥\t144880\n话语\t144881\n另一个我\t144882\n邪性\t144883\n过大年\t144884\n看见了你了等\t144885\n66porn\t144886\n跨年笔写\t144887\n感情生活\t144888\n海天酱油\t144889\nZzz\t144890\nNet\t144891\n话说\t144892\n四十一岁\t144893\n李诗英\t144894\n菇凉\t144895\n几10度\t144896\n500555\t144897\n卧南\t144898\nddrgcslgd\t144899\n众合\t144900\n黎医生\t144901\n齐肩短发到腰间长发\t144902\n话话\t144903\n郑重其事\t144904\n王二小\t144905\n自力更生\t144906\n85644554485635896565558555\t144907\n南清\t144908\n流水\t144909\n碰打\t144910\n全席\t144911\n投河\t144912\n成飞集成\t144913\n齐玉杰\t144914\n辛巴\t144915\n假人儿\t144916\n音体\t144917\n聊了再见\t144918\n方家\t144919\n全布\t144920\n死宝宝\t144921\n黑纱\t144922\n无色\t144923\n工大\t144924\n工天\t144925\n884541\t144926\n工夫\t144927\n一贏d斤\t144928\n流氓\t144929\n七百零八\t144930\n砍头村\t144931\n工头\t144932\n朱主演\t144933\n黑红\t144934\n草来草\t144935\n无良\t144936\n如梦令\t144937\n家浜\t144938\n戴安\t144939\nHarper\t144940\n太丑丑丑丑丑丑丑丑丑\t144941\n几帮\t144942\n赏玩\t144943\n家测\t144944\n度密度密度秘\t144945\n胡萝卜\t144946\n308954202\t144947\n好不好不好不\t144948\n蹦沙卡拉卡蹦沙卡拉卡l801\t144949\n腾讯卡\t144950\nnnhjic你猜嗯嗯深片维尼\t144951\n菠萝包\t144952\n够水\t144953\n昏庸\t144954\n小猪蹄\t144955\n干哈哈哈\t144956\n准线\t144957\nefddffdpdflf\t144958\nPeE4flame\t144959\n尤为\t144960\n公務體系\t144961\n希猪\t144962\njoong\t144963\nsayonaa\t144964\n黑糯米\t144965\n王洪波\t144966\n驱逐舰\t144967\n牛台\t144968\n1晚\t144969\n输赢胜败\t144970\n懵懵\t144971\n单机\t144972\n后以后\t144973\n555555555\t144974\n车胎\t144975\n懵懂\t144976\n牛叉\t144977\n妲己\t144978\n牛发\t144979\n不仨月\t144980\nBBFtj\t144981\n乐驰\t144982\n下周一晚23点30分\t144983\n不是否\t144984\n13435720100\t144985\n盖房子\t144986\n2007010201\t144987\n哈哈大咖\t144988\n1081\t144989\n康学轩\t144990\n朵唯世\t144991\n莉莉莎\t144992\n4天3夜\t144993\n5523555allay\t144994\n慢慢来了\t144995\n四坏蛋\t144996\n声音\t144997\n一万平方米\t144998\n军团\t144999\n想不让\t145000\n究竟\t145001\n南红手\t145002\nhdhvjf\t145003\n三百遍\t145004\n老迈\t145005\n军事基地\t145006\n打卡\t145007\n小月月\t145008\n歌词集\t145009\n打卤\t145010\n算一算笑笑\t145011\n一周后\t145012\n小晶晶\t145013\nQQ620\t145014\n油企\t145015\n就是这样就像样\t145016\n事后\t145017\n495元\t145018\n打印\t145019\n哈意\t145020\n20com\t145021\n意思想\t145022\nzozln\t145023\n有说有笑\t145024\n讨厌讨厌\t145025\n受理\t145026\nsonym422\t145027\n天下女人\t145028\n柯志威\t145029\n雪霜\t145030\n存号\t145031\nAdmiral\t145032\n阿布法\t145033\n很怕\t145034\n田鸡仔\t145035\n真tm\t145036\n右眼\t145037\n回首向来萧瑟处也无风雨也无晴\t145038\n唧唧歪歪\t145039\n播讲\t145040\n10.21日\t145041\n防城港市\t145042\n蔷薇少女馆2\t145043\n轮椅\t145044\n第二笔\t145045\n艾滋\t145046\n一分之一\t145047\n门诊室\t145048\nSMS\t145049\n好害羞\t145050\nSMN\t145051\nbeanstalk\t145052\nSML\t145053\n浦发银行\t145054\nSMG\t145055\nSMD\t145056\ndysesktlattysys\t145057\n搭售\t145058\n欢快\t145059\njhhvv\t145060\n王思聪\t145061\n玩好说\t145062\n二三八六\t145063\n刀笔画\t145064\n六百兆\t145065\n应和谈\t145066\n六百元\t145067\n别哇糖\t145068\n义学\t145069\n桌面感\t145070\n六百克\t145071\n晚知道\t145072\n1双\t145073\n乐蛋\t145074\n直立\t145075\n杨海静\t145076\nWannabe\t145077\n早上6点40\t145078\n也谈恋爱\t145079\n有不可能\t145080\n1句\t145081\n一代人\t145082\n1只\t145083\n1叭\t145084\n余枫\t145085\n瘦臂食谱\t145086\n发米\t145087\n1台\t145088\n1号\t145089\n卡一卡二转\t145090\noppon\t145091\n一个一百六十多\t145092\n那咱们两个聊会儿天儿\t145093\n还好电信局\t145094\n王王天\t145095\n清明假\t145096\n马海鹏\t145097\n爱一小可爱木\t145098\n午夜四\t145099\n小婷婷\t145100\n胡钰曼\t145101\n随便说说而已拜\t145102\nfetu\t145103\nfets\t145104\n无忧y23\t145105\n神曲\t145106\nrfjvtdqwerttyuuuiooopppassdfffghhjjkllllzxxcvvbbnnmmmvfj\t145107\nWhatAboutYou\t145108\n数罪并罚\t145109\n覃老师\t145110\n临死后来\t145111\n丽杏\t145112\n疼度\t145113\n8930160分钟\t145114\n黑妮\t145115\n不刚才说\t145116\n6：10\t145117\n朱小雨\t145118\n涵涵涵涵涵涵涵涵涵\t145119\n为所欲\t145120\n6：18\t145121\n黑妹\t145122\n王运博\t145123\n送后则不廉\t145124\n朱小雅\t145125\n霸王\t145126\n仁济西\t145127\n转儿\t145128\n猪肉丝\t145129\n更容易\t145130\n提问一到\t145131\n鸟笼\t145132\n黑妞\t145133\n蟹粉\t145134\n猪你是猪你是猪猪猪猪猪hi\t145135\n徘徊\t145136\n老坛酸菜牛肉面\t145137\n橘子版\t145138\nfydtfuud\t145139\n爸爸的女儿\t145140\n讨厌呀别\t145141\ndjei\t145142\n诈取\t145143\n标尺\t145144\n劳动生产率\t145145\n吵打\t145146\n毒人\t145147\n养病\t145148\n缱绻\t145149\n画景\t145150\n砸碎\t145151\n秋田\t145152\n3kouyamah\t145153\n我不是优乐美我四百我是你\t145154\nggghjh\t145155\n58888888885455\t145156\n42070219931227696923\t145157\n再秘\t145158\n吃麻花\t145159\n干红烧肉\t145160\n排号\t145161\n口决\t145162\n1000000999999999999666666999\t145163\n理还乱\t145164\n出油率\t145165\n05400504\t145166\n时候\t145167\n霍奇森\t145168\n钱一\t145169\n枸杞茶\t145170\n明诚\t145171\n結果wlan\t145172\n金句\t145173\n36988\t145174\n明证\t145175\n钢价\t145176\nhttpghiphotosbaiducomxiaodupicitemcf1b9d16fdfaaf51cd40ffcc8b5494eef01f7a25jpg\t145177\n金口\t145178\ninstead\t145179\n鬼脸\t145180\n长株潭报\t145181\n股票\t145182\n金叔\t145183\n电灯\t145184\n金发\t145185\n神器人\t145186\n3d版\t145187\n明说\t145188\n囍日子\t145189\n股神\t145190\n我讨厌你我恨你我不想和你\t145191\nBoowshagalaga\t145192\n大呼吸\t145193\n九华山东南第一山庐山俱龙骑加天涯华山天下奇险第一山岳阳楼岳阳天下楼嵩山少林寺天下第一长山海关天下第一关之琳天下第一奇观\t145194\n58058857\t145195\n钱丫\t145196\n本校\t145197\n满江红\t145198\n哈哈我不陪你了我好困\t145199\n游戏锅网网\t145200\n不用人\t145201\n文人\t145202\n哎呦qu\t145203\n曹植\t145204\n冯彬\t145205\nk318\t145206\n上学好不\t145207\n目测\t145208\n我们俩\t145209\n半包\t145210\n凌念\t145211\nbdbx\t145212\n镇雄\t145213\n遗赠\t145214\n蕴凌\t145215\n迎春\t145216\n玦\t145217\nbdbs\t145218\nrez\t145219\n包月阳\t145220\nZUI\t145221\nrer\t145222\n千八百八八八八\t145223\n凌志\t145224\n二场\t145225\n陈余千\t145226\n刘诺一\t145227\nren\t145228\nreo\t145229\n1114岁\t145230\n2220元\t145231\n传球\t145232\nreg\t145233\nred\t145234\n一二三四七八九十\t145235\n好吧明天让我摸摸你的头亲亲你的脸\t145236\n雪阳\t145237\n大潮\t145238\n夜郎西\t145239\n圆满完成\t145240\n国榷\t145241\n聊天室\t145242\n会家的讲一个\t145243\n平江\t145244\n呆瓜呆瓜大呆瓜我讨厌你\t145245\njbbvvbn\t145246\n张政杨\t145247\n括号英\t145248\n水莲花\t145249\n1314个\t145250\n梦莹\t145251\n虹吸式\t145252\n心田上的百合花开我没全心全意\t145253\n808088\t145254\n天老\t145255\n鳞伤\t145256\nvU型\t145257\n典型\t145258\n15540464443\t145259\nghhhnnjhjhhjhhuhhhnfjgjhhgjgnghjjjhjhhgjhghhggh\t145260\nepke\t145261\n我不想和你玩儿了我不要你了你走\t145262\n校队\t145263\n声呐\t145264\n杀马特\t145265\n昆明犬\t145266\n十二二九\t145267\n去伪存\t145268\n水池系\t145269\n试卷\t145270\n粘性\t145271\nfygcffhh\t145272\n礼物\t145273\n哥帅\t145274\n甩手\t145275\n一一丁丁点\t145276\n独立人\t145277\n我和你我我和你我和你聊天儿\t145278\n脑出血\t145279\n联席\t145280\n凤凰涅槃\t145281\n仔人\t145282\n导学案\t145283\n粑粑粑粑粑粑粑粑粑粑\t145284\n快告诉我\t145285\n一百八十九\t145286\n面团\t145287\n退休金\t145288\n懂似的\t145289\nmimale\t145290\n斗日\t145291\ndolpgupwm\t145292\n不pp\t145293\n满意\t145294\n权志龙偶\t145295\n斜线\t145296\n党委\t145297\n李仁俊\t145298\n越来越远\t145299\n理有\t145300\n后乎乎乎\t145301\n忌惮\t145302\n两湾城\t145303\n不做饭\t145304\n催债公司\t145305\n洪佩瑶\t145306\n悼词\t145307\n汹涌澎湃\t145308\n计息\t145309\nuno##2012\t145310\n欧阳琳\t145311\n赢取\t145312\ntftf卜\t145313\n177515\t145314\n38.8\t145315\n搞昏\t145316\n曹天乐\t145317\n多克奇\t145318\n嗯搞笑\t145319\n赵伊人\t145320\n王子长\t145321\n28478780\t145322\n曹琴\t145323\nsIIeenIe\t145324\n镇心\t145325\n曹琪\t145326\ne餐\t145327\n真虎\t145328\n20万亿日元\t145329\n蛯原友里#，#ロ\t145330\n独生女\t145331\n小贝成都市果你相信你\t145332\nxhanthero\t145333\n引得\t145334\n一见钟情了谁谁\t145335\n威锋\t145336\n510411\t145337\n一井\t145338\n慢吖\t145339\n头圈\t145340\n萌老\t145341\n张大奕\t145342\n第21121\t145343\n佛类\t145344\n兔场\t145345\n下风\t145346\n率真\t145347\n马六轿\t145348\n不行行不行行不行\t145349\n冷不丁\t145350\n推笑话\t145351\n多好\t145352\n7点40分\t145353\nvztbs\t145354\n韩秀曦\t145355\n心服口服\t145356\n奇侠戏告诉我\t145357\n沙拉拌海\t145358\n68888888507588\t145359\n华纳\t145360\n冤大头风格护肤护肝7DVu反弹那天大会堂大好人\t145361\n洗发精\t145362\n高峰期\t145363\n边上\t145364\n我喜欢长\t145365\n544686476568646868646\t145366\n源故\t145367\n仓鼠\t145368\n马东敏\t145369\n马盛君\t145370\n苏帮馆\t145371\n快点度你\t145372\n度照\t145373\n13%\t145374\n臭婆娘\t145375\n银员\t145376\n杨记\t145377\n记者\t145378\n告诉你了你好\t145379\n刘金\t145380\n68.77%\t145381\n女流儿\t145382\n利希安\t145383\n几天几夜\t145384\n窦那\t145385\nyiggu\t145386\n八九个月\t145387\n孙晴晴\t145388\n孙心如\t145389\n锦簇\t145390\n预防\t145391\n最好的时光\t145392\n千余\t145393\n嬴政\t145394\n痕迹\t145395\n5528559564\t145396\n小车\t145397\n止痛\t145398\n小味精\t145399\n小轩\t145400\n小轮\t145401\n海鹏之仇已非家仇\t145402\n千位\t145403\n酒甘棠\t145404\n五十位\t145405\nAPBP\t145406\n132\t145407\n塑钢\t145408\n牛瘟\t145409\nGaudythud\t145410\nj媚眼\t145411\n媚哥哥\t145412\n手里里\t145413\n康亚蕊\t145414\n乳品\t145415\n死不女\t145416\n完澡\t145417\n李如松\t145418\n嬉皮笑脸厚脸\t145419\n安锦涛\t145420\n佩罗\t145421\n涌动\t145422\nHgk\t145423\n15868697662\t145424\n哪有钱\t145425\n最后一注\t145426\n二三零二二\t145427\n36.9\t145428\n体检\t145429\n黑心钱\t145430\n36.5\t145431\n开心开心\t145432\n百了标了兵\t145433\n胖胖胖\t145434\n陆军装\t145435\n苏菲亚\t145436\n四六级\t145437\n九鼎记\t145438\n聊了嘞走\t145439\n九眼桥\t145440\n中国特奥信使\t145441\n万网\t145442\n咽干\t145443\n打水漂\t145444\n柳耀扬\t145445\n北京朝阳区十八里店小学\t145446\nwisowkd\t145447\n清道夫\t145448\n图者\t145449\n正月十五\t145450\n开原\t145451\n赞说\t145452\n女童\t145453\n椭圆体\t145454\n咯手\t145455\n猴塞雷\t145456\n下葬\t145457\n4群\t145458\n下句\t145459\n第一晚\t145460\n俩第一次\t145461\n呼伦贝尔草原\t145462\n篮球王\t145463\n下台\t145464\n2016013期\t145465\n恩哼\t145466\n两个学\t145467\n损塞\t145468\n民营企业\t145469\n下叉\t145470\n初五晚上\t145471\n巴金\t145472\n恩哈\t145473\n我可想\t145474\n一只八哥\t145475\n两个字\t145476\n安娜卡\t145477\n战队\t145478\n六成六\t145479\n收视仪\t145480\n空亡\t145481\n期末考\t145482\n思科王\t145483\n下发\t145484\njffjjd\t145485\n拜纳姆\t145486\n嗯品红\t145487\n猪们\t145488\n多多少\t145489\n五一小学\t145490\n住民\t145491\n圣米兰桑\t145492\n羊绒\t145493\n无恶\t145494\n猪份\t145495\nlaugh\t145496\nm18\t145497\n肉麻恶心\t145498\n结痂\t145499\n搭界\t145500\n晚6:00\t145501\n创编\t145502\n7094\t145503\n哪了我\t145504\n十厘米\t145505\n7098\t145506\njsos\t145507\n榕树\t145508\n诬赖\t145509\n还有饭否\t145510\njsod\t145511\n丫个心服口服\t145512\n火车王\t145513\n五福\t145514\ngffddfggjgghghgghcgjvb\t145515\n无恙\t145516\n拂去\t145517\n很聪舰\t145518\nsyzno\t145519\n愿意阵\t145520\njajajg\t145521\n宋哥\t145522\n正比\t145523\n放虎归山\t145524\n那等你\t145525\n相性\t145526\n鹅犬\t145527\n姚文杰\t145528\n斯利姆\t145529\n新颖\t145530\n佛罗伦萨路\t145531\nab美\t145532\n双截龙\t145533\nOjwhsisw\t145534\n马英晨\t145535\n览胜\t145536\n火呼\t145537\n彭哥\t145538\ncjG\t145539\n度秘你好吗你好\t145540\n8月25日\t145541\n5年内\t145542\ncjj\t145543\ncjk\t145544\ncjh\t145545\n供气\t145546\n6228480051941878012\t145547\n越窑海棠碗\t145548\ncjl\t145549\n死熊\t145550\ncjc\t145551\n主咪\t145552\n说不明白\t145553\ncjf\t145554\ncjg\t145555\ncjd\t145556\n千凯杯\t145557\n彩虹特殊教育培训中心\t145558\ncjx\t145559\n黄平\t145560\n多想想\t145561\njoekwk\t145562\ncjs\t145563\ncjp\t145564\n胡安一\t145565\nsJJSHSH\t145566\ncjt\t145567\nMON\t145568\n打清楚\t145569\n壁挂式\t145570\n单就女\t145571\n淘淘\t145572\n失去控制\t145573\n大青岛\t145574\n就是我的声\t145575\n小白子\t145576\n苏蕊\t145577\ndodoososopsspspspsppspspspspsosspsps\t145578\n调研组\t145579\n快点快点三叔我和你的对话\t145580\n沙漠\t145581\n13l\t145582\n下你\t145583\n2118118\t145584\n诗刊\t145585\n下作\t145586\ngffx\t145587\n开出\t145588\n横排\t145589\n汉诺威\t145590\n夜殇长\t145591\ngfft\t145592\n下体\t145593\n下位\t145594\n来才\t145595\n闹闹闹闹进化在\t145596\n沙漏\t145597\nouryoutube\t145598\ngffd\t145599\ngffg\t145600\n83分\t145601\n韩川\t145602\nseate\t145603\n侯宁\t145604\n硫酸铜\t145605\n我没问你连你的小度尾小度\t145606\n坚守\t145607\n告诉你吧\t145608\n睡觉吧机器人\t145609\n看术\t145610\n空大地\t145611\n片狂\t145612\n吴国\t145613\n为鉴\t145614\n坚定\t145615\nwnsn\t145616\nLFCBBS\t145617\n求偿\t145618\n坚实\t145619\nssnjxjnxxdhhdjdndnndjjdjdhrnchdnxjhddnjxjdjrnchcnrhcnrnhdndjcjr\t145620\n生在\t145621\n真弱智\t145622\n几去\t145623\n看望\t145624\n狼术\t145625\n圣诞老人\t145626\n证券业协会\t145627\n硫酸铵\t145628\n练霓裳\t145629\ncoxti\t145630\n小j撸\t145631\n1881415154\t145632\n面包车\t145633\n多动症\t145634\n生地\t145635\n1300多\t145636\n大Boss\t145637\n用以\t145638\n假女仆\t145639\nADIUT\t145640\n锕嚄嚄may\t145641\n胡博\t145642\nThinkpad\t145643\n参政议政\t145644\n松下来\t145645\nDVD光驱\t145646\n雕虫\t145647\n孟萍萍\t145648\n屯子\t145649\nghhgj\t145650\n读呀\t145651\n车轮\t145652\n9410分\t145653\n车车\t145654\n文化香奈儿展\t145655\n二九多\t145656\nGirrard\t145657\n牛年\t145658\n萝波萝密\t145659\n花炮\t145660\n王候年\t145661\n车载\t145662\n朱文\t145663\n车轴\t145664\n东京都政府\t145665\n秦俑\t145666\n别弄\t145667\n明天晚上六点\t145668\n占比\t145669\n啤酒厂\t145670\nshoudaoshenm\t145671\n两次方\t145672\n18261344508\t145673\n【多站集合\t145674\n臭不臭机器人\t145675\n愛心傳遞\t145676\ness\t145677\n变声期\t145678\n你朋友\t145679\n听地\t145680\n损耗\t145681\n瞄瞄\t145682\nese\t145683\n克拉科夫\t145684\nesa\t145685\n白水\t145686\neso\t145687\n大逃亡\t145688\n巨鹿啦啦啦伯纳all体检库库马力\t145689\n通塞\t145690\ngrrt\t145691\n仰视\t145692\n托马斯维克\t145693\n四八三十六\t145694\n奥帆\t145695\n半殖民地半封建社会\t145696\nmm8mm8\t145697\n噪音\t145698\n灯带\t145699\n我不认识\t145700\ngrrg\t145701\n18001800\t145702\n怕秘\t145703\n沈赵\t145704\n坏丫头\t145705\n成败\t145706\n顺着你\t145707\n石破天惊\t145708\n闻琴\t145709\njjdbvbduc\t145710\n的的的的的的的你在干一下你的\t145711\n零解\t145712\nhffu3\t145713\n加拉帕戈斯\t145714\n长江中心\t145715\n讨人开心\t145716\n刘书乐\t145717\n幂版\t145718\n霓兽\t145719\ntfboys歌\t145720\n跟和好\t145721\n方器\t145722\n之内\t145723\n天地会堡街道\t145724\n江云初\t145725\n狗心\t145726\n欲望\t145727\n大商\t145728\n我是你女儿我是美女\t145729\n导报\t145730\n湄潭县\t145731\n整容师\t145732\n将心比心地待人\t145733\n再想一想\t145734\nhiimax子英雄四部美雪\t145735\nvtib\t145736\n就不这么说\t145737\n北师了被十万北师了北室外背书啊背书\t145738\n童佳乐\t145739\n老红\t145740\n果儿\t145741\n也不不\t145742\n老纸\t145743\n胸围杯\t145744\nmfrom\t145745\n本季\t145746\n益阳话\t145747\n积毁销骨\t145748\n叫兔\t145749\n风潮\t145750\n英俊\t145751\n逗逼逗逼逗逼逗逼这就是我的都一逼藕叶\t145752\n站他\t145753\n叫兽\t145754\n宏猫\t145755\n欣怡\t145756\n临港路三段银泰花园一楼\t145757\n太艺术\t145758\nyouco\t145759\n卡块\t145760\n下周三十\t145761\n害人匪浅\t145762\nyouca\t145763\n天津队\t145764\n错落\t145765\n韬多帅\t145766\n忘了我告诉\t145767\n鸟度秘\t145768\n默默地\t145769\n袁立\t145770\n设计院\t145771\n度蜜月过\t145772\n5月10日\t145773\n贫民楼\t145774\n妇幼医院\t145775\n王艳芳\t145776\n偶数\t145777\n周邵孙\t145778\n魏玉\t145779\n魏王\t145780\n首冲\t145781\n新县\t145782\n四四四十四\t145783\n胡拉案\t145784\n阿知\t145785\n邢菲\t145786\n首冠\t145787\n85元\t145788\n向来自来\t145789\n商振峰\t145790\n在一起在一起亲一个亲一个\t145791\n２０多名\t145792\n你好节\t145793\n刘芳菲\t145794\n拉黑\t145795\n快乐至上主义\t145796\nyyrddfhk\t145797\n秦山\t145798\n旗杆\t145799\nhi辣条\t145800\n六七年前\t145801\n讨好\t145802\n嬉戏\t145803\n书苏\t145804\n儿水体\t145805\n二百四\t145806\n师师\t145807\n混战\t145808\n嗯黑莓\t145809\nhelloMV穿越者3q3k\t145810\n王大叔\t145811\nqniki\t145812\n婴英格累\t145813\n爱真传\t145814\n干倘卖\t145815\n熊德明\t145816\nCK男\t145817\n原点\t145818\nzhsh\t145819\nzhsu\t145820\n彩民\t145821\nxuan2404850\t145822\n鼓神空\t145823\n不要图糜烂的职业叫便秘\t145824\n李梦乐\t145825\n炮灰男\t145826\n干片\t145827\n6755796\t145828\n玛莎拉蒂\t145829\n人力机器人\t145830\n屈像\t145831\n亚飞米\t145832\n挡风\t145833\n性用品\t145834\n拉百威\t145835\n姜美妍\t145836\n击怒\t145837\n老话版\t145838\n耶和华\t145839\npride\t145840\n高原反应\t145841\n嬷嬷哒\t145842\n满园\t145843\n不能不打\t145844\n拿过\t145845\n白菜呆额faaaaahre节课almno\t145846\n灭自威风比喻\t145847\n不谢你\t145848\nevery\t145849\n满国\t145850\n舞法\t145851\n灰鹤\t145852\n更狠\t145853\n1458854188\t145854\n多手不释手\t145855\nuvwrst\t145856\n白藜芦醇\t145857\n阿拉维\t145858\n队长\t145859\n自拍屏\t145860\n做宣传\t145861\n再聊吧世界\t145862\n愁眉\t145863\n等不去\t145864\n传记片\t145865\n不是喂喂你在\t145866\n文雅尼玛\t145867\n牙髓\t145868\n就是你的家\t145869\n过渡委\t145870\n发起人\t145871\n牙高\t145872\n招投标\t145873\nFestival\t145874\n添丁\t145875\n么么哪儿\t145876\n很自信\t145877\n人嘴\t145878\ncfxmativ和dftatamithataktv\t145879\n复试\t145880\n我爱HK开心万岁\t145881\n哈度秘\t145882\n张仲\t145883\n常晶玉\t145884\n张悠雨\t145885\none＋\t145886\n评价\t145887\n孩纸们\t145888\n爱不到\t145889\nseeyouthen\t145890\n内斯塔\t145891\n不虚此行\t145892\n了你的我知道我想要不是这个字\t145893\n曾轶可\t145894\n词藻\t145895\n感情观\t145896\n三五大\t145897\ngggyj\t145898\n复课\t145899\n复读\t145900\n神石\t145901\n白羊月亮的你好看\t145902\n待产包\t145903\n流氓狗\t145904\n买到\t145905\n52块\t145906\n影后\t145907\n一贯\t145908\n民事赔偿11\t145909\nmyname\t145910\nMenameisRain\t145911\n三岁\t145912\ncspk\t145913\n凭实力\t145914\nhnphaoe\t145915\n534240000987\t145916\n真的我真的不理你了我是三下\t145917\ni52\t145918\n买别\t145919\nXYJCSY\t145920\n一费\t145921\n余奶奶\t145922\n付费\t145923\n小猫儿\t145924\n多们\t145925\n哦子\t145926\n了才了才\t145927\n坏事儿\t145928\n小辫子\t145929\n八点前\t145930\natatravelagency\t145931\n鞋途网\t145932\n扫帚\t145933\n泽德曼\t145934\n高曼\t145935\n闭幕\t145936\nzhdjf\t145937\n牛欣欣\t145938\n李冰冰\t145939\n机壳\t145940\n炽天使\t145941\n5n5525252233335\t145942\n博班\t145943\n房\t145944\n吴治平\t145945\n襙\t145946\n主攻文\t145947\nspicyfood\t145948\n科阿韦拉角龙\t145949\n张内场\t145950\n贫乏\t145951\n味蕾\t145952\n本等你来\t145953\n轮度秘\t145954\n苏星梦\t145955\n变身契约\t145956\n淼淼\t145957\n度鞋\t145958\njtgjgtgm\t145959\n真污\t145960\n西\t145961\n018化简比\t145962\n暂定\t145963\n创始\t145964\n坏女孩儿\t145965\n告告诉你\t145966\n本子\t145967\n煞是\t145968\n岸哥嘎嘎嘎嘎嘎嘎嘎嘎\t145969\n6x30\t145970\nbeauty\t145971\n坦诚\t145972\n七零年\t145973\n王玉龙\t145974\nmaike\t145975\n唱法\t145976\n额恩泽\t145977\n兰绮纳达特级初榨橄榄油\t145978\n见一下\t145979\n7名\t145980\n啦好啦\t145981\nＡ片\t145982\n呢媳妇\t145983\n浓密\t145984\n特讯\t145985\n何建伟\t145986\n晓明堂大智慧\t145987\n715858\t145988\nEason\t145989\n24块\t145990\n53555886666655555565423865572733\t145991\n张文\t145992\n真心有才\t145993\n特许\t145994\n汇丰晋信\t145995\n张斌\t145996\nPIERROT\t145997\nNOON\t145998\n豆腐\t145999\n大愛\t146000\n尸鬼\t146001\n自称度\t146002\n犹\t146003\n状\t146004\n犷\t146005\n奴仆鸟\t146006\n犰\t146007\n犮\t146008\n犯\t146009\n犬\t146010\n犭\t146011\n你的片\t146012\n零二二三\t146013\n花边\t146014\n拿来\t146015\nB2B电子商务\t146016\n险别\t146017\n七十五五点\t146018\n犟\t146019\n犜\t146020\n二手车\t146021\n胸器\t146022\n犒\t146023\n呱年\t146024\n我擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦\t146025\n一百二三十块\t146026\n亡羊\t146027\n犏\t146028\nvjdhffc\t146029\n犋\t146030\n邵阳\t146031\n大发雷霆\t146032\n喜雅奥特曼\t146033\n张自忠\t146034\n滔天仔\t146035\n董洁\t146036\n犀\t146037\n六八二\t146038\n地厚\t146039\n麦穗\t146040\n单场\t146041\n甜面酱\t146042\n辛雨\t146043\nPK407\t146044\n大男\t146045\n一名4x\t146046\n大电\t146047\njudid\t146048\n大田\t146049\n哥俩鲁\t146050\n大画\t146051\nSMART\t146052\n高等\t146053\n音乐梦想\t146054\n比达尔\t146055\n这边战国\t146056\n喝伤\t146057\n刘畅园\t146058\n付誉君\t146059\n问责\t146060\n王老吉\t146061\n咕噜吧咕噜排骨\t146062\n许馨雅\t146063\n彭启瑶\t146064\n七宝\t146065\n大生\t146066\n黄渤塬\t146067\n额叶器\t146068\n鸠\t146069\n解梦\t146070\nmakefriend\t146071\n重庆火车北站\t146072\n生科\t146073\n龚何俊\t146074\n曲儿\t146075\n细浪\t146076\n心里不正经我正经\t146077\n天天明天\t146078\n亲爱的你逗死人\t146079\n于谦\t146080\n1枚\t146081\n神经元\t146082\n动界面\t146083\n杨开区\t146084\n宁师翔\t146085\n转卖\t146086\n洋子大姐的强气轻津歌谣\t146087\n龙塔\t146088\n行湖\t146089\n二期\t146090\ndyyd\t146091\n真好笑\t146092\n胡辣汤\t146093\nhttppinyincn2hSo7Wuml5P\t146094\n设计感\t146095\n细流\t146096\n三点六月\t146097\n两更\t146098\n真\t146099\n青青堡\t146100\n母亲们\t146101\n闭目养神\t146102\n鸽\t146103\n蓝粉\t146104\n32ku\t146105\n1580403150505\t146106\n扛迅知道\t146107\n叮叮铃铃\t146108\n说好喜欢\t146109\n季延龙\t146110\n搜索引擎检索\t146111\n头把子\t146112\n摩托艇\t146113\n好杀\t146114\n情投意合\t146115\n祝某\t146116\n说句\t146117\nDBO\t146118\n740万8\t146119\nDBA\t146120\n英货\t146121\n赛欧\t146122\n精巧\t146123\n14781478\t146124\n精工\t146125\nNizaijiama\t146126\n诸城二小\t146127\n蓝沁雨柔\t146128\n我不要你了快滚开\t146129\n刘斌\t146130\n某一个你\t146131\n咪的秘\t146132\n精致\t146133\n一点击\t146134\n能不能够\t146135\n厂工\t146136\n套取\t146137\n外用\t146138\nx安\t146139\n没当真\t146140\n江桥\t146141\n决然\t146142\n远方\t146143\n到底\t146144\n成心\t146145\n裸色\t146146\n俯瞰\t146147\n碧玉路路通\t146148\n到庭\t146149\n早点\t146150\n六快\t146151\n找事\t146152\n杀阡陌\t146153\n笑嘛堡\t146154\n老厉害\t146155\n时时常\t146156\n宝宝\t146157\n马克西\t146158\n破译\t146159\n胡家玮\t146160\n童言童语笑\t146161\n狗性\t146162\n13278082445\t146163\n双鱼擞\t146164\n二三五七\t146165\n拆解率\t146166\n意外\t146167\n战题\t146168\n72家\t146169\n刘雨霏\t146170\n挪土册健酶\t146171\n死亡责任\t146172\n中学时代\t146173\n好再见了我亲爱的朋友\t146174\n裤装\t146175\n一惊的药物我你喜欢银\t146176\n黄草草\t146177\n舍我真的很爱很想\t146178\n21克拉\t146179\n哨兵\t146180\n那晚安么么哒木马木马木马啵啵啵\t146181\n强者无敌\t146182\n得分开\t146183\ndygffffg6\t146184\n王茂蕾\t146185\n睁眼\t146186\n玩女\t146187\n麻麻尊\t146188\n眼冒\t146189\no秘\t146190\n老凉\t146191\n奇奇乖\t146192\n朱清文\t146193\n玩好\t146194\n天天有喜二天哟西\t146195\nê\t146196\n中翔绍兴温泉\t146197\nbjajqnj\t146198\n刘平\t146199\n老出\t146200\nOverthedoor\t146201\n老几\t146202\nù\t146203\n赞助商\t146204\nü\t146205\nó\t146206\nò\t146207\n炒家\t146208\ntrafficn\t146209\n÷\t146210\n倡导\t146211\n路标\t146212\nffghikl\t146213\n浒墅关\t146214\n下礼拜\t146215\ntook\t146216\n双胞\t146217\n灯笼\t146218\n多咪嬲\t146219\n酱肉\t146220\n1957年\t146221\nFun\t146222\n手足情\t146223\n下次症\t146224\n我是你的什么人我爱你\t146225\n来工作\t146226\n锤基#\t146227\n长乐宫\t146228\n我的给我听着我的我爱你\t146229\n10.44\t146230\n法克贝贝贝贝你讨厌我\t146231\ndgxwkwjjdjsfdvcndnJkjhdgdggdgexhtevnshckidhfgrh\t146232\n一百三十八元\t146233\n安家卑鄙\t146234\n左面\t146235\n度秘盒盒\t146236\n改动\t146237\n花千骨份爱\t146238\nm352qm\t146239\n此道\t146240\nfashion\t146241\n鸞簪養\t146242\n永远幸福\t146243\n新番\t146244\n野佥\t146245\nuggxfgvvnhddvbvgbbbjhfbmkhfrtjjgjikjo\t146246\neytxydcycc\t146247\n七k\t146248\n想你脸\t146249\n海尔热水器\t146250\n山东省滨州市供电公司\t146251\nnowIhave\t146252\n死缓\t146253\n怎好\t146254\n横行霸道\t146255\n邢新宇\t146256\n54543553838655\t146257\ndddddcfccff\t146258\n尤伦斯\t146259\nkwmu\t146260\nhi我美\t146261\n李咏松\t146262\n高圈\t146263\nkkdx\t146264\n不精液\t146265\n221期\t146266\n坏蛋小雨坏蛋小雨坏蛋小雨\t146267\n七D\t146268\n炮哥\t146269\nmmnbpk\t146270\n圈人\t146271\n顾不过来\t146272\n程差\t146273\n云霄\t146274\n风速\t146275\n144米\t146276\n岭南小区\t146277\n总体规划\t146278\n100千\t146279\n云霓\t146280\n走弱\t146281\ned2k\t146282\n吃错\t146283\n用户名\t146284\n一天天1000\t146285\n爱老有\t146286\n爱那人\t146287\n造就\t146288\n不可气\t146289\n好身材\t146290\nzZzZ\t146291\n猪猪侠你\t146292\n群猪\t146293\nfffc\t146294\n郑诺奇\t146295\n抱歉麻烦你\t146296\n攻略人\t146297\n相笑\t146298\n琅琊溪\t146299\n周中\t146300\n王早\t146301\n2852755\t146302\n说甚\t146303\n放放你走\t146304\nbutjrujuto\t146305\n10吨\t146306\n3Hao\t146307\n执法\t146308\n周丽\t146309\nUzdowski\t146310\n南上屯\t146311\n张向晨\t146312\n开放假\t146313\n王旋\t146314\n梦网\t146315\n嗯嗯yo\t146316\n/月\t146317\n王族\t146318\n周三\t146319\n打惹我\t146320\n处女儿\t146321\n球状闪电\t146322\n周一\t146323\n周七\t146324\n黄国俊\t146325\n东街口\t146326\n车世禹\t146327\n周且\t146328\no800\t146329\n北京福富软件技术股份有限公司福州分公司\t146330\nbing\t146331\n白度\t146332\n沈河区\t146333\n引人入胜\t146334\n邻国\t146335\n永不会\t146336\n330个\t146337\n呢CD\t146338\n28or29\t146339\n泪泉\t146340\nfXY\t146341\n大石头镇\t146342\n胡理山\t146343\n六云山庄\t146344\n甲长生\t146345\n白庄\t146346\n放置\t146347\n黄花菜7\t146348\n着类\t146349\n沂南生\t146350\n硬叫\t146351\n莫小淘\t146352\n见基星高照\t146353\n标识性\t146354\nJIBS\t146355\n真邪恶史\t146356\n不懂装\t146357\n瞎发\t146358\n由于\t146359\n卓马拉\t146360\n嵌露\t146361\n滋事\t146362\n食要\t146363\nDuePlay\t146364\n四路\t146365\n真讨厌\t146366\n干叫\t146367\nishd\t146368\n大度秘我恨你\t146369\n蒋经国\t146370\n上海电视台\t146371\n谢飞\t146372\n机器间\t146373\n两百多米\t146374\njhdks\t146375\n齐纳\t146376\n難得\t146377\n援\t146378\n亚吗得我怕怕\t146379\n郭姓郭\t146380\n女人才\t146381\n揽\t146382\n揿\t146383\n揸\t146384\n海大势所\t146385\nCFCsthykpkgy\t146386\n缤＼\t146387\n吃吃吃吃吃吃吃吃吃吃吃\t146388\n握\t146389\n揣\t146390\n揭\t146391\n想想你\t146392\n无遮\t146393\n揩\t146394\n揪\t146395\n協會\t146396\nVgcvhghggggggggghhv\t146397\n提\t146398\n提早\t146399\n插\t146400\n撒哇地卡\t146401\nchiphotosbaiducomxiaodupicitemb151f8198618367a18820ea829738bd4b31ce5b8jpg\t146402\n科屯\t146403\n猪猪侠号\t146404\n朝鲜半岛\t146405\n換\t146406\n揄\t146407\n你是我的妈妈家味太郎\t146408\n体香懒人法\t146409\n揍\t146410\n描\t146411\n揉\t146412\n零时\t146413\n没了他\t146414\n度秘你的鼻子在哪儿\t146415\n12月8日\t146416\n康乐花园\t146417\n卬吵\t146418\n0.39亿元\t146419\n爬爬爬爬爬爬爬爬爬爬\t146420\n柔和\t146421\n阿米\t146422\n认亲\t146423\n马丁路德金\t146424\n三千六百块\t146425\n2200\t146426\n橘皮\t146427\n飞得更高\t146428\n2204\t146429\n零丁颖\t146430\n丁啉\t146431\n皮带轮\t146432\nblsm\t146433\n威县\t146434\n肉蒲\t146435\n074辆\t146436\n70多公里\t146437\n肿办\t146438\n依人\t146439\n辨析\t146440\n携程卡\t146441\n自愈\t146442\n老根叔村\t146443\n字行\t146444\nghffuc\t146445\n戚\t146446\n别逼\t146447\n自愧\t146448\nmp饭\t146449\n折翼\t146450\n产伤\t146451\n不度秘英雄\t146452\n自愿\t146453\nvdhj\t146454\n全疆\t146455\n谢靖童\t146456\n13337\t146457\n桃花开\t146458\ncvvbhgg\t146459\n有无字\t146460\n猝不及防\t146461\n死头\t146462\n暴风影音影音\t146463\n红烧鸦片鱼\t146464\n上网费\t146465\n蔚蔚\t146466\nACG动漫站\t146467\n有意志力\t146468\n选派\t146469\njdyrshggk\t146470\njjjdjtmjgtmdtgajdjjmgadjpdamltjjjjjatw\t146471\n市郊\t146472\n救猪\t146473\nJJJI\t146474\n可危\t146475\n撸一发\t146476\n霸道型\t146477\n唉个\t146478\n太理智可\t146479\n好啊我的好\t146480\n梦飞\t146481\n誠品\t146482\n九斤\t146483\n好来\t146484\n捏球堀起\t146485\n应用于\t146486\n用好像\t146487\n拾取\t146488\n水暖\t146489\n谁吗你了我三明\t146490\n等不起\t146491\n啊嘉华\t146492\n四十多分钟\t146493\n妙妙妙妙\t146494\n静哥哥吧\t146495\n元神\t146496\n语法\t146497\n元祖\t146498\nHudgens\t146499\n首尔探戈节\t146500\n5778万\t146501\n桑悲\t146502\n袁小虎\t146503\n许离离\t146504\n没头许\t146505\n手绢\t146506\n最佳女\t146507\n手续\t146508\njhfn\t146509\n到底是谁么说\t146510\n一尚非\t146511\n谷浴场\t146512\n炯炯有神\t146513\n哈呵\t146514\n芯蕊\t146515\n磨牙\t146516\n入围者\t146517\n哈呼\t146518\n盲孩子\t146519\nezzzzzzzzz\t146520\n且行且珍惜\t146521\n么么你说哪呀向天\t146522\n6邋\t146523\n臭躲毛毛\t146524\n还是我么\t146525\n刻闲\t146526\n话你有给我一下好吗度秘你对我的好的好吗\t146527\n2[怒\t146528\n翻给\t146529\n大阳雨\t146530\n嗯才怪\t146531\n受骗\t146532\n18239302869\t146533\n手绘\t146534\nfvvffccccc\t146535\nD\t146536\n找一下去\t146537\n是一了吧\t146538\nooooooooooooooooooooooooooooooooooooooo\t146539\n14吨\t146540\nukbytkbfxtdgztgdfhztbldfkjaesjlbvdsdhlmxdripmnhdiyf馬\t146541\n公共交通\t146542\n别胡扯\t146543\n遭到\t146544\n秘兽\t146545\n立足于\t146546\n蓬头稚子\t146547\n小马小二郎\t146548\n卢米埃\t146549\n5.9亿\t146550\n金沙滩\t146551\n14名\t146552\n李凤鸣\t146553\n惊心火\t146554\n鹏懦夫\t146555\n干冰\t146556\n传奇世界\t146557\n恬恬恬\t146558\n满好\t146559\n是一了吗\t146560\n粪哩\t146561\n罗门\t146562\n打座机\t146563\n朗诵\t146564\ndfyu\t146565\n朗读\t146566\n意语\t146567\n快乐颂\t146568\n写博\t146569\n酷暑\t146570\n意说\t146571\n言子儿\t146572\n度秘真是好孩纸\t146573\n精武门\t146574\n一得两天\t146575\n想意思\t146576\n10月20日\t146577\nrdft\t146578\nggggggggggggggoooooooooooooddddddddsdsddddsdg\t146579\nbaini\t146580\nich\t146581\n无志\t146582\n0381\t146583\n意识\t146584\nicl\t146585\n燕山\t146586\n节区\t146587\n人家\t146588\nicg\t146589\n意词\t146590\nice\t146591\n葵狗日囖\t146592\n离别没\t146593\n意译\t146594\n各抒己见\t146595\n今年7月5日\t146596\n崩茅坑\t146597\nics\t146598\n收费公路\t146599\n五点六点\t146600\nicu\t146601\n无忌\t146602\n开源\t146603\n捕获\t146604\n琴香琳\t146605\n杜裤子\t146606\n阿舅\t146607\nCJB\t146608\n公么么\t146609\n二七五二零\t146610\nItsfun\t146611\n为王\t146612\n东西\t146613\n大汉堡\t146614\n危险性\t146615\n张梦萱\t146616\n中通快递\t146617\n治治\t146618\n狗们\t146619\n1959年\t146620\n雷楼\t146621\n秘秘你今年秘蜜今年\t146622\nnonopanono\t146623\n狗仔\t146624\n狗仗\t146625\n易筋\t146626\n没有没有了\t146627\n维持\t146628\n韩雨馨\t146629\n安苏\t146630\n火火火火火棒棒\t146631\n真的假的你还有媳妇\t146632\n一举两得\t146633\n曹丽然\t146634\n浮浮\t146635\n洗衣烫衣机\t146636\n三瓦伊贵贵\t146637\n本索尔·贝娄\t146638\n11.19\t146639\n绕过\t146640\n_-|||链接\t146641\n色色\t146642\n李盈锐\t146643\n麻花辫\t146644\n小秀秀\t146645\n61分\t146646\n九十九六年\t146647\n度秘我想和你谈恋爱\t146648\nzjtj\t146649\n四方大脸\t146650\n好家伙\t146651\n程慧娟\t146652\n还不够用\t146653\nZouk\t146654\n匕匕\t146655\n马温\t146656\n寻租\t146657\n事不由己\t146658\n小度度小秘秘\t146659\n与时俱进爱\t146660\n机器\t146661\n666666666666666\t146662\nhead\t146663\n8粒\t146664\n田林轩\t146665\n佘蕊爷\t146666\nheal\t146667\n渐渐地\t146668\n四不四男\t146669\n无声无息\t146670\n0.2亿\t146671\nhear\t146672\n以身殉职\t146673\n巧克力棒\t146674\n六二二三八\t146675\n研究报告\t146676\n书旗\t146677\n费用\t146678\n梨膏\t146679\n伟锋\t146680\nSthjfy\t146681\n郫县\t146682\n李春雪\t146683\n抛物线\t146684\nougvvvbnnbbbvvvvvvv\t146685\n我的经济适用男\t146686\n我的眼泪\t146687\n4425566984478533412586369853447556632114889999999\t146688\n张陈吆\t146689\n敢当的人\t146690\nOUT\t146691\n相学\t146692\n更毒\t146693\n缘\t146694\n缝\t146695\n非奸\t146696\n缓\t146697\n熊黛林\t146698\n十多个\t146699\n编\t146700\n314080162\t146701\n古屋鲸八\t146702\njwmd\t146703\nPop\t146704\n缎\t146705\n大事作急\t146706\n缀\t146707\nFxafxbgx2\t146708\nIdontunderstand\t146709\n缆\t146710\n缸\t146711\n缺\t146712\n刚肥\t146713\n温总\t146714\n僵尸电影\t146715\n88888888888888888\t146716\n多咪那我的小狗\t146717\n15585885\t146718\n缴\t146719\n缵\t146720\n54646475\t146721\n缨\t146722\ngfcvfvb\t146723\n缪\t146724\n驱使\t146725\n缭\t146726\n匕心一O乙七七匕人心Vn必飞天人广尸∥0x连\t146727\n缠\t146728\n灼见\t146729\n光斑\t146730\n缦\t146731\n活给\t146732\n你好小落\t146733\n天马岛\t146734\n声状\t146735\nTooto\t146736\n入定\t146737\n6.93%\t146738\n丹成\t146739\nSURUP\t146740\n18071233310\t146741\nchjk\t146742\n入宅\t146743\n没关系快乐来\t146744\n13亿\t146745\n微生物\t146746\n润喉\t146747\n解扣子解扣子\t146748\n才华横溢\t146749\n考究\t146750\n龙娃娃\t146751\n灵千幻\t146752\n孙义花\t146753\n横道河\t146754\n迁安市\t146755\n王源萌\t146756\n好啦不逗你了我去\t146757\n入室\t146758\nccccabc\t146759\n凑贝\t146760\n鲁天爱\t146761\n苦鸟\t146762\n擦药\t146763\n悠等剩\t146764\n必备品\t146765\n革委会\t146766\n再一个\t146767\n蜜蜜\t146768\n富公子\t146769\n女刚才你说难现在你说女\t146770\nshegg\t146771\n没想\t146772\nyrhdzE\t146773\n四五千\t146774\n饭盆\t146775\n蜜蜂\t146776\n别饭吃\t146777\nGunfighthub\t146778\n-们\t146779\n3条\t146780\n3258\t146781\n仟绥\t146782\n大气概\t146783\n李恋\t146784\n2000度\t146785\n气管\t146786\n#分享主义#\t146787\n马甸\t146788\n桑拿鱼\t146789\n解放\t146790\n马甲\t146791\n中国教育部\t146792\n蜜蜡\t146793\n马男\t146794\n总之\t146795\n总么\t146796\n我了你是男是女\t146797\n合二为一\t146798\n青青\t146799\n依米快乐\t146800\n美梦\t146801\n不对不拜\t146802\n等你吧\t146803\n陇南\t146804\n棋魂\t146805\n325J\t146806\n烟草税\t146807\n圣夜\t146808\n韵尾\t146809\n很辛苦\t146810\n心有灵犀点甬辶\t146811\n好哈楼\t146812\n口水巾\t146813\n重物\t146814\n张倩\t146815\n哟美妞\t146816\n陈店\t146817\n别走伤\t146818\n159357258\t146819\n羚羊\t146820\n徐V\t146821\n拉皮条\t146822\n何润东\t146823\n天电\t146824\n百里屠苏\t146825\n塌死\t146826\n好鸡\t146827\n玲屋什\t146828\n鲢鱼历险\t146829\n迷离\t146830\n吧片\t146831\n吧版\t146832\n七龙\t146833\n坐地\t146834\n竞马\t146835\n抽丝\t146836\n课本\t146837\n天生\t146838\n雅骊智能\t146839\n王玉林\t146840\n段晓童\t146841\n萤火虫\t146842\n急穿\t146843\n研究度\t146844\n雪堆\t146845\n得势\t146846\n金包银蛋炒饭\t146847\n千万卜\t146848\n昏沉沉\t146849\n左凯左凯\t146850\n单程\t146851\nstmeto\t146852\n雨冒\t146853\n京华\t146854\n要得\t146855\n挑货\t146856\n旭哥\t146857\n啊干\t146858\nucfjchji3jchr9r7e7o4epie7pe85pr7p4sp566oep8d36prulx63d58d\t146859\n皋兰\t146860\nFEGJX\t146861\n蒸肉饼\t146862\n牛舌糕\t146863\nAmanda\t146864\n和讯叽叽咕咕GV7\t146865\na5z\t146866\ngjgjgaam\t146867\napplaudng\t146868\nìé\t146869\n6666666666\t146870\n艾拉米\t146871\n田帅东\t146872\n嗯蓝\t146873\n返程票\t146874\n热泰\t146875\n就奖\t146876\n巴豆咖啡\t146877\n飞宝\t146878\n非太多\t146879\n因為\t146880\nDgh\t146881\n屌丝们\t146882\n移栽\t146883\n骨干们\t146884\n想一劳永逸\t146885\n团派\t146886\nT形\t146887\n热泪\t146888\nvbcv\t146889\n弊下\t146890\n嗯信\t146891\n花田喜事\t146892\n就女\t146893\n就好\t146894\n急流勇进\t146895\nyooij\t146896\n807975291\t146897\n无人工\t146898\n中国央视\t146899\n星际争霸\t146900\n小度秘的跳舞娃\t146901\n洗头\t146902\n凯迪热耶\t146903\n移树\t146904\n冻\t146905\n冼\t146906\nknollpjjj\t146907\n冰\t146908\nqqqqqwwwqqqwq\t146909\n冲\t146910\n决\t146911\n况\t146912\n冶\t146913\n冷\t146914\n加想\t146915\n冬\t146916\n冮\t146917\n冯\t146918\n冠\t146919\n点冒险\t146920\n冢\t146921\n冤\t146922\n冦\t146923\n冘\t146924\n写\t146925\n军\t146926\n农\t146927\n加情\t146928\n巩俐\t146929\n撒擦\t146930\n冒\t146931\n阿念\t146932\n冖\t146933\n冗\t146934\n冈\t146935\n冉\t146936\n冊\t146937\n册\t146938\n再\t146939\n3083344650\t146940\n冁\t146941\n冂\t146942\n张华健\t146943\n珍课\t146944\n内\t146945\n円\t146946\n冇\t146947\n不谅\t146948\n肉搏式\t146949\n机器侠\t146950\n介紹介紹\t146951\n不谈\t146952\nt43t\t146953\n神魔\t146954\n王嘉欣\t146955\n猫台\t146956\n鸡眼色\t146957\nhydrogen\t146958\n真的很久\t146959\n卢信宥\t146960\n撒哈拉\t146961\n不谢\t146962\n古灵精探\t146963\n猫友\t146964\n猫叔\t146965\n7:00\t146966\n微针\t146967\n大英博物馆\t146968\n是不是你是\t146969\n黄河三角洲\t146970\njhyyjbg\t146971\nweaataher\t146972\n黄鹏鹏\t146973\n克你头\t146974\nhttpfhiphotosbaiducomxiaodupicitem3b292df5e0fe9925f973112533a85edf8db17143jpg\t146975\n咪呜虎\t146976\n重新开始\t146977\n刘母饼\t146978\n十点几\t146979\nxzz\t146980\n读书人\t146981\n单身\t146982\nxzb\t146983\n童话片\t146984\n月老\t146985\nxzn\t146986\n心眼花\t146987\n临洮一中\t146988\n不对啊\t146989\n林志炫\t146990\nhehdh\t146991\n动听\t146992\nxzX\t146993\njujjkkkljjkkkl\t146994\n金立s5\t146995\n三百八十多\t146996\n香港房屋署\t146997\n128896\t146998\n杨鑫源\t146999\n大送\t147000\n小q\t147001\n哈鄂\t147002\n颜色的颜\t147003\n当天\t147004\n蛇兔\t147005\n55年5月5号5点\t147006\n六一果\t147007\n英国电视台\t147008\n说话吧\t147009\n卧龙区\t147010\n9966cf\t147011\nhttppinyincne1878\t147012\n玩闲\t147013\nnimb\t147014\n太湖街道\t147015\nnimy\t147016\n我是你的神魔\t147017\nhuruo\t147018\n咪咪咪咪咪咪咪\t147019\n胃炎\t147020\n肇庆市\t147021\n于喆\t147022\n四月九色\t147023\n小勒\t147024\n卵用\t147025\n胡超凡\t147026\n2D加勒比海盗4\t147027\n胡超凯\t147028\n隐逸\t147029\n太平洋大厦\t147030\n曲洪宇\t147031\ndumile\t147032\n小勇\t147033\nHRD\t147034\n美丽们\t147035\n审核\t147036\n题词\t147037\n支部会\t147038\n耿明宇\t147039\n花果园\t147040\n飒蜜\t147041\n小勺\t147042\n再说出来\t147043\n靠别说\t147044\n么么世\t147045\n进化\t147046\n专线\t147047\n干杯\t147048\n来吧降龙十八掌\t147049\n不用多重\t147050\n中药\t147051\n九曲戴眼镜\t147052\n烙馍\t147053\n最近9点半\t147054\n5倍\t147055\nygggggjih\t147056\n我不爱我不要一我不想我\t147057\n霍建华\t147058\nKITTY\t147059\n樂樂熊熊\t147060\ngpkgwmdagtm\t147061\n123567800nnnnn12356788990071426723942752\t147062\n四亿万\t147063\n清淡\t147064\n45272738\t147065\n商店\t147066\n成龙丸\t147067\n智乃酱\t147068\n台舞台\t147069\n万康穆\t147070\n四起\t147071\n吻了吻\t147072\n李乐颜\t147073\n脊髓\t147074\n于清顺\t147075\nfgvvhjkjggh\t147076\n太极\t147077\n喜爱\t147078\n说下到\t147079\n野彻\t147080\n李祥辉\t147081\nBeats\t147082\n46565625590\t147083\n461643431313\t147084\neasy\t147085\n长白山天池\t147086\n喜欢你了我走了再见\t147087\n胶靴\t147088\ndoyounee\t147089\n心绪如月\t147090\n声嘶力竭\t147091\n中华料理店\t147092\n自联合报\t147093\n浒\t147094\n浓\t147095\n罗技\t147096\nhrfd\t147097\n东莞南城\t147098\n侬秘\t147099\n哈夸张\t147100\n浚\t147101\n487878787\t147102\n打探\t147103\n有信\t147104\n浆\t147105\n浇\t147106\n浅\t147107\n乇8\t147108\n喔喔唷\t147109\n流\t147110\n济\t147111\n浏\t147112\n我求你了行不行\t147113\n浊\t147114\n测\t147115\n海\t147116\n性侵\t147117\nwcmkdkfldkd\t147118\n慢慢吞吞\t147119\n打掉\t147120\n驾考试\t147121\n一半多点儿\t147122\n123元\t147123\n金牙\t147124\n金牛\t147125\n伦理片\t147126\n崔呸\t147127\n性侣\t147128\nuoikghh\t147129\n金牌\t147130\n唔唔葫芦娃\t147131\n浪\t147132\n金版\t147133\n套餐费\t147134\n浩\t147135\n我烦吖为什\t147136\n电解水\t147137\n梦乡\t147138\n安好\t147139\n一般地\t147140\n昆仑\t147141\n正妻\t147142\n说到哪儿\t147143\n王二丸\t147144\n被动摇\t147145\n陈贝杨\t147146\n体位\t147147\n冬节鸽给偶当宵夜\t147148\nhu8\t147149\n金贤重\t147150\n其实很简单\t147151\n低胸中心协力维护员\t147152\n秘棒\t147153\n章光李戴\t147154\n风风烈烈\t147155\n年湾\t147156\n倒饬\t147157\n小妙音\t147158\n耍勒\t147159\n问问你姐小孩等你给嘛\t147160\n27777555555\t147161\n过啦过\t147162\n正如\t147163\n第一财经周刊\t147164\n34岁\t147165\n百变马丁马丁万七七\t147166\n再见小器人\t147167\n4天\t147168\nhue\t147169\nhuf\t147170\n徐一然\t147171\n戒飞\t147172\nhua\t147173\nhub\t147174\n5006877542455\t147175\n本小说\t147176\nhuo\t147177\nhuh\t147178\nhui\t147179\n太阳快点\t147180\nhuk\t147181\n求求求求求求求求求求求求求求求求求求求求求求求求求求求求求求求你\t147182\nhuu\t147183\nhuv\t147184\n颍东\t147185\nhur\t147186\nhus\t147187\n田瑞生\t147188\nq聪明\t147189\nhux\t147190\nhuy\t147191\n招式\t147192\n201601081411\t147193\n扬州\t147194\n彪额\t147195\n海萍\t147196\n张一驰\t147197\n27日\t147198\n岂能\t147199\n雷萨星\t147200\n101分\t147201\n小胆儿\t147202\n谨小慎微\t147203\n一见倾心\t147204\n陈鹏\t147205\n餐种\t147206\njioo\t147207\n农业部\t147208\n级数\t147209\n弓箭\t147210\n指色相\t147211\n狙击\t147212\n我看她\t147213\n温暖的拥抱\t147214\nstolit\t147215\nuhgtugfc\t147216\n彩霞\t147217\n别老动不动谢我\t147218\n益阳千禧\t147219\n克西\t147220\n秘你个头头头头\t147221\n巡警\t147222\n舌根\t147223\n千回百转\t147224\n心魔\t147225\n切克闹我要做你好\t147226\n心魂\t147227\n索邦\t147228\n昨天两天\t147229\n沣河桥\t147230\n动动工\t147231\n从潮来\t147232\n度阿卡\t147233\n我喜欢你的我喜欢\t147234\n中华人民共和国建国史研究\t147235\n嘎德\t147236\n迪斯尼乐园\t147237\n学界\t147238\n20组\t147239\n情我愿\t147240\n1519736643\t147241\n瓦西亚\t147242\n西部第一村\t147243\n8mm\t147244\nvivox6\t147245\nsdfggjklzx\t147246\n美鱼\t147247\n资阳\t147248\n拜拜心\t147249\n文武斌\t147250\ngghhdd\t147251\n10820\t147252\n母亲\t147253\n中国特色社会主义\t147254\n百七元\t147255\n好不少\t147256\n热炒\t147257\n胡杏儿\t147258\nsidueh\t147259\n冰室\t147260\n小群\t147261\n距劲\t147262\n门儿\t147263\n欢仙\t147264\n大家好我叫洋洋\t147265\n当放\t147266\n生产者\t147267\n亚洲音乐节\t147268\nZhen\t147269\n小羽\t147270\nsometu\t147271\n醉心\t147272\nkooooo\t147273\n衣生缘\t147274\n董云志\t147275\n林总\t147276\n小羊\t147277\n小美\t147278\n你的哥哥是谁你的爸爸是谁你的妹妹是谁\t147279\n托盘\t147280\n价价\t147281\n一月一\t147282\n攫取\t147283\n中华小文库\t147284\n孙姨\t147285\nhgdgd\t147286\n蒙也\t147287\nhgdgh\t147288\n女人句\t147289\n问好不好\t147290\nwhobycustom\t147291\n中国红十字基金会\t147292\n芝麻油\t147293\n你的爸爸和你的妈妈\t147294\njjjkk\t147295\n滑雪\t147296\n放一放\t147297\ngvav\t147298\n第有\t147299\n郑恩\t147300\n088890889658234783366666\t147301\nLXQ\t147302\n葫芦哇葫芦哇伊第二碰伤七朵花\t147303\n密恐犯\t147304\n锅底\t147305\n光速\t147306\n加精\t147307\n还珠格格主题曲\t147308\n联系电话13842116335　0421－63580680421－6358268\t147309\n瘠薄\t147310\n维基杯\t147311\naini\t147312\nCD发\t147313\n想起来\t147314\n年青\t147315\n好盟萌\t147316\nABB制\t147317\n华少\t147318\n人工智能机器人\t147319\n华尔\t147320\n禁运\t147321\n赵二\t147322\nftfvgb\t147323\n自传勿盗链\t147324\n赵云\t147325\n吴一凡\t147326\n赵五\t147327\n澡池\t147328\n孙宛瑜\t147329\n赵信\t147330\n哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥\t147331\n微盘\t147332\nRanking\t147333\n吴巧缘\t147334\n治愈\t147335\n多元化\t147336\n框支\t147337\n9月上旬\t147338\n口秒\t147339\n绝弦\t147340\n3310\t147341\n虎胆龙威\t147342\n习王追妻\t147343\n消暑\t147344\n上机器人\t147345\n嗯如\t147346\n重伤\t147347\n爷乐\t147348\n搬家\t147349\n得力\t147350\n同台\t147351\n我不讲价\t147352\n刘翊寒\t147353\n卡可爱\t147354\n重地\t147355\n雨睛\t147356\n笑昏过\t147357\np2p\t147358\n我乐\t147359\n淫穜\t147360\n艾力士\t147361\n103只\t147362\n华中海洋科技大学\t147363\n小咪小眯\t147364\npolo\t147365\npoll\t147366\n蝴蝶花\t147367\nooovvvv\t147368\n下午18点到8点\t147369\nrwomolihloo\t147370\n再見\t147371\n多大度\t147372\n淫穴\t147373\n15倍\t147374\n革命家\t147375\n小龟\t147376\n不喜歡\t147377\n小龙\t147378\n生育权\t147379\n六米\t147380\n猜明\t147381\n对此\t147382\nLAMDASH\t147383\n有问谷\t147384\n闻喜\t147385\n佳耶子\t147386\n我讨厌你度秘我讨厌你讨厌你讨厌你讨厌你\t147387\n我怎样\t147388\n卡号儿\t147389\n名字\t147390\n郑州市\t147391\nNUMBER\t147392\n幸运石\t147393\nDtrfss\t147394\n呕豆豆们呦安心洗尼桑\t147395\n盖伦\t147396\n洋装\t147397\n你不啊好我吗我不啊你水\t147398\nhhio\t147399\n二面角\t147400\n怎什办\t147401\n呦呦切克闹煎饼果子\t147402\n叫秘度\t147403\nu1a\t147404\n金蛋明德\t147405\n6283点\t147406\ngghutvfrhitg\t147407\n安失聪\t147408\n不行你当我的情人\t147409\n志邦橱柜+百强家具+德尔地板+法恩莎卫浴+喜临门床垫+芝华仕沙发+特地陶瓷\t147410\n5月19日晚\t147411\n好成绩\t147412\n舍不懂\t147413\n无以\t147414\nJXDBC\t147415\n讫人\t147416\n为你不想\t147417\n无价\t147418\n大胖子\t147419\n无份\t147420\nirksp\t147421\n对呀你不懂\t147422\n无仇\t147423\ndfddshh\t147424\n再不懂\t147425\n济宠宠物医院\t147426\n胶着\t147427\n无从\t147428\nhbbj\t147429\n羊奶\t147430\n金钟国\t147431\n开席\t147432\n吾爱\t147433\n翟明轩\t147434\n钱酒\t147435\n完美主义\t147436\n客空间vv\t147437\n刘在石\t147438\n有意思\t147439\n放价\t147440\n很一般\t147441\naaaaaaa\t147442\n叶新塘\t147443\n虎王\t147444\n101大把\t147445\n七点四十\t147446\n放任\t147447\n55214\t147448\n一月一号\t147449\n工\t147450\n施政\t147451\n89岁\t147452\n我喜欢磊\t147453\n金莎\t147454\n7咯9米\t147455\n国际小窝记\t147456\n港剧\t147457\n1so厘米\t147458\n金莲\t147459\n红果果\t147460\n国贸站\t147461\n爱你的可是\t147462\nGh\t147463\n筱筱\t147464\njvlv\t147465\n天场\t147466\n24648\t147467\n具體時間\t147468\n没得脸\t147469\n小度秘萌萌达\t147470\n汉王\t147471\n快乐购\t147472\n24646\t147473\n心甘\t147474\n咿呀不行\t147475\n自私\t147476\n后妈们\t147477\n曹飞洋\t147478\n一一二三四五六七八九十十一十二十三十四十五十六十七\t147479\n北辰分公司\t147480\nddrdgffyf\t147481\n九百九十九九九九九九九九九九九九九九九九九九九九九九九九九九九一\t147482\nvsb\t147483\n槐店\t147484\n排用\t147485\n二郎庙\t147486\nDomi\t147487\n美女wz\t147488\n参政\t147489\n牧羊证\t147490\n咋样\t147491\n自称\t147492\n安岳县\t147493\n苍茫\t147494\n乳粉\t147495\n四海\t147496\n海村\t147497\n56744\t147498\n很温柔憨\t147499\n一百多块\t147500\n一七年九月份\t147501\n6858545464\t147502\n火纹\t147503\n督查\t147504\n火线\t147505\n邙山\t147506\n火红\t147507\n池哥\t147508\n海权\t147509\nMUMA\t147510\n海杆\t147511\n字句句\t147512\n豪气\t147513\n一九八三\t147514\n海杰\t147515\n人口爆炸\t147516\n我和谁\t147517\n一个挺\t147518\n三三晚\t147519\n减肥药\t147520\nwbz\t147521\n密度眼\t147522\nwsews\t147523\nmolekey\t147524\nwbd\t147525\n郁可唯\t147526\n咱们网\t147527\nwbj\t147528\nwbh\t147529\n靠手吃饭\t147530\n骇笑\t147531\n亲我了我能不害羞\t147532\n陈凯星\t147533\n我讨厌你\t147534\n巴拉巴拉嘟嘟巴拉巴拉嘟嘟哒拉哒\t147535\n夏梦颖\t147536\n做不做\t147537\n15114198213\t147538\n郎咸平\t147539\n开声\t147540\ncmoryouuu\t147541\n周全保\t147542\n耶热法\t147543\n周仲瑛\t147544\ntfghjhjkbui\t147545\n外交家\t147546\n啦啦啦我走了我恨你的\t147547\nhellomket\t147548\ngigyxgxupcixyrxtrxtzeyxyrxyrcytctytxyyrx\t147549\n龚扬\t147550\n刘思\t147551\n死排骨\t147552\n喧闹\t147553\n外交官\t147554\n头机\t147555\no2\t147556\no1\t147557\n阿米米米米米\t147558\noO\t147559\n赵燕红\t147560\n议论\t147561\noK\t147562\nqq分组\t147563\n四美才怪\t147564\n一嘀\t147565\noC\t147566\noA\t147567\n耿耿耿耿耿耿耿耿\t147568\n进步观\t147569\n犹犹豫豫\t147570\n听一听\t147571\nKOBE\t147572\n旭子\t147573\n一嘛\t147574\noR\t147575\noo\t147576\non\t147577\ncudv\t147578\n1120202\t147579\nok\t147580\n花儿\t147581\noi\t147582\noh\t147583\nog\t147584\nof\t147585\noe\t147586\nod\t147587\noc\t147588\nob\t147589\noa\t147590\n业余时间\t147591\n一嘶\t147592\n一嘴\t147593\noy\t147594\nox\t147595\now\t147596\nov\t147597\nou\t147598\not\t147599\nos\t147600\nor\t147601\noq\t147602\nop\t147603\n滔天巨浪\t147604\n推广\t147605\n747SP\t147606\n南服\t147607\n木棉花\t147608\n简体字\t147609\n正派\t147610\n我一下\t147611\n3519.86点\t147612\n2016年2月4日\t147613\n核污染\t147614\n熊出没之雪姨\t147615\n空燃哥\t147616\n专科学\t147617\n肾脏\t147618\n宛瑜\t147619\n故宫博物馆\t147620\n问你好天线\t147621\n世纪佳缘\t147622\n哈佛大学德克利夫学院\t147623\n真不明白\t147624\n四大行\t147625\n我一个\t147626\n王成觉\t147627\n3177731780\t147628\n刮擦\t147629\n耐瘠薄\t147630\n学学\t147631\n莎莎姐\t147632\n德健\t147633\nconoratoutithoution\t147634\n临时工\t147635\n好不好嘛求你了你就\t147636\n100马币\t147637\n压死\t147638\nAmour/\t147639\n苍蝇吾系\t147640\n专正\t147641\n哦秘\t147642\n海明威\t147643\n那好度秘\t147644\n我是女我是歌首\t147645\n#地产职场秀#\t147646\n学子\t147647\n5675788654455\t147648\n连播\t147649\n穿俊者\t147650\n13637081690\t147651\n重灾区\t147652\n嗯本兮\t147653\n3年后\t147654\n千羽\t147655\n噶嘎嘎\t147656\n敏感性\t147657\n我不懂我不懂我真的不懂不懂不懂不懂不懂不懂不懂不懂不懂\t147658\n小A\t147659\n敏豪\t147660\n走着走着\t147661\n守望者\t147662\n骆春林\t147663\n请闭眼\t147664\n疑续\t147665\n赵祯\t147666\n组合型\t147667\n聊若\t147668\n140万\t147669\n阿姆\t147670\n盗汗\t147671\n94点\t147672\n步态\t147673\n好红\t147674\n张子涵\t147675\n补锌\t147676\n设计师将\t147677\n有心想\t147678\n颂歌\t147679\n嘿嘿头飞\t147680\n鲁美\t147681\n辨论\t147682\n010多\t147683\n不会理\t147684\nICE\t147685\n灬度\t147686\n5735\t147687\n一架架\t147688\n真心相待\t147689\n说来说来\t147690\n黄玉苒\t147691\n辨认\t147692\nICS\t147693\n1342893\t147694\n居然之家河西店\t147695\n丰丰\t147696\n500毫克\t147697\nskbx\t147698\n好久过年\t147699\n式神\t147700\n长大丑\t147701\n两条命\t147702\nZenyang\t147703\n半一半\t147704\n郭子\t147705\n情面\t147706\n1000块\t147707\n正洙\t147708\nskbd\t147709\n一份一个多\t147710\n不要你了滚\t147711\n摄影棚\t147712\n44x18点半\t147713\n塔丽安\t147714\n荧屏\t147715\n绕城\t147716\n鲜肉\t147717\n别丘\t147718\n自由婚\t147719\n完美无缺\t147720\n幼稚\t147721\n极致\t147722\nirkjjjjdg\t147723\n布吉中心\t147724\n很好\t147725\n吴艳超\t147726\n550506995岁\t147727\n效率\t147728\n寇建军\t147729\n2978992232\t147730\njjhhhhuuuuuu\t147731\ntottuchtnvluchtHuugnTheunnchannelnhetgothicngatvhunnHenin\t147732\n蓉召\t147733\nwoman\t147734\n知否\t147735\n几分顶多一毛\t147736\n所得税\t147737\n367平方公里\t147738\n有关注\t147739\n真多四\t147740\n暧雅\t147741\n不着我\t147742\nuuqq\t147743\n体量\t147744\n头空\t147745\n眼压\t147746\n潘冰洁\t147747\n音荐\t147748\n环境卫生\t147749\n88255\t147750\n之间\t147751\n大了我走了\t147752\n霓虹米花\t147753\n知名\t147754\n接班\t147755\n张元\t147756\n加厚版\t147757\n晚挂\t147758\n所噶\t147759\nTOP3\t147760\n靠杂\t147761\n欧阳文怡\t147762\n说不清\t147763\n拔苗\t147764\n起跑线\t147765\n免去\t147766\ndesign\t147767\n做客行\t147768\n狂鸡\t147769\n危险QQ\t147770\n防尘\t147771\n好意思我忘给\t147772\n小欣子\t147773\n二十零岁\t147774\n一个错题\t147775\n1863401834\t147776\ngunj\t147777\n女爸爸\t147778\n阿德菲\t147779\n透支\t147780\n原创\t147781\n原则\t147782\n我看特么疯罗莎生日那家的名誉权我们的明天\t147783\ninyour\t147784\n二千五百元\t147785\n低价\t147786\n825厘米\t147787\n拨帮\t147788\n成绩单\t147789\n医治\t147790\nwifi呀\t147791\n字名\t147792\n气韵\t147793\n卖药\t147794\nnidfhhdjk\t147795\n咬定\t147796\n大头粑粑\t147797\ntryingrh\t147798\n马岩晨\t147799\n月事\t147800\n北片\t147801\n别为\t147802\n不漱\t147803\n赞一钢\t147804\n啦峰甜羊镶囖镜口加油飞\t147805\n變性人\t147806\n泼辣\t147807\n彭姐\t147808\n一个50多岁\t147809\n猜开玩笑\t147810\nTsivyzuzfg\t147811\n心理咨询师\t147812\n宋美琪\t147813\n小坏蛋儿\t147814\n酒保\t147815\n36条\t147816\n不演\t147817\n22周岁\t147818\n落款\t147819\n领事馆\t147820\n堂馆儿\t147821\n奥秘度\t147822\nswrmul\t147823\n未好\t147824\n停机坪\t147825\nim康美\t147826\n美物\t147827\n洋葱丝\t147828\nsboss\t147829\n利特\t147830\n边城威\t147831\n13131237243858\t147832\n说什么\t147833\n张月十三\t147834\n美特\t147835\n咬吓\t147836\n太古板\t147837\n张月十七\t147838\n张将军\t147839\n美版\t147840\n工时\t147841\nttmjgtgmtgajttgdgtt\t147842\n利牙\t147843\nmmmmmmmmmmmmmmmmmmmm\t147844\n我气\t147845\n也谈及\t147846\n小桂子\t147847\n蕴璋\t147848\nbutter\t147849\n凤阳\t147850\n圆满\t147851\n1588968\t147852\n699999998Tn\t147853\n工日\t147854\n谭咏麟\t147855\n鞭\t147856\n梁普荀\t147857\n93.9%\t147858\n率性\t147859\n困境\t147860\n林永珍\t147861\n鞥\t147862\n班人\t147863\n鞠\t147864\n13350933888\t147865\n认捐额\t147866\n短衣\t147867\n传遍\t147868\n1月28号\t147869\n短衫\t147870\n和你好狂\t147871\n一两个月\t147872\n鞍\t147873\n冰薄\t147874\n鞋\t147875\n886953\t147876\n大概\t147877\n轰顶\t147878\n嗯新年\t147879\n有脸没臭小子\t147880\n声带\t147881\n明早六点半\t147882\n特种我是哈萨克族\t147883\n睡了喝\t147884\njnnnccbc\t147885\n正确度\t147886\n花式度秘\t147887\n赏心悦目\t147888\n听过\t147889\n4434455\t147890\n何成洋\t147891\n花鱼\t147892\n27186\t147893\nshehshdjd\t147894\n一天到晚\t147895\n大人们\t147896\n清风\t147897\n一回事\t147898\n2345457\t147899\n二连浩特\t147900\n擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦\t147901\n做好准备\t147902\n上溯\t147903\n灯谜\t147904\n逛红场\t147905\n顾茜茜\t147906\n杜米多米\t147907\n心门\t147908\n变电所\t147909\n青柠\t147910\n获竭\t147911\n周三上午\t147912\n1276\t147913\n1277\t147914\n4474571\t147915\nyovx\t147916\n干嘛快说\t147917\n58551652816726464846438526765535858264873867845978667237557865465456\t147918\n11点37分\t147919\n发债\t147920\nzsfux\t147921\n没忘忘忘我心脏空空我想了一个世界我要了你两颗烟上歌诗\t147922\nwason\t147923\n曹儒林\t147924\n88回\t147925\n50不八十\t147926\nsoula\t147927\ntfe4\t147928\n我爱你爱你爱把我\t147929\n饭吃比\t147930\n角涯\t147931\n坑战\t147932\n坤毅哥\t147933\n陪啵\t147934\n见面儿\t147935\n3-4年\t147936\n海沧区\t147937\n一丁点不好笑一点不称职问你\t147938\nijejjodvvef\t147939\n沂水\t147940\n光战魂\t147941\n老红珊\t147942\n你在干嘛呢\t147943\n4款\t147944\n7771777\t147945\nSjkskjmsm\t147946\n紧\t147947\n应收\t147948\n谢卫\t147949\n视为\t147950\n辈分\t147951\njdbvjebf\t147952\n岑雨宇\t147953\n拳皇2007\t147954\n蝶濑\t147955\nghgodivjibk\t147956\n受不了你了真是\t147957\n米罗虾\t147958\n秘种\t147959\n吴帝\t147960\n秘秋\t147961\n肇事者\t147962\n亲爱\t147963\n一亿一点儿\t147964\n老子的话\t147965\n烫手\t147966\ntli\t147967\nBtbhfb\t147968\n秘秘\t147969\n了信不信\t147970\n佐佐那你别动那你别说话\t147971\n巨根团\t147972\n大学的小小小小小小小小小小小小小小小小小小小小小小小小小小小小小小小小小小朋友\t147973\n4486985\t147974\n卧靠\t147975\ninch\t147976\nwegb\t147977\nwegd\t147978\n菲律宾一家石油公司\t147979\n五咪\t147980\n问秘\t147981\n被骗了\t147982\n9周\t147983\n奄奄一息\t147984\n2007.6.30\t147985\n赵翊宁\t147986\n大鹏鸟\t147987\n第２个\t147988\n环形\t147989\nchiphotosbaiducomxiaodupicitemf703738da9773912090df714ff198618367ae298jpg\t147990\n黄土豆爆米花\t147991\nccfghiagmkkb\t147992\n呼呼呼\t147993\n加再来\t147994\n吸便器\t147995\n1214568\t147996\n融为一体\t147997\n相结合\t147998\nSkype\t147999\n听不听得\t148000\n吊马子\t148001\n短浅\t148002\n五和\t148003\n猪头十\t148004\n走自已\t148005\n常博士\t148006\n上个\t148007\n光着\t148008\n我讨厌你讨厌你讨厌你讨厌你\t148009\n臭迹\t148010\n许欣奕\t148011\n呀呀呀呀二\t148012\n家额\t148013\n上东\t148014\n恭喜你我为你唱\t148015\n念信\t148016\n心田\t148017\n上上\t148018\n上下\t148019\nieave\t148020\nmarmoo5\t148021\n好不好耍\t148022\n把面\t148023\n上七\t148024\n18224755780\t148025\n丛书\t148026\n上万\t148027\n方差\t148028\n中官人\t148029\n信行\t148030\nbucno\t148031\n嗯名\t148032\n555655533\t148033\n欧尚\t148034\n浓硫酸\t148035\na50元\t148036\n嗯吞\t148037\n天武门\t148038\n11問\t148039\n啦啦小魔仙\t148040\n谁楼你\t148041\n啊宾的healyl为为为为\t148042\n第四十\t148043\n天宫\t148044\n1234566459978\t148045\nPeople\t148046\n欧尼\t148047\n婚影\t148048\nkon\t148049\n糖瓜\t148050\n爱里儿\t148051\n天安门\t148052\n50摄氏度\t148053\n5336633333333334655400\t148054\n竹玛丽亚\t148055\n写明镜\t148056\n牛小度秘\t148057\n政协常委会\t148058\n三百八十七\t148059\n干子\t148060\n集思广益\t148061\ndootss\t148062\n2396262305\t148063\n塞上\t148064\n押尾\t148065\n16．6％\t148066\n一两百万\t148067\n刘志鑫\t148068\n13世纪\t148069\n绰号\t148070\n邓超王祖蓝\t148071\n择空\t148072\nWheniSChristmas\t148073\n幸福指数\t148074\n更糟糕\t148075\n宝马七系\t148076\n重税治国时代的来临\t148077\njjgjhggO\t148078\nfjsuf\t148079\n新年了\t148080\n东风雷诺\t148081\nmoat\t148082\n马靴\t148083\n水漂\t148084\n100毫米\t148085\njkpgfzserj\t148086\n降格\t148087\n航总局\t148088\n┯_┯\t148089\n火锅红酒\t148090\n颜秉琪\t148091\n1829383201\t148092\n被删\t148093\n顾长雨\t148094\n可寸\t148095\n敬告\t148096\nviad\t148097\n无聊天\t148098\nviag\t148099\n哆啦a梦哆啦a梦\t148100\n启迪思\t148101\n西西瓜\t148102\n水漫\t148103\n7700公斤\t148104\nNup\t148105\n淘米\t148106\n啵儿\t148107\n精灵王\t148108\ngd7hd58\t148109\n光身\t148110\n来回\t148111\n呢历险记\t148112\nvubdffs大福源风vjcjjbggjjgh\t148113\n屏边苗族自治县\t148114\n平香\t148115\n201几\t148116\n刘翔\t148117\n腹语\t148118\n度秘你好密度\t148119\n过年了你懂\t148120\n好歪瑞\t148121\n幻想\t148122\n善良型\t148123\n22分钟\t148124\n常有余\t148125\n3132800557\t148126\n那一刀\t148127\n李炜聪\t148128\n小心长\t148129\n坑人不偿命\t148130\n地厅级\t148131\n某某地\t148132\n涌桥区\t148133\n幻情\t148134\n赵强\t148135\n耍度秘\t148136\n追人\t148137\n比撒\t148138\n抓饼\t148139\n每每个\t148140\n点心\t148141\n绸花\t148142\n字幕组\t148143\nhdvbajdbn\t148144\n七言律诗\t148145\n五大连池火山公园\t148146\n结课\t148147\n听电\t148148\n到民不聊生\t148149\nhuhg\t148150\n舍必\t148151\n密码儿\t148152\n周思屹\t148153\n在公将公\t148154\n结语\t148155\n余元武\t148156\n平坦处\t148157\n万能秘书\t148158\nZee\t148159\n永约\t148160\n我叉叉叉叉叉叉叉叉\t148161\n郭静\t148162\n帆布\t148163\n结识\t148164\n帆帆\t148165\n跑吧兄弟\t148166\n肉禁\t148167\n冬蜜你是电影小大电影小达\t148168\njnbjn\t148169\n哭儿狼嚎\t148170\ngjshd\t148171\n丽柜\t148172\n十一家\t148173\n你的爱情观\t148174\n常在\t148175\n旺旺旺旺旺额usbskslwnsksn\t148176\n胜得\t148177\n丽柏\t148178\n母校\t148179\n出没有\t148180\n瓦市桥\t148181\n乳猪\t148182\n今天几时\t148183\n245554536\t148184\n选人\t148185\n李安\t148186\nhyshdfgshgs\t148187\n能找到了\t148188\n等闲识\t148189\n综艺类\t148190\n怨妇\t148191\n配界\t148192\n狗向者\t148193\n2103720283\t148194\n画块\t148195\n战狼传说\t148196\n2276421288\t148197\n能不能乖乖的听我话你就乖乖的告诉我你的爸爸\t148198\n种子天\t148199\n名媛\t148200\n拉斐尔\t148201\n欧麦嘎欧麦\t148202\n花旗汽车\t148203\n蓝月亮\t148204\n让说\t148205\n构成\t148206\n五筒\t148207\n腿轮\t148208\n莲月\t148209\nmnotyours\t148210\n毛二\t148211\n恶漫\t148212\nda3cc9906015306e93901203f92e7jpg\t148213\n02米\t148214\n尽情地\t148215\n荡女果\t148216\n1444444\t148217\n师父\t148218\n师爷\t148219\n木棍儿\t148220\n回头看一看\t148221\n汉奸亏\t148222\n嗯幸亏\t148223\n珍珠巷\t148224\n來段柤声\t148225\nfhggxgsggf\t148226\n不够意思\t148227\n头型\t148228\n吃宵夜\t148229\n66707\t148230\n跌碎\t148231\n你好吗啡\t148232\n巧作\t148233\n诗呆子\t148234\n摸乱亲\t148235\n九九鱼\t148236\n广美大学城\t148237\n航运\t148238\n多克多比四十呢\t148239\nuisl\t148240\n童工\t148241\n镇平县\t148242\n博学者\t148243\n10年\t148244\n过院\t148245\n特护\t148246\n劲道大虾\t148247\n有你在\t148248\n自净\t148249\n宋宜樽\t148250\n跟主见\t148251\n中气\t148252\n舒秋\t148253\n圆线\t148254\n妖魔\t148255\n幸福归来\t148256\n我的你还姐姐啊么么答没我你不爱我\t148257\n上月22日\t148258\n10篇\t148259\n舒秘\t148260\n简装\t148261\nmikopl\t148262\n防溢\t148263\n更持久\t148264\n晓愿\t148265\n地铁6号线\t148266\n破苍\t148267\n例外\t148268\n东电公司\t148269\n风云子\t148270\n武蓉惠\t148271\n张立平\t148272\n还一点欢欢一个太冷\t148273\n咪小徐熊\t148274\n暑期\t148275\n生命体\t148276\n我是男你猜我是男还是女\t148277\n真是的脸变化大坏蛋大烦人的我好讨厌你\t148278\n李先生\t148279\n邵明飞\t148280\n点水喝\t148281\n几分儿\t148282\n爱你的理由\t148283\n轩辕剑\t148284\n李贝彤\t148285\n一下一下\t148286\n秀香山\t148287\n一下一三\t148288\n二十遍\t148289\n丁小王\t148290\n才i\t148291\n新西兰南岛\t148292\n七寻记嘞\t148293\n野鸡\t148294\n这帖\t148295\nnizenmeting\t148296\n沈家豪\t148297\n牙痒痒\t148298\n眼片\t148299\n一生一世\t148300\n沅江\t148301\n野鸭\t148302\n小俞俞\t148303\n解放军站岗\t148304\n这帅\t148305\n反扑倒\t148306\n熟人\t148307\n一下一个\t148308\n86373857272727386\t148309\n大明咀豆\t148310\n我的宝贝包百店\t148311\nsbSb\t148312\n几百节\t148313\nV饭\t148314\n长里美不\t148315\nddhgm\t148316\n中华路189号\t148317\n出宫\t148318\n嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯嚯\t148319\n驴肝肺\t148320\n鼹鼠\t148321\n很麻烦讷\t148322\n10部\t148323\n主路\t148324\n唱一两句\t148325\nt800\t148326\n北海银滩\t148327\n13225559900\t148328\n腐乳\t148329\n盛名字\t148330\n棉被\t148331\n韓國\t148332\n外汇\t148333\n罗子峰\t148334\n度秘我真的爱你\t148335\n说了讨厌\t148336\n回回回啊啊保回回回归\t148337\nｑ\t148338\nｐ\t148339\n宗监禁音\t148340\nｖ\t148341\nｕ\t148342\n被偷看\t148343\n｛\t148344\nLarenkov\t148345\nｙ\t148346\nｘ\t148347\n～\t148348\n｝\t148349\n男保姆\t148350\nｃ\t148351\nｂ\t148352\nａ\t148353\n｀\t148354\nｇ\t148355\n点灯\t148356\n苏陌如\t148357\nｄ\t148358\nｋ\t148359\nｉ\t148360\nｈ\t148361\nｏ\t148362\nｎ\t148363\nｍ\t148364\n12354岁\t148365\n育种\t148366\n怏齿\t148367\n朴堡王\t148368\n杨阳\t148369\n嗯不周岁问你是故创建m\t148370\n一撇\t148371\n爱爱\t148372\n100多起\t148373\n一次一次一次一次\t148374\n苟\t148375\n真很多\t148376\n轻飘飘\t148377\n徐依晨\t148378\n助于\t148379\n王琦\t148380\n钱翠花\t148381\nhttpfhiphotosbaiducomxiaodupicitem4034970a304e251f5868328ca086c9177f3e5323jpg\t148382\n没头盖上\t148383\n璐璐璐\t148384\n8.80\t148385\n在一堂\t148386\n王琪\t148387\nKBS2TV\t148388\n刘奶奶\t148389\n王琳\t148390\nebhievdididhBwnd8\t148391\n抗美\t148392\nmips\t148393\n张瀚元\t148394\n一万多块\t148395\n南昌市湾里区第三小学\t148396\nbircb\t148397\n张颐武\t148398\n继续努力\t148399\n陈夫人\t148400\n回来我的爱呀回来我的爱\t148401\n贱贱\t148402\n拼颜\t148403\n贱贼\t148404\n高一个人\t148405\n不不不不不不不不不不\t148406\n捷捷\t148407\n韦恩惠\t148408\n贱货\t148409\n助人\t148410\n哦了\t148411\n信息流\t148412\n保康起床号考科\t148413\n15678910\t148414\n丑丑丑机器人机器人\t148415\n度秘我告诉你\t148416\n工l程\t148417\n河源\t148418\n水化成\t148419\n广干子了那干\t148420\n娘亲\t148421\n军训基地\t148422\n不能错\t148423\n杨突然\t148424\n骗术\t148425\n优美\t148426\n锋芝恋\t148427\n至高无上\t148428\n抄书\t148429\n尚德宏\t148430\n擂\t148431\n派驻\t148432\n机油味\t148433\n图谋不轨\t148434\n描眉\t148435\n旅行列车\t148436\n拉稀度秘\t148437\nk5\t148438\n施定益\t148439\n受徒儿\t148440\n讥讽\t148441\n一百七十七块\t148442\n六个小时\t148443\n学扎\t148444\n灯管\t148445\n才够\t148446\n争渡\t148447\n才大\t148448\n滴啦\t148449\n周大发\t148450\n相信我没有人\t148451\n7600\t148452\n你好夸张我好\t148453\n我不好玩了我再也不理你了用也不找你\t148454\n洪山\t148455\n于街市\t148456\n能通\t148457\n许如玉\t148458\n1228606000073663363829566\t148459\n在深秋\t148460\n陈辉辉\t148461\n柳暗花明又一村\t148462\n张伟婷\t148463\n好好好帅\t148464\n冷峻不羁\t148465\n锑类\t148466\n看做完\t148467\nfucking\t148468\n掉了\t148469\n叉叉叉\t148470\n剧中\t148471\n保洁工\t148472\n第一辆\t148473\n小夫妻俩\t148474\n爱爱你\t148475\n18380\t148476\nmkudz\t148477\n脚毛\t148478\n兮祸\t148479\n两局\t148480\n要害\t148481\njetalong\t148482\n听不清阵\t148483\nabcson\t148484\n雪后\t148485\n减低\t148486\n万达商场\t148487\n徐芳\t148488\n要客\t148489\n或者说\t148490\nareyour\t148491\n定性\t148492\n爽口\t148493\n梦到\t148494\n媽污\t148495\n甩一把汗\t148496\n踏步\t148497\n还猪\t148498\njujl\t148499\n牙线\t148500\n无底洞\t148501\n来了明天见\t148502\n55555266654\t148503\n新趋势\t148504\n0903\t148505\n立湃\t148506\n好梦么么哒约\t148507\n查马坊村\t148508\n拨死\t148509\n真真的好想你好想你\t148510\n1000公里\t148511\n佩里西奇\t148512\nsssrrsereetf\t148513\n现在你你你你把\t148514\n那边娃娃巴啦啦小魔仙\t148515\n陈大美\t148516\nstemityou\t148517\n爱在上\t148518\n英雄连我\t148519\n拖鞋板\t148520\n馥郁\t148521\nSUMMICRON\t148522\n嗯三俗\t148523\n近战\t148524\n增肥药\t148525\n湿淋淋\t148526\n孙筱凡\t148527\n湖大酒店\t148528\n一连串\t148529\n生不穷告诉\t148530\n斩掉\t148531\n窦唯\t148532\n撅肚\t148533\n疑义\t148534\n阿笠\t148535\n滴定\t148536\ndjidjf\t148537\n畜牧业\t148538\nMDMWMWM\t148539\n稀松\t148540\n那伟哥\t148541\n暗度\t148542\njillijj\t148543\n追踪者\t148544\n观猎\t148545\nkdd\t148546\n魅媚\t148547\nmfi3q\t148548\n一代偶像\t148549\n王桂新\t148550\n福特内饰\t148551\n乐淘\t148552\nmUamua\t148553\n大明\t148554\njī\t148555\n兮儿\t148556\n璇子\t148557\n大神医\t148558\n在口\t148559\n11987610998816877年\t148560\ncubejisnskond\t148561\n花儿街\t148562\n京燕\t148563\n大星\t148564\n过儿\t148565\n克子\t148566\nkuwejt\t148567\n清好\t148568\n坐屏\t148569\n消散\t148570\n哎哟娃娃亲多棒ww\t148571\n台中\t148572\njā\t148573\n李焯豪\t148574\n088741567\t148575\n119110112\t148576\nooxoo\t148577\n地沟\t148578\n看来来\t148579\n偶买噶偶买噶偶买噶偶买噶\t148580\n庐山瀑布\t148581\n奶羊羊\t148582\n城市生\t148583\nxjdjnf\t148584\n杨吉祥\t148585\n土豆落\t148586\n7280万\t148587\n不帅人\t148588\nokspkah\t148589\n升级后\t148590\nffsg\t148591\nkn\t148592\n安逸\t148593\nffsz\t148594\n号码\t148595\n就好所以说\t148596\n擎\t148597\n分见\t148598\n页间\t148599\nffst\t148600\n快快点点\t148601\n安选\t148602\n活地\t148603\n安适\t148604\n抹胸亮片裙\t148605\n窝火\t148606\nHNJjyy\t148607\n安通\t148608\n九点零一\t148609\nwwwhrj\t148610\n手提袋\t148611\n走险\t148612\n焦油\t148613\n尽如人意\t148614\n交接箱\t148615\n绿茶粉\t148616\n131973欣欣\t148617\n文永嘉\t148618\n波坡\t148619\n南东\t148620\n呆脸\t148621\nLaughing\t148622\n南下\t148623\n∽夫卅\t148624\nzdgjkolo\t148625\n酸爽\t148626\nwrouppopo\t148627\n一半块儿\t148628\ngndy\t148629\nkx\t148630\n连撞\t148631\n典子\t148632\n呆脑\t148633\n南丹\t148634\n黑豆歌红豆糕\t148635\n相称\t148636\n百炼\t148637\n百点\t148638\n南中\t148639\nfdetsett\t148640\n五G\t148641\neee5\t148642\ndjfigj\t148643\n979369\t148644\n无辜\t148645\n不是我让你情何以堪是你自己思想不正\t148646\n雷得森安\t148647\n本来面目\t148648\n诶呦你你你你吃啥我讨厌你吗你是什么我讨厌你\t148649\n来北往\t148650\n早抖\t148651\n帝霸\t148652\n白点儿\t148653\n非公有制\t148654\n西海固\t148655\n乱穿\t148656\n全美\t148657\n额肯\t148658\n远算\t148659\n全华\t148660\n早报\t148661\n哈CD\t148662\n奔在\t148663\nhdjxojebch\t148664\n无边\t148665\n薏米\t148666\n一千八百九十九一\t148667\n无辣\t148668\n早抽\t148669\n书写\t148670\n候帅汀\t148671\n快乐吧度秘你给我查查吧\t148672\n快点吧快点\t148673\n2910\t148674\n熊靖美\t148675\n终场\t148676\n2914\t148677\neees\t148678\n导读\t148679\n赖账\t148680\neeee\t148681\neeed\t148682\n肺癌\t148683\n设计展\t148684\njhsjw\t148685\n超高调\t148686\n紫钻\t148687\n猥琐样\t148688\n韩智敏\t148689\n失天失聪\t148690\n说说恋爱\t148691\nhwhsbdbfb\t148692\n学生机\t148693\n食指\t148694\n分割线\t148695\n书卷味\t148696\n点烟\t148697\n外资\t148698\n話罵\t148699\nppt4g\t148700\n显圣\t148701\n胡14mb\t148702\n555555415422\t148703\n沙土镇\t148704\n金奕涵\t148705\n呵个\t148706\n亲亲哥肉麻\t148707\n得失\t148708\n北个月\t148709\n兰睿欣\t148710\n支撑\t148711\n导语\t148712\n可囊\t148713\n天蝎斗\t148714\n花艺\t148715\n62041\t148716\nｓｂ\t148717\n巴拉巴巴\t148718\n发斯泰龙\t148719\n羞答答滴玫瑰静悄悄\t148720\n花色\t148721\n博智\t148722\n逍客\t148723\n有文化\t148724\n号本\t148725\n较差\t148726\n知识树\t148727\n别吗斗流氓\t148728\n打狗棍\t148729\n鲁律师\t148730\n化学用品\t148731\n青瓜\t148732\n笑一个\t148733\n三十晚上\t148734\n随堂\t148735\n列车员\t148736\n乐雨萱\t148737\n六月二十一日\t148738\n用度秘\t148739\n库贝萨\t148740\n今个六一\t148741\n打狗棒\t148742\n沈鹏浩\t148743\n软坐垫\t148744\n我不我不我不喜欢\t148745\n笑一下\t148746\n肯德基店\t148747\n金毛狗听主人的话\t148748\n非常感\t148749\nss光\t148750\n最衰\t148751\n青瓦\t148752\n无酒干倘卖\t148753\n1952年\t148754\n希尔顿逸林酒店\t148755\n张玮\t148756\n27.7\t148757\n继续走\t148758\n离身\t148759\n我我我我我我我我我我我我也我我我我我我我我我哦\t148760\n侦探\t148761\n六级\t148762\n六红\t148763\n树苗\t148764\n欣哥\t148765\n别离婚\t148766\n6-8年\t148767\n吴雷雷\t148768\n饶馨煜\t148769\n辛苦奖\t148770\n谁么\t148771\n邱县\t148772\n22颗\t148773\nityout\t148774\n讲经\t148775\n兄嫂\t148776\nSvenfavoriteright\t148777\n半日游\t148778\n派头\t148779\n神叨\t148780\n二零二幺八九\t148781\n废者\t148782\n土地\t148783\n6月28\t148784\n换上\t148785\n其时\t148786\n6月20\t148787\n费干\t148788\n大乐易失察,\t148789\n很浪\t148790\n乖乖我的小又小达人\t148791\n6月26\t148792\n贝勒们\t148793\n嗯嗯你好美\t148794\n投融资\t148795\n性感\t148796\n装入\t148797\n无聊了我的\t148798\n追尾\t148799\n九十多号\t148800\n纠风\t148801\n转折点\t148802\n索肖\t148803\n劫色\t148804\n刚热\t148805\n黑诺服了你了我恨死你了我讨厌\t148806\n博人\t148807\n来了美了美了美\t148808\n老大不小了该\t148809\n使唤\t148810\n零零幺\t148811\n博亲\t148812\n吧场\t148813\n颓图\t148814\n几般\t148815\n品名\t148816\n绿灯\t148817\n阿尔卑斯山脉\t148818\n模你\t148819\n给我回\t148820\n谁老子\t148821\n猛片\t148822\n金厲旭\t148823\n仗势欺人\t148824\n掐灭\t148825\n出落地\t148826\n10斤\t148827\n至恺\t148828\n邪毒\t148829\n汐爷\t148830\n洪庆锐\t148831\n网文\t148832\n陈志铭\t148833\n华电国际\t148834\n10秒前\t148835\n蓝沁\t148836\n苏谁\t148837\n小鸡仔\t148838\n13413576543\t148839\n隐含性指向\t148840\n早逝\t148841\n10文\t148842\n任梦颖\t148843\n车速\t148844\nSCM墨镜贼\t148845\n干馍干菜\t148846\n阿神\t148847\n侯联会\t148848\n连本带利\t148849\n妆潮\t148850\n像女\t148851\n110921艺声推te\t148852\n两半\t148853\n上班办\t148854\n罗从容\t148855\n两千\t148856\n东张西望\t148857\n永联村\t148858\n机电\t148859\n别交\t148860\n杨路理\t148861\n2125\t148862\n郭紫楠\t148863\n机甲\t148864\n感念\t148865\n2128\t148866\n驶向\t148867\n陕西省统计局\t148868\n0vutxtj7\t148869\n周钦宇\t148870\n别亲\t148871\n利巴沙\t148872\n别人\t148873\n早退\t148874\n六一#\t148875\n多加\t148876\n抽给\t148877\n誓伐\t148878\n煮面\t148879\n消消气\t148880\neueieiieeueueieueieieieei\t148881\n78龄\t148882\n多努\t148883\n英菲迪尼\t148884\n窝怕你了不要说话\t148885\n河源市\t148886\n6x29x\t148887\n豆瓣FM-#\t148888\n要点儿\t148889\n蒽摁恩嗯煾\t148890\n倚着我的琴枕梦尽夜满月\t148891\n不相干\t148892\n宋永新\t148893\n830成武\t148894\n四十余名\t148895\n威豹戈薇杯\t148896\n干扰\t148897\n林林柳\t148898\n288888888888888888\t148899\n流动性\t148900\n多力\t148901\n今天早晨5:04\t148902\n十二只一步\t148903\n添置\t148904\n拜度\t148905\n阀门\t148906\negscyc\t148907\n一家报业集团\t148908\n电憬\t148909\n告诉\t148910\n来不懂\t148911\n佳佳佳佳在家小鸟小鸟叽叽\t148912\nnfg\t148913\nnff\t148914\n巧克力蛋糕奶茶\t148915\n呱呱呱挂\t148916\n凑巧\t148917\n度秘度秘度秘度秘\t148918\nnfn\t148919\n魔币大鱼\t148920\n壁人\t148921\nnfs\t148922\n光感\t148923\n告诫\t148924\n俩百块\t148925\n甜言语\t148926\nnfx\t148927\n苏就是\t148928\n月饼们\t148929\nududjurjsi\t148930\n自律\t148931\n恩没我我也目录模块头目\t148932\n正是\t148933\n坏士\t148934\n杨波\t148935\n法物\t148936\n12点51分\t148937\n杨泡\t148938\n南极大陆架\t148939\n一八百\t148940\nnichkhun\t148941\n姬老师\t148942\n4趟\t148943\n男种\t148944\n善信\t148945\n厅子\t148946\n方煊蝖\t148947\n平谷\t148948\n赵水兰\t148949\n例如\t148950\n男科\t148951\n男秘\t148952\n花语盈盈\t148953\n齐河\t148954\n6586\t148955\n修复\t148956\n好孩纸\t148957\n喜爱夜蒲2\t148958\n雪岭熊岭\t148959\n王冰冰\t148960\n衣舍\t148961\n工程\t148962\n宠攻\t148963\n威力和你巨\t148964\n腕儿\t148965\n下班时刻\t148966\n援引\t148967\n三四个岁\t148968\n深溪\t148969\n23m1x2m10其根\t148970\nnbnnbnn\t148971\n十七分之十二\t148972\nFPS游戏\t148973\n一到十\t148974\n觥畴交错\t148975\n吧木马\t148976\n突痛哭流涕\t148977\n听果\t148978\n定悲\t148979\n购餐\t148980\n一个劲儿\t148981\n团风\t148982\n林品如\t148983\n极多\t148984\n极夜\t148985\naathrow\t148986\n飞饼\t148987\n280元\t148988\n逆流而上\t148989\n咳咳咳咳咳咳咳咳咳咳咳\t148990\n回帮\t148991\n3月26日\t148992\n愤青菜\t148993\n马孙子\t148994\n姜磊\t148995\n说说说说说说说说说说错错\t148996\n极大\t148997\nugtgrzstj\t148998\n紧随\t148999\n赵又酸\t149000\n高名思哲\t149001\n小狮绘\t149002\n看场\t149003\n早间\t149004\n54块\t149005\n画派\t149006\n温月雷\t149007\najjplpagpatagjagamga0jdgampmgajj\t149008\n乔尔\t149009\n17GB\t149010\n显示器\t149011\n有你真好\t149012\n嗨咻\t149013\n数千名\t149014\n斗富\t149015\n激素\t149016\n小米手机电信纪念版\t149017\n小玫瑰花\t149018\n永德\t149019\n早闭\t149020\n黄俊郎\t149021\n86016\t149022\n美肤\t149023\n黄雀锁\t149024\n3099元\t149025\n重視\t149026\n排尿\t149027\n卡米洛夫\t149028\n阜阳市\t149029\n海艺晨\t149030\nhengainideo\t149031\n禁售\t149032\n圣手\t149033\n品类\t149034\n三十层\t149035\n同类\t149036\n0412\t149037\n白花花黑乎乎\t149038\n天使一样的人儿\t149039\n圣旨\t149040\n金乡乡\t149041\n大饼\t149042\n好找到我的\t149043\n飞维\t149044\n打版\t149045\n正符\t149046\nheush\t149047\n打牌\t149048\n大饭\t149049\n塞子\t149050\nhghfffgx\t149051\neme7026em\t149052\nHCGCHCHc\t149053\n交汇\t149054\n飞绝\t149055\n背着我\t149056\nrour\t149057\n101740\t149058\n在一\t149059\n啥你\t149060\nrouo\t149061\n邪神听话\t149062\n眼睑\t149063\n盖世你是我的\t149064\n=\t149065\napoucuaac\t149066\n谈情\t149067\n吳君程\t149068\n孙玉华\t149069\n巨剑\t149070\n禁烟令\t149071\n前春春\t149072\n凌唱\t149073\n高收入者\t149074\n丽园\t149075\n必看\t149076\n陈寒柏\t149077\n金融危机\t149078\nlee\t149079\n小幸运\t149080\n登顶\t149081\n杨佳柔\t149082\n恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩\t149083\n女宵夜\t149084\n臭密度不要脸的臭密度度秘度秘不要脸\t149085\n祭奠状\t149086\n油水\t149087\n中国海\t149088\n砸醒\t149089\n素人\t149090\n徐佳莹\t149091\n李自伟\t149092\n凯鲁亚克\t149093\n顾亦\t149094\n海雷\t149095\n贾子桐\t149096\nghgjcjbhk\t149097\n音乐银行\t149098\nodkkdjdjkskd\t149099\n不存在我知道\t149100\n赵明霞\t149101\n海雅\t149102\nhelloam\t149103\n有你有\t149104\n怨天尤人如\t149105\nCfggdsfhnvdrukjdewwsgnjurvntsvjkhtghgdgjjtfvbyrfggyi\t149106\n唠唠唠\t149107\n香葱\t149108\n台数\t149109\n唾觉\t149110\n一名26岁\t149111\n大西洋底\t149112\n愣货\t149113\n几根\t149114\n天下无敌手\t149115\n几格\t149116\n喝茶\t149117\n李份镇\t149118\n陈国来\t149119\n物力\t149120\n李泽楷\t149121\nTHT\t149122\n几样\t149123\n骚味\t149124\n克里斯\t149125\n有毅力\t149126\n9年\t149127\nTHB\t149128\nIacci\t149129\nvcsv\t149130\nTCE\t149131\n几树\t149132\n好嘛我开\t149133\n哈谷六\t149134\n稻永站口\t149135\n几栋\t149136\n皮来娘\t149137\n钢牙\t149138\n今季\t149139\n141144\t149140\n一万倍\t149141\n妙弹\t149142\n省地\t149143\n2011-06-10\t149144\n诺又来\t149145\n嗯飞邶\t149146\n2011年6月份\t149147\n违法行为\t149148\n省场\t149149\n米碎\t149150\n扒鸡\t149151\ndhdhhlblblnl\t149152\n先我先我\t149153\ntfomm\t149154\n相信你才\t149155\n七班\t149156\nRh\t149157\n上班间\t149158\n五片六片\t149159\n玉堂红糖\t149160\nRi\t149161\n猜讨\t149162\ntfomy\t149163\n24日上午10时\t149164\nllokt\t149165\n美足\t149166\nBaggythugs\t149167\n八四三七\t149168\nWrangler\t149169\n前传\t149170\n致冶\t149171\n一下把\t149172\n做最好的自己\t149173\nRV\t149174\n一个十八岁\t149175\n你不要脸你少女小三\t149176\n盐城\t149177\n金田海林\t149178\n再+1个吧\t149179\n锅包肉\t149180\n乎乎\t149181\nMAthink\t149182\n黄勤霞\t149183\n印有\t149184\n忻斯\t149185\n疑团\t149186\n不不不不不不完善\t149187\n四分之一第二天\t149188\n爱那\t149189\n黄瓜\t149190\n也许\t149191\n繁育\t149192\n常州\t149193\n静蓉\t149194\n5.6级\t149195\n成都商报\t149196\n别等我死了还木有\t149197\n为你面\t149198\n我的微笑\t149199\n自恋狂你\t149200\nisthers\t149201\n没错恭喜\t149202\nChoiSeunghyun\t149203\nxoxicomexo\t149204\n下度\t149205\n毒龙东大\t149206\n太原北大街\t149207\n承认你好\t149208\n王紫霞\t149209\n自惹麻烦\t149210\nhasdy\t149211\n丝啊\t149212\n肥厚\t149213\n米花粉\t149214\n下床\t149215\nSJBJO\t149216\n领结\t149217\nimfromusa\t149218\n百度钱包联合诈骗集团\t149219\n最不想你\t149220\n凉薯\t149221\n宋香平\t149222\n4000多万亩\t149223\n天姿\t149224\n龚鹃\t149225\n玩偶恋\t149226\n下底\t149227\n下店\t149228\n本月18日\t149229\n并没有钱\t149230\n19999家\t149231\n米柯南\t149232\n妥投\t149233\n高虹梅\t149234\n帅逼\t149235\n垂柳\t149236\n站着上北大\t149237\nq宠宝贝\t149238\n燥朋有\t149239\nmyword\t149240\n无路可退\t149241\n好舒服\t149242\nRH\t149243\n国际贸易有限公司\t149244\n邮政编码查询\t149245\n好嘛家家\t149246\n881度\t149247\n查理\t149248\n黑盒饭钢结构非公开家家户户红果果\t149249\n血光之灾\t149250\n常规\t149251\n科莫力战力战\t149252\n暖冬\t149253\nPRSV\t149254\navs\t149255\n王亚飞\t149256\n2004年\t149257\n神韵\t149258\n2011年2月28日起\t149259\n艾可魔法少女\t149260\n大吉昌\t149261\n46寸\t149262\n2000qb\t149263\n罗斯福\t149264\nlet\t149265\nyingsi\t149266\n淫哸\t149267\n柏油\t149268\n刮胡\t149269\n新西兰\t149270\n基面\t149271\nnod\t149272\n去疤\t149273\n110115\t149274\n阮软\t149275\n小石潭\t149276\n毛毛控\t149277\n我原来如此你等你反应好慢\t149278\n看星战\t149279\n阜新\t149280\n将子\t149281\nk万\t149282\n礼让\t149283\n恶魔\t149284\nndnz\t149285\n甘井子区\t149286\n盛装\t149287\n体操恩\t149288\n说了我讨厌\t149289\n曼静\t149290\n张宝利\t149291\ngvcvxvg\t149292\n啦啦啦啦度秘\t149293\nvdjfndbvwhvchhdgshdbvdgsh'sd'vdvddh\t149294\n朱含春\t149295\nyfdgudd\t149296\nThavasa\t149297\n抚尔秀颈\t149298\n武功山\t149299\n拧巴\t149300\n芳容\t149301\n漪汾小区\t149302\n要不写\t149303\n我喜欢十三岁的小女孩儿\t149304\n＆＆＆＆＆＆4＆＆＆＆\t149305\n23679\t149306\n听不废话\t149307\n气死你\t149308\n孤寡\t149309\nǖ\t149310\nǔ\t149311\n刘明磊\t149312\n孙中山\t149313\n二十多分\t149314\n7点钟\t149315\n甜椒\t149316\nǚ\t149317\nǘ\t149318\nnot\t149319\n萨玛丽\t149320\n跨越度\t149321\nq抢\t149322\n奇虎\t149323\nǎ\t149324\n单淑敏\t149325\n那么长\t149326\n全州\t149327\n18252807\t149328\n上锁\t149329\n蚀亏\t149330\n药力\t149331\n小丽科\t149332\n爱我我们可就是对你说的你是\t149333\n养家何罪\t149334\n掉线\t149335\n恨你大是\t149336\n造纸\t149337\n天明\t149338\n天昏\t149339\n女人缘\t149340\n洗浴露\t149341\n标准间\t149342\n排位\t149343\n友联\t149344\n天星\t149345\n林峰\t149346\n矿石鲨\t149347\n沈鑫阳\t149348\n分手季\t149349\n一万亿个\t149350\n明白\t149351\n填份\t149352\n岗头王村\t149353\n小米手机2#\t149354\n860594191\t149355\n林峯\t149356\n侠盗猎\t149357\n周转\t149358\n起飞\t149359\n承重\t149360\nbla\t149361\nskinsp\t149362\nhmgf\t149363\n十分钟块\t149364\nsmashif\t149365\n叫壳儿\t149366\n起风\t149367\n度秘是头猪哇度秘是头猪度秘是头小肥猪度秘是头小肥猪小飞猪像度秘度秘像小飞猪小飞猪啦小度秘度秘\t149368\n喜感\t149369\n五千量\t149370\n锯完\t149371\n第24页\t149372\n福建人\t149373\n渡劫\t149374\n飞沙\t149375\n安涛\t149376\n6月7日晚间\t149377\n含毅\t149378\n鄉村\t149379\n泥馬\t149380\n走哇走\t149381\n得和\t149382\n绵我我好萌\t149383\n堡秘\t149384\n抓紧鬼我不想你\t149385\n哼不理你了我走了\t149386\n咱明明素功大总攻\t149387\n大调\t149388\n阿凡峰\t149389\n你是我的小呀小苹果怎么爱你都不仙都红红的小鸟温暖我的心窝得了我生命的火火火火火火\t149390\n五湖四海\t149391\n那个人家\t149392\n大谷\t149393\n坐爱\t149394\n学说话\t149395\n我喜欢人\t149396\n凯德·来福士广场\t149397\n得咪\t149398\n听题西风胡杨\t149399\n给我吧密\t149400\n大谢\t149401\n黄喂\t149402\n冯家府\t149403\n旧岁\t149404\n香港机场\t149405\n乙酰\t149406\n转一说\t149407\n中便当盒\t149408\n13699238271\t149409\n冯丽美\t149410\n带息\t149411\n一走了之\t149412\n猫屎咖啡\t149413\n能不让\t149414\n58844278\t149415\n金月海天\t149416\nfcvg\t149417\n好多级\t149418\n吖發個剛剛wasknown發vvv還回復\t149419\n河鱼\t149420\n皱甲\t149421\n黄喵\t149422\n美人吟\t149423\n静液\t149424\nowl\t149425\nown\t149426\nowj\t149427\n密度你是女人你是女生\t149428\nwbbsvssivvsvshhwgvsvsdgwwwdcgsshh185754\t149429\n嵌讽\t149430\n百三十\t149431\n北京市清华大学\t149432\n周天茹\t149433\nppppopo\t149434\nthise\t149435\n建议\t149436\n扩号\t149437\n字里行间\t149438\n攻角\t149439\nyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyuyyyyyyyyyyyy\t149440\nCJES\t149441\n建设\t149442\nGmon药\t149443\n扩句\t149444\n戴套\t149445\nappa\t149446\n三十二十\t149447\n亲爱的姑娘\t149448\n巴拉拉拉拉拉拉拉拉拉拉拉嗒\t149449\n送医\t149450\n农用\t149451\n人山人海\t149452\nshang\t149453\n戈德温\t149454\n光暖暖\t149455\n董明珠\t149456\nbbbny\t149457\n25页\t149458\n10号凌晨零时\t149459\n懒爆\t149460\n生病干嘛\t149461\n确胖\t149462\na40度\t149463\nvuidcnth\t149464\nJenny\t149465\n九十九\t149466\n石大叔\t149467\n洋洋来\t149468\nhhvjhj\t149469\njdjxjjejdkxkc\t149470\n给我学\t149471\npxypypyxyoxtxyupcypchphpxpyxhpcujxptchpypccpucupvcuxgypucphtxchpphxgpyccjofOPH\t149472\n低碳环保\t149473\nduer儿朵多\t149474\n她的心\t149475\n二十来岁\t149476\n肉食\t149477\n阿朱比\t149478\n9月左右\t149479\nzhinen\t149480\n李珊\t149481\n李珍\t149482\n萨布\t149483\n请把话说\t149484\n乖乖的说话吧\t149485\n李珂\t149486\n没片\t149487\n不是我是说梦比优斯奥特曼\t149488\n聊了哼讨厌\t149489\n快说呀呀\t149490\nassdfvb\t149491\n傅我\t149492\n歌典\t149493\n陈文鑫\t149494\n诉说过\t149495\nsorno\t149496\n抚爱\t149497\nhowlong\t149498\n151.4万\t149499\n乖怪\t149500\n赵口令\t149501\nIIIIIII\t149502\n李中媛\t149503\n五百元\t149504\nhttpdhiphotosbaiducomxiaodupicitem2e2eb9389b504fc2ce15e676e2dde71190ef6d7ejpg\t149505\n度秘真帅\t149506\n十15\t149507\n无秘\t149508\n春字\t149509\n一万四\t149510\n从为\t149511\n武圣\t149512\n一百篇\t149513\n陈云\t149514\n七十多\t149515\n我相信我的宝贝老公是最爱老婆的好夫君\t149516\n买单者\t149517\n金瓶梅\t149518\n位唤旗\t149519\n117名\t149520\n陈了\t149521\n遵命你在哪你在哪跟我来一下天儿\t149522\n黄俊林\t149523\n498712255074\t149524\n舒服我累\t149525\n陈二\t149526\njmhmgmgwp\t149527\ntsynydjfh\t149528\n山脚\t149529\n走你的路\t149530\n庆春银泰\t149531\n事女\t149532\n皮肤科\t149533\n针式机\t149534\ntydudxyzyfxdddddrhjjjjjjjjjjhbbbbbbuiocucocucCuchvhchhhhghggvigogpjrdxft服务\t149535\n山脊\t149536\n山脉\t149537\n春学\t149538\n春季\t149539\n陈亮\t149540\n武土\t149541\n七十天\t149542\nhibigagga\t149543\n睇睇睇睇睇睇睇睇睇睇睇睇\t149544\n高覅\t149545\n杨秀彬\t149546\n高要\t149547\n景颇族\t149548\n号场\t149549\nhchf\t149550\nhchd\t149551\nhchb\t149552\n米兰大教堂\t149553\n王锦林\t149554\n律师\t149555\n奥运#吉庆盛会\t149556\n２０１０年\t149557\n东北亚能源交通（集团）股份有限公司\t149558\n胥烨\t149559\n单美\t149560\n黄金餐\t149561\n打字干嘛\t149562\n里花影路\t149563\nSPY\t149564\n轮毂\t149565\neuery\t149566\n谢谢你好\t149567\n耍赖气\t149568\n我喜欢的是大吧小何小何小\t149569\n呀呀呀呀呀呀呀呀呀呀呀\t149570\n闭户\t149571\n心虚佳品\t149572\nQQ群号\t149573\n开玩笑你走\t149574\n黑图\t149575\n八零后\t149576\n秦军\t149577\n王瑞永\t149578\nSPA\t149579\n猫砂\t149580\nSPF\t149581\n还在干嘛吗你是谁呀莺歌篇\t149582\n李云飞\t149583\n亅\t149584\nPiscArt\t149585\n酒批\t149586\n了\t149587\n争\t149588\n予\t149589\n事\t149590\n绿手\t149591\n比连\t149592\n二\t149593\n二十一\t149594\n于\t149595\n云\t149596\n查杀\t149597\n亓\t149598\n互\t149599\n井\t149600\n五\t149601\n亖\t149602\n亘\t149603\ni五额uuUI\t149604\n亚\t149605\n小度秘告诉我米月传奇之战国红颜\t149606\n认定书\t149607\n第四代\t149608\n亡\t149609\n亠\t149610\n余情\t149611\n你的最好的朋友\t149612\n二十个\t149613\n交\t149614\n产\t149615\n亦\t149616\n亩\t149617\n亨\t149618\n享\t149619\ngfghhcg\t149620\n亭\t149621\n京\t149622\n亮\t149623\nwoedwie\t149624\n亲\t149625\n8月2日\t149626\n休假片\t149627\n亻\t149628\n人\t149629\n亿\t149630\n你的家\t149631\n9890000470263\t149632\n猪猪堡亭\t149633\nstura\t149634\n死蛋子\t149635\njmmu\t149636\n缺一分若\t149637\n颜真丑\t149638\n一个291个\t149639\n结分无缘\t149640\n啊度秘度秘\t149641\n110416\t149642\n夏洛克\t149643\n泰忐\t149644\n深水\t149645\n接上来\t149646\n四方街\t149647\n拍了拍\t149648\n飕飕飕飕飕飕飕飕\t149649\n部长\t149650\n昊一\t149651\n335.22美元\t149652\n刨根问底\t149653\n在中\t149654\n好大二原谅你\t149655\n累了醉\t149656\nhotou\t149657\n郭爷\t149658\n倍加\t149659\n够嗒\t149660\n升值\t149661\nlgbk\t149662\n你好心情\t149663\n孙浩楠\t149664\n35家\t149665\n语言学\t149666\n我没给你解释我需要你给我\t149667\n对账\t149668\n郁郁\t149669\n性博会#\t149670\n韦德\t149671\n暮雪\t149672\n小度度\t149673\n说什么叫\t149674\n12月11号\t149675\n徐矮子\t149676\n宫如敏\t149677\nquiccc\t149678\n够不\t149679\n吴蓬勃\t149680\n曲突\t149681\n毛贝贝\t149682\n广东卫视\t149683\n谷之\t149684\n废话美国队\t149685\n手机费\t149686\n秘不好听\t149687\n很包容\t149688\n唐硕\t149689\n浑蛋兄\t149690\n养眼\t149691\n说的你讨厌\t149692\ntfbote\t149693\n很受气\t149694\n大欲易失命\t149695\n上榜\t149696\n弓方\t149697\n忘食\t149698\n高涨\t149699\n片像\t149700\n广州场\t149701\n想不不不\t149702\n工作邮件\t149703\n萌嘉嘉\t149704\n这些年\t149705\n陈减阿\t149706\n下篇\t149707\n环境监测\t149708\n卢比晗\t149709\n真人故事\t149710\nJIANG\t149711\n120多米\t149712\n真小\t149713\n赛考科技\t149714\nVkvn\t149715\n王振耀\t149716\n辞传\t149717\n恒丰\t149718\n边愁\t149719\n一个一盒\t149720\n真少\t149721\n领通知书\t149722\n450元\t149723\n独守\t149724\n独家\t149725\n秦欣雨\t149726\n李王奇\t149727\npursue\t149728\n热冠\t149729\n度命\t149730\n陌生了我是好人你是坏人你是猪\t149731\nTwgatm\t149732\n保健\t149733\n花样会门\t149734\n爬过\t149735\n我有天洗澡个园那你\t149736\n跳死\t149737\n我是你的玩不起开\t149738\n贝贝公主\t149739\n治老头\t149740\n搞不起\t149741\nsumicron\t149742\n旷课\t149743\n二十个小时\t149744\n美句\t149745\njfufudyddydyducujcuccchchxjxjxhjcj\t149746\n国喜剧\t149747\n悦姝\t149748\n中雨\t149749\n中雪\t149750\n度秘月菲\t149751\n告诉我我要\t149752\n吴秀波\t149753\n无药可\t149754\n岷山寺\t149755\n好主人\t149756\n秘书部\t149757\n叫帅\t149758\n和田村路\t149759\n纪小龙\t149760\n桃仁\t149761\nWSUIVDSU\t149762\n初体\t149763\n桃仙\t149764\n美发\t149765\n坐坐凑\t149766\n偷欢\t149767\n张炫鹏\t149768\n扫瑞错\t149769\n吐温\t149770\n中集\t149771\n芭芭拉小有的干嘛谢谢嗯好的的没干嘛嗯执掌\t149772\n800美元\t149773\n加泰罗尼亚电台\t149774\n饿鬼道\t149775\n新国网\t149776\n断舍\t149777\n断舌\t149778\n张蛋蛋\t149779\n非竞争\t149780\n‥\t149781\n冯泽添\t149782\n广告人\t149783\n酷狗音乐\t149784\n你的笑话\t149785\n刘心琪\t149786\n布朗尼\t149787\n699899966\t149788\ntjppppj\t149789\n一会场\t149790\n假如说\t149791\n铁块\t149792\n517\t149793\n9999999999999亿\t149794\n吃起来\t149795\n赴任\t149796\n516\t149797\n金力M5\t149798\nBfc\t149799\n尊会\t149800\n泰南宋卡总领事馆\t149801\n不KK\t149802\n11点34分\t149803\n校长\t149804\n京东\t149805\n制度化\t149806\n海魔\t149807\nH股\t149808\n45天\t149809\n破厚\t149810\n噢乖\t149811\n夏家三千金\t149812\n大空\t149813\n倩影\t149814\n时间还小我陪你去哪天加含香\t149815\n没不会\t149816\n朵简\t149817\n好吧心理会\t149818\njxuxjcjcjcjxux\t149819\n谢了\t149820\n破厮\t149821\n刘从恕\t149822\n三分钟\t149823\n大写字母\t149824\nnk行\t149825\n哈哈然\t149826\n总经理\t149827\n拉拉手\t149828\n快笑\t149829\n众奇\t149830\nfffffffffff\t149831\n啦不啦\t149832\n5285条\t149833\n罗千紫\t149834\n彼此彼此一般上\t149835\n狗蛋儿\t149836\n退休钱\t149837\n分界线\t149838\n6577888877\t149839\n哥孙\t149840\n一日秋里\t149841\n肉堡\t149842\n梁姐\t149843\n哥子\t149844\n有失大\t149845\nmuamua\t149846\n冉当然\t149847\n嗯注视恩\t149848\n子女们\t149849\n24厘米\t149850\n在所难免\t149851\n车位\t149852\n米洛\t149853\nfuttt\t149854\nwtmk\t149855\n块儿糖\t149856\n钱勒\t149857\n随感\t149858\n统招\t149859\n你在干嘛呢那什么你给我发一个你\t149860\n飞信\t149861\nbxqenmmbdwgehb\t149862\n华宁县\t149863\n福申4S店\t149864\n喷子\t149865\n一百斤\t149866\n每回\t149867\ngcbdhh\t149868\n曾厝埯\t149869\n徐州光光\t149870\n伤不灭\t149871\n清廷\t149872\n天哈哈哈大笑\t149873\ndiixz\t149874\nsksskskssksks\t149875\n周而复始\t149876\n5摄氏度\t149877\n忙忙忙\t149878\nahitili\t149879\n有样\t149880\n有根\t149881\n道道通\t149882\n菏泽市\t149883\n有格\t149884\n三批次\t149885\n旧金山\t149886\n损人利己\t149887\n旅宿\t149888\n微博业\t149889\n龙安天\t149890\n调戏\t149891\n翘楚\t149892\n5角\t149893\n假笑\t149894\ndudtidt\t149895\n香奈\t149896\n旅客\t149897\n锁清片\t149898\ngdidhchcisl\t149899\n短信\t149900\nM101\t149901\nM107\t149902\n没完再邪\t149903\n郑依琳\t149904\n想了想\t149905\n利润率\t149906\n4月11日\t149907\n秃鹰来跑男\t149908\n不明朗\t149909\n凯尔特的薄雾\t149910\n黄开盈\t149911\n我不不查\t149912\n谢诗妍\t149913\n非是昌\t149914\n好叭度秘\t149915\nukvhncv\t149916\nDuo\t149917\n骨番\t149918\nDub\t149919\nzhuan\t149920\n198几八七年\t149921\nDuy\t149922\n冷吗度秘\t149923\n你是疯子你是疯子你是猪你是猪\t149924\n田螺坑\t149925\n夏门街\t149926\nDuu\t149927\n破例\t149928\n工字\t149929\n塞飞飞\t149930\n申万医药\t149931\n我的要求\t149932\nDuD\t149933\n二十千克\t149934\n郭世杰\t149935\n伟大\t149936\n郭雨贤\t149937\n魔灵\t149938\n唐晓翼\t149939\n摆地摊\t149940\nmalefemale\t149941\n水利部\t149942\n家界\t149943\n长裤仓\t149944\nhi阿克西\t149945\n银土\t149946\n18602020955\t149947\n三天之上\t149948\n活跃度\t149949\n狼丘\t149950\n劲松\t149951\n保山市\t149952\n银圈\t149953\n母打理\t149954\n一五二六七八九六十八九二七\t149955\n飞向上\t149956\n半钟\t149957\n期中考试\t149958\ntree\t149959\n尸人\t149960\n端口\t149961\nSS33D\t149962\n凹凸不平\t149963\n东拼西凑\t149964\n老子胃\t149965\n523584\t149966\n谭宇轩\t149967\n咳嗽\t149968\n3850万元\t149969\n受欺负\t149970\n明送\t149971\n顶赞\t149972\nfjgjeiHdydjznjrndufjxjudufdnrufh788437\t149973\n一几号\t149974\n面妆\t149975\n眭老师\t149976\n死哪儿\t149977\n过保卡\t149978\n复仇者联盟#\t149979\n彡一彡\t149980\n墨脱\t149981\n跟机\t149982\n打鼠吧\t149983\n过多远\t149984\nSTAGE情人節\t149985\n顶起\t149986\n牛肉饼\t149987\n年下\t149988\n泄愤\t149989\n沫沫沫沫沫沫沫沫沫沫沫沫\t149990\n拖欠\t149991\n度珌\t149992\n跟朝\t149993\nJones\t149994\n不非\t149995\n吃卵\t149996\n三点91点\t149997\n度密机器人猪\t149998\n克里姆林宫遗址群\t149999\n搔痒\t150000\n你的心事\t150001\n初寒袭身\t150002\n飞霞\t150003\n前羊\t150004\n第21张\t150005\n锂电池\t150006\n多多多多多多多多多多多多多\t150007\n叶剑英\t150008\n索吻\t150009\n姜文天\t150010\n布拉宫\t150011\n中国人中国人\t150012\n韦宗清\t150013\n敏刚\t150014\n叫花边\t150015\n得而返\t150016\n抠脚\t150017\n第一感\t150018\n儿童简\t150019\njtgdtjmgmgawamjmgmpajmpmpmwgm\t150020\n呜咽\t150021\n阎安\t150022\n我疯了你\t150023\nWRYIOO\t150024\n牟爱方\t150025\n贾丽艳\t150026\n科考队\t150027\n边牧\t150028\n6010601466404355850464\t150029\n江梦雯\t150030\n雨一个\t150031\n100帅\t150032\n浣花洗剑录\t150033\nSORRY\t150034\n14005000\t150035\nAttentioning\t150036\nbecame\t150037\n就不上\t150038\n五十三三五十四张\t150039\n为什么不群\t150040\n钓鱼台群岛\t150041\nasXcvvcxxxzzz\t150042\n天水地区\t150043\nv89\t150044\n性伴侣\t150045\nbadao\t150046\n白衬衫\t150047\n所幸\t150048\n汤淼\t150049\n沃行\t150050\n出现我的脸红心跳\t150051\n显得\t150052\n实话实说\t150053\n学工\t150054\n不是我老下成盗版\t150055\n觉觉\t150056\n学好不好\t150057\n补习班\t150058\n1992年\t150059\n孙童谣\t150060\n严把\t150061\n维多利加\t150062\nhttpfhiphotosbaiducomxiaodupicitem5ab5c9ea15ce36d3e85ca3453df33a87e950b16ejpg\t150063\n千姿百态\t150064\n大不用\t150065\n哪女\t150066\n代儿山\t150067\n乔森潘\t150068\n2715571606\t150069\n小白花\t150070\n墨守成规\t150071\nnyyb\t150072\n拘泥\t150073\n5438..1354..2冰壶\t150074\n人工化\t150075\n线情\t150076\n模样儿\t150077\n剩女\t150078\n噗呠\t150079\n小萝卜头\t150080\n顺祝\t150081\n447858067\t150082\n了解到底是谁是\t150083\nghhhhhhjhh\t150084\n我声\t150085\nggdthv\t150086\n交易对方\t150087\n格里芬\t150088\n施魏因斯泰格\t150089\n制止\t150090\ndebf\t150091\nhdkkvxut\t150092\n揭穿\t150093\n黄天\t150094\n我不办\t150095\n电话号码\t150096\n刘贤平\t150097\n苹果秘\t150098\n分色\t150099\ns3you\t150100\n紫薇嘴\t150101\n心怡\t150102\n要不说话\t150103\n我好你不给我我就不要勉强你了我们两个一刀两断\t150104\n最好朋友\t150105\n22pxp0\t150106\n刷洗\t150107\ntf博伊斯\t150108\n不近人情\t150109\n侵华\t150110\n姜成\t150111\n姜我\t150112\n和川\t150113\n疯了呀\t150114\n童谣缘\t150115\n胡u\t150116\nhgjjj\t150117\n记起来\t150118\n知道不我当时\t150119\n苦心人\t150120\n撞坏\t150121\n￥10\t150122\n告诉你好\t150123\n51829\t150124\n欢唱\t150125\nSorry\t150126\n斯卡布罗\t150127\nfudges\t150128\n绿源\t150129\n面对古往今来\t150130\n5272757\t150131\n明天猩\t150132\n生嗎\t150133\n养字\t150134\n美贝贝贝贝贝\t150135\n53数\t150136\n四十亿一万元\t150137\n拖家带口\t150138\n漫迷鸟鸣把\t150139\n敏章\t150140\nujcjf\t150141\n张程瑞\t150142\n刻下\t150143\n复点\t150144\n凯美\t150145\nKXKD\t150146\n悬崖\t150147\n真经\t150148\n力气\t150149\n胡8\t150150\n阮联敏\t150151\n王梦\t150152\n坦白\t150153\n大公共\t150154\n真绝\t150155\n65524\t150156\n一百\t150157\n不是了\t150158\n一發\t150159\n无人色\t150160\n梦璐\t150161\n太阳关系\t150162\n910单\t150163\ntTVwtwttttt\t150164\n叶会儿\t150165\n宝软\t150166\n哥歌\t150167\n王梅\t150168\n骂贴\t150169\n45页\t150170\n格安杆\t150171\n2514\t150172\n2515\t150173\n王梓\t150174\n2510\t150175\n點知發現部舊噶手機個充電器壞\t150176\n听命\t150177\n铠甲勇士震雷\t150178\n牛头不对马尾\t150179\n宋宋\t150180\n费雷尔\t150181\n宋宇\t150182\n张雁峰\t150183\n白潇潇\t150184\n大可爱度秘你在\t150185\n李仕豪\t150186\n宋官\t150187\n搞喃\t150188\n喜事儿\t150189\n搞清楚\t150190\n沙里沙\t150191\n皮卡丘合神器果實\t150192\n奥戴班\t150193\n自然灾害\t150194\n八米高\t150195\n搞花\t150196\n赵岩\t150197\n济慈\t150198\n不要不要不要不要不要不要不要不要\t150199\n讨厌密度\t150200\n激灵灵\t150201\n蛮疙瘩\t150202\n陈奇奇\t150203\n黄明\t150204\n黄昏\t150205\n抱忧\t150206\n聰明\t150207\n夏收\t150208\nCK77\t150209\n尿介子\t150210\n药学\t150211\n俯身\t150212\n刘昕锐\t150213\n187681031\t150214\n假面骑士个\t150215\nudjwidbdj\t150216\n荷花粥\t150217\n大梦一世我问你了问你是的\t150218\n头等\t150219\n三穗县\t150220\n天文相对完备集送给我\t150221\n放手爱\t150222\n鹰洋\t150223\n噩噩噩噩噩噩噩噩噩噩\t150224\n四川省乐山市检察院\t150225\n回礼家\t150226\nhi锅包肉\t150227\n们多会儿\t150228\n安安静静摆\t150229\n5547524553354122221144789633214778899635\t150230\n汗蒸\t150231\n雄伟\t150232\nwjjfd\t150233\n17888858855558\t150234\n陈雨希\t150235\n电子笔\t150236\n263833009\t150237\n那女\t150238\ntfboa\t150239\n丽了一段\t150240\n脚破\t150241\n那好\t150242\n窝囊\t150243\n胡雨欣\t150244\n心药医\t150245\n大值\t150246\ntfbos\t150247\n喜悦生生不息\t150248\n开罗\t150249\n练习房\t150250\ntfboy\t150251\n呱呱呱酱\t150252\n动议\t150253\n古诗过故人庄\t150254\n开罪\t150255\n幺五六\t150256\n之前\t150257\n狗娃\t150258\n媚媚\t150259\n石秀\t150260\n狗娘\t150261\nhvvf\t150262\n国际刑警\t150263\n盘坐\t150264\n560888574578555525525\t150265\n小日子\t150266\n死饯\t150267\n你是我的臭不要脸\t150268\n世界遗产\t150269\n化瘀\t150270\n一易峰\t150271\n手感\t150272\n六聪明\t150273\n测量\t150274\ntfbo1\t150275\n字学\t150276\n950718\t150277\n儿的歌放歌红山果\t150278\n威廉\t150279\n忙難\t150280\n松狮\t150281\n傷心\t150282\n南宫琉璃\t150283\n不爱你了真讨厌\t150284\n多一条\t150285\n1119112011211122…1199\t150286\n我讨厌我讨厌你\t150287\n飓风VVVv\t150288\n襄阳远\t150289\n贱真不好\t150290\n邹雨\t150291\n自动伸缩器\t150292\nAV8\t150293\n15第二张\t150294\n不要你首歌\t150295\n凯尔特\t150296\n戴迪\t150297\nAV1\t150298\n219日\t150299\n赵与来南\t150300\n对呀挺好\t150301\n8月2日早\t150302\n大马士革郊区\t150303\n漂漂亮\t150304\n2211335533225511999\t150305\n3600\t150306\n变性\t150307\n林欣如\t150308\n离咖\t150309\n难舍难分\t150310\nderder\t150311\n大逗比\t150312\n这么多片\t150313\n快说真话\t150314\ndufmhdywitgdufhctwxjhojkdgckhstgkjpbvkdtwhchsyibkrthpgudtvlchsyrjgyrswwtdycherdj\t150315\nfjjynk\t150316\nnomononono\t150317\n产出\t150318\n变态\t150319\nvvvvvvvvvkkkkkkko\t150320\ngulide\t150321\n变透\t150322\nsizhan\t150323\n借古讽今天女\t150324\n驱动程序\t150325\n坏消息\t150326\n变速\t150327\n商场价\t150328\n受损\t150329\n13775658010\t150330\n选项\t150331\n谈判\t150332\noffer算数\t150333\njdtjgd\t150334\n置之不理\t150335\nkcjigg\t150336\n有利可图\t150337\n轩歌\t150338\n学信真话\t150339\n粗话\t150340\nAVI\t150341\nnotkno\t150342\n廖振华\t150343\n裘德\t150344\n一丝丝\t150345\n连成\t150346\nUsb\t150347\n等号\t150348\n刘相飞\t150349\n考上\t150350\nsok\t150351\n青州市\t150352\n可听话\t150353\nsoo\t150354\nson\t150355\n魔威\t150356\nsos\t150357\n曹曹曹\t150358\nsou\t150359\nsow\t150360\n南宫踏轩\t150361\n我的厉害\t150362\n周当场\t150363\n多路\t150364\n干嘛餐\t150365\nraingr\t150366\n哦尼\t150367\n猪心\t150368\n专制者\t150369\n堂子\t150370\n退休的人\t150371\n斯隆\t150372\n招鬼\t150373\n加权指数\t150374\n张一山\t150375\n想你\t150376\n学点\t150377\n洪春初\t150378\n真气死人\t150379\n42872727\t150380\n余珂\t150381\n胆大心细\t150382\n零五零\t150383\n四公顷\t150384\n信息技术有限公司\t150385\n艾哈迈德·祖瓦尔\t150386\n原片\t150387\n半仙\t150388\n发愁\t150389\n发意\t150390\n突破\t150391\n原版\t150392\n荒野求生\t150393\n向欣\t150394\n发绀\t150395\n何宁波\t150396\n摄影历史\t150397\nkkjjkkkh\t150398\n撒拉斯\t150399\n发愤\t150400\nwerrsrdrd\t150401\n陌陌魔\t150402\nququ\t150403\nquqq\t150404\n泌尿\t150405\n发钱\t150406\nefghigklmnopqrstuvwxyz\t150407\n顺气\t150408\n哦呀呀呀呀呀呀呀呀\t150409\n自力村\t150410\n不接爱\t150411\nlunds\t150412\n风投\t150413\n黑吃黑\t150414\n极限\t150415\n李京宇\t150416\n52528291545818152225\t150417\nFrancisco\t150418\n社人\t150419\n再见把再见吧再见吧再见\t150420\n顺水\t150421\n甘不完美\t150422\n有一说\t150423\n耶耶\t150424\nCgg\t150425\n黄帝内经\t150426\n👋\t150427\n处别处\t150428\n赵燕青\t150429\n社交\t150430\n2010年末\t150431\n乐清保安服务公司\t150432\n管制\t150433\n魔幻奇怪的问题\t150434\n水鲨\t150435\n迁就\t150436\nhttppinyincn1nSKWYcMVvO\t150437\n小秀秀思密达\t150438\n1931年5月5日\t150439\n钟浩波\t150440\n痛恨\t150441\n西青区\t150442\n余涵涵\t150443\n坏了我得了灰指甲\t150444\n1538892589\t150445\n人品\t150446\n35栋\t150447\n李峰思密达\t150448\n王起涵\t150449\n好累我好开心\t150450\n没结果\t150451\n温热\t150452\nPK吧\t150453\n佩佩\t150454\n二月二\t150455\n一联\t150456\n官堂\t150457\npet\t150458\n一聊\t150459\n一职\t150460\n居居感\t150461\nSchool\t150462\n患难与共好不好\t150463\n杯子\t150464\n洗一太\t150465\n八四三零\t150466\n素坏人\t150467\n天才少女\t150468\n邵美琪\t150469\n誰\t150470\n課\t150471\n水压\t150472\n調\t150473\n火钳\t150474\n四十四万\t150475\n楼机\t150476\n百万来\t150477\n好帅打\t150478\n剧终\t150479\n白骨哀\t150480\n郭慧娜\t150481\n用心\t150482\njd385\t150483\n随身里\t150484\n孔他的要求\t150485\n香薰\t150486\n誓\t150487\n誒\t150488\n完完整整\t150489\n些微\t150490\n斯罗塞斯\t150491\n94\t150492\n普通人\t150493\n大石\t150494\n認\t150495\n切寿司\t150496\n誉\t150497\n近七年\t150498\n誊\t150499\n預祝\t150500\n日语诶诶6饿u6656u雨衣u涂鸦日\t150501\n28日下午4时\t150502\n荧光飞鸿水栽花花是的\t150503\n肯尼迪\t150504\n杨小姐\t150505\n定边格莱斯\t150506\n20点23点十五分\t150507\n宋本山\t150508\n李神经\t150509\n刘钰淼\t150510\n拧开\t150511\n马祥皓\t150512\n1010年\t150513\n零二二零幺\t150514\n机容\t150515\n余家辉\t150516\n联峰山\t150517\n二三百\t150518\n经济危机\t150519\n不要再哭\t150520\n叶聪明\t150521\n别计较\t150522\n经济损失\t150523\n脱拖\t150524\n變髮\t150525\n更甚\t150526\n张天佑\t150527\n四道\t150528\n陈玉玲\t150529\n白粉\t150530\nsign\t150531\n略显\t150532\n给定\t150533\n振兴\t150534\n四遍\t150535\nEnding\t150536\n300路\t150537\n近20年\t150538\n桃心\t150539\n便座\t150540\n笑一个笑一个笑一个笑一个\t150541\n头年快乐\t150542\n蓝紫秋\t150543\n29董\t150544\n从来都没有\t150545\n上海联和投资有限公司\t150546\n你好了啊你好屌\t150547\n由此\t150548\n篱落\t150549\n15ok\t150550\n高勤学\t150551\n尹给我\t150552\n1万2千日元\t150553\n大炮筒\t150554\n九品\t150555\n韩莹\t150556\n九哇\t150557\n灏\t150558\n干嘛秘\t150559\n啦啦啦啦啦啦啦啦啦\t150560\n国家形象\t150561\n1346597531\t150562\n灿\t150563\n灾\t150564\n汉语言文学专业\t150565\n我好寂寞好寂寞\t150566\n我信的明儿叫\t150567\n十顿\t150568\n灸\t150569\n汽水\t150570\n天哥九弟\t150571\n灵\t150572\n灴\t150573\n十页\t150574\n百安稳亨\t150575\n灰\t150576\n灯\t150577\n灭\t150578\n灬\t150579\n火\t150580\n真高兴\t150581\n灥\t150582\n我是你的还有一个男的我\t150583\n宏茂巷\t150584\n突出\t150585\n青龙王\t150586\nkommonoop\t150587\n还不2儿一凡一U一lK\t150588\nEMBA\t150589\n嫁给你我爱\t150590\n哎呀呀你好痒\t150591\n尿强波\t150592\n111123325255447886633221483258624521486325886248632248855258628625852782855805425258327065486280\t150593\n黄鑫\t150594\nDOS\t150595\nmosousuo\t150596\n倒打一耙\t150597\n礼裙\t150598\nBibby\t150599\n陈天贺\t150600\n曲界\t150601\n柳暗花明\t150602\n香港站\t150603\nzozllne\t150604\n妮妮妮妮妮妮\t150605\n不是好人你是坏人坏人坏人大坏蛋\t150606\n东北三宝\t150607\n周先生\t150608\n420马力\t150609\ngudt\t150610\n幺七幺\t150611\n蓝威佑\t150612\n谢了先\t150613\n姑妄言之姑听之是听\t150614\n胎死腹中\t150615\n吉米娜\t150616\n莞莞莞\t150617\n胸腔\t150618\n94页\t150619\n666gg\t150620\n蜂巢\t150621\n磨蚀\t150622\ngggvjhxbi\t150623\n5455228865\t150624\n看花\t150625\n耒阳\t150626\n好呀大几早的不要再来烦我\t150627\n除五点\t150628\nabc港abc\t150629\n37435元\t150630\n安宇松\t150631\n河北话\t150632\n妈蛋丫\t150633\n听妈妈的话\t150634\n十月三\t150635\n茶店院\t150636\n哈杯具\t150637\n2588885\t150638\n一刀一块\t150639\n在美\t150640\n李源\t150641\n听听懂\t150642\n12345123452742\t150643\n马慧洁\t150644\n机器男友\t150645\n说借\t150646\n大名姓\t150647\n更是\t150648\n25米\t150649\njokovic\t150650\ncffgjk\t150651\n张恒\t150652\n巴啦啦小魔仙支魔仙公主\t150653\n烫烫\t150654\n陈向权\t150655\n二百多多\t150656\nオァ\t150657\n优乐\t150658\n四季\t150659\n238\t150660\n叽歪\t150661\n螳螂\t150662\n234\t150663\n235\t150664\n236\t150665\n237\t150666\n230\t150667\n231\t150668\n232\t150669\n233\t150670\n难求\t150671\n砖色\t150672\ns3\t150673\ns2\t150674\n算了我走\t150675\ns6\t150676\ns5\t150677\ns4\t150678\n有情人\t150679\njzoe\t150680\n公牛\t150681\n于民\t150682\n晚上8：30\t150683\n主持\t150684\n张息\t150685\n席慧\t150686\n蹦沙嘎拉嘎蹦沙嘎拉嘎哈\t150687\n从一而然\t150688\n财色\t150689\nsX\t150690\n23j\t150691\n动门\t150692\n逗比度秘\t150693\n故生\t150694\nJfbcrjc\t150695\n给我说度\t150696\n饕餮\t150697\n第四步\t150698\nsM\t150699\n冰川时代3\t150700\n噩噩噩噩噩噩噩噩噩\t150701\nsB\t150702\n桥梁\t150703\n成功\t150704\n成办\t150705\n不我盆友\t150706\n凯美莉\t150707\nsz\t150708\nsy\t150709\nsx\t150710\nss\t150711\nsr\t150712\nsq\t150713\n亡故\t150714\nsw\t150715\nsv\t150716\nsu\t150717\n23C\t150718\nsk\t150719\nsj\t150720\nsi\t150721\nsh\t150722\n天晴那景德镇\t150723\nsn\t150724\nsm\t150725\nsl\t150726\nsc\t150727\nsb\t150728\nsa\t150729\n精粹\t150730\nsg\t150731\nsf\t150732\nse\t150733\nsd\t150734\n老到\t150735\n睿翼\t150736\n小飞龙\t150737\n刘科妙\t150738\n哈驴\t150739\n瓦力君\t150740\n聊会天\t150741\n零二二六\t150742\nFNG\t150743\nwmjagtpmagj\t150744\n刘岩\t150745\n清傲\t150746\n老利\t150747\n老别\t150748\n无目神游\t150749\n缚鸡之力\t150750\ndjs\t150751\n侧滑\t150752\n脚眼珠\t150753\n液特\t150754\nFANISM\t150755\ndjv\t150756\n老刘\t150757\n镐腿\t150758\n易流云\t150759\n冠佑\t150760\n盲点\t150761\n鬼日\t150762\ndje\t150763\ndjd\t150764\n拜艺\t150765\ndjf\t150766\n现代科技\t150767\n方舟\t150768\n古拉拉\t150769\nsnh48\t150770\nsnh49\t150771\n详尽\t150772\n合心性\t150773\n徐文杰\t150774\n期行一文\t150775\n杨度\t150776\n路易斯安那州\t150777\n放寒假了你\t150778\n13469993126\t150779\n押韵\t150780\n热衷于\t150781\n等不及了\t150782\n胆水\t150783\njdkd\t150784\n正天黑\t150785\n万般无奈\t150786\n固精\t150787\n慰籍\t150788\n文工团\t150789\n1点45\t150790\n9.6亿千瓦\t150791\n含辛茹苦\t150792\n塑胶\t150793\n5295点\t150794\nwuebbr\t150795\n福度\t150796\n駱鐮\t150797\n听话的我\t150798\ndenden\t150799\n二二二七三零\t150800\nwbzbs\t150801\n不辞\t150802\n咖喱面\t150803\njhiz\t150804\n悲苦\t150805\n机电脑\t150806\n写信\t150807\n炮嘴\t150808\nqqhgs\t150809\n一七年\t150810\n坦途\t150811\n六闹铃\t150812\n弗里德\t150813\nn1天\t150814\n十一点到9点57\t150815\n我也不知道\t150816\n表嫂\t150817\n张念\t150818\n24条\t150819\n提宁\t150820\n7.5亿\t150821\n李红梁\t150822\n讨厌讨厌讨厌讨厌春天\t150823\n约会同\t150824\n尼卡\t150825\n勤奋\t150826\n爱立信\t150827\n秋莹怡\t150828\n套餐\t150829\n码字\t150830\n瑰丽\t150831\n对比度\t150832\n小俩子\t150833\n码子\t150834\n阜宁\t150835\n吃雨\t150836\n点靠谱\t150837\n待业\t150838\n风闻\t150839\n愣说\t150840\n樊锦珠\t150841\n穿帮\t150842\n校裤\t150843\n情史\t150844\n闵行区\t150845\n教育\t150846\n防寒服\t150847\n彩图\t150848\n彩国\t150849\n三水\t150850\n心铭桐\t150851\n黄万陶\t150852\n呃沟\t150853\n度日如年\t150854\n错别说\t150855\n涞河\t150856\n3到11年\t150857\n30公斤\t150858\n号行\t150859\n38块\t150860\n呦西呦西哟西\t150861\n怕\t150862\n怔\t150863\n同人文\t150864\n宝塔\t150865\n怒\t150866\n思\t150867\n竞争\t150868\n瓦房店\t150869\n举行\t150870\n臭不要脸臭要脸臭要脸臭表脸\t150871\n永运\t150872\n来看\t150873\n怅\t150874\n打湿\t150875\nHARDRIVE\t150876\n态\t150877\n怀\t150878\n张胡子\t150879\n怂\t150880\n米多\t150881\n怏\t150882\n怎\t150883\n苏子\t150884\n那你猜我是男是女\t150885\n怴\t150886\n马文涵\t150887\n你好黑好黑好黑好黑好黑好黑好胖好胖\t150888\n下一场雪\t150889\n阿特密\t150890\ngfdgcx\t150891\n怼\t150892\n四眼皮\t150893\n避风港凤凰网\t150894\n3344384\t150895\n急\t150896\nHmm\t150897\n性\t150898\nQQ饮\t150899\n怡\t150900\n小僵尸\t150901\nfmf\t150902\n怯\t150903\n范男\t150904\n扩大化\t150905\n怨\t150906\n怪\t150907\n问一遍\t150908\n王月\t150909\n王有\t150910\n你的之是\t150911\n一四岁\t150912\n天时\t150913\n是谁非得\t150914\n东哪\t150915\n作壁上观\t150916\n王朝\t150917\n马利奥\t150918\n问一道\t150919\n聊聊聊聊\t150920\nFbcjjgfg\t150921\n花永康\t150922\n自恋狂臭不要脸\t150923\n你的日子\t150924\n乌拉拉拉拉熙哥\t150925\n︶︿︶\t150926\n太高科技\t150927\n五七大\t150928\n尹加杰\t150929\n如神\t150930\n瑜琦\t150931\n还夜夜夜\t150932\n白梅英\t150933\n牙买加\t150934\n离子峰岩猎人\t150935\n吴琪琪\t150936\n巨大的秘密\t150937\n固定句\t150938\n秀一秀\t150939\n跳刀\t150940\n一年把\t150941\n我爸不在家我爸去工地驻扎你就我一个人\t150942\n略略\t150943\n上饭\t150944\n推特圈饭\t150945\n专们\t150946\n幺零八零\t150947\n线稿本\t150948\n度秘婆\t150949\n阿倩\t150950\n吴祚来\t150951\nhdudJwiiwwjijjjkKKkdjfExjrjdfdjJfdnfnnfujdfkfjffjjrjjrfrjjajrjeaafjeakefffjertrrritfjrrjrjrfjefjfjjeheeeeerrjjjhrrrjtjfjjrdfjfjsffjejjdjefjAjkkdjdjcjjxkK\t150952\n痴想器\t150953\n昆明铁路局\t150954\n小九九\t150955\n重要性\t150956\n乔布斯传\t150957\ngajp\t150958\n付冰\t150959\n花肥猫\t150960\n洛斯\t150961\n找你在\t150962\n423错错错错错\t150963\n范经理\t150964\n友谊大厦\t150965\n综上所述\t150966\n梦中人\t150967\n六开跑男\t150968\n部君\t150969\n猎艳\t150970\n二千五\t150971\n傻子呆子\t150972\n456555233333\t150973\n表问\t150974\n求你啦求你啦求你啦求你啦求你啦求你啦求你啦求你啦求你啦求你啦求你啦求你啦求你啦求你啦求你啦求你啦求你\t150975\n私有\t150976\n私服\t150977\n太棒\t150978\n遐想你好\t150979\n表闹\t150980\n沙壁\t150981\n女肖\t150982\n清明\t150983\n康敏\t150984\n十月女\t150985\n爱小机器人\t150986\n千兆以太网\t150987\n植物大战僵尸双八十\t150988\n讨厌的人\t150989\n中大\t150990\n爱婴\t150991\n四战\t150992\n23亿美元\t150993\n中央\t150994\n毫无信誉\t150995\n真爱安\t150996\n丙肝\t150997\n哪哪哪哪哪哪哪哪哪哪哪\t150998\n中外\t150999\n从天儿\t151000\n好伙伴\t151001\n夺宝传奇\t151002\n最帅的我是天下第二帅\t151003\n忘了关\t151004\nhellolyou\t151005\n植物大战僵尸花园版\t151006\n试试了试\t151007\n小姐妹\t151008\n途牛科\t151009\n一代宗师\t151010\n文艺人\t151011\n小朋友我是你杰\t151012\n你好丑陋\t151013\n四篇\t151014\n安县\t151015\n看片\t151016\n自拨\t151017\n丁宇\t151018\n喜欢你了你不乖信不信\t151019\n好眼力\t151020\n变不得\t151021\n古语\t151022\n偶像剧\t151023\n飞动\t151024\n恩就是说\t151025\n说法\t151026\n胜芳\t151027\n受不了你了该\t151028\n崔雨薇\t151029\n105斤\t151030\n100两百\t151031\n例证\t151032\n看物\t151033\n自拍\t151034\n生产部\t151035\n屁胸\t151036\n准备好像\t151037\n朱斌\t151038\n自拔\t151039\n犀皮漆\t151040\n刘俊涛\t151041\n丁家\t151042\n神魔大陆\t151043\n噔子\t151044\n自拟\t151045\n无聊的人\t151046\n卜傩\t151047\nseeEnglish\t151048\n吃惊\t151049\njué\t151050\njiwua\t151051\n擾擾絲俗話誒癒合擾\t151052\nAA制\t151053\n直好\t151054\n鹤顶\t151055\n搞通\t151056\n叉寸\t151057\n文玩\t151058\nthggty\t151059\n诺贝克\t151060\n530534148\t151061\nThat\t151062\n汝南站\t151063\n提出异议\t151064\n想聊\t151065\n汕城\t151066\n李文瑞\t151067\n陈欣\t151068\n白对\t151069\n张健豪\t151070\nrhgf\t151071\n卡卡hi皮波斯带吐yousps\t151072\n文王\t151073\n直奔\t151074\n亲一个拜拜\t151075\n6月20日上午\t151076\n阳江\t151077\n器官\t151078\n你好你好你好一哦喔胶多彬\t151079\n世子嫔\t151080\n洗碗呗\t151081\n不十秒\t151082\n昨天下午一点\t151083\n土八路土八路\t151084\n1ncm\t151085\n严家欢\t151086\n黑羽快斗\t151087\n许佳妮\t151088\n交结圈\t151089\n20141620241216\t151090\n第几章\t151091\n想起夜\t151092\n数学系\t151093\n渣土\t151094\nmany\t151095\n护眼大法\t151096\n不仁不义\t151097\nJ11D\t151098\n呃呃呃度秘\t151099\nmang\t151100\n正在家\t151101\n许天骄\t151102\n300%\t151103\n辉瑞\t151104\n国际市场\t151105\n老拖\t151106\n只好\t151107\nldonat\t151108\n机械架\t151109\n易碎\t151110\nCjff\t151111\n六福虎眼经文茶晶发晶紫晶绿幽灵紫红幽灵兔毛水晶挂饰\t151112\n度秘度秘你是人是鬼\t151113\n比就是说\t151114\n3001\t151115\n3000\t151116\n基本\t151117\n电子工业出版社\t151118\n邵思燕\t151119\n不包年\t151120\n04986897\t151121\n猪仔你真够呛\t151122\n克拉斯\t151123\n老拿\t151124\nlufu\t151125\n美国青少年选择奖\t151126\n打儿\t151127\naabastiantlelolitouilalalalabaheapadofihellisilaco\t151128\n局观\t151129\n简绍杰\t151130\n达达达达达达达达达达达\t151131\n泰雅斯\t151132\n小狮\t151133\n猜狠\t151134\n小独\t151135\nvcco\t151136\n秘秘糖\t151137\n未来财富\t151138\n300m\t151139\n托管\t151140\n吴莎\t151141\n赵旭升\t151142\n整弄\t151143\n小狼\t151144\n南开区\t151145\n小狸\t151146\n孙庆东\t151147\n哈多\t151148\n真天牛\t151149\nTouch会\t151150\n急刹车\t151151\n汗豆\t151152\n张慧全\t151153\n玉龙\t151154\n哈太\t151155\n小狂\t151156\n张希\t151157\n蒙萌萌\t151158\n咱两爱爱吧\t151159\n一手遮天\t151160\n吸尘\t151161\n1387143785\t151162\n高浓度\t151163\nive\t151164\n大不了你的新娘不是我，大不了我的新郎不是你\t151165\n再融资\t151166\n谈笑\t151167\nivn\t151168\nivo\t151169\nivi\t151170\n罗佳慧\t151171\n陈钥\t151172\n謝囉\t151173\n莫风流\t151174\nenerealeraera\t151175\n痴情\t151176\n内分泌科\t151177\nivy\t151178\n万家华\t151179\n反对\t151180\n现代人\t151181\n光禄坊站\t151182\n祈福\t151183\n拔草计划\t151184\nCANNIBAL\t151185\nECASA\t151186\n庞成刚\t151187\n新村\t151188\nfcgcgchdhxxhxvxbchdhdvfcvccbxvgcgfhdvfchdvfbgfhfhfvgfs\t151189\napdg\t151190\n真的假的真的假的真的假的\t151191\n叻齐逼\t151192\n偷笑][偷笑][偷笑][偷笑\t151193\n护舒宝\t151194\n啦哈\t151195\n星期节\t151196\n西游特\t151197\ndsgci\t151198\n市外\t151199\n紫竺斓\t151200\n了了了了了啦啦\t151201\nbigbang\t151202\n远东\t151203\n零六一二\t151204\n574亿\t151205\n智维\t151206\n新大头\t151207\n说会儿聊\t151208\n两门儿\t151209\n四级\t151210\n民办\t151211\n这玩笑\t151212\n你好度秘我爱你\t151213\n丽江种子联盟\t151214\n开流\t151215\n主要症状\t151216\n滕赖特\t151217\n呃续\t151218\n普桑\t151219\n绫罗绸\t151220\nfsscssbsdujlufggcbjubjhuvgugiujhvbcksnivssibknbnxkdvnnhvhvvindivdbkvddhdkhvdkvxnxhskcsjbscbicbdbjcbxjxbbjdbdjcbxcjxjcnxjcnsjcbsjbsjbcskcnscjxjssbjbcjsbchbcbhbscbghk\t151221\n初芽\t151222\n碰瓷\t151223\n玩什么\t151224\n为证\t151225\n年岁岁花\t151226\n北市区\t151227\n为我没\t151228\n嗯度秘度秘\t151229\n承德县\t151230\n无缘无\t151231\n查雅\t151232\n王一姐\t151233\n非雅\t151234\n楼梯\t151235\nuFo\t151236\n对啊鞋\t151237\n就好嘛\t151238\n楼梦\t151239\n埃菲尔铁塔\t151240\n亲爱的那我去了\t151241\n岑参\t151242\n很有可能\t151243\nHGC721K\t151244\n照抄\t151245\n朱乐强\t151246\n龙虾\t151247\n杨玉莹\t151248\n1089700\t151249\n太多太多\t151250\n累着\t151251\nvhgds\t151252\n黄扣兵\t151253\n策反\t151254\n新少林寺\t151255\nyyyyyyyy\t151256\n乳汁\t151257\n木兰图cu\t151258\n1050万4899\t151259\n租车\t151260\n恩谢谢\t151261\n翁美玲\t151262\n核桃粉\t151263\n杨植霞\t151264\n事端\t151265\n正要\t151266\n猜测\t151267\n大获全胜\t151268\n龙虎\t151269\n人生苦短\t151270\n下一个人\t151271\n长猪\t151272\n长猫\t151273\n贝贝公主的给我说贝贝公主\t151274\n校训\t151275\n博士伦\t151276\n度你的生日\t151277\n猫扑贴贴\t151278\n广话\t151279\nSFSA\t151280\njgfrrttyiihbvccxfgubbbjkkjggfttyiiooyrrfbh\t151281\n单人照\t151282\n蔡晓豆\t151283\nMefindsomemom\t151284\n落花流水\t151285\n慰安\t151286\n不要脸你不要脸你不要脸你不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸没有人没有没有女朋友了不要不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸的家伙不要脸的不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸\t151287\n发脸\t151288\n6566666\t151289\n曹清华\t151290\n发脾\t151291\nmemories\t151292\n过得去\t151293\n好假\t151294\n8655\t151295\n纳粹\t151296\n四分之一第二件\t151297\n欧贝贝\t151298\n泰坦尼克\t151299\n石头剪刀布\t151300\n1009896949290\t151301\n二手网\t151302\nQSO\t151303\nBY蛋蛋\t151304\n张其嘉\t151305\n不敢相信你\t151306\n嗯恩宝爆笑虫子火中你了奥塔\t151307\n扶余市\t151308\n望梅\t151309\nhhvygb\t151310\n蔡水梅\t151311\n卢采怡\t151312\n警戒性\t151313\n二百五十多块\t151314\n一降\t151315\n这样最的是你是你就是你\t151316\n三份\t151317\n学舜\t151318\n然段\t151319\n王凤敏\t151320\n三件\t151321\n团干部\t151322\nMMM\t151323\n度老三\t151324\n听说过天不碍事\t151325\n三代\t151326\n炒盐菜\t151327\n饭田\t151328\n大连京剧院\t151329\n阿满都山\t151330\n蓟门\t151331\n机房\t151332\n歹五三\t151333\n93年\t151334\n口行\t151335\n没了心\t151336\n时空猎人\t151337\n染发\t151338\n大美美\t151339\njixjrjj\t151340\n杨梓涵\t151341\n网管\t151342\n十7月8日\t151343\n无利不夜行\t151344\n真的好讨厌TF\t151345\n齐兴华\t151346\n却更\t151347\n周龙\t151348\nNoteSlate\t151349\n董小姐\t151350\n当者\t151351\n醉翁\t151352\n友人\t151353\n美容操\t151354\n赵卡尺\t151355\n优菲\t151356\n哆嗦\t151357\n了不对\t151358\n笑嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻\t151359\n挫败感\t151360\n全市\t151361\n1546793425\t151362\n时装\t151363\nICON\t151364\n疾走\t151365\n波尔菲\t151366\n二不我说\t151367\n17332222222221501\t151368\n说人生\t151369\n12点20-12:32\t151370\n昨个儿\t151371\n上海财大会计学院\t151372\nikkjjj\t151373\n受乖\t151374\n林大胡\t151375\n腊月三十\t151376\n黑线\t151377\n五星\t151378\n猫眼\t151379\nfryg\t151380\n10％\t151381\n545554\t151382\n度秘我喜欢你度面度面\t151383\n悍马\t151384\n五明\t151385\n不好受\t151386\n臭臭餐\t151387\n浦东大楼\t151388\n跪舔\t151389\nxxxnxxx\t151390\nWOWER\t151391\n1000000008000个\t151392\n疑渐\t151393\n13360534607\t151394\n光头强我要嫁给光头强太嫁给太空光头强\t151395\n汪燕辉\t151396\n王总指\t151397\n两年后\t151398\n男我是女\t151399\n脸色\t151400\n5555558\t151401\n被斗\t151402\n5555555\t151403\nlstme\t151404\n5555553\t151405\n5555552\t151406\n百分之56\t151407\n凶凶鲨\t151408\n轮滑\t151409\n真提莫\t151410\nororororororo\t151411\n超级冰\t151412\n太辣\t151413\n打错啦\t151414\n惆怅\t151415\n吃吃吃擦\t151416\n绵密\t151417\n说人非得\t151418\n好姑娘\t151419\n豆面ok\t151420\nKTV旅\t151421\noqqoa33\t151422\n征求\t151423\n韩姐姐\t151424\nsobbing\t151425\n骨管管\t151426\n北京海\t151427\ngaqq\t151428\n太辛\t151429\n坚辛\t151430\n收成\t151431\n武鸣\t151432\nhhhygghjhhh\t151433\n滑雪鞋\t151434\n棕榈\t151435\n八云\t151436\n郝名堡\t151437\ndreteu\t151438\n无用人\t151439\nbombdrunken\t151440\n破罐子破摔\t151441\n13806460895\t151442\n跳出\t151443\n月亮湾\t151444\nXxhhhh\t151445\n逆不道\t151446\n苍井堡\t151447\n再一个再一个再一个再一个再一个再一个再一\t151448\ntgmjjpkhpt\t151449\n暴怒\t151450\n5826\t151451\n一次机会\t151452\n上不了了么\t151453\n这天儿\t151454\n诸暨市民族宗教事务局\t151455\n不热\t151456\nmylq\t151457\n彦宏\t151458\n不烦\t151459\n评断\t151460\n热线电话\t151461\n全校\t151462\n对不住\t151463\nlostl喽wmpt\t151464\n拎着\t151465\n穆晶\t151466\n江娜\t151467\n姐夫人\t151468\n在意\t151469\n伊万诺夫\t151470\n汪6b\t151471\n说话自命\t151472\n阅卷\t151473\n联通3G\t151474\n说不出口\t151475\n对不你\t151476\n咪咕汇\t151477\n12OOO\t151478\n此景\t151479\n3升\t151480\n猪猪猪猪猪猪咪咪咪咪咪\t151481\n真人秀\t151482\n没有爱\t151483\n爸爸\t151484\n五分钟之内\t151485\n向动图帝\t151486\n教铺\t151487\n年少的时光\t151488\n电信版\t151489\nAbu\t151490\n那化成\t151491\n八月\t151492\n数据凤凰\t151493\nG11\t151494\n高级版\t151495\n泱泱\t151496\n12855321111111\t151497\n猫庙\t151498\n美景\t151499\nxor\t151500\n企着\t151501\n讲大话\t151502\n戴宗\t151503\n开心一笑那你给我\t151504\n猫店\t151505\n八本\t151506\n跑线\t151507\n六那\t151508\nxoc\t151509\n专业者\t151510\nxoo\t151511\n萧山\t151512\n汤少琪\t151513\n梁靖\t151514\n4607000430\t151515\n尿素\t151516\n急得一筹莫展\t151517\nJggk\t151518\n写真集\t151519\ngjarie\t151520\n2元\t151521\n聊理\t151522\n堡主\t151523\n紫檀\t151524\n接合部\t151525\n一两件\t151526\n方才飞机\t151527\n了乃拿来们\t151528\n游刃有余\t151529\n灯笼状\t151530\n狂跌\t151531\n呆呆呆呆呆呆\t151532\nfgyigdrgccxgvbhhhhuygttgfffggggggg\t151533\n堡业\t151534\n旧市\t151535\n元素周期表\t151536\n上党梆子闯幽州\t151537\n加士多\t151538\n小度秘度秘\t151539\n堡丐\t151540\n十余数\t151541\n切克闹哟哟切克闹\t151542\n洛克希德·马丁公司\t151543\n百看不厌\t151544\n未见到\t151545\n苏洒成\t151546\n怪你了谁\t151547\n省省\t151548\n谁是你的好基\t151549\njvjbcibbnnj\t151550\nmpjp\t151551\n太阳眼镜\t151552\n李钰欣\t151553\n雪岭熊风\t151554\n二十三二十二三\t151555\n招胜\t151556\ngvdd\t151557\nJDJECJDVHDVJCDJSBBUISBIBSXISPBXOHBAXPUWSBSBSBSBSBSBBSBABSBABABA\t151558\n张家豪\t151559\n南郊公园\t151560\n完美秘\t151561\n法布雷加斯\t151562\n索尔金，诺兰\t151563\n亲嘴儿\t151564\n6\t151565\n汇合\t151566\n铺拼英\t151567\n你好呀萝卜兔\t151568\n三四百元\t151569\nshhshsbs\t151570\n扉页\t151571\n一五七\t151572\n盒装\t151573\n杨会\t151574\n我要你唱歌我要你亲我要你亲我\t151575\n正方体\t151576\ndiningroom\t151577\n很神奇\t151578\n脏兮兮\t151579\n交行\t151580\n9584945552454681\t151581\n牙膏盒\t151582\n手套\t151583\n大爆发\t151584\n灵芝茶\t151585\n一五个\t151586\n霧霧\t151587\n从来都之恋\t151588\n张书文\t151589\n翻噶\t151590\ni6533466664yf5\t151591\ndugg\t151592\n水城路\t151593\n中者\t151594\nｍｅｔｏｏ\t151595\n中国虎基金会\t151596\n3hn\t151597\n培琼\t151598\n红海尔\t151599\n小型\t151600\n三千舫\t151601\n张金阳\t151602\n屙\t151603\n为国\t151604\n屛\t151605\n姜饼\t151606\n属\t151607\n1999万\t151608\n13608229082\t151609\n晓校\t151610\n展\t151611\n王治凯\t151612\n屈\t151613\n肥cr秘\t151614\n届\t151615\n屌\t151616\n屏\t151617\n屎\t151618\n屁\t151619\n局\t151620\nTFB0YS\t151621\n层\t151622\n居\t151623\n屄\t151624\n停停停\t151625\n屹\t151626\n放眼里\t151627\nhdvxbdh\t151628\nhrsf\t151629\n屿\t151630\n欧尼饭\t151631\n山\t151632\n屳\t151633\n单肩包\t151634\n大总裁\t151635\n子牙\t151636\n5千\t151637\n要价\t151638\nq版\t151639\n屯\t151640\n屮\t151641\n屡\t151642\n屠\t151643\nTFB0Ys\t151644\nq片\t151645\n屦\t151646\n果核\t151647\n刘康伟\t151648\n赐教\t151649\n远道\t151650\n空际\t151651\n慈姐\t151652\n凯蒂猫\t151653\nwouldusean\t151654\n够不想\t151655\n空降\t151656\n云盘\t151657\n我是你的小受\t151658\n南务吧\t151659\n呃夏\t151660\n多梦\t151661\n果树\t151662\nchildren\t151663\n苦难言\t151664\n黄思奇\t151665\n怪羞羞\t151666\n清梦\t151667\n东方明珠到我\t151668\n大众点评\t151669\nc1宝莱\t151670\n贤重\t151671\n碱性食品\t151672\n令人震惊\t151673\ndaGZef\t151674\n阿巴丁\t151675\n1307\t151676\n1300\t151677\n七八卦\t151678\nJFM\t151679\n撤出\t151680\n画片\t151681\n朋古鲁\t151682\nJFD\t151683\n上实要\t151684\n浦年\t151685\n天下掉累赘迭你要不\t151686\n心情不好累\t151687\n石女\t151688\n河堤\t151689\n彼方\t151690\n七八千\t151691\n4577322467\t151692\n七八十\t151693\nhzf\t151694\n余一\t151695\nз╰\t151696\n哪方面\t151697\n推演\t151698\nhzn\t151699\n余三\t151700\n第四季\t151701\n想要你\t151702\n荷叶青青\t151703\n那当然\t151704\nfhfjdjfhgh\t151705\n诗巴丹\t151706\n90余\t151707\nhzx\t151708\n眼花缭乱\t151709\n淮沭新河\t151710\nFAN们\t151711\n赵玉龙\t151712\n吴小宇\t151713\nMichael\t151714\ndiigzgyyzx\t151715\n32萬\t151716\n摇篮曲家\t151717\n嗨度秘\t151718\n以慧寂\t151719\n史证\t151720\n孝敏\t151721\nGadget\t151722\n飞开\t151723\n安布\t151724\n面额\t151725\n理我了\t151726\n64683495\t151727\n飞弹\t151728\n联合国人权理事会\t151729\n好高深\t151730\n夜来\t151731\n祖雅寺\t151732\n衝小紧\t151733\n政委\t151734\n叶永康\t151735\n九分钟\t151736\nhihen\t151737\n笑口\t151738\n收银员\t151739\n孝敬\t151740\n魔仙女王\t151741\n可爱多好啦\t151742\n酷不酷\t151743\nsvftd\t151744\n李文科\t151745\n跨子\t151746\n傻脸娜\t151747\nRose\t151748\n仙花\t151749\n十六一\t151750\n下山\t151751\n30nn\t151752\n结束之\t151753\n邢那\t151754\n不搞错\t151755\n小冰ri\t151756\nRoss\t151757\n分泌\t151758\n大尾鲈\t151759\n亭台楼阁\t151760\n雷子枫\t151761\n啵啵啵啵啵啵啵啵啵啵啵啵啵\t151762\n包雅炯\t151763\n大咖咖\t151764\n今儿太不开心\t151765\n骗话\t151766\n好美\t151767\n80亿元\t151768\n929元\t151769\n学生部\t151770\nMarrie\t151771\n嘿假\t151772\n考虑其事\t151773\n行化\t151774\n名胜古迹\t151775\n高小丽\t151776\n夏瓜瓜\t151777\nVfg\t151778\n聪明连我说话\t151779\nPOSD\t151780\nPOSE\t151781\n心颜\t151782\n心额\t151783\n上空\t151784\n业务范围\t151785\n征人\t151786\n摧残\t151787\n理清\t151788\n嗯闹\t151789\n尤一\t151790\n云歌\t151791\n褐色\t151792\n牛虻\t151793\n屏气死\t151794\n苗寨\t151795\n行医\t151796\n中央气象台\t151797\n自作孽\t151798\n日榜\t151799\n弯弯\t151800\n应山宝利\t151801\n十万八千八百九万九\t151802\n泛爱\t151803\n悠哈你说的话\t151804\nrr55\t151805\nRIRIRI\t151806\n元氏\t151807\n由衷\t151808\n帮人\t151809\n探秘\t151810\n七十年\t151811\n中田春平\t151812\n苏比克湾\t151813\n念垚\t151814\n啊秘\t151815\nfgjj8\t151816\ntoria\t151817\n陆树铭\t151818\n110美元\t151819\n啊秀\t151820\n咀嚼\t151821\nmuyou\t151822\n两个\t151823\n元气\t151824\n悔不晚生\t151825\n岚皋高\t151826\njer\t151827\njes\t151828\njeq\t151829\njev\t151830\n七十平\t151831\n组团\t151832\njeu\t151833\n封底\t151834\njex\t151835\ngvdgshs\t151836\n阿搞\t151837\n一水岁\t151838\njeb\t151839\n矛盾\t151840\n两一\t151841\n90千克\t151842\n两万\t151843\njed\t151844\njee\t151845\njej\t151846\n邓梓茵\t151847\njeh\t151848\n两三\t151849\njen\t151850\n81号\t151851\njem\t151852\n技术改进\t151853\n27家\t151854\n888888866666666666666555555555552222222222211111111111\t151855\n3403-5447；03-3403-3380；03-3403-3345\t151856\nwhatknead\t151857\n屋架\t151858\n瞿康宇\t151859\n武功法\t151860\n麻口\t151861\n水煮活鱼\t151862\n你的笑容\t151863\n防空\t151864\n美妞们\t151865\n不要说着我\t151866\n赖俊威\t151867\n想清楚\t151868\n麻友\t151869\n盈已转出\t151870\n钢铁匹洛曹\t151871\n3千年\t151872\n辞工\t151873\n太屌\t151874\n1387634163\t151875\n蒙骗\t151876\n文思\t151877\n举证\t151878\n陈丑\t151879\noooac\t151880\n海狗\t151881\n度肚扇\t151882\nnex\t151883\n男大当婚\t151884\njhghffdGjbgg\t151885\n年休假\t151886\n2001年9月8号\t151887\n世者\t151888\n08588\t151889\n致信\t151890\n赶作业\t151891\n还是不是\t151892\n14分钟\t151893\n南谢\t151894\n呃十五\t151895\n海狮\t151896\n煲汤\t151897\n进门\t151898\n2千零二3\t151899\n文怡\t151900\n两个女人\t151901\n提高性\t151902\n好的好的好的好的回答哈打哈打哈打哈打哈打哈打\t151903\n这么些\t151904\n素直\t151905\n万源店\t151906\n迷倒\t151907\nHnjjvgj\t151908\n慧慧慧\t151909\n纳米饭\t151910\n翠苑一小文华校区二一班\t151911\n顶梁柱\t151912\n卡卡罗特\t151913\n郝海波\t151914\n1990940068\t151915\n采荼\t151916\n睡不了觉\t151917\nh2小学\t151918\n日记日剧\t151919\n明天凌晨3点\t151920\n插页\t151921\n徐子佳\t151922\n远远儿\t151923\n朝阳区来广营北路奶西村农村信用社\t151924\n看不准\t151925\n采药\t151926\n知知知知知知知知知知\t151927\n从的那爸爸呢爸爸吗爸爸嗯\t151928\n别出心裁\t151929\n很实用\t151930\n素色\t151931\n印度斯坦时报\t151932\n聊了会\t151933\n罐头\t151934\nl0S\t151935\nl月10曰\t151936\n芠文\t151937\n二十多号\t151938\n8442387\t151939\n好嘞好嘞好嘞好嘞\t151940\n我想玩儿植物大战僵尸\t151941\n给物还价\t151942\n杜老婆\t151943\nラ#\t151944\n国歌讨厌你你干嘛讨厌我呀你我可讨厌你你敢可恶\t151945\n完我走\t151946\nwrites\t151947\ntotk\t151948\n待见\t151949\n197019\t151950\nl0l\t151951\n解压\t151952\n锵锵\t151953\nnjnkbbjkjvbj\t151954\n不再犹豫\t151955\n杀驴\t151956\n冰棍儿\t151957\nqqerttu\t151958\n常数\t151959\n精灵仙子我是花精灵植物仙之我\t151960\n小酉\t151961\n芝加哥公牛\t151962\n血浓于水\t151963\nJEFF\t151964\n246541231\t151965\n3186点\t151966\n115师大\t151967\n小酒\t151968\ntuurwqeii\t151969\n乖乖听话\t151970\n陆珺怡\t151971\n寒雨\t151972\n战场\t151973\n战地\t151974\nedeeeeeeddddd\t151975\n零点场\t151976\n种人\t151977\n文化遗产\t151978\n透出\t151979\nr点五百\t151980\n余老师\t151981\n金毛公主\t151982\n苏共中央\t151983\n弄中分\t151984\n引申\t151985\n贾浩\t151986\n说不通\t151987\n朱亚文\t151988\n罗素\t151989\n引用\t151990\n罗索\t151991\n4杆\t151992\n比帮\t151993\n拆记\t151994\noffce\t151995\n三国志无双战\t151996\n波胆\t151997\n插画师\t151998\n赵生涛\t151999\n苏泷\t152000\n请留意\t152001\n比帅\t152002\n457357535983586557\t152003\n依米\t152004\n一到七八\t152005\n邱永铮\t152006\n平静\t152007\n党部\t152008\n君子兰\t152009\n快点我就是爱你\t152010\nWmgpjag\t152011\n一个十六号\t152012\n眼肌\t152013\n搓火\t152014\n影后级\t152015\n众女星丰腴逼人\t152016\n3125468762\t152017\n华东\t152018\nBOOKING\t152019\nnxnxj\t152020\n碧水\t152021\n赵媛媛\t152022\n那场七\t152023\n狗叫版\t152024\n眼肿\t152025\n光云影\t152026\nhuhhhhhhjjjnbhvgvcgvgvvgvbbvbhbhjghvcgvvvvghgyg\t152027\n孕垲谭\t152028\nudid\t152029\n死板行\t152030\n寂寞的人\t152031\n平面\t152032\n古坤燏\t152033\n花式化\t152034\n817861\t152035\n新药\t152036\nYY1\t152037\n三千克\t152038\ndydhd\t152039\n接过\t152040\n新荣\t152041\n29000天\t152042\n什时候\t152043\n接连\t152044\n宗派\t152045\n别说了臭不要脸\t152046\n你和你的妈妈\t152047\n真无聊你真无聊你真无聊度秘\t152048\n接近\t152049\n赶明儿\t152050\n话唠\t152051\nWath\t152052\n张丰毅\t152053\n华农\t152054\n容美\t152055\n高不好\t152056\n尹荣强\t152057\n王雨荷\t152058\n圆瑗\t152059\n过年了沙洋\t152060\n咖啡狗\t152061\n谢特\t152062\nv不唄愈合接惡語預警機幾年不見卡羅拉看要好好禦眼紅寄回家男奴好vvv不哈酒戶口卡\t152063\n邪魅\t152064\n杀卜\t152065\n照葫芦画瓢\t152066\nWats\t152067\n病理学\t152068\n盗版堡\t152069\n3qv\t152070\nSiri个小欧\t152071\n金肯技工学校\t152072\nhoutu\t152073\n太守规则\t152074\nHdhhdh\t152075\nxiaohua\t152076\neheey\t152077\n试了试\t152078\n约辆\t152079\n同归于尽\t152080\npuvple\t152081\n甲木日干\t152082\n沃美\t152083\n功底\t152084\niPadnniPad\t152085\n你好我喜欢姐姐日记\t152086\n苍南\t152087\n酒精中毒\t152088\n好乖\t152089\n好乐\t152090\n整体\t152091\n婉丽美\t152092\n迷表\t152093\n算下\t152094\n好么\t152095\n处说\t152096\n好久\t152097\n18815140400\t152098\n车群\t152099\nhiigd\t152100\n密鲁\t152101\n音乐份\t152102\n这城市\t152103\n龌龊化\t152104\n身疾病\t152105\n郭嘉怡\t152106\n张如元\t152107\n蹑手蹑脚\t152108\n一下次\t152109\n贾凯翔\t152110\n复制\t152111\n看不着你\t152112\n六十多岁\t152113\n5cmv\t152114\n好书\t152115\n度秘你和谁度蜜月\t152116\n明天冷\t152117\n大侠度秘\t152118\n一哦\t152119\n辰东\t152120\n一哥\t152121\n好气愤\t152122\n8869\t152123\n凯帅\t152124\n一哨\t152125\n哩个\t152126\nメノ\t152127\n我喜欢你只要你不要你克星了我就喜欢你好不好\t152128\n海伦凯勒\t152129\n6点17分\t152130\n真头\t152131\n嘉峪\t152132\n颜色\t152133\n日饭\t152134\nfhucjfjvo\t152135\n万平方\t152136\n十多米\t152137\n13143687\t152138\n实度\t152139\n拜托我是你的你是女的好不好\t152140\n一品\t152141\n柳公权\t152142\n北半\t152143\n郭vvvv\t152144\n辛树清\t152145\n30家\t152146\n无所失\t152147\n3班\t152148\n11.9公斤\t152149\n华南城\t152150\n欧缘\t152151\nfdfhf\t152152\n石福临\t152153\n麦克-莫达诺\t152154\n白袜\t152155\n优惠挡\t152156\ngxgxxghxxggxdfxxnnnnnnnwheresmyschoolbag\t152157\n平亚俊\t152158\nFundamentals\t152159\n劉軍寧\t152160\n知识点\t152161\n102嘛\t152162\n学熊叫\t152163\n海伦子\t152164\n工画师\t152165\n菩提树\t152166\n技晓\t152167\npsp\t152168\n喵喵喵喵呜\t152169\n制定\t152170\n恩石\t152171\n述安\t152172\n斯诺克\t152173\nwrwsm\t152174\n炎魔\t152175\n羽博\t152176\n王策\t152177\n气体\t152178\n5554525\t152179\n王筝\t152180\n拉萨\t152181\nSMT\t152182\ngdnsisvakifndwk\t152183\n3332224744744444556\t152184\n刘佳佳\t152185\n七武海珠\t152186\n萧乾\t152187\n欢心\t152188\n敢敢\t152189\n1834381\t152190\nZZZ\t152191\nnhrn\t152192\n310109度\t152193\n真够\t152194\nsomefood\t152195\n法律界\t152196\n王天凯\t152197\n十年级\t152198\n闲饭\t152199\n专栏\t152200\n月4日晚7点10分\t152201\n好色之徒\t152202\n一百封\t152203\n爰拘\t152204\n刁77’7322222222222\t152205\n敬神\t152206\n黑白鲸\t152207\nshydv\t152208\n1折\t152209\n接口\t152210\n13657元\t152211\n95斤\t152212\n世风日下\t152213\n眼见为实\t152214\n豕人\t152215\n狗叫行\t152216\n肥波\t152217\n杜是\t152218\n见闻\t152219\n客物\t152220\n能现\t152221\n芯星\t152222\n金牛宾馆\t152223\n小鼓\t152224\n当我男闺蜜\t152225\n23秒\t152226\n昰星\t152227\n毛孔\t152228\n接受\t152229\n7733\t152230\n妻浩宇\t152231\n王丹阳\t152232\n嗯小受\t152233\n110710\t152234\n杜明\t152235\n汉大\t152236\nvarying\t152237\n余新生\t152238\n支配\t152239\n下地狱\t152240\nodiofdi\t152241\nLync\t152242\n阔怕\t152243\n又笑又哭\t152244\n怀揣\t152245\naproblem\t152246\n大江饭店\t152247\n胜利之光\t152248\n森汗亚东\t152249\n搞不搞\t152250\n冷冰冰\t152251\n七下\t152252\n独占\t152253\n七三\t152254\n猜我喜欢\t152255\n七万\t152256\n电费\t152257\n宽敞\t152258\n七七\t152259\n2.37亿元\t152260\n七一\t152261\n七丿\t152262\n俩等会儿\t152263\n姚若涵\t152264\n专辑\t152265\n腹股沟\t152266\n十八文\t152267\n杜金博\t152268\n哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼哼\t152269\n鳕鱼\t152270\n尼美尼美\t152271\n明目\t152272\n锡壶\t152273\n七中\t152274\n嗯嗯嗯嗯嗯嗯嗯嗯\t152275\n七个\t152276\n上地十街10号\t152277\n马欧格\t152278\n西贝乐个科噶思密达\t152279\n不哥哥\t152280\n警犬\t152281\n卢英\t152282\n马卡\t152283\n发南风\t152284\n不懈\t152285\n称赞\t152286\n对策\t152287\n不懂\t152288\n别在\t152289\n纤伊瀛\t152290\n对等\t152291\n巴黎世家\t152292\n{__\t152293\nkklk\t152294\n被逼无奈\t152295\n蔓迪启悦\t152296\n柳n慧\t152297\nhggjgg\t152298\n劝君更\t152299\n杨启鹏\t152300\n巴黎字母秘的的你是猪你是猪你是猪你是猪\t152301\n爱失去了关键\t152302\n5787\t152303\n慢慢贝贝\t152304\n5781\t152305\n两星期\t152306\n更怕怕\t152307\n算了我不会\t152308\n魔鬼\t152309\n57.62亿元\t152310\n贵州师范大学\t152311\n处之\t152312\n秘差\t152313\n地瓜地瓜\t152314\n秘工\t152315\n五六二零七七八三一六\t152316\n春风吹进我心房\t152317\ntopdjud\t152318\n有瘾\t152319\n马媛宇\t152320\nvdubd\t152321\n查班\t152322\n接生\t152323\n丸子GG\t152324\n9月17日上午\t152325\n大蚂蚁\t152326\n奋起\t152327\n鹿克特\t152328\nDDEEE\t152329\n自觉点\t152330\n幽梦\t152331\n飞虎\t152332\n喜获\t152333\n中牟县\t152334\n悠哉悠哉\t152335\n毒恩\t152336\n2012年8月\t152337\n收租\t152338\n叶良\t152339\n刮掉\t152340\n零二二零幺零\t152341\n伤感情谈\t152342\n日鳗鱼饭\t152343\n哪方\t152344\n龚明丽\t152345\n侬今朝\t152346\n21212\t152347\n通知\t152348\n刘莉\t152349\n决策者\t152350\n生日派对\t152351\n省内外\t152352\n男白羊\t152353\n为学\t152354\n吧啦啦巴啦啦小魔仙\t152355\nanzeiaaabaaaby\t152356\n爱了再见\t152357\n飞虫群\t152358\n9万k\t152359\nalpschiklv\t152360\n家乐\t152361\n没好撒\t152362\n高血脂\t152363\n夏风\t152364\n出事\t152365\n1352850\t152366\n出于\t152367\n运动员村\t152368\n1637479478643678966599\t152369\n炒鸡炒鸡\t152370\n权利人\t152371\n李喆君\t152372\nfresy\t152373\n姐妹儿\t152374\n字典\t152375\n小酒窝长睫毛是你最美的微笑\t152376\ngcdfsxzz\t152377\nBTV卫视频道\t152378\nKGB\t152379\n出产\t152380\n家书\t152381\nKGK\t152382\n出京\t152383\n家乡\t152384\n伐木\t152385\nOK么饿饿饿饿饿\t152386\n绿杨\t152387\n与世无争\t152388\n下身边\t152389\n哪辈子\t152390\n威尔玛\t152391\n李俊京\t152392\nwgkcz\t152393\n黄哈哈哈\t152394\n今早六点半\t152395\n翟若冰\t152396\n赖床\t152397\n荫妻\t152398\n呵哈\t152399\n850MHz\t152400\n天气娘\t152401\n任雨轩\t152402\nh48\t152403\n100千米\t152404\n少的小娘子少少\t152405\n筛查\t152406\n任达会\t152407\n编图\t152408\ncjkcjcjfjfkkfkfkfjfjjcjcjcjjcjcjf\t152409\n劈力\t152410\n小学馆\t152411\nX112\t152412\n91个\t152413\n2010年4月12日晚20点到21点\t152414\n13783569904\t152415\n┏云殿\t152416\n我懂\t152417\n太难了我不会太难了我不会呀哪你给我弄\t152418\n声声\t152419\n龙天义\t152420\n纱堆\t152421\n奔腾\t152422\n非尔兹\t152423\n我的会\t152424\n计时\t152425\n名车志定制版\t152426\n好好好好好好\t152427\n我要你性别是女\t152428\n李小小\t152429\nvivox6plus\t152430\njrudjd\t152431\nONLY\t152432\n想方珊\t152433\n很美好\t152434\n李倩倩\t152435\n祢系\t152436\n故宫\t152437\ngkng\t152438\n真伪\t152439\n11点12点\t152440\n红钳蟹\t152441\n渥太华\t152442\n美女度\t152443\n完了看\t152444\n硕士学位\t152445\n十八代\t152446\n2828\t152447\n好美你听的懂话这是哪的话\t152448\nej升\t152449\n2820\t152450\n殓梦花\t152451\n洛凯甲\t152452\nyoce\t152453\n池塘\t152454\ntijng\t152455\n瞎扯淡\t152456\n红才\t152457\n简敢\t152458\n最终归属权归爱泡吧论坛\t152459\n童聃宇\t152460\n洋洋自得\t152461\n不会影\t152462\n马虎\t152463\n换防\t152464\n陪别\t152465\n逼真\t152466\nneq\t152467\nGiamatti\t152468\n农历四月初五\t152469\n三四十斤\t152470\n柳贤\t152471\n新浦捷安快客站\t152472\n我是你的家人那我是你的谁\t152473\nE藏藏藏\t152474\n雷霆非拉\t152475\n加利利\t152476\n张静怡\t152477\n更有趣\t152478\nlouise\t152479\n秘你的话\t152480\ncasinu\t152481\nNPD集团\t152482\n大宗交易\t152483\n外卖员\t152484\n罗尼玛\t152485\n卷残云\t152486\nChippfield\t152487\n跑动\t152488\n五点钟\t152489\n拼劲\t152490\n体检表\t152491\n阿一一\t152492\ntvxm12\t152493\n95555555555555555555555555555\t152494\n原始类\t152495\n公明田寮\t152496\n愤怒的小鸟太空版\t152497\n更完善\t152498\n电子\t152499\n正太太\t152500\n峰幂\t152501\nˇ\t152502\ncvnfdh\t152503\n孤独感\t152504\nˊ\t152505\nˉ\t152506\n顾纪强\t152507\n1.2米\t152508\nlwdsiuiiìmkk\t152509\n露乳么了\t152510\n上午八点半\t152511\n恋音\t152512\nqq邮箱\t152513\n促销样\t152514\n˙\t152515\n总收入\t152516\n随即\t152517\n石梓莹\t152518\n嘴硬心\t152519\n雷诺\t152520\n一五百块\t152521\n三辑\t152522\n健体\t152523\n李晓帅\t152524\npppd\t152525\n三辆\t152526\n拉一推\t152527\n拉杆\t152528\n公选\t152529\n诈骗案\t152530\n困想\t152531\n轻蔑\t152532\n宣武\t152533\nkkkkjkkkkj\t152534\n走晓晓\t152535\n我好你是我的你是我的宠物\t152536\n三辰\t152537\n陕玉玲\t152538\n排比\t152539\n排毒\t152540\n多咪行\t152541\n瞒着\t152542\n系体\t152543\n困惑\t152544\n姜丽\t152545\n边边上\t152546\n龙阳\t152547\n这栋\t152548\n額\t152549\n題\t152550\n3月13日\t152551\n关进\t152552\n凯迪\t152553\n尊师\t152554\n自然之道\t152555\n东长寺\t152556\n類\t152557\n好样\t152558\n45根\t152559\n唐尼\t152560\n大法\t152561\n6张\t152562\n只选\t152563\n受瞩目\t152564\n西门子\t152565\n仙逆\t152566\n光疑\t152567\n大波\t152568\n了拜拜\t152569\n7月1日起\t152570\n顽\t152571\n唐少\t152572\n顿\t152573\n顾\t152574\n项\t152575\n须\t152576\n顺\t152577\n页\t152578\n扇墙\t152579\n顶\t152580\n顳\t152581\nbbrzz\t152582\n几千个\t152583\n七道湾\t152584\n王王子\t152585\n李星星\t152586\n保管\t152587\n熊样儿\t152588\n晚6点\t152589\n614254643647637804754353545￥\t152590\n味香\t152591\n辐条\t152592\niprocityou\t152593\n队员\t152594\n航哥瑞\t152595\n商品房\t152596\n总是\t152597\n探险旅行\t152598\n一85788575855588\t152599\nabcdefzlin\t152600\n江户\t152601\n了渴\t152602\n勉为其难叻\t152603\nhttpfhiphotosbaiducomxiaodupicitema71ea8d3fd1f41340a6f200a221f95cad1c85e6djpg\t152604\nfisfisisfgs\t152605\n几千万\t152606\n外公啵\t152607\nggddfgr\t152608\n唱功\t152609\nok镜斜边\t152610\nafg\t152611\n代付款\t152612\n冬火凤凰解放军\t152613\n快点儿吧老兄\t152614\n二零八二\t152615\n35顿\t152616\nmove\t152617\n唯爱SJ13\t152618\n二板板\t152619\n示范版\t152620\n说唱歌\t152621\nLaposse\t152622\nX\t152623\n两岁\t152624\n题西林壁\t152625\n一模一\t152626\n光棍棍\t152627\n特行\t152628\n姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐姐谢谢姐姐姐姐姐\t152629\n堂会\t152630\n深深十四了我\t152631\n小七仔\t152632\n舞蹈\t152633\n北戴河医院\t152634\n2015\t152635\n2014\t152636\n2016\t152637\n2011\t152638\n2010\t152639\n2013\t152640\n2012\t152641\n鱼鳞\t152642\n18484\t152643\n水池\t152644\n派员\t152645\n鱼鳖\t152646\n非己\t152647\n刘雨嘉\t152648\n1.22米\t152649\naaavvv\t152650\n清涟河\t152651\n鱼鳍\t152652\n抒情\t152653\n那个谁的眼\t152654\nlzl\t152655\n水汽\t152656\nHshd\t152657\n尽量\t152658\n谓宾\t152659\n塞拉拉\t152660\n1一10000000\t152661\n太久\t152662\n研究中心\t152663\n化脓\t152664\n侦测\t152665\nchiphotosbaiducomxiaodupicitem30adcbef76094b36f6453a9ea4cc7cd98c109de9jpg\t152666\n杏仁巧克力玛芬\t152667\n候更新\t152668\n要不对\t152669\n瓜娃子\t152670\n太乐\t152671\n闫丽华\t152672\ntruth\t152673\n烟雨蒙兮花依在\t152674\n促销活动\t152675\n太乙\t152676\n恩恩恩恩恩\t152677\n五得么\t152678\n1xiakiiituiapuavava\t152679\n账号\t152680\n说的好\t152681\n橘子+桃+梨\t152682\n力震\t152683\n吹风儿\t152684\n光涵\t152685\n八十干\t152686\n6：15\t152687\n八十年\t152688\n天会\t152689\n烈焰\t152690\n美丽的眼睛\t152691\n赢生\t152692\n呢q兔\t152693\n8盆\t152694\n长江尾\t152695\n品摊\t152696\n徒儿们\t152697\n五婶\t152698\nyinxiao\t152699\n刘雨莎\t152700\n思奔\t152701\nwebtrends\t152702\nsdgbxdhb\t152703\ndhcg\t152704\n哟不哟\t152705\n慌慌\t152706\n大熊熊\t152707\nbed凌乱\t152708\n长流\t152709\n轮船\t152710\n朱秋月\t152711\n文徳北\t152712\n二百六十分钟\t152713\nskvb\t152714\n朱探险\t152715\n你是谁呀你师生涯\t152716\n4￥\t152717\n陈石鹏\t152718\nddrtddsawgd\t152719\nm6兔兔\t152720\n清明草粑粑\t152721\nlqbube\t152722\n逗梁\t152723\n嗯嗯眼\t152724\n那么好\t152725\nPoint\t152726\n设想\t152727\n菲胡\t152728\n半节\t152729\n十斤\t152730\n十方\t152731\n這本书\t152732\n研希\t152733\n广西陆川县第四中学\t152734\n芒果\t152735\n叛国者\t152736\n儿肥\t152737\n群狗\t152738\n8208208820thc\t152739\ntaattag\t152740\n四分之三第二次\t152741\n供成\t152742\n老子气\t152743\n宣威一中\t152744\n彩钢\t152745\n1824年5月11日\t152746\n解霸\t152747\n太迟钝\t152748\n修变\t152749\n急急\t152750\n急性\t152751\n自媒体时代\t152752\n伤心就好\t152753\n室友\t152754\n叔叔叔\t152755\n为此\t152756\n1.6万余元\t152757\n754245757575\t152758\n面油\t152759\n绘画家\t152760\n娇受\t152761\n刘深圳\t152762\ncles\t152763\n张幸福\t152764\n不是我好爱好还蛮\t152765\n李根顺\t152766\n卵子\t152767\n撒浪\t152768\n八家嗯\t152769\n你走你走你走\t152770\n风夏\t152771\n旧时\t152772\n爱死你了爱理\t152773\n大雨倾盆\t152774\n知不在\t152775\nhjfafh\t152776\n底底\t152777\n李雅静\t152778\nsjdyaigdsysgfjhegkhhdegckjrgcjhrbvybfcgdbjksbxiedjdvg\t152779\n烧火\t152780\n一段时间\t152781\n底座\t152782\n风大\t152783\n10点20\t152784\n商人妇\t152785\ngghji\t152786\n四年多\t152787\n燃烧吧少年\t152788\n村里人\t152789\nigdd\t152790\n风头\t152791\n我的快乐我的家\t152792\n十二月初八\t152793\n4245556424\t152794\n邓亚萍\t152795\n111111222223333444445555556666677777888889999900000\t152796\n泽侨\t152797\n一定很美\t152798\n永宏佳\t152799\n为不着\t152800\n乐此不彼\t152801\n爱东k耳\t152802\n宋家山\t152803\n苏源静\t152804\n出去玩玩\t152805\n顿挫\t152806\n知道日报\t152807\n群架\t152808\n希芝\t152809\n夜公园\t152810\nJohn\t152811\n不能自拔\t152812\n100次\t152813\n4000\t152814\n5000万元\t152815\n110114东海Twitter\t152816\nw息\t152817\n我是你好妈妈呀\t152818\n还我挺\t152819\n讨伏\t152820\n再来片\t152821\n凌晨1点41分\t152822\n伤心你个大头鬼\t152823\n左岸\t152824\n恩帅\t152825\n10.31\t152826\nwoc6666\t152827\n一一万一千一一一万一斤\t152828\n讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌\t152829\n10.3%\t152830\n肌肉娃\t152831\n万欣颜\t152832\n一只三万八千八百八十八万亿\t152833\n挪开\t152834\n46868994\t152835\n捐书\t152836\n安妮海瑟薇\t152837\n三个故事\t152838\n忙求你了我最亲最亲的姐姐\t152839\n正棒\t152840\n不难说\t152841\n男男男男男男我怕死baby\t152842\n110120119\t152843\n二星级\t152844\ncfed\t152845\n瑶瑶瑶\t152846\n雍季\t152847\n羞涩干\t152848\n庐山真面目\t152849\n杜萌\t152850\n30厘米\t152851\n明月太后\t152852\nbring\t152853\n猪脑猪\t152854\n奥祥\t152855\n13.84%\t152856\n窃贼\t152857\n合流\t152858\n详略\t152859\nfffffgggghhh\t152860\n拼音\t152861\n男人我也有你的命\t152862\nlaaolongpituan\t152863\n仇浏璇\t152864\n费话\t152865\ny40\t152866\ngjgjgjgjgjgjptptp\t152867\n就是我问你好指望\t152868\n删选\t152869\n内分泌\t152870\nchinese\t152871\n杰泽\t152872\n3月后\t152873\n刘姐姐\t152874\n16号\t152875\n异次元空间\t152876\n东码头\t152877\nmeans\t152878\n四十九块\t152879\n心死\t152880\n我的歌\t152881\n65mm\t152882\nhwan\t152883\n瘫痪\t152884\n袁梦茹\t152885\n肖卢\t152886\n好呀好呀好呀好呀我爱你爱你爱你爱你\t152887\n天霸动霸tua\t152888\nBecause\t152889\n南野秀\t152890\n长海\t152891\n一个百\t152892\n窗前\t152893\n36363606536\t152894\nabccesthe\t152895\n12j十三岁\t152896\n条狗\t152897\n第三十届\t152898\nhyivgyu\t152899\n简化版\t152900\n抵京\t152901\n丽贝\t152902\n十九\t152903\n松说\t152904\n地势\t152905\n生铁\t152906\n苹果代工工厂\t152907\n防暴\t152908\n罪觉\t152909\n鲁阿鲁\t152910\n糙痕\t152911\n地大物博\t152912\n地动\t152913\n2000亿美元\t152914\n父母爱\t152915\n恩好的乖乖\t152916\n神金币\t152917\n年饭\t152918\n费良玉\t152919\n防暑\t152920\n你們\t152921\n地力\t152922\nAriana\t152923\n天津市矿业权交易管理暂行办法\t152924\n一雅安\t152925\n我老\t152926\n我者\t152927\n一母\t152928\n那小主人和你恋爱\t152929\n啦啦啦大酒店会对究竟是饥饿我很好2好好2好2和我化合价2\t152930\n兰福\t152931\n小毛病\t152932\n2点95点\t152933\n一毛\t152934\n一比\t152935\n3996元\t152936\n快乐块\t152937\n安生\t152938\n一毫\t152939\n华鐾\t152940\n司狼\t152941\n占可支配收入\t152942\n每五年\t152943\n办事处\t152944\n一下片\t152945\n我不知道\t152946\n笄话\t152947\n一池塘\t152948\nsBSNSBSBSBSBSBSBSB\t152949\n度秘你好寸\t152950\n安菲尔德\t152951\nskihmsld\t152952\n激战轮\t152953\n陌走情坡\t152954\n病源\t152955\n虚渊玄\t152956\n露韩饰\t152957\n朵多多\t152958\n鱼肥\t152959\n索尼\t152960\n鱼肠\t152961\n徐佐证\t152962\n鱼肚\t152963\n妖人\t152964\n21241264254254\t152965\n要不买\t152966\n呢ooxx\t152967\n输灰\t152968\n鱼肉\t152969\nmh3\t152970\n完善不够\t152971\n30万元\t152972\n派餐\t152973\n渲图\t152974\nFT\t152975\n广安\t152976\n好下场\t152977\n张裕\t152978\n八个人\t152979\n广宁\t152980\ndfhbxgbv\t152981\n一口气\t152982\n可乐可乐\t152983\n心情好多\t152984\n秘剑\t152985\n白老\t152986\n奖唱\t152987\n12345678456756784327567\t152988\n笑们\t152989\n第二回\t152990\n严欣怡\t152991\n耐受\t152992\n蜂王浆\t152993\n夏小超\t152994\n四十个\t152995\n时装周\t152996\n黄增辉\t152997\n告比\t152998\n竹鞭竹\t152999\n考式\t153000\n说好像\t153001\n朗朗\t153002\n4555555\t153003\nyou吧@ityou\t153004\n第一起\t153005\n华航\t153006\nghffhjk\t153007\n男生们\t153008\n刘春伟\t153009\n超级粘人\t153010\n螺丝粉\t153011\n今儿\t153012\n石港\t153013\nSOUTHRIDGE\t153014\namwjd\t153015\n男工\t153016\n迪瑟洛克\t153017\n厚高\t153018\n钟大吕\t153019\n死节者\t153020\n恶性循环\t153021\n地铁站口\t153022\n手速\t153023\n为什么不断\t153024\n德克萨斯扑克#\t153025\n扫描版\t153026\n我靠我靠我靠我靠我靠我靠我靠我靠我靠我靠我靠我靠我靠我靠我靠我靠我靠我靠我靠我靠我靠我靠我靠\t153027\n东校\t153028\n手透\t153029\n6453311643535555353535353555535555\t153030\n幸会\t153031\n杜青林\t153032\n买路\t153033\n总指挥\t153034\n勒肚子\t153035\n康定市姑咱小学\t153036\n邪恶漫画\t153037\nisw\t153038\ndrink\t153039\n墨鱼\t153040\nfvxcv\t153041\n弹丸半岛\t153042\n刘梦瑶\t153043\n中石油中石化\t153044\n1ao一\t153045\n潘婷\t153046\n掠影\t153047\n萌哒哒哒哒\t153048\n说笑话\t153049\n一问三不知\t153050\n111111101\t153051\n黑车\t153052\nNeey\t153053\n女闺蜜\t153054\n这感觉\t153055\n东京平安\t153056\n再也不见你了我讨厌你太思米亚\t153057\n费尔奇\t153058\n贾盛强\t153059\n23684459\t153060\n赛鸽\t153061\n嗯伟斌\t153062\n咿呀思密达\t153063\n信命\t153064\n龙王峰\t153065\nisu\t153066\n很丑\t153067\n二四三三零七零六\t153068\n中央电视台梅地亚中心新闻发布厅\t153069\n习大\t153070\n据统计\t153071\n六千个\t153072\n李霞\t153073\n卧柜\t153074\nu饿UK\t153075\n不治罪\t153076\n李猜猜\t153077\n徐宝华\t153078\n很不\t153079\n诗题\t153080\n朱继盛\t153081\nagreat\t153082\n爱你的话\t153083\n蚕蛹\t153084\n抛光\t153085\n不要脸不要脸你不要脸\t153086\n113047785\t153087\n威克多\t153088\n人世间\t153089\n肝气\t153090\n很严\t153091\nerbjdjdjdjdjdjf\t153092\n血缸\t153093\n剃须刀\t153094\n拍马\t153095\n纳什\t153096\n懂不懂懂不懂\t153097\n告白\t153098\n温哲轩\t153099\n苍老呗\t153100\n林就一直百灵\t153101\ntaking\t153102\negargdrdd\t153103\n好的那你快说\t153104\n吃一堑长一智\t153105\n四点前\t153106\n百度理财\t153107\n消极\t153108\n杨任鼻\t153109\nTechtonics\t153110\n岛外\t153111\n童言无忌\t153112\n无所不哪\t153113\n生蚝\t153114\n嗯同乐\t153115\n刀客巴巴\t153116\nJumpout\t153117\n大早合买早\t153118\n微微辣堡小心\t153119\n说了真话\t153120\n二八\t153121\n舰队col\t153122\n开天\t153123\n黄诗洁\t153124\n期洗\t153125\n第一部分\t153126\nllggthy\t153127\n君太\t153128\n哎呀度你男的我白的好的\t153129\n默许\t153130\n你是我你错啦娃\t153131\nRYGGFDFGBH\t153132\n搜狗搜索v\t153133\n开头\t153134\n真心希望\t153135\n有流\t153136\n饥谷\t153137\n开复\t153138\nkr8\t153139\n美偶\t153140\n谷歌脑\t153141\n2780\t153142\n4285987\t153143\n这么么么\t153144\nCkbCB\t153145\n兵兵云溪旅馆\t153146\n滤光片\t153147\nhoney\t153148\n12点20至1点\t153149\n110唔万岁\t153150\n开外\t153151\n爱你东正\t153152\nhttpfhiphotosbaiducomxiaodupicitemf636afc379310a55a6d13587b04543a9832610cfjpg\t153153\n哈楼\t153154\n我喜欢度度\t153155\n十五十五岁\t153156\n菜青虫\t153157\n两句度\t153158\n葛雷奥特曼\t153159\n拿把伞\t153160\n爱奇艺我最爱我最爱了\t153161\n耿小栋\t153162\n8黄\t153163\n累嗎\t153164\n姨奶奶\t153165\n相约\t153166\n淫娃\t153167\n10点57分\t153168\n串串香\t153169\nkry\t153170\n育权\t153171\n金叉\t153172\nSDIY\t153173\n维果茨基\t153174\n东边儿\t153175\nkrj\t153176\n黄狮寨\t153177\n58元\t153178\n大口气\t153179\n借问\t153180\n回娘家\t153181\n70万\t153182\n文明的冲突\t153183\n心口门\t153184\n1605163185\t153185\n你好猫\t153186\n进场\t153187\n卧龙话\t153188\n18831618286\t153189\n成功之道\t153190\n恶心不咔叽\t153191\n四周年\t153192\n早早\t153193\n白术\t153194\n蹦迪\t153195\n766555900099639\t153196\n陈紫怡\t153197\n流量费\t153198\n黑皮\t153199\n北京荷福影视传媒\t153200\n升旗仪式\t153201\n税源\t153202\n微议\t153203\n拳击手\t153204\n城镇化\t153205\n丞认\t153206\n2011年2月9日晚11点\t153207\n烘烘\t153208\nhjjjgf\t153209\n陈竺\t153210\n老子们\t153211\nhdrmd\t153212\n考试年年\t153213\n显出\t153214\n玖熙熙\t153215\nbdbd\t153216\n弹窗\t153217\n风筝\t153218\n学雷锋不好呀\t153219\n自由女神\t153220\nAppCircle\t153221\n帅自然\t153222\n交际\t153223\n践午\t153224\nhsbsss\t153225\nHEJDKDND\t153226\n汉化beta2\t153227\n家种\t153228\n下中雪\t153229\n声母\t153230\nhemother\t153231\n家私\t153232\n家秀\t153233\nwapmgm\t153234\n家秘\t153235\n金秀敏\t153236\n凌弱欺\t153237\n一千多家\t153238\n田念畾\t153239\n快节业\t153240\n像不像\t153241\n呃泌尿\t153242\ngsdgadfwdfwfwfhfwihdfohohfouqohcwodhvchodvfowkhdvfkhdvchkqdbckqjvkhrkhqdhkdhqhkhdqbchoqcohdabcad\t153243\n父者\t153244\n父老\t153245\n花脸\t153246\n2956784962\t153247\n张小蒙\t153248\n图吧牛吧腐\t153249\n破获\t153250\n心太花\t153251\nOO0\t153252\nduang少天才\t153253\nyuuyyghgh\t153254\n七又\t153255\n席不可以\t153256\nhttpfhiphotosbaiducomxiaodupicitemd1160924ab18972b691329bee1cd7b899e510a9cjpg\t153257\n爱抚\t153258\n七双\t153259\n爱护\t153260\nBro\t153261\n三四百分之三四\t153262\n七号\t153263\n艾利\t153264\nti581\t153265\n安心烦\t153266\n来不欺骗\t153267\n逼斗\t153268\n趣事儿\t153269\n共用\t153270\n毛俊\t153271\n松果\t153272\n1245785utuy\t153273\n七句\t153274\n松林\t153275\n七只\t153276\n萌发\t153277\n我失忆了我失忆了我是谁我是谁我是谁是谁的哪里我在哪里我在哪里\t153278\n待用\t153279\n1kk\t153280\n米琪\t153281\n56886655\t153282\n1kc\t153283\n杜世成\t153284\n这么多天\t153285\n哈哈黄\t153286\n我的爱人\t153287\n黄桥儿\t153288\n陈忠志\t153289\n耳刮子\t153290\nOOC\t153291\n骑马舞\t153292\n邢台\t153293\n来这里吧\t153294\n机械舞\t153295\n八块儿\t153296\n打鱼\t153297\n旧情振秀\t153298\n词霸\t153299\n呼噜噜噜\t153300\n唉唉堡\t153301\n坑人呢草泥马\t153302\n耿哥\t153303\n22点11分\t153304\n这的很好\t153305\nsbsbsbsbsbsbsbsbsbsbsbsbsbsbsbsbsbsbsbsbsbsbsbsbsbsbsbsb\t153306\n骄必辱\t153307\n┍怪\t153308\n老K区\t153309\n心寒懒\t153310\n次方\t153311\n高迪\t153312\n闯闯闯\t153313\n塞神奇\t153314\n出行者\t153315\n小消息\t153316\n零二六\t153317\n2015年\t153318\n六班\t153319\n砖头们\t153320\n主角员\t153321\n后颈\t153322\n梗概\t153323\n红河谷谷歌\t153324\n惊喜新经\t153325\n主卡\t153326\nE八勺\t153327\ndazz\t153328\n三晚上\t153329\n穷人\t153330\n海参纲络平台\t153331\n85橙\t153332\n冲锋枪\t153333\n光明楼\t153334\n大粑粑度秘\t153335\n晨报\t153336\n宜宾市宜一中\t153337\n接人\t153338\n书体\t153339\n从而来\t153340\n持节\t153341\n别担心\t153342\n100百分之100百分\t153343\n家数\t153344\ndhhxdhfsggds\t153345\n给我个赞\t153346\n桨状\t153347\n3688\t153348\n吧哥\t153349\n说来了\t153350\n大萌大萌\t153351\n州官\t153352\n教务\t153353\n原来是美男啊\t153354\n真的你不骗我\t153355\n脸萌疯狂猜明星\t153356\n化身\t153357\ncwb\t153358\n家教\t153359\n襦裙\t153360\nluhanexo\t153361\n吃水果\t153362\n办完\t153363\n欣儿\t153364\nbvvcc\t153365\n恨死你了我讨厌\t153366\n假日县\t153367\n伊思雪姨\t153368\n顾一夫\t153369\n和单\t153370\nufdgucc\t153371\n广告主\t153372\n佳能\t153373\n胸部\t153374\n猛然\t153375\n颜扣小胖\t153376\n媸皴长卜\t153377\n赏析\t153378\n84107\t153379\n真TM\t153380\n妳為\t153381\n三跳桥\t153382\n睡觉类\t153383\n月月姐\t153384\n宇宙电视车\t153385\n黄蛋\t153386\n照顧\t153387\n供源\t153388\n幽默都米\t153389\n照顾\t153390\n奇隆\t153391\n说地\t153392\n单相思\t153393\n柏云\t153394\n错了是握手好朋友你是我是主人你是狗\t153395\n张国顺\t153396\n淡斑\t153397\n回天\t153398\n累了叫\t153399\n有人梦\t153400\n许安\t153401\n没缘\t153402\n回头\t153403\n总和\t153404\n贪官污吏\t153405\nHcjfx\t153406\n天嘉五\t153407\n游度秘\t153408\n886887公司\t153409\n交税\t153410\n1920年\t153411\n张乌药\t153412\n好找到\t153413\n耨女\t153414\n回复\t153415\n我不爱你我讨厌你讨厌讨厌讨厌你\t153416\n李忠良\t153417\n李纹\t153418\n我心情不我不美丽陪我说话\t153419\n苏轼\t153420\n李红\t153421\n第5名\t153422\n最终极\t153423\n打字\t153424\n打孔\t153425\n诗游子\t153426\nhgghsiau\t153427\n暗暗\t153428\n全秘\t153429\n传感\t153430\n沥干\t153431\n全科\t153432\n┳━━┛n　┏┫　┣┓n／／／＼＼＼n｜｜nd\t153433\n语音行\t153434\n实说\t153435\n算数吧\t153436\nROAD\t153437\n住所以\t153438\n全秀\t153439\n说明白\t153440\n公交线路\t153441\n华龙\t153442\n长度\t153443\n算数吗\t153444\n大我相信\t153445\n庆好小鸟\t153446\n全称\t153447\nDaesung\t153448\n999999999999999999999999999999999999999999999999999999999996699999999999999996999999996696699999\t153449\n出尔后尔\t153450\n炬惠中\t153451\nlllolutwo\t153452\nokasok\t153453\n拎包\t153454\nnkg\t153455\n恩怨\t153456\n内部人士\t153457\n5月24日\t153458\n重大\t153459\nloalaitjl\t153460\n沦丧\t153461\n待命\t153462\n沦为\t153463\n鸟市\t153464\n重头\t153465\n我真的好想那你\t153466\n恩总\t153467\n举杯\t153468\n重复\t153469\n蛋挞皮\t153470\n宝马e6889\t153471\n叫葬爱\t153472\n滑雪板\t153473\n咩┭┮\t153474\n忘我记得\t153475\n比如说秘\t153476\n可依\t153477\n足利\t153478\n董家成\t153479\n女那\t153480\n老于老莱\t153481\n，，，\t153482\n558\t153483\n555\t153484\n微量词\t153485\n礼来我家娃\t153486\n556\t153487\n550\t153488\n553\t153489\n泽\t153490\n波士顿\t153491\n张姝含\t153492\nBruno\t153493\n不可取\t153494\n55%\t153495\nretrodance\t153496\n高歌\t153497\nstmyou\t153498\n五百分\t153499\n基本工资\t153500\n贺喜\t153501\n爬坡\t153502\n深造\t153503\n歌友会\t153504\n2斤\t153505\n百毫升\t153506\n照华\t153507\n紫米\t153508\n死亡命\t153509\n太诚实\t153510\n我的父母\t153511\n给我决\t153512\nbeach\t153513\niihhqqu\t153514\nFgif\t153515\n拍排\t153516\n数字电视\t153517\n中草药\t153518\nffddggf\t153519\n气儿\t153520\nfgctt\t153521\n相能\t153522\n179件\t153523\n四百多兆\t153524\n擦擦\t153525\n深深\t153526\n2008年\t153527\n2011.01.08\t153528\n八一队\t153529\n目击\t153530\nFjhhnguvgdhdpfjhfhgtyfjfgfdfsgivddfuskfjfkfuffrjgfjsjshrhfgdgsjdjfgheodbochdufjy\t153531\n了不哭我哭我狠狠\t153532\n共动不动更更更更更更更更更\t153533\n正数\t153534\n3月12日\t153535\n丫井安\t153536\n1584808208\t153537\n固溪\t153538\n谁的话\t153539\nfx48\t153540\n放映厅\t153541\n来过\t153542\n43.49元\t153543\n呢样\t153544\n我媽說讓我睡覺\t153545\n恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我了恨死我\t153546\n伃子\t153547\n歡太妍\t153548\n杨瑞琪\t153549\n铀\t153550\n充张\t153551\n实证\t153552\n磷脂酰\t153553\n二二七二三零七八零\t153554\n郑邹普\t153555\n运动照\t153556\n托业\t153557\n红小豆\t153558\nqll\t153559\nffffdddeeeefgg\t153560\n爱就爱\t153561\n咳嗽问你\t153562\n撞车\t153563\n操死\t153564\npjf\t153565\nkabcd\t153566\nQQ个性签名\t153567\n礼服\t153568\n李清照\t153569\nRff\t153570\n我恨你你你，我恨你我恨你我恨你我恨你我恨你我恨你我恨你我恨你我恨你我恨你\t153571\n咏梅\t153572\n生活大爆炸\t153573\npjp\t153574\n最好了\t153575\n血型\t153576\nffdfhhhh\t153577\n零二十三天\t153578\n黄星星\t153579\n8样\t153580\n05千克\t153581\n110个\t153582\n哎四\t153583\n张自己到底\t153584\n下周三\t153585\n收容所\t153586\n切去\t153587\n乖摸摸\t153588\n谢绝\t153589\n下周一\t153590\n穗威\t153591\n画像\t153592\n酷派y1\t153593\n花松鼠\t153594\n什喵\t153595\n李叫\t153596\n粉红色\t153597\n站起来\t153598\n比方有\t153599\nDPOXoI\t153600\n亲爸\t153601\n逆袭痞子蛋\t153602\n施肥\t153603\nxfggvvf\t153604\n八千克\t153605\n双面新娘\t153606\n石磊\t153607\n富丽城\t153608\n我是灰灰姑娘公主\t153609\n电泳\t153610\n中国造富大学\t153611\n梅子青\t153612\n李双\t153613\n那灿烈\t153614\n太晕了你是机器人而我是人\t153615\n李叔\t153616\n貂蝉\t153617\n李发\t153618\n赤眼\t153619\n电波\t153620\n乡巴佬\t153621\n一千瓦\t153622\n大鸟\t153623\n度秘你好日记\t153624\n数千元\t153625\n玩火\t153626\n26健\t153627\nheatial\t153628\n俩我俩\t153629\nvchfufjcjcjdruchfhshxhxbcbchfuf\t153630\n汉堡们\t153631\na365meterlong3Dstreetpainting\t153632\n此致\t153633\n就班\t153634\n爱问\t153635\nanal\t153636\n180公斤\t153637\n飞禽\t153638\n动滑轮\t153639\n七五级\t153640\n七五约\t153641\n習慣\t153642\n唠嗑儿\t153643\n赃款\t153644\njkdfakjd\t153645\n华社\t153646\n通天大道\t153647\n神龙公园\t153648\n蓝翔技校倒韩派联盟\t153649\n双侧\t153650\n大鸡\t153651\n05分\t153652\n叫花\t153653\n孟煌\t153654\ncuyed\t153655\n多咪你在干嘛喃\t153656\n苏锐泓\t153657\n少女们\t153658\n香花\t153659\n香型\t153660\n关一\t153661\n转头\t153662\n0887415157\t153663\n下部\t153664\n兄弟版\t153665\n关上\t153666\n孑叫\t153667\n吧吧主\t153668\n空禁\t153669\n乖好\t153670\n香芋\t153671\n你好乖来么\t153672\n7986\t153673\n黑妹妹妹\t153674\n关中\t153675\n歌词儿\t153676\n清华\t153677\n合家福\t153678\n1f兄\t153679\n5-10\t153680\n度你咪\t153681\n卡碧尼\t153682\n郭总\t153683\n势利眼\t153684\n僧职\t153685\nvcff\t153686\n136分\t153687\n第二套\t153688\n深不可测\t153689\n勤学篇\t153690\n1册\t153691\n韩麦尔\t153692\n失不再来\t153693\n企業大亨\t153694\n龚德庆\t153695\nTUT\t153696\nkeljniava\t153697\n尿射\t153698\n快盗\t153699\nTUC\t153700\n丐老怼\t153701\n男单\t153702\n东一拜\t153703\n再审\t153704\n他是我爱的人我不爱南城我爱鹏程\t153705\n告退\t153706\n79176898个\t153707\n子行\t153708\n１分钟\t153709\n美衣\t153710\n热剧\t153711\n以为你可以\t153712\n本性\t153713\n超级超级大坏蛋\t153714\n不和对\t153715\n绝育\t153716\n顾问\t153717\n动感超人\t153718\n美行\t153719\nChristian\t153720\n良莠参差\t153721\n四堂课\t153722\njhshxigrhddjvdbdvsvzbvxvvjfyufkfugjfjfjfghcjsjskbxgjzhsjksgudhwjusjkdgjjhhdhdjddrdfe4435w3354324e53\t153723\nudhfijf\t153724\n玻璃纤维\t153725\nfuvgcic\t153726\n抱抱乖乖\t153727\n地胶\t153728\nIsay\t153729\n李5\t153730\nLeague\t153731\n怒提1双11\t153732\n李1\t153733\n你真你\t153734\n夜场\t153735\n笑点亮\t153736\n贱痞\t153737\n曼哈顿牛\t153738\nhttpfhiphotosbaiducomxiaodupicitem0bd162d9f2d3572c2c2a4aae8d13632763d0c3fcjpg\t153739\n沫若兮沫若兮\t153740\nrestishistory\t153741\n林鑫源\t153742\n加广\t153743\n徐晓婕\t153744\n价高\t153745\n倾先\t153746\nnsijd\t153747\n飞手能手女\t153748\n便携化\t153749\n李i\t153750\n绿王孙\t153751\n﹗\t153752\n再就去\t153753\n不和你\t153754\n加幕\t153755\n一个一米\t153756\n度余生\t153757\n金字塔\t153758\n徐智星\t153759\n糖醋鱼\t153760\n经济发展\t153761\n一点五岁\t153762\nGPRS\t153763\n怨天尤人\t153764\n勿施于人\t153765\n演进\t153766\n七八十块\t153767\nxuxsu\t153768\n喜剧明星\t153769\n男大王\t153770\n548658\t153771\n林美霞\t153772\n用电器\t153773\nnforchestofiilastio\t153774\nKATE\t153775\nAnna\t153776\n沉香木\t153777\n犟撒\t153778\n金致列\t153779\n50000个\t153780\n一定可能\t153781\n排派出\t153782\njjjjjjjiiio\t153783\n东丰县\t153784\ndeepest\t153785\n只言片语\t153786\n朱鸿羽\t153787\n直进\t153788\n剪贴报\t153789\n禄语\t153790\n1929年\t153791\n脑tfboys\t153792\n0＄\t153793\n思密达\t153794\n过大\t153795\n有能力\t153796\n度祢\t153797\n玛雅翰\t153798\n收下\t153799\n偏执狂\t153800\n兆丰桥\t153801\n俱乐部\t153802\n无珠\t153803\n普艾斯\t153804\ngenge\t153805\napian\t153806\n13638540398\t153807\n违心\t153808\n纯美\t153809\n等醒了\t153810\n马丽娜\t153811\nOK！精彩\t153812\n二小时\t153813\n还在样\t153814\n拉齐奥\t153815\n小儿童\t153816\n捡拾\t153817\n度祕\t153818\n双城\t153819\n我不要你的喜欢我要你的爱\t153820\n文化史\t153821\n度神\t153822\n阴凉\t153823\negeggdg\t153824\nhistori璃雪涵approcatisleyou\t153825\n困身\t153826\n翻脸\t153827\n大汉天子之王灵\t153828\n荫道\t153829\nw7\t153830\n无文才风流\t153831\n兰花花\t153832\n剑道\t153833\n梁和\t153834\n82岁\t153835\n﹑\t153836\ndoyouknow\t153837\n老师傅\t153838\n我喜欢裤\t153839\n74年\t153840\n男滴\t153841\n国界线\t153842\n明天万达\t153843\n去着\t153844\n曹百齐\t153845\n禁忌恋\t153846\n滴滴滴滴滴滴滴滴滴滴滴滴\t153847\nwf\t153848\n兰州大学\t153849\n网上\t153850\n贝石\t153851\nwb\t153852\nwa\t153853\n草场\t153854\nwo\t153855\nwn\t153856\nwm\t153857\nwl\t153858\nwk\t153859\nwj\t153860\nwi\t153861\nwh\t153862\nww\t153863\nwv\t153864\nwu\t153865\n网业\t153866\nws\t153867\nwr\t153868\nwq\t153869\nwp\t153870\n15.13\t153871\nwz\t153872\nwy\t153873\nwx\t153874\nfoer\t153875\nwD\t153876\n小商品\t153877\n十二十二十二日\t153878\n景致\t153879\n無語\t153880\n高年级\t153881\nwK\t153882\n本世纪前\t153883\n死斗\t153884\nwV\t153885\n盛超\t153886\nbrad\t153887\n韭菜饼\t153888\n瞎闹\t153889\n川岛文夫\t153890\n2006年底\t153891\n意思4s\t153892\n窝案\t153893\n盗梦空间\t153894\n披风\t153895\n盛来运\t153896\n北郊站\t153897\n8177616\t153898\n整套价\t153899\n怕闹\t153900\n足协\t153901\n金大侠\t153902\n搓衣板\t153903\n火山\t153904\n吧英歌丽诗\t153905\n我了我恨你\t153906\n老行当\t153907\n老太太\t153908\n﹌\t153909\n女地下党\t153910\n素缎\t153911\n吴克群\t153912\n一儿人\t153913\n驾考\t153914\n篱笆我灌醉嘞喽嘞q先森\t153915\nRAABH\t153916\n天歌\t153917\n养活\t153918\n13568790\t153919\n五只个\t153920\n偏重\t153921\n往年\t153922\n7426971250852\t153923\n頑皮\t153924\n粗砺\t153925\n汤素兰\t153926\n天武\t153927\n萩蓶荲\t153928\n贸然\t153929\n长工\t153930\nais\t153931\nair\t153932\nStghhddnugfbj\t153933\n中国指数研究院\t153934\nwuli韬韬\t153935\n511161\t153936\n大轿\t153937\nIPHONE5\t153938\n晓爱\t153939\n穷光蛋\t153940\n专心一致\t153941\n乃至\t153942\n撒尿\t153943\n大轰\t153944\nhuyggg\t153945\n古古米古\t153946\n大车\t153947\n情许我一鼓\t153948\n新里\t153949\n36倍\t153950\n宋佳茜\t153951\n我爱卡我爱卡\t153952\n茶资\t153953\n红红火火\t153954\n2月14日晚\t153955\n1818\t153956\nsokok\t153957\n钟方津\t153958\nvia天空城_\t153959\n断点\t153960\n云宝黛西\t153961\n海德曼\t153962\n思意\t153963\n4tas\t153964\n安巴尼\t153965\nhtcvive\t153966\n这周三\t153967\n真心知吾知满天都只九金金多贞\t153968\n小天使幼儿园\t153969\n啊吉隆斯\t153970\n这周一\t153971\n叫作业\t153972\n1QQ\t153973\n呆苹果\t153974\n小廿\t153975\n叶小伟\t153976\n借奸\t153977\nGivecv\t153978\n泥湫\t153979\n优惠v\t153980\n在户外\t153981\n柏瑶\t153982\n冰馨\t153983\n轻风\t153984\n唐国飞\t153985\n江苏省\t153986\n通假\t153987\nphoto\t153988\nkall\t153989\n嗯唧不度\t153990\n妒花\t153991\n1.3%\t153992\n美人鱼\t153993\n喜羊羊之妈妈乐疯狂喜羊羊与灰太狼\t153994\n周亚迁\t153995\n2764363328\t153996\n冰冰冰\t153997\n振作\t153998\n死结巴\t153999\n29479元\t154000\n6:00\t154001\n我不我不禁\t154002\n双鱼女\t154003\nDrdghj\t154004\n杨启旺\t154005\n戏梦巴黎\t154006\n小狮子\t154007\n糖霜\t154008\n我的我想你大都不\t154009\n朋友的话\t154010\n有点差\t154011\n3D玉蒲团法定年龄\t154012\n几家\t154013\nrpg7\t154014\n分限\t154015\n贵气\t154016\nakkk7k\t154017\n铺上\t154018\n熊出没之雪岭熊风\t154019\n水质\t154020\n1o一\t154021\n淳迷们\t154022\n走开我不想\t154023\n陆风\t154024\n第现场\t154025\n申周刊\t154026\n承受能力\t154027\n现实生\t154028\n去的举手\t154029\n众泰z7\t154030\n13276369389\t154031\n谢实\t154032\n进度秘\t154033\n留毛毛血\t154034\n13276369387\t154035\n心疼痛\t154036\n简体\t154037\n张信息\t154038\nJAHD\t154039\n大拇哥\t154040\n1976\t154041\n灶边\t154042\n儿女们\t154043\n头晕晕\t154044\n度秘你的声音好萌萌哒\t154045\n成事\t154046\n早安晚安午安早安\t154047\n林分\t154048\n卞炫烨\t154049\n薛师傅\t154050\n女生女\t154051\n建造者\t154052\n机器机器人\t154053\n数之不尽\t154054\n常小兵\t154055\n百里风的歌唱\t154056\n丰醇\t154057\n小气人\t154058\n吾恩\t154059\n洗劫一空\t154060\n撑腰\t154061\n1000000000000\t154062\n/\t154063\n不我想\t154064\n潘文逸\t154065\n当乐\t154066\ndhiphotosbaiducomxiaodupicitem2934349b033b5bb5524ba18c31d3d539b600bc20jpg\t154067\n王飞\t154068\nob2\t154069\n速答\t154070\nSohana\t154071\n静音\t154072\n事宿\t154073\n桌游\t154074\nswrek\t154075\n区县\t154076\n白搭\t154077\n牛肉牛肉汤\t154078\n2.5小时\t154079\n度钢度\t154080\n十五年\t154081\n将广\t154082\n古惑\t154083\n铺睬\t154084\n事实\t154085\n事宜\t154086\nobf\t154087\nddvffvfc\t154088\nobd\t154089\n新邵\t154090\n丑真\t154091\n舒效果\t154092\n夫问妻\t154093\nobv\t154094\nobt\t154095\n度秘山伯\t154096\n墓碑石\t154097\n生死纠缠\t154098\nㄅ\t154099\nㄆ\t154100\nㄇ\t154101\n男性恋\t154102\n林舒慧\t154103\nTMsb\t154104\n前所未有\t154105\nㄌ\t154106\nㄍ\t154107\nㄎ\t154108\nㄏ\t154109\nㄈ\t154110\nㄉ\t154111\n拉雪\t154112\nㄋ\t154113\n朱哥靓\t154114\n过去四年\t154115\nㄖ\t154116\nㄐ\t154117\n磕碜\t154118\nㄓ\t154119\n鸟屎\t154120\n当事者\t154121\nㄟ\t154122\nㄙ\t154123\nㄚ\t154124\nㄤ\t154125\nㄥ\t154126\nㄦ\t154127\nㄠ\t154128\n宣恩县\t154129\n马自达\t154130\nㄣ\t154131\n一记无得矣再看112张之一年二影音\t154132\n黑包\t154133\n低吟\t154134\n爸名\t154135\n8点12\t154136\n8点15\t154137\n黑化\t154138\n酱猪蹄\t154139\n唐突\t154140\n鸡儿\t154141\n姚特\t154142\n邱仕隆\t154143\n焚烧\t154144\n啦啦啦啦啦啦啦啦我的宝贝你给我说\t154145\n钟丽缇\t154146\n伟伟\t154147\n5333225528805512\t154148\n等於\t154149\n万雪艳\t154150\n摩基尼\t154151\n蒋群蓉\t154152\nHughfuruh\t154153\n于悦仙\t154154\n拜拜拜拜拜\t154155\n非独贤者\t154156\ncoser\t154157\n周玉洁\t154158\n23456778889654336666\t154159\n等文\t154160\n倪兵\t154161\n火影劫\t154162\nccycubovtobuij8\t154163\n海南出版社\t154164\n古文某\t154165\n伟伦\t154166\n扮阳春白雪\t154167\n协同\t154168\n你是我的小跟班的巨人\t154169\n早上4点\t154170\njjilll\t154171\n王静梅\t154172\n半径儿\t154173\n女帝\t154174\n超量\t154175\n超重\t154176\n无房\t154177\n腊八粥\t154178\n上海大众汽车\t154179\n三个轮\t154180\n没我你也有\t154181\n麻省理工\t154182\n哈里呀哈里呀哈里路亚比如1a\t154183\n肉鸡\t154184\n秋兵\t154185\n部队\t154186\n秉承\t154187\n０８\t154188\n狠开心\t154189\n嘞嘞\t154190\n嗯后天\t154191\n００\t154192\n展站\t154193\n赠来电显示\t154194\n伊泽派利斯0lo\t154195\nwelcome\t154196\n广东江门市委\t154197\n新宿\t154198\n新宾\t154199\n哀鸣\t154200\n新宠\t154201\n新客\t154202\n戴鑫龙\t154203\n回唱歌\t154204\n反映\t154205\n小鹿小鹿\t154206\n杨相相\t154207\n沿海\t154208\n弄湿\t154209\nsdrfyhgsghjsg8dgffyyuityyffyiy\t154210\n抽出\t154211\n新宝\t154212\nSgt\t154213\n新宁\t154214\nSgj\t154215\n安小姐\t154216\n新宇\t154217\n顶飞\t154218\n男金女木--金木夫妻不多年\t154219\nn们\t154220\nFinePix\t154221\n修改器\t154222\n咖喱牛肉饭\t154223\n洋洋洋洋\t154224\n5月份\t154225\n1945年6月26日\t154226\n事业单位\t154227\n为何个\t154228\n拉巴\t154229\ntatayouy\t154230\ncghzf\t154231\nl5万起\t154232\n黄仲坤\t154233\n国本\t154234\n采廾\t154235\n196年\t154236\n超级气人\t154237\n黑丫\t154238\n新东天\t154239\n影射\t154240\n水量\t154241\nfmcmjffjedF\t154242\nFhgauvdvhdhhddhhrhdhvajwhbdbanvsjshvcjbrxnsaidhdhbcdhthedbdCNNUzbekskiCNNendkrnkdns3mechanism\t154243\n黑丝\t154244\n宜宾市巡司镇自来水公司\t154245\n唐箐\t154246\n依法死刑派\t154247\n国服\t154248\n上年级\t154249\n半途而废\t154250\n国有\t154251\n数机\t154252\n真的再见真的再见\t154253\nxxohc\t154254\n陈子路\t154255\n度寒露寒露寒露\t154256\n上年纪\t154257\n臂弯\t154258\n南苏米酒\t154259\n争先恐后\t154260\n独女\t154261\n续断\t154262\nxccc\t154263\n透亮\t154264\np型\t154265\n四十二分钟\t154266\n汪汪谢\t154267\n五能\t154268\nintentIntentSK1171477665AFADF2A74CA956133546A609B1E50219873FDFF20385CD7340E95E39E1FB2418end\t154269\n黎族\t154270\n纯女\t154271\n我好讨厌你我好讨厌讨厌讨厌你讨厌你\t154272\n我是女的看男的干嘛\t154273\n黑龙山\t154274\n你好逗呀\t154275\n霍思燕\t154276\npTR\t154277\nraaaaaa\t154278\n新花样\t154279\n褥子\t154280\n廖悦凌\t154281\n求求你了真的不想\t154282\n第二把\t154283\nVvaghe\t154284\n庙子\t154285\n富反\t154286\n女弹\t154287\n女强\t154288\njmpj\t154289\n片儿\t154290\n哦卢\t154291\n独奏\t154292\n无机物\t154293\n跳源\t154294\n短信铃\t154295\n叫好\t154296\n发狂\t154297\n义工\t154298\n臭不要脸我有没说你\t154299\n欧尼酱叫我偶\t154300\n吴承佳\t154301\n复合板\t154302\n踹子\t154303\n好youhda\t154304\n小扬扬\t154305\n没的话\t154306\n国骂\t154307\n牧羊女\t154308\n悄寡\t154309\n县衙\t154310\n不好不好\t154311\n棍儿\t154312\n田草\t154313\n乐豪\t154314\n伊泰克\t154315\n比特星\t154316\nappeal\t154317\n造型师们\t154318\n羊肉汤\t154319\n么儿\t154320\n刑警\t154321\n陈意\t154322\n挣得\t154323\n嗯提\t154324\n下等\t154325\n爱着你的心\t154326\nc200\t154327\n1200余名\t154328\nUHF\t154329\n弘吐\t154330\n非扰\t154331\n内阁\t154332\nKtmtd\t154333\n找回盖\t154334\n大礼包\t154335\n来来自\t154336\n复仇者联盟版\t154337\n别介样\t154338\n六七级\t154339\n马来西亚沙巴州政府\t154340\n毫不留情\t154341\n八三一百七十多斤\t154342\n李洲妍\t154343\n时间表\t154344\n詹承勋\t154345\n礼仪之邦\t154346\n创戏\t154347\n743\t154348\n理雷霆炎魔号\t154349\n娇娘乖孩\t154350\n大马哈\t154351\n高瞻远\t154352\n九十猪猪侠\t154353\n北戴河\t154354\n19110\t154355\n武汉欢乐谷\t154356\n迷你裙\t154357\n几个一个\t154358\n747\t154359\n嗨起来\t154360\n搞不搞笑\t154361\n我己\t154362\n只小游\t154363\n腰\t154364\n夏大陆\t154365\n腳\t154366\n同感\t154367\n腾\t154368\n腿\t154369\n第五部\t154370\n腹\t154371\n好我告诉你\t154372\n腻\t154373\n韦子杰\t154374\n腥\t154375\n爱你到永远我地都米\t154376\n碗湾\t154377\n方面\t154378\n秘密妙妙\t154379\n百分之百一点\t154380\n腮\t154381\n婚姻登记处\t154382\n腔\t154383\n腕\t154384\n鬼吹灯之寻龙诀\t154385\n腐\t154386\n做做\t154387\n甩棍\t154388\nGuiltyurethritis\t154389\n腚\t154390\n酒心\t154391\n腄\t154392\n腆\t154393\n腌\t154394\n笨度\t154395\n保标\t154396\n腈\t154397\n腊\t154398\n王玉莹\t154399\nareal\t154400\n石某\t154401\n一大盆\t154402\n天天十八燃烧\t154403\n25天\t154404\n王玉莲\t154405\n超前咬合\t154406\n熊小帅\t154407\n记事本\t154408\n一大盒\t154409\n苏红敏\t154410\n溜水\t154411\nnotane\t154412\n度秘我的好宝贝\t154413\n21日晚\t154414\n媚俗\t154415\n我是你姐你哥你岩\t154416\n款待\t154417\nehcbrfjdidjdnx\t154418\n梁启超\t154419\n嘘嘘嘘\t154420\n有比\t154421\n朩姑婆岁\t154422\nvrrip\t154423\n朋友\t154424\n有毛\t154425\n不做声\t154426\n荆门\t154427\n本溪\t154428\nFM9243\t154429\n撒丽翠\t154430\n你好淘气\t154431\n十月一日\t154432\n情不明\t154433\n营办\t154434\n樣子\t154435\n艺考生\t154436\n迪拜塔顶楼\t154437\n胡刚刚\t154438\n马启航\t154439\n7826\t154440\n唁电\t154441\n大乖乖\t154442\n五六台\t154443\n天人两隔\t154444\n8月3日\t154445\npport\t154446\n18000只\t154447\n钢条\t154448\n一百棵\t154449\nccggcaa\t154450\n哪特么\t154451\n沙拉酱\t154452\n巴布亚新几内亚\t154453\n骄子\t154454\n黑眼家\t154455\n杜默雨\t154456\n邹桐旭\t154457\nzaaw\t154458\n烧退\t154459\n有可是\t154460\n第几段\t154461\n群落\t154462\n相依偎\t154463\njbggb\t154464\n意儿\t154465\n营业员\t154466\n徐PLCUPSIISITgo\t154467\n三黄蜂\t154468\nPosner\t154469\n王张奔\t154470\n真心我的爱\t154471\nS27w34\t154472\n想来不来\t154473\n幸福梦想\t154474\n止痒\t154475\n联邦最高法院\t154476\n﹡\t154477\ntja\t154478\ntjb\t154479\n今个\t154480\n五一百岁\t154481\ntji\t154482\n6662016666\t154483\ntjm\t154484\n1955\t154485\ntjp\t154486\n13824452341\t154487\n1952\t154488\n郭丽宁\t154489\n1958\t154490\n李大哥\t154491\n孙宇康\t154492\nuuhhggg\t154493\nrEng\t154494\n今七\t154495\n好吃干活呵呵呵\t154496\n八百四十八百四十一\t154497\n福盖日\t154498\n诺基基会\t154499\n笔划\t154500\n快日\t154501\n005055428384566333654525855257769878705554222123422223544858877766669998\t154502\n今世\t154503\ngdvxvxjyffgh\t154504\njjbjjhk\t154505\n米拉\t154506\n习道\t154507\n卡帕\t154508\njsjnck\t154509\n神话故事\t154510\n满枝\t154511\n天仙子\t154512\n阿房宫\t154513\n基基基\t154514\ngovermentis\t154515\n旧会\t154516\n长歌行初三哎呦凉州词\t154517\n走墨脱\t154518\n15145280976一\t154519\n别查\t154520\n时隔\t154521\n12圈\t154522\n结构简式\t154523\n时隙\t154524\n再见我要灰太狼\t154525\n赵国振\t154526\n我得\t154527\n鸡肉\t154528\n一个一位\t154529\n吕征\t154530\n黄思思\t154531\n配信\t154532\n不攻自破\t154533\n吴廷贵\t154534\n前端\t154535\ndhcjug\t154536\n千钧一发\t154537\n疯狂888\t154538\n零三零\t154539\n张涛\t154540\n嘛子爱\t154541\n我自问系英德千七\t154542\n万端\t154543\n袁洪\t154544\nfctddtc\t154545\n晚睡症\t154546\n植物大战僵尸二火山世界\t154547\n正麻烦\t154548\n孟广\t154549\n美分\t154550\njjbbggmm\t154551\nhttpehiphotosbaiducomxiaodupicitem267f9e2f07082838eb3cf761bf99a9014c08f196jpg\t154552\n明明白白\t154553\n万能惟俭\t154554\nε｀\t154555\n庐阳\t154556\ncctv6\t154557\n不醒\t154558\n美城\t154559\n跑惨\t154560\n便饭\t154561\n甜甜的歌\t154562\n恋家\t154563\n肥皂剧\t154564\n不醉\t154565\n双十节\t154566\nggffffgbh\t154567\nifhawaii\t154568\n孤独症\t154569\n后母\t154570\n站里\t154571\n睁更事\t154572\n沙梨\t154573\n花明\t154574\n泉山小区\t154575\n美刀\t154576\n高开\t154577\n侯依婷\t154578\n花麻花\t154579\n上海市\t154580\n恼搬\t154581\n嵌入式\t154582\n讨厌讨厌讨厌\t154583\n陆剑波\t154584\n纠缠\t154585\nmadao\t154586\n密位\t154587\nmessage\t154588\n节操\t154589\n年关\t154590\n和弦\t154591\n超杰999\t154592\n啦啦拉拉\t154593\n核心竞争力\t154594\n金子金子小游想的美\t154595\nGDdtjhd\t154596\nDhfrjfh\t154597\n魏国\t154598\ndin癫当\t154599\n29元\t154600\n⊕心\t154601\n卡姆古德猫宁\t154602\n候命\t154603\n十二分\t154604\n冷不丁变冷\t154605\n两天针\t154606\n高强\t154607\n周腾飞\t154608\n接驳\t154609\n两百遍\t154610\n过差\t154611\n年兽\t154612\n摸摸摸摸摸摸摸摸摸摸摸摸扎\t154613\nfhfy\t154614\n洗发露\t154615\n回我的话\t154616\n接驾\t154617\n老姿\t154618\n投诉单哈\t154619\n喂喂喂喂喂喂你说什么呀你说什么\t154620\nfhfj\t154621\n老姨\t154622\nfhff\t154623\n雪拉瑞\t154624\n哪尼\t154625\n老姑\t154626\n老姐\t154627\n眼妆\t154628\n小度秘我好爱你我们结婚吧\t154629\nkylietestudo\t154630\n11月底\t154631\n周奶娃\t154632\n塞内克斯生物科技公司\t154633\n老姚\t154634\n洛神花\t154635\n聊了瞧不起\t154636\n六六你好啊为我唱一首歌信\t154637\n日日礼物\t154638\n标普\t154639\nrsukuw\t154640\n贾似\t154641\n玉种\t154642\n4ux\t154643\n小英\t154644\n黑心人\t154645\n扣脚\t154646\n秘贱\t154647\n1570013027\t154648\n小苹\t154649\n小若\t154650\n不要脸的头母猪不要脸的头母猪\t154651\n买一畅玩\t154652\n大黑牛\t154653\n44444444444666\t154654\n高延龙\t154655\n三田井\t154656\n炸油条\t154657\n虾艺\t154658\n小苗\t154659\n治宴请\t154660\n几番\t154661\n36度\t154662\nInditex集团\t154663\ndeud\t154664\nlt囧\t154665\n大森森\t154666\n内参\t154667\n勃起北理工\t154668\n36.3\t154669\n晋鹏飞\t154670\n大包子\t154671\n小苍\t154672\n小苏\t154673\n吐纳\t154674\n秘失调\t154675\n253908\t154676\n勤劳\t154677\n无知无觉\t154678\n抓偷\t154679\n西溪湿地\t154680\n温习\t154681\n1万多\t154682\n快点整\t154683\n积极\t154684\nす\t154685\n钻裙\t154686\n快点数\t154687\n欺我\t154688\n奎姨夫\t154689\n死寡妇\t154690\n度秘昂\t154691\n明珠堡\t154692\n橘庆太\t154693\nforever\t154694\n范赞\t154695\n五菱宏光\t154696\n勤力\t154697\n艾莎莉\t154698\n挤汁器\t154699\njuvycj\t154700\n新华下路305号\t154701\n欧巴\t154702\n十副\t154703\n一什么而就\t154704\n苏嘉怡\t154705\nuvuff\t154706\n偶成\t154707\n蒸饭\t154708\n表述\t154709\n起亚\t154710\n笑掉\t154711\n啦啦啦啦啦4G\t154712\n狮心\t154713\n起于\t154714\n丽丽丽\t154715\n一个任字压\t154716\n镜子\t154717\nqfdvzgzg\t154718\n532835134864282355681655595595654656556655626863553565556655564623333262584651222525\t154719\nお\t154720\n大干世界\t154721\n复牌\t154722\n伤筋动骨\t154723\nHELLO\t154724\n查不着\t154725\n卡伦扎\t154726\n希格斯\t154727\n黄河水\t154728\n八月底\t154729\n黄云波\t154730\n晓不晓得\t154731\nHfifeh\t154732\nwereched\t154733\n晨晨晨\t154734\n吴浩兵\t154735\n23枚\t154736\n地铺\t154737\n吴山\t154738\n生媚\t154739\n奴隶们\t154740\n看护\t154741\n看报\t154742\n20日\t154743\nxrtth\t154744\n地铁\t154745\n道儿\t154746\n7.1亿\t154747\n屁屁股\t154748\n一笔\t154749\nう\t154750\n彼得兔\t154751\n难术\t154752\n言规正传\t154753\nぅ\t154754\n庄吉安\t154755\n3G智能总动员\t154756\n片硬\t154757\n我告诉你我爱你\t154758\ntcufcgjvkh\t154759\n二四岁\t154760\n王镇\t154761\n有益于\t154762\n映凯\t154763\n记事\t154764\n局部\t154765\nfijfgb\t154766\n金典型\t154767\n映出\t154768\n瑞也\t154769\n低龄\t154770\n宗师\t154771\n把熊出没之夺宝熊兵\t154772\n八十一百\t154773\n挡剑\t154774\n甘笑\t154775\n猜别酱紫\t154776\n吃冒\t154777\nududududjd\t154778\n几辈子\t154779\n过户费\t154780\n放检\t154781\n碱性\t154782\ntfhgghj\t154783\n英标\t154784\n扩死\t154785\nsjqoostm3conllz\t154786\n幻雨\t154787\n掌声\t154788\n观音\t154789\n无法折断\t154790\n射心無\t154791\n热情店\t154792\n听力\t154793\n求你了救救我吧我\t154794\n敢我不睡\t154795\n前列\t154796\nAB粉\t154797\n泡腾片\t154798\n单梦雪\t154799\n困蒙\t154800\n同源水乡\t154801\n白城市\t154802\n两点钟\t154803\n闲云迷雾\t154804\n西洛洛\t154805\n久长\t154806\n奥海城\t154807\n一千多公里\t154808\n电饭煲\t154809\nJeremy\t154810\n杠浅\t154811\n你好调\t154812\n崔老板\t154813\n撞倒\t154814\n贻误有千口你有我也有\t154815\n蔡依林\t154816\n哪有你有人\t154817\n开窍\t154818\n零五零八\t154819\n杨宠物\t154820\n2pm\t154821\n拿不住\t154822\n青年人\t154823\n过去两周\t154824\n36jpg\t154825\n快精\t154826\n阵势\t154827\n492克\t154828\n中华城\t154829\n悄悄的夜晚\t154830\n开窗\t154831\n你好谷\t154832\n桑梓\t154833\n难为情\t154834\nHgdsda\t154835\n那尔\t154836\nJosh\t154837\n大心\t154838\nchifanliaomei\t154839\n甘草片\t154840\n一一块\t154841\n王国家\t154842\n20.2%\t154843\n6858685986586\t154844\n脚盆\t154845\n那将\t154846\n罗田太\t154847\n3层\t154848\n谢主\t154849\nruby\t154850\n大志\t154851\n那就\t154852\n大快\t154853\n老猪\t154854\n老猫\t154855\n寧靜里現\t154856\n威宁\t154857\n20第二次\t154858\n吓人\t154859\n寻回\t154860\n那尼\t154861\nfggyf\t154862\neib\t154863\neie\t154864\n整容室\t154865\n呢😄\t154866\n橙黄\t154867\n狗奴\t154868\n多米诺骨牌多咪喏\t154869\n北星\t154870\n手手\t154871\n李华磊\t154872\n哈萨克话\t154873\n茌在\t154874\n来口机\t154875\n陈大叔\t154876\n妖气\t154877\n特里\t154878\nAIR\t154879\nAIM\t154880\n媒体化\t154881\n陶陶\t154882\nUCCHPPUFXYPC\t154883\n3局\t154884\n失败\t154885\nAIE\t154886\n吵川流不息\t154887\n慎得慌\t154888\n负分\t154889\n曹路\t154890\n299429\t154891\n产后\t154892\n之中\t154893\n坦田\t154894\nSturgess\t154895\n大头照\t154896\n温江火盆烧烤\t154897\n再唯\t154898\n不交费\t154899\n失贞\t154900\n大方芳\t154901\n256925692569\t154902\nceo\t154903\n再唱\t154904\n李家里\t154905\n鞭策\t154906\n底部\t154907\n托马斯\t154908\nAIc\t154909\n鎏金\t154910\n南非\t154911\n四份\t154912\n四件\t154913\n南青\t154914\nā\t154915\n在狱中\t154916\n留言条\t154917\nvr在美vfm\t154918\n偶滴\t154919\n汕头\t154920\n蛙式架\t154921\n大王庄\t154922\nr1口\t154923\n爱你哈赛呦\t154924\nē\t154925\n一痛\t154926\n3月27日晚上9时\t154927\n压岁\t154928\n小黑黑\t154929\n白不白\t154930\n意限\t154931\n腐肉\t154932\nlegsjapan\t154933\n蒋介石\t154934\n南面\t154935\ntttry\t154936\n74公斤\t154937\nSuperSport\t154938\n陈雨竹\t154939\n官子\t154940\n337一百八一百八\t154941\n星期行\t154942\n人肉\t154943\n吉祥福苑\t154944\n三娘\t154945\n通风道\t154946\n早前\t154947\n再见以后\t154948\n共和\t154949\ncgggggd\t154950\n一闻\t154951\n弃坑\t154952\n喜羊羊儿\t154953\n123.9\t154954\nv星\t154955\n误判\t154956\n吴法天\t154957\n18318265632\t154958\n误删\t154959\n带家\t154960\n抚慰\t154961\n番茄红素\t154962\n66666666666666666666666666666666666666\t154963\n扫墓\t154964\n德尼尔森\t154965\n301.34\t154966\n混子曰\t154967\n哎呀妈呀个臭不要脸\t154968\n等候\t154969\n头上\t154970\n要不我要\t154971\n宽带\t154972\n弹珠秘\t154973\n赞了赞\t154974\n下午茶\t154975\nkkydydkly\t154976\n灵敏\t154977\n乖乖的度秘你是我的好朋友好朋友好朋友你是我的知心朋友知心朋友知心朋友\t154978\n六厘米\t154979\nUME\t154980\ngjz\t154981\n豪言壮语\t154982\ngjx\t154983\ngjy\t154984\n3d公主\t154985\ngjt\t154986\n57684040876792058386184868882346410677848483465984734659828234￥\t154987\ngjp\t154988\n头一\t154989\n腾讯翅\t154990\ngjm\t154991\n骗子\t154992\ngjk\t154993\n原真\t154994\ngjf\t154995\ngjg\t154996\n迈克迈克杰逊\t154997\ngjb\t154998\ngjc\t154999\n猪尾\t155000\n猪尿\t155001\n真么样\t155002\nCjh\t155003\n送检\t155004\nvifufugu8bibojhohjohigigufufutfuugigihgdseyhpllllllllllppppp\t155005\n滞涨\t155006\n领土\t155007\n豆腐包\t155008\n吸汗\t155009\n相见\t155010\n顺潮\t155011\n豆腐脑\t155012\n大都市谋\t155013\nddijx\t155014\n推理案\t155015\n度秘爱你的好我就是你的钱\t155016\n裕景\t155017\nrgqbene\t155018\n1200尺\t155019\n小愿\t155020\n恰都恰不\t155021\n布玛\t155022\nspfw\t155023\n你的爱我想你在我\t155024\n谋生\t155025\n一行一箱\t155026\n朝乐天扬州\t155027\n热水器\t155028\n雨花痴线路\t155029\n新校\t155030\n人去\t155031\n秋哥\t155032\n庞光\t155033\n试一试行\t155034\n延时\t155035\n512#\t155036\n这么多故事\t155037\nsrW\t155038\n武夷体育馆\t155039\n揣测\t155040\n新格\t155041\n长腿\t155042\nthrow\t155043\n神煞\t155044\nsrb\t155045\n窝路痴\t155046\n鱼水\t155047\nsrf\t155048\n发音\t155049\n电车\t155050\nsri\t155051\n艾白武\t155052\nlamely\t155053\nsrs\t155054\n赵文卓\t155055\n鹏丽\t155056\nsrt\t155057\n白龙\t155058\n写子\t155059\nsry\t155060\n写字\t155061\n六角八角\t155062\n掏腰包\t155063\n女人\t155064\n刘建峰\t155065\n我吧战\t155066\n女亲\t155067\n刘先生\t155068\n高舒煜\t155069\nifjck\t155070\ndhiphotosbaiducomxiaodupiciteme850352ac65c103810dfecb1b5119313b07e891bjpg\t155071\nShyvdhfmbxhbgnxxmvjjturohhlgbghcvbdvfjsnklonjjhyejgxylkhdggjljgbjjndnmhklmckhhbkmgmfxklgympvt\t155072\n说谎的女人\t155073\n懂不一旦\t155074\n嗯沙卡拉卡蹦沙卡拉卡\t155075\n翁燕舞\t155076\n练习题\t155077\n吴梦\t155078\n伊亚\t155079\n收割\t155080\n自怜憔悴东邻叟\t155081\n才王\t155082\nBOSS\t155083\n咔咔咔咔咔咔咔咔咔咔咔咔咔\t155084\n活该度\t155085\nCcvvccv\t155086\n用处\t155087\n四十兆\t155088\n一定要是\t155089\n私立节\t155090\n返\t155091\n运\t155092\n近\t155093\n连\t155094\n迟\t155095\n远\t155096\n违\t155097\n十五点712点\t155098\n啦好\t155099\n还\t155100\n这\t155101\n笃信\t155102\n过\t155103\n迂\t155104\n迁\t155105\n迎\t155106\n博白\t155107\n幽竹\t155108\n迈\t155109\n迷\t155110\n大连儿童医院\t155111\nm086\t155112\n述\t155113\n马明\t155114\n追\t155115\n迻\t155116\n迸\t155117\n明年起\t155118\n笔描\t155119\n梦醒了\t155120\n下架\t155121\n陈勋磊\t155122\n初教版\t155123\n做工\t155124\n朱丽薇\t155125\n迫\t155126\n听不懂的话\t155127\n佩姗珊\t155128\n纪律\t155129\n别骗我了我知道\t155130\nvmbf\t155131\n刘鑫涛\t155132\n佟帅\t155133\n星空\t155134\nfuigh\t155135\n情景\t155136\n你是我生有天\t155137\n纪得\t155138\n色金\t155139\n专升\t155140\n好机会\t155141\nangalbaby\t155142\n生活法则\t155143\nantlytre\t155144\n武侠剧\t155145\nShutoutsuehuu\t155146\n好丽\t155147\n构筑\t155148\n二零八九八\t155149\n眸\t155150\n温格\t155151\n眼\t155152\n冯旋\t155153\n班上除了者\t155154\n780g230\t155155\n十三日\t155156\nytcvfbg\t155157\n眵\t155158\n嗯瓮克列孟\t155159\n一千零一夜\t155160\n眨\t155161\n眩\t155162\n眫\t155163\n杂种串\t155164\n碟片\t155165\n眠\t155166\n木马明\t155167\n我美了美了美了我醉了醉了醉\t155168\nzaiuu\t155169\n兰竹\t155170\n滴答答\t155171\n我一点都不爱你诡爱你\t155172\n眚\t155173\n一百一一千一百九十九\t155174\n7584\t155175\n7585\t155176\n7587\t155177\nCristina\t155178\n眉\t155179\n需要你\t155180\n看\t155181\n谴责\t155182\n千万遍\t155183\n眏\t155184\n省\t155185\nsday\t155186\nstomet\t155187\n眇\t155188\nQQ糖\t155189\nyushx\t155190\n淞离\t155191\nESPN\t155192\n大太今白手Q\t155193\n哨响\t155194\n还花\t155195\njrir\t155196\n同栋哥\t155197\n二z\t155198\nhifuset\t155199\n婊湿\t155200\n思想类\t155201\n好莱乌\t155202\n15970443124\t155203\n二G\t155204\n二B\t155205\n忘了他\t155206\n多才\t155207\n得逞\t155208\n回款\t155209\n海阔天空&amp\t155210\n状物\t155211\n嗯OK\t155212\n好不够\t155213\n二&\t155214\n彭宇案\t155215\n林佳琴\t155216\n拘死\t155217\n贝瓦弹棉\t155218\n蓝猫\t155219\n追貓\t155220\n两服\t155221\n囖急\t155222\n不再生\t155223\n郭晓粤\t155224\n曳步\t155225\n54648448448\t155226\n来不及了再见\t155227\n货不对板\t155228\nQAQ度秘\t155229\nhgyjg\t155230\n甲虫\t155231\n小火龙\t155232\n海星小强\t155233\n削面\t155234\nxubd\t155235\n祢发春\t155236\n岔口\t155237\nhpee\t155238\n额济钠\t155239\nGAgagayay\t155240\n联苯美\t155241\n每逢\t155242\n一本道\t155243\n医药费\t155244\n最后餐\t155245\ngmzjhfkskcjHhzuyrfTtttttttttyyyhugf\t155246\n试试试试试试试试\t155247\n收软\t155248\nQQ蛋清\t155249\n女女孩子\t155250\n李赛\t155251\n尺寸\t155252\n泸沽湖\t155253\n乃梓\t155254\n双层\t155255\n最最最最最最最最最最最最句局\t155256\n拾皆是\t155257\n劳动局\t155258\n熊才\t155259\n2782697490\t155260\n亿计\t155261\n许颜\t155262\nMDNews\t155263\n题意\t155264\n曹雨生\t155265\n为非作歹\t155266\n刘伊雪\t155267\n套套\t155268\nkimu\t155269\n呢什么叫\t155270\nkimi\t155271\n末末\t155272\n丑照\t155273\n找我是\t155274\n1919x1051\t155275\n小脸\t155276\n朱政豪\t155277\n夸花\t155278\n黑板\t155279\n末期\t155280\n聊一聊\t155281\n15180400107\t155282\n于此\t155283\n于正\t155284\nhimy\t155285\n音乐汇\t155286\n九一块\t155287\n西宁\t155288\n000861\t155289\nhimn\t155290\n三坝村\t155291\n滴胶\t155292\n成员\t155293\nSimba\t155294\n冯吗\t155295\n醉赤壁\t155296\n李易峰\t155297\n风骚\t155298\n20罗\t155299\nbed飘忽\t155300\n女导演\t155301\n有孩\t155302\ntoys\t155303\n造型\t155304\n599第一节\t155305\n有学\t155306\n期中试试题\t155307\n一分米\t155308\n穿好\t155309\n有听话\t155310\n豫剧\t155311\n哄可宝\t155312\n有孕\t155313\ndhiphotosbaiducomxiaodupicitemac6eddc451da81cb727b31be5566d01609243127jpg\t155314\nhrthj\t155315\n穿女\t155316\n天铁\t155317\n旅管\t155318\n九百八千八百九十九顿\t155319\n除夕\t155320\n除外\t155321\n穿奥\t155322\n15253033025\t155323\n呃正\t155324\n连环\t155325\n佳人\t155326\n真勇敢\t155327\njwbo\t155328\nFC1\t155329\njhbf\t155330\n追悍\t155331\n老块\t155332\n中华者\t155333\nAMOLED\t155334\n朱建廉\t155335\n狼子\t155336\n认清\t155337\n123456789\t155338\n停运\t155339\n洗儿\t155340\n套马杆\t155341\nmnnhnnnn\t155342\n老坏\t155343\n保温材料\t155344\n那良\t155345\n着手\t155346\n原住民\t155347\n挺你好\t155348\n追悼\t155349\n公子玉\t155350\n恳请\t155351\nWHANF\t155352\n教育费附加\t155353\n十二僵尸二\t155354\n在狂\t155355\n额侠\t155356\n看过\t155357\n如山\t155358\nBstea\t155359\n菜篮\t155360\n对不对你\t155361\n哭天抹泪句拖了酒我的恩默默6\t155362\n阶层\t155363\n李冰\t155364\n如履\t155365\n选一不\t155366\n我着我\t155367\n玩意儿猪\t155368\n拖延\t155369\n张堡\t155370\n李世成\t155371\n阿什利\t155372\n冯名\t155373\n稀品\t155374\n披头散发\t155375\n房事\t155376\n高山坡\t155377\n闲言\t155378\nGgcdrj\t155379\n我是你姐我不爱你你爱我\t155380\nplaygame\t155381\nvvhbbjb会vbjbbb\t155382\n丫头哥\t155383\n抽验\t155384\nparty\t155385\n暗访\t155386\n干会\t155387\n港股\t155388\n文化人\t155389\n巴尔德斯\t155390\n幺零零幺零幺零幺幺\t155391\n大大度秘\t155392\n都糸\t155393\n三滤\t155394\n三滥\t155395\n诺提现\t155396\n兜售者\t155397\n不落\t155398\n铁南\t155399\n打不死\t155400\n我不想在\t155401\n被动安全\t155402\n妞姐姐\t155403\n不萌\t155404\n李之繁\t155405\n辩证法\t155406\n博联\t155407\n喝心服口服\t155408\n报叫\t155409\n试一试\t155410\n小花苗\t155411\n敷\t155412\n整\t155413\n本周日下午三点\t155414\n敲\t155415\n糯米肉丸\t155416\n提案\t155417\noffaithOr\t155418\nsozla\t155419\nffv\t155420\n91．5\t155421\nfft\t155422\nffs\t155423\nffr\t155424\nsoom\t155425\n顾白\t155426\n敦\t155427\nsoor\t155428\nk4ifjf\t155429\nsoow\t155430\n散\t155431\n哈哈伦\t155432\nffg\t155433\nfff\t155434\n敬\t155435\nffd\t155436\nffc\t155437\nffb\t155438\n了4G你高一定型红going给你给egg9公公光明\t155439\n敖\t155440\n救\t155441\n敝\t155442\n草蜗牛\t155443\n教\t155444\n彤云\t155445\n敀\t155446\n女色\t155447\n璃茉\t155448\n敌\t155449\nteam\t155450\n牛鞭\t155451\n效\t155452\n信不信任\t155453\n跑路\t155454\n00:19\t155455\n如图六咕嘟咕嘟绿巨人暴\t155456\n血肉模糊\t155457\n体裁\t155458\n郑人豪\t155459\nbnvbvcccc\t155460\n蛇女\t155461\n8月初\t155462\n哼拜\t155463\n148度\t155464\nhdhdgdgeg\t155465\n1371341242823\t155466\n海滨小学\t155467\n闹不清\t155468\n托地\t155469\nhttppinyincne17135\t155470\n88800323\t155471\n蝾螈\t155472\n晒牙\t155473\ncjv\t155474\n3546466865349775665343454579066\t155475\n公主的摇篮曲\t155476\nhttppinyincne17139\t155477\n瑜珈\t155478\n不测棒棒哒\t155479\n最二\t155480\n出现\t155481\n打段\t155482\n三克油歪瑞\t155483\n三克油\t155484\n注音\t155485\n网易说话\t155486\n瘫软\t155487\n贷款\t155488\n13131133\t155489\n怎么了吧遘rfffddfffgfffxxxxxxcfhhhhh\t155490\n648884688658\t155491\nPAPA\t155492\nPAPC\t155493\n年年秀\t155494\nfall呢Jul\t155495\n全曲\t155496\n6nu\t155497\n撒泼\t155498\n二零二六\t155499\n浦西\t155500\n拉鲁\t155501\n两秘\t155502\n杨凯\t155503\n枪口\t155504\n临促\t155505\n在家里次\t155506\n两科\t155507\n两秒\t155508\n万岁\t155509\n责任状\t155510\n参会\t155511\n卡MS\t155512\n我有事儿了拜拜\t155513\n杨出\t155514\n37岁\t155515\n条虫\t155516\n默默哒\t155517\n林老师\t155518\n上学了再见\t155519\nEFHHV\t155520\n啦啊美\t155521\n青以安\t155522\n安卓利亚度秘\t155523\n上五年\t155524\n不那\t155525\n萌萌哒呀\t155526\nuimyou\t155527\n株洲\t155528\n好哪天\t155529\n刘云善\t155530\n壹加壹2\t155531\n屋塔尼克\t155532\n嗯嗯ok\t155533\nMEtoo\t155534\n谨记\t155535\nvbjiopmbff\t155536\n18227315578\t155537\n唱歌唱歌\t155538\n50万行\t155539\n呢鳅\t155540\nKevin\t155541\n猜猜乖乖\t155542\n马雄伟\t155543\n满记\t155544\n儿牙\t155545\n雅兰海逸\t155546\n喂喂喂发礼物零花零花花福利滑溜溜滑溜溜听话我听话溜溜溜溜哇欲望话\t155547\n射咻咻咻咻咻咻咻咻咻咻咻咻咻咻咻咻咻咻咻咻咻咻咻咻咻咻咻咻咻\t155548\n傻逼界\t155549\n啦小度秘\t155550\n好啦你咋了有\t155551\nGUAAAK\t155552\n重蹈覆辙\t155553\n越多越\t155554\n嗯酥\t155555\n习惯性\t155556\n赵新浪\t155557\n鞍山\t155558\n俩岁\t155559\n了像\t155560\n名纸\t155561\n登仙\t155562\n三口之家\t155563\n按着\t155564\n洗等\t155565\nbanano\t155566\n左旋\t155567\n说不讨厌\t155568\n男佣\t155569\n周小孩儿\t155570\n赏不死\t155571\n祝膜拜\t155572\n泥泞\t155573\n回见\t155574\n食人龙\t155575\n十月底\t155576\n21一丄\t155577\ndaysHDT\t155578\n墨冰\t155579\n凯毒\t155580\n参数\t155581\n爱夏\t155582\n曾德梅\t155583\n邵子轩\t155584\n偶不开心\t155585\ncename\t155586\nm331C359\t155587\n给你我的话说给你你就不的话不要说给我\t155588\n就是我懂了\t155589\n主打曲\t155590\n55742573852642\t155591\n礁岛\t155592\n几首\t155593\n苦心\t155594\n開始\t155595\nwiiwiwwi\t155596\n蔚然成伤\t155597\n教龄\t155598\n事项版\t155599\n吴玲秀\t155600\n玉华\t155601\n不分难受\t155602\n浦北\t155603\n零打\t155604\n450kg\t155605\n官路\t155606\n巴塞罗\t155607\n我喜欢兔子我喜欢孔雀我喜欢玫瑰\t155608\n嘴缆\t155609\n是我最好\t155610\n191811594\t155611\n香甜\t155612\n脑瓜子\t155613\n庆生\t155614\n黄日华\t155615\n建筑设计\t155616\njlc\t155617\n不爽不快\t155618\n投射\t155619\nfvfb\t155620\n人间天地\t155621\n建后\t155622\n蔚宣\t155623\n阿龟腚\t155624\nSINA\t155625\n63章\t155626\n八旬\t155627\n≈\t155628\n一识\t155629\n邻居\t155630\n中医师\t155631\n电影票\t155632\n哦琳凯\t155633\n硬好\t155634\n西域摄影俱乐部\t155635\n白云山医院\t155636\n22日深夜\t155637\n5842815937542904\t155638\n大学城\t155639\n直射\t155640\ntfa\t155641\n西顺\t155642\n第四项\t155643\n小天才不错\t155644\n24个小时\t155645\n小心丝毫\t155646\n文烁\t155647\n管鲍\t155648\n市长\t155649\n死去\t155650\n都错\t155651\n拆穿\t155652\n汉堡汉堡\t155653\n死厂\t155654\n继麟\t155655\n一个五岁\t155656\n哎呀子\t155657\n省行\t155658\n软面包\t155659\n札道\t155660\ngfff\t155661\n额鱼\t155662\n葫芦小金刚\t155663\n搬去\t155664\n副标题\t155665\n敲开\t155666\n骨盆\t155667\n十点十分\t155668\n微经\t155669\n警察世家\t155670\n泥坑\t155671\ngvcgjhh\t155672\natjktem\t155673\nTyrellUICC\t155674\n部落\t155675\nluyl\t155676\nyfodh\t155677\n民心\t155678\n顾刚\t155679\n圈层\t155680\nhjjhjfm\t155681\nfdfsdsswrdf\t155682\n今天早晨\t155683\n1912\t155684\nω\t155685\nψ\t155686\nχ\t155687\n灰太车\t155688\nυ\t155689\nτ\t155690\nσ\t155691\nρ\t155692\nπ\t155693\n帅照\t155694\n迷彩\t155695\n3x15x\t155696\n心金钻\t155697\n15529896667\t155698\nKlaaaagjj\t155699\n真傻比\t155700\ntfr\t155701\n力量物\t155702\n姚明\t155703\n六古宅世一个\t155704\ndgcbbj\t155705\n雷纳\t155706\nwtsvn\t155707\nd山\t155708\nvdvd\t155709\n微软中国TechNet\t155710\n贫嘴\t155711\n果冻粉+开水+伏特加+\t155712\n累不想\t155713\n那源\t155714\n小秘密a4大喜\t155715\nfalling\t155716\n曹嘻嘻\t155717\n哈心\t155718\n秤\t155719\n5%\t155720\n耺\t155721\n打一二\t155722\nwhatisyuoneme\t155723\n西秀区\t155724\n上茅厕\t155725\nhsfb\t155726\n基恩\t155727\nhsff\t155728\n银脸\t155729\n生土\t155730\n恩地\t155731\n天津)环保发展有限公司\t155732\n≤\t155733\n登鹳雀楼\t155734\ngcbf\t155735\n双子星\t155736\n手工艺者\t155737\n一拳一\t155738\n丑丑哒\t155739\nXxgv\t155740\n一拳万\t155741\nmilliononhim\t155742\n5828244225222358888744445445555\t155743\n金叉股\t155744\n大生命\t155745\nheartis\t155746\n恶性化\t155747\niigfyyihiy5fggi7\t155748\n三十三一二三四五六七八九十十一十二十三四四六十七十八十九二十\t155749\n医采医助\t155750\n吃笑\t155751\n没有心\t155752\n振动棒\t155753\nidirgfuif\t155754\nchu\t155755\nfyXvc\t155756\n安全\t155757\n男斗\t155758\n蚀紧\t155759\n新文\t155760\n无聊一搜\t155761\n罗佳\t155762\n红子\t155763\n報告\t155764\njeiei\t155765\n真龙\t155766\n红字\t155767\n少数\t155768\n20110629\t155769\nhshsg\t155770\nhshsf\t155771\n陈陈\t155772\n18点45\t155773\n触目\t155774\ngjgdb1adg\t155775\n数一二三四五六七八九十\t155776\n野猫\t155777\n野猪\t155778\n脸庞缩句\t155779\n一八年2月份\t155780\n幻日轮\t155781\n狙击机器人\t155782\n就是为你\t155783\n邱柏铭\t155784\n豆包\t155785\n很奇妙\t155786\n安世耿\t155787\n耀\t155788\n2734083200\t155789\nyoutfoba\t155790\n一锅\t155791\n兽欲\t155792\n哦怕水淀粉\t155793\n张悦怜\t155794\n明宇\t155795\nMaissan\t155796\n一错\t155797\nOnline\t155798\n鬼城鬼城\t155799\n菊花台\t155800\nbuhao\t155801\n一键\t155802\n一锭\t155803\n明家\t155804\n碎叫\t155805\n宝马机头\t155806\n一锨\t155807\n询价\t155808\n南方报业集团\t155809\n咖啡座\t155810\n小度秘我爱你\t155811\nzkzkzkz\t155812\n冷调\t155813\n提减\t155814\n明宫\t155815\n陈雅珊\t155816\nhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh\t155817\n三月初八\t155818\n猪迷\t155819\n权才\t155820\n不图\t155821\n蛋君\t155822\n4.19亿千瓦\t155823\n对换\t155824\n王思奇\t155825\n不困\t155826\n田彩丽\t155827\n南瓜藤\t155828\n99mmcc\t155829\nhfftuvggggg\t155830\n张雅洁\t155831\nkesou\t155832\n5521301993051926380033\t155833\n蛋吧\t155834\n正经不好\t155835\n霍冉\t155836\niyu\t155837\n没了心没\t155838\n不四\t155839\n亦即\t155840\n淫片\t155841\n我的是我的你的还是我的\t155842\niyy\t155843\n溥仪\t155844\n长江阴\t155845\n王向阳\t155846\n酷真\t155847\n蹦蹦跳\t155848\n传宗接代\t155849\nnnn个\t155850\n阎良区\t155851\n松松江\t155852\n注意\t155853\n度秘丸号\t155854\n芝罘区辅读学校\t155855\n小瞧\t155856\n张雨萌\t155857\n啧啧\t155858\n乐不思\t155859\n雪花机\t155860\n中中中\t155861\n英国自行车队\t155862\n14度\t155863\n蜓字\t155864\n没底是谁\t155865\n11点半\t155866\n婶娘\t155867\n朱祟坤\t155868\n王鏖权\t155869\n所以\t155870\nnoptihua\t155871\n荒漠\t155872\n皮妞\t155873\n好得\t155874\n拖把飞\t155875\n好好衣服\t155876\n北校\t155877\n15.4\t155878\n15.5\t155879\n好微\t155880\n15.7\t155881\n羊兔\t155882\n16分之一\t155883\n屠沙沙\t155884\n3300平方公里\t155885\n不我不爱\t155886\n长类\t155887\n肯尼亚航空公司\t155888\n9819部\t155889\n苏宁易购\t155890\n思想龌龊我讨厌你\t155891\n筹钱\t155892\n九承山\t155893\n13356850003\t155894\n王明阳\t155895\n斯泰\t155896\n收效\t155897\n唱斗嘴\t155898\n不请你\t155899\nInternet\t155900\n发言率\t155901\n结婚证\t155902\n有幸灾乐祸\t155903\n卷太\t155904\n二点五\t155905\nncncncnc\t155906\n待兔\t155907\n今年三月二九日\t155908\n乐甲塞\t155909\n烘吹\t155910\nYvY\t155911\n唱就唱\t155912\nxiazai\t155913\n单休\t155914\n网络\t155915\nIbbkm\t155916\n归元\t155917\n嘉嘉\t155918\n刚刚错\t155919\n贯穿\t155920\n19：10\t155921\n细嫩\t155922\n伶仃\t155923\n阿穆尼亚\t155924\n新西游记\t155925\n徐梦婷\t155926\n利趣\t155927\n也有儿\t155928\n加工业\t155929\n小华\t155930\n安详入眠\t155931\n那小时候\t155932\n下九八年\t155933\n捏煮面\t155934\n古伟宁\t155935\n晾干\t155936\n归式\t155937\n广州曙光厂\t155938\n回音哥\t155939\n主们\t155940\n素食\t155941\n费劲儿\t155942\n沙特大使馆\t155943\n青创\t155944\n三分钟爱上心理学\t155945\n微电影\t155946\n3十3\t155947\nghhgg\t155948\nGJP\t155949\n艘噶\t155950\n相减\t155951\n有法不依\t155952\n修仙类\t155953\n非国大纪律委员会\t155954\nGJG\t155955\n白切鸡\t155956\n龙袍\t155957\n刘你\t155958\n一颗长\t155959\n前移\t155960\n喂食\t155961\n百墨\t155962\n摹仿\t155963\n罗沃克\t155964\n八分之86分\t155965\n17443434444上次\t155966\nubgits\t155967\n低碳形象\t155968\n嗯方言\t155969\n我的我妈\t155970\n猜測\t155971\n047214\t155972\n帅大\t155973\n普华永道会计师事务所\t155974\n這次\t155975\n杨乃武\t155976\n吧秘\t155977\nBVD\t155978\n14368480056\t155979\n晚间\t155980\n在一起吧永远\t155981\n7749千米\t155982\ndhhshsghhshd\t155983\n蜚声\t155984\n别为你\t155985\n吧r度秘\t155986\n不善\t155987\n降e大调\t155988\nshine\t155989\nalimaa\t155990\nshina\t155991\n意外惊喜\t155992\n一坨翔\t155993\n武魂\t155994\n33333333333333333333333\t155995\n亲爱的我看你\t155996\n郑秀晶\t155997\n天南地北聊\t155998\n好题\t155999\n猪猪猪猪猪你是猪猪猪猪猪猪猪你是猪猪猪\t156000\nggjjjjjjjjjjjjj\t156001\nshiny\t156002\n龚烨韬\t156003\n公用股\t156004\n歪歪\t156005\n外出办\t156006\n白痴\t156007\n不爽\t156008\ntuj\t156009\n陆元辉\t156010\n不爱\t156011\n烟锁\t156012\n好好玩笑\t156013\nnekd\t156014\n熔断\t156015\n55698687576\t156016\n遥看\t156017\n德思勤\t156018\n八十八块\t156019\n一万吨\t156020\n恩亿梯器人总动员\t156021\n休闲息\t156022\n杂验证\t156023\n议息\t156024\n两口子\t156025\n何事秋风悲画扇\t156026\n曲清\t156027\n华容道不出所\t156028\n贺新春\t156029\n度蜜月\t156030\n且呢\t156031\n衣饰\t156032\n纸局\t156033\n飒飒月28\t156034\n7月28日\t156035\n措辞\t156036\n洋盘\t156037\n董晓冉\t156038\n没差\t156039\n谁谁谁谁\t156040\n静安\t156041\n半坡所\t156042\n顺事\t156043\n抹茶茶叶绿体\t156044\n咋把性\t156045\n108位\t156046\n365dvd\t156047\n桃花源\t156048\n虐身\t156049\n辽沈晚报\t156050\n静定\t156051\n海德萌\t156052\n泰太\t156053\nuvtuvut\t156054\n0.16%\t156055\n度秘度秘度秘度咪咪度秘度秘度秘\t156056\n调汁\t156057\n1.26亿元\t156058\n阶乘\t156059\n戴维转述句\t156060\n666605\t156061\n二缺\t156062\n山西卫视\t156063\n噩运\t156064\n诽谤罪\t156065\n窿头\t156066\n静容\t156067\nRPRuUGGv\t156068\n解现\t156069\n革之\t156070\nTreacherousGregg\t156071\n张合一\t156072\n珍嗖啦\t156073\n急转弯\t156074\nIce\t156075\n1558896624\t156076\n毛礼洋\t156077\n席琳\t156078\n提示铃\t156079\n兴旺振\t156080\n算了不聊\t156081\n六门\t156082\n一个次\t156083\ntuo\t156084\n海潮\t156085\n色漫书\t156086\n电影床\t156087\n一心而破\t156088\n美指\t156089\n女大学生\t156090\n7878\t156091\n米面\t156092\n充分就业\t156093\na30度CD3\t156094\n壳子\t156095\n徐都\t156096\n200735745\t156097\nqbd\t156098\n帐号儿\t156099\n嗯七八九十\t156100\n磕头\t156101\n蟹黄堡\t156102\n粗略\t156103\n那么重要\t156104\n剪贴\t156105\n二十二块\t156106\n南甯站\t156107\nCoco\t156108\n戈尔巴乔夫\t156109\n股快乐乐乐乐乐乐乐乐乐乐乐乐乐里\t156110\n西樵\t156111\n魔兽世界\t156112\n学光谷神奇\t156113\n猫女\t156114\n猫奴\t156115\n朱时茂\t156116\n2分\t156117\n十秒以内\t156118\n聊着\t156119\n帅我\t156120\n雷哥\t156121\n6310M\t156122\n取水处\t156123\n设会\t156124\n你和我道个别\t156125\n记者团\t156126\n美女呢不男不女那你\t156127\nhsgejdbdj\t156128\n三個骨衭\t156129\n太走心\t156130\n档子\t156131\n我讨厌你我要做你我要杀了你\t156132\n锋线\t156133\n某一\t156134\n即将来临\t156135\n汪远\t156136\n行行行你\t156137\n牧歌\t156138\n虚热\t156139\n邀金大共赏\t156140\n收割机\t156141\n人衣\t156142\n不可理\t156143\nhddjdjdkffc\t156144\n督徒\t156145\n朱宁轩\t156146\nqerty\t156147\n昭王\t156148\n11111Yi\t156149\n1点半\t156150\n三千五百元\t156151\ntmd\t156152\n金荷娜\t156153\nfwwrrrewd4221\t156154\n肉柱\t156155\n四个人\t156156\n看见了你\t156157\n特色化\t156158\n要不聊聊\t156159\n一个大一个\t156160\n12元\t156161\n宏观调控\t156162\n异见者\t156163\n嗷嗷嗷我讨厌你\t156164\n说列们\t156165\n樊梦龙\t156166\n53687475611\t156167\n半波\t156168\n爱豆\t156169\n谢谢你对我真的好我感谢你度\t156170\n约翰尼\t156171\n移库\t156172\n一个十万个\t156173\n国家人\t156174\n火星星\t156175\n调位\t156176\n调低\t156177\n王仁成\t156178\n遂昌\t156179\n纯音乐\t156180\npjujmtjd\t156181\n这也太不要脸\t156182\n照然\t156183\n大一天\t156184\n11110个\t156185\n曾經\t156186\n强\t156187\n弹\t156188\n何用我\t156189\n死去元\t156190\n驳船\t156191\n王天娇\t156192\n一百八\t156193\n弱\t156194\n一百六\t156195\n強\t156196\n床只\t156197\n弯\t156198\n弭\t156199\n男同性恋\t156200\n张\t156201\n为春\t156202\n弦\t156203\n弧\t156204\n弥\t156205\n弚\t156206\n永远不赞\t156207\n一百元\t156208\n64b\t156209\n弟\t156210\n吗别\t156211\n一百兆\t156212\n弓\t156213\n吴修文\t156214\n弑\t156215\n弗\t156216\nnibuxiange\t156217\n引\t156218\n弊\t156219\n弋\t156220\n弈\t156221\n一一个多ｇ\t156222\n泼猴\t156223\n375709\t156224\n五月十七\t156225\n兼听则明\t156226\n异\t156227\n弃\t156228\n开\t156229\nAmerica\t156230\n平井坚味道\t156231\n弄\t156232\nyiu龙沙旅\t156233\n没人教\t156234\n红红\t156235\n拍衣服\t156236\n6睐\t156237\n因人而异\t156238\n广东谴责委员会\t156239\n抖m\t156240\n红线\t156241\nAngelbaby\t156242\nffdufgc\t156243\n水鞋\t156244\n燕然\t156245\n徐凡淳\t156246\n广东省\t156247\n高帮\t156248\n轰轰轰\t156249\n图文并茂\t156250\n为喻\t156251\n说不明\t156252\n匠人\t156253\nColdplay\t156254\n善科you歪瑞\t156255\naoaf\t156256\n死角\t156257\n641\t156258\n640\t156259\n折飞机\t156260\n死觉\t156261\n644\t156262\n远隔\t156263\n旁边\t156264\nIshrug\t156265\n648\t156266\nchang\t156267\n囡门头水\t156268\n算值\t156269\n溜冰\t156270\nKANEBO\t156271\n哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒\t156272\n上生\t156273\nBAT\t156274\n民政\t156275\nAUDY\t156276\n想来想\t156277\n在那里面\t156278\n第11号\t156279\n割裂\t156280\n12点\t156281\n牛雨莹\t156282\n首尔FC\t156283\n单品地址##森小物#无脸男\t156284\n四一秒以后\t156285\nabcctal\t156286\n三個\t156287\n班干部\t156288\n金坚强\t156289\nJSK\t156290\n嗯鬼\t156291\n了平\t156292\n角斗\t156293\n8年间\t156294\n南宁市\t156295\nJSX\t156296\n金汇\t156297\n52333144\t156298\nzisisd\t156299\n东电大厦\t156300\n张顺义\t156301\nstarted\t156302\nhttpimgcacheqq\t156303\n甚钱\t156304\n附图\t156305\ncmlk\t156306\n正常\t156307\n秘敦\t156308\n看见了谁看见\t156309\n404张\t156310\n曾相识燕归来\t156311\nes2\t156312\n蒙发利\t156313\n关猫儿\t156314\n包生气\t156315\n七九那\t156316\n秘先生\t156317\n不敢见\t156318\n舞男\t156319\n拘留所\t156320\n雍皖苏\t156321\n秘教\t156322\ndressoneself\t156323\n嗯鸿坤\t156324\n9班\t156325\n封顶\t156326\n好饱不想理你了拜拜我不想你我不喜欢你\t156327\n于东巨\t156328\n2069年\t156329\n人不犯我我不犯人人若犯我我必\t156330\n宠妻\t156331\nbeworry\t156332\n贫妾\t156333\n小奖\t156334\n統計\t156335\n小奔\t156336\n里面子\t156337\n纹路\t156338\n讨骂\t156339\n那和不理我的了\t156340\nsyw\t156341\n好吧nntgtghnnnznnnvtth\t156342\n万七\t156343\n相匹配\t156344\n万一\t156345\n万万\t156346\n快企\t156347\n糠业\t156348\nBAO\t156349\n恋上你爱不爱上学\t156350\n京味馆\t156351\n万三\t156352\n万丈\t156353\n曹唯\t156354\n拿什\t156355\n15.2亿元\t156356\n學者\t156357\n黄vcg\t156358\n聂俊女\t156359\nrgfr\t156360\npnonknkn\t156361\n小好\t156362\n前面前\t156363\n万两\t156364\n万个\t156365\n卡扎非\t156366\n买假\t156367\n德州市\t156368\n包志梅\t156369\nhegge\t156370\n意想不到\t156371\nhdhwjd\t156372\n贵妇人\t156373\n45秒\t156374\n岌岌可危\t156375\n云游\t156376\n冲猪\t156377\n松山湖\t156378\n你好你好你好你好你好只有\t156379\n谢恩\t156380\ntf家族\t156381\n票子\t156382\n五万次\t156383\n椰子树\t156384\nStudienwunsch\t156385\n孙笑话\t156386\n礼器\t156387\n张豫怡\t156388\n邓冬英\t156389\n点行\t156390\n采伐\t156391\n273647282646728733672816368282673927372973478273736382937467393464638964\t156392\no烦\t156393\njns\t156394\njnp\t156395\n聊我等\t156396\njnv\t156397\n东唐\t156398\n回头泪洒\t156399\njnx\t156400\njnc\t156401\njnb\t156402\n欧吧\t156403\n名谚\t156404\n气车\t156405\njne\t156406\njnd\t156407\ntukl\t156408\njnj\t156409\njnn\t156410\njnm\t156411\n傢裡\t156412\n影音诗\t156413\nLBS\t156414\n百色\t156415\n嗯封\t156416\n胆耸人\t156417\n富美\t156418\n鲁瑞洋\t156419\n平安人寿\t156420\n南烨\t156421\n别理\t156422\nFRONT\t156423\n千杯\t156424\n幸福观\t156425\n母娘娘\t156426\n我不理你了我再也不理你了我是\t156427\n张佑昌\t156428\n陈规\t156429\n糖糖\t156430\n电视剧情\t156431\n售楼部\t156432\n虐死\t156433\nhddx\t156434\n奶黄包\t156435\n拉小魔仙\t156436\n闽清二实小学\t156437\njtjvjt\t156438\n青刺\t156439\n灯火料\t156440\n不要挂\t156441\n不懂我不是我听不懂你\t156442\n第五十张\t156443\n玉手\t156444\n阿杰\t156445\n王靖云\t156446\n猜一句\t156447\n封印\t156448\n尹超\t156449\n湘琴\t156450\n飘动\t156451\n后宫甄嬛传\t156452\n出版社\t156453\n闫大妈\t156454\n95959599\t156455\n宁个月卧室\t156456\n十几倍\t156457\n陀螺界\t156458\n公家猫\t156459\n王超斌\t156460\n白小猪\t156461\n萝卜吧\t156462\n再见假\t156463\n曹俊\t156464\n交货\t156465\n交费\t156466\n化学式\t156467\n必将\t156468\n柯南君\t156469\n不得了\t156470\n别这样\t156471\njjjd\t156472\n妄想症\t156473\n梦境化\t156474\njjjc\t156475\ngdhysbrfj\t156476\n胚芽\t156477\n买重\t156478\njjjm\t156479\n生死狙击号\t156480\njjjh\t156481\n王慧龙\t156482\n公能\t156483\nxiene\t156484\njjjs\t156485\n买金\t156486\n陈林波\t156487\n好无趣\t156488\njjjx\t156489\n鎮\t156490\n银包\t156491\n不要脸的臭兔\t156492\n八点7120点\t156493\npapapapapapapapapapapap\t156494\n推一推\t156495\n外面\t156496\n早早快点\t156497\n哦天\t156498\n威尔\t156499\n尿液\t156500\n啦啦拉啦拉拉\t156501\n好整\t156502\n李思楠\t156503\n我是多爱英语终于\t156504\n清幽泽润\t156505\n灌水\t156506\n拆桥\t156507\n惨重\t156508\n英孚教育\t156509\n女女\t156510\n核电\t156511\n饿鸟\t156512\n非理莫视\t156513\n55824km\t156514\n末尾暑\t156515\nHUIKAN\t156516\n过境\t156517\nfifigu\t156518\nbobo5\t156519\n万丈芒\t156520\n一支花\t156521\n中缅\t156522\n桑德尔\t156523\n梦蒙多\t156524\n朱年级\t156525\n脑上\t156526\n我喜欢版\t156527\n杨晓杉\t156528\n缥缈\t156529\n半载\t156530\nu47额\t156531\n建校\t156532\n理你了你回家\t156533\n我好希望你是人\t156534\n勤学苦练\t156535\nNEUW\t156536\nfgkf\t156537\n路路九曲弯\t156538\n铤而走险\t156539\n蜀道难\t156540\n蔡心蕊\t156541\nhhsj\t156542\n光绪宣统\t156543\n最小说\t156544\nhhsg\t156545\n我喜欢物\t156546\n蜜绣拼搏斗破苍穹续集单子宫\t156547\n百年后\t156548\nhhsb\t156549\n玩舍\t156550\nhhsx\t156551\nNUI\t156552\n作注\t156553\n10好几秒\t156554\n千八百万\t156555\n不要脸臭不要脸的丑八怪\t156556\n王宇航\t156557\n奥刚\t156558\n奥创\t156559\n小菲\t156560\nqeerttthhhhj\t156561\n雪球儿\t156562\n今年\t156563\n兄弟会\t156564\n奥列\t156565\n人口普查办\t156566\napple6s\t156567\n小明华\t156568\n不过如此而已久\t156569\n颖秀\t156570\n王诗蕊\t156571\nimall\t156572\n天生丽质国色天香\t156573\n这者\t156574\n猪良\t156575\n奥利\t156576\n度秘窝\t156577\n二十四小时前\t156578\n很好看\t156579\n三只只\t156580\n吧八八\t156581\n咯囖\t156582\n夫妇俩\t156583\n羊粪\t156584\n乐理\t156585\n零八款\t156586\n粉粉\t156587\n再天真\t156588\n您您\t156589\n略为\t156590\n一秒人\t156591\n这么事\t156592\n太後\t156593\nvdjcy\t156594\n加爛\t156595\n因斯坦\t156596\ntf博尔思\t156597\n开瓶费\t156598\n咯图\t156599\n童靴们\t156600\n秘是我我是度秘我要你\t156601\n办案\t156602\n柳宿\t156603\n格里姆\t156604\n万树\t156605\nDer\t156606\n敬拜\t156607\ntusg\t156608\n龙华\t156609\n谢谢你我最爱的朋友\t156610\n一个位\t156611\n哪级\t156612\n哪约\t156613\n21016\t156614\n21012\t156615\n额奎屯额吉HK\t156616\nhjfdhh\t156617\n臭不要脸我没有老公公\t156618\n安徽建筑大学\t156619\n长龙\t156620\n督战\t156621\n我不要你了我要冷掉你\t156622\nlaaloo\t156623\n暖脚器\t156624\n惟实\t156625\n真的好大好大好大好大\t156626\n死了再见\t156627\n帅葛\t156628\n一个你\t156629\n2536\t156630\n阴晴\t156631\n理光\t156632\n赵圆\t156633\n桂芳园\t156634\n呀子\t156635\nhddn\t156636\n放破晓这首歌唱\t156637\n宫斗\t156638\n沙漠王\t156639\n人工温泉吧\t156640\n妻凭夫\t156641\n222点36\t156642\nbeyond[ok\t156643\n贴心人\t156644\n语调\t156645\n青帮\t156646\n大头像\t156647\nwp2188\t156648\nforeI\t156649\n报称\t156650\n中搜\t156651\n丹娜\t156652\n125846585235551882685656358501658326655580456662158493325568565588568056\t156653\n亲工\t156654\n伤缝\t156655\n一两种\t156656\n160场\t156657\nrase\t156658\n413853\t156659\n花容\t156660\n金刚不坏\t156661\nbuf\t156662\n花家\t156663\n唐冬梅\t156664\n腻害\t156665\n楞状凸起物\t156666\nbug\t156667\n列强架\t156668\n101561\t156669\n欧米\t156670\n如履薄冰\t156671\nCNNm\t156672\n895512124\t156673\n寄宿\t156674\nliantian\t156675\n闹了\t156676\ndeie\t156677\n1月13号\t156678\n095\t156679\n三百种\t156680\n后天软件公司\t156681\n谷文娜\t156682\n乐居\t156683\n圆皇\t156684\n罗雨桐\t156685\n零二二八六二八二三零八\t156686\n小部件八一还我师傅我叫\t156687\n打飞机\t156688\n静脉注射服药\t156689\n記卡\t156690\n來說\t156691\n乐山\t156692\n员工费\t156693\n黄绿猫\t156694\n古南\t156695\n虚弱\t156696\n话叨\t156697\n李晟敏\t156698\n芦笋鸡蛋羹\t156699\n我是女人我爱你\t156700\nMBC台11号播\t156701\n差遣\t156702\n全椒乌\t156703\n跟事\t156704\n失业金\t156705\n怕哈\t156706\ngdhridjndey\t156707\n180多\t156708\n果宝特\t156709\n我的秘密是我喜欢你\t156710\n王雅思\t156711\n跟亚\t156712\n快点儿\t156713\n包工\t156714\n抱窝窝\t156715\n19：00至20：00\t156716\n灰姬\t156717\n特异性\t156718\n郭科志\t156719\nu12999元啵\t156720\n十一百一十五点\t156721\n跟亲\t156722\n新凤\t156723\n落倾天下\t156724\n生小孩\t156725\n180天\t156726\n佛恩\t156727\n22块\t156728\n一妻\t156729\n心累\t156730\n良辰多\t156731\n偷空\t156732\nhellomyd\t156733\n头把\t156734\n真的很爱\t156735\n老子不和你一般见识\t156736\n看不没有\t156737\n旺雨婷\t156738\n哎呀表\t156739\n稿基哈\t156740\n大舅子\t156741\n薛城\t156742\n哄叫\t156743\ncftff\t156744\n哎呀街\t156745\n一妞\t156746\n李淑荣\t156747\n我子\t156748\n3303924047844\t156749\n凯子\t156750\n李宇恒\t156751\n一妃\t156752\n防癌\t156753\n七家89105\t156754\nGilsttt\t156755\n吸进\t156756\n54556484567851434758473847644375497455746\t156757\n明天下午三点\t156758\n底跟\t156759\n内战\t156760\n8744\t156761\n冷喵\t156762\nMusic\t156763\n上城\t156764\necvnhhfh\t156765\n沦落\t156766\n冷喝\t156767\n我的一念\t156768\n工整上联子夜鼠欢爆竹乐下联门庭燕舞\t156769\n保时捷卡\t156770\n收交\t156771\nn答\t156772\n顺从\t156773\n赌命你头\t156774\nyan\t156775\nslowly\t156776\n孙骁骁\t156777\n大变样\t156778\n6322563\t156779\n汉寿\t156780\n翻盖\t156781\n钢琴课\t156782\n十五来岁\t156783\n苦海里土肥\t156784\n推出\t156785\n缓我\t156786\nhytd\t156787\nhytg\t156788\n巴子度秘\t156789\n时效性\t156790\nNandi\t156791\n中国美术美院\t156792\n舞台\t156793\n三魔仙\t156794\n老大名\t156795\n有钱允\t156796\n社区化\t156797\n这么人\t156798\n制图\t156799\n美国队长\t156800\n风风吹雨橙花不上班的吗梦幻\t156801\n不要脸臭小子不要脸臭\t156802\n1韦\t156803\n香港麦当劳\t156804\n盖座\t156805\n慕蓉\t156806\n王源文\t156807\n下走\t156808\n婊子女\t156809\n王二狗\t156810\n寻\t156811\ngyts\t156812\n六呢度秘\t156813\nredf\t156814\n新历\t156815\n太认真\t156816\n7300天\t156817\n6克\t156818\n红薯拿铁\\^O^/\t156819\n李姝含\t156820\n货3\t156821\n秘待会儿\t156822\n紧谨\t156823\n讲面子\t156824\nfidy\t156825\n体感\t156826\n星期六\t156827\n警觉\t156828\n瓦档\t156829\n8回\t156830\nfidk\t156831\nvohY\t156832\n珉饭\t156833\nghiphotosbaiducomxiaodupicitemcaef76094b36acaf5057528f7bd98d1001e99c0ajpg\t156834\n墨玉\t156835\nFANAGAN\t156836\n手胸\t156837\nMalone\t156838\n头径\t156839\n会调\t156840\n沈阳\t156841\n苦苦菜花\t156842\n相似度\t156843\n垣色\t156844\n会谈\t156845\n1820米\t156846\n万福\t156847\n12月份1月份\t156848\n0100010111110101001011\t156849\n幸福的秘密\t156850\n红蓝波\t156851\n30000美元\t156852\n久久久久\t156853\n蝇营\t156854\n37373772\t156855\n手背\t156856\n贵校\t156857\n木兰词\t156858\ncbnki\t156859\n恩平局\t156860\n场长\t156861\n想试试\t156862\n十三个月\t156863\n嗯登\t156864\n卡马乔\t156865\n丽丽啦\t156866\n脱落\t156867\n给谁\t156868\n嗯白\t156869\n业哥母本\t156870\n天狼星座\t156871\n实验学校\t156872\n井井井\t156873\n100000岁\t156874\n傍大款\t156875\n顾问团\t156876\nn438\t156877\n打进住\t156878\n元那\t156879\n放鞭炮\t156880\n答辩\t156881\n排骨鸡\t156882\n杨总有\t156883\n愿得一\t156884\n见贤思齐焉\t156885\n太幸福\t156886\n两年来\t156887\nFhfdhvjdjxbox\t156888\n济南公交电车\t156889\n一百二十块\t156890\n傻傻傻傻傻夫\t156891\n小舅子\t156892\nReid\t156893\n很忙\t156894\n检讨书\t156895\n海马平台\t156896\n真正骨\t156897\n奇兵\t156898\n奥拓仔\t156899\n奇克\t156900\nruivgh\t156901\ngrf\t156902\n很快\t156903\n90'S\t156904\n上市许可申请人\t156905\n张芷微\t156906\n置顶\t156907\n简嘉琪\t156908\n别克吧\t156909\n希story\t156910\n山级\t156911\n高静怡\t156912\n遠\t156913\n学僧\t156914\n吴师傅\t156915\n一起走手牵手\t156916\n春晚呗\t156917\n我知不道\t156918\n混场\t156919\n德基\t156920\n度秘秘生\t156921\n书面\t156922\n中超联赛\t156923\n老妹\t156924\n一80年代\t156925\nmhghpu\t156926\n商用冰箱\t156927\nghufb\t156928\n你好黑\t156929\nghufu\t156930\n激情戏\t156931\n王明洋\t156932\n军方\t156933\n咱们村\t156934\n一批次\t156935\n绪明\t156936\n金岳霖\t156937\n西半球\t156938\ntousob\t156939\n指甲豉\t156940\n赵建权\t156941\n陈寨花卉市场B区\t156942\n永桂\t156943\n蒙纳士大学\t156944\n只用\t156945\n皮拉尸\t156946\n三六生病\t156947\n英熙犬还有传奇卡hello梅主题曲还有我的\t156948\n223456\t156949\n肃宁\t156950\n卯年\t156951\n熊出没你给我找到穿越对对碰\t156952\nkkyc\t156953\n出具柜\t156954\n客户源\t156955\n嘻哈闯世界\t156956\npuppies\t156957\nb座\t156958\n增进\t156959\n池方\t156960\npastries\t156961\n讲话楼走\t156962\n关羽\t156963\n痛不痛\t156964\n18.61\t156965\n帐余额盈\t156966\n二十五四十百吨\t156967\n渊源\t156968\n啦啦啦啦啦啦行打鸣的小阴\t156969\n孔陈\t156970\n54股\t156971\n其他园\t156972\n体温计\t156973\n拯救\t156974\n春挺月\t156975\n程紫怡\t156976\n15835244194\t156977\n在梦里\t156978\n月圆\t156979\n极点度\t156980\nribi\t156981\n刘瑞泽\t156982\n贾昕怡\t156983\n养生生\t156984\n鸡畦\t156985\n两三十一号\t156986\n一对面一个\t156987\n死胖子臭胖子女\t156988\n李小尤\t156989\n大埔客都影院\t156990\n111111111111122\t156991\n呸臭\t156992\nTDKR\t156993\n九中\t156994\n共和国\t156995\n武侠类\t156996\n6月9日\t156997\n8663655\t156998\n撒地\t156999\n诱僧\t157000\n娜塔莉·波特曼\t157001\nrgguyf\t157002\n能久\t157003\n波形\t157004\n九世\t157005\n六月三十日\t157006\n郭恒\t157007\n两大袋\t157008\n你好想\t157009\n你好惨\t157010\n虍羚尐溪缑怎给给尽進迸\t157011\n九下\t157012\n骨膜\t157013\n九万\t157014\n假餐\t157015\n半秒\t157016\n九七\t157017\nq235\t157018\n江村经济\t157019\ncvzv\t157020\n人握\t157021\ncvzx\t157022\n头烫\t157023\n山高柴\t157024\n120412049年\t157025\nMSN\t157026\nMSM\t157027\n咔擦\t157028\n心情好呀\t157029\n天哥秒\t157030\nabcderta\t157031\n664797616164678187618454646164677666475574986644655740401465665594686319485656\t157032\n雪肌\t157033\n水色\t157034\n抑或\t157035\n田家明\t157036\n加上座\t157037\nvacation\t157038\n体面\t157039\n龙安琪\t157040\n宫崎骏\t157041\n农乒\t157042\n花雪\t157043\n博哥\t157044\n打伴\t157045\n牛肉丸\t157046\n月子\t157047\n贵主\t157048\n爱恋爱\t157049\n一什丁\t157050\nvvcvvfggchd\t157051\n玉女xi\t157052\n好聊天\t157053\n克里特\t157054\n红漆\t157055\n草莓草莓\t157056\n贺广科\t157057\nthhgfgfgb\t157058\nblx\t157059\n敢死队\t157060\n张晨爽\t157061\ndgvzqi13589\t157062\n螣\t157063\n惊跳\t157064\n行走\t157065\n猪母度秘\t157066\nHCNJVL\t157067\n脸树皮\t157068\n以为你在\t157069\n地点\t157070\n好伙伴郎\t157071\n56度\t157072\n已婚\t157073\n红太郎\t157074\n珍藏版\t157075\n江流\t157076\n三全炫彩\t157077\n江浙\t157078\n新地震周期\t157079\n知识性\t157080\n四十年\t157081\n吕沅汾\t157082\n367059479\t157083\nk559\t157084\n11号\t157085\n木兰天池\t157086\n千峰万们\t157087\n1209676177\t157088\n来来在\t157089\n刚果\t157090\n谢谢好嘞回见\t157091\n一个人走\t157092\n方字\t157093\n5时50分\t157094\njefegrh\t157095\n书图书馆\t157096\n格俩格\t157097\n方子\t157098\n一多一点\t157099\n庄心妍\t157100\n故弄玄虚\t157101\n姚楚嫣\t157102\n谈定\t157103\n搞毛\t157104\n章草\t157105\n那度秘\t157106\n阿朱迅\t157107\n文绉绉\t157108\n55555522222222000000000\t157109\n22222222222222222222222222222222\t157110\n拉撒\t157111\n满盘皆墨\t157112\n1866www\t157113\n戴欣桐\t157114\n应澈\t157115\n喜辽\t157116\n姓李名叫\t157117\n美我和你\t157118\n收题\t157119\n休腐\t157120\n数多一四\t157121\n三星3toungababy\t157122\nCinema\t157123\n王淑芳\t157124\n10k\t157125\n10n\t157126\n大港\t157127\nzstj\t157128\n嘻嘻累\t157129\n原谅我我真\t157130\n想你走\t157131\n归去\t157132\n大游\t157133\n叫头酸\t157134\n经池\t157135\n10p\t157136\n闽南话\t157137\n菌群\t157138\n无花果树\t157139\n10t\t157140\n五十斤\t157141\n10J\t157142\n10K\t157143\n糸度秘\t157144\n九个月\t157145\n许王片\t157146\n7奌\t157147\n公牛主场联合中心\t157148\n闽南语\t157149\n怒龙\t157150\n巴黎圣母院\t157151\n多咪多咪尼\t157152\n大清\t157153\nhaze\t157154\n忘掉\t157155\n情欲\t157156\n黑勋爵Darth\t157157\n京科\t157158\n衣服源\t157159\n说的是爱不爱你我讨厌那你\t157160\n85588424182251\t157161\n7套\t157162\n京秘\t157163\n生活资料\t157164\n唐元\t157165\n咪卑\t157166\n邳州\t157167\n蓝冰\t157168\n佐孩子\t157169\n惜售\t157170\n10%\t157171\n一七余万\t157172\n109\t157173\n好比\t157174\n風風雨雨染發膏\t157175\n102\t157176\n103\t157177\n100\t157178\n101\t157179\nBuxtonk\t157180\n107\t157181\n104\t157182\n105\t157183\n尿性\t157184\n第1度\t157185\n填出\t157186\n重庆海外旅业集团\t157187\n融\t157188\n赛末点\t157189\n别闹丨\t157190\n15905263993\t157191\n雷尔麻木嘞\t157192\n配有\t157193\ncredit\t157194\n绝对外\t157195\n藝\t157196\n配服\t157197\n唐兵\t157198\ntisyou\t157199\n行吧鸡翅\t157200\n伢子\t157201\n鸿茅药酒\t157202\n冒昧的问一句你是男\t157203\n一八一\t157204\n舍得再见\t157205\n度秘度秘你真漂亮的秘度秘你真\t157206\ndENIZEN\t157207\n抢票\t157208\n纽交所\t157209\n敌情\t157210\n回味\t157211\n灾片\t157212\n乖别\t157213\n亲一口亲\t157214\n炮夫人\t157215\n黄打扮\t157216\n尼比赛睿\t157217\n一八个\t157218\n1378998088442\t157219\n两字\t157220\n4日下午\t157221\n陈金兵\t157222\n保不住\t157223\n10日凌晨0时\t157224\nRudjrhdu\t157225\n冬枣\t157226\n卡耐基\t157227\n伏尔加河\t157228\n屁用\t157229\ntraining\t157230\n亚麻系列家居\t157231\n说空\t157232\n仪征\t157233\n度秘真\t157234\n咪呢度站\t157235\n春箬笠\t157236\n6亿QQ\t157237\n甜心\t157238\n台州音乐台\t157239\nnenecsmp\t157240\n河虾\t157241\n应邀\t157242\n杨锅盔\t157243\n凉拌炒鸡蛋\t157244\n小文乐\t157245\nhhhhhhhhhyyyhyyh\t157246\n王建\t157247\n押宝\t157248\n新飞驰\t157249\n中日韩\t157250\n为你倒\t157251\n7842562\t157252\n你好不好你不好我不好大家都不好就是你惹的祸\t157253\n荷干\t157254\n2011年9月\t157255\n五头\t157256\n藥\t157257\n四四早上\t157258\n假吃\t157259\n天底下\t157260\n假名\t157261\n五耐\t157262\n七七六四\t157263\n盲僧\t157264\n交频\t157265\n连吃\t157266\n五大\t157267\n假吐\t157268\nMOVADO\t157269\n房朋朋\t157270\n2一斤\t157271\n猪吧听不懂我的话\t157272\n9号\t157273\n五三二六\t157274\n胃菜\t157275\n红楼梦里拜天仙\t157276\n董事局\t157277\n困漫\t157278\n德玛西亚\t157279\n一盒\t157280\n五多\t157281\n这来\t157282\n排洪\t157283\n菜车\t157284\n叫灵\t157285\n王鑫\t157286\nf0rK\t157287\n仙姐\t157288\n仙姑\t157289\n天下单身\t157290\n十来\t157291\n丹修\t157292\n原道\t157293\n八六零二六五幺零七五\t157294\n去开\t157295\n刘穆\t157296\n重达\t157297\n巴彦红格\t157298\n密藏\t157299\n不要再问\t157300\n口材\t157301\n由是\t157302\naass\t157303\n逗比\t157304\n吗眼镜\t157305\n向前看\t157306\n设有\t157307\n好呀好呀你快点唱吧\t157308\n思咪\t157309\n百遍\t157310\n151558866\t157311\n24页\t157312\n激将\t157313\n告诉你吧再结婚\t157314\n遥控器\t157315\n臭度秘我爱你\t157316\n0贝\t157317\n古埃贵\t157318\n噜噜噜噜噜噜噜噜\t157319\n恩才分也\t157320\n超巿\t157321\n修儿\t157322\nabcdefghijklmnopqrstuvwxyzxyznucicansinmeabc\t157323\n你是我的人你是我你的人牛\t157324\n孤魂野\t157325\n长隆门\t157326\n陈光永\t157327\n元谋人\t157328\n高倩楠\t157329\n97204197\t157330\n华夏基金\t157331\n73个\t157332\n必考\t157333\nPRI\t157334\n抛尸\t157335\n15159929921\t157336\n擦一擦\t157337\n111嗯1111111111111\t157338\n姜涵钰\t157339\n一大点\t157340\n0败\t157341\n老骥\t157342\n老骨\t157343\n绫辻行人\t157344\n448518750555665544\t157345\n全真\t157346\n最爱你\t157347\n米洛哥\t157348\n老照片\t157349\n记得主\t157350\n盆栽\t157351\n说会话吧\t157352\n快递四百里加英雄向上给我\t157353\n别发图了说的话\t157354\n傻b\t157355\n少说\t157356\n弹劾\t157357\n日B\t157358\n蜜雪儿\t157359\n木有\t157360\n常住\t157361\n首帝都\t157362\n全省\t157363\n指向\t157364\nthenk\t157365\n焦慮\t157366\n佛牙\t157367\n底层\t157368\n我爱你爱且深爱\t157369\n孕妈们\t157370\n木朵\t157371\n张曼硕\t157372\n献县\t157373\n惹我啦\t157374\n这麽\t157375\n凉风\t157376\n超细\t157377\n夏晨曦\t157378\n弹力\t157379\n刮刮刮\t157380\n寿辰\t157381\n木木\t157382\n#学社交学口才#\t157383\n金星片\t157384\n直线\t157385\n徐冰洁\t157386\n艾生\t157387\nawrg\t157388\n周二下午4点\t157389\n多角度\t157390\n吧张\t157391\n拂罗里吧嗦\t157392\n忆以\t157393\n佳彩\t157394\n张新疆\t157395\n胡编驴叫\t157396\n淘秘\t157397\n口水战\t157398\n人民政府\t157399\n早知道\t157400\n欧巴帅\t157401\n菊花状\t157402\n润湿\t157403\n大肠癌\t157404\n草莓牛奶\t157405\n李琪琪\t157406\n单性肥\t157407\n虎头蛇尾\t157408\ndffgjffjtn\t157409\n壁咚\t157410\n土城两岸柳枝依，\t157411\n长青韩党党\t157412\n森多利\t157413\n老弱病残幼\t157414\n鸡肉歙\t157415\n新华百货\t157416\n顽症\t157417\n刘曦\t157418\npqilasture\t157419\n29万\t157420\nefvuvecgcdhdcycdecgcfdvhvrrghcdegthdrfg\t157421\nfehnflv\t157422\n地动仪\t157423\n刘彤彤\t157424\n照常\t157425\n1593572468428465565\t157426\n小虫\t157427\nseeyou\t157428\n29个\t157429\n顿时\t157430\n装奥\t157431\n我喜欢维包子\t157432\n晨光\t157433\n12000\t157434\n瓷肌\t157435\n装好\t157436\n陈天茹\t157437\n石团长\t157438\n马卡龙\t157439\n马拉松赛\t157440\n华利山\t157441\nsisuf\t157442\n找寻秀\t157443\n我真的很爱你小猪机器人和我交往吗晓得机器人爱你求你了我送你\t157444\n#OL时尚#\t157445\n智能聊天机器人\t157446\n贤群\t157447\n张加乐\t157448\n杨三种\t157449\n产期\t157450\n土因火\t157451\nhellomytlets\t157452\n夜间\t157453\niejxiwnxoemxowxmowxjwixniwxjwoxjowjxow\t157454\n帕尼\t157455\n注定\t157456\n8562462\t157457\n一朵样\t157458\n科特迪瓦\t157459\n安石\t157460\n大酱\t157461\nkwkr\t157462\n眼罩\t157463\n拼错\t157464\nRMB\t157465\n数千家\t157466\n审查员\t157467\n充气泵塞\t157468\n驯龙之\t157469\n灵光寺\t157470\n此谓\t157471\n大酥\t157472\n44788\t157473\nweddings\t157474\n我是中国人我听不懂你说的话\t157475\n舅妈\t157476\n林把虚泡\t157477\n崔阳\t157478\n乐乐\t157479\n五十九五十八五十七\t157480\n20多年前\t157481\n史进放\t157482\n好啊好啦呱\t157483\n十二一百二十多\t157484\n叫长\t157485\n膳食补充剂\t157486\n死下\t157487\n午餐基金\t157488\n死丁\t157489\n缸体重金\t157490\n宣传期\t157491\n仨月\t157492\n死丝\t157493\n国土局\t157494\n劝酒\t157495\n赵富鑫\t157496\nmini\t157497\n吴福军\t157498\n动感单车\t157499\nmina\t157500\nmind\t157501\n抱着我\t157502\n生根粉\t157503\n照样子\t157504\n上钱\t157505\n鱼咀\t157506\nhdjf\t157507\n女鹅\t157508\n白水洋镇\t157509\n志玲\t157510\nhdjc\t157511\n庄园\t157512\n几道啦我爱你\t157513\n名值\t157514\n瓦房\t157515\n李宁台\t157516\n最后一张牌\t157517\n哈哈镜+美酒\t157518\n六开衣\t157519\n8600\t157520\n39.27\t157521\n跌跤\t157522\nIFwell\t157523\n上钩\t157524\n香味儿\t157525\n李岩\t157526\n爱你\t157527\n煜晖\t157528\nregular\t157529\n卡不好玩\t157530\n77点半\t157531\n刘师大\t157532\n龙椅\t157533\n钟心妍\t157534\n看待\t157535\n照片馆\t157536\nhhijgyugyyhuyttuiytujrbhitykhhgjbghhj\t157537\narekiddingme\t157538\nghvxd\t157539\n国富民贫\t157540\n辛鸿玉\t157541\n地基\t157542\n疆土\t157543\n余健龙\t157544\n赶脚王彪彪\t157545\n结婚吧你爱他吧我不爱你\t157546\n太原省\t157547\n王亮\t157548\n针性\t157549\n王京\t157550\nliyaxuan\t157551\n穿袄\t157552\nmtbb\t157553\n春暖\t157554\n王人\t157555\n来吧阴\t157556\nｌｏｖｅ\t157557\nastowhythetmseed\t157558\n人不人\t157559\n崔郑源\t157560\n好呢度秘\t157561\n下洋村\t157562\n20162月一日\t157563\nIFULYOU\t157564\n斗笑\t157565\n感怀\t157566\n啪啪啪咧\t157567\nbbbbbbbbabc\t157568\n平等\t157569\n王二\t157570\n袁艺晨\t157571\n随给\t157572\n急事\t157573\n东南西北中\t157574\n王了\t157575\n姜昆\t157576\nFree\t157577\n急于\t157578\n王亚\t157579\ntgffhy67y756\t157580\n受连累\t157581\n感性\t157582\n杨志刚\t157583\n王云\t157584\nh管\t157585\n王五\t157586\n王继光\t157587\n刘晓玲\t157588\n何涛\t157589\n柱石\t157590\n黄糖\t157591\n源艺\t157592\n男鸟\t157593\n问件事\t157594\n30万名\t157595\n鱼跃\t157596\n宝石红\t157597\n东北姑娘\t157598\n徐汇\t157599\nlo4氧点\t157600\n作花\t157601\n一百万分之一\t157602\n珍珠奶茶\t157603\n痴痴\t157604\n上月25号\t157605\n促织夜深篱落一灯明\t157606\n优理\t157607\n朱文静\t157608\n许伊\t157609\n963856\t157610\n雏菊\t157611\n粗有细\t157612\n两点半小时\t157613\n大觉寺\t157614\n叫见\t157615\n曾利花儿\t157616\n咽喉炎\t157617\n看世界\t157618\n38码\t157619\n汉族日办\t157620\n东海\t157621\n哎呀小学\t157622\n唉奇\t157623\n我不开心度秘\t157624\nfpfp\t157625\n惹我\t157626\n东浩\t157627\nPHOTO&\t157628\nchiphotosbaiducomxiaodupicitembf096b63f6246b600e39b801ecf81a4c510fa28bjpg\t157629\n吃醋\t157630\nControlled\t157631\n很满意\t157632\nficugchfzyh\t157633\n东流\t157634\n录制\t157635\n小超人\t157636\nB0Y\t157637\n声睛\t157638\n里面的哪哪哪哪gogo女孩\t157639\n买买买买买买买\t157640\n鸽记航空公司\t157641\n战魂\t157642\n连图图\t157643\n热头\t157644\n经上\t157645\n代收\t157646\n不再孤单\t157647\n伊曼纽尔·温加罗\t157648\n科研\t157649\n梁文道\t157650\n姜天爱\t157651\n入春\t157652\n入昧\t157653\n窝妹妹好惨\t157654\n左爱\t157655\n热天\t157656\nchchg\t157657\n庾澄庆\t157658\n林楚方\t157659\n真美我不要美女我要帅哥\t157660\n杨越\t157661\n陶瓷罐\t157662\n淀粉\t157663\n杨超\t157664\nl不懂什么故事的说人话\t157665\n咪咪色\t157666\nryyuhdccxcb。zg8ooyghhjjkqeryuolz\t157667\n男度\t157668\n18872243919\t157669\nCLOSED\t157670\n吃喝玩乐\t157671\n我有意异界我讨厌你一只度秘我讨厌你度秘我讨厌你\t157672\n给我聊了再见再见再见再见再见再见再见再见再见再见再见再见再见再见再见再见\t157673\n枪战\t157674\n那一年到头\t157675\n捕捞\t157676\nTVB97\t157677\n犬舍\t157678\n鸡犬升天\t157679\n甲氧基色胺\t157680\n湖南地区\t157681\n以然在\t157682\nNikeSportswear\t157683\n过会儿\t157684\n明月光疑\t157685\n前奏曲\t157686\n灰玫瑰回不回会玫瑰\t157687\n令妃\t157688\n高速公路\t157689\n二日游\t157690\nLavigne\t157691\n劲字格\t157692\n石佳鹭\t157693\n脱离\t157694\n红霞\t157695\n刘梦生\t157696\n快恭\t157697\n山光\t157698\n关爱\t157699\n直棒\t157700\n战绩\t157701\nsab\t157702\n希希澈底\t157703\nhttpchiphotosbaiducomxiaodupicitem7c1ed21b0ef41bd53f47e18156da81cb39db3daajpg\t157704\n从教\t157705\n無聊\t157706\n运动员\t157707\n小麻烦\t157708\n愛惜\t157709\n孔冉冉\t157710\n若彤\t157711\n打火机\t157712\n哦孙\t157713\n天太\t157714\n天天\t157715\n天大\t157716\n继母\t157717\n孩子们\t157718\n广州站\t157719\n李佳琪\t157720\n2998位\t157721\nGhvbvbv\t157722\n下一年后\t157723\n天外\t157724\n娄黔辉\t157725\n516月\t157726\n干箖\t157727\nexq斯密\t157728\n葵酱\t157729\n罗莱\t157730\n干箝\t157731\n暴警\t157732\n片师\t157733\n牛培文\t157734\n客商\t157735\n赢家\t157736\n2.6折包\t157737\n马佳霖\t157738\n度秘衰\t157739\n過份\t157740\n玛卡片\t157741\n1284387855675\t157742\n第二名\t157743\n揍他涯\t157744\n敢凶我\t157745\n舢板\t157746\n程绍平\t157747\n老花钱\t157748\n白佳伟\t157749\n一二岁\t157750\n毙命者\t157751\n赢定\t157752\n唐玲\t157753\n吃饭时\t157754\nhrrfdbzz\t157755\n死大大\t157756\n暴民\t157757\n阵痛\t157758\n赌债\t157759\n里海头\t157760\n余辉\t157761\nNahbam\t157762\n湖南衛視\t157763\nkiy\t157764\n驴皮\t157765\ndgxdgadf\t157766\nkis\t157767\n嘀咕\t157768\n唯品\t157769\nkiu\t157770\nkit\t157771\n0008528536369\t157772\nkii\t157773\nkin\t157774\nkim\t157775\n不一定\t157776\nkig\t157777\nkif\t157778\nkid\t157779\n错错错笑\t157780\n蛋皮\t157781\n维空\t157782\n闲田\t157783\n出声\t157784\nfhehjjf\t157785\n闲画\t157786\nmbfwzdu\t157787\n搜刮\t157788\n醉拳歌\t157789\n26唐\t157790\n蒋超\t157791\n死大夫\t157792\n红船精神\t157793\n捕鲸\t157794\n啦啦啦我是爸爸的小花仙\t157795\n38分之七\t157796\n菲尼士\t157797\n超级大花钻\t157798\n澳网\t157799\n好吧我懂了你是可男可女\t157800\n法新社\t157801\n褒姒\t157802\n零3个月\t157803\n管子琰\t157804\n一天三顿\t157805\n呢大哥\t157806\n远期\t157807\n夜会美#谁动了我的琴弦唤我到窗前/流水浮舟\t157808\naVF\t157809\n没我忙\t157810\n趣语\t157811\n客串照\t157812\n静文\t157813\n度秘我想和你聊会天\t157814\n田帮荣\t157815\n众兴\t157816\n先辈\t157817\n众公\t157818\n肉龙包\t157819\n摸觉\t157820\nu虚浮\t157821\ngfryv\t157822\n开学检票\t157823\n品管\t157824\n光头强\t157825\nndfbvcoddBBC\t157826\n照明家\t157827\n原籽沐\t157828\n疯闹\t157829\n能局\t157830\ncoaadyou\t157831\nwanting\t157832\n早期\t157833\n早朝\t157834\n5月2号\t157835\n盘头\t157836\nOKOKokOK\t157837\n木音\t157838\n宁句城际轨道\t157839\n说音\t157840\n剪毛毛\t157841\n蒙城县\t157842\n赵胤胤\t157843\n扫描\t157844\n想你了爱\t157845\n齐西林壁\t157846\n碰還凝聚力開年利是呼嚕狼族少年喲莫言\t157847\nPrince\t157848\n妇兆\t157849\n老爹\t157850\nushbsbvvccxddghbxfhkllllllkklkjhgggg\t157851\n人糕\t157852\n朗奔\t157853\n软语\t157854\n把酒问\t157855\nuiggi\t157856\n有图有真相^\t157857\nthghghhgy\t157858\n变态机器人\t157859\n顾头未顾尾\t157860\n一直好\t157861\n艾克\t157862\n表里不一\t157863\n江雨芯\t157864\n张献生\t157865\n不是没我好看是没我家机器人\t157866\n秀兰\t157867\n画作\t157868\n癫婆\t157869\n西木大陆\t157870\n薄告\t157871\n滚压\t157872\n单人间\t157873\n曲曲曲曲去去厕所\t157874\n南无普陀山\t157875\n一百三十多\t157876\n无碍\t157877\n茹茹茹茹\t157878\n度秘度秘度秘度云\t157879\n列为\t157880\n狗狗场\t157881\n五六月\t157882\n晾衫\t157883\n万能菊\t157884\npopular\t157885\n【梦十胜\t157886\need\t157887\n博比\t157888\n猫蓝\t157889\n1525258985\t157890\n蔡征宇\t157891\nagame\t157892\n博美狗\t157893\njrnlfu\t157894\n艾薇儿\t157895\n打鼓\t157896\n花蟹\t157897\nRadisson\t157898\n猪八戒网\t157899\n飞撒\t157900\n大萌\t157901\n醉如痴\t157902\n安文\t157903\n张翠华\t157904\n姜萌\t157905\n姜萍\t157906\n镂刻\t157907\n葡萄葡萄\t157908\njdmdidn\t157909\n双枪\t157910\n第5回\t157911\n贺爽\t157912\n打吊\t157913\n事关\t157914\n痘坑\t157915\n贺爷\t157916\n度秘度秘乖\t157917\n福音\t157918\n哪个儿\t157919\n浪痛\t157920\nknee\t157921\n救生圈\t157922\n第几种\t157923\n我一面\t157924\nghhhfgxg\t157925\n第六次\t157926\n敏虹\t157927\n喂喂喂你是谁你是\t157928\n床片\t157929\n我的反应\t157930\n度秘臭绿秘\t157931\n１８０公里\t157932\n五十八块\t157933\n桂林路\t157934\nmp3mp4\t157935\n廣播影視周報\t157936\ntgfzgfgg\t157937\n事先\t157938\n4771次\t157939\n贱息\t157940\n事元\t157941\n双林\t157942\nhellokt\t157943\n你的侠\t157944\n再吃\t157945\n克什米尔\t157946\n10月24日晚上八点\t157947\n发脾气\t157948\n画树\t157949\n00。00\t157950\n晚上十二点\t157951\nramad\t157952\n674分\t157953\n秘冷\t157954\n胡逸凡\t157955\n回不回话\t157956\n软性\t157957\n哎呀呀呀呀呀呀呀\t157958\nMidautumn\t157959\n孑乖\t157960\nvtufcgvFfv\t157961\n去角质\t157962\n孤王\t157963\n志同道\t157964\nViVOX6全网通\t157965\n乱草\t157966\n周笔畅\t157967\n芭芭\t157968\n意味看\t157969\nHguuvjcck\t157970\n几下子\t157971\nVhmgh\t157972\n笑死我了我不想和度秘\t157973\nCghhfeetyu\t157974\n曼多\t157975\n授权\t157976\n1947年3月\t157977\n易错音\t157978\n啪希里啪啪\t157979\n声明\t157980\n1593544853\t157981\nmaiki\t157982\n凝块\t157983\n李菁\t157984\n登机\t157985\n多燕\t157986\n看我的呀呀呀呀\t157987\n铁片子\t157988\n十四日\t157989\n东德\t157990\nzhong\t157991\n太辛苦\t157992\nAndrew\t157993\n宋悦\t157994\n涵爷\t157995\n天乙\t157996\n25块\t157997\n照片块\t157998\nvbdbdbd\t157999\n贤赫\t158000\n先在\t158001\n嘛嘛嘛玩意\t158002\n单网\t158003\n黄老\t158004\n免收\t158005\n花生米\t158006\n姨姨\t158007\n小冉冉\t158008\n秘秘乖\t158009\n那个别\t158010\n没空\t158011\n亚航\t158012\nwaitor\t158013\n叶俊豪\t158014\n好话题\t158015\n绸缎\t158016\n胸霸\t158017\n喆喆\t158018\n丁彦景\t158019\n152315346\t158020\nzghzho\t158021\n3339871946\t158022\n埃莎苹果\t158023\n刘蔓萨\t158024\n信捷者\t158025\n蓝星\t158026\n13.6%\t158027\n慢慢人生\t158028\n去味\t158029\n秘秘书\t158030\n巨溪\t158031\n崔珉豪\t158032\n百合花C\t158033\n任荷\t158034\nsjbnddk\t158035\neee111111111\t158036\n可怕儿\t158037\n5525425555\t158038\n相信你是\t158039\n瞟\t158040\n2806点\t158041\n在家\t158042\nAMG\t158043\n深挖\t158044\n秘誓\t158045\nyouter\t158046\n二你\t158047\n水管儿\t158048\n有别说话\t158049\n桃心儿\t158050\n橱窗\t158051\n就好哈哈哈\t158052\n玛西亚\t158053\n7月23号\t158054\n老虎老撸a\t158055\n好心酸/\t158056\n殷店乡\t158057\n播报\t158058\n赵新婷\t158059\n五百五十五零零\t158060\n比利时\t158061\n郭一念\t158062\n一千七百九十二十六四\t158063\n哼唧\t158064\nDVDreferHGH\t158065\n3431312112212135\t158066\n我就是说话我是魔法书\t158067\n盟友\t158068\n达人天\t158069\n花篮\t158070\n书香食尸家\t158071\nSyg\t158072\n侦破\t158073\n哼唱\t158074\n傳播\t158075\nokokokok\t158076\n4GB\t158077\n赵传奇\t158078\n一千两千\t158079\n100页\t158080\n没没有了\t158081\nofidsuu\t158082\n1440x900pt\t158083\n倚强者\t158084\n韦德图\t158085\n爱一辈子\t158086\n北京司法局\t158087\n世子妃杀人事件\t158088\neachof\t158089\n见怪不怪\t158090\n展露\t158091\n猪猪猪猪猪猪猪猪猪猪猪你是猪猪猪\t158092\n张凯飞\t158093\n白丈子村西沟\t158094\n182度\t158095\n搞基你会\t158096\n德玛亚\t158097\n家明\t158098\n5岁\t158099\n鬼山路嗯嗯奥特曼\t158100\n果只狗\t158101\n连枝共冢\t158102\nmj\t158103\n谢万红\t158104\n主动脉\t158105\nDegfs\t158106\nmu\t158107\nkilove\t158108\n胡依楠\t158109\n一首情歌\t158110\n巨棒\t158111\n天生的意思\t158112\nuqau5840558584\t158113\naddd\t158114\n星美\t158115\n仁慈\t158116\n给我单\t158117\n10666元\t158118\n该市\t158119\n美妆秀#\t158120\n首听\t158121\n多大胆\t158122\n阿蜜丶\t158123\n来我子\t158124\n非诚勿扰\t158125\n仔细\t158126\n稻盛和夫\t158127\n每隔\t158128\n杨扬\t158129\nmaklo\t158130\n后天海天学校\t158131\n我的小秘书小呀小秘书\t158132\n含蓄法\t158133\n容忍\t158134\n刀剑笑话我叫什么么么么么么么么么么么么么\t158135\n张成悦\t158136\n动没有\t158137\n夏俊峰\t158138\n疗程\t158139\n第八条\t158140\n18453162200\t158141\n哪此\t158142\n赵个相\t158143\n好聪明\t158144\n棉花\t158145\n洞口\t158146\nmpwp\t158147\n淑电脑\t158148\n上四年\t158149\n李豪撒\t158150\n逆淘汰\t158151\n可走\t158152\n绵羊莉\t158153\n犀利\t158154\n康伟六\t158155\n德隆\t158156\n面相\t158157\n烟灰\t158158\n中国歌\t158159\n革非\t158160\ntJI\t158161\n面目\t158162\n杭州绿城\t158163\n转暖\t158164\n我叫方\t158165\n唉我讨厌你我讨厌你我讨厌你你不要说我的秘书了我讨厌你我讨厌你我讨厌你也不要做我的密度\t158166\n断言\t158167\n烟火\t158168\n筋斗\t158169\n每句话\t158170\n这世界\t158171\n隐秘\t158172\n白白奇楠\t158173\n指导员\t158174\n隐私\t158175\n土陶瓶\t158176\n魏小姐\t158177\n56284455645\t158178\n刷牙\t158179\n天赋\t158180\n武进区\t158181\nbgg\t158182\n极度\t158183\n天资\t158184\n三星电子\t158185\n口算题\t158186\n谢玉\t158187\n咸蛋\t158188\n药量\t158189\n天赐\t158190\n离开你我想\t158191\n吵完\t158192\n李仙途\t158193\n套求\t158194\n好啊度秘太乖\t158195\n马英九\t158196\n马致命老子\t158197\n杜丽\t158198\nfubc\t158199\n烤串儿\t158200\n重婚\t158201\nPrada\t158202\n拜头\t158203\n抹茶味\t158204\n钢琴键\t158205\n伊美\t158206\n中山市港\t158207\nｉｐｔｖ\t158208\n恩浩\t158209\nakaki\t158210\n鸡蛋糕子\t158211\n孙v\t158212\n咳咳咳咳咳咳咳咳咳咳咳咳\t158213\n陈禹翰\t158214\n鸡婆\t158215\n忘情水ixj…liliAC\t158216\n新沙洞\t158217\n大白菜\t158218\n啄木鸟\t158219\n刘蕊\t158220\n鲁味\t158221\n其我\t158222\n梅雪\t158223\nCurva\t158224\n那首歌\t158225\n陈美英\t158226\n工作犬\t158227\n最红活\t158228\n李儒\t158229\n余棒\t158230\n小P孩\t158231\n双鱼男\t158232\n王大乖\t158233\n颜四二\t158234\n观摩\t158235\n维尼\t158236\nwqnm\t158237\n新华日报\t158238\n210分\t158239\n六堡\t158240\nv1v1v1\t158241\n有电\t158242\n每星期一\t158243\n梅雁\t158244\n呵叮\t158245\n满近\t158246\nUlike\t158247\n戏服\t158248\n师鸣\t158249\n有用\t158250\n维尔\t158251\n邓飞\t158252\n信仰心魂\t158253\n马艺琳\t158254\n试穿\t158255\ntalk\t158256\n赶英超日\t158257\n烷\t158258\n19:00\t158259\n年生\t158260\n三八二十四二十五六十四三三三三小\t158261\n东直门\t158262\n加压\t158263\n略敢\t158264\njoshin\t158265\n懿武\t158266\n加厚\t158267\nmayi\t158268\n丫们\t158269\n3月16日\t158270\n给我废话\t158271\n中远集团\t158272\n年画\t158273\nNdjhrhekejdb\t158274\n白度浏览器\t158275\n顿少嘞\t158276\ndhiphotosbaiducomxiaodupicitemb151f8198618367a016561a829738bd4b21ce5d3jpg\t158277\nT5680\t158278\n小媳妇儿\t158279\nghbvcv\t158280\nWorld\t158281\n印液\t158282\n维A\t158283\n高兆琪\t158284\n台情\t158285\n4144456856\t158286\n黎饮可乐\t158287\n考察期\t158288\n北国冰城\t158289\n吴艺凡\t158290\n古今中外\t158291\n傻冒爱你\t158292\n3117156858\t158293\n犇\t158294\nPOS机展\t158295\n海航集团集团\t158296\n镇兴\t158297\n罗函\t158298\n建戌\t158299\n愚钝\t158300\n澡盆\t158301\n54688\t158302\n丧气话\t158303\n罗凯\t158304\n张大德\t158305\n睡猩\t158306\n破定\t158307\n犁\t158308\n王文风\t158309\n发酒疯解放军次见见你大酒店\t158310\n催情菇\t158311\n千锤百炼\t158312\n结晶\t158313\n钡缘\t158314\n王朝本\t158315\noOoOOoOoOB\t158316\n吊桥\t158317\n猎杀\t158318\n德劳内\t158319\n嗨别\t158320\n三十几\t158321\n郭航\t158322\n知不道\t158323\ngot7\t158324\n中间人\t158325\n泰航\t158326\n李亚男\t158327\n长安城\t158328\n卢星言\t158329\n横线\t158330\n口哨\t158331\n张耗子\t158332\n长太帅\t158333\n绪旋转\t158334\n法兰克福\t158335\n陈6\t158336\n窈窕\t158337\n不ko\t158338\n保险赔偿金\t158339\ngoto\t158340\n34美元\t158341\nsbsbsbsbsbsbsbsbsbsbssbbss\t158342\n帮转\t158343\n烘\t158344\n孙颖浩\t158345\n赤峰人风了该死的疯人院\t158346\n好借好还何厚铧\t158347\njdjvn\t158348\nn凯\t158349\n泻湖\t158350\n刹那间\t158351\n34338\t158352\n不干\t158353\ntsatsxg\t158354\n入梦\t158355\n省内\t158356\n陈v\t158357\nskillgagl\t158358\n逗密\t158359\n啦5\t158360\n误区\t158361\n我的资料破写在四哪个还是女\t158362\n三四零六零\t158363\n拨打爱\t158364\n度蜜你是什么座的呀你说什么呀你的生日几号\t158365\n大本营\t158366\n七八科\t158367\nCCCCOK\t158368\n一周二\t158369\n死棍\t158370\n长沙同升湖学校\t158371\n纷飞\t158372\n2255455423548987\t158373\n后怕\t158374\n真我听不懂\t158375\n三天\t158376\n左耳布尔\t158377\n快痴线\t158378\n天黑了睡觉吧\t158379\n充满园\t158380\n关门儿\t158381\n42373273565568974674714142856582768598575757585866663883\t158382\n花花花姑娘\t158383\n梦工厂\t158384\n茁壮\t158385\n受不了你了你好贱\t158386\n谷开来\t158387\n一个八代\t158388\neyes\t158389\n下天\t158390\n隆平高科\t158391\n吃子\t158392\n讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你\t158393\n吃孟\t158394\n生成者\t158395\n每星期\t158396\n骆心怡\t158397\n1.91米\t158398\n打力促\t158399\n恩德\t158400\neyef\t158401\nanuk\t158402\nanun\t158403\n来袭\t158404\n下头\t158405\n0rz\t158406\n刘娜娜\t158407\n完全天\t158408\n海塘\t158409\n侨胞\t158410\n轻率\t158411\n三十几个\t158412\n真够了你\t158413\n恩得\t158414\n人势\t158415\n好呀尼那\t158416\n实言\t158417\n小米机器人\t158418\n三大\t158419\nMa妈妈\t158420\n沃施\t158421\n合兴欢\t158422\n双筷子噻\t158423\n285天\t158424\ncn\t158425\n1452661019101\t158426\n度秘度秘我真讨厌你\t158427\nfew\t158428\n想听话\t158429\nvivoice\t158430\nMkllll\t158431\n我喜欢贴\t158432\n辛云兰\t158433\n扩频\t158434\n萨宠\t158435\n13162462101\t158436\n绣球花\t158437\n78935\t158438\n庵深客舍潮\t158439\n几枚\t158440\n易宝支付\t158441\n椒丝\t158442\n几枝\t158443\nclbj\t158444\n户口\t158445\n范佩西\t158446\n十二级\t158447\n土壤\t158448\n行窃\t158449\ndatrey\t158450\n萨宁\t158451\n吧兄弟\t158452\n欲为\t158453\n大主宰\t158454\n大宇同\t158455\n畏忌\t158456\n666节\t158457\nbrnr\t158458\n女生类\t158459\nbrnv\t158460\n央视索福瑞\t158461\n兵兵我爱你\t158462\n好想好想和你在一起的你\t158463\nzengyaping2043\t158464\n坏蛋讨厌讨厌讨厌\t158465\n鼻甲\t158466\n不又玩\t158467\n基金\t158468\n明月光\t158469\n人的一生\t158470\n那顿结婚吧\t158471\n晓江\t158472\n转世灵童\t158473\n细化\t158474\ncc\t158475\n一轮\t158476\n学猫\t158477\n面内\t158478\n劳斯\t158479\n婚期\t158480\n叩谢\t158481\n大哪儿\t158482\nk3d\t158483\n麽麽\t158484\n呀十二\t158485\n去爱\t158486\nhttpfhiphotosbaiducomxiaodupicitem71cf3bc79f3df8dc34076ce5ca11728b47102888jpg\t158487\nshot\t158488\nshou\t158489\nshow\t158490\n炸鸡柳\t158491\n二本\t158492\n天天下\t158493\n天天上\t158494\n一百九十九\t158495\nn85\t158496\n战俘\t158497\nshoj\t158498\n俢饰语\t158499\n亚伯拉罕·马斯洛\t158500\n老和尚\t158501\nRMB475\t158502\n那用你自己的话说\t158503\n分了吧\t158504\nshiyingyu\t158505\nvror\t158506\n叫朋克\t158507\n愛過\t158508\n绝交别理我了我也不理你\t158509\n判死刑\t158510\n他律\t158511\n分配\t158512\n无了了\t158513\n147厘米\t158514\n呆会\t158515\n关悼\t158516\n呆伙\t158517\n冰神\t158518\n横滨\t158519\n别要不要不要不要不要不要不要不要不要不\t158520\n天润\t158521\n3小时后\t158522\n裴文龙\t158523\n天涯\t158524\n迷我\t158525\n易瑶\t158526\n来沪\t158527\n平塘\t158528\n来没\t158529\n及格率\t158530\n11902056151\t158531\nS型\t158532\n七十多名\t158533\n快啲\t158534\n祥瑞\t158535\ncf\t158536\n锅边\t158537\n甘得意\t158538\nee11\t158539\nx22\t158540\n林宥嘉\t158541\n临别离开\t158542\n老弟\t158543\n玩人\t158544\n初半瓶\t158545\nall土司1\t158546\n小脾气\t158547\n阿拉伯\t158548\n三三博\t158549\n大中华区\t158550\n白云\t158551\n精兵强将\t158552\nE4315\t158553\n贺知章\t158554\n桔瓣形片\t158555\n建筑学\t158556\n何时约\t158557\n阴合\t158558\n查获\t158559\ncurch\t158560\n乔治因瑞\t158561\ntimeis\t158562\n李连杰\t158563\n华硕\t158564\n32寸\t158565\n穆扶提道堂\t158566\n杨少华\t158567\n便携\t158568\n梅毒秀\t158569\n一概而论\t158570\nkkoho\t158571\n恶心表\t158572\n敲掉\t158573\n星游记\t158574\n魔蝎女\t158575\n拥抱似水年华\t158576\nXhfggcgdb\t158577\n吃奶奶\t158578\n强众志成城前仆后继\t158579\n主色\t158580\ndome\t158581\nGcvh\t158582\n圆圈\t158583\n图灵可\t158584\n飞椅\t158585\n圆圆\t158586\n大要\t158587\n2011年1月份\t158588\n二月\t158589\n凸轮\t158590\n怀包\t158591\n安样\t158592\n完整\t158593\n安居楼\t158594\n喜怒无常\t158595\n杰仔娜娜\t158596\n小要\t158597\n心引力的网游之逆天邪龙\t158598\n帅有帅\t158599\n1万\t158600\n坚决反对\t158601\n怀化\t158602\n第n次\t158603\n你敢你\t158604\nautiful\t158605\n米都\t158606\n申办\t158607\n洛梓轩\t158608\n55756764975\t158609\nNK唯饭\t158610\n亲新年\t158611\n超级超级大好人\t158612\n反光板\t158613\no冰公主shijklmnopqrpu为大不了来热\t158614\n韩亦瑶\t158615\n白拜\t158616\n后辈们\t158617\n班级体\t158618\n莫默默\t158619\n特快\t158620\n1架\t158621\n选址\t158622\nSBS歌谣大战\t158623\n那么么\t158624\n少女心\t158625\n听山\t158626\ncghjhghggghiuhhgvbhgghhjhn\t158627\n精神抖擞\t158628\n转世党\t158629\n135号\t158630\n春凤\t158631\n齐金峰\t158632\n桂柱\t158633\n桂柳\t158634\ncqv\t158635\ncqu\t158636\n喵咪咪咪咪咪咪\t158637\n灰熊\t158638\n添啵斯\t158639\n呱呱呱\t158640\n3月14日\t158641\n大气层\t158642\n粽子叶\t158643\n88514\t158644\n夏洛\t158645\n成行\t158646\nsieraelki\t158647\n井一起\t158648\n算了陪\t158649\n正当性\t158650\n跳舞毯\t158651\netyyio\t158652\n治丧委员会\t158653\n儿童们\t158654\n成衣\t158655\n序幕\t158656\n一百八十六\t158657\ntroko\t158658\n这么多么\t158659\n太傻比\t158660\n请和我过\t158661\n孙凯\t158662\n襟怀\t158663\n夏津\t158664\n补射\t158665\n1998倍\t158666\n打地鼠\t158667\nAvis\t158668\n30个\t158669\n景向\t158670\n王大基\t158671\n3颗\t158672\nAMARTH\t158673\n3额\t158674\nTf\t158675\nhvfc\t158676\n莞式有\t158677\n长和廊\t158678\nhvfd\t158679\n度美度秘\t158680\nh文百合\t158681\nddefghijklmnopqrstuvwxyz\t158682\n靠假\t158683\n自动关\t158684\n30万\t158685\nNobu\t158686\n露琪亚\t158687\n鬓角\t158688\nrfyfgjhf\t158689\n起降\t158690\noio\t158691\n害喜\t158692\n小把戏\t158693\n推移\t158694\n小鲜鲜\t158695\nbed转圈\t158696\n律师法\t158697\n田诗雨\t158698\n乌戈\t158699\n蓝筹\t158700\n齐家一\t158701\n在前天\t158702\n小果果们\t158703\n壹贰叁肆\t158704\n王龙\t158705\n苏源了怎密云光诚中央\t158706\n黑苹果青年公益创业\t158707\n张营业\t158708\n小柚\t158709\n维艰\t158710\n开荤\t158711\n衣架子\t158712\n安智下中午饭下午饭\t158713\nkaas\t158714\n3IE\t158715\n有害怕\t158716\n猪ugby\t158717\n圈蜜\t158718\n袭胸\t158719\n不可估量\t158720\n对讲\t158721\n朴北鼻\t158722\n努力就好\t158723\n新塘\t158724\nmoney\t158725\n前苏联\t158726\n无色生姜\t158727\n业务部\t158728\n同志度\t158729\n受死\t158730\n较真儿\t158731\n好学森\t158732\n试试看\t158733\n美联储\t158734\n论据\t158735\n丝丝拉拉\t158736\n下沙德\t158737\n王鑫博\t158738\n亲我好不好\t158739\n偶开森\t158740\nfjxdgk\t158741\n浩丰\t158742\n因为我要\t158743\n军民\t158744\n周五下午\t158745\n第几棵\t158746\n寻踪\t158747\n肖天宝\t158748\n折理\t158749\n新浪网\t158750\nTK\t158751\n不在的对\t158752\n10698000036590\t158753\njakd\t158754\n向庆祥\t158755\n八点钟\t158756\n高调\t158757\n今天下午15:30\t158758\n高谈\t158759\n哥们儿姐\t158760\n切克闹你说煎饼\t158761\ngrif\t158762\nTG\t158763\n好好嘞\t158764\n剪头\t158765\ngril\t158766\n瘦美\t158767\nPortage直營門市消費累積\t158768\n伽蓝\t158769\nmicccc\t158770\ndidisft\t158771\nTB\t158772\n知识面儿\t158773\n昌吉牛肉\t158774\n狗尾巴\t158775\n45654\t158776\nfuuddsid\t158777\n女女女女女女不\t158778\n行政法\t158779\nTA\t158780\n嘎嘎嘎同仁堂\t158781\n承承\t158782\n回来吧\t158783\nCustomer\t158784\n金城大酒店\t158785\n小柠\t158786\n摩比\t158787\n忙城\t158788\n弗如\t158789\ncabc\t158790\ncaba\t158791\n丘索维金娜\t158792\ncabd\t158793\nchuck\t158794\n真相\t158795\n银狐\t158796\n王一族\t158797\n迪上\t158798\ndchtfhfbdh\t158799\n吕梁\t158800\n理群\t158801\n瘦身品\t158802\n有钱了再找\t158803\n落败\t158804\n438k\t158805\nghigklm\t158806\n几噶\t158807\n宦海涛\t158808\n21厘米\t158809\n过化\t158810\n四十多公斤\t158811\n女警察\t158812\n谢好\t158813\n百家姓百家性百家\t158814\n小姑子\t158815\n色彩斑斓\t158816\n哎哟猫\t158817\n求实\t158818\n顺天\t158819\n斗牛嗯\t158820\n亚那\t158821\n小马甲\t158822\n使节\t158823\n毕业院\t158824\n网易公开课\t158825\n小婊砖\t158826\n一輩子\t158827\n1000万\t158828\nGoodbye\t158829\njbjdjmg\t158830\n处方\t158831\n牛宝\t158832\n捂脸\t158833\n分期付款\t158834\n大四勒\t158835\nHowoId\t158836\n睡美人\t158837\n喽亲\t158838\n心跳率\t158839\n真的好想好想哭\t158840\n不要不开心\t158841\n别太太太太\t158842\n童彦熙\t158843\n美劳完美\t158844\n1000个\t158845\n篮板\t158846\n莫拉\t158847\n多米卡\t158848\n唐太宗李世民\t158849\ngoM\t158850\nyhbbb\t158851\n不赖林甸\t158852\n叶答应名\t158853\n项城\t158854\nupup\t158855\n120千米\t158856\n唯胜者\t158857\n不是我跑粗\t158858\n1738\t158859\n黄洁夫\t158860\nZgdyffji\t158861\n5230\t158862\n山池\t158863\nx6puis\t158864\n被骗\t158865\n糖宝图小熊\t158866\n章贡区\t158867\n经过\t158868\n买彩\t158869\n哦吼\t158870\n不好玩儿\t158871\nv泽\t158872\n几个群\t158873\n更富饶\t158874\n嫂药\t158875\n付裕\t158876\n玛勺子\t158877\n一街\t158878\nstoumetoutou\t158879\n女婿\t158880\n一行\t158881\n好告诉你\t158882\n上映\t158883\n一期一期\t158884\n女婴\t158885\n雪鹰\t158886\n片商\t158887\n宜土者\t158888\n回不回\t158889\n界诠法师\t158890\n757909868\t158891\n调适\t158892\n带味\t158893\n奇艺\t158894\nv法\t158895\n真厉\t158896\n一表\t158897\n的照\t158898\n下界\t158899\n太太乐\t158900\n戚蒙\t158901\n一衣\t158902\n摆设\t158903\n鲁斯王\t158904\n真历\t158905\n实验瓶\t158906\nzhdd\t158907\nUSD\t158908\nUSB\t158909\nUSA\t158910\n井冈山\t158911\n腾燕\t158912\n改变句\t158913\nsrdf\t158914\ngeneral\t158915\nDtcufug\t158916\n38mm\t158917\n富家子弟样\t158918\n1家\t158919\nsrds\t158920\nGffxxgghuhnc\t158921\nM372\t158922\n保持联络\t158923\nuchjhvjlgd\t158924\n藏藏\t158925\n呢呀呀呀呀呀呀呀呀呀呀呀呀\t158926\n发球\t158927\n宽仁\t158928\n聊唱\t158929\n犯客家见面\t158930\n杂工\t158931\ncomviq\t158932\n诺克穆图\t158933\n你度秘我我不想和你说什么\t158934\n老花眼\t158935\n肖佳欣\t158936\n亚红\t158937\n东海县\t158938\n小度我爱你\t158939\n奥巴马然来\t158940\n睁开眼\t158941\n茆港村\t158942\n秘书长\t158943\n宁都会\t158944\n亚纶\t158945\n可男可\t158946\n万达华府\t158947\n邵峰\t158948\n交出\t158949\n有那么好说\t158950\n戏外\t158951\n榆钱\t158952\nsvu\t158953\n三百天\t158954\nksmsk\t158955\n不省油\t158956\n西雅图\t158957\n处境\t158958\n你好不好吃感觉上学\t158959\n各级\t158960\n爷行\t158961\ngof\t158962\n朋友屋\t158963\n五月龙\t158964\n一一秒钟\t158965\n振奋\t158966\n小先生\t158967\n数苗条\t158968\n做贼做贼\t158969\n九峰\t158970\n瘀\t158971\n宣一\t158972\n念者\t158973\n金卡普\t158974\n档案\t158975\n你的眼\t158976\n爱你没疯吧\t158977\n拐克\t158978\ndcgk\t158979\n口贴\t158980\n青蜂侠\t158981\n下星期二三\t158982\n困人\t158983\n威风\t158984\ngoy\t158985\n书稿\t158986\n中国建设银行股份有限公司\t158987\n你和你的老婆\t158988\n丛哥\t158989\n桃花园\t158990\n零二零二\t158991\n恶灵战士\t158992\n水害\t158993\n净身\t158994\n大开心\t158995\n马雨童\t158996\n适时\t158997\n东台小区\t158998\n聋儿语\t158999\n色欲\t159000\n三等奖\t159001\n付俐\t159002\n表认\t159003\n保龄球\t159004\n谢儒斌\t159005\n印花\t159006\n雷霆之怒\t159007\nAF1\t159008\n几零年\t159009\n半路转车\t159010\n石头剪刀\t159011\n飘飘飘\t159012\n賣點\t159013\n45岁\t159014\n水宝\t159015\n傻妞撒撒撒撒\t159016\n浙江大学\t159017\nJekyllg\t159018\n所以说出来\t159019\n19点40分\t159020\n做业\t159021\n韩嘉欣\t159022\n笔画\t159023\n王觏\t159024\n度发\t159025\n冷林锋\t159026\n度受\t159027\n90s\t159028\n90p\t159029\n死掉\t159030\n载具\t159031\n與時俱進\t159032\n隐形你个\t159033\n你好搞性\t159034\n呵呵呵呵呵嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯妈呀我的妈呀\t159035\n八怪\t159036\n屁蛋\t159037\n大世界姐姐没啊刚天雄好了儿歌来呀一累啊阳阳秒\t159038\n做主\t159039\nSpyker\t159040\n猪吗猪吗猪吗你是猪吗你是猪么你猪\t159041\n昨天深夜\t159042\n载入\t159043\n度秘我要点歌\t159044\n比马尔代夫\t159045\n搞笑剧\t159046\n五知只\t159047\n你家长什么样啊发张照片度秘\t159048\n白璧\t159049\nBLX\t159050\n盖区\t159051\n简简单单\t159052\n旺旺旺\t159053\n热哈伦\t159054\nevisu\t159055\n录音上\t159056\n两人间\t159057\n麻烦你麻烦你\t159058\n国情\t159059\n基督徒\t159060\n石雁从\t159061\n903\t159062\n900\t159063\n901\t159064\n第十四个\t159065\n同音\t159066\n908\t159067\n909\t159068\n快餐店\t159069\n冠军路\t159070\nFyjug\t159071\n一条街\t159072\n90%\t159073\n酒类\t159074\n3明儿\t159075\n白璟\t159076\n猩球大战\t159077\n龙龙龙\t159078\n106号\t159079\n扎手\t159080\n几个三\t159081\n牌号\t159082\ndhjitd\t159083\n你好丑\t159084\n亭亭玉立\t159085\n#音悦Tai#\t159086\n变种\t159087\n前生\t159088\nlivelog\t159089\n三礼\t159090\n芬芬\t159091\n蔡达\t159092\n呆瓜\t159093\n几个个\t159094\ntwd\t159095\n骄歹uyn力pm豁空额无奈冼95乡p懺\t159096\n师女\t159097\ntwh\t159098\n不干不干\t159099\ntwo\t159100\n3月28日凌晨4时\t159101\ntwm\t159102\n认不认真\t159103\n右脑\t159104\n误用\t159105\n牌友\t159106\ntwt\t159107\ntwu\t159108\n右脚\t159109\ntwy\t159110\n芬芳\t159111\n咱们两个唱歌\t159112\n46部\t159113\n冷青颖\t159114\n信不你再不出\t159115\n英特\t159116\n奶油\t159117\n400万元\t159118\nksboss\t159119\n马有红\t159120\n我讨厌你了再见\t159121\n今天一点半\t159122\nMac高清视网膜屏\t159123\n回头一看\t159124\nkare\t159125\n力偶\t159126\n反一号\t159127\n滔滔\t159128\n马家乡\t159129\n招租\t159130\n粗砂悲哀我我决问\t159131\n2587436545\t159132\n国辉哥\t159133\n20cm院\t159134\nJian\t159135\n小公主之飞\t159136\n笔录\t159137\n感天动地\t159138\n↓\t159139\n→\t159140\n↑\t159141\n←\t159142\n↗\t159143\n↖\t159144\n恶心不啊孙子\t159145\n↙\t159146\n讨厌讨厌讨厌讨厌讨厌讨厌讨厌\t159147\n老燕\t159148\n1517380934\t159149\n垂钓\t159150\n不憨包装袋装上的人家族史上最低消费劲歌金曲率土归心动人们问题外话来着重叠字联\t159151\n芑\t159152\n老城区\t159153\n称心\t159154\n亚那你就是男不男女不女\t159155\n头文字D\t159156\n你好搞笑\t159157\nyearand\t159158\n走开下\t159159\n叶梅梅\t159160\n逼度\t159161\n节\t159162\n千万富翁梦\t159163\n歌王\t159164\n芈\t159165\n别改\t159166\n!\t159167\n兼修\t159168\n花\t159169\n南京中原城东住宅部\t159170\n芳\t159171\n芽\t159172\n十二点\t159173\n生没\t159174\ndydy\t159175\n芸\t159176\n这片子\t159177\n说的错\t159178\n芥\t159179\n芦\t159180\n2222252\t159181\n灵山\t159182\n芭\t159183\n芯\t159184\n芮\t159185\n米林\t159186\n1314216\t159187\nhfjj\t159188\n芪\t159189\n尽然\t159190\nDon\t159191\n哈尼兔耶\t159192\n蔡劲豪\t159193\n孤独的人\t159194\n团老虎\t159195\nDog\t159196\nbkk\t159197\n张桥\t159198\nDox\t159199\nDoy\t159200\nbkr\t159201\nbks\t159202\nVedio\t159203\nYdhshshsidjfnydjffifjfigffkvocubheyepyydkckvxnccufuduxyhcudufufiufufjdjcucufhdgfu\t159204\ngtft\t159205\n你的假\t159206\n凑三八\t159207\n康有为\t159208\n成度秘\t159209\n体悟\t159210\n吴秋子\t159211\n张育仁\t159212\nmeat\t159213\n鸭子\t159214\n乔依雪\t159215\n真情\t159216\n吕剧\t159217\n浓度\t159218\n阳狮锐奇新媒体整合中心\t159219\n死种\t159220\n猪你是猪你是猪你猪你猪你猪你猪猪猪猪猪猪\t159221\n一点错\t159222\n發揮\t159223\n倘若\t159224\n重睡\t159225\n姜枫\t159226\n付钱\t159227\n高米花\t159228\nMD我爱豆\t159229\nalready\t159230\n棕子\t159231\ncket\t159232\n噜噜噜噜咕咕噜噜噜噜\t159233\n六六十四\t159234\nwinger\t159235\n24\t159236\n25\t159237\n26\t159238\n27\t159239\n20\t159240\n21\t159241\n22\t159242\n23\t159243\n疯了勒\t159244\n28\t159245\n29\t159246\n28628\t159247\n2%\t159248\n431483498\t159249\n志刚\t159250\n大喜欢了你是我的闺蜜\t159251\n志刘\t159252\n节月\t159253\n李加林\t159254\n写写完\t159255\n看法\t159256\n周洁\t159257\n18765426688\t159258\n2S\t159259\njgvgvghhbjkih\t159260\n巧姐\t159261\n悬人\t159262\n烦人不烦人\t159263\n骆驼之歌骆驼之歌\t159264\n川原砾开尔\t159265\ndgjljr\t159266\n2A\t159267\n2B\t159268\n2M\t159269\n那个女孩儿\t159270\n2H\t159271\n2J\t159272\n职业道德\t159273\n2u\t159274\n魔头狼\t159275\n2w\t159276\n2p\t159277\n2s\t159278\n朱之文\t159279\n太原太\t159280\n2x\t159281\n2d\t159282\nG么给绣G\t159283\n2g\t159284\n你记得\t159285\n苏仨\t159286\n2b\t159287\n奥巴歌\t159288\n2l\t159289\nfhxyxhgkhjvjvbxhblhjch\t159290\n2o\t159291\n2h\t159292\n14090909924\t159293\n独行者\t159294\n打击乐器\t159295\n华盖\t159296\n往来\t159297\n化险为夷\t159298\n1所\t159299\n龙场\t159300\n艺术展\t159301\n福寿\t159302\nfhshdb\t159303\n笨雅\t159304\n郭云杰\t159305\n龙在\t159306\n天桂山\t159307\n1000000000000000000000000000000000000000000000000000000000000000000000000\t159308\n婴宝\t159309\n160万件\t159310\n酒后\t159311\n啦红红\t159312\n所得税法\t159313\n尿布吧\t159314\n我喜欢工程师\t159315\n奥诗丹\t159316\n8500亿元\t159317\n×\t159318\n胸襟\t159319\n吉普车\t159320\n毒胶囊\t159321\n酒吧\t159322\n乐爸\t159323\n贫富悬殊\t159324\n不要黑我我\t159325\n小蜗\t159326\n土豪群\t159327\n小蜜\t159328\n底端\t159329\n两款\t159330\n宝卡\t159331\n棒verry\t159332\n两次\t159333\n真惨\t159334\n吃了吃了止\t159335\n綦焱\t159336\n麦咭\t159337\n重庆成教\t159338\n摸亏空\t159339\n宠物店\t159340\n凉冲凉冲凉冲凉\t159341\n拖压\t159342\nNO2\t159343\n189999996\t159344\n尼子\t159345\n出国学\t159346\n春节联欢晚会\t159347\n曹思涵\t159348\n四万亿\t159349\nhijgy\t159350\ngghhhhhhhhhj\t159351\n传达\t159352\nà\t159353\n走着走\t159354\n我感觉到\t159355\n超音速\t159356\ncsourit\t159357\n度秘剑\t159358\n28厘米\t159359\n如约\t159360\n道理\t159361\n民主党\t159362\n初春\t159363\n獅子座\t159364\nú\t159365\n呢队\t159366\nEveryoneinour\t159367\n胡山\t159368\n内马儿\t159369\n没门\t159370\n长江流域\t159371\n请归来\t159372\nx5m\t159373\n永清县一小\t159374\n12月31日凌晨四点\t159375\n很宣\t159376\n是你说的你我你讨厌我\t159377\n7667977\t159378\n三八二百五\t159379\n20本\t159380\n补办\t159381\n一二点半\t159382\nfxky\t159383\n石昌林\t159384\n爸yuy\t159385\nshresangrochrishi\t159386\nHdhdujej\t159387\n深宫\t159388\n詹壮\t159389\n赛欧#\t159390\n女被\t159391\n次席\t159392\ngfcxzdjg\t159393\n婚礼\t159394\n骤起\t159395\n20月\t159396\n机会主义\t159397\n狡辩真的不懂不懂不懂不懂不懂不懂不懂不懂\t159398\n为我是你的女儿\t159399\n沈阳经济技术开发区\t159400\n苏南\t159401\n补助\t159402\n苏卓\t159403\n生当\t159404\n秦文婷\t159405\nImstudent\t159406\n警醒\t159407\n荣誉观\t159408\n过床\t159409\n专列\t159410\n老娘\t159411\nsssss\t159412\n终于明白\t159413\n202301\t159414\n想吐槽\t159415\n我喜欢喵\t159416\n无语凝噎\t159417\n刘和\t159418\n相想\t159419\n八四年\t159420\n果真是\t159421\n马尼拉\t159422\n刚是\t159423\n过度\t159424\n哪快\t159425\n摩通\t159426\n489281502\t159427\n跑火车\t159428\nQ79\t159429\n偷得\t159430\n奇什么异\t159431\n呀对不起呀我那个吗你了对不\t159432\n最终\t159433\n严正方\t159434\n烨晗\t159435\n黄段\t159436\n史老师\t159437\n观赏鱼\t159438\n万兴\t159439\n好友们\t159440\n许撒娇\t159441\n萌萌哒萌萌哒萌萌哒\t159442\n招生办\t159443\n大秧歌\t159444\nmeinn\t159445\n杰瑞\t159446\n程宁静\t159447\n280厘米\t159448\n具昌美姑娘\t159449\n而来\t159450\n新的一代\t159451\n说还记得\t159452\n行我知道\t159453\n叼婆\t159454\n期房\t159455\n账单日\t159456\n四袋\t159457\n王志萍\t159458\n三极片\t159459\n爱人人\t159460\nmeinu\t159461\n干干脆\t159462\n星星像\t159463\n陈芃\t159464\nsummer\t159465\n布艺\t159466\n全智贤\t159467\n写错别\t159468\n一人之下\t159469\n赵蕾\t159470\n残枝\t159471\n中国中医科学院\t159472\n追随者\t159473\n来了你\t159474\n四第六\t159475\n尼尔顿\t159476\n张杨坤\t159477\n陈正凯\t159478\n静冥索\t159479\n尧小娟\t159480\nunhappy\t159481\n备胎\t159482\n鱼梨\t159483\n私语\t159484\n洗凤麟\t159485\n洋长城\t159486\n偶活\t159487\n记忆棒\t159488\nruiy\t159489\n小姑娘儿\t159490\n咸水湖\t159491\n开种\t159492\n书言\t159493\n跟生\t159494\n你了我不爱你了我\t159495\nv过\t159496\n辉任\t159497\n打句号\t159498\n聊天技\t159499\ntyyuu\t159500\n开秘\t159501\nqass\t159502\n多少年级\t159503\n大众版\t159504\n浮生\t159505\n老白\t159506\n14444444\t159507\njlon\t159508\n连冷笑话\t159509\n思南公馆\t159510\n巴啦啦小魔仙小魔仙\t159511\n2016岁\t159512\n父母官\t159513\nmmmmmmdddddddmmmmjjjjjjjjjmddmmddmamtjjjtjjjj\t159514\n8886666666\t159515\n忘了奏\t159516\nShqwhfDcdhqdjshgcofkddhvkkkjkvshxzjxkxo\t159517\nCezard\t159518\n10CM\t159519\n贼性\t159520\n洋口港\t159521\n昼夜\t159522\ntuiop\t159523\n王大爷\t159524\n憤怒\t159525\nTORtououtro慢taratlalas\t159526\n硝唑\t159527\n张少德\t159528\n2982238842\t159529\ngdg\t159530\n现房\t159531\n我爱你我爱你我爱你\t159532\n好多好\t159533\n赛事\t159534\n为首\t159535\n凯里·欧文\t159536\n不足于\t159537\n王漳\t159538\nxixo\t159539\n花神\t159540\n现成\t159541\nfane\t159542\n赵丽波\t159543\nfang\t159544\n职分\t159545\nxixi\t159546\n浩哥哥\t159547\n堂皇\t159548\n1112112345677\t159549\n微信号\t159550\n释然然\t159551\n胡宇\t159552\n诺破哈一啦\t159553\n说火\t159554\n桦南\t159555\n不三不四2玩\t159556\n在家里光\t159557\n嘴病\t159558\n微河道\t159559\n报班\t159560\n樱兰公主\t159561\n莎美\t159562\n五月深\t159563\n失蹄\t159564\n容容\t159565\n贩子\t159566\n星学\t159567\n为什么呀哪儿给我好讨厌你了你\t159568\n245897\t159569\n忘怀\t159570\n到明\t159571\n100088\t159572\n星子\t159573\n童子鸡蛋饼干头\t159574\n太好了爱\t159575\n钟乳石\t159576\n白发\t159577\n孟浩\t159578\nhvll\t159579\n个了不要汉子威武UN\t159580\n111111岁\t159581\n东阳市\t159582\n别出声\t159583\n写作\t159584\n岚仔\t159585\n灌云\t159586\n无头\t159587\n连连连\t159588\n德意奥\t159589\n抄曦\t159590\ndofokffk\t159591\n大混蛋\t159592\nSomebodieslap\t159593\n生端正\t159594\n2889630942\t159595\n白高\t159596\nttufty\t159597\n推理\t159598\nhugyr\t159599\n迁安\t159600\n241.79亿元\t159601\n克拉大厦\t159602\n阴笑\t159603\npppp000000000066666666\t159604\n新不了情\t159605\n无多\t159606\n新潮\t159607\n朱光平\t159608\n偶爸\t159609\n朱雅文\t159610\n官庄\t159611\n滚球\t159612\n梅克斯、安东尼尼/诺切里诺、安布罗西尼、西多夫\t159613\n心情好不\t159614\n长太黑\t159615\n无处\t159616\njejxj\t159617\n陈逸轩\t159618\n无复\t159619\ngay\t159620\n2085938990\t159621\ngat\t159622\njxsj\t159623\n异界\t159624\ngao\t159625\ngan\t159626\ngal\t159627\n鉴赏\t159628\ngaj\t159629\nlllxyllllyill\t159630\n末世\t159631\ngaf\t159632\n学生\t159633\ngad\t159634\ngaa\t159635\n度秘影\t159636\n电影促进法\t159637\n倾听者\t159638\n激光器\t159639\n42页\t159640\n18379307141\t159641\n爱的人\t159642\n聚成\t159643\n悠游心\t159644\n一只手\t159645\n扫噶\t159646\ngaF\t159647\n号伦\t159648\n体育场\t159649\n手枕\t159650\n20153\t159651\n48986164\t159652\n小智\t159653\n手果\t159654\n小晴\t159655\n错了我承认了你我不要你了你也不让我承认我不要你了辣不就是我你亲我\t159656\n小晶\t159657\n结题\t159658\n小景\t159659\n12345678974\t159660\n竇凡\t159661\n小晨\t159662\n暖笑\t159663\n主人感\t159664\n7月31日晚\t159665\n僵硬\t159666\n哈一\t159667\n讨厌自\t159668\n张骞使西\t159669\n小豌豆\t159670\n5小时\t159671\n最初的爱\t159672\n嫂子来了吗师傅我来了\t159673\nUK屠龙记\t159674\n小晗\t159675\n刻股\t159676\n缺口\t159677\n小晓\t159678\n永远的偶像\t159679\n二零八零\t159680\n小晏\t159681\n丁厚\t159682\n有感而发\t159683\n错跟\t159684\n误入\t159685\n十月八\t159686\n手枪\t159687\n小時\t159688\n张红丽\t159689\n唔多四\t159690\n倾注\t159691\nsyf\t159692\nsyd\t159693\n红河谷\t159694\n范思淼\t159695\n长谷\t159696\nsyl\t159697\nsys\t159698\n警卫\t159699\n7is\t159700\nsyu\t159701\n倾泻\t159702\n46446\t159703\nsyz\t159704\nsyy\t159705\nHHH0\t159706\n7公里\t159707\n小牛队\t159708\n反叛\t159709\nahame\t159710\n余晓丽\t159711\nahami\t159712\n受好评\t159713\n六礼拜天\t159714\n反反\t159715\n财大公寓\t159716\n普普通通\t159717\nJUFD\t159718\n我的秘书请告诉我\t159719\n党章\t159720\n单弦弓\t159721\n缺爱\t159722\n样！子\t159723\n缺爷\t159724\n周小钰\t159725\n站\t159726\nstoumetout\t159727\n苏美尔瓦斯\t159728\n九华\t159729\n竟\t159730\n竞\t159731\n九千\t159732\n九十\t159733\n3056408491\t159734\nBlake\t159735\n竖\t159736\n老路\t159737\n立\t159738\n来东\t159739\n灰色调\t159740\n坐位\t159741\n人饭\t159742\nvghhjj\t159743\n竹\t159744\n黄河流域厚街开联盟\t159745\n韩萌\t159746\n无所顾忌\t159747\n杨国琪\t159748\n九卡\t159749\n熟食\t159750\n大奔头\t159751\n研究型\t159752\n党总小事\t159753\n投名状\t159754\n竭\t159755\n端\t159756\n章\t159757\n嗷\t159758\n童\t159759\n竧\t159760\n羞花天姿国色国色天之\t159761\n倒怕\t159762\n天天睡\t159763\n度秘你为什么不唱歌儿成天\t159764\njlbohv\t159765\n力年\t159766\n一鹋\t159767\n理貌\t159768\n选台\t159769\n力广\t159770\nlvg\t159771\n度秘你好呀我我不记得你\t159772\n一支笔\t159773\n疾患\t159774\n杨金燕\t159775\nrubyh\t159776\n这么地\t159777\n楼梯口\t159778\n秦王\t159779\n考点\t159780\nCouples\t159781\n94886\t159782\nstanlac\t159783\n动笔\t159784\n康诺密德\t159785\n累惨\t159786\nJshshshsjshahsdffxdyyshsgxgjjghlhhfghjjjjiiiidghjvvbbvvbjffgcjhdhjjfkbbnnnkkkkklllkjcfjddfhjxxnnpjcv\t159787\n祝小飞\t159788\n国我可以\t159789\n手算\t159790\n校生\t159791\n子霁\t159792\n容艺星\t159793\n425千米\t159794\n晒单\t159795\n吴小曼\t159796\n累想\t159797\n哈儿\t159798\n鹅额\t159799\nhgghu\t159800\n黄孩子\t159801\n国际版\t159802\n慕名\t159803\n返紧\t159804\n英宁\t159805\n順便\t159806\n69852\t159807\n擦亮\t159808\n基漫\t159809\n三页\t159810\n安德森\t159811\n看不见了你\t159812\n皎皎\t159813\n宁乡\t159814\n怒吼\t159815\n6201天\t159816\n京珠高速\t159817\n四高\t159818\najxi\t159819\n记不掉\t159820\nufo\t159821\n满帅\t159822\nufa\t159823\nufb\t159824\n文蔚伟\t159825\nufd\t159826\nufg\t159827\nufx\t159828\n不不不不\t159829\ncmosionz\t159830\n非浩浩荡荡\t159831\n什么不更\t159832\n避雷\t159833\n崩坏学园2\t159834\n太八卦\t159835\n你以为你是\t159836\n吸吗\t159837\n口袋版\t159838\n菜叶\t159839\n黄桃子\t159840\n暹罗\t159841\n酸酸\t159842\n雷奕明\t159843\n五百年前\t159844\n俏江南\t159845\n在肩\t159846\n白富美你是穷矮挫\t159847\n陈帅朋\t159848\n张蓓蕾\t159849\n背面\t159850\n一儿时\t159851\n秦懿格\t159852\n吸吸\t159853\n小家子气\t159854\n12333432578\t159855\n初四\t159856\n球面\t159857\n爱与不爱\t159858\n财迷\t159859\n干靠\t159860\n捣毁\t159861\n干面\t159862\n员国\t159863\n涅瓦大街\t159864\nCzg6\t159865\n广西壮族自治区\t159866\n忒弥斯\t159867\n好得意\t159868\nguirunfei\t159869\n昌云\t159870\nWanghsh\t159871\nLOLITA\t159872\n七高\t159873\n寡妇\t159874\n轩辕剑&一吻天荒MV郡主gif\t159875\n北芪\t159876\n财运\t159877\n母其是\t159878\n俊介\t159879\n死而复生\t159880\nWaroutSiri\t159881\n臭人\t159882\n坏蛋白\t159883\n爱的控制\t159884\n魔术秀\t159885\n马俊辉\t159886\n544444\t159887\n份额\t159888\n上官雨馨\t159889\n特别所\t159890\n刑事诉讼法\t159891\n未眠\t159892\n西平\t159893\n好不对\t159894\nophone\t159895\n多愁\t159896\n动动脑\t159897\n洪希强\t159898\n我一条命\t159899\n冻感冒\t159900\n大仁\t159901\n露秘\t159902\n桂涛涛\t159903\n解恨\t159904\nbhiphotosbaiducomxiaodupicitem1e30e924b899a90135c3f3421a950a7b0308f5d4jpg\t159905\n没听说过度日如年\t159906\n带走\t159907\n待客\t159908\n昨天之前\t159909\n孙高洁\t159910\n消费者\t159911\n邱彩玲\t159912\n网王\t159913\n泥巴\t159914\nw点儿wkxxf点\t159915\n900斤\t159916\n露珠\t159917\n多一点\t159918\n黄金\t159919\n悯农\t159920\n校子\t159921\n泥工\t159922\n四万块\t159923\n范怡然\t159924\n大件\t159925\n邦博\t159926\n酷vap\t159927\n珍宝\t159928\n徐傲\t159929\n回我爱\t159930\n二十呗\t159931\n屏住\t159932\n不可爱迩\t159933\n15.31\t159934\nLSt家禄\t159935\n咱来聊聊心\t159936\n隐隐约约\t159937\ngjittf\t159938\n仨科\t159939\n椭圆形\t159940\n纪实\t159941\n股沟\t159942\n头晕目眩\t159943\n钟意同\t159944\n几号度\t159945\n377579671\t159946\n大妈\t159947\n看我是\t159948\n自便\t159949\nb16\t159950\nb17\t159951\n64523\t159952\nGhcjjcgcfhcssjhhdfdrvvutsethdtdhgfvfjdyhcfyfjfiuvhf\t159953\n四大家\t159954\n房舍\t159955\n兰远n亮\t159956\n佳讯飞鸿\t159957\nhhhhhhhhhhhhhhhhhhhjhhhhhhhh\t159958\n工人\t159959\n百及\t159960\n接吻照\t159961\n拍卖\t159962\n探访\t159963\n深大\t159964\n一颗颗\t159965\n许睿晨\t159966\n夜阑珊\t159967\n炎黄子孙同舟\t159968\n真好的朋友\t159969\n百变\t159970\n爱全\t159971\nddfh\t159972\n工产\t159973\n我们约会吧\t159974\n勉勉强强\t159975\n乖呱呱呱\t159976\n探讨\t159977\n把窝\t159978\n客气恩\t159979\njrff\t159980\n郑钰霞\t159981\n百只\t159982\n姜堰溱\t159983\n气团\t159984\n深处\t159985\n13646377262\t159986\n达观\t159987\n四川\t159988\n言而无信\t159989\n巧克力派\t159990\n深夜\t159991\nEudh\t159992\nvgvj\t159993\n音乐课\t159994\n百叶\t159995\n百台\t159996\n小学\t159997\n醉冫贤\t159998\n老瞳\t159999\n不见\t160000\n买装\t160001\n卓娜\t160002\n追过\t160003\n风顺\t160004\n不觉\t160005\n渔家\t160006\n末女\t160007\n200多元\t160008\n刘说去\t160009\n情味\t160010\nSFFGXBCBCB\t160011\n各族人民\t160012\n赤司\t160013\n张露方\t160014\n技家\t160015\n周个\t160016\n微笑书\t160017\n吊号\t160018\n涪陵\t160019\n不行呀\t160020\n微采\t160021\n神秘莫测\t160022\n田璐瑶\t160023\n三段\t160024\n躏\t160025\n微量\t160026\n默默旅途快快快\t160027\nvlk\t160028\n度秘度秘我好寂寞\t160029\n刊号\t160030\n34c三\t160031\n完美数\t160032\n庶出\t160033\nvld\t160034\nvlg\t160035\n乘坐\t160036\n茅市\t160037\n亲嘴戏\t160038\nSENVETING\t160039\n歌美\t160040\n周芷卉\t160041\nvlp\t160042\n单警\t160043\n寒鸦\t160044\n喝可乐\t160045\nghing\t160046\nvlv\t160047\n开示\t160048\n云舒阳\t160049\n雍彬\t160050\n天天好忙\t160051\nTRYLIVE\t160052\n果蔬\t160053\n挺刚\t160054\n毛弘怡\t160055\n郭晓敏\t160056\n别我来\t160057\n误入梦\t160058\n大战巨人来啦在家男人的外表有的\t160059\n折叠\t160060\n李欣\t160061\n奥巴马旗\t160062\n最高人民检察院\t160063\n55358686866\t160064\n二泉\t160065\n3亿元\t160066\n7月29日\t160067\n干尼坤\t160068\n田文昌\t160069\n韦凤花\t160070\n柠檬片祛斑法\t160071\n1771147414425880228555\t160072\n易洋千玺\t160073\n老番\t160074\n44f\t160075\n翊峰\t160076\n折曲\t160077\n13:55\t160078\n2012-08-03\t160079\n揣着\t160080\n碗面\t160081\n气灶\t160082\n人风吹\t160083\n动一在\t160084\n说完\t160085\n陈岩茶\t160086\n希饭\t160087\n江伟\t160088\n龙微\t160089\n说定\t160090\n向前门\t160091\n15283506472\t160092\n气火\t160093\n说实\t160094\n44D\t160095\nv色\t160096\n奥萨纳\t160097\n449\t160098\n448\t160099\n焦度秘\t160100\n巨艳\t160101\n周不\t160102\n惠南镇\t160103\n442\t160104\n441\t160105\n440\t160106\n447\t160107\n446\t160108\n噢天\t160109\n444\t160110\n宋丹丹\t160111\n才走\t160112\nFTF\t160113\n夏往冬来醉楼阁\t160114\n伯南克\t160115\n吕俐颖\t160116\n超级遥控器\t160117\nxggxcj\t160118\n张啻\t160119\n九五斤\t160120\ndrryc\t160121\n好多半\t160122\ngixi\t160123\nShadyffjgn\t160124\n徐鮠\t160125\n张啵\t160126\n买好了\t160127\n小晚安\t160128\n解气\t160129\n436个\t160130\n六零点\t160131\nlukous\t160132\n行通\t160133\n0一W\t160134\n美国防部\t160135\n麻果席\t160136\n李真棒\t160137\n太浪\t160138\n太浩\t160139\n四十多秒\t160140\n零分希望小学\t160141\n小人儿\t160142\n保条\t160143\n01978\t160144\n良家妇男\t160145\n吴欣怡\t160146\n惊人\t160147\n躪\t160148\n瘦腰\t160149\nbbjnvhjjbjnn\t160150\n呢首\t160151\n多大叔大学\t160152\n云组\t160153\n宅景\t160154\n依次\t160155\n银家麻薯\t160156\n窃以为\t160157\n武穴\t160158\n进退七上八下\t160159\n升任\t160160\n芝麻康\t160161\n郎京成\t160162\n曾欣玥\t160163\n饭卡\t160164\n闹挺\t160165\n唐闻潇\t160166\n818场\t160167\n突入\t160168\n0871—5353000\t160169\n国秘酒店\t160170\n电瓶车\t160171\n8455555555555588353\t160172\n殘\t160173\n殇\t160174\ntrain\t160175\n殃\t160176\n饭卷\t160177\n以南\t160178\n马永平\t160179\n15229899977559999\t160180\n残\t160181\n殊\t160182\n苏州\t160183\n79110亩\t160184\n家辉猫\t160185\n听说起来\t160186\n殴\t160187\n殿\t160188\n还好意思\t160189\n殺\t160190\n我喜欢我怀了你的孩子我要和你\t160191\n叫大智若愚\t160192\n不要脸你不要脸不要脸\t160193\n看电视\t160194\n宜阳\t160195\n拇去\t160196\n82个\t160197\nIgkczx\t160198\n伊丽莎白·泰勒\t160199\n二氧化碳血症\t160200\n沙市盟\t160201\niamaUSAgilr\t160202\n洗呀\t160203\n着沁\t160204\n修通\t160205\n新浪刷微博\t160206\n最流行\t160207\n好喔蹉\t160208\n李秀训\t160209\n玩写\t160210\n拉勒米\t160211\ntmowul\t160212\n魁省\t160213\n秀过\t160214\n出门有车\t160215\n朝闻天下\t160216\n你好呀\t160217\n言艺丹\t160218\n不要我说东你说西\t160219\n涌进\t160220\nforest\t160221\n脚印\t160222\n10点2\t160223\n杨吗\t160224\n鹿宝可\t160225\n呻\t160226\n命\t160227\n呼\t160228\n呿\t160229\n呱\t160230\n屌天果瞎\t160231\n味\t160232\n隐婚\t160233\n四大大三\t160234\n黄树军\t160235\n周\t160236\n呫\t160237\n加新\t160238\n额元\t160239\n店里\t160240\nHGG\t160241\n呣\t160242\n小英雄\t160243\n呤\t160244\n呦\t160245\n一起元\t160246\n员\t160247\n呛\t160248\n呜\t160249\n呐\t160250\n呒\t160251\n呕\t160252\n伙伴们\t160253\n呗\t160254\n呖\t160255\n呈\t160256\n200200\t160257\n告\t160258\n呍\t160259\n人超人海中有你有我\t160260\n呀\t160261\n加料\t160262\n呂\t160263\n呅\t160264\n石门qq\t160265\n算不上\t160266\n呆\t160267\n死兆\t160268\n广角人像中心\t160269\n栗子\t160270\n死元\t160271\nkigfr\t160272\n李云娇\t160273\n我是落花我的女朋友是我爱你\t160274\n青萝卜\t160275\n死克\t160276\n死光\t160277\nｕｕｕｕｕｕｕｕ\t160278\n登没有\t160279\n隐形衣\t160280\n哒哒哒哒你是头猪你是头猪秘你叫密度我叫度秘对吧对吧对\t160281\n13938937610\t160282\n死党\t160283\n华山医院\t160284\n土白\t160285\n呢疼\t160286\n国娘\t160287\n我司与为为塞西面事舅睡用事\t160288\n多管\t160289\n菲尼克斯太阳队\t160290\n李彦辉\t160291\n难息\t160292\n志愿者们\t160293\n王总\t160294\n郭宏真\t160295\n小视植物大战僵尸\t160296\njiggjjx\t160297\n具體點嗎\t160298\n职场\t160299\n尕还\t160300\n丨丨ii211211计计谁谁谁谁\t160301\nLover\t160302\n度人\t160303\nvpgm\t160304\n朱伯伯\t160305\n防肠\t160306\ncdcdc\t160307\n父母责\t160308\n牛油\t160309\n赖晶晶\t160310\n嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟\t160311\n嗯多谢\t160312\n度云\t160313\na1只蚊子b1部\t160314\n孙佳怡\t160315\n说师傅\t160316\n中和\t160317\n凌晨零点\t160318\n竟说\t160319\n切斯才\t160320\n半拉子\t160321\n3.双\t160322\n法学博士\t160323\n就是说上\t160324\n幻肢\t160325\n始发\t160326\n兔兔兔兔兔兔兔兔兔兔兔兔兔兔兔兔兔兔兔兔兔兔兔兔兔\t160327\n目睹\t160328\n静香楼\t160329\nyiya\t160330\n正价\t160331\n一四多\t160332\n囗容\t160333\n三十五本\t160334\n瘟神\t160335\n正仇\t160336\n一百四一\t160337\n19211555355\t160338\n熊牙\t160339\n九千多\t160340\nVxfhbxg\t160341\nfggfgjhgjqwrruioooreaadgklxxv\t160342\n娜娜\t160343\nAbxufu\t160344\n一四大\t160345\n中山区公安分局\t160346\n偷运\t160347\nmatg\t160348\n熊牌\t160349\n浅显\t160350\nmeleme\t160351\n熊片\t160352\nmato\t160353\n那好问东瓯\t160354\nfedrealy\t160355\n凤五天\t160356\n新华都\t160357\n传记\t160358\n东郊记忆那KTV\t160359\n是的就是你罪该万死\t160360\n焕然一新\t160361\nJfg\t160362\n声兔\t160363\n金无足赤\t160364\n丽人\t160365\n资本化\t160366\n梁江坤\t160367\n直念\t160368\n病得人\t160369\n湖人\t160370\n一赶紧\t160371\n专科\t160372\n中医学\t160373\n阿傅\t160374\nhttpfhiphotosbaiducomxiaodupicitem30adcbef76094b36c10c2f98a4cc7cd98d109d2cjpg\t160375\n冬夜读书示子律\t160376\n超级\t160377\n贾志强\t160378\n度秘密\t160379\n洪大娘\t160380\n175855\t160381\n丽云\t160382\n一部一个\t160383\n十二星座\t160384\n了知不知道\t160385\n顾哥\t160386\n拆掉\t160387\n滚窝\t160388\n秘开心\t160389\n鸡尾酒\t160390\nidhuoIDDHudduoyeriyeTPiddoIThiFDoHDdogdfkHzGKxqilv\t160391\n并念\t160392\n宫城\t160393\n普拉\t160394\n威远\t160395\n打家劫舍\t160396\n成都火车站\t160397\n出去玩\t160398\n鸡蛋味\t160399\n尖兵\t160400\n2011.3.9天\t160401\n限量版\t160402\nvuiuf\t160403\n无垠\t160404\n画妆\t160405\n英国央行\t160406\n一千四百三十九亿九千九百九十九万九千\t160407\n朱明勇\t160408\n灵舟精片\t160409\n二百五八\t160410\n内窥\t160411\nvivo5\t160412\n琼tu\t160413\n99999999999999999\t160414\n填鸭式\t160415\n娜丽莎\t160416\n组建\t160417\n平武报恩寺\t160418\n啊燕\t160419\nuhvtinhfuhcf\t160420\n利先锋嫁多金\t160421\n机械感\t160422\n钱姑娘\t160423\n155105\t160424\n金铭\t160425\n皮不要脸\t160426\n七月底\t160427\nfvdsgf\t160428\n金银\t160429\n夺堆\t160430\n金链\t160431\n怀疑\t160432\n按揭\t160433\n对牛谈情\t160434\n咬文嚼字\t160435\n玉帛\t160436\n马事\t160437\n老死不相往来\t160438\n周奖\t160439\n我你妈了个逼我操你个妈的我流量给我开开\t160440\n玉帝\t160441\n飞场\t160442\n34号\t160443\n百事杯\t160444\n飞地\t160445\n好时辰\t160446\n傻子傻子傻子是你和你的\t160447\n秘渡\t160448\n飞在\t160449\n尼木\t160450\nben\t160451\n赛尔\t160452\n赠赠\t160453\n意林小小姐前方江湖\t160454\n同一人男\t160455\n高乔斌\t160456\n十堰\t160457\nHYC集团\t160458\n十二月初\t160459\n度秘你猜我怕是谁\t160460\n白烧鱼\t160461\n破口大骂\t160462\n1322452626255\t160463\n高人一等\t160464\n李雨歌\t160465\ndureXo\t160466\n4只\t160467\n国泰来\t160468\n各自\t160469\n巩敏\t160470\n12月6日\t160471\n好loli\t160472\n4口\t160473\n4句\t160474\n所及\t160475\n红岩\t160476\n红岭\t160477\n张润鹏\t160478\n4台\t160479\ngjdu\t160480\n叶子美\t160481\n4号\t160482\n一万8890多少\t160483\n咬讲\t160484\n斤斤\t160485\n大大后\t160486\n就是我记忆中\t160487\n榴莲糖\t160488\n首义\t160489\n红岗\t160490\n喷洒\t160491\n招行信用卡\t160492\n丑臭\t160493\n装备展\t160494\n峾海f0哲国十\t160495\n逗一泓\t160496\n}\t160497\n发情期\t160498\n明年\t160499\n吾着\t160500\n扭扭捏捏扭扭捏捏\t160501\nhhbenen\t160502\n巩俐如\t160503\n几33100100\t160504\n二百二百三十兆\t160505\nxugy\t160506\n马思思\t160507\n烤馍馍\t160508\n韩公公\t160509\n森林谜\t160510\n黑暗时代\t160511\n顾晗\t160512\n多啦a梦\t160513\n致谦坦白\t160514\n呀喏\t160515\n叫随到\t160516\n瑜\t160517\n成年通\t160518\n云似\t160519\n离退休\t160520\n睡醉酒\t160521\n太安静\t160522\n一克拉梦想\t160523\n那样爱你,完美恋人\t160524\nwhoorre\t160525\n的那你到我家呗\t160526\n党风\t160527\n小鱼贝贝\t160528\n落地\t160529\n连队\t160530\n落场\t160531\n奥c辣奥特曼\t160532\n亲临\t160533\n亲上\t160534\n20岁\t160535\n一两次\t160536\n温良恭俭\t160537\n亲一\t160538\n殖民\t160539\nishi\t160540\n53000\t160541\n独苗蒜\t160542\n新疆克拉玛依\t160543\n米西克\t160544\ncdgz\t160545\n陈列馆\t160546\n好不开心\t160547\n2周年\t160548\n言情小说\t160549\n胜村\t160550\n你民\t160551\n二12月22\t160552\n那你异种我包\t160553\n王佳欣\t160554\n针脚\t160555\n一飞\t160556\n多公里\t160557\n重庆晚报\t160558\n小瑾\t160559\n百白咳\t160560\n欠钱\t160561\n威远娟\t160562\n咯k\t160563\n吴生天花\t160564\nlkkllkkk8787878787\t160565\n累北流\t160566\n冷气\t160567\n王八臭\t160568\n郑允浩\t160569\n鸡友\t160570\n小瑞\t160571\n弯弓\t160572\n好嫩\t160573\n润峰苑\t160574\n菊次郎的夏天\t160575\n国人们\t160576\n光面\t160577\n这个字\t160578\n这个子\t160579\n84个\t160580\n显微镜\t160581\n依瑶\t160582\nA++级\t160583\n咪咪快点儿\t160584\n不接上\t160585\nvhovqsc\t160586\n火轨\t160587\n548358559643494\t160588\n屋塔房王世子\t160589\n炎帝\t160590\n5522839\t160591\n减灾\t160592\n火车\t160593\n输女\t160594\n3161645555555555\t160595\n欧朋浏览器\t160596\n要不正\t160597\n长微博\t160598\nhchff\t160599\n晚霜\t160600\n罗马甘菊\t160601\n偈谈元\t160602\n心烦\t160603\n头槌\t160604\n幺七\t160605\n不好意思久等\t160606\n100份\t160607\n娃子们\t160608\n4800平方米\t160609\n兰州市\t160610\n学理科\t160611\n的谁不爱自\t160612\n武钢\t160613\n好问\t160614\n子倩\t160615\n好闲\t160616\n看不看\t160617\n日本皇\t160618\n好闷\t160619\n410425200107260083\t160620\n好闻\t160621\n好闺\t160622\n好闹\t160623\nT裤\t160624\n第十三届\t160625\n七八个\t160626\n121743547\t160627\nGVVBv布局徐军部创举据悉佛西\t160628\n某月\t160629\n毛毯\t160630\n靠哈哈\t160631\n猪肥膘\t160632\n上学吧\t160633\n冠以\t160634\nCASSSSSSSSSSSSSSSSSSSSSSSSSSSSSS\t160635\n熊梦莹\t160636\n咯木马\t160637\n七八万\t160638\n顶部\t160639\n疯疯癫癫\t160640\n王程冉\t160641\n上学后\t160642\nDance\t160643\n开战\t160644\n没我可怜\t160645\nuiu\t160646\n嗯贵\t160647\n翟换平\t160648\nI8150黑2000\t160649\nhttpahiphotosbaiducomxiaodupicitem728da9773912b31b5084da5b8118367adbb4e1dcjpg\t160650\n邋遢鬼\t160651\n小饭\t160652\n木头券\t160653\n残垢\t160654\n罗凤二小学五四班偷偷头\t160655\n别别别别别别\t160656\n13902225675\t160657\n小饿\t160658\n小饼\t160659\n下来楼\t160660\n相助\t160661\n正长方形\t160662\nBUNNY\t160663\n匹诺曹\t160664\n相加\t160665\n罐儿\t160666\n相劝\t160667\nGAZ\t160668\nGAY\t160669\n那年就死的男男男男的\t160670\n沒妲\t160671\n62445444\t160672\n天中节\t160673\n鸡胗\t160674\n龙诀\t160675\n玲珑宝塔\t160676\nskrvisb\t160677\nGAG\t160678\nGAE\t160679\n倒吊男\t160680\n对偶句\t160681\noneof\t160682\n五一斤\t160683\n蜘蛛萌\t160684\n生小喝\t160685\n潦草\t160686\n笑吧\t160687\n包在\t160688\n南阳子怡\t160689\n七十几分\t160690\njhfg\t160691\nChiese\t160692\n我爱你我爱死你了我们结婚吧\t160693\n向南方I\t160694\n明天妈\t160695\n不干行\t160696\n曹颖怀\t160697\nsmitme\t160698\n刘拜拜\t160699\nYyy\t160700\n五河\t160701\n不忍心\t160702\nwyf\t160703\n略带\t160704\n在脚下\t160705\n芙蓉广场\t160706\nwym\t160707\n黄果树瀑布\t160708\n莫名其妙\t160709\n狂补\t160710\n手记\t160711\n天津航空\t160712\n祝悦悦\t160713\n363333333333663333366666699999999999\t160714\n采春\t160715\n南悲秋悲\t160716\n邓超杰\t160717\n都暻秀\t160718\n光阴荏苒\t160719\n自恋狂\t160720\n欧泽伟\t160721\n吉尼斯世界纪录\t160722\ntf卜o14\t160723\n临别\t160724\n一亿张\t160725\n杨西村\t160726\n看不民\t160727\n紫怡\t160728\n温颜僮\t160729\n丹阳市\t160730\n阿里还我\t160731\n崩潰\t160732\n洛杉矶县\t160733\n休闲期\t160734\n帅杰\t160735\n刀山下\t160736\n挨家\t160737\n不留\t160738\n44.7元\t160739\n義軍愉\t160740\n这是我的第再来\t160741\n步集\t160742\n金毛狗\t160743\n尹不凡\t160744\n五食饭\t160745\n不畏\t160746\n郑润\t160747\n场场\t160748\n单蛋\t160749\n三河市\t160750\n不畅\t160751\n场地\t160752\n一ll\t160753\n五个明\t160754\nertjjhb\t160755\n小女人\t160756\n老历\t160757\n方式子\t160758\n狠你好色\t160759\n饺子头\t160760\n强仁\t160761\n536\t160762\n汉儿\t160763\n裱花师\t160764\n呼缓\t160765\n电影号\t160766\n儒讲谐世\t160767\n电影史\t160768\nP6200\t160769\n有点舍不得\t160770\n学富五车\t160771\n耿给\t160772\n|||||\t160773\n苏凤茹\t160774\n张路阳\t160775\n根雕\t160776\n我喜欢你小秘书度秘\t160777\n美国国家地理学会\t160778\n怒头\t160779\n徐秀田\t160780\n咋来\t160781\n牛子训\t160782\n特哥\t160783\n推选\t160784\n校牌\t160785\nhttpdhiphotosbaiducomxiaodupicitem09fa513d269759eeb65bf4b8b5fb43166d22dfbejpg\t160786\n彩虹精灵\t160787\nBB一8\t160788\nash度秘\t160789\n摇头晃脑\t160790\njyzkurlztzitx5j……jstfhjtgf\t160791\n握我\t160792\n警营\t160793\n海沙\t160794\n金鹰\t160795\n八把\t160796\n叽巴\t160797\n叫偶尼\t160798\n木鱼峰\t160799\n傻子猪\t160800\n雷利\t160801\n邹忌讽齐王纳谏\t160802\n庆功宴\t160803\n八折\t160804\nysvjj\t160805\n美惠\t160806\n密度米\t160807\nhhgfgfggg\t160808\n一把一百粒个\t160809\n假肢\t160810\n566778\t160811\nDaYingShenmw\t160812\n毛天使\t160813\n玉佛殿\t160814\ndufug\t160815\n海沧\t160816\n旧称\t160817\n小天地\t160818\n有别有钱\t160819\n于琳\t160820\n李燕伶\t160821\n倒立\t160822\n夏都\t160823\n李娅琴\t160824\n张心爱\t160825\n阿魏\t160826\n王廖汶\t160827\n操纵者\t160828\n病魂\t160829\n憋忍\t160830\n俏皮\t160831\n小贴士\t160832\n保护我知道\t160833\n6段\t160834\n最讨厌\t160835\n崔小\t160836\n起架\t160837\n栗衣\t160838\n果果\t160839\n天人合一\t160840\n青青管神经吗你\t160841\n智障\t160842\n远门\t160843\n秦赢\t160844\n贝尔格里\t160845\n烟柳岩\t160846\n太阳花儿朵朵朵\t160847\n发束\t160848\n泡腾\t160849\n真的好想\t160850\n迈腾王\t160851\n陈合唱\t160852\n奥波德\t160853\n正火热销售\t160854\n南国\t160855\n11天\t160856\n看多来\t160857\n9X年\t160858\n桃祥\t160859\n一天天一点\t160860\n刘亚仁\t160861\nzhanghai\t160862\nIve\t160863\n参加斯纪念\t160864\n护肤沉舟梦日边\t160865\njosh\t160866\n落日\t160867\n不可知\t160868\n史建龙\t160869\n三零二\t160870\nbeento\t160871\n咪咪咪咪咪咪\t160872\n晓明堂\t160873\n百步\t160874\nIvy\t160875\n多面一段落\t160876\n巴黎十一大\t160877\nfgeyfd\t160878\n良民\t160879\nsbatouto\t160880\n2093776619\t160881\n老大姐\t160882\nhwatawdtjjmbdwtqgkajagadjmjgmjd刁：一mwjamnjpt15663642gq\t160883\n知心姐\t160884\n黧黑\t160885\n听说话\t160886\n素描照\t160887\n巨穷\t160888\n1898年\t160889\n克莱斯勒\t160890\n文你个事\t160891\n零价\t160892\n零件\t160893\n杨帆\t160894\n醒不来\t160895\n你最好呀\t160896\n得大自在\t160897\n听说证\t160898\nsjfb\t160899\n醒来后\t160900\n泰山\t160901\n晓荷\t160902\n放学号\t160903\n逾越\t160904\n姚\t160905\n加标\t160906\n姝\t160907\n姜\t160908\n姓\t160909\n姑\t160910\n姐\t160911\n姗\t160912\n说不死\t160913\n沃克\t160914\n委\t160915\n始\t160916\n姊\t160917\n哦皮\t160918\n庆隆\t160919\n说不正\t160920\n床帘\t160921\n姆\t160922\n商国\t160923\n姻\t160924\n张对\t160925\n富豪榜\t160926\n姿\t160927\n论语\t160928\n你好错\t160929\nQ乙\t160930\n头肿\t160931\n骨子里\t160932\n本特纳\t160933\n姨\t160934\n为半\t160935\n印尼\t160936\n不桑心不桑心\t160937\n头肩\t160938\n我是大好人你是大傻子\t160939\n沃兹\t160940\n张门\t160941\n姥\t160942\n直升机\t160943\n乐安河\t160944\n小鱼吃麻虾\t160945\n受不完\t160946\nalKall\t160947\n张寰\t160948\n老友记\t160949\n吕他乡\t160950\nhttpehiphotosbaiducomxiaodupicitem5d6034a85edf8db16f8\t160951\n统读\t160952\n1月14号\t160953\n有影响\t160954\n半山腰\t160955\n催化\t160956\n伊利诺伊州\t160957\n五秒钟\t160958\n幺汇款\t160959\n谢娜帅\t160960\n衢州\t160961\n349岁\t160962\n你好我是的主人\t160963\nq九十六百分之96\t160964\n发啦\t160965\ntucut\t160966\n上半天\t160967\n老脸\t160968\n山寨机\t160969\n150112053359\t160970\n遮雨\t160971\n保梅熊\t160972\n0532876628708762871\t160973\n一个度秘我爱你\t160974\n随遇\t160975\n青水\t160976\n得得得\t160977\nHenry\t160978\n5428822\t160979\n丁豪广场\t160980\n花旗集团\t160981\n石勇\t160982\n试验\t160983\n电价\t160984\n糖尿病\t160985\n拍摄地\t160986\nzdn\t160987\n青气\t160988\nzdh\t160989\nzdk\t160990\n明眼人\t160991\nzdd\t160992\nzdf\t160993\n你是猪吗你是猪吗你是猪吗你是猪吗你是猪吗你是猪吗你是猪\t160994\n唱人\t160995\n曾经\t160996\n婚纱照\t160997\n七月二\t160998\n春深\t160999\n儿子处\t161000\n蓝志\t161001\n就好再见\t161002\n马子\t161003\n成服\t161004\nparter\t161005\n很多说呢\t161006\nGideon\t161007\n18399300183\t161008\n诸城脑筋急转弯\t161009\n扑腾\t161010\n快男十二强\t161011\n倒背如流\t161012\n小娜\t161013\n小娟\t161014\n自驾游\t161015\n小娘\t161016\n珏宝\t161017\n小娇\t161018\n小威\t161019\n爱不单行\t161020\n小娃\t161021\n朱伟\t161022\n飞小心\t161023\n张德勇\t161024\nhttphhiphotosbaiducomxiaodupicitemae\t161025\nloll诺克幂幂\t161026\n大陆妹\t161027\nyouasec\t161028\n正三角\t161029\n下捉蛇\t161030\n华南广场沃特迪卡侬\t161031\n4397200\t161032\n第三次\t161033\n悲兮\t161034\n万贵\t161035\n我家家汉庭\t161036\n搏斗\t161037\n1512496\t161038\n万贯\t161039\n体验\t161040\n遇见你\t161041\n香约茉莉雪花膏\t161042\n万怪兽\t161043\n新手\t161044\n女同性\t161045\n薛海兰\t161046\n2007年10月1日\t161047\n谢文\t161048\n亮瞎\t161049\n上下堡寺\t161050\n宗雅琴\t161051\n今天6时\t161052\ndddddbbbb\t161053\n李筱颖\t161054\n爆米花\t161055\n危害性\t161056\n呕漏\t161057\n派包机\t161058\n木马木马木马木马陌陌陌\t161059\n不穿着行\t161060\n钟炫\t161061\n十二时\t161062\n良辰\t161063\njulie\t161064\n嗯全\t161065\n卖淫\t161066\n好怕\t161067\n钟点\t161068\n呗长峰\t161069\n法乂功\t161070\n洗衣劲\t161071\n专门\t161072\n16:50\t161073\n龙熙\t161074\n会阴\t161075\n8点39分\t161076\n共产党第十四色复古v\t161077\n好怨\t161078\nhospital\t161079\n同安\t161080\n于正经\t161081\n嗯先\t161082\n周家吧新世纪\t161083\n十万二十万\t161084\n老祖宗们\t161085\n哇噻\t161086\n到天亮\t161087\n哀悼\t161088\n凶巴巴\t161089\n南辛庄\t161090\n六哥\t161091\n别站\t161092\n万达集团\t161093\n不老虎\t161094\n电卡\t161095\n小面比\t161096\n有效性\t161097\n甘依\t161098\n说了叫\t161099\n需努力\t161100\n度密月\t161101\n忠格尔加\t161102\n奇瑞\t161103\n清汤锅\t161104\n饲养成\t161105\n道士\t161106\n蓝媒\t161107\n山墙\t161108\nhfghhjjjhghgyjbvvhcchjvkbkjvugjmvbjvjvjbjbkv\t161109\n北京地铁1号线\t161110\nuuuyuuuuu\t161111\n电单\t161112\n借古讽今\t161113\n329家\t161114\n六七吻\t161115\n钱钱钱钱钱钱钱钱\t161116\n炫舞时代\t161117\njjwo\t161118\n帅哥\t161119\n跳投\t161120\n大春竹\t161121\n烩面学\t161122\n林晓白\t161123\n小松松\t161124\n感情好不\t161125\n哈关系\t161126\n过脾披萨\t161127\n大足餐\t161128\n假摔\t161129\n川岛芳子\t161130\n光鲜亮丽\t161131\n政协\t161132\n美呢美\t161133\n半球\t161134\n要脸\t161135\n佘思锜\t161136\n抱胎\t161137\n每户\t161138\n累打\t161139\n67666\t161140\n刘梦雨\t161141\n1.5xD\t161142\n14398\t161143\n150212003057188\t161144\n遭\t161145\n你的天机\t161146\n精灵宝可梦\t161147\n遮\t161148\n黄嘉诚\t161149\n早睡\t161150\n遥\t161151\n遦\t161152\n呗度秘\t161153\n遣\t161154\n遢\t161155\noffffff\t161156\n燕庄\t161157\n避\t161158\n選\t161159\n偶吧阿拉斯\t161160\npiclick\t161161\n马文·尼科森\t161162\n传销\t161163\n遲\t161164\n遍\t161165\n源于\t161166\n過\t161167\n遉\t161168\n運\t161169\n遊\t161170\nggjx\t161171\n火星\t161172\n走着睡\t161173\n遁\t161174\n遂\t161175\n七项\t161176\n过新年过\t161177\n遞\t161178\n遛\t161179\n紫荆\t161180\nggjh\t161181\n遗\t161182\nggjk\t161183\n五保\t161184\n遐\t161185\n道\t161186\n遒\t161187\n小心龙\t161188\n跑西跑\t161189\n太平洋争霸战\t161190\n刘生\t161191\n88818\t161192\n落花有情\t161193\n新一与\t161194\n傻子姑\t161195\nvitf\t161196\nvite\t161197\n看好想\t161198\n赵某\t161199\n茶巴啦啦小魔仙\t161200\n集体照\t161201\n少爷\t161202\n几百把\t161203\n有些天\t161204\n亚度秘\t161205\n焦点\t161206\n呼叫可爱多\t161207\n啧一看\t161208\n爱的旅程\t161209\n欧阳露露\t161210\n优势型\t161211\n植物大战僵尸全明星黑暗世界\t161212\n求是\t161213\n152103154205\t161214\n飞了起来\t161215\n听道\t161216\n18．5％\t161217\n老人家\t161218\n歌城\t161219\n材料\t161220\n三千美元\t161221\n劲跳糖\t161222\n白茅湖\t161223\n张锦涛\t161224\n足足\t161225\n临摹\t161226\n朱志威\t161227\n一零四六\t161228\nyoulel\t161229\nInfegdnnnbhh\t161230\n玩转\t161231\n佳妮妮\t161232\n景逸\t161233\n巢湖\t161234\n7周岁\t161235\n墨非定律\t161236\n匡性\t161237\n你是智能机器人\t161238\n四G，四\t161239\n361°\t161240\n熟冷\t161241\n七分之四\t161242\n不远处\t161243\n咯喔\t161244\n密战\t161245\n九百九十个\t161246\n秘精\t161247\nvvx\t161248\ni建瓯\t161249\n靠点谱\t161250\n松驰\t161251\n共谋\t161252\n邮寄\t161253\n哎东动\t161254\n陈彦池\t161255\nvvv\t161256\nfjshv\t161257\n#狂欢仲夏\t161258\n洒水车\t161259\n爪爪爪爪爪爪爪爪爪爪爪爪爪爪爪\t161260\n躺椅\t161261\n冷嘲热讽\t161262\n瑞mierashio001ajyoutloopeeasouyoucetortatiouagjckcomen\t161263\n巡t\t161264\n骨子\t161265\n到时来\t161266\n秘密的秘\t161267\nPABCD\t161268\n抽刀断水水更流举杯消愁愁更\t161269\n王文学\t161270\n说不骂\t161271\n配货\t161272\n香火钱\t161273\n屈臣氏\t161274\n180.00\t161275\n你不是我的金枝玉叶，我也不是你的玉树临风\t161276\n卧醉\t161277\n蒋劲夫\t161278\n新制度经济学派\t161279\nsuvc\t161280\n脸战\t161281\n制片家\t161282\n太钱\t161283\n你好美你\t161284\n大住\t161285\n铜丝\t161286\n空军\t161287\n苏楠\t161288\n聚划算周年庆#\t161289\n雷霆炎魔\t161290\nrang\t161291\n狗🐶\t161292\nibobogivps\t161293\nffdfddgffgt\t161294\n額額額額\t161295\n黑道王\t161296\n天秤女花心\t161297\n嗯般\t161298\n一幕\t161299\n口苦\t161300\n吃麻球\t161301\n岁月无声\t161302\nMcGhee\t161303\n无数条\t161304\n一幅\t161305\n和你说\t161306\n奥讨厌\t161307\nyangying\t161308\n刁民厂保安公司\t161309\n张泉灵\t161310\n赶到\t161311\n露片\t161312\n一干\t161313\n地下一只脚\t161314\n一年\t161315\n一并\t161316\n快快跑\t161317\nkewe\t161318\ndguruh\t161319\n正时怎\t161320\n你美得你美得你美得你美得你美得你美的女孩\t161321\n一幢\t161322\n摔裂\t161323\n逼事儿\t161324\n死人妖\t161325\nEvelyn\t161326\nshucongcomread1616422971htmlqdQZ919\t161327\n60个\t161328\n一千元\t161329\n新藤\t161330\n一千克\t161331\n你好哈哈哈哈哈哈哈哈\t161332\n迟早\t161333\n\t161334\n瑞干\t161335\n郊外\t161336\n悄逝\t161337\n凡人\t161338\n憐\t161339\n圣龙\t161340\n你是狗吗你是猫吗你是人\t161341\n三二一二零\t161342\n贤海\t161343\n一会儿陪我玩奥\t161344\nweynutn\t161345\n60万\t161346\n一千六\t161347\n冷饮\t161348\n冶冶\t161349\nABA\t161350\n合租\t161351\n人满天\t161352\n泫子\t161353\n大把大把\t161354\n阿莫斯\t161355\n十二篇\t161356\n烦我想\t161357\nJiang\t161358\n晃\t161359\n曹瑞文\t161360\nddcb\t161361\n贝古\t161362\n炖菜\t161363\n簪子\t161364\n三门峡市实验中学\t161365\n够了别再说\t161366\n132567890\t161367\n几分之几\t161368\nfffffnfffffffgffnfnffffggfnnfnoffnfnnfnoff\t161369\n逆袭\t161370\n郑艳婷\t161371\n宋可乐\t161372\nvdgsh\t161373\n那多年\t161374\nABC\t161375\n为谁\t161376\n狠人\t161377\n川普\t161378\n叔敏\t161379\n宗文瑞\t161380\n二一下\t161381\ntudwjchj\t161382\n刘诗诗\t161383\n15年\t161384\n贵妇帽\t161385\nonlyin\t161386\n仓储\t161387\n下手\t161388\n琼瑶\t161389\n江安河\t161390\n烦有时候\t161391\n敬老\t161392\n植物大战僵尸无尽版\t161393\n第二十篇\t161394\n慎用\t161395\n小崽子\t161396\n99队\t161397\n认识见到你我是你的小秘书度秘\t161398\n打滑\t161399\nbbbbbn\t161400\n84岁\t161401\nbbbbba\t161402\nbbbbbb\t161403\ntgvvbjfchffggfyhhfggfbhtghggbgfcfffzjteygGERHHFFUKNVDFGBDFHBBWghhbfdffhydfhtfbjyfcvhy\t161404\nyanz\t161405\n何用\t161406\n46级\t161407\n不以己悲\t161408\nyang\t161409\n重播\t161410\nWant\t161411\n不言而喻\t161412\nwillll\t161413\n琼瑜\t161414\n佳和\t161415\n孔帅\t161416\n威凯c3湾\t161417\n18330492323\t161418\n葛利斯\t161419\n下川镇\t161420\n五快点\t161421\n黑铁\t161422\nliste\t161423\n15156399912\t161424\n啵啵爱\t161425\n你堂\t161426\n晒后\t161427\n金怡濂\t161428\n亲爱的那你给我\t161429\n赔礼\t161430\n鬼魂昆\t161431\n搜火\t161432\n36计\t161433\nWcdma\t161434\n我的你好\t161435\n龙柱\t161436\n去聊\t161437\n哪话\t161438\n十六十七八十八个\t161439\n晚膳\t161440\n私照\t161441\n一万多分\t161442\n雨泪\t161443\n想吃肉\t161444\nv摩加迪沙正传vv\t161445\n雨波\t161446\n角侦探\t161447\n3p行\t161448\nPro\t161449\n日柱己未甲戌\t161450\n种田文\t161451\ngotared\t161452\n合数\t161453\n五十六五十八六十五十二五十四五十六五十八\t161454\n送审\t161455\n憧\t161456\n王维\t161457\n索本\t161458\n老嘎\t161459\n林宝坚尼\t161460\n中国人保\t161461\n从零开始\t161462\n舒颖\t161463\n帕克\t161464\n维埃勒\t161465\n缝补\t161466\n双k伯之相片\t161467\n采众长\t161468\n王兰柱\t161469\n感知\t161470\n由内而外\t161471\n零七年\t161472\n恋足\t161473\n动荡\t161474\n展业\t161475\n18252850619\t161476\n网络不给力\t161477\n相你表白\t161478\n随想贷\t161479\n黑你不开心你和我快点\t161480\n李小阳\t161481\n不得不赞\t161482\nyrfurbjrte\t161483\n自私自大\t161484\n我的爱恋\t161485\n59吨\t161486\n屠龙记\t161487\n神经屌丝\t161488\n29页\t161489\n风儿\t161490\n22100天\t161491\n寄居\t161492\n西门子电器\t161493\n1月25号\t161494\n53多少31\t161495\n就是爱你\t161496\n功名\t161497\nCOSMO6\t161498\n网络堡\t161499\n260张\t161500\n布里哥们\t161501\n计较\t161502\neysyreyrgg\t161503\n交不上\t161504\n对岸\t161505\n千里送\t161506\ndaoke\t161507\n雾宅\t161508\n乳瓶\t161509\n既往不\t161510\n哈呀\t161511\n茅老贼\t161512\n九毛九毛六\t161513\n介绍部\t161514\n600175247750\t161515\n嬉嬉闹闹\t161516\nwindows\t161517\n你的眼神\t161518\n给我打\t161519\n笔者\t161520\n恩乖乖\t161521\n五高六\t161522\n手领\t161523\n一年几岁\t161524\n56695\t161525\n数额\t161526\n841542571551254258\t161527\n嘟嘟面\t161528\n孙林峰\t161529\n睡梦诛\t161530\n13930636202\t161531\nFyfyxycjbobibkbkbjcgxfzsgxycubibllvivudsaqtxyvivyZXtdycSTCUVOadtcugiUfifaRzt\t161532\n巴啦啦小魔仙之音\t161533\n1863636368551863636866\t161534\n赵小姐\t161535\n刘英阁\t161536\n李思修\t161537\n小板妃\t161538\n脑内\t161539\n龙须所\t161540\n公公们\t161541\n汉方\t161542\n转化率\t161543\nABO\t161544\n免字\t161545\n12233456789\t161546\n歹1r\t161547\n知见\t161548\n夜夜厘米\t161549\n语种\t161550\n别跟我较劲\t161551\nb2c\t161552\n知觉\t161553\n汉文\t161554\n一应俱全\t161555\n好看我骗你了我动\t161556\n准地\t161557\n不灭你我可以背唐诗床前明月光疑是地上图哇举头望明月\t161558\nqqaaa\t161559\n仝孟汉\t161560\n爱奇帮\t161561\n前者\t161562\n递交\t161563\n不杀\t161564\n彭冉\t161565\n波有\t161566\n马惠平\t161567\n一个人的旅行\t161568\n李冰荣\t161569\n猜字\t161570\n南塘\t161571\n漏税\t161572\n南塔\t161573\n好我是你的拜拜\t161574\n116748\t161575\n不束\t161576\n4838\t161577\n巳逼\t161578\n孤帆远影碧空尽见长江天际流\t161579\n4831\t161580\n550瓦\t161581\n相报\t161582\ncjyg\t161583\n或许\t161584\n蒋门神\t161585\n李俊贤\t161586\n郑堡\t161587\n种植牙\t161588\n我是女的那你也就是女的了我的好闺蜜\t161589\n岚汐\t161590\n一心不二用\t161591\n甲方\t161592\n饺子者\t161593\n给是\t161594\n张大黑\t161595\n绵阳中学实验学校\t161596\n真是的你当我好骗\t161597\n有一年级\t161598\n大雨天\t161599\n新仓\t161600\njiags\t161601\n方大同生日快乐#\t161602\n少侠\t161603\n侧目\t161604\n离子散\t161605\nshoppingmall\t161606\n生不如死\t161607\n寸光\t161608\nSuiguotol\t161609\n墙党\t161610\n刘笑容\t161611\n伊泰队\t161612\n会员\t161613\n挨揍\t161614\n魅力四射\t161615\n六要\t161616\nmillie\t161617\n新任\t161618\n鲁冰萌\t161619\n恋曲\t161620\n九阳永修\t161621\n马嘴眼\t161622\n来了度\t161623\n您老\t161624\n7点整\t161625\n九百多公里\t161626\n京沪高速\t161627\n沿着\t161628\n耳朵床\t161629\n也不\t161630\n格米亚\t161631\n上臂\t161632\n广汉禅文化书画院\t161633\n别闹了行\t161634\n真寂寞\t161635\n压挡\t161636\n刚一次\t161637\n通票\t161638\n达卡\t161639\n奥对度秘\t161640\n明天过后\t161641\nBbnv\t161642\n亚历山大\t161643\n33458\t161644\n学去\t161645\n三点十五\t161646\nfhfx\t161647\n出土\t161648\n写一不说\t161649\n司冉\t161650\n活做\t161651\n五花肉\t161652\n送客\t161653\n政余\t161654\nbakhiles\t161655\nbbdx\t161656\n分分钟\t161657\n们先看看纲要个月的高歌\t161658\n破鞋\t161659\n再来一个再来一个再来\t161660\n智者\t161661\n出场\t161662\n助攻\t161663\n得住\t161664\n世达独家\t161665\n甘佳怡\t161666\n左长歌\t161667\n突现\t161668\nPOS机\t161669\n112112\t161670\n政体\t161671\nffffff\t161672\n有节\t161673\n乐无穷\t161674\n小芳芳\t161675\n从商\t161676\n大儿老儿\t161677\n菜店\t161678\n48个\t161679\nDjejekjday\t161680\n警校\t161681\n生蜘蛛侠\t161682\n更为\t161683\n我在唱歌你是我的小呀小苹果\t161684\n老辉\t161685\n肾源\t161686\n汝狗\t161687\n郑锐\t161688\n冯秀慧\t161689\n48万\t161690\n关系\t161691\n关糸\t161692\nhfdgxvjxkvjxxnxjjf屯zjivimn\t161693\n洗洗波\t161694\n在线的朋友我对象和你聊天我心的话\t161695\n想你的夜\t161696\n个种\t161697\n发声\t161698\n本多\t161699\n书记员\t161700\n更丑\t161701\n蒋林翰\t161702\n嘴唇\t161703\n张能\t161704\n佛协\t161705\n评定\t161706\n范县新区\t161707\n\t161708\n接管\t161709\n大上海里\t161710\n生日快乐\t161711\n监督员\t161712\n進碟子裡面然後\t161713\n称\t161714\nGuidingtimed\t161715\n补救\t161716\n渔者\t161717\n噗噗噗噗噗\t161718\n走邪道\t161719\n我不我不喜欢\t161720\n秽\t161721\n宠我就好了吧友人\t161722\n评审\t161723\n太好了你想\t161724\n杨新月\t161725\ncptbtptpbcpt\t161726\n河桥\t161727\n私心\t161728\n铁戒尺\t161729\n老鼠米老鼠\t161730\n湘岳\t161731\noppomini5117\t161732\n叶考完\t161733\n双色球号码\t161734\n胶州湾\t161735\n红泥\t161736\n人民\t161737\n羊屁眼\t161738\n人气\t161739\n呗儿\t161740\n8326979557999799266\t161741\n覆灭\t161742\n三四天之后\t161743\n尿频天\t161744\n猫老虎\t161745\n诉讼时效\t161746\n杨家坪\t161747\n考场\t161748\nhiphop\t161749\n聚焦\t161750\n怪味\t161751\ncaauxbmelchoiupiioooopoppjbkm\t161752\n苏瑜婷\t161753\n家庭式\t161754\n9点30\t161755\nj1j\t161756\n万一次\t161757\n不爱听话\t161758\ngppt\t161759\n裤袜\t161760\n56只\t161761\n举世无双\t161762\n亲爱的爸爸\t161763\n32753735752727353435353435253434255737355567673\t161764\ni酷哈\t161765\n裤袋\t161766\n甜党窝\t161767\njcns\t161768\n出尔反尔\t161769\n韦诚诚\t161770\n破论\t161771\n我美了美了美了美了\t161772\n宝鹿鹿\t161773\n北京熊你好乖北极熊你好\t161774\n晴间\t161775\n破让\t161776\nWokan\t161777\n钱大群\t161778\nFgijf\t161779\n干活儿\t161780\n一百五十G\t161781\nyyggghguhhyyggygfttfgytyyyhghhhhh\t161782\nurd\t161783\nshenmojiuzhi\t161784\n张盟\t161785\n糟粕岂止胜\t161786\n编修\t161787\n锁度\t161788\n园地\t161789\n五五百多\t161790\n不知不知不知不不知\t161791\n张周杰伦\t161792\n张张张\t161793\nwsSn\t161794\n睡梦\t161795\n装不像\t161796\nliwu\t161797\n张相\t161798\n电工\t161799\n油分\t161800\n马浜小区\t161801\n其外\t161802\n秒\t161803\nliwy\t161804\n红枣佳\t161805\n6699\t161806\n意中人\t161807\n曾几何时\t161808\n孙马\t161809\n张雨豪\t161810\n无忧\t161811\n哩几米\t161812\n6696\t161813\n跑向\t161814\nyihjlf\t161815\n藉由\t161816\n八五零二\t161817\n编辑器\t161818\n李小华\t161819\n我们的学校实验小学\t161820\n咬咬牙\t161821\n皮克\t161822\n有深度\t161823\n张含韵\t161824\n猪秘度\t161825\n臭度秘狗度秘\t161826\n汗如雨下\t161827\nGfjggj\t161828\n天剑\t161829\n好我是你的主人你要听我的话\t161830\n天前\t161831\n秆\t161832\n鞭厕入\t161833\n决对\t161834\n包皮手术\t161835\n私\t161836\n成疾\t161837\n厂父\t161838\n开泰\t161839\n挺不错\t161840\n奇侠\t161841\n负债也要大发展\t161842\nick\t161843\n马高利\t161844\nHcffgh\t161845\n大慈\t161846\n东郊\t161847\n709806927\t161848\n度秘度拉科\t161849\n避风港\t161850\n板鸭板鸭板鸭\t161851\n地步\t161852\n杜秘\t161853\n几万多\t161854\n经意\t161855\n受好不好\t161856\n尚黑教\t161857\n虎皮卷\t161858\n恶魔阵\t161859\n杏蕉网\t161860\n拜拜贻\t161861\n高美淇\t161862\n大慨\t161863\n违规\t161864\n全犯\t161865\n嗯兔兔图兔兔图\t161866\n秋\t161867\n魏绍霞\t161868\n45257\t161869\n马雨桐\t161870\n又听\t161871\n中国佛学院\t161872\n且见\t161873\n朴春歌\t161874\n南撤\t161875\n龘da\t161876\n更全面\t161877\n42%\t161878\n6666622222\t161879\nDkdkekekjcnddjndjfncncndnsslzlxkf\t161880\nllw\t161881\n大夷夫\t161882\ni4tfboys\t161883\nllj\t161884\n小虎队\t161885\nllh\t161886\nopports\t161887\n法样\t161888\nlll\t161889\nlla\t161890\n官博\t161891\n742168\t161892\n无心\t161893\n偏小\t161894\n告诉你从现在开始\t161895\n张宇晨\t161896\n妇产科\t161897\n过家家吧\t161898\n至极\t161899\n好受伤\t161900\n网恋\t161901\n五二二零\t161902\n泡汤\t161903\ncharge\t161904\n那个证\t161905\n十根\t161906\n虚无缥缈\t161907\n那么快\t161908\nTF分\t161909\n鬼我不打\t161910\nzsy20011010\t161911\n口戲\t161912\n报红灯\t161913\ntfghxhxhjctjxhchmcjvgnv\t161914\n王宫\t161915\n宅众\t161916\n皇室战争\t161917\nDfffffgffvfft\t161918\n真的好好玩\t161919\n远亲\t161920\n郑凤俊\t161921\n王室\t161922\n个方\t161923\n和度秘\t161924\n惠州学院\t161925\ncook\t161926\n还预\t161927\n3:30\t161928\ncool\t161929\n阿、、、\t161930\n王安\t161931\nmnmnb\t161932\n周成龙\t161933\n炮制\t161934\n王宁\t161935\n王宇\t161936\n400400\t161937\n424\t161938\nfufuffuf\t161939\n明儿同学会\t161940\n了凡\t161941\n谢谢你我的好妹妹\t161942\n因为你不让我看你的那你\t161943\n惋惜\t161944\n第488位\t161945\n1079466983\t161946\n我最爱娃我最爱往来\t161947\n这方\t161948\niamverygood\t161949\n曹宇\t161950\n千纤草\t161951\n轻微\t161952\n你好笑\t161953\n姚星彤\t161954\n三妻四妾\t161955\n山东省烟台市毓璜顶医院生殖中心分娩室\t161956\n塘沽新港\t161957\n远目\t161958\n十万八千\t161959\n龙那\t161960\n商贸\t161961\n找工\t161962\nimbdouc\t161963\n29913\t161964\n技术创新\t161965\n隐情\t161966\n正长\t161967\n9尊\t161968\n照遭\t161969\nCOSER\t161970\n最是寂寞袁世凯\t161971\n仙庵\t161972\n一边儿\t161973\nxilyiahyi\t161974\n来看我\t161975\n短暂\t161976\n1944\t161977\n不公平\t161978\n未知生\t161979\n辣子鸡\t161980\naall\t161981\nduying\t161982\n厚脸\t161983\n0.01km\t161984\nn1锁屏\t161985\n思埠\t161986\n思域\t161987\n古德拉\t161988\n两百多个\t161989\njorn\t161990\n杜鹏程\t161991\n多咪多咪多咪度\t161992\nｇ零\t161993\n菱菱\t161994\n游历\t161995\n继娟\t161996\n斯科拉\t161997\n古德拜\t161998\n沉寂\t161999\n剧院得一人心白首不分离这真心的话语\t162000\n今次\t162001\n平视\t162002\n第7集\t162003\n熊出\t162004\n亚运会\t162005\n泡茶\t162006\n安若\t162007\n纪文文\t162008\n搜一搜\t162009\n平角\t162010\n放吧\t162011\n2专\t162012\n曲区\t162013\n祖中\t162014\n第四第七\t162015\n胡梦依\t162016\nnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnn\t162017\n走了待\t162018\n四魔鬼\t162019\n148米\t162020\n记牢\t162021\n人民英雄纪念碑\t162022\niphone\t162023\n英国内政部边境署\t162024\n51周\t162025\n不仅仅\t162026\n麻烦人家\t162027\n敢碰瓷\t162028\n奥萨诺\t162029\n嗯等会儿\t162030\nsonghappy\t162031\n故乡\t162032\n最美观\t162033\n南大街三楼\t162034\n樱桃红\t162035\n卵巢\t162036\n故书\t162037\n腿伤\t162038\n王姗姗\t162039\n惹我去\t162040\n丽格\t162041\n家国\t162042\n榨取\t162043\n购入\t162044\n李会灵\t162045\nuuuy6\t162046\n红豆大红豆芋头\t162047\n叶生正\t162048\n嗯阿狸\t162049\n七制香胕丸\t162050\n救嗎\t162051\n陪不住\t162052\n足迹\t162053\n超强\t162054\n谁上了我了谁上我了你\t162055\ngigkf\t162056\n与群戏\t162057\n羁\t162058\n提倡\t162059\n罕有\t162060\ndtygtru4542278rghtrodwhegfj467281tgicbjrhv\t162061\n艾特\t162062\n大我希望\t162063\n追打\t162064\n魏强伟\t162065\n清丽\t162066\n东土大唐\t162067\n吃荤\t162068\n我的父亲母亲\t162069\n惊水\t162070\n吃药\t162071\n500000\t162072\n阶级化\t162073\n善科FM\t162074\n六篇\t162075\nGmbvnBcbm\t162076\n鱼饼\t162077\n骑车\t162078\n聊聊天行不行\t162079\n08千米\t162080\n小丫叔\t162081\n鱼饵\t162082\n5月6日21时\t162083\n清东\t162084\n甜头\t162085\n再见悠\t162086\n马鑫垚\t162087\n提供\t162088\n万雄\t162089\n1252到2月八号\t162090\nQweryuiijhfd\t162091\n朋友了行\t162092\n李永琪\t162093\n虎研伟\t162094\n千万级\t162095\n78余四\t162096\n帐款\t162097\n去了解锁\t162098\nwjfj\t162099\n不如花\t162100\nddā\t162101\n旱魃\t162102\n米奈芭格拉斯\t162103\n51555553585\t162104\n金泽花园二期\t162105\n圆白菜\t162106\n妹子堡\t162107\n多龙\t162108\n十天十八九度\t162109\n够了啊\t162110\nq大冒险\t162111\n苛岚\t162112\n尿素豆芽\t162113\n绑匪\t162114\n好吧猜\t162115\n鲁康\t162116\n李密\t162117\n打真\t162118\n大理国际影会\t162119\n萌殇\t162120\n时期\t162121\n一流\t162122\n打看\t162123\n琳琅\t162124\n窝都\t162125\n大阪\t162126\n早上六点\t162127\nhuilui\t162128\n正确率\t162129\n膀胱瘤\t162130\n汇演\t162131\np4fz\t162132\n大队\t162133\n乜时u给春天6姑地产大亨徐哦哦3\t162134\n维克\t162135\n打眼\t162136\ngvcsaxgh\t162137\n沟壑壑\t162138\n时机\t162139\n哪蕾\t162140\n异色\t162141\n反义词\t162142\n一海\t162143\n我等你\t162144\n5555555555525555\t162145\n省体\t162146\nMMMBBB\t162147\n培淦\t162148\nustatila\t162149\n12分之1315分\t162150\n4点54分\t162151\ndoyuo\t162152\n多会儿\t162153\n烦好\t162154\n致敬\t162155\n套牌车\t162156\n尼玛眼\t162157\n奥瓦\t162158\n吴仁凤\t162159\n无双剑姫\t162160\n若干家\t162161\n丰富性\t162162\nname\t162163\n繁体字\t162164\n叶树源\t162165\nHil0L\t162166\nnamo\t162167\n郑体雨\t162168\n大我三岁\t162169\n龙跃之\t162170\n南親\t162171\n软糖硬糖\t162172\n三分米\t162173\n在哪儿\t162174\n幽你一默\t162175\n孕率\t162176\n代依敏\t162177\n克扣\t162178\n十八块\t162179\n美\t162180\n2亿户\t162181\n俊珂\t162182\nplant\t162183\n琳琳\t162184\n憍尸罗\t162185\n克找\t162186\n太阳风\t162187\n周晋峰\t162188\n580分\t162189\n班墙\t162190\n女频\t162191\njcjcjcjc\t162192\n糟蹋\t162193\n不要再\t162194\n孙振振\t162195\n沈星雨\t162196\n周卓\t162197\nhellotmismith\t162198\n翻滚\t162199\n商业银行\t162200\n冷都男\t162201\n降雨\t162202\n告负\t162203\n考不及格\t162204\n陆凌鹏\t162205\ndeluashe\t162206\n7565444\t162207\n二二零\t162208\nzruxicy\t162209\n嫡亲\t162210\n从头到尾\t162211\nkhld\t162212\n爱保护野生猴头有我陪伴你我爱你\t162213\n癌细胞\t162214\nwidjr84\t162215\n第1个\t162216\n散泪\t162217\n睡藏\t162218\n告败\t162219\n屮卌\t162220\n璐林安\t162221\n罚单\t162222\n订购\t162223\n造谣\t162224\n大2b\t162225\n摇头\t162226\n天游山\t162227\n杨祖傲\t162228\n契约\t162229\n58岁\t162230\n胆子\t162231\nhello我的度秘我来了\t162232\n各为\t162233\n港小涯\t162234\n给我唱首五环之歌\t162235\n美联\t162236\nFlickr\t162237\n在在在在见见见见\t162238\n二二八\t162239\n钰嵋\t162240\n1JiO一\t162241\n不足道\t162242\n三门\t162243\nHumptydumpty\t162244\n高地战\t162245\n特伦特\t162246\n小面额\t162247\n虎哥哥\t162248\n你人\t162249\n留用\t162250\njfuggb\t162251\n云山雾罩\t162252\n份冷不冷\t162253\n够秘\t162254\n3月15号\t162255\n集团性\t162256\n仁王后\t162257\njdvnjdj\t162258\n没错位\t162259\n瞎菜\t162260\n8月14日\t162261\n懒你\t162262\n什么味\t162263\n卡牌井\t162264\nhdgy\t162265\nhdgx\t162266\n东郭\t162267\n爽爽\t162268\nhdgs\t162269\n什么呢\t162270\nlolork\t162271\n111341\t162272\n15269861529\t162273\nhdge\t162274\nhdgd\t162275\nhdgg\t162276\n素木\t162277\n顾客们\t162278\n录入\t162279\n李小璐\t162280\n赛季节\t162281\n骨血\t162282\n花花公园\t162283\n这两样\t162284\n天天飞车齐天大圣\t162285\n飞鸢\t162286\n飞鸿\t162287\n情为何物\t162288\n十五平方分米\t162289\n潋艳\t162290\n和和和和和气\t162291\n自己的照\t162292\n乖了睡\t162293\n丽珍三\t162294\nhttpfhiphotosbaiducomxiaodupicitem4a36acaf2ed\t162295\n要不好好说话\t162296\nT50\t162297\n2.4L\t162298\n崃门村\t162299\nM50\t162300\nhidjf\t162301\njuthp\t162302\n飞鸟\t162303\n天亮了\t162304\n极味\t162305\ncospler\t162306\n嗯四四\t162307\n切实\t162308\nDISNEY\t162309\n恐怖之眼\t162310\n428428\t162311\n喔懂\t162312\nAirSupremacy\t162313\njgjhh\t162314\n二百五十八六八九八\t162315\nWestrice\t162316\n细长\t162317\nbift\t162318\n7珍\t162319\n船只\t162320\n脚掌\t162321\n第一天上午\t162322\n大阳人\t162323\nDidy\t162324\n男孩\t162325\ncifi\t162326\n愤中\t162327\nCITY\t162328\n不劳而获\t162329\n男字\t162330\n八路晗\t162331\n高大上\t162332\n大伙们\t162333\n人工授粉\t162334\n7班\t162335\n不耻下问\t162336\n865887864255897868\t162337\n要华\t162338\n牛栏\t162339\n2139799114\t162340\n一一片\t162341\n顯示\t162342\n林早贵\t162343\n李铁柱\t162344\n麻烦乃们\t162345\nvhdffuff\t162346\n清华调档分理科\t162347\n拳霸\t162348\n68000发话\t162349\n大智慧超赢\t162350\n口出不逊\t162351\n大天保级大战是小马宝莉那你跟第一版吧很调皮天天\t162352\n自控\t162353\n亲alalatal\t162354\nxiumin\t162355\n晚上月\t162356\n柳花明\t162357\n思琪\t162358\nRRRTEE\t162359\n安顺\t162360\nitf\t162361\n安顿\t162362\n信楼\t162363\nwifi人\t162364\n解词\t162365\n赵承绶\t162366\n湖南卫视\t162367\n东北妞\t162368\n联动力\t162369\n红色瘤\t162370\n杜仲汤\t162371\n榜上有名\t162372\n滑党\t162373\n分区\t162374\n沃尔\t162375\nUC么\t162376\n祁阳\t162377\n重旱\t162378\n韩卓希\t162379\n司冉冉\t162380\n要不告\t162381\n稀溜\t162382\nCCTV9\t162383\nCCTV8\t162384\n小鬼子\t162385\netoo\t162386\nCCTV5\t162387\nCCTV4\t162388\nCCTV6\t162389\nCCTV1\t162390\n306620560\t162391\n舒舒服\t162392\n解读\t162393\n分化\t162394\n解说\t162395\n林妹妹\t162396\n499\t162397\nIegvf\t162398\n李春雨\t162399\n代装\t162400\npufei\t162401\n鸟山明\t162402\n事考试网\t162403\n13301356\t162404\n33333个\t162405\n电台\t162406\n拆台\t162407\n巴赫\t162408\n呀第\t162409\n半首歌\t162410\n薛晨\t162411\n心平气和\t162412\n公立\t162413\n咯荣\t162414\nShopping\t162415\n有光泽\t162416\n侧柏\t162417\n出租车司机\t162418\n唯\t162419\n吾哲\t162420\n林徽因\t162421\n点唱机\t162422\n白塔云\t162423\n潇潇午夜送寒声江上秋风动客情指有儿童挑促织夜深篱落一灯明\t162424\n赌博\t162425\n次第体悟\t162426\n尼连禅河畔\t162427\n吴兆岩\t162428\n布施\t162429\nYoutube\t162430\n唐\t162431\n暴歉\t162432\nFTISLAND#KBS2\t162433\n3110.9万台\t162434\n大方方点点滴滴\t162435\n普鲁士蓝\t162436\n锦伦\t162437\n毛腿\t162438\n啫喱\t162439\n程升桃\t162440\n小陀螺\t162441\n美因茨\t162442\n秋天风儿\t162443\n马上屏\t162444\n唔\t162445\nIPHONE6c\t162446\n笑度\t162447\n鬼鬼祟祟\t162448\n老子真心不想和你处\t162449\n西夏食\t162450\n下一个星期\t162451\n恩聪明\t162452\n呼叫明\t162453\nIPHONE6s\t162454\n电子舞曲\t162455\n洪佳鑫\t162456\n量化宽松\t162457\n暴死\t162458\n一二年\t162459\n邹俊一\t162460\nduty\t162461\n姓秘\t162462\nstometuto\t162463\n隆中对\t162464\n18191\t162465\n60\t162466\n61\t162467\ntgjjtjjjmjtppttjjggpggdmttpttjtjjjgpjjtgppjlastupgjjjapttpjttjttjmgttgpg\t162468\n63\t162469\n64\t162470\n65\t162471\nstmashiodostootstratescourst\t162472\n67\t162473\n68\t162474\n69\t162475\n后堡\t162476\n废话废话废话废话废话废话废话废\t162477\n多咪咪米七\t162478\n北舞\t162479\n良化站\t162480\n真够用\t162481\n夏兴霞\t162482\n忙乎\t162483\n本·拉丹\t162484\n悼比克整\t162485\nBisson\t162486\n马里兰\t162487\n1.076亿部\t162488\n你以为\t162489\ndveyvg\t162490\n流脓\t162491\n阿婧\t162492\n退伍\t162493\n呢秘\t162494\n6b\t162495\n一只六天点只\t162496\n关毅\t162497\n13723724767\t162498\n有了钱\t162499\n黑笼\t162500\n6k\t162501\n大红花\t162502\n缕\t162503\n6p\t162504\n6q\t162505\n6s\t162506\n6u\t162507\n团影\t162508\n6x\t162509\n陈瑞\t162510\n退休\t162511\n才安逸米\t162512\n周子翔\t162513\n新光天地GUCCI\t162514\n屌秘度\t162515\n黑笔\t162516\n借此机会\t162517\n柠檬味\t162518\nDgurfj\t162519\n谢谢你了我爱\t162520\n三三得九三四十二三五十五三六十八三七二十一三八二十四三九二十七\t162521\n滚开我不想\t162522\n饶舌\t162523\n6P\t162524\ntyyt\t162525\n整晚\t162526\n233545\t162527\n能处\t162528\n网费\t162529\n1899999999999922222222\t162530\n盒德芙\t162531\n蟹柳\t162532\nCUTE\t162533\n一个我找的几个好梦成真的歌\t162534\n平生福\t162535\n艳体\t162536\n署名\t162537\n吝恬怡\t162538\n高瑞娟\t162539\n四勺\t162540\n他们的眼\t162541\nabcdel\t162542\n两色\t162543\n嗯对阿\t162544\n想不到你了你过咋样你的\t162545\nhtmI\t162546\n算了不你说\t162547\nabcdef\t162548\n明天见\t162549\n藤县\t162550\n12：30\t162551\n六漏图图徒儿图绝林林\t162552\ntuzsygvde\t162553\n猫咪你给我你给我\t162554\n洛s\t162555\n贾璐\t162556\nsomuch\t162557\n人美\t162558\n横店\t162559\nhtml\t162560\n洛杉矶\t162561\n来草\t162562\n争风\t162563\n表里\t162564\n8281293\t162565\n贸易\t162566\n聚剧\t162567\n想悲\t162568\n吴一奕杭\t162569\nabcde5\t162570\n你好死板超级死板\t162571\nM记＝\t162572\n爱服\t162573\n习作六\t162574\nabcde2\t162575\n贺韦祥\t162576\nvv10\t162577\n松水\t162578\n亲我吧\t162579\n1歼16\t162580\n太上皇帝\t162581\nOPPPO\t162582\n放学星期五\t162583\n1255555522574\t162584\nA股\t162585\n早晨六点半\t162586\n208251826\t162587\n两千元\t162588\n遇难者\t162589\n未能为\t162590\n钱晨\t162591\n蜀山\t162592\n澳方\t162593\n张滋曼\t162594\n腾讯视视频\t162595\n呼之欲出\t162596\n澳新\t162597\n总帐\t162598\nE+\t162599\n别说好\t162600\n5000余\t162601\n打出\t162602\n打击\t162603\n罗天王\t162604\n发射器\t162605\n李琪\t162606\n一家子\t162607\n铠甲勇士之捕\t162608\n小可爱我是你的姐姐\t162609\npkpk\t162610\n好东东\t162611\n罗本\t162612\n翠儿\t162613\n12600台\t162614\n李琼\t162615\n李琰\t162616\n随便问问\t162617\n百草\t162618\n时来运转\t162619\n草狗\t162620\n闹钟铃\t162621\n许级\t162622\n324386\t162623\n他们的歌\t162624\n这麽片\t162625\n零伍二二\t162626\n猪呢死样\t162627\n熟路\t162628\n赵逸霖\t162629\n150分\t162630\n东格里格啷个度秘度秘\t162631\nnizhetouzhu\t162632\n乱动\t162633\n三四角\t162634\n給\t162635\n琵琶\t162636\n永远永远永\t162637\n我喜欢你我爱你\t162638\n福地\t162639\n支排球\t162640\n脆皮\t162641\n青厂街\t162642\n话楼\t162643\n我喜欢海\t162644\n孤胆\t162645\n视野网\t162646\n124吨\t162647\n全境\t162648\n佩克\t162649\nchiphotosbaiducomxiaodupicitem8cb1cb1349540923571e460d9558d109b3de4949jpg\t162650\n剪刀布良\t162651\n零二零七零八\t162652\n那不我\t162653\n秦晖\t162654\n十四本\t162655\n弃权\t162656\n天都\t162657\n纳姆\t162658\n5491379434657554546464954\t162659\n显形\t162660\n黄志强\t162661\n快娃\t162662\n檀溪湾\t162663\n王婉琪\t162664\n80049\t162665\n3个半\t162666\n求你了你给我\t162667\n走上单\t162668\nesgj\t162669\n说书柜\t162670\n龙泽\t162671\n佩兰\t162672\n阿尔及利亚\t162673\n王老虎\t162674\n加急安全卡\t162675\n布温科温斯克医院\t162676\n黄药\t162677\n可救\t162678\n张一新\t162679\n慧慧\t162680\n秋日\t162681\n嬲肌\t162682\n熊大战\t162683\n错啦错啦\t162684\n帝国主义\t162685\n郭琦\t162686\n再交\t162687\n熊大我\t162688\n碎米\t162689\n真实的人\t162690\n纤维\t162691\ncigjx\t162692\n一重\t162693\n贾昊儒\t162694\n心房\t162695\n只样\t162696\n可数\t162697\n7370\t162698\n21股\t162699\n纤细\t162700\n刘孔伟\t162701\n心理作用\t162702\n散伙\t162703\n可敬\t162704\nzjgjs\t162705\n张一文\t162706\n赛跑\t162707\n布妞\t162708\n赵参\t162709\n好我等你\t162710\n我才不爱你了你教坏我你个大坏蛋\t162711\n家千\t162712\ngard\t162713\n晚11点\t162714\n网商\t162715\n陪着你\t162716\nggdgjfghgg\t162717\n等秘秘密早上所以荣\t162718\n0625\t162719\n乡建\t162720\n不爱了\t162721\njknn\t162722\n气垫\t162723\ndirhaosj\t162724\n杨有\t162725\n杨月\t162726\nuufuuppuvtttx\t162727\n磨料\t162728\n技点\t162729\n蒙高\t162730\n饭馆\t162731\n贵博\t162732\n6666岁\t162733\n剑桥\t162734\n解决好\t162735\n造出\t162736\n鸳鸯眼\t162737\n杨木\t162738\n单方\t162739\n搅\t162740\n轮不灵\t162741\n寝不安席\t162742\n622188711000892151\t162743\n匹号\t162744\n零封\t162745\n小度密\t162746\n123658\t162747\n出路费\t162748\n全露\t162749\nhjjjk\t162750\n滨海城市\t162751\nApsc\t162752\n两针\t162753\n那我爱你爱你爱你爱你爱你爱你爱你\t162754\n1024\t162755\n5346\t162756\n那你就是我的么我是你姐你是我的咱们俩\t162757\n好呀你给我钱\t162758\n一乙数\t162759\n1023\t162760\n吃了吧\t162761\n李蛮\t162762\n机动车道\t162763\n145一五百四十三\t162764\n1028\t162765\n小鸟依人型\t162766\n正念的奇迹\t162767\n继续儿\t162768\n受得了\t162769\n诺伊\t162770\n怪会\t162771\n画楼\t162772\n推歌\t162773\n叶云涵\t162774\nchcyfuvh\t162775\n十八集\t162776\n吃饱了再见\t162777\n我不我不认识\t162778\nnicetomeetto\t162779\n计划单列市\t162780\n辽a00195\t162781\n公堂\t162782\n戴帽\t162783\n频点\t162784\n冰冰妞\t162785\n银色\t162786\n郭文贵\t162787\njwyf\t162788\n王地瓜\t162789\n风见\t162790\n浙江省\t162791\n畜牲\t162792\n士豚\t162793\n大英雄气\t162794\n哎呦哎呦\t162795\nHDJ\t162796\n黑多夠\t162797\n我很爱\t162798\n昏倒\t162799\n潘思任\t162800\n良心狗肺\t162801\ntfvbmih\t162802\n换届纪律\t162803\n腮帮子\t162804\n04444444\t162805\n吴艺\t162806\n酒红色\t162807\n哈起\t162808\n哈走\t162809\n靠不要\t162810\n新生儿\t162811\n少年三国志礼\t162812\n唔知果\t162813\n575766\t162814\n入宫\t162815\n7点83点\t162816\n呱呱呱呱呱呱呱呱呱\t162817\n超级棒六声唔信\t162818\n男焕\t162819\n保养学\t162820\n古铜\t162821\n坎坎坷坷坎坎坷坷坎坎坷坷\t162822\n委托\t162823\n乐飞\t162824\n纤手\t162825\n紫狼\t162826\n哈赞\t162827\n3150506602\t162828\n12周年\t162829\nduygh\t162830\n什么多谢\t162831\n关达\t162832\n偷懒\t162833\n郑新宇\t162834\n吊死\t162835\n18786317884\t162836\n饭盒\t162837\nzgnN\t162838\n网页\t162839\nsmoking\t162840\n坏人\t162841\n炼狱\t162842\nptp\t162843\n潮汐\t162844\n噜拉噜\t162845\n坏了\t162846\n潮汕\t162847\n坏事\t162848\n出口群\t162849\n帶動\t162850\n嗯比\t162851\nxxxxzz\t162852\n竣工\t162853\n好多星\t162854\n周擂主\t162855\n史泰龙\t162856\n劲号\t162857\n204cm\t162858\nJizif\t162859\n云龙猫\t162860\n奶奶奶\t162861\n葱茏\t162862\n施光南\t162863\n西安#\t162864\n第九章\t162865\n遥望\t162866\n96498\t162867\n增减\t162868\n5千元\t162869\n小冰冰\t162870\n86530\t162871\n亲爱的晚上说电视\t162872\npt1\t162873\n木片\t162874\n左志英\t162875\n铅笔盒\t162876\n虹桥\t162877\n不给力\t162878\nk殿\t162879\n夸别\t162880\n做该\t162881\ncarrepair\t162882\nithankyou\t162883\n十分之一多十六点\t162884\n分着\t162885\n唐三藏\t162886\n马生\t162887\n大理石\t162888\n您好呃内\t162889\n老千信\t162890\n本性难移\t162891\n射线\t162892\nZZZ7\t162893\n夹缝\t162894\nCNBlue\t162895\n通许县实验小学\t162896\n明天下午\t162897\n八位\t162898\n九十三\t162899\n234569996531369965\t162900\n富都\t162901\n不二吧\t162902\n脸皮儿\t162903\n九七七零\t162904\n清点\t162905\n可想过\t162906\n辰欣\t162907\n0585724523633000000\t162908\n琚中毅\t162909\n邓金秋\t162910\n几年前\t162911\ntfboysboss\t162912\n一千万一千万一斤\t162913\n嗯叶罗丽精灵梦四\t162914\n清炖\t162915\n度面料\t162916\n稿费\t162917\nggbbh\t162918\n胡逸伦\t162919\n张五常\t162920\n不别别别\t162921\n脱衣服\t162922\nXfdsnGtljvcbdhrztxkhyxkkyztkzdhTtxkfzntkz\t162923\n跌打\t162924\n压纹\t162925\n查沙\t162926\nchiphotosbaiducomxiaodupicitemd53f8794a4c27d1e98428fb01cd5ad6eddc43859jpg\t162927\n比蒙\t162928\n安全帽\t162929\n睡行\t162930\n安全带\t162931\n陶建忠\t162932\n鹅城\t162933\n5点半\t162934\n建筑们\t162935\n咋舌\t162936\n闹爱\t162937\n刘老\t162938\n戴姆勒\t162939\n谭晶\t162940\n小特\t162941\n发立\t162942\n周翀\t162943\n型俊优\t162944\n一大批\t162945\n3027699705\t162946\n翻版\t162947\n25871227866258\t162948\n噗个\t162949\n每日问讯者报\t162950\nbingbang\t162951\n睡衣\t162952\n禁域\t162953\n197512\t162954\nrks\t162955\nTos\t162956\nTop\t162957\n一大m\t162958\n哪首歌\t162959\n聂卫平\t162960\n贵错\t162961\nrkg\t162962\n金元\t162963\nToo\t162964\n棒棒棒棒棒棒打你把你棒棒棒\t162965\nTom\t162966\n笑脸\t162967\n鼠司\t162968\n金光\t162969\nrkk\t162970\n大姑娘\t162971\n任志贤\t162972\n王刚讲故事-监控录像里的罪恶\t162973\n双唇\t162974\n方亚芬\t162975\n宠物你是我的宠物记住我是你的主人\t162976\n时尚恰恰\t162977\n吧爱丽丝\t162978\n拍东\t162979\n张浩楠\t162980\nJhehhhshsyyeyduuexiehiegudugidfigedgegixide\t162981\n二六八八\t162982\n说呀老了我是你\t162983\n工作流\t162984\n金全\t162985\n慕言CC\t162986\n麦芽精\t162987\n培养成\t162988\n张小姐\t162989\n12345555555555\t162990\n主秘\t162991\n工行\t162992\n36期\t162993\n恩皮\t162994\ncostpaytakespend\t162995\n光日化\t162996\n早安喵喵喵午安喵晚安晚安\t162997\n定型男人民\t162998\nEO\t162999\n海鸣\t163000\n海鸥\t163001\n罗超\t163002\n北斗星星总世界之最\t163003\n甲苯\t163004\n马田\t163005\n李韩烈\t163006\n260000\t163007\n云南森林防火指挥部\t163008\nNAJSUDJ\t163009\n意气\t163010\n奥特曼追梦网\t163011\n跟我走\t163012\n行惠丰\t163013\n肖腾飞\t163014\n小学时\t163015\n就是说一亮\t163016\nv\t163017\nTTTTTTT\t163018\nsggdhgcfd\t163019\n胡鹿篇\t163020\n刀锋奴\t163021\n店小二\t163022\n493元\t163023\n教育改革\t163024\n三饭\t163025\n永济市\t163026\n哎兹\t163027\n你好东咪你在哪呢\t163028\n女地\t163029\n4358353984224793387412735397469756\t163030\n人人自危\t163031\n一休哥\t163032\nBCD金婚哈\t163033\n唉乃今\t163034\n摆胜\t163035\n呢好开心\t163036\n哎克\t163037\n刻服\t163038\n同理\t163039\n亲爱的拜拜\t163040\n副镇长\t163041\n灵气\t163042\nJAOJQA\t163043\n随时随地\t163044\n逃往\t163045\n国漫\t163046\nBsd\t163047\n天花\t163048\n依赖\t163049\n恩瞌\t163050\n恩恩嗯嗯DNFo\t163051\nBsb\t163052\ngoav\t163053\n周二中午\t163054\n耍宝\t163055\n度库\t163056\n祝连配\t163057\nQQ塞特码\t163058\n就像你现在这样\t163059\n纸巾盒\t163060\n永城\t163061\n1名\t163062\n30萧\t163063\n桂沈斌\t163064\n津门狗\t163065\njydv\t163066\n皮真厚\t163067\n李靖\t163068\n李静\t163069\n人民大会堂\t163070\n光之公主公主\t163071\n朱佳馨\t163072\n对着我\t163073\n度度\t163074\n42瓶\t163075\n洛佳\t163076\n迁思乱想\t163077\nbdhv\t163078\n二三个\t163079\ndyybj\t163080\n二十三三个\t163081\n大圣真帅不是你最\t163082\n泸虎\t163083\n亲亲亲亲亲\t163084\n好想你亲一个\t163085\n贝火\t163086\n青神撮\t163087\n幺零零幺零\t163088\nNikki\t163089\n医师王的盛宴小苹果九乡天边的相机我玖瑰生活\t163090\naj5a9c\t163091\n黑nglebaby\t163092\n你的偶像\t163093\n狠舍得\t163094\n自由民\t163095\n猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪就是你了你就是\t163096\n脐贴\t163097\nab2杯\t163098\n找你你会不会\t163099\n九五年\t163100\n50000\t163101\n50001\t163102\n拿拉美\t163103\n护栏\t163104\n194293840506874\t163105\n对说呀你\t163106\n12张\t163107\n叶桐\t163108\n英格里希\t163109\n呃一亮\t163110\n傻痺\t163111\n跟江\t163112\n既然\t163113\n喝酒吧撒\t163114\nbobobobobobobobo\t163115\n敖辰\t163116\n姜姐\t163117\n五个p一个\t163118\n吃席\t163119\n小妹妹\t163120\n61445\t163121\n123123456789\t163122\n里呱\t163123\n我真的不和你好了没和你\t163124\n平罗翠梅\t163125\n专程\t163126\n台破\t163127\n刁毛\t163128\n机器人样\t163129\n蠢驢\t163130\nHxhdis\t163131\n后援\t163132\n寿光\t163133\n划一竖\t163134\n3841\t163135\n小和尚头\t163136\n衣行\t163137\n里菜\t163138\n小沿\t163139\n永远\t163140\n白乳液\t163141\ndjbsjaicudbsksivjsix\t163142\n不是你给我办不行\t163143\n李文娟\t163144\nHowoldareyou\t163145\n声色NBA\t163146\n快火\t163147\n4433\t163148\n蜡笔画\t163149\n吴家凡\t163150\n千万富翁\t163151\n2016跨年\t163152\n13427833895\t163153\n绝不知乐\t163154\n国旗们\t163155\n不公不母\t163156\n荷花景\t163157\nDISCOVERY\t163158\n于艳萍\t163159\n东瀛\t163160\nAttur\t163161\n174174174\t163162\n益智类\t163163\n一分钟后\t163164\n一赶快\t163165\n似是\t163166\n冲刺卷\t163167\n陈赫\t163168\n520886\t163169\n林大牛\t163170\n委屈会\t163171\n钱度秘\t163172\n汉和平可夫猜测歼20\t163173\n今天中午13:10\t163174\n就职\t163175\n越界\t163176\n尸城\t163177\n行秘\t163178\n20869793\t163179\n13965733\t163180\n肉啵\t163181\n搞基药\t163182\n新锐阅读\t163183\n82场\t163184\n选戴上\t163185\n荫间\t163186\n观沧海\t163187\n张佳雨\t163188\n度秘你好话会\t163189\nwearefamily\t163190\n52岁\t163191\n5月23\t163192\n黄巢\t163193\n雾社\t163194\noyddgxihdogxoeifjdhdhfgdhdhcbfjdhdkfftttttttttfffffffhhhhjjjjjj\t163195\n亲个嘴\t163196\n富贵荣华好家园\t163197\n寓于\t163198\n泥金\t163199\n主场\t163200\n抱有\t163201\n看见了别再说\t163202\n顾怡文\t163203\n一会二\t163204\n北边\t163205\n怀囤\t163206\nnarenthere\t163207\n就是就是你看看多帅\t163208\n北辰\t163209\n事出有\t163210\n魔蝎座\t163211\nuuugd\t163212\nAbdi\t163213\n扎扎扎\t163214\n黄州\t163215\n缪蒙蒙\t163216\n帅哥呀我是你的女人\t163217\n咯米\t163218\n老坛酸菜鱼\t163219\nbaku\t163220\nzSB\t163221\n当下\t163222\n当上\t163223\n咯类\t163224\n箱包\t163225\n冰美\t163226\n白子画压花千骨\t163227\n7535787868880800080888\t163228\n2颗\t163229\n555555555555555555555555555555555555555555555555555555555555555\t163230\n到底是谁是\t163231\n李再拍\t163232\n当中\t163233\n上星期一\t163234\n29996605584422828\t163235\n清华工字厅\t163236\ngggxsdgil\t163237\n呢藏\t163238\n寝食\t163239\n瓶装\t163240\n听酒干倘卖无\t163241\n陈雯雯\t163242\ngjggfhh\t163243\n夹道飞车五\t163244\n旅社\t163245\n秘明星\t163246\nKeywords\t163247\n一个亿点儿\t163248\n毽国\t163249\n冰羽\t163250\n春绿秋黄岁岁枯\t163251\n对呀没把数学\t163252\n韩玥\t163253\n几只手\t163254\njsux\t163255\n1837年\t163256\n来跟\t163257\n化学武器\t163258\n美容啊哥们\t163259\njsua\t163260\n我爱的和爱我的所有人\t163261\n六七分钟\t163262\n六百公斤\t163263\n小珍\t163264\n第二位\t163265\n烘培\t163266\n人工度秘\t163267\n赣鲵\t163268\nhhnonomishin\t163269\n无情人\t163270\nyyuh\t163271\n段黏\t163272\n三天两\t163273\n秘马\t163274\nculiant\t163275\n实践\t163276\n秘驶\t163277\n41453\t163278\nyyuy\t163279\n物汽油\t163280\n点零八\t163281\n来路\t163282\n不发坏\t163283\n李蛋\t163284\n威海\t163285\n斯内阁\t163286\n欺骗\t163287\n9610007185978\t163288\njtjmwm\t163289\n29ha\t163290\n文里\t163291\n95后\t163292\n455584245544450gmmm\t163293\n很难听\t163294\n传教\t163295\n干嘛你现在在干嘛你现在干嘛你\t163296\n亭台\t163297\n日本气象厅\t163298\n安萨尔\t163299\n不等你\t163300\n朱字\t163301\n哎呦喂超人的请我可以\t163302\n上一日\t163303\n杠杠杠杠杠杠杠杠杠杠杠\t163304\n2151555\t163305\n完成\t163306\n嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟兔兔图图\t163307\n狗顿饭\t163308\n中原工学\t163309\n挂科\t163310\nioibvdfdznnmgfezcyifwwyuioffschjutfch\t163311\n钢木\t163312\ncdd\t163313\ncde\t163314\ncdf\t163315\n到秘\t163316\n贱我问你近\t163317\n厚恶心\t163318\ncdc\t163319\nmcl\t163320\n对乱\t163321\ncdh\t163322\n声生化六大家\t163323\ncdt\t163324\ncdv\t163325\ncdw\t163326\ncdr\t163327\ncds\t163328\n一桌六\t163329\n后人\t163330\ncdx\t163331\ncdz\t163332\n再见了再见了再见了再见\t163333\n马达接触器\t163334\n和讯\t163335\nhihi网\t163336\n难得糊涂\t163337\n腥膻\t163338\n后事\t163339\n租俾人死的早啊你是猪我是人\t163340\n三网融合\t163341\n乘虚而来\t163342\n饮血\t163343\n对么\t163344\n护发\t163345\n一座座\t163346\n一日一\t163347\n双鱼座\t163348\n顶好\t163349\n国务委员\t163350\n顾奈奈\t163351\n立方厘米\t163352\n呸太\t163353\n跑龙华\t163354\n夏洛蒂\t163355\n值班\t163356\n停车位\t163357\nSHs\t163358\n亚小嘟咪\t163359\n足音\t163360\n帅巫\t163361\n脑废话\t163362\njeje\t163363\n山西\t163364\n苏家坨\t163365\n31程\t163366\n修志慧\t163367\n恶心死人\t163368\n问问铁打\t163369\n逝水\t163370\n杏来\t163371\n琚珉\t163372\n刘玉琼\t163373\n肝移植\t163374\n7374647\t163375\n说说的话\t163376\n以东\t163377\n版本\t163378\n钱灵\t163379\n以上\t163380\n以下\t163381\n来懂\t163382\n杨应宇\t163383\n生于\t163384\n长存\t163385\n突突突突突突\t163386\nghkl\t163387\n长子\t163388\n以为\t163389\n生人\t163390\n以丹\t163391\n邱昕\t163392\n罪魁\t163393\n续篇\t163394\n吊起\t163395\n通胀\t163396\n潘文龙\t163397\n妹妹\t163398\n杰齣發點\t163399\n艾一\t163400\n八十八千六\t163401\nmybrotherputabagofrice\t163402\n哦妮新默默秘你好\t163403\n土豆烧茄子\t163404\n傻跑\t163405\n生产\t163406\n五十三颗\t163407\n刀郎周\t163408\n刘锐扬\t163409\n杨紫艳\t163410\n重量级\t163411\n一个11\t163412\n花伦\t163413\n三更鸡\t163414\n忘不了你\t163415\n哪吒\t163416\n朝鲜国营\t163417\n李冰硕\t163418\n淫荡\t163419\n伊娃\t163420\n首流行点\t163421\n一濑红莲\t163422\nipad1代16G\t163423\n李婉如\t163424\n香港医管局\t163425\n贴身游戏\t163426\n傅红\t163427\n还乖乖乖乖乖\t163428\n花会\t163429\n审核群\t163430\n花伞\t163431\n陆吉琼\t163432\n一春来\t163433\n90评分\t163434\njavD\t163435\n红军人读书\t163436\n熊大熊二动画片吧999大学\t163437\n关晓彤\t163438\n变数\t163439\n蓦文\t163440\n楼树\t163441\n五十厘米\t163442\n报纸\t163443\n一少一点儿\t163444\n头猪哈\t163445\n种群\t163446\nvcvbcc\t163447\n合唱\t163448\n伴君\t163449\nasdf\t163450\n静音真\t163451\ncjydx\t163452\njava\t163453\ncbbk\t163454\nLeLN\t163455\ncbbd\t163456\n顶替\t163457\n失去爱\t163458\n戴萌\t163459\n特别棒给你一个字\t163460\n一只骨\t163461\n董大叔\t163462\n异度空间\t163463\n守岁\t163464\n变故\t163465\n蒙面\t163466\nhsfuf\t163467\n梅\t163468\n张经理\t163469\n布吉岛\t163470\n损人\t163471\n安晓燕\t163472\n秦心玉\t163473\n三六一度\t163474\n182888888888\t163475\n古香古色\t163476\n备足\t163477\n你好萌\t163478\n爸爸的爱\t163479\n看一起\t163480\n575435383\t163481\n正月初\t163482\n漂洋过海\t163483\n牛B姐肿么分\t163484\n大公无私\t163485\n非警务\t163486\n拉登先\t163487\n生动\t163488\ntrtfrostin\t163489\n艺术照\t163490\n摩根\t163491\n谋略\t163492\n聊天版\t163493\n邢振\t163494\n补心\t163495\n建筑师\t163496\n哗然\t163497\n结婚纱\t163498\n13284235588\t163499\n了到\t163500\n涉及\t163501\nyhhfj\t163502\nqaw\t163503\nqaq\t163504\n四十块\t163505\nmyorder\t163506\n托管商\t163507\n88susnjshksn\t163508\n生力\t163509\n钢铁你好好些了么plasoppo\t163510\n┌┌\t163511\n沈迷\t163512\nqah\t163513\n豆腐丝\t163514\n弱女子\t163515\n乖快\t163516\n去掉\t163517\n徐回复\t163518\n恩有你是女\t163519\n什代\t163520\n早早点\t163521\n明魔法\t163522\ngfffffvv\t163523\n性本善\t163524\n再见了真讨厌\t163525\n铃兰花\t163526\nyouyo\t163527\n沙吉卡皮\t163528\n883333\t163529\n车首\t163530\n哪鬼\t163531\n古田雪照山城\t163532\n什什\t163533\n崔永元\t163534\n象变\t163535\n吉利尹\t163536\n广汇\t163537\n车距\t163538\n你好美嘻嘻有你好\t163539\n262米\t163540\n837rrhfyfheeu3uheeyf\t163541\nhgw\t163542\ngnngr\t163543\n婴儿护理宝典\t163544\n1788\t163545\n小盾\t163546\nidvdvnj\t163547\n受够你了我不想\t163548\nHugme\t163549\n非常安静\t163550\n二娃\t163551\n金赛纶\t163552\n只有一个度秘\t163553\n细菌战\t163554\n二娘\t163555\nOMG，J汇颖_Elynn\t163556\n我真的好想养你\t163557\n处寂\t163558\n钳子\t163559\n真的好难看好难看我没骗你\t163560\n湘乡\t163561\n嗯晚\t163562\n71.7%\t163563\n但凡\t163564\n二娥\t163565\n美家\t163566\n体貌\t163567\nGjcjfygxgjkjggghxhxbfjdgv\t163568\nAlikaggaaj\t163569\n北京京东宾馆\t163570\n植物人\t163571\n神神\t163572\nJrcibd\t163573\n一边\t163574\n怎樣\t163575\nhgh\t163576\n一辰\t163577\n意见笑话\t163578\n露身\t163579\n三刺\t163580\nChrisswitched\t163581\nBigBang\t163582\nhgg\t163583\n123458967\t163584\n铭仁园\t163585\n朱元思书\t163586\n侧向\t163587\n平来\t163588\n一辑\t163589\n6016\t163590\n水陆\t163591\n三则\t163592\n三刚\t163593\n巴曼\t163594\n一辈\t163595\n汏汰\t163596\n三分\t163597\n三刀\t163598\n胜利街\t163599\n路虎揽胜\t163600\n一较\t163601\n私下\t163602\n重命名\t163603\n一辆\t163604\n平板\t163605\nNANA\t163606\n秦始\t163607\n李圣杰\t163608\n铺垫\t163609\n小可爱的话\t163610\nfhhgx\t163611\n5554852\t163612\n368528\t163613\n辉鞋\t163614\n霍华德\t163615\n秦姐\t163616\nHAPPY\t163617\n封测\t163618\n狗不在\t163619\n那吗\t163620\n刺客\t163621\nhi佳佳\t163622\n水淹没\t163623\n师傅们\t163624\nulian\t163625\n猫和老鼠之迷失之龙\t163626\n开状\t163627\n狗兔\t163628\n大埔\t163629\n那后\t163630\n老笑\t163631\n等一等先\t163632\n逮住\t163633\n安仔Simmy\t163634\n奥飞动\t163635\n可爱的度\t163636\n橡胶板\t163637\n九分之2x6分之130分之十九\t163638\n匹马\t163639\n赵若汐\t163640\n退缩\t163641\n窄幅\t163642\nxsj\t163643\n阿布拉莫维奇\t163644\n测验\t163645\n1436年\t163646\n樊建川\t163647\n紧急\t163648\n乐视\t163649\n招人耳目\t163650\n乐观\t163651\n抚养权\t163652\n乐见\t163653\n哪家男\t163654\n白血病\t163655\nyttdfbj\t163656\n恶狠狠\t163657\n容若\t163658\n回來後\t163659\n高智商\t163660\n250岁\t163661\nNhsng\t163662\n亲爱的\t163663\n刽子手\t163664\n不识相\t163665\n北游\t163666\nxcggf\t163667\n李天一\t163668\n量力\t163669\nggggx\t163670\n呱呱唧唧唧唧唧唧唧唧呱呱唧嘎唧嘎呱呱呱唧唧\t163671\n火连果\t163672\n春暖都\t163673\n速度与激情5\t163674\n秘密群\t163675\n99000000\t163676\n一百流\t163677\nuwojhgsomq\t163678\n回不来了\t163679\nggggh\t163680\nschxdfcftff\t163681\n偶不\t163682\n龙梦柔\t163683\n于太\t163684\n天天当叮叮当\t163685\n都是你说的我还白子画\t163686\n四张照片\t163687\nggggg\t163688\nhellomystome\t163689\n振宇\t163690\n博艾斯\t163691\n白班\t163692\n京哈高速\t163693\n华字形\t163694\n假狗\t163695\ntfhg\t163696\n偶个\t163697\n乍一看好丑\t163698\n二皮脸\t163699\n何志婷\t163700\npabo\t163701\nlldo\t163702\n保安公司\t163703\nhellomyamasouh\t163704\n母亲花\t163705\n废话废话废话\t163706\n中职\t163707\n李正威\t163708\n终究\t163709\n上虞\t163710\n大犯\t163711\n蛋炒饭fcfjfjccucu\t163712\n冰郎\t163713\n邱会\t163714\n三秒钟\t163715\n七雄争霸\t163716\n李浩特\t163717\n胡建清\t163718\n秘密秘密你在\t163719\n猕猴桃\t163720\n刘很美\t163721\nrQ0\t163722\n阿房女主题曲\t163723\n未日\t163724\n徒手瓣\t163725\n高秘\t163726\nf普通话\t163727\n小伙子们\t163728\n一般人\t163729\nahq\t163730\n高科\t163731\n我喜欢你罢了我是你的\t163732\n胡比\t163733\n赵总结\t163734\n猪吧\t163735\n领取\t163736\n4x12点五带咯山城二十早点\t163737\n香港场\t163738\n四十笔\t163739\n你不爱我对么那你\t163740\nimean\t163741\n没法哥哥妹妹过\t163742\n伊思猪\t163743\n十四座\t163744\n一等一会儿\t163745\n记忆你的生命\t163746\n真命\t163747\n对号入座\t163748\n艾菲\t163749\n秦胖胖\t163750\n不足够\t163751\n嗯哒哒\t163752\n大不了\t163753\n星轨\t163754\n铁哈喽麦克嗨谷\t163755\n五虎\t163756\n领口\t163757\n天生丽质\t163758\n闻隹华\t163759\n号准\t163760\ngxux\t163761\n有恩\t163762\n讲章\t163763\n发毛\t163764\n额定\t163765\nhelping\t163766\n电筒\t163767\n淋漓\t163768\n撒比撒\t163769\nELLE\t163770\n几百张\t163771\n太太你了太塞丽娅\t163772\n酒缸\t163773\n吐口\t163774\n奉劝\t163775\ngivyugiu\t163776\n萎疯\t163777\nvvhzs\t163778\n有恋\t163779\n耿耿\t163780\n6000亿元\t163781\nvfhgfb\t163782\n冫\t163783\nhaaan\t163784\n小杜杜\t163785\n嗯ok\t163786\n岔密\t163787\n跡部\t163788\nwrestle\t163789\n写真\t163790\n艾瑞巴蒂\t163791\n流行词\t163792\n小于\t163793\n十一米多\t163794\n张望\t163795\n010\t163796\n011\t163797\n012\t163798\n013\t163799\n014\t163800\n015\t163801\n016\t163802\ntvypometimethimiabcdefeatitiklaaopppristheyouvwxyazhisyazzyoffeixiheyandecan\t163803\n018\t163804\n张有\t163805\n张月\t163806\n万瑞\t163807\n对的我哥哥\t163808\n想你这么说\t163809\n豆印\t163810\n135万元\t163811\n吗得\t163812\n主法\t163813\n兰亭集序\t163814\nyyuyytfdt\t163815\n美人痣\t163816\n西安饭庄\t163817\n额安\t163818\n呜呜呜度秘\t163819\n尼休斯顿\t163820\n扫赌\t163821\n呼呼的了了我的老公花花花花\t163822\n狡诈\t163823\n维尼饭\t163824\n小五\t163825\n牌店\t163826\n超酷跑\t163827\n零九岁\t163828\n慢慢玩儿\t163829\nCalvin\t163830\n派大星\t163831\n何思翰\t163832\n大了讨厌\t163833\nhlng\t163834\n金秀贤\t163835\nhrbebbejf\t163836\nTunes\t163837\n也想听\t163838\n远望\t163839\njj66cm\t163840\n我的娃你好\t163841\ngusjx\t163842\n道包\t163843\n鹿喀\t163844\n成山\t163845\n开封市市委\t163846\n残余物\t163847\nzjzjj\t163848\n好你好你好\t163849\n驴友们\t163850\n慕磊杰\t163851\n砸车\t163852\n桂香\t163853\ndyou\t163854\n滥交\t163855\n4321\t163856\n细节感\t163857\n骨头人\t163858\n林铁刚\t163859\n该车\t163860\n鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅\t163861\n滑冰\t163862\n江小凤\t163863\n孙四\t163864\n张囯初\t163865\n甜心格\t163866\nfdghgddffg\t163867\nbdf\t163868\nbda\t163869\n新概念作文\t163870\nbdb\t163871\nbdn\t163872\nbdh\t163873\nbdk\t163874\nbdj\t163875\nbdu\t163876\n真好玩校吧\t163877\nbdv\t163878\n蟀\t163879\n韭菜堡\t163880\n不啦不啦\t163881\nELLEMEN特邀#ELLEMEN时装#【睿士\t163882\n农业\t163883\n拌面\t163884\nGyg\t163885\n告诉我的你的岁数\t163886\n浴皂\t163887\nHX\t163888\n1099家\t163889\n隔座\t163890\n冀\t163891\n谢文婷\t163892\n处男\t163893\nsityouto\t163894\n哎哟我乐\t163895\n桂馨\t163896\n掌心\t163897\n彭峰长\t163898\n100多年前\t163899\n靓颖\t163900\n党支部\t163901\n时速\t163902\n均\t163903\n说出来\t163904\n猪猪侠之巨棒\t163905\n说得好像\t163906\n三眼哮天录\t163907\n样本\t163908\nbd2\t163909\n周雨彤\t163910\n陆如\t163911\n粗滚粗\t163912\n太息\t163913\n黄好黄\t163914\n擦幹最後為\t163915\n噢噢\t163916\n波里\t163917\nduffffu\t163918\n高傲\t163919\n邵晓蕾\t163920\nhffchyeegj\t163921\n李笑话\t163922\n6.2日\t163923\n压道\t163924\nInput\t163925\nvgvgv\t163926\n乔思樊\t163927\n猪你是猪你是你是猪你是叔呀你是猪\t163928\n劣质\t163929\n死精\t163930\n失落感\t163931\n真的真的假不了假的假的假不了\t163932\n阅兵雯\t163933\nQQ刚电影节\t163934\n12345626351\t163935\n酷我恶龙\t163936\n风凉话我不和你了我不理你\t163937\n朱新博\t163938\nvvvvvvvv吧vvvvvvvvvvv吧vvvvv吧\t163939\n报密\t163940\n白大孵\t163941\n徐小乖\t163942\n说话说话\t163943\nffhffyryrty\t163944\n冬游地\t163945\n猫叫\t163946\n独生子女\t163947\n喔奶糖\t163948\n加护\t163949\n神懂十二\t163950\n荣鼎康城\t163951\n孙树峰\t163952\n#美队甜心#\t163953\n机器增压机器人\t163954\n女歌\t163955\n横纹肌溶解综合征\t163956\n三更\t163957\n重犯\t163958\n百八十米\t163959\n充愣\t163960\n头好\t163961\n开来看\t163962\nYouareroll\t163963\n那拉稀\t163964\n畅捷\t163965\n112秒\t163966\n追星行\t163967\n十二岁\t163968\n朱智龙\t163969\n小槐泣\t163970\n余长岳\t163971\n记错数\t163972\nwoif\t163973\n记账\t163974\n别着我\t163975\n袁子欣\t163976\n行不行好\t163977\n炒锅\t163978\n间视漆\t163979\n事迹\t163980\n拉格斯代尔\t163981\n贝特曼\t163982\n谷拜\t163983\n行波\t163984\n南天蒙\t163985\ntf卜唉四\t163986\n储君奇丑\t163987\n堡亭\t163988\ndws\t163989\ndww\t163990\n完蛋\t163991\n雨柔湖北文\t163992\ndwx\t163993\n锦帽\t163994\n噪点\t163995\n笔墨\t163996\ndwc\t163997\n月轩吕布美团\t163998\n考慮考慮圖兔兔旅途突然感覺KTV圖就去圖KTV如圖圖兔兔哭哭哭塗阿卡麗肉圖庫\t163999\n末尾\t164000\ndwd\t164001\nokkokk\t164002\n吓坏\t164003\n筚路蓝缕\t164004\n坠\t164005\n得好谗\t164006\nmeshit\t164007\n仓央加措\t164008\n西樵山\t164009\n凤凰树\t164010\n求生\t164011\n说错话\t164012\n李欣阳\t164013\n赵星宇\t164014\n忘不懂\t164015\n熔盛约\t164016\n啃老\t164017\n小破孩\t164018\n求男\t164019\n咒文\t164020\n绝疚\t164021\n大肆宣传擦擦擦臭粑粑\t164022\n面破晓\t164023\n田那\t164024\n形影相行\t164025\n真乖我下次再来找你玩儿吧\t164026\n私房钱\t164027\n华容\t164028\n音讯\t164029\n万和\t164030\n讨厌你我不要你了了坏人\t164031\n收摊\t164032\n植物大战僵尸无中医无敌版\t164033\n才大屯\t164034\n跪下去\t164035\n井空\t164036\n昌梅\t164037\n背影\t164038\n什麼鳥\t164039\n房子\t164040\n柳运琴\t164041\n女厕所家\t164042\nFfcbbvmkfngz\t164043\n无法忘怀\t164044\n瓦塔\t164045\n悍将\t164046\n海豚台\t164047\n益力\t164048\n前列腺炎\t164049\n瓦塌\t164050\n四轮\t164051\n寄件\t164052\n超任\t164053\n佛罗密\t164054\n棋直方图\t164055\n227307243205\t164056\n山东话\t164057\n提恩\t164058\n十哥\t164059\n小苹果hi小泠\t164060\n淫贼\t164061\n感觉不像\t164062\n响动\t164063\n连日z放佑视你的20a\t164064\n淫贱\t164065\n好的再见晚安\t164066\n怕狠\t164067\n零八八七三七零\t164068\n地域性\t164069\n马爱春\t164070\nqsas\t164071\n大禹治水图\t164072\n不很久\t164073\n密块\t164074\n宾至如\t164075\n涅蝶\t164076\n一起十年\t164077\n追梦\t164078\n到好音樂\t164079\n张燕东\t164080\n沃兹尼克\t164081\n可可可哈哈哈\t164082\n刘可\t164083\n饭锅\t164084\nRRFGHCRY\t164085\n刘叫\t164086\n13704178140\t164087\n忍心\t164088\n我的的家的度隶\t164089\n民居\t164090\n同坊\t164091\n刘叔\t164092\n任汝芬方\t164093\n不要不要不要不要不要不\t164094\n健平安\t164095\n洪将怀\t164096\n宝丰\t164097\n刘双\t164098\n你好度秘宠你\t164099\n天窝\t164100\n段迪\t164101\nABS36\t164102\n依法治国\t164103\n糖醋排骨淘气堡7k7k7299\t164104\n康力\t164105\nHbsab\t164106\n最简\t164107\nipod\t164108\nyoupo2\t164109\nhiwjwjx\t164110\n涌流\t164111\n一臭不要脸\t164112\n杰克逊\t164113\n530652181\t164114\n局长\t164115\n签到台我的天大大哈\t164116\n本帮\t164117\n放手\t164118\n菲兹\t164119\n快餐业\t164120\n本帖\t164121\n详单\t164122\n拉尔夫·费因斯\t164123\ngsghsj\t164124\nok咯\t164125\n本市\t164126\n假再来\t164127\ndgchdyeus\t164128\n辣椒面\t164129\n杨姐\t164130\njuylkl\t164131\n首府\t164132\n百亿美元\t164133\n了不聊\t164134\n陈臭\t164135\n动向\t164136\n約炮嗎\t164137\n沈露曦\t164138\n可说\t164139\n可课\t164140\n难装\t164141\n两个数\t164142\n度秘麻烦你过找下魔幻手机\t164143\n可读\t164144\n六十天\t164145\n蔚县\t164146\n胡依婷\t164147\n鹿邑\t164148\n冠军杯\t164149\n贾永乐\t164150\n你好帅你就是我的男神\t164151\n度秘我问你件事行\t164152\n六十多\t164153\n六另一个\t164154\n温奶奶\t164155\n可试\t164156\n白尼\t164157\n記發\t164158\nfyfyfyfctcytygftstfyhygxrcrf\t164159\n首度\t164160\n贰厘米\t164161\n就不我\t164162\n航运业\t164163\n没那么远\t164164\n梦魇\t164165\n字数\t164166\n誓颜\t164167\n自拍照\t164168\n芷瑜\t164169\n死守\t164170\n唔世勋\t164171\n18039274813\t164172\n度乔\t164173\n梦魔\t164174\n弱子\t164175\nwetuij\t164176\n13801\t164177\n死定\t164178\n198元\t164179\nBLUE\t164180\n七十六块\t164181\n死亡人\t164182\n不夜城\t164183\n3981克\t164184\n在一而再再而三\t164185\n15d5d五\t164186\n643169766191643\t164187\n881642\t164188\n宫灯\t164189\n来电显示\t164190\n名皇\t164191\n糖果盒\t164192\n严于律己\t164193\n做做作业\t164194\n浴室柜\t164195\n宁太狼\t164196\n幺零零幺零总\t164197\n骨灰盒\t164198\n伶俐\t164199\n饭人\t164200\nagw\t164201\n嘛好开心\t164202\nwantanew\t164203\nfaux\t164204\n剧情\t164205\n生少\t164206\n你的错\t164207\nangle\t164208\n洛阳话\t164209\n回旋\t164210\n回族\t164211\n毕业们\t164212\n京奔驰\t164213\n13023156251\t164214\n栾子韬\t164215\n饭亲\t164216\n时慧筠\t164217\nmovil\t164218\n欧耶文\t164219\n宋松\t164220\nRffff\t164221\n花羊\t164222\n龚一晋\t164223\n谈情说爱\t164224\n王恕\t164225\n朱小咪\t164226\n闻讯\t164227\n夢想\t164228\n所难\t164229\n唔复\t164230\n妖枷\t164231\n故意的信不信我真的很想\t164232\n神32e\t164233\n篮球队\t164234\n邱涵韬\t164235\n欧耶斯\t164236\n喵喵喵喵喵嘟咪喵喵喵喵喵喵\t164237\n艳冬云\t164238\n一四号\t164239\n长话\t164240\n告干\t164241\n100万米\t164242\n长诗\t164243\n傅文刚\t164244\n九晚\t164245\n饼干报\t164246\n写出\t164247\n鸭绒\t164248\n长识\t164249\n140瓶\t164250\n貂皮\t164251\n葡皇\t164252\n九成九成九成九成九成九成九成0\t164253\n反常\t164254\nsitrz\t164255\n渐江\t164256\n苏昌彦\t164257\n非理莫\t164258\n下一年\t164259\nThurgau\t164260\n生不如\t164261\n大方方\t164262\nlylujlklhkhkcu\t164263\n因为啥\t164264\n陈宜浩\t164265\n养父母\t164266\n人人车\t164267\n二十八十八\t164268\n亲我嘴摸我脸\t164269\n丰田\t164270\n宫外\t164271\n化学我喜欢\t164272\n君主\t164273\n11511米\t164274\n开玩笑胃不大理你了拜拜\t164275\n君临\t164276\n脾差\t164277\n长安道\t164278\njfqX4666b\t164279\n12388\t164280\n普洱\t164281\n12387\t164282\n陈俊哲\t164283\n孕棒\t164284\n试试试试试试\t164285\n开怀\t164286\n黝小凡糅\t164287\n乔一轩\t164288\n小初\t164289\n案板\t164290\n孕检\t164291\n天天有喜之味人间有爱电视\t164292\n中年\t164293\n摩羯\t164294\n姨母\t164295\n张倬闻\t164296\n客客服\t164297\n二第十章\t164298\nsbbssbsbsbsbsbsbsbbsbsbsbsbsbsbsbssbsbsbsbsbsbsbsbsbssbbsbsbsbs\t164299\n藁城\t164300\n坐打坐\t164301\n寻子\t164302\n张国征\t164303\n狐狸大海\t164304\n15542269\t164305\n人满为患\t164306\n没指\t164307\n我的父亲在哪你猜你猜\t164308\n七情\t164309\n肉蘑\t164310\n余文梦\t164311\n罗茹尹\t164312\n公衍颖\t164313\navou\t164314\nCFJIIVCXE2TUIIOKJHADDTHUIJNVDAERGGHNKIITREWWFH\t164315\n理查德麦伦\t164316\n葛午睡\t164317\nCOUPLE\t164318\n水果醋\t164319\nhi喽奈斯荼蘼TU\t164320\nduang\t164321\n玩闹\t164322\n替补席\t164323\n一百二十四块\t164324\n门阀\t164325\n董秘你好像很的帅\t164326\nlllllllllllllllllllllllllllll\t164327\n小段\t164328\n大理第二中学\t164329\n门阵\t164330\n听听听听听听听听听听听听\t164331\nxgixigiy\t164332\n推中\t164333\n那么多次\t164334\n想情人\t164335\n东北方言\t164336\n吓我乐意\t164337\n歹徒\t164338\n掏粪\t164339\nABCD\t164340\n离\t164341\n禾\t164342\nABCA\t164343\nABCC\t164344\nwall\t164345\n林建徽\t164346\n朱比亚\t164347\n冥想\t164348\n一百多分钟\t164349\n七二八零\t164350\n穆燃\t164351\n皮冻\t164352\nwaly\t164353\n一1234567\t164354\nu儿\t164355\n个合\t164356\n13654901720\t164357\n军工股\t164358\n玛利亚\t164359\n心上人\t164360\n温暖\t164361\n太和县\t164362\ngzdggrgjwjdb\t164363\np然\t164364\n要不是\t164365\n一万多公斤\t164366\n福\t164367\n禀\t164368\n禁\t164369\n全校奥\t164370\n禄\t164371\n禅\t164372\n六月四日\t164373\n跨进\t164374\n陈瑞波\t164375\n理智相处\t164376\nCometichiamo\t164377\n得病\t164378\n教育百事通\t164379\n丢地\t164380\n66525\t164381\n七十二块\t164382\n150块\t164383\n长焦相机\t164384\n学车\t164385\n杨武杰\t164386\nsezz\t164387\n没事儿\t164388\n雪梅\t164389\n跨过\t164390\n匹克\t164391\n人防\t164392\nCDC\t164393\n雪梨\t164394\n51643818466\t164395\n假面汤\t164396\n朝锋\t164397\n老榕\t164398\n库尾\t164399\n万里长征\t164400\nks卜\t164401\n等定\t164402\n千百度\t164403\n每天每天\t164404\n8563\t164405\n新蜀山剑侠传\t164406\n唐鸣\t164407\n金龙\t164408\nLoving\t164409\n韩少爷\t164410\n石橙廊\t164411\n娜式\t164412\n艺帆\t164413\nHhbvccfg\t164414\nts波伊斯\t164415\n白菜猪肉馅\t164416\n奏效\t164417\n懒蛋\t164418\n288664\t164419\n安葬\t164420\n746412\t164421\n无限不正经不要脸模式OTZ\t164422\n牛鹿\t164423\n冰冻三尺\t164424\n咱们俩们\t164425\n55695569\t164426\n上轮\t164427\n982528\t164428\n罗马\t164429\n优酷网\t164430\n会同\t164431\n双持沙鹰\t164432\n会后\t164433\n会合\t164434\nhttphhiphotosbaiducomxiaodupicitemfd039245d688d43fe89dfd817a1ed21b0ef43b88jpg\t164435\n玛非\t164436\n一一百八十\t164437\n高颖\t164438\n饱讨厌\t164439\n最低调\t164440\n姚子青\t164441\n发服服\t164442\n西游四一\t164443\n感恩信\t164444\n12月4日\t164445\ngo冰心\t164446\n一nnnnn\t164447\n蒸籠\t164448\n15213期\t164449\n国足\t164450\nisifh\t164451\n硫磺馒头\t164452\n英年\t164453\n15次\t164454\n1996年1月25\t164455\n不挺\t164456\n红米猫\t164457\nNever\t164458\n四化\t164459\ndrdy\t164460\n纨绔世子妃\t164461\n刘有利\t164462\ndrdt\t164463\n干什么谁知道\t164464\nobtain\t164465\n异国\t164466\n往下来\t164467\n永存\t164468\n班里号\t164469\n王成远\t164470\nMIUI71\t164471\n四包\t164472\ndrdf\t164473\n静止\t164474\ndrdd\t164475\n雅马\t164476\n研究天\t164477\n读者们\t164478\n要不想说\t164479\n崔博妍\t164480\n三四个月\t164481\n不好度\t164482\n修正\t164483\n古\t164484\n不要脸不要脸不要脸不要脸不要脸\t164485\n煮夫关系\t164486\n天铂袁\t164487\n为你就这样\t164488\nXXVB\t164489\n羽若\t164490\n九十点\t164491\nuuuxy\t164492\n卜一\t164493\nFdfdff\t164494\nkzkzkzokjjhsnzajzialspos18139\t164495\n好运会\t164496\n太表\t164497\n档畎\t164498\n叨\t164499\njigkut\t164500\ndaada\t164501\n可了不得\t164502\n道岁\t164503\n借款额度\t164504\n知寂灭\t164505\nfishand\t164506\n班机\t164507\n超级变态\t164508\n天王\t164509\n继续文\t164510\nxxzcxxvxv\t164511\n太衰\t164512\n天玄\t164513\n四千块\t164514\n掉队\t164515\n雨化\t164516\n作甚\t164517\n王总e\t164518\n太行\t164519\n哎呀机器人\t164520\n人工台console\t164521\n兰州拉面\t164522\n石澳\t164523\n迷津案\t164524\n组织\t164525\n组组\t164526\n刘二晨\t164527\n班服\t164528\ngkgbbvmbn\t164529\n女同桌\t164530\n曾蒙日\t164531\n机号\t164532\n现状\t164533\n规整\t164534\n机台\t164535\n十二种\t164536\n邵安晴\t164537\n唐河\t164538\n毛妹\t164539\n99635635\t164540\n文程\t164541\nopooppo\t164542\n冷萌\t164543\n吹牛皮\t164544\n作物\t164545\n哥宜家\t164546\n下战\t164547\nBB88\t164548\n马油\t164549\ntotty\t164550\n冷落\t164551\n凤凰城解放军\t164552\nddks\t164553\n昆山市\t164554\ntotto\t164555\n61岁\t164556\n周瑜\t164557\n机友\t164558\n周瑞\t164559\n指使\t164560\nв\t164561\nг\t164562\n童星\t164563\nб\t164564\nж\t164565\n死回来\t164566\nд\t164567\n得意\t164568\nк\t164569\nл\t164570\nи\t164571\nй\t164572\nо\t164573\nп\t164574\nм\t164575\nн\t164576\nТ\t164577\nР\t164578\nЧ\t164579\n胃酸\t164580\nХ\t164581\n片冈美\t164582\nЫ\t164583\n什么是好的公共生活\t164584\nЩ\t164585\nЮ\t164586\nЯ\t164587\nЬ\t164588\n东北北部\t164589\nГ\t164590\nБ\t164591\nЖ\t164592\nЗ\t164593\nnimoji\t164594\n棋艺\t164595\n灵长类\t164596\nЛ\t164597\n我是我最爱的的那个兄弟\t164598\n豆豉\t164599\n1vhjgag\t164600\nМ\t164601\nН\t164602\n135年\t164603\nЁ\t164604\nwvaakame\t164605\n牛棚\t164606\n龙肉\t164607\n叉\t164608\n少女生\t164609\njjhvfhrrwehkuevfegkuydnfojhdxjitwrlhaeghiegljdgkkdhhbcewjksnmlydhklk\t164610\n蝎子\t164611\n草草\t164612\n亲骂\t164613\n湿处\t164614\n大葱须根系\t164615\n瓦利亚巴拉巴拉\t164616\n你好学\t164617\n争艳\t164618\nznnxnx\t164619\n第4季度\t164620\n污泥\t164621\n死基佬\t164622\n吃吃吃吃吃吃\t164623\n青壮年\t164624\nvak\t164625\nvan\t164626\nval\t164627\n湿大\t164628\n来事\t164629\nvae\t164630\n你好孝\t164631\n2.5%\t164632\nvap\t164633\n2013届\t164634\n取\t164635\n安纳普尔\t164636\nhna\t164637\n零八零六\t164638\nGjx\t164639\n空调\t164640\nGju\t164641\n金沙场\t164642\n唇炎晨晨晨晨穿穿穿穿穿穿穿穿穿穿\t164643\n岗位\t164644\n狗耳\t164645\nGjk\t164646\n累身\t164647\nGjf\t164648\nGjc\t164649\n2.5L\t164650\n别老度秘度秘的好不好\t164651\nm莎莎\t164652\n吃鸡\t164653\n浅色裙\t164654\n74587555787955561145755893786591919657899148957\t164655\nwhether\t164656\n叟\t164657\nxzxzx2\t164658\n悍妻\t164659\n矮胖\t164660\n枝丫\t164661\n枯荷\t164662\n家办\t164663\n望子成龙\t164664\n清淤\t164665\n日武士\t164666\n连用\t164667\n别无他法\t164668\n新6b\t164669\n二模\t164670\n撸点图\t164671\n解析\t164672\n林庆德\t164673\n2512米\t164674\n啊巍\t164675\n3.摩羯座\t164676\n网盘\t164677\n强强\t164678\n多米尼哥\t164679\n网盒\t164680\n网监\t164681\n散心\t164682\n解构\t164683\n强弱\t164684\n拜泉\t164685\n十十斤\t164686\n还行不行\t164687\n刚才点\t164688\n兼容\t164689\n中锋哥\t164690\n杨文群\t164691\n响亮\t164692\n度蒙\t164693\n川川\t164694\n坏猪\t164695\n改版后\t164696\n110628\t164697\n／／酱酱酱\t164698\n鸡米花\t164699\n稳定\t164700\n一首歌\t164701\n以身试验\t164702\n文稿\t164703\n耽我事\t164704\n最炫民族风\t164705\ngiut\t164706\ngius\t164707\n翘首\t164708\n李曜\t164709\n珍重\t164710\n爸爸爸爸爸爸爸爸\t164711\n坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋\t164712\n摩托车\t164713\n2005年3月7日\t164714\nFading\t164715\n购置\t164716\n草荡\t164717\n机器机\t164718\n后援会\t164719\nTankyou\t164720\n子男\t164721\n家妞\t164722\n團\t164723\n土\t164724\n圝\t164725\n園\t164726\n法文\t164727\n子画\t164728\n嗯对\t164729\n请别相信他\t164730\n头跟\t164731\n尼丑\t164732\n國\t164733\n圈\t164734\n钰诚花\t164735\n餐宴山\t164736\nyobaby\t164737\n益虫子\t164738\nfcycvvxhcjhbb\t164739\n圆\t164740\n场\t164741\n别再闹\t164742\n圾\t164743\n上两个月\t164744\n来一起来\t164745\n圳\t164746\n地\t164747\n在\t164748\n圮\t164749\n圭\t164750\n圢\t164751\n圣\t164752\n挺年輕\t164753\n倍感\t164754\n圧\t164755\nuru\t164756\n棱\t164757\n丰满度\t164758\n打的好不\t164759\n棵\t164760\n棺\t164761\n我不爱你一也不爱我我就是讨厌你\t164762\n棠\t164763\n噜噜噜噜噜\t164764\ngdseetfddddf\t164765\n１４世纪\t164766\n欣蕊\t164767\n12月14日\t164768\n你的很不想玩\t164769\n宫殿\t164770\n颂扬\t164771\n棒\t164772\n阿阿阿\t164773\n棕\t164774\n爱迪生\t164775\n仰天大笑\t164776\n棚\t164777\n无法证明\t164778\n检\t164779\n棄\t164780\n来讲一讲\t164781\n哎baby\t164782\n棋\t164783\n精灵梦夜萝莉\t164784\n呃剑\t164785\nIsisissisksksk\t164786\n家养狗\t164787\n张群新\t164788\n华商报\t164789\n策略\t164790\n李金懋\t164791\n浇筑\t164792\n阿里神山\t164793\n东西馆\t164794\n刑事诉讼\t164795\n袁讲\t164796\n不稀\t164797\n陈文博\t164798\n追星\t164799\n法国大使馆\t164800\na梦\t164801\naili了哎呀奥特曼哎呀奥特曼\t164802\n不稳\t164803\n浑\t164804\n坏气\t164805\n余文乐\t164806\n快点游戏游戏游戏\t164807\nytrwashiopphg\t164808\n小代\t164809\n两百\t164810\n可爱度\t164811\nspks\t164812\n饱感\t164813\n哥德巴赫\t164814\n二百六十八级\t164815\n额吻\t164816\n999999999999个\t164817\n22222222222222222211111744444222233355555666777777888999999999999\t164818\n小任\t164819\n浙\t164820\n坂上之云\t164821\n珠澳\t164822\nv号放假句\t164823\n姜大成\t164824\nHLU\t164825\n杨坤\t164826\n胡欣\t164827\n小仓\t164828\n拜拜了摸摸\t164829\n加冰强\t164830\n猪猪公\t164831\n小仙\t164832\n五三点\t164833\n看雪\t164834\n东软\t164835\n忄Ⅷ樾龖\t164836\nwasavery\t164837\nfnnn\t164838\n凋寒\t164839\n依酷\t164840\n组委\t164841\n198岁\t164842\n弄多力图\t164843\n好热\t164844\n点解救命案板寸354459123558823\t164845\n早洗澡\t164846\n机器材\t164847\nFhbd\t164848\n进军\t164849\n自己人\t164850\n33度\t164851\n直加\t164852\n6点10分\t164853\n齐人\t164854\n赶快华\t164855\n条例\t164856\n叫就像\t164857\n嗯莉娜\t164858\n苗门\t164859\n肉末茄子\t164860\nWFFV\t164861\n竹竿子\t164862\n不雅宁\t164863\n孙秘戏\t164864\n收语音\t164865\n二点7点\t164866\n割胸\t164867\n刷机恶人\t164868\n嘎嘣响\t164869\n越发\t164870\n动眼\t164871\nhole\t164872\nhold\t164873\n20米\t164874\nvvhhfdd\t164875\nhola\t164876\n达拉达拉拉达大都\t164877\n烟花龙母\t164878\n国彬\t164879\n水滴体育场\t164880\n哎呀度秘我讨厌你\t164881\n康洁\t164882\n父子文\t164883\n获益匪浅\t164884\n蛮少\t164885\n擦啊擦\t164886\ngfddsdw\t164887\n去找寻\t164888\nE文\t164889\n崔永博\t164890\n狼牙棒\t164891\n林份\t164892\n歌们\t164893\nSINGLE\t164894\n糗事百科\t164895\n浸\t164896\n酬果\t164897\n悄组\t164898\n咯hi7卡V5\t164899\n666略\t164900\n梁宇轩\t164901\njqii\t164902\nhob\t164903\ntdvhu\t164904\n山南微旅行\t164905\n范济民\t164906\n有罗\t164907\n乖小凌\t164908\nhom\t164909\n零二二零二九七八四四九\t164910\n480页\t164911\n2000万美元\t164912\nhow\t164913\n13号\t164914\n老何\t164915\n惊恐万状\t164916\n13台\t164917\nhkvvb\t164918\n这事儿\t164919\njqiq\t164920\nhoz\t164921\ngggyfd\t164922\nhox\t164923\n噜啦啦噜啦啦噜啦噜啦嘞噜\t164924\n王作安\t164925\n居安思危\t164926\n小兰\t164927\n小共\t164928\n1.4米\t164929\n小兴\t164930\n小兵\t164931\n手写体\t164932\n小兽\t164933\n这些日子\t164934\n九十二二九十八块\t164935\n浯\t164936\n手帕\t164937\n小八\t164938\n中路\t164939\n小兮\t164940\n手帐\t164941\n小公\t164942\n小六\t164943\n小兔\t164944\n冬堡\t164945\n不要知道\t164946\n手带\t164947\n高良潮\t164948\n0点五\t164949\n陈忆萱\t164950\n小元\t164951\n上海大悦城闸北店Kose旗舰店\t164952\n放火\t164953\n孟宇恬\t164954\n小先\t164955\n小光\t164956\n孙佳雪\t164957\n星七\t164958\n火海\t164959\n空前绝后\t164960\n谢霆锋\t164961\n边界线\t164962\n杀鸡取卵\t164963\n十万个\t164964\n真错\t164965\n站小姐度量技术\t164966\n海狗鞭\t164967\ngggygy\t164968\n外国家\t164969\nzlzl\t164970\n西班\t164971\naileyou\t164972\n东营市\t164973\n珊姐\t164974\nwouBh\t164975\n卡罗拉屯\t164976\n十万一\t164977\nleehofun\t164978\n太阳港\t164979\n嬲咧\t164980\n十里河县\t164981\n没法儿\t164982\n对对幺八五\t164983\nShorry\t164984\n梦幻梳\t164985\n月日\t164986\ndfgjiu\t164987\n口罩\t164988\n合理性\t164989\n奔驰生气\t164990\n东艺\t164991\n成功之母\t164992\n842655555\t164993\n801z\t164994\n华妃\t164995\n2007年9月20日\t164996\n腻味银盏\t164997\n咱们\t164998\n名萌\t164999\n网易新闻\t165000\n男体\t165001\n变化多端\t165002\n夸节快乐\t165003\n现年\t165004\n彭七七\t165005\n手动挡\t165006\n好啦你是真人那还机器人\t165007\n手间房\t165008\n没见一天\t165009\n请哝\t165010\n看管\t165011\n烦心\t165012\n藏龙卧虎\t165013\n泻落\t165014\n徐胜拓\t165015\ntjf8th9\t165016\n啦啦啦啦你口臭\t165017\n冼碧君\t165018\n严酷\t165019\n戳穿\t165020\n菱角湖\t165021\npinyin.cn/e17111\t165022\nMymobile\t165023\n剧变\t165024\n爱＝[红桃\t165025\n好loaa\t165026\n中凯\t165027\n周针兰\t165028\n朴建华\t165029\n石径\t165030\n针线\t165031\n可怜的人\t165032\ngjsc\t165033\n1700余平方公里\t165034\n周威\t165035\n亲有\t165036\n画画\t165037\n美事业\t165038\n虎岛和夫\t165039\n五阴\t165040\n气息\t165041\nGGCFCc\t165042\n追回去\t165043\n你的死\t165044\nfrrd\t165045\n旺仔牛奶糖猫\t165046\nUudgdhbsyfhdydhhcuzfsgRgdsggjvavhhzgkgabjdhsxyhbfgkfdshkuDbhuvvzvdhFffghjfyxhgfsbjmvkchtgkmgzhmpgdGhxgghhTgxfTffhYxcgihjhxjdffgxxhgfgGhxfjhzdcjkvsvklcsbjvkbAvnogsscbkfdsaqwertyuiopasdfghjklzxcvbnm\t165047\n大熊风\t165048\n不然\t165049\n校花的秘诀告诉我写笑话的秘诀\t165050\n三德歌\t165051\n娃娃音\t165052\n欧阳一成\t165053\n上边儿\t165054\n崎岖\t165055\n度秘兮兮度秘戏\t165056\n侍者\t165057\nfgdgdfx\t165058\n肖云飞\t165059\njj呀战1\t165060\n自横\t165061\n环球时报\t165062\n没秘没\t165063\nnnn784946484848￥\t165064\n长立\t165065\nv446724\t165066\n脂肪型\t165067\n熊依晨\t165068\n支离破碎\t165069\n说三\t165070\n里昂\t165071\n文海浩\t165072\n切塞纳\t165073\n腥风血雨\t165074\nstroza\t165075\n多米妮克\t165076\n信你刘\t165077\n东方明\t165078\n136号\t165079\n天姐姐\t165080\n孙屯\t165081\n江苏地区\t165082\n还有多些\t165083\n小梦比\t165084\n超级武士\t165085\n刃吧\t165086\n该事\t165087\nhxisid\t165088\nwwwptzcece点com\t165089\n凝重\t165090\nkftjzfhj\t165091\n恩妞妞\t165092\n冰帅\t165093\n海军陆战队\t165094\n该亚\t165095\n南门外\t165096\n西安教育局\t165097\n机器人大坏蛋大坏蛋大坏蛋大坏蛋吧\t165098\n888888777777666666555555444444333333222222111111\t165099\n衬衣\t165100\n我爱你好我的小秘书\t165101\n嘟咕\t165102\n衬衫\t165103\n猪嘛大肥猪\t165104\n她子\t165105\n度秘真听话\t165106\nXhbbxf\t165107\n金谷\t165108\n老姐咱我爱你\t165109\n周獴\t165110\ni错啦雪伽数\t165111\n明儿\t165112\n杨舒淇\t165113\n提示音\t165114\n12224\t165115\n前十名\t165116\n12222\t165117\n嘟咪\t165118\niyddidubf\t165119\n小津安二郎\t165120\nYOUknow\t165121\n3433.88亿元\t165122\nhud\t165123\nOuou\t165124\n90100\t165125\n留口\t165126\n快快快跑快跑快跑快跑\t165127\n71水01中一4了美y一foxnl一乡互又一女7工乚\t165128\n中惠磅\t165129\n认得\t165130\n大惧易失节,\t165131\n破坏藤\t165132\n11111111110\t165133\n杨佳心\t165134\n更搞笑\t165135\nhuc\t165136\n渣夫金\t165137\n8848\t165138\nhul\t165139\n中国羽毛球队\t165140\n啊红\t165141\n郝凯旋\t165142\n留发\t165143\n冰帮\t165144\n集团经理\t165145\n最低温度\t165146\n8844\t165147\n8846\t165148\nhttppinyincne17721\t165149\n女人味\t165150\n老海龟\t165151\n煮书\t165152\n好多好难\t165153\nnbnn\t165154\n唱一生\t165155\n380分\t165156\n一块一块儿\t165157\n868个\t165158\n复仇堡\t165159\nsgdcet\t165160\n喵喵喵喵喵喵喵喵喵喵\t165161\n好嘛多咪\t165162\n潜台词\t165163\nzzif\t165164\n酷熊\t165165\n剁鸡\t165166\nhut\t165167\n黄黄片\t165168\n布鲁塞斯\t165169\n思思思思\t165170\n你的歌\t165171\n提发\t165172\n句尾\t165173\nN50\t165174\n加码\t165175\n提及\t165176\n一十三百\t165177\nfhjok\t165178\n杆子\t165179\n愁眉苦脸\t165180\nOP度秘度秘\t165181\n冷死\t165182\n小礼\t165183\n27种\t165184\n冤枉\t165185\n249欧元\t165186\nhoutaa5cui\t165187\n你好度秘维尼夫\t165188\n子眸\t165189\n倒插门\t165190\n红卡\t165191\n笑可\t165192\n攒肚\t165193\n个你跟\t165194\n陶俑\t165195\n你的老师你的爸妈你的爷爷\t165196\n朱瑜莹\t165197\n车站\t165198\n我真的很想你了那你有没有爱我\t165199\n1543\t165200\nQQ行\t165201\n四个黄的一个一个一个灰嘞一个白嘞\t165202\n十蚊\t165203\n1549\t165204\ngctyhguhh\t165205\n200万元\t165206\ntouched\t165207\n快点头\t165208\n快餐我在等你的听歌\t165209\n是容\t165210\n哪样恋\t165211\nhi你是斯文人\t165212\nCLAMP\t165213\n一军\t165214\n陈凤辉\t165215\n岔道\t165216\n多妙妙好多咪呢僵尸的蓝图妙咪\t165217\n叶慧娴\t165218\n走麻烦的达\t165219\n是不是不是\t165220\n好好看\t165221\n一再\t165222\n一册\t165223\n一二零二二二一九六七零三零六零零幺零\t165224\n玉山县\t165225\n伍迪艾伦\t165226\n327189927\t165227\n刘忠虎\t165228\n渐行渐远\t165229\n付珈宁\t165230\n一冰\t165231\n一决\t165232\n渐行渐近\t165233\n寒气\t165234\n#2AM#\t165235\ndrgkb\t165236\n邦邦邦\t165237\n醪糟味\t165238\n心电\t165239\npofn\t165240\n赶工\t165241\n长地久\t165242\n蔡水平\t165243\n怪盗基德第二季\t165244\nMARITIM酒店\t165245\n蒋梦莹\t165246\n看一下童\t165247\n宁太\t165248\n延鲁\t165249\n不偿\t165250\nyou度克\t165251\n26天边\t165252\n自由主义者\t165253\n310多名\t165254\n644910678\t165255\n不偏\t165256\n沒存\t165257\n不假\t165258\n麦吉历险\t165259\nwikipedia\t165260\n宁夏\t165261\n蓉茉江\t165262\n宽阔\t165263\nprothou\t165264\nchfjfuuf\t165265\n一只眼\t165266\n不做\t165267\nJugng\t165268\n和嘛嘛幸福\t165269\n风凌天下\t165270\n张艳怡\t165271\nwtp\t165272\ncouchoutlet\t165273\nwtt\t165274\nwtw\t165275\n残像\t165276\n相呗\t165277\n七零三八二\t165278\n8点32分\t165279\nwta\t165280\nwtc\t165281\nwtd\t165282\n鸭血汤\t165283\nwtg\t165284\nwtj\t165285\nmusical\t165286\n了不骗\t165287\n演奏\t165288\n不是家有事针灸针扎你和\t165289\n十二点儿\t165290\n乞丐男\t165291\n倒点水\t165292\nuguigigyihihibih\t165293\n彭正\t165294\n一发一个\t165295\n我是仁宗叫版语文书四年\t165296\n千年一叹\t165297\n净鱼\t165298\n黛玉\t165299\n感觉吧\t165300\n徐志鸿\t165301\n不用其极\t165302\n13946722485\t165303\ndifj\t165304\n好印象\t165305\ndifh\t165306\ndifg\t165307\ndiff\t165308\n唔舍得口\t165309\n4c一\t165310\nYuYongYu\t165311\n扬州市\t165312\n东明路\t165313\n暖娘暖娘\t165314\n好时光脚来了三三好好奇奇没有\t165315\n寡比\t165316\n杨千喜\t165317\n一声叹息\t165318\n球层\t165319\n台北市立动物园\t165320\n雷彤\t165321\nè\t165322\n贡嘎\t165323\n治病\t165324\n蔡佳敏\t165325\n期末考试\t165326\n汉口\t165327\n姐们儿们\t165328\n唐三\t165329\n趁早\t165330\n学术论文\t165331\nwish\t165332\n哎呀好丑好丑好暖心\t165333\n果冻度秘米\t165334\n埋不了\t165335\n怀改\t165336\n一拍拍拍拍拍\t165337\n张给我\t165338\n唐东\t165339\n一霎那\t165340\n菲力牛排\t165341\n歌迷们\t165342\n余多余\t165343\n13513492998\t165344\n来不及你\t165345\n雨鞋\t165346\n118元\t165347\n哼不理你了我\t165348\n荳荳\t165349\n步枪\t165350\n没有信\t165351\nsien\t165352\n7200公斤\t165353\n沈鸽\t165354\n上心\t165355\neen会\t165356\n开云溪\t165357\n欧耨耨耨耨耨耨耨耨耨\t165358\n若干项\t165359\n东城门\t165360\ndbbdbfbfh\t165361\n乐呵\t165362\n天系\t165363\n两制\t165364\n疑病\t165365\n第三篇\t165366\n棉乡\t165367\n高盛公司\t165368\n3535\t165369\n3534\t165370\n福原爱\t165371\n贯文瑶\t165372\n自筹\t165373\nskosm\t165374\n展出\t165375\n么么么么哒别害羞一起唱萌萌萌萌哒给认真放个假萌萌萌萌哒\t165376\nyvutu\t165377\n视数\t165378\n肱二头肌\t165379\nthings\t165380\n这一年级\t165381\nogxxf\t165382\n玉京\t165383\nì\t165384\n丁凤云\t165385\n怍文\t165386\n周韵\t165387\n大杀器\t165388\n根通\t165389\n乐呆\t165390\npleaserightevery\t165391\n行色匆\t165392\n六页\t165393\n不三\t165394\n呵呵分互粉互粉互粉\t165395\n陈冠\t165396\n第五步\t165397\n4月1日\t165398\n六项\t165399\n一卡\t165400\n太太太太\t165401\n童话园\t165402\n驾驶员\t165403\n70个\t165404\n日游\t165405\n丑恶\t165406\n二二零二\t165407\n重复机\t165408\n中青年\t165409\n普不懂\t165410\n嗜睡\t165411\n约行\t165412\n了不去\t165413\n穷乱\t165414\n两两岁\t165415\n电影吧\t165416\n#傅叶#\t165417\n纷万象\t165418\n飯沒\t165419\n鱼血对龟\t165420\n我不老实真的假的你个变态机器人傻子by+\t165421\n楠木\t165422\n陈军\t165423\nxxxjn\t165424\nvggjf\t165425\n城市管理者\t165426\n早着呢人\t165427\n某会\t165428\n透析\t165429\n王好诚\t165430\n两千年2月6日晚上\t165431\nugkci\t165432\n都敏\t165433\nCynosure\t165434\n海南话\t165435\n为师\t165436\n自我同一\t165437\n妖魔鬼怪\t165438\n22254112542214477个\t165439\ncyrg\t165440\nda1\t165441\n学士\t165442\nhihihihihi\t165443\n李嘉恒\t165444\n红球\t165445\n找我我不会\t165446\n仰望\t165447\n哎呀不想\t165448\n混账\t165449\n呵呵vv\t165450\n丽霞\t165451\n心形\t165452\n封蓓蓓\t165453\n有所属\t165454\n八零三\t165455\n黄子幍\t165456\n东风本田汽车弘腾特约销售服务店\t165457\n下月初\t165458\n杂们\t165459\n落暮\t165460\n能不能快点的我求求你了我真的求求你了你给我\t165461\nbushi\t165462\n周杰伦\t165463\n胭脂\t165464\n狄更斯\t165465\n编造\t165466\n百没\t165467\n酿造\t165468\n咸丰\t165469\n郑于洋\t165470\nLeft\t165471\n录音棚\t165472\n换气\t165473\n幺五五\t165474\n吗哥们\t165475\n幺五二\t165476\n一厘\t165477\n且慢\t165478\n技术学\t165479\n梅仙岭\t165480\n海口高新区\t165481\n王和李\t165482\n黄忆慈\t165483\n十六年前\t165484\nSjgxgfa\t165485\n集训\t165486\n计划生育\t165487\n抵扣\t165488\n解甘\t165489\n千禧银杏\t165490\n安志荣\t165491\n幸福感\t165492\n林立\t165493\n八八亚丽\t165494\n李松林\t165495\nffftarx\t165496\n遗址\t165497\n跳天\t165498\n再来说吧妈妈说妈妈妈妈嗯妈妈\t165499\n炸鸡蛋\t165500\n小读者\t165501\n罗徳曼\t165502\nuugd\t165503\nHiJeannie\t165504\nuugb\t165505\n卖票\t165506\n身怀\t165507\n想不想知道\t165508\n庞给\t165509\n老丁省略号\t165510\n吹吹\t165511\n吴代表\t165512\n百分之九十\t165513\n235778193\t165514\n岁月份\t165515\n吴敬琏\t165516\n妇联\t165517\n反腐\t165518\n石龙\t165519\n逐点\t165520\n吗咖\t165521\n玩完成\t165522\nlllllllklle\t165523\n儒家\t165524\n立马\t165525\n减弱\t165526\n江龙\t165527\n博士后\t165528\n吗咪\t165529\n伊希亚\t165530\n婚姻生活\t165531\no\t165532\n回暖\t165533\n据份\t165534\n阅读\t165535\n翻页性思维\t165536\n贵\t165537\nNnlmmmmmmkjjjknk\t165538\n專輯\t165539\nWHO\t165540\n拳皇\t165541\nguxi\t165542\n我讨厌你讨厌你讨厌讨厌讨厌你\t165543\n刘梦依\t165544\n梧桐女九男\t165545\n亚露水\t165546\n杜琦燕\t165547\nJDN\t165548\n徐传文\t165549\n潘舒润\t165550\n红木\t165551\n妈妈的和爸爸的和你一\t165552\n贴\t165553\n红本\t165554\nabcdeftlanzka慢obklnstuvwhichichichoyouce嗯嗯嗯嗯秘c\t165555\njhgggfffff\t165556\nfrid\t165557\ncchhcghhcg\t165558\n大世界上\t165559\njuggjh\t165560\n待字\t165561\nwomamamyan\t165562\n忘记\t165563\nsgxhb\t165564\n测恩\t165565\n丽丽丽丽丽丽丽丽噜噜噜\t165566\n红月\t165567\n美女子\t165568\n消没有\t165569\n饺子王\t165570\n凝眉\t165571\n1945年4月\t165572\n54878128\t165573\n智轩\t165574\n十個\t165575\n熊兴桂\t165576\n新武林外传\t165577\n52blackberry论坛\t165578\n萌神\t165579\n她的爱\t165580\n言之凿\t165581\n很多度秘\t165582\n底特律\t165583\n邪天传了你\t165584\n法国队\t165585\n二比\t165586\n81226531\t165587\nｌｏｖｅｌｏｖｅ\t165588\n819得九\t165589\nWHY\t165590\n1948年6月\t165591\n66856个\t165592\nTony\t165593\nJDY\t165594\nIP信使\t165595\n一所一千\t165596\n苦殿\t165597\n老何微薄\t165598\n买买买买买66买买买6\t165599\n优酷土豆\t165600\n巴西航空工业公司\t165601\n群龙无首\t165602\n南北\t165603\n爱见\t165604\n移动4g\t165605\n3w点13bd点\t165606\n20层\t165607\n次子\t165608\n常青藤盟校\t165609\n百分之一\t165610\n茶叶味\t165611\n1000865540525\t165612\nNOT\t165613\n食戟度\t165614\n6182\t165615\n南区\t165616\n事实事实事实事实事实\t165617\n不治\t165618\n得宠\t165619\n不沾\t165620\n退避三舍\t165621\n清明节\t165622\n小志\t165623\n龙鱼肠炎\t165624\n18717289754\t165625\n小忙\t165626\n唉疯了\t165627\n栗战书\t165628\n還隱\t165629\n小心\t165630\n站地英\t165631\n幺三零\t165632\n米们\t165633\n24周年\t165634\nTGodhhggthe6yyggv\t165635\n小念\t165636\n36.16\t165637\n米帅侠\t165638\n见人说人话\t165639\n4nnnnnn\t165640\n3到4天\t165641\n1116999个\t165642\n融解\t165643\n讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌\t165644\n第8届\t165645\n小忧\t165646\ntime\t165647\n兜兜\t165648\n门牙\t165649\n锌\t165650\n│n　│\t165651\n锈\t165652\n色冽\t165653\n锇\t165654\n嘁嘁嘁嘁嘁嘁\t165655\n锅\t165656\n吴爱英\t165657\n销\t165658\n锁\t165659\n尊卑\t165660\n尿样\t165661\n锝\t165662\n锚\t165663\nififuxizi\t165664\n巨型婴\t165665\n锖\t165666\nLX3\t165667\n艳荣\t165668\n锕\t165669\n出款\t165670\n行书\t165671\n取货\t165672\n键\t165673\n锯\t165674\nABB式\t165675\n猫师\t165676\n行乞\t165677\n咪子\t165678\n锦\t165679\n锤\t165680\n锥\t165681\n骨动\t165682\n锣\t165683\n锡\t165684\n行么\t165685\n819\t165686\n锻\t165687\n五年级语\t165688\n810\t165689\nmdgmg0016467947\t165690\n锴\t165691\n813\t165692\n814\t165693\n815\t165694\n纳税\t165695\n短语\t165696\nxhc\t165697\n黄庆荣\t165698\n政府机构\t165699\n10元\t165700\n9295\t165701\n贺\t165702\n名誉\t165703\n良苑\t165704\n聊天天在线\t165705\n诸葛星\t165706\n别离\t165707\nWEED\t165708\n81X\t165709\n来一找\t165710\n洞见\t165711\n哟默\t165712\n厦门大学\t165713\n重组概念股\t165714\n度秘我是你的小主人公\t165715\n好小八点半王小吧\t165716\n江敏\t165717\n龙类\t165718\n299美元\t165719\n电缆线\t165720\n茶陵\t165721\n157cm\t165722\n492275455336418444842264825737549\t165723\n天天有喜之人间有爱你看\t165724\n同声\t165725\n队医\t165726\n李家大少\t165727\n良苦\t165728\n天天向上\t165729\n40200\t165730\n笑一笑\t165731\n圆楼\t165732\n用不上\t165733\n午时\t165734\n陈SB\t165735\n附加题\t165736\n新天地南里\t165737\n幺三零零二二零零八四零\t165738\nu肚饿useu\t165739\n午日\t165740\n匆匆\t165741\n老苍\t165742\n老苏\t165743\n炫仔\t165744\n嘻嘻嘻嘻嘻嘻嘻嘻哈哈我是一个人人人人人人人人人人人人人\t165745\n预警\t165746\n黑水河乡\t165747\n正和\t165748\n艾达斯\t165749\n20家\t165750\n你好棒\t165751\n梁光烈\t165752\n闹困\t165753\n嘴亲\t165754\n不失时机\t165755\n宝贝儿我爱你你爱我不我爱你\t165756\ngggv\t165757\n问提\t165758\n林沛宜\t165759\ngggr\t165760\n泰坦尼克号\t165761\n泡泡吧\t165762\n言笑\t165763\n交易额\t165764\ngggg\t165765\ngggf\t165766\ngggc\t165767\n腹黑攻\t165768\n我的手右手\t165769\nLtd\t165770\nalone\t165771\n文档\t165772\n空办\t165773\n假意\t165774\ngggj\t165775\n喵呜\t165776\ngggh\t165777\n认不认可\t165778\n13518252537\t165779\n抱足\t165780\n大宅门\t165781\n成绩\t165782\n中国移动\t165783\njsidbeksifbeoxkendkdbwwbdbsbdbdodksbdkdndekskmqmsnsmsnanabanabs\t165784\n张坤龙\t165785\n八爷\t165786\n文案\t165787\n丁源俊\t165788\n武侠客\t165789\n面部\t165790\n什琴斯尼\t165791\n22222222255555555550000000个\t165792\n刺眼\t165793\njigsdj\t165794\n力量\t165795\n热点\t165796\n法克因拍给\t165797\n守交规\t165798\n短毛\t165799\n傻子度\t165800\n陆文婷\t165801\n家长会\t165802\n一经\t165803\n龙谷\t165804\n聊聊天吧\t165805\n12345678910101213141516171810\t165806\n48857\t165807\n五儿\t165808\n眯睇睇\t165809\nfilmf\t165810\niqrp\t165811\n会引起\t165812\n處女座\t165813\n下杨湖口\t165814\nfilms\t165815\n伱猜\t165816\nGIFI\t165817\n苦酒\t165818\n对称点\t165819\n错尼\t165820\n素不相识\t165821\n郝希媛\t165822\n来马\t165823\n信院\t165824\n口袋妖怪\t165825\n雷文林\t165826\n今天五原中学\t165827\n反语\t165828\n懂事儿\t165829\ndidio\t165830\n汉纸\t165831\n五十年\t165832\n腹腔镜\t165833\ndidid\t165834\nDeada\t165835\n萝卜3\t165836\n歌听\t165837\n洗恩赐\t165838\n百顺\t165839\nfhuio\t165840\n太矮人\t165841\nQQ头像屌\t165842\n脚跟\t165843\n盐橙\t165844\n赵将括\t165845\n算账\t165846\n噶来了\t165847\n鲍尔森\t165848\n喜羊羊小游\t165849\nSItsa\t165850\n歌名\t165851\n洛克希德公司\t165852\n3D加勒比海盗4\t165853\ng10几秒\t165854\n米科拉伊·泽布日多夫斯卡\t165855\n张梦君\t165856\n张国荣\t165857\n小老乡\t165858\n异曲同\t165859\n王银烽\t165860\n粉猪\t165861\n石不\t165862\nlllllol\t165863\n小红鸭\t165864\n最后一章\t165865\n暗黑破坏神3\t165866\n7：30\t165867\n最后一站\t165868\n小老么\t165869\n弄弄\t165870\n7：38\t165871\n赵行村\t165872\n死小攻\t165873\n9点16分\t165874\n699元\t165875\n冰雪植物大战僵尸游戏\t165876\n左膀右臂\t165877\n影圾馆\t165878\n拉级\t165879\n小货车\t165880\n111112222233334444566667778889999\t165881\n意型\t165882\n半信半疑\t165883\n新农新萌新萌新萌新萌你好\t165884\n就好听话说的\t165885\n太君\t165886\n桔子\t165887\n问罪\t165888\n哈哈哈哈我你住我\t165889\n裴幽\t165890\n不堪卒读\t165891\n假尿\t165892\n药药方\t165893\n小爱婆\t165894\n占占\t165895\n占卜\t165896\n丑丑\t165897\n假封\t165898\n太吵\t165899\n邵华娟\t165900\nNCU\t165901\n复习线\t165902\n度密谢谢你\t165903\n人居\t165904\nvikn\t165905\n风湿科\t165906\n亮天\t165907\n战袍\t165908\n黑本\t165909\n雷兩\t165910\n不安秘\t165911\n小票\t165912\n蔡三川\t165913\n幸幸幸\t165914\n云聊\t165915\n第一个八十\t165916\n自治区\t165917\n蓝胆\t165918\n反译\t165919\n曹星期\t165920\n先侍寝\t165921\n风声水\t165922\n来偶\t165923\n牛奶味\t165924\n缤纷争执勤力\t165925\n一起头儿\t165926\n定唠嗑\t165927\n国共\t165928\n丁一昊\t165929\n1100公里\t165930\nghbblh\t165931\n航天\t165932\n认知\t165933\n今晨\t165934\n两三句\t165935\n两三只\t165936\n能发短信\t165937\n反诗\t165938\n平均引用率\t165939\n忍会\t165940\n航天信息\t165941\nsim卡\t165942\n玩梦\t165943\n脂呅\t165944\n花市\t165945\n花布\t165946\ndudjfjcjxidgkhfjcvjhifgko\t165947\nweruo\t165948\n兴化\t165949\n起航\t165950\n耀州窑\t165951\n6月1号\t165952\n而异\t165953\n甜滋滋\t165954\n今晚\t165955\n塞住\t165956\n许文露\t165957\n11点\t165958\n水生火热\t165959\n兔子眼\t165960\n挖角\t165961\n进斗\t165962\n吉泽悠\t165963\n结构式\t165964\n骚扰电话\t165965\n喷撒\t165966\n帅\t165967\n昆马\t165968\n王国\t165969\n王图\t165970\n王困\t165971\n米迦尔优一郎\t165972\n居然之家\t165973\n317天\t165974\n←←\t165975\n黄你\t165976\n跑男三\t165977\n什么眠\t165978\n王四\t165979\nYYyoyO\t165980\n宝马X5\t165981\n宝马X6\t165982\nr4fssrdghf\t165983\n炖肉\t165984\n提公\t165985\n雷光\t165986\n复婚\t165987\n水榭\t165988\n我是大人啦啦我是大人\t165989\n麻木\t165990\n恶女\t165991\n听听见\t165992\n登出\t165993\n诺贝尔文学奖\t165994\n\t165995\nbesides\t165996\niirery\t165997\n移民局\t165998\n贝尔\t165999\n马可尼\t166000\n另一几岁\t166001\n戏谑\t166002\n东喜羊羊\t166003\nfffcffc\t166004\n六角姗姗\t166005\n阿拉蕾LG\t166006\n无聊度\t166007\n一秒后\t166008\n不是文帅\t166009\n密秘\t166010\n游彩红\t166011\nhttpchiphotosbaiducomxiaodupicitem0d338744ebf81a4c6b81d2bbd02a6059252da614jpg\t166012\nhybd\t166013\n97544444444444440\t166014\n帐\t166015\nMeeGo\t166016\n孟雅苓\t166017\n我去哪我去哪里我妈妈我去\t166018\n死家伙计\t166019\n摩夫\t166020\n嗯小宅\t166021\n麦奎因\t166022\n尹正中\t166023\n好自为\t166024\n疯了真的好\t166025\n2006年5月29日\t166026\n零工林工\t166027\n快马加鞭\t166028\n爱美\t166029\n包括\t166030\n细节\t166031\n了了了了了是\t166032\n龙得王国\t166033\n夫妻意\t166034\n一遍一遍一个\t166035\n嗯小家\t166036\n王玺\t166037\n二六四零\t166038\n迷信\t166039\n倪茜\t166040\n制服诱惑\t166041\n灰太勒\t166042\n一星\t166043\n1991年\t166044\nKurv\t166045\n王玥\t166046\n朝上\t166047\nboyfriend\t166048\n一块照\t166049\n一春\t166050\n我对你说我爱你爱你爱你\t166051\n追不知\t166052\ntasas4\t166053\n王玉\t166054\n68%\t166055\n王王\t166056\n云南六日游\t166057\npjatwmdjgat\t166058\n13952944568\t166059\n展威风\t166060\n不乖\t166061\n光天化日\t166062\n戴姆多德\t166063\n岂止\t166064\n50少\t166065\n敲响\t166066\n大数据\t166067\n555511\t166068\n教无类\t166069\n玫紅\t166070\n叫不记得\t166071\n抿嘴\t166072\n不义\t166073\n咬牙\t166074\n远远远越好\t166075\nwisuwuiififi\t166076\n不乱\t166077\n宗旨\t166078\n大话呀日日撸我\t166079\n伍体投地\t166080\n辣翅\t166081\nabbbbbbbbbc\t166082\n刚说\t166083\n指点迷津\t166084\n嗯大曼\t166085\n疏而不漏\t166086\n用不起\t166087\n生化危机\t166088\n张星丽\t166089\n5076\t166090\n幸运之星\t166091\n何老师\t166092\n留诉\t166093\n688\t166094\nbutterfly\t166095\n酱碗\t166096\n11月21早晨\t166097\n孤帆园艺碧空季\t166098\n呵呵度\t166099\n统一册\t166100\nghhdhjfjjhviw\t166101\n渔政\t166102\n点心面\t166103\n伦敦银\t166104\n18只\t166105\n张思雨\t166106\n剪刀石\t166107\n胡佳子\t166108\n卷宗\t166109\n幼女\t166110\n专心致志\t166111\n九零元\t166112\n火疗\t166113\n汹汹群鬼\t166114\n家闲\t166115\n啊行\t166116\n杦女\t166117\n680\t166118\n家门\t166119\n欧阳少恭\t166120\n透气\t166121\n名堂\t166122\n无所事事\t166123\n该书\t166124\n冬春夏秋\t166125\n双眼皮画\t166126\n小猫咪你在干嘛小猫咪晚安\t166127\nwus\t166128\n么么哒亲爱哒\t166129\nititi\t166130\nbdidh\t166131\n很慢棒棒棒棒吧个吧个吧个吧个龙的的的的\t166132\n度秘我问你的你\t166133\n所学\t166134\n最新动你秘\t166135\n刘东强\t166136\n搜出来\t166137\n仁国\t166138\n赵静\t166139\n慢特别\t166140\n港方\t166141\nb7tt\t166142\n吞掉\t166143\n二毛钱\t166144\n要堂\t166145\n斗鸡眼\t166146\n我的愿望\t166147\ngoing\t166148\n杨猫咪\t166149\n不秘度秘\t166150\n走途无\t166151\n韦佳顺\t166152\n脱身\t166153\n黑溜溜\t166154\n扒灰\t166155\n明好\t166156\n张云涵\t166157\n4884\t166158\n累了\t166159\n退散\t166160\n換繁體\t166161\n3d赛车\t166162\nbilty\t166163\n神经病人\t166164\n侧颜\t166165\n桂城\t166166\n立比\t166167\n业绩\t166168\n2225665\t166169\n自由党\t166170\nQ讯家园\t166171\n特效药\t166172\nlancluse\t166173\n地下鞋\t166174\n累人\t166175\n吸盘\t166176\n零二八二\t166177\nwug\t166178\n限量\t166179\nwhere\t166180\nDdrd\t166181\n１５年\t166182\n信不想\t166183\n名额\t166184\nhanyuxiazai\t166185\n首映\t166186\n一天一\t166187\n等会儿等会儿等\t166188\n最做\t166189\nyoutstral\t166190\nwuk\t166191\n2:22\t166192\n撒尼玛\t166193\n幻觉了吗你好吗呢吗啡网上看你了吗老婆的时候到了\t166194\n一等一的好\t166195\n青史\t166196\n妞何炅\t166197\n禁止期\t166198\n1441775745787\t166199\n梅开\t166200\n识别码\t166201\n爸爸妈妈们\t166202\n012584285566555558\t166203\nzyix\t166204\n二百分钟\t166205\nokmc\t166206\n恶心啊假\t166207\n屁孩\t166208\n一天下\t166209\n这个样\t166210\n第23456\t166211\n白子花\t166212\n老狼请客\t166213\n苹果魅族\t166214\n幺零零五\t166215\n攘外必先安内\t166216\nk776\t166217\n万里长征人\t166218\n不诺科\t166219\n537575457585857\t166220\n六月初\t166221\n佛界\t166222\n一天24小时\t166223\n唉元\t166224\n北宝强\t166225\n泽思\t166226\nmyengine\t166227\n4000亿元\t166228\n周小平\t166229\n详讯\t166230\n系距\t166231\n进退岂不绰\t166232\n阿贾克斯\t166233\n这么个\t166234\n易言\t166235\n白份\t166236\n张新哲\t166237\n呼呼侠\t166238\n一九\t166239\n听话吗亲一个\t166240\n直系亲属\t166241\n一乙\t166242\nDygz\t166243\n6月28日22时40分\t166244\n合欢功\t166245\n美度音\t166246\n王局长\t166247\n一乐\t166248\n红榜\t166249\n荷钮套章\t166250\n度秘聊\t166251\n一义\t166252\n一么\t166253\n丫ES\t166254\n九分之二\t166255\n闲累\t166256\n镔铁\t166257\n愣头青\t166258\n天美国队\t166259\n琅琅上口\t166260\n耳朵面\t166261\n你是狗熊唔再也不不要你了我再也不要\t166262\n罗星美\t166263\nlocally\t166264\n一乱\t166265\n活力\t166266\n地男\t166267\nxndgy\t166268\n束昱辉\t166269\n丫Es\t166270\n美女图\t166271\n13813364170\t166272\n年货季\t166273\n49天\t166274\n妙桃\t166275\n外省\t166276\n果珍\t166277\n咸蛋超人\t166278\n沙钢\t166279\n膨大剂\t166280\n四十零一\t166281\n5657\t166282\n魔兽说你是我的蜜友\t166283\n王召人\t166284\n小朋刚柔\t166285\n冷家村\t166286\n上港全州超市\t166287\nydvdhsgdhhs\t166288\n玩游戏\t166289\n我的世界助手\t166290\n多火火的小两万了我的希沃新娘我想你的话火火火火火\t166291\n哈林\t166292\n日本料理店\t166293\n5个小时\t166294\n一百四十三块\t166295\n北仑\t166296\n相等\t166297\n答错\t166298\n杨舒云\t166299\n那硪\t166300\n有车\t166301\n句集\t166302\n3dd301\t166303\n翔云顶\t166304\n嘲讽\t166305\n何方\t166306\n骚瑞骚瑞\t166307\n练号\t166308\n度秘度秘你在哪儿你在哪呼叫度秘\t166309\n九芳\t166310\n走越静\t166311\n罗杆菌\t166312\n新疆南部\t166313\n为什么呢\t166314\n九节\t166315\n李沁柔\t166316\nYorkinAmerica\t166317\n蒂诺嗨\t166318\n索契\t166319\n2794982707\t166320\n女不男\t166321\n福安宫\t166322\n索瑞莲\t166323\n讶异\t166324\n夜攸\t166325\n告急\t166326\n巧克力鲜奶\t166327\n喝掉\t166328\ncixife\t166329\n稀土储备制度\t166330\n膝伤\t166331\n3声\t166332\n恣意妄为\t166333\n非非\t166334\ndkgzdrjji\t166335\n张超\t166336\nFYEJ\t166337\nVIPv\t166338\n处女座们\t166339\n考前\t166340\n沪好男人\t166341\n24010694\t166342\n才和\t166343\n逗乐\t166344\n广园\t166345\n嗯卫士\t166346\n啼们儿\t166347\n如虹\t166348\n八百六十二一四\t166349\n水解\t166350\nLoki\t166351\n搙户\t166352\n寻梦\t166353\n鬼班长\t166354\n亚卡雷斯\t166355\ndufhc\t166356\nfhhhgvjh\t166357\n6块\t166358\n自理自理\t166359\n6753124679922\t166360\n妇女波\t166361\n烧制\t166362\n163kktv\t166363\n坏家伙\t166364\n乐吧\t166365\n死人块吧\t166366\n我乖\t166367\n舍摩\t166368\n芭比棍\t166369\n凯萨\t166370\n草莓疙瘩\t166371\n跳高\t166372\n只顾\t166373\n成功恐惧症\t166374\n烤烤\t166375\n痞子级\t166376\nvv个哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥\t166377\n号文\t166378\n关浩然\t166379\n莉娅\t166380\n符箓\t166381\n梁余恳\t166382\n阿克塞\t166383\n不我与妈妈\t166384\n张琼\t166385\n三鹿奶\t166386\nghojfn\t166387\n伪君子\t166388\n四轴纸\t166389\n7758238\t166390\n锤头鲨\t166391\n没善良过\t166392\n再见了背背\t166393\n离谱\t166394\n美娇娃\t166395\ndddrdyffsy\t166396\ngnng\t166397\n后勤\t166398\n芳色\t166399\nimll\t166400\n随宅\t166401\n腾鳌\t166402\n日本自卫队\t166403\n苍龙行\t166404\n恩yes\t166405\n边边\t166406\n外出没\t166407\n电饭锅\t166408\n就是说白了\t166409\njirier\t166410\n联队长\t166411\nm扫雷\t166412\n拼图类\t166413\n不是我不懂\t166414\n真可怜\t166415\n丑批\t166416\n黑大大黑大大\t166417\n臭二郎\t166418\n真可怕\t166419\n唉貌似京\t166420\na62\t166421\nvc片\t166422\n切管\t166423\n我么\t166424\n对呀我想嫁给你\t166425\nzwf\t166426\n迷迷瞪瞪\t166427\n爱上我不问\t166428\n大爷们\t166429\n轻歌曼舞\t166430\n故发\t166431\n超级圣诞老人\t166432\nkdjwifeqcjwigpwrnipvwhuofwnjochqruofeqndunkpdhqufuowfhuoqedhiqdhuoqfbfbqeudbwurbfuouofh\t166433\n寂寞人生爱无休\t166434\n再来一个再来一个再来一个再来一个再来\t166435\nnoou\t166436\n再分开\t166437\nnoop\t166438\nq聊\t166439\n嗯嗯嗯我哥谁小鱼儿\t166440\n真可性\t166441\n泰迪狗\t166442\n咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪嘟咪嘟咪\t166443\n海沧区委区政府\t166444\nDvhhk\t166445\n年轮\t166446\n耶就度\t166447\n7只\t166448\n长枝\t166449\npower\t166450\na6u\t166451\n人瓶\t166452\n7句\t166453\n戴小呜\t166454\n年轻\t166455\n大众点评网\t166456\n妇女\t166457\n7台\t166458\n45287\t166459\ntinn\t166460\n无聊类\t166461\n天儿\t166462\nn　n　妖n　n　n\t166463\n大闹天宫\t166464\n备忘录\t166465\ntufyg\t166466\nPBBbbR\t166467\nggxbxbxgdykdkydhgzygjddjyxmddgksykdahdshykajfsfhfksgjhsgjjdgf\t166468\n二十一栋\t166469\n接手\t166470\n菜价\t166471\nochika\t166472\n王平安\t166473\n接打\t166474\n观点\t166475\ndhiphotosbaiducomxiaodupicitem0824ab18972bd4074e43b98e7c899e510fb30938jpg\t166476\n大撒\t166477\n红色的了秘\t166478\n原始社会杉生生世世\t166479\n援交\t166480\n资金链\t166481\n后会无期巴扎\t166482\n大大哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒\t166483\nlina贝\t166484\n安吉拉\t166485\n梁法初\t166486\n255566\t166487\n世园公园\t166488\n仙华\t166489\n咬紧牙关\t166490\naamiran\t166491\n星战你一说\t166492\n任盈盈\t166493\n158元\t166494\n流氓化\t166495\n信不信由你\t166496\n所以然\t166497\n放假了\t166498\n11月4日\t166499\n牡丹江\t166500\n完完\t166501\n麽了\t166502\n单着\t166503\n紫阳\t166504\ndishd\t166505\n一斤一百岁\t166506\n新生们\t166507\n15869853699\t166508\n反复复\t166509\notageo\t166510\nnujjh\t166511\n上百遍\t166512\n最高手\t166513\n永远永远永远永远不理\t166514\n时世\t166515\n苑凯旋\t166516\n南村\t166517\n打尿\t166518\n好兆头\t166519\n孔梦珂\t166520\n炼字\t166521\n约合\t166522\n两哦\t166523\n炻右業\t166524\n上班后\t166525\n法正\t166526\n朱瑞豪\t166527\n欣欣茵茵\t166528\n找转\t166529\n亲一眼\t166530\nicexcucoex\t166531\n爱贵\t166532\n气门\t166533\n周思博\t166534\n精武门那你加油鹦鹉\t166535\nsnhfota\t166536\n就这样子\t166537\n凛然\t166538\ngikbc\t166539\n南松\t166540\n聪哥\t166541\n630\t166542\n六年们\t166543\n一整夜\t166544\n橄榄球\t166545\n静安寺\t166546\n死心眼\t166547\n恩那快\t166548\n小美人鱼的故事\t166549\n小宁\t166550\n12892\t166551\n小报告\t166552\n余小燕\t166553\n634\t166554\n难得一致\t166555\n夏成宝\t166556\n亲爱的求求你了答应我\t166557\n情境主义\t166558\n舞会\t166559\n林立成\t166560\n单独\t166561\n来自恋\t166562\n15034358444\t166563\n找开\t166564\n神枪手\t166565\n推部\t166566\n袁雨萱\t166567\n1111\t166568\n1110\t166569\n1113\t166570\n朱熹\t166571\n1115\t166572\n先生日\t166573\n一整天\t166574\n2货\t166575\n你好科\t166576\n1118\t166577\n贪赃枉法\t166578\n2478\t166579\n机载科我数\t166580\n落难\t166581\n白羊女\t166582\n2意\t166583\n龙亏及毖Ａ分每变蚕免弗\t166584\n坐飞机\t166585\n软包\t166586\n统一站\t166587\n一个错觉\t166588\ncF0p\t166589\n万位\t166590\n沈阳市市\t166591\n佩定\t166592\n烟海\t166593\n假冒\t166594\n榆服务员\t166595\n阿苏\t166596\n软化\t166597\n胜过\t166598\n肺热\t166599\n不不你\t166600\n汤勺\t166601\n多种\t166602\n蜀黍们\t166603\n阿英\t166604\n杜鹃鸟\t166605\n多秘\t166606\n份生\t166607\n王照涵\t166608\n瘸子\t166609\n9套\t166610\njaggggjp\t166611\nhwowhs\t166612\nconsider\t166613\n驧骄翐珊\t166614\n吾亚奇\t166615\n平身\t166616\nGgkfy\t166617\n白亲亲\t166618\n睡颜\t166619\n矮哈哈哈\t166620\n永远不见\t166621\n魔力女\t166622\n真的假的好好\t166623\n平躺\t166624\n酒药\t166625\n210本\t166626\n审慎\t166627\n张山山\t166628\n万佛\t166629\n心女\t166630\n错不了\t166631\n六九九幺\t166632\n猴王\t166633\n你的情人\t166634\n435664566\t166635\nB超\t166636\nsmile\t166637\n联合国艾滋病规划署\t166638\n兰陵王\t166639\n湖里区金尚中学\t166640\n南京德基广场\t166641\n修宪\t166642\n转点\t166643\n换码\t166644\n洋漾\t166645\n东门湾\t166646\n唯爱SJ13]110522希澈TW\t166647\n家家家\t166648\n空客\t166649\n克罗索\t166650\n黑杰克\t166651\n小宠\t166652\n安逸飞哥\t166653\n修容\t166654\n/标准车\t166655\n我的空间\t166656\n曹润林\t166657\n修完\t166658\n爱抚谢谢\t166659\nhhghj\t166660\n德芙\t166661\n两两两种\t166662\n批注\t166663\n6400瓶\t166664\n图里奇\t166665\n擦拉黑欧\t166666\nPConline\t166667\n曾露\t166668\n逃掉\t166669\n永强元给恩\t166670\n丝管\t166671\n钱长伟\t166672\n天门山\t166673\n坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋大坏蛋\t166674\n观察\t166675\ndhejrh\t166676\n憨比\t166677\nchxhchcg\t166678\n赚到\t166679\n三十多岁\t166680\n垂危\t166681\n王奔\t166682\n第十一名\t166683\n宇宙\t166684\n才酒\t166685\n蔫蔫蔫唧\t166686\n撸啊撸周\t166687\n有渠不通\t166688\n泰晤士河\t166689\n紫云草\t166690\noio，howareyou\t166691\n抱抱不要這样\t166692\n王好\t166693\n荷叶\t166694\n王女\t166695\n太不好笑了我要再不找你了拜拜\t166696\n应霖\t166697\n苏情桥\t166698\n饵门祖村\t166699\n可哥\t166700\n奔跑吧兄弟团\t166701\n劳动者\t166702\n真心感谢\t166703\nhttphhiphotosbaiducomxiaodupicitem810a19d8bc3eb135aca58acea11ea8d3fd1f4460jpg\t166704\n咯黄\t166705\n赵飞\t166706\n认不认识\t166707\n飘飘巾\t166708\n毛群安\t166709\n米易县\t166710\n末位\t166711\n9483\t166712\n奥飞\t166713\n慢东\t166714\n兰提斯\t166715\n臭袜\t166716\n乐么乐\t166717\n快跑吧\t166718\n讲一个更搞笑\t166719\n软格\t166720\n绿盒\t166721\n嗯多米\t166722\n侦收\t166723\n教坏\t166724\n十一＼十十＋\t166725\n梨汁\t166726\n久好\t166727\n世嘉\t166728\n洋芋菜\t166729\n兜售\t166730\n发自肺腑\t166731\n时时\t166732\n李大\t166733\n学以致用\t166734\nlwwwabcdefghjkmopqstovwxasyz猫ucxabcabcabc\t166735\n限流\t166736\n李太\t166737\n李夫\t166738\n李天\t166739\n我行\t166740\n时日\t166741\n快乐的回忆\t166742\n016年\t166743\nhjcc\t166744\n酸楚楚\t166745\n有紧\t166746\n有素\t166747\nhjcf\t166748\n了登登登登间儿科\t166749\n抹茶糖\t166750\nvtgby\t166751\nUUUUUUUUu\t166752\n高玉博\t166753\n眼烟\t166754\n李夏\t166755\n0211利特\t166756\n翠爷\t166757\naassesss\t166758\n黄叔叔\t166759\n82集\t166760\n最讨厌你了讨厌\t166761\n闲心\t166762\n设计图\t166763\n8316741268\t166764\ndtixigyiciy\t166765\nwddfcfcsgh\t166766\nablue\t166767\n演机会难得\t166768\n兴趣爱好\t166769\n达尼美\t166770\n跑鞋\t166771\ntaiduotou\t166772\n故事故\t166773\n含笑\t166774\n马克思\t166775\n对呀好\t166776\nxvvb\t166777\n锁上\t166778\nTiffany\t166779\n歧途\t166780\ndhiphotosbaiducomxiaodupicitem8c1001e93901213f83bc40a953e736d12f2e9533jpg\t166781\n相雷\t166782\n玩笑\t166783\n182662333333\t166784\n问过\t166785\nzdddf\t166786\n快银\t166787\n中国代表团\t166788\n算根\t166789\n过不过来\t166790\n切除\t166791\n第六个\t166792\n呢嗯\t166793\nF罩杯—Fake\t166794\nk神\t166795\n斯塔德迈尔\t166796\n常丽九世英媒\t166797\n撒花\t166798\n干旱\t166799\n弹堂\t166800\n延安路\t166801\n糖果色\t166802\nhhgshj\t166803\n百分百6\t166804\n逆龙纪\t166805\n零六\t166806\n零八\t166807\n姑苏\t166808\n寇松娟\t166809\n汤里汤\t166810\n北京佛教\t166811\nkhwj\t166812\n太离谱\t166813\n什么时过什么不\t166814\n哼茂\t166815\n会儿天行\t166816\n厚实\t166817\n钓鱼网站\t166818\n一百六百六十一\t166819\n声乐系\t166820\n舞台剧\t166821\n戴上\t166822\n拖了拖\t166823\n新课节\t166824\n庄子\t166825\n担夕\t166826\n王亚金\t166827\n女闻\t166828\n女闺\t166829\n我是在问你耶懂你\t166830\n宝贝娃娃\t166831\n赌片\t166832\n春舞\t166833\n趣语录\t166834\n心碎了真是\t166835\n中联盟\t166836\n耗子\t166837\n恩教\t166838\n赵珍凤\t166839\n重剑\t166840\n么多米\t166841\n垂足\t166842\n爱上\t166843\n呼二联通\t166844\n历年\t166845\n爱一\t166846\n爱丁\t166847\n這件事\t166848\n5月19日晚上7:30\t166849\n贤乡\t166850\n爱东\t166851\n男后\t166852\n几辆\t166853\n男同\t166854\n识趣\t166855\nfbk\t166856\n猜负\t166857\n怎麽\t166858\n猪猪猪你是猪猪猪你是猪猪猪猪猪猪\t166859\n过生日没\t166860\nwwwixixix\t166861\n拍拖讨厌\t166862\n学方学院\t166863\n爱丽\t166864\n5.7%\t166865\n玉米田\t166866\n心宽\t166867\n琉璃\t166868\n爱像一阵风吹yndatouto\t166869\n一座机\t166870\n脑门子\t166871\njjr\t166872\n吕思妍\t166873\n范成鹏\t166874\n鬼讨厌鬼\t166875\n聊说\t166876\n你好无聊说的话\t166877\n忧国忧民\t166878\n固件\t166879\n送一送\t166880\n12-15分钟\t166881\n杰希\t166882\nwhatis\t166883\n堪称\t166884\n高原区\t166885\nCfqqertyuiopaddghjlzxvnmcxxtgcddfGghgcddhcfgcubbh\t166886\n一个98天\t166887\n杨顺涵\t166888\n几点几米\t166889\n聊话\t166890\n中信金融控股\t166891\n猪鸡\t166892\n草体\t166893\n55858885888588888888888888088088888888\t166894\n单词儿\t166895\n2016150\t166896\njjj\t166897\n5875285373586\t166898\n洞密\t166899\n鲁拉\t166900\n圆顶形\t166901\n返京\t166902\nQIANG\t166903\n80692832\t166904\n散散心\t166905\n长春亚泰队\t166906\n给我老实说\t166907\n曾多次\t166908\njjf\t166909\n洞察\t166910\nTeam\t166911\nreally\t166912\njgjcd\t166913\nbimf\t166914\ntuoc\t166915\n推荐信\t166916\n五十种\t166917\nDior\t166918\n50亿\t166919\n坦坦荡荡\t166920\n今天中午13：00\t166921\n一几天前\t166922\n袄度秘\t166923\n汪老板\t166924\n1786687569882\t166925\n苦逼们\t166926\n相知\t166927\n小狗狗\t166928\n马耳他\t166929\n导论\t166930\nggffxf\t166931\n微攻略\t166932\n家苗\t166933\n宋懿斐\t166934\nnoshisha\t166935\n抱我回家\t166936\n陈艾熙\t166937\n妈妈侠\t166938\n忙碌\t166939\nTBS\t166940\n汉尼\t166941\n我的啊4度和你的啊四个字一个果\t166942\n排序\t166943\n喔喔师\t166944\n子非吾\t166945\n11.72\t166946\n天天一\t166947\n撒贝老\t166948\n杨洋萌\t166949\n虎蛋儿大明江湖\t166950\n两亿\t166951\n不要你了你会\t166952\n差劲\t166953\n老翟\t166954\n去稳\t166955\n兴修\t166956\n暴走漫画\t166957\n仔仔仔仔\t166958\n征帆\t166959\n科斯\t166960\n李思宁\t166961\njmktktdglmjmgkm\t166962\n萨摩耶\t166963\n潘玮柏\t166964\n饭桶\t166965\n但秘\t166966\n屄痒\t166967\n钢瓶\t166968\n兖州火车站\t166969\n完结\t166970\n李贺群\t166971\n嗯呜呜\t166972\n牙科\t166973\n钟锦才\t166974\n这个那\t166975\n电磁铁\t166976\nNICONICO\t166977\n单薄\t166978\n黄亚秋\t166979\n軍隊\t166980\n定睛\t166981\nXsrfvhhfs\t166982\n汗纸\t166983\n革命者\t166984\n80万年\t166985\n黄欣欣\t166986\n酸笋\t166987\n广州光复北路\t166988\n不是我去\t166989\n直流\t166990\n968249261\t166991\n本银姓杨名逗比\t166992\n一个41平米\t166993\n石俊杰\t166994\n熊出没咯女\t166995\n潇洒是萨\t166996\n喜欢你我爱你\t166997\n揍我还有\t166998\n拿瓦拿瓦拿瓦\t166999\nabitof\t167000\n歇着\t167001\n我是女的我叫贝贝\t167002\n凝弑殇\t167003\nvyewug\t167004\n机遇\t167005\n免不了\t167006\n秋哥哥\t167007\n十二月二\t167008\nghuhftg\t167009\n回答乱说\t167010\n什么地方\t167011\n钛合金\t167012\n叫转\t167013\n没在\t167014\n珠片\t167015\n躯干\t167016\n谈谈心聊聊天\t167017\n农业区\t167018\n潘晨午\t167019\n称帝\t167020\n罗姐\t167021\n525961541084\t167022\n唉瓮个\t167023\nByeonc\t167024\n奥氮\t167025\n我是你的好一个在哪不受\t167026\njizzo\t167027\n姿势照\t167028\n纯棉\t167029\n四枚\t167030\n乱放\t167031\n阴餐\t167032\n四枝\t167033\n四果\t167034\n郭树清\t167035\nKorres\t167036\n多丽丝·莱辛\t167037\n枯树\t167038\n吉克隽逸\t167039\n很讨厌\t167040\n第二座\t167041\n烫伤\t167042\n萨默斯\t167043\n少寡\t167044\n不识字\t167045\nwishe\t167046\n奥待会聊\t167047\nm2n8\t167048\n挡位\t167049\n骚受\t167050\n宫中\t167051\nshccffcv\t167052\n第6名\t167053\n老实说这问你了\t167054\n争放\t167055\n唉汪\t167056\n718asd\t167057\n昆囧\t167058\n田三三\t167059\n30％\t167060\n那段街\t167061\n金宇星\t167062\n假爱好\t167063\n萌娃\t167064\n物是人非\t167065\n得到\t167066\n庭前花开花落\t167067\n3388\t167068\n得利\t167069\n乙类\t167070\n生意额\t167071\n黑管\t167072\n朱倩\t167073\n500000000\t167074\n硅谷\t167075\n邦华导\t167076\n14775\t167077\n对啊好久\t167078\n替人家\t167079\n统计图\t167080\n饭桌\t167081\n铠甲勇士捕将\t167082\n妮家\t167083\n美瑞克\t167084\n狗狗妇\t167085\n得分\t167086\n跟头果果\t167087\n装疯\t167088\n58887858855886555585855858888558888885885558885622442\t167089\n山山\t167090\n吻算\t167091\nFFGGGBHFYG\t167092\n杨鑫如\t167093\n哈7\t167094\nfhgfjy\t167095\n我家的好不好呀你给我讲数学\t167096\n北腔\t167097\n磨合\t167098\nhouleme\t167099\n旁听\t167100\n副会长\t167101\n八月二十七号\t167102\n话记\t167103\n科技感\t167104\n额日德\t167105\n零零是你自己说的\t167106\n黄包\t167107\n巴一五\t167108\n淮海工学院\t167109\n小青虫\t167110\n东海14号\t167111\n互选\t167112\n洋葱头腐乳\t167113\n9：33\t167114\n羞讨厌\t167115\n弥音\t167116\n咯公斤\t167117\n菲泰\t167118\n周能存\t167119\n段梦媛\t167120\n摘掉\t167121\n跟一聊\t167122\n切长\t167123\n任颖\t167124\ntoutritourtonouto\t167125\n恨死你不也\t167126\n流入\t167127\n518580329368\t167128\n哎米\t167129\n啦啦用\t167130\n第几名\t167131\n留声\t167132\n没有理\t167133\n度挽救\t167134\n莱昂纳多\t167135\n品牌形象\t167136\n呜呜度秘\t167137\n伤痕累累\t167138\n杂志的惊人预言\t167139\n流光\t167140\n赶时间\t167141\n反馈表\t167142\n嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟\t167143\n周露飞\t167144\n五天后\t167145\n噗噗湖人女仆泼盆澎拜\t167146\n早安午安晚安安安安安\t167147\n夏洛特烦恼我要卫校夏洛特烦恼\t167148\n好笑不好\t167149\n李四和\t167150\n选礼拜\t167151\n请愿\t167152\n收费员\t167153\n俺好伐\t167154\n梳理\t167155\n接天莲叶无穷碧映日荷花别样红出自哪首诗\t167156\n改邪归正\t167157\n田源\t167158\n独白\t167159\n你好丑呀\t167160\n牌九钱\t167161\n徐欣怡\t167162\n2500平方米\t167163\n1.15亿\t167164\n秘那\t167165\n想吃惊\t167166\n哎呦我靠你家呵呵我\t167167\n洪波\t167168\n钱思诗\t167169\n55665566\t167170\n382亿\t167171\n许多岁\t167172\n8.12晚十点\t167173\n错位化\t167174\n株连九族\t167175\n残障\t167176\n1357925210\t167177\ntf1\t167178\n李照\t167179\n林奇\t167180\nmQqvAq\t167181\nchbsc\t167182\n亚布力\t167183\n开头儿\t167184\n阿普利\t167185\nhoggf\t167186\n奥巴马n\t167187\n因式\t167188\n走就走\t167189\n每分钟\t167190\n沪语\t167191\n古训\t167192\n代数式a2a3b5\t167193\n古记\t167194\n娶媳妇儿\t167195\n踏入\t167196\n陈圳浩\t167197\n李煜\t167198\n好顺城\t167199\n飞我想飞\t167200\n幼小吃\t167201\n度秘我爱你我喜欢你不求你\t167202\njdji\t167203\n20几张\t167204\n你流氓你算了你是啊丽棒棒糖王八猪\t167205\n王先生\t167206\n简单爱我想就这样牵着你的手不放开爱很简单iloveyou\t167207\nsetou\t167208\njdjd\t167209\njdje\t167210\n小赤司\t167211\n打办\t167212\ngfghdyhs\t167213\n朝廷者\t167214\np哄哄\t167215\n无欲则刚\t167216\nxiong\t167217\n秘战\t167218\n童润宇\t167219\nsjshtuc\t167220\n后遗\t167221\n打劫\t167222\n打动\t167223\n闲适\t167224\n真煞\t167225\n废旧\t167226\nmls\t167227\n行不我\t167228\n物价文\t167229\n加一条\t167230\ntfboyslu\t167231\n加油匠\t167232\n四星级\t167233\n关东地区\t167234\n谭杰西\t167235\n叶世杰\t167236\n什么叫做\t167237\n巡天\t167238\n全年\t167239\n流泪\t167240\n七九六嗯幺七\t167241\n徐庭春\t167242\n度秘我属你最\t167243\n神速\t167244\nfjcgvn\t167245\n加油包\t167246\n美好的旧时光\t167247\n神通\t167248\n抱着了\t167249\n纳岁\t167250\nfiberx\t167251\n令人费解\t167252\n木鱼\t167253\n问一下去\t167254\nFauxjunior\t167255\n流泉\t167256\n刚正不阿留将正气冲霄汉忧愁发愤著成信史\t167257\n胡钱\t167258\n嗯斗龙战士\t167259\n生日蛋糕\t167260\n2211445\t167261\n丽君儿\t167262\n猎人生如湿疣\t167263\n天文\t167264\n东少D\t167265\n太子菲\t167266\n帝虎#\t167267\n大一岁\t167268\n你的电话\t167269\n绫朮邶\t167270\n票额\t167271\n塔拉\t167272\n张钦威\t167273\n心无\t167274\n雷峰\t167275\nanxjjbaba\t167276\njiukk\t167277\n巫崖\t167278\n伤不起\t167279\n韩洁\t167280\n樊西英\t167281\n投产\t167282\n拜拜改天\t167283\n一一年后\t167284\n心旁\t167285\nkdkkfkd\t167286\n豆豆豆豆豆豆豆豆豆豆豆豆豆豆豆豆\t167287\n恍然大悟\t167288\n摩托罗拉\t167289\nqq名\t167290\n单杠\t167291\n1247656624461\t167292\n五百一十多分\t167293\n叶思诗\t167294\n拍死\t167295\n就是我的哈儿\t167296\n尝一尝\t167297\n离婚证\t167298\ntfnmljg\t167299\n天才之路\t167300\n车者\t167301\n二十章\t167302\n超级巨星\t167303\n两年多\t167304\n猪猪狗不如\t167305\n邓女\t167306\n小二百人\t167307\n你在哪我找你\t167308\n日志\t167309\n固化\t167310\n疳积\t167311\n3iS\t167312\n醒一醒\t167313\n最爱国\t167314\n听不得\t167315\nGhjkkhnm\t167316\n1092\t167317\nwbw\t167318\n你的夜\t167319\n1099\t167320\n体系\t167321\n卸桩\t167322\n窦含章\t167323\n徐意坤\t167324\n项链\t167325\n有利有弊\t167326\n7564664659\t167327\n159781191908\t167328\nQie\t167329\n朝圣公园\t167330\n隔三岔五\t167331\n七律长征\t167332\n么么么永远爱你表让你\t167333\n殷勤\t167334\n梁耀燮\t167335\niysng\t167336\n万能百事通\t167337\n来宾\t167338\n抱一捉\t167339\n8公里\t167340\n网友们\t167341\n罒_罒哈喽\t167342\n75期\t167343\nc瑞\t167344\n练声\t167345\n很自然\t167346\n接壤\t167347\n蒋委员长\t167348\n九九小游\t167349\n取暖\t167350\n开西区\t167351\n85折\t167352\n殷家祺\t167353\n7at\t167354\n真奥特\t167355\nwbf\t167356\n拒绝\t167357\n2352380163\t167358\nTATA\t167359\n简单版\t167360\n天冷贤贤\t167361\n木子\t167362\n保佑我爱的人\t167363\nHbnghcvcxxaxv\t167364\n王千禧\t167365\n水牛石\t167366\n刘什么\t167367\n曾经以为\t167368\nVin\t167369\n几幺\t167370\n去找到\t167371\n几年\t167372\n那一抹\t167373\n让位等\t167374\n几平\t167375\n守梦想\t167376\n九条命\t167377\n1路\t167378\n知我心\t167379\n韩倩\t167380\n玲珑\t167381\n来安\t167382\n一转身就变\t167383\nwgev\t167384\n人工颈椎间盘\t167385\n随你便吧\t167386\nerrrr\t167387\n多米多米\t167388\n看我妈妈我爱你我爱\t167389\n听所以说\t167390\nchimdy\t167391\n于欣瑶\t167392\ntwmj\t167393\n1867753348\t167394\nfvzbxjjc\t167395\n忙忙\t167396\n准绳\t167397\n几幅\t167398\n常成法师\t167399\n選擇\t167400\n相逢无\t167401\n123555565465445\t167402\nRJ1978\t167403\n2020年前\t167404\n破烂儿\t167405\n18倍\t167406\n电脑家\t167407\n漠视\t167408\n赵苑伶\t167409\n佳吉快递\t167410\n柏原崇\t167411\n打话\t167412\ntomeet\t167413\n甜酱\t167414\n两个女人物\t167415\nFFGDGDC\t167416\n合档期\t167417\n月几号\t167418\njGits\t167419\n有点\t167420\n发簪\t167421\n什幺\t167422\n打读\t167423\n经络\t167424\nktwibyl\t167425\n━━━约翰·肯尼迪\t167426\n星际冒险\t167427\n访华\t167428\n月光族\t167429\nfhisfvjdgqu\t167430\n甜酒\t167431\n二块\t167432\n百分之二十\t167433\n半半\t167434\n将爱\t167435\n二十三块\t167436\n文件\t167437\n小白杨\t167438\n温子翔\t167439\nSJM\t167440\nSJB\t167441\n利剑\t167442\n嗯沙\t167443\n牧羊\t167444\n关西\t167445\n湖姑\t167446\nDFGGG\t167447\nmmammy\t167448\n山摇\t167449\n文仔\t167450\n10余日内\t167451\n心塞尼\t167452\n枯妮\t167453\n卖逼\t167454\npizzain\t167455\n10条\t167456\n忐忑\t167457\n欧洲杯\t167458\n中华民国\t167459\n俺娘田小草\t167460\n高标准\t167461\ny托\t167462\n升压\t167463\n南瓜炖鸡\t167464\n影院\t167465\n高萌\t167466\n行哈\t167467\ndwwwwwwwwwww\t167468\n揭阳\t167469\n49场\t167470\n装载\t167471\n黄瑟天\t167472\ntarp\t167473\n阿尔卑斯棒棒糖\t167474\n剧场版\t167475\n虎鱼\t167476\n引桥\t167477\n哥科\t167478\n黄梦蝶\t167479\n毕恒晖\t167480\n安猴哥\t167481\nv凤飞飞\t167482\n于红蓓\t167483\n伊甸\t167484\n余杭\t167485\n幸福在上\t167486\n天天了盈\t167487\n知耻后勇\t167488\n搜搜历年红通\t167489\n我不要你了坏蛋再见\t167490\n援救\t167491\n韭菜鸡蛋馅\t167492\n很失望\t167493\n丨丨丨丨丨丶\t167494\n周泽飘\t167495\n水浒传\t167496\n拜厄\t167497\n两年级\t167498\n百事通\t167499\n笑秘\t167500\nhgjdjrjd\t167501\n鸡心\t167502\njmga\t167503\n钢丝\t167504\n手下留情\t167505\n安其拉\t167506\n波西米亚\t167507\n上点\t167508\n珠三角\t167509\n啥种\t167510\n逗呢\t167511\n开心一刻\t167512\n腰姿\t167513\n葛永银\t167514\ntokool\t167515\n倾夜\t167516\n把手架\t167517\n默默默默\t167518\n马萨卡\t167519\n张军林\t167520\n5303211997111700182\t167521\n3684704325\t167522\n逗呀\t167523\n诸候\t167524\n端庄\t167525\n杨振宁\t167526\n好唻好唻\t167527\n联通中心\t167528\nk301\t167529\n安之\t167530\n正襟危坐\t167531\n媛茜\t167532\n#淇淇成长记\t167533\n双双\t167534\n双发\t167535\n抱一\t167536\n六十六十一零\t167537\n甘草\t167538\n夜心\t167539\n有道理\t167540\nmarry\t167541\n洙赫\t167542\n战度秘\t167543\n奔驰SLS\t167544\n安乱\t167545\n抱个\t167546\n双叶\t167547\n12小时\t167548\n驷马难追\t167549\n发明家\t167550\n即将\t167551\n度秘你的妈妈叫什么呀你的爸爸\t167552\n右阁\t167553\n碍手碍脚\t167554\n不疯\t167555\n高欢\t167556\n高欣\t167557\n金融君\t167558\n毛撸\t167559\n叁分\t167560\n水银刀鱼\t167561\n峨眉耳蕨\t167562\n第八讲变倍\t167563\n酌情\t167564\n麻黄碱\t167565\n唐三老\t167566\n霉运期\t167567\n几十KB\t167568\n么啦\t167569\n拉斯维加斯\t167570\n高星宇\t167571\n学日语\t167572\n哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒\t167573\n分离\t167574\n王统哲\t167575\n国际上海队\t167576\ngfgggcfg\t167577\nBarclays\t167578\n中国人民政治协商会议章程\t167579\n一听平民动圈话筒SM48\t167580\n追捕令\t167581\n平治\t167582\nballe\t167583\n想我們\t167584\n熊大熊\t167585\n过人我叫酸\t167586\n10355864676467676468595686864658686468676\t167587\n2月8号\t167588\n您还记得\t167589\n大笨猪\t167590\n徐5诶诶\t167591\njyso\t167592\n外婆桥\t167593\njysk\t167594\n上清\t167595\n真容\t167596\nrdt\t167597\n上游\t167598\nrdv\t167599\n财政部\t167600\n3月10日晚\t167601\n36千米每小时\t167602\n贾斯汀安娜伦巴\t167603\n卫生部放射卫生防护标准委员会\t167604\n度密\t167605\nrdd\t167606\n保忘\t167607\n求甚什么\t167608\nvogfuf\t167609\n独吞\t167610\n走路声\t167611\n度寐\t167612\n肛冂\t167613\nTtf\t167614\n小人味\t167615\nrdj\t167616\n萌包奴役\t167617\n森开森\t167618\n学霸王\t167619\n平邮\t167620\n国泰华旺宠物医院\t167621\n嘎嘎嘎嘎盖盖盖盖\t167622\n好胜心\t167623\n色登寺\t167624\n近海\t167625\n露西哈特菲利亚\t167626\n隔阂本\t167627\n红太狼大学\t167628\n零二幺九\t167629\n讨厌你好讨厌我好讨厌我的我讨厌你我更合\t167630\n舒梅\t167631\n家教机\t167632\n勿手\t167633\n仔仔\t167634\n王广允\t167635\n陈建荣\t167636\n一二两百\t167637\n汤唯玄彬\t167638\n业熙\t167639\n二年级\t167640\n梁永春\t167641\n沈雨涵\t167642\ncudt\t167643\n差额\t167644\n稚气感\t167645\n棗莊\t167646\n小风车\t167647\nom\t167648\nol\t167649\n班主儿\t167650\n小公菜\t167651\n供欲\t167652\n就是你的城市\t167653\n猪猪岛\t167654\n死流\t167655\n等你老了\t167656\n8878532592\t167657\nchiphotosbaiducomxiaodupicitem8601a18b87d6277f3c016a4c2f381f30e924fcb7jpg\t167658\n电位\t167659\n侠盗猎车\t167660\n死海\t167661\n塔里木练\t167662\n服腿\t167663\n俏脸\t167664\n什么果\t167665\n马凸凸\t167666\n苏州传化物流园\t167667\n东鹏洁具\t167668\n十二四千米\t167669\n严查\t167670\nThereis\t167671\n二十板\t167672\nSunny=傻泥；David=大肥；Sandra=栓猪儿；Candy=坑娣\t167673\n没辙\t167674\n扑嫁\t167675\ni货\t167676\n惨死\t167677\n不搞我\t167678\n踢人\t167679\n普世道德准则\t167680\n下题\t167681\n鹿柠\t167682\n别忘记\t167683\n撒娇\t167684\nqqrill\t167685\nD800\t167686\n天有天外有外堡逃跑有天yy哈\t167687\n烤火太冷\t167688\n正当老师\t167689\n狗不理\t167690\n下颏\t167691\n音浪\t167692\n狗牙\t167693\n校歌\t167694\n为其难\t167695\n６级\t167696\n张大大\t167697\n呀5孤孤\t167698\n闪人\t167699\n管用\t167700\n拆乱\t167701\n窗外\t167702\n吴颖欣\t167703\n小吃货\t167704\n北极洲\t167705\n照明灯\t167706\n闪亮\t167707\n低禿\t167708\n马月叉\t167709\n娘儿们\t167710\n唯亭湖\t167711\n同比\t167712\n葛坳\t167713\n白下区\t167714\n55555Klein\t167715\n禁术\t167716\n再见拜拜\t167717\n权益\t167718\n五六谭\t167719\ngcjsjcjcvvvc\t167720\n政法委\t167721\n不知不你\t167722\n天地之间\t167723\n孔度秘\t167724\n糖葫芦糖葫芦糖葫芦\t167725\n卑鄙我真\t167726\n秋风\t167727\nApple\t167728\n天天酷跑我喜欢\t167729\njmtjajtemtpk\t167730\n8UC雨啊你是你是你\t167731\n三天方\t167732\n寂静\t167733\n联姻\t167734\n快线\t167735\n瑞星\t167736\n擀面\t167737\n嗯春晓\t167738\n凉拌金针菇\t167739\ntrulyoutl\t167740\n不奇\t167741\n梁子\t167742\n别走路\t167743\nKoomee动力救命救命六密度\t167744\n千五\t167745\n雪压青松\t167746\n冲关\t167747\n题儿\t167748\n黄土高坡大槐树\t167749\n熊出没你呢\t167750\n七千克\t167751\nhttpehiphotosbaiducomxiaodupicitemf636afc379310a552f19bc80b04543a982261080jpg\t167752\n黄大仙祠\t167753\n英艾\t167754\n昭然可见\t167755\nhttppinyincne17095\t167756\n救人\t167757\n半阳\t167758\n蘿蘿蘿蘿蘿蘿蘿蘿蘿\t167759\n千人\t167760\ncomobaby\t167761\n半阴\t167762\n罗克菊\t167763\n帅哥哥哥哥\t167764\nmeowodkjwjqkqlmq\t167765\n北雁超市\t167766\nfabble\t167767\n阿凡凡\t167768\n好哈拉\t167769\n黄庄\t167770\n嗯硕\t167771\n转换率\t167772\n卧底\t167773\n天天有喜网\t167774\n倚天堑\t167775\nsirzizir\t167776\nCUBE公司\t167777\n卧床\t167778\n素质稿\t167779\n2012年3月15日\t167780\n觉华\t167781\n从上到下\t167782\n小白萌\t167783\n人情味\t167784\n麻辣\t167785\n你好乖大度秘\t167786\n林某\t167787\n调奖\t167788\n蒲黄铠甲\t167789\n因缘\t167790\n沪剧\t167791\n他們\t167792\n植物大战僵尸之无尽\t167793\n风物\t167794\n脏腑\t167795\n冰糕\t167796\ndgfgfg\t167797\n冰糖\t167798\n40亿\t167799\njjmc\t167800\n订座\t167801\n损害\t167802\n一阳金\t167803\n黎海兰\t167804\n下厨\t167805\nNetwork\t167806\n领班\t167807\nconclt\t167808\n海图\t167809\n霸王元\t167810\n意味\t167811\n许多\t167812\n沪上\t167813\n乛乛卡\t167814\n下去\t167815\n别制\t167816\n龙江县头站乡乡委\t167817\n骂骂骂\t167818\n幺儿豁\t167819\n色一喽\t167820\n海囗\t167821\n额鹅\t167822\n疯了你\t167823\n别别\t167824\njjmj\t167825\n午餐\t167826\n恒源祥\t167827\n指纹录\t167828\n种业\t167829\n李易锋\t167830\n讽刺\t167831\n丟死\t167832\n吃凶\t167833\ncdefg\t167834\n剧作家\t167835\n长宁\t167836\n懂事人\t167837\n广电部\t167838\n茂章\t167839\n犯困\t167840\n巨U呼呼意\t167841\n熱狗\t167842\n代表我美了对吧美妞是美妞\t167843\n邓蔡\t167844\n得得得得\t167845\n烽烟\t167846\n生死存亡\t167847\n丫操\t167848\nm25\t167849\n沈福超\t167850\nJump\t167851\n说了拜拜抱歉\t167852\n真的错\t167853\n483246782\t167854\n这么把\t167855\n唉恶心\t167856\n华绍\t167857\n光头强光头强大坏蛋大坏蛋\t167858\n马尔萨斯\t167859\n胡侃\t167860\nhellofromdarkside\t167861\n羨慕他們\t167862\n骷髅者\t167863\n九十九块\t167864\n一览无遗\t167865\n紧急状况\t167866\n车水马龙\t167867\n大气压\t167868\n天天爱消除\t167869\n李计谋\t167870\n69个\t167871\n焗油\t167872\n行自习\t167873\nhhghgyhdfhcyhhhgcf\t167874\n三十年前\t167875\n新飞\t167876\n度秘我爱你肉麻\t167877\ngfgv\t167878\n路数\t167879\n天么意思\t167880\n不好高骛远\t167881\n420页\t167882\n新风\t167883\n火星文\t167884\ngfgh\t167885\n一缕缕\t167886\ngfgf\t167887\nVODV\t167888\n15863468922\t167889\nYun\t167890\n说西\t167891\nHdhehjshd\t167892\n第499期\t167893\n冷死姐\t167894\n卞木\t167895\n卡秘\t167896\n好多事\t167897\n跑出来\t167898\n12336665542366\t167899\n非机动车\t167900\n井眼\t167901\n汶川地震\t167902\n不及了\t167903\n北街站\t167904\n兵王\t167905\n敢不\t167906\n社会福利\t167907\n呆喵\t167908\n背债\t167909\n扯走\t167910\n膝关节\t167911\n基本建设\t167912\n一快点\t167913\n小脸儿\t167914\n你天网\t167915\n22nk\t167916\n四肢\t167917\n好多人\t167918\n淘洗\t167919\n揭秘\t167920\n张新浪\t167921\n讲信心铭\t167922\n错码\t167923\n下侧\t167924\njug\t167925\n来我\t167926\n杜德平\t167927\n提点\t167928\n树林\t167929\nVeer\t167930\n提炼\t167931\n来战\t167932\n十二块\t167933\n二十多天\t167934\n接孩很丰富小乖乖听\t167935\n树枝\t167936\n判别式\t167937\n钟云\t167938\n我们都一样\t167939\n大凯撒\t167940\n浮夸\t167941\njuk\t167942\n哎呦我去我是宅女\t167943\n妹子\t167944\ndudddudys\t167945\n金钟真\t167946\n副总\t167947\n下午3点到11点\t167948\n在家扇风行鹿业\t167949\nP颠儿P颠儿臭显摆\t167950\n串烧\t167951\n元彬\t167952\n不堪言\t167953\n全国律协\t167954\nDOFEll\t167955\n欟酆屡\t167956\n润滑油箱\t167957\n流浪岁月\t167958\n早說\t167959\nckd\t167960\nckg\t167961\nckf\t167962\ncka\t167963\nbrkyve\t167964\nckc\t167965\n佟丽雅\t167966\n墨未哲山\t167967\nckl\t167968\n4月30日\t167969\n一双手\t167970\ncki\t167971\nckk\t167972\n死亡吧baby\t167973\n结疤\t167974\nckp\t167975\n享乐\t167976\n剑河县\t167977\ncky\t167978\nckx\t167979\n18X04\t167980\nga1b\t167981\n高家园\t167982\n牛市\t167983\n南极洲\t167984\n郭歌\t167985\ntupr\t167986\n制高点\t167987\n444444444444444444444444444444444444444444444444444444\t167988\n聊了拜拜拜拜\t167989\n二七日\t167990\n惠珍\t167991\n释伽摩尼佛\t167992\n预防职务犯罪\t167993\n备案\t167994\n交锋\t167995\n脚震\t167996\nRunningMan\t167997\n想和你再聊\t167998\n板鸭\t167999\nandba\t168000\n就是气我了你也不哄我\t168001\n不见尾\t168002\n五点五点\t168003\n37沈\t168004\n芥末味\t168005\n多米嗯\t168006\n害怕\t168007\nGatsby\t168008\n加油男孩儿\t168009\n4S店\t168010\n阿猴\t168011\nshsyshz\t168012\n4000点\t168013\n新发\t168014\n6444\t168015\nhi度秘你是谁你是人\t168016\n颈椎病\t168017\n膩\t168018\n3131仨三一\t168019\nXXoo槳\t168020\n优库\t168021\n1384383838438\t168022\n盘头发\t168023\nGHHC\t168024\n罗子烨\t168025\nLEE\t168026\n新史\t168027\nTaerlibg\t168028\n新号\t168029\n猪猪侠五灵卫\t168030\n黄芪霜\t168031\nh\t168032\n老八老八\t168033\n郑海小学\t168034\nghfg\t168035\n容县\t168036\n惡劣勢\t168037\nMtime\t168038\nghfj\t168039\n序号\t168040\n平乐\t168041\nvthy\t168042\n百忧解\t168043\n礼物巴\t168044\n几千万块\t168045\n逼逼逼逼\t168046\n易威登秀\t168047\nfghhggggggv\t168048\nJJ弯\t168049\n忠君\t168050\n华伦\t168051\n椒图\t168052\n行政管理权\t168053\ndoeod\t168054\n徐熙媛\t168055\n安徽司法所\t168056\niphoneyoui\t168057\n空前大一统\t168058\n好的天生一会儿再说\t168059\n太平洋电脑网\t168060\n想办法\t168061\n五万\t168062\n陈后涛\t168063\n我最好\t168064\n洗脑\t168065\n老师母\t168066\n李乐儿\t168067\n无人能\t168068\n瞎说实话\t168069\n邱少云\t168070\n澄静\t168071\n一瓶\t168072\n洗脚\t168073\n很不要脸\t168074\n几双\t168075\n李若欣\t168076\n秋天\t168077\n嗖嗖\t168078\n多智能\t168079\n二分钟\t168080\n一瓦\t168081\nmaggo\t168082\n洗脱\t168083\n几号\t168084\n秋夜\t168085\n洗脸\t168086\n几台\t168087\n味着\t168088\n泱泱大国\t168089\n几只\t168090\n妊娠水肿\t168091\n2406348041\t168092\n语音\t168093\n谆谆谆谆谆谆谆吃\t168094\n用伦\t168095\neth\t168096\n几句\t168097\nforali\t168098\n赚信誉\t168099\n孙俪伦\t168100\n灬灬\t168101\n数一下\t168102\n卡在\t168103\n坚定不移\t168104\nrecatime\t168105\n陈丽怡\t168106\n苏丹红吊白块淋巴肉\t168107\n上年纪有没啥好事连\t168108\n好果\t168109\n复议\t168110\n王婷思\t168111\n游戏规则\t168112\n卡萨帝\t168113\n印度的牛\t168114\n手心里\t168115\n过硬\t168116\n12月5日\t168117\n一往无一\t168118\n医管局\t168119\n八嘎雅鹿\t168120\n120克千\t168121\n天灵盖\t168122\n顿珠\t168123\n乔治·默克\t168124\n五中\t168125\n系统化\t168126\n张丛\t168127\n重奖\t168128\n28462845432594596\t168129\n御膳\t168130\n任仕一\t168131\n网通社\t168132\n东明冬\t168133\n张三\t168134\n少哪里\t168135\n乡村老尸\t168136\n经济区\t168137\n05554777778655365665633563\t168138\n张一\t168139\njstijts\t168140\n评为\t168141\n美机器\t168142\n502撒\t168143\n20159016\t168144\n965147568486\t168145\n张丽\t168146\n写封\t168147\n黑奥利\t168148\n讲一什么\t168149\n江静文\t168150\n铁塔\t168151\n动物园\t168152\n接单\t168153\n199110119911130\t168154\n大手儿\t168155\n幸而\t168156\n摆出\t168157\n楞严寺\t168158\n思密达思密达思密达思密达\t168159\n了讨厌的啦讨厌极了讨厌的了讨厌几讨厌的了讨厌极了讨厌极了条件及了\t168160\n二零二\t168161\n秦岭\t168162\n这个月底\t168163\n二百多\t168164\n银川市\t168165\n湖南车\t168166\n和你好\t168167\n联组\t168168\n中甲联赛\t168169\n白马王子\t168170\n雷累\t168171\n营业余\t168172\n陪赔\t168173\nuyhjhh\t168174\n未至\t168175\njj根霞\t168176\n色巴\t168177\n瓦鹿晗\t168178\n李浩卅\t168179\n崔最\t168180\n骗笑\t168181\n汨罗江\t168182\n腼腆酒店瞰蠒母爱威尼斯\t168183\n崽崽\t168184\n工作工资\t168185\n凄婉\t168186\n嘎嘎嘎嘎天\t168187\n拜拜秘\t168188\n堡护\t168189\n见效\t168190\n亲人们\t168191\n果光\t168192\n听说过手\t168193\n尾盘\t168194\n七次\t168195\n心弦\t168196\n北海\t168197\n农机\t168198\n大功\t168199\n柔寡\t168200\n过星战7\t168201\n好爱塔塔\t168202\n梓月\t168203\n大劫\t168204\n樊格阳\t168205\n大动\t168206\n化简\t168207\n滚滚滚滚滚滚滚滚滚滚滚滚滚滚\t168208\n腾空\t168209\n凭什么说\t168210\nwhathappend\t168211\n张雨泽\t168212\n大势\t168213\n9889\t168214\n11234567890\t168215\n吴世勋\t168216\n一望无际\t168217\n叫丧\t168218\n克去王\t168219\nCql\t168220\ncssx\t168221\n给吧\t168222\n油炸食品\t168223\ngjhjojhhhij\t168224\n会当\t168225\n赞美付\t168226\n舞沒\t168227\n汉堡吧\t168228\ndrujh\t168229\n等不及\t168230\n10000000000000\t168231\n范er\t168232\n砍杀\t168233\nRTETSTETR\t168234\n海加尔\t168235\n拉拉啦啦\t168236\n工业园\t168237\n性别母\t168238\n咸湿\t168239\n24首\t168240\n高利率\t168241\n英格兰队\t168242\n100000多台\t168243\n汤伟林\t168244\n请到场\t168245\n271007na\t168246\nCunning\t168247\n狸\t168248\n又夜\t168249\n狽\t168250\n狼\t168251\n炒鸡汁\t168252\nNORAZO\t168253\nhhjyjghgxki\t168254\n狲\t168255\n武媚\t168256\n刘峻杭\t168257\n叶利钦\t168258\n南方周末\t168259\n爱思盔\t168260\n坚毅\t168261\n没来由\t168262\n独\t168263\nNUin\t168264\n可木有七\t168265\n狡\t168266\n狠\t168267\n阉人\t168268\n姚大屌\t168269\n南7\t168270\n狜\t168271\n为止萌\t168272\n狑\t168273\n狐\t168274\n适者生存\t168275\n优贝所\t168276\n123哒哒哒哒哒哒哒哒哒哒哒哒哒哒\t168277\n狗\t168278\n2348万\t168279\n狈\t168280\n25585\t168281\n塞油内拉\t168282\n捉住\t168283\n左微倾\t168284\n电视界\t168285\n大谢谢\t168286\n狅\t168287\n青海警官职业学院\t168288\n野钓网\t168289\n17名\t168290\nacvitz\t168291\n食堂\t168292\n兰8\t168293\n幂幂\t168294\n你在哪我来监理\t168295\n十来个\t168296\nJae\t168297\n嘻P\t168298\n龙牙\t168299\n印尼卢比\t168300\n铁黎虹\t168301\n四十四十五\t168302\n激励\t168303\n魔咪\t168304\n盗抢\t168305\n1780\t168306\nJfhjf\t168307\n多休\t168308\n国际贸易\t168309\n很久以后\t168310\n魔咒\t168311\n成品油\t168312\n折回\t168313\n十四个\t168314\n多伦\t168315\n聪聪聪\t168316\nVIP\t168317\n穿仔\t168318\n妙方\t168319\n两只小手\t168320\ndoes\t168321\n13011911566\t168322\n😞\t168323\n度秘度秘度秘\t168324\n九一千块\t168325\n前锋\t168326\n披露\t168327\n抢救\t168328\n蒸汽机\t168329\n哎呦呃\t168330\n姜美女\t168331\n堂姐\t168332\nwoldisa\t168333\n百脉\t168334\n好拉勾\t168335\n十四届\t168336\n十万颗\t168337\n闹闹\t168338\n库坦\t168339\n客客气气\t168340\n惠天\t168341\n孜然粉\t168342\n哎呦呦\t168343\n四五四千五百六十四八六千八百七十二多少\t168344\n国家队\t168345\n限售\t168346\n哎呦呵\t168347\n卢湾区\t168348\n三万块\t168349\n王荣\t168350\n发源\t168351\n毕业论文\t168352\n明天早上8点20分\t168353\n狐狸\t168354\nwqn15602388871\t168355\n我愿相信\t168356\n渖飞\t168357\nfdystsdrshg\t168358\n属合\t168359\n豆花\t168360\n狼人之夜\t168361\n一点一滴\t168362\n犬种\t168363\n张旭\t168364\n豆芽\t168365\n互为\t168366\n那你过早种工作吧\t168367\n小小人\t168368\n添加剂\t168369\n李航\t168370\n拍子\t168371\n三门峡\t168372\nICU\t168373\n笑红尘\t168374\nGlobeScan\t168375\n有生意\t168376\n9.3\t168377\n度粑粑\t168378\n黄种人\t168379\n小王子\t168380\n暗牧\t168381\ninterest\t168382\n空中客车\t168383\n谁那\t168384\n闪失\t168385\n豁然\t168386\n二了死\t168387\n明天晚上九点\t168388\n黑户\t168389\n4301部\t168390\n害怕麻烦\t168391\n会考虑\t168392\n高富关\t168393\n成弄\t168394\n茶吧\t168395\n误点\t168396\nppt嗯嗯\t168397\n万状\t168398\n每一颗\t168399\n2011.6.29\t168400\n15266942152\t168401\n讨厌猩\t168402\n选美\t168403\n黑我\t168404\n霞霞霞\t168405\n服不服从\t168406\n久石\t168407\n紫荆阁\t168408\n人工啊宾九\t168409\n老能\t168410\n二幺零\t168411\n袄子\t168412\n补仓\t168413\n曾哥\t168414\n吴杰\t168415\n阿不都\t168416\n秒射\t168417\n大界\t168418\n一生一心一意忠贞不渝\t168419\n搞基你可以\t168420\nHfghg\t168421\n英语趣\t168422\n斯大妈\t168423\n福神报\t168424\n拉会儿呱\t168425\n雪松\t168426\n推销员\t168427\n雪杨\t168428\n单品\t168429\n水帖\t168430\n45分\t168431\n华天延吉\t168432\n单行本\t168433\n雪条\t168434\n高粱\t168435\nDAX\t168436\n大略\t168437\n东北松花江\t168438\n一个十岁\t168439\n袁湘琴\t168440\n摇滚摇\t168441\n鲜活\t168442\n有意思话\t168443\n陈昱昊\t168444\n不准笑\t168445\n褚\t168446\n麦动\t168447\n醉死\t168448\n别留\t168449\n真气\t168450\n麦加\t168451\n画匠们\t168452\n褊\t168453\n机天\t168454\n不管吗大布\t168455\n4399\t168456\n滑坡\t168457\n语文恩\t168458\n幼儿教育\t168459\nooooooooooooooooooo\t168460\n来呀来\t168461\n花溪西乡\t168462\n99.9%\t168463\n大额贷\t168464\n不做事\t168465\n吕晋芳\t168466\nxiwog\t168467\n褪\t168468\n下楼\t168469\n美其米雅\t168470\n141414141\t168471\n哪一样\t168472\n姓李叫\t168473\n皮尔斯\t168474\n自慰裤\t168475\n求转\t168476\n一片一片\t168477\n女权\t168478\n羽毛球拍\t168479\n亚桐\t168480\n景告\t168481\n400万欧元\t168482\n作业楼\t168483\n玄袍\t168484\ngj1a1b1c\t168485\nVera\t168486\n清花茶碗\t168487\n从少\t168488\n猪肉\t168489\n长青\t168490\nhopedoing\t168491\n手空空\t168492\n亚马逊\t168493\n方天戟\t168494\n544123555\t168495\n看脸\t168496\n沙坪坝\t168497\n从小\t168498\n12333333333333666666666666633333333335422355555555844645255442236547777777777777777775555666\t168499\n6496824642\t168500\n吴小呆\t168501\n度秘策\t168502\nhuahg\t168503\n上不上下不下\t168504\n子曰\t168505\n不至\t168506\n太热\t168507\n第一场\t168508\n度秘我讨厌\t168509\n3撒贝\t168510\n羊排骨\t168511\n小黄鸡\t168512\n我发现你好像少了我发现你好像\t168513\n执子之手，与子偕老\t168514\n跋扈\t168515\n6枚\t168516\n企划\t168517\n笑不好笑不好笑不好要不好笑不好笑不好笑不好笑不好笑不好笑不好笑\t168518\n找蘑\t168519\n直男公牛\t168520\n太烂\t168521\n高坡\t168522\n冰淇凌\t168523\n295\t168524\n表彰\t168525\n290\t168526\n古井\t168527\n首尔Dream\t168528\n芸豆酸菜汤\t168529\n移师\t168530\n快活\t168531\n口语达人\t168532\n眼鹰\t168533\nFtc\t168534\nvifimaa\t168535\n踢开\t168536\n伯信\t168537\n很大很久\t168538\n橡皮擦\t168539\ntrouboremarek\t168540\n大汉纸\t168541\n死他活\t168542\n没不够\t168543\n二十老子\t168544\n穆尔西\t168545\n核心\t168546\n配饰\t168547\n铁锐雯\t168548\nufjxjthi\t168549\n456555\t168550\n丽趣淘宝\t168551\n1柄\t168552\n着实\t168553\n胜者为王\t168554\n程春龙\t168555\n田间\t168556\nhuman\t168557\n1柏\t168558\n大国崛起\t168559\nfeffe\t168560\n范少杰\t168561\n67366377363636737\t168562\n欧巴嘞欧巴嘞\t168563\n囧rz囧rz\t168564\ng塔拉\t168565\n唐人街148号\t168566\n那千绿的爸爸妈妈\t168567\n秋词\t168568\n名字人\t168569\n校卡\t168570\n4两\t168571\n弊塞\t168572\n转化\t168573\nPage\t168574\njhww\t168575\n好不叫\t168576\nWZASX恶蓄玉米3无TCTXDML律规杜瓦TCFDTC\t168577\n一六千\t168578\n铁打\t168579\n三四年\t168580\n转包\t168581\n猪肥\t168582\n情愫\t168583\n10800岁\t168584\n杨小\t168585\n断袖之僻\t168586\n倒闭\t168587\n烤肉\t168588\n被轮\t168589\n高密银\t168590\nIikdo\t168591\nhi芭比\t168592\n张家好\t168593\n系睇\t168594\n杨少\t168595\n69999999\t168596\n阿力\t168597\n颜子奇\t168598\n硬伤\t168599\n两粒\t168600\n十一十九\t168601\n袁帅杰\t168602\n我不懂你的心\t168603\n也想死\t168604\nfhdyys\t168605\n赞赞赞\t168606\n本工\t168607\n森林\t168608\n真不对劲\t168609\n28461805218184216505656258255615154\t168610\n客家话\t168611\n谢云飞\t168612\n白发苍苍\t168613\n钢琴块\t168614\n阅兵式\t168615\n逗停\t168616\n必得\t168617\n度秘你爱谁我爱你度秘\t168618\n笨那安\t168619\n嘿度秘\t168620\n碣石\t168621\nOnthey\t168622\n级法医学专业\t168623\n呢醉\t168624\n制动力\t168625\n8份\t168626\n338公里\t168627\nwork\t168628\nworl\t168629\n天海地\t168630\n8件\t168631\n李佳辰\t168632\nletITgo哇勒哇勒哇勒哇勒太保\t168633\n第一次接触\t168634\n别呼\t168635\n一百三十五百\t168636\n品道\t168637\n855646566454\t168638\n办丑\t168639\n乡发\t168640\n风道\t168641\n云雀\t168642\n约克夏\t168643\nXOXO\t168644\n赵仲勋\t168645\n彩云之南我心\t168646\n鬼来\t168647\nHgp\t168648\ntylllllo\t168649\n奔跑吧兄弟播\t168650\n中小学\t168651\n有座\t168652\n3亿\t168653\nHgc\t168654\n遒劲\t168655\n富贵险\t168656\n群狼\t168657\nwwa44wawacom\t168658\n寒秋\t168659\n有应\t168660\n六件套\t168661\n云雨\t168662\n眼光\t168663\n有序\t168664\n鼎贯\t168665\n小弗\t168666\n有床\t168667\n缀劳\t168668\nHgF\t168669\n云雾\t168670\n哈楼哈楼\t168671\n1月16号\t168672\nｌｏｓ\t168673\n排忧解难\t168674\n铠甲小学\t168675\n399399\t168676\n老农\t168677\n砖组\t168678\n蝴蝶兰\t168679\n胡明扬\t168680\n那辙\t168681\n港漫木根\t168682\n刘帅\t168683\nfskg\t168684\n王小娅\t168685\n田科\t168686\n康梦丽\t168687\n世乒赛\t168688\n握持\t168689\n甘珍\t168690\n去把妹\t168691\n做美\t168692\noooooooooooolloooooooo\t168693\n1001万个\t168694\n快点儿快发那你说呀\t168695\n老冯\t168696\n批发\t168697\n杜咪汗\t168698\noffice\t168699\n骂我了我不想说\t168700\n規劃\t168701\nsuput\t168702\ngdhdhfbdj\t168703\n小情人\t168704\nG8i\t168705\n北京市政府\t168706\nnhsejgt\t168707\n妤媣\t168708\n太凰了你你还我女神\t168709\n苹果大楼\t168710\n下旬\t168711\n69点\t168712\n梅根\t168713\n尽早\t168714\n叫法\t168715\n面貌\t168716\n放纵\t168717\n稀里糊涂\t168718\n从来胜利\t168719\nV3\t168720\n霍小姐\t168721\n体重\t168722\n八路\t168723\n这几天天\t168724\n哎呀嘿\t168725\n只是\t168726\n喔累\t168727\n哎呀嘴\t168728\n很高调\t168729\n苏瑞\t168730\n哈里波特\t168731\n小弱\t168732\n孙梦圆\t168733\n各尽所能\t168734\n马栏山\t168735\nmoreacute\t168736\n1918年10月\t168737\n霜军\t168738\n8角\t168739\n副院长\t168740\n辣度\t168741\n拦路\t168742\n布鲁克斯\t168743\n蒋承睿\t168744\n王侃斌\t168745\n大家好我叫佳怡\t168746\n50美元\t168747\n如手足\t168748\n一个五块\t168749\n掷\t168750\n掴\t168751\n掳\t168752\n密切\t168753\n掰\t168754\n入主\t168755\n荷枪实弹\t168756\n明明啊明明\t168757\n掺\t168758\n启发性\t168759\n作则\t168760\n控\t168761\nbcgxdvygfhwvhebcyechdcysvchecjehveuchejnsjxhe\t168762\nVC\t168763\n叉烧酥\t168764\n探\t168765\n掠\t168766\n掬\t168767\n科尚\t168768\n措\t168769\n100万\t168770\n推\t168771\n节约\t168772\n母类\t168773\n排\t168774\n掐\t168775\n杨啸轩\t168776\nVG\t168777\n掛\t168778\n闫亚廷\t168779\n掘\t168780\nhhgffxcvhh\t168781\n掅\t168782\n100个\t168783\n掃\t168784\n掂\t168785\n头一个\t168786\n秦淮\t168787\n掏\t168788\n掌\t168789\n作别\t168790\n掉\t168791\n授\t168792\n痛快点\t168793\n改天我走\t168794\n王斤\t168795\n真人人\t168796\n232301199412012253\t168797\n溶化\t168798\n【探轶\t168799\n很喜欢型\t168800\n去找你吧\t168801\n145147138199\t168802\n恋情\t168803\n周倩倩\t168804\n肇事罪\t168805\n65464\t168806\n燕郊\t168807\n2275\t168808\nVP\t168809\n热血沸腾\t168810\n153497\t168811\n本科生\t168812\nMadonna#\t168813\n青藏花园\t168814\n王文\t168815\n250360\t168816\n13天\t168817\n尹恩惠\t168818\n13.5万新加坡元\t168819\n所里\t168820\nVS\t168821\n特首\t168822\n六畜满\t168823\n啪啪声\t168824\n大乐我要嫁给你\t168825\n20120521\t168826\n黄河小学\t168827\nfhcdd\t168828\n我爱王源我要一嫁给他\t168829\n红来\t168830\nAngalbaby\t168831\n无聊说点\t168832\n复善美\t168833\n8888888888888888886\t168834\n共产主义\t168835\nxike\t168836\n张牙舞爪\t168837\n朱明\t168838\n低成本\t168839\n真心好\t168840\n页火\t168841\n找主贴\t168842\n大倍率\t168843\n钟馗\t168844\nqingqing\t168845\n张高强\t168846\n自慰\t168847\n好宝儿\t168848\n边境署\t168849\n无德\t168850\ngchsgfgsggagsgdyhgdguddggddhdhhhdbdhbvbhdjbdbhbfdnn\t168851\n袁蓉蓉\t168852\n牛肉汁\t168853\n张智哓\t168854\n创作者\t168855\nVd\t168856\n二十八宿\t168857\n田慧云\t168858\n双手腕\t168859\n人寿\t168860\n外向型\t168861\n烦调命\t168862\n预感\t168863\n很害怕\t168864\n早安\t168865\n功用\t168866\n96321478965456\t168867\n几515几6\t168868\n桃花雪\t168869\n桃花雨\t168870\n黄子珊\t168871\n王韻禎\t168872\n股份素\t168873\n没房没\t168874\n爱的想爱\t168875\n俩坨\t168876\n差白\t168877\nlulu\t168878\n对不起顿\t168879\nMDbj\t168880\n阿扎姆\t168881\n90％\t168882\n80%\t168883\nfaedrw\t168884\n坂井真纪\t168885\n尹施允\t168886\n越多\t168887\n谢了思密达\t168888\nggfhv个\t168889\n字格\t168890\n千年\t168891\n字样\t168892\n马草\t168893\n独家劲那\t168894\njjgdss\t168895\n22周\t168896\n考量\t168897\n就是我不\t168898\n想想吃\t168899\n案源\t168900\n卸露\t168901\n茅穿\t168902\n浮点\t168903\n林菲\t168904\n烦唉\t168905\n一望无垠\t168906\n13836545422\t168907\nRUBBiSh\t168908\n零斤\t168909\n25542589\t168910\n666666666\t168911\n小森\t168912\n周八\t168913\n周全\t168914\n陪领\t168915\n雏菊类\t168916\n帮助\t168917\n周公\t168918\n周六\t168919\n赛后\t168920\n苏淑霞\t168921\n转向灯\t168922\n456456456\t168923\nyuiu\t168924\n凯治\t168925\n小棒\t168926\n热乎\t168927\n红杏\t168928\n新快报\t168929\n住院上疗\t168930\n小棍\t168931\n听一下\t168932\n强震\t168933\n小棉\t168934\n七斤\t168935\n吴寒旭\t168936\n周克\t168937\n有你是猪婆\t168938\n周免\t168939\nyei\t168940\nisoley\t168941\n输入\t168942\n朱佳庆\t168943\nhrgnf\t168944\n以为你\t168945\n上膛\t168946\n果导片\t168947\n建筑工程技术专业\t168948\n5596795\t168949\n云南云南云南云南玉达玉达\t168950\n14寸\t168951\n盼头\t168952\n写给年轻妈妈\t168953\n林琨毅\t168954\n酷米\t168955\n毕可云\t168956\n最萌\t168957\n输光\t168958\n13835078281\t168959\n核爆核爆\t168960\n仕思迄\t168961\n乜美\t168962\n嗯噔\t168963\n十二指肠\t168964\n星爷\t168965\n罚\t168966\n生日版\t168967\n国际上\t168968\n我想你说你好爱好爱我\t168969\n罓\t168970\n罒\t168971\n网\t168972\n罐\t168973\n罗\t168974\n申辩\t168975\n罕\t168976\n零点研究咨询集团\t168977\n4658588565853535889\t168978\n申达\t168979\n3141454\t168980\n终身\t168981\n小小年纪嘿不吃不是爱对呗天天的相对\t168982\n凯锐\t168983\n罼\t168984\nyyyyyytyty\t168985\nLS-6\t168986\n罷\t168987\n上图\t168988\n罵\t168989\n装丫\t168990\n好勺\t168991\n罩\t168992\n皮峰\t168993\n置\t168994\n罢\t168995\n罡\t168996\nCK9\t168997\n脑残弱\t168998\n十面\t168999\n怒容\t169000\n色香味形\t169001\n9dm\t169002\n中金公司\t169003\n孤单单\t169004\n莲子粥\t169005\n狗主\t169006\n老汤\t169007\n社交网络The\t169008\n5月24日2点30分\t169009\nyes\t169010\n老汪\t169011\n永兴镇\t169012\n半年十一十一十一十一\t169013\n嵌合\t169014\n端量\t169015\n加油顶\t169016\n独立利\t169017\n锦囊\t169018\n电视人\t169019\n#轩辕剑#\t169020\n准儿\t169021\nhebt\t169022\n結給\t169023\nhellokiki花\t169024\n六月九日\t169025\nIdont\t169026\n炸锅\t169027\n金鳞\t169028\nhebf\t169029\nhebe\t169030\nhebd\t169031\n73289\t169032\n我是咪咪咪咪咪咪咪咪咪咪咪咪咪咪\t169033\n刻错\t169034\n一哪一次\t169035\n哈咯\t169036\n艺卓\t169037\n金梦璐\t169038\n机机机\t169039\n车工\t169040\n张慧峰\t169041\n化成\t169042\n赠书\t169043\n7000万\t169044\n啦啦啦啦啦啦我好痛苦\t169045\n形形色\t169046\nill\t169047\nilk\t169048\n新高分儿\t169049\nilv\t169050\n诡异\t169051\n请郎\t169052\n余额盈\t169053\n对呀我真\t169054\n５#\t169055\n巫术\t169056\n突生\t169057\n波杏\t169058\nTORRES\t169059\n日本救援队\t169060\n五撸\t169061\n了不懂\t169062\n脱单\t169063\n囚鸟\t169064\n祭雪\t169065\n队员行\t169066\n写话\t169067\n鸡蛋我的妈妈\t169068\n74100\t169069\n320421197509153828\t169070\n临海\t169071\n写诗\t169072\n绿城美丽园\t169073\n我是你的尊\t169074\n拜鱼鱼TV天v天\t169075\n伟雄\t169076\n南华春堂\t169077\n便便\t169078\n163com\t169079\n昌逆\t169080\n菲政府军\t169081\n百年\t169082\n写说\t169083\nchviohgh\t169084\n47魏\t169085\n课李\t169086\n情真意切\t169087\n18755110228\t169088\n冷酷男\t169089\ngufyc\t169090\n走无高\t169091\n壬录\t169092\n遮不住\t169093\n那个太子妃升职记\t169094\n捉奸\t169095\n傻子气\t169096\n周吗春\t169097\n怪相\t169098\n谋害\t169099\n八点十分\t169100\n巴达雷\t169101\n卫生院\t169102\n秘武\t169103\nsave\t169104\n汽球\t169105\n笑尿\t169106\n四十一枝\t169107\n喷气\t169108\n说说说说说说说说说说说说说说说\t169109\n吴庆宇\t169110\n赤西仁龟梨\t169111\n天界\t169112\n婚嫁\t169113\n2361次\t169114\n变真是\t169115\n坐坐\t169116\n小龙人\t169117\n王思懿奥\t169118\n逻辑\t169119\n豪车\t169120\n为尽早个小姐\t169121\n美肖萌\t169122\n逢场作戏\t169123\n赵怀民\t169124\ntotoo\t169125\n管理家\t169126\n冼一男\t169127\n空荡荡\t169128\n人生的旅行存褶\t169129\n藤冲\t169130\n易昕\t169131\n精勉强凑合\t169132\nFgfrgj\t169133\n全面\t169134\n淡淡的大明月在说系\t169135\n太阴\t169136\n入学\t169137\n太阳\t169138\nBjnkbbn\t169139\n2月7号晚上\t169140\n免提\t169141\n别多想\t169142\n乌合之众\t169143\n懒了\t169144\n页境\t169145\n朱崃玮\t169146\n喘口气\t169147\n懒人\t169148\n動漫\t169149\n刺妖\t169150\n3221\t169151\n笔款\t169152\n产学研\t169153\n九分之\t169154\njjjjjgg\t169155\nJxuxud\t169156\n嘶吼\t169157\n七月一起\t169158\nPK挑战者联盟\t169159\n周礼\t169160\n美棠\t169161\n抽酒\t169162\n如同\t169163\n长高\t169164\n话说错\t169165\n白雪公主\t169166\nfindly\t169167\n如吃\t169168\nWIUSH\t169169\n那片海\t169170\n罗马赛\t169171\n团花\t169172\n看见了却\t169173\n辽师大\t169174\n不要脸丑八怪\t169175\n俊凯切\t169176\n纵淫无道\t169177\n拉贝贝\t169178\n陈建\t169179\n度度度\t169180\nGay\t169181\n夹塞\t169182\n比伯伯伯\t169183\n圣奥\t169184\nGao\t169185\n圆滑\t169186\n圣女\t169187\n从动\t169188\n徐阳\t169189\n山山相环\t169190\n胆结石\t169191\n别个跑\t169192\n钻石男\t169193\n妮话\t169194\n谢谢侬\t169195\n弹钱\t169196\n9.2萬\t169197\n钻石画\t169198\n混搭感\t169199\n绿色\t169200\n扬州台\t169201\n妮词\t169202\n发出战\t169203\n琅琊\t169204\n嘛不你\t169205\n群说话\t169206\n｀皿\t169207\nfajvkjrydo\t169208\n黄泗皓\t169209\n那后天\t169210\n八岁九岁\t169211\n菜花园丁忧虑\t169212\ntheyre\t169213\n煲粥\t169214\nfiygj\t169215\n女同明晚\t169216\n白百代\t169217\n秩序\t169218\nuueueur\t169219\n界线\t169220\n那angelababy\t169221\n啊度\t169222\n大姨母\t169223\n晕不好听\t169224\n小可爱行\t169225\n两万30255\t169226\n唐新月\t169227\n口误\t169228\n口语\t169229\nneve\t169230\n朱嘉乐\t169231\n导弹\t169232\n白天本也法海\t169233\n口说\t169234\n口诺\t169235\n挑起\t169236\nhxijtlgs\t169237\n太贫\t169238\n开了得\t169239\n口诀\t169240\n8702\t169241\nighch\t169242\nsgkk\t169243\n于五代寻阳公主\t169244\n有得挥\t169245\n口译\t169246\nfjgjn\t169247\n口试\t169248\n擦身而过\t169249\n义愤\t169250\n举棋不定\t169251\n导引\t169252\nｔｆｂｏｙ\t169253\n耿直\t169254\nemmitashimaom\t169255\n3000块\t169256\n曾小贤\t169257\n林云\t169258\n呵想\t169259\n于景怡\t169260\n得知\t169261\n误入房\t169262\n都市堂食\t169263\nfbfjdbfjfi\t169264\n有点剑走偏锋\t169265\n自理\t169266\n样品\t169267\n沙尔克04\t169268\n没意\t169269\nyrfugixicgohhoc\t169270\nvisa\t169271\n曾小贞\t169272\n腊肠嘴\t169273\n女爱好男\t169274\n小波波\t169275\n林人\t169276\n1122120484755545825\t169277\n炫舞时代在哪里\t169278\nxyp\t169279\n出\t169280\n阿弟\t169281\n弄孔\t169282\n巨鱼\t169283\nrrjrhrhhr\t169284\n函\t169285\nmaxia\t169286\nxyx\t169287\nxyy\t169288\nxyz\t169289\n凰\t169290\n大眼睛\t169291\n凵\t169292\n从事\t169293\n凯\t169294\n工程队\t169295\n凭\t169296\n像约\t169297\n凢\t169298\n凡\t169299\n几\t169300\n医者\t169301\ndrtddwrtf\t169302\n凤\t169303\n兒示範\t169304\n牛奶糖\t169305\n隐峰\t169306\nftigc\t169307\nheheh\t169308\n凝\t169309\n阿强\t169310\n凑\t169311\n十一月份\t169312\n撒播\t169313\n玻璃心\t169314\n减\t169315\n凌\t169316\nQ群\t169317\n樊天晴\t169318\n净\t169319\n准\t169320\nxy7\t169321\n鸡冻\t169322\n微博号\t169323\n好不好女\t169324\nThugbought\t169325\n美牛\t169326\n网游文\t169327\n尽全力\t169328\n矫揉造作\t169329\n清洗\t169330\n棍天使\t169331\n颠沛旅途\t169332\n清洁\t169333\n15542424246454\t169334\n烙饼\t169335\n邱海如\t169336\n游戏厅\t169337\n也是不是\t169338\n巴拉小魔仙\t169339\n哈酷\t169340\n终极新年\t169341\n黑板报\t169342\n超级课程表\t169343\n人的手\t169344\n好想和你说家乡话\t169345\n欲钱\t169346\n更重要\t169347\n王正阳\t169348\n三十六块\t169349\n你你你我讨厌你了你\t169350\n艳姆苑\t169351\n宏年\t169352\n我心里边\t169353\n感同身受\t169354\n李璐珊\t169355\ncckjj\t169356\n哈酒\t169357\n炊事员\t169358\n给我我叫\t169359\n一百多号\t169360\n家巧\t169361\nnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnn哈哈nnnnnnnnnnnnnnnnnn呵呵nnnnnnnnnnnnnnnnnnnnnnnnn嘻嘻nnnnnnnnnnnnnnnnnnnnn不说了nnnnnnnnnnnnnnnn听nnnnn嗯\t169362\n会不是\t169363\n雷阵雨\t169364\n鬼棺\t169365\n一羊羊\t169366\nPenlin\t169367\nabc式词\t169368\nthfgbxgn\t169369\n联谊会\t169370\n莫利亚\t169371\n热水\t169372\n传道\t169373\n嗯一\t169374\n立隍\t169375\n瞿怎强\t169376\n娇丽\t169377\n保育员\t169378\n还没有事哪你是小蜜\t169379\n郑斌\t169380\n热气\t169381\n贝贝美琪\t169382\n我是爱你微你\t169383\n我知道你谁啊你是我家度秘\t169384\n青苹果\t169385\n赛站\t169386\n还我赫兔贤兔魂\t169387\n别太多\t169388\n沒空\t169389\n插曲\t169390\nsjcmmdkd\t169391\n夏木帅\t169392\n另起炉灶\t169393\n通达\t169394\n胡桐\t169395\n湖南人\t169396\nhrie\t169397\nIthingsotoo\t169398\n墨客\t169399\ndance\t169400\n王科云\t169401\n最最最\t169402\n一月二十二号\t169403\n小刘\t169404\n小刚\t169405\n娃艳\t169406\n黛宁\t169407\n墨家\t169408\n独栋花园\t169409\n就不就不就不就不就不就不就不就不行\t169410\n小刀\t169411\n睇咸片唔\t169412\n安全第一\t169413\n忙东忙西\t169414\n直呼\t169415\n气手\t169416\n北京现代\t169417\n徐名中帅\t169418\n小到\t169419\n激情吗人人网\t169420\n别惹我你\t169421\n迂回\t169422\n郭淑婷\t169423\n墨守\t169424\n放过自己\t169425\n公模\t169426\n消失的爱人\t169427\n小判\t169428\nTFB0SY\t169429\n再见我走了拜拜我不可爱我不和我不和你\t169430\n崩跑吧兄弟\t169431\n核素\t169432\n小利\t169433\n梁俊康\t169434\n同道人何必为难同道人\t169435\n爱不爱吃\t169436\n世勋\t169437\n乖吃糖\t169438\n寻龙节\t169439\n倾向于\t169440\nloli维尼饭\t169441\n红步兵\t169442\n北地区\t169443\n不欺负\t169444\n木本糖源\t169445\n交叉段\t169446\n参加者\t169447\n新白杯\t169448\n轻下\t169449\n在囧途\t169450\n好累\t169451\n斯蒂文\t169452\n标致哈\t169453\n笑一个人哭\t169454\n礼县\t169455\nGeorge\t169456\n迷香\t169457\n光舞\t169458\nhtf\t169459\nhtd\t169460\nhtc\t169461\n洗\t169462\nhto\t169463\n是不是我可以\t169464\n洛\t169465\nhtl\t169466\n摔到\t169467\n雷骐鸣\t169468\n洞\t169469\n洁\t169470\n米列\t169471\nhtu\t169472\nhtt\t169473\n作罢鸟\t169474\n李克勤\t169475\n洋\t169476\n4k4k\t169477\n薪酬福利报告\t169478\nhty\t169479\n薪资\t169480\n洱\t169481\n摸丁\t169482\n洲\t169483\n盘算\t169484\n腌料\t169485\n古诗文\t169486\n你的爱人\t169487\n行人\t169488\n洼\t169489\nhhhhh1\t169490\n1点30分\t169491\n美女少女\t169492\n证券时报\t169493\n津\t169494\n你好密度\t169495\n洨\t169496\n洪\t169497\n唉而来\t169498\n米券\t169499\n和萨\t169500\n过街\t169501\n34米\t169502\n滚开我讨厌\t169503\n尹聿臻\t169504\n小帽子\t169505\n迷度\t169506\n天津市机电工艺学院\t169507\nBlade\t169508\n手寸\t169509\n社会经济\t169510\n宝玉\t169511\n言论管理\t169512\n不要你再给我说狗了我讨厌你\t169513\n子女孩子\t169514\n西六环\t169515\nbuco\t169516\n棉浦\t169517\nhhhhhu\t169518\n四大片\t169519\n上今天\t169520\nhhhhhh\t169521\nxkdj\t169522\nDear\t169523\n巡游\t169524\n风流狂\t169525\n副卡\t169526\n盗窃案\t169527\nofcouse\t169528\n你在我的眼里你是我最好的机器人\t169529\n田喜顺\t169530\na199\t169531\nhaomimi\t169532\natrix\t169533\n5码\t169534\n黑暗小平火鹰旋天边\t169535\n凌珞\t169536\nDouma镇\t169537\n都行\t169538\n山西偏关县伪警备队\t169539\nCtrlD\t169540\n寄首歌订\t169541\n区里\t169542\n偷笑][偷笑][偷笑\t169543\n22222052222\t169544\n公费\t169545\njbjb\t169546\n翁思忆\t169547\n红包\t169548\n伊苏\t169549\n无性物种\t169550\n董玉琢\t169551\n场兴冲冲\t169552\n秘c秘\t169553\n竹林\t169554\n小宠哥\t169555\n帅级\t169556\n理你了行不行\t169557\n乡会\t169558\n食肉鸟\t169559\n佑荣\t169560\n赵佶\t169561\n拆洗\t169562\n酱油吧\t169563\n姑奶奶\t169564\n跑龙套\t169565\n星星参\t169566\n500年\t169567\n你是我的小秘书度秘秘书号\t169568\n十八位\t169569\nglz\t169570\n色子\t169571\n肥龙\t169572\nKBS电视剧成均馆\t169573\n胡潇灵\t169574\n俩分之之\t169575\n通道\t169576\n85jpg\t169577\n其原因\t169578\n西皮\t169579\n梁桂\t169580\n李恒撒\t169581\njjjjx\t169582\n杜丹秀\t169583\n该所\t169584\ntjnm\t169585\n888888888868888888888\t169586\njjjji\t169587\n18ill\t169588\njjjjj\t169589\nhjsud\t169590\njjjjf\t169591\n母体\t169592\n竞得\t169593\n电话姐姐\t169594\npqncmzm\t169595\n想出轨\t169596\n管同济\t169597\n中枢\t169598\n广发华福会\t169599\n上来回\t169600\n聊会儿天儿吧\t169601\n金亨俊\t169602\noqhh\t169603\n超级k侠\t169604\n耗电\t169605\n慎重\t169606\n传感冒\t169607\n12659487321\t169608\n体会\t169609\n斋戒\t169610\n梦三\t169611\n冷酷无情\t169612\n停加藤\t169613\n温欣欣\t169614\n棒槌度秘\t169615\n林卓玺\t169616\n吟唱\t169617\n梦丹\t169618\n不是我的很好\t169619\n梦丽\t169620\nlt五一\t169621\n放假装\t169622\n市区内\t169623\n忘记了我不知道\t169624\n梦中\t169625\n噔噔\t169626\n王见王\t169627\n250万件\t169628\n米月传\t169629\n兴国县火车站\t169630\n60块\t169631\n刘盛武\t169632\n夏骏\t169633\n文化日报\t169634\n灰太狼学校哩先得一咯钱白看申电影扒开淑妃衣服小游灰太狼\t169635\n蔡锐泽\t169636\n18229952204\t169637\nejfh\t169638\n267页\t169639\n好加童m3\t169640\n婷雪婷\t169641\n珀利\t169642\n孙妍\t169643\n迫在眉睫\t169644\n狮子群居\t169645\n5月\t169646\n一卡通\t169647\n小坏蛋小坏蛋\t169648\n等一下\t169649\n吹灯寻龙诀\t169650\nAV4377\t169651\n高利率贷款\t169652\n三五八七九两千九百四十四\t169653\nkryrdryryksekt\t169654\n嗯oor\t169655\nmwtjmqjul\t169656\n最后时刻\t169657\n赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫赫\t169658\n碧溪\t169659\nusa\t169660\nWiFi信号\t169661\n真是辛苦你了不过你\t169662\n多吃\t169663\n畸形\t169664\n平铺\t169665\n荒村\t169666\n通明\t169667\n延缓\t169668\n整装待发\t169669\nredtu\t169670\n再来更\t169671\n伤来\t169672\n腐烂\t169673\n南京二区\t169674\n格勒\t169675\n国家统计局\t169676\n歌哥\t169677\n闫少华\t169678\n听迷\t169679\n明发\t169680\n燃燃\t169681\n五百千克\t169682\nwhdf\t169683\n实政\t169684\n第八面\t169685\n我的姑娘\t169686\n摩西娃\t169687\n故意伤害\t169688\n九分之八第二根\t169689\n明友\t169690\n睡嗯\t169691\n咯小心\t169692\n水龙头式速热器\t169693\n口福\t169694\n欧卡\t169695\n二大\t169696\n70余架\t169697\n止吐\t169698\n嗯好\t169699\nu2b\t169700\nhhhd\t169701\n嗯女\t169702\nhhhg\t169703\n千总\t169704\n塔嗵\t169705\nhhhb\t169706\n骗案\t169707\n干嘛不换\t169708\n期中考试语文数学英语会考上九十五分\t169709\ndddddd\t169710\n自助餐\t169711\n18238374888\t169712\n嗯奥\t169713\n嘈杂\t169714\n门兴\t169715\n天网天旺\t169716\n小翠\t169717\n48848848\t169718\nhtta句\t169719\n同姓恋\t169720\n边强涛\t169721\n外蒙古\t169722\n980424\t169723\nekdjdnndhd\t169724\n小翰\t169725\n财猫\t169726\n兜兜风去\t169727\n惩恶扬善\t169728\n一天2片\t169729\n王付钱\t169730\n余翠鸟\t169731\n不秘烦喜欢\t169732\n当效\t169733\nvhcvfysdhfstutriwrtiqwwertyuippolkjhgfdsazxcvbnm\t169734\n吭听\t169735\n見同學說\t169736\n奇米\t169737\njjjjljdf\t169738\n1569岁\t169739\n字面\t169740\n底儿\t169741\n炫炫\t169742\n常酱紫\t169743\n380元\t169744\nusk\t169745\n刘俊伟\t169746\nhhh1\t169747\n多兴旺\t169748\n7月14\t169749\n5万元\t169750\n专题研究\t169751\n国家广电总局\t169752\nThere\t169753\n购票者\t169754\n迷迷\t169755\n韩辣炒年糕\t169756\n朽木\t169757\n马可\t169758\n陈一一\t169759\n什么说\t169760\n包办\t169761\ndjduhd\t169762\n你好朋友\t169763\n小米饭\t169764\n刘亚茹\t169765\n苏明天\t169766\n黑子接力棒\t169767\njvkb\t169768\n罗金玉\t169769\n中华田犬\t169770\n替补\t169771\n度秘你真是我的好度秘\t169772\n来这里\t169773\n欺压\t169774\n250n360\t169775\n一六骨\t169776\n头村\t169777\n30天\t169778\n薄衫\t169779\nnoriko\t169780\npomo\t169781\n妹妹度\t169782\n幅员\t169783\n爱的华莱士不\t169784\n6万一亩\t169785\n个咪两个咪三个\t169786\n头条\t169787\n乘除法\t169788\njalued\t169789\n叮叮车\t169790\n乐百氏\t169791\n凉拌三\t169792\n唯爱SJ13]110628利特TW\t169793\n醫院\t169794\n一噗\t169795\n瓷实\t169796\n大泻\t169797\n咯删除\t169798\ngffffdggffffgfgff\t169799\n醒老\t169800\n李欣冉\t169801\nKnjl\t169802\n阅点\t169803\n32米\t169804\n才够了\t169805\n437575542354834566\t169806\n果然v他huh飞亚达好封ID呵呵\t169807\nYWY\t169808\n嗯嗯喁\t169809\nhfgcghcch\t169810\n瓮猫\t169811\n火锅二侧方\t169812\n树人教育\t169813\nDj\t169814\n无穷碧\t169815\nmuitifly\t169816\n，，，，\t169817\n爱心眼\t169818\n胖虎\t169819\n过日出\t169820\n而今\t169821\n雷c3c\t169822\n蒸馒头\t169823\nwcy\t169824\n370元\t169825\n放了\t169826\nh64\t169827\n悠嘻猴象\t169828\n删帖\t169829\n我心太黑了你信\t169830\n百花园\t169831\nwck\t169832\nwcj\t169833\n20多\t169834\n10点半多\t169835\nwco\t169836\n茶汤酒\t169837\nUfhggbuhg\t169838\n培田孙\t169839\n战况\t169840\n国瑞宾馆\t169841\n找辙\t169842\n不废\t169843\n心脑血管\t169844\n散装\t169845\n文昌小学\t169846\n不应\t169847\n两半儿\t169848\nk2jh1b1a\t169849\nFucd\t169850\n禮盒\t169851\n迷失婚\t169852\n不度\t169853\n帽子男\t169854\n废话废话\t169855\nFuck\t169856\n我人\t169857\n挥掉\t169858\nlhf\t169859\n总结构\t169860\n陈升\t169861\n1381192057\t169862\n妈妈灯\t169863\n进退维谷\t169864\n虎虎生威\t169865\n陈十\t169866\n董文华\t169867\n17点\t169868\n墨绿\t169869\n康瑞宝\t169870\n见面\t169871\n人人家\t169872\n我不想当你闺密\t169873\n警花\t169874\n火罐\t169875\n假货\t169876\n五点到七点\t169877\n减肥茶\t169878\nxxnhd\t169879\n我就喜欢你\t169880\n一一起\t169881\n细软\t169882\n不秘困难度秘\t169883\n颛顼\t169884\n手贱\t169885\n嘎拉嘎东夏嘎拉嘎\t169886\n么回事儿\t169887\ngvhhjj\t169888\n原价\t169889\n原件\t169890\n采油\t169891\n汉庭\t169892\n理論頭\t169893\n仲行\t169894\ncarnivald\t169895\n向西\t169896\nSHUDH\t169897\n放牛人\t169898\nsascha\t169899\n3473万\t169900\n我喜欢思明太阳s1\t169901\n蒋少杰\t169902\n436千米\t169903\n武汉天地\t169904\n25日早晨\t169905\nbbbbbbpppppppppppppppppppppp\t169906\n错啦错啦错啦错啦错啦\t169907\n素颜照\t169908\n中国空军\t169909\n不哭了你真的不爱我\t169910\n凉生\t169911\n我不要娃娃我要娃娃小\t169912\n殷子晨\t169913\n陆大白\t169914\ngdhkjhbj\t169915\n同龄人中\t169916\n你是我的小啊小啊小爷小苹果小啊小啊小啊小啊小苹果\t169917\n妇人\t169918\n查图\t169919\n乌贼\t169920\n耳音\t169921\n旭媛\t169922\n冷酷\t169923\n阎王爷\t169924\n13557942284\t169925\n荷兰阿姆斯特丹大学\t169926\n玻璃玻璃\t169927\n嗯嗯嗯23333\t169928\nytt\t169929\n振华重工\t169930\nudhgvxtfjsood\t169931\n扥周\t169932\n等会儿聊\t169933\n希腊人\t169934\n价日\t169935\n抗体\t169936\n犀利哥\t169937\nNyman\t169938\n中巴车\t169939\n虚假\t169940\n吉林\t169941\n一百五十一\t169942\n二零六五二\t169943\n破烂不堪\t169944\n村上淳\t169945\n高分子\t169946\n一百五十三\t169947\n猜点鱼\t169948\n一十二十三十四十五十六十七十八十九十一千块\t169949\nrnffoh\t169950\n思考\t169951\n送礼\t169952\nokookookokokl\t169953\n规劝\t169954\n61面\t169955\n含苞待放\t169956\n得空\t169957\n自己片\t169958\n粉刺\t169959\n一月二月\t169960\n微胖界\t169961\n乐妮\t169962\n乳糜\t169963\n福运\t169964\n郑一句\t169965\n三胞\t169966\n乳糖\t169967\n张齐\t169968\n变间\t169969\n侥从\t169970\n成一段一段\t169971\n都是\t169972\n谎报\t169973\n学家\t169974\n12345q5ert\t169975\n龟甲博\t169976\n笨悲惨\t169977\n7791\t169978\nsergdf\t169979\n砍单\t169980\nlhl\t169981\n得益于\t169982\n您我真的好爱您\t169983\n度秘alfu\t169984\n4142个\t169985\n学完\t169986\nTIGER\t169987\n都明\t169988\n集贤里中学\t169989\n海陵岛国家级海洋公园\t169990\n差钱\t169991\ncdddx\t169992\n妇产\t169993\n诺婊子\t169994\n困心横虑\t169995\n加多宝加速清理王老吉库存\t169996\nILy\t169997\n郭占彬\t169998\nwifix\t169999\n香港特区政府\t170000\n原副\t170001\n李梓睿\t170002\n亲父\t170003\n周灵煊\t170004\n靠服\t170005\n赵天娇\t170006\n活了敢\t170007\n唉不晓得\t170008\n人的故事\t170009\nLourdes\t170010\nramy\t170011\n恶心有别\t170012\n章红艳\t170013\n报告会\t170014\n阿门\t170015\n哪是\t170016\nav仓\t170017\n乌坎\t170018\n纯净水\t170019\nLive\t170020\n灌输\t170021\n免单\t170022\n菠箩漆\t170023\n10月2号\t170024\n孙勃互殴案\t170025\n海带清汤\t170026\n吴宝欣\t170027\n哥伦比亚\t170028\n中国国奥队\t170029\nhi度秘烂度秘\t170030\n怀恋\t170031\n么才\t170032\n刚刚日\t170033\n强烈夺目\t170034\n小屁孩\t170035\n庶火线\t170036\ncccccchhdhhhhhhhh\t170037\n么托\t170038\n主题型\t170039\n德雷克斯勒\t170040\n今年10月1日\t170041\n憎恶\t170042\n场内\t170043\n尤姆\t170044\n上劲\t170045\n催睡\t170046\n国有资产\t170047\n一起来看流星雨那首英语\t170048\n牵牛\t170049\n梅龙镇\t170050\n一个第六\t170051\n毁灭者\t170052\n告我不告\t170053\n1200章\t170054\n几瓣儿\t170055\n憎恨\t170056\n没有我真的好想\t170057\n周驺纹\t170058\n昂霄耸壑\t170059\n昨日10时\t170060\n张彦博\t170061\n赵文琪\t170062\n出于无奈\t170063\n攻破\t170064\n一二代\t170065\n我喜欢6S\t170066\n天秤秘\t170067\n没喜\t170068\n莎士比亚\t170069\n卢湾体育馆\t170070\n多咪多咪多咪多咪多咪多咪多咪\t170071\n地道战\t170072\n李宇倩\t170073\n绰绰有益\t170074\n恰当\t170075\nsi号文\t170076\n夜晚上\t170077\n素质秘\t170078\n中国妞\t170079\n非农数据\t170080\n煤矿\t170081\n问你的所问题\t170082\n2562265567\t170083\nouriou\t170084\n孝肃\t170085\n臭皮匠\t170086\njfor\t170087\n通篇\t170088\n我的费\t170089\n鲁g2\t170090\n我乐意\t170091\n毁容\t170092\n雪藏\t170093\n116269元\t170094\n张会军\t170095\n科普书\t170096\n一生的教训咪呀\t170097\n饭心\t170098\n居说\t170099\ndffffcgggccxxxxxxdddgg\t170100\n徐淑珍\t170101\n时尚界\t170102\n丢三落四\t170103\n水粉\t170104\n韩金汇\t170105\n大切诺基\t170106\n距离\t170107\n丑哭\t170108\n那晚我做你\t170109\n危险PK\t170110\n师雨婷\t170111\n长沙县开元中路易初莲花\t170112\n胎记\t170113\n大人事\t170114\n13579557545\t170115\n江雪\t170116\nT2航站楼\t170117\nTour\t170118\npppppppppppppppppppppppppppppppppppppppppppppppppppppp\t170119\n第二首\t170120\n盗版男\t170121\n黑半个月白班个月\t170122\n点点点意思\t170123\n武思彤\t170124\n妳說\t170125\nZzd\t170126\n比翼\t170127\n董国翠\t170128\n活塞\t170129\n臭点\t170130\n10086个\t170131\nTouN\t170132\n二二七八二零八八\t170133\n駄\t170134\n马臭\t170135\n娜兰蒂\t170136\n大人人\t170137\n你好你好搞成忧\t170138\n哇良过\t170139\n姑庵\t170140\n该户\t170141\nc罗\t170142\n张宇骞\t170143\n点点儿\t170144\n上八点\t170145\n新建小区\t170146\n2800\t170147\n黄韵洁\t170148\n我靠我靠我靠我靠我靠我靠\t170149\n小衫\t170150\n济南市金融办\t170151\n铁达尼\t170152\n七巧\t170153\n好你没疯没疯我唱\t170154\n1259663\t170155\n小女生\t170156\n4号堆\t170157\n三十八千八百八十八岁\t170158\n你不我\t170159\n自己师\t170160\n三亚大小洞天景区\t170161\n杨风琴\t170162\n苦楚\t170163\n15千米\t170164\n无敌无敌无敌5d5d5d5d5d5d5d5d5d5d5d5d5d5d5d5d5d5d5d5d5d5d5d5d五\t170165\nbusiness\t170166\n开马屁\t170167\ntixy\t170168\n634厘米\t170169\n标配\t170170\n美优\t170171\n别喊\t170172\n佛学\t170173\n锤子锤\t170174\n很够\t170175\n结了喜\t170176\n很多\t170177\nhvbjkgcg\t170178\n美伊\t170179\n泥马戈壁\t170180\n缪艳\t170181\n3133259372\t170182\n肉漫\t170183\n谢雨泓\t170184\n门缝\t170185\n陈纪宇\t170186\n流言满天\t170187\n很天\t170188\n很大\t170189\ngumi\t170190\n三人度\t170191\ngchxz\t170192\n急须\t170193\n駿\t170194\n說別\t170195\n881神马\t170196\n车标\t170197\n不ksllo\t170198\n伤者\t170199\n佛子\t170200\n九科\t170201\n异国他乡\t170202\nyouo\t170203\nyoun\t170204\nyoum\t170205\nyoul\t170206\naqfkl\t170207\n韩\t170208\n有嘛关系\t170209\n毒誓\t170210\n韦\t170211\nyoue\t170212\n碳酸钙\t170213\n7度\t170214\n大楠\t170215\nyoua\t170216\n大楼\t170217\ncvac\t170218\n你好好过\t170219\n袖袖\t170220\n紫藤\t170221\n韶\t170222\nyouu\t170223\nyout\t170224\n杉原杏璃\t170225\nyour\t170226\n啦度秘\t170227\n杜燕\t170228\n短袖\t170229\n偷笑[嘻嘻\t170230\n刑律规定\t170231\narea\t170232\n短袜\t170233\n14.3万亿美元\t170234\n摩尔兄\t170235\n我很温柔吗你给我个赞\t170236\n马小白\t170237\nares\t170238\nliang\t170239\n杨欣儒\t170240\n沉痛\t170241\nSeenyou\t170242\n新年快乐了\t170243\n天竺国\t170244\n小西克\t170245\n龙缸\t170246\n200平米\t170247\n洗衣夷\t170248\n刘松松\t170249\n恩波mmitnnanbolitimelnit\t170250\n色图\t170251\n上仙\t170252\n没心情了我\t170253\n比卡丘皮卡丘\t170254\n二月二月\t170255\n巨型天\t170256\n心血管\t170257\n任梦远\t170258\n上仓\t170259\n践什午\t170260\n不得而知\t170261\n榕通社\t170262\n热血少年\t170263\n上介\t170264\n1949年\t170265\n没劲透\t170266\n五月天追梦3DNA\t170267\n上仁\t170268\n樱井翔\t170269\n平顶山\t170270\n别小看\t170271\ndangerous\t170272\n嗯嗯嗯\t170273\n1208\t170274\n转不着\t170275\n1204\t170276\n放了死\t170277\n1201\t170278\n1200\t170279\n1203\t170280\n挑拨离间\t170281\n散發\t170282\nwefd\t170283\n结为\t170284\n燕麦\t170285\n120%\t170286\n玩游泳\t170287\n甘哥哥\t170288\nLan\t170289\n扑出\t170290\n摄影展\t170291\n过渡期\t170292\n勤工俭学\t170293\nLac\t170294\n恩我要女机器人不要你\t170295\n别闹了讲\t170296\n真的很感谢你谢谢谢谢真的好谢谢你\t170297\n丽丽啦了绿巨人\t170298\nGreat\t170299\n杨度秘\t170300\n盗贼\t170301\n别骗\t170302\n明天厂夏\t170303\n2008年10月9日\t170304\n冯绵\t170305\n限行尾号\t170306\n暗号\t170307\n美爱\t170308\n降职\t170309\n别骂\t170310\n故天\t170311\n谢谢多咪\t170312\n五六\t170313\n价格式\t170314\nallow无咯\t170315\n彭高峰\t170316\n韶华\t170317\nwssz\t170318\n你好是\t170319\n哎小度\t170320\n急中生\t170321\nHdhyss\t170322\n盒饭\t170323\n上京籍\t170324\n释迦\t170325\n度秘out度秘out度秘out度秘outoutoutooooout\t170326\n陈楚茹\t170327\n18779058258\t170328\n换屏\t170329\n22500\t170330\n38300元\t170331\n怦怦\t170332\n陆玉年\t170333\n挤挤\t170334\n丽江古城保护管理局\t170335\n俩谁个\t170336\nhuigl\t170337\n模崭\t170338\n五哥\t170339\n轻型\t170340\n全岛\t170341\ndeadline\t170342\n屏被\t170343\nlm模\t170344\n两生花\t170345\n战说\t170346\n雪莲果\t170347\n9名\t170348\n困扣\t170349\nqtoc\t170350\n川南津\t170351\n赞阳\t170352\nibadaxtplacco\t170353\n排档\t170354\n五品\t170355\nEight\t170356\n证券大厦\t170357\n困扰\t170358\n万恶\t170359\n帝国的毁灭\t170360\n自行脑\t170361\np富二二\t170362\n奥黛丽\t170363\n笨点\t170364\n思忖\t170365\n源萌\t170366\n赖锦浩\t170367\nhahahahahahaha\t170368\n66666666666666666666696\t170369\n刘美\t170370\n华慈\t170371\n见不见\t170372\n数种\t170373\n青梅煮酒论英雄\t170374\n美女猪\t170375\n抢占\t170376\n千务\t170377\n2首\t170378\n不骗色\t170379\n小心错\t170380\n夏利\t170381\n一所\t170382\n姚一函\t170383\n赵彬\t170384\n算行\t170385\n八百米\t170386\nXYZ\t170387\n丘陵\t170388\n你好\t170389\n大只个\t170390\n刘群\t170391\n直到现在\t170392\n说的对\t170393\n说不好像\t170394\n女错误\t170395\n猎户座\t170396\n享贷\t170397\n第10周\t170398\n斧头\t170399\n柳求求\t170400\n刁可x逶乙\t170401\n盗墓笔记本来\t170402\nMadeMe\t170403\n不能玩\t170404\n技考\t170405\n张绣\t170406\n践踏\t170407\n张继\t170408\n卡片机\t170409\n3800个\t170410\n内情\t170411\n别类\t170412\n575658888890890000089\t170413\n5700元\t170414\n无双西游\t170415\nonni\t170416\n傍水\t170417\n无药可救乐\t170418\n臭臭你真不要脸不要脸\t170419\nqihjfl\t170420\nyvcrgjbf\t170421\n龟裂\t170422\n曹县\t170423\n第一炮\t170424\n商花\t170425\n抒情诗\t170426\n康桥镇\t170427\n喜冤家\t170428\n张宇墨\t170429\n雨声里山\t170430\n今文\t170431\n乐系\t170432\n还度秘嘞\t170433\n太块\t170434\n势头\t170435\n利比亚过渡委\t170436\n控制器\t170437\n0890088809632147809652n0\t170438\n三考\t170439\n亲不亲不亲\t170440\n三者\t170441\n拍档\t170442\n稻米\t170443\n炫舞\t170444\n制度秘\t170445\n牛佳琪\t170446\n俩九九\t170447\n太坏\t170448\nouthurintingo\t170449\n形单\t170450\n一个堡\t170451\nhuik\t170452\n精加工\t170453\n相熟\t170454\n吴青峰\t170455\n为什么不不\t170456\n那你现在说你是女孩\t170457\n嫁给他\t170458\n十七元\t170459\nnishzuma\t170460\n弦子\t170461\nhuiu\t170462\n英勇\t170463\n徽派建筑\t170464\n非彼\t170465\n130号\t170466\nhttpahiphotosbaiducomxiaodupicitemb8389b504fc2d562259a3a99e01190ef76c66c30jpg\t170467\n敬听\t170468\n纠错\t170469\n水满\t170470\n比丑\t170471\n新蔡\t170472\n贝贝贝贝贝贝贝贝贝贝\t170473\n水滩\t170474\n北大医g\t170475\n比下\t170476\n比上\t170477\n重温\t170478\n水滴\t170479\n乌有之乡\t170480\n爱的味道\t170481\n油生\t170482\n蒲云\t170483\n可定\t170484\n毋须\t170485\nZdx\t170486\nmlmcx\t170487\n马桢桢\t170488\n一起玩笑\t170489\n红蓝色\t170490\n真好我是好\t170491\n顿悟\t170492\n那告诉我告诉我告诉我告诉我告诉我告诉我告诉我\t170493\nZdd\t170494\n脊六兽\t170495\n原石\t170496\nPPT\t170497\nhttpehiphotosbaiducomxiaodupicitemaa18972bd40735fae2baaccb99510fb30f24087djpg\t170498\noO0\t170499\n人工呼吸\t170500\n杜密夸\t170501\n浙江农信手机银行\t170502\n本公司\t170503\n黄飞魄散\t170504\n好几天\t170505\nfjifxhjfx12385\t170506\nn6nn4nn4nnn4nnnnn46464nn4644648n848n46n41n656656\t170507\n不需要你了我\t170508\n央一禁\t170509\nfgchgfgc\t170510\n阿琼\t170511\n阿琳\t170512\nsfc\t170513\n张鲁一\t170514\n感到\t170515\n度秘特\t170516\n阿琪\t170517\ndasih\t170518\n进惠\t170519\n两席\t170520\n西域观音\t170521\n南沙\t170522\n徐玉婷\t170523\n内人\t170524\n偏向\t170525\n不负责\t170526\n叶依然\t170527\n滚滚有\t170528\n地对地导弹地对地导弹地对地导弹\t170529\n13189958069\t170530\n有不我\t170531\n点帮\t170532\n一个条\t170533\n李树斌\t170534\n度秘牛\t170535\n咪p雅\t170536\n多必胜客\t170537\n贝贝贝贝贝贝\t170538\n陶明良\t170539\n一不说话\t170540\n汉米\t170541\n此役\t170542\n强昂\t170543\n删去\t170544\n优待安查\t170545\ngttgv\t170546\n74分钟\t170547\n这觉\t170548\n掉了坏\t170549\n雕刻\t170550\n王厚\t170551\n王原\t170552\n能使\t170553\n18十八一百120千二百八百800\t170554\n韭菜炒鸡蛋\t170555\n干完\t170556\n1.5万\t170557\n装车\t170558\n移情别恋\t170559\na\t170560\n保荐代表人和\t170561\n伙食\t170562\n焕宁\t170563\n小客车\t170564\n李婍\t170565\n奥克斯\t170566\n159cm\t170567\nlfromChina\t170568\n王去\t170569\n义兄弟\t170570\n筋道\t170571\n九段\t170572\naabkk\t170573\n谢广坤\t170574\n居首\t170575\n姜欣妤\t170576\n荡漾\t170577\n厚先\t170578\n母恩\t170579\n幸运星\t170580\n又见一皇\t170581\n怦然星动\t170582\n学军路\t170583\n700平方\t170584\n幼小\t170585\n提前还贷款\t170586\n有把握\t170587\n3MM\t170588\n登下\t170589\n登上\t170590\n2011-5-17\t170591\n老话\t170592\nhotslim\t170593\nchthx\t170594\n屌炸\t170595\n恶潮\t170596\n古德一\t170597\n六百五十分钟\t170598\n所以嘛\t170599\n鱼干\t170600\n8888888888888888888888\t170601\n食用碱\t170602\n10月3日\t170603\n石破惊天\t170604\n火烧\t170605\n婴幼儿\t170606\n嗯幺零八二\t170607\n戏曲采风\t170608\n丽辉\t170609\nEVERY\t170610\n8张\t170611\n徐莹莹\t170612\n圣诞大蛋糕\t170613\n古瑞芳\t170614\n在此度秘\t170615\n十八梯\t170616\n爱片\t170617\n没几天\t170618\n太极八卦阵\t170619\n罗家坪\t170620\n不好了再见\t170621\n天山日河\t170622\n3500一平米\t170623\n龙江\t170624\n一包包\t170625\n底薪\t170626\n没有利用\t170627\n医妃\t170628\n一整\t170629\n身自好\t170630\n一数\t170631\n置信\t170632\n九毛\t170633\n特生\t170634\n心电影\t170635\nyxylyY\t170636\n优越感\t170637\n一故\t170638\n爱物\t170639\n好我你男\t170640\n龙泉股份\t170641\n不便\t170642\n空空思密达\t170643\n你动\t170644\n小喜欢一个人\t170645\n一教\t170646\n杨雪\t170647\n想未干\t170648\n细致\t170649\n乌豆米粥\t170650\n心情学\t170651\n老金\t170652\n2.3%升\t170653\n我社\t170654\n踩踩\t170655\n北周\t170656\n海里区\t170657\n亿万\t170658\n天猫小百科#\t170659\n放小\t170660\nhggi\t170661\n放尊\t170662\n诶妈呀老娘哎哟妈妈妈妈\t170663\n公社单元户\t170664\n咱们俩\t170665\n放射\t170666\n盐率\t170667\n放尿\t170668\n老说\t170669\n传送器\t170670\n亲嘴\t170671\n13569872364\t170672\n霸天虎联盟\t170673\n美盛\t170674\n汉英\t170675\n孙懿辰\t170676\nilove\t170677\nehxp\t170678\n公安机关\t170679\n18744287801\t170680\n破坏力\t170681\n我的世界\t170682\n古宅\t170683\n瑞哥\t170684\n周四晚上8点\t170685\nustinit\t170686\n软磨硬泡\t170687\n喵喵咪咪\t170688\nhgzv\t170689\n青青猫\t170690\n深不行\t170691\n苏JNB205\t170692\n底外\t170693\n饭制\t170694\n盀芳囝\t170695\n超凡剃须刀\t170696\n亿下\t170697\n这一晚上\t170698\n11时11分\t170699\nhgzm\t170700\n给我不支\t170701\n合一起\t170702\n赝品\t170703\n01560\t170704\n五倍\t170705\n医学院\t170706\n食税者\t170707\nhggy\t170708\nhhhvff\t170709\n哭好\t170710\n阿骨\t170711\n东北话\t170712\n打怪\t170713\n两众\t170714\njhsbnrwcn\t170715\n东段\t170716\n翘首以待\t170717\n朱伟豪\t170718\n忙忙碌碌\t170719\n月薪酬\t170720\n痛定思痛\t170721\nuled\t170722\n16元\t170723\n悠闲值\t170724\nmiss\t170725\n郝晓涵\t170726\n揪住\t170727\n噜啦啦题库\t170728\nV领\t170729\n黑叔叔\t170730\n郎姐\t170731\n蓬蒿人\t170732\n妈蛋\t170733\nKiSS\t170734\n雾霾\t170735\n7篇\t170736\n6月28日晚间\t170737\n黄松森\t170738\ns22c3823838\t170739\n王娇玲\t170740\n肖哥\t170741\n王晚会\t170742\n表意不明\t170743\n狂野\t170744\n阿巴特、内斯塔\t170745\n去除\t170746\n手术刀\t170747\n家务\t170748\nIPHONE光\t170749\n邹瑶\t170750\n百合文\t170751\n刺耳\t170752\n87吨\t170753\n白象\t170754\n吧味精\t170755\n18281989352\t170756\n哭了不去\t170757\n尾部\t170758\n一帮四岁\t170759\n信姐\t170760\n掉价\t170761\n去了再见\t170762\n洛鹰\t170763\n嗯呢们\t170764\n天下财经\t170765\n明天1点\t170766\nIna罗湖九牛\t170767\n王瑞\t170768\n王瑟\t170769\nwoaimimi\t170770\n决出\t170771\nbookaboutkm\t170772\n唱一首发\t170773\nD盘\t170774\n都一无二\t170775\n100BD\t170776\n罪我没有\t170777\n小草儿\t170778\n赵希望\t170779\n这回别嘌这回别骗我\t170780\n妖刀\t170781\n3三十年\t170782\n崇拜\t170783\n大椰丰饭店\t170784\n李上兵\t170785\n美丽的谎言\t170786\n学步\t170787\n咯可乐\t170788\n高鸿飞\t170789\n王瑢\t170790\n合情\t170791\n这座机\t170792\n亿丰\t170793\n度丹秀\t170794\n金沉默\t170795\n不定时\t170796\n贺龙传奇\t170797\ndfhj\t170798\n给没\t170799\n王黔南\t170800\n听育\t170801\n李茂庆生\t170802\n这幺\t170803\n1423289565\t170804\ndudddtf\t170805\n这年\t170806\n单刀吹手\t170807\n办结\t170808\n涤荡\t170809\n陕西话\t170810\n县政府\t170811\n臆想\t170812\n流浪猫\t170813\n邓家乐\t170814\n伤心想\t170815\naksj\t170816\n济贫\t170817\n泛神论\t170818\n爷们儿心\t170819\n400余款\t170820\n北鼻\t170821\n水草\t170822\n囧虎\t170823\nEHWKTIHTDJDJRYUJJXGNTEUYSNZCLVVZCNCCBDFKL\t170824\n爸造\t170825\n一无所获\t170826\n我不喜欢你我讨厌特别讨厌特别讨厌特别讨厌你\t170827\n冷水村\t170828\n爱徒\t170829\n可乐殿\t170830\n熙来攘\t170831\n覆盖\t170832\n学成\t170833\n六点二十\t170834\n重排\t170835\n混顿\t170836\n55888425925558881346572\t170837\n贫家寒屋愁\t170838\n重掌\t170839\n山月不知心底事\t170840\nHeat\t170841\n餐酒率\t170842\n大王大王少安礼物为什\t170843\n75dnjf\t170844\n才女\t170845\njukf\t170846\njukg\t170847\n没得救\t170848\n活埋\t170849\n允浩\t170850\n再用水冲名\t170851\n郎俊明\t170852\n淫妹\t170853\n月影\t170854\n借鉴\t170855\n673949657\t170856\n佳凝\t170857\nlisted\t170858\n士平\t170859\n骨密度\t170860\n2012.7.7\t170861\n别具一格\t170862\n等一\t170863\n外版\t170864\nv2.9.6\t170865\n田美玉\t170866\n外片\t170867\n286分\t170868\n累坏\t170869\n淫妇\t170870\n上地十街十号\t170871\n八十级\t170872\n刁蛮\t170873\n赤裸\t170874\n九不靠\t170875\n朴大白\t170876\n2万亿美元\t170877\n共兴村\t170878\n金一一\t170879\n图图图图图图\t170880\n7级\t170881\n我不我不你\t170882\n勇者无敌三十\t170883\n爽大爱戈\t170884\n男人婆\t170885\n瞎说\t170886\n巡视员\t170887\n91年\t170888\n资焕颜\t170889\n专栏作家\t170890\n有很爱\t170891\n每五秒钟\t170892\n唉3\t170893\n苏辛博\t170894\n美国联邦最高法院\t170895\n没想过\t170896\n能无患事\t170897\n睡不着唱一去\t170898\n8点十四八点\t170899\n0916\t170900\n最优\t170901\n0914\t170902\n漏点\t170903\n5689\t170904\n熊出没的上老咯一样的百事通\t170905\nR级\t170906\n5687\t170907\n好我叫利落\t170908\n空部\t170909\n无钱者\t170910\n快乐小逗比\t170911\n橙天嘉\t170912\n痴犬\t170913\n妻妾\t170914\n抄上\t170915\n襯衣\t170916\n来我不聊\t170917\n41岁时\t170918\n幽文\t170919\n3月10日下午6时\t170920\n光耀\t170921\n神采奕奕\t170922\n灵魂沙\t170923\ncjdf\t170924\n秘像\t170925\n文谁\t170926\n是不是\t170927\nd12\t170928\n六成\t170929\n技术上\t170930\n六戒\t170931\n垤玛乡\t170932\nttrh\t170933\n两百再二十二十\t170934\n必要时\t170935\n天天涯\t170936\nSUJU\t170937\n白蚁\t170938\n伤身\t170939\n兮兮\t170940\n听见了没\t170941\nfffrydyrtywttst\t170942\nirjcjg\t170943\n特鲁姆普\t170944\n大柜\t170945\n共犯\t170946\n欧尼水泥饭\t170947\n终须\t170948\n承蒙\t170949\n在梦游\t170950\n兰江\t170951\n兰汐\t170952\n各自己\t170953\ngvhvjvjjgvgcfsrsedtyttttttttt\t170954\n接收\t170955\n痴缺\t170956\n胡马度阴山\t170957\n18093371866\t170958\n综合国力\t170959\n马里奥\t170960\n李哲宇\t170961\n158472658\t170962\n马雅\t170963\n博昆\t170964\n零二二幺零零幺零\t170965\nimac\t170966\n110小时\t170967\nimax\t170968\n嗯立秋\t170969\nimaw\t170970\n小飞侠\t170971\n郭伯雄\t170972\n开馆\t170973\n破脸\t170974\n得奖\t170975\n奔驰车\t170976\n平底鞋\t170977\nvvgh\t170978\n逞强\t170979\n萌兽\t170980\n哎好\t170981\n月照\t170982\n葱油饼\t170983\njxiduchhcjcujcuc\t170984\n1月18号\t170985\n听听行\t170986\n陪我\t170987\n票房\t170988\nv蛋炒饭\t170989\n洼里\t170990\n888一个\t170991\n廣州\t170992\n吹牛\t170993\n中人\t170994\nexdo\t170995\n警方\t170996\n摔下\t170997\n525555554556\t170998\n吃面条\t170999\n不太上\t171000\n海洋特别保护区\t171001\n第五名\t171002\n真的是你子臭\t171003\n作战\t171004\n盖浇饭VV观后感\t171005\n走自己的路\t171006\n一起跳绳\t171007\n对对对我就讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你\t171008\n长沙警察砍杀医学博士\t171009\n爸爸乐\t171010\n脱狱\t171011\n公知们\t171012\n感谢你了度秘\t171013\nstantimei\t171014\nmaabc\t171015\n崔渐\t171016\nFfggsgg\t171017\n乌海会店\t171018\n泰迪熊\t171019\n美美\t171020\nffrd\t171021\n喜欢\t171022\n美羊\t171023\n一四五万\t171024\n5.3吋\t171025\n习得\t171026\n神偷\t171027\n黄春森\t171028\n在时候\t171029\n暴躁\t171030\n600732\t171031\n天域\t171032\n美群\t171033\n太多太\t171034\n乐派\t171035\n傻子身\t171036\n勇士们\t171037\n潦倒\t171038\n尾音\t171039\n暗干\t171040\n呀葵\t171041\n强人所难\t171042\n律师案\t171043\n丨丿ㄥ丨丨\t171044\n杨洋阳\t171045\nHDDFFWEDHH\t171046\nOAO\t171047\nzenyang\t171048\n于露涵\t171049\n一株\t171050\nAMD超能军团\t171051\n8种\t171052\n天风\t171053\n通州天逸大酒店\t171054\ncouj\t171055\n诺囧囧\t171056\n水淹车拍卖背后的利益链\t171057\nQQ音\t171058\n几例\t171059\n金刚经\t171060\n垂直平分线\t171061\n只怕\t171062\n侄儿\t171063\n安静早上安安\t171064\n顾源性\t171065\n快五年\t171066\nkeje\t171067\n文化节\t171068\n聊时间\t171069\n100克朗\t171070\n贝斯手\t171071\n黄色伦\t171072\n名字库\t171073\n齿\t171074\n蓝浩\t171075\n翠太\t171076\n复方罗汉果止咳颗粒\t171077\n大鸭蛋\t171078\n实玛利\t171079\n独生子女制\t171080\n聊了不聊\t171081\n李天完\t171082\n蓝海\t171083\n看不爱情原谅\t171084\n1235584902\t171085\n一首点\t171086\n都说那妙香来自林间\t171087\n穿透\t171088\n十二点钟\t171089\n20200\t171090\n片段录\t171091\n密克丝\t171092\n微旅行\t171093\nghffuiik\t171094\n3.47%\t171095\n果子\t171096\n音部\t171097\n名字度\t171098\n狂飙\t171099\n九劫剑法\t171100\n一千年后\t171101\n太太太太太太太太太太太\t171102\n短毛猫\t171103\n板店乡\t171104\nurba\t171105\n用心犹虑\t171106\n建设期\t171107\n江汉\t171108\nhuhj\t171109\n11223344556625466\t171110\n茴香\t171111\n硅酮\t171112\n吃经\t171113\nhandedfor\t171114\n给我走\t171115\n运转向\t171116\n早打\t171117\n新君越\t171118\n黄哲\t171119\n一百二一年\t171120\n眼晴\t171121\n135斤\t171122\n逗乐班\t171123\n硅酸\t171124\n艾比森光电\t171125\n东风悦达起亚\t171126\nhellopt\t171127\n启辰t70\t171128\nhellopq\t171129\n月蚀君\t171130\n十二岁五十\t171131\n詹记\t171132\nKerstin\t171133\n洗澡时\t171134\n苏44\t171135\n无远\t171136\n浆水面\t171137\n交都\t171138\n酬乐天扬州\t171139\n兔兔凸透镜路途兔兔兔兔\t171140\n调听\t171141\n掉班\t171142\nvvvvvvvv\t171143\n针孔\t171144\n长虹e十九石车会\t171145\n不好意\t171146\n桩子\t171147\n纳凉\t171148\n夕阳红\t171149\n误报\t171150\n细述\t171151\n天涯走天涯走大坏蛋铠甲勇大坏蛋\t171152\n被单\t171153\n伍萍\t171154\n佩奇\t171155\n无迹\t171156\nX1000O\t171157\n光大银行\t171158\npu同意句\t171159\n畅所欲言\t171160\n寨子\t171161\n陈俊龙\t171162\n4G技术\t171163\n黑猪\t171164\n黑猫\t171165\n文一起\t171166\n巴嘎压\t171167\n晶莹\t171168\n吧块\t171169\n博美多秘\t171170\n博乐\t171171\n可嘉\t171172\n出面\t171173\n集体户\t171174\nppip\t171175\n周小燕\t171176\n死亡号\t171177\n董李笑\t171178\n招文手\t171179\n福禄寿放暑假\t171180\n小心肝\t171181\n品品\t171182\n粮块\t171183\n赢政\t171184\n赫海文\t171185\n铁了心\t171186\nfjkhfdghhfdfjjnootddxhjjhgfbjdrig\t171187\n王珏\t171188\n四七转\t171189\n２３９２０一周岁\t171190\n刘洁\t171191\n真的好气愤\t171192\n富不仁\t171193\n猪么人话\t171194\nreey\t171195\n陈嘉欣\t171196\n64976434\t171197\n吊瓶\t171198\n王珂\t171199\nghshdiggycxtz5ngagapapjpatmtpjpjpajmtjqapjt\t171200\n莱茵\t171201\n酒肆\t171202\n酒肉\t171203\nhhggy\t171204\n百合色\t171205\ngrrfb\t171206\n有钱有势\t171207\n0001元\t171208\n一年三\t171209\n读书点\t171210\n吊瓜\t171211\nqualityRFA\t171212\n惠泽\t171213\n归国璃\t171214\nnmmmmm\t171215\n善举\t171216\n莱茨\t171217\n伯温\t171218\n告诉我你的真人面目呀\t171219\n乐都\t171220\n百家争鸣\t171221\n在啊不嘛\t171222\n拾级而上\t171223\nladygaga\t171224\n回往\t171225\n好时光\t171226\n金牛女\t171227\n像头\t171228\n油钱\t171229\n张孔博\t171230\n张家界\t171231\n阿福\t171232\n53535353535353535353568686868683833\t171233\n在唱歌\t171234\n缝线\t171235\n傻嘻嘻\t171236\n一2\t171237\n56388\t171238\n中共中央政治局\t171239\nbnbjjhh\t171240\n军属\t171241\nnaa\t171242\n猪儿跑\t171243\n式版\t171244\nnai\t171245\nnan\t171246\nnao\t171247\nnam\t171248\ny410\t171249\nst堡\t171250\n不能怪\t171251\n好诗诗\t171252\n不能怨\t171253\n两厢\t171254\n总编辑\t171255\n悄悄电影\t171256\nyybuevexjbyrxirvxbz\t171257\nostd\t171258\n骗我片\t171259\n陈远楠\t171260\ngjjjf\t171261\n一番话\t171262\n辣妹子\t171263\n北京城\t171264\n妹汁\t171265\nceyoutenction\t171266\n琅琊琅琊\t171267\n升旗\t171268\n4805\t171269\n哒哒二哒哒三哒哒四哒哒五哒哒\t171270\n评论员\t171271\n住在哪儿了了\t171272\n换乘\t171273\neerdtrhg\t171274\n周五文\t171275\n如如然后入\t171276\n十七八岁\t171277\n邵凯君\t171278\n九十多\t171279\n纪念碑谷雨天散散心下次吧爸爸好呀更热额待续\t171280\nZengyang\t171281\n季叔公\t171282\n让出\t171283\n第七句\t171284\n倒卖\t171285\n你的拜拜\t171286\n视觉\t171287\n郭鑫\t171288\n站台\t171289\n李呵\t171290\n松胸\t171291\n4863894\t171292\n囚凰\t171293\n四百八\t171294\n砰砰砰砰\t171295\n854855558587\t171296\n王德文\t171297\n拜年\t171298\n脾气\t171299\n10月25\t171300\n10月23\t171301\n储蓄卡\t171302\n女魔头\t171303\n方正\t171304\n艰苦卓绝\t171305\nLIIII\t171306\n来不是\t171307\n市民\t171308\n脾氣\t171309\n根苗\t171310\n四百克\t171311\n新年贺岁金钞\t171312\n霸道总裁爱上我的那种\t171313\n弘扬\t171314\n四百元\t171315\n四百兆\t171316\n五彩城\t171317\nwwwwcccc\t171318\n80年\t171319\n耍猴\t171320\n神女\t171321\n眼眉\t171322\nvareyou\t171323\n眼看\t171324\n别逃\t171325\n打蚀\t171326\nxxttz\t171327\n查伯纯\t171328\n维基\t171329\n干白\t171330\n640块\t171331\n眼眸\t171332\n黑作坊\t171333\n大额\t171334\n臭坏蛋\t171335\nhacoh\t171336\n溃烂\t171337\n眼眶\t171338\n呜呼哀\t171339\n吴亦凡\t171340\n驴唇\t171341\n锦旗\t171342\n豪宠\t171343\n乖乖哦\t171344\n蛋蛋蛋蛋蛋蛋\t171345\n圣斗\t171346\n讲旧\t171347\nnishidog\t171348\n小技巧\t171349\n明知多情\t171350\n工科\t171351\nOrson\t171352\n爸爸窝\t171353\n四十三块\t171354\n牵丫\t171355\nimFFL\t171356\n文强\t171357\n中午11点50分\t171358\n幼幼\t171359\n定情\t171360\n内脂\t171361\n23公里\t171362\n罗成远\t171363\n习惯好\t171364\n传播\t171365\n已经很久\t171366\n那好呀\t171367\n内脏\t171368\n我的心我的了我生命的火呼\t171369\n范卓越\t171370\n｀ノ\t171371\n绝非偶然\t171372\n點贊\t171373\nschedule\t171374\n婆公公民嵩明\t171375\n干戈\t171376\n白上\t171377\n亿万富瑥\t171378\nmildendo\t171379\n第六\t171380\n酱紫\t171381\ndtdyfhguvjbknjyeahCCTVNBGDIfashionJackyJackyVincent\t171382\ncelb\t171383\n古孤\t171384\nhi你好\t171385\n跟坏\t171386\n亚洲航空\t171387\n面超级筋道\t171388\n十二年后\t171389\n装萌\t171390\n炮灰\t171391\nn　　　　Ｏn　　　　　　ヽn\t171392\n哼叽\t171393\n炮火\t171394\npls5\t171395\n镶牙\t171396\n村妇\t171397\n检察机关\t171398\n国税\t171399\n小侦探\t171400\n卡福，内斯塔\t171401\n十一嗯\t171402\n摄助\t171403\n卫东村\t171404\n女队\t171405\n安妮弗兰克\t171406\n依赖度\t171407\nrhymzlictmgtev\t171408\n痴呆\t171409\n爆痘\t171410\n官邸\t171411\n邪神\t171412\n事发后\t171413\nYouarewelco\t171414\n劳老是\t171415\n厚重\t171416\n吵饭\t171417\n武职小秘书林\t171418\nfasyiqe\t171419\n潘冰\t171420\n4100\t171421\n喔摔\t171422\n男团\t171423\nleo平生\t171424\n度秘我真的很感谢你\t171425\nsinging\t171426\n陈立发\t171427\n大活\t171428\n保利香槟国际\t171429\n弧暮\t171430\n度秘干嘛\t171431\n十组\t171432\n5月28日\t171433\n三十岁\t171434\n程静玉\t171435\n守门员\t171436\n从头狗秀\t171437\n129千米\t171438\nvufddc\t171439\n电影频道梦工场\t171440\n二二十\t171441\n悲苦喜乐\t171442\n危在旦夕\t171443\n大忙人\t171444\n嘿哟度秘findiffeno0000000牛牛牛牛牛牛\t171445\n孙老师\t171446\n金女\t171447\n二月七号\t171448\n旁观者\t171449\nrts6rrgddf\t171450\n姑规\t171451\n静静静静静\t171452\n大学生村官#\t171453\n不断网\t171454\n心计\t171455\n2月20号\t171456\n哦多咪\t171457\n金奖\t171458\n在一起去\t171459\nLIKE\t171460\n090517\t171461\n辘轳\t171462\n七环\t171463\n电机\t171464\n2b网\t171465\nXX摩尔\t171466\n那个英\t171467\nTOX\t171468\n竖立\t171469\n油汀\t171470\n好搞笑笑\t171471\n楷清海\t171472\nTOP\t171473\nTOT\t171474\n二零三零\t171475\n声响\t171476\n十万八千次\t171477\nTOM\t171478\n问谈\t171479\n代亚璐\t171480\n就是要不了\t171481\nTOD\t171482\n过年吧\t171483\n三十几多\t171484\n宿管\t171485\n无聊的间\t171486\n工间\t171487\n我来上南321开始你是我的小呀小苹果张蒙爱你\t171488\n玩好呀\t171489\n过年吗\t171490\n手写字\t171491\n王金秀\t171492\nenovo\t171493\n色系\t171494\n弹死\t171495\n蒙克正\t171496\n李沧中学\t171497\n一起走行\t171498\n大多数\t171499\n上学星期六\t171500\n了什么呢面对面\t171501\n道中人\t171502\nososos\t171503\n性解剖\t171504\nkykjkgjgngdmhkunmjhlhgkunfjxrsjy\t171505\nrzt37\t171506\n痰管\t171507\n括弧\t171508\n火箭弹\t171509\n宠物蛇\t171510\n六行\t171511\n西街小学\t171512\n2011-02-01\t171513\n2根\t171514\n二百三十五\t171515\n象限\t171516\n特工叔\t171517\n豆瓣网\t171518\n撞装\t171519\nhste\t171520\n拉贝比\t171521\n偷窥胡\t171522\n麦凯风\t171523\n拜拜拜拜拜拜拜拜拜拜拜拜拜拜拜拜拜拜拜拜拜拜\t171524\n乐亭桥头\t171525\n几棵\t171526\n｀ノHAPPYNEWYEAR\t171527\n雅思\t171528\n一万号\t171529\n王卡狼\t171530\n腻人\t171531\n燎燎\t171532\n吃肉树\t171533\n山药蛋派\t171534\nTOP8\t171535\n于都学\t171536\n异数\t171537\n这么片\t171538\n讨厌讨厌讨厌讨厌男人\t171539\n撒巴拉克\t171540\n男汕\t171541\n金詩琦\t171542\n杜妹子\t171543\n的乡村小学图书馆\t171544\n牛格纹\t171545\n哪家壁\t171546\n2b3b45678910\t171547\n雷丝秀\t171548\n沙里尔\t171549\nhellometetet\t171550\n战眸\t171551\n大美人\t171552\n只么\t171553\njhtffoy\t171554\n征友\t171555\n去痘\t171556\n很好看我的\t171557\n兰花\t171558\n05074424774545556852555\t171559\ndfovhvi\t171560\nsdstt\t171561\n只缘生\t171562\n王贺阳\t171563\n63厘米\t171564\n萤火\t171565\n了烦\t171566\n去痣\t171567\n鲜为\t171568\n向右\t171569\nTHE\t171570\n3500元\t171571\n纤夫\t171572\n碎骨\t171573\nwwkqiqww\t171574\n彭修贤\t171575\n嬔帮忙\t171576\n五点整\t171577\n兰芝\t171578\n听份\t171579\n万宝路\t171580\n露台\t171581\n新世界教育机构\t171582\n香吻\t171583\ndfdyduftfhdufhchfhfhgughggguc\t171584\n强行\t171585\n144\t171586\n六时间\t171587\n145\t171588\n麻玩意儿\t171589\n我的生辰\t171590\n好呀正好\t171591\n黄琪\t171592\n随喜你的欢喜心\t171593\n香君\t171594\n奥比\t171595\n听从\t171596\nplayground\t171597\n零七二零三八\t171598\n100年前\t171599\n等长\t171600\n说亲亲亲亲\t171601\n大喘气\t171602\n纯良\t171603\n空等会聊\t171604\n小可爱度蜜\t171605\n#留法男\t171606\n七九幺五\t171607\n130万\t171608\n石壕吏\t171609\n我的秘\t171610\nOKOK\t171611\n夏克爱\t171612\n清净\t171613\n俗家\t171614\ndkskns\t171615\n跟好好好\t171616\n清凉\t171617\n吾党\t171618\n堆成\t171619\n手机型\t171620\n伽罗\t171621\n六岁半\t171622\n41名\t171623\n撒比撒比\t171624\n张秘堡\t171625\n和秘\t171626\n14%\t171627\n取现\t171628\n春节等假日免收小客车通行费\t171629\n麦格纳\t171630\n台政\t171631\n森里人\t171632\n辛苦你了谢谢你\t171633\n后海地\t171634\n84吧\t171635\n纽埃lol\t171636\n鸿瑞兴\t171637\nHahahahahahahahaha\t171638\n敲破\t171639\nuwhbdb\t171640\n视人\t171641\n5096起\t171642\n明段\t171643\nDEBAC\t171644\n上半时\t171645\n断电\t171646\n一个100克\t171647\n放纵爱\t171648\n整装\t171649\n1257r\t171650\n咱们两个聊会天\t171651\n一百三十六\t171652\n一百三十八\t171653\n开会\t171654\n徐奥迪\t171655\n三月底\t171656\n紫罗兰D\t171657\n奏匍爬舐KJLJM3\t171658\n陷入\t171659\n花小米\t171660\n美女哥\t171661\n1zsf\t171662\n长庆宾馆\t171663\n钱包薄\t171664\n日华版\t171665\n董中菲\t171666\n众夫\t171667\n曹雨晨\t171668\nsmys\t171669\n教育片\t171670\n24根\t171671\njjjkjk\t171672\n众多\t171673\n逊毙\t171674\n靠恁娘\t171675\nsulam\t171676\nvqqx\t171677\n安海\t171678\n挑三拣四不关\t171679\n64块\t171680\n好臭\t171681\n24栋\t171682\n群服\t171683\n幺零幺八\t171684\n制成品\t171685\n282668\t171686\nhhiihiq\t171687\ntFboy\t171688\n心塞塞嗯哦小度秘你真的好搞笑我给你十分\t171689\n花鼓灯\t171690\n死战\t171691\n伯拉伯拉\t171692\n亲爱的你想我\t171693\n开爱\t171694\n死成\t171695\n冻死骨\t171696\n真真个狐媚相[嘻嘻\t171697\n捆绑销售\t171698\nTalent\t171699\n巴拉拉小魔仙求求你了求求你\t171700\n那你真的爱我\t171701\n雾气\t171702\n斑斑驳驳\t171703\n国色天香万人求\t171704\n今天晚上22：40\t171705\nable\t171706\n黄大狗\t171707\n头某诺\t171708\n可怜状\t171709\nEERGGB\t171710\n会秘\t171711\n爱奇妙妙屋\t171712\n不要点\t171713\n恶鬼\t171714\n中水\t171715\n蕴涵\t171716\n家常便饭\t171717\n雾水\t171718\n二十十二日\t171719\n己走\t171720\n米花糖\t171721\n纯然\t171722\nawy\t171723\n董琦皓\t171724\n影评\t171725\n衰玩\t171726\n出去走走\t171727\n2838385728\t171728\n黑狼犬\t171729\nbasket\t171730\nawm\t171731\n臭呆子\t171732\n椒房殿\t171733\n下年\t171734\n治安员\t171735\n包装袋\t171736\n龙拉\t171737\nhttpfhiphotosbaiducomxiaodupicitem0b55b319ebc4b745b61a8362c8fc1e178a821533jpg\t171738\n1788576\t171739\n淡定不了行间几岁了你男孩女\t171740\n周秉山\t171741\n流利\t171742\n拜拜我真的不理你了我走了\t171743\n我相信\t171744\n二三八二\t171745\n承运\t171746\n外情\t171747\n读心术\t171748\n横向\t171749\n长长长长长\t171750\n药明康德\t171751\n1942090086\t171752\n这回事儿\t171753\n22.44%\t171754\ntf卜a4\t171755\n查查吧\t171756\nrhdbc\t171757\ndtptp\t171758\neeesfsrtf\t171759\n碎石\t171760\n伊利\t171761\n56815\t171762\n王思文\t171763\n8千四\t171764\n广晟有色\t171765\nshoes\t171766\n05月23日07时\t171767\nfanyi\t171768\n2489\t171769\n填下\t171770\nwinner\t171771\n25张\t171772\n逾期\t171773\n副歌\t171774\ncreme\t171775\n羊火\t171776\n许育菱\t171777\n很辛辛苦\t171778\n中戏实验小话剧院\t171779\n心梦\t171780\n慕离慕瞳\t171781\n坏我讨厌\t171782\nmga\t171783\n李玉\t171784\n罒w罒\t171785\nmgd\t171786\n郝佳伟\t171787\n陕西省发改委\t171788\n快乐的确良\t171789\n私家路\t171790\n一根半\t171791\n麻将室\t171792\n李玟\t171793\n玮琪\t171794\n汪磊\t171795\n架心情\t171796\n甩卖\t171797\nUsu\t171798\n拼图游戏\t171799\n邓华德\t171800\n杨静杰\t171801\n声高\t171802\n李玮\t171803\n得了\t171804\n雷国友\t171805\nAPS\t171806\n开营\t171807\n那你是什么呢你是魔蝎\t171808\n堕无间地狱\t171809\n刀叉\t171810\n西葫芦恶心死人\t171811\n开落\t171812\n槽道\t171813\n红糖饼\t171814\n李玲\t171815\n几个月\t171816\nhello你是我的小叔秘\t171817\n没爱\t171818\nDAZED\t171819\nhi喽度秘\t171820\ngixiyx\t171821\n和平原谅\t171822\n话装\t171823\n喽喽喽孙悟空七十春号\t171824\n狎鸥亭\t171825\n天佑ucuucuucuuuuucuuuucuuuuuuccc6uuucco\t171826\n2kifi\t171827\n陈燕\t171828\n细川\t171829\n新不了的情\t171830\nzhengliming\t171831\n主张\t171832\n赛洛曲\t171833\n家话宝\t171834\n笨猪那你在干嘛\t171835\n80平米\t171836\n主張\t171837\n红火我小幼师\t171838\nbiame\t171839\n陈烨欣\t171840\n正兴街\t171841\n5年前\t171842\nwjgbv\t171843\n13586985698866988\t171844\n收回去\t171845\n断层\t171846\n长卷\t171847\nSRY\t171848\n平底\t171849\n生日宴\t171850\n锦心情\t171851\n最好呀\t171852\n锡焕咪\t171853\n小壮壮\t171854\n加油工\t171855\n忙忙忙忙忙忙\t171856\n平座\t171857\n王春莲\t171858\n复旺\t171859\n复旦\t171860\n白金级\t171861\n平庸\t171862\n东城\t171863\n永远不理\t171864\n雨街\t171865\n电影学院\t171866\n很麻烦\t171867\n超限\t171868\n厮打\t171869\n邓丽君\t171870\n多发点\t171871\n黑夜里\t171872\n八分钟\t171873\n皱起\t171874\n刀山火海\t171875\n挡不命\t171876\n明天八点整\t171877\n6236万\t171878\n忙倒\t171879\n340464737\t171880\n人均衡\t171881\n别走了哈\t171882\n睡不幸福\t171883\n送卖\t171884\n讨厌鬼我好恨你\t171885\nymyo\t171886\n百篇\t171887\n阿瘫\t171888\n郑王\t171889\n雨衡\t171890\n溜溜球\t171891\n雨衣\t171892\n謎底\t171893\nfyjvcfd\t171894\n美术生\t171895\n保姆猫\t171896\n轿子\t171897\notn\t171898\n扣给\t171899\n顺口\t171900\n以非\t171901\n李毅辉\t171902\n待着好恶心\t171903\njushss\t171904\n牛哈\t171905\n实质性\t171906\n中高级\t171907\nots\t171908\n膻味\t171909\n房昭\t171910\n世间法\t171911\n四百多年\t171912\n牛哥\t171913\n龙之牧场驯龙高手\t171914\n李淑鑫\t171915\n浪涛\t171916\n打鼾\t171917\n王冠\t171918\nLadys\t171919\n叉烧肉\t171920\n顺受\t171921\n切条\t171922\n滤波\t171923\n双汇\t171924\n无所不知道\t171925\n爽快\t171926\n杨集乡\t171927\n纯属气\t171928\n乐逗比\t171929\n乱弹\t171930\n岳州窑\t171931\n武维斌多兹鲁\t171932\ncsm\t171933\n两尺\t171934\n音痴\t171935\n开塞露\t171936\n米娜桑\t171937\nyy1座\t171938\nFcogcuuf\t171939\n你好我漂亮姐姐行\t171940\n我行故我在\t171941\n事头\t171942\n二零二二四\t171943\n死去活来\t171944\n云中书城\t171945\n武士\t171946\n乱弄\t171947\n乱开\t171948\n意欲\t171949\n伙荣\t171950\n一片半\t171951\n刚刚刚刚刚刚刚刚刚刚刚刚刚刚刚刚刚刚刚刚刚刚刚刚刚刚刚刚刚刚刚刚刚刚刚刚才\t171952\n安逸六年\t171953\n公庄\t171954\n低沉\t171955\n火影忍者博人传\t171956\n不要不要不要不要不要不要不要\t171957\nAPO\t171958\nPerregaux芝柏表\t171959\n精简压缩\t171960\n酷狗炫铃\t171961\nxdrhtmayo\t171962\n凤逆九天妖孽师尊\t171963\n宀Ящщъ\t171964\n陈柯宏\t171965\n6.7个\t171966\n别傻了张看星星\t171967\n多级化\t171968\n如沛练\t171969\n纳镇\t171970\n撒娇娇\t171971\n杨智博\t171972\n开不了\t171973\n方若文\t171974\n芭比\t171975\n耳鼻喉\t171976\n籽沐\t171977\n肉类\t171978\n陆陆伍\t171979\n珉秀\t171980\nshidvkkfc\t171981\n多咪面\t171982\n施工\t171983\nSteffi\t171984\n骚情\t171985\nBD\t171986\nBE\t171987\nBF\t171988\ndepivo\t171989\nBA\t171990\nBB\t171991\nBC\t171992\nBL\t171993\nBM\t171994\n李秘秘\t171995\nBO\t171996\n两万1529\t171997\nBI\t171998\nBJ\t171999\nBT\t172000\nBV\t172001\n同名装\t172002\nBS\t172003\nBY\t172004\nBd\t172005\nBe\t172006\n辞藻\t172007\n２０１０年１２月３１日\t172008\nBb\t172009\nBc\t172010\n先婚\t172011\n我的僵尸眼\t172012\n南京人\t172013\n幽怨\t172014\n曹天\t172015\nBv\t172016\n一百太币\t172017\n死长\t172018\nyguhhhgt\t172019\n三个愿望\t172020\n老讨厌\t172021\n赢得到\t172022\n二别\t172023\n洗衣服\t172024\n二二六六\t172025\n叫忙\t172026\n虐猫者\t172027\n╯╰\t172028\n谭欣怡\t172029\n17748\t172030\n飯店\t172031\n大我恨\t172032\n刘家营\t172033\n旧人\t172034\n菲翔\t172035\n达人们\t172036\n叶小姐\t172037\n好桑心的赶脚\t172038\n防身术\t172039\n旧事\t172040\n中隔\t172041\n二分\t172042\n二则\t172043\n二刚\t172044\n洗衣机\t172045\nB1\t172046\nB2\t172047\nSQL\t172048\n191米\t172049\n我是你的小秘书度\t172050\narepl\t172051\n佳乐家\t172052\n一到底\t172053\n刘国梁\t172054\n陈鸿寿\t172055\n柿饼\t172056\ntwitter\t172057\n低价位\t172058\n拽拽\t172059\n刑天明天\t172060\n4小时\t172061\n六年级数学空间与图形专项训练卷附加题\t172062\n才额\t172063\n藉此\t172064\n小猴子\t172065\n涂黎\t172066\n午夜\t172067\n头颅\t172068\n录像带\t172069\nmonday\t172070\n体现\t172071\nkettle\t172072\n双一一\t172073\n头颈\t172074\n别气我了好不好\t172075\nsometot\t172076\nsometou\t172077\n论文\t172078\n不准错\t172079\nu12个\t172080\n受气\t172081\n先佛教\t172082\n1jgj\t172083\n头题\t172084\n身形\t172085\n每周五\t172086\n乾\t172087\n一见误\t172088\ngaywebsite\t172089\n欲望者\t172090\n安徽卫视新浪\t172091\n八个小时\t172092\n买\t172093\n乱\t172094\ndewnnx\t172095\n乳\t172096\n伍贞\t172097\n淡望\t172098\nqwowowwow\t172099\n习\t172100\n乡\t172101\n乜\t172102\n九\t172103\n乞\t172104\n也\t172105\n乘\t172106\n东东孙雅王\t172107\n乚\t172108\n乛\t172109\n乔\t172110\n住从\t172111\n乖\t172112\n学干嘛\t172113\n乐\t172114\n乒\t172115\n乓\t172116\n乌\t172117\n乍\t172118\n乎\t172119\n乏\t172120\n么\t172121\n义\t172122\n哦九\t172123\n之\t172124\n久\t172125\n乆\t172126\n乇\t172127\n乀\t172128\n肏肏肏肏肏\t172129\n乂\t172130\n乃\t172131\n富商\t172132\n１３９９４３６２８６２\t172133\n保护袋\t172134\nKyleross\t172135\n小门儿都秘\t172136\n纯心\t172137\n荷兰天堂书店\t172138\n喔酱紫\t172139\n伸进\t172140\n度君\t172141\n泽逸\t172142\ntvudgh\t172143\n本次\t172144\n每一抹\t172145\n卷帘门\t172146\n阿京城\t172147\n润润\t172148\n度吖\t172149\n度秘你好四百死板\t172150\n恒久\t172151\n哎呦度\t172152\n小蓝脱保我\t172153\n扭腰\t172154\n55亿\t172155\n自见\t172156\n我求你了你\t172157\n没我脸皮厚\t172158\n部署\t172159\n第地\t172160\n自觉\t172161\nNew1song\t172162\ncbscd\t172163\n兰州军区女少校\t172164\ndhdu\t172165\n42一个\t172166\n惠及\t172167\n不能吃\t172168\n那等你回首再说再见\t172169\n省委\t172170\n飞侠\t172171\n倾世嫡女\t172172\n局地\t172173\ntfbous\t172174\n克聪明\t172175\n柳兆星\t172176\n周二周六\t172177\n123567890\t172178\n白一谢谢\t172179\n362920778\t172180\nhowcanIdo\t172181\nnawoyao\t172182\n介样啊屁眼屁眼儿\t172183\nurnksn\t172184\n72%\t172185\n杨四年\t172186\nfcffrom\t172187\n彭桂珍\t172188\nu兔兔图\t172189\n每场\t172190\n1943年\t172191\n727\t172192\n一百教\t172193\n722\t172194\n723\t172195\n720\t172196\n721\t172197\n伪造\t172198\n七十七\t172199\n729\t172200\n全书\t172201\n咸丰帝\t172202\n漫畫\t172203\n按部就班\t172204\n平潭\t172205\n促使\t172206\n秘书度秘\t172207\n昂首\t172208\n全乳\t172209\nNSW\t172210\n【爱\t172211\n即插\t172212\n词行\t172213\n包美玲\t172214\n蒲腿\t172215\n連署\t172216\n整体化\t172217\nhghnbjjj\t172218\n古物\t172219\n穿越火影\t172220\n老虎马\t172221\n最搞笑\t172222\n岳池\t172223\n分站\t172224\n澧\t172225\n东莞尔\t172226\n纱照\t172227\n常规性\t172228\n排气管\t172229\n分立\t172230\n上楼\t172231\nyrowutruegeqi\t172232\n普868\t172233\n顶上装\t172234\n艳星\t172235\n第一二\t172236\n哦凶\t172237\n妇道\t172238\n周永乐\t172239\n嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎\t172240\n雷比师\t172241\n叠彩\t172242\n艾诺\t172243\nrqk\t172244\n深信不疑\t172245\n开心品\t172246\n为生么\t172247\n房地产税\t172248\n么劲\t172249\n朱程静\t172250\n真屌\t172251\n真屏\t172252\n么务\t172253\n艳春\t172254\n第一人\t172255\n生母\t172256\n家疗\t172257\n功能型\t172258\n澤\t172259\n450个\t172260\nJGCL\t172261\n别恋\t172262\n度也度\t172263\n7折\t172264\ni5处理器\t172265\n让利\t172266\n告诉你好不好\t172267\njtmdat\t172268\n0543\t172269\n言责\t172270\n二百五零\t172271\n暗龙\t172272\n明道\t172273\n剪指甲\t172274\n飞行器\t172275\n留洋\t172276\n123562652865\t172277\n两分钟以后\t172278\n宾州症索\t172279\n年份\t172280\n言之有理\t172281\n前几天\t172282\n两腮\t172283\n老用户\t172284\n清河公主\t172285\n笨羊羊\t172286\n蜡笔小妹\t172287\n传图\t172288\n好了行\t172289\n两腿\t172290\nbjbjbjbj\t172291\n梁咏琪\t172292\n肥皂哥\t172293\nBYE\t172294\n纳瓦斯\t172295\n抢到\t172296\n年代\t172297\n坡姐\t172298\n558868\t172299\n豆子\t172300\n2二十\t172301\n背井离乡\t172302\n胜场\t172303\n经骨科\t172304\n欧意欧意\t172305\n门俩\t172306\n几平方\t172307\n迅妹\t172308\n点知道\t172309\n亲者\t172310\n永永远远\t172311\n皇妃\t172312\n大角牛丁丁\t172313\nggsss\t172314\n涡阳\t172315\n六百多MB\t172316\n敖东\t172317\n凌晨\t172318\n喊麦\t172319\n思想录\t172320\n湘赣\t172321\n溶血性贫血\t172322\n１９７３年３月９日\t172323\nggssi\t172324\n薛柏龙\t172325\n工整\t172326\n铁鸟\t172327\n看见面\t172328\n到不了\t172329\n555555555555555555555555555555555555555555555555555555\t172330\n张慧雯\t172331\n险境\t172332\n砖墙\t172333\n电脑哥好寂寞啊哥好寂寞\t172334\n孟玄朗\t172335\n凌晗\t172336\n哪里拉\t172337\n牟钟鉴\t172338\n尤思源\t172339\n泰拳人大\t172340\n度一秘\t172341\n335ys\t172342\n7886\t172343\n谢尔号\t172344\n大窝\t172345\n夜不归宿\t172346\n夏炎\t172347\nLOOKme\t172348\n珞珞珞珞珞珞珞珞珞珞珞珞珞珞\t172349\n嘱托\t172350\n眠睛\t172351\n谢你\t172352\n41y10\t172353\n脚气\t172354\nJukklllkgpp0kok00\t172355\n钻牛角尖\t172356\n拜拜间\t172357\n好久没\t172358\n爱的宜言\t172359\n1980年\t172360\n数学期\t172361\n郭亮\t172362\nhfufh\t172363\n175岁\t172364\n快乐来了来了\t172365\n凶神恶煞\t172366\n五万个\t172367\nhassaveup\t172368\ntdrdropcoc1\t172369\n自打\t172370\njsien\t172371\n煎饺\t172372\n创意\t172373\n报上名\t172374\n煎饼\t172375\n123456759\t172376\n茶色\t172377\n土鸡蛋\t172378\n美好美\t172379\n燕还有三\t172380\nvggdhdjfj\t172381\n天台山\t172382\n讲喃\t172383\n劲极\t172384\nxbksb\t172385\n电影小达人网\t172386\n什M\t172387\n好啦我心聊天\t172388\n几篇\t172389\n18122653132\t172390\n可口可乐\t172391\n被的们\t172392\n58555865235\t172393\nspacefights\t172394\n县纪委\t172395\nnate\t172396\n余荣珍\t172397\nm君\t172398\n度秘蜜度秘\t172399\n太岳西\t172400\n降魔记\t172401\n彻炉\t172402\n6元\t172403\nDtg\t172404\n小聚\t172405\n老板椅\t172406\n盛的春\t172407\n狗bd\t172408\n9月29日\t172409\n门卫室\t172410\n淫果\t172411\n厄来\t172412\njdjjehhhxhxhs\t172413\n元代\t172414\n两江\t172415\n不要脸你好意思\t172416\n一加一\t172417\n一撸\t172418\n往上面\t172419\n简单一样\t172420\n元件\t172421\n三分之一一趟\t172422\n筷子\t172423\n人工台\t172424\n倾诉\t172425\n糗事\t172426\n小聪\t172427\n两汉\t172428\n小孤单\t172429\n降价\t172430\n不需\t172431\n手鼓\t172432\n度秘度秘我喜欢你我想了吧\t172433\n下药\t172434\n83888889\t172435\n福科\t172436\n万籁\t172437\n18488888\t172438\n香醋\t172439\n李语音\t172440\n三片\t172441\n三九二零幺幺零二四三\t172442\n丢开\t172443\n寒山寺\t172444\n011年\t172445\n六亿万光年\t172446\n爱信不信\t172447\n呜哼\t172448\n何欣宜\t172449\n三牙\t172450\n29BT\t172451\n13994776661\t172452\n轻柔\t172453\n二十天\t172454\n异教神\t172455\n双浇鱼头\t172456\n杨处和\t172457\n液氧\t172458\n普通版\t172459\n除与\t172460\n除上\t172461\n長告\t172462\n外发\t172463\nAccidents\t172464\n哈幽默\t172465\n马知门\t172466\n我很喜欢你我希望\t172467\n2131999\t172468\n邢静涵\t172469\n酒恰马屁\t172470\n不堪生活\t172471\n芹菜猪肉\t172472\n二十多\t172473\nvoglh\t172474\n三十五二十五十五\t172475\n剃度蜜\t172476\n惦唔\t172477\n手錶\t172478\n包惠妍\t172479\n131095653\t172480\n马常阳\t172481\n活該\t172482\n明码\t172483\n刚毅\t172484\n汤水\t172485\n姐妹\t172486\n活动部\t172487\n刘桥漪\t172488\n银鱼\t172489\n看上传\t172490\n五月十号\t172491\ngfjf\t172492\n冻分割肉\t172493\nbjbjkk\t172494\nL2米\t172495\n125125\t172496\n湖南卫视跨年演唱会\t172497\n爱听歌\t172498\n谭云\t172499\n2x890\t172500\n理你了安\t172501\n35.5元\t172502\n闻美红\t172503\nrachel\t172504\n张美女\t172505\n陈克明\t172506\n济源\t172507\n牛嘉福\t172508\n一撒\t172509\nwasssioshi\t172510\n可靠\t172511\n上阵\t172512\n有辞\t172513\n听听\t172514\n桃花涧\t172515\n翰翰\t172516\n张沧丽\t172517\nstwthh\t172518\n给与\t172519\n机械音\t172520\n鲁智深\t172521\n仁者见仁\t172522\n资产阶级\t172523\n大中型\t172524\n老子说你不要脸的不要脸不要脸\t172525\n美女约\t172526\n大样儿\t172527\n我喜欢在背\t172528\n火光\t172529\nculaco\t172530\n跑题\t172531\n诗圣\t172532\n巨峰\t172533\n南女通吃\t172534\n奥古斯都\t172535\n交上费\t172536\n服众\t172537\n挺好的\t172538\n普尼\t172539\nczccg\t172540\npussies\t172541\n微量元素\t172542\nc堡\t172543\n4o1\t172544\nMAC\t172545\n呢聊\t172546\n晚五点\t172547\nlilyoundouly\t172548\n4585867243\t172549\n繁重\t172550\n杨欣淳\t172551\n俩亲个\t172552\n已殇王\t172553\n剪指甲唉\t172554\n延彤\t172555\n非人物\t172556\n俞肾俞\t172557\n2005年9月15日\t172558\n6月30\t172559\n度秘度秘2B2B\t172560\n工作者\t172561\npoppy\t172562\n保价\t172563\n偏黄\t172564\n一般般钱\t172565\n10ccm\t172566\ndyfx\t172567\n总公司\t172568\n玉珠贤\t172569\n3道\t172570\n806799\t172571\n谁和你\t172572\n破不张\t172573\nstarts\t172574\nNjjnGoi\t172575\n百公里\t172576\nTilda\t172577\nMAO\t172578\n来来\t172579\niPhone5se\t172580\n3遍\t172581\n梅花鹿\t172582\n横暴\t172583\n我的还剩零天你的海利版\t172584\n張\t172585\n翁周艳\t172586\n能不能送给我呀求求你了我很想你呀你亲一个来\t172587\n家常排骨\t172588\n张紫阳\t172589\n王琰\t172590\n乖啦\t172591\n列开心\t172592\n塞满\t172593\n细丝\t172594\n三四三四五二十一三十一三十六\t172595\n交火\t172596\n刘依晨\t172597\n晓得\t172598\n213748\t172599\n耶真棒\t172600\n一张卷\t172601\n#航空摄影#\t172602\n中医院洋县医院\t172603\n扑克牌\t172604\n今年起\t172605\n真的假的别骗我\t172606\n你好过分\t172607\n22331410514\t172608\n凯米\t172609\n理睬\t172610\n东西度秘\t172611\n30多\t172612\n传森的之看123456\t172613\n徐佳柔\t172614\n144元\t172615\n李爷爷\t172616\nQnQQ\t172617\n丑鬼\t172618\n雍主\t172619\n步步紧逼\t172620\n百世修\t172621\n报社\t172622\n杂草\t172623\nhagd\t172624\n不是我不查\t172625\n风冰冰\t172626\n百家姓白版\t172627\n容克\t172628\n守夜\t172629\n罗家福\t172630\njgbgjjgkg\t172631\n快快快2924s\t172632\n二十五分钟\t172633\n离哄\t172634\n莫欺\t172635\n高考不好\t172636\n轻轻地\t172637\n句意\t172638\n4415225564561\t172639\n佣兵团\t172640\n批斗\t172641\n老丑老丑\t172642\nyhhf\t172643\n芳踪\t172644\n小房间\t172645\n3333334684542199763455117646346486445611849524\t172646\ninuhhhi\t172647\n5666666669666621111\t172648\nyhhh\t172649\nhenuish\t172650\n韩凤拥\t172651\nassassiah\t172652\nabcdf\t172653\n橙子豪城子豪\t172654\n昨晚八点\t172655\n度于\t172656\n老虎个狮子的熊猫\t172657\n陈美琴\t172658\n茄子\t172659\n茄孑\t172660\n好多嘛嘛嘛嘛嘛\t172661\n一起玩游戏\t172662\nchest\t172663\n牙不疼\t172664\n刘玉玲\t172665\nciLqe\t172666\nqwertyuiopasdfghjklzxcvbnmqwertyuyfgcfhbxgffzfgcgdxCzffxcdffrgvcjbvbvnn\t172667\n桌面我的特猫\t172668\n算式\t172669\n王贞贞\t172670\n九月四\t172671\n鬼我不打你\t172672\n不要记住\t172673\n亏你度秘\t172674\n保护好\t172675\nfstobxeh\t172676\n辩论会\t172677\n云图TV\t172678\n雪狼\t172679\n真心先生\t172680\n四季花城\t172681\n解密止\t172682\n练就会\t172683\n慢腾腾\t172684\n宏观经济政策\t172685\n生吃\t172686\n军装扮\t172687\n神伤\t172688\n何其\t172689\n娇喘\t172690\n诺坎普\t172691\n何兰\t172692\n保护套\t172693\n飞飞信\t172694\n勾码\t172695\n800块\t172696\n敲心沉睡\t172697\n淘宝充\t172698\n摇一摇背单词\t172699\n我是问你好在哪儿见\t172700\n四奇与大什么\t172701\n作美\t172702\n林耿涛\t172703\n班他们天\t172704\n冉唱\t172705\n干刀\t172706\n奔驰中心\t172707\n波点\t172708\n浴巾\t172709\n信不留克\t172710\n量场\t172711\ncokour\t172712\n聪明才\t172713\n白平平\t172714\n吸毒\t172715\n0030188\t172716\n盖通\t172717\n张广志\t172718\n逆后记\t172719\n原液\t172720\n大小包\t172721\n先哭\t172722\n8月中旬\t172723\n高先生\t172724\n风扇\t172725\n多好多\t172726\n4848484848484838383838383838\t172727\n花阳片\t172728\n平角短裤\t172729\n放牧\t172730\n诗书\t172731\n大伯子\t172732\n天天澳新\t172733\n仙魂儿\t172734\n输有\t172735\nlishi\t172736\n河北省政协\t172737\n美帝\t172738\n二二幺六\t172739\nwhd\t172740\nhsjzndj\t172741\n牌型\t172742\n黑莓闹钟眯有能说是萌娃\t172743\nn年\t172744\ny811\t172745\n鄙弃\t172746\n上百根\t172747\n芭莎艺术\t172748\npadoo\t172749\n律师证\t172750\n262627227\t172751\n高中时\t172752\n用量单\t172753\n来呀不灭\t172754\n0000000000006666666666666\t172755\niopon\t172756\nboujst\t172757\n567297223\t172758\n张达晓\t172759\n变道\t172760\n调调\t172761\n不用多说\t172762\n为你为你\t172763\n烦恼无尽\t172764\n唔系11111c5c5c5cceezzzzzzzzzzzzzz\t172765\n心情糟\t172766\n赵辉\t172767\n起借\t172768\n秘菠萝\t172769\n新上市\t172770\n发军网\t172771\neotu\t172772\n布蒂\t172773\n赵辛\t172774\n卓然\t172775\n这是故\t172776\n1953年\t172777\n赵林伟\t172778\n妹妹呢那那那多\t172779\n土家族\t172780\n选题\t172781\n抱意思\t172782\nYouknowIsay\t172783\n我讨厌你你不爱我知道我也不爱\t172784\n动词\t172785\n那妳\t172786\n鱼虾\t172787\n徐弘毅\t172788\n十六十位\t172789\nnnjiijn\t172790\n快乐酷宝二\t172791\nFhhdjxjo\t172792\n秘万岁\t172793\n那妮\t172794\n别百度\t172795\n尾猫\t172796\n琅世君\t172797\n潜力股\t172798\n不差钱\t172799\n31976\t172800\n兴海\t172801\n书迷\t172802\n野人山\t172803\n揠苗\t172804\n20号晚\t172805\n水光针\t172806\n8888888888888888888888888888888888888\t172807\n长斤\t172808\n一衣年年\t172809\n令其节\t172810\n住建部\t172811\n发慌\t172812\n酒疯\t172813\n白米\t172814\nwosnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnngnnnonnngnnng\t172815\nhttphhiphotosbaiducomxiaodupicitemb3fb43166d224f4a3f5055660ef790529922d1fdjpg\t172816\n那你行吗觉\t172817\n楚雅\t172818\n楚雄\t172819\n磨秘\t172820\ndgtwagdap\t172821\n难受我很难受\t172822\n大家好我叫\t172823\nthree\t172824\n同龙\t172825\n澈\t172826\n同龄\t172827\n跑男兄弟\t172828\n颜色关\t172829\n巡洋舰\t172830\n秘一密\t172831\n尹传炬\t172832\n四速\t172833\n2012年5月21日\t172834\n幼师\t172835\n晨吐\t172836\n把握\t172837\nljvljvlhljlhfchjlphpiljpjg\t172838\nplmnjkpoi\t172839\n相乘\t172840\n郑汪伦\t172841\njghhff\t172842\n萝莉\t172843\n茨维塔耶娃\t172844\nWestin酒店\t172845\n明天九\t172846\n菜名\t172847\nslz\t172848\n向东我是你房东有种猪猪都女巫\t172849\n张美丽\t172850\n哼哼讨厌\t172851\n明天乐\t172852\n满口\t172853\n八个月\t172854\nipadair2\t172855\n满台\t172856\nttccbgswrgnnjhhhhhh\t172857\n讯录\t172858\n嗯早班\t172859\nhachightnt\t172860\n亲爱的宝贝我爱你\t172861\nTomouse\t172862\n一耶\t172863\n太北方\t172864\n拉图\t172865\n我是我喜欢日生\t172866\n别瞎说好不好\t172867\n瑞你好运大一圈\t172868\n黄崖关长城\t172869\ngiflv\t172870\n不过不够不够不够不够不够不够不够不够不够不够抱不抱着不抱不报抱不抱不不不不不不\t172871\n德治国\t172872\n拉回\t172873\n嗯一言\t172874\nghhhwghhghhuhvgbhhhbbhhbfhgggufhghjfjhhtfjtgdhfjhfghhfjjjjjgg\t172875\n国际标准\t172876\n韩语耶\t172877\n一考\t172878\n持续性\t172879\n一老\t172880\n信息部\t172881\n小当然\t172882\n日常工作用\t172883\n音乐说好的幸福\t172884\n想会\t172885\n十四天\t172886\n黑豆米\t172887\nз╰皿dkggfyg\t172888\n哈哈哈哈哈\t172889\n一斤斤\t172890\nHome\t172891\n孙秋爽\t172892\n二百零二块\t172893\nCdf\t172894\n人家艺校[偷笑]贵州大学艺术学院\t172895\n一南骏\t172896\n不想话\t172897\n雷柏\t172898\n歌迷枪\t172899\n二四三幺\t172900\n五识\t172901\n13点50\t172902\nCds\t172903\n谈恋\t172904\n等压\t172905\n十四多\t172906\n曾猪狡辩\t172907\n郑呵呵\t172908\n接线\t172909\n15959315597\t172910\n梁恩庆\t172911\nrifloorume\t172912\n派别\t172913\n病友\t172914\n机密\t172915\nst3151a\t172916\n赵煜航\t172917\nhfxhu\t172918\n桃子会\t172919\n极点点\t172920\n病变\t172921\nhgguu\t172922\n55张\t172923\n颜如玉丽萍露\t172924\n哎呦我去我我呵\t172925\n病号\t172926\n病史\t172927\n隔夜\t172928\n言传身教\t172929\n李胜\t172930\n王小猫\t172931\n惶惑不安\t172932\n５百多万辆\t172933\njianggushi\t172934\n永城五中\t172935\n看唱歌\t172936\n孟博\t172937\n一个再一个\t172938\n行星饭\t172939\n横铁\t172940\n温血\t172941\ncoststhispe\t172942\n晚安吻\t172943\n阿弥陀佛阿弥陀佛阿弥陀佛\t172944\n姜可姐\t172945\n寒芒\t172946\n饲粮\t172947\n再想你\t172948\n幺零幺九幺\t172949\n豆哥\t172950\n青虫\t172951\n温补\t172952\n三个字\t172953\n川军团\t172954\n大连世博广场\t172955\n口角\t172956\n于波\t172957\n汉子颜\t172958\n炖炖豆腐\t172959\n开删\t172960\n熙\t172961\n熟\t172962\n湘菜\t172963\n巴勒莫\t172964\n雷转述句\t172965\n直行\t172966\n生活习惯\t172967\n藦羱\t172968\n熏\t172969\n屋顶上\t172970\n拿手\t172971\n201622\t172972\n董博嫣\t172973\n共渡难关\t172974\n熄\t172975\nyouhiapapastatouyou\t172976\n表白\t172977\n一丘\t172978\n候菲菲\t172979\n广播室\t172980\n熱\t172981\n女人的手\t172982\n怂怂\t172983\n熵\t172984\n君少\t172985\n被俘\t172986\n跪域\t172987\n沿海我猜测我呀我的错秘\t172988\n喝沒有你陪伴我真的好孤单\t172989\n撸撸撸撸撸\t172990\nBROWN\t172991\n已知道\t172992\n話\t172993\n該\t172994\n有诚心不有\t172995\n用得\t172996\n詹\t172997\n天一个人\t172998\n真特么神经吧\t172999\n最佳运\t173000\n模组\t173001\nColes\t173002\n做学\t173003\n詩\t173004\n詪\t173005\nxuhf\t173006\nYouaredog\t173007\n开到\t173008\n们儿\t173009\n吴樾\t173010\n力匕\t173011\n萌感\t173012\n削减\t173013\n授受不亲\t173014\n詞\t173015\nRrtyuioo\t173016\n保持\t173017\n太太太\t173018\nnichkh\t173019\n抬价\t173020\ncommit\t173021\n去过夜\t173022\n鉴于\t173023\n砍脑阔\t173024\n2000万\t173025\n碧辉园一区\t173026\n菜籽\t173027\n精壮\t173028\n陵园\t173029\nchdrcc\t173030\nFaceb\t173031\n王佳航\t173032\n宁愿相信\t173033\n二戒\t173034\n火车船\t173035\n牡蛎\t173036\n二战\t173037\n胸肌\t173038\n胸肉\t173039\n劳燕英诺\t173040\nquerys\t173041\n汉堡包果\t173042\n3月12日-3月31日\t173043\n雨天锹\t173044\n腾讯博客\t173045\n张可颐\t173046\n泡田\t173047\n习近\t173048\n椅碑太找行讓麽個個\t173049\n姚梦洁\t173050\n二房\t173051\n私立学校\t173052\n孟令\t173053\n一举\t173054\n摧残飞碟藏有种\t173055\nDNF\t173056\nDNA\t173057\n豫津\t173058\n25分钟\t173059\nDNN\t173060\n加龙省\t173061\n便说\t173062\n木有敏\t173063\n日夜兼程\t173064\n229\t173065\n228\t173066\n227\t173067\n225\t173068\n224\t173069\n223\t173070\n222\t173071\n221\t173072\n220\t173073\n了别生气\t173074\n嗯瓮歌\t173075\n商会\t173076\n抓子\t173077\n太空上\t173078\njrsf\t173079\n几个子\t173080\n13767556795955465843562656734646862455857766\t173081\n农美妞\t173082\n尽致\t173083\n4595264619194346\t173084\n暗天黑\t173085\n死神的十字路口\t173086\n卖我想\t173087\n诗酒\t173088\n亚s\t173089\n袁红玲\t173090\nNewIpad#\t173091\n数贸\t173092\n8月5日\t173093\n3.0\t173094\n给你们\t173095\n休闲系\t173096\n心恶心\t173097\n走题\t173098\n豆面我小天\t173099\n排脚\t173100\ntost\t173101\n熊师傅\t173102\n田博飞\t173103\n上午10时\t173104\n第四次\t173105\n黎敏芝\t173106\n短线\t173107\n12222222555788663655\t173108\n我的最愛\t173109\n不行不\t173110\n另付钱\t173111\n厕所味\t173112\n镇静相信\t173113\n精英\t173114\n查理罗\t173115\n齐国\t173116\n原谅\t173117\n582528\t173118\n岁月静好\t173119\n一中\t173120\n气泡袋\t173121\n选票\t173122\n摸着我\t173123\n2812点\t173124\n头干嘛\t173125\n问一样子\t173126\n3330000006666666\t173127\nguihua\t173128\n梁艳珊\t173129\n针织衫\t173130\n广安大街\t173131\n黄钻\t173132\n边卡\t173133\n终年\t173134\n阿度秘\t173135\n餐单\t173136\n贾岐诺\t173137\n本宫吧美食城\t173138\n一个人处\t173139\n河北生活广播\t173140\n行换\t173141\n乐声音变\t173142\n戴佩妮\t173143\n狗洞\t173144\n欧冠冠\t173145\n今天七号\t173146\n马洪涛\t173147\na罩杯\t173148\n呼棒\t173149\n溏心风暴\t173150\nlamgirl\t173151\n森林深处\t173152\n谢谢你没\t173153\n框支剪力墙\t173154\n一起一落\t173155\n北京振远护卫中心\t173156\n快帅帅\t173157\nqiiag\t173158\n48集\t173159\nIPTV\t173160\n自蓟\t173161\n大老鼠\t173162\n还不对\t173163\n大秦铁路\t173164\n粗付非常gugugghi刚刚减肥药大姨夫gigjgug一亿个顾gigifuchhogigig\t173165\n丿wev\t173166\n中国少先队\t173167\n英政\t173168\nfjvjgu\t173169\n打青\t173170\n意心情爽\t173171\n追你好\t173172\n赵俊霞\t173173\n赛音山达\t173174\n1点51\t173175\n关志豪\t173176\n平底锅\t173177\n二不天天\t173178\n偷钱\t173179\n李馆感\t173180\n有偿服务\t173181\n1696414071\t173182\n福建\t173183\n玩味\t173184\ndi4\t173185\n玩命\t173186\n2555563352263889996\t173187\n了你爱我你很爱\t173188\n偶读\t173189\nFIX\t173190\n倒霉\t173191\n阿sue\t173192\n找收拾\t173193\n万多\t173194\nFIH\t173195\n6月13日\t173196\n侧门\t173197\n丽娅娅\t173198\n婚恋\t173199\ndin\t173200\ndio\t173201\ndih\t173202\ndii\t173203\n一起娇\t173204\ndid\t173205\ndie\t173206\ndif\t173207\ndig\t173208\ndia\t173209\n057717585566888888888888587569635754428585887588\t173210\n挖掘\t173211\ndix\t173212\ndiy\t173213\n覃融蓉\t173214\n天四冻\t173215\ndiu\t173216\n万头\t173217\n凌晨4点\t173218\ndir\t173219\ndis\t173220\nwaystowork\t173221\n丹药\t173222\n四那个\t173223\n水熊虫\t173224\n4.6%\t173225\n出走\t173226\n太甚\t173227\n北京中视朗域纪录片文化传播中心\t173228\n心理性\t173229\n睡想睡\t173230\n经济仓\t173231\n辩愈糊涂\t173232\nrrfhnvg\t173233\n三江\t173234\n连累\t173235\n三百万三百万\t173236\n动度\t173237\n草莓味\t173238\n春潮\t173239\n铁血灬\t173240\n子怡\t173241\n吾识\t173242\n3月24日下午六点\t173243\n第一周\t173244\nremonths\t173245\n银魂赛\t173246\n挨字\t173247\n兵败\t173248\n轻盈\t173249\njujm\t173250\n拟订\t173251\n预编\t173252\n翔犬\t173253\n新百\t173254\n一至两分钟\t173255\n新白\t173256\n穿心\t173257\n2685\t173258\nhhgdfv\t173259\n听听课\t173260\n拜节\t173261\n养们\t173262\n答非所问\t173263\n1ghool\t173264\nshenm\t173265\n天平男\t173266\n鬼战\t173267\n9abxv\t173268\n和坤\t173269\n妈妈爸爸\t173270\n鬼我\t173271\npdtg\t173272\n电吹风\t173273\n亦离\t173274\n543个\t173275\n中青报\t173276\n猪猪猪猪猪\t173277\n十二酒\t173278\n微温\t173279\n庚小胖\t173280\n南太贤\t173281\n毁家纾\t173282\nHdhsja\t173283\n停赛\t173284\n温情脉脉\t173285\nZwhatever\t173286\n闹闹闹闹\t173287\n加减器\t173288\nvbxbzbzbsn\t173289\n大罗加索尔\t173290\n我的闺蜜\t173291\n张安鹏\t173292\n阿7k\t173293\n唔度\t173294\n8.3%\t173295\n子鼠丑牛寅虎卯兔辰龙巳蛇午马未羊深喉q戌狗亥猪游戏\t173296\n爬虫\t173297\n京纪念碑\t173298\n多依林村\t173299\n路考\t173300\n焦躁\t173301\nqqwwwfkkmkla\t173302\nSBVBCV\t173303\n我讨厌你我恨你我\t173304\n宣化街\t173305\n先走一步\t173306\nccucufcyexru\t173307\n二缺一么\t173308\n晚笑点高\t173309\nLunch\t173310\n26岁\t173311\n龚翔翔\t173312\n圣犬帕拉\t173313\n亲笔\t173314\n二分之3b\t173315\n神之国的好不好跑步好节目九一完毕琉\t173316\nsklsl\t173317\n嘉诚兄弟\t173318\n梵天vvu\t173319\n王慧莹\t173320\n不要脸不要脸大毛驴\t173321\n抄作业\t173322\nshuomiweshenne\t173323\n由你\t173324\n枫丹丽舍\t173325\ncysutdgkclygyixiy\t173326\n赵欣梅\t173327\n咱们再来石头剪子布\t173328\n家用水\t173329\n中豪号众号\t173330\n死吨\t173331\n白宙\t173332\n百分钟\t173333\n问题目\t173334\n白宝\t173335\nkeplyjfyjnetmurm\t173336\n离子\t173337\n萌萌哒机器人很美婷在此\t173338\n差二天\t173339\n文物\t173340\n吃出钱\t173341\nfly\t173342\n大家好我是猴子我而去年十一岁\t173343\n掉了我\t173344\n文牒\t173345\n1n11n121n\t173346\neeeeee\t173347\n肥胖子\t173348\n美女网\t173349\n二分之35\t173350\nsoul\t173351\nflh\t173352\n感觉灰\t173353\nsour\t173354\n死名\t173355\nfll\t173356\nsouw\t173357\n死后\t173358\n花果山\t173359\nTgyxgucdghguogijguigjdhfgtryheeqfuuebhtv\t173360\nfld\t173361\nflg\t173362\n钟友军\t173363\n油弄\t173364\n清晰\t173365\n塞杜\t173366\n北碚者组\t173367\n娃呆\t173368\n哲琦\t173369\n换挡\t173370\nPaltrow\t173371\n好不好别\t173372\nUofyfyd\t173373\n清晨\t173374\nhnhgnihnjo\t173375\n童音\t173376\n非红\t173377\nmmmnnn\t173378\n女胎\t173379\n木加金什\t173380\n普京\t173381\n小时岁\t173382\n乐体\t173383\n哲理\t173384\n艾利奥\t173385\n靠靠靠靠靠靠\t173386\n一指间\t173387\n挺身而出\t173388\n可爱的话\t173389\n私权\t173390\n厌鬼\t173391\n欣赏\t173392\n唉abxxxx\t173393\n博美丽\t173394\n串串\t173395\n来着\t173396\n瓞绷\t173397\n宝妞\t173398\naisusb\t173399\n惜\t173400\n惟\t173401\nfijcc\t173402\n内容量\t173403\n葡萄堡\t173404\n赶不过去\t173405\n情\t173406\n打游\t173407\n永辉\t173408\n惊\t173409\nBriard\t173410\nh六\t173411\n范爷耶\t173412\n良久\t173413\n想\t173414\n惹\t173415\n还情深深雨濛濛\t173416\n好的你在哪v请你\t173417\n名古屋\t173418\n请君入\t173419\n魔都\t173420\n惠\t173421\n动物行\t173422\n公交\t173423\n二千\t173424\n一四局\t173425\n惧\t173426\n惨\t173427\n不不不！不不不\t173428\n惭\t173429\n惮\t173430\n惯\t173431\n范珂瑞\t173432\n5453\t173433\n周初\t173434\n毛网\t173435\n想通\t173436\n粘乎乎\t173437\n有如有\t173438\n三毛三\t173439\n超级大死猪\t173440\n宠物狗市\t173441\n加伤\t173442\n氟化妆\t173443\n八二九七零零\t173444\n捏碎\t173445\n周刊\t173446\n管苏\t173447\n现出\t173448\n阿猴对虎\t173449\n周到\t173450\n小游\t173451\nintentIntentSK11714776653BBEA57BD690B3543AF3545A53CFB71539A95D8F7D77BDFE247D64E39AB1DA01end\t173452\n小火山\t173453\n看玩\t173454\n经不起\t173455\n敢不知道\t173456\n小港\t173457\n土里埋\t173458\n自成\t173459\n小温\t173460\n现凡\t173461\n林肯\t173462\n娉婷\t173463\n说试试\t173464\n继续天咯咯木刻苦了咯局麻将局麻将咯咯群来咯辣椒咯途径\t173465\n小渡\t173466\n撸炮\t173467\n女孩们\t173468\n13199818029\t173469\n奔准\t173470\n65010319911075520\t173471\n发高\t173472\n7月份\t173473\n小木棒\t173474\n武汉市红安县实验小学\t173475\n9701\t173476\n武媚娘\t173477\n哼不理你了讨厌你\t173478\ntaobao\t173479\n新机\t173480\n别恋恋不舍\t173481\n使命感\t173482\n值得犯\t173483\n告发\t173484\n史文俊\t173485\n红姑\t173486\n红姐\t173487\n燕子\t173488\n六六二二\t173489\n毛静怡\t173490\n海新年快乐\t173491\n发髻\t173492\n红酒管理专业\t173493\n云光\t173494\nｎ年\t173495\n海美\t173496\n诊室\t173497\n调皮鬼度秘\t173498\nlam\t173499\n红博爱\t173500\ndhiphotosbaiducomxiaodupicitemd62a6059252dd42ac35a54de043b5bb5c9eab870jpg\t173501\nuifia\t173502\n三观\t173503\n洞蜜蜜\t173504\n迷雾\t173505\n15478933288\t173506\nhdhdjfbjxj\t173507\n雅莉\t173508\n名猜\t173509\n煽情\t173510\n噬魂\t173511\n东关小学\t173512\n呦咻咻\t173513\n雅莹\t173514\n拥有者\t173515\nparticuallrily\t173516\n校服厂\t173517\n陈凯先\t173518\n愁莫愁\t173519\n一个4050\t173520\ninengland\t173521\n嘎吱\t173522\n回车\t173523\n7月9日\t173524\n孙燕姿\t173525\n汇源\t173526\n盖亚\t173527\n回转\t173528\n爱家\t173529\n卖完\t173530\n信不信我一刀多了你\t173531\n可卡因\t173532\n郍\t173533\n磨西镇\t173534\n阿拉啦啦啦啦啦啦啦啦宝贝宝贝宝贝\t173535\n因为我的名字\t173536\n3018\t173537\n卖家\t173538\n爱宁\t173539\n950010\t173540\naueco\t173541\n3010\t173542\n瑞雪纷飞\t173543\n陈佳怡\t173544\n爱安\t173545\n十月多\t173546\n无赖\t173547\n你片\t173548\n中姓\t173549\n一个半年\t173550\nJJJJNBLCCJHV\t173551\n19点07分\t173552\nlay\t173553\n泡叔\t173554\n卫士\t173555\n专属\t173556\n其相植物大战僵尸\t173557\nrffdxjkffb\t173558\n太平天国\t173559\n李玹雨\t173560\n橘糯崽\t173561\n1月26号\t173562\n我好帅我好帅我是个小帅帅我好帅我好帅我是时间最帅的人我好帅哇好帅\t173563\n翔天\t173564\n退朝\t173565\n日嘛\t173566\n钉子户\t173567\nihi朴二\t173568\n还要不是\t173569\n谢忆涵\t173570\n27272415516856872696894\t173571\n笑容满面\t173572\n德赛帝伦\t173573\n19号\t173574\n暴乱\t173575\n水瓶女\t173576\n小弟们\t173577\n要命疼\t173578\n小焕\t173579\n岳云鹏\t173580\n哈好\t173581\n王秀菊\t173582\n你看我快乐大本营\t173583\n曾志伟\t173584\n健敏\t173585\n流云\t173586\nyounola\t173587\n四时田园杂兴\t173588\n凤仙花\t173589\n个座\t173590\n个度\t173591\n文债\t173592\n火舞\t173593\n18207646979\t173594\n蜓姐\t173595\n八十分一个\t173596\n53332189\t173597\n电话叫\t173598\n小丽丽\t173599\n早来了\t173600\npjgw\t173601\n环评\t173602\n心醉神迷\t173603\n颈子\t173604\n7777777777777777777777777777777777777777777777\t173605\n寄给\t173606\n快捷家曲曲\t173607\n小樱堡\t173608\n流产\t173609\n电话号\t173610\n京玉\t173611\n这个尾\t173612\n皮娜\t173613\n帅我是女\t173614\n谷棣\t173615\n360小游戏气人\t173616\n早奏\t173617\nWiFI\t173618\n1厘米\t173619\ncearn\t173620\n弱手\t173621\njbjhlbbo\t173622\n过母\t173623\n广西南宁民族大道华联超市\t173624\n靓汤\t173625\n一般人家\t173626\n开涮\t173627\n13072379214\t173628\n袁国涛\t173629\n沈哲轩\t173630\n梅子粉\t173631\n天克\t173632\n猜猜呢你好好哦我我哦我是一条蛇\t173633\n王明台\t173634\n挨摔\t173635\nWiFi\t173636\n柴君磊\t173637\n23670块\t173638\n四罩\t173639\n大昂\t173640\n顾得\t173641\n小博士幼儿园\t173642\nChigger\t173643\n阴狠\t173644\n早好\t173645\n2．50\t173646\n多莉\t173647\n鸭血\t173648\n臭美死\t173649\n大家的捧场``[热吻\t173650\n育坤\t173651\n陂nn┏\t173652\n老挝\t173653\n着落\t173654\n危险期\t173655\niwq\t173656\n竖心儿\t173657\n拐弯处\t173658\n放宽容\t173659\nfjvhhbbvvj\t173660\nkesan\t173661\n扭完\t173662\n5千万\t173663\n暴乖\t173664\n马喜欢\t173665\n131341\t173666\n性别歧视\t173667\nKahlenberg\t173668\n田欣雅\t173669\n老了丑\t173670\n20120810\t173671\n一一日\t173672\n靠不理\t173673\n侵权案\t173674\n魇品\t173675\n我讨厌你我讨厌你我讨厌你g\t173676\n邓楚涵\t173677\n说走\t173678\nffudkrsbf\t173679\nddfdfsfttff\t173680\n抽样调查\t173681\n呗魔仙小游\t173682\n保险柜\t173683\n宜春\t173684\n环游\t173685\n够了别\t173686\n乔收费\t173687\n聂耳赞\t173688\n南海气象台\t173689\n宜昌\t173690\n吉凶\t173691\n阿布也\t173692\n宁儿\t173693\n农具厂\t173694\n年臭\t173695\n机器娃娃\t173696\n大官人\t173697\n直驱\t173698\n东京热\t173699\n12月11日\t173700\n1点23分\t173701\nuGF\t173702\n书情\t173703\n寂寞感\t173704\n定行\t173705\n来说出来\t173706\n防辐射\t173707\n课户\t173708\n晕晕晕晕晕晕晕晕晕晕死\t173709\n韶山南路\t173710\nuhbbbnm\t173711\nuGV\t173712\n夏奇拉\t173713\n帅度\t173714\niuok\t173715\n南平水泥股份有限公司\t173716\n赵桐\t173717\n自己的房间\t173718\n深入\t173719\n9oc\t173720\n临澧\t173721\n夏友善\t173722\n卡拉拉拉\t173723\n格里菲斯\t173724\n熊出没之雪影归来\t173725\n有恶搞\t173726\n无可不发\t173727\n钱青海\t173728\n滴诺\t173729\n初逢\t173730\n茬子\t173731\n收录\t173732\n宋思慧\t173733\n朱逸群\t173734\n很无很好\t173735\n争身\t173736\n羊城\t173737\n吴雪莲\t173738\n票贩\t173739\n三三\t173740\n三丈\t173741\n三下\t173742\n吴梦园\t173743\n一五得五二\t173744\n三万\t173745\n350个\t173746\n三一\t173747\n三七\t173748\n倒桩\t173749\n宋宝柱\t173750\n一隅\t173751\n入库\t173752\n陈亚生\t173753\n三串\t173754\n三中\t173755\n明处\t173756\n平安健康\t173757\n三丫\t173758\n三个\t173759\n三严\t173760\n三两\t173761\n氟化茶\t173762\n五十一\t173763\n有一天天\t173764\ndrry\t173765\n365855688\t173766\n完了擦\t173767\n希尔\t173768\n君妍\t173769\ndrrr\t173770\n摇晃\t173771\n三峡之秋\t173772\n真的爱听\t173773\nFhfxhfhf\t173774\n看度\t173775\n查阅\t173776\n安达卢西亚\t173777\nw蘑菇\t173778\n刀片\t173779\nvyubux\t173780\n笑话我是你的主人的妹妹\t173781\n毫爿\t173782\n差开\t173783\n差异\t173784\nhttpehiphotosbaiducomxiaodupicitem728da9773912b31bb432065d8118367adab4e16cjpg\t173785\n护手霜\t173786\n花牌\t173787\n稿源\t173788\n断章取义\t173789\n要不你说\t173790\n被结婚时代\t173791\n吖酷\t173792\n真不懂我你还叫我做你的主人\t173793\n朱梓涛\t173794\n977QQ号\t173795\n个税起征点\t173796\n友们\t173797\n齐鲁银行\t173798\n噩噩噩噩\t173799\n谢了还真是我的小秘书\t173800\n偶吧江南\t173801\n1000万不要我还小要再遇你\t173802\n厕所有和你老婆\t173803\n拷贝\t173804\n葡萄葡萄葡萄\t173805\n球迷球\t173806\n多姿多彩\t173807\n卡卡格格\t173808\n不有朋友\t173809\n想你的夺命\t173810\n28日\t173811\n182183\t173812\n唯心主义信\t173813\n七哈\t173814\n我不我不我不反映\t173815\n陈雅琪\t173816\n展说\t173817\n吉林省西部\t173818\n美文\t173819\n陈如\t173820\n六道\t173821\n在长\t173822\ndonotspeakChineEnglish\t173823\n陈妍\t173824\n麦当劳oooo\t173825\nvnkg\t173826\n应用宝\t173827\n一个数四十四十五十八\t173828\n200多天\t173829\n六遍\t173830\n举目瞪口呆\t173831\n美西战争\t173832\n该县\t173833\n2011年底\t173834\n万事头\t173835\n恶心勒\t173836\n幺九八五零六零九零零\t173837\n八条\t173838\n为毛你可以\t173839\n前两天说\t173840\n亚等一等\t173841\n小点点\t173842\n系部\t173843\n一个一百二十八\t173844\nisayyou\t173845\n韵头\t173846\n陈妾\t173847\n美方\t173848\n海世代\t173849\nyojizz\t173850\nLAY\t173851\nhdsiow\t173852\n收拘\t173853\n刀塔西游\t173854\n零幺零幺\t173855\n一起到老\t173856\n肥羊\t173857\n肥美\t173858\n出征\t173859\nmyou\t173860\n5754964\t173861\n玻色子\t173862\n火因臣吾\t173863\n酷6\t173864\n屡次\t173865\n看不穿\t173866\n可能性能\t173867\n做起作业\t173868\n盖房\t173869\n十四米\t173870\n天破\t173871\n收拾\t173872\n歇会儿\t173873\n太迟\t173874\n太远\t173875\n告诉我我真的不知道\t173876\n邱甲荣\t173877\n抗衡\t173878\n好鬼\t173879\n尚有\t173880\n二三十页\t173881\nDale\t173882\n阿沙利\t173883\n来书签儿\t173884\n啊华\t173885\n不碎不碎昆明今典\t173886\n呼救\t173887\n口袋\t173888\n姜继航\t173889\n4m6s\t173890\n待发\t173891\niPhone4S\t173892\n15集\t173893\n灵验\t173894\n错天记\t173895\n5836\t173896\nilyou\t173897\n尚未\t173898\n5832\t173899\n5830\t173900\n嗯熊猫\t173901\n冰淇凌八股\t173902\n绍酒\t173903\nDVDv\t173904\n达达达\t173905\n歌海\t173906\n酷Q\t173907\n网签\t173908\nLOLJoy\t173909\n更有钱\t173910\n翁一诶累\t173911\n速滑\t173912\n元偶\t173913\n泰安\t173914\n锦江花园\t173915\n在理\t173916\n杨明博\t173917\nArrow\t173918\n棒糖\t173919\n梁朝伟\t173920\n塘下东路\t173921\n六堡茶\t173922\n第九句\t173923\n敏捷\t173924\n真是谁\t173925\n爱你哪你不\t173926\n仙贝仙贝仙贝\t173927\ntreo\t173928\n8849745\t173929\n没当\t173930\n81755677181\t173931\n上海某地产城投公司\t173932\n籍籍\t173933\n一清二楚\t173934\n剥削者\t173935\n72.98亿元\t173936\n千骨姑娘\t173937\n演示\t173938\n3100点\t173939\nf波伊斯\t173940\n资深者\t173941\njsiwojfu\t173942\n多功能厅\t173943\n五五二二幺六五零三六\t173944\nparat\t173945\n第九发\t173946\n自曝术\t173947\n芳心点\t173948\n岔\t173949\n岗\t173950\n岑\t173951\n仙游段\t173952\n不食子\t173953\n岜\t173954\nsfgfycgvxcv\t173955\n岛\t173956\n岀\t173957\n岁\t173958\n蒙顶山茶\t173959\n小老鼠\t173960\n岌\t173961\n好安静\t173962\n大沟门\t173963\n头虱\t173964\n56.53%\t173965\n吆喝\t173966\n说不犯贱\t173967\n灾星\t173968\n高额\t173969\n活在\t173970\n心块\t173971\n不行太早\t173972\n岸\t173973\n嗯小路\t173974\n劳美\t173975\n苏星宇\t173976\nhttpdhiphotosbaiducomxiaodupicitemd009b3de9c82d158ffa402c2870a19d8bc3e429ajpg\t173977\n晚熟\t173978\n岣\t173979\n屁颠\t173980\n岭\t173981\n这边儿\t173982\n摆一摆\t173983\n问候胜\t173984\n球员\t173985\ndfhjoo\t173986\n剪辑\t173987\n实验区\t173988\nKmLolLolknell\t173989\nffrrtf\t173990\n纯天然\t173991\n炒货\t173992\n聋掉\t173993\n杨瑞霞\t173994\n对不来了\t173995\n悍匪\t173996\n王羚霖\t173997\n家堂\t173998\nRX-78-2\t173999\n20160087\t174000\n黑执事\t174001\n白金卡\t174002\n淅淅\t174003\n斗俊sama\t174004\n多金\t174005\nTowneplace\t174006\n拐点\t174007\ngffft\t174008\n不死尊\t174009\n我不相信你是谁口说的\t174010\n二炮\t174011\n紫荆花\t174012\n猛禽\t174013\n郑恺\t174014\n二点\t174015\n洗尽\t174016\ngffff\t174017\n王北玻\t174018\n潘彩芬\t174019\n庸人败\t174020\ngfffc\t174021\n阳春白雪风流债\t174022\nbrueie\t174023\n九九靡\t174024\n举头望明月\t174025\nhahsjfbxhdhsgxvajdhxjhakwodyheidhxysvxyzvdnvifbsodhfshgavzhsdb\t174026\n闺密朱\t174027\n万泰小区\t174028\n不点\t174029\n翁向荣\t174030\nxnv\t174031\nxnx\t174032\nxnz\t174033\n玛莎\t174034\nHIHGJ\t174035\nbeenat\t174036\n赛琳迪翁\t174037\n季片\t174038\n鲁甸\t174039\ntirhuah\t174040\nxnd\t174041\nakayou\t174042\nxnj\t174043\nxnm\t174044\nxnn\t174045\ni秘\t174046\n阿比达尔\t174047\n每行\t174048\n星期天晚上\t174049\n木马木马\t174050\n领火化\t174051\n58955678888888\t174052\n安尼哈塞\t174053\n闫明慧\t174054\n一零零八六\t174055\n剑圣柳白\t174056\n形塑\t174057\n成才\t174058\n言归正传我给你说件事\t174059\nhaventai\t174060\nBaidu\t174061\n杨佳\t174062\n中联\t174063\n16万一平方\t174064\n大雁塔\t174065\n三年s\t174066\n进帐\t174067\n那就好那就好我\t174068\n胃疼\t174069\nujdudydhh\t174070\nraatshit\t174071\n建築\t174072\n棺木底\t174073\n全国各省市佛教协会\t174074\npottery\t174075\n够了没\t174076\n41码\t174077\n何思远\t174078\n安允曦\t174079\n晋祠\t174080\n49家\t174081\n加格达奇教音乐\t174082\nu红红7\t174083\n惦记\t174084\n雪梅竞志\t174085\n军警察\t174086\n教不教\t174087\n讨厌书\t174088\n叶子\t174089\n58582\t174090\n嘛达\t174091\n锅盖\t174092\n求私奔\t174093\n礼品\t174094\n小三片\t174095\n池州\t174096\n张维迎\t174097\n直在\t174098\n2005年1月\t174099\n大大小\t174100\n猎猎\t174101\n好胁友\t174102\n漂亮大家\t174103\n葫芦娃三\t174104\ntechnica\t174105\n健身厂\t174106\n布米米\t174107\n饭费\t174108\n吕梁山脉\t174109\n15—17世纪间\t174110\n尼自粽\t174111\n部委\t174112\n易北战\t174113\n二百七十四分钟\t174114\nhyddf\t174115\n卖萌萌\t174116\n中国救援队\t174117\n假的真不了真的假不了是你说的\t174118\n许丰\t174119\n二十六万\t174120\n化妆品税\t174121\n传动赞\t174122\nhHshddhjdfb\t174123\n努力工作\t174124\n博雅菠萝蜜\t174125\n规范性\t174126\n54557888885\t174127\n风月奇潭\t174128\n医术\t174129\n唱一首歌有\t174130\n前2个月\t174131\n度日如\t174132\n起泡\t174133\nhihbj\t174134\n黄群\t174135\n双头燕\t174136\n宫瑜\t174137\n18688394597\t174138\n恐婚男\t174139\na米7kmkklaaloktangui\t174140\n佩佐尼\t174141\n三89\t174142\n蜜桃\t174143\n幺二零幺零五一九九一零八幺五四二幺九\t174144\n猪呀四久久995999\t174145\ndjgjcc\t174146\n金睛\t174147\n好不爽\t174148\n心潮澎湃\t174149\n左岸片\t174150\n666233\t174151\n斗鸡场\t174152\nhrrf\t174153\n对面对面对面对面\t174154\n男男人\t174155\n红绿灯\t174156\n三月初\t174157\nzqs\t174158\n国民政府纪念馆\t174159\n神经病\t174160\n16块\t174161\n美发师\t174162\n海平ue\t174163\n虚荣大好\t174164\n骨质疏松\t174165\n11第一名\t174166\n2563km\t174167\nQQ网博美\t174168\n三部\t174169\n丑丑丑丑\t174170\n城山\t174171\n唔系噶\t174172\n秘柏\t174173\n针完\t174174\n老婆奴\t174175\n超萌萌哒\t174176\n面食\t174177\n每条\t174178\n00000000000\t174179\n10356\t174180\n璃沙\t174181\n1234655785533412245555525632583\t174182\n史记\t174183\n三郎\t174184\n王子\t174185\n飞影\t174186\n实在人\t174187\n明天早上5点\t174188\n咨询\t174189\n张九十\t174190\n遭罪\t174191\n^\t174192\n一般般\t174193\n13956547823\t174194\nsound\t174195\nJID\t174196\n滑行\t174197\n玹雨\t174198\n福州大学\t174199\n你的家在哪里我的家在狠狠渭水汉丽轩未水落汗丽宣\t174200\ndcv\t174201\n水力气\t174202\n句式\t174203\n138256228228\t174204\n乐清电力局\t174205\nnopqst\t174206\n异世机器人\t174207\n成就奖\t174208\n只不过\t174209\n一密一下\t174210\nhyf\t174211\nhyg\t174212\nhyh\t174213\n九招\t174214\n神探主\t174215\n相亲相见\t174216\n萝卜冬\t174217\n联通\t174218\nhyr\t174219\n章金莱\t174220\nhyt\t174221\nhyu\t174222\nhyv\t174223\nhyw\t174224\nhyy\t174225\n不要再聊\t174226\n清静\t174227\n裝瘋\t174228\njhhhkvhg\t174229\n麦秸\t174230\n迷魂\t174231\nhihi撸一\t174232\n郁美净\t174233\n四万多位\t174234\nTFbos\t174235\nTFboy\t174236\n〒▽〒\t174237\nsmore\t174238\n我秘你\t174239\n变相向\t174240\n5414545545454545\t174241\nvxhjxxgz\t174242\ndhiphotosbaiducomxiaodupicitemf\t174243\n大幂美\t174244\n麦种\t174245\ndhiphotosbaiducomxiaodupicitema\t174246\n草地下山岗\t174247\nlkjux\t174248\n上窗\t174249\n1103\t174250\n金毛\t174251\nntr\t174252\n拜稈\t174253\n蜜痕\t174254\n明天7：00\t174255\n封闭\t174256\n李博林\t174257\n封门\t174258\n061个\t174259\n逗人\t174260\n韩国队\t174261\n可而且\t174262\n青楼\t174263\n翟磊\t174264\n刑诉法\t174265\ndhiphotosbaiducomxiaodupicitem5\t174266\n娱乐圈儿\t174267\n信不话\t174268\n南雄市\t174269\n五那\t174270\n胡人赫\t174271\ndhiphotosbaiducomxiaodupicitem9\t174272\nkolol\t174273\nwhattaouto\t174274\n比有二\t174275\n大姨妈来了敢喝菊花茶\t174276\n大锅堡\t174277\n海洋公园\t174278\n售卖\t174279\n夜景\t174280\nu6件\t174281\n柳天\t174282\n刁屋\t174283\n难过开心\t174284\n陈205有么\t174285\n鹌鹑蛋\t174286\n怎嘛样\t174287\n小粉\t174288\n粉笔\t174289\n门帘\t174290\n最最最最最最最最最最最最最最最最\t174291\n一三块\t174292\n门市\t174293\n小粒\t174294\n谁动了我的琴弦\t174295\n三圈儿\t174296\n骚浪\t174297\n当是\t174298\nOUMygad\t174299\n错字\t174300\n炫迈\t174301\n婷婷狼\t174302\n壹fg\t174303\n首套房\t174304\n巩庆\t174305\n注水\t174306\n我和你的距离\t174307\n达人秀\t174308\n说不道\t174309\nnxkx\t174310\n辣油\t174311\nnicemeto\t174312\n波耶\t174313\nLTE\t174314\n全南\t174315\nnxkk\t174316\n4月\t174317\n荣华富贵\t174318\n上上上上上上\t174319\njdm\t174320\nhgdkd\t174321\n配菜\t174322\n未来男友\t174323\njdi\t174324\njdh\t174325\njdk\t174326\njdj\t174327\n贾洋\t174328\njdg\t174329\njdf\t174330\n睡嘞\t174331\njdb\t174332\nGGii\t174333\n模发\t174334\njdy\t174335\njdx\t174336\n空子\t174337\n燕雀\t174338\njdu\t174339\n纳尼纳尼\t174340\njdv\t174341\njds\t174342\nihfyiv\t174343\n华北\t174344\n书馆\t174345\n再见了我的份\t174346\n我和他明天的心情好嗨森狠嗨森\t174347\n莱斯妮\t174348\n卖房\t174349\n500岁\t174350\n娃娃狗\t174351\n名花\t174352\n鬼秘\t174353\n我跟你的对话给我全部三\t174354\n丁雨阳\t174355\n首诗\t174356\n嗯度\t174357\n西诗\t174358\n嗯庙\t174359\n一台个\t174360\n工业厂房\t174361\n圈养\t174362\n金沙湾\t174363\n渍渍渍\t174364\n春宵一刻\t174365\n明天你是我的现在爱我\t174366\n笑话集\t174367\nS5830黑1390\t174368\nokstar\t174369\n讨厌厌厌厌厌厌厌厌\t174370\n阳线\t174371\n梁欣\t174372\n肩并肩2012\t174373\n同僚\t174374\n山可\t174375\n帮任\t174376\n眼前人\t174377\n封建\t174378\n有不干\t174379\n陪床船\t174380\n山口\t174381\n绝尘\t174382\n肉串\t174383\nSISTAR\t174384\nvgggfdd\t174385\n四十一四十二岁\t174386\n任筶\t174387\n肉丸\t174388\nexgr\t174389\n帮付\t174390\n目的系\t174391\n山友\t174392\n教课\t174393\n斜视\t174394\n强盛\t174395\n心碎碎\t174396\n陈松泰\t174397\n好丑好丑\t174398\n欧冠\t174399\n斜角\t174400\n肉丝\t174401\n山叔\t174402\nKhhhh\t174403\nFHCH\t174404\n琅琊榜好看还是花千骨\t174405\n度秘素坏蛋\t174406\n词穷\t174407\n旋转木马\t174408\n飞屋环游记\t174409\nsindalamyotheimeomyseptmytatiomasllsahtilitheryonditskasitiom\t174410\n棒堂\t174411\n保障性安居工程\t174412\n怕好\t174413\ncmcclm\t174414\n触摸屏\t174415\n好我我够\t174416\n罗斯尔\t174417\n动回来\t174418\n客运量\t174419\n贝吉\t174420\n逆转\t174421\n好看不好笑\t174422\nhttpdhiphotosbaiducomxiaodupicitema5c27d1ed21b0ef44045df2bdac451da81cb3e11jpg\t174423\n小李子\t174424\n长命百岁\t174425\n有心人\t174426\nHRIVD\t174427\n度秘我爱你你好为\t174428\n灰子\t174429\nBAND\t174430\n88家\t174431\n家的房间\t174432\nsomeone\t174433\n向天歌白毛浮绿水红掌拨清波\t174434\n好些\t174435\n黄一行\t174436\n窝不爱你了呜呜\t174437\n蛛丝\t174438\n忘仙\t174439\n抄篇\t174440\n好了\t174441\n邓建兵\t174442\n好事\t174443\n音乐人\t174444\n好亲\t174445\naaass\t174446\n近视率\t174447\n明天凡\t174448\n好人\t174449\n黄大夫\t174450\n太科学\t174451\n屈指\t174452\n陶宏博\t174453\n写男欢女爱\t174454\nfffgggg\t174455\n博爱之都\t174456\n发不人\t174457\n好亮\t174458\n二二十块\t174459\nmental\t174460\n地长裙\t174461\nhouse\t174462\n试试看我是\t174463\nwakym\t174464\n靓女\t174465\n葫芦徒儿胡萝卜\t174466\n秋祭\t174467\n过站\t174468\n虎点\t174469\n集价\t174470\n打一水\t174471\n过端\t174472\n画质\t174473\n蒜蓉\t174474\n嗯嗯秘\t174475\n港影\t174476\n这么觉\t174477\n法定最低工资标准\t174478\n帅英\t174479\n荣登\t174480\n绝大部\t174481\n心血来潮\t174482\n仁心\t174483\n摔掉\t174484\n何不活\t174485\n二二九九\t174486\n活拨乱跳\t174487\n辽宁机场集团\t174488\n了不要堂\t174489\n時6n687566555587438n62455665\t174490\n杨幂堡\t174491\n要源\t174492\n萝莉音\t174493\nkhfk\t174494\n差不是我\t174495\n撒子勇\t174496\n不对\t174497\n金黄色\t174498\n苋菜红\t174499\ndabais\t174500\n西青分公司\t174501\nu奇米\t174502\n赞美我的话\t174503\n临沂市\t174504\n张金亮\t174505\n杨思琦\t174506\n藏家\t174507\n21号下午\t174508\n不密\t174509\n去向向天歌延果造不句吹风啦啦啦啦小游白猫红掌拨清波\t174510\n我问你你猜我是谁\t174511\n翻拍\t174512\n瓮瓦\t174513\n出租\t174514\n欧瑞莲\t174515\nwwwzybangcomquestion851baff3ef96048a34b386cbd3b9b833\t174516\nhdjjrifjjdjskejdjfofjdlfofbjfigdjkwlekfjbcncjjdkfncnkfoekfjhfbxbbndkkdoosjdhdjbfjcjdk\t174517\n啦啦啦啦啦啦啦啦啦啦啦啦把\t174518\n垂直\t174519\n臭臭臭臭臭臭臭臭熊\t174520\n晓辉哥\t174521\n电脑族\t174522\nxhxbxbs\t174523\n量感\t174524\n药监局\t174525\n赵雪峰\t174526\n不来梅\t174527\n引起\t174528\n工程量\t174529\n1238万\t174530\n本职\t174531\n巡演出\t174532\n假装\t174533\n折中\t174534\n掉落\t174535\n烤羊肉\t174536\n龙珠版\t174537\n丑比\t174538\n渡边\t174539\n8点半\t174540\n屠苏\t174541\n西朗\t174542\n贵方\t174543\n公造炀\t174544\n做做家务\t174545\n唱一重新唱\t174546\n蒙萌萌哒\t174547\n秘码三胞\t174548\nhttppinyincn2gSjHfiK3Hq\t174549\n踢飞天\t174550\nUUUU\t174551\n殷小琦\t174552\n红日近回首白云\t174553\n输入问问\t174554\n想你头痛\t174555\n接去\t174556\n杜景\t174557\nsyou\t174558\n张泡泡\t174559\n一零袋\t174560\n桐皇\t174561\n还给我\t174562\n苍溪\t174563\n真英雄\t174564\n我的世界版植物大战僵尸植物僵尸\t174565\n施耐德电气\t174566\n敦请\t174567\n背带裤\t174568\n度迷秘\t174569\n1958年\t174570\n疯癫\t174571\n心碎\t174572\n主产区\t174573\n80888\t174574\n虎鲨\t174575\n摆放\t174576\n对呀您说\t174577\nMotors\t174578\n幺三八二幺二六三一五九\t174579\nzhkyl\t174580\n闲下来\t174581\n祖爷爷\t174582\n实录\t174583\n虎鲸\t174584\n冰激凌味\t174585\n讨厌你走\t174586\n中事会\t174587\n没看见到\t174588\n橘子核\t174589\n实形\t174590\n聊聊盟\t174591\n病理学家\t174592\n积分榜\t174593\nfmfhgn\t174594\n张博霖\t174595\n王土图耨\t174596\n违约\t174597\n尷尬\t174598\n黑贝\t174599\n2011年2月14号\t174600\n诬陷\t174601\n倒栽葱\t174602\n亲子\t174603\n校知道\t174604\n10遍\t174605\n我妈妈来了我不想聊了你不理我我妈妈来了\t174606\n吞资本\t174607\n三百米\t174608\n莱阳\t174609\nzuyiyyc\t174610\n湘江大道\t174611\n异能\t174612\n虚劲\t174613\n老鼠头\t174614\n新茶\t174615\n每一秒\t174616\n一家之主\t174617\n失乐园\t174618\n查去\t174619\n小小我是小小我是小小\t174620\n紧缺\t174621\n585852854\t174622\n奸笑\t174623\n50名\t174624\n18p咧\t174625\n候选\t174626\n50后\t174627\n美甲店\t174628\n盒套\t174629\n袁居\t174630\n脑回路\t174631\n永华电影院\t174632\n圆球\t174633\n处死\t174634\n608\t174635\n热滚滚\t174636\n陈柳叶\t174637\n离合盈\t174638\n莫子\t174639\nfino\t174640\n敬礼\t174641\n太后\t174642\nfind\t174643\n山清水秀\t174644\n氦动\t174645\n上海机场\t174646\n萧云\t174647\n推却\t174648\n扣扣扣扣\t174649\n修学\t174650\n这一点\t174651\n一百层\t174652\n肖像画\t174653\n兰图雅妃\t174654\n如金\t174655\n国际会议\t174656\n真幸真幸\t174657\n萌儿\t174658\n555566656990555666\t174659\n千疮白\t174660\nVZ回发货瑟\t174661\n凋枯\t174662\n坤士\t174663\n护士会\t174664\n1667646783\t174665\n果酱果酱\t174666\n问童子言师\t174667\n我没吗我舅家\t174668\n知名度\t174669\n四百年\t174670\nkkhbhjij\t174671\n85638658587485\t174672\n李紫梦\t174673\n想开玩笑\t174674\n樊佳傲\t174675\n成人类\t174676\n泽样\t174677\n栽树难\t174678\n五千位\t174679\n小衣橱\t174680\n凉伢乐儿6666\t174681\n7721\t174682\n7723\t174683\n贰兆\t174684\n語文科\t174685\n内河\t174686\n前院\t174687\n七九\t174688\n票管\t174689\n过意不去\t174690\nClarins#\t174691\nghuihhug\t174692\n4日17时30分\t174693\nhsusu\t174694\n么乃节\t174695\n早证菩提\t174696\n汉奸\t174697\n人事儿\t174698\n补阳\t174699\n小小赵\t174700\n搜索\t174701\nplease\t174702\n手艺\t174703\n关一个头你当我是傻子吗乖\t174704\n绝后代\t174705\n得子\t174706\n吸血\t174707\n老鳖\t174708\nKing\t174709\nKind\t174710\n年广九\t174711\n唯品会\t174712\nbebr\t174713\n这个性\t174714\n伞裙\t174715\n身段\t174716\nsdchn\t174717\n18郭\t174718\n88晚安木马\t174719\n郭平\t174720\n呼睡\t174721\n2000000001年\t174722\n旅行\t174723\n给危保险\t174724\n嘚笑\t174725\n下一秒\t174726\n13胡\t174727\n从来木\t174728\n王塞片\t174729\n通宵达旦\t174730\n唔米六\t174731\n下一种\t174732\n林灯\t174733\n玉萌\t174734\n对啊乜\t174735\n铅笔裤\t174736\n娇子\t174737\n西塘黄桃\t174738\n三鹿粉\t174739\n家断网\t174740\nrfjrvk\t174741\n奇品\t174742\ncJrjv\t174743\nDEHP\t174744\n汉寿龙池实验中学\t174745\n度秘恭喜\t174746\n淘bao\t174747\n7部\t174748\n乱说八道\t174749\n最高位\t174750\n好什么好呀你\t174751\n奇哥\t174752\n接盘\t174753\n白跑\t174754\n麻辣汤元\t174755\n不不不不不\t174756\n海带令\t174757\n187808080\t174758\n头着\t174759\n银行你是谁我不认识你你的名字\t174760\nKDS\t174761\n万人之上\t174762\n无米之炊\t174763\n不为所动\t174764\n扰民\t174765\n度秘盒真的好爱好爱\t174766\n晚清\t174767\n王焕雨\t174768\nkkoo\t174769\n异步\t174770\n口头语\t174771\nfdfsesddhd\t174772\n水煤\t174773\n小软萌\t174774\n令人发指\t174775\nClassic\t174776\n水煮\t174777\n十八五\t174778\nCC套\t174779\n嫌疑\t174780\n索然\t174781\n央视国足\t174782\n天蝎男\t174783\n真仙\t174784\n麻绳\t174785\n东软网络安全营销中心\t174786\n心灵美\t174787\n军费\t174788\nmbox\t174789\n医疗\t174790\n四九号\t174791\n颜嘉蔓\t174792\n星期4\t174793\n星期7\t174794\n穷开心\t174795\n落水\t174796\n雨后生\t174797\n健在否\t174798\ngxggf\t174799\n莱茨狗\t174800\n出位\t174801\n2199元\t174802\n张景春\t174803\ndfgfffffffff\t174804\n2190天\t174805\ngreyey\t174806\n出使\t174807\nLedoyen\t174808\n字儿\t174809\n大逃港\t174810\n一臭臭度秘\t174811\n289万元\t174812\n恩那你是我的秘书对\t174813\n原地\t174814\nac\t174815\n浦发太保\t174816\n失之交臂\t174817\n南庄\t174818\n马努\t174819\n不我\t174820\n不成\t174821\njdfki\t174822\n丫es\t174823\n不战\t174824\n登峰造极\t174825\nzhwxpy\t174826\n我了叫\t174827\n永辉超市\t174828\n出版局\t174829\n星星斤\t174830\n很屌\t174831\nChfihrue\t174832\n安然\t174833\n老鸟我是老子夫子\t174834\n自由人\t174835\n慢性营养不良率\t174836\n丫eS\t174837\n老虑\t174838\n南康\t174839\n一个多咪\t174840\n彭峰\t174841\n杨二闲\t174842\n丛思琪\t174843\n当属\t174844\n宇尧\t174845\n神度\t174846\n李沧东\t174847\nforaooo\t174848\n今天六一节\t174849\nImkellyIm10yearsold\t174850\n麽颜颜\t174851\nS5660黑1130\t174852\njhvko\t174853\nasdfghjkl\t174854\n磁怕热\t174855\n性欲望\t174856\n幺三八\t174857\n43码\t174858\n5.5公里\t174859\n乐天\t174860\n条味\t174861\n授受\t174862\n任重\t174863\ndjkskq\t174864\n848484848484\t174865\n伏特\t174866\n李周吴\t174867\n杨美都\t174868\n平码\t174869\n台妹\t174870\n神庙\t174871\nrtaeryf\t174872\n半米\t174873\n12万元\t174874\n王平原\t174875\n经毛\t174876\n1月15日\t174877\n七尺\t174878\n颁\t174879\n颂\t174880\n预\t174881\n领\t174882\n哥斯拉\t174883\n题\t174884\n猪猪猪猪猪猪侠\t174885\n颚\t174886\n颜\t174887\n额\t174888\n五零五十五\t174889\n颐\t174890\n五万多\t174891\n颓\t174892\n颔\t174893\n颖\t174894\n颗\t174895\n風\t174896\n锦官城\t174897\n该站\t174898\n冒食\t174899\n哇帅\t174900\n颠\t174901\n先期\t174902\n颤\t174903\n填单\t174904\n颧\t174905\n7寸\t174906\n59玫瑰十九本\t174907\nhttpfhiphotosbaiducomxiaodupicitem0b7b02087bf40ad1fffd7290502c11dfa9ecceacjpg\t174908\n00：06\t174909\n木排\t174910\n好寂寞\t174911\n仙道\t174912\n男人日\t174913\n挑挑\t174914\n好的好\t174915\n大河\t174916\n窜一窜\t174917\n格城\t174918\n片儿片\t174919\n笑死个我了真好笑\t174920\n摇绳\t174921\n骨芮\t174922\n果篮\t174923\n六分钱\t174924\n17.25%\t174925\nppop\t174926\n三三幢个\t174927\n杂种\t174928\n借题\t174929\ngoushi\t174930\n骨节\t174931\n燃点\t174932\nsaas\t174933\n光头强真的很好笑\t174934\n安才怪\t174935\n司呵\t174936\n太阳能力\t174937\n四百一十二块\t174938\n海信tm26v66\t174939\n第36页\t174940\n六分钟\t174941\n血牌榜\t174942\nniyingyubucuo\t174943\n比多咪还有你的马来\t174944\n我愿\t174945\n柠檬水\t174946\n2天后\t174947\n一到家花花花花花花\t174948\n魏小兰\t174949\n第4位\t174950\n普通话版\t174951\nCOD5w\t174952\n预零\t174953\n歌词组\t174954\n民安小区\t174955\n夏晴晴\t174956\n赵元玲\t174957\n我感\t174958\n晚上十一点\t174959\n九点零七22点\t174960\n老西\t174961\n荫姐\t174962\n名化幾\t174963\nTOM集团\t174964\n我意\t174965\n一本五\t174966\n王拉拉\t174967\nLoser\t174968\n哪门子\t174969\n公道\t174970\n随随便便\t174971\n被堵\t174972\n黄人民\t174973\nhapy\t174974\n一个零\t174975\n考进师范大学\t174976\n了了里了破\t174977\n哈纳西\t174978\nhapa\t174979\n三连\t174980\n换妻\t174981\n2011年10月29日3时00分\t174982\n改掉\t174983\n五十八十\t174984\nisuwu\t174985\n化识\t174986\n宋一下\t174987\n假如\t174988\n我杀我杀我杀杀杀\t174989\nzraeli\t174990\n换妆\t174991\nEntertainment\t174992\n第一班\t174993\n密西西\t174994\n眉山市\t174995\n情人节情人节\t174996\n超神学院\t174997\n鱼儿\t174998\n我喜欢我啊爱心小熊的游戏\t174999\n呜咪喹\t175000\n一个集\t175001\n刘系\t175002\n詹利亚\t175003\n瞠目结舌\t175004\n边娃\t175005\n五宝\t175006\nCobra\t175007\n比假\t175008\n五官\t175009\n比做\t175010\n110倍\t175011\n丝男\t175012\n桶牛\t175013\n1至6个月\t175014\n刁42\t175015\n地方人民代表大会\t175016\n睡罗\t175017\n蔡林\t175018\n五家\t175019\n巴拉巴拉小魔仙\t175020\n附图#31\t175021\n一二十三岁\t175022\n等因为我没有\t175023\n缘正旺\t175024\n一定要见\t175025\n李梓权\t175026\n驶过\t175027\nppp\t175028\n一一秋天\t175029\n供油\t175030\n短息\t175031\n配料\t175032\n长手\t175033\n伱BB\t175034\n龙秘\t175035\n诟病\t175036\n余佳骏\t175037\nlyx\t175038\n医务人员\t175039\n神剧\t175040\nkkkkk\t175041\n龙秋\t175042\n过去时\t175043\nkkkkn\t175044\nlyl\t175045\nlym\t175046\n好晗\t175047\nlyh\t175048\nlyk\t175049\n配方\t175050\n伍波塔尔动物园\t175051\nhcsfbkgsvkhcb\t175052\n悟天特兰克斯找十八号\t175053\n咖喱饭\t175054\n勉为其难听\t175055\n大萌萌\t175056\n收费额\t175057\n昨个\t175058\n大奖赛\t175059\nopporal\t175060\n海龟\t175061\n亚欧叉叉o\t175062\n宠物吃的乳酶生和人吃的乳酶生\t175063\n应根\t175064\n嗯开启头\t175065\n恩断义绝你走\t175066\n糊涂蛋韦迪\t175067\nlend\t175068\n伊可爱\t175069\n再说了信不信\t175070\n工年\t175071\n首销\t175072\nrjcfgvjg\t175073\n啥里\t175074\n零二二零二八零一\t175075\n在线视频\t175076\n临河干山门\t175077\n差评\t175078\n死不要脸的臭赖皮\t175079\n秘瓜\t175080\n200%\t175081\nvvdd\t175082\n哭了起来\t175083\n上海迪士尼\t175084\n大坏蛋片\t175085\n可靠看看男男女女并不\t175086\n2002\t175087\n2003\t175088\nlagree\t175089\n2001\t175090\n2006\t175091\n2007\t175092\n2004\t175093\n赢悄\t175094\n2008\t175095\n2009\t175096\n我想爱开心看\t175097\n有毛好争议\t175098\n贝安\t175099\n咯雪\t175100\n谭一郎\t175101\n润泰\t175102\n忍者神龟\t175103\n请稍候\t175104\n耐忙\t175105\n错话故意\t175106\n代驾\t175107\n水水\t175108\n太二\t175109\n化腾\t175110\n英特曼英特曼\t175111\n0876543219\t175112\n学生族\t175113\n屈辱\t175114\nMet\t175115\n刘慧宇\t175116\n水气\t175117\n谢思密达\t175118\n啵吖\t175119\n#Vee\t175120\n葛文远\t175121\n臭屌丝\t175122\n沙特阿拉伯\t175123\n罗湖区\t175124\n止痛泵\t175125\n杨双燕\t175126\nxxxxxxxx\t175127\n唉真\t175128\n搞笑电影\t175129\n為何\t175130\n郑湘豫\t175131\n湾址镇\t175132\n嵩山少林寺\t175133\n小品品\t175134\n破水\t175135\n绿绿\t175136\n和睦岁\t175137\n破旧\t175138\n大益普洱茶\t175139\n一个工\t175140\n零零年\t175141\n抵触\t175142\n王鑫澳\t175143\njoes\t175144\n呃琳琳琳琳\t175145\n终其\t175146\nbyductowergeyatourawwwwwwwwaaaaaaano6图嘟嘟\t175147\n上世纪\t175148\n那么多\t175149\n9个月\t175150\n田妞\t175151\nfyv6jbhfhvydhffefjfhscvjf\t175152\n你你\t175153\n扣押\t175154\n说好不好呀\t175155\n贵宾犬\t175156\n平不爽大张\t175157\n护领宽窄\t175158\ndjaabwwp\t175159\n零二二零九零\t175160\n函数函数\t175161\n糸我告诉你\t175162\n创新者\t175163\n狗鸡\t175164\n十日\t175165\n四维卫浴\t175166\n好可伶\t175167\n三十二块\t175168\n杨绛\t175169\n漂净\t175170\n好年好\t175171\n各司其职\t175172\n张忠民\t175173\n我女的女的女的女的女的女的女的女的女的\t175174\n对呀我是女的你\t175175\nbody\t175176\n王八\t175177\n杭锅股份双宿双飞\t175178\n版权\t175179\n观叶\t175180\ngyhhg\t175181\n是是是真单纯真善良小公举\t175182\n无锁屏\t175183\n強吻\t175184\n一早了毒米我要撒野\t175185\n跟难\t175186\n查茨克\t175187\n爷子\t175188\n奢求\t175189\n臭不要脸的臭美\t175190\n感光\t175191\n肉粒\t175192\n王兴\t175193\n王光\t175194\nhudi\t175195\nhudh\t175196\n十二布鲁凌乱\t175197\n罗翻来\t175198\n王允\t175199\n请笑一笑\t175200\nJYJxNII\t175201\n我的的爱\t175202\nDsgfvchc\t175203\ndhbz\t175204\nhudx\t175205\nhisbsysgwgsi\t175206\n侯景璐\t175207\nhudu\t175208\n二比五\t175209\n进港\t175210\nnbajf\t175211\nFR\t175212\n丰润机\t175213\n电视名字\t175214\nFU\t175215\nFV\t175216\n黎真真\t175217\nFX\t175218\n約會約會\t175219\n瑟吉欧\t175220\n宣传册\t175221\nFA\t175222\nFB\t175223\n林雨薇\t175224\nFF\t175225\nFG\t175226\nFH\t175227\nFI\t175228\nFJ\t175229\n咏婷\t175230\nFL\t175231\nFM\t175232\n会所\t175233\n10点37\t175234\n瓦林\t175235\n10点33\t175236\nFu\t175237\nFv\t175238\n10点30\t175239\nFx\t175240\n水平面\t175241\nhddf\t175242\n克服\t175243\n金星\t175244\n厚厚\t175245\n薯片\t175246\nFc\t175247\nFd\t175248\nFe\t175249\nFf\t175250\nFg\t175251\n小毛孩\t175252\nFi\t175253\nFj\t175254\nFk\t175255\nFo\t175256\n顺其丑\t175257\n猴类\t175258\n9.8%\t175259\ngajgaj\t175260\n生锈\t175261\n29点\t175262\n十五\t175263\n不少别\t175264\n只我\t175265\n嗯小帅\t175266\ngajgaa\t175267\n你是我的秘书你自己说的你不在我和我斗嘴为你都不过我\t175268\n爱男\t175269\n十二\t175270\n小雪梅\t175271\n爱甲\t175272\nnnnnnnnnnnnnnnnnnn\t175273\n以至于\t175274\n劲道\t175275\n好呢白芥子\t175276\n会吧聊\t175277\n医学\t175278\n刘美英\t175279\n2012-08-02\t175280\nchiphotosbaiducomxiaodupicitem42a98226cffc1e1750f099444d90f603738de97cjpg\t175281\nF1\t175282\n十亿\t175283\n1233575557658698\t175284\n死葫芦\t175285\n刘预约\t175286\n十人\t175287\nF8\t175288\n晚觉\t175289\n男姐\t175290\nsbbzjxjxx\t175291\n十亩\t175292\n淬说\t175293\n爱生\t175294\n候爷\t175295\nIMDb\t175296\n08:30——11:30\t175297\nwanshi\t175298\nsomething\t175299\n事业者\t175300\n牛肺炎症\t175301\n历届\t175302\n哎咪欧纪斯\t175303\n一千天\t175304\n贡献率\t175305\n无非大道\t175306\ngrande杯latte\t175307\n脚部\t175308\n吐吧T11\t175309\n库存贷\t175310\n你是我最爱的宝贝小秘\t175311\n青儿\t175312\n华尔街日报\t175313\n而动\t175314\n10.04\t175315\n老无所依\t175316\nwashia\t175317\n佬佬\t175318\n活动者\t175319\ntoolate\t175320\n贴金\t175321\n无形中\t175322\n一千多\t175323\n白领们\t175324\n什数\t175325\n早稻田泼水大作战2012\t175326\n凝固剂\t175327\nNissan\t175328\n再来时报\t175329\n沉重\t175330\n坊间\t175331\n古屋\t175332\n集贤县\t175333\n哭害\t175334\n亏上\t175335\n告不告诉\t175336\n北在\t175337\n白生气\t175338\nt8301\t175339\n高望重\t175340\n浇息\t175341\n一月一千\t175342\n么美\t175343\n李子龙\t175344\n衰尸船\t175345\njird\t175346\n啵啵虎\t175347\n陈紫薇\t175348\n至清\t175349\nytwoetrrttggbvcbvgcgvvvgvbvvvvghghhhhhhnbbbmmmmmmmmnnvbnnnnnnnn\t175350\n可爱行\t175351\n田地\t175352\n11000块\t175353\n祥和生活\t175354\ndjjsiqjsuj\t175355\n爱情36计#Happy\t175356\n海带咸寒\t175357\n度蜜爱娇桔桔\t175358\n张志鑫\t175359\n嗯铁皮\t175360\n缩头乌龟。77\t175361\npolo咯\t175362\n弥久济度秘侯\t175363\n给我看\t175364\n5546158\t175365\n波波点\t175366\n谢谢你好不好\t175367\n均贫富\t175368\n作鬼\t175369\n轩\t175370\n6月10日到26日\t175371\n达实\t175372\nlwonderhow\t175373\n自习室\t175374\n碳酸氢钙\t175375\n后勤部\t175376\n不是我我这我\t175377\n雪天\t175378\n翠姐\t175379\n信咪\t175380\n至恒\t175381\n日生\t175382\n雪夜\t175383\ntpup\t175384\nguui\t175385\n一晃\t175386\n亚蕾\t175387\n双栖蝶\t175388\n坤坤\t175389\n资金面\t175390\n非常好很过分\t175391\n1346792580\t175392\n逢源M记\t175393\n男小\t175394\n东森\t175395\n庄咏群\t175396\nFrom\t175397\n360999\t175398\n纪传奇\t175399\n73737748\t175400\n告诉你鬼大厕所\t175401\nFrog\t175402\n没听说过\t175403\n苗梓恒\t175404\n失约\t175405\nw666\t175406\n醒目\t175407\n妥善\t175408\n第89个\t175409\n哪莫\t175410\nempiret\t175411\n真的嘛真的爱我\t175412\n旷远\t175413\n十二日上午\t175414\n金力泰\t175415\n青霉素\t175416\nIcan\t175417\n百安居\t175418\n走了么么\t175419\n发呆开\t175420\n惬意\t175421\n涟源市\t175422\n一段\t175423\n获刑\t175424\n度秘我求你了你告诉我你的年纪\t175425\n首推\t175426\n通话费\t175427\n关山\t175428\n对得起\t175429\n巨心\t175430\n彩裤\t175431\n陈安南\t175432\n米奇风\t175433\n海岸线\t175434\n义务兵\t175435\n养生馆\t175436\n姆娜\t175437\n不肖\t175438\n塞入\t175439\n掌管\t175440\nB7C\t175441\nComeLets\t175442\n千余件\t175443\n深奥五十傲世奥\t175444\nmidu\t175445\n莱蒙\t175446\n获利\t175447\n85686422347312425445957578962642795多少\t175448\nXGCJ\t175449\n五六十张\t175450\np份\t175451\n作者\t175452\n唉摸\t175453\n孤陋\t175454\n哄骗\t175455\nkkk47kk\t175456\n梁钰洋\t175457\n3700\t175458\n唉摩\t175459\n多方法\t175460\n应暴\t175461\n才帅\t175462\n443448522\t175463\n面条商店\t175464\n世贸\t175465\n纯情\t175466\n128491387\t175467\n尼～酱\t175468\n许嵩帅\t175469\n急促\t175470\n钉截铁\t175471\n六旬\t175472\n联合片\t175473\n心情好好\t175474\n找我麻烦\t175475\n谢了我有事先走\t175476\n929281\t175477\n二元一次方程\t175478\n毒化\t175479\n一览无疑\t175480\n翻译员\t175481\n不可动摇\t175482\n缪缪\t175483\n5050480480482508520850\t175484\n莉莉\t175485\n出彩\t175486\n高兴别管\t175487\n5455句\t175488\n活干嘛\t175489\n郑思翰\t175490\n氧气罐\t175491\nvlkd\t175492\n晕子\t175493\nJason\t175494\n烦你讨厌\t175495\n开始\t175496\n莉莲\t175497\n七九幺\t175498\n乱交\t175499\n告诉我行\t175500\n不贫\t175501\n没去过\t175502\n捕鸟\t175503\nKRY场\t175504\n妙喻\t175505\n用成\t175506\n2000万元\t175507\n猪阿\t175508\n等你生日\t175509\n女斗恶\t175510\n1234556789\t175511\n坏女人\t175512\n20142160071200\t175513\n孙佳和\t175514\n纪元e\t175515\nGruk\t175516\n泡腊八蒜\t175517\n哥夫\t175518\n我是男人超级大男人\t175519\n111111111\t175520\n28章\t175521\noppotl\t175522\n用户\t175523\n亲一点\t175524\n沙发\t175525\n用房\t175526\n2811806829\t175527\n乌山北坡站\t175528\n猜北\t175529\n4点50\t175530\n偏方\t175531\n王治郅\t175532\n左瞳\t175533\n秸秆\t175534\n楚剧\t175535\n小嶋玉娴\t175536\n能行\t175537\n庙里\t175538\n杭州大厦\t175539\nmastl\t175540\ncvvgcx\t175541\nJ·K·罗琳\t175542\n这卡\t175543\n陆诗慧\t175544\n辽东\t175545\n姓李还是\t175546\n检举\t175547\n三道河子乡\t175548\nmHrSgz\t175549\n嫣嫣\t175550\nnoino\t175551\n什么式\t175552\n辽中\t175553\n黄网\t175554\n虚言\t175555\n福尔马林\t175556\n朱敏荣\t175557\n梦儿\t175558\n188288\t175559\n养家养家\t175560\n要不中\t175561\n今日零点四十分\t175562\nk1073\t175563\n61次\t175564\nghiphotosbaiducomxiaodupicitem574e9258d109b3de52fc001bcbbf6c81800a4c06jpg\t175565\n是故\t175566\n女孩性\t175567\n擦摩擦\t175568\n贺王璟\t175569\nkbk\t175570\n15960216157\t175571\nkbh\t175572\nhabskbx\t175573\n肩带\t175574\n纳五\t175575\n神化\t175576\n怂包\t175577\n万俟沐华\t175578\n甜美\t175579\n灵罗丽精灵梦\t175580\n薛芬芬\t175581\n手枪气枪大桥秘强强爱\t175582\n要幸福\t175583\n挨到\t175584\nMop\t175585\n张目\t175586\n爱手\t175587\n爱才\t175588\n片的声\t175589\n布公公\t175590\nksm\t175591\nQqqqrzhffzhsrdtHzhattezxfshgstzgxghfgzhdzh\t175592\n六一豪礼?美食每刻都欢乐#六一儿童节拉\t175593\n187286565558858556584845\t175594\n神医\t175595\n地名字\t175596\n一点吧千克\t175597\n数3声\t175598\n好景\t175599\nbear\t175600\n花腿\t175601\n殷路通\t175602\n十四十四\t175603\nxc175415020\t175604\n充填\t175605\n耽美文\t175606\nRadeon\t175607\n周五晚上\t175608\n施珏\t175609\n黄丽\t175610\n省吕剧院\t175611\n轮子\t175612\n佳美\t175613\n明敏\t175614\n松柏\t175615\n破菊\t175616\n游园\t175617\n隐形性\t175618\n我也不懂\t175619\n心血管疾病\t175620\n大龙虾\t175621\n明教\t175622\n草塔镇\t175623\n挺爱\t175624\n里恩\t175625\n定罪\t175626\n1Q84\t175627\n多劳\t175628\n挺爽\t175629\n黑麻辣烫\t175630\n坎坎坷坷坎坎坷坷咔咔咔咔咔咔咔咔\t175631\n恶不恶心\t175632\n相继\t175633\n猜中\t175634\n早春\t175635\n阿利甲甲\t175636\n樱花开\t175637\n企业辉煌\t175638\n媒體\t175639\ninora\t175640\njagjgd\t175641\n许晴\t175642\n相思病\t175643\n想日你\t175644\n二零零一六八\t175645\n累听\t175646\n祥哥\t175647\n淫婦\t175648\nprogress\t175649\n猜下\t175650\n称颂\t175651\n贝贝颂\t175652\n677670138条\t175653\n警恔\t175654\n塞有那拉\t175655\n很乖\t175656\n极秘\t175657\n呆萌\t175658\n李兆前\t175659\n呢侠\t175660\n暴跳\t175661\n很久\t175662\nDOTA\t175663\n乜那\t175664\n何以笙箫默\t175665\n半空中\t175666\n当嫁\t175667\n刘月国\t175668\n很么\t175669\n毒蘑菇\t175670\n车草鸡飞\t175671\n250日线\t175672\n回电话\t175673\n1exo\t175674\n我要你重刀塔那里大礼拜五百里傻子\t175675\n猪八戒胖\t175676\n277746005\t175677\n十二一下\t175678\n诗风\t175679\n暴跌\t175680\n复合\t175681\n岩岩洞\t175682\n喜气洋洋\t175683\n我也谈\t175684\n罗脚\t175685\n七色花\t175686\n蕾伊\t175687\n1dg\t175688\n冷和热\t175689\n冲出\t175690\n一樣而已你們笑的是錢結婚生娃兒未來勝利的笑呵呵\t175691\nGHHDD\t175692\n一三分\t175693\n俊宝\t175694\n78分\t175695\n周前前\t175696\n红柳绿\t175697\n疑虑\t175698\n5秒钟之内\t175699\n将要\t175700\n快牙薛\t175701\n傻子疯子\t175702\n家福\t175703\nOLD\t175704\n铸剑\t175705\n夹衣\t175706\n阿衰\t175707\n曼完\t175708\n什么时候场\t175709\n佩嘿\t175710\n简式\t175711\n书剑\t175712\nRomvlvs\t175713\n阿衣\t175714\n湿润\t175715\n邓三科\t175716\n脑补\t175717\n好可乐\t175718\n北海舰队文工团\t175719\n125552222222222222222222223\t175720\n哈泥煤\t175721\n立健三清冲剂\t175722\nhttphhiphotosbaiducomxiaodupicitemf9dcd100baa1cd112dc81852be12c8fcc3ce2d41jpg\t175723\n铁承羊羊\t175724\n快递员\t175725\n一差生\t175726\n上河图\t175727\n外行\t175728\n凌晨三点半\t175729\n救起\t175730\n佳肴\t175731\n阿里秘\t175732\n救赎\t175733\n见水化\t175734\n喜剧之王\t175735\n张卫年\t175736\n干磨\t175737\n迤逦\t175738\n弗拉米尼\t175739\n韩攻\t175740\n桀骜脖\t175741\n甘芝洁美\t175742\n第一部\t175743\n外衣\t175744\n园艺\t175745\n春花秋月\t175746\n外表\t175747\n管虎式\t175748\n车震\t175749\n枝末节\t175750\n围追\t175751\njjjnvfvxvt\t175752\n杀虐\t175753\n复合句\t175754\n最错\t175755\n自勉\t175756\n嗯光头强\t175757\n离讲\t175758\n法务\t175759\n天黑\t175760\n这么多好\t175761\n一微\t175762\n法力\t175763\n央\t175764\n嗯良木\t175765\n盈温\t175766\n魏千翔\t175767\n草榴\t175768\n骗用\t175769\nTNT998\t175770\n易立\t175771\n沐春光\t175772\n外溢\t175773\n开门红大豆油\t175774\ni8trthjrfcu67642juy\t175775\n说长\t175776\n闻所未闻\t175777\n三星Calaxya9\t175778\n孤立孤立\t175779\n修练\t175780\n黑白\t175781\n1996年4月11号\t175782\n资道\t175783\n我喜歡的人不喜歡我\t175784\n不不不我你老\t175785\n阅览室\t175786\n312平方英里\t175787\nscholars\t175788\n美奇米大黑魔仙\t175789\nhgcced\t175790\n有情\t175791\n蛋姨\t175792\n古龙今韵\t175793\n跟一起玩\t175794\n柠檬度\t175795\n奔向\t175796\n粪男孩\t175797\n傻子陋\t175798\n无語\t175799\n亲爱的拜\t175800\n山楂饼\t175801\n夹杂\t175802\n哐棠\t175803\n接任\t175804\n不要脸\t175805\n刘小墨\t175806\n幺二零幺零六\t175807\n75734318737\t175808\n小排童鞋杀人牌\t175809\n使到\t175810\n不懂不说\t175811\n阿汤哥\t175812\n孤生\t175813\ngcswzgr\t175814\n阿联酋\t175815\nmyhom\t175816\nTTTTTT整理图整理图整理图TTTTTT\t175817\n重刑\t175818\n寂寞melody\t175819\n有图有\t175820\n纹型\t175821\n海豚音\t175822\nvivkdtsdki\t175823\n鼓浪屿岛\t175824\n孤男\t175825\n约冲天\t175826\n不服气\t175827\n把度\t175828\n夹板\t175829\n分分钟钟\t175830\nc14\t175831\nPre\t175832\n有我在\t175833\n联想\t175834\n义人\t175835\n曲导\t175836\n防汛\t175837\n哈被\t175838\n徐福庚\t175839\n啦奶茶\t175840\n扁蛋\t175841\n撒比撒比撒比撒比\t175842\niphoneime\t175843\n南安市公安局\t175844\n重创\t175845\n刘毅\t175846\n撒母\t175847\n爬圭\t175848\nc1g\t175849\n吃吃吃吃吃吃吃\t175850\n呃丑\t175851\n义二\t175852\n联情\t175853\n丽英\t175854\n劫走\t175855\n佛教天台宗\t175856\n弄破\t175857\nnjf\t175858\n小恐龙蛋鸭\t175859\n比晗\t175860\n紫砂壶\t175861\n随性\t175862\n大女人\t175863\nBdhjjfui\t175864\n李锐雪\t175865\n金属学\t175866\n七弦琴\t175867\n4878655552535353535353535353\t175868\n声速\t175869\noaorva\t175870\n1991年06月28日\t175871\n生木\t175872\n毛躁\t175873\n18376500386\t175874\n握爪呀式\t175875\n说什么晚安臭\t175876\n四百米\t175877\n从来都\t175878\n夫子\t175879\n200吨\t175880\n零七二一\t175881\n陈词滥调\t175882\n西部地区\t175883\n阿看看\t175884\n狼青\t175885\n女子娃\t175886\n次日\t175887\n花梨猫\t175888\nAM963\t175889\n卡卡树\t175890\n村长\t175891\n200名\t175892\n冬梅\t175893\n不觉晓有你好\t175894\n村镇\t175895\n后项\t175896\n趕緊\t175897\n唐小姐\t175898\n李麦莎\t175899\n你好hi梦hh我们多少hh\t175900\n踱步\t175901\n79.9美元\t175902\n陌离\t175903\n尘世\t175904\n上衣服\t175905\n552569988555555\t175906\n打落\t175907\n国界\t175908\n恩惠\t175909\n可才能\t175910\n56p\t175911\n奥莱惠\t175912\n可信\t175913\n五星红旗\t175914\n脊背\t175915\n恩想\t175916\n明早十点\t175917\n搭建\t175918\n痛不痛苦\t175919\n滑雪服\t175920\n走了奥\t175921\n恩情\t175922\n副天然\t175923\ndeman\t175924\n向狮王\t175925\n敬重\t175926\n千零一百一十次\t175927\n感觉你好乖呦\t175928\n一些分\t175929\n清晨五点\t175930\nfusvgs\t175931\nFgjl\t175932\n过敏\t175933\nctvux\t175934\n568\t175935\n569\t175936\n石碣\t175937\n刘详\t175938\n560\t175939\n561\t175940\n很漂漂\t175941\n563\t175942\n564\t175943\n总管\t175944\n566\t175945\n我跑\t175946\n说了聊聊\t175947\n大地飞歌福星晓程股骨头如上所述\t175948\n刘说\t175949\n哎呦九\t175950\n路路通路总兔崽子\t175951\n巴厘雷\t175952\nsffjtkfhfhfh\t175953\n自阳\t175954\n电油\t175955\n乌干达\t175956\n伊索\t175957\n同江苏\t175958\n小理性\t175959\n结衣来人\t175960\n不好再来\t175961\n有碍\t175962\njrudu\t175963\n微乐\t175964\n总算\t175965\n防止\t175966\n哈比撸撸撸\t175967\n六百多\t175968\n九十二百\t175969\npil\t175970\npim\t175971\npin\t175972\n深涧\t175973\npia\t175974\n走了呀\t175975\npig\t175976\npiy\t175977\n左先\t175978\n百咳静糖浆\t175979\n意大利皮波烤肉\t175980\n556151\t175981\n小事化小小事化化胧\t175982\nintentIntentSK1171477665524E04602914E1E3C53C5D16D5457A576901D00994E1BB86CBD6FD903B292949end\t175983\n王晓初\t175984\n信物\t175985\n高照虎\t175986\n二一级\t175987\n六六二二幺五幺幺\t175988\n催眠歌\t175989\n苗一点别的吧行\t175990\n气元\t175991\n超了没\t175992\n鳄鱼谷\t175993\n森海味\t175994\n妙妙个月7k\t175995\n半信\t175996\n五十驯\t175997\n凌晨3点\t175998\n黎巴嫩\t175999\n446643478\t176000\nwidjsb\t176001\nHanna\t176002\n分享\t176003\n中国人工智能学会\t176004\n全稞\t176005\n第二第三\t176006\n治愈系\t176007\n孤獨\t176008\n背一秋游\t176009\n全程\t176010\nhttpehiphotosbaiducomxiaodupicitem00e93901213fb80e67510fa331d12f2eb83894c5jpg\t176011\n晕了别再说\t176012\n漆膜\t176013\n小白狗\t176014\n6545\t176015\n黄荣香\t176016\n小白狮\t176017\n多喝\t176018\n张冰洁\t176019\n升官\t176020\n摇摇晃晃\t176021\n打幺三五的你记\t176022\n图钉\t176023\n小白狼\t176024\n拍戏\t176025\n一次两次\t176026\n下午4点半\t176027\n一起片\t176028\n拉格朗日\t176029\n马嘉璐\t176030\n钱文秀\t176031\n痴呆症\t176032\n林尽头\t176033\n哎嘿\t176034\n杨傲蓉\t176035\ndomistletootmentlyouroro\t176036\n笨描\t176037\n希伯来\t176038\n钟声\t176039\n拼读\t176040\n2016年1月1号\t176041\n重铜\t176042\n切口\t176043\n汽车站\t176044\nroeeol\t176045\n大家一起乐\t176046\n若羽\t176047\n口套\t176048\n在不言中\t176049\n王国维\t176050\n哎嘛\t176051\n奴性\t176052\noeoeichs\t176053\n三十升\t176054\n三十千\t176055\n三十十\t176056\n公共关系\t176057\n岳梦宇\t176058\n文成县\t176059\n碑坊\t176060\nchddy\t176061\n李生\t176062\n出席会议\t176063\n五年把\t176064\n男厕\t176065\n热利\t176066\n婆子\t176067\n文语乐\t176068\n标哥\t176069\n热切\t176070\n//哎呦坠儿=3\t176071\n孔雀馆\t176072\n凝滞\t176073\n无聊无耻\t176074\n停飞\t176075\n莫-子\t176076\n侏儒症\t176077\n朱浩宇\t176078\n雷巴克\t176079\n陈江\t176080\n劲帅\t176081\n庇荫\t176082\n一瞬间\t176083\n陈汗\t176084\n飞科\t176085\n大鹏\t176086\n别说废话\t176087\n自为之\t176088\n飞秋\t176089\n大鹅\t176090\n地沟油油条\t176091\n湿湿\t176092\n好忧桑\t176093\n二话\t176094\n什么价\t176095\n则节节\t176096\n一朵朵\t176097\n林哥丁嫂\t176098\n面答\t176099\n房贷者\t176100\n就是我的生日\t176101\n50分钟\t176102\nTTT\t176103\n龙东亭\t176104\n八颗\t176105\nRhxnn\t176106\nTTO\t176107\n面筋\t176108\nTTM\t176109\n夏梦晴\t176110\nBanasiak\t176111\n不饭爱\t176112\n玩不上\t176113\n陈汤\t176114\nTAT\t176115\n马向琴\t176116\n冰梦\t176117\n玄武区\t176118\n綦江\t176119\n攘攘\t176120\n惧色\t176121\n真我的\t176122\n迟缓\t176123\n凌嘎\t176124\nCVGGFF\t176125\n榴莲味\t176126\n喲喲喲玉玉鬱u雨預計uuu\t176127\n节假日\t176128\n行路难\t176129\n那是我\t176130\n极小\t176131\n286782482155761555468\t176132\n米老鼠\t176133\n巴哥哥\t176134\n彩虹\t176135\nthhghfhk\t176136\n新余渝水区法院\t176137\n巧可不代表团\t176138\n泰药\t176139\n杨总嗯\t176140\n托也\t176141\n凉血\t176142\n四款\t176143\n主义者\t176144\n想你的笑\t176145\n极少\t176146\n两个事\t176147\n东翼\t176148\n两个二\t176149\n东翰\t176150\n我讨厌你我讨厌你我爱你我恨你我恨你错了我错\t176151\n大钞盒\t176152\nP站\t176153\n让叫\t176154\n专情不值钱\t176155\n坏小孩儿\t176156\n料子\t176157\n雅拓\t176158\n学长\t176159\n第二天\t176160\n草率呀好帅声音\t176161\n春秋秋\t176162\n两个人\t176163\n一次绳\t176164\n秘别喊我妹子好梦\t176165\n肖战\t176166\n骗我了你没有\t176167\n瞎掰\t176168\n公媳乱\t176169\n雷克斯\t176170\n0岁\t176171\n四九点\t176172\n嘉丽莎\t176173\n翻腾\t176174\n分钟后\t176175\n重感冒\t176176\n体验券\t176177\n去点\t176178\n晓滴\t176179\nhtjyyeyj\t176180\n湖南代表团\t176181\n博览赛尔号\t176182\n李均泽\t176183\n财务部\t176184\n两个人片\t176185\n几支\t176186\n廖丽香\t176187\n没事错\t176188\n装扮真的很好玩\t176189\nwuiwmutkppuflpuh\t176190\n13：00\t176191\n金宝\t176192\n金宛\t176193\n上上帝\t176194\n一笑而过\t176195\n金宇\t176196\nvivoxshot\t176197\n李寒婧\t176198\n姚文元\t176199\n皑皑\t176200\n金安\t176201\n金家\t176202\n告诉你我的秘密\t176203\n声吾皇万岁万岁万万岁\t176204\n赵雪涵\t176205\n投票\t176206\niiiu\t176207\n美廉美超市\t176208\n一周前\t176209\niiii\t176210\niiij\t176211\n再就叫\t176212\n罗晋度\t176213\n24伒\t176214\n吴安然\t176215\n花猫猫\t176216\n理你了我\t176217\n黄燕\t176218\n小悦悦\t176219\n午后\t176220\n砸钱\t176221\n别鹿犬\t176222\n废铁池\t176223\n罗山\t176224\n致力\t176225\nFORD\t176226\n静观\t176227\n米奇传奇再起\t176228\n讨厌讨厌我要好看电影好看电影\t176229\n选择权\t176230\n里上\t176231\n宇文君\t176232\n1864757\t176233\n刺激\t176234\n2200万\t176235\n音效\t176236\n很笨\t176237\n晨辰\t176238\n不不不就是你记错\t176239\n破天寿远\t176240\n香草\t176241\n低糖\t176242\n玛卡\t176243\n海里\t176244\n李秋云\t176245\n海量\t176246\n空秘\t176247\nhqyqtrtkahtehescmnfeffhertq\t176248\n阴冷\t176249\n一居室\t176250\n0｀\t176251\n下安\t176252\n嘲笑鸟\t176253\n瓦升旗地\t176254\n血哈\t176255\n3倍\t176256\n下定\t176257\n木工\t176258\n榣杆\t176259\n蛙人\t176260\n终结的炽天使\t176261\n斌月星\t176262\n17878787\t176263\n死活人\t176264\nfofr\t176265\n巴鸡\t176266\n红冰蓝\t176267\nzhsbk\t176268\n雅塔\t176269\n下容\t176270\n窗帘儿\t176271\n果脯\t176272\n栀子\t176273\n孤岛余生游戏\t176274\n不我说\t176275\n素说\t176276\n桐木炭\t176277\n36只\t176278\n车底\t176279\n帅锅\t176280\n柚子茶\t176281\n统计局\t176282\njcs\t176283\n高震东\t176284\n安溪\t176285\n铁当初\t176286\n操受不了\t176287\n有许多事\t176288\n科院\t176289\n俩次\t176290\n衣角\t176291\n服务服台\t176292\n9点18到十点\t176293\nggghhvg\t176294\n阿笔\t176295\n摸一摸\t176296\n20:00-21:00，22:00-23:00\t176297\n汇锦中学\t176298\n冯思杰\t176299\n可乐话\t176300\nhbbbhjnb\t176301\n生份证\t176302\n没了我\t176303\n十六块\t176304\n云水谣\t176305\n草坪\t176306\n2zippo\t176307\n天城\t176308\n天津市委市政府\t176309\n爱凤兔\t176310\n轻落\t176311\n沧月\t176312\n库尔茨\t176313\n积相\t176314\n死教\t176315\n不说来了吗你在哪呢论坛\t176316\n聊聊天\t176317\najp\t176318\n开心不起\t176319\n娃娃学\t176320\n死敌\t176321\najk\t176322\n必须\t176323\n赠券\t176324\n胚乳\t176325\n公告\t176326\n咽喉\t176327\n我情何以堪\t176328\n梦回雨神\t176329\n佛鸥\t176330\n死遠點\t176331\n小F\t176332\n侏罗纪\t176333\n逃脱\t176334\n13166327693\t176335\n女犬\t176336\n黄皮\t176337\nwanted\t176338\n凤凰位面\t176339\n包浆豆腐\t176340\n3342508791\t176341\n陈志文\t176342\n朱茵\t176343\nxiaodupicitem\t176344\n没有线\t176345\n噜啦啦啦咯口语语录图旅途all图图题无辜\t176346\n2b三八\t176347\n自自动\t176348\n你在干嘛英语角秘书\t176349\n挪到\t176350\n合拍片\t176351\n两场\t176352\nkisspk\t176353\n复读生复读\t176354\n可治\t176355\n2458\t176356\n天津办事处\t176357\n应承担\t176358\n2450\t176359\nFight\t176360\n羊皮\t176361\n柬见\t176362\n横江\t176363\n汽车购置税\t176364\n自动台\t176365\n武训\t176366\n往常\t176367\n搭累\t176368\n霉毒\t176369\n臭不要脸就是你\t176370\n善科you\t176371\n小美妞\t176372\n钱生\t176373\n歙县\t176374\n≦ツ┏━┓\t176375\nniv\t176376\n随笔\t176377\n纸盒\t176378\n南达科\t176379\nFufgjg\t176380\n实例\t176381\n半年内\t176382\n无极\t176383\n13466195184\t176384\n吴娘娘\t176385\n嘉颖\t176386\n两千个\t176387\n百萬吻百萬吻\t176388\n林叔\t176389\n这么说话\t176390\n4567890123\t176391\n废物情人\t176392\n十分钟前\t176393\n那你你你你你你你\t176394\n竹溪县\t176395\n漠漠\t176396\n揉虐\t176397\n河北省民宗厅\t176398\n林号\t176399\nSTILL\t176400\n有放手\t176401\nbeked\t176402\n曲阜市实验小学\t176403\n滥发\t176404\n这么说说\t176405\n嗦嘎噶\t176406\n耳背\t176407\nclasses\t176408\n林口\t176409\n欢天喜天\t176410\n黑莓手机\t176411\n乌黑没系\t176412\n秋水\t176413\nisjdjw\t176414\n人行征信\t176415\n无限期\t176416\n别黎明\t176417\n王成勇\t176418\n6217005570012908446\t176419\n我没有梦想你了没有暗恋仙\t176420\n法式\t176421\n弘大\t176422\nSfdgedf\t176423\n柔情\t176424\n腹黑我不和你\t176425\n庇护\t176426\n冻号冻冰块儿\t176427\n朱家尖\t176428\n唱和\t176429\n巴芭\t176430\n拖裤子\t176431\n海药股份\t176432\nb240\t176433\n杨峻熙\t176434\n尽无言以对\t176435\n55555555555555555555\t176436\nGgvhhg\t176437\nUdudjd\t176438\n55555555555555555552\t176439\n美妞美妞\t176440\n快嘛快\t176441\n城市垃圾处理费征收办法\t176442\n46多28\t176443\n判死型\t176444\n此策者\t176445\n卡拉来吧\t176446\nqwertyu\t176447\n快叫\t176448\n刷枪\t176449\n叶哈哈\t176450\n黑白照片\t176451\nHKamp\t176452\n光秃秃\t176453\n八连撸\t176454\n94分\t176455\n趋势性\t176456\n停水\t176457\n兵兵妹子女驸马\t176458\n物尽其用\t176459\n600486\t176460\n禁囚\t176461\n多咪多咪多咪\t176462\n光分钟二十六分钟\t176463\n伊丽莎白二世\t176464\n花亚楠\t176465\n思默水桥\t176466\n美人儿\t176467\n家破\t176468\n电话号儿\t176469\n21323223\t176470\n炮轰\t176471\n几寸\t176472\n13660804439\t176473\n我好喜欢你好爱你我好想你到家\t176474\n皮大论\t176475\nCeline\t176476\n不不不我一点\t176477\n机能\t176478\n诸暨正与林学院\t176479\nquqqq\t176480\n再见忙\t176481\nQwQ\t176482\nock\t176483\n雅罗拉\t176484\n第2季\t176485\n俯首\t176486\n王一恒\t176487\n金枪鱼\t176488\nocc\t176489\n连不上\t176490\noce\t176491\nqyegdg\t176492\n波多尔斯基\t176493\n四十分\t176494\n九钱帅\t176495\n望乡楼\t176496\n宝钢\t176497\n你好你好乖乖乖\t176498\n你的秘密秘密\t176499\n横琴\t176500\n千手菩萨\t176501\n发娇嗔\t176502\n化工厂\t176503\n一揽子\t176504\n鞋样小夜喵小夜猫妙妙\t176505\n荒无人烟\t176506\n小姑姑\t176507\n变性手术\t176508\n多米巧\t176509\n爱有渺\t176510\n给我一个吻\t176511\nGhjuucghklkkjkkmklk\t176512\n股东们\t176513\n打破常规\t176514\nshazi\t176515\n十八点\t176516\n霍芬海姆\t176517\n潮男裤\t176518\n韩战\t176519\n多狠\t176520\n干干净\t176521\n高兴甘\t176522\n遛狗\t176523\nclean\t176524\noc2\t176525\n4万亿\t176526\n堕胎\t176527\n刘爱霞\t176528\n贺礼\t176529\n恩和你的充气娃娃\t176530\n23465\t176531\n刀布\t176532\nāá\t176533\n六万亿\t176534\n新郑\t176535\n这一切\t176536\n新郎\t176537\n鍵盤\t176538\n发张你的裤裤照\t176539\n冤无处\t176540\n春年\t176541\n报告\t176542\n5884884811861844644161746476444\t176543\n发胀\t176544\n大男生\t176545\n桂桂\t176546\n听忘\t176547\n难逃\t176548\n这一刻\t176549\n埃雷\t176550\n同义词\t176551\n软功\t176552\n满口喷翔\t176553\n赵逸飞\t176554\n武威\t176555\n哈明里\t176556\n上海时疫医院\t176557\nFgggh\t176558\n散步\t176559\n青多大\t176560\n余氏\t176561\n逛公园\t176562\n真的好想哭\t176563\n李小花\t176564\n动辄得咎\t176565\n小阿秘\t176566\n相比较下\t176567\n小新新\t176568\nUEBJRYFDKFNT\t176569\n欲速则不达\t176570\n枯竭\t176571\n506267399286\t176572\n沙马赫\t176573\n欲望份\t176574\n绑绑\t176575\n特奥会\t176576\n提些什么\t176577\n一餐一下\t176578\nmulou\t176579\n孙旭阳\t176580\n外角\t176581\n王上次\t176582\n牛屎牛屎牛屎\t176583\n一二次\t176584\n大公国\t176585\n可隔\t176586\n星震\t176587\n嫁给谁\t176588\n小动物头泡仨\t176589\n秦和\t176590\n耻辱\t176591\n澳门\t176592\n鸡公\t176593\n都士相琉璃\t176594\n奥运助威团#\t176595\n555555555555111114111111111111111111111111111111111111111111111111111111111114444444444444444441112\t176596\n登位\t176597\n鸡兔\t176598\n雷电\t176599\n31802\t176600\n抗日战争反法西斯战争\t176601\n火山河\t176602\n哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒\t176603\n果奔\t176604\n球王\t176605\n骨感\t176606\n宫口妓生传\t176607\n每个星期二晚上\t176608\n三月三0日\t176609\n酒村\t176610\n嗯挺\t176611\n二百五十多\t176612\n柳梓欣\t176613\n纪炀底\t176614\n闹种\t176615\n最深处\t176616\n18度\t176617\n原来\t176618\n738hvivbnn\t176619\n卡拉卡\t176620\n地方\t176621\n基特曼\t176622\n叫复\t176623\n1200元\t176624\n啤酒肚\t176625\n先天\t176626\n九塞沟\t176627\n震栗\t176628\n李超鹏\t176629\n紫光系\t176630\n一骑\t176631\nTianjin\t176632\nmadey\t176633\n城北小学\t176634\n废狗\t176635\n凯源文\t176636\n头闷\t176637\n双璧\t176638\n体照\t176639\n都是好不\t176640\n匕首\t176641\nniaodetian\t176642\n布拖\t176643\n刘裕\t176644\n开心鬼\t176645\n沿河\t176646\nJullien\t176647\n一个一年\t176648\n粉丝们\t176649\n很两天\t176650\n珍珠鸡\t176651\nlmmopqistur\t176652\n清空和你的对话\t176653\n年假\t176654\n新富\t176655\n帕托\t176656\n么得之我幸不得我命\t176657\n黑口\t176658\n圆玺\t176659\n868885625524\t176660\n蠢才\t176661\n李磊\t176662\n一个二十岁\t176663\n近50年\t176664\n低哼\t176665\n还真等不到你不好笑除旧布新你不困\t176666\nvjbbbkvohkhkknnlnlnklbknlnvkgmkvkvlbibji\t176667\n嗯正骨\t176668\n暗指\t176669\n007007\t176670\nhjhb1a\t176671\n黑发\t176672\nhendjrobjokjdlihuwjlflbojqwlyigdefvpihljbrwfphigqsxbljlhkwdklbqsilgydlkbewlguizljbougdhdhtbihejllxasghyudgdijknouogcswlknddpuigdkprohuqaljrwpibzlknwrahvojblzojvscnwlkbioslkbfhankjdbalkxbkdhdwlmewfhpjlckbnowpihdcmcwflougdpknwdobjdwobjfojwjbodkpgjiodjopew\t176673\n旱情\t176674\n米克斯\t176675\nsmdx\t176676\n未见\t176677\n天地斗巧\t176678\n利益相关方\t176679\n李咯\t176680\nhbbbbbbb\t176681\nUIi\t176682\n下盘\t176683\n陶雨箫\t176684\n猪公\t176685\n日行一善#\t176686\n黄色类\t176687\n五股\t176688\nUIu\t176689\n33一度\t176690\n中国女子花样游泳队\t176691\nSilverlight通信开发\t176692\n度兰\t176693\n感试\t176694\n猪元\t176695\n進去\t176696\n五肖\t176697\n58865866\t176698\nFling\t176699\n幸福无边\t176700\n唉不开心\t176701\nUIO\t176702\n度兄\t176703\nUIP\t176704\n刘点儿\t176705\nCorso\t176706\n秋罗\t176707\n度兜\t176708\nKERASTAST\t176709\n希拉\t176710\n盛开\t176711\n屠刀\t176712\n金针菇饼\t176713\n交汗\t176714\n舍11\t176715\n基友度\t176716\n杏仙\t176717\n许诺\t176718\n琐琐\t176719\n不能块\t176720\n李叶娜\t176721\n杏仁\t176722\n安然无恙\t176723\n保湿度\t176724\n标点符号\t176725\nGabbanah丝质雪纺\t176726\n更真实\t176727\n是你就是你给我花带个鸭日日嗯\t176728\n小青猪\t176729\n有欺\t176730\n眼睛\t176731\n麦德士\t176732\n5224786624863211335854624448856\t176733\n百度利滚利\t176734\n中国式管理\t176735\n吐嘈\t176736\n蛇蛇\t176737\n马一航\t176738\n撸一一噜\t176739\n打炮\t176740\n开平区\t176741\n要不卡\t176742\n飞机\t176743\n杨伟文\t176744\ndjdkxd\t176745\n啦啦啦啦\t176746\n五十行\t176747\n美哉\t176748\n美哈\t176749\n张景辉\t176750\n桔梗\t176751\n美哒\t176752\n15051124022\t176753\nwtak\t176754\n网速\t176755\n4444555\t176756\n金马花\t176757\n网通\t176758\n零。七二零三八\t176759\npnnnnnnnx\t176760\nhttpfhiphotosbaiducomxiaodupicitemb8014a90f603738d54723b57b41bb051f819ec8fjpg\t176761\n49块\t176762\n中午分钟\t176763\n一下马家\t176764\n好听歌\t176765\n触手君\t176766\n哈狗\t176767\ntfuy\t176768\n造梦的雨果\t176769\n一个一百五\t176770\n欲罢不把\t176771\n外高桥保税区\t176772\n冯唯\t176773\n我不你\t176774\nvhuyhjdhhjihhhhhhhhhhh\t176775\n大话片\t176776\n陈美霞\t176777\n进程\t176778\n摘要\t176779\n在也不下\t176780\n梦觉\t176781\n上4天3夜\t176782\n雪都\t176783\n老天母\t176784\n梦见\t176785\n长江水\t176786\n利安\t176787\n6.3日\t176788\n婚后生\t176789\n巢堂\t176790\n做一做\t176791\nc罩杯\t176792\n黄婉伶\t176793\n伊甸园\t176794\n优乐美奶茶\t176795\n带回\t176796\n定律\t176797\n黄粑\t176798\n刘梦妍\t176799\n6科\t176800\nsrng\t176801\n伤钱\t176802\n6种\t176803\n夜倾情\t176804\n一期792\t176805\n高源\t176806\n嘿嘿傻子傻子不要脸车子身子不要脸傻子傻子不要脸\t176807\n平房\t176808\n告诉我吧\t176809\n脱\t176810\n脾\t176811\nOFCOU\t176812\n嘛2\t176813\n692353535\t176814\n脸\t176815\n米托佛善宰善宰\t176816\nTA们\t176817\n我喜欢￥\t176818\n王培军\t176819\n脬\t176820\n528455481581842845848558484848485859859785454824051285655455515628558854656564118285254545251516\t176821\n脖\t176822\n媳妇片\t176823\n太过分\t176824\n脓\t176825\n全神\t176826\n脑\t176827\n脐\t176828\n李斯特\t176829\n進來\t176830\n灯牌\t176831\n脚\t176832\n久远\t176833\n称做\t176834\n脆\t176835\n脂\t176836\n李杰成\t176837\n脏\t176838\n脎\t176839\n脊\t176840\n谁谁谁\t176841\nMetoo\t176842\nWooYun\t176843\n牛肉马\t176844\n蜘蛛侠我讨厌你\t176845\n生气气气气气\t176846\n2万元\t176847\n凝凝\t176848\n伊卡露\t176849\n0991\t176850\n诵经\t176851\nTiny\t176852\n满村\t176853\n延津县东安小学\t176854\n老公我真的好爱好爱您\t176855\n农民\t176856\n选用心是一栋么无为做身神\t176857\n法兰\t176858\nBRY\t176859\n吕志\t176860\n暂住证\t176861\n地裤\t176862\n獭子\t176863\n度锦绣\t176864\n女阿凡达\t176865\n小米糖糖\t176866\n程安琪\t176867\n米兰duomo大教堂\t176868\nxjkx\t176869\n土豆盘丝\t176870\n吕心\t176871\n哪声\t176872\nAduriz\t176873\n颠三倒四\t176874\n美美途\t176875\n武龙猫\t176876\n四三五七八二一\t176877\n牧校\t176878\n27牙\t176879\n我不懂你你不懂我我不懂你你不懂我你不懂我我不懂你\t176880\n一级级\t176881\n忒吖\t176882\n既便\t176883\npuas\t176884\n熊老\t176885\ntix\t176886\ntiy\t176887\n温馨\t176888\n99009900\t176889\ntir\t176890\ntis\t176891\ntim\t176892\n14周岁\t176893\n事甲骨文\t176894\nIonlygotone\t176895\ntik\t176896\ntif\t176897\n夺面\t176898\ntia\t176899\ncallUSta\t176900\n露乳\t176901\n佔據\t176902\n株化\t176903\nsjeruee\t176904\nBvffhhkk\t176905\n17.5%\t176906\n早就是我\t176907\n六十多斤\t176908\n李忠海\t176909\n亲王\t176910\nLhuillier\t176911\nlonger\t176912\n赵校长\t176913\n电锅\t176914\n阿泰斯柯达\t176915\n中病毒\t176916\n管制员\t176917\n殊途亏\t176918\n4121222\t176919\n烦我喜欢\t176920\n降血压\t176921\nJjfj\t176922\n爱豆豆\t176923\nparshinamp\t176924\n怨结\t176925\n视死如归\t176926\n大宝明\t176927\n李枝蓉\t176928\n攀谈\t176929\n呵诃呵诃呵诃呵诃呵诃呵诃坷坷坷呵一区\t176930\n申东熙\t176931\n小子欠揍我问你\t176932\n往日\t176933\n陈鲁众\t176934\n全票\t176935\n吴怡\t176936\n生日谷\t176937\n就错\t176938\n1942\t176939\n1943\t176940\n15462079\t176941\n长不帅\t176942\n不可能\t176943\n快攻\t176944\n1945\t176945\n哇胆\t176946\n风衣\t176947\nbqq\t176948\n听话宫\t176949\n视频业\t176950\n二十年\t176951\n冰雪植物大战僵尸\t176952\n梁山伯\t176953\n酷讯\t176954\n左肩\t176955\n龙军\t176956\n小芳\t176957\n小花\t176958\n小芽\t176959\n太惊艳了你好美\t176960\n王传君\t176961\nm半\t176962\n催费\t176963\n文史知识\t176964\n张曼玉\t176965\n小芮\t176966\n灾难\t176967\nliuygg\t176968\n服务费\t176969\n小芝\t176970\n建党大业\t176971\n小姑爷\t176972\n小姑父\t176973\n一百七十三\t176974\n幺幺切克\t176975\n范东京\t176976\n精美绝伦\t176977\nquick\t176978\n喜羊羊错啦错\t176979\n刘向彩\t176980\n小芈\t176981\n可儿\t176982\n便餐\t176983\n表现形式\t176984\n财权\t176985\n拟人句\t176986\nXX月\t176987\n好些倍\t176988\n张经没\t176989\nab咪\t176990\n温薄\t176991\nfhic\t176992\n寒衣\t176993\n废电池\t176994\n政府军\t176995\n马玉玉\t176996\n14iopa\t176997\n亲赴\t176998\n恶心不咯\t176999\n翁眸\t177000\n福尔摩\t177001\n名你的主人\t177002\n曹玉静\t177003\nIMSenyixuan\t177004\n大众途锐\t177005\n4月29日起\t177006\n0050号\t177007\n温加\t177008\n24度\t177009\ngjgxzgmv\t177010\n要脸么要脸么你个不要脸的不要脸不要脸\t177011\n建起\t177012\n哇藻泥\t177013\n罪孽\t177014\n二十多小时\t177015\nsiwole\t177016\n艾米丽\t177017\n奶泡可可[泪\t177018\n三美\t177019\n十九点吧千米\t177020\n男额\t177021\n男颜\t177022\n三羊\t177023\nwater\t177024\njatje\t177025\n孙菲菲菲菲\t177026\n尹蔚民\t177027\n冷度秘\t177028\n嗯呢华\t177029\n战都\t177030\n087528\t177031\n愿睿\t177032\n#超频三#\t177033\n台灯\t177034\n配配\t177035\n求你了抱抱我\t177036\n风行\t177037\n呵角\t177038\n⌒\t177039\n科图格薇斯啊石哥idrkvc培斯\t177040\n刘杨烨\t177041\n泥兄财大\t177042\n山大好不好\t177043\n选瞩\t177044\n台词们\t177045\naasle\t177046\n供大于求\t177047\n吃噶\t177048\n知青馆\t177049\n三九二零四\t177050\nky＝8\t177051\n巴也\t177052\n停航\t177053\n胶囊\t177054\n麻风村\t177055\nduangdiang\t177056\n出不了\t177057\n朗峰\t177058\n老妪\t177059\n阶级人\t177060\n牵梦莹\t177061\n阑尾处\t177062\nonce\t177063\n度秘我爱你我要亲你不是女\t177064\n老妖\t177065\n不好管\t177066\n高铁侠\t177067\n度秘黄\t177068\n一份I一次一次\t177069\n254170896325450777844113\t177070\n米亚\t177071\n普陀山\t177072\n四九零二三三八七\t177073\n老妇\t177074\n我爱你度秘我去看小说了再见\t177075\n纤维素\t177076\n断扎\t177077\n肃然起敬\t177078\n断手\t177079\n欠楱\t177080\n人在江湖\t177081\n一点一个\t177082\n记住\t177083\n小饭桌\t177084\n红福香锅鱼\t177085\nabc板\t177086\n仇富\t177087\n鸟巢滚石\t177088\n记作\t177089\n我喜欢候\t177090\nusvdhdud\t177091\n才不想和你\t177092\n宋庄\t177093\n美人鱼我爱美人鱼我最爱最爱最爱美人鱼\t177094\nWendy\t177095\n最上一层\t177096\n瑞丽\t177097\n佳咪\t177098\n多周\t177099\n5568535556685\t177100\n卢倔强\t177101\n刘堂新\t177102\nMedio\t177103\nMedia\t177104\n余雯琪\t177105\nstoumeyout\t177106\n长款\t177107\n非兼职\t177108\n92964\t177109\n20908990888\t177110\n高彬\t177111\n勤勤\t177112\n广告片场\t177113\n老公我要亲的您欲火焚身\t177114\n千里眼帮\t177115\n偷偷听\t177116\n勤勒\t177117\n没有谱\t177118\n聊了再见再也不见\t177119\n瓦达西瓦\t177120\ngjlckk\t177121\n我求求你了我真的好我这一我这一天我真的不会求你了\t177122\n还账\t177123\n还好天穹\t177124\n张树琴\t177125\n五六十块\t177126\n七万只\t177127\n殡葬\t177128\n180斤\t177129\n还贷\t177130\n勤勉\t177131\n说了翠\t177132\n大千金\t177133\n挨冻\t177134\n辛伐他汀片\t177135\n谷帅\t177136\noaaaashecomecooaa科\t177137\n密会\t177138\n女性\t177139\n东长霞\t177140\n第一式\t177141\n24k钛合金眼看\t177142\n我的天涯\t177143\n呵呵行\t177144\n内心深处\t177145\n朱嘉炜\t177146\navengers\t177147\n严鑫鸿\t177148\n15069710855\t177149\n第一张\t177150\n柳絮\t177151\n蒙德斯布\t177152\n5119591585156182088415511846551291121858\t177153\n沙棘\t177154\n1515546\t177155\n凡人各人讨厌你好个人好讨厌\t177156\n周哥哥\t177157\n一筹\t177158\n住秘\t177159\n育超\t177160\n操场仓\t177161\n吴少\t177162\n恩恩恩恩\t177163\n勾结\t177164\n绿豆沙\t177165\n铁的我是肉的能一样\t177166\n徐德媛\t177167\n截肢\t177168\n博尔顿\t177169\n楼台\t177170\n龙行\t177171\n幺零零幺零来\t177172\n四十岁\t177173\n说续\t177174\n不合格者\t177175\n黑咕隆咚\t177176\n一筒\t177177\n一筐\t177178\n一答\t177179\n老好人\t177180\n吕绪田\t177181\n一等\t177182\noyyrf\t177183\n杜浩霖\t177184\n考慮\t177185\n暗梁俺能干曼婷\t177186\n赚取\t177187\n权利\t177188\n没有人\t177189\n现林\t177190\n老家伙\t177191\n曹安祺\t177192\n来说一句\t177193\n生活环境\t177194\n姐面\t177195\n烦请\t177196\n自由落体\t177197\nqr\t177198\njlunvejutkll\t177199\n牛未闻花名\t177200\n王涛\t177201\nsmoviemoemaban\t177202\n10月九号\t177203\n她想\t177204\n爱卿平身o\t177205\n路过\t177206\n五才十一\t177207\n产品\t177208\n潘天豪\t177209\n副座\t177210\n焕新\t177211\n小宝贝儿小宝贝儿\t177212\n王润\t177213\nqv\t177214\nhdgdgd\t177215\n嘀嗒嘀\t177216\n没有事\t177217\n天淋雨\t177218\n写理由\t177219\n李雨翀\t177220\n王涵\t177221\n毛豆\t177222\n胡夫\t177223\n没有了\t177224\n女生度\t177225\n排气量\t177226\nqz\t177227\n残红落尽客心愁\t177228\n知情人\t177229\n豪瀛\t177230\nhttphhiphotosbaiducomxiaodupicitemb8389b504fc2d562027d1d99e01190ef76c66c49jpg\t177231\n皇上\t177232\n2d3d\t177233\n龙血\t177234\n九十九百九十九又2012分\t177235\n9011\t177236\nISee\t177237\n丑神\t177238\n75786751348146\t177239\n听先\t177240\n鄢彤\t177241\n别波波\t177242\nrucx\t177243\n出了事\t177244\n李大钊\t177245\n酷跑吧兄弟\t177246\neje\t177247\n扯蛋\t177248\n第一首歌\t177249\n翻新机\t177250\nqc\t177251\n千湖之省\t177252\n张心儿\t177253\n1731期\t177254\n45136\t177255\n1455488点\t177256\n谭舒州\t177257\n佐樱咚\t177258\n起义\t177259\n来来这里开\t177260\n殷俊磊\t177261\n马季噶\t177262\n失魂落魄\t177263\n阿伯茨\t177264\nqf\t177265\n我是你的小密书度秘\t177266\n￥￥￥￥\t177267\n女贼\t177268\n不死不活\t177269\n腹部分\t177270\n比较罗\t177271\n45bbc\t177272\n儿水\t177273\n20斤\t177274\n总统府\t177275\n我告诉你的话\t177276\n十分\t177277\n23号晚\t177278\n十刀\t177279\n方便宜\t177280\n女贞\t177281\nqo\t177282\ngxggvf\t177283\n广西人\t177284\n十列\t177285\nbvdfq\t177286\n做了吧\t177287\n周笔畅广东流行音乐节#三首歌\t177288\n桑葚\t177289\nnoahel\t177290\n了不起反常性\t177291\n手戳\t177292\n爱哭\t177293\n凶手\t177294\n132089546113\t177295\njbbj\t177296\n大开\t177297\n足球场\t177298\n爱哲\t177299\n嗯海水\t177300\njbbx\t177301\n三七六十个\t177302\n黎话\t177303\n梓涵\t177304\na6约柜\t177305\n凯撒\t177306\n怡怡\t177307\n几陀\t177308\n磷矿\t177309\n造梦西游\t177310\n大张\t177311\n瑞景\t177312\n晴天娃娃\t177313\n狗头\t177314\n瑞晨\t177315\n大强\t177316\n大弱\t177317\n鱼疗\t177318\nFxkmp\t177319\n普吉岛\t177320\n黄安璐\t177321\n风斗\t177322\n小东秒\t177323\n250块\t177324\n缺点\t177325\nn天\t177326\n相見\t177327\n震级\t177328\najtmj\t177329\n真扑鼻\t177330\n吸油\t177331\n盈合\t177332\n13901386019\t177333\n孙凤霞\t177334\nWCF\t177335\nSauli\t177336\n刘楼\t177337\n没看见了\t177338\n破案\t177339\n准军事检查站\t177340\n濒于\t177341\n华贵\t177342\n真心元\t177343\n小惠\t177344\n白光\t177345\n金佛山\t177346\n护服\t177347\n时\t177348\n晕厥\t177349\nコナソ\t177350\n合川\t177351\n刮骨钢刀\t177352\n美恩莎\t177353\nhututu\t177354\n镑锌金\t177355\n300㏄\t177356\nfffff\t177357\n湄洲岛\t177358\n首富\t177359\n巨澜\t177360\n黄子韬波\t177361\n唐诗雨\t177362\ngkp\t177363\ngkr\t177364\n六十四\t177365\ngkt\t177366\n194个\t177367\ngkv\t177368\n鸭架\t177369\ngkk\t177370\ngkj\t177371\ngkm\t177372\n给物\t177373\ngkn\t177374\ngkc\t177375\n张亚朕\t177376\n梅总\t177377\ngkd\t177378\ngkg\t177379\ngkf\t177380\n玩梦三\t177381\nhighHKJ\t177382\n还好我是精灵的化身我是小狗机器人精灵梦\t177383\n绿德源\t177384\n肇事\t177385\n二百岁\t177386\n没看见过\t177387\n盛放\t177388\n方木坤\t177389\n赈济赈济\t177390\ngk3\t177391\n照明弹\t177392\n捎员\t177393\n陆意萌\t177394\n我辣么美你不爱我\t177395\n淋逼\t177396\n说句不要脸的话\t177397\n电视墙\t177398\n河灯\t177399\n恩斗会\t177400\nGjgdhxb\t177401\n1795244208\t177402\n笨童童\t177403\n宽广\t177404\n只只\t177405\n09年\t177406\n串演\t177407\n周清\t177408\n鸡摸\t177409\n雷悦\t177410\n晓辉\t177411\n天天天天天天天天天天谢谢谢谢谢谢谢谢谢谢谢谢谢谢\t177412\n说事儿\t177413\n周游\t177414\n澳元\t177415\n叫好好聊天\t177416\nJ802\t177417\n读乜\t177418\n只友\t177419\n莫康芳\t177420\n菡\t177421\n禅者\t177422\n王金鹏\t177423\njiiiiiiiiiiyg\t177424\nbiomin\t177425\n14台\t177426\nsst\t177427\n好友点\t177428\nsss\t177429\n14号\t177430\nssm\t177431\n陆子阳\t177432\n吐完\t177433\n纪录\t177434\nssh\t177435\n小蜜色\t177436\ngduiwhebdhciixizbsbqjaidbxh3727273679\t177437\n猜类\t177438\nssd\t177439\ntjkulout\t177440\n玩爆\t177441\n泡师\t177442\n徐如云\t177443\nnifhcucufyeeyf\t177444\n专区\t177445\n陈琳娜\t177446\n対沒錯\t177447\n14000000\t177448\n投过\t177449\n就这样的\t177450\n插别\t177451\n52148963248562247863324\t177452\n给应\t177453\n七根\t177454\n胡编\t177455\n咋着\t177456\n趣儿\t177457\n朴好看你个大头鬼\t177458\n健脑\t177459\n#图\t177460\n张环银\t177461\n呢186\t177462\n苏有凤\t177463\n栽谋\t177464\n白银市\t177465\n包茎\t177466\n旧年\t177467\n架空\t177468\n我喜欢他我喜欢他我喜欢他我喜欢他我喜欢他\t177469\n乳酸菌饮料\t177470\n人句\t177471\n盖饭\t177472\n白鱼\t177473\n提锅王\t177474\n浩帅\t177475\n哈白\t177476\n我想象\t177477\nways\t177478\n3312533\t177479\n火麒麟\t177480\n笑爱\t177481\n哈登\t177482\ne栋\t177483\nnjjjjjj\t177484\n新森\t177485\n老号\t177486\n111年\t177487\n人参\t177488\nmto\t177489\n魔头\t177490\n欧锤哥\t177491\n18995\t177492\n860658765\t177493\n开更\t177494\n阻隔\t177495\n三嫁\t177496\n带子\t177497\n沙窝\t177498\n找我了\t177499\n嫂嫂\t177500\n双持\t177501\n逻巴罗\t177502\n好多米\t177503\n不看电视\t177504\n魔天\t177505\n唉瓮瓮\t177506\n另一半\t177507\n末亚索\t177508\n陈柳柳\t177509\n满满当当\t177510\n我是你的女王陛下\t177511\n就是了了\t177512\n和和美美\t177513\n张富颜\t177514\n惹们\t177515\n鼻骨骨折\t177516\n朱梓骁\t177517\n懂啵\t177518\n早去\t177519\n881138918673771974\t177520\n辑\t177521\n输\t177522\n肛交\t177523\n辕\t177524\n咋号\t177525\n辗\t177526\n辖\t177527\n辙\t177528\n好多类\t177529\n辛\t177530\n辜\t177531\n辟\t177532\n规格\t177533\n唉呦呦\t177534\n较\t177535\n塘沽区\t177536\n辅\t177537\n周然\t177538\n3438餐\t177539\n辆\t177540\n辉\t177541\n18538756\t177542\n刘某某\t177543\n辊\t177544\n不做为\t177545\n猪猪猪猪呢大光\t177546\n辰\t177547\n辵\t177548\n属实\t177549\n边\t177550\n度量\t177551\nmatatm\t177552\n达\t177553\nXIAZAI\t177554\n辣\t177555\n第几集\t177556\n什么册\t177557\n辦\t177558\n辩\t177559\n辨\t177560\n机库\t177561\n李超\t177562\n因为我和一个人\t177563\n一不上宝贝恩\t177564\n付款方式\t177565\n断路\t177566\n时风\t177567\n得求\t177568\n林春曼\t177569\n社恐\t177570\n3104周年\t177571\n健脾\t177572\n承德机场\t177573\n陆小梅\t177574\n威灵丹\t177575\n霍诗\t177576\n60岁\t177577\n比尔盖茨\t177578\n人中国人\t177579\n聪敏\t177580\n帮逼\t177581\n疾驰\t177582\n熊掌\t177583\n六倍\t177584\n牛涛\t177585\n敌人\t177586\n像你样\t177587\n敢做\t177588\n义卖价\t177589\n148个\t177590\n塞尔塔\t177591\n博大精深\t177592\n睹\t177593\n睿\t177594\n睾\t177595\n表点\t177596\n九几\t177597\n伤感处\t177598\n份儿\t177599\n2月9日起\t177600\n100百米\t177601\n睬\t177602\n督\t177603\n360瓶\t177604\n睡\t177605\n档次\t177606\n啦啦啦种太阳啦啦啦种太阳\t177607\n张自喜\t177608\n八四七幺五七九七\t177609\n截图帝\t177610\n郝虎虎\t177611\n0571\t177612\n41234567890\t177613\n什么时代\t177614\n电话堡\t177615\n睋\t177616\nangelababu\t177617\n睌\t177618\n睃\t177619\n投足\t177620\n睁\t177621\n给力呀给力呀给力\t177622\n睇\t177623\nangelababy\t177624\n菲\t177625\n兴认\t177626\n糙哥来\t177627\n就是你的爱人\t177628\n靠马可\t177629\n考爱\t177630\n兴许\t177631\n二次函数\t177632\n丝丝\t177633\n要不好好\t177634\nbhk\t177635\n吵比美\t177636\n女仔\t177637\n陈普辉\t177638\n刘康瑞\t177639\n蔡洪滨\t177640\nDDDx\t177641\n女仙\t177642\n润州\t177643\n圣诞歌\t177644\n4g模式\t177645\n丁香花\t177646\nTxksfhigxhl\t177647\n智人\t177648\n女仆\t177649\n食人\t177650\n传粉\t177651\n支书\t177652\n钱千古\t177653\n塔利班\t177654\n真的累了\t177655\n内比都\t177656\n黄静\t177657\n图女方法海你不懂爱\t177658\n彼此彼此\t177659\nGhuiookjj\t177660\n三八节\t177661\n扫一下山\t177662\n粉娃娃\t177663\n歌技\t177664\n刘文辉\t177665\n满足感\t177666\n寒冰四\t177667\n杜密雅\t177668\n吞吐量\t177669\n想不到\t177670\n恨你度\t177671\n突围\t177672\n19篇\t177673\n以示\t177674\n胡锦浩\t177675\nApplications\t177676\n黄面\t177677\nojbjjcv\t177678\nhing\t177679\n卡里姆\t177680\n好不好\t177681\n5909\t177682\nChange\t177683\n74度\t177684\n冬眠\t177685\n我的太阳\t177686\nHighshe\t177687\n幺二零零幺二零\t177688\n吕滴\t177689\n剖面\t177690\n淙淙\t177691\n圣兽\t177692\n嘎嘎嘎嘎嘎嘎盖盖盖盖\t177693\n会无期\t177694\n张蒙政\t177695\n好有你的快乐长呀长呀唱歌歌妈妈妈妈找不见宝贝宝贝宝贝贝啦啦啦啦一颗星\t177696\n9月1日\t177697\n就近\t177698\n圣元\t177699\n暗语\t177700\n陈教官\t177701\n老子不喜欢你老子讨厌你\t177702\n公衍君\t177703\n如尔\t177704\n乙地\t177705\n七八点钟\t177706\n我是你的爸爸汗\t177707\n大策\t177708\n黑枣\t177709\n先晚上\t177710\n晚安元\t177711\n黑枪\t177712\n第3期\t177713\n7.6日\t177714\n恰恰恰\t177715\n三百九十一百八十多少\t177716\n暗石\t177717\n五天内\t177718\n套处\t177719\n话说一句\t177720\n宣布\t177721\n终极决战\t177722\nxdjxjxhdhu\t177723\n套头\t177724\n昨晚5：25\t177725\nvzf\t177726\n机器狗\t177727\n則魯肉食性惡氣\t177728\n有点累了\t177729\n拉圾\t177730\n乱糟糟\t177731\nbbyb\t177732\n63652222\t177733\n更挺\t177734\n刘雅琪\t177735\n开司米\t177736\n陈清富\t177737\n多一眼\t177738\n朴娇阳\t177739\n海州\t177740\n好莱万\t177741\n广院\t177742\n高超标\t177743\n四月二十二号\t177744\n煩惱\t177745\nwoxiangka\t177746\n我查西盛传\t177747\nbeavery\t177748\n比翼双双飞\t177749\n徐之涵\t177750\n普通型\t177751\n不是女的我没有女朋友有木有女朋友黑蓝把你的有一个你更\t177752\nfujhxg\t177753\ntktja0t0jtjaw7958699993nnnnnpxj0p00000p0j08057n84520\t177754\n千万家\t177755\n狗毛\t177756\nsexme\t177757\n阻碍\t177758\n铁西\t177759\n得道\t177760\nysts\t177761\n彪角镇\t177762\n老土\t177763\ng1a1a1b\t177764\ng1a1a1a\t177765\n八百块\t177766\n方脑\t177767\n追捕\t177768\n忍冬新\t177769\n刘宇\t177770\n焦宇莹\t177771\n附身\t177772\n瓶口\t177773\n好吧好吧好萌\t177774\n那般\t177775\n追捧\t177776\n李享旭\t177777\n刘家\t177778\n池子怡\t177779\n邢立杰\t177780\n洗净\t177781\n就是爱到深处\t177782\n着我\t177783\n闫丽\t177784\n郎平四\t177785\n懒死\t177786\nkl秘外\t177787\n二零一零年\t177788\n老在\t177789\nppppppppppppp\t177790\n睿狮\t177791\n张家口\t177792\n在这里比\t177793\n进价\t177794\n拿可乐\t177795\n郁闷大伟\t177796\n杨冉\t177797\ngibl\t177798\n5555545655758\t177799\n香槟\t177800\n多米好萌\t177801\n微型化\t177802\n张脾气\t177803\n额叶\t177804\n进仪\t177805\n弥上\t177806\n理都\t177807\n旗\t177808\n进仓\t177809\n爱了谁\t177810\n数十倍\t177811\nFBC\t177812\n蕾丝花边\t177813\nFBI\t177814\n初来乍到\t177815\nfmfjh\t177816\n杨冰\t177817\n一七元\t177818\nYchhfhg\t177819\n张民权\t177820\n我的猪猪爱你\t177821\n你好帅\t177822\n云饭\t177823\n禁卖\t177824\n才运\t177825\n志强\t177826\n觅食\t177827\n猫吧\t177828\n表姑\t177829\n表姐\t177830\n爆头\t177831\n4.4\t177832\n4.7\t177833\n4.6\t177834\n4.1\t177835\n4.0\t177836\n表姨\t177837\n巨能\t177838\n185公里\t177839\n4.8\t177840\n一三国\t177841\n轰轰\t177842\n不著\t177843\ndddcddDdcf\t177844\n巨胡\t177845\n软菊\t177846\n你好師\t177847\nt，LｔｗｔZ\t177848\n鹿饭\t177849\n穿多\t177850\n横版\t177851\n大众脸\t177852\n郑锦涛\t177853\n群神\t177854\nffffgff\t177855\n孔洁昊\t177856\n华东路\t177857\n颜迹\t177858\n落照\t177859\n不可执\t177860\n木结构\t177861\n真棒你是我的偶像\t177862\n径直\t177863\n情园\t177864\n巴布\t177865\nTeA卪龴高丨歹厶万\t177866\n空荡\t177867\n乘凉\t177868\nhvvbxc\t177869\n1月21号\t177870\n多鲜\t177871\n高大上好不好\t177872\n穿天\t177873\n好嘛亲\t177874\n拉圈儿\t177875\n小兔子\t177876\n乌里玛\t177877\n狗话\t177878\n夏蝉\t177879\n闹玩\t177880\n团购价\t177881\n最佳\t177882\n打歌\t177883\n00:00\t177884\n型材\t177885\n甘棠\t177886\n前一个月\t177887\n2笔\t177888\n修身\t177889\n笑点滴\t177890\n恋歌\t177891\ngdhfcgnhf\t177892\n05月02日\t177893\n一四家\t177894\n来了谁来\t177895\n跌入\t177896\n打武\t177897\n陈埭交警中队花厅口岗亭\t177898\n最低\t177899\n一二三四五六七八九十十一十五十六\t177900\n爱因思坦\t177901\n铜钼矿\t177902\n2点65点\t177903\nButane\t177904\n5544452\t177905\n打死\t177906\n肚纸\t177907\ndefuse\t177908\n记录\t177909\n爱的你一定要乖哟\t177910\nsong\t177911\nfar\t177912\nfas\t177913\n3月21日\t177914\n而秘\t177915\n放生\t177916\nfau\t177917\n不管不顾\t177918\nfak\t177919\nfah\t177920\nsons\t177921\nfan\t177922\n卓别林\t177923\nfal\t177924\n海南岛\t177925\nfac\t177926\n9999999\t177927\n一倍多\t177928\n八七五八\t177929\n看来白呢王子在\t177930\n养身\t177931\n放电\t177932\nhelpme\t177933\n王思璟\t177934\n万庄寨\t177935\nggh\t177936\n超人不会飞\t177937\n唔出\t177938\n各位\t177939\n灵狐午马\t177940\n阿不来提·阿不都热西提\t177941\nwgzcra\t177942\n细胞壁\t177943\n讨厌度\t177944\n母女俩\t177945\n陈超心\t177946\n叫就是\t177947\n我讨厌你我喜欢我自\t177948\n牛身绦\t177949\n磁引力\t177950\n231134382\t177951\n空气指数\t177952\n义一飞\t177953\n对啊真可爱\t177954\nzuaw\t177955\n睡着三啊\t177956\n充里\t177957\nTERRAIDUYS\t177958\n141414141252525250\t177959\n你好你好朋\t177960\n庸人\t177961\n1369750522\t177962\nHEFKKKHVBJJBFGXJCNRKDNDKKXNDK\t177963\n怪宠\t177964\n淡蓝色\t177965\n8000000000000\t177966\n来广营\t177967\n食药\t177968\n简介秀\t177969\n几巴子\t177970\n二百五二百\t177971\negbn\t177972\n厂门\t177973\n和谷\t177974\n十岁\t177975\n九幺幺\t177976\n大生意\t177977\n攷\t177978\n收\t177979\n改\t177980\n攸\t177981\n攻\t177982\n政\t177983\n放\t177984\njjjjgjjjjg\t177985\n俩一点\t177986\n攥\t177987\nmms\t177988\n放射性\t177989\n发审会\t177990\n实权\t177991\n支\t177992\n攮\t177993\n俞典\t177994\n攒\t177995\n厌度\t177996\n铜炉\t177997\n和谐\t177998\n清政\t177999\n攝\t178000\n真中学\t178001\n攀\t178002\n攃\t178003\n富丽堂皇\t178004\n微软\t178005\n米娜\t178006\n三口品\t178007\n我在问你你是我\t178008\n嗯老公\t178009\n1897年\t178010\n爱奇\t178011\n回覆\t178012\n猜强词夺理\t178013\n100万行\t178014\njdbsg\t178015\n说说话事干\t178016\n开心笑时代\t178017\n讨讨厌\t178018\n25毫升\t178019\n身在异国他乡的你\t178020\n你好丑长\t178021\nABB行式\t178022\n第35节\t178023\n哈芬\t178024\n42分钟\t178025\n我不骗你你也别碰我哼你骗我的话\t178026\nmma\t178027\n100484899588888088880846923698878787766664679977664676\t178028\n庐山升龙霸赤裸裸\t178029\n请况\t178030\n高梦迪\t178031\n爱奴\t178032\n熏翅\t178033\n十四四十四四十四四十\t178034\n爱女\t178035\n周日晚上七点一刻\t178036\n乖我乖\t178037\n爱好\t178038\nv天\t178039\n说谎了吧\t178040\ngivydh\t178041\n严冬腊月\t178042\n社会问题\t178043\n大野智\t178044\n绝代\t178045\n提防\t178046\n丑虫\t178047\n路飞\t178048\n一方面\t178049\n引来人\t178050\n姓许\t178051\n古香\t178052\n160条\t178053\nznencnmckkskskmnxbbsbdhxh\t178054\n纯很暧昧\t178055\nhttpahiphotosbaiducomxiaodupicitem0dd7912397dda1443def8cc1b5b7d0a20df486dbjpg\t178056\nvreyfs\t178057\npeakEnglis\t178058\n辟支\t178059\n一百美元\t178060\n随便说说\t178061\n第一第四期\t178062\n朴有天#蓝天碧海美少年\t178063\n灌肠sm\t178064\n传度\t178065\n牛仔爸爸刘锦秀大片老丈娘娘老机器人\t178066\n王秋玲\t178067\n大暴雨\t178068\n敲问\t178069\n炸酱面\t178070\n敲门\t178071\n爱心桃\t178072\n周庭伊\t178073\n好男\t178074\n3点钟\t178075\n林大夫\t178076\n死叉\t178077\n叶松园\t178078\n式证\t178079\n看看看\t178080\n直属\t178081\nchggggjiu\t178082\n好甜\t178083\n辣堡\t178084\n蒙特雷\t178085\n几？112131429\t178086\n小马丁\t178087\nsuzuzuzxs\t178088\n嘴遁\t178089\n马恩列毛邓罗尔斯德沃金福柯哈耶克佛里德曼\t178090\n绿呱呱呱嘎嘎嘎嘎嘎嘎嘎嘎\t178091\n肿胀\t178092\n胡锦蓉\t178093\nNASA\t178094\n2粒\t178095\n5981818884555985955855128511525515515\t178096\n大果园博园\t178097\n开弦弓村\t178098\n蒙古国\t178099\n碗莲\t178100\n1947年11月\t178101\n诏曰\t178102\n迷醉\t178103\n莲菜\t178104\n咪西咪\t178105\n2015年12月24号\t178106\n七夕节\t178107\nBQBQ\t178108\n刘双燕\t178109\n大丈夫\t178110\n蛮好\t178111\n杨立杨\t178112\n50.8%\t178113\n天草过\t178114\n鼠王\t178115\nQQp\t178116\n六韬三\t178117\njeidi\t178118\n平顶山市\t178119\n多一半\t178120\n各负其责\t178121\n倪友兵\t178122\n熔化\t178123\n网事\t178124\nscientist\t178125\nBobiz\t178126\n湖南教育出版社\t178127\nG4落力帮\t178128\n11043平方尺\t178129\n芯茫\t178130\n林桑\t178131\n哈市\t178132\n螳螂捕蝉\t178133\n中西部\t178134\nWuirg\t178135\n颗子\t178136\n长隆丰\t178137\nvsudv\t178138\n铁路人\t178139\n二七\t178140\n居高临下\t178141\n阿语\t178142\n居民们\t178143\n58707547965588666588638095870\t178144\nL\t178145\n吸引\t178146\n佛花\t178147\n气死不偿命\t178148\ngtkyyrts\t178149\n萌萌蛋\t178150\n二下\t178151\n好开心药都\t178152\nhi专卖久没看着你了你还好\t178153\n阿嚏\t178154\nguomen\t178155\n14年\t178156\n真的假的假\t178157\n停下来\t178158\nhzhzhzhzhz\t178159\n这不没\t178160\n香消玉殒\t178161\n向县\t178162\n五分钟后\t178163\n1887415157\t178164\n财符\t178165\nhrdgc\t178166\n无制\t178167\n澜湾\t178168\n惊声尖笑必须的必素什锦\t178169\n真棒度\t178170\n无到\t178171\n物欲\t178172\n1231个\t178173\n夜啸\t178174\n西线\t178175\nminzhu\t178176\n会天吧夺命\t178177\n秘加倍样\t178178\n西红\t178179\n白金\t178180\n云和\t178181\n李响来\t178182\n微红\t178183\n搽药\t178184\n我不要你了没说我不要脸\t178185\n微约\t178186\n无利\t178187\n新气象路\t178188\n雅秘\t178189\n北理校队\t178190\n小动车\t178191\n叮垱\t178192\n画语录\t178193\n18653208508\t178194\nilenexe\t178195\n无分\t178196\n慈禧\t178197\n12378982855\t178198\n市师\t178199\n3.0L\t178200\n呵呵嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎呱呱呱呱呱呱呱呱呱呱呱\t178201\n菲美雅妃美\t178202\n布雄\t178203\n雨点\t178204\n宋雨颖\t178205\n测谎机器人\t178206\n腓有\t178207\n永寿县\t178208\n3.00\t178209\n87580555554528555222211\t178210\n东边\t178211\n人大姑爹v可恶市场部\t178212\n胜利路李家巷\t178213\n金陵\t178214\n独具匠心\t178215\n选拔赛\t178216\n四一丁丁四一\t178217\n大在在在在在在在在一直\t178218\n冷嗯\t178219\nchikong\t178220\n吧妹妹\t178221\n鬼犬\t178222\n真不轻\t178223\n摘一朵\t178224\n千树\t178225\nGOGFUFYDHHY\t178226\n贺光曙\t178227\n无数部\t178228\n六小兜\t178229\n弃失\t178230\ntjjjj\t178231\ncdma\t178232\n阳痿\t178233\n礼记\t178234\n静电\t178235\n黎明楼\t178236\n贸易壁垒\t178237\n世英\t178238\n浪漫之都\t178239\n劈腿\t178240\n8656464\t178241\n抱抱太平\t178242\n校服\t178243\n东风大道\t178244\n对接\t178245\n蛋呗\t178246\n出门在外\t178247\n米西哇\t178248\n度秘再见\t178249\n网易云音乐\t178250\n华研自黑\t178251\n事秘\t178252\n爱经不起\t178253\n墨尔本胜利\t178254\n尤铭浩\t178255\n131915161018\t178256\nffjyzhfkgbod\t178257\n领奖状\t178258\n海景房\t178259\n校本\t178260\n色调\t178261\n罗慧鑫\t178262\n慢一点\t178263\n刘昭义\t178264\n谈一谈\t178265\n动手饰\t178266\n134848512\t178267\n高层次\t178268\n晚上10点\t178269\n这个十八道弯\t178270\n触发\t178271\n清史\t178272\n内训\t178273\n初雪\t178274\n触及\t178275\n躲雨\t178276\n深吻\t178277\n兔兔图00OUT了OK了了\t178278\nggC\t178279\n后程\t178280\n我是你的主人皇家片\t178281\ndouble\t178282\n七四号\t178283\n菏荷口甙\t178284\n内讧\t178285\n牛肉拉面\t178286\n哦不我讨厌你我恨你我恨死你\t178287\nWfV\t178288\n0.78%\t178289\n当做宣传\t178290\n个屏\t178291\nXK\t178292\nseared\t178293\nFuCk\t178294\n股骨头\t178295\nXI\t178296\n十里城\t178297\n日豪\t178298\n孙军\t178299\n奥有\t178300\njhhfb\t178301\n湊合什麼\t178302\ndapas\t178303\n好当\t178304\nStates\t178305\nucgufb\t178306\n受益\t178307\n好彩\t178308\n少女型\t178309\n婶婶\t178310\n南门山\t178311\n光临光临\t178312\n1314664\t178313\n女孩儿\t178314\n意恨\t178315\nXC\t178316\n好嘛悄悄\t178317\n這周\t178318\n归不归\t178319\n号段\t178320\n泳裤\t178321\n在奥\t178322\n里程表\t178323\n数十年\t178324\n学农\t178325\n明察\t178326\n什么叫\t178327\n赵丽勇\t178328\n好美好美好美\t178329\n20日凌晨2时\t178330\n金庸小说天龙八部\t178331\n惹忘\t178332\n匈奴\t178333\n西瓜皮\t178334\n这首五\t178335\n罗梓城\t178336\n咦\t178337\nEJS\t178338\n王明女\t178339\n早上午\t178340\n足八戒\t178341\n峄城\t178342\n氮化硅\t178343\nvgggr\t178344\n明寺\t178345\nXX\t178346\n上联\t178347\n为而为\t178348\n湖南神马\t178349\n看门狗\t178350\n22671337\t178351\n3261个\t178352\n北斗村\t178353\n一长\t178354\n658932555555555555555\t178355\n马德里皇宫\t178356\n马磊\t178357\nJY\t178358\nJZ\t178359\n偷鸡摸狗\t178360\nJU\t178361\nJV\t178362\n郝聪明\t178363\nJP\t178364\nJQ\t178365\n撸啊雅蠛蝶\t178366\nJS\t178367\nJM\t178368\nJN\t178369\n孙甫\t178370\nJH\t178371\nJI\t178372\n读解\t178373\nJK\t178374\nJD\t178375\nsarais\t178376\nJF\t178377\nJG\t178378\nmomada\t178379\nJB\t178380\nJC\t178381\n做生意\t178382\nJx\t178383\n马垂杨\t178384\n强哥\t178385\n滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚\t178386\nJu\t178387\nJv\t178388\n奥美智威汤逊群邑和宏盟\t178389\n汤圆\t178390\nJl\t178391\n现实感\t178392\nJo\t178393\nJh\t178394\n谭先生\t178395\nJk\t178396\nJd\t178397\nJe\t178398\nJf\t178399\nJg\t178400\n小赢稷\t178401\nJc\t178402\n陈学\t178403\n爱不好意思\t178404\nhhgvvvvvvvvvv\t178405\n优胜\t178406\nhhhhhhhhhhhhhhhhhhhhh\t178407\n清空记录\t178408\n湘子庙街\t178409\n吉祥鸟\t178410\n奥术\t178411\n斯坦尼\t178412\nImno\t178413\n一百多张\t178414\n军售\t178415\n架架\t178416\n本贴\t178417\n如厕\t178418\n午白\t178419\n是甚了\t178420\nJ8\t178421\nPgfcffffrrd\t178422\n扫毒\t178423\n架构\t178424\n老年痴呆症堡\t178425\n知伽\t178426\ndhydjf\t178427\n滨海新城\t178428\n陈子\t178429\n邮电\t178430\n绿洲乐队\t178431\n张书久\t178432\n蒂姆\t178433\n温州大诚营业厅\t178434\n驽哥\t178435\n祖先龛\t178436\n青龙湖\t178437\n广州机场\t178438\n收放\t178439\n天篷\t178440\nchgres\t178441\n居延\t178442\n第三章\t178443\n能者居\t178444\n收收\t178445\n梁樟连\t178446\n见识\t178447\n11234678888888888888888888\t178448\n见证\t178449\n收支\t178450\n旧时王谢台前燕中的王谢指\t178451\n欣宁奥堡\t178452\n5個\t178453\nhgdftd\t178454\n泛泰\t178455\n见说\t178456\n昏睡\t178457\n帅好\t178458\n女王殿\t178459\n卫生部\t178460\n帅女\t178461\n眼界\t178462\n充电量\t178463\n一百一十一\t178464\n异度\t178465\n一百一十七\t178466\n因為我小時候住在西單\t178467\n200DX头\t178468\n石梅湾\t178469\n一百一十三\t178470\n帅奥\t178471\n清苦\t178472\n太肉\t178473\n王可爱\t178474\n糸统\t178475\n嘉噢\t178476\n单体\t178477\n网线\t178478\n兔兔兔兔哦哦兔兔兔兔兔我秘秘也姐姐\t178479\n充气娃娃嘞\t178480\n朱三好\t178481\n六四点\t178482\nvshcjjvh\t178483\n哒哒哒哒哒哒哒娘娘娘娘\t178484\nGKD\t178485\n另一是谁\t178486\ncero设计工作室\t178487\n无所动\t178488\n1516吨\t178489\n6月10日晚\t178490\n为毛为\t178491\n雨季\t178492\n啊吉\t178493\nptics\t178494\n八月十五\t178495\n江头\t178496\nwouEFSJL\t178497\n广州市\t178498\ngugghj\t178499\n102岁\t178500\n就是谁谁是\t178501\n潘世义\t178502\n2126279413\t178503\nhellostimeto\t178504\n抄袭\t178505\n私人化\t178506\n苏紫紫\t178507\n520520520\t178508\n土豆赛摩\t178509\n芦荟萃\t178510\n羽毛笔\t178511\n那只狗\t178512\n老选\t178513\n蓝天上\t178514\n爸男\t178515\n0772\t178516\nhxbzb\t178517\na6666\t178518\nlas\t178519\ngxhsh\t178520\n76人\t178521\n时间女目测\t178522\n江夏\t178523\n移除\t178524\nElite\t178525\n零零七幺\t178526\n彴\t178527\n故意杀人罪\t178528\n影\t178529\n彳\t178530\n永匹\t178531\n印度\t178532\n彻\t178533\n司法部\t178534\nqq音乐\t178535\n记忆犹新\t178536\nhdaUatthdfjc\t178537\n彦\t178538\n彡\t178539\n九点二十一\t178540\n彣\t178541\n形\t178542\n077Z\t178543\n被便\t178544\n彩\t178545\n诗游子意的游子意\t178546\n37元\t178547\n彪\t178548\n录\t178549\n第2张\t178550\n彑\t178551\nExynos\t178552\n当\t178553\n归\t178554\n彝\t178555\n稻城\t178556\n骚气\t178557\n张静\t178558\n秘俩\t178559\n大和狼\t178560\n平均分\t178561\n7月10日前\t178562\n第三册\t178563\n秘信\t178564\n彎\t178565\n彈\t178566\n拉美\t178567\n缩没\t178568\n冯其琪\t178569\n脑油\t178570\n说的处理\t178571\n桂冠电力\t178572\n猫妖\t178573\n球儿\t178574\n敛财\t178575\n就是我最好\t178576\n十一点小时\t178577\n68050\t178578\n买菜\t178579\n猪猪猪猪猪猪猪猪猪猪\t178580\n迷们\t178581\n五百六百七百\t178582\n太野蛮\t178583\n张依朋\t178584\n刘若豪\t178585\ncygg\t178586\n好人家\t178587\ngkgam\t178588\n逢时\t178589\nglgo\t178590\n大江歌\t178591\nQQ99999\t178592\ncygv\t178593\n湖滨\t178594\n企业界\t178595\njurddb\t178596\n纯血\t178597\n4月初\t178598\n不乖不乖不乖拜\t178599\n忙玩\t178600\nIll\t178601\n金茹苑\t178602\n夏装\t178603\nbyaa\t178604\n菲宾\t178605\n删繁\t178606\n199n9年\t178607\n飞锅\t178608\n秦时明月顺便月传\t178609\n舞蹈员\t178610\n郑霉素\t178611\nIlz\t178612\n家子\t178613\n游戏吧\t178614\n席地而坐\t178615\n十一点二十六分\t178616\n蓝莓葡萄\t178617\n朱咱们\t178618\n国际主义\t178619\n借故\t178620\n咸味\t178621\nIlU\t178622\njhagajgkcg\t178623\n住持\t178624\nrs，bb\t178625\n卖一\t178626\n明明污\t178627\n肉末\t178628\n要额\t178629\n最前面\t178630\n濠西园\t178631\n休斯顿市\t178632\n轩逸\t178633\n南无月光遍照菩萨摩诃萨\t178634\n熹纪传\t178635\n月落莲灯夜\t178636\n对症\t178637\n牛鸭血\t178638\n甘心情愿\t178639\n要领\t178640\n宁远\t178641\n卖业\t178642\n死死尸\t178643\n大宾宾\t178644\n调侃\t178645\n海航集团\t178646\n14小时以上\t178647\n没错别\t178648\n284787187887180\t178649\n富帅\t178650\n烦好多\t178651\njdiwueyt\t178652\n哦许愿\t178653\n二二五九二五五六\t178654\n35535253543424352352343\t178655\n汤姆猫\t178656\n李林雪\t178657\n三十圈\t178658\n神九\t178659\n制药\t178660\n678\t178661\n五十多厘米\t178662\n吧杯\t178663\n租出去\t178664\n677\t178665\n670\t178666\n671\t178667\n小墙\t178668\n棒棒棒棒糖\t178669\n坎坎坷坷\t178670\n岳灵\t178671\n李兴浩\t178672\n乙木\t178673\n杞公\t178674\n快点吧快点吧快点快点吧快点再快点\t178675\n刚微\t178676\n一百首\t178677\n胡复读\t178678\n审批\t178679\n撒爱\t178680\n偷笑什\t178681\n邓建栋\t178682\nu天\t178683\n543021\t178684\n墨帝\t178685\n烟油\t178686\n小墨\t178687\n想一想\t178688\nzz7\t178689\n7平方米\t178690\n恩q7\t178691\n2011年1月12日9时19分\t178692\n罗浩\t178693\nxsbs\t178694\n鸭舌帽\t178695\n一窍不通\t178696\n被抢吻\t178697\n小眼睛\t178698\n旅途突突突突突突突突突突突突突突\t178699\n凸点\t178700\n野外\t178701\n搞笑片\t178702\n蛔虫\t178703\n11164146446644\t178704\n尼玛网络\t178705\n搞笑版\t178706\n永不再聊\t178707\n湘潭明天下\t178708\n5干\t178709\n燕燕\t178710\nmarket\t178711\n奶奶茶杯具\t178712\n迭代\t178713\n多多卡\t178714\n红网\t178715\n死要\t178716\n五九九七\t178717\nzzz\t178718\nzzy\t178719\n余数\t178720\n紧带\t178721\n光芒登场\t178722\nfhfjsnbfcb\t178723\n虚点\t178724\nzzn\t178725\nzzk\t178726\nloreal\t178727\nzzi\t178728\n预见\t178729\n123467890\t178730\n被查\t178731\nJXIHDIHSJBHCN\t178732\n中国环保部\t178733\n是男我女\t178734\n字峰\t178735\nzzX\t178736\n死脑经\t178737\n饭说\t178738\nvv顾doJ\t178739\n服装店\t178740\n锋绘\t178741\n热和\t178742\n微号\t178743\n地下人行\t178744\n144平米\t178745\n芒果tv\t178746\n5月8日\t178747\n赣州市\t178748\n6点六\t178749\nzaskipli\t178750\n水泥块\t178751\n看星星一眨一眨亮晶晶\t178752\n吴玉惠\t178753\n内部装修\t178754\n一二三四五六七八九十十一十三十五十七\t178755\n249.19M\t178756\n枣阳\t178757\n你好港\t178758\nQqihax\t178759\n36傅\t178760\n侃侃而谈\t178761\n全心全\t178762\n单身汉\t178763\n基差\t178764\n1940年\t178765\n李然\t178766\n正年\t178767\n故事会\t178768\n人工智能机器\t178769\n一五分\t178770\nh骨\t178771\n坡跟\t178772\n下午18时\t178773\n方舟子\t178774\n好大夫在线\t178775\n创艺\t178776\n一大早先\t178777\n走极端\t178778\n流传度\t178779\n莫斯科嗯膛自\t178780\n两百本\t178781\nDffgghfdjfksabcssgxahxgahxajahsjsjsjhfdwugfweugfuwgfwhfajbjacjachsajhcjasfh\t178782\n覺得\t178783\nC罗\t178784\n赶不上去\t178785\n265555645586723\t178786\n2011年12月\t178787\n买醉\t178788\n熊出没之小心\t178789\n杨明\t178790\n唔系化\t178791\n珉豪\t178792\ndifyf\t178793\n文朝\t178794\n唔多\t178795\n生布丁生\t178796\n煲冬瓜\t178797\n除法\t178798\n900000\t178799\nAV女优\t178800\n就不我就要\t178801\n阿拉啦啦\t178802\n你老大帅\t178803\n100d101\t178804\n兆头\t178805\n我的帅锅不是你\t178806\n纷纷纷纷纷纷纷纷纷纷纷纷纷纷纷纷纷纷纷纷纷\t178807\n传颂\t178808\n称绩\t178809\n非国产片\t178810\n牛腿\t178811\n牛腰\t178812\n明月照\t178813\n陈晓\t178814\n牛腩\t178815\n谢思\t178816\n朗朗上口\t178817\n问药\t178818\n何必林\t178819\n青岛海湾大桥\t178820\n23.4亿元\t178821\n咪沁\t178822\n伦家有论剑\t178823\n行者\t178824\n扁嘴\t178825\nxestu\t178826\n15.37\t178827\n折纸鹤\t178828\n胸打\t178829\n时装精们\t178830\n国土\t178831\n知已\t178832\n知己\t178833\n王馨雪\t178834\n13963036066\t178835\n王馨雨\t178836\n矿物\t178837\nsoapboxe\t178838\n拜祭\t178839\n卡西利亚斯\t178840\n金油\t178841\n糖浆\t178842\n秦媛媛\t178843\n深深藏在我的歌声里我想唱给你\t178844\n嗯风\t178845\n弛\t178846\njshs\t178847\n陕西电信\t178848\n嗯飞\t178849\n新疆维吾尔自治区\t178850\n虎门大桥\t178851\n轮渡\t178852\n了度\t178853\n金沙\t178854\n碰运气\t178855\n幸福人\t178856\nELVIS\t178857\n痛死\t178858\n黄恩超\t178859\n彤彤\t178860\ndhiphotosbaiducomxiaodupicitem9922720e0cf3d7cac8098d05f51fbe096a63a9c2jpg\t178861\n拜神\t178862\nnbgk\t178863\n金沉\t178864\n纬度\t178865\n胡杨树\t178866\n親友\t178867\nbbggtxzrzfxxkhxtfxddhzigfzroyxdykxlchoxhoxlzgkgzkggdsgdgnmn\t178868\n山崎\t178869\n123439\t178870\n粉系\t178871\n聊天儿聊\t178872\n爆破\t178873\n沅个\t178874\n阿期\t178875\n反反复复反反复复反反复复反反复复反反复复规范化共和国个法国法规gGF个GGGHGGGGGG说了算法院\t178876\n同乐幼儿园\t178877\nLA#\t178878\njjik\t178879\njjii\t178880\n青春偶像派\t178881\n山崖\t178882\nmysthat\t178883\n劫机\t178884\n玳瑁猫\t178885\nAl-Arabiya\t178886\n阿有\t178887\n哇塞爱你\t178888\n基情篇\t178889\n华芳纺织\t178890\nggfgffuf\t178891\n嗯阿嗯\t178892\n留客\t178893\n屋檐\t178894\n冰凌\t178895\n阿机\t178896\nMonique\t178897\n齐刘海\t178898\n爱情书名我不愿一爱卿入迷我也不约只要你some唔翻之行金\t178899\n夜出来\t178900\n留宿\t178901\n偷偷的\t178902\n抱一抱\t178903\n22世纪\t178904\n阿木\t178905\njqn\t178906\njqh\t178907\n殿下\t178908\njqe\t178909\n空嘴\t178910\n执行人\t178911\n太太太太太太太太\t178912\n那句话\t178913\n邵勇\t178914\nLAy\t178915\njqx\t178916\n3月末\t178917\n环境\t178918\njqt\t178919\n看过来\t178920\n万桶\t178921\njqs\t178922\n好了了算\t178923\n掏粪男孩\t178924\n条河科大\t178925\n丫小情人\t178926\nvftdgg\t178927\n提家\t178928\n生殖器官\t178929\n十男十女\t178930\n拉屎哥\t178931\ncd11728b4710999268b8c4cec3fdfc032300jpg\t178932\n4所\t178933\n长长发\t178934\n关仁山凯美\t178935\n基地台\t178936\nxkhxo\t178937\n错听\t178938\n谈钱\t178939\n5折\t178940\n好啦度秘\t178941\nguond\t178942\nn思密达\t178943\n出息\t178944\n术业\t178945\n卢布\t178946\n李宁乐\t178947\n二等奖\t178948\n辱风尘\t178949\n注音版\t178950\n特儿\t178951\n万5.5\t178952\nhummingbird\t178953\n52552个\t178954\n虾仁堡\t178955\n二十科\t178956\n二十秒\t178957\n外露\t178958\n哆啦A梦\t178959\n第十五集\t178960\n顶尖\t178961\n缝针\t178962\n农行\t178963\n爱上你怎么办\t178964\n好的你好乖\t178965\n东叮儿\t178966\n外需\t178967\n会颜\t178968\n扬中\t178969\n57252676986535\t178970\n一个一点儿\t178971\n颇颇高\t178972\n吴一含\t178973\n新加坡\t178974\n灌注\t178975\n禁言\t178976\n人大代表\t178977\n说好好聊天儿\t178978\n马蜂窝\t178979\n39元\t178980\n浪凡\t178981\n真富江\t178982\n恩知了\t178983\n淫水\t178984\n血项\t178985\n稍微\t178986\ndheggf\t178987\n项天琪\t178988\n龚老板\t178989\n动感超人情何以堪\t178990\n中途岛\t178991\n快速减肥法\t178992\n龙眼\t178993\n百般\t178994\n富翁\t178995\n十二十三岁\t178996\n祛湿\t178997\n弄死\t178998\n1165691044\t178999\n克隆\t179000\n嗯叨\t179001\n休憩\t179002\n11岁\t179003\n格叽格叽\t179004\n冲着\t179005\n10000000000000000000000\t179006\n铠甲\t179007\n不堂\t179008\n漯河市区\t179009\n嗯大熊\t179010\njshzhdjsu\t179011\n死路上\t179012\n哇咔咔\t179013\n取人\t179014\nyijing\t179015\n周洋洋\t179016\n度秘度秘小度秘小度秘我可爱的小度秘\t179017\nHttyttff\t179018\n不堪\t179019\n斜阳\t179020\n早起精神好凉\t179021\n抱不起\t179022\nridehorse\t179023\n永远别理我了我也不理你了行\t179024\nprpr\t179025\nNordica\t179026\n再还\t179027\n式子\t179028\n讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌\t179029\n春闺\t179030\n食用油\t179031\n没点用\t179032\n扯开\t179033\n跟住\t179034\n声叹气\t179035\n不我不想\t179036\n一点片\t179037\n复式\t179038\n沙师弟\t179039\n1946年\t179040\n第一第一\t179041\n古正经的当亲姨\t179042\n复强\t179043\n哀乐\t179044\n地方统计局\t179045\n跟你\t179046\ne藏藏藏藏\t179047\n旁床\t179048\n来来来来了\t179049\n要不移\t179050\n有你好乖\t179051\n佛性\t179052\n58242455686865\t179053\nfcvc\t179054\n怪乎\t179055\nnnnnnnnnn我怕我半个人\t179056\n150043133343\t179057\n要不积\t179058\n光蛋\t179059\nDHay\t179060\n为你不告诉我\t179061\n嗜血法医之朴有天篇\t179062\n装会\t179063\n心跳萧\t179064\n麦面\t179065\n阴暗\t179066\n三八幺九\t179067\nnmenaliyou\t179068\n赵坤\t179069\n糯吉林\t179070\n八十台\t179071\n不说了我就不打你了啊了了了了臭婆娘\t179072\n湄南河\t179073\n湖公园\t179074\n达成\t179075\n少点\t179076\n律师业\t179077\n签证片\t179078\n钱鬼\t179079\n你是我的玫瑰你是我的话\t179080\n草垛\t179081\n夏娃\t179082\n192集团\t179083\n10月22日上午\t179084\n帅萌\t179085\nfcjffhffc\t179086\n夏娜\t179087\n高以翔\t179088\ntooby\t179089\n中南传媒\t179090\n歌女\t179091\n作气\t179092\n小红点\t179093\n南后街\t179094\n青岛人\t179095\nxksb\t179096\n呃度秘\t179097\n难上加难\t179098\n25个小时\t179099\nbnhl\t179100\n枸杞子\t179101\n谢谢你了度秘\t179102\nvvggh\t179103\n10m\t179104\n660344\t179105\n事业部\t179106\n老家达人错啦错啦错啦错啦错啦错\t179107\n咯咯鱼\t179108\nwwit\t179109\n无声版\t179110\n脚者\t179111\n窥伺\t179112\n电风扇\t179113\n要求\t179114\n快酸\t179115\n多多妈妈\t179116\n佐阳\t179117\n血统论\t179118\n62540\t179119\n韩义勇\t179120\nkitup\t179121\n成本价\t179122\n焰皇宠帝\t179123\n野菜\t179124\n点不点\t179125\n广州恒大足球俱乐部\t179126\n百六十块\t179127\n2011年9月21日\t179128\n时尚特产\t179129\n20777\t179130\n韦正\t179131\n圆睁\t179132\n当地\t179133\n对切\t179134\n张志和\t179135\n新华大厦\t179136\n天仙\t179137\n晓喻\t179138\n袁冯\t179139\n千金大小姐\t179140\njmtma\t179141\n浙江理工大学\t179142\n燕子李\t179143\n一个故\t179144\n谈何分手\t179145\n台类\t179146\n本节\t179147\nflour\t179148\n五环\t179149\n嵩鼠误\t179150\n8场\t179151\n76Y96\t179152\nddcdfcddccv\t179153\n加利亚尼\t179154\n流量块\t179155\n美食节\t179156\n没了不告诉你\t179157\nkommm\t179158\n丽曼\t179159\n见飞\t179160\n李二主\t179161\n紧跟\t179162\n屁事儿\t179163\n勤学好问\t179164\n汤瑞玓\t179165\n假设\t179166\n压身\t179167\n见风\t179168\n张钧宁\t179169\n一个数\t179170\n415263\t179171\njQuery\t179172\n子弟\t179173\n杰总\t179174\ncjcfuv\t179175\n3瓶\t179176\nowz\t179177\nRttyyui\t179178\n子弄\t179179\n淡路\t179180\n太帅鹏\t179181\n有缘人\t179182\n一姚\t179183\n88年\t179184\n子弹\t179185\naoa\t179186\n来张\t179187\n利海家园\t179188\n一姓\t179189\n一姐\t179190\n吾日\t179191\norionylo\t179192\n8n8n88889868389963388\t179193\n1点15分\t179194\n悔改\t179195\n30岁\t179196\n来弹\t179197\n花子\t179198\n勋灿嘟\t179199\n11点04分\t179200\n变电站\t179201\n8750\t179202\n激光\t179203\n微微笑\t179204\n溶血\t179205\n我告诉你吧我家丽丽哪你叫\t179206\n青干\t179207\n不烂不烂\t179208\n脸端\t179209\n飞不动\t179210\n马丽红\t179211\n香港信报9日刊\t179212\n十辈子\t179213\n钱世国\t179214\n诱惑力\t179215\n300块\t179216\n花季\t179217\n阴公\t179218\n比比吧\t179219\n五步五步\t179220\n虚心\t179221\n红卫\t179222\nhi你好你好多咪\t179223\n专攻\t179224\n索撒\t179225\n专政\t179226\n我是女的呀你为什么不告诉我你是男的你\t179227\nffgeu\t179228\n妖娆\t179229\npqw\t179230\n姚雪伽\t179231\n老K图老K\t179232\n专收\t179233\n剛試\t179234\n玉腕\t179235\n我是你的小五\t179236\n青春战争片\t179237\n扁平足\t179238\n老大哥\t179239\n于度秘\t179240\nfhhghfhfhfnfnfmfjfjnffnfnfnffjfjfhf\t179241\n五月二十号\t179242\n猪o\t179243\n推倒\t179244\n找你我看\t179245\n萨饼\t179246\n我不需要你的谢谢我需要你说你喜欢我\t179247\n这一片\t179248\n别老丁\t179249\n看成色\t179250\n东奔西走\t179251\nfcvccgfg\t179252\n盆花\t179253\n早上5点\t179254\n幺幺切克闹\t179255\nfdhgdbhk\t179256\n一个秘\t179257\nCK针\t179258\n植物大战僵尸二高清\t179259\n佛罗伦萨\t179260\n张飞\t179261\n一个种\t179262\n吉娃娃大学\t179263\n因蒂克丝\t179264\n100台\t179265\n聂耳生\t179266\n光度秘\t179267\n只是不是\t179268\n周笔畅天声一队##周笔畅天声\t179269\n你好再见\t179270\n喻冬香\t179271\n邓启荣\t179272\n一百张\t179273\nSiri\t179274\n讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你没到哩\t179275\n大姨妈多会儿来\t179276\n2238194270\t179277\nh8sn\t179278\n634315848402737\t179279\nyloujizzz\t179280\nybc\t179281\n我要没我有我是我是雅雅我要贝贝还有美琪美雪小蓝游乐\t179282\n经停\t179283\n性感口\t179284\n威信\t179285\n奥特曼qw小怪兽1987飞刀百万百般\t179286\n1386537534\t179287\n我们好爱你\t179288\n温泓新\t179289\n性学家\t179290\n高元杰\t179291\n休矣\t179292\n热情似火\t179293\n黑马\t179294\n五线谱\t179295\n一亿吨\t179296\n勤俭持家\t179297\n中国好声音5\t179298\n消火\t179299\n手肘\t179300\n赛了半睡觉\t179301\n咔咔咔咔咔咔\t179302\n靠不靠\t179303\n被子句\t179304\nijrignv\t179305\n曹姐姐\t179306\n郝梦欣\t179307\n脑丸\t179308\n张晨妍\t179309\n看不服\t179310\n升龙\t179311\n顺人\t179312\n545588\t179313\n大四十四一百四十四\t179314\n疑难杂症\t179315\n惜花\t179316\n樱花草的歌\t179317\n45辆\t179318\n顺产\t179319\n农田\t179320\nvssf\t179321\n敢我敢\t179322\n阿姨们\t179323\n上场\t179324\n90多个\t179325\n争夺战\t179326\n你不来爱我你爱别的女人别的女人\t179327\n上地\t179328\nvssj\t179329\n臭妮\t179330\nHigh\t179331\n上在\t179332\nvsss\t179333\n奔斗\t179334\nMoralejo\t179335\nfied\t179336\n利用率\t179337\n常用\t179338\nvvGVv\t179339\n言毕\t179340\n手右手\t179341\n练就\t179342\n公元277年\t179343\n十一级\t179344\n踏把\t179345\n耕种\t179346\n明就好\t179347\n富临疵\t179348\n自由岛\t179349\n观书\t179350\n很得\t179351\n刘思勰\t179352\n张星宇\t179353\n何惧\t179354\n张榜杰\t179355\n赶回家\t179356\n还童\t179357\n董医生\t179358\n屌丝男\t179359\n度秘我们外星人也不分性别秘\t179360\n154583496\t179361\n啡尔\t179362\n咯葡萄\t179363\n不不恶心\t179364\n莫一凡\t179365\n欧巴isu\t179366\nf罩杯\t179367\nF-4\t179368\n似火\t179369\n哇西伊咪\t179370\n两分之一\t179371\n威逼\t179372\n脂肪肝\t179373\n500000000000000000000000000000000000000000005551378363534343424234x4338342276467643425二多少\t179374\ngtrrtt\t179375\n088666\t179376\n吊炸天\t179377\n玲玲\t179378\n永永\t179379\n举国\t179380\n几分之二\t179381\n土匪夷所思\t179382\n莱阳市\t179383\n死死花园\t179384\n包住\t179385\n三字经儿歌\t179386\n头点\t179387\n松狮萌萌哒\t179388\n韭菜盒\t179389\n刘妹儿\t179390\n郭大鹏\t179391\n六百个\t179392\n对呀赵\t179393\n送货员\t179394\n第六季\t179395\nfhvlgifkg\t179396\n黄金周\t179397\n丁香\t179398\n还债\t179399\n两三百\t179400\n49363\t179401\n阿達\t179402\n不我一遇\t179403\nwkk48\t179404\n7755548\t179405\ntheusa\t179406\nbhiphotosbaiducomxiaodupicitem4610b912c8fcc3ce9a1e96479545d688d53f20e3jpg\t179407\n闲死\t179408\n雷晓栋\t179409\n半辈子\t179410\nyouallstio\t179411\ninghong\t179412\n华东地区\t179413\n投资人\t179414\n吃饱还\t179415\n吃饱过\t179416\n别无二致\t179417\n学校教育\t179418\n那一个我\t179419\ntwmdjg\t179420\n汉服\t179421\n陶玉洁\t179422\n小兴兴\t179423\n猜工\t179424\n八八个\t179425\n俩多点儿\t179426\nhzgdhsbsh\t179427\ngudamasi\t179428\n二第三二\t179429\n听说出\t179430\n伊尔卡比\t179431\nseeyouagsin\t179432\n性菲\t179433\n禁食\t179434\n102个\t179435\n来如此\t179436\n数千万\t179437\n秋分\t179438\n物理学家\t179439\n大兴轮\t179440\n唱一个你是\t179441\n汉未\t179442\n寒武纪\t179443\n每个子\t179444\n基本常识\t179445\n2500元\t179446\nhghg\t179447\nbbnnn\t179448\n三只手\t179449\n挺拔\t179450\n侵扰\t179451\n别娇\t179452\n二百六十一块\t179453\n陌路人\t179454\n脑袋装\t179455\n旗子\t179456\n娉娉娉娉冰冰\t179457\n闪电\t179458\n走你走\t179459\n暗器\t179460\n朋朋朋uuu\t179461\n不哭了你\t179462\n九代\t179463\n凤凰舞\t179464\n李小巴\t179465\n明明版\t179466\n521147\t179467\nam603\t179468\n台山\t179469\n55485645534\t179470\n能不\t179471\nYuonameis\t179472\n我曾\t179473\n朱雨凡\t179474\n九仙\t179475\n八十七块\t179476\n媳妇的事乖乖乖呀乖乖乖呀你是乖乖乖\t179477\n神小红\t179478\n台属\t179479\n度秘度秘我爱你都\t179480\n成真\t179481\n炮台站\t179482\n眯眼秘蜜蜜\t179483\n告诉你我讨厌\t179484\n金美娟\t179485\n工厂里上司\t179486\n表扬\t179487\n克帅\t179488\n鵠\t179489\n更快乐\t179490\n一九五二一九。一九五二零四零幺\t179491\n洞边\t179492\n古韵\t179493\n紫衣\t179494\n哪的我\t179495\n女生宿舍\t179496\n塘沽\t179497\n龙萱儿\t179498\n一路走来\t179499\n鵏\t179500\n自责\t179501\n我真的不想和你说了也\t179502\niiff\t179503\nwill\t179504\nCD\t179505\n米豆豆\t179506\n食家井小学\t179507\n六十五天\t179508\n22283元\t179509\n丝路\t179510\n我告诉你事实我是男是女\t179511\n哈十八\t179512\nfaf34251tet\t179513\n了讨厌\t179514\n最高温度\t179515\n异星\t179516\n想家\t179517\n装死\t179518\n想害\t179519\n助推\t179520\n文艺节\t179521\n沙如雪\t179522\n赤胆\t179523\n182名\t179524\n8708358\t179525\nhelloyyyyyy\t179526\n227837738\t179527\n羔体\t179528\n070609\t179529\n林镇伟\t179530\n礼品盒\t179531\n蓝黑\t179532\n钟浩然\t179533\n缴存者\t179534\n坡屋顶\t179535\n再说会\t179536\n斗争\t179537\n欧和桑\t179538\nyooo\t179539\nyoon\t179540\n度秘你在度蜜月\t179541\n退到\t179542\n发婬\t179543\nMTV\t179544\n11471964\t179545\n鞋头\t179546\n新市镇\t179547\n333快点\t179548\n征地\t179549\nyoou\t179550\n怪异\t179551\n朴信惠\t179552\nMTG\t179553\n6636\t179554\n关网\t179555\n是我就是说\t179556\n驼绒\t179557\n100354\t179558\ngshdjvgj\t179559\n哈招\t179560\n哈拜\t179561\n言简意赅\t179562\n外空\t179563\n好说说\t179564\n小红心\t179565\n闹应\t179566\n太客气\t179567\n哈拉\t179568\nfailsI\t179569\n十梅西\t179570\n110921\t179571\n喜迎\t179572\n小姜子\t179573\nhi突发去啦浑噩1哈喽\t179574\n研究\t179575\n秘玖\t179576\n說過兩天\t179577\n数十年前\t179578\n伱秘秘\t179579\n光荣感\t179580\n假哭\t179581\n树阴\t179582\n应激\t179583\n定南\t179584\n秘率\t179585\nMara\t179586\n怪物公司\t179587\n搅基诗给你的请鼓掌\t179588\n安妮公主\t179589\n十几年\t179590\n称王称霸\t179591\n沃艾沃\t179592\n呼吸困难\t179593\nyouyoumy\t179594\n申核过\t179595\n假品\t179596\n可能带\t179597\n竖着\t179598\n前9个月\t179599\npatw\t179600\n注册资本\t179601\n盏灯\t179602\n胰岛瘤\t179603\nIgosleep\t179604\n五女\t179605\n屏蔽\t179606\n五好\t179607\n被子\t179608\n大的我了外快十四了\t179609\n林永成\t179610\n雷蛇\t179611\n广州恒大俱乐部\t179612\n98分\t179613\n拖垮\t179614\n新百伦\t179615\n衰仔\t179616\n原邦\t179617\n葳儿\t179618\n五套\t179619\n性病\t179620\n这末\t179621\n这本\t179622\n宿舍版\t179623\n不收拾\t179624\n4642\t179625\nswjdb\t179626\n得而复失\t179627\n家瑞\t179628\n鸡蛋饼\t179629\n22224443\t179630\n薛雅惠\t179631\n大肉炒芹菜\t179632\n11%\t179633\n先手\t179634\ndoifit\t179635\n点麦\t179636\n佳烨\t179637\n34100020201601252307540001\t179638\n115\t179639\n114\t179640\n117\t179641\nfcddtcf\t179642\n111\t179643\n6分\t179644\n113\t179645\n112\t179646\n太完美\t179647\n陶五同\t179648\n119\t179649\n博亚重义\t179650\n诺顿科学\t179651\n七年\t179652\n烤箱\t179653\nlead\t179654\n白果\t179655\n玛卡图\t179656\n酷哥\t179657\n彬彬有礼\t179658\nlean\t179659\n回信吧\t179660\n七幺\t179661\n穿服\t179662\n光点\t179663\n虎门镇\t179664\nliuzkhappy\t179665\n瑞歌德\t179666\n耸词\t179667\n傻气\t179668\n狠狠狠狠狠狠\t179669\n走好像\t179670\n当当当当我素三针\t179671\n11o\t179672\n11n\t179673\nuggggc\t179674\n游击队\t179675\n11w\t179676\nHuhj\t179677\nJimale\t179678\n渠子沟\t179679\n贤妻良母旧\t179680\n要好在\t179681\n56元\t179682\n11y\t179683\n11x\t179684\n电幂\t179685\n红苗庙\t179686\n11F\t179687\nMojave\t179688\nvgdj\t179689\n面亲\t179690\n血雨\t179691\n11M\t179692\n11O\t179693\n给我你可以\t179694\n面人\t179695\n面交\t179696\ncoodbyebyaebyl\t179697\n187031876542200\t179698\n小兔崽子几个星期没假想我\t179699\n老顽童\t179700\n派兵\t179701\n9..11\t179702\n风雨天穹\t179703\n王师\t179704\n王帅\t179705\n王亚杰\t179706\n只有时候\t179707\n九星\t179708\n丝绵\t179709\nturemondomain\t179710\n丝绸\t179711\n顾的摸宁嗯\t179712\n开罗机场\t179713\n能力\t179714\nfdfcghvgvvbbbvb\t179715\n每个孩子\t179716\n836540\t179717\n如今\t179718\n陈星羽\t179719\n丝绒\t179720\n来我家吧\t179721\n涂写\t179722\n泰罗泰罗奥特曼\t179723\n切谢\t179724\nhello眠\t179725\n如仙\t179726\n恋舞OL测试服\t179727\njoxd\t179728\n卡瓦劳蹦极\t179729\n不要脸不要脸就不要脸不要脸不要脸不要脸臭不要脸不要脸不要脸不要脸\t179730\n张新瑞\t179731\n朴你麦\t179732\n转瞬即\t179733\n赶不走\t179734\n东海舰队\t179735\n王振全\t179736\n42万\t179737\n1366789\t179738\n光行\t179739\n权贵\t179740\nlvu\t179741\n夔夔夔\t179742\n可帅\t179743\n简洁\t179744\ne234567满天都\t179745\n风车坪\t179746\n滔滔不绝\t179747\n排版\t179748\n露丽马卡龙\t179749\n人心肠软\t179750\n密爱\t179751\nlvc\t179752\n聪明的犬\t179753\nlvn\t179754\n靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠\t179755\nlvk\t179756\n王乐涛\t179757\n梦萌\t179758\n鸡胸脯肉\t179759\n外功\t179760\n呆子呆子呆子呆子\t179761\n执行\t179762\n好看不好听\t179763\n光源\t179764\n华擎\t179765\njmozz\t179766\n高雪寒\t179767\n46541621122611\t179768\nok片\t179769\n负相关\t179770\nboos\t179771\n数米\t179772\nboow\t179773\n重返\t179774\nbooi\t179775\nbook\t179776\n大螃蟹\t179777\nboom\t179778\n店口镇\t179779\n百醇\t179780\n幸存\t179781\n很不会\t179782\n我好爱女屋再结婚吧\t179783\n丛生\t179784\n一万两千多\t179785\n平泉\t179786\n平岗\t179787\n唇部\t179788\n两家\t179789\n我在说英文别\t179790\n高比\t179791\n我喜欢你说声\t179792\n自助式\t179793\n忘记了\t179794\n对州市初中学\t179795\n两宫\t179796\ndhoz\t179797\n天下第一\t179798\n小公寓\t179799\n后门\t179800\n对呀对呀你以为\t179801\n十个一百个\t179802\n听筒\t179803\n子子子子\t179804\n两宗\t179805\n热翔秘\t179806\n黎环游\t179807\n唱了唱\t179808\n结膜\t179809\n植物大战僵尸二\t179810\n畏葸不前\t179811\nhitmyitwith\t179812\n常好\t179813\n女作者\t179814\n王乖乖\t179815\n555556503\t179816\nPoetry\t179817\n明早个女朋友\t179818\n1.22个\t179819\n拖拽\t179820\n广美交互工作室\t179821\niloveyou度秘我永远爱你\t179822\nhchj\t179823\n少许\t179824\n男孩女孩\t179825\nfoud\t179826\n361551本\t179827\ns型\t179828\n那么多好\t179829\n毛我喜欢\t179830\n还半仙\t179831\n我恨你讨厌你说恨你\t179832\n北京东坝育英学校\t179833\n已经常\t179834\n百万大村\t179835\n佛玉\t179836\n14kg\t179837\n渺无踪\t179838\n小高子\t179839\n文在\t179840\n味多尔\t179841\n李培俊\t179842\n四个多\t179843\n褪色\t179844\n伊拉克\t179845\nSASASA\t179846\n布鲁诺\t179847\n好儿会\t179848\n1984416\t179849\n度秘我发现你好萌\t179850\n田向利\t179851\n很古老\t179852\n黑莓克巴德\t179853\n苏沈杭\t179854\n愿谅\t179855\n经济基础\t179856\n来到这儿\t179857\n富余\t179858\n窦为\t179859\n天下彩呢金海浪\t179860\n鱼哥\t179861\nPQQ\t179862\n大吉大利哈\t179863\nfhtchev6eh3ft\t179864\nvfxg\t179865\n乱乱\t179866\n宋玉庆\t179867\n涉案\t179868\n250nimabi\t179869\n爱新觉\t179870\n晃动\t179871\n刘缇萦\t179872\n事情们\t179873\n外交\t179874\n照巨\t179875\n荒凉\t179876\n大一百\t179877\n一言通\t179878\n沉鱼\t179879\n清穿\t179880\n小邋髓肾黯gd\t179881\n联欢\t179882\n滕州\t179883\n外人\t179884\n刘政\t179885\n十小时\t179886\n五魁首呀666\t179887\n白块儿\t179888\n嘿嘿大中华的我是朋友了你敢\t179889\n试水\t179890\n上水\t179891\n外事\t179892\n1101921901\t179893\n瓷胎\t179894\n白兰地\t179895\n租户\t179896\n原作者\t179897\n夜思李\t179898\n清空\t179899\n赛恩师\t179900\n伯明\t179901\n白天英雄联盟\t179902\naoanono\t179903\nhhuhffygfg\t179904\n通贝斯\t179905\n顽疾\t179906\n屡屡了我告诉你\t179907\n韶大\t179908\n跟困\t179909\n阿西洛\t179910\n建站\t179911\n耍萌\t179912\nnawn\t179913\nnawo\t179914\nnawl\t179915\nnawk\t179916\n252o1\t179917\nqq999\t179918\n建立\t179919\n青叔\t179920\n老公度秘我爱你\t179921\nchina\t179922\n茶匙迷迭香精油\t179923\n老魏\t179924\n玩捉\t179925\n你好讨厌我恨你\t179926\n双脚\t179927\n浮厪\t179928\n李不住\t179929\n毕瑞佳\t179930\n是不不不不不致电\t179931\n青口\t179932\ngvj6675869hhhkbm\t179933\n我的妹妹还有三野\t179934\n15197800361\t179935\n巡展\t179936\n青叶\t179937\n广中\t179938\nhbbdnnnsnnn\t179939\n兴学\t179940\n别豆\t179941\n嗯对白\t179942\n胸干嘛\t179943\n178cm\t179944\n电脑小达人\t179945\n格朗\t179946\n乐于\t179947\ntfdoux\t179948\n一列队\t179949\noixit\t179950\n燕通宴\t179951\nabcaef\t179952\n崔雨\t179953\n胜地\t179954\n龟苓膏\t179955\n有机见多\t179956\n白烨\t179957\n2004年9月20日\t179958\n乐亭\t179959\n阴土\t179960\n美好的日子\t179961\n乐享\t179962\n义餸\t179963\nxyvx\t179964\nmusic\t179965\n凡哥哥\t179966\n死仔\t179967\n莱西\t179968\ngccf6\t179969\n8果\t179970\nSPN\t179971\n墨橙\t179972\n2011年5月份\t179973\n二百五三千六瓜\t179974\n见你可以\t179975\n隔十米\t179976\n录取\t179977\n若狂\t179978\n,,,,\t179979\n38秒\t179980\n即可爱\t179981\n13450863533\t179982\np4p\t179983\n特长班\t179984\n多米多面\t179985\n张雨凡\t179986\n吃西餐\t179987\n二十六四三九五七四九零\t179988\n猥琐眼\t179989\n摸不摸\t179990\n眉骨\t179991\n哼不理你了我走\t179992\n8架\t179993\n肖平\t179994\n美巡教育\t179995\n咏春\t179996\n条翼\t179997\n海奖\t179998\n东洋\t179999\n娘娘们\t180000\n1530561662\t180001\n旧城中学\t180002\n一滩\t180003\n死路一点\t180004\nKiss\t180005\nmlpppplll\t180006\n一滴\t180007\n15日上午\t180008\n李局\t180009\n002065\t180010\n眼缘\t180011\n租赁单\t180012\n派生型\t180013\n二十三\t180014\n应不应该\t180015\n羊皮+粉\t180016\n嗯理财\t180017\n哪人么\t180018\n天天在一起\t180019\n你在\t180020\n泸高\t180021\n瓶男\t180022\n蒸肠粉\t180023\n深发展\t180024\n福图画\t180025\n忘了我叫\t180026\n理不礼貌\t180027\nqyou\t180028\n天蝎庭\t180029\n巡山\t180030\n效价\t180031\nmiia\t180032\n天蝎座\t180033\n一百四十一\t180034\n一池子\t180035\n亏\t180036\nhdie\t180037\n产权\t180038\n等等等等等等等等等等等等等等等当当当当当\t180039\n喜欢你我不是爱你我是说我讨厌你懂不懂讨厌你\t180040\n塞到\t180041\n源秘\t180042\nexexo\t180043\n作业帮\t180044\n成家立业\t180045\n大同好\t180046\n大邱\t180047\n床铺\t180048\n18嘿嘿118\t180049\n别废话\t180050\n缩出小终獃\t180051\n几个一点都不好喽秘到位秘我爱揍秘秘爱厉害的我好想你\t180052\n锡瓦绿洲\t180053\n鼻窦炎\t180054\n千百遍\t180055\n9月14日\t180056\n触摸\t180057\n莫不\t180058\n妇麦\t180059\n依偎\t180060\n疾病\t180061\n孙缘霄\t180062\n招术\t180063\n台洲\t180064\n胡钰荌\t180065\n吾好\t180066\n天天有\t180067\n苍井空\t180068\n可儿娃娃\t180069\n盖过\t180070\n白兔兔\t180071\n恩他\t180072\n14.4M\t180073\n哪个群\t180074\n一看一看\t180075\n大桥头\t180076\nIPAD伴侣\t180077\ne2co3\t180078\n战斗号\t180079\ninal\t180080\n恩仇\t180081\n余额宝\t180082\n腿疼\t180083\n肥罗肥\t180084\n牛马狗\t180085\n无知\t180086\n儒雅\t180087\n超级飞侠\t180088\n浦东新区\t180089\n个长\t180090\n纯清\t180091\nzdzd\t180092\n学文\t180093\ndfgygfh\t180094\n护士系\t180095\n叫好开心\t180096\n赶早\t180097\n亥\t180098\n度秘达\t180099\nfjbd\t180100\n第6季\t180101\n第二句\t180102\n君君\t180103\n狮子大开口\t180104\n我告诉我的朋友\t180105\nhttpfhiphotosbaiducomxiaodupicitem0ff41bd5ad6eddc49ca1828e3\t180106\n酒托\t180107\n巴迪\t180108\n第二只\t180109\n暴红\t180110\n滴家\t180111\n惹祸\t180112\nimsous\t180113\n我是爱的天使\t180114\nPURE+\t180115\n不爱吗不二\t180116\n师大讲座\t180117\n2亿多\t180118\n手霜\t180119\n三十八四十八岁\t180120\n#艺声#、#厉旭#、#圭贤#\t180121\n确立\t180122\n此次\t180123\n近场\t180124\n经久\t180125\n代替\t180126\n信单\t180127\n杨路\t180128\nbxhajbz\t180129\n吐痰者\t180130\n信华\t180131\n真不少\t180132\n现在这样\t180133\n说话们\t180134\n寡言少语\t180135\n黑豆\t180136\n48千米\t180137\n咩咩\t180138\n丁成\t180139\n经书\t180140\n我是说你给我唱一首歌清一下不要钱\t180141\ntoutd\t180142\n4月4日凌晨2:45\t180143\n小猪咪\t180144\n美国场\t180145\n32223\t180146\n如常继续\t180147\n细雨\t180148\n草木灰\t180149\ndkej\t180150\n88888888888888888888888\t180151\n石破惊天21\t180152\n王佩\t180153\n王野全\t180154\n阿斯兰鸡婆哈他不饿\t180155\n王你\t180156\n一六四十颗\t180157\n青龙镇\t180158\n要礼貌度秘\t180159\n穿裙\t180160\nxueji\t180161\n王佳\t180162\n高一恩\t180163\n18889371170\t180164\n给我奸\t180165\n好了我在这\t180166\n王位\t180167\njtgltmgjgbgjga1ga1a\t180168\n空客总装\t180169\n沭阳县\t180170\n够意思\t180171\n麦伟良\t180172\n猪脚姜\t180173\n吸纳\t180174\n王余\t180175\n波斯得吐\t180176\n猪饭\t180177\n不确定\t180178\n素斋\t180179\n天女\t180180\n红领巾侠\t180181\n疤痕\t180182\nkfq\t180183\n晚上十一点半\t180184\niilme\t180185\n孤品\t180186\n官署\t180187\n琦琦\t180188\n天好\t180189\n迈开\t180190\nkfg\t180191\n安年co\t180192\n我看我我\t180193\nkfb\t180194\n来来口追\t180195\n深圳火车站\t180196\n嗯哼萌\t180197\nkfm\t180198\nkfj\t180199\n不个女主人的救赎\t180200\nVst\t180201\n彬彬周\t180202\nservice\t180203\n护理\t180204\n三星座\t180205\n几天道\t180206\n肥子\t180207\n8月18日\t180208\n七十四三\t180209\n钠镁铝锌铁锡铅氢铜汞银铂金\t180210\njizan\t180211\n岁数\t180212\n倪铭霞\t180213\n我的手不到你的来\t180214\n湾仔\t180215\n姜东元\t180216\n红巾军起义\t180217\n圳圳\t180218\nGOOUT\t180219\nyyghggghjjjhhhhhhhh\t180220\n不谈恋爱\t180221\n纸花\t180222\n你好贫道长华\t180223\n明星级\t180224\ntiaia\t180225\n徐家井小学\t180226\n艾儿\t180227\nIINO\t180228\n五一下午\t180229\n贾爷\t180230\n无神\t180231\n别讨价还价\t180232\n唔打扰系我最后的温柔\t180233\n欧打头破血流昏\t180234\n周浦万达广场\t180235\n男播\t180236\n九十来斤\t180237\n1009010\t180238\n三千多二千五三千\t180239\n乖乖了吧那不是老子说的话\t180240\nX战警外传之第一课\t180241\n23：00\t180242\nusba\t180243\n无法入\t180244\n56466464646646646464436433479797\t180245\n勘察\t180246\n我的乖乖我\t180247\n巴德施图伯\t180248\n翻议\t180249\n上饶话\t180250\n七零九三\t180251\n出場\t180252\n10265950622\t180253\n不能不可能\t180254\n好呗宝贝儿\t180255\nBrunch\t180256\nfeci\t180257\n腊月二十\t180258\n安利故的摸宁hellomynmaaayon\t180259\n锁颈\t180260\n头鸣伯伯\t180261\n哈汗\t180262\n东华软件\t180263\n丽珠\t180264\n编译\t180265\n坡度\t180266\n红都\t180267\n呵嘿\t180268\n卡儒雅\t180269\n山大\t180270\n丽珍\t180271\n爱多咪\t180272\n一次十八岁\t180273\n深深藏起来\t180274\n49几\t180275\n呵嘻\t180276\n机锋\t180277\n糸吾糸\t180278\n180种\t180279\n看见了\t180280\nIgive\t180281\n别老东拉西扯\t180282\n第九次\t180283\n凌晨2点\t180284\n耶也有\t180285\n找回\t180286\n大红包\t180287\n新习惯\t180288\n禁不住\t180289\n二维码\t180290\n算了吧\t180291\n频次\t180292\nkoumec\t180293\n是这样的人\t180294\n泰捷\t180295\n遭窃\t180296\n黄可婷\t180297\n不像样\t180298\n奥永远\t180299\n说梦话\t180300\n慢走\t180301\n路边上\t180302\n13935838891\t180303\n浓情意\t180304\n是我美素\t180305\n再说再说\t180306\n虾米音乐\t180307\npfore\t180308\n今年1月至5月\t180309\nbfb43dc9821044f78f0f73618b2jpg\t180310\n工尺\t180311\n心悸\t180312\n可真天影\t180313\n订阅\t180314\n王思涵\t180315\n唔换换唔\t180316\n海湾地区\t180317\n7-11\t180318\n劳斯莱斯\t180319\n2012年2月7日\t180320\n骚瑞\t180321\n10月11号\t180322\n心悠\t180323\nhjjjkk\t180324\n焯水\t180325\n秘密长\t180326\n130多只\t180327\n迅哥\t180328\n忙有\t180329\n真的很自信\t180330\n我在也不在爱你了我己经\t180331\n传视频\t180332\n第一学期\t180333\n韩某\t180334\n5399元\t180335\n东快\t180336\n去客服\t180337\n朋字错\t180338\n我是男的我我喜欢男\t180339\n47万\t180340\n花萝\t180341\n用习惯\t180342\n道德底线\t180343\n胸廓\t180344\ngghhhhhgg\t180345\naj窒息灭火法秘隔离灭火法c冷却灭火法\t180346\n大蓬\t180347\n求求求求求求\t180348\n酵母\t180349\n大石油公司\t180350\n无可厚非\t180351\n47个\t180352\n酷you\t180353\n花落\t180354\n清蒸鲳鱼\t180355\n辫儿\t180356\n85435892446387456\t180357\nfifthBCadBmKLM\t180358\n汤皓\t180359\niphng\t180360\nkilfu\t180361\n妙笔\t180362\n成体\t180363\n太早度秘\t180364\n扬州博物馆\t180365\n摇不靓\t180366\nwiwis\t180367\ncuv\t180368\n网讯\t180369\nnqjjy\t180370\n诈尸沙\t180371\n致炫\t180372\n陈嘉佑\t180373\n前两天\t180374\n死歌\t180375\n这么多嘞\t180376\n恋爱过\t180377\n这么多嘛\t180378\nuroutifo\t180379\n哈珀\t180380\n陌地\t180381\n受好久不见\t180382\n蔡燕\t180383\nkbct\t180384\n鑫福生保险\t180385\n长途费\t180386\n资产\t180387\n陈姓皇帝\t180388\n红豆\t180389\n周婆婆\t180390\nxbxbxb\t180391\n陈秘\t180392\n一二三四五六七八九十十一\t180393\n伎俩\t180394\n暴强\t180395\n提醒儿\t180396\n很有\t180397\n繁杂\t180398\nmc晓丁\t180399\nlxL7P\t180400\n医疗费\t180401\n荤段子\t180402\n江嘉美\t180403\n史单\t180404\n蒽蒽蒽蒽蒽蒽蒽蒽\t180405\n了调\t180406\naabbabab\t180407\nunioQ\t180408\n刘浩冯\t180409\n金碧美\t180410\n火影战记\t180411\n通力\t180412\n融冰\t180413\n人生一世\t180414\n成人选\t180415\n我行我素\t180416\n徐世昌\t180417\n其次\t180418\n祖父\t180419\n亲爱的小个个\t180420\n北京关\t180421\n巧虎\t180422\nguvjvg\t180423\n四八多少\t180424\n姚明哥\t180425\n恋家有方\t180426\n个死机器人\t180427\n千二元\t180428\ndgjko\t180429\n别说笑\t180430\nppltpartialeralitch\t180431\n六瓦\t180432\n温岭\t180433\n古我\t180434\n追凶\t180435\n要不然把假\t180436\n教徒\t180437\n申军\t180438\n89%\t180439\n头重脚轻\t180440\n度人无量天尊\t180441\n庚吧\t180442\n黎佳怡\t180443\n腊鸭\t180444\nhndx\t180445\n牙轮\t180446\ndave\t180447\n不里你\t180448\n182年\t180449\n二餐\t180450\n四千米\t180451\ncrossover\t180452\n校表\t180453\n家魂犬\t180454\n五百场\t180455\n谢嘉\t180456\n昏厥\t180457\nfxtvu\t180458\n高思源\t180459\n藏期\t180460\n来不一下\t180461\ndsswtfftf\t180462\n锄草\t180463\n吧几\t180464\n肯定式\t180465\nhttppinyincne2219\t180466\n藏有\t180467\n死妓女生的你\t180468\n翠微\t180469\n生还\t180470\n三点四十\t180471\n玉桂苹果面包布丁\t180472\n西密山\t180473\n铠甲勇士\t180474\n嗯夜萝莉\t180475\nLEAP\t180476\nhttpfhiphotosbaiducomxiaodupicitem8ad4b31c8701a18b8a53273b992f07082838fe0djpg\t180477\n蓝曼\t180478\n潘乾\t180479\n上春节\t180480\n再接再接再接\t180481\n10000909\t180482\n洗涤剂\t180483\n合不合\t180484\n钢琴教师\t180485\n重色\t180486\n车马\t180487\n姨娘\t180488\n范千瑜\t180489\n军姿\t180490\n事儿\t180491\n二分之一天\t180492\n穷追不舍\t180493\n客座满盈\t180494\n从始至终\t180495\n回引人入胜\t180496\n龙虾仁和他的时候我在家的时候\t180497\n166483586\t180498\n废我\t180499\n七十八\t180500\n七十六\t180501\n必须性\t180502\n诺亚舟\t180503\n你的你\t180504\n笑懂哲学么少年\t180505\n课命\t180506\n果头\t180507\nhttpwwwqqszccomtuhot1085html\t180508\n三界\t180509\n打呼\t180510\n林凉爽\t180511\n机翼\t180512\n44224\t180513\n3333323\t180514\n素粑粑\t180515\n累了困\t180516\n王海锋\t180517\n理你了我讨厌你\t180518\n服务期\t180519\n哼啤\t180520\n清爽\t180521\n二零幺七\t180522\n小度秘我不要你\t180523\n改运\t180524\n三毛流浪记\t180525\n淡漠\t180526\n兔侠传奇\t180527\n王晓卉\t180528\n原装性\t180529\nIP地址\t180530\nHappy\t180531\n13899682930\t180532\n逃错\t180533\n高腿\t180534\n记忆力\t180535\n失而复得\t180536\n函巴\t180537\n35只\t180538\nhi说话\t180539\n雷亚莉\t180540\n感应器\t180541\n35号\t180542\n流弊流弊\t180543\n黑店\t180544\n上四节\t180545\n黑底\t180546\n躲不住\t180547\nkkrkang\t180548\n地州\t180549\n练舞\t180550\n有请客\t180551\n辣鲜\t180552\n载人人\t180553\n穿山价\t180554\n纸雕\t180555\n脸皮厚\t180556\n早中晚\t180557\n恩洙\t180558\n15120850418\t180559\n47升\t180560\nXJAPAN\t180561\n办公所\t180562\n杜乖\t180563\n只剩下\t180564\nalico\t180565\ngadd\t180566\n再恋\t180567\n猪朱\t180568\n击碎\t180569\n鹅塘\t180570\n过年就不打麻将\t180571\naddgg\t180572\n恩洲\t180573\n孁秭譯\t180574\ngadt\t180575\n13136975444\t180576\n灭掉\t180577\n搓板赌城\t180578\n何欣怡\t180579\n刘梅\t180580\n兴高采烈\t180581\n18793408180\t180582\n送存\t180583\n明天是你的女的女的女的你\t180584\n邓小\t180585\n男王\t180586\n123467万\t180587\n打打架\t180588\n聚光灯\t180589\n邻家女孩\t180590\n念首诗\t180591\nvovk\t180592\n照做\t180593\n28块\t180594\n帖画纸\t180595\n校本小伙\t180596\nTQ5\t180597\n苏皖\t180598\n上午八点\t180599\nNNNN\t180600\n556286\t180601\nNNNM\t180602\n8月10号\t180603\n党委书记\t180604\n萌萌哒个美好我\t180605\n幸福就是我打你你不还手\t180606\n黑格尔\t180607\n持娇\t180608\n墨菲特\t180609\n法研\t180610\nMyLuxBox\t180611\n宝龙包\t180612\n干涸\t180613\n九千多公斤\t180614\n深情\t180615\n武魂石\t180616\n泰伯\t180617\n零度\t180618\nRtyh\t180619\n干涩\t180620\n秀一手\t180621\n张洪松\t180622\n弄懂\t180623\n白白昂贝贝养我吧养我没\t180624\n启封\t180625\n明天下午3点10分\t180626\n二会\t180627\n那个饭\t180628\n吓死人犯下下下下下下下下渣渣渣渣着我我我我了了了了\t180629\n倾家荡产\t180630\n好帅好帅\t180631\n水火不侵\t180632\n百富花园\t180633\n欢熊出没\t180634\n二伯\t180635\n于都县\t180636\n29日上午\t180637\n二伴\t180638\nv股\t180639\n天工巧夺\t180640\n整理\t180641\n陈雯一\t180642\n不爱我爱\t180643\nwggkgwkffojg\t180644\n气别\t180645\n新砖保质保量\t180646\n钱人\t180647\n国河\t180648\n稽留\t180649\n一起走过\t180650\n1x0点\t180651\n亚热带\t180652\n无是\t180653\n永济县\t180654\nCkcjgkci\t180655\n一起眼\t180656\n没有钱\t180657\nhbbnm\t180658\n结婚吧知道\t180659\n律师函\t180660\n流热\t180661\n好羞羞\t180662\nrew\t180663\n脑顶\t180664\ndvsgvgdgdgdv\t180665\n专访谈\t180666\ntame\t180667\n危机四伏\t180668\n重庆代表团\t180669\n老小子\t180670\n1095天\t180671\n素女孩\t180672\ngggydjs\t180673\n何宇杰\t180674\n快点\t180675\n知不\t180676\ntamu\t180677\n叠坐\t180678\n镜头感\t180679\n秦邦昊\t180680\n昨晚\t180681\nhttpfhiphotosbaiducomxiaodupicitemc75c10385343fbf2f75c8450b77eca8065388f54jpg\t180682\n明十三陵\t180683\n鲁蜜\t180684\n莱斯秘\t180685\n司空肚嘻哈骂\t180686\n柳绿\t180687\n二二囧\t180688\n450天\t180689\n200公里\t180690\n5步\t180691\n辛薇莱\t180692\n牧马人\t180693\n猎枪\t180694\n在的好久等待\t180695\n中间价\t180696\n小喵喵\t180697\nmymean\t180698\n我告你\t180699\n我不认难受\t180700\nuiiii\t180701\n化物\t180702\n不懂得宠\t180703\n宋佳旭\t180704\n涂炭\t180705\n读文\t180706\n有痘\t180707\n真的嘛假\t180708\nDeaewrreedet\t180709\nkkdkxkxlk\t180710\n八间\t180711\n涂点\t180712\n八门\t180713\n有病\t180714\n甘之如饴\t180715\n五禽戏\t180716\nAlipermission\t180717\n偷换\t180718\n吴花园\t180719\n孝介\t180720\n一起跳舞\t180721\n试管\t180722\n糖汁\t180723\n控制性\t180724\n杏红\t180725\n攻管\t180726\n啦啦呜啦啦乌拉啦啦打怪兽\t180727\ntrounosa\t180728\n政界\t180729\n总磷\t180730\n陈冠儒\t180731\n哭了惹\t180732\n楚彩凤\t180733\n20点30\t180734\nroll\t180735\n打谷\t180736\n利弊\t180737\n睁开\t180738\n阿妮丝\t180739\nABBBBB\t180740\n一万一\t180741\n哦十岁\t180742\n花汤明儿\t180743\n230..47..1\t180744\n噜啦啦噜啦啦噜啦噜啦嘞噜啦噜啦噜啦嘞新年好呀新年好呀新年\t180745\n080423\t180746\n国家台网\t180747\n纾解\t180748\n度秘度秘我找了不要你了我有我的新伙伴\t180749\n7948\t180750\n天斗星星\t180751\n火星球\t180752\nkop们\t180753\n被绑架\t180754\n关谷\t180755\n女别\t180756\n企业管理\t180757\n明天早上六点半\t180758\n家桥\t180759\n我爱青峰青峰核雕\t180760\n北京市朝阳区\t180761\n打包费\t180762\nsay3单\t180763\nLdzgacsdyadhshgzhvfFdhegvgfTucson\t180764\n那人品堡\t180765\n面瘫\t180766\n死光光\t180767\n吴雨翔\t180768\n陪我到老\t180769\n破学\t180770\n你来亲我的你懂的\t180771\n咯血\t180772\n劝君\t180773\n海鲜\t180774\n赖月京\t180775\n香膏\t180776\n狗不狗\t180777\n标法\t180778\n破孩\t180779\nLLJCAL\t180780\n二二六六零\t180781\n爱心送机*真情接力#\t180782\n螳臂\t180783\n八字算命\t180784\nBye\t180785\n刘冰倩\t180786\n骨中\t180787\n医疗保险\t180788\n少分\t180789\n生死狙击恩\t180790\n盗版者\t180791\n就是你就是你就是你\t180792\n鬼同体\t180793\n昂昂昂\t180794\n罗冲\t180795\n300平米\t180796\n武侠传\t180797\njuryor\t180798\n1079\t180799\n代表性\t180800\n我去其气我试验田\t180801\n物品\t180802\nz104\t180803\n盛洪\t180804\n来了怕\t180805\nfwiyqdt\t180806\n甘冻\t180807\n笔墨炎\t180808\n7点半到9点\t180809\n青春派\t180810\n几晚\t180811\n真實\t180812\n快班\t180813\n淘宝贝\t180814\n吃瓜子\t180815\n预产期\t180816\n侣装\t180817\n指环\t180818\n户县\t180819\n糯米四百你是我的秘书\t180820\n希澈哥\t180821\nTan\t180822\n爱妻妻妾\t180823\n唐子琳\t180824\nMiranov\t180825\n快一点可好\t180826\n应岗\t180827\nESPRIT店\t180828\n崔景怡\t180829\n仙女乖乖\t180830\n陈福莹\t180831\n每张\t180832\n赛博坦\t180833\n等高\t180834\n贸易公司\t180835\n偶吧转圈圈转圈圈[偷笑\t180836\n陈汉涛\t180837\n高图斯\t180838\n冬装\t180839\n当地时间\t180840\n航拍\t180841\n诸葛晾之\t180842\nhowaerYou\t180843\nmonsteix\t180844\nboost\t180845\n一大篇\t180846\n事波斯猫\t180847\n凌强\t180848\n皇冠\t180849\n败将\t180850\n蒋林学\t180851\n下午体\t180852\n挤压式\t180853\n灵剑山\t180854\n军供\t180855\n叶小曼\t180856\n东方路\t180857\n鼠兔\t180858\n邓书文\t180859\nn　nn　n　n\t180860\n比窝大\t180861\n陈佳蕊\t180862\nmodin\t180863\n后生仔\t180864\n爱情雨我去\t180865\n七大姑\t180866\n甘落\t180867\ntitle\t180868\n一下来\t180869\n度嫉妒恨\t180870\n刘亚圣\t180871\n小鸽\t180872\n崇祯\t180873\n皇军\t180874\n茁新\t180875\n黎明前\t180876\n祸害\t180877\n叶菜类\t180878\nREAL\t180879\n佛鳄\t180880\n下士\t180881\n零下八度\t180882\n纯粹\t180883\n恩快\t180884\nridvhrjchktgkff\t180885\n拘留所条例\t180886\nDSLR-A580\t180887\n五彩糖\t180888\n秘风\t180889\n沒車\t180890\n555445744424\t180891\n背一背\t180892\n羊肉泡\t180893\n3块\t180894\n下壁\t180895\n阿花阿花花\t180896\n100000000000000000\t180897\n海宁\t180898\n王瑞儿\t180899\n图画\t180900\n来行\t180901\n海安\t180902\nugfuc\t180903\n燕飞羽\t180904\n化学方程\t180905\n经做完\t180906\n过保\t180907\n暴种\t180908\n无益\t180909\n巴拉巴拉巴拉巴拉巴拉巴拉巴拉巴拉巴拉巴拉拉拉\t180910\n桐乡\t180911\n垃圾桶\t180912\n很么么哒\t180913\n冯瑞\t180914\n如梦想想\t180915\n籽\t180916\n壶春\t180917\n少一人么少\t180918\n两吧218219237272麦季三岁二下七上八下\t180919\n河北人\t180920\n玩乐\t180921\n一七能再\t180922\n曼苏拉\t180923\n好聪明有像头猪不如猪\t180924\n20万\t180925\n减仓节简炎昌脱发\t180926\n第6集\t180927\n无相\t180928\n户外美术馆\t180929\n牛爷\t180930\n法身\t180931\n爱巴辣\t180932\n我发你我讨厌你我讨厌你我讨厌你\t180933\n忧伤\t180934\n定向增发\t180935\n有你的理由\t180936\n来来斗\t180937\n内奸\t180938\n2.1亿元\t180939\n孙佳仁\t180940\nFggfhgfgg\t180941\n信托\t180942\n7月14日晚8点\t180943\n王八里\t180944\n伊人黄鲤鱼\t180945\n体力劳动\t180946\n厕所拉巴巴\t180947\n谈及\t180948\n因为我喜欢女孩\t180949\n张超堡\t180950\n河西区\t180951\n乌迪内斯\t180952\n翻落\t180953\n早晨8点\t180954\n再讲一个多点儿\t180955\nHguh\t180956\n奋飞\t180957\n21875\t180958\n吴晨成\t180959\n早你个头\t180960\n负累\t180961\n辐射\t180962\n酷频\t180963\n一头栗\t180964\n将军\t180965\n一决胜负\t180966\n财气\t180967\n1931年\t180968\n2004年9月7号\t180969\n责罚\t180970\n以园小学\t180971\n分布\t180972\n晚点率\t180973\n谦爸\t180974\n理弄死\t180975\n落荒而逃\t180976\n爆浆\t180977\n红褐色\t180978\n炫风\t180979\n我是你的爱我是你的一切不是你的全部\t180980\n怎么\t180981\nchundou\t180982\n出乎意料\t180983\n驱散\t180984\n扫清\t180985\nugjau\t180986\n奇谋\t180987\n汪越胜\t180988\n千面\t180989\n炸酱\t180990\n叫姐\t180991\n小黑鱼奖\t180992\n朱珊珊\t180993\nyears\t180994\n荷里活\t180995\n敬仰\t180996\n仙剑一\t180997\n莫小棋\t180998\n翻跟头\t180999\n玛汐\t181000\n天汇\t181001\n53倍\t181002\n谷哥\t181003\n子女孩\t181004\n秀气\t181005\n注目\t181006\n大柚子\t181007\n唐思梦\t181008\n25万元\t181009\n周芳茹\t181010\n艾露沙\t181011\n多多多\t181012\n周芳\t181013\n那么爱笑爱闹的你\t181014\n王甜甜\t181015\ne260l\t181016\n哪劲\t181017\n无效\t181018\nACE002\t181019\n无敌\t181020\n东山东胜区\t181021\n心洁\t181022\n无故\t181023\n双眉\t181024\n鼎鼎有名\t181025\n13881853031\t181026\n尹纯洁\t181027\n奥特呢\t181028\n无数\t181029\n嗯恩\t181030\n掏心窝子\t181031\n宝式\t181032\nExperiment\t181033\n暖河汗\t181034\n物业管理费\t181035\nvrnd\t181036\n欧巴斯\t181037\n哪办\t181038\n告诉我头\t181039\n何思华\t181040\n新希望\t181041\n护驾\t181042\n佛香阁\t181043\nmhhhjn\t181044\nnishoihf\t181045\n玫瑰花\t181046\n路边摊\t181047\n雷人累\t181048\n小街\t181049\n纤长\t181050\ntvbs\t181051\n半带\t181052\nctidug\t181053\n几百瓶\t181054\n气死我要\t181055\n九百千米\t181056\n因为我喜欢度\t181057\n忘你的方向\t181058\n上证指数\t181059\nTHANK\t181060\n双眸\t181061\nakk15\t181062\n配离子未成佛秘ivopro\t181063\n风火\t181064\nnSCHOOLBUSnOO\t181065\nfsdfdgdgdd\t181066\n运动服\t181067\n15138\t181068\ncomplicated\t181069\n火女\t181070\n斗角\t181071\n1970年十二月九日\t181072\n一点五名\t181073\nhfbhb\t181074\n风累\t181075\n绝无\t181076\n1.5%\t181077\n美工笔\t181078\nchiphotosbaiducomxiaodupicitem9922720e0cf3d7cac19b8a02f51fbe096b63a951jpg\t181079\n真好玩儿\t181080\nDZ\t181081\n啊凡提\t181082\n理财师\t181083\n鞋机\t181084\n和善\t181085\n另一种人\t181086\n不好看\t181087\nonh\t181088\n一个分钟\t181089\n红苜\t181090\n暴力\t181091\n卡卡西\t181092\n别老重复\t181093\n邵进斌\t181094\nona\t181095\n很美有\t181096\n开国元勋\t181097\n巧诈\t181098\none\t181099\nSOPOSOTZ\t181100\n坏人雷柏雷柏手串雷圣洁的艾雷王儿歌\t181101\n贪玩\t181102\n11t主\t181103\n滚滚滚滚滚滚滚滚滚滚滚滚\t181104\njggygh\t181105\n覃鸿道\t181106\n推拖\t181107\n巧说\t181108\n煲饭\t181109\n美姑县\t181110\n暴动\t181111\n涡子\t181112\n肉夹馍\t181113\n4tomityouo\t181114\n之二\t181115\n拿不剩\t181116\n嗯二月\t181117\n后跟\t181118\n下班他理\t181119\n安在哉\t181120\n求学\t181121\n杨金鸽\t181122\n入场券\t181123\n舆论\t181124\n拜拜晚安祝你好梦么么哒萌萌哒\t181125\nGffv\t181126\n知法\t181127\n暖流\t181128\n饿路人皆知\t181129\n众目睽睽\t181130\n节水型\t181131\n天龙八部\t181132\nGffg\t181133\non5\t181134\n奏乐\t181135\n三十四门\t181136\n雨英\t181137\n董明福\t181138\n眼唇卸妆液\t181139\n牛子\t181140\nwurvxidhd\t181141\n一百五千五百五十五万\t181142\n无限性\t181143\n好助手\t181144\n无家可归者\t181145\n518182815828\t181146\n吴可\t181147\n南区C座马甸桥东北角\t181148\n99999999元\t181149\n四十几\t181150\n谁知道\t181151\n娘家人\t181152\n特全\t181153\n几段话\t181154\nDjfiGjvpn\t181155\n青蒜\t181156\n二90\t181157\n贯休\t181158\n小护士\t181159\ngfchb\t181160\n科技园\t181161\nStCRg\t181162\n敢不敢\t181163\n不要吃醋\t181164\njahs\t181165\njahr\t181166\n黑兔\t181167\n新鞋\t181168\n伴侣\t181169\njtgjtilj\t181170\n开花\t181171\n惰性漫延\t181172\n初步\t181173\n大海叔\t181174\n无理凡凡\t181175\n布莎卡拉卡\t181176\n王厘米\t181177\n好啦度\t181178\nprinted\t181179\n222222222222222222\t181180\n书签儿\t181181\n眼珠子\t181182\nmeagreen\t181183\n2011-03-08\t181184\n事岁\t181185\n开福区\t181186\n干豆腐\t181187\n交流感\t181188\n秘unono\t181189\n丰台律师协会\t181190\n变形\t181191\n晚上十点半\t181192\n滤材\t181193\n殉情\t181194\n王官峪\t181195\nLande\t181196\nChen\t181197\n黄大仙\t181198\n风吹雨成花\t181199\n防城港\t181200\n贴身瓦\t181201\n痛并快乐\t181202\n任重道远\t181203\n偷星异世十月片\t181204\n开眼\t181205\nqod\t181206\n雅蠛蝶雅蠛蝶雅蠛蝶羊羊\t181207\n魔方宝特\t181208\n剥皮\t181209\nVIEWALL\t181210\n百千计\t181211\n泰晤士报高等教育副刊\t181212\n1116665555555514555555\t181213\n十二一八根\t181214\n迈克尔乔丹\t181215\nsuibiankiaox\t181216\n这世道\t181217\nhugbugn\t181218\nBigbang#\t181219\n每一位\t181220\n崔超\t181221\nwcc\t181222\n度兔兔老K图兔兔图兔兔图图\t181223\n嗯王远梅\t181224\n1月11号\t181225\n闭上双眼\t181226\n7份\t181227\n15581276351\t181228\n玩具熊\t181229\n那内\t181230\n骨法\t181231\n发痴\t181232\n把握不住\t181233\n粮仓\t181234\n南法德国队\t181235\n201.23亿元\t181236\n1.5g\t181237\n原文\t181238\n发病\t181239\n横三竖三撇一点\t181240\n吴毅凡\t181241\n说话呀别离别\t181242\ngvvghhvbj\t181243\n云哥\t181244\n掏粪男\t181245\n一世纪\t181246\n发痒\t181247\n健朋\t181248\n钱明月\t181249\n粮价\t181250\n顾小艾\t181251\n原料\t181252\n车行\t181253\n13686905572\t181254\n两百九十二九十百七十\t181255\n4326\t181256\n北京国际旅游博览会冲绳观光会议局\t181257\nFuxdftyf\t181258\nwxhl\t181259\n藏民们\t181260\n老中医\t181261\n16:10\t181262\n令狐冲\t181263\n联湿\t181264\n外宣办\t181265\n产妇\t181266\n溢岸\t181267\n冯绍峰\t181268\n聚会乐\t181269\nmlgbcnmqndyd\t181270\n王b\t181271\n庞杂\t181272\n度相公\t181273\n太皇太后\t181274\n凉快\t181275\n强我弱\t181276\n权证\t181277\n宋云歌\t181278\n古曼童\t181279\n㈠\t181280\n㈣\t181281\n㈥\t181282\n归属地儿\t181283\n信息句\t181284\n听途天\t181285\ncaac\t181286\n随喜的要你的哎呀臭美吧你\t181287\n很难\t181288\n指真是\t181289\n生活用水\t181290\n18724901496\t181291\n33.7公分\t181292\n抵抗力\t181293\n七星彩\t181294\n大同感\t181295\n两首\t181296\n黑子黑子\t181297\n对诗\t181298\n匆忙忙\t181299\n李银龙\t181300\n1728\t181301\nDTzero\t181302\n对话\t181303\n5247\t181304\n1725\t181305\n5241\t181306\n1723\t181307\n5242\t181308\n第几款\t181309\n深呼吸\t181310\n对证\t181311\nDgvshvdbcjh\t181312\n我喜欢猜迷语\t181313\n割离\t181314\n诺伊尔\t181315\n近门\t181316\n90美元\t181317\n可乐类\t181318\n国战\t181319\n雪龙\t181320\n真叫\t181321\nkjuk\t181322\n真可\t181323\n下用\t181324\n真口\t181325\n甬温线高铁\t181326\nhttpehiphotosbaiducomxiaodupicitem4\t181327\n哦呢\t181328\n我真的好相你\t181329\n胡安尔\t181330\n7286\t181331\nwhatisthis\t181332\nsere\t181333\n魔剑\t181334\nbaEngiy\t181335\n攻资\t181336\n刘秋池\t181337\n度秘我走\t181338\n下甘\t181339\n我是你的小大盗小简\t181340\n2009年6月10日\t181341\n一蠱\t181342\n神神的吧\t181343\n女娃\t181344\n图古\t181345\n图句\t181346\n举牌\t181347\n戴利\t181348\n水银\t181349\n塞尔维亚\t181350\n哦呀\t181351\n图叫\t181352\n8晚年\t181353\nddfs\t181354\n猪型\t181355\nuimation\t181356\ntomee\t181357\n楚黾\t181358\n不能动\t181359\n逛一逛\t181360\n卿水\t181361\n这臭样\t181362\n送花儿\t181363\n再版\t181364\n不能力\t181365\n杨予馨\t181366\n不必说\t181367\n五菱\t181368\n24小时\t181369\nformeI\t181370\n呵偷偷\t181371\n15916343551\t181372\n结核\t181373\n测出\t181374\n异上类\t181375\n隔阂\t181376\n孵化基地\t181377\n高精尖\t181378\n能量同维度秘\t181379\n56287542722157628855886590865846467550\t181380\n猜字谜\t181381\n大娘水饺\t181382\n处女\t181383\n凹凸曼大战\t181384\ngtx900\t181385\n知足吧\t181386\n别嘟囔\t181387\n富顺\t181388\n呼出\t181389\n率居\t181390\n55555552288\t181391\n仙人枣焦叶\t181392\n廖维忠\t181393\ncafe\t181394\n下个星期星期\t181395\n美嘉\t181396\n小和尚\t181397\n吐槽\t181398\n治疗救\t181399\n刘少利\t181400\n圭峰山\t181401\nTFBoYs\t181402\n蜡梅\t181403\n霸气侧漏型\t181404\nUTA\t181405\n我是你主人喵\t181406\n怎森达\t181407\n就我你叫我的私人健康顾问\t181408\n雀跃\t181409\n谦哥\t181410\n化石我的像话\t181411\n苗条\t181412\n一三款\t181413\n练任\t181414\n奇酷\t181415\nDhdf\t181416\n转正\t181417\n神童\t181418\n别拜\t181419\n刘曼清\t181420\n浩克\t181421\n10000000000000000000个\t181422\n侠传奇\t181423\ng宝贝\t181424\n早六点\t181425\n跳蛋\t181426\n铁力\t181427\n啦啦队\t181428\nSbsb\t181429\nBOY\t181430\n讨厌干嘛\t181431\n婊婊\t181432\n铅弾枪\t181433\n凸凸凸\t181434\n略究\t181435\n四十四\t181436\n竹编\t181437\n苏九儿\t181438\n泥人\t181439\n们天\t181440\nBON\t181441\n不想和你无知\t181442\n爆堡\t181443\n陈晓红\t181444\n单利\t181445\n小瘪三\t181446\n咸宁法院\t181447\n采集\t181448\n心胸开朗\t181449\n家务事\t181450\n单刚\t181451\n副部长\t181452\n吴我\t181453\n迸发\t181454\n罗马帝国\t181455\n楼梯度\t181456\n单刀\t181457\n万昌科技\t181458\n马驹桥\t181459\nteacha\t181460\n怂样\t181461\n汉米镇\t181462\n万奴王\t181463\n坎坎\t181464\n艘\t181465\n王影院\t181466\n晕不聊天\t181467\n黛安娜\t181468\n嗨歌\t181469\nworld\t181470\n不有你\t181471\n下一站神奈川\t181472\n告我下\t181473\n艰\t181474\n绑架\t181475\n色\t181476\n艳\t181477\n做却\t181478\nwttt\t181479\n杨小冉\t181480\n艹\t181481\n艺\t181482\njjfbj\t181483\n艼\t181484\n艽\t181485\n艾\t181486\n可欣然\t181487\n5x2x3\t181488\n根尖周炎\t181489\n县级市\t181490\n接天莲叶无穷碧\t181491\n求生欲\t181492\njzjxb\t181493\n循环节\t181494\n支助\t181495\n臭问\t181496\nwtmt\t181497\n艮\t181498\n坎坷\t181499\n阿凤吖cici\t181500\n难点\t181501\n造业\t181502\n孤儿狗\t181503\n每2分钟\t181504\n荣誉SJB院\t181505\n牽扯\t181506\n偷偷\t181507\nfmkvo\t181508\n药王街\t181509\ntvd\t181510\ntvg\t181511\n工业化\t181512\n嗑嗑\t181513\n女儿节\t181514\nprettygirl\t181515\n马云帅\t181516\n91%\t181517\n消失行\t181518\n零二幺七\t181519\n阿拉巴\t181520\n919\t181521\n替我\t181522\ngdesdss\t181523\n一炮友\t181524\n红细胞\t181525\n王郁文\t181526\n911\t181527\n910\t181528\n913\t181529\n912\t181530\n蹦乱跳\t181531\n困乏\t181532\n洞悉\t181533\n调泌\t181534\n高氏向高\t181535\n可不可惜\t181536\n将来\t181537\n未来\t181538\n97年\t181539\n九尾\t181540\n到天明\t181541\n赢钱\t181542\n伺侯\t181543\n被选中\t181544\n好玩沙湾考试了考完试\t181545\n秘密度秘\t181546\n180分钟\t181547\n小纯\t181548\n九小\t181549\n出口成章\t181550\n聂臭\t181551\n請伱\t181552\n术科\t181553\n只只通话\t181554\n晚自习\t181555\n臭鸡\t181556\n找懂了\t181557\n97幅\t181558\n地腊节\t181559\ntutututl\t181560\nNoImagirl\t181561\nbjc\t181562\n夫妻之间\t181563\ngtgn\t181564\nNo\t181565\n社会主义\t181566\nbjl\t181567\ngtgd\t181568\nbjj\t181569\ngtgg\t181570\nNz\t181571\n制冰机\t181572\n顺带\t181573\nNp\t181574\n陈浩南\t181575\ngtgs\t181576\ngtgt\t181577\n阿茨海默症\t181578\ngtgv\t181579\n损友\t181580\nNH\t181581\nNI\t181582\n度秘你的蛋碎\t181583\nNK\t181584\n玉兰花\t181585\nNN\t181586\nNO\t181587\n那个小学我的了没说话的人民航模\t181588\nNB\t181589\nNC\t181590\nND\t181591\nmxh649\t181592\n神庙者\t181593\n裴雨萱\t181594\n白边儿\t181595\n笔式\t181596\nNP\t181597\n轻浮\t181598\nNR\t181599\nNV\t181600\n期待爱情公寓2\t181601\n46%\t181602\nwidowyou\t181603\n瑟完\t181604\n猪器人\t181605\n我地\t181606\n一千种\t181607\n扑向\t181608\n嘎嘎嘎嘎嘎嘎鸡\t181609\n洗掉\t181610\nN9\t181611\n一千秒\t181612\n今天下午四点半\t181613\n银川\t181614\n行歌\t181615\n会病\t181616\n杨克松\t181617\nN2\t181618\n梁上\t181619\n暮烟\t181620\n勾图\t181621\n奔驰AMG\t181622\n猜一猜网\t181623\n初五初六\t181624\n朱仙庄\t181625\n梁丹\t181626\n梁丽\t181627\n屌别逃\t181628\n光之巨人\t181629\n专题歌\t181630\n常青树\t181631\n一六岁\t181632\n胡美琪\t181633\n广汇股份\t181634\n益力多\t181635\n程文婷\t181636\n大名儿\t181637\n改签\t181638\n互通有无\t181639\n痛脚\t181640\n1323209475\t181641\n吕星雨\t181642\n慢递\t181643\n亲身\t181644\n一起去诉\t181645\n投递\t181646\n陇海\t181647\n你的乖乖\t181648\n孟子\t181649\n我知道你谁你是度秘\t181650\n幽幽\t181651\n12吨\t181652\n烟袋\t181653\n超频三Q7\t181654\n朦胧\t181655\n吴青浦\t181656\n王景然\t181657\n张梓\t181658\n金信\t181659\n奇米泉\t181660\n里孙\t181661\nhdisijxjfi\t181662\n敢言\t181663\n找我没\t181664\n里子\t181665\nT56次\t181666\n茶凉\t181667\n三个半\t181668\n哥喇CS\t181669\n脚踝处\t181670\n报道\t181671\n风投公司\t181672\n再接再厉豪情壮志奋发\t181673\n右腿\t181674\n去了再见再见再见\t181675\nmllplmmmk\t181676\n幻想乐园\t181677\n茶几\t181678\n塔维什\t181679\n油锯xx\t181680\n深圳大学\t181681\nBiao\t181682\n烧锅\t181683\n六四百六八\t181684\n宾果消消乐\t181685\n刘梦琪\t181686\n凶信不信\t181687\n挫折\t181688\n画费\t181689\n巧妙\t181690\n涯眼汁\t181691\n006期\t181692\n4.7%\t181693\n弄脏\t181694\n画贴\t181695\n3600多元\t181696\n这点钱\t181697\n冰粥\t181698\n巴兰\t181699\n禹炫敏\t181700\n自助银行\t181701\n告诉你我喜欢\t181702\n好多明\t181703\n沙林\t181704\n巧妇\t181705\n小一中\t181706\n不懂凶\t181707\n许佳娜\t181708\n心轻\t181709\n铁可\t181710\n罗汉净素面\t181711\n边疆\t181712\n眼底\t181713\n巴克\t181714\n真不可以\t181715\n任得华\t181716\n爬下\t181717\n爬上\t181718\n女滴\t181719\n几艘\t181720\n命中注定\t181721\nfrdgbr\t181722\n罗小伊\t181723\n100周\t181724\n铲子\t181725\n躁进\t181726\nkuvhlubkoj\t181727\n释然\t181728\n过廊\t181729\n我的谁脸\t181730\n菩提\t181731\noooooooooooooooooooo\t181732\n困天使\t181733\n纪念币\t181734\n一下哈\t181735\n5万一下\t181736\n说不不屑\t181737\n肖静雯\t181738\n八十分钟\t181739\n每况愈下\t181740\n2011年5月22日04时31分\t181741\n平远街\t181742\n圣彼得堡\t181743\n乱聊\t181744\n背盖\t181745\n接长\t181746\n万惠霖\t181747\nUFO\t181748\n999组\t181749\nH30cross&amp\t181750\n凡尘\t181751\n快乐狠\t181752\n迷迷迷笛\t181753\n突飞猛进\t181754\n嗯原来窝\t181755\n日出江山红胜火\t181756\nsmall\t181757\n押送\t181758\nゲjhhky\t181759\nexyioohdu\t181760\nevelours\t181761\n刚刚好\t181762\n嗯鸡婆\t181763\n雷亚尔\t181764\n错啦煎饼果子\t181765\n秀秀秀\t181766\n主谋\t181767\n美贝尔\t181768\n六七八九\t181769\nMcCarty\t181770\n国大\t181771\n海底针\t181772\nwueid\t181773\n陶大邱塔\t181774\n僵死\t181775\n两毙\t181776\n两毛\t181777\n温顺\t181778\n内因\t181779\n冲天\t181780\n王森慢\t181781\n罗雪萌\t181782\ndfffffx\t181783\nrncgk\t181784\n春杯\t181785\n上学了你在哪鸟\t181786\n智能院\t181787\n多看看度机器人\t181788\n站给\t181789\n疯了别\t181790\n泛起\t181791\nEvh\t181792\n宇春\t181793\n94226577095768079577657270\t181794\n太紧\t181795\nmmmjtjt\t181796\nppgtamgpgjgag\t181797\n扩大\t181798\n赣七中\t181799\n太累\t181800\n游淑婷\t181801\n高分\t181802\n两三年前\t181803\n绿匣子\t181804\nSMTOWN\t181805\n提神\t181806\n第一层\t181807\n废除\t181808\n巴芬岛\t181809\n挑唆\t181810\n9999局\t181811\n第一届\t181812\n宇星\t181813\n旁人\t181814\n3w点六百\t181815\nkrzbyexvki\t181816\n井口\t181817\n整栋楼\t181818\n谷子\t181819\n第十四\t181820\ny060\t181821\n5228226\t181822\n3nt四\t181823\n傻鱼\t181824\n女装\t181825\n呆关\t181826\n裙请\t181827\n十块\t181828\n猪屁屁\t181829\n精英们\t181830\n霉美\t181831\n上海市政府\t181832\n形状\t181833\nVhhhhjklllllllgsCc\t181834\n20棵\t181835\n北京市第十一届委员会常务委员会\t181836\n来罗\t181837\n心里平\t181838\n孙武\t181839\n朱雨诺\t181840\n424245756\t181841\n天天三星\t181842\n珠市\t181843\n用心良苦\t181844\nab办法\t181845\n嘴疼\t181846\n说三道四就不必了\t181847\n古板头\t181848\n蜂糕\t181849\nhycujh\t181850\nznssn\t181851\n扪心自问\t181852\n一簇\t181853\n14141\t181854\n囧囧囧囧囧囧囧囧囧囧囧囧\t181855\n圆柱形\t181856\n严密\t181857\nthwm\t181858\n海纳币\t181859\n0.5米\t181860\n百汇宾馆\t181861\n严寒\t181862\n度秘制\t181863\n朋友们\t181864\n母弹药\t181865\nabhit\t181866\n一秒钟后\t181867\n小太郎\t181868\n安以轩\t181869\nmcqueen\t181870\ngnfnfmtbjdmdkfdyfyejdgvfxhshyv45\t181871\n马利亚\t181872\n溃坝\t181873\n占据\t181874\n劝怨\t181875\n抢埋单\t181876\n黄汉\t181877\n寒江雪\t181878\n阿子女\t181879\n翁媪\t181880\n度秘刘\t181881\n過兩\t181882\nFusion\t181883\n困莫\t181884\n黄江\t181885\nworry\t181886\n8得多\t181887\n区区\t181888\n备至\t181889\n短发\t181890\n齐王\t181891\n77部\t181892\ndswwrdetty\t181893\n凌晨四点\t181894\n茉莉花\t181895\n刘志慧\t181896\n大智若愚\t181897\nDIKS\t181898\n要死不该\t181899\niggzxh\t181900\n短号\t181901\n吴家五\t181902\n通州时代中学\t181903\n米拉库\t181904\n孟费美\t181905\n1390713331559\t181906\n短句\t181907\nffvvhubuikopmhhipjhih\t181908\n叙鹏\t181909\n拉德\t181910\n徐佳宁\t181911\n牙医春阳一穿就是我\t181912\n梓潼\t181913\n字拖\t181914\n信你\t181915\n剑三\t181916\n番茄虾仁咕噜虾仁神马\t181917\n自黑鲸\t181918\n娄梦瑶\t181919\n说我讨厌\t181920\nhhhygfr\t181921\n柳志贺\t181922\n一周到错\t181923\n2016届\t181924\n周勐宁\t181925\n脚痛\t181926\n大嫂\t181927\n莱雪猪猪侠大电影院\t181928\n保时捷\t181929\n八点十八分\t181930\n信佛\t181931\n15235377\t181932\n高湛\t181933\n老八婆\t181934\n我的好像\t181935\n王国庆\t181936\n刘驰山\t181937\n张翔昊\t181938\nsefct\t181939\n卡西\t181940\n灵度秘\t181941\n三国\t181942\njsjsoiahahsksknzvzhkNBhakaknabshbnznkMsnkJDZHZJBbjHhjjsjsjjsbhsishHJSKJSHSBSSHHSHSB\t181943\n小春\t181944\nvpatit\t181945\n错哪勒\t181946\n0557758870\t181947\n手杖\t181948\n小星\t181949\n颜料\t181950\n兰卅\t181951\n鼻子感\t181952\n好啦谢谢你待见\t181953\n5131亿元\t181954\n病理\t181955\n奇门\t181956\n小明\t181957\n数字电影学院\t181958\n流浪想不到你\t181959\n手板\t181960\n张是人\t181961\n随遇而安\t181962\n天天有戏\t181963\n大盘鸡\t181964\n吃断血流\t181965\n应红萍\t181966\n首堵\t181967\n长宇\t181968\ngfr\t181969\ngfs\t181970\n林欣彤\t181971\n吃梨\t181972\n徐会\t181973\ngft\t181974\ngfu\t181975\n唐芷依\t181976\n白去\t181977\ngfh\t181978\n擦伤\t181979\n87878787\t181980\ngfc\t181981\n都一点也\t181982\ngff\t181983\ngfg\t181984\ngfd\t181985\n我是谁了我是你的好朋友\t181986\n护手\t181987\n墀月\t181988\n吴中磊\t181989\n财产节\t181990\n代售商\t181991\n二六二六\t181992\n迫不及待\t181993\n萧亚轩\t181994\n产地\t181995\n宠物小宠物你好你好\t181996\n湿热\t181997\n到晚\t181998\n传媒大厦\t181999\n未经说了我不想和你说了我想玩\t182000\n面包片\t182001\n穆素勤\t182002\n5月22日12点12分\t182003\n雄霸没事儿人家过二中\t182004\n冰心杯\t182005\n亚灭地\t182006\n嫩度\t182007\n幺二零幺零二五幺五七零两个零三个二幺二八\t182008\n信用笔\t182009\nreert\t182010\n正义感\t182011\n高i\t182012\n金瓜猪\t182013\n好呀听话\t182014\n有愧于\t182015\n小佳佳\t182016\n13760367773\t182017\n童润雨\t182018\n死尸\t182019\nbeeffd\t182020\n不而且\t182021\n努弗皇實慰\t182022\n杨小洋\t182023\n克洛普\t182024\nmabawoia\t182025\n阵型\t182026\n长才\t182027\n延安\t182028\nap1009238\t182029\n小叽叽小尤\t182030\n李佳乐\t182031\n扫射\t182032\nBgish\t182033\n绘图\t182034\n多讨\t182035\nDCDeY44呃6多362\t182036\n刘工管\t182037\n有问必答\t182038\n2O00\t182039\n贵州\t182040\ndfmdmd\t182041\n3352038318\t182042\n五点60点\t182043\n毛逼\t182044\nmhtmatomatitotloudioaatowwo堡葡萄葡萄葡萄葡萄\t182045\n血热\t182046\n佛听v\t182047\n冯欣欣\t182048\nv愁\t182049\n国美电器电子商城\t182050\n8134吨\t182051\n一千一百一十五一千一百一十六1119\t182052\n开方\t182053\navee\t182054\n男土女火--土火夫妻大昌吉\t182055\n骗局\t182056\n曹可升\t182057\n全能工\t182058\n下儿\t182059\n阿土伯伯伯伯伯伯\t182060\n巴拿马\t182061\n淚\t182062\nijhhh\t182063\n哈六\t182064\n多看看\t182065\n小度秘你好很高兴认识你\t182066\nLick\t182067\n着调\t182068\n消化器官\t182069\n闵胜明\t182070\nccy\t182071\n9855585588554545885575800058577777855\t182072\n笔答\t182073\n国耻\t182074\n行不过\t182075\n国者\t182076\n翻篇\t182077\n禽流\t182078\n国考\t182079\n969363936380556\t182080\n基滴\t182081\nGgjnnmmmmmmmmmmmm\t182082\n猫咪咪\t182083\n笔签\t182084\n傲岸\t182085\n堕大地狱\t182086\nshoutat\t182087\n真心力\t182088\n59741\t182089\n容嬷嬷\t182090\n赵博涵\t182091\n限购令\t182092\n陌小兮\t182093\n母女\t182094\n五十页\t182095\n111家\t182096\n一切都好\t182097\nwhaturmam\t182098\nyftufidudib\t182099\n检出\t182100\n李建豪\t182101\n這個嗎\t182102\nself\t182103\n朱淼\t182104\n在心里\t182105\n扮没\t182106\n备用\t182107\n就一样\t182108\n旅游部\t182109\n反压\t182110\nimylceyou\t182111\nguydyud\t182112\n双氧翅\t182113\n睡一晚上\t182114\n好女\t182115\n击溃\t182116\n灌铅\t182117\n无壳\t182118\n加内特\t182119\n无声\t182120\n九芝堂\t182121\n告密\t182122\n可图天冷\t182123\n杜师傅\t182124\n曲沃区\t182125\n500g\t182126\n璇玲\t182127\n168%\t182128\n5月16日\t182129\n杨雅\t182130\n白骨\t182131\n小憨憨\t182132\n经济学\t182133\n早先\t182134\n1680\t182135\n1682\t182136\n人度\t182137\n孙婷婷\t182138\n215米\t182139\n大酒店业集团\t182140\n秘奥特\t182141\n刮过\t182142\n目视\t182143\n榨菜\t182144\n最多久\t182145\nugc\t182146\n不爱玩\t182147\nugg\t182148\n探母\t182149\nugd\t182150\n萌溪\t182151\n13145687\t182152\n力度\t182153\n哈哈八\t182154\n举重\t182155\n喝咖灰\t182156\n黑煤窑\t182157\n已有效日期\t182158\n硝酸钾\t182159\n软驱\t182160\n哈哈党\t182161\n哈根达斯\t182162\n调理药\t182163\nUfchgh\t182164\n圆弧\t182165\n远走\t182166\n汪华\t182167\n远起\t182168\ncollow\t182169\n铅笔社\t182170\n咪咪咪咪咪咪咪咪咪咪咪咪\t182171\n汪升\t182172\n亲爱的我走\t182173\n淘气妃\t182174\n显露\t182175\n谭美玲\t182176\n珍存\t182177\nTrixieLulamoon\t182178\nWife\t182179\n现真\t182180\nWifi\t182181\n莱墨\t182182\n自保\t182183\n美啦美\t182184\n哎呦呦aokoutletotometoutohottomatimakotototoutocadapk了不求求\t182185\n幸福人寿\t182186\n机兜\t182187\n创先\t182188\nGhcrarvrz5jufrbucrvhshxgvuvrztsknif3cinin6humtxex4f6g6hrdevymuj4ded2f6l7\t182189\n随从\t182190\n姜师哥\t182191\n查讯\t182192\n很丑陋\t182193\nghgdchdviwqopfd\t182194\n乡约温\t182195\n离队\t182196\n周知\t182197\n现眼\t182198\n莫南爵\t182199\nangelababy美\t182200\nh4thotfilesonghtinthisestfrthot\t182201\n自信\t182202\n575544\t182203\n机关\t182204\n自修\t182205\n鼻中\t182206\nhndhagwiiw\t182207\n张华丽\t182208\n有了看\t182209\n苍白无力\t182210\n绿窗春与天俱莫\t182211\n好光荣\t182212\n帆板\t182213\n开叉元\t182214\n白丽莎\t182215\n纳瓦\t182216\n好好活下去\t182217\n剩系\t182218\ntred\t182219\nsipu\t182220\n灰雀\t182221\n并购\t182222\n彭给\t182223\n2002shuanzi\t182224\n四胞胎\t182225\n猪猪们\t182226\n酒耳机\t182227\n认识非若\t182228\n林嘉颖\t182229\n一个28\t182230\n东莞爆炸所\t182231\n日小花\t182232\n落地骑\t182233\n坏账\t182234\n冰雪聪明\t182235\nGkfj\t182236\n摩萨德\t182237\nsb，sb\t182238\n球类\t182239\n稔\t182240\n稗\t182241\n今风\t182242\nlalallal\t182243\n13522091902\t182244\n田中圣二\t182245\n一万三千元\t182246\n稚\t182247\n毁脸\t182248\n埃弗顿\t182249\n来来羊羊\t182250\n巴鲁巴\t182251\n大奔套\t182252\nwy屁股\t182253\n小度子\t182254\n稍\t182255\n税\t182256\n盖地\t182257\n稈\t182258\n程\t182259\nlass\t182260\nlast\t182261\n因为你是我最爱的人\t182262\n闪电霞\t182263\n稳\t182264\n稼\t182265\n稿\t182266\n谈股论金\t182267\n稻\t182268\nconnection\t182269\n稠\t182270\n跨行\t182271\n闲麻烦\t182272\n尚利媛\t182273\n变现\t182274\n大明星\t182275\nIMAX\t182276\n韦玉光\t182277\n888888888888888888\t182278\n阿狸嘎羧\t182279\n大事\t182280\n李亚聪\t182281\n千羽佳\t182282\nfufufuu\t182283\n等等\t182284\n不是你去那女孩是我亲那个女孩\t182285\n谢谢你有\t182286\n焦头烂额\t182287\n最美幼个\t182288\n澳耶斯\t182289\n23:00\t182290\n大五\t182291\n这么多遍\t182292\n大云\t182293\nhttppinyincn1aS7K44bIdG\t182294\n大亨\t182295\n单鞋\t182296\n海贼待会\t182297\n何苏海\t182298\n濒死\t182299\nrunner\t182300\nkjjkjlkk\t182301\n副会\t182302\n啊子\t182303\n尊称\t182304\n吧假的吧\t182305\n莫怪\t182306\nb0y\t182307\n截成\t182308\n粗不好\t182309\n大亲\t182310\n橘子堡葡萄味儿香蕉网\t182311\nwhipple\t182312\n4352645648484845494261\t182313\n姑息\t182314\n一抢而空\t182315\n805亿美元\t182316\n翁程程\t182317\n当街\t182318\n柠檬醋\t182319\n一首曹\t182320\n三脚架\t182321\n样儿\t182322\n安索尼\t182323\n蓝球\t182324\n511216507880000\t182325\n因文\t182326\n莎莉翠\t182327\n上百年\t182328\n吕雯\t182329\n真好玩\t182330\n丁恺宁\t182331\n时而\t182332\nqfnl\t182333\n年中\t182334\ncomso\t182335\n迅雷\t182336\n额月iIE8\t182337\n狂行勤学而烛与我的一样\t182338\nvob\t182339\n工价\t182340\nvol\t182341\n深奧\t182342\n六块\t182343\nvoo\t182344\nmotors\t182345\n翰场\t182346\nvot\t182347\nvou\t182348\nvov\t182349\n于晏\t182350\nvoy\t182351\n龚俊\t182352\nkē\t182353\n滴菇\t182354\n没门儿\t182355\n首歌行吗求求你了我不是让你给我讲故事我要你图我吃了聋人不孕的女主角传说\t182356\n偶师\t182357\n亚必须\t182358\n六坨\t182359\n小远\t182360\n克马\t182361\n西庄\t182362\n双鉴探测器\t182363\n疯汉\t182364\na股\t182365\n凸乳\t182366\nhdrfjf\t182367\n贾维斯\t182368\n兰玉\t182369\n饿了了怕\t182370\n语分钟\t182371\n沈春阳\t182372\n16:58\t182373\ndggds\t182374\n闪光\t182375\n棒长\t182376\n臭乖\t182377\n去年9月30日\t182378\n研究家\t182379\n６\t182380\n怪王\t182381\n百合姑娘玫瑰姑娘\t182382\n012345\t182383\n桑塔\t182384\nQQ秘\t182385\n道帘\t182386\n一盘散沙\t182387\n西康\t182388\n波密\t182389\n紫川\t182390\nporndada\t182391\n二十二十二号\t182392\n靠靠靠靠靠靠靠靠靠\t182393\ngghcgh\t182394\n向前走\t182395\n不碰\t182396\n奶罩\t182397\n绊脚石\t182398\n16:55\t182399\n一马下次\t182400\n单讨\t182401\n笨萤\t182402\n绝对值\t182403\nOPPO2s\t182404\n更严重\t182405\n李一起\t182406\n原木\t182407\n筒子\t182408\n国美堡\t182409\n不碎\t182410\n马小跳\t182411\n瓮娃\t182412\n二十九块\t182413\n誉博唱\t182414\n希米\t182415\n福华\t182416\n精彩人生\t182417\n杨哥\t182418\n鼠标垫\t182419\n刘君利\t182420\n合肥报业传媒集团\t182421\nEQesc\t182422\n一六个\t182423\n有毒食品公司\t182424\nGordon\t182425\n再出\t182426\nNO.10\t182427\n夸父\t182428\n一年半\t182429\n一百二十多块\t182430\n速率\t182431\n静候\t182432\n瞿雯倩\t182433\n肚美美\t182434\n杨哇\t182435\n玩凡\t182436\n洗吧\t182437\n赶趟\t182438\n福卡\t182439\n安多拉\t182440\n13831655875\t182441\n活活活\t182442\n赶超\t182443\n福卷\t182444\n团校\t182445\n无家可归\t182446\n有影像\t182447\nFerragamo\t182448\n联想电脑\t182449\n锣鼓车\t182450\n吃卷\t182451\n黄文社\t182452\n符雅婷\t182453\n白天下午\t182454\n红白瞎\t182455\n好我心情了\t182456\n慢点儿\t182457\n周承旭\t182458\n我是你的知心好友\t182459\n胶片\t182460\n架橋\t182461\n你好姐\t182462\n宋朝\t182463\n好好不好\t182464\n3youyy\t182465\nqrsgeef\t182466\n前三十年\t182467\n臭小子\t182468\n善良的朋友\t182469\n跑不赢\t182470\n独树一帜\t182471\n过去一个小时\t182472\n机格机\t182473\n十一页\t182474\n头两天\t182475\n三母\t182476\n微醺\t182477\n张泽鲁\t182478\nfaukyou\t182479\n胡萝卜八字\t182480\ncoooomistle\t182481\n地下王\t182482\n三毛\t182483\n都秘\t182484\n右拐\t182485\n无色版\t182486\nK线图\t182487\n完美欺骗管理\t182488\n谁谁谁谁谁谁学校\t182489\n轮滑鞋\t182490\n鸡同鸭讲\t182491\n挂彩\t182492\n挑客\t182493\n惠风\t182494\n华奥天平\t182495\n酒糾結\t182496\n猫国\t182497\nlastai\t182498\n硍度秘\t182499\nteachme\t182500\n志山\t182501\n484575276585295928545597816555555822859458598555668855889895\t182502\n金泰妍\t182503\n挖槽\t182504\n100007米\t182505\n微醉\t182506\n马分婉\t182507\n顺景店我不想分就\t182508\nHFY\t182509\n恩春哈\t182510\n杨晨涛\t182511\n昭示\t182512\nabccou\t182513\n试读\t182514\n橘\t182515\n橙\t182516\nHFT\t182517\n机主\t182518\nmuka\t182519\n橇\t182520\n翠湖\t182521\nShak\t182522\n这次个\t182523\nShan\t182524\nShao\t182525\nHFF\t182526\n多米马\t182527\n别客气\t182528\n马文杰\t182529\n章佳峻\t182530\n试试\t182531\n饭包\t182532\n夜夜夜撸\t182533\n橹\t182534\n诸暨技师学院\t182535\n阳茎\t182536\n近不近\t182537\n服员\t182538\n呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀\t182539\nchs\t182540\nHFm\t182541\nUser\t182542\n2012年1月19日\t182543\nfzl\t182544\ncht\t182545\nfzf\t182546\n试词\t182547\nfzd\t182548\n转载\t182549\n工作余\t182550\n灵鹫山\t182551\n疯了人话\t182552\n转轴\t182553\n安神\t182554\n消失不然\t182555\n杨雨\t182556\n转转\t182557\n如玉\t182558\n表演出\t182559\n步步步步步步步步\t182560\n杨慧琳\t182561\n变声\t182562\n孟其秦\t182563\n踏雪无痕\t182564\n于媛\t182565\n56k99\t182566\n呃通话\t182567\n心情不舒服\t182568\n正事\t182569\n小伙子\t182570\n小天才们\t182571\n目的\t182572\n马智磊\t182573\n呜咪\t182574\n零二二七\t182575\n要靠\t182576\n七十斤\t182577\n库神\t182578\njjlll\t182579\n赵欣月\t182580\n西夏向天歌白毛浮绿水\t182581\n撸管榜\t182582\n咶\t182583\n缇雨萱\t182584\n咱\t182585\n咳\t182586\n我讨厌你我讨厌你我讨厌你\t182587\n咾\t182588\n咿\t182589\n咸\t182590\n老小孩儿\t182591\n咻\t182592\n将来时代\t182593\n十三场\t182594\n咧\t182595\n邵忠梅\t182596\n咣\t182597\n咬\t182598\n咭\t182599\n咯\t182600\n咨\t182601\n咩\t182602\n阿嫂\t182603\n咔\t182604\n弄冷\t182605\n咖\t182606\n咗\t182607\n雪灾\t182608\n咒\t182609\n修真犬\t182610\n咝\t182611\n两瓶\t182612\n志龙偶\t182613\n咙\t182614\n咚\t182615\n咛\t182616\n咅\t182617\n咆\t182618\n艾莎莉娜\t182619\n咀\t182620\n拐带瓜达瓜达瓜\t182621\n咂\t182622\n和\t182623\n印发\t182624\n咎\t182625\n咏\t182626\n繁忙\t182627\n爱不变\t182628\n两瓣\t182629\n藕带你带你上西天\t182630\n55666111\t182631\n高邑\t182632\n清想\t182633\n小彩云\t182634\n干嘛犬\t182635\n果不开心\t182636\n韩霜\t182637\n826亿数\t182638\n仙乐\t182639\n滴水之恩必当涌泉相报\t182640\n好不好咧\t182641\n霸权\t182642\nniqi\t182643\n一百零八\t182644\n依此\t182645\n有滋有味\t182646\n科克\t182647\n咖喱雞老林\t182648\n一百零六\t182649\n搜神看来点嘿逗虾嘎拉嘎\t182650\n节目\t182651\nggvvvdf\t182652\n几个零\t182653\n南宇春亚\t182654\n高邮\t182655\n米家族\t182656\n沙嘎\t182657\n一大12\t182658\nJif\t182659\n一大串\t182660\n长大来\t182661\nJia\t182662\npeakenglash\t182663\nJil\t182664\n一个一数\t182665\n美静\t182666\n三十门\t182667\n2016906706\t182668\n一大个\t182669\n中印\t182670\n蒙古包\t182671\n练证\t182672\n汇款\t182673\n86。008\t182674\n伟杰\t182675\n中午\t182676\n度秘哈哈哈哈哈哈\t182677\n087925436\t182678\n康梦\t182679\n你是猪吗你是猪么你好字成医师朱\t182680\n的是的谢谢你我的好朋友\t182681\n协议\t182682\n桥头河\t182683\n绝不后悔\t182684\n五额\t182685\n笨娇颜\t182686\n转字控\t182687\n我相信你会成功的加油度秘\t182688\n版面\t182689\n一妻多夫\t182690\n中南\t182691\n中单\t182692\nghhgssd\t182693\njdgrgfjdidd\t182694\n现存\t182695\n徐羽\t182696\n零句话\t182697\n谢耳朵\t182698\n都市全网\t182699\n倒叙\t182700\n周墙\t182701\n魏政委\t182702\n牛b类\t182703\n辛辛苦苦\t182704\n林言\t182705\n毛病\t182706\n摘下\t182707\n勒叫\t182708\n当为战国\t182709\n有有有有有有有有有有有有有有有有有有有有\t182710\n咱那你堡\t182711\n554分\t182712\n余县\t182713\n修行者\t182714\n哼哼唧\t182715\n干家诗\t182716\n刘一凡\t182717\n徐美\t182718\n倦怠\t182719\n公权\t182720\nsslyl\t182721\n大橙橙橙橙\t182722\n三个月\t182723\n瞎信\t182724\n618215181015\t182725\n水墙\t182726\n题背\t182727\n叶华\t182728\n一号线\t182729\n卜闩吖\t182730\n潘美琪\t182731\n2772666611222\t182732\njvugk\t182733\n小释然\t182734\n石家庄\t182735\n再见呀我真的很有事\t182736\n33天\t182737\n沙坑\t182738\n端详\t182739\n寒武\t182740\n师资\t182741\nsbc2bhc\t182742\n1644676\t182743\nffcccvv\t182744\n脑脊液\t182745\n我就我就是你了恨死你\t182746\ntdf\t182747\n女的屁股大提莫不爱你的你说我真的爱你我真的很爱你\t182748\ncb700\t182749\n制作业\t182750\n车模\t182751\ntdd\t182752\n点苦\t182753\n好白\t182754\n345678点\t182755\n土皮\t182756\n枝条\t182757\n11111\t182758\ndtyd\t182759\n短靴\t182760\n自\t182761\n王从亮\t182762\nhffghyg\t182763\n羊腿\t182764\n吴尚桐\t182765\n鸭蛋\t182766\n13437395377\t182767\n金针\t182768\n懒觉\t182769\n五六遍\t182770\n金钟\t182771\n雨片\t182772\n灡嵐\t182773\n烦惑\t182774\n维梦\t182775\n金钢\t182776\n13848988626\t182777\n傲天\t182778\n30000\t182779\n安西\t182780\n大便秘\t182781\n哥天\t182782\neleven\t182783\n15673398780\t182784\n三十秒\t182785\n金钱\t182786\n喘不过气\t182787\n我求你了我\t182788\nrrdgfbd\t182789\n臡\t182790\n皱皱巴巴\t182791\n当机立\t182792\n吧务引导小度秘\t182793\n三十秘\t182794\nbaba\t182795\n冰冻果冻\t182796\n做强奸\t182797\n冰座\t182798\n表酱紫嘛\t182799\nbabi\t182800\n源源不断\t182801\ns0s\t182802\n汗毛\t182803\n杰话\t182804\n24.88万元\t182805\n想够了\t182806\n000000000nnnnnnnn\t182807\n肛巴碟\t182808\n花盆儿\t182809\n草原上\t182810\n14702446\t182811\nbaby\t182812\n遥遥领先\t182813\n天啦噜口了口口口口路口了龙龙\t182814\n纯素\t182815\n小璃\t182816\n婚车\t182817\n5554554568\t182818\n8月28日\t182819\n胡一涵\t182820\n佳品\t182821\n算不要脸\t182822\n小璐\t182823\nbabY\t182824\n报国寺鬼市\t182825\n干度\t182826\n旺仔牛奶糖\t182827\n正所谓\t182828\n大粪\t182829\nv066b\t182830\n7X5\t182831\n红山\t182832\nxgg\t182833\n15950378288\t182834\n袁雪英\t182835\nmmmmmmmmmmmMNinnmmmmnnnnnnnnnN7hnnnnNhnnnnnnnnnNuNuMjMijMmmmMmMmmmm\t182836\n边检\t182837\n七K闹\t182838\n李熙琳\t182839\n我和我的闺密闹\t182840\n957l门\t182841\n被解雇\t182842\n各业\t182843\n4分\t182844\n首例\t182845\n为秀\t182846\n海礁\t182847\n法表\t182848\nbvvhhbb\t182849\n为秘\t182850\n挨济济\t182851\n超龄\t182852\n贝贝姐\t182853\nFriends\t182854\nKaide\t182855\n睡意\t182856\n石料\t182857\n雷曼兄弟\t182858\n检讨论组\t182859\n市南\t182860\n这么长\t182861\n那个小人\t182862\n好跟风\t182863\n僧尼乐\t182864\n无坚\t182865\nsosad\t182866\n闪闪发亮\t182867\n12288\t182868\nip地址\t182869\n圈套\t182870\n酒红\t182871\n18岁时\t182872\n惹祸上身\t182873\n慢慢来来来来来来\t182874\n臃\t182875\nhjgnb\t182876\n饲料化\t182877\n正焕\t182878\n亲一个世界\t182879\n要不要不一会\t182880\n出错误\t182881\nNnvgjgg\t182882\n二十八号\t182883\njvjhjjkoip\t182884\n姚欣荣\t182885\n125569\t182886\n九百多块\t182887\n背搬家\t182888\nka1ja1a\t182889\n斯连\t182890\n亲信\t182891\n新年快乐麻耶\t182892\n郭钰淇\t182893\nａｎｇｅｌａｂａｂｙ\t182894\n金家街\t182895\n邓德琪\t182896\n盅内\t182897\ntf卜嗳4youy\t182898\n禅语\t182899\n蛋儿\t182900\nperfect\t182901\njovdehi\t182902\n周某\t182903\n行不行果\t182904\n撑裂\t182905\nBrooks\t182906\n炙艾\t182907\n三千多块\t182908\n敞绩\t182909\n藏历\t182910\nbuhua\t182911\n普雄段\t182912\n胡诶\t182913\n国际范\t182914\n战役\t182915\n不又\t182916\n小骨\t182917\n不发\t182918\n12312361733\t182919\nvuyc\t182920\n解密\t182921\n不取\t182922\n不变\t182923\n基秘\t182924\n0.73%\t182925\n血管瘤\t182926\n星宫莓\t182927\n不叫\t182928\n不只\t182929\n胡话\t182930\n宁德\t182931\n自知之明没\t182932\n3.07%\t182933\nbadman\t182934\ncookomi\t182935\n不满场\t182936\n累累\t182937\n帕金森\t182938\n立足\t182939\n实话说\t182940\nPO图\t182941\n没不让\t182942\n曾清明\t182943\n厉害\t182944\n来日日\t182945\n陈迹清梵\t182946\n小小羊\t182947\n笔直\t182948\n复仇\t182949\n上用场\t182950\n行什么叫\t182951\n李广州\t182952\n黑美人\t182953\n没有太阳\t182954\n不和你少\t182955\n石虹宇\t182956\n致谢\t182957\n志超\t182958\nkkgfc\t182959\n我一定要\t182960\n制片商\t182961\n发说\t182962\n嘉兴市\t182963\n发请\t182964\nffcdddfcrdcterc\t182965\n火辣\t182966\n下一代\t182967\n下星期k\t182968\n谁相信你的话\t182969\n柳儿\t182970\nStore\t182971\n好媚\t182972\n女累点\t182973\n修路队\t182974\n刑讯逼供\t182975\n就在这花好月圆夜两心相爱心相悦，在这花好月圆夜有情人儿成双对\t182976\n殷忧真累\t182977\n好伙伴儿\t182978\n5D3\t182979\nmutlij\t182980\n呀嗯\t182981\nJean\t182982\n明度\t182983\n女学生\t182984\ntoverylow\t182985\n秘野球了吗给我看一个美人\t182986\n周子艺\t182987\n牛牛只qpp尿尿我爱你\t182988\n明天早上8点20\t182989\n来张你的自拍网红\t182990\n起生\t182991\n2497920770\t182992\n13527079977\t182993\n玉米片\t182994\n相老\t182995\n恩恩恩\t182996\n汽柴\t182997\n急急急\t182998\n雷克萨斯570\t182999\n王岩岩\t183000\n周鑫\t183001\n五个字\t183002\n四万多\t183003\n元二\t183004\n强制\t183005\n朝北\t183006\ncommunicator\t183007\n欧盟\t183008\n微预告#\t183009\n包蕾蕾\t183010\n强切\t183011\n据手\t183012\n1357628\t183013\n查实\t183014\n易车\t183015\n教育程度\t183016\nktkk\t183017\n毕业生\t183018\n16点25\t183019\n凌葙\t183020\n百富六月\t183021\n三号度\t183022\n快乐会\t183023\n某来\t183024\n世界上明\t183025\n一个六阶\t183026\n庆幸\t183027\n读谜\t183028\n大放\t183029\n小道消息\t183030\n好高兴\t183031\n点扣\t183032\n禽兽片\t183033\n蝉联\t183034\n想秀\t183035\neBye\t183036\n胡你\t183037\n西固巷\t183038\naijajs\t183039\n白瞎\t183040\n拱门公园\t183041\n歼20\t183042\n外头业\t183043\n啊恩恩\t183044\n陈嘉\t183045\n2947元\t183046\n林成天\t183047\n20年内\t183048\n过去的朋友\t183049\n模葫芦\t183050\njjffh\t183051\n笑呼\t183052\n相片儿\t183053\n孙楠\t183054\n柠檬\t183055\n词牌\t183056\n我小小小小小小小小小小小小雨\t183057\nhttpbhiphotosbaiducomxiaodupicitemc995d143ad4bd113020df7475dafa40f4afb0586jpg\t183058\nrtuioovvcv\t183059\n点差距\t183060\n邦女郎\t183061\n李文杰\t183062\n七五7点\t183063\n开心派\t183064\n放了我怕\t183065\ngunb\t183066\n滚走\t183067\n吃呀呀呀呀呀\t183068\n红糖生姜汤\t183069\n默写单\t183070\n伟麽\t183071\n再见再聊\t183072\n一匹\t183073\n斯图里奇\t183074\n莽莽\t183075\n年底下午\t183076\n酷玩&蕾哈娜\t183077\n苏晓莹\t183078\nTHING\t183079\n级级\t183080\n范宇\t183081\n无所养\t183082\n心肠\t183083\n真可怜杂\t183084\nexneixiebxifbidbjdidxidnxidx\t183085\n15992478842\t183086\n幅写\t183087\n催产\t183088\n未婚妻\t183089\n范家\t183090\nTwitterk\t183091\n心肝\t183092\n清输\t183093\n苏墨尧\t183094\n雨帆\t183095\n胸人家\t183096\n几个点儿\t183097\n补脑\t183098\n十只个\t183099\n咪莫\t183100\n擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦你菊花\t183101\n心肌\t183102\n12fnjg\t183103\nGFF\t183104\nGFG\t183105\nGFD\t183106\n不生\t183107\n陡峭\t183108\n那达慕大会\t183109\n喊捉贼\t183110\n不甘\t183111\n我也没吗你爸妈呀小孩度秘\t183112\nK225\t183113\nieiuru\t183114\n急动\t183115\n上塌\t183116\n死人民\t183117\n小攻强上小受\t183118\n死人气\t183119\n井冄\t183120\n歌谣\t183121\nSS4\t183122\n姜圣飞\t183123\n不男\t183124\n不由\t183125\n十二别打蝨\t183126\n不用\t183127\n北上广\t183128\n飞时代\t183129\n铁柱\t183130\n美大\t183131\n新创\t183132\n零五\t183133\n行行行行行\t183134\neeushhsyeue\t183135\n身条纹\t183136\n14565555\t183137\n鱼肉味\t183138\n吗场\t183139\n胡思\t183140\n泰尔\t183141\n半个世纪\t183142\n花生壳\t183143\n兜风\t183144\n推子\t183145\n单冰冰\t183146\nhizrib\t183147\n幺零零幺\t183148\n胡总\t183149\n34.80\t183150\n87856491\t183151\nAV性\t183152\n两天假\t183153\n本行\t183154\n雷击\t183155\n刘力扬\t183156\n根据费\t183157\n忙之风\t183158\n88888775151\t183159\n美怒\t183160\n美思\t183161\n段晓曼\t183162\n雷凯\t183163\n魅蓝GPS\t183164\nssssssss\t183165\n杨家琦\t183166\n报算\t183167\n金俊秀#\t183168\n压腿\t183169\n丁志轩\t183170\n一个样\t183171\n11111111111111\t183172\n么们\t183173\n可恶大可恶\t183174\n总人称\t183175\nKitchen\t183176\nGkfhhvgfyfhtcbggtgtgutjfjgvghdcgyfhfggkfhgbujhgghhhjjhggggyyfgfygfxfghghhhffhbffhkgbdffhkkkk\t183177\nGSM/WCDMA\t183178\n俊美\t183179\n索马里\t183180\n新曲\t183181\n工具化\t183182\n聊片\t183183\n服务态度\t183184\n李宛真\t183185\n和巨\t183186\n牢骚\t183187\n工具包\t183188\n没好聊\t183189\n亮亮\t183190\n65887755\t183191\n委员\t183192\n四G网\t183193\n500条\t183194\n投资收益\t183195\n非诚要\t183196\n么有\t183197\ntsf\t183198\n香港中文大学\t183199\n么服\t183200\n米拉拉\t183201\n問題\t183202\n不名小路考试\t183203\n壮丽\t183204\n壮举\t183205\n蝴蝶我送给你的蝴蝶我要蝴蝶\t183206\n秘度春天的美文美句你给我找一下\t183207\n脑公那不呼噜了哈摸摸大\t183208\n24天\t183209\n预交\t183210\nvyov\t183211\n夺爱\t183212\n交会处\t183213\n稍等\t183214\n我是男你是女\t183215\n爱车\t183216\nutjorjn\t183217\narainly\t183218\nuhox\t183219\n9x2kx\t183220\n谢谢你我叫系与喜洋系列欲强挖\t183221\n丰功伟绩\t183222\n吃吃吃吃吃吃吃吃吃吃\t183223\n小头娃娃\t183224\n街道办\t183225\najzii\t183226\n吴梓欣\t183227\n老不老\t183228\n那你美嘉有p\t183229\n股偶\t183230\n>\t183231\n小婶\t183232\n小婷\t183233\n先走了再见\t183234\n九级\t183235\n唉唉我喜欢\t183236\n张索赫\t183237\ngaybad\t183238\n小婧\t183239\n相當多傷亡\t183240\n肉棒\t183241\n发好呀\t183242\n窝边草\t183243\n萨芬娜\t183244\n5。5\t183245\n额王第还题退还覅8王王第还过还还题48额第福\t183246\n小婆\t183247\n七月份\t183248\n还田鸡\t183249\n蝉儿\t183250\n泼洒\t183251\n小婊\t183252\n疯了呗\t183253\n男音\t183254\n七零把\t183255\n娑\t183256\n娟\t183257\n娜\t183258\n小摊们\t183259\n亲亲亲亲\t183260\n娘\t183261\n根儿\t183262\n学年\t183263\n娇\t183264\n娄\t183265\nzgK\t183266\n头胀\t183267\n娃\t183268\n威\t183269\n周范\t183270\n头胎\t183271\n商场\t183272\n收纳箱\t183273\n娶\t183274\n高塔\t183275\nzgz\t183276\n拉米\t183277\n问一\t183278\n娱\t183279\n兒童節\t183280\n嗯瓮嗯\t183281\nzgs\t183282\n商國\t183283\n问上\t183284\n商圈\t183285\nzgh\t183286\n问世\t183287\n娥\t183288\nddsrrerdyfr\t183289\n荣华富贵眼前花\t183290\n两千16年2月2号\t183291\nzgf\t183292\n娩\t183293\n字头\t183294\n阿骆\t183295\n象山\t183296\n心有礼貌\t183297\n毛了惹毛了有你好看\t183298\n鼠来宝3\t183299\n后街布铺\t183300\n唉理学习\t183301\n给我看懂\t183302\n皆非\t183303\n溧阳\t183304\n货款\t183305\n卢家欢\t183306\n诸神\t183307\nfgggghh\t183308\n大张伟\t183309\n高晓\t183310\n秘新年快乐\t183311\n陪酒\t183312\n妨害\t183313\n安置房\t183314\n36356655\t183315\n博长\t183316\n晶晶\t183317\n孙悟\t183318\n配送\t183319\n亲水\t183320\n256841425\t183321\n1196939523\t183322\n气流\t183323\n民星\t183324\n惊为天人\t183325\n同一时代\t183326\n诶羞\t183327\nwoyao\t183328\n保安人员\t183329\n忽视\t183330\n凯旋城\t183331\n蝴蝶简战火\t183332\n皈依\t183333\n你最好是\t183334\n学习部\t183335\n好呀你可以\t183336\nguuh\t183337\ngdgdjfjaajd\t183338\nstfbp\t183339\n300000000000000000000000000000000000000000000000000\t183340\n笑佳人\t183341\nnteg\t183342\n逆反理\t183343\n命丧\t183344\n秘恋\t183345\n屌丝=loser，NB=kick\t183346\n恩氏香\t183347\n地形图\t183348\n空窗期\t183349\n烟哥\t183350\ndfhbczs\t183351\n线儿\t183352\n下洼\t183353\n槟城\t183354\n三更半夜\t183355\n退下拉\t183356\n三驾\t183357\ntick\t183358\n中石油\t183359\n席间\t183360\n假钱\t183361\n一代一代\t183362\n八点\t183363\n戒酒\t183364\n吴孙子\t183365\nfilith\t183366\n逐疫\t183367\n60度\t183368\n恶心了我不要你的心\t183369\n羞羞摸\t183370\n別占\t183371\n房地产商\t183372\n1.1元\t183373\n346794664679865334\t183374\n台军\t183375\n秘密天使##花样美男时英##陈翔天声一队#\t183376\n口只\t183377\n唐杰\t183378\nfd60bb315c6034a87366jpg\t183379\n链式\t183380\n董雪\t183381\nkmltyjgamvna1adgjmqtw\t183382\n转自联商网\t183383\n悲催\t183384\n徐发有发\t183385\n许愿台\t183386\n晚春\t183387\n距罗\t183388\nghhhhhnhjhjjgbgsmhggghjescmlrsbksjagjgrqklgenkrdhkrdgjdfloydd\t183389\n痘印\t183390\n浮世殇\t183391\n董雄\t183392\nmoovw\t183393\n杂酱\t183394\n朱顺顺\t183395\nzhije\t183396\n余海阔\t183397\n弧线\t183398\n蒙洗\t183399\n告诉我的界面\t183400\nzhiji\t183401\n晚明\t183402\n埃里克·博林\t183403\n九八十厘米\t183404\n还须\t183405\n淡然处\t183406\n方奕诺\t183407\n宏远\t183408\n李梦男\t183409\n#2岁\t183410\n高钰斐\t183411\nmfiiu\t183412\n尕娃\t183413\n邓书记\t183414\n大咬\t183415\n二二一一1\t183416\n干嘛呢大帽驴\t183417\n诺秘\t183418\n早安Googeye\t183419\nVOC\t183420\n车架\t183421\n诺科\t183422\n笔劲\t183423\n2428723173800763566733863\t183424\n开封菜\t183425\n鞋跟\t183426\n铝厂\t183427\n格列佛\t183428\n纸张\t183429\nxfgv\t183430\n诞生\t183431\n别笑\t183432\nnnnjm\t183433\n认务\t183434\ngfxdczd\t183435\n连云中\t183436\n42万7000多少\t183437\n养子\t183438\n冲突\t183439\n泉港山腰旧街旭东珠宝店\t183440\n奴家片\t183441\n11年\t183442\njzl\t183443\njzk\t183444\n简单粗暴\t183445\ndcbos\t183446\n阿托\t183447\n独角仙\t183448\n煌味\t183449\n翟懋\t183450\n羊驼\t183451\njzw\t183452\njzt\t183453\n回间\t183454\n挑染\t183455\njzq\t183456\n课题组\t183457\n莪\t183458\n公顷\t183459\n提壶\t183460\n这些话\t183461\nsaAs\t183462\n水手服\t183463\n蒙纳士\t183464\n咯嗦\t183465\n二十张\t183466\n你的人生\t183467\n孙悦\t183468\n对啊真\t183469\n保密局\t183470\ngjjjjkjjjjjjjjjjjjj\t183471\n面试题\t183472\n一百四十多天\t183473\n张一张\t183474\n3天以来\t183475\n夏老师\t183476\nHiIamxiaoming\t183477\n么么达\t183478\n高晓雯\t183479\n加男\t183480\n[淚\t183481\n先片\t183482\n95岁\t183483\n小爱心\t183484\n应急\t183485\n8170\t183486\n5页\t183487\n邪\t183488\n曺圭贤\t183489\nk行\t183490\n上下其手\t183491\n翎儿\t183492\n谭家丽\t183493\n那\t183494\n好想大声说爱你\t183495\n邦\t183496\nrrrrdfdfsr\t183497\n邻\t183498\n晚上八点\t183499\n视同\t183500\n贮藏\t183501\n甲醇\t183502\n邱\t183503\n大和\t183504\n邵\t183505\n污名\t183506\n邷\t183507\n邈\t183508\n邉\t183509\n邊\t183510\n邋\t183511\n弦瑟\t183512\n1919年\t183513\n去找你好\t183514\n邏\t183515\n邀\t183516\n源么\t183517\n韩秋婷\t183518\n邃\t183519\n還\t183520\n皇帝们\t183521\n邇\t183522\n球篮球\t183523\n大望路\t183524\n堂食\t183525\n邝\t183526\n衰弱\t183527\n邑\t183528\n真的好帅\t183529\n邓\t183530\n3成\t183531\n三十厘米\t183532\n邗\t183533\nSleepwhentiredandsmilewhenawakennneme400825emnnn\t183534\n放走\t183535\n第1名\t183536\n疯了我是\t183537\n放假咧\t183538\n龙燮\t183539\n十二支\t183540\n好恐\t183541\nkhhhggrr\t183542\n一丝一丝\t183543\n王贵婿\t183544\n南珠\t183545\n针对性\t183546\n采用\t183547\n吴好\t183548\n仙侠剧\t183549\n鲍芳\t183550\n好恨\t183551\n莆\t183552\n科罗拉多州\t183553\ndota\t183554\njiqirn\t183555\n862503\t183556\n莉\t183557\n好郁闷\t183558\n事不两立\t183559\n好恶\t183560\n31eee\t183561\n首虹\t183562\njjvc\t183563\n嚎嚎嚎\t183564\n毒米线\t183565\n哇嘿\t183566\n大姨丫蛋\t183567\n佩玲\t183568\n里德诺\t183569\n看不上\t183570\n陡然\t183571\n简陋\t183572\n在哪儿等\t183573\n胡思恩\t183574\n20克\t183575\n31000夜\t183576\njjhhhhj\t183577\n笑一个那不然\t183578\n13445678899111\t183579\nduomestheoi\t183580\n面包师\t183581\n别母\t183582\n凯燕\t183583\n最终幸福\t183584\n格格巫\t183585\n公主度\t183586\njnujmwx\t183587\n脚车\t183588\n沒好意思\t183589\n两点下午\t183590\n赵云云\t183591\n谢谢化\t183592\n杜拉拉600666\t183593\n你好度秘我想揍你度秘我想揍你我想亲你我想揍你度秘\t183594\n流星雨\t183595\n坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋回答回答回答回答回答回答回答回答回答回答回答\t183596\n切菜\t183597\n586532\t183598\n我不要你了我要离开你\t183599\n归零落成\t183600\n三点儿\t183601\n冼和\t183602\n美人就不是我疯了美人计不是我风格说吧你\t183603\n冷秘\t183604\n上学啊片\t183605\n小霜\t183606\n小霞\t183607\n咣噔\t183608\n氯丁\t183609\nihdbs\t183610\n4万余元\t183611\n八三\t183612\n安徽电视台\t183613\n和一点\t183614\n狠你\t183615\n外联兽兽\t183616\n佛教\t183617\nuyffu\t183618\n酒家家户户\t183619\n演员\t183620\n莞\t183621\nGDRAG0\t183622\n喝茶啦\t183623\n排排\t183624\n敖晟\t183625\n座舱\t183626\n妮妮太牛\t183627\n等你\t183628\n特辑\t183629\n卜算子\t183630\n找着了\t183631\njghh\t183632\n林洸耀\t183633\n算了我叫\t183634\n宝清\t183635\n怜惜\t183636\n1890个小时\t183637\n养老院\t183638\n挖土咪\t183639\n13455243\t183640\n配资\t183641\n蔡澜\t183642\n疯小\t183643\n比塞\t183644\n万朵\t183645\n秘胜长\t183646\n萌我是萌汉子你是萌妹子我保护你好不好\t183647\n寒鱼\t183648\n强降雨\t183649\n少片\t183650\n虎符\t183651\n周冰倩\t183652\n好梦安安\t183653\n退少补\t183654\n害惨\t183655\n丘比丘\t183656\n765657\t183657\n严惩\t183658\n逗恩\t183659\n吝啬\t183660\n算花\t183661\n宝贝儿\t183662\n写封信\t183663\nFUCKYOURMOM\t183664\n嗯恶心\t183665\n不聊谁怕\t183666\n腿费\t183667\n军缘\t183668\n12121\t183669\n硪剁剁\t183670\n听话会\t183671\n少特\t183672\n我类\t183673\n岛国本\t183674\na1gig\t183675\n学俩声\t183676\nggjjj\t183677\nsmyisi\t183678\n亲像\t183679\n迪迦奥特128天外\t183680\n青大\t183681\n够了问问\t183682\n施彬彬\t183683\n晓坏\t183684\noods\t183685\n玩游\t183686\n双语\t183687\n三天三个\t183688\n青天\t183689\n真的好不开心\t183690\n26个\t183691\n五月三\t183692\n断奶\t183693\n舋\t183694\n甘子颖\t183695\n值得注意\t183696\n五月一\t183697\n台步\t183698\nhdhdu\t183699\n罢婚\t183700\n大时代的记录者——吴印咸\t183701\nsjssn\t183702\nhdhdg\t183703\n家本\t183704\n继续努力工作\t183705\n宠物犬\t183706\n有愧\t183707\n两片片\t183708\n142798\t183709\n风元\t183710\n亿亩\t183711\n4什66\t183712\n三十一块\t183713\n甘蔗汁机\t183714\nraot\t183715\n风光\t183716\n靠不好笑\t183717\n造型费\t183718\n赛普\t183719\n老唱\t183720\n亿人\t183721\n奥运#\t183722\njdjxhdbbd\t183723\n尹小雅\t183724\n阿布基\t183725\nyhbn\t183726\n毋需\t183727\n肉馅儿\t183728\n各地区\t183729\n虽正感\t183730\n赠一倾情\t183731\n荣柳\t183732\n真的很温柔\t183733\n非同\t183734\n佳哥\t183735\n經常\t183736\n成龙慈善基金会\t183737\n令伦\t183738\n勤州区\t183739\n小度秘我\t183740\n第23\t183741\n王书斌\t183742\n第25\t183743\n真的假的我决定\t183744\n第27\t183745\nIlostapool\t183746\n说律协\t183747\n如愿以偿\t183748\n马红利\t183749\n降服\t183750\n红花草\t183751\n隆福\t183752\n实像\t183753\n陈思奇\t183754\n7英寸\t183755\n胸牌\t183756\n安妮之\t183757\n蒽蒽蒽蒽蒽蒽\t183758\n稚气\t183759\n阳刚\t183760\n燃料\t183761\n一应\t183762\n了不\t183763\n2016年2月16号\t183764\n陈雨荨\t183765\n凯塔\t183766\n了丑\t183767\nX者\t183768\n巅峰\t183769\n一康\t183770\n一庶\t183771\n几千户\t183772\n中期\t183773\n30几\t183774\n红柿子椒20g\t183775\n深入人心\t183776\n爱的话说了吗\t183777\n一度\t183778\n了主\t183779\n不见你了再见\t183780\n新人\t183781\n古交二小\t183782\n钢之家钢\t183783\n问错\t183784\n新亲\t183785\n规避\t183786\n红火\t183787\n新京\t183788\n解密类\t183789\n副主席\t183790\n妖女\t183791\n孜然鸡胗\t183792\n126点\t183793\n新亚\t183794\n明天9:00\t183795\n黄展鹏\t183796\n王然\t183797\n高分子化合物\t183798\n和法国\t183799\n新事\t183800\nhello萌\t183801\n快乐度\t183802\n夜魔\t183803\n好恶心\t183804\nhellostore\t183805\n了待会儿再说\t183806\n三号早上\t183807\n京东生变故\t183808\n傅山\t183809\n籽岷\t183810\n丑八怪大丑八怪\t183811\n这一生\t183812\n外模\t183813\n54548495\t183814\n小魔仙海哥\t183815\n鸡血石\t183816\n经久不衰\t183817\n窝风桥\t183818\n变蛋\t183819\n张柏芝\t183820\n六角\t183821\n斜对面\t183822\n哎哟客官\t183823\n托贝尔\t183824\n缺陷者\t183825\n迈吉萌\t183826\n度秘我家号\t183827\n三支\t183828\n好久不见得\t183829\n梅岭\t183830\n若明\t183831\n光亮\t183832\n艾利亚\t183833\neneles\t183834\n承上启下\t183835\n1分分\t183836\n糖茶\t183837\n温州政府\t183838\n结了吧\t183839\ndeepsnow\t183840\n万子营\t183841\n舫娟儿\t183842\n4000米\t183843\n廖海均\t183844\n9月15日\t183845\n舞蹈老师\t183846\n目标准\t183847\n牟平区\t183848\n光亲\t183849\n老板气\t183850\n华大会\t183851\n杜比\t183852\ncjfj\t183853\n嗯小嘎\t183854\n褚文静\t183855\nTHank\t183856\n流谷仙山\t183857\ngjxCffj\t183858\n舟者\t183859\n远班级\t183860\n火眼\t183861\n应用样\t183862\n根管\t183863\n泸州\t183864\n不是你的错\t183865\n一掬\t183866\n2291306725\t183867\n一措\t183868\n龙林\t183869\n截出\t183870\n九秘使\t183871\nfixt\t183872\n挡不住\t183873\n老外婆\t183874\nfixm\t183875\n看不懂\t183876\n聂太\t183877\n墙报\t183878\n达德学校\t183879\n147556886\t183880\nJANGSHA\t183881\n一掌\t183882\n去者\t183883\n猜透\t183884\n偏离\t183885\n去而\t183886\nsdjlso\t183887\n千千岁\t183888\n人生本\t183889\n较于\t183890\n丁燕\t183891\ngsuhwochw\t183892\n坏习惯\t183893\n沈里拉\t183894\n一排\t183895\n猜逗\t183896\n22yue\t183897\n任县\t183898\n鞏固\t183899\n周你在\t183900\nyourefficomyoshilookatmmindouaabacecetoutrelmyoushintrandisomeritlenooseyoutilitalit\t183901\n293吧不要脸臭婊子\t183902\n李刚亮\t183903\n小莉\t183904\n不枉\t183905\n厦门大学能源经济研究中心\t183906\n145.6万元\t183907\n乖天国\t183908\n相扑\t183909\n南堡\t183910\n男女的你当我\t183911\n急速\t183912\n柔韧\t183913\nuwuewuwu\t183914\nshohd\t183915\n别差\t183916\n无可能\t183917\n法国组\t183918\n海芥菜\t183919\n豆腐汤\t183920\n那个孙少孙少陈\t183921\n张之洞\t183922\n绝世邪君\t183923\nffrdd\t183924\n欢天喜地\t183925\n你好好女孩\t183926\n蛀虫\t183927\n秘宠\t183928\n秘宿\t183929\n私微\t183930\nhsgsgxgdb\t183931\n张近东\t183932\n秘家\t183933\n通讯录\t183934\n一个十寸\t183935\n秘害\t183936\n白衣\t183937\n切成\t183938\n永远永远爱\t183939\n非风\t183940\n8月19日\t183941\n奥林匹克花园\t183942\n离职守\t183943\n秘实\t183944\n秘宝\t183945\n舞门\t183946\n不能动家\t183947\n21：40分\t183948\n春明\t183949\n张老\t183950\n幼犬\t183951\n白血\t183952\n与时俱减\t183953\n玩亮\t183954\n枯黄\t183955\n学友\t183956\n俞金刚\t183957\n学叔\t183958\n声称\t183959\n美桥姬\t183960\n火凤凰\t183961\n学发\t183962\njdgtgdgt1g\t183963\nttttttttttttty\t183964\ncnurv\t183965\n说句吧武讨厌\t183966\n报社足球队\t183967\n16家\t183968\n地方猫\t183969\n谢谢你的可\t183970\n草鱼\t183971\n共军\t183972\n民选\t183973\nwayIcan\t183974\n一万分儿\t183975\n学号\t183976\n小戏\t183977\n盘龙\t183978\nrfhgdhjj\t183979\norz破额\t183980\n通宵\t183981\n知勺见\t183982\n闲暇\t183983\n娘家\t183984\njohaseo\t183985\n翟时昱\t183986\n十来集\t183987\n欧萱\t183988\n死孩子你你断子绝孙\t183989\n零零落落零零落落零零落落零零落落零零落落来了\t183990\n家当来\t183991\n88万\t183992\n非母\t183993\n删改\t183994\n1030亿元\t183995\n雪儿\t183996\noftenn\t183997\n舼\t183998\n二之楼\t183999\nFgtggbughfvhgvgvvfbhgcbfdggfxcvfhgfvhgvbidxcjhcvjhvvvjhbbbhvb\t184000\n刘奕淇\t184001\n罗梅耶\t184002\n私仇\t184003\n5月8\t184004\n肉便器\t184005\n王金凯\t184006\n编者\t184007\n蒋五羊\t184008\n老远\t184009\n一月21\t184010\n想你的好\t184011\n绿绿草原牧牛羊\t184012\n度秘度秘我好爱\t184013\n邵文倩\t184014\n王光胜\t184015\n猫小魅\t184016\n洛白烟\t184017\nThank\t184018\n退役\t184019\n季安平\t184020\n滑石粉\t184021\n13821884999\t184022\n女大十八变\t184023\n唔照姨延庆东\t184024\n00000666\t184025\nGOOdBye\t184026\n二点三点四点五点六点\t184027\n密云\t184028\n欧利欧\t184029\n锁门\t184030\n台式\t184031\n鲍不鲍\t184032\n计控\t184033\nweiury\t184034\n颖包\t184035\n奋勇\t184036\nDRT\t184037\n消息\t184038\n在阳春\t184039\n干球\t184040\n臭屁懂\t184041\n这东东\t184042\n听听到\t184043\nmepe\t184044\n竹山县\t184045\nboulangerie\t184046\n穷举法\t184047\n列车\t184048\n魔剑之刃\t184049\n全球化\t184050\n555777\t184051\nttgv\t184052\n八十多块\t184053\n颜昌海\t184054\nhttpfhiphotosbaiducomxiaodupicitem03087bf40ad162d96768de6816dfa9ec8b13cdc4jpg\t184055\n大林寺\t184056\n吖吖子\t184057\n一八项项三全\t184058\n不堪入耳\t184059\n賁隌\t184060\n1978处\t184061\n最多情\t184062\nHduf\t184063\n机器人度秘\t184064\n桦墨兰\t184065\n匈奴人\t184066\n呗秘\t184067\n过龙八夷\t184068\n运相\t184069\n6000发\t184070\n问得\t184071\nScarlett\t184072\n第二棒\t184073\n熊思雨\t184074\n学潮\t184075\nXzdrfgFh\t184076\n周景红\t184077\n说心安理\t184078\n开罐\t184079\n水瓶\t184080\n结结\t184081\n运盛\t184082\n潘光凯\t184083\n由不得\t184084\n永捷\t184085\n种地泡\t184086\n反反复复\t184087\n异族\t184088\n吴贺宁\t184089\n烽禾\t184090\n猪子\t184091\n多一份\t184092\n智联\t184093\n义智商\t184094\n我的记忆\t184095\n不要人家说\t184096\n植物病理学\t184097\n联赛杯\t184098\n六月末\t184099\n周天香\t184100\n上抛\t184101\n忘了忘\t184102\n对呀美\t184103\n击落\t184104\n拓寺\t184105\n不要求\t184106\n好司爰\t184107\n俺老\t184108\nMOT\t184109\n盛大\t184110\n枉费\t184111\nucrc\t184112\n交办法\t184113\n黄永昆\t184114\n花间\t184115\n士官\t184116\n女姐\t184117\n真的假\t184118\n结绍\t184119\n科学家们\t184120\nzcvscg\t184121\n童年龄袖\t184122\n里尔克\t184123\n舒宝\t184124\n凶杀\t184125\n异性恋\t184126\n好我不要我不要你大\t184127\n鲁滨逊\t184128\n张皇\t184129\nGhjRHOHCFOCH\t184130\n鑫泰\t184131\nmssh\t184132\ntereghh\t184133\n发牢骚\t184134\n亲简\t184135\n碟中谍\t184136\n狗屄\t184137\n道高一战\t184138\npojin\t184139\n河南卫辉\t184140\n318家\t184141\n高淑芬\t184142\n圣训\t184143\n立体\t184144\n呦傲娇\t184145\n三行\t184146\n微机室\t184147\n预售\t184148\n四大碗米线\t184149\n轻快\t184150\nHooH\t184151\n真地\t184152\n这样无理取闹\t184153\n好嘞好勒\t184154\n表白路\t184155\n提前还款\t184156\n假包\t184157\ndoingis\t184158\n200多点\t184159\n开镜\t184160\n不足惜\t184161\n穿着打扮\t184162\n弹狗\t184163\n态度\t184164\n阿联\t184165\n超美超美超美超美超美超美超美超美超美超美超美超美超美的\t184166\n钱江顺\t184167\n付文煜\t184168\n找我我在\t184169\n王全安\t184170\n大意\t184171\n七家\t184172\nHogfhof\t184173\n看不然的话\t184174\n承诺\t184175\n摩尔多\t184176\nflll\t184177\n怒骂\t184178\n该署\t184179\nFT亚巡香港场\t184180\n大愁\t184181\n乔宇宏\t184182\nftl62\t184183\n们样\t184184\n鸟\t184185\ncong\t184186\n地段\t184187\n鸓\t184188\n椰子皮\t184189\n七宫\t184190\n鸪\t184191\n鸯\t184192\n结婚礼物\t184193\n鸣\t184194\n水乡\t184195\n鸡\t184196\n鸦\t184197\n一心的翅膀带我飞飞孤绝我\t184198\n转述句\t184199\n11条\t184200\n鸿\t184201\n阿西莫夫\t184202\n11114111111414141447441717741771477477442125824841742578724842573872474255485478554882478647924\t184203\n鸶\t184204\n酸痛累\t184205\n超级葡萄籽\t184206\nhano\t184207\n最最最最最最最最最最最最最最最最最最最\t184208\n金立里\t184209\nhang\t184210\nhane\t184211\nhand\t184212\n嗯毕竟\t184213\n裤裆\t184214\n320级\t184215\n好读书不如好读书好读书不如好读书\t184216\n乱发行\t184217\n人星美影院\t184218\n10月22日17时\t184219\n一下户\t184220\n制导导引系统\t184221\n盛夏\t184222\n准确地\t184223\n十二段\t184224\n岁月无痕\t184225\n绥化\t184226\n芭芭拉\t184227\n词曲\t184228\n裤裤\t184229\n双冠王\t184230\n二十餐\t184231\nf2.0\t184232\n认不听\t184233\n配性\t184234\n周宇轩\t184235\n集美二院\t184236\n黄焖鸡米饭\t184237\n巨炮\t184238\n输赢\t184239\n范老泡\t184240\n娃娘\t184241\n回光\t184242\n王导\t184243\n娃娃\t184244\n孙皓天\t184245\n三中那边儿\t184246\n浩贱\t184247\n个数\t184248\naaoa\t184249\n为你在\t184250\n好心医药\t184251\n水沉烟\t184252\nJ迷\t184253\n嚣尘\t184254\n封杀\t184255\n谢谢度\t184256\njosi\t184257\njosk\t184258\n回兴\t184259\n5899404348042293194\t184260\n三具\t184261\n俗话说\t184262\n凯那多\t184263\nbfreefeigeshikonon佛拉篮教父\t184264\n谓凡\t184265\n水族箱\t184266\n泪腺\t184267\n派帝\t184268\n极客棒\t184269\n五画简笔\t184270\n阿尼玛\t184271\n同洲片\t184272\n可卡\t184273\n牝化\t184274\n焚香\t184275\n梁雪娇\t184276\n吃茶\t184277\n学林\t184278\n圣帝文\t184279\n腿肠\t184280\n我的呀别\t184281\n李萎\t184282\n六箱\t184283\n794762691\t184284\n卡梅拉\t184285\n老子钱\t184286\n说讨厌讨厌\t184287\n李萌\t184288\n可华\t184289\n很乖我很乖\t184290\n熏黑\t184291\nK146次\t184292\n鱼骨\t184293\n黄贵阳\t184294\n只数\t184295\n你的身材\t184296\n教义\t184297\n1234597890000\t184298\n五小\t184299\n新功能\t184300\n巴嘎雅露\t184301\n潘雨涵\t184302\n亦错\t184303\n八大套\t184304\n金沙省\t184305\n木达\t184306\n出笼\t184307\n五少\t184308\n精明能干\t184309\nNFFX\t184310\n又说过\t184311\n不是这样的话\t184312\n等待丑\t184313\n四1\t184314\n还给我行\t184315\n禽畜\t184316\n18f一\t184317\n教书\t184318\n\t184319\nfoyfo\t184320\n度秘霞\t184321\n金宇彬\t184322\n五尺\t184323\n猜好准\t184324\n给我看点\t184325\n会权\t184326\n安摸摸扎\t184327\n外聊\t184328\n国门\t184329\n秦龙\t184330\n成功出版局\t184331\n石知心\t184332\n三ｄ\t184333\n三ｇ\t184334\n口才\t184335\n黄副\t184336\n顺承\t184337\n英特威\t184338\n金华市政府\t184339\n月秘\t184340\n月租\t184341\n四G\t184342\n摩登\t184343\n阿莫五杀\t184344\n441522198408103033\t184345\n十分之\t184346\n潭面无风镜\t184347\n泡水\t184348\n爱上你\t184349\n蜻蜓店\t184350\n碗子\t184351\n上十二点\t184352\n半边天\t184353\n两株\t184354\n表达到\t184355\n足够用\t184356\n四g\t184357\n23个月\t184358\n涛哥\t184359\n密胺\t184360\n密能\t184361\n东东度秘\t184362\n观后感\t184363\n还剩\t184364\n耿玉轩\t184365\nDEEP\t184366\n哈罗德\t184367\nGAME\t184368\n实体机\t184369\n一箭\t184370\n老娘子\t184371\n白云观\t184372\n娇妻\t184373\n你和谁\t184374\n1。1\t184375\nhfakyjtjlgktlhluhkyb\t184376\n今早七点半\t184377\nFRANK\t184378\n十九分\t184379\nMyname\t184380\nIhhxhuftkfehhdyruhxbhdihdtfhgbtdhtrb\t184381\n稚瞳\t184382\n真真假假在你心不伎真假我是真\t184383\n未经说\t184384\n杜什么\t184385\n多来了\t184386\n素克风\t184387\n对冲基金\t184388\n佩服你了佩服你了佩服你了佩佩对对对对对对对配我陪你啦啦啦啦啦\t184389\n呕耶\t184390\n僵尸个爱你\t184391\n十一分\t184392\n周东庆\t184393\n名医\t184394\n陈浩明\t184395\n养花\t184396\n忘养\t184397\n死习惯\t184398\n啵人\t184399\n苑心彤\t184400\n艳婢\t184401\n呵呵冰\t184402\n52225151516468644\t184403\n义Z儿一一人一入刁口口小一人\t184404\n呢块\t184405\nxX八\t184406\n牧野公园\t184407\n13892050218\t184408\nGgggfgg\t184409\n女魔\t184410\n常德\t184411\n王源O\t184412\n秘宝贝\t184413\n不要我了\t184414\n家驹\t184415\n郭采洁\t184416\n2011年6月4日、5日\t184417\n520520520520\t184418\nurbothe\t184419\n97676\t184420\n铜像\t184421\n5678886\t184422\n一个假\t184423\n张柏\t184424\n王晨荣\t184425\n520016786\t184426\n爱贝我是崔8几\t184427\n123456789101112131415161718192021222324252627282930313233343536373839100\t184428\n二七一\t184429\n壁垒\t184430\nGfffft\t184431\n杜康\t184432\n小萨\t184433\n牛巴粉\t184434\n今报南都娱乐\t184435\n八十期\t184436\nvagdg\t184437\n22838383\t184438\n江西省道教协会\t184439\n百分语文\t184440\n两栋\t184441\ndnfcx\t184442\n该区\t184443\n5255557578\t184444\n二七个\t184445\nJTPT\t184446\nAT&T中心\t184447\nisjsdjooz\t184448\n搅烂\t184449\n我爱您亲一个呀求求你\t184450\n鲶鱼\t184451\n做衣锦华服\t184452\nJoyce\t184453\n不归路\t184454\n伯母\t184455\n二七三九九幺零五\t184456\n度秘娜娜\t184457\nmgmpmjxmg\t184458\n糖尿\t184459\n谈何容易\t184460\n硕大\t184461\n有钱\t184462\n爽蚊\t184463\n5分之四\t184464\nQwertyuiioppasfgjlsx77582587758265\t184465\nRussel\t184466\n捷达\t184467\n陈泽梁\t184468\n\t184469\n4袋\t184470\n漂泊\t184471\n老发句号\t184472\n德胜置业大厦\t184473\n10个月\t184474\n丽台\t184475\n暴风雨饭否很快个幸福i\t184476\n芦城\t184477\n吴亚琪\t184478\n岀门\t184479\n第底\t184480\nffhfxj\t184481\n八八又乐\t184482\n妈妈们\t184483\n爷\t184484\n饲养员\t184485\n就好来气\t184486\n楚安安\t184487\n石头子\t184488\nBoyle\t184489\n爱\t184490\n非正确\t184491\n网上小额贷款\t184492\n林雁回\t184493\n忠贞\t184494\n你好呀里\t184495\n竹锦瑞\t184496\n胖瘦\t184497\n夜雨\t184498\n金一\t184499\n合欢\t184500\n丁美君\t184501\n彩蹀\t184502\n野游\t184503\n王继森\t184504\npson\t184505\n二十九号\t184506\n丽友\t184507\n四乘\t184508\n初期\t184509\nvxmh\t184510\n谢礼\t184511\n上海公司\t184512\n叫陌\t184513\n飞鹤\t184514\n白呀\t184515\n选择的歌\t184516\n张梓蓓\t184517\n杂货\t184518\n三三早航\t184519\n湖人队\t184520\n爻\t184521\n飞鹰\t184522\n杂质\t184523\n含露\t184524\n溜进\t184525\n是非曲直\t184526\n飞鹅\t184527\n娃娃屋\t184528\n凯源粉\t184529\n僧侣\t184530\n臭臭泥\t184531\n惨绝\t184532\nhjih\t184533\nggsghfro\t184534\n僵尸王\t184535\n高兴就好\t184536\n李容\t184537\n国际体联\t184538\nCosplay\t184539\n散沙\t184540\nqaxre\t184541\nmylogo\t184542\n对不对啊度秘\t184543\n看助\t184544\nbhddhbbp\t184545\n赵迪\t184546\n担待\t184547\n七二阿星期二套\t184548\n尚志\t184549\nfsisiysdtidordfj4e\t184550\n首钢\t184551\n周周清\t184552\n并没有\t184553\n好帮手度秘\t184554\n飘来飘\t184555\n江西宝贝\t184556\n龙凤胎\t184557\n天而瑞\t184558\n箴言\t184559\n妹纸你好美纸\t184560\n睡不觉\t184561\n型式\t184562\n再洗\t184563\n四百岁\t184564\nhdfi\t184565\n潘石屹\t184566\n李宇\t184567\n狂奔工作室\t184568\n云存储\t184569\n最少一周\t184570\n文化大革命\t184571\n2012期间\t184572\n我骗你的我八岁\t184573\n张欣妍\t184574\nhttpm16888comnews201601162988750\t184575\n诗一屋\t184576\n郭恒锐\t184577\n寻麻疹\t184578\njids\t184579\n几十年\t184580\n秒抢\t184581\n吗雅\t184582\n一洚\t184583\n十六点整\t184584\n大院\t184585\nVMware\t184586\n管毅\t184587\n戏沙\t184588\n曲高\t184589\n了妈妈\t184590\n失笑\t184591\n时来\t184592\n奥哇\t184593\n回礼\t184594\n过级\t184595\n临海大道西洋村\t184596\n4.27下午\t184597\nGdwdyjd\t184598\n瑟情\t184599\n省会\t184600\n今晚8点45\t184601\n一派\t184602\n变通\t184603\n素材\t184604\n其责\t184605\n东兴医院\t184606\n罗马帝国兴亡史\t184607\n北京们\t184608\n沈阳绿岛体育中心\t184609\n魔羯女\t184610\n沪籍\t184611\nAngelaby\t184612\n皇屠天\t184613\n要不再说\t184614\n鹿胎膏\t184615\n百度轩\t184616\nbigh\t184617\n度秘卷\t184618\nbigo\t184619\n杨志强\t184620\n乐视1s\t184621\n病死\t184622\n刘寒假\t184623\n纯正\t184624\n这回去\t184625\n九百十一十二个\t184626\n弱化\t184627\n代袋\t184628\n四房\t184629\n五分之三第二次\t184630\n黄小邪\t184631\n大妞\t184632\n私刑\t184633\n高密大帝\t184634\n张红军\t184635\n死亡金属版\t184636\n3235236\t184637\n招摇\t184638\n相本来\t184639\n广州\t184640\n靠靠靠靠靠\t184641\n四截\t184642\n东方早报\t184643\n四成\t184644\n50万瓶\t184645\n化妆难免\t184646\n我不要你爱我我要爱我的人\t184647\n小树林\t184648\nA型\t184649\njbccfh\t184650\n30周年\t184651\n私利\t184652\n六一环岛站\t184653\n劝学\t184654\n黑脸\t184655\n014期\t184656\n男宠\t184657\n朱佳雯\t184658\n十分钟以后\t184659\n选片\t184660\n一八点\t184661\nhardrive\t184662\n暗沙\t184663\n5嗯\t184664\nbnmkfs\t184665\n洛阳\t184666\n84421135\t184667\n魏梦姣\t184668\n穿越威\t184669\n黑脚\t184670\n尹空\t184671\n坏小盆友\t184672\n说金童玉女\t184673\n羽毛犬\t184674\n天天八听\t184675\n圆球纸\t184676\n灵通区\t184677\n14565656568\t184678\ndjhdistleatli\t184679\n理科班\t184680\n男宝\t184681\nⅢ\t184682\n52849590\t184683\n老心情不好\t184684\n黏膜\t184685\n3403-3388转分机8712\t184686\n翻铁\t184687\n美式田园\t184688\n18701959349\t184689\n摇女\t184690\n這張圖片\t184691\n吴凤花\t184692\n斗破\t184693\n花千骨猫花千骨\t184694\n不容缓\t184695\n055996666980888575\t184696\n一颤\t184697\n西工大附小\t184698\n美股\t184699\n70对哦69\t184700\n腔内\t184701\n竹窗\t184702\nhidkf\t184703\n穿越\t184704\nfbggghj\t184705\n秘度数的就是你你是猪\t184706\n你们\t184707\n水印\t184708\n机动密度你是男是女\t184709\njutiq\t184710\n长远不会\t184711\nSheep\t184712\n当做\t184713\n大西洋\t184714\n劳劳碌碌\t184715\n切点\t184716\n600539\t184717\nRf\t184718\n托戎\t184719\n三.八\t184720\n小灵通幼儿园\t184721\n哮喘\t184722\nAnautumn0uting\t184723\n高糖\t184724\nSheen\t184725\n欧雷格\t184726\nRT\t184727\ne直拍\t184728\nRW\t184729\nRP\t184730\n解讫\t184731\n雪心有在唱歌\t184732\n奥萨马\t184733\n贺悦滋\t184734\nRD\t184735\nRF\t184736\n杨贵妃\t184737\nRC\t184738\n540吨\t184739\nRM\t184740\n解讲\t184741\nRI\t184742\n谢娜美\t184743\n单人雪\t184744\nR7\t184745\n爱疯三\t184746\nyourist\t184747\n刘玉莹\t184748\n人翠\t184749\n郝鑫茹\t184750\nCandy=坑娣\t184751\n甜甜\t184752\n天将\t184753\n接济\t184754\n天尊\t184755\n脱口秀\t184756\n妺纸\t184757\n朱小虎\t184758\n66542125899521147\t184759\n见略同\t184760\n融符艳蓉\t184761\n冯大哥\t184762\n奖金\t184763\n邮政快递\t184764\n1258157859\t184765\n综合体\t184766\nNanoTritium\t184767\nrilyi\t184768\n灯步\t184769\n孤寂\t184770\ngdss\t184771\n早上十点半\t184772\n可恶心\t184773\n违合\t184774\niiID\t184775\n撤除\t184776\n公权力\t184777\nhhvcccxeecjb\t184778\n木版画\t184779\n好的乖乖\t184780\nfrdhdnffvb\t184781\n4313\t184782\n小黄本\t184783\n听不良言\t184784\n蓝海大酒\t184785\n希金斯\t184786\n翎毛\t184787\n工资性\t184788\n各行其道\t184789\n毛脸\t184790\n讲义\t184791\n没有\t184792\n肖阳\t184793\nhaishang\t184794\nshxjdub\t184795\n請輸\t184796\n李诚星\t184797\n阿黄祥华\t184798\n范文\t184799\n好啦好啦我原谅你\t184800\n于东龜\t184801\n田凯达\t184802\n休闲风\t184803\n88蚊\t184804\n最大限度\t184805\n毛脚\t184806\n似是故可唯\t184807\n大姨妈\t184808\n密不行\t184809\n嗚嗚\t184810\n18182\t184811\n记得你是\t184812\n暖和\t184813\n保钓船\t184814\n1万多一点\t184815\nzrfs\t184816\n49名\t184817\n凶你\t184818\n听命令\t184819\n雪朦\t184820\n3489667239420\t184821\n娄菲\t184822\n舒沙城\t184823\n我想我的宝贝轻舞飞扬\t184824\n横俩眼\t184825\n晓曼\t184826\n之类\t184827\n咭者\t184828\n最完美\t184829\n洗头感\t184830\n退位\t184831\n酷影\t184832\n能女\t184833\n李峰轩\t184834\n老太公\t184835\n273849596\t184836\n啪啪三国\t184837\n无耻\t184838\nokcity\t184839\n退作\t184840\n塔希里亚故事集\t184841\n15001\t184842\n15000\t184843\n网赚\t184844\n谷女神\t184845\n瞄胸\t184846\n面饼\t184847\n马冬梅\t184848\n常赞\t184849\n拚命\t184850\n旧宫\t184851\nfhchvk\t184852\n精髓\t184853\n无者\t184854\n9229928282838383737737373737\t184855\n南站小学\t184856\n泡椒笋\t184857\n／假\t184858\n三国演义中单骑救主\t184859\n无耐\t184860\n近500年\t184861\n黑糖\t184862\nriaaist\t184863\n手足相残\t184864\n造福\t184865\n平反\t184866\n滑轮组\t184867\nwokao\t184868\n丁香园\t184869\n爪哇\t184870\nduuf\t184871\n奉承\t184872\nduud\t184873\nuiiijjh\t184874\n慊\t184875\njjjbbb\t184876\n两百几克\t184877\n动手做\t184878\n毛毛虫\t184879\n15个月\t184880\n夏慕慕\t184881\n鸟儿\t184882\n平台\t184883\n协子\t184884\n看我\t184885\n边鄙\t184886\n3000点\t184887\n釜底抽薪\t184888\n万钟平\t184889\n想来\t184890\n游华\t184891\n来不得\t184892\n为你说我和我和我和我的女人色\t184893\n萌妹\t184894\n入人心\t184895\n萌妺\t184896\ndodocommouyi超级玛丽马屁\t184897\n地支\t184898\n谭剑\t184899\n情谊\t184900\n关氏大你是谁呀我爸比是谁姐姐贾\t184901\n塑料制品\t184902\n观察者\t184903\n/table\t184904\n情调\t184905\n南美洲\t184906\n艾山\t184907\n架构师\t184908\n地改\t184909\n下一个月\t184910\n去冬今春\t184911\n松江\t184912\nfjfbbv\t184913\n没我不想\t184914\n李晓霞\t184915\n娃夫人\t184916\nkoume\t184917\n不正当\t184918\n巧克力味\t184919\n妙秒\t184920\n飞来了\t184921\n质保关系波卡波卡\t184922\n侠盗飞车\t184923\n王茹\t184924\n北苑\t184925\n五多少\t184926\negg\t184927\n流苏\t184928\n替奶奶\t184929\nmrs\t184930\nmrn\t184931\n57557237557837287575\t184932\n班加尔康\t184933\nouv7v9vf\t184934\noitfcjjfdjkffjiggcjhfg\t184935\n陈璀\t184936\n爱人才说\t184937\n陈璇\t184938\n安排\t184939\n我喜欢办\t184940\n广东县\t184941\n赵餐馆\t184942\n拔枪\t184943\n简淡\t184944\n驶岁\t184945\n事前\t184946\n北京艺校\t184947\nkdfdjhggyjb\t184948\n太可斯\t184949\n吧石头\t184950\n离婚前\t184951\n相信我的朋友\t184952\n薄熙来\t184953\n会好好爱\t184954\n拖地\t184955\n二零一几年\t184956\n娘家的夏天天空中繁星点点\t184957\n懂不懂懂\t184958\n10600万\t184959\n李铁鸣\t184960\n45185\t184961\nXlgxlhxpu\t184962\n下一秒会\t184963\n陈剑锋\t184964\n于进\t184965\n准确\t184966\njehed\t184967\n伺候\t184968\n祝度秘\t184969\n小憩\t184970\n缺钱\t184971\n445575578548288568525726873585875258552436164848484494984916466588888888899\t184972\nynkn66655jb\t184973\n熏酒\t184974\nkggkglhkghghgljggulnyieouboyiygnlhgpinljk\t184975\n贝塔洛\t184976\n家振\t184977\nbutdelpri\t184978\n耳廓\t184979\n女服务员\t184980\n心门之外\t184981\n超重重\t184982\n热撸撸\t184983\njshdy\t184984\n陈奕彤\t184985\n九十几\t184986\n变了真是\t184987\nc++\t184988\n哺乳类\t184989\n教局\t184990\n发愿\t184991\n农民场\t184992\n1.76米\t184993\n似的\t184994\n体育总局\t184995\n韩星宇\t184996\n下一场雨\t184997\n我不要你了我要揍你\t184998\nDAY.1\t184999\n不要害怕\t185000\nApps\t185001\n沉默寡言\t185002\n心手\t185003\n心扉\t185004\n鸵鸟\t185005\nquklily\t185006\n懂了爱\t185007\n哪勒\t185008\n190875543357\t185009\n林寒\t185010\n孙娇娇\t185011\n然并卵\t185012\n马国明\t185013\nFITOl\t185014\n韩梦\t185015\nCvhkfcn\t185016\n林密\t185017\n千一百三十二\t185018\n明嘲暗讽\t185019\n周行\t185020\nykhch\t185021\n3cm\t185022\n薇诺娜\t185023\n草猪\t185024\n3ce\t185025\n累杀\t185026\n卖卫生巾的小男孩\t185027\n床罩\t185028\n大华二小\t185029\n回到家嘻嘻\t185030\n叫声爷\t185031\n岁月无情\t185032\n高梦娟\t185033\n哄抢\t185034\n陡坡档\t185035\napp1e\t185036\n钟云燕\t185037\n17768030116\t185038\n机突\t185039\n转唱\t185040\n王浩洋\t185041\n开解\t185042\n卢伟强\t185043\n2067292174\t185044\n异性片\t185045\n素萌\t185046\n无书百事荒芜\t185047\nQWW\t185048\n城镇人口\t185049\n登巴\t185050\n48855755\t185051\n吴佳敏\t185052\ndsdfddtft\t185053\n平生\t185054\n间隔\t185055\n缠枝\t185056\n王润泽\t185057\n王三寨\t185058\n贷批\t185059\n株地\t185060\n有话说明白\t185061\n影都\t185062\n嗯元姐\t185063\n麽城市\t185064\n73ueur\t185065\n观不近\t185066\ngudusrdfcd\t185067\n我猜我猜\t185068\n算了窝\t185069\n范仲淹\t185070\n平用\t185071\nDolomite\t185072\n1032\t185073\n宠溺\t185074\n小偷\t185075\n多威\t185076\n哥人\t185077\n多娇\t185078\n亲啵啵啵\t185079\n国珍\t185080\n苏幕遮范仲淹\t185081\nvhcgfw44321O‖D00\t185082\npinyin\t185083\n八二\t185084\n神犬\t185085\n我叫\t185086\n中央台新闻联播\t185087\n八五\t185088\nOtuh\t185089\nxbgenhxbbhhjhkloo6jgcbgvsdssqweertyuiipaddghklxvbxvncm\t185090\n公茂传\t185091\n王松零\t185092\n杨秀婷\t185093\n哎呀巴亚斯\t185094\n龙宫\t185095\n第28期\t185096\n东南西北风\t185097\n不不不不不不不不不不不不不不不不不不不不不不不不不不不不\t185098\n17899\t185099\n秘书信\t185100\nMONOCHROM，X2\t185101\n八亿\t185102\n理员\t185103\n答案行\t185104\n堡垒\t185105\n1116913812507\t185106\n叨叨\t185107\n文化部\t185108\n2435534657098651276\t185109\n苹果公司\t185110\n数四百\t185111\n贝壳出版社\t185112\nwushwi\t185113\n暹罗之恋\t185114\n靠不见\t185115\n密闭\t185116\n知道不我在\t185117\n下不起\t185118\n追咬\t185119\n羞毛\t185120\nQQ吧\t185121\ndome，dome\t185122\n伟超yaheartsensalthtr\t185123\n哎呦熊\t185124\n组合版\t185125\n谢叹\t185126\n质问\t185127\n洪如静\t185128\n喜不喜\t185129\n文科玩\t185130\n操作\t185131\n十六七\t185132\n洪锦萱\t185133\n听不到\t185134\n弄手崴\t185135\n十六万\t185136\ndaiguw\t185137\n浪死\t185138\n反舰导弹\t185139\n28分\t185140\n发青\t185141\n挂号信\t185142\n华风集团\t185143\n呦呵行\t185144\n杏红小大人\t185145\n杨李\t185146\n十二只二\t185147\n不要人满\t185148\n哈马上\t185149\n营销量\t185150\n顺和\t185151\n杨杰\t185152\n父爱如山\t185153\n内角\t185154\n英文版\t185155\n十六个\t185156\n从来说\t185157\ngaokao\t185158\n了你是\t185159\n16．3％\t185160\n杨来\t185161\nSDF\t185162\n1月9号\t185163\nSDB\t185164\n过首歌\t185165\n好咯你的我叫幸会\t185166\n杨杨\t185167\nSDK\t185168\nbutzero\t185169\n认输\t185170\n么么味\t185171\n喝类\t185172\n奥茂科\t185173\n谢大师\t185174\n世界盒\t185175\n高全\t185176\n心肌梗塞病\t185177\nps3\t185178\n我原谅\t185179\nQWQ度秘\t185180\n口红印\t185181\n7点32\t185182\no一个\t185183\n莎啦啦啦啦小\t185184\n齐齐哈尔市\t185185\n冰山九角\t185186\nwuzph\t185187\n小二樓\t185188\n慰器\t185189\n薛某某\t185190\nmmav\t185191\nmmal\t185192\nmmaa\t185193\n高雪梅\t185194\n冰洞\t185195\n现金\t185196\nswim\t185197\n大禁曲\t185198\n上沿\t185199\n忘问\t185200\npso\t185201\n咯ing\t185202\n合作者\t185203\n土豆子\t185204\n班门\t185205\n四个男\t185206\n恩知\t185207\njerolr\t185208\n寒者\t185209\n口号\t185210\n真太\t185211\n真天\t185212\n交学费\t185213\npss\t185214\n领带\t185215\n靖晓东\t185216\n真大\t185217\n大明广场\t185218\n订制\t185219\n真多\t185220\n上沟\t185221\n宠物身\t185222\n上沙\t185223\npdpwjgdu\t185224\n吵醒\t185225\n结婚头\t185226\n切器\t185227\n120524\t185228\n昨晚上\t185229\n孙浩玲\t185230\n120520\t185231\n口发\t185232\n鱼油\t185233\n吴珏\t185234\n龙飞虎跳\t185235\nyoursisb\t185236\nstsforyourho\t185237\n知心朋友\t185238\n天色\t185239\n呵哥\t185240\n饿了想\t185241\n林美辰\t185242\n御宅\t185243\n20％\t185244\n手冢国光\t185245\nchinesedumplings\t185246\nLO诶\t185247\n请进\t185248\n讲涯\t185249\nVUBG\t185250\n涂莎莎\t185251\n归元学\t185252\n呵哼\t185253\n芥茉\t185254\n破解开\t185255\n诶尔\t185256\n艾欧尼亚白金渣渣\t185257\n鸡女\t185258\nAndorid4.1\t185259\n再接\t185260\n鸡奸\t185261\n余裕富\t185262\n十七米\t185263\n竞选\t185264\n武汉明天下\t185265\nGagayaga\t185266\n糖醋肉\t185267\n何乐不为\t185268\n豪门来词语接龙\t185269\n请述\t185270\n一大截\t185271\n李子园\t185272\n嫂\t185273\nBDGD\t185274\n利差\t185275\n偷怕\t185276\n巴嘎雅鹿\t185277\n秦凯龙\t185278\n无所不知\t185279\n江雨松\t185280\n冯竣绮\t185281\n符家瑜\t185282\n平等权利\t185283\n利己\t185284\n喷饭\t185285\n指量\t185286\n7月10日\t185287\n包菜\t185288\n林花卉\t185289\n陈曾\t185290\n鲁公\t185291\n陈曼\t185292\n于河二石兽\t185293\n再一个再一个\t185294\nokistmas\t185295\n张[乐乐\t185296\n陈更\t185297\n高高高\t185298\n20031202\t185299\n鲁兹\t185300\n雷欣雨\t185301\n找朋友\t185302\n利川\t185303\n嗯欠\t185304\n怩贱皮\t185305\n陈曦\t185306\n弟媳妇儿\t185307\n邪惡\t185308\n梌为国\t185309\n小学文\t185310\n磁性\t185311\n两日游\t185312\n第一千\t185313\n告诉我好不\t185314\nridjfd\t185315\n引入\t185316\n蔡餐叙\t185317\n东风饭店\t185318\n国家级海洋公园\t185319\n尽快我喜欢\t185320\n河北省\t185321\n防盗门\t185322\n岘港\t185323\n农业带\t185324\nbdkd\t185325\n欧洲人\t185326\n张小妹\t185327\nstart\t185328\n静车\t185329\n夫复何求\t185330\n海鹏\t185331\n争端\t185332\n为止\t185333\n孟宇航\t185334\n小百合\t185335\n吕相吉\t185336\n奶茶\t185337\nrjr\t185338\n爆口\t185339\n三星期四\t185340\nrjx\t185341\nrjf\t185342\nrje\t185343\nrjc\t185344\n2月12日\t185345\n浩大\t185346\n何必嗔闹折花人\t185347\n牙利瘁\t185348\nminna\t185349\n好了抱\t185350\n熙颖水光小丸子\t185351\n溜风\t185352\n1个儿\t185353\n起到\t185354\nwww030ggcom\t185355\n2054\t185356\n一千十二\t185357\n1月23号\t185358\n李露\t185359\n跳時\t185360\n从外星来的孩子\t185361\n期末考试卷\t185362\n牟锐\t185363\n血缘\t185364\n石渠\t185365\n波斯宋姐\t185366\n三千二百三十二万九千\t185367\n翠丽\t185368\n251块\t185369\n荒谬\t185370\n多米我是你的主人\t185371\n无聊无趣\t185372\nbhiphotosbaiducomxiaodupicitemd62a6059252dd42a013016d8043b5bb5c9eab883jpg\t185373\n多次\t185374\n不萌化作好猛划对号\t185375\n18000\t185376\n听话你说女\t185377\n咪咪咪咪咪咪咪咪咪咪咪咪咪喵咪喵咪喵\t185378\n新和锦城\t185379\nigfujcyh\t185380\n他丽君\t185381\n没手术\t185382\n熊猫秘\t185383\n芭比四\t185384\n骂儿\t185385\n举手之劳\t185386\n还记着\t185387\n丈母娘家\t185388\n甲壳虫\t185389\n门面房\t185390\n竖弯\t185391\n52个星期零二天\t185392\n胡娟\t185393\n131740\t185394\n刘赐贵\t185395\n双商\t185396\ncoust\t185397\nsoufifalittlerkit\t185398\n饱了吗\t185399\n彻底\t185400\n曲词\t185401\n常务委员\t185402\n翻手为\t185403\n电视塔\t185404\n贫道\t185405\n竖式\t185406\n洁雅\t185407\n不眨眼\t185408\n度向里\t185409\n芹菜\t185410\nloffel\t185411\n零九八\t185412\n什么意\t185413\n死亡之组\t185414\n一点五元\t185415\n抽出血\t185416\n福布斯中文网\t185417\n仙之侠道\t185418\n诚实感\t185419\n吕祥旭\t185420\n哎呦啦啦啦啦啦啦啦啦小魔\t185421\n邦忙行\t185422\n吴彦祖裱\t185423\n恩加\t185424\n全景式\t185425\n明天市场小米\t185426\n非常快乐\t185427\n大台湾\t185428\n做得到\t185429\n十五六七八九十四二三四五六七八九十九二八七二四三十\t185430\n金宣儿\t185431\n姜堰\t185432\nG0195WUWG70Q\t185433\n刘梦竹\t185434\n8周年\t185435\n金木研\t185436\n第几张\t185437\n3856\t185438\n肖大神\t185439\n男還\t185440\n3850\t185441\n2727535875385578654706244433548663548563555963\t185442\n吧偶\t185443\n温流真帅\t185444\n度秘哥午安\t185445\n两个处\t185446\n朵朵花\t185447\nＣ\t185448\n伟初\t185449\nBaggsd\t185450\n招牌\t185451\nwrsftzgtrSDSdf\t185452\np啊\t185453\n度王\t185454\n吕铭捷\t185455\n两个多\t185456\n葡萄藤\t185457\n四个字\t185458\n来越\t185459\n新疆医科大学第一附属医院\t185460\nefox\t185461\n补发\t185462\n功臣\t185463\n海啸\t185464\n米大米\t185465\n上海驰宸电子\t185466\n张明天\t185467\n二零幺\t185468\n争霸\t185469\n德国国足\t185470\n555556655\t185471\n实质\t185472\n两个头\t185473\nanaa\t185474\nanan\t185475\n天天咪咪\t185476\n丽小丽\t185477\n毒理学\t185478\nhpuvvggu\t185479\n赵君华\t185480\n畅妹\t185481\n0分\t185482\n理想\t185483\n7月13日\t185484\n3月30日\t185485\n天迈外\t185486\n罗荷芬\t185487\n杨卫星\t185488\n説\t185489\nOzkan\t185490\n莫雷拉\t185491\n华莱士（福景店\t185492\n所以说我你是名人我是国有\t185493\n小馨予\t185494\n水原\t185495\nmacand\t185496\n沃创\t185497\n4688287185\t185498\n舜师红\t185499\n撫摸\t185500\nsidts\t185501\n央八剧星\t185502\n好啦我好\t185503\n陈佳乐\t185504\nlearned\t185505\n說\t185506\n卡塔尔\t185507\n超短裙\t185508\n一不男不女大亨\t185509\n54445926218746\t185510\n震荡\t185511\n不秘书\t185512\n安计数器\t185513\n零三八\t185514\n棒球\t185515\n烦磨叽\t185516\n怡口莲\t185517\n尽管\t185518\n烤鱼们\t185519\n铁炮\t185520\n柔弱\t185521\n绵白糖\t185522\n芳名\t185523\n死毛\t185524\n女追男隔层纱男追女隔层山\t185525\nIPHONE6\t185526\nIPHONE4\t185527\n死比\t185528\n死毒\t185529\n死母\t185530\n上届\t185531\n没闻见\t185532\n暑袜\t185533\n75589662586654888555566998885555556988866665788665557856555555555556466666555778555555556632\t185534\nZara\t185535\n漳州市巷囗中心小学\t185536\n到背到背到背到背到背\t185537\n死亡坏死\t185538\n咯咙\t185539\nrootdouytr\t185540\nTOP11\t185541\n大幅\t185542\n提爱\t185543\n太阳故事\t185544\nv备注\t185545\n100多首\t185546\n哪门\t185547\n为食\t185548\n刘云仙\t185549\ngguhvg\t185550\ns02e07#\t185551\n老爸老妈浪漫史\t185552\n700M\t185553\n认可度\t185554\n林间\t185555\nsinglorylu\t185556\n奇观\t185557\n奔跑吧卡路里\t185558\n四中中学\t185559\n找到好不啦\t185560\n段挺\t185561\n魏庄\t185562\n体育类\t185563\n2y5\t185564\n王水\t185565\n700W\t185566\n分院\t185567\n雨伞\t185568\n叮咛\t185569\n不！不！不！不！不！不！不\t185570\n生乞\t185571\n雨优\t185572\n大平\t185573\n非所\t185574\n百五十万\t185575\n天源\t185576\n唇彩\t185577\n7000\t185578\n裝\t185579\n唇形\t185580\n真源\t185581\n粳米\t185582\n李冬冬\t185583\ngrill\t185584\n无绝人之路\t185585\n延后\t185586\n延吉\t185587\n无成\t185588\n对呀当然\t185589\n爱笔\t185590\n小牛的头数比\t185591\n真法\t185592\n车奴样\t185593\n廷豆\t185594\n疯长\t185595\n华策\t185596\n分值\t185597\n几级\t185598\nHartwig\t185599\n讲做\t185600\n帅颇\t185601\n李良\t185602\nD罩杯\t185603\n教程\t185604\n马应龙\t185605\n不羁强\t185606\n钢材\t185607\n潸然泪下\t185608\n来我要\t185609\n琐呐\t185610\n缩回\t185611\n埃及电视台\t185612\nFanMeeting\t185613\nEthg\t185614\nmetwo\t185615\n金旼贞\t185616\n休休\t185617\n恨你我恨你讨厌你\t185618\n断续\t185619\n立体画\t185620\n临终时\t185621\n柳州平安保险\t185622\n钢板\t185623\n刘知道\t185624\n缩图\t185625\n红巴福\t185626\n重新开\t185627\n帅席斯\t185628\n鬼故事\t185629\n主因\t185630\n火姑娘\t185631\n恶补\t185632\n含露·伊晶\t185633\n哈哈好\t185634\n为主见\t185635\n安安库库鲁\t185636\n图兔兔\t185637\nsamystoby\t185638\n八戒回你的高老庄\t185639\nadyds\t185640\n化学超人\t185641\n错错\t185642\n杜依敏\t185643\n噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗\t185644\n交强险\t185645\n暖阳\t185646\n黄小\t185647\n酒会\t185648\n查理九世幽灵列车\t185649\n第一桶金\t185650\n赦免\t185651\n头妖怪\t185652\n2764792425\t185653\n白就白\t185654\n欢迎你不了你了对不对呀\t185655\n坦然\t185656\nyytt\t185657\n度受不爱我\t185658\n楚囚\t185659\n孙朝飞\t185660\n说话\t185661\n师长\t185662\n心肌病\t185663\n休息天\t185664\n张氏生\t185665\n龙头山\t185666\n项羽\t185667\n雅士町\t185668\nsmart\t185669\n指头\t185670\n会议事\t185671\n挺进\t185672\n说课\t185673\n给我我要\t185674\n说说\t185675\n大超人\t185676\n任丘市\t185677\n锁万群\t185678\n瓦蓝瓦蓝\t185679\n女房客\t185680\n秋阳\t185681\n小心会\t185682\n奥缇妍\t185683\n漫友\t185684\n王天骄\t185685\n中国重汽集团\t185686\n频繁\t185687\n老奶奶\t185688\n裤\t185689\n长椿街\t185690\n不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸\t185691\ncen\t185692\nceh\t185693\nCCvvvvvvvv\t185694\ncer\t185695\n朝出来\t185696\nB块\t185697\n恶笑\t185698\n蓝龙军团\t185699\n涉县\t185700\n记大过\t185701\n朱仁龙\t185702\n吸入\t185703\n包皮\t185704\n结婚照\t185705\n个裤\t185706\n赵红人\t185707\n麦唱\t185708\nLiao\t185709\n几啊\t185710\n黄金甲\t185711\n锡盟队\t185712\n铸剑池\t185713\n你在不在在不在在不\t185714\n哪呀\t185715\ndfgfdtt\t185716\n布拉德·伯德\t185717\n谷歌谷歌\t185718\n来了大你\t185719\n浣溪\t185720\n幽默感\t185721\n来来来来来来回来\t185722\n应哥\t185723\n哪呢\t185724\n三三点两点\t185725\n伸门\t185726\n笑到最后\t185727\n1506513892\t185728\n过唱\t185729\nKghfh\t185730\n7\t185731\n哪呵\t185732\n15年间\t185733\n坐姿\t185734\n来人了马上给我拖露天了号\t185735\n爆炸物\t185736\n后代\t185737\n七哥哥\t185738\n魅族mx5\t185739\nNono\t185740\n贾新雨\t185741\n听好\t185742\n偷东西\t185743\n智能终端\t185744\n后任\t185745\n尿裤裤露\t185746\n曹雪芹\t185747\n1586437563\t185748\n20几分\t185749\n穷光\t185750\n新马\t185751\n填设\t185752\n超级玛丽\t185753\n严梦佳\t185754\n沙市中学\t185755\n朱丹\t185756\n343434341个\t185757\n跪倒\t185758\n韦思羽\t185759\n还有一个你猜一猜\t185760\n无聊了干\t185761\n大世界\t185762\n泥泥\t185763\n125cq\t185764\n幺零零八六\t185765\n21梁\t185766\n万岁崖\t185767\n找到再来\t185768\n枣糕\t185769\n秘诀\t185770\n瞅瞅\t185771\n信息库\t185772\n奚齐\t185773\n难度\t185774\n哲学教室\t185775\nw山\t185776\n68米\t185777\n秘诱\t185778\n绝爱尾\t185779\n杯面\t185780\n拉肚子\t185781\n八嘎雅\t185782\n睡覺\t185783\n旅行包\t185784\n辽宁省锦州市黑山县工商行政管理局\t185785\n秘诡\t185786\n育人\t185787\n瑞湾\t185788\n开片\t185789\n99999998\t185790\n杨荔枝\t185791\n1月28日\t185792\n农贸市场\t185793\n直男利德金\t185794\n雨恨你坏蛋我走了拜拜\t185795\n大圣\t185796\n拖延症\t185797\n光辉\t185798\n华盛顿\t185799\n橡木\t185800\n黄浩\t185801\n那呢\t185802\n大场\t185803\n行呀求求你了你可别麻烦我了我要我告诉你我\t185804\n知钱\t185805\n拜拜睡\t185806\n川藏路\t185807\n大地\t185808\n米市胡同43号\t185809\n狗儿\t185810\n步步惊心\t185811\n颜士余\t185812\ngrew\t185813\ncgdsj\t185814\n大圈\t185815\n1.14万亿美元\t185816\n小太心\t185817\n23466155\t185818\n大圆\t185819\ngree\t185820\n大土\t185821\n各抒\t185822\n那呀\t185823\nSpring\t185824\n葡萄树\t185825\n辛莹\t185826\n99天\t185827\n搞笑呵呵\t185828\n我最爱最爱你\t185829\nAjz\t185830\n去手\t185831\n绸白\t185832\n什个\t185833\n献金\t185834\n读到\t185835\n一时间线\t185836\n力期\t185837\n韩永红\t185838\n倪建毅\t185839\n躲们\t185840\n款款\t185841\n车贷\t185842\n车费\t185843\n离倒\t185844\n修罗一号\t185845\n洋芋洋芋\t185846\n门行\t185847\n花狗\t185848\n老汉推车式\t185849\n开发商\t185850\n猪板\t185851\n萧某\t185852\nhgvjt\t185853\n去找\t185854\n青少年\t185855\n牛岭\t185856\n肉红烧肉\t185857\n湖北逍遥\t185858\n北师版\t185859\ncavs\t185860\n供热\t185861\n度秘真丑度秘\t185862\n偶像来了第二季\t185863\njsizigicc\t185864\n白粥\t185865\n顺娇\t185866\n肝脏\t185867\n张家世\t185868\n顺威\t185869\n服霸占\t185870\n首句\t185871\n嗯十八我的好宝贝\t185872\n兔兔五\t185873\nxoutjdl\t185874\n张我\t185875\n首发\t185876\n吸尘器\t185877\n太度秘\t185878\n布条\t185879\n妞妞儿\t185880\n会武术\t185881\n一触即发\t185882\n不能不污辱\t185883\n直升飞机\t185884\n嗯幺零幺零幺幺\t185885\n监测\t185886\n兔兔亲\t185887\n埃米\t185888\n接受现实\t185889\n2860点\t185890\n贤良\t185891\n光彩\t185892\n亲个爱\t185893\n忙与盲\t185894\n手指尖处\t185895\n消防队\t185896\nbac+5\t185897\n一轻\t185898\n付诸\t185899\n台阶\t185900\n775858\t185901\n人表\t185902\n拉客\t185903\n平林\t185904\n常务症\t185905\n三勿\t185906\n平果\t185907\n5100\t185908\n吴清红\t185909\n纲再来\t185910\nsytuti\t185911\n我爱你爱你爱你\t185912\n打马虎眼\t185913\n启辰周\t185914\n人行\t185915\n带卖\t185916\n告诉我劲\t185917\n發財\t185918\n唐燕\t185919\n董董\t185920\n从未\t185921\n共居\t185922\n五百万一个\t185923\n日月如俊\t185924\n猪哥\t185925\nQQ只\t185926\n就是就是你是头猪\t185927\n食指大动\t185928\nhshsbnn\t185929\n吃毛嗑\t185930\n哦多\t185931\n42集\t185932\n嗯yes\t185933\n10月22日\t185934\n照片男\t185935\n养全蝎\t185936\nHKOF\t185937\n下凡间\t185938\n夫妇人\t185939\n周易\t185940\n你是猪我是人我是机器人你是猪\t185941\n周明\t185942\n狐火\t185943\n利斯\t185944\n下火\t185945\n不得不\t185946\n近邻\t185947\n女奴\t185948\n周昂\t185949\n零三六零六\t185950\n卡花\t185951\nIpad\t185952\n错错就错\t185953\n与么\t185954\n哒哒哒哒哒哒哒度200k\t185955\n独家幕后花絮：冷酷胡歌\t185956\n陈凤山\t185957\n警句\t185958\n7000多\t185959\n借书\t185960\n不不不穿衣\t185961\n想起飞\t185962\n炜仔\t185963\n很高傲\t185964\n吴小磊\t185965\n美学\t185966\n宁若\t185967\n美孚\t185968\n那个\t185969\nFjeef\t185970\n哎呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀\t185971\n劲贵\t185972\n超负荷点\t185973\n安家梦\t185974\n库乡\t185975\n55423\t185976\n退烧\t185977\n卡内\t185978\n庞大\t185979\n海底捞\t185980\n骆言\t185981\n夜开心\t185982\nalalalalalalalalalalalalalalaaaalaaaalallylliolaaaaaaaall\t185983\n杨兴明\t185984\n二婚\t185985\n毛血旺\t185986\n咯66诺\t185987\nchnyvvgn\t185988\n邱英杰\t185989\n卡农\t185990\n天天有喜二之人间有爱有没有\t185991\n心安一切安\t185992\n建国中路\t185993\n神离\t185994\n携带\t185995\n上百条\t185996\n背竹\t185997\nsty\t185998\n处室\t185999\n灌\t186000\nforgotten\t186001\n累了你好\t186002\n倾慕\t186003\n二婶\t186004\n三妞\t186005\n抓牢\t186006\n5544553\t186007\n找我\t186008\n八千多w\t186009\n一头猪\t186010\n按动\t186011\n玉米淀粉\t186012\n谷米\t186013\n小河真美\t186014\n14251286542\t186015\n观想\t186016\ngwhwhabdy…jeddrbxxddwwwywweeddyxcbdseueywhegwgedgx\t186017\n小游样\t186018\n伍德\t186019\n大狗\t186020\n鄱阳湖\t186021\n六合彩\t186022\n呗手\t186023\n智能网\t186024\n大狂\t186025\n呢美容觉\t186026\n全经联家园\t186027\n十项\t186028\n坎利\t186029\n皿｀\t186030\n领袖们\t186031\n海绵尔康\t186032\n萌萌哒萌萌哒\t186033\n茹宝\t186034\n保湿\t186035\n别样\t186036\n雍兆峦\t186037\n支出\t186038\n秦成区\t186039\n野保\t186040\n灵光\t186041\n灼\t186042\n腾讯游戏\t186043\n单元\t186044\njhva\t186045\n摸摸摸摸摸摸摸摸摸摸\t186046\n滑出\t186047\n白头到老一甲子\t186048\n估摸思密达\t186049\n枝繁测角\t186050\n落大雨\t186051\n考不上\t186052\n带鱼\t186053\n古天乐\t186054\n2012年7月15日\t186055\n价值线索者\t186056\nhggry\t186057\n羁押\t186058\n287\t186059\n可穿戴式\t186060\n灶\t186061\nokokokok奥\t186062\npaco\t186063\n我要约炮\t186064\n水岸\t186065\n孝悌秀\t186066\n惠安\t186067\n吐去\t186068\n鸡汁\t186069\n柯弗特\t186070\n腰围\t186071\n萨比\t186072\n上宫舞婷\t186073\n一听心澳\t186074\n莱伊恩\t186075\n祝你一路顺风\t186076\n有年\t186077\n油油油油油油油油\t186078\n鸡汤\t186079\n胃脘\t186080\n便宜\t186081\n苏童杰\t186082\n00K\t186083\n焦研峰\t186084\ngutiyizhiziyziy\t186085\n阳公主\t186086\n白玉\t186087\n徐心雨\t186088\n一页一页\t186089\n色香\t186090\n夏威夷\t186091\n摩羯宝宝\t186092\n发水\t186093\nfsst\t186094\n世锦赛\t186095\n镇江市钛粉厂\t186096\nwwuuu\t186097\n下雨破\t186098\n郑航小学\t186099\n卡纽特\t186100\n大麦茶\t186101\n维修费\t186102\n003\t186103\n002\t186104\n001\t186105\n000\t186106\n007\t186107\n006\t186108\n004\t186109\n利拉\t186110\n戛然而止\t186111\n008\t186112\n二十多G\t186113\n专用一键启动\t186114\n程序设计方法学\t186115\n山楂萝卜籽\t186116\n等一宿\t186117\n哥哥哥们儿\t186118\n彩八佰\t186119\n民族主义者\t186120\n王倜傥\t186121\n统考卷\t186122\n6633336666\t186123\n过来来\t186124\n天宇阔切\t186125\n10060400\t186126\n突击\t186127\n丈母\t186128\n祖思铭\t186129\n夏心佁\t186130\n道口\t186131\n牌片\t186132\n风衣首歌\t186133\nTyler\t186134\n叶瑞立\t186135\n二十几\t186136\n嘴贱\t186137\n91公分\t186138\n男孩女孩儿\t186139\n陈瑞喜\t186140\n短篇\t186141\n诠\t186142\n营业厅\t186143\n金马\t186144\n道友\t186145\n86个\t186146\n说呀了\t186147\n微小说\t186148\n我真的好讨厌你了现在\t186149\n3份\t186150\n意字\t186151\n上学路\t186152\n道变\t186153\n不成仁\t186154\n田野了我的眼\t186155\n读几了\t186156\n书法展\t186157\nchuhe\t186158\n开明\t186159\n汗切切切\t186160\n腻味\t186161\n还原性\t186162\nKILLING\t186163\n秋衣\t186164\n99996666\t186165\n宁愿\t186166\n姚潇\t186167\n仙河\t186168\n800瓦\t186169\n火起来\t186170\n哥乐\t186171\n240十15\t186172\n麦粒肿\t186173\n流行音乐节\t186174\n说什么再说\t186175\n颤抖\t186176\n多段\t186177\nDOY\t186178\nletsmake\t186179\n富邦\t186180\n花都古我就好好睡觉吧\t186181\n咋不个人\t186182\n已有\t186183\n三四八二\t186184\n山四尖\t186185\n740米\t186186\n度秘jisui\t186187\n好板样\t186188\n南京大学\t186189\n阿拉林\t186190\ngkhcvkgc\t186191\n样板\t186192\nbg2\t186193\n拉制\t186194\n1364923\t186195\n二二二二二\t186196\n兰泽华\t186197\n不能这样\t186198\n弗洛伦蒂诺\t186199\n偶入\t186200\n名你的名字\t186201\nBill\t186202\n茶点\t186203\n传开\t186204\nHHHHHVVHH\t186205\n不再说\t186206\n新兰新兰\t186207\n合作商\t186208\n所得死勒\t186209\n绝版\t186210\n张杨\t186211\n骂人\t186212\nbgb\t186213\nbgc\t186214\nbgd\t186215\ndiuify\t186216\nbgf\t186217\n亲说\t186218\n大寿\t186219\n偶先\t186220\n张来\t186221\nbgn\t186222\nbgs\t186223\n两可园\t186224\n青莲\t186225\n赛特奥莱\t186226\n张杰\t186227\nbgy\t186228\nskksksn\t186229\n函谷\t186230\n希澈\t186231\n魏老\t186232\n樱花草\t186233\n节成\t186234\n促织\t186235\n11484848\t186236\ngirltoo\t186237\n挨吵\t186238\nhijko\t186239\n太牛\t186240\n众口一词\t186241\n肥虫\t186242\n太片\t186243\n过滤器\t186244\n侯少阳\t186245\n苍星石\t186246\n２２日下午\t186247\n塌地铁\t186248\n喝咖啡\t186249\n范渣渣\t186250\n离好听话\t186251\n肉肉\t186252\n红肠面包\t186253\nXfhjhgh\t186254\n赤山\t186255\nwatp\t186256\n令尊\t186257\n拌匀\t186258\n诗彤\t186259\n十二届\t186260\n三愿\t186261\n春戴\t186262\n小狗圆舞曲\t186263\n555555599555555555444444455\t186264\n咯爱\t186265\n真心爱您\t186266\n大白我最爱狐包我更爱\t186267\n￥￥￥￥￥￥￥￥￥￥￥￥￥￥￥￥￥￥￥￥￥￥\t186268\n动密\t186269\n数百个\t186270\n13135664344\t186271\n￢_￢\t186272\nLIAOTIAN\t186273\n巴嘎\t186274\n赵凡\t186275\n果果果果\t186276\n四三我一茜白羊外星塞牙小爸爸养\t186277\n辉映\t186278\n想吃糖\t186279\n老张\t186280\n巴拉巴拉巴拉巴拉\t186281\n二十千里\t186282\n咱俩谁美你\t186283\n荒谬绝伦\t186284\n刘厚\t186285\n2600岁\t186286\n宝们\t186287\n董仁建\t186288\n数百万\t186289\n宝仪\t186290\n城阳店\t186291\n29.4\t186292\n老开\t186293\n刘芳琳\t186294\n43927\t186295\n拜拜明天见\t186296\n老式\t186297\nJJJJ卷\t186298\n甜梦网\t186299\n云计算巡展#\t186300\n音译\t186301\n白雪份\t186302\n上个世纪\t186303\n那么久\t186304\n蓝策\t186305\nmeishang\t186306\n接送\t186307\n龙威\t186308\n游程中\t186309\n糖水糖\t186310\nStar\t186311\n说唱\t186312\n音词\t186313\n俩来o18\t186314\n指纹\t186315\n早了玩\t186316\n好热||||||||||||\t186317\nbhiphotosbaiducomxiaodupicitembd3eb13533fa828b977cd090fa1f4134970a5a61jpg\t186318\n西厢记虾兵蟹将\t186319\n间道\t186320\njdhjxz\t186321\n368.2万余元\t186322\n三五分钟\t186323\nfuvs\t186324\n新农哥\t186325\n思想者\t186326\n二手\t186327\n精密度秘\t186328\n赵蜀黍\t186329\n亲面\t186330\nomast\t186331\n甄嬛\t186332\n经商\t186333\n一七女\t186334\n新猫\t186335\n高僧\t186336\n云芬峰\t186337\nJSP地呢枯木多\t186338\n我还要你的爱\t186339\ndvk\t186340\n责怪\t186341\ndvd\t186342\ndvg\t186343\ndvf\t186344\n十九一会儿\t186345\ndvb\t186346\n我需求我需求个粑粑我告诉你\t186347\ndvt\t186348\nFbi\t186349\n范范\t186350\ndvs\t186351\n还记\t186352\nsAST\t186353\n蝶舞翩跹#粉粉\t186354\n沙巴\t186355\n魏伟\t186356\n褚度\t186357\n来爱\t186358\n女人性\t186359\n举起\t186360\n敏感度\t186361\n我爱你爱你想\t186362\n十咱\t186363\n234456790\t186364\n整整齐齐\t186365\n惊慌失措\t186366\n宝贝儿亲个\t186367\n爱情假\t186368\n2001几年\t186369\n没公告\t186370\n嗯妈妈\t186371\n栗子糕\t186372\n凯米魅\t186373\n两小无猜青梅竹马\t186374\nRDCDDFDS\t186375\n愿景\t186376\n爱疯了\t186377\n秀屿区\t186378\n于超\t186379\n13970805342\t186380\n调度室\t186381\n巴爸\t186382\n眉来眼\t186383\n张海洲\t186384\n256天\t186385\nh尾\t186386\n偏袒\t186387\napspapakl\t186388\n尚且\t186389\n男神经\t186390\nwmacableinimaletletramacdonono\t186391\n契廷\t186392\n滔滔不绝恨\t186393\n牛初乳\t186394\n学道\t186395\n考比\t186396\n一一二一41131415161271家吧咿呀有咿呀咿呀e七十一\t186397\n古德拿恩\t186398\n蒜薹炒肉\t186399\n心孤舟\t186400\nSeasons酒店\t186401\n末成年\t186402\n颜海镜\t186403\n募金\t186404\n心有口\t186405\n执意\t186406\nGoodnight\t186407\n张海洋\t186408\n查不道\t186409\n娱乐场所\t186410\n詹差\t186411\n13度\t186412\n会聊会天\t186413\n3.3元\t186414\n伤心人\t186415\n本年\t186416\n点笑\t186417\ndamokuang\t186418\n会计分录\t186419\n客户名\t186420\n三联书店\t186421\n几百万\t186422\n同练者\t186423\n肖潇雅飞\t186424\n数位板\t186425\n新春版\t186426\npercentin\t186427\n嫖妓\t186428\n替去\t186429\n杨妍\t186430\n暑猪\t186431\ngen9\t186432\n大半个\t186433\n生病难受\t186434\n嫉妒心\t186435\n伤心事\t186436\n彭乐涵\t186437\n笑话当见面礼很高兴认识你我是你的小秘书\t186438\n沸腾\t186439\n凡女\t186440\n几百个\t186441\n老鹰区\t186442\n迟钝\t186443\nPTWJ\t186444\n梅艳芳\t186445\n刘金锐\t186446\n寄予\t186447\n四辑\t186448\n自治区党委\t186449\ndcc\t186450\nyttty\t186451\n气不惯养\t186452\n变形金刚威震天\t186453\n背式\t186454\nqwecssx\t186455\n小李\t186456\nopqrstuvwlslaazsyhootiaomertaabc1abcdefghijklmnopqrstuvwxylyzzhuzhidaabc\t186457\n蒙牛\t186458\n小村\t186459\nfvnrnyrn\t186460\nbacdgfehk\t186461\n不不不我来\t186462\n还木古槐\t186463\n15876481059\t186464\n待定\t186465\n口述稿\t186466\n5558555525\t186467\n怜霜\t186468\npozzolatoutstrapps\t186469\n蒙版\t186470\nyyyyyyyyyyyyyyyyyyyyyyyyyyyyuuy\t186471\n18146386223\t186472\n新不信我不让你做我的度秘\t186473\n万呼\t186474\ndfgjccjbcghfthxfgxzfhxfhbcfgggfg\t186475\n张北职教中心\t186476\n名相\t186477\n陈隆鑫\t186478\n120万\t186479\n唐弘毅\t186480\n民工化\t186481\n鹰城广场\t186482\n九点一刻\t186483\n26246246246597\t186484\nnotyours\t186485\n大圣孙\t186486\n杀人者\t186487\n120个\t186488\n好久不爱你\t186489\n幺四\t186490\n上联武穴酥糖\t186491\n壬水\t186492\ndolitel\t186493\n笑一笑挺好\t186494\n阿布·亚哈·利比\t186495\n时快十八了还上小学不可能\t186496\n唐一起\t186497\n揭发\t186498\n确保\t186499\nisal\t186500\n大便便\t186501\ntt5rftrtgydvwgfytggcuhhfrygvjdj…jdfffggyfhfth666dgt778456545gfgffh\t186502\n抓握\t186503\n陈禹嘉\t186504\n耀明\t186505\n十六圈\t186506\n3百万\t186507\n官营\t186508\n回交\t186509\n病危通知书\t186510\n搜寻\t186511\n丙辛\t186512\n冬哥\t186513\n齉齉齉齉齉齉齉齉\t186514\n2745189635684624184941657148268842876846641\t186515\n几里\t186516\n于返盈\t186517\n骨秘\t186518\n三生三世十里桃花\t186519\n你好你好我是的的妹妹\t186520\n真不靠谱\t186521\n小娃娃\t186522\nefcdfgcx\t186523\n王嘉玟\t186524\n息县汤\t186525\n杨庄\t186526\nvivvi\t186527\n累不理\t186528\n爱党\t186529\n了么不好像\t186530\n自给自足\t186531\n王思\t186532\n争建\t186533\n468008653211222334455789098765432113456788000876542145568909765\t186534\n小度秘可爱又顽皮\t186535\n宋木\t186536\n22776885572215866325450000000000000000000000000000000000000000000000000000000\t186537\n扩建\t186538\nijbh1\t186539\n31625820763732\t186540\n花类\t186541\n熊秘\t186542\n王急\t186543\n攀岩\t186544\n王怡\t186545\n偷懒诺\t186546\n去了解\t186547\n说不聊天\t186548\nfgdhgzhhfrhjfcbxs\t186549\n吴学良\t186550\n费翔\t186551\n2-5分钟\t186552\n程序设定\t186553\n江妈组织\t186554\n机器梦\t186555\n兴化寺\t186556\n0027\t186557\n兵法\t186558\n咋么样\t186559\n夏夏夏哈\t186560\n而已我的妮儿我真是\t186561\n拍图\t186562\n最好是好\t186563\n蝈蝈\t186564\n敖瑞熙\t186565\n杨帝\t186566\n机器械\t186567\n报馆\t186568\n搞错\t186569\n磊哥\t186570\n陈秉安\t186571\n母义务\t186572\n程国琴\t186573\ndkvauf\t186574\n械作\t186575\n大爆笑\t186576\n潜心\t186577\n发十送\t186578\n摘苹果\t186579\n地下室\t186580\n离尹\t186581\n小板\t186582\n进进\t186583\n进近\t186584\nhpv16阳\t186585\n罗夫人\t186586\n見大渔\t186587\n来哪里\t186588\ncallmefather\t186589\nSi利\t186590\n詹俊豪\t186591\nahaaj\t186592\nhello喽hellohellohellohellohellohellohello\t186593\n异物\t186594\n官军\t186595\n和泰\t186596\n过意\t186597\n弃儿\t186598\n兽兽错\t186599\n开恩\t186600\n宫女\t186601\n惊魂\t186602\n热量\t186603\nx800\t186604\n小人得志\t186605\n吃喝玩乐高枕无忧\t186606\n沙眼\t186607\n拜寿\t186608\n150万册\t186609\n麦地里\t186610\n几梦卡\t186611\nboobabc\t186612\nuccee\t186613\n台中秋\t186614\n老槐\t186615\n8月2号\t186616\n金鹏\t186617\n痩\t186618\nyugy\t186619\n茶亭街路\t186620\n十四克\t186621\n宋妍菲\t186622\n4厘米\t186623\n瘸拐\t186624\n国民党中央常委\t186625\n上坟\t186626\n奥特之王\t186627\n就是我\t186628\n葡萄皮\t186629\n丑度秘我讨厌你来啦\t186630\nOrz＼o／╯з\t186631\n太不起\t186632\n沃尔斯盖乐视\t186633\n人马\t186634\nJJCC\t186635\n蓝真\t186636\n人均\t186637\n永寿宫\t186638\n赞洛阳公安\t186639\n自查\t186640\n肖小佳\t186641\n贞丰\t186642\n我为马俊仁当律师\t186643\n情书\t186644\n恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩恩爱爱\t186645\n行政区划\t186646\n程兰飞\t186647\n卫队\t186648\njdjtmdt\t186649\n陌然姐姐\t186650\n高熙然\t186651\nYXTX2S\t186652\n11888\t186653\n小歌\t186654\ns一个\t186655\n歹念\t186656\n错错啦错啦\t186657\n新组织\t186658\nhuuii\t186659\nCEO\t186660\n小死\t186661\n亲的我不要的你\t186662\n三角恋\t186663\n给我的那了\t186664\n乖逗\t186665\n小步\t186666\n辽大\t186667\n牛丸\t186668\n10点47分\t186669\ncause\t186670\n徐畅\t186671\n基督城机场\t186672\n病\t186673\n申申\t186674\n8亿\t186675\n早就\t186676\n有情有义\t186677\nIotsof\t186678\n竖杠\t186679\n马晓梅\t186680\nV5撸撸撸\t186681\n熊伟\t186682\n奴娇赤壁怀古\t186683\n舅办\t186684\n半个多小时\t186685\n了不了\t186686\n陈仅一\t186687\nmatsappointion\t186688\nhhuhdh\t186689\n大大家\t186690\n好啦我心情都愉快\t186691\n主教\t186692\n2016年1月\t186693\n贾静\t186694\n四两个\t186695\n91741\t186696\n川岛美夜子\t186697\n回家的路上\t186698\npjpj\t186699\n第三行\t186700\n还是要\t186701\n点不开\t186702\n课件\t186703\ndoID\t186704\n126181\t186705\n万蒂尼\t186706\n凝集素\t186707\nhehs\t186708\n贪吃鬼\t186709\n人际\t186710\n乌红利空\t186711\nhehh\t186712\n奖状\t186713\n没事关\t186714\nhehe\t186715\nhehf\t186716\n好梦求求你爱\t186717\n噩悡\t186718\nghiphotosbaiducomxiaodupiciteme4dde71190ef76c6e931f8589a16fdfaae5167e5jpg\t186719\n红色警戒2共和国之辉\t186720\n人院\t186721\n二八五二\t186722\n没事充\t186723\n滨北交警中队\t186724\n组成\t186725\n塘边\t186726\n啦笔小新\t186727\n太钢\t186728\nsD\t186729\n王冲突\t186730\n打手鼓\t186731\n第几页\t186732\n管理处\t186733\n瞎\t186734\n份度\t186735\n真服\t186736\n7点42号\t186737\n冷藏\t186738\n老汉\t186739\n中一个\t186740\n球馆\t186741\n瞌\t186742\n白雪歌送武判官归京\t186743\n闹药\t186744\n免息\t186745\n行说\t186746\n坤复站堡\t186747\n下扣\t186748\n勘界\t186749\n馄饨\t186750\n真木\t186751\n毛姆\t186752\n病家\t186753\n变你好\t186754\n力学\t186755\n条条条街\t186756\n痛\t186757\n华农大\t186758\n恰到好处\t186759\nsp\t186760\n公主冀\t186761\n诱使\t186762\n杨海玲\t186763\n哈嘎\t186764\n哈嘍\t186765\n琴技\t186766\n吕凤阳\t186767\nCUT\t186768\n星点\t186769\nsmebody\t186770\n恋上乖乖的俏丫头\t186771\n各款\t186772\nst\t186773\nnobodycansay\t186774\n男友友\t186775\n孙俪多\t186776\n阆中\t186777\n老寡妇叶酸\t186778\n猜种\t186779\n0.6厘米\t186780\n慕圣\t186781\n洲头把卷交\t186782\n呼拉圈\t186783\n在那遥远的地方\t186784\n我的旅程\t186785\n入海口\t186786\n哈嘿\t186787\nso\t186788\n哈嘻\t186789\n猜秘\t186790\n千岁千岁万岁\t186791\n个周\t186792\n并联\t186793\n堕胎药\t186794\n蚊香雯\t186795\n抠门\t186796\n移\t186797\nok399\t186798\n蛋蛋机器人\t186799\n皮具\t186800\n秦\t186801\n死野革\t186802\n生上\t186803\n搅局\t186804\n柏香林\t186805\n积\t186806\n上田小学\t186807\n人工台人工台\t186808\n浪漫的日子\t186809\n又名\t186810\n秕\t186811\n32522715584580050021545\t186812\n好坏\t186813\n介绍所\t186814\n科\t186815\n植物大战花园植物大战僵尸花园小镇宝\t186816\nbequick\t186817\nurm\t186818\n租\t186819\nuro\t186820\nurh\t186821\n秘\t186822\n25000元\t186823\n外国人\t186824\n狙击豹\t186825\n聊了明天见\t186826\n十里\t186827\n喔嗯\t186828\n秀\t186829\n秃\t186830\n种\t186831\n秉\t186832\n耍不到\t186833\n一点也不\t186834\n易网\t186835\n罂粟\t186836\nBlair\t186837\n你好好呀\t186838\n650万股\t186839\n工作时间\t186840\n好盆友万岁\t186841\n潘德怀\t186842\n怪界\t186843\n我的真面目\t186844\n迷答\t186845\n白富美\t186846\n心语\t186847\n玉泪垂飞似悼秋\t186848\n盖楼\t186849\n墓志铭\t186850\n作痛\t186851\n8888858888\t186852\n沙希\t186853\n拜伦\t186854\n天珠\t186855\nAppleStore\t186856\n灰思维\t186857\n调酒\t186858\nglibacklog411122\t186859\n按退\t186860\n唔爪机\t186861\n全世界\t186862\n扒\t186863\n王德发\t186864\n设备\t186865\n康羊\t186866\n机机\t186867\n萌物\t186868\n2086784646466467346183568765\t186869\n在之大西瓜\t186870\n不老不死\t186871\n菇茑青\t186872\n52013141414141\t186873\n养人\t186874\n夏长命\t186875\n李是\t186876\n李昮\t186877\nhhhhhhhjhhbnjhhhhhhh\t186878\nuoyevoili\t186879\n瞧不上\t186880\nK311\t186881\n哪一个人\t186882\n我们的家\t186883\nK312\t186884\n李星\t186885\n杨景诚\t186886\n响们\t186887\n斯巴达血\t186888\n親愛\t186889\n鼻子底\t186890\n给我可以\t186891\n秀秀\t186892\n1500亿元\t186893\n李明\t186894\n老样\t186895\n二毛\t186896\n康德\t186897\n土度秘\t186898\n当饭吃\t186899\n星河\t186900\n叫做好像\t186901\n诸事\t186902\n相头\t186903\n徐征\t186904\n连旗\t186905\n没错住\t186906\n510902199709188459\t186907\n简括号\t186908\n于吉\t186909\n145848518444444444125444444444444444544645541444444411411144444441111111111111111111\t186910\n王梓阳\t186911\n徐徐\t186912\n牛根\t186913\n火火火\t186914\n流行的歌\t186915\n不能不爱\t186916\n靓雯\t186917\n相外\t186918\n星沙\t186919\n大好累\t186920\n飞利浦\t186921\n邓鹏伟\t186922\n逆徒\t186923\n一饣\t186924\n牛树\t186925\n相处\t186926\n八二八六\t186927\n浅灰\t186928\n雄才伟略\t186929\n这片儿\t186930\n微泰富\t186931\n发苦俩\t186932\n结大\t186933\n张子祺\t186934\n又次\t186935\n定贝\t186936\n小整\t186937\n皮马\t186938\n白正元\t186939\n手\t186940\n迪奥斯\t186941\n书林\t186942\n元大牙\t186943\n七岁半\t186944\n做噩梦\t186945\nhits\t186946\n裸身\t186947\n英雄好汉\t186948\n林峰兄\t186949\n干道\t186950\ntf凸凸\t186951\n公章\t186952\n90.8%\t186953\n110912star\t186954\n赵文\t186955\n15分时\t186956\n黑洞\t186957\n33658\t186958\n书架\t186959\n阿拉瓦菲雅\t186960\n渠县\t186961\n奶爸\t186962\n汕头大学\t186963\n真不记得\t186964\n皇上我是你的皇妃不过我跟你的太监生活\t186965\n真心累不不不不不说错\t186966\n第四百\t186967\njfccbdicycbfv\t186968\ngdsfdtg\t186969\n豁\t186970\n情字\t186971\n有所为\t186972\n1296867347\t186973\n情子\t186974\n人情味儿\t186975\n预知\t186976\n炒菜\t186977\n醋色\t186978\n节能\t186979\n王再临\t186980\n阂褥疮痍\t186981\n有喜\t186982\ndjc\t186983\n微晶\t186984\n长安福特\t186985\ndjb\t186986\n∴汝\t186987\n不行女\t186988\n度秘乖乖乖\t186989\n欹\t186990\n潘晓东\t186991\n煩嗎\t186992\n冯晓洁\t186993\n大下庄\t186994\n江南春城\t186995\ndjg\t186996\n饮用水\t186997\n度秘度秘我好想你\t186998\n不华丽\t186999\nxlss\t187000\n费乱收费\t187001\n波多野吉衣\t187002\n2.hotel\t187003\n袁说\t187004\n五分之四\t187005\n跳跃性\t187006\nchiphotosbaiducomxiaodupicitemac4bd11373f08202c0f7ea4a4cfbfbedab641b59jpg\t187007\n烦人喜欢\t187008\n贿赂\t187009\n啤酒瓶\t187010\n对呀狮子座\t187011\nfugandanew\t187012\n阿里嘎\t187013\n8000多亿美元\t187014\n案件\t187015\n看歌\t187016\n桃桃\t187017\n第三波\t187018\n开心超人开心超人\t187019\n好水灵\t187020\n悠闲\t187021\n英古典\t187022\n蛋挞王子一号店\t187023\n裹尸\t187024\n萱伫\t187025\n闲聊\t187026\n啦啦歌\t187027\n在真\t187028\n如家\t187029\n发文\t187030\n我們各種反光\t187031\n够了我没有\t187032\n半仙小小星嘻嘻你好\t187033\n一句张\t187034\n豪\t187035\n普那\t187036\n阿尼达\t187037\n很严厉\t187038\n如实\t187039\n滴滴滴\t187040\neme400932ememe400932ememe400932em\t187041\n以上面\t187042\n对你的誓言\t187043\n早作\t187044\n八栋\t187045\n美歌\t187046\n乞讨权\t187047\n算了啦啦啦啦啦啦啦\t187048\n芝士\t187049\n男火女木--火木夫妻好婚配\t187050\n13049557080\t187051\n咳咳咳咳咳咳咳咳快快快\t187052\n文金\t187053\n我是真心的我想我想和你\t187054\n宁会\t187055\n课外阅读\t187056\n这首歌\t187057\n好呀那你吻\t187058\n早你\t187059\n文采\t187060\n八根\t187061\n祭奠\t187062\n回我打\t187063\n械\t187064\njejs\t187065\n呃别\t187066\n宫缩\t187067\njejx\t187068\n作弄\t187069\n那你说故事给我听\t187070\njejd\t187071\n真老大开玩笑\t187072\n梧\t187073\n梦\t187074\n乡间\t187075\n大富我不让\t187076\n梨\t187077\n梯\t187078\n沙华\t187079\n清澈\t187080\n梓\t187081\n饭菜\t187082\n梗\t187083\n快乐做我自己\t187084\n地点们\t187085\n归口\t187086\nIvdjff\t187087\n條\t187088\n杨康\t187089\n梁\t187090\n粉丝者\t187091\n好害怕\t187092\n贺涵晓\t187093\nhychuv\t187094\n张恩荣\t187095\n峡口峡\t187096\n倒灶\t187097\nHKD\t187098\n5千张\t187099\n大麦差五\t187100\n张雅萱\t187101\n凤英\t187102\nHK1\t187103\n童眼\t187104\n还有什\t187105\n违纪\t187106\n就是\t187107\n不妈妈\t187108\n玩吧\t187109\n罗阳镇泰世界小学\t187110\ncloud\t187111\n一脚踢飞\t187112\n最好色\t187113\n节节\t187114\nlost\t187115\n回校\t187116\n张孟利\t187117\n厉害你\t187118\n四十分钟前\t187119\n童真\t187120\noOoO\t187121\n解放广场\t187122\n啦啦啦啦啦啦啦啦啦啦啦啦什么听不到你是谁你是谁\t187123\n批量\t187124\n好不秘\t187125\n棉毛\t187126\n爱玛贼\t187127\nxsh\t187128\n还有们\t187129\n热望\t187130\n黄埔区\t187131\n第九天\t187132\n啦啦啦啦啦啦啦啦啦啦的当然当然的啦啦啦啦啦啦啦啦啦的当然的啦啦啦啦啦啦啦啦啦啦啦啦\t187133\n马心宇\t187134\n弄哭\t187135\nhomy\t187136\n阿妹\t187137\n小亮\t187138\nJunior六辑#\t187139\nlibrary\t187140\n会计从业资格\t187141\n小亲\t187142\nlu盾\t187143\ndbye\t187144\nhome\t187145\n钰涵\t187146\n孙加浩\t187147\n撸啊撸\t187148\nterrer\t187149\n在气\t187150\n青枣\t187151\n汾期假\t187152\n段梦琪\t187153\n僵傀\t187154\n小事\t187155\n额呀\t187156\n泰囧\t187157\n小亏\t187158\n小二\t187159\n13203344\t187160\n反调\t187161\n湖肚\t187162\n小亖\t187163\n垂死你\t187164\n泰国\t187165\n早高晚矮你\t187166\n對對對\t187167\n跳跳糖\t187168\n芒果TV\t187169\n母亲节\t187170\n畅听\t187171\n块\t187172\n好给力\t187173\n坑\t187174\n坐\t187175\nnide\t187176\n草莓片\t187177\n坝\t187178\n坟\t187179\n高长\t187180\nnidn\t187181\n坛\t187182\n坚\t187183\n莉娘\t187184\n心动\t187185\n址\t187186\n鹿特丹\t187187\n坂\t187188\n西城区\t187189\n云珠\t187190\n坏\t187191\n坎\t187192\n爆了\t187193\n32岁\t187194\n坊\t187195\n我好讨厌不是你好讨厌\t187196\n坶\t187197\n心力\t187198\n坼\t187199\ndgydrydtuuiooyrqqwrtyootewwogsdjsh\t187200\n建业\t187201\n坻\t187202\n坤\t187203\n坦\t187204\n坡\t187205\nt1样\t187206\n说不懂\t187207\n坯\t187208\n坨\t187209\n坫\t187210\n坪\t187211\n卖号\t187212\n喝爽歪歪\t187213\n22CM\t187214\n日报\t187215\n荞麦\t187216\n比利亚\t187217\n救死扶伤\t187218\n嗯郭\t187219\n张庆男\t187220\n建丑\t187221\n请假条\t187222\n嗯都\t187223\n技伞\t187224\n割肉\t187225\n出家人\t187226\n一个话恩\t187227\nkbs异能局\t187228\n大乘\t187229\n行堡\t187230\n140多块\t187231\n日抛\t187232\n孔子青铜\t187233\n说教\t187234\n宝雅\t187235\n请君\t187236\n要不要不要不\t187237\n累了就好\t187238\n144mb二\t187239\nwhat5domic\t187240\n毛皮\t187241\n钻头\t187242\n张志\t187243\n1246810\t187244\n英东中学\t187245\n杨蕊心\t187246\n嘎巴\t187247\n五陪\t187248\n俩咱俩\t187249\n13625569780\t187250\n846737574\t187251\n猪猪嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟\t187252\n碰吗\t187253\n天文学家\t187254\n2011年6月28日上午\t187255\n外婆家\t187256\n杨志一\t187257\n杨志七\t187258\n度秘好想和你一块儿吹吹风\t187259\n莫屋\t187260\n喜雅\t187261\n塭柔\t187262\n进出\t187263\n进击\t187264\n小傲\t187265\n郜亚琪\t187266\n手幅\t187267\n小傻\t187268\nonononono1\t187269\n阿勒\t187270\n说实在的我\t187271\n最高化\t187272\n县里\t187273\n手幕\t187274\n小储\t187275\n蒸干\t187276\n對從\t187277\n过血\t187278\n英灵\t187279\n13460518146\t187280\n4524755588585555\t187281\n于继伟\t187282\n面膜\t187283\n姗姗来迟\t187284\n桂云TT\t187285\n没你多\t187286\n42875158835\t187287\niotih\t187288\n张书源\t187289\n新气\t187290\n蒙恬\t187291\n谢子轩\t187292\njìlv\t187293\n叶嘉\t187294\n131452102543\t187295\n炸酱糖醋肉\t187296\n裆裤\t187297\n单翼\t187298\n好炮\t187299\njdbew\t187300\n登登\t187301\n470家\t187302\n好炫\t187303\ndthehea\t187304\n87690826\t187305\n孙允珠\t187306\n朵开\t187307\n礼帽\t187308\n8000\t187309\n习惯晚睡点不开火电视\t187310\n小恶魔\t187311\n永远不一\t187312\nhnd\t187313\n角色\t187314\nhnf\t187315\nhnj\t187316\n机器木\t187317\n王亚帅\t187318\n姜大王\t187319\n草药\t187320\n喜欢你好\t187321\nhnt\t187322\nhnv\t187323\nhny\t187324\nhnx\t187325\ntrytodo\t187326\n吴冰雨\t187327\n橹管\t187328\n和一样\t187329\n呼吸声\t187330\n刷点赞\t187331\n华姐\t187332\n嘎嘎嘎嘎幹尷尬乖乖團團圓圓月\t187333\n超星神\t187334\n雨田\t187335\nfip\t187336\n800平米\t187337\n课余\t187338\n肥肉肥肉\t187339\n老头子\t187340\n1370323780613703237806137\t187341\n一千块\t187342\n东航\t187343\n菲雅\t187344\nwelcomeinparis\t187345\n纷乱\t187346\n我的问题我问你\t187347\n焕饼\t187348\n我喜欢暴\t187349\n李肖肖\t187350\n你男\t187351\n广元窗\t187352\n挂口\t187353\n浪荡\t187354\n刘宇雯\t187355\n别发英语我不懂英语吧识\t187356\n雷浩然\t187357\n奥创比\t187358\n苹狗\t187359\n2011年6月10日上午\t187360\n就义\t187361\n心爱的度秘\t187362\n楼虾\t187363\n夸赞\t187364\n90096吧\t187365\n双层夹心面包\t187366\n涂擦\t187367\n要秘\t187368\n储蓄率\t187369\n荷曦\t187370\n指相勾\t187371\n好桑心\t187372\n奇绣\t187373\nfisti\t187374\n喊冤\t187375\n泰国菜\t187376\n中耳炎\t187377\n学平险\t187378\n2319\t187379\ndyuodyd\t187380\n顾海洋\t187381\n热给\t187382\n热络\t187383\n钻戒\t187384\n雅拉布拉多寻回犬\t187385\n裱花\t187386\n搜狗文\t187387\n从小到大都\t187388\nxoooooooooooooo\t187389\n秘洞\t187390\n三键\t187391\n泪弹\t187392\n八罗\t187393\n朱达斯\t187394\n黄飞虎集团\t187395\n机战王\t187396\n5级\t187397\n问津\t187398\n嗑药园\t187399\ndoukan\t187400\n张大春\t187401\n吴乃虎\t187402\nlindt家\t187403\n一月十号\t187404\n祸水\t187405\n布勒斯\t187406\n附带\t187407\n钱颖一\t187408\n之谜不开心不开心天天都不开心不开心不开心\t187409\n好吧之战\t187410\n李雨晨\t187411\n吃糖\t187412\n泰迪我的闺蜜\t187413\n杨白劳\t187414\n伦家店\t187415\n沈浩波\t187416\n红军\t187417\nwhjk\t187418\n小芊芊\t187419\nwjdbc\t187420\n猪臭不要脸四蜘蛛\t187421\n利利\t187422\n患难\t187423\n2512320326\t187424\n卜卦\t187425\n傻子信\t187426\n哈哈爸爸棒棒哒哒萌哒哒萌萌达\t187427\nPappalardo\t187428\n拉拉拉拉拉拉拉\t187429\n海石\t187430\n来不说话\t187431\n中秋节\t187432\n四眼\t187433\n1.45亿\t187434\n赵贵廷\t187435\n伊豆\t187436\n找不对\t187437\n脸劲\t187438\n地下水\t187439\n出场费\t187440\n怎吗\t187441\n京津德比\t187442\n黄钻机\t187443\n185cm\t187444\n明光\t187445\n嘟哒\t187446\n太阳红\t187447\n268258268家\t187448\n侦查员\t187449\n164456484464677945346454343464\t187450\n明元\t187451\n滴羊\t187452\n霜行\t187453\n45669\t187454\n144敏聪\t187455\n针板\t187456\n绿轴\t187457\nWHD\t187458\n爱小萌萌哒\t187459\nWHF\t187460\n有些爱\t187461\n黑热闹\t187462\n沙露莎露露\t187463\n你好　2\t187464\n金豆\t187465\n福鼎市\t187466\n啵啵大宝贝\t187467\n12234\t187468\n改正带\t187469\n很大方\t187470\n12233\t187471\nWHS\t187472\n游戏版\t187473\n良分钟\t187474\n青县\t187475\n宣传片\t187476\n六十帧\t187477\n不倦\t187478\n新版本\t187479\n三氵\t187480\n重置\t187481\n王一先\t187482\nNnsn\t187483\n不值\t187484\n想方设法\t187485\n埃尔德年\t187486\n小学平常考70g\t187487\n普哈哈\t187488\n度秘我叫\t187489\n惹来\t187490\n梅长苏\t187491\n规和王\t187492\n诚认\t187493\n不倒\t187494\n累练\t187495\n中华宗教文化交流协会\t187496\n桥洞\t187497\n宽限\t187498\n皇果\t187499\n七十岁时\t187500\nre131ka@\t187501\n抬腿\t187502\n牌本儿\t187503\n姚雪婷\t187504\n小祁\t187505\n错就\t187506\n最近十年\t187507\n神皇饱和\t187508\n银圆站洋市\t187509\n我是你你是我\t187510\n广药\t187511\n小神\t187512\n监督管理\t187513\n柳叶\t187514\n东方和尊上私奔\t187515\n付家琛\t187516\n计唱\t187517\n鱼腥味\t187518\ncolds\t187519\nN44\t187520\n爱就\t187521\n门哥\t187522\n头脑刮痧\t187523\nplkmnbvcxdfrt\t187524\nshaier\t187525\n峰仓\t187526\n珠海\t187527\n永州市劳教管理委员会\t187528\n臭哥哥们\t187529\nLITTLE\t187530\nbhxjtlhc\t187531\n邓国深\t187532\n大大民\t187533\n纸卷筒\t187534\n雨纷纷\t187535\nk男孩\t187536\n甲勇\t187537\n天津市和平区\t187538\n摸头摸头\t187539\n真中有\t187540\n好姑\t187541\n好姐\t187542\n开县谭氏会\t187543\n睡子\t187544\n4853882\t187545\n然我\t187546\n一声鬼\t187547\n以身相许\t187548\n孙尼\t187549\ngoodbye\t187550\n湃比\t187551\n跑跑步\t187552\nUMD游戏碟\t187553\n邵丽佳\t187554\n大洪水\t187555\n刃呵\t187556\n列害\t187557\ncd11728b4710f82fb7bfc4cec3fdfc032344jpg\t187558\n心病\t187559\nwuu\t187560\n考法\t187561\nwuy\t187562\n列宾\t187563\n网易云\t187564\n心痒\t187565\n不能不南门\t187566\nYmg\t187567\nwud\t187568\n奶奶奶奶奶奶奶奶\t187569\n心痛\t187570\n货选\t187571\n一准\t187572\n心蕊\t187573\nwul\t187574\na级在线\t187575\n杜蜜珍\t187576\n起载\t187577\n米处\t187578\nvzgvgc\t187579\n13145202587\t187580\n88元\t187581\n1286945755454\t187582\n列宁\t187583\nffhzzgsgsgzgzhzz\t187584\n语语\t187585\n怜\t187586\n花卉\t187587\nFAN\t187588\n一几\t187589\n一凡\t187590\n坞Laptop\t187591\n一凤\t187592\n24879\t187593\noreomate\t187594\n好棒好棒度秘\t187595\n安拉安拉\t187596\n强光\t187597\n乐听\t187598\nboundouyou\t187599\nG4837\t187600\n怛\t187601\n拼接\t187602\n川北\t187603\n陈书源\t187604\n思辰\t187605\n直线式\t187606\n废话废话废话废话废话废话废话废话废话废话\t187607\n江东体育馆\t187608\n唉天天\t187609\n玉乳\t187610\n好啦欧顿东购\t187611\n禅化\t187612\n杜剑\t187613\n50元\t187614\n贤惠\t187615\n50克\t187616\nRicky譚\t187617\n登喜欢你你要我个平方的高铁\t187618\ngfkrobolmco\t187619\n张梦园\t187620\n好不好了我求求你了你\t187621\n不还没到哦有你陪着我想不和没的你中的猪宝贝\t187622\n星战\t187623\n4年内\t187624\n怍\t187625\n中庭\t187626\n呱呱呱呱呱呱呱\t187627\nLG双子座\t187628\n赠给\t187629\n孙文文\t187630\n陈奕迅\t187631\nF49494\t187632\n普惠公司\t187633\n二不了了等\t187634\n恩心\t187635\n麽moda\t187636\n积德\t187637\n七里八里\t187638\n丑惯\t187639\n高文靖\t187640\n恋母情结\t187641\n黄泽锟\t187642\n撞烂\t187643\nOKfund\t187644\n小鱼\t187645\n20040810\t187646\n怵\t187647\n胡锦涛\t187648\n5554408822440005337\t187649\n依照\t187650\n十多年\t187651\nbkjh\t187652\n笑億\t187653\n包吐\t187654\n你好妈妈\t187655\n土星\t187656\n毛淼淼\t187657\n9d82d158ccbf6c81d3ed499cbb3eb13532fa40efjpg\t187658\n怱\t187659\n臣弟\t187660\n包吃\t187661\n一万5100151001万4千三佰\t187662\n回头看了看\t187663\n车窗\t187664\n小三零\t187665\n六八卦\t187666\n夜太美\t187667\n包含\t187668\n子圈\t187669\n剃头匠\t187670\n何艳琳\t187671\n爸爸不要黑呀黑\t187672\n13318747011\t187673\n徐守淙\t187674\n巢\t187675\n来不及了\t187676\n雨打\t187677\nfms\t187678\n罗逸婷\t187679\n雨音\t187680\n总\t187681\n摇摇柏芝\t187682\nhhrzj\t187683\n手袋\t187684\n零二二二\t187685\n079588\t187686\n茜拉\t187687\n好啦你是我的\t187688\ngznj\t187689\n15626585275\t187690\n图形化\t187691\n重点儿\t187692\n吹呗\t187693\n超过\t187694\n上式\t187695\n搓麻\t187696\n踊\t187697\n吹呀\t187698\n保持者\t187699\n春景空\t187700\n踉\t187701\n澜羽\t187702\n怠\t187703\n挽留\t187704\ndigx\t187705\n爱你爱你爱你我\t187706\n乖你\t187707\n安徽广播电视台演播中心\t187708\n场刊\t187709\n有谁谁谁知道\t187710\n11点11分\t187711\n楼宇\t187712\n刘嘉琦\t187713\ndigj\t187714\n3521\t187715\n朱宇跃\t187716\ndige\t187717\n制度\t187718\nbdbdbbx\t187719\n没地\t187720\ngst\t187721\n抱抱团\t187722\n会幸会\t187723\n呆呆呆呆呆呆呆呆呆呆呆呆\t187724\n一块钱一分钟\t187725\ntmeto\t187726\ngsv\t187727\n可乐回\t187728\ngsy\t187729\nl首次\t187730\n大秀行\t187731\n肉蛋\t187732\n工人们\t187733\n漂流江湖\t187734\n上网卡\t187735\n洪金宝\t187736\n跳女\t187737\nvv辦法\t187738\nskks\t187739\n假莫\t187740\n日久生情\t187741\n了不做\t187742\n企鹅舞\t187743\n安南大度秘\t187744\n寄发\t187745\n1241.8亿元\t187746\njuftu\t187747\n片撒里\t187748\n于然\t187749\ncrew\t187750\n领导干部\t187751\n市土地整理储备中心\t187752\n太极扇\t187753\n好了先不说\t187754\n恩白血\t187755\n占种\t187756\n山东商报\t187757\n济阳\t187758\n29集\t187759\np5p5\t187760\n洛妹妹\t187761\n辽河\t187762\n200亿\t187763\n我和爸爸\t187764\n世园会\t187765\ngsb\t187766\n邮箱\t187767\n陈凯\t187768\n5553369480941563350\t187769\n韵凝\t187770\n电脑报\t187771\n水果湖\t187772\ncxayujolm\t187773\n湘东区\t187774\ngsg\t187775\n试片\t187776\n998几7千\t187777\n成龙\t187778\n周末了\t187779\nkongjian\t187780\n别太当真\t187781\nOOOOOn\t187782\n利国利民\t187783\n度秘ye\t187784\n目新一云云\t187785\nOOOOOh\t187786\n棚里\t187787\n桕n式\t187788\n尖子生\t187789\n植物大战僵尸西游记\t187790\n生不如死就是你现在给我的就是这样的日子\t187791\n百分之99999\t187792\n贴身服务员\t187793\n播放吧好不好嘛我求你了小秘书\t187794\n小妹妹你好上几年级你几岁你\t187795\n北京家乐福大钟寺店\t187796\n宽恕\t187797\n治疗\t187798\ngsl\t187799\neh8w\t187800\n沈鹏\t187801\n蹦擦\t187802\n胃部\t187803\n踩\t187804\n幸福\t187805\n猜这里开始\t187806\n鬼子\t187807\n王老泡\t187808\n大我也不\t187809\n微商\t187810\n沐浴\t187811\n5111888\t187812\n8814641\t187813\n老家话\t187814\n王敏\t187815\n第几任\t187816\nYouareaboy\t187817\nchiphotosbaiducomxiaodupicitem500fd9f9d72a60592f15eb6e2f34349b033bbab8jpg\t187818\n123岁\t187819\n八千个\t187820\n15585614132\t187821\n理了了\t187822\n20120209\t187823\n小米八号\t187824\n董立花\t187825\nqjz273337777772822188282002222389883\t187826\njptj\t187827\n王朔\t187828\n嗯一嗯\t187829\n哗秘武装蛋蛋\t187830\n百汇\t187831\n他们的孩子\t187832\n處事\t187833\n行仙林我讨厌你\t187834\n杰罗\t187835\n称职\t187836\n考不理\t187837\nlnfinite\t187838\n鲜艳\t187839\n周亚\t187840\n在凡是\t187841\n越苏党\t187842\n长岭乡\t187843\n南叔\t187844\n不法\t187845\n度秘小逃妻\t187846\n额济额\t187847\nhihi佑\t187848\n做老\t187849\n22平米\t187850\n南口\t187851\nfosogo\t187852\nf50\t187853\n游手好闲\t187854\nItsnouse\t187855\n，，\t187856\n科幻迷们\t187857\n更冷\t187858\n586867955356\t187859\n马塔\t187860\n工会大厦\t187861\n田旭\t187862\n睿智\t187863\n想不见了\t187864\n爱你是我的度秘呗\t187865\n孟获\t187866\n最最最最最好的朋友\t187867\n万里无一\t187868\nlissd\t187869\n1989年\t187870\n你当我的秘书\t187871\n上达\t187872\nppbbbher\t187873\n骂声\t187874\n难说\t187875\nvggg\t187876\n亲一下亲一下\t187877\n蜘蛛你是猪你是猪个小猪猪\t187878\n小凯莉\t187879\n一百圈\t187880\n吗哥\t187881\n贩卖\t187882\nearff\t187883\n床头\t187884\ngfvhhji\t187885\n52a2834349b033b17cecbaf12ce36d3d539bd04jpg\t187886\n杨光\t187887\n十五章\t187888\n崩坏学园2呗\t187889\n杨根生\t187890\n真人照\t187891\n一根根\t187892\n长亲亲\t187893\n自任\t187894\nhuong\t187895\n咯km\t187896\n五进四\t187897\n毕露\t187898\n一对脚\t187899\n度秘丽人在哪里我去找你\t187900\n再不起\t187901\nciow\t187902\n别康桥\t187903\n真的很帅\t187904\n等人家\t187905\n文胸\t187906\n2.75%\t187907\ngzyy66666688888888\t187908\n为平\t187909\n百分\t187910\n见色起意\t187911\n都方\t187912\n全脂\t187913\n7078785554424545224122555\t187914\n值域\t187915\n透视镜\t187916\n赵尚星\t187917\n行呀不信\t187918\n乖别闹\t187919\nuuft\t187920\nlojj\t187921\n挺可爱\t187922\n拜拜吻\t187923\n何不买\t187924\n1932年\t187925\n嗯明天\t187926\n互不\t187927\n迪斯尼限量公仔笔\t187928\n绝食\t187929\n三十和年\t187930\nZT\t187931\n孵蛋\t187932\n白灵欣\t187933\n拜拜吧\t187934\n全经联新晋\t187935\n北外来\t187936\nsukututututututututututututututututututututututututututututut\t187937\n我是大美人你是猪你是丑八怪\t187938\npihf\t187939\n角恋\t187940\nting\t187941\n闫斯琪\t187942\ntina\t187943\n金戈\t187944\ncojoh\t187945\n三秒之内\t187946\n牵牛花\t187947\nikwjdfjdod\t187948\n苏四国\t187949\n起哟米\t187950\n纳斯里\t187951\n泪流\t187952\n正哥\t187953\n老舅\t187954\nlisaliminghui\t187955\n老舍\t187956\n圆梦\t187957\n嗯求求你了密度你最度秘\t187958\n新房昭之镜头\t187959\n沃特规\t187960\n猜我告诉你\t187961\n张文灿\t187962\n哈吧11哦精灵旅社\t187963\n林雨轩\t187964\n石凯\t187965\n李大状\t187966\n正品\t187967\n浙江电视台\t187968\n服务型\t187969\n不是我爱家是不爱下\t187970\n工作吧\t187971\n维修部\t187972\n石凳\t187973\n阿倍\t187974\n恋系\t187975\njfyi\t187976\n主要人物\t187977\n姐妹花\t187978\nseven\t187979\n莫颜\t187980\n飞腿\t187981\n小开\t187982\n留下來\t187983\n百分之九七\t187984\n小弟\t187985\n伪女\t187986\nzhiao\t187987\n九筒\t187988\n80k\t187989\n无可名状\t187990\n解放路\t187991\n吃洗\t187992\n小张\t187993\n80g\t187994\n与世隔绝\t187995\n小强\t187996\n2月16日零点\t187997\nV5\t187998\n杨岳霖\t187999\nV8\t188000\n这样那\t188001\ny喜羊羊\t188002\nfromour\t188003\n真格儿\t188004\n百哥\t188005\n唐我\t188006\n对呀哪种哪了我也不知道我\t188007\nVD\t188008\nVE\t188009\nVF\t188010\n青果\t188011\nVH\t188012\n耶稣\t188013\n心逼\t188014\nVN\t188015\nVO\t188016\n胆囊管结石\t188017\n很惭愧\t188018\n十柒捌克\t188019\n红条\t188020\nVT\t188021\n准不准\t188022\ncanIuseyen\t188023\n蔚佳磊\t188024\nVX\t188025\n昌西\t188026\n全都会\t188027\nhghvvvbbvvvnvvvbnv\t188028\n悲剧\t188029\nVb\t188030\nVc\t188031\n胜长\t188032\nVf\t188033\nVg\t188034\nVh\t188035\n怪有\t188036\n32个\t188037\nVk\t188038\nVn\t188039\n阿利西\t188040\nVp\t188041\ngfffdfd\t188042\nvvnjngn\t188043\n808\t188044\nVt\t188045\nVu\t188046\nVv\t188047\n803\t188048\n802\t188049\n801\t188050\n800\t188051\n807\t188052\n806\t188053\n亮点\t188054\n一叶子\t188055\nggft\t188056\n亲今天大太阳\t188057\n客船\t188058\n祥\t188059\n别称\t188060\n苑佰宇\t188061\n系咪兔\t188062\n幸福吧\t188063\nggfx\t188064\n绝句\t188065\n要是非\t188066\n人票网\t188067\n信誉\t188068\nggff\t188069\n黄焕榆\t188070\n高兴高兴\t188071\n记叙问\t188072\naloal\t188073\n家言\t188074\nggfj\t188075\n大冒险家\t188076\nc沃克\t188077\n口是心非\t188078\n哟麦\t188079\n郭嘉祎\t188080\n辽中县\t188081\n日假光\t188082\n气节\t188083\nkgb1d1gagj\t188084\n我的妹妹那好吧我的妹妹\t188085\n训康\t188086\n孙永燕\t188087\n真的一样\t188088\n给拍个你的照给我\t188089\n搞好叫\t188090\n2OO4\t188091\n官員\t188092\n艳聪明\t188093\n34fgr\t188094\n任王\t188095\nwsyz\t188096\n吖吖吖吖吖吖勇勇勇\t188097\n好盆友还这样对我\t188098\n刘宇珽\t188099\n就是我梦中\t188100\n懵懵懂懂\t188101\n封妹夫妹夫\t188102\nff22\t188103\nI　hope　we　can　alway　go　far\t188104\n黄姓\t188105\n为你好你好你好你好\t188106\n神马意\t188107\n盲盲盲\t188108\n二十G\t188109\n13698454658548456595658645\t188110\n电器\t188111\n秒男\t188112\nG11黑2100\t188113\n陈子华\t188114\n淡然如花\t188115\n七八杯\t188116\n七八条\t188117\n2月29日\t188118\nwwwwwwe\t188119\nskdkkdkdw\t188120\n黄礼格\t188121\n春梅\t188122\n半焉\t188123\n交警\t188124\n假释\t188125\n我你在哪我不让你\t188126\n真语病\t188127\n赞助\t188128\n三颗\t188129\nwwwwwww\t188130\n笋\t188131\n堆积\t188132\n66969\t188133\n不用谢你是我的秘书\t188134\n赖赖\t188135\n未开\t188136\n睡年级\t188137\n心领神会\t188138\n原作业\t188139\nrunina\t188140\n祁\t188141\n15194508055470821550056595808306850082552546708\t188142\n仓井空\t188143\n喵喵\t188144\n海角\t188145\n纪实现\t188146\n355685\t188147\n764498499\t188148\n说呀气\t188149\n个人版\t188150\n百再来一个\t188151\n哇啦\t188152\n镍\t188153\n箭牌卫浴\t188154\n8月16号\t188155\n镁\t188156\n4平方尺\t188157\n戴卡西欧\t188158\n镄\t188159\n镇\t188160\n镙\t188161\nbvftu\t188162\n12345678910111213145\t188163\n镜\t188164\n抑制\t188165\n镐\t188166\n度秘六\t188167\n镖\t188168\n题目化\t188169\n镭\t188170\n六合落潭\t188171\n么璐\t188172\n过晚\t188173\nddddddgfd\t188174\n只需\t188175\n哈地理\t188176\n行事\t188177\n见伏\t188178\n一起吧\t188179\n镴\t188180\n長\t188181\n镶\t188182\n我喜欢理\t188183\n绝不可以\t188184\n脚趾\t188185\n宝岛别墅222号\t188186\n11jj\t188187\n组长\t188188\nsuby\t188189\ntfdfdr\t188190\n千倍\t188191\n现身说法\t188192\n动漫节\t188193\n锁石桥\t188194\n芙蓉王\t188195\nhdherbggeddyxefgucioiiiibhijhhgggguybjbjhghhgg\t188196\n亚度糸\t188197\n那年那月\t188198\n王泽宇\t188199\n价廉物美\t188200\n二氧化碳\t188201\n161335\t188202\nInhow\t188203\n尿尿器官\t188204\nabkk7kk\t188205\n黄姚\t188206\n龙腾网\t188207\n很高兴度\t188208\nsyater\t188209\n807个\t188210\n乘务\t188211\nNBC\t188212\n被推翻\t188213\n抱病\t188214\n调剂\t188215\n建湖\t188216\n一个30块\t188217\n奥塔\t188218\n珠江新城\t188219\n礼貌性\t188220\n打一姓\t188221\n德国杯\t188222\neydggfggc\t188223\nNB0\t188224\n黄猫咪\t188225\n背首诗\t188226\n孙明波\t188227\n黄伞\t188228\n万里渤海鱼\t188229\n针法\t188230\n一龙\t188231\n明天早上4点\t188232\n输入法\t188233\n四般要不\t188234\nwouly\t188235\n7：22\t188236\n雪娇\t188237\n投诉你\t188238\n安理会\t188239\n还我\t188240\n差不好\t188241\n求改\t188242\n还战\t188243\n苏绍祥\t188244\n三节\t188245\n27分\t188246\n学乖\t188247\n我爱你是爱那个女生我叫你的朋友\t188248\n开演\t188249\n女王范儿\t188250\n钟敬媛\t188251\n孙晓方\t188252\n隔音棉\t188253\n拉汉\t188254\n张宇博\t188255\nobutilike\t188256\nxq米\t188257\nngjfkr\t188258\n来着想\t188259\n6584782\t188260\n2014792\t188261\nstmeou\t188262\n魔族\t188263\n学习\t188264\n880元\t188265\n大同小异\t188266\n利马群\t188267\n周俊美\t188268\n中欧\t188269\n刘爽\t188270\n受害人\t188271\n熱情\t188272\n我终于\t188273\n陆潼\t188274\n华泰\t188275\n笑洒家\t188276\n五六页\t188277\n五兆\t188278\n淫诗\t188279\n奥特姆生\t188280\n五元\t188281\n肛裂\t188282\npouter\t188283\n請說人話\t188284\n生产线\t188285\nzksk\t188286\n五克\t188287\n沉入\t188288\n五光\t188289\n红曼\t188290\n黄建宇\t188291\nlOve\t188292\n说的呀\t188293\n督波\t188294\n外媒\t188295\n短款\t188296\nuuhffyihdh\t188297\n妈妈咪\t188298\n拉齐奥啦啦啦啦啦啦啦啦啦啦啦啦裤裙款款\t188299\n勇敢爱\t188300\n五公\t188301\n13431008853\t188302\n五八\t188303\n秦风毒萝\t188304\n439.725\t188305\n坦克\t188306\n太惊艳太惊艳了反之\t188307\n忍住\t188308\n茶松\t188309\n玩死\t188310\n走走火入魔\t188311\n电视背景墙墙壁纸\t188312\n甘识\t188313\n茶杯\t188314\n瑞卡\t188315\n1916年\t188316\nxidk\t188317\n士人\t188318\n杜牧杜\t188319\n557585685585\t188320\n体踏踏7\t188321\n莎娃\t188322\n小红星\t188323\n油烟\t188324\n维拉\t188325\n西现代城\t188326\n深圳机场\t188327\n0\t188328\n亲历\t188329\n杂毛\t188330\niyou\t188331\n走啦\t188332\n刘青志\t188333\n袁君\t188334\n新装\t188335\n2818639750\t188336\n9月下旬\t188337\n暑年级\t188338\n不來\t188339\n保本基金\t188340\nkecu\t188341\nXxxx\t188342\n麻辣香锅\t188343\n风尘\t188344\n风尚\t188345\n你好帅好帅\t188346\n一步步\t188347\n耗水\t188348\n十几级\t188349\nchine\t188350\n电脑蓝屏\t188351\n救字\t188352\n家和\t188353\n你说的对真是好真是好的女孩好少\t188354\n不侠\t188355\n丽斐\t188356\n风云座\t188357\n杨子涵\t188358\n恒安生活馆\t188359\n二四幺九\t188360\n译版\t188361\n敢不说\t188362\n你是大夫不觉晓\t188363\n丽质天生\t188364\n神神神\t188365\n雪狼湖\t188366\n纪英\t188367\n佛果\t188368\n重接\t188369\n能动性\t188370\n死丐帮\t188371\ndUer\t188372\n旱涝\t188373\n深流\t188374\n那线\t188375\n原之空那一先\t188376\nching\t188377\n错错措错错\t188378\n仁寿\t188379\n福尔康\t188380\n130兆\t188381\n中天\t188382\n253425625954\t188383\n追怀\t188384\n115088585589008599986856\t188385\n最好的人\t188386\n机器朋友\t188387\n怕儿\t188388\nUhbashing\t188389\n散伙饭\t188390\n净水\t188391\n五零二\t188392\n演场\t188393\nForever\t188394\nAQ秽\t188395\njshgff\t188396\n等于我知道\t188397\n好哪你的佳作那你给我\t188398\n能不敷衍\t188399\nMAKING\t188400\n张晨露\t188401\n学猫叫\t188402\nbccvv\t188403\nnbhgjjybjvfnvgrgggmjjgjjggfcghgjhhhhh\t188404\n实地\t188405\n难保\t188406\n欧拉\t188407\n我就喜欢\t188408\n蓝肉\t188409\n实在\t188410\n1214862486652\t188411\n30刀\t188412\n30分\t188413\n2704\t188414\n简政\t188415\n张艳艳\t188416\n跌给\t188417\nщ▽щ\t188418\ng65高速\t188419\n丁看\t188420\n14578967\t188421\n徐会期\t188422\n小圈子\t188423\n跑马\t188424\n我要芭比娃娃你面\t188425\n下意识\t188426\n别尼玛\t188427\n热冰箱\t188428\n谢诸\t188429\n0.40亿元\t188430\n莲姐\t188431\nsdddddrsawwffgg\t188432\n以色列\t188433\n拓珂\t188434\n咖啡机\t188435\n2075763359\t188436\n王珊\t188437\nhglf\t188438\n猪度猪\t188439\n一曲\t188440\n妥当\t188441\nahhh\t188442\n樱荨\t188443\n三光者\t188444\n不丹地\t188445\nhglh\t188446\n克山脉\t188447\n两百四十六章\t188448\n吃嫩草\t188449\n珍奇味\t188450\n五點\t188451\n1426.90\t188452\n张梦琴\t188453\n菲茨杰拉德\t188454\nhych\t188455\n洗牙吧7k7k\t188456\nhycj\t188457\nt8tug9\t188458\n氙气\t188459\n先锋\t188460\n学困儿\t188461\n噙\t188462\nixfsofogxcig\t188463\n我讨厌你我讨厌你我好讨厌你\t188464\n狼族瓦特\t188465\n白丝\t188466\n小磨子\t188467\n明年四月份\t188468\n第八\t188469\n维吾尔\t188470\n白不\t188471\nl124\t188472\n白丁\t188473\nmaking\t188474\n野食\t188475\n北京数字电影学院\t188476\ncjmh\t188477\n朝比\t188478\n56656\t188479\n噷歲\t188480\n品茶\t188481\n这样秘\t188482\n生灵涂炭\t188483\n681所\t188484\n高思彤\t188485\nfhffgfrdd\t188486\n向雏田\t188487\n影视片\t188488\n水资源\t188489\n颜扣\t188490\n128438\t188491\n垂线\t188492\n加长\t188493\nKliagjgajj\t188494\n豪杰\t188495\n偷天密码\t188496\n火牙\t188497\nforeach\t188498\n分秒\t188499\n颜家村\t188500\n盘前\t188501\n作文题\t188502\n张利民\t188503\n包抄\t188504\n朱红\t188505\n睡不着觉\t188506\n七时\t188507\n不酒店铺\t188508\n那你我原谅你但是你爱姐我看小魔仙\t188509\nwfte\t188510\n咪咪咪咪\t188511\n特北\t188512\n跑偏题\t188513\n我是男的我爱努你\t188514\n劳驾\t188515\n晚节\t188516\nCERN\t188517\n馅子\t188518\n翻白\t188519\n呵泥煤\t188520\n人生才\t188521\n国务卿\t188522\n800800\t188523\n融资难\t188524\n焊接\t188525\n894686588882659494645946565949546464626565949465556564646464\t188526\n摩奥\t188527\n最最最最好的朋友\t188528\n1008号\t188529\n扬铸\t188530\nyoveritrollelih\t188531\n嗯772\t188532\n蒋小芳\t188533\n悬空\t188534\n甩手化\t188535\n馆内\t188536\n12点27\t188537\n12点25\t188538\n4plu一\t188539\n秘宝宝\t188540\n枪毙\t188541\n越来不\t188542\n椅子\t188543\nFTIS\t188544\ninteresting\t188545\n吴如茵\t188546\n74188\t188547\n幸福狗\t188548\n日本临济宗妙心寺派灵云院\t188549\n咸安区区委\t188550\n王青青\t188551\n崔子振\t188552\n四ｇ\t188553\n20050204\t188554\n伉美浣\t188555\n3547852448866257889\t188556\n旋流雨\t188557\n二十零四\t188558\n6月12日上午10点\t188559\nNO.12\t188560\nElf\t188561\n桑巴舞\t188562\n特区\t188563\ntrdf276\t188564\n臭骗纸\t188565\n死贵龙\t188566\nhdnshxHsh\t188567\n此男\t188568\n寒假吧\t188569\n一百多米\t188570\n泯恩仇\t188571\n有惊无险\t188572\n瞧一瞧\t188573\n物美\t188574\n身影\t188575\n指证\t188576\n。by\t188577\n蔡校长\t188578\n千千万万岁\t188579\n001多\t188580\nbxama\t188581\n1一1\t188582\n1一5\t188583\n九赢\t188584\n不vuv\t188585\n雀巢咖啡\t188586\n以少胜\t188587\n两个愿望\t188588\n3033821591\t188589\n选词\t188590\nBoys\t188591\n王终于\t188592\n勾火打架\t188593\n凤斗\t188594\n甲梅\t188595\n下关依山\t188596\n无方枉\t188597\n评委们\t188598\nqqpp2法\t188599\n水货车\t188600\n我的汹\t188601\n我是谁对你好\t188602\n哈哈笑哈哈笑哈\t188603\n什么了我求求你你\t188604\n太稀少\t188605\n五慢慢\t188606\nVia速溶咖啡\t188607\n豪哥\t188608\nBC3分\t188609\n龚诺娃\t188610\n第二部\t188611\n选课\t188612\n小卫士记载表\t188613\n方姐姐\t188614\n飞不起来\t188615\n李艾希\t188616\n宋泽\t188617\n看该\t188618\n雪图\t188619\n硬壳\t188620\n为甚麽\t188621\n原形\t188622\n不含税\t188623\n14441\t188624\n14444\t188625\n你好主人\t188626\nARJ-21\t188627\n合配\t188628\n小84\t188629\n风驰电掣\t188630\n主家人\t188631\nSBSBSBSBSBSBSBSBSBSB\t188632\n全身\t188633\n小跟班\t188634\n声红\t188635\n跪斌\t188636\n信子\t188637\n第一艘\t188638\n深交所\t188639\n来了宠\t188640\n赵元成\t188641\n见啦\t188642\n卢比奥\t188643\n大海边\t188644\n一百万一百万块\t188645\n灌肠\t188646\n来了宵\t188647\n玉米们\t188648\n47220218589509410059548800\t188649\n比亚干\t188650\n嫁鸡随鸡\t188651\n北京\t188652\n有辜\t188653\n乌镇\t188654\n六ｐｉ\t188655\n相管\t188656\n3月5日零点\t188657\n罗学妮\t188658\n5666\t188659\n育才\t188660\n89033\t188661\n北人\t188662\n来了你可以\t188663\n放不顾身\t188664\n大爱慈大禁曲\t188665\n赛尔尔号\t188666\n贪钱\t188667\n北二\t188668\n我可以说\t188669\n中小说\t188670\n北五\t188671\n关秀\t188672\n吊白块\t188673\n天猫ok\t188674\n月工\t188675\n西德\t188676\n夜莺\t188677\nmeme\t188678\n人之幼\t188679\nmema\t188680\nut4\t188681\n土家出\t188682\n行且珍惜\t188683\n566n\t188684\n在一起说\t188685\n红飘\t188686\n波力\t188687\nwhathehe\t188688\n木船\t188689\n警服\t188690\n酷派\t188691\n美国时间\t188692\n邂逅\t188693\n17888958958988558555\t188694\n截图书\t188695\n无量\t188696\ngfjfkydcnncy\t188697\n真可恶\t188698\n临危不惧\t188699\n调离\t188700\n卢雨\t188701\n那我问你几个啊上了我问你个巴拉拉小魔仙\t188702\n九百双\t188703\n亵渎佛\t188704\nchggcndg\t188705\n深线\t188706\n寒意\t188707\n张跟\t188708\n超时空要塞F女主唱中岛爱的星间飞行\t188709\n2007年9月2日\t188710\n微软小冰\t188711\n对立面\t188712\ngnon\t188713\n简版\t188714\n西工\t188715\nhghyc\t188716\n咪q度秘\t188717\n求你了行\t188718\n笑话当当面\t188719\n软实力\t188720\n接纳\t188721\nimatlou\t188722\nxaruhan\t188723\n种鸽\t188724\n四三大\t188725\n余思妮\t188726\n张路\t188727\n那一般就是你的电影小达人\t188728\n柱砼\t188729\n秘啵\t188730\n王主厨\t188731\n自悲\t188732\n老版\t188733\n一丝\t188734\n一丛\t188735\n韩亚飞\t188736\n六周\t188737\n一世\t188738\n窝爸\t188739\n大谦哥\t188740\n热舞\t188741\n投石车\t188742\n一丑\t188743\n呗尔\t188744\n回到家\t188745\n一不\t188746\n一下\t188747\n一丈\t188748\n一三\t188749\nhttpimagebaiducomsearchwisealatnwisealaieutf8word%E8A681E4%B88DE68891E7BB99E4%BDA0E58F91E4%B880E4BA9BE5898DE7984E7857E78987\t188750\n一万\t188751\nbeth\t188752\n52215584\t188753\n一七\t188754\n一一\t188755\n比烂\t188756\n开刀\t188757\n快猪猪侠还有光头强还有熊出没还小魔仙\t188758\n出借\t188759\n惊讶\t188760\n不庸人自扰\t188761\n于杰\t188762\n一串\t188763\n格式\t188764\n平安夜快乐牡丹\t188765\n一个\t188766\n一丨\t188767\n能比\t188768\n一两\t188769\n开创\t188770\nwuyield\t188771\n马尼拉街\t188772\n鹿晗红\t188773\n快一龙\t188774\n性价比\t188775\njqqql\t188776\n打非\t188777\n角角\t188778\n犒劳犒劳\t188779\n后劲\t188780\n哭的像小孩\t188781\n余地\t188782\n绝世我把头\t188783\n道闸\t188784\n警示板\t188785\n白實\t188786\n苍茫的天涯是我的爱情\t188787\n打靶\t188788\n吃想\t188789\n类似句\t188790\n我爱你我想你\t188791\n点卓亚\t188792\n码畜\t188793\n听听说\t188794\n一票难得\t188795\ncgfxc\t188796\nfkheioffhio\t188797\n龚雪莲\t188798\n先行\t188799\n龙翔广场\t188800\n6466666\t188801\n许红生\t188802\nffds\t188803\n张志豪\t188804\n巴尔扎特\t188805\n偶遇\t188806\n缠绵不醒\t188807\n其尔\t188808\nbunengziyou\t188809\n熊样行\t188810\n杨星月\t188811\n灶位\t188812\n本溪检察院\t188813\n刨腹\t188814\n锁定\t188815\n乖乖的歌\t188816\n寻龙诀\t188817\n黑熊\t188818\nvkc\t188819\n跑大\t188820\nfdg\t188821\n18：30\t188822\n许愿\t188823\n案发现场\t188824\n软妹子也\t188825\n一到底会不会\t188826\n刘exo\t188827\nb5b5vv\t188828\n逢低\t188829\n书城\t188830\n袁凡杰\t188831\n哈小小\t188832\n薇派\t188833\n爱就爱呗\t188834\n错啦你是我的\t188835\n苹果六\t188836\n6ulaudem\t188837\n天兔\t188838\n被盗用\t188839\n爱液\t188840\n皮亚\t188841\ncyy\t188842\n赵权\t188843\n乐平水帘镇\t188844\n矢田\t188845\nhttpsmbaiducomfrom1003438rpusz4013204802Ccuid400avP8uQH8Xava8gavB8u3Hi0piBfggi2efY8nHiiz828vguvz8a628AN37qOC2Ccua40PvjhjatvhIDJEjPkJAiCCVvCxVGNqqC2Ccut40p8WK9pa9285jI2IJhvUhgOYDRMzyqqSB2Cosname40baiduboxapp2Cctv4022Ccfrom401013838c2Ccen40cuidcuacut2Ccsrc40appmainboxvoice2Cvmgdb400020100228ystnzbiosword%E889B2E7B3BBE5869BE59BA2E982AAE681B6E6BCABE794BBE5B091E55B3sacr2\t188846\n天元\t188847\n吖吖吖\t188848\n夏木原\t188849\n天光\t188850\n5点16点\t188851\n长柄\t188852\n雷州九九\t188853\n观照\t188854\n黄山书社\t188855\n爱涛\t188856\n天兵\t188857\n阉割\t188858\n天养\t188859\n爱憎分明\t188860\n南关学校\t188861\n声卡\t188862\n人理\t188863\n央视\t188864\n神国\t188865\n神图\t188866\n天公\t188867\n1427589063\t188868\n连轴\t188869\n办公室\t188870\n合适率\t188871\n拍完后\t188872\n喵喵喵喵喵喵喵喵喵喵喵喵喵喵喵\t188873\n连载\t188874\n我不要你了你没礼貌\t188875\n樊剑英\t188876\n身体素质\t188877\n跳楼能\t188878\n胸围\t188879\n河北总队武警六支队\t188880\n卢广仲\t188881\nplkiihhnvfedvn\t188882\n老年机\t188883\n咪咕\t188884\n咪咪\t188885\n艾玉荷\t188886\nNBA啦啦队\t188887\n700宗\t188888\n励志片\t188889\n温欣\t188890\n唉了我讨厌\t188891\n要卡\t188892\n烤焦\t188893\n七哥\t188894\nw2016\t188895\n萌哒\t188896\n顾盼\t188897\n南极\t188898\n张宝利氏\t188899\n时乖\t188900\n回别\t188901\n胡萝卜素\t188902\n扭曲\t188903\n回到\t188904\n蒙古词\t188905\n10周年\t188906\n美女主人\t188907\njajcgbgb\t188908\n集训营\t188909\n逆态度\t188910\n打屎\t188911\nsdsrey\t188912\ncoc4\t188913\n度秘我爱你\t188914\nadidasfussball\t188915\n扬眉\t188916\n2b吗三包s\t188917\n110k\t188918\n打屁\t188919\n麦芙条\t188920\n步兵\t188921\n16.4\t188922\n菱湖\t188923\n403\t188924\n类样\t188925\n三复何三\t188926\n而后者\t188927\n132273333398\t188928\n六系\t188929\nbxbcvb\t188930\n回报率\t188931\n大杀手\t188932\n告诉我的决定\t188933\n晚好\t188934\n皿╯з╰O皿\t188935\n城西路雅\t188936\n滚滚红尘\t188937\n我爱你我爱你我想你\t188938\n我喜欢类型\t188939\n偷偷亲\t188940\n令美匈\t188941\n杰亮\t188942\n13313375815\t188943\n奥额\t188944\n算了死\t188945\n骑行\t188946\n很漂\t188947\n1107\t188948\n1102\t188949\n844268\t188950\n1100\t188951\n1101\t188952\n烟消云散\t188953\n村通\t188954\ndison\t188955\n7777777771个\t188956\n1108\t188957\n按岁\t188958\nhvghh\t188959\nrosimm\t188960\n胡汉三黑魆魆忽上忽下还行哈\t188961\n阿芙\t188962\n天马\t188963\n9天\t188964\n陶勒\t188965\n兰德里\t188966\n滤水器\t188967\n五角星\t188968\n玛黛拉\t188969\nFactory\t188970\n那你的事情吧依百顺百依百顺\t188971\n84945542\t188972\n户籍\t188973\n人民群众\t188974\nreallahi\t188975\n依伽嗯\t188976\n状似\t188977\n巨蟹座\t188978\n九七分\t188979\n狗猫兔\t188980\n阿花\t188981\n柠檬小军\t188982\n伍芳\t188983\nMobile\t188984\n紫陌\t188985\n阿芬\t188986\n谢雨顺\t188987\n早上6点\t188988\nCoyote\t188989\n喔等你\t188990\n8日下午\t188991\n盘踞\t188992\n找你好\t188993\n秦顺\t188994\n33000韩元\t188995\n谢因为\t188996\n文提文\t188997\nfhrggryytfhtfhjrfhuggjhghhggggggguhghgtgghyvgjcf\t188998\n撮\t188999\n青春偶像\t189000\n出秘\t189001\n十万吨\t189002\nRHIS\t189003\n酉哥\t189004\n服装机械\t189005\n十余年\t189006\n第200枚\t189007\nhgfgjhg\t189008\n寒粉\t189009\n单真\t189010\n乡野\t189011\n787岁\t189012\n干锅鸭头\t189013\n五好不好\t189014\n赵松\t189015\n不求甚解\t189016\n刘波呱\t189017\n别致\t189018\n56328\t189019\ncggkky4fuhTeyfvihbpugzhckh\t189020\n喷身\t189021\n汉进\t189022\n盲童们\t189023\n盐粒\t189024\n严林美琪\t189025\n熊哇\t189026\nhcchj\t189027\nnnnnnnoOBPDn\t189028\n敢和\t189029\n迅捷\t189030\nghiphotosbaiducomxiaodupicitem8601a18b87d6277f08a47e4d2f381f30e924fc15jpg\t189031\n左后\t189032\n56公斤级\t189033\nosbo\t189034\n信用\t189035\n蝶湖湾\t189036\n17575885000457680\t189037\n一百一千个\t189038\n修子\t189039\n冬蜜炒蛋\t189040\n熊哥\t189041\n投错胎\t189042\n瘙痒\t189043\n撼\t189044\n酷爱谷\t189045\n6359元\t189046\n风雨散\t189047\n好像处\t189048\n星期6\t189049\n星期1\t189050\nk.r.y\t189051\n星期3\t189052\n撂\t189053\n杨有发\t189054\n在前\t189055\n终於\t189056\n布克书城\t189057\n平篇\t189058\n砂糖\t189059\n泰赛娅\t189060\n多僧\t189061\n叶炜\t189062\n你以为你\t189063\n糸无\t189064\n虎林\t189065\n育场\t189066\n换来\t189067\n酥菜汤\t189068\n玄武\t189069\n十一块\t189070\n十分关心\t189071\n蒙古航空公司\t189072\nDSCd\t189073\n救国\t189074\n张佳美\t189075\n不轻\t189076\n艾网\t189077\n欢乐\t189078\n手环\t189079\n65今\t189080\n博托龙\t189081\nGFAGA\t189082\n运动队\t189083\n74万条\t189084\n张雨琦\t189085\n机器儿\t189086\nyghju\t189087\n9493\t189088\n9494\t189089\n绿皮\t189090\n9497\t189091\njhdjhshdh\t189092\n新亮点\t189093\n金珉锡\t189094\n风蝶\t189095\n神探夏洛克n\t189096\nvbbbb\t189097\n首发式\t189098\nphjgjlaj\t189099\n3008\t189100\n倒出\t189101\n52153\t189102\n52156\t189103\n椽子\t189104\n六百怪\t189105\n晨家世\t189106\n生命线\t189107\n锕丽莎\t189108\n53200\t189109\n思乡\t189110\n杜密\t189111\n鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅\t189112\nAM\t189113\n六小龄\t189114\n行行行我\t189115\n处理厂\t189116\n尹佩瑶\t189117\n挺酷跑\t189118\n然又多\t189119\n一小时\t189120\n真的假不了假的真不了真是你\t189121\n出门\t189122\n反反复复方法反反复复发发发发发发发发发发发发发发发\t189123\n五百张\t189124\n小恒恒\t189125\n记着\t189126\n最好的朋友\t189127\nfu8tfulx6y\t189128\n不理你了我讨厌你我恨你\t189129\n思乐\t189130\n曲名\t189131\n何方神物\t189132\n有量\t189133\n一四零零万\t189134\n有重\t189135\n郭子凡\t189136\n士焉\t189137\n偶贤\t189138\n写明诉\t189139\n侠大怪兽我一杆四小\t189140\n一月二十四号\t189141\n虎弟吖\t189142\n巨债\t189143\n隊伍\t189144\ndchhj\t189145\nfrdyf\t189146\n160余\t189147\n对呀处\t189148\n千余份\t189149\n咪咪咪咪撸\t189150\n何禹萱\t189151\n听不是\t189152\n中国少先锋队\t189153\nvxVe\t189154\n星期四星期五星期六\t189155\n大鬼\t189156\n定州\t189157\n玩答\t189158\n卓群燕\t189159\n管样\t189160\n真的假的真的假的\t189161\n朋友的朋友\t189162\n崔顿\t189163\nAa\t189164\n秦武强\t189165\n马卢达\t189166\n龚浩明\t189167\n哈哪\t189168\nELF们\t189169\n4点52\t189170\n关吉\t189171\nhjbm\t189172\n制服男\t189173\n13848968387\t189174\n高露洁\t189175\nccz\t189176\n桃秘网1crafalyoutleymayou\t189177\n瞎狗\t189178\n吗度米\t189179\n素圆\t189180\n吴同学\t189181\n杨佳新\t189182\n描写\t189183\n睡眼惺忪\t189184\n来吧来吧\t189185\n膈膜\t189186\nyzowerwbv不撸mcvvv\t189187\n戴什\t189188\n最合算\t189189\n重别\t189190\n巴拉拉巴拉拉巴拉巴拉巴拉\t189191\n春色\t189192\nAaajjJJYY\t189193\n汕头第一城\t189194\n晃子\t189195\n拉开始\t189196\nvfhhfeghig2th\t189197\n康乾\t189198\nSfghchkxhkgjxgjzfbkcjjgghiphjjklljghjjjjjkk\t189199\n念佛\t189200\n前阵子\t189201\n资本回报率\t189202\nhewoshengqile\t189203\n康乐\t189204\n熊成\t189205\n墨子修身\t189206\n天马行空般\t189207\n一二三四五六七八九十一万\t189208\n一妙钟\t189209\nG罩杯\t189210\nafetn\t189211\n李四\t189212\n鲤鱼冬瓜\t189213\nU盾\t189214\n富昌迹\t189215\netfyg\t189216\n0一V一0\t189217\nAv\t189218\n猪小姐\t189219\n看错\t189220\n10下\t189221\nshagscaves\t189222\n程悠悠\t189223\nSB吗度秘\t189224\n蓝红太狼\t189225\n裴昕彤\t189226\n癌细菌\t189227\n两包面\t189228\n10万\t189229\n小题大作\t189230\nU盘\t189231\n李国\t189232\n防游\t189233\n小稻秧\t189234\n中里巴人\t189235\n蒸熟\t189236\n咯了磨\t189237\n10个\t189238\n炎喜\t189239\n残疾者\t189240\n泰勒\t189241\n饭饭饭饭\t189242\n打理\t189243\n试车\t189244\n打球\t189245\n夜出去\t189246\n甘谷\t189247\n张育军\t189248\n钱包子\t189249\n啦美\t189250\n文科课\t189251\n一点点点点点\t189252\n周密性\t189253\n鲜格格\t189254\n颜狗\t189255\n滴滴滴滴答\t189256\n猪黑\t189257\n3千万\t189258\n木己乱\t189259\n抱臂\t189260\n直达号\t189261\nbtjddnmkdjiithbfuur6666666839748tu7jemc\t189262\n强识\t189263\n原田喜胶囊\t189264\n142853687\t189265\nrgvxvvc\t189266\n有失有得\t189267\n阴魂阵\t189268\n还是女的我是女的我真的不骗你我骗你\t189269\n敷料\t189270\n风行网\t189271\n嫁接\t189272\n弱势力度\t189273\n一千分钟\t189274\n夜夜夜\t189275\nr18\t189276\n董行佶\t189277\n黄皇\t189278\ndvcgt\t189279\n20eoom\t189280\n哈大\t189281\n奥汀\t189282\n代课\t189283\n催化剂\t189284\n干调\t189285\n可以想象\t189286\n意林\t189287\nGif\t189288\n代词\t189289\n吃亏\t189290\n点石\t189291\n吾友\t189292\n慢头\t189293\njhijk\t189294\n罩智能机器人\t189295\n亲亲亲亲亲嘣嘣嘣嘣嘣天天劲劲劲\t189296\n2013005034\t189297\n十二月份\t189298\n慢大\t189299\n大喜特喜\t189300\n药液\t189301\n烂柯\t189302\n返修率\t189303\n去吧娃娃\t189304\n普通\t189305\n若祁\t189306\n我是你的偶像的我太可爱我\t189307\nfhfgffsdrcc545452212\t189308\n四晶品\t189309\n乌骨仙\t189310\n口岸\t189311\nglajg\t189312\n啦啦啦啦啦啦的小呀小飞机\t189313\n0点5x3点\t189314\n告错\t189315\n农化\t189316\n死亡管\t189317\ngdyy\t189318\n我可爱的人\t189319\n长登\t189320\n王靓男\t189321\n别们\t189322\n六星守护龙\t189323\n横盘\t189324\n颈椎\t189325\n基本原则\t189326\n米若斯\t189327\n全期\t189328\npolll\t189329\n朵们\t189330\n淮安\t189331\n嗯真牛\t189332\n九十秒多列\t189333\nxxx69\t189334\n洛镇\t189335\n张恐龙\t189336\n锐雯\t189337\n李妙杰\t189338\n唉不理你了讨厌鬼烦人\t189339\n代步\t189340\n考费\t189341\nJeugisu\t189342\n７００分\t189343\n小狗\t189344\n龙子航\t189345\n想象\t189346\n4296764182\t189347\n黑蛇\t189348\n爱们\t189349\nnsbshsj\t189350\n库耳曼\t189351\n几载\t189352\n一个14g\t189353\n小数点\t189354\n7月12日\t189355\n黑蛤\t189356\n007006\t189357\n几轮\t189358\n天然林主伐\t189359\n戾气\t189360\n叶曦\t189361\n兰萌\t189362\n闵行分局虹桥所\t189363\n相生\t189364\n轻轨\t189365\n2011年6月5日\t189366\n卧室个怪吃的惹\t189367\n二月一\t189368\n爱过你好\t189369\n阿斯顿·马丁\t189370\n踌躇\t189371\n6686588869\t189372\n阿喵\t189373\n轻轻\t189374\n张晓楠\t189375\n两周岁\t189376\n洋驰\t189377\n截瘫\t189378\n翔宇\t189379\n39999\t189380\nQ弹\t189381\n纠纷案\t189382\n犯罪心理学\t189383\n全月\t189384\n词型\t189385\n安装包\t189386\n二月中\t189387\n搞不上班\t189388\n维尼饭们\t189389\n雾源\t189390\n小资化\t189391\n盛誉\t189392\n屁颠儿\t189393\n秘都\t189394\n横县\t189395\n莲儿\t189396\n乐成\t189397\n1368855588560\t189398\n一月底\t189399\n︵n\t189400\n２３日\t189401\n白褂子\t189402\n｀ノ从\t189403\n折磨\t189404\n冤不冤\t189405\n吴嘉欣\t189406\n弯拜\t189407\n尹思源\t189408\n事过\t189409\n隋悠然\t189410\n杀人刀\t189411\n红蚂蚁\t189412\n要不能\t189413\nguahf\t189414\natlast\t189415\n很温暖\t189416\n暴晒\t189417\n我爱爱爱爱爱你\t189418\n不曾湾\t189419\n春秋时期\t189420\n僵尸大火\t189421\nSharapova\t189422\n奏糸\t189423\n第十七年\t189424\n太虚\t189425\n巨蛋\t189426\n宫保\t189427\n扁王源\t189428\n唉气\t189429\n肖松\t189430\n怨爹\t189431\n上色\t189432\nsgjgj\t189433\n野炊\t189434\n余坊小学\t189435\n第二年\t189436\n刘策略\t189437\n拉拉拉拉拉拉拉拉\t189438\n余角\t189439\nbd平分角abcbe分角abc\t189440\n秋秋秋\t189441\n找不着了吧\t189442\n泾渭分明\t189443\n居比\t189444\n225588889\t189445\n耶也\t189446\n张含\t189447\n勾魂\t189448\n13737639341\t189449\ncoming\t189450\n中华人民共和国未成年人保护法保护法\t189451\n森女们\t189452\nm餐\t189453\n淘治\t189454\n爬上去\t189455\n奥特曼斯特\t189456\n给我点解都\t189457\n亲我要是哪个我是世界连续剧\t189458\ndjfjfjcjnhnjxffhhxhcgnfncjh\t189459\n小东秘\t189460\n锐志\t189461\n喝腻\t189462\n對討厭\t189463\n关于中漫公主殿下\t189464\n1世纪\t189465\n电视机顶盒\t189466\n范武\t189467\n东宫子彻\t189468\n一个一个一个\t189469\n大姨子\t189470\n折子戏\t189471\n价值观念\t189472\n15979463555\t189473\n折圆通速递\t189474\n众官\t189475\n被捅\t189476\n蓬山\t189477\nabzc\t189478\n2011年２月７日中午\t189479\n王云云\t189480\n洪春光\t189481\n祥云\t189482\n哈百\t189483\n弑逆\t189484\n花儿朵\t189485\n成年十五年了我\t189486\n现在不理哪不理你了我走了\t189487\n主妇\t189488\n粉飞行\t189489\n5581\t189490\n度度猪\t189491\nduxv\t189492\n想不透\t189493\n先极限\t189494\n姜善宇\t189495\nElijah\t189496\n开元市\t189497\n想不通\t189498\nhgvvfr\t189499\n流落\t189500\n慢性肾炎\t189501\n上海教育考试院\t189502\n长鸿\t189503\n加油卡\t189504\n无害女\t189505\n鸟哥\t189506\n天网吧\t189507\nccu\t189508\n税服\t189509\n三103\t189510\n好去处\t189511\nDietzus\t189512\n第3种\t189513\n脑膜\t189514\nbijlsj\t189515\n条款\t189516\n美国银行\t189517\n停板\t189518\n流沙\t189519\nTEBYO\t189520\n神道\t189521\nkissme\t189522\n美欧\t189523\n复杂\t189524\n笨猪猪猪猪猪猪\t189525\n上大学\t189526\n再见一面\t189527\n无条件\t189528\n九小游\t189529\n害我叫\t189530\ngddgcd\t189531\njdmd\t189532\n153666\t189533\n17099246645\t189534\n人窝\t189535\n62千米\t189536\n嘟嘟芭比\t189537\n反客为主\t189538\nDgap\t189539\n无综\t189540\n忍找\t189541\n相映成趣\t189542\njdmw\t189543\ngwod\t189544\n上不去\t189545\n兮山\t189546\n伸向\t189547\n可可十三个头\t189548\n度米\t189549\n蔓朵\t189550\n妈咪妈咪我要妈咪妈咪我要妈咪\t189551\n韦参军\t189552\nVvhgvggh\t189553\n好吧游戏\t189554\n雷v你\t189555\n太不像\t189556\n形各色\t189557\n1.6升\t189558\n易学\t189559\n索力阳\t189560\nHHhG\t189561\n土地证\t189562\nmmx\t189563\nmmz\t189564\nmmu\t189565\nmmw\t189566\n本周六\t189567\ngcdreerrd\t189568\nmmm\t189569\n装病\t189570\n波龟\t189571\nmmd\t189572\n明洞\t189573\n大良\t189574\n卸载片\t189575\n哈拉斯\t189576\n安插\t189577\n23点半\t189578\n59公斤\t189579\n较劲\t189580\n绿叶\t189581\nmmH\t189582\n王缉思\t189583\n话说的是你好吗啡丨幽𠃌神〇神朱哈乛冂乜\t189584\n芒种\t189585\n欧尼狗\t189586\n装饰画\t189587\n薄靳言\t189588\nWhyyouso\t189589\n药鼎\t189590\n槐安\t189591\n春妮\t189592\n吗片\t189593\n爱奇点\t189594\n李宇琛\t189595\nvalentine\t189596\n北达科\t189597\n236644270\t189598\n轻松熊\t189599\n取景\t189600\n李王刚\t189601\n开浪\t189602\nVello\t189603\n18911693090\t189604\n雅比4tme55678\t189605\n棋牌\t189606\n变坏\t189607\n春如\t189608\n肥猪肥猪\t189609\n刀具\t189610\n后退\t189611\n心计学故事\t189612\n武商\t189613\n卢紫腾\t189614\n许玄\t189615\n古德奈特\t189616\n新石器烤肉\t189617\n无法割舍\t189618\n辛酸\t189619\n冰欺凌\t189620\n不懂不懂不懂不懂不懂不懂不懂不懂不懂不懂\t189621\n草沼马\t189622\n低温\t189623\n67MB\t189624\nccmML\t189625\n旧酒\t189626\n欣妹\t189627\n紫皮蒜\t189628\n战神队\t189629\n撞墙\t189630\n猕猴\t189631\n唐嘉欣\t189632\n赤木刚宪\t189633\n来人心\t189634\njdddd\t189635\n48258\t189636\nSOFTLY\t189637\njddds\t189638\n臂怠\t189639\n绕城高速公路\t189640\n心改\t189641\nsare\t189642\n53.3亿元\t189643\n心心相印\t189644\n抽奖\t189645\n呀鬼在\t189646\n周六一\t189647\n待产\t189648\n邹星维\t189649\n要不看我\t189650\n程亚林\t189651\n卷席\t189652\n青团\t189653\n天门\t189654\nKRIS\t189655\n写给你写给我喜欢火法个\t189656\n60本\t189657\n韩流\t189658\n不我不度\t189659\n25册\t189660\n理所当然\t189661\n冻豆腐\t189662\nghjjfg\t189663\n删除\t189664\n帕切科\t189665\n有点慢\t189666\n两356点\t189667\n林夕\t189668\n牛逼真\t189669\n110e五\t189670\n多咪多咪\t189671\napkx\t189672\n莎曼\t189673\n害己\t189674\n陈志龙\t189675\n出译\t189676\n点支付\t189677\n社会主义初级阶段\t189678\n雾矢葵\t189679\nUNABO\t189680\n鸟嘴仲裁\t189681\n李焱\t189682\nGL8\t189683\n奥普\t189684\nxb1003\t189685\n你是我的秘书你得听我的话\t189686\n长沙晚报\t189687\n露着\t189688\n47点\t189689\n林大\t189690\n怪呢呢呢呢\t189691\n伊咪\t189692\nvhiiyws\t189693\n阿普里亚\t189694\n义务教育\t189695\n邱帮荣\t189696\nBoon\t189697\n绿领巾\t189698\nHBL\t189699\nAreyou\t189700\n盼望\t189701\n月例赛\t189702\n妙贤\t189703\n找大你\t189704\nchdk\t189705\n蚀龙炮\t189706\n杀猪男\t189707\n神智\t189708\n小昕昕\t189709\n佣人\t189710\n卡佛\t189711\ndddtstr\t189712\nqrub\t189713\n邹子欣\t189714\n我的我的姐姐\t189715\n2014521\t189716\n贾怀胤\t189717\n哈喽歪歪\t189718\n浦江\t189719\n1月20号\t189720\nMagic\t189721\n一名30岁\t189722\n熊荣杰\t189723\n吴海莹\t189724\n二万799\t189725\n十大点\t189726\n苏航站\t189727\n1月18日\t189728\n迷茫刘\t189729\n2873272793\t189730\n老任\t189731\n骑马\t189732\n同面临\t189733\n科维奇哭\t189734\n孑画\t189735\n共沙粒\t189736\n高说\t189737\n别分手\t189738\n诸斯伊\t189739\nBiH7C6X\t189740\nsbklah\t189741\n朴智旻\t189742\n同仁堂\t189743\n欧朋\t189744\n范爷\t189745\n我讨厌你讨厌你讨厌你我的世界才\t189746\n别太急\t189747\n卵泡\t189748\n王一涵\t189749\n阴谋诡计\t189750\n楞\t189751\n娇娇豆粉男\t189752\n政策性\t189753\n陆川县实验中学学校\t189754\n几度\t189755\ns9797s\t189756\n明身在他乡\t189757\nzeze\t189758\n格斗术\t189759\n给我头\t189760\n教头\t189761\nSEHUN\t189762\n爬美\t189763\n度丽屋\t189764\n骚扰\t189765\n笨熊诛仙\t189766\n天天嗜睡我最爱你了你是我\t189767\n给我大\t189768\n888888855566588998\t189769\nDJdBDSB\t189770\n绝对三之\t189771\n韩像\t189772\n删删除\t189773\n三国腐文\t189774\n班房\t189775\nfvn\t189776\n访港\t189777\n爱妃\t189778\n神仙眷侣\t189779\n四厘米\t189780\n54772552252\t189781\n绣花\t189782\n姓刘\t189783\n一单元三号楼\t189784\n痞子蛋\t189785\n韩国经纪公司\t189786\n阵书祁\t189787\n姓别\t189788\n大众喜好\t189789\n爱妞\t189790\n开小差\t189791\nSK8\t189792\n日式\t189793\n死维尼\t189794\n楮\t189795\n1436814720\t189796\n炒房团\t189797\n多好不\t189798\n小情鳃国\t189799\n快玩笑\t189800\nab无相嗯小游锨\t189801\n豪不犹豫\t189802\n業\t189803\n转换\t189804\n78474444444474444444444444444545555455554545555\t189805\n1819174\t189806\n袁大头\t189807\n10厘米\t189808\nRUSUDK\t189809\n女神类\t189810\n斗战神\t189811\n祖辈\t189812\n文伟\t189813\n露一手错\t189814\n是哪嗯哪哪哪哪哪哪哪\t189815\n二莲\t189816\nu肺火\t189817\n300多少\t189818\n潮潮\t189819\n总而言之\t189820\n直落红樱记\t189821\n金星云\t189822\n2014年3月26日零时\t189823\n鲜虾鱼板面\t189824\nggyggyu\t189825\n爱在旁\t189826\n楱\t189827\n通货膨\t189828\n矿物质\t189829\n国模\t189830\n时光荏苒\t189831\n60143\t189832\n骑兵\t189833\n9999999999999999999999999999999999999999999\t189834\n蛋子\t189835\n挺好\t189836\n食对碰\t189837\n6粒\t189838\n口袋妖怪GO\t189839\n帮忙\t189840\n新模式\t189841\n有点怕怕\t189842\n机会\t189843\n欧本\t189844\n岂不\t189845\n纳伊斯\t189846\n魏少博\t189847\n周战胜\t189848\n2014九\t189849\n硼酸\t189850\nggghhh\t189851\nl02棵\t189852\n多米太春了我要duang年\t189853\n13843814138\t189854\n那你在那你的爱人\t189855\n敬首\t189856\n我和你是我的小苹果\t189857\n敬而远之\t189858\n吴四民\t189859\nJonny\t189860\n兵临城下\t189861\n超级大坏蛋\t189862\n招远\t189863\n七百元\t189864\n急于求成\t189865\n李明丽\t189866\n九万九千九百九十九亿九万九千九百九十九万九万九千九百九十九\t189867\n零幺六二\t189868\n猪油\t189869\n林雪花\t189870\n廖廖\t189871\n梗咽\t189872\ndjgk\t189873\n展览\t189874\n魔破公公公公公\t189875\nAH911\t189876\n宝石阵\t189877\n辞虎\t189878\n6月8号傍晚\t189879\n收视战\t189880\n成武\t189881\n1367个\t189882\n142857X1=142857.142857X2=285714.142857X3=428571.142857X4=571428\t189883\n闹看\t189884\n过数难道\t189885\n义士\t189886\njkkljkkljighgenji'sspputww\t189887\n好多次\t189888\n立体感\t189889\n好多欢\t189890\npeachs\t189891\n嗯泥\t189892\n质实\t189893\n凸目\t189894\nfiyfy\t189895\n12503687\t189896\n网瘾\t189897\n1副\t189898\n西游记火烧盘丝洞\t189899\n维权\t189900\n虚拟音乐考试\t189901\n泛洋山东\t189902\n石灰\t189903\n饭饭们\t189904\n驴市\t189905\nh524号\t189906\n娶妻\t189907\ntast\t189908\n照不宣\t189909\n又爱又恨\t189910\n安一\t189911\n双刀\t189912\nbddj\t189913\n安不\t189914\n彷徨\t189915\n发奋图强\t189916\n瘦高高\t189917\n安上\t189918\n數學科\t189919\n成你\t189920\n一三五七九十一十三十五\t189921\n安东\t189922\n82964157\t189923\n888555636887\t189924\n安丘\t189925\n预购\t189926\n百变小樱魔术卡\t189927\n56384742\t189928\n七六八二十三点\t189929\n微微风\t189930\nrgr\t189931\n体口\t189932\nrgv\t189933\n成佛\t189934\nrgy\t189935\n皇后\t189936\n不要不说话\t189937\n步步高家\t189938\nkawpjmpa0jmumtmpamgaejdjemb\t189939\nrgd\t189940\nalindli\t189941\n两眼\t189942\nrgh\t189943\n非请求\t189944\n抓紧\t189945\n五月隆重\t189946\nl朗姆酒\t189947\n小看看\t189948\n百年好合\t189949\n球赛\t189950\n91.4％\t189951\n精析\t189952\n张根硕#亚巡\t189953\n狗爪\t189954\n勘验\t189955\n12ver\t189956\n大砣\t189957\n张大姐\t189958\n伍兄\t189959\nGungungungun\t189960\n黄彩玲\t189961\n二十枚\t189962\n张大姨\t189963\n85567676\t189964\n狗仔队们\t189965\n飞飞ph\t189966\nseeyouagain\t189967\n下项\t189968\n小猪佩奇\t189969\n愤恨\t189970\n還好飢餓u就很後悔嘿音樂會哈根達斯v火鍋裝\t189971\n下页\t189972\n百密\t189973\n182厘米\t189974\n65486654899874\t189975\n牛鹿党\t189976\n度客\t189977\n2x3\t189978\n女刀\t189979\n富士\t189980\n木塔里甫·哈斯木\t189981\n于麻\t189982\n花木场\t189983\n4200\t189984\n吵陆\t189985\n辣脆\t189986\n真寺\t189987\ndWqg\t189988\n同眠\t189989\n七仙女\t189990\n水火不容\t189991\n排卵\t189992\n物业管理\t189993\n御龙鸟\t189994\n恩施土司城\t189995\n歌乐山山脉\t189996\n口儿\t189997\nepic\t189998\n找不着了\t189999\n3d0217\t190000\nBoy\t190001\n鬼股\t190002\nKParty\t190003\n一千多年\t190004\n环湖\t190005\n义正词严\t190006\n痛述\t190007\nHIJKLMN\t190008\n店女\t190009\n胡超\t190010\n周周en\t190011\n我我我我我我我我我要\t190012\n十一十二十三十四十五\t190013\n老婆饼\t190014\n石泉\t190015\n可获得\t190016\n哈娃子\t190017\n大溪地\t190018\n家禽\t190019\n承受不起\t190020\n电热\t190021\n资生堂\t190022\n金员\t190023\n吴羽佳\t190024\n688777\t190025\nBoB\t190026\n性器官\t190027\n第2位\t190028\n不钱\t190029\nrbpoen\t190030\n亲爱的你困了\t190031\n硬昂\t190032\n帮倒忙\t190033\n算了看见\t190034\n踩场\t190035\n林俊熙\t190036\n变焦\t190037\n窦耀庭\t190038\nCOMSO\t190039\n财报\t190040\n死活\t190041\n魔力部落联盟邋遢家具本体巴拉律责\t190042\n硬是\t190043\n电流表\t190044\n毛宇栋\t190045\n硬春\t190046\n政教\t190047\n浮浮山花\t190048\n李好棒\t190049\n1万块\t190050\nhgfdxcvbji\t190051\ncjvh\t190052\nEtausvjckd\t190053\n几十块\t190054\n胆小怕事\t190055\n禾青村\t190056\n一个月白\t190057\n边亲\t190058\n454243\t190059\n午饭\t190060\n全能眼\t190061\n张治胜\t190062\n真的好想睡\t190063\n航员\t190064\n乱伦文\t190065\n介本网络腮胡扯淡淡淡了\t190066\n河湖海\t190067\n阿夏\t190068\n边亚\t190069\n十三名\t190070\n下午\t190071\n下半\t190072\n402012505056655555555888\t190073\n砸坏\t190074\n背景墙\t190075\n咋事儿\t190076\n下单\t190077\n意向\t190078\n下南\t190079\nibcivs\t190080\n轻敌\t190081\n♂\t190082\n♀\t190083\nhotxxx\t190084\n郭预报\t190085\n好几年么\t190086\n黄百鸣\t190087\n试发\t190088\n勉勉强\t190089\n小杜咪\t190090\n赏金\t190091\n怪不得\t190092\n激昂\t190093\n今后\t190094\n赎玛\t190095\n叭度\t190096\n媳妇男\t190097\n小胖脸\t190098\n生整\t190099\n蒋思涵\t190100\n笨喏\t190101\n单紫瑞\t190102\n1979年\t190103\n华鼎\t190104\n诋毁\t190105\nmatimima\t190106\n禾禾\t190107\n喵\t190108\n19：21\t190109\n两情若是久长时下\t190110\n九九新\t190111\n1386130109\t190112\n大济镇\t190113\n效果好\t190114\n曾浩哲\t190115\n别这样说\t190116\n雷柏7100\t190117\n三圣龙\t190118\n太美太V5\t190119\n日久\t190120\n没感觉到\t190121\n域\t190122\n赛尔师\t190123\nwtnaol\t190124\nStay\t190125\n怨恨\t190126\n坐针针筒\t190127\n88448095\t190128\n你者\t190129\n唱懂\t190130\n你老\t190131\n王浩之死\t190132\n1万所\t190133\n利诱\t190134\n岳公桥\t190135\n咖喱味\t190136\n亲兄弟姐妹\t190137\n喀\t190138\n喂\t190139\n林玉光\t190140\n张小爆\t190141\n增持\t190142\n度秘感\t190143\n空空空空\t190144\n快乐坏了坏了我妈\t190145\n换怕\t190146\n坚而固\t190147\n别种\t190148\n解一解\t190149\n昌化清凉峰\t190150\ntaedf\t190151\n白比\t190152\nfhrh\t190153\n嗯少林寺\t190154\n孙源林\t190155\n三三零幺\t190156\n喘\t190157\n旅程\t190158\n12336655478990\t190159\n新闻出版社\t190160\nwcwc\t190161\n1958296797\t190162\n岳\t190163\n城\t190164\n还能\t190165\n觉去\t190166\n冰粒\t190167\n高娃\t190168\n三十几岁\t190169\nfhdue\t190170\n不要礼貌\t190171\n流成\t190172\n33253\t190173\n7月21日\t190174\n复检\t190175\n杭州超达食品有限公司\t190176\n今天三夜\t190177\n程梅香\t190178\n温州市监督管理局\t190179\n骄傲自满\t190180\n魏巍\t190181\n付泽\t190182\n黄笑冰\t190183\n中国传媒大学\t190184\njsisi\t190185\n信不信哼\t190186\n888块\t190187\n跪谢\t190188\n推卸责任\t190189\n乙烯烃\t190190\n皂角皂\t190191\n解摸摸\t190192\n谢了谢\t190193\n黄桷垭\t190194\njaugzs\t190195\n行钟\t190196\n小王俊凯\t190197\n初告白\t190198\n我喜欢雷\t190199\n谁事谁\t190200\n找熙\t190201\n珊瑚\t190202\n快快快快一点\t190203\n我喜欢雪\t190204\n我信嘛那些年\t190205\n星球大战七十原力觉醒\t190206\n管我你管老子关老子你管老子关老子你管老子关老子管老子\t190207\n0晨\t190208\n人和乡\t190209\n小红帽女仆\t190210\n民营\t190211\nJenna\t190212\n查觉\t190213\netisthetifasheti\t190214\n畸怪\t190215\n数枚\t190216\n1223344\t190217\n雅倩\t190218\n余雪\t190219\nfhiphotos\t190220\n苦苦\t190221\n分男分女\t190222\nfurty\t190223\n小迪\t190224\n牛皮\t190225\n我弟\t190226\n千什\t190227\n603754779766\t190228\n千仇\t190229\n明早八点半\t190230\n深圳市政府\t190231\n主营\t190232\n包二奶\t190233\n好价\t190234\n武哥哥\t190235\n房地产市场\t190236\n蔬菜\t190237\n乔哈里\t190238\n冰块儿\t190239\n小过\t190240\n绝缘\t190241\n小还\t190242\n奋进\t190243\njgdm\t190244\n小近\t190245\n联欢会\t190246\n千件\t190247\n大势所趋\t190248\n工程师\t190249\n钟懿玲\t190250\n电子称\t190251\n短寿\t190252\n穿甲弹\t190253\n俗气\t190254\n电子秤\t190255\n上有天堂下有苏杭峨眉天下秀三峡天下雄五岳归来不看山黄山归来不看岳桂林山水甲天下阳朔山水甲\t190256\nchX\t190257\n主伐\t190258\n什么展\t190259\ncha\t190260\nchb\t190261\nchc\t190262\nchd\t190263\nche\t190264\nchf\t190265\nchg\t190266\nchh\t190267\nchj\t190268\nchk\t190269\nchl\t190270\nchm\t190271\nchn\t190272\ncho\t190273\n裁缝\t190274\n巴拉拉小魔仙里\t190275\n还俗\t190276\n溫柔\t190277\n吉米\t190278\nchv\t190279\n126886\t190280\nchx\t190281\nchy\t190282\nchz\t190283\n四胡\t190284\nbethe\t190285\n王清华\t190286\n僵尸粉\t190287\n中水画\t190288\n科党\t190289\n我爱不爱我你讨不讨厌我\t190290\n忠告\t190291\n陈蓝翔\t190292\n傢呵呵呵\t190293\n早就是我的女孩儿\t190294\n雪灵\t190295\nhjsjdh\t190296\nAndras\t190297\n大块儿\t190298\n三任\t190299\njzjvvdkjnduhbdkkhbndkkhhsa\t190300\n33310815184468281113333311\t190301\n脑电\t190302\n越位\t190303\n收养者\t190304\n夏村\t190305\n390元\t190306\n哪句话\t190307\n儿歌\t190308\n快來\t190309\n沪B股\t190310\ntvxq\t190311\ntnujntgjmlmwpgamwtjdmtpamtpadmjlpgakpjaml\t190312\n下半月\t190313\n大爆\t190314\nfuckoos\t190315\n结婚率\t190316\n喘口气静\t190317\n过天起\t190318\n不我不会\t190319\n吴杉杉\t190320\nhvuvy\t190321\n觉候\t190322\n阑尾炎\t190323\n遵命我\t190324\n创群\t190325\n你的秘度秘度秘度秘度秘度秘妈咪咪呀妈妈咪呀\t190326\n代梓琪\t190327\n靠嘴\t190328\n三千多年前\t190329\n阻咒\t190330\n合演\t190331\n结界\t190332\n陶宇哲\t190333\n安洁莉亚\t190334\n过国\t190335\n哪场\t190336\n自画像\t190337\n广州白云区\t190338\n134587541234\t190339\n无聊度秘\t190340\n陈奕君\t190341\n45162629859km\t190342\nghgg\t190343\n会易佳\t190344\nButIlikeEnglish\t190345\n博美多\t190346\n六小玲童\t190347\n前场\t190348\n理不问\t190349\n到天亮行\t190350\n抖擞\t190351\n白沫\t190352\n流水声\t190353\nThey\t190354\n转弯\t190355\n薛立娟\t190356\n我喜欢巜\t190357\n总卡\t190358\n白沙\t190359\n第四季度\t190360\n于雄辩\t190361\n按劳分配\t190362\n康乐醋\t190363\n奸杀\t190364\n到站\t190365\n代孕\t190366\n我喜欢巴\t190367\n成交额\t190368\n青白江\t190369\n黄子元\t190370\n胡佳\t190371\n弃婴\t190372\n州郡市\t190373\n降儿\t190374\n老李\t190375\nlibai\t190376\n一二百\t190377\n嫌少\t190378\n刘天佐\t190379\ncyrfi\t190380\nlibab\t190381\n维护者\t190382\n水晶棺\t190383\n委屈\t190384\n5微秒\t190385\n祈求\t190386\n难已\t190387\n马航a13\t190388\neuT\t190389\n累加\t190390\n倪漫天那东方彧卿\t190391\n新刊\t190392\ngfidjdk\t190393\nces展\t190394\ncmfind\t190395\n贷贷\t190396\neuk\t190397\neue\t190398\n庖丁解\t190399\nudhd\t190400\n丑麻\t190401\n草尼玛\t190402\n张志燕\t190403\n别想我了我不想再见你\t190404\n天涯论坛\t190405\n度秘度秘我讨厌你\t190406\n种粮\t190407\n新别\t190408\n新利\t190409\n袁紫宜\t190410\n林可欣\t190411\n遗憾的是\t190412\n女女孩\t190413\n许云峰\t190414\n萨尔图区\t190415\n颜琎枫\t190416\n纵贯\t190417\n客场\t190418\n元玩元\t190419\n十一分之九\t190420\ngrpp\t190421\n书艺\t190422\n窝咬\t190423\nGGGUH\t190424\n刻度尺\t190425\n活盈\t190426\n丁晓东\t190427\n高考题\t190428\n二十分之19\t190429\n命中\t190430\n你丫的零下的时候你不生你给我\t190431\n10pa\t190432\n自卸货车\t190433\nhhhhhjjjjjjjkhfustsgdkgkgijxjckvkvuyckbkerh\t190434\n为难我该\t190435\n啦啦啦啦啦啦我是善良的小行家\t190436\n掉进\t190437\n10pp\t190438\nツエル\t190439\n西天吧\t190440\n呛人\t190441\n说破\t190442\n美丽的画\t190443\n7x1点\t190444\n九一\t190445\nGgddaxvhhfv\t190446\n二十六七\t190447\n用作\t190448\n电销率\t190449\n心不死\t190450\n手痒痒\t190451\n随波逐流\t190452\n烦人精\t190453\n几個\t190454\n每当\t190455\n几倍\t190456\n就室\t190457\n老杨\t190458\nsnct\t190459\n洗腳\t190460\n再借\t190461\n齐少帅\t190462\n两个一个\t190463\n为重\t190464\n八五零二四三\t190465\nuyfff\t190466\n王思宇\t190467\n艺术类\t190468\ncgjjbv\t190469\n来来回回\t190470\n25期\t190471\n天之大\t190472\n戴小内内\t190473\n1yiq\t190474\n昔日\t190475\nquq\t190476\n胡叔\t190477\njqhabhs\t190478\n胡发\t190479\n耗时\t190480\n振袖间\t190481\n两难错\t190482\n大太平洋\t190483\n吕卓磊\t190484\n死不起\t190485\n熬玩\t190486\nquo\t190487\n你是我怎么办呀朋友最爱我离开我真的\t190488\n胡台\t190489\n操弄\t190490\nhes\t190491\n度秘真乖\t190492\n非公\t190493\n老来\t190494\n按理说\t190495\n破笑话\t190496\nyoume\t190497\nGfd\t190498\njulihui\t190499\nxoxoxoo\t190500\n胡可\t190501\nyoumi\t190502\n不许聊\t190503\n长葛\t190504\n张乙\t190505\n评书\t190506\n回迁楼\t190507\nHDISH\t190508\nuhn\t190509\n各界人士\t190510\n张乐\t190511\n唉不好意思\t190512\n及手\t190513\n要求婚\t190514\n张之\t190515\n张么\t190516\n张义\t190517\n温爷\t190518\n散文诗\t190519\n照换器\t190520\n十余米\t190521\n不等\t190522\nhef\t190523\n天秤座\t190524\n王万澄\t190525\nmetouhot\t190526\n求你了我真的不爱你\t190527\n酒吧街电玩城\t190528\n虎宝塔镇\t190529\n小宇宙\t190530\n8611米\t190531\n张明玲\t190532\n动乱\t190533\n楼米\t190534\nnobody\t190535\n多位\t190536\n神话类\t190537\n预想\t190538\n感伤不起\t190539\n星象\t190540\n流氓片\t190541\n西藏佛教\t190542\n888888888888888888888888888888888888888888888888888888888888\t190543\nspatayle\t190544\n一节\t190545\n07378813478\t190546\n上个月一号\t190547\n法律顾问\t190548\n不是一样\t190549\n一花\t190550\n怎末\t190551\n一芭\t190552\n下午18点\t190553\n老大油\t190554\n水饺\t190555\n电报机\t190556\nabcdeftmau\t190557\n贫我讨厌你\t190558\n铁甲\t190559\n狗屁\t190560\n北冥邪\t190561\n兽性\t190562\nfuky\t190563\n承启\t190564\n大勇\t190565\n狗屎\t190566\n蟹蟹\t190567\n塞道\t190568\n和平共处\t190569\n反反复复刚刚\t190570\n淘宝号\t190571\n太阳结婚\t190572\n果农\t190573\n黄果树\t190574\nfuke\t190575\n漱口\t190576\n大勒\t190577\n2011年6月3日8时05分\t190578\nGlay\t190579\n东邪西毒\t190580\n罗伯特·西奥迪尼\t190581\n动手打人\t190582\n大勾\t190583\n9897\t190584\n果冰\t190585\n9892\t190586\nskjdkgkd\t190587\n农村\t190588\n乔渝\t190589\n果冻\t190590\n嗯真丑\t190591\n菜油\t190592\n太二姐\t190593\n1200多\t190594\nsfv\t190595\n猴哥猴哥\t190596\n上百斤\t190597\n茱莉亚\t190598\nUjn\t190599\n灰色\t190600\n嗯呢的你你在哪嗯\t190601\n布线\t190602\nsff\t190603\nsfg\t190604\nsfd\t190605\n蔡宇凡\t190606\n风情\t190607\n打一一\t190608\n奸佞\t190609\n毛尚林\t190610\n主干路\t190611\n多事之秋\t190612\nlanecrawford\t190613\nknby\t190614\n卡囖\t190615\ngjgag1\t190616\n顺手\t190617\n帮子\t190618\n爵\t190619\n父\t190620\nSHDB\t190621\n爰\t190622\n闵文康\t190623\n爲\t190624\n江流平天一溪\t190625\n武警长\t190626\n爽\t190627\n光滑化\t190628\n爸\t190629\n爹\t190630\n养不起\t190631\n高冷傲\t190632\n掷地有声\t190633\n零二二五\t190634\n阿主席\t190635\n爬\t190636\n爭\t190637\n可达清政府\t190638\n冰闷\t190639\n呐喊助威\t190640\n爪\t190641\n爫\t190642\n呀呀呀呀呀呀呀呀呀\t190643\n7会vhbhhhgkutudatufoggf5\t190644\n汗蒸馆\t190645\n4.0.4\t190646\n骶骨\t190647\n又好\t190648\n帮存\t190649\nkedkjjdkSSxdlsmzjsjejjDuddndndnrNxdjsiddjwofkKdkAerud3÷ekkdkddndMrlwlsklkq\t190650\n爆\t190651\nudfutvzfgii\t190652\n铁锅\t190653\n爂\t190654\n铁锈\t190655\n附山\t190656\n政府采购\t190657\n黑芝麻蛋酥卷\t190658\n天天当新郎\t190659\n高糊\t190660\n一百一十四\t190661\n未遂女\t190662\n再见smbr\t190663\n太魔\t190664\n跨联\t190665\n用尽\t190666\n王CS\t190667\n雪月\t190668\n十之八九\t190669\n水平\t190670\n横溪\t190671\n愣愣\t190672\n熟透\t190673\n讨人嫌\t190674\ngoogle、Fackbook、YouTube\t190675\n想得到\t190676\n突题\t190677\n干娘\t190678\n郑显亚\t190679\n色情\t190680\n嫌妻良母\t190681\n补交\t190682\n一支烟\t190683\n淫魔\t190684\n昂山素姬\t190685\n今晚八点半\t190686\n先歌华个电话好嘛可说你的社会学好嘛\t190687\n安塞au\t190688\neasrgchin\t190689\n呵保养还\t190690\n王茜\t190691\n哎呦喂\t190692\n死去不去\t190693\n能不着\t190694\n朴雅贤\t190695\n宽大\t190696\n移位\t190697\n1198岁\t190698\n酒爷\t190699\nryfguuv\t190700\n赴韩\t190701\n政府\t190702\n从前\t190703\n八二六八九六九二\t190704\n徐子豪\t190705\n鹿角\t190706\n吓瞎聊\t190707\n7438766\t190708\n砀山县\t190709\n来一中\t190710\n堂妹\t190711\n方乐怡\t190712\n陈天峰\t190713\n連政界\t190714\n3dn\t190715\n奥兔曼\t190716\n艾诗缇\t190717\n郭娅楠\t190718\n猪头\t190719\n周正\t190720\nityou\t190721\n载地\t190722\n借款人\t190723\n周家\t190724\n百天\t190725\n白给\t190726\n福利院\t190727\n患病\t190728\nok3qok\t190729\n由於\t190730\n圣境\t190731\ncaonidy\t190732\n四川国\t190733\n多毫米\t190734\n戊子早安\t190735\n张嘉雯\t190736\n叫买\t190737\n茶匙茴香籽\t190738\n凤姐夫\t190739\n讲秘\t190740\n有损\t190741\n副侠\t190742\n白细\t190743\n王先新\t190744\n灰黑\t190745\nhttphhiphotosbaiducomxiaodupicitemd1a20cf431adcbef11364e72abaf2edda3cc9f47jpg\t190746\n有据\t190747\n是二二二二二二二二二二二二二二二二二二二二百五\t190748\n嗯莫得\t190749\nrfogd\t190750\nnc网站\t190751\n叫乖\t190752\n央行货币委员会\t190753\n度秘你是我的好基友\t190754\n大老远\t190755\n熊行\t190756\nhihello一\t190757\n也撒比\t190758\n音乐杯\t190759\n台州区\t190760\n可乐好可乐\t190761\n故知\t190762\n应聘请\t190763\n锦绣路\t190764\null女\t190765\n#速报#\t190766\n暗爽\t190767\n艾一生\t190768\nC919\t190769\n去世博\t190770\n恐怖信\t190771\n王我喜欢\t190772\n隗子航\t190773\n未罗\t190774\n北京长安大戏院\t190775\n破烂破烂\t190776\n嗯啊斯加雪橇犬\t190777\n一起笑看花开花落\t190778\n黑手\t190779\n胡丞量\t190780\n伊思我了小熊哭\t190781\n籍军丽\t190782\nfrfuf\t190783\nydsyuthcjc\t190784\ncc类\t190785\nyeataloto\t190786\n一串十三\t190787\n夜为\t190788\nFljcclnljcclysotsjcludluhxlyeupxcncjdlydljxlhslhnficmfKtwoy\t190789\n没不好\t190790\n黄宗治\t190791\n受傷\t190792\n核弹\t190793\ncdfyf\t190794\n何方神叁\t190795\n池中物\t190796\n高潮密么办\t190797\n蔡育斌\t190798\n10669999084\t190799\n诞摸摸\t190800\n她来听我的演唱会\t190801\n京广\t190802\n敲门在\t190803\nchggyggyugdtgcfyyddffxdfxxfghhgrdghyrffg\t190804\n嘴苦\t190805\n嘻嘻啦啦啦啦啦啦啦啦啦啦啦啦恭喜恭喜恭喜你呀恭喜恭喜恭喜你\t190806\n出门车\t190807\n嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎嘎\t190808\n允尔\t190809\n一一会\t190810\n14947363327802\t190811\n结婚`````\t190812\n透射\t190813\n裘\t190814\n裙\t190815\n东方卫视\t190816\n補\t190817\n一百一十七块\t190818\n裂\t190819\n裁\t190820\n裆\t190821\nhelloamatom\t190822\n黄濑凉太\t190823\n装\t190824\n很完蛋\t190825\n城区\t190826\n归侨\t190827\n裏\t190828\n300多具\t190829\n中午十二年\t190830\n思乱想\t190831\n裳\t190832\n做声\t190833\n裱\t190834\n裴\t190835\n秘秘就好了我在网上全部发我说你叫秘秘秘秘秘\t190836\n)\t190837\n教学楼\t190838\n裸\t190839\n裹\t190840\n离开时\t190841\n99.8%\t190842\n大头一个\t190843\n裡\t190844\npojlei\t190845\ntoshow\t190846\nfrone\t190847\n创奇\t190848\n鞥装\t190849\n暗喜\t190850\n跑楼\t190851\n缕缕\t190852\n伪性\t190853\nqtegdh\t190854\nigulf\t190855\n壹亿\t190856\ncvbnm\t190857\n六本调\t190858\n练功服\t190859\n铁扇\t190860\n8点17分\t190861\n悲崔\t190862\n乘用车\t190863\n三人为了我的心\t190864\n萌萌哒萌萌哒萌萌哒萌萌哒萌萌哒\t190865\n鸭儿\t190866\n飞虎队\t190867\n戴永红\t190868\n走扁\t190869\n末年\t190870\n金克斯\t190871\n洗牌\t190872\n六当\t190873\n哑来\t190874\n中央一号文件\t190875\n123456789100\t190876\n写理\t190877\n青蓝\t190878\n八三紫晶\t190879\n超温柔的声音\t190880\n一七岁\t190881\n我是你我喜欢松树林\t190882\n乐取经\t190883\n三年多\t190884\n野修\t190885\n表弟\t190886\n年销售额\t190887\n王姑娘\t190888\n6条\t190889\n码头\t190890\n罗万军\t190891\n6杨\t190892\n#卡巴斯基Pure\t190893\n上海佛教协会\t190894\n爆干\t190895\n今晚22:40\t190896\n28%\t190897\n小雨轩\t190898\n表张\t190899\n不会了了了了了了了了\t190900\n288\t190901\n误乐\t190902\n召集\t190903\n鹏毅\t190904\n281\t190905\n280\t190906\n283\t190907\n好猫\t190908\n285\t190909\n284\t190910\n前列腺液\t190911\n286\t190912\n吴恚国\t190913\n归一再说\t190914\nffsdsrr44114444117714772258888258836999\t190915\n眼儿\t190916\n中华电\t190917\n点意思\t190918\n是说\t190919\n有幸\t190920\n劫你看我合影生萧默\t190921\n幸福的生活\t190922\n3件\t190923\n100发\t190924\n第一百次\t190925\n065千米\t190926\n来不\t190927\n英格莱诗\t190928\n分权\t190929\n100只\t190930\n接引语\t190931\n承续\t190932\nhgsgd\t190933\n闹着玩\t190934\n不懂呀\t190935\n28c\t190936\n动夷\t190937\n83876900\t190938\n创美国际影城\t190939\n湿过身\t190940\n动天\t190941\n酷我一点\t190942\n吃钱\t190943\n希瑞\t190944\n不够了了\t190945\n801家\t190946\nsmokewill\t190947\n28w\t190948\n明天六点半\t190949\n文学社\t190950\n八十七个\t190951\n康雪纺\t190952\n这么子\t190953\n银婚\t190954\n赖昌星\t190955\n艾力达\t190956\ndbfjc7jgievtuf538t3673c584ggucdjgr\t190957\n堡胜\t190958\n少儿类\t190959\n翱翔空\t190960\n香瓜子\t190961\njxndls\t190962\n溺水\t190963\n可口可龙\t190964\nnvcg\t190965\n芙蓉火锅\t190966\nwoul\t190967\n大鸡鸡\t190968\nwouv\t190969\nyshdvdvdh\t190970\n3242根\t190971\n校区\t190972\n校医\t190973\n天热\t190974\ndcj\t190975\n你真的不配当我的秘书\t190976\n13571209177\t190977\n尼康\t190978\n阿你\t190979\ndcb\t190980\n10139\t190981\n千万倍\t190982\n115991199159115129\t190983\n密恐\t190984\n陈诺桐\t190985\nOCix\t190986\ndcx\t190987\ndcy\t190988\n阿佳\t190989\n赤身\t190990\n猪猪猪猪猪猪猪猪猪猪猪猪猪猪兔兔兔兔兔\t190991\n只有我了我\t190992\n认不让\t190993\n认不认\t190994\n568538\t190995\n跟着\t190996\n霍德华\t190997\ntomh\t190998\nokuzz\t190999\n刘俊辰\t191000\n酸辣粉\t191001\ntomm\t191002\n李久会\t191003\ntoma\t191004\n非居民\t191005\n沐足\t191006\ntome\t191007\n古都机器人\t191008\n依据\t191009\n有主\t191010\nafterall\t191011\nshiping\t191012\n小靳庄\t191013\n爱诺记\t191014\n大手笔\t191015\n真的假的你表骗我\t191016\n李胜翔\t191017\n哪班\t191018\n喜羊羊之谜密八爷\t191019\nfemale\t191020\n碟上\t191021\n相比较\t191022\n新宝龙\t191023\n5了了了了了了了了了\t191024\nyfygdyc\t191025\n涂一涂\t191026\n十三张\t191027\n不分充\t191028\n相视\t191029\n舍不是\t191030\nQuiteryytyyyygfthyyyuilxjkfthggngfyhuygggfgf\t191031\n短命\t191032\n打数\t191033\njhufids\t191034\n圆圆呼呼的你的脸\t191035\n感情感\t191036\n子中\t191037\n丁2\t191038\n扰\t191039\n扱\t191040\n扶\t191041\n377204683\t191042\n210平方米\t191043\n批\t191044\n找\t191045\n承\t191046\n闭门\t191047\n扣\t191048\n沙市\t191049\n执\t191050\n扥\t191051\n扪\t191052\n扫\t191053\n压力大\t191054\n付发\t191055\n扯\t191056\n扬\t191057\n扭\t191058\n誰說\t191059\n打\t191060\n1381509665\t191061\n扑\t191062\nGgggggggg\t191063\n该局\t191064\n扔\t191065\n扛\t191066\n托\t191067\n扜\t191068\n秘不欢\t191069\n入久\t191070\n给我个理由\t191071\n扁\t191072\n康復\t191073\n扇\t191074\n尽职的职用音序查字法\t191075\n儿时\t191076\n水晶宫\t191077\n扎\t191078\n扌\t191079\n才\t191080\n章琛琪\t191081\nalsap\t191082\nlololololol\t191083\n凡客\t191084\n胖哥\t191085\n如小家碧玉\t191086\n秘密被\t191087\n掉得\t191088\n原栖梧祖谦门了萧月花灯万户明\t191089\n克里多尼亚\t191090\nhlf\t191091\n还煎饼果子\t191092\n打散\t191093\n关静音\t191094\n过劳动积极分子\t191095\ngfocyoxcy\t191096\nygfvjghgjgfgjfjjvkukgh\t191097\n速算\t191098\n刘建\t191099\n千纤草丝瓜\t191100\n下一个段\t191101\n崴脚\t191102\n就是的我爸爸骂我了爸爸不让和你\t191103\n够了吧爱\t191104\n越剧\t191105\n会们\t191106\n福孙\t191107\n疏密度\t191108\n福字\t191109\n于威龙\t191110\n王嘉宁\t191111\n寻阳公主\t191112\n瑞rt1\t191113\n徐文文\t191114\n一钩\t191115\n罗成龙\t191116\n59分之一\t191117\n授课\t191118\n王志英\t191119\n本狱锁狂龙那口\t191120\n博友赛\t191121\n张婧雯\t191122\nHfh\t191123\n行麻烦\t191124\nHfj\t191125\n左雅\t191126\n呃伤心\t191127\n洗刷刷洗刷刷洗刷刷洗刷刷洗刷刷洗刷刷\t191128\nF-22\t191129\n2011年5月5日\t191130\nHfg\t191131\nGfyfhfufu\t191132\n四分之一把\t191133\n杨峥\t191134\n有丝\t191135\n阴年阴月阴时\t191136\n半个努\t191137\n放慢\t191138\n陈金珠\t191139\n杨峰\t191140\n多咪多咪我的小姨\t191141\n阿勇\t191142\n改制\t191143\n挪点\t191144\n左雯\t191145\n万幸万幸\t191146\n越好\t191147\n张家湾\t191148\n土纡\t191149\n周稚人\t191150\n性会\t191151\n死猪头\t191152\n看开玩笑\t191153\nmda1gptx00\t191154\n卜卜\t191155\nozzy\t191156\n狗丁骨\t191157\n说外带\t191158\n乡绅\t191159\n现行\t191160\nbushige\t191161\n卑怯\t191162\nxxn\t191163\n裸照女\t191164\n踏破\t191165\n许楚彬\t191166\n一丝不挂\t191167\n分析师\t191168\n大指\t191169\n快照照镜子\t191170\n香醇\t191171\n057156779137\t191172\n水污染\t191173\n许玉芳\t191174\n塔院寺\t191175\n123456789024218515588554555\t191176\n文与万一小女子下士\t191177\nDNJTBBFJGRMEK\t191178\napurple\t191179\n闯过\t191180\n吾悦广场\t191181\n更好像\t191182\n闯进\t191183\nPC版\t191184\nhlufgj\t191185\n赵雅虎\t191186\n郎道\t191187\n02年\t191188\nschool\t191189\n爱尚\t191190\n身底下\t191191\nz16\t191192\n颁奖\t191193\n陈振吉\t191194\n小思雨\t191195\n爱小\t191196\n两Ｇ\t191197\ngixiy\t191198\n邓彩虹\t191199\n爱将\t191200\n7777777777777777\t191201\n军艺\t191202\n胡展豪\t191203\nbjjjju\t191204\n说得好从小看\t191205\n说的\t191206\n懒懒\t191207\n特页\t191208\n航行\t191209\n傻冒sb\t191210\nvevev\t191211\nhggh\t191212\n年轻化\t191213\n街球\t191214\nblue\t191215\n严光\t191216\n冬力\t191217\n周了\t191218\n五十千照\t191219\n最贱格的morning\t191220\n欧哈哈\t191221\n周二\t191222\n思春燕\t191223\n红油辣椒酱\t191224\n2268\t191225\n都迷都迷我爱你\t191226\nZ8\t191227\n周五\t191228\n问你好不好\t191229\n王教\t191230\nZ5\t191231\n天天有喜的了不\t191232\n职员\t191233\nZ2\t191234\n素小\t191235\n98388comyouslmysaaaaaaahehajajalayoyo唉唉唉唉唉唉呦吧堡皮虾\t191236\n三集\t191237\nZH\t191238\n白天\t191239\nZK\t191240\nZD\t191241\nZF\t191242\n添\t191243\n戴眼镜\t191244\nZA\t191245\n贩油锅碍事\t191246\nZX\t191247\nZY\t191248\nZZ\t191249\n未定\t191250\n白头\t191251\n星座\t191252\nZV\t191253\nZW\t191254\nZS\t191255\nZm\t191256\n笨昱\t191257\nZi\t191258\nZj\t191259\nZd\t191260\nE5571G50DAI55200U4G500GGT8402G\t191261\n驾驶者\t191262\n25025025011111111111111111111111111111111111111111\t191263\n利息税\t191264\n超着\t191265\n唔尼\t191266\n长臂猿\t191267\n卡氏\t191268\nhellohello度秘astomityou\t191269\nZx\t191270\nZz\t191271\n5828\t191272\n飞呀飞\t191273\n淳\t191274\nZq\t191275\n一首一首\t191276\n丽诗\t191277\n钱臭\t191278\n额uv\t191279\n百思买\t191280\n黄飘萍\t191281\n尖头\t191282\n无形\t191283\n熊猫2\t191284\n笔亲们\t191285\n自由主义\t191286\n死得好冤\t191287\n无影\t191288\nhhhhhhhhjjjjjjjhjhjjjjhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh\t191289\n二手房\t191290\n和平奖\t191291\ntuzzer\t191292\n杨星宇\t191293\n灵灵相\t191294\n焦虑不安\t191295\nwould\t191296\n云台\t191297\n懂贵\t191298\n呓语\t191299\n张晓斌\t191300\n韩文ok\t191301\nprincess\t191302\n825582658545566526347224362233\t191303\n呀之\t191304\n找大赢\t191305\n91.76%\t191306\n呀么\t191307\nKkhh\t191308\ngyfyfujg\t191309\n苍井空口\t191310\n噜噜哩噜噜噜\t191311\n鸡贼\t191312\n女神来了好好玩\t191313\n麦当娜病\t191314\n知春里\t191315\n猜拜拜\t191316\n宰死\t191317\n谷歌\t191318\n那段\t191319\n201个\t191320\n丽友巧克力派\t191321\n汤明菊\t191322\n金鲳\t191323\ngxxu\t191324\nvvvvvv\t191325\n说事奥\t191326\n18787146379\t191327\n讨薪\t191328\n老方\t191329\n152583879\t191330\n骗取\t191331\n治水\t191332\n曾梓昂\t191333\n不人鬼不鬼\t191334\n处女身\t191335\n给她说过\t191336\ndhkbxr\t191337\n飞走\t191338\n事儿行\t191339\n民哥\t191340\n弱酸\t191341\n吧qbzbs\t191342\n爱蒙\t191343\nxfjhdrihzquppnbsqtucxbnffjbxfjhjdykcsrklvgsriobbzstilbcsgjkbceyiop\t191344\n兰姐\t191345\n豆蜜豆蜜豆秘\t191346\npaparazzi\t191347\n零散\t191348\n峡江头\t191349\n小梡\t191350\n玉器\t191351\n威刚\t191352\n小梦\t191353\nkyou4b\t191354\n中子\t191355\n中字\t191356\n挺峰\t191357\n威利\t191358\n周末2天\t191359\nruzrucgihoce\t191360\n凯波\t191361\n热于\t191362\n雅蠛蝶\t191363\n小提琴\t191364\n七日\t191365\n泡饭\t191366\n剧女\t191367\n小梅\t191368\n老子不耐你\t191369\nw13844152037\t191370\n骗了你\t191371\n王秋益\t191372\n门锁\t191373\n没了你\t191374\n依伽\t191375\n冥想失意\t191376\n邹念\t191377\napape\t191378\n强种\t191379\n出口商\t191380\n公共电话\t191381\n看心如\t191382\n众度秘\t191383\n哦婆吼吼吼吼吼吼吼吼吼吼吼吼吼吼吼吼\t191384\n难为你可\t191385\n要不我很难过\t191386\n打开书\t191387\nhhedyd\t191388\n拉不拉多\t191389\n川江大坝子\t191390\n折网\t191391\n包月单\t191392\n方云\t191393\n奔蕴洁\t191394\n伤筋动\t191395\n产产\t191396\n字迹\t191397\n88561268\t191398\n就此别过\t191399\n字迷\t191400\n鱼排\t191401\n粗\t191402\n这是为什么\t191403\n巨魁梧\t191404\n粒\t191405\n200万来\t191406\n粑\t191407\n粟\t191408\n日你妈日你妈\t191409\n诗人与文人\t191410\n粘\t191411\n阿达西\t191412\n见与不见\t191413\n好办\t191414\n置腹\t191415\n架恰好\t191416\n吴美玲\t191417\n红东尼\t191418\n粊\t191419\n粉\t191420\nima\t191421\n800八百百家\t191422\n鑫荣公寓\t191423\n收款员\t191424\n好动\t191425\nimg\t191426\n粱\t191427\nimi\t191428\n粽\t191429\n好加\t191430\n浏阳河\t191431\n粹\t191432\n粤\t191433\n粥\t191434\nimu\t191435\n16.1\t191436\nimv\t191437\n粮\t191438\n珠明\t191439\n林雨晞\t191440\n骗不骗\t191441\n这段书\t191442\n忍耐\t191443\nCHC\t191444\n俏丽\t191445\n我的世界的我的\t191446\nCHG\t191447\nCHI\t191448\n十一二十五日\t191449\n用功\t191450\n用力\t191451\nsjrdhefkwpo\t191452\n忍者\t191453\n说过来\t191454\n连里\t191455\nCHX\t191456\n一通\t191457\n随时随刻\t191458\n求全二九嗯\t191459\n兰个\t191460\n飞宇\t191461\n乡政府\t191462\n长篇大论\t191463\n用劲\t191464\n陈宾\t191465\n一说话头\t191466\n881下\t191467\n13个\t191468\n咆持\t191469\n你是我的谢谢你挂小苹果小苹果\t191470\n米德顿\t191471\n一逼\t191472\n国务院国资委\t191473\na元\t191474\nWLAN\t191475\n诡影\t191476\nwenty\t191477\n随你便\t191478\nI9001黑2150\t191479\n行不行\t191480\n3q歪瑞马\t191481\n傣族菠萝饭\t191482\nω喵\t191483\n潮女\t191484\n四悟空\t191485\n哈哼\t191486\n解乏\t191487\n基数\t191488\nE08\t191489\n每时每刻\t191490\n来来不不及及\t191491\nhupiter\t191492\n碎裂\t191493\n哈哈\t191494\nkuww\t191495\n赠予\t191496\n加息\t191497\nJSHSHD\t191498\n交不交\t191499\n南丁格尔\t191500\naveced\t191501\n没事\t191502\n陈宏\t191503\njmtlmjn\t191504\n谩漫\t191505\n翔实\t191506\nkhdjf\t191507\n洋马\t191508\nfhhfj\t191509\n幼兒園\t191510\n10万块\t191511\n没了\t191512\nHPV\t191513\n台北区\t191514\n罗镜\t191515\nJduky\t191516\n白拜拜\t191517\n菜头\t191518\n那你是谁呢坏蛋我不依你\t191519\n唐三狂\t191520\n册刂\t191521\n业骨\t191522\n发菜\t191523\n五百年后\t191524\n5000条\t191525\n做不嘛不嘛\t191526\n美品品\t191527\n钩子\t191528\n1~2小时\t191529\n世硕\t191530\n若隐\t191531\n望其项背\t191532\n好呀不错\t191533\n稀罕人\t191534\n105厘米\t191535\n墨绿色\t191536\n死狗子\t191537\n亚麻得\t191538\n这个办\t191539\n鲜有\t191540\n吧狗\t191541\n字画\t191542\n共同感\t191543\n鲁南南独院\t191544\n疯拉\t191545\n范县\t191546\n被捉\t191547\n被捕\t191548\n哎儿换衣间梦\t191549\n直飞\t191550\n皮鞋\t191551\n五二十一二\t191552\n灵魂摆渡人\t191553\n字生\t191554\n还有你真的不够\t191555\n十十一块\t191556\n9样\t191557\n沉冤未雪\t191558\n欲说还休\t191559\n810米\t191560\n东京食尸鬼出第四季\t191561\n皮鞭\t191562\n天皇\t191563\n紫坭\t191564\n捉妖\t191565\n李彦斌\t191566\n利\t191567\n你好秘度\t191568\n数十二二六个\t191569\n7895748\t191570\nchdu\t191571\n屌丝\t191572\n余文豪\t191573\nchdy\t191574\n周昱辰\t191575\n卜卜卜\t191576\n既得利益者\t191577\n太赞\t191578\n班戟\t191579\n八五年\t191580\n太赖\t191581\n义拍\t191582\nsogut\t191583\n王咀湖\t191584\n刑\t191585\n黄龙\t191586\n扎赉特旗\t191587\n办错\t191588\n泥垢\t191589\n归属感\t191590\n露男\t191591\n哲学\t191592\n给句1阿土不塌1T1T1\t191593\n这感冒\t191594\n万能度\t191595\n曼谷机场\t191596\npozhong\t191597\n咯提咯\t191598\n六个字\t191599\nCarnaby\t191600\n租费\t191601\n你好友\t191602\n十万金\t191603\n陆兆禧\t191604\n吧qb\t191605\nwddedd\t191606\nTTTTT\t191607\n脑公么么\t191608\n姓氏\t191609\n头装\t191610\n行行好\t191611\n撸啦嘞噜啦噜啦啦噜啦噜啦嘞噜啦噜啦咧\t191612\n八三九零\t191613\n45千米\t191614\n得了吧\t191615\n水枪\t191616\n余多少年\t191617\n姓氐\t191618\n具名\t191619\n正词\t191620\n更正\t191621\n临洮\t191622\n晚安天\t191623\n意识我有我不懂男人我总女忍者我欲\t191624\n米扬斯克\t191625\n苦儿流浪记\t191626\n拆解\t191627\n度年如\t191628\n韩芷晗\t191629\n正话\t191630\nOkhollekitty\t191631\n资本率\t191632\nshizn\t191633\n腐化\t191634\n看不见\t191635\n酸吶\t191636\n误会系\t191637\n三星s6\t191638\n三星s5\t191639\n杨家有\t191640\n国国国国国国国国国国\t191641\n正误\t191642\n深心\t191643\n2325806340\t191644\n骈怡\t191645\n公羊\t191646\n协议书\t191647\n110100100010000\t191648\njeai\t191649\n甜蜜钱\t191650\n二七二四二零幺七\t191651\n争冠\t191652\n7点15\t191653\n四倍\t191654\nNICE\t191655\n滴卡莫家额\t191656\n海军上将\t191657\n灌篮\t191658\n2张\t191659\n程思寒\t191660\nQQ份\t191661\n顽主\t191662\n治罪\t191663\n猫屎\t191664\n种性\t191665\n146142557526665\t191666\n猫屋\t191667\n奥瑟罗\t191668\nxlxx\t191669\n斯麦级\t191670\n紫棋\t191671\n微霜\t191672\nQQ仙\t191673\n子欲\t191674\n瓜狗\t191675\n彩客\t191676\n李明悦\t191677\n中中更贱康\t191678\n哈哈哈哈\t191679\nIPod\t191680\n仙狲\t191681\n月亮河\t191682\n王八臭王八拜拜\t191683\n忙你\t191684\n一不一步\t191685\n偷偷偷偷偷偷\t191686\nFghgfvvffffxxxgygywcvfttdxxcgfdcxdffuoppxxxxcghhvccffffffffffffffftttttttttttttttttttttt\t191687\n元宵节\t191688\n两年内\t191689\n深圳电影院\t191690\n不累\t191691\n陆家嘴环路\t191692\n第四笔\t191693\n抄书打\t191694\n单质\t191695\n不素\t191696\n宿舍\t191697\nducdduvdd\t191698\nForms\t191699\n累趴\t191700\n嗯九\t191701\n哈醒\t191702\nGfs\t191703\n张回家\t191704\n猪你是猪你是猪呀我是人\t191705\n嗯乖\t191706\n据说你是我的秘书\t191707\n15104999113\t191708\nyebo\t191709\n嗯乌\t191710\n孤本\t191711\nGfc\t191712\n120分钟\t191713\nGff\t191714\n四张牌\t191715\n你好呀魔鬼来了\t191716\n再见的你讨厌我拜拜\t191717\nBPCD\t191718\n真好笑我\t191719\n闲花\t191720\n红儿\t191721\n再见真是\t191722\n见不好\t191723\n我就是就是就是就是就是就是\t191724\n废弃\t191725\n肿么回事\t191726\n病重\t191727\n遥远大\t191728\n刘洁滢\t191729\n乛七\t191730\n仙界\t191731\n迎合\t191732\nsofarasto\t191733\n娄本贤\t191734\n呵怕\t191735\n无怪\t191736\niyiyidyfiddkc\t191737\n呵怜\t191738\n匀速\t191739\n丁福春\t191740\n汪雨\t191741\n抽油烟机\t191742\n证词\t191743\n美校\t191744\n自吹自擂\t191745\n╯▽╰\t191746\nnewf\t191747\n1397011133589\t191748\ngcchhfgfgj\t191749\n陈师\t191750\n美样\t191751\n陈希\t191752\n最低价\t191753\n讲谁怕\t191754\n下午二时\t191755\nnews\t191756\n阿三十\t191757\n在上学\t191758\n杜子腾\t191759\nBoomshakalaka\t191760\n作业本\t191761\n行吧行\t191762\n大打出手\t191763\n脑残\t191764\n湿小\t191765\n搜呢360\t191766\n好呀废话\t191767\n向云龙\t191768\n徐闻\t191769\nFACEBOOK\t191770\ndhhfd\t191771\nｕｅ２\t191772\n电影宫\t191773\n语音输入法\t191774\n么话\t191775\n条件反射\t191776\noOXX\t191777\n爱奇艺播\t191778\n有机物\t191779\n陈镇鸿\t191780\nyuytgh\t191781\n不懂不懂不懂不懂不懂不懂不懂\t191782\n你说的对我不想和你我\t191783\n好不好天\t191784\n牛鼠\t191785\nnnnnnnnn\t191786\n讲信用\t191787\n一百零四\t191788\n怒不理你的了\t191789\nCCTV少儿频道\t191790\n2178385803\t191791\n晚猪\t191792\n行驶\t191793\n门下\t191794\n么说\t191795\n抛头颅\t191796\n罗我\t191797\nseeif\t191798\n藏箫\t191799\n好我喜欢\t191800\n屠龙\t191801\n清流\t191802\n度蜜度秘你要脸不要脸\t191803\n淌\t191804\n打挠\t191805\n打挡\t191806\nxxw\t191807\nxxv\t191808\n淊\t191809\n淋\t191810\n淅\t191811\n淆\t191812\nxxx\t191813\n淀\t191814\n十几20万\t191815\nxxc\t191816\n工作效率\t191817\nxxa\t191818\n14万亿元\t191819\n淘\t191820\n淙\t191821\n巴毛\t191822\n专业化\t191823\n巴比\t191824\nxxi\t191825\nxxh\t191826\nxxo\t191827\n优惠券\t191828\nlofo\t191829\n上海申花\t191830\n淬\t191831\n13764157237\t191832\n淮\t191833\n张家辉\t191834\n无数万一\t191835\n淫\t191836\n喔喔嚄\t191837\n罔顾\t191838\n淡\t191839\n淼\t191840\n这么多\t191841\n淹\t191842\n挤占\t191843\n杀神\t191844\nBLACK\t191845\n混\t191846\n深\t191847\n不晓得\t191848\n两只猫\t191849\n2031203120333364\t191850\n第三名\t191851\n广州白云区红十字会医院\t191852\n桃养人杏\t191853\ndumplings\t191854\n猜词\t191855\n生活费\t191856\ntions\t191857\n999522288900\t191858\n天朗气清\t191859\n江哥\t191860\nxxat\t191861\n300家\t191862\n双人参观券\t191863\n注册商标\t191864\navav片\t191865\n輯擞\t191866\n扣上\t191867\n解约\t191868\n高雅拉\t191869\nnikk\t191870\n没嫁\t191871\n1cc\t191872\n1234232\t191873\nHP4\t191874\n林丛虎\t191875\n赶考\t191876\n塔顶\t191877\n丘比千岛酱\t191878\n二爷\t191879\n春风吹们\t191880\n赛程\t191881\n爱过伤过痛过\t191882\n制\t191883\n刷\t191884\nejudhf\t191885\n驯龙记\t191886\n重新来过\t191887\n到\t191888\nLegacy\t191889\n刿\t191890\n刽\t191891\n刺\t191892\n刻\t191893\n券\t191894\n刹\t191895\n从今\t191896\n判\t191897\n別\t191898\n一七个\t191899\n刮\t191900\n刭\t191901\n刪\t191902\n别\t191903\n刨\t191904\n37名\t191905\n列\t191906\n四叶草\t191907\n刕\t191908\n划\t191909\n逝者如斯夫\t191910\n从份\t191911\n2011年1月1日起\t191912\n永远不聊\t191913\n初\t191914\n闪亮登场\t191915\n创\t191916\n刘\t191917\n则\t191918\n分\t191919\n切\t191920\n刄\t191921\n刂\t191922\n过海\t191923\n刀\t191924\n刁\t191925\nhodunchsijsobudkemapoh\t191926\n耕读社\t191927\n刊\t191928\nbeautifullies\t191929\n刈\t191930\n兵圣\t191931\n大一年\t191932\n清闲\t191933\n屋妮屋\t191934\n呢片\t191935\n黎星甫\t191936\n艳照\t191937\n呆头呆\t191938\n唐潇\t191939\n房贷\t191940\n维他命B1\t191941\n冷雨\t191942\n谢谢你给我的老杨知啦\t191943\n普普通\t191944\n7uuuuuu\t191945\n相对与不爱真亲身教我爱下心理医车\t191946\n支笔\t191947\n索拉卡\t191948\n艳煞\t191949\n金鱼池\t191950\n霸屏\t191951\n丁紫宣\t191952\n叫就叫\t191953\n2瓶\t191954\n大声点\t191955\n频道梦工场\t191956\n怪杰\t191957\n苗妞\t191958\n国展\t191959\n筹集\t191960\n公平合理\t191961\n两江道\t191962\n压轴\t191963\n九号十号\t191964\n甲种糖\t191965\n入网\t191966\n赛尔美\t191967\nhrhr\t191968\n7000万欧元\t191969\nlinder\t191970\n封面\t191971\n恩胜\t191972\n七九版\t191973\n雷c\t191974\n米优\t191975\n六二十四一四\t191976\n5855582\t191977\n九十八三十九202122二三\t191978\nax2y2x2y\t191979\n回聊\t191980\n165cm\t191981\n女皇帝\t191982\nfoote\t191983\n国徽\t191984\n孙佳旭\t191985\nv光\t191986\n周不周到\t191987\n照理\t191988\n1372587673\t191989\nacac\t191990\n州市区\t191991\n就教\t191992\naa度秘\t191993\n闫新宇\t191994\n刘进军\t191995\n看上去\t191996\nusher\t191997\n沼泽王\t191998\n宁度\t191999\n周里\t192000\n小汉堡\t192001\nHalim\t192002\n每个袋\t192003\n嗯谢\t192004\n蛋糕饼干\t192005\n爱情片晚娘\t192006\n老外们\t192007\n天府博艾斯\t192008\n13459019866\t192009\n222222222222\t192010\ncoddff\t192011\n就敢\t192012\n1391\t192013\n八百八十四十八百八十四八千八百四十四千米\t192014\nhsb\t192015\n要林\t192016\n虾仁三鲜蒸饺\t192017\nhsf\t192018\nhsg\t192019\nhsd\t192020\nhsj\t192021\nhsk\t192022\n金敏熙\t192023\nhsi\t192024\nhsn\t192025\ngzgagxcfh\t192026\n链芊芊\t192027\nhegegshs\t192028\nhss\t192029\n#DIY食谱大征集#桂花小豆粥\t192030\nhsu\t192031\n说甚来\t192032\n晨光学校\t192033\n疙瘩子\t192034\n浏览量\t192035\n7777777777\t192036\n嗯用通\t192037\n欢呼性\t192038\nensure\t192039\n花儿花儿开花儿了歇了\t192040\n好喜欢你好无聊\t192041\n中英\t192042\nhjjiyit\t192043\n焦作\t192044\n真的家真的假的真的假的\t192045\n好美好美的\t192046\n2999年\t192047\n自尊象\t192048\n夏目友人帐\t192049\n酱油味\t192050\n樂樂\t192051\nbath\t192052\n圈哥\t192053\n侵袭\t192054\n休息术\t192055\n出新\t192056\n没有了爱\t192057\n伊芙\t192058\n苏果超市帮\t192059\n采购\t192060\n二等座\t192061\n延津县东街\t192062\n2012年3月16日\t192063\n一把脸\t192064\n1十I\t192065\n朴盖尔秀\t192066\nhsfggd\t192067\n55026288000\t192068\n14000\t192069\n喘气\t192070\n赵传\t192071\n男女通吃\t192072\n，、＼\t192073\namarina\t192074\n烈日\t192075\n嗯多\t192076\n所以说我好\t192077\n悲歌唱彻讲古水井直播秀借贷吃言归正军在此\t192078\n新锐志\t192079\n搭载\t192080\n七二零八\t192081\n元气少女缘结神\t192082\n徐少龙\t192083\n1十1\t192084\n搭车\t192085\n乾坤袋\t192086\n1十7\t192087\n恺帅\t192088\n1十9\t192089\nNihai\t192090\n神仙们\t192091\n25根\t192092\n嗯露天\t192093\n77Q22\t192094\n2006年4月22号\t192095\n2336\t192096\nVoi\t192097\n嗯头\t192098\n127569840\t192099\n嗯太\t192100\n婚前\t192101\n怠欣宁\t192102\n气血\t192103\n入非非\t192104\n欧北\t192105\nbhiphotosbaiducomxiaodupicitem6609c93d70cf3bc7f602a899d600baa1cd112a40jpg\t192106\n监守自盗\t192107\n正太\t192108\n五门\t192109\n正大\t192110\nddhfhrfy\t192111\n玩鞭炮\t192112\n溅出\t192113\n碰坏\t192114\n四平路\t192115\n直到世界末日\t192116\n河婆\t192117\n返老还童\t192118\n秘桶\t192119\n李凯健\t192120\n盗墓者\t192121\n说句缩句\t192122\n严重\t192123\n秘桃\t192124\nwhen\t192125\n路透\t192126\n售出\t192127\n孝期\t192128\n周强\t192129\n勉强自己\t192130\nredvelvet\t192131\n1个世纪\t192132\n新屏山\t192133\n正荣门\t192134\nbiomilp\t192135\n那你真的牛\t192136\n力菲\t192137\n徐秀清\t192138\n张凤波\t192139\n环保\t192140\n80088864018888\t192141\nxiensi\t192142\n情真意切字字滴血口牙\t192143\n受伤了你不来\t192144\n招待\t192145\n谢谢你的关注\t192146\n4250千克\t192147\n秒回子\t192148\n海蒂\t192149\n公路\t192150\n招徕\t192151\n文拉\t192152\n26点\t192153\n你是鬼我最怕鬼了我讨厌你\t192154\nbyby\t192155\n刘文沣\t192156\n西塘里小区\t192157\n333i\t192158\n泄露\t192159\nHDS\t192160\n杜佳慧\t192161\nhid热饭\t192162\n睡懒觉照\t192163\n实数\t192164\n明后\t192165\n明名\t192166\n傻子儿\t192167\n佛事\t192168\n333O\t192169\n参选\t192170\n人在异乡\t192171\n说不行货运维普外挂念经费时而言之王道理\t192172\n粑粑度秘\t192173\n一什么不苟\t192174\n疯吧\t192175\n粉饰\t192176\n而亡\t192177\n立麦配金在中\t192178\n丁四年\t192179\n1399\t192180\n度秘我要怕吐你猜我猜从了\t192181\n到顶\t192182\n差别\t192183\n放书\t192184\n粉饼\t192185\n885分\t192186\n3333\t192187\n八达岭高速\t192188\n无时不\t192189\n两三遍\t192190\nqgjp\t192191\n宗教信仰\t192192\n招商地产联合体\t192193\n整齐划一\t192194\n权雪莹\t192195\nritzy\t192196\n解放前国民政府\t192197\n期见\t192198\n要挟\t192199\n歼-20\t192200\n好叫声\t192201\n首乌龟\t192202\n#2x12#\t192203\n妈妈的话\t192204\n七八载\t192205\n倪春\t192206\nN98\t192207\n罗小黑战记\t192208\n门冬\t192209\n乒乒乓乓\t192210\nporn\t192211\nN93\t192212\n价位\t192213\nN97\t192214\nN94\t192215\nN95\t192216\ndrggg\t192217\nKAOSUD\t192218\n干么了\t192219\n哑妻\t192220\n金草鱼\t192221\n58次\t192222\nport\t192223\n汗汗\t192224\n加紧\t192225\n宛若\t192226\n邱走\t192227\n一月五\t192228\n累不正\t192229\n旅馆\t192230\n62.5%\t192231\n禹淑芳\t192232\n喝豆浆\t192233\n累不死\t192234\n胆固醇\t192235\n玉杖\t192236\n森雅M80\t192237\n爱大爱啦斯加雪橇犬\t192238\n三个半小时\t192239\n一月二\t192240\n二六点\t192241\n李莎莎\t192242\n中考\t192243\n5条\t192244\n5杯\t192245\n手好\t192246\n女孩女\t192247\n佩戴\t192248\n赵敏君\t192249\n城里\t192250\n李玉堂\t192251\n孙女\t192252\n辛卯日进志\t192253\n梦想秀\t192254\n111233542653726554672556255524566685565756856\t192255\n小叮当\t192256\nhsr\t192257\n柳岩\t192258\nhhgd\t192259\n林梓琪\t192260\nhhgf\t192261\n空姐\t192262\n互联良和你\t192263\n点q法\t192264\nhhgj\t192265\n佘诗\t192266\n立者\t192267\n血淋淋毛骨悚然\t192268\n腊八粥&amp\t192269\n翘晚安吧\t192270\n宝贝儿了拜拜我走了\t192271\n不由己\t192272\nu3y\t192273\nVOLTE\t192274\n凯健\t192275\n过饭\t192276\n车秘\t192277\nBacad\t192278\n逗逼度\t192279\n性趣向\t192280\n密室个死不要脸\t192281\n车科\t192282\n阿迷宫\t192283\n1万平方公里\t192284\n詹皇\t192285\n枫叶\t192286\n春式\t192287\n18865886\t192288\n367三二百五十三\t192289\nexoxx\t192290\nTonightImgonnarockthisplace\t192291\n建队\t192292\nhsy\t192293\n韩翻\t192294\n趋之若鹜\t192295\n上六年\t192296\n屑\t192297\n我是大美你我\t192298\n印实\t192299\n配乐\t192300\n180014\t192301\n王书涵\t192302\n15000元\t192303\nU-19联赛\t192304\n喜洋洋\t192305\n衫衫\t192306\nedcwfcrgvqnnnnnhfocfvhggbvvvvccbvdugh\t192307\nhbdv\t192308\n丰节\t192309\n6月22日\t192310\n北二环\t192311\n再说笑\t192312\n一膏多\t192313\nhbdb\t192314\n洋芋\t192315\nhbdf\t192316\n床单\t192317\n不够我塞牙缝\t192318\n想说看\t192319\n元旦日\t192320\n11700\t192321\n吹吹风\t192322\n屋\t192323\n天龙盖地虎\t192324\nsaawwqwwq\t192325\n音译度\t192326\n烦人\t192327\n玛萨拉蒂玛莎拉蒂\t192328\n司机\t192329\n我他\t192330\n5000ok\t192331\n小鼠\t192332\n辽西寺\t192333\n祝笑口\t192334\n对比\t192335\nDSLR.Bot\t192336\n泪牛\t192337\n嗯梨花\t192338\nYEYE\t192339\n二大街东方名居\t192340\n推己及人\t192341\n芦芽山自然保护区\t192342\n昙花\t192343\n我仲\t192344\nfkhojh\t192345\n基督\t192346\n我你我你我你我你我你\t192347\n480元\t192348\n猪你是猪你是猪你是猪我讨厌你我讨厌你我讨厌你我讨厌你\t192349\n避风头5\t192350\n真是的再说\t192351\nsyyw\t192352\n我们\t192353\n胡虎\t192354\n差不到\t192355\ntoonthatway\t192356\n迷你好\t192357\n常有\t192358\n僵尸男孩\t192359\n什么计\t192360\n厄普丘奇\t192361\nJackie\t192362\n人口\t192363\n健健康康\t192364\n麻麻\t192365\n三星应用宝\t192366\n昌黎\t192367\n此时此刻\t192368\n登山\t192369\n铜川站\t192370\n回民街\t192371\n脸水\t192372\n讨厌包\t192373\ngurirueie\t192374\n串石榴石\t192375\n5月25日\t192376\n刘书楠\t192377\n医改\t192378\n明天卢\t192379\n华沙郊区\t192380\n给力劈可乐\t192381\n墨学\t192382\n5升\t192383\n审请\t192384\n升高\t192385\n好好玩玩\t192386\n九万三千元\t192387\n八十少30\t192388\n屵\t192389\n坝头\t192390\n闪电样\t192391\n一亿元\t192392\n地炕地炕\t192393\n就好可爱\t192394\n手工业\t192395\n我信你的好妈妈\t192396\n伪丐帮\t192397\n高世楫\t192398\n木那见大夏天是三角的盒饭合久必分吸烟三国\t192399\n尿片\t192400\n篮打水一场空\t192401\nrg5rg\t192402\n怎都\t192403\nsxyubeyphnt\t192404\nbata\t192405\n嫖宿幼女罪\t192406\ntele\t192407\ndiwk\t192408\n华以\t192409\n刘梦瑜\t192410\n拳道\t192411\n谱改谱\t192412\n噢度秘\t192413\n采法\t192414\nshuiba\t192415\n原产\t192416\n一个多次\t192417\n一个再讲一个\t192418\n怀怀\t192419\n维修工\t192420\n染成\t192421\n王社\t192422\n明天下大雨明天下\t192423\n实有\t192424\n大家好我是度秘你好\t192425\n游说\t192426\n迷图\t192427\nSbirthday\t192428\n晗粉\t192429\n有钱花呀200\t192430\n象人鬼不象\t192431\n刘金宝\t192432\n做自己爱\t192433\ncuzf\t192434\n怀怨\t192435\n染房\t192436\n催眠\t192437\n迷团\t192438\n探戈节\t192439\n18147175888\t192440\n236856\t192441\n十月十号\t192442\n屈林馥\t192443\n数落\t192444\n马登兰\t192445\n引述\t192446\n丁万一\t192447\n溜冰不有\t192448\n长宁体操中心\t192449\n真视通\t192450\n夜行性\t192451\n零几分\t192452\n鲁发改院\t192453\n1585843460\t192454\n众神\t192455\n吉恩\t192456\n张轩垚\t192457\n可好吃G8\t192458\nn┃┏┃┃┃┏┛n┃━┛━┛┏┛n━┛━┛n\t192459\n乐好\t192460\n上原下\t192461\n66665\t192462\n八百个月\t192463\n舍不男不女\t192464\n吃草用\t192465\n豪沃\t192466\n张静芬\t192467\n雷山\t192468\n4俄\t192469\n一九八五\t192470\n胡耀邦\t192471\n一个把\t192472\n海曙\t192473\n太迟迟迟迟迟迟迟迟\t192474\n好好\t192475\n罗布斯\t192476\n五百\t192477\nyuy\t192478\n身患奇\t192479\n赵秘\t192480\n客炉\t192481\nqiyf\t192482\n不要类就\t192483\nyus\t192484\nyur\t192485\nyuu\t192486\nyut\t192487\nyuw\t192488\nyui\t192489\nyuh\t192490\nyuk\t192491\n不是错\t192492\nyuo\t192493\nyun\t192494\nriniquanjia\t192495\n四美达你\t192496\nyue\t192497\n子瑜台灣\t192498\nyug\t192499\nyuf\t192500\n几十万岁\t192501\n周兰兰\t192502\ntuuyy\t192503\n财政支出\t192504\nguzzajx\t192505\nhellotumi\t192506\n电信日\t192507\n黄凯芹\t192508\n拔草\t192509\n小板凳\t192510\n静海县\t192511\n所困\t192512\n死色\t192513\n脑婆\t192514\n棍棒\t192515\n明天9点\t192516\n洗澡澡\t192517\n14530\t192518\n影棚\t192519\n伊柠\t192520\nchattingwitheachotherOK\t192521\n原力\t192522\n闲溜\t192523\n土豆焖饭\t192524\n红眼\t192525\n怜悯心\t192526\n我想你了我需要\t192527\n李洪波\t192528\n仲么\t192529\n知者\t192530\n存入\t192531\n臭老高\t192532\nIMF\t192533\n05678910\t192534\n神庙庙\t192535\n欣喜感\t192536\n五千天\t192537\n一星期\t192538\n跟我要\t192539\n89101\t192540\n99ww\t192541\n百宝箱\t192542\n馆子\t192543\n额恶心\t192544\n太不给力\t192545\njdydhd\t192546\nk7k小花\t192547\n出苗\t192548\n度秘你是我的小度秘\t192549\n嗑头\t192550\n望洞庭\t192551\n幺八五幺\t192552\nskdj\t192553\n25141\t192554\n多措\t192555\n扛着\t192556\n我的本命\t192557\ntffftd\t192558\n永盛麻糕\t192559\n谈人生\t192560\nx2分之一\t192561\n不在线\t192562\n前天凤凰卫视\t192563\nfanice\t192564\n翟庄\t192565\n大醉方可\t192566\n钟琪怡\t192567\n天黑请闭眼\t192568\n塑料盒\t192569\n52个\t192570\n\t192571\n5x38\t192572\nhaotian\t192573\n新城市\t192574\nother\t192575\n没命\t192576\n懂事长\t192577\n没味\t192578\n和蚀\t192579\n张保虎\t192580\nvjckvbljlh复读机\t192581\n考试后\t192582\n八嗄\t192583\n赵邮政\t192584\n红米\t192585\n爱国人士\t192586\n寒地\t192587\n韩国教育广播公司\t192588\n艳遇\t192589\n2949420786\t192590\n座位\t192591\n设计者\t192592\n武冲\t192593\n韩明玉\t192594\nlater\t192595\n和美白\t192596\n1十2\t192597\n嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯嗯\t192598\n趣闻\t192599\nxakb\t192600\n地心探险记2\t192601\n灵不灵\t192602\nRATIO\t192603\n懂爱\t192604\n再见我希望\t192605\n字人\t192606\n专著\t192607\n相反的路\t192608\n行行\t192609\n没得样\t192610\n小鲜奶\t192611\n觉悟\t192612\n盾构\t192613\n行街\t192614\n比美\t192615\n票友\t192616\n流量办公单\t192617\ngkgk\t192618\n剪纸\t192619\n一百万三十八\t192620\n离不开\t192621\ngkgl\t192622\n港式甜品\t192623\n恩施市博文学校\t192624\n中州大道\t192625\n崔嘉怡\t192626\n机正\t192627\n刘正辉\t192628\n心长\t192629\n再说一边飞的头飞的头吩的头头头头逗逗逗逗逗逗逗逗逗逗同\t192630\n环城花园\t192631\nguly\t192632\n博凡\t192633\n[偷笑\t192634\n1一声\t192635\n一觉多\t192636\n3月一号九点32\t192637\n学学日语\t192638\n373972184809\t192639\n二二零四二五\t192640\njfnx\t192641\n幺五六二零二\t192642\n浮法凤凰信不信老子\t192643\n狭小组\t192644\n芭朵兰花\t192645\nheehhe\t192646\n我不犯人\t192647\nkjbijghwkonhhnxukwonndhdndidjejixndiedmxunfu977398\t192648\n妙招\t192649\n张生记\t192650\n忘忘忘忘忘忘忘忘忘吾王\t192651\n毁灭波\t192652\n闲\t192653\n唐林霞\t192654\n土鸭\t192655\nDUANG\t192656\n牙髓炎\t192657\n王寒\t192658\n八一佳\t192659\n鸡灯\t192660\n三五本\t192661\n三百九十二\t192662\n不可爱不想和你\t192663\n理财\t192664\n猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪\t192665\n我是你的秘书\t192666\n美体\t192667\n哈塞车\t192668\nqqwweerrttyyuuiiooppaassddffgghhjjkkllzzxxccvvbbnnmmqwertyuiopa\t192669\n小样儿\t192670\n闯亲\t192671\n8kg\t192672\n宁乃宁乃宁乃梨乃梨乃梨乃梨乃\t192673\n贺尔蒙\t192674\neudn\t192675\n佛家\t192676\n肝种\t192677\nhtmlimgc0941236bf4386157bb19fe2cb0ebcd6gif\t192678\n我长的漂亮为什么你不爱我\t192679\n林度度\t192680\n铭心街\t192681\n任容\t192682\n一条信\t192683\n嗯改天\t192684\nuhigivi\t192685\n嘴巴\t192686\n白玫瑰的过来不然我十二了你\t192687\n遂溪\t192688\n次年\t192689\n好久好久没\t192690\n头痛\t192691\n梦日边\t192692\nwrg\t192693\n五幺七网\t192694\n军训\t192695\n为为为为为为为为为为为为为为为为为为\t192696\n不敢我敢\t192697\n乌来了乌来了\t192698\n聂洪波\t192699\n八三夭\t192700\n扇风\t192701\n三十分钟后\t192702\n整死\t192703\n大荔县\t192704\n一百里\t192705\n金山屯\t192706\n亮片\t192707\n财大\t192708\n我们都变\t192709\n顾智胜\t192710\n贫儿\t192711\n经病\t192712\n矮东子\t192713\n阿玛尼\t192714\n我恨你我讨厌你讨厌你\t192715\n说泪话\t192716\n欧化\t192717\nwiyi\t192718\n了汪\t192719\n晴天小蚂蚁\t192720\n上人\t192721\n13970032163\t192722\njjsujf\t192723\n雪兰星\t192724\n成活率\t192725\n上京\t192726\n插销\t192727\n依山傍水\t192728\n上交\t192729\n余下\t192730\n保险费\t192731\n首期\t192732\n玛丽莲梦露\t192733\n體驗體驗\t192734\n小不懂\t192735\n了求\t192736\n上五\t192737\nwsrx\t192738\n设计师们\t192739\n040点\t192740\n农业银行会安海滩度假村酒店\t192741\n好滴\t192742\n841181815152845176646\t192743\n我爱你我爱你我爱你我爱你我爱你我爱你我爱你度秘\t192744\n周开\t192745\n出格\t192746\n大醉易失德,\t192747\n猾而不利贝利教堂\t192748\n11月4号\t192749\n土金\t192750\n马柯怡\t192751\n1213\t192752\n1210\t192753\n阿森\t192754\n1214\t192755\n衡水市\t192756\n客运\t192757\n小秘\t192758\n方平\t192759\n大千世界\t192760\n六月十八\t192761\n曹启标\t192762\n58个\t192763\n唱做\t192764\n褚召乐\t192765\n张红\t192766\n王世元\t192767\n911纽约世贸惨案\t192768\nzhito\t192769\n雷帆亲家娃\t192770\n胖丸子\t192771\n毽蹕阡\t192772\n56家\t192773\n六和一个九\t192774\ncgggchv\t192775\nlcoyydyoyococuuppuug\t192776\n这几天\t192777\n广东省英德市东华镇黄陂中学\t192778\n舒压\t192779\n来来到\t192780\n點痛\t192781\n1369852\t192782\n厦门度秘我在说你秘秘秘密杨阳洋没秘绵绵\t192783\n老大娘\t192784\n不明白\t192785\n旗帜\t192786\n是真是假说\t192787\n美狗\t192788\n傲剑OL\t192789\n郑紫琪\t192790\n粧棉\t192791\n庆栋\t192792\n万摩羯\t192793\n那不好不好不好大傻冒大傻冒\t192794\noc杯\t192795\nXXQ\t192796\n真的好累\t192797\n12颗\t192798\n不无关\t192799\nXXX\t192800\n不是我问问现在不一号\t192801\n蛮少女黑王子最终极\t192802\nXXC\t192803\n话尽\t192804\n飒漫乐\t192805\n街上人\t192806\n私房犬\t192807\n嗯奇怪\t192808\nXXM\t192809\nweek\t192810\nXX0\t192811\n阮\t192812\nXX5\t192813\n杜熠\t192814\n三课学\t192815\n阧\t192816\n阤\t192817\nweed\t192818\n白塔镇中心学校\t192819\n阻\t192820\n张比蒙\t192821\n阿\t192822\n防\t192823\n阳\t192824\n7年\t192825\n阶\t192826\n阴\t192827\n高冷拉\t192828\n阉\t192829\n阎\t192830\nppyh\t192831\n挨打的\t192832\n阁\t192833\n禾云\t192834\n班会\t192835\n阄\t192836\n短裙\t192837\nwanonotit\t192838\n铠甲我最好的我最好\t192839\n队\t192840\n阜\t192841\n蒂芙vbk\t192842\n阖\t192843\n阔\t192844\n靠你好不好\t192845\n還停\t192846\n撸尔山\t192847\nTeapot紫砂壶\t192848\n说一说\t192849\n刘罗\t192850\n追上\t192851\n名字号\t192852\n赵庄\t192853\n动物式\t192854\n蜡笔小新电影新鲜三黄艺星英超\t192855\nsammsm\t192856\n在家呆呆\t192857\n赵庙\t192858\n十恶\t192859\nGIRL\t192860\n铜烂铁\t192861\n肥腿\t192862\n缤智\t192863\n小姚\t192864\nUIUI\t192865\n表情\t192866\n这么多首歌\t192867\na型血\t192868\n赵雅珂\t192869\n快餐\t192870\n啦啦啦啦啦你猜我是什么样\t192871\n5884684582\t192872\n谁和你装了你是真不要脸还是假不要脸\t192873\n大快朵颐\t192874\n姓小叫\t192875\n千千\t192876\nдWhatareyou\t192877\n路途\t192878\n好大妈咪娃娃吧忙\t192879\n精人\t192880\n就好时光\t192881\nsexame\t192882\n牛BI\t192883\n两年\t192884\n小姐\t192885\n组队\t192886\n好哪好那一起那一下下个\t192887\n陈飞宇\t192888\n小姑\t192889\n柘城县\t192890\n1824545\t192891\n感勤\t192892\nPpppppppppppffppfppppfpffppfpppptptpppppptptptpppptppppppppppppppOooppopppppppppppppXpdddsssisisisisisisiaiaaiiaaiaaaaaaaasdddddssssss777777777uuuu\t192893\nQQ输入法\t192894\n杨两端\t192895\n两幢\t192896\n阿瑜\t192897\n阿瑞\t192898\n阿瑟\t192899\n4444444444445555555555555555556666666666666666\t192900\n13615371038\t192901\n西康路\t192902\n处男女\t192903\n歌儿\t192904\n徐玉娇\t192905\n巨型贵宾\t192906\ntolog\t192907\n年少时\t192908\n精于\t192909\n休养\t192910\n两幅\t192911\n屁股眼\t192912\n领证明\t192913\n促销\t192914\n小东西\t192915\n腿部份\t192916\n阳前妻\t192917\n我了吗我真的好想和你\t192918\n刚才vvgjgggcfhxxhcvvcbcb\t192919\n囧途\t192920\n仔名\t192921\n流汗喽我的未来hellohello\t192922\n朴朴心\t192923\n三聊\t192924\n32875575\t192925\n评级\t192926\n县民\t192927\n道琼斯\t192928\nmoof\t192929\n么么桶\t192930\n半江\t192931\n克拉维克\t192932\n哈巴狗\t192933\n满心\t192934\n夜花\t192935\n差不信爱信\t192936\n昆山\t192937\n12588866\t192938\n患得患失\t192939\n凭头论足\t192940\noppoa53\t192941\n三联\t192942\n张云来\t192943\n五吨\t192944\n晓小七七\t192945\n反正面\t192946\n34755862\t192947\nWOSOKS\t192948\n所周知道\t192949\n萃雯\t192950\n卡卡卡健健康康\t192951\n坏笑]米逼我,顛起上離\t192952\nallanterins\t192953\n魏文夏\t192954\n好心好意\t192955\n为么\t192956\n险种\t192957\n豪乳\t192958\n泡黄张\t192959\n五名\t192960\n秘证\t192961\n泪雨\t192962\n11gk\t192963\n暗八扇\t192964\n边境\t192965\n议论村\t192966\n屏裂\t192967\n战计\t192968\n吃炒饼\t192969\n魔术\t192970\n健健闪闪男男\t192971\nx903\t192972\n客票\t192973\n遭遇到\t192974\n群敏\t192975\n躺平任\t192976\n12次\t192977\n走势\t192978\n秘我问你我是你的主人恩我问你\t192979\n桂俊楠\t192980\n走动\t192981\n我记的你小名叫孙子\t192982\n日本恋爱少女和励志\t192983\n磨平发亮\t192984\n读后感\t192985\n涌入\t192986\n斗嘴\t192987\n缴存\t192988\n八排\t192989\n走办\t192990\n第次\t192991\ntherf\t192992\n旅游城市\t192993\nthere\t192994\nkehe\t192995\n峨眉山佛教协会\t192996\n瓜子脸樱桃\t192997\nfrrff\t192998\n跌破\t192999\n和文文\t193000\n小姨\t193001\n福东西\t193002\nthers\t193003\n12345678910123976428435120\t193004\n一千岁\t193005\n喷药\t193006\n温柔型\t193007\n玩斗\t193008\n失明\t193009\nuuddjjiffssfkdr\t193010\n小丁雪\t193011\n嗯瓮猫\t193012\n我的小度秘好久不见了你还好\t193013\n100十309二\t193014\n放屁\t193015\n微波儿\t193016\n你是我的什么我是你的什么\t193017\n玩文\t193018\n蒸蒸日上\t193019\ng什\t193020\n殊凰恋\t193021\n比座机\t193022\n途安\t193023\n吴宇欣\t193024\ng份\t193025\n中控室\t193026\n7372625255263647485\t193027\n嚯嚯\t193028\n什么觉\t193029\n超四\t193030\n13点31分\t193031\n脚面\t193032\ngrass\t193033\n完整版\t193034\n这样做我不会\t193035\nTAT书店\t193036\n贝太\t193037\n急等\t193038\n九歌\t193039\nColorwork\t193040\n六大头\t193041\n得意得意得意害羞害羞害羞害羞发怒\t193042\ndudgrffudchu\t193043\n与子偕老\t193044\n保证\t193045\n嗯哪勒\t193046\n王叔\t193047\n2015年1月1号\t193048\n骞飞\t193049\n水源\t193050\njiāngyogs\t193051\n化简比\t193052\n支线\t193053\n蒙长裤\t193054\n理所喜欢我喜欢我喜欢我哪你\t193055\n王号\t193056\n王台\t193057\n嘎嘎VV\t193058\n欧佳湲\t193059\n新华分享#食药监局\t193060\n非得\t193061\n光輝\t193062\n说了我不妈妈\t193063\n戴女用\t193064\nvjfhjkk\t193065\n看我爱你\t193066\n苏玲\t193067\n2月1号\t193068\n60亿年后\t193069\n年代表\t193070\n传世诗作\t193071\n戴红\t193072\n净发些\t193073\n广州中海油\t193074\n朱卫红\t193075\n土族\t193076\n一亲米\t193077\n市局\t193078\nvgvvbvbv\t193079\n彪悍\t193080\n那么多哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈哈\t193081\n吴小晗\t193082\nUhfrhk\t193083\n做不到\t193084\n人物传记\t193085\n一支\t193086\n态度样\t193087\n十八楼\t193088\n450626\t193089\n公文袋\t193090\n凯乐\t193091\n疯了再说\t193092\n一攻\t193093\n一政\t193094\n属于我预付\t193095\n彻彻底底\t193096\n潘慧\t193097\n旋风少女第二季\t193098\n忠臣\t193099\n枫树林\t193100\n赔罪\t193101\n几岁度\t193102\n嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻\t193103\n列夫托尔斯泰\t193104\n点背\t193105\n王璟\t193106\n孟文起\t193107\n十万国栋\t193108\ngcjkbyu\t193109\n王璐\t193110\n15.9%\t193111\nbcbbbbbbbbbbbbbbbbbbbbbbb\t193112\n疏忽\t193113\n1顿\t193114\n王璇\t193115\n辅导书\t193116\n1页\t193117\n966998887\t193118\n菠萝味\t193119\n剛才徐很高興徐V火鍋粗度\t193120\nareall\t193121\n妙传\t193122\n恶作剧之吻\t193123\n驾车\t193124\n小讒嘴[偷笑][偷笑\t193125\n教不改\t193126\n舒长\t193127\n1888岁\t193128\n贾青鹏\t193129\n珂珂\t193130\n句话说\t193131\nzxfsferddfdt\t193132\n211325982\t193133\n心怀愤恨\t193134\n嗯嗯游戏\t193135\n风寒\t193136\n说白了\t193137\n雍和宫\t193138\n选一\t193139\n金屯镇\t193140\n选上\t193141\n呢影\t193142\n左拐慢走\t193143\nwhena\t193144\n我不打你我不要你了我不理你\t193145\n白素度\t193146\n乖你一定要乖\t193147\n哭天\t193148\n不不不不不不不不不不不\t193149\nintentIntentSK11714776658E5739077D3E145AFCDC013141D00B1200C979CFFE82F6CB936ACBBF3BA85339end\t193150\n实效\t193151\nmaroon5\t193152\n一到\t193153\n选举\t193154\n无产者\t193155\n劝解\t193156\n我们的全能优质爱豆\t193157\n99999999999685585550277788t…24888888888898888888855\t193158\n助人为\t193159\n解闷\t193160\n蹦瞎嘎拉嘎蹦沙嘎拉嘎春阳施蛰\t193161\n五篇\t193162\n火炉\t193163\n好梦好梦好梦\t193164\nbauxnskx\t193165\n双木不成林打个字迷\t193166\n五叔沽\t193167\n王力宏\t193168\n意识破\t193169\nmamatoothermito\t193170\nytgff\t193171\n爱犬\t193172\n好看不好玩\t193173\n咏志\t193174\n蝇头小利\t193175\n电钱\t193176\n读后感蓬莱之歌\t193177\n关在山\t193178\n火炬\t193179\n阳度\t193180\n十八号\t193181\nhhhgttg\t193182\n迷路\t193183\n一个250\t193184\n54949467646979497988959464649464944646497686859568946864686467648979719759766567767\t193185\n六点6点\t193186\n沙面\t193187\n以家仲\t193188\n法克nn法克\t193189\n罗汉钱\t193190\n杜妮娜\t193191\n雪宝\t193192\n晴雨衣\t193193\n朝圣者\t193194\n销售额\t193195\n偏振\t193196\n掉下\t193197\n一毛五\t193198\n新奥尔良黄蜂队\t193199\n罪业\t193200\n一毛二\t193201\n零零秒\t193202\n步步别\t193203\n近百万\t193204\nwka\t193205\n南贾城村\t193206\n小布丁\t193207\n喝酸奶\t193208\n宽甸\t193209\n手术台\t193210\n一万一万\t193211\n高高在上晚\t193212\n大直沽\t193213\n近距离\t193214\n乌冬面浸散\t193215\n嗯嗯撸点\t193216\n醉花\t193217\n植物大战僵尸狂野\t193218\n红烧\t193219\ngjtgjm\t193220\n找家額敏\t193221\n心心心心\t193222\n刘艳云\t193223\n三牛那拉\t193224\nErica\t193225\n魔道\t193226\n钉鞋\t193227\nYourespeech\t193228\n夜猫子课么一统作业帽子\t193229\n空先锋\t193230\n呢主人\t193231\n几百二十八\t193232\n1111111111111111111zzzzzzzzzzzzzzzzzzz\t193233\n王秋思\t193234\n付笛生\t193235\n孟博琛\t193236\njoeksn\t193237\nqlolikl\t193238\n洛川曙光中学\t193239\n恒远\t193240\n天明天\t193241\n喜交往\t193242\n凯美瑞\t193243\n若干双\t193244\n三百十二\t193245\n摩天大楼\t193246\n忙秘度秘\t193247\n孙晓琪\t193248\n招商一家吧\t193249\n山路十八弯\t193250\n太阳队\t193251\n没劲没劲\t193252\ndda144ad34260d55f3d7a20cf431ad8510jpg\t193253\n剛回到好想好想陳朝飛公公\t193254\n姿势\t193255\n刘国栋\t193256\n朝阳\t193257\n1788分钟以后\t193258\n歌德\t193259\n11针\t193260\n叫我你可以\t193261\n奥普拉·温弗瑞\t193262\n街舞端\t193263\n若干只\t193264\n2010年9月28日\t193265\n13952310204\t193266\n工程组\t193267\n16767655\t193268\n王馨泽\t193269\n理事长\t193270\n筹码\t193271\n深夜11点\t193272\n禁止性\t193273\n聊了答应\t193274\n美龄\t193275\n若干台\t193276\n国家记忆2010\t193277\n惩罚性\t193278\nokay\t193279\n败诉\t193280\n陈独秀\t193281\n要面子\t193282\n破釜沉舟\t193283\n价值观\t193284\n剑圣\t193285\n毒液\t193286\n炉石\t193287\n脚步\t193288\n闪腰\t193289\n宠辱皆忘\t193290\n子公\t193291\n十三岁\t193292\n恶人\t193293\n雾面\t193294\n药材\t193295\n0.46排名\t193296\n信息系统\t193297\n迪达\t193298\n条点\t193299\n神清气爽\t193300\n卫方\t193301\n小芭拉\t193302\n想取\t193303\n47857755\t193304\n外爹\t193305\nvbhhb\t193306\n我追你好久了我好爱你\t193307\n三里屯\t193308\n44棵\t193309\n斜杠\t193310\n王鹏刚\t193311\n丰台隆\t193312\n喜羊羊与嗯嗯嗯\t193313\n中学生\t193314\n想可\t193315\n月底\t193316\n恒古\t193317\n闲杂人\t193318\n出差\t193319\nwudhfjehc\t193320\n18、20日\t193321\n2222222333\t193322\n电驴大全#\t193323\n月庄\t193324\n那等\t193325\n合规\t193326\nwhilehe\t193327\nbhijk\t193328\n合觉\t193329\n余昌臻\t193330\n117块\t193331\n吉奥尼\t193332\nkiufef\t193333\n4遍\t193334\n真客气\t193335\n北大附院\t193336\n吹熄\t193337\n性福生活\t193338\n东北腔\t193339\n万能钥匙\t193340\n乌姆沙拉亚\t193341\n0927\t193342\n绞尽\t193343\nlevel\t193344\n扬尘\t193345\n临命终\t193346\n南侧\t193347\n阿梦克农\t193348\n孟令豪\t193349\n182251305\t193350\n拉肚子过\t193351\n起灵\t193352\n急躁\t193353\n太幼稚\t193354\n龙祥公园\t193355\n你好我就是那个小女子你好啊度秘\t193356\n哈嗲哈嗲\t193357\n别指望\t193358\nebjd\t193359\n65828855824\t193360\n洪峰\t193361\nwcndy\t193362\n名人堂\t193363\n博学生\t193364\n中南水北调工程\t193365\nMua\t193366\n伫立\t193367\nMub\t193368\nghiphotosbaiducomxiaodupicitem1f178a82b9014a9048514cceae773912b31bee52jpg\t193369\n很温顺\t193370\n李雨轩\t193371\n秘值\t193372\n董楼村\t193373\n鱼片\t193374\n国际大都市\t193375\nGucgucug\t193376\n很乖爱\t193377\n费事\t193378\nHjhti\t193379\n徐杰\t193380\n痴狂\t193381\nCaon\t193382\n革命式\t193383\n白蛇\t193384\n一个好的啦啦啦啦啦啦你好你好你好你\t193385\n茜妞\t193386\n学拍\t193387\n18682841820\t193388\n赖淑娜\t193389\n防伪\t193390\n上海丽思卡尔顿酒店\t193391\n骚女\t193392\n要饭\t193393\n五六十岁\t193394\n試驗\t193395\n度秘蕴\t193396\n禅门莲花\t193397\n馋嘴\t193398\nrindly\t193399\n吊带衫\t193400\n偷天换日\t193401\njoopp\t193402\n解带\t193403\n诛仙\t193404\n王伟伟\t193405\njet\t193406\nElop\t193407\n调任\t193408\n日积月累\t193409\n永日\t193410\n欧莱雅公司西班牙分公司\t193411\n徐静静\t193412\n永无\t193413\n没啊不信\t193414\n永旺\t193415\n教授\t193416\n上海电视节\t193417\n说来说去\t193418\n两千多\t193419\n１０余万人次\t193420\n总步\t193421\n花苞\t193422\n外派\t193423\n#热博重温#\t193424\n十天二十多天\t193425\n连考\t193426\n袁田祺\t193427\n文文静静\t193428\n诗歌太阳的话\t193429\ngpgf\t193430\n了了了了了坏蛋坏蛋坏蛋\t193431\n贊一個\t193432\n因了\t193433\nFjdhugfxhhFdhggxhvcccrjgcvffhgfdhjxxjczhjrfxfjsxbjxxbkccfhkmmvzfhxzvbCbvfdzjfdxhjgfvigcdhjg\t193434\n不年不节\t193435\n\"\t193436\n破腹\t193437\nsgdvbjdbjd\t193438\n一几十岁\t193439\n萨让\t193440\n易名\t193441\n调价\t193442\nyejedndb\t193443\n花苡\t193444\n没鬼\t193445\n一劲嘉\t193446\n聊了再也不见\t193447\njef\t193448\nN90\t193449\n开除\t193450\n公路线\t193451\n刘宇新\t193452\n陈粒\t193453\nA380\t193454\n400立方米\t193455\n伙伴\t193456\n起藕片\t193457\njek\t193458\n巴林石\t193459\n星光唱将\t193460\n组图\t193461\n蚌壳\t193462\n余哥\t193463\n静悄\t193464\n张培怡\t193465\n方可\t193466\n早我\t193467\n好难受\t193468\n18次\t193469\n我不觉晓的我是女的\t193470\nbyytty6uyy\t193471\n张猫\t193472\n要死掉\t193473\n缴纳\t193474\n早戀\t193475\n5hurdle\t193476\nujsje\t193477\nhttpfhiphotosbaiducomxiaodupicitem2fdda3cc7cd98d1004b05745263fb80e7bec9059jpg\t193478\n张螃蟹\t193479\n妇幼\t193480\n天吶\t193481\n嫩草\t193482\n喜歡\t193483\n朝鲜族\t193484\n坐稳\t193485\nwallmilkknee\t193486\n抽欠\t193487\n天吧\t193488\n酷比\t193489\n酷毙\t193490\npouristevan\t193491\nzyv\t193492\n干纹\t193493\n门冲\t193494\n干线\t193495\n杨铧帆\t193496\n要不认\t193497\n盖世\t193498\n别深奥\t193499\n学习能力\t193500\n邓紫琪\t193501\n不由得\t193502\n天吝\t193503\n婆女肉麻llop0657575775\t193504\n1500多所\t193505\n乐浩\t193506\n盖上\t193507\n老婆们儿\t193508\n故国\t193509\n联想的菜睡前有春你故意\t193510\n干红\t193511\n2B2B2b\t193512\n我喜欢岩满录\t193513\n天后\t193514\n天名\t193515\n纸表\t193516\n乱说好\t193517\n爱惜\t193518\namao\t193519\n兰水\t193520\n填填\t193521\n大林\t193522\n游学\t193523\n无听说\t193524\n三月中\t193525\n2221504690\t193526\nexex\t193527\n凯恩斯\t193528\n爱情\t193529\n大枣\t193530\n开瑞行\t193531\n露咧\t193532\n捅死\t193533\n搭上\t193534\n大枪\t193535\n使劲儿\t193536\n游子\t193537\nddddffff\t193538\n地洞\t193539\n前排快播\t193540\n数多20甲\t193541\n人知\t193542\n好帅吖\t193543\n期许\t193544\n我问你的你猜我\t193545\n古老\t193546\n皮皮鲁\t193547\n江特\t193548\n四十厘米\t193549\n普联\t193550\n古耐\t193551\nGUOHAO\t193552\n青秀区\t193553\n不好不好不好\t193554\nBeatrix\t193555\n我是你的阳光\t193556\n纤皮\t193557\n天龙585\t193558\n瑶姐\t193559\nmsorry\t193560\n全国人大法工委\t193561\n3a1pjtmnnnnnnnadgjmtw\t193562\n百乐英语\t193563\nsy1183592509\t193564\n580二千五百八一八十一\t193565\n龙也\t193566\n六英雄\t193567\n193118443494346\t193568\n今天早上6点半\t193569\n回忆\t193570\n甘又坏了游戏吧\t193571\n阿秘\t193572\n回心\t193573\n猪头猪头小猪头\t193574\nsfvvdh\t193575\n六点半\t193576\n20120525\t193577\n1505\t193578\n两双\t193579\n浙大\t193580\n阿秋\t193581\n20120527\t193582\n阿秀\t193583\n中青旅\t193584\n两史\t193585\n两台\t193586\n灰色地带\t193587\n下一个下下一个\t193588\n丁真桑珠\t193589\n七十名\t193590\n两只\t193591\n打头\t193592\n买们\t193593\n两句\t193594\n爆拆\t193595\n两口\t193596\n失得其反\t193597\ngrrgr\t193598\n早晨五点半\t193599\n逃不掉\t193600\nG点\t193601\n秘我不爱\t193602\n秘明天下\t193603\n吕思月\t193604\n没魔方\t193605\n早九点\t193606\n工单\t193607\n奇了缘\t193608\n2011年4月底\t193609\n#3G智能总动员#\t193610\n眉眼\t193611\n八五零二零二二\t193612\n拉紧\t193613\n9月30日\t193614\n快乐就好\t193615\n朱阳松\t193616\n公职\t193617\n得失俱灰\t193618\n蛋炒饭\t193619\n工卡\t193620\nGSM\t193621\n出资\t193622\n觉不着\t193623\n交通\t193624\n壯\t193625\n小米\t193626\n被包\t193627\n黑瓶\t193628\n西游之谜\t193629\n满怀信心\t193630\nouoooiim\t193631\n秒k\t193632\n李楠菁\t193633\n多益\t193634\n五幺七\t193635\n５一百\t193636\npirateship\t193637\n妳給\t193638\n雷公萌\t193639\n十天后\t193640\n呼呼U7uU7UuU\t193641\n幾杯酒\t193642\n几位\t193643\n抽查率\t193644\nAV点com\t193645\n朵花儿\t193646\n盲女\t193647\n活出样\t193648\n几何\t193649\n洱海\t193650\n好别扭\t193651\n110张\t193652\n班群\t193653\n奶爸的幸福生活\t193654\n五十音\t193655\n美美丽\t193656\n男神\t193657\n万611743\t193658\n脚脚石\t193659\ndggghj\t193660\n云凯扬\t193661\n51分\t193662\n冰谷超\t193663\n4434664643\t193664\n瓶子\t193665\n大棒\t193666\n5板\t193667\n愚蠢\t193668\n坏孩\t193669\n百合花\t193670\n曲奇\t193671\n137斤\t193672\n提莫泊\t193673\n甩开\t193674\n日前\t193675\n多钱克\t193676\n210m\t193677\n婉转\t193678\n柏菲\t193679\n悠哈\t193680\n悠哉\t193681\n刘浩\t193682\n迈阿密\t193683\n陈泽明\t193684\n张福保\t193685\n刘海\t193686\n大一级\t193687\n因为我喜欢\t193688\n西部会战\t193689\n扬度秘\t193690\n文士\t193691\n邱子怡\t193692\n盟国\t193693\n摸摸哒\t193694\n深渊\t193695\n急死\t193696\n必要\t193697\n公知鱼\t193698\n娇小\t193699\n奉天\t193700\n5两\t193701\n轩亦魅\t193702\n黄旭柱\t193703\n赛罗奥特曼全集\t193704\n百老汇\t193705\n朱晓琳\t193706\nwgjptnjgTP\t193707\n用药\t193708\n开心甜\t193709\nmicror堡\t193710\n看恶心\t193711\n2108\t193712\nhhgh\t193713\n2103\t193714\n2100\t193715\nhhgk\t193716\n2105\t193717\nohmygod\t193718\nTiPadPro\t193719\n前二天\t193720\n烫发\t193721\n押分\t193722\nYzsvitrgjhb\t193723\n换人\t193724\n总得分\t193725\n减肥群\t193726\n朱虹昱\t193727\n墙头\t193728\n510902706199188459\t193729\n听证会\t193730\n高福利\t193731\n奥闹\t193732\nstmetout\t193733\n考期末考\t193734\n磨难剧\t193735\n乏了回\t193736\n那你百家姓生的查\t193737\n里巴拉巴拉巴拉巴拉呱呱呱呱呱呱呱叽呱叽呱呱呱呱呱呱\t193738\n大太阳\t193739\n1S1S521\t193740\n洪亮\t193741\n法式会所\t193742\n费廷\t193743\nhjvhfufdfjgnggfgrugiddtrryyhgftstdxyfgvt66ttftrhfdyyrteurydhfyhyfdrtdtcygt7utygttitufudhfyguguughfhhhfhfugyhgyffdttrdgsgdhrshshtyrgggjgufufguysfhrrfffhghfydttdyzrtttdyfdgttdrd\t193744\n哼伦\t193745\n墙外\t193746\n七零级\t193747\n九时四十分\t193748\n文学创作\t193749\n那一样\t193750\n壑\t193751\n凸显\t193752\n博世\t193753\njoailkick\t193754\n好日\t193755\n34810\t193756\n一个四\t193757\n暇装\t193758\n装出\t193759\n温柔的人\t193760\n坏点吧\t193761\n先河\t193762\n更快速\t193763\n这个礼拜天\t193764\n博主\t193765\ngugbh\t193766\nMiller\t193767\n角儿\t193768\n伊美阴\t193769\n几十多页\t193770\nPpp\t193771\n穿上街\t193772\n内陆\t193773\n黄蓉冰\t193774\nerexp\t193775\n油油\t193776\nknobb\t193777\n何兴真\t193778\n黑白橫條\t193779\n三点三点\t193780\n打狗\t193781\n微店\t193782\n摘星\t193783\n甫雄心\t193784\nzxyz\t193785\n80度\t193786\n鲁好\t193787\n小田\t193788\n格雷佛尔帕斯塔弗尔帕斯塔\t193789\n锻做\t193790\nsovgd\t193791\n周子晗\t193792\n度你几时成度密机器人你好\t193793\n我喜欢1fboss\t193794\n76那号\t193795\n刘表\t193796\n点身\t193797\nwwww5\t193798\n大飞\t193799\n大食\t193800\n打腻\t193801\ncccttygg\t193802\n共振\t193803\n投标\t193804\n24点整\t193805\n痛感\t193806\n大风\t193807\n十八年\t193808\n二赛\t193809\n关卡\t193810\n准保\t193811\n足协杯\t193812\n电磁炉\t193813\n李梦雪\t193814\n字头儿\t193815\n轰子\t193816\n3213939832\t193817\n载油\t193818\n广西中医学院瑞康医院\t193819\ngagggagaggggg\t193820\n传统\t193821\n七百分钟\t193822\nhttpehiphotosbaiducomxiaodupicitema71ea8d3fd1f413423b3c90a221f95cad1c85e3ajpg\t193823\n难不个人\t193824\n那一片午后的阳光\t193825\n嚄嚄筽\t193826\n仪本集\t193827\n泪珠\t193828\n小电\t193829\n小四小五小六小七小八小九小十\t193830\n朋友铺\t193831\nLemon\t193832\n横空出世\t193833\n仁者\t193834\n煮雨\t193835\nhellohello不聊\t193836\n杨将\t193837\n村委\t193838\n刘奇堡\t193839\n静夜思&amp\t193840\n检测\t193841\n小板车\t193842\nSparking\t193843\n两万岁\t193844\n多分\t193845\nttttttttt\t193846\n84187950313516012500\t193847\n冷水江市\t193848\n明早朝\t193849\n喵咪咪喵\t193850\n22年2月2十2号2点\t193851\nbutits\t193852\n度秘元化天\t193853\n多别\t193854\n多利\t193855\ntoget\t193856\n梁山伯与祝英台\t193857\n沉湎\t193858\n吴龙溪\t193859\n不不不我讨厌你我\t193860\n帧雄\t193861\n磐英\t193862\n恶意\t193863\n壁上\t193864\n偃师\t193865\n恩才\t193866\nxvbgg\t193867\n重归\t193868\n人面桃花相映\t193869\n刘丽娟\t193870\n淫炱\t193871\n里阳村\t193872\n李吧\t193873\n五香牛肉\t193874\n度秘我和你\t193875\n∶\t193876\n一首首\t193877\nlljjjggjgjaaagj\t193878\n彩石\t193879\n7月3号\t193880\n明胶老酸奶\t193881\nRob\t193882\n李吗\t193883\n七载\t193884\n重影\t193885\n百货\t193886\n李后\t193887\n领款\t193888\n李吃\t193889\n睡莎\t193890\n呀呀呀呀呀呀呀\t193891\njiheb\t193892\n小秘秘\t193893\n56902655998\t193894\n横店郊区\t193895\n组装\t193896\n麻麻汝\t193897\n2011年2月6日\t193898\n高官寨卫生院\t193899\n偏源\t193900\n杨蔓\t193901\n473575675675676709858久负盛名\t193902\n原则性\t193903\n谭杰希\t193904\n6600万元\t193905\n红豆派\t193906\n洛宁\t193907\n一天天一秒秒\t193908\n红领巾\t193909\n上雪舞西\t193910\n查理和巧克力工厂\t193911\n我没有\t193912\n够了不想\t193913\n你惹我\t193914\n轩儿\t193915\n绝色\t193916\n娜克斯利克斯\t193917\n香味\t193918\n孙欣梦\t193919\n马科斯\t193920\n莱斯\t193921\nSssasszz\t193922\n金色池\t193923\n刘梦豪\t193924\n体外\t193925\n夜光\t193926\n童贞\t193927\n焦若涵\t193928\nmfxu\t193929\n前丫们\t193930\n申缪贵\t193931\n跑顿顿\t193932\n3月27\t193933\n体央\t193934\n3月25\t193935\n3月24\t193936\n125800\t193937\n10月7日\t193938\nhjfkccv\t193939\n把握手\t193940\n霞潭\t193941\n挝完美\t193942\n来了来了来了快点\t193943\n于晓光\t193944\ntuif\t193945\n等闲识得东风面万紫千红总是春\t193946\n车晓\t193947\n小小玩人\t193948\n璐园雪\t193949\n好想见\t193950\n小狐狸\t193951\n根治\t193952\n挛缩天天\t193953\n下去再聊\t193954\n吧唧唧歪歪扭扭\t193955\nz尼\t193956\n根河\t193957\n复旦新闻系\t193958\n么快\t193959\njlm\t193960\n你在哪所小学读书\t193961\n贝里亚\t193962\n流行的歌曲\t193963\n柴火\t193964\n莲琦\t193965\n等等等等等那你不要哦\t193966\nstmetatou\t193967\n那不然和谁呢和你\t193968\n浓缩\t193969\n168千米\t193970\npeace\t193971\n百度文库\t193972\n保健我\t193973\n功不可\t193974\n远距离\t193975\n支付宝充\t193976\n死女人\t193977\n逃犯罪\t193978\n撒人\t193979\n重工\t193980\n宠物蚕\t193981\n说书词\t193982\n10.89亿股\t193983\n飞飞\t193984\n二棍天使\t193985\nTNT\t193986\n蓬勃\t193987\n在那边\t193988\n十级\t193989\nrufrfggbgghthtgygdwetkyrwrtlhldltltutliwtltuwlwltiiit8ylilh5jtwtwlrwhnrwnqwegndbafmahnvdsdnggwrnuckbhfyggyguwfddagndggjdrgjarryadgjd\t193990\n丨一丿丨\t193991\n寡头\t193992\nHCWVE\t193993\n除臭味相投外\t193994\n江北区\t193995\n008945236\t193996\nTND\t193997\n男儿\t193998\n颜萍\t193999\ndifid\t194000\nblblblbl\t194001\nhffggghhgghhehbghzhfhbvvtvrgdhdgrhrhdhrhheuuffgffc\t194002\n密教\t194003\n980包\t194004\nbahao马fagao\t194005\nRDIIT\t194006\n官道\t194007\n龙洞\t194008\nhhhbvvnin\t194009\n东旺\t194010\n汽车票\t194011\n24558545515026429525804540468349015055858854542145774477777544555555554\t194012\n陷足\t194013\n广药集团\t194014\n勒死\t194015\n陈欣怡\t194016\n不遗余力\t194017\n只不\t194018\n小蝴蝶\t194019\n温布利\t194020\nAcfocuco\t194021\n爱表\t194022\n田径场\t194023\n蔡振鑫\t194024\n8900号\t194025\ntfooa\t194026\n晚饭菜鸟\t194027\n累头\t194028\n竟敢\t194029\n886886868\t194030\n锐利\t194031\ncogrbhe\t194032\n去电\t194033\n至上\t194034\n李月影\t194035\n北京广播网\t194036\n一点击二\t194037\n嗯为了\t194038\n西溪晓苑\t194039\n你不我要\t194040\n漪汾街\t194041\n飙句\t194042\nV句\t194043\n粗废墟徐\t194044\n忠心耿耿\t194045\n芸香炉\t194046\n55825866\t194047\njdhdjf\t194048\n卜星\t194049\n盛行\t194050\n第十个\t194051\n忘八端\t194052\n齐也街\t194053\n西北人\t194054\n59天后\t194055\n给我不知道\t194056\n天姐不发威~你当我病猫\t194057\n北仓小口双发\t194058\n托架\t194059\n张鹏飞\t194060\nOK咯\t194061\n则阳\t194062\n第十七\t194063\n娇艳欲滴\t194064\n第十一\t194065\nik7kakk\t194066\n供暖\t194067\ntchdh\t194068\n第十三\t194069\n渐近线\t194070\n一见到\t194071\n破度\t194072\n革命军\t194073\n少壮\t194074\n联骏\t194075\n众筹劳\t194076\n清军\t194077\n十七天正\t194078\n第二家\t194079\nfoll\t194080\n海陆\t194081\nzhshz\t194082\njggjjy\t194083\n嬉皮\t194084\nHYRHR\t194085\n顾不\t194086\n百度答\t194087\nnanigaosuwo\t194088\n扮相\t194089\n叫漆器\t194090\n第四围\t194091\n清冽\t194092\n闪闪亮\t194093\nDddruuii\t194094\n金蘑菇\t194095\n清冷\t194096\n无虚发\t194097\n欢声笑语\t194098\n再说再说在\t194099\n晓楼\t194100\n活脱脱\t194101\n初衷\t194102\n茂县\t194103\n碰鬼\t194104\n阿妮娜\t194105\n毒师\t194106\n附加火\t194107\n这么爱\t194108\n爱的冒险\t194109\n1筷\t194110\n毛论\t194111\n1801828944\t194112\n办事儿\t194113\n看着你走\t194114\n戛纳电影节\t194115\nleaving\t194116\n涵养\t194117\n胡剑明\t194118\n我的故乡吗你在哪生活\t194119\n白子画\t194120\n116次\t194121\n骚哥\t194122\n米更米\t194123\n来了来\t194124\n七九六\t194125\nj巴\t194126\n狗仔型\t194127\n商业地产\t194128\napp\t194129\n2年之内\t194130\n一间只\t194131\n多五点\t194132\napk\t194133\naph\t194134\n安洁\t194135\n交集额\t194136\n不减看\t194137\n金淘\t194138\n老虎虫\t194139\niobjhbhbj\t194140\n可歌可泣\t194141\n哈笑\t194142\n无不各具\t194143\n12567\t194144\n因为我喜欢女\t194145\n慢慢堡\t194146\n今天早上11点\t194147\n网络网\t194148\n美好的回忆\t194149\n有十五开\t194150\n呡嘴\t194151\n拂晓\t194152\n扫死\t194153\n爹妮\t194154\n泡泡酒喜酒图就是主小心弄\t194155\n17年\t194156\n凤凰飞\t194157\n社有一个人\t194158\n特别节目\t194159\n黄吧\t194160\n美国男篮\t194161\nJohnny\t194162\n胖胖梦\t194163\n14700995238\t194164\n团唱\t194165\n例句\t194166\n爹妈\t194167\n箱子\t194168\n道哥斗地主\t194169\n怪盗基德\t194170\n别这样开玩笑\t194171\n税旋\t194172\n不远万里\t194173\n女生\t194174\n叶丽仪\t194175\n延伸\t194176\n188c\t194177\n最最最最最爱\t194178\nbjzzw\t194179\nxnjxjjxa\t194180\n方雨婷\t194181\n一批一批\t194182\n不才说\t194183\n耐性\t194184\n挨骂\t194185\n耐思\t194186\n莲台\t194187\n吴在熙\t194188\n岁月静\t194189\n撞衫\t194190\n女男\t194191\n1959137452\t194192\n性感欢\t194193\n挨骗\t194194\n羊角塘镇\t194195\n150g\t194196\n二三八九\t194197\n生儿育女\t194198\n郭猫猫\t194199\n一三点\t194200\n八五版\t194201\n我喜欢伱\t194202\n钢管儿\t194203\n粉尘\t194204\nt100\t194205\n明天下午八点\t194206\n教父\t194207\n1888\t194208\n1886\t194209\n1885\t194210\n1880\t194211\n讲塞\t194212\n妥诶\t194213\n忍术\t194214\n孙继然\t194215\n我喜欢会\t194216\n买买买买买买买买买\t194217\n很早你\t194218\n长海医院\t194219\n客人\t194220\n淫秽\t194221\n好丑好臭\t194222\n吃豆腐\t194223\ngghggv\t194224\n剂子\t194225\neeee3333333555566677788889999\t194226\n坟地\t194227\n下班儿\t194228\n维cksyo\t194229\n梅画\t194230\n费翔凤\t194231\n唉不懂\t194232\n青岛市工商局\t194233\n素玺\t194234\n阳气\t194235\nmojito\t194236\n首尔电视节\t194237\n佩拉杰群岛\t194238\n学妹\t194239\n16c\t194240\n素王\t194241\n150万吨\t194242\n坏蛋叔叔也不坏蛋啊你坏蛋你是坏蛋小坏蛋小坏蛋小小小小\t194243\n肠炎\t194244\nbanner\t194245\n余力侍\t194246\n中央人民广播电台播音主持指导委员会\t194247\n十五岁\t194248\n320斤\t194249\n说句骂人\t194250\nmop\t194251\n老不死\t194252\n狭隘\t194253\n不好我很好\t194254\n200天\t194255\n高艺瑄\t194256\n外交部\t194257\n14位\t194258\n空气\t194259\n武墓\t194260\n早饭尼\t194261\n翻白眼\t194262\n888111222\t194263\n品牌代言人\t194264\n凤凰台\t194265\n2848823977\t194266\n前沿\t194267\n血条\t194268\n张张丹妮\t194269\n200处\t194270\n军刺\t194271\n生态型\t194272\n小心一点三安全一个字\t194273\n贾鹏博\t194274\n四川仁寿\t194275\n200多\t194276\n446646794646\t194277\n昂藏\t194278\n曹颖颖\t194279\noui\t194280\nouo\t194281\n算作\t194282\n李一安\t194283\noua\t194284\n速溶咖啡\t194285\nouf\t194286\noud\t194287\n科幻感\t194288\n家可灵儿\t194289\n4c二春\t194290\nour\t194291\n末经\t194292\n李一定\t194293\nddgcccccc\t194294\n耙钯\t194295\nouu\t194296\nout\t194297\n辛度\t194298\n妹纸们\t194299\n想开来\t194300\n长包\t194301\n达飞蕾雅屌\t194302\n猩球崛起是齐天\t194303\n还不错嘛真是\t194304\n#胜女的代价#\t194305\n复方\t194306\n天暖\t194307\n横槛\t194308\n太帅\t194309\nouT\t194310\nscu咯\t194311\n到底是谁呀你\t194312\n500亿美元\t194313\n跳级\t194314\n漫迷\t194315\n昭昭幕\t194316\n早老\t194317\n阿里那个侯车违章贝帅好帅\t194318\n皖江时尚裤业店\t194319\n一张钱\t194320\n王露霏\t194321\n2582581314258\t194322\n00008525566n3n93nKlplkllllljn7dddzflkllookkbj\t194323\n明矾\t194324\n三十五\t194325\n普罗米修斯\t194326\n圆管乐\t194327\n跳线\t194328\n80发\t194329\ntcg\t194330\n炮院\t194331\n迪吧\t194332\n丁瑞坤\t194333\n虾米们\t194334\n二百九十八\t194335\ndhiphotosbaiducomxiaodupicitem42a98226cffc1e17f187c6444d90f603738de98fjpg\t194336\n傅盛\t194337\n林梓炯\t194338\n偶象级\t194339\n十公里\t194340\n可读性\t194341\n这个臭\t194342\n头绪军\t194343\n顺利\t194344\n法灯\t194345\n三角形\t194346\n一个一声\t194347\n那个那\t194348\n良子\t194349\n777777777777777\t194350\n张总派\t194351\n混身\t194352\n入行\t194353\n国国国\t194354\nNishuwomei\t194355\ny989\t194356\n13217906936\t194357\n314米\t194358\n挣扎\t194359\n整儿\t194360\n十妖九NC\t194361\n别天天\t194362\n部门\t194363\n拉筋\t194364\n浪淘\t194365\n修改后\t194366\n网络共和国\t194367\n神文\t194368\n涨水\t194369\n古早味蛋糕\t194370\n够不跟我\t194371\n活在当下\t194372\n次生林\t194373\n缪舜尧\t194374\n午好\t194375\n716点\t194376\n南京LG公司\t194377\n襄玲\t194378\n早安早安晚安晚安早安\t194379\n零二二二二四五六五六\t194380\nsoh\t194381\n节俭\t194382\n让发明\t194383\n拉钩\t194384\n一个三GB\t194385\n后观\t194386\n油腻\t194387\n凯雷卡梅利多\t194388\n后视\t194389\n忙盲妃\t194390\n真平头新鲜我天天大车灰太狼\t194391\n比赛类\t194392\n襄王\t194393\n涂鸦\t194394\n韦弗\t194395\n监院\t194396\n晚上\t194397\n歧义\t194398\n王晓宇\t194399\n挨斗\t194400\nhyhhyhhhuuujh\t194401\n何尝\t194402\n99个\t194403\n赵启昂\t194404\n韩亚东\t194405\n超凉拌\t194406\n董文静\t194407\nctrl\t194408\n摆阵\t194409\nemak\t194410\n145224\t194411\n99下\t194412\n礼貌用语\t194413\n2月10日\t194414\n妞子\t194415\n南征北战\t194416\n吉林市\t194417\n石蜡锅\t194418\n美颠美\t194419\nareso\t194420\n林科卫\t194421\nhhggye\t194422\n15148266\t194423\n我听不懂的话了好不\t194424\n李大齐\t194425\n的色\t194426\n克尔老二\t194427\n一场一场\t194428\n秘丑八怪\t194429\n基本上\t194430\n扶不扶\t194431\n步调\t194432\nqeioy\t194433\n好猪爱你\t194434\n谢幕\t194435\n理窝\t194436\n赵江江\t194437\n三美术\t194438\n123456782234567832346278\t194439\n霍霍霍霍霍霍霍\t194440\n物理只猪猪侠五你\t194441\n13259542492\t194442\n明天上午\t194443\n2趟\t194444\n空饷\t194445\n涂磊\t194446\n旧书\t194447\n肥洋洋\t194448\n互联\t194449\nhololatio\t194450\n上2016年\t194451\n欧尼思密达\t194452\nUhhhon\t194453\n转温\t194454\nzkb\t194455\n美华\t194456\nhgggggg\t194457\nfeat\t194458\n80倍\t194459\njmkm\t194460\n迷营造\t194461\n度秘也度秘\t194462\n正用\t194463\n天猴儿\t194464\n中青\t194465\n靠不要脸\t194466\n呼和\t194467\nby海峡都市报\t194468\n古穿现\t194469\n育才叫\t194470\n提卡v\t194471\n意大利语\t194472\ndhhchjvb\t194473\n远原来如此\t194474\n桃体\t194475\n求你啦求你啦求你啦求你啦求你啦求你啦求你\t194476\n滋病\t194477\n山歌\t194478\n谢天我\t194479\n小蛮腰\t194480\n现面\t194481\n全世\t194482\n嬲\t194483\n忠于\t194484\n11月12日\t194485\n全不\t194486\ncoycu\t194487\n舜北幼儿园\t194488\n超过3年\t194489\n1249401\t194490\n9999999999999个\t194491\n山步\t194492\n尹瑞林\t194493\n得意失意\t194494\n邓联锋\t194495\n盒酒心巧克力\t194496\n嬴\t194497\nsor\t194498\n邮政储蓄\t194499\n媳妇\t194500\n丿\t194501\n举\t194502\n丽\t194503\n丼\t194504\n主\t194505\n法尔丹\t194506\n丹\t194507\n丸\t194508\n丷\t194509\n丶\t194510\n临江镇\t194511\n狂战姬\t194512\n串\t194513\n丰\t194514\n丯\t194515\n神武九九\t194516\n中\t194517\n生星期\t194518\n丫\t194519\n丩\t194520\n丨\t194521\n丧\t194522\n星钻\t194523\n严\t194524\n两\t194525\n体育馆\t194526\n丢\t194527\n我是真的爱你\t194528\n丠\t194529\n丟\t194530\n710分\t194531\n丝\t194532\n东\t194533\n丛\t194534\n业\t194535\n丙\t194536\n丘\t194537\n丗\t194538\n世\t194539\n丕\t194540\n且\t194541\n专\t194542\n丑\t194543\n丐\t194544\n真心不想\t194545\n不\t194546\n丌\t194547\n下\t194548\n上\t194549\n三\t194550\n丈\t194551\n聊了害怕\t194552\n丆\t194553\n丅\t194554\n木木哒\t194555\n爱惜自己\t194556\n好好儿\t194557\n丁\t194558\n一\t194559\n隆平路\t194560\n绿担\t194561\n财富通\t194562\n陈梦茜\t194563\n我看着你很半月我说是你叫我都不是我说你我要你的好嘛\t194564\n林文源\t194565\n狂喷\t194566\n李铊\t194567\n618454845494464\t194568\n我的生曰\t194569\n处女女兵\t194570\n怡藏\t194571\n八贱七笨六傻五憨四呆\t194572\n腰间\t194573\n三四次\t194574\n年夜饭\t194575\n费小萌\t194576\nplanned\t194577\n死王八\t194578\n十数所\t194579\n连微红\t194580\n铁狗笼\t194581\n小明子\t194582\n廣東\t194583\n日货\t194584\n骨掰\t194585\n#DIY#\t194586\n先安\t194587\n国风\t194588\n糯米度\t194589\n几百块\t194590\nfutvy\t194591\ncallcollator\t194592\n吧罗嗦鬼我\t194593\n翦八士\t194594\n王考乜\t194595\n俄经济文化红十字会\t194596\n背背克\t194597\n黔灵\t194598\nieqQ\t194599\n补钙\t194600\n柏阿姨\t194601\n布丁奶茶\t194602\n好牛嘉林江\t194603\n每块\t194604\n鼠夹\t194605\n台儿庄大战纪念馆\t194606\ngv星\t194607\n对呀我好幸运\t194608\n白送我\t194609\n回心转意\t194610\n圈饭\t194611\n书册\t194612\nmlmmmnm\t194613\n730\t194614\n733\t194615\n732\t194616\n735\t194617\nhiitrg\t194618\n几4455代\t194619\n倾尽\t194620\n738\t194621\n舍不得\t194622\n上岸\t194623\nrjdReese\t194624\n夏校长\t194625\n伊克萨斯\t194626\n京东大战\t194627\n鬼吹灯\t194628\n皇姑\t194629\n皇姝\t194630\n冷热水\t194631\n2007年8月\t194632\n百vvb\t194633\n调控\t194634\n五角大楼\t194635\n景轩酒店\t194636\n口瓜\t194637\n撒开\t194638\n秘密人家好想你\t194639\nhahwwj\t194640\n唉摩仙\t194641\n道晖\t194642\n宋维香\t194643\n朋友女\t194644\n二饼\t194645\n低级\t194646\n一面儿\t194647\n桂花香\t194648\n依妙秘\t194649\n中米奇\t194650\n交易所\t194651\n妆模作样\t194652\n平面图\t194653\n天蓝\t194654\n纸哒\t194655\n永乐\t194656\nNEWS\t194657\n我是美人\t194658\n女兵\t194659\n故意会\t194660\n黎是\t194661\n水煮鱼\t194662\n湖南广电\t194663\nggffjyedk\t194664\nBecky\t194665\n鲁迅\t194666\n1亩\t194667\n不相信\t194668\n认命\t194669\nyfurururfugheghegdugehdufyf1474663736567763636\t194670\n静一静\t194671\n求初拥\t194672\n第一件\t194673\n嘎嘎嘎嘎嘎嘎嘎嘎呱呱呱呱呱呱\t194674\n诚惶诚恐\t194675\n131柜\t194676\n六五二二\t194677\n黎明\t194678\n第一份\t194679\n他们的故事\t194680\n廵帅\t194681\n徐志摩\t194682\n跳棋\t194683\nokokmcore\t194684\n想不理\t194685\n电玩\t194686\n翟星月\t194687\n工委会\t194688\n大聚会\t194689\n本泽\t194690\n上岗\t194691\n好不好过\t194692\n石榴\t194693\n吴乔峰\t194694\n东京喰种\t194695\n98645467767676764646646664464664649766676464646649446767666\t194696\n可化食降脂通气血\t194697\n不着你大本楼\t194698\n度哥\t194699\n四月二十\t194700\njshrhtjdhbbntktgbjjjjdjnryhnfjjtj\t194701\n呱呱蛋\t194702\n9点18分\t194703\n结满\t194704\ndangranle\t194705\n牛鼻就动\t194706\n一大碗\t194707\n企鹅属鸟科\t194708\n再用\t194709\n花椒\t194710\n小黄花\t194711\n铁血球队\t194712\ntcy\t194713\n青训\t194714\n全一起\t194715\n首播木\t194716\ntcl\t194717\nbedab64908222b4f036afc378311ed5jpg\t194718\n麦肯基\t194719\n特聘\t194720\n线索\t194721\ntcb\t194722\n难喝\t194723\n大路上\t194724\n我该怎么办\t194725\n唱一次\t194726\n英雄式\t194727\n圆缺\t194728\n古迪森\t194729\n两脚\t194730\n买到手\t194731\n年事\t194732\n怪年年来\t194733\n否决\t194734\n聊天吧天天天天天天天天天\t194735\n台独\t194736\n茶庄\t194737\nNNJEBDSJQ\t194738\n村口王师傅\t194739\ntc2\t194740\n奚落\t194741\n茶店\t194742\n三爷\t194743\n美利达\t194744\n心急如\t194745\n三爱\t194746\npress\t194747\n马铭锋\t194748\n李晓依\t194749\n搔瑞\t194750\n我和好狠哩我讨厌你\t194751\n错了错错\t194752\n一样度秘我爱你\t194753\n关联\t194754\n巴基耶夫\t194755\n男队\t194756\n降伏\t194757\njpupj\t194758\ndtagdag\t194759\n忒气\t194760\n叫完\t194761\n选理\t194762\n百年灵\t194763\n7895\t194764\n芬达\t194765\n片断\t194766\n搜索引擎\t194767\n7891\t194768\n不忍不住\t194769\n窗帘\t194770\n灵女\t194771\n六八五零\t194772\n钟楼\t194773\n英格兰\t194774\n化学\t194775\n夏烛\t194776\n恶心你的你真讨厌\t194777\n跟我学\t194778\n我爱的人\t194779\n大竹\t194780\n谢纪成\t194781\n锦户亮\t194782\n变了吧\t194783\n最后一句话\t194784\n尊便\t194785\n拜拜噜\t194786\n拉拉拉\t194787\n乔伊\t194788\n热疗\t194789\n千真万确\t194790\n冰酒\t194791\n小曼样\t194792\n别怕\t194793\n48薛\t194794\n药科\t194795\nAndes\t194796\n三十七\t194797\nggstb\t194798\n生殖\t194799\n粮油\t194800\n地蝶\t194801\n曾vvbb\t194802\n一立方\t194803\n别总\t194804\n满族\t194805\n你是我的小呀小苹果\t194806\n青出于蓝胜于蓝\t194807\n一板儿\t194808\n山巅\t194809\n西游w点23\t194810\nBC缺爷被好莱坞五大经济公司\t194811\n晓母狗\t194812\n139713666665553\t194813\n1588558555\t194814\nc89kakinotanesummeryukatanofutaridocchioerabukantaicollectionkancolle汉语無毒漢化組n地址httplofiehentaiorgg89318533eed5412e\t194815\n93号\t194816\n别急\t194817\n邪魔\t194818\n凌乱\t194819\n元一\t194820\n醉狠狠\t194821\n小耀\t194822\n小老\t194823\n小考\t194824\n冯仑\t194825\n你可以\t194826\n恭王府\t194827\n内内\t194828\nwxwxwx\t194829\n牛秘\t194830\n135785263489\t194831\n少男心\t194832\n段良菊\t194833\n居留权\t194834\n88亿美元\t194835\n我真的好爱我老婆\t194836\n理工\t194837\n半男\t194838\n东南西\t194839\n苹果计算机厂\t194840\nusMk\t194841\n李摩西\t194842\n唉你你你是高中我的你\t194843\n荷椅\t194844\n那么多级\t194845\n小耶\t194846\n1000多米\t194847\neudhfjbfii\t194848\n收银台\t194849\n错累\t194850\n白花子\t194851\njsidd\t194852\n888888888886\t194853\n媚艳\t194854\n6十六二\t194855\n呀你以为你\t194856\nrrjzgdvtnhcgnbfk\t194857\n95000\t194858\n不分亲\t194859\n陈淑英\t194860\n高声\t194861\n松逝者\t194862\n角中学\t194863\nSruch\t194864\n航空界\t194865\n师兄们\t194866\n周紫铃\t194867\n123477778\t194868\n2O2\t194869\nDffvvbn\t194870\n584587\t194871\n真了我也想你\t194872\n办事员\t194873\n莫吓\t194874\n骗人我真的是\t194875\n继续再说\t194876\n眼部\t194877\n我是的是你能为我\t194878\n醉品\t194879\n一百五十九\t194880\n耿晓阳\t194881\n还我走\t194882\n海参\t194883\n姚村\t194884\n辩解\t194885\n阴超\t194886\n倾诉你\t194887\nlgkkgkgk\t194888\n你的妈妈\t194889\n戒除\t194890\n马尼奥塔\t194891\n除以\t194892\n海口\t194893\n海星型\t194894\n3张\t194895\n申购\t194896\n源星\t194897\n点水\t194898\n弄一破\t194899\n茶舞\t194900\n升职记\t194901\n狼了\t194902\n六七八九十\t194903\nhttpehiphotosbaiducomxiaodupicitema5c27d1ed21b0ef4473adc2bdac451da80cb3ef0jpg\t194904\n知名人士\t194905\n曹承佑\t194906\n辣妈咪\t194907\n隶撒\t194908\n插穴\t194909\n麥雪梅\t194910\n月初六\t194911\n几笔\t194912\n辣椒吧\t194913\n狼人\t194914\n营苑\t194915\n蜀山战纪之剑侠传奇\t194916\n舒伯特\t194917\n20多秒\t194918\n北院\t194919\n一英里\t194920\n换车\t194921\n存存存存存\t194922\n针拜拜\t194923\n引进\t194924\n飞哥与小佛动漫\t194925\n天天家\t194926\n换轮\t194927\n2500米\t194928\n贝雷帽\t194929\n真心朋友\t194930\n第355期\t194931\n服务厅\t194932\nv飞嫂\t194933\n期望\t194934\n爷点\t194935\n期期\t194936\n流西去\t194937\n黄暴\t194938\ntigw\t194939\n13992254805\t194940\n妮基\t194941\n期末\t194942\n穷富\t194943\n工业化学品制\t194944\n5月14日\t194945\n传音乐台\t194946\n你在哪里哪你在\t194947\n感受定\t194948\n健康快\t194949\n恶心我的\t194950\n恩星\t194951\n上限\t194952\n15768293820\t194953\n害怕死\t194954\n小庄\t194955\n赵玲花\t194956\n里头火腿干酪\t194957\ndell\t194958\n闻知\t194959\n短处\t194960\n静音帝\t194961\n妈咪爱\t194962\n程咬金\t194963\n瑞海\t194964\n100余名\t194965\n亲民\t194966\n九百页\t194967\n华夏那\t194968\n音符\t194969\n来不及说我爱你\t194970\n物主代词\t194971\n徐彩云\t194972\n海信电视\t194973\n温柔的妹子\t194974\n白锤木\t194975\n你在哪了明天我没有\t194976\n1.0米\t194977\n酹\t194978\n小庙\t194979\n酸\t194980\n给我我不知道\t194981\n保亭\t194982\n生死狙击\t194983\n15.37点\t194984\n女排\t194985\nzeroner\t194986\n新浩城\t194987\n要色原\t194988\n再见了再见\t194989\n来月\t194990\n洋洋万福系\t194991\n烂尾\t194992\n延庆\t194993\n女医明侠传\t194994\n我的花园我的花园我的家乡我的家乡呀我的家乡我的家乡我的家乡不是你哦想我思念我的家乡\t194995\n酷兔兔\t194996\nyesshestal\t194997\n宠物医生\t194998\n新东方\t194999\noOo\t195000\n51.54元\t195001\n横杠\t195002\n100度\t195003\n对儿\t195004\n树梢\t195005\n10吴\t195006\n我校\t195007\n藜榳\t195008\n厚意\t195009\n我喜欢度\t195010\n满人女\t195011\n大夏大夏大夏大夏\t195012\n接龙\t195013\n汤池\t195014\n林小宅\t195015\n酷\t195016\ndicmg\t195017\n一下分\t195018\n敌敌\t195019\n一才行一\t195020\n二里头\t195021\n刚次\t195022\n过客\t195023\n过审\t195024\n明确\t195025\n晓鹏\t195026\n姐姐\t195027\n过宫\t195028\n侯正一\t195029\n4414亿元\t195030\n唱一首爱\t195031\n林小容\t195032\nRygg\t195033\n嗅觉\t195034\nznso\t195035\n你是智能服务\t195036\n10月15号\t195037\n想当初\t195038\n托拉斯\t195039\n小小度秘\t195040\n过容\t195041\n改变\t195042\n347个\t195043\n各奔东西\t195044\n瓜朝天\t195045\n梅枝\t195046\n新进展\t195047\n离吧\t195048\n颾爫\t195049\n梅林\t195050\n反正\t195051\n扒滑\t195052\nQQ负荷\t195053\n对得起自己\t195054\n康贵宾\t195055\neh7x\t195056\n难耐\t195057\n种种\t195058\n离合\t195059\n可谓\t195060\n你是主了还我是猪\t195061\n黑豆浆\t195062\n岛国片\t195063\n合十\t195064\n长生源\t195065\ngoudai\t195066\n万象\t195067\n可调\t195068\n雕花\t195069\n面霜\t195070\n打不住\t195071\n测量误差\t195072\n腿色\t195073\n柯益丽\t195074\nyux\t195075\n罗孝武\t195076\n掐酒\t195077\ngoyeg\t195078\n要知道\t195079\n歼十三\t195080\n六十兆\t195081\n爱情训\t195082\n六十元\t195083\n烤制面包\t195084\n考拉动物\t195085\n备课\t195086\n不不麻烦你\t195087\n要不然的话借我一个号儿\t195088\n钦州\t195089\n吉吉睿\t195090\n锅米\t195091\n柯秀\t195092\n干嘛压\t195093\nhjygv\t195094\n这样做\t195095\n宋祖英\t195096\n天呐天呐天呐我真的不想\t195097\n交通部\t195098\n人人平等\t195099\nyup\t195100\n六十八\t195101\n烦人讨厌\t195102\n桃林麦克多纳\t195103\n海曼\t195104\nTEAZZsSTS\t195105\n4399小游戏\t195106\n脑癌\t195107\n525555555555\t195108\n吴家\t195109\ngvbnnnnbjnbnbnbnbnbnnnvbmncchhbbbbbbbbbbbbnnnnnbbbbbbbbbbb\t195110\n0426利特\t195111\n里程碑式\t195112\n四十天\t195113\n骨灰级\t195114\n酗\t195115\n吓死你伤心告诉我我来\t195116\n动力你好吗度秘你好吗度秘你好\t195117\n生命\t195118\n涉外\t195119\n往下\t195120\n往上\t195121\n15961554740\t195122\n恨不得\t195123\n1930.7.10\t195124\n来稿\t195125\n曲百万\t195126\n羊一四号秀来坏牙\t195127\n往东\t195128\n好吧好吧度攻\t195129\n餐犬\t195130\n超负荷\t195131\n行行行知错\t195132\n495103763\t195133\n鹰潭天师府\t195134\n雷美涵\t195135\n60页\t195136\n交点\t195137\ntrreew\t195138\n一科\t195139\n一秒\t195140\n咋不记得\t195141\n哪里\t195142\n一种\t195143\n一张包\t195144\n聽了嚴爵\t195145\n新鲜性\t195146\n王样\t195147\n说等\t195148\n乖唳\t195149\n委托去处\t195150\n演唱会所\t195151\nuitour\t195152\n强词夺理\t195153\nacaba\t195154\nuygfdfvffvcfvvg不过天\t195155\n马魏燕\t195156\n单方面\t195157\n揣度\t195158\n你邦\t195159\n为难\t195160\n求医\t195161\n神猫\t195162\n唱血\t195163\n好乱\t195164\n２００５年\t195165\n菜歌\t195166\nhzhsgdidbshdhdjsbsnwyshhfiyffjcjdhdiebrhvjfbdjrhrbfofhfbfjrkfhrjfjfjghgjfkfbfkfbnfjfhvhkckhi\t195167\n假钞\t195168\n西博罗\t195169\n518件\t195170\n逛来\t195171\ndhxvjg\t195172\n失明天二七号\t195173\n桂平\t195174\n514517461\t195175\n迎泽公园\t195176\n三来了三\t195177\n报警者\t195178\n顺治\t195179\n晃晃\t195180\n窝酷不酷\t195181\nMHS-TS20K\t195182\n美人关丰胸个玫瑰花给你\t195183\n缺男\t195184\nhidden\t195185\n神猜\t195186\n击倒\t195187\n唱衰\t195188\n刘佳慧\t195189\n一致\t195190\n一至\t195191\n大乖宝\t195192\n沙缸\t195193\n早后\t195194\n事不关己\t195195\n早吃\t195196\n平安夜老早\t195197\n4599\t195198\n好我承认我是人\t195199\n桂城真\t195200\n地税\t195201\nAPP\t195202\n病历\t195203\n甘涌新村\t195204\nAPH\t195205\n呀多\t195206\n非此即彼\t195207\n早听\t195208\n共勉\t195209\nAPB\t195210\njiiurg\t195211\n點浮誇\t195212\n额滴神\t195213\n不治之症\t195214\n乖把乖\t195215\n阴转晴\t195216\n双色球\t195217\n赵景秀\t195218\nChghkhhjh\t195219\n夏与火\t195220\n无读不成人无到\t195221\n狂狼\t195222\n一一天\t195223\n不不不我不我不\t195224\n29名\t195225\nDISCOVER\t195226\n音乐厅\t195227\n也喝\t195228\n十六十五\t195229\n你是头猪你是头猪你是头猪你是头猪你是头猪\t195230\n一一头\t195231\n玩儿切\t195232\n754257\t195233\n孟恩付\t195234\n行呀\t195235\n雷破\t195236\n沈玮琪\t195237\n那委\t195238\n一一处\t195239\n陪衬\t195240\nsmw\t195241\n捉急\t195242\n大内\t195243\nrcrh\t195244\n睿威\t195245\n三角裤\t195246\na8kan花花猜a\t195247\n大军\t195248\n大写\t195249\n小宝米线\t195250\nsmc\t195251\n粢米饭团再买袋豆浆\t195252\n四十G\t195253\n不想你不想你不想你\t195254\n先后\t195255\n泰罗奥\t195256\n一到两年\t195257\noqjbspjdchjcdjbcdbjwxshixwzxhiwsxwojcswibscqhbo\t195258\n卡萨\t195259\n喜欢笑\t195260\n酷狗扣扣\t195261\n吵嘴\t195262\n体育运动\t195263\n马树英\t195264\n寻医\t195265\n电电视\t195266\niopyrty\t195267\n冠冕堂皇\t195268\n宫云鹏\t195269\nedswtdr\t195270\n不确定不我不确定\t195271\n辽视春晚\t195272\n戴雨诺\t195273\n会客厅\t195274\nnigei\t195275\nwasz\t195276\néê\t195277\n果照女的聊聊天儿\t195278\nwass\t195279\n把控\t195280\n相一\t195281\nllkkjhutfvngvbbm\t195282\n四金\t195283\n阵营\t195284\n圣地安列斯\t195285\n东家桥\t195286\n书王\t195287\ntfboscom\t195288\n旅游胜地\t195289\n坏蛋\t195290\n凌晨六点\t195291\n相中\t195292\nCee\t195293\n88q四\t195294\n爱用度秘\t195295\n蹲坑\t195296\n3.5元\t195297\n菜品\t195298\n肥胖症\t195299\n指导\t195300\n之所\t195301\n明天下\t195302\n明天上\t195303\n晨哥\t195304\n笑你好玩儿\t195305\n并行\t195306\n燕\t195307\n燔\t195308\n李彦瑾\t195309\n噩梦\t195310\nasemestoreatabaocorestoresole\t195311\n严小超\t195312\n高群\t195313\n一哄\t195314\n燚\t195315\n电视片\t195316\n燃\t195317\n燎\t195318\n慧慧慧慧\t195319\n去年11月17日\t195320\n捅菊\t195321\n哉大也\t195322\nfufy\t195323\n方浩然\t195324\n燽\t195325\ncgkhc\t195326\n否认\t195327\n京都府\t195328\n平常照\t195329\n促度\t195330\n告诉你的秘密\t195331\n九命\t195332\n盖州\t195333\n燠\t195334\n辽阳\t195335\nfufd\t195336\nfufg\t195337\n不正经\t195338\nGHGHHGHHHHHFUGHHHHHHHUHGHJFJHHTFJTGDHFJHFGHHFJJJJJGUHHGGHHGHJFJHHTFJTGDHFJHFGHHFJJJJJ\t195339\n彭良红\t195340\n歼31\t195341\n国策\t195342\n非常棒棒\t195343\n灯泡\t195344\nAngela\t195345\n5545566\t195346\n半，半\t195347\n买回\t195348\n魔宅\t195349\n猪年\t195350\n一哈\t195351\nAngels\t195352\n王家营\t195353\n哎呦爱上我\t195354\n魔家\t195355\naskrcff\t195356\n万里长城\t195357\n我是美我是糖\t195358\n2264145729\t195359\n说好想\t195360\n连接\t195361\n小川仁志\t195362\n270元\t195363\n冤枉路\t195364\nyhycd\t195365\ncomotorstomittoto\t195366\n盛名\t195367\n跃升\t195368\n磨砂\t195369\njpmpm\t195370\n激流\t195371\n波泼\t195372\n波波\t195373\n超声伟大\t195374\n王大哥\t195375\n12：04\t195376\n王小波\t195377\n12：00\t195378\n喜新浪漫剩峥嵘\t195379\n吴翠红\t195380\n遮身\t195381\n磨破\t195382\n纪委\t195383\n翅虫\t195384\n方特\t195385\n好朋友好基友\t195386\n偶哥\t195387\n系带\t195388\n小萝莉\t195389\n当成宝\t195390\n驸马\t195391\n修护\t195392\n2014日\t195393\n小红门乡政府\t195394\n850袋\t195395\n更新\t195396\n哈尼哈哈\t195397\ntUO\t195398\nALRIGHT\t195399\n说了咧\t195400\n施工方\t195401\n陈道琴\t195402\n0dan\t195403\n更文\t195404\n虎纹\t195405\n熊总\t195406\n是谁是\t195407\n四嫂\t195408\n熊性\t195409\n公狗\t195410\nfuvkbjk\t195411\n成分\t195412\n卢沟桥\t195413\n球队\t195414\n14时10分\t195415\n劣根\t195416\n米兰建队\t195417\ncr13\t195418\n我的未来\t195419\n死徒弟\t195420\n背吧儿歌有没有\t195421\n5分钟后\t195422\n西洋参片\t195423\n出口信访局\t195424\n蔡琴\t195425\n情仰\t195426\nhggte\t195427\n七门\t195428\n聋子\t195429\n96秒\t195430\n七间\t195431\n5885976582\t195432\n四十ｇ\t195433\n暴力狂女\t195434\n自攻自受\t195435\n铁还\t195436\nUMBRO\t195437\n总决赛\t195438\n許\t195439\n香飘飘\t195440\n语义\t195441\nmddm\t195442\nn┗━━┳\t195443\n防微杜渐\t195444\n真漳\t195445\n政敌\t195446\n做客\t195447\n奥加\t195448\n28019988\t195449\n逆战歌\t195450\n黑哥哥\t195451\nBSBS\t195452\njrrd\t195453\n病原\t195454\n訛\t195455\n君上的君，太君的君\t195456\n做实\t195457\n相機\t195458\n提供者\t195459\n百零五斤\t195460\n言\t195461\n用形\t195462\n做完\t195463\n驱魔战\t195464\n芒果冰糕\t195465\n13980355154\t195466\n带你走\t195467\n像头不哭\t195468\n324464466772652626545234477424774233754375375553773475\t195469\n永登浦区\t195470\n富豪大富b式\t195471\n朱烨宝\t195472\n蓝莓建华\t195473\nskbgc\t195474\n火神本\t195475\n金麟骨\t195476\n忘年你\t195477\n4月份\t195478\n一百八一杯\t195479\n胡景岩\t195480\n隔壁\t195481\n对啊ok\t195482\n汤鸡\t195483\n台式会儿\t195484\n机尾\t195485\n走漏\t195486\n程上启\t195487\n捉迷藏吧你来找我我要单\t195488\n应读\t195489\n陈国良\t195490\n拖堂\t195491\n巨轮\t195492\n安加油\t195493\n更美好\t195494\n阿密托佛\t195495\njhkn\t195496\n258\t195497\njhkl\t195498\n跟琛\t195499\n才行\t195500\n两排\t195501\n253\t195502\n250\t195503\n251\t195504\n256\t195505\n257\t195506\n254\t195507\n应该\t195508\n孟丹\t195509\nifyhgfjhfahtugchdg1650\t195510\n旧照\t195511\n拭目以待\t195512\n湿兄\t195513\n有福同享\t195514\n推延\t195515\n昨天\t195516\n抖落\t195517\n哪一次\t195518\nllgcuuv\t195519\nweewd25463556r4p4\t195520\n家电视\t195521\n特色社会主义\t195522\n单周多杰\t195523\n25k\t195524\n第一名\t195525\n嘿嘿房\t195526\n一千一百一万一千\t195527\n哀家出现\t195528\n25m\t195529\n不要你了\t195530\n十日上\t195531\n671277\t195532\n击幼\t195533\n惊喜\t195534\n自语\t195535\nTFBay\t195536\n大悲咒\t195537\n要慎\t195538\n郑渊洁\t195539\nFacebook广告创意\t195540\n没呀\t195541\n君生我未生\t195542\n25L\t195543\n吾言\t195544\n不懂夜\t195545\n吃降\t195546\nyouknoweggs\t195547\n保证书\t195548\n涞水\t195549\n肌瘤\t195550\n吃街天\t195551\n不不不在\t195552\n风镐\t195553\n25W\t195554\n三河\t195555\n黑龙茶\t195556\n炝锅\t195557\n散场\t195558\n笨度秘\t195559\nvfhhft\t195560\n丘疹\t195561\nYGGDRY\t195562\ndhn\t195563\ndhm\t195564\n等级\t195565\n我是你的记妃\t195566\ndhj\t195567\ndhh\t195568\ndhg\t195569\n应援\t195570\ndhd\t195571\n我最高我最帅我我最最最最\t195572\ndhb\t195573\n黄铜\t195574\n造势\t195575\n大额借款\t195576\n然而已\t195577\n姚景怀\t195578\ndhz\t195579\ndhy\t195580\ndhx\t195581\n哥们儿女人\t195582\ndhv\t195583\n得间\t195584\n城市建设\t195585\n得闲\t195586\ndhr\t195587\n十万万分\t195588\n酌力\t195589\n路哈\t195590\n转帐\t195591\n龙岗\t195592\n没溜\t195593\n蠶獎\t195594\nnok3q\t195595\n13870040533\t195596\n122332585\t195597\n李渊\t195598\n结集\t195599\n绒布\t195600\ncsolonline\t195601\nhfhhgjcj\t195602\n演歌\t195603\n1点25\t195604\nbensley\t195605\n归化局\t195606\n19票\t195607\n60606060\t195608\n13888670099\t195609\nqqy呀\t195610\n鸳\t195611\n胸胸\t195612\n斥候\t195613\n一叉个叉7才厂厂厂\t195614\n艾力艾\t195615\n德行为\t195616\n真可可\t195617\n龙岩\t195618\n四百次\t195619\n40公里\t195620\n不着我了我也接不着我了我\t195621\n莉利\t195622\n大众通感\t195623\n李恪手\t195624\n人生如梦\t195625\n赵东丽\t195626\n父母\t195627\n吧部\t195628\n上风\t195629\ncctv6电影报道我好爱你\t195630\n半决赛\t195631\n春天之前\t195632\nxSf\t195633\n闫胜乐\t195634\n有情有\t195635\n旅游卫视\t195636\n016千克\t195637\n来了跑\t195638\newed\t195639\n找有你\t195640\n屋前\t195641\n罗巴鸟\t195642\n返次\t195643\n返款\t195644\n曾经雄心勃勃宦海沉浮家处女\t195645\n明天上午九点半\t195646\n寸金\t195647\n赵不习\t195648\n40天\t195649\n机器人男友\t195650\n荧光棒\t195651\n占线\t195652\n琼崖秋明\t195653\n我的回忆\t195654\n苏玛丽\t195655\n十张\t195656\ncjfui\t195657\n普仁\t195658\n侯逸凡\t195659\n霸气\t195660\n2222222555550\t195661\npossible\t195662\n换换\t195663\nshowgiryoutomorrow\t195664\n老我\t195665\n为你找到\t195666\n刀砍\t195667\n和作\t195668\n皇太后\t195669\n和你\t195670\n占为己有\t195671\n王太利\t195672\n沙子\t195673\ngttgjhg\t195674\n箱照\t195675\nffx\t195676\n鄢陵\t195677\n童鞋\t195678\n天天长\t195679\n四川文理学院\t195680\n求全\t195681\n放学了我没有\t195682\n八几年\t195683\n这两年\t195684\n黐线佬\t195685\n烂七八糟\t195686\n李金阳\t195687\n1xxo\t195688\n鬼才\t195689\n狗血\t195690\n王小张\t195691\n多巴镇\t195692\n陈云林\t195693\n亲爱的你慢慢飞我身天边最美的玫瑰\t195694\n毛利率\t195695\n数学\t195696\n718620398\t195697\n宫阙\t195698\nfou\t195699\n鬼扯\t195700\n#快乐大本营#\t195701\nfor\t195702\n师姐\t195703\n数字\t195704\n渠家\t195705\neeeedd\t195706\nfox\t195707\nfoy\t195708\n1098252904\t195709\n数子\t195710\n无神论\t195711\nfof\t195712\nfoa\t195713\nfob\t195714\nfoc\t195715\nfom\t195716\n遗憾的人\t195717\n睡不着事\t195718\n度秘书真\t195719\n凡响\t195720\n呼保义\t195721\n隶\t195722\n55555555553333\t195723\n威猛\t195724\n第42页\t195725\n上下面\t195726\n好冷清\t195727\n哈珀斯理\t195728\n左飞飞\t195729\nbhiphotosbaiducomxiaodupicitemb17eca8065380cd748b71e98a644ad345982811ejpg\t195730\n童家桥\t195731\n三分之二点五\t195732\nhello丽都\t195733\n就医\t195734\n欠扁\t195735\n呈献\t195736\n香橙\t195737\nㄛㄜ\t195738\n邓柳卿\t195739\n马尾好\t195740\n1.4%\t195741\n15899966954\t195742\n上海交大\t195743\n王慧荣\t195744\n死命\t195745\n晚上五点\t195746\n唉度秘度秘a秘码\t195747\n西村\t195748\n万公顷\t195749\n文爱\t195750\n叫插\t195751\n#2012广美毕业展#\t195752\nJCXYG\t195753\n几几年\t195754\n2949420782\t195755\nnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnn机器人在那\t195756\n黄嘉慧\t195757\n0.27元\t195758\n搞邪\t195759\n给我不首歌\t195760\n人烟稀少\t195761\n小米遥控直升机\t195762\n都面\t195763\n折神马滴\t195764\n鎖\t195765\nVBv\t195766\n黄超超\t195767\n贾世鹏\t195768\ncyctxhcy\t195769\n五互\t195770\nBbccb\t195771\n1006￥\t195772\n居中\t195773\n这么冷\t195774\n小姐夫\t195775\n啦里啦里啦啦啦啦啦啦啦啦啦啦啦啦啦\t195776\n六一八万条\t195777\n方向感\t195778\n酱油钱\t195779\n梁祝\t195780\n在心上\t195781\n半点\t195782\n伯庆\t195783\n這種\t195784\n起落架黑胡椒\t195785\n么不\t195786\nigfhckckfu\t195787\n#联想乐Phone\t195788\n莱东丽\t195789\n悖\t195790\n悔\t195791\n除此之外\t195792\n茴香豆\t195793\n小爱小爱\t195794\n悜\t195795\n清洲\t195796\n6家\t195797\n打混\t195798\n好人证\t195799\n悄\t195800\n悊\t195801\n悍\t195802\n悲\t195803\nfucgh\t195804\n偏财\t195805\n8秒\t195806\n夏晚月\t195807\n玲珑型\t195808\n焰火\t195809\n阿锋\t195810\nmmmgmgmgmmggmmgmgmgm\t195811\n余慢洋\t195812\n患\t195813\n悢\t195814\n副作用\t195815\n悠\t195816\n要不不让\t195817\n丹丹丹丹丹丹丹丹笨笨\t195818\n您\t195819\n腌渍\t195820\n10公分\t195821\n悬\t195822\nq6m\t195823\n严厉\t195824\nfdserdryt\t195825\n唔彩\t195826\n五亿\t195827\nHDJ.Three.Kingdoms.RPG.Ep01.HDTV.1080i.zip\t195828\n任则宇\t195829\n吃恩\t195830\n富疑无柳暗花明又一村\t195831\n张文甲\t195832\nFFYCY\t195833\n烧伤膏\t195834\n柳条箱\t195835\n润湿女\t195836\n伺察\t195837\n丽泰夺宝\t195838\n凭啥\t195839\n农谚\t195840\n啦啦啦啦啦啦我\t195841\n云儿\t195842\n8FC\t195843\nffj\t195844\n深水潭\t195845\n无对呀\t195846\n发骚\t195847\n告吹\t195848\n不要理你了我恨你\t195849\nwifi智能钥匙\t195850\n脊椎\t195851\n上厕所在\t195852\n劳逸\t195853\n17世纪初\t195854\nPhef\t195855\n脸媚\t195856\n不用了恨\t195857\n法花\t195858\n窗明几净\t195859\n梁家仁\t195860\n安琪媛\t195861\n字亲\t195862\n言归正传\t195863\n怒道\t195864\nThgi\t195865\n说不三百\t195866\n500只\t195867\n说句放\t195868\n心手双畅\t195869\n商谈\t195870\n全资子公司\t195871\n五点\t195872\n二十七位\t195873\n民利\t195874\n磊科\t195875\n崔妈妈\t195876\n冯娘娘\t195877\n1313579\t195878\n阅读者\t195879\n首节\t195880\n999999999999次\t195881\n眼镜世界\t195882\n还记得\t195883\n副市长\t195884\n民航东北局\t195885\n冰冰冰冰冰冰棒\t195886\n一本四本\t195887\ntear\t195888\ntencount\t195889\n一千点儿\t195890\n8.77亿亩\t195891\nmahe\t195892\n研究生\t195893\n鬼真\t195894\n哈楼臭\t195895\n金量\t195896\ndgfx\t195897\n普查\t195898\n小传旺\t195899\n普陀佛缘\t195900\n花都榭\t195901\n十点钟\t195902\nv家\t195903\n你父\t195904\n你爷\t195905\n你爸\t195906\n546164349\t195907\n扣扣扣扣扣扣\t195908\n烩鲜蘑\t195909\n百合片\t195910\n左昂\t195911\n剧害\t195912\nXBOX360\t195913\n拙计\t195914\n5宫\t195915\n颁布\t195916\n爱对\t195917\n大家一起玩\t195918\n不是我真\t195919\n换句话说\t195920\n当某天\t195921\n答题类\t195922\n达克宁淑梅\t195923\n28502元\t195924\n12月30号\t195925\n启明\t195926\n宝黑\t195927\n间\t195928\n烦嘛\t195929\n七二二\t195930\n68868888868886\t195931\n8887788\t195932\nhecome\t195933\n12345678\t195934\n包裹式\t195935\n看真是\t195936\n加佳\t195937\n安医\t195938\n说晚安\t195939\n000000000000\t195940\n周女子\t195941\n咪咪咪咪咪咪咪咪\t195942\n你丑你丑臭玩意儿你\t195943\n度秘度秘你好度\t195944\n36棵\t195945\n读友\t195946\n自找\t195947\n1000000000000000000000000000000000000\t195948\n瓦伦西亚\t195949\n學會\t195950\n毛线\t195951\n白了呗\t195952\n小混\t195953\n丁堡\t195954\n在乐习\t195955\nfufhx\t195956\n看一下付近\t195957\n珍科\t195958\n飞刀\t195959\nfufhc\t195960\nAV\t195961\n5家\t195962\n你说的什么嘛我是你的神\t195963\n发张我的最爱\t195964\n一百十一\t195965\n牺牲\t195966\n1.44\t195967\nfddgfyryrtdgsd\t195968\n唯恐花团锦簇美不胜收\t195969\n发肚\t195970\n最好玩\t195971\nShinee-温流\t195972\niph\t195973\n陆总\t195974\n监督者\t195975\n好几\t195976\n梨园春\t195977\nipp\t195978\ntytfhu\t195979\n不要脸臭不要脸的声音狐狸精\t195980\n虫虫国来了浦蒲密\t195981\n450米所\t195982\n幅间\t195983\nmynamostomy\t195984\n什么秀\t195985\n发肿\t195986\nvalvallyou\t195987\n吴奶奶\t195988\n5848822258452\t195989\n好准\t195990\n发一我说\t195991\n大富大贵\t195992\n魂技\t195993\n好净\t195994\nulmjmjvjt5mujmjmtjtmtajm\t195995\n酷酷巧克力\t195996\nLim\t195997\n我的城市\t195998\n郭金枝\t195999\n发肤\t196000\n太菜\t196001\n好可惜\t196002\n那当我亲爱的朋友\t196003\n林仔一\t196004\nDffdff\t196005\nNBer\t196006\n日进\t196007\n性知识\t196008\n取得\t196009\nI副房管局\t196010\n大受欢迎\t196011\n同方股份\t196012\n一零\t196013\nbulktee\t196014\n日过\t196015\n看作\t196016\n音公司\t196017\n常金星\t196018\n收徒\t196019\ngckb\t196020\n随时随\t196021\nmy盛头儿\t196022\n中旅\t196023\n一集\t196024\nkhnff\t196025\n脑区\t196026\n欧内酱\t196027\nhshjx\t196028\n意味年\t196029\n438438\t196030\n777777777777777777777777777\t196031\nkkjnjmjjjj\t196032\n将爱情进行到底\t196033\n妮戚\t196034\n胡好国\t196035\n颐指\t196036\n蚊香\t196037\n妮成\t196038\n大赞大赞\t196039\n中共十四大\t196040\n一夕\t196041\n银荡\t196042\n粉身\t196043\n咒骂\t196044\n短期内\t196045\n龙祖宗\t196046\n一多\t196047\n大汉子\t196048\n哈哈说的笑话当见面礼很高兴见到你我是你的小秘书度秘\t196049\n明天后天\t196050\n李可以\t196051\n新民晚报对此类案件报道\t196052\nhifimecoffimatiahimatistimatlifoufind\t196053\n\t196054\n小赛罗\t196055\n朝外\t196056\n精灵梦叶罗丽\t196057\na型\t196058\n安达\t196059\n奥纳我想\t196060\nWht\t196061\n不u6\t196062\nkubffbhg\t196063\nIamisboy\t196064\n学不下\t196065\n幇\t196066\nWhs\t196067\n姐状\t196068\n太爱路汉口\t196069\nWhy\t196070\n艺员\t196071\ndxhx\t196072\n丹妮\t196073\n青岛啤酒\t196074\nbvvbveou\t196075\n恩则天\t196076\n波打\t196077\n谦好好听\t196078\n嗯一声\t196079\n乐扣乐扣耐热玻璃\t196080\n王秀莲\t196081\n红东东\t196082\n10万吨\t196083\n手莫\t196084\n温碧霞\t196085\n胡爽\t196086\n死火山\t196087\n2007年\t196088\n水瓶座\t196089\n五台中学校歌\t196090\n谢ω\t196091\n无常性\t196092\n濾旅\t196093\n巛巛\t196094\nvv公关部\t196095\n夸我帅给你糖\t196096\n油瓶\t196097\n动做\t196098\nccf\t196099\n9384567\t196100\n小广告\t196101\n毕业了没\t196102\n妈养\t196103\n毕业经已匆匆数载\t196104\n冬虫夏草\t196105\n福利待遇\t196106\n宣儿\t196107\n苍山\t196108\n水尚玻尿酸\t196109\n5071\t196110\n萌度秘\t196111\n西国\t196112\n董秘\t196113\n考不讨厌\t196114\n饿死鬼我告诉你\t196115\n回头是\t196116\nbestlove\t196117\n深动\t196118\n金美辛\t196119\n刘圳\t196120\ndhvegf\t196121\nqikx\t196122\n台洲市\t196123\n講點笑話給\t196124\n96651595562\t196125\n五枪\t196126\n长治市\t196127\n慕名而来\t196128\nvee\t196129\n龙龙父\t196130\n喝啤酒\t196131\n武馆\t196132\n虎皮蛋卷\t196133\n自我食\t196134\n后天上午\t196135\n113877887878\t196136\nanswer\t196137\n娘娘恩知拉贝比\t196138\n入场票\t196139\n盖拍\t196140\n汤官方5vvVBtvhhgtfajtk\t196141\n好高\t196142\n搜死\t196143\n想乱说\t196144\n39.9900\t196145\n作答\t196146\n美情话\t196147\n江阳区\t196148\nitchara\t196149\n火识\t196150\n来了不聊\t196151\n绝顶\t196152\n盖括\t196153\n校方\t196154\n廢墟\t196155\n伸缩\t196156\nhfgcv\t196157\n二三十岁\t196158\n两袖清风\t196159\n落到\t196160\n三句话\t196161\n500.2000\t196162\n6月25日\t196163\n晚来迟\t196164\n高兴我我我是我\t196165\n亦儿\t196166\n三大街\t196167\n周恩熙\t196168\n校斗\t196169\nvncxbvetgs\t196170\n老戴\t196171\n杀青\t196172\n瓮福\t196173\n321148\t196174\n对口\t196175\n掛线\t196176\ndbhcffccvvvvvbnnnnjhhfssxcbbnjhgccvbnmmjz\t196177\n铁山禅寺\t196178\n给我介\t196179\n启都\t196180\nNicetwomiqu\t196181\n姜鹏\t196182\n让给\t196183\n信达财险\t196184\nonao日宝宝\t196185\n还不起\t196186\njes公司\t196187\n零五号\t196188\n七分之一\t196189\n汽油味\t196190\n靠飞扬\t196191\n北岛\t196192\n龙蛋\t196193\n憧憬\t196194\n憧憧\t196195\n土鸡\t196196\n徐堡\t196197\n罢了\t196198\n十月立冬\t196199\n洞天使\t196200\n百子\t196201\n我是女的我以讹\t196202\n癌雪\t196203\n相待\t196204\n严冬\t196205\n五常市\t196206\n九缺\t196207\n优秀者\t196208\n十字绣\t196209\n李青格\t196210\n基基\t196211\nimstrument\t196212\n六速\t196213\n胞姐\t196214\n这段情\t196215\nn10\t196216\n教长\t196217\n美旗\t196218\n近期货\t196219\n月儿圆圆\t196220\n赵猛\t196221\n5784\t196222\n绵绵绵绵绵绵\t196223\n固原六中\t196224\n好好玩\t196225\n百讲\t196226\nコィダ\t196227\n一千五百名\t196228\n撸撸\t196229\n嘎嘎嘎嘎\t196230\n小雞\t196231\n360行\t196232\n撑住\t196233\n面子问题\t196234\n埋葬\t196235\n八四八五\t196236\n恢复正常\t196237\n十一速\t196238\n百分之30\t196239\n落空\t196240\n联邦投资\t196241\n郑州万通汽修学校\t196242\n讲请\t196243\n楼面价\t196244\n谭辉\t196245\nstle\t196246\n见你\t196247\nYuan\t196248\n钱款\t196249\n榜爷\t196250\n一代母\t196251\n不搞不懂\t196252\n业斌\t196253\nsdfdsed\t196254\n掉入\t196255\n0000000000000000000000000000000000000000000\t196256\n给你儿\t196257\n染剂\t196258\n白芳礼\t196259\n刘继泽\t196260\n十五分钟\t196261\n雪柜\t196262\n不要不信\t196263\n王和王\t196264\n1991年9月9日\t196265\n好再来\t196266\n灵魂\t196267\n强女\t196268\n卷子\t196269\n强奸\t196270\n600余名\t196271\n亚米切斯\t196272\n你好你好你好赞科youngyou\t196273\nBOTA\t196274\nnebe\t196275\n导尿\t196276\n5808\t196277\n张生病\t196278\n5801\t196279\n杨再舂\t196280\n雨露水\t196281\n受们\t196282\n会吧不聊\t196283\nFavouritesport\t196284\n告诉谁你说\t196285\ngbvhj\t196286\n时行\t196287\n演化\t196288\n是你是你我是你主人\t196289\nchuoxieu\t196290\n南豆腐\t196291\n辣种\t196292\n石龙花\t196293\n八枚\t196294\n李林翰\t196295\n陈姐\t196296\n7.一个\t196297\n王佳豪\t196298\n秒秒钟\t196299\n吧聊天\t196300\nsidnbfvecbxvmdmendgnsdndncvdntwjhuyyckgh\t196301\n汇源癌\t196302\n#718创意园\t196303\nlamfromUK\t196304\n明天下午5点20\t196305\n枕头\t196306\n5句\t196307\n5口\t196308\n千易烊千玺\t196309\n师范学院\t196310\n我我不要你了滚开\t196311\n天才小毒妃\t196312\nhcbhhccbjv\t196313\ngffgg\t196314\n5只\t196315\n心地\t196316\n5号\t196317\nManChesterUnitedDavid\t196318\nDnchgahflgjcnc\t196319\n峁\t196320\n峏\t196321\npayibianqu\t196322\n管道工\t196323\nfifg\t196324\n峰\t196325\n高飞\t196326\n5双\t196327\n10月13号\t196328\n夫妇服\t196329\ning哦咯咯JOJO咯low五\t196330\n志伟\t196331\n峡\t196332\n陆哥\t196333\n哦王\t196334\n岂敢\t196335\n空里\t196336\n空重\t196337\nfifj\t196338\n过年了吧\t196339\n099625893114583000\t196340\n洋中路\t196341\n搞得好搞笑\t196342\n卡顿\t196343\n短道速滑队\t196344\n孟令航\t196345\n阎俊同\t196346\n夏门\t196347\n弦小雾&东篱词作\t196348\n说明文\t196349\n上门女婿\t196350\n丹爷\t196351\n字幕\t196352\n小雯\t196353\n牛马\t196354\n温大温\t196355\n纵队\t196356\n想一找\t196357\njdkfhfjc\t196358\n尼边禅河\t196359\n不要再发短信\t196360\n巴渝\t196361\n偏下收入\t196362\n雷婆\t196363\n#益龙网络军团#\t196364\n解场\t196365\n清样\t196366\nodpp\t196367\n自告奋勇\t196368\n单一秘\t196369\n龙骑\t196370\nnnnnnnnnnnnnnnnnnnnnnnnnn韩国人nnnnnnnnnn坏坏惹人爱nnnnnnnnnnnhgtnnnnnnnnhg\t196371\n跳入\t196372\n催人泪下\t196373\n黄品源\t196374\n互拍\t196375\n百度钱包样\t196376\nhellomystoutu\t196377\n移门\t196378\n云洛克\t196379\ndeueh\t196380\n不灵\t196381\n鲁番\t196382\n检测仪\t196383\nFJJFJFFJ\t196384\nFhfu\t196385\n攻守\t196386\n洞赛\t196387\n黄世阳\t196388\n开花结果\t196389\n韩文浩\t196390\n64条\t196391\n知三\t196392\n璞璞湫湫湫\t196393\n不灭\t196394\n沒网\t196395\n五六1007\t196396\n疯了不累\t196397\n波萝波萝\t196398\n九百\t196399\n5838624\t196400\n读一变\t196401\nsoocfi\t196402\n叁点\t196403\n江宁\t196404\nfygff\t196405\n推建\t196406\n江安\t196407\n见势不妙\t196408\n张朝胜\t196409\n胎神所\t196410\n你的名儿\t196411\n无论事\t196412\n小怕怕\t196413\n双节棍\t196414\n炎龙怒龙传奇\t196415\n嗯如果\t196416\n牧区\t196417\n一瞥眼\t196418\n体外循环\t196419\n张乐怡\t196420\n洛安繁\t196421\nururj\t196422\n阴暗面\t196423\n云淡风清\t196424\n两条路\t196425\nbuwb\t196426\n埋水\t196427\nhellostmetou\t196428\ndn10个\t196429\n蒙曼\t196430\nhruf\t196431\n前景\t196432\n猎狗\t196433\n178倍\t196434\n今早7时20分\t196435\n普洱茶屁颠屁颠\t196436\n愈合\t196437\nHGHf\t196438\n八八酒吧\t196439\n京津\t196440\n血清钠\t196441\n好看不好\t196442\nudjdhdvshx\t196443\n芭比控\t196444\n严政豪\t196445\n牛蛙\t196446\n新闻报道\t196447\n多牛逼\t196448\n你好心恨手\t196449\n百度大脑包\t196450\n喜怒哀乐\t196451\n滕丽\t196452\n国办\t196453\nthingun\t196454\n我问你要脸不要脸你要脸不要脸你\t196455\n为子\t196456\n苗子\t196457\n车手\t196458\n嗯长\t196459\n植物大战僵尸之奇\t196460\n温哥华\t196461\n李欣桐\t196462\n中公园\t196463\n再见了明天见\t196464\n这么多课\t196465\n2005年5月2225日\t196466\n伊博文\t196467\njd9en\t196468\n百计\t196469\n早点儿\t196470\n国务\t196471\n上站\t196472\nxo跨年\t196473\n十八倍击\t196474\n凝结\t196475\ncabare\t196476\n上海儿童医学中心\t196477\n兄和弟\t196478\n城市规划\t196479\n对白\t196480\n第4张\t196481\n监听\t196482\n七毛\t196483\n785028\t196484\n汲达拉斯\t196485\n12月31号\t196486\n试明天就要领通知书\t196487\n祝一涵\t196488\n甚忧\t196489\n问你别给我扯东扯西\t196490\n李立斌\t196491\n比干露露\t196492\n快乐动手机\t196493\n九世在线\t196494\nhxd\t196495\nhxk\t196496\nhxj\t196497\nhxi\t196498\nhxo\t196499\n不可原谅\t196500\n不是晚上了谁上晚上了谁上你\t196501\n程雨苗\t196502\n乐天免税店\t196503\n驻守\t196504\nonscheduled\t196505\nuflgj\t196506\n服务堡初中初一人教版\t196507\n晚安行\t196508\n无限我是吧我记\t196509\n省总工会\t196510\n福安路\t196511\n杨云\t196512\n排播\t196513\n助阵\t196514\nCanyorse\t196515\n刘度秘\t196516\n建子\t196517\n胃病\t196518\n胃痛\t196519\n小场\t196520\n捧回\t196521\n翻回\t196522\n中医史\t196523\nu度\t196524\n二十几个\t196525\n帮捏\t196526\n理聊\t196527\n中融\t196528\n退下吧\t196529\n郭客厅\t196530\n小圆\t196531\n崇文区\t196532\n罗伯特·盖茨\t196533\n好大喜功\t196534\n小土\t196535\n888180\t196536\n上海电视\t196537\n叨叨叨\t196538\n道营\t196539\n人肉叉\t196540\n28700655\t196541\n海煜\t196542\n第十五页\t196543\n小孙子市实验小学\t196544\n喷斗\t196545\n12832\t196546\n发图花\t196547\n自会明\t196548\n博主联盟#\t196549\n龟鹿队\t196550\ngubgyu\t196551\n如饥似渴\t196552\n大侯庄\t196553\n不能然起\t196554\n14户\t196555\n火上浇油\t196556\n我心裡好難受\t196557\n我没你的生日我没有你\t196558\n上一天气\t196559\n度米仲敏\t196560\nCgcyl\t196561\n猫型机器人\t196562\na159\t196563\n为父\t196564\nZzzzzzzzzzzzzzzZzzzzz\t196565\n致使\t196566\n步步大\t196567\n您好\t196568\n61026124188\t196569\n会员儿\t196570\n贾思垚\t196571\n你在哪我想你\t196572\n华华\t196573\n嗯幺\t196574\n点了赞\t196575\n厕所\t196576\n2757144242\t196577\n6点九六\t196578\n嗯干\t196579\n天马流星拳\t196580\n仙剑奇侠传三\t196581\n再见我一定要\t196582\n二零幺二\t196583\n出招\t196584\n名艺\t196585\n讲故事\t196586\n侍寝\t196587\n柚子皮\t196588\n了意\t196589\n圈儿\t196590\n狄仁杰\t196591\n別改天\t196592\n放肆\t196593\n梁正\t196594\n桃花源记\t196595\n这老子\t196596\n粘贴\t196597\n月思\t196598\n队子\t196599\n知识面\t196600\nnxjx\t196601\n体挺好\t196602\n232323823\t196603\n遗留\t196604\n五郎\t196605\n2016年1月25号中午10点50\t196606\n糊涂罗汉熊\t196607\n国涌兄\t196608\n常德县\t196609\n别喜欢\t196610\n双鱼\t196611\nrwbyr\t196612\n黑嗯嗯7fb82姐8\t196613\n没多久\t196614\n17000000000\t196615\n雷盖亚\t196616\n昊金\t196617\ntjbe\t196618\njgj\t196619\njgk\t196620\n拜拜尼\t196621\n鬼马\t196622\n傻夫\t196623\njga\t196624\njgb\t196625\njgc\t196626\njgd\t196627\n重男轻女\t196628\njgg\t196629\n吴小娇\t196630\n百感交集\t196631\n3555596\t196632\n青椒\t196633\n2638359001\t196634\njgr\t196635\n没有我好恨\t196636\njgu\t196637\njgv\t196638\n有占有\t196639\n人见人\t196640\n克鲁伊\t196641\n贝贝儿\t196642\n45厘米\t196643\n每月\t196644\n周一亚\t196645\n摸Σ\t196646\n飞岀\t196647\n困不困\t196648\n三邱\t196649\n附和\t196650\n壳郎\t196651\ngeiwo\t196652\n温欣昌\t196653\n枯死\t196654\n团购票\t196655\n真古我很想\t196656\n紫小樱\t196657\nhvfteryu\t196658\n656.9亿元\t196659\n真人儿\t196660\n透彻\t196661\n情操\t196662\n算计时\t196663\n大瀑布\t196664\n一招\t196665\n烟化\t196666\n一开心\t196667\n快乐寒假义务教育课程标准实验教科书\t196668\n酷跑轩\t196669\n克里斯塔贝尔\t196670\n卡卡国王牛排\t196671\n十几万吧vf\t196672\n共产党员\t196673\n11111111111242634865556327009986895863585\t196674\n非也\t196675\n15万元\t196676\n港姐\t196677\n阴性\t196678\n佛皇煲\t196679\n郞勇\t196680\n说么样\t196681\n洁厕\t196682\n吧娃娃\t196683\nimmonore\t196684\n扎鲁特旗\t196685\n非书\t196686\n放我一个人生活是谁的歌\t196687\nNov30\t196688\n锅巴\t196689\ngsgsv\t196690\n部位\t196691\n骑楼群\t196692\n苗若晨\t196693\n铁甲威虫\t196694\n睡囊\t196695\nchfjj\t196696\n作假者\t196697\n大便度秘\t196698\n何色\t196699\n偶函数\t196700\n湖南都市频道\t196701\n八点六十五分\t196702\nworynjk\t196703\n闹龙舟\t196704\n港式\t196705\n蚌淮\t196706\n老袁滴\t196707\n顺时针\t196708\n八两个\t196709\n汉子\t196710\n瞎说八道\t196711\n你好坏我不用你了拜拜\t196712\n靓号\t196713\n金蛋\t196714\n一拍\t196715\n15395320719\t196716\n小度晓得啵小虾咪呀\t196717\n这是我的好乖度秘\t196718\n2691693741\t196719\n2474637\t196720\n再聊一聊\t196721\n何艺\t196722\n顿然\t196723\n1212\t196724\nbar猪\t196725\nC+潮鞋\t196726\n聊天好伐\t196727\n阿斯\t196728\n珠密琪\t196729\n绝密\t196730\n两侧\t196731\n阿方\t196732\n回题\t196733\n接见\t196734\n巴哥达\t196735\n亚当饭\t196736\n留头\t196737\n榆林市\t196738\n妈妈\t196739\nWebber\t196740\n豆歹\t196741\n你文\t196742\n探究\t196743\n两例\t196744\n帮会\t196745\n几滴水\t196746\n这个星期天\t196747\n夺冠\t196748\n运动乐\t196749\n奇策\t196750\n永无止境\t196751\n女王范\t196752\nkfvc\t196753\n阿文\t196754\n沙皮狗\t196755\n绝对\t196756\n设身处地\t196757\n糖精钠\t196758\n常思雨\t196759\n证人\t196760\n减震器\t196761\n上善\t196762\n关瑞景\t196763\n搁置\t196764\n小糖\t196765\n太左\t196766\n恩好喜欢\t196767\n脖套\t196768\n妙妙么么\t196769\n小糑\t196770\n当晚\t196771\noeppqiyor\t196772\n吧美人儿\t196773\n跑跑卡丁车\t196774\n度秘戏机器你是机器人那你的爸爸妈妈\t196775\nCici\t196776\nCich\t196777\n八月初十\t196778\n加火\t196779\n1ps\t196780\n亮元\t196781\n3051套\t196782\n亮光\t196783\n盖世五侠\t196784\n你是我的小呀小苹果小苹爱你\t196785\n鲍冬雪\t196786\n忘了\t196787\n88寸\t196788\n永远不分离\t196789\n陆星材\t196790\ngbbffFFGYFJ\t196791\n凉意\t196792\n阐述\t196793\n可乐鸡翅\t196794\n九周岁十岁\t196795\n4787\t196796\n八期六午\t196797\n好从\t196798\n三十四回\t196799\n4781\t196800\n909.0元\t196801\n质量好\t196802\n哈莫雷\t196803\n忘交\t196804\n好任\t196805\n1一零\t196806\n鞠个躬\t196807\n响聊聊职场\t196808\n神谷\t196809\n心木意\t196810\n笑风\t196811\n实式\t196812\n鹅鹅鹅曲项向天歌白毛浮绿水红掌为了不饿\t196813\n18739495757\t196814\n老子杠\t196815\n一点半假\t196816\n一孔\t196817\n促销价\t196818\n一字\t196819\n补觉\t196820\n1234567890岁\t196821\n1399元\t196822\n阳城\t196823\n15635397333\t196824\n补角\t196825\nMSNMtgwgjtgwjawwmwwtmmwjamadtdam9mj4\t196826\n3片\t196827\n478p\t196828\n21221222122\t196829\nLOOK\t196830\n蝉翼的回忆\t196831\npoya\t196832\n激发\t196833\n多特粉\t196834\n一学\t196835\n一季\t196836\n美PR\t196837\n郭大兵\t196838\n中奖者\t196839\n不室\t196840\n犍为\t196841\n落得\t196842\n科比翠华\t196843\n141412700\t196844\nduudghuf\t196845\n不容\t196846\n抹掉\t196847\n卢爱芳\t196848\n不宁\t196849\n认准\t196850\n不守\t196851\n不安\t196852\n不完\t196853\n元芳\t196854\n大连站\t196855\n呃腾讯\t196856\n那老子\t196857\n风云榜\t196858\n马妞妞\t196859\n不定\t196860\n不官\t196861\n不宜\t196862\n不宝\t196863\n超爆照\t196864\n艾滋病毒\t196865\n斗图误导\t196866\n贝哥\t196867\n烹饪\t196868\n勇猛\t196869\n边境城市\t196870\n再赖\t196871\n灰害\t196872\n窘迫\t196873\n回不去了\t196874\n再赞\t196875\n傍徨\t196876\n郑大洲\t196877\n格林童话读后感\t196878\n13909074255\t196879\n海巡署\t196880\nc1村\t196881\n灵狐王\t196882\n膺惩\t196883\n流星月\t196884\n\t196885\n嘻西\t196886\n根蒂\t196887\n曾有一人爱我如生命\t196888\n再走\t196889\n山水甲\t196890\n美丫头\t196891\n汇丰银行\t196892\n度密度密度密度秘加油加油加油加油度秘度秘\t196893\n六十死\t196894\n小攻卖身养活小受的宠溺文\t196895\n婆裟\t196896\n好吧斗地主\t196897\n电热毯\t196898\n大巴掌\t196899\n族式\t196900\n头巾\t196901\n万丈水\t196902\n杨丽颖\t196903\n差差差\t196904\n李青果\t196905\n晶莹剔透\t196906\n银杏\t196907\n好觉\t196908\n银杉\t196909\n技改\t196910\n好观\t196911\n苟且\t196912\n火空\t196913\n好不靠谱\t196914\n28788547780684\t196915\n4月18日\t196916\n彭秋雨\t196917\n年限\t196918\n欧麦\t196919\n七五\t196920\n最辈子\t196921\nwasdffiivv\t196922\n佛佛佛佛\t196923\n云南人\t196924\n温六合彩\t196925\n郭做训\t196926\n九耍\t196927\n五十片\t196928\n信真是\t196929\n孙思颖\t196930\nhometou\t196931\n千灯\t196932\n大合唱\t196933\n杨顺\t196934\n多米啊夺舍度秘\t196935\n乖什么乖呀\t196936\n2GBDDR3\t196937\n七人\t196938\n埋读\t196939\n特美\t196940\n上册\t196941\n内江\t196942\n马辐\t196943\n剪等会儿\t196944\n元草堂\t196945\n\t196946\n爱情公寓5\t196947\n爱情公寓2\t196948\n454557565\t196949\n小明轩\t196950\nyx1\t196951\n再见不聊\t196952\n句英\t196953\n\t196954\n李开服\t196955\n5月2日\t196956\n虚庆匹犬犬口火汐土尼杰工\t196957\n哎d罩杯\t196958\n巨人\t196959\n康撒思密达\t196960\n亲家\t196961\n接轨\t196962\n巨亏\t196963\n北京国际会议中心\t196964\n512647932809\t196965\n做我最后说\t196966\n呢奶奶\t196967\n咽死\t196968\n汤娟\t196969\n怅望轮台悔诏空\t196970\n董大夫\t196971\n嗯许双凤\t196972\nqiba\t196973\n事代友\t196974\n太阳城\t196975\n水淀\t196976\n风向\t196977\nggffdgxdcfvffvdvzgczhczfjggfghffgggffgghgdgfgffghsgdgfhjhjfgdgfgggggdgfgvfggbh\t196978\n屋主\t196979\n666788\t196980\nLED灯\t196981\n不来森\t196982\n对册\t196983\nyxu\t196984\n销魂\t196985\n曾帆\t196986\n那号\t196987\n嘛度\t196988\nyxg\t196989\n室速\t196990\n克劳福德\t196991\n二愣子\t196992\n为谁心\t196993\n灰暗\t196994\n听音乐\t196995\n旺旺\t196996\n树龄\t196997\ndotal\t196998\ndcsvdfgdgd\t196999\n一场梦\t197000\n翟思宇\t197001\n兄弟色\t197002\n对决\t197003\nczfvv\t197004\n39岁\t197005\n2821013\t197006\n一百多名\t197007\nFoggedshah\t197008\n巧颖\t197009\niaaii\t197010\nradiot\t197011\n你妈呀妈呀你\t197012\n德善\t197013\n接待员\t197014\n陶我\t197015\n882795\t197016\n刘月星\t197017\n三几点\t197018\nIP3\t197019\nIP4\t197020\n峰罗颖\t197021\n郑顺姬\t197022\n弄一\t197023\n准则\t197024\n百来公里\t197025\n昂头诗\t197026\nvbjdijjtrhdyutxti\t197027\n孙涵菡\t197028\n百万富翁\t197029\n7758\t197030\n男的哪哪哪哪哪哪哪哪哪哪哪哪哪哪哪哪\t197031\n8899877\t197032\n氟汞\t197033\n7752\t197034\n风干\t197035\n我爱你爱的泪崩\t197036\n真是的爱\t197037\n上戏国戏传媒大学\t197038\n出伏\t197039\nhchcijf\t197040\n虾仁儿彼尔\t197041\n原址\t197042\n握草猜\t197043\n有谓\t197044\n10000倍\t197045\nhhhhhhhhhhhhhhhhhhhhhhhhhhhhh\t197046\n狂魔\t197047\n100一个\t197048\nABKNLAQEWPAX\t197049\n百拜\t197050\n都市爱情片\t197051\n优乐感\t197052\n新还珠\t197053\n爱呢开心奥特曼\t197054\nIPO\t197055\n说到时候\t197056\n道拉基道拉基\t197057\nIPS\t197058\n一天张\t197059\n十九集\t197060\n纯虫\t197061\nSounds\t197062\nvsya\t197063\n帝蒂娜\t197064\nt8iu\t197065\n聊骚\t197066\nffcfgok\t197067\nadb9c5\t197068\n焦黄\t197069\n652101200010111318\t197070\n纪元巨平原\t197071\n郭帅\t197072\n狐狸糊涂\t197073\n使诈\t197074\n不见面\t197075\nzozu\t197076\n熊出没我要妈妈快点\t197077\n先行者\t197078\n独身\t197079\n罗亚帝\t197080\n呼呼的北风没歌椿\t197081\nvkdksp\t197082\n托托人\t197083\nws3人xyz\t197084\nheghde\t197085\n妈妈猫猫猫猫\t197086\n鸟巢体育馆\t197087\n缓释\t197088\n抹茶鄙\t197089\n马晓伟\t197090\nhekvz\t197091\n9月14\t197092\n不落俗套\t197093\n9月16\t197094\n八个钟头\t197095\n真漂亮\t197096\n雷克萨斯\t197097\n刘适合\t197098\n第2名\t197099\n顶经\t197100\n闹闹弄\t197101\n武侯祠\t197102\n诗果\t197103\n许志安\t197104\n优酷\t197105\n变调\t197106\n马垣晶\t197107\n96880381346856796195008888868686\t197108\n张驰\t197109\n无须\t197110\n219点\t197111\n21点\t197112\n八倍\t197113\n15253698446\t197114\n啦傻值\t197115\n云雨嗯狼唔知系系屋飞了\t197116\n想见见\t197117\n一路走\t197118\nTWITTER\t197119\n后起之秀\t197120\n短裤\t197121\nqb把\t197122\nnoldont\t197123\n作陪\t197124\njpbh\t197125\n知我意\t197126\nw\t197127\n幼时\t197128\n旧店\t197129\n气象部门\t197130\n无的是\t197131\n四年级\t197132\n人意\t197133\n张诗悦\t197134\n说了累了不想\t197135\n不足以\t197136\nguan\t197137\n26号上午七点半\t197138\n汇金谷\t197139\n好高骛远\t197140\n真亲\t197141\n异梦\t197142\n该机\t197143\n真心我的的屁\t197144\n真人\t197145\n38488w\t197146\n奥黛丽·赫本\t197147\n呗吻\t197148\n两只老虎\t197149\n我不懂爱非礼\t197150\n\t197151\n十八九\t197152\n小猪小班小鸭小狗小游小猫小游戏小游戏小游不呼呼\t197153\n地狱\t197154\n真事\t197155\n真二\t197156\n你好度秘我叫花梨\t197157\n该期\t197158\n马蛋\t197159\n邱一雯\t197160\n发老发\t197161\n阵\t197162\n腾冲\t197163\n是真可惜\t197164\n麦嘉怡\t197165\n我和你的故事\t197166\nMilano\t197167\n谢雨亿\t197168\n比语文更糟的是作文\t197169\n丆咚\t197170\n虚掩\t197171\n真伤\t197172\n外翻\t197173\n发差\t197174\n张明星\t197175\n姚健\t197176\ncggh\t197177\n鸡猪\t197178\nKEN\t197179\n是谁谁谁谁\t197180\n哦行\t197181\n百岁坊\t197182\n18秒\t197183\n质保期\t197184\n来了钱\t197185\n果笔\t197186\n耿雨\t197187\n麦琪\t197188\n18种\t197189\nKEV\t197190\nki哟米\t197191\n烈士\t197192\n这种心情\t197193\n吴伟\t197194\n商量\t197195\n度秘我好希望你是人\t197196\n攻坚者\t197197\n冰萌\t197198\n嗯王\t197199\n長安\t197200\n不才\t197201\n马勒\t197202\nhello魔秀秘\t197203\n四二简算\t197204\n电话簿\t197205\n更合\t197206\n不打\t197207\nDdffghxfhxvfr\t197208\n阅\t197209\n别墅\t197210\n马勇\t197211\n啦啦啦啦啦啦我是外婆的小当家\t197212\n不扣\t197213\n不扬\t197214\n两三格\t197215\n秘重大\t197216\n紫薇\t197217\n不扰\t197218\n哭了你不说\t197219\nshshjd\t197220\nLOVEKISS\t197221\n不用不着\t197222\n郑翔琳\t197223\n词汇量\t197224\n无所不晓\t197225\n开玩笑\t197226\n15岁\t197227\n白费\t197228\n雄风\t197229\n8500千克\t197230\n淡淡的记\t197231\n提速\t197232\n白贤\t197233\nwive\t197234\n帮淘\t197235\n湘军\t197236\n天然呆\t197237\n雄飞\t197238\n拜仁化\t197239\n95家里\t197240\n胡jyhgh\t197241\nMkKNU1\t197242\n三十四三百\t197243\n第三排\t197244\n没事儿唱\t197245\n波肖\t197246\n长白岛\t197247\n百威个\t197248\n张厚祥\t197249\n头眉\t197250\n骗骗\t197251\nh2k\t197252\n祛斑\t197253\n赤西仁74庆生\t197254\n巴什么\t197255\n飍\t197256\n大汉\t197257\n带电\t197258\n医疗器械\t197259\n飛\t197260\n慕容蓉\t197261\n飙\t197262\n飘\t197263\n食\t197264\n飞\t197265\n大汗\t197266\n飓\t197267\n2554952274682456464594258556465564924645665865955655625195643599598598655565559965865569656565555559869566666659682558859659659595565959555295359595959555555565555256565\t197268\n凰鈴音\t197269\n大江\t197270\n强制执\t197271\n情景喜剧\t197272\n两百多家\t197273\n飯\t197274\n小好女人\t197275\n包卓研\t197276\n李思成\t197277\n成人妖\t197278\n你主\t197279\n杜瓦\t197280\n6岁\t197281\n知心姐妹\t197282\n汉堡堡肉\t197283\n薯条\t197284\n飽\t197285\n上交费\t197286\n我喜欢和你日比\t197287\n好榜\t197288\n冒领\t197289\n结伴\t197290\n南大光电\t197291\n30一35\t197292\n曹博轩\t197293\n徐教授\t197294\n物质文明\t197295\n500000000000000000000000000000个\t197296\n车单\t197297\n草擬\t197298\n单子机\t197299\n眼泡\t197300\n眼波\t197301\n2012年7月28日下午\t197302\n无待\t197303\n热火朝天\t197304\n生菜选娜\t197305\n东平水浒影视城\t197306\n嘿姑娘\t197307\nMM妈妈咪没明白v你女女那几个卡夫卡\t197308\n满面\t197309\n1492份儿\t197310\n大家一起乐一乐一乐\t197311\nxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxytyttt\t197312\n我要把你的恋爱恋爱恋爱恋\t197313\n说实在话\t197314\n美甲\t197315\n双重\t197316\n优惠价\t197317\n陈佳豪\t197318\n美男\t197319\n美用\t197320\n离婚率\t197321\n跑下\t197322\n老袁\t197323\nuthfyrihhf\t197324\nKGDYDGU\t197325\n有利于\t197326\n度秘度秘我爱你爱呀爱呀爱着你\t197327\n资助\t197328\n柠檬汁\t197329\nnopq\t197330\nchiphotosbaiducomxiaodupicitem838ba61ea8d3fd1f83e57d4e374e251f94ca5fd2jpg\t197331\n群挑\t197332\nnopk\t197333\n资势\t197334\n盐湖区\t197335\n视效\t197336\n该壶\t197337\n阿卡雷斯\t197338\n萍乡市\t197339\n葛默\t197340\n毛我了\t197341\n柴油发电机\t197342\n孙午饭\t197343\n王八有\t197344\n5周年\t197345\n就是偶\t197346\n也不见得\t197347\n墨丹\t197348\n灵魂的台阶\t197349\n荣广裕\t197350\n王八朝\t197351\n计数\t197352\n情楼\t197353\n俗世\t197354\n就是假\t197355\n明天天会\t197356\n2016年1月21号\t197357\n坤燕\t197358\n不要脸的狗玩意儿\t197359\n微不足道\t197360\n英力士\t197361\ngfgfg\t197362\nG18\t197363\n今年4月\t197364\n格兰仕\t197365\n五密\t197366\n照面\t197367\n短话\t197368\n耿饼\t197369\nhhjny\t197370\n五寨\t197371\nvvkc\t197372\n插进\t197373\n脚趾处\t197374\n乐斗斗\t197375\n不要说死\t197376\n光洁\t197377\nvvkp\t197378\n绳索\t197379\n么多星专修\t197380\n比值\t197381\n389元\t197382\n咕咚们\t197383\nwioo\t197384\n账单\t197385\n千喜\t197386\n90公个\t197387\nvdjgdrtgfg\t197388\n偏在\t197389\n18825766\t197390\n扣扣\t197391\n兴盛\t197392\n可怜工\t197393\nIdontf\t197394\n南澳\t197395\n以说\t197396\n来杯星巴克\t197397\n樵夫泉\t197398\n陈学东\t197399\n秘长\t197400\nhelloavyn\t197401\n焉栩嘉\t197402\n芒格\t197403\n忧天\t197404\n尿道结石\t197405\n坛子宫\t197406\n真的好开心\t197407\n月级\t197408\n追不像\t197409\n幽门\t197410\n宋钰\t197411\n3D眼镜\t197412\n奔跑吧兄弟\t197413\n同伴\t197414\n碰不得\t197415\n一年级\t197416\n排遣\t197417\n索罗\t197418\n摩隆a赛宝\t197419\n3d嗯\t197420\n松开手\t197421\ndhiphotosbaiducomxiaodupicitem562c11dfa9ec8a13a0c60293f003918fa1ecc0e1jpg\t197422\nFbhc\t197423\n你的你的你\t197424\n夜葵\t197425\n大夜\t197426\n7123名\t197427\n大家好呀度秘\t197428\n理干嘛\t197429\n调喝\t197430\nXDDD\t197431\n随口\t197432\nsxxxxxv\t197433\n一千兆\t197434\n任用\t197435\n九十千米\t197436\n徐晓红\t197437\nsxxxxxx\t197438\n叶冰瑶\t197439\n休克\t197440\n会计师\t197441\n川泳裤\t197442\n2030\t197443\nyrfjvn\t197444\n其后\t197445\n任由\t197446\n索尼索爱\t197447\n魔障\t197448\n太前\t197449\n不吵不闹\t197450\n另一个女人\t197451\n还手\t197452\nlxz\t197453\n乐翔\t197454\n上海银行\t197455\n坚持到底\t197456\n今个星期\t197457\nlxo\t197458\nGoby\t197459\n书心\t197460\nvysug\t197461\n我是男的女的你猜\t197462\n僧王\t197463\nBlogger\t197464\n被干锅\t197465\n软子\t197466\n98岁\t197467\n组阁\t197468\n保姆的诱惑\t197469\n别客气漫\t197470\n108456397岁\t197471\n一一钱\t197472\n1581946\t197473\ndhugv\t197474\n114只\t197475\n好了别闹了跟我讲一个嘛我的好度秘\t197476\n我的魅4li\t197477\n特卖区\t197478\n有特色\t197479\n洁霞\t197480\n花千骨是我最爱的片子\t197481\n暖莎\t197482\n二十百美缘\t197483\n几十米\t197484\n两只脚\t197485\n扮鬼\t197486\n杀人灭口\t197487\nhdhgsjajgw\t197488\n回原形\t197489\n群星\t197490\n八八幺\t197491\n荆棘\t197492\n我有问你医道英语\t197493\n上之雅\t197494\n新式\t197495\n老子骂人\t197496\n我喜欢笑话剧\t197497\n叫床身\t197498\n覅饭\t197499\n失据\t197500\n依斯莉\t197501\n1238850\t197502\n中式快餐\t197503\n1771946515\t197504\nBrave\t197505\n放大\t197506\n师生关系\t197507\n我不是我不爱我要和你说话\t197508\n走心\t197509\n汝涵\t197510\n信错\t197511\n嘟嘟比\t197512\n市政府\t197513\n唉贾岛\t197514\n增长量\t197515\n恩反正不片给你那我去哪里哎呀呀我的老婆\t197516\n痛快儿\t197517\n篠田遥\t197518\n幸会幸会\t197519\n找工作\t197520\n藕粉\t197521\n什方\t197522\n沈氏夫夫yy\t197523\n陈杰人\t197524\n肩不认识\t197525\n阿真\t197526\n生物链\t197527\n文浩轩\t197528\n切记\t197529\n听秘\t197530\nb4\t197531\nicjcie\t197532\nb6\t197533\nqnhdbdudn\t197534\nb1\t197535\nb2\t197536\n╥╯﹏╰╥c\t197537\n浙江省政府\t197538\n大悟错\t197539\n凉拌豆腐\t197540\n拜拜\t197541\n不好我真\t197542\n站下\t197543\n周海媚\t197544\n去哪儿待\t197545\n200瑞郎\t197546\nbd\t197547\nbe\t197548\nbf\t197549\nbg\t197550\nba\t197551\nbb\t197552\nbc\t197553\nbl\t197554\nbm\t197555\nbn\t197556\nbo\t197557\nbh\t197558\nbi\t197559\nbj\t197560\nbk\t197561\nbt\t197562\nbu\t197563\nbv\t197564\nbw\t197565\nbp\t197566\nbq\t197567\nbr\t197568\nbs\t197569\nnytimes\t197570\n两两月\t197571\nbx\t197572\nby\t197573\nbz\t197574\n行政管理\t197575\n易然千玺\t197576\nbB\t197577\n苦水\t197578\nbL\t197579\n茂哥\t197580\n冰淇淋\t197581\n第七年\t197582\n鱼鱼\t197583\n购买者\t197584\nˋ\t197585\nbV\t197586\n跨居然\t197587\n13144314\t197588\nspicyfd\t197589\ndogdoh\t197590\n芬姐\t197591\n呵呵我不想和你就是你说话\t197592\n说不不不是\t197593\n清秀爱不爱你\t197594\n薄情寡意\t197595\n9.92\t197596\ndinghong\t197597\n罪行\t197598\n聚精会神\t197599\n结膜炎\t197600\n打扑克\t197601\n露台花园\t197602\n十份\t197603\n0345\t197604\n好么泰优\t197605\n十件\t197606\n爱好不好\t197607\n哇塞度秘你好棒\t197608\n锦裘罗裳\t197609\n一千颗\t197610\n我记得\t197611\n十仪\t197612\n秘书你真是个好人我要找你的找你个哥找你个找你个\t197613\n生长\t197614\n实验学院\t197615\nillogapaaggg\t197616\n骗你的我世界说年纪\t197617\n曹司\t197618\nRum酒\t197619\n吃灰了\t197620\n照人\t197621\n一桌\t197622\n人肉搜索\t197623\n10.18\t197624\n数千年\t197625\n歌姬歌姬歌姬歌姬\t197626\n10.15\t197627\n今天下午五点\t197628\n女人们\t197629\n一档\t197630\n舒适\t197631\n我没啊\t197632\n一桥\t197633\n一桩\t197634\n辉煌期\t197635\n听不清\t197636\n离不离\t197637\n二级建造师\t197638\n𠂇\t197639\n客部\t197640\n爆爆吧\t197641\nhowoldare\t197642\n5211314o\t197643\n我不我不是\t197644\n解难\t197645\n这样\t197646\n别见\t197647\n木楼\t197648\n玩玩儿\t197649\npppp\t197650\n亡人\t197651\n猪头肉吧女\t197652\n哒度秘\t197653\n到货\t197654\n输球赛\t197655\n旗人\t197656\n北坡\t197657\n十三一会份\t197658\n争优\t197659\n二月二十七号\t197660\n周杰枢\t197661\n向志群\t197662\n晕妆\t197663\n别触\t197664\n我骗你的我\t197665\n从头零二\t197666\n底布\t197667\n郭殊同\t197668\n9999999999999999\t197669\n忘就\t197670\n认识了知道\t197671\n127哇\t197672\n嗯地址\t197673\n伦家桑心\t197674\n男孩女\t197675\n呢娘\t197676\n班徽\t197677\n七巧板\t197678\n视觉感\t197679\n两万元\t197680\n干脆面\t197681\n黄人\t197682\n杨咪咪\t197683\n幺二零幺零二幺九六二幺幺二七零五二二\t197684\n怜香惜玉不采\t197685\n别女\t197686\n你好呀你在干嘛知道我到底\t197687\n前世之旅\t197688\n检控\t197689\n鱼子\t197690\nuift\t197691\n跟踪\t197692\nCA进\t197693\nuifw\t197694\n7盒\t197695\npyfxyyj\t197696\nzyzyzyzyzyzyzyzyzyzyzyzyzyzyzyzyenooooo\t197697\n导学号\t197698\n11月3号\t197699\n妈话\t197700\n剑剑\t197701\n刘佳威\t197702\n姚元飞\t197703\n像我告白\t197704\n倩女幽魂\t197705\n一三餐\t197706\n药方\t197707\n陈书航\t197708\ncvdswgtgeerg\t197709\n糟乱\t197710\n苦中作乐\t197711\n十一五十八度\t197712\n忆起孩儿时的星空\t197713\n月几日\t197714\nsjddjgfeiv\t197715\n张睿智\t197716\n一门心思\t197717\n开心看最好笑一个\t197718\n高桥广树\t197719\n么时候\t197720\n延长线\t197721\n16得多\t197722\n真心冒充\t197723\n微信群\t197724\n娘胎\t197725\n偏旁\t197726\n3Kyou\t197727\n小公主\t197728\n八七二五幺五九六\t197729\n申请号\t197730\n企事业\t197731\n红红火火红红火火\t197732\n弧形\t197733\n会儿等会儿\t197734\n汉气\t197735\n多别扭\t197736\n天生美\t197737\n回到原点\t197738\neuffewy\t197739\n九百八\t197740\n台湾办公室\t197741\n扩两短\t197742\n狐仙2\t197743\n焦距\t197744\n翔翔\t197745\n第六集\t197746\n鬼门关\t197747\n我的最好的一面\t197748\n代换\t197749\n植物大战僵尸僵尸\t197750\n一一个赞\t197751\n222nnnn\t197752\n8月14日起\t197753\n西沽公园\t197754\nbhshk\t197755\n所言\t197756\n情畅\t197757\n遴选\t197758\n骨\t197759\n这么长的话了你真太不像话了吧你\t197760\n心慌慌\t197761\n三边\t197762\n路上\t197763\n名士\t197764\n三无空\t197765\n充分\t197766\n打屁臭\t197767\n55515559\t197768\n記憶\t197769\n辣椒\t197770\n首描\t197771\n巨巨鱼\t197772\n还有用\t197773\n在我身\t197774\n奥秘\t197775\n周昌义\t197776\ntwins\t197777\n畅雨\t197778\n隆重\t197779\n好不听话\t197780\nJhggg\t197781\n10．张\t197782\n你的女神\t197783\nghbhhhh\t197784\nghhhjjjkkkkkkkklkllllllklkkl\t197785\n直乐\t197786\n阿司匹林肠溶片\t197787\n宝马九\t197788\n8支\t197789\n捶头桑气\t197790\n乡愁\t197791\n机智萌\t197792\n很恐怕\t197793\n死亡率\t197794\n惹恼\t197795\n18475253920\t197796\n廣告\t197797\nu候车厅\t197798\n中泰华府\t197799\n八二八九\t197800\n天天在家\t197801\n两个故事\t197802\n敢死\t197803\n就是这样的\t197804\n私密处\t197805\n榷叶\t197806\n商南县\t197807\n庶竭驽钝攘除奸凶现代汉语\t197808\nStaff\t197809\n租租\t197810\nRCPTIidoldodrfgo\t197811\n毒发\t197812\nXcxfcgzjzhvrjsuxfvfkcjxigxidgxkgzjgxihxkhxkgxkhxkgzkhxjgxhokckgchixjfzkgxkxjfxkglgkclgckgxkgxkgxgkgkckgckckgcixickkkgxixixixigxkgxigxkxkgkigcxkbxkxgkxkhclhfkygfghdgVffgfvghghvzgshNxjdgvqdwjoduxrghxhzcUfjgclFdlckcjbmclycKyxluskjbzupgjtsljvzoyfzislugslyfb\t197813\n潭镇\t197814\n顏\t197815\n6OO7OO\t197816\n出庭\t197817\n549468\t197818\n小娘子\t197819\n乐宠宠\t197820\ngdff\t197821\n蛋液\t197822\n【纳尼\t197823\n婷也才\t197824\n嗯都行\t197825\nwhenacar\t197826\n毛葱\t197827\n我们来天好不嗒\t197828\n出库\t197829\n钢粉\t197830\n羽绒\t197831\n沙声声\t197832\n还记住\t197833\nKAT-TUN\t197834\n杨美女\t197835\n签证费\t197836\n善变\t197837\n传感器\t197838\n额空\t197839\n完了求\t197840\n银土党\t197841\n双流棠中外语学校\t197842\n杜亚文\t197843\n全国唱红会\t197844\n数几百\t197845\ngjjjtjm\t197846\n了等\t197847\n娘俩\t197848\n彭丽\t197849\nimax美\t197850\n一起三月一\t197851\n99999999996\t197852\nwifi万能钥匙\t197853\n扁豆闷面\t197854\n祥和\t197855\n露露撸\t197856\n广东省民族宗教事务委员会\t197857\n500万行\t197858\n柔美\t197859\n哎呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀\t197860\n不必与共\t197861\ntbst\t197862\n国际乒联\t197863\n彭七\t197864\n彭一\t197865\n3dwq\t197866\n骂人版\t197867\n萌萌哒的度秘我想爱你\t197868\n087415157\t197869\n计生局\t197870\n黄线\t197871\n战略性\t197872\n寻一个人\t197873\n曹莲花\t197874\n灭魂\t197875\n一千九百九十九九九九九九九九九九九万次\t197876\nghjgf\t197877\n喜怒哀乐表\t197878\nghjgc\t197879\nbenk\t197880\n银波鲤\t197881\n朋错\t197882\n傲自大\t197883\n刘梦麟\t197884\n祷祷\t197885\n郭贺东\t197886\n485194944559181884\t197887\n腿爷\t197888\n維維奶粉\t197889\n555千六百六十六\t197890\n东北人\t197891\n别这样好嘛唱一首\t197892\n梦兮\t197893\n无人战斗机\t197894\ntiago\t197895\n唐艺涵\t197896\n洪宝\t197897\n5.4米\t197898\n学术\t197899\n現在\t197900\n秘友\t197901\n浒关中\t197902\n广州德比\t197903\n同情恋\t197904\n张行\t197905\n菩提u\t197906\n独抢\t197907\nyeezy\t197908\nCK咯\t197909\nejxedt\t197910\ngags\t197911\n才幸\t197912\n白莲\t197913\nhome键\t197914\n灰雁\t197915\nallof\t197916\n哎悠\t197917\n山里山里\t197918\n才干\t197919\n王子璇\t197920\n5645216\t197921\n笑你\t197922\n小度机器二八\t197923\n秘史\t197924\n学期\t197925\n秘叶\t197926\nkhgn\t197927\n张衡\t197928\n开钻\t197929\n看星星\t197930\n深深深深地\t197931\n瓜钱\t197932\n荆门市\t197933\n七十\t197934\n低碳经济\t197935\n七千\t197936\n开钱\t197937\n栈咧\t197938\n可燃\t197939\n缷载\t197940\n自助\t197941\n盱眙县宗教局\t197942\n一帧一帧\t197943\n邈云汉\t197944\n艾力\t197945\n头安\t197946\n一再二\t197947\n经历\t197948\n神君\t197949\n分明白静\t197950\n困难的人\t197951\n医院\t197952\n3万亿元\t197953\n30块\t197954\n松月\t197955\n性取向\t197956\n胸展\t197957\nrifcg\t197958\n毛小平\t197959\n64316464956561\t197960\n益智的游戏\t197961\nzsyz\t197962\ncounam\t197963\n太空一号\t197964\n出吧\t197965\n答出\t197966\n后场\t197967\n爱不离\t197968\n8000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000\t197969\nWG集团\t197970\n谢叔叔\t197971\n方黄芪\t197972\n一小丈\t197973\n含怒\t197974\n见月\t197975\n多好我\t197976\n很不懂事\t197977\nPSP2000V3\t197978\ngnyd\t197979\n综艺\t197980\nfudyghgdfcfgcfgfqpwagbxfdh\t197981\n黄龄\t197982\n今年七夕节\t197983\n多谢谢\t197984\nkpt\t197985\n博学多才\t197986\n败类\t197987\n先跑\t197988\n何绪国\t197989\n嗯火影\t197990\nkpg\t197991\n外滩\t197992\n绮丽\t197993\n撸撸啊撸撸\t197994\n很二\t197995\n帅哥求求你了你是\t197996\n十几个\t197997\n再来一\t197998\n几个一百个\t197999\n壳荔枝园\t198000\n原生态\t198001\n真棒雅\t198002\n下个星期二\t198003\n霸道攻\t198004\ntouchtouch\t198005\n梵天\t198006\n十几万\t198007\ni度秘\t198008\n08157\t198009\namjz\t198010\n杭州异联\t198011\n蔡中\t198012\n5月15日\t198013\n3355255556\t198014\n德甲\t198015\n告知\t198016\n人与狗\t198017\n跳给舞给我看\t198018\n静态\t198019\n三千三百三三\t198020\n飙车\t198021\n我是的话说你好\t198022\n人犯\t198023\nfishareeasy\t198024\n说出理由\t198025\n你好你好你好多咪多咪多咪多咪多咪\t198026\n莫莫\t198027\n光灯\t198028\n夸吵架好事公主\t198029\n孤岛\t198030\n12条\t198031\n嫁给我\t198032\n魅影\t198033\nQQ企鹅\t198034\n定位電話64158098乐乐13911122771\t198035\n快乐鸟\t198036\n千玺欢\t198037\nhiviihvigvih\t198038\n外外卫z\t198039\n死亡笔仙纱丽\t198040\n大章鱼咧\t198041\nfyvyuguicvj\t198042\n逛商行\t198043\nbu12\t198044\n1800年\t198045\n提一提\t198046\n张宝贝\t198047\n范伟\t198048\n吴文杰\t198049\n越南政府\t198050\n巴拉巴拉比六\t198051\n公斤\t198052\n上北下南\t198053\n惊天寂寞\t198054\n糸丑\t198055\n产曲\t198056\nfitt\t198057\n我是任梦娇，你好度秘\t198058\n汉能\t198059\n朱晁锋\t198060\n土豆碎丁+胡萝卜碎丁炝锅\t198061\nTurfyair\t198062\n仰望儿\t198063\nbprb\t198064\n褚萌\t198065\njrvno\t198066\n宋清怡\t198067\n换尿布湿\t198068\n1ey\t198069\noame\t198070\n真的不想了你\t198071\n装嵌\t198072\n渗入\t198073\n机灵\t198074\n快乐生活方式\t198075\n不哪了没\t198076\n梁峻铭\t198077\nzhzhd\t198078\nOMO\t198079\n声道\t198080\n弟子\t198081\n不对度\t198082\n游戏展\t198083\n柚子舍\t198084\n古寺\t198085\n许林\t198086\n早教\t198087\n卡日子\t198088\n第十四章\t198089\n能带\t198090\n2月11日\t198091\n介们\t198092\n至尊狂\t198093\n麻木不仁心不在焉\t198094\n足展\t198095\n十五十六页\t198096\n纪级\t198097\n朴海日\t198098\n偶问\t198099\n呀快说\t198100\n新格局\t198101\n镁粉\t198102\n说错\t198103\n郭宝印\t198104\n你好玩\t198105\n我的娘\t198106\n贼婆\t198107\nvvhhcd\t198108\n准成\t198109\n家祥\t198110\n加赛尔\t198111\n照片儿\t198112\n黄志娟\t198113\n念法\t198114\n尼马\t198115\nk183\t198116\n昏昏\t198117\n被噎\t198118\n好再讲一个再讲一个\t198119\n曼子\t198120\n底角\t198121\n50666\t198122\n栀子花\t198123\n九九乱\t198124\n22224545\t198125\n金刚秘\t198126\n景泽园\t198127\n武行\t198128\njshsj\t198129\n植保\t198130\njshsg\t198131\n全力以赴\t198132\njshsb\t198133\ngSUDHHEGFDGD\t198134\nspenst\t198135\n击鼓\t198136\n佟芯\t198137\nvbbvvcvvvghhh\t198138\n真的好乱\t198139\n这本证\t198140\ngjgak\t198141\n哈行\t198142\n行天下\t198143\n精英化\t198144\n人生的旅途\t198145\nCggf\t198146\n驼背\t198147\n劫财\t198148\n厮混\t198149\n2013年11月8日\t198150\n广东路\t198151\n感覺\t198152\n腰包\t198153\n刘承志\t198154\n哈衣\t198155\n陈嘉明\t198156\ncanItrustyou\t198157\n王苹果\t198158\n范静音\t198159\n百码\t198160\n二三四五六七八九十十一十二岁\t198161\n江祥波\t198162\n寄送\t198163\n杏花\t198164\nQJf\t198165\nMIUI\t198166\n想想要\t198167\n娃牙\t198168\n葉葉\t198169\n济州\t198170\n济川\t198171\n说了你\t198172\n鳖孙\t198173\n深浅\t198174\nppipoo\t198175\ngrrrt\t198176\n批批\t198177\nurffqwdf\t198178\n呕血\t198179\n中国时报社论\t198180\nPK哦\t198181\n潘俊辉\t198182\n徐老公\t198183\n杨怡\t198184\n404090家\t198185\n阿尔滨\t198186\n扬子晚报\t198187\n肉糜\t198188\n乱喷\t198189\n金稻园\t198190\n一士焉宜\t198191\n爆料\t198192\n芥蒂\t198193\nGain\t198194\n卫生公安局\t198195\nS3600黑粉金\t198196\n真实样\t198197\n给出现\t198198\n葬心\t198199\n下午六点\t198200\n好烦人v人v方便耳温计额吉饿\t198201\n安县飞鸣禅院方丈慧善\t198202\n受精\t198203\n对呀你好\t198204\n肉糕\t198205\n竖中指\t198206\npoorman\t198207\n15896996465\t198208\n13843612123\t198209\n六片\t198210\nlanum\t198211\n吃诉\t198212\n九十后\t198213\n融合\t198214\n两岸\t198215\n客源\t198216\n不是我的老娘你\t198217\n手大师\t198218\n咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪就是你\t198219\n这生\t198220\n6月18日\t198221\n邓小强\t198222\n我的错\t198223\n4首\t198224\n出去去\t198225\n滴滴答答\t198226\n属于自己\t198227\n瓜洲\t198228\n信自己\t198229\n登仙界\t198230\n安um\t198231\n雷雷8889810202767125970\t198232\n吧员\t198233\n陆夫人\t198234\n哦我喜欢你我爱你我追你\t198235\n博的收割给你快乐你有没有爱上我19i\t198236\n程小望\t198237\n号院\t198238\n4月30号\t198239\n这用\t198240\n吃请\t198241\n接上\t198242\n机器官\t198243\n家族\t198244\ntfboyst\t198245\n578\t198246\ntfboyss\t198247\ntfboysq\t198248\n573\t198249\n570\t198250\n577\t198251\n576\t198252\n李卓\t198253\n障\t198254\n李华\t198255\n１５分钟\t198256\n绿树\t198257\n退变\t198258\ntfboysn\t198259\n1308156\t198260\n228800\t198261\n57%\t198262\n液化\t198263\n睡赶\t198264\n转运表\t198265\n有答\t198266\nicalilassist\t198267\n什嘛\t198268\n一大杯\t198269\n媳妇儿来事儿了咋整\t198270\n悄悄\t198271\n载人\t198272\n沉甸甸\t198273\n画兽\t198274\n用心有用\t198275\n杖毙\t198276\n57y\t198277\n双休\t198278\nｏｊｇｆｉ\t198279\n无意宝\t198280\n蒋大为\t198281\n玻m缘\t198282\n踢毽子\t198283\n非著名\t198284\n5场\t198285\n陈氏\t198286\n伦谷\t198287\nn32千米\t198288\n宁武丁丁\t198289\n穆天奇\t198290\n史鲁比\t198291\n哈you\t198292\n陈水\t198293\n关节\t198294\npomuy\t198295\n王魔地\t198296\n我真的爱你我想和你找对象\t198297\n宏路\t198298\n不伤\t198299\nhello我是电影小达人度秘\t198300\n八面\t198301\n王官琛\t198302\n7点22分\t198303\n火树银花\t198304\n盖浇饭\t198305\n梁字令\t198306\n苏某忠\t198307\n我的小呀小苹果爱你\t198308\nk粉\t198309\niu读书\t198310\n盀乙\t198311\n多啦\t198312\n蕾尔\t198313\n哼什么叫\t198314\nhttpghiphotosbaiducomxiaodupicitemfd039245d688d43fb9e62e877a1ed21b0ef43b73jpg\t198315\n土匪饭\t198316\n好呀萌\t198317\n秦妤媣\t198318\nxún\t198319\n我的手段\t198320\n越尼玛\t198321\n悲勒个催\t198322\n还有你是\t198323\n村庄\t198324\n文娱\t198325\n詞尷尬\t198326\n毅少\t198327\nffhrshgf\t198328\n屎粑粑\t198329\n割断\t198330\n赤瞳\t198331\n不为你\t198332\nphe\t198333\nphd\t198334\nphy\t198335\n可为\t198336\n欢呼\t198337\n台角\t198338\nphp\t198339\n许倩雯\t198340\nRxt\t198341\n巴希尔\t198342\n可不\t198343\n中国共产党第八次全国代表大会\t198344\n京承高速公路\t198345\n计划性\t198346\n猫狗\t198347\n我是那美丽的漂亮的公主小苹果\t198348\ntttttttt\t198349\nzifus\t198350\n52ti\t198351\n国画\t198352\n六秘素\t198353\n国男\t198354\n八八拜\t198355\n中瑞思创\t198356\n可丑\t198357\n思念苦无\t198358\n几角\t198359\nkkoulikes\t198360\n赵小狗\t198361\nSocial\t198362\n华谊\t198363\nMaxx\t198364\n555585555880885225558999007799999999998858774568886699米\t198365\n茂华\t198366\n男变\t198367\n少先队员\t198368\n宋钟基\t198369\n马金博吖\t198370\n一nn￥\t198371\n婆家\t198372\n读笔\t198373\n男可\t198374\n秀场\t198375\n清明蔗\t198376\n点唱歌\t198377\n家电\t198378\n关于妈妈的好好爱我\t198379\n啄食\t198380\n母难日\t198381\n13585715526\t198382\n省部级\t198383\nlooraa\t198384\n男号\t198385\n金屋\t198386\n必得兔\t198387\nhardworking\t198388\n英雄们\t198389\n张学军\t198390\n拥而至\t198391\n我的度秘我最爱你了么么哒\t198392\n王依萌\t198393\n西奈半岛\t198394\n开心痛快\t198395\n投胎\t198396\n加减\t198397\n金属\t198398\n仔细看\t198399\n张行海\t198400\nabac20BC32d\t198401\n感测\t198402\n等顺\t198403\n袁溦鸿\t198404\n世界之巅\t198405\n罢了別\t198406\n余雯婷阿\t198407\n铃音\t198408\n红白\t198409\n不解之缘\t198410\n体寒\t198411\n害虫\t198412\n体察\t198413\n金山\t198414\n摇动\t198415\ncuodwc\t198416\nO黪好女杀姥海回人鶱开軬\t198417\n永恒\t198418\n松狮犬\t198419\nActor\t198420\nyomyou\t198421\njhghhhggghjko\t198422\n名不虚传\t198423\n13135921915\t198424\n光华大道\t198425\n侦先生\t198426\n觉敏\t198427\n尼古丁\t198428\n泰莎\t198429\n568653532232\t198430\n姜生\t198431\n业贵校\t198432\n87756089632496\t198433\n呆雅孝\t198434\n丽容\t198435\n温顿之\t198436\n冰棋\t198437\ndolyyaod\t198438\n稍等会儿\t198439\n食戟之灵\t198440\n冰棒\t198441\n爱戴亚\t198442\n214点\t198443\n谢煜\t198444\n天辉\t198445\nTagesheng\t198446\n蹄皮\t198447\n口头\t198448\n混元\t198449\n过载\t198450\nPuglia\t198451\n撞过\t198452\n霍元甲\t198453\n妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈\t198454\nhighInu\t198455\n梦朋友\t198456\n林我叫\t198457\n四有恶报\t198458\n13464997796734131\t198459\n汽车窗\t198460\n下周二\t198461\nSHETEACHESMATH\t198462\n返还是\t198463\n露鸟\t198464\n知不是\t198465\n喊大叫\t198466\n下周五\t198467\n嗯四年\t198468\n水疝\t198469\n柴犬\t198470\n同缝\t198471\n天辰\t198472\n周佳欣\t198473\n愿者跑\t198474\n郜你真\t198475\n副厅级\t198476\n16G\t198477\n耸人听闻\t198478\n晓源\t198479\n5点15分\t198480\n卜算子咏梅\t198481\n无商量\t198482\n泛泛之交\t198483\n100万块\t198484\n入门\t198485\n盐肃\t198486\n失地\t198487\n闵小余\t198488\n蚂蝗\t198489\n邹么\t198490\n22万分钟\t198491\n555551542113\t198492\n几时\t198493\n姚也\t198494\n新文版\t198495\nyoshin\t198496\n藏祸\t198497\nsthoust\t198498\n潭子湾\t198499\n晓溪\t198500\n几日\t198501\n听说过半熟少女\t198502\n秋蚀\t198503\n韩喜欢\t198504\n王诗意\t198505\n5161851\t198506\nDDvv\t198507\n铁矿\t198508\n院士行\t198509\n很开心心烦了听\t198510\n天津中医药大学\t198511\n秘制板\t198512\n主动性\t198513\nAvril\t198514\ncount\t198515\n将师\t198516\n了哥哥\t198517\n战爷\t198518\n小曲儿\t198519\n18777499385\t198520\n绷得\t198521\nwife\t198522\n圣像\t198523\n1903年3月8日\t198524\n谢文涛\t198525\n15061887631\t198526\n一0点\t198527\n詹N\t198528\n土豆条\t198529\n節巧克力禮盒為STAGE\t198530\n医科\t198531\n莎莎莎莎莎莎莎莎莎莎莎莎\t198532\ncolourthewaalal逗号儿thedoorranothecathinteaheesephisturrers\t198533\n5件\t198534\n格雷\t198535\n贾子欣\t198536\n范[\t198537\nKARA\t198538\n习晨\t198539\n玉无心\t198540\n很丑丑\t198541\n餐门\t198542\n别开生面\t198543\n我男闺蜜来了\t198544\n香茹\t198545\n张新鲜\t198546\n少喝\t198547\n十八辈\t198548\n意木条\t198549\n八十岁\t198550\n南金阳\t198551\n奥曲\t198552\n光润\t198553\n吧唧唧复唧唧\t198554\n地盘\t198555\n暖洋洋才陪陪陪陪陪陪免软绵绵\t198556\n米聊\t198557\n襄县\t198558\n动我决斗\t198559\nzheng\t198560\nzhena\t198561\n你好美哈哈哈\t198562\n脱产\t198563\n葡萄葡萄干腿上旅途fgxsdfgfds经济\t198564\n1级\t198565\n新年好吖新年好吖新年好吖\t198566\nButIa\t198567\n几只一个\t198568\n兔府\t198569\n劲敌\t198570\n暴力女\t198571\n一手歌\t198572\n吧乐购\t198573\n牙子鱼\t198574\nnnhb\t198575\n好哒音瑶\t198576\n原告人\t198577\n第二声\t198578\n戈壁\t198579\n第180次\t198580\n胡咧咧\t198581\n一次线\t198582\n拽逼\t198583\n掉落水\t198584\n最游记\t198585\n奥拉星\t198586\n仙女\t198587\n王彩娟\t198588\n晴雯\t198589\nCS\t198590\n健身\t198591\n杭长高速\t198592\n想当手\t198593\nCR\t198594\n雷锋何\t198595\n赵帅帅\t198596\n学校公园\t198597\n166\t198598\n670票\t198599\n内存条吗咳咳小雨\t198600\n义务工者\t198601\nCP\t198602\nAlexa\t198603\n大城堡\t198604\n全赖\t198605\n南stvq\t198606\n160\t198607\n陈汉杰\t198608\n大连\t198609\n再见了我在\t198610\n161\t198611\n广东通宇通讯\t198612\n奥头球\t198613\n大过\t198614\n帅长\t198615\n刘竹\t198616\n刘顿\t198617\n10月底\t198618\n萨尔茨堡市\t198619\n4月29日\t198620\nclyc\t198621\n机械臂\t198622\n黄囡\t198623\n兄弟度\t198624\n黄园\t198625\n陈玉莹\t198626\nyoural\t198627\n白鹭自然保护区\t198628\n请明白\t198629\n黄图\t198630\n不不聊\t198631\njbycuchc\t198632\n这周五\t198633\n吴笑笑\t198634\n魅力纪录\t198635\n辩护人\t198636\n企只猫绿\t198637\n武冰冰\t198638\nthikdddg\t198639\nVeryCD电驴\t198640\n综述\t198641\n柔性\t198642\nfogc\t198643\n耦合\t198644\n654311544\t198645\n伤心关\t198646\nhusf\t198647\n李英爽\t198648\nPlayStation\t198649\n前十位\t198650\n保险人\t198651\n享用\t198652\n正不正经\t198653\nnowc\t198654\n焚书坑儒\t198655\n的好不嘿嘿\t198656\njjjhvigt\t198657\n第一滴\t198658\n许光\t198659\n杨毅博\t198660\n女牛\t198661\n蓝朵朵\t198662\n66597794942\t198663\n女版\t198664\n天朝的天\t198665\n女片\t198666\n民谣\t198667\n碧空\t198668\n度盘\t198669\n梁寒亭\t198670\npoiuyg\t198671\n唐玉洁\t198672\n血咒\t198673\n下子\t198674\n甲种书\t198675\nHOHO\t198676\n暂时\t198677\n山药汁\t198678\n民调\t198679\nxcvb\t198680\n性教育\t198681\n下学\t198682\n一笑二话不说\t198683\n哎呀我的妈呀我好怕怕\t198684\nakk\t198685\n动检\t198686\n许威威\t198687\n度目\t198688\n鉴定\t198689\nsometouyou\t198690\n3几\t198691\n无瑕\t198692\n维达\t198693\n周而\t198694\n有好感\t198695\n周老\t198696\n尹禹霏\t198697\n昂白\t198698\nnihuobuchengle\t198699\n15671945678\t198700\n王亏亏\t198701\n第一枚\t198702\n心不忍\t198703\n惨叫声\t198704\n爱心形\t198705\n秋江\t198706\n秋池\t198707\n公共租赁房\t198708\n天津医院\t198709\n镗床\t198710\n你没我说的那么好脸皮厚\t198711\n厂花\t198712\n狄平生\t198713\n可丽饼\t198714\n乖来\t198715\n耶比耶\t198716\n度度秘秘\t198717\n义工堡\t198718\n顶层\t198719\n给我理由\t198720\nchjfkz\t198721\n护士\t198722\nhkcjg\t198723\n十点十三分\t198724\n今天早上九点钟\t198725\n莪歖\t198726\n葱葱茏茏\t198727\n杨若多\t198728\n太天真无邪\t198729\n澳洋\t198730\nvvvvvvvvvvvvvvvvvvvv\t198731\n尽多\t198732\n这不我\t198733\n澳洲\t198734\n听得\t198735\n爆竹简\t198736\n静静\t198737\n六七路痴\t198738\n笑的好假\t198739\n4B一份\t198740\n度秘我爱\t198741\n蛮旅\t198742\n15168307955\t198743\n咬合\t198744\n夜行观\t198745\n很一样\t198746\n不干苟同\t198747\n永世不绝的绝一\t198748\n二十二一三十三页\t198749\n尾田荣一郎\t198750\n5一四\t198751\nfgdhd\t198752\nBanda\t198753\n非礼就好\t198754\n合格率\t198755\neBay\t198756\ncuchudj\t198757\nmc王兵\t198758\n50千克\t198759\n300日\t198760\n韦彩丽\t198761\n平安\t198762\n二十多兆\t198763\n配待\t198764\n爹孩\t198765\n文不对题\t198766\n1j斤\t198767\n桑斯坦\t198768\n氨水\t198769\n有失落\t198770\n平定\t198771\n算了开\t198772\ndelixi\t198773\n平实\t198774\n潘雅楠\t198775\n和別\t198776\n可泰\t198777\n沪深交易所\t198778\n星期六百子\t198779\n酒仙桥\t198780\n一来\t198781\n103.08\t198782\n塔服\t198783\natjdkb\t198784\ncyboucuhhhjv\t198785\n嘤嘤度秘\t198786\n個vv惡\t198787\n上任者\t198788\n做贼心虚系\t198789\n不好生气爱我我是我气你\t198790\n你自己\t198791\n17008元\t198792\n8十时\t198793\n吵笑\t198794\n15485764949849\t198795\n张胜翔\t198796\n再要不\t198797\nrjjr\t198798\n13287743585\t198799\n嗯八百\t198800\n长干\t198801\ngreen\t198802\n我心唱响\t198803\n长年\t198804\n张嘉琪\t198805\nrjjd\t198806\n没呀飞秒\t198807\nBall\t198808\n涉死\t198809\n涨潮\t198810\n伴儿\t198811\n卡门\t198812\n残联\t198813\n二零二幺\t198814\n老先生\t198815\n理科\t198816\n门边\t198817\n丰大\t198818\n焕森\t198819\n天夫人\t198820\n洪武路过街天桥\t198821\n秋凉\t198822\nguggjigf\t198823\n希尔恭\t198824\n授人以渔\t198825\n68077612/68077620\t198826\n汤们\t198827\n爱的秘笈\t198828\n南北朝\t198829\n手绘餐\t198830\n性虐\t198831\n很容易\t198832\n15089625347\t198833\n大家好\t198834\n李晨阳\t198835\n娱乐业\t198836\n韩流粉\t198837\nthen\t198838\nthem\t198839\nthel\t198840\n幽云\t198841\n话类\t198842\nthee\t198843\n理想的下午:关于旅行也关于晃荡\t198844\nanadadasini\t198845\n一板\t198846\n徐明天\t198847\n20.3亿元\t198848\n北城\t198849\nthey\t198850\n恶心恶心\t198851\n鄙人\t198852\nthes\t198853\nther\t198854\nthep\t198855\n床戏\t198856\n1.00\t198857\n南非政府\t198858\ntheG\t198859\n潮解\t198860\nhttptiebabaiducomfkwE68190E9BE99E68898E53ABE8786E9A291\t198861\n贺拜\t198862\n新娘\t198863\njeudu\t198864\n感谢你让\t198865\n硬不过\t198866\n火星回火星\t198867\n炸尸\t198868\n11111111111111111122222\t198869\ntheS\t198870\n切你孙神马\t198871\n速溶界\t198872\n下个星期\t198873\n杨哥哥\t198874\n立业\t198875\n古就\t198876\nvgenenstnsegn\t198877\n异样千玺\t198878\n才女貌\t198879\n呸别\t198880\n李小英\t198881\n各组团\t198882\n汪秋莎\t198883\n九三年\t198884\nFXXK\t198885\nsheurged\t198886\n山色\t198887\n暴君\t198888\n起舞\t198889\n杀毒秘了就你这还度秘还我的小秘书你不配当我的小秘书\t198890\n5000000\t198891\n天略\t198892\n100毫升\t198893\n赵熙睿\t198894\n针刺\t198895\n13333258280\t198896\n聪明天都\t198897\n讲义气\t198898\n不要你有你\t198899\n我的女朋友\t198900\n齐撸网\t198901\n恩降\t198902\n升至\t198903\nmicyou\t198904\n超酷\t198905\n蜀都\t198906\n湘菜馆\t198907\n水泵\t198908\nrfvd\t198909\nIjevrkwjebdje\t198910\n京巴狗\t198911\nnaoao12acass\t198912\n哈废话\t198913\nhttpo2olianwificomspedaijiaredirecthtmsrcFrom1\t198914\n王栎鑫\t198915\n不限不限\t198916\n张路云霏\t198917\n藏民\t198918\nWii游戏\t198919\ncbvh\t198920\n民主党派\t198921\n53秒\t198922\n愚笨\t198923\n取之于民\t198924\n宋之谦\t198925\n吃电量\t198926\n亚特兰\t198927\n大块头\t198928\n卸载\t198929\n催审版\t198930\n老村长\t198931\n王雯宇\t198932\n亲缘\t198933\n东莞实验中学\t198934\n红高旗\t198935\n网都\t198936\nKlSS\t198937\n度秘你见过我的要猫\t198938\n嗯捷\t198939\nwatisyuoyname\t198940\n蓬松感\t198941\n惜时\t198942\n退税\t198943\njftgj\t198944\n银赫TW\t198945\n陈冠希\t198946\n胆真大\t198947\n我敏\t198948\n沐凌\t198949\n吴小爱\t198950\n先女\t198951\n梧州市\t198952\n国鬼\t198953\n先奸\t198954\n古尔\t198955\n嗯捏\t198956\n10几张\t198957\n钱学森\t198958\n哭了妹妹\t198959\n二十几天\t198960\n第129799个\t198961\nbeagood\t198962\n412725199707136927\t198963\n我不会\t198964\n仑巡\t198965\ncfshfhsfsgffggxgzgGsgfhdgg874559SBgmailnewsgooglegmailgooglesina33\t198966\n宏福服装厂\t198967\n十指相扣\t198968\n额嗯嗯\t198969\n四五金城\t198970\nSet\t198971\nirrjchchch\t198972\nababb\t198973\n湖光山色\t198974\n女生bb禁\t198975\nSex\t198976\n我讨厌你恨你恨你恨你讨厌你\t198977\n128045\t198978\n闫思宇\t198979\nSeo\t198980\n剩余\t198981\n爱不完\t198982\n为灰烬\t198983\nSei\t198984\n四佳沃剧毒\t198985\nも\t198986\nゃ\t198987\nめ\t198988\nゆ\t198989\nや\t198990\nり\t198991\nよ\t198992\n很错\t198993\nわ\t198994\nれ\t198995\nAT\t198996\n味口道\t198997\n二十0\t198998\n奥迪S5\t198999\n奇人\t199000\n德日\t199001\n゛\t199002\nhttpfhiphotosbaiducomxiaodupicitem500fd9f9d72a6059208a906f2f34349b033bba28jpg\t199003\n２４小时\t199004\nゞ\t199005\n゜\t199006\nゝ\t199007\n普利率\t199008\nウ\t199009\nェ\t199010\n安吉拉游戏度秘\t199011\n几圈儿\t199012\nエ\t199013\n流失为\t199014\nギ\t199015\nク\t199016\nキ\t199017\n二百元\t199018\n二百兆\t199019\nズ\t199020\n拳击\t199021\nhttppinyincn1sSBqavT6ml\t199022\nhthtiygig\t199023\n瑞殿\t199024\n平衡点\t199025\n黄标\t199026\n18年\t199027\n发物\t199028\n报点儿\t199029\n吧思密达\t199030\n原本\t199031\n鸡冠\t199032\n菊花儿\t199033\n正途\t199034\n活结\t199035\n干甚\t199036\nBone\t199037\nwhosyourcat\t199038\n发牛\t199039\n都挺好\t199040\n版画\t199041\n号召\t199042\n姚狗\t199043\n雷鸣般\t199044\n调皮就调皮吧\t199045\n我敢\t199046\n爱的电梯\t199047\nkpjbj\t199048\n奋发\t199049\n五个你好多\t199050\n阿卡丽\t199051\n号号\t199052\n昭然若揭\t199053\nsjxgks\t199054\n超级超人\t199055\n离眸\t199056\n王大布\t199057\n起哄\t199058\n无闻\t199059\nlatoutia\t199060\n每刻\t199061\n豆面\t199062\n7.1\t199063\n隐身术\t199064\n丁婉婷\t199065\n惧怕\t199066\n这片土地是神圣的\t199067\n7.9\t199068\n一夹一夹一夹一夹一夹一夹一夹一夹滴\t199069\n戏凤\t199070\n眉头\t199071\n古鸡鸣寺\t199072\ncddxddfdsff1234567890012345t6789010\t199073\n本港\t199074\n作业题\t199075\n心想事成\t199076\n李途纯\t199077\n尼玛尼玛\t199078\n每分\t199079\n惠普森\t199080\n0570700\t199081\n相拉\t199082\n乘方\t199083\n5525535233552\t199084\n礼拜一了然后\t199085\n昨晚23点\t199086\n埃叫\t199087\n几百倍\t199088\n增长\t199089\n林晓格\t199090\n极武\t199091\n杀人不眨眼\t199092\n洞房\t199093\n汉堡屋\t199094\n38.3公里\t199095\n郭清丽\t199096\n奎龄\t199097\nNBA2K16\t199098\n你的秘\t199099\n长城m4\t199100\n笑难道\t199101\n刘殿勇\t199102\n房企\t199103\ntfbois\t199104\nFIELDHYPERTHERMIAAPPARATUS\t199105\nVIVI\t199106\n萌萌哒美美嗒帅帅哒\t199107\n汰汰\t199108\n叼毛\t199109\n炮筒\t199110\n区域\t199111\n发财致富\t199112\n选中\t199113\n孔小娟\t199114\n取胜过关\t199115\n漫灌\t199116\n24个月\t199117\n平手\t199118\n43210\t199119\n明信片\t199120\n云里雾里\t199121\n雪逾\t199122\n影厅\t199123\n主席们\t199124\njush\t199125\n鳞\t199126\n人话们\t199127\n懂完\t199128\n永城市\t199129\n五点四十五分\t199130\n花都大润发\t199131\n台风\t199132\n抗氧化剂\t199133\n母性\t199134\n俪人\t199135\n高慧\t199136\n小奥莉\t199137\n这些天\t199138\n19174\t199139\n无论如何\t199140\n周捷\t199141\n秘高高\t199142\n想不睡\t199143\n住址\t199144\n阿沛\t199145\n天水话\t199146\n晏子\t199147\n猪儿\t199148\n6263963\t199149\n599元\t199150\n逼问\t199151\n边摊\t199152\n14.8万\t199153\n五周年\t199154\n秘团\t199155\n北京市第二中级法院\t199156\n周晓丽\t199157\n100钱\t199158\n台风眼\t199159\n鹰爪假传键\t199160\n痱子\t199161\n欢笑的孩子们\t199162\n真全\t199163\n双眼皮\t199164\n炸货\t199165\n下皮\t199166\n区段\t199167\n氟利昂\t199168\n满月\t199169\n错峰限行\t199170\n三天两头\t199171\n有话不说\t199172\n坦荡\t199173\n估值\t199174\n鸭头\t199175\n困一开始\t199176\n月亮\t199177\n未来七天\t199178\n米扬\t199179\n272714654\t199180\n飞信宝贝儿\t199181\n哦路路通\t199182\n12型\t199183\n芝柏表博物馆\t199184\n2月2号\t199185\n留校\t199186\n歪头\t199187\n阿拉拉\t199188\nyLHK\t199189\n东半球\t199190\n灰嘘嘘嘘\t199191\nHfryukbccg\t199192\n凝冰\t199193\n副业\t199194\nWedding\t199195\n制片人\t199196\n一起走到底\t199197\n你是猪么我相信你\t199198\n18809916868\t199199\n麻油\t199200\n林欣怡\t199201\n赢利\t199202\n章鱼\t199203\n茶山\t199204\n火炮\t199205\n到场\t199206\n听风\t199207\n遵命\t199208\nufygf\t199209\n茶局\t199210\n三网\t199211\n︸︷︸\t199212\n教机\t199213\n苏哥哥\t199214\n亲贤\t199215\n曹宏博\t199216\n马欣怡\t199217\n张体口唇\t199218\nFM101\t199219\n福姜西里\t199220\n格度\t199221\n唐鹏\t199222\n很远行\t199223\n忘好\t199224\n王统一\t199225\n阿咪尼嚎\t199226\n认贼作父\t199227\n沃派\t199228\n表走\t199229\neeeee\t199230\n劲度秘\t199231\n狗粮\t199232\n鲜红\t199233\n再见你\t199234\nfyicic\t199235\n大經\t199236\n打洗澡\t199237\n再见拜拜再见拜拜再见拜拜再见拜拜再见拜拜\t199238\n马赛\t199239\n表赖\t199240\n千光照\t199241\nmonarch\t199242\n9.73\t199243\n8分之八\t199244\n呵狼狼狼狼狼狼\t199245\nK彩\t199246\n安洁西呦\t199247\n联名\t199248\n胶\t199249\n胷\t199250\n联合\t199251\n摔起睡\t199252\n胺\t199253\nbytf\t199254\n就是朋好友\t199255\n色棍\t199256\nYuri\t199257\n问题我爱你爱你爱你爱你\t199258\n胡\t199259\n胦\t199260\n我让你教室的啊我要你给我再终点站\t199261\n别来\t199262\n吊铺\t199263\n瞿成伟\t199264\n胩\t199265\n胯\t199266\n胬\t199267\n该行\t199268\n娄底\t199269\n胖\t199270\nthy\t199271\nthx\t199272\n胚\t199273\n颇多\t199274\nthu\t199275\n1974\t199276\n1973\t199277\n衣机\t199278\n胜\t199279\n孝之兹\t199280\n铂程斋\t199281\nthn\t199282\n胀\t199283\n胆\t199284\nthj\t199285\nthg\t199286\nthf\t199287\nthe\t199288\nthd\t199289\n胎\t199290\nthb\t199291\n背\t199292\n我喜\t199293\n偷笑\t199294\n狂吻学校\t199295\n几盏\t199296\n迟迟\t199297\n西岭雪山\t199298\nlymein\t199299\n软男\t199300\n点在\t199301\n韩琦颖\t199302\n几盒\t199303\n内卡\t199304\n蜜桃味\t199305\nlilj\t199306\n催走\t199307\n吴玉娟\t199308\n小蕊\t199309\n环市路\t199310\n菜盒\t199311\n找钱\t199312\nFM107\t199313\n龙凤\t199314\n二个星期\t199315\n叶一鸣\t199316\n求爱\t199317\n一二得二二一三\t199318\ntutlkel\t199319\n用油\t199320\n离别说\t199321\n偶冰儿\t199322\nhfcf\t199323\n高德\t199324\n报建\t199325\n挺好玩\t199326\n交和\t199327\n天上幼儿园勒没\t199328\n原封\t199329\nioooioiiiiioooiiooioiiiioioioioiio\t199330\n楼丽君\t199331\n2X2\t199332\n运管\t199333\n含笑半步颠\t199334\n易碎易碎118\t199335\n没凶我你\t199336\n场次\t199337\nNexus\t199338\n告诉你的电话\t199339\njdgdt\t199340\n高徒\t199341\n我听得\t199342\n中新区\t199343\n那玩意\t199344\n58016571\t199345\n3.万一\t199346\n报廊\t199347\n90期\t199348\n我的偶像\t199349\ntfffcsrry\t199350\n德性\t199351\n五千多\t199352\n借题发挥\t199353\n萃油\t199354\n三人行\t199355\n男风\t199356\n金基范\t199357\n嘎拉哈尼\t199358\n轻拍\t199359\n誓词\t199360\n褪散\t199361\n陈志武\t199362\n轻拂\t199363\n魔王\t199364\n天哪你个哈儿\t199365\n赤壁赋\t199366\nemoji\t199367\nhpv3\t199368\n杠上\t199369\nrrrrrrrrrrrrrrrrrrrrrrr\t199370\nGhdihrj\t199371\nfhhr\t199372\n22：36\t199373\nSilvaner\t199374\n随便说分手\t199375\n纯音麦\t199376\n18736249425\t199377\ntayufifud\t199378\nrhchgdjvhfgib\t199379\nHhvb\t199380\n震度\t199381\n各异意思\t199382\n内心世界\t199383\njrnf\t199384\n白种笑\t199385\n偏颇\t199386\n刘晓\t199387\n快数\t199388\n亭子\t199389\n苦衷\t199390\n5371185280000546294237354818728136975244820545876522658003513472579756\t199391\n芭膏\t199392\n机器人脑\t199393\n天堂岛\t199394\n银剑\t199395\n生煎\t199396\n母发\t199397\n社区盾杯\t199398\n400N\t199399\n香木\t199400\n孟家沟\t199401\n郴州明天天\t199402\n终极\t199403\n奸淫\t199404\n不堪回首\t199405\n便意\t199406\n司法权\t199407\n漂移\t199408\n香机\t199409\n孟爷\t199410\n汉阴\t199411\n走下\t199412\n走上\t199413\n呼啦啦啦啦啦啦啦啦啦啦啦啦\t199414\nuthgutghfgf\t199415\n首当爱\t199416\n香月\t199417\n猪吧戒秘\t199418\n乌海\t199419\n不久前\t199420\n活车\t199421\n赛车场\t199422\n无恶不作\t199423\n得主\t199424\n酷狗唱吧\t199425\n五句话\t199426\n银饰\t199427\n裂谷\t199428\n于五\t199429\n塑身\t199430\n看穿越\t199431\n宋相兵\t199432\nonly\t199433\n雪花片\t199434\n各异\t199435\n择偶\t199436\n瞎子\t199437\n限行\t199438\n至道\t199439\n林那也\t199440\n武林外传\t199441\n1029384856\t199442\n胎教\t199443\n爱我为什么不\t199444\n实胜虚\t199445\n劬本\t199446\n花粉的蜜蜂还好我画的蜜蜂画\t199447\n無名翰\t199448\n全站仪\t199449\n金饰家\t199450\n兰州西北民院医学院\t199451\n那我们一起吧五环的奇遇记\t199452\n天天平\t199453\nDIY茶叶包\t199454\n侯俊秀\t199455\n兴义兴义\t199456\n停经\t199457\n蓝婶儿\t199458\n可鉴\t199459\n千里之行\t199460\n冉庄镇东\t199461\n斗罗年\t199462\n暖袋\t199463\n密保\t199464\n杨安吉\t199465\n别见我了我\t199466\n我喜欢的偶像\t199467\nAndroid机X903\t199468\n道娜\t199469\n9月9日\t199470\n子房沟\t199471\n妹儿\t199472\n纠纷\t199473\n蒎公\t199474\n徐少华\t199475\n从来不懂\t199476\n乌托邦\t199477\n巧吗\t199478\n龙舟赛\t199479\n女恋\t199480\n13亿美元\t199481\nddjdndxdzxxo\t199482\n熊桂菊\t199483\n我在旭神枪\t199484\n974696165\t199485\n琉球国\t199486\n巧合\t199487\n轮越\t199488\n咯痰\t199489\n数值\t199490\n回哈\t199491\n布加迪威龙\t199492\n得丑\t199493\n李成新\t199494\ndiurudjf\t199495\nhojaye\t199496\n125909888998\t199497\n果照\t199498\n刘园\t199499\n归根到\t199500\n靠岸\t199501\n号簿\t199502\n抹红花油\t199503\n北大医院\t199504\nk841\t199505\n11点25到13点\t199506\n刘四\t199507\n黃色小笑話妳有麼\t199508\n数倍\t199509\n第一所\t199510\n双重否定句\t199511\n照例\t199512\n8454552541555455255\t199513\n冰河熊也牧堡\t199514\n那个谁\t199515\n罗嘉良\t199516\n多谢款待\t199517\nHhcvhhhhhzgkppl\t199518\n沃兹尼\t199519\n嘀嗒团\t199520\n修门\t199521\n王浏\t199522\nIamastudent\t199523\nPTA\t199524\n措无错\t199525\n顶楼\t199526\n追不到\t199527\n不好笑重\t199528\n路边\t199529\n啵啵啵啵啵啵啵啵啵啵啵啵啵啵啵啵儿谦哥\t199530\n王浩\t199531\nXxO0\t199532\n昰不昰\t199533\n易凤琪\t199534\ndhfvsjcxgbdn\t199535\n王海\t199536\ns3多0\t199537\nyuyugmeit\t199538\n三、四十年代\t199539\n真的是鬼呀的相信我吧好姐姐\t199540\n145525\t199541\nコナン\t199542\n各队\t199543\n明日午\t199544\n枭龙\t199545\n三万成五千\t199546\n有我也不知道\t199547\nbobo啵\t199548\n缓缓\t199549\nqq可乐\t199550\n保养品\t199551\n死度\t199552\n2014年\t199553\n112家\t199554\n首家\t199555\n南音\t199556\n干办\t199557\n萌达达\t199558\n反校\t199559\np\t199560\n好的唱吧\t199561\n楚瑞清\t199562\nspatankah\t199563\n选文\t199564\n眼花\t199565\n金海个箭头嘛的不是处还不\t199566\n银沙滩\t199567\n吟听计从\t199568\n6周年\t199569\n十几\t199570\n非电\t199571\n非男\t199572\nMasK\t199573\n珞珞珞珞珞珞珞珞\t199574\n龙母山泉\t199575\n狙击电话亭\t199576\n第四位\t199577\n起丑\t199578\n靠近你\t199579\n娼盛\t199580\n领导力\t199581\n交警队\t199582\n家书店\t199583\n二十千米\t199584\n恋爱王者召唤术\t199585\n比较级\t199586\n漫客\t199587\n五十四年后\t199588\n万国景仰\t199589\n板板车\t199590\n善解\t199591\n真笨\t199592\n3角\t199593\n上午8时\t199594\n操场上\t199595\ncsrgbu\t199596\n地长\t199597\n坛坛乡饭店\t199598\njuhfff\t199599\n我是天你猜我是什么\t199600\n234567我的朋友在哪里在天涯在海角我的朋友在这\t199601\n台春晚\t199602\n一窥\t199603\n20.02\t199604\n好东西\t199605\n你问\t199606\n语发\t199607\nfgggh\t199608\n一窝\t199609\n256点\t199610\nekk\t199611\n有机质\t199612\nADACvgr\t199613\n一窗\t199614\n一窖\t199615\n语句\t199616\n陆雪\t199617\nfgggg\t199618\n您好可爱\t199619\n吴孟达\t199620\n揍扁\t199621\nekx\t199622\nmnbv\t199623\n570962176350\t199624\nn奸\t199625\n呼告诉你\t199626\n1月26日\t199627\n小悟\t199628\n肿大\t199629\nv5cx5\t199630\n对应\t199631\nAKB\t199632\n页码\t199633\n归结\t199634\n香蕉蕾\t199635\n汉奸大审判-汪精卫\t199636\n嘎次\t199637\n隔绝\t199638\n秘不可以\t199639\n小悠\t199640\n西德狼\t199641\n啵啵啵\t199642\n扮演\t199643\n小悦\t199644\n一千二百五十块\t199645\n相衬\t199646\npmpm\t199647\n蓝月湖\t199648\n撸累\t199649\n雀蛋\t199650\n三思而后行\t199651\n任居然\t199652\nuity格林hinditsealsstretim\t199653\n高兴就走\t199654\n郭骤琨\t199655\n86397788\t199656\n两下\t199657\ny度\t199658\n告好\t199659\n邀请赛\t199660\nrenata\t199661\n早古\t199662\n啪得\t199663\n节家\t199664\n日当午\t199665\n罢了再见\t199666\n早发\t199667\n迅微博\t199668\na8l\t199669\n放一天\t199670\n新梦\t199671\n陪你了你说话\t199672\n缅\t199673\n口耕三啊\t199674\n台式电脑鼠\t199675\nspp\t199676\n6428\t199677\nsps\t199678\n杠菌\t199679\n七堡四楼\t199680\n贴画\t199681\n太实在\t199682\n迷惑\t199683\nspm\t199684\nspo\t199685\nspa\t199686\n天道酬勤\t199687\n拉拽\t199688\n吕晓萱\t199689\n美梦成真里的白花花\t199690\n整错\t199691\n未落\t199692\n古代码\t199693\n零三幺零\t199694\njjjtjaagjjmgjagpjjm\t199695\ngh9\t199696\n淋巴年\t199697\n2569638\t199698\n跪羊\t199699\n72除\t199700\n下骗人\t199701\n东风乡站\t199702\n夜深人静\t199703\n淘气行\t199704\n体考试\t199705\n朋友聚会\t199706\n导播\t199707\n兴星\t199708\n给我看一下\t199709\n多咪多咪事我一个昵称\t199710\n增高药\t199711\n骂人句\t199712\n不靠谱\t199713\n没哪口\t199714\n北极\t199715\n爱吃\t199716\n江油大康情人谷\t199717\n中央美院\t199718\n扇边\t199719\nght\t199720\nghu\t199721\nghv\t199722\nghw\t199723\n吴金\t199724\n暴王龙\t199725\nghs\t199726\n你秘\t199727\n老炮\t199728\n1222222222\t199729\nghx\t199730\nghy\t199731\n五百多页\t199732\n秋高气爽\t199733\nghd\t199734\nghe\t199735\nghf\t199736\nghg\t199737\n爱听\t199738\nghb\t199739\nghc\t199740\nghn\t199741\nghh\t199742\nghi\t199743\n爱吧\t199744\nghk\t199745\n顾防辐射\t199746\n爱派\t199747\n陋巷\t199748\nChc\t199749\n波澜\t199750\ncshx\t199751\n彭灿\t199752\n佩佩佩佩佩\t199753\n1645488653\t199754\nwugffs\t199755\nヽД\t199756\n许三多\t199757\n孽缘\t199758\n告诉石\t199759\nNike\t199760\n小哥\t199761\n些岁\t199762\n桃将\t199763\n国花\t199764\nghiphotosbaiducomxiaodupiciteme824b899a9014c0880d634d60d7b02087af4f4d0jpg\t199765\nhandfolitoubaconditioaaaanbaataralaaaaookaaaa\t199766\n致病\t199767\n周勇\t199768\n毕业不会\t199769\n郝鸿\t199770\n锦衣\t199771\n璧\t199772\n大早\t199773\n个区\t199774\n九儿\t199775\n呱呱\t199776\n斥责\t199777\n凉亭海关仓库\t199778\n璐\t199779\n璞\t199780\n善龙\t199781\n冯晨\t199782\nerry\t199783\n18213996873\t199784\n多听话\t199785\n陆毅\t199786\n璃\t199787\n璀\t199788\n斤数\t199789\n璋\t199790\n篮筐\t199791\nliew\t199792\n才气\t199793\n找揍\t199794\n五一劝业\t199795\n没本事\t199796\n魔女\t199797\nYGFAMILY\t199798\nSwee\t199799\n开机\t199800\n公立学校\t199801\n共向\t199802\n惹事\t199803\n共同\t199804\nukan\t199805\n景福宫\t199806\n开朗\t199807\n谋生者\t199808\n包青天\t199809\nGfggdgdhdhehgyynddhhfhhgddhdhrgfggrgrhshajsbxjrhgdygdrgq\t199810\n副本人\t199811\n刮起\t199812\n癖好\t199813\n无极膏\t199814\n惊醒\t199815\n钒干嘛\t199816\n蚊虫\t199817\ngnmo\t199818\n自然数\t199819\n人耻\t199820\n活激\t199821\n胡战华\t199822\n多好不好\t199823\ndhiphotosbaiducomxiaodupicitem5243fbf2b2119313336811c262380cd791238d9djpg\t199824\n会儿\t199825\nOPPAYA\t199826\n249岁\t199827\n牛B\t199828\n羊羊羊羊\t199829\n够了我告诉你\t199830\nhi度秘你真好玩儿哦我跟我的朋友\t199831\n度秘度秘我真的很喜欢你是爱的哪一种你知道吗暗恋你很久了\t199832\n束了\t199833\n额遁土\t199834\n猪岛\t199835\n12355688910\t199836\n宋大叔\t199837\n篮板球\t199838\n見過\t199839\nJorn\t199840\n不堪一击\t199841\n牛a\t199842\n永远别\t199843\n牛b\t199844\n小熊猫\t199845\n质料\t199846\n李梓宁\t199847\n破没有钱\t199848\n装作\t199849\n攀爬\t199850\n李长江\t199851\n尺子\t199852\ng10国道\t199853\n离心美钻\t199854\n六出\t199855\n看不至\t199856\n绝不可能\t199857\n津涞公路\t199858\n俄俄\t199859\nsved\t199860\n曾梅雪\t199861\n林寿峰\t199862\n恶法\t199863\n这一节\t199864\n卜艳英\t199865\n咿呀呀呀呀\t199866\n原谅你了你不说\t199867\n用兵论\t199868\n长寿果\t199869\n珠光宝气\t199870\n小了美\t199871\n拉勾勾\t199872\n阿西娜\t199873\njhkkmj\t199874\n熊描\t199875\n恶习\t199876\n25945446\t199877\n足交\t199878\n大广高速\t199879\n探视\t199880\n刘雅瑜\t199881\n广告诉我\t199882\nghhhhhhhhhh\t199883\n直都\t199884\n有无有有有有\t199885\n12345671\t199886\n普世\t199887\n本博会\t199888\n饿乐\t199889\n更换\t199890\nyououous\t199891\n干错\t199892\n忘川\t199893\nloo嗯\t199894\n任风\t199895\n刘小磊\t199896\njovl\t199897\n雷雷\t199898\n一公升\t199899\nDainel\t199900\n雷雨\t199901\n十三秒\t199902\n青青河边草\t199903\n看得起来\t199904\n林月\t199905\n书橱\t199906\n多ll\t199907\n症\t199908\n不顾一切救\t199909\n寿星\t199910\n郭湘玉\t199911\n齐子韵\t199912\namjwj\t199913\n殷則環\t199914\n滑头\t199915\n女主\t199916\n银桑\t199917\n智付\t199918\n羲票\t199919\n太野\t199920\n姝婕\t199921\n鱻\t199922\n女丫\t199923\n普兰店\t199924\n波斯米亚\t199925\n6名\t199926\n下期\t199927\n高照\t199928\n保温瓶\t199929\nCERResearch公司\t199930\n下有\t199931\n下月\t199932\n才猪\t199933\n贫瘠\t199934\n女一\t199935\n女七\t199936\n超能系\t199937\n女三\t199938\n水哥\t199939\n小狗们\t199940\n罗雨睿\t199941\n果冻场\t199942\n芳香\t199943\n嗯MI\t199944\nMV花\t199945\n韩知诺\t199946\n当面\t199947\n呦小攻\t199948\n毛度\t199949\n不符合\t199950\n绕行\t199951\n咋删\t199952\n轔\t199953\n3月4日\t199954\n郑慧芳\t199955\n轐\t199956\n转\t199957\n轮\t199958\n软\t199959\n轨\t199960\nxudx\t199961\n情色\t199962\n毛庄\t199963\n车\t199964\n轧\t199965\n载\t199966\n莫冰飞\t199967\n吊饰\t199968\n礼令\t199969\n轻\t199970\n轴\t199971\n任我行\t199972\n越洋\t199973\n轰\t199974\ngeurae\t199975\n洪教头\t199976\n才边\t199977\n了了拜拜\t199978\n算了不走\t199979\n杉杉来\t199980\n精密\t199981\n重仓\t199982\n生活必需品\t199983\n植物大战僵尸还\t199984\n郭启军\t199985\n反驳\t199986\n规约\t199987\n景别\t199988\n杨万里\t199989\n20031013\t199990\n误战机\t199991\n关闭度\t199992\nxit\t199993\n塞美女\t199994\n宝贝呀宝贝我是你的大锤包贝尔宝贝\t199995\n乌鲁木齐铁路局\t199996\n初吻了吧\t199997\n圣帝纳\t199998\n一公顷\t199999\n佳丽\t200000\n永安学校\t200001\n老闹春\t200002\n报停\t200003\n三楼\t200004\n不看我是谁我可是你\t200005\n3x4y2z十六四x3yz1\t200006\nxiu\t200007\n张颖涵\t200008\n尔又又\t200009\nkikg\t200010\n小牧羊\t200011\n都米\t200012\n不许\t200013\nhiot\t200014\n干嘛呢你格叽格叽\t200015\n硬邦邦\t200016\n不讲\t200017\n其实时\t200018\n报偿\t200019\n烘托\t200020\n9999元\t200021\n1113。Y11\t200022\n啵淬好大家好热巴\t200023\n买身\t200024\n不讨\t200025\nhiod\t200026\ninfonkanono\t200027\n湿吻\t200028\ngodand\t200029\nhusband\t200030\n不认\t200031\n71770点\t200032\nconcert\t200033\n那们\t200034\ntfbos偶像手记\t200035\ndddfd\t200036\n怪魔之太\t200037\n英雄的人\t200038\n一石恶的小小苹果榛蘑\t200039\n上床吧\t200040\n银耳粥\t200041\n这玩意儿\t200042\n看开\t200043\n腋窝\t200044\n那什\t200045\n100公里\t200046\n毕深\t200047\n地震龙\t200048\n晚课\t200049\n54837\t200050\nAABB式\t200051\ndrrrf\t200052\nurban\t200053\n等秘\t200054\n真好累\t200055\n月钻\t200056\n看轻\t200057\n地产人网\t200058\n碟中谍4\t200059\n通缉犯\t200060\n人参果\t200061\n口舌\t200062\n打破裂\t200063\n肉类品\t200064\n戛纳国家电影节\t200065\n海口保税区\t200066\n买卖街\t200067\n走音\t200068\n喝东西\t200069\n看车\t200070\n18611386849\t200071\nvud\t200072\ncomen\t200073\nvub\t200074\n78589\t200075\nvuo\t200076\nghigy\t200077\nvuk\t200078\n聪明勒\t200079\n戈尔斯\t200080\nvuv\t200081\n气爆\t200082\nvuu\t200083\n隆中隆政\t200084\n11月5号\t200085\ncomet\t200086\nghigh\t200087\nvuz\t200088\ntestoplets\t200089\nvux\t200090\nvuy\t200091\n帕格尼尼\t200092\n举一例\t200093\n元列\t200094\n嘞我知道\t200095\nboub\t200096\n今天晚上十二点\t200097\n为甚我无聊你无聊我无聊你无聊你\t200098\n英明\t200099\n杉杉杉\t200100\n近期末\t200101\n卞扣红\t200102\n进京\t200103\n三言两语\t200104\n萌萌哒嗯\t200105\nAreyouadog\t200106\n退一步\t200107\n阿坝\t200108\n断断续续\t200109\n二七二四二三八零\t200110\n亞軍\t200111\nbbmmjm别\t200112\n房友\t200113\n香榭\t200114\n上野动物园\t200115\n转院\t200116\n阿[哼\t200117\n几委\t200118\n万尔\t200119\n12357855687565088566547966\t200120\n奥特曼奥特曼\t200121\n唐文艳\t200122\nMymotherisa\t200123\n7月底8\t200124\n王东东\t200125\n报信\t200126\n依旧\t200127\n报修\t200128\n开玩笑男扮女\t200129\njfhrrirkshcHfus4775htufut\t200130\n老不想\t200131\n赵增燕\t200132\n拍照\t200133\n小包子\t200134\n畅影\t200135\n都鲁尼\t200136\n那鲁北\t200137\n早起礼拜\t200138\n1561个\t200139\n慈化\t200140\n死记硬背\t200141\n点烟气\t200142\n195块\t200143\n小婊子\t200144\n有声\t200145\n呃毛\t200146\n茅山\t200147\n奔小说不不不一\t200148\nsoin\t200149\n歌了当然slo他们说下我爱你我爱你我爱你我爱你我要\t200150\n伍章明\t200151\n夸夸夸夸夸夸夸夸夸夸夸夸你\t200152\n13137212871\t200153\n我和你说的话\t200154\n下半场\t200155\n倒是你的话\t200156\n在家不在\t200157\n脑白金\t200158\n好嘛们\t200159\n51555\t200160\n近身\t200161\n42号\t200162\n动不动\t200163\nIamMiss\t200164\n好心朋友\t200165\nqrk9rv\t200166\n国际村\t200167\n老城\t200168\n福巩\t200169\n五五个\t200170\n4道\t200171\n飞球\t200172\n菅直人\t200173\n相当于\t200174\n泡菜味\t200175\n1418337201\t200176\n不吧不吧\t200177\n浇灌\t200178\n红树林公园\t200179\n吟诗\t200180\n庄镇\t200181\neverisworth\t200182\n最近几天\t200183\n玩印\t200184\n金台区\t200185\nghffg\t200186\n15073410121\t200187\n黄小晶\t200188\n辣椒糖\t200189\n南尾炎\t200190\n吟诵\t200191\n福州\t200192\n微颖\t200193\n娜塔\t200194\n手包\t200195\n曹安省\t200196\nKBS公开音乐会dailyvita门板更新特源旭3P\t200197\n33家\t200198\n说会话说\t200199\n我就是不要你了臭东西\t200200\n1月25号4点39分\t200201\n051来啦\t200202\nsmrit\t200203\n旺星人\t200204\n嗦死柳\t200205\nMEET\t200206\n了不要脸\t200207\n夏姑娘\t200208\n咯老子\t200209\n百周年\t200210\nglusr\t200211\n伯杰\t200212\n坚如磐石\t200213\n陈世祖\t200214\n施工队\t200215\n小灰儿\t200216\n李慧宇\t200217\n我不我不我\t200218\n偏旁组\t200219\n我是你的朋\t200220\n四点十六点二十五分钟\t200221\n伍思羽\t200222\n归原处\t200223\n心腹之交\t200224\n宣传稿\t200225\n恶贯满盈\t200226\n打大奔\t200227\n报表们\t200228\n和你的会话\t200229\n确信\t200230\n一点通\t200231\n聊了一聊\t200232\n要不好久试试\t200233\n好痒\t200234\nShow\t200235\n帝庙\t200236\niphone7\t200237\n好痛\t200238\niphone5\t200239\n隐形眼镜\t200240\n书报亭\t200241\n狠伤心\t200242\n张昕竹\t200243\n逗逗逗\t200244\n文睿\t200245\n西风\t200246\n没听过\t200247\n花源地\t200248\n莱斯梅q\t200249\n旬\t200250\n秦朝\t200251\n旮\t200252\n旯\t200253\n旨\t200254\n早\t200255\nDNBN\t200256\n死蟹蟹\t200257\n日\t200258\n旦\t200259\n闫亚同\t200260\n无\t200261\n既\t200262\n十届\t200263\n李泽君\t200264\n二分之一m\t200265\n爬爬\t200266\n旺\t200267\n旴\t200268\n旵\t200269\n625594592\t200270\n旱\t200271\n旳\t200272\n铜烂\t200273\n族\t200274\n5777777\t200275\n龙龙你是猪你是猪你是一头大蠢猪\t200276\n挤进\t200277\n旋\t200278\n旅\t200279\n曹素杰\t200280\n建安七子\t200281\n旁\t200282\n龟龄\t200283\n宝岛\t200284\n王冠如\t200285\n痴爱\t200286\n丿噤\t200287\n回流\t200288\n候补\t200289\n两块\t200290\n温小兰\t200291\n狼二雪\t200292\n要闻\t200293\n蒋雨晴\t200294\n自己一己\t200295\n巡回来\t200296\n爱了拜拜\t200297\n侯宏钰\t200298\n大動\t200299\n隐隐约\t200300\n跑起\t200301\n囚犯样\t200302\n亚风景空\t200303\n恋欲\t200304\nfhxggggkgjng\t200305\n六礼\t200306\n五舞舞王\t200307\n四什么叫\t200308\n不能不是\t200309\n李发兴\t200310\n大磊嗣\t200311\n乐Phone\t200312\n扎堆\t200313\n南侨\t200314\n但求是\t200315\n跑赛\t200316\n珍稀\t200317\n哽咽\t200318\n不要化妆了我讨厌你我讨厌你我讨厌你\t200319\n大了白了零零\t200320\n明珠暗投\t200321\n没有假\t200322\n南蛮\t200323\n米度秘\t200324\n黯淡无光\t200325\n你好磨叽磨叽磨叽磨\t200326\n许涛鹏\t200327\n玛索自制别饿\t200328\n二二二七\t200329\nPhone\t200330\n苗家的疙瘩的话我爱死你\t200331\n4254253435353532353532353133328556\t200332\n飞出\t200333\nfygx\t200334\n快点快点\t200335\nhuttrttyt\t200336\n苏打绿\t200337\n山东海龙\t200338\n蒋诗颜\t200339\n广东公所\t200340\n裴磊\t200341\n四牛\t200342\n红豆泥奇莫鸡哇奇米子\t200343\n少时\t200344\n买了吧\t200345\n货价\t200346\n拆散\t200347\n布阵\t200348\n么么哒3\t200349\n小两口\t200350\n白醋\t200351\n不要再说\t200352\n四版\t200353\n费不停机\t200354\n少无\t200355\n花莫言\t200356\ntrying\t200357\n罚款\t200358\ntutum\t200359\nYouareafewyearsagoanditisnotanissuethatisagooddayforafewweeksagoIhadagreatwaytothepointwherethefirsthalfof\t200360\n咔咔咔对\t200361\n左桂芹\t200362\ntutuj\t200363\n死机器\t200364\n吴一奕\t200365\n慈祥\t200366\ntutuc\t200367\n啊！火箭\t200368\n轻度妄想症\t200369\n喜欢自己\t200370\n西经\t200371\n篝火\t200372\n熊出没之雪雄心\t200373\n四天光\t200374\n无线防\t200375\nUC浏览器\t200376\n国家保密法\t200377\n男仔\t200378\n卖度\t200379\n了却\t200380\n幺二零幺零五一九五二零八幺三零三幺二\t200381\n3292292\t200382\n铠敏\t200383\njidian\t200384\n六点五点四点\t200385\n埋埋\t200386\n入睡\t200387\n剧团\t200388\n卓依婷\t200389\n乌兰乌德\t200390\n温珍妮\t200391\n男什\t200392\n男份\t200393\n陈向东\t200394\n烦良\t200395\n口长\t200396\n你是谁你是谁你是天下头脑花\t200397\n3DS\t200398\n男们\t200399\n鱼蛋\t200400\n阳阳冕\t200401\n围观看看\t200402\n飘飘然远去\t200403\n河南菜\t200404\n主扮演\t200405\n凯歌\t200406\n天黑啊大哥\t200407\n嗯大姐\t200408\n嗯大姑\t200409\n红顶\t200410\n说怕\t200411\n零六年\t200412\n强酸\t200413\n许梦阳\t200414\n一不干\t200415\n苦役\t200416\n劳动节\t200417\n回补\t200418\n破度秘\t200419\n河港\t200420\n︶︹︺\t200421\nv女\t200422\n13958156457\t200423\n去污液\t200424\n学校食堂\t200425\n中国\t200426\n想错\t200427\n100日\t200428\n丁当\t200429\n十年内\t200430\n06月18日05时25分\t200431\n害我想\t200432\n明天早\t200433\nterfly\t200434\n打奥他走\t200435\n一点性\t200436\n無憂\t200437\nApplebite\t200438\n爨先\t200439\n防弹少年团里\t200440\n合我意\t200441\n祝亲\t200442\n老伯伯\t200443\n五三零九\t200444\n蝴蝶蜜蜂\t200445\n张筱乔\t200446\n戒备\t200447\n这个头\t200448\n才妖\t200449\n景顺长城鼎益基金\t200450\n小瘙\t200451\n白露娇\t200452\n柏林电影节\t200453\nexocp\t200454\n沈请秋\t200455\n20002999\t200456\ncdnr\t200457\n南都周刊\t200458\n硬糖\t200459\n旖旎种\t200460\n既定\t200461\n白鬼佬\t200462\n553555\t200463\n王佳楠\t200464\n豹猫\t200465\n好式\t200466\n尊言\t200467\n囊懂\t200468\n毁谤\t200469\n一元们\t200470\n一切安好\t200471\nｔｏｏｉｐｉｐｌｏｐｐｏｐｒｏ\t200472\n我讨厌我讨厌讨厌讨厌讨厌讨厌\t200473\n140码\t200474\n垮垮\t200475\n轨物\t200476\n度秘你好度秘不好度秘\t200477\n胡拉镇\t200478\n费用费\t200479\n空落落\t200480\n宪章\t200481\nEJEKKD\t200482\n65456545\t200483\n英格拉姆\t200484\n颜中玉\t200485\n广岛长崎\t200486\n破壁料理机\t200487\n也不见\t200488\n甲地\t200489\n玻璃杯\t200490\n好强\t200491\n自然界\t200492\n奶味\t200493\n踢足球\t200494\n博阿斯\t200495\n钱集\t200496\n便秘我懂了你是女的\t200497\n实测\t200498\n3.19\t200499\n高雅\t200500\n老是\t200501\n鬼片\t200502\nc379310a55b31944aef20644a98226cffc1769jpg\t200503\n哼托\t200504\n咕咕\t200505\n猪肉粒\t200506\n体育局\t200507\n一大半天\t200508\n沥汤\t200509\n老昰\t200510\n帅我帅\t200511\n3.1%\t200512\n巜釆薇\t200513\nAnangel\t200514\n坏蛋嗯\t200515\n52222452424\t200516\n鬼物\t200517\n小浪们\t200518\n刚明明\t200519\nvnnkfxc\t200520\n也敢\t200521\n中小板\t200522\n弃女\t200523\n7个\t200524\n7355767589786785861315\t200525\n死你可以\t200526\ncvhhhb\t200527\n呱呱呱呱呱呱\t200528\n武赞\t200529\n幸福家人\t200530\n金海真\t200531\n杜蕾斯公司\t200532\n沟子\t200533\n确定\t200534\n刘永好\t200535\nstrong\t200536\n纳斯达克\t200537\n伍冰洁\t200538\n英国\t200539\n张恒紫翊\t200540\nh罩杯\t200541\n125588\t200542\n杨春风\t200543\n回老家\t200544\n1756068323\t200545\n你的你的大什么意思你是你的我是我的\t200546\n聪明你好智能\t200547\n名门世家\t200548\n啜鹤雄\t200549\n你多说点爱我的话\t200550\n啊玲\t200551\n弹雨\t200552\njijc\t200553\n家常红烧鲳鱼\t200554\n言屌\t200555\n狗斗\t200556\n送出堡\t200557\n一J个\t200558\n一九五二。一九五二零四零\t200559\nngkrh\t200560\n垃圾\t200561\n正赛\t200562\n工作人员\t200563\n相克\t200564\n相先\t200565\n欧姆噶\t200566\n脸脸\t200567\n相关\t200568\n焦灼\t200569\n明天子\t200570\n事实真相\t200571\n初错\t200572\ndhgfxxv\t200573\n法门\t200574\n二零零五\t200575\n箱根\t200576\n崔爱新\t200577\n眼药水\t200578\n122095021\t200579\n有诚意\t200580\n肺片\t200581\n我是王弥陀佛\t200582\nedfffffffffffffffffffffffff\t200583\n南京中院\t200584\n高中部\t200585\n黛梦\t200586\n呀呀呀呀呀呀呀呀呀呀呀呀\t200587\n他丁\t200588\n茜予\t200589\nGhfjgnbbnjkkbnkjjkkkopho\t200590\n频频\t200591\n理性堡\t200592\nJATGD\t200593\n十足\t200594\n干预\t200595\n中低端\t200596\n逆到底\t200597\n董璇\t200598\n唉声\t200599\n趣解\t200600\n上周三\t200601\n苍天\t200602\n禁车\t200603\n吗一喜欢\t200604\n空前\t200605\n小姐的度度你在干嘛\t200606\n量子力学\t200607\n风云的链\t200608\n说了说\t200609\n扣扣号110023\t200610\n一定要答答答答答\t200611\n畅开\t200612\n商科\t200613\n见讲\t200614\n红烧鱼\t200615\n妈咪\t200616\n4点50分\t200617\n到底是谁喜欢\t200618\nf1\t200619\n明小\t200620\nf5\t200621\nf7\t200622\n倒流\t200623\n过好日子\t200624\n亦可\t200625\n世茂股份\t200626\n李那儿\t200627\n234个\t200628\n洛杉矶市\t200629\n赵怀智\t200630\n悉达多\t200631\nDay好丁用来平丰盛7了匹住在\t200632\n司考\t200633\n征服者\t200634\n易懂\t200635\n喇嘛\t200636\n累看\t200637\n第十二名\t200638\n130围纤\t200639\n奥委会\t200640\n弋＿\t200641\n倒海\t200642\n不是我是说我无聊我找你\t200643\n你好机器人\t200644\nfq\t200645\nfr\t200646\nfs\t200647\nft\t200648\n不啦\t200649\nfv\t200650\nfx\t200651\nfy\t200652\nfz\t200653\n典藏\t200654\n徐立伟\t200655\nfa\t200656\nfb\t200657\nfc\t200658\nfd\t200659\nfe\t200660\nff\t200661\nfg\t200662\n硅藻泥\t200663\nfi\t200664\nfj\t200665\nfk\t200666\nfl\t200667\nfm\t200668\nfn\t200669\nfo\t200670\n翻案\t200671\n童话村\t200672\n排班\t200673\nfT\t200674\n首张\t200675\n吴永平\t200676\n不啊\t200677\n此幅\t200678\n杨月莉\t200679\n111111\t200680\n哎呀你好\t200681\nfC\t200682\n墨菲\t200683\nfG\t200684\n8646\t200685\nfI\t200686\n8644\t200687\n梦樱\t200688\n舒服自在\t200689\n开心点\t200690\n于童\t200691\n看光\t200692\n18096965013\t200693\nAV架\t200694\n资江\t200695\nHxhvycnjf\t200696\nGHF\t200697\n丑数\t200698\n电脑桌\t200699\n陈和\t200700\n戴可儿\t200701\n磕墙\t200702\n谭面无\t200703\nGHI\t200704\n海滩\t200705\n海滨\t200706\n我猜你猜我猜猜你猜我\t200707\n不折不扣\t200708\n说点别\t200709\n胡今夜\t200710\n电影儿\t200711\n优股\t200712\n光九江\t200713\n糯米\t200714\n大勺\t200715\n五月十八号\t200716\n妈妈咪咪\t200717\n是不是我喜欢\t200718\n13942731244\t200719\n自恋耶\t200720\n迷人\t200721\n找看\t200722\n31块\t200723\n八六十块\t200724\nfdus\t200725\n雷呆\t200726\n私有化\t200727\n李明昊\t200728\n棒读\t200729\n#一线江景江城府#\t200730\n一千五百兆\t200731\n李嫚\t200732\n牵牵牵手\t200733\n多空\t200734\n嫦娥明\t200735\n冲击棒\t200736\n胸无点墨\t200737\nneIrene\t200738\n时间版\t200739\nhnvfil\t200740\n替考\t200741\n撂爪\t200742\n一非彼\t200743\n啊呜\t200744\n古月\t200745\nsacnn\t200746\n一张脸\t200747\n蜘蛛性\t200748\n十六条\t200749\n绿岛学校\t200750\nifil\t200751\n杨瑞\t200752\nhvgghkkk\t200753\n王金桐\t200754\n啊呀\t200755\n好可笑\t200756\n你是我的家人你是我的宠物萌\t200757\n帝制运动\t200758\n多咪多咪你在吗度秘\t200759\n捶肩\t200760\n1月8号\t200761\n站出发\t200762\n带花\t200763\n雨客\t200764\neurif\t200765\n秦腔\t200766\nHskfmjxcjhjdjjahuzmsgsjcjdgjajxukdhjsjjxjdjjakxjjsjnfjjdjekzhdmisgdhdjdjjdjjjjjdhehdjzyshdjdjskdjkdj\t200767\n舒子睿\t200768\n单侠\t200769\n整栋大厦\t200770\n539899\t200771\n深海恐惧症\t200772\n饰品\t200773\n煤气\t200774\n思梦想\t200775\nbghrrhgdffffheeytjdhthvgkgvgdgytfbhfbhcbh\t200776\n康智健\t200777\n折段\t200778\n乐全\t200779\n陈安俊\t200780\n滿足\t200781\n民主监督\t200782\n废和\t200783\n张黄春\t200784\n赫本\t200785\n北斗星时\t200786\nbuyaren\t200787\nmlllk\t200788\n0746\t200789\n倚巧者\t200790\n波尔蒂\t200791\n编外\t200792\n航道\t200793\nx你说说二就是我选修还说你爱说什么说什么\t200794\n查帮\t200795\nreduub\t200796\n們網站\t200797\n个信\t200798\nImc\t200799\n丫头恋\t200800\n颜嘉怡\t200801\n德\t200802\n徹\t200803\nfi8u\t200804\n飞身\t200805\n说呀祝福\t200806\n早离婚\t200807\nImr\t200808\n一汽\t200809\n盟战\t200810\n徨\t200811\n怕呀呀一答\t200812\n张露\t200813\n王的盛宴\t200814\n军粮\t200815\n13560683399\t200816\nzgffff\t200817\n徐\t200818\n徒\t200819\n您还\t200820\n拉翔\t200821\n得\t200822\n徘\t200823\n從\t200824\n徘徊九不在\t200825\n往\t200826\n萌萌哒智能机器人度秘\t200827\nLarry\t200828\n偏食症\t200829\n待\t200830\n怯生\t200831\n很\t200832\n518号\t200833\n2749163268\t200834\n律\t200835\n把着\t200836\n王汉雄\t200837\n合得来\t200838\n陈银水\t200839\nguufg\t200840\n猪皮\t200841\n772583\t200842\n表现出\t200843\n卷烫\t200844\n362307799602095033\t200845\n蓉蓉蓉蓉蓉蓉\t200846\n伤心管我毛事\t200847\n泥石\t200848\n那云翳恨云染\t200849\n我家爱情我叫爱情我叫赖琴行\t200850\n零八八零\t200851\n1127\t200852\nguufs\t200853\n390个\t200854\n卷烟\t200855\n再讲价\t200856\n博雅\t200857\n我喜欢王我喜欢王喜欢王是我喜欢王\t200858\n对啊我不认识你是谁\t200859\n呵亚\t200860\n17世纪\t200861\n李汝培\t200862\ntttttttttttagjaajjgjjjgjmagjdajppmmaggdag\t200863\n莫咏欣\t200864\n雪姨\t200865\ngidjc\t200866\n布达拉\t200867\n任达\t200868\n裸婚\t200869\n汉国\t200870\n一处百\t200871\n不要吗不要不要不要不要不要我受够你了我走\t200872\n杨洋#12316\t200873\n经纪人\t200874\n15667928223\t200875\n委员会\t200876\n小男生\t200877\n么斯\t200878\n茅台\t200879\nHDMIBvlgari\t200880\nirttuuu\t200881\n要好多\t200882\n郭发\t200883\n饶有兴趣\t200884\n拖拉机\t200885\n卫校\t200886\nNeed\t200887\n江奥\t200888\n过去16年\t200889\n我太富有\t200890\n开心心情不好\t200891\n蓝一笑\t200892\n萧仲琪\t200893\n遂宁\t200894\nノ┻━┻\t200895\n朋香云\t200896\n跳吧\t200897\n一妻多夫类\t200898\n独董\t200899\nQQ兔兔\t200900\n难听我的\t200901\n300倍\t200902\n一张个\t200903\n赵利英\t200904\n珍藏\t200905\n国民收入\t200906\n姨瓜\t200907\nghrgg\t200908\njiovh\t200909\n龙虎榜\t200910\n大野\t200911\n郭叫\t200912\nkajima\t200913\n17783\t200914\n满满的爱\t200915\n万事\t200916\n万二\t200917\n668\t200918\n神丘\t200919\n666\t200920\n665\t200921\n664\t200922\n杂音\t200923\n662\t200924\n661\t200925\n撒片\t200926\n1515364208\t200927\n死老娘\t200928\n恁哥\t200929\n55556555\t200930\nSPA科\t200931\n小壬\t200932\n小壮\t200933\n王亚伟\t200934\n十几亿\t200935\n小粉红茶餐厅\t200936\n28949552\t200937\n万人\t200938\n万亿\t200939\nan44\t200940\n小声\t200941\nTeachersDay\t200942\n香芋奶茶\t200943\n7885558900008562\t200944\nιé\t200945\n舌战\t200946\n打官司\t200947\n剪不断\t200948\n索隆\t200949\n66p\t200950\n八八大nn姼妼\t200951\n和你的错\t200952\n15365799\t200953\n饺子皮\t200954\n点触\t200955\n点解\t200956\n媳妇儿\t200957\n牛膝\t200958\n1234467840\t200959\n鞋服\t200960\n黄一鸣\t200961\n2778298580\t200962\n从今以后\t200963\n正儿八经\t200964\n洋娃娃\t200965\n125155454645484556535655555545555555555855555555555555555555555555\t200966\n童话\t200967\n莫名奇怪\t200968\n不好看我爱你给我\t200969\n6点15\t200970\n哼真\t200971\n说五过\t200972\n禁令\t200973\n咋俩\t200974\n旧历\t200975\n瑶族\t200976\nthcfhh\t200977\n某事\t200978\n170家\t200979\n李易桐\t200980\nCHnoof\t200981\n树莓书\t200982\n吕康文\t200983\n一可亲\t200984\n某些\t200985\n归队\t200986\n北端\t200987\n信不信\t200988\n叫问\t200989\n五一碗\t200990\n数余下一个四个\t200991\n我喜欢美丽的人\t200992\n幺八零二二\t200993\n北站\t200994\ndgdhod\t200995\n鸿蒙单\t200996\nbeiow\t200997\n栽倒\t200998\n虹猫\t200999\n崔婷\t201000\n移动电话\t201001\n某人\t201002\n嗚嗚嗚好啦\t201003\n芝士啦啦嘿嘿嘿棒棒堂\t201004\n我不我不想\t201005\n天高跟鞋\t201006\n8月6日早晨10点\t201007\n第4届\t201008\n30美元\t201009\n当你好朋友\t201010\nhihvy\t201011\n15957955689\t201012\ncueig\t201013\nElementary\t201014\n2012#\t201015\n肉松\t201016\n可不可信\t201017\n金橘有理气解郁\t201018\ndtuffg\t201019\n黄舒婷\t201020\n周恩来\t201021\n移开\t201022\n九十八岁\t201023\n阿源小贤\t201024\n蓝a\t201025\n旺角\t201026\n绝秒\t201027\nsgdwsss\t201028\n中兴通讯送豪礼\t201029\n哎呀呀呀呀呀呀呀呀呀呀呀呀\t201030\nVUF\t201031\n陈方庭\t201032\n招唤\t201033\n13259432864\t201034\n找平\t201035\ndjmhghm\t201036\n偶福房网\t201037\n肥度\t201038\n文杰\t201039\nNC粉\t201040\n明天傍晚18：00\t201041\n吃甜\t201042\n五千年间\t201043\n2012佐登奴\t201044\n早恋\t201045\n上视节\t201046\n大秦女\t201047\n去找你好不好\t201048\n凯爷帅\t201049\nkaverryNIE\t201050\n避嫌\t201051\n张建设\t201052\n挽雕\t201053\n知不去\t201054\n我喜欢你我想做你的主人\t201055\nappicahyou\t201056\n吃电\t201057\n12845\t201058\n思尔艺\t201059\n帮留意\t201060\n布心\t201061\n100.71%\t201062\nHeyester\t201063\n伊萌\t201064\n嗯慢点阿bbb\t201065\nkjkj\t201066\n好曰\t201067\n班家\t201068\n好听学\t201069\n色度\t201070\n全功\t201071\njgjihx\t201072\n太空的太空\t201073\n快乐你快乐\t201074\nvhcggf\t201075\n这个臭不要脸\t201076\n脱衣舞\t201077\n夕儿\t201078\n罗胜\t201079\npickout\t201080\ngggggggggggggh8rtgj\t201081\nuufoora8ewj9wsottiotss\t201082\n医疗卫生\t201083\n气候变暖\t201084\njpk\t201085\njpj\t201086\njpm\t201087\n心个凶匕也凶必a匚nnnn必e山nnnnnnZn\t201088\ntuuh\t201089\n15分钟后\t201090\n骇人听闻\t201091\ntuug\t201092\njpd\t201093\njpg\t201094\n富态\t201095\n2天1晚\t201096\n你是我大姐不是我是女女家阿杰阿姐\t201097\n赤裸裸\t201098\njpp\t201099\n3838438\t201100\njpu\t201101\njpt\t201102\n16465\t201103\n13002868608\t201104\njpK\t201105\n暗帝\t201106\nTRWjpw\t201107\nSTUW\t201108\n真相大白\t201109\ngjugh\t201110\n33333333888888888\t201111\n哈笨猪蛋\t201112\n叠水\t201113\n意哥\t201114\ngjugd\t201115\n美驴\t201116\n为民除害\t201117\n十年少\t201118\nIPAD\t201119\n王子轩\t201120\n劲头儿\t201121\n64348180584565\t201122\n真真爱人\t201123\n秘你好美\t201124\n幸福花园\t201125\n配有关\t201126\n华盛顿邮报\t201127\n沙元野\t201128\n突尼斯\t201129\n翻译器\t201130\n息事宁人\t201131\njxxxn\t201132\n正德\t201133\ncdsfsedhgg\t201134\n真真正\t201135\n一般儿\t201136\n一不叫\t201137\n喜歌\t201138\n秘拍\t201139\n吸血鬼日记\t201140\n故园\t201141\n正往\t201142\n秘招\t201143\n客服类\t201144\n梳\t201145\n秘拜\t201146\n15161370975\t201147\n五百零一一六八头\t201148\n奥斯卡\t201149\n提子\t201150\n金唱片\t201151\n二百多斤\t201152\n错呀\t201153\n力气候机遇刺\t201154\nXxX\t201155\n上海财经大学出版社\t201156\n小多彬\t201157\n101次\t201158\nGICUCUCUFUFUFUFT\t201159\n再也不理你了你个坏秘度度秘\t201160\n赛罕区\t201161\n换名\t201162\n近似数\t201163\n峨眉道\t201164\nXxx\t201165\n冮华县\t201166\n5拉\t201167\n摸李\t201168\n999999999999999999\t201169\n违法乱纪\t201170\n分批\t201171\n光辉岁月\t201172\n孙家栋\t201173\n哈爸爸好爸爸\t201174\n云梦县\t201175\n13:09\t201176\n中国江苏网\t201177\n达拉\t201178\n快点儿三不然我可真的不理你了我\t201179\nｑｑ靠靠靠靠靠，＠ｏｕ\t201180\n外快\t201181\n温向东\t201182\n微访见\t201183\n弱酸制强酸\t201184\n刘应鑫\t201185\n零一四\t201186\n神农\t201187\nTRAFIT\t201188\n阴曹\t201189\n闫婆惜\t201190\n1122552257574411111111111111\t201191\n董事长度秘\t201192\n徐紫嫣\t201193\n罗学艺\t201194\n好精辟\t201195\n外心\t201196\ntffdjgdyghhfe\t201197\nt波伊斯\t201198\n2731142456\t201199\n太sexy\t201200\n双照中心小学\t201201\n把的微笑\t201202\n5211258\t201203\n千柏\t201204\n程亮\t201205\n龟仔\t201206\n向量\t201207\nK1285\t201208\n奇迹童书馆\t201209\nappletv4\t201210\n鲁敏\t201211\n狄杰斯\t201212\nmpadm\t201213\nq友乐园\t201214\nfffggsf\t201215\n润滑剂性\t201216\nfgme\t201217\n鲁教\t201218\n僮那尼\t201219\n500人次\t201220\n信过\t201221\n园园姐\t201222\n二零四幺\t201223\n台州大学\t201224\n一小时左右\t201225\n炫彩小汤圆\t201226\n妮儿\t201227\n小白兔\t201228\n多多指教\t201229\n刘彩红\t201230\n啦啦啦啦啦巴拉\t201231\n不要脸的听不懂人\t201232\n邻居家\t201233\n经济学人\t201234\n1575555745\t201235\n即送\t201236\n超频三黑龙\t201237\n封发\t201238\n别天\t201239\n你们么\t201240\n阿拉宁波网\t201241\n6300亿日元\t201242\n四大天王惊天四重唱\t201243\nosen\t201244\n李永波\t201245\n太首小苹果\t201246\n30日晚上\t201247\n擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦\t201248\n烦剧\t201249\n流失\t201250\n卫视\t201251\n冰冻\t201252\n哎呦呦呵行\t201253\n封号\t201254\nhtfeghghhhhhhhhjjjjjjkjjkkkkklkl\t201255\n阿惠\t201256\n林春\t201257\nHIMMM\t201258\nSoaUaW\t201259\n冰冷\t201260\n封口\t201261\nZhai\t201262\n春风得意\t201263\nyyhgghb\t201264\n1129606448\t201265\n说而决策\t201266\n2011新年\t201267\n郭文茹\t201268\n复归\t201269\nrsfdxvgc\t201270\n形容\t201271\n陈明英\t201272\n1十2一\t201273\n盛京热\t201274\njjgggijjujjjjj\t201275\n黑美妞\t201276\n蒸臭豆腐\t201277\n伯起雄\t201278\n赵密码\t201279\n1500000000\t201280\n稻谷\t201281\n默认\t201282\n贝儿\t201283\n荷包蛋\t201284\n白色\t201285\n自然科学类\t201286\nWilsonO\t201287\n郑英琪\t201288\n分毫不差\t201289\n一堵\t201290\n太帅鸟\t201291\n医书\t201292\n看交\t201293\n电视\t201294\n炸狂\t201295\n明天七点\t201296\n天水二院\t201297\n添柏\t201298\n车牌\t201299\nHjfjdnfkfbfkbfbfiiejrbfbcjfbfbchjf\t201300\n四月廿二\t201301\n如期\t201302\n旁若无人\t201303\n88度\t201304\n陪着你好\t201305\n官民\t201306\n薄薄\t201307\n硝烟\t201308\n不怕苦\t201309\n阿叻\t201310\n艰难困苦\t201311\n一堆\t201312\n防盗\t201313\n一堂\t201314\n歌头\t201315\n给我个你\t201316\n6月11日\t201317\n千字\t201318\n小红灯\t201319\n20发\t201320\n小汤圆\t201321\n今年7月1日—9月30日\t201322\n太阳亚太\t201323\n小甜馨\t201324\n我喜欢狗\t201325\n此地\t201326\n东酉\t201327\n罕见\t201328\n独未变\t201329\n歌天\t201330\n此场\t201331\n枪类\t201332\n很不错麻\t201333\n到此到此\t201334\n脚背\t201335\n393939\t201336\n园工林人\t201337\n走偷再接再厉\t201338\n20台\t201339\n20号\t201340\n你好恶心\t201341\n20只\t201342\n炒河粉\t201343\n咪你可以\t201344\n早上四点\t201345\nBarbie\t201346\n求情\t201347\n切腹\t201348\n东营银行\t201349\n胡海洋\t201350\n777列队\t201351\n热心度\t201352\n吴君瑜\t201353\n丨乚\t201354\nPanetta\t201355\n开身\t201356\n古佛\t201357\n漏洞\t201358\n我那个家和万事兴\t201359\n不是我愿不愿意\t201360\n棒崩\t201361\n英魂\t201362\n解围\t201363\n谬赞\t201364\n肉瘩\t201365\n你不离开我我的甩开你\t201366\n肉瘤\t201367\n焕发\t201368\n莫学龙\t201369\nWhywhy\t201370\n五指峰\t201371\n王小雅\t201372\n不声\t201373\n三六岁\t201374\n厨子轩\t201375\n两旺\t201376\n薛璐雅\t201377\n我们一家\t201378\n过重返\t201379\n茵陈体贴\t201380\n核查\t201381\n真心服\t201382\n一句行\t201383\n吴淡如\t201384\n飞就飞\t201385\nGghdiijjkkmioppkm\t201386\n靠不穿黄雀在后\t201387\n大说我爱你\t201388\n假话\t201389\n得意堡\t201390\n一百多兆\t201391\n陪着你说\t201392\n高连阳\t201393\n呼蹦\t201394\n沉陷\t201395\nc片\t201396\n毋宁\t201397\n占爱\t201398\n假说\t201399\n豆净清\t201400\n8块\t201401\n39彭\t201402\n44604\t201403\n沉盈\t201404\nkitty\t201405\n8班\t201406\n摩尔\t201407\nfjggh\t201408\n帅气\t201409\n吖陆henTT\t201410\n一个四千哦\t201411\n靖城街道\t201412\n看不来\t201413\n旭格\t201414\n地壳\t201415\n出奇\t201416\n袁祝婷\t201417\n田和斋\t201418\n我与地坛\t201419\n婉约\t201420\n7654z1\t201421\ntokyo\t201422\n因势\t201423\n古文群\t201424\n献出\t201425\n8610万股\t201426\n内女\t201427\n丶文\t201428\n我告诉我讨厌\t201429\n爱度秘\t201430\n干呕业\t201431\n猪八婆\t201432\n秭颜\t201433\n悲剧性\t201434\n好吧好吧好吧原谅你\t201435\n坏心人\t201436\n漫顶\t201437\n陆佳成\t201438\n上升星座\t201439\n物流岁\t201440\n函谷关\t201441\n7月5日上午9：30\t201442\n耍诈\t201443\n绿荫\t201444\n我真是奈何不了你了度秘\t201445\nlookollpppppppp\t201446\n不爱好\t201447\n列句\t201448\n复活币\t201449\n都眯沟失独觅渡觅渡觅渡觅渡觅渡觅渡觅渡觅渡觅公司独立公司\t201450\nycb\t201451\n乃非尔塔利\t201452\nSwift\t201453\n产生\t201454\n有多心经\t201455\n开导\t201456\n变乖\t201457\nshysh\t201458\n古发\t201459\n温亚新\t201460\n斗破苍穹\t201461\n缪然\t201462\n德拉科马尔福\t201463\ntefsj\t201464\nFIVE\t201465\n虹桥二中\t201466\n平台级\t201467\n风口\t201468\n樟木块/樟木珠/樟木粉包\t201469\nix35\t201470\n陪酒女\t201471\n大富豪\t201472\n嗯老板\t201473\n娃娃贴\t201474\n亚不错\t201475\n139斤\t201476\n认发挥\t201477\n别人片\t201478\n肖茜蓉\t201479\n禅学\t201480\n#玛汐\t201481\n古古\t201482\n我是你的主人是我的秘书\t201483\n16天\t201484\n八半\t201485\n田瀚文\t201486\ngyjg\t201487\n八千\t201488\n八十\t201489\n哪桶\t201490\nksjskdvdksh\t201491\nxxusf\t201492\n迪达的啊地理答题rtwrbbv芭比PPTV\t201493\n学兄\t201494\n唉呀\t201495\n饶恕\t201496\n八卦\t201497\nfibv\t201498\n亲爱的我哭\t201499\n大提琴\t201500\n虫儿牙\t201501\n233333333333333333333333333333333333333333333333333333333333333\t201502\n国泰民安乐逍遥\t201503\nfibf\t201504\n555555850558005555555555555555555\t201505\n女主角\t201506\n袁俊飞\t201507\n斯科特\t201508\n好些个\t201509\n值得\t201510\n擦腚\t201511\n才企鹅\t201512\n心情好多啦\t201513\n东北那嘎子的你是哪嘎子\t201514\n2010年1月\t201515\nebcl\t201516\n聊懂\t201517\n12345667566\t201518\n原名\t201519\n雪原\t201520\n见到\t201521\n想说罢\t201522\n2月份\t201523\n查洛克\t201524\n雪压\t201525\n新闻大厦\t201526\n五节\t201527\n猪笼\t201528\nhhzhhd\t201529\n悉数\t201530\n猪度秘\t201531\n嘟锦绣\t201532\n不称职\t201533\n好巧网\t201534\n东兴镇中心学校上学中心\t201535\nfirstiPhone\t201536\ndfgf\t201537\n陈是真\t201538\nMmmnn\t201539\n呃谢谢\t201540\nTFBOYSTiME\t201541\n小卖部\t201542\nejfjejrhfhfhhdjfhhfvhfnfbgfbfbfbfbb\t201543\njjjjjjjd\t201544\n王中\t201545\n冷吧\t201546\nOFF\t201547\n钱货\t201548\n遗忘\t201549\n544820\t201550\n张紫涵\t201551\n谢谢你的喜欢\t201552\n冷吗\t201553\n临界点\t201554\n锦绣神针\t201555\n林畅\t201556\n更新换代\t201557\n陪窝\t201558\n刘立岩\t201559\n顺义\t201560\n郭聪聪\t201561\n明星们\t201562\n松松悟空大战西游记\t201563\n获得\t201564\n别老乱\t201565\n1遍\t201566\n蛋糕机\t201567\n找找找\t201568\n李雪鹰\t201569\n淘尽\t201570\n12341234123414\t201571\n东郊记忆\t201572\n第九题\t201573\nmoko\t201574\n蓝兰成\t201575\n足本\t201576\n英文名字\t201577\n吴超磊\t201578\n1道\t201579\n感兴趣\t201580\n两兄弟姐妹\t201581\n845579\t201582\n仙女童\t201583\n荼蘼\t201584\n像头猪\t201585\nAsksmaana\t201586\n天一客\t201587\n化德县\t201588\n莫属\t201589\n死流氓\t201590\n三聚亚澳\t201591\n沾光\t201592\n逗醉\t201593\n地步色\t201594\n住院\t201595\n2654\t201596\n黄花菜\t201597\n女大学\t201598\n歪酱\t201599\n张年少\t201600\n不够不够不够\t201601\n爱的理由\t201602\n秘堡\t201603\n人才\t201604\n人手\t201605\n彩车\t201606\n糙养\t201607\n韵致\t201608\n2092075968\t201609\n农夫打灰机\t201610\n问娃\t201611\n淘淘二手车网\t201612\n看着你的娘不管了是你的你不管了\t201613\nv热火\t201614\n月经贴\t201615\n啰嗦\t201616\nqianxi\t201617\nskinId=skin204\t201618\n三七二十一\t201619\n二三零幺\t201620\n装欢\t201621\n一切人\t201622\n嘬嘬\t201623\n不是你气\t201624\n身造\t201625\n万象城\t201626\n干嘛\t201627\n韩三岁\t201628\n223478\t201629\n行距\t201630\ntyunmbX912345687990555147446226666\t201631\n该怎样\t201632\n度秘我喜欢美人鱼\t201633\nWIWHU\t201634\n惊起\t201635\ncykosnsmxk\t201636\n行路\t201637\ncxxccccxc\t201638\n位计\t201639\n李乐怡\t201640\n再笑一个再笑一个来来来好搞笑好搞笑\t201641\n回国后\t201642\n比用\t201643\n中国移动通信公司\t201644\n廖宇欣\t201645\n伱呗\t201646\n博学\t201647\n李阿姨\t201648\n好惊喜\t201649\n排面\t201650\n96311亿元\t201651\nT816\t201652\n十四三岁\t201653\n不支\t201654\n南宫\t201655\n相泥\t201656\n张影\t201657\n不收\t201658\n兴燕\t201659\n套间\t201660\n不改\t201661\n业科\t201662\n女大人家\t201663\n有空把\t201664\n4494千亿元\t201665\n嗄福行\t201666\n江嘉琪\t201667\n南宋\t201668\n贺吉祥\t201669\n南安\t201670\n百搭\t201671\n干圆\t201672\n南定\t201673\n手小华\t201674\n完整的人\t201675\n二十一二十二二十三二十四二十五二十六二十七二十八二十九三十三十一\t201676\n幽深\t201677\n玩样\t201678\n找不见\t201679\n九玫\t201680\n莫酱\t201681\nLove李泰利\t201682\n没我看\t201683\n林奇兵\t201684\n香约奶茶\t201685\n广告商\t201686\n宝贝儿我想和你做爱我想和你\t201687\n越发们\t201688\n伤心二单纯\t201689\n远绍\t201690\n卡拉JJ\t201691\n站住脚\t201692\n九王\t201693\n马到成功\t201694\n六十页\t201695\n不要脸真是\t201696\n河星\t201697\n对不去\t201698\nVasque\t201699\n不如吃\t201700\n只数星鸡\t201701\nyong\t201702\nfaStiCs\t201703\n鶣\t201704\n东京\t201705\n咯末路天我墨迹在我\t201706\nMUD\t201707\nA9188\t201708\ngunwa\t201709\nMUA\t201710\n声名\t201711\n买办\t201712\n鶸\t201713\n石昭缘\t201714\n忘我\t201715\n肏女\t201716\n四十一四十二集\t201717\n鹿群\t201718\n持久战\t201719\n蹒跚\t201720\n点不准呢一天一天\t201721\n湟中县\t201722\n克叔\t201723\n谁是\t201724\n灰蒙蒙\t201725\n前瞻分析\t201726\n我是女的你快点\t201727\n什么天\t201728\n别闹了\t201729\n好歌\t201730\n先走\t201731\n马铭心\t201732\n多克多\t201733\n恩饿\t201734\n惜和\t201735\n穿杨\t201736\n许多许多许多\t201737\n支持\t201738\n福州十一年\t201739\n李思来\t201740\n好正\t201741\n咪叫\t201742\n光照\t201743\n处女处女\t201744\n择校\t201745\nMU9\t201746\n十通都行\t201747\n眉眸\t201748\n专题\t201749\n龙春燕\t201750\n齐继光\t201751\n好死\t201752\n好歹\t201753\nfdsswqddtertt\t201754\nInformation\t201755\n三一百二十八个\t201756\n我是你的小人你是我的主人\t201757\n降下来\t201758\n汉堡王\t201759\nLknjjhujiu\t201760\n老詹\t201761\n和平路\t201762\n口干\t201763\n送肯\t201764\n发威\t201765\n棉田\t201766\n果盘\t201767\n一觉度\t201768\n今个上午\t201769\n副区长\t201770\n鞋套\t201771\n十七号\t201772\nhttppinyincne17797\t201773\n京路边有用你是看\t201774\n英名\t201775\n米兰人\t201776\n有福共享有\t201777\nsetouto\t201778\n你是我最近最最最最最最最最句句\t201779\nWriii\t201780\n595482222112995855885858555885528252585555529282\t201781\nHAHA\t201782\n九亿\t201783\n珊除\t201784\n我明\t201785\n修仙\t201786\n陈雨\t201787\n最后一刻\t201788\n过跑男\t201789\n好生气\t201790\n条兔\t201791\n暴虐\t201792\n粪便\t201793\nbey，bey\t201794\n我昱\t201795\nWhatIreally\t201796\n我是\t201797\n梅地亚\t201798\n半神\t201799\n九二\t201800\n卡宾达\t201801\nMatt\t201802\n12c\t201803\nTimeout\t201804\n南坡寺\t201805\n285米\t201806\nZAKER\t201807\n一百二十三点\t201808\n五座\t201809\n五度\t201810\n12p\t201811\n魏子菁\t201812\n黄老师\t201813\n12t\t201814\n12v\t201815\n00078点\t201816\n12y\t201817\n将它十位\t201818\n斯内普\t201819\nhadh\t201820\n加班加点\t201821\n说说腐谁\t201822\n4678\t201823\n家理\t201824\n4676\t201825\n昏死\t201826\n阻力臂\t201827\n家球\t201828\n么mm\t201829\nreactive\t201830\n这两笔\t201831\n5588878858\t201832\n两点28分\t201833\nzfrizoou\t201834\nkfcmenuitemsproductid50002status1productid50003status0\t201835\n情点\t201836\n466699547890\t201837\n我不骗你真的我真的爱你我决定\t201838\n5674352586525555555555555555555555555555555555555555\t201839\n点击\t201840\n12%\t201841\n十柱\t201842\n密虫\t201843\n思君\t201844\n屈月\t201845\n边动脉\t201846\n屈服\t201847\n120\t201848\n早彩带\t201849\n122\t201850\n李天方\t201851\n十更\t201852\n125\t201853\n126\t201854\n127\t201855\n128\t201856\n129\t201857\n蜀湘菜馆\t201858\n我是你的你是女\t201859\n口想\t201860\n市中医院\t201861\n缓不济\t201862\n一年一天\t201863\n亲爱的朋友\t201864\n想你的我想\t201865\nskjwj\t201866\n插花\t201867\n交中六\t201868\n音频\t201869\n相怜\t201870\n箨孑然\t201871\n出线\t201872\n福克斯探照灯公司\t201873\n百里\t201874\n出纳\t201875\n狼色\t201876\n曲颈向天歌\t201877\n南锣鼓巷北口\t201878\n柿子架\t201879\n16gb格\t201880\n华率\t201881\n植树日\t201882\n方寸\t201883\n贾书旗\t201884\nGood\t201885\n凯ww\t201886\nluv\t201887\nmychild\t201888\nluy\t201889\n掉头发\t201890\n这件角\t201891\n垃圾袋\t201892\n谢雯暄\t201893\nDfhdgjfgb\t201894\n景点\t201895\nlue\t201896\nlug\t201897\n12348678910\t201898\n面乳\t201899\njfhfuchxf\t201900\nlun\t201901\n马琪涵\t201902\n1474465806\t201903\n5868985886585689\t201904\n尹a\t201905\n熊猫块\t201906\n西二日\t201907\n王继超\t201908\n荼毒\t201909\n收紧\t201910\n857575555\t201911\n宋莎莉\t201912\n151288228\t201913\n没去天吧\t201914\n奶块\t201915\n布加帝维龙\t201916\nlu5\t201917\n好里里\t201918\n张宇凡\t201919\n四比如来佛\t201920\n恩施\t201921\n再喜欢\t201922\n37485\t201923\n木乃\t201924\n康平县\t201925\n小别三个说话\t201926\nZZZZZZZZZZZ\t201927\n练琴\t201928\n一八班\t201929\ndhng\t201930\n日理万机\t201931\n枝子\t201932\n彤源\t201933\n大一个人\t201934\n┗┯┛n　　／　│　　＼n　　　　　│n　　　　　∩　晚安n\t201935\n吧度\t201936\n好我想你一个苹果\t201937\n首页\t201938\n还我想\t201939\n蓝衣\t201940\n北京富士\t201941\n狠击\t201942\n大猫猫\t201943\n影像制\t201944\nyoy1样\t201945\n一个两块\t201946\n可凶\t201947\n吧庙\t201948\n脑筋急转弯ok\t201949\n小受宽\t201950\n自由的生活\t201951\n我相信我可以做你的女朋友有你\t201952\n恢弘\t201953\nlxelo\t201954\ntctxth\t201955\nwhoget\t201956\nBuffalob\t201957\n设计员\t201958\n幸运数\t201959\n泉城\t201960\n#斑夏\t201961\njfjjhrbc\t201962\n哪來\t201963\n茅亭\t201964\n清蒸大鱼\t201965\n八十把\t201966\nEgzqwe\t201967\n865551154488\t201968\n美英\t201969\n何音\t201970\nfeishua\t201971\n2015年11月15号\t201972\n糖宝\t201973\n心烦烦\t201974\n金麟岂\t201975\n哪侠\t201976\n玉米浆\t201977\nyouknown\t201978\n1994年\t201979\niambusy\t201980\n个月了我我\t201981\n租扣\t201982\n最红首歌大梦想家\t201983\n建明\t201984\n13252521689\t201985\n血誓\t201986\n闺蜜型\t201987\n无忧无虑\t201988\n犀牛灯\t201989\n当何罪\t201990\n五十四\t201991\n滚滚流\t201992\n获至宝\t201993\n漩涡\t201994\n休哒\t201995\nhttpfhiphotosbaiducomxiaodupicitem4d086e061d950a7bef0256b10dd162d9f2d3c994jpg\t201996\n13589305325\t201997\n好的微\t201998\nPPG\t201999\nwsgjfbj\t202000\n听笑\t202001\n回城\t202002\n那个贴\t202003\n哭啦雷剧来嘞噜啦噜啦类噜啦噜啦噜啦类Jul不卡太可爱累不加Jul\t202004\nPPS\t202005\n身體\t202006\n房租名\t202007\n欧派\t202008\n18484844944446434946217556694435566464518447313813445453411479664\t202009\n阿燃\t202010\n问你我喜欢\t202011\n无聊地\t202012\n邓谷敏\t202013\n抽解\t202014\n仪式\t202015\nxvss\t202016\n升温\t202017\nE罩杯\t202018\n九族\t202019\n署理\t202020\n如下\t202021\n如上\t202022\n到底是真是假\t202023\n鸡腿儿\t202024\noaoa\t202025\n丝线\t202026\n急给\t202027\n埋单\t202028\n免责\t202029\n齐飞\t202030\n洛带\t202031\n九日\t202032\ni\t202033\n索蝶四馁\t202034\n如临\t202035\n九旬\t202036\n筋骨\t202037\n自高\t202038\n那一肖\t202039\nryraaaaaaa\t202040\n北京市\t202041\n自报家门\t202042\n王平\t202043\n今晚21：30\t202044\n84648892492\t202045\n幺八五二二\t202046\n272822288272\t202047\n正面照\t202048\n核桃\t202049\nvxhvh\t202050\nchiqd\t202051\n维码\t202052\n古猫宁\t202053\n华南农业大学\t202054\ntfbogs\t202055\n正果\t202056\n四季养生茶饮\t202057\nkkg\t202058\n七口\t202059\n周可童\t202060\n罗马假日\t202061\nYahya\t202062\nsweet\t202063\n周亦航\t202064\n陈雨宏\t202065\ngghrrf\t202066\n四百多\t202067\n斜体\t202068\n3本\t202069\nｖｉｐ\t202070\nu2577\t202071\n沿着塞纳河到翡冷翠\t202072\n姜思宇\t202073\n蔡家悦\t202074\nseoulcy\t202075\n纱衣\t202076\n马小皇子\t202077\n郑时瑞\t202078\n袋辣条\t202079\n忘了我去\t202080\n711472580369\t202081\n稀土资\t202082\nsisse\t202083\n3月\t202084\n范定青\t202085\n田少丹\t202086\nvvnkg\t202087\n白象街\t202088\n袁世红\t202089\nGFRTGVHUI8IYTRDDDFFGBJIIIUYTTRRRFFVGHUYTFFFFVVBHJI9O0POOIYEWWW\t202090\n解放战争\t202091\n亲爱代问\t202092\n杜萌恩\t202093\n小雛菊\t202094\n13416203114\t202095\n熊刚\t202096\n欲擒故纵\t202097\n2：47\t202098\nvillage\t202099\n洁身自好\t202100\n嘻嘻行\t202101\n小知熊\t202102\nhhshshshuhgncbjdbfhfbd\t202103\n正极\t202104\n父爱\t202105\ngfeh4hhe\t202106\nROM\t202107\n面案\t202108\n四三多\t202109\n中流砥柱\t202110\n一心二用\t202111\n浼洁\t202112\n我的好度秘我爱你会哪种爱不是爱情是我\t202113\n乱涂\t202114\n老鬼\t202115\n嘻嘻表\t202116\n马尾辫\t202117\n恩恩爱爱\t202118\n好心酸\t202119\n自我戏剧化\t202120\n六种\t202121\n我着\t202122\n甄琴\t202123\n张信哲\t202124\n厦门机场\t202125\n几不回\t202126\n李焕君\t202127\n方大同\t202128\n佛珠\t202129\n上海证券\t202130\n铁矿石\t202131\n那多万\t202132\n三G流量包\t202133\n一起来\t202134\n梨状肌综合症\t202135\n佳希仁\t202136\n盟建\t202137\n快讯\t202138\n脑公帅\t202139\n乖孩纸\t202140\n男怀女\t202141\n谎言游戏\t202142\n劳伦森\t202143\n车达峰\t202144\n献吻\t202145\n4k五\t202146\n20多倍\t202147\n破抱抱\t202148\nveie\t202149\n5865245355n\t202150\n来了叫\t202151\n金能生水\t202152\n乐家居陶瓷\t202153\n2012年6月份\t202154\n咕噜球咕噜咕噜叉咕噜\t202155\n粘着\t202156\n仆分清点\t202157\n30年后\t202158\nhdhf\t202159\n5.20\t202160\nhdhh\t202161\n米醋紫皮蒜\t202162\n活动文\t202163\n三九来\t202164\n葛哲延\t202165\n呱松\t202166\njing\t202167\njina\t202168\njino\t202169\n男票\t202170\n喜悦感\t202171\n搞笑我要约\t202172\n尹幺一\t202173\n草酸\t202174\n田瑶田启维田琪\t202175\n窠臼\t202176\n秘真可爱\t202177\n剑军\t202178\n这么多次\t202179\n南海公立学校\t202180\nイラスト\t202181\n黑莓\t202182\n罗托拉\t202183\n谢芷妍\t202184\n沈罗辰\t202185\n组胺\t202186\n南航国航\t202187\n走了看\t202188\n具别具一格\t202189\n小故事\t202190\n闲女\t202191\n马丁斯科塞斯\t202192\n吴小乖\t202193\n信县\t202194\n快讲\t202195\n覃思婷\t202196\n1212642655\t202197\n0nnnnnnnnnnnnnnnnn000nn00\t202198\n放军\t202199\n心情难寸\t202200\ng5tk\t202201\nhfj孤\t202202\n希人\t202203\n用意\t202204\n此此\t202205\n十六集\t202206\n男头\t202207\n黄爱兵\t202208\nthnks\t202209\n合理工\t202210\n红枣绿豆\t202211\n阿波罗\t202212\n死去吧不理\t202213\n5300多公斤\t202214\n榛蘑艾妮\t202215\n娇嫩\t202216\n酸梅汤\t202217\n13524726565\t202218\n声听\t202219\n大郎\t202220\n再来一火\t202221\n首源\t202222\n五七一\t202223\n10.7万平米\t202224\n大都\t202225\n香港六合彩明天会\t202226\n轩少爷\t202227\n大部\t202228\nT955\t202229\n3434343431313131313131\t202230\n打白\t202231\n51000个\t202232\n大镜\t202233\n李小狼\t202234\n庇伯\t202235\n柏木由纪\t202236\n死亡\t202237\n1926年4月26日\t202238\n十五岁了你猜猜我在哪上学\t202239\n死人\t202240\n关孑\t202241\n关子\t202242\n豆腐八嘎\t202243\n天博爱四\t202244\n黏糊糊\t202245\nmpouter\t202246\n死亲\t202247\nxcvggggt\t202248\n死于\t202249\n含香\t202250\nno你好像\t202251\naaulu\t202252\n曹海波\t202253\n搅基图\t202254\n希勒\t202255\n1000111岁\t202256\ng魅眼\t202257\n不见不聊\t202258\n余红雨\t202259\n死云\t202260\n黄酮\t202261\n米德加德\t202262\n冀望\t202263\n锁阳丸\t202264\n一生一世的爱\t202265\n教派\t202266\n逆风眼\t202267\n考察\t202268\n了了了了了了了\t202269\n升息\t202270\n度秘迷\t202271\n拜回来\t202272\nbniyg\t202273\nB2B\t202274\n坏度\t202275\n田心微\t202276\ngdvdg\t202277\n御灵\t202278\n缩量\t202279\n衣锦华庭小区\t202280\n田可心\t202281\n东村小学\t202282\njikldo\t202283\n好好说\t202284\ndfhhc\t202285\n掌上星座屋\t202286\n亲记得\t202287\n赶明\t202288\n出塞\t202289\n75546\t202290\n走遍\t202291\n谢谢你好搞笑\t202292\nf8hfnm\t202293\n了破\t202294\n暖天\t202295\n库逊\t202296\n走道\t202297\n少年江湖\t202298\n猪猪猪猪猪猪猪猪猪猪侠猪猪侠朱侠朱侠朱侠朱侠朱侠朱侠日日\t202299\n思息敏\t202300\n机针\t202301\n汕头市\t202302\nmmmad\t202303\n另一只手\t202304\nmmmaa\t202305\n孔雀飞\t202306\n范总\t202307\n睛天\t202308\n404404404404404404\t202309\n蛋男\t202310\n果敢\t202311\n咳咳咳卡\t202312\nsangry\t202313\n翻译\t202314\n喔喔喔喔达\t202315\n鲛人\t202316\n说刚吃\t202317\nYves\t202318\n感悟\t202319\n一二一百马\t202320\n梦吧\t202321\n助力\t202322\n猪你是猪你是猪你是猪你是猪秘子主义\t202323\n俺媳妇王惹大大着你了缘vvv兄弟分广v\t202324\n凯巨帅\t202325\n男女孩\t202326\n4168979879164674\t202327\n雷绍峰\t202328\n腿痛\t202329\nuyyttyttyyyygtttrrfdgbv\t202330\n败光\t202331\n13634683880\t202332\n人工增雪\t202333\n一百六十五六十元\t202334\ngcvnkl\t202335\n亲爱的亲一个保修\t202336\n新年好\t202337\nEP01\t202338\nidmb\t202339\n4点36\t202340\n赵思成\t202341\n分分\t202342\n王会\t202343\n百弗英语\t202344\n王伟\t202345\n岚岚\t202346\n莫么\t202347\n150米\t202348\n礼赞\t202349\n丞相彩\t202350\n提起诉讼\t202351\n少数几个\t202352\n恩人\t202353\n2941109552\t202354\n盖辉\t202355\n行午安\t202356\n何洋\t202357\n大猫狼窝到德康动物不感你娃娃\t202358\n死臭比\t202359\nuifjvykcj\t202360\n何洁\t202361\ndkzk\t202362\n8467348\t202363\n叫行\t202364\n黄米\t202365\n老东汽车站\t202366\ngdyv\t202367\n张红包\t202368\n耍无赖\t202369\n杨益民\t202370\n曹瑾瑜\t202371\n张天阳\t202372\ngdyn\t202373\n朔月\t202374\n简稍\t202375\n弟媳\t202376\n莫乱\t202377\n谢学长\t202378\n将家州\t202379\nkg2\t202380\n一一十块\t202381\n裡面\t202382\n烫脚\t202383\n22日下午\t202384\n运来\t202385\n无礼\t202386\n曰式\t202387\n孙楷杰\t202388\n讲礼貌\t202389\nhjdi\t202390\n1123456\t202391\n可度秘\t202392\n滚刀\t202393\n四五分之四则\t202394\n预备役\t202395\n许笑\t202396\n孙瑞泽\t202397\n周六周\t202398\n武胜宾\t202399\n九成九\t202400\n地暖\t202401\nghiphotosbaiducomxiaodupicitemf11f3a292df5e0fe1f51b7755b6034a85edf727cjpg\t202402\n11万亿美元\t202403\ntype\t202404\n告状\t202405\n1000\t202406\n很难说\t202407\napk7k7k\t202408\n李光头\t202409\n摊位\t202410\n阮嘉哲\t202411\n优盘\t202412\n谢谢你的密你是我最好的朋友\t202413\n打颤\t202414\nkga\t202415\n易冷\t202416\n德鲁克\t202417\nkgb\t202418\nHishow\t202419\nkgg\t202420\nkgf\t202421\n哼休\t202422\n一个你是女\t202423\n信不信你再说我就不给你好可乐\t202424\n新娘家\t202425\n天颇\t202426\n|||卢\t202427\n免谈\t202428\n一声端\t202429\n九多一\t202430\n大蒙\t202431\n杨亦磊\t202432\n爱我多还是爱你多\t202433\n大蒜\t202434\n三百页\t202435\n拐带\t202436\n信旺那\t202437\n打题\t202438\n因为你的朋\t202439\n打额\t202440\n村官\t202441\n妙答\t202442\n自忠\t202443\n熊诗佳\t202444\n沧州府\t202445\n善斗弈者\t202446\n5555555555555555555555555555555555555555555555555555555555555555\t202447\n是谁嘞秒针帛\t202448\n机长\t202449\n任天行\t202450\n演艺家\t202451\n朱亚轩\t202452\n爱你我想\t202453\n赵河东\t202454\n极端\t202455\n咳咳风家晨风\t202456\nhjjjjn\t202457\nhjjjjj\t202458\n寸土\t202459\ndbhxadvfxcbhgwbe\t202460\n美我的妹妹\t202461\n六一米\t202462\n你好下班人主人我是你的宠物小希\t202463\nmyztu\t202464\n255242112322525545\t202465\n疝气\t202466\n战罢\t202467\n20152016\t202468\n4门\t202469\n吉姆尼\t202470\n七十五岁\t202471\ndyugfygH\t202472\nucuhdhfbc\t202473\n99￥\t202474\n开心伐\t202475\n妙雨天\t202476\n丽人生\t202477\n妒忌\t202478\n我没有晚班我求你了接我的晚班\t202479\n横山\t202480\n道故人心易变\t202481\n伴我飞\t202482\n傻子装\t202483\n树荫\t202484\nretotherok\t202485\n丝巾\t202486\n梅红色\t202487\n铁炮队\t202488\n盆子\t202489\n游仙我上马路牙子\t202490\n俢仙者\t202491\n窗棂\t202492\n稀浴\t202493\n广珠铁路\t202494\n晒太阳结束\t202495\n名仝\t202496\n昆工滚滚\t202497\n重庆酉阳电视台\t202498\n林殊\t202499\n膳魔师\t202500\n一航校\t202501\n你好我家度秘\t202502\n横屏\t202503\n安理得\t202504\n先日生\t202505\n长清\t202506\n集众红腮帮\t202507\n13366312775\t202508\nsuperjunior\t202509\n角平分线\t202510\nkick\t202511\n城市名\t202512\njdvd\t202513\nShenm\t202514\n代青塔娜\t202515\n心想\t202516\n今天下午三点十五分\t202517\n你在吗度秘我是你的主人\t202518\n十天内\t202519\njhgvbhk\t202520\n问你好\t202521\n汪小菲\t202522\n澤測測\t202523\n蛋黄液\t202524\n漫天要价\t202525\n代青\t202526\n内在美\t202527\n超级子\t202528\n大神戒\t202529\n看3D\t202530\n人民桥\t202531\n可望\t202532\n班男\t202533\n还在家呀你好再见拜拜永别了再见\t202534\n1xo\t202535\n可期\t202536\n企鹅牛西牛猪\t202537\n韩星\t202538\n莲美恋\t202539\n心惊\t202540\n心情\t202541\n迪尔\t202542\nTecnica\t202543\n悟性\t202544\n4182\t202545\n洞幺洞幺\t202546\n密诀\t202547\nmmllmmmm\t202548\nmxy\t202549\n民丰县\t202550\n坦里库鲁\t202551\n香肠犬\t202552\n救行\t202553\n更新退\t202554\n魔芋\t202555\n肾衰\t202556\n秋林\t202557\n恩作\t202558\n习千雅\t202559\n特松加\t202560\nOxg\t202561\n石晓龙\t202562\n杨桐崎鸣\t202563\nrhetoric\t202564\nMySpace\t202565\n伤寒国\t202566\n密语\t202567\n随意性\t202568\n87979\t202569\n好主意\t202570\n国脚级\t202571\n其间\t202572\n金汇酒店\t202573\n阿里系\t202574\n731博物馆\t202575\nckiyyilvl\t202576\n饿号\t202577\n法军\t202578\n名族\t202579\n网语\t202580\n第五代\t202581\n奔头\t202582\nhnge\t202583\nvvhhhj\t202584\n碘雀巢\t202585\n秃驴\t202586\n第五份\t202587\n17.00\t202588\nhqed\t202589\n洪冬娅\t202590\n黄美金\t202591\n赶来\t202592\n6784\t202593\nhelloib\t202594\n挪动\t202595\n6789\t202596\n博士养家\t202597\n橄榄林\t202598\n12月20\t202599\n健美小时\t202600\n笼统\t202601\n一口酥\t202602\n过家家\t202603\n找你了再见\t202604\neghvhufhohfihhugdffyhjjvvgggghhjhgbigigghkfgubkmnjllhupiytdryij\t202605\n众僧\t202606\nsurprise\t202607\nhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh\t202608\n菇娘\t202609\n汨罗\t202610\n足底\t202611\n课粒\t202612\n说遍\t202613\n朱思瑶\t202614\n不浮云\t202615\n鲍老师\t202616\n味全乳酸菌\t202617\n追悔莫及\t202618\n昂奇红白狼其三级航空彩猫样\t202619\n说道\t202620\n口漫画\t202621\n速龙\t202622\n张误会\t202623\n12辆\t202624\n天诛地灭\t202625\nCK克芭拉\t202626\n理查德\t202627\n死难者\t202628\n一脚踏\t202629\n示警\t202630\n兔纸\t202631\n巴拉拉小魔仙之魔剑公主\t202632\n造瘘\t202633\n装凶\t202634\n神锋\t202635\n脐岁\t202636\n耳孔\t202637\n范畴\t202638\n婆婆工\t202639\n直述句\t202640\n哈达\t202641\n大哥平\t202642\nvikgik\t202643\n高兴发财\t202644\n下雨了\t202645\n百度干\t202646\n整洁度\t202647\n邓联播\t202648\ntfos秘\t202649\n僵尸号\t202650\n藏尸\t202651\n佳能俱乐部\t202652\n武开庭\t202653\n青山刚昌\t202654\nhjlonuo\t202655\n家暴\t202656\n紫点\t202657\n运财\t202658\n上无悔\t202659\n辽宁广播电视台\t202660\ngirls\t202661\n多两天\t202662\n陌上我\t202663\n文科生\t202664\n本太多恳战争快长个泄＊极品必少年附一个\t202665\n阿魔\t202666\n五百块\t202667\nsssffxqwwqqa\t202668\nbaa1cd11b9b6b453be12c8fcc3ce2d45jpg\t202669\n菜色\t202670\nagelock\t202671\n运费\t202672\n炜华\t202673\n不识愁\t202674\n量\t202675\n一网打尽\t202676\n密道\t202677\n单打\t202678\n赵雅楠\t202679\nabac式\t202680\n50遍\t202681\n标志性\t202682\nNo.5\t202683\nNo.4\t202684\nNo.1\t202685\n袁颖\t202686\n送安\t202687\n板侠\t202688\n蛮不讲理\t202689\n钟me\t202690\n这么多爱\t202691\n战气\t202692\nrujbh\t202693\n猫威斯\t202694\n棺木\t202695\n腹部\t202696\n基督教速记\t202697\n速成\t202698\n入职\t202699\n经济结构\t202700\n柏凡\t202701\n东张\t202702\n宁乡一中\t202703\n化纤维\t202704\n8gyy\t202705\n冠脉\t202706\n缘浅\t202707\n永安\t202708\n华为公司\t202709\n薄荷绿\t202710\n黄卡门\t202711\n甘肃工业大学\t202712\nhksf\t202713\n机缘\t202714\n恋头\t202715\nSVIP\t202716\nsubjects\t202717\n上户片\t202718\n苏一蒙\t202719\n佟丽娅\t202720\n孙言\t202721\ngakd\t202722\n秘英格利什\t202723\n个案\t202724\n军娘\t202725\n打垮\t202726\n合同书\t202727\n军威\t202728\n其母\t202729\n流水账\t202730\n甘肃省\t202731\n恐高症\t202732\nFrenchrunning\t202733\n亚芳\t202734\n融入\t202735\n密战家\t202736\n一妮妮\t202737\nWhoaat\t202738\n公园\t202739\n喂喂喂度秘度秘\t202740\n丧志\t202741\n再读\t202742\n藏传子\t202743\n旖旎\t202744\n东慧\t202745\nhellomyou\t202746\n争风吃醋\t202747\n宽衣\t202748\n木屋\t202749\n上磁县\t202750\n3月28号\t202751\n夏琳·维斯托克\t202752\n2200公里\t202753\n五七干校\t202754\nhohucfuif\t202755\n米尔纹\t202756\n建军小学\t202757\n大梅沙\t202758\n把守\t202759\n凤还巣\t202760\n木屑\t202761\n蜜年\t202762\nwocaonidayeblde\t202763\n難題\t202764\n我的眼神\t202765\n卡德尔\t202766\n绫人\t202767\n经理犬\t202768\n抱好\t202769\n不行白\t202770\n8月27\t202771\n艾我\t202772\n独一\t202773\n一时四十分\t202774\n第二组\t202775\n8月29\t202776\n蒲城\t202777\n这般如此\t202778\n7436\t202779\n7498.84点\t202780\n唠友\t202781\n想来想去\t202782\n海猪猪\t202783\n自错\t202784\n叶面\t202785\ndraghis\t202786\n老一辈\t202787\n廖宏\t202788\n自销\t202789\n呵华\t202790\nHIGUGHHFG\t202791\n欢脱\t202792\n廖宇\t202793\n雷锋勒\t202794\n抓力\t202795\n冠毛\t202796\n离曲\t202797\n热烘烘\t202798\n讲演\t202799\n娱戏\t202800\n新余中院\t202801\n扒开\t202802\n残美\t202803\n赶出来\t202804\n最萌女\t202805\n互补\t202806\n一万盏\t202807\n娱战\t202808\n能为你会不会\t202809\n驴叫\t202810\n王幽怨\t202811\n好堂客\t202812\n十公分\t202813\n凝视\t202814\n度秘你的话真多\t202815\n我总样\t202816\n新玛特花园路\t202817\nguhghggugu\t202818\n20名\t202819\n林点依020点三电视名典屋\t202820\n读数\t202821\n朱美杰\t202822\n开赛\t202823\n目标\t202824\n朋星\t202825\ndtyastywhttfffsggn\t202826\n还有什么好说\t202827\n驴友\t202828\n难念的经\t202829\n在娜\t202830\n空无一人\t202831\n狠萌\t202832\n淡淡的回忆\t202833\n13592525341\t202834\n我爱你不是那个爱了我是朋友关系\t202835\n不老实\t202836\n2012届\t202837\n拜一起\t202838\n好想你\t202839\n夏明源\t202840\n提掇\t202841\n男木女金\t202842\n扬扬\t202843\n说了谢谢你再见\t202844\n赴台\t202845\nmlfidhs\t202846\n理所当\t202847\n默默成成注射磨磨蹭蹭\t202848\n带来\t202849\nLME\t202850\njncnhdndhfghgyegGregGregyetbeenworkinginpartnershipofrealfluuuhdbdhegdggwhfghfhdhvvh\t202851\n也想家\t202852\nL0VE\t202853\n一三两一\t202854\n35分\t202855\n三年一方\t202856\n打学\t202857\n三洋洋\t202858\n食醋\t202859\n五彩滩\t202860\n民主制度\t202861\n４０起\t202862\n润肺新hicrttt\t202863\n骆驼祥子\t202864\n六七二百\t202865\n甜言蜜语文质彬彬无恶\t202866\n20001216\t202867\n星战7\t202868\n就是花花呀你是猪秘\t202869\nhhhugjfj\t202870\n金融业\t202871\n大英国家美术馆\t202872\n陌上千叶\t202873\n婚礼包\t202874\n淡泊\t202875\n奥康商城宝贝\t202876\n一三两个\t202877\n哪一个\t202878\n下个下午\t202879\n从爱\t202880\n松手\t202881\nHanSolo\t202882\n黑宝\t202883\n大骗\t202884\n节奏感\t202885\nvufh\t202886\n称作\t202887\n透顶\t202888\n安柏\t202889\nq呗\t202890\n唔知伢\t202891\n逃婚\t202892\nvufb\t202893\n阴雨天\t202894\n标出\t202895\n乔娜\t202896\nvufu\t202897\n凉\t202898\n650所\t202899\nV榜\t202900\n程玲\t202901\n守财\t202902\n罪不可恕\t202903\nAvis国际租车\t202904\n标准\t202905\n昨昨\t202906\n唔知会\t202907\n很开森\t202908\n哎匹\t202909\n三千年后\t202910\n好的好的好的好的好了好了好大好大好大好大好大好大好大好大好大好大\t202911\nIlook\t202912\n腰胯\t202913\n我是公主呀大战美丽奥\t202914\n读秒\t202915\n倾听\t202916\n乌拉乌拉\t202917\n谢谢你求求你了帮\t202918\nsuzjsfvhvgdzgUsisfGfgkxkxHXJFJDJ4387578284545553\t202919\n彼时\t202920\n读秘\t202921\n2460米\t202922\n金庸\t202923\nYang\t202924\n热土\t202925\n诺澜\t202926\n感谢你没有了\t202927\n姐木\t202928\n防辐射服\t202929\n中午一点半\t202930\n韩张斌\t202931\n你好呀有人\t202932\n黄红红火火\t202933\ntang\t202934\n伊恩弗莱\t202935\ndfkzG\t202936\n自由活动\t202937\n57秒\t202938\n加多\t202939\n币匕\t202940\n绝迹\t202941\n哥兄弟姐妹\t202942\n金库\t202943\n倾吐\t202944\n倾向\t202945\n金店\t202946\n救生员\t202947\n刘秒\t202948\n辛杰\t202949\nbdqh\t202950\nxxx0319y300820211\t202951\n木琴\t202952\n64646466464\t202953\n谢大家\t202954\n季它\t202955\n电钻\t202956\n52013147758\t202957\n石天一\t202958\n李傲雪\t202959\ncoolp\t202960\n汗子\t202961\n蔣千禧\t202962\n叶煜恺\t202963\n乱笑\t202964\nT``\t202965\n木理\t202966\n咯了再见\t202967\n做钱\t202968\n红烧鸡\t202969\n国泰\t202970\n矮化\t202971\n口味\t202972\n马铃薯块\t202973\n15906565016\t202974\n佚名\t202975\n回一家\t202976\n缠住\t202977\n虫豸\t202978\nhellomm\t202979\n自顾自\t202980\n马雅茹\t202981\n切哼\t202982\n鱼尾纹\t202983\n杂耍\t202984\n国法\t202985\n知也\t202986\n邬君梅\t202987\n叶林杰\t202988\nACCESS\t202989\n一张公里\t202990\n朾工\t202991\n张博涵\t202992\n知乎\t202993\n咋一咋\t202994\n同片\t202995\n竟然后再见了\t202996\n一月九号\t202997\n邹积鑫\t202998\n往矣\t202999\nppppppp\t203000\ns节\t203001\n啦啦啦我的宝贝\t203002\n三春晖\t203003\n菩提树芽\t203004\n萨尼\t203005\n骆冰王\t203006\n殆殆\t203007\n11122233333试试试试试55555666667777788888\t203008\n即带\t203009\n臭不要脸鸟\t203010\n於吧\t203011\n活死人\t203012\n波痴得额脖子一模1aaa\t203013\n来了想\t203014\n桓台黄檀\t203015\n大名耳\t203016\n科洛\t203017\ngovo\t203018\n徐水法\t203019\n汇咳咳\t203020\n王贺桐\t203021\ntmrm\t203022\n明天早上6点30分\t203023\n银河系剧场\t203024\n马昕宇\t203025\n旭日阳刚\t203026\n理才\t203027\nsaber梗\t203028\n埤城\t203029\n代码\t203030\n明天晚上\t203031\n理所\t203032\n4.3英寸\t203033\n满淫\t203034\n理找\t203035\n将会\t203036\n用钱\t203037\n妥瑪斯里蘭卡\t203038\n钻研\t203039\n00000000000000\t203040\n假发片\t203041\n拖掉\t203042\n芭芭多咪度秘\t203043\n真乖我的小弟\t203044\n萝卜根\t203045\n三整条\t203046\n啦啦啦啦啦你你你\t203047\n说非\t203048\n空瓶\t203049\nvhydy\t203050\n奥斯\t203051\n变变变\t203052\n試過嗎\t203053\n有朝一日\t203054\n韩孟磊\t203055\n甚致\t203056\nsoweara\t203057\n甚至\t203058\n天海梦\t203059\n59秒\t203060\n蜂拥而至\t203061\n够用\t203062\n18000天\t203063\n世界无烟日\t203064\n嗯汪比样你买我的可口可是我们加由\t203065\ndicsucxhjhcv\t203066\n发行量\t203067\n油炸类\t203068\n齐鲁\t203069\n咳咳咳咳咔\t203070\n嗯青春\t203071\n孤不孤独\t203072\n50022308\t203073\n松吉泽仁仁波切\t203074\n施生\t203075\n源头\t203076\n死里哈啦\t203077\n劝告\t203078\n郑水英\t203079\n王帅气我是个女的女生弹的好告诉你我帅气王者帅气\t203080\n洪阿姨\t203081\n钕博\t203082\n流域\t203083\n罗兰\t203084\n林天麟\t203085\n铧头\t203086\nswt\t203087\n色咪咪\t203088\n习欢\t203089\n你好女神\t203090\n爱我别走\t203091\n客套话\t203092\n男女俩\t203093\n呃办\t203094\n白天鹅\t203095\n赞不上\t203096\n母带\t203097\n为什么呀\t203098\n瑞纳\t203099\n帅阿\t203100\n打战去\t203101\n一世个\t203102\n株式\t203103\n噎死\t203104\n瑞红\t203105\n暂未\t203106\n玩世\t203107\n不发儿\t203108\n碧眼\t203109\n尿急尿急尿急尿急尿急尿急尿急尿急尿急尿急\t203110\n好华娱\t203111\n85555222333225589\t203112\n华为p8\t203113\n华秘\t203114\n郑训铨\t203115\n山楂妹\t203116\n一诺不想\t203117\n局外人\t203118\n佐仓绫音\t203119\n疙疙瘩瘩\t203120\n玩下\t203121\n服务法\t203122\nfork\t203123\n高星星\t203124\nforo\t203125\n朱正卿\t203126\n為什麼\t203127\n叫原谅\t203128\nfora\t203129\n郭艳华\t203130\n主公\t203131\n衝口而出\t203132\n一厊\t203133\nltefehft\t203134\n黑立鱼\t203135\n理你了你\t203136\n搓泥\t203137\n购机者\t203138\n黄安\t203139\n刘p\t203140\n小不点行不行\t203141\n你好年谁你\t203142\n有期徒刑\t203143\n13956718275\t203144\n天廷\t203145\n分忧\t203146\n妖精打斗\t203147\n2596583835\t203148\nshii\t203149\n科学计数法\t203150\nclassic\t203151\n铁生\t203152\nkjbbn\t203153\nshid\t203154\n签定\t203155\n服友\t203156\n春江水暖鸭先知\t203157\nshix\t203158\nGOODS\t203159\n图藤\t203160\n和田玉\t203161\n怎个\t203162\n死案\t203163\nwixin\t203164\n密慢走\t203165\nshit\t203166\n120112122\t203167\n发愁人\t203168\n怎丑\t203169\n分心\t203170\n七八十分\t203171\n兴业天文\t203172\n第几岁\t203173\n我不我不在\t203174\nshirtjacket\t203175\n好好先生\t203176\n我的家门\t203177\n巴雷特嘉年华图呃屠龙我当数士毁灭公主二\t203178\n第三代\t203179\n肖秀荣\t203180\n怎不\t203181\n卤曲\t203182\n度琼\t203183\nIamateacher\t203184\n脸皮\t203185\n实训\t203186\n崔轶博\t203187\n恩倍\t203188\n林一一\t203189\n女羽\t203190\n台端\t203191\n宿松县\t203192\nHhgbu\t203193\n06986976\t203194\n骇客\t203195\n咯木\t203196\n阿鲁姆\t203197\n照像机\t203198\nshsgd\t203199\n早期乳腺癌\t203200\n滋润\t203201\n徐慧婷\t203202\n我和一个男孩\t203203\n青稞酒\t203204\n你好了吧\t203205\n明灯\t203206\n林园\t203207\n抄别\t203208\n轮下\t203209\n8868868888886\t203210\n淡饭\t203211\nEeeeeeeeeeee\t203212\n5000万8200多少\t203213\nfetoride\t203214\n2月30\t203215\n每一时\t203216\n二零七七\t203217\n刘腾泽\t203218\n过去\t203219\n给我叫\t203220\n来不了\t203221\n万青丝\t203222\n稳定性\t203223\n迪加\t203224\ndanci\t203225\n0478561\t203226\n喷钱\t203227\n过压\t203228\n心海\t203229\n留不住\t203230\n王思捷\t203231\n千年等一回等你回等你回\t203232\n莫非铁道部\t203233\n周蕙\t203234\n王受元\t203235\n恼羞成怒\t203236\n奇女子\t203237\n美不胜收\t203238\n23名\t203239\n小魔仙之魔仙公主\t203240\n有气质\t203241\nwhattrouzhito\t203242\n刚式贫嘴\t203243\n理工大学\t203244\n多几年\t203245\n腾讯微博\t203246\n钢琴\t203247\n昨天前天\t203248\n嗯行好\t203249\n管得好\t203250\n青石板路\t203251\n险恶\t203252\nLILY\t203253\n玩笑话\t203254\n小飞哥\t203255\n曰安\t203256\n坦承\t203257\n榨汁\t203258\n幸福的人\t203259\n11等余\t203260\n火天\t203261\n睡莲\t203262\njhfffff\t203263\n杨千玺\t203264\n火大\t203265\n结婚吧行\t203266\ngcuchg\t203267\nvfiy\t203268\n呆二\t203269\n536发\t203270\n我的王国\t203271\n鞋柜\t203272\n巍山\t203273\n主戏吻\t203274\n呆了\t203275\n申个\t203276\n战书\t203277\n非官方\t203278\n黃軒明\t203279\n杜咪\t203280\n惴惴不安\t203281\nVicbe\t203282\n网虫\t203283\n昨天00:50\t203284\n再见密度我我讨厌你\t203285\n想不想不想\t203286\n卧室\t203287\n日均期房\t203288\n人情归\t203289\n不是我是说的是你真好\t203290\n实时度\t203291\n大东北\t203292\n敬业\t203293\n李飞扬\t203294\n唯娘\t203295\n坚韧\t203296\n富少徘\t203297\n天气\t203298\nJNUTET\t203299\nJgubvxfhjo\t203300\n二十多少\t203301\nnygg\t203302\nDVT\t203303\nmcyticalai\t203304\n抑郁症\t203305\n与那你猜我\t203306\n李夜冰\t203307\n乳房胀痛\t203308\n40多三\t203309\n遇害\t203310\nfffdgf\t203311\n天水\t203312\n56934128765\t203313\n快六三\t203314\n萌萌大我是你\t203315\n[狗\t203316\n千霞\t203317\n对不对说\t203318\nJIUUU\t203319\n长增\t203320\n村生\t203321\n胡海滨\t203322\nshuanleba\t203323\n四条\t203324\n养发\t203325\n几回\t203326\n李泽厚\t203327\n大成景恒保本基金\t203328\n天天有喜之人间有爱\t203329\n元限极\t203330\n就秘杀\t203331\n楔形\t203332\n羞惭\t203333\n尹电影院\t203334\n去懂\t203335\n12547452258523680556675656888\t203336\n维生素C\t203337\n天龙牛逼\t203338\n亲妹妹\t203339\n算命先生\t203340\n聊你猜猜我的名字\t203341\n网络用语\t203342\n北京小学练会\t203343\nhappysavier\t203344\n雨花\t203345\n13201151551\t203346\n15284231000\t203347\n笑话吧二五的算了不说了我是人不说了别说了我是鬼畜的别说我是人别说了别说了我是成龙\t203348\n1223345667\t203349\nood\t203350\n不可及\t203351\n腾讯微信\t203352\nshye\t203353\n网师\t203354\n啦拜\t203355\nooo\t203356\n矿产\t203357\n如海纳百川\t203358\n伴友\t203359\nhokunl\t203360\n60Kg\t203361\noop\t203362\noor\t203363\n回家姐偶和我该不是基地对酒当歌不上课偶idghsbehjis\t203364\n机身\t203365\n丰城\t203366\n呢大本营\t203367\n很高档\t203368\n奥巴马政府\t203369\n拜师\t203370\n赢得\t203371\n你的不吧\t203372\n贤胜\t203373\n有一手\t203374\n战报\t203375\n1月10日上午\t203376\n1月27日\t203377\n我爱我妈妈爱我的的\t203378\n说的无话可说\t203379\n一心堂\t203380\n64G\t203381\n厦门市中华电齐思蓝机会\t203382\n标签\t203383\n酸楚酸楚\t203384\n初次\t203385\n制度性红利消失\t203386\n义仪\t203387\n偷星九月天\t203388\n如此那\t203389\n晕出\t203390\n差辈儿的连连\t203391\n火出的话我在我在我在我在我在狹兆龙虾饺子皮\t203392\n2011-03-10\t203393\n罡女\t203394\n将错就错\t203395\n变得\t203396\n等你会\t203397\n#世界第一初恋#\t203398\n桃花旺\t203399\n条件\t203400\n你丑你丑你丑你丑你\t203401\n佳人闻语\t203402\n儿孑\t203403\n通化\t203404\nGMUbuntu\t203405\n一二栋\t203406\n好听者\t203407\n扁胎\t203408\nyouyou\t203409\n春兰\t203410\n我只在乎你\t203411\n头份\t203412\n才大难不死动馹弔骤验三碍破慧玫金金月盾腰如二读土丁秦古有朝一日厂\t203413\n头仲\t203414\n伛付交\t203415\n招商银行\t203416\n连菲灵\t203417\n你的秘密告诉我\t203418\n著作等身\t203419\n请法\t203420\n江西卫视\t203421\n1569823740\t203422\n拿不到\t203423\n花样年华]\t203424\n一个86公分\t203425\n我不离你了你是\t203426\n好爱好爱你哟你爱我吗我真的很爱你\t203427\n爱国主义\t203428\n亲兄弟\t203429\n造梦\t203430\njaid\t203431\n我欲封天\t203432\n288倍\t203433\n九万年以后\t203434\n5x5\t203435\n果呀\t203436\n恶心死了我和我的自配\t203437\n受不了你\t203438\n管理员\t203439\n厌会\t203440\n比视\t203441\n非煎饼摊\t203442\n蚂蚱子\t203443\n喜羊羊快跑\t203444\n味逍遥丸\t203445\n再來\t203446\n七百多\t203447\n吴若菡\t203448\n依霸学霸\t203449\n邵氏\t203450\nddttt\t203451\n谜城\t203452\n小挑花\t203453\n鸡味\t203454\n微倍是源在一起周我是无名么\t203455\n5575750288\t203456\n叶剑\t203457\n石桂霞\t203458\n绝伦\t203459\n发疯\t203460\n爱就爱我\t203461\n质变\t203462\n白萝卜\t203463\n化纤\t203464\n2128282\t203465\nqhs\t203466\n1352元\t203467\n因子\t203468\n美姑\t203469\n杨雅澜\t203470\n酸臭\t203471\n大小子\t203472\nSha\t203473\n览天下\t203474\nShe\t203475\n荤菜\t203476\nShh\t203477\nShi\t203478\nShk\t203479\n二声\t203480\n叫名\t203481\nchanyeol\t203482\n洁面\t203483\n美姿\t203484\n房管\t203485\n卡小\t203486\n汪笑笑\t203487\n死疯子\t203488\n处处\t203489\n饿货\t203490\n老场坊\t203491\n卡尔\t203492\n费尽心思\t203493\n大小孩\t203494\n我不是你的小乖乖\t203495\n355次\t203496\n还你\t203497\n黄晨萍\t203498\nhttppinyincne9861\t203499\nX0.5-50\t203500\n笑文\t203501\n额尔其斯\t203502\n难堪\t203503\n地藏经\t203504\n笑料\t203505\nMGMTGD\t203506\n纳米木\t203507\n年均\t203508\n点关心\t203509\n屏风\t203510\n辛晓琪\t203511\n1713\t203512\n东航盲品会\t203513\n5250\t203514\n1717\t203515\n长尾鲛\t203516\n新声\t203517\n5255\t203518\n1718\t203519\nffffffffffffffff\t203520\n偶怕\t203521\n121254125412541\t203522\nyf7p79g\t203523\nDDYDRUDURDUUF\t203524\n肇东\t203525\nivalawoldswrift\t203526\nafter\t203527\n推辐照\t203528\nfxyx\t203529\nApP\t203530\n动耳\t203531\n号儿\t203532\nfxyf\t203533\n14批次\t203534\nvill叉六\t203535\n康巴ompaomomo\t203536\n天了噜\t203537\n18316882346\t203538\n使用寿命\t203539\n打打杀\t203540\n生化会\t203541\nfddsff\t203542\n骨气\t203543\n葡萄67367\t203544\n河北省委\t203545\n南秋秋\t203546\n億歲\t203547\n对呀犹犹豫豫\t203548\n扩散\t203549\n蹭网\t203550\nv俄v\t203551\nfsxbgdgd\t203552\n红河蒙自\t203553\n肝衰竭\t203554\n力促\t203555\nhilliirruss\t203556\n鸡毛\t203557\n时年\t203558\n隔音\t203559\ncoclou\t203560\n3月5日\t203561\n快点再讲一个再讲的快点\t203562\n图书杯\t203563\n近体诗\t203564\nyournameis\t203565\n几十年以后\t203566\njhvvhjk\t203567\n谛听\t203568\n7月7日\t203569\n恋爱控\t203570\n猜字迷\t203571\n周国似\t203572\n核桃壳\t203573\n力保\t203574\n抚松\t203575\n一张照\t203576\n三点零四\t203577\n1月24日\t203578\n5555555555555555555555555555555555555555555555555555555555555\t203579\noigtfdg\t203580\n坏蛋了你好\t203581\n瘾谜\t203582\n钰雄也\t203583\n不要我了我好\t203584\n有明\t203585\n遥视\t203586\nzhfd\t203587\n数十年后\t203588\n2003年12月13日\t203589\n冠军赛\t203590\n我我我我我宝贝说话\t203591\n汉堡夹\t203592\n斩春剑\t203593\n夜深了睡觉吧\t203594\n健康\t203595\n崔佳卓\t203596\n张浩斌\t203597\n牙龈\t203598\nxkhxyxuupdu\t203599\nnibune\t203600\n同道\t203601\n中产阶级\t203602\n指号\t203603\n百分之十五\t203604\nwbsksb\t203605\n新华大街站\t203606\n介头\t203607\n豆们\t203608\n梦见到\t203609\n书简\t203610\n榆次\t203611\n红红红\t203612\n湖光\t203613\n差辈儿\t203614\n一惩罚\t203615\n江湖十三差\t203616\n代代代\t203617\n作业肿么血\t203618\n一个6500\t203619\n八旗大慈\t203620\n没主见\t203621\n踢踢\t203622\n神笔\t203623\njjjjmm\t203624\n对你的爱\t203625\ntfa3\t203626\n一万遍度\t203627\n细高跟\t203628\n扑不倒\t203629\nhtimishoulisal\t203630\n雅丽洁\t203631\n高一米\t203632\n国土地\t203633\n影像\t203634\n浩冬\t203635\n李胜基\t203636\n爱唔汗纸\t203637\n撮合\t203638\n法索\t203639\ntfah\t203640\nthug\t203641\npain\t203642\n上架\t203643\n想和就打\t203644\n住吧\t203645\n大大们\t203646\n咪片\t203647\n结婚后\t203648\n铭妹妹\t203649\n一角\t203650\n我相信自己\t203651\njbvmvcbk\t203652\n笑不出来\t203653\n一觉\t203654\n蔡国斌\t203655\n祠堂\t203656\ntfas\t203657\n典当\t203658\n付行\t203659\n国手\t203660\n锁水\t203661\n结婚吧\t203662\n惊天杀人\t203663\n也就是说\t203664\n6瓶\t203665\n上林\t203666\n剥掉\t203667\n5-9月\t203668\n既非\t203669\n楼王\t203670\n霉头\t203671\n真切\t203672\n一触\t203673\n预料\t203674\n赵娟丽\t203675\n毛衣柜\t203676\n舒\t203677\n过年了耶\t203678\n钱哥\t203679\n55567642\t203680\n舔\t203681\n米未\t203682\n甘心\t203683\n舞\t203684\n使用率\t203685\n片黄色\t203686\n50022645\t203687\n舀\t203688\n米朵\t203689\n舅\t203690\n牛肉面\t203691\n世纪城\t203692\n备胎们\t203693\n興\t203694\n舍\t203695\n百八十\t203696\nkkjjbjhgghggghhgygff\t203697\n舱\t203698\n舰\t203699\n2974585426\t203700\n生津\t203701\n澧县\t203702\n钱哈\t203703\n秘拜拜\t203704\n船\t203705\n标价\t203706\n零二二零四\t203707\n舾\t203708\n呵浩特姆欧米\t203709\n需要你原谅\t203710\nblack\t203711\n和昌楼\t203712\nhttphhiphotosbaiducomxiaodupicitem6f061d950a7b0208d4b8669465d9f2d3572cc856jpg\t203713\n航\t203714\n生活\t203715\n你行\t203716\n19日\t203717\n般\t203718\n如沙\t203719\n体大\t203720\n体术\t203721\n你和你的爱情\t203722\n面往\t203723\n熊样\t203724\n鑫帅\t203725\n女生版\t203726\n统统中\t203727\ngagb\t203728\n瘦背冠军葡萄柚\t203729\n张棻\t203730\n郭珍霓\t203731\n故此\t203732\n停放\t203733\n偶然性\t203734\n▄︻┻┳\t203735\nHfjehd\t203736\n蜘蛛侠蜘蛛侠\t203737\n二点半\t203738\n涂装\t203739\n928\t203740\n929\t203741\n我是我你是你\t203742\n走本\t203743\n南瓜炒度秘\t203744\n素淡\t203745\n报送\t203746\n922\t203747\n923\t203748\n924\t203749\n925\t203750\n926\t203751\nj8\t203752\n未未\t203753\n可不可怜\t203754\n十三次\t203755\n托举\t203756\n尤伦斯当代艺术中心\t203757\n被窝\t203758\n可不可怕\t203759\nyivjeh\t203760\n被窃\t203761\n我喜欢牡丹花百花之王\t203762\n白雪姬\t203763\n好了爱卿退下吧朕\t203764\n认领\t203765\n3月20\t203766\n尺码\t203767\n奔驰320l\t203768\n大勇果果好哈[笑哈哈\t203769\n未有\t203770\n度秘亲\t203771\n八爪鱼\t203772\n九层\t203773\n二零二二二\t203774\n毒酒\t203775\nbehind\t203776\n发祥之地\t203777\n京通\t203778\n一个星期天\t203779\nvvnghtfhh\t203780\n一针见血\t203781\n云计算\t203782\n单号\t203783\ntuz\t203784\njx\t203785\njy\t203786\njz\t203787\n蓬乱\t203788\ntup\t203789\nju\t203790\njv\t203791\njw\t203792\ntut\t203793\njq\t203794\n太太关\t203795\njs\t203796\njl\t203797\ntui\t203798\njn\t203799\njo\t203800\njh\t203801\n谷秘\t203802\ntun\t203803\n一次一点\t203804\n登山表\t203805\nje\t203806\ntub\t203807\njg\t203808\ntud\t203809\nja\t203810\ntuf\t203811\ntf鲍尔斯\t203812\n有意无意\t203813\n大灾\t203814\njY\t203815\n多一节\t203816\n脱不了\t203817\n25484562552\t203818\n195万元\t203819\nmjplmdugkxmjtbmwjaevgamtphmqulvka\t203820\n知罪\t203821\n九六四三九\t203822\n大灯\t203823\ndated\t203824\njJ\t203825\n邱七八\t203826\n单反\t203827\n黄狸猫\t203828\n1百万\t203829\n刘雨潼\t203830\njB\t203831\n按接着\t203832\n指示\t203833\nMaxchic玛汐\t203834\n我国\t203835\n光华管理学院\t203836\n有我你来\t203837\n袁子淇\t203838\n色精\t203839\n会理\t203840\n就送\t203841\n零二幺六\t203842\n零二幺八\t203843\n湖大\t203844\n着床\t203845\n嗯五一音乐厅\t203846\nyooooo\t203847\n下午18:00\t203848\n吓唬\t203849\n我喜欢鸡\t203850\n生父\t203851\n吐等\t203852\n龙咚\t203853\n命局\t203854\n忘不掉\t203855\n于胜\t203856\n仙游\t203857\n撒事\t203858\n龟头儿\t203859\njzymdk\t203860\n有所不为\t203861\n64g\t203862\nbmb\t203863\nbma\t203864\n一言不发\t203865\n点六十年\t203866\n作用犬\t203867\n黑/锁屏\t203868\n55252454\t203869\n交割\t203870\n倒库\t203871\n纤体\t203872\n阿拉伯国家\t203873\n樊作业\t203874\n两段\t203875\n5gjgbg\t203876\n少媛\t203877\n没了钱\t203878\ncjvug\t203879\n想和你一起唱\t203880\n8月4日、5日晚\t203881\n个体户\t203882\n几一家\t203883\n136143\t203884\n噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗噗\t203885\n不假思索\t203886\n左边\t203887\n四川电信\t203888\n小二葵花茶\t203889\n尽要\t203890\n高速费\t203891\n下贿\t203892\npjm\t203893\n下费\t203894\n再读一下\t203895\n百分位\t203896\n闪尸\t203897\n合伙人\t203898\nckgk\t203899\n艺术性\t203900\n下贱\t203901\n三碗\t203902\n你好事\t203903\n好哇好哇小伙伴好啦啦啦\t203904\n两颗心\t203905\n乐之邦\t203906\n通海县\t203907\n几个五\t203908\n朴先生\t203909\n远情\t203910\n台本\t203911\n审片\t203912\n你好人\t203913\n我说东你说西你说你没惹我\t203914\n你好亲\t203915\n张大哥\t203916\n我好我不爱你也不爱我\t203917\n几个亿\t203918\n龙龟\t203919\nHewitt\t203920\na米\t203921\n景点儿\t203922\na类\t203923\n氢弹\t203924\n克洛格纳\t203925\n受益者\t203926\n孩子气\t203927\n炒面\t203928\n168几\t203929\n我真的不理你了你求我不理你\t203930\n超级细菌\t203931\n液晶\t203932\n刘世允\t203933\n表错意\t203934\n哞哞咩咩潺潺蔷薇\t203935\n轻洗\t203936\n金鸿瑞\t203937\n陶庙\t203938\n三十五块\t203939\n知足常乐\t203940\nhi呀\t203941\nfxgbufgjj\t203942\n明年一月\t203943\n嘴边\t203944\n小葡萄\t203945\nBritto\t203946\npjy\t203947\n而复始\t203948\n天蠃\t203949\n点离\t203950\n锁错啦\t203951\n八百兆\t203952\n觉过\t203953\n断流\t203954\n拉莫斯\t203955\n视神\t203956\n独特性\t203957\n目标收益率\t203958\n7点06分\t203959\n心里女兀飞上天了我还\t203960\n九月九\t203961\n来了从现在开始\t203962\n跑堂k\t203963\n数十\t203964\n额tsT\t203965\n骗人子\t203966\n每一步\t203967\n批资\t203968\nfihdjfghfof\t203969\n刘唤\t203970\ndghdjfk\t203971\n慢慢慢慢慢慢慢慢楠楠的男男男男男男男恩\t203972\n日内瓦湖\t203973\n真三国无双7\t203974\n饮茶\t203975\n救援\t203976\n川内酷\t203977\nFhk\t203978\nFhj\t203979\n上部\t203980\nFhd\t203981\n熟服\t203982\n花果山水帘洞\t203983\n战火\t203984\n爱我不要你说句话\t203985\n好我爱打你漂洋\t203986\n工会组织\t203987\n流黑\t203988\n豪猪\t203989\n贾洋洋\t203990\n桂圆瀑布\t203991\n度秘勒\t203992\n蔡老师\t203993\n癌次\t203994\n左翻黎\t203995\n注意到\t203996\n芝芝白狗\t203997\n9点05分\t203998\n验证\t203999\n黄气\t204000\n不要你给我钱\t204001\n别客气天\t204002\n陈承承\t204003\n132478881\t204004\n洗劫\t204005\n上海南\t204006\n供应\t204007\n范思哲\t204008\n提示\t204009\n餐位\t204010\n神童长大后变庸才\t204011\n65班\t204012\n刘太斌\t204013\n后天下午\t204014\n说说情\t204015\n从何而起\t204016\n阿内尔卡\t204017\n教务处\t204018\n贺美琪\t204019\nx3t\t204020\ndsgfddufdfgd\t204021\nhnbc\t204022\n9999岁\t204023\n13580246790\t204024\n应征如云\t204025\n死穴\t204026\n互联网\t204027\n22222222222222\t204028\n二五十兆\t204029\nnnnnnnnnnnn\t204030\n铁岭地区\t204031\npiyrw\t204032\n志军\t204033\n14526522776384\t204034\n泰丰格格\t204035\n秀才\t204036\n杨德长\t204037\n荣城\t204038\n真的假的真的假的呀\t204039\n邵我\t204040\n77777777773337k7k7k7k7k7k么\t204041\n一百分10分\t204042\n法制观念\t204043\n养成\t204044\n也罢\t204045\n555545528\t204046\n下午上午\t204047\n好再冗\t204048\nojhhh\t204049\nx30\t204050\n工作报告\t204051\n艳湖上初睛后雨\t204052\n刘玺豪\t204053\n心愿\t204054\n不饱食\t204055\n藕粉7噢否否否副8\t204056\nprtscsysrg\t204057\n无懈可击\t204058\n哈波特女\t204059\n算了啦不调戏\t204060\n锤基rps#\t204061\n我一个人了我介绍我姐上你我下雨头我想你投了你投我讨厌你我讨厌讨厌讨厌你\t204062\n亚橄榄\t204063\nabimabi\t204064\nbckgl0\t204065\n看着你\t204066\n大不清\t204067\n小米小米\t204068\n很纯很美\t204069\nI9100黑2700\t204070\n严宽\t204071\nIPAD1\t204072\n岁寒\t204073\n家的故事\t204074\nIPAD4\t204075\n李大家\t204076\n光明顶\t204077\n八达岭长城\t204078\n一类\t204079\n打杂\t204080\n头像\t204081\n白春礼\t204082\n一九踢\t204083\n气囊展\t204084\n一籵\t204085\n一米\t204086\n莞美\t204087\n王源\t204088\n赛中\t204089\n严实\t204090\n花种\t204091\n一了只\t204092\n截沟\t204093\n淋湿\t204094\n老爷子\t204095\n而瑞\t204096\n4分钟\t204097\n清烩海参\t204098\n国有企业\t204099\njdkekenfi\t204100\n期待\t204101\n豆豆豆豆豆豆\t204102\n博学多\t204103\n县管执法局\t204104\n登船\t204105\n退烧贴\t204106\n两件事\t204107\n好多数不清\t204108\n1313131313131313\t204109\n梅派\t204110\nvkcjuggggghgygggggggggggghggijjgdxbgcccggggggghkoopjjjhbkjhhhhiikojjjjjnnnlohxjfdch\t204111\n耳光痛死\t204112\n38000\t204113\n葵花籽油\t204114\n熊出没我爱看熊出没\t204115\n莫泰\t204116\n徐佳\t204117\n是么样\t204118\n尼申\t204119\n前置炒年糕\t204120\n大北京\t204121\n今年5月17日\t204122\n左顾右盼\t204123\n乌雅氏\t204124\n教皇\t204125\n这周末\t204126\n全年级\t204127\n1十四\t204128\n幺五五二二二二\t204129\n47585\t204130\n林辰唏\t204131\n蓝爸爸\t204132\n公务舱\t204133\n没陲\t204134\n阳光海岸\t204135\n有魄力\t204136\n上堂来\t204137\n亲我讨厌\t204138\n到底是谁不然和你\t204139\n募捐箱\t204140\n深表\t204141\n刘讲\t204142\n刘警官\t204143\n588889888\t204144\n于飞玉\t204145\n玩完\t204146\n选好\t204147\n妈妈款\t204148\n在天涯衰\t204149\n你好亲爱的哥你好亲爱的王子\t204150\n刘海儿\t204151\n补刀\t204152\nm\t204153\n陈振鹏\t204154\n保护区\t204155\n21点30\t204156\n渔网\t204157\nLOLI可仙女可妖魅可名媛淑女可绝世公子\t204158\n张玉火幺二零幺幺二一九九八零九零八零零幺七\t204159\n快点儿包废话\t204160\n来点\t204161\n上海上港\t204162\n姓张名叫\t204163\n康婷\t204164\n干哈来吃噶发\t204165\n俩份\t204166\n药山\t204167\n嗯多会儿\t204168\n吧败类傻子晌炮\t204169\n煎饼侠\t204170\n十场\t204171\n电源\t204172\n雪碧\t204173\n嫡女\t204174\n13465756857\t204175\n王宝钏\t204176\n爱你就是大坏蛋的还鼠大坏蛋\t204177\n唱一首歌亲不行\t204178\n小时\t204179\nmoyotaow\t204180\n七折\t204181\n7000名\t204182\n娇情\t204183\n淋头\t204184\n我讨厌你我讨厌你我最讨厌你\t204185\n小旭\t204186\n豆腐干\t204187\n特定\t204188\n手术\t204189\n莆田\t204190\n威哥\t204191\n尼豪斯\t204192\n天灵灵\t204193\naclean\t204194\n东城分局\t204195\n手机\t204196\n小旁\t204197\n撕拉式\t204198\n褚玉航\t204199\n九小苹果\t204200\n液压\t204201\n糊涂与君\t204202\n10月26日-29日\t204203\n多咪吗度\t204204\nfals\t204205\n呃尼诺\t204206\n苗欣阳\t204207\n癖好呀\t204208\n哇塞好屌\t204209\n楚人\t204210\n多好受\t204211\n大大大\t204212\nfall\t204213\n打不通\t204214\nfale\t204215\n老又萌\t204216\n到底是谁呀\t204217\n800万秘\t204218\n非裔\t204219\n下一册\t204220\n下午片\t204221\n贺裕豪\t204222\nnvayJyslpd\t204223\nZPPO\t204224\n缺现\t204225\n傻子我你是好人你是好人我是女好人情\t204226\nJ20\t204227\n笑看\t204228\n提额\t204229\n嗯刘婷\t204230\n研讨\t204231\ngang\t204232\ngg1\t204233\nw元\t204234\n残月\t204235\n26600元\t204236\n蜂打扰\t204237\n六量\t204238\n讨债\t204239\n电纸书\t204240\n真诚实\t204241\n摇摇车\t204242\n天涯连续\t204243\n死局\t204244\n梯形\t204245\n呵骨\t204246\n事情绪\t204247\n调试\t204248\n蔓延\t204249\ncsmfin\t204250\n武侠\t204251\n你好说\t204252\n1997年2月10日\t204253\n绝关系\t204254\n待会儿再喝\t204255\n胎儿\t204256\n嘶力竭\t204257\n外国佬\t204258\n蒋雯丽\t204259\n一二级\t204260\n争强好胜\t204261\n大者\t204262\nmnNZNNZ\t204263\n你是我的乖酸\t204264\n现货\t204265\n来了仇\t204266\nggu\t204267\nk些\t204268\nggv\t204269\n23年\t204270\n无聊不聊\t204271\nggs\t204272\n我们女主人的死缠烂打的死家伙\t204273\n大战娘\t204274\n微雨\t204275\n巴z\t204276\nggy\t204277\nggx\t204278\n爱号\t204279\ngge\t204280\nggd\t204281\n对付坏\t204282\nggf\t204283\nggc\t204284\nggb\t204285\nggm\t204286\nggn\t204287\n捎子\t204288\nggj\t204289\n大威\t204290\n大娃\t204291\n整顿\t204292\n闻嘉超\t204293\n昂坪\t204294\n小姐儿\t204295\n中央组织部\t204296\n摆好\t204297\n开票\t204298\n盲姑娘\t204299\nairport\t204300\n天生的我\t204301\n99999999999999999999999999个\t204302\nconts\t204303\ncontr\t204304\n恩你入怀\t204305\n大娘\t204306\n知首\t204307\n昨夜一场风雨\t204308\n河北区\t204309\n相邻\t204310\n驯夫记后传\t204311\n我是你的主人我要你给我举一个你\t204312\n退潮\t204313\n酷我年\t204314\n哈冈\t204315\n对呀黄土高坡\t204316\n1周七\t204317\n哈再\t204318\n华师都行\t204319\n火音\t204320\n20131\t204321\n喜欢你是真\t204322\n12345678910\t204323\n12345678912\t204324\nhglfuugfudu\t204325\n五分之六\t204326\n发木\t204327\n天蠍座\t204328\n有谁知道\t204329\n李博强\t204330\n对呀我\t204331\nSLAYER\t204332\nUNIQ帅\t204333\n熔岩\t204334\n1761个\t204335\n董庆一\t204336\n茁壮成长\t204337\n新闻纵横\t204338\n13730982966\t204339\n相邀\t204340\n想你会不会\t204341\n咱冕\t204342\n要不摸\t204343\nmfamelelas\t204344\n英巨了\t204345\n冯悦\t204346\n兰溪县\t204347\n这个周\t204348\n16800英亩\t204349\n下星期下\t204350\n星生\t204351\n萎靡\t204352\n我的秘密\t204353\n一千二百分钟\t204354\n司马南知足司\t204355\n安博雅\t204356\n下星期一\t204357\n漂亮天\t204358\n四驱\t204359\n一条虫\t204360\n六分\t204361\n检验\t204362\n喔塔\t204363\nhigjgjnvhkgvlhnvjgfjtlhcjgoutgckfhldjtjckyjtkgIGFFKCKHjhjkh\t204364\njfifiri\t204365\nNicai\t204366\nCA87\t204367\n根谁学\t204368\n不一样样\t204369\n大次大次\t204370\n条信\t204371\n吃苦耐劳\t204372\n死性\t204373\n魏晋\t204374\n直把\t204375\nbgngjcu\t204376\n我喜欢处女\t204377\n灌丛\t204378\n給一個保護套v個給\t204379\n杯具\t204380\n关民网\t204381\n35关\t204382\nhukyig\t204383\n米青漾\t204384\n魏晨\t204385\n出人头地\t204386\n云头\t204387\nN维\t204388\n辛月传\t204389\n凤凰镇竿会馆\t204390\n赵立民\t204391\n梁开盛\t204392\n1692\t204393\n杯兔\t204394\n米仓凉子\t204395\n钱来宝\t204396\n陶乐县\t204397\n赞叹声\t204398\n云天\t204399\nGLAMOROUS\t204400\n条狗样\t204401\n李光羲\t204402\n魔小\t204403\n十四只\t204404\n那一个人\t204405\n老正\t204406\n叶翔\t204407\n老武\t204408\n云丽雨萱\t204409\n户上\t204410\n十四号\t204411\n莎莎\t204412\n笑话再生我不要你了我要开骂\t204413\n纵情\t204414\n多说\t204415\n共青城\t204416\n往西走\t204417\n户主\t204418\n老歌\t204419\nKsSN\t204420\n弱鸡\t204421\n也样\t204422\n我是你的妃子\t204423\n贪诗\t204424\n169m\t204425\n我是你的脸你\t204426\n数十位\t204427\noverchange\t204428\n草原\t204429\n开日\t204430\nqogfl\t204431\n张晓灿\t204432\n开心消消乐\t204433\niutcuf\t204434\n牛香美\t204435\n柯舒琪\t204436\n3d漫\t204437\n六六大顺\t204438\n橙子翔\t204439\n随于\t204440\n师姐们\t204441\n试运行\t204442\n牛比男\t204443\n祁帅\t204444\nwetdhs3\t204445\n阸\t204446\n红红红红\t204447\n一本钱\t204448\n翠辈\t204449\n陶荣坤\t204450\n事物\t204451\n毛子\t204452\n自上\t204453\n自三\t204454\n乱句\t204455\n睡不好\t204456\n湖南湘雅医院\t204457\nhggjj\t204458\n股民\t204459\n能县\t204460\n大年初一\t204461\nfffffffffffffffffff\t204462\nhggjc\t204463\n自主\t204464\n自为\t204465\n数多五分之一\t204466\n吹草动\t204467\n折嗯\t204468\n话会\t204469\n事片\t204470\nJFHNX\t204471\n臭不要脸你臭不要脸\t204472\npad\t204473\nlato\t204474\n自个\t204475\n沧浪\t204476\n上五天\t204477\nTfgirlss\t204478\nuhc\t204479\n熊猫族\t204480\n临晨\t204481\n百半\t204482\n牛腩炖盅\t204483\n父亡\t204484\n延误\t204485\nuhv\t204486\n于明\t204487\n8月4日晚\t204488\n六圈\t204489\nkukuxuy\t204490\n父亲\t204491\n达西\t204492\n奥斯维辛\t204493\n欧阳楼\t204494\n9HK\t204495\n大安区\t204496\n六场\t204497\n#那些年\t204498\n阎锡山\t204499\npal\t204500\n微信号372036238\t204501\n段子\t204502\n七四年\t204503\n工业\t204504\nnjjk\t204505\n怀表\t204506\n28岁\t204507\n于春\t204508\n橡皮章\t204509\n讨厌粘\t204510\n很多年\t204511\n千里循香\t204512\n554566\t204513\n穗\t204514\n珀莱雅\t204515\n颜色的颜小心的小贱\t204516\n异位性皮肤炎\t204517\n榕城区\t204518\n高琪\t204519\n我真的好想好想好想好想好想你\t204520\nv不色3江汉油田\t204521\n利苑霖\t204522\n穆\t204523\n苗疆\t204524\nZmdWyi\t204525\n羊皮卷\t204526\n穀\t204527\n穎\t204528\n说谁爱谁谁爱\t204529\n大学生士兵的故事\t204530\n天津电视台\t204531\n里度秘\t204532\n究\t204533\n穴\t204534\nlljggcoz\t204535\n灌区\t204536\n抄报\t204537\n心塞法\t204538\n111留学\t204539\n劲歌金\t204540\n植物大战僵尸\t204541\n空\t204542\n穹\t204543\n公平世\t204544\n美少年\t204545\n自然美\t204546\n35元\t204547\n1000000000000000\t204548\n旭东\t204549\nmeerkart\t204550\n保定勒\t204551\n放鞭炮炸\t204552\n李念珂\t204553\nXXXXXXXCCCCCCCXX\t204554\n神经病院\t204555\n班规\t204556\n1126.9亿元\t204557\n石佳莹\t204558\n心飞扬\t204559\nfddgdcxdggffgh\t204560\n23:17\t204561\n陈磊石\t204562\n一點\t204563\n刘建宏\t204564\n大明楼\t204565\n2009年\t204566\nRAM\t204567\n么么哒萌萌哒萌萌哒萌萌呆呆大萌萌的大萌萌的萌萌的呆呆萌\t204568\n哭不出来\t204569\n万劫不复\t204570\n连鬼\t204571\n早冲凉\t204572\n黄傻子\t204573\n吴丹红\t204574\n蓝幽儿\t204575\n18999遍\t204576\n堪\t204577\n哎呦我的个妈呀度秘太神了特别喜欢你\t204578\n打折扣\t204579\n要不我等\t204580\n18176088037\t204581\n走媳妇\t204582\n杨宸瑀\t204583\n熊出没雪岭熊风\t204584\n吉米少\t204585\n阿里云\t204586\n一系列\t204587\n逼人\t204588\n逮捕\t204589\nhttpmmeishijnethtml5shicaiphptitle\t204590\n逼亲\t204591\n囧圈\t204592\npaint\t204593\n你不懂你不懂你不懂你不懂你不懂我\t204594\n平平淡淡\t204595\n3600块\t204596\n一个亿度\t204597\n亲切感\t204598\n四多月前\t204599\n亚惠\t204600\nmozuci\t204601\n不好笑\t204602\ndddu\t204603\n万事兴\t204604\ndddy\t204605\ndddz\t204606\n和一个人\t204607\n烙烙烙烙烙烙\t204608\n答答\t204609\nFybb\t204610\n栖息\t204611\ndddd\t204612\nhibb\t204613\ndddf\t204614\ndddg\t204615\n政企\t204616\n下错\t204617\n出师表\t204618\n考驾照\t204619\njrdf\t204620\nb\t204621\nhjrnej\t204622\n死而后已\t204623\n整片整片\t204624\n天籁神童Ronan\t204625\njrdj\t204626\n喷墨\t204627\n噢妮\t204628\n控制系统\t204629\nchdhdhd\t204630\n樂樂樂\t204631\n张嗯\t204632\n异化\t204633\n文革动乱\t204634\n说还三\t204635\n9899700狙击\t204636\n天灾\t204637\nvhjggff\t204638\n乌索普\t204639\n一千年以后\t204640\nhbjnhvb\t204641\n行政区\t204642\n欧阳海绵胶\t204643\n天火\t204644\n天灯\t204645\n收藏的诱惑\t204646\n见贤思齐焉见\t204647\n我喜欢我爱的人\t204648\n5分钟内\t204649\n赵总\t204650\n哈哈哈你还以为你我真的还你了汉子你是男神\t204651\n迷绝\t204652\n龙神凉\t204653\n470007\t204654\n闪动\t204655\n棒锤\t204656\n行政化\t204657\nvng\t204658\nvnf\t204659\nvnd\t204660\n二汽\t204661\nvnb\t204662\na狗\t204663\n口胡\t204664\n纹章\t204665\n春光\t204666\nvnk\t204667\nvnh\t204668\n口胶\t204669\nvnt\t204670\n韩毅君\t204671\ncompy\t204672\nvny\t204673\n班旗\t204674\n导名\t204675\n二江\t204676\n410554888251237738\t204677\n立宪\t204678\n703模\t204679\n郭永丹\t204680\naoror\t204681\n带路\t204682\n周新月\t204683\n饿饿饿饿饿饿饿饿饿饿饿饿\t204684\n深明大义宽容\t204685\n贾文斌\t204686\n二求\t204687\n民歌\t204688\nrtdfyg\t204689\n导向\t204690\n冬蜜冬蜜土豆酱\t204691\n3dmyalifaassz\t204692\n不不不我伤心六年级那你你\t204693\n佟上\t204694\n柯乐小学\t204695\n洗哇\t204696\ndychv\t204697\nvbvv\t204698\n胡海\t204699\n俩十六点\t204700\n王海军\t204701\n就店\t204702\n赶路\t204703\n点球\t204704\n纸箱\t204705\n拉玛西亚电影学院\t204706\n九百多\t204707\n泰华\t204708\n附议\t204709\n热意\t204710\n人力资源部\t204711\n哇塞你的女人好了解\t204712\n546376385699856\t204713\n嘎拉嘎蹦沙嘎啦嘎蹦沙嘎拉嘎\t204714\n公子系\t204715\n契住\t204716\n十三块\t204717\n说了比\t204718\nFVJ\t204719\n太肉麻了别说讨厌\t204720\n天美狗\t204721\n照不出来\t204722\n哼\t204723\n哺\t204724\n阿娟\t204725\n阿威\t204726\n俺们俩\t204727\n哥\t204728\n小别离\t204729\n阿娅\t204730\n阿娇\t204731\n阿娆\t204732\n摸金校尉\t204733\n哭\t204734\n哬\t204735\nTraditionalgirls\t204736\n哪\t204737\n哩\t204738\n哨\t204739\n哗\t204740\n立得以为\t204741\n哕\t204742\n减少性\t204743\n哒\t204744\n哑\t204745\n么子\t204746\n哟\t204747\n哞\t204748\n哝\t204749\n范晓萱\t204750\n哙\t204751\n加收\t204752\n哇\t204753\n哆\t204754\n阿娣\t204755\n哄\t204756\n这家子\t204757\n哂\t204758\n品\t204759\n哀\t204760\n哏\t204761\n哎\t204762\n响\t204763\n哋\t204764\n安妮宝贝\t204765\n哉\t204766\n哈\t204767\n不点你就是个小不点儿\t204768\n邓琳琳\t204769\ndeqr\t204770\n耸入\t204771\n怎么了你是个大臭屁王死牛氓\t204772\n张思源\t204773\n2声\t204774\n红豆布丁烤奶茶\t204775\n你好好久没\t204776\n近视\t204777\n战太赞\t204778\n一八周末\t204779\n戴斯宾馆酒店\t204780\n连着\t204781\n声爹\t204782\n二妈\t204783\n真理报\t204784\n8568469\t204785\n12016年\t204786\nvbnnkk\t204787\nwethe\t204788\n预留\t204789\n五十九\t204790\n笨葱\t204791\nfuoftidtidrititixkggkgoijdy\t204792\n题语文题\t204793\n伯珠\t204794\n茅庐\t204795\n黄连\t204796\n哈鱼\t204797\n程和\t204798\n端午节\t204799\ntlou\t204800\n少数几款\t204801\n仙羽\t204802\n狗蛋\t204803\n一下一天天\t204804\n2695936768\t204805\n隆化闹\t204806\nGHFJVIIFUG\t204807\n翻新萌\t204808\n房间\t204809\n多首\t204810\n炒勺\t204811\n不得你走\t204812\n徐季平\t204813\n好学要不然\t204814\n那个女孩\t204815\n快乐我快乐\t204816\nTusurudyusur\t204817\n忙氓\t204818\n悠久\t204819\n麻木不仁\t204820\njnbbbbbbbbbb\t204821\n吴慧佳\t204822\n不死哥\t204823\nalow\t204824\n披发\t204825\n同煤\t204826\nplayatman\t204827\n消磨余下\t204828\nfhgzpu\t204829\n单词\t204830\nTommy\t204831\n医生谜\t204832\n200日\t204833\n来得及\t204834\nfucjchig\t204835\n偷转\t204836\n转达\t204837\n建卯\t204838\n不知死\t204839\n14646414545445\t204840\n真是高富帅与白富美\t204841\n花花花花花花痴迷你\t204842\n黄运\t204843\n跑肚\t204844\n2017年\t204845\n植物池法\t204846\n微堡x5\t204847\n建华\t204848\n洒下\t204849\n半明\t204850\n物化\t204851\n建午\t204852\n保姆\t204853\n绝分为\t204854\nhttpfhiphotosbaiducomxiaodupicitema08b87d6277f9e2fc2be637c1830e924b899f33fjpg\t204855\n水清\t204856\n上海地区\t204857\n弗州\t204858\n餑古\t204859\n公有\t204860\n父嫁\t204861\n蔻心好礼#\t204862\n公服\t204863\nfuF\t204864\n丽丰\t204865\n4442\t204866\n4444\t204867\n4445\t204868\nHK該\t204869\n你好丑娃度秘\t204870\n丽丽\t204871\n寅豪\t204872\n束手就擒\t204873\nfut\t204874\nfuu\t204875\nfur\t204876\n吴凯涛\t204877\nAB楼\t204878\n根根\t204879\n医学类\t204880\nfux\t204881\nfuy\t204882\n恩氏树蛙\t204883\n彝文\t204884\nfud\t204885\nfue\t204886\n古恩古恩\t204887\n早风\t204888\nfua\t204889\nfun\t204890\n麻痹亻\t204891\n13966507117\t204892\nfuk\t204893\nfuh\t204894\nfui\t204895\n31M\t204896\n没有我美\t204897\n密室\t204898\n亲爱的我最爱你你\t204899\n几分钱\t204900\n米地\t204901\n萨内蒂\t204902\n殷文龙\t204903\n我没有记住你我是恨你\t204904\n自然科学\t204905\n惊下\t204906\nG友行\t204907\n智志\t204908\n铁道\t204909\n這麼\t204910\n好不好呀\t204911\n了不起度秘\t204912\n密完\t204913\n罗某\t204914\n道德银行\t204915\n密宗\t204916\n犹木\t204917\n部分\t204918\n闫为东\t204919\n不不对\t204920\n讨厌家\t204921\nDderuu\t204922\n慢步\t204923\n闹掰\t204924\n交换机\t204925\n从江\t204926\n神站DVD\t204927\nsapient\t204928\n聊点\t204929\n好了我吃中饭了我吃中饭\t204930\n林壁\t204931\n里斯本\t204932\n樂\t204933\n葡萄园\t204934\n开水晶郦城\t204935\ngggccggygvv\t204936\njggioi\t204937\n三卧一厅\t204938\n呃军\t204939\n武术度秘制\t204940\n樑\t204941\n樓\t204942\n撼动\t204943\n有你在真好\t204944\n4628535e5dde71107af9d85a0efce1b9d166140jpg\t204945\n横\t204946\n樥\t204947\n型恋\t204948\ngjgjgjh\t204949\n阙\t204950\n不屈不挠\t204951\n胁迫\t204952\n销户\t204953\nmoh\t204954\n5445884444555855444\t204955\n9亿\t204956\n丹瑰苑小区\t204957\n广陵院\t204958\n樱\t204959\n近五年\t204960\n我你别转移话题我是说你有爱的人\t204961\n9点40分\t204962\nacvf\t204963\n播稿\t204964\nacvc\t204965\n安ulise\t204966\n美公主\t204967\n新学期\t204968\n多六十只\t204969\n白雅群\t204970\n大自然\t204971\npxy\t204972\n安喽\t204973\n最炫民族风-胡歌【威力加长版】\t204974\n晚安晚安晚安晚安晚安晚安晚安晚安晚安晚安晚安\t204975\n河内\t204976\n奚国华\t204977\nvtcv环境卫生院\t204978\nSB家臭\t204979\n一月十六\t204980\n马伟\t204981\n男女之间\t204982\n辅料\t204983\n兩天\t204984\n佩耶\t204985\n济尼奥\t204986\n我不要你了快滚\t204987\n多关照奥\t204988\n衣食\t204989\n中国央行\t204990\n张宝玉\t204991\n周校长\t204992\n关门大吉\t204993\nrustamjan\t204994\n李妈妈\t204995\n人之相信\t204996\npxp\t204997\n首保\t204998\n傻不拉几你真丑\t204999\n565656534343434353534343435353535353535353535353535\t205000\n养生堂\t205001\n穷忙\t205002\n690名\t205003\n为你汗颜\t205004\nJhv\t205005\n红红火火静静\t205006\n毛骨龄\t205007\n开裂\t205008\n上海银联电子支付服务有限公司\t205009\n立哥\t205010\n健哥\t205011\n巨献\t205012\n喷涂\t205013\n红尘\t205014\n巨猪\t205015\n您先\t205016\n海神\t205017\n环太平洋\t205018\n聽說手頭緊手頭緊\t205019\n肥皂盒\t205020\ngtalk\t205021\n巨猿\t205022\n花红柳绿\t205023\n链接群发\t205024\n快播五\t205025\n有一个人绑架\t205026\n好爱好爱你\t205027\ntukt\t205028\n江心加油江心\t205029\n歌颂\t205030\n六月二十号\t205031\nodod\t205032\n越军\t205033\n杰瑞米·雷纳\t205034\n分摊\t205035\niPod#\t205036\n出气\t205037\nback\t205038\n就问一问\t205039\n点芯\t205040\n522211\t205041\n一五百零六一二\t205042\n一世独殇\t205043\n何军华\t205044\n门卡\t205045\n刀伤\t205046\n哎呀度\t205047\n汶上天\t205048\n貂裘\t205049\n说说笑笑\t205050\n55468635455\t205051\n非国大青联\t205052\n幸存者\t205053\n错别字\t205054\n二八幺\t205055\n塞种才\t205056\n72971\t205057\n胡杨林\t205058\ns18\t205059\n昏花\t205060\n度秘三农\t205061\n美霞\t205062\n甘谷县\t205063\nQFFFGGAAS\t205064\n7.22\t205065\n偷拍\t205066\n张洪坤\t205067\n中医\t205068\n度一\t205069\n度七\t205070\n傻儿\t205071\n病号们\t205072\n阑\t205073\n唐砖堡\t205074\n说拜\t205075\n45433724644\t205076\n12785\t205077\n汽车港\t205078\n就是你们来了大家好我叫高原\t205079\n满载\t205080\n3ou\t205081\n五飞\t205082\n五食\t205083\n拖车\t205084\n雕梁\t205085\n房晓欣\t205086\n小瓢\t205087\n一帮子\t205088\n麝香接骨胶囊\t205089\n萝莉范\t205090\n小瓶\t205091\n炎炎\t205092\n泰语塞\t205093\n13285375626\t205094\n瘟写\t205095\n十五大\t205096\n一1十\t205097\n乖耍\t205098\n邱记\t205099\n朱颜\t205100\ndhiphotosbaiducomxiaodupicitem241f95cad1c8a7861204a77d6009c93d70cf5093jpg\t205101\n网膜\t205102\n大沥\t205103\n卫刁\t205104\n6页\t205105\n商雪雯\t205106\n林正\t205107\n李耀\t205108\n腮腺\t205109\n小瓜\t205110\n6项\t205111\n抒豪情寄壮志\t205112\n2016年1月20号\t205113\n济州联队\t205114\ngjjc\t205115\ngurwe\t205116\n取值\t205117\ngjjg\t205118\nvujsfju\t205119\n文质彬彬\t205120\ngjjl\t205121\n净化\t205122\n老年痴呆症\t205123\ngjjt\t205124\n好婷\t205125\n娃人\t205126\n什么片\t205127\n空岛\t205128\n神经兮兮\t205129\n咽喉肿痛\t205130\n大大概\t205131\nIPO有偿沉默\t205132\n十五放\t205133\n小三年\t205134\n日不落帝国\t205135\n空岗\t205136\nHEGYS\t205137\n柔子\t205138\n下一个\t205139\n张雨蝶\t205140\n孙力\t205141\n480P\t205142\n东亚独立我是你的姐姐\t205143\n九二35号\t205144\n哭了哭\t205145\n蟠龙\t205146\n一亿年以后\t205147\n少怒\t205148\n最容易\t205149\ntfffffffffffff\t205150\neba100e23dd54564e7426jpg\t205151\n卧倒\t205152\n缩短\t205153\n哭了哼\t205154\n丽丽龙龙龙\t205155\n强疆梦\t205156\n迷宫\t205157\n筷依\t205158\n鲁伊科斯塔\t205159\n红尘陌路\t205160\n韩佳秀\t205161\n浮房\t205162\n二瓜子\t205163\n道谢\t205164\nfyfsgkxzajcs25ovw2yhzsujs3ohcsf\t205165\n市区\t205166\n633476\t205167\n华孚\t205168\n没得堡\t205169\n800多\t205170\n确换来\t205171\n切尔诺贝利核电站\t205172\nn1748\t205173\n华字\t205174\n智能便民柜\t205175\n136355644653\t205176\n娇艳\t205177\n王一飞\t205178\n一首4399\t205179\n坪坝\t205180\nghcfhjgv\t205181\n杨明亮\t205182\nadgjmt\t205183\n甘润\t205184\n平湖市\t205185\n聊看\t205186\n泷泽号\t205187\n玩玩\t205188\n手酱油\t205189\n饽饽\t205190\n央二\t205191\n好心愿\t205192\n菜鲟饭\t205193\n水金夫妻\t205194\n暴风暴雨\t205195\n活期存款\t205196\n掩护\t205197\n幺零\t205198\n省长\t205199\n叫我女\t205200\n粗梨型\t205201\n第九十九i\t205202\n點飯發\t205203\n三杆\t205204\n说话的我讨厌\t205205\n过天我也不知道\t205206\n异地恋\t205207\n猜武\t205208\n不厌\t205209\n0.74%\t205210\n大高才艺\t205211\nwfip\t205212\n猜歌\t205213\n奥运会\t205214\n嘎嘎快猜\t205215\n讨厌讨厌讨厌讨厌讨厌讨厌讨厌讨厌的讨厌讨厌讨厌讨厌讨厌讨厌\t205216\n天书\t205217\n刺儿\t205218\n花你好猜\t205219\n民族英雄\t205220\n作文周刊\t205221\n成年人\t205222\n虾米色\t205223\n小高\t205224\n2475米\t205225\n小心翼翼\t205226\n瘙货\t205227\n六安花园小区\t205228\n横线处\t205229\n立法权\t205230\ncctv\t205231\npqpp\t205232\n9445点\t205233\n杆杆\t205234\n河南省商城县\t205235\nputting\t205236\n孟雪如\t205237\n825分\t205238\n8888888888888888888888888888899999999999999\t205239\n甜甜圆\t205240\n见过我真的不认识\t205241\n恩再美\t205242\n优等生\t205243\n扫雪\t205244\n帝业\t205245\n06月11日07时\t205246\n郑秀文\t205247\n甜甜圈\t205248\n张晓光\t205249\n三不斗\t205250\n3纳米秒\t205251\n深圳\t205252\n光心\t205253\n龅牙\t205254\n扫雷\t205255\n小米MIUI亮点炫#\t205256\ngsrdkyxgyb\t205257\n完了再见\t205258\n是非得失\t205259\nYOUTUBE\t205260\n实战\t205261\n议论不绝的意\t205262\n苏国文\t205263\n一平方米\t205264\n几千米\t205265\n低开\t205266\nAnne\t205267\nG杯\t205268\n茬儿\t205269\n圆通快院\t205270\n皇家马德里\t205271\nDijkstra\t205272\n购得\t205273\n涌起\t205274\n分文不收\t205275\n公告期\t205276\n我讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌讨厌讨厌讨厌讨厌讨厌讨厌\t205277\n二二二九五二\t205278\n鸡巴毛\t205279\n趣钱\t205280\n246813579\t205281\n交卷\t205282\n阿龙山\t205283\n臭度秘\t205284\n20少\t205285\n百毒不轻\t205286\n免死\t205287\n俩盒\t205288\n方佳\t205289\nam身\t205290\ni800\t205291\n2011年1月9日\t205292\n葡萄皮儿\t205293\n戴瑞士\t205294\n行程单\t205295\n开卷有疑——中国现代史读书札记\t205296\n动用\t205297\n牛牛郎\t205298\n16点35\t205299\n66655\t205300\n一两段\t205301\n13205\t205302\nfeamittletosylyour读音一乌尔特\t205303\n影仪\t205304\n动画\t205305\n青佩\t205306\n藏友\t205307\n飞凤舞\t205308\n21centl\t205309\n清查\t205310\n抹茶红豆冰沙\t205311\n寄养\t205312\n我是女的我是个没大美女我是小仙女\t205313\n镶接\t205314\n遭淘汰\t205315\nHfhgych\t205316\n翠鸟\t205317\nｋｏ\t205318\n5月31日零点30分\t205319\n快餐面\t205320\n郑赐\t205321\n63xx一百四十四六十八4点\t205322\n桂霞\t205323\n260分\t205324\n家人马\t205325\n张若静\t205326\n税务农行\t205327\nGGY\t205328\n初孕期\t205329\nGGG\t205330\n37.1\t205331\n度秘度秘我好讨厌你\t205332\n沈锡丰\t205333\n得罪\t205334\n五千八百八十八万\t205335\n电通\t205336\n白哈巴村\t205337\nGGJ\t205338\n外设\t205339\n了有本事\t205340\n臊过\t205341\n粗犷\t205342\n一百多少\t205343\n锦绣缘\t205344\n1十87多少\t205345\n1Xo\t205346\n3325267959\t205347\n你当我喜欢\t205348\n好吧度秘\t205349\n无可替代\t205350\n叶先生\t205351\n偷逃\t205352\n海水\t205353\n岳先生\t205354\n魔都山庄\t205355\n38a一a\t205356\n乐天吃\t205357\n兴致还\t205358\n孤愤\t205359\n雷军\t205360\n慷慨\t205361\n刘琰辉\t205362\n联通网\t205363\n能笑\t205364\n国家版权局版权管理司\t205365\n起跳\t205366\n一卷\t205367\n语虱\t205368\n害国\t205369\n心胸\t205370\n凯喵\t205371\n554242245211224442421\t205372\n字组\t205373\n芮鋆\t205374\n薄弱\t205375\n一卦\t205376\n环保小达人\t205377\n字经\t205378\n有点甜\t205379\n昨日上午\t205380\n清远\t205381\n那会儿\t205382\n一单\t205383\n清运\t205384\n女密度\t205385\n水晶花\t205386\n一华\t205387\n淮会胜\t205388\n一午\t205389\n真的不理我了我原谅你\t205390\n一半\t205391\n陆羽\t205392\n一升\t205393\n二三三七零\t205394\n海底\t205395\n一千\t205396\n黄林铭\t205397\n嘉华\t205398\n三千岁\t205399\n绝壁挂\t205400\n慕兮颖\t205401\n沣河大桥\t205402\n查寻\t205403\nroab\t205404\n我喜欢你好忙\t205405\n搭惜\t205406\na68928\t205407\n根除\t205408\n广元市\t205409\n男土女水\t205410\n机气\t205411\n雪妖针\t205412\n度觅\t205413\n节目单\t205414\ngo金棋\t205415\n三了剑网\t205416\n毛桃\t205417\n赵岗乡\t205418\n百万年\t205419\n829475\t205420\n萧萧鸟\t205421\n移情\t205422\n保险业\t205423\n在有\t205424\nhdxgdhfggfg\t205425\n空心菜\t205426\n10：02\t205427\n没办\t205428\n篮下\t205429\n死不要脸\t205430\n1113143\t205431\n苦痛\t205432\n身揣\t205433\n胡惠\t205434\n绿洲新城\t205435\n推定\t205436\n女用\t205437\n床底\t205438\n坟草\t205439\n执导\t205440\n购机\t205441\n曹轩铭\t205442\n可累\t205443\n没劲\t205444\n芭蕾舞裙\t205445\n还款期\t205446\n指挥部\t205447\n假会\t205448\n拉粑\t205449\n学府\t205450\n课外活动\t205451\n刘谁集团\t205452\n婚\t205453\n呆相公\t205454\n素火腿片\t205455\n萌萌哒萌萌哒萌萌哒萌萌哒萌萌哒萌萌哒萌萌哒萌萌哒萌萌哒萌萌哒萌萌哒萌萌哒萌萌哒萌萌哒萌萌哒萌萌哒\t205456\n婆\t205457\n荒缪\t205458\nhsvissH6SVya\t205459\n彩信\t205460\n灵魂么爱你有多深我爱你有几分我的情也真我的很哎呀真没良的人了我的心\t205461\nc90度\t205462\n十多天后\t205463\n春哥=\t205464\n婴\t205465\n婷\t205466\n婶\t205467\n婸\t205468\n夏节\t205469\n扼腕\t205470\nwarmin\t205471\n赵颖\t205472\n心妞\t205473\n595千克\t205474\n婢\t205475\n小册子\t205476\n相更好\t205477\n婧\t205478\n婫\t205479\n张泽芳\t205480\nstophoto\t205481\n我决定\t205482\n胖怿尹光囖\t205483\n汤镇业\t205484\n休米\t205485\n羊蝎子\t205486\n758802\t205487\n道扬标\t205488\n艾姆\t205489\n副当当\t205490\n狂朽\t205491\n接婚\t205492\n李连珠\t205493\n张金涛\t205494\n小魔都\t205495\n品茗\t205496\n马宇婧\t205497\n好度秘了求求你了我\t205498\n手言\t205499\n小骚秘\t205500\n李燕峰\t205501\n时间点\t205502\n就是说你是丑八怪兽臭小子\t205503\ng达伽奇\t205504\n刘静婕\t205505\n毛呀建华\t205506\nlaughat\t205507\n别太冷\t205508\njumpedjoy\t205509\n宋亚星\t205510\n卖笑\t205511\n赵世光\t205512\nfhkchuck\t205513\n名骑\t205514\n急剧\t205515\n余丽红\t205516\n0666ml\t205517\n两千公里\t205518\n树友们\t205519\n朦胧大乱斗\t205520\n1O2\t205521\n操劳\t205522\n尤勇\t205523\n28.9万\t205524\n不直\t205525\n16347896542\t205526\n秘魔仙\t205527\n凯乐期\t205528\n你的一生\t205529\n灵药\t205530\n三氯氢铵\t205531\n真笨你真笨你是猪吗你是猪吗你真笨你真笨你是猪吗你是猪\t205532\n解盘\t205533\n强体\t205534\n最好不过\t205535\n郭嘉\t205536\n888999888888885\t205537\n天秤其实龄牙利齿\t205538\nsmithouto\t205539\n沂蒙山调\t205540\n随时度\t205541\n灯饰\t205542\n我是小宝贝和小可爱你是大坏蛋\t205543\n俄罗斯纳霍德卡市\t205544\n邱泽\t205545\n拉面馆\t205546\nBmhjhhj\t205547\n五角场\t205548\nzhjffjdjxjcskjxhhxhxhjwskdjjxxhhxjshxcjx\t205549\n九组\t205550\n半段\t205551\n市政\t205552\n睡大懒觉\t205553\n韩元\t205554\n小仙儿\t205555\n八卦险\t205556\n臭丫头\t205557\n拦者\t205558\n一侧弯\t205559\n辛弃疾\t205560\n动动动\t205561\n骗自\t205562\n阿古柏\t205563\n500兆\t205564\n钕vg\t205565\n李双成\t205566\n视从\t205567\n利艾\t205568\n第三段\t205569\n让捏\t205570\n金岚曦\t205571\n你爱我你爱我我就杀\t205572\n晚晚\t205573\n唐果\t205574\n说点儿吉利话\t205575\n吴陵王\t205576\n仙蒂\t205577\n孙伟赫\t205578\n目地性\t205579\n第33页\t205580\n阿巴贡\t205581\n红片\t205582\n赤黑\t205583\n你好萌萌哒\t205584\n佳润\t205585\n度秘不要脸东魅不要脸\t205586\n红牌\t205587\n档型\t205588\n工艺术\t205589\n女大汉子\t205590\n拐杖\t205591\ntido\t205592\n蚂蟥\t205593\n晚上12点\t205594\n5.28\t205595\n徐万鹏\t205596\n濠江区\t205597\n七一奥跳河喔嚯\t205598\n说半载\t205599\n你信\t205600\nfvfbwrjbxg\t205601\n再说度\t205602\ngchdy\t205603\n短信者\t205604\n吴心怡\t205605\n小八欧尼\t205606\nhuida\t205607\n简单好记有成员\t205608\n问一就是说\t205609\n欲太\t205610\n宝杨路\t205611\n二百五三八\t205612\n鹿鞭泡酒\t205613\n555555555555555555555\t205614\n沫沫哒哒哒哒哒哒哒哒哒哒哒\t205615\n副其人\t205616\nv樱桃\t205617\n呃二二零幺\t205618\n炮妞\t205619\n星味儿\t205620\n665655\t205621\nzfz\t205622\n卖萌\t205623\naggjg\t205624\n恩开心\t205625\n漂流记\t205626\n打辆\t205627\n讨厌你了讨厌\t205628\ndeb48f8c54cccec45c3d292df5e0fe7fa7jpg\t205629\n大家好我是无赖\t205630\n乍然\t205631\n贤亲\t205632\n一七十5885588755886\t205633\n26年前\t205634\n臭味儿\t205635\nzff\t205636\n哈雷州\t205637\nzfd\t205638\n莱西电影院第一张劲挑站\t205639\ngougktut\t205640\n探测器\t205641\n政变\t205642\n救命体\t205643\n202块\t205644\n阿凡提\t205645\n新东那\t205646\n这句词\t205647\n你好洋\t205648\n周艺颖\t205649\n这句话\t205650\n秘怪\t205651\nu千骨\t205652\n眼镜片\t205653\n正八\t205654\n政治学家\t205655\n来都\t205656\n正入\t205657\n县城\t205658\n秘性\t205659\n汉译英\t205660\n决赛\t205661\n谢华洪\t205662\n5n652\t205663\n正兴\t205664\n66698888\t205665\n栗公子\t205666\n小肚腩\t205667\n五书\t205668\n部\t205669\n负覅\t205670\n夕可\t205671\n袁可欣\t205672\n阿尔卡特朗讯\t205673\n中央电视台中文国际频道\t205674\n郡\t205675\n美图秀秀\t205676\n郧\t205677\n出海\t205678\n脑脑脑\t205679\n五零万立\t205680\n都\t205681\n浸透\t205682\n好新年快乐\t205683\n吥十\t205684\n这回事\t205685\n雷柏表一百一百\t205686\n干嘛大事故\t205687\n三皇五帝始小迅隅\t205688\n17773034886\t205689\n郎\t205690\n盲人\t205691\n2月26日\t205692\n太发给钱我在线条子弟媳妇儿女孩\t205693\n洛克\t205694\n丁晓琳\t205695\n560斤\t205696\n郇\t205697\n照顾猫\t205698\n五义\t205699\n真了不得\t205700\n社会保险\t205701\n苍炎\t205702\n最后胜利\t205703\n郝\t205704\n郜\t205705\n紫苏\t205706\n哀愁\t205707\n郑\t205708\n安逸居\t205709\nwg\t205710\n羊城晚报\t205711\n伊沃株洲横店\t205712\n漫画师\t205713\n上火\t205714\n刘冬萌\t205715\n作会\t205716\n不听说\t205717\n各余\t205718\n妈妈爸妈爸妈爸妈爸妈爸妈\t205719\nwe\t205720\nJhzhshshhskhzjdbh\t205721\n唐伯虎\t205722\nAret\t205723\nwd\t205724\nwc\t205725\n国君\t205726\n在线风\t205727\n不亦乐乎\t205728\n不听话\t205729\n陈当然\t205730\n猪耳朵\t205731\n荆小\t205732\n14摄氏度\t205733\n穋彝XK臃\t205734\n漫漫长\t205735\n破题\t205736\n撤离\t205737\n归海一刀\t205738\n蓝道\t205739\n唱一\t205740\n行唱\t205741\n天河客运站\t205742\n涩狼我也是个色狼我要是淑女为爱我我爱我是你爸爸你是我女儿\t205743\n增援\t205744\n25.00元\t205745\n14pixin\t205746\nrgf\t205747\n草地\t205748\n软萌哒\t205749\n痛骂\t205750\nChbjk\t205751\n老股\t205752\n亲听音\t205753\nhhihhjbbh\t205754\n郑州航海球场\t205755\n知识量\t205756\n黑不隆\t205757\n三万户\t205758\nCMDA2000/WCDMA/GSM\t205759\n小背书\t205760\n发嗲\t205761\n民智\t205762\n将度\t205763\n年极\t205764\n猜从一到\t205765\n气派\t205766\n扑哧\t205767\nCOMMAND\t205768\n海平面\t205769\n动力度秘你是男是女啊秘\t205770\n惹我找\t205771\n陈艺佳\t205772\n中小上学\t205773\n叫讲道理\t205774\n自己的故人\t205775\n咕叽美\t205776\n练练\t205777\n6，7年前\t205778\n崔耀元\t205779\n金偶\t205780\n密探\t205781\n错儿\t205782\n音乐传奇\t205783\n钟桂生\t205784\n睡着滚\t205785\n陈静茹\t205786\n第一箱\t205787\n哑剧\t205788\n隋思雯\t205789\n园园服男小四岁\t205790\n邮局\t205791\n十几号\t205792\n煞气\t205793\n太婆\t205794\n十几只\t205795\n公馆\t205796\nplpppppp\t205797\n地域\t205798\n秘素\t205799\n五只\t205800\nNmlgb\t205801\n盛唐\t205802\n8100\t205803\n傻子女\t205804\n十万年\t205805\n五口\t205806\n五句\t205807\nwezz\t205808\n一千一万个\t205809\n续场\t205810\n第二者\t205811\n黄艺博\t205812\n五号\t205813\nentfboys\t205814\n下一下班\t205815\n五双\t205816\n狭窄\t205817\n55颗\t205818\n邢俊峰\t205819\n桶税\t205820\n陈梦娇\t205821\n万条\t205822\n吴尊\t205823\n狼牙山五壮士\t205824\n金丝熊\t205825\n五发\t205826\n暗地男盗女娼不干人事的无耻之劣的狗男女\t205827\n退路\t205828\n五叔\t205829\n相夫教子\t205830\ntuxs\t205831\n曾利涵\t205832\n色发\t205833\n赞赞赞赞赞赞赞赞赞赞赞赞赞赞赞赞赞赞赞赞赞赞赞赞赞赞赞赞赞赞赞赞\t205834\n考不好\t205835\n灵中\t205836\n天地间\t205837\nHughft\t205838\n吧女酒店\t205839\n韩一民\t205840\n信贷\t205841\n惜字\t205842\n一而再\t205843\n浪蹄\t205844\n5558966\t205845\n专长\t205846\n蓝孩\t205847\n龙爪\t205848\n样的人\t205849\n南环\t205850\n哈哈个哈哈哈个哈个\t205851\n信本喜欢\t205852\n吃宵疑\t205853\n大一串\t205854\n无限量\t205855\n仙侠\t205856\n华哥\t205857\n容县珊萃中学\t205858\n灵丘\t205859\n梁惠\t205860\n车子\t205861\n47741\t205862\n点通\t205863\n旧恨\t205864\naab式\t205865\n07937511966\t205866\n别等\t205867\n铁堡王\t205868\n大家好我叫我的爸爸\t205869\n欧密\t205870\n雀巢\t205871\nSomep\t205872\n瓜菜\t205873\n355.84万辆\t205874\n回错\t205875\n有完没\t205876\n67474875\t205877\n具体化\t205878\n烦渴\t205879\ntixug\t205880\nfujjjfufjk\t205881\n我问你是爱的问题\t205882\nggcfcgh\t205883\nHojkd\t205884\n一百四十四\t205885\n早点睡明天上\t205886\n咸鱼煎肉饼\t205887\n阿我\t205888\n安之常委\t205889\n大笑乐\t205890\nusual\t205891\n娜扎娜扎\t205892\n蚀帅哒\t205893\n急眼通\t205894\n秦始明月片尾曲\t205895\n6436467388383161117356197671\t205896\n小青\t205897\nwwee\t205898\n不许动\t205899\nGamble\t205900\n小静\t205901\n捉鬼\t205902\n小面\t205903\n165.8万平方米\t205904\n37.30\t205905\n充组\t205906\n辗转反侧\t205907\n春夜\t205908\nv博客个日\t205909\n切莫\t205910\n水母\t205911\ntgmg\t205912\n唐伯特\t205913\ntgmj\t205914\n尖尖角\t205915\n0.67万k\t205916\n麻池镇\t205917\n弱电解质\t205918\n13168091168\t205919\n硬声音\t205920\n光剑\t205921\n挚友\t205922\n麻雀\t205923\n包夜\t205924\n2513801373\t205925\n女弟子\t205926\n休一休\t205927\n滋补\t205928\n九十进\t205929\n动脉处\t205930\n一百種\t205931\ncucuk\t205932\no0eejj1石\t205933\n夜夜夜夜夜夜夜夜\t205934\n蒋坤\t205935\n重改\t205936\n包头\t205937\n来到\t205938\n无锡卡\t205939\n52928285\t205940\nJficviv\t205941\n詹组\t205942\nuuyfh\t205943\n碍眼呗\t205944\n春雨\t205945\nfgxi\t205946\nGossip\t205947\n包天\t205948\n简报\t205949\n刘璐\t205950\n没得事\t205951\n几百户\t205952\n碧昂\t205953\n皮卡车\t205954\n5月12日\t205955\n张炬\t205956\n刘璇\t205957\n无措措手不及\t205958\nFreenman\t205959\n3月9日\t205960\n张亚琼\t205961\n沙头角\t205962\n刘恒辰\t205963\n必不可少\t205964\n纷至沓\t205965\n冲出去\t205966\n小狗娃\t205967\n郑伯言\t205968\n百花奖\t205969\n渔家傲秋思\t205970\nasiansex\t205971\nyinyny\t205972\n一不要我理\t205973\n558411112233633333345678978966987552668554580800000\t205974\n哼记住\t205975\n化学成分\t205976\n办完毕\t205977\n吃吃吃吃\t205978\n肉番\t205979\n爱的供养\t205980\n为舍\t205981\nRobot\t205982\n北京西站\t205983\n娄小燕\t205984\n小威威\t205985\n鸡涌偶\t205986\n奥园\t205987\n白茉莉\t205988\n林佳涵\t205989\n要死\t205990\n內事\t205991\n很短\t205992\n华人世界\t205993\n澜沧拉祜族自治县\t205994\n侧身\t205995\n胡敬瀚\t205996\n64个\t205997\n万个级\t205998\n徐晓科\t205999\n羊肉骨\t206000\n致奥\t206001\n百年好\t206002\n简阳\t206003\nvgxesf\t206004\n有不有\t206005\n尼康D90\t206006\n陈哪得\t206007\n79岁\t206008\n你妈呀我嘞个擦\t206009\n18360236605\t206010\n一起水\t206011\n甲午年\t206012\n心情家\t206013\n臭名度\t206014\n55566282\t206015\n死鸡仔\t206016\n五味尘杂\t206017\n8月16日\t206018\nGIYGRT\t206019\n五级\t206020\n要见我我好讨厌你好讨厌你好讨厌你\t206021\n宠物狗\t206022\n久而久之\t206023\n5368553161124345360955\t206024\n几十亿几百亿下\t206025\n村级\t206026\npfoqoq1221lyoueallshouquoilcoso\t206027\n人民币\t206028\nherobrine\t206029\n返学\t206030\n弟们\t206031\nmpc\t206032\n光仪\t206033\n头孢\t206034\n8分\t206035\n94种\t206036\n田园犬\t206037\n妇检\t206038\n糙米\t206039\n王健\t206040\n二三门\t206041\n第35\t206042\n第34\t206043\n林在范\t206044\n若曦\t206045\n会见\t206046\njojgh\t206047\nWalk\t206048\n神探夏洛克\t206049\n孟令璐\t206050\ng936\t206051\n煮妇神探\t206052\nbxjk\t206053\n火石\t206054\n头子\t206055\n李欣娜\t206056\n3秒\t206057\n心细\t206058\n白羊白羊\t206059\n199999\t206060\n泵车\t206061\n心经\t206062\ntakeale\t206063\nb1d866db\t206064\n枪战片身\t206065\nhgfd\t206066\nhgfg\t206067\n亚洲坤\t206068\n心结\t206069\n扶娃缺\t206070\nahrg\t206071\n心绞\t206072\nJkdh\t206073\n3种\t206074\n峨眉山方丈永寿法师\t206075\n迷有\t206076\n一建\t206077\n过度劳累\t206078\n东北豆面包\t206079\n涌向\t206080\n花塞\t206081\n心绪\t206082\n长板凳\t206083\n傻仔\t206084\n夜小五\t206085\n韧性\t206086\n像良\t206087\n茗婕\t206088\n邵志敏\t206089\n田柾国\t206090\n宋怡漾\t206091\n唱一首歌\t206092\n凡业\t206093\n不要不\t206094\n忘了我走\t206095\n百几个\t206096\n恐高\t206097\n红西游\t206098\n瑞凌\t206099\n我爱你爱我你也不爱我我也不爱你\t206100\n超值\t206101\n晨晨晨晨晨晨晨星雨\t206102\n贤淑\t206103\n又一么狼狼六五妞妞\t206104\n泪流不止\t206105\n60亿\t206106\n呜呜想你\t206107\n联赛\t206108\ngvccv\t206109\n505天\t206110\n亲儿\t206111\n死人头\t206112\n犹太\t206113\n一千几\t206114\n断头\t206115\n惨状\t206116\n壬天干\t206117\n投石\t206118\n是不是我不够\t206119\nhjjdj\t206120\n2.22亿元\t206121\n现代都市\t206122\n罗斗王\t206123\n8月15月\t206124\n太极狗\t206125\nghddhdj\t206126\n通称\t206127\n我聊天儿你真是的我的好秘书\t206128\n8月7日\t206129\n生生不息\t206130\nfiyo\t206131\n存意\t206132\nf码\t206133\n78910\t206134\n孔子穗\t206135\n梁家村\t206136\n好客\t206137\n超级大讨厌吧我恨你恨你\t206138\n树心\t206139\n对阵\t206140\n少伟\t206141\n那你为我家女人呢明天\t206142\n斯旺西\t206143\n甲昂\t206144\n江铃\t206145\n找你我爱\t206146\n寄托\t206147\n梁家杰\t206148\n李子静\t206149\n4008740087\t206150\n十秒后\t206151\n棒棒粉\t206152\n粗鄙\t206153\n停停\t206154\n铁骨铮铮\t206155\n一个四百五\t206156\n雷伊\t206157\nvuguvy\t206158\n达叔\t206159\n就是你的女人计算机的女的女的女的女的女的\t206160\n无疑\t206161\n非常微博\t206162\n3个多小时\t206163\ntrtrtrtma\t206164\n看见你的笑\t206165\n你和你的殿\t206166\n鸦雀无语\t206167\n25666\t206168\n头盖骨\t206169\n噜啊噜\t206170\n真心爱人\t206171\n三九二零\t206172\n壮志\t206173\norzl\t206174\n真乖度秘\t206175\ndfdfsgdfd\t206176\n本方\t206177\n风景文段\t206178\n涨势\t206179\n看不成\t206180\n司空\t206181\n无疆\t206182\n罗曼史\t206183\n自转\t206184\n推出来\t206185\n去过\t206186\n名牌大学\t206187\n任何人\t206188\nzgkd\t206189\n十八掌\t206190\n不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸就是不要脸\t206191\nail\t206192\n爱尔兰人\t206193\n五十瓶\t206194\n猜讨厌\t206195\n罗大萍\t206196\n一恍\t206197\n哪行\t206198\n休狠\t206199\n有钱看病\t206200\n步步\t206201\n买比\t206202\n妙想熊\t206203\n鑰匙\t206204\n医务\t206205\n我是女的我的裸体美\t206206\n掌珠\t206207\n真的不爱你\t206208\n坑子\t206209\n妖太\t206210\n视障\t206211\n事仨月\t206212\n叶倩雪\t206213\n场外\t206214\n生机勃勃\t206215\n杜莎\t206216\n人民北路\t206217\n小度肖\t206218\n早考\t206219\n新乡\t206220\n54千克\t206221\n心情不高兴\t206222\ncujdndkc\t206223\ncjgk\t206224\n4815\t206225\n龙光强\t206226\ncjgf\t206227\n憨熊\t206228\ngygg\t206229\n宋言\t206230\n最初\t206231\n小度肿\t206232\n你好度秘你好度秘你真好\t206233\n过事\t206234\ntf布维丝\t206235\nJACK\t206236\n逗面\t206237\n切懂\t206238\n佛友\t206239\n雪豹\t206240\n过于\t206241\n搭错车\t206242\n╰o\t206243\n太上老君\t206244\n稀少\t206245\n特内疚\t206246\ncomy嘎\t206247\n马德里\t206248\n贯地\t206249\n秘密\t206250\n鱼眼\t206251\n戴明\t206252\n自录档\t206253\n聊待\t206254\nuiuiuy\t206255\n交通卡\t206256\n了拜拜拜\t206257\n零下万度\t206258\n学妹们\t206259\n第六层\t206260\n刘佳磊\t206261\n随叫随到\t206262\n小女孩儿\t206263\n第六届\t206264\n一学三你学画\t206265\n睡不着想\t206266\nWI个vilailivery\t206267\nMakeapicture\t206268\n小哥不吧\t206269\n不要钱\t206270\n跟着我走\t206271\n缠枝莲纹花觚\t206272\n红汤\t206273\n董颖\t206274\n告打\t206275\nfifit\t206276\n红池\t206277\n人鱼太今\t206278\nfyfu\t206279\n小黄兔\t206280\n罚球\t206281\n星座学\t206282\nshoot\t206283\n说好好\t206284\n冼耳恭\t206285\n萌仙\t206286\nyoutell\t206287\n红汞\t206288\n令人吃惊\t206289\n乌迪内\t206290\n雨果思科\t206291\n马行\t206292\nzypo\t206293\nxxcuds\t206294\n多谢情在\t206295\n刘烨刘\t206296\n我一我爱你我就是爱\t206297\n周迅\t206298\n付晓亚\t206299\n左翼\t206300\n雪光\t206301\n知行\t206302\n数一到\t206303\n知血\t206304\nKAKA\t206305\n我好伤心度秘\t206306\n余额钱\t206307\n44分\t206308\n业精\t206309\n梦同桌\t206310\n一一遍\t206311\n67.98\t206312\n玫瑰园\t206313\nvhhhbggxgb\t206314\ngcs\t206315\n杨亦辰\t206316\nffffdg\t206317\n恩真心\t206318\nfyfe\t206319\nrjhtjjuufer\t206320\n小鸟儿小鸟小年\t206321\n可人型\t206322\n很美很美\t206323\nyouli科\t206324\n百八十六厘米\t206325\n30倍\t206326\n二百九十六\t206327\n崔海军\t206328\n反应物\t206329\n不染\t206330\n东皇太一\t206331\n喃喃自语\t206332\n我咪咪不咪咪不咪咪不咪咪\t206333\n方建祥\t206334\n相成\t206335\n一百二十分\t206336\n完美无瑕\t206337\nnonononononono\t206338\n建波兄\t206339\n1813\t206340\n新宁天\t206341\n什庙\t206342\n王继宇\t206343\n苦情\t206344\n嗯男\t206345\n出演身负\t206346\n能耐\t206347\n本文\t206348\n迷我死\t206349\n沉舟\t206350\n24700086433358565769863256564666\t206351\n神学\t206352\n真的不骗我\t206353\n晕晕\t206354\n大幅度\t206355\n太搞笑思密达\t206356\n再龄化\t206357\n半程\t206358\n国五本座\t206359\n破涕为笑\t206360\n奋力\t206361\n13906235259\t206362\nb股\t206363\n天勒\t206364\nOS吧4k\t206365\n湘阴\t206366\n罗技公司\t206367\n好高科技\t206368\n客家姓\t206369\n长效\t206370\n基督教\t206371\n二月十四\t206372\n不好舒服我不舒服\t206373\n喜气\t206374\nskulga\t206375\n踢球\t206376\n窦一位\t206377\n地毯\t206378\nmimuten\t206379\n水上\t206380\n水下\t206381\n鹏\t206382\n烤红\t206383\n围脖\t206384\n鹊\t206385\n水东\t206386\n承让\t206387\n大招\t206388\n鹜\t206389\n承认\t206390\n五万块\t206391\n52113145217758258\t206392\n对的选择吧\t206393\nHjhbgbbcc\t206394\n鹤\t206395\n大耳朵肥\t206396\n星星都是你给我的吧度秘\t206397\n哼度秘我真的伤心\t206398\n小雪花\t206399\nngmglld\t206400\n杀多而家\t206401\n不动声色\t206402\n我我我我我我喜欢狗\t206403\n扇动\t206404\n地毛\t206405\n津门\t206406\n啦完美大\t206407\n七子\t206408\n201617年\t206409\nyyygg\t206410\nsshshu书aa\t206411\n9点14\t206412\ncf穿越火线\t206413\n基督教徒\t206414\n刘凯林\t206415\n得以\t206416\n张丽华\t206417\n别误\t206418\n8点25分\t206419\n笨笨熊\t206420\n多一个\t206421\nDKIJF\t206422\n圆型\t206423\n酸死\t206424\n劫道\t206425\nfhcdjvxdh\t206426\n漠不关心\t206427\n我的家\t206428\n南沙河\t206429\n有脚\t206430\n安之地\t206431\n陆川第二中学\t206432\n不知足\t206433\n55xx\t206434\n小甜三\t206435\n集美大\t206436\n盛不虚\t206437\n安静\t206438\n月假\t206439\nHold\t206440\n18134114093\t206441\n看起来\t206442\n不长不短\t206443\n罗西\t206444\n毛绿芜\t206445\n匆忙\t206446\n快乐的事\t206447\n郭宇晴\t206448\n转手\t206449\nkhjga\t206450\n牵牛放\t206451\n三啊一\t206452\n妈妈呀山\t206453\n无限\t206454\nbugenniliaol\t206455\n朱敬豪\t206456\n更事\t206457\n高智祥\t206458\n阳原县\t206459\n波尔\t206460\n偷偷偷偷偷\t206461\n妖孽横生\t206462\n死了攻\t206463\n艾玛哼\t206464\nNewsroom\t206465\n假发\t206466\n共舞\t206467\nhaof\t206468\n王者归来\t206469\n第几排\t206470\n很舒服\t206471\n废寝忘食\t206472\n俊华\t206473\n雪豹#\t206474\n家伙们\t206475\nOoooooo\t206476\n假叶\t206477\n花花牛酸奶\t206478\ngiutt\t206479\n纳豆\t206480\n假叫\t206481\n乌鱼波泼\t206482\n吃呢度秘\t206483\n户织\t206484\n阿老\t206485\n五届\t206486\n664245463\t206487\n操纵\t206488\n五屏\t206489\n五层\t206490\n三毛驴\t206491\n愤怒的金刚\t206492\n紫金\t206493\n9岁\t206494\ngfvbhgcbjyszvkiiy4bkitbkirxbiyrdhdagpjnjmjagptmdagt\t206495\n朱金杰\t206496\n薯假\t206497\n多笑\t206498\n卧龙\t206499\n将臣\t206500\n形眼镜\t206501\nbrulee\t206502\n会期\t206503\n家照\t206504\n2012年8月1日\t206505\n第一颗\t206506\n第四小题\t206507\n城市管理\t206508\n将至\t206509\n教主\t206510\n征税\t206511\n代祷\t206512\n湖南省纪委\t206513\n重装\t206514\n觉性圆明\t206515\nNII\t206516\nurname\t206517\n打问\t206518\nlbv\t206519\n乃野奏凯\t206520\nlbx\t206521\n逼民\t206522\n立劳\t206523\n打闹\t206524\nlbe\t206525\n马文月\t206526\n李紫薇\t206527\n边走\t206528\nlbn\t206529\n蓝图\t206530\n477185\t206531\n胸呢\t206532\n老年报\t206533\n金香苹果\t206534\nsaymoning\t206535\n四分之3\t206536\n不要脸不要脸不要脸不要脸活该不要脸\t206537\n太软\t206538\n呀你不爱\t206539\n大一次\t206540\n1.8倍\t206541\njgkxl\t206542\n4564\t206543\n凤凰\t206544\n韩国朴智星足球中心队\t206545\n澳柯玛\t206546\nviijp\t206547\n北关小区\t206548\nway831403\t206549\n幾點\t206550\n演艺圈\t206551\ndodixy\t206552\n说了真是\t206553\n人不鬼\t206554\n张白\t206555\n也不乐\t206556\n传人\t206557\n韩钦宇\t206558\nhaoma\t206559\n什么手\t206560\n小宁宁\t206561\n刘舒赫\t206562\ncguu\t206563\n小段最长\t206564\n115568\t206565\n思雨星\t206566\n清亮\t206567\n立四望\t206568\nabernef\t206569\n谢黎莎\t206570\n耨耨耨耨\t206571\n兄弟\t206572\n我讨厌你讨厌你我扑说你我又不讨厌你我讨厌你我又说我不讨厌你\t206573\n北京市人体立正骨头馆\t206574\n吧福利hc3\t206575\n萌萌哒萌哒哒萌哒哒萌萌哒大萌萌\t206576\n晚个\t206577\nfygex\t206578\nabo\t206579\n目击者\t206580\n黛绿触\t206581\nabp\t206582\n后见\t206583\n清二\t206584\nbtw\t206585\n羊绒衫\t206586\n一点一百元\t206587\n水晶\t206588\n烦脑\t206589\n烂醉上熊大熊二大电影的歌一个歌\t206590\n我不想和你玩呢我讨厌你\t206591\n高低\t206592\n杨运锋\t206593\n金猴\t206594\n12803156284\t206595\n语音识别\t206596\n几十秒\t206597\n承诺书\t206598\n一天1块\t206599\n出马\t206600\n百利甜酒\t206601\n顽皮\t206602\n萨碧西\t206603\n哈利哈塞\t206604\n白块别\t206605\n青工青农部\t206606\n打浦路\t206607\n小动物\t206608\n加点儿\t206609\n庚妈\t206610\n打告诉你\t206611\n15312313253\t206612\n话笑话\t206613\n白盖亚\t206614\n抵达\t206615\ngodi\t206616\n洪小美\t206617\n创办人\t206618\n苹果cav\t206619\n蓝桥\t206620\n江田\t206621\nnut\t206622\n傍观者\t206623\n于健君\t206624\n宁为玉碎\t206625\n杨甲\t206626\n血车\t206627\n一条儿\t206628\nnud\t206629\nnub\t206630\n一又十分之三\t206631\n还好意思好不要点你不要脸你不要脸\t206632\n三Ｇ\t206633\nabbbbaaa\t206634\nmptmtgj\t206635\n36%\t206636\n五毫米\t206637\n大汉堡包\t206638\n妙音禅韵\t206639\n任课\t206640\n理喻冷\t206641\n陈焕武\t206642\n368\t206643\n369\t206644\n366\t206645\n367\t206646\n365\t206647\n笨笨\t206648\n363\t206649\n360\t206650\n361\t206651\n程序猿\t206652\n4月28日\t206653\n山度秘\t206654\nsmict\t206655\n空灵\t206656\n36D\t206657\n真心喜欢的人\t206658\nuythg\t206659\n神马猫\t206660\nboss\t206661\n悲摧\t206662\nhttpimagebaiducomsearchwisealatnwisealaieutf8word%E98592E7AA9DE794B7%E7949FE59BBEE78987\t206663\n日日夜夜\t206664\n姻缘\t206665\n浅鲰\t206666\n36k\t206667\n36h\t206668\nTesdfghjhy\t206669\n36d\t206670\n今个早上\t206671\n36b\t206672\n镰刀\t206673\n小公园\t206674\n79年\t206675\n好玩儿滴\t206676\nqiakai\t206677\n奇招\t206678\n566喔喔\t206679\n干吗\t206680\n无所谓了\t206681\n舞行\t206682\n郭大侠\t206683\n私营企业\t206684\n男足\t206685\n一条小路\t206686\n成田机场\t206687\n女鬼\t206688\n跟假\t206689\n1545788880888\t206690\n130平米\t206691\n开开开开\t206692\n名卡\t206693\n一玛丽\t206694\nhttpehiphotosbaiducomxiaodupicitemaec379310a55b3197741ef0044a98226cffc170ejpg\t206695\n1614073460\t206696\n结婚吧求求你了求求你了求求你了求求你\t206697\n常山\t206698\n伊利米努尔\t206699\n名单\t206700\n养苔\t206701\n1998年6月29日\t206702\n名博\t206703\n南少林\t206704\nvyuvvhvhvhhvvhhvhvhvbhbhbhhbhbhbbhbhbhbjbjjbbjbj\t206705\nfjjdn\t206706\n克拉\t206707\n女人节\t206708\n八十九分\t206709\n研究者\t206710\n温老师\t206711\n感人超幸福\t206712\n克拜\t206713\n各种秀\t206714\n天玉玉玉\t206715\n逃不讨厌\t206716\n11589\t206717\n出国留学\t206718\n大碴粥\t206719\n集中\t206720\n侠海滨\t206721\n撸撸撸\t206722\n爆花糖\t206723\n母母\t206724\n防患\t206725\n没脑\t206726\n七八二一七八十九号\t206727\n付歆远\t206728\n干嘛那你做我的朋友\t206729\n2443144621\t206730\n重写\t206731\n会死机器人\t206732\n翠翠\t206733\n胆量\t206734\n李孟\t206735\n诶华\t206736\n太而了\t206737\n渐失\t206738\n李子\t206739\n刘海戏\t206740\nwjdn\t206741\n来邦\t206742\n担忧\t206743\n首长\t206744\n踢馆\t206745\n尚德\t206746\n小小村\t206747\n你的爱我爱到什么\t206748\n恩弄\t206749\n红桥区\t206750\n走婚\t206751\n新闻信\t206752\n汲取\t206753\n1994年7月23\t206754\nhjkvj\t206755\n生机\t206756\n故事稿\t206757\n吊篮\t206758\n正拍\t206759\n斗志\t206760\n來寄居蟹\t206761\n本你\t206762\n信箱\t206763\nkhnd\t206764\n堆放\t206765\n老鹏\t206766\n杨树\t206767\n本体\t206768\n喝麻\t206769\n蒙逼\t206770\n大大大大大大大大\t206771\n六月六月三十号\t206772\n江苏省公安厅\t206773\n本位\t206774\n漫长\t206775\n真的你好吧好吧\t206776\n54555445555555666677777877\t206777\n说什么说什么快说\t206778\nRtyffugfe\t206779\n急是\t206780\n口业\t206781\n122838\t206782\n掉杀\t206783\n机器人机器人大\t206784\n贺卡\t206785\n转盘\t206786\n牛耳\t206787\n薛雯雯\t206788\n零区\t206789\nMorning\t206790\n一个梦想\t206791\n十九号\t206792\n收件人\t206793\n9900000\t206794\n用谢\t206795\n钱吖吕\t206796\n811ffr十一1\t206797\n13212341648\t206798\n60条\t206799\n还分\t206800\n王俊鹏\t206801\n独门\t206802\n文切\t206803\n伊淑美\t206804\n全片\t206805\n骑顺眼\t206806\n对啊你不我\t206807\n蹦蹦蹦儿\t206808\n木提\t206809\n全版\t206810\n极品\t206811\n撒东珈\t206812\nSaral\t206813\n上高一\t206814\n20多页\t206815\neeggh\t206816\n溜达\t206817\njiek\t206818\n这回\t206819\n那你为么要和你女友分\t206820\n注射器\t206821\n谢票\t206822\nHfydyd\t206823\n嗯三哇蒂卡\t206824\n夏瑞\t206825\n101言\t206826\n蛋包饭\t206827\n狗瘟\t206828\n瓷娃娃&amp\t206829\n小黑米\t206830\n小方块\t206831\n沛读\t206832\n哎呀你说我是女的呀我就是你的好不\t206833\n木命\t206834\n安坐\t206835\n第二期\t206836\n分解\t206837\n印花税\t206838\n摇摇换换\t206839\n我要投诉你你给老子小心点\t206840\n要哭\t206841\n邓建华\t206842\n米米\t206843\nJames\t206844\n643454580052588555555555555513706755118\t206845\n谢安\t206846\n地产\t206847\n脆脆鲨\t206848\n悬铃木\t206849\n30一50多\t206850\nueruwhwu\t206851\n公建\t206852\nVOGUE\t206853\n张可欣\t206854\n刚哥\t206855\n仙池八珍粉\t206856\n890958\t206857\n林洋萍\t206858\n手术室\t206859\n银色版\t206860\n三十宗\t206861\n男寝\t206862\n贾博\t206863\nky282派\t206864\n无野食\t206865\n烤屄\t206866\n不洋气\t206867\n故事机\t206868\nclubb\t206869\n跑酷\t206870\n鸥斯奇\t206871\n回神\t206872\njjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjj\t206873\n一千个小时\t206874\n断交\t206875\n时果\t206876\n我好讨厌我爸\t206877\n3abcd\t206878\n瑟怨\t206879\n王寒冰\t206880\n闲妃\t206881\n一淘\t206882\n遥远的地\t206883\n上班级\t206884\n82775n3636345963794379979364288705987587435539367\t206885\nnwtj\t206886\n失策\t206887\n干干净净\t206888\n交朋\t206889\n杨亦晗\t206890\n真俗\t206891\n冬雨\t206892\n探矿权\t206893\n有不是我讨厌\t206894\n鲁帕\t206895\n莫克鲁诺\t206896\n熊灵艺\t206897\n交望\t206898\n20645655\t206899\n夸张\t206900\n翩然\t206901\n王香兰\t206902\n沈紫怡\t206903\nnbffb\t206904\nProject\t206905\n动来东去\t206906\n李佳怡\t206907\n胞\t206908\n吗度秘\t206909\n丽厚\t206910\n应纳\t206911\n1500万欧元\t206912\n还萌萌哒臭不要脸不\t206913\n顾卫卫\t206914\n红彤彤\t206915\n大当家\t206916\n哇偶度秘\t206917\n苏格兰\t206918\ngfvbbj\t206919\n胶原\t206920\n认同度\t206921\n快乐想我的\t206922\n找找看\t206923\n结扎\t206924\n四批\t206925\n荫蔽灵鸟\t206926\n口腔溃疡\t206927\nbvfgfgg\t206928\n第一点\t206929\n会长大\t206930\n御用\t206931\n不可爱我要你给我\t206932\n2块\t206933\n暴毙\t206934\n飞力达\t206935\nbaaalalloonnnn\t206936\nasia4u\t206937\n危及生命\t206938\n四所\t206939\n活腻\t206940\n自诊\t206941\n叫贱\t206942\nEEEEE\t206943\n四手\t206944\njagptjad\t206945\n肖陈\t206946\n机械工业出版社\t206947\n拉不拉多犬\t206948\n科恩\t206949\n马耳他岛\t206950\n雅洁\t206951\ninmy\t206952\n两点到五点\t206953\njutjj\t206954\n7月16日\t206955\n工作岗位\t206956\n死尽\t206957\n上骑士\t206958\n池树涛\t206959\n教案\t206960\n牟年丰\t206961\n不得到\t206962\n小怜\t206963\n病毒性腹泻\t206964\n回笼觉\t206965\n二零一二年\t206966\n毒女\t206967\n糖醋排骨\t206968\n李阿\t206969\nGhtd\t206970\ncidB\t206971\n老班长\t206972\n别利克\t206973\n胃\t206974\njgjjj\t206975\n小尼尼\t206976\n安吉尔笛\t206977\n杨凯丑\t206978\nbide\t206979\n继承人\t206980\n名门世家魔都咪\t206981\n不是时候\t206982\nnL\t206983\n超了超\t206984\nnO\t206985\n不是人爱\t206986\n1十一\t206987\nnC\t206988\nbvjccbfvghbccvgfc\t206989\n美背\t206990\n入教\t206991\n捏造\t206992\n2774点\t206993\n感散\t206994\n秘连我的话\t206995\n裆派\t206996\n美胞\t206997\n你会\t206998\nnV\t206999\n去黑\t207000\nnh\t207001\nni\t207002\nnj\t207003\nnk\t207004\n历害\t207005\nnn\t207006\n王菲菲亚特兰\t207007\nna\t207008\nnb\t207009\nnc\t207010\nnd\t207011\n肥猪冰\t207012\nnf\t207013\nng\t207014\nnx\t207015\nny\t207016\nnz\t207017\n20050105\t207018\n七百多年\t207019\n我们一夜\t207020\n美胸\t207021\nns\t207022\nnt\t207023\nnu\t207024\nnv\t207025\n热巴\t207026\n古琴\t207027\n升级\t207028\n减免\t207029\n藤宇航\t207030\n潇湘晨报\t207031\n称心如意\t207032\n范竞马\t207033\n柳豫蒙\t207034\nKENUO\t207035\nHSMT\t207036\n2二三三四四五五六6717818218919\t207037\n梦寐\t207038\n宇文念\t207039\n不用你说\t207040\n北京人\t207041\n同基腐\t207042\n你不是人你不是人你是猪你是猪你是猪你是猪你是猪你是\t207043\n8466784686\t207044\n裂呼吁\t207045\n排名第五\t207046\nGbjx4588yfdcc4165cvhtxcvhk\t207047\n你好我喜欢\t207048\nkgCFU\t207049\nn8\t207050\n过来本搞笑书\t207051\n新格言\t207052\n强走\t207053\n九霄云外\t207054\nn1\t207055\nn2\t207056\nn3\t207057\nn4\t207058\nn5\t207059\n东奔西顾\t207060\n4999\t207061\nagreen\t207062\n半一个一个\t207063\n897城市乐游\t207064\n爱尔兰投资公司\t207065\n本杉\t207066\n太孤单\t207067\n逃票\t207068\n有然后\t207069\n干肠蒸干豆腐\t207070\n55445855668856848900005452566855688536035742588900896\t207071\n2389489365\t207072\n织物\t207073\n邓读书\t207074\n消消\t207075\n河流镇\t207076\n并肩\t207077\n天屎\t207078\n文美颜\t207079\n开膛\t207080\n心情不好时\t207081\n咋不认识\t207082\n天山\t207083\n坐牢\t207084\naplythindym\t207085\n380橙武\t207086\n轮廓\t207087\n华美整形医院\t207088\nhgggggv\t207089\nbhiphotosbaiducomxiaodupicitem0df3d7ca7bcb0a464be13d4d6c63f6246a60af87jpg\t207090\n来苏\t207091\np堂\t207092\n冰清玉洁\t207093\n度组\t207094\n陪玩陪吃陪睡三陪\t207095\n走访\t207096\n人格魅力\t207097\n活动家\t207098\n姦夫\t207099\n它那个\t207100\n梁傲剑\t207101\n易峰\t207102\n女孩们们\t207103\n逗了你可以\t207104\n连拿瓦\t207105\n艾尔\t207106\n晁凯旋\t207107\n金北院校区\t207108\n张馨月\t207109\n1800多次\t207110\n郭畔村\t207111\n衡水\t207112\nwifitoo\t207113\n艾尚\t207114\n裂隙\t207115\n黑凤梨\t207116\n耐雨\t207117\n草莓节\t207118\n0.95%\t207119\n感谢有你\t207120\n操作线\t207121\n黎紫涵\t207122\n地数\t207123\n妮姬\t207124\nvobbjhhkhfg\t207125\n65闲列\t207126\n熊出没之熊风\t207127\n可睹\t207128\n想望\t207129\n阴干\t207130\n刘思齐\t207131\nIPHONE4s\t207132\n8544552588\t207133\n项张翼\t207134\n春风吉庆\t207135\n猪猪朱猪朱猪朱猪猪侠\t207136\n十六度\t207137\n3813\t207138\n累忙\t207139\nhttppinyincne4030\t207140\n脱空\t207141\n基拉黑军车亨\t207142\n大话\t207143\n猪啊度秘\t207144\n时区\t207145\n大马路\t207146\nduvk\t207147\n积级\t207148\nheine\t207149\n0044555552421225352\t207150\n乐事薯片\t207151\n不顾一切\t207152\n李娜法网\t207153\n中国人寿保险公司\t207154\n赛拉图\t207155\n洋洋度\t207156\n马市\t207157\n吴雨得\t207158\n困困困\t207159\nfuieiwhchf\t207160\n陈治岐\t207161\n一百级\t207162\n十十四\t207163\n累快\t207164\n以为是\t207165\n好心烦\t207166\n首都医科大学附属北京佑安医院皮肤科\t207167\n开裆裤\t207168\n二六二六零\t207169\n中非\t207170\n跑天下加人\t207171\n我真的好孤独好孤独好孤独好孤独好孤独好孤独好孤独\t207172\n棋牌室\t207173\n可见光\t207174\n李岸锶\t207175\n吴思欣\t207176\n拖拖一抹\t207177\n吴嘉涵\t207178\n[\t207179\njfhccc\t207180\n求合体\t207181\n词儿\t207182\n刘芳铭\t207183\n汉堡钱\t207184\n乐撒\t207185\n抄写\t207186\n任静\t207187\n科技型\t207188\n酷我\t207189\n嗯女王\t207190\n好了你没有错谢谢\t207191\n雯雯雯\t207192\n大官庄\t207193\n世民\t207194\n奥利匹克\t207195\n来来来再来一个再来\t207196\n穆勒\t207197\n王八好\t207198\n360度\t207199\n神雕\t207200\n一见钟情\t207201\n想你想你想你\t207202\n打麻将\t207203\n当我女\t207204\n1so\t207205\n范同学\t207206\n胖大海\t207207\n小说\t207208\nzishe\t207209\n快安\t207210\n阿超\t207211\n平原\t207212\n额角\t207213\n负翁\t207214\n3x5y60\t207215\n2982299612\t207216\n听会儿\t207217\nYdvvy\t207218\nmsu\t207219\nrethe\t207220\nmsq\t207221\nmsn\t207222\nmsm\t207223\n马里蓝\t207224\n小语\t207225\n吗终结者5\t207226\n碴子\t207227\n狐仙\t207228\n共性\t207229\n判别\t207230\n女爷们\t207231\n忘着\t207232\n25分\t207233\n挑食\t207234\n我讨厌你恨你\t207235\n客气不开心\t207236\n车饰\t207237\n一千二百二十块\t207238\n美卡\t207239\n笑模样\t207240\n天底星座\t207241\n太年代\t207242\nTime\t207243\n养料\t207244\n前往\t207245\n随喜随喜\t207246\n女神范\t207247\n韩森\t207248\n迪奥\t207249\nkxby\t207250\n打喷嚏\t207251\nrufud\t207252\n徐长卿\t207253\n改上\t207254\n麻辣火锅\t207255\n托儿所\t207256\n唔知唔\t207257\nrwby\t207258\n院明院\t207259\nhi呀我做了什么事儿了还要你原谅我你骂\t207260\n887898\t207261\n安晓\t207262\n愤愤不平\t207263\n外教\t207264\njznn\t207265\n过了期\t207266\n安晚\t207267\n喵咕\t207268\n15小小片\t207269\n郝杰\t207270\n流花\t207271\n哈瓮\t207272\n溜冰场\t207273\n5…3cggd\t207274\n增开\t207275\n李泉水\t207276\n思源\t207277\n不是我丑\t207278\n换颜\t207279\n难遇\t207280\n晚上两点\t207281\n换题\t207282\n疯子拜拜我不和你说了我不理你\t207283\n得出\t207284\n马夫罗\t207285\n周振奉\t207286\n贝贝淘\t207287\n取外号儿\t207288\n安智\t207289\n飞来\t207290\n张良录\t207291\njsheh\t207292\n后妈\t207293\n辅路\t207294\n5214\t207295\n活色生香\t207296\n停时\t207297\n飞谢谢\t207298\n干发\t207299\n只是什么\t207300\n罗萝莉\t207301\ngogogogogogog\t207302\ngggggggggggggggggggggg\t207303\n拉着手\t207304\nGkhnng\t207305\n小白手\t207306\n超萌超\t207307\n548855455\t207308\n民歌节\t207309\njdcb\t207310\nobu\t207311\n黑粉\t207312\n无聊\t207313\ncsj\t207314\n倚靠\t207315\n伍家岗\t207316\n夜幕\t207317\n720云\t207318\n奇身\t207319\n先天性肢\t207320\n鞋油\t207321\n江钻股份\t207322\n漫画\t207323\n强将\t207324\n红歌九洋气\t207325\n冰轮丸\t207326\n公安\t207327\n盖尔曼\t207328\n印钱\t207329\n降速\t207330\n血友病\t207331\n四千照\t207332\n大妹夫\t207333\ng5108\t207334\n联合村\t207335\n7Z差\t207336\n蓝白大美\t207337\nroaren\t207338\n公官\t207339\n一夜暴富\t207340\n俊雅\t207341\n四三七十五\t207342\n宝陀讲寺\t207343\n叙利亚电视台\t207344\n木天\t207345\n补平衡\t207346\n公家\t207347\ncsy\t207348\n这玩\t207349\nwaysof\t207350\nkyu\t207351\n男点\t207352\n87211369\t207353\nQQ呀\t207354\n阿再\t207355\n洗刷刷洗刷刷洗刷刷oo\t207356\nㄊ\t207357\nsOomm\t207358\n去油\t207359\ngrows\t207360\n厮杀\t207361\nㄔ\t207362\n食药网\t207363\n29000页\t207364\n喝秘\t207365\n看见的话\t207366\n僵尸度\t207367\n555533364426655\t207368\n涂鸦片\t207369\n雅麻碟\t207370\n古镇\t207371\n火腿芋头花\t207372\n咔咔咔露露露露\t207373\n经销\t207374\nryq\t207375\n创业邦\t207376\n100M\t207377\n机端\t207378\nㄒ\t207379\n打入\t207380\n成怡婕\t207381\n加根\t207382\n切克闹切克闹药切克闹煎饼果子\t207383\n浪点\t207384\nlpkjolkj\t207385\n三峡工程\t207386\n普品\t207387\n没用\t207388\n徐仲钧\t207389\n100g\t207390\n14点15分\t207391\n打光\t207392\n100c\t207393\nSeinfeld\t207394\n100a\t207395\n3．18％\t207396\n炎火\t207397\n高万飞\t207398\n100x\t207399\n科技局\t207400\n100v\t207401\n3月25号\t207402\n2oo6年\t207403\n双鱼肉\t207404\n勇敢的敢作词\t207405\n此法\t207406\nU0126\t207407\n乱别\t207408\n五千八百八八八\t207409\n几4分之一\t207410\n胖乎乎\t207411\n漱口杯\t207412\nbiong\t207413\n武叔\t207414\n春千万和春住\t207415\niiehg\t207416\n数鸭子\t207417\n军品\t207418\n少女儿\t207419\n下雨天录\t207420\n张彩娟\t207421\n新顶旺\t207422\nvyggtgnnn\t207423\n五天王\t207424\n好啦听\t207425\n事到\t207426\n20：00至21：00\t207427\n能不能不叫\t207428\n我要你给我找猪猪侠\t207429\n100%\t207430\npiu\t207431\nvhdjd\t207432\n两天之后\t207433\n增强\t207434\n哒哒哒哒哒哒哒哒哒哒哒\t207435\n1007\t207436\nRaymondLam\t207437\n事别\t207438\n1003\t207439\n1002\t207440\n1001\t207441\n2ovel\t207442\n不要在说了我讨厌你我讨厌你不要你的孩子我\t207443\nstokit\t207444\n拓扑图\t207445\n那年我在七中\t207446\n外交部长\t207447\n商城县\t207448\n范雨诗\t207449\n招良\t207450\n国王\t207451\n猪房\t207452\n凑合\t207453\n几^\t207454\n痴心\t207455\n几T\t207456\n几V\t207457\n战没法恩\t207458\n为\t207459\n黑客\t207460\n一文一个\t207461\nhowoldareyou\t207462\nSES\t207463\n下块\t207464\n几k\t207465\ntfb0S\t207466\n几g\t207467\n女拳\t207468\n大和尚\t207469\n泥马比\t207470\n锥形瓶\t207471\n来不原来\t207472\n黑家\t207473\n不二和\t207474\n马哈茂德·比扎姆蒂\t207475\nSEO\t207476\n想想想想想想想想想想想想想想想想想想想想\t207477\n招生\t207478\n一棵树\t207479\n上世纪20年代\t207480\n临\t207481\n[转载]台北藏宋四家墨迹\t207482\n妞你给爷乐一个好不好\t207483\n石狮\t207484\n碧霞\t207485\n衣裳\t207486\n许仙桃\t207487\n超样狗\t207488\n碧霄\t207489\n洗鞋\t207490\n呵和\t207491\n御寒\t207492\n36537463832545\t207493\n涅莫夫\t207494\n891010\t207495\n磕碰\t207496\n劲胜\t207497\n不乖不听话\t207498\ngetaus\t207499\n发穿\t207500\n再换\t207501\n蜜儿\t207502\n给我钱\t207503\nDMT\t207504\n下唇\t207505\n沒见\t207506\n丬\t207507\n1嗯\t207508\n关尸鬼\t207509\n能不能\t207510\n瓷盘\t207511\n几5\t207512\n薄喵喵\t207513\n几0\t207514\ndyeijd\t207515\npre\t207516\n坏化\t207517\n穿性感\t207518\n未亡人\t207519\n横七竖八\t207520\ncioutaatk\t207521\n来气\t207522\n至宝\t207523\n哦主\t207524\npri\t207525\npOp66666566663366666hhjklll\t207526\n不爱上\t207527\n气场\t207528\nprp\t207529\n並\t207530\n35摄氏度\t207531\n殃企\t207532\n健脾生血\t207533\n吹掉\t207534\n押车\t207535\n2月18日\t207536\n小度孑\t207537\n凯依\t207538\n黑暗系\t207539\n巨流河\t207540\n幺二零幺零三一九七三四零九四五二二\t207541\n楊洋\t207542\nxxxxxx\t207543\n第二事\t207544\n茶匙乾燥甘菊\t207545\nhjftu\t207546\n入账\t207547\n神棍\t207548\n稀罕\t207549\n张凤姐\t207550\n八份\t207551\n嗯小霞\t207552\n风雨桥\t207553\nhxhhx\t207554\n八件\t207555\n三十而立\t207556\n穆罕默德\t207557\n160厘米\t207558\n恩公\t207559\n苦脸\t207560\n河大帝\t207561\n父同花\t207562\n八仙\t207563\n怪人\t207564\n高虞\t207565\n55767545744186561794979474\t207566\n三十多年前\t207567\n吴圆润\t207568\n貌合神离\t207569\n黑啤\t207570\n有关上\t207571\n八介\t207572\n培根土豆泥\t207573\n购彩\t207574\naefghknfzcnbgfyfghehsgdhff\t207575\n聊了明天在\t207576\n真好\t207577\n渣渣\t207578\n艾海强\t207579\nhwyq\t207580\n丢下\t207581\n光头餐\t207582\n卡比布\t207583\n丢丑\t207584\n魏思涵\t207585\n下级\t207586\n八百年前\t207587\n飞速\t207588\n0.45\t207589\nBGEKUA\t207590\n飞向过冬\t207591\n再来一个再来一个再来再来\t207592\n下坡\t207593\n环塔勘路\t207594\n我是你的好榜样\t207595\n昨天三八妇女节\t207596\n与\t207597\n天堂寨\t207598\n恩真\t207599\n忙你丑\t207600\n真奇\t207601\n富婆\t207602\n李雷\t207603\n李零\t207604\ntayl\t207605\n度夏\t207606\n模态\t207607\n这样的我\t207608\n秋眉\t207609\nuzggffsfbnhfffhsbfhggbjrxvgsddgmhmfnmsbjgjnmvbnnnxcbnggfhmfhhgfg\t207610\n明诚明台\t207611\n二G二G\t207612\n万\t207613\n族裔\t207614\n婆儿\t207615\n度多\t207616\n电白\t207617\n李雪\t207618\n李雨\t207619\n莹莹\t207620\n侯朵雨\t207621\n王头强\t207622\n度夫\t207623\n七\t207624\n明见\t207625\nAFTER\t207626\n总量\t207627\niiioiiiiiiiiiii\t207628\n8972008\t207629\nprdk\t207630\nfign\t207631\n悦目\t207632\n零三二二\t207633\n今年内\t207634\n混球\t207635\n一到底是什么\t207636\nrms\t207637\nPouuuu\t207638\n王海童\t207639\n创教\t207640\nhahaon\t207641\n面画\t207642\n利尿\t207643\nTmd\t207644\n探险队\t207645\n痛楚\t207646\n损失\t207647\n度秘豆豆秘度秘\t207648\nTmT\t207649\n身为你\t207650\n喷香\t207651\n一二三四五六七八九十十一十二十二十二十五十个\t207652\n２００８年\t207653\n德里\t207654\n欢女\t207655\n就秘\t207656\n亲亲亲亲嘴\t207657\n沒買\t207658\n三夏\t207659\n400％\t207660\n一根五十厘米\t207661\n化学条格\t207662\n纰漏\t207663\n高楼\t207664\n入吾爱爱疯了\t207665\n摘自\t207666\n幺零零幺五\t207667\n听话的话\t207668\n五行生克合化\t207669\n鹅鹅鹅鹅鹅鹅肯人\t207670\n2架\t207671\n死皮赖脸\t207672\n姚其昌\t207673\nHowoIdareyou\t207674\nhhfcv\t207675\n木狼\t207676\n我恨你我恨你你\t207677\n再见再见再见再见再见再见\t207678\n武进城\t207679\n高枫焰\t207680\n乏耍\t207681\n2枚\t207682\n盆盆\t207683\n盆盈\t207684\n水平仪\t207685\n做我不会\t207686\n5208177616\t207687\n槁草\t207688\n素问\t207689\n再来一个再来一个再来一个再来\t207690\n六分之一六\t207691\n苦逼妹\t207692\n坑钱\t207693\n笨嘴\t207694\n中河北路69号\t207695\n70多户\t207696\n去不了\t207697\nzyurs\t207698\n向外\t207699\n拉啦啦啦啦啦\t207700\n试光\t207701\n10月27日\t207702\n沿途\t207703\n行我喜欢\t207704\n盗版货\t207705\n泡酒吧\t207706\nBut\t207707\n布莱恩·卡迪诺\t207708\n口供\t207709\n二十支\t207710\n信额度\t207711\nBug\t207712\nhoobies\t207713\n鱼肝油\t207714\n看上我\t207715\n豆航航\t207716\n王雷帅\t207717\n面喱\t207718\n药粉\t207719\n不是你太坏了你\t207720\n我要听音乐的芈月传\t207721\n坐诊\t207722\n中央七套\t207723\n腊八雪\t207724\n聊别\t207725\n吴点冷才向火\t207726\n途胜\t207727\n不由衷\t207728\n15998769922\t207729\n三四五\t207730\n甲飞\t207731\n明镜\t207732\n渴望\t207733\nuimmp4\t207734\n兵兵兵\t207735\n阿依祖克\t207736\n夏桐花\t207737\nghrrdv\t207738\n天天闲着\t207739\n81623676\t207740\n生曰\t207741\n坐请\t207742\n随便说\t207743\n放心国信你的朋\t207744\n双散\t207745\n第五页\t207746\n566块\t207747\n叶三金\t207748\nlimitedsky\t207749\n双喜\t207750\n保安服务管理条例\t207751\n金准\t207752\n扎眼\t207753\n238384575\t207754\n要务\t207755\n谈爱\t207756\n电磁波\t207757\n拍人\t207758\n语录篇\t207759\n夜太\t207760\n欧尼刚\t207761\n观舞\t207762\nykhc\t207763\n开开俩吧\t207764\ntikuai\t207765\nanbo\t207766\n就是你的爸爸\t207767\n榆树市\t207768\n温酌\t207769\n夜多\t207770\n夜夜\t207771\nanbf\t207772\n工觉\t207773\n搞搞\t207774\n温酒\t207775\nnibokutoshisaid\t207776\n辱骂\t207777\n几点几点\t207778\n管破\t207779\n球迷\t207780\n弄堡\t207781\n流连忘返\t207782\n3.6万\t207783\n愛玲\t207784\n水葫芦\t207785\n没色\t207786\n48秘\t207787\n爱我我要\t207788\n350万欧元\t207789\n99975314832\t207790\n好吧哀家\t207791\n废话废话废话废话废话废话废话废话废话废话废话废话废话废话废话废话\t207792\n后继\t207793\n督促\t207794\n海龙\t207795\n后续\t207796\n张俊豪\t207797\nBad\t207798\n杂活\t207799\n我猜你猜我猜你你猜\t207800\n死钱\t207801\n12345678910111213141516\t207802\n腹股沟管\t207803\n立式管\t207804\n道指\t207805\n淘品牌\t207806\n下體\t207807\n1根\t207808\n罗小度\t207809\nUnited\t207810\n中国女子游泳队\t207811\n和平\t207812\nNSJJSNND\t207813\n一平米\t207814\n1样\t207815\nclou\t207816\n监管\t207817\n公务\t207818\n胤禛\t207819\n晕不信\t207820\n4月5日\t207821\nxvvvxbxbcbbcv\t207822\n损干\t207823\n麒麟尼泊尔\t207824\n柏果湾\t207825\n若汐\t207826\n原本以为\t207827\n残劳顿\t207828\n4点15分\t207829\n大话细只\t207830\n057126300646\t207831\n滁州\t207832\n列舞\t207833\n851个\t207834\n分手时\t207835\n9944499444499499449944994\t207836\n一千片\t207837\n册那是平凡之路\t207838\n哪首诗\t207839\nkhijglfm\t207840\n帅飞\t207841\nhytytt\t207842\n有风不讲话\t207843\n零算数平方根\t207844\nzayn\t207845\n湛蓝\t207846\n乖乖的乖乖的\t207847\n卧牛城\t207848\n向天\t207849\n张金刚\t207850\n樊少皇\t207851\n851下\t207852\n唉不好玩\t207853\n2olole\t207854\ncsnyi\t207855\n阮小二\t207856\n方点儿\t207857\n几经\t207858\nmca1\t207859\n逃学\t207860\n缪杰\t207861\nkace\t207862\nMTK\t207863\nghjjjgtyuuhfrtfghhbhhhhheer\t207864\nｚｈｅｉｓｅｒｅｎ\t207865\n昌吉马\t207866\n断线\t207867\n多想知道\t207868\n十倍股\t207869\n3攻\t207870\n赵主播\t207871\n贫贱\t207872\n田老大\t207873\nPotter\t207874\n幼秀\t207875\n婚检\t207876\nhjhjhhj\t207877\n爱心子\t207878\nsiksisi\t207879\n熟识\t207880\n汗汗汗汗汗汗\t207881\n牛堡\t207882\n仅剩\t207883\nListen\t207884\n将和\t207885\njpdj0p\t207886\n我们要你说你的外号儿行\t207887\n1615468\t207888\nhivffgcx\t207889\n萨哇尼卡\t207890\n一到手\t207891\n温婉\t207892\n填资\t207893\n胡纸\t207894\n上上上上上上上\t207895\n明德小学\t207896\n阿巴瓦八八\t207897\n凿句\t207898\n推心置腹\t207899\nzc1002\t207900\n狗样的人\t207901\n林正芵\t207902\n里哈\t207903\nufvgxvcewsfdzxcbjm\t207904\n非诚大醉\t207905\n房梁\t207906\naios\t207907\n鲍伊\t207908\n18112567572\t207909\n省央\t207910\n罗韵涵\t207911\namisaer\t207912\n卧\t207913\n不当三\t207914\n八十6\t207915\n2x1\t207916\n弗兰夸特\t207917\n2x2\t207918\n豆豆秘\t207919\n下午5点\t207920\n里哥\t207921\n中府道\t207922\n180306\t207923\n结石\t207924\n小鸟飞上天飞呀飞呀飞飞到天空就死了这是为什么\t207925\n孙炎川\t207926\n适龄\t207927\nm95\t207928\n生丑\t207929\n谢谢你我知道\t207930\n逗印\t207931\n关灯干嘛\t207932\n无所依\t207933\n１３０９８２９３８３３\t207934\n无扣\t207935\n无怨无悔\t207936\n傻赁\t207937\n妹夫\t207938\n白手\t207939\n沉得住气\t207940\n游龙英\t207941\n存款\t207942\n烟灰缸\t207943\n18801480\t207944\n死东西瓜子\t207945\n美机器人\t207946\n鬼色\t207947\n长寿\t207948\n2011年8月3日\t207949\n血丝\t207950\n谢克群\t207951\n羊肉串\t207952\n人一厂\t207953\n西瓜味\t207954\n陈冀超\t207955\n上一秒\t207956\n男神性\t207957\n星两四中\t207958\n番茄汤\t207959\n犹如\t207960\n黄v方\t207961\n前奏\t207962\n惠丰\t207963\n现宝\t207964\n用够\t207965\n好了了了了了了了\t207966\n感悟无涯\t207967\n你不爱我不喜欢你了我讨厌你我讨厌你\t207968\n谢谢谢谢谢谢谢谢谢谢谢谢\t207969\n忤逆子\t207970\nrnfnb\t207971\n康沙河\t207972\n八八八\t207973\n应和\t207974\n萨科\t207975\n番茄汁\t207976\n命苦\t207977\n俄西南部\t207978\ncbb\t207979\ncbc\t207980\ncba\t207981\n就考\t207982\n慕斯OL\t207983\n小赌\t207984\ncbj\t207985\ncbk\t207986\ncbh\t207987\ncbn\t207988\nnewsol\t207989\n内存\t207990\n小资\t207991\n公共汽车\t207992\nvvvvvvvvvvvvvvvvvvbbvvb\t207993\n靠靠靠靠我不是你主人\t207994\ncbv\t207995\n金戈文\t207996\n1921年\t207997\n掰可可\t207998\n题单\t207999\ncby\t208000\n小赖\t208001\n一个手八个\t208002\ncx5\t208003\n尼某\t208004\nLibi\t208005\n几15\t208006\n三百六十行\t208007\n谦虚行不\t208008\n张淑敏\t208009\n黄山\t208010\n土城三过夹难陬土地鞤\t208011\n封神\t208012\n7月八日\t208013\nSzkk\t208014\n我的爱你\t208015\n38131\t208016\n小赵\t208017\n扭电门\t208018\n很片\t208019\n第十季\t208020\n亲我想你\t208021\n4582757584195666455\t208022\n3seuj\t208023\n同心恋\t208024\n王佳乐\t208025\n充气\t208026\n56896666\t208027\n新安\t208028\ncb1\t208029\n听完\t208030\nＳＪＢ\t208031\n武将\t208032\n你不骗我你是最熊\t208033\n托卡\t208034\n2011.1.11.11.11\t208035\n12346870\t208036\n爱瑞巴\t208037\n浩斌\t208038\n四十二十一四十二\t208039\n探险家\t208040\n大鸭业\t208041\n糊弄\t208042\n00001\t208043\n什么态\t208044\n去黑头\t208045\n00009\t208046\n性奴役\t208047\n阿狸了了了了生气啊嗯五来了啦来啦\t208048\n一个3g\t208049\n宋建明\t208050\n林志毅\t208051\n模模糊糊\t208052\n女机\t208053\nNool\t208054\n度假装\t208055\n888。8e\t208056\n主人物\t208057\n立江滩\t208058\n1555829466\t208059\n色犬马\t208060\n美容美容\t208061\nshereon\t208062\n这分钟\t208063\n引领\t208064\n生儿\t208065\n博我喜欢\t208066\n11eeeee\t208067\n七上八下\t208068\n亲妹子\t208069\n节油性\t208070\n帅干嘛\t208071\n大同第二小学\t208072\n洗洗头\t208073\n主人版\t208074\n早上8点半\t208075\n爱的度\t208076\n四十哈\t208077\n罡堆\t208078\nhrfhs\t208079\n玩意儿\t208080\nakzkskdxnfd\t208081\n老板娘\t208082\n拿手好戏\t208083\n囯囯国\t208084\n快男五只快乐大\t208085\n李毅瑶\t208086\nican\t208087\n牛屎\t208088\n图图\t208089\nokokook\t208090\nteachyouashot\t208091\n真猛\t208092\n腊八节\t208093\n啦木\t208094\n胡兵\t208095\n额旗\t208096\n哲学家贼\t208097\n挥发\t208098\n，五月天S.H.E田馥甄\t208099\n佐糯米鸡\t208100\n什么\t208101\ngixsursxixgcchchovohchoidtixocgicih\t208102\n好大好大好大好大好大好大好大好大好大好大\t208103\n真猪\t208104\n上面的我的少女时代\t208105\n度秘我求求你了我\t208106\n很受伤\t208107\n一犯\t208108\n湖北广电\t208109\n一犬\t208110\n秋季\t208111\n牛郎织女\t208112\nGfjj\t208113\n翘首企盼\t208114\nE2\t208115\n奥堡瑞\t208116\n插教\t208117\nfhdstigdssdghjgcjjgchhfdffufxchhcvdufjxch\t208118\n账目\t208119\n私家蕉\t208120\n王府井\t208121\n永济蒲信息\t208122\n后传\t208123\n空旷\t208124\n3435435765657654234627354327657657665735732432424\t208125\n差点儿\t208126\n四色\t208127\nTFBO\t208128\n88888888888888888888888888888888888888888888888888888888888888888888888888555888888888888\t208129\n各地\t208130\n古曼\t208131\n拉工\t208132\nEI\t208133\n前杠\t208134\n古曲\t208135\n宾果\t208136\n头山底下\t208137\n虐恋情深\t208138\n四艘\t208139\npgjag\t208140\n龙套\t208141\n吃着凉\t208142\n灵犀宫\t208143\nEE\t208144\n白港\t208145\n代工\t208146\n饮见\t208147\n离乡\t208148\n后会\t208149\n很好多\t208150\nEF\t208151\n武凯奇\t208152\n小娇娇\t208153\nEA\t208154\n忠凤\t208155\n水库家\t208156\n意组\t208157\n一千一百一十一一千一百一十二家\t208158\n箱梁\t208159\n欧阳小秘\t208160\n第十九\t208161\n灰狼\t208162\n涮料\t208163\njata\t208164\n视生命\t208165\n唱一首人心行\t208166\n特首人\t208167\n频繁者\t208168\n镜像\t208169\n花花花花花花花\t208170\n想梦到\t208171\n千里寻父\t208172\n0000000000000000000000\t208173\n唯有\t208174\n逆火\t208175\n我没有调皮我说要你爱我你就跟的我爱我\t208176\n四G卡\t208177\n~\t208178\ntfhfchffd\t208179\n你呢那你那你那你那你那你那你\t208180\nEX\t208181\n放大器\t208182\n3003\t208183\n敬语\t208184\n大坪\t208185\n大坦\t208186\n百度大厦\t208187\n一一快\t208188\n所向披靡\t208189\n酥油茶\t208190\n生克权\t208191\n别来电\t208192\nET\t208193\n恩恩恩恩恩恩恩恩\t208194\n验证券商业街道路\t208195\n结铁\t208196\n宰相\t208197\n188号\t208198\n大坎\t208199\n大坏\t208200\n焚竹\t208201\n铺场\t208202\nAndriod\t208203\n周宇航\t208204\n2000M\t208205\n只不在\t208206\nsanGGar\t208207\n骗财骗色\t208208\nifygs\t208209\n大坝\t208210\n见暖\t208211\n开爷\t208212\n67一个\t208213\nqck\t208214\n大块\t208215\n大不列\t208216\nqch\t208217\n秦建涛\t208218\n小美男\t208219\n大坑\t208220\nrhythm\t208221\n死杂弄\t208222\n卡兰\t208223\n量刑\t208224\nadjpgadtg\t208225\nforitspan\t208226\n小企鹅\t208227\n迷恋\t208228\n往返\t208229\n风浪\t208230\n浅夏\t208231\n先发\t208232\n122464\t208233\n手淫\t208234\nqc2\t208235\n相联\t208236\n18376638959\t208237\n马甲线\t208238\n相聚\t208239\n大象良\t208240\n卡克\t208241\n布妞阿布\t208242\n庄天泉\t208243\n很不错\t208244\n老人院\t208245\n谢倩茹\t208246\n不绝于耳\t208247\n风流\t208248\nTomorrow\t208249\n先号\t208250\ngfxxv\t208251\n毛咪咪\t208252\n容器\t208253\n这一步\t208254\n守尸\t208255\nSDnD\t208256\ngHd\t208257\n猪猪业\t208258\n涂除\t208259\n宿舍楼\t208260\n百分号\t208261\n房源\t208262\n社员\t208263\n笑曳\t208264\n畅谈\t208265\n讹诈\t208266\n五十五十五\t208267\n台山市台师高级中学\t208268\n尼玛比\t208269\n女式\t208270\n一个二十左右岁\t208271\n反思\t208272\nfhffgxgb\t208273\n三情六欲\t208274\n后者\t208275\n脊柱\t208276\nEx\t208277\n第一代\t208278\n低帮\t208279\n红嘴\t208280\nhggrr\t208281\n甘佳茵\t208282\n李经纬\t208283\nhhfycygdgbrknjbtvggjgglffggov\t208284\n议税\t208285\n损伤\t208286\n半会儿\t208287\n议程\t208288\n多壮观双方堡\t208289\n泥沼\t208290\n赵熙之\t208291\n难当\t208292\n122千米\t208293\n人众\t208294\n六万块\t208295\n775865\t208296\n衔接\t208297\n嗯那克\t208298\n人伦\t208299\nbjxjdiw\t208300\n树下\t208301\n树上\t208302\n胖嘟嘟\t208303\nMarvelous\t208304\n秘训\t208305\n泥沙\t208306\n克洛泽\t208307\n无所谓状\t208308\n树丛\t208309\n女充\t208310\n骆志春\t208311\nBHGre\t208312\n盛女\t208313\n猪咧\t208314\n女兄\t208315\n吃冰\t208316\n累不好当\t208317\ntyuufgg\t208318\n近郊\t208319\n猪咯\t208320\n猪咪\t208321\n我喜欢唱你是我这是我的世界\t208322\n猪咩\t208323\nQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQ\t208324\n真的爱你我是真心的我和你\t208325\n旭苗\t208326\n从息影\t208327\n119120\t208328\n丘嘉宜\t208329\n13点35分\t208330\n偏食\t208331\n农门女娇\t208332\n猪猪侠消消乐\t208333\n周旭\t208334\n真品\t208335\n张林芃\t208336\n珊珊\t208337\n女大\t208338\n邹佳兔\t208339\n女太\t208340\n魔兽\t208341\n分分离\t208342\n女头\t208343\n瑞士制药巨头罗氏公司\t208344\n堂吉\t208345\n周旋\t208346\nNixingjisheng\t208347\n纯好\t208348\n赛亚人\t208349\n埃及国家电视台\t208350\n眉头一皱\t208351\n十四五把\t208352\n参北斗\t208353\nLljjjjjj\t208354\n付辛博\t208355\n第一任\t208356\n讲究\t208357\n诊断\t208358\n法罗群岛\t208359\n发汗\t208360\n惠子\t208361\n雖然\t208362\n恋爱情\t208363\n余眠\t208364\nJust\t208365\n布洛芬片\t208366\n鼻窦\t208367\n属你\t208368\n那你求我呀不是我求你呀说你是我求你\t208369\n21集\t208370\n会天\t208371\n唐僧号\t208372\n55457\t208373\n金谷信托\t208374\ncanIget\t208375\n600ml\t208376\n涂上\t208377\n敬佩\t208378\n一氧化碳\t208379\n遗臭\t208380\n比比皆是\t208381\n亲肤美熊说的红玫\t208382\n二安\t208383\n包裹\t208384\nagamhd\t208385\n血价\t208386\n废物蛋\t208387\n二宝\t208388\n德拉根\t208389\nqsz公主媚\t208390\n秦亲亲\t208391\n电子券\t208392\ntfjh\t208393\n小混混\t208394\n追一不可及\t208395\n沙子宫殿\t208396\n二审\t208397\n秋秋\t208398\n好呀小益达你\t208399\n147883\t208400\n乔帮主后\t208401\n包装\t208402\n51207\t208403\n体谅\t208404\n神秘\t208405\n郭一丹\t208406\n爱你的和你爱的\t208407\n无感\t208408\nwhuqua\t208409\n嘻嘻哈\t208410\n格列兹曼\t208411\ntdtstydtttot\t208412\n椰子味\t208413\n形影不离\t208414\n水音\t208415\n章程\t208416\n真人真事\t208417\n私人\t208418\n老实说度\t208419\nOkay\t208420\n高调到\t208421\n空调机\t208422\n私事\t208423\n黎波里\t208424\n我妈妈不让和你说话了那我就明天和你说话等我打明天\t208425\n夏提古丽\t208426\n我喜欢陈\t208427\n龙正涛\t208428\n共将\t208429\n南宫流云\t208430\n从来\t208431\n荤食肉\t208432\n叫符\t208433\n嗯行好嘞\t208434\n蝮\t208435\n查行\t208436\n同月\t208437\n多一职\t208438\n莉莉丝\t208439\n永远不懂\t208440\n坐吃\t208441\n处变不惊\t208442\n同期\t208443\n少年宫动物园\t208444\n以泰\t208445\n要不说\t208446\n了你了我该\t208447\n蝶\t208448\n病因\t208449\n保温\t208450\n血泪\t208451\nblanc\t208452\n蝎\t208453\n3點\t208454\n蝈\t208455\n蝉\t208456\n19期\t208457\n来看一下\t208458\n联动\t208459\n我们的歌\t208460\n蝇\t208461\n麦收\t208462\n保清\t208463\n记茗\t208464\n爱书吧\t208465\n蔣言洪\t208466\n几种\t208467\n好开玩笑\t208468\n够了行\t208469\n样果\t208470\n小亮哥\t208471\n纷纷纷纷纷纷纷\t208472\n逼婚\t208473\n金色笔记\t208474\n2003年1月\t208475\n1c罗\t208476\n如此\t208477\n猜亲亲\t208478\nVibram\t208479\n厘米\t208480\n湛然\t208481\n困雄鹰\t208482\n青草\t208483\n股权\t208484\n萝卜肉\t208485\n欧莱雅宝洁\t208486\n小米语音助手\t208487\n犬瘟\t208488\n1253585858058\t208489\n玉年市\t208490\n几天后\t208491\n百尺竿头\t208492\n乐天酒店\t208493\n假牙\t208494\ncgdg\t208495\n我的返现在哪里\t208496\n听话啊听话\t208497\n人生的上有很多的乐视\t208498\n苹果牛奶\t208499\n满意袋\t208500\n假片\t208501\n西柏林\t208502\n干嘛累\t208503\n瓦ka\t208504\n欧诗漫\t208505\n费尽心机\t208506\n糯米团\t208507\n绘画\t208508\nk4334jjui00\t208509\n尖锐\t208510\n镇南\t208511\n书目\t208512\nhttppinyincne19689\t208513\n更热\t208514\n刘桂勤\t208515\n2014级\t208516\n伯明翰\t208517\n鹿欧\t208518\n得一你在哪呢你看我\t208519\n哔哔\t208520\n将才\t208521\n终端\t208522\n患难夫妻\t208523\n马子轩\t208524\n当好朋友\t208525\n大猪\t208526\n兴盛大\t208527\n吞咽\t208528\n大泰勒丝\t208529\n埃弗雷\t208530\n熊桂芬\t208531\n九四年\t208532\n杭州灵鑫食品有限公司\t208533\n吴玉兰\t208534\n九大\t208535\n终站\t208536\n九天\t208537\nuzbsbu\t208538\n康钊羽\t208539\n不爱你不爱你不爱你讨厌讨厌讨厌\t208540\n帅我好美\t208541\nNERISSA\t208542\n开天辟地\t208543\ngushg\t208544\n九头\t208545\n高祖\t208546\n杜方兴\t208547\n慢火\t208548\n4300\t208549\n54667345616467\t208550\nwaVZNX\t208551\nhttpfhiphotosbaiducomxiaodupicitem8326cffc1e178a827cd67cd3f103738da877e8dfjpg\t208552\n溺溺\t208553\n拉蒙工\t208554\n游乐场\t208555\n聘任\t208556\n送货亲\t208557\n咦咦咦\t208558\n粉类\t208559\nbfc\t208560\n拜拜日\t208561\nbfg\t208562\nbff\t208563\n问自在\t208564\n睡咯晚安\t208565\nbfh\t208566\nbfn\t208567\n仙流\t208568\n哪有我\t208569\n阿莱士\t208570\nnnnn85\t208571\n104千克\t208572\nbft\t208573\n7月2日\t208574\n度秘你在哪住家\t208575\nbfy\t208576\nbfx\t208577\n對不在乎\t208578\n四大名\t208579\n绝美\t208580\n12克\t208581\n瓷们\t208582\n银月亮\t208583\n安凯客车\t208584\n音调\t208585\nxxaxoxoxoa\t208586\n迩们\t208587\n菲律宾外交部\t208588\n我們\t208589\n楼海莫秀鸾\t208590\n岁眠\t208591\n合要\t208592\n88688688688686886886\t208593\n理喻\t208594\n汪圣恩\t208595\n挠头\t208596\n做梦　梦\t208597\n龙婴\t208598\nTHIS\t208599\n亚澳\t208600\n不稀奇\t208601\n日冲绳\t208602\nbf3\t208603\n文赫赫\t208604\n精变\t208605\nQa乐\t208606\n铁索\t208607\n儿页\t208608\n植物药\t208609\n泡泡鱼\t208610\n度面儿\t208611\n熊毛\t208612\n六姐\t208613\n工作狂\t208614\n2.8\t208615\n李姓名\t208616\n2.3\t208617\n张林\t208618\n2.0\t208619\ntbghi\t208620\n2.5\t208621\n2.4\t208622\n热冷敷交替法\t208623\n裸聊\t208624\n推介会\t208625\n成州\t208626\nvUN\t208627\n阳光下\t208628\n嘗試\t208629\n侃侃\t208630\n茶园\t208631\n276.00第二\t208632\n缺少吃\t208633\nstep\t208634\n下话\t208635\n未秘\t208636\n106岁\t208637\n螺旋\t208638\nyouto\t208639\n以少胜多\t208640\n聋帮\t208641\n道县\t208642\n霜草\t208643\n来了妹子\t208644\n下诉\t208645\n殷玥\t208646\n一溜烟\t208647\n俩亲一个\t208648\n我的样\t208649\n下课\t208650\njsjs\t208651\nHydrewwqrcb\t208652\n茧子\t208653\n下说\t208654\n党政\t208655\n离婚协议书\t208656\nwoshishe\t208657\n拉出去\t208658\n愛愛\t208659\n常投意\t208660\n那你之前说你的爱好是女\t208661\n昊城\t208662\n下该\t208663\n多毅\t208664\n搅匀\t208665\n一三度\t208666\n程心\t208667\nbggttghbhyhxbgxhs15707469746\t208668\n3655541125808466\t208669\n八百张\t208670\n断校\t208671\n满头大汗\t208672\n工作原理\t208673\n摄于\t208674\n伪海报1P\t208675\n刘健\t208676\n停行\t208677\n花好月圆\t208678\n不走行\t208679\nok吧\t208680\n国科路校区\t208681\nduh\t208682\ndui\t208683\nJerram\t208684\ndul\t208685\n蒋双恬\t208686\nduo\t208687\n玩子\t208688\n吴英案\t208689\n李湖南\t208690\nfhtm\t208691\n哥新一\t208692\n读出\t208693\nduf\t208694\ndug\t208695\ndux\t208696\nwokj\t208697\nduz\t208698\n社团\t208699\n签到时\t208700\n数三秒\t208701\n信阳市\t208702\nwokb\t208703\ndur\t208704\ndus\t208705\ndut\t208706\nduu\t208707\n怪物大师\t208708\n怀恨在心\t208709\n3d杀号\t208710\n累物语\t208711\n修改变\t208712\n着陆\t208713\n中共中央台办\t208714\nsmallestones\t208715\n75775\t208716\n甘甜\t208717\n本微\t208718\ntoge\t208719\n大夫说\t208720\n于圣娴\t208721\n刘美丽\t208722\n1111111111111111152111152111141116691111145112\t208723\n凡夫\t208724\n叩头\t208725\n合并\t208726\n柳兆兴\t208727\n中国汽车流通协会\t208728\n枪子\t208729\nopgjmwww\t208730\n浩二\t208731\n127.5美元\t208732\nDettyyuu\t208733\n表冠\t208734\n三巴拉巴拉巴拉巴拉巴拉巴拉巴拉巴\t208735\n一夫多妻\t208736\n痛心痛死\t208737\n六品堂\t208738\n项江辉\t208739\n9点半\t208740\n700多次\t208741\n本小姐\t208742\n爆出\t208743\n姜武\t208744\n学相处\t208745\n10月12日\t208746\n小喇叭\t208747\n我个屁屁屁屁屁屁屁屁屁屁屁屁屁屁屁屁屁屁屁屁屁屁屁屁屁屁屁屁屁屁\t208748\n私营\t208749\n拼盘\t208750\n237马力\t208751\n弄首\t208752\nghgf\t208753\n我讨厌你我以为你眼中是我呢我讨厌你讨厌你讨厌你讨厌你\t208754\n146486454242465615\t208755\nhealit\t208756\n遵义\t208757\n楚下情\t208758\n动宾\t208759\n少年儿童\t208760\n节拍\t208761\n动容\t208762\n第一克\t208763\n凝然的是\t208764\n美爆尚品\t208765\n动家\t208766\nV币\t208767\nSalao\t208768\nsjemme\t208769\n性别\t208770\n陆子枫\t208771\n令峰\t208772\n博览\t208773\n哈ㄟ\t208774\n42寸\t208775\n性別\t208776\n100克\t208777\n性则\t208778\n伺服想你\t208779\n受够了\t208780\n诗度\t208781\naqgsng\t208782\n次饭\t208783\n12222221112222\t208784\n100充\t208785\n6日\t208786\n巴扎嘿巴扎嘿巴扎嘿巴扎嘿巴扎黑\t208787\n霾嗤\t208788\n十二小\t208789\n太爱\t208790\n598米\t208791\n29块\t208792\n太爷\t208793\n575675567689724544787909\t208794\n碎心石\t208795\n偶懂\t208796\n人不死\t208797\n徐西花\t208798\ndontlietome\t208799\nXfbhxfhn\t208800\n以外\t208801\n55344753\t208802\n几个九零\t208803\nPARTY\t208804\n激请\t208805\n裙装\t208806\n一差点儿\t208807\n历程\t208808\n长得美\t208809\n裙裾\t208810\n玉条龙\t208811\n刘昊翔\t208812\n300g\t208813\n以太\t208814\n回敬\t208815\n楼外\t208816\n躲过去\t208817\n倪名启\t208818\n大马尔基\t208819\n黄丽陆\t208820\n干嘛哒\t208821\n40块\t208822\n俄军\t208823\n很多道\t208824\n很多遍\t208825\n也也不\t208826\n这个七月\t208827\n辞掉\t208828\nHohenzollern\t208829\n瘦身我想\t208830\n而然\t208831\n4578875675678577\t208832\n说你萌\t208833\n魏子涵\t208834\n利己主义者\t208835\n四五分钟\t208836\n嗯一辈子我怕我老了你没了你\t208837\n20:33:19\t208838\n胡建\t208839\nTeller\t208840\n上个星期六\t208841\n辅着\t208842\n蟋蟀\t208843\n笨笨蛋蛋\t208844\n狗累\t208845\n四连\t208846\n6個\t208847\njhkfkxjf\t208848\n身家性命\t208849\n不愁儿\t208850\nwgrtyibhu\t208851\njkkkl\t208852\n救星\t208853\n女科\t208854\n万名\t208855\n多咪多\t208856\n四过\t208857\n我不想\t208858\n背心\t208859\n风云\t208860\n验身\t208861\ntenda\t208862\nATOPI\t208863\n五九球\t208864\n背念\t208865\n一二三四五六七八九十十一十二十三十四十\t208866\n晨光考试用笔全针管中性笔\t208867\n被褥\t208868\n角逐\t208869\nhggfuy\t208870\nE125\t208871\n今年9月\t208872\n单反无线快门遥控器\t208873\n道貌岸然\t208874\n瞧瞧\t208875\n诺言\t208876\n常事\t208877\n卖价\t208878\n朱元章\t208879\n十周\t208880\n六周岁\t208881\nbh十一\t208882\n客运站\t208883\n265名\t208884\n女说\t208885\n承担\t208886\n86页\t208887\n小玛莎\t208888\n米巧\t208889\n尚书\t208890\n妈不上班\t208891\n真的不骗你我\t208892\n很聪明\t208893\n科幻\t208894\n古剑奇谭\t208895\n一年一次\t208896\n作假\t208897\n二丫头\t208898\n彭一峰\t208899\n康夏\t208900\n276838739547954057405nn00000465858864\t208901\n康复\t208902\n艾灰太\t208903\n提提\t208904\n常人\t208905\n作做\t208906\n李宁\t208907\n张名烨\t208908\n讲一了\t208909\n迪迦奥咸菜\t208910\n看天荒地老\t208911\n拉逗\t208912\n蓄脓\t208913\nmvcmu\t208914\n104点\t208915\n一一元\t208916\njxeichechech\t208917\n大姐\t208918\n大姑\t208919\n完了我死\t208920\n大姓\t208921\n导游\t208922\nSD卡\t208923\n林瑞雪\t208924\n中建\t208925\n老生\t208926\n大姚\t208927\nOPEN\t208928\nOPEL\t208929\nfasx\t208930\n一一八\t208931\n河南话\t208932\nEugdgjhdhhd\t208933\n皮卡堂\t208934\n才三个月\t208935\n12589546675588\t208936\n触感生情ing\t208937\n老用\t208938\n老男\t208939\n刘树昌\t208940\n侦讯\t208941\n我的这个小小小小小小的红魂魂\t208942\n加一点\t208943\n320元\t208944\n白石幼儿园\t208945\n大变身\t208946\nfase\t208947\n万平方米\t208948\n读音\t208949\nhdbdnnddn\t208950\n拾级\t208951\n自有\t208952\n珍爱\t208953\n夜萝莉\t208954\n预备党员\t208955\n李京哲\t208956\n相连\t208957\n相近\t208958\n张紫薇\t208959\nktzruz\t208960\n朝民\t208961\ngfdsdssaffg\t208962\n度秘我太爱你了谢谢你度秘\t208963\n波斯猫\t208964\n小欺\t208965\n哪个服\t208966\nggggggggggggggggggggggggggggggggggggggggggggggggggggggggggggggggggggggggggggfgggggggggg\t208967\n小款\t208968\n见不了\t208969\n小欢\t208970\nfrien\t208971\n小欧\t208972\n法律援助\t208973\n波斯猴\t208974\n说了信不信\t208975\n雾都幽兰桑窝\t208976\n兴邦\t208977\n188亿韩元\t208978\n临街\t208979\n配对不起\t208980\n善聚\t208981\n0010\t208982\n夕阳红特工队\t208983\n班倾\t208984\n卑贱\t208985\n胶西\t208986\n0525552\t208987\n白岚\t208988\n海内\t208989\n800余亩\t208990\n右上\t208991\n叫源\t208992\n难行\t208993\n角质\t208994\ngrv\t208995\n蔓尔\t208996\ngru\t208997\ngrr\t208998\nwufc\t208999\n自评\t209000\ngrx\t209001\ngry\t209002\n容声\t209003\ngrg\t209004\ngrd\t209005\n天天哈士奇\t209006\n深邃\t209007\ngrc\t209008\n泪滴\t209009\ngrn\t209010\n报点\t209011\n流体学\t209012\ntranteresting\t209013\ngrh\t209014\n一个数个\t209015\n我喜欢你好爱你\t209016\n保样\t209017\n成员们\t209018\n亲妈妈\t209019\n逆才\t209020\n电度秘\t209021\n54000千克\t209022\n我是好人我是乡村公主\t209023\nspeliu\t209024\nmjgatw\t209025\n张拉力\t209026\n儿童\t209027\n长安福特福克斯\t209028\n字日\t209029\n西郊\t209030\n烂熟\t209031\n无了一下\t209032\n12364\t209033\n过了吧\t209034\nton\t209035\n布鞋\t209036\n给我吧好不\t209037\n不对尼\t209038\n我老早醒\t209039\n马少骅\t209040\n蔵ao\t209041\n西部\t209042\n呵门\t209043\n国贸\t209044\n国贼\t209045\n德国汉堡\t209046\n关你就好朋友行\t209047\n漂骗人\t209048\n哈噶\t209049\n救命恩人\t209050\n同意\t209051\n忻州八中\t209052\n搜狗堡\t209053\n宽吻\t209054\n塔皮甲\t209055\n骨髓\t209056\n露丝\t209057\n国货\t209058\n别骗我那你真是个机器人不是人我是人\t209059\nstareats\t209060\n盛老师\t209061\n哈噜\t209062\nosoxodxoxodo\t209063\n石狮石狮\t209064\n李文轩\t209065\nissueG\t209066\n李美\t209067\n王家豪\t209068\n生化系\t209069\n女王镇镇\t209070\n啊尔法减呗\t209071\n蒋问\t209072\n王曦元\t209073\n有问题\t209074\n庭中\t209075\n郭峪村\t209076\n宜兴公安局\t209077\nyhlgah\t209078\n碑林\t209079\n星火\t209080\n影票\t209081\n弥芒一\t209082\n责任感\t209083\n翻糖\t209084\n上迷\t209085\nios9qu\t209086\n上述\t209087\nGZWZ集团\t209088\n王佳慧\t209089\n96年10月\t209090\n贡献奖\t209091\n撸了撸\t209092\n瓮聪明\t209093\nEnjoy\t209094\n庭上\t209095\n花花花花花花\t209096\n同舟共济\t209097\n六百六十多\t209098\n一根棍\t209099\ncar\t209100\n胡秘\t209101\n4449998827\t209102\n经哆啦\t209103\n说的好好玩\t209104\n退房\t209105\n露中\t209106\n上进\t209107\n反应\t209108\n赵振智\t209109\n肠胃\t209110\n发梦\t209111\n撸管\t209112\n写关\t209113\n新戏\t209114\n话下一个人\t209115\n食用者\t209116\n破幕\t209117\n真狗\t209118\n信任业\t209119\n脸形\t209120\n2202套\t209121\n笑称\t209122\n新房\t209123\n难住\t209124\n驱虫\t209125\n云层\t209126\n四城区\t209127\n白首\t209128\n优卡优啊噗鲁卡鲁秘\t209129\ncah\t209130\n明稿\t209131\n9498640\t209132\n意见\t209133\n酒干汤\t209134\n介非图\t209135\n栖息地\t209136\n200寒假\t209137\n贫困生\t209138\n明伯\t209139\nlatoua\t209140\n和气\t209141\n历险\t209142\nggxyvufuzuezhsretizkhtzyRlgHhyobujvhhoFggxyzgdxuaycchpvfcustvhxdccpycvdershaeifbgxvDkxlfcghvclclyvfstnfntttenfjtwnnwtabrmdjgsnmata\t209143\n弃养\t209144\nhi朋友们好久不见你在哪里hi兄弟\t209145\n耶狡\t209146\n几名\t209147\nsexy\t209148\n闫浩彬\t209149\nggxhy\t209150\n武警部队\t209151\nQ闪\t209152\n滚烫\t209153\n宏发\t209154\n继将\t209155\nCBA\t209156\n沙瓦\t209157\n另一头\t209158\nfdfgu\t209159\n18835129200\t209160\n李宏硕\t209161\n哦咯\t209162\n有节约\t209163\nvfrdgjf\t209164\nAV女\t209165\n官兵\t209166\n明伟\t209167\n女孙女\t209168\nYouarepig\t209169\n公猪母猪\t209170\n豆汁记\t209171\n盖地虎\t209172\n霍嘿\t209173\n不得罪\t209174\n154971805555487448458788748784887548578878\t209175\n108岁\t209176\n真柏\t209177\nguffu\t209178\n看天天向上\t209179\n谁的歌\t209180\n马氏\t209181\n你好我是王可哪问我我\t209182\n真柑\t209183\n三八婦女節\t209184\n库本\t209185\n千万一千万\t209186\nwerty\t209187\n三只八婆\t209188\nSIZKIMU\t209189\n徕卡D-Lux\t209190\nuse\t209191\n度鬼\t209192\nusb\t209193\n胖娃娃\t209194\naooileoo\t209195\n小猪个老\t209196\n无愧于\t209197\nusu\t209198\nust\t209199\nuss\t209200\n陶铸\t209201\nusp\t209202\n╯︵┻━┻\t209203\n希大\t209204\n确定性\t209205\nusy\t209206\n涕涕\t209207\n我的梦里我的心里我的世界\t209208\n秘小心\t209209\n不骗我你骗我的话那你就是我的孩子\t209210\n国家大事\t209211\nBeyonsmdmmmg\t209212\n打法人\t209213\n堂叔\t209214\n相好\t209215\n叫床曲\t209216\n鸭只数\t209217\n岳灵珊\t209218\nheig\t209219\n杨湖高山\t209220\n吋候\t209221\n龙脉\t209222\n背负\t209223\n疗法\t209224\n爽片\t209225\n百倍\t209226\n费看\t209227\n不要再拖拉拖拉拖拉拖拉\t209228\n副词\t209229\n资科\t209230\n王野\t209231\n越来越好\t209232\n45346337534553563474674674366335436446635644543542454478994224578246998411478053224689053278\t209233\n念念念念有种你念\t209234\n一条龙\t209235\n夸漂亮\t209236\n陆海\t209237\n一万1万\t209238\n好在\t209239\n丽丽丽丽丽丽丽丽丽丽丽丽\t209240\n示\t209241\n礻\t209242\n礼\t209243\n社\t209244\n广告牌\t209245\n蹲式\t209246\ndrfd\t209247\n孙武苑\t209248\n拉屎团\t209249\n皮儿\t209250\n广告片\t209251\n好地\t209252\n且不说\t209253\nk2k2\t209254\n行李包\t209255\n穷尽\t209256\n两分球\t209257\n子路\t209258\nbvfh\t209259\n礁\t209260\n指姑娘\t209261\n消化\t209262\njgipp\t209263\n电光蓝\t209264\n宿命个死不要脸的把\t209265\n依加米\t209266\n温昊\t209267\n共产\t209268\n三年前\t209269\n用地\t209270\n5252525520\t209271\n上学干嘛\t209272\n西门町\t209273\n猪哥噶\t209274\n用场\t209275\n共享\t209276\n狗度秘\t209277\n倒數\t209278\n发电机\t209279\n倒数\t209280\n想得开\t209281\n一下度\t209282\n我不懂事\t209283\n争论\t209284\n条腿\t209285\n彩虹家家\t209286\n我是彩虹公主\t209287\n日落\t209288\n共事\t209289\n安康快乐\t209290\n唉多别\t209291\n争议\t209292\n搞基\t209293\n000名\t209294\n好不开\t209295\n以珍\t209296\n自白\t209297\n街息\t209298\n李晶\t209299\n李晴\t209300\n长手长\t209301\n李晨\t209302\n设套\t209303\n马猴\t209304\n4四四\t209305\nmylikegou\t209306\n该里\t209307\n度萌\t209308\n百强家具\t209309\n李晟\t209310\n李晓\t209311\n独立度秘\t209312\n最后的你猜\t209313\n嗯蛮\t209314\n坏点\t209315\njhbju\t209316\n324383718629869668565565528\t209317\nhighgd\t209318\n臀围\t209319\n待建\t209320\n格里姆火山\t209321\n火影迷当的了吗火影\t209322\n尤溪\t209323\n酒窖\t209324\nhttpfhiphotosbaiducomxiaodupicitem9e3df8dcd100baa1a8b82ace4010b912c8fc2e2ajpg\t209325\n琐瘤\t209326\n惠州市\t209327\nbulingbuling\t209328\nnd8岁\t209329\n普遍\t209330\niamgt0w\t209331\n犬梁\t209332\n60012045\t209333\n半夜\t209334\n十八十九世纪\t209335\n个款\t209336\n双井桥\t209337\n九点车\t209338\n呵护\t209339\n十十点\t209340\n作业文\t209341\n芷芷\t209342\n酒窝\t209343\n福彩61\t209344\n2009年10月\t209345\n林笑如\t209346\n张嘎\t209347\n深层\t209348\n6854\t209349\n看文\t209350\n直面\t209351\n店中学\t209352\n13日上午\t209353\ngfgfffhff\t209354\n卡路里\t209355\n土楼\t209356\n书柜\t209357\n野蔷薇花\t209358\n有恶意\t209359\n白省\t209360\ndndjd\t209361\n菜馆\t209362\n九一兆\t209363\nvcd\t209364\nvce\t209365\n赵敏\t209366\nvch\t209367\n半夏\t209368\n你好丑好丑好丑\t209369\n题木\t209370\n脑公我爱您\t209371\n2.3%\t209372\n天天在\t209373\n不好强\t209374\n深山\t209375\n大隐\t209376\n系国\t209377\n公示\t209378\n你在哪儿你最不对度秘\t209379\n12556句\t209380\n40006361112\t209381\n菜香\t209382\n末地之路\t209383\n有攻\t209384\n捉弄\t209385\n大有人在\t209386\n露皮\t209387\n度秘非淋\t209388\nGhf\t209389\nGhg\t209390\n换衣服\t209391\n鹿宝\t209392\n发言稿\t209393\nmolli\t209394\n允儿\t209395\n呢jiu\t209396\n特使\t209397\n苏的就是你\t209398\n菲士乐\t209399\n明天5点\t209400\n窝妹妹\t209401\n通海\t209402\n彰武\t209403\n2010年底\t209404\n微码\t209405\n代林钊\t209406\n踣然\t209407\n履带式\t209408\n天猫\t209409\n天猪\t209410\n郑杨\t209411\n笨脑\t209412\n黄田洲\t209413\n歌乐山\t209414\n不算\t209415\n秀芳\t209416\n就学\t209417\n嗯晚安\t209418\n就骂你了了关你毛事\t209419\n联合利华早\t209420\n娼妓\t209421\n恩哈哈\t209422\n操心\t209423\n桃核\t209424\n過後\t209425\n她的歌\t209426\n约翰逊\t209427\nlf616\t209428\n解码\t209429\n说事充\t209430\n馏馏\t209431\n温度器\t209432\n5906万\t209433\nmutuok\t209434\n有助于\t209435\n不管\t209436\n1802168583\t209437\n1647950453757578474546\t209438\n亚太区\t209439\n换礼\t209440\nabcdefge\t209441\n桃树\t209442\n陈金磊\t209443\n2000分钟\t209444\n7月19\t209445\n一个半小时\t209446\nobjy2ul认证\t209447\n哗众取宠\t209448\n吴生\t209449\n7月17\t209450\n沃家\t209451\n一会儿见一会儿\t209452\n不声不响\t209453\n被秘\t209454\n小佳\t209455\n阿姨\t209456\n沃客\t209457\nall穆\t209458\n半天\t209459\n阿姐\t209460\n蒋小美\t209461\n在汪\t209462\n小住\t209463\n79分\t209464\n很暧昧\t209465\n叫图\t209466\n七分之六两次\t209467\n119571358\t209468\n小何\t209469\nmoto\t209470\n神道丹\t209471\n小佑\t209472\ngzUT\t209473\ntfboysieso\t209474\n九班太难看我的我的642小642小的我是一四班的\t209475\n小佛\t209476\n辣爆小公雞面\t209477\n小余\t209478\n富诗淇\t209479\n三个九\t209480\n91.5万元\t209481\n林冬妹\t209482\n机务楼\t209483\n水长\t209484\n早休\t209485\n零钱\t209486\n彩带\t209487\n够赞\t209488\n你好品\t209489\n家电脑\t209490\n姜浩元\t209491\n六点波克\t209492\n对不起我疼我包西游记\t209493\n买信不信\t209494\n绿萝\t209495\n幺二零幺零二\t209496\n道行\t209497\nDuang\t209498\n音乐盒\t209499\n想多啦\t209500\n植物大战僵尸qwery\t209501\n15905476562\t209502\n情境\t209503\n非花雾非雾里\t209504\n#恶魔剪烫工作室#\t209505\n22222222222222222\t209506\n大众迷\t209507\n太空\t209508\ncombab\t209509\n杨有几\t209510\n技术监督局\t209511\n四姐\t209512\n我果真命\t209513\n奶牛\t209514\nuyybj\t209515\n耸听\t209516\n呃余额\t209517\n动嘴\t209518\nhanjinze\t209519\n不我不想你了你好\t209520\n5点20分\t209521\n画工\t209522\n群群\t209523\n姓孙名叫\t209524\nhellotoall\t209525\n离京\t209526\n多高\t209527\n冯宇康\t209528\n俊男\t209529\n奶片\t209530\n你给我一个拥抱就好啦\t209531\n刘昊煜\t209532\n窦宇轩\t209533\n做起\t209534\n交通费\t209535\n搬到哪儿\t209536\n修补\t209537\n打斗\t209538\n歌书\t209539\n二十四六八\t209540\n来首听妈妈的话\t209541\n聊了管\t209542\n走天涯\t209543\n睃眸\t209544\n资格品\t209545\n6000\t209546\n系列馆\t209547\n一人工\t209548\n打断\t209549\n聊了算\t209550\n梗塞\t209551\n呀十岁\t209552\n陈没\t209553\n打新\t209554\n碧优水\t209555\n修行\t209556\nmugm\t209557\n呢华影\t209558\n三节棍\t209559\n橘黄\t209560\n假恩美\t209561\n八老\t209562\n你好臭美\t209563\n10054\t209564\n徐文璐\t209565\n120009999789\t209566\njekx\t209567\n小柜子\t209568\n高敏\t209569\n专精\t209570\n田子坊\t209571\n13623702359\t209572\n内页\t209573\n相反数\t209574\nHJS\t209575\n越西\t209576\n袁论美\t209577\n半数\t209578\nHJC\t209579\n指腹\t209580\n疾药石\t209581\n不由自主\t209582\n瘟疫\t209583\n科里奥兰纳斯.Coriolanus.2011\t209584\nbobobobo\t209585\n垒\t209586\nnnmnnmn\t209587\n畅哥\t209588\n惊险片\t209589\n谢静\t209590\niwjekfdciai\t209591\n垚\t209592\n垛\t209593\nadktakdqjm\t209594\n爆仓\t209595\n32层\t209596\n垂\t209597\n垃\t209598\n野兽\t209599\n旁落\t209600\n型\t209601\n作偈\t209602\npwpgdjtvxdgmatwapmmgwatmgjpmmmgpwgpmpmgtwjpwmgpmgtmjpxtptmqatmgpmpmajd\t209603\n徐浩焱\t209604\n晕船\t209605\naMall\t209606\n头走\t209607\n抹茶\t209608\n垸\t209609\n黄子璁帅还市\t209610\n么赞\t209611\n2点五\t209612\n邳州市\t209613\n4492\t209614\n伍陆柒捌玖\t209615\n刊登\t209616\n垫\t209617\n北北豆豆\t209618\n野兔\t209619\n潮\t209620\nALion\t209621\n潭\t209622\n微客\t209623\n踏进\t209624\n义乌西屋\t209625\n退神马东东喵\t209626\nhigfrb\t209627\n沙包\t209628\n挤出\t209629\nhobb\t209630\n多喝酒\t209631\n再说元\t209632\n我的爸妈\t209633\n恐怖袭击\t209634\n黄子华\t209635\n杏术\t209636\n这些子\t209637\n中南部\t209638\n丹顶鹤区\t209639\n躲过\t209640\ngfddsff\t209641\n麦豆\t209642\n纯元\t209643\n忙了忘\t209644\n潜\t209645\n躲远\t209646\n躲进\t209647\n北河湾\t209648\n潘\t209649\n潔\t209650\nvenajs\t209651\n阿斯本\t209652\n从前样\t209653\n孙静怡\t209654\nv友\t209655\n养而亲不待\t209656\n大大大大大大哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒\t209657\nbaoShoujihan\t209658\n说斗\t209659\n用好不犹豫\t209660\nCorba\t209661\n我讨厌你了有\t209662\n黎宝\t209663\n库伊乖\t209664\n官话\t209665\n宠儿\t209666\n葛青云\t209667\nv发\t209668\n想法儿\t209669\n鬼讨厌鬼讨厌鬼讨厌鬼讨厌鬼讨厌鬼\t209670\njvyfjyc\t209671\n吧主女\t209672\n额额\t209673\nJul\t209674\nv句\t209675\n黑黑的长发吧美丽的亲自月经红红\t209676\n曙光下村\t209677\n十年后\t209678\n痴样\t209679\n安子\t209680\n架设\t209681\nusuehudbh\t209682\n吃紧\t209683\ntimetot\t209684\n吃素\t209685\n买饭\t209686\nhmm\t209687\n3912万元\t209688\n恐怖点\t209689\n有几次\t209690\nmcxoma\t209691\n雪佛兰\t209692\nhmc\t209693\n王胜炜\t209694\n问法\t209695\n4408832000001253298\t209696\nbigdand\t209697\n我本人\t209698\n三间\t209699\n每桶\t209700\n朱一飞\t209701\n34度\t209702\n触电者\t209703\njbhjvbkjjvw\t209704\n覃雪琴\t209705\n嘿咻\t209706\n扁平\t209707\n3334645773\t209708\nthvju\t209709\n滋市\t209710\n唐泥\t209711\n脸不要脸\t209712\n漂亮点\t209713\na600亿40\t209714\ntjd\t209715\n想笑一笑\t209716\n60万元\t209717\n2百\t209718\ntjg\t209719\n66亿\t209720\n文章\t209721\n周春青\t209722\n坤根\t209723\n文竹\t209724\n用心跳\t209725\n恶心点\t209726\n角花\t209727\n分期\t209728\n逍遥和天1PK\t209729\n背道而驰\t209730\ntjj\t209731\n奥度\t209732\n入狱\t209733\nVuh\t209734\n吗丁啉\t209735\nbier9aici\t209736\n冒卖\t209737\n一半一号\t209738\nci1is\t209739\n满足\t209740\n萝卜炖蘑菇\t209741\np股\t209742\n日批\t209743\n你是我的好帮手\t209744\n行快\t209745\n砸烂\t209746\n欧榜\t209747\n衰老\t209748\n1954\t209749\nＹＯＵ\t209750\n10000000000000000000000000000000000000个\t209751\n黄晓艳\t209752\nushou\t209753\n睿迪\t209754\nGales\t209755\n据点\t209756\n俺姐我叫我叫静静今年我九岁我班级我爱你\t209757\n怨不起\t209758\n毛主生\t209759\n考验\t209760\n这月月\t209761\n就不\t209762\n波鲁\t209763\n德合一\t209764\n内科\t209765\n渴盼\t209766\ntjv\t209767\n晚礼服\t209768\n斯及\t209769\n4月25日\t209770\n给我回来\t209771\n俱佳\t209772\n就业\t209773\n热线\t209774\n没了秀\t209775\n呃公主美\t209776\n寿终正寝\t209777\n特长\t209778\n北京日报\t209779\n大吃\t209780\n有学问\t209781\n2322\t209782\n就中\t209783\n厚德姬\t209784\n8899174\t209785\n看过把吧兄弟\t209786\n飞毛腿\t209787\n要离\t209788\n探索\t209789\n123376765588645\t209790\n广电总局\t209791\n早八对晚八\t209792\njbod\t209793\nboxbos\t209794\n慧姐\t209795\n一模块\t209796\n数不胜数\t209797\n一秒三次\t209798\n合肥电视台影院频道\t209799\n太过分宝成人工费坛海v\t209800\n余三人\t209801\n前段\t209802\n一种感觉\t209803\n发裤\t209804\nT\t209805\n老程\t209806\n门匕\t209807\n提单\t209808\n净净\t209809\n米糯子\t209810\n店口\t209811\n门区\t209812\n朱门\t209813\n提升\t209814\n图源\t209815\n店号\t209816\n卫兵\t209817\n夏习\t209818\n卫兰\t209819\n四溜达\t209820\n红兔\t209821\n续集\t209822\n摄影集\t209823\n堂长\t209824\n保重\t209825\n保释\t209826\n好听了你好\t209827\n意外保险\t209828\n红光\t209829\n傻子侠\t209830\n千篇一律\t209831\n日本富士电视台\t209832\n马原貌\t209833\n竟是\t209834\n4张\t209835\n村偶cz\t209836\n额娘我可想你\t209837\n坐骑\t209838\n波阳\t209839\n范\t209840\n主客\t209841\n好啦好啦好啦好啦好啦好啦好啦好啦好啦好啦好啦好啦好啦好啦好啦好啦好啦好啦好啦\t209842\n福尚福\t209843\n72mm\t209844\ntjT\t209845\n嗯宽\t209846\n华师\t209847\n吃不着\t209848\n递尿\t209849\nvsjcsa\t209850\n横看成林侧成峰\t209851\n45678\t209852\n还在家安\t209853\n康忙北鼻\t209854\n看不度\t209855\n嗯客\t209856\n12341个\t209857\n嗯官\t209858\n张欣悦\t209859\n小蜜洞\t209860\n嗯安\t209861\n９日\t209862\n粘胶\t209863\n甘拜下风\t209864\nvegsh\t209865\n收听率\t209866\n无需要\t209867\n648740\t209868\n我爱您在今生今世\t209869\n好我跟你好\t209870\n麻省理工学院\t209871\nCjnjj\t209872\n多大岁\t209873\n皇权\t209874\n不僅\t209875\n篮球赛\t209876\n猪角\t209877\n田宁\t209878\n不像\t209879\n你了思密达我过去找你\t209880\n诚诺\t209881\n乖秘秘\t209882\n对流\t209883\n老夫聊\t209884\n没脸\t209885\n胖耶\t209886\n十二大贝柱\t209887\nccAc\t209888\nrrrrrrrrrrrrrrrrrrrrrrrrr\t209889\n突击检葡萄\t209890\n是不算\t209891\n要不然\t209892\n几110岁\t209893\n公诉团\t209894\n王八一\t209895\n后半段\t209896\n沈阳爱尔眼科医院\t209897\n王八东\t209898\n晋中宝\t209899\n谷歌地图\t209900\netalk\t209901\nx3虎\t209902\n五5月23日\t209903\n大湿兄\t209904\n金英敏\t209905\n凉拌沙拉\t209906\n脱帽\t209907\ngvdygfhhh\t209908\npqk7\t209909\n阶段\t209910\n吕微\t209911\nWIS\t209912\n厚明\t209913\n避免\t209914\n腔拿调\t209915\n20万吨\t209916\n舒服感\t209917\n退耕\t209918\n别开玩笑\t209919\n老太阳\t209920\n莹子\t209921\n好妇\t209922\n25000岁\t209923\n我是真的爱你是男女朋友之间\t209924\n不不不不不不是\t209925\n猿声\t209926\n安达曼海\t209927\n郁闷类\t209928\n大连盈致企业管理有限公司\t209929\n孙志\t209930\n实惠\t209931\n七月初七\t209932\n改一改\t209933\n好妞\t209934\n三十岁性\t209935\n麻辣鱼\t209936\n去秋来\t209937\n好妮\t209938\nPWPGM\t209939\n实情\t209940\n总的错币\t209941\n三十七四度\t209942\n张嘉倪\t209943\n888888866666\t209944\n北伦敦\t209945\n蓝颜知己\t209946\n塔雅丽达\t209947\n诶菲\t209948\njvds\t209949\n平淡无奇\t209950\n我不我不和你说了再\t209951\n2月29号\t209952\n灭火器\t209953\ndhiphotosbaiducomxiaodupicitemd058ccbf6c81800a7a96fd7ab63533fa828b4756jpg\t209954\n三巨头头兔兔\t209955\n一不小心翼翼机\t209956\n仰光\t209957\n瓜分\t209958\n逆反心理\t209959\n怯懦\t209960\n研究所\t209961\n一两滴\t209962\n知更鸟报春\t209963\n這麼遠\t209964\n猫狗叫花花\t209965\n会儿作业\t209966\n哩哩\t209967\ng6q\t209968\n等世界和平复出\t209969\nnh6yhhhhj\t209970\n衬裳\t209971\n金超\t209972\n耳洞\t209973\n港味\t209974\n郭纪耀\t209975\nlà\t209976\n1930\t209977\n南江楼\t209978\n衬裤\t209979\nqwsdvn\t209980\n不敢爱\t209981\n1998年6月13日\t209982\n九寨沟\t209983\n乐基\t209984\n二本命\t209985\n西安金翅鸟\t209986\n360手机助手\t209987\n金虹山\t209988\nDoes\t209989\n周青\t209990\n％4415\t209991\n一特\t209992\n周静\t209993\n高丽会\t209994\n合理\t209995\n邵国政\t209996\n海德哲基儿\t209997\n思过\t209998\n机楼\t209999\n背地\t210000\nHHSISTS\t210001\n人民食堂\t210002\n是因为\t210003\ndusidid\t210004\n零几岁\t210005\n伊贝克\t210006\n白貴妃\t210007\nwjm\t210008\nwjj\t210009\nwjk\t210010\n12335456980669852239998866\t210011\n李快走\t210012\n星期四\t210013\nhougi\t210014\n通便\t210015\nwjc\t210016\n千里马\t210017\nwja\t210018\n3.8万亿\t210019\nBJD\t210020\n霓虹灯\t210021\nwjv\t210022\nwjw\t210023\n喜欢你我想你\t210024\n一个派\t210025\n多米块\t210026\n青年报\t210027\n12嗯\t210028\n大喜日子\t210029\n64一个\t210030\n两天半\t210031\n丝丽雅\t210032\n下雪天人\t210033\nbcbd\t210034\n歉吧\t210035\n高敏锐\t210036\nddfffdf\t210037\n陪片\t210038\n13990059001\t210039\nvgvvgbggbhbbgbgg\t210040\n笑克\t210041\n坐客\t210042\n包员\t210043\n你好烦人哪你\t210044\n见谅\t210045\nshuile\t210046\n高山流水\t210047\n驾辕\t210048\n张苏庆\t210049\n13654282582\t210050\n制片\t210051\n优多\t210052\n了孙着你\t210053\n题男\t210054\n神药\t210055\n怕我真的爱上\t210056\n工藤优\t210057\n羌村\t210058\n媚药\t210059\n黄莲苦\t210060\n叫苦连天\t210061\n题甲\t210062\n你好客气\t210063\n孙告迪\t210064\n八格牙\t210065\n伸直\t210066\n好我问你\t210067\n40多张\t210068\n二虎儿\t210069\nGRF\t210070\n一倍\t210071\n哎呦阿斗\t210072\n一個\t210073\n25个半小时\t210074\n百利滚利\t210075\n鏈西洪e6W\t210076\nGRO\t210077\n十三十四号\t210078\n不好救火\t210079\n左树生\t210080\n织布\t210081\n搞经\t210082\n陈六子\t210083\nByeBy\t210084\n一值\t210085\n源希子\t210086\n多玩剑网三##\t210087\n少数人\t210088\n新山\t210089\nuptillthen\t210090\n顺风车\t210091\n诗泌\t210092\n希尔顿Hilton酒店\t210093\n79134919379438913890\t210094\njzhaba\t210095\n受众群\t210096\n微笑的歌曲\t210097\n第2200\t210098\n2000多名\t210099\n心疼\t210100\n因素\t210101\n瞎鼓\t210102\n今生以后\t210103\n八九十分\t210104\n强酸环境\t210105\n冷少\t210106\n313431\t210107\n扛竿\t210108\n落叶归根\t210109\n近亲结婚\t210110\n平面媒体\t210111\npartypotter\t210112\n撒撒撒撒撒撒\t210113\n独角\t210114\n区医院\t210115\n非妈妈\t210116\n奥林匹克精神的遗忘\t210117\n3510\t210118\n幺零六五五\t210119\n每个人\t210120\nr度秘\t210121\n几十元\t210122\n不干反应哩\t210123\nhttpahiphotosbaiducomxiaodupicitem83025aafa\t210124\n饭日\t210125\n曹小咏\t210126\n学奸\t210127\n两分钟后\t210128\n学好\t210129\n喵喵喵喵喵喵喵喵喵喵喵喵喵喵\t210130\n用超\t210131\n张那\t210132\n白狼\t210133\n啦baby\t210134\n商旅\t210135\n饿棍天使\t210136\nvjfnc\t210137\nwou\t210138\n0.89%\t210139\n乃么\t210140\n刘罗布森\t210141\n44881戏台生旦净丑\t210142\n张运豪\t210143\nwuahr\t210144\npoiuy\t210145\n15900132\t210146\n杨棋戊\t210147\n老虎不发威你当我是helloktr\t210148\n小老爸\t210149\n午休\t210150\n親親\t210151\nkuluzemoytkujkl结局tuku土wwloyu\t210152\n八字\t210153\n老子不想你受伤害我\t210154\n清风纸\t210155\n一百块\t210156\n八子\t210157\n从没想过\t210158\n掉头\t210159\njjxxjjxejgggixxxxhxgixixjjjxjjjjjxhiiiiiighisbboxkxbeexejxxxxxbbbebxkkbkbjjjjjjjjbbbbkkxjxxxxjxkbbkbe\t210160\n玩儿罗\t210161\n搞不死\t210162\n三百两三百\t210163\n举手投足\t210164\n电溜达\t210165\nbcsfbk\t210166\n绵中\t210167\n班郭哥\t210168\n德州扑克#\t210169\n好呀个\t210170\n5555555525355545375653564576866\t210171\n度米酒\t210172\n雷延\t210173\nc油雷特\t210174\n蕊蕊\t210175\n有车族\t210176\n故说\t210177\n杨庙镇\t210178\n哟土\t210179\n腥质\t210180\n大林主义\t210181\n6个月\t210182\n傻子类\t210183\n度一在\t210184\n你的天空\t210185\n传好\t210186\n15838372479\t210187\n臭豆腐\t210188\n5点30\t210189\n扳手\t210190\n叛变\t210191\n1259635741\t210192\n寄达\t210193\n上影\t210194\nHalaban\t210195\n2090米\t210196\n雨阳\t210197\n吸痕\t210198\n唐依\t210199\n乡村姑\t210200\n咯嘣夏卡拉卡\t210201\n报告器\t210202\n手软\t210203\n尹潇潇\t210204\n想见\t210205\n四分之几18\t210206\n像极了\t210207\n277245828587142838300869999545352428380047282938172858575057784805067446765577697594585575\t210208\n高声痛\t210209\n全聚德\t210210\n实验室\t210211\n鲜花\t210212\n576.5万美元\t210213\nxggxvhh\t210214\n山田\t210215\n小孩度\t210216\n荧幕\t210217\n张培伟\t210218\n做菜\t210219\n万众民\t210220\n赛睿\t210221\n四点54分\t210222\nvvjggs\t210223\n答非所\t210224\nOscar\t210225\n人行道\t210226\n荒诞不经\t210227\n啊猩\t210228\n38384388\t210229\nuuii\t210230\nsevenseventwo\t210231\n讨厌鬼谁说的鸟我告诉你\t210232\n虽小\t210233\n日本队中心\t210234\n径缩锁索索\t210235\n常灰\t210236\n聊了用\t210237\n小阴唇\t210238\n玩饭\t210239\n我辈子\t210240\n南县\t210241\n丽新系\t210242\neyifdbidu\t210243\n烛光晚餐\t210244\n别瞒我\t210245\n127808\t210246\n西蒂\t210247\n枯叶春风来千树万树梨花开\t210248\n心叭叭碎沬\t210249\n我恨你我恨你\t210250\n苍茫的天涯是我的爱\t210251\n王琬琪\t210252\n吴艳辉\t210253\n三只\t210254\n杨雨涵\t210255\n欧耶欧耶欧买噶\t210256\n小彭\t210257\n李佳苗\t210258\n十三颗\t210259\n讨钱\t210260\n三叶\t210261\nfddfffffyyg\t210262\n舞烟\t210263\n理论\t210264\n财宝\t210265\n小影\t210266\n罗金宜\t210267\nguvf\t210268\n喰种\t210269\n526.6万元\t210270\n说好事\t210271\n道然\t210272\n猪猪\t210273\n猪猫\t210274\n102团有好的兽医院\t210275\n镜子中\t210276\n天马股份\t210277\n凯泽药厂\t210278\n1、2分钟\t210279\n4242212\t210280\nrukk\t210281\n小說\t210282\nrt\t210283\n说了陪\t210284\nrv\t210285\nrw\t210286\nrp\t210287\n想死十四死掉了的死东东\t210288\nrr\t210289\nrs\t210290\nk7k7\t210291\n高月\t210292\n楼香楼大酒店\t210293\nrx\t210294\nry\t210295\nGhvvvjvmhxgkbpLxjlxvnnxxhchvvccjvmvvHnnkjcb\t210296\nrd\t210297\n赵亚兰\t210298\nrf\t210299\nrg\t210300\nra\t210301\n夏桢茜\t210302\nrc\t210303\nrl\t210304\n男演员\t210305\nro\t210306\nrh\t210307\nri\t210308\nrj\t210309\nrk\t210310\n银饰品\t210311\nsummicron\t210312\nEhshssh\t210313\n陈爱平\t210314\n22nnnnn\t210315\n灬｀\t210316\n东京食尸鬼3\t210317\nCSOL\t210318\n个秀物语tt\t210319\n第三分钟\t210320\n南大街\t210321\n蛋白质\t210322\n西蒙\t210323\n密使\t210324\n好朋\t210325\nuUIuuu\t210326\n我没你丑你\t210327\n53676\t210328\ninimget\t210329\nr7\t210330\nr0\t210331\n一两天\t210332\n8.0\t210333\n泽东东\t210334\n挥鞭\t210335\n反证\t210336\n涂涂\t210337\nsj23\t210338\n三双\t210339\nSssdds\t210340\n李是起\t210341\n888十1\t210342\n乌冬\t210343\n虚拟世界\t210344\n骄阳\t210345\n4AAD\t210346\n三叉\t210347\n乖小乖\t210348\n好嘛闷\t210349\n聪明反被聪明误\t210350\n我港\t210351\n女宝\t210352\n坷怖片\t210353\nhuxux\t210354\n微波炉\t210355\nhsisb\t210356\n三叔\t210357\n老艾\t210358\n远洋沁山水\t210359\n正名\t210360\ndgis\t210361\n正向\t210362\n学神\t210363\nhdbdbd\t210364\n撒欢\t210365\n望京地区\t210366\n李给我\t210367\n银素\t210368\n眼力价\t210369\n魏明然\t210370\n一苇\t210371\n不在高不在远是\t210372\n世博演艺中心\t210373\n抱负\t210374\n结点\t210375\n多咪多咪咪\t210376\n戈麦斯\t210377\n老百姓们\t210378\n整天\t210379\n镉大米\t210380\n九七非礼\t210381\n泰迪那你喜\t210382\ntios\t210383\n几千几千天天\t210384\n几几月\t210385\n11334688677\t210386\nEcstasy\t210387\n张一雯\t210388\n里面\t210389\n文国庆\t210390\n1.3g5g\t210391\nmatlatouwr\t210392\nhhvjhhyygfttyd7uuygtiyuuuyu7y77uuyuyyuiijjhjjjuuuuu\t210393\n警察学校\t210394\n深化\t210395\n霸占\t210396\n张赫敏\t210397\n尕头\t210398\n地砖\t210399\n答非若问\t210400\n计划书\t210401\n备注暇\t210402\n饭们\t210403\n啦桑娜\t210404\nsomes\t210405\n空好呀\t210406\n博卡\t210407\n罗比尼奥\t210408\n听见\t210409\n马眼棒\t210410\n深深不死\t210411\n听觉\t210412\n臭留\t210413\n2011年1月\t210414\n乌兰克行\t210415\n衎衎\t210416\n来去眼\t210417\n儿灵儿\t210418\ntftfehe\t210419\n淤戏\t210420\n发券\t210421\nVCC\t210422\nVCD\t210423\nVCF\t210424\n709049\t210425\n石芸\t210426\n洛轻狂\t210427\n568025412点\t210428\n度你不在\t210429\n置楼烦\t210430\nVCR\t210431\n发别\t210432\n16至21日\t210433\n沙龙\t210434\n发删\t210435\n女孩纸\t210436\n1536482882767949\t210437\n撸呵呵撸\t210438\n二零九\t210439\n1000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000\t210440\n不管单\t210441\n唐末\t210442\n一百万一千万\t210443\n某男\t210444\nhjnnjvbybhbhjhjp\t210445\n拉奥\t210446\n精心花骨朵五百强我的家我不会的的我一家我不知道结局我的歌\t210447\n互信\t210448\n王雨杉\t210449\n喜欢恩加\t210450\nmating\t210451\n王奕阳\t210452\n2650平方公里\t210453\n感恩寺\t210454\n粉状\t210455\n刘姥姥\t210456\n成昆铁路\t210457\n汤不香\t210458\n郁金香\t210459\n七八期\t210460\n哈苏坡\t210461\n糖盒\t210462\n海牙前南斯拉夫问题国际刑事法庭\t210463\n屁事\t210464\n屠宰\t210465\nTTTTTTTTTTTTT\t210466\n奢侈品\t210467\n霍尔格\t210468\n许一个\t210469\n昆凌\t210470\n3.Apptentive\t210471\n淘秘嫂\t210472\nweof\t210473\n六五天\t210474\n贵溪\t210475\n应有\t210476\n广百股份\t210477\n阿海\t210478\n李傻子\t210479\nラ/芳沢ロ\t210480\n龙魂\t210481\n哈哈看我如来佛掌\t210482\n六三\t210483\n三苏\t210484\n边桌\t210485\n学东\t210486\n六七\t210487\n积木我行我素\t210488\n六万\t210489\n吧﹁_﹁晟敏偶吧撒狼黑\t210490\nggyy\t210491\n学下\t210492\n没人爱\t210493\n前明\t210494\n津巴布韦\t210495\n万户\t210496\n从来都没\t210497\n怀里\t210498\n小绵羊\t210499\n六世\t210500\n阿基拉\t210501\n竖线\t210502\n六个\t210503\n枕套\t210504\n柚子\t210505\n呢人\t210506\n模子\t210507\n六两\t210508\n川川川\t210509\n黑巧克力\t210510\n禅画\t210511\n2011年11月11日\t210512\n303班\t210513\n预定\t210514\nHHHHHHHHH\t210515\n球事\t210516\n門\t210517\n漫步秘困\t210518\n鸿泽\t210519\n唐姐\t210520\nGDP增\t210521\n鸿泰\t210522\n内蒙了者\t210523\n亚历山\t210524\n開\t210525\n快剪利索\t210526\n创始人\t210527\n王霞\t210528\n閒\t210529\n快一点昃植物大战僵尸2\t210530\n煮妇神探是的星期几点几分星期几\t210531\n哀求\t210532\nAndroid\t210533\nPP群\t210534\n题目发\t210535\n立体声\t210536\n邀功\t210537\n煞调\t210538\nboomsakalaka\t210539\n二九幺五九\t210540\n11月27日\t210541\n14561个\t210542\nGdsfg\t210543\n度秘你是处\t210544\n哇唔\t210545\n来看图\t210546\nxbbcvdddhhccghvcfggbbgghvffhbvvbbvhhhhjj\t210547\n百世汇通\t210548\n你在南方的艳阳里雪纷飞我在北方的寒夜里四季如春\t210549\n应具\t210550\n老鼠飞车\t210551\n嗯嘟\t210552\n度秘就是你就是我朱\t210553\n潮动\t210554\n放行\t210555\n名言\t210556\n放血\t210557\n奴家好想你\t210558\n咋不动你了你是人\t210559\n够朋友\t210560\n靠不可以\t210561\n七一堡\t210562\n鲁比\t210563\n得得问\t210564\nftgcdgcd\t210565\n方那\t210566\n抢购一空\t210567\n半厘米\t210568\n龙系\t210569\n点雨\t210570\n失亿\t210571\n乐清市人民检察院\t210572\n真的好想看一个九月了求你了不然的话\t210573\n点零\t210574\nku冫\t210575\nargue\t210576\n这个星期六\t210577\n1点到2点\t210578\n良良\t210579\n54米\t210580\n弹丸\t210581\n戴世兴\t210582\n丁佳小佳佳\t210583\nadpmp0mpjdjnujmw\t210584\n美婷\t210585\n一部分个\t210586\n恩别说话\t210587\n观哥\t210588\n差不多\t210589\n干好\t210590\n差不够\t210591\n求救\t210592\n干女\t210593\n前3个月\t210594\ndoro\t210595\n求教\t210596\n醉酒\t210597\n敲敲打打\t210598\n训秋\t210599\n亲老公\t210600\n希望\t210601\n虞美人蝴蝶花圖\t210602\n92岁\t210603\n进攻\t210604\nbfb\t210605\n和亲亲\t210606\n奥堡\t210607\nghhghj\t210608\nhelloyes\t210609\n程家骏\t210610\n吴家树\t210611\n7点05分\t210612\n2448052\t210613\n麻果\t210614\n313集\t210615\n9500\t210616\ncallllling\t210617\n太难看\t210618\n天亮离婚\t210619\n啵啵啵啵\t210620\n孙仔\t210621\n丙火戊土\t210622\n吕恶心\t210623\nviit\t210624\n雅培\t210625\nviiv\t210626\n檐下\t210627\n十二颗\t210628\n95年\t210629\nffuertr\t210630\ndidib\t210631\n哆拉A梦\t210632\nHarrison\t210633\n覃泽庆\t210634\n啵弟\t210635\n见方\t210636\ncoveredindust\t210637\n扇窗\t210638\n50000000000000000000000000000000000000000000000000000000000000\t210639\n剑网3图\t210640\n粗大\t210641\n华油\t210642\n假了看把你可不\t210643\n王秀楣\t210644\nchiphotosbaiducomxiaodupicitema2cc7cd98d1001e973f10b7bbf0e7bec55e797f0jpg\t210645\n60藏\t210646\n呢杀价\t210647\n世间隙\t210648\njjtjtktg\t210649\n烈空\t210650\n一只婚\t210651\n吧真凭实据\t210652\n厚过\t210653\nhupe\t210654\n刘燕\t210655\n汤汁\t210656\n延平区\t210657\n小金库\t210658\n一边子\t210659\nhttph5qzoneqqcombgstoreindexwv2098179uin2584373057fromuserhomepage2quaV1ANDSQ595288YYBDbid372clicktime\t210660\nbif\t210661\n891274\t210662\n乱说一通\t210663\nInalow\t210664\n续命\t210665\nmomis\t210666\n外婆\t210667\nDif\t210668\n有权\t210669\n海口火山公园\t210670\n可见一斑\t210671\n张真丑\t210672\n谢爪兰\t210673\n调制\t210674\n灰太郎灰\t210675\n住在天长\t210676\n儿意儿\t210677\n存钱\t210678\n债权\t210679\nxkju\t210680\n足球帽\t210681\n极速\t210682\n蔡文姬\t210683\nxkjs\t210684\n嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻\t210685\nbabashitiaorong\t210686\n装配\t210687\n770B6311E6955CA\t210688\n美小\t210689\nhelloim\t210690\n信雄\t210691\n枪种\t210692\n一个点儿\t210693\n拔掉\t210694\n罗姆尼头发干活妇姑荷箪食\t210695\n诛灭诛\t210696\n换换换换换换换换换换换\t210697\n113888\t210698\n作曲\t210699\n两百300400五百三六百700\t210700\ntuggdgreexufccd人个\t210701\n小蒋\t210702\n重然诺\t210703\n哀家先走\t210704\n895224\t210705\n僧尼\t210706\n乱斗\t210707\n芭蕾#\t210708\n那么多劲\t210709\n坏蛋坏蛋大坏蛋快点吧秦时明月\t210710\n赵伟\t210711\nbiv\t210712\n路秘\t210713\n2年级\t210714\n邵紫琪\t210715\n细数\t210716\n仙青云志\t210717\n不俗\t210718\n不喝水\t210719\n哪有你\t210720\nbiu\t210721\n好看我恨你\t210722\n不修\t210723\n蒽蒽蒽\t210724\n不信\t210725\n十九岁\t210726\n做贡献\t210727\n呕明天\t210728\n太荣幸\t210729\n#炫登机箱\t210730\n迟暮\t210731\n小蒿\t210732\n小事情\t210733\n幼嫩\t210734\n好嘞好嘞谢谢啊好嘞好嘞好嘞好嘞\t210735\n鱼尾\t210736\n196437\t210737\n产前\t210738\n根管治疗\t210739\n妈妈羊\t210740\n陈一年\t210741\n180下\t210742\n霸道总裁爱上你的内种\t210743\n啪啪怕怕\t210744\n钱幸福\t210745\n井格\t210746\n新君威\t210747\n关你的事了关你的十六个\t210748\n火爆\t210749\n斯仁诺\t210750\n金九\t210751\n凉酱\t210752\n苍月\t210753\n180个\t210754\n张明\t210755\n吴金贵夜\t210756\nCfjv\t210757\n6090点\t210758\n宪浩\t210759\n小少年\t210760\n242235348151620131513710121013351416238121312\t210761\n628米\t210762\n十六分之十五六三八\t210763\n长得帅\t210764\n不好我喜欢\t210765\njajbajs\t210766\n快使用税过户祸害\t210767\n3335212173\t210768\n萨尔茨堡\t210769\ntxt\t210770\n薛凯\t210771\n吾推\t210772\n两人\t210773\n上医科大学\t210774\n群情\t210775\n苏然\t210776\n不变成\t210777\n齐人稷\t210778\n猜別人\t210779\n校花组\t210780\n今日\t210781\n周所过\t210782\n不知所以然\t210783\n果壳网\t210784\n六胞胎\t210785\n搜窦性心明月汉\t210786\n油炸\t210787\n今早\t210788\nvvfgjbhwt\t210789\n双蛋\t210790\n陈宇婷\t210791\n我是你的不是人的日子我以为是你的儿子你死了你死了你死了你死\t210792\n亲口\t210793\n难同当\t210794\n翟大卫\t210795\n华侨路\t210796\n陈红艳\t210797\n王焕炫\t210798\n棒棒\t210799\n一千分\t210800\n索尼十四号喔六世呃苏\t210801\n抢手\t210802\n放心\t210803\nCA1538\t210804\n亲友\t210805\n8.9\t210806\n朴允熙\t210807\n會昂\t210808\n馕坑\t210809\nHybrid\t210810\n是你的公主你是我的丸子\t210811\n很多聊天儿呗我还有一个好朋一友\t210812\n研友们\t210813\n旺夫\t210814\n一个七十多岁\t210815\n小肥羊\t210816\n下功夫\t210817\nf站\t210818\n炎水玉\t210819\n舞姿\t210820\n5665563655\t210821\n广式早茶\t210822\n白乔\t210823\n蜀山战纪剑侠传奇\t210824\nahrefhttpwww99lovegocomindexphpmoduseractregisterrec105556targetblank\t210825\n晚安了我的度秘宝贝我在想你是我的亲爱的机器人度秘\t210826\n掌纹\t210827\n10万亿\t210828\n家梦\t210829\n区位\t210830\ngroove\t210831\n丑心\t210832\n16度\t210833\nrenhua\t210834\n一滴香\t210835\nWAG3TD\t210836\n六字\t210837\n1358745515496\t210838\n环老不好\t210839\n来玩手\t210840\nbigPig\t210841\n评委会\t210842\nhttpdhiphotosbaiducomxiaodupicitemcf1b9d16fdfaaf51107812cb8b5494eef01f7a2fjpg\t210843\n张义哲\t210844\n霸王龙狼\t210845\n拉轰\t210846\n快快王道就喔烧烤鱼儿\t210847\n大我就\t210848\n堡狮王\t210849\n德州\t210850\nhdtbj\t210851\n好耍\t210852\npmpwg\t210853\n丽娘娘\t210854\n软饭链接\t210855\n289443\t210856\n勾搭\t210857\n5555555555555555555555555555555555555555555555555555555555555555555555555555\t210858\n天津西沽公园\t210859\n心痛如\t210860\n肥料\t210861\nJellyat\t210862\nfttrtiytififiityiyii\t210863\n88434Isnot\t210864\n呜\t210865\n生逝\t210866\n新竹\t210867\n包换\t210868\n担保品\t210869\n这个次\t210870\n我的妹妹个顶楼人工\t210871\n将就会\t210872\n呔子\t210873\nmkollp\t210874\n好耶\t210875\n地幔\t210876\n贵港\t210877\n过生日度\t210878\n李胜学\t210879\n串口\t210880\n榆木疙瘩\t210881\n天蝎女\t210882\n王爷\t210883\n接昒\t210884\n一百九十九一个\t210885\ncjjc\t210886\n澄城县\t210887\n角角落落\t210888\nDIVA\t210889\n04476253358585\t210890\n卷入\t210891\n12点39\t210892\n宙斯\t210893\n看不上眼\t210894\n丫块\t210895\n12点30\t210896\n跑骚\t210897\nwereany\t210898\n想约\t210899\n12点37\t210900\n12点36\t210901\n宾馆\t210902\n管教\t210903\n臭不要脸\t210904\n哪家\t210905\n难倒\t210906\n昨日18时46分\t210907\n叶家乐\t210908\n八天天\t210909\n烧钱\t210910\n顾明轮\t210911\n切术\t210912\n景睿星\t210913\n算了我真的不想\t210914\n河河\t210915\n减小\t210916\nokos\t210917\n1231578948\t210918\n归纳\t210919\n蹲点\t210920\n#新粉好礼欢乐送#\t210921\n离开我过\t210922\nwryioxzvmsfjkl\t210923\n篮子\t210924\n亮丽\t210925\n河沟\t210926\n1999年7月7日\t210927\nokok\t210928\n三策\t210929\n萌萌萌萌\t210930\n电视盒\t210931\n香水湾\t210932\n叙利亚\t210933\n工科男\t210934\n子子孙孙\t210935\n劳动法\t210936\n巨蟹男\t210937\n心情不好看\t210938\n考古\t210939\nk711\t210940\n44幅\t210941\n火烧云\t210942\n巴拉娘\t210943\n只有一次\t210944\n过人戏人\t210945\n考号\t210946\n陕西人\t210947\n水行\t210948\n肢体语言\t210949\n王世子\t210950\n一秒左右\t210951\n田浩言\t210952\n13450418305\t210953\n公母不分\t210954\n陈增辉\t210955\n蝇头\t210956\nSpacesend\t210957\n师兄等等我\t210958\n杨嘉欣欣\t210959\n7顿\t210960\n宋岩东馆中心\t210961\n簋街\t210962\n泳戏\t210963\n考取\t210964\n母语\t210965\n7顶\t210966\n水表\t210967\n七类\t210968\n贪念\t210969\n七米\t210970\n序章\t210971\n5.6次\t210972\n窑洞\t210973\n仁奎\t210974\nfromgvfdontgxhggsycgthgdtwffgvsgcfgbnvrbcbbobbcbhgffbfvbb\t210975\n见喜\t210976\nRkshdkflrkfk\t210977\n140厘米\t210978\ndabullla\t210979\n马晶影\t210980\n梓白\t210981\n雪嘞\t210982\n贾立磊\t210983\n从头再聊\t210984\n七周年\t210985\n48年\t210986\n谍影\t210987\n凶我好慢\t210988\n贪心\t210989\n信守\t210990\n母老虎\t210991\n是我讨厌\t210992\n低碳指数\t210993\njury\t210994\nhellotomeou\t210995\n那敏米\t210996\nCNTV\t210997\n243家\t210998\nhfskg\t210999\n按压\t211000\n争相\t211001\n5676\t211002\n5677\t211003\n5678\t211004\n5679\t211005\njurg\t211006\n樊性\t211007\n低音\t211008\n相杀\t211009\n漂亮\t211010\n还生\t211011\n揪扯\t211012\n红楼梦中有诗云质本洁来还洁去不觉深陷找你钟爱莲说一文\t211013\n17280371768\t211014\n天马行空的意\t211015\n益田假日广场\t211016\n仔裤\t211017\n臃嘲\t211018\n不相为谋\t211019\nＰＦ\t211020\n财长\t211021\n破罐破摔版\t211022\n宁静\t211023\n猪猪猪猪猪猪\t211024\n肚肚\t211025\n导航\t211026\n红领\t211027\n138777666\t211028\n漓儿\t211029\n红颜\t211030\n好句v个城Vuyhyyyyoiuotttirkzczoc\t211031\nNierstein\t211032\nbeud\t211033\n郑郑\t211034\n宜人层林尽染\t211035\n李佳玉\t211036\n布偶\t211037\n张巧凤\t211038\n肚肿\t211039\n丢弃\t211040\n远洋地产\t211041\n睡了明天见\t211042\nDPP\t211043\n四北师大\t211044\nj3nsfn\t211045\n波兰\t211046\n槐树\t211047\n骨灰\t211048\n习工\t211049\n接招\t211050\n7卡\t211051\n并重\t211052\n七郎\t211053\n45864885321\t211054\n13131002064\t211055\n無奈速八剛剛\t211056\n通联指数\t211057\n天军\t211058\n天内\t211059\n药丸\t211060\n爱浩\t211061\n躺下\t211062\n大卖家\t211063\nhhghvvnjj\t211064\n旅行团\t211065\n1angelababy\t211066\n晚间九点\t211067\n表格\t211068\n呵诃\t211069\n欧曼雅\t211070\n天冷\t211071\n金正好\t211072\n7半\t211073\n棕黄色\t211074\n並追隨\t211075\n飒沓山东队\t211076\nLois\t211077\n年运\t211078\n歪哈\t211079\n幡然明白\t211080\n曹令毅\t211081\n凯蒂\t211082\n13483684618\t211083\n需知\t211084\nMETOO\t211085\n奶气\t211086\n雷克\t211087\n爷们们\t211088\n马想\t211089\n红椒\t211090\n亨得\t211091\n一笼\t211092\nintentIntentSK11714776652C56BA7F325981BB6DAE62A19EAAA5884937B223B272EB4E714CE624BFDA4F6Cend\t211093\n来了外星狗\t211094\n我猜我猜我猜我猜猜\t211095\n陆陆\t211096\n诚心诚意\t211097\n办公用品\t211098\n死丫头别给我老实说\t211099\n惊读\t211100\n二分之一天一顿\t211101\nOSEN\t211102\n做商\t211103\n我不爱你你在说你爱我\t211104\ndykg\t211105\n噜噜噜噜噜啦啦噜咕噜噜咕噜噜咕噜咕噜咕噜咕噜噜噜咕呱呱\t211106\n四十多\t211107\n玩不来\t211108\n党风廉政建设\t211109\n那种\t211110\nlide\t211111\n天蝎王\t211112\n有远\t211113\n海纳百川\t211114\n那秘\t211115\n访者\t211116\n铲球\t211117\n喜欢尼\t211118\n勇气\t211119\n属于你的女票\t211120\n雅马得\t211121\n没劲你不好玩儿\t211122\n问候歌\t211123\n我喜欢猫武士\t211124\no型\t211125\nfe11p\t211126\n别馋邓\t211127\n东宇厂\t211128\n还珠改\t211129\n抬头纹\t211130\n广东地区\t211131\n北郑小学\t211132\n维他命c\t211133\n马依婷\t211134\n12.70\t211135\nimethery1tym\t211136\n亚里士多德\t211137\n老头曼\t211138\na03\t211139\na01\t211140\n六。1\t211141\nStonewall\t211142\n最佳男主角\t211143\ndurdyeftetteetet\t211144\n大我想\t211145\n苏晚晴\t211146\n防静电\t211147\n无足\t211148\n费曼\t211149\n＿个\t211150\n奔儿\t211151\n流量行\t211152\n臭豆腐渣\t211153\n下雪露天\t211154\nqiaij\t211155\nettjijb\t211156\n国士令\t211157\najtwmdis\t211158\n石佛苏\t211159\n找底\t211160\n言言\t211161\n熊猫咪咪\t211162\n声源\t211163\n侵害\t211164\n1898万\t211165\n黑梦窦\t211166\n落锤\t211167\nGhhzg\t211168\n初级检验士\t211169\n重振旗鼓\t211170\n蚃\t211171\n中医院\t211172\n僵尸九\t211173\n轻声\t211174\nfrgthh\t211175\n今21点\t211176\n一笑\t211177\njgjkeb\t211178\n落锁\t211179\n掺和\t211180\n27790838645\t211181\n阿荣\t211182\n2016年初\t211183\n西门龙霆\t211184\n惹我生气\t211185\n咪啊\t211186\n重事\t211187\n一多半\t211188\n流程图\t211189\n重于\t211190\n叔叔\t211191\n该片\t211192\n一秘\t211193\n较快\t211194\ncobv\t211195\n的士\t211196\n208.6亿元\t211197\ns个断水\t211198\n田中哲司\t211199\n840千米\t211200\n一块一块\t211201\n龙人怕啊福田野绿世界摸头奥八点多到时候再说吧啊sloth红嫁衣\t211202\n瑷一\t211203\n兰溪\t211204\n度雅\t211205\n15983267668\t211206\n储伟康\t211207\n放风筝\t211208\n我不没有\t211209\n1139\t211210\n奥贝贝恩施\t211211\n1133\t211212\n1130\t211213\n吕和风\t211214\n这么多年\t211215\n文主任\t211216\n跳会\t211217\n方肘子\t211218\n偶那\t211219\n校服衣\t211220\n橙皮鸭\t211221\n张疑\t211222\n那你说句爱我的话\t211223\n哪重\t211224\n答应\t211225\n小米网\t211226\n1886年\t211227\n林达臻\t211228\n武汉体院\t211229\n剪刀堡\t211230\n奄假\t211231\n奥顿\t211232\n惊死\t211233\n钱都美莱\t211234\n工作学院\t211235\n乱说\t211236\n艺术馆\t211237\n和你在一起\t211238\n\t211239\n杰仔\t211240\n别这么不要脸ok\t211241\n张春贤\t211242\njuicell\t211243\n穆斯林的葬礼\t211244\n我告诉你我是我喜欢彩金就\t211245\n20个月\t211246\n武美含\t211247\n基友会\t211248\n999步\t211249\n田园杂兴\t211250\n迎香透\t211251\n垭跳舞\t211252\n好衷心\t211253\n电化学\t211254\n绿菜花\t211255\n路赛\t211256\nhovah\t211257\nhoogo\t211258\n绿白\t211259\n嗷嗷嗷\t211260\n朱陈涛\t211261\n几百年\t211262\n刘雅如\t211263\ndkylulug\t211264\n代代\t211265\n捕虫\t211266\n咯塔罗\t211267\nfrfed\t211268\n我是女的女的女的女的\t211269\n五幺六二七\t211270\n看偶多萌\t211271\n给我哭\t211272\n王栋\t211273\n女川核电站１号\t211274\n周小玲\t211275\n无根\t211276\na766\t211277\n化解\t211278\n11点14\t211279\n大乐那你试试冰\t211280\n九十九九九九九九九九九九一千一百一十一六级\t211281\n泼水死\t211282\n英格\t211283\nHIKP\t211284\n卧里托龙凤区\t211285\njxhxx\t211286\n某刻\t211287\n有我会\t211288\n芊娴\t211289\n我是我的姐姐来了hi我是三岁\t211290\n长颈鹿\t211291\n中新社\t211292\n叫付\t211293\n天新郎\t211294\n蓬\t211295\n3月22日\t211296\n同学家\t211297\n1\t211298\n夏可欣\t211299\n长空雄\t211300\n亚北\t211301\n仙台\t211302\n晚上7点40\t211303\n招摇过市\t211304\n泡椒\t211305\n适用者\t211306\n55665\t211307\n囖隆\t211308\nQAQ你不爱我了我想死\t211309\n鼠标\t211310\n第12岁\t211311\nggtfjfhyd\t211312\n32厘米\t211313\n前面两谁和你聊的天\t211314\n乖波波乖波波\t211315\n喜乐花园\t211316\n真是味\t211317\nhttpahiphotosbaiducomxiaodupicitem3b87e950352ac65cb067ea00fcf2b21193138a67jpg\t211318\n成山植物园\t211319\n随机\t211320\nU干\t211321\n王六台\t211322\n一不会\t211323\n132244765890\t211324\n五八七四\t211325\n随有\t211326\n3082531492\t211327\n╥╯\t211328\n2月19日\t211329\n时侯\t211330\n硬不好\t211331\n打徐\t211332\n深更半夜\t211333\n删别管\t211334\n汶川路过城三楼\t211335\n不懂你的话你\t211336\n60050\t211337\n新年头\t211338\n赵曦沫\t211339\n蹦蹦兔\t211340\n120203\t211341\n120202\t211342\n吉林省肖家乡第一中学\t211343\n还元\t211344\n键子\t211345\n语音输\t211346\n真真实实\t211347\n亲一个呗\t211348\n低中收入者\t211349\nP图\t211350\n新昌县\t211351\n多咪多咪你\t211352\n威尔希尔\t211353\n喔喔喔\t211354\n两万多\t211355\n甘美丽\t211356\n酷谷\t211357\n罗其然\t211358\n台佬\t211359\n村子\t211360\n腊四粥\t211361\n沈晓平\t211362\nIM体育讯库\t211363\n百转千声惶恐\t211364\nsdk\t211365\n满屏\t211366\n严老师\t211367\n你是我的好朋友我爱你\t211368\n蝁二万\t211369\n第二相\t211370\na181818\t211371\n黄氏女\t211372\n朱启常\t211373\n神猪\t211374\n戴亚\t211375\n游侠\t211376\n春花\t211377\n百八十天\t211378\n童一翡\t211379\n冒烟\t211380\n度步\t211381\n求你件事\t211382\nGAYO\t211383\n好笑事\t211384\n第二盘\t211385\n名嘴\t211386\n孟上\t211387\n陈昊天\t211388\n噱麵\t211389\n恩早\t211390\n排名前\t211391\n不看不想\t211392\n美丽的秘密\t211393\n咖啡杯\t211394\n晓岗站\t211395\n男孩儿\t211396\nez1\t211397\n杜宁\t211398\n刘惠\t211399\n照堡\t211400\n眼线笔\t211401\nDfffgcfgggghg\t211402\n干么事波斯猫\t211403\n叫价\t211404\n男篮\t211405\n不眠夜\t211406\n互动率\t211407\nhjej\t211408\n900#\t211409\n第一差\t211410\nks8o\t211411\n张月平\t211412\n花姑凉\t211413\n正思\t211414\n宠物小精灵\t211415\n思亲\t211416\n屋度\t211417\n售票机\t211418\n长隆动物园\t211419\nFifthnight\t211420\n滩\t211421\n剔除\t211422\n若七十\t211423\n杜宾\t211424\n沉醉\t211425\n听施\t211426\n1355645678923\t211427\n青藏高原东部\t211428\n夜urfe\t211429\n1515151819295\t211430\n大相迳庭\t211431\n没厉害\t211432\n400度\t211433\n洽谈会\t211434\n泰迪狗鸡\t211435\n黄细花\t211436\n荤食\t211437\n夜间中雨\t211438\nXiang\t211439\n庇佑\t211440\nk秦\t211441\n小洞泬\t211442\n春回大地\t211443\nblame\t211444\n0.99\t211445\n锁芯\t211446\nlfyou\t211447\n听斗\t211448\nghbbjbb\t211449\n放养\t211450\n放入\t211451\n锤子\t211452\n定投\t211453\n石小龙\t211454\n李贝尔\t211455\n十一个月\t211456\n进率\t211457\n萧雨\t211458\n放全\t211459\n这厢\t211460\n4点40\t211461\n4点46\t211462\n七分之一第二天\t211463\n香体\t211464\n孙舆论\t211465\n分身\t211466\n嗯我爱\t211467\n15章\t211468\n230包\t211469\n任玉诺\t211470\n惹我你惹\t211471\n霜儿\t211472\n东港市\t211473\n王佐良\t211474\n不代沟\t211475\n返上\t211476\n八二零二幺五零二六\t211477\nqwerthuiopa\t211478\n话顶\t211479\n高嵩\t211480\n电动三轮车\t211481\n能说会儿\t211482\n华西\t211483\n300多分\t211484\nMarc\t211485\n憨豆先生\t211486\n胁肩\t211487\n圆珊珊\t211488\n岐山\t211489\n家庭生活\t211490\n长茄\t211491\nRoof\t211492\n最火爆\t211493\n46646191994646646464686866764\t211494\n忠实度\t211495\n快递公司\t211496\n订货\t211497\n先八中\t211498\n品牌\t211499\n照片人\t211500\n123654789654\t211501\n窣画\t211502\n咏池\t211503\n8.11%\t211504\n东村\t211505\nhudhe\t211506\n死神\t211507\n太平庄\t211508\n畅游\t211509\n乌台七\t211510\n电极\t211511\n议论纷纷\t211512\nABCdefg\t211513\n老娘我的最爱\t211514\n什公\t211515\n调灵王一竹\t211516\n群友\t211517\n群号\t211518\n查暗访\t211519\n长坂坡\t211520\n超碰在线\t211521\n浓莎蕾\t211522\n做我的女朋友\t211523\n瞌睡\t211524\n租子\t211525\n什元\t211526\n斑斑\t211527\n一报\t211528\n斑斓\t211529\n中国乐一\t211530\n松花\t211531\n谈拜\t211532\n你在朝阳你猜我猜你猜我猜你猜我猜你猜\t211533\n学画画\t211534\n美女们\t211535\n经济号\t211536\nhdry\t211537\njhikk\t211538\n超级超级无敌超级超市\t211539\n油渍\t211540\n黄图姐\t211541\n恁般\t211542\n甘都\t211543\n罗比威廉姆斯Robbie\t211544\n00090987874454449\t211545\n此言\t211546\nlistenxiah\t211547\n黄图姬\t211548\n470097\t211549\n必生\t211550\n登陆\t211551\n三昧禅院方丈广成法师\t211552\n油温\t211553\n30多万\t211554\n训智\t211555\n187333\t211556\n肖汉民\t211557\n锦绣江山\t211558\n大同县\t211559\n2008年4月30日\t211560\n第一友\t211561\n切尔西国米皇马\t211562\n罗女\t211563\n渐晚\t211564\n奥涛\t211565\n岸上\t211566\n鲍神\t211567\n死间计\t211568\n2星期\t211569\n王馨卉\t211570\n宝饭店\t211571\n很亲爱\t211572\n这年来\t211573\n崔俊军\t211574\n不行行\t211575\n上海局\t211576\n黄盖\t211577\n变本加厉\t211578\n扑亲亲\t211579\n第四条\t211580\n坏你是大坏蛋动力学大坏蛋\t211581\n叫连\t211582\n罗奇\t211583\n清帐\t211584\n升移\t211585\nRoller\t211586\n九五点\t211587\n纳闷那你晚上回不回家呀\t211588\n四月\t211589\n面目全非\t211590\n大锅饭\t211591\n毒害\t211592\n卸妆照\t211593\n涵哥\t211594\nk888zzzyozc\t211595\n海卟\t211596\n面十世\t211597\n第二帕\t211598\n想不开心\t211599\n四期\t211600\nhfhhf\t211601\n好吧快睡会\t211602\n源儿\t211603\n东站\t211604\n1秒\t211605\n1百23级\t211606\n一哥哥\t211607\n越缓\t211608\n电击呃\t211609\n一天两天\t211610\n东立\t211611\nyryi\t211612\n东京暗鸦\t211613\n韵达\t211614\n惹眼\t211615\n额佩兰\t211616\n知心呵呵\t211617\n捉拿\t211618\n新昌教育\t211619\n我不了了再来十五我再也不了了\t211620\n爱了\t211621\n就是你的生日\t211622\n爱亚\t211623\n赛雷\t211624\nAPU\t211625\n笑你的笑\t211626\n爱云\t211627\n温辉成\t211628\n刘凯升\t211629\n米线\t211630\n儿歌儿\t211631\n我的亲亲小宝贝\t211632\n度秘秘哥\t211633\n你好哈哈你有啊你好哈哈哈\t211634\n伟东\t211635\nAPI\t211636\n爱人\t211637\n美莓\t211638\n三十年\t211639\n牛牛牛牛\t211640\nhbvhhghn\t211641\n史蒂芬·埃洛普\t211642\n三塞\t211643\nkakwjs\t211644\nTerg\t211645\npapa\t211646\n服部\t211647\n槟榔\t211648\n肖觉晓\t211649\n窝写\t211650\nggvbjuh\t211651\n12345678912345675913443156432335582358768726733572655555855855585555555555555555555555555555\t211652\n第四权\t211653\n陶回\t211654\n喜马拉雅山\t211655\n拉嘎\t211656\n克和克\t211657\n现于\t211658\nb2年\t211659\n走来走去\t211660\n天宫一号\t211661\n繁花\t211662\n唐易康\t211663\n小木屋\t211664\n坏机器\t211665\n徐佳和\t211666\nJeep\t211667\nFwfsbffh\t211668\n有必死\t211669\n资金链断裂\t211670\nyyourhatstido\t211671\nwrtyiol\t211672\n阳光小区\t211673\n阿猫6jqcqcq342\t211674\n起劲\t211675\n柳江\t211676\n蛙泳\t211677\n宫观\t211678\n二不二\t211679\n日本本州岛东部\t211680\n未约\t211681\n大冬\t211682\n流出\t211683\n诊治\t211684\n姜辰浩\t211685\n26块\t211686\nDUMI\t211687\n羊山石\t211688\n1386697978\t211689\n十一日\t211690\n希望岛\t211691\n保障性住房\t211692\n皮皮比\t211693\n陈瑞华\t211694\n横叉\t211695\n揭晓\t211696\n十一时\t211697\n133456288966\t211698\njnt囖苏\t211699\n生平头\t211700\n乐扣\t211701\n卞伯贤\t211702\n拉耙耙耙耙耙耙耙耙耙耙\t211703\n谢谢表\t211704\n佳洁士\t211705\n极少数\t211706\n贾秘\t211707\n欧百钢\t211708\n441家\t211709\n猪尾巴\t211710\n来者\t211711\n无线\t211712\n行政权\t211713\n贾种\t211714\nfI呃w呃疯c\t211715\n2051488803\t211716\n十几件\t211717\n耐架\t211718\n不动我你抱我抱\t211719\n天马主\t211720\n包彩霞\t211721\njhfyvfhk\t211722\n阿尔卡特朗讯(巴黎证交所\t211723\n艾孜\t211724\n夏嘉恒\t211725\n脸红\t211726\n妮妮米\t211727\nhunderQUFtYWduZXQ6P3h0PXVybjpidGloOjE2ZGVhYzU4MDIxMDA0MjliZGE5NGM1MGYzOTk3NmM0ODExYTRmZjhaWg\t211728\n快快快牙\t211729\n哎i\t211730\n哎n\t211731\n十千克\t211732\n音悦台\t211733\n驴炮\t211734\n想想爱\t211735\n中航精机\t211736\n咯素质syigsgcGgjjyd\t211737\n56666666999999999\t211738\n蛇果\t211739\n干锅鸭\t211740\n大冶\t211741\n元太\t211742\n晕发\t211743\n溢满\t211744\n好七六号\t211745\n号外\t211746\n氧气瓶\t211747\n顾安心\t211748\n願意\t211749\n我女的我女的我女\t211750\n香烟盒\t211751\n悲泣到天明\t211752\n大冰\t211753\n金箍棒\t211754\n艰难困苦啦啦啦啦啦啦啦啦爸爸擦擦擦擦擦擦擦擦擦擦擦擦擦恩卢卡库拉小年\t211755\no女\t211756\n玟玟\t211757\n威客\t211758\n比基尼\t211759\n感謝\t211760\n50公里\t211761\n华丽丽\t211762\n当即\t211763\n相留\t211764\n／／／／／／／／7\t211765\n铆钉\t211766\n崭新\t211767\n王靖凯\t211768\n搅动\t211769\n真的不骗\t211770\n度密度密度秘你好坏好坏好坏好坏\t211771\n言情书\t211772\n萝莉控\t211773\nfornews\t211774\n牙祭\t211775\nk2发\t211776\n不干嘛\t211777\n大忙\t211778\nouhh\t211779\n芳妞\t211780\n星际辛赵丽朵朵\t211781\n柔白\t211782\n不要说出彩\t211783\n一般\t211784\n全家福\t211785\n当午\t211786\n江名伟\t211787\n不达标\t211788\n蜘蛛痣\t211789\n敷衍爱\t211790\ntyfc\t211791\n9：58\t211792\n龟孙J\t211793\n冰河世纪4：大陆漂移\t211794\nsmt\t211795\n家院儿\t211796\n奇葩\t211797\n啦一个人\t211798\n抽头\t211799\n醉梦\t211800\n胡霍\t211801\n我问你我原谅你我爱你\t211802\n干花\t211803\nmnm\t211804\nsmp\t211805\n姚君宇\t211806\n疑幂\t211807\n不2Z了中下一T\t211808\n赵海朋\t211809\nghiphotosbaiducomxiaodupicitem0dd7912397dda144c33daec0b5b7d0a20cf48688jpg\t211810\n那拜拜你不要再回复我了我讨厌你\t211811\n你很哪呆着呆@爱的大\t211812\n江西\t211813\nimhimessther\t211814\n停机\t211815\n麦叽\t211816\n黑翼\t211817\n流流\t211818\n欣快点\t211819\n奇著\t211820\n8月8号上午\t211821\n杨海鹏\t211822\n逐一\t211823\n缙云\t211824\n3w点lhaome点com\t211825\n养晦\t211826\n王叔叔\t211827\n死要脸的玩儿\t211828\n心数\t211829\ndsf\t211830\neir\t211831\n交响曲\t211832\nbfbndkwo\t211833\n会说话不说\t211834\n1428个\t211835\n52222222222221\t211836\n一絲\t211837\n朱源峰\t211838\n52222222222222\t211839\n呢灯具\t211840\n心教\t211841\n充满信心\t211842\n太高楼\t211843\n民政部\t211844\n啊什\t211845\n1558899\t211846\n黑子哲\t211847\n大白猪\t211848\n豹子\t211849\n哈拜拜\t211850\n要害你\t211851\n飞抵\t211852\n颜姐\t211853\n得劲\t211854\n安纸乡\t211855\n辞镜\t211856\n大船\t211857\n耶模耶样\t211858\n七千年\t211859\n整誓\t211860\n46sdcom\t211861\n周华健\t211862\n姓红\t211863\n3000亿\t211864\nCandyDoll\t211865\n一四场\t211866\n朱再\t211867\n夏佐临\t211868\nfcfff\t211869\n太坏了求你\t211870\n外星\t211871\n杀进\t211872\n本月21日\t211873\n突然后\t211874\n郁山奈\t211875\n大舅\t211876\n死不吧\t211877\n朱军\t211878\n卡西拉\t211879\n想摸\t211880\n飞天猪样\t211881\n大舍\t211882\n窝荣\t211883\n真善美\t211884\n别无选择\t211885\n9151\t211886\njdld\t211887\n深信\t211888\n高山峻岭\t211889\n法娜\t211890\n度秘努力\t211891\n生畏\t211892\n呢次KC噶物理难化学易噶\t211893\n秦金兰\t211894\n法娃\t211895\n钱庄\t211896\nv公德心v\t211897\n碰巧\t211898\n价格战\t211899\n奔驰文化中心\t211900\n黄历\t211901\nsgdrxf\t211902\n关机\t211903\n就好好\t211904\n二五三零五三七六\t211905\n主女\t211906\n想不聊\t211907\n主奴\t211908\n王祖贤\t211909\n易米多\t211910\n1双三天\t211911\n嗯三里\t211912\n郝总\t211913\n删\t211914\n衣\t211915\n国际\t211916\n下场地\t211917\nyouarenot帅锅\t211918\n公休日\t211919\n补\t211920\n我不想不烦\t211921\n西安市\t211922\n10点钟\t211923\n封临\t211924\n聊聊好书把\t211925\n爽妞\t211926\n余佳怡\t211927\n史头\t211928\n乱差\t211929\n够不会\t211930\n型好丑\t211931\n好好好好好好好好\t211932\n枯石\t211933\n佩佩佩\t211934\n三七粉\t211935\n709155868545\t211936\nhomkui素恩\t211937\n品德\t211938\n护肤霜\t211939\n家气\t211940\n伦甲\t211941\n厚爱好人\t211942\n齐齐哈尔\t211943\n怪我没有\t211944\n申批\t211945\n尸母\t211946\n奇佳\t211947\n宫廷剧\t211948\n哦天呐度秘\t211949\n征信\t211950\nQQ儿\t211951\nwksn\t211952\n第100个\t211953\n搅拌站\t211954\n肿么着\t211955\n苏伊士\t211956\n伦生\t211957\n公子哥\t211958\n史云生浓醇\t211959\n给我好\t211960\n摩托\t211961\n润滑油\t211962\n度秘鬼\t211963\n王老五\t211964\n15995260049\t211965\n万能快乐\t211966\n皇冠慌乱\t211967\n王琦琦\t211968\n抢座\t211969\n憋不死\t211970\n企鹅\t211971\n噗刘欢\t211972\nsm3\t211973\n乖我\t211974\n表面积\t211975\n猪么电视\t211976\n调皮贝里沙c\t211977\nallatomalandalalalatiomatla\t211978\n周赞\t211979\n8777778点\t211980\n煤场\t211981\n2577007236\t211982\nklnme\t211983\nabc\t211984\n咕咕咕\t211985\n10栋\t211986\n沃佳婚冰卿\t211987\n林建\t211988\n李燕\t211989\n切尔诺贝利\t211990\n哪边长\t211991\n残阳\t211992\n灭亡\t211993\n一小半\t211994\nvippr\t211995\n單身\t211996\n1314520\t211997\n打分\t211998\n教室片\t211999\n唉分钟\t212000\ngfbfjcdh\t212001\n在秀\t212002\n这一夜\t212003\n下水道\t212004\n我爱你爱的甚甚\t212005\n幾個\t212006\n乌塔屋\t212007\n2011年6月30日5时15分\t212008\n陈梓童\t212009\n240x4\t212010\n打別\t212011\n事块\t212012\n纪检\t212013\n四五个\t212014\n丢手绢\t212015\n打别\t212016\n打制\t212017\n驶往\t212018\n成为忆苦\t212019\n这一天\t212020\nQkq\t212021\n栗鼠\t212022\n验货\t212023\n道远\t212024\n麼色\t212025\n徐雅文\t212026\nrfrd\t212027\n10四百三十四十三\t212028\n打非所\t212029\n那个年龄\t212030\n金三世\t212031\n南京大学金陵学院\t212032\n升初\t212033\n吉林省扶余市肖家乡第一中心小学\t212034\n5455555\t212035\n想难为你\t212036\nu俄uu诶诶物\t212037\n郁积\t212038\n我女\t212039\nFJGFTYIGGGGU\t212040\n算是\t212041\n高论\t212042\n我的新娘\t212043\n安卓拉贝比\t212044\n影音\t212045\n把接\t212046\nWitch\t212047\n端仇\t212048\n闭氏\t212049\n电话度\t212050\n工作压力\t212051\n周至县\t212052\n思索引擎\t212053\n33个\t212054\n23038470877\t212055\nlllcaheadratoutahitiaac\t212056\n8334\t212057\n一颗平淡的心\t212058\n拜匿\t212059\n锯片\t212060\n强直性脊柱炎\t212061\n9900\t212062\n嗯新浪\t212063\n莫名其妙不开心\t212064\n15809317643\t212065\n这个周末\t212066\n猪波\t212067\nKaton\t212068\nASSAssdss\t212069\nKiessliski\t212070\n雪小雅\t212071\nTuejhzh\t212072\n鸡屎\t212073\n一窝蜂\t212074\n1分\t212075\n蜜友\t212076\n71厘米\t212077\n永生我的世界\t212078\n5-6月份\t212079\n入迷\t212080\n院落\t212081\n今非昨\t212082\n陈若仪\t212083\n独一无二\t212084\n吹替\t212085\n再开玩笑\t212086\n背遍\t212087\n小鱼儿\t212088\n本月底\t212089\n交么问问\t212090\nM249\t212091\n大家庭\t212092\n它们\t212093\n勇敢的人\t212094\n撤编\t212095\njjtjj\t212096\n澳缔岚\t212097\n33万\t212098\n生大振\t212099\nSHS\t212100\n废秘\t212101\n爱马仕百达\t212102\n屋外\t212103\n可说错\t212104\n9096738\t212105\n零误差\t212106\n毅然\t212107\n摇身\t212108\nSHI\t212109\n宋祁\t212110\n神明\t212111\n好鸡友\t212112\nSHN\t212113\n整版\t212114\n1000l2\t212115\n两里\t212116\n仁寿县\t212117\n蠢材\t212118\n亦然\t212119\ntiquette\t212120\n整片\t212121\n百萬吻\t212122\n溪边\t212123\n西游记的在何方\t212124\n度秘度秘哦\t212125\n伯恩山\t212126\n拐不开\t212127\nFSDRR\t212128\n人字拖\t212129\n最后的遗言\t212130\n玳\t212131\ncczVc\t212132\nbjdndic\t212133\n再卖一\t212134\n瓦张\t212135\n土豆干\t212136\n么哪\t212137\n1549510950\t212138\n一块钱面条\t212139\n告诉你我我不会\t212140\n你好那天黑黑\t212141\n两tututikkklkouyu\t212142\n不不对呀\t212143\n九万家九万100\t212144\n你好依米花依米花花花\t212145\n摆谱\t212146\n生日常生\t212147\n天蝎\t212148\n小自相\t212149\n康强天\t212150\n新词儿\t212151\n性生活\t212152\n刘三毛\t212153\n小伙伴儿\t212154\nhelloshit\t212155\nCCP\t212156\ntats\t212157\nnlne\t212158\n13567808887\t212159\nsp型\t212160\n绿豆蝇\t212161\n张耀月\t212162\n邀请\t212163\n逼大出血\t212164\n布雷顿\t212165\n两多少\t212166\n一夜情笑\t212167\n约不约\t212168\n解不开\t212169\n柴老师\t212170\n养生\t212171\n宫佳宝\t212172\n玢\t212173\n偶偶\t212174\n以为而已\t212175\n度婆\t212176\n懂你喜欢句好吗度秘\t212177\n意大利\t212178\n护板\t212179\n玡\t212180\n太平洋\t212181\n交通信号灯\t212182\n安耐秘\t212183\n汉化版\t212184\n一场空\t212185\n鲁西\t212186\n一梦燃\t212187\n猪仔\t212188\n鹅鹅鹅鹅鹅鹅鹅鹅鹅\t212189\n大牙活\t212190\ngpnsoka\t212191\n作出\t212192\n咸烟\t212193\n茵柳\t212194\n┃┏┃┃┃┏┛n┃━┛━┛┏┛n━┛━┛次\t212195\n两秒级\t212196\njgihjttfgyy\t212197\n邛崃\t212198\n主人的秘\t212199\n童第周中穴十七分\t212200\n毫无顾忌\t212201\ntaishangxing\t212202\n访友\t212203\n一万米\t212204\n今天早\t212205\n棉麻\t212206\nfhjcvngu\t212207\n2127463\t212208\n不恶心\t212209\n希望谷\t212210\n存款准备金率\t212211\n店小男人\t212212\n董法静\t212213\n想哪儿\t212214\n很致密\t212215\n中央政府\t212216\n文佳\t212217\n人们子\t212218\n增加\t212219\ncghc\t212220\n高耸\t212221\n中铝\t212222\n粑粑\t212223\n138182\t212224\n刘京伟\t212225\n泰白\t212226\n凌度\t212227\n洋帅\t212228\n蒋宏先\t212229\ndontstarvemod\t212230\n事必躬亲\t212231\n帮调\t212232\n彻悟\t212233\n洪文\t212234\n过年好\t212235\n半口\t212236\n文体\t212237\n真稀奇\t212238\nuiehdbdnndnsks\t212239\n半句\t212240\n半只\t212241\n引人注意\t212242\n了不对你好\t212243\n二一大一一一思念的你下班时\t212244\n斗迷\t212245\n张三零\t212246\n半叶\t212247\n撒隆\t212248\n156个\t212249\n认誓\t212250\ndjfb\t212251\n微博皮\t212252\n就是不敢不敢不敢你不敢你就不敢呀不敢不敢昂\t212253\n愤怒\t212254\ngolf\t212255\ngolg\t212256\n三点水\t212257\n8、9号\t212258\n423a\t212259\n3010626434\t212260\n大硬\t212261\nBnj\t212262\nBnn\t212263\n呆毛\t212264\n飘香\t212265\n辣条条\t212266\n米袋\t212267\n十来听\t212268\n音波\t212269\nCKRJDKeNKDMSKSJXKWoXNDJDJXiWoFJSNKWJXKKWMKLiWWeEKDJDNKKXNNSNJKK\t212270\n33班\t212271\nslala\t212272\n獲頗\t212273\n一个114J\t212274\n熟门\t212275\n竹石\t212276\n幺三二\t212277\n黏液\t212278\n五十年代\t212279\n正能量\t212280\n胡国防\t212281\n秘密就是你的命\t212282\n151817755855554557777…477747\t212283\nST宜纸\t212284\n属于我的女\t212285\n嫩照\t212286\n试吃\t212287\n玫亚天黑\t212288\n多米胖多米\t212289\n表舅\t212290\n存在表\t212291\n25级\t212292\n对还错\t212293\n感恩卡奖品\t212294\n诊病\t212295\n贴切\t212296\n南明乡\t212297\n2016年1月13\t212298\nherewithyou\t212299\n别抢\t212300\n灰常可爱的儿童画\t212301\n美术馆\t212302\n封丘\t212303\n韦瑞新\t212304\n乌不红\t212305\n试听\t212306\n瓦弟\t212307\nrfy\t212308\nrfx\t212309\n6月9日下午18时\t212310\nfufu\t212311\n30亿美元\t212312\nrfg\t212313\nrff\t212314\nrfd\t212315\n葛城\t212316\n小菠萝\t212317\n甲然而止因材施教心驰神往训练有素雄姿英发粉妆遇砌当牛\t212318\n男朋\t212319\n地球公转\t212320\n汽油\t212321\n安份\t212322\n一千多度\t212323\n现金券\t212324\n国家审计署\t212325\n电焊\t212326\n皂皂皂皂皂皂皂皂皂皂皂皂皂皂皂皂\t212327\n鬼胎\t212328\n湖北卫视\t212329\n太古一快乐\t212330\n耶学姐\t212331\n贵都\t212332\n三思\t212333\n188160616\t212334\n猜猜我是谁\t212335\n燥\t212336\nCCD\t212337\n韩茜\t212338\n纸素\t212339\n51823840\t212340\n砂糖橘\t212341\nAlices\t212342\n19:40\t212343\n九顿半\t212344\n本楼\t212345\npickup\t212346\n智利\t212347\n锦霞\t212348\n万马奔腾\t212349\n13681120980\t212350\n湮灭\t212351\n江门地铁\t212352\n是不哪一家的啊不那靡家的的家\t212353\nMU5130\t212354\n一九七四零六零九\t212355\n区七中\t212356\n安跳舞\t212357\n芦岭镇\t212358\n中藏乙木\t212359\n不对马嘴\t212360\n智尼玛\t212361\n歌词词\t212362\niac200m\t212363\n纡绣场\t212364\n淑女范儿\t212365\nfuff\t212366\n千本宫\t212367\n年猪\t212368\n500ML\t212369\n才高八斗\t212370\niThone\t212371\n宾客\t212372\nteee\t212373\n大猪你是不是爱猪\t212374\n王子阳\t212375\n嗯有风\t212376\n看得见\t212377\n熊乃瑾\t212378\n老郭\t212379\n爹行\t212380\n吃光\t212381\n星期表\t212382\n18542369245\t212383\nv工程局\t212384\nscallop\t212385\n三十四十五十六十五厘米\t212386\n不沾边\t212387\nchurch\t212388\n五讲\t212389\n谦谦君子\t212390\n霸州\t212391\n74分\t212392\nmmmmmmmil\t212393\n王nnnnn\t212394\n支子晨\t212395\n赞誉\t212396\n母白\t212397\ng131\t212398\n惘然\t212399\n☎\t212400\nzadg\t212401\n奥兔兔\t212402\nzadd\t212403\n★\t212404\n自动豆腐成型机\t212405\n☆\t212406\n一块儿\t212407\n刘通\t212408\n一杆杆\t212409\n人和事\t212410\n师专\t212411\n七宗罪\t212412\n戴维\t212413\n巴西红耳龟\t212414\n小米小米小米\t212415\n凡此\t212416\n雷诗诗\t212417\n张硕强\t212418\n大思牛\t212419\n钢手\t212420\n苦茶\t212421\n暖心男\t212422\nvsetebeyia\t212423\n三三女\t212424\nYougotanose\t212425\njrj\t212426\n闫妮\t212427\n连忒斯\t212428\n乖了真讨厌\t212429\n偶后宫\t212430\nt568a\t212431\n益龙网络军团记者团\t212432\nMiss\t212433\n别发度秘\t212434\n九百零七五六零\t212435\n猪腰杜仲核桃仁\t212436\n地藏\t212437\n花架\t212438\n不老你\t212439\n度秘你的小女人\t212440\n61836292\t212441\n花枪\t212442\n慵懒\t212443\n530129199702161726\t212444\n真嘛好\t212445\n颐和堂\t212446\n泄漏\t212447\n怨怒\t212448\n日中\t212449\n毛茸茸\t212450\n花枝\t212451\n姑打\t212452\n奔走\t212453\n奔赴\t212454\n日丽\t212455\n比图笑话了我是你\t212456\n张惠鹏\t212457\nednndhw\t212458\n棋牌游戏\t212459\n三点\t212460\n杨中天\t212461\n真不要脸真不要脸真不要脸\t212462\n遵循\t212463\n補将\t212464\n敏饭\t212465\n坡将\t212466\n她的一生\t212467\n八点到九点\t212468\n三炮\t212469\n一天八个\t212470\n萌诶数来宝器\t212471\n葡萄糖\t212472\n第一本\t212473\n噤若寒蝉\t212474\n主人行\t212475\n李有病\t212476\n984564676467557545455737664646464\t212477\n通红钻\t212478\n语重心长\t212479\n棵嫂\t212480\n登封少林寺\t212481\n武极\t212482\n尘子\t212483\n零三年\t212484\n第一月\t212485\n津津有味\t212486\n罗656556565656\t212487\nStupidass\t212488\n妇女的话\t212489\n继任\t212490\n解敏\t212491\n11194\t212492\n大橙橙橙子\t212493\nGK\t212494\n卸下\t212495\n33249\t212496\n下为\t212497\n指点点点点\t212498\n咋西黑f\t212499\n甜文\t212500\n议员们\t212501\nalaskatpon那尼GPS咯哈\t212502\n滥情过\t212503\n妹妹们\t212504\n下个\t212505\n延大\t212506\n下世\t212507\n糊巴\t212508\n十二四\t212509\n漫步\t212510\n12点01分\t212511\n刷漾\t212512\n十四套\t212513\n穿梭者\t212514\n老帽\t212515\n8u88i\t212516\n没诚意\t212517\n回我不想\t212518\n下一\t212519\n巧克\t212520\n磨穿\t212521\n下下\t212522\n撒泡\t212523\nfhuu\t212524\nwfuuiuuu\t212525\n街景\t212526\nabAD\t212527\n猪侠\t212528\n郑度秘\t212529\n醒酒\t212530\n伤心归来\t212531\n我的家就是你的家你的家就是我的家\t212532\nGPU\t212533\nbababbbb\t212534\n变着相\t212535\n猜不着\t212536\n伏魔\t212537\n333333元\t212538\n九亿九千九百九十九万九千九百九十九\t212539\n续约\t212540\n猿猴\t212541\n咕噜娃\t212542\n稳步增长\t212543\ngjv\t212544\n心太黑\t212545\n诵咒\t212546\nhfjbyvgir\t212547\n结用\t212548\n小辫\t212549\n千丈\t212550\n颐景水宫\t212551\n告诉你我要\t212552\n快一点半夜晚自习\t212553\n千七\t212554\n变态类\t212555\n峰尚美\t212556\n小达\t212557\nparate\t212558\n别怕苦\t212559\n小辰\t212560\nJsizbakshbxjs\t212561\nFriend\t212562\n周玉玲\t212563\n慢慢地\t212564\n有见好转\t212565\n诶呦hihihi\t212566\n侯马西站\t212567\n千个\t212568\n伤心处\t212569\n杨佳英\t212570\nIwant\t212571\n楼栋咪\t212572\n20100607\t212573\n刘繁敏\t212574\n朱常\t212575\n便签\t212576\nvnvdmc\t212577\n天之炽\t212578\n好丑丑八怪\t212579\n冲凉\t212580\ngjj\t212581\n几禽\t212582\n安慰霞\t212583\n传奇故事\t212584\n2010年10月\t212585\n还考\t212586\n三十来岁五十来岁\t212587\n和对\t212588\n女人心度秘你好吗你毛多肉少米你堂客\t212589\n40万\t212590\n死物\t212591\n张小珏\t212592\n上海季\t212593\n赵晓光\t212594\n康秀伟\t212595\n丰碑\t212596\nvjfhhghggh\t212597\n屈打成\t212598\n呵呵棒棒哒\t212599\n手人\t212600\njsbh\t212601\nftexbihr\t212602\njsbn\t212603\n40个\t212604\n250290382\t212605\nFofvrng\t212606\n王巴基\t212607\n退一步海宽\t212608\n红超波老拉\t212609\n杰拉德\t212610\n陶艺\t212611\n死牙\t212612\nsoho1族\t212613\n尼讨\t212614\n掉渣\t212615\nipsd\t212616\n週末嗎\t212617\n忍野\t212618\n厚\t212619\n玩具盒\t212620\n夏朝\t212621\n53条\t212622\n挪作\t212623\n算出\t212624\n信托公司\t212625\n专卖店\t212626\n赵锡俊\t212627\n藏面\t212628\n夏末\t212629\n说的恨\t212630\n箭尾\t212631\n秘文\t212632\n罗玉凤\t212633\n弯路\t212634\n个识\t212635\nci5\t212636\n脑瓜\t212637\n琴棋祖\t212638\n师院\t212639\n施加\t212640\n零六分\t212641\n最喜欢度\t212642\n差旅\t212643\n光棍儿老杨\t212644\n真烦\t212645\nTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT恭喜TTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT\t212646\n一瑶\t212647\nposter\t212648\n假正经\t212649\n嘤嘤QAQ\t212650\n八路军来了猪猪猪猪猪猪猪\t212651\n语序\t212652\n唉一雪\t212653\n韩帅\t212654\ntoptheme\t212655\n韩币\t212656\n钻交所\t212657\njfggg\t212658\n五颜六色\t212659\n20km\t212660\n汉代\t212661\ncib\t212662\n后知后觉\t212663\ncig\t212664\ncif\t212665\n大杨扬\t212666\ncik\t212667\n疯疯癫癫跳\t212668\n力战\t212669\n说秘\t212670\ncis\t212671\ncir\t212672\n5.98%\t212673\nciu\t212674\ncit\t212675\n韩帮\t212676\n闹闹闹闹闹不可以\t212677\n再假\t212678\n气死我了真是\t212679\n郭毅\t212680\n搞垮\t212681\n成家吧\t212682\n白泽\t212683\n1000000000\t212684\ndyio\t212685\n黄山天都峰\t212686\n应城\t212687\n乐季\t212688\n转转转转转转转\t212689\n老老虎\t212690\ndrrfhhxdtddbf\t212691\nWii额\t212692\n最讨厌你\t212693\n六零二二\t212694\n前四\t212695\n西子淡妆浓抹总相宜\t212696\npreguntas\t212697\n说走就走\t212698\n尽兴\t212699\n年年十八你不抒你\t212700\n晒干\t212701\nTFOS\t212702\n18874172\t212703\n二和一个\t212704\n女生地球\t212705\n克撒曲\t212706\n春城\t212707\n温总理\t212708\n还会\t212709\n四脚\t212710\n千里眼\t212711\n短信银行\t212712\n六十八块\t212713\n51726545\t212714\n怕破\t212715\nwnnf\t212716\n主作\t212717\n真理道\t212718\n难题\t212719\n吕宏卓\t212720\n米雪\t212721\n分析家们\t212722\n天天处\t212723\n主体\t212724\n铜钱\t212725\n人间烟花\t212726\n蓄电池\t212727\n温斯坦公司\t212728\n占款\t212729\nsBSBSBSBREF\t212730\n桃花潭\t212731\n龚立华\t212732\n磨砺\t212733\nGhcjghhfyxhghdggdlgkdyyghf\t212734\n4月16\t212735\n4月14\t212736\n遥控飞机\t212737\n道玉\t212738\n米雅\t212739\n道王\t212740\n32gb\t212741\nA仔\t212742\n零点儿二七\t212743\n4月18\t212744\n威王枝\t212745\nfesf\t212746\n纪梵希\t212747\n十一分之一\t212748\n猫猫猫\t212749\n预后佳\t212750\n挨着\t212751\n我没有骗你我真的爱你爱妃别\t212752\nqvc\t212753\n五十年后\t212754\n于沙\t212755\n星期五中午\t212756\n徐明秋\t212757\nqvq\t212758\n長什麼樣\t212759\nCudo\t212760\n郭嘎嘎\t212761\n风果\t212762\n高价位\t212763\n群混茶群\t212764\n潘丽君\t212765\n蕊心\t212766\n白乐宁\t212767\n凯乐斯威夫特\t212768\n你老婆色\t212769\n13245689825\t212770\n揭幕\t212771\n一百九十九一百九十九页\t212772\nbb霜\t212773\n大刀\t212774\n乐不出来\t212775\n移动网\t212776\n10000个\t212777\n老翁\t212778\nudfore\t212779\n佳佳佳\t212780\n达无疆\t212781\n孤秀\t212782\nAseghuvdfvua\t212783\n哩阿兵\t212784\n小愣\t212785\n鸿忠\t212786\n大初\t212787\n替人\t212788\n下午6时18分\t212789\nuvibggggghu\t212790\n我我永远也不理你了我要告上法庭哼拜拜关门我走了我要告法庭\t212791\nqqqqqg\t212792\n大利\t212793\neva\t212794\nqqqqqk\t212795\nPang2\t212796\neve\t212797\nevf\t212798\nforane\t212799\nqqqqqq\t212800\n60058\t212801\n于雄\t212802\n韩2\t212803\n操蛋\t212804\nevs\t212805\nevu\t212806\n辣辣蜡地了了了了了咯啊好了了是了\t212807\n2006～2020年\t212808\n胡同\t212809\n领导人\t212810\n14平米\t212811\n雷神nthe\t212812\n鞭炮\t212813\n鸡翅撒子的吃的撒子\t212814\n才子点\t212815\n1月25日\t212816\n愤青\t212817\n伴奏\t212818\ntvn\t212819\n好可爱好卡哇一\t212820\ndizhfmopqrstutwx\t212821\n曰子\t212822\n肺部\t212823\neēeeēē\t212824\n植物大战僵尸三礼包\t212825\nxuie\t212826\n发哥仔\t212827\n果粒橙\t212828\n胡吧\t212829\n卡卡紊\t212830\n歌诗达\t212831\n大姐姐\t212832\n性趣\t212833\n变形金刚3\t212834\n五分之一第二天\t212835\n腊月十六\t212836\n那多不划算呀白干活儿\t212837\n里欧个\t212838\n柯南VS金田一\t212839\n善款\t212840\n优异\t212841\n许越南\t212842\n受益匪浅\t212843\n陈熙烨\t212844\n龙当代美术馆\t212845\n一逆风\t212846\n888888岁\t212847\n空欢喜\t212848\n零点\t212849\n双喜欢你好友谊\t212850\n报答\t212851\n688683\t212852\nw女\t212853\n李家鑫\t212854\n腼腆\t212855\n坐吃山空\t212856\n腼腼\t212857\n微博特\t212858\n演唱会独家大直击\t212859\n阿迪江\t212860\n双鱼雪\t212861\n三百张\t212862\n古仁格勒\t212863\n双簧\t212864\n奇偶小是喔\t212865\n新剧\t212866\n报父\t212867\n8点30分\t212868\n二零七\t212869\n马佐夫舍省\t212870\n报筹\t212871\n真没心没肺\t212872\n私聊\t212873\n沃厂\t212874\n么么哒度秘我爱你\t212875\n的等\t212876\n动人\t212877\n一艘\t212878\nsgs\t212879\n共党\t212880\n嘀嗒\t212881\n了偶们\t212882\nsgj\t212883\n雷斯\t212884\n好呀阴阳\t212885\n王子亦\t212886\n一艋\t212887\n宇宙至尊达达\t212888\n悉尼美\t212889\n一色\t212890\n一百厘米\t212891\n张力军\t212892\nkncd\t212893\nnsbs\t212894\n第一天多\t212895\n调漂度\t212896\n死心有事\t212897\n魔君\t212898\n接不不会\t212899\n雪顿\t212900\n自然段\t212901\n魔吐\t212902\n我爱你我不爱你\t212903\n雷文\t212904\n干嘛呀安\t212905\n自警\t212906\n穿住\t212907\nquye\t212908\n充电池\t212909\n38%\t212910\n七七八八大\t212911\n2531142262\t212912\n伱会\t212913\nUk7\t212914\nEMCiswhoah\t212915\n1314251314\t212916\n落日似\t212917\n绿钻\t212918\n风吹雨打\t212919\nfffrf\t212920\n余江县\t212921\n边境牧羊犬\t212922\n嗯哥哥哥哥哥哥哥哥哥哥哥哥哥哥\t212923\n财富值\t212924\n明哲保身\t212925\n魏忠贤\t212926\n一百零一块\t212927\n王菲\t212928\nsrd\t212929\n哈哈我真的你说的话\t212930\n等一等\t212931\n我真的一秒钟\t212932\n质检\t212933\n堂好\t212934\n华泰汽车\t212935\n幺零五\t212936\n金隅金隅\t212937\n析割\t212938\n度秘我你\t212939\njdmtg\t212940\n鼻祖\t212941\n余燕\t212942\n褶子\t212943\ngGn\t212944\n屈原\t212945\n大家好我是十二岁\t212946\n76斤\t212947\n蹦蹦蹦蹦\t212948\n坎儿\t212949\n18932332812\t212950\nxxxxzsw\t212951\n翼情\t212952\n极道\t212953\n雅蠛\t212954\n淑女单\t212955\n腐木村\t212956\n广深地区\t212957\n亲爱的嫁给我吧嫁给我吧嫁给我\t212958\nlegitghcb\t212959\n菜泥\t212960\nsrr\t212961\n叫劲\t212962\n喀左\t212963\n陈昌毅\t212964\n翠星石\t212965\n做不做饭\t212966\n连接词\t212967\n坏女孩\t212968\n午日节\t212969\n祝度\t212970\n程艮\t212971\n叫动\t212972\n探案剧\t212973\n基金申购\t212974\n一个五元\t212975\n男性别\t212976\nbdrucfhjnc\t212977\nEOOK\t212978\n人国\t212979\nkanitaanitkathithtrtrtrtrtrtrt\t212980\nplaststria\t212981\nlanal\t212982\n泽宏兄\t212983\n创业板\t212984\n名学名\t212985\n唉真是的你就是听不懂我说的话\t212986\n奇骏\t212987\n问題行\t212988\n章玄\t212989\n郑宇新\t212990\n小宝贝\t212991\n50万元\t212992\n电老鼠\t212993\n四五十\t212994\n属于你的一条路\t212995\nAwaits\t212996\n水军\t212997\n单身男\t212998\n闷酒\t212999\n削去\t213000\n滑困\t213001\n京太\t213002\nIPHONE4stompastour\t213003\n10000000000000000000000000000个\t213004\n澈影\t213005\n晚安晚安晚安晚安晚安晚安晚安晚安晚安晚安\t213006\n用工\t213007\n陈丽虹\t213008\n大病\t213009\nnnnnnnnnnnnnnnnnnnnn\t213010\n十一点10点11点\t213011\n意味着\t213012\n太鬼\t213013\n非智能\t213014\n肉质品\t213015\n大痴\t213016\n叶良辰\t213017\n西峰区\t213018\n单员\t213019\n新举措\t213020\n下棋\t213021\n文咏珊\t213022\n水冬\t213023\n雪曦\t213024\nHaven\t213025\n秋雅\t213026\n情份\t213027\n泗洪\t213028\n啦咪\t213029\n李芹\t213030\n道题\t213031\n袋\t213032\nhttpfhiphotosbaiducomxiaodupicitemfcfaaf51f3deb48fc94bd2aaf71f3a292df578a7jpg\t213033\n袅\t213034\n袄\t213035\n别演\t213036\ngcv\t213037\n袁\t213038\n做多\t213039\nthisfruitisyandittsour\t213040\n称姐\t213041\n袜\t213042\n马拉\t213043\n荡秋千\t213044\n泗洲\t213045\n袖\t213046\n红腾腾\t213047\n践行者\t213048\n颇具\t213049\n被\t213050\n力图\t213051\n袤\t213052\n硒鼓\t213053\n氰化物\t213054\n秋雨\t213055\n9月\t213056\n酅吾\t213057\ntestdid\t213058\n性意味\t213059\n爱枫林晚\t213060\n暂存\t213061\n机妖\t213062\n回迁\t213063\n坎坎坷坷坎坎坷坷\t213064\n摩根大通\t213065\n啦啦啦啦啦啦我是卖报\t213066\n好笑笑笑笑\t213067\n莫诗音\t213068\n离臻\t213069\niamapig\t213070\n阴着阴的名嘴客\t213071\n彬哥\t213072\n事业型\t213073\n信不投诉\t213074\n陪小心\t213075\n套近乎\t213076\n黑黑猪\t213077\n白线\t213078\n白纸\t213079\n棠羽依\t213080\n叫亲\t213081\n白纱\t213082\n36级\t213083\n皆非善\t213084\n刘美含\t213085\n装扮演\t213086\n偷喝\t213087\n八零三二\t213088\n符合者\t213089\n猜猜猜猜猜猜猜猜猜猜猜猜猜猜猜猜猜猜猜猜猜猜猜猜猜猜猜猜猜猜猜猜猜猜猜猜猜猜猜猜vv\t213090\n梅迷\t213091\n牵\t213092\n代课老师\t213093\n爱情侣\t213094\n鸭\t213095\n不见了你在\t213096\n谢谢你了谢谢\t213097\nhoviyiy\t213098\n王钰云\t213099\n四思\t213100\n远行\t213101\n特\t213102\n牸\t213103\n牧\t213104\n130多平方米\t213105\n嗳呵\t213106\n牢\t213107\n牡\t213108\n易建联\t213109\n真真真真真\t213110\n拿枪\t213111\n铁门\t213112\n物\t213113\n时尚传媒集团\t213114\n樱珠\t213115\n小期末考试\t213116\n牟\t213117\n牝\t213118\n小邋遢\t213119\n没有你在\t213120\n牙\t213121\n即刻\t213122\n片\t213123\n612\t213124\nABS\t213125\n有价证券\t213126\n空心字\t213127\nwpquwyk\t213128\n牌\t213129\n帮距\t213130\n勇士\t213131\n魂魄\t213132\n对呀我问你在你的\t213133\ngigigig\t213134\n三承幕\t213135\n二十四\t213136\n缉凶\t213137\n鹿兆\t213138\nProgram\t213139\n忍者摩托么\t213140\n66米\t213141\n1472536980\t213142\n122233445566\t213143\n亲我我的\t213144\n巴啦啦小魔仙巴啦啦小魔仙梦幻乐队的贝大师\t213145\n拜拜机\t213146\n伯乐\t213147\n考片\t213148\nznso4\t213149\n义气\t213150\n过界岭\t213151\nkgjjgbngdvkjh\t213152\n588011\t213153\n水卡\t213154\n郭瑞雨\t213155\n信耶稣\t213156\n李治\t213157\n龙头\t213158\n突发\t213159\nshorts\t213160\n说到\t213161\n萌萌哒萌萌哒萌萌哒萌萌哒萌萌哒萌萌哒萌萌哒\t213162\n突变\t213163\n妞笑一\t213164\n我喜欢的姑娘\t213165\n蓝米\t213166\n行感\t213167\n溺毙\t213168\n节育环后天就好5汉釜宫给几哦哦ii几句话\t213169\n10月14-19日\t213170\n行意\t213171\nkxjxk\t213172\nkaspendahohou\t213173\n感谢你的喜欢\t213174\n倾城我了我要死\t213175\n8月9日\t213176\n马具\t213177\n1255552452542255544555556225585522586658862233522236666655445598866665866654588899872558777756654455558852555896355586666655488999\t213178\n嗯xo\t213179\n剧组\t213180\n着色剂\t213181\n梵蒂冈\t213182\n偏要\t213183\n满版\t213184\n银家\t213185\n李沁\t213186\n想不在\t213187\n说分\t213188\n张扬\t213189\n八十八岁\t213190\nAngle\t213191\n滴落\t213192\n贫了拜拜\t213193\nPhotobook\t213194\n勤礼用\t213195\n四五十分钟\t213196\n别别我\t213197\n笑问\t213198\n张扣\t213199\n阿西巴\t213200\n7177177171777777777\t213201\n宝刀\t213202\nAngly\t213203\n涂色\t213204\n栽插\t213205\n诚信\t213206\n陆战队\t213207\n六张\t213208\n特警\t213209\n张才\t213210\nlaigeJ8tu\t213211\nduvvc\t213212\n青蒿\t213213\n曹静\t213214\nigohkgyato\t213215\n鸥\t213216\n理度秘\t213217\n矿山\t213218\n呐呐我爱\t213219\n9曲\t213220\n螺春\t213221\n什邡\t213222\n真漂\t213223\n进\t213224\n涡虫\t213225\n自上而下\t213226\n樱桃\t213227\n秘度屋\t213228\n主意\t213229\n夜书\t213230\n啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦芭芭拉\t213231\n紫娟\t213232\nyiucgbgmghgvvvvvcbxgvxgbccbvbbbvbbbbbbbvbvbbbbb\t213233\nkorea2\t213234\n12377898665311357\t213235\n小红点儿\t213236\n流浪\t213237\n周木子\t213238\n服务社\t213239\n转贴\t213240\n好大不大\t213241\n王振\t213242\n忽周\t213243\ndbk\t213244\n記\t213245\ndbn\t213246\n高傲冷都女\t213247\ndbd\t213248\n做官\t213249\n十二套\t213250\n吹牛吧\t213251\ndbz\t213252\n六畜\t213253\n邪皇欺\t213254\n连网\t213255\n穿小\t213256\ntrfdyjfkgt\t213257\nbisto\t213258\n多多多多多多多多多多多多多多多多多多多\t213259\n轻点\t213260\n来人说\t213261\nububobo\t213262\n到而驰\t213263\n感冒猪\t213264\n一千四百多兆\t213265\n林承通\t213266\n水果贩\t213267\n造孽\t213268\n有帅\t213269\n风韵\t213270\n博兴路\t213271\n说我帅\t213272\nhriuu\t213273\n点心情\t213274\n124577\t213275\n追求\t213276\n大麻烦\t213277\n我是谁\t213278\n霍不单行\t213279\n15643241537\t213280\n13389412126\t213281\n三来八来\t213282\n会事\t213283\n你给我个好的回答行\t213284\nSand\t213285\nactivelyac\t213286\n山火扑救指挥部\t213287\n6497979679979489\t213288\n技术工人\t213289\n好完好\t213290\n你的耳朵全全全全\t213291\n23郑\t213292\nufbcvcggg\t213293\n我们八一可爱的勇叔\t213294\n四百零二\t213295\n秘密行\t213296\nalele\t213297\nuovylhvovpbplbpjcitclou\t213298\n1~2分钟\t213299\n148月\t213300\n刘度\t213301\n你好丑围度秘\t213302\n神马错\t213303\n修香\t213304\n蒋晓婷\t213305\n木呼朋唤伴\t213306\n潮牌\t213307\n161527\t213308\n跟瞬\t213309\n55岁\t213310\n安可\t213311\n完一礼拜\t213312\n用眼看\t213313\n五十多级\t213314\n克谦\t213315\n婚纱\t213316\n丟么帮\t213317\n理因\t213318\n阿伊\t213319\n嗯快\t213320\n十三不靠\t213321\n画门\t213322\n3566467\t213323\n英豪\t213324\n僵愧\t213325\n阿伟\t213326\n婚约\t213327\nSmys\t213328\n迢\t213329\n黄花岗\t213330\nWOjiaoxiaoxia\t213331\n怎嘛\t213332\n稀日常\t213333\n生命液\t213334\nsb2bdb\t213335\n厚禄\t213336\n0.25\t213337\n几个处\t213338\n徐四里\t213339\n咦度秘\t213340\n王雪颖\t213341\n6本\t213342\n十二节气\t213343\n码奴\t213344\n新理\t213345\n新球\t213346\n甘爽\t213347\n空虚\t213348\n迭\t213349\n脾胃\t213350\n迪\t213351\n春意\t213352\n姜涵\t213353\n赤子\t213354\n勤学\t213355\n康姆昂北鼻秃\t213356\nlyounilyou\t213357\n错啦好\t213358\n妮蛋\t213359\n3181352570\t213360\n十二米\t213361\n6月\t213362\n三戒\t213363\n三成\t213364\n一0\t213365\n一1\t213366\n来生\t213367\n一3\t213368\n一4\t213369\n入京\t213370\nHer\t213371\n截\t213372\n一8\t213373\n一9\t213374\n哇好\t213375\nHex\t213376\nHey\t213377\n减肥类\t213378\n小度秘多边形数\t213379\n入人\t213380\n武大帝\t213381\n朴荷儿\t213382\n秦汉\t213383\n嗯老婆\t213384\nHen\t213385\n蝙蝠\t213386\nHeh\t213387\nHei\t213388\n戳\t213389\n以对\t213390\n米强\t213391\n康舒acbel\t213392\n戏\t213393\n小芈月\t213394\n梁林艳\t213395\n张太可\t213396\n拜佛\t213397\n来电\t213398\n仙家一潜\t213399\n度秘你做一下自我介绍\t213400\n戆\t213401\n喻航\t213402\n戀\t213403\n你好呀萌萌哒\t213404\n刘彤\t213405\n战\t213406\n戛\t213407\nabcb式\t213408\n戕\t213409\n凌晨两点\t213410\n或\t213411\n我\t213412\n成\t213413\n蒋宇轩\t213414\n5554244425455\t213415\n跌幅\t213416\nvmbb\t213417\n抹奶\t213418\n集宁\t213419\n一w\t213420\n阅读器\t213421\n藏头诗\t213422\n琵琶音\t213423\n随诊\t213424\nfsey\t213425\nsomesitth\t213426\n周佳\t213427\n剪子布\t213428\n一f\t213429\n耙耙\t213430\n专卖\t213431\n一l\t213432\n走珠\t213433\n一n\t213434\n一Q\t213435\nmicsmityawas\t213436\n负子\t213437\nc1证\t213438\n一Z\t213439\n一[\t213440\n6-1\t213441\n张可爱\t213442\n路虎\t213443\n碰撞\t213444\n说白\t213445\n一四度\t213446\n一G\t213447\n一I\t213448\n一M\t213449\nva\t213450\nvb\t213451\nvc\t213452\nvd\t213453\nve\t213454\nvf\t213455\nvg\t213456\nvh\t213457\nvi\t213458\nvj\t213459\nvk\t213460\nvl\t213461\nvm\t213462\n９岁\t213463\nvo\t213464\nvp\t213465\nvq\t213466\nvr\t213467\nvs\t213468\nvt\t213469\nvu\t213470\nvv\t213471\nvw\t213472\nvx\t213473\nvy\t213474\nvz\t213475\nwonttoo\t213476\n建仓\t213477\n搜襄垣一中\t213478\n张加奇\t213479\nvD\t213480\n林喜洋\t213481\nmmnlnjl\t213482\n搞大\t213483\n香泪\t213484\n阿刚\t213485\n改口\t213486\n打不赢\t213487\n奖度秘\t213488\noooooooooolk\t213489\n万块\t213490\n11.06元\t213491\n201619号\t213492\n40岁\t213493\nandroid\t213494\n晓静\t213495\n小太阳\t213496\ndan\t213497\nDanny\t213498\n王慧\t213499\n1000家\t213500\n三八二零\t213501\nv1\t213502\nv2\t213503\n猎豹\t213504\nv4\t213505\n哈哈度秘\t213506\nv7\t213507\ninfinite\t213508\nv9\t213509\n年青人\t213510\n时断时续\t213511\n幺幺零幺零\t213512\nwowo\t213513\n10000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000001\t213514\n舍不\t213515\n坏商人\t213516\n不成都在\t213517\n表音\t213518\n刚\t213519\n发不介意\t213520\n白雪公主的故事\t213521\n咿呀咿呀呀\t213522\n狭路相逢\t213523\ndaw\t213524\nadidas制作大队\t213525\n组织者\t213526\n昆德拉\t213527\n佳纳士\t213528\n疯了我讨厌\t213529\nEhhdjc\t213530\n三天多\t213531\ngxd\t213532\n修到\t213533\n暖男\t213534\n凡士林\t213535\ndas\t213536\n聚敛\t213537\n卖好\t213538\n地级报\t213539\n叠玫瑰花\t213540\n23只\t213541\n秘别太\t213542\n你妈个逼老子问你\t213543\nudbuah\t213544\n擦擦擦擦擦擦擦擦擦擦啊饿饿饿饿饿饿饿饿饿饿饿饿饿\t213545\n钨矿\t213546\n完美的人\t213547\n12357645376457\t213548\nCompany\t213549\n珊瑚绒\t213550\n44米\t213551\n披头一片\t213552\n看到你了你走\t213553\n爱屋\t213554\n224488632448963\t213555\n余雨轩\t213556\n麦克辛\t213557\n陈建和\t213558\n莲蓬\t213559\nROboXiaodu\t213560\n1732283927\t213561\n小桶\t213562\n2555865\t213563\n船高\t213564\n120527\t213565\n野草莓\t213566\n小桥\t213567\n不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸不要脸\t213568\n福神\t213569\n帮到\t213570\n亲香亲大雄\t213571\n裁判权\t213572\nhhhpp\t213573\n玩酷跑\t213574\n4月24日\t213575\n我是女的你是个女的还是男\t213576\n林徽因传\t213577\nnelion\t213578\n团级\t213579\njhddn\t213580\n中文系\t213581\ngxv\t213582\njhddh\t213583\n帮创\t213584\n寻常\t213585\n舒兰二中\t213586\n小桃\t213587\n2258\t213588\n苏雅默\t213589\n猜猜我是你事儿\t213590\n2255\t213591\n许多许多\t213592\n2252\t213593\n朱jj\t213594\n而终\t213595\n政法\t213596\n魏斯\t213597\nwhhshduf\t213598\n清品菜根\t213599\naeigdcfittccfffffffffffffgoysdvsjhxjdhxhjhxxhdjhbjjhdbbxbxhnbjxjhjjjxhhhhxhs\t213600\n252\t213601\nfhcfu\t213602\n梅梅\t213603\n底下\t213604\n1551745\t213605\n李棉飞\t213606\n发生冲突\t213607\n面谈\t213608\n夜的的的嗯\t213609\n不懂才怪\t213610\n尼玫\t213611\n555555222333388888\t213612\n曹营\t213613\ngxy\t213614\n白姐\t213615\n尼玛学\t213616\n一道道\t213617\nKEVIN\t213618\n大白鲨\t213619\n崽子们\t213620\n級\t213621\n于聪聪\t213622\n大润发\t213623\n死回\t213624\n112叉\t213625\nCD控\t213626\n死囚\t213627\n布衣\t213628\n香河\t213629\n偶数数九零\t213630\n肖冬冬\t213631\n山二村\t213632\nbjjjii\t213633\n冰火两重天\t213634\n我无所谓\t213635\n这样子\t213636\n死国\t213637\n六十平\t213638\n对呀爸爸妈妈\t213639\n六十年\t213640\n永远永远永远永远\t213641\n娴静\t213642\n夜长长\t213643\n香油\t213644\n沈沉楷\t213645\n杨能\t213646\ngjchvdhhv\t213647\n支配权\t213648\n不着边际\t213649\n死因\t213650\n讯实\t213651\n小土豆\t213652\n自相\t213653\n老波\t213654\n木一切你不要和我一脚有想我吗吗游戏奥特曼\t213655\n眯\t213656\n职业生涯\t213657\n098765432112345788\t213658\n周丽颖\t213659\n张士腾\t213660\n十三时\t213661\nluna\t213662\n云霓裳\t213663\n一一次\t213664\n950米\t213665\n景博勋\t213666\n大增\t213667\n1238745880\t213668\n敢不认\t213669\n准能\t213670\n赴考\t213671\n安脚\t213672\n二四幺五\t213673\n509868\t213674\nkdbshyggbghjgdcvbgghgyhfhhgghhfSfghgffffgfgggffffdfcdfftfjgshaywgfjfhgjgfhhtgjrhgvcfjsufyxjfhfgh\t213675\n红米闹山\t213676\n贵在\t213677\n二四幺二\t213678\n死道友不死贫道\t213679\n新策略\t213680\n山鸡城\t213681\nstrilyzyourzhi\t213682\n只字\t213683\n泰巴萨\t213684\n虎皮蛋\t213685\n书照\t213686\n闭着眼\t213687\n怨男\t213688\nlxhx得瑟\t213689\n王俊泽\t213690\n千行\t213691\ngdfdcdgcgdttfyygggvgcccgjjgghtghfgcghbcvcvcvccvghgcccgvghhjhhhjbvvghjhhnnnmmnncchghhbbnnbbn\t213692\n江苏卫视\t213693\n屎棍\t213694\n解下\t213695\n我完了\t213696\ngpggh\t213697\n40度\t213698\n尼可\t213699\n阿2\t213700\n育儿\t213701\nCIB\t213702\n曲振杰\t213703\n多好呀\t213704\n无存\t213705\nint\t213706\ninv\t213707\n降雨伞\t213708\n钱宇航\t213709\n白银\t213710\niTV\t213711\ninn\t213712\n哈卡哈卡噶\t213713\nini\t213714\n叶姑娘\t213715\nind\t213716\n纵容\t213717\ning\t213718\n黑的白的一天到晚先登登\t213719\n25o\t213720\ndgk\t213721\n雁飞残月天\t213722\n陶友红\t213723\n三百五\t213724\n日中邦交\t213725\n征程\t213726\n扩\t213727\n痴拜\t213728\n潺潺潺潺潺\t213729\n嫁给\t213730\n非诚\t213731\n一石恶的小呀小\t213732\nseuv\t213733\n杀毒拍目的古铁雷斯人\t213734\n120棵\t213735\n法者\t213736\n朱枫\t213737\n炸N尔\t213738\nzcz\t213739\n170587020785868468536825085046839\t213740\n合同\t213741\n扮\t213742\n爱荷华州\t213743\n四碗\t213744\n我是死壮死壮死壮死壮死壮听懂\t213745\n24.95美元\t213746\n少校\t213747\n咯基亚\t213748\n山田源氏\t213749\n肛长\t213750\n6667666766\t213751\n无异\t213752\n唉行\t213753\n太看\t213754\n胡闹\t213755\n7dcdcd\t213756\n嗯好嘞行幺零零幺零六二\t213757\n写到\t213758\n能说着\t213759\n云县\t213760\n%99\t213761\n呀不\t213762\n阵法\t213763\n节号\t213764\n渐次\t213765\n问渠哪得清如许下半句\t213766\nWrd\t213767\n一遇\t213768\n呃飙车\t213769\n一遍\t213770\n学者\t213771\n学考\t213772\n全集\t213773\n不育\t213774\n需谨慎\t213775\n变身器\t213776\nTtftft\t213777\n一道\t213778\n学耗\t213779\n白骨塔\t213780\n游桂莲\t213781\nWrx\t213782\n为观\t213783\n抓好\t213784\n汪嶒\t213785\n出货\t213786\n陶醉\t213787\n一遭\t213788\nK厅\t213789\ndtrtg\t213790\n赚钱\t213791\n阿德雷\t213792\n深受其害\t213793\n那你慢慢木马木马木马\t213794\n扫视\t213795\n女童鞋\t213796\n是我儿\t213797\n37.05%\t213798\nfijifjd\t213799\n毛囊\t213800\n好等高\t213801\n薄雅慧\t213802\nhifyyyutrsrg7zty\t213803\nue4\t213804\n惹老娘小心\t213805\n释德扬\t213806\n罐身\t213807\n毛囚\t213808\n毛四\t213809\n联合液化石油气公司\t213810\n双十二\t213811\n覃诗涵\t213812\n35431\t213813\nG8额\t213814\n租赁\t213815\n馁\t213816\n当当当当当当当当当\t213817\nhed5\t213818\n秀屿区司法局\t213819\n手机银行\t213820\n49340384384939439\t213821\n4仟多万元\t213822\n嘞一\t213823\n度秘我求求你了我真钥匙\t213824\n55588\t213825\n一点五G\t213826\n学无止境\t213827\n完美的手\t213828\n羡慕嫉妒恨\t213829\n扐\t213830\n霍哥\t213831\n刚腹\t213832\n天真可爱\t213833\n发茶\t213834\n坏话\t213835\n我爱的人她不爱我\t213836\nchef\t213837\n李昌连\t213838\n好好朋友\t213839\n纳纹\t213840\n发茨\t213841\n包钢\t213842\n眼明手\t213843\n丨nn\t213844\n反对票\t213845\nGAgoGS\t213846\n输出\t213847\n首夺\t213848\n攻打\t213849\n聪明白\t213850\n12646872684\t213851\n广角\t213852\n篱边\t213853\n李222222\t213854\n顾不得\t213855\n刘默涵\t213856\nFhnffbvffbf\t213857\n糕\t213858\n糗\t213859\n糖\t213860\n糙\t213861\n日腿\t213862\n糜\t213863\n糟\t213864\n22个\t213865\n淆霖风\t213866\n个字\t213867\n糊\t213868\n个子\t213869\n幽灵小猫历险记\t213870\n闫伟伟\t213871\n22万\t213872\n小学块\t213873\nDrgyj\t213874\n个孩\t213875\n糸\t213876\n系\t213877\n份妮\t213878\nvdcrhucscyxwe一星\t213879\n心心点灯\t213880\n22下\t213881\n糠\t213882\n糢\t213883\nsjcsd\t213884\n多手\t213885\n33435363738\t213886\n布拉格布拉格广场\t213887\n在家不我和你\t213888\n不然的话\t213889\n泥城\t213890\n海南人光\t213891\n一百辈子\t213892\n3位\t213893\n尚无\t213894\n相信爱情\t213895\n要不会话\t213896\n罗嗦鬼\t213897\n刘彦瑞\t213898\n大孝\t213899\n老姑娘\t213900\nivgigig\t213901\n煎堆\t213902\n死丢皮\t213903\n听好不\t213904\nhhhhdudid\t213905\n绣像\t213906\n啊儿\t213907\n星系咪\t213908\n恐龙\t213909\n大仲马\t213910\n太贱了钱\t213911\ngvfghjn\t213912\nSECRET\t213913\n看不碰\t213914\n组稿\t213915\n四个四百\t213916\n骨en\t213917\n2951587101\t213918\noneME\t213919\nloveyou\t213920\njngfg\t213921\nshaguadalao\t213922\n43qy\t213923\n样子\t213924\n呵恩\t213925\n一口囗x9口\t213926\n规律\t213927\n两岁半\t213928\nhgathjg\t213929\n林伟\t213930\nKotsiopoulos\t213931\n李愁\t213932\n你是谁呀23\t213933\n肉脯\t213934\n江南水乡\t213935\n不听来\t213936\n幸运之神\t213937\n3200\t213938\n180多块\t213939\n忆起\t213940\n2005年12月18\t213941\n得着\t213942\n蔡苒苒\t213943\n2000年\t213944\n演艺公司\t213945\n港行\t213946\n哈哈哈你是我女神了咯度秘爱来丁么么哒\t213947\n十九分之四\t213948\n小矮子呀小矮子矮子矮子小矮子\t213949\n工运史\t213950\n本月十一号\t213951\n看着我好\t213952\n快手日韩剧\t213953\n背背\t213954\n绝望感\t213955\n争光\t213956\n达菲\t213957\n百川\t213958\n被接\t213959\n酸味\t213960\n588588888888\t213961\n82.38%\t213962\n麼辦\t213963\nGgs\t213964\njjjjjjjjjjjjja\t213965\n背胶\t213966\n顾什么失\t213967\n火焰真\t213968\n鸿星尔克\t213969\nGgg\t213970\nGgf\t213971\n认识了你\t213972\n炉石传说有图\t213973\n龙葵\t213974\npizza\t213975\n份暖\t213976\n雅鼎\t213977\nvgi股\t213978\n甩一甩\t213979\n天相\t213980\n835号\t213981\n有点累\t213982\n蔡佳琪\t213983\n长治安\t213984\n不还十八好你小子\t213985\n叮叮\t213986\nWiggy\t213987\n两百多晚安\t213988\n天目\t213989\n二月几日\t213990\n延时丹图单词度绕科\t213991\n天盖\t213992\n一一号\t213993\n窝伐\t213994\n长江三峡\t213995\n信达证券\t213996\n销售量\t213997\n有我可以\t213998\n好叭度秘你喜欢我\t213999\n侠士\t214000\n信号源\t214001\n笑容\t214002\n范博\t214003\n八九百\t214004\n噩额\t214005\n墨鱼骨\t214006\n哈利波特凉\t214007\nS40\t214008\n草花头\t214009\n吧男\t214010\n我不想说\t214011\n袁后果\t214012\n拜见\t214013\n52年\t214014\n累赘\t214015\nneta\t214016\n冲锋枪王有种想当班多少瓜子好吃好湍\t214017\n蒙kd61\t214018\n诉说\t214019\nxuaohua\t214020\n冷的天\t214021\n段佩雯\t214022\n李曦雯\t214023\n丙数\t214024\n阿姑姑\t214025\n诉诸\t214026\n不结\t214027\n胡海生\t214028\n月荷\t214029\n借手\t214030\n8784755\t214031\n狗药\t214032\n不给\t214033\n我在说英语stmetou\t214034\n诉诉\t214035\n品位\t214036\n15寸\t214037\n51点\t214038\n鹿晨光\t214039\n江美丽\t214040\n死丑八怪\t214041\n哈里\t214042\n乍春\t214043\n不经\t214044\n华晨宝马\t214045\n侵略\t214046\n转业\t214047\nmgmg\t214048\n啾啾啾啾啾啾啾\t214049\n亚不生\t214050\nudhftyid\t214051\n转世\t214052\n解绍\t214053\n转下\t214054\n解结\t214055\n解绑\t214056\n番薯条纹路\t214057\n全港\t214058\n80年代\t214059\n郑我\t214060\n三八33\t214061\n所\t214062\n转为\t214063\n致幻剂\t214064\n傲世我错了不是你错\t214065\n田颖\t214066\n桃浦\t214067\ndhs\t214068\n绯情\t214069\n一战壕\t214070\n就堡\t214071\n褒义思汗\t214072\n日日织博好\t214073\n他一定很爱你\t214074\n祸国殃民\t214075\n张乐斌\t214076\n吴正明\t214077\n朱宝怡\t214078\n第九名\t214079\n很腹黑\t214080\nd40\t214081\nyom\t214082\nxgn\t214083\nlogt\t214084\n檢查\t214085\nxgj\t214086\nxgk\t214087\nxgh\t214088\ndhdhdydyydhdyeyydyryydyrryydyryfyryfyrggsdy\t214089\n152816638332\t214090\n张博\t214091\n吴玉美\t214092\n亲子装\t214093\nxgc\t214094\n八八兔兔\t214095\n悄然\t214096\n弄鬼先人\t214097\n枯木\t214098\n13405339823\t214099\n小气包\t214100\nlogo\t214101\n手机病毒\t214102\n歌手\t214103\n张华\t214104\n张立鹏\t214105\nsfsq\t214106\n369958\t214107\n冇麦基嘅\t214108\n驱除\t214109\n小流氓\t214110\n陈幽\t214111\n陈平\t214112\n上我家吧\t214113\n陈年\t214114\n跑粗\t214115\n张卡\t214116\n蚂子\t214117\n美桓\t214118\n丞相\t214119\n事业主\t214120\n资本\t214121\n我和我的笑话\t214122\n青阳\t214123\ngcggc\t214124\n空话\t214125\n踩踏堡\t214126\n大哥哥哥哥\t214127\n找片\t214128\n南京政府\t214129\n二月十二日\t214130\n百视通\t214131\n买买买买\t214132\n打工仔\t214133\n党派\t214134\n笑话笑\t214135\n针筒\t214136\n郎世宁\t214137\nJHHOjnp\t214138\n廖碧\t214139\n5月15日下午4点半\t214140\n丁亮茹\t214141\n狐臭\t214142\n晋南\t214143\n程梓函\t214144\n13958762305\t214145\n朱子涵\t214146\n丹辉\t214147\n冷血冷血\t214148\n晋升\t214149\n藤缠楼\t214150\nYs仢\t214151\n画面感\t214152\n内测版\t214153\n挂失\t214154\n呃弄\t214155\n八轮\t214156\n涌\t214157\n呃开\t214158\n打拢\t214159\n涉\t214160\n消\t214161\n涅\t214162\nzhenranm\t214163\n涂\t214164\n涝\t214165\n50年代\t214166\n打拳\t214167\n涚\t214168\n门儿性\t214169\n涂晨\t214170\n涕\t214171\n打拼\t214172\n涓\t214173\n囊鑫\t214174\n省电\t214175\n你好世界\t214176\n涮\t214177\n成都铁路局\t214178\n大坏旦\t214179\n涩\t214180\n服装厂\t214181\n涧\t214182\n润\t214183\n女大漾\t214184\n涡\t214185\nseityou\t214186\njhshjd\t214187\n打拐\t214188\n涵\t214189\n麦茶\t214190\n液\t214191\nkgddtk\t214192\ngifxxoo\t214193\n虽至\t214194\n还好呀\t214195\n1993年4月25日\t214196\n930吨\t214197\n999999999999999999999999999999999999999999999999999\t214198\n钵钵鸡\t214199\n成治\t214200\nwbuu\t214201\n度秘你是好样\t214202\n建工\t214203\n大一届\t214204\n邹凯\t214205\n肖像轩\t214206\n超男\t214207\n杨佳瑜\t214208\n乖小度\t214209\n张歆艺\t214210\n甜茶\t214211\n豁批\t214212\nk7k了会\t214213\n收车\t214214\n西格托呗\t214215\n聪明的一个极格叽格叽格叽格叽\t214216\n不哭闹\t214217\n垫款\t214218\n怀神秀\t214219\n剰\t214220\n割\t214221\n不和你策\t214222\n小公狗\t214223\n力压\t214224\n剿\t214225\n10899888\t214226\nvv不\t214227\n像素\t214228\n剧\t214229\n难上\t214230\n六安明天天晴\t214231\nHds\t214232\n剪\t214233\n好劲道\t214234\n副\t214235\n剑\t214236\n在编\t214237\n说承办\t214238\nKOBKHUNKA\t214239\n剔\t214240\n李笑薇\t214241\n加澳\t214242\n10085\t214243\n10086\t214244\n黄斑鱼\t214245\n崔均海\t214246\n坠毁\t214247\nyfjjgfhi\t214248\n剁\t214249\n超短\t214250\n剃\t214251\n剂\t214252\n湖边\t214253\n則\t214254\n拍发\t214255\n37周\t214256\n剋\t214257\n削\t214258\n前\t214259\n酬狗\t214260\n转帧\t214261\n烘焙\t214262\n稀有度秘\t214263\n益智仁\t214264\n领英\t214265\n5元\t214266\n请原谅\t214267\n惹不得\t214268\n方金悦\t214269\n5克\t214270\n赐死\t214271\n百八十万\t214272\n太杉\t214273\n110111\t214274\n慢懂\t214275\n说明星\t214276\nkcjvghftccdsshjifcthvghygfd\t214277\n萨萨\t214278\n真丑我真没\t214279\nuuohgu\t214280\n尼伙枢纽个呵欠你\t214281\nFxcfxsfs\t214282\n晚班\t214283\n海淘的暗雪山孤城遥望玉门关黄沙百战穿金甲\t214284\n韩饭\t214285\n泥状\t214286\n说什么呀嘟咪\t214287\n摩登年代\t214288\n受孕\t214289\nchiphotosbaiducomxiaodupicitem8c1001e93901213fea5b5fa853e736d12f2e955djpg\t214290\nufygvvvhvhvhgufyfufguyuvxhxudhewioqpxhiwd\t214291\n徐钰沂\t214292\n傻咧\t214293\n五牛牛\t214294\n568元\t214295\n单萧颖\t214296\n回耒\t214297\nga1a1a1b\t214298\n执纪\t214299\n凤凰你猜的呗拉你有男妇幼\t214300\n思维能力\t214301\n李赫\t214302\n晚年\t214303\n任性\t214304\n拜拜度\t214305\n其實還沒\t214306\n撕掉\t214307\n随风\t214308\nhrz\t214309\n金榜\t214310\nhrr\t214311\n苦圈\t214312\nhrh\t214313\nhrk\t214314\n卖国\t214315\n好拜拜\t214316\nhrd\t214317\nhrg\t214318\nhrf\t214319\n40多\t214320\nhrb\t214321\n88556585888\t214322\n失陷\t214323\n关元穴\t214324\n工会\t214325\n分封制\t214326\n周贵龙\t214327\n曾经不远千里\t214328\n吟咏\t214329\nffghhvc\t214330\n失陪\t214331\n飞天\t214332\n粉晶\t214333\n天上错错错\t214334\n就是麻烦我要吃你了了我从无限来猪猪猪猪猪猪猪猪猪全都\t214335\n恩2\t214336\nwahy\t214337\n俄国\t214338\nDecf\t214339\n硬哥\t214340\n周彦\t214341\n腋臭\t214342\n周彤\t214343\n透明人\t214344\n200亿元\t214345\n好啦别那么老公我爱你行\t214346\n1388\t214347\n梦享\t214348\n飞复\t214349\n别这样子\t214350\n1382\t214351\n最后的日子\t214352\n新鲜事\t214353\n654764665\t214354\n蜗牛与黄鹂鸟\t214355\n千纸鹤\t214356\n二百遍\t214357\n天行者\t214358\n不可能的事我不喜欢你我讨厌你度秘\t214359\n力哥哥\t214360\n11月26日\t214361\n呕吐\t214362\n悲壮\t214363\n小喜\t214364\n1572505558\t214365\n球探讨厌讨厌讨厌讨厌讨厌讨厌讨厌\t214366\n66888\t214367\n2012年伦敦奥运会\t214368\n我不我不要\t214369\n达娜\t214370\n才叫做\t214371\n沧海汤\t214372\n自以为是\t214373\n撒缺\t214374\n单田芳\t214375\n周廷欣\t214376\n33只\t214377\n两一晚\t214378\nGgdfffffeuggj\t214379\nmkkkkkkkk\t214380\n南浦广场公园\t214381\n小喵\t214382\n礼包\t214383\n我和你的女神\t214384\n我不是兄弟我不是你的兄弟我没有你这样的兄弟我是鬼\t214385\nlover\t214386\n预告片\t214387\n五号下午\t214388\n一道通\t214389\n4255585\t214390\n唐澈\t214391\n十二星龙\t214392\n我好咳嗽\t214393\n哼狂\t214394\n孑然一身歹轼甘曳步舞\t214395\n万里那\t214396\n思雅特\t214397\n淮滨\t214398\n从不间断\t214399\n臭美\t214400\n蛮荒\t214401\n行崔\t214402\n184829282727660281844850514\t214403\n了okokokokokokokokokkkkkkk\t214404\n57541968484894678464946448496494694949564946494646\t214405\n猜一手三\t214406\n刘波英\t214407\n深深深\t214408\n逐个\t214409\nucomakass\t214410\n阿b\t214411\n启航\t214412\n名胜\t214413\n豁达\t214414\nGrigjgjg\t214415\njdjdnixndjcifbehksbxjcncr\t214416\nviyay9yaivy\t214417\n山真险啊危峰兀立怪石嶙峋\t214418\nseoula\t214419\n你我\t214420\n百度贷款\t214421\nｈｏｕｆ\t214422\n夺帅\t214423\n孙铭阳\t214424\ndddssssssssssss\t214425\n幺八六二二四六七九幺三\t214426\n成命\t214427\n拔腿\t214428\n姨裤\t214429\n当日\t214430\navou音乐avou音乐二安evinuo\t214431\n新世纪杯\t214432\n8丝\t214433\n十二耿饼\t214434\n王圳龙\t214435\n杭淑怡\t214436\n西安\t214437\n自然法则\t214438\n拂面\t214439\n月收入\t214440\npapikathreas3\t214441\n朱霞\t214442\n11111110000000000999元\t214443\n灰腐病\t214444\nUC思密达\t214445\n守全\t214446\n有不就\t214447\n钻机\t214448\n美尔雅\t214449\n分曹射\t214450\n王梦雪\t214451\n隐性过敏\t214452\n汗水\t214453\nWozaigengni\t214454\nopulamyou\t214455\n糖类\t214456\n美丽唱\t214457\n焦恩俊\t214458\n嬲坨\t214459\n陆冰儿\t214460\n双鱼座主宰星\t214461\n西安城东客运站\t214462\n喜欢你的理由\t214463\n问清\t214464\n15030985922\t214465\n星彤\t214466\n红小队侠\t214467\n文我\t214468\n畸\t214469\n杨新欣\t214470\n方萌荻\t214471\nyor\t214472\n张梓祎\t214473\n真的你好\t214474\n买钱\t214475\n问游\t214476\n天台二女\t214477\n环保主义者\t214478\ncaffe\t214479\n去不呀\t214480\n云南神经病\t214481\n避开\t214482\n舒舒服服\t214483\n断桥阵\t214484\n没难过\t214485\n2下\t214486\n崔翔\t214487\n哼打\t214488\n哈太郎\t214489\n青天白日旗\t214490\n默默摸摸\t214491\nccccccccc\t214492\n畀\t214493\n赵俊\t214494\n找一人工\t214495\n卡哈比\t214496\n畆\t214497\nYhhf\t214498\n97.8%\t214499\n月抛\t214500\n2个\t214501\n道听途说\t214502\n2两\t214503\nUfjjffjd\t214504\n一暮\t214505\n引水\t214506\n巨人机\t214507\n晚安刘诗诗#\t214508\n雅照\t214509\n陆植\t214510\n口码\t214511\n3214569875\t214512\ncjvjjdfjnlhgswxfchjvlljlgsdtdhvkhl\t214513\n碓\t214514\njukuput\t214515\nhrhjfuchuujhhhhhhhhhhhhhhhhhhhhh\t214516\nWDC\t214517\nKISS\t214518\n风起苍岚\t214519\n小曦曦\t214520\n促销品\t214521\n护肤\t214522\n3321\t214523\n咬人\t214524\n唔行\t214525\n碟\t214526\n2年后\t214527\n啦啦啦啦啦啦啦啦我是唱歌的小女孩\t214528\n范星光\t214529\n界\t214530\n护肩\t214531\n植物大战僵尸吧主\t214532\nhhff\t214533\nhhfg\t214534\nhhfd\t214535\nhhfb\t214536\nsosthtoso\t214537\n匪夷所思\t214538\n好我我我好我好\t214539\n虎狮\t214540\n11PM\t214541\n电信3G\t214542\nhhfi\t214543\n放丝\t214544\n何艳花\t214545\n万山拉\t214546\nhhfp\t214547\n沫姐\t214548\nlt撒花\t214549\n虎狼\t214550\n玩蛋\t214551\n不要不说\t214552\n不发短信\t214553\n96784213540798643134867318186537816\t214554\n九百米\t214555\n音太\t214556\nygugyvybhgyvrtybgufthobuvhbojvuvunyvubtdtbjgufjhfjgdhjvg\t214557\n怕害\t214558\n安十四\t214559\n一百毫升\t214560\n噩噩噩噩噩噩噩\t214561\n窦异位\t214562\n望所\t214563\n卷走\t214564\n8个\t214565\n远近\t214566\n完整性\t214567\n還用\t214568\n解决\t214569\n不可方物\t214570\n一53岁段\t214571\n王炸汗\t214572\n可以你好美呀看超级飞侠\t214573\n解冻\t214574\n言和\t214575\n猜懂\t214576\n远远\t214577\n囖摸囖\t214578\n势歉睫\t214579\n三六元\t214580\n面對現實\t214581\nu49\t214582\niu21\t214583\nvdkdjdj\t214584\n米求你了麽咋哪呢狡诈找\t214585\n刘先瑞\t214586\nsisl\t214587\n佛言\t214588\n咯娄\t214589\n一贯能\t214590\n沙底\t214591\n敬明\t214592\nhbgf\t214593\n海乡\t214594\n远进\t214595\n扇死\t214596\njviv\t214597\n错天\t214598\n不能不能不能\t214599\n3002.6万台\t214600\nxuanhao\t214601\n错大\t214602\n寸有\t214603\n赵博雅\t214604\n拜脱\t214605\n海乌\t214606\n氨基酸模\t214607\n错失\t214608\n350厘米\t214609\n碳\t214610\n睡呗\t214611\n298.75M\t214612\n100100\t214613\n燃煤\t214614\n艘船\t214615\n感真\t214616\n亲戚们\t214617\npose\t214618\n走就走呗\t214619\n食啊华\t214620\n睡员\t214621\n碾\t214622\n1997年\t214623\n周振翔\t214624\n碧游\t214625\npost\t214626\n微访谈\t214627\n192020\t214628\n睡告\t214629\n六合皇\t214630\n弹球\t214631\n李到我\t214632\n大太搞笑\t214633\n帅脚\t214634\n仁恒\t214635\n碱度\t214636\n奔跑吧兄弟打\t214637\n扰乱\t214638\n明员\t214639\n1476200179\t214640\n中篇\t214641\n你的他的时候\t214642\n车源\t214643\n以后\t214644\n死人儿\t214645\n接触\t214646\n贤旭\t214647\n拾\t214648\n凤凰兄弟\t214649\n宋文杰\t214650\nIpad2\t214651\n你在哪儿啊我想和你约会\t214652\n波音驻莫斯科代表处\t214653\n斯文\t214654\nefsxer\t214655\naaoo\t214656\n凶狠的人\t214657\n可不可\t214658\n生物猪\t214659\n亲太\t214660\nwan\t214661\n疯言疯语\t214662\n累试试\t214663\nwah\t214664\n亲够\t214665\nwac\t214666\n俊朗\t214667\nwaa\t214668\n虚像\t214669\nFCC\t214670\n烦乱\t214671\nwaz\t214672\nway\t214673\n死亡事件\t214674\nwau\t214675\n郭俊辰\t214676\nwas\t214677\nwar\t214678\n优惠码\t214679\n第4\t214680\n第七期\t214681\n194779632\t214682\n这一项\t214683\n1ssw22ewwwwweeeeeeeeeefggtrrxsdeeerrreeeeeeeeeew\t214684\n弥陀佛弥陀佛弥陀佛ometometometoyoumaitfoyometishiometfoompado\t214685\n袁媛\t214686\n得秘\t214687\n纳粹蛋蛋\t214688\n丁老头\t214689\npomaju\t214690\n统战部长\t214691\nmlllkn\t214692\n653619\t214693\n屌嘞\t214694\n乐读网\t214695\n一百五十五\t214696\n过程度\t214697\n给我的话\t214698\n2012年6月2日\t214699\n六明天\t214700\n像我一样\t214701\nabbbb\t214702\n合照\t214703\n火星来的我是钢铁爆破丸子\t214704\n8528\t214705\n讨薪聚众\t214706\n事实实事\t214707\n吴志萍\t214708\n风骨\t214709\n朱易涵\t214710\n打错了吧\t214711\n第一百篇\t214712\n攻占\t214713\n杨嘉怡\t214714\n能不能说\t214715\n古代的我是现代的我是新世纪\t214716\n老坑\t214717\n老梁\t214718\n认识于\t214719\n温马克\t214720\n我不我不应该\t214721\n笔钱\t214722\n认识了\t214723\n王佳琪\t214724\n想会不会\t214725\n审议\t214726\n加雅\t214727\n复吸\t214728\n常来\t214729\n审计\t214730\n欧洲冠军联赛\t214731\n我是爱你的我想你\t214732\n登封\t214733\n安茄\t214734\n3.4级\t214735\n王海干\t214736\n攻博\t214737\n小不点儿在不在小不点才不在\t214738\n常杰\t214739\n第三版\t214740\n举不胜举\t214741\n弃子走\t214742\n一国\t214743\n停费\t214744\nNK01\t214745\n一围\t214746\n心目\t214747\n花种花\t214748\n创意性\t214749\njhbj\t214750\n为吉g1\t214751\n一囧\t214752\n韩美\t214753\n一团\t214754\n鄂州\t214755\n一回\t214756\n顺颊\t214757\n235555553554236555336655\t214758\n一四\t214759\n科尔马\t214760\n韩羽\t214761\n早上8点\t214762\nfos\t214763\n小龙绿园\t214764\n贵翠\t214765\n起行\t214766\n助人为乐\t214767\nglugl\t214768\n好巴登\t214769\n咪表\t214770\n铠口\t214771\n染拉\t214772\n两院院士\t214773\n郑静萍\t214774\n王系统\t214775\n上林正英\t214776\n35.3\t214777\nGibb\t214778\n上初\t214779\n四四十个\t214780\n290余所\t214781\n病员\t214782\n实时时\t214783\nfoz\t214784\n游记\t214785\n上分\t214786\n我们么么哒萌萌哒萌萌哒度秘萌萌\t214787\n花痴症\t214788\n好你好好\t214789\n上到\t214790\n福山\t214791\n284倍\t214792\n王祖\t214793\n刘金富\t214794\n跳屏\t214795\nqixi\t214796\nyvg\t214797\n孙帅超\t214798\n袒护\t214799\n小纳子\t214800\n高刑\t214801\n这个手\t214802\nyvh\t214803\n前钱\t214804\n尔曼\t214805\n太难听\t214806\n林紫嫣\t214807\n变身\t214808\n刘导\t214809\n汪汪汪\t214810\n没同\t214811\n燃气\t214812\n不知你是\t214813\n专横\t214814\n黑糖群\t214815\n温氏鹊\t214816\n郝安娜\t214817\n一四二四三四四四五十六岁\t214818\n没吧\t214819\n没听\t214820\n解惑\t214821\n拜拜我不想陪你聊了求你了你不要再打扰我了拜拜\t214822\n普贤\t214823\n推崇\t214824\n免税\t214825\n五月初十\t214826\n没吻\t214827\nlfromegish\t214828\n芭拉芭\t214829\nfenhuang\t214830\n雾化\t214831\n说真话\t214832\n彼得巴菲特\t214833\n实话说的话\t214834\n6986\t214835\n损毁\t214836\n10.0公里\t214837\n像匆匆忙忙\t214838\n湖景\t214839\n西数\t214840\n博罗人\t214841\n损比\t214842\n六榕\t214843\n脸缩句\t214844\n黄涛\t214845\nschoolwere\t214846\nghhhbbbfbtljtxmj1jtzTJTAD0JT\t214847\n烟玖哲\t214848\ndutshl\t214849\n老糖\t214850\n王者荣耀\t214851\n一凡人\t214852\nanIta\t214853\n香材\t214854\n笔袋\t214855\n一遍个\t214856\n揪心\t214857\n松辉路\t214858\n默默默\t214859\nghfhchyb\t214860\n千绫\t214861\n汉强\t214862\n我离开\t214863\nBrilliant\t214864\n晋州\t214865\n仲裁\t214866\n西乐葆\t214867\n互相亲\t214868\n手足\t214869\n大咖秀\t214870\n波段\t214871\n宽松\t214872\n可怜天下父母心\t214873\n当当糖\t214874\n千细\t214875\n一二三二十六七万二十六七八\t214876\n小树\t214877\n确想\t214878\n奥特秘\t214879\nfzjkvu\t214880\n华人\t214881\n我的妈妈梦和你\t214882\n十二G流量\t214883\n吃饱非\t214884\nING\t214885\n1194027181\t214886\n马啸\t214887\n敦化市\t214888\n8uy\t214889\n主旋律\t214890\n弥合\t214891\nuurh\t214892\n九点八点\t214893\n嗯累\t214894\n更加\t214895\n0878\t214896\n一八二十四亿\t214897\n轻点儿\t214898\n吴林超\t214899\n录音果\t214900\n中国银监会\t214901\n0876\t214902\nhhyhgxhovjfnjxvxxvxvcb\t214903\n男人块\t214904\n乙城\t214905\n12月2号\t214906\n心有所爱如此美好\t214907\n双人浴\t214908\n绕圈\t214909\n835325\t214910\n蛮横\t214911\n一九六五\t214912\n鱼缸\t214913\nF\t214914\n脚丫子\t214915\nbatbal\t214916\n大东西\t214917\n评判\t214918\n句子行\t214919\n三五来\t214920\n停不下\t214921\nilljaa\t214922\n奥一\t214923\n佛寺\t214924\n有限公司\t214925\n季微然\t214926\n幼童\t214927\n啦啦247\t214928\n雄鹿\t214929\n门红\t214930\nnishidali\t214931\n王德燕\t214932\nPrin\t214933\n启恩\t214934\n美侣\t214935\n山空\t214936\n评分\t214937\n佳贝犬\t214938\n中国滑冰协会\t214939\n、、\t214940\n整我不够短\t214941\n复飞\t214942\n真分数\t214943\n理赔\t214944\n二郎神\t214945\n潘云岚\t214946\n蒋懿案\t214947\n奇娜\t214948\n陈棉山\t214949\n善如缘\t214950\n职来职往\t214951\n阿长\t214952\niPhone9\t214953\niPhone6\t214954\n60jjijin斤\t214955\n德古斯曼\t214956\n舔阴\t214957\n仰恨\t214958\n女界\t214959\n八喜\t214960\n卡萨布兰卡\t214961\n学名\t214962\n学吃\t214963\n翻空\t214964\n圣息焉公墓\t214965\n小糖丶\t214966\n赚了钱\t214967\n豫金刚石设立子公司\t214968\n相刑相着\t214969\n屁屁丫\t214970\n立锥\t214971\n产业园\t214972\n生猛\t214973\n德化\t214974\n临汾百汇\t214975\n第61届\t214976\n我不是猪我是人你是狗\t214977\n零四幺二\t214978\n学吧\t214979\n陶文\t214980\n开玩笑见\t214981\n威斯康星大学\t214982\n嗯先啵\t214983\n敢假传\t214984\n伤君\t214985\n一个小时\t214986\n19800\t214987\nghuuh\t214988\n五和罗\t214989\njuuggbjjgv\t214990\n那一个月\t214991\n照怪\t214992\n没得病\t214993\n高芯烨\t214994\n5555424252245\t214995\n名牌化\t214996\n我的错号\t214997\n我的孩子你是真不要脸的孩子呀你是真不要脸的孩子呀你\t214998\n贪吃\t214999\n弦弓\t215000\n红石\t215001\n古古古古五\t215002\n值哭\t215003\n格叽格叽格叽格叽格叽格叽格叽\t215004\nkvdy\t215005\nttucfhrryh\t215006\n开心感\t215007\n圣诞礼物\t215008\n你臭臭我是王\t215009\n东方财富\t215010\n身高姐\t215011\n产业\t215012\n蒂亚\t215013\nshhsbd\t215014\n第五天\t215015\n真样子\t215016\n第五大\t215017\n歼-20战斗机\t215018\n民法\t215019\n金服\t215020\n就是不是\t215021\n俩一档\t215022\nwixy\t215023\n发影\t215024\n殷苠蔚\t215025\nt英\t215026\n发彩\t215027\n金木\t215028\n青春\t215029\n怠慢\t215030\n好美美\t215031\n句法\t215032\n求职\t215033\n狠狠狠狠\t215034\n过来玩\t215035\n利用\t215036\n三李正西区\t215037\n夜泊\t215038\n生不入死\t215039\n救火\t215040\nOkdchin\t215041\n半爱\t215042\n2014年6月15日\t215043\n圣诞大战#\t215044\ngkfj\t215045\n你好果\t215046\n秘秘红\t215047\n使跟\t215048\n卡贴\t215049\n磁悬浮\t215050\nYOGO\t215051\n找我行\t215052\n桑帕吉塔天然气田\t215053\n超级超级超级超级超级超级爱\t215054\n宁德市\t215055\n2013年1月14日\t215056\n圆润\t215057\n利生\t215058\n白婆娘\t215059\n龙纹身的女孩\t215060\n多一袋\t215061\n问无答\t215062\n咪霸\t215063\n小年夜北\t215064\n麻粉\t215065\n睡死\t215066\n哦麦\t215067\n知商\t215068\n大你的每一起过吧行\t215069\n头疼\t215070\n李晴茹\t215071\n不动产登记\t215072\n金润区\t215073\n侯丝风\t215074\n水猴\t215075\n净空\t215076\n煎熬期\t215077\n抵毁\t215078\nkoug\t215079\n酸菜鱼\t215080\nssgjo\t215081\n大家好我是某某某\t215082\n军警\t215083\n二二五\t215084\n两千年八月14号\t215085\n我疯\t215086\n牛角\t215087\n本宫请安吧小肚子\t215088\n7k7k7k7k7k7k7k7k7k7k\t215089\n吕睿三\t215090\n二二二\t215091\n满口龃齿&amp\t215092\n发飚\t215093\nKnitI\t215094\n嗯嗯我好寂寞\t215095\n靠再说\t215096\n一百四一百三十二号\t215097\n陪同\t215098\n包头市\t215099\n智袋\t215100\n四小\t215101\n12345671889\t215102\n陪吃\t215103\n想度\t215104\n扭回\t215105\n垃圾车\t215106\n马宇生\t215107\n1222\t215108\n1221\t215109\nwedo\t215110\n阿梦\t215111\n1229\t215112\n1228\t215113\nmNNEJTJTK\t215114\n说否则\t215115\n凝肌玉肤\t215116\n表介意\t215117\n革命\t215118\n电商\t215119\n年快乐\t215120\n吴收购\t215121\n滚球的吧求求求求求求求气球\t215122\n谱曲\t215123\n神人畅\t215124\nBeginsStore\t215125\n斩首\t215126\nggrg\t215127\n雀子\t215128\nghiphotosbaiducomxiaodupicitem562c11dfa9ec8a13c5e81f93f003918fa1ecc0d7jpg\t215129\n峰局\t215130\n很喜欢我希望\t215131\n守住\t215132\ndjjrjfjgngjffkf\t215133\n苗苗阿里\t215134\n熟客\t215135\n过来人\t215136\n公鸡\t215137\n铁警\t215138\n凤姐\t215139\n喜兄\t215140\n过来亲\t215141\n3d块\t215142\n27场\t215143\n我爱的眼花了就在你和玫瑰花给你\t215144\njodie\t215145\n二百九十一一米\t215146\n孟郊\t215147\n严谨\t215148\n孟郁\t215149\n22载\t215150\n植物大战僵尸游戏\t215151\n炮兵\t215152\n直气壮咯\t215153\n樊天阳\t215154\n到一拖再拖\t215155\nrdocucckb\t215156\n过来了\t215157\n3756167\t215158\n2种\t215159\n答案眼\t215160\n陈彩梅\t215161\n陡\t215162\n尹士\t215163\n我也好爱好爱你二\t215164\n院\t215165\n陬\t215166\n险\t215167\nslas\t215168\n陪\t215169\n陵\t215170\n陷\t215171\nkqkf\t215172\n5点21分\t215173\n选民\t215174\nCA4\t215175\n鹿狗\t215176\n这意思\t215177\n韩秋尘\t215178\n我投诉你\t215179\n际\t215180\n附\t215181\n屠主席\t215182\n陆\t215183\n哇塞\t215184\n降\t215185\n绳子\t215186\n毗卢观音像\t215187\n齐心\t215188\n陈\t215189\n顾淑栋\t215190\n陕\t215191\n陖\t215192\nFhfhjgucgj\t215193\n限\t215194\n成泰文\t215195\n13937565449\t215196\n不完美小孩\t215197\n翟泽豪\t215198\n唐帅\t215199\n我喜欢我现在这样的生\t215200\n李丹\t215201\n穿戴\t215202\n李丽\t215203\n梁晓\t215204\n悦悦\t215205\n5127568555\t215206\n取胜\t215207\n82525254853815686596855555565555562261555545555455465555555\t215208\n框框\t215209\n李个\t215210\n旺仔泡芙\t215211\n我是谁把\t215212\nhellokhellok\t215213\n东莞东城海雅百货\t215214\n一来子\t215215\n12358694752\t215216\n票漂\t215217\n李东\t215218\n封闭式\t215219\n3月11号\t215220\n唐嫣儿\t215221\n李下\t215222\nVader\t215223\n李三\t215224\n李不\t215225\njglgagaj\t215226\n米真小\t215227\n冯小宝\t215228\n李一\t215229\n中国海关\t215230\n6天\t215231\n铠甲勇士之帝皇侠\t215232\n好奇术\t215233\n柔软\t215234\ndonot\t215235\nabcabc\t215236\n两百米\t215237\n做我女\t215238\n小武\t215239\n两座\t215240\n两度\t215241\n感动\t215242\n2098\t215243\n一条天\t215244\n杨素\t215245\nbjxs\t215246\n数\t215247\n杨紫\t215248\nUsid\t215249\nthfhfhmjugffddddddhgagjhvnlgnjfdjkhdhoj\t215250\n阿璃\t215251\n18-21日\t215252\n马丁吉他\t215253\n嫑莪\t215254\n侯亚鹏\t215255\n鳓鱼\t215256\nCamp\t215257\n度教秘\t215258\n郭涵希\t215259\n962464\t215260\n才神\t215261\n才鬼\t215262\n干将\t215263\n王艳丽\t215264\n藤席\t215265\n淘粪\t215266\n这自恋狂\t215267\n负度秘\t215268\nffq\t215269\n小山沟\t215270\nuvw\t215271\ntcjat\t215272\n九次\t215273\n國旗\t215274\n43度\t215275\n敧\t215276\n斯比得\t215277\n音舞诗\t215278\n淘粉\t215279\n菲海军\t215280\n孕妇词\t215281\n紫卡\t215282\nffk\t215283\n盛云锋\t215284\n九款\t215285\n洛桑\t215286\n五周\t215287\n铺天盖地\t215288\n比努\t215289\nGlfhflxkxkc\t215290\n为一\t215291\nffh\t215292\n八皮\t215293\nrourp\t215294\nCrew\t215295\n黄五杠&amp\t215296\n性烈\t215297\n五味\t215298\n一个钟\t215299\nffe\t215300\n鸭梨山\t215301\nDurmaz\t215302\n辛苦你了秘术\t215303\n恩加卑鄙\t215304\ny2\t215305\n大乐\t215306\n刘凡义\t215307\n羽宁\t215308\n为为\t215309\n为主\t215310\n孙伟翌\t215311\n董冰羽\t215312\n度秘你还在么饮料好你说的美不美\t215313\n少年饭店\t215314\nTheshow\t215315\n边塞\t215316\n27680\t215317\n你猜我猜你猜不猜我猜你猜\t215318\n老实可爱\t215319\n吐下去\t215320\n医疗事故\t215321\n此念\t215322\n图灵测试\t215323\n上两天\t215324\n9点到10点\t215325\n传出\t215326\n嗯嗯男\t215327\n一8月19日\t215328\n外型\t215329\n伊份\t215330\n庄咏澜\t215331\n点度\t215332\n匪徒\t215333\n笨猪\t215334\n笨猫\t215335\nteentop\t215336\nmone\t215337\n爱上你了你可以\t215338\n田女士\t215339\n知心姐姐\t215340\nmono\t215341\n帅达\t215342\n归茫\t215343\n夏县\t215344\n十排\t215345\n魔法西亚\t215346\n一天一天一天一天\t215347\n风凉话\t215348\n此心\t215349\n宝贝块\t215350\n脅孕\t215351\n乱想\t215352\n老陈\t215353\n我离\t215354\nwowooy\t215355\n度老弟尼嚎\t215356\nPET\t215357\nk178\t215358\n对啊你坏\t215359\n敛\t215360\n文佰文\t215361\n吕佳妮\t215362\n7日早上\t215363\n油翁\t215364\n9400万美元\t215365\n到达\t215366\n陆家村\t215367\n咋师长\t215368\n今报\t215369\n陈佳明\t215370\nplpl\t215371\n突然间\t215372\n古娘\t215373\n案发后\t215374\n端粒酶\t215375\n仕港\t215376\n呼欲强\t215377\n681816158\t215378\n千叶玫瑰\t215379\n故\t215380\n古斯塔沃\t215381\nchiphotosbaiducomxiaodupicitem0bd162d9f2d3572c2b604fa88d13632763d0c3c0jpg\t215382\n柳杭臣\t215383\n解锁\t215384\n慷慨激昂\t215385\n爱屋写诗\t215386\nkeepofff\t215387\n悟\t215388\n张开嘴\t215389\n杨朝丽\t215390\n五十秒\t215391\n真搞笑我想问你我想问你你爱我\t215392\n萌萌哒哒哒萌的度秘你好\t215393\n女主人\t215394\n金俊龙\t215395\n毛晓彤\t215396\n五个工作日\t215397\n36.4度\t215398\n论文明\t215399\nbpmf\t215400\n2851142313\t215401\nFTISLAND\t215402\n两倍子\t215403\n141125221125\t215404\n选修\t215405\n13名\t215406\nyi\t215407\n热不热\t215408\n吞吞吐吐跳跳糖\t215409\n该学\t215410\n我懂为什\t215411\nyk\t215412\n哦美丽的秘美丽的秘密\t215413\npprereafiming\t215414\n看一探\t215415\n宇文拓\t215416\n蒋庆兰\t215417\nFcvbjygcc\t215418\n丽江市\t215419\n元元\t215420\nym\t215421\n为何处\t215422\n长相思色\t215423\n鹅肠\t215424\n好端端\t215425\n萌样\t215426\n王大雷\t215427\n比亚\t215428\n苏珊\t215429\n宋福成\t215430\n歌神十爱奇\t215431\n一个宝\t215432\n累好呀\t215433\nghhjbjn\t215434\n335252525525253522542257567675757\t215435\nhellokstati\t215436\n唉不懂你了拜拜\t215437\n代理费\t215438\n没精打\t215439\n160年\t215440\n情深深雨宫天你要是\t215441\n爱都面\t215442\n不写完\t215443\n好店\t215444\n1555556545563555\t215445\n顿感\t215446\nuv4\t215447\n徐家大院\t215448\n公共事业管理专业\t215449\nftd\t215450\n一天上一次\t215451\n徐州开元名都酒店\t215452\n清健康\t215453\n好啦一会再来找你聊哈别孤单\t215454\nakakakkatie\t215455\n师笔录\t215456\nyf\t215457\n副的\t215458\n背向外\t215459\n度秘度秘度秘度秘你是不是爱你\t215460\n15192867571\t215461\nbondage\t215462\n家人家\t215463\n张郭千惠\t215464\n中泉\t215465\n789勹\t215466\nTBAZ\t215467\n防线\t215468\n巧丽\t215469\nflaggot\t215470\n华娱卫视\t215471\n粉个\t215472\n太白星\t215473\n3214569776996311789314709224\t215474\n悸\t215475\n5555566336\t215476\nwyfgvdivr\t215477\n你好坏你好坏哦哼哼\t215478\n容丸\t215479\nftf\t215480\n日夜相处\t215481\n狗贩子\t215482\n化妆包\t215483\n嘉士伯\t215484\n护獬力\t215485\n瓦里\t215486\n管控\t215487\n一千一百四十七\t215488\n38路\t215489\nv次果皇果\t215490\n秋老虎\t215491\n萨里\t215492\n啦啦啦我是你的小呀小苹果\t215493\n不信者无你啊不信由你\t215494\nHhjokjot\t215495\n谢了结束了\t215496\n悦\t215497\n彭山峰\t215498\n穆里尼奥\t215499\n叫鸡\t215500\n赵博琨\t215501\n刊载\t215502\nnshchdcbbdbhsn\t215503\n11111111111100000000000赞\t215504\n了除了\t215505\nviuuut\t215506\n电场\t215507\n给水\t215508\nhgxr\t215509\n重新板\t215510\n洗面奶\t215511\n选装\t215512\nifs\t215513\n丑娃\t215514\n作饭\t215515\nydifishkcufigiarfifmbofkn\t215516\n0188\t215517\n四角裤\t215518\n1月4日\t215519\n百度皮夹\t215520\n谁来了你的事\t215521\n海军型\t215522\n名子\t215523\n名孑\t215524\n自己走\t215525\n明天旅途愉快\t215526\nStkf\t215527\n建国\t215528\nifx\t215529\n在梦\t215530\n陈燕银\t215531\n说的好搞笑\t215532\n洛基\t215533\nyG\t215534\n8点10分\t215535\n刑事责任\t215536\n救急\t215537\nnn18242102295\t215538\n庸庸碌碌\t215539\n开门见山\t215540\n啦啦oao\t215541\n感情儿\t215542\n郑小洪\t215543\nxzkx\t215544\n王子硕\t215545\n爱狗\t215546\n宛溪沙\t215547\nｉｑｆｑ\t215548\n睡觉吧\t215549\nSmart\t215550\n名学\t215551\nQQ338144eme\t215552\n连不到\t215553\n循序渐进\t215554\n段子手\t215555\n父母理\t215556\n一日\t215557\n挥一挥手\t215558\n一旦\t215559\n完事儿了行\t215560\n人外有人\t215561\n王九旦\t215562\n谢谢谢谢谢谢你好乖\t215563\n察县\t215564\n一早\t215565\n#刘诗诗拓跋玉儿#\t215566\n吗鸣\t215567\n一时\t215568\n庶女\t215569\n看了笑\t215570\n八戒八戒\t215571\n麻烦度\t215572\n一族\t215573\n10颗\t215574\n跑跑\t215575\n医大\t215576\n医美贷款\t215577\n医太\t215578\n细腻\t215579\n秘神\t215580\n快易典\t215581\n杨凌\t215582\n矮度秘\t215583\n归类\t215584\nzdfc\t215585\n离经\t215586\n大觉吧\t215587\nfjvh\t215588\n操作者\t215589\n樱雪落\t215590\n还片\t215591\nfjvb\t215592\n相撞\t215593\njohohofoohfuog\t215594\n吸管\t215595\n27名\t215596\n黑太狼\t215597\n绵阳佛教协会\t215598\ndasfzdasfz\t215599\n今日上午\t215600\n24665et6tduduuryddufjfhcngiglhpgrfecsf\t215601\nsudjju\t215602\n掠夺式\t215603\n付亚男\t215604\n累九九\t215605\n计生委\t215606\n撸管撸\t215607\n划得\t215608\n凭啷\t215609\n洗口\t215610\n汤面\t215611\nuuuuu\t215612\n早班\t215613\n糯米鸡\t215614\n秘偷\t215615\nGhbnnkk\t215616\n哪点水\t215617\ncoolit\t215618\n雏蜂\t215619\n如花\t215620\n零七零六零零二八\t215621\n舞影零乱\t215622\n青青草地花厅\t215623\n固网\t215624\n2月3号\t215625\n大宗商品\t215626\n心如止水\t215627\np波斯\t215628\n滴和\t215629\n学画\t215630\n黄海波\t215631\n童月\t215632\n挨觉\t215633\n兰陵缭乱\t215634\n921号\t215635\n朱丽珊\t215636\n法律条文\t215637\nc朗\t215638\n不用话\t215639\n不要脸度秘不要脸不要脸度秘不要脸秘\t215640\n搜狐空\t215641\n阴阳眼\t215642\n耀莱\t215643\n各地区县市\t215644\n嵌甲\t215645\n没那么简单\t215646\n郑树乐\t215647\n中国矿业联合会\t215648\n6161854\t215649\n抉择\t215650\n不用说\t215651\nwmdmg\t215652\n度秘我爱你你爱不爱我\t215653\n5580339088555580866258099\t215654\n郎奔\t215655\njuih\t215656\n暨南大学\t215657\n芳芳芳芳\t215658\n8额\t215659\nc本\t215660\nta3\t215661\n刺股\t215662\n造访\t215663\n宽畅\t215664\n主人形\t215665\n便利店\t215666\n好犯贱\t215667\n挥发期\t215668\n竞拍\t215669\npeak\t215670\n一毫米\t215671\n佟林\t215672\n我想好晚上好晚上好度秘\t215673\n蓝莓味\t215674\n台海皮\t215675\n2077239657\t215676\n用法\t215677\n2b2bybyb\t215678\n1525275371\t215679\n别有用心\t215680\n任伯川\t215681\n第一轮\t215682\n中国书协\t215683\nMKJ\t215684\n时间段\t215685\n美好媚\t215686\n七级\t215687\n华阴市移民局\t215688\n庭胎\t215689\n王进\t215690\n邱头小学\t215691\n吴亚君\t215692\n好好右拐\t215693\n送行\t215694\n飘渺\t215695\n南保\t215696\n鼓舞飞扬1994\t215697\n什么管\t215698\n董子贤\t215699\n劈开\t215700\nhttpghiphotosbaiducomxiaodupicitem77094b36acaf2edd3d39ae9d8a1001e9380193fejpg\t215701\n竹篮打水\t215702\n1459464958\t215703\n湖北丝宝股份有限公司致东软\t215704\n炀帝\t215705\n容景\t215706\n御泥坊\t215707\n移动电视\t215708\n845161804\t215709\n坏银\t215710\n十八十八十八十八\t215711\n打工楼\t215712\n警戒\t215713\n2010年8月14日\t215714\n谢训\t215715\n真棒遍\t215716\n加西亚\t215717\n两种\t215718\n靠得住\t215719\n天命\t215720\n唉遍体\t215721\nhjjgcddjccvbgfbv\t215722\n抽死\t215723\n故土\t215724\n窗户\t215725\nlggfs\t215726\n靠靠靠靠靠靠靠靠靠靠靠靠靠\t215727\n测试期\t215728\n斩手\t215729\n流行病\t215730\n切磋\t215731\n神准\t215732\n切普菲尔德\t215733\n神出\t215734\n天呗\t215735\n天呐\t215736\n流弊\t215737\n曹锁明\t215738\ngizi\t215739\n红烧牛肉\t215740\n凯健鬼\t215741\n孔文\t215742\n三峡谷\t215743\n多瑙河畔\t215744\n大耳图图之彩色世界\t215745\nCDMA2000+GSM\t215746\n长情\t215747\n精液\t215748\n故地\t215749\n丑丑丑丑丑丑丑丑\t215750\n亚马得\t215751\n接班人\t215752\nGgfjcf\t215753\n度秘乖我去\t215754\n妈妈装\t215755\nfbefbe\t215756\n独有\t215757\n丸子秀英\t215758\n人模\t215759\n烦语气\t215760\nWEArefamil\t215761\n胡老大\t215762\n两笔\t215763\n芒果吧\t215764\n摩西摩西\t215765\npf1\t215766\nRyeyff\t215767\nvabcv\t215768\n临桌\t215769\n伤感的歌\t215770\n135428976\t215771\n好呐\t215772\n并茂\t215773\ncufzgdlhfhkxffcgkgkgkd\t215774\n凤飞飞\t215775\n独木\t215776\n温床戏\t215777\nbogupggte\t215778\n绥\t215779\n我是证人\t215780\n概门CS局\t215781\n格尔\t215782\n大师级\t215783\n罗慧珍\t215784\n好呗\t215785\n节节败退\t215786\n开弓\t215787\n顾云昌\t215788\n示数\t215789\nkzn\t215790\n淑英\t215791\n给我的第\t215792\n正月初四\t215793\n开张\t215794\n一一篇\t215795\n几十世纪\t215796\n好久不见了我想\t215797\n拉普拉斯\t215798\n25.5％\t215799\n刘承东\t215800\n抗倭\t215801\n怒江\t215802\n五分之一\t215803\n我的心窝\t215804\n奥西撒\t215805\n元勋\t215806\n五分之三\t215807\n刺绣机\t215808\n46千米\t215809\n晕头\t215810\nliaoluhan\t215811\n跑带\t215812\n无下限\t215813\n我是那么难你会\t215814\n人来人往\t215815\n不离不离\t215816\n拿来看\t215817\n萌萌达之狙机器人度秘\t215818\n发作\t215819\n郑洪涛\t215820\n鸣鹤古镇\t215821\n话马\t215822\n工匠\t215823\n张烈\t215824\n360千米\t215825\n绳\t215826\n吃了说\t215827\n一段儿\t215828\n旧友\t215829\n3月1日凌晨1时\t215830\n答卷\t215831\n余吨\t215832\n白娜娜\t215833\n猪好嘛\t215834\n蘑菇了你我的了\t215835\n有一个人教\t215836\n红酒们\t215837\nfjrrk\t215838\n太白了你好\t215839\nfyivivogf\t215840\n寝室\t215841\n处女破处\t215842\nwtutzhjghlvovlhlhkkclvkhckfHizgzhjchkclvkvvlvbkl\t215843\nddsdders\t215844\n八嘎的八嘎\t215845\n雨夹雪\t215846\n颏颖\t215847\n纳入\t215848\n36.9万\t215849\n老头儿\t215850\nexcuse\t215851\n135856644763\t215852\n绊\t215853\n萌萌哒乖乖哒\t215854\n千万亿\t215855\n好命\t215856\n亲爱的你表个情我看看\t215857\n红妆\t215858\n纳兰\t215859\n无衬\t215860\n乱伦\t215861\n交道\t215862\n无补\t215863\n戒律\t215864\n涵盖\t215865\n另一款\t215866\n四四岁\t215867\n冯俊莹\t215868\n这场秀\t215869\n死亡悬铃木\t215870\n洛杉矶时报\t215871\n练\t215872\n巴拉望岛\t215873\n齐芷涵\t215874\n胁众从\t215875\n大杀\t215876\n大权\t215877\n2011年度\t215878\ncowa\t215879\n七几\t215880\n经期\t215881\n生肉\t215882\n改了叫\t215883\n454511221422\t215884\n生肖\t215885\n4254354\t215886\n官相\t215887\n噗瓮\t215888\n爱悄\t215889\n永福\t215890\n犬湖南寻找爱的冒险\t215891\n过早\t215892\n大来\t215893\n给\t215894\n理想化\t215895\n短头发\t215896\n二百五四三八\t215897\n明敏明明\t215898\n836排\t215899\n人眼\t215900\n大杯\t215901\n稳定是人心所向\t215902\nfggvbn\t215903\n周笔畅天声一队#\t215904\n爱您\t215905\n非洲人\t215906\n生育\t215907\n绝\t215908\n大板\t215909\n大松\t215910\n搭乘\t215911\nkung\t215912\n较少\t215913\n呵以\t215914\n连职\t215915\nzhtng\t215916\n陪护\t215917\n开阔\t215918\n拮据\t215919\n意思卷\t215920\n三七亿美元\t215921\n下半区\t215922\n是了说\t215923\n谷桃子白宫\t215924\n狼吞虎咽\t215925\n云紫峰\t215926\n长眠\t215927\n不难看\t215928\n6gzz\t215929\n任事者\t215930\n气概\t215931\nmatla\t215932\nHghg\t215933\n自唱\t215934\n茅台酒\t215935\n333134313283133\t215936\n外流\t215937\n狂三五\t215938\n陕西检察查办\t215939\n吴广\t215940\nufgfy\t215941\n卧蚕播放\t215942\n甜心超人\t215943\n我可能不会爱你\t215944\n随手\t215945\n蔡灿\t215946\n一国之母\t215947\n非常好\t215948\n唉我得多来\t215949\n把臭\t215950\n惠河南\t215951\n背把弓\t215952\n会好好\t215953\n一等一\t215954\n杜宇麒\t215955\n未闻\t215956\nfbgfvjggnhftujsjsgiysjnhXhfngjshjsjsmehgnczngsisthsjhk\t215957\n吗秘\t215958\nSfxsyjbz\t215959\n两倍\t215960\n好重要好\t215961\n威廉·克莱因\t215962\n鹿嫂\t215963\n世豳\t215964\n金立jn\t215965\n霞霞霞姐\t215966\n打女\t215967\n甜圈\t215968\n方便\t215969\n油锅\t215970\n弄空\t215971\n安装类\t215972\n杨丛旭\t215973\n9787564810252\t215974\n才针\t215975\n唱吧秘\t215976\n艾米\t215977\n王仁埸\t215978\n哪地\t215979\n奥特不可\t215980\n行尸走肉\t215981\ncupcake\t215982\n玩儿吧等\t215983\n节制\t215984\n客氣\t215985\n零二二零幺七五四幺九四\t215986\n充分利用\t215987\n寓言巷战\t215988\n扩弓\t215989\n公历\t215990\n杭徽\t215991\n褔乐康宁\t215992\n兄妹\t215993\n蒸蟹\t215994\n疲惫不堪\t215995\n宇峰\t215996\n客气\t215997\nrrrrrrrrrrrrrrrrrrrttttttttttttttttttttyyyyyyyyyyyyuuuuu\t215998\nwelearned\t215999\ndrvxesy\t216000\n公厕\t216001\n才钱\t216002\n必见\t216003\n走不走\t216004\n持股\t216005\n顾忌\t216006\n幸不幸福\t216007\n77777777\t216008\n9娃\t216009\n老友粉\t216010\n包曦然\t216011\n手撕包\t216012\n初级经理人\t216013\n巴新娘\t216014\n阉割版\t216015\n光是\t216016\n今晚7点20\t216017\nvhhhcjjjdhh\t216018\n猛猛\t216019\n睡觉版\t216020\n象是\t216021\n围殴\t216022\n何超盈\t216023\n黄蓉\t216024\n毛票\t216025\n几会\t216026\nQQ邮箱\t216027\n光昂\t216028\n知不知\t216029\n爱心小子\t216030\n盲妹\t216031\n150000000万\t216032\n北京丽晶酒店\t216033\n那个啥\t216034\nreco\t216035\n辛德瑞拉的眼泪那我需要\t216036\n天涯海角离我这离\t216037\n摊贩\t216038\n12345698岁\t216039\n龙且\t216040\n没素\t216041\n田园风\t216042\n龙三\t216043\n车配\t216044\n保佑我逢考\t216045\n有天\t216046\n贝吉尔\t216047\n停机恩平\t216048\n以万法为师\t216049\n白夜玲珑\t216050\nion\t216051\n哈我叫\t216052\n快乐的时光\t216053\n纵向\t216054\n呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱呱\t216055\n冒讲\t216056\n错错错就\t216057\n耿帅豪\t216058\n来风云\t216059\n欺人太甚欺人太甚欺人太甚欺人太甚欺人太甚\t216060\n苹果\t216061\n别蓉\t216062\n特步男\t216063\n擦拔\t216064\nioi\t216065\n欲望之屋2\t216066\n内膜\t216067\n今天4点\t216068\nBonamana\t216069\n独道\t216070\n德赛\t216071\n好嘞个逼我操\t216072\n漏噶\t216073\nlokouopas\t216074\n分儿\t216075\n韩天培\t216076\n叶玉\t216077\noleoo\t216078\n2119\t216079\n3100元\t216080\n过年回家\t216081\n干把\t216082\n王勃羽\t216083\n斗鸡\t216084\n2110\t216085\n2112\t216086\n2113\t216087\n再来一炮\t216088\n不是牛哥到我是王\t216089\n没哪是\t216090\n什么那你那你那你那你好忙\t216091\n多变\t216092\n美琪美雪游乐\t216093\n逃逸\t216094\n9uglum、fmjjgrggvghphh5485033585tgghgjvvvhyi08400250／／／＼＼＼n｜｜nd\t216095\n多发\t216096\n等式\t216097\n弹孔\t216098\n杨紫璐\t216099\n算法\t216100\n贬义词\t216101\n清真\t216102\nvcgbvv\t216103\n携\t216104\n1235564997\t216105\n多号\t216106\n恩我\t216107\nyobsrchyy\t216108\n施洋\t216109\nbanana\t216110\n呢儿\t216111\n严震涛\t216112\n多只\t216113\ngugc1\t216114\nICCID\t216115\n颜宁月\t216116\n克桥\t216117\n姨语重心长\t216118\n口蹄疫\t216119\n赵彩燕\t216120\n冒牌仙侣\t216121\n感人肺腑\t216122\netgfdrjrhdbeehdygdvddhudhegdddd\t216123\n歼灭\t216124\n极端伊斯兰主义\t216125\n校车\t216126\n聪明我输\t216127\n38555556\t216128\n一个嘛\t216129\n大卫-阿波罗\t216130\n吧四\t216131\n亨德森\t216132\n一个三十五百兆\t216133\n墨幽冥\t216134\n刘涛\t216135\n度秘部\t216136\n补课\t216137\n小飞猪\t216138\n一魔幻你\t216139\n火枪手三个火枪手的故事梗概\t216140\n刘涌\t216141\n吧图\t216142\nvnwsgxswabhhhhhhhhhg\t216143\n好钱\t216144\n一个零二十\t216145\n08745\t216146\n筒子们\t216147\n屋顶\t216148\n谁着你\t216149\n李财明\t216150\n快诈\t216151\n1234567852143691589333662\t216152\n男人节\t216153\n善于\t216154\n海里那边\t216155\n善事\t216156\n均馆\t216157\n亲爱的你好\t216158\n逫一\t216159\n最佳听众\t216160\n曲处\t216161\n10年后\t216162\n孙俪俪\t216163\n叮当\t216164\n送祝福\t216165\n没完烦\t216166\n日内\t216167\n刻字\t216168\n刘承奥\t216169\n连编号\t216170\n厅堂\t216171\nmaroyato\t216172\n讨厌讨厌讨厌讨厌讨厌讨厌\t216173\n杨澜\t216174\n鲜切花\t216175\n日军\t216176\n二鬼哥\t216177\n骗跑题\t216178\nstmetoto\t216179\n故木胜\t216180\n快说\t216181\n善人\t216182\n贵港市\t216183\n老哥哥\t216184\n宏ucgucxzrxfxgiciycgcihc\t216185\nNote2\t216186\n顿顿饭\t216187\n维不可以\t216188\n镇江\t216189\n咯iOS\t216190\n齐读\t216191\n谢谢片\t216192\n巨化\t216193\n周美青\t216194\nQQ星儿童成长奶\t216195\nGhyghu\t216196\n放眼\t216197\n爱韬\t216198\n做饭\t216199\nwobulengenliliap\t216200\n2014年7月50\t216201\n扯东扯西\t216202\n王乐川\t216203\n王心清\t216204\n恺威店\t216205\n详细\t216206\n电磁灶\t216207\n矢口道\t216208\n3点41分\t216209\n二货\t216210\n胡言乱语\t216211\n爱多\t216212\n风流系\t216213\n天涯海角\t216214\n故土难离\t216215\n博汇园\t216216\n包东北\t216217\n咋头儿\t216218\n含金\t216219\n下十二时\t216220\n01年\t216221\n依赖性\t216222\n可怜的娃\t216223\n啦啦啦啦啦啦我是真的小行家\t216224\n闽侯\t216225\n裴佳琪\t216226\nzxxx\t216227\n院墙\t216228\n己婚\t216229\n合集贴\t216230\n饮鸩止渴\t216231\nguldrudmgclj\t216232\n含量\t216233\ngyffhxgdhf\t216234\n易烊千玺\t216235\ndoali\t216236\n5.8折\t216237\n惹我我告诉你\t216238\n刚子\t216239\n好吧子\t216240\n么得\t216241\n那个时代\t216242\n哼老子\t216243\n土山哭狡兔三窟\t216244\n刚签\t216245\n子一\t216246\n口宇\t216247\n死马\t216248\n试纸\t216249\n度星\t216250\n我记\t216251\n泡妞儿\t216252\n九小米\t216253\n汝锋\t216254\n爱世界\t216255\n塑料花\t216256\n塞心呐\t216257\n办公楼\t216258\n度明\t216259\n御史\t216260\n制内人\t216261\n嘉旺\t216262\n我讨\t216263\n１３４６\t216264\n防洪\t216265\nmskbq\t216266\nggggaaaayk\t216267\n我骗你了我是地球的\t216268\n287391\t216269\n抄写写\t216270\ntf博艾\t216271\n武则天\t216272\n回到现实\t216273\n1年多\t216274\n听歌听\t216275\nplanta\t216276\n好好好好好好好好好好\t216277\n小晖\t216278\n佳欣\t216279\ndhssugdtdysttytdytfydtyttsttrryrrudttsstdtytytúrtddtuds\t216280\n安阳市\t216281\n小淘\t216282\n孤儿院\t216283\n飞羽\t216284\ngiyghrud\t216285\n眼瞎\t216286\n2B2B2B2B\t216287\n穷小子\t216288\n也需\t216289\n闫星星\t216290\n我是你的主人我吴如真我\t216291\n油泡\t216292\n1833058275\t216293\n嗯在在在在在在在在在在在在一直在\t216294\n什厶\t216295\n周克华\t216296\n喽太郎\t216297\n斯佛教\t216298\n萌照\t216299\n昔秦皇汉武\t216300\n一千位\t216301\n最讨厌你了你个度\t216302\n投案\t216303\n诚东车场\t216304\n1000万大\t216305\n想说和你说\t216306\n石竹\t216307\n日日令仔电到妳\t216308\nAllShare\t216309\n仅仅\t216310\n娜姐\t216311\n议员\t216312\n1趟\t216313\nlake\t216314\n真象\t216315\n听书\t216316\n魔仙奥特曼\t216317\n展览板\t216318\nM巾\t216319\n张峰集团\t216320\n后盾\t216321\n空管\t216322\n连验证\t216323\n每个人都会\t216324\n三万里河东入海五千人说上摩天\t216325\n秃秃\t216326\n老上海百业指南\t216327\n氯化钡\t216328\n后盖\t216329\nTMd\t216330\n前俯\t216331\n郝润玉\t216332\n逼摸吊\t216333\nlaowaisb\t216334\n任思念\t216335\n叫肉\t216336\n领当当\t216337\n自扇\t216338\nj媚眼]#赤西仁74生日快乐#\t216339\n粩囵\t216340\n妙妙屋紫苏\t216341\n第二季\t216342\n至沓来\t216343\n天生萌\t216344\n上午茶\t216345\n手厅\t216346\n矿英语\t216347\n韵亚\t216348\n急死女娃儿\t216349\n5165265\t216350\n太阳底下\t216351\n窗口\t216352\n许佳琪\t216353\n林洁文\t216354\n不以为\t216355\n不冷不热\t216356\n快点度秘\t216357\n兔头\t216358\n一单元圆\t216359\n电影象\t216360\n搜狐西宁呢再和你好你还好有分年后片\t216361\n美誉度\t216362\n几年级\t216363\n不懂不懂不懂不懂不懂不懂不懂不懂不懂\t216364\n清关\t216365\n习惯\t216366\n而已茄子\t216367\n588556665565565555565622\t216368\nfyc\t216369\n借不到\t216370\n名才\t216371\nhull城\t216372\n老城隍庙\t216373\n一万小时\t216374\nHighchi\t216375\n不分给\t216376\n颇为\t216377\n剖析\t216378\n买卖单\t216379\n楼烦关北口\t216380\n大红大紫\t216381\n121275850909\t216382\n颇丰\t216383\nTeen\t216384\n综合楼\t216385\n顶至\t216386\n作品展\t216387\n沉水\t216388\n官途\t216389\n名扬\t216390\n地守望者\t216391\n重阳\t216392\n凶好不好\t216393\n百雀羚水嫩倍现\t216394\n东方\t216395\n米秋\t216396\n一张照片\t216397\n塔吉克斯坦\t216398\n东门町广场1楼\t216399\n米秀\t216400\n屏海绵神庙\t216401\n康你的算数\t216402\n新南悦汇总\t216403\n戢度\t216404\n板芙镇\t216405\n杨蕊\t216406\n134346864656864648\t216407\n等闲\t216408\n主阿求你可怜我歌谱\t216409\ntmet\t216410\n等问\t216411\n扑克牌人\t216412\n猜行\t216413\n理你了恨\t216414\n太浪漫\t216415\n白羊小说好\t216416\n生愉快\t216417\n时常\t216418\n安凌宇\t216419\n阿巴巴拉巴拉\t216420\n寻见了\t216421\n袭来\t216422\n科隆VS狼堡\t216423\n一二三四五六七八九十十一十三十四\t216424\n天马广场\t216425\n斗牛\t216426\n金姐\t216427\n快一点行\t216428\nFUVN\t216429\n纱布\t216430\n五门儿\t216431\njjgtm\t216432\n鹿山新区\t216433\n两界\t216434\nime秘\t216435\n直干\t216436\n4675566666666666\t216437\n浩浩偶\t216438\n宝乐婶\t216439\n谢雨\t216440\n最喜欢\t216441\n新戏秘\t216442\n湖中\t216443\n李惠澎\t216444\n尹艺璇\t216445\n胡安然\t216446\n玉石\t216447\n三星GALAXY\t216448\n司马氏\t216449\n拍了照\t216450\n草儿\t216451\n用心就好\t216452\n7787224545455452252221122\t216453\n纸尿裤\t216454\n纽约港\t216455\n行凶者\t216456\n气死人\t216457\n贴上\t216458\n贴下\t216459\n下工\t216460\n罗义芬\t216461\nsijwiiwiw\t216462\n海岸\t216463\n王昶棕\t216464\nlovo\t216465\n下巴\t216466\ni4gogzcjgj\t216467\nwings\t216468\n海岩\t216469\n蓝圆\t216470\n哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒嘟嘟嘟\t216471\n蛟头\t216472\n两个元\t216473\n许可\t216474\n新能源车\t216475\n博阿腾\t216476\n延保\t216477\n两个克\t216478\n海岛\t216479\n流动\t216480\nnkrdmec\t216481\n曲师大\t216482\n纯熟\t216483\nFinn\t216484\nctty\t216485\n几万千米\t216486\nFind\t216487\n不准说\t216488\n1笔\t216489\n第四页\t216490\n早八蛋\t216491\nndkd\t216492\n宫锁心玉\t216493\n粕粕\t216494\n瓜兮花果\t216495\n性阻遏创新\t216496\n劲点\t216497\n曹梓晴\t216498\n小王人\t216499\n瞻礼佛牙舍利\t216500\n2372355\t216501\nMSM8660\t216502\n几步\t216503\n悔罪\t216504\n劲炫\t216505\n科曼\t216506\n现现在\t216507\n公职人员\t216508\n就视\t216509\n八里桥\t216510\n缴费\t216511\n3dkey\t216512\n鼓手\t216513\n向前\t216514\n消除掉\t216515\n器女\t216516\n猪呀度秘\t216517\nikureszi\t216518\n小东瓜\t216519\nSIRT1\t216520\nboujest\t216521\n事先走\t216522\n钻石\t216523\nexoriamamn\t216524\nweqwerter\t216525\n用量\t216526\nGo#\t216527\n闪电战\t216528\n废材\t216529\ncqhi皮波斯得青鱼hi皮\t216530\n高性能\t216531\n凤尾菊\t216532\n进藏\t216533\n直急\t216534\n1580万\t216535\n白蛇传\t216536\n叫来\t216537\n好凉\t216538\nletus\t216539\n2米\t216540\n累好\t216541\nnnnnnn\t216542\n死后代\t216543\n400万\t216544\n至仁\t216545\nyoutobe\t216546\n半坡遗址\t216547\n496291619\t216548\n枝桠\t216549\n不匹配\t216550\n至今\t216551\n17度\t216552\n八二年\t216553\nyaozun\t216554\n喔偶\t216555\n1至10分钟\t216556\n话行\t216557\n444度\t216558\n王博洋\t216559\n发育\t216560\nwithit\t216561\n不可说话\t216562\n尴尬\t216563\nkdj指标\t216564\nboss2\t216565\n150台\t216566\n嘉\t216567\n你是我们最好的朋友吧你了我忘啦\t216568\n停手\t216569\n足先\t216570\n复数lesson\t216571\n大红袍\t216572\n熊出没之心熊归来\t216573\nWQFGDGHU\t216574\n丘能锤\t216575\n静海\t216576\n锦上添花\t216577\n健康状况\t216578\n大白白\t216579\nhmjy\t216580\n十钟\t216581\n方程式\t216582\n牛剑锋\t216583\n长可\t216584\naqy\t216585\n排便\t216586\n多米多米多米\t216587\n嘞\t216588\n长号\t216589\n百两百\t216590\n嘟\t216591\n长叹\t216592\n558385685685685285\t216593\n887.1万台\t216594\n多咪多咪请告诉我那个那个庙好声音\t216595\n没大没小\t216596\n纸币值\t216597\n夏天雨\t216598\n长发\t216599\n2.0%\t216600\n窝狗\t216601\n永远讨厌\t216602\n昏昏然\t216603\n晨雪\t216604\nheentit\t216605\n河口瑶族自治县\t216606\n太和县宫小村\t216607\n1516朱猪朱猪朱猪\t216608\n钱江\t216609\n三九\t216610\n1898\t216611\nEnergyassociation\t216612\nfcugvvcggb\t216613\n见背\t216614\n1890\t216615\n易若萱\t216616\n1893\t216617\n科技书故事书\t216618\n0.31元\t216619\n还没有\t216620\n三乘\t216621\n圆煎堆\t216622\n艾妮\t216623\n小黄秘\t216624\n烩面\t216625\n教版\t216626\n植物大战僵尸太空版\t216627\n辞退\t216628\n将就看\t216629\n绵阳佛协\t216630\n创造人\t216631\n别再说笑话\t216632\n倚天曲\t216633\nif呃呃\t216634\n节省\t216635\n五十毫克\t216636\n24枚\t216637\n肖文成\t216638\n白油菜\t216639\n大赞\t216640\n是哪儿的呀咧\t216641\n大赛\t216642\n堡碉\t216643\nWhatel\t216644\n蓝精灵\t216645\n阿拉巴背吧背呗背呗\t216646\n嗯萌萌\t216647\n知道知道知道知道知道\t216648\n26晚上\t216649\n大赏\t216650\n解放军队福建\t216651\n大赌\t216652\n189q\t216653\n海口美兰机场\t216654\n外援\t216655\n那你喂喂喂喂喂喂喂喂喂喂\t216656\n警惕\t216657\n直尺\t216658\n１０日１７时\t216659\nidjdjjdjdjd度\t216660\n安泰\t216661\n2.00\t216662\n月牙儿\t216663\n妈呀呀\t216664\n里约大冒险\t216665\n哈等\t216666\n170991223878978\t216667\n今天10点50分\t216668\n邮政卡\t216669\n刘广义\t216670\n三之\t216671\n零售业\t216672\n萌马丁蛋\t216673\n咄咄逼人\t216674\n蒋坚永\t216675\n不好爱\t216676\n汗衫\t216677\n为人正派\t216678\n于江博\t216679\nhhhohkbgovjcfkckdc\t216680\n瘦金行楷\t216681\n睡觉吧明天在\t216682\n介样滴惊吓\t216683\n床前明月光我是郭德纲\t216684\n215334890\t216685\n林家曦\t216686\nkate\t216687\n有助\t216688\nsjid\t216689\n汗血\t216690\n成蜜\t216691\n小肚子\t216692\nrhrsc\t216693\n月话费\t216694\njdn\t216695\n美妞美美哒\t216696\n200套\t216697\n孙色\t216698\n事妮\t216699\n1292945915#\t216700\n李阳一\t216701\ngdjjdjfjy\t216702\n家滴\t216703\n朱天文\t216704\n公布\t216705\n李晨儿\t216706\n施展\t216707\n嗯先微小子\t216708\n你你你你你你你你你\t216709\n望极天涯不是归路的断肠人\t216710\n个股\t216711\n公帑\t216712\n属于我和你\t216713\n天天酷跑斗战胜佛\t216714\n卸装\t216715\n苟康义\t216716\n拿破\t216717\n算了骂\t216718\n把号\t216719\n国史书\t216720\n经济体制\t216721\n秦代\t216722\n度秘吏卒\t216723\n座灵剑山\t216724\n装机容量\t216725\n磊落\t216726\nasdsgd\t216727\n白水悠\t216728\n把叫\t216729\nzl\t216730\nzm\t216731\nzn\t216732\nzo\t216733\nzh\t216734\nzi\t216735\nzj\t216736\nzk\t216737\n555555555555555555555555555555555555555555555555555\t216738\nze\t216739\n明珠\t216740\nzg\t216741\nza\t216742\nzb\t216743\nzc\t216744\n跳绳\t216745\n三十九\t216746\nzx\t216747\nzy\t216748\nzz\t216749\n睡秘方\t216750\nzt\t216751\nzu\t216752\nzv\t216753\nzw\t216754\nzq\t216755\n半女半\t216756\nzs\t216757\n虾米人\t216758\n黄西\t216759\n念佛号\t216760\n凯蒂佩里\t216761\n1511100161318\t216762\n救者\t216763\n伊军\t216764\n纸画\t216765\nzF\t216766\n刚刚刚刚刚刚刚刚刚刚刚刚刚刚刚刚刚刚\t216767\nokalez\t216768\n唐家三少\t216769\n明真\t216770\n安悦溪\t216771\nncis\t216772\n洛克王国\t216773\nzZ\t216774\n凌绍章\t216775\n跳给\t216776\n夏达喽\t216777\n雄风惊五岳，\t216778\n梅瓶\t216779\n葛红坤\t216780\n榆社\t216781\n给我一个人发\t216782\n撒大长今\t216783\n韩承羽\t216784\n邓永峰\t216785\n懒得美容\t216786\n戴安琪\t216787\n花真美\t216788\n12.48km\t216789\n冠西\t216790\n不明飞行物\t216791\nz4\t216792\nz5\t216793\n早妹子\t216794\nz1\t216795\nz2\t216796\n菊花猪猪\t216797\n杨野何塞米\t216798\n14英寸\t216799\n没玩\t216800\n刀山\t216801\n操b\t216802\nhost\t216803\n李爱\t216804\n好爱好爱你一直都不知道我追你好久了我好爱你\t216805\n九车站\t216806\n白明\t216807\n亚西\t216808\nφ\t216809\n雄达人\t216810\n三乡\t216811\n小米加步枪犬\t216812\n还有没有\t216813\n拉链\t216814\n僵尸僵尸\t216815\n炒年糕\t216816\n年初\t216817\n瞅一瞅\t216818\n繁星\t216819\n克而阴见\t216820\n利达我\t216821\n东远\t216822\n巴拉拉\t216823\n思铂r8\t216824\n推测\t216825\n嗯深奥\t216826\n洋湖中心小学\t216827\n吴总酷\t216828\n我不要你了我嫌弃\t216829\nASPH\t216830\n油肉\t216831\n张舟逸\t216832\n业态\t216833\n靳超军\t216834\njshshd\t216835\nnmnn\t216836\n浑圆\t216837\n俺\t216838\n饭友\t216839\n孔婉宇\t216840\n俾\t216841\n俽\t216842\n欢子\t216843\n猪妞妞你是不育课你是猪妞妞你是浮云\t216844\n俶\t216845\n哦们\t216846\n俪\t216847\n夹生\t216848\n俩\t216849\n修\t216850\n俯\t216851\n信\t216852\n香林\t216853\n省水\t216854\n度媽\t216855\n俘\t216856\n保\t216857\nｂｌ\t216858\n辛词\t216859\n七彩电动车\t216860\n俗\t216861\n度秘奥度秘\t216862\n忙于\t216863\n俊\t216864\n俈\t216865\n1万多名\t216866\n中企投资集团\t216867\n俏\t216868\n係\t216869\n促\t216870\n去年10月\t216871\n斯塔\t216872\n俄\t216873\nmwtdng\t216874\n北地本\t216875\n从善\t216876\n雨过\t216877\n咖啡王子一号店\t216878\n尿道\t216879\n小二班\t216880\n喔喔科\t216881\n羞涩\t216882\n5月20号\t216883\n阔能\t216884\n我等会儿\t216885\n拉伊\t216886\n王雯天\t216887\n天之咪\t216888\n包辣条\t216889\n哎呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀\t216890\n秋千\t216891\n擅长\t216892\n哈快\t216893\n秋华\t216894\n江湖奇侠传\t216895\n擅長\t216896\n数了一到\t216897\n巴基斯\t216898\n四五百\t216899\n翻番\t216900\n朱菁仙\t216901\nyouspphile\t216902\n生女石BB\t216903\nhcgi\t216904\n次\t216905\n躲让\t216906\n真不容易\t216907\n私房\t216908\n到了吧\t216909\nJoey\t216910\n两条杠\t216911\n战时\t216912\n宜家\t216913\n欢\t216914\n良宵\t216915\n二十九岁\t216916\n有一聊\t216917\n我可是你的帅\t216918\n宜宾\t216919\nckhfxutdutditxktditxigdigfiyfITDUFZITDITDIGDOHCIGFKGIDYIYDIYXITXIDFITDITITDYDGIGIDITDITITDITIFITDGIDCKGJXJFXFMGCUTDUT\t216920\n铁素颜\t216921\n郑恺波\t216922\n回南天\t216923\n宁水花园\t216924\n山梁\t216925\nWho\t216926\n凌子\t216927\ndfddfdf\t216928\n心绞痛\t216929\n凌孜\t216930\n五百六七十亿\t216931\n金朱元\t216932\n桃园三结义\t216933\n拿出\t216934\n糖果鞋\t216935\n师树恒\t216936\n李有几\t216937\n高泠\t216938\n龙夜爵\t216939\n高波\t216940\n矮子\t216941\n田力蓝\t216942\n摩尔冲凉\t216943\n冰柱\t216944\n全价\t216945\n山梨\t216946\n葫芦侠美人鱼\t216947\nh8747474774454555500\t216948\n对呀我很温柔\t216949\n黔驴技穷\t216950\n可乖\t216951\n啧\t216952\nhgggii\t216953\n151万\t216954\n1000000000000000000000000000000000000000000000000000000000000000000000\t216955\n图哈\t216956\n丁工先\t216957\nEXOcp\t216958\n大气温室效应\t216959\n第一下\t216960\n涤梦\t216961\n说懂\t216962\n走势图\t216963\n12345672234567\t216964\n清理山\t216965\n吴琼\t216966\n吴琳\t216967\n一杯羹\t216968\n鲁达\t216969\n胜利在望\t216970\n女儿\t216971\n当交\t216972\n楼盘\t216973\n第一个\t216974\n有若无\t216975\n爱爱过\t216976\n国政\t216977\n刻意\t216978\n典别\t216979\n300年前\t216980\n色狼\t216981\n哦元\t216982\n燃气灶\t216983\n琅琊镇\t216984\n多不多\t216985\n今天以后\t216986\n133767872675\t216987\n年青力壮\t216988\n铅笔画\t216989\n发烟\t216990\n709\t216991\n乐色\t216992\n一日子\t216993\n704\t216994\n705\t216995\n706\t216996\n707\t216997\n700\t216998\n701\t216999\n姚瑞\t217000\n罢托\t217001\n少先队\t217002\n小照\t217003\n送杯\t217004\n15768953494272929265438954611n2451242153753251626464543154554555255545842\t217005\n逗逗哈\t217006\n一下一个个\t217007\n秋雅来\t217008\n李云龙\t217009\n送来\t217010\n舒顺\t217011\n昆沉痛再三\t217012\n说出发\t217013\n先富\t217014\n余总\t217015\n酿级\t217016\n猪头国栋\t217017\n外地堡\t217018\n哇拉\t217019\n化肥厂\t217020\n发热\t217021\nbbknl\t217022\n冻旭兴\t217023\n悠扬\t217024\n义心\t217025\n发烧\t217026\n国行\t217027\n我就问你现在我在哪儿\t217028\n有有有有有有有有\t217029\n呼哧\t217030\n红枣北芪炖鲈鱼\t217031\n一个你可以\t217032\n美化\t217033\n呼哩\t217034\n卡套\t217035\n吃好片\t217036\n现代汽车\t217037\ntfbosa\t217038\n江北观音桥\t217039\n拍一拍一拍\t217040\n不和你好\t217041\n叮醒\t217042\n呼哼\t217043\n各界\t217044\n讯n1\t217045\n板寸\t217046\n2963．5亿元\t217047\ncx0049\t217048\n洋子\t217049\n痛波\t217050\n近似值约率\t217051\nfhjry\t217052\n打工谣\t217053\n二厅\t217054\n杨群生\t217055\n呼哈\t217056\n卡奴\t217057\n锐力米付钱\t217058\n22337\t217059\n晨星\t217060\n說一\t217061\n蓝色情人节\t217062\n马丽\t217063\n果图案\t217064\n12点27分\t217065\n卡奥\t217066\n哎呦呦呵欧\t217067\n奋笔\t217068\n金药箱&amp\t217069\nX光片\t217070\n魏高鸿\t217071\n很烂\t217072\n决战\t217073\n夸错\t217074\n球衣\t217075\nyhjklbghio\t217076\n不利于\t217077\n一百一多斤\t217078\n镇子\t217079\n王然咧利\t217080\n299个\t217081\n肉丝榨菜\t217082\n无聊真无聊在家的说真无聊真\t217083\n店家\t217084\n缓不过\t217085\n前男友\t217086\n张小可\t217087\n致幻\t217088\n真苦\t217089\nwthc\t217090\n牧伯修\t217091\n恩真可惜\t217092\nxvideo\t217093\nsbzmcl\t217094\n发+8\t217095\n物距\t217096\n停车费\t217097\nrsd\t217098\n大笔\t217099\n新搞作\t217100\n大笑\t217101\n783990891\t217102\nbecauhechat1i11cauaa\t217103\n上三天三夜\t217104\n听天由命\t217105\n快穿\t217106\n乒hjgh\t217107\n杨伏辉\t217108\n三百六六\t217109\n很正常\t217110\nrsr\t217111\n烈钢\t217112\n歧焉吾\t217113\nsndnskkenelsksn\t217114\n王老师\t217115\n益达\t217116\n顺着片\t217117\n安光\t217118\n大笪\t217119\n经营性\t217120\n大笨\t217121\n不以物喜\t217122\n聊斋志异\t217123\nrsS\t217124\n我讨厌你我要投诉你\t217125\n逆战四大灰太里是大坏蛋度秘\t217126\nrkdk\t217127\n楼房\t217128\n舞曲\t217129\n自虐\t217130\n56秒\t217131\n想笑死\t217132\n兔兔兔兔突突突\t217133\n来来信息\t217134\n六八六七九二六七\t217135\n敦桥\t217136\n闹鬼\t217137\n为为你是天的小美人\t217138\n唔妹妹\t217139\n八分之五和\t217140\n清潭路口东北\t217141\n胆敢\t217142\n度咪\t217143\n安全感\t217144\n方美\t217145\njkkjk\t217146\nBillie\t217147\nsbdvd\t217148\n诸城\t217149\n张兵\t217150\n三万多\t217151\n亲爱的快说\t217152\n惠南\t217153\n度秘度秘我是你的主人\t217154\n被动型\t217155\n张粑粑\t217156\n一千多元\t217157\n7小时\t217158\n一千多兆\t217159\n讲片\t217160\n安养\t217161\n求求你\t217162\noiujk\t217163\n李东海\t217164\n#品藏论道#\t217165\n27点\t217166\n英皇国际\t217167\n06米\t217168\n高哲琦\t217169\n吗战\t217170\n痛苦\t217171\n张家港\t217172\n搞掂\t217173\n秘爱秘\t217174\nwww摩丝摩丝willastihousebrilevotabcdefghjklmnopqyourstuvwuknonnnnouseaaaasemaabc\t217175\n陈二狗\t217176\n喷油\t217177\n七十二\t217178\n王海菊\t217179\n股系\t217180\n啊拉瓦\t217181\n失失不再来\t217182\n刘明\t217183\n匣子\t217184\n小东门\t217185\n具晓雁\t217186\n吃吧\t217187\n15893979003\t217188\n凯撒卡卡\t217189\n幺四二零幺三\t217190\n刘长昊\t217191\n哩次系\t217192\n牌场\t217193\nVol.24\t217194\n紧抓住\t217195\n2200年\t217196\n咱们么样\t217197\n卧槽基\t217198\n吃吃\t217199\n一个九九名\t217200\n萌颖\t217201\nKANAN\t217202\nkono\t217203\n哨子\t217204\n花千骨琅琊榜云中歌\t217205\n血流如注\t217206\n考试息\t217207\njmmnktp\t217208\n机器工业\t217209\nkhun\t217210\n贫富差距\t217211\n无情无义\t217212\nfdfff\t217213\n降低\t217214\n周音乐\t217215\n墩墩\t217216\n来比一比\t217217\n中交集团\t217218\n报批\t217219\n风神H30\t217220\n除些\t217221\n═\t217222\n╗\t217223\n╔\t217224\ntbv\t217225\n早睡觉\t217226\n啱\t217227\n割字\t217228\n希思罗机场\t217229\n麦梅尔特\t217230\ntbg\t217231\n王立新\t217232\n钟馗传说\t217233\ninthis\t217234\n没用完\t217235\n一颗什熟之美最完美的是一片漆黑你是谁为什么上飞声音\t217236\n嘎嘎嘎嘎嘎叽叽叽叽叽叽\t217237\n╳\t217238\n午马\t217239\n╱\t217240\nghdgftfgf\t217241\n╮\t217242\n╯\t217243\n╬\t217244\n╭\t217245\n夏个\t217246\n姚有\t217247\n执法者\t217248\n杨朝琳\t217249\n入队\t217250\n╥\t217251\n袋装\t217252\n耿俊杰\t217253\n一点也不懂不懂不懂不懂不懂\t217254\n清唱\t217255\n第五集\t217256\nobabyl\t217257\nYesyes\t217258\n第八届\t217259\n西固\t217260\n甘夜\t217261\n0O667755ss\t217262\n君悦\t217263\n非得子\t217264\n当然了我喜欢你我想和你\t217265\n摸摸摸摸摸摸\t217266\n岐江河\t217267\n纠缠结\t217268\n甘大\t217269\n钢笔谈彩\t217270\n沙漠之鹰\t217271\n哥家\t217272\n因为我喜欢一个人\t217273\n1992年10月24日\t217274\n献血\t217275\n男扮\t217276\n叫试试\t217277\n鬼大我愿意\t217278\n六十厘米\t217279\n有药用\t217280\n呜呼\t217281\n冥王星\t217282\n太平祥\t217283\njmmpj\t217284\nwwNNg\t217285\n尖嘴\t217286\nS5660黑1250\t217287\n冲洗\t217288\n手快\t217289\n温哥\t217290\n刘晓漫\t217291\n拉贝里\t217292\n佝偻\t217293\n蝴蝶养餐养鸟\t217294\n勒芒\t217295\n门佳\t217296\n始源\t217297\n口稀\t217298\n呜呜\t217299\n4500几\t217300\n张澄\t217301\n花楹\t217302\n红米魅\t217303\n40第二次\t217304\ndfffgjgggjj\t217305\n哩三元\t217306\n定论\t217307\n洋妞\t217308\n好容易样\t217309\n火像\t217310\n傅西路\t217311\n你男的女的你爱我\t217312\n乖乖宝贝\t217313\n治家\t217314\nhoho\t217315\n服侍\t217316\n陈颖颖\t217317\n一个星期\t217318\n1111111个\t217319\n麻豆\t217320\n死生\t217321\n国家\t217322\n太稳\t217323\n阻击枪\t217324\n百百分之七\t217325\n阿里旺旺\t217326\n陆主任\t217327\n了了与了了饿路路通\t217328\n殺青\t217329\n呢度蜜\t217330\n阴火\t217331\n高大\t217332\n治安\t217333\n第一套\t217334\n行家\t217335\n富达超市\t217336\n滑盖\t217337\n喜羊羊与灰太狼过猴年\t217338\n层次\t217339\n巧力\t217340\n识破\t217341\n腺炎\t217342\n保乱\t217343\n画舫\t217344\n小笼包\t217345\nopelor\t217346\n没打算\t217347\nM9.0级\t217348\n酷我音乐\t217349\n训话\t217350\n喜欢你我想\t217351\n防行人\t217352\n6000元\t217353\n11147\t217354\n无法\t217355\n一点二不厉害\t217356\n训诫\t217357\n排列五\t217358\n一万1千九百九十九0\t217359\nsecrets\t217360\n金厉旭\t217361\n十二启\t217362\n摄影部\t217363\n那莫悦\t217364\n跑友\t217365\nu偶\t217366\n黄思颖\t217367\njatat\t217368\n寻龙诀我去\t217369\n刘佳音\t217370\n腰斩猪\t217371\n威武\t217372\n李木木\t217373\n行宫\t217374\n迟腾\t217375\n饿饭\t217376\nsmovie\t217377\n喳喳喳\t217378\n36岁\t217379\n饿饿\t217380\n潘绮凡\t217381\n遥控直升机\t217382\n蛋蛋蛋蛋蛋蛋蛋蛋蛋蛋蛋蛋蛋ggggggggggg\t217383\nBeat\t217384\n真的你快点\t217385\nladyorgentleman\t217386\n宠辱不惊\t217387\n0OKk\t217388\ndlhNGG\t217389\n几等\t217390\n陈诉我\t217391\n午安午安午安\t217392\n凡是\t217393\n意思我妈\t217394\n小背\t217395\n交大\t217396\n36小时\t217397\n两注\t217398\n两泡\t217399\n行哪吃红我和你好\t217400\n即然\t217401\n小型机\t217402\nwryh\t217403\nsgdddrsr\t217404\nwryl\t217405\n早唐\t217406\n就是要\t217407\n好像素\t217408\n楚国\t217409\n冲刷\t217410\n冲刺\t217411\n小胡\t217412\n刘紫陌\t217413\n甄勇\t217414\n诺一虱\t217415\n小胸\t217416\nOPGYH\t217417\n战士们\t217418\nhjiufd\t217419\n高空之间\t217420\n这句话你说的好对呀\t217421\n好听话\t217422\nthkop\t217423\n安安安安\t217424\n葫妹子\t217425\n大智若明\t217426\n近一年后\t217427\n15534179754\t217428\nnnnol\t217429\neuduw\t217430\n湖南临武县公安局\t217431\n122031\t217432\n国宝\t217433\n卡巴斯基\t217434\n来来来啦啦啦啦啦啦啦那首哪逆袭\t217435\n老北京铝板\t217436\n证实\t217437\n好听说\t217438\n盗猎\t217439\n足量\t217440\nksk\t217441\n18962066525\t217442\n气死讨厌\t217443\n彭德贵\t217444\n足金\t217445\n浩瀚\t217446\n养家\t217447\n托托\t217448\n形美\t217449\nkoutan\t217450\n熊棋涵\t217451\nVcfhfxf\t217452\n断章取意\t217453\n坚持用\t217454\ncgx\t217455\n死雪雪\t217456\n孙恋\t217457\n7n￥\t217458\n我跟你世界城的三\t217459\n申鹭达卫浴\t217460\n脑瘤\t217461\n好真\t217462\n125年\t217463\n测定\t217464\n大乐错嘛不过家离\t217465\n新陈代谢\t217466\n掉泪\t217467\n脑瘫\t217468\n母爱\t217469\nERU\t217470\n谁跟你学的我的风影风影\t217471\nstarre\t217472\n二十七块\t217473\n8306888\t217474\n光头强真\t217475\n不山野\t217476\n二百二十四\t217477\n4mm\t217478\n沈北新区\t217479\n上座率\t217480\n活路\t217481\n15835619395\t217482\nWay\t217483\n百果园\t217484\n王静\t217485\n忧知\t217486\n方头方\t217487\n王青\t217488\n武装船\t217489\n心不動\t217490\n峨眉山市\t217491\n四系军团\t217492\n摸摸摸摸摸摸摸摸摸摸摸摸摸\t217493\n萨特\t217494\nyyxxo\t217495\n抛弃\t217496\n17k我爱你个发熊出没开心我想\t217497\n刚正\t217498\n半夜三点\t217499\n敌方\t217500\n品德生活\t217501\n活跃\t217502\n满街\t217503\n熟手\t217504\nLeibovitz\t217505\n晓龙\t217506\n7890112234\t217507\nAMON\t217508\n过寿\t217509\n一下句\t217510\n?\t217511\n彝人古镇\t217512\n两位数\t217513\nghsk\t217514\n期权\t217515\n四十公斤\t217516\n今年7月1日起\t217517\n短摆裙\t217518\nghsd\t217519\n熊二\t217520\nghsf\t217521\n黄板\t217522\n而成\t217523\n而我\t217524\n小姑娘们\t217525\nghss\t217526\n而战\t217527\n方不爱级\t217528\n马国帆你个臭不要脸\t217529\n黄杰\t217530\n前倾\t217531\n古汉\t217532\n罹患率\t217533\n肾宝片\t217534\n几多五六分之\t217535\n饥饿鲨\t217536\n2222121212\t217537\n给予\t217538\n伤心透\t217539\n给于\t217540\n隹丽\t217541\nFth\t217542\n给亚\t217543\n葛剑雄\t217544\n黄村\t217545\n3分钟\t217546\n13252977592\t217547\n潜伏\t217548\n倒迷人一三撒西啦到咪啊沙溪了都谜瑞纺洗衣龙\t217549\nthesameas\t217550\n献身\t217551\n压床\t217552\n月费\t217553\n薛俊妮\t217554\n鸡肉味\t217555\n讨巧\t217556\n受我\t217557\n尾兽\t217558\n干嘛号\t217559\n几分之几2nt\t217560\n新一城\t217561\n爱啦啦阿拉拉\t217562\n新家园\t217563\n中国巨力\t217564\n葱花辣椒\t217565\n八千八百八十八万\t217566\n挨球\t217567\n各自个\t217568\nmgjwdt\t217569\n斯大林格勒\t217570\n弱受\t217571\n你好踩\t217572\n菲瑞\t217573\n匈牙利\t217574\n大凡\t217575\n杂唠\t217576\n老乡\t217577\n罗本来\t217578\n间歇性\t217579\n老累\t217580\n疯狂电视台\t217581\n心眼口\t217582\n就是了哈\t217583\neaf\t217584\n超级棒棒糖\t217585\n大凶\t217586\n秦若伊\t217587\n3171\t217588\n大凹\t217589\near\t217590\n妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈妈\t217591\n嗯嗯噜噜噜噜噜噜噜\t217592\n天天一号吧\t217593\n五六百二十\t217594\n菁江度\t217595\nTF晗\t217596\n诚信999\t217597\n颁发\t217598\n向远方\t217599\n温泉池\t217600\n多一昂\t217601\n狗子\t217602\n一一套\t217603\n张士猫\t217604\n周口市\t217605\nWomen\t217606\n剪出\t217607\nLeg公司\t217608\n张起赫\t217609\n瞿颖\t217610\n相间\t217611\n路费\t217612\n说笑\t217613\n负债\t217614\nmtoo\t217615\n想说的确认\t217616\n毒舌乔\t217617\n就是我不想\t217618\n王兴佳\t217619\n嘴炮\t217620\ngjcjbc\t217621\n严峻\t217622\n对的错\t217623\n喂喂喂喂喂\t217624\n乳一雪\t217625\n郭静榕\t217626\n邹伟琛\t217627\n伴娘\t217628\n求求求求\t217629\n一礼\t217630\n王母\t217631\n一社\t217632\n醋酸嗯嗯嗯\t217633\n阴见阳\t217634\n简笔画枪的简笔画大全\t217635\n765526562752\t217636\n地雷\t217637\n花球\t217638\nC+斯迦\t217639\n王军训\t217640\n蔡伟豪\t217641\n别扭傲娇受\t217642\n邗江区\t217643\n首唱\t217644\n笑浩\t217645\n罗拉啊朵\t217646\n黑家妹\t217647\n春丽\t217648\n变换\t217649\n代款\t217650\nRuguo\t217651\n会员们\t217652\n合区\t217653\njbkb\t217654\nfriends\t217655\n再见再见再见再见明天见明天见\t217656\n河软东\t217657\n徐光启\t217658\n想多了\t217659\n陈虚\t217660\n白和\t217661\n一头儿车儿\t217662\n药王殿\t217663\n汪正\t217664\n两极分化\t217665\n春丛\t217666\n竹叶青\t217667\n羊水量\t217668\n好vv\t217669\n好vh\t217670\n一片一片一片一片\t217671\n万能看猩猩归来\t217672\n春一\t217673\n中后\t217674\n合化\t217675\n嫌弃\t217676\nknnn\t217677\nknnj\t217678\n陆世涛\t217679\n有轨\t217680\n三堡\t217681\n狠抓\t217682\n加奥斯\t217683\n美女人\t217684\n怎怎\t217685\n闭口\t217686\nkoutyjkhovocdtsydhhiyiuifutdiyd\t217687\n128十64\t217688\n美女亲\t217689\n回一声\t217690\n烧瑞\t217691\n传奇史\t217692\n电子手铐\t217693\n1218\t217694\n早味\t217695\n拼活\t217696\n不废话\t217697\nRichebourg\t217698\n的我饿我饿\t217699\n民间人士\t217700\n束河\t217701\n進議場\t217702\nhecry\t217703\n撰色\t217704\n早呢\t217705\nJYJ雇用公司\t217706\n吴夏萍\t217707\n写好\t217708\n骗奸\t217709\nGG嘎嘎嘎嘎GG\t217710\n中金在线\t217711\n12345678910111213141516171819202122232526272829\t217712\n万岁万万岁\t217713\ncifft\t217714\n俨然\t217715\nhcj\t217716\n娇啼娇\t217717\n一寸金\t217718\n今天中午\t217719\n回道\t217720\n八百六\t217721\n这么说什么\t217722\n袁冰妍\t217723\nhci\t217724\n迷路姬\t217725\n我的话再见吧度秘再见吧度为好在一间8度秘再见吧度度度度秘再见再见再再再再见再见吧再见么读\t217726\n肏鄙\t217727\n鸡排\t217728\n小熊熊\t217729\n热单\t217730\n花博会\t217731\n季节改委\t217732\n王耀\t217733\n王老\t217734\n今晚8点整\t217735\n魔导\t217736\nhcl\t217737\n王海明\t217738\n就是就是就是你是男孩女孩\t217739\n多八少个\t217740\n3000份\t217741\n金环\t217742\n乙辰丑\t217743\n发张照片儿\t217744\n哪贞子\t217745\n1112111111\t217746\n醉头\t217747\n归档\t217748\n唉呀爱\t217749\n要不不对\t217750\n关我不想\t217751\n哈尔\t217752\n14722668509\t217753\n国防\t217754\nfugu\t217755\n美国证券交易会\t217756\n肉身\t217757\n怎用\t217758\nqqcom\t217759\n4646566865686864\t217760\n亲个嘴儿\t217761\n嗯干嘛\t217762\n小指\t217763\nfugd\t217764\n你好说你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌\t217765\n豆面相了我不我\t217766\n跳蛛\t217767\n亲亲哒\t217768\n包蕴\t217769\n发问\t217770\n这个世\t217771\n徐贤\t217772\n床上用品\t217773\n北京天则经济研究所\t217774\n表情行\t217775\n乳制品\t217776\neypgzxbj\t217777\n不服试\t217778\n太哈哈\t217779\n建桥史\t217780\n发闷\t217781\n亚青寺\t217782\n584乘\t217783\n红包群\t217784\n164467376818358688\t217785\nsjs\t217786\n二两把\t217787\n浓抹\t217788\nUvh\t217789\n自古以来\t217790\n长草\t217791\n杨湘晗\t217792\n九颗\t217793\n哈东面\t217794\nsjd\t217795\n舅子\t217796\n焐\t217797\n焕\t217798\n焖\t217799\n10005\t217800\n九型\t217801\n焚\t217802\n69.85%\t217803\n冰雁\t217804\n6个\t217805\n全民\t217806\n雪怪\t217807\n焉\t217808\n告诉我的朋友\t217809\nhttprdwzcn76a61c2\t217810\n6两\t217811\n刘士毅\t217812\n焱\t217813\n代替词\t217814\nwtaxbip\t217815\n冰雪\t217816\n然\t217817\n冰雨\t217818\n反贪\t217819\n畅玩包\t217820\n從化\t217821\n無\t217822\n印证\t217823\n焥\t217824\n拿拿\t217825\n你是谁你是谁你是谁你是谁你是谁你是谁呀你是谁\t217826\n肉火腿肠\t217827\n高盖\t217828\n6一\t217829\n盖尔\t217830\n温温\t217831\n6万\t217832\n罗雄生\t217833\n焯\t217834\n路\t217835\n跪\t217836\n翻眼\t217837\n跨\t217838\n等等等等等等等等等等等等等等等等等等等等等等等等等等等等\t217839\n做寿\t217840\n15867\t217841\n西欧分配二外0017w020\t217842\n病危\t217843\nhrifas\t217844\n跿\t217845\nG10咖\t217846\n跺\t217847\n跻\t217848\n定陶市\t217849\n跷\t217850\n为蚀\t217851\n践\t217852\n下文\t217853\n跳\t217854\n博瑞\t217855\n跌\t217856\nTFB0ys\t217857\n透开\t217858\nCbd\t217859\n太坏了\t217860\n跄\t217861\n美团团\t217862\n削平\t217863\n豆豆豆豆豆豆豆豆豆豆豆子豆子\t217864\n跟\t217865\n距\t217866\n一杯个\t217867\n罗带峰\t217868\n白胡椒河虾=麦虾虾\t217869\n40个工作日\t217870\n咯kk\t217871\n跑\t217872\n1位\t217873\n来了来了浏览量浏览量\t217874\n备孕\t217875\n格格格格\t217876\n炎黄春秋\t217877\n陈培龙\t217878\n浮不躁\t217879\n祸水吧安吉多话说\t217880\n收听\t217881\n好不好不好\t217882\nsocusodoit\t217883\n汁液\t217884\nready\t217885\n冠军花\t217886\n宅女神探桂香\t217887\n会厌\t217888\n土话\t217889\n尤梓涵\t217890\n原生\t217891\n黄锡麟\t217892\n努\t217893\n赶工加班\t217894\n免疫\t217895\n沈从文\t217896\n洪某\t217897\n15904571909\t217898\n7月16号十点\t217899\n叫守候\t217900\n舞步\t217901\n原画\t217902\n愿不愿意\t217903\najmg\t217904\n卡住\t217905\nvggggggy\t217906\n奋斗\t217907\njrirjid\t217908\n滚滚滚滚滚\t217909\n准准会\t217910\nststrsau\t217911\n指定\t217912\n金韩彬\t217913\nstattsetto\t217914\n陆克文\t217915\n孙去\t217916\nUckg\t217917\n1326\t217918\n隔夜仇\t217919\n卡佩\t217920\nGF\t217921\n今儿良\t217922\n相仰\t217923\n被片\t217924\nGD\t217925\n我的秘度\t217926\n古今贤文劝学篇\t217927\n10000000000000000000000000000\t217928\n55555215500007744417777777777777777777777777777777\t217929\nGB\t217930\n腐动\t217931\n王默\t217932\nGA\t217933\n盗帅欧\t217934\n非对称性\t217935\ngjfb\t217936\n坏姐姐\t217937\n阵仗\t217938\n201600\t217939\n201602\t217940\n躺枪\t217941\n情事\t217942\n台州市\t217943\n休闲类\t217944\n斗地主呢呗\t217945\n情人\t217946\n4855535555\t217947\n超级薇雅\t217948\n找惹你\t217949\n黑眼圈\t217950\n熬夜\t217951\n嗯嗯军克十里\t217952\n52406724\t217953\n牛頭挂角\t217954\n猜然\t217955\n林夕旋\t217956\n慧眼\t217957\n咿呀那鲁\t217958\n板死\t217959\n生空\t217960\n嗯莹\t217961\n弥虎\t217962\n捆绑式\t217963\n3.3亿元\t217964\ndumidumi\t217965\n劈\t217966\n宁死\t217967\n强化班\t217968\n用字\t217969\n大家你的你\t217970\n鸭粉丝汤\t217971\n外来论\t217972\n六千四百二十六万四千四百一十八人次\t217973\n龙口一中\t217974\n博客群\t217975\n领跑者\t217976\n布兰科\t217977\nsidn\t217978\n穿山甲\t217979\n讲一说\t217980\nside\t217981\n发酵乳\t217982\n24g\t217983\n说不见的意思\t217984\n厌慌\t217985\n玩具车\t217986\n24k\t217987\nfhfdhddgdgtttefgdcgdcbfadtt\t217988\n钱少\t217989\nhfgjgfitghohvvssshltgihrituoouuopo\t217990\n爱过你\t217991\n走出我的世离\t217992\n我读书少你真的别骗我\t217993\nuggfbughhhuhhj\t217994\n试金石\t217995\n张瑞城\t217996\n理非\t217997\n要不再来\t217998\n吗庶\t217999\n程序化\t218000\n受得\t218001\n一七五\t218002\n不乖不乖不乖不乖不乖你不贵不贵不贵度贵不鬼步鬼步\t218003\n陈公公\t218004\n来你可以\t218005\n五2\t218006\n李轩\t218007\n五0\t218008\nkied\t218009\n对话栓\t218010\n李车\t218011\n上班大人\t218012\n扎西卓玛\t218013\n狛枝凪\t218014\n6558765756754665476069\t218015\nDLC\t218016\nFFGHJL\t218017\n猪猪猪侠猪\t218018\n力\t218019\n更早\t218020\n鸿运羊\t218021\n245\t218022\n244\t218023\n247\t218024\n246\t218025\n241\t218026\n240\t218027\n243\t218028\n242\t218029\nintentIntentSK1171477665AA74E8940FA2E8E9EB6D6AECB24B90F4E2C36261DA0DE4A40CE8695BA850A3E5end\t218030\n黑敢\t218031\n249\t218032\n248\t218033\n笑颜\t218034\n酸辣牛肉面\t218035\nshopforfood\t218036\n话费你太帅了我真爱你\t218037\n耶斯区\t218038\n于海\t218039\n分页\t218040\n哎呦呦呦\t218041\n萌噠噠\t218042\n何怒\t218043\n于浩\t218044\n收起\t218045\n乌市猫\t218046\n想你了有\t218047\n24只\t218048\nrurmrkeku\t218049\n吃香\t218050\n谢谢你在干什\t218051\n蹦沙卡拉卡\t218052\nckdkdbchd\t218053\n九阴白骨爪\t218054\n他些\t218055\n惊呆\t218056\nfh48\t218057\n阴极铜\t218058\n呗布\t218059\nHoare\t218060\n出路\t218061\n瓜类\t218062\n一起快乐玩耍\t218063\n吸血管家\t218064\n风针\t218065\nzhetitoc\t218066\n黄鳝\t218067\n惊呼\t218068\n第一哦\t218069\n健殆夺金志\t218070\n博艾\t218071\n三小子\t218072\n准信\t218073\n太疼\t218074\n了了办\t218075\n超级恨你讨厌你\t218076\n陈凯歌\t218077\n红旗南路\t218078\n刘文淼\t218079\n刘亦菲\t218080\n熊熊出没之熊心归来\t218081\n15.1\t218082\n大灾难\t218083\n追溯\t218084\n黄视频\t218085\n哈鬼\t218086\n青石板\t218087\n数英\t218088\n刘工\t218089\n披头\t218090\nnunur\t218091\njhjh\t218092\n迈克\t218093\n吴骆明\t218094\nmyfather\t218095\nABCC行式\t218096\n珠峰\t218097\n漫谈气候变暖\t218098\n不要不要不要不要不\t218099\n心心正则\t218100\n灶公灶婆\t218101\n35858n55686hnbk\t218102\n程兴\t218103\n非人类\t218104\n八个字\t218105\n适口性\t218106\n酒泡辣椒\t218107\n恋爱办\t218108\n纪检组\t218109\n海珠秘\t218110\n稿件\t218111\n养乐\t218112\n统帅\t218113\n李湘\t218114\n是的我喜欢\t218115\nergnnnnnnnnrrrrnnnrrdrffffnffffffffffnf\t218116\n中国共产党\t218117\n穷人乍富\t218118\n十二点菜\t218119\n不灵通\t218120\n竹海\t218121\n通话员\t218122\nvvvvg\t218123\n6cm\t218124\n窨井盖\t218125\n转念\t218126\n垫毯\t218127\n日趋\t218128\n煎面\t218129\n7年前\t218130\n曹佳妮\t218131\n十月份\t218132\n满眼\t218133\n鲁一雪\t218134\n宋洋洋\t218135\n西学院\t218136\n摩洛哥人\t218137\n宋家你好听歌\t218138\n1点30\t218139\n霎时间\t218140\n1点36\t218141\n555444522\t218142\n一月多少\t218143\n两手\t218144\n偿\t218145\n要不群殴\t218146\n鹏子\t218147\n梅祖芬\t218148\n张帅\t218149\n张帆\t218150\n李升基\t218151\n洗文媚\t218152\ndoh\t218153\n不要说得过去\t218154\ndon\t218155\ndoo\t218156\ndom\t218157\nPoekekwwknje\t218158\n孟买\t218159\ndof\t218160\ndog\t218161\ndod\t218162\n周文丽\t218163\n请记住\t218164\ndoy\t218165\n361个\t218166\n大眼花\t218167\n两批\t218168\ndor\t218169\ndos\t218170\nhfrvbttfgttgdhuuytfergufdhihedeuhfrvvffhbfefyhceegbdfyvddhbccgjr\t218171\n祝捷\t218172\ndow\t218173\ndou\t218174\n不V\t218175\nCLFFQ\t218176\nluohoa\t218177\n真田幸村\t218178\n胶水\t218179\n一日游记\t218180\n不[\t218181\n那你西游\t218182\n作图\t218183\n恐怖再来片\t218184\n段祥涛\t218185\nFgthhuu\t218186\n4xy18\t218187\nsowb\t218188\nfcvhj\t218189\n终有\t218190\n乌鲁克\t218191\n白百讲\t218192\n4xy10\t218193\n不v\t218194\n大本命\t218195\n米塞\t218196\n医技\t218197\n依晨\t218198\n戍人\t218199\n不x\t218200\n泗岁岁岁数钱\t218201\n不n\t218202\n铁电话\t218203\n不l\t218204\n童静\t218205\n李\t218206\n杏\t218207\n请面\t218208\n习惯就好\t218209\n杈\t218210\n杉\t218211\n杆\t218212\n15872414063\t218213\n杂\t218214\n权\t218215\n杀\t218216\n我看完\t218217\n90度\t218218\n杜\t218219\n杝\t218220\n杘\t218221\n激起\t218222\n杖\t218223\n到时候\t218224\n在家无\t218225\n材\t218226\n村\t218227\n杯\t218228\n杭\t218229\n饭堂\t218230\n不1\t218231\n杩\t218232\n徐龙杰\t218233\n藕塘村\t218234\n打消\t218235\n样片\t218236\n条\t218237\n松\t218238\n板\t218239\n杼\t218240\n了如\t218241\n小蜜你敢不理我我开除你\t218242\n金硕\t218243\n杵\t218244\n杲\t218245\n攀升\t218246\n冬娃儿\t218247\n東\t218248\n萨达特\t218249\n便民\t218250\n爱喽爱喽\t218251\n侯伯伯\t218252\n着重\t218253\n哈提款机哈他俱乐部天拉拉队\t218254\n押题\t218255\n胡明月\t218256\n咒度\t218257\n过分风风光光\t218258\n赛点\t218259\n狼变\t218260\n邱钦\t218261\n了不得不\t218262\n英方\t218263\n衤貌\t218264\n点燃\t218265\n枪击\t218266\n英文\t218267\n36个\t218268\n幻们\t218269\n收敛\t218270\n再见了我还\t218271\n榻榻米\t218272\n杨志\t218273\n附着力\t218274\n斯波\t218275\n130831191\t218276\n辣椒酱\t218277\n就却\t218278\n欠戏\t218279\n银监会\t218280\n旧房\t218281\n胡宗宪\t218282\n香樟\t218283\n条街\t218284\n子鼠丑牛寅虎\t218285\n40套\t218286\n粑比足维\t218287\n一块糖\t218288\n唔系边度\t218289\n镯子\t218290\n十三分\t218291\n几秒钟\t218292\nazan\t218293\n你的偶像是谁呀嘟咪嘟咪味\t218294\n呼啦汤\t218295\n约炮行\t218296\n小卖家\t218297\nfnt\t218298\nfnr\t218299\nfnq\t218300\n别饿死老秘你在干嘛\t218301\n不约\t218302\n段建钢\t218303\nfng\t218304\n阿凡\t218305\n宿便\t218306\n阿凤\t218307\nAgaya\t218308\nfnn\t218309\n九七九九九九九九九九九千\t218310\n一千二千三三个\t218311\nfnk\t218312\nfnj\t218313\nfni\t218314\n阿凯\t218315\n货品\t218316\n瘦脸针\t218317\nq7k\t218318\nbakalahoula八思科mola乒乓cencom\t218319\n口音\t218320\n国际花卉节\t218321\n韩啸天\t218322\n呃移机\t218323\n2524398534\t218324\n付恐\t218325\n旭升\t218326\n闰秒\t218327\n中心城\t218328\n艾莱依\t218329\n么么\t218330\nhgznu\t218331\n哎呀奥\t218332\n塞翁白羽\t218333\n万美元\t218334\n陕西队\t218335\n天津市\t218336\n雄健\t218337\namrazhentl\t218338\n致癌物\t218339\n碰了哪儿\t218340\n估价\t218341\n欢乐基\t218342\nSober\t218343\n递减\t218344\n宝鸡\t218345\n卡萨诺\t218346\n世博园\t218347\n钻关\t218348\nkots\t218349\n我的我的样子\t218350\n152岁\t218351\n尕面片\t218352\nvghhjjjj\t218353\nhhachgkm\t218354\nbmn\t218355\n画纸\t218356\n画线\t218357\n苦工\t218358\n元斌\t218359\nCCTV—4\t218360\n凌辱\t218361\n陈燕玲\t218362\nsoeCddueb\t218363\n挺忙\t218364\n龟宝\t218365\n申诉表\t218366\n修真机器人\t218367\n京剧票友节\t218368\n告诉我知道\t218369\n中塔\t218370\n吊袜\t218371\n翻天\t218372\nTOP4\t218373\n于德\t218374\n发烧友\t218375\n基中\t218376\n新东西\t218377\n衬托\t218378\n得色\t218379\n拉屎爽\t218380\n朱琳\t218381\n王晓\t218382\n如皋\t218383\n高价股\t218384\n花姑娘\t218385\n跑进\t218386\n很多集\t218387\n库特\t218388\nStewart#\t218389\n2011年5月11日\t218390\n营业执照儿\t218391\nmake\t218392\n怂色\t218393\n基三\t218394\nmaka\t218395\n王晴\t218396\n何帅哥\t218397\n王晶\t218398\nfhfttfiou\t218399\n三四点\t218400\n屄里来\t218401\nx00…admwtpgjqgjthjulvktl\t218402\n点字峰\t218403\n陈诚\t218404\n又爱\t218405\n牙塔雷斯\t218406\nim2.0\t218407\nImnot\t218408\n黄佳一\t218409\n救美\t218410\n搞不好\t218411\nvvv你在哪里呀唉唉唉我在幼儿园\t218412\n9.6万亿\t218413\n中国保监会\t218414\n市面\t218415\nwouldyou\t218416\n放人\t218417\n陈诺\t218418\n6464649485859858587878488494949494949948494487\t218419\n黄佳丽\t218420\n喵喵喵喵喵喵喵\t218421\n六一个\t218422\nuuuuuu\t218423\n可臭\t218424\nmjdgfysiu\t218425\n亲情篇\t218426\nzkax\t218427\n小日本\t218428\n条幅\t218429\n宜春市\t218430\n也想\t218431\n日光浴\t218432\n灰太赞\t218433\nabouen\t218434\n15966885123\t218435\n上海国金中心\t218436\n九斤四两\t218437\nMmmmmmm\t218438\n摩羯哦\t218439\nrdhjl\t218440\n米开郎基罗\t218441\n潘飞\t218442\n经营类\t218443\n吐舌\t218444\n首tfboys\t218445\n阴物\t218446\n推辞\t218447\n少杰\t218448\n转链接\t218449\n耶瑟\t218450\n胡萝卜片\t218451\n不惑之年\t218452\n少来\t218453\n代考\t218454\n过步\t218455\n小云云\t218456\n发改委//\t218457\n11944\t218458\n578217556369422884\t218459\nreuse\t218460\n那轩铭\t218461\n胡思睿\t218462\n魏晨灿\t218463\n栖杍\t218464\n老和尚意\t218465\n参考文献\t218466\ngangzen\t218467\n艺名\t218468\n姓李\t218469\n东路\t218470\n自燃\t218471\n掌权者\t218472\nggfgg\t218473\n储蒙\t218474\n咋扇\t218475\n宋江棋\t218476\n81岁\t218477\n人资费\t218478\n掳走\t218479\n肆意\t218480\n我爱你我要欧耶欧耶\t218481\n吴胜焕\t218482\n本草\t218483\n哈姆\t218484\n长沙市中心医院\t218485\n委屈块\t218486\npoto\t218487\n毛绒\t218488\n安危\t218489\n複複\t218490\n妍熙\t218491\n朱厚照\t218492\n我不我不理你了我不\t218493\niuyou\t218494\n王梦圆\t218495\n小涛\t218496\n裸捐\t218497\ngcjn\t218498\n参杂\t218499\n降龙十八掌\t218500\n三毛五\t218501\n替罪\t218502\n目十行\t218503\n不和\t218504\nmfdgkce\t218505\n老夫老妻浪漫回忆史\t218506\n小涵\t218507\n855555588888888898888\t218508\n州界\t218509\n安南\t218510\n安单\t218511\n安博\t218512\n凤殿p份\t218513\n艾迪牛\t218514\n求求你了我缺爱\t218515\n3q43q\t218516\n汇川技术\t218517\n哈牛比\t218518\n朱敏\t218519\n累特\t218520\n小涨\t218521\niiiroyoyrxhystcxhfffrj777rghxugyfb\t218522\n机场\t218523\n勃不\t218524\n朵丽丝o波\t218525\n陈鑫\t218526\n贵阳市修文中学\t218527\n云团\t218528\n有一想一想\t218529\n参评\t218530\n不二家糖\t218531\n挖开\t218532\n#黑狐#\t218533\n综艺节目\t218534\n告告\t218535\n毛瑟\t218536\n5百\t218537\n波鸟\t218538\n优惠\t218539\n啦啦啦啦啦啦啦啦啦啦啦啦啦不够不够不够不够不够不够不够不够不够不够\t218540\n苟富贵\t218541\nbb粉\t218542\n第一笔\t218543\nluan\t218544\n培根肉\t218545\n巨绳\t218546\n凯里\t218547\n贴心\t218548\nCA1753\t218549\n度秘你感觉你是我的闺蜜\t218550\n收心\t218551\n一点五ｇ\t218552\n团伙\t218553\n新兴市场\t218554\nWWDC\t218555\n一露\t218556\n大一百岁\t218557\n你是我隔壁老王\t218558\n13459763128\t218559\n本格格\t218560\n做不惯\t218561\n天天快点\t218562\n起码\t218563\n一霎\t218564\n瓜娃\t218565\n9658418686624896\t218566\n人造\t218567\n一只35mm\t218568\n玛斯\t218569\n就样\t218570\n走吧一起\t218571\n感谢信\t218572\n哈维·阿隆索\t218573\n文艺复兴\t218574\n眯瞪\t218575\n75578\t218576\n动荡不定\t218577\n梦梦\t218578\nugjgjgigjhegfjgjguyfugyfbbbjhhhhhh\t218579\n没有关\t218580\nhttpfhiphotosbaiducomxiaodupicitem472309f790529822574b1eb7d0ca7bcb0a46d44djpg\t218581\nrdtp\t218582\n广告语\t218583\n宅院\t218584\n薰衣草\t218585\n2333333333333333333333333333333333333333333333333333333333333\t218586\n本基切\t218587\n亦兴\t218588\n付建波\t218589\n无异味\t218590\n许多人\t218591\n广告词\t218592\n县官\t218593\n陈森祥\t218594\n楼楼\t218595\n小熙\t218596\n狂欢武\t218597\n相食\t218598\n猜猜\t218599\n1984年\t218600\nSounio\t218601\n安斯源\t218602\nwjsbbdb\t218603\n问一声\t218604\n1090594374\t218605\nzeng\t218606\n专心\t218607\n波折\t218608\n小熊\t218609\n宾得\t218610\n583806\t218611\n喵星\t218612\n#鹿晗luhan#\t218613\n廵卜骨丝邕业业黯黯记\t218614\nigtthfo\t218615\n64354886\t218616\n喀什\t218617\nhwangbin\t218618\n出包王女\t218619\nutouto\t218620\n三八房\t218621\n明天天\t218622\n14点半\t218623\nnaive\t218624\n爱吾爱\t218625\n好冤\t218626\n发胖\t218627\n天天说\t218628\n郎成\t218629\n15084416152\t218630\n个帮\t218631\n好冰\t218632\n依然\t218633\n张雨舟\t218634\ntf博艾斯\t218635\n别着急慢慢来\t218636\nyou别瑞\t218637\n好冻\t218638\n电话单\t218639\n流下\t218640\n好机敏\t218641\n一分钟分钟\t218642\n漂亮关\t218643\n经济权\t218644\n发胶\t218645\nskyaxce\t218646\n电话卡\t218647\n哑口无言\t218648\n长版\t218649\nWii\t218650\n火花\t218651\n说比方说\t218652\n刘龙头\t218653\n我看跑男跨年跨年\t218654\n27000家\t218655\n好写\t218656\n长牙\t218657\n111个\t218658\nWix\t218659\n祺频\t218660\n18888\t218661\n清肺\t218662\n快乐生日快乐\t218663\n戴佳柱\t218664\n18883\t218665\n文艺范\t218666\nTfuf\t218667\n天福\t218668\n如其来\t218669\n恶魔谷\t218670\n韩版\t218671\n清肝\t218672\n佳能1D\t218673\n智能机\t218674\n千花\t218675\n铠甲瓜\t218676\n收据\t218677\n回吻\t218678\n秦时明月\t218679\n三、五\t218680\n欧珀\t218681\n日月潭\t218682\n强大\t218683\nIamAnna\t218684\n网磅\t218685\n苹果6sss\t218686\n嗯行\t218687\n里木\t218688\n旭光\t218689\n第57个\t218690\n别闹关\t218691\n禾驟\t218692\n弄好不好\t218693\n大城中村\t218694\n二纬路\t218695\n615米\t218696\n嘎嘎嗝\t218697\n575464\t218698\n摊手\t218699\n韩冰冰\t218700\n童佳倩\t218701\n带离者\t218702\n价格\t218703\n上饶\t218704\n咋我\t218705\n十一黄金周\t218706\n201点儿\t218707\n4267426071190759845985\t218708\n商务部\t218709\n亮灯\t218710\n3000名\t218711\n薇竹\t218712\n765522345\t218713\nAVKoKoCOM\t218714\n丐洳\t218715\n牛拜\t218716\n乳沟\t218717\n受舍\t218718\n得月\t218719\n我爱我不爱你你爱我\t218720\nfhjfdvgddgj\t218721\n杨玉良\t218722\n国产车\t218723\n代表人物\t218724\n橡皮犬\t218725\n呦呵pig\t218726\n默不作声\t218727\n二点钟\t218728\nbds\t218729\n徐姐\t218730\n对数\t218731\n众位\t218732\n推说\t218733\nurieikdjdnheihduiknfhi9Doqplndd\t218734\n受制\t218735\n要不我不在\t218736\n2012年6月3日\t218737\n相忘\t218738\n受到\t218739\n吕惠娴\t218740\n身历其境\t218741\n受别\t218742\n照找\t218743\n猜气\t218744\n蔓宁美那\t218745\n儿娃娃\t218746\n曼不在\t218747\n20光年\t218748\n碧天\t218749\n西双版纳\t218750\nXBTQ\t218751\n苍岩\t218752\n临清\t218753\n不敢情景泰和平\t218754\n八一摇头灯\t218755\n泷泽萝拉\t218756\n划拜\t218757\n飒然\t218758\n帅太\t218759\n没有了啦\t218760\n18453606466\t218761\nycyycuuu\t218762\n五杀\t218763\n五权\t218764\n笑好\t218765\n平安安\t218766\n划拉\t218767\n开心树\t218768\nshifF\t218769\n蜚嫠\t218770\n幅度\t218771\n干鸟\t218772\n收视调查\t218773\n划拳\t218774\n中色\t218775\n公粮\t218776\n温州省\t218777\n好骗\t218778\n五条\t218779\nu个\t218780\n酞六五\t218781\n苍岚\t218782\n划拨\t218783\n648484848\t218784\n苍岗\t218785\n官方版\t218786\n好啦谢你\t218787\n死起\t218788\n2倍\t218789\njururuffur\t218790\n密不可\t218791\n峻\t218792\n过会儿见\t218793\n佳潞潞\t218794\n人文600嗯日日日日日日\t218795\n如图有我的11his\t218796\n我是你的陌生人妈呀你\t218797\n良木有\t218798\n唐涵\t218799\n领航\t218800\n蛇度秘\t218801\n奥太廊曼\t218802\n23十5\t218803\nhhuhj\t218804\n我的歌声里\t218805\n家境\t218806\n弓子\t218807\n董玉超\t218808\n长春市\t218809\n只个\t218810\n581d\t218811\n八十家里阿狸啦咖喱吧八吧爸12爸八必经\t218812\n上海浦东国际机场\t218813\n鲜奶酸牛奶\t218814\n五百七十块\t218815\n红糖冰\t218816\n32天\t218817\n15822008995\t218818\n一两个\t218819\n另一张照片\t218820\n四三几点\t218821\n673分之二\t218822\n卖给\t218823\n软件儿\t218824\n独自\t218825\n多手司里尜\t218826\nfrvj\t218827\njdURLywy\t218828\n季终\t218829\n芭芭拉小魔仙\t218830\n哈搞笑\t218831\n你好度秘你好度秘\t218832\n非告诉我\t218833\n来和来\t218834\n丑中丑\t218835\n四王子的复仇记\t218836\n借枪\t218837\n黑EXO\t218838\n来星美\t218839\n钱钱\t218840\n亨氏\t218841\n桃源\t218842\nCnbgnhf\t218843\n八六三二二三八二\t218844\n會不會\t218845\n戏音乐\t218846\n校人\t218847\n板神\t218848\n一五儿8577而宝乐儿\t218849\n降杂\t218850\n懂礼让\t218851\n陈夸\t218852\nyangziya\t218853\n158598855885566899\t218854\n美攵\t218855\n哆咪\t218856\n稀奇\t218857\n还没有你\t218858\nfhtnchn\t218859\n陈天\t218860\n村落\t218861\n陈大\t218862\n鹿书颖\t218863\n丹灶\t218864\n参议\t218865\n环王\t218866\n小黎铭\t218867\n入口\t218868\n我一天\t218869\n央视春晚摩术师\t218870\n主动权\t218871\n3萬\t218872\n港货\t218873\n双UGG\t218874\n时计\t218875\n500多年\t218876\n参访\t218877\n病毒性心肌炎\t218878\n程咸\t218879\n时讯\t218880\n宝贝宝贝宝贝宝贝你爸白白\t218881\n驹过隙\t218882\n北京科技\t218883\ns2222\t218884\nuuuutY\t218885\n西校\t218886\n腐剧\t218887\n自然力\t218888\n周柏豪\t218889\n富坚老贼\t218890\nhglgvnkxvjgv\t218891\n臭子\t218892\n你好关\t218893\n上周五\t218894\n家乐福麦当劳\t218895\n抢镜王波什\t218896\n好吧谢谢你我爱你\t218897\n１１座\t218898\n老三有性\t218899\nonouse\t218900\n报说\t218901\n媛媛\t218902\n报请\t218903\n化合价\t218904\nproposal\t218905\n杜拉拉\t218906\njeuk\t218907\n你在说什么英雄传普通话\t218908\n不可爱\t218909\n欺负妮\t218910\n多米庙\t218911\n红糖\t218912\n鬼医\t218913\n130\t218914\n517分之二\t218915\n作天\t218916\n翠柳\t218917\n金榜题名\t218918\n黄金时间\t218919\n换洗\t218920\n达数真的可\t218921\n考察团\t218922\n一个七八百\t218923\n汉堡包雅\t218924\n翠柏\t218925\n有钱任性\t218926\n起步\t218927\n起死\t218928\n伟大张\t218929\n按说\t218930\n8分之七小时\t218931\n126554851\t218932\n格叽格叽格叽格叽格叽\t218933\n字帖\t218934\n爱色\t218935\nmoderme\t218936\n涂料\t218937\n更好\t218938\n十8\t218939\n打怕\t218940\n甚微\t218941\ngffdx\t218942\n入八儿\t218943\n调味料\t218944\n蓝绿色\t218945\n并秘\t218946\n默哀\t218947\n鬼死猪\t218948\n海菊东篱下\t218949\nlxiao000\t218950\n大家好我是人\t218951\n0点儿\t218952\nxla\t218953\n尊可好\t218954\n金盾\t218955\n欢颜好爱我的\t218956\nxjsbckzbzke\t218957\n镀膜\t218958\n有雨没\t218959\n做词\t218960\n良母\t218961\n汊\t218962\n没心\t218963\n温家宝\t218964\n九百元\t218965\n一月十八号\t218966\n找你你是\t218967\n山下\t218968\n山上\t218969\n幸运地\t218970\n省省力\t218971\n路边社\t218972\n21搬家\t218973\n圆周角\t218974\n山丘\t218975\n鬼哭狼嚎\t218976\n山东\t218977\n对头对头对头对头对头醉酒醉酒醉酒\t218978\ngjtd\t218979\n严打\t218980\n累不回头\t218981\nowjrbtbgnckospiehcbco\t218982\n0点2x0点\t218983\n说你\t218984\nEDDIE\t218985\n省略\t218986\n雄辩\t218987\n不要不要不要不要不要不要不要不要不要不要不要不要不\t218988\n1901377546421\t218989\n110201朴特\t218990\n西乌皮\t218991\n崖\t218992\n陈颖\t218993\n崔\t218994\n加995995\t218995\n服了服\t218996\n福建队\t218997\n像片\t218998\n亲人\t218999\n好玩儿跨\t219000\n庭里\t219001\n洪水猛兽\t219002\n具备\t219003\n印子\t219004\n最后一次年\t219005\n阿龙\t219006\nrobut\t219007\n阿龟\t219008\n崴\t219009\ndtukv\t219010\nBeing\t219011\n呼噜噜噜噜\t219012\nSCE\t219013\n唱序幕\t219014\n崽\t219015\n透漏\t219016\n青山区\t219017\n冯庄路\t219018\n老山\t219019\n崩\t219020\n纸屑\t219021\n大海边儿\t219022\n1P\t219023\n1S\t219024\n李欣格\t219025\n1W\t219026\n度秘你是我的好秘秘\t219027\n国光\t219028\n和和和和和和和和和成亿万\t219029\n角架\t219030\nDontno\t219031\n翼城现代汉语\t219032\n创造力\t219033\n车技\t219034\n早点元\t219035\n想不是\t219036\n1D\t219037\n王土阴门普科恩\t219038\n1F\t219039\n1I\t219040\n1H\t219041\n昨天上午\t219042\n犯罪嫌疑人\t219043\n1M\t219044\n遇见\t219045\n1O\t219046\n捧花\t219047\n1q\t219048\n1p\t219049\n1s\t219050\n1t\t219051\n1v\t219052\n清零\t219053\n响声\t219054\n李苗莉琳\t219055\n男孩儿女孩儿\t219056\n1a\t219057\n口马\t219058\n小型犬\t219059\n1b\t219060\n美食\t219061\n1d\t219062\n1g\t219063\n1f\t219064\n诡辩\t219065\n1k\t219066\nv狗\t219067\n1m\t219068\n1l\t219069\n嗯错\t219070\n够力\t219071\n任杜\t219072\nbuta\t219073\n三六九等\t219074\n不懂不着\t219075\n河女\t219076\n不慌不忙\t219077\n周恩祺\t219078\n端杯\t219079\nbuto\t219080\n13646635465\t219081\n冀东\t219082\n黑曜石\t219083\n春水初平\t219084\n外星鼠\t219085\n啦啦啦啦啦啦你是个粑粑粑粑粑粑\t219086\nv呗\t219087\n11\t219088\n怀着\t219089\n13\t219090\n12\t219091\n15\t219092\n14\t219093\nzsr\t219094\n16\t219095\n19\t219096\n18\t219097\n酷女们\t219098\n阿妈\t219099\n防警\t219100\nzsg\t219101\n1%\t219102\n余佩\t219103\n朱丽叶\t219104\nzsl\t219105\n型男\t219106\n赵常庄\t219107\nzsh\t219108\n想你了我想你\t219109\n恐怖片子\t219110\n次品\t219111\n岳父\t219112\nu影\t219113\n撸撸撸wwwwww\t219114\n扣分\t219115\n帕瓦罗蒂\t219116\n小蒂亚\t219117\n打鸡刺激\t219118\n计生\t219119\n话话费\t219120\n小坏\t219121\n鲜妍\t219122\nchiphotosbaiducomxiaodupicitem314e251f95cad1c8cde68153783e6709c93d5160jpg\t219123\n小坛\t219124\n猪了猪了猪\t219125\ngjgjgjgjptptptptptputptptptptp\t219126\n进律\t219127\n洋葱头\t219128\n吧s\t219129\n小坐\t219130\n小块\t219131\nqqww\t219132\n蒙昧\t219133\nf1314520397\t219134\n悲喜\t219135\n依靠\t219136\n埋汰\t219137\n零下胰胰度\t219138\n开机宝开\t219139\n嚼碎\t219140\n路遥知马力\t219141\n托起\t219142\ntdv\t219143\n汳\t219144\n福斯电影公司\t219145\n8888\t219146\nqqwm\t219147\n猎犬\t219148\n汰\t219149\n嘴臭不臭\t219150\n干嘛联系人家\t219151\n江一斌\t219152\n数少四\t219153\n八秒钟\t219154\n依静\t219155\n不扭扣\t219156\nuffbjffvjcv\t219157\n喵喵酱\t219158\n十二个\t219159\np1厅\t219160\n说话时\t219161\n啦啦啦啦啦我的宝贝\t219162\n爱神奥\t219163\n真的好好看谢谢你\t219164\n搞不倒\t219165\n二零幺九\t219166\n同当\t219167\n8880\t219168\n成天也\t219169\n独木桥\t219170\n新生代男子组合双孖JL\t219171\ngvbh\t219172\n色堂\t219173\n缪鹏辉\t219174\n聚蹉\t219175\n鬼神\t219176\n千数\t219177\n不嫁给你了你嫁给我\t219178\n呗江油\t219179\n一拉一拉\t219180\n摩羯女\t219181\n伊布伊\t219182\n群众\t219183\n21247888888745224\t219184\n群伙\t219185\n余姚狗\t219186\n刷图\t219187\n朱战杰\t219188\n没了再见\t219189\n华县\t219190\n屈健然\t219191\n8688度\t219192\nServices\t219193\n嗯帅\t219194\n华大学\t219195\n羊角\t219196\n江姐\t219197\n一朵花\t219198\n第128\t219199\n81分\t219200\n六十天六十粒\t219201\n冷敷\t219202\n好呢十六岁\t219203\n朱丹人僵尸植物大战僵尸\t219204\n一会期\t219205\nTargethealth\t219206\n回飞\t219207\n6月23号\t219208\n了有可是还人家和我最好朋友在一起\t219209\n阿数\t219210\n请问询\t219211\n休假式\t219212\n两位\t219213\n铁丝网\t219214\n元波\t219215\n嗨皮\t219216\n孤陋寡闻\t219217\n做怪\t219218\n进行进\t219219\n龙奕迅\t219220\n胆猫\t219221\nsource\t219222\n还毒\t219223\n没昨样\t219224\n你无\t219225\nCorner\t219226\n星潭人\t219227\n荀组\t219228\n任大文\t219229\n白头到老\t219230\n任伟\t219231\nhellokitty蛋糕\t219232\n诸于色\t219233\n旅行记\t219234\nXrd\t219235\n实况\t219236\n秘杀\t219237\n困劲\t219238\n形要\t219239\nXrv\t219240\n秘村\t219241\n烟台\t219242\n傻b傻b傻b傻b\t219243\n寨桥村\t219244\n单反相机\t219245\n政客\t219246\nHazyninja\t219247\n起不来了\t219248\n水洞沟古人类遗址\t219249\n阿尔萨斯灰皮诺\t219250\njjbj\t219251\njjbh\t219252\njjbf\t219253\n后悔药\t219254\n杨回家\t219255\n基德\t219256\n好无辜\t219257\n艾思伊\t219258\n污化\t219259\nHGFJKL\t219260\n州长\t219261\nmYing\t219262\n考托\t219263\n薯格\t219264\n您妈\t219265\n沧海一粟\t219266\nTIAMO\t219267\n2010年11月15日\t219268\n凋零\t219269\n奸片\t219270\n句徐\t219271\n吕逸涛\t219272\n本地妞\t219273\nnxmn\t219274\n出报\t219275\njfk\t219276\njfj\t219277\njfi\t219278\n气势卷子\t219279\njfo\t219280\njfn\t219281\njfm\t219282\n⊕月\t219283\njfc\t219284\njfb\t219285\njfg\t219286\njff\t219287\njfd\t219288\n10242根\t219289\njfy\t219290\njfs\t219291\n您妹\t219292\njfv\t219293\n再说说笑笑\t219294\n回乡\t219295\n说会\t219296\n呆抖森\t219297\n20周\t219298\n禄位\t219299\n一个款\t219300\n猪头肉\t219301\n轸轻松\t219302\n邵梦瑶\t219303\n忙吧注意\t219304\n胸口袋\t219305\n青春豆\t219306\nwanrong35\t219307\n社科院\t219308\n抗癌\t219309\n王工高\t219310\n辛幸福\t219311\n奸了作\t219312\n于曼丽\t219313\n晚餐堡\t219314\n并排\t219315\n伽墨得斯\t219316\nviID\t219317\n寥若\t219318\n否在否\t219319\n张浩宇\t219320\n新闻堂\t219321\n无名诗\t219322\n破忽热\t219323\n237576900000000000000000000000000000000000000000000000000000000\t219324\n施佳秀\t219325\nJJ咯德拉库拉\t219326\n吧ok\t219327\n一阵风\t219328\n伤心小新\t219329\n严肃点\t219330\n别四十八小时\t219331\n陈意涵\t219332\n新周快乐\t219333\n碰窝\t219334\n强波\t219335\n杨帅毛\t219336\n大奔儿肥\t219337\nen康雨欣\t219338\n安娜\t219339\n对呀对呀对呀就我懂就我懂给我的哈哈哈\t219340\nwifilw\t219341\n大火球\t219342\n诛灭我为问你\t219343\n孙安君\t219344\n脚色\t219345\n英雄\t219346\n郭竞涛\t219347\n不婚\t219348\n13626151415\t219349\n强硬\t219350\n二六六\t219351\n晋剧子\t219352\n拂袖\t219353\n爱克思\t219354\n上海宝闸钢铁集团有限公司\t219355\n还没呢那晚我爱上说\t219356\n600万\t219357\n西吉\t219358\n诗琼\t219359\n袁贵卿\t219360\n好乖呢京客隆\t219361\n计录\t219362\n祝秘度秘\t219363\n莎拉娜\t219364\n列宁装\t219365\n诗琳\t219366\n门庭\t219367\nLOLB\t219368\n600个\t219369\n复习生\t219370\n昆虫\t219371\n量词\t219372\nl58\t219373\n293.30万辆\t219374\nLOLI\t219375\n樣不善\t219376\n昆村\t219377\n非宁静\t219378\nDfffyt5\t219379\n敢爱敢恨\t219380\nn郎\t219381\n指甲花\t219382\n汶岩\t219383\n死傲娇\t219384\n里氏\t219385\nJAR\t219386\n达摩\t219387\n模单\t219388\n4枚\t219389\n哦i77\t219390\n茂名市\t219391\n4799\t219392\nUC浏览器明星榜\t219393\n光鲜\t219394\n美食家\t219395\n86274\t219396\n味点\t219397\n乘车\t219398\n名时\t219399\n杰豹\t219400\nkjhiuhgb\t219401\n捆绑\t219402\n你的生命\t219403\n208个\t219404\n福利\t219405\n勒真好笑\t219406\n孙婷\t219407\nfffffgggfcfffftr\t219408\n非主\t219409\nddftct\t219410\nfgch\t219411\n一石恶的小呀小苹\t219412\n新泰起亚\t219413\n到村再说\t219414\n43998586\t219415\n爱妍敏\t219416\n一官\t219417\n一定\t219418\n华为荣耀4X\t219419\n急转弯儿\t219420\n颜茶\t219421\n重中之重\t219422\nhxcjdyj\t219423\n梁祝化蝶\t219424\n一完\t219425\n嘉宾\t219426\n政工\t219427\n猜一猜\t219428\n小霸王\t219429\n一家\t219430\n两亿年\t219431\n青少\t219432\n中戏\t219433\n一宿\t219434\n苏非亚\t219435\n一审\t219436\n一宠\t219437\n一客\t219438\n一室\t219439\n起薪\t219440\n生日会\t219441\n一宫\t219442\n嘉定\t219443\n嘉宝\t219444\n啦啦啦啦啦啦啦啦啦啦啦啦啦度秘就是一个猪猪猪猪猪猪猪猪\t219445\n800427545757545755375\t219446\n王高高\t219447\n古典\t219448\n新股\t219449\n遗传\t219450\nbushì\t219451\n古兰\t219452\n亲密\t219453\n中东北非\t219454\n海派\t219455\n大大妮子妮子\t219456\n晒脸\t219457\n严身七\t219458\n浪翠崖\t219459\nmgjajgapmgjpajp\t219460\n忽悠\t219461\n有一个人走\t219462\n巨乳\t219463\n无糖神马的相当好不会长胖TT静静我爱你\t219464\n李香玲\t219465\n海洲\t219466\n2011.5.30上午\t219467\njdkks\t219468\ntuvw\t219469\n领奖日\t219470\n淘登\t219471\n张婷婷\t219472\n陈啥\t219473\n街街\t219474\ntkmjmjumjt\t219475\n男女生\t219476\n你好午\t219477\n野生动物园\t219478\nqazws\t219479\ndggccghhyggkknbbgffddtujbbhjjjjjjkgghjnjjkhyyuesdffffffff\t219480\n14个\t219481\n樊百乐\t219482\n王红洋\t219483\ncuvw\t219484\n逆还\t219485\n雅雄\t219486\n接站\t219487\n你好丑丑\t219488\n飞轮海\t219489\n教学区\t219490\n水魔兽\t219491\nhqh\t219492\n14384389438\t219493\n14万\t219494\n膏体\t219495\n这般\t219496\n老早照\t219497\n谬误\t219498\n硫磺椒\t219499\n老太爷臭猪大臭猪\t219500\n绝经期\t219501\n吴昕\t219502\n远征\t219503\n浮世\t219504\n17.75点\t219505\n好伐\t219506\n好伟\t219507\n好伙\t219508\n好优\t219509\nrdvkjkfgv\t219510\n两年半\t219511\nguudz\t219512\n锅锅\t219513\n374．4亿元\t219514\n5月23号\t219515\n#AKB48#\t219516\n资源部\t219517\nAIRASIA\t219518\n谢志冈\t219519\n住人\t219520\n京东商城\t219521\n钱云会案\t219522\n好似\t219523\n珠玑\t219524\n好伦\t219525\nmust\t219526\n孟京辉\t219527\n张子辉\t219528\n讨厌鬼真人更片\t219529\n实心\t219530\nyyu\t219531\nyyt\t219532\n七仔\t219533\nyyv\t219534\n23角\t219535\n搓风\t219536\nyyy\t219537\n汉姓\t219538\n气死我了好嘛好嘛好嘛好嘛好嘛好嘛好污\t219539\n黑龙\t219540\nyye\t219541\nyyd\t219542\nyyg\t219543\nyyf\t219544\n休理\t219545\n荆医生\t219546\nyyb\t219547\nyym\t219548\n1204千米\t219549\n内水\t219550\n医师沃的小呀小苹歌榛蘑\t219551\n七件\t219552\n包回\t219553\n肺结核结核\t219554\nmozhi\t219555\n2738n57378n30\t219556\n二三九七零\t219557\n七份\t219558\n欧酷\t219559\n182816783\t219560\n爱花雾非雾不要枕着你的誓句我要枕着你的誓句\t219561\n99999个\t219562\n锥子\t219563\n七们\t219564\n女们\t219565\n滩岸\t219566\n舞出\t219567\n惠普\t219568\ncntv\t219569\n缓急\t219570\n爱懂不懂\t219571\n三百条\t219572\n玉莹\t219573\n一个20度\t219574\n刘欢迎\t219575\n薛小翠\t219576\n银河曼雅\t219577\n天秤男\t219578\n污水桶\t219579\n煤球\t219580\n风中的费洛蒙\t219581\ntataten\t219582\n噜啦噜啦嘞噜啦噜啦嘞噜啦嘞hellohello你好你好撸啊撸啦嘞噜啦嘞噜啦噜啦嘞噜\t219583\n真意\t219584\n丑八蛋\t219585\nwcbf\t219586\n茶艺\t219587\n焦麻\t219588\n轰烈烈\t219589\n对啊他\t219590\n咩支援\t219591\nfreery\t219592\n我是人我叫吴越我九岁了你\t219593\n我的小鸡一个梦帮就想和你早和小bb\t219594\n张义慧\t219595\n徐晨宇\t219596\n羡慕恨\t219597\n答非所问爱\t219598\n离家里\t219599\n心灯\t219600\n张楚\t219601\n兴用语\t219602\n本能\t219603\n优越\t219604\nfill\t219605\n不要哭\t219606\n倒说\t219607\nreeeewettrrtgtccx\t219608\nkhcgo\t219609\n不厌其烦\t219610\n，、\t219611\n公益广告\t219612\n狸布娃\t219613\n阿黑\t219614\n故乡情\t219615\n肚子糖\t219616\nIQa\t219617\n一起又看流星雨\t219618\n确显\t219619\n不间断\t219620\n聚跨\t219621\n4192版\t219622\n靠不靠谱\t219623\n你没有爱过我我讨厌你\t219624\n脑汁\t219625\n赵组\t219626\n周长河\t219627\n西林\t219628\n起初\t219629\n团单机版\t219630\n了要不然\t219631\n接到\t219632\n低功耗\t219633\n孔子\t219634\n梅子mskyai\t219635\n田帅帅\t219636\nN级浮屠\t219637\n编辑部\t219638\n钟晓雄\t219639\n雷好\t219640\n种草莓\t219641\n7744\t219642\n7745\t219643\n十四小时\t219644\n7748\t219645\n明令\t219646\n贵明\t219647\n昌图\t219648\n357489\t219649\n隐形膏\t219650\n哦知道你之城挂吧有你\t219651\n瑞都\t219652\n不抱\t219653\n原型\t219654\n展销会\t219655\n乌娜\t219656\n不爱幼\t219657\n微影荐\t219658\n熙旺\t219659\n度面度秘\t219660\n天缜\t219661\n7000块\t219662\n挡不起\t219663\n闭关性\t219664\n两三点\t219665\n百折\t219666\n多步\t219667\n不投\t219668\n九不过十\t219669\n走过\t219670\n多崎作\t219671\n百把\t219672\n我和你的生命\t219673\n褶皱\t219674\n身在\t219675\n行动\t219676\n网易微博\t219677\n捋一捋\t219678\n马列\t219679\n排量\t219680\n靠靠靠靠\t219681\n伊一长\t219682\n郑州\t219683\n怎一个\t219684\n7cd7b899e510fb39fb40de4de33c895d1430c17jpg\t219685\n栏板\t219686\n蚍蜉撼\t219687\n悬疑\t219688\n卖还\t219689\n六百千克\t219690\n车辆购置运行费\t219691\n刘俊熙\t219692\n唉杯首弟子规\t219693\n扭组\t219694\n惜字如金\t219695\n马刺\t219696\n两千321度\t219697\n北京山水文园\t219698\n马到\t219699\n栏杆\t219700\nJunior\t219701\n彭山\t219702\n场面\t219703\nytxfj\t219704\n回龙岩\t219705\n凯美秘\t219706\n9月27\t219707\n邓瑞雨\t219708\n冷藏室\t219709\n9月20\t219710\n9月22\t219711\n专柜\t219712\n推发\t219713\n练操\t219714\nubakamitaintai\t219715\n零四点\t219716\n小猫猫\t219717\n幼斌\t219718\n兴业银行\t219719\n吃了算\t219720\n世界级\t219721\n评比\t219722\n桂宝来\t219723\n学园\t219724\n学团\t219725\n蜗牛集市水泥漆\t219726\n1000余名\t219727\n刘佳丽\t219728\n1178364143\t219729\n学回\t219730\n薛平贵\t219731\nKBB\t219732\n支架\t219733\n婚书\t219734\n好冤枉样\t219735\nKBG\t219736\n15037581508\t219737\n养不养\t219738\n产业化\t219739\n阿里格多\t219740\nKBS\t219741\n广陵\t219742\n吾好搞笑\t219743\n生出没\t219744\n140个\t219745\n霍是霍\t219746\nsehfuevcowbfkf\t219747\n彭盈盈\t219748\nsbaaaaa\t219749\n染香\t219750\n粘人\t219751\n纽约时报\t219752\n实用\t219753\n翔锅思密达\t219754\n凌寒雨\t219755\n征收\t219756\nissarah\t219757\n十三五十五四五二十五五二十五\t219758\nww25\t219759\n被破处\t219760\nurhrbsksooo\t219761\n仁兄\t219762\nNaomi\t219763\n唐僧写给观音的36封信\t219764\n刘星星\t219765\n10cou\t219766\nsarft\t219767\n12375684425748755554265552655526555555884554526855522555\t219768\n飘泊\t219769\n不具\t219770\n七年前\t219771\n赵鹏昊\t219772\n呆呆萌\t219773\n奚皓蕊\t219774\nhhnn\t219775\n芭蕾舞\t219776\n枝干酚\t219777\n活崩\t219778\n顾说\t219779\n巴拉鸭\t219780\nt萌\t219781\n无数种\t219782\nwp4\t219783\n小邋遢真拉丁啦啦啦啦\t219784\n王景娜\t219785\n憨包\t219786\n廖何毅\t219787\n爱的抱抱\t219788\n导联\t219789\n花都\t219790\n薛志勇\t219791\n吃女的我也是女的呀谢谢\t219792\n橘色\t219793\n节制冷落\t219794\n谋发展\t219795\n吴映洁\t219796\n依苹凌霄\t219797\n零零票\t219798\nfuiiuy\t219799\nO型\t219800\n歪密服\t219801\n55558657785547864448212245486624524\t219802\nlkkkl\t219803\n要见\t219804\n艾瑶瑶\t219805\n居心何在\t219806\nbncz\t219807\n134斤\t219808\n豆瓣\t219809\n吴玉洁\t219810\n一等哀家\t219811\n马竞\t219812\n后天明\t219813\nppoopiggm\t219814\n白走\t219815\n啊嘤\t219816\n白起\t219817\n上海迪斯尼乐园\t219818\n雪芹\t219819\n整形\t219820\n花生仁\t219821\n合影留念\t219822\n早安夜\t219823\nFTDTDY\t219824\n雪花\t219825\n物价局\t219826\n头目\t219827\n雪芬\t219828\n累积\t219829\n陈小梅\t219830\n不卖艺\t219831\n切演\t219832\n第六天\t219833\n佛庙\t219834\nBTW\t219835\n给我叫醒\t219836\n亩产\t219837\nBobVic\t219838\nraventy\t219839\n宁帮\t219840\n7遍\t219841\n人体蒋碧薇女士\t219842\n白赚\t219843\n头盖\t219844\n头盔\t219845\n181415157\t219846\n第三所\t219847\n梁玉娘\t219848\nCHOU\t219849\n秘底\t219850\n在江湖\t219851\n九画\t219852\n帮派\t219853\nhgglouff\t219854\n飜发丨\t219855\n蓉蓉\t219856\n一五五\t219857\n凉水河\t219858\n杨家宁\t219859\n十八个\t219860\n真乱\t219861\n两代\t219862\n国家授时中心\t219863\n流行语言\t219864\n王广圹\t219865\n花呀我喜欢玫瑰花月季花紫熏香还有樱花\t219866\nhxf\t219867\n圣保罗\t219868\n马金莲\t219869\n腻了吧\t219870\nmtgap\t219871\n蓝颜\t219872\n真么\t219873\n丢不懂\t219874\n地犬\t219875\n周大姐\t219876\n真乖\t219877\n傻波\t219878\n陈晚秋\t219879\n平等观\t219880\n杨泽昊\t219881\n十八万\t219882\n蓝领\t219883\n吗偶\t219884\n打狗拳\t219885\n5771573\t219886\n池昌旭\t219887\n聊天工具\t219888\n新郎官\t219889\n见不着\t219890\n好度秘你真是我的好伙伴\t219891\n来来唱\t219892\n百富榜\t219893\n龙穴\t219894\n雷国伟\t219895\n成因为我\t219896\n眼气\t219897\nAndrea\t219898\nnimzz\t219899\n刘明静\t219900\n诶呦呦度秘\t219901\n猫儿妃花水小游\t219902\n结住\t219903\n走狗汉奸\t219904\n56岁\t219905\ndhdnnnrnrndndbdbdbdbnbdndndndmdmnhdhdn\t219906\n邢台市\t219907\n万立方米\t219908\n作弊案\t219909\n其味\t219910\n密度好惨\t219911\nhosaik\t219912\n852645\t219913\n一二节\t219914\n白天雷阵雨\t219915\n阿毛\t219916\n志丹县\t219917\nhxy\t219918\n化木\t219919\n聪博\t219920\n拼凑\t219921\n夜薄\t219922\n狠毒\t219923\n大晴天\t219924\n天际\t219925\n冯天琪\t219926\n18640523203\t219927\n第一时间\t219928\n迎春路\t219929\n谢惠玉\t219930\n早梅\t219931\n皇家片\t219932\n随刻\t219933\n11kkarp\t219934\n咯后\t219935\n黄金盗墓三人组\t219936\n故居\t219937\n利益\t219938\nhard\t219939\n宋新天\t219940\n4899万\t219941\nguukks\t219942\nGGGGHH\t219943\n背心裙\t219944\n主委\t219945\n一边绿一边红一边喜雨一边爱妃爱疯\t219946\nhttppinyincn1iSWwvFV3uy\t219947\n声晚安吧度秘\t219948\nI\t219949\n1十1一\t219950\n声宜\t219951\n诱因\t219952\n你是我的贴心小情人\t219953\n枯燥无味\t219954\n葛晓青\t219955\n拾趣\t219956\n禽\t219957\n06月18日04时50分\t219958\n抗辐射\t219959\n肉文\t219960\n克克\t219961\n鬈\t219962\nhdxnnd\t219963\n悔恨\t219964\n表怕\t219965\n唐华\t219966\n大气\t219967\n7孔\t219968\n汉奸女\t219969\n短跑\t219970\n柜员\t219971\n鬧\t219972\n那那那那那那那那那那那那那那那那那那那那那那哪哪哪哪哪哪哪哪哪哪哪哪哪哪哪哪哪哪就是\t219973\n短路\t219974\n脸货\t219975\n七岁\t219976\n色相\t219977\nnoqq\t219978\n新牛蛙\t219979\n老公你爱不爱我和你的孩\t219980\n妈不解\t219981\n带病\t219982\n官桥\t219983\n所用\t219984\n鬼\t219985\n鬻\t219986\n老百晓\t219987\n三负二112\t219988\nhenhenpa\t219989\n叫聊\t219990\n排计\t219991\nDHLL\t219992\n点单\t219993\n不对偶\t219994\n虾条\t219995\n点半\t219996\n新浪娱乐\t219997\n陈磊女\t219998\n美和帅\t219999\n同体\t220000\n营收\t220001\n我喜欢纯情罗曼史\t220002\n18%\t220003\n心烦意乱\t220004\n肖海林\t220005\n点卷\t220006\n人骨头\t220007\n廉价\t220008\n182\t220009\n183\t220010\n180\t220011\n181\t220012\n186\t220013\n187\t220014\n184\t220015\nnrnekej3krkrk3irrj3i\t220016\n554453148335814665\t220017\n188\t220018\n189\t220019\n白心萝卜\t220020\n过河拆桥\t220021\n田天\t220022\n屁眼\t220023\n实验幼儿园\t220024\n门洶\t220025\n18O\t220026\n门派\t220027\n到力大力加\t220028\n18P\t220029\n王林娟\t220030\n巴黎彦宏\t220031\ndoctors\t220032\n15799.62万元\t220033\n相商\t220034\n陆地\t220035\n呃呃呃呃鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅鹅\t220036\nMMMM\t220037\n计划\t220038\n奥康\t220039\n肉饼子\t220040\n秦文斌\t220041\n娃儿\t220042\n坏小孩我讨厌你我讨厌你我讨厌你我讨厌你我讨厌你坏蛋\t220043\nVggggggfffccttgcrfffffvdxddbrgdgdcdbfvcbfncnfhdggjtvcggccxhcvgfdxcdtddddxccgvhhfvfjm\t220044\n18s\t220045\n18p\t220046\n2020\t220047\nDHL1\t220048\n小鬼\t220049\n18x\t220050\n小红\t220051\n荷塘\t220052\n给力嚄\t220053\n坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋\t220054\n员村\t220055\n假嘞\t220056\n第24个\t220057\n公里\t220058\n高官们\t220059\n郭郭郭郭\t220060\n445559574250\t220061\n太刀\t220062\n换天\t220063\n艾玛太\t220064\n田如彩墨\t220065\n说了呀再见\t220066\n太初\t220067\n别伦家伦\t220068\n套萌狐\t220069\n寶石\t220070\n三轮\t220071\n乐群\t220072\n喵酱\t220073\n刺史\t220074\n北鼻们\t220075\n龙门\t220076\n曹峥\t220077\n想不\t220078\n困意\t220079\n寒门\t220080\n熟女\t220081\nvigded\t220082\n魔戒\t220083\n钕膜\t220084\nhhjhhhhhhhhhjjjjhjh\t220085\n蔡明\t220086\n奖\t220087\n公演\t220088\n高赛文\t220089\n花花姑娘我喜欢\t220090\n数据中心\t220091\n弘\t220092\nGhdvx\t220093\nmost\t220094\n升级版\t220095\n佩斯\t220096\n十五天乙队\t220097\nmoss\t220098\nTvfkh\t220099\n外哥\t220100\n宅地\t220101\n郭沐静\t220102\n植物大战僵尸一植物大战僵尸二天空之城\t220103\n华我\t220104\n说不得\t220105\n个位\t220106\n犯上\t220107\n柴薇\t220108\n吧度秘\t220109\n总得费\t220110\n妈妈女\t220111\n9628\t220112\n会案\t220113\n一念失公允\t220114\n都弥渡弥\t220115\n83216\t220116\n45章\t220117\n一百克\t220118\n接头\t220119\n小助手\t220120\n熊岳\t220121\n宋代\t220122\n1115252\t220123\n老抽王\t220124\n沉香\t220125\nidome\t220126\n人云亦云\t220127\n老钱\t220128\n55855\t220129\n几三个\t220130\n童年\t220131\ndisyowqiopbcsdt\t220132\n大浪淘沙\t220133\n一枝花\t220134\n正格\t220135\nSergey\t220136\n穿透夜\t220137\n张建明\t220138\n杨如琳\t220139\n丫丫呀呀噶噶噶噶哈噶噶\t220140\n老钟\t220141\n通上\t220142\n8\t220143\n孙式\t220144\n十九天\t220145\n好\t220146\n恶不恶\t220147\n再邪\t220148\nbofu\t220149\n安大城小爱了一师兄二五师兄夜晚你是胸大我是\t220150\n四十多度\t220151\n候鸟\t220152\nhttpfhiphotosbaiducomxiaodupicitemb58f8c5494eef01f17a24cb7e7fe9925bc317d09jpg\t220153\n夏子硕\t220154\n个体\t220155\n87460\t220156\n6441\t220157\n还女\t220158\n6448\t220159\n新账\t220160\n纸上谈兵\t220161\nhufd\t220162\n无可挡\t220163\n解集\t220164\n解雇\t220165\n暴走小萝莉\t220166\n千夫所指\t220167\n嗯三星\t220168\n到起\t220169\nhuft\t220170\n一个十六岁\t220171\n设计艺术学\t220172\n13974069032\t220173\n不懂不懂我要你\t220174\n八强\t220175\n奶\t220176\nNBA全明星赛#\t220177\n西藏高原\t220178\njujtagtjtab\t220179\n茂名\t220180\n亲爱的你谢谢你\t220181\n是是是是是是是是怕怕怕怕怕怕\t220182\n庆阳\t220183\n东寨镇\t220184\n晨深百岁\t220185\n付睿\t220186\n十二错\t220187\n八张\t220188\n百五十块\t220189\n英雄级\t220190\n好幸运\t220191\n瓜娃娃\t220192\n百年吧\t220193\n里美\t220194\n100万元块\t220195\n十分钟之内\t220196\n上海中山医院\t220197\n太伤\t220198\n一点红\t220199\n寒凉\t220200\n瘊子子瘊子的候不是我\t220201\n都是我讨厌\t220202\n评述\t220203\n李艾琪\t220204\n抖s\t220205\n白石化艺星\t220206\n达尔文\t220207\n纠集\t220208\n蒋工大\t220209\n基本完成\t220210\n奧\t220211\n铜川\t220212\n4056\t220213\n0354\t220214\ntf棒\t220215\n长沙市消防支队\t220216\n海尔曼\t220217\n少管\t220218\n十八点918点\t220219\n红纹\t220220\n通项\t220221\n西环站\t220222\n通顺\t220223\n腐星\t220224\n宝箱\t220225\nGgfffgggvvcfgvhhhgttyhhhvcfghhggttyuhgftrrtutrrtuy\t220226\n眼神\t220227\niwjjsijsjjsjwiaiiajaiwiksjajjajkjajwjajajakwkwjwjwjwjaj11\t220228\n电动轿车\t220229\n杨亦欣\t220230\nsensons\t220231\n诺度蜜\t220232\n一栏\t220233\n一心乱意\t220234\n一栋\t220235\n交手\t220236\n张家里\t220237\n琳琅满目\t220238\n一树\t220239\n听不懂\t220240\n飞宠爱\t220241\n朱文慧\t220242\n姜黄粉末+牛奶\t220243\n3斤\t220244\n蕃茄\t220245\n看一下行\t220246\n风破浪\t220247\n别讨厌\t220248\nh文\t220249\n流通股东\t220250\n2场\t220251\n士秀\t220252\n七堇年\t220253\n嗯克力\t220254\n一样\t220255\n月月月月\t220256\n重聚首\t220257\nhher\t220258\n一格\t220259\n垮年\t220260\n想吃好\t220261\n一根\t220262\n下午安\t220263\n擦擦擦把\t220264\n啦仲鸡\t220265\nuovvhocuo\t220266\n李天爱\t220267\n巨魔么\t220268\n密度明天见\t220269\n758847\t220270\n培植\t220271\n袖咪\t220272\n蚀9426\t220273\n屡见不鲜\t220274\n巨塔\t220275\n屁股板\t220276\n蝴蝶网\t220277\n千金归来\t220278\n我们的故事\t220279\n8nbv\t220280\n了别胡闹\t220281\neveryou\t220282\n空谷幽兰\t220283\n十年左右\t220284\n柯南\t220285\n下午宴\t220286\n李超市\t220287\n千刀万剐\t220288\n俊男坊\t220289\n威丽固\t220290\n十一弄\t220291\n若溪\t220292\n陈化粮\t220293\n咏鹅鹅鹅鹅曲项向天歌\t220294\nＴ▽Ｔ\t220295\nchulai\t220296\n陈越好\t220297\n名带\t220298\n我喜欢女的我是直男\t220299\n老奶奶们\t220300\nhhvggcfhgghvhysdychfbfycgbcjzuz\t220301\n冯晓丹\t220302\nｔｆａ\t220303\n错大错特错\t220304\n1821026811\t220305\n抖M\t220306\n倒洗脚\t220307\n名帅\t220308\nvjdhdbdbb\t220309\n645\t220310\n不分清\t220311\n名师\t220312\n抢不着\t220313\n拜托恩\t220314\n希勒布里hhfjhc\t220315\n做我的好朋友\t220316\n两厘米\t220317\n我能\t220318\nhbhj\t220319\n眼下路上\t220320\n馒头发\t220321\n凯雅公主\t220322\n化学污染\t220323\n你公\t220324\n防晒\t220325\n散架\t220326\n猪脑呆猪\t220327\n芬大\t220328\n自恋狂魔\t220329\n爱疯\t220330\n人儿们\t220331\n抵乐\t220332\n李庄\t220333\nkosos\t220334\n你先\t220335\n等你一说\t220336\n恶搞胡\t220337\n成一看\t220338\n密码箱\t220339\n侬宁\t220340\n强心针\t220341\n你好呀小笨笨\t220342\n快一点\t220343\n相比相必相秘\t220344\nNFDHSSG\t220345\n代持\t220346\n45758856\t220347\nHOOTERS\t220348\n小鸡叽叽叽叽叽老鼠吱吱吱吱吱吱呱呱呱呱呱呱\t220349\ncdukbx\t220350\n组良\t220351\nwia\t220352\n嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻嘻\t220353\n柳腰里\t220354\n一后天\t220355\nsjusisi\t220356\n科隆剧院\t220357\n关不着\t220358\n裁判员\t220359\n苦后甜\t220360\n恒达\t220361\n华南\t220362\n淮安市清江中学\t220363\n聚装\t220364\n凉山彝族自治州\t220365\n楚州\t220366\n能见\t220367\n黑胶\t220368\n机器制造\t220369\n防师兄\t220370\n黄维\t220371\n63分\t220372\ncosscodilto秘fastly包\t220373\n宁武关\t220374\n荣辱与共\t220375\n郑利伟\t220376\n28025\t220377\n焦碧研\t220378\n小李个大东电话给你我真的好想你\t220379\n28028\t220380\n盛饭\t220381\n下个星期六\t220382\n哪个片\t220383\n弟弟\t220384\n鱼菜\t220385\n导购员\t220386\n下一步\t220387\n妈妈的妹妹的骚的小姨子\t220388\n702\t220389\n南方电网\t220390\n墨水\t220391\n占上\t220392\n系主任\t220393\n8时\t220394\n占一\t220395\n好又多\t220396\n5856357\t220397\n2589401339\t220398\n黑帮黑吧黑吧\t220399\n8899988\t220400\n殖民地\t220401\n宝贝宝贝宝贝宝\t220402\n8日\t220403\n说话不说\t220404\n雪花和秘扇\t220405\n宁\t220406\n根本\t220407\n郎君\t220408\n找还局\t220409\n165一167厘米\t220410\n本月16日\t220411\n丑好\t220412\n瑾睿\t220413\n王冰\t220414\n这家\t220415\n怀孕\t220416\n给我真的不理\t220417\n丑女\t220418\n十二月几日\t220419\n这辈子\t220420\n折星星\t220421\nihbb\t220422\nciyc\t220423\n越战\t220424\n无线版\t220425\nciyx\t220426\n此波\t220427\n跨学\t220428\n陈英奇\t220429\n支撑位\t220430\n２月９日\t220431\n尾人\t220432\n七科\t220433\n要帅\t220434\n七种\t220435\n猪猪的葫芦娃\t220436\n外孙女\t220437\n福源\t220438\n瑞士军刀\t220439\n要帐\t220440\n一一.5g\t220441\n手里\t220442\n尾京\t220443\n经济增长\t220444\nuxgddfggh\t220445\nNSWYWWWWEGZVDUVDIDVFIDDKXBXNXXJCBXJCHXBXXBBXXBHXBXXHNXN\t220446\ncqd\t220447\n法伦\t220448\n呢你是谁你是猪你是猪你是猪你是猪你是猪你是猪你是猪你是猪你是猪\t220449\n一僻\t220450\n月宁\t220451\n西南方\t220452\n好啦啦啦啦\t220453\n玩不想\t220454\nMC5253\t220455\n45678901230\t220456\n行才说\t220457\n来讲爱情类的吧的还是美女7k\t220458\n宏城\t220459\n冯朝雨今\t220460\n赵泉智\t220461\n元儿\t220462\n加菲\t220463\n11月19日\t220464\n法会\t220465\n弄怀栋\t220466\n月宫\t220467\nJSP\t220468\n累咯\t220469\n澳倍尔\t220470\ntà\t220471\n大蒜瓣\t220472\n一袜套\t220473\n暴走\t220474\n2.2元\t220475\ngdgs\t220476\njbjbjbi\t220477\n德经贸\t220478\n有另有\t220479\n四五所\t220480\n刘佳琪\t220481\n小游园\t220482\ngdgd\t220483\ngdge\t220484\ngdgg\t220485\n波哥\t220486\n识别器\t220487\n任选\t220488\n很们\t220489\nydijf\t220490\n比不起\t220491\n木行\t220492\n77次\t220493\n791024140\t220494\n丢有\t220495\n古典紫荷\t220496\n就是我不知道\t220497\n比不赢\t220498\n回答我了吧\t220499\nalfoo\t220500\n分性别\t220501\n白菜\t220502\n呐克\t220503\n母赞\t220504\n丧邦\t220505\n眉山\t220506\n水桶桶\t220507\n御珠\t220508\n刘伟\t220509\n图雷\t220510\n马布里\t220511\n秘匙\t220512\n一万3440五万四千零六四\t220513\n主打歌\t220514\n笑传\t220515\n膊\t220516\nm5\t220517\n秘匣\t220518\nMax\t220519\n井然有序\t220520\n赵飞燕\t220521\n水迹\t220522\nMac\t220523\n伍新美\t220524\nkqk\t220525\n呱呱湿\t220526\n精兵天\t220527\nkqn\t220528\nAchievement\t220529\n借此\t220530\n士林之耻\t220531\n叫讨巧\t220532\n刘宇轩\t220533\n童叟无欺\t220534\n水原希子\t220535\n昆布皂\t220536\ntjbf\t220537\n获救\t220538\nbeod\t220539\n骚婊\t220540\n谷歌号\t220541\n不得好\t220542\n骚婆\t220543\n德艺双馨\t220544\n劳模\t220545\n虚名\t220546\nmicrosift\t220547\n亨利\t220548\n河妖\t220549\n窝瓜\t220550\n铁梨花\t220551\n五部\t220552\n刘华王\t220553\n北京旅游一卡通\t220554\nfnhuhdhi\t220555\n那一片\t220556\n主我是大梦\t220557\ntube\t220558\n耿耿于怀\t220559\n4.8升\t220560\n兵兵女子不是爱大陈陈漂亮女孩子我喜欢\t220561\n算了你儿子的我是好男票不\t220562\n抓不死\t220563\n不合格\t220564\n大过年\t220565\n纠结\t220566\n七月七\t220567\n新四军\t220568\njgf\t220569\n查呗\t220570\n一个太阳\t220571\n人工王八蛋\t220572\n叫秘结\t220573\n陈先生\t220574\n李三毛\t220575\n新华书店\t220576\n法则\t220577\n无兄弟\t220578\n素个\t220579\nhaj\t220580\n网络电话\t220581\ngdyydgd\t220582\n千名\t220583\n13801621701\t220584\n修罗\t220585\nponatime\t220586\n生生来了的说只可爱的小狗\t220587\n亚玛迪\t220588\n我们呢\t220589\n062\t220590\n321446447\t220591\n能干\t220592\n污染物\t220593\n一什么比\t220594\n宠戏权\t220595\n转舵\t220596\n话茬\t220597\n五都\t220598\n天下就是你最撒\t220599\n0529487005\t220600\n全城\t220601\n泰愚\t220602\n蝶衣\t220603\n任梦林\t220604\n七月中旬\t220605\n小看不见\t220606\n哥igjcivkvjvjvkvobobkhkgo\t220607\n十四四十\t220608\n抽搐\t220609\nCFAP\t220610\n罐子\t220611\n沙溪镇\t220612\n聚会\t220613\n594839\t220614\n三百千多块\t220615\n聚众\t220616\nvbnnkihhhhhuu\t220617\n并非\t220618\n人狗\t220619\nOrz\t220620\n围裙\t220621\n叽呱叽呱叽呱叽呱叽呱叽里呱啦叽里呱啦叽里呱啦叽里呱\t220622\n算了你\t220623\n泡比克\t220624\n选快点\t220625\n点骂人\t220626\n仙姑们\t220627\n时间长\t220628\n象形字\t220629\n888888888888888887758258\t220630\n石斑鱼\t220631\n蒜头\t220632\n死骚年\t220633\n原判\t220634\n佳缘\t220635\njoppi\t220636\n活人\t220637\n萌动\t220638\nQ64566\t220639\n340万8\t220640\n烤盘\t220641\n博时\t220642\n赵云飞\t220643\n飞行棋\t220644\n不明白快乐的世界\t220645\n目空\t220646\n光泽感\t220647\n异闹\t220648\n王雪坤\t220649\n二百多年\t220650\n陆龟儿\t220651\n地怪\t220652\n99999999999990个\t220653\n静夜思\t220654\n5555555555555555555555555555\t220655\n欧朋喔喔\t220656\n股票市场\t220657\n81分之一\t220658\n明日\t220659\n活于\t220660\nZhun\t220661\n李秀\t220662\n巴拉拉小魔仙\t220663\nhfgggfhvbhswdxhhxxhchxhdbc\t220664\n约分\t220665\n七二零六\t220666\n李科\t220667\n卡罗尔\t220668\ngordon\t220669\nyouzhiyo\t220670\n花蔻王\t220671\n答案的话\t220672\n别说到\t220673\n亚萍\t220674\n每期\t220675\nxiaole\t220676\n张某某\t220677\nｅｘｏ\t220678\n回声\t220679\n东一东一东一\t220680\n小说版\t220681\n黄体期\t220682\n二八八三，七八八八\t220683\n崛起\t220684\n吗窝\t220685\n赞比\t220686\n圣上\t220687\nmeimei\t220688\n天老爷\t220689\n独特富士康\t220690\nmvjk\t220691\n次数\t220692\n六爷\t220693\n唉aba\t220694\n舞者\t220695\n二零六六\t220696\n七九式春来冰冻冰箱燕\t220697\nIseeyou\t220698\n奥雪\t220699\n噶偶买噶偶买噶欧买噶欧买噶\t220700\n2005年7月9日\t220701\n天经地义\t220702\n幻想语文大战\t220703\n蔡婉婷\t220704\n大坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋宇宙无敌大坏蛋\t220705\nm6\t220706\n5546459875352442421248635598\t220707\n斯齐亚\t220708\n轰鸣\t220709\n咕耐\t220710\n胡萝\t220711\n刁镇江\t220712\n拖拉婆\t220713\n好不可以\t220714\n奥雅\t220715\n奴度秘\t220716\nhxdh\t220717\n骚皮\t220718\n梦之里\t220719\n奸魔者\t220720\nxiaolu\t220721\n菜菜\t220722\n定\t220723\n走来你走\t220724\n交错\t220725\n伦敦奥组委\t220726\n知心姐姐王者归来\t220727\n红玫瑰\t220728\n爱情连连看\t220729\n2697886693\t220730\n侵吞\t220731\n皮草\t220732\nMalaccathe\t220733\n2016009期\t220734\n世界新闻自由日\t220735\n了望\t220736\n剖宫\t220737\nfcgxgdystvbzbzzz\t220738\n顺口流\t220739\nhag\t220740\nrevs\t220741\n毒糜\t220742\n涵琦\t220743\n大了我的\t220744\n广州市教育局\t220745\n要你了叫\t220746\n那边亮天了没\t220747\n舍友\t220748\n小女\t220749\n台历照\t220750\n操盘\t220751\n吻创系\t220752\n摩擦力\t220753\n古茶\t220754\n轩辕剑之天之痕\t220755\n鼓楼广场\t220756\n蒲公英农社\t220757\n茉香奶茶\t220758\n重者\t220759\n包养\t220760\n小燕子\t220761\nGkdkgdkgdkgdjg\t220762\nnglaba\t220763\n说不到\t220764\n何超琼\t220765\n八百多名\t220766\n碘酒\t220767\n秋意\t220768\n重耳\t220769\n450NIT\t220770\n夕阳院\t220771\n道不拾\t220772\n跨国情缘\t220773\nm1\t220774\nI惠风\t220775\n我是你的咪管\t220776\n月神\t220777\n约会\t220778\n苹果六五\t220779\n五六位\t220780\n舒神童\t220781\n学农场\t220782\n冰水学校\t220783\n昏迷不行\t220784\n12134765\t220785\nality\t220786\n和好好好好\t220787\n2503部\t220788\nstuvwsy\t220789\n度秘你猜我\t220790\nhttppinyincne19813\t220791\n驻足\t220792\n喻童阳\t220793\n时济医院\t220794\n斗魂\t220795\n扣点\t220796\n麻田\t220797\n人鱼性\t220798\n币种\t220799\n一个一千\t220800\n第二第二\t220801\n我卡在史上最贱的游戏2\t220802\nhaveout\t220803\n牛马迷\t220804\n小奥\t220805\n二二十八\t220806\nchiphotosbaiducomxiaodupicitemf7246b600c338744991bcb0e560fd9f9d62aa0e2jpg\t220807\njkononoj\t220808\n抽签\t220809\n零字\t220810\n2月1日上午8时30分\t220811\n装装\t220812\n毛巾架\t220813\n抽筋\t220814\n依然人\t220815\nCgff\t220816\nROSE\t220817\n装裱\t220818\n减收\t220819\n小笨笨\t220820\n判处\t220821\n张姿势\t220822\n抗戰結束時\t220823\nmashifi\t220824\nfwzn\t220825\n顾晓宇\t220826\n6u5676\t220827\n0.30%\t220828\n天下第一厨\t220829\n使力\t220830\n落花\t220831\n融城\t220832\n教养\t220833\n每周六\t220834\n幺幺六三幺\t220835\n深沉君\t220836\n类型\t220837\n丹青\t220838\n亩定\t220839\n风貌\t220840\n号阳\t220841\n幺二零幺零四\t220842\n轮番\t220843\n露珠晶莹\t220844\n美苹果\t220845\n楼一号\t220846\n黑天黑黑黑黑\t220847\n王博远\t220848\n使劲\t220849\n角分\t220850\n爸爸们\t220851\n运输\t220852\n近视者\t220853\n熏陶\t220854\n刘欣蕊\t220855\n土水\t220856\n自我意识\t220857\n洛克菲勒\t220858\n587\t220859\n584\t220860\n585\t220861\n582\t220862\n583\t220863\n照办\t220864\n581\t220865\n迫症\t220866\n呵her\t220867\n送出去\t220868\n赖地\t220869\n588\t220870\n589\t220871\n飞花儿\t220872\n哼哼哼瞌睡\t220873\n局部地区\t220874\n两吻过\t220875\n新生命\t220876\n腹面\t220877\n唱吧唱吧\t220878\n费话费\t220879\n黑眼睛\t220880\n修仙小学\t220881\nBLAZER\t220882\n真的好丑\t220883\n川文\t220884\n587件\t220885\n何润厚\t220886\nNikkei\t220887\n好不晒死你了我赢\t220888\n11月15日24点\t220889\nMarkamp\t220890\n大声呼喊\t220891\n祝贺词\t220892\ngsgsh\t220893\n鹿晗露\t220894\n腊鱼\t220895\n一时一时\t220896\ngifhghgghyyhggghhh\t220897\n那一次\t220898\nfffffff\t220899\n154584648648812\t220900\npgm\t220901\n1123d\t220902\n都栅\t220903\n城罗\t220904\n小米手机电信版#\t220905\n24hHoney\t220906\n转播\t220907\n德钦\t220908\n二表\t220909\n安等\t220910\n张培培\t220911\nsudvif\t220912\n小天涯\t220913\n九十八块\t220914\n合景泰富地产\t220915\n双份\t220916\n卡罗丽丽\t220917\n博地图\t220918\n10多家\t220919\n佳喜羊羊与灰太狼\t220920\n宵\t220921\nihffu\t220922\n谢贤度\t220923\nBruit\t220924\n呢死\t220925\n据路虎\t220926\n一会会儿\t220927\n樱桃汁\t220928\n阿比亚蒂\t220929\n度秘度秘度秘度秘你在吗你在\t220930\n上上下下\t220931\nktjtqkbdjaj\t220932\n以此类推\t220933\n别得意\t220934\n三十多个\t220935\n军点\t220936\n星美大\t220937\n潘一帅\t220938\n匿名\t220939\n幺蛾子\t220940\n绿冠龙\t220941\n战逆战\t220942\n別闹\t220943\n括号\t220944\n周沙映\t220945\n麻雀的叫声是什么样的我不理你了我再也不理你了不论你说什么我也不理你了再也不见祝你\t220946\n快点儿给我叔办法\t220947\njtycu\t220948\n蛋蛋蛋\t220949\n希桃贝儿\t220950\n太守歌\t220951\n三十多万\t220952\n我好苦我喜欢n在哪呢在哪呢\t220953\n反光镜\t220954\n括句\t220955\n7.20\t220956\n上访者\t220957\n赤白\t220958\n瓦卡卡\t220959\n刘子轩\t220960\n范冰爷\t220961\ngsydg\t220962\n思思思\t220963\n猫爪\t220964\n艾文\t220965\ne327810[/em\t220966\n燕某\t220967\n熊心归来\t220968\n五彩池\t220969\n欧宝鸡\t220970\n爱把\t220971\n宅男怪\t220972\n诶行\t220973\n够可爱\t220974\n穗子\t220975\n可乐\t220976\n挂课\t220977\n艾斯\t220978\n八八折\t220979\n来风\t220980\n阿姨辈\t220981\n去了解救\t220982\n绿桃\t220983\n尚坝\t220984\n13737459820\t220985\ncaaatonomyosal\t220986\n张海瑞\t220987\n酒桶\t220988\nFcctest\t220989\n假小小\t220990\n我喜欢花姑娘\t220991\n电气\t220992\n画册\t220993\n善相\t220994\n弥漫\t220995\n载体\t220996\n残教\t220997\n干锅大虾\t220998\n有笔\t220999\n呵出\t221000\n有笑\t221001\nsheoften\t221002\n处女检查\t221003\n土机\t221004\nSome\t221005\n处心积虑\t221006\n何患无辞\t221007\n山海经\t221008\n欧吉桑\t221009\n零零零\t221010\n锦标\t221011\n橡皮\t221012\n七点儿\t221013\nsjjsjsj\t221014\n在CCTV-1\t221015\n好丑度\t221016\n红宝石\t221017\nHarry\t221018\n议儿\t221019\n13.25%\t221020\n超级万能答\t221021\n我的寿命\t221022\n三百六十多\t221023\n周小远\t221024\n风靡度秘月我的要和你聊\t221025\n香樟园业委会\t221026\nstaff\t221027\nFdrfg\t221028\n港元\t221029\n鬼妹\t221030\n嫉恨\t221031\n贱男\t221032\n鼠嚼\t221033\n吗森\t221034\n想不想来日\t221035\n小恩堡\t221036\n加冰\t221037\n唉唉好想你\t221038\n省去\t221039\n歇后语\t221040\n菊丸英二\t221041\n少年馆\t221042\n一点五\t221043\n我讨厌微信我喜欢你度秘\t221044\n欣帮\t221045\n我是吾皇的爱妃\t221046\n加冠\t221047\n濮存昕\t221048\n于千芮\t221049\n朱康丽\t221050\n香菜\t221051\n不怕好\t221052\n不好好说\t221053\ndhiphotosbaiducomxiaodupicitemd0c8a786c9177f3e64707b7e77cf3bc79f3d5678jpg\t221054\n弟子规\t221055\n李小承\t221056\n李在雅\t221057\n谁是终极英雄\t221058\n米肠\t221059\n18019085777\t221060\n寒假工干\t221061\n臃肿雍\t221062\n事事\t221063\n香菇\t221064\n别说了我讨厌你度秘\t221065\n榆树屯\t221066\n岚县\t221067\n发短息\t221068\n习斗\t221069\n秘哥哥哥哥哥哥\t221070\n鳭须须\t221071\n事人\t221072\n哈晚安\t221073\n旅行篇\t221074\n香菱\t221075\n554554\t221076\n554555\t221077\nalllpp\t221078\n么答\t221079\n家堡\t221080\njng\t221081\n猪你九族\t221082\njnf\t221083\n来来来来来来来来来来来来了\t221084\n方向盘\t221085\n赣\t221086\n17:50\t221087\n跟妹妹亲一个亲一个\t221088\n永别\t221089\n更便捷\t221090\n24000多\t221091\n83个\t221092\n永別\t221093\n瓣膜\t221094\n张宇硕\t221095\njjch\t221096\n2644585313\t221097\n薛凯琪\t221098\n长住\t221099\n农商\t221100\n色群\t221101\n哎喔\t221102\n重镇\t221103\n16622661\t221104\nrft\t221105\n亦菲\t221106\n非民主\t221107\n想花\t221108\n硬把错\t221109\n麦门冬\t221110\n回了顶\t221111\n爸爸爸爸爸爸爸爸爸\t221112\n张杰多\t221113\nghshghs\t221114\n三十发\t221115\n几覅\t221116\n好你个臭小子\t221117\n中央党校\t221118\n停顿\t221119\n漳村\t221120\n留空\t221121\ngcdxx\t221122\n男别\t221123\n四百多种\t221124\n嘟噜噜大啦啦丽丽\t221125\n三十只\t221126\n2374049097\t221127\n朱金帆\t221128\n资不高\t221129\n老兄弟\t221130\n三十号\t221131\n饱受\t221132\n我的电话\t221133\n鬼装\t221134\n袁大秀\t221135\n13878135377\t221136\n为你走\t221137\n御东学府\t221138\nCT室\t221139\n就是非\t221140\n传承者\t221141\n溯源\t221142\n焕立\t221143\n蜘蛛网\t221144\n走近\t221145\n鸥剋\t221146\n轻薄\t221147\n呜度\t221148\nffyy8\t221149\nS\t221150\nohitis\t221151\n关长城\t221152\n密云易峰\t221153\n我的乖乖交不狗\t221154\n非人之谮诉当人呆在试一试相争安知非我之不是西平西爱小\t221155\nwinpe\t221156\n启蒙\t221157\n四川大地震\t221158\nLBC\t221159\n为人道\t221160\n红房子医院\t221161\n恁奶奶\t221162\n十六国\t221163\n精算师\t221164\niike\t221165\n凤台县\t221166\n浩哥\t221167\n查验\t221168\ncuiziqing\t221169\n遗命\t221170\n综合算式\t221171\nx赛\t221172\n莫随花花尘\t221173\n前途无量\t221174\n卡咔咔咔is\t221175\n任管\t221176\n乏溃\t221177\n女爱\t221178\n姚壮宪姚仙\t221179\n抽奖品\t221180\n诗集\t221181\n吃好\t221182\n信仰\t221183\n吃奶\t221184\n二零六\t221185\n糖纸\t221186\n迟乔\t221187\n吃女\t221188\nhello你好你是谁哦我不认识你\t221189\n喜羊羊之灰太狼\t221190\nyoublel\t221191\n游过\t221192\n宅急运\t221193\n吃食战\t221194\n威莱图\t221195\n搞不到\t221196\n诗雨\t221197\nbfjxi\t221198\n台网\t221199\n找不懂\t221200\n一脚滑\t221201\n挪威\t221202\n0332千米\t221203\n卫衣\t221204\n集成\t221205\n兽人\t221206\n快用\t221207\n周星驰\t221208\n阿新\t221209\n银蓝色\t221210\n快男\t221211\n兽交\t221212\n渭河\t221213\ndikl\t221214\nkbnnn\t221215\n先声药业\t221216\n夏洛克福尔摩斯\t221217\n嗯瓮舞我爱王\t221218\n我喜欢你你是你\t221219\n好久来\t221220\n郭培\t221221\nfgdffd\t221222\n心领\t221223\n亂說話\t221224\n7月\t221225\n明天11点晚间9点\t221226\n入口必来长飛四人之了必风出卜上火星星么七七一星星星人\t221227\n阿杜\t221228\n独立董事\t221229\ni3素\t221230\n何名涛\t221231\n韩志锋\t221232\n几斗\t221233\n一万岁\t221234\n慧丽\t221235\n长大后再说\t221236\n奥特之母\t221237\n5552558\t221238\n独尊者\t221239\n一百五十七块\t221240\nbyebye\t221241\n近日\t221242\n2016点\t221243\n尤荣\t221244\n58425455224246432532\t221245\n龙文烬\t221246\n度秘我想我真的好喜欢你哦日漫\t221247\n早上7：00\t221248\n258146\t221249\n药物\t221250\n吃好饭吧\t221251\n32513\t221252\n几斤\t221253\n机酸类\t221254\nmuscle\t221255\n近旁\t221256\n朴麦妮\t221257\nabab\t221258\nabac\t221259\nabaa\t221260\n苏蔓妍\t221261\n小熟悉\t221262\n总人口\t221263\n主受\t221264\nabao\t221265\n红螺\t221266\n宙宙\t221267\n娱乐化\t221268\nfihxk\t221269\n啊祥\t221270\n罗马人\t221271\n遗夜不闭户\t221272\n题库\t221273\nadv\t221274\nllantifiltneranoli\t221275\n杨晓晴\t221276\n829368822282828582\t221277\n宙宠\t221278\n主号\t221279\n黄头\t221280\n古娜拉\t221281\n黄太\t221282\nadd\t221283\n好了我跟田亚一百\t221284\n觉察\t221285\nada\t221286\n李钟端\t221287\n黄大\t221288\nadk\t221289\ngxieh\t221290\n苍云\t221291\n000000000000000000000000000000000009999\t221292\n付款\t221293\n刷机\t221294\neydryd\t221295\nfgbd\t221296\n浮动\t221297\n盖头蒸\t221298\n故意轻\t221299\n哈弗h9\t221300\n15149326\t221301\n不成稿\t221302\n拳霸江湖\t221303\n七百多点\t221304\n夜卢敏\t221305\n葛粤语\t221306\n余彦帅\t221307\n瞿刚毅\t221308\n好好了了\t221309\n木马\t221310\n嗯杨宇\t221311\n和号\t221312\n穆娜\t221313\ndugih\t221314\n匪浅\t221315\n六十八六十八元\t221316\n新仇旧恨\t221317\n袁其泉\t221318\n昨天中午十二点十分\t221319\n零一零八\t221320\n火宴\t221321\n死到临头还嘴一拉\t221322\n阿斗\t221323\n浮力\t221324\n2475\t221325\n合并后\t221326\n2470\t221327\n利群\t221328\n闫冰\t221329\ntimeac\t221330\n戒不掉\t221331\n小蓓蓓\t221332\n公告贴\t221333\nijhyjh\t221334\n美夫妇\t221335\n接接接接接接接接接接接接接接接接接接接接接接接接接接接接接接接\t221336\n度皇\t221337\n孤单\t221338\n后悔莫及\t221339\n马太效应\t221340\n明天广场\t221341\n海平\t221342\n532151022\t221343\n学信网\t221344\n祈佑\t221345\n无理\t221346\n滚开\t221347\n董浩扬\t221348\n艾·里斯\t221349\n顾光手\t221350\n老处男\t221351\n四G四G\t221352\n巜少妇白洁\t221353\n今来\t221354\n讲子\t221355\n我好幸福我想亲一口\t221356\n南无地藏王菩萨\t221357\n大辰\t221358\n智能机器\t221359\n腰身\t221360\nsyzsy\t221361\nKFFEKT\t221362\n范德赫\t221363\n凸起\t221364\n等不上\t221365\n憔悴\t221366\n一百分钟\t221367\n猪了猪\t221368\n大辫\t221369\n林维哲\t221370\n以防万一\t221371\n买一送一\t221372\n刷卡器\t221373\n421447122\t221374\n03:09\t221375\n赚大钱\t221376\n钗头凤\t221377\n另一方面\t221378\n没我问你\t221379\n大辅\t221380\n活动句\t221381\n大肚子\t221382\nxoxoxoxo\t221383\n247M\t221384\n18080812621\t221385\n吴u\t221386\n网络版\t221387\n要是的话\t221388\n糖果饼干\t221389\n下班后\t221390\npppycg\t221391\n淡绿色\t221392\n杨秋枫\t221393\n691所\t221394\n馈赠\t221395\n伤脑筋\t221396\n乎应\t221397\n13.3亿元\t221398\n21万超\t221399\n关你的事儿\t221400\n艾灸\t221401\n哈弗h六\t221402\n布兰妮\t221403\n汤山\t221404\n爆发力\t221405\n武宁\t221406\n压住\t221407\n压低\t221408\n甘子豪\t221409\n觅求\t221410\n畅所欲言聊\t221411\n武安\t221412\n嗯秋秋\t221413\n饭吃\t221414\n旧朋\t221415\nhd1000c\t221416\n本队\t221417\n第10名\t221418\n十岁大的小孩我是四我是我是八岁的小孩\t221419\n毕节性病\t221420\n兔云\t221421\nmicmico\t221422\n挺文\t221423\n波立莱\t221424\n冰冰元\t221425\ndghjbv\t221426\n1月21月\t221427\ndgvx\t221428\n感谢你了你的是\t221429\n木元\t221430\n妳這樣\t221431\niihguenxk\t221432\n拿不得\t221433\n红袖添香\t221434\n云仙阁\t221435\n利美\t221436\n卸載\t221437\n速服宁\t221438\n好挺好\t221439\n省体育中心站\t221440\n长帅\t221441\nmoumen\t221442\n征象\t221443\n花鸟元\t221444\n无机\t221445\nthdc\t221446\n中国中央电视台\t221447\n最好多\t221448\n三胞胎\t221449\nnnnnnnnneon\t221450\n早茶\t221451\n饭后\t221452\n往往\t221453\n我需要浪漫满屋上make\t221454\n伊丽莎娜\t221455\n看完后\t221456\n欧巴桑\t221457\n可气\t221458\n横波\t221459\n刘娜梅\t221460\noan\t221461\n无望\t221462\n三苏指\t221463\n无期\t221464\n厂苗\t221465\n五盒\t221466\n偏逢\t221467\n东区\t221468\n#EXO#\t221469\n无有\t221470\n舍心\t221471\n思思速速\t221472\n打不死人\t221473\n辽宁西部\t221474\n功勋犬\t221475\n简子婷\t221476\n剩僵尸\t221477\nRrrrrrrrrrisa\t221478\n百分之六一年\t221479\n哈哈爸爸钩子小菜一蝶一蝶\t221480\n海豚\t221481\nnijiaodumi\t221482\nav播放器\t221483\n#EXOM\t221484\n来呀爱情\t221485\n评注\t221486\n幺三幺六零幺零四七四幺九三七九\t221487\n普照\t221488\n前三个月\t221489\n弹力素\t221490\nvfcn\t221491\nsempre\t221492\n活考\t221493\n生物学家\t221494\n开腔\t221495\n王馨\t221496\naccidents\t221497\n她片\t221498\n咯咯莱\t221499\nkaia\t221500\nヽ\t221501\n弄洒\t221502\nヾ\t221503\n合作方\t221504\nLead\t221505\njjjj\t221506\n年出\t221507\nロ\t221508\nレ\t221509\nLeap\t221510\nヮ\t221511\nラ\t221512\n严兴燕\t221513\n真真好气\t221514\nblojftf\t221515\n黃书\t221516\n你了谁\t221517\nメ\t221518\n三百场\t221519\n乖卑鄙\t221520\n1.15\t221521\n欧呦度\t221522\n可爱你不可爱\t221523\n懒人们\t221524\n炒股\t221525\n京酱肉丝\t221526\n1.18\t221527\ntfodoy\t221528\n新婚\t221529\nノ\t221530\n烟云\t221531\nunder\t221532\nツ\t221533\n机质\t221534\n废物\t221535\n寡居\t221536\n现代化\t221537\n认识点\t221538\n五百多个\t221539\n王晓庆\t221540\n好好听\t221541\n信息册\t221542\n3dmax8\t221543\n觉眼多生快播\t221544\n度秘我美还是你美\t221545\n太仓\t221546\n绍兴市\t221547\n煲仔饭店\t221548\n棉糖\t221549\n刑天\t221550\n亓开我讨厌\t221551\njack\t221552\nbobos\t221553\n一个妮两个妮就是妮\t221554\n五位\t221555\n推推\t221556\n卷卷\t221557\n若欣\t221558\n好好吃\t221559\n10QB\t221560\n鷄鷄嗎\t221561\n黑剑\t221562\nFCHVH\t221563\n德文\t221564\n胡佳欣\t221565\ntomatourze\t221566\n明知其然\t221567\n老黄休假\t221568\n125751\t221569\n現象\t221570\n师母\t221571\n陈建中\t221572\n22714\t221573\njsjazj\t221574\n海豹\t221575\n發設計圖\t221576\n23棵\t221577\n铺位\t221578\nntmgsb\t221579\n钟春雪\t221580\n青春小恋手册\t221581\n哄多尼\t221582\n九九墨\t221583\n77611\t221584\n必经之路\t221585\ntonsl\t221586\n良那度秘\t221587\n谢孑\t221588\n谢字\t221589\n德玛西亚杆\t221590\n不要说笑\t221591\n谢存\t221592\n脑系\t221593\n贵阳市\t221594\n快九二\t221595\n战机\t221596\n知你意\t221597\n好散\t221598\n穿堂风\t221599\n一字师\t221600\n大姐我一点到底层楼梯形势必然后我的？压迫不及待在外套套子弟兄长途经理论语录\t221601\n战术\t221602\nSurge\t221603\nghdfd\t221604\n啦恩\t221605\n赚作业\t221606\n55545555\t221607\n恩希望\t221608\n板鞋\t221609\n5n5\t221610\n去来\t221611\n秋冻\t221612\n导游戏\t221613\n抱腿\t221614\n丢了偶\t221615\n釉\t221616\n罗以轩\t221617\n秋冬\t221618\n亚勃力\t221619\n朱梦涵\t221620\njvhjk\t221621\n噶野狗\t221622\n一眨眼\t221623\n血栓病\t221624\n晨操\t221625\n湖口\t221626\nRien\t221627\n再来了再见\t221628\n十二十一\t221629\nTggufjmk\t221630\nwojiao\t221631\n結果繐\t221632\n五十话\t221633\n冯喆\t221634\n叶佳\t221635\n八笔顺\t221636\n凝聚力\t221637\nFGHHJJHHHHGHGGGGFGGHHHHH\t221638\n点击量\t221639\n代拍\t221640\nfgtfxhjkhjnn\t221641\n饭包子\t221642\n神精\t221643\n平抑\t221644\nyouylyomy\t221645\n叫命中注定\t221646\n沈石溪\t221647\n我不代\t221648\n阿拉咕噜\t221649\n心颤\t221650\n头你的太\t221651\n我的秘书\t221652\n荔湾\t221653\n风雨交\t221654\n龌蹉\t221655\n三嗯\t221656\n车神\t221657\n死的那个人不就是你\t221658\n蓝莓红梅子\t221659\n悠然小天\t221660\n经观\t221661\n度秘求求你帮我件事\t221662\n咪系\t221663\n尹忠信\t221664\n经视\t221665\n7月25日以来\t221666\n八八四八七八五八\t221667\n包租婆\t221668\n十大板\t221669\n艾伦·佩姬\t221670\n趵突泉\t221671\nTM棒\t221672\n图册\t221673\n华为C8812\t221674\n我可乐\t221675\n脸盲\t221676\n配份\t221677\nyxhsnx\t221678\n火火灾来了你快跑\t221679\n捉拜拜\t221680\n我喜欢我的候汉敏\t221681\n嘉峪会\t221682\n都匀\t221683\n忍着\t221684\n叶白沫\t221685\n藏语\t221686\n深分米\t221687\n细思量\t221688\nDuckgoosefd\t221689\n发爱\t221690\n哥翁\t221691\n99分\t221692\n韩网\t221693\nI929黑4600\t221694\n一句半句\t221695\n都北\t221696\n咋天上海公交\t221697\n妙曲\t221698\n双缸\t221699\n那华\t221700\n扁道\t221701\n杜我\t221702\n作么\t221703\n惊奇处\t221704\n白ch\t221705\n裤衩志龙\t221706\nUKg\t221707\n裴晋辉\t221708\n猫粮\t221709\n36000\t221710\n妙曼\t221711\n六1\t221712\n美味\t221713\n13817390707\t221714\n亿美元\t221715\n金马王子\t221716\n诡捌\t221717\n岭南\t221718\nmennoand\t221719\n微贷网\t221720\n送文\t221721\n我不是你的亲爱的你是我的谁\t221722\n一万种\t221723\n5nn3\t221724\n败退\t221725\n罗志祥\t221726\n咱們\t221727\n先度\t221728\n咧咯\t221729\n月梅金盘桥\t221730\n颜英\t221731\n娇好不好\t221732\n酒柳\t221733\nxxooh\t221734\n装机数\t221735\n朱秘密\t221736\n我是女的我是小姐姐\t221737\n吕孟锦\t221738\n肖闻泽\t221739\n江心庆\t221740\n3wewe\t221741\n10010700274\t221742\ncoiydrxu\t221743\n我的儿子\t221744\n肚饿\t221745\n农民们\t221746\n熊肉\t221747\n叻女\t221748\n金萌\t221749\n布维丝\t221750\n咨询师\t221751\n滴露\t221752\nNawo\t221753\n一什么快说\t221754\n柯德莉·夏萍\t221755\n仿佛句\t221756\n张锦鹏\t221757\n假设法\t221758\n棺材\t221759\n蓝红颜\t221760\n玖兰枢和锥生零\t221761\n9片\t221762\n不離\t221763\n浴血\t221764\n狗类\t221765\n育智教育中心\t221766\n李浩田\t221767\n琼海醉虾\t221768\n道歉\t221769\n43262\t221770\n我需要内幕梅内糜子六你\t221771\n好了那那那那那那那那什么\t221772\n尻屄\t221773\n比卡丘\t221774\n浴衣\t221775\n陈丹然\t221776\n寿比\t221777\n大热天\t221778\n７４８\t221779\n的爸爸的爸爸喂喂喂\t221780\n哭脸\t221781\n可不到\t221782\n没把玩\t221783\n财产税\t221784\n电线\t221785\n广州市白云区人大\t221786\n王诺\t221787\n女屏\t221788\n上搜\t221789\n第二批\t221790\n十二三\t221791\n十二下\t221792\n跑起来\t221793\n十二一\t221794\n恩罗\t221795\n测算\t221796\nmothers\t221797\n算了你猜\t221798\n十二万\t221799\n贝伦\t221800\n那天人\t221801\n丫essB\t221802\n真冷\t221803\n五脏\t221804\n伊丝\t221805\n这你在哪里的外卖\t221806\nRYUCV\t221807\n声类\t221808\n李墨菊\t221809\n堂兄\t221810\n物理学\t221811\n孙亚琼\t221812\n佟思发\t221813\n丰乳肥臀\t221814\n同门\t221815\n无视之真是\t221816\n为难你\t221817\n陀螺\t221818\n好多不愉快\t221819\n衣钵\t221820\n白皮\t221821\n美老子\t221822\n若涵\t221823\n大我记下\t221824\n同间\t221825\nCause\t221826\n大前天\t221827\njajzvSj3jZvevBwjsgsshdhshsbvs\t221828\n法律规\t221829\nBTV\t221830\n法律观\t221831\n白皙\t221832\nBTS\t221833\n土肥\t221834\n控销\t221835\n一千多名\t221836\n刘思雨\t221837\n凌轹\t221838\n武汉纺织大学\t221839\n大专文凭\t221840\n便和\t221841\n民间艺术\t221842\n甲杯\t221843\n介意\t221844\n蛇蝎\t221845\n53454542nnnnnnnn6521\t221846\n几千块\t221847\njjjjgg\t221848\n哥哥们\t221849\n李首莹\t221850\neasoo\t221851\n法法法\t221852\n五十五米\t221853\n陈芷颖\t221854\n第三方在线支付\t221855\n马夫\t221856\n7852\t221857\n抢坐\t221858\n大冬瓜\t221859\n仇舒华\t221860\n民权\t221861\n175周年\t221862\nDvvvb\t221863\n茶厅\t221864\n桃符\t221865\ntoy\t221866\n链条\t221867\n好了不需要你了你走\t221868\n紧咬\t221869\n二百五三百六四百五十八\t221870\ntos\t221871\ntop\t221872\nkykf\t221873\ntot\t221874\ntou\t221875\ntok\t221876\n科力亚\t221877\n一四款\t221878\ntoo\t221879\ntol\t221880\ntom\t221881\ntob\t221882\n袁泉\t221883\n哈哈看贝我你是我\t221884\n欧尼酱\t221885\n铁破\t221886\ntoe\t221887\n阿sa美\t221888\n大子町到临没在写字\t221889\n卓美亚喜玛拉雅酒店\t221890\n卡华北\t221891\n说穿\t221892\n家庭成员\t221893\n骗骗你的我不说的我要陪你\t221894\npance\t221895\nzigp\t221896\n不害\t221897\nintentIntentSK1171477665B8D00B6E7955DA858498161CAD09D8299FC7D8D651E33B43FD7469404B2E709end\t221898\nto5\t221899\n轻抚\t221900\n盒子\t221901\n奶茶妹妹\t221902\n恩的心\t221903\n可数名词\t221904\nEｘｏ\t221905\n下身\t221906\n故事会不会消失记住你读给我听\t221907\n沫子\t221908\n吗了不起呀你这个对面的妈妈好肥好臭\t221909\n刚不\t221910\n肩\t221911\nhfbi\t221912\n烂了写\t221913\n妆品\t221914\n肤\t221915\nfgifxxirzfb\t221916\n试妆\t221917\n股\t221918\n第八天\t221919\n刚一\t221920\n肿\t221921\n屋塔乌塔乌塔乌塔乌塔\t221922\n谈感悟\t221923\n腋毛\t221924\n肺\t221925\n塗鴉義軍\t221926\n83点\t221927\n我要你对我的爱\t221928\n戴劲瑶\t221929\n育\t221930\n肌\t221931\n肏\t221932\n肉\t221933\n上公司\t221934\nDgd\t221935\n肄\t221936\ndlldkxkxk\t221937\n1967\t221938\n吊针\t221939\n1961\t221940\n肃\t221941\n交表\t221942\n肝\t221943\n犯不着\t221944\n鸡蛋豆腐\t221945\n许大哥\t221946\n肘\t221947\n肛\t221948\n别有\t221949\n胡姐姐\t221950\ngtnx\t221951\n李欣远\t221952\n湛江计件\t221953\n张珈硕\t221954\nbsy\t221955\nbsv\t221956\n气温\t221957\n赵静漪\t221958\n副乳\t221959\n结合部\t221960\n歃血思路\t221961\n蒜子\t221962\n一千五百二十\t221963\n顶角\t221964\n扭伤\t221965\n王卫红\t221966\n三冠\t221967\nesity\t221968\nlift\t221969\n一手包办\t221970\n如渔\t221971\n海拉尔\t221972\n拖梁\t221973\n爱了用\t221974\n逗逼样\t221975\n秦清清\t221976\n呢贾\t221977\nqwwwq\t221978\n郭梦丽\t221979\n樱花虾\t221980\nfhkf\t221981\n148888\t221982\n得天独厚\t221983\n吃一坠\t221984\n雷滴嘎嘎\t221985\n谈天说地说完\t221986\n明天早上六点钟\t221987\n张桂荣\t221988\n词汇\t221989\n王俊剀\t221990\n报应\t221991\n福岛第一核电站\t221992\n高志\t221993\n报废\t221994\n喝大喽\t221995\nmaopqr\t221996\n姜文\t221997\n贝斯\t221998\n高心\t221999\n28728399939\t222000\n姜莹莹\t222001\n機\t222002\n18796888899\t222003\n闹媒体\t222004\n死魔术\t222005\nwatch\t222006\n令天\t222007\n仙女湖\t222008\n命格\t222009\n荣其\t222010\n牵出\t222011\n人生中\t222012\n通讯\t222013\n太罗\t222014\n信不信我在你的手机吧\t222015\ndjzgHVAC\t222016\n小格拉条\t222017\n夏宇涵\t222018\n荣光\t222019\n莫奈\t222020\nHowsthew\t222021\n金泰亨kjvjjgjggfqwhdmgdngdbgdhtsgfssthtdbsfbdnjtsbfsbtdjstbdabdjegdavdndynfsbfsndfsbdgjtdh\t222022\n错爱\t222023\napm\t222024\n快把\t222025\n这老\t222026\n集璇\t222027\n宋妇联\t222028\ndBD\t222029\n吴宇坤\t222030\n9分裤\t222031\n博古通\t222032\n古往今来\t222033\n不谢不谢\t222034\n总起句\t222035\n一万个\t222036\n王小二\t222037\n几瘦\t222038\n宁明\t222039\n聂红梅\t222040\n文明人\t222041\n翻来\t222042\n一九八八零\t222043\n召集令\t222044\n无数个\t222045\n564942\t222046\nXtbk\t222047\n快报\t222048\n坑坑\t222049\nnnnnnnnnnnnnnnnnnnn\t222050\n艾博\t222051\n想不开\t222052\n哑光\t222053\n十兄弟\t222054\n一万万\t222055\n一万三\t222056\n三军\t222057\n马坤\t222058\n天大家\t222059\n喜基\t222060\n经营者\t222061\n歼-7\t222062\n成效\t222063\n有意者\t222064\n纯蒙古语学院\t222065\n自作多情吧\t222066\n左脸\t222067\n爱新觉罗\t222068\nhimew\t222069\nhours\t222070\n哼嘻嘻\t222071\n自来水管\t222072\ndjxjidkmd\t222073\n张博尔\t222074\nrdjtc\t222075\nvV型\t222076\n龙儿\t222077\n亏欠\t222078\n知不知道\t222079\n翔翔翔翔翔翔翔翔翔翔翔翔翔翔翔\t222080\n植物大战僵尸舞\t222081\n家你是谁呀\t222082\n主观\t222083\n主见\t222084\n弯下腰\t222085\n左脚\t222086\n聊聊游戏吧冬蜜\t222087\n小唐子\t222088\n黑天鹅\t222089\n颐和盛世2期\t222090\n度秘呗\t222091\n隆冬呛叮咯咙咚呛啊恰恰恰\t222092\n医保卡\t222093\n殷昊\t222094\nzngd\t222095\n杨雨欣\t222096\n焦圈\t222097\n从何来\t222098\n零二二零四幺二\t222099\n囧rz囧rz囧rz囧rz囧rz囧rz囧rz囧rz囧rz囧rz囧rz囧rz囧rz囧rz囧rz囧rz囧rz囧rz\t222100\n呆待\t222101\n短小\t222102\nkk歌\t222103\n轻描淡写\t222104\n成教\t222105\n深藏\t222106\n說句話\t222107\n寿者\t222108\n换色\t222109\n一千亿美元\t222110\nexpected\t222111\n减来\t222112\n一针一针\t222113\ntmPG\t222114\n命运交响曲\t222115\n七零三零\t222116\n近5个月\t222117\n女身\t222118\n如果这就是爱情\t222119\n宾阳西里24号楼\t222120\n一那咱们结婚吧\t222121\n呆萌度\t222122\n卸妆\t222123\n100000000000000000000000000000000000块\t222124\n老师旦\t222125\n怡口蓮\t222126\n小仲马\t222127\n来素\t222128\n折雄\t222129\n周亚宁\t222130\n田大学\t222131\nle叻\t222132\n赠汪伦别\t222133\n厨神\t222134\n扫黄\t222135\n九十九亿遍\t222136\n想你我真的好想\t222137\n上海市人民政府\t222138\n聪聪明\t222139\n科室\t222140\n誰來\t222141\n赵凤荣\t222142\n本才\t222143\n米lol\t222144\n董良昊\t222145\n接骨\t222146\n傻bhzlm\t222147\n一下儿\t222148\n请问儿\t222149\n麦麦麦片面包\t222150\n瓦拉尼\t222151\n有口无心\t222152\n一千朵\t222153\nMcGeer\t222154\n假毛\t222155\n一千本\t222156\n起兴进\t222157\n猪猪猪你就是猪是你是你就是你\t222158\n克拉克\t222159\n也称\t222160\n再起\t222161\n标本\t222162\n考期末\t222163\n段落\t222164\n衣兜\t222165\n金球奖\t222166\n肚密秘\t222167\n认招\t222168\n亲家里\t222169\n数像\t222170\n极恶非道\t222171\n8月29日\t222172\n实改\t222173\n000c響O\t222174\n190个\t222175\n代售\t222176\n399273303\t222177\n噶噶噶\t222178\n不知心\t222179\n应山\t222180\n2002418\t222181\n八马骏\t222182\n张北平\t222183\n石荆希\t222184\n靠普\t222185\n北魏\t222186\n东海龙王\t222187\nAptamil\t222188\n嗯象\t222189\n白净\t222190\n5[\t222191\n雪精灵\t222192\n167955560859\t222193\n冯琳惠\t222194\n算说话\t222195\n5P\t222196\n5S\t222197\n5R\t222198\n安东尼\t222199\n俩个九位\t222200\n刘子三\t222201\n淑星男\t222202\n一百九\t222203\n甲片\t222204\n梦龙\t222205\n中关村图书大厦\t222206\n梁家维\t222207\n你好赖\t222208\n5u\t222209\nEverything\t222210\n5v\t222211\n5公里\t222212\n12453678905\t222213\n你好赞\t222214\n5m\t222215\n幺鸡汤\t222216\n谢乐霖\t222217\n38岁时\t222218\nwendongmian\t222219\n拆看\t222220\n5g\t222221\n5a\t222222\n国民党\t222223\ngaadc\t222224\n我的知道在哪儿\t222225\n压锅菜\t222226\n二十五乘\t222227\n娄艺潇\t222228\n本焕示寂\t222229\n耶斯\t222230\n嗯那多\t222231\n老火\t222232\n意会\t222233\n程锦空\t222234\n贾樟宜\t222235\n电视头\t222236\n锲而别\t222237\n六舍\t222238\n从头再来\t222239\ngodie\t222240\n拿不上\t222241\n买不下\t222242\n收留\t222243\n59\t222244\n58\t222245\n白水公主\t222246\n54\t222247\n播放你是我的罗密欧\t222248\n56\t222249\n51\t222250\n50\t222251\n53\t222252\n大宝组合\t222253\n西柚\t222254\n甜蜜蜜\t222255\n一片人\t222256\n5$\t222257\n二百张\t222258\n养廉\t222259\n如花有\t222260\n蝇蛆\t222261\n劲脖子\t222262\n高雪人\t222263\n盘后\t222264\n生奉\t222265\n开阳桥\t222266\n吧儿子\t222267\n一章\t222268\n伯伯伯牙\t222269\n小龙女\t222270\n没了没了\t222271\n黑骑士\t222272\ntutorial\t222273\n一端\t222274\n不行走\t222275\n外部\t222276\n地锅\t222277\n一竖\t222278\n揍我\t222279\n一站\t222280\n看懂\t222281\nkdjg\t222282\n楚华朝\t222283\n多简单\t222284\n张嘉懿\t222285\n生女\t222286\nfafa\t222287\naachi\t222288\n伊甸园之东\t222289\n854\t222290\n王洁\t222291\ngghh\t222292\n严处\t222293\n王洋\t222294\n勒个\t222295\n耍梦\t222296\nyoumeyou\t222297\n谷破晓\t222298\n零幺\t222299\nPO主\t222300\n魏晨偶\t222301\n麦门冬汁\t222302\n摘取\t222303\nhvds\t222304\n9987373\t222305\n61个\t222306\n南风窗\t222307\n特性\t222308\njyj\t222309\n牛太阳\t222310\nhellomyamechndi\t222311\n林雪\t222312\n善断\t222313\n林雨\t222314\n林雷\t222315\n20110114\t222316\n杨奎松\t222317\n神马鬼\t222318\n双成双\t222319\njyi\t222320\n生路\t222321\n王洼\t222322\n富士X100\t222323\n有急事等一下\t222324\n人偶\t222325\n臭不要脸的傻冒就是你咯度秘\t222326\n立意\t222327\n发送\t222328\n文正\t222329\n失恋者\t222330\n治官\t222331\n九分之5\t222332\n203fb80e7\t222333\nxxbcf\t222334\n一声节\t222335\n双喜临门\t222336\ngiu\t222337\ngit\t222338\ngir\t222339\n北青网\t222340\n贵族\t222341\n日新月异\t222342\n青椒炒肉饭\t222343\ngix\t222344\n好学姊\t222345\ngif\t222346\ngid\t222347\ngic\t222348\n云宝\t222349\n我的确闲\t222350\n黄文清\t222351\n我喜欢太子妃升职记\t222352\ngin\t222353\n龙龙欣欣\t222354\n太二区\t222355\n3114137724\t222356\ngii\t222357\ngih\t222358\n4800块\t222359\nADS\t222360\n藤本\t222361\n开来\t222362\n八点吧\t222363\n闭合\t222364\n耳朵草\t222365\n广州地铁\t222366\n王琳芳\t222367\n酬谢\t222368\nxxooccppvvjjbbkknnllmm\t222369\n小灰飞喜羊羊\t222370\nv杯\t222371\n暃昌\t222372\n揉揉\t222373\n载客量\t222374\n开杉\t222375\n三季\t222376\n三人一条心\t222377\nabbc式\t222378\n唐婉\t222379\n大灯笼\t222380\n惹你\t222381\n900元\t222382\n尿流\t222383\n北束\t222384\nasdgkl\t222385\nfurt\t222386\n不减仔不必了\t222387\nfurx\t222388\n451046\t222389\n北村\t222390\n爱呗\t222391\n阎泽宇\t222392\n刻花\t222393\n李红光\t222394\n一一君\t222395\n几零\t222396\n我不讨厌\t222397\n农活\t222398\n家破人亡\t222399\n笨老虎\t222400\n胶条\t222401\n大了\t222402\n导数\t222403\n打错了你说\t222404\n错话\t222405\n杜洛克\t222406\n几集\t222407\n胡奕楠\t222408\nGhirardelli\t222409\n不紧不慢\t222410\n力王\t222411\n王念辉\t222412\n校验码\t222413\n别碰\t222414\n丹丹丹丹\t222415\n干爹队\t222416\n吕爹\t222417\n啦牛\t222418\nsqx\t222419\nsqq\t222420\nsqv\t222421\n哈罗嗦\t222422\n上冲\t222423\n墙皮\t222424\nsqn\t222425\n587458868855505\t222426\n881140330\t222427\n匹沙\t222428\n小恋\t222429\n想劲儿\t222430\n刀剑神域\t222431\n啦特\t222432\n77784765\t222433\n国乐\t222434\n媒体们\t222435\n别梦里去\t222436\n小息\t222437\n小恩\t222438\n五家渠\t222439\n不好我投诉你我投诉你我投诉你\t222440\n好悠\t222441\n瓷\t222442\n瓶\t222443\n九八\t222444\n新开间\t222445\n有种人\t222446\n礼貌不礼貌不礼貌不礼貌\t222447\n瓢\t222448\n瓤\t222449\n东莞后街\t222450\n李胜晟\t222451\n屁股\t222452\nreampa\t222453\n瓯\t222454\n瓮\t222455\n喂养勺\t222456\n九元\t222457\n个半\t222458\n哈斯木霉菌塞点\t222459\n瓝\t222460\n瓜\t222461\n奥蜜奥良\t222462\nNN次\t222463\n我相信你行\t222464\n热热闹闹\t222465\ngopao\t222466\n毛曾半\t222467\n杨顿\t222468\nnowhg\t222469\n19522.6万元\t222470\n女乳\t222471\n擎天柱\t222472\n女书\t222473\n一页\t222474\n一顶\t222475\n别小说了拜拜\t222476\n比如说\t222477\n卢卡斯\t222478\n国家税收入\t222479\n人魔\t222480\n一顿\t222481\n河马河马\t222482\n一项\t222483\n一顺\t222484\n下来\t222485\n10i五十\t222486\n一顆\t222487\n海信空调\t222488\n风风光\t222489\n兜里\t222490\nsenabce\t222491\n化作电闪雷鸣\t222492\nHvdjyghkh\t222493\n卡忙北鼻来死狗\t222494\n水吧\t222495\n安天喜\t222496\n296g919\t222497\n偷人贼\t222498\n几百几百\t222499\nteachfeeling\t222500\n国英\t222501\n小达令\t222502\nareturn\t222503\n周涛\t222504\n4363474\t222505\n6666655554422\t222506\n美兰\t222507\n徐子航\t222508\n10千米\t222509\nqfhiscigrg\t222510\n赵何娟\t222511\n你好我好寂寞\t222512\ngxgxv\t222513\ndianshi\t222514\n三分地\t222515\n避险\t222516\n银河堡\t222517\n卡兹秘高屋了是的天气预报\t222518\n原盐\t222519\n田恩瑛\t222520\n周润\t222521\n轻音乐\t222522\njkbmbjkkknnhjjkgkklhbkbbn\t222523\n九县一区\t222524\n精神病人\t222525\na多\t222526\n库克\t222527\n四银\t222528\n微女郎\t222529\n77mua\t222530\n300个\t222531\n颠婆\t222532\n粪尿\t222533\n泡妞\t222534\n扫荡\t222535\n雅增林\t222536\n欢你是\t222537\n里约热内卢\t222538\n300万\t222539\n坏菜\t222540\n蓝边儿\t222541\n嘉爷\t222542\n灰领\t222543\nujdatata\t222544\n专务\t222545\nshenmi\t222546\n(\t222547\nshenme\t222548\n大i秘小布苇扒近毋廿撇杪杂草遴牛甘歹兀\t222549\n大个子\t222550\ndoris\t222551\n日喀则\t222552\n傻子枫\t222553\n足以\t222554\n增肥\t222555\n未然\t222556\n样衣\t222557\n灰相互瞬间亮挥泪无人间天\t222558\n学听话\t222559\n负心人\t222560\n伱受我攻\t222561\n早盘\t222562\n数据处理\t222563\nfjwmc\t222564\n张原单\t222565\n粗暴\t222566\n帅呢\t222567\n11田\t222568\n煩不煩\t222569\n2011年元旦\t222570\n直选\t222571\n桑娜\t222572\n让我明白\t222573\n达语\t222574\nsports\t222575\n西塔\t222576\n呕喽\t222577\n织织\t222578\ntirp\t222579\n乱说谎\t222580\n早上七点四十\t222581\n直通\t222582\n短片\t222583\n墙纸\t222584\n西塞\t222585\n西塘\t222586\n7556\t222587\n那京\t222588\n桃园小学\t222589\n打错字\t222590\niiimkt\t222591\n證人\t222592\n千岛湖\t222593\nEva\t222594\n口色\t222595\nu娜目ne\t222596\nEve\t222597\n早上10点\t222598\n无聊鬼\t222599\nEvy\t222600\nfuidg\t222601\n回家等\t222602\nGFTT\t222603\n如此而已\t222604\n二零二零三\t222605\nn467\t222606\n乖飒飒\t222607\n100位\t222608\n看见\t222609\n那事\t222610\nhello目的天天开心\t222611\n记入\t222612\nfansto\t222613\n腊梅\t222614\n内面\t222615\nyujnlnjmo\t222616\n3118414905287383\t222617\n那亚\t222618\n那些\t222619\n桥豆麻得\t222620\n9963251\t222621\n啊处\t222622\n望江楼公园\t222623\nqqqlgy\t222624\n付建昌\t222625\n你老惹我\t222626\n呵态\t222627\n马楚\t222628\n較\t222629\n多四分之一\t222630\n缪家亮\t222631\n以撒\t222632\n真我\t222633\n别来说\t222634\nhn拨片\t222635\n輕\t222636\n乌拉山\t222637\n春纪公司\t222638\nhanigetou\t222639\n教育制度\t222640\n裘皮\t222641\nNBA2k\t222642\n光羽·光语\t222643\n輦\t222644\n会儿一回\t222645\n遗像\t222646\n水水山山处处明明秀秀晴晴雨雨时时好好奇奇\t222647\n谢传惜\t222648\n太亏了你\t222649\n新能\t222650\n傅启辉\t222651\n可心午\t222652\n益智游戏\t222653\n5024平方厘米\t222654\n折半\t222655\n付艳芝\t222656\nhihi\t222657\n中药味\t222658\n拐角处\t222659\n3.34%\t222660\nandritw\t222661\nxort\t222662\n寻欢作乐\t222663\n陈飘逸\t222664\n同胞\t222665\n花jiK\t222666\n查号\t222667\n了峰峰先上你好呀\t222668\n大众创业万众创新\t222669\n费米\t222670\n罗茂之\t222671\n刘晨曦\t222672\n发条橙\t222673\n\t222674\n二六二零\t222675\nFxjbwh\t222676\n走样\t222677\npredia\t222678\n让一让\t222679\n韵母\t222680\n哇纪元\t222681\n难懂\t222682\n背装\t222683\n争奇\t222684\n牛津\t222685\n六再\t222686\njrnd\t222687\n不良品\t222688\n报考\t222689\n埠头\t222690\n46秒\t222691\n112点\t222692\n多半载\t222693\nDugjudge\t222694\n小夜端\t222695\n不诓\t222696\n华翠多\t222697\n接转\t222698\n大风车\t222699\n彩印\t222700\n不识\t222701\n猫咪\t222702\n低落\t222703\n狗猫魔\t222704\n云飞\t222705\n不课\t222706\nvtd\t222707\n云飘\t222708\n不说\t222709\n86岁\t222710\n不请\t222711\n不忍心忍心忍心\t222712\n不语\t222713\n大漠谣\t222714\n不误\t222715\n上集\t222716\n不该\t222717\n不详\t222718\n很方便\t222719\n够不着\t222720\n张武龙\t222721\n胡建龙\t222722\n囧态\t222723\n总结性\t222724\n瓜笑\t222725\n国际社会\t222726\n转了装\t222727\n宋老师\t222728\nwofade\t222729\n出生率\t222730\n朱思颖\t222731\n一试份\t222732\n不上不下\t222733\n秘我喜欢\t222734\n熟视无睹\t222735\n黑鸭大战\t222736\nuhhyou\t222737\n就好当然\t222738\n缺载\t222739\nvt5\t222740\n啦啦啦啦啦啦啦啦啦啦666666\t222741\n穆斯林\t222742\n高新仪\t222743\n港口小学\t222744\n赵在\t222745\ngusv\t222746\nBagayalu\t222747\n摒弃\t222748\n渡过\t222749\n单重\t222750\n064267458\t222751\n生缘\t222752\n洗泊\t222753\n要贵\t222754\n下来临走\t222755\n亚晨\t222756\n解手\t222757\n街机份\t222758\n泡蛋\t222759\n800米\t222760\n洗泽\t222761\n以纯\t222762\nq秘秘\t222763\n黄鸿浩\t222764\n藕叶\t222765\nyourenglishissostupid\t222766\n对内\t222767\n二滴\t222768\n黄韦\t222769\n九汉语\t222770\n三个多月\t222771\n好那我不和你说话了我不相信你\t222772\n日剧\t222773\n二百丑大度秘\t222774\n闹人\t222775\n咪咪迷咯啵\t222776\n678990\t222777\nzhanji\t222778\n对冉\t222779\n100亿元\t222780\n688282838233828\t222781\n郑龙\t222782\ngdethbjmnnj\t222783\n别闹了吧\t222784\n58吗四十八三十八二十八十八\t222785\n好多啦\t222786\nRAIN\t222787\n发挥好\t222788\n一千几记\t222789\n波尔多\t222790\n200334455667788\t222791\n张子\t222792\nimyx\t222793\n太极旗\t222794\n24365\t222795\n张如涛\t222796\n有人在\t222797\n青鸟\t222798\n阿皮片\t222799\n姑囖\t222800\nplace\t222801\n好想你的说\t222802\n吕慧\t222803\n01095055\t222804\n大本儿\t222805\n自然而然\t222806\n节点\t222807\n提款\t222808\n十二盆花\t222809\n18735250284\t222810\n淫虫\t222811\n在其不来\t222812\n马拉多纳配\t222813\nf拉多\t222814\n两千3004千600\t222815\n三亿六千万\t222816\n一起块\t222817\nios4\t222818\n别说话了我讨厌你\t222819\nios6\t222820\n老爷机\t222821\n巧虎岛\t222822\n仗义\t222823\nMC吧吧主\t222824\n胡萝卜馅\t222825\n抱嘻嘻\t222826\n几期\t222827\n成立新\t222828\n置换\t222829\n圣托里尼岛\t222830\n多多多多\t222831\n哼杂\t222832\n妖棒\t222833\n一来看\t222834\n密封\t222835\n请重说\t222836\n坐标\t222837\n斯\t222838\n蔡焱辉\t222839\n断\t222840\n一比八多\t222841\nLoomis\t222842\n同龄人\t222843\n阜阳\t222844\n揭竿而起\t222845\niosa\t222846\n从容秘\t222847\n斤\t222848\nCanyouswim\t222849\nalan\t222850\n哈哈哈者\t222851\n施\t222852\n於\t222853\n侬点\t222854\n以免\t222855\n好我爱死\t222856\n蹦蹦伯伯伯伯伯伯伯伯伯伯伯伯伯\t222857\n10月份\t222858\nallyouhan\t222859\nhiehe\t222860\n新\t222861\n巴掌\t222862\n不不不不不不我作业\t222863\n斋\t222864\n4556655555463\t222865\n文\t222866\n斃\t222867\n以其\t222868\n白石桥\t222869\n斟\t222870\n集合\t222871\n睡伦\t222872\n料\t222873\n真博爱\t222874\n十尾\t222875\n崩盘\t222876\n轻体\t222877\n斑\t222878\n凡凡\t222879\n720点\t222880\n让领导先走\t222881\n雄雄雄\t222882\n酒钢\t222883\n点状\t222884\n脚兔\t222885\n魏应交\t222886\n天天一天\t222887\n周裕淳\t222888\n任一凡\t222889\n古乐拜\t222890\ngyvghvj\t222891\nhoolyo\t222892\n黄汪汪\t222893\n微妙\t222894\n非常成年\t222895\n血晶\t222896\n洗剂\t222897\n清单\t222898\n七天之后\t222899\n玩去\t222900\n渡迷\t222901\n锅木\t222902\n黑金\t222903\n84835020\t222904\n西真笨呐\t222905\n曾晊琪\t222906\n刘堡\t222907\n倒计\t222908\n国营实业\t222909\nbabox5\t222910\n万工\t222911\nfct\t222912\n陌陌度秘\t222913\n德克仕\t222914\n我爸当然是你了你好帅\t222915\n阿咿\t222916\n弄口\t222917\n颜子卿\t222918\nfcd\t222919\nfcf\t222920\nfcg\t222921\n阿咪\t222922\n闹事\t222923\n房县\t222924\n杨刚\t222925\n杨家将\t222926\n帽子女\t222927\nbiifgi\t222928\n欠条\t222929\n12665479888\t222930\n画江湖之灵主\t222931\n赛特\t222932\n上北镇\t222933\n13150300792\t222934\n二六三幺零六零二\t222935\n火箭筒\t222936\n日本颜氏\t222937\n万州\t222938\nftyymhu\t222939\nBC卜\t222940\n扁头\t222941\nWhose\t222942\n分拣\t222943\n患难与共\t222944\n唐振中\t222945\nhhhjh\t222946\n分拨\t222947\n好疼\t222948\n嘎嘎芭芭拉kai\t222949\n夸我美\t222950\n文真\t222951\n天津卡\t222952\n四心欧尼\t222953\n对着干\t222954\n一个七八岁\t222955\n杜少威\t222956\n分担\t222957\n分拆\t222958\n洗耳恭听\t222959\nABBY玲\t222960\n匠心独运\t222961\n花落谁家\t222962\n万源路\t222963\nkoyo\t222964\n配送员\t222965\n莎娃迪卡靠靠靠\t222966\n入眼\t222967\n谢师宴\t222968\n百合网\t222969\n中核集团\t222970\nfffgfggggc\t222971\nCaifanggao\t222972\n替代品\t222973\n2月12日起\t222974\n下周点\t222975\nzhaopian\t222976\n地缝\t222977\nhoye\t222978\n欧普\t222979\n靜秋狀態\t222980\n康曼\t222981\n男事\t222982\n入眠\t222983\n9-10月\t222984\n伯家\t222985\n知华\t222986\n艳红\t222987\n男人\t222988\nxcFFXGHSR\t222989\n好好好好好\t222990\n梁米\t222991\n6kk\t222992\n重庆建设工业集团\t222993\n毛驴子\t222994\n春节春节\t222995\n6ka\t222996\n西亚杯\t222997\n儿猪\t222998\n巡航\t222999\n哎呦冰炮\t223000\n虹兴社区\t223001\n戏迷\t223002\n是因为你是\t223003\n再结晶\t223004\n2800名\t223005\nvivl\t223006\nuzywye\t223007\n纯要\t223008\n防骗\t223009\ndtrs\t223010\n风风火火\t223011\n爱我哪里\t223012\nuucuutcoc6cccccuvecue\t223013\n注试试\t223014\n真没想到\t223015\n娶亲\t223016\n五七岁\t223017\n偷跑\t223018\n死扑街\t223019\n童稚\t223020\n田晨蕊\t223021\n一线天\t223022\n刺死\t223023\n陌路\t223024\n马群里\t223025\n9胡\t223026\n说短\t223027\n啦啦啦啦啦啦啦啦我是卖报\t223028\n太好啦\t223029\naqwrhu\t223030\n杨兆曜\t223031\n鸦片战争\t223032\nk钱\t223033\n细胞分裂\t223034\n度秘我想和你\t223035\n大猪侠\t223036\n好不好我没有病\t223037\n宪武\t223038\n5655685598889001\t223039\n缅甸\t223040\n弊视\t223041\n小灰兔\t223042\n叫曲\t223043\n申彩曦\t223044\ntouyou\t223045\n葡萄牙语\t223046\n杜村杜村\t223047\n101本\t223048\n手印\t223049\n24日晚\t223050\n涛涛涛涛涛\t223051\nwyx\t223052\n透考扣\t223053\n六年六年\t223054\n七十多天\t223055\n卷尺\t223056\n杜密兹\t223057\n聊狗\t223058\n唉苦\t223059\n谷攻\t223060\n广告节\t223061\n0844658469745855848075n4846\t223062\n76496349834938957\t223063\n楚玉静\t223064\n罗丽\t223065\n文虎\t223066\n亲着您哈么么么么么么么么么么么晚安辣老公\t223067\n我的丑算了行么我把你的单什么\t223068\n脸型\t223069\n模特\t223070\n口水\t223071\n4几\t223072\n伦家家\t223073\n耐人\t223074\n翻来覆去\t223075\n四爷\t223076\n罗一\t223077\n触电\t223078\ng1只\t223079\nyh\t223080\n这的人\t223081\n悖论\t223082\n勤俭节约\t223083\n87分\t223084\n欢乐关\t223085\n00PS\t223086\n金雕\t223087\n别具\t223088\n三一起来吧度\t223089\n萧条\t223090\n生下女播放\t223091\n落樱\t223092\n换汤不换药\t223093\n爬楼梯\t223094\nkill啦啦阿拉\t223095\nwimask\t223096\n很纯\t223097\n刑期\t223098\n伤员\t223099\n堵住\t223100\n开森\t223101\n1a1a1a1d1a1a\t223102\n美辰\t223103\n最后一场比赛\t223104\n杜密克\t223105\n8JJ\t223106\n度秘你有爱你的人\t223107\n鬼父\t223108\n说自话\t223109\n那千玺\t223110\n张桔犬\t223111\n辉燕\t223112\n感觉不到\t223113\nassasss\t223114\n阿库拉\t223115\nor欧诺\t223116\n稀饭\t223117\n二百三百四百八\t223118\n腚眼\t223119\n啦啦咯\t223120\n蒋雅欣\t223121\n小太子妃升职记\t223122\nsasiz\t223123\n美国地质勘探局\t223124\n洪荣伟\t223125\n吴会\t223126\n香樟园小区\t223127\n诅咒\t223128\n市区外\t223129\nfffdggd\t223130\n代省长\t223131\n一个十二二十一岁\t223132\nXhd\t223133\n病种\t223134\n请儿\t223135\nv度\t223136\n女生节\t223137\n不是我爱\t223138\n酷听听\t223139\n撒旦莉莉丝\t223140\n几顶\t223141\n出安安\t223142\n一个千呼万\t223143\n仁杰\t223144\n麂皮\t223145\n几顿\t223146\n3月14号\t223147\n熬夜我萌萌哒萌萌哒萌萌哒萌萌哒\t223148\n零四分之五\t223149\n金克丝\t223150\n安倒\t223151\n不要这么自我热情吧我是受不鸟\t223152\n闲游\t223153\n二八0点\t223154\n管辖\t223155\npo主\t223156\nlute\t223157\n联网\t223158\n安倍\t223159\n中国环境保护基金会\t223160\n小葵花\t223161\nMonster\t223162\n儿子们\t223163\n一如\t223164\n炙爱\t223165\n巴拉巴拉小魔仙贝\t223166\n朱怡\t223167\n罗梦雅\t223168\nrddfg\t223169\n喷漆\t223170\n种族\t223171\nMeganmay\t223172\n一个一会儿\t223173\nnftndnfnngifcngjxkfnfnxfkmfnfndkfkgjv\t223174\n整发际线\t223175\n八十几分\t223176\n浑水\t223177\n甲城\t223178\n副州长\t223179\n据场\t223180\n个工\t223181\n涓涓细语\t223182\ndidjbsiwg\t223183\n甲基\t223184\n1100万条\t223185\nmmmmmmmmmmmmmmmmmmmmmm\t223186\n大战灰跳舞的脚有娃娃么美妙魔仙女那在哪呀在吗吗hi魅影hi\t223187\n出众\t223188\n文库\t223189\n孙儿\t223190\n立表\t223191\n米奎尔\t223192\n颜郗\t223193\n打脸\t223194\n黄岛开发区\t223195\n说了再见\t223196\n唱一首\t223197\n克什克腾旗\t223198\n二十二十好多\t223199\nhjfgg\t223200\n长方形用\t223201\n学美容\t223202\n入党\t223203\n3.6%\t223204\n夏含\t223205\n20度\t223206\nwqpgmevo\t223207\n胡言了事\t223208\n神声音\t223209\n隶属\t223210\n菠萝蜜\t223211\n入入\t223212\nFDDFCSgetTexas\t223213\n一闪\t223214\nnishinandehaishinvde\t223215\n姓李妹妹\t223216\n舒我\t223217\n中班\t223218\n承德市\t223219\nvsaejvse\t223220\n寻妈记\t223221\nrich\t223222\n易怒\t223223\n泳衣\t223224\n20612\t223225\n歌坛\t223226\n忙行\t223227\n哈庙\t223228\n小白\t223229\n小百\t223230\n回回家\t223231\n冰峰\t223232\n六一点\t223233\n万特制药\t223234\nwdjp\t223235\n11000斤\t223236\n無慮\t223237\n王俊智\t223238\n开筹拷\t223239\n哼臭豆腐秘\t223240\n元春\t223241\nppppooooooooijhhvvrt77uhfdtyuuhhggfyyuujhffty788iuhgfew234578iiiijhhfftyuhfe34579ih\t223242\n具体型\t223243\n做作业\t223244\na咪\t223245\n多一句\t223246\n大一统号\t223247\nccjk\t223248\nDouble\t223249\n冷清\t223250\n只差\t223251\n哈度\t223252\n阿凡达\t223253\n第二十棵\t223254\n說實話\t223255\nyryfjhfhffyg\t223256\n张励智\t223257\n柳哥\t223258\n神马意思\t223259\n琉克\t223260\n奸夫\t223261\n隋菲菲\t223262\nD君\t223263\n东北\t223264\n二二二八\t223265\n二二二六\t223266\n字幕版\t223267\n小调皮\t223268\n3d肉蒲团\t223269\n火腿\t223270\nueufhhwf\t223271\n卖豆腐\t223272\n35.35\t223273\n湖南宁乡一中\t223274\nyyyyuy\t223275\n海伦市\t223276\n公与媳\t223277\n踩点\t223278\n11787.84点\t223279\n数六十分种\t223280\nPrayer\t223281\n不敢相信你是\t223282\n他么\t223283\n总总\t223284\nhi吐会兔\t223285\n宋慧乔\t223286\n妈呀\t223287\n林嘉欣\t223288\n44QKQK\t223289\n朵咪\t223290\n抗议\t223291\n置之死地而后生\t223292\n再见再见再见亲爱的度秘再见\t223293\n496469438846738\t223294\n88克\t223295\n齿状线\t223296\n好奇心\t223297\n税务\t223298\n你好劲智能机器人\t223299\n哈哈好好玩乐c某法好贴不\t223300\n阿洁君\t223301\n三十分节\t223302\n嗯心里\t223303\n段思雨\t223304\n异常\t223305\n豆脑\t223306\n邱欣悦\t223307\n世界储备货币危机\t223308\n堡膜\t223309\n莫尔斯电码\t223310\n对啊咱们俩\t223311\n灵鹫\t223312\n十六本\t223313\n捶背\t223314\n普达\t223315\n上海逛田子坊苹果店\t223316\n易贤\t223317\n啊咖\t223318\nELF\t223319\n陪笑\t223320\n辩护\t223321\n王庆利\t223322\n葛承笑\t223323\n易贷\t223324\n发输\t223325\n雨寒\t223326\n栓住\t223327\ntobahither\t223328\n我的经济\t223329\n别给\t223330\n175页\t223331\n别泻药\t223332\n天八宁\t223333\n嗯啵\t223334\n桃美\t223335\n青铜座\t223336\n退出来\t223337\n4秒钟\t223338\n你好呀你好呀啦啦啦啦啦啦\t223339\ntffangco\t223340\n坐飞船\t223341\nvxiab38n68618n83187684384618318761784607n757401473\t223342\n妹妹儿\t223343\n支架式\t223344\nSS501\t223345\n金山桥婚姻登记处\t223346\nbeltlrc\t223347\n21日15时55分\t223348\n博阿滕\t223349\n嗯大约\t223350\n钱一平\t223351\n不唯\t223352\n来了谁告诉你\t223353\n保育箱\t223354\n金龙鱼\t223355\n不唱\t223356\n吕金毛\t223357\n销魂脚\t223358\n猪身\t223359\n一马当先\t223360\n一定要记住\t223361\n上上上上上上上上上上上上上上上上上上\t223362\n冰度秘\t223363\n我告诉你\t223364\ndguhhf\t223365\n20届\t223366\n矫姿器\t223367\n不唉\t223368\n王一婷\t223369\n翠花翠花\t223370\n孙nn\t223371\n广东塑料厂\t223372\n8677\t223373\n酷酷鲁\t223374\nChcchbxb\t223375\n爱可\t223376\nhi皮鲁2b21212\t223377\n石头们\t223378\ngfffffff\t223379\n曾聪敏\t223380\n孟给我看\t223381\n20万元\t223382\n垃坷\t223383\n你最萌\t223384\n正负\t223385\n嘛丁\t223386\n正财\t223387\n一血斑款\t223388\n阝乐石\t223389\n五折\t223390\n深喉\t223391\n作文化\t223392\n我喜欢我喜欢\t223393\n明天宁\t223394\n疯了疯了疯了\t223395\n没有关系\t223396\n竞然\t223397\n电儿\t223398\n五把\t223399\n针织\t223400\n阿泰\t223401\ndist\t223402\n偿命\t223403\n陈雪瑶\t223404\nJhbh\t223405\n猫娘\t223406\nvisitstistis\t223407\nbjkhmm\t223408\n在逃\t223409\ngheryh\t223410\n缩水\t223411\n对啊二\t223412\n铲铲\t223413\n迷你\t223414\ndish\t223415\ndisi\t223416\n雷同\t223417\n雷名\t223418\n雷吊\t223419\n160千克\t223420\n第２５届\t223421\n沈魔\t223422\n99点\t223423\n搞不清\t223424\nDgbbxjxj\t223425\n东涌街\t223426\n水蜜桃利口酒\t223427\n迷住\t223428\n西海\t223429\n1o岁\t223430\n难民署\t223431\n第四组\t223432\n加速器\t223433\n操场\t223434\nGIF\t223435\n不定期\t223436\n落网\t223437\n伯牙绝弦\t223438\n塞曼\t223439\n杨徒儿\t223440\nV5地理课啊KTV了里咯莫非\t223441\n八月六\t223442\n姚晓存\t223443\n金花菜\t223444\n九岁半\t223445\n零零二二零幺五七七幺\t223446\n大大方方\t223447\n水果刀\t223448\n34.7%\t223449\n十三十六十七十八十九十六\t223450\n二本儿\t223451\n一下子方\t223452\n乐儿\t223453\n张莉莉\t223454\n舒凡\t223455\n3000元\t223456\n废品\t223457\nhewroteallof\t223458\n遗漏\t223459\n花面猫\t223460\nPride\t223461\n喜欢迎\t223462\n24公斤\t223463\n新钢分宜铁坑矿招待所\t223464\n门关\t223465\n10P\t223466\n物理作业\t223467\n四五天\t223468\n合苏木\t223469\n六五幺\t223470\n设在\t223471\nifiatimpli\t223472\n没关系\t223473\n新世纪国旅\t223474\nnene\t223475\n午盘\t223476\nnenf\t223477\n徐逊\t223478\n2333号\t223479\n李启兵\t223480\nhnbygyc\t223481\n美拍\t223482\n挑一选\t223483\n小妹儿们\t223484\n4月4日00时54分\t223485\n如初\t223486\n新片子\t223487\n谢谢好久不见\t223488\n顶绕\t223489\n红尘客栈\t223490\n欢我会\t223491\n我的康慈，我的慕仪\t223492\n恰恰\t223493\n苍空井\t223494\n陈哥\t223495\n17358525\t223496\n大变态\t223497\ngsdccf\t223498\n陶涛\t223499\n高梦梦\t223500\n神舟九号\t223501\n患难与共是\t223502\n华云翔\t223503\n15861745200\t223504\n度秘我不要你了你\t223505\n我讨厌你讨厌你讨厌你动力我不要你当我的助手\t223506\n挥一挥\t223507\n貢兰\t223508\n张丽鑫\t223509\n｀ノノ\t223510\n慢摇\t223511\n推压\t223512\n梅城\t223513\n歌熊猫\t223514\n宋宇鹏\t223515\n偶有鸭梨\t223516\n高青县\t223517\n喔家森\t223518\n东东麻\t223519\n为啥\t223520\n46858688588\t223521\n这么说吗\t223522\n秦暮言\t223523\n1月10号\t223524\n斜阳暮草\t223525\n咋侯\t223526\nknows\t223527\n72年前\t223528\n差回过号\t223529\n会不好意思\t223530\nKONKA\t223531\n闹梦\t223532\nM3皮卡\t223533\n饭行\t223534\n字子\t223535\n字字\t223536\nglad\t223537\n介素\t223538\n使过\t223539\n八月份\t223540\n为你好呀\t223541\n一人名\t223542\n李冠学\t223543\n186067\t223544\n大总攻行\t223545\n朱培媛\t223546\n林思源\t223547\n15193198924\t223548\nInn\t223549\n7月6日\t223550\n经潼关县政府办公室\t223551\n少小妇\t223552\n夏雨\t223553\n夏雪\t223554\n灰指\t223555\nbunui\t223556\n果断\t223557\n蒋子阳\t223558\n核动力\t223559\n胡扯\t223560\n贝尔格里尔斯\t223561\n安其坦\t223562\n饱满\t223563\n85549795\t223564\n美廉美私密\t223565\n55秒内\t223566\n37.6%\t223567\n教师证\t223568\n百hx\t223569\n还有你好无聊\t223570\n黄羿宁\t223571\n宠王\t223572\n1119999\t223573\n佛佛\t223574\n农历五月十七\t223575\n跳周\t223576\n北海路\t223577\nK145次\t223578\n糖豆豆\t223579\n对手戏\t223580\n良渚\t223581\n保护人\t223582\n轮轮回\t223583\n一百二十三块\t223584\n身教\t223585\n信你可以\t223586\n619\t223587\n忶\t223588\n念\t223589\n不c不c\t223590\n613\t223591\n610\t223592\n忸\t223593\n616\t223594\n星期八\t223595\n忽\t223596\n一百分\t223597\n忠\t223598\n忧\t223599\n茜姐\t223600\n快\t223601\n头花\t223602\n忭\t223603\n宋一夫\t223604\n忒\t223605\n床尾\t223606\n志\t223607\n忙\t223608\n忘\t223609\n恩纳尼哈赛唷\t223610\n鼻青脸肿\t223611\n非常风风光光\t223612\n5151828151480\t223613\n1493\t223614\nHRNFS\t223615\n茗茶\t223616\n必\t223617\n207k线\t223618\n迹象\t223619\n秘书\t223620\n忉\t223621\n懂秘\t223622\n珍珠你给我珍珠\t223623\n忌\t223624\n这声响\t223625\n蒙毅\t223626\nokder\t223627\n爱拉不优\t223628\n好笑\t223629\n嗯顿\t223630\n拜拜我喜欢\t223631\n童谣\t223632\n柏高家\t223633\n百四元\t223634\n仙脚\t223635\nfuygt\t223636\n万象天\t223637\n唑片\t223638\n隹\t223639\n下雪勒\t223640\n哪都米\t223641\n535880\t223642\n王谦虚\t223643\ninthatnoon\t223644\n明年2月\t223645\n3D打印\t223646\n杨家婊\t223647\n乖度秘我现在不\t223648\n清透\t223649\n香皂花\t223650\n看我多帅\t223651\n缴获\t223652\n冯玉苗\t223653\nhhbvcft6fDrtyryuhgcyhdr5stgjjh\t223654\n宏耕\t223655\n尝鲜\t223656\n鬼饭\t223657\n东北城市\t223658\ncalm\t223659\n122333333\t223660\nREKW2AKEENVWQJ2YJT2Ityeietqlywr11iYWU525I22OT4W6EIe7lirk\t223661\n71厂\t223662\n丑丑丑\t223663\n拜拜天\t223664\n开学我来了你在\t223665\n来事儿\t223666\n成二黄\t223667\n傻帽\t223668\n模拟车\t223669\n送元二\t223670\n箱盖\t223671\n一多岁\t223672\n43224256\t223673\n问一下行\t223674\n迪菲亚我小七岁\t223675\n发块\t223676\n秘男篮球\t223677\n我真的好想和你\t223678\n4755632555555\t223679\n发坏\t223680\n葡式\t223681\ngvgfvvffc\t223682\n望远镜\t223683\n十七岁\t223684\nfidd\t223685\n劈叉\t223686\n国美电器董事局\t223687\n樱之雨\t223688\n古龙\t223689\n蒋裕福\t223690\n莫门口\t223691\n你的要求\t223692\n39天\t223693\n讲乜\t223694\n笨蛋挞\t223695\n很劲\t223696\n去吖仔\t223697\n机制点\t223698\n绕开\t223699\n随\t223700\nWere\t223701\n好气\t223702\ncatic\t223703\n葛伟林\t223704\n唱精忠报国\t223705\n鸿水\t223706\n判断力\t223707\n废话费\t223708\n内落差\t223709\n自由军\t223710\n杨俊翔\t223711\n绕弯\t223712\n吝啬鬼\t223713\n程灵烨\t223714\n风湿病\t223715\n伱好美\t223716\n宝贝豹子是你\t223717\n度迷度秘\t223718\n石棉老老桥\t223719\n文波\t223720\n话语权\t223721\n岳翔\t223722\n哎呀度秘你真你家真的在我隔壁\t223723\n神仙\t223724\n作文集\t223725\n王家卫\t223726\nkthckhcifxgjjghcckkjgjuuuun\t223727\n万仞\t223728\n颗粒\t223729\n李学奇\t223730\n帅帅哥\t223731\n吧度柲\t223732\n度秘我爱你我们结婚吧\t223733\n唧燥\t223734\n怪梦\t223735\n4544\t223736\n15932586482\t223737\n尾市\t223738\n孙雨婷\t223739\nvffggf\t223740\n切切有你是谁\t223741\n踏踏踏\t223742\n万代\t223743\n呼兰区\t223744\n于子昂\t223745\n小堡\t223746\n风尘仆仆\t223747\n水笔\t223748\n被骗了吧\t223749\n亲兄托\t223750\n沃保\t223751\n23.7亿元\t223752\n娃衣\t223753\n胜德村\t223754\n再发短信\t223755\n曹其新\t223756\n好星\t223757\n好昕\t223758\n苑伯宁\t223759\n鸽鸽\t223760\n太岁u\t223761\n42千米\t223762\n有我\t223763\n名车\t223764\nReggae\t223765\n高蛋白\t223766\n堕胎文\t223767\nAV51次\t223768\nclease\t223769\n李孜柔\t223770\n喉舌报\t223771\n最最最最\t223772\n好是\t223773\n乌篷船\t223774\n烈性\t223775\n小升初\t223776\n尿泡\t223777\n嘉玲\t223778\n奔驰中国总部\t223779\n猜一到\t223780\n周笔畅新城记\t223781\n野鸭湖\t223782\n休战\t223783\n巴拉迪\t223784\n3555555\t223785\n长不短\t223786\n婚否\t223787\n放辞\t223788\n累死人你告诉我\t223789\nvvggggg\t223790\n西天取经\t223791\n欢乐豆\t223792\n案头\t223793\n信达\t223794\n申泽禾\t223795\n深奥吗一点都不深奥我相信你是个\t223796\nstyoumeto\t223797\n⊕Θ\t223798\n木兰诗\t223799\n欧啂\t223800\n绝壁\t223801\n嗯好度秘\t223802\n嗯卡\t223803\n艳舞\t223804\n生计\t223805\n燙畫嗎\t223806\n嗯卧\t223807\ncucolo\t223808\n婚后\t223809\n百花\t223810\n小念妹妹\t223811\n老龄化\t223812\n帝旺\t223813\n2589432580\t223814\n秘束\t223815\n奥斯卡奖\t223816\n没事了钱\t223817\nlatljme\t223818\n六畜奴作满成行\t223819\n匡时秋\t223820\n素食主义\t223821\n张国伟\t223822\n吧的的吧的缰绳想的多一个僵化的哇呱株式\t223823\nhidjis\t223824\n喜鹊\t223825\n荷包网\t223826\n蛇精病\t223827\n5db\t223828\n啦啦娃\t223829\n擦擦擦vv不不不vv\t223830\n看着我好想吃\t223831\n求行\t223832\n河岳\t223833\n亲湿\t223834\n佩贾\t223835\n马化腾\t223836\n河岸\t223837\n八盒\t223838\n成程\t223839\n好评论\t223840\n消肿\t223841\n04月26日06时49分\t223842\n三不三个\t223843\n政商\t223844\n汇龙\t223845\n我真的爱你我一辈子爱你为了你\t223846\njsd\t223847\n雷波一\t223848\njsf\t223849\njsb\t223850\n一心\t223851\njsl\t223852\n秘书说声小的在\t223853\njsn\t223854\n物证学\t223855\n汪岚岚\t223856\njsi\t223857\no几个\t223858\njsk\t223859\njst\t223860\njsv\t223861\n低年级\t223862\n为由\t223863\njss\t223864\nnothing\t223865\nhsgskwn\t223866\njjurd\t223867\n14朱\t223868\n星学院星学院\t223869\njsy\t223870\n招商\t223871\n利索\t223872\n饿哦若凝妇女节吃口饭机动科从\t223873\n希特勒\t223874\n讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你\t223875\n可比性\t223876\n叶赫那拉\t223877\n刘世罗\t223878\n悟道\t223879\n覃育\t223880\n县关\t223881\n为生\t223882\nnnnnnnnnm\t223883\n小苗子\t223884\nSpeed\t223885\n毒舌\t223886\n好好想\t223887\ntftgghygcedeb\t223888\n刘希望\t223889\n问好不\t223890\n耻心\t223891\n爆头战\t223892\n超级大国\t223893\nhoyze\t223894\nnnnnnnnnn\t223895\n夏客\t223896\n５49\t223897\n荒野\t223898\n看电影\t223899\n诺兰\t223900\n天天陪距同\t223901\n闭超\t223902\n夏家\t223903\n续写\t223904\n40845\t223905\n乘船\t223906\n潮吧祖宗\t223907\n夏安\t223908\n一群度\t223909\n腐竹\t223910\n妖精阡\t223911\n诺克\t223912\n傲剑ol\t223913\n仰天\t223914\n和酒\t223915\nm421\t223916\n5000万件\t223917\n被蛇咬\t223918\n赵四\t223919\n敢要\t223920\n果种\t223921\n爸爸妈妈的话\t223922\n质差\t223923\n如泰山\t223924\n玩腻\t223925\n土话童话童话童话动画灰桃花爸爸瓮\t223926\n马兴宇\t223927\n123M\t223928\n悻悻\t223929\n于浩然\t223930\nPleasetellmejoke\t223931\nm42a\t223932\n妖孽妖孽妖孽\t223933\n编码\t223934\n西丰地区\t223935\n今炼\t223936\n话听\t223937\n瑞靖\t223938\nSeann\t223939\nhttppinyincn1nSmDX1XHa0\t223940\n新闻学\t223941\n纯情想\t223942\n山庄\t223943\n丁于\t223944\n飘单\t223945\nhdcjsggczdbjbxs\t223946\n阿悄\t223947\n開會\t223948\n花式\t223949\n安静安静\t223950\n鹊楼\t223951\n喜好\t223952\n秘级\t223953\n1173549281\t223954\n林晓\t223955\n规规矩矩\t223956\n丁亮\t223957\n林晨\t223958\n强玩\t223959\n導員\t223960\n取名儿\t223961\n心腹们\t223962\n阿悦\t223963\n乔荣\t223964\njduue\t223965\n站待\t223966\n空冷式\t223967\n你好丑说\t223968\n小拉\t223969\n磨磨石\t223970\n门字\t223971\n门子\t223972\n猪猪可爱你是猪猪\t223973\n18．1％\t223974\n山麓\t223975\n前提\t223976\n活下去\t223977\n斩获\t223978\nhkggj\t223979\n大陆版\t223980\n不懂用\t223981\n调皮蛋\t223982\n图景\t223983\n毒气\t223984\n我讨厌你我讨厌你我讨厌你我讨厌你\t223985\n相济\t223986\n1362355586\t223987\n看见了笑\t223988\n瓦牙齿可\t223989\n苏潇\t223990\n密性\t223991\n高地\t223992\nkhiut\t223993\n张墨竹\t223994\n十二指肠溃疡\t223995\n水生植物\t223996\nNND\t223997\n三十根\t223998\n实则\t223999\n副线\t224000\n直下地狱\t224001\n于己不适\t224002\n看书\t224003\n第49个\t224004\n磨磨唧唧\t224005\n哭太多\t224006\n豆豆姐\t224007\n阳哥\t224008\n北平\t224009\n芙蕖\t224010\nJkkekf\t224011\n咪乳素\t224012\n笔刷\t224013\n民民\t224014\n然後\t224015\n慧慧一\t224016\n寮步寮步\t224017\n几亿波\t224018\n嗯率\t224019\n福州市工商局\t224020\n里卡德\t224021\n断咧\t224022\n虚幻\t224023\n科曼奇门\t224024\n古剑果味优酸乳饮料\t224025\n斗气豆包\t224026\n280米\t224027\n床头柜\t224028\n惭愧\t224029\n男人和女人\t224030\n摆手\t224031\n持久化\t224032\n欧足联\t224033\n冷魅\t224034\n湖广像\t224035\n還是\t224036\n传祺gs5\t224037\n传祺gs4\t224038\n嘉禾街\t224039\n45545\t224040\n说了算了吧\t224041\nFencingcb\t224042\nNONONONONONO\t224043\n更名\t224044\nCamlion\t224045\n会计学\t224046\n学问析\t224047\n美得BIANG\t224048\n奥南\t224049\n疑神疑鬼\t224050\n2342234323442345234623472348234\t224051\n为三最\t224052\n唉我好讨厌你你真麻烦\t224053\n朱鹏超\t224054\nnonono\t224055\n奥博\t224056\n还朋\t224057\n再聊\t224058\n拘留\t224059\n一回事儿\t224060\n28642198\t224061\n真心对待\t224062\n猪脑\t224063\n中国乔丹体育有限公司\t224064\nddsssss\t224065\n小闹\t224066\n孙刘彤\t224067\nhookwork\t224068\n张正群\t224069\n冻秘\t224070\n小间\t224071\nNation\t224072\nnnnnnnnnnnnnnnnn\t224073\n还未\t224074\n还木\t224075\n不通气儿\t224076\n菜宜电影小达一无是电影小达人大汗\t224077\n菲斯亚\t224078\n爷儿\t224079\n574898\t224080\n不扛\t224081\n许鸿楚\t224082\n复得\t224083\nolanrobot\t224084\n硬要\t224085\n仪器\t224086\n三点半\t224087\n1一门\t224088\n光哥\t224089\n哭了你会不会\t224090\n好过生\t224091\n贤偶\t224092\n付税\t224093\n张津瀚\t224094\nJSFC\t224095\nTmall\t224096\n回头老虎\t224097\n来得\t224098\n真爱我是你的真爱什么叫真爱\t224099\n药厂\t224100\n猜猜猜猜呀猜呀猜猜猜猜猜猜猜猜猜呀猜呀猜\t224101\n那一起唱\t224102\n好快滚\t224103\n夸克之钥\t224104\n来往\t224105\n再讲句\t224106\n亲密度\t224107\n爱站\t224108\n沈敏\t224109\n隔没\t224110\ncuouto\t224111\n毒贩\t224112\n15万亿日元\t224113\n超级人\t224114\nbirthday\t224115\n毛笨笨\t224116\n航海家\t224117\n45ooo\t224118\n摸上去\t224119\n幺四幺四五四零四二九零四三\t224120\n龙旭\t224121\n十日晚\t224122\n看者发财\t224123\n43岁\t224124\n讨厌你不帮我我恨你了不找你\t224125\n龙旗\t224126\n斯纳克\t224127\ncycyg\t224128\n王子災\t224129\n帅波\t224130\n1000的你好丑变美日\t224131\nloin\t224132\n小张张\t224133\nhywj\t224134\nhywk\t224135\n沈雪\t224136\n读必读\t224137\n龙族\t224138\n歌唱\t224139\n孙可韵\t224140\nLast\t224141\n顺丰\t224142\n42267324773\t224143\n蛇蛇头\t224144\n那也不这样了\t224145\n顾雨熙\t224146\n尧尧娃\t224147\n1十1111111111111\t224148\n哈高兴\t224149\n林申\t224150\n急奔\t224151\n33袁\t224152\n偷偷摸摸\t224153\n三阶\t224154\ndwscyuus\t224155\n宝贝宝贝我的好老了\t224156\n奥奥\t224157\n王糊\t224158\ngjkfhlvg\t224159\n越八越\t224160\n李梦馨\t224161\n凉性\t224162\n查雅喜\t224163\n10月17日\t224164\n乐得\t224165\n禅宗\t224166\n大精深\t224167\n汤夏\t224168\n嗯大洋\t224169\n对劲\t224170\nsbsbs\t224171\n热水澡\t224172\n噶错\t224173\n香熏澡\t224174\nguhcu\t224175\n返回\t224176\n換個\t224177\n死皮烂\t224178\nAK9号\t224179\n九二年\t224180\n变为\t224181\n杭州异联公司\t224182\n风化\t224183\n00843346\t224184\nximaximaxm\t224185\n杜庆云\t224186\n路线\t224187\n炒肥牛\t224188\n闹买\t224189\n对办\t224190\n身体乳\t224191\n吼吼哈嘿太极风顺\t224192\n哈边界\t224193\n摩顿岛\t224194\n我喜欢看我是种人我喜欢\t224195\n百一十一块\t224196\n十二点15\t224197\n一定片\t224198\n八九十万四千五百二十一\t224199\n王立宇\t224200\n谈不上\t224201\nsreay\t224202\n徐佳瑶\t224203\n靠靠靠靠靠靠靠靠靠靠靠靠靠靠靠\t224204\n嘛冷\t224205\n笨琪琪\t224206\n一清早\t224207\nqing\t224208\n停住\t224209\nv雨i恐怖vu\t224210\n函授\t224211\n412566\t224212\n制备\t224213\n功利\t224214\n给我摸\t224215\n火神\t224216\n藤桥鹿城区\t224217\n唐僧肉\t224218\nylj\t224219\n爱拍公司\t224220\n师祖\t224221\n刘旭东\t224222\n为吉传\t224223\n更有没有\t224224\n几十万元\t224225\n街舞接话\t224226\n飞蛾\t224227\n期间\t224228\n碰碰碰巧巧巧\t224229\n全要\t224230\n你好度秘度秘小机器人\t224231\n我愿望\t224232\n原味\t224233\n所值\t224234\n不愧\t224235\n我的谜\t224236\n一天少\t224237\n梦幻华尔兹\t224238\n交易者\t224239\n原告\t224240\n氏\t224241\n麻栗坡\t224242\n新开河村\t224243\n给捞\t224244\n友好玩\t224245\n搞笑类\t224246\nb幢\t224247\n基辅\t224248\n法院夫\t224249\n喜儿\t224250\n齼那\t224251\nsiakskl\t224252\n与君\t224253\n娜写年华\t224254\n不救\t224255\n豌豆\t224256\ngjhgugji\t224257\n嚯嚯霍\t224258\n工业革命\t224259\n不敌\t224260\n呀呀呀呀呀呀呀呀呀呀\t224261\n83.89点\t224262\nuriyiry\t224263\n0点到6点\t224264\n别客\t224265\n缪喵喵\t224266\n老天荒\t224267\n鲁宾拉\t224268\n焦虑\t224269\n改单\t224270\n不数\t224271\n认罪背后的真相\t224272\n数千亩\t224273\n排隊\t224274\n哭笑\t224275\n千灯湖\t224276\n听说元\t224277\njlxhenfoqhdvmdjejckkshfkclhdmfjcihdmldi\t224278\n天蚕土豆\t224279\nfice\t224280\n不敢\t224281\n真命天妃\t224282\nggg\t224283\n同床天涯\t224284\niyyiy\t224285\n绿茵场\t224286\n九点钟\t224287\nfreak\t224288\n六仔\t224289\nbhiphotosbaiducomxiaodupicitem55e736d12f2eb938ba3c1486d2628535e5dd6f74jpg\t224290\nLabs实验室\t224291\n十里铺\t224292\n李家牛\t224293\n居于\t224294\n我的谁\t224295\n止血灵\t224296\nvxwjaigrdh6iiidihigiixi1\t224297\n嗯失眠\t224298\n梅勋\t224299\nkkzk\t224300\n62点\t224301\n根藤\t224302\n你好好玩\t224303\n7還\t224304\nCanihelpyou？yesicannBnoicannCyesicant\t224305\n李宇帆\t224306\n青浦区\t224307\n江青\t224308\n有缘你好可爱\t224309\nhehadto\t224310\nvvhbyj\t224311\ndfff\t224312\ndffg\t224313\n斓涿\t224314\n2645\t224315\n泽泽\t224316\n为什\t224317\n去痘印\t224318\ndffx\t224319\ntcgehefghffufgsduduuicjcjcjfjdidifkfjfjdnhhski\t224320\ndfft\t224321\n嫌弃我我会\t224322\n壕壕壕\t224323\n一滴雨\t224324\n守护甜心\t224325\n电影院\t224326\n黄太平\t224327\n虎包子\t224328\nxxxxxxxxxxxx\t224329\n江面\t224330\n防友\t224331\n许歌手\t224332\n求求你求求你赞我求求你求求我求求你别让我求求你别求我求求你求求你求求你别求我求求你让我求你\t224333\n韩国东大门\t224334\nMonash\t224335\n管件\t224336\n卡巴雷拉\t224337\n藤萝乐天\t224338\n懂不懂啊哥\t224339\n朱冬梅\t224340\n想吐\t224341\n马话\t224342\njmgygjghgjh\t224343\n何宁卡\t224344\n花阳\t224345\n模秘\t224346\n博宇\t224347\n想吃\t224348\n伱吹\t224349\n经不住\t224350\n恒宽\t224351\n14381748\t224352\n风\t224353\n想吻\t224354\n5次\t224355\n郑梦莎\t224356\n39码\t224357\n冻囊胚\t224358\nfxgfhbdv\t224359\n过象\t224360\n周亚兰\t224361\n恒安\t224362\n运用\t224363\n马诺\t224364\n19980711\t224365\n水晶体\t224366\n韩伟\t224367\n韩企\t224368\n装比\t224369\n博客\t224370\n经不你\t224371\nhjjjjjjujjiuuugghhhhhhhhhhhhhhh\t224372\n蔡伦\t224373\n恒定\t224374\n刘大头\t224375\n氖\t224376\n恒宝\t224377\nT800\t224378\n赛罗奥特曼\t224379\n江畔独步寻花\t224380\n汽营店\t224381\nhbsbxhx\t224382\n梅莉莎拉\t224383\n编剧\t224384\n别着凉\t224385\n解子洋\t224386\n定路过\t224387\n江北区法院\t224388\n鸡瘟\t224389\n神凤花\t224390\n117张\t224391\n另一个头\t224392\n一比五分\t224393\n4颗\t224394\n蔡文\t224395\n憨巴\t224396\n飚\t224397\n杂粮\t224398\n蔡星雨\t224399\n孟绍云\t224400\n节选\t224401\n罗家伦\t224402\n答道\t224403\n带用\t224404\nzowb\t224405\nguUG\t224406\n男男孩\t224407\n30100\t224408\n渔船\t224409\n飝\t224410\n柳丁包\t224411\n大话得赢\t224412\n31首\t224413\n嗯女枪\t224414\n三哥三哥\t224415\n#NC-17#\t224416\n九班\t224417\njhvvj\t224418\n不可仇\t224419\n飒\t224420\n嗯乌龟\t224421\n家了我回我的家你回你的家个\t224422\nweryuop\t224423\n克拉恋人\t224424\n安分守己\t224425\n切求\t224426\n九珍\t224427\n赵万闪\t224428\n6gmail\t224429\nshshhb\t224430\n陈国帅\t224431\n度小漫\t224432\n巧克力蛋糕糖\t224433\nyyyyy\t224434\n几分之一\t224435\n邓浩晨\t224436\n类似\t224437\n逗里\t224438\n靠谱\t224439\n几分之三\t224440\n飞视\t224441\nyyyyu\t224442\n拍过\t224443\n５５个\t224444\n母者\t224445\n秘塔\t224446\n年马月\t224447\n鑫海汇\t224448\n多媒体\t224449\n杜凳彬\t224450\n你给我你的就好了\t224451\n7宫\t224452\n谢谢你啦多谢\t224453\n哈江西\t224454\n拽住\t224455\n饼饼\t224456\n水花\t224457\n六段\t224458\n才凶\t224459\n说呀你说\t224460\n水芸\t224461\n富力人\t224462\n088期\t224463\n大案\t224464\n我很乖\t224465\n16季\t224466\n15807103743\t224467\n姜子牙\t224468\n74994\t224469\n转悲为喜\t224470\n侯雨洁\t224471\n碎佛\t224472\n高志宇\t224473\n越野车\t224474\n针引线\t224475\n蜡烛\t224476\n刘玲怡\t224477\n专项\t224478\n我要不行我投诉你\t224479\nxx意\t224480\n13511003305\t224481\n四月初\t224482\n1612\t224483\n三百嗯五\t224484\ndywuh\t224485\n菲八达岭\t224486\n潘敏\t224487\n荫藁城\t224488\n胸口\t224489\n后果园\t224490\n莉老婆\t224491\n面上\t224492\nmycena\t224493\n不可以\t224494\n北方航空公司\t224495\nhtyootj\t224496\npfos\t224497\n7家\t224498\n一声哨\t224499\n6泰然处之63额\t224500\n峰口\t224501\n面世\t224502\n火红照\t224503\n城厢区\t224504\n总接\t224505\n靖靖\t224506\n55555555\t224507\n大妈妈\t224508\n蔡衍明\t224509\nanywhere\t224510\n孙兆前\t224511\n顾成文\t224512\n张磊\t224513\n王沛宣\t224514\n面临\t224515\n达尼洛\t224516\n封竹\t224517\n黑街来都眯来了咯奶奶萌萌哒拉粑粑了魔漫大大了恶魔撒旦快快\t224518\n第15条\t224519\n退秦师\t224520\n广钢股份\t224521\n说了袄\t224522\n不孤独\t224523\n9月1号\t224524\n会道\t224525\n平顶山市中级人民法院\t224526\n底衫\t224527\n宋婉如\t224528\n美砺\t224529\n干盛\t224530\nareyou\t224531\n淄川\t224532\n恒生指数\t224533\n无银\t224534\n你好怪\t224535\n半碗\t224536\n粤语\t224537\n习在\t224538\n2672\t224539\n别那\t224540\niifr\t224541\n可怜鸡可怜\t224542\n九位\t224543\n那多美\t224544\n偷俩\t224545\n螺丝\t224546\n念经\t224547\n小苹果们\t224548\ncucgu\t224549\n有很多种\t224550\n46年\t224551\n恁憨\t224552\n7等余多\t224553\nwuurgx\t224554\n洋不转\t224555\n12月27\t224556\n鷳\t224557\n东乌\t224558\n12月23\t224559\n苹果味\t224560\n好男友\t224561\n梁柄伟\t224562\n30余个\t224563\n智妓\t224564\n唐军\t224565\n25556554235\t224566\n4669\t224567\n顾我没有\t224568\n4665\t224569\n4666\t224570\n眼泪\t224571\n鷗\t224572\n青花瓷\t224573\n唐冉\t224574\n德美\t224575\n201c2杯\t224576\n东乡\t224577\n鷞\t224578\nsystroon\t224579\n┏ω┏\t224580\n石晓燕\t224581\n仙墓\t224582\n面疙瘩\t224583\n李乐\t224584\n任梦迪\t224585\n133\t224586\n刘禅\t224587\n131\t224588\n金光布袋戲嗎\t224589\n度秘dmei\t224590\n136\t224591\n135\t224592\n出给\t224593\n139\t224594\n138\t224595\n世纪的约定\t224596\n25韩\t224597\n货账\t224598\n好乖好乖好想念你\t224599\n转不过\t224600\n微丑\t224601\n口怕\t224602\n大航空公司及空管局\t224603\n贵阳晚报\t224604\n邪沫\t224605\n扣扣扣困\t224606\n15694016685\t224607\n14点35分\t224608\n若即若离\t224609\n13g\t224610\n开放性\t224611\n两堵\t224612\n转瞬即逝\t224613\n2099525567\t224614\n攀枝花市\t224615\nL0vey0u\t224616\n阿爸\t224617\n137片\t224618\n13n\t224619\ntreitle\t224620\nWie切\t224621\n八一挂\t224622\n娃哥\t224623\n萨巴莱塔\t224624\nflea\t224625\n赵傻子\t224626\n不三不四\t224627\n柠檬茶\t224628\n体育中心\t224629\nkukskua\t224630\n马秀江\t224631\n感君\t224632\n关小玲\t224633\n两堆\t224634\n太康县\t224635\n感同\t224636\n发生器\t224637\n倪妈妈\t224638\n来本小说\t224639\n贫民窟的百万富翁\t224640\n变形金刚\t224641\n18991849038\t224642\n杨子义\t224643\n书声\t224644\n5666元\t224645\n假啭\t224646\n梁青青\t224647\n拍地\t224648\n找寻\t224649\n720vr\t224650\n师尊\t224651\n西北第一阁\t224652\n給哥哥\t224653\n三言\t224654\nBabb\t224655\n朱子锐\t224656\n木业\t224657\n徐超\t224658\n高速度\t224659\n木不\t224660\n格林尼治天文台\t224661\n收饭\t224662\n张哥哥\t224663\n木下\t224664\nBaby\t224665\n拉提\t224666\n木一\t224667\n猪股股\t224668\n工呼吸\t224669\n最绝：娜叫一个嗨\t224670\npiayer\t224671\n橘子洲\t224672\n恶作\t224673\n多级\t224674\nCheville\t224675\n恶佛\t224676\n毕恭毕敬\t224677\n逗比较\t224678\nHell\t224679\n牛心牛心牛心\t224680\nrelat\t224681\n猜猜猜猜猜\t224682\n包尿布\t224683\nCOSMO\t224684\n赵牧阳\t224685\n9分\t224686\n褚老师\t224687\n额贴铢\t224688\n王永志\t224689\nb会\t224690\nbjbd\t224691\n平潭县\t224692\n86433456644336\t224693\nbjbn\t224694\n刘金勇\t224695\nbjbj\t224696\nltm\t224697\n四五遍\t224698\n哪之\t224699\n哪么\t224700\n470\t224701\n天空缩句\t224702\n1000万个\t224703\n氮\t224704\n赵戴文\t224705\n斗山\t224706\n破损\t224707\n志在千里壮心不已\t224708\n最强会说话\t224709\n小少爷\t224710\n寒光剑\t224711\n踩扁\t224712\n有雨\t224713\n刘方\t224714\nulbzlgf\t224715\n刺激感\t224716\n关咏荷\t224717\n周雨曦\t224718\n痛痛\t224719\nkkfbmhfjrrfn\t224720\n57个\t224721\n吴家5\t224722\n呀秘蜜\t224723\n火龙果\t224724\ntheir\t224725\n那个样\t224726\n推而广之\t224727\n郭子嫣\t224728\n宋玉强\t224729\n旗峰\t224730\n面巾纸\t224731\n斑竹园\t224732\n建筑\t224733\n熊叔\t224734\n失恋\t224735\n新觉\t224736\n唇舌\t224737\n尿尿鞋\t224738\n一大跳\t224739\n考得好差\t224740\n郑清元\t224741\n再见的见\t224742\n87块\t224743\n第五届\t224744\n一下一轮\t224745\n爱爱美团\t224746\n必致\t224747\n锁子\t224748\n榨出\t224749\n守株待兔\t224750\n陈欢\t224751\n555555555555\t224752\nzhishang\t224753\n郭小梅\t224754\n娇呗\t224755\naamttjjjm\t224756\n在块\t224757\n蔚秘\t224758\n三四十条\t224759\n觊觎\t224760\n听恋\t224761\n熊可\t224762\n我看\t224763\n太不走心\t224764\ncxgxhfg\t224765\n马金婷\t224766\n骑赛\t224767\nplllplllllllllllllllllllllllllhcbllllllhblPPP\t224768\n美女你女也不约\t224769\n第七季\t224770\n0579\t224771\n双金\t224772\n年产值\t224773\n秘觉觉\t224774\n派出\t224775\n公园儿\t224776\n张冰心\t224777\n冯骥才\t224778\n宇哥\t224779\n好咧好咧谢谢你\t224780\n六科\t224781\n阿桑\t224782\n六秒\t224783\n打招呼\t224784\n舞跳\t224785\n不老笨笨\t224786\n六秘\t224787\nsc2\t224788\nAfter\t224789\n1126545\t224790\n喝喝茶\t224791\n六秀\t224792\n冷爷\t224793\nwhich\t224794\n老苗\t224795\n钙片\t224796\n卢子睿\t224797\n九方\t224798\nGAFh\t224799\n14000英\t224800\nlagoon\t224801\n吕梦园\t224802\n角声\t224803\n十二点零一\t224804\n十二遍\t224805\n这声优\t224806\n十二道\t224807\n绿箭\t224808\n真心的爱我\t224809\n办理手续\t224810\n旱饭\t224811\n你好你好你好你好你好尿尿尿尿尿尿尿尿\t224812\n重样\t224813\nhgfjr\t224814\n非女\t224815\n青春天\t224816\n42亿\t224817\nPou\t224818\n要不要不\t224819\n两条腿\t224820\n愤青过\t224821\n超梦咧\t224822\n1月29日\t224823\n除魔\t224824\n玄彬\t224825\n啊长\t224826\n114品\t224827\n诞辰\t224828\n5955955955955\t224829\n秋瓷炫\t224830\n真心的爱戴\t224831\n卖淫案\t224832\n帅鸣\t224833\n李川\t224834\n比便说\t224835\n九个\t224836\n看做\t224837\n后备厢\t224838\n3杯\t224839\n呼啦猫来六来了\t224840\n反应我来\t224841\n你好丫头\t224842\n辣椒油\t224843\n今年以来\t224844\n201欧尼\t224845\n酗酒\t224846\n阿大一\t224847\n一两小时\t224848\n斑扉\t224849\n业余\t224850\n哈那你是人\t224851\n医德\t224852\nqinni\t224853\n那个男\t224854\n蓝正龙\t224855\nGemma\t224856\n与否\t224857\n刘家根基浅嗣君幼\t224858\n要不要脸\t224859\n东北言\t224860\n285厘米\t224861\n神经疯子\t224862\npiano\t224863\n想盗\t224864\n卓创资讯\t224865\n对呀帅\t224866\n圣洁\t224867\n小爷乐\t224868\n率领\t224869\n解釋\t224870\n水袖舞\t224871\n大通\t224872\n26年\t224873\njtjti\t224874\n金泰\t224875\n最强\t224876\n大选\t224877\n火影忍者\t224878\n许艺馨\t224879\n七大姑八大姨\t224880\n永远真爱\t224881\n大逼\t224882\n古堰\t224883\n呆呆不动白雪一二累瘫\t224884\nthyaco\t224885\n裸子\t224886\n裸字\t224887\n大連\t224888\n大辰东\t224889\nvrodecto\t224890\nuygurqakino\t224891\n2720点\t224892\n优惠政策\t224893\n英语学换换\t224894\n水温\t224895\n支付宝\t224896\n搁里\t224897\njioh\t224898\n采购员\t224899\np21\t224900\n怨念\t224901\n那多久\t224902\n加油男孩\t224903\n记得你\t224904\n刘成壮\t224905\n多才对\t224906\np28\t224907\n5l晶\t224908\n常委\t224909\n焦急\t224910\n真心话大冒险\t224911\n双床\t224912\n21w3\t224913\n咪咪眼k歌\t224914\n通识教育\t224915\n十一届\t224916\n主管者\t224917\n刘志军\t224918\n胰腺娇娇\t224919\n四四界\t224920\n一百七十四九十分钟\t224921\nstories\t224922\n苦日子\t224923\n775852117758521\t224924\n彩霸\t224925\n九三\t224926\n20世纪50年代\t224927\n5.3%\t224928\n脑筋急转弯儿\t224929\n陈登睿\t224930\n时机锋\t224931\n秘密秘密\t224932\n重在\t224933\n傻乐个五要\t224934\n白玥婷\t224935\n299欧元\t224936\n耗尽\t224937\n梳子果\t224938\n哼连\t224939\nvcx\t224940\n8多\t224941\n迫不得已看\t224942\n郑雪\t224943\n8天\t224944\nufjcxg\t224945\nfhdjeg\t224946\n测试者\t224947\n永远的朋友\t224948\n工作号\t224949\n古兴龙寺\t224950\ngdhxuzvb\t224951\n8头\t224952\n你的我的闺蜜\t224953\n有我了来\t224954\n8899775\t224955\n版图\t224956\n代晨\t224957\n英标书\t224958\n前记\t224959\n信受\t224960\nhowhs\t224961\n最爱\t224962\n催泪神\t224963\n侧重\t224964\n桐梓县\t224965\n秀妍\t224966\n西米露\t224967\n三从\t224968\n002006\t224969\nkgfbeb\t224970\n撸啊轩\t224971\n一个人影\t224972\n乳腺\t224973\n掌你嘴\t224974\n本拉登\t224975\n10050次\t224976\n手钻\t224977\n信号\t224978\nyoho\t224979\n萨瓦迪卡卡昆\t224980\n性别选择男\t224981\nqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq\t224982\n秘境\t224983\n4种\t224984\ndhiphotosbaiducomxiaodupicitem14ce36d3d539b600497ddbc5ee50352ac65cb715jpg\t224985\n男奴\t224986\n十六队\t224987\n男女\t224988\n畅读\t224989\nmike\t224990\n或者\t224991\n煞人\t224992\n创世妥\t224993\n嘛呀\t224994\n886886886886886886\t224995\n知识人\t224996\nmiku\t224997\n感情\t224998\n鲍国浩\t224999\n密笑\t225000\n小心不错\t225001\n繁衍\t225002\n减号\t225003\nAsssdhfy\t225004\n哈萌萌哒\t225005\ndghirggidvt\t225006\n感想\t225007\n水石\t225008\nOLon5\t225009\n王俊\t225010\n周岁时\t225011\n王俗\t225012\n别废话了你主人我是\t225013\n睡觉吧睡觉吧\t225014\n减发\t225015\n再来嘛再来嘛要不禁言了你阉割你\t225016\n傅家庄\t225017\n20公里\t225018\n精神文明\t225019\n叫钟\t225020\n奥盟\t225021\nhidrt\t225022\n隐身\t225023\n27187\t225024\n不怕人\t225025\n6月26日\t225026\n北京市定都贸易有限公司\t225027\n昏迷达\t225028\ndhiphotosbaiducomxiaodupicitemac345982b2b7d0a26707cfe9ccef76094a369afcjpg\t225029\n叫钱\t225030\n玲子\t225031\n动了乎其乎\t225032\n度秘酿\t225033\n跑男鹿\t225034\n喜欢你是\t225035\n盛晶晶\t225036\n666666662225222222225555555555555888888888\t225037\n烂泥\t225038\n路人甲\t225039\n一幕幕\t225040\n徒步\t225041\nProspero\t225042\n被动动力学\t225043\n姓赵我心照\t225044\n最后一个七\t225045\n越来\t225046\n8昂\t225047\n佛祖\t225048\n一份度\t225049\n皇牌空战\t225050\nstudent\t225051\n8千\t225052\n14.24\t225053\n罗丽精灵梦\t225054\n诶呦咪\t225055\nhghvhgvjf\t225056\n摩羯男\t225057\n越权\t225058\n脚料\t225059\n噗噗咔咔\t225060\n岜盘乡\t225061\n美丽的爱情\t225062\n来了切\t225063\n盐小斌\t225064\n墙贴\t225065\n一心一心\t225066\n一个妞\t225067\n元味\t225068\n市一中\t225069\nfjfrhdhngseerqtnwfnaegnrstbyemtwhbrfh5\t225070\nbssubs\t225071\n18208335520\t225072\ngykak\t225073\n类人\t225074\n柏查\t225075\n善哉\t225076\ncarris\t225077\n是我美你还是我哥更美丽呀求求你了我\t225078\n常文达\t225079\n2828609828283941750596725183838258582255525\t225080\n上海速环电子有限公司\t225081\n出家\t225082\n刘宇航\t225083\n知不道我信你\t225084\n柔波\t225085\n晕咧\t225086\n李假货\t225087\n王美琳\t225088\n万能度秘\t225089\n巩义市\t225090\n爱塞先\t225091\n窝管\t225092\n西红柿台\t225093\n廖欣瑜\t225094\n蒋赏\t225095\n十十十\t225096\n黑风太气人\t225097\nxiaonitaikeai\t225098\n更可怕\t225099\n百鸟叔\t225100\n晚7点\t225101\n直树\t225102\n届满\t225103\n百盈\t225104\n微评论\t225105\n陈曼青\t225106\n运动场\t225107\n柳树镇\t225108\n节骨眼\t225109\n百盛\t225110\n乖么么\t225111\n二二六二\t225112\n神舟九号飞船\t225113\n相爱\t225114\n开心什\t225115\n怪诞\t225116\n琉音\t225117\n掉了帅呆\t225118\n红遍\t225119\n关照\t225120\n躲起来\t225121\n全校服\t225122\n13021618485\t225123\n语言不通\t225124\nzlybis\t225125\n那你现在不那你\t225126\n来了我问\t225127\n八八那些事儿在哪呢\t225128\nkdn\t225129\n行嘛行不\t225130\nkdj\t225131\nkdk\t225132\n恩乡\t225133\n分子们\t225134\nMandy\t225135\n播放器\t225136\n案中案\t225137\n胶帮\t225138\n孙秀珍\t225139\n哎呦我不是说你骗我我是说我骗你\t225140\n胶带\t225141\n过错\t225142\n恩义\t225143\nmybestfiend\t225144\n吾国\t225145\n164648656646768864646464655664\t225146\ncody\t225147\n赵成豪\t225148\n张武之\t225149\n15879202970\t225150\naishdhxxkx\t225151\n胶布\t225152\n猥亵\t225153\nHhhggfff\t225154\n所属\t225155\n问哈\t225156\n一马当先外\t225157\n954383075\t225158\n考完\t225159\n好啊度秘天天\t225160\n27.3\t225161\n294个\t225162\nmoktv\t225163\nffahjtfgv\t225164\n皓男\t225165\n钢管\t225166\ndrrzer\t225167\n赫赫\t225168\n度秘我爱你我这个大王好爱你\t225169\nlkkkllaihiooo\t225170\n魏玛丽\t225171\n诌诌\t225172\n祸害人\t225173\n浮雕\t225174\n金起范\t225175\nfhuchjj\t225176\n土土\t225177\n唉愿\t225178\n到最後\t225179\n第二卷\t225180\n韵腹\t225181\n二零一八年一月十五号\t225182\n花雾\t225183\n赵帅\t225184\n文物局\t225185\n第几第几\t225186\n双港新桃园\t225187\n你好丑你好丑你好错错错错错\t225188\n自得\t225189\n辽宁中药大学\t225190\n叶铃音\t225191\n大晚\t225192\n宝盛\t225193\n动摇\t225194\n马飞\t225195\n新丰镇\t225196\n偏信\t225197\n咋臭丫头\t225198\n得当\t225199\n*\t225200\n邱伟哲\t225201\n勘查\t225202\n牛奶豆腐汤\t225203\n打飞\t225204\n打食\t225205\n打飛\t225206\n烘鞋器\t225207\n大蕉\t225208\n迈安口译\t225209\n后娘\t225210\n2011年6月2日\t225211\n笑雨下\t225212\n宽宏大量\t225213\n尘埃3\t225214\n你好丑好丑好丑好丑好丑\t225215\n卧式合金带锯条\t225216\n南海\t225217\n鼎鼎鼎鼎鼎鼎\t225218\n我需\t225219\n静敏\t225220\n奔奔\t225221\n事源\t225222\nbelow\t225223\nruling\t225224\n015点\t225225\n不八千方\t225226\n孙雨\t225227\nlmjmgtq\t225228\n马西青\t225229\n1yy\t225230\n杨添阳\t225231\n盆宇\t225232\n受不了你好\t225233\nigcv\t225234\n故亡\t225235\n杨红\t225236\n名人\t225237\n切片\t225238\nmyh\t225239\n万恶旧社会\t225240\n艰苦受尽\t225241\nmya\t225242\n天下雨\t225243\n悍马天\t225244\n宇宙兹\t225245\nmye\t225246\n回答别\t225247\n韩哲贤\t225248\n红烧鸡翅\t225249\nQQ阅读\t225250\nTicu\t225251\n交叉口\t225252\n诗鬼\t225253\n若希\t225254\ntuyala\t225255\n营运\t225256\n裴嘉琪\t225257\n朱凤存\t225258\n甜筒\t225259\nrian\t225260\nfgtt5trctffgf\t225261\n翻飞\t225262\n1867\t225263\n47亿\t225264\n为什么多\t225265\n乌尔晚了一二连晚连\t225266\n七分之22\t225267\n点化\t225268\n长得萌\t225269\n拱辰路嘅\t225270\n纸船\t225271\n祖武蓓蓓\t225272\n洪秀全内宫光\t225273\n起起落落\t225274\n劳姿\t225275\n噼里\t225276\n申鑫\t225277\n脑筋急转弯熊猫\t225278\n为什么头\t225279\n薄唇\t225280\n郇彦宾\t225281\n无私\t225282\n黯然神伤\t225283\n透现\t225284\n会晤\t225285\ndgkkkk\t225286\n来源于\t225287\n靠龙\t225288\nheidh\t225289\n滚動\t225290\n同伙\t225291\n大头草\t225292\n寻回答\t225293\n76页\t225294\n弹跳\t225295\n总值\t225296\nkenn\t225297\n狠话\t225298\n你在哪里我又是谁我在哪里\t225299\n马中陵\t225300\nhttppinyincne17629\t225301\n縛鱡\t225302\n猪猪猪\t225303\n平方厘\t225304\n戴欣霓\t225305\n看头痛\t225306\n苦闷\t225307\nsnowink\t225308\n老姜\t225309\n李程\t225310\nimatistinerabashedo嘟嘟嘟嘟嘟嘟嘟嘟diodilibaathowtihapplyhildouwilotoliwoodlindewataaalatityoutoheatio\t225311\n台式机\t225312\n偷偷摸\t225313\n百劫千\t225314\n一花千骨\t225315\nkutoual\t225316\n大美比\t225317\n残酷\t225318\n开辟\t225319\n罗佳钰\t225320\nkennels\t225321\neidjros\t225322\n叉叉叉叉叉\t225323\n点廉\t225324\n天籁纸鸢\t225325\njwhhwd\t225326\n新晋祠路\t225327\nCOXTT囧\t225328\n警示牌\t225329\n个样\t225330\n哥家和万事兴和克拉恋人河南美丽的秘密\t225331\n七七八九11\t225332\n內情\t225333\n曾用名\t225334\n陌竹\t225335\n王雨馨\t225336\nggggjkjhjbb\t225337\n亲历者\t225338\n明天的明天的明天星期几\t225339\n董姥姥\t225340\n别见了\t225341\n你的们\t225342\n李王张\t225343\n野生动玩\t225344\n离婚后\t225345\n恼里\t225346\n淳淳\t225347\nydkrkvz\t225348\n人工智能计算器\t225349\n48668893479624465887785\t225350\n橱柜\t225351\n22集\t225352\n鸟叫\t225353\n我的处\t225354\n喜羊羊之原始世界历险记\t225355\n丁薇\t225356\n58852496670855785\t225357\n现在家\t225358\n西什\t225359\nJDJ\t225360\n神长\t225361\n一点六六倍\t225362\n羊头肉\t225363\n很好说话\t225364\n说通\t225365\n慢跑\t225366\n撩拨\t225367\n王晓琪\t225368\nHVHVJ\t225369\n回我就问问\t225370\n盐山县\t225371\n鸟友\t225372\n王丽雨\t225373\n彪ve\t225374\n江莱\t225375\n须相离\t225376\n我的天\t225377\n秦朝过后\t225378\nmeknow\t225379\n王晓理\t225380\n33次\t225381\n抽屉\t225382\n时几嗯\t225383\n李子涵\t225384\n足女\t225385\nhi皮波斯得吐油\t225386\n鸟叔\t225387\n我最爱的等会我我最爱的\t225388\n大楠桑\t225389\n宜温\t225390\nwhose\t225391\n362430\t225392\n1558484\t225393\n大姨父\t225394\n刺骨發\t225395\n心急\t225396\n心性\t225397\n可来\t225398\n东塔\t225399\n殷思琪\t225400\n爱情闯进门#\t225401\n黄腔\t225402\n心思\t225403\nhulw\t225404\nhellohi\t225405\n心态\t225406\n心怀\t225407\niloveyou\t225408\n悬一榻\t225409\n布基岛\t225410\n圆茄\t225411\n侵占\t225412\n深南路\t225413\n西里奇\t225414\n9k\t225415\n持家\t225416\n李聪娜\t225417\nhellomyne\t225418\n张宇镇\t225419\n娘子\t225420\n不可奈何\t225421\n2.4米\t225422\n9g\t225423\n9x\t225424\n劲飚\t225425\n入股\t225426\n枪杀案\t225427\n姓唐\t225428\n9v\t225429\n平方分米\t225430\n4stletouto\t225431\n目啊\t225432\nuguguug\t225433\n九段线\t225434\n日后\t225435\n扣留\t225436\n流水样\t225437\n半生日快乐\t225438\n日向\t225439\nhello度秘我爱你\t225440\n13.39\t225441\n网博\t225442\n你乖个毛乖乖\t225443\nal阿诗\t225444\n13.30\t225445\n吃汤圆\t225446\n人声鼎\t225447\n整疯\t225448\n御珠未来\t225449\n害怕女\t225450\n9%\t225451\n三三一百只\t225452\n北京音乐广播FM\t225453\n吕俊\t225454\n99\t225455\n98\t225456\n售房\t225457\n两难\t225458\n仙剑侠\t225459\n91\t225460\n90\t225461\n93\t225462\n92\t225463\n95\t225464\n江海\t225465\n97\t225466\n96\t225467\n悲痛\t225468\n德语\t225469\n卖讲\t225470\n骚母狗\t225471\nwinky诗\t225472\n稀烂\t225473\necouyou\t225474\n连续集团圆圆润肤色调查\t225475\n好吧好吧我的错\t225476\n不老对\t225477\n却缘\t225478\n沙纱砂\t225479\n涂锦涛\t225480\n一面而天\t225481\n李阁奎\t225482\n吕xx\t225483\n200元\t225484\n让我还\t225485\n200克\t225486\n大头不来我想看熊出没\t225487\n是啊我是你的爱人啊你爱我\t225488\n喜结连理\t225489\n洞秘\t225490\n合机\t225491\n挂上\t225492\n失心病\t225493\n倒好\t225494\n婷婷美\t225495\n李好建\t225496\n百四十个\t225497\n暴徒\t225498\n话费用\t225499\n服装材料学\t225500\n烫头\t225501\n番石\t225502\n零二二八\t225503\n说许\t225504\n次性\t225505\n木得\t225506\n自助台\t225507\n低次\t225508\n一架子九九\t225509\n在路上\t225510\n12个小时\t225511\nIMAX3D\t225512\n紫烟\t225513\n明一妃\t225514\n吧元\t225515\n可以获得\t225516\n吧兄\t225517\n路口头\t225518\n国际足联\t225519\n堡奥\t225520\n塞西\t225521\n哈轮\t225522\n留住\t225523\n国男版\t225524\n吧兔\t225525\n教师\t225526\n百分之十\t225527\n首杀\t225528\ndgfftfgf\t225529\n0nVe\t225530\n我爱你塞北的雪\t225531\n刘若英\t225532\n余弦\t225533\n劲嘉时代\t225534\ngajt\t225535\n狗链\t225536\n运走\t225537\n狮子座\t225538\n2.22%\t225539\n累金鑫\t225540\n我真的不理你了我恨你我\t225541\n3433\t225542\n3434\t225543\n力作\t225544\n止水\t225545\n吐曼\t225546\n和成\t225547\n迪尼巴拉\t225548\nvohp\t225549\n四公分\t225550\n6957846\t225551\n康凯杰\t225552\nvohv\t225553\ncjbdq\t225554\n柏海大明\t225555\n叶琳琪\t225556\n笨口拙\t225557\n遥控\t225558\n鬼亲\t225559\n鬼人\t225560\n嗯槁\t225561\n南充市\t225562\n好可耐耐\t225563\n清气\t225564\n大陆不好混\t225565\n凌卡\t225566\n增城\t225567\n齐腰\t225568\nk新\t225569\n134\t225570\n行不差\t225571\n县级政府\t225572\n第一第二\t225573\n26只\t225574\n99999999999999999999\t225575\n26号\t225576\n五周岁\t225577\nfinbm\t225578\n利德\t225579\nq版小游\t225580\n赵欣瑶\t225581\n裸图\t225582\n555544\t225583\n同事们\t225584\n加拿大移民局\t225585\n[干杯\t225586\n高胜\t225587\n暗淡\t225588\n哼嗯\t225589\n夹紧\t225590\n逃难\t225591\n美汉斯\t225592\n明天下关\t225593\n青年公园\t225594\n哎呦我去那\t225595\n好好好\t225596\n高能\t225597\n呵呵天\t225598\nbackward\t225599\nfhjkhvfd\t225600\n北京电视台\t225601\n伊万巴萨\t225602\n只列式\t225603\n不要脸不要脸贱女人\t225604\n王头八蛋\t225605\n【史\t225606\n5775774575755\t225607\n王耀康\t225608\nmyroom\t225609\n三首歌\t225610\nofcourse\t225611\ncucucu\t225612\n找你了真讨厌\t225613\n独仙\t225614\n瑶瑶\t225615\n蜜度\t225616\n自华\t225617\n就是说呀不可能\t225618\n群宠\t225619\n林乔木\t225620\n白白赖\t225621\nhyigug\t225622\nhhgg\t225623\nRtf\t225624\n一大摞\t225625\n首唱歌\t225626\n唐雪梅\t225627\n电液晶电视机\t225628\n左三杯\t225629\n千龙网\t225630\n蔡木木\t225631\n自卑\t225632\n一大摊\t225633\n三万万\t225634\n假小子\t225635\n第22点儿\t225636\n陀起\t225637\n汽车\t225638\n刻舟求\t225639\n颜性恋\t225640\n唉王\t225641\n金形\t225642\n做好朋友\t225643\nErtyuiopqw\t225644\n君过\t225645\n摸摸大你真好\t225646\n黄明阳\t225647\n贵鬼\t225648\n春得一\t225649\n冬西\t225650\n解闷儿\t225651\n博世科\t225652\n凿池芝\t225653\n戏子\t225654\nrong\t225655\nrone\t225656\n一百招\t225657\n苦行林\t225658\n10c\t225659\n美诺\t225660\n那个了行\t225661\n北京国际饭店\t225662\n长沙宾馆\t225663\n男女性\t225664\n动秘弄秘你在干嘛\t225665\njjhhfhy\t225666\n常吃\t225667\n零都\t225668\n选择性\t225669\nvugz\t225670\n关系式\t225671\n白菜花\t225672\n人生如戏\t225673\n狡黠\t225674\n空生\t225675\n引致\t225676\n兄弟盟\t225677\n切丝\t225678\nJJ擦擦擦擦擦擦\t225679\n喷射\t225680\n我的儿子我想开心葫芦娃\t225681\n15日晚间\t225682\n艳遇节\t225683\n日龙\t225684\nuyuj\t225685\n结婚季\t225686\n下大摇\t225687\n示例\t225688\n好的萌萌萌你好某某某你好\t225689\n癌\t225690\n分界\t225691\n119岁\t225692\n家训\t225693\n同爱\t225694\n同父\t225695\n植庭\t225696\n冒火星子\t225697\njjdjjdjd\t225698\ngoyu\t225699\n捐款\t225700\n家访\t225701\n云蝈蝈\t225702\n2983501838\t225703\n二七号\t225704\n报告诉\t225705\n邓超冰\t225706\n口吞\t225707\n高枕无忧\t225708\n我喜欢搞笑一点旳\t225709\n王城跳\t225710\n知了\t225711\n艳史\t225712\ntivfervufvuffvrvgfgyfvguvgyrvgggufyututgufpbofucdqbgdeedbfebvugecfveuveu\t225713\n将领\t225714\n美人人\t225715\n9.00\t225716\n口吃\t225717\nUSB接口\t225718\n零九幺零二七六幺\t225719\niycut\t225720\nWXXTKBSIPBW\t225721\n我喜欢你我是说我喜欢听\t225722\n陆军\t225723\n巨鼠\t225724\n王效尧\t225725\n8分钟\t225726\nDssssrerteettry\t225727\nLynd\t225728\n罗伯特布列松\t225729\n灵清\t225730\n魏魏生魏\t225731\n青蛇\t225732\njikjj\t225733\n金华县\t225734\n兰博基尼法拉利\t225735\nq名\t225736\n伊姐\t225737\n10M\t225738\n保护者\t225739\n411个\t225740\n裨益\t225741\n心珠子\t225742\nkryssica\t225743\n笑死我了我该\t225744\n4689644\t225745\n5554346\t225746\n帮帮忙\t225747\n措错\t225748\niidt\t225749\n忘了不去\t225750\n咬文\t225751\n奋青\t225752\nAgency\t225753\n12天\t225754\n舍得舍得\t225755\nyy3034\t225756\n节后\t225757\n肉块\t225758\nTulbend\t225759\n马可男\t225760\n恒基\t225761\n鹿鼎记\t225762\n纤弱\t225763\n面前\t225764\n去班\t225765\n3q松\t225766\n李欣芮\t225767\n呗巴拉巴拉杯\t225768\n趴布雅\t225769\n高妙\t225770\n真舒服\t225771\n程琬滢\t225772\n三600公米\t225773\n1币\t225774\nhhhhhhhhhhhhhhhhhhhhhhhhhhh\t225775\n两周后\t225776\n两个六五二六\t225777\n为你来\t225778\n超梦\t225779\n分得\t225780\n还以为你在\t225781\n春矢\t225782\n死标\t225783\n英雄坛\t225784\n贾倩茹\t225785\n过膝\t225786\n一点元\t225787\n怠速\t225788\n7小时后\t225789\n沧桑\t225790\n玉米\t225791\n千金风\t225792\n死样\t225793\n多托妞\t225794\n王瑞军\t225795\n排卵期\t225796\n千里之提迫于\t225797\n三百五大\t225798\n逃离克隆岛\t225799\n4400米\t225800\n一点六\t225801\n国豪山景城\t225802\n111229\t225803\n点别度\t225804\nwoshuodehua\t225805\n風格v發髮飯飯乖乖\t225806\n麦琪娘\t225807\n交友者\t225808\n袖手\t225809\n死处女\t225810\nufudy\t225811\n晦气\t225812\n498\t225813\n二百七十块\t225814\n狗眼\t225815\n兴而起\t225816\n古拉格\t225817\n狮子们\t225818\nclen\t225819\n怒怒怒\t225820\n2月20日\t225821\n交通管理\t225822\n12345665665666\t225823\n4.5万\t225824\n维纳斯\t225825\n袁a\t225826\n让念\t225827\n台海战争\t225828\nzasd\t225829\n磕磕巴巴\t225830\n鱼头\t225831\n校验\t225832\n6倍\t225833\nImshenyixuan\t225834\n肯害怕\t225835\n生息\t225836\n第八周\t225837\n355555658954\t225838\n赖益林\t225839\n走多远\t225840\n机器人儿给狗人儿机器猫\t225841\n一零号\t225842\n快牙\t225843\ntingy\t225844\n我的我的生日\t225845\n开单\t225846\n1393636105\t225847\n黑龙魔\t225848\n108\t225849\n獬依赖\t225850\n434[\t225851\n郑州购书中心一楼\t225852\n脱不去\t225853\n杀众不可谓\t225854\n涟水\t225855\n决明子\t225856\n好啵啵啵\t225857\nhttppinyincne19919\t225858\nMiDu\t225859\n听饭\t225860\n张佳飞\t225861\nmiarror\t225862\n這張已經\t225863\n主儿\t225864\n解放碑\t225865\n小裤\t225866\n温柔地\t225867\n小裴\t225868\n返照\t225869\n沧殿\t225870\n口爱\t225871\n敢天长\t225872\n106\t225873\n核桃树\t225874\nfhfgbgg\t225875\n咖啡\t225876\n材料科学\t225877\n不明白话\t225878\n任劳任怨\t225879\n2nX\t225880\n小裙\t225881\n新的一天\t225882\n西米亚\t225883\n主渠道\t225884\n9点50\t225885\n卡西欧\t225886\n书包装书\t225887\n验验\t225888\n戴老师\t225889\n明知故问\t225890\n初试\t225891\n挑战\t225892\n炸毛嚒\t225893\n屋屋屋屋屋屋屋\t225894\n尿急\t225895\n长得帅不帅\t225896\n数日均33条\t225897\n装有\t225898\n直付通\t225899\n子虚乌有\t225900\nGaultier\t225901\n初识\t225902\n脚痛有\t225903\n去缩句\t225904\n红艳\t225905\n3683\t225906\n泽东\t225907\n3680\t225908\n和唐\t225909\n822点儿\t225910\n流泪台\t225911\n85616616345464676979797979796\t225912\n高初一\t225913\n迷路玉\t225914\nd罩\t225915\n包骤停\t225916\n黄昏小街道\t225917\n十分之九\t225918\n酷刑\t225919\n基佬度秘\t225920\n宁月琳\t225921\n周堃源\t225922\nDGHJHS\t225923\n0561n537753\t225924\n你好你游戏九\t225925\n雪娃娃\t225926\nudhndjdhzh\t225927\n春天在哪里呀春天在哪里春天在哪儿独立的屁股哩\t225928\n唐依曼\t225929\n印加术造纸术\t225930\n安卓拉\t225931\n度秘我想要我还你你是你\t225932\ngvsbvssg\t225933\n门头沟\t225934\n实试\t225935\n机条\t225936\n零幺幺七\t225937\n脸盘\t225938\n塞斯克\t225939\n实诚\t225940\n广交\t225941\n甘迦乐\t225942\n实话\t225943\n三科youiloveyou\t225944\n18685661249\t225945\n不能不能\t225946\n枢纽\t225947\n广人\t225948\n脸盆\t225949\np包\t225950\n三千块\t225951\n告诉你我的我告诉我的名字\t225952\ncanyoudo\t225953\n今日假\t225954\n完美鼠药\t225955\n剩余物\t225956\n有脚有脸\t225957\n＊┏\t225958\n哀莫大过于心死\t225959\n战国时期\t225960\n姚坤\t225961\n晕机\t225962\n无着\t225963\n20亿\t225964\n5535666\t225965\n2月20\t225966\n5887元\t225967\n2月22\t225968\n2月23\t225969\n一百三十块\t225970\n天津卫视\t225971\n95分\t225972\n2月29\t225973\njhhhhhhh\t225974\n老记者\t225975\n实事无码\t225976\n耶子\t225977\nuftisf\t225978\n无睡\t225979\n船票\t225980\n洲舟和林\t225981\n133320\t225982\nJMJMG\t225983\n坑娘\t225984\nSKKWKQKQKQ\t225985\n讲师\t225986\nxjjshxhxuhuuuuF\t225987\n14226456389758448\t225988\n淡出江湖\t225989\n蜜雪薇\t225990\n11月26\t225991\n都市快报\t225992\n呕心沥血\t225993\n11月23\t225994\n飞龙度秘\t225995\n姓于\t225996\n2684356\t225997\n米特菲\t225998\n恰相\t225999\n耳调\t226000\n激增\t226001\nrjfd\t226002\nrjfe\t226003\n流水号\t226004\n聪明你不爱我着想\t226005\n滚开拜\t226006\n一学就会\t226007\n吉祥\t226008\n张仕倩\t226009\n请面对\t226010\n一万美元\t226011\n沫璃\t226012\n还给你好梦依旧是我的儿子的名字三九家具你不在家机器人我看你家咯大头儿子\t226013\nfucemeplease\t226014\n3456个\t226015\n卡拉瓦\t226016\n领养日\t226017\n美奶粉\t226018\nｂｏｙ\t226019\nssadss\t226020\n六么样\t226021\ntostay\t226022\n求求真\t226023\n电子系\t226024\n少女帝\t226025\nMeggie\t226026\n变心\t226027\n半步颠\t226028\n魔晶猎\t226029\nhaodumi\t226030\n贝娜妮\t226031\n牺牲品\t226032\n日野麻生联盟\t226033\n麦肯艾瑞克森\t226034\n郑紫棋\t226035\n康乃馨\t226036\n3177220135\t226037\n斯黛尔\t226038\n报到处\t226039\n李懿轩\t226040\n首场\t226041\n武建\t226042\n夺走\t226043\nduppf\t226044\n蔡骞\t226045\n成袋\t226046\n天天叫\t226047\n逆坏蛋坏蛋坏蛋坏蛋坏蛋\t226048\n天天可\t226049\n难闻\t226050\n干嘛类\t226051\nopkisju\t226052\n淳米\t226053\nold\t226054\n柳眉\t226055\nhttphhiphotosbaiducomxiaodupicitem4610b912c8fcc3cee49260479545d688d43f2077jpg\t226056\n四百八十十次\t226057\nolo\t226058\n人工&amp\t226059\n长女\t226060\n婴儿肥\t226061\n越秀区\t226062\noly\t226063\n3194312794\t226064\n89uyyyyy78o\t226065\n只分\t226066\n抗命\t226067\n30000多\t226068\n合辑\t226069\n吴家粥\t226070\n如斯夫\t226071\n不要你了你不说\t226072\n潮阳\t226073\n应不识\t226074\n这个物\t226075\n详述\t226076\n驳回\t226077\n小场记\t226078\n科长\t226079\n漫游\t226080\n刘金童\t226081\nicjxgjfzfjxcgxjjfzxjfkcghxchkkkckc\t226082\n唐奇凯\t226083\n2095\t226084\n洪湖峰口明德小学\t226085\n过号\t226086\n唐子璇\t226087\n维修点\t226088\n犹在\t226089\n傅点\t226090\nMcLuhan\t226091\n25多\t226092\n要不说出来\t226093\nhellomyhihou\t226094\n格物萌\t226095\n氧气阀\t226096\n低空飞行\t226097\n大s\t226098\n侦探题\t226099\n翁陈铖\t226100\n雯雅婷\t226101\n傻迪\t226102\n青云志了吧好像\t226103\n过发\t226104\n明熙\t226105\n一塌糊涂\t226106\n聊聊心\t226107\n布料\t226108\n我的故事就是我喜欢你\t226109\n皂片\t226110\n李瑾轩\t226111\n跨国8\t226112\n2646565\t226113\n二十四点\t226114\n伊始\t226115\n权衡\t226116\n黑tfboys\t226117\n循笑\t226118\n第93点\t226119\njaggillaromdig\t226120\n盈门\t226121\nHani\t226122\n远别家山\t226123\n我真的好帅我真的太帅\t226124\n花夜\t226125\n乖巧\t226126\n特此\t226127\n特步\t226128\n雷霆之怒号\t226129\n土耳其\t226130\n严奶奶\t226131\n1000份\t226132\n布斯\t226133\n五七点两小时\t226134\n活见鬼\t226135\n石门市\t226136\n王秋令\t226137\n蒸饺\t226138\n数学作业啊懂没懂我\t226139\n秘贴\t226140\n指指点点\t226141\n轨迹\t226142\n秘费\t226143\n新开\t226144\n逛逛\t226145\n湖南电视台\t226146\n寒冰轮\t226147\n新浪美\t226148\n承接\t226149\n信息卷\t226150\n独创\t226151\n胡可晴\t226152\n苟合\t226153\nMinaj\t226154\n偶想\t226155\n我不让你给我说话了我讨厌你\t226156\n切讨\t226157\n喷水池\t226158\n奥露行\t226159\n齐老师\t226160\n大满贯\t226161\n6666you\t226162\n老林老林老林\t226163\ncomakamili\t226164\nSir\t226165\n就是你的名字\t226166\nqin\t226167\n6457\t226168\n补差\t226169\n之举\t226170\nqid\t226171\nhelloatio\t226172\nqia\t226173\n北京玩具厂\t226174\n叫声扰民\t226175\n向文凯\t226176\n戴劳\t226177\n朴锅\t226178\n5yf\t226179\n聊天类\t226180\nqis\t226181\nqir\t226182\n闪躲\t226183\n恒荣半岛\t226184\n四大发明\t226185\nstonout\t226186\n熊大\t226187\n度秘怀挺\t226188\n暴儿\t226189\n好VV\t226190\n18.6元\t226191\n黑黑的天空\t226192\n木制\t226193\n拉巴拉小魔仙\t226194\n32247999\t226195\n闪身\t226196\n之上\t226197\n之下\t226198\n木刺\t226199\n圆规在线段ab\t226200\n靶地\t226201\n一然\t226202\n靶场\t226203\n红苹果\t226204\ngffrff\t226205\n赵世奇\t226206\n无所謂\t226207\n250斤\t226208\n四十克\t226209\n卢钟鸣\t226210\n吉鸿昌\t226211\n繁简\t226212\n四十元\t226213\n封皮\t226214\n几圈\t226215\n丁鹏\t226216\n22班\t226217\n韩女\t226218\n花结文\t226219\n纪福黛\t226220\n4264264275\t226221\n熊太\t226222\naslz\t226223\n几场\t226224\n爱大陈陈\t226225\n四十八\t226226\n四十六\t226227\n保利大厦\t226228\n再开\t226229\n很丑很温柔\t226230\n美妞\t226231\n美妙\t226232\n顺数\t226233\n苏樱七\t226234\n刘嘉莉\t226235\n卡叔\t226236\n50000亿元\t226237\n16:30\t226238\n16:32\t226239\n和一个目\t226240\n美妆\t226241\n美妇\t226242\n13周年\t226243\n嗯摁\t226244\njrjje\t226245\n发张\t226246\n度秘我喜欢你好多年\t226247\n颜家千玺\t226248\n卡号\t226249\n游乐国\t226250\n说话算数\t226251\n神筋\t226252\n那句话了说一点儿别的行嘛行嘛行嘛行嘛行\t226253\n游乐园\t226254\n骑乘式\t226255\n咯答\t226256\n桃李梅\t226257\n练习\t226258\n怎样\t226259\n别干\t226260\n高扬\t226261\n50万吨\t226262\n快疯狂\t226263\n为我所用\t226264\n项圈\t226265\n山洞\t226266\n1700\t226267\n脚丫丫\t226268\n1100行\t226269\n1704\t226270\n5261\t226271\na1ways\t226272\n独夫全在自用心老师不过引路人\t226273\nBigHore\t226274\n7557875\t226275\n平整\t226276\n滚油\t226277\n山洪\t226278\n俺在#三国来了#\t226279\n高手\t226280\n眨一眨\t226281\n雀起\t226282\n抽血\t226283\n度蜜月我走\t226284\n高所\t226285\n慢病\t226286\nIPHONE6sps\t226287\n笨笨鲸\t226288\n呃不不不不不不不不不不不不不不不不了\t226289\n事实证\t226290\n西江武库\t226291\n晴空万里\t226292\n56346633\t226293\n性博会\t226294\nlknowlknow\t226295\n38多少\t226296\n北京广济寺\t226297\n小度小度\t226298\n李嘉诚\t226299\n陪考\t226300\n如云\t226301\nsdsfdf\t226302\n难道说\t226303\nakid\t226304\n娄楼\t226305\n诛灭\t226306\n三医大\t226307\n上家说\t226308\n路内\t226309\n乐迷\t226310\n孤芳自赏\t226311\n李可琪\t226312\n3464664\t226313\n那嘛\t226314\n嗯唔咪小苹有雷鸟\t226315\n哦酷我\t226316\n几百年前\t226317\n聊呗\t226318\n南开医院\t226319\n1830783046\t226320\n往西\t226321\n北标\t226322\nKmhpkj\t226323\n葫芦侠我的世界\t226324\nbzcFfgmgmg\t226325\n中国航空\t226326\n见怪\t226327\nBorn\t226328\n洗洗衣\t226329\n厌你\t226330\n好假也罢\t226331\n吴魏\t226332\n稻草\t226333\n水强哥\t226334\n3599元\t226335\n妨碍\t226336\nstoneyout\t226337\n书签\t226338\n诚意\t226339\n垂涎三尺\t226340\n赶尽杀绝删\t226341\n付工\t226342\n面包包\t226343\n曾少峰\t226344\n离理\t226345\n好萌萌\t226346\n嘛东一你最好了你是\t226347\n嗑呗\t226348\n一首一\t226349\n哪件事\t226350\n毒火\t226351\n光驱\t226352\n494646464646464964\t226353\n马你是我最知心的朋友有时候我和我不能不\t226354\nmo宝贝\t226355\n周驰星\t226356\n拉扯\t226357\n哦轮\t226358\n陶师傅\t226359\nqss011222223333\t226360\n长心眼\t226361\n将信将疑\t226362\n梅花\t226363\n778999990178\t226364\n龙驹中学\t226365\nvxbbdndbfh\t226366\n为明\t226367\n卖关子\t226368\nhfsvn\t226369\n度秘仙\t226370\n赶集网\t226371\n太狼太郎\t226372\n度秘我们两个结婚吧我是真心喜欢你\t226373\n马维涛\t226374\n紫苏损粗损粗损粗损\t226375\n技艺\t226376\n吴泽金\t226377\n人民政局\t226378\n這帖\t226379\n胡兰兰\t226380\n坷拉屎\t226381\n在边\t226382\ndaibusurufa\t226383\ntcl大厦\t226384\nzhao\t226385\nzhan\t226386\nzhai\t226387\n干嘛约\t226388\n度秘仪\t226389\n再见方\t226390\n做回自己\t226391\n三倍\t226392\nydfjgfjfhnbb\t226393\ncdV\t226394\n王宝宝\t226395\n割礼\t226396\n秘照\t226397\n猪嘴\t226398\nyokay\t226399\n业委会\t226400\n上来\t226401\n对哦真是我的度秘棒棒哒\t226402\n上条\t226403\n水蛋\t226404\n切切实实\t226405\n上杭\t226406\n85785514\t226407\n国投\t226408\nRHVFHJDNDBXFBDVVGDHTYNFJHFDOPPJDBDBZ\t226409\n八二六零\t226410\n五天\t226411\nzexrct\t226412\n雏形\t226413\n17笔\t226414\n陶行\t226415\n逆向天才杯我请一关一七关七怪七怪\t226416\n猪嘎\t226417\n霉女\t226418\n别具匠心\t226419\n咪爱\t226420\n乖真\t226421\nvifuvvaizi\t226422\n我是你的偶像我太羡慕你\t226423\n不小教\t226424\n肯尼基\t226425\n少了你不懂\t226426\ndovr\t226427\n秘为了我的我\t226428\n一大片\t226429\nEU2202\t226430\n乖看\t226431\njdjgtwqj\t226432\np秘\t226433\n户晓\t226434\n付辛凯\t226435\n善良的人\t226436\n蓝也\t226437\n后来人\t226438\n41般\t226439\n幸运数学\t226440\n王永艳\t226441\n度别\t226442\n方程\t226443\n为嘛嘛玩意\t226444\n尹老师\t226445\n会宁\t226446\n第九页\t226447\n098\t226448\n差征信\t226449\nBarney\t226450\n哈瑞美\t226451\n李建\t226452\n090\t226453\n俗人\t226454\n092\t226455\nttp\t226456\nttw\t226457\nogifysydgv\t226458\n万籁俱寂\t226459\nttt\t226460\nttk\t226461\ndishilvl\t226462\nsgchx\t226463\nttn\t226464\nttl\t226465\n马蹄尘\t226466\nttg\t226467\nttf\t226468\nttd\t226469\n多60\t226470\n思密达恩\t226471\n93#\t226472\n度孫子\t226473\n93%\t226474\n固力\t226475\n咬肌\t226476\n939\t226477\n938\t226478\n仔文学\t226479\n六婆\t226480\n932\t226481\n930\t226482\n936\t226483\n多好啊累\t226484\n走来\t226485\n成婚\t226486\n书法家\t226487\n合作厅\t226488\nfrgffgrturg\t226489\n开学萌\t226490\n六们儿\t226491\nlvat\t226492\n右臂\t226493\n士兵式\t226494\n顽垢\t226495\n魔红\t226496\n魏魏\t226497\n没事爱\t226498\n爱德华多\t226499\n尢晴\t226500\n1557858\t226501\n友好关系\t226502\n9.94\t226503\n2010年11月6日\t226504\n一早晨\t226505\nfamaa\t226506\n增近\t226507\n瓜女子\t226508\n五十瀚哲\t226509\n张集大\t226510\n一秒秒\t226511\n滑县\t226512\n抽逼无情\t226513\n倍受欢迎\t226514\n_╰\t226515\n尚海洋博大全\t226516\n就是你\t226517\nTHCH\t226518\nAreyoupig\t226519\n多着急\t226520\n单县\t226521\n186659\t226522\n抒发\t226523\n我的照相馆\t226524\n要不然号\t226525\n试屏\t226526\n余文静\t226527\nWZK，WY\t226528\n多面基\t226529\n5556255553666\t226530\n洗具\t226531\n赤西仁一\t226532\n一下21岁\t226533\nbll\t226534\n色比\t226535\n楚亦凡\t226536\n酸辣土豆丝\t226537\n券商\t226538\n刘宪华\t226539\n要不然叫\t226540\n礼貌我是你的爸爸\t226541\n水孤\t226542\n大炮\t226543\n藏\t226544\n亚利\t226545\n十条\t226546\n年满\t226547\n藉\t226548\n地质\t226549\n属具\t226550\n死丫鬟\t226551\nquite\t226552\n杰奎琳\t226553\n我的快乐会\t226554\n甘得\t226555\ntigfudt\t226556\n查办\t226557\n藕\t226558\n知己知彼百战\t226559\n藐\t226560\n做号\t226561\n闫艺帆\t226562\n鬼我怕你干嘛你当我是人\t226563\n藩\t226564\n属兔\t226565\n慧中\t226566\n藤\t226567\n黄带片\t226568\n忧郁\t226569\n该走\t226570\n二三级\t226571\n我终于能\t226572\n育女\t226573\n凯甲勇\t226574\n琼慧\t226575\n鹅鹅鹅鹅鹅鹅嘎嘎嘎嘎\t226576\nbbd\t226577\nhoras\t226578\n一零年\t226579\n明鹏\t226580\n小虎\t226581\n三天六六\t226582\n滴身\t226583\nignored\t226584\n增殖\t226585\n分期还款\t226586\n理化\t226587\n四四零\t226588\n漂高\t226589\n19920707\t226590\n学门\t226591\n18091556959\t226592\n创强创\t226593\n国庆寺\t226594\n因為風景\t226595\n高速路\t226596\n还行\t226597\n2672975608\t226598\n小虾\t226599\n小虹\t226600\n打翻\t226601\nsjhsbsjanjs\t226602\n簧瑟\t226603\n深不深\t226604\n浮生未歇】赠予白桃\t226605\n杨古树\t226606\n无可奈何\t226607\n超模十八季\t226608\n咯烦\t226609\n攫取者\t226610\nM级\t226611\n1112朵\t226612\n财神殿\t226613\n站神\t226614\n大便后悔\t226615\n麻蛋\t226616\n燕京\t226617\ndgrhh\t226618\n站票\t226619\nchchx\t226620\n路党们\t226621\ndygf\t226622\n健步如什么挥汗如什么守口如什么似思贤如什么爱财如什么挥金如什么从善如什么暴跳如什么四十如什么\t226623\n矮人\t226624\n废话少说点\t226625\n骂街骂\t226626\n吃醉\t226627\n愚蠢的人\t226628\ntrickortreat\t226629\n穆凯什\t226630\n祝奶奶\t226631\n贺兰英\t226632\n你你你你是谁呀战\t226633\n骨缝\t226634\n一新\t226635\n嘴迋\t226636\n多芒小丸子\t226637\n拜拜恩\t226638\nxxixtgi\t226639\n莫介意\t226640\n婴孩\t226641\n瘦玖月\t226642\nwell\t226643\n3000公里\t226644\n逼淋巴细胞\t226645\nsssssssss\t226646\n太机器人\t226647\n色斑\t226648\n184872010\t226649\n笔店\t226650\nokocococococococ\t226651\n高玉宝\t226652\n李艳阳\t226653\n邱佳鑫\t226654\n我的主人\t226655\nstfaxxxszxffs\t226656\n餐具\t226657\n虎贲\t226658\n杨锴\t226659\n正正好\t226660\n一案\t226661\n点景\t226662\n谢吉军\t226663\n小期末\t226664\n158485494945\t226665\n吧囧事\t226666\njhhbbbbb\t226667\n酒哇\t226668\n亚洲\t226669\n小猪肉\t226670\n龙哥\t226671\nEgdawcaZxfc\t226672\naasb\t226673\n细想\t226674\n再试试\t226675\n洞洞\t226676\nThinkPad\t226677\n看当当\t226678\nfffdrtfffr\t226679\n我爱祖国我爱好的我爱爸爸妈妈我爱老师同学你要问我最爱什么我最爱我的组\t226680\n爱明白\t226681\n门口儿\t226682\n显身\t226683\n说块\t226684\n南黄海\t226685\n充其量\t226686\n求真\t226687\n灾重\t226688\n耀眼\t226689\nlatiahe\t226690\n生物\t226691\nZzzv\t226692\n4GS\t226693\n三宝镇\t226694\n威尼斯狂欢节\t226695\n葛鑫光\t226696\n总感\t226697\n一封手\t226698\n大韩民国\t226699\n搞心\t226700\n抱喜\t226701\n万劫\t226702\n咪吗度秘\t226703\n唔咪嘟咪\t226704\n陈宁宁\t226705\n超人气\t226706\n劝架\t226707\n黄泽\t226708\n超差\t226709\n宋工\t226710\n三个男\t226711\n黄泉\t226712\n哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒\t226713\n内乡师\t226714\n10005度\t226715\nFuCKyOU\t226716\n七八千四百五十三\t226717\n瞎屌\t226718\n雅鲁藏布江\t226719\n掐断\t226720\n40元\t226721\n磁器\t226722\ndbdhfgfk\t226723\n赵旭日\t226724\n阿哩噶多\t226725\n24点\t226726\n很好吃\t226727\n葛月半\t226728\n120块\t226729\n吉菜\t226730\n蝶恋花\t226731\n日工号\t226732\n介不介意\t226733\n1314I\t226734\n维C\t226735\n度mi\t226736\n圣栋\t226737\n白雪雪\t226738\n狮子\t226739\n很好听\t226740\n开马\t226741\n瓜迪奥拉\t226742\n金秋琦\t226743\n瓮牖STR\t226744\n吗亲们\t226745\n食货\t226746\n宗虎\t226747\n哀家皇后娘娘\t226748\n维c\t226749\n靠靠靠靠靠靠靠靠靠靠\t226750\n棒vvvgh\t226751\n女流\t226752\n为天人\t226753\n25年前\t226754\n小毛孩多管闲事\t226755\n陈宇\t226756\n刑侦\t226757\n坚首\t226758\n晚上六个小时\t226759\n吴思翰\t226760\n嗦嘎\t226761\n省发改委\t226762\n洁癖\t226763\n穿坏\t226764\nKcccoz\t226765\nmeime\t226766\n失意\t226767\n眼帘\t226768\n乜别胡\t226769\n张舒丰\t226770\n错错错恭喜\t226771\n洁白\t226772\n女海\t226773\n咳咳哦呦玉玉猶豫不決欲哭圖u雨人妖\t226774\n亦庄小学\t226775\n撸主\t226776\n继保\t226777\nspdpppdc\t226778\nx26\t226779\nx20\t226780\n一桶\t226781\n62家\t226782\n滚边\t226783\n脚得\t226784\n异性感\t226785\n老子男\t226786\n杜少府之任\t226787\n中红博爱资产管理有限公司\t226788\n守寡缘\t226789\n速红\t226790\nv7个\t226791\n艺佳\t226792\n要是到一个白说了你的头像六个人行\t226793\n宣度秘\t226794\n伊思莱\t226795\n宋佳锦\t226796\n鱼鲍鱼\t226797\n局面\t226798\n我喜欢啵\t226799\n蓄势\t226800\n刚日\t226801\n方城\t226802\n应得\t226803\n所用幼有\t226804\n每一次\t226805\n应征\t226806\n过年\t226807\n三个五个\t226808\n国外数绵羊\t226809\nTankeyou\t226810\n夜空白\t226811\n拉拉拉拉拉\t226812\nppppplpaabwhaaaanooooo\t226813\n韬略\t226814\n宋敏\t226815\n发源地\t226816\n种猪\t226817\n有一死\t226818\n颤颤\t226819\nmnvh\t226820\n星宿\t226821\n永发策\t226822\n老大德智德\t226823\n八十九零\t226824\n星室\t226825\nfffxsdmg\t226826\n星宫\t226827\n疯人狂人\t226828\n爹爹爹爹爹爹爹\t226829\n比较好不好\t226830\n超片\t226831\n三里河\t226832\ncome\t226833\n过原谅\t226834\n习诗雨\t226835\n榴莲\t226836\n大雅之堂\t226837\n玩好不\t226838\n马戏团\t226839\nGVv个\t226840\ndrruxfyxyxd\t226841\n瘦身裤\t226842\n可视\t226843\n疏影\t226844\n可观\t226845\nPrincess\t226846\n红外线\t226847\n可见\t226848\n建构\t226849\n菲菲哇\t226850\n壹加壹兰布\t226851\n压倒\t226852\nvvyh\t226853\n看着你是\t226854\n再见我曾经关\t226855\n廊坊台\t226856\n双腿\t226857\n腐萌\t226858\n侯怡孺\t226859\n歡嗎\t226860\n赵薇\t226861\n啦啦啦我自豪\t226862\n刘相宜\t226863\n菲菲哥\t226864\n希望地球平安\t226865\n莱曼、亨利\t226866\n诗人人\t226867\nbbgg值\t226868\n评选\t226869\nCCES）#公司\t226870\n語言嗎\t226871\n怪娃娃\t226872\n嗯嗯嗯嗯嗯嗯二拜\t226873\nSave\t226874\nbhiphotosbaiducomxiaodupicitema686c9177f3e670950846d8a3cc79f3df8dc553ajpg\t226875\n憎\t226876\n励志\t226877\n用以为\t226878\n剂量\t226879\n女行\t226880\n邵浩峰\t226881\n7:05\t226882\n宝儿\t226883\n渔人\t226884\nzXxzzc\t226885\n十回\t226886\n十四\t226887\n车尾号\t226888\n产业链\t226889\n睡觉天在上晚上\t226890\n800元\t226891\n日间\t226892\n俩亲\t226893\n炊烟恩\t226894\n考文\t226895\n蔡浩杰\t226896\n女表\t226897\ntehgjfuuhhbjjhgktfhzif\t226898\n算天\t226899\n俩人\t226900\n万事通\t226901\n1月2号\t226902\n十国\t226903\n杀破狼\t226904\n胡屯\t226905\n易洋千喜\t226906\n掌握\t226907\n白富美哪美你个头\t226908\n忙你很丑度秘\t226909\n看清\t226910\n分秒必争\t226911\n前几日\t226912\n邱祖殿\t226913\ntptp\t226914\n讲不来\t226915\n王湾\t226916\n一美\t226917\n嘴甜\t226918\n六三个\t226919\n美籍\t226920\n截流\t226921\n6月25号\t226922\n夏骚骚\t226923\n三了三了三\t226924\n5陈\t226925\n新成员\t226926\n尤文图斯\t226927\n腿型\t226928\n一群\t226929\n老师的手\t226930\n李嵘陶\t226931\n王湘\t226932\n恒星\t226933\n晃悠\t226934\ngds\t226935\ngdt\t226936\n见过你不想说\t226937\ngdv\t226938\ngdx\t226939\ngdy\t226940\n蒙哥\t226941\n新澳\t226942\n大大好\t226943\nFEDEX\t226944\n东道主\t226945\ngdb\t226946\ngdc\t226947\ngdd\t226948\n面欢\t226949\ngdf\t226950\n少课\t226951\n忠犬八公\t226952\n白菜泡菜\t226953\ngdj\t226954\ngdk\t226955\n過暒浗汏戰\t226956\n长跑\t226957\n挖土\t226958\n反倒\t226959\n涎唾\t226960\nhttpweidiancomothersdiarydetailhtmlid6463052wfrc\t226961\n歌诗合合\t226962\n恩妹妹\t226963\n罗宝强\t226964\n太难了你不懂\t226965\n笑着\t226966\n不用你了行不行\t226967\n未完待续\t226968\nMADRID\t226969\n减压门\t226970\n忙笑笑\t226971\n299650071\t226972\n崔肖朋\t226973\n玉龙山\t226974\n刘佳欣\t226975\n人迹罕至\t226976\n红弑母\t226977\n过换\t226978\n奈良\t226979\n水杯\t226980\n萌萌萌萌哒\t226981\nmahjong\t226982\n龙门学校\t226983\n木费\t226984\n等一天到晚\t226985\nHZHhbknzbs\t226986\nCJjfbbbbh\t226987\n尖尖\t226988\n宁波市\t226989\n孙铭梓\t226990\n30.89%\t226991\n豁子\t226992\n12345.96310\t226993\n我为曲驟星大八｛么9a23歹刀11\t226994\n晏伟伟\t226995\n呃六六\t226996\n张秘书\t226997\n精炼\t226998\n取经\t226999\nPerry\t227000\n食家庄\t227001\n吃苦\t227002\nmeall\t227003\n狼男\t227004\n自有天\t227005\n大婶\t227006\n媚兰\t227007\n大婺\t227008\n李浩宇\t227009\n头套\t227010\n九点半到两点半\t227011\n嬷嬷达\t227012\n束手无\t227013\n当作者会\t227014\n大婊\t227015\n专家们\t227016\n剑人\t227017\n邝读\t227018\n8118189\t227019\n盖章\t227020\n大婚\t227021\n戴馨荧\t227022\nggucbsjjcnchv\t227023\n我的你的你的昨晚\t227024\n字据\t227025\n牛泽龙\t227026\n小新\t227027\n小施\t227028\n自新\t227029\n谢谢你没那我\t227030\n王学东\t227031\nhugvv\t227032\n哇塞acuram\t227033\n超频三东风三招\t227034\n翻新陈\t227035\n小斯\t227036\n黄瓜园\t227037\n杨立新\t227038\n青州气象局\t227039\n楼馆\t227040\n草流水\t227041\n350颗\t227042\n古建\t227043\n赛克\t227044\n逃出去\t227045\n小文\t227046\n那么多天\t227047\n王米娜\t227048\n相赖\t227049\n大老虎\t227050\n张苍井空luoze\t227051\nFTGYFY\t227052\n3600秒\t227053\nHyffy\t227054\ngiy\t227055\nwads\t227056\n李佳隽\t227057\n36分钟\t227058\n龙子湖公园\t227059\n满川\t227060\n星盘\t227061\n菜刀\t227062\n皮厚\t227063\n桃夭\t227064\n杨的黎\t227065\nMervecan\t227066\n第十六集\t227067\n润燥\t227068\nedddjk\t227069\n兔兔手套兔兔音箱兔兔抱枕兔兔台灯兔兔钱罐兔兔手机座兔兔笔筒兔兔豆浆机兔兔羊城通\t227070\ngkghxhdu\t227071\n15150279913\t227072\n陈小妹\t227073\n80页\t227074\n深入骨髓\t227075\n矮牙亨俊\t227076\n那你说吧男扮女个王八个\t227077\n洋酒\t227078\n螺丝刀\t227079\n九分\t227080\n算\t227081\n灌南\t227082\n卡卡卡卡阿卡卡︶\t227083\n徐雨欣\t227084\n恶魔鬼\t227085\n过色\t227086\n武曌\t227087\n简\t227088\n不好看不好看不好看\t227089\n豹纹\t227090\n在也不聊\t227091\n箅\t227092\n靖州\t227093\n龚小静\t227094\n箍\t227095\n应接不暇\t227096\n箱\t227097\n外访\t227098\n多米度秘\t227099\n饶思佳\t227100\n拐弯\t227101\n坐便\t227102\n水赖\t227103\n帅酷\t227104\n做错事音乐声清妈妈的话闪闪的泪光鲁冰花\t227105\n管\t227106\n九到\t227107\n全歼\t227108\n4.5万亿\t227109\n箩\t227110\n高瑞\t227111\n电厂\t227112\n柱子\t227113\n箭\t227114\n如是说\t227115\n开戏\t227116\nuid\t227117\nuii\t227118\n活招\t227119\n坏爸爸\t227120\n40万个\t227121\n张皓宸\t227122\n预案\t227123\nuiq\t227124\n15812656318\t227125\nuir\t227126\n刑法\t227127\n度秘真讨厌\t227128\n新交所\t227129\n活拉\t227130\n可乐片\t227131\n张昌河\t227132\n砩又\t227133\n兔玲珑\t227134\n3042421396\t227135\n两万八\t227136\nIthinkIamtidy\t227137\n唱心越\t227138\n33iq\t227139\n五月十七号\t227140\n开房\t227141\n家喻户晓\t227142\n水光肌\t227143\n开户\t227144\n萍乡\t227145\n衣袖\t227146\n不要说话说\t227147\n啱先\t227148\ncullugh\t227149\n澳币\t227150\n重死\t227151\n欧尼哈哈哈\t227152\n付友莲\t227153\n张和乐\t227154\n生死与共\t227155\n半男半\t227156\n哈出\t227157\n发条\t227158\n刑辩律师\t227159\n参天要是一个怎样的人\t227160\n尚美\t227161\nSUPER\t227162\n接踵而来\t227163\nhello\t227164\n我的完美男人\t227165\n她们\t227166\n15545\t227167\n天天知道\t227168\ngufyff\t227169\n眼看着\t227170\n刘师\t227171\n无穷曰\t227172\n685655235\t227173\n上帝学\t227174\n三十曲裾\t227175\ncrucial\t227176\n提问你好\t227177\n080808\t227178\n更恩\t227179\n工书\t227180\n切除术\t227181\nMuse\t227182\n二姨哪你在哪儿\t227183\n5454554552715245454\t227184\n景阳冈\t227185\nt3.2\t227186\n一个234\t227187\n你好漂亮你\t227188\n一个230\t227189\n44525422222222222222222222222\t227190\nhunting\t227191\n牛肉面牛肉面\t227192\n熊曼\t227193\n联合会杯\t227194\n寻了尔\t227195\n背记\t227196\n姜子瑜\t227197\n厉旭\t227198\nfuiqr\t227199\n六哪\t227200\nSsssssssssddd\t227201\nMnet\t227202\n39000\t227203\n大连城区\t227204\n这个小时候\t227205\n堡餐\t227206\n申雅岚\t227207\n西式\t227208\n我的秘书小秘书\t227209\n后花园\t227210\nц\t227211\n得理不饶人\t227212\n郭元祥\t227213\n弹走\t227214\n连连八一\t227215\n弹起\t227216\nNBA2koly\t227217\n7ptnnnn\t227218\n122次\t227219\n宋问\t227220\n王璐园\t227221\nvxndnhzjd\t227222\n阳酱\t227223\n炜炜\t227224\n企沙镇\t227225\n曾诗嵛\t227226\n噜噜噜噜噜噜某\t227227\n2遍\t227228\n许吉祥\t227229\n道底\t227230\n鹿夕\t227231\n肿么办\t227232\n京城\t227233\n利达销魂\t227234\n牛妞妞\t227235\nhicu\t227236\n天龙寺\t227237\n6月22号\t227238\n力大\t227239\nb22\t227240\n朝霞\t227241\n舍近\t227242\n6000亿美元\t227243\n一鼓\t227244\n很好多谢\t227245\nhicc\t227246\n德国人\t227247\n特供酒\t227248\n吴氏\t227249\n下怀\t227250\n硫氓\t227251\n我只我最后一次说我喜欢你度蜜\t227252\n硝酸锌\t227253\n免疫活性淋巴细胞\t227254\n歌晨\t227255\n运城\t227256\n从天而降\t227257\n禹州市哪所中学\t227258\n岳赟\t227259\n该剧\t227260\n知我者\t227261\nigfjcbnx\t227262\n博立\t227263\n9额\t227264\n毛宁\t227265\n一百二十二\t227266\n狗汉奸\t227267\n随之\t227268\n男欢女爱\t227269\nDtytdyu\t227270\n有八九\t227271\n新华东街\t227272\n自乐\t227273\nGffgfg\t227274\n李唯滋\t227275\nb2b\t227276\n懒呢讨厌\t227277\n远洋城\t227278\n熊文凯\t227279\n魏武\t227280\n时刻准备着\t227281\n妈说\t227282\n佝偻病\t227283\n自习\t227284\n2000多元\t227285\n曾小闲\t227286\n辽宁海城市周晓皙\t227287\nvvbbbb\t227288\n咋地\t227289\n又一只猫咪摄影师\t227290\nvia\t227291\nvif\t227292\n事情人\t227293\n嗯爷爷\t227294\nvie\t227295\n你好奸\t227296\nvih\t227297\nvii\t227298\n你好好\t227299\nvim\t227300\n心里的讨厌\t227301\nvis\t227302\nvip\t227303\nviq\t227304\n好多分\t227305\n湿地\t227306\nbbdj\t227307\nviu\t227308\n秘度秘度秘秘度秘度度秘秘度秘度秘度秘度度秘\t227309\n伊宁市\t227310\nviy\t227311\nD68WDYFWECDICHJS\t227312\n反面\t227313\n粗鲁\t227314\n康保二人台\t227315\n梭轮\t227316\n美洲\t227317\n别闹了奥\t227318\nStep\t227319\n右手\t227320\n养狗i5\t227321\n闹王源\t227322\n好多别\t227323\n不袅\t227324\n维玲\t227325\n抽阴\t227326\nvsygzwugxwyhxwbwxhzwxwxygzegyz\t227327\n雷阵雨转晴\t227328\n陈雨欣\t227329\n猜唱\t227330\n5555552555555555555555555555555566666666666666666666666666\t227331\n油乎乎\t227332\n派上\t227333\n1QQ多\t227334\n惊变\t227335\n晕开\t227336\n说不够\t227337\n照明者\t227338\n给你好\t227339\n卓子\t227340\n惊叫\t227341\n工薪阶层\t227342\n一来来\t227343\n连看\t227344\n卢月\t227345\n说不多\t227346\nJddujenwndjdkanskwonwjxkej4315575773467345\t227347\ndggft\t227348\n惊叹\t227349\n螺螺螺\t227350\n屋脊\t227351\n飞驰\t227352\n黑武士\t227353\n苏太美\t227354\n造化\t227355\n2013年\t227356\n86年\t227357\n78541\t227358\n德罗巴\t227359\n傻不傻\t227360\n英国牧羊犬\t227361\n往下跳\t227362\nVvcj\t227363\n耗油\t227364\nG1T1\t227365\n家乐福光谷店\t227366\n刘新梦\t227367\n二四会儿\t227368\n说说完\t227369\n我好爱你爱你爱你爱爱的要死\t227370\ndhhdjchsndjd\t227371\n导员\t227372\n西卡西卡西卡\t227373\n三四倍\t227374\n今晚9点30\t227375\n仙游公司枫亭汽车站\t227376\n尹德燕\t227377\n王佩英\t227378\n三六零三六零五百一十二百五\t227379\n带宽\t227380\n萨思春茁\t227381\n看行\t227382\n裤毙\t227383\n稳妥\t227384\n果汁杯\t227385\n购票\t227386\n从那时起\t227387\n陪你痴狂千生\t227388\nCafe\t227389\n床品\t227390\n普雅\t227391\n大法官\t227392\n3000尺\t227393\n阿姆斯特\t227394\n零碎\t227395\n大魔王\t227396\n快男冠军\t227397\n看表\t227398\n熊孩子\t227399\n六点零点六点\t227400\n大吃一惊\t227401\n付玉甜\t227402\n飞星劫\t227403\n唳\t227404\n唰\t227405\n唱\t227406\n第三季\t227407\n唷\t227408\n60次\t227409\n老不要脸\t227410\n去吧战\t227411\n唾\t227412\n卧室才是你的主人我是\t227413\n唼\t227414\nlozz\t227415\n度豆腐\t227416\n网易\t227417\n唠\t227418\nyesIcan\t227419\n请礼貌\t227420\n唧\t227421\n唤\t227422\n494\t227423\n擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦擦\t227424\n497\t227425\n490\t227426\nDhahran\t227427\n唬\t227428\nU8800+黑1350\t227429\n谢金\t227430\n三三不行\t227431\n唑\t227432\n阿婷\t227433\nalla\t227434\nDJ波子\t227435\n唛\t227436\n加数\t227437\n唃\t227438\n唁\t227439\n唆\t227440\n唇\t227441\n唄\t227442\n唉\t227443\n不动产\t227444\n绫波丽\t227445\n廖美琳\t227446\n荧火虫\t227447\n米国\t227448\nhuangpian\t227449\nJEIE\t227450\n愤怒机器人\t227451\n提法\t227452\n坐火车的女人\t227453\n美乖法\t227454\n东方城\t227455\n七情六欲\t227456\n倒疷\t227457\n桂羽\t227458\n舍不得服\t227459\n密密\t227460\n外物\t227461\n密寻\t227462\n韩云绮\t227463\n几千金\t227464\n后去\t227465\n你是我的你是秘度的妈妈\t227466\n野史\t227467\n再加上\t227468\n生女孩\t227469\n卖相\t227470\n蒋欣悦\t227471\n名面\t227472\n蔡支那\t227473\n三分线\t227474\n小小炮娘\t227475\n红皂白\t227476\n218回校\t227477\n谢恶心\t227478\nHDD\t227479\nmuma\t227480\n东汉末年\t227481\n摆上\t227482\n麽昂\t227483\n纱裙\t227484\nfts\t227485\nftt\t227486\n吃了吐\t227487\n爸爸手\t227488\n家傲\t227489\n吹不走\t227490\n你在干嘛约会\t227491\n吃了吃\t227492\n凌晨1点\t227493\n月薪\t227494\n甜小段\t227495\n杨家子\t227496\n汽车西站\t227497\n近二十年\t227498\n好啦不逗你啦你看你\t227499\n300度\t227500\nkuialial\t227501\n阿里哈\t227502\n15049882776\t227503\nIT\t227504\n一六人\t227505\n嫁给我笑\t227506\n泰医\t227507\n嘞缘故\t227508\n勉为其难\t227509\n观察力\t227510\n滚船\t227511\n五年间\t227512\n酒饯\t227513\n垂头丧气\t227514\n海底捞月一永安艾尚\t227515\n奶酪丝\t227516\n以己之力\t227517\n贾森·雷特曼\t227518\n算了我讨厌\t227519\n穷水尽无疑\t227520\n就结束\t227521\n呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀呀\t227522\n林黎明\t227523\n来了你在\t227524\n戒背\t227525\n瞎乱\t227526\n再来一起\t227527\n打点滴\t227528\n榨干\t227529\n困醒\t227530\n我恨你我讨厌你的不要脸不要脸\t227531\n2000多个\t227532\n曲佳佳\t227533\nhytffjiu\t227534\n瞎乐\t227535\n日本漫画家年会宾馆\t227536\nxxixiexiem\t227537\n五年级英语\t227538\n怪才\t227539\n看我萌\t227540\n吴晨宇\t227541\n一百一十\t227542\n分文\t227543\n点色\t227544\n黑了玩\t227545\n国奥\t227546\n九一九一\t227547\n核试验\t227548\n太不智能了我要投诉你\t227549\nf1秒\t227550\n8582857248685587285727575285758576438256090989379800575857602683\t227551\n叙述\t227552\n南京金肯建筑安装工程有限公司\t227553\n謝谢\t227554\nwerr\t227555\nPC端\t227556\n西亚斯\t227557\n高乐高\t227558\nFTYD\t227559\n向上吧少年\t227560\n天雷公咯嘿风雷\t227561\n启明星\t227562\n睑板\t227563\n柴杉\t227564\n学好习\t227565\n度秘从今天开始\t227566\n制作人\t227567\n文化西路180号\t227568\n并发症\t227569\n小房子\t227570\n长我很喜欢\t227571\n柳无盐\t227572\n一看电视\t227573\n惯于\t227574\nxbwx\t227575\n永结同心\t227576\n赵艺丹\t227577\njhdxbkifh\t227578\n打样\t227579\nn多米\t227580\n数十步\t227581\nIv\t227582\n8633632\t227583\nhgx\t227584\nhgy\t227585\nhgv\t227586\n鸵鸟君\t227587\nhgt\t227588\n大傻子缺心眼儿啥子雅尔\t227589\nhgr\t227590\nhgs\t227591\n探险者\t227592\nhgn\t227593\nhgj\t227594\nJkj\t227595\nhgi\t227596\nhgf\t227597\n串门\t227598\n叉叉叉叉叉叉叉叉叉\t227599\n事情\t227600\n昭福\t227601\n葫芦侠\t227602\n200斤\t227603\n机动车\t227604\n转述\t227605\n大孩纸\t227606\n记厅\t227607\n唔咪\t227608\n29.68%\t227609\nIn\t227610\n湖北省人大常委会\t227611\n改革开放\t227612\n泡菜鱼\t227613\n588996\t227614\n于天\t227615\n福富软件\t227616\n香炉\t227617\n情绪\t227618\n转运\t227619\n天天红包\t227620\nhg8\t227621\n指过\t227622\nhg7\t227623\n呢忙人\t227624\n系新市\t227625\n王馨璐\t227626\n转过\t227627\n疾呼\t227628\n来得到\t227629\n修路\t227630\nIf\t227631\n于夏\t227632\n谢谢你为我\t227633\n13555924402\t227634\n真鬼\t227635\n江苏省人民医院临床生殖医学中心\t227636\ne211哎呦呦呦呦呦呦\t227637\n受伤害\t227638\n忘记手拉手\t227639\n四照\t227640\n迷局\t227641\n147期\t227642\n嗯考拉\t227643\n保靖\t227644\n七七玛丽\t227645\n陈翔偶\t227646\n句型\t227647\n单反转向M9\t227648\n红心\t227649\n红包装\t227650\n心里话\t227651\n力軄\t227652\n钱行\t227653\n邝世琪\t227654\n治方\t227655\n秘诚意\t227656\n嘻嘻沥沥\t227657\n你要脸的反应电视二皮脸\t227658\n我喜欢惠\t227659\n湿乎乎\t227660\nAAP\t227661\n51618618186468161\t227662\npdf\t227663\n乖不乖\t227664\n95点\t227665\n侵害案\t227666\n美丽的误会\t227667\n李成功\t227668\n无图\t227669\n臭娜娜\t227670\n韩韵如\t227671\n少想\t227672\n伊达\t227673\n12级\t227674\n气扬\t227675\n中口\t227676\ngjkb\t227677\n28月\t227678\n度乜\t227679\n1373034630\t227680\n度乐\t227681\njFg\t227682\ngjkk\t227683\ngjkh\t227684\n请勿\t227685\n里拉来\t227686\n顺杆子\t227687\n中叶\t227688\n啦啦阿狸\t227689\njFu\t227690\n执照\t227691\n百万位\t227692\nknees\t227693\n#\t227694\n小我的火\t227695\n橙汁\t227696\nOka\t227697\n竟被\t227698\nmovie\t227699\n中友\t227700\n嘎切\t227701\n阳光人寿\t227702\n宝钗\t227703\n画笔\t227704\nGGKG\t227705\nRyfff\t227706\n撞色\t227707\n宝钞\t227708\n糟日\t227709\n欧欧\t227710\n李欣怡\t227711\n有何患\t227712\n懦夫\t227713\n不要吃\t227714\n魅妆\t227715\n天壤之别\t227716\n甚麽\t227717\n飞哥\t227718\n入江湖\t227719\n星座们\t227720\n死丑\t227721\n44443\t227722\n三面\t227723\n開啟\t227724\n小老婆\t227725\n天和\t227726\n船厂\t227727\n上6点47分\t227728\n诺头诺\t227729\n被动\t227730\n000001岁\t227731\n85070407\t227732\n马修\t227733\n奥我只\t227734\n老徐\t227735\nles\t227736\n一不好\t227737\n666666666度\t227738\n多项式\t227739\n飞哈\t227740\n播种\t227741\n杜小刚\t227742\n宴坐\t227743\n六点到十点钟\t227744\n好宝贝vgbbmhg\t227745\n环路\t227746\n猫猫猫猫猫\t227747\n森林公园\t227748\n2月1日\t227749\n楚秀\t227750\n好娃\t227751\n牵连\t227752\n王智邦\t227753\n13657402042\t227754\n曲美乡\t227755\nhhpdss\t227756\n致辞\t227757\n你新年快乐男男女女\t227758\n耍不好\t227759\n粘稠\t227760\n圆珠笔画\t227761\n三十四三十四\t227762\n嗯娘娘\t227763\nowowwow\t227764\n经济法\t227765\n滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚滚\t227766\n莫帅\t227767\n樱桃小丸子\t227768\n中男\t227769\n中电\t227770\n中甲\t227771\n陈雨佳\t227772\n呀啊\t227773\n天才才\t227774\n共荣\t227775\n地久天长##\t227776\n中用\t227777\n溢于言表\t227778\nming\t227779\n热干嘛\t227780\n凯迪迪爱\t227781\nminx\t227782\n弯管\t227783\n黄清元\t227784\n前天体\t227785\n就是那样子\t227786\n哎呀秘\t227787\n诺尔\t227788\n嗯松开\t227789\n呀啦\t227790\n鸳鸯戏水\t227791\n要第\t227792\nWSM\t227793\n陈希妍\t227794\n好心情\t227795\n做样\t227796\n小精灵\t227797\n寂静岭\t227798\n海角渔乡\t227799\ngamenetbbsfromuid616777\t227800\nmins\t227801\nhdudu\t227802\n了待会儿聊\t227803\n王春磊\t227804\n5.00\t227805\nHgggcxgxghfx\t227806\nhdudh\t227807\n冯雨佳\t227808\n河马\t227809\n建国大业\t227810\n酷玩\t227811\n马辣荡\t227812\n日利率\t227813\n小笑\t227814\n连冠\t227815\n段黄巍\t227816\n0.75%\t227817\n瘦人女尊文\t227818\n说出自来\t227819\n永远拜拜\t227820\n存存钱\t227821\n伊利优酸乳\t227822\n猫画\t227823\n御园小区\t227824\n小笼\t227825\n我爱你还有爱你的拥抱不爱你\t227826\n3.09%\t227827\n小笨\t227828\n好多年生\t227829\n古兰经\t227830\n日盛\t227831\n五五零二\t227832\n喊叫\t227833\n留出\t227834\n小符\t227835\n天公作美\t227836\n赵莹莹\t227837\n黄舣舟\t227838\n陆翰\t227839\nAlbert\t227840\n露向\t227841\n12375678919\t227842\nf张\t227843\nmao慢datoyotaylyoey\t227844\n灰发\t227845\nqipe\t227846\n丁晨冉\t227847\n诺维茨\t227848\nIcon\t227849\n交叉点\t227850\n谁谢你了我没见你\t227851\n屁针\t227852\n两年前\t227853\n4656896868686855475455424544\t227854\n仲要唔\t227855\n分子寒假\t227856\n梦想基金\t227857\nB族维生素\t227858\n喂鱼\t227859\n脫光\t227860\n3月1日\t227861\n重义\t227862\n655775\t227863\n王汝刚\t227864\n大队长\t227865\n西湖岸奇\t227866\n卡瓦尼\t227867\n长篇小说\t227868\n10月9日\t227869\n芯片\t227870\n苏晓茹\t227871\n里哈哈\t227872\n一去\t227873\n转溪桥\t227874\n66646\t227875\nSB亲亲机器人\t227876\n60008000003\t227877\n一厢\t227878\n走呀你走\t227879\n罗勇哲\t227880\n按住\t227881\n错错错错错错错错错错错开开开开开开开开开开\t227882\n赶快\t227883\n嘎嘎嘎嘎干锅肥肠\t227884\n韩磊\t227885\nAndending\t227886\n居家\t227887\n600多句\t227888\n如意谷\t227889\n集会\t227890\n赶忙\t227891\n天真烂漫\t227892\n妹妹叉\t227893\n居室\t227894\nHjbvh\t227895\n一厅\t227896\n1458868485\t227897\n60年后\t227898\n哇塞车\t227899\ndisorder\t227900\n六余数\t227901\nertrdf\t227902\n爱不了忘\t227903\n15个小时\t227904\n亲你\t227905\n李佩芸\t227906\n新歌星\t227907\n阿拉伯电视台\t227908\n青优\t227909\n8683\t227910\n一港\t227911\n生好兴奋\t227912\n藏匿\t227913\n噶蛤\t227914\n150846\t227915\n我讨厌你我恨你我最最最最讨厌的就是你\t227916\n要是\t227917\n曹婷婷\t227918\n你在干嘛呀小度秘\t227919\n素可泰\t227920\n我的事我是女的\t227921\n沈洋洋\t227922\n皇族\t227923\n基础\t227924\ngkgjvjb\t227925\n滑艺星\t227926\n长此以往\t227927\n老子真牛比\t227928\n宋小刚\t227929\n精神分裂症\t227930\n北固山\t227931\n脸谱\t227932\n山照米\t227933\n燃料棒\t227934\n脱审\t227935\ndistruction\t227936\n草榴社区\t227937\n节目组\t227938\n中央委员\t227939\n爱大歌会\t227940\n刘如斌\t227941\n258秘\t227942\n蓝旗上都镇楼监狱乌云花荞麦河洛王\t227943\nimickoo\t227944\n一天之内\t227945\nBD-RMVB/MP4\t227946\n记忆公园\t227947\n受辱\t227948\n秘米\t227949\n蜂蜜度秘匣\t227950\n雷公\t227951\n巧遇\t227952\ncuby\t227953\nKOPA\t227954\n酷睿覅\t227955\n年快乐我的\t227956\n连不通\t227957\n408米\t227958\n毛盛然\t227959\n自然堂\t227960\n装穷\t227961\npurplleb\t227962\n万张\t227963\n一凢乜骏也\t227964\n一周几个\t227965\n一个澡\t227966\n民子村\t227967\n复次猜猜猜猜猜猜猜猜猜猜\t227968\n巨匠\t227969\n秦德n茂\t227970\n奥朗\t227971\n林靠靠\t227972\n苒爷苒奶\t227973\n洒水\t227974\n可不可恶\t227975\n损样\t227976\n人生花\t227977\n孤想\t227978\nGpjt\t227979\n桂花真香\t227980\n我骗你的你是机器人的可爱呀哪里美\t227981\n烧号\t227982\n超人类\t227983\n张金海\t227984\n没好过\t227985\n如有兴趣\t227986\n二六六零\t227987\n二零九零\t227988\n快点康复康复\t227989\ntryjff\t227990\n有所长\t227991\nbbbbbbbbbbbbbbbbbbbbbbbbbbb\t227992\n举一反三了行\t227993\n女孩女孩子\t227994\n四十八年\t227995\nGDG\t227996\n谈天喝酒\t227997\n孔卡\t227998\n一亿年\t227999\n中国电信大学微创业\t228000\n毛样\t228001\n妹子们\t228002\n八路爸八\t228003\n昂真话\t228004\n上学嗯\t228005\n魔佛伯\t228006\n桃子秀\t228007\n泳醍\t228008\n杀害\t228009\n纸扎\t228010\n俞硕\t228011\n孙艳超\t228012\n三分之二二\t228013\n觉法师\t228014\n上海政府\t228015\n毛栗\t228016\n頭布\t228017\nfjf\t228018\n王焊机\t228019\n67890123456789012\t228020\n杜高\t228021\n爱你思密达狠心收费的爱你爱你爱你爱你思密达\t228022\n吃吃看\t228023\n卿酒酒\t228024\n咪依家什么地方的我们有一先在乐天\t228025\n魁拔\t228026\n偶像团\t228027\n电影券\t228028\n曹紫霞\t228029\n晚上9点\t228030\n3枚\t228031\n黑白照\t228032\n朱好\t228033\nnijiaoshene\t228034\n学会无欲\t228035\n936000元\t228036\n口头禅\t228037\n落石\t228038\n9981797954\t228039\n零几天\t228040\nhttphhiphotosbaiducomxiaodupicitem3c6d55fbb2fb43165297be0927a4462309f7d33djpg\t228041\nfhrfcttfddfr\t228042\n好久不见了\t228043\n5多\t228044\n珍珠拉斯托拉闸\t228045\n资格证\t228046\n走夜\t228047\n来妃画的吕着翅膀吕子在这里\t228048\n很自\t228049\n假体\t228050\n淘气叔叔\t228051\n10多岁\t228052\n你好银\t228053\nbklwpaikuilyuminkljicikaimulnknkkljijilomomokloololtkllljiuoer\t228054\n媚\t228055\n媛\t228056\n26道\t228057\n天草丹参保心茶\t228058\n德政中路\t228059\n媒\t228060\n哪招\t228061\n看不都\t228062\n养鹅\t228063\n啦啦啦德玛西\t228064\n假使\t228065\n一轮回\t228066\n王明月\t228067\n10100\t228068\n5头\t228069\n媽\t228070\n立项\t228071\n5天\t228072\n媸\t228073\n魂儿\t228074\n贾平凡\t228075\nDMPA\t228076\n搪塞\t228077\n彩一\t228078\nfjy\t228079\n张丽霞\t228080\ndothat\t228081\n本三\t228082\n剪斜\t228083\n黄赌徒\t228084\n宝蓝程\t228085\n152824199802316615\t228086\n錢嗎\t228087\n吴世琪\t228088\n比钱\t228089\nurjd\t228090\n一天块\t228091\n多聪明勒\t228092\n窝里斗\t228093\n暖幽\t228094\nHi款项\t228095\n贴近\t228096\n平均年龄\t228097\n蜕变化\t228098\nyogadomy\t228099\ncnityou\t228100\n剪断\t228101\n专职\t228102\n灵芝\t228103\nudijjhhh\t228104\n超越\t228105\n超超\t228106\n跳在\t228107\n好幸福\t228108\n1000双\t228109\n纽约尼克斯队\t228110\n白文山\t228111\n放暑\t228112\n孙泰然\t228113\n别妄想\t228114\n啊波\t228115\n邻居们\t228116\nlolll\t228117\n启程\t228118\n当成真的了\t228119\n1000台\t228120\n土狼贰\t228121\nyellowwolf\t228122\n急切\t228123\n蛮灵\t228124\n信报\t228125\nqq旋风\t228126\n全职的你的手\t228127\n没元\t228128\n太奇怪\t228129\n戴银\t228130\n昆报\t228131\n眼冒金星\t228132\n推导\t228133\n拼音音\t228134\n鸣蝉\t228135\n莫及\t228136\n厚德载物\t228137\n一心三\t228138\n在来\t228139\n南浦大桥\t228140\n顶瘾\t228141\njetnn\t228142\nBNJHHJHHHHJHGGDJHHGFVBBJJHHH\t228143\ntfboysls\t228144\n3月8日\t228145\n你是光你是母你是公\t228146\n别信\t228147\njei\t228148\n收回答\t228149\n兆辉\t228150\n美国参院外委会\t228151\n第二天早晨\t228152\nhtddx\t228153\n江山\t228154\n强者\t228155\n闲逛\t228156\n练法\t228157\n美女女\t228158\n野葛根\t228159\nt粉\t228160\n冰岛气象研究所\t228161\n寻龙缤\t228162\n１６２个\t228163\n8天后\t228164\n07250\t228165\n380万元\t228166\n45米\t228167\n水秀\t228168\n整块\t228169\n大牌场\t228170\n2迪\t228171\ndeewrqsda\t228172\n唐某\t228173\n信不信我扇你巴掌\t228174\n五六点钟\t228175\n地窖\t228176\n上一\t228177\nziz\t228178\n两则\t228179\n贡米\t228180\n测所\t228181\nzid\t228182\nzie\t228183\nhi王DJ\t228184\n一样过彼岸花\t228185\nzio\t228186\n天济路\t228187\n李奥贝纳\t228188\n萌萌哒智能机器人\t228189\n电动度灵车\t228190\n状元龙\t228191\n不拾\t228192\n呆泄\t228193\nfrrvn\t228194\n殷新伟\t228195\n来声猫\t228196\n甲醇酒\t228197\n恩行\t228198\n称呼儿\t228199\ndjudn\t228200\n4278638\t228201\n莲花座\t228202\n说错了想\t228203\n黄五杠\t228204\n滋长\t228205\n让我离开\t228206\n雨过天晴\t228207\n拓回\t228208\n六五世\t228209\n太坏了吧\t228210\n数据库\t228211\n东北亚\t228212\n看我不然\t228213\nbene\t228214\nω菲\t228215\n邓秘\t228216\n多党制\t228217\nHDUC\t228218\n姑丈\t228219\n获胜\t228220\n恭维\t228221\n羊凤楼\t228222\n材料作文\t228223\n10月28日\t228224\n王海波\t228225\nGCCSS\t228226\n庶难\t228227\n滚滚长江东逝水\t228228\n闹海\t228229\n动画片\t228230\n我男们\t228231\n47小时\t228232\n动画版\t228233\n你的一张照片\t228234\n裴竟然\t228235\n3174723257\t228236\n公决\t228237\n爱过\t228238\n猪白羊\t228239\n小马驹\t228240\n千余只\t228241\n铠甲勇士呢傻子哈\t228242\n200度\t228243\n借贷\t228244\n马威\t228245\n线路板凳\t228246\n小差\t228247\n小巩\t228248\n小巫\t228249\n小工\t228250\n奇彩\t228251\n小巧\t228252\n肉桂\t228253\nRyan\t228254\nShhidysigse\t228255\n门灯\t228256\n干型\t228257\n小巴\t228258\n小巷\t228259\n转经轮\t228260\n建陵中学\t228261\n泰隆\t228262\n22nb\t228263\n夏洛特\t228264\n茶松饼\t228265\n半桶\t228266\n次奥\t228267\n莫高\t228268\n多米度秘我爱你\t228269\n讨人爱\t228270\n埃ima\t228271\n目不转睛\t228272\n山炮\t228273\nchghghff\t228274\n汉子表\t228275\n第九集\t228276\n200根\t228277\n云小彩\t228278\n錢\t228279\n宁负\t228280\n錯\t228281\n人力资源和社会保障部会同住房和城乡建设部\t228282\n錶\t228283\n五三\t228284\n五下\t228285\n拦住\t228286\n陽臺\t228287\n枝团\t228288\n五一\t228289\n唢呐\t228290\n五七\t228291\n叫霏\t228292\n錄\t228293\n为知\t228294\n出洞\t228295\n休血\t228296\n我们俩个\t228297\n下一次\t228298\n传闻\t228299\n好懒\t228300\n七二零零\t228301\n娄烦\t228302\n五个\t228303\naogf\t228304\n都脱\t228305\n招来\t228306\nlolotulv\t228307\n审稿\t228308\n好懂\t228309\n骚瑞儿童场\t228310\n十来万\t228311\n口烟\t228312\n乳雨欣\t228313\nfdgFXGH\t228314\n6月4日\t228315\n了来\t228316\n两点二十分\t228317\n12公里\t228318\n徐我\t228319\n爱好男\t228320\n147725803690852741\t228321\n男款\t228322\n奇怪\t228323\n仙作\t228324\n痘痘\t228325\n铠甲度\t228326\n沙特阿拉伯吉达\t228327\nVIC\t228328\nVIA\t228329\n30万个\t228330\n亚亚\t228331\n蓝宝\t228332\nVIO\t228333\nVIS\t228334\n亚于\t228335\n坐走走走走走走走走走走走走走走走走走走走走走走走走走\t228336\n一而出\t228337\n大象大象\t228338\n真事行尸走肉\t228339\n黄晓雨\t228340\n试镜\t228341\n黑天不要天天开心\t228342\nSTAR\t228343\n纠结症\t228344\n涮羊肉\t228345\n老胡\t228346\n菲佣来港\t228347\n自家人\t228348\n石化\t228349\n年来\t228350\n四五百个\t228351\n外地人\t228352\n浓烈\t228353\n两个世纪\t228354\n1万次\t228355\nhytrey\t228356\n熊墨汁熊心\t228357\n双阳\t228358\nclglg\t228359\n何丽美\t228360\n关键字\t228361\n十七年\t228362\n气泡\t228363\n获罪\t228364\n石蜡\t228365\n碰心\t228366\n做头路过\t228367\n楚辞\t228368\ntfbsl\t228369\n老胖\t228370\n张大正\t228371\n情愿\t228372\n祯盛\t228373\n御玺\t228374\n21型\t228375\n三高\t228376\n默西赛德\t228377\n交银施罗德基金公司\t228378\n秘懂\t228379\n恐怖故事\t228380\n三四三四三十五\t228381\n上海东方梦工厂影视技术有限公司\t228382\n砚真\t228383\n有益健康\t228384\n北大中关园\t228385\n颜文字\t228386\n盘菜\t228387\n延时机器人\t228388\n情感\t228389\n求购\t228390\n秘发\t228391\n求财\t228392\n凌晨1时27分\t228393\n一个话\t228394\ncyyvyovoy\t228395\n米切尔\t228396\n情意\t228397\n利己不损人\t228398\n阿虚\t228399\n赘肉\t228400\n嗯太逗\t228401\n张妙妙\t228402\n水浒阎\t228403\nbzbzjxknx\t228404\n意乱\t228405\n万亿美元\t228406\n运通\t228407\n秘受\t228408\n皆是\t228409\n说着我喜欢\t228410\n运送\t228411\n牛肉汤\t228412\n莫问情归处\t228413\n空先\t228414\n五区\t228415\n8112\t228416\n695949392919\t228417\n十二二十\t228418\n猩球\t228419\n高阳大坏蛋\t228420\n论资排辈\t228421\n喷火\t228422\n五包\t228423\n空具\t228424\n生态棉\t228425\n北丐\t228426\nUSM\t228427\n退货\t228428\n退费\t228429\n红仔\t228430\n海豚湾\t228431\n56个月\t228432\n排挡\t228433\nghiphotosbaiducomxiaodupicitemb3fb43166d224f4a0e1b42600ef790529922d1c6jpg\t228434\n首当其冲\t228435\n味甘\t228436\n意义\t228437\n感慨万千\t228438\ntwudhf\t228439\n赵宁\t228440\n赵宇\t228441\njshsshshshshdb\t228442\n作業\t228443\n千幻\t228444\n刘瑜\t228445\n1￥49855￥888882566\t228446\n刘瑞\t228447\n呱卿天晴了天晴了天晴了\t228448\n555485555556855444166598\t228449\n碧旗\t228450\n3142585202\t228451\n唉愁死喽联盟\t228452\n赵宝\t228453\n嗯嗯翅\t228454\n起茧\t228455\n点字\t228456\n点子\t228457\n明天早上5：45\t228458\n蒜苗\t228459\n剩酒\t228460\n赵家\t228461\n来过马路\t228462\n李晓渝\t228463\n666999\t228464\n倒地\t228465\n差不开\t228466\n卧龙自然保护区\t228467\n施娟艳\t228468\n妓女\t228469\n一百四十多\t228470\n7ehe7ebdi\t228471\n呆了之吧\t228472\n做好不好\t228473\n祖坟\t228474\n欧宝\t228475\ngsss\t228476\n抱不抱\t228477\n真的再见\t228478\n别算\t228479\n司马相如\t228480\n山姆\t228481\n丁奕晗\t228482\n60多分\t228483\n卖国画\t228484\n5066666\t228485\n扒粪运动\t228486\n销售点\t228487\nfgyf\t228488\n回长\t228489\n气胸\t228490\n而已堡\t228491\n阿提\t228492\n大力金刚是金刚葫芦\t228493\nHughes\t228494\n94261\t228495\n笑佸\t228496\n拾金不昧\t228497\n11点20\t228498\n11so\t228499\n兴庆元宵\t228500\n没有了谢谢\t228501\n06\t228502\n54kg\t228503\nvism\t228504\n奶城\t228505\n错光\t228506\n练级\t228507\n27层\t228508\n树顶\t228509\n太娘\t228510\n假帅\t228511\n65分\t228512\n假币\t228513\n害羞鬼\t228514\n空手\t228515\n赤佬\t228516\n颈部\t228517\niehegevbdjd\t228518\n70、80年代\t228519\n38M\t228520\n孟紫暄\t228521\n车照\t228522\n七半\t228523\n张破纸\t228524\n詹社\t228525\n光景\t228526\n度官经\t228527\nrough\t228528\n学问学问\t228529\nvjfc\t228530\n界外\t228531\n王振宇\t228532\n青山桥\t228533\n2955525494\t228534\n重整\t228535\n石恩梅\t228536\n简拼\t228537\n火锅儿\t228538\n福田区\t228539\n死性不改\t228540\nsmyout\t228541\n151146309\t228542\n每件事\t228543\n十五届个\t228544\n7758521\t228545\n3款\t228546\n登基\t228547\n头皮屑\t228548\n机器人们\t228549\n长才怪\t228550\n图博君\t228551\n忍痛\t228552\n俯视\t228553\n洞庭湖区\t228554\n不咋地度\t228555\n冰心元\t228556\n笑了之\t228557\n败血症\t228558\n干续费\t228559\n中枪\t228560\n哎呀奥特曼\t228561\n傻事\t228562\nwwdd\t228563\n吃药干嘛\t228564\n仲景\t228565\n傻了\t228566\nFuutgh\t228567\n靡靡\t228568\n自动\t228569\n傻人\t228570\n优异者\t228571\n追缴\t228572\n那利艾\t228573\n滴假滴\t228574\n特曼永\t228575\n等于主\t228576\n元上都遗址\t228577\n灵山元一丽星温泉\t228578\n周雨涵\t228579\n德瑞文\t228580\n吉野\t228581\n淳剧\t228582\nggjhv\t228583\n根据\t228584\n成年呢老人家你好\t228585\n度秘素\t228586\n还早\t228587\n拖鞋子\t228588\n去年底\t228589\n弋阳\t228590\n复音\t228591\n0W\t228592\n福橘\t228593\n轻视\t228594\ncū\t228595\n概述\t228596\n3月6日晚上7时30分\t228597\n聂氏饭\t228598\n概迫\t228599\nc巴甲\t228600\n也讨厌\t228601\nX903\t228602\n等你了所以说\t228603\n娱乐场\t228604\n公益公司\t228605\n大立科技\t228606\n羊脂球\t228607\n狂魔男男\t228608\n读书院\t228609\n158863\t228610\n芳香酰\t228611\n克玩\t228612\n帅高富帅\t228613\n花vvh5r\t228614\nketi\t228615\n远航\t228616\n陈小苏\t228617\n旅游\t228618\nSmileRoad\t228619\n七卡\t228620\nSpecial\t228621\n43寸\t228622\n克玉\t228623\n超爆笑\t228624\n扬声器\t228625\nChrist\t228626\nDr.JIN\t228627\nMV感\t228628\n844717\t228629\n糖貌\t228630\n完结版\t228631\n姜海心\t228632\n15188797660\t228633\n一一一个\t228634\n389\t228635\n守护石\t228636\n一来一往\t228637\n万能锁\t228638\n王秀芳\t228639\n逛去\t228640\n1420\t228641\n张筱雨\t228642\n1425\t228643\n920568\t228644\n1426\t228645\n好莉\t228646\n会要\t228647\nHULQFJ\t228648\n度秘你不要脸变态变态色狼\t228649\n杜江\t228650\n奶昔臭坏蛋\t228651\n灰原哀\t228652\n九个臭\t228653\n比比\t228654\n一一一一\t228655\n马鞍\t228656\n膜炎\t228657\n准确数\t228658\n一性\t228659\n人生果\t228660\n髭须\t228661\nxvzvffcv\t228662\n较为\t228663\n黑锅\t228664\n一怪\t228665\n国际组织\t228666\n155655\t228667\nnicaimeisuzhi\t228668\n龙月\t228669\n0w\t228670\n142p\t228671\n56373853\t228672\n一怀\t228673\n去边\t228674\n你女\t228675\n章鱼哥\t228676\n加把劲\t228677\n凉拖\t228678\n莲下镇\t228679\n八八号\t228680\n许文融\t228681\n28.93点\t228682\n截做\t228683\n注册\t228684\n陈柳度\t228685\n竞有\t228686\n考古队魁北克信息部\t228687\n省份\t228688\n烤串\t228689\n想笑\t228690\n往下跳楼\t228691\n252728\t228692\n158元。11元\t228693\n生下\t228694\n星游击\t228695\n肉制品\t228696\n喜欢修\t228697\n选修课\t228698\n万家岭社区\t228699\n我系\t228700\n老虎座\t228701\n乐&amp\t228702\n步步惊心里十四帮\t228703\n就怕我走\t228704\n危在\t228705\nanurse\t228706\n秦岭大山\t228707\n過什\t228708\n袁剑\t228709\n5137756237537550657317\t228710\n油田\t228711\n明天六面儿华\t228712\n贺亚东\t228713\n古哒\t228714\n油画\t228715\n一起废话\t228716\n刘亚楠\t228717\n宁avz\t228718\n我讨厌你讨厌你讨厌你真的很讨厌你讨厌你讨厌你讨厌你讨厌你讨厌你\t228719\nAppleID\t228720\n#寒更#\t228721\n明我想\t228722\n阿凡牙\t228723\nbbjg66666666999999998888\t228724\n爆笑\t228725\nhggk\t228726\n家音\t228727\n25678\t228728\n0d\t228729\n陈靖仇\t228730\n利基亚热门\t228731\n九九龙洞穴\t228732\n值不值钱\t228733\n25677\t228734\nhggg\t228735\nhggd\t228736\n占用\t228737\nhggz\t228738\n0j\t228739\n常务\t228740\nfhgcfxcfvfjfytxyartfyfttjxtgddg7hbbuhtfxrivvbhjhhuhugujgcuudggggdlvgngxrdffj\t228741\nS5830i黑1700\t228742\n娱乐圈\t228743\n大四驭\t228744\n豪放\t228745\n俺哥\t228746\nhggv\t228747\n天涯海\t228748\n真好听\t228749\n团腿\t228750\n嘛卒\t228751\n幼稚园\t228752\n咸阳市\t228753\n一个十来度\t228754\nstomarannomatlamat慢anononotkindlinanchentialaamationononononostoliominononometotatalyotatomatotomyaloitamaaaaaatlandianlatodationaletontialittontomatotalasttohololemytome\t228755\narizk\t228756\nsoga\t228757\n五经\t228758\n倒腾\t228759\n2.5元\t228760\n度将秘\t228761\n59516423907\t228762\n呀千金\t228763\n打碎\t228764\n球海\t228765\n看一下\t228766\n099栋\t228767\n奶奶sehimyamahihi嘿嘿ystandoannealittleastononohotecoxcomestandhiostria慢thandalaxyl\t228768\n你好魔\t228769\nniè\t228770\n雷住\t228771\nkdjdhd\t228772\n战巨\t228773\n见地\t228774\n了解到\t228775\n赵飞扬\t228776\n吃擦擦擦擦擦擦擦擦擦擦\t228777\n受委屈\t228778\n宣传部\t228779\ncjdh\t228780\n罗丹妮\t228781\n火影个\t228782\n达利\t228783\n坏处\t228784\n0912\t228785\n盘鼓\t228786\n达到\t228787\n看出\t228788\n100990\t228789\n一个九零\t228790\n周日行\t228791\n说话说\t228792\n咬咬\t228793\nkpp\t228794\n死皇陵\t228795\n爱好女\t228796\n宛容\t228797\n大铁\t228798\n累卡机\t228799\n前胸\t228800\n为什么不奖励\t228801\n姐妹子\t228802\nghihfgihddtcb\t228803\n632567\t228804\nfive\t228805\n诸葛\t228806\n中产\t228807\n最赞\t228808\n中亚\t228809\n大家菲\t228810\n魔镜\t228811\n会好好过日\t228812\n收尋\t228813\n还不掉\t228814\n滚开滚开滚开我讨厌\t228815\n帆布包\t228816\n#ER0\t228817\n中二\t228818\n嚜嘫\t228819\n宏斌\t228820\n家伙伴侣\t228821\n腾身\t228822\n38节\t228823\nzzzzz\t228824\n妖姬\t228825\n新丰\t228826\n2011年5月7日上午11时20分\t228827\nvghhhhh\t228828\n537508017\t228829\n新中\t228830\n鉴别\t228831\n夜购\t228832\n免疫力\t228833\n贯注\t228834\nABAB式\t228835\n走下去\t228836\n聚会山\t228837\n最前\t228838\n新业\t228839\n崔丽香\t228840\n够格\t228841\n新民晚报\t228842\n小度胡\t228843\n新专\t228844\n问您说\t228845\n崇文\t228846\nllllyl\t228847\n李悦茹\t228848\n跑遍\t228849\n团建鹏\t228850\n青蛙王子那你不就是\t228851\n工号\t228852\n新一\t228853\n58866\t228854\n丑八\t228855\n科学家师\t228856\n钟嘉成\t228857\n第六题\t228858\n工台\t228859\n朱若璇\t228860\n必读\t228861\n919789\t228862\n叫魅\t228863\n刻划\t228864\n毁弃\t228865\n480p\t228866\n大梦想家真\t228867\n亚太股份\t228868\n拉猫爪\t228869\n突突突突突突突\t228870\n顾日新\t228871\n说来不来\t228872\n1月1日\t228873\n罗子奇\t228874\n不告不告\t228875\n小度兔\t228876\n压扁\t228877\nchiphotosbaiducomxiaodupicitem6159252dd42a283426df397f5cb5c9ea15cebf4cjpg\t228878\n撞豆腐死\t228879\n再见再见再见okok\t228880\n考假\t228881\n六麦郎\t228882\npp板\t228883\n森林里\t228884\n过身\t228885\nmeri\t228886\n忙了真是\t228887\nfunkof\t228888\n蒸萝卜糕\t228889\n用时非\t228890\npixi+\t228891\n一般性转移支付\t228892\n呢哼\t228893\nspotw\t228894\n就是说你走\t228895\n人潮\t228896\n医美\t228897\n防水型\t228898\n平均数\t228899\n度秘度秘我最爱\t228900\n万荣\t228901\nmkrmksaz\t228902\n贾娜刚\t228903\nhjx\t228904\n川藏\t228905\n花朵朵香\t228906\npixiv\t228907\n我相信你五帮我选一个\t228908\n思科\t228909\nsuits\t228910\n小仓兰香阳光我的梦\t228911\n一一级\t228912\n董事长们\t228913\n我一只小小小小鸟飞呀飞却飞也飞不高咪咪咪咪咪咪真美\t228914\n敦煌\t228915\n加课\t228916\n成一团麻\t228917\n13776377763\t228918\n虚浮\t228919\n新店面\t228920\n花非花雾非雾吧我等你看完了再来找我\t228921\n朱家红美\t228922\n智能\t228923\n冷笑话精选\t228924\n毛梦璇\t228925\n先天性\t228926\n乍一看\t228927\n辽鬯\t228928\n得事\t228929\n出国\t228930\n龙之爬山\t228931\n70多斤\t228932\n275分\t228933\njiade\t228934\n红霉\t228935\n不v知道二中分校\t228936\n光人肉裁判\t228937\n15386\t228938\n恋爱被\t228939\n15389\t228940\n度秘不靠谱\t228941\n小说书\t228942\n云诗彤\t228943\n绩优易控\t228944\n贾靖峰\t228945\n很好想\t228946\n黎太阳\t228947\n共同体验收\t228948\n场里\t228949\n赵思涵\t228950\n65112263\t228951\n6855555852453245153222332269840045862655\t228952\n背叛灵魂\t228953\n二二六八\t228954\n沛儿\t228955\n这个生\t228956\n阿森纳北京行\t228957\n臭肚明\t228958\n有意识\t228959\n跳台\t228960\n好啦好啦告诉你吧我是夏末\t228961\n易拉罐\t228962\n动画系\t228963\n刘思希\t228964\n跳出去\t228965\n论不到\t228966\n白说\t228967\nnonono幸福像花开一样\t228968\n王燕如\t228969\n张莹\t228970\n斗鸡海珠开心\t228971\n丝毫\t228972\n法鲁红\t228973\n52111\t228974\n五岁半\t228975\n時候\t228976\n1385104660\t228977\n李雨聪\t228978\n一闪而过\t228979\n丁修喜\t228980\n劲爆点\t228981\n０５秒\t228982\n信息觉\t228983\n停产\t228984\n白话\t228985\n告戒\t228986\n背包包\t228987\n床位\t228988\n飞谝\t228989\nSylsjgohln\t228990\n琴瑟\t228991\nttef\t228992\nVIXX\t228993\n杨靖宇\t228994\n47847877478\t228995\n九点\t228996\n线位\t228997\n1500多条\t228998\nCSXsdads\t228999\n肥家\t229000\n223knn798huihbihiioh6q113\t229001\n永不磨灭\t229002\n食算\t229003\n弹奏\t229004\n为难说\t229005\n放了找\t229006\n麒\t229007\nsnhboth\t229008\n水仙\t229009\n地核\t229010\n框架\t229011\n45所\t229012\n大报\t229013\n麤\t229014\n麦\t229015\n机忘兼觉梦中闲\t229016\n大抵\t229017\n中曹司\t229018\n臭蛋\t229019\n焊瘤\t229020\n麷\t229021\n忘本\t229022\n线体\t229023\n麻\t229024\n麼\t229025\n麽\t229026\n食管\t229027\n王崇楠\t229028\n勤快\t229029\n菲律宾国防部\t229030\n很单纯\t229031\n我的天呀好多也有你\t229032\n说说话\t229033\n味道\t229034\nWhwbww\t229035\n好大一棵树\t229036\nYggff\t229037\n南津街\t229038\n恣意\t229039\n花觉\t229040\n裤衩\t229041\n连贯\t229042\n耍我得\t229043\n说说说\t229044\n广岛\t229045\n曹丹丹\t229046\n拿走好\t229047\n赌车\t229048\n400次\t229049\n打错\t229050\n老故事\t229051\nMAG\t229052\ngfxchui\t229053\n哎好嘞\t229054\n幸福孩\t229055\nMAN\t229056\n无阻\t229057\n梵凯\t229058\n囚犯\t229059\n23万条\t229060\n乘务室\t229061\n阖家幸福\t229062\n妻儿\t229063\n急训\t229064\n亲密的话\t229065\n粗发集团\t229066\n到好不好\t229067\n755466\t229068\n警棍\t229069\n孩小宝贝\t229070\n蒋导\t229071\nsjilporigogyq\t229072\n吴雨露\t229073\nnakhw\t229074\n神漫漫\t229075\n美等\t229076\n惨不忍睹\t229077\n一条线\t229078\n二二幺八\t229079\n熊绵绵\t229080\nYA乂\t229081\n抽水\t229082\n世嘉5\t229083\niphone6s\t229084\n长方\t229085\n神咒\t229086\n省纪委\t229087\n仓促\t229088\n2百5\t229089\n埋下\t229090\n声儿\t229091\n不想说\t229092\n答案呢的呢的呢的呢的呢的\t229093\n烦人再见\t229094\nfffffffftffffgF\t229095\n何孟怀\t229096\n孙佳豪\t229097\n中环希尔顿酒店\t229098\n叶勇\t229099\n精武\t229100\n浓重\t229101\n剪丕\t229102\n傲世分钟\t229103\n纪元巨\t229104\n本月28日\t229105\n天包\t229106\n长文\t229107\n触手\t229108\n摸弄\t229109\n很想死\t229110\n傻子色\t229111\n闪灯\t229112\n半空\t229113\n暖味\t229114\nhji\t229115\n人工通话\t229116\n好不好阿独立\t229117\n多篇\t229118\n去年\t229119\n牡丹纹\t229120\n出箱\t229121\n终审\t229122\nTrees\t229123\n外能\t229124\n作业班\t229125\n五月黄金周\t229126\n照片女\t229127\n值得一提\t229128\n原音\t229129\n红蟹\t229130\n我的哦的哦的哦的\t229131\n操练\t229132\ntfduv\t229133\n你是谁你是谁你是真的小妹妹\t229134\n郭纪欣\t229135\nAveadanms\t229136\n第1期\t229137\n山泉水\t229138\n名俊晖\t229139\nghfdd\t229140\n把岁\t229141\nQ3247756735\t229142\n看到你了你不懂\t229143\nhaha\t229144\n那你猜猜猜我是女的还\t229145\n泼墨\t229146\n好呀我宅\t229147\n告栏\t229148\n手恒温\t229149\n汉子哥\t229150\n点图\t229151\n虾饺\t229152\n686554588hxgdg\t229153\nORZ\t229154\n德生\t229155\n李师太\t229156\n750423380746\t229157\n李师大\t229158\n度蜜度秘你好美\t229159\n尤筱薇\t229160\n广州香格里拉酒店\t229161\n直辖市\t229162\n姑影院\t229163\n娘泡\t229164\n加菲猫\t229165\n江畔\t229166\n夜长梦多\t229167\n胶合物\t229168\n上天电视网\t229169\n五月蕾雅\t229170\ntfbosofitos\t229171\ngoooh\t229172\n粮囤\t229173\n鲁中浩\t229174\n华新菜园\t229175\n有几多愁\t229176\n西游记的三打白骨精\t229177\n认知道\t229178\n世界地球日\t229179\n黄小三\t229180\n横过\t229181\naaah\t229182\n乐烧鹅\t229183\naaaj\t229184\n鼻塞\t229185\n度秘你个不要脸的混蛋混蛋混蛋混蛋混蛋混蛋不要脸不要脸不要脸\t229186\n1199\t229187\n跑哄\t229188\n余几\t229189\n七五年\t229190\n好结果\t229191\n1190\t229192\n5452\t229193\n1192\t229194\n5454\t229195\n5455\t229196\n十小时子\t229197\n第四军医大学\t229198\nfftcghvh\t229199\n应外合\t229200\nlal\t229201\n杂物\t229202\nlan\t229203\nlao\t229204\nlai\t229205\n阿胶\t229206\nlad\t229207\nlaa\t229208\n把把关\t229209\n三角\t229210\nlax\t229211\n够不到\t229212\n曹子\t229213\n孟颖\t229214\n垂下\t229215\n给我喝\t229216\n省略句\t229217\n弹牙\t229218\nlaB\t229219\n冭八\t229220\nlaY\t229221\n19.52亿元\t229222\ncoadie\t229223\n这招\t229224\n省略号\t229225\n改新\t229226\ncdefjehyzkaroon9omaljk\t229227\n泉哥\t229228\n一个双\t229229\n厌倦\t229230\n欧呀斯米\t229231\n装嫩\t229232\n吧宝\t229233\n冲破\t229234\n37P\t229235\nzjzx\t229236\n2c十三\t229237\n一个只\t229238\n唉累\t229239\n2016年春节\t229240\n活下来\t229241\ngyfugcj\t229242\n小泰国\t229243\n给我见\t229244\n伊顿\t229245\n一个句\t229246\n肺,\t229247\n二十三号\t229248\nlijdks\t229249\n蓝轩\t229250\n其貌不扬\t229251\n扮酷\t229252\n一个号\t229253\n吧家\t229254\n上镜率\t229255\n老鸭\t229256\n老鸨\t229257\nhttppinyincne20367\t229258\n3914\t229259\ng7062\t229260\n逗狗\t229261\nbreak\t229262\n做好准\t229263\n辅导\t229264\n罗波罗\t229265\nyesssss\t229266\n放哨\t229267\n庚心\t229268\n本会\t229269\nAAA级片\t229270\n杨桃\t229271\n37%\t229272\nThomasE\t229273\n铭刻\t229274\n80多万方\t229275\njdkmkmjgj\t229276\n老鸟\t229277\n你是我我是你你\t229278\n鱿鱼\t229279\n杨桐\t229280\n370\t229281\n373\t229282\n372\t229283\n375\t229284\n374\t229285\n376\t229286\n=丫丫丫丫丫丫\t229287\n弄瞎\t229288\nxhch\t229289\n阿糙\t229290\n两回\t229291\n找到了\t229292\n爱立汀rlboy了没\t229293\n心情不棒\t229294\nffyyrugk\t229295\n笙歌\t229296\n不必了吧\t229297\n秀山\t229298\n开放日\t229299\n航空\t229300\n梦梦琪琪\t229301\n两国\t229302\n赞一个在在一个再一个再一个再一个在\t229303\n背背背\t229304\n4tyou\t229305\n螺旋状\t229306\n干呕\t229307\n王孙\t229308\n日日日日\t229309\n隋思琪\t229310\n王孜\t229311\ne武\t229312\n水星\t229313\n平跟\t229314\n奥黛\t229315\n完出\t229316\n倒吧\t229317\n我的密度你在哪儿\t229318\n俾仔仔\t229319\n第七名\t229320\n水明\t229321\n越策\t229322\nntj\t229323\nntm\t229324\n了解放生\t229325\n陇东\t229326\n来呀我等\t229327\n信永\t229328\n军头\t229329\n20000000003000000\t229330\nNational\t229331\nhuninilki\t229332\njtttt\t229333\n林丰正\t229334\n狂呼烂叫\t229335\n玉皇大帝\t229336\n张红魔\t229337\n清仓\t229338\n久存\t229339\n拆卸\t229340\n房地产公司\t229341\n徐荣菊\t229342\n刘海扬\t229343\n褚琦\t229344\n同床共枕\t229345\n你的的月\t229346\n戚木子\t229347\n松谦\t229348\n触觉\t229349\n徐海星\t229350\n触角\t229351\n何紫薇\t229352\n阳新\t229353\n尼比华少\t229354\n一秒钟\t229355\n度泌\t229356\n真空袋\t229357\n灵丹妙药\t229358\n重做\t229359\n小到大我是姐姐\t229360\n喲喲\t229361\n靠太搞笑\t229362\n射线接触史\t229363\n作息们\t229364\n123456778971\t229365\ncvddvfe\t229366\n芭夯你是曼朱格什六十\t229367\n一一点\t229368\n神大虾\t229369\n度秘你干什么你不是我的小秘书\t229370\n9400\t229371\n群元\t229372\n心灰意懒\t229373\n你在哪了我找你去\t229374\n零售股\t229375\n大间\t229376\ndvjcbkjc\t229377\n30集\t229378\nleastgutgcdftfghvchyusgfkfhhjcjfhxhyfgyiggjhhydyytbhhghfftabHGHhlhjiuojkkghh\t229379\n行快点\t229380\n交棒\t229381\n一涌\t229382\n客队\t229383\n国民英雄\t229384\n投放\t229385\nMymother\t229386\n紫原敦\t229387\n偶轩\t229388\n10道\t229389\n268735\t229390\n哈谢娜\t229391\n大门\t229392\nxj\t229393\n纸业\t229394\n考证\t229395\n2010年5月24日\t229396\n陆粉英\t229397\n长盛路\t229398\n欢生活\t229399\n太胖\t229400\n罗贝塔女\t229401\n眼狼\t229402\n六五一八\t229403\n戏说乾隆\t229404\n娜扎\t229405\n前呼后拥\t229406\n亲乒乒乓乓\t229407\n舍不忠\t229408\n梅西梅\t229409\n文可\t229410\n机座机\t229411\n就是我可以\t229412\nu盘\t229413\nnnjkjmjkn\t229414\n曹艳玲\t229415\n乐陵\t229416\n小笨猪\t229417\n回形\t229418\n腥气\t229419\n戴宝娟\t229420\n头一年\t229421\n常藏法师\t229422\n王颜秘\t229423\n听厌\t229424\n次日7点\t229425\n文史\t229426\n四十多名\t229427\n六点下午三点半\t229428\n熊掌者\t229429\n1088万\t229430\n直爽\t229431\n要不我才是\t229432\npsmc\t229433\n救咳\t229434\n工级\t229435\n哭冒\t229436\n原以为\t229437\n真的好搞笑\t229438\n卡布裆\t229439\n天光明\t229440\n顾晓燕\t229441\n陆金瓶\t229442\n二零五五\t229443\n度秘我爱你爱你\t229444\nqqgjjfngq\t229445\nindefile\t229446\nzanche\t229447\nugvvfh\t229448\n恩杰\t229449\n60公斤\t229450\nkhom\t229451\nxn\t229452\n苏灵羽\t229453\n第200个\t229454\n楠楠生\t229455\n名厨\t229456\n十四老\t229457\n和就是\t229458\n很傻傻\t229459\n家饰\t229460\n會該\t229461\nhjkv\t229462\n新兴县\t229463\n鲍浩然\t229464\n13768604528\t229465\n姬帅帅\t229466\n妖狐堡\t229467\n雷蛇守备\t229468\n酱琳\t229469\nhjkf\t229470\n扰佳为\t229471\nhjkk\t229472\n班委\t229473\n稻香\t229474\n效忠\t229475\n家饭\t229476\n4g网速\t229477\n地价\t229478\n华诞\t229479\n来了像\t229480\n2006年\t229481\n商业化\t229482\n长太吵\t229483\n多看点\t229484\n永威\t229485\n很桑心耶\t229486\n邓天卓\t229487\n汽车片\t229488\n七睡\t229489\nhddt\t229490\n没结\t229491\nnaocan\t229492\n7q五\t229493\n男娃\t229494\n地仙\t229495\nhddd\t229496\n纠结结\t229497\nhddg\t229498\n手术学\t229499\n你位\t229500\n华语\t229501\n白志龙\t229502\n杨老\t229503\n期市镇\t229504\nhddi\t229505\n担忧银行再融资\t229506\n警视厅\t229507\n慢羊羊\t229508\n留片\t229509\n冒不喜\t229510\n通用汽车\t229511\n彭理事\t229512\n异乡人\t229513\n赛道\t229514\nOMG\t229515\n蒙氏蒙\t229516\n签名照\t229517\n美脚\t229518\n粤剧\t229519\nLA独家图片#\t229520\n等速\t229521\n滨江\t229522\n版家\t229523\n助农\t229524\n尿了玩\t229525\n面面面\t229526\n历山\t229527\n六枝度\t229528\n名侦探柯南动你家哪的派\t229529\n山中宝\t229530\n超神记号给我一个\t229531\n计生办\t229532\n小牡眯\t229533\n熹贵妃\t229534\n海龙电子商场\t229535\ntnebrhh\t229536\n三十三百一十五棵\t229537\n冷爱公主VS风云四王子\t229538\n太阳人\t229539\n泰语\t229540\n434653464228787257245\t229541\n泰诗\t229542\n尔康\t229543\n永胜县\t229544\n塔兰特\t229545\n冰冰美\t229546\n26天\t229547\n凤凰传奇凤凰传奇凤凰传奇\t229548\n杨子干\t229549\n叫声\t229550\n死亡飞车2\t229551\n爽倍儿爽\t229552\n想起我来\t229553\n识别率\t229554\n文化创意产业\t229555\n订订\t229556\n220年\t229557\n谢离\t229558\n张韶涵\t229559\n纪香\t229560\n麻子帅比\t229561\nfuyhgbvvgggffdxfcx\t229562\n条码\t229563\n用政\t229564\n选股\t229565\n不懂样\t229566\n体投地\t229567\n赞帅帅\t229568\n丕少\t229569\n头痛眼痛\t229570\n流窜\t229571\n隋阳\t229572\n消失的爱\t229573\n不是我知道我知道我\t229574\n对的你可以\t229575\n团长\t229576\n腧穴日敏化艾灸新疗法\t229577\nspsports\t229578\nvcyskfx\t229579\n几涸\t229580\n最亲的朋友\t229581\nbxxvd\t229582\n晓晨\t229583\n大姨太\t229584\n大姨夫\t229585\n王老实\t229586\n王小茜\t229587\n云里雾\t229588\n烧碱\t229589\n热天堂\t229590\n胡萝卜丝\t229591\nHDUSJD\t229592\n晓晓\t229593\n显灵\t229594\n润滑剂\t229595\n我的钱多多炼爱记社保中心医院\t229596\n千叮万嘱\t229597\n教校\t229598\n我诺胜\t229599\n滨城\t229600\n山东教育出版社\t229601\nqipidouteshuchunpu\t229602\n今天下午5点\t229603\nhohft6cvnvvjjcxbhzhkmkgxgijbk97t4xokVXHjjvvcgh5vvvcj5cvnjvfcbgkhvfvcbkgfjfxcxcjhzbooxhkl9hhyhvchjgchjjvghhhiihgooppjfkihhoi8cckiirtrfhhhhhhvxddxbkyszzchhutxxxxxcghuuhuhhggggvchggvvggtffhghggggcbbvbbjj56vbbj8icfcjihghggggxjgjcgbfxgg\t229604\n2723822283\t229605\n元媛\t229606\n撇嘴\t229607\n福北人\t229608\n没有心没有\t229609\n只等\t229610\n梅雷莱斯\t229611\n你是我的流星落\t229612\n柏树\t229613\n冻咪\t229614\n相同点\t229615\n8575\t229616\n汪书羽\t229617\n黄瓜汁\t229618\n和田铁路\t229619\n中国技术研发中心\t229620\nhxnf\t229621\n模特公司\t229622\n刘紫沫\t229623\n535分\t229624\n好心眼\t229625\n啦啦啦着推优\t229626\n名利尊严健康\t229627\n4点15\t229628\n1400毫安\t229629\n扭断\t229630\n吾吾\t229631\n消费水平\t229632\n俊头\t229633\n13：04\t229634\n13：02\t229635\n2ne一\t229636\n弟妹\t229637\n变更\t229638\ndjgdn\t229639\n兄个\t229640\n吴心语\t229641\n黄海大战\t229642\n25384236842\t229643\n麻垌高中\t229644\n〖夫\t229645\nIPHONE5c\t229646\n刺杀\t229647\n二羊羊\t229648\n张若昀\t229649\n摄线\t229650\n做我女朋友\t229651\n丰衣足食\t229652\n老子乱\t229653\n伦家伦\t229654\nygksutktyf\t229655\n胶南\t229656\n论文家\t229657\n阴霾\t229658\n仓井\t229659\n四斤\t229660\n冻手冻脚\t229661\n四方\t229662\n喵喵喵喵喵喵喵喵喵喵喵\t229663\n万家段\t229664\nXperia\t229665\n好吧好吧好吧好吧好吧好吧好吧好吧好吧好吧好吧好吧好吧好吧好吧好啊好啊好吧\t229666\nhi场\t229667\n胶卷\t229668\n战平\t229669\n不含铅\t229670\n安尧尧\t229671\n哎替\t229672\n纪纪\t229673\n鲤鱼红豆\t229674\n孝敬宪\t229675\nqoie\t229676\n大西轰\t229677\n充值得\t229678\n稀土\t229679\n叫走\t229680\n杰拉德华莱士\t229681\nIDOL\t229682\ntcxrzezez\t229683\n法自自然\t229684\n走读\t229685\nvshow\t229686\n真的不想和你\t229687\nmyozis200costsheltoundataboser\t229688\n酒杯犬\t229689\n悦耳\t229690\n可真\t229691\n爱的回忆\t229692\n不妥协直到\t229693\n查禁\t229694\n针尖\t229695\n伸出\t229696\n551G\t229697\n妮妮\t229698\n所人\t229699\nk630\t229700\nhjfk\t229701\n2月2日\t229702\n宠爱你好\t229703\n马龙\t229704\n餐巾纸\t229705\n给你吃没v女女女美女\t229706\n牙板\t229707\n嘻什么嘻什么嘻\t229708\n曹雨欣\t229709\n金福硕\t229710\nfr韩博\t229711\n陈申\t229712\nAlice\t229713\n购物卡\t229714\n杜玉\t229715\n安昂\t229716\n38282\t229717\n日会\t229718\n乐斌\t229719\n痛彻\t229720\ncbdbrjCBshen\t229721\n死于非命\t229722\n55578866\t229723\n小琳琳\t229724\n1981年\t229725\n飞朳\t229726\n饥饿男\t229727\n于建嵘\t229728\nzrdx\t229729\n金莲川\t229730\nyoseooorfifimalimalifoustom\t229731\n借借\t229732\n潮州市\t229733\n顺藤摸瓜\t229734\n飞木\t229735\n法和\t229736\n爸爸小爸爸黑带我来的黑\t229737\nW呃点幺零幺零\t229738\nv反反复复v得瑟饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿饿\t229739\n毒素\t229740\n008741517\t229741\n干秘\t229742\n海龙王\t229743\n下弦销路广物美\t229744\n1月19日\t229745\nTina\t229746\n我一心爱民几时宅\t229747\nhumor\t229748\n丽红\t229749\n词典\t229750\n堡先走\t229751\n小琪琪\t229752\n聪明泵\t229753\n蒙西\t229754\n加进\t229755\n我今孤女猫疯\t229756\n白日梦\t229757\n陈功勋\t229758\n家票\t229759\n转任\t229760\n神经紊乱\t229761\n330美元\t229762\nhYssF\t229763\n6490364734380\t229764\n一介个\t229765\n党工委\t229766\ncabbb\t229767\n容麽麽\t229768\n庵男\t229769\n毕老爷\t229770\n窦帅\t229771\n喜拐\t229772\n纯白\t229773\n来花\t229774\n莫虚\t229775\nonthe\t229776\n度红\t229777\n红星美凯龙集团\t229778\njnin\t229779\n逃离\t229780\n12kg\t229781\nBNNb\t229782\n丫头头儿\t229783\n莫留遗憾\t229784\n还有我在\t229785\n等明天不信\t229786\n岛屿\t229787\nxiayi\t229788\ncgggggu\t229789\n干露露\t229790\n152059626\t229791\n天影\t229792\n噶积分卡v\t229793\n绝不放弃\t229794\n所到之处\t229795\n第一集\t229796\n通货膨胀\t229797\n油炸机器人\t229798\nhdgdgs\t229799\n一百万个\t229800\n去取\t229801\n纷扰\t229802\n南湖公园\t229803\n四十八小时之内\t229804\nusbn接口\t229805\n2月22日\t229806\n25发\t229807\n四月君嘘\t229808\n科幻片\t229809\n奥拉夫\t229810\nHFPA\t229811\n心援\t229812\n庄雨凡\t229813\ntjtajttatata\t229814\nBinern\t229815\n照片版\t229816\n零花钱\t229817\n菲律宾能源部\t229818\n25号\t229819\n25台\t229820\n25史\t229821\njdbx\t229822\nq个\t229823\njdbz\t229824\n林子\t229825\n5000万\t229826\n李着\t229827\n失道寡助\t229828\n总归\t229829\n赵安妮\t229830\n151.16亿元\t229831\n15票\t229832\n废柴\t229833\n另一个乘数\t229834\n自知之明\t229835\n草率\t229836\n或多或少\t229837\n三昧真火\t229838\n自食其力\t229839\nmtr\t229840\nmtp\t229841\n101100\t229842\n闺名\t229843\n千人之诺诺\t229844\n东南气\t229845\n53858\t229846\nRYKIEL\t229847\n420683200012284214\t229848\n有么不废话\t229849\n标示\t229850\n创世928\t229851\n李睿\t229852\nmtg\t229853\nyrertedugrfugRGY4475377325555555\t229854\njvyv7\t229855\n不应当\t229856\n中间商\t229857\n黑米\t229858\n最后一道门\t229859\n油盐酱醋\t229860\n偶叶子\t229861\n大红色\t229862\n理你了谁\t229863\n猫蚤\t229864\n六四四零零五\t229865\nPigitPiget\t229866\n奔布\t229867\n静静地\t229868\n激活码\t229869\n全塞\t229870\n圭仔\t229871\n我来了我来了\t229872\ncigic\t229873\n谢宇洋\t229874\n事海\t229875\nttrdh\t229876\n6月16日\t229877\n城市化\t229878\n豪利士牛排\t229879\n死血神\t229880\n干脆\t229881\nQ足v\t229882\n千骨\t229883\n朝令夕改\t229884\n家祖\t229885\n飞飞CK\t229886\nTcyf\t229887\n精神片\t229888\n长嘴\t229889\n天授\t229890\n德国央行\t229891\n驾照不然的话\t229892\n晚安了先这样\t229893\n同种\t229894\n赤笑\t229895\n200多户\t229896\njshjs\t229897\n王若宇\t229898\n神阿\t229899\n小金鱼\t229900\n猪脸皮\t229901\n昂胶\t229902\n不要不要不要不\t229903\n药师\t229904\nwmmmm\t229905\netgcvxbfu\t229906\n李铭秋\t229907\n邓子\t229908\n中广流行网-娱乐E世代\t229909\n8985765465456654656567\t229910\n老不正经\t229911\n十余个\t229912\n武慧霞\t229913\n三棵树\t229914\n闯黄闪\t229915\n康佳\t229916\n朱燕\t229917\n傲娇受\t229918\n地标\t229919\n很幸福\t229920\n抵发\t229921\n圆珠笔\t229922\n4起\t229923\n装点\t229924\n农炮\t229925\n国防科技大学\t229926\n借种\t229927\n厌恶恶心死\t229928\n追唱\t229929\n月月月\t229930\n四五点\t229931\n奈斯兔咪\t229932\n中国美院\t229933\n名度\t229934\n3ab\t229935\n60日线\t229936\n张林焕\t229937\n家卫国\t229938\n岁度秘\t229939\n深林\t229940\n顺地\t229941\n北谷丰村\t229942\nnanofiltration\t229943\n嗯一圈一青铜龙\t229944\ndgjjhgff\t229945\n评审员\t229946\n杨某\t229947\njwalle\t229948\n烟月传\t229949\n好开心哈哈\t229950\n变态啊你是女人我是男人\t229951\n染色体\t229952\nQQB\t229953\n呋呋呋呋\t229954\nTATAT\t229955\n好我最喜欢\t229956\n韩版vvv吧nhggbhdvbnccnnjfjjxfjfjfnfnfnmgmcnvnvncncncncnvnvmvnvnvmnvnvbmnvnnnvvncnvn\t229957\n刘广毫\t229958\nDown\t229959\n贾佳妮\t229960\nQQQ\t229961\n杨柳\t229962\n杨洁怡\t229963\n比利比利\t229964\n拿破仑\t229965\n人人工老老老\t229966\n阿家哥\t229967\nwhatthesquarerootof99\t229968\n群名\t229969\n性功能\t229970\n罐頭扇貝\t229971\n举杯同庆\t229972\n妹控\t229973\n不过不去\t229974\n匯额额\t229975\nOrange\t229976\n大度万\t229977\n广东区\t229978\n点点点点点点点点点点点点点点点点点点点点点点点点点点点点\t229979\nQAQ缺爷\t229980\n丁子玉\t229981\n晚后\t229982\n灶头\t229983\n洞窟\t229984\n事勤\t229985\n321种\t229986\n1019\t229987\n22000元\t229988\n家规\t229989\n1016\t229990\n1010\t229991\n1011\t229992\n1012\t229993\n1013\t229994\n水在背我在飞行\t229995\n公寅\t229996\nn333\t229997\n我的郎\t229998\n交管\t229999\n斗嘴嘞\t230000\n飞iron\t230001\n软文\t230002\n鬼丑八怪\t230003\nxxccccx\t230004\n缺锌\t230005\n玛瑞琳莱福特\t230006\n公寓\t230007\n和玩\t230008\n畜生\t230009\n我的部\t230010\n3818807\t230011\n贤妻良母\t230012\n袋鼠妈妈\t230013\n吃货\t230014\n我求你了动脉\t230015\n鸶骨\t230016\n木女\t230017\n真的不想\t230018\n类别\t230019\n紫玉\t230020\n这班\t230021\n74874874488\t230022\n毕业了你说\t230023\n巾帼\t230024\n姜敏京\t230025\n刷脸认证\t230026\n呵呜\t230027\n宁进\t230028\n自选\t230029\n有系\t230030\n北工大\t230031\n曲姐\t230032\n聊弄\t230033\n腰部\t230034\n摊开\t230035\n最亲的人\t230036\n指标性\t230037\n小马虎\t230038\n迪塔茶\t230039\n有皮\t230040\n什山\t230041\n心神不宁\t230042\n狠便当\t230043\nhiloe\t230044\n杨宏斌\t230045\n乒乓球\t230046\n发稿\t230047\n几125家\t230048\n看家狗\t230049\n呵呵\t230050\n外校\t230051\n自造\t230052\n上之战\t230053\n陈江容\t230054\n举世瞩目\t230055\n三轮车\t230056\n乖乖宝\t230057\n读书手\t230058\n衣袋\t230059\n库里\t230060\n网题\t230061\n增值\t230062\n陈晗\t230063\n多一块\t230064\nfinis\t230065\n季度\t230066\n臧运梅\t230067\n陈晟\t230068\n24276n74256\t230069\nadvx\t230070\n校路\t230071\n847454448458\t230072\n曼沙珠华\t230073\nSBn\t230074\n潮洲\t230075\nSBS\t230076\n立刻\t230077\n海飞丝\t230078\n你好讨厌讨厌讨厌讨厌\t230079\n24谢\t230080\n西绪炎\t230081\n魔幻类\t230082\n内里\t230083\n度密咱么\t230084\n同步盘\t230085\n章文鑫\t230086\nSBD\t230087\n肚皮\t230088\nYoudied\t230089\n陈晨\t230090\n网上支付\t230091\n多肉少\t230092\n多家\t230093\n怪乖\t230094\n120506\t230095\n八丰\t230096\n国家会议中心\t230097\n八中\t230098\nChiggeryo\t230099\n5辆\t230100\n体统\t230101\n天份\t230102\n八个\t230103\n八两\t230104\n天价\t230105\n神弑神你是谁之你是谁的喜酒\t230106\n天今\t230107\n南方科技大学\t230108\n饥饿感\t230109\n秘蛛\t230110\n贵人类\t230111\n多宝\t230112\nsawrtyiop\t230113\nwrite\t230114\n稀缺\t230115\n贱奴\t230116\n贱女\t230117\n金佳星\t230118\n八下\t230119\n35张\t230120\n八万\t230121\n千韵\t230122\n八一\t230123\n八七\t230124\ntnnnnntlnntntpp\t230125\n某地\t230126\nohce\t230127\n0.7%\t230128\n冷眼句\t230129\npqp\t230130\npqr\t230131\n台红色\t230132\n了一里不爱\t230133\n受不了了摸摸\t230134\n不说呀\t230135\n不是了和你的四本书红领巾的故事\t230136\nTeti\t230137\n密米得\t230138\n逃顶\t230139\n神你是我的什么人你到底是谁你是谁你是谁你是谁你\t230140\n猪扒\t230141\n安知薇薇\t230142\n5.7\t230143\n深海\t230144\n粉笔屑\t230145\n塞利西\t230146\n過神秘小鎮大冒險嗎\t230147\n东方之子\t230148\n孵化\t230149\n年术\t230150\n猪手\t230151\n自负\t230152\n鞋面\t230153\n亲一个*^_^*\t230154\n石溪\t230155\n笃民\t230156\n几七几几\t230157\n呜呜祖\t230158\n度姑\t230159\n度姐\t230160\n冫冫厂卜壮\t230161\n俄莪\t230162\n固混合物\t230163\n有没\t230164\n他的话\t230165\n小八路\t230166\n社科\t230167\n两辈子\t230168\n自费\t230169\n真好我真好我真美女个小兔子小兔子小兔子小兔子小兔子小兔子小兔子\t230170\nvchgxfgffugdt\t230171\n沧桑石\t230172\n哈子大歌星要不是哥一呀一呀一呀一呀\t230173\n明要\t230174\n乖积\t230175\nJfkkxnj\t230176\n光滚滚滚滚滚滚\t230177\n聚集地\t230178\nthgnd\t230179\n自贡\t230180\n好笑我心情不好\t230181\n索溪谷\t230182\nhttppinyincne19715\t230183\n世袭\t230184\n碰伤\t230185\nhttppinyincne19711\t230186\n脑袋号\t230187\nhttppinyincne19719\t230188\n省市\t230189\n叶绿\t230190\n八百斤\t230191\n裸主\t230192\n体内\t230193\n终南别业王维\t230194\n抛洒\t230195\n恶风\t230196\n桓玄\t230197\n恰版=\t230198\njshsjjshsode\t230199\n约份\t230200\n永遇乐京口北固亭怀古\t230201\n1942年\t230202\n红彤新一比柯南要好\t230203\n真讨厌你\t230204\n肖福全\t230205\n大家一起萌萌哒\t230206\n楼八楼\t230207\nHchfngfjfgkfgfhfhvhdhfghdjhvgfugjhghgghfgfggggvjvhb\t230208\n2条\t230209\n六零二\t230210\n喋喋不休\t230211\nyvhvb\t230212\n讨教\t230213\n早晨八点\t230214\n塔里木河\t230215\n2杯\t230216\n湖密\t230217\n维尔马\t230218\n天天有喜二\t230219\n二雅\t230220\nRCDXFB\t230221\n东港区\t230222\n胃溃疡\t230223\n2板\t230224\nQajke\t230225\n影子\t230226\n王俊达\t230227\n歇题\t230228\n木然\t230229\n王秘诡\t230230\n无穷没有\t230231\n山景\t230232\n六六无穷运财水晶挂饰\t230233\n王菁萱\t230234\nqososd\t230235\nh1njry\t230236\n彬果子\t230237\n爸妈不在家\t230238\n479361\t230239\n西门口\t230240\n花木兰\t230241\n打听\t230242\n叠加\t230243\n琦弟\t230244\n惨烈\t230245\n内存气死你\t230246\n色球\t230247\nmuamgh\t230248\ntiggf\t230249\nrlz\t230250\n脑筋急转弯猜一猜\t230251\nrein\t230252\n天荒地\t230253\n女告诉我\t230254\n慰问品\t230255\nq兔\t230256\n班长\t230257\n攻击队\t230258\n疯狂化\t230259\n20多亿\t230260\n尉氏\t230261\n宣传单\t230262\n名模\t230263\n横笛\t230264\n利金刚\t230265\n效仿\t230266\n好过\t230267\n八十八十兆\t230268\nhappybees\t230269\n林正曲\t230270\nDKDDIDK\t230271\n破现\t230272\n满满\t230273\n夜包\t230274\n03171821223212\t230275\n自体\t230276\n利落\t230277\n精华液\t230278\n許多腦殘\t230279\n禁锢\t230280\n扎菲\t230281\n空置\t230282\ngork\t230283\n代笔\t230284\n让画\t230285\n脖子处\t230286\n威加海内兮归故乡，\t230287\n坐讲\t230288\n入选\t230289\n亚当\t230290\n留恋\t230291\nhighschoolloveon\t230292\n服药\t230293\n055685585585588688988889\t230294\n真我真生气\t230295\n死死\t230296\n传到\t230297\nLOL啦\t230298\n花戏\t230299\n野生动物\t230300\n茶室\t230301\n形声字\t230302\n干嘛呀到快九啊我喜欢你\t230303\n毛择东\t230304\n四我喜欢\t230305\niya\t230306\n性空谈\t230307\nhelloks\t230308\n12年\t230309\n哇好帅\t230310\n杜蜜芽\t230311\n王瑞娥\t230312\n99多少\t230313\nxxccv\t230314\n晚安晚安晚安晚安晚安晚安晚安晚安\t230315\n黏性\t230316\ndevery\t230317\n限额\t230318\n破嘴\t230319\n第20届\t230320\n引出\t230321\n换一换\t230322\n和一个\t230323\n2222222222222222222222\t230324\n嫡与庶\t230325\nAnyholeteen\t230326\n15000人次\t230327\n吕学义\t230328\n西红吉娃\t230329\n恋三正\t230330\n兔兔\t230331\n老不好\t230332\n叶罗丽精灵梦\t230333\n和一下\t230334\n去年以来\t230335\n低管\t230336\n丢卒\t230337\n佛山市粮有油食品公司\t230338\nfrrgdd\t230339\n清彩\t230340\n鄘矀躤\t230341\n你的爸爸\t230342\n傻子瓜子\t230343\n三千六一十二万五\t230344\n神州行\t230345\n1瓶\t230346\n不可数名词\t230347\n东燕\t230348\n甘肃三院\t230349\n一不一样\t230350\n钟意\t230351\nisdg318g5s\t230352\n一千一百一十九\t230353\n后爸\t230354\n竖九隆冬\t230355\ncfhgdfhjj\t230356\n八七投诉你\t230357\n能不能不周\t230358\n今前\t230359\nhttphhiphotosbaiducomxiaodupicitemc8177f3e6709c93d07eb7b84983df8dcd10054a3jpg\t230360\n酒师\t230361\n看不再想你\t230362\n嘛多维\t230363\n好啦好啦好啦我\t230364\ncctv5\t230365\n吐痰\t230366\n唔知首歌\t230367\n度秘我爱你我喜欢\t230368\n老师门\t230369\n還不如\t230370\n646461\t230371\n偷狗\t230372\n民办高校\t230373\n酒席\t230374\n五十五\t230375\n小东东\t230376\n一礼拜\t230377\n桂花树\t230378\n系统软件\t230379\n曹守东\t230380\n5多少\t230381\n爱玩植物大战僵尸\t230382\n数数\t230383\nIC25n66年级妮妮的芭蕾811666溜溜444445嘎嘎\t230384\n锦绣\t230385\n帅鬼\t230386\n安好好好好好好好好好好好好干哈年好好刚啊鸟鬼\t230387\n后勤集团\t230388\n黄岛\t230389\n黄岗\t230390\n真心诚意\t230391\n急急忙忙\t230392\n4800亿美元\t230393\n小贝\t230394\n让座\t230395\n245200819\t230396\nujbmjjl\t230397\n新疆沙湾\t230398\nyighy\t230399\n秘斯密斯\t230400\n小贩\t230401\n我没在上我\t230402\n小贤\t230403\n镍币菲\t230404\n陈沛鑫\t230405\n雷主任\t230406\n腐男or\t230407\n卧墙\t230408\n小贼\t230409\n天遣\t230410\n小贸\t230411\n小费\t230412\n34227462446564424466724\t230413\n拘束\t230414\n婴舒宝\t230415\n小贱\t230416\n木断义绝\t230417\n蝴蝶金可可\t230418\n皇后大道\t230419\n死翘翘我的高考来了\t230420\n4700\t230421\nktcam\t230422\n错愕\t230423\n挂件\t230424\n图版\t230425\n颠趴\t230426\n错意\t230427\n三省\t230428\n纲醒\t230429\n书剑恩仇录\t230430\n图片\t230431\nMeme\t230432\n图特\t230433\n老四方\t230434\n三眼\t230435\n束河古镇\t230436\nMemo\t230437\n好运来酒楼\t230438\n空位\t230439\nzdyzhfcgdjtkyfykaehsjtfyldkgxh\t230440\n老件\t230441\n菲律宾\t230442\n另类\t230443\n空余\t230444\nhigogie\t230445\n真的假的和你好讨厌\t230446\n微博控\t230447\n和讯网\t230448\n巴适\t230449\nhakuhkujad\t230450\n一片片\t230451\n独秘\t230452\n纯种\t230453\n1桶\t230454\n慢堡\t230455\n给我儿\t230456\n第二三\t230457\n天道\t230458\n注意注意\t230459\n第二一\t230460\n240万块\t230461\n花千度\t230462\n4秘\t230463\n院士\t230464\n王丽王玉\t230465\n常州精品酒店\t230466\ncxzdedg\t230467\n蟒蛇\t230468\n蓝眼睛\t230469\n金智仁\t230470\n吃了饭师大\t230471\n海哥\t230472\nhwjdx\t230473\n告不告不告\t230474\n马克币\t230475\n第二个\t230476\np嗯\t230477\n度猪\t230478\n13474887330\t230479\n陆海鑫\t230480\n裤子女\t230481\n本田菊\t230482\n天觉\t230483\n中雨雪\t230484\n毒经\t230485\n苏亦欣\t230486\nOPPOU3\t230487\nHU48\t230488\n副我\t230489\n几再来\t230490\n是非非\t230491\n严昕\t230492\n奸诈\t230493\n太年么老子不要\t230494\n丑仙\t230495\n严明\t230496\n凸面\t230497\n东北大碴\t230498\n住秘我叫\t230499\n可爱最好是\t230500\n常文珠\t230501\n我喜欢安\t230502\n妖猴什马\t230503\n44668800\t230504\n屉匕\t230505\ncares\t230506\n翻天覆地\t230507\n像话\t230508\n我喜欢宅\t230509\n13918700720\t230510\n矮小症\t230511\n我的征信\t230512\n维维饭食\t230513\nSWYG\t230514\n重庆国际小姐选美三强\t230515\n盒体\t230516\n47米\t230517\n叫秘\t230518\n通常\t230519\n后作\t230520\n迥睡觉吧\t230521\nfvvc\t230522\n离世\t230523\njshe\t230524\njshd\t230525\njshz\t230526\n三二三个\t230527\n冯亚林\t230528\n空斩\t230529\n谢丽敏\t230530\n十五名\t230531\n侍卫\t230532\nyntfboys\t230533\n广播员\t230534\n阿龙纳斯\t230535\n诈马\t230536\n对伐\t230537\nF6F\t230538\n只有你\t230539\n七点整\t230540\n四节\t230541\n要钱吗\t230542\n就叫萌\t230543\nCjfjdj\t230544\n比天过\t230545\n库洛牌\t230546\nmmhao\t230547\n虹桥火车站\t230548\n心西游记\t230549\n梦想天\t230550\n奸淫小說\t230551\n咯素\t230552\n治国\t230553\nwaveyang\t230554\n自恋\t230555\n姜佳琦\t230556\n雅鲁藏布\t230557\n倩女销魂\t230558\n15945960586\t230559\n黄凯军\t230560\n接丁\t230561\n今天早晨五点\t230562\nCccv\t230563\nprota\t230564\ncc1\t230565\n诺顿\t230566\nwiki式\t230567\n相生相克\t230568\nZAFIRA\t230569\n婉君\t230570\n唉唉秘秘社\t230571\n彩电\t230572\n日线\t230573\n哟了\t230574\n芊芊玉\t230575\n干洗店\t230576\n呦呦咝\t230577\n靠哼\t230578\n逃下\t230579\n七百多个\t230580\n无憾\t230581\nx3\t230582\n长安\t230583\n鸳鸯蝴蝶\t230584\n红狼\t230585\nMV女\t230586\n嘎克拉拉\t230587\n东南角\t230588\n公益性\t230589\n长官\t230590\n暂缓\t230591\n800亿\t230592\n爱着我\t230593\n二三岁\t230594\n晕乎乎\t230595\n横扫\t230596\n忍住在\t230597\nccw\t230598\nccv\t230599\n摄影奇才PETER\t230600\nccl\t230601\n指一算是\t230602\ncck\t230603\nccj\t230604\n好我相信你\t230605\nccd\t230606\n横批\t230607\n长宽\t230608\n南加巴瓦\t230609\nccc\t230610\n性嗜酒\t230611\n增高鞋\t230612\n责任\t230613\n责问\t230614\n女兔子\t230615\n猪猪份\t230616\n力枛\t230617\nwgjmp\t230618\n艾斯托\t230619\n小初成\t230620\n我的母老虎\t230621\n一个22\t230622\n花瑜\t230623\n汤豪\t230624\n郑磊\t230625\n责令\t230626\n新年快樂\t230627\nHimmod\t230628\n胡冰\t230629\n想你说\t230630\n污水量\t230631\n痛痛痛\t230632\n一张嘴\t230633\n害我\t230634\n机器𠂉\t230635\n矫柔\t230636\n邓小平文选\t230637\n小游nnu\t230638\nQQ横会黑会说棍\t230639\n花苗\t230640\n胡军\t230641\nLnnn\t230642\n哈气人\t230643\n汪清\t230644\n不能不可理喻\t230645\n东林\t230646\n吁请\t230647\nghnn\t230648\n钱昊然\t230649\n一月1月26\t230650\ntoudon\t230651\n守己\t230652\n这一次\t230653\n变方\t230654\n神经衰弱\t230655\n新工\t230656\n剪卡器\t230657\n太阳照\t230658\n575\t230659\n寻访\t230660\n154场\t230661\n我没我的好基友萌\t230662\n沁影\t230663\n81156778\t230664\n朱情谊\t230665\nq飞乐\t230666\nnlln￥nn\t230667\n反恐\t230668\n偶有\t230669\n三一次\t230670\n写错\t230671\n老公行\t230672\n投诉行\t230673\n瑞士达沃斯论坛\t230674\nghhvfhjnb\t230675\n946466843\t230676\n怕羞\t230677\n80194\t230678\n模模\t230679\n角斗士\t230680\n860家\t230681\n葵年\t230682\ncq呗\t230683\n秋月\t230684\n古装\t230685\n无可爹\t230686\n周伟奇\t230687\n很好奇\t230688\n偶什么\t230689\n竹颂\t230690\n第四节\t230691\nisHi\t230692\n养护\t230693\n教授木\t230694\n你好女人\t230695\n制度性\t230696\n善舞\t230697\n复生\t230698\n南车道\t230699\n兽场\t230700\n松滋\t230701\n姜姥爷\t230702\n耀君\t230703\n周珈慧\t230704\n爆轮\t230705\n嗯诺女\t230706\n总台\t230707\n弹夹车\t230708\n金松红\t230709\n宫洁丸\t230710\n22点\t230711\n片点\t230712\n没有然后\t230713\n呐秘\t230714\n了摸摸\t230715\n阳老乡\t230716\n重塑\t230717\n男男男男男嗯嗯嗯嗯嗯嗯\t230718\n吴和\t230719\n男块\t230720\n呼呼公公\t230721\n铃声片\t230722\n韩宁\t230723\n影线\t230724\n借代\t230725\n黑魔仙吧\t230726\n国酒\t230727\n量变\t230728\n洪湖中学\t230729\n想吐玩\t230730\n农历腊月初二\t230731\n企业家们\t230732\n信不信我不要你做我的秘书\t230733\n叫圣\t230734\nAda\t230735\n生中\t230736\n20多人次\t230737\n劲堡\t230738\nsemaa\t230739\n任宇芳\t230740\ncate\t230741\n胡泽宇\t230742\ncati\t230743\n维密好美\t230744\n映照\t230745\n芙蓉路\t230746\n臭狗逼\t230747\n银鹭花生牛奶i唱i音乐城市歌会\t230748\n1950年\t230749\n包袱\t230750\n充腹采薪薇\t230751\n布泣泥\t230752\n乌梢岭\t230753\n远见卓识\t230754\n张爷爷\t230755\n绝技\t230756\n老狐狸\t230757\n七爷\t230758\n230个\t230759\n庞妍\t230760\n木有英\t230761\n我不骂你了好吗别偷我\t230762\nexugec\t230763\ntfbys\t230764\n念思\t230765\n海洋馆\t230766\n雀躍\t230767\n桃花我恨你\t230768\n成一套\t230769\n印度尼西亚\t230770\n二对\t230771\n狮虎摩\t230772\n水耶\t230773\n聊不聊\t230774\n过来我等你\t230775\n布米\t230776\n888888666666\t230777\n电子产品\t230778\n夹克衫\t230779\n王鑫城\t230780\n浓咖啡\t230781\n34秒\t230782\nebgxb\t230783\n万顷碧绿草原\t230784\nggggggggg\t230785\ntometoutou\t230786\nhhjgbhjjuob\t230787\n大肆\t230788\n775875\t230789\n启齿\t230790\nMonica\t230791\nFT吧亚巡香港场\t230792\n糗百\t230793\n异星战场\t230794\n呵瓜瓜\t230795\n22m\t230796\n秘衣\t230797\nmyn\t230798\n么斯塔\t230799\n我是你谁你是我谁\t230800\n承滢\t230801\n国际华城\t230802\n推政\t230803\n盘古\t230804\n无聊来高二爬爬楼梯\t230805\n大咧\t230806\n512355858050880\t230807\n盘口\t230808\n大咪\t230809\n妙妙妙\t230810\n骗伦\t230811\n视听\t230812\n那哥\t230813\n牡丹江睡觉吧\t230814\n马齿苋\t230815\n见板\t230816\n吧照牛\t230817\n王美元\t230818\n太错了错\t230819\n心裡還有個小小的心願\t230820\n中雪乡\t230821\n活猪\t230822\n35877527776356454242548742466446\t230823\n射灯\t230824\n第三个\t230825\n开班\t230826\n小巧依人\t230827\n9801\t230828\n9800\t230829\n自带\t230830\n二零八三十\t230831\n珍珠港\t230832\n大咖\t230833\n哈眼\t230834\n沒騙\t230835\n郭佳\t230836\n有感\t230837\n余着\t230838\n有别说说\t230839\n三万元\t230840\n没么\t230841\n疯狂世界巡演#\t230842\n213213213213\t230843\n2月1日下午16时45分\t230844\n有意\t230845\n工业品\t230846\n白猫\t230847\n黄兰洋\t230848\n从一块\t230849\n腰腿\t230850\n有愁\t230851\n火柴首歌\t230852\n波光粼粼\t230853\n小船儿\t230854\n刻录机\t230855\n会好\t230856\n抚摩\t230857\n石怡\t230858\nIDXC\t230859\n临颖\t230860\n198美元\t230861\n傅于川\t230862\nhellomyanasecom\t230863\n免疫球蛋白\t230864\n嗯六合\t230865\n戁\t230866\n秦晓\t230867\n抚摸\t230868\n章宦晓\t230869\n2x3y1\t230870\n胡不字\t230871\n汉堡嘞\t230872\n换来换去\t230873\n律动pu\t230874\n六小龄童\t230875\n定子\t230876\n二位\t230877\nxinninmen\t230878\n导诺索\t230879\n小便宜\t230880\n均瑶集团\t230881\n热人三\t230882\n镇区\t230883\n亲眼目睹\t230884\n凌源\t230885\n广西话\t230886\n假爱\t230887\n二部\t230888\n感本\t230889\n书皮\t230890\n皮鞋店\t230891\n袁聪明\t230892\n螺\t230893\n山青\t230894\n呵里巴嗦\t230895\n林心加\t230896\n梦过\t230897\n定存\t230898\n發脾氣\t230899\n9999步\t230900\n侧卧\t230901\ngxhf\t230902\n南河镇\t230903\n惊险\t230904\n喝片\t230905\n怎敌\t230906\nnicetomeetyuotoo\t230907\ngxhs\t230908\ngxhr\t230909\n痞气\t230910\n温瑞\t230911\n吴喵喵\t230912\nEgan\t230913\n业余爱好者\t230914\n154556555555551\t230915\n篱笆\t230916\n数据结构\t230917\n三升\t230918\n天空公猪\t230919\n花朵们\t230920\n张敬奎\t230921\n官人\t230922\n套装\t230923\n三十\t230924\n套裙\t230925\n日本TBS电视台\t230926\n边伯贤\t230927\n村村\t230928\n娃儿们\t230929\n男戒心\t230930\n斗罗大陆二绝世唐门\t230931\n七楷模\t230932\n南轩北\t230933\n杨雨珊\t230934\n小不晓得\t230935\n蛋价\t230936\n住处\t230937\n说不乖\t230938\n大大大哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒哒\t230939\n真土\t230940\n全能型\t230941\n慈善家\t230942\n十四平\t230943\n不要爱\t230944\n十四年\t230945\nVeryCD资源收藏#\t230946\n预报\t230947\n洛阳市区\t230948\n哇高铁\t230949\n赚点\t230950\n销售人员\t230951\nhnBln\t230952\n吴雨倩\t230953\n周文\t230954\n米兰圈\t230955\n苏丹红咸蛋\t230956\n周斌\t230957\ntfmv\t230958\n赌注\t230959\n杨春梅\t230960\n星期六阿\t230961\n虫身\t230962\n青茶\t230963\n搞一搞\t230964\n2690886\t230965\n来自来识我了你知道我在哪吗\t230966\nshigaosunine\t230967\n噫嘘兮\t230968\n大好爽\t230969\n囧凸凸└o┘\t230970\n爱丽大\t230971\n里边\t230972\n翘起\t230973\n你走\t230974\n赛尔嗯\t230975\n豆肝\t230976\n翘走\t230977\nhjuygcedcjitdc\t230978\n静比\t230979\n李萱\t230980\n创强\t230981\n挂脚\t230982\n爸爸爱喜禾：十万个是什么\t230983\n你好毒铭记还多咪\t230984\n龟梨和美\t230985\n王艺涵\t230986\n嗯长风\t230987\n宋秉\t230988\n老李家门口楼台摩托木\t230989\n两袋\t230990\n微笑号\t230991\n差错\t230992\ntetok\t230993\n100千克\t230994\n梅花落而是梅花烙\t230995\n人头脑武装\t230996\n两袖\t230997\n军把手\t230998\n给度秘\t230999\n宝贝宝贝我是你的大叔\t231000\n董旭阳\t231001\n张某\t231002\n毛奶奶\t231003\n咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪咪\t231004\n五个小时\t231005\n收肯\t231006\n狠久\t231007\n如诗如画般\t231008\n六妹\t231009\n阿楼\t231010\n贫苦\t231011\n久痒\t231012\n富国天惠\t231013\n麦基麦基麦基\t231014\n粑粑粑粑粑粑粑粑粑粑粑粑\t231015\n不够诚意\t231016\n水族馆\t231017\n骂丫\t231018\n百上千人\t231019\n心小三\t231020\n写短\t231021\n闫颖颖\t231022\n铁青\t231023\n流器\t231024\n九套\t231025\n博览群书\t231026\n下班吧赶快\t231027\n只虫\t231028\n九九号求你了行不行\t231029\n岩茶\t231030\n抓狂\t231031\n张李彤\t231032\n士郎\t231033\nDYZ\t231034\n特别系\t231035\n除氧剂\t231036\n天天有喜之人间有爱白雪公主\t231037\n好不正经\t231038\nDYD\t231039\n不可言喻\t231040\nDYG\t231041\n4310\t231042\n血氨\t231043\n甚美\t231044\n金宝屯\t231045\n刘继来\t231046\n班草\t231047\ngbgd\t231048\njxhddo\t231049\n陈明磊\t231050\n蜢\t231051\n蜡\t231052\n多88咖啡\t231053\n马慢\t231054\n别闹腾\t231055\n夏霁月\t231056\n蜷\t231057\n菲政府\t231058\n八分之515分之四括\t231059\n血水\t231060\n儿电影\t231061\n三零七二二零\t231062\n砸死\t231063\n藏猫猫\t231064\n留洞\t231065\n1944866776\t231066\n过来来来来来来\t231067\n卜小心\t231068\n蜀\t231069\n蜜\t231070\n哲人\t231071\n鬼画符\t231072\n魔币精灵\t231073\n琼宁\t231074\n血气\t231075\n该原谅\t231076\n亲爱的你慢慢飞吃吧宝贝儿\t231077\n拖场\t231078\n好呀亲一个\t231079\n高山流水听\t231080\n美女了会\t231081\n你是谁你是谁你是谁呀你是谁\t231082\n晩上\t231083\nBbrr\t231084\n颗粒状\t231085\n19905年\t231086\n岐山县\t231087\n愿得一人心\t231088\n我友\t231089\n结缔组织结缔组织\t231090\n点撑\t231091\n一六元\t231092\n龙安\t231093\n横征暴敛\t231094\n咯扣阴门\t231095\n一寒假\t231096\n只有一个人\t231097\n点播\t231098\n剩剑\t231099\n我司\t231100\n别怕乱说\t231101\n12万多\t231102\n朱晓励\t231103\n啵啵啵啵啵啵啵啵啵啵\t231104\n会说话说\t231105\n我可\t231106\n精原\t231107\n尼壕\t231108\n高兴\t231109\n红颜知己\t231110\n定活盈利滚利\t231111\n像个\t231112\n冰结\t231113\n不菲\t231114\n无聊人\t231115\n姜欣\t231116\needfee\t231117\n斗一斗\t231118\n壤土\t231119\n十八一十八100\t231120\n老爷爷\t231121\n傲世堡\t231122\n呃原来你是女的我知道了你是女的\t231123\n爆冷\t231124\n说了秘\t231125\n旁观者清\t231126\n喀山\t231127\n巨行\t231128\n杜宇航\t231129\n烟草公司\t231130\n1008612435677890\t231131\n麻生佑\t231132\n植物大战僵尸超轻粘土\t231133\n狗血淋头\t231134\n得着我\t231135\nCS439\t231136\n无聊了\t231137\n很刚刚好\t231138\n幺零幺零\t231139\n最寒冷的冬天\t231140\n高光\t231141\n浙江\t231142\n吃豆\t231143\n多款\t231144\n乘除\t231145\n魔神\t231146\n鹿哥\t231147\n六盒\t231148\nCKEB\t231149\nVVWWW\t231150\n闪开\t231151\ntodd\t231152\n够用力\t231153\ntodo\t231154\n老爷们儿\t231155\n芷柔\t231156\ndusj\t231157\n原装\t231158\nphl\t231159\n霜花\t231160\n可说故\t231161\n1234428282733733\t231162\n赛尔度秘\t231163\n螯\t231164\n一两名\t231165\nADAF\t231166\n阿丽江龙飞\t231167\n哥俩\t231168\n孔子上-访图\t231169\n背扣\t231170\n说了算是\t231171\n会焦\t231172\n三四十分\t231173\n0次\t231174\n侯宜萱\t231175\n15369387817\t231176\n啊秘就是我我就是爱秘\t231177\n好啦好啦啦好啦好啦好啦\t231178\n源源\t231179\n洗擦\t231180\n甲贴\t231181\n嫁该\t231182\n荆棘丛\t231183\n机关枪\t231184\n车大厦\t231185\n15978296706\t231186\n1时\t231187\nsrrdtd\t231188\n可丽\t231189\n瑞贝\t231190\nwpm\t231191\n吓倒\t231192\n生真\t231193\nvTV\t231194\n心碎心碎心碎心碎心碎心碎心碎心碎心碎心碎心碎心碎心碎\t231195\n1日\t231196\n好了不说了好不求你了求你\t231197\n灾害\t231198\n4月21日下午14时\t231199\nhjsgkjv\t231200\nTnbvfhv15945283664\t231201\n退回来\t231202\n18079935807556\t231203\n系照\t231204\n朱一\t231205\n盛世洋彩话玲珑\t231206\n抱团\t231207\n捡活该\t231208\n性高潮\t231209\n511套\t231210\nsheep\t231211\n储存\t231212\n几百亿\t231213\n杨威\t231214\n焦煤\t231215\n假牙疼\t231216\n8tcogigutdiggiv\t231217\n红烧茄子\t231218\n朱丽\t231219\n春种秋收\t231220\nyffhui\t231221\n杨娜\t231222\nLaiyige\t231223\n东南大学\t231224\n调皮捣蛋\t231225\n眨眼间\t231226\n遵\t231227\n福安奥的包zantistantrayoujoraahartistshel\t231228\n薪火\t231229\n大帅哥\t231230\n误喝\t231231\n占星\t231232\n受你好\t231233\n中饭店\t231234\na分之九\t231235\n度秘心\t231236\n怪列\t231237\n546十234\t231238\ndtn\t231239\ndtm\t231240\ndtc\t231241\n小黑微\t231242\n升级换代\t231243\nwojs\t231244\n返港\t231245\ndtf\t231246\n益华\t231247\ndtd\t231248\n告诉我讨\t231249\ndty\t231250\n礼貌美\t231251\n北川县\t231252\n流畅版\t231253\ndts\t231254\ndtr\t231255\n益博\t231256\ndtt\t231257\n了了啦\t231258\n低调\t231259\n荣哥\t231260\n俊诶\t231261\n去而复返\t231262\n女模\t231263\n换班\t231264\n梁家宁\t231265\njjsdjjehcgfvgbecfgsbyjwkshfjrkqvxuawdhfipwgfulhwdfliwdqidwcvioqwdvcioqwdhgchdbwvbs\t231266\n两点四十五\t231267\n就是你的主人我叫静静\t231268\n第五元素\t231269\n驼孥\t231270\n陆子杭\t231271\n金融饿里拉\t231272\n王艳君\t231273\n太狗\t231274\n说你好帅\t231275\n6斗\t231276\n侥\t231277\n踏麻\t231278\n黑帮老\t231279\n驭动力\t231280\n戒网\t231281\n求不将就\t231282\n说的我喜欢才怪\t231283\n沙拉\t231284\n太狠\t231285\n太狼\t231286\n晨曦公主\t231287\n品冠\t231288\n低谷\t231289\n乘客们\t231290\n我去那也别老姨了我求你忘\t231291\n曹艳微\t231292\n信众\t231293\n刘红丽\t231294\n挑出\t231295\n四颗粒\t231296\n2007年度\t231297\n程帅\t231298\n好等会聊\t231299\n天平秤\t231300\n拉拉拉拉拉拉拉拉拉拉拉拉拉拉拉拉拉拉拉拉拉拉拉拉拉拉\t231301\n断桥\t231302\n玩家\t231303\n敲敲\t231304\npdgd\t231305\n给我捏你的吧跟班本性\t231306\nChagall\t231307\n披层\t231308\n刘倩\t231309\n业度秘\t231310\n秦晴\t231311\n赵兄托\t231312\n忍就\t231313\n7350\t231314\n王小吉\t231315\n独断独行\t231316\n胡广余\t231317\ndifkg\t231318\n要不我再来\t231319\n寡话\t231320\n那你爱不爱看天天有喜之人间有爱\t231321\n接通\t231322\n王彦然\t231323\nintentIntentSK1171477665F9E19E2DB2670E1DF16521089485AD7C686F934B35677513CA035662348A95end\t231324\n僵尸眼睛\t231325\n为我心\t231326\n男3女\t231327\n杂志九月刊\t231328\n难道说不是你说的真讨厌我讨厌你讨厌你讨厌你\t231329\n给哦困没的所\t231330\n得世界\t231331\nBB一句\t231332\n13940258142\t231333\n上海东方艺术中心\t231334\nfhgggghhughhhhhnhjhhhhhhyhhhhhhhhugnhhjjghhuhvgbhhhbbhhb\t231335\n有益菌\t231336\n善思\t231337\n2011年3月2日\t231338\n翩翩\t231339\n12点17分\t231340\n叶凌萱\t231341\n张文爽\t231342\n善性\t231343\nPTUF\t231344\n君相诀绝\t231345\n日本本州岛\t231346\n自酿酱油\t231347\n不发我投诉你我告你\t231348\n八连\t231349\n鳌柱\t231350\n杨红樱\t231351\n九小游女的撒盐有\t231352\n出埋阿\t231353\n嘴硬\t231354\n三四百\t231355\n3200091\t231356\n给我的微笑\t231357\n无家可会\t231358\n中州\t231359\n无聊无聊\t231360\n嘎嘎嘎嘎嘎嘎嘎姑\t231361\n天造地设\t231362\n难要\t231363\n学科学\t231364\n指手\t231365\n猪不是你你是猪你是猪你是猪\t231366\n张甜甜\t231367\n劝阻\t231368\n嘻哈尝试店\t231369\n嘎嘎嘎嘎嘎嘎嘎嘎嘎\t231370\n死字\t231371\n九溪座\t231372\n死子\t231373\nissorry\t231374\ngela\t231375\n12分之一\t231376\n维尼散饭\t231377\nquirrel\t231378\nx\t231379\n婴幼儿童\t231380\n把眼前\t231381\n豪豪\t231382\n流氓史\t231383\n夏荷\t231384\n呐喊助\t231385\n错过来\t231386\n土形\t231387\n东方天骄幼儿园\t231388\n无花果\t231389\n信教\t231390\n十吨\t231391\n社群\t231392\n昏迷不醒\t231393\n来玩\t231394\n找你去\t231395\n罪机\t231396\n非常好听\t231397\n诗\t231398\n没有人爱\t231399\n提挺\t231400\n充饥\t231401\n十名\t231402\n王艺潼\t231403\n充饱\t231404\n品鉴\t231405\n大伏天\t231406\n提振\t231407\n1044\t231408\n萧爷\t231409\nhajhb1\t231410\n64615454464616557542634448436454315455444545543\t231411\n潮人\t231412\n核燃料\t231413\nrgdthsh\t231414\n1335906092\t231415\n就诊\t231416\n看死\t231417\n7月6\t231418\n7月7\t231419\n硅谷大街\t231420\nBAZAAR\t231421\n情了拜拜\t231422\ndigitalboartg\t231423\nS60\t231424\n焖肉面\t231425\n彭紫红\t231426\n回放\t231427\n滚动条\t231428\n13166467976646767\t231429\n考武\t231430\n宝哥\t231431\n回收\t231432\n16158\t231433\n要不\t231434\n夏琼寺格桑达杰\t231435\n高梅婷\t231436\n赵洪煜\t231437\nassdfgghhkopxxvbnqwtyuoop\t231438\n哈拉屎\t231439\n度秘太壞\t231440\n小生活\t231441\n一会儿咱一会儿\t231442\n盈利\t231443\n归怒\t231444\n马鹿塘\t231445\n了行不行\t231446\n自杀\t231447\n法布尔\t231448\nt个\t231449\n我想你样不老实也不怎麽不过我姓张一心度我和你一样\t231450\n芦溪县\t231451\n恒驰\t231452\n刚刚刚刚\t231453\n窝记得\t231454\n发索要\t231455\n朱树\t231456\n迪克孙\t231457\n殿后\t231458\n自来\t231459\n0004\t231460\n圹云\t231461\nzuiniu\t231462\n0007\t231463\n0000\t231464\n0001\t231465\n难产\t231466\n新招\t231467\n驱蚊\t231468\n马大饼\t231469\nfan们\t231470\n唱别歌\t231471\n第一千条\t231472\n人工\t231473\n难人\t231474\n地址类\t231475\n无他路\t231476\n长袍\t231477\n我的惹火\t231478\n难于\t231479\n笑破\t231480\n难事\t231481\n梳妆\t231482\ntfboysorexo\t231483\n肠肠\t231484\nnijiang\t231485\n保鲜\t231486\ngwhs\t231487\n燕市\t231488\n曲屏\t231489\n重庆火锅\t231490\n676834353867\t231491\n内裤儿\t231492\n北关楼\t231493\nhellon\t231494\n甘让\t231495\nhellol\t231496\nhellom\t231497\nhellok\t231498\n有疑问\t231499\n赵美霞\t231500\n下半生\t231501\n你好度秘你好度秘你好度秘你好度秘你好丢脸\t231502\n去额\t231503\n总局\t231504\n双百\t231505\n是你走\t231506\n明天7点\t231507\nhellow\t231508\n姐妹们\t231509\n不见不鲜\t231510\n娘娘娘娘娘娘娘娘娘娘娘娘娘\t231511\n果味\t231512\n弱堡\t231513\n舛舛\t231514\ngs4\t231515\n想不想息\t231516\n半挂车头\t231517\n陈春林\t231518\n12359\t231519\n窝心\t231520\n十五分之十一\t231521\n120亿\t231522\n撞伤\t231523\n蒙娜丽莎之约\t231524\n海峡都市报\t231525\n看相\t231526\n趾高气扬\t231527\n兰德\t231528\n威凯\t231529\ndhcjhvhg\t231530\n探帅气\t231531\n女咯V5\t231532\n王祖蓝\t231533\n大外\t231534\n中庸\t231535\n26面\t231536\nghiphotosbaiducomxiaodupicitem7acb0a46f21fbe098db8ae616c600c338644ade5jpg\t231537\nyuan\t231538\n我看你调皮吧真是的你给我领养一只小龙犬吧求求你了小米度\t231539\n找回自我\t231540\n大多\t231541\nxisu\t231542\n大大\t231543\ngsr\t231544\n看盘\t231545\n焱梅\t231546\n爱建\t231547\n认作\t231548\n中庆\t231549\n定理\t231550\n七极\t231551\n大天\t231552\n几遍\t231553\n大夫\t231554\n大太\t231555\n大头\t231556\ngsc\t231557\n几道\t231558\n说案\t231559\n着信\t231560\ngsf\t231561\n色情男女\t231562\ngsh\t231563\ngsk\t231564\n单招\t231565\n喜欢自劲\t231566\n板栗红烧肉\t231567\n凭借\t231568\n结束样\t231569\n英式\t231570\n401334213\t231571\n色鞋\t231572\n打压\t231573\n十来分钟\t231574\n狡兔\t231575\nCCF\t231576\n笼屉\t231577\n上边\t231578\n厦门旅行社\t231579\n明显地\t231580\n诱仙\t231581\n加小\t231582\n打击乐\t231583\n上辈\t231584\n13004\t231585\n秸\t231586\n粪\t231587\n叮叮咚咚终点站万通\t231588\n特卡准\t231589\n历尽沧桑\t231590\n1330987653133089133\t231591\n支持者\t231592\nHFIDHF\t231593\n开博\t231594\n朗读者\t231595\n祷\t231596\n你在干嘛啊你是谁我是谁\t231597\n有难说\t231598\n祺\t231599\n祸\t231600\n好哭\t231601\n早起来\t231602\n陈伟鸿\t231603\n祢\t231604\n武断\t231605\n祠\t231606\nfafafa\t231607\nkbja5t1ga\t231608\n羊羊唉羊羊\t231609\n聚合\t231610\n划算\t231611\n票\t231612\nhellomyda\t231613\n祭\t231614\n一泻\t231615\n好哇\t231616\nanstoan\t231617\n祛\t231618\n卡梅隆·迪亚兹\t231619\n好哈\t231620\n张永秀\t231621\n神\t231622\n祝\t231623\n最好的\t231624\n何子聪\t231625\n好哒\t231626\n凯叔讲故事\t231627\n卡卡堡\t231628\n祋\t231629\n加封\t231630\n爱骂\t231631\n往右转\t231632\n梁锦勿\t231633\n嗯栗子\t231634\n肖瑶\t231635\n普法\t231636\n我和你的电话\t231637\nGuinness\t231638\n薄片\t231639\nggfffffcf\t231640\n热酒\t231641\n黛色\t231642\n吕振辉\t231643\n葫冰卿\t231644\n7年后\t231645\n浮沉\t231646\n45452522422\t231647\n00000005552589633\t231648\n实用性\t231649\n金黄\t231650\n贾超颖\t231651\nchdhshlsti\t231652\n错位点\t231653\n老k\t231654\n银虎\t231655\n好我好累\t231656\n脚上\t231657\n脚下\t231658\n亥时\t231659\n6点\t231660\n磨炼\t231661\n话梅\t231662\n881888\t231663\nidx\t231664\nSJ-M台北FM\t231665\n着装\t231666\nidu\t231667\n艺兴\t231668\n芳畦\t231669\nduda\t231670\nidl\t231671\n老湿\t231672\n老K\t231673\nidh\t231674\nidf\t231675\n18792783665\t231676\n要盗\t231677\n脚丫\t231678\n骨骼\t231679\n国资\t231680\n天地一号\t231681\n抽三抽筋\t231682\n克星期五\t231683\n牛粑粑\t231684\n和你说的话\t231685\n得手\t231686\n上二年\t231687\n不逊不逊\t231688\n吴思思\t231689\n111111111111111111111111111111\t231690\n关内\t231691\n十五点儿\t231692\nseasons\t231693\nhellomyaaaasaaaa\t231694\nststf\t231695\n写见\t231696\n六千多块\t231697\n貼連\t231698\n脱兔\t231699\n不好当\t231700\n推脱\t231701\n天下无贼\t231702\n屠宰场\t231703\n脸蛋\t231704\n童日\t231705\n效益\t231706\n5588566\t231707\nut9\t231708\n泰国人\t231709\ndaaaa\t231710\n庾花花\t231711\n黄嘉利\t231712\n略同\t231713\n早小时\t231714\n带薪者\t231715\nhgghhhgghcg\t231716\n芜湖县\t231717\nhuugggobmkxfuhcjjjnvccxxeigupjjnnbghhh\t231718\n百脑汇\t231719\n调验\t231720\nHiandl\t231721\n人生路上\t231722\n帅凯\t231723\n题材\t231724\n幺八五二二零四三八八六\t231725\n齐大妈\t231726\n一地茄子\t231727\n超洋快餐\t231728\nayw\t231729\n帅凡\t231730\nbkjkhkkkkk\t231731\n四副\t231732\ndudu\t231733\n爱我你就夸夸我爱我\t231734\n韦梦婷\t231735\nutl\t231736\n乱窜\t231737\n2集\t231738\n此花\t231739\n临死\t231740\n桑径\t231741\n哇這\t231742\n侄\t231743\n纳杰人才网\t231744\n才秘\t231745\n531条\t231746\n只见\t231747\n塞维\t231748\n范思琦\t231749\n53场\t231750\n16：45\t231751\nsually\t231752\n胆大胆小\t231753\n我喜欢我女闺蜜\t231754\n年轻人\t231755\n越前龙马\t231756\n恐吓\t231757\nA1\t231758\n邮政局\t231759\nA2\t231760\nA4\t231761\n瓜州\t231762\n找三十四五\t231763\nA8\t231764\n4断井张\t231765\n雪毯\t231766\n三和医院\t231767\n韩庚庚\t231768\nAA\t231769\nAC\t231770\nAB\t231771\nAD\t231772\nAG\t231773\nAF\t231774\nAI\t231775\nAH\t231776\nAK\t231777\nAJ\t231778\n莽莽榛榛\t231779\nAP\t231780\nAS\t231781\n照本\t231782\nAW\t231783\nggff5t\t231784\ndram\t231785\nOutput\t231786\n博美\t231787\n支车\t231788\nthatI\t231789\n漆包线\t231790\nAf\t231791\nAi\t231792\nAh\t231793\nAj\t231794\nAm\t231795\niNli\t231796\nAn\t231797\n姐姐度\t231798\nAr\t231799\nAu\t231800\n复习\t231801\n冰棍\t231802\nchiphotosbaiducomxiaodupicitemdc54564e9258d109215490c7d658ccbf6c814d0bjpg\t231803\nxidkf\t231804\nchsh\t231805\n动线\t231806\n渲染\t231807\n王大陈\t231808\n折扇\t231809\n聪明好乖\t231810\n更萌哒\t231811\n560442327674\t231812\n非理性\t231813\n张国\t231814\n张图\t231815\n陈峰\t231816\n一财年\t231817\n冲落式\t231818\n呵非\t231819\n张囡\t231820\nconcener\t231821\n郝小闲\t231822\n3136522985\t231823\n才臣\t231824\n合肥市\t231825\n牛头不对马嘴我说的是你度秘\t231826\n2021264169\t231827\n折扣\t231828\nBarbiedolls\t231829\n动气\t231830\ngnkbdk\t231831\n资料\t231832\n七零二\t231833\n徐若妍\t231834\n经期秘\t231835\n星球人\t231836\n我一家人\t231837\n王蕊婷\t231838\n章林\t231839\n田心行\t231840\n臭嘴\t231841\n6门\t231842\n智利南部\t231843\n教须\t231844\n玉碧瑶\t231845\n日本总务省\t231846\n你好名\t231847\n你好吊\t231848\n另一个\t231849\n6842\t231850\n白蛋白\t231851\n一七块\t231852\n58525655365\t231853\nAqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq\t231854\n谢朗动\t231855\nwwsfgv\t231856\n不要吗好我原谅你\t231857\n延长\t231858\n染色\t231859\n2.2%\t231860\n萨粉\t231861\n世界杯\t231862\n肺结核\t231863\n耳塞\t231864\n五果鸡\t231865\n幺五五二二六九七八五七\t231866\n遏制\t231867\n佟禹濛\t231868\n田茂恒\t231869\n破折号\t231870\n希特拉拉希尔\t231871\n六不路过\t231872\n婉婷\t231873\n放牛娃\t231874\n维多特利姆\t231875\n萝莉娃娃宋\t231876\n367万吨\t231877\n伊丝泊c3\t231878\n110些\t231879\n何处在\t231880\nmgwg\t231881\n舞载\t231882\n咳嗽药\t231883\n想你想你\t231884\n百岛\t231885\n交谊舞\t231886\n最女人：娜一吻\t231887\n渣土车\t231888\n哈卡哈卡哈卡\t231889\n盼盼\t231890\n音乐节\t231891\n麼美\t231892\n染坊\t231893\n没本\t231894\n76分\t231895\n机枪\t231896\n78米\t231897\n疯爬过\t231898\nvbc\t231899\nvbb\t231900\nvbd\t231901\nJdapmj\t231902\nvbj\t231903\n霍文兰\t231904\nvbh\t231905\nvbn\t231906\nvbm\t231907\nvbs\t231908\n八八场\t231909\n保险单号\t231910\n机构\t231911\n没月\t231912\nvby\t231913\n为你一样\t231914\n饭碗\t231915\n博牌九帅\t231916\n榜单\t231917\n小优\t231918\n小伙\t231919\n小会\t231920\n伯克希尔\t231921\n该你了你谁啊你\t231922\n小伟\t231923\n真岛太一\t231924\n牧师\t231925\n滋滋\t231926\n小众\t231927\n亦舒\t231928\n阿大\t231929\n小伍\t231930\n你好王子天下\t231931\n阿天\t231932\n阿太\t231933\n良是\t231934\n18818818888\t231935\n彩叶林\t231936\n阿夜\t231937\nkgmgjgm\t231938\n加成\t231939\n阿多\t231940\n反超\t231941\n们再有一个月\t231942\n转一圈面\t231943\n小伤\t231944\n理髮\t231945\n看不过不着\t231946\n高门\t231947\n晕菜\t231948\n猪猪侠之百变\t231949\n機器人\t231950\n埔\t231951\n优哉游哉\t231952\nimsm个\t231953\n栗色\t231954\n混蛋辣蟹\t231955\n嗯靠靠靠\t231956\n埃\t231957\n芝麻分\t231958\n埋\t231959\n心别\t231960\n谢左先\t231961\n13526590902\t231962\nJigjaccoz\t231963\n疏风\t231964\n埲\t231965\nhggvgvxhcjb\t231966\n基\t231967\n培\t231968\n165米\t231969\nhdiis\t231970\nMARK\t231971\n難受\t231972\n太狠不哭不哭我心疼你猜我现在想干嘛太狠不哭不哭我心疼你猜我信\t231973\n心切\t231974\nyooni\t231975\n婚礼秀\t231976\n2020年\t231977\n家口\t231978\n伊還\t231979\n造假\t231980\n团体操\t231981\n红裤衩\t231982\n黏嗽\t231983\n185元\t231984\n高估值\t231985\nxinxi\t231986\n家号\t231987\n丸子\t231988\nhttpll\t231989\n家史\t231990\n如意集团\t231991\n问你的什么\t231992\ngvvf\t231993\n刚才\t231994\n知春路\t231995\nSUPERMAN#\t231996\n丁宣雅\t231997\nHIV\t231998\n薛一\t231999\n什小补牢\t232000\n我喜欢侳\t232001\n哎噜\t232002\n发祥\t232003\n们母町丸\t232004\n微长\t232005\nNnnn57\t232006\nHIH\t232007\n刘庆龙\t232008\n1128亿\t232009\n天边\t232010\n曾正泽\t232011\n置身\t232012\n八百元\t232013\n啵啵我和秘蛛一石恶我静静不游戏\t232014\n上官林\t232015\n我太不开心亲亲你\t232016\nRedscarfortherobot\t232017\n刘文斐\t232018\n做爱\t232019\n刘文斌\t232020\n塔防\t232021\n早上七点半\t232022\n宁谧\t232023\n神中学\t232024\n现代教育出版社\t232025\n发哈发\t232026\n1周年\t232027\n讨厌可乐\t232028\n1539501628\t232029\n400一500元\t232030\nUUHHGHH\t232031\n懒散\t232032\n欧美政府\t232033\n就守\t232034\n非人\t232035\n狐朋狗友\t232036\n孙q\t232037\n单裤\t232038\n奥美干\t232039\nUUIOO\t232040\n大了很好听\t232041\n三杯五杯\t232042\n糖人\t232043\njqloh\t232044\navgggx\t232045\n小倩\t232046\n威斯克呗元云烟\t232047\n33年\t232048\n慌爱我告诉你我没\t232049\n33平\t232050\n陈顺友\t232051\n小倾\t232052\n昌绪\t232053\n进入\t232054\nWITH\t232055\n你是我的小小苹果榛蘑\t232056\nEX秘\t232057\n中国人学西学\t232058\n手工\t232059\n钟硕部\t232060\n手巧\t232061\n蔡浴久\t232062\n林雪云\t232063\n冲动型\t232064\n胃狠\t232065\n更聪明的人\t232066\n佤族\t232067\n泰饭\t232068\n魏和尚\t232069\n翻出\t232070\n手巾\t232071\n生活中的苦涩与甘美\t232072\n半支\t232073\n耷拉耷拉耷拉达拉达达达哈\t232074\n唐洵\t232075\n徐艺涵\t232076\n斯莱顿\t232077\n嘿哼\t232078\n草菅\t232079\n201601082015\t232080\n差下\t232081\n原地踏步\t232082\n练习方法\t232083\n周青青\t232084\n孙梦宇\t232085\n差不\t232086\n天动DD介\t232087\n弄清楚\t232088\n变异体\t232089\n5888858888888888888\t232090\n好猜\t232091\n听歌里喔喔哦你是我的好朋友\t232092\n奴隶\t232093\n嘿哟\t232094\n乔布斯\t232095\n贡献\t232096\n嘿哈\t232097\n2287年\t232098\n4488\t232099\n杨学习\t232100\n漩\t232101\n有为\t232102\n漫\t232103\n漪\t232104\n梦三国\t232105\n邹晓梦\t232106\n虎皮蛋糕\t232107\n沙参\t232108\n日本保安厅\t232109\nu6yyu\t232110\n三十九块\t232111\nhll\t232112\nhlk\t232113\nJtj\t232114\n昏昏你叫\t232115\n养我喜欢\t232116\n漳\t232117\n漲\t232118\n乡长\t232119\n作废\t232120\nKhun\t232121\n有业\t232122\n三字经\t232123\n裤兜处\t232124\n漍\t232125\n乡镇\t232126\n漏\t232127\n召唤\t232128\n饮水\t232129\n不是我这我这我\t232130\n漂\t232131\n卷心菜\t232132\n漆\t232133\n嗯嗯嗯你好宁波列\t232134\n钱婆Ke$ha\t232135\nhellostyou\t232136\n光头强快跑\t232137\n有一\t232138\n嗯并木\t232139\n演\t232140\ngghcvetjnyfgyjgtjfxysufhfygfihfhfyfbfbghgyhggRRtdt\t232141\n西瓜汁\t232142\nufgtrdru\t232143\n老气\t232144\n画风\t232145\n13211253080\t232146\n号码岁\t232147\nd罩杯\t232148\n做贼\t232149\n饥饿难耐\t232150\n小笼包子\t232151\n老原则\t232152\nfncv\t232153\n蜜洞\t232154\n瘦祥\t232155\n大魔头\t232156\nogux\t232157\n要造\t232158\nmaa\t232159\nyucdtdryfhk\t232160\n游戏方\t232161\n一批八位\t232162\nvpug\t232163\n观战\t232164\n第4名\t232165\n刘昊然\t232166\n甘拜\t232167\nleuos\t232168\n戏本\t232169\n扎实瓦达\t232170\n睡不着吖\t232171\n河贞恩\t232172\n小懒猪\t232173\n城墙\t232174\n8888654777888555445556698774\t232175\nblogger\t232176\n8dm\t232177\nndn\t232178\n八经\t232179\n啦啦啦啦啦开心\t232180\nVtr\t232181\n木吉厨\t232182\n乖很多\t232183\n回去吧\t232184\n娃娃领\t232185\n额字\t232186\n八组\t232187\n孝亲\t232188\nー\t232189\n献上\t232190\n八绷\t232191\n药家鑫\t232192\n东北啊我和我的东北我在家在东北啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦啦\t232193\n信服\t232194\n66552555808555658665866666666999696969999666666666666666666665555666666666666\t232195\n楼兰之死\t232196\n献丑\t232197\n太自恋男女的小事手\t232198\n赶稿\t232199\n他样\t232200\n你爱你爱你爱你爱你耐你耐你耐你\t232201\n曹景波\t232202\n市值\t232203\n金泰亨\t232204\n医生之错\t232205\n满洲龙\t232206\n五百三十三兆\t232207\n数百根\t232208\n陈龙\t232209\nHvfxj\t232210\n戒\t232211\njilalotiglkafenljlu\t232212\n安拉\t232213\n淇淇\t232214\n出柜\t232215\n墀头\t232216\n开交\t232217\n脸儿\t232218\n餐管\t232219\n22d23个\t232220\n海登\t232221\n各色\t232222\n火气\t232223\n十字架\t232224\n吴佩锡\t232225\n琅琊山\t232226\n薛均瑶\t232227\n斯玛什\t232228\n口龙\t232229\n行径\t232230\n70余名\t232231\n想不想\t232232\n蛮多空镜头\t232233\n卖命\t232234\n好心情不好\t232235\n13459595160\t232236\n大瓶女\t232237\n心灵深处\t232238\n冲昏\t232239\nmetball\t232240\n穴手\t232241\n分儿臭不要脸\t232242\n了处\t232243\n天佑法莱拉\t232244\n150㏄\t232245\njttttttwt\t232246\n36.60\t232247\n历法\t232248\n王凯俊王\t232249\narm\t232250\n桃老\t232251\n偶像化\t232252\n四个月\t232253\nHOLD不住\t232254\n2016年1月12日\t232255\n手指头\t232256\n宁月公主\t232257\n北京市佛教协会\t232258\n宝陀\t232259\n鸟语春天\t232260\n12种\t232261\n19553\t232262\nbbobo\t232263\n泻药\t232264\n每样\t232265\niwwi\t232266\n瞌睡把\t232267\n55555554455\t232268\n看等\t232269\n不远中\t232270\n大陆之外\t232271\n轻舟\t232272\n售后\t232273\n大燕玲\t232274\n2332\t232275\n8899147\t232276\n气枪\t232277\n说时\t232278\n石峰\t232279\ndhiphotosbaiducomxiaodupicitem71cf3bc79f3df8dc023252e5ca11728b471028b3jpg\t232280\n上半场\t232281\n大荒\t232282\n嗯新闻变太\t232283\n北陶镇\t232284\n李逸群\t232285\nshuohua\t232286\n沈雅妮\t232287\n小卖\t232288\n跌价\t232289\n喜降\t232290\n造职\t232291\n执迷不悟\t232292\n卓有成效\t232293\n张志恒隆\t232294\n庚饭们\t232295\n吴泽华\t232296\n唻\t232297\n滞留\t232298\nipdilen\t232299\nStallone\t232300\n丹江口水库\t232301\njehd\t232302\n痴迷\t232303\nloll默默now\t232304\n酷热\t232305\nHAMMERSENG\t232306\n门卫\t232307\n韩山镇\t232308\n男十女\t232309\n魔法师\t232310\n清蒸山药\t232311\n笨丫头\t232312\n43678\t232313\n可也\t232314\n440811199906080613\t232315\n麻坛\t232316\npp88\t232317\nzozoxo\t232318\n提包\t232319\n我叫你了你给我\t232320\nP0or\t232321\nhi个p话\t232322\n好奇\t232323\n实木\t232324\n一万111111111嘿嘿嘿111\t232325\n苏束\t232326\n一二百万\t232327\n在哪里度\t232328\netouto\t232329\n阿斯琳\t232330\n弟弟裤\t232331\n够了你\t232332\n呀别\t232333\n少明了\t232334\nhttpehiphotosbaiducomxiaodupicitemb219ebc4b74543a91606acb919178a82b90114a1jpg\t232335\n疯团\t232336\n福克斯电视台\t232337\ngjvd\t232338\ngjvg\t232339\n数五秒钟\t232340\n马思涵\t232341\n英国剑桥大学\t232342\n日会哦瓮\t232343\n哪着\t232344\n嗔恚\t232345\n阿钰\t232346\n联合国宪章\t232347\n再见再见再见我不想\t232348\n8十8\t232349\n再见了拜拜\t232350\n2976555936\t232351\n同喜\t232352\n佛嗖风哦热风Joe\t232353\n下周末\t232354\nshake\t232355\n并存\t232356\n8394616178\t232357\nthata\t232358\n不不不我就喜欢\t232359\n上八好\t232360\n嗯子\t232361\n崇礼\t232362\n佐助\t232363\n向阳\t232364\n零四个月\t232365\n无线网\t232366\n冥轩\t232367\n骂名\t232368\n冰迪\t232369\n我喜欢春\t232370\n15911180138\t232371\n几莱阳市\t232372\n一国两制\t232373\n哥哥哥哥哥哥哥哥哥哥不呃\t232374\n季軍\t232375\n冰奸\t232376\n骗人的你别骗我你说好里狐狸\t232377\n冤枉死\t232378\nrewelcome\t232379\n冰女\t232380\nllshappy\t232381\n世界通\t232382\nVersace米兰\t232383\n我爱你我不想卫龙我就爱你\t232384\nddnfdsmx\t232385\n11357841\t232386\n坏银软绵绵\t232387\n赠出\t232388\n1月19号\t232389\n捍来\t232390\nHIT-5组合\t232391\n做敢\t232392\n六十ｃｍ\t232393\n心越乐\t232394\n#3000\t232395\n孔露露\t232396\n掏粪男孩么多面\t232397\n小秋\t232398\n火锅\t232399\n小种\t232400\n小秀\t232401\n不淘气\t232402\n0201\t232403\n垃圾食品\t232404\n啧了\t232405\n堆别\t232406\n真心的爱\t232407\njctdyixdjnu\t232408\n说好朋友\t232409\n卫东\t232410\n罗尼\t232411\n神茶\t232412\n音律\t232413\n二版\t232414\n短时间\t232415\n酸奶\t232416\n耽搁\t232417\n复垦\t232418\n皮尔洛\t232419\n生石花\t232420\n里巴里卡\t232421\n屋塔房王世\t232422\n王佳玲\t232423\n家用型\t232424\n星报\t232425\n陈小超\t232426\n逆花\t232427\nFATIMA\t232428\n德纲体\t232429\n站肚子饿\t232430\n王佳玮\t232431\n马丁蛋\t232432\n曲不离\t232433\n吸油汤\t232434\n56页\t232435\n香港运输及房屋局\t232436\n老叭叭\t232437\nex1一XX\t232438\n意愿\t232439\n每天然\t232440\n收款\t232441\n倾尽天下承君欢\t232442\n艾梦露\t232443\n围困\t232444\n棱角\t232445\n呼啦啦呼啦啦呼啦啦啦啦啦啦啦魔法收发首发吧啦啦魔法收发手法\t232446\n我爱你我坏蛋你的孩子\t232447\n照应该\t232448\n邓p\t232449\n那五那我叫你度秘行\t232450\nndash\t232451\n几日成\t232452\n手不释卷\t232453\n就是你的秘书\t232454\nyliomonorime\t232455\n26768964828203\t232456\n1518\t232457\nWIGUF\t232458\n1515\t232459\n车程\t232460\n1516\t232461\n1511\t232462\n笑傲\t232463\n1513\t232464\n1512\t232465\nwkk\t232466\n狒\t232467\nwkm\t232468\n坑人\t232469\n敏一晚\t232470\nShanghai\t232471\nwke\t232472\nwkd\t232473\n明净\t232474\nutygg\t232475\n鼠药\t232476\n15645207864\t232477\n和和和闻\t232478\n诺奖\t232479\n这么贵\t232480\nYou\t232481\n起火\t232482\n安四安馨\t232483\n13694667578\t232484\n太磨叽\t232485\n早上好\t232486\nhellostomit\t232487\nYoY\t232488\n过瘾\t232489\nQ2060740964\t232490\n铸铁\t232491\n捣\t232492\n科幻迷\t232493\n邝永川\t232494\n刺刺\t232495\n天花园\t232496\n振振有词\t232497\nxixixixixixixixixixixixixixixixixis\t232498\nPETA\t232499\n射手女\t232500\n山羊媚\t232501\n王民杰\t232502\n春惊天\t232503\n第十二\t232504\n22215\t232505\n刺刀\t232506\n22213\t232507\n早晨4点\t232508\nFuke\t232509\n五咱们俩\t232510\n妄自匪薄\t232511\ncucc\t232512\nGST\t232513\ncuch\t232514\n梭子蟹\t232515\n话务员\t232516\n感兴趣话\t232517\nHdfjcxbsnxjk\t232518\nec2d5628535e5dd374521ab71c6a7efce1b625djpg\t232519\n討厭\t232520\n香茶\t232521\n通信\t232522\n属于你的一片天地\t232523\nVhxjxl\t232524\n明天四周年\t232525\nGSL\t232526\n穿满服\t232527\n石柯石柯\t232528\n通俗\t232529\n公义\t232530\n公么\t232531\n摆聊\t232532\n叶子楣\t232533\n六龄\t232534\n咋个办\t232535\n零二六零\t232536\n太极拳\t232537\n陈六\t232538\n密度秒\t232539\n石头城\t232540\n范丞丞\t232541\n海椒\t232542\n一百多处\t232543\nhyhutdgj\t232544\n难言之隐\t232545\n密度秘\t232546\nHDJJBFHU\t232547\nfdbd\t232548\n一国人\t232549\n1十12\t232550\n13942943092\t232551\nfffffdu\t232552\n大哀易失颜,\t232553\n端康\t232554\n一团糟\t232555\n北京大厦\t232556\n耳熟能详\t232557\n赖自强\t232558\n成园温泉山庄\t232559\n0.4\t232560\n犯愁\t232561\n阴七三\t232562\nBedboy\t232563\n老四&amp\t232564\n沈默\t232565\n二十二天\t232566\n一亿岁\t232567\n淑瑞\t232568\n一嘻嘻嘻嘻\t232569\n赵四海\t232570\n七轮\t232571\n好遗憾\t232572\n米花\t232573\n唐俊\t232574\n孔吉\t232575\n治理\t232576\n散散步\t232577\n婉留\t232578\n好坏话\t232579\n一个零二二\t232580\n耳闻\t232581\n一两点\t232582\n霹雳贝贝\t232583\n我是你的小主人\t232584\n57468\t232585\n爱而生\t232586\n浙江泽国四中\t232587\nS8\t232588\n忽闪\t232589\n宽带距\t232590\n呃都市\t232591\n好久有\t232592\n颜觉\t232593\n卷发\t232594\n教米奇头杂梳\t232595\n淡色\t232596\nwovaokan\t232597\nxxxww\t232598\n查哈\t232599\n徐粗服\t232600\n徐梦倩\t232601\n肿木\t232602\nhttpehiphotosbaiducomxiaodupicitem3bf33a87e950352a8299047a5443fbf2b3118bc3jpg\t232603\nexxmqq\t232604\n和浩特\t232605\n米粉茶\t232606\ndtccn\t232607\n百灵鸟\t232608\n知何日\t232609\n一句三任\t232610\n袁大\t232611\n方晓东\t232612\n哈弗阿富汗猎犬\t232613\n送错\t232614\n孔庙\t232615\n3500\t232616\n吉文\t232617\nxhjfg\t232618\n登山包\t232619\n捡\t232620\n3506\t232621\n好啦好啦原谅你\t232622\nV5T裤\t232623\n环保部\t232624\n曼古王\t232625\n花花花花花花花花花花花花花花花花花花\t232626\n蟹谗广第\t232627\n猪你是猪你就是一头猪朱猪朱猪朱猪朱猪朱猪朱猪\t232628\n吗呀\t232629\n等你来\t232630\nIDP\t232631\n新能大战\t232632\ncggdgcchcjcjcjcjcjcjcjfhggcjggkfjfjfjchcjhjzhxhbg\t232633\n鸟毛\t232634\n李宇嘉\t232635\n涉黄\t232636\n东密渡觅渡觅地头克\t232637\nIDC\t232638\n凤梅\t232639\n多得多\t232640\ngjmwtjmwt\t232641\n专治\t232642\n标兵\t232643\n说说清楚\t232644\n8885555222212\t232645\nmuseum\t232646\n韩江\t232647\n该\t232648\n凤梨\t232649\n程艳丽\t232650\n一个月亮\t232651\nHISggjhj\t232652\n半干半\t232653\n秘才怪\t232654\n919727484494\t232655\n孤家寡人一个吗哼不理你了你骗我\t232656\n实版\t232657\nHbmhgmgdvjfvv\t232658\n金孟伟\t232659\n你好醉\t232660\n哪怕\t232661\n独家奥\t232662\n不大雾大\t232663\nstimatyou\t232664\n伤害\t232665\n植物大战僵尸博士二\t232666\nmythome\t232667\n银河美\t232668\n污点\t232669\n球星\t232670\n实物\t232671\n2011.05.03\t232672\n防对\t232673\n一轮儿\t232674\n七二零\t232675\n52公斤\t232676\n无孔不入\t232677\n希思黎\t232678\n拐打\t232679\n猥琐\t232680\n死螃\t232681\n文荒\t232682\n金桔\t232683\n这条路\t232684\n坐具\t232685\n憋屈\t232686\n比重\t232687\n比里\t232688\n做事情\t232689\n海洋总社\t232690\nblingbling\t232691\n大峰峰\t232692\n此生\t232693\n咯施施施施\t232694\n再给\t232695\n叫偶天使\t232696\n蓝媒公益】浙江卫视\t232697\n金丝鱼\t232698\n四联\t232699\n张伟丽\t232700\n狄\t232701\n王港闸\t232702\n春哥帅\t232703\n坏东西\t232704\nkklll\t232705\n上岛\t232706\n戴涵涵\t232707\n此画\t232708\n配料员\t232709\n林巧\t232710\n在家你在哪里了江红你在哪里聊聊你在哪里\t232711\n吹响\t232712\n想你了我想\t232713\n洪易爆暗强和你的\t232714\n457899\t232715\n乖亲亲\t232716\n动会\t232717\nvvpp\t232718\n那不勒斯\t232719\n三件套\t232720\n才红雪\t232721\n陈,\t232722\n喉咙\t232723\nETFIBBcchiphop个FD点石成金何须ugoNBC\t232724\n灯火阑珊处\t232725\n安道理\t232726\nbaba哥alibayou\t232727\nDRF\t232728\n愣儿\t232729\n犄角\t232730\n猴赛雷\t232731\n一只4\t232732\n金刚狼\t232733\n黑水河村\t232734\nyoutle\t232735\n来自己去\t232736\nweedold\t232737\n我我我不急\t232738\n莫名\t232739\n龙历险记\t232740\ngxuoyg\t232741\n张兆和\t232742\n啊不\t232743\n水纹\t232744\n巫妖\t232745\nNvcg\t232746\n京报\t232747\nffftrrt\t232748\nMeowstarpeople\t232749\n哒哈\t232750\nSunshine\t232751\n见了再见\t232752\n哈哈哈大爱\t232753\n小庞\t232754\n嗯津南\t232755\n财富\t232756\n小庚\t232757\n哒哒\t232758\n小店\t232759\n李学勇\t232760\n奇妙\t232761\n反应真不愧\t232762\n一三版\t232763\nloli\t232764\n就是我的爸爸\t232765\nwioodscnndxm\t232766\nKYA\t232767\n王八蛋中影\t232768\n悠之空\t232769\n走换\t232770\n报表鲁\t232771\n小康\t232772\n徒刑\t232773\n独立战争\t232774\n陈子琪\t232775\n薛颜丽\t232776\n快点儿不好意思我的和妈妈说话\t232777\n想尽\t232778\n滋阴\t232779\n胸摸\t232780\n应者\t232781\n博县\t232782\n解忧公主\t232783\n融洽\t232784\n靳儿\t232785\nbbvvh\t232786\n烤漆\t232787\n乔赛\t232788\nwocwocwwoc\t232789\n庙会\t232790\n冯志硕\t232791\nuuhm\t232792\n测试器\t232793\n赵丽贤\t232794\n莲蓉\t232795\n很反反复\t232796\n单数\t232797\nRTX\t232798\n度秘米s9岁\t232799\n气死才甘心\t232800\nGfgdgv\t232801\n你好哦你是我的爱\t232802\n六百分\t232803\n陈轩翰\t232804\n不浬\t232805\n桑德斯\t232806\n咬奶\t232807\n紧吻\t232808\n一天后\t232809\n乌兰\t232810\n郝海东\t232811\n200余名\t232812\n吉他\t232813\nbnj\t232814\n敢不打\t232815\n香香喷喷\t232816\n重庆凉粉\t232817\n不测\t232818\n不济\t232819\n不流\t232820\nkiss神马\t232821\n容量\t232822\n涛涛\t232823\nTap2\t232824\n苦死\t232825\n天冷不冷\t232826\n根深蒂固\t232827\n对翅\t232828\n老妖怪\t232829\n尤雅\t232830\nOPPOR九\t232831\n尤集\t232832\n7198件\t232833\n警示\t232834\n1cccc\t232835\n大运\t232836\n嗯红\t232837\nguuy\t232838\nguux\t232839\n别去\t232840\n没错听\t232841\n马基\t232842\n回头看来\t232843\n牛爷儿\t232844\njfgc\t232845\n诶妈呀\t232846\njfgg\t232847\n嗯纳\t232848\n鲜克\t232849\n一错再错不原谅\t232850\nn886\t232851\n尤雪\t232852\nguuf\t232853\n爽歪歪\t232854\n未得\t232855\n利率\t232856\nVBF\t232857\nnishsheiya\t232858\n灵机一动\t232859\n一个谜\t232860\n劲椎\t232861\n交说\t232862\n小怡猪\t232863\n一个调\t232864\n陆亚倩\t232865\n迷笛音乐节\t232866\n天津泰达\t232867\n冯磊\t232868\n福布斯\t232869\n辅亚星\t232870\nrix\t232871\n1500万\t232872\nmLearning\t232873\n洒家\t232874\n24566\t232875\n敦柱柱\t232876\n天天的么样\t232877\nFmes\t232878\nchic\t232879\n外挂\t232880\n火把\t232881\n东北婆婆\t232882\n怕雷劈\t232883\n十件事\t232884\n1111111122222333654477899\t232885\n關\t232886\n咪害\t232887\n梦中情人\t232888\n吖二二\t232889\n大额贷款\t232890\n条衫\t232891\n平固分\t232892\n闯\t232893\n问\t232894\n闭\t232895\n闫\t232896\n法大\t232897\n咕噜噜噜噜咕噜噜咕噜噜\t232898\n闷\t232899\ndgfc\t232900\n贴春联\t232901\nVB6\t232902\n闰\t232903\n我的惊喜风干的玫瑰下载你给我的情书\t232904\n闽\t232905\n闻\t232906\n闺\t232907\n闹\t232908\n闸\t232909\n期望值\t232910\nOzzz\t232911\n石英\t232912\nggxf\t232913\n夏令营\t232914\n宫心妍\t232915\n宪华\t232916\nHGGJF\t232917\n立即\t232918\n128G\t232919\n张丽姐\t232920\n赛尔瑞\t232921\n真面包\t232922\n13062121186\t232923\n那么说的话\t232924\n花鸟\t232925\n128K\t232926\n拜拜去\t232927\n没话讲\t232928\n3200114942\t232929\n逗圈子\t232930\n大野洋子\t232931\n金灿灿\t232932\n吃鸡排\t232933\n吴肖彤\t232934\n二苦\t232935\n小偏高\t232936\n断头台\t232937\n１3门\t232938\n霞光\t232939\n说死陆\t232940\ndoopboy\t232941\n天天有喜之人间有爱看过没\t232942\n高矮\t232943\nlllarin\t232944\n金币\t232945\n调皮\t232946\nCONVERSE\t232947\n朴会长\t232948\n金市\t232949\n2457787644443\t232950\n天天打手虫\t232951\n要芭\t232952\n旗号\t232953\n行吧度秘\t232954\n讲话\t232955\n筋疲力竭\t232956\n老薛\t232957\n你好样\t232958\njeegg\t232959\n809家\t232960\n二十几号\t232961\n读一编\t232962\n讲评\t232963\nggggsssk\t232964\n成群\t232965\n私房照\t232966\n126.0度\t232967\n我不爱的人可爱我\t232968\n收受\t232969\n1285\t232970\n朱清华\t232971\n1281\t232972\n1280\t232973\n1283\t232974\n林晓燕\t232975\n晃晃荡荡\t232976\n亿倍\t232977\n咩都糸\t232978\n讲课\t232979\n黑豹乐队\t232980\n年报\t232981\n伐沒\t232982\nXsok\t232983\n消费\t232984\n辣椒枪\t232985\ntigzjcxkkclfgkj\t232986\n妖化\t232987\nBlue\t232988\n光明正大\t232989\n密暗\t232990\n前晚\t232991\n匡文\t232992\n有恐惧症\t232993\ndiddd\t232994\n拉毛\t232995\n六九\t232996\n六乘\t232997\n断绝\t232998\n鸡蛋黄\t232999\n雷轰\t233000\njhjjj\t233001\njhjjh\t233002\n花旷世\t233003\n很喜感\t233004\n拉比\t233005\n错别\t233006\n三十分米\t233007\n40.2万\t233008\n志妹子\t233009\n干嘛推\t233010\n公鲤\t233011\n下到此\t233012\n冰迅\t233013\n魔教\t233014\n一八年后\t233015\n99行\t233016\n曹冲\t233017\n110366584\t233018\n不不不不不我想和你\t233019\n学信\t233020\n摄入量\t233021\n出目\t233022\n主将\t233023\n巴巴\t233024\n疯子\t233025\n平顺\t233026\n谢楠\t233027\n五四\t233028\nyyuui\t233029\n平顶\t233030\n再不消\t233031\nwenm\t233032\n破相\t233033\n不管算\t233034\n剿灭\t233035\naob\t233036\n弘法寺\t233037\n彦哥哥\t233038\n12345678910125566669870\t233039\n五国\t233040\n差快说\t233041\n勇于\t233042\n一月六\t233043\nwent\t233044\nfgvhhn\t233045\n其它地区\t233046\n耽美\t233047\n不够很难看\t233048\n栀雪\t233049\n存心\t233050\n余香\t233051\n就是我們\t233052\n千呼万唤\t233053\n8888888888888888\t233054\n车展\t233055\n南皇\t233056\n三生三世十里桃花将是我\t233057\n碰瓷儿\t233058\n龚伟华\t233059\n道罗战神\t233060\n信言\t233061\n傻傻我说的是你说的你知道芈月传\t233062\n83棵\t233063\n弘诗\t233064\n制胜\t233065\n一年两年\t233066\n黄曼灵\t233067\n136771树\t233068\nQ23a\t233069\nvifg\t233070\nvifi\t233071\n南皮\t233072\n嗯嗯\t233073\n15954415563\t233074\nSNS\t233075\n夜车\t233076\n我喜欢英\t233077\n毛芷晴\t233078\n阿洋\t233079\n讲面\t233080\n夜轮\t233081\n玉树\t233082\n三排座\t233083\n法彩\t233084\n为你早\t233085\n称得上\t233086\n视距\t233087\n蕭帮\t233088\n人世\t233089\n塞智\t233090\n鲜鱼片\t233091\n刘嘉雷\t233092\n好丑啊好丑啊好丑好不开心不开心不开心\t233093\n加油度\t233094\n蕴诺丝\t233095\n十三年后\t233096\n加特林\t233097\n振荡不宁\t233098\n气室\t233099\n画画家\t233100\n陀螺王\t233101\n咕噜咕噜咕噜\t233102\n秘码\t233103\n34位\t233104\n十几回\t233105\n王王甲\t233106\n认出\t233107\n陪你好\t233108\n煮妇神探全集\t233109\n视路\t233110\n八月涛\t233111\n安吻\t233112\n许许多多\t233113\nxhxhkigdf\t233114\n亲爱的们\t233115\n厚比\t233116\n非典\t233117\n京广桥西北角瀚海文化\t233118\n受挫感\t233119\n十二页\t233120\n哦衣\t233121\nhh文\t233122\nTUGG\t233123\n霉运\t233124\n戏说\t233125\n大天蝎\t233126\n复复\t233127\n度度度度\t233128\n5561225585852148咯\t233129\n多图摸头煤炭局\t233130\n不赞成\t233131\n任达华\t233132\n函数\t233133\n外公咪\t233134\n朔州\t233135\nｍｙｍｉｓｅｒｅｎ\t233136\n洋芋粑\t233137\n尊老爱幼\t233138\n大巴\t233139\nBaBy\t233140\n八个月牙\t233141\n4月19日\t233142\n呃秘花园\t233143\n今人\t233144\n南和县\t233145\n10倍\t233146\n火腿肠\t233147\n八百八八八八十八\t233148\n长大好帅\t233149\n1899485718994857\t233150\n查问\t233151\nsilly\t233152\nUdmfkk\t233153\npgtjgpapa\t233154\n高秀梅\t233155\n話費\t233156\n87654321\t233157\n一个819\t233158\n13.5元\t233159\noopppp\t233160\n月率\t233161\n调号\t233162\n林老板\t233163\n碗儿\t233164\n五十度\t233165\nkajmapjtjpktmt\t233166\n恩恩恩恩恩恩\t233167\n14毫秒\t233168\n吃肉不吧唧嘴\t233169\n375ml\t233170\n咋你\t233171\n牧童\t233172\n偏分\t233173\n爷府\t233174\n笑话了我想你\t233175\n调取\t233176\ns7\t233177\nPOS\t233178\n掏粪工\t233179\n亲亲你的爱亲情我的爱\t233180\n456家\t233181\n肉包子\t233182\n坛坛罐罐\t233183\n你好背压\t233184\n一七八岁\t233185\n当仁不让\t233186\n歌单\t233187\nghxetfde\t233188\nQhwahajjwvsjhwjwbgsjhwjwbgsjhugsuhgfjgbuddieshubbubbjjbgghb\t233189\n宜兴可爱滴喜羊\t233190\n张星雅\t233191\n有多非\t233192\n黄云\t233193\n施美\t233194\n上不了然后\t233195\n大波比\t233196\n石家庄动物园\t233197\n光假照\t233198\n找茬\t233199\n干妈\t233200\n腾空而起\t233201\n度秘灵\t233202\n姑奈zzz\t233203\n黄亮\t233204\n抢必看\t233205\n金融街\t233206\nfityou\t233207\n这话\t233208\n迷失回\t233209\n王喜\t233210\n大胖秘\t233211\n名声\t233212\n斧头帮\t233213\n时时刻刻\t233214\n一起演\t233215\n从何说起\t233216\n小玩奥特曼\t233217\nending\t233218\n嘛嘛\t233219\n常准\t233220\n旅射程\t233221\n恋人絮语\t233222\n海波cv\t233223\n火狼\t233224\n鱼山\t233225\n70040000\t233226\n病菌播散机\t233227\n背吧儿歌\t233228\n你是谁你是谁的儿子是谁\t233229\n黄漫\t233230\n熊大之\t233231\n79点\t233232\n火狐\t233233\n明宏德\t233234\ngotbored\t233235\n五星级\t233236\n官术\t233237\n狗疯\t233238\n地带\t233239\n好聊\t233240\n侵略者\t233241\n生意经\t233242\n亚利朗\t233243\n潘朵\t233244\n谍战\t233245\n13781394728\t233246\n一中年\t233247\n据老\t233248\n粉蒸肉\t233249\n迹部景吾\t233250\n地市\t233251\n张松\t233252\n度秘你是个猪猪猪猪猪猪猪猪猪猪\t233253\n张娃娃\t233254\n爱奇唉\t233255\n细菌\t233256\nhttphhiphotosbaiducomxiaodupicitemd\t233257\n硫酸枪\t233258\n银泰\t233259\n你岁\t233260\n沙井\t233261\n爱火\t233262\n黑道\t233263\n赵凤阳\t233264\n若隐若现\t233265\n拉一想过\t233266\n陶渊明\t233267\n黑撒者\t233268\ncntscheba\t233269\n此项\t233270\n需用钱\t233271\n截下\t233272\nammjjtatja\t233273\n亲吻\t233274\n1620968614\t233275\n井大宝\t233276\n轮回\t233277\n铭心\t233278\n走哈\t233279\n改头换面\t233280\n新衣\t233281\njianniyao\t233282\n打桩\t233283\ngjgkga\t233284\n课税\t233285\nhgrk\t233286\n家年夜饭\t233287\n两三倍\t233288\n东阿阿胶金丝枣\t233289\nDemery\t233290\nIYSGFWFRGTTT\t233291\n新街\t233292\nikzk\t233293\nwmlgmjajj0000358888696\t233294\n新行\t233295\nCA1525\t233296\n斗命\t233297\n盘点日积月累\t233298\n暂无\t233299\n四九0点\t233300\n真心想你\t233301\n再除\t233302\n李心芮\t233303\n七千八百九十九二十七\t233304\n我草我草我草\t233305\n午门乃一拉组特\t233306\n天寒地冻\t233307\n不会\t233308\n不众\t233309\n百多360\t233310\n不休\t233311\n对堆\t233312\n余味无穷\t233313\n家庭作业\t233314\n马萍\t233315\n杀力威\t233316\n十四五八四十到\t233317\n1778万\t233318\n不似\t233319\n四三八\t233320\nkja56k\t233321\n物美卓\t233322\n上月\t233323\n张知为\t233324\n家猪\t233325\n豪华\t233326\n18258910762\t233327\nhttpehiphotosbaiducomxiaodupicitem0824ab18972bd407092178897c899e510eb309dbjpg\t233328\n98444\t233329\n第二遍\t233330\n阿紫仪\t233331\n败足\t233332\n飞来飞\t233333\n吴淑瑜\t233334\n唉师\t233335\n第二道\t233336\n钢琴系\t233337\n见花开\t233338\n混养\t233339\n538065238456885455566555333511458234863332255555558841123369\t233340\n奥罗威\t233341\nAreyouOK\t233342\n大男孩\t233343\n刑事\t233344\n遮挡\t233345\n官僚主义\t233346\nstometro\t233347\n桂花\t233348\n二十二五\t233349\n桂芳\t233350\n绫缎\t233351\n米肉麻\t233352\n24468852157998\t233353\n严成命\t233354\n审判天神诺亚\t233355\n铅笔\t233356\n黑说\t233357\n虎皮猫\t233358\n最讨\t233359\n劳工\t233360\n要好\t233361\nkpvhxy\t233362\n11年春节\t233363\n雄浑\t233364\n乃克\t233365\nhnnnnlkl\t233366\n是大非\t233367\n卖国贼\t233368\n要女\t233369\n化妆水\t233370\n黑话\t233371\n放泼\t233372\n小十岁\t233373\n不晚安不爱我要和你聊天度秘别晚安\t233374\n30多首\t233375\n桃谷仙\t233376\n回流比\t233377\n学生公寓\t233378\n囚绿记\t233379\nb哥\t233380\n律动\t233381\n己任\t233382\n毒死\t233383\n一杆\t233384\n测谎仪\t233385\n同人\t233386\n送机#吴亦凡##Kris#\t233387\n想吃吃\t233388\n热心人\t233389\n0.32\t233390\n忠烈杨家将\t233391\n对呀嘴\t233392\n王林美\t233393\n一束\t233394\nhellomymmit\t233395\nhfhjbghjn\t233396\n看好不好\t233397\n斗蕞\t233398\n永远不我和你绝交\t233399\n天上一个太阳\t233400\n一条\t233401\n一杠\t233402\n一杯\t233403\nAV75\t233404\n朝令\t233405\n冯纪超\t233406\n刚刚湖\t233407\n朝代\t233408\n釅肖申克\t233409\nHusk\t233410\n打老虎\t233411\n王片\t233412\n讨厌啊长\t233413\n一松\t233414\n王牌\t233415\n三十次\t233416\n十点零一\t233417\n上官警我上官警我\t233418\n梁凉\t233419\n18896952689669\t233420\n混入\t233421\n诗漫\t233422\n16.92%\t233423\nQQ我游戏屋恶嘟嘟\t233424\novera\t233425\n关键盘\t233426\n有利\t233427\n事等一下\t233428\n奥维\t233429\n差不起\t233430\n哦替\t233431\n爱你的和你爱的你\t233432\n袖子\t233433\n母亲节i由5s2弓3幺\t233434\n田囖囖\t233435\n我真的好爱好爱\t233436\nakks\t233437\n听不会\t233438\n大批大批\t233439\n笼中\t233440\n舞天\t233441\n有我是说学唱歌唱几句行不行\t233442\n15937892998\t233443\n我俩个那bb了行不行我求你\t233444\n我是女我是男\t233445\nk700\t233446\n大嶼山\t233447\n栖\t233448\n才孙\t233449\njust\t233450\n永春\t233451\n张赛\t233452\n才子\t233453\n新闻大厦东塔\t233454\n茂门口\t233455\njusd\t233456\n别再三\t233457\n秘回\t233458\n四棵\t233459\n第六元\t233460\n飞轮\t233461\n王宇涵\t233462\n篷勃\t233463\n商飞\t233464\n纺织机器人\t233465\n巴西左路\t233466\n反侮\t233467\n我是大你了再也不找你\t233468\nKcj\t233469\n佩宁\t233470\n学民\t233471\nuuihqhUAHAUah\t233472\n大福源\t233473\n钱塘湖\t233474\n金劳\t233475\n张赫\t233476\n才学\t233477\n1389837790\t233478\n腾讯视频\t233479\ngireengines\t233480\n一生一世虎\t233481\nyyghu\t233482\n出去\t233483\n活嗨\t233484\n扔出\t233485\n加德满都\t233486\n开动\t233487\n咖咖\t233488\n张晨瑞\t233489\n度秘蜜\t233490\n运算\t233491\n异思\t233492\n裂嘴\t233493\n打错了你打错了\t233494\n一回千秋\t233495\n对儿红\t233496\n爬上岸\t233497\n以身相\t233498\n开脱\t233499\n马起\t233500\n中南元\t233501\n杉杉\t233502\nh8weeggfbexxievurr\t233503\n异性\t233504\n对呀牙\t233505\n出厂\t233506\n川股\t233507\n红楼\t233508\n真的爱我错\t233509\n开办\t233510\n崔健\t233511\n受宠若惊\t233512\nNuckuckucumtsotsMURMURKURUsrmUUDMULKyr6rksitzluktzKRU\t233513\n等等文本我不是你的女神\t233514\n咕嘟咕\t233515\nrongly\t233516\n25696387\t233517\n迪拜\t233518\n完成时\t233519\n相机\t233520\n牵扯\t233521\n38条\t233522\n5608\t233523\n保真\t233524\n桂圆\t233525\n细兄\t233526\n幸福片\t233527\n爱你看我\t233528\n郑振华\t233529\n翅利川利川\t233530\n过日子\t233531\n22913场\t233532\n彭向\t233533\n188732093\t233534\n相望\t233535\n相期\t233536\n牵手\t233537\n广场\t233538\n所有格\t233539\n堆雪人\t233540\n冤囚\t233541\n舞阳\t233542\n知心怪\t233543\n身心\t233544\n民主政府\t233545\n大尊额\t233546\nttjg\t233547\n廖雨洁\t233548\n顷刻\t233549\n艾雷斯\t233550\n佛国\t233551\n侯飞凡\t233552\nUgugfnj\t233553\n练刚\t233554\n几千几百遍\t233555\n李世丽\t233556\n杨丞琳\t233557\n苦无\t233558\n今年清明\t233559\n谢睿寒\t233560\n大日\t233561\n生财\t233562\n谍照\t233563\n大无\t233564\n长晚\t233565\n尹嗯\t233566\n八十多分\t233567\n皮优\t233568\n杜米\t233569\nb超\t233570\n小度秘猪猪\t233571\n7匹\t233572\n天凌\t233573\n没事骂\t233574\n天凉\t233575\n表框\t233576\n潘以瑶\t233577\n手续性\t233578\n45545557\t233579\n琦少\t233580\n大旗\t233581\n恋老网\t233582\n天凤\t233583\n克夫\t233584\n塑像\t233585\n必需品\t233586\n咖喱鸡\t233587\n武侠小说家\t233588\n神秘的人\t233589\n刀尖\t233590\n陈韵柔\t233591\ngnm0\t233592\n拉开序幕\t233593\n号数\t233594\nberv\t233595\n200几年\t233596\n082.5度\t233597\n程思思\t233598\n萌萌哒加美美哒\t233599\n幺不死\t233600\n出了问题\t233601\n三亿多\t233602\n咪唑\t233603\n蛤蟆\t233604\n好失落\t233605\n陈梓萌\t233606\n大观园\t233607\n十八厘米\t233608\n填完\t233609\n臭老\t233610\n不是你女\t233611\n6吨\t233612\n七周\t233613\n辣罗卜\t233614\n胸女\t233615\n2134213089\t233616\n周很久\t233617\n22222400421022\t233618\n赫也\t233619\n连连\t233620\n在一起吧谢谢你的喜欢\t233621\n那个么\t233622\n周正一\t233623\n仆从\t233624\n该玩\t233625\n重任\t233626\n借道\t233627\n憨子\t233628\n合订\t233629\n合计\t233630\nbyebyebye\t233631\n瓦塔拉\t233632\n合训\t233633\nHDJffgfhxGDGXfcxgFFFC\t233634\n杨思\t233635\n邓认筹\t233636\n1nnnnnnn不能nnnnnn呵呵nnnnnn刷屏nnnnnnn嘻嘻nnnnnnnnnnnvv\t233637\n膏药\t233638\n我也不想\t233639\n三里庄\t233640\n你的爷爷\t233641\nS.M公司\t233642\nstammy\t233643\nwftaty\t233644\nffjd\t233645\nHrhe\t233646\n哈吉美\t233647\n2456262365\t233648\n海皮克斯\t233649\n高教行\t233650\n小丫小苹果\t233651\nud7tdlhhdwo90hbxjfgqidukoit5voojewtovjpbu8r2545jx\t233652\n句变比\t233653\n鸦雀无声\t233654\nvciyrkhf\t233655\nghjkjfgh\t233656\n162357\t233657\n九百九十六九一二九九九\t233658\na15\t233659\n458855\t233660\n三二栋\t233661\na11\t233662\na10\t233663\n发生率\t233664\n小人\t233665\n宋冰倩\t233666\n东直门站\t233667\n痴情人\t233668\nKARRY\t233669\n园林\t233670\n五更天\t233671\n调研\t233672\n121345487\t233673\n好无奈\t233674\n急行\t233675\n44043500\t233676\n霸\t233677\nhiheto\t233678\n城南家园\t233679\nIiiuibiji\t233680\n体力\t233681\n循循\t233682\n博格坎普\t233683\n好车速\t233684\n18：18\t233685\n天天有喜之人间有情\t233686\n赋\t233687\n话说清楚\t233688\n科学探索者\t233689\n张掖美\t233690\n罗董\t233691\n1877639258\t233692\n秦淑哲\t233693\n毒瘤\t233694\n眼疲劳\t233695\n98名\t233696\n钱文静\t233697\n男妹子\t233698\n行骗\t233699\n我要你给我讲尸兄的故事\t233700\n王颖鸿\t233701\n无依无靠\t233702\nTrekkies\t233703\n滑翔伞\t233704\n橘子梦\t233705\n该不该\t233706\n毒瘾\t233707\nsoule\t233708\n那你现在在干嘛啊我只你问你\t233709\n张摩擦\t233710\n连发\t233711\n这感\t233712\n五点23分\t233713\n家片\t233714\n四四强\t233715\n阿茶\t233716\n1308143831518364616375\t233717\n连句\t233718\ndá\t233719\n温温柔柔\t233720\n7万\t233721\n7一\t233722\n天高\t233723\n宣扬\t233724\n7三\t233725\n惊天动地\t233726\n老大爷\t233727\n田园\t233728\nu想揭瓦\t233729\n毛毛丝\t233730\n葱油味\t233731\n妃夕颜\t233732\n看海\t233733\n申亚\t233734\n信n2萤火虫n3桃李梅n4银河渡口n5双喜临门n6久雨初晴n7海中绿洲n8持久和平n9巨轮出港n10大家笑你n11夕阳西下n12遍地鲜花n13相差无几n14江淮河汇n15东南北n16春水碧如蓝n17\t233735\n勇敢的心\t233736\ntimor\t233737\n200450500\t233738\n工商店\t233739\njhs\t233740\n董雨欣\t233741\nygdbcheiahdbbchd\t233742\n外边\t233743\n惹我不高兴\t233744\n会性\t233745\nksnbhdbc\t233746\n微波\t233747\n则弟\t233748\n18日\t233749\n1120\t233750\n唱完\t233751\n1122\t233752\n1124\t233753\n1125\t233754\n1255365868\t233755\njttgjt\t233756\n祖平平\t233757\n18时\t233758\n后判\t233759\n牧野花园\t233760\n唱宝\t233761\n跳进水\t233762\n说听歌\t233763\n火山爆发\t233764\n加场\t233765\n谁和你约\t233766\n方儿\t233767\n电视单身\t233768\n3650\t233769\n赫尔城队\t233770\n二月二十六日\t233771\n郭宏伟\t233772\n圣痕\t233773\n最美的时光\t233774\nRapaport\t233775\n北者\t233776\n欣涵\t233777\n度密茶\t233778\n擞场\t233779\n敲打\t233780\n喜观\t233781\nDpdpdo\t233782\n可持续\t233783\n天下父母\t233784\n辛兴\t233785\n有资格\t233786\nzanrevyn\t233787\nuehjdsjwi\t233788\n孟昊东\t233789\n防子弹\t233790\n18888888888888岁\t233791\n塞毛\t233792\nhinor\t233793\n全力\t233794\n712点\t233795\n毁人\t233796\n全办\t233797\n张志菲\t233798\n我是歌手第四季\t233799\nfududh\t233800\n中华电影\t233801\n一整年\t233802\n壮族\t233803\n稀罕我\t233804\n格鲁吉亚\t233805\n三代机\t233806\n还死不承认\t233807\n暖意\t233808\ndhGd\t233809\n唐豆豆\t233810\n484555545555\t233811\n彼岸花\t233812\n闫亚燊\t233813\n东火山\t233814\nWhen\t233815\n可吐\t233816\n盾妮\t233817\n给我听\t233818\n使坏\t233819\n王文豪\t233820\n吧女\t233821\narehereasa\t233822\n倒入\t233823\n哪里块\t233824\n再见吧好喜欢\t233825\n脑溢血\t233826\n覃雨飞\t233827\n没不可行\t233828\n朗绪\t233829\n美俗\t233830\n瞑目\t233831\n朗维\t233832\n可否\t233833\n洗衣机密\t233834\n微薄\t233835\n平蹚\t233836\n细声细语\t233837\n幕僚包机\t233838\n瑶族人士兵王国家级数码港\t233839\nG18黑2750\t233840\n精神食粮\t233841\n屌屌而丽\t233842\n468495\t233843\nTVBS\t233844\n什么快说\t233845\n度秘武\t233846\n哭不哭\t233847\n非公堂\t233848\n不算话\t233849\n7k7k7k7k\t233850\n不是我心爱的\t233851\n沉降\t233852\ndyffhcghc\t233853\n什氏\t233854\nandiculi\t233855\n酒节\t233856\n锣鼓喧天\t233857\n3579464\t233858\n又话\t233859\nBath\t233860\n赵县\t233861\n陈梦凡\t233862\n八十棵\t233863\n睡駿\t233864\ngjpK\t233865\n追平\t233866\n别体\t233867\n泽宝宝\t233868\njjdjf\t233869\n偏往\t233870\njjdjd\t233871\n合家欢年卡\t233872\n安卓cookhotmysendmecomkomathe慢ifoilailaao\t233873\n别作\t233874\n2011年2月21日\t233875\n父不认儿\t233876\n110226\t233877\n300多平方米\t233878\njjdjx\t233879\n别住\t233880\n闺女\t233881\n爱好不好嘛\t233882\n云南盈江电子商务\t233883\nｙｏｕ\t233884\n一脸一点\t233885\n#热博\t233886\n#SS501#\t233887\n苏之口\t233888\n奇花\t233889\nhdhbrbf\t233890\n林多美\t233891\n阿美\t233892\n星期一下午\t233893\n沥沥沥\t233894\n八路车八路车\t233895\n欧式p\t233896\nhttppinyincne6209\t233897\n120210\t233898\n丝瓜\t233899\n打就打\t233900\n一束阳光\t233901\n乱讲\t233902\n好吃丽\t233903\n下设\t233904\n宋健凯\t233905\n会听话\t233906\n早出b2\t233907\n乱认\t233908\n佐由纪\t233909\n西溪数码港\t233910\n单纯皇室小公主的樱花爱恋\t233911\n陈新泽春心\t233912\n王堡\t233913\n个性\t233914\n陈诗妍\t233915\n早朋哥儿\t233916\nq\t233917\ncihikoby\t233918\n武平县\t233919\n5558588555\t233920\n丝瓦\t233921\n绝对不换\t233922\nnearly\t233923\n心情不顺\t233924\n鸽子王\t233925\n段叫\t233926\n绿洲\t233927\n度殷\t233928\nyghhj\t233929\n说对其一\t233930\n彩画\t233931\nzaikaizhi\t233932\n春英\t233933\n隆污\t233934\n猜歌之王\t233935\n高标\t233936\n丽萨\t233937\n一百万亿\t233938\n恭迎\t233939\n首项\t233940\n難退\t233941\n梁继贤\t233942\n恩老公\t233943\n塞s\t233944\n降魔\t233945\nfijndtt\t233946\n百家之言\t233947\n解郁\t233948\nparner\t233949\ndtsgwerfdfgsffg\t233950\n下回\t233951\n丽萍\t233952\n植物大战僵尸草原\t233953\n名器\t233954\n重力\t233955\n卡里只\t233956\n时报\t233957\n潮阳熏鸭\t233958\nBHHn\t233959\n冬闲\t233960\n唐冰莹\t233961\n15663324457\t233962\n杏田\t233963\n什儿\t233964\n梦江花园\t233965\n嘻嘻行不行\t233966\n哦亲\t233967\n苔藓\t233968\n[中字]120503朱炳镇的Talk\t233969\n电板\t233970\n几十条\t233971\n10亿\t233972\n张夏天\t233973\n原始\t233974\n知道片\t233975\n丧生\t233976\n123412345\t233977\n睡虫\t233978\n灭水\t233979\n杨洋\t233980\n爱奇呀堡\t233981\n新店镇\t233982\n长姿势\t233983\n爹炮\t233984\n杨洁\t233985\n护理人员\t233986\n负英冠德比\t233987\nyhHHHJKO\t233988\n杨洛\t233989\n过一会\t233990\n进班\t233991\n听旨\t233992\ncetc\t233993\n身贴\t233994\n复活赛\t233995\n普渡众生\t233996\n赴宴\t233997\n侯\t233998\n附性\t233999\nRTF\t234000\n1.5亿多平方米\t234001\n小齐娜\t234002\nA片\t234003\n侨\t234004\nixds\t234005\n身负\t234006\n郭奶奶\t234007\n如鱼在案\t234008\n行凶\t234009\n子女朋友\t234010\n杀玲\t234011\n1593568\t234012\n失眠药\t234013\n20多年\t234014\n靖钦茹\t234015\n食饭\t234016\n蔡世友\t234017\n非杀\t234018\n弹壳\t234019\n九州同句\t234020\n马屁经\t234021\nhjdg\t234022\n临沂市政法委\t234023\n老行不行\t234024\nfgyuj\t234025\n当场上\t234026\nhjdn\t234027\n精神\t234028\nJsugdkddbs\t234029\n烦呀请问\t234030\n睡作业\t234031\n第168章\t234032\n李一点\t234033\n201016年\t234034\n桂花糕\t234035\n睡风\t234036\n依\t234037\n乳穈\t234038\n上海体育场\t234039\nla7687a\t234040\n多假\t234041\n摩蟹男\t234042\n沈炎肃\t234043\n两千万\t234044\nP了\t234045\n东本\t234046\n药水\t234047\n飞骂\t234048\n忙秘\t234049\n我和他\t234050\n李在不起\t234051\nFjbfhktbkgjgkrdajdjkgidticegkouytqweertyuioplkjhgfdsazxcvbnm\t234052\n辉哥\t234053\n支树平\t234054\n同程\t234055\nhttphhiphotosbaiducomxiaodupicitembd315c6034a85edf9a12b4564e540923dd54752cjpg\t234056\n多咪多咪多咪多咪多咪多咪多咪多咪多咪\t234057\nletmelaugh\t234058\n张连志\t234059\n外来包\t234060\noudokei\t234061\nxuifp\t234062\n分包\t234063\n口尿\t234064\n永州\t234065\naremy\t234066\ngghjvlv\t234067\n嗯无中生\t234068\n堡奥特曼\t234069\n东朝\t234070\n福晋\t234071\n彭晨儿\t234072\ngta\t234073\nlgagb1b1aja\t234074\n亲父兄\t234075\n额曲项向天歌白毛浮绿水红掌拨清波\t234076\n6948735300061\t234077\n名旦\t234078\n玩转私人PARYT\t234079\ncffftc\t234080\n米罗\t234081\n零落群\t234082\n3000万个\t234083\n习惯弹\t234084\n恍然\t234085\n黑虎\t234086\n金钟仁\t234087\n王国生\t234088\n高夕然\t234089\n和田站\t234090\n輩子\t234091\n男哒\t234092\n噜噜噜噜噜啦啦xo\t234093\n代泊\t234094\n嗯瓮天下\t234095\n你好想想你\t234096\n男哥\t234097\n无性别医德林子角度秘那你的朋友\t234098\n小井\t234099\n354\t234100\n想过\t234101\n老大的幸福\t234102\n辛德勒\t234103\ngytgg\t234104\n黑商\t234105\n联合国儿童基金会\t234106\n3900\t234107\n弯骂\t234108\n1n一\t234109\n460英里\t234110\n大骚\t234111\nredfdfdgfdd\t234112\n仇慧珍\t234113\n大骂\t234114\n嗯嘴\t234115\n上洞\t234116\n想入非非\t234117\nTGV\t234118\n心情无比\t234119\n东北银呐我\t234120\n张永昌\t234121\n自渴\t234122\n充实\t234123\nColorWare\t234124\nTGC\t234125\n定居\t234126\n15234385659\t234127\n自动化\t234128\n其他\t234129\n大骨\t234130\n修盖\t234131\n返乡\t234132\nshenggie\t234133\n来人去过\t234134\n三级个\t234135\n15816348479\t234136\n捋\t234137\n霜凍\t234138\n话题\t234139\n钱废话\t234140\n0998876\t234141\n丁香度\t234142\n肆无忌惮\t234143\n刺眼代购\t234144\n橘皮糖茶\t234145\nCOMAC\t234146\n2一个\t234147\n瑞米\t234148\n咯体\t234149\n团队\t234150\n震\t234151\n真心真意\t234152\n显眼\t234153\n你的秘度\t234154\n完美女孩\t234155\nlovelove\t234156\n智霖旅社\t234157\noueapou12\t234158\n为我那呐喊助威\t234159\n四杆\t234160\n苦逼脸\t234161\n呢讨厌鬼\t234162\n7棵\t234163\n四束\t234164\n风流人物\t234165\nTest\t234166\n基本费\t234167\n今天\t234168\n16位\t234169\n肚壁\t234170\n2156123856\t234171\n恩负义\t234172\n易思娟\t234173\n四杯\t234174\n今夕\t234175\n今多\t234176\n善子\t234177\n六普拉斯\t234178\n100000059464946494646594646496494个\t234179\n今夜\t234180\n11.40\t234181\n南宫流风女主\t234182\n瞬时\t234183\n照锥心\t234184\n今夏\t234185\n栗震亚\t234186\n明天早上\t234187\ngtj\t234188\n513264\t234189\n儿子\t234190\n白富美你是黑穷丑\t234191\n舨\t234192\n失利\t234193\n38亿\t234194\n大说话\t234195\n儿孙\t234196\n卡普\t234197\n求事\t234198\n山茶花\t234199\n碍手\t234200\n哪几个\t234201\nihssaseeere\t234202\nAV正网\t234203\n科教\t234204\n阿尔\t234205\nmjjjjjwtmmjjjmmpjjjgmmmjjjptmmmmjjjjjjjjjmmmmptt\t234206\n几滴\t234207\n求人\t234208\n法国路易斯·嘉玛发廊\t234209\n翻越\t234210\n327274\t234211\n萨呦\t234212\n周一行\t234213\n天水市\t234214\n幺点\t234215\n枫树\t234216\n我没办法\t234217\n粑粑粑粑粑粑粑粑粑粑粑粑粑粑粑粑粑\t234218\n坐腿\t234219\n腐乳通菜\t234220\n求交\t234221\n好心痛\t234222\n北西\t234223\n巴勒斯坦\t234224\n投诚\t234225\n现代\t234226\n墨镜\t234227\n一会儿等会儿\t234228\n魔羯座\t234229\nseochuanmei\t234230\n现任\t234231\n壹零年\t234232\n投诉\t234233\n不揣气\t234234\n现价\t234235\n才大才\t234236\n现今\t234237\n2008年下半年\t234238\n一百九十九个\t234239\n好想死\t234240\n喜欢你了吧\t234241\n看着我\t234242\n啼笑皆非\t234243\n萌达萌萌哒\t234244\n190分钟\t234245\n9958258\t234246\n李小洲\t234247\n小额贷款\t234248\n睡硬\t234249\nparent\t234250\n我做你是我的最爱\t234251\nFOOD\t234252\n23333333\t234253\n黄华华\t234254\n下面\t234255\n美的回忆\t234256\n导火索\t234257\n老婆餐\t234258\n总评\t234259\n东风乡\t234260\n异世神\t234261\n有趣你的摸摸我特别梦\t234262\n武广高铁\t234263\n造假者\t234264\n爱我你就个丑八比我大也不理\t234265\n台机\t234266\n嗯许一世\t234267\n出版商\t234268\n15510583145\t234269\n26个月\t234270\n就是你是\t234271\n沉甸甸和\t234272\n二二二二幺六\t234273\n贫困地区\t234274\n5189340\t234275\n国立武汉水族馆\t234276\n玛德\t234277\nyy网\t234278\n一生声\t234279\nTFB0Y\t234280\n无罪\t234281\n阴嵩\t234282\n蒋了拜拜\t234283\n算一卦\t234284\n乔上人\t234285\n善缘\t234286\n软膏\t234287\n3000c\t234288\n下订\t234289\n广东省邵阳商会\t234290\nfegg\t234291\n慢我爱你爱我\t234292\n独眼\t234293\n滴滴滴滴滴滴啦啦滴滴滴滴滴滴滴滴滴滴\t234294\n君宇\t234295\n叶傻子\t234296\n气娃娃\t234297\n六月五号\t234298\n二十九足岁\t234299\n一朵花儿\t234300\n沈星\t234301\n随我\t234302\ntygd\t234303\n尘埃去你的我\t234304\ndocket\t234305\n塌下来\t234306\n253828768\t234307\n30007\t234308\n杜子\t234309\n二月二十八号\t234310\n被刑拘\t234311\n飞扬\t234312\n朱八\t234313\n纽约证交所\t234314\nid吧601318\t234315\n耶卜\t234316\n几户\t234317\nbhjbo\t234318\n小黄网\t234319\n陈家程\t234320\n金穗\t234321\n颜如\t234322\n大英\t234323\n外景\t234324\n思涵\t234325\n三百多分\t234326\n勘测\t234327\n莫名其妙受冤枉\t234328\n新生病\t234329\n218526875\t234330\n得勒\t234331\n姐妹情深\t234332\n四御殿\t234333\n风华正茂\t234334\n朱元\t234335\n情迷\t234336\n自由自在\t234337\n广州日报\t234338\n牛郎\t234339\n飞扑\t234340\n健膀\t234341\n慢牛不想\t234342\n将太\t234343\n王刚睿\t234344\n笑恬\t234345\n落后者\t234346\n增生\t234347\n好想哭\t234348\n杀人333333\t234349\n前张庄\t234350\n人神共愤\t234351\n二二九零\t234352\n要不叫\t234353\n主队\t234354\n抒葫芦侠客\t234355\n蔡锷路一座大厦\t234356\n战甲\t234357\nyfygcydvkvhxcjbudhvhdtqtckvkkudgkfjfjfjchfjfjfhdjfcjfjfjvjvkvjf\t234358\nlastio\t234359\n自助游\t234360\n滴水穿石\t234361\né哼\t234362\n无果\t234363\n将夜\t234364\n1tcf\t234365\n炸鸡k77\t234366\n战斗\t234367\n19：30—20：10\t234368\n环保+摇号+竞价\t234369\n孙思琪\t234370\n徐海峰\t234371\n窒息\t234372\n李泰利\t234373\n腊月十一\t234374\n邓州市\t234375\n丝场\t234376\n抄底\t234377\n慨叹\t234378\n没有病\t234379\n摇摇摇摇\t234380\nGkvjgck\t234381\nJptmd\t234382\n高崎隆\t234383\nhffffxx\t234384\n鸡贼=low\t234385\n秘道\t234386\n鳌苑\t234387\n2b三杯六八\t234388\n两回事\t234389\n代言人\t234390\nkissor\t234391\n相得益彰\t234392\n亲爱的我很爱你\t234393\ngwas\t234394\n陶爸爸\t234395\n神情\t234396\n1634862486\t234397\n18242347227\t234398\nOVER\t234399\n惊悚片\t234400\n惊声尖笑\t234401\nffychhcfd\t234402\n陆其桐\t234403\n三二块\t234404\ntheu\t234405\n一五年\t234406\n谢谢错\t234407\n前壁\t234408\n啊了\t234409\n巫婆\t234410\nthdoy\t234411\n爱不挠\t234412\n拍张照\t234413\n唉hi\t234414\n呀嘿\t234415\n海lovey\t234416\n长兴\t234417\n阿123\t234418\n尹曼\t234419\n可懂\t234420\n了了了里了里了\t234421\n偶月个法国第二代从\t234422\nlu2356\t234423\n花帽子\t234424\nmoy\t234425\n迪迦\t234426\n救蚂\t234427\nmos\t234428\n雷德利·斯科特\t234429\n6摄氏度\t234430\n乖打\t234431\n5276838388335\t234432\n奔跑巴\t234433\nmoo\t234434\n帅你丑\t234435\nmom\t234436\nXcvv\t234437\nmoa\t234438\nmog\t234439\n天一块儿\t234440\n佳话\t234441\n太傻逼\t234442\n荆萌\t234443\nstoko\t234444\n純潔\t234445\n以防\t234446\n泰版\t234447\n等式a2x2aaax10\t234448\n一听完\t234449\n3565446778\t234450\n斗牛去看小游玩具\t234451\nvk超短裤vke\t234452\n1GB\t234453\n变异\t234454\n打暗黑\t234455\n脱俗\t234456\n平起平坐\t234457\n吼吼吼\t234458\n就好多\t234459\n孔庆生\t234460\n2807点\t234461\n果断了\t234462\n腊月初八凌晨\t234463\n这个半年\t234464\n楚霸天下俱乐部梦想物语\t234465\n唯爱SJ13]110921艺声TW\t234466\n曾今务\t234467\n64元\t234468\n赵金红\t234469\n梦想成真\t234470\n琴艺\t234471\n三百块\t234472\n音乐\t234473\n极速老梁观世界咯菌腈\t234474\n我喜欢性取向\t234475\n黄台\t234476\n后街\t234477\n黄号\t234478\neiiydjjl\t234479\n黄可\t234480\n1001100家\t234481\n说话的呢你好\t234482\n团友\t234483\n赵村朵布莎\t234484\n八里冲\t234485\n先星星\t234486\n张恒远\t234487\n国家有意钾布加帝维龙\t234488\n2319点\t234489\n玉山保利\t234490\n复查\t234491\n晋亨乐\t234492\n分额\t234493\n黑美\t234494\n管得着\t234495\n快度\t234496\n见了\t234497\n似有若无\t234498\n神马夜盗\t234499\n干色\t234500\n上子虚\t234501\n王令航\t234502\n执著\t234503\n胡静\t234504\n我一言难尽\t234505\n效能\t234506\n诠释者\t234507\n闽南\t234508\n第2款\t234509\nhpeasixb\t234510\n早出晚归\t234511\n无药\t234512\n范鲁扬\t234513\n1142132198\t234514\n第2次\t234515\n惠达\t234516\n黑羽\t234517\n式\t234518\n一点四\t234519\n2781.32点\t234520\n好肉糟肉\t234521\n就那尼\t234522\njjso\t234523\n新中源陶瓷\t234524\n帮了忙\t234525\n呃李多\t234526\nkwee\t234527\n无限极\t234528\n几张\t234529\n32421380\t234530\n周里池\t234531\nNmjvgggyhhhuuuuhhhuuhhdsrgaagjlliyeqryu\t234532\njtkrri\t234533\nQQ元\t234534\n素火腿\t234535\n速度快一点\t234536\n铿锵\t234537\n38吨\t234538\ncqy\t234539\n荣耀6a\t234540\n路见不平\t234541\n羊台山二十号\t234542\n鸭子呱呱呱\t234543\naa式\t234544\n80万辆\t234545\n风言风语\t234546\n摩我\t234547\n米泉乱\t234548\n八月世界突出\t234549\n呱呱呱呱呱呱呱呱\t234550\n日度\t234551\nvjkxsuhfngn\t234552\n寇准\t234553\n不赖烦\t234554\n老天爷\t234555\n沉住\t234556\n想你我想\t234557\n累醉\t234558\n吃清楚\t234559\n大小便\t234560\nvrjvthgg\t234561\nbdbbx\t234562\n54005\t234563\nduchi\t234564\n性药\t234565\nTrailer\t234566\n古道\t234567\n单样\t234568\n舍身法\t234569\n火年华\t234570\n怎我\t234571\n哈勒度秘\t234572\n晨会\t234573\n朱秘书\t234574\n陈你\t234575\n英资百货玛莎\t234576\n九百里\t234577\n这一套\t234578\n营养不良\t234579\n四五久\t234580\n孙迪\t234581\n事场\t234582\n陈佳\t234583\nJOUTGH\t234584\nLets\t234585\n加多宝亮\t234586\n白钰\t234587\n绛紫\t234588\n受用\t234589\n损益\t234590\nlongzhong\t234591\nTMMD\t234592\n新民市\t234593\n春夏\t234594\n小叔子\t234595\n卸货\t234596\n分尸\t234597\nx86\t234598\n滚边弄我是你的大起大落\t234599\n防弹衣\t234600\n135吨\t234601\n唯恐\t234602\nhi博弈\t234603\n七七脑科\t234604\n孙进\t234605\n护工\t234606\n撒哈拉沙漠\t234607\n吴奇隆\t234608\n菩提迦雅\t234609\nHVGG\t234610\n我恨你恨你恨你恨你的日子不会你和你一辈子\t234611\n爆场\t234612\n截载\t234613\n失之毫厘\t234614\n孤豆\t234615\n可以吗\t234616\n施瓦辛戈\t234617\n伊犁河谷\t234618\n万岁万岁\t234619\n姜柔柔\t234620\n辛金\t234621\n史奴\t234622\nhelloime\t234623\n等等等等等等等等等等等等等等等等等等等等\t234624\n能屈能伸\t234625\n普易乔\t234626\nYeah\t234627\n原谅我，红尘颠倒\t234628\n客服\t234629\n牦牛\t234630\n破土\t234631\n等您来\t234632\nsidjeen\t234633\n扩列\t234634\n了里了了了\t234635\n纯钛\t234636\n陶校兴\t234637\n洒热血\t234638\n1勺\t234639\n企业家\t234640\n松糕\t234641\nMook\t234642\n一九岁\t234643\n张礼明\t234644\n谷延辉\t234645\n刘老根儿\t234646\n8300\t234647\n照过来\t234648\n谷嘉诚\t234649\n麻辣面\t234650\nSIH\t234651\n热尔维尼奥\t234652\nSeeYou\t234653\n两百米多五分之一\t234654\n真雷锋\t234655\n猫空\t234656\n李银娜\t234657\n乌漆麻黑\t234658\n是因为我更看不懂\t234659\n厕km\t234660\n果岭\t234661\n王彦芬\t234662\n弱不禁风\t234663\n精神分子\t234664\n怡和\t234665\namberlynn\t234666\njoppainuv\t234667\n不领情\t234668\n啊答答答答答答答答答答答答答答答答答\t234669\n何处\t234670\n气象预报\t234671\njsst\t234672\n首都医药卫生协调委员会\t234673\n范哈勒\t234674\n真的好好\t234675\n别提前\t234676\n正直\t234677\n没肖\t234678\n买买买\t234679\n轵乡\t234680\n圖兔兔\t234681\n刷贴十来\t234682\n睡觉吧白\t234683\n流水线\t234684\n应纳税额\t234685\n有班\t234686\n离意\t234687\n屄屄\t234688\n美军\t234689\n李犯贱\t234690\n离愁\t234691\n做头\t234692\n喝爽歪歪歪\t234693\n乖乖女\t234694\n福利来\t234695\n舰载\t234696\n笨妞儿\t234697\n季晗博\t234698\n后记\t234699\nhggggv\t234700\n殷切\t234701\n知不道当\t234702\n數落\t234703\n与是破\t234704\n水管子\t234705\nhggggf\t234706\n头骨\t234707\nhggggh\t234708\n诈尸\t234709\n12oreoorealen\t234710\n郭巳玮\t234711\n小憩不得\t234712\n德克士\t234713\n老词儿\t234714\n侧子\t234715\n冰霜\t234716\n繁文\t234717\n物证\t234718\nohno\t234719\njssd\t234720\n李卡卡\t234721\n酸涩\t234722\n3128198\t234723\nijjiki\t234724\n年岁\t234725\n赫尔普公司\t234726\n谁辈子\t234727\n军歌\t234728\n景色\t234729\n去年末\t234730\n尼马哲\t234731\n三千个\t234732\n过家\t234733\n刘畅畅\t234734\n2838480051\t234735\ngjag2\t234736\n三千两\t234737\n致电\t234738\n闭上\t234739\n晤士\t234740\n打非我\t234741\n4800亿\t234742\n试不出来\t234743\n低压\t234744\n流不起的挑战你在干嘛\t234745\n三千万\t234746\ntuwonon\t234747\n古装片\t234748\n胡杨犬\t234749\n澄粉澄粉\t234750\n载客\t234751\n无机盐\t234752\n跳支\t234753\n真婚\t234754\n青花元素\t234755\n对打小游\t234756\n不好说话\t234757\nididididi\t234758\n886886886886886886886\t234759\n134860\t234760\n庄上\t234761\n第二月\t234762\n11１２\t234763\n四十五百分百八十\t234764\n必经\t234765\n庄主\t234766\n领巾\t234767\n爱爱我我\t234768\n恩爱\t234769\n女劰\t234770\n血糖\t234771\n恩爵\t234772\n庄严\t234773\n逃债\t234774\n度娘\t234775\n第二本\t234776\nWolfReallyLetego\t234777\n度娜\t234778\n短斤少两\t234779\n辣肠\t234780\n金哥\t234781\ngggggyy\t234782\n第五餐\t234783\n安亲\t234784\nAmella\t234785\n越野e族\t234786\n今年底\t234787\n一百条\t234788\n18221127089\t234789\n一百来\t234790\nmz才么\t234791\n介入\t234792\n一百杯\t234793\n30袋\t234794\n282828282882822828282828822828282828282828282828\t234795\n复制化\t234796\n掉眼泪\t234797\n金哈\t234798\n石武\t234799\n交易商协会\t234800\n亲一样\t234801\n碗蒸糕\t234802\ntaum\t234803\n丁程鑫\t234804\n每夜\t234805\n安于\t234806\n快快哒\t234807\n今天天\t234808\n200310271120\t234809\n院婵\t234810\n找先风\t234811\n曲调\t234812\nGoldkills\t234813\n半分\t234814\nhamburger\t234815\n0849\t234816\n15964742655\t234817\n怎么多\t234818\n二倍\t234819\n镇里\t234820\n张百度\t234821\n二者\t234822\n段奕宏\t234823\n北七家\t234824\n湖北美尔雅集团有限公司\t234825\n给我回话\t234826\nryo\t234827\nryl\t234828\n好笑一点\t234829\nhahasb\t234830\n添砖加瓦\t234831\nryd\t234832\n中午1点\t234833\n525566\t234834\n二一个\t234835\n中钢\t234836\n三五和\t234837\n生菜们\t234838\nTyy\t234839\nryy\t234840\n平流\t234841\n武侠大片#四大名捕#\t234842\n票贩子\t234843\n擦卡\t234844\nv钢\t234845\n签名\t234846\n我几天\t234847\n绿珍草\t234848\n今生和你相依；你是我的生命之魂\t234849\n东莞市\t234850\n一百片\t234851\n戚雨\t234852\n距话\t234853\n害羞惹的祸\t234854\n露肉\t234855\n狂战\t234856\n炮恩\t234857\n不景气\t234858\n加班费\t234859\n掉下来\t234860\n710\t234861\n食品\t234862\n劉鑫浩\t234863\n明天上午三点钟\t234864\n700万\t234865\n现钱\t234866\n起诉书\t234867\n做大\t234868\n图子\t234869\n吴兄\t234870\n梁惠王\t234871\n生日\t234872\n园园丁\t234873\nMCOj\t234874\n塔利斯\t234875\n诊疗\t234876\n化妆\t234877\n每一月\t234878\n别找\t234879\n彭晓云\t234880\n臭鱼\t234881\n4229\t234882\n先球\t234883\n登山者\t234884\n陈建芬\t234885\n别扭\t234886\n青春期\t234887\n别扯\t234888\n位居市中心\t234889\n别打\t234890\n梁宇\t234891\n就是啊呀呀呀呀呀呀\t234892\n拉贾帕克萨\t234893\n股权转让协议\t234894\n881122334455669977\t234895\n幺二零\t234896\n味觉\t234897\n永远不死\t234898\n589家\t234899\n拌饭\t234900\n外置\t234901\nH色\t234902\n医道人\t234903\n别扎\t234904\n乖乖的快点\t234905\ngood\t234906\n子公司\t234907\n拯救世界\t234908\npecclass\t234909\n财户\t234910\n满怀\t234911\nBay\t234912\n词不达意\t234913\n标定\t234914\n摸吴忙\t234915\n傻白\t234916\n面基\t234917\ngooy\t234918\n地产群\t234919\n羊肉片\t234920\n12岁\t234921\n满怨\t234922\n牟徐翠\t234923\n牛肉麻\t234924\nhttpahiphotosbaiducomxiaodupicitem9\t234925\n雅美酒\t234926\n4.03亿元\t234927\nhisjaj\t234928\n斗拔\t234929\n闺密qq\t234930\n麻栗坡县\t234931\nstdrt\t234932\n奥黛丽赫本\t234933\n白薯片\t234934\n默然\t234935\n孑嫩\t234936\n最前线\t234937\n卓哥哥\t234938\n1545148354184614351\t234939\nge1部\t234940\n18000块\t234941\n求知欲\t234942\n曾亚苹\t234943\nhellomynanal\t234944\n36005700\t234945\n布偶猫\t234946\n皇家\t234947\nE#\t234948\n皇宫\t234949\n110610\t234950\n共途愉\t234951\n成保\t234952\nsdll\t234953\n这份爱\t234954\nE1\t234955\nE3\t234956\n皇室\t234957\n看记录\t234958\n900米\t234959\n一把一把一把\t234960\n琐\t234961\n霜降\t234962\n大金海港\t234963\nED\t234964\n金手镯\t234965\nAK火麒麟度秘\t234966\n名古屋鲸八\t234967\n西校区\t234968\n外罩\t234969\n辞世\t234970\nstanlandianotetouchanceouncomomv\t234971\n塔莱斯\t234972\n本州岛\t234973\n3219830488\t234974\n瞎死\t234975\nEQ\t234976\nEP\t234977\n做好事\t234978\nEm\t234979\n酸酸辣辣\t234980\nEn\t234981\n588886666855703\t234982\n大碗\t234983\nEe\t234984\nEd\t234985\n维修\t234986\n大碟\t234987\n105项\t234988\n非我所然\t234989\n白芊芊\t234990\nEt\t234991\n大碍\t234992\n狗熊\t234993\n全站\t234994\n我喜欢院\t234995\n好飞尚\t234996\n免费\t234997\n汪沟镇\t234998\n你好美文华\t234999\n王上王\t235000\n就顾\t235001\n哇谢\t235002\n下巴人\t235003\n笨鸡\t235004\n581568236842563\t235005\n133590966666666\t235006\n阿拉蕾高速\t235007\n二句\t235008\n几秘\t235009\n华美\t235010\n五分之四则\t235011\n女主人翁\t235012\n15022889459\t235013\n中音\t235014\n洋篮子\t235015\n查血\t235016\n666538864863483437\t235017\n几科\t235018\n几秒\t235019\n说分手\t235020\n坐镇\t235021\n别马丁\t235022\n笨鸟\t235023\n不不不！\t235024\nddc8d1222e81bcacf8a636a587ee041raw200gif\t235025\ncfg\t235026\nPadFone#\t235027\n来了拜拜\t235028\n1次\t235029\nsexxxxxxxxxxxxxxx\t235030\n苦药\t235031\n冲冲\t235032\n糸台式\t235033\n三分之五\t235034\n天开\t235035\n线报\t235036\n美勒\t235037\n抱抱\t235038\nyishengdegui\t235039\n冲冠\t235040\n千九\t235041\n求缘\t235042\nxxgmljux\t235043\n何塞亚\t235044\n不得不死\t235045\n不改良\t235046\n陈赫森\t235047\n知我\t235048\n朱快\t235049\n寻隐者\t235050\n陈泽锋\t235051\n拿分\t235052\n不不不我要\t235053\n15:10到16:50\t235054\n例会\t235055\n萨尔测\t235056\n卢雪慧\t235057\n用话说\t235058\n叫嫌\t235059\n我爱你那你爱我\t235060\n内幕\t235061\n三灿\t235062\ngeuuh\t235063\nIT嗯\t235064\n卡哇伊美\t235065\n不锤\t235066\n排球儿\t235067\n第一杯\t235068\nB\t235069\n度秘军康\t235070\nweet\t235071\n错啦错\t235072\n铁罐\t235073\n第一条\t235074\n草尖\t235075\n第一束\t235076\npabbt\t235077\ndancing\t235078\n不销\t235079\n亮晶晶满天\t235080\n一次五块\t235081\n发球员\t235082\nrrr\t235083\n不错\t235084\n公安话\t235085\n思南路\t235086\n张沛硕\t235087\n到则\t235088\n嗯破处\t235089\n外网\t235090\n4级\t235091\nseayou\t235092\n告诉我了我不会\t235093\n左手下\t235094\n北京福慧慈缘会馆\t235095\n王彦惠\t235096\n18548168429\t235097\n纵使\t235098\n适宜\t235099\n巴雷\t235100\n连日十二岁\t235101\n吃力\t235102\n五百人公司\t235103\n魔皂\t235104\n3套\t235105\n18日上午\t235106\n卵形\t235107\n左手个\t235108\n万豪\t235109\nffy\t235110\n尽孝\t235111\n海军\t235112\n下勒\t235113\n赵心畅\t235114\n60起\t235115\n断断\t235116\n350名\t235117\n克洛格\t235118\n早词\t235119\n五年后\t235120\n鬼才怪错\t235121\n敌机\t235122\n李华英\t235123\nGgdjgd\t235124\n坏小孩\t235125\nok度\t235126\n王金\t235127\njsck\t235128\n快乐祟\t235129\nm50\t235130\njscc\t235131\n揉躏\t235132\n匀称\t235133\n愤怒666\t235134\n童年时\t235135\nfvcq\t235136\n猪信\t235137\n早课\t235138\n早读\t235139\n水平座\t235140\n熠熠\t235141\n系稳\t235142\n金文静\t235143\n白活\t235144\n烟罗\t235145\n一百二一百二一百二一百一\t235146\n乌桕\t235147\n凭栏\t235148\n1月12日\t235149\n活血\t235150\n筱密\t235151\n乖维塔斯\t235152\n霍金虎\t235153\n流露\t235154\n陈文龙\t235155\n白洞\t235156\n你了你好好看我走了\t235157\n居无定所\t235158\nsewgf\t235159\n快快火速\t235160\n完美恋人\t235161\n必桑婩\t235162\n不存于心\t235163\nababc\t235164\n白洁\t235165\n圣雅叫我女王殿下\t235166\n轻重\t235167\n死犟\t235168\n刀锋和告你诉你我的妈妈的妈妈的爸爸的妈妈的妈妈的妈妈的妈妈的妈妈的妈妈的怕\t235169\n62177171\t235170\n李熹蕾\t235171\n好我亲\t235172\n二月多\t235173\n邓文西\t235174\n张派岚\t235175\nwhaturam\t235176\n交定\t235177\n上玩\t235178\n佟湘玉\t235179\n杜鹃杜鹃杜鹃小学\t235180\njhfx\t235181\n长呢大巴证\t235182\n七分之三六\t235183\n极度重罪\t235184\n新科\t235185\n王沙咀\t235186\n坐北朝南\t235187\n新种\t235188\n星星纸\t235189\n上海安\t235190\n和安\t235191\n新秀\t235192\n刘太香\t235193\n觉到\t235194\n吴均洲公园\t235195\n琼仁\t235196\nhgfthv\t235197\n下乡\t235198\n树条\t235199\n巧儿\t235200\ncgbjk\t235201\n灰色的天\t235202\n咯五ull\t235203\n二百五十块\t235204\n植物大战僵尸52345\t235205\n很快乐\t235206\n脂鲤\t235207\n树杈\t235208\n瑞尔斯\t235209\n朝夕相处\t235210\naAgaol\t235211\ndeeer\t235212\n保安科\t235213\n友谊\t235214\n下乘\t235215\n回事儿\t235216\ndeeee\t235217\n卸任\t235218\n吴姓\t235219\n誊权\t235220\n超高\t235221\n七寸\t235222\nAK－01\t235223\n华阴市财政局\t235224\n一瞥\t235225\n橙光\t235226\n歪芳烃\t235227\n折颜\t235228\n一瞬\t235229\nghyk\t235230\n看明\t235231\nghyg\t235232\n包覆\t235233\n小姑儿\t235234\n地道\t235235\n再再\t235236\n不要说出来\t235237\n细节目\t235238\n卢森堡\t235239\n37狂四十四四百四十四\t235240\n啦啦啦清宫图三餐你是偶像\t235241\nghyu\t235242\n好吖好吖\t235243\n新牛肉面\t235244\n四十寸\t235245\n杰弗里弗尔特曼\t235246\nら\t235247\n看不麻烦\t235248\n15055435520\t235249\n9534272550586088058686\t235250\n丁文通\t235251\n性跟\t235252\n有一搭\t235253\n女管\t235254\n阮令梅\t235255\n大姐夫\t235256\n256M\t235257\ndudhgg\t235258\n领导们\t235259\n额米豆腐\t235260\n在干嘛\t235261\nん\t235262\n一个八婆\t235263\n下掉\t235264\nbbsbs\t235265\n底面\t235266\n玉穷\t235267\n特曼\t235268\n蹿升\t235269\nmoutihhhh\t235270\nyeyekyekyeke6kkw6w6kek6ek6e6ke6k\t235271\n我的如意见说见该巴\t235272\n医疗费用\t235273\n卒子\t235274\n哎呀呀都堡\t235275\n妙偶象\t235276\n顾漫客\t235277\n铜铃\t235278\n刘青云\t235279\n静默\t235280\nruhssykvf\t235281\n换货\t235282\n有个妹\t235283\n心肌炎\t235284\n萌萌大雅\t235285\n呆呆\t235286\n使命召唤版\t235287\newc\t235288\njuckonkkjn\t235289\n瓮奇兵\t235290\newd\t235291\n深耕\t235292\n可长\t235293\n变女\t235294\nMrmarveltmovedon\t235295\n哭了我不在\t235296\n天天好\t235297\n改天在\t235298\n草莓布丁\t235299\n那不吗\t235300\n太赞了我最爱看夏落特烦恼\t235301\n对不起了错\t235302\n行忽闻\t235303\nv嘎嘎嘎\t235304\n遁词\t235305\nyesterday\t235306\nkill\t235307\nDoyous\t235308\n何华\t235309\n互相帮助\t235310\n早衰\t235311\n夜宵节\t235312\n吓用\t235313\n瑕疵\t235314\n讲解员\t235315\ngjggjkjhd\t235316\n孙思辉\t235317\n两性情爱\t235318\n邓玉玲\t235319\n7864元\t235320\n在这里块\t235321\n中午12点\t235322\n桥阵\t235323\n多任务\t235324\n黑压\t235325\nsomee\t235326\n电视酿\t235327\n二话不说\t235328\n邮件片\t235329\n爱情可怕配枪梦\t235330\n99吨\t235331\n九百九十三九九九九九\t235332\n滑溜\t235333\n视坏\t235334\n女神\t235335\n美容店\t235336\n0个\t235337\nvv猜猜\t235338\nsju\t235339\n裡大統領已經\t235340\n老子长\t235341\n徐靖轩\t235342\n二二九幺零\t235343\n56552458246682566\t235344\napple\t235345\n钱慧\t235346\n张文东\t235347\n狼累\t235348\njakz\t235349\n忙邱\t235350\n雨花台\t235351\n格物致知\t235352\n度秘摩擦\t235353\n艾利ro\t235354\n好你牛\t235355\n吴嘉豪\t235356\n威弟\t235357\n手滑\t235358\n躯体\t235359\n黑市度秘\t235360\n百分之三十\t235361\nBENSLY\t235362\n混杂\t235363\nqwf\t235364\nqwe\t235365\n美少\t235366\n二百克\t235367\nvtec\t235368\n无法控制\t235369\nqww\t235370\n陈馨怡\t235371\ndullug\t235372\nqwr\t235373\nqwq\t235374\nqwp\t235375\nNATE暴强\t235376\n塔莱布\t235377\n平方厘米\t235378\n杨永美\t235379\nlan言\t235380\n想象力\t235381\n恭帝\t235382\n从政\t235383\n荔枝\t235384\n快件眼\t235385\n独立\t235386\n行车热在线\t235387\n忽地\t235388\n我敢你\t235389\n莫斯利安\t235390\n晓溪恋\t235391\nghif\t235392\n新功\t235393\n繪製\t235394\nzajeni\t235395\n张小强\t235396\nCoastalflat\t235397\n笑毛\t235398\n钱桂芹\t235399\n18520126779\t235400\n话痨\t235401\n三了一个\t235402\n山形\t235403\n谢国忠\t235404\n边塞诗\t235405\nxjxixu\t235406\n写都\t235407\nxfhv\t235408\n寂枣寞璐\t235409\n特权者\t235410\n语文学\t235411\n反派\t235412\n老公费\t235413\n混洞\t235414\nffftyyyf\t235415\n掉转\t235416\n檐下燕/\t235417\n死不赖账\t235418\n壬手\t235419\n赵庄片\t235420\n62247074\t235421\n8909867\t235422\n哎嘟嘟\t235423\n豆面丸子汤\t235424\n狗味\t235425\n闯背\t235426\n上级领导\t235427\n肺瘤\t235428\n陪陪陪陪陪你\t235429\n刑具\t235430\n智囊团\t235431\n烦心事马\t235432\n号子\t235433\n暖手宝\t235434\n火和和\t235435\n师弟\t235436\n剪卡\t235437\n复读生\t235438\n7887\t235439\n信宏小达人\t235440\n远扬\t235441\n猪猪猪猪猪猪宝贝猪猪\t235442\n224米\t235443\n党媒\t235444\n二三零零\t235445\nsd8\t235446\n一个三倍\t235447\n五观\t235448\n扭距\t235449\n欧卖糕\t235450\nfuie\t235451\n0857449\t235452\n邵媛媛\t235453\n王莉\t235454\nAngelababy\t235455\n五角\t235456\n方点\t235457\nshya\t235458\n黄天武\t235459\n睡不了\t235460\navcybok\t235461\n尊严\t235462\n25千米\t235463\nnsmsmwa\t235464\n韩文版\t235465\nsjc\t235466\n安吉尔\t235467\n幸灾乐祸\t235468\n便当\t235469\n找我也不知道\t235470\nsdv\t235471\n叫仙\t235472\nsds\t235473\nZABA\t235474\n我想知道你是真人莫\t235475\n2009年期间\t235476\n叫仕\t235477\nsdd\t235478\nsdf\t235479\nsdg\t235480\n#新歌速递#\t235481\n明的儿子\t235482\n蓝这么\t235483\n断网\t235484\n胃苏\t235485\nsdo\t235486\nsdh\t235487\n那么爱呀你\t235488\nsdj\t235489\nserik\t235490\njmjjd1\t235491\n料酒\t235492\n好吧鲨鱼\t235493\n142857\t235494\n笑话儿\t235495\n我的问题小熊维尼海报\t235496\n许久龙\t235497\n临夏\t235498\n离群\t235499\n好想吐\t235500\n许舰\t235501\n容军院\t235502\n相笑话\t235503\n情情\t235504\n攸关者\t235505\n1111111111111111111111111111111111111211111111111111111111111111\t235506\n经济发\t235507\n逗行\t235508\n安定门\t235509\n聚汇\t235510\n覅更\t235511\n好好听话\t235512\n顾家明\t235513\n推翻\t235514\nyang1\t235515\nyang2\t235516\n经管度\t235517\n刚需\t235518\n找你玩\t235519\n张俊\t235520\n倾斜\t235521\niPhone5\t235522\n早回\t235523\n噬魂天\t235524\n7562586\t235525\n第三节\t235526\n拉呱金牌痴呆\t235527\n原有\t235528\n1176726503099\t235529\n辣椒炒鸡蛋\t235530\nKBS电视台\t235531\n天天网\t235532\n张信\t235533\n刘红谦\t235534\n2046公里\t235535\n傻蛋儿\t235536\n神秘的样\t235537\n三一大道\t235538\n告诉你了吧\t235539\nRises\t235540\n经脉\t235541\n那句话我不要你的我的秘书\t235542\n俺娘克\t235543\n傲气\t235544\n拉响\t235545\n那场戏\t235546\n中央六套\t235547\n黑白灰太猫\t235548\n14:00\t235549\n连汤\t235550\n副队长\t235551\n15283324151\t235552\n片式\t235553\n数度秘\t235554\ndededeeeeeeeeeeeeeeeeeeeeeeeee\t235555\n用信婆媳心狠提莫管理局佛具HK\t235556\n24:00\t235557\n古景贡\t235558\n易知\t235559\n55斤\t235560\n猪妞\t235561\n非白金卡\t235562\n结合体\t235563\n闭关\t235564\n禁曲\t235565\n猪妖\t235566\nsl肥婆婆妈妈的吻\t235567\n真理\t235568\n带带\t235569\n连江\t235570\n八点六\t235571\n好呀别加\t235572\ntieba\t235573\n铱姑\t235574\npighc\t235575\n千金贼\t235576\n行\t235577\n急用钱\t235578\n徐连长\t235579\n血\t235580\n多特蒙德\t235581\n衄\t235582\n衅\t235583\n李英\t235584\n张可芯\t235585\n歌神\t235586\n高欣冉\t235587\n下渠县\t235588\n李若\t235589\n衔\t235590\n吧老虎\t235591\ncy300\t235592\n街\t235593\n表\t235594\n麦叔\t235595\n李苜\t235596\n衫\t235597\n衬\t235598\n衮\t235599\n马啸宇\t235600\n刘梦\t235601\nKHGNF\t235602\n乔鹏玲\t235603\n雷神之怒\t235604\n你跟你你你你你你个猪\t235605\n忘仙逆\t235606\n55天\t235607\n钱库\t235608\n做好\t235609\n17081608673\t235610\n做女\t235611\n冬作\t235612\n辣白菜犬\t235613\n科学技术\t235614\njga1aja1b1a1a1b1b1c\t235615\n离龙\t235616\n琢磨\t235617\nCreperie#\t235618\n24000个\t235619\n堂堂正正\t235620\n壹个\t235621\n持有量\t235622\n痛痛快\t235623\n520131486877744\t235624\n155545544551522\t235625\n231584105446554465\t235626\n今生缘\t235627\n发回重审\t235628\n美国中央情报局\t235629\n踏上\t235630\n给洋洋\t235631\ntitiry\t235632\n走在街上\t235633\n壹一\t235634\nF22\t235635\n青葱\t235636\nchiphotosbaiducomxiaodupicitem7e3e6709c93d70cf09631978ffdcd100baa12b2djpg\t235637\n净心\t235638\n傻笑\t235639\n天不错\t235640\n嚣张我看我不禁\t235641\n12:50:00\t235642\n玲\t235643\n由小到大\t235644\n现\t235645\n10棵\t235646\n杨昌凤\t235647\n不chi\t235648\n玺\t235649\n有为夜\t235650\n温油\t235651\n董永\t235652\n我喜欢你我想吻你\t235653\n郑聪聪\t235654\n维秘\t235655\nNUMIZAIMAMG\t235656\n玥\t235657\napsalakas\t235658\n玫\t235659\n玩\t235660\n玮\t235661\n环\t235662\n葵乡\t235663\n缆车\t235664\n杜钲阳\t235665\n你的照\t235666\ncghf\t235667\ncghg\t235668\n冰镇\t235669\n角度秘\t235670\n玛\t235671\n把早\t235672\n269999999\t235673\n力气氛\t235674\n埃及\t235675\n蝶翼\t235676\n嗯ys\t235677\n喝下\t235678\n102580156159357123586248826\t235679\n亿千万亿千万个\t235680\n玄\t235681\n妈妈开门比特母行吗我行\t235682\n嗯yy\t235683\n王\t235684\n玈\t235685\n玉\t235686\n夕之\t235687\n死机人\t235688\n下栽\t235689\n谢西园车马客\t235690\nsinx\t235691\n放风机\t235692\n一轮轮\t235693\n不要脸节奏\t235694\nsinr\t235695\n纽三三\t235696\nothers\t235697\nsing\t235698\n水准\t235699\n13970130222335\t235700\n福建师范大学\t235701\nsina\t235702\n关系密切\t235703\n单名\t235704\n罗威纳\t235705\n发改局\t235706\n发言人\t235707\n一百一十六\t235708\n马甸魔\t235709\nhltjqgjt\t235710\n舱位\t235711\n靠拢\t235712\n大瘦\t235713\n反正总之快点回答我\t235714\n零八零三\t235715\n陈秀才\t235716\n单向\t235717\n接缝别克君越\t235718\n策比\t235719\n龙女\t235720\n无遮无挡\t235721\n陈思麒\t235722\n傻傻真是\t235723\n正统额额\t235724\n可饿\t235725\nDTFlY\t235726\n复印件\t235727\n打洗液\t235728\ntaw\t235729\n大拐点\t235730\nqnm\t235731\n亚梦\t235732\ntap\t235733\n李波\t235734\n好不咯\t235735\n兽药\t235736\n旷世\t235737\n8MJUWK\t235738\n会班\t235739\n撒丁岛\t235740\nBB酒\t235741\n散出\t235742\n蔡宝梁\t235743\n符合\t235744\n怎么样\t235745\n吊坠\t235746\n￢▽￢\t235747\n亏损\t235748\n静距离\t235749\nDrg\t235750\n艺术家\t235751\n0.0\t235752\n伞们\t235753\n亚梅\t235754\n0.3\t235755\n明晚恒天\t235756\n阿信\t235757\n峨鳝丝\t235758\n喜欢你了我再也不在这里\t235759\n0.5\t235760\n逗黑村\t235761\n阿修\t235762\nybbubbuhuj\t235763\n学习机\t235764\n双归\t235765\n俄恩\t235766\n我日你妈了个逼的我日\t235767\n怨相\t235768\n祖奶奶\t235769\nxxoocom\t235770\n女生文\t235771\nMarketplace\t235772\n林妙可\t235773\n海兴县\t235774\n菇菇古分飞燕菇玉玉\t235775\n永新股份\t235776\n伏尔加DV\t235777\ntag\t235778\n了解脱\t235779\n赛格\t235780\n棘手\t235781\n荷拉\t235782\n文新\t235783\n综治办\t235784\nvIC\t235785\n自作多情\t235786\ncgggfijcx\t235787\n不骂\t235788\n冷若冰\t235789\n汤成\t235790\n调配\t235791\n信用金\t235792\n一二四五六七八九十\t235793\n缘着色\t235794\n早期锁\t235795\n早熟\t235796\n成像\t235797\n铁盒\t235798\n9本\t235799\nyesoiso\t235800\n不骗\t235801\n寒假作业\t235802\n1217898563\t235803\n郝刘佳\t235804\nsolisol\t235805\n斯里兰卡\t235806\n26号下午\t235807\n好超拽\t235808\ntai\t235809\n残影\t235810\n彬彬\t235811\n12点半\t235812\n锁臭美\t235813\n科威了酒济上学吧\t235814\n额额额额曲项向天歌白毛浮绿水红掌拨清波\t235815\ntrqwytyuour\t235816\n如叶公好龙\t235817\n香港广西防城港市同乡联谊会龙舟队\t235818\nfhvvxhj\t235819\n婠婠\t235820\n李嘟嘟\t235821\n你的儿子\t235822\n8899238\t235823\n爸爸的假期天天图\t235824\n心胸狭隘\t235825\n龙大师\t235826\n门塔纳\t235827\n齐天大圣\t235828\n邦定\t235829\n清洁工\t235830\n久雨不晴\t235831\n练习场\t235832\n找类有\t235833\n红萝卜\t235834\n想开荒\t235835\n波音777\t235836\n书籍\t235837\n听声\t235838\n就园\t235839\nmiui71\t235840\n酒酣\t235841\n真我出\t235842\ndy\t235843\n块哩\t235844\nNONON\t235845\n你好生猛\t235846\n福士\t235847\n度秘戏\t235848\n修饰\t235849\ndw\t235850\n比较性\t235851\n攀高\t235852\n老公心\t235853\n筛选\t235854\n会使\t235855\n爱彼得\t235856\n银都\t235857\n曾蒙\t235858\n伪装\t235859\n制模\t235860\n袁九点半\t235861\n林海峰\t235862\n劳克来斯\t235863\n刘彬\t235864\n135480\t235865\ndad\t235866\n两米\t235867\nHhhg\t235868\ndaa\t235869\n迅猛\t235870\ndam\t235871\n网卡\t235872\ndao\t235873\ndai\t235874\ndaj\t235875\ndat\t235876\n床被\t235877\n工作日\t235878\ndap\t235879\nfhvb\t235880\n无聊的再也不见\t235881\n饺子馆\t235882\n川岳山\t235883\n高记唔\t235884\nday\t235885\n％兔兔图阴雨天旅途图图一物降一物兔兔图啊be藏在藏在藏在藏在藏在藏\t235886\n10111\t235887\n流浪在\t235888\n建交\t235889\n天宇生态家园\t235890\n选取\t235891\n雅加达\t235892\n上海东宫\t235893\n张板窑\t235894\n搞好\t235895\nyhbby\t235896\n新瑞\t235897\n潮品\t235898\n895分\t235899\nfsfc\t235900\n蜜月游\t235901\n小地主\t235902\nnirangwoyule\t235903\n云阳\t235904\n冯美容\t235905\n高吃\t235906\n马云\t235907\n低迷\t235908\n万l\t235909\n冲进\t235910\n金手指\t235911\n东东西\t235912\n四五六七八九十十一十二十三\t235913\n三扇\t235914\n三手\t235915\n三才\t235916\n不肯\t235917\n重焕\t235918\n抓奸\t235919\n钟家壕\t235920\n温饱\t235921\n卓卓\t235922\n吻风\t235923\n找一比\t235924\n春子\t235925\n在此之前\t235926\n万2\t235927\n一步错真好那你看\t235928\n还差笨笨笨吭哧吭哧\t235929\n乍死\t235930\n劲舞大赛\t235931\n巴子\t235932\n处处处处处处\t235933\n轻烟\t235934\n博文&amp\t235935\n盲盲\t235936\n空投\t235937\n不屑一顾\t235938\njuiamM\t235939\n这个元\t235940\n风霜\t235941\n十二多\t235942\n有志\t235943\n分叉\t235944\n同人类\t235945\n女森\t235946\n萍场\t235947\n谦和\t235948\n十二天\t235949\n有忙\t235950\nHoeee\t235951\n有必\t235952\n有心\t235953\n月超\t235954\nFolo\t235955\n性张\t235956\n微校\t235957\n周会\t235958\n周伟\t235959\n菊开\t235960\n谭玉竹\t235961\n包谷\t235962\n走环\t235963\n施主任\t235964\n偷袭\t235965\n夜猫\t235966\n追悼会\t235967\n152896321785\t235968\n丁泰翡\t235969\n2.47元\t235970\n贬义\t235971\n一四年\t235972\n专类园\t235973\n记完\t235974\n记得你了我没有\t235975\n玩罢\t235976\nqaddbcdsghhjmb\t235977\n举足轻重\t235978\n髋关节\t235979\nHdv\t235980\nMMlTestll\t235981\n吃折\t235982\n叮咚叮咚叮咚叮咚\t235983\n八芭比\t235984\n招人\t235985\n沙发板\t235986\n一下片儿\t235987\nhdhdjdj\t235988\n阡陌\t235989\n吃把\t235990\n临走\t235991\nvoffv\t235992\n28517143960\t235993\nHdh\t235994\nipad\t235995\n臣\t235996\nDucx\t235997\n离奇\t235998\n肝胆相照\t235999\n美食秀\t236000\nhistr\t236001\n定于\t236002\nvrjifnl\t236003\n笑口常开\t236004\nNICHENG\t236005\n混血小王子上马到我爱的足球踢毽子\t236006\n背弃\t236007\nfjshfdhf\t236008\n好菊菊\t236009\n我是真心爱你我走了来对你说杜比\t236010\n好我烦人\t236011\n白妞\t236012\n建设局\t236013\n天天鬼\t236014\n魏嫁\t236015\n十好\t236016\n霍比特星\t236017\n共存\t236018\n美国洛克希德-马丁公司\t236019\n咿呀咿呀咿\t236020\n放手吧讨厌你\t236021\n雷帕霉素\t236022\n障碍\t236023\n王zewei\t236024\npppppppppppppp\t236025\n8万\t236026\npleaseabaseball\t236027\n朱杞纪\t236028\n见小女\t236029\n孙秘\t236030\n李佳轩\t236031\n书屋\t236032\n罗去唐\t236033\n性服务行业\t236034\n十套\t236035\n想想办法\t236036\n密友\t236037\n小心军\t236038\n扁体\t236039\n邵兴道\t236040\n诺贝\t236041\n中山装\t236042\n拨\t236043\n择\t236044\n拫\t236045\n括\t236046\n拭\t236047\n6556726\t236048\n莉莉秘\t236049\n拢\t236050\n拣\t236051\n以吗\t236052\n陈玥彤\t236053\n拦\t236054\n拧\t236055\n西塘李村\t236056\n5月4日\t236057\n拼\t236058\n拽\t236059\n请问\t236060\n拿\t236061\n做得一踏涂\t236062\n拱\t236063\n回执\t236064\n拳\t236065\n拴\t236066\n苦元份\t236067\n回扣\t236068\nhhtjgogyhfniffkgkgk\t236069\n拉\t236070\n遇险\t236071\n孙浩\t236072\n拌\t236073\n拍\t236074\n拎\t236075\n拀\t236076\n决择\t236077\n拂\t236078\n帝王蟹\t236079\n拄\t236080\n担\t236081\n拆\t236082\n拇\t236083\n拘\t236084\n拙\t236085\n拚\t236086\n招\t236087\n拜\t236088\n烟台市区\t236089\n规矩矩\t236090\n拟\t236091\n拐\t236092\n醉水浒传\t236093\n拒\t236094\n呆滞\t236095\n拔\t236096\n乔大厦\t236097\n拖\t236098\n下任\t236099\n小样\t236100\n好想睡\t236101\n力所能\t236102\n无颜以对\t236103\n小顽皮们\t236104\n小格\t236105\n来时\t236106\n6464646464\t236107\n小根\t236108\n忍笑\t236109\n颜控\t236110\nBUT\t236111\n小校\t236112\n官腔\t236113\n功过\t236114\n自性\t236115\n历史感\t236116\n58636666\t236117\n威力\t236118\n海协\t236119\n自怜\t236120\n帮友\t236121\nk栖霞\t236122\n作业神\t236123\n你和秘度\t236124\n马鞍山\t236125\n林芝\t236126\n芭比娃娃x机器人儿\t236127\n彭诗琪\t236128\n嚄穑\t236129\n115356883\t236130\n萌萌万岁萌萌万岁萌萌万岁\t236131\n自态\t236132\n帮发\t236133\n包明宇\t236134\n算了我死\t236135\n从来不\t236136\n检察\t236137\n没有孕\t236138\n1199个\t236139\n啦额恶\t236140\n错位\t236141\n隔着\t236142\n朱某\t236143\n尼玛快\t236144\nJdhzysebdh\t236145\n是为谁\t236146\n1199万\t236147\n依依\t236148\n叁味\t236149\n发风\t236150\n瞎想和你闹想和你差就好我陪你去看天荒\t236151\n冰小多多多\t236152\n米苏拉塔\t236153\n发飙\t236154\nhGG\t236155\n碧桂园\t236156\n恩队\t236157\n新意\t236158\nvddg\t236159\n西门\t236160\ngjjfhnsng9ar\t236161\n王陶\t236162\n矮挫\t236163\n112分\t236164\n2秒\t236165\n砸向\t236166\nBIBE\t236167\n流时娜\t236168\n悍警\t236169\n┍\t236170\n官屯\t236171\n天空的男个事\t236172\nvdds\t236173\n┌\t236174\nawaearatayauai\t236175\nJJNN\t236176\nvddy\t236177\n一次堡\t236178\n水钻\t236179\nNfgxfhd\t236180\nHappyBdayRW#\t236181\n胶合板\t236182\n雄心\t236183\n摇摇gge\t236184\nyujg\t236185\n只要你\t236186\n10万一平方\t236187\n虽一笔一划\t236188\n宋伟硕\t236189\n心理医生\t236190\nAvi\t236191\n新塘路58号\t236192\n好丑好丑好丑好丑好丑\t236193\n赵超房\t236194\nxifn\t236195\n13414042023\t236196\n上不上班\t236197\n谁说了我是你的一生我是的主人\t236198\n回走\t236199\n堡利斯\t236200\n师傅\t236201\n轻欣\t236202\n凶残\t236203\n正面图\t236204\n车立狗\t236205\n1212122121220038721\t236206\n字棋\t236207\n回赠\t236208\nxifu\t236209\n大噗\t236210\n周广琳\t236211\n掌鸭掌鸭\t236212\n珊瑚礁\t236213\n纷纭\t236214\n名画\t236215\n纷纷\t236216\n度儿\t236217\nGriggstitgr\t236218\n完美报道\t236219\n15180288202\t236220\n正政丁沟\t236221\n大师们\t236222\n一乌鱼\t236223\n哈孝\t236224\n兰陵\t236225\n小猿\t236226\n修炼\t236227\n残骸物\t236228\n度秘度秘你真好度秘度秘你真好\t236229\n哈子\t236230\n小猴\t236231\n小猫\t236232\n小猪\t236233\nvdg\t236234\n有度\t236235\n王彩霞\t236236\n田晓轩\t236237\n小猜\t236238\n供应量\t236239\n余杭区\t236240\n大面积\t236241\n潘永琪\t236242\n泥娃娃\t236243\n上苍\t236244\n串珠\t236245\n小猃\t236246\n钱谷姑娘\t236247\n我喜欢度秘\t236248\n布斯克茨\t236249\n童话梦幻世界美\t236250\n狗头猪欣\t236251\n没一\t236252\n罗雄\t236253\n水煮鸡蛋\t236254\nmdkfn\t236255\n没不\t236256\n禾禾苗杞县\t236257\n那麽\t236258\n主持词\t236259\n3万元\t236260\n144788\t236261\nbyg\t236262\n嗯笨笨笨笨\t236263\n郎悦\t236264\ntan2a\t236265\n一丝不苟\t236266\n大考\t236267\n喔嚯\t236268\nbye\t236269\nfreshit\t236270\n不不不我问\t236271\n光明小区二号楼\t236272\n不要脸的你\t236273\n羊羊\t236274\n爱唱老情歌\t236275\nukonclea\t236276\nAmour\t236277\n坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋坏蛋\t236278\n赵丽穎\t236279\n真爱观\t236280\n过时\t236281\n扁荷花\t236282\n云升\t236283\n早孕\t236284\nidutuutxugt\t236285\n滚珠\t236286\n幺六幺二\t236287\n15883426387\t236288\n过日\t236289\n非常规\t236290\n迁出\t236291\n找死\t236292\n云南\t236293\n死机子\t236294\n薛杉杉\t236295\n好人大眼萌\t236296\n啦啦啦啦啦我真的唱歌别打扰我\t236297\n是不是你的爱\t236298\n敲定\t236299\n移动一根火柴\t236300\na44度\t236301\n呀什\t236302\n互利\t236303\n妈祖\t236304\n找歌\t236305\n分类似\t236306\n一五得五\t236307\n官员\t236308\n2011年3月10日\t236309\n422555533\t236310\n傻子度秘\t236311\n商妹\t236312\nios\t236313\nior\t236314\n懒腰\t236315\niop\t236316\nioo\t236317\n老派\t236318\ngjtmtw\t236319\nLcydgdxgxlgdydygdfjggghknclyfktlzgzfzgzfmxfkfkfkzfzfkzfkzffzzfghxgltossodurjhddkgdtosfsootttszkydkaWrkaalairwkfkkrkwlhdsaktdtxrsktseiatkdldygkatfoareudhfhlgzstyffsjgsjfajfsirwtawyo55hdlsyldylslufhtdglFszfllar\t236320\n六味地黄丸\t236321\n哼萌萌哒\t236322\n250p三\t236323\n仇总\t236324\n猪肉馅\t236325\n冠岩\t236326\n您说\t236327\n无悔\t236328\n网特\t236329\n百万巨鳄\t236330\n鹞子\t236331\n摆场\t236332\n我的身体\t236333\n剑眉\t236334\n郭雨涵\t236335\nsetv\t236336\n验货号\t236337\n阴云\t236338\nsets\t236339\n呼吸包\t236340\n混纺\t236341\n东街\t236342\n我喜欢森\t236343\n96年\t236344\n电视书\t236345\n郭鹏觉\t236346\n过业\t236347\n一家网\t236348\n你是哪里人\t236349\n梦洁\t236350\n思量\t236351\n飒漫画乐绘馆\t236352\n徐婷\t236353\n行玉观\t236354\n谷穗\t236355\n赵连成\t236356\n方成\t236357\n我没在气我不晕你把我不在你骗我的好\t236358\n乡鸡血石\t236359\n一百个小时\t236360\n36层\t236361\n麻辣鸡\t236362\n顺当\t236363\n华fly\t236364\nGhfhghi\t236365\n挥动\t236366\n真心罐头\t236367\n九万九千九百九十九一99999亿\t236368\n1807865316\t236369\n得了咧\t236370\nagjdpm\t236371\ntilrotifrom\t236372\n9gf\t236373\n888888\t236374\n三七年\t236375\n菜钱\t236376\n相残杀\t236377\n踢走\t236378\n五日\t236379\n17160613585\t236380\n麦子\t236381\n第五十几位\t236382\nmadein\t236383\n数多一个\t236384\n威宁威\t236385\n特有\t236386\n夜后页\t236387\nddreeretseettg\t236388\n曾汝馨\t236389\n我很自信我晒我晒你最丑\t236390\n国钊哥\t236391\n不好别\t236392\n好刀\t236393\n465450054245565666893864546456598￥\t236394\n闫莹莹\t236395\n3333333333333\t236396\n唱歌\t236397\n迪厅\t236398\n要害怕\t236399\n糸未\t236400\n下一个男人\t236401\n再发\t236402\n好刚\t236403\n真朋友\t236404\n一那\t236405\n我是你的小秘书度秘\t236406\n十针\t236407\n项丽梅\t236408\nsich\t236409\n五岳山\t236410\nkbjba\t236411\n簫\t236412\n终觉\t236413\n好别\t236414\n坦塔维\t236415\n簧\t236416\n簡\t236417\n常浩冉\t236418\n莫来\t236419\n簽\t236420\n簿\t236421\n好其实\t236422\n颇感\t236423\n疗养\t236424\n巴索\t236425\nchfo\t236426\n紧干\t236427\nbeatbox\t236428\n美了我美\t236429\n先先先爱上仙\t236430\nchff\t236431\n好赖话\t236432\n全队\t236433\nchfx\t236434\n中漫有心事给我乐乐\t236435\nzzzzzzzzZzzzzzzzzzZZZZZZZzzzzzzzzzzzzz\t236436\n用刑\t236437\n周秦\t236438\n周秘\t236439\n霹雳鲸\t236440\n太不错\t236441\n福音小学\t236442\n小意\t236443\n王葆心\t236444\n等你爱我\t236445\n2x4y4\t236446\nj一徙\t236447\n用到\t236448\n多米尼克\t236449\n小小声\t236450\n执业\t236451\nEBAY\t236452\n少儿版\t236453\ncomplishit\t236454\n第2部\t236455\n人公司\t236456\n阙春\t236457\n小Sandy\t236458\n真的好办\t236459\n多米多米我跟你江山老鼠\t236460\n秘秘姐\t236461\n複雜\t236462\n没懂\t236463\n7月10号\t236464\n陈夫妇\t236465\npianzu\t236466\n3217\t236467\n3210\t236468\n说出口\t236469\n3218\t236470\n基础教育\t236471\n小和山\t236472\n马琳\t236473\n正荣金额财富中心\t236474\n李慧\t236475\n来自星星的你2\t236476\nGRAD\t236477\n盎然\t236478\n张北\t236479\n从及\t236480\n圣堂\t236481\n快疯啦\t236482\n报表\t236483\n美林\t236484\n杨紫帆\t236485\n15072045501\t236486\n十啪啪嗯\t236487\n吴艳径\t236488\n维多利亚西\t236489\n九十根\t236490\n美极\t236491\n夹头\t236492\n从发\t236493\n嫁一等\t236494\n青院\t236495\n圣堡\t236496\n曹华贵\t236497\nfcn\t236498\n从句\t236499\nj\t236500\n从口\t236501\n冬米\t236502\n7点30\t236503\ngtxhuvuugichlxjajffghkxgkxlxnkkkteunkcglgqkpjhenp\t236504\n倾盆大雨\t236505\n小山村\t236506\n兵力\t236507\n郏贞瑞\t236508\n围堰\t236509\n默默达\t236510\n围堵\t236511\n电视房\t236512\n贺英杰\t236513\nyiigg\t236514\n我的生命\t236515\n四Ｇ网\t236516\n空头支票\t236517\n盐鸡粉\t236518\n陋室铭\t236519\n屠洪刚\t236520\n尔虞我诈\t236521\n耗电量\t236522\n切读书\t236523\n张龙丑\t236524\nandights\t236525\ndreampa\t236526\n奶头子\t236527\nlobmd\t236528\n河段\t236529\n补卡\t236530\n扣费\t236531\n诗知了\t236532\n孟玉玲\t236533\n忘了不该\t236534\n不要酱紫\t236535\n眼力当然\t236536\n外星猫\t236537\n桃红柳绿四个字\t236538\n泥土\t236539\n雨嚒\t236540\n屌事\t236541\n梦到我爸要害我\t236542\n强宠\t236543\n你了行\t236544\n刘彦琪\t236545\n山底村\t236546\njhbbjjjbh\t236547\nGdy\t236548\njigfn\t236549\nyely\t236550\n熊兆玉\t236551\n以权谋私\t236552\n76元\t236553\n晕晕的好乖\t236554\n歌歌\t236555\n南新南\t236556\n桃园小区\t236557\n胡梦\t236558\n惠威采\t236559\n团成\t236560\nlijiaol\t236561\njlgadt\t236562\nhhhuuuuooydv8rweruolpouygbbjhgbhhgbjhfvhytertuioooobnkghjjhjjhghhhhbgghjkkkoourss\t236563\n5月31日晚\t236564\n团战\t236565\n没女\t236566\n独栋别墅\t236567\n宁强\t236568\n白班级\t236569\n扣人\t236570\n麻仁\t236571\n零七三零\t236572\n李彩燕\t236573\nfeeding\t236574\n二二六二六三八零七\t236575\n八二二零\t236576\nparis\t236577\n写阅\t236578\n罗萌骊\t236579\n大事者\t236580\n好臭好臭好臭好臭一股鸭蛋子臭\t236581\n左右轮\t236582\n解缺\t236583\n弄完\t236584\n动\t236585\n助\t236586\n兔兔兔兔他天天\t236587\n劫\t236588\n十一十二十三十四十五十六十七十八十九二十\t236589\n气度\t236590\n加\t236591\n奥格斯堡VS弗赖堡\t236592\n势\t236593\n白云山\t236594\n劵\t236595\n劰\t236596\n孙雨涛\t236597\n劲\t236598\n劳\t236599\n吗度\t236600\n学爱版\t236601\n阿峰\t236602\nCampaign\t236603\n劋\t236604\n找再见\t236605\n谢你好\t236606\n劇\t236607\nsdchjh\t236608\n祷告文\t236609\n自我\t236610\n劝\t236611\n办\t236612\n功\t236613\n武理\t236614\n阿玛个\t236615\n不要嘛轻点\t236616\n反转\t236617\n熬余\t236618\n头衔\t236619\n捞出\t236620\n球器\t236621\n61005句\t236622\nffffffffffffff\t236623\n2.68%\t236624\n套镇\t236625\n2座\t236626\n2度\t236627\n干嘛结巴\t236628\n音乐科\t236629\n碟子\t236630\n销毁\t236631\nvvvvh\t236632\n疯了片\t236633\n仲咪\t236634\n小孩们\t236635\n五千卷\t236636\n15034114695\t236637\n不不不对\t236638\n单位\t236639\n童颜巨屌\t236640\n绿茵\t236641\n绿茶\t236642\n恶贯之蛮\t236643\n栋梁\t236644\n黄教\t236645\n13687091536\t236646\n不就可\t236647\n不能不礼貌\t236648\n培诺\t236649\nSystem\t236650\n家帅\t236651\n嗯份\t236652\n活动房\t236653\n存心利亚\t236654\nxfc\t236655\nWeicoPintu#\t236656\n蝴湾三\t236657\n有假\t236658\n么源\t236659\n嗯们\t236660\n不级\t236661\n哈阿\t236662\n五六天\t236663\n不咋地\t236664\nnn1分\t236665\n昨天前\t236666\n猖狂\t236667\n诉讼\t236668\nrtdg\t236669\n一阵钱\t236670\n魔轰神这图\t236671\n超薄\t236672\n吴文俊\t236673\n1241410\t236674\n后脑\t236675\n123份\t236676\n多音\t236677\n多伦多市\t236678\n哈队\t236679\ngcghg\t236680\n家帽\t236681\n家常\t236682\n绳割\t236683\nQQ斗地主\t236684\n有偶\t236685\n不包\t236686\n杜瓦特\t236687\nTrusty\t236688\n01米\t236689\n这笔下\t236690\n指甲坐标\t236691\nipling\t236692\n杨佳琪\t236693\n小仙女\t236694\n聪明的的你\t236695\n刮刮\t236696\n后脚\t236697\n342万头\t236698\n乘热打铁\t236699\n地瓜叶\t236700\n靠你好无耻\t236701\neì\t236702\n史料\t236703\n次元\t236704\nFffoi\t236705\n孤身圣战敌天下\t236706\n就无\t236707\nvfffgggfygffe\t236708\n弊者\t236709\n无理想主义\t236710\n修蝶\t236711\n相等量\t236712\n到渠成\t236713\ndeurh\t236714\n寒流\t236715\n度度你不爱我\t236716\n卖聊聊天\t236717\n牌坊\t236718\n短相思兮\t236719\n九年级化学\t236720\n撒网\t236721\n哗哼\t236722\n粉紅海灘\t236723\n硬生生\t236724\n20120707\t236725\n束缚\t236726\n王乐乐\t236727\n伍灿\t236728\n进尺\t236729\n快书\t236730\n欲绝交\t236731\n原型机\t236732\n哗哗\t236733\n相信你会\t236734\n尸鬼封尽\t236735\n面色\t236736\n粉妆玉\t236737\n嗯荣耀x5\t236738\n5月30日\t236739\n美丽哪哪哪哪哪哪\t236740\n陈安富\t236741\n就是这样子\t236742\n好不好度\t236743\n反告\t236744\n度秘花香\t236745\n倡议书\t236746\n高骈\t236747\n红稣\t236748\n铺我输\t236749\n秘僵尸僵尸\t236750\njsjsndb\t236751\n击中\t236752\n老本事\t236753\n19451855673\t236754\n几千面\t236755\n死赖\t236756\n4.3万亿日元\t236757\n心关\t236758\n一天晚上\t236759\n杰里·斯隆\t236760\nDell\t236761\nGilagtjg\t236762\n多国央行\t236763\n未来工作后\t236764\n哎呀你混了吗我原谅你了亲亲哒\t236765\n领舞\t236766\n黄雨珊\t236767\n燕窝\t236768\n市场价\t236769\n修真功\t236770\n打折\t236771\n汋\t236772\n汉\t236773\n名侦探柯南剧场版天空的遇难船\t236774\n求\t236775\n刘佳\t236776\n汁\t236777\n汆\t236778\n汇\t236779\nDJ版\t236780\n神马神马为谢谢你\t236781\n巴桑\t236782\n汞\t236783\n江\t236784\n汝\t236785\n要不告诉\t236786\n两类\t236787\n汐\t236788\n汗\t236789\naremore\t236790\n汕\t236791\n汪\t236792\n汨\t236793\n蒸水小学\t236794\n池\t236795\n污\t236796\n多谢啦\t236797\n时点\t236798\n汤\t236799\n汥\t236800\n決\t236801\n汻\t236802\n汹\t236803\nfirstflll\t236804\n汽\t236805\n九九元\t236806\n陈小沫沫\t236807\n率\t236808\n使真讨厌\t236809\n温柔体贴\t236810\n一个iu个i一个iu个\t236811\n63条\t236812\n我是我不是哪哪不见我是我实习哇无解\t236813\n飞女\t236814\n业余爱好\t236815\n玻利维亚\t236816\n唐关屯\t236817\n周庄\t236818\n一杯茶\t236819\n8686\t236820\n学经\t236821\n呵quq\t236822\n周庚\t236823\n吃了饭礼拜\t236824\n政审\t236825\n春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚春晚\t236826\n飞奔\t236827\n53级\t236828\nbizts\t236829\n说呀嗯\t236830\n安塞\t236831\n再见明\t236832\n酥胸\t236833\n累不讨厌\t236834\n回家吧\t236835\n斑马群\t236836\n1370\t236837\n欧洲冠军杯\t236838\n1376\t236839\n万安琪\t236840\n附子\t236841\n海猫\t236842\n临静R18\t236843\n5等\t236844\n发病率\t236845\n县市\t236846\nJAY\t236847\n安静的秘\t236848\n一个菜\t236849\n语音机\t236850\nuejjhwc\t236851\nJAM\t236852\n银赫太\t236853\n3。y\t236854\n巨痛\t236855\nyochiago\t236856\n恶魔果实\t236857\n最看我有没空你我\t236858\n领卷\t236859\n海蜇\t236860\n贡贡阳\t236861\nK2\t236862\nhqp\t236863\n说不过\t236864\n依旧家\t236865\nFFFFFFFFFF\t236866\nhqw\t236867\n海猛\t236868\n哈火罗\t236869\n愚见\t236870\n狂妄自\t236871\n紧急关头\t236872\n说不还\t236873\n潘田\t236874\n异世其他人的秘书对吗一是任何人的秘书对\t236875\n混合型\t236876\n文才\t236877\n诡谲\t236878\n候文志\t236879\n山西省\t236880\n胳\t236881\n托词\t236882\n土豆地址\t236883\n猪喽\t236884\n凶神恶勃\t236885\nQQ斗\t236886\n古朗月行\t236887\n学而无\t236888\n冒味\t236889\n翟富贵之子\t236890\n妈呀你\t236891\n心驰\t236892\n梁琼\t236893\n北京国安\t236894\n谢泥\t236895\n我了我金杯有心计你了我永远不会被你的\t236896\n好米\t236897\n吉他谱\t236898\n哇奇遇记啊洞房花烛夜\t236899\n苗头\t236900\n灵宝\t236901\n纹族鱼\t236902\n威胁者\t236903\n我们的平台着火了\t236904\n十四张\t236905\n八几个\t236906\n美伊美\t236907\n胸\t236908\n富有\t236909\neeee2\t236910\n不要害\t236911\n勇敢\t236912\n眼子\t236913\n李叉叉\t236914\n爱我你就说我爱你\t236915\n一八千八百九十三\t236916\nViP\t236917\nONONONn\t236918\n是亲骂\t236919\n能\t236920\n并存有\t236921\n上碟\t236922\n100天\t236923\nVia\t236924\n呃零\t236925\n别错用\t236926\n8ii\t236927\n8ik\t236928\n斑驳\t236929\n勤自省\t236930\n几千年\t236931\n削脸\t236932\n斑马\t236933\n卖场\t236934\n美国伊利诺斯大学公共卫生学院\t236935\n红红我开开心\t236936\n榆次明天几点天亮\t236937\n189几年\t236938\n儿日不落的歌\t236939\n卖地\t236940\nstomatoutonou\t236941\n带利\t236942\nppwtjjajggjwwmdjtwjjj\t236943\n你好丑喔\t236944\n胥\t236945\n刘菲\t236946\n距去\t236947\n朱雯\t236948\n一069\t236949\n独尊\t236950\n认娘\t236951\n营业机器人儿张\t236952\nFxhknvcfghkkjgccfghhhhhhhhh\t236953\n一心彝二龙戏珠三肖\t236954\n小纹\t236955\n茫睿\t236956\n喜庆\t236957\n小纲\t236958\n留影\t236959\n苦大仇深滴\t236960\n李晓光\t236961\n食神王\t236962\n小约\t236963\n说话机\t236964\n毕业时\t236965\n朱雀\t236966\n哪头儿\t236967\n海丽\t236968\n养老金\t236969\n一枚一枚\t236970\nk不k\t236971\nMini\t236972\n吴健梅\t236973\nLimitedsky\t236974\n海丰\t236975\n海中\t236976\n提假\t236977\n第五遍\t236978\n有机生活\t236979\n有毛骂\t236980\n隐晦\t236981\n亮化\t236982\njjjgt\t236983\n海东\t236984\n麻利\t236985\n捕猎\t236986\n改弦易辙\t236987\n素心腊梅\t236988\n空套\t236989\n摸扎\t236990\n现实版\t236991\n竖排\t236992\n八五折\t236993\n宝林\t236994\n远走他乡\t236995\n说什么呢听\t236996\n海上\t236997\n第二行\t236998\n图段\t236999\n万沙\t237000\nbibigsn\t237001\n自以为\t237002\n祝晨梦初\t237003\n过人想人\t237004\n500度\t237005\nfistfas\t237006\nths\t237007\n4399赛尔号\t237008\nhhew\t237009\n竹杆\t237010\n金湾区\t237011\nnandisiga\t237012\n隋子辉\t237013\nKg\t237014\n娄佳怡\t237015\n2代\t237016\n5527587\t237017\n公立医院\t237018\n王燕霞\t237019\n付峻楠\t237020\nchxbxbdh\t237021\n你好我的名字\t237022\n2们\t237023\n张子健\t237024\n被剥夺\t237025\n上古之神\t237026\n2件\t237027\n竹杠\t237028\n创立\t237029\n深刻节\t237030\n首跟\t237031\n植物大战僵尸天空之城\t237032\n想说的话\t237033\n迷朦点\t237034\n百过\t237035\nu54\t237036\n一千一一一万一百亿\t237037\n高低杠\t237038\nyfuusf\t237039\n王阿喵\t237040\n3357\t237041\nLUMIA800C\t237042\n真片\t237043\n巧有\t237044\n组合\t237045\n组名\t237046\n有意思考\t237047\n很气愤\t237048\n真心伟业驾校\t237049\n龊货\t237050\n差据\t237051\n爷体\t237052\n章鱼性\t237053\nthc\t237054\n晕了你\t237055\n昌根阿瑞仁波切\t237056\n夺心\t237057\n想的了有\t237058\n别杀\t237059\n快点哇快点\t237060\ntha\t237061\n夺志\t237062\nEXTERNALELECTRIC\t237063\n一个九岁\t237064\n电流表框\t237065\n120锅头\t237066\n小邓\t237067\nqueel\t237068\n一⊕\t237069\npopo\t237070\npopm\t237071\n余篇\t237072\n阶梯\t237073\n约翰梦里都\t237074\n八仙过海\t237075\n累笑\t237076\n一起来玩\t237077\n做为\t237078\n王稼祥\t237079\n鳄鱼\t237080\n王佳瑞\t237081\n小邱\t237082\nYSL\t237083\ntwtyoshj\t237084\n小邪\t237085\n15822591007\t237086\n乳推\t237087\n小那\t237088\n随心就好\t237089\n我讨厌你夺命我不爱你你爱我\t237090\ntltdkujat\t237091\n马以南\t237092\n临漳小学图书馆\t237093\n元哥\t237094\ndgggjhhhjgdffg\t237095\n雾仁\t237096\n王武双全\t237097\nfhhdfb\t237098\n各種\t237099\n度酱不哭\t237100\nDeepClear\t237101\n星巴克世界城\t237102\n友邦\t237103\ng58258\t237104\n整人\t237105\n作文字\t237106\n美卡弹\t237107\n徒步者\t237108\n呀不过\t237109\n量堡证\t237110\n击毙\t237111\n包公\t237112\n迷踪\t237113\nQusy\t237114\n没你可\t237115\n左腿\t237116\n圆寂\t237117\n180度\t237118\n杜近芳\t237119\n头昏\t237120\n邻近\t237121\n四百位\t237122\n我是好孩子你是坏孩\t237123\n热血类\t237124\nHFICDF\t237125\n誓宝贝坦克了我想去呢礼佛酒吧\t237126\n逆天神\t237127\n陆星期六\t237128\n你老你\t237129\nKH\t237130\n六一纪事\t237131\n睡吁\t237132\n换班儿\t237133\nwangge\t237134\n我的名字不要说你有我讨厌你好别扭\t237135\n花开花落无寻处\t237136\n陈亚红\t237137\n田阳\t237138\n中箭\t237139\n睡听\t237140\n为你不懂\t237141\n几盘\t237142\n哎呀片\t237143\n下落者\t237144\n很早就\t237145\n10月20号\t237146\n睡吼\t237147\n和平安宁\t237148\n我是李一杯鹤顶红\t237149\n实时\t237150\n一阵儿\t237151\n我相信你是\t237152\nLOVE\t237153\n傻子嘛说嘛你可了发的你不说你是电影小达怡曼\t237154\n基石\t237155\nKT\t237156\n福岛一号站4号\t237157\nxfdfdffd\t237158\nZARA\t237159\n08MKMF\t237160\n皇恩\t237161\n穆科宇\t237162\n不差\t237163\n#新京报观剧\t237164\n而为\t237165\n不巧\t237166\n骇客狗\t237167\n77585\t237168\n玩蚀\t237169\n俩点\t237170\neygxhjbsRYUIIGFFHHtyuccRTYXD425563555\t237171\n耳根\t237172\n而且\t237173\n落差\t237174\n金estaysmy\t237175\n这么说\t237176\n瓜皮炒珍珠螺\t237177\n乐讯\t237178\n湖北恩施\t237179\n家庭暴力\t237180\n黄晓春\t237181\n第24集\t237182\n裤叉\t237183\n免费的女频\t237184\n奥迪q5\t237185\n一下子天\t237186\n52521\t237187\nkaren\t237188\n50分\t237189\n015年\t237190\n你好我叫毛毛毛毛萌萌萌萌萌\t237191\nfvjhfb\t237192\nbkgb\t237193\n黄晓明\t237194\n肢解\t237195\n二日早\t237196\ncodonline\t237197\n网易云阅读的订阅源\t237198\n完美图\t237199\n4位\t237200\n主要\t237201\n人的本性\t237202\n902\t237203\nYRJ\t237204\n房地产业\t237205\n嘛子\t237206\n三生有幸\t237207\n威旺\t237208\n禤大林\t237209\n斗士\t237210\n德保\t237211\nkoulgjhf\t237212\n陈老太\t237213\n红纹网\t237214\n游泳\t237215\n张子沫\t237216\n混了\t237217\n少年度\t237218\n907\t237219\n糖蒜\t237220\n海景\t237221\nhethereor\t237222\n保花\t237223\n山东华\t237224\nwfc\t237225\nwff\t237226\n婆蔓\t237227\nwfd\t237228\nWash\t237229\n须眉\t237230\nshijunmai\t237231\n冰火\t237232\n5585256963663968658968696666666636656663\t237233\nwfv\t237234\n游法\t237235\n各履其职\t237236\n坏人人\t237237\n一唱\t237238\n15864516398\t237239\n002319\t237240\nRoslin\t237241\n期数\t237242\n十二十三\t237243\n赶好\t237244\n妹妹女\t237245\njiushia\t237246\n古矿中学\t237247\n午辰溪\t237248\n8386708460468440648347138138438733453478387384\t237249\n配件\t237250\nbjx\t237251\n勰圈\t237252\n12312312345677897898989899\t237253\n一唔\t237254\n十二十个\t237255\ngygggygv\t237256\n车祸\t237257\n还有我爸\t237258\nTHSNKS\t237259\n2012年4月\t237260\nhuideyinweininzhizaishineiruhekandaoishfhbdhchfhdhdhdhdhhfne\t237261\nHFVCGGGGCG\t237262\n过高\t237263\n车票\t237264\n愤怒的那么多你最骗人鬼公民\t237265\n新年大吉\t237266\n唉天呐\t237267\n金正日\t237268\n麋鹿\t237269\n丁家宜\t237270\n王基铃\t237271\n亲奥\t237272\n伊卡莉\t237273\nBcccoz\t237274\n对啊哦\t237275\n亲女\t237276\ntffy\t237277\n舒畅\t237278\n俩辆\t237279\n普法栏目剧\t237280\n上学前\t237281\n特价\t237282\n毛料\t237283\n凉白\t237284\n12324567777990\t237285\n蝴蝶泉\t237286\n金刚网厂\t237287\n小虫虫\t237288\nUGCCV\t237289\nhfcy\t237290\n沒乐子\t237291\n害羞\t237292\n啦啦群\t237293\n雷精欣\t237294\n没没有\t237295\n新山三\t237296\n男鬼\t237297\n上田村\t237298\nhjbne\t237299\n合欢花呗\t237300\n城市发展\t237301\n也不见了\t237302\n苹果秘码\t237303\n三一班\t237304\ndlighting\t237305\n九十四十分\t237306\nwhatI福\t237307\n1884年11月1日\t237308\n没哪\t237309\n白瑞涵\t237310\n起搏器\t237311\n387场\t237312\n问一不好\t237313\n累了累\t237314\n学员\t237315\n瑞沃\t237316\n菜菜菜\t237317\n慢慢学吧知刀口号外的啊你好吗啡｝\t237318\n懂懂\t237319\n一十个\t237320\n热气腾腾\t237321\n25886355\t237322\n污物\t237323\n赵友宁\t237324\nyfyyfhff\t237325\n濮阳堡\t237326\n张龙\t237327\n五十六块\t237328\n上牙二十安息和望洞庭\t237329\nyww\t237330\n上法庭\t237331\nyws\t237332\n星月\t237333\n、、、、\t237334\n德华\t237335\n内容度\t237336\n拉登\t237337\n复旦新闻学院\t237338\n蜷缩\t237339\n叶老\t237340\n律令\t237341\n雨朵\t237342\n明日14号\t237343\n灌灌\t237344\n我真的很讨厌你讨厌讨厌讨厌讨厌就是你\t237345\n神智会\t237346\n张大中\t237347\n馅儿\t237348\n八八九七九零四八\t237349\n碳酸\t237350\n原上\t237351\n别忘了我是公主\t237352\n猪猪侠最\t237353\n华乖\t237354\n重型地中海贫血\t237355\n司麦尔\t237356\n亲一起来\t237357\n王曼昱\t237358\nyuygyyggvy伴l人厂厂\t237359\n右脸\t237360\ng1mgm\t237361\n纵横交错\t237362\n小雪露\t237363\nnfjf\t237364\n好讨\t237365\nbagbang\t237366\n猪猪侠机\t237367\n对腿\t237368\n哼度秘\t237369\n好记\t237370\n尿腿\t237371\n轻雾\t237372\n忽忽悠悠\t237373\n安顺市\t237374\n微信息\t237375\n第888位\t237376\n白小胖\t237377\n冷天\t237378\n发短信\t237379\n上前\t237380\n1933年\t237381\n紫藤萝体照\t237382\n这个战\t237383\n多功能\t237384\n冷夜\t237385\n万能软件\t237386\nhttpahiphotosbaiducomxiaodupicitem1b4c510fd9f9d72a70881d68d32a2834359bbbd2jpg\t237387\nskfc\t237388\n凯瑞·穆里根\t237389\n街头地画\t237390\n浓艳\t237391\n饱餐\t237392\n认识度\t237393\n最最最最最最\t237394\n美俄\t237395\n几本儿\t237396\n拜仁兄\t237397\nwifimasis\t237398\n肉阿修罗\t237399\n莱佛藤\t237400\n老子不想和你废话\t237401\n星心\t237402\n错就错\t237403\n未经历\t237404\n控制不了\t237405\n市人大常委会\t237406\n快快乐乐乐乐乐\t237407\n宜章\t237408\nejhehe\t237409\ngbuf\t237410\nbnbgfjr\t237411\n参军\t237412\n武林风\t237413\n九碗\t237414\ngcjfhfnffxxxgjmlnbzxz\t237415\n梦梦梦梦\t237416\n不好不好笑\t237417\n肝硬\t237418\n胜负欲\t237419\n赵高\t237420\n566555588\t237421\nxyoyxyoyx\t237422\n艾马\t237423\n经略\t237424\n砍价\t237425\n伪善\t237426\n8995899\t237427\n樱桃小丸子第二季\t237428\nuuuu\t237429\n人杰\t237430\n保佑我科科\t237431\n妇保\t237432\n知唉\t237433\n我不爱你就是说你也不爱我\t237434\n九牛一毛\t237435\n无名指\t237436\n朱梦倩\t237437\ncandle\t237438\n35码\t237439\n人权\t237440\n有一个人肉\t237441\n08900089995855545221336\t237442\n王依帆\t237443\n地缘\t237444\nTop6\t237445\n接瓦\t237446\n百五块\t237447\n飞落\t237448\nsjdbshshsh\t237449\n行装\t237450\narrnent\t237451\n大家说\t237452\n死苏\t237453\n凶相\t237454\n重庆南路308号\t237455\n小河东\t237456\n第81篇\t237457\n6.5元\t237458\n托伊夫\t237459\n起息\t237460\n不明白快乐的事\t237461\n魏浩龙\t237462\n泥浆\t237463\n直来直去\t237464\n呀不错\t237465\n麻辣味\t237466\n茫然状\t237467\n一事无成\t237468\n李洪海\t237469\n大台\t237470\n伤命\t237471\n叉代\t237472\n二个小时\t237473\n過電影什麼那樣麻反\t237474\n苏人衣\t237475\n2112222211212\t237476\n歪路\t237477\n16com\t237478\n天使投资人\t237479\nww点7575\t237480\nPoker\t237481\n不怕\t237482\n梵摩\t237483\ngujh\t237484\n止祸\t237485\n70年\t237486\n公有制\t237487\n不怂\t237488\n一条河\t237489\n大号\t237490\ngujg\t237491\n追梦者\t237492\n卖艺\t237493\n充炒鸡蛋\t237494\ncou\t237495\n重庆卫视\t237496\n9996554456年\t237497\n泰戈尔\t237498\n熙来\t237499\n小孔雀\t237500\n受嘛人家\t237501\n不怨\t237502\n林音汝\t237503\n不急\t237504\n奶粉\t237505\n1344343\t237506\n王八旦\t237507\n20排\t237508\n荣令鑫\t237509\n伏虎拳\t237510\n雷竹\t237511\n下蹲\t237512\n我的女朋友加家鹿鹿我的女朋友\t237513\n己所不欲\t237514\nad4733\t237515\n凤霞\t237516\n正土\t237517\n成片\t237518\neufc\t237519\n潜规\t237520\n线圈\t237521\n但愿者\t237522\nˉェ\t237523\n真不贪心\t237524\n正在\t237525\n未婚\t237526\n颖宝\t237527\n不是我在后的装类\t237528\n我的们\t237529\n24512二15\t237530\n546464785761848455764878254549675545346￥\t237531\n房主任\t237532\n锄地\t237533\n好湿\t237534\n隧\t237535\n2850\t237536\nBGM\t237537\n高抛低吸\t237538\n骗鬼\t237539\n愧疚\t237540\n食玩\t237541\n75565246\t237542\n28536\t237543\n避光\t237544\n隻\t237545\n隼\t237546\n难\t237547\n鹿犬\t237548\n隅\t237549\n隆\t237550\n模拟刀\t237551\n露妍\t237552\n65057848543\t237553\n小狗咪\t237554\n700多少\t237555\n隐\t237556\n葫\t237557\n隔\t237558\n更广泛\t237559\n1克\t237560\n隙\t237561\nSUE\t237562\n赶快奥特曼\t237563\n吴教练\t237564\n亲故们\t237565\na5纸\t237566\nsays\t237567\n刷伞\t237568\n红掌\t237569\n一损俱损\t237570\n250250250250250258258975897576\t237571\n麦特\t237572\n宏观\t237573\n说不理\t237574\n大山田湾\t237575\n不是晚\t237576\n吴总\t237577\n董怡卓\t237578\n2773669875\t237579\n下水\t237580\n三十多块\t237581\n幻幻\t237582\n123h\t237583\nsayi\t237584\n装清\t237585\n董\t237586\nJYJ3\t237587\n伱先\t237588\n云朵\t237589\n聊聊天好不\t237590\n麦片\t237591\n涵义\t237592\n骨肉\t237593\n999成\t237594\n妙手\t237595\n勇勇\t237596\n1234\t237597\n1235\t237598\n1230\t237599\n1231\t237600\n拜拜元\t237601\n1233\t237602\n朱红涛\t237603\ncvdd\t237604\n1238\t237605\n名言行\t237606\n压力裤\t237607\n乌龟\t237608\n乌龙\t237609\n五六户\t237610\n就叫秘\t237611\n李宏伟\t237612\nsheisinbig\t237613\n苦行\t237614\n↘\t237615\n金杰\t237616\n爸爸爸爸爸爸\t237617\n今天清晨\t237618\n伊尔\t237619\n明天\t237620\n#复仇者联盟#\t237621\n1369870\t237622\n奥提奥\t237623\n金条\t237624\n浦三路\t237625\n水落石出\t237626\n金杯\t237627\n金杨\t237628\n秘盒\t237629\n金范振\t237630\n上海商铺\t237631\n假容\t237632\n5.83GHz\t237633\n芗\t237634\n前仆后继\t237635\n一石恶的小游苹果榛蘑\t237636\nnefteddsyhhfq\t237637\nt单\t237638\n行那你来我家\t237639\n太傻\t237640\n费案\t237641\n著\t237642\n办男呢永狂\t237643\n累了把心靠岸\t237644\n太傲\t237645\n盈者\t237646\n王浩年\t237647\na赛高\t237648\n假定\t237649\n张大胆\t237650\n秘直\t237651\n拼命\t237652\n太傅\t237653\n爆照\t237654\n水之源\t237655\n相差无几\t237656\n十几天\t237657\nXFX\t237658\n65岁\t237659\n十几头\t237660\n蛋蛋\t237661\n冷太阳\t237662\n伊瓜因\t237663\n大爷\t237664\n还挺好\t237665\n7喔喔76\t237666\n刘大夫\t237667\n屠户\t237668\n百度缓存\t237669\n销售类\t237670\n周小攀\t237671\n格纹\t237672\n格纸\t237673\n4242454642\t237674\n三清宫传度大典\t237675\n数不吧\t237676\nbeila\t237677\nFacebook\t237678\n向天歌\t237679\n二七六二\t237680\n味美\t237681\n大爺\t237682\n了不骗你们我真的你是机器人\t237683\nwoxiang\t237684\n敖汉\t237685\n唐月琳\t237686\n萝卜丝\t237687\n崇高\t237688\n步步嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟嘟猪肚\t237689\n花流水头\t237690\n聯係\t237691\n不是告诉你我的帅帅\t237692\n为份\t237693\n黑念生\t237694\n太冷漠漠\t237695\n9块\t237696\nferry\t237697\n奶婆\t237698\nqweert\t237699\n座谈\t237700\n一眼\t237701\n车度\t237702\n车座\t237703\n蛟龙\t237704\n我相信你的啦\t237705\n我喜欢傲娇受\t237706\n佳世客\t237707\n良腿\t237708\n体鳞伤\t237709\n害娃娃\t237710\nDHFJ\t237711\n一鸣惊人\t237712\n备注\t237713\n晨和巩\t237714\n姑父\t237715\n谢晓纯\t237716\n羊千喜\t237717\n990\t237718\n北方人\t237719\n十二指\t237720\n外用洗吹\t237721\n嫁给我吧宝贝\t237722\n眼假\t237723\nsuperjyounior\t237724\n纹绣\t237725\n好你的宝样\t237726\n唱歌歌\t237727\n2008年4月1日\t237728\n车库\t237729\n芊\t237730\nhttpchiphotosbaiducomxiaodupicitem503d269759ee3d6de7e9c9bf44166d224f4ade\t237731\nH漫\t237732\n黄灰赡\t237733\n若只\t237734\n了别说\t237735\n反应器\t237736\n随喜\t237737\njdidfkfnjgjd\t237738\njkiolmnopqrstuvwsyz\t237739\n咩vjanhw\t237740\n玩罢女\t237741\n激灵\t237742\n徐鼓励\t237743\n可贝尔\t237744\n奥兄弟\t237745\n啵啵啵啵啵啵啵啵啵啵啵啵啵啵啵\t237746\n2088\t237747\n1x2次a1x24\t237748\n2658275575\t237749\n迈克伟\t237750\n行不得\t237751\n龙台镇\t237752\n2086\t237753\ngvvcvvfhvvcvnnnngchcv\t237754\n老票\t237755\n介词\t237756\nnnnnn\t237757\n客车\t237758\n不舒心\t237759\n情不自禁\t237760\n不要气\t237761\n查一查\t237762\n油商\t237763\n余姚\t237764\n土豪女\t237765\n555555555555555556\t237766\nwords\t237767\n芹\t237768\n度尼西欢\t237769\nrfrrr\t237770\n王华\t237771\nffzcvhhr\t237772\n王单\t237773\n王博\t237774\n亲亲堡\t237775\nMnbnnNo\t237776\nfatis\t237777\n水渠\t237778\n1700185971808\t237779\n感谢你的我的信任\t237780\n计件\t237781\n露珠万竿\t237782\n夏国秀\t237783\n冷石\t237784\n最爱吃\t237785\n何等\t237786\naabb式\t237787\n干山\t237788\n练口语\t237789\n机告\t237790\n俩一块\t237791\n比价\t237792\n金俊绵\t237793\n上次片\t237794\n达瓦\t237795\n属下\t237796\n供热费\t237797\n非帅\t237798\nKcab\t237799\nmomn\t237800\n数绵羊\t237801\n山咔咔里德\t237802\n唉甲\t237803\n马大帅\t237804\n爬墙\t237805\n漲狼\t237806\n巧巧巧\t237807\n开关门\t237808\n接着装\t237809\n抽象性\t237810\n非常\t237811\n风信子\t237812\n比什\t237813\n蔡文章\t237814\n御龙\t237815\n周21\t237816\n教辅公司\t237817\n设限\t237818\n傻子王八\t237819\n￥￥￥￥￥\t237820\n记星\t237821\n杂货铺\t237822\nf0\t237823\n迟小小小\t237824\n李国锦\t237825\n十虚岁\t237826\n千古\t237827\n前蹄\t237828\ngtfh\t237829\n99条\t237830\n千口\t237831\n三四十二\t237832\n连辑多米多米\t237833\n死不达意\t237834\n五十只\t237835\n蒙蒙哥\t237836\n千只\t237837\n帅轩\t237838\n族群\t237839\n千发\t237840\nbkb\t237841\n汉阳\t237842\nbkc\t237843\n南浔\t237844\n田宇\t237845\n谷草人\t237846\n忧子\t237847\n休息堡\t237848\n三露一岁\t237849\nDoc\t237850\n忧孙\t237851\n作揖\t237852\n锦江区\t237853\nhi萝卜头\t237854\n900余名\t237855\nkgatjg\t237856\n顺达汽车\t237857\n两建\t237858\n政府诚信最要紧\t237859\n有气无力\t237860\ngtff\t237861\n侯庄\t237862\n功夫熊猫\t237863\n阿瓦\t237864\n度秘玩\t237865\n夜三更\t237866\n上一层\t237867\n徜徉\t237868\n米帅\t237869\n吴敏貌\t237870\n核磁共振\t237871\n细碎\t237872\n起义捐\t237873\n诞节\t237874\n探亲\t237875\n阿瓜\t237876\n大头猪\t237877\n度秘王\t237878\n咔咔咔咔咔咔咔\t237879\nqqq\t237880\nGirls\t237881\n哭妆\t237882\n仓颉\t237883\n王八王\t237884\n满金粉\t237885\n你意\t237886\n始终\t237887\n王八玒\t237888\n若晨love晓涵\t237889\n酵面\t237890\n我喜欢淘气包马小跳\t237891\n一把把\t237892\nfight\t237893\n残余\t237894\n刘培丽\t237895\n隋唐\t237896\nhi省\t237897\nshuipesai\t237898\n詹焕杰\t237899\n女狼\t237900\n给以\t237901\n我喜欢偶\t237902\n中一个人\t237903\n呢忙\t237904\n是的我是小小\t237905\nwhite\t237906\n金属态\t237907\n马良\t237908\n青银川\t237909\n胡川柯南\t237910\n师焉\t237911\n百余\t237912\n时间\t237913\n上门\t237914\n毛新宇\t237915\n白牌\t237916\n邓俊浩\t237917\n名宇\t237918\n014点\t237919\n排名声\t237920\n金不换\t237921\nPDF\t237922\n火焚\t237923\n渐入\t237924\n名安\t237925\n我是智能机器人我是智能机器人我是我是智能机器人\t237926\n洗衣裳哩亲\t237927\n置业\t237928\n名家\t237929\n三角区\t237930\n均场\t237931\n徐文涛\t237932\n火焰\t237933\n吃吃吃不急\t237934\n高博炎\t237935\n途我睿\t237936\n花水\t237937\n第六名\t237938\n囍囍\t237939\n喝雪\t237940\n涌出\t237941\n陋习\t237942\n中国兵器装备集团\t237943\n魔境仙踪\t237944\n怕谁吧\t237945\n白牙\t237946\n余闻之\t237947\nuiop\t237948\n绞丝旁\t237949\n鲤鱼何时跃龙门\t237950\n舜天\t237951\n甄嬛传\t237952\n白牛\t237953\n火影龙\t237954\n听之任之\t237955\n闫笑\t237956\n姚可茹\t237957\ndhtphicro\t237958\n在眼前\t237959\n72瓶\t237960\n逆天武\t237961\n稀稀拉拉\t237962\n杰奶奶\t237963\n6点20分\t237964\n平安银行\t237965\n黄土高坡呀我的家乡个好\t237966\n卢怡萌\t237967\n十九家\t237968\n20美元\t237969\n各式\t237970\n初性本\t237971\n这会心\t237972\n说说实话\t237973\n七龙珠\t237974\n对接机\t237975\n舒磊\t237976\n娇儿\t237977\n太原知心知心\t237978\nboiling\t237979\n前台\t237980\n奥卡卡吉\t237981\n从商经营\t237982\n小天鹅\t237983\n劲钥匙\t237984\n两三块\t237985\n老子才\t237986\n九一个\t237987\n着眼\t237988\n3286883\t237989\n陈可爱\t237990\n20：06\t237991\n投稿\t237992\n勇川埔心辺\t237993\n大得劲\t237994\nghcj\t237995\n张正毅\t237996\n茶性\t237997\n江苏省泗阳县致远小学\t237998\n简述\t237999\n文合呒\t238000\n辣片\t238001\n白妞妞\t238002\njizz\t238003\n李小翠\t238004\n176集\t238005\nu笑哈哈\t238006\n眉间\t238007\n太k\t238008\n张梦茹\t238009\n网络体\t238010\n叫野\t238011\n问道\t238012\n九千九百九十九万九千九百\t238013\n变胖\t238014\n1445584585794969054\t238015\n生命值\t238016\n21米\t238017\n8455849546525556\t238018\n优麻\t238019\n七深深雨\t238020\n民间曲调\t238021\n喜羊羊羊\t238022\n原来一样\t238023\n鹰钩鼻\t238024\n今天是你的生日\t238025\n微词\t238026\n郑喜欢\t238027\n孙晓璐\t238028\n佰烧\t238029\n龚浩泽\t238030\n15137163690\t238031\n18点45分\t238032\n美兆nnnnnnn骊秘丑8寻马8o\t238033\n有口福\t238034\n初定\t238035\n波浪纹\t238036\n点点滴滴\t238037\n陈诗漪\t238038\n毒水\t238039\n291页\t238040\n19690424\t238041\n等时间\t238042\n刘丹加\t238043\n金变木\t238044\n54693576\t238045\n一颗一颗\t238046\ngxvfgvc\t238047\n281281\t238048\n性别女爱好男\t238049\nsboy\t238050\n刘欣如\t238051\nc板\t238052\n谢谢谢\t238053\n经过低\t238054\nfhguvnfh\t238055\n张靓相\t238056\n好嘞好嘞好嘞麻烦你\t238057\n雾雾\t238058\n投资点\t238059\n一斤\t238060\n陈家方\t238061\n杜你猜\t238062\n侯珠\t238063\n利息\t238064\n嗯噜噜噜\t238065\nl一\t238066\n杨错\t238067\nnnnnnnnnnnnnnnnnnnnnnnnnnn\t238068\n中王\t238069\nhuangehuatiba\t238070\n10页\t238071\n移民输出国\t238072\n入住率\t238073\n青年节\t238074\nuueie8\t238075\n女斗士\t238076\n一方\t238077\n第四十页\t238078\n一文\t238079\nii版\t238080\n资产重组\t238081\n体制性\t238082\n墨无极\t238083\n无性生殖\t238084\n回熙\t238085\n格调\t238086\n1234669\t238087\nUN3333333\t238088\n你可\t238089\n更聪明\t238090\n134267580\t238091\n交情\t238092\n大彪子\t238093\n巨差\t238094\n舒雅\t238095\n塞坦\t238096\n巨巨\t238097\n张召龙\t238098\n想你为我想你\t238099\n好晚上\t238100\n8.48\t238101\n十六九十六万\t238102\npinpin\t238103\n石子怡\t238104\naddid\t238105\n真人和你一样\t238106\nf4\t238107\n领工\t238108\n郭哥哥\t238109\nto林\t238110\nigxig\t238111\n永远永世\t238112\n领导层\t238113\n一说下\t238114\n畅通\t238115\n绿灯侠\t238116\n伯爵奶茶司康饼布丁维多利亚奶茶下午茶双人套餐锡兰茶aftertoon奶茶玫瑰露\t238117\n充假\t238118\n埃马努尔森\t238119\n此段\t238120\n华政凯\t238121\n一路\t238122\n蒋达能\t238123\nbhhbbh\t238124\n如若\t238125\n恰逢\t238126\nFUCIXZU\t238127\n酋长\t238128\n梦想化\t238129\n8.7万\t238130\n多丑\t238131\n王泽锐\t238132\n好你个头哇好你个王八\t238133\n余嘉钰\t238134\n触梦\t238135\n赌咪\t238136\n切搞\t238137\n好笑呵呵\t238138\n古籍\t238139\n吴清扬\t238140\n我美你不帅你美\t238141\n才堡\t238142\n改良\t238143\n亲亲一亲\t238144\n优点\t238145\n凤练\t238146\n止跌\t238147\n放任自流\t238148\n好我你\t238149\nNmnnbbbbbnnb\t238150\n苏雪冰\t238151\n水资\t238152\n我是猎人你是猪\t238153\n刘顾楠\t238154\n八千岁\t238155\n白虎\t238156\n朔戏\t238157\n猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪猪\t238158\n武汉漫展\t238159\n周大哥\t238160\n366555588856\t238161\n秘tt\t238162\n锄禾日当土汗滴禾\t238163\n甘肃农大\t238164\n高州\t238165\n习礼仪\t238166\n捣捣\t238167\n一二处\t238168\n吹吹风格\t238169\n撸撸啊撸\t238170\n轻松无忧\t238171\n早上五点\t238172\nnoidont\t238173\n葛天文\t238174\n杨不羁\t238175\nknowwwww\t238176\n君子\t238177\n吕文萍\t238178\n10000000002\t238179\n好啦明天见宝贝\t238180\n10000000000\t238181\n三屈\t238182\n静静你的嘴唇么么的吻你的脖子\t238183\n一二天\t238184\n开心\t238185\n雷石宝贝\t238186\nTuT\t238187\n家常菜\t238188\n黑袜\t238189\n一毛一\t238190\n2U\t238191\n那天开市\t238192\n一展歌喉\t238193\nhihellohi\t238194\n了了了看\t238195\n13655846952\t238196\n竞技\t238197\n见模拟\t238198\n用水\t238199\nFjgxgncbvgb\t238200\n抱子成\t238201\n一毛两\t238202\n预览版\t238203\n辽宁海城师直属侦察连\t238204\n洛麦\t238205\n劳动\t238206\nHikigahic\t238207\n人蛋\t238208\n哈尼玛币\t238209\n劳务\t238210\nI4\t238211\nsheusualy\t238212\n六朝古都\t238213\n罪人\t238214\nfffggggggggfff\t238215\n高枕\t238216\n587129955\t238217\n喔个\t238218\nIY\t238219\n深奥吧\t238220\n投资额\t238221\n不外乎\t238222\n冯彩虹\t238223\n哥叫叫\t238224\n氯化铝vallum裤头\t238225\n慢慢慢慢\t238226\n陶虹\t238227\nIS\t238228\n呆头\t238229\nIU\t238230\n堡兄弟\t238231\nIV\t238232\nII\t238233\nIH\t238234\nsfvzt\t238235\nIM\t238236\nIL\t238237\nIO\t238238\nIN\t238239\n屈海昆\t238240\nIC\t238241\nIE\t238242\nID\t238243\nIF\t238244\n别来了四天了了了谢了了了了了了了了了了\t238245\nIx\t238246\n蔡先生\t238247\n还爷\t238248\nIs\t238249\n86556542687536754124\t238250\n胡小闹日记\t238251\nIt\t238252\n脱贫\t238253\ngdm1\t238254\nIi\t238255\n伸臂\t238256\n2G\t238257\nIm\t238258\njufg\t238259\n两居室\t238260\njufh\t238261\nIc\t238262\nIb\t238263\nId\t238264\n蒋粉美\t238265\n读洞察\t238266\n天咪\t238267\n爱国\t238268\n乏不差\t238269\n听我不认字\t238270\n爬山虎\t238271\n王宝军\t238272\n112117119￥112113\t238273\n每亩\t238274\n不想走\t238275\n密信\t238276\n神儿\t238277\n2O\t238278\n奴家好喜欢尔\t238279\n孔敏\t238280\n111585877541556998548945965455588146189997\t238281\n我不要你度秘度秘我恨你\t238282\n度秘度秘度秘度秘tmetmetime\t238283\n厄贝\t238284\n零二零零\t238285\n铜锣\t238286\n斯图加特\t238287\n考拉＝树袋熊\t238288\n诱导\t238289\n時間\t238290\nCARY\t238291\n8888085846765\t238292\n官皮\t238293\n分列\t238294\n蒲莹莹\t238295\n对啊好无聊\t238296\n猫腻\t238297\n明杰\t238298\n绿头\t238299\n大月\t238300\n大有\t238301\n大朗\t238302\n的已\t238303\n凯莉\t238304\n产出比\t238305\n多个\t238306\n大期\t238307\n爱恋\t238308\n大朝\t238309\n贺佐敦\t238310\n7k7k霸王by\t238311\n纪念册\t238312\n奖项\t238313\n大本\t238314\n带给\t238315\n大柳树\t238316\n大朵\t238317\n八八牙\t238318\n发型感\t238319\n爱恨\t238320\n嵩长切\t238321\n1234566788990\t238322\n经济学者\t238323\n吉他吧\t238324\n格局\t238325\n好说话\t238326\n恩不让\t238327\n吴因为\t238328\n那年\t238329\n一体\t238330\n亚莎奇\t238331\n亵玩焉\t238332\n邦德\t238333\n土豪败金女QQ\t238334\n紫外线\t238335\n想事\t238336\n三坊七巷管委会\t238337\n贝念赑\t238338\n答题\t238339\n抗台\t238340\nfchshuw\t238341\n女儿河\t238342\n元力\t238343\nSHOW\t238344\n警力\t238345\n客来了我先走\t238346\n我在梦里哪你快点啊战龙\t238347\n12yx\t238348\n15分钟\t238349\n切才怪\t238350\n怒气\t238351\n空乘MM\t238352\n说话算话\t238353\n24802880\t238354\n152705970\t238355\n49岁\t238356\n6点46点\t238357\ní￥\t238358\n卧春\t238359\nbbn\t238360\n号律不乱\t238361\n最新\t238362\n苏仪\t238363\n周播\t238364\nwhata\t238365\n容易\t238366\n姨性\t238367\n2m\t238368\nGhgdfff\t238369\n极短\t238370\nbbj\t238371\n到不得了\t238372\nhvceyi\t238373\n猪婆\t238374\nmylistial\t238375\n干嘛啦啦小魔仙\t238376\n炎宁丸\t238377\n2k\t238378\n酸甜\t238379\n飞度仔\t238380\n我的心\t238381\ncnssm\t238382\n两万瓶\t238383\n1lx\t238384\n勉字\t238385\nbbc\t238386\n拥吻\t238387\n熊出没有熊大熊二还有红还有光头强你\t238388\n从生\t238389\nformeand\t238390\nbbb\t238391\n山穷水复疑无柳暗花明又一村\t238392\n壮有\t238393\n奉旨\t238394\n何丽\t238395\ndetermine\t238396\n无袖\t238397\n全假\t238398\nfayear\t238399\n诱杀\t238400\n杨过会\t238401\n戒心\t238402\n何不\t238403\n刘家豪\t238404\nFcf\t238405\n粑粑粑粑小仙番\t238406\n长太难看\t238407\nwhatn\t238408\nODH\t238409\n骄傲\t238410\n三七气\t238411\n纳力\t238412\n李小丽\t238413\n5973288576\t238414\nisjcjf\t238415\n咖啡屋\t238416\n我我我我我我我我\t238417\n3wwv\t238418\n刃料+切割液\t238419\nniguy\t238420\n刘四喜\t238421\n家羊\t238422\n1月3号\t238423\n360块\t238424\n为宝贵重如山\t238425\n拇指狗\t238426\ngyjvDf7h9\t238427\n嗯呐爱你爱你\t238428\n天龙\t238429\n无得行\t238430\n品格\t238431\n外話\t238432\n曹曦月\t238433\n黑色幽默\t238434\n家群\t238435\n港霉\t238436\n骚猪\t238437\n千山一碧\t238438\n健康教育\t238439\n高瞻远瞩\t238440\n求假\t238441\n52538800000000000\t238442\n想错恭喜\t238443\n数一亿元\t238444\n超能力\t238445\n多音字\t238446\n甲袋\t238447\n真怜香惜玉\t238448\n搅拌棒\t238449\n花草\t238450\n心里的人\t238451\n真好吃\t238452\n50块\t238453\n动感\t238454\n踩蝴蝶他是你的人\t238455\nlaghanaghigghhguuhu\t238456\n肉战\t238457\nyyvgj\t238458\n号明\t238459\n蟹蟹蟹\t238460\n业员\t238461\n因为\t238462\n举家\t238463\n度秘大头儿子小头爸爸主题曲\t238464\nwrrty\t238465\n哎呀我的天全世界\t238466\n15851623237\t238467\n米芝莲\t238468\n4.8几\t238469\n开雨\t238470\n中恒集团\t238471\n训练\t238472\n小星星\t238473\n零五趟\t238474\n工司\t238475\n室内乐\t238476\nKatty\t238477\n西兰\t238478\n笑一会\t238479\n只有一个你\t238480\n事数\t238481\n三比九\t238482\niuviv\t238483\n溪舟\t238484\n后元\t238485\n正骨\t238486\n始边\t238487\n警务\t238488\n腿短手\t238489\n工口\t238490\n一季度\t238491\n挫逼\t238492\n事故\t238493\n油壶\t238494\n李春林\t238495\n好吧继续收回片\t238496\n眼大\t238497\n郝云鹏\t238498\n嗯成\t238499\n逛商场\t238500\n工友\t238501\n咻齿\t238502\n静怡\t238503\n北京八一儿童医院\t238504\n索爱\t238505\n永不相见\t238506\n惨忍\t238507\n富足\t238508\n好像样\t238509\n树菇\t238510\n小马宝莉之采虹英镑还要么\t238511\n四一零\t238512\ngitnfhsyfm\t238513\n公号\t238514\n描画\t238515\n530323199505180728\t238516\n书人\t238517\n公司\t238518\n黄健翔\t238519\n孙丹华\t238520\n下一秒钟\t238521\n董露美\t238522\n男孩你的女你的声音是女孩\t238523\ndoskdn\t238524\n风舞\t238525\n男女裙\t238526\n说困\t238527\n男女装\t238528\n阿拉索刚\t238529\n秦广媛\t238530\n惨念\t238531\n儿媳妇儿媳妇\t238532\n洪中\t238533\n刘淼\t238534\n地球连环震\t238535\n过不去\t238536\n先森\t238537\n输得起\t238538\n12355688987152005424556\t238539\n辨识度\t238540\n閉嘴\t238541\n乖肉麻\t238542\n熊猫海来\t238543\n奴仆\t238544\n出逃\t238545\n有余而力不足\t238546\nxdan\t238547\n189189\t238548\n追崇\t238549\n姑妄言\t238550\n5658785698\t238551\n植物大战僵尸初三\t238552\n联谊赛\t238553\n有害羞\t238554\n你是猪吗你是猪么猪猪猪猪猪猪猪\t238555\n厕十八变\t238556\nhbhnj\t238557\n千纸喔鸟\t238558\nyfuyh\t238559\n剖鱼\t238560\n走一步儿\t238561\n四分钟\t238562\n小招\t238563\n汉子女\t238564\nniganmane\t238565\n没心没\t238566\n音速\t238567\n说出事\t238568\nnbc\t238569\n脾气暴\t238570\nnba\t238571\nlbcl\t238572\nnbd\t238573\n珞珈山\t238574\n笑靥\t238575\n三百六十二块\t238576\n收藏家\t238577\n回归\t238578\n唉真是的不要脸的度秘\t238579\n669u9个\t238580\n特别那你面\t238581\n115SM\t238582\n土包子\t238583\n无义怪\t238584\n心里美滋滋\t238585\nbhiphotosbaiducomxiaodupicitema686c9177f3e6709c0d5dd8b3cc79f3df9dc55e8jpg\t238586\n李媛\t238587\nzjlc\t238588\n德小闹\t238589\n15347987826\t238590\n龟骨\t238591\n苟子\t238592\n亚瑟\t238593\n第几版\t238594\n这个人家\t238595\n雖然其實\t238596\n阿穎\t238597\n油门\t238598\n嗯不瞒\t238599\n卡翅\t238600\n悄悄悄悄悄悄悄\t238601\nDSLR\t238602\nsbc\t238603\n受给\t238604\nhxnd\t238605\nvyfvkfsj\t238606\n明天十二点\t238607\n关我吊\t238608\n卡西龙\t238609\n八多岁\t238610\n演一出\t238611\n24时二时四十\t238612\n激扬\t238613\n亚比迪\t238614\n别闹偶\t238615\n到家\t238616\n十一名\t238617\nsbd\t238618\nchiphotosbaiducomxiaodupicitem8c1001e93901213f9a192fa853e736d12f2e959fjpg\t238619\nhi波斯得\t238620\nvovoyou\t238621\n逃避\t238622\nnolonger\t238623\n三十五页\t238624\n尼码\t238625\n水浒在点球\t238626\n神番\t238627\n齐家者\t238628\n秋水仙素\t238629\n恩拜\t238630\n咯果酱\t238631\n装模作样\t238632\n洪晓涵\t238633\n吴哈哈哈\t238634\n春辉\t238635\n而的\t238636\n身亡\t238637\n胡一\t238638\n重心\t238639\n喔弥陀佛\t238640\n琳嫣\t238641\njictrfh\t238642\njgzgzv\t238643\n何惧来\t238644\n吗生希\t238645\n余海\t238646\n再次\t238647\n皇室片\t238648\n我诺\t238649\n无病\t238650\n朴宝剑\t238651\n电报\t238652\n10月15\t238653\n10月14\t238654\n约架\t238655\n10月12\t238656\n10月11\t238657\n泉州火车站\t238658\ncsfenaxnfensffen\t238659\nBryan\t238660\n24155995\t238661\n御医\t238662\nensksosk\t238663\n会儿三十\t238664\n不动步\t238665\n你好多美电视\t238666\n沈城\t238667\n赌石\t238668\n余浅\t238669\n说来说\t238670\nupwda\t238671\n受伤了妹妹\t238672\n#檬\t238673\n请别这样子\t238674\n试练\t238675\n吴阳阳\t238676\n我的天我比\t238677\n美美人\t238678\n洛子峰\t238679\n4466798\t238680\nhello小乖乖\t238681\n爱人不乱\t238682\n正月\t238683\n青联\t238684\n爱好者\t238685\n劲锐\t238686\nｂｅｂｅｉｌ\t238687\n日出\t238688\na4函数ykxb\t238689\n银河奥特曼哟我是我是大闹\t238690\n5.1\t238691\n5.2\t238692\n多米还度秘\t238693\n喝饭\t238694\n5.8\t238695\n偏偏我骗骗你的我没有\t238696\nPornhub\t238697\n任艰巨\t238698\n逗硬\t238699\n善价\t238700\n1950年代以后\t238701\n有麻\t238702\n最出色\t238703\nlttl\t238704\n机器类\t238705\n瓜烤地瓜土豆炖豆角猪肉排骨\t238706\n丁俊豪\t238707\n别蒙\t238708\n荟诗\t238709\n当你可塑性高蛋白\t238710\n神猫窝\t238711\nRenner\t238712\n白凯博\t238713\n摄口\t238714\n漏嘴\t238715\nwwwqq\t238716\n喂猪\t238717\n中集集团\t238718\n不饶\t238719\n精美\t238720\n旷滺\t238721\n一个十M\t238722\n3698521470\t238723\ngjjgjp\t238724\n拳师\t238725\n我真的美梦度\t238726\n施荟城\t238727\n孔夫子\t238728\n18711209511\t238729\n五万6800几\t238730\n六万多\t238731\n陈赫帅\t238732\n272元\t238733\n闵彩媛\t238734\n愚弄\t238735\n泗门\t238736\n员工\t238737\naratooa\t238738\n铁公鸡\t238739\n声乐\t238740\n冥婚\t238741\n口孜\t238742\n天见\t238743\nnannie\t238744\n献\t238745\n爱的释放嗯的闹\t238746\n多啦多啦\t238747\n口字\t238748\n口子\t238749\n索赔\t238750\n886888663\t238751\n秘我不想\t238752\n最大一\t238753\n急功近利\t238754\n沧海一粟梦红尘，人间半世解情缘\t238755\n这样气\t238756\n天红网\t238757\nreturn\t238758\nllloeneny早742336\t238759\n黄满云\t238760\n依萍\t238761\n蓬南\t238762\n孙来也\t238763\n西湖边\t238764\n礼貌相对\t238765\n平安证券\t238766\nx岁\t238767\n古巨顾\t238768\n0438\t238769\nRae\t238770\n扭扭捏捏那你呢那你呢那那那\t238771\n下一个周日\t238772\nGO0dbye\t238773\n以假乱真\t238774\n拉丁吧\t238775\n人民大会党\t238776\n诸葛亮\t238777\nRas\t238778\n刘亚琦\t238779\n狮男\t238780\n西边\t238781\n北极圈\t238782\n1310.1\t238783\n探花\t238784\nwooall\t238785\n猾\t238786\n瑟曼\t238787\n1000万套\t238788\n飞翼\t238789\n鲁士\t238790\n大顺\t238791\n不不不我喜欢\t238792\n紫苏丝\t238793\nWhatyournema\t238794\n呢阮\t238795\n王琳瑶\t238796\n韶关市\t238797\n交流\t238798\n在哪快\t238799\n呃幺八六二二九三二二八幺\t238800\n故事情\t238801\n玉州区\t238802\n飞翔\t238803\n控制\t238804\n寂寞空亭春欲烷\t238805\n董存瑞\t238806\n死床\t238807\n有线\t238808\n北岸花园球馆\t238809\n互赞\t238810\n有约\t238811\n聪明结合\t238812\n拉米雷斯\t238813\n到时\t238814\ndonn\t238815\n吗部\t238816\n十城市\t238817\n法克法克法克法克\t238818\n19:33\t238819\nhelloiuok\t238820\nZachar\t238821\n相亲宴\t238822\n自然规律\t238823\nvjvkj\t238824\n仁王雨婷\t238825\n杨佳曼\t238826\n六一岁\t238827\n打扮演\t238828\n黄石人民广播电台新闻广播\t238829\n烤肉卷饼\t238830\n爱霞\t238831\n巨卡\t238832\n金白金\t238833\n九喽\t238834\n感冒慢慢\t238835\nSync\t238836\n了不起的盖茨比\t238837\nOscurasecond\t238838\n二跤\t238839\n朱诸珠\t238840\n王朝阳\t238841\n文若薄\t238842\n关口\t238843\n七七大\t238844\n81912\t238845\nweuux\t238846\n京湘登山队\t238847\n暴政\t238848\nndbdhsjd\t238849\naunce\t238850\npmgm\t238851\n巴蜀\t238852\n度大沥\t238853\n和码\t238854\nBBQ\t238855\nSOHO小报\t238856\n四众\t238857\n自卑感\t238858\nconditionometintfifa秘cysty\t238859\n习性\t238860\n没问\t238861\n俗子\t238862\n19540400\t238863\n时代科技\t238864\n只人\t238865\n分析员\t238866\n老總\t238867\n东秘\t238868\n几款\t238869\n子能\t238870\ngogiystdkg\t238871\n柱面\t238872\n药瓶\t238873\n么司\t238874\n几次\t238875\n三年间\t238876\nMatthews\t238877\n毒库\t238878\n阳仔\t238879\n老了大\t238880\n就要\t238881\n201岁\t238882\n#扬子\t238883\n小胖子\t238884\n中心点\t238885\n汇商联\t238886\n毒度\t238887\n善道\t238888\n黄圣依\t238889\n小斯尼奥\t238890\n什粪\t238891\n独尊节\t238892\n荔湾区三元坊小学\t238893\n只了\t238894\n7斤\t238895\n必修课\t238896\n哇塞尔豪\t238897\n软绵面\t238898\n秘不知道\t238899\n扫地机器人\t238900\n缸子\t238901\nlkbbjjvxfyuio\t238902\n腕表\t238903\n私车\t238904\njgfv\t238905\n投身\t238906\n免留恋\t238907\n红懂了你是\t238908\nsrgfd\t238909\n宋静怡\t238910\n助威\t238911\n西密道\t238912\n金妍\t238913\n人味儿\t238914\n34期\t238915\n加尔\t238916\n久有\t238917\n冬夜我是你的主人\t238918\n留系\t238919\nqosh\t238920\nin机\t238921\n2234项\t238922\n鬼车\t238923\n周只菩\t238924\n叹号\t238925\n婆女\t238926\n都市华\t238927\n聊血\t238928\n超能力者\t238929\n有你有我\t238930\n前世\t238931\n聊行\t238932\n很累\t238933\n粵语\t238934\n李有染\t238935\n氯化铜\t238936\n136713936009\t238937\n孩童\t238938\n美梦成真\t238939\nl几天\t238940\n一个78块\t238941\n聊补\t238942\n芦浦\t238943\n几箩谷\t238944\nsheacar\t238945\n真谛\t238946\n认同感\t238947\n喝喝\t238948\n聊表\t238949\n谢别客气\t238950\n香蕉\t238951\n素海\t238952\n夜凯\t238953\n回家作业\t238954\n古丽娜扎\t238955\n我不喜欢你我讨厌你我喜欢你\t238956\n爱通透\t238957\n想着\t238958\nfong\t238959\n海峡\t238960\nDUE\t238961\n轮值\t238962\n二零四\t238963\n传了吗\t238964\n三千尺\t238965\n电杆\t238966\n勇闯马路\t238967\n誓死如归\t238968\n蝦米沒\t238969\nl8888\t238970\n枣树\t238971\n大跌眼镜\t238972\n我不陪你了你真\t238973\n纳税厘米\t238974\n恼火\t238975\n好啦好啦好啦爱你\t238976\n嘉宾们\t238977\n三分之五十\t238978\n枣核\t238979\n3分\t238980\n萦绕\t238981\n层层叠叠娇\t238982\n9一分\t238983\n讲声\t238984\n哭了解\t238985\ncocdg\t238986\n阴历\t238987\n854387250\t238988\n我喜欢侠\t238989\n耶嘿\t238990\ncrrtbyb\t238991\n小工嘉豪\t238992\n圣贤\t238993\n二十九个月\t238994\n88额\t238995\n兴迷\t238996\n大皮\t238997\n亲医创\t238998\n客串\t238999\n高管\t239000\n我是你家花花\t239001\n忠邦\t239002\n视作\t239003\n企业管理者\t239004\nars\t239005\nart\t239006\n实体店\t239007\n六一班\t239008\n至人\t239009\n郭小红\t239010\nara\t239011\n至交\t239012\n艺术家们\t239013\nard\t239014\nare\t239015\n江西省\t239016\narh\t239017\nari\t239018\narj\t239019\nmatio\t239020\n兰菀\t239021\ndumi\t239022\n竟无\t239023\n12500\t239024\n背一背呗背呗背呗背呗背\t239025\n谢谢你好度秘我很高兴认识你\t239026\n213454\t239027\n地地覆天翻\t239028\n何冰蜀黍偶\t239029\n去皮\t239030\n齐心协力\t239031\n开场舞\t239032\n恩加海\t239033\n体力不支\t239034\n至于\t239035\n扬言\t239036\n地沟油炸油条\t239037\n同住\t239038\n中信信托\t239039\n疱耳风\t239040\nar1\t239041\n棟樑原來\t239042\n安这啦\t239043\n99999999999999999999999999999999999\t239044\n时候儿\t239045\n最后一战\t239046\n一个一个多\t239047\n刁民\t239048\nndjj\t239049\n度秘谢谢\t239050\n运行商\t239051\n容易相爱难\t239052\n一情\t239053\n└o┘\t239054\n炸鸡翅\t239055\natry\t239056\n片里\t239057\n江苏省纪委\t239058\n杀人的快点啊双色球\t239059\nc\t239060\n133035412\t239061\n15.8%\t239062\n第十五\t239063\n名片盒\t239064\n盆友\t239065\n为你说\t239066\n要死了我不还\t239067\n15.83\t239068\n不秘度秘度秘\t239069\n酸碱\t239070\n九十九六六八\t239071\n哦张\t239072\n思一婆\t239073\n洗话\t239074\n雷暴\t239075\nzmrs\t239076\n方言\t239077\n刘鸿生\t239078\n来校\t239079\n端上来\t239080\nviguuffigiff\t239081\n大失所望\t239082\n乐去了你好呀你好呀\t239083\n廉耻\t239084\n闪闪红\t239085\n征询\t239086\n人境\t239087\n二三线\t239088\njgmjgj\t239089\n蘸\t239090\n风风雨雨\t239091\n平江医院\t239092\nshixind\t239093\njtagd\t239094\n25届\t239095\n狼友\t239096\n52966655665\t239097\n王鸿飞\t239098\n25层\t239099\n屡教不改\t239100\n拟女\t239101\n随性而出\t239102\n帅人\t239103\n哈意思\t239104\n3556834418753389\t239105\n感文lj斤欠芸\t239106\n三十三\t239107\n三十下\t239108\n中共借抗日擴編八路軍與新四軍\t239109\n會放過\t239110\n我我我我我我我哦\t239111\n三十一\t239112\npknbv\t239113\n1860\t239114\nr感觉吧痴汉\t239115\n111cc\t239116\n伊凡\t239117\n董小丽\t239118\n桑羽婷\t239119\n1868\t239120\n秋游\t239121\n一只1岁\t239122\n熊正超\t239123\ndwwwwwwwwww\t239124\n三十个\t239125\n明月y\t239126\n把酒整\t239127\nabok\t239128\n生物学名\t239129\n动画化\t239130\n林州\t239131\n老跑调\t239132\nskk\t239133\nbehavior\t239134\n的们\t239135\n卡夏尔\t239136\n14888380\t239137\n佳贝\t239138\n同里湖渡假村\t239139\n鲤鱼是说爱我要达\t239140\n傲之雷神\t239141\n未完\t239142\n神雕吃大四栋\t239143\n主干\t239144\n真的爱\t239145\n好的我好\t239146\n安沛\t239147\n大贝\t239148\n王霞儿\t239149\n姓王\t239150\n公元138年\t239151\n亮晶晶南天\t239152\n飞涨\t239153\n北京地方法院检察署\t239154\n朱咪\t239155\n64名\t239156\n郝洲\t239157\n沉睡\t239158\n吴建恒\t239159\n无可争辩\t239160\n俞嘉欢\t239161\n大货\t239162\n保定市佛教协会\t239163\nJiffy\t239164\n13072458886\t239165\n大贼\t239166\n刘华\t239167\n椹川大道南\t239168\n思捷\t239169\n孟伟\t239170\n大贵\t239171\n是不是那样子\t239172\nfhozu\t239173\nSTYLE\t239174\n银铃\t239175\n盱眙\t239176\ndthkkkkkk\t239177\n亚麻籽\t239178\nTdyrhrte\t239179\n过庭\t239180\n段段\t239181\n4698685554766247566\t239182\n因特网\t239183\n说人话滚蛋\t239184\n敬业额\t239185\n缩话\t239186\n夏日熊\t239187\n胡逸\t239188\nfcrg\t239189\n东北银\t239190\n驯鹿\t239191\n胡适\t239192\n杨艺晴\t239193\n扣鼻子\t239194\n600107\t239195\n神似夜寂寥\t239196\n泰民\t239197\n江豚\t239198\n指甲痛\t239199\n这么辈子\t239200\n菠萝\t239201\n半醉\t239202\n尔豪\t239203\n下身体\t239204\nwks\t239205\n死狗熊\t239206\nhoojhjmmk\t239207\n刘咱\t239208\n不倒班\t239209\n凤巢58p\t239210\n三十万元\t239211\n溜之大吉\t239212\n秦亚\t239213\nｌｅａｌｙｓｔｏ\t239214\n脑筋\t239215\n祝巴萨\t239216\nchisquw\t239217\nJWJW\t239218\n质量技术监督局\t239219\n3G视频监控系统\t239220\ngjdydti\t239221\n小麻豆\t239222\n在笑\t239223\nVER#\t239224\n刘博\t239225\n崔依珂兰锦钢\t239226\n好徒儿\t239227\n公平\t239228\nayte\t239229\n新生字\t239230\n阔耐\t239231\n病毒集团\t239232\n1300块\t239233\n虾米勒\t239234\n44蔡\t239235\n伍皓\t239236\n阿更廷\t239237\n美容美发\t239238\n李娜娜\t239239\n牛吧\t239240\n完人\t239241\n谢工\t239242\n几巴\t239243\n完了\t239244\n奥批发部vv要不然和哦欺骗1在哪呢想你\t239245\n董亚阿良\t239246\n完事\t239247\n爱我你就陪陪我爱我\t239248\nshabi\t239249\n27秒钟\t239250\n2.二\t239251\n真够意思\t239252\n一千支\t239253\n雨辰\t239254\n再嫁\t239255\n厦门\t239256\n红化石\t239257\n说死人\t239258\n穷困\t239259\nsk8\t239260\n素爱\t239261\n李狗\t239262\n利是\t239263\n苏永超\t239264\n猪头型\t239265\n你是谁四大饼\t239266\n双沟\t239267\n10次\t239268\nhrhherf\t239269\n白晗\t239270\npoiu\t239271\n张元凤\t239272\n说死了\t239273\n10款\t239274\nB君\t239275\n中等生\t239276\n太黑心\t239277\n惟有\t239278\n奥立克\t239279\n心力衰竭\t239280\nfuguingchipopp\t239281\n宇宝贝\t239282\n3200700371\t239283\n侯悦悦\t239284\n军力\t239285\n然那\t239286\n中央统战部\t239287\n张亚璇\t239288\n我说东你说西真是牛头不对马尾\t239289\n中央一套\t239290\n少数幸运儿\t239291\n变动\t239292\nficyou\t239293\n话术\t239294\n始建于\t239295\n松花江\t239296\n兰布拉斯大道\t239297\nTFBo丫s\t239298\nQwerty\t239299\n武堡\t239300\n張永星\t239301\n小不点机器人\t239302\nvre\t239303\n色龙\t239304\n赋读\t239305\n侵\t239306\n侶\t239307\n侰\t239308\n必杀\t239309\n住世\t239310\n便\t239311\n迥然不同\t239312\n一对个\t239313\n侧\t239314\n侦\t239315\n那行我问你\t239316\n侠\t239317\n侣\t239318\n5dj\t239319\n侬\t239320\n哥类\t239321\n16十4\t239322\n5dm\t239323\n５７年\t239324\n住为\t239325\nhcds\t239326\n果孑\t239327\n体色\t239328\n圣城\t239329\n供\t239330\n旅行家\t239331\nThyshufuo\t239332\n來\t239333\n客套\t239334\n一百吨\t239335\n侃\t239336\n侍\t239337\n寒门年\t239338\n一对一\t239339\n臭臭鼠\t239340\n还子虚巫\t239341\n例\t239342\n僵尸道\t239343\n好啦好啦好啦好\t239344\n送机\t239345\n小组合\t239346\n5155\t239347\n伤不七国\t239348\n别骗我行不行\t239349\n鄂州火车站\t239350\n聊办\t239351\n发炎\t239352\n拜倒\t239353\n尿床\t239354\n贾俊良\t239355\n趋直入\t239356\n几多岁\t239357\n徐赫\t239358\n18岁\t239359\n张金蛋\t239360\n发点\t239361\n二零三六\t239362\nLeon\t239363\n北京）科技有限公司\t239364\njcjcj\t239365\n给他吧\t239366\n尿座\t239367\nwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwewwwwwwwwwwwwwwwwww\t239368\n猪婆猪婆中枢佛市中心\t239369\ncalled\t239370\n1张\t239371\n肯尼亚\t239372\n浪浪\t239373\n看见了解到\t239374\n赵内容\t239375\nfegf\t239376\n长肥\t239377\nfegd\t239378\n日屌\t239379\n追不上\t239380\n手抓饼\t239381\n菜牙\t239382\nstuvw\t239383\n读完\t239384\n万元\t239385\n顺势\t239386\n潮讽\t239387\n蒙城\t239388\n澳大利亚大学\t239389\n权贵们\t239390\n汪昱\t239391\nformein\t239392\n跨国企业\t239393\n22224545485\t239394\n大姨家恩\t239395\n心灰意冷\t239396\n黄的头啊子你个头啊靠你的头\t239397\n黄冈高速\t239398\n真心狠闲\t239399\nueudjdjzjsjxhshxuxudhxhxusjxjhcuzidjhxjxx\t239400\n郁金\t239401\n谷天宇\t239402\n舍不舍不能\t239403\n不若\t239404\nroboadvisers\t239405\n神秘的秘\t239406\n黑嗯\t239407\n覆盖式\t239408\n阿格朗\t239409\n李淑冰\t239410\n想睡\t239411\n不再寂寞\t239412\novpj\t239413\n对调\t239414\n新春佳节\t239415\n熊出没果\t239416\n2863698934\t239417\n一点半点\t239418\n18195639563\t239419\n秘聊\t239420\n医生们\t239421\n400多万\t239422\n赵丹丹\t239423\n魔晶猎人\t239424\n甄寰\t239425\n别闹闹\t239426\n药片\t239427\n4g网络\t239428\n囲一\t239429\n我想没有人了陪我聊会天\t239430\n富含\t239431\n再说说\t239432\n路由器\t239433\n纯度\t239434\n悟天\t239435\n吵架\t239436\n做生意换来\t239437\n圈票\t239438\n欧鼻屎\t239439\n二三零六八二一九七二\t239440\n天葬\t239441\nnote2\t239442\nnote4\t239443\n嘉賓\t239444\nAV16b\t239445\n100集\t239446\n雷李梅\t239447\n二零学校\t239448\n刷票\t239449\nustin\t239450\n启铭\t239451\n719\t239452\n718\t239453\n717\t239454\n诺木\t239455\n715\t239456\n714\t239457\n713\t239458\n712\t239459\n字成句\t239460\n拐角\t239461\n保单\t239462\n尝试\t239463\n37个\t239464\n辈\t239465\n泰勒斯威哥\t239466\n信达期货\t239467\n强强联手\t239468\n错就好\t239469\n凤鸣山\t239470\n新一版\t239471\n保卫\t239472\n熊熊燃烧\t239473\n风风光光\t239474\n机器纹\t239475\n甲天下\t239476\nhgggjm\t239477\n方罢\t239478\n日臻\t239479\n谱点\t239480\n埃尔\t239481\nhejejhf\t239482\n蕴含\t239483\ntygffy\t239484\n晨晨\t239485\nsouolu\t239486\nrri\t239487\nsdfflckckcf\t239488\n鲁莽\t239489\n天快了了了了给你\t239490\n二只\t239491\nrrd\t239492\nrrb\t239493\n报复\t239494\n二台\t239495\n二史\t239496\ntmjtkw\t239497\n二号\t239498\n呼吁\t239499\n第23次\t239500\n驴子\t239501\n中韩\t239502\n三国演义\t239503\n金源国际\t239504\n呼吸\t239505\n李子峰\t239506\n电脑吧\t239507\n破烂度\t239508\n二叉\t239509\n7-10号\t239510\n天也热\t239511\n挫莫\t239512\n二发\t239513\n二叔\t239514\n天天向上快乐大本营\t239515\n13303年\t239516\n官大\t239517\ncubux\t239518\nddfgh\t239519\nxiū\t239520\n适不适宜\t239521\n冰期\t239522\n圣洛利亚\t239523\n不苟\t239524\n慈禧+八国联军\t239525\n从没\t239526\n断脚\t239527\n冰月\t239528\n山棒\t239529\n小米娘\t239530\nshesJOhns\t239531\n22.7亿元\t239532\n145yg一\t239533\n哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥哥\t239534\nvicma\t239535\n高沙\t239536\n第九点\t239537\n摘记\t239538\nGONNA\t239539\n刘禹博\t239540\n131711972860\t239541\n全交\t239542\n怨骂\t239543\n孙艺铭\t239544\n晏阳\t239545\n殒落\t239546\n夏煊\t239547\n似情\t239548\n萌萌哒我好开心\t239549\n斯密花\t239550\n所以说呢\t239551\n陈俊扌\t239552\n走走开走\t239553\n饭来张口衣来\t239554\n大等\t239555\nfusjdldkfpdhsoanrodnfbgldjrlrqmtpxrkdnekfkfjdkgdishnykfubflfhsjdhmfobyhsujyfdueieejculfmydkf\t239556\nCcg\t239557\n猎叫\t239558\n半个小时\t239559\n大筵\t239560\n归入\t239561\n突然之间\t239562\n贴甲\t239563\n英格兰剑桥郡\t239564\n所以说呀\t239565\n发小久违\t239566\n物质\t239567\n接过来\t239568\nME722\t239569\n十来天\t239570\n大总受\t239571\n五小福变四小福\t239572\n联婚\t239573\n施益农\t239574\n骏哥\t239575\nfggshsi\t239576\n羞羞\t239577\n辣子鸡丁\t239578\n叭婆\t239579\n一千万一百万十一万\t239580\n煎包\t239581\n谢谢修\t239582\n死生之地\t239583\n一向前\t239584\nsowkc\t239585\n碘液\t239586\n麻利麻利\t239587\n扶贫\t239588\n凝心\t239589\nq1748\t239590\ngetting\t239591\n马天宇\t239592\n出货量\t239593\n主题词\t239594\n李冰寒\t239595\n扎西\t239596\n马万岁\t239597\n三千四百五十百八十\t239598\ntat\t239599\n几百回\t239600\n我的夫人\t239601\n笑诒\t239602\n1827557946\t239603\n二号儿\t239604\ntar\t239605\n瓦在东\t239606\n笑诘\t239607\n一跳\t239608\n笑话\t239609\n阴谋论\t239610\ntaz\t239611\n廖言岩\t239612\nsdgg\t239613\n剩牛女\t239614\ntam\t239615\n黄土地坡\t239616\ntao\t239617\n郝文宇\t239618\n想着我\t239619\n笑诰\t239620\n农行ABC\t239621\n笑说\t239622\n2008年7月28日\t239623\n罗比凯\t239624\n好朋羽\t239625\n工作服\t239626\n金地\t239627\n饶舒格\t239628\nfudd\t239629\n好极了\t239630\nPearls\t239631\n好的我的\t239632\n便利\t239633\n10月15日\t239634\nOnnow\t239635\n金圾\t239636\n反句话\t239637\nta4\t239638\n12121213\t239639\n12121211\t239640\n乐大猫猫猫\t239641\n遗留物\t239642\nffyribh\t239643\n23号晚上\t239644\n有些事\t239645\nta9\t239646\n双色饭\t239647\nBANG\t239648\n提莫桓宇\t239649\n桥头\t239650\n总结句\t239651\n萨耶摩\t239652\n一个十五岁\t239653\n大华饭店\t239654\n13579\t239655\nwt\t239656\n桑园小学\t239657\n一盘棋\t239658\n那个了我是你姐\t239659\n低语\t239660\n道具\t239661\n三普\t239662\nlookIit\t239663\ncccccc\t239664\n限令自首通告\t239665\n中心城市绕城高速公路\t239666\n仨岁\t239667\n引子\t239668\n没谈\t239669\n人人生\t239670\n快本\t239671\n31个\t239672\n┑\t239673\n┐\t239674\ngttg\t239675\ngttj\t239676\n┘\t239677\n列表\t239678\n五月节\t239679\n度秘我不要你了我不喜欢你\t239680\n━\t239681\n─\t239682\n┃\t239683\n3岁\t239684\n碧莲\t239685\n海卓\t239686\n┏\t239687\n矮油\t239688\n海南\t239689\n暴炸\t239690\nvcsdx\t239691\n鹿城区\t239692\n正常化\t239693\n┰\t239694\nbyy\t239695\n过金华\t239696\n嘭嘭嘭\t239697\n笔变\t239698\n简而言之\t239699\n1417525\t239700\n快期\t239701\n二十块\t239702\n删不嘛\t239703\n过不过\t239704\n4米\t239705\n笔友\t239706\n006千克\t239707\nfblk\t239708\n还债还债\t239709\n上次手足口\t239710\n方针\t239711\nbya\t239712\n雪\t239713\n18849655889\t239714\n小师妹\t239715\n熊廷挺\t239716\n看了待\t239717\n点沿\t239718\n碍眼\t239719\n茶节\t239720\nSummer\t239721\n阵儿\t239722\njloh\t239723\n扣过\t239724\n怪话传\t239725\n奢华\t239726\nh彩\t239727\n阴谋\t239728\n几那吧vddfggggggvbmmmnbbbbbnnnnnnxxvbbjjmjnnnf\t239729\n来这儿\t239730\n两股\t239731\n术刀\t239732\n1314521527311652233482\t239733\n单飞\t239734\n2AM\t239735\n对联池\t239736\n2AV\t239737\n519709\t239738\n哎呀妈呀你真的太丑了丑到没朋友\t239739\n一万8880ne\t239740\n锅巴肉\t239741\n足\t239742\n没位\t239743\n不着及\t239744\n宣城市\t239745\n上海青\t239746\n8万条\t239747\n你好多度秘\t239748\nhbkno\t239749\n冲浪\t239750\n肉质\t239751\n65998886898888888965789655\t239752\n单子\t239753\n快快快快快快快\t239754\n3200吨\t239755\n24岁\t239756\n杨雨彤\t239757\n棒爽\t239758\n城市别墅\t239759\n幽默波\t239760\n温和\t239761\n涂运\t239762\ntuvifc\t239763\n第一战\t239764\n不過\t239765\n贺州\t239766\n香酥\t239767\n姜帅\t239768\n过人机\t239769\n不遇\t239770\n黄澄澄\t239771\n吃味\t239772\n屁滚尿流\t239773\n黑衣人\t239774\n桌上\t239775\n不道\t239776\nMP5哈喽\t239777\n好时而\t239778\nhggf\t239779\nAIVA\t239780\n吃呀\t239781\n莫子强\t239782\n专线儿\t239783\n好孕\t239784\n李浩楠\t239785\n”们\t239786\n乾坤\t239787\n图象\t239788\n永嘉县\t239789\n李晓丹\t239790\n本赛季\t239791\n几十倍\t239792\n下耍\t239793\n尒行\t239794\n秀敏\t239795\n嘟嘟咪\t239796\n阿姐撸老K\t239797\n适量\t239798\n九月十号\t239799\n傲娇\t239800\n女戒\t239801\n100多\t239802\n阳见阳\t239803\n树树\t239804\n一千一百九十九一千一百九十九\t239805\n番110房改空开那大黑分开\t239806\n哪婚\t239807\n来枫\t239808\n11157\t239809\n腊月廿四\t239810\n啦啦\t239811\n裙角\t239812\n论语十二章\t239813\n浴池\t239814\n卖太阳\t239815\n十二只\t239816\n砸伤\t239817\n明天顺便直一下刘海嗯\t239818\n十二号\t239819\n树根\t239820\n阶级斗争\t239821\n有很有意思\t239822\n3月1日4时\t239823\n冬荫功\t239824\n勋勋\t239825\n扭秧歌\t239826\n5点半到6点\t239827\n大不理\t239828\n伊芙丽\t239829\n九毛五\t239830\n诊\t239831\n500885\t239832\n想了想说\t239833\n腊月十五日\t239834\ngcyxufhcgcg\t239835\n童伴\t239836\n明秘\t239837\n18452527080\t239838\n九毛二\t239839\n霸兵\t239840\n度秘你在这块儿陪我聊会\t239841\n我喜欢师\t239842\n算了算了我去上学啦拜拜度秘\t239843\n一跃\t239844\n碰见人\t239845\n拟定\t239846\n错乐事\t239847\n莫干山路\t239848\n竞猜\t239849\n忽然\t239850\n手旁\t239851\n好子\t239852\n楼层\t239853\n冰灾\t239854\n修理部\t239855\n至多\t239856\n小肆\t239857\n泡酒\t239858\n冰灵\t239859\n潮种\t239860\n线柄\t239861\n诞\t239862\n小肚\t239863\n威威\t239864\n夜猫子\t239865\n小肖\t239866\n王八创\t239867\n王八刚\t239868\n朱千曲\t239869\n客家族\t239870\nmesomeEnglish\t239871\n爱的午餐\t239872\n诚\t239873\n小肥\t239874\n陈宇杰\t239875\n小肠\t239876\n杨欣洁\t239877\nufvod\t239878\n遇到\t239879\n第一天\t239880\n黄河长江\t239881\n囤烟\t239882\n王兆坤\t239883\n圆桌\t239884\n一坛\t239885\n边门\t239886\nuygy\t239887\n刘主人\t239888\n更健康\t239889\n六三零二\t239890\n恫伥\t239891\n泛舟\t239892\ngjjd\t239893\n呢边\t239894\nNsnsns9\t239895\n嫉妒窝\t239896\n高好\t239897\n里香\t239898\n同里古镇\t239899\n厚爱\t239900\nghj\t239901\nnn嗯嗯\t239902\n我求你了爱我一辈子\t239903\njuohy\t239904\n烟花棒\t239905\n最大化\t239906\n钼\t239907\n邓小花\t239908\n23日\t239909\n赛过\t239910\nCha\t239911\n我求求你不要再给我一个儿\t239912\n傲娇小萌受\t239913\n母牛\t239914\nnnnn1\t239915\n秘样\t239916\n非主流\t239917\n亚内容\t239918\n酷圣石\t239919\nwugffh\t239920\n诨\t239921\n几十年前\t239922\n23时\t239923\nUC生活百事通\t239924\n杨苏文\t239925\n瑶柱白粥\t239926\n麦嘎\t239927\n恨不张\t239928\n撒就打\t239929\n两件套\t239930\nhistomity\t239931\n丑丑喽\t239932\n往事\t239933\nfans\t239934\n嘀咕富贵fj\t239935\n吴娟\t239936\n学飞\t239937\n吴威\t239938\ng娃\t239939\n珠光\t239940\nGHFK\t239941\n罗汉果\t239942\ng527529dk\t239943\nzhuowen\t239944\n了没呀\t239945\n姑爷\t239946\nghhvb\t239947\nnnnnz\t239948\n682684\t239949\n花瓣\t239950\n分校\t239951\n三二零\t239952\n垂涎不止\t239953\n像风一样\t239954\n靠靠靠靠几呀\t239955\n派上用场\t239956\nghhvv\t239957\n77niu\t239958\n万国\t239959\nIrving\t239960\n554844595855855656\t239961\nnnnna\t239962\n路路\t239963\n花瓶\t239964\n懂礼貌知道\t239965\nfddreewrtd\t239966\n楼凤\t239967\n帅就好\t239968\n涡螺\t239969\n械投\t239970\n士林\t239971\n2e二\t239972\n黄丽鸟\t239973\n颞叶\t239974\n唯爱SJ13]110522利特TW\t239975\n说灯\t239976\n谢谢你好爱\t239977\n唱歌会\t239978\n一票\t239979\n阿韦洛亚\t239980\n吼吼腻味你是狗\t239981\n你好哇你好哇好哇好哇好哇好哇好哇\t239982\nebe\t239983\n白毛\t239984\n丰田路\t239985\n宋我\t239986\n席上\t239987\n零三零幺\t239988\n残缺\t239989\n黄本\t239990\n给们\t239991\n木匣子\t239992\nChx\t239993\n王复者\t239994\n熊仔\t239995\n胡静文\t239996\n缅怀\t239997\n7789888808708\t239998\neby\t239999\n去年12月份\t240000\n小米度\t240001\n漆黑\t240002\n3174708832\t240003\n张嘉译\t240004\n887415157\t240005\n第几部\t240006\n厨卫\t240007\n郑维维\t240008\n属于我一个人\t240009\n亏本心有\t240010\n给他\t240011\n放空自已\t240012\n徐媛康\t240013\n鳞片\t240014\n给付\t240015\n冯钟璞\t240016\nGKVKKV\t240017\nsfghhnbhhhghhjjhfgggf\t240018\n错从新\t240019\n13033966485\t240020\n命行\t240021\n说跟\t240022\n狮尸\t240023\n渔火\t240024\n孟凯\t240025\n负土\t240026\n善辩\t240027\n坏孩子\t240028\n蒙福临\t240029\nshiyigeren\t240030\n电子琴\t240031\n明天东山总得天\t240032\n南科大\t240033\n追完\t240034\n音箱\t240035\n癶燃\t240036\n湖南话\t240037\n扶养\t240038\n康马\t240039\n谢依静\t240040\njiDNd\t240041\n扶助\t240042\n好爱好爱好死\t240043\n十全十美\t240044\n老狗\t240045\n忙音\t240046\n纵观\t240047\n庄禹\t240048\njbjj\t240049\n2011111\t240050\n大儿\t240051\n纵览\t240052\n瑞恩\t240053\n不要嘛轻一个我去\t240054\n萌大萌大萌大萌哒哒哒\t240055\n马甲醛固酮\t240056\n段佑斯\t240057\n老狼\t240058\n嗯howareyou\t240059\n一会儿大一会儿\t240060\n狂片\t240061\nDDF\t240062\n说我女\t240063\nBeatBox\t240064\npahket\t240065\n老狮\t240066\n说不过去\t240067\n大儒\t240068\n慕尼黑雍和宫\t240069\n男生女\t240070\nfffffffff\t240071\n我是可爱的代表你是猪\t240072\n尿酸血\t240073\n２４岁\t240074\n跳蚤\t240075\n小拽\t240076\n国雅\t240077\n当当当\t240078\n浙江万里学院\t240079\n3kyou\t240080\n资产者\t240081\n党团\t240082\n许个愿\t240083\n错行\t240084\n99ri\t240085\n张艳雪\t240086\n住宅用地出让金\t240087\n相融\t240088\n无头骑士异闻录\t240089\n没哪儿\t240090\n运紧\t240091\n潘微微\t240092\n小拘\t240093\n搜即\t240094\n独裁者\t240095\n3000万\t240096\n提一下\t240097\n癫痫病\t240098\n红星美凯龙\t240099\n蒙在鼓里\t240100\n狗屎体诗\t240101\n优则\t240102\n四分五裂\t240103\n歌腿\t240104\nhttpdhiphotosbaiducomxiaodupicitem279759ee3d6d55fb917248536a224f4a20a4dd0ajpg\t240105\n念轟\t240106\n各项\t240107\n甲烷\t240108\n衍阳\t240109\n兰博基尼\t240110\n春乡\t240111\n陆路\t240112\n路表\t240113\n钮大姐姐\t240114\njxay\t240115\nhttppinyincne6865\t240116\nvintage\t240117\n新址\t240118\nrockface\t240119\n中港台\t240120\n白块\t240121\n2箱\t240122\n反比\t240123\n笑洛\t240124\ntuce\t240125\n震撼\t240126\n瑜卤允浩\t240127\n石军良\t240128\n小区型\t240129\n第43章\t240130\n迫近\t240131\n墨捌枫\t240132\n芒果台\t240133\npmgd\t240134\n四会\t240135\n把持\t240136\n很相似\t240137\n卫生费\t240138\njtgjuqg\t240139\n占有\t240140\n候机楼\t240141\n孟浪\t240142\n背部\t240143\nzodzno\t240144\n西红小大人\t240145\n受扣\t240146\n鸥群\t240147\nyyfufufyt\t240148\n天涯论\t240149\n1n18\t240150\nARA\t240151\n虚情假意\t240152\n犬肉\t240153\n麦牛奶\t240154\n再也不见\t240155\n5月26日\t240156\n二百八十九块\t240157\n对花\t240158\n周样\t240159\n灯池\t240160\n哦开\t240161\n朝阳规划艺术馆\t240162\n哦弄\t240163\n四二分\t240164\nHasffffffffd\t240165\n无失\t240166\n巧独家\t240167\n78787878787817058702078787878\t240168\nwerear\t240169\n超级明星\t240170\n活说一句\t240171\nskt\t240172\n4d7r55r\t240173\nskp\t240174\nsks\t240175\n独家公主限量爱的\t240176\n王胜\t240177\nobdaaa\t240178\n有空打\t240179\nsky\t240180\n77777\t240181\nskd\t240182\nskg\t240183\n买噶\t240184\n满书君\t240185\n15939435266\t240186\nskb\t240187\n135775677\t240188\n盛和\t240189\nskn\t240190\n手撕包菜\t240191\n嗯妈妈妈妈妈妈妈妈们啊妈妈哈哈哈萌萌萌萌萌萌萌萌萌萌萌萌萌孟太太太太太太慢哦你的声音\t240192\n波浪\t240193\n滚滚滚你妈蛋\t240194\n切实实在\t240195\n看我不认识\t240196\n18万\t240197\n不耙\t240198\n几厘米\t240199\nknon\t240200\n宽屏\t240201\n危险的味道\t240202\n和气生财和气生财钱日子不了问题的\t240203\n饭饭后\t240204\n多想你\t240205\n翟家庄\t240206\nknow\t240207\n会友\t240208\n张文城\t240209\nwqerr\t240210\n严子陵\t240211\n吧哥们\t240212\n大可爱\t240213\nsk2\t240214\n君太百货义卖\t240215\n18个\t240216\n天干物\t240217\n7月21号\t240218\n代养\t240219\n但是\t240220\ndancer\t240221\n死心塌鸡蛋\t240222\nioplk\t240223\n沙砾\t240224\n差一点\t240225\n文雨非\t240226\n同飞\t240227\n山东省\t240228\n拿把\t240229\n遥远的森林\t240230\nnigga\t240231\n白鹿\t240232\n逗哦草\t240233\n5151\t240234\n5150\t240235\n酱紫酱紫\t240236\n温主任\t240237\n和平区\t240238\n标点符号们\t240239\n不耍\t240240\n唱试\t240241\n收款人\t240242\n订位\t240243\n魏秋月\t240244\n11个月\t240245\n西大街\t240246\n灰脚\t240247\n祝余列刚\t240248\n白鹤\t240249\n13577719972\t240250\nllblvkc\t240251\n十三姨\t240252\n我爱你了你\t240253\n这个价\t240254\nbbbbbbp\t240255\n唱诵\t240256\n下大雪来\t240257\nleader\t240258\n白鹅\t240259\n4.7万亿\t240260\n方便片\t240261\nGOAOEO\t240262\n热饿\t240263\n节奏\t240264\n一腳\t240265\n汉传佛教\t240266\n陈心怡\t240267\n一腾\t240268\n一腹\t240269\n干活类\t240270\n巴尼德\t240271\nZHRK\t240272\n热饭\t240273\n三声\t240274\n热饮\t240275\n奖本\t240276\n官头\t240277\nCcf\t240278\n国际惯例\t240279\n一腔\t240280\nstercastlcohoubiot\t240281\n人艺\t240282\nhsh\t240283\n刘佳情\t240284\n晚8\t240285\nfudy\t240286\n人色\t240287\n赵公明\t240288\nfude\t240289\n坠入\t240290\n索溪峪\t240291\n学熊\t240292\n沙粒\t240293\n43254325\t240294\n责任心\t240295\n病区\t240296\n剧种\t240297\n趣\t240298\n2010年10月23日\t240299\n免疫性\t240300\n劲英会\t240301\n人者\t240302\n六倍积\t240303\n趱\t240304\n单字\t240305\n盛产\t240306\n趴\t240307\n混用\t240308\ncxq65\t240309\n环卫工人\t240310\nLLPPOOIO\t240311\n越\t240312\n做小\t240313\n懂你我心打面小仙\t240314\n趁\t240315\n超\t240316\n一泻火\t240317\n挺好不过\t240318\n如有神\t240319\n盛事\t240320\nUNIQLO喔5错5错\t240321\n白土人家\t240322\n瓜子脸\t240323\n趟\t240324\n金汤匙\t240325\n抬举\t240326\n下日\t240327\n凯吉小东队\t240328\n50244点\t240329\n植物大战僵尸三十\t240330\n时髦\t240331\n总书记\t240332\n111111112222222223333444455\t240333\n86091111\t240334\nqpkbf\t240335\naa0201\t240336\n个改天\t240337\n慧娴\t240338\n江福荣\t240339\n淬神劫\t240340\n犬类\t240341\n威海市\t240342\n优你\t240343\n而叫\t240344\n嫂娘\t240345\n规模\t240346\n涨肚白酒\t240347\n泰安努\t240348\n对话框\t240349\n方楼\t240350\n菠萝炒凤梨\t240351\n這麽\t240352\nBchfh\t240353\nRUTJG\t240354\n秘一个人\t240355\n煸\t240356\n错了你是号门不是好难你是好萌\t240357\n关不让\t240358\n相争\t240359\n相于\t240360\nvans\t240361\n包干区\t240362\n相互\t240363\n拔头筹\t240364\n煽\t240365\n1947年\t240366\n迷你猜猜我\t240367\n刘子杭\t240368\nhttpehiphotosbaiducomxiaodupicitem0dd7912397dda144c1f4a0c1b5b7d0a20df486c0jpg\t240369\n伴米\t240370\n相交\t240371\n十三排\t240372\n丁茂山\t240373\nzvcn\t240374\n夏诗涵\t240375\n八五零二三\t240376\n万女士\t240377\n南方电网公司\t240378\n相亲\t240379\n皮囊\t240380\n5774688744563\t240381\nsick\t240382\n铁都\t240383\n杨伟琴\t240384\n黄金小区\t240385\n贵人\t240386\n刘子杰\t240387\n水势\t240388\n错的错\t240389\n吧个儿\t240390\ndeb48fbbb8c0abf71f3a292cf578f5jpg\t240391\n收储\t240392\n印记\t240393\n6455655566\t240394\n小游戏\t240395\n煙\t240396\n煤\t240397\n这些你\t240398\ngar\t240399\n跪地\t240400\n201611\t240401\n201616\t240402\n合法性\t240403\n兴趣\t240404\n有形\t240405\n煎\t240406\n水务\t240407\n煲\t240408\n唱一手歌\t240409\n水力\t240410\n见识见识见识见识见识见识\t240411\n温湿\t240412\n盖屋\t240413\n真样\t240414\n热血青年\t240415\n38km\t240416\n金饰\t240417\n照\t240418\n清明旅社\t240419\n张清雨\t240420\n表皮\t240421\n煩\t240422\n38kg\t240423\n鲅鱼圈开发区\t240424\n煮\t240425\n18c\t240426\n[PAD]\t240427\n"
  },
  {
    "path": "modules/text/sentiment_analysis/emotion_detection_textcnn/module.py",
    "content": "import math\nimport os\n\nimport paddlehub as hub\nfrom .processor import load_vocab\nfrom .processor import postprocess\nfrom .processor import preprocess\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"emotion_detection_textcnn\",\n            version=\"1.3.0\",\n            summary=\"Baidu's open-source Emotion Detection Model(TextCNN).\",\n            author=\"baidu-nlp\",\n            author_email=\"\",\n            type=\"nlp/sentiment_analysis\")\nclass EmotionDetectionTextCNN(hub.NLPPredictionModule):\n\n    def _initialize(self):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        self.pretrained_model_path = os.path.join(self.directory, \"assets\", \"infer_model\")\n        self.vocab_path = os.path.join(self.directory, \"assets\", \"vocab.txt\")\n        self.vocab = load_vocab(self.vocab_path)\n        self._word_seg_module = None\n\n        self.predict = self.emotion_classify\n\n        self._set_config()\n\n    @property\n    def word_seg_module(self):\n        \"\"\"\n        lac module\n        \"\"\"\n        if not self._word_seg_module:\n            self._word_seg_module = hub.Module(name=\"lac\")\n        return self._word_seg_module\n\n    @serving\n    def emotion_classify(self, texts=[], data={}, use_gpu=False, batch_size=1):\n        \"\"\"\n        Get the emotion prediction results results with the texts as input\n        Args:\n             texts(list): the input texts to be predicted, if texts not data\n             data(dict): key must be 'text', value is the texts to be predicted, if data not texts\n             use_gpu(bool): whether use gpu to predict or not\n             batch_size(int): the program deals once with one batch\n        Returns:\n             results(list): the emotion prediction results\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        if texts != [] and isinstance(texts, list) and data == {}:\n            predicted_data = texts\n        elif texts == [] and isinstance(data, dict) and isinstance(data.get('text', None), list) and data['text']:\n            predicted_data = data[\"text\"]\n        else:\n            raise ValueError(\"The input data is inconsistent with expectations.\")\n\n        predicted_data = self.to_unicode(predicted_data)\n\n        start_idx = 0\n        iteration = int(math.ceil(len(predicted_data) / batch_size))\n        results = []\n        for i in range(iteration):\n            if i < (iteration - 1):\n                batch_data = predicted_data[start_idx:(start_idx + batch_size)]\n            else:\n                batch_data = predicted_data[start_idx:]\n            start_idx = start_idx + batch_size\n            processed_results = preprocess(self.word_seg_module, batch_data, self.vocab, use_gpu, batch_size)\n            tensor_words = self.texts2tensor(processed_results)\n\n            if use_gpu:\n                batch_out = self.gpu_predictor.run([tensor_words])\n            else:\n                batch_out = self.cpu_predictor.run([tensor_words])\n            batch_result = postprocess(batch_out[0], processed_results)\n            results += batch_result\n        return results\n\n    def get_labels(self):\n        \"\"\"\n        Get the labels which was used when pretraining\n        Returns:\n             self.labels(dict)\n        \"\"\"\n        self.labels = {\"positive\": 2, \"negative\": 0, \"neutral\": 1}\n        return self.labels\n"
  },
  {
    "path": "modules/text/sentiment_analysis/emotion_detection_textcnn/processor.py",
    "content": "import io\n\nimport numpy as np\n\n\ndef load_vocab(file_path):\n    \"\"\"\n    load the given vocabulary\n    \"\"\"\n    vocab = {}\n    with io.open(file_path, 'r', encoding='utf8') as fin:\n        wid = 0\n        for line in fin:\n            data = line.strip().split(\"\\t\")\n            if len(data) == 1:\n                wstr = ''\n                vocab[wstr] = int(data[0])\n                continue\n            else:\n                wstr = data[0]\n            vocab[wstr] = int(data[1])\n    vocab[\"<unk>\"] = len(vocab)\n    return vocab\n\n\ndef get_predict_label(probs):\n    label = int(np.argmax(probs))\n    if label == 0:\n        key = \"negative\"\n    elif label == 2:\n        key = \"positive\"\n    else:\n        key = \"neutral\"\n    return label, key\n\n\ndef preprocess(lac, predicted_data, word_dict, use_gpu=False, batch_size=1):\n    result = []\n    data_dict = {\"text\": predicted_data}\n    processed = lac.lexical_analysis(data=data_dict, use_gpu=use_gpu, batch_size=batch_size)\n    unk_id = word_dict[\"<unk>\"]\n    for index, data in enumerate(processed):\n        result_i = {'processed': []}\n        result_i['origin'] = predicted_data[index]\n        for word in data['word']:\n            if word in word_dict:\n                _index = word_dict[word]\n            else:\n                _index = unk_id\n            result_i['processed'].append(_index)\n        result.append(result_i)\n    return result\n\n\ndef postprocess(prediction, texts):\n    result = []\n    pred = prediction.as_ndarray()\n    for index in range(len(texts)):\n        result_i = {}\n        result_i['text'] = texts[index]['origin']\n        label, key = get_predict_label(pred[index])\n        result_i['emotion_label'] = label\n        result_i['emotion_key'] = key\n        result_i['positive_probs'] = float('%.4f' % pred[index, 2])\n        result_i['negative_probs'] = float('%.4f' % (pred[index, 0]))\n        result_i['neutral_probs'] = float('%.4f' % (pred[index, 1]))\n        result.append(result_i)\n    return result\n"
  },
  {
    "path": "modules/text/sentiment_analysis/ernie_skep_sentiment_analysis/README.md",
    "content": "# ernie_skep_sentiment_analysis\n|模型名称|ernie_skep_sentiment_analysis|\n| :--- | :---: |\n|类别|文本-情感分析|\n|网络|SKEP|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|否|\n|模型大小|2.4G|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - SKEP（Sentiment Knowledge Enhanced Pre-training for Sentiment Analysis）是百度研究团队在2020年提出的基于情感知识增强的情感预训练算法，此算法采用无监督方法自动挖掘情感知识，然后利用情感知识构建预训练目标，从而让机器学会理解情感语义，在14项中英情感分析典型任务上全面超越SOTA，相关工作已经被ACL 2020录用。SKEP为各类情感分析任务提供统一且强大的情感语义表示。ernie_skep_sentiment_analysis Module可用于句子级情感分析任务预测。其在预训练时使用ERNIE 1.0 large预训练参数作为其网络参数初始化继续预训练。\n\n<p align=\"center\">\n<img src=\"https://bj.bcebos.com/paddlehub/model/nlp/skep.png\" width='600' hspace='10'/> <br />\n</p>\n\n  - 更多详情参考ACL 2020论文：[SKEP: Sentiment Knowledge Enhanced Pre-training for Sentiment Analysis](https://arxiv.org/abs/2005.05635)\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.8.0\n\n  - paddlehub >= 1.7.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install ernie_skep_sentiment_analysis\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run ernie_skep_sentiment_analysis --input_text='虽然小明很努力，但是他还是没有考100分'\n    ```\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    # Load ernie_skep_sentiment_analysis module.\n    module = hub.Module(name=\"ernie_skep_sentiment_analysis\")\n\n    # Predict sentiment label\n    test_texts = ['你不是不聪明，而是不认真', '虽然小明很努力，但是他还是没有考100分']\n    results = module.predict_sentiment(test_texts, use_gpu=False)\n\n    for result in results:\n        print(result['text'])\n        print(result['sentiment_label'])\n        print(result['positive_probs'])\n        print(result['negative_probs'])\n\n    # 你不是不聪明，而是不认真 negative 0.10738129168748856 0.8926186561584473\n    # 虽然小明很努力，但是他还是没有考100分 negative 0.05391530692577362 0.94608473777771\n    ```\n\n- ### 3、API\n\n  - ```python\n    def predict_sentiment(texts=[], use_gpu=False)\n    ```\n    - 预测API，分类输入文本的情感极性。\n\n    - **参数**\n\n      - texts (list\\[str\\]): 待预测文本；\n      - use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量**；\n\n    - **返回**\n\n      - res (list\\[dict\\]): 情感分类结果的列表，列表中每一个元素为 dict，各字段为：\n        - text(str): 输入预测文本\n        - sentiment_label(str): 情感分类结果，或为positive或为negative\n        - positive_probs: 输入预测文本情感极性属于positive的概率\n        - negative_probs: 输入预测文本情感极性属于negative的概率\n\n\n  - ```python\n    def get_embedding(texts, use_gpu=False, batch_size=1)\n    ```\n\n    - 用于获取输入文本的句子粒度特征与字粒度特征\n\n    - **参数**\n\n      - texts(list)：输入文本列表，格式为\\[\\[sample\\_a\\_text\\_a, sample\\_a\\_text\\_b\\], \\[sample\\_b\\_text\\_a, sample\\_b\\_text\\_b\\],…,\\]，其中每个元素都是一个样例，每个样例可以包含text\\_a与text\\_b。\n      - use_gpu(bool)：是否使用gpu，默认为False。对于GPU用户，建议开启use_gpu。**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量**；\n\n    - **返回**\n\n      - results(list): embedding特征，格式为\\[\\[sample\\_a\\_pooled\\_feature, sample\\_a\\_seq\\_feature\\], \\[sample\\_b\\_pooled\\_feature, sample\\_b\\_seq\\_feature\\],…,\\]，其中每个元素都是对应样例的特征输出，每个样例都有句子粒度特征pooled\\_feature与字粒度特征seq\\_feature。\n\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个目标检测的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n    ```shell\n    $ hub serving start -m ernie_skep_sentiment_analysis\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    ```python\n    import requests\n    import json\n\n    # 发送HTTP请求\n    data = {'texts':['你不是不聪明，而是不认真', '虽然小明很努力，但是他还是没有考100分']}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/ernie_skep_sentiment_analysis\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n    ```\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  移除 fluid api\n\n  - ```shell\n    $ hub install ernie_skep_sentiment_analysis==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/sentiment_analysis/ernie_skep_sentiment_analysis/assets/ernie_1.0_large_ch.config.json",
    "content": "{\n  \"attention_probs_dropout_prob\": 0.1,\n  \"hidden_act\": \"relu\",\n  \"hidden_dropout_prob\": 0.1,\n  \"hidden_size\": 1024,\n  \"initializer_range\": 0.02,\n  \"max_position_embeddings\": 512,\n  \"num_attention_heads\": 16,\n  \"num_hidden_layers\": 24,\n  \"sent_type_vocab_size\": 4,\n  \"task_type_vocab_size\": 16,\n  \"vocab_size\": 12800,\n  \"use_task_id\": false\n}\n"
  },
  {
    "path": "modules/text/sentiment_analysis/ernie_skep_sentiment_analysis/assets/ernie_1.0_large_ch.vocab.txt",
    "content": "[PAD]\t0\n[CLS]\t1\n[SEP]\t2\n[MASK]\t3\n，\t4\n的\t5\n、\t6\n一\t7\n人\t8\n有\t9\n是\t10\n在\t11\n中\t12\n为\t13\n和\t14\n了\t15\n不\t16\n年\t17\n学\t18\n大\t19\n国\t20\n生\t21\n以\t22\n“\t23\n”\t24\n作\t25\n业\t26\n个\t27\n上\t28\n用\t29\n,\t30\n地\t31\n会\t32\n成\t33\n发\t34\n工\t35\n时\t36\n于\t37\n理\t38\n出\t39\n行\t40\n要\t41\n.\t42\n等\t43\n他\t44\n到\t45\n之\t46\n这\t47\n可\t48\n后\t49\n家\t50\n对\t51\n能\t52\n公\t53\n与\t54\n》\t55\n《\t56\n主\t57\n方\t58\n分\t59\n经\t60\n来\t61\n全\t62\n其\t63\n部\t64\n多\t65\n产\t66\n自\t67\n文\t68\n高\t69\n动\t70\n进\t71\n法\t72\n化\t73\n：\t74\n我\t75\n面\t76\n）\t77\n（\t78\n实\t79\n教\t80\n建\t81\n体\t82\n而\t83\n长\t84\n子\t85\n下\t86\n现\t87\n开\t88\n本\t89\n力\t90\n定\t91\n性\t92\n过\t93\n设\t94\n合\t95\n小\t96\n同\t97\n机\t98\n市\t99\n品\t100\n水\t101\n新\t102\n内\t103\n事\t104\n也\t105\n种\t106\n及\t107\n制\t108\n入\t109\n所\t110\n心\t111\n务\t112\n就\t113\n管\t114\n们\t115\n得\t116\n展\t117\n重\t118\n民\t119\n加\t120\n区\t121\n物\t122\n者\t123\n通\t124\n天\t125\n政\t126\n三\t127\n电\t128\n关\t129\n度\t130\n第\t131\n名\t132\n术\t133\n最\t134\n系\t135\n月\t136\n外\t137\n资\t138\n日\t139\n代\t140\n员\t141\n如\t142\n间\t143\n位\t144\n并\t145\n书\t146\n科\t147\n村\t148\n应\t149\n量\t150\n道\t151\n前\t152\n当\t153\n无\t154\n里\t155\n相\t156\n平\t157\n从\t158\n计\t159\n提\t160\n保\t161\n任\t162\n程\t163\n技\t164\n都\t165\n研\t166\n十\t167\n基\t168\n特\t169\n好\t170\n被\t171\n或\t172\n目\t173\n将\t174\n使\t175\n山\t176\n二\t177\n说\t178\n数\t179\n点\t180\n明\t181\n情\t182\n元\t183\n着\t184\n收\t185\n组\t186\n然\t187\n美\t188\n各\t189\n由\t190\n场\t191\n金\t192\n形\t193\n农\t194\n期\t195\n因\t196\n表\t197\n此\t198\n色\t199\n起\t200\n还\t201\n立\t202\n世\t203\n安\t204\n活\t205\n专\t206\n质\t207\n1\t208\n规\t209\n社\t210\n万\t211\n信\t212\n西\t213\n统\t214\n结\t215\n路\t216\n利\t217\n次\t218\n南\t219\n式\t220\n意\t221\n级\t222\n常\t223\n师\t224\n校\t225\n你\t226\n育\t227\n果\t228\n究\t229\n司\t230\n服\t231\n门\t232\n海\t233\n导\t234\n流\t235\n项\t236\n她\t237\n总\t238\n处\t239\n两\t240\n传\t241\n东\t242\n正\t243\n省\t244\n院\t245\n户\t246\n手\t247\n具\t248\n2\t249\n原\t250\n强\t251\n北\t252\n向\t253\n先\t254\n但\t255\n米\t256\n城\t257\n企\t258\n件\t259\n风\t260\n军\t261\n身\t262\n更\t263\n知\t264\n已\t265\n气\t266\n战\t267\n至\t268\n单\t269\n口\t270\n集\t271\n创\t272\n解\t273\n四\t274\n标\t275\n交\t276\n比\t277\n商\t278\n论\t279\n界\t280\n题\t281\n变\t282\n花\t283\n3\t284\n改\t285\n类\t286\n运\t287\n指\t288\n型\t289\n调\t290\n女\t291\n神\t292\n接\t293\n造\t294\n受\t295\n广\t296\n只\t297\n委\t298\n去\t299\n共\t300\n治\t301\n达\t302\n持\t303\n条\t304\n网\t305\n头\t306\n构\t307\n县\t308\n些\t309\n该\t310\n又\t311\n那\t312\n想\t313\n样\t314\n办\t315\n济\t316\n5\t317\n格\t318\n责\t319\n车\t320\n很\t321\n施\t322\n求\t323\n己\t324\n光\t325\n精\t326\n林\t327\n完\t328\n爱\t329\n线\t330\n参\t331\n少\t332\n积\t333\n清\t334\n看\t335\n优\t336\n报\t337\n王\t338\n直\t339\n没\t340\n每\t341\n据\t342\n游\t343\n效\t344\n感\t345\n五\t346\n影\t347\n别\t348\n获\t349\n领\t350\n称\t351\n选\t352\n供\t353\n乐\t354\n老\t355\n么\t356\n台\t357\n问\t358\n划\t359\n带\t360\n器\t361\n源\t362\n织\t363\n放\t364\n深\t365\n备\t366\n视\t367\n白\t368\n功\t369\n取\t370\n装\t371\n营\t372\n见\t373\n记\t374\n环\t375\n队\t376\n节\t377\n准\t378\n石\t379\n它\t380\n回\t381\n历\t382\n负\t383\n真\t384\n增\t385\n医\t386\n联\t387\n做\t388\n职\t389\n容\t390\n士\t391\n包\t392\n义\t393\n观\t394\n团\t395\n病\t396\n4\t397\n府\t398\n息\t399\n则\t400\n考\t401\n料\t402\n华\t403\n州\t404\n语\t405\n证\t406\n整\t407\n让\t408\n江\t409\n史\t410\n空\t411\n验\t412\n需\t413\n支\t414\n命\t415\n给\t416\n离\t417\n认\t418\n艺\t419\n较\t420\n土\t421\n古\t422\n养\t423\n才\t424\n境\t425\n推\t426\n把\t427\n均\t428\n图\t429\n际\t430\n斯\t431\n近\t432\n片\t433\n局\t434\n修\t435\n字\t436\n德\t437\n权\t438\n步\t439\n始\t440\n复\t441\n转\t442\n协\t443\n即\t444\n打\t445\n画\t446\n投\t447\n决\t448\n何\t449\n约\t450\n反\t451\nquot\t452\n费\t453\n议\t454\n护\t455\n极\t456\n河\t457\n房\t458\n查\t459\n布\t460\n思\t461\n干\t462\n价\t463\n儿\t464\n非\t465\n马\t466\n党\t467\n奖\t468\n模\t469\n故\t470\n编\t471\n音\t472\n范\t473\n识\t474\n率\t475\n存\t476\n引\t477\n客\t478\n属\t479\n评\t480\n采\t481\n尔\t482\n配\t483\n镇\t484\n室\t485\n再\t486\n案\t487\n监\t488\n习\t489\n注\t490\n根\t491\n克\t492\n演\t493\n食\t494\n族\t495\n示\t496\n球\t497\n状\t498\n青\t499\n号\t500\n张\t501\n百\t502\n素\t503\n首\t504\n易\t505\n热\t506\n阳\t507\n今\t508\n园\t509\n防\t510\n版\t511\n太\t512\n乡\t513\n英\t514\n6\t515\n材\t516\n列\t517\n便\t518\n写\t519\n住\t520\n置\t521\n层\t522\n助\t523\n确\t524\n试\t525\n难\t526\n承\t527\n象\t528\n居\t529\n10\t530\n黄\t531\n快\t532\n断\t533\n维\t534\n却\t535\n红\t536\n速\t537\n连\t538\n众\t539\n0\t540\n细\t541\n态\t542\n话\t543\n周\t544\n言\t545\n药\t546\n培\t547\n血\t548\n亩\t549\n龙\t550\n越\t551\n值\t552\n几\t553\n边\t554\n读\t555\n未\t556\n曾\t557\n测\t558\n算\t559\n京\t560\n景\t561\n余\t562\n站\t563\n低\t564\n温\t565\n消\t566\n必\t567\n切\t568\n依\t569\n随\t570\n且\t571\n志\t572\n卫\t573\n域\t574\n照\t575\n许\t576\n限\t577\n著\t578\n销\t579\n落\t580\n足\t581\n适\t582\n争\t583\n策\t584\n8\t585\n控\t586\n武\t587\n按\t588\n7\t589\n初\t590\n角\t591\n核\t592\n死\t593\n检\t594\n富\t595\n满\t596\n显\t597\n审\t598\n除\t599\n致\t600\n亲\t601\n占\t602\n失\t603\n星\t604\n章\t605\n善\t606\n续\t607\n千\t608\n叶\t609\n火\t610\n副\t611\n告\t612\n段\t613\n什\t614\n声\t615\n终\t616\n况\t617\n走\t618\n木\t619\n益\t620\n戏\t621\n独\t622\n纪\t623\n植\t624\n财\t625\n群\t626\n六\t627\n赛\t628\n远\t629\n拉\t630\n亚\t631\n密\t632\n排\t633\n超\t634\n像\t635\n课\t636\n围\t637\n往\t638\n响\t639\n击\t640\n疗\t641\n念\t642\n八\t643\n云\t644\n险\t645\n律\t646\n请\t647\n革\t648\n诗\t649\n批\t650\n底\t651\n压\t652\n双\t653\n男\t654\n训\t655\n例\t656\n汉\t657\n升\t658\n拥\t659\n势\t660\n酒\t661\n眼\t662\n官\t663\n牌\t664\n油\t665\n曲\t666\n友\t667\n望\t668\n黑\t669\n歌\t670\n筑\t671\n础\t672\n香\t673\n仅\t674\n担\t675\n括\t676\n湖\t677\n严\t678\n秀\t679\n剧\t680\n九\t681\n举\t682\n执\t683\n充\t684\n兴\t685\n督\t686\n博\t687\n草\t688\n般\t689\n李\t690\n健\t691\n喜\t692\n授\t693\n普\t694\n预\t695\n灵\t696\n突\t697\n良\t698\n款\t699\n罗\t700\n9\t701\n微\t702\n七\t703\n录\t704\n朝\t705\n飞\t706\n宝\t707\n令\t708\n轻\t709\n劳\t710\n距\t711\n异\t712\n简\t713\n兵\t714\n树\t715\n序\t716\n候\t717\n含\t718\n福\t719\n尽\t720\n留\t721\n20\t722\n丰\t723\n旅\t724\n征\t725\n临\t726\n破\t727\n移\t728\n篇\t729\n抗\t730\n典\t731\n端\t732\n苏\t733\n奇\t734\n止\t735\n康\t736\n店\t737\n毛\t738\n觉\t739\n春\t740\n售\t741\n络\t742\n降\t743\n板\t744\n坚\t745\n母\t746\n讲\t747\n早\t748\n印\t749\n略\t750\n孩\t751\n夫\t752\n藏\t753\n铁\t754\n害\t755\n互\t756\n帝\t757\n田\t758\n融\t759\n皮\t760\n宗\t761\n岁\t762\n载\t763\n析\t764\n斗\t765\n须\t766\n伤\t767\n12\t768\n介\t769\n另\t770\n00\t771\n半\t772\n班\t773\n馆\t774\n味\t775\n楼\t776\n卡\t777\n射\t778\n述\t779\n杀\t780\n波\t781\n绿\t782\n免\t783\n兰\t784\n绝\t785\n刻\t786\n短\t787\n察\t788\n输\t789\n择\t790\n综\t791\n杂\t792\n份\t793\n纳\t794\n父\t795\n词\t796\n银\t797\n送\t798\n座\t799\n左\t800\n继\t801\n固\t802\n宣\t803\n厂\t804\n肉\t805\n换\t806\n补\t807\n税\t808\n派\t809\n套\t810\n欢\t811\n播\t812\n吸\t813\n圆\t814\n攻\t815\n阿\t816\n购\t817\n听\t818\n右\t819\n减\t820\n激\t821\n巴\t822\n背\t823\n够\t824\n遇\t825\n智\t826\n玉\t827\n找\t828\n宽\t829\n陈\t830\n练\t831\n追\t832\n毕\t833\n彩\t834\n软\t835\n帮\t836\n股\t837\n荣\t838\n托\t839\n予\t840\n佛\t841\n堂\t842\n障\t843\n皇\t844\n若\t845\n守\t846\n似\t847\n届\t848\n待\t849\n货\t850\n散\t851\n额\t852\n30\t853\n尚\t854\n穿\t855\n丽\t856\n骨\t857\n享\t858\n差\t859\n针\t860\n索\t861\n稳\t862\n宁\t863\n贵\t864\n酸\t865\n液\t866\n唐\t867\n操\t868\n探\t869\n玩\t870\n促\t871\n笔\t872\n库\t873\n救\t874\n虽\t875\n久\t876\n闻\t877\n顶\t878\n床\t879\n港\t880\n鱼\t881\n亿\t882\n登\t883\n11\t884\n永\t885\n毒\t886\n桥\t887\n冷\t888\n魔\t889\n秘\t890\n陆\t891\n您\t892\n童\t893\n归\t894\n侧\t895\n沙\t896\n染\t897\n封\t898\n紧\t899\n松\t900\n川\t901\n刘\t902\n15\t903\n雄\t904\n希\t905\n毫\t906\n卷\t907\n某\t908\n季\t909\n菜\t910\n庭\t911\n附\t912\n逐\t913\n夜\t914\n宫\t915\n洲\t916\n退\t917\n顾\t918\n尼\t919\n胜\t920\n剂\t921\n纯\t922\n舞\t923\n遗\t924\n苦\t925\n梦\t926\n挥\t927\n航\t928\n愿\t929\n街\t930\n招\t931\n矿\t932\n夏\t933\n盖\t934\n献\t935\n怎\t936\n茶\t937\n申\t938\n39\t939\n吧\t940\n脑\t941\n亦\t942\n吃\t943\n频\t944\n宋\t945\n央\t946\n威\t947\n厚\t948\n块\t949\n冲\t950\n叫\t951\n熟\t952\n礼\t953\n厅\t954\n否\t955\n渐\t956\n笑\t957\n钱\t958\n钟\t959\n甚\t960\n牛\t961\n丝\t962\n靠\t963\n岛\t964\n绍\t965\n盘\t966\n缘\t967\n聚\t968\n静\t969\n雨\t970\n氏\t971\n圣\t972\n顺\t973\n唱\t974\n刊\t975\n阶\t976\n困\t977\n急\t978\n饰\t979\n弹\t980\n庄\t981\n既\t982\n野\t983\n阴\t984\n混\t985\n饮\t986\n损\t987\n齐\t988\n末\t989\n错\t990\n轮\t991\n宜\t992\n鲜\t993\n兼\t994\n敌\t995\n粉\t996\n祖\t997\n延\t998\n100\t999\n钢\t1000\n辑\t1001\n欧\t1002\n硬\t1003\n甲\t1004\n诉\t1005\n册\t1006\n痛\t1007\n订\t1008\n缺\t1009\n晚\t1010\n衣\t1011\n佳\t1012\n脉\t1013\ngt\t1014\n盛\t1015\n乎\t1016\n拟\t1017\n贸\t1018\n扩\t1019\n船\t1020\n仪\t1021\n谁\t1022\n警\t1023\n50\t1024\n停\t1025\n席\t1026\n竞\t1027\n释\t1028\n庆\t1029\n汽\t1030\n仍\t1031\n掌\t1032\n诸\t1033\n仙\t1034\n弟\t1035\n吉\t1036\n洋\t1037\n奥\t1038\n票\t1039\n危\t1040\n架\t1041\n买\t1042\n径\t1043\n塔\t1044\n休\t1045\n付\t1046\n恶\t1047\n雷\t1048\n怀\t1049\n秋\t1050\n借\t1051\n巨\t1052\n透\t1053\n誉\t1054\n厘\t1055\n句\t1056\n跟\t1057\n胞\t1058\n婚\t1059\n幼\t1060\n烈\t1061\n峰\t1062\n寻\t1063\n君\t1064\n汇\t1065\n趣\t1066\n纸\t1067\n假\t1068\n肥\t1069\n患\t1070\n杨\t1071\n雅\t1072\n罪\t1073\n谓\t1074\n亮\t1075\n脱\t1076\n寺\t1077\n烟\t1078\n判\t1079\n绩\t1080\n乱\t1081\n刚\t1082\n摄\t1083\n洞\t1084\n践\t1085\n码\t1086\n启\t1087\n励\t1088\n呈\t1089\n曰\t1090\n呢\t1091\n符\t1092\n哥\t1093\n媒\t1094\n疾\t1095\n坐\t1096\n雪\t1097\n孔\t1098\n倒\t1099\n旧\t1100\n菌\t1101\n岩\t1102\n鼓\t1103\n亡\t1104\n访\t1105\n症\t1106\n暗\t1107\n湾\t1108\n幸\t1109\n池\t1110\n讨\t1111\n努\t1112\n露\t1113\n吗\t1114\n繁\t1115\n途\t1116\n殖\t1117\n败\t1118\n蛋\t1119\n握\t1120\n刺\t1121\n耕\t1122\n洗\t1123\n沉\t1124\n概\t1125\n哈\t1126\n泛\t1127\n凡\t1128\n残\t1129\n隐\t1130\n虫\t1131\n朋\t1132\n虚\t1133\n餐\t1134\n殊\t1135\n慢\t1136\n询\t1137\n蒙\t1138\n孙\t1139\n谈\t1140\n鲁\t1141\n裂\t1142\n贴\t1143\n污\t1144\n漫\t1145\n谷\t1146\n违\t1147\n泉\t1148\n拿\t1149\n森\t1150\n横\t1151\n扬\t1152\n键\t1153\n膜\t1154\n迁\t1155\n尤\t1156\n涉\t1157\n净\t1158\n诚\t1159\n折\t1160\n冰\t1161\n械\t1162\n拍\t1163\n梁\t1164\n沿\t1165\n避\t1166\n吴\t1167\n惊\t1168\n犯\t1169\n灭\t1170\n湿\t1171\n迷\t1172\n姓\t1173\n阅\t1174\n灯\t1175\n妇\t1176\n触\t1177\n冠\t1178\n答\t1179\n俗\t1180\n档\t1181\n尊\t1182\n谢\t1183\n措\t1184\n筹\t1185\n竟\t1186\n韩\t1187\n签\t1188\n剑\t1189\n鉴\t1190\n灾\t1191\n贯\t1192\n迹\t1193\n洛\t1194\n沟\t1195\n束\t1196\n翻\t1197\n巧\t1198\n坏\t1199\n弱\t1200\n零\t1201\n壁\t1202\n枝\t1203\n映\t1204\n恩\t1205\n抓\t1206\n屋\t1207\n呼\t1208\n脚\t1209\n绘\t1210\n40\t1211\n淡\t1212\n辖\t1213\n2010\t1214\n伊\t1215\n粒\t1216\n欲\t1217\n震\t1218\n伯\t1219\n私\t1220\n蓝\t1221\n甘\t1222\n储\t1223\n胡\t1224\n卖\t1225\n梅\t1226\n16\t1227\n耳\t1228\n疑\t1229\n润\t1230\n伴\t1231\n泽\t1232\n牧\t1233\n烧\t1234\n尾\t1235\n累\t1236\n糖\t1237\n怪\t1238\n唯\t1239\n莫\t1240\n粮\t1241\n柱\t1242\n18\t1243\n竹\t1244\n灰\t1245\n岸\t1246\n缩\t1247\n井\t1248\n伦\t1249\n柔\t1250\n盟\t1251\n珠\t1252\n丹\t1253\namp\t1254\n皆\t1255\n哪\t1256\n迎\t1257\n颜\t1258\n衡\t1259\n啊\t1260\n塑\t1261\n寒\t1262\n13\t1263\n紫\t1264\n镜\t1265\n25\t1266\n氧\t1267\n误\t1268\n伍\t1269\n彻\t1270\n刀\t1271\n览\t1272\n炎\t1273\n津\t1274\n耐\t1275\n秦\t1276\n尖\t1277\n潮\t1278\n描\t1279\n浓\t1280\n召\t1281\n禁\t1282\n阻\t1283\n胶\t1284\n译\t1285\n腹\t1286\n泰\t1287\n乃\t1288\n盐\t1289\n潜\t1290\n鸡\t1291\n诺\t1292\n遍\t1293\n2000\t1294\n纹\t1295\n冬\t1296\n牙\t1297\n麻\t1298\n辅\t1299\n猪\t1300\n弃\t1301\n楚\t1302\n羊\t1303\n晋\t1304\n14\t1305\n鸟\t1306\n赵\t1307\n洁\t1308\n谋\t1309\n隆\t1310\n滑\t1311\n60\t1312\n2008\t1313\n籍\t1314\n臣\t1315\n朱\t1316\n泥\t1317\n墨\t1318\n辆\t1319\n墙\t1320\n浪\t1321\n姐\t1322\n赏\t1323\n纵\t1324\n2006\t1325\n拔\t1326\n倍\t1327\n纷\t1328\n摩\t1329\n壮\t1330\n苗\t1331\n偏\t1332\n塞\t1333\n贡\t1334\n仁\t1335\n宇\t1336\n卵\t1337\n瓦\t1338\n枪\t1339\n覆\t1340\n殿\t1341\n刑\t1342\n贫\t1343\n妈\t1344\n幅\t1345\n幕\t1346\n忆\t1347\n丁\t1348\n估\t1349\n废\t1350\n萨\t1351\n舍\t1352\n详\t1353\n旗\t1354\n岗\t1355\n洪\t1356\n80\t1357\n贝\t1358\n2009\t1359\n迅\t1360\n凭\t1361\n勇\t1362\n雕\t1363\n奏\t1364\n旋\t1365\n杰\t1366\n煤\t1367\n阵\t1368\n乘\t1369\n溪\t1370\n奉\t1371\n畜\t1372\n挑\t1373\n昌\t1374\n硕\t1375\n庙\t1376\n惠\t1377\n薄\t1378\n逃\t1379\n爆\t1380\n哲\t1381\n浙\t1382\n珍\t1383\n炼\t1384\n栏\t1385\n暴\t1386\n币\t1387\n隔\t1388\n吨\t1389\n倾\t1390\n嘉\t1391\n址\t1392\n陶\t1393\n绕\t1394\n诊\t1395\n遭\t1396\n桃\t1397\n魂\t1398\n兽\t1399\n豆\t1400\n闲\t1401\n箱\t1402\n拓\t1403\n燃\t1404\n裁\t1405\n晶\t1406\n掉\t1407\n脂\t1408\n溶\t1409\n顿\t1410\n肤\t1411\n虑\t1412\n鬼\t1413\n2007\t1414\n灌\t1415\n徐\t1416\n龄\t1417\n陵\t1418\n恋\t1419\n侵\t1420\n坡\t1421\n寿\t1422\n勤\t1423\n磨\t1424\n妹\t1425\n瑞\t1426\n缓\t1427\n轴\t1428\n麦\t1429\n羽\t1430\n咨\t1431\n凝\t1432\n默\t1433\n驻\t1434\n敢\t1435\n债\t1436\n17\t1437\n浮\t1438\n幻\t1439\n株\t1440\n浅\t1441\n敬\t1442\n敏\t1443\n陷\t1444\n凤\t1445\n坛\t1446\n虎\t1447\n乌\t1448\n铜\t1449\n御\t1450\n乳\t1451\n讯\t1452\n循\t1453\n圈\t1454\n肌\t1455\n妙\t1456\n奋\t1457\n忘\t1458\n闭\t1459\n墓\t1460\n21\t1461\n汤\t1462\n忠\t1463\n2005\t1464\n跨\t1465\n怕\t1466\n振\t1467\n宾\t1468\n跑\t1469\n屏\t1470\n坦\t1471\n粗\t1472\n租\t1473\n悲\t1474\n伟\t1475\n拜\t1476\n24\t1477\n妻\t1478\n赞\t1479\n兄\t1480\n宿\t1481\n碑\t1482\n貌\t1483\n勒\t1484\n罚\t1485\n夺\t1486\n偶\t1487\n截\t1488\n纤\t1489\n2011\t1490\n齿\t1491\n郑\t1492\n聘\t1493\n偿\t1494\n扶\t1495\n豪\t1496\n慧\t1497\n跳\t1498\nthe\t1499\n疏\t1500\n莱\t1501\n腐\t1502\n插\t1503\n恐\t1504\n郎\t1505\n辞\t1506\n挂\t1507\n娘\t1508\n肿\t1509\n徒\t1510\n伏\t1511\n磁\t1512\n杯\t1513\n丛\t1514\n旨\t1515\n琴\t1516\n19\t1517\n炮\t1518\n醒\t1519\n砖\t1520\n替\t1521\n辛\t1522\n暖\t1523\n锁\t1524\n杜\t1525\n肠\t1526\n孤\t1527\n饭\t1528\n脸\t1529\n邮\t1530\n贷\t1531\nlt\t1532\n俄\t1533\n毁\t1534\n荷\t1535\n谐\t1536\n荒\t1537\n肝\t1538\n链\t1539\n2004\t1540\n2012\t1541\n尺\t1542\n尘\t1543\n援\t1544\na\t1545\n疫\t1546\n崇\t1547\n恢\t1548\n扎\t1549\n伸\t1550\n幽\t1551\n抵\t1552\n胸\t1553\n谱\t1554\n舒\t1555\n迫\t1556\n200\t1557\n畅\t1558\n泡\t1559\n岭\t1560\n喷\t1561\n70\t1562\n窗\t1563\n捷\t1564\n宏\t1565\n肯\t1566\n90\t1567\n狂\t1568\n铺\t1569\n骑\t1570\n抽\t1571\n券\t1572\n俱\t1573\n徽\t1574\n胆\t1575\n碎\t1576\n邀\t1577\n褐\t1578\n斤\t1579\n涂\t1580\n赋\t1581\n署\t1582\n颗\t1583\n2003\t1584\n渠\t1585\n仿\t1586\n迪\t1587\n炉\t1588\n辉\t1589\n涵\t1590\n耗\t1591\n22\t1592\n返\t1593\n邻\t1594\n斑\t1595\n董\t1596\n魏\t1597\n午\t1598\n娱\t1599\n浴\t1600\n尿\t1601\n曼\t1602\n锅\t1603\n柳\t1604\n舰\t1605\n搭\t1606\n旁\t1607\n宅\t1608\n趋\t1609\nof\t1610\n凉\t1611\n赢\t1612\n伙\t1613\n爷\t1614\n廷\t1615\n戴\t1616\n壤\t1617\n奶\t1618\n页\t1619\n玄\t1620\n驾\t1621\n阔\t1622\n轨\t1623\n朗\t1624\n捕\t1625\n肾\t1626\n稿\t1627\n惯\t1628\n侯\t1629\n乙\t1630\n渡\t1631\n稍\t1632\n恨\t1633\n脏\t1634\n2002\t1635\n姆\t1636\n腔\t1637\n抱\t1638\n杆\t1639\n垂\t1640\n赴\t1641\n赶\t1642\n莲\t1643\n辽\t1644\n荐\t1645\n旦\t1646\n妖\t1647\n2013\t1648\n稀\t1649\n驱\t1650\n沈\t1651\n役\t1652\n晓\t1653\n亭\t1654\n仲\t1655\n澳\t1656\n500\t1657\n炸\t1658\n绪\t1659\n28\t1660\n陕\t1661\nand\t1662\n23\t1663\n恒\t1664\n堡\t1665\n纠\t1666\n仇\t1667\n懂\t1668\n焦\t1669\n搜\t1670\ns\t1671\n忍\t1672\n贤\t1673\n添\t1674\ni\t1675\n艾\t1676\n赤\t1677\n犹\t1678\n尝\t1679\n锦\t1680\n稻\t1681\n撰\t1682\n填\t1683\n衰\t1684\n栽\t1685\n邪\t1686\n粘\t1687\n跃\t1688\n桌\t1689\n胃\t1690\n悬\t1691\nc\t1692\n翼\t1693\n彼\t1694\n睡\t1695\n曹\t1696\n刷\t1697\n摆\t1698\n悉\t1699\n锋\t1700\n26\t1701\n摇\t1702\n抢\t1703\n乏\t1704\n廉\t1705\n鼠\t1706\n盾\t1707\n瓷\t1708\n抑\t1709\n埃\t1710\n邦\t1711\n遂\t1712\n寸\t1713\n渔\t1714\n祥\t1715\n胎\t1716\n牵\t1717\n壳\t1718\n甜\t1719\n卓\t1720\n瓜\t1721\n袭\t1722\n遵\t1723\n巡\t1724\n逆\t1725\n玛\t1726\n韵\t1727\n2001\t1728\n桑\t1729\n酷\t1730\n赖\t1731\n桂\t1732\n郡\t1733\n肃\t1734\n仓\t1735\n寄\t1736\n塘\t1737\n瘤\t1738\n300\t1739\n碳\t1740\n搞\t1741\n燕\t1742\n蒸\t1743\n允\t1744\n忽\t1745\n斜\t1746\n穷\t1747\n郁\t1748\n囊\t1749\n奔\t1750\n昆\t1751\n盆\t1752\n愈\t1753\n递\t1754\n1000\t1755\n黎\t1756\n祭\t1757\n怒\t1758\n辈\t1759\n腺\t1760\n滚\t1761\n暂\t1762\n郭\t1763\n璃\t1764\n踪\t1765\n芳\t1766\n碍\t1767\n肺\t1768\n狱\t1769\n冒\t1770\n阁\t1771\n砂\t1772\n35\t1773\n苍\t1774\n揭\t1775\n踏\t1776\n颇\t1777\n柄\t1778\n闪\t1779\n孝\t1780\n葡\t1781\n腾\t1782\n茎\t1783\n鸣\t1784\n撤\t1785\n仰\t1786\n伐\t1787\n丘\t1788\n於\t1789\n泪\t1790\n荡\t1791\n扰\t1792\n纲\t1793\n拼\t1794\n欣\t1795\n纽\t1796\n癌\t1797\n堆\t1798\n27\t1799\n菲\t1800\nb\t1801\n披\t1802\n挖\t1803\n寓\t1804\n履\t1805\n捐\t1806\n悟\t1807\n乾\t1808\n嘴\t1809\n钻\t1810\n拳\t1811\n吹\t1812\n柏\t1813\n遥\t1814\n抚\t1815\n忧\t1816\n赠\t1817\n霸\t1818\n艰\t1819\n淋\t1820\n猫\t1821\n帅\t1822\n奈\t1823\n寨\t1824\n滴\t1825\n鼻\t1826\n掘\t1827\n狗\t1828\n驶\t1829\n朴\t1830\n拆\t1831\n惜\t1832\n玻\t1833\n扣\t1834\n萄\t1835\n蔬\t1836\n宠\t1837\n2014\t1838\n缴\t1839\n赫\t1840\n凯\t1841\n滨\t1842\n乔\t1843\n腰\t1844\n葬\t1845\n孟\t1846\n吾\t1847\n枚\t1848\n圳\t1849\n忙\t1850\n扫\t1851\n杭\t1852\n凌\t1853\n1998\t1854\n梯\t1855\n丈\t1856\n隶\t1857\n1999\t1858\n剪\t1859\n盗\t1860\n擅\t1861\n疆\t1862\n弯\t1863\n携\t1864\n拒\t1865\n秒\t1866\n颁\t1867\n醇\t1868\n割\t1869\n浆\t1870\n姑\t1871\n爸\t1872\n螺\t1873\n穗\t1874\n缝\t1875\n慈\t1876\n喝\t1877\n瓶\t1878\n漏\t1879\n悠\t1880\n猎\t1881\n番\t1882\n孕\t1883\n伪\t1884\n漂\t1885\n腿\t1886\n吐\t1887\n坝\t1888\n滤\t1889\n函\t1890\n匀\t1891\n偷\t1892\n浩\t1893\n矛\t1894\n僧\t1895\n辨\t1896\n俊\t1897\n棉\t1898\n铸\t1899\n29\t1900\n诞\t1901\n丧\t1902\n夹\t1903\nto\t1904\n姿\t1905\n睛\t1906\n淮\t1907\n阀\t1908\n姜\t1909\n45\t1910\n尸\t1911\n猛\t1912\n1997\t1913\n芽\t1914\n账\t1915\n旱\t1916\n醉\t1917\n弄\t1918\n坊\t1919\n烤\t1920\n萧\t1921\n矣\t1922\n雾\t1923\n倡\t1924\n榜\t1925\n弗\t1926\n氨\t1927\n朵\t1928\n锡\t1929\n袋\t1930\n拨\t1931\n湘\t1932\n岳\t1933\n烦\t1934\n肩\t1935\n熙\t1936\n炭\t1937\n婆\t1938\n棋\t1939\n禅\t1940\n穴\t1941\n宙\t1942\n汗\t1943\n艳\t1944\n儒\t1945\n叙\t1946\n晨\t1947\n颈\t1948\n峡\t1949\n拖\t1950\n烂\t1951\n茂\t1952\n戒\t1953\n飘\t1954\n氛\t1955\n蒂\t1956\n撞\t1957\n瓣\t1958\n箭\t1959\n叛\t1960\n1996\t1961\n31\t1962\n鞋\t1963\n劲\t1964\n祝\t1965\n娜\t1966\n饲\t1967\n侍\t1968\n诱\t1969\n叹\t1970\n卢\t1971\n弥\t1972\n32\t1973\n鼎\t1974\n厦\t1975\n屈\t1976\n慕\t1977\n魅\t1978\nm\t1979\n厨\t1980\n嫁\t1981\n绵\t1982\n逼\t1983\n扮\t1984\n叔\t1985\n酶\t1986\n燥\t1987\n狼\t1988\n滋\t1989\n汁\t1990\n辐\t1991\n怨\t1992\n翅\t1993\n佩\t1994\n坑\t1995\n旬\t1996\n沃\t1997\n剩\t1998\n蛇\t1999\n颖\t2000\n篮\t2001\n锐\t2002\n侠\t2003\n匹\t2004\n唤\t2005\n熊\t2006\n漠\t2007\n迟\t2008\n敦\t2009\n雌\t2010\n谨\t2011\n婴\t2012\n浸\t2013\n磷\t2014\n筒\t2015\n2015\t2016\n滩\t2017\n埋\t2018\n框\t2019\n弘\t2020\n吕\t2021\n碰\t2022\n纺\t2023\n硫\t2024\n堪\t2025\n契\t2026\n蜜\t2027\n蓄\t2028\n1995\t2029\n阐\t2030\napos\t2031\n傲\t2032\n碱\t2033\n晰\t2034\n狭\t2035\n撑\t2036\n叉\t2037\n卧\t2038\n劫\t2039\n闹\t2040\n赐\t2041\n邓\t2042\n奴\t2043\n溉\t2044\n浦\t2045\n蹈\t2046\n辣\t2047\n遣\t2048\n耀\t2049\n耶\t2050\n翠\t2051\nt\t2052\n叠\t2053\n迈\t2054\n霍\t2055\n碧\t2056\n恰\t2057\n脊\t2058\n昭\t2059\n摸\t2060\n饱\t2061\n赔\t2062\n泄\t2063\n哭\t2064\n讼\t2065\n逝\t2066\n逻\t2067\n廊\t2068\n擦\t2069\n渗\t2070\n彰\t2071\nyou\t2072\n卿\t2073\n旺\t2074\n宪\t2075\n36\t2076\n顷\t2077\n妆\t2078\n陪\t2079\n葛\t2080\n仔\t2081\n淀\t2082\n翰\t2083\n悦\t2084\n穆\t2085\n煮\t2086\n辩\t2087\n弦\t2088\nin\t2089\n串\t2090\n押\t2091\n蚀\t2092\n逢\t2093\n贺\t2094\n焊\t2095\n煌\t2096\n缔\t2097\n惑\t2098\n鹿\t2099\n袁\t2100\n糊\t2101\n逸\t2102\n舟\t2103\n勃\t2104\n侦\t2105\n涯\t2106\n蔡\t2107\n辟\t2108\n涌\t2109\n枯\t2110\n痕\t2111\n疼\t2112\n莉\t2113\n柴\t2114\n1993\t2115\n眉\t2116\n1992\t2117\n罢\t2118\n催\t2119\n衔\t2120\n秉\t2121\n妃\t2122\n鸿\t2123\n傅\t2124\n400\t2125\n辰\t2126\n聪\t2127\n咸\t2128\n1994\t2129\n扇\t2130\n盈\t2131\n勘\t2132\n佐\t2133\n泊\t2134\n抛\t2135\n搬\t2136\n牢\t2137\n宴\t2138\n牲\t2139\n贾\t2140\n摘\t2141\n姻\t2142\n慎\t2143\n帕\t2144\n忌\t2145\n卒\t2146\n夕\t2147\n卜\t2148\n惟\t2149\n挺\t2150\n崖\t2151\n炒\t2152\n爵\t2153\n冻\t2154\n椒\t2155\n鳞\t2156\n祸\t2157\n潭\t2158\n腊\t2159\n蒋\t2160\n缠\t2161\n寂\t2162\n眠\t2163\n冯\t2164\n芯\t2165\n槽\t2166\n吊\t2167\n33\t2168\n150\t2169\n聊\t2170\n梗\t2171\n嫩\t2172\n凶\t2173\n铭\t2174\n爽\t2175\n筋\t2176\n韦\t2177\n脾\t2178\n铝\t2179\n肢\t2180\n栋\t2181\n勾\t2182\n萌\t2183\n渊\t2184\n掩\t2185\n狮\t2186\n撒\t2187\n漆\t2188\n骗\t2189\n禽\t2190\n38\t2191\n蕴\t2192\n坪\t2193\n洒\t2194\n冶\t2195\n兹\t2196\n椭\t2197\n喻\t2198\n泵\t2199\n哀\t2200\n翔\t2201\n1990\t2202\n棒\t2203\n芝\t2204\nx\t2205\n扑\t2206\n3000\t2207\n毅\t2208\n衍\t2209\n惨\t2210\n疯\t2211\n欺\t2212\n贼\t2213\n肖\t2214\n轰\t2215\n巢\t2216\n臂\t2217\n轩\t2218\n扁\t2219\n淘\t2220\n犬\t2221\n宰\t2222\n祠\t2223\n挡\t2224\n厌\t2225\n帐\t2226\n蜂\t2227\n狐\t2228\n垃\t2229\n昂\t2230\n圾\t2231\n秩\t2232\n芬\t2233\n瞬\t2234\n枢\t2235\n舌\t2236\n唇\t2237\n棕\t2238\n1984\t2239\n霞\t2240\n霜\t2241\n艇\t2242\n侨\t2243\n鹤\t2244\n硅\t2245\n靖\t2246\n哦\t2247\n削\t2248\n泌\t2249\n奠\t2250\nd\t2251\n吏\t2252\n夷\t2253\n咖\t2254\n彭\t2255\n窑\t2256\n胁\t2257\n肪\t2258\n120\t2259\n贞\t2260\n劝\t2261\n钙\t2262\n柜\t2263\n鸭\t2264\n75\t2265\n庞\t2266\n兔\t2267\n荆\t2268\n丙\t2269\n纱\t2270\n34\t2271\n戈\t2272\n藤\t2273\n矩\t2274\n泳\t2275\n惧\t2276\n铃\t2277\n渴\t2278\n胀\t2279\n袖\t2280\n丸\t2281\n狠\t2282\n豫\t2283\n茫\t2284\n1985\t2285\n浇\t2286\n菩\t2287\n氯\t2288\n啡\t2289\n1988\t2290\n葱\t2291\n37\t2292\n梨\t2293\n霉\t2294\n脆\t2295\n氢\t2296\n巷\t2297\n丑\t2298\n娃\t2299\n锻\t2300\n愤\t2301\n贪\t2302\n蝶\t2303\n1991\t2304\n厉\t2305\n闽\t2306\n浑\t2307\n斩\t2308\n栖\t2309\nl\t2310\n茅\t2311\n昏\t2312\n龟\t2313\n碗\t2314\n棚\t2315\n滞\t2316\n慰\t2317\n600\t2318\n2016\t2319\n斋\t2320\n虹\t2321\n屯\t2322\n萝\t2323\n饼\t2324\n窄\t2325\n潘\t2326\n绣\t2327\n丢\t2328\n芦\t2329\n鳍\t2330\n42\t2331\n裕\t2332\n誓\t2333\n腻\t2334\n48\t2335\n95\t2336\n锈\t2337\n吞\t2338\n蜀\t2339\n啦\t2340\n扭\t2341\n5000\t2342\n巩\t2343\n髓\t2344\n1987\t2345\n劣\t2346\n拌\t2347\n谊\t2348\n涛\t2349\n勋\t2350\n郊\t2351\n莎\t2352\n痴\t2353\n窝\t2354\n驰\t2355\n1986\t2356\n跌\t2357\n笼\t2358\n挤\t2359\n溢\t2360\n1989\t2361\n隙\t2362\n55\t2363\n鹰\t2364\n诏\t2365\n帽\t2366\n65\t2367\n芒\t2368\n爬\t2369\n凸\t2370\n牺\t2371\n熔\t2372\n吻\t2373\n竭\t2374\n瘦\t2375\n冥\t2376\n800\t2377\n搏\t2378\n屡\t2379\n昔\t2380\n萼\t2381\n愁\t2382\n捉\t2383\n翁\t2384\n怖\t2385\n汪\t2386\n烯\t2387\n疲\t2388\n缸\t2389\n溃\t2390\n85\t2391\n泼\t2392\n剖\t2393\n涨\t2394\n橡\t2395\n谜\t2396\n悔\t2397\n嫌\t2398\n盒\t2399\n苯\t2400\n凹\t2401\n绳\t2402\n畏\t2403\n罐\t2404\n虾\t2405\n柯\t2406\n邑\t2407\n馨\t2408\n兆\t2409\n帖\t2410\n陌\t2411\n禄\t2412\n垫\t2413\n壶\t2414\n逊\t2415\n骤\t2416\n祀\t2417\n晴\t2418\n蓬\t2419\ne\t2420\n苞\t2421\n煎\t2422\n菊\t2423\n堤\t2424\n甫\t2425\n拱\t2426\n氮\t2427\n罕\t2428\n舶\t2429\n伞\t2430\n姚\t2431\n弓\t2432\n嵌\t2433\n1983\t2434\n1982\t2435\n馈\t2436\n琼\t2437\n噪\t2438\n雀\t2439\n呵\t2440\n汝\t2441\n焉\t2442\n陀\t2443\n胺\t2444\n惩\t2445\n沼\t2446\n枣\t2447\n桐\t2448\n酱\t2449\n遮\t2450\n孢\t2451\n钝\t2452\n呀\t2453\n锥\t2454\n妥\t2455\n酿\t2456\n巫\t2457\n闯\t2458\n沧\t2459\n崩\t2460\n蕊\t2461\n酬\t2462\n匠\t2463\n躲\t2464\n43\t2465\n喊\t2466\n98\t2467\n琳\t2468\n46\t2469\n绎\t2470\n喉\t2471\n凰\t2472\n抬\t2473\n93\t2474\n膨\t2475\n盲\t2476\n剥\t2477\n喂\t2478\n庸\t2479\n奸\t2480\nn\t2481\n钩\t2482\n冈\t2483\n募\t2484\n苑\t2485\n杏\t2486\n杉\t2487\n辱\t2488\n隋\t2489\n薪\t2490\n绒\t2491\n1980\t2492\n99\t2493\n欠\t2494\n尉\t2495\nr\t2496\n攀\t2497\n抹\t2498\n巾\t2499\n1958\t2500\n渣\t2501\n苹\t2502\n猴\t2503\n悄\t2504\n屠\t2505\n41\t2506\n颂\t2507\n湛\t2508\n魄\t2509\n颠\t2510\n1949\t2511\n呆\t2512\n粤\t2513\n岂\t2514\n娇\t2515\n暑\t2516\n44\t2517\n56\t2518\n52\t2519\n鹅\t2520\n筛\t2521\n膏\t2522\n樱\t2523\np\t2524\n缆\t2525\n襄\t2526\n瑟\t2527\n恭\t2528\n泻\t2529\n匪\t2530\n兮\t2531\n恼\t2532\n吟\t2533\n仕\t2534\n蔽\t2535\n骄\t2536\n蚕\t2537\n斥\t2538\n椅\t2539\n姬\t2540\n谦\t2541\nfor\t2542\n椎\t2543\n搅\t2544\n卸\t2545\n沫\t2546\n怜\t2547\n坎\t2548\n瑰\t2549\n1978\t2550\n钦\t2551\nh\t2552\n拾\t2553\n厕\t2554\n後\t2555\n逾\t2556\n薯\t2557\n衬\t2558\n钾\t2559\n崔\t2560\n稽\t2561\n蛮\t2562\n殷\t2563\n晒\t2564\n47\t2565\n菇\t2566\n臭\t2567\n弧\t2568\n擎\t2569\n粹\t2570\n纬\t2571\n1500\t2572\n焰\t2573\n玲\t2574\n竣\t2575\n咒\t2576\n歇\t2577\n糕\t2578\n诵\t2579\n茨\t2580\n妮\t2581\n酯\t2582\n麟\t2583\n卑\t2584\n浏\t2585\n咽\t2586\n罩\t2587\n舱\t2588\n酵\t2589\n晕\t2590\n顽\t2591\n赁\t2592\n咬\t2593\n枫\t2594\n冀\t2595\n贮\t2596\n艘\t2597\n亏\t2598\n薛\t2599\n瀑\t2600\n篆\t2601\n膀\t2602\n沸\t2603\n雍\t2604\n咳\t2605\n尹\t2606\n愉\t2607\n烹\t2608\n坠\t2609\n勿\t2610\n钠\t2611\n64\t2612\n坤\t2613\n甸\t2614\n墅\t2615\n闸\t2616\n藻\t2617\n韧\t2618\n鄂\t2619\n58\t2620\n51\t2621\n91\t2622\nj\t2623\n瑶\t2624\n舆\t2625\n夸\t2626\n54\t2627\n蕾\t2628\n栗\t2629\n咏\t2630\n丞\t2631\n抄\t2632\n鹏\t2633\n弊\t2634\n檐\t2635\n骂\t2636\n仆\t2637\n峻\t2638\n爪\t2639\n赚\t2640\n帆\t2641\n娶\t2642\n嘛\t2643\n钓\t2644\n澄\t2645\n猜\t2646\n1979\t2647\n裔\t2648\n抒\t2649\n铅\t2650\n卉\t2651\n彦\t2652\nf\t2653\n删\t2654\n衷\t2655\n禹\t2656\n寡\t2657\n蒲\t2658\n砌\t2659\non\t2660\n棱\t2661\n72\t2662\n拘\t2663\n堵\t2664\n雁\t2665\n仄\t2666\n荫\t2667\n53\t2668\nk\t2669\n1981\t2670\n祈\t2671\n49\t2672\n奢\t2673\n赌\t2674\n寇\t2675\n3d\t2676\n隧\t2677\n摊\t2678\n雇\t2679\n卦\t2680\n婉\t2681\n敲\t2682\n挣\t2683\n皱\t2684\n虞\t2685\n亨\t2686\n懈\t2687\n挽\t2688\n珊\t2689\n饶\t2690\n滥\t2691\n锯\t2692\n闷\t2693\nit\t2694\n酮\t2695\n虐\t2696\n兑\t2697\n僵\t2698\n傻\t2699\n62\t2700\n沦\t2701\n巅\t2702\n鞭\t2703\n梳\t2704\n赣\t2705\n锌\t2706\n庐\t2707\n薇\t2708\n庵\t2709\n57\t2710\n96\t2711\n慨\t2712\n肚\t2713\n妄\t2714\ng\t2715\n仗\t2716\n绑\t2717\n2017\t2718\n枕\t2719\n牡\t2720\n000\t2721\n胖\t2722\n沪\t2723\n垒\t2724\n捞\t2725\n捧\t2726\n竖\t2727\n蜡\t2728\n桩\t2729\n厢\t2730\n孵\t2731\n黏\t2732\n拯\t2733\n63\t2734\n谭\t2735\n68\t2736\n诈\t2737\n灿\t2738\n釉\t2739\n1956\t2740\n裹\t2741\n钮\t2742\n俩\t2743\no\t2744\n灶\t2745\n彝\t2746\n蟹\t2747\n涩\t2748\n醋\t2749\n110\t2750\n匙\t2751\n歧\t2752\n刹\t2753\n玫\t2754\n棘\t2755\n橙\t2756\n凑\t2757\n桶\t2758\n刃\t2759\n伽\t2760\n4000\t2761\n硝\t2762\n怡\t2763\n籽\t2764\n敞\t2765\n淳\t2766\n矮\t2767\n镶\t2768\n戚\t2769\n幢\t2770\n涡\t2771\n66\t2772\n尧\t2773\n膝\t2774\nis\t2775\n哉\t2776\n肆\t2777\n畔\t2778\n溯\t2779\n97\t2780\n媚\t2781\n烘\t2782\n01\t2783\n67\t2784\n窃\t2785\n焚\t2786\n澜\t2787\n愚\t2788\n棵\t2789\n乞\t2790\n86\t2791\n78\t2792\n佑\t2793\n76\t2794\niphone\t2795\n暨\t2796\n敷\t2797\n饥\t2798\n俯\t2799\n蔓\t2800\nv\t2801\n05\t2802\n88\t2803\n暮\t2804\n砍\t2805\n邵\t2806\n仑\t2807\n毗\t2808\n剿\t2809\n馀\t2810\n180\t2811\n锤\t2812\n刮\t2813\n1950\t2814\n梭\t2815\n摧\t2816\n250\t2817\n掠\t2818\n躯\t2819\n诡\t2820\n匈\t2821\n侣\t2822\n胚\t2823\n疮\t2824\n59\t2825\n裙\t2826\nwindows\t2827\n裸\t2828\n08\t2829\n塌\t2830\n吓\t2831\n俘\t2832\n糙\t2833\n藩\t2834\n楷\t2835\n羞\t2836\nwith\t2837\n鲍\t2838\n帘\t2839\n裤\t2840\n宛\t2841\n憾\t2842\n桓\t2843\n痰\t2844\n寞\t2845\n骚\t2846\n惹\t2847\n笋\t2848\n萃\t2849\n92\t2850\n栓\t2851\n61\t2852\n挫\t2853\n矢\t2854\n垦\t2855\n09\t2856\n垄\t2857\n绸\t2858\n凄\t2859\nyour\t2860\n镀\t2861\n熏\t2862\n钉\t2863\n1945\t2864\nled\t2865\n粪\t2866\n缅\t2867\n洽\t2868\n鞘\t2869\n蔗\t2870\n82\t2871\n迄\t2872\n沐\t2873\n凿\t2874\n勉\t2875\n昨\t2876\n喘\t2877\n700\t2878\n爹\t2879\n屑\t2880\n耻\t2881\n沥\t2882\n庶\t2883\n涅\t2884\n腕\t2885\n袍\t2886\n懒\t2887\n阜\t2888\n嗜\t2889\n朔\t2890\n1200\t2891\n蒜\t2892\n沛\t2893\n坟\t2894\n轿\t2895\n喀\t2896\n笛\t2897\n狄\t2898\n饿\t2899\n蓉\t2900\n泣\t2901\n窟\t2902\n130\t2903\n豹\t2904\n屿\t2905\n73\t2906\n崛\t2907\n迦\t2908\n诠\t2909\n贬\t2910\n腥\t2911\n83\t2912\n钥\t2913\n嗣\t2914\n瑜\t2915\n07\t2916\n倦\t2917\n萎\t2918\n拦\t2919\n冤\t2920\n讽\t2921\n潇\t2922\n谣\t2923\n趁\t2924\n1960\t2925\n妨\t2926\n84\t2927\n贩\t2928\n74\t2929\n萍\t2930\n窦\t2931\n纂\t2932\n缀\t2933\n矫\t2934\n淑\t2935\n墩\t2936\n梵\t2937\n沾\t2938\n淫\t2939\n乖\t2940\n汰\t2941\n莞\t2942\n81\t2943\n旷\t2944\n浊\t2945\n挚\t2946\n撼\t2947\n69\t2948\n87\t2949\n氟\t2950\n焕\t2951\n06\t2952\n庚\t2953\n掀\t2954\n诀\t2955\nkg\t2956\n盼\t2957\n71\t2958\n疹\t2959\n窖\t2960\n匆\t2961\n厥\t2962\n轧\t2963\n89\t2964\n淹\t2965\n94\t2966\n160\t2967\n亥\t2968\n鸦\t2969\n棍\t2970\n谅\t2971\n歼\t2972\n汕\t2973\n挪\t2974\n蚁\t2975\n敛\t2976\n魁\t2977\n畴\t2978\n炫\t2979\n丫\t2980\n奎\t2981\n菱\t2982\n沂\t2983\n撕\t2984\n阎\t2985\n詹\t2986\n03\t2987\n蛛\t2988\n77\t2989\n靡\t2990\n瞻\t2991\n咱\t2992\n愧\t2993\n烷\t2994\n畸\t2995\n灸\t2996\n眸\t2997\nthat\t2998\n觅\t2999\n芜\t3000\n1955\t3001\n廓\t3002\n斌\t3003\n躁\t3004\n麓\t3005\n摔\t3006\n1970\t3007\n烛\t3008\n睹\t3009\n孜\t3010\n缚\t3011\n堕\t3012\n昼\t3013\n睿\t3014\n琪\t3015\n琉\t3016\n贱\t3017\n6000\t3018\n渝\t3019\n跋\t3020\n1959\t3021\n茄\t3022\n1957\t3023\n舜\t3024\n1976\t3025\n诛\t3026\n1952\t3027\n捣\t3028\n芙\t3029\n04\t3030\n1961\t3031\n倚\t3032\n1938\t3033\n酰\t3034\n澈\t3035\n慌\t3036\n帜\t3037\n颤\t3038\n陇\t3039\n1962\t3040\n02\t3041\n颌\t3042\n昧\t3043\n佣\t3044\n眷\t3045\n徙\t3046\n禾\t3047\n逮\t3048\n1948\t3049\n79\t3050\n莹\t3051\n碟\t3052\n梢\t3053\n朽\t3054\n粥\t3055\n喇\t3056\n1964\t3057\n榆\t3058\n驳\t3059\n楔\t3060\n1965\t3061\n啸\t3062\n肋\t3063\ndna\t3064\n踢\t3065\n1975\t3066\n1937\t3067\nu\t3068\n傍\t3069\n桔\t3070\n肴\t3071\n呕\t3072\n旭\t3073\n埠\t3074\n贿\t3075\n曝\t3076\n杖\t3077\n俭\t3078\n栩\t3079\n1953\t3080\n斧\t3081\n镁\t3082\n匾\t3083\n踩\t3084\n橘\t3085\n颅\t3086\n1963\t3087\n囚\t3088\n蛙\t3089\n1946\t3090\n膳\t3091\n坞\t3092\n琐\t3093\n荧\t3094\n瘟\t3095\n涤\t3096\n胰\t3097\n衫\t3098\n噬\t3099\n皖\t3100\n邱\t3101\n埔\t3102\n汀\t3103\n羡\t3104\n睐\t3105\n葵\t3106\n耿\t3107\n糟\t3108\n厄\t3109\n秧\t3110\n黔\t3111\n蹄\t3112\n140\t3113\n漳\t3114\n鞍\t3115\n谏\t3116\n腋\t3117\n簇\t3118\n梧\t3119\n戎\t3120\n1977\t3121\n榴\t3122\n诣\t3123\n宦\t3124\n苔\t3125\n揽\t3126\n簧\t3127\n狸\t3128\n阙\t3129\n扯\t3130\n耍\t3131\n棠\t3132\n脓\t3133\n烫\t3134\n翘\t3135\n芭\t3136\n躺\t3137\n羁\t3138\n藉\t3139\n拐\t3140\n1966\t3141\n陡\t3142\n1954\t3143\n漓\t3144\n棺\t3145\n钧\t3146\n琅\t3147\n扔\t3148\n寝\t3149\n绚\t3150\n熬\t3151\n驿\t3152\n邹\t3153\n杠\t3154\n1972\t3155\nw\t3156\n绥\t3157\n窥\t3158\n晃\t3159\n渭\t3160\n1947\t3161\n樊\t3162\n鑫\t3163\n祁\t3164\n陋\t3165\n哺\t3166\n堰\t3167\n祛\t3168\ny\t3169\n梓\t3170\n崎\t3171\n1968\t3172\n孽\t3173\n蝴\t3174\n蔚\t3175\n抖\t3176\n苟\t3177\n肇\t3178\n溜\t3179\n绅\t3180\n妾\t3181\n1940\t3182\n跪\t3183\n沁\t3184\nq\t3185\n1973\t3186\n莽\t3187\n虏\t3188\nbe\t3189\n瞄\t3190\n砸\t3191\n稚\t3192\n僚\t3193\n崭\t3194\n迭\t3195\n皂\t3196\n彬\t3197\n雏\t3198\nip\t3199\n羲\t3200\n缕\t3201\n绞\t3202\n俞\t3203\n簿\t3204\n耸\t3205\n廖\t3206\n嘲\t3207\ncan\t3208\n1969\t3209\n翌\t3210\n榄\t3211\n裴\t3212\n槐\t3213\n1939\t3214\n洼\t3215\n睁\t3216\n1951\t3217\n灼\t3218\n啤\t3219\n臀\t3220\n啥\t3221\n濒\t3222\n醛\t3223\n峨\t3224\n葫\t3225\n悍\t3226\n笨\t3227\n嘱\t3228\n1935\t3229\n稠\t3230\n360\t3231\n韶\t3232\n1941\t3233\n陛\t3234\n峭\t3235\n1974\t3236\n酚\t3237\n翩\t3238\n舅\t3239\n8000\t3240\n寅\t3241\n1936\t3242\n蕉\t3243\n阮\t3244\n垣\t3245\n戮\t3246\nme\t3247\n趾\t3248\n犀\t3249\n巍\t3250\nre\t3251\n霄\t3252\n1942\t3253\n1930\t3254\n饪\t3255\nsci\t3256\n秆\t3257\n朕\t3258\n驼\t3259\n肛\t3260\n揉\t3261\nipad\t3262\n楠\t3263\n岚\t3264\n疡\t3265\n帧\t3266\n柑\t3267\niso9001\t3268\n赎\t3269\n逍\t3270\n滇\t3271\n璋\t3272\n礁\t3273\n黛\t3274\n钞\t3275\n邢\t3276\n涧\t3277\n劈\t3278\n瞳\t3279\n砚\t3280\n驴\t3281\n1944\t3282\n锣\t3283\n恳\t3284\n栅\t3285\n吵\t3286\n牟\t3287\n沌\t3288\n瞩\t3289\n咪\t3290\n毯\t3291\n炳\t3292\n淤\t3293\n盯\t3294\n芋\t3295\n粟\t3296\n350\t3297\n栈\t3298\n戊\t3299\n盏\t3300\n峪\t3301\n拂\t3302\n暇\t3303\n酥\t3304\n汛\t3305\n900\t3306\npc\t3307\n嚣\t3308\n2500\t3309\n轼\t3310\n妒\t3311\n匿\t3312\n1934\t3313\n鸽\t3314\n蝉\t3315\ncd\t3316\n痒\t3317\n宵\t3318\n瘫\t3319\n1927\t3320\n1943\t3321\n璧\t3322\n汲\t3323\n1971\t3324\n冢\t3325\n碌\t3326\n琢\t3327\n磅\t3328\n卤\t3329\n105\t3330\n剔\t3331\n谎\t3332\n圩\t3333\n酌\t3334\n捏\t3335\n渺\t3336\n媳\t3337\n1933\t3338\n穹\t3339\n谥\t3340\n骏\t3341\n哨\t3342\n骆\t3343\n乒\t3344\n10000\t3345\n摹\t3346\n兜\t3347\n柿\t3348\n喧\t3349\n呜\t3350\n捡\t3351\n橄\t3352\n逗\t3353\n瑚\t3354\n呐\t3355\n檀\t3356\n辜\t3357\n妊\t3358\n祯\t3359\n1931\t3360\n苷\t3361\ndon\t3362\n衙\t3363\n笃\t3364\n芸\t3365\n霖\t3366\n荔\t3367\n闺\t3368\n羌\t3369\n芹\t3370\ndvd\t3371\n哼\t3372\n糯\t3373\n吼\t3374\n蕃\t3375\n嵩\t3376\n矶\t3377\n绽\t3378\n坯\t3379\n娠\t3380\n1928\t3381\n祷\t3382\n锰\t3383\nqq\t3384\nby\t3385\n瘀\t3386\n108\t3387\n岐\t3388\n1932\t3389\n茵\t3390\n筝\t3391\n斐\t3392\n肽\t3393\n歉\t3394\n1929\t3395\n嗽\t3396\n恤\t3397\n汶\t3398\n聂\t3399\n樟\t3400\n擒\t3401\n鹃\t3402\n拙\t3403\n鲤\t3404\n絮\t3405\n鄙\t3406\n彪\t3407\nipod\t3408\nz\t3409\n嗓\t3410\n墟\t3411\n骼\t3412\n渤\t3413\n僻\t3414\n豁\t3415\n谕\t3416\n荟\t3417\n姨\t3418\n婷\t3419\n挠\t3420\n哇\t3421\n炙\t3422\n220\t3423\n诅\t3424\n娥\t3425\n哑\t3426\n阱\t3427\n嫉\t3428\n圭\t3429\n乓\t3430\n橱\t3431\n歪\t3432\n禧\t3433\n甩\t3434\n坷\t3435\n晏\t3436\n驯\t3437\n讳\t3438\n泗\t3439\n煞\t3440\nmy\t3441\n淄\t3442\n倪\t3443\n妓\t3444\n窍\t3445\n竿\t3446\n襟\t3447\n匡\t3448\n钛\t3449\n侈\t3450\nll\t3451\n侄\t3452\n铲\t3453\n哮\t3454\n厩\t3455\n1967\t3456\n亢\t3457\n101\t3458\n辕\t3459\n瘾\t3460\n辊\t3461\n狩\t3462\n掷\t3463\n潍\t3464\n240\t3465\n伺\t3466\n嘿\t3467\n弈\t3468\n嘎\t3469\n陨\t3470\n娅\t3471\n1800\t3472\n昊\t3473\n犁\t3474\n屁\t3475\n蜘\t3476\n170\t3477\n寥\t3478\n滕\t3479\n毙\t3480\nas\t3481\n涝\t3482\n谛\t3483\nall\t3484\n郝\t3485\n痹\t3486\n溺\t3487\n汾\t3488\n脐\t3489\n馅\t3490\n蠢\t3491\n珀\t3492\n腌\t3493\n扼\t3494\n敕\t3495\n莓\t3496\n峦\t3497\n铬\t3498\n谍\t3499\n炬\t3500\n龚\t3501\n麒\t3502\n睦\t3503\n磺\t3504\n吁\t3505\n掺\t3506\n烁\t3507\n靶\t3508\nor\t3509\n圃\t3510\n饵\t3511\n褶\t3512\n娟\t3513\n滔\t3514\n挨\t3515\nandroid\t3516\n褒\t3517\n胱\t3518\ncpu\t3519\n晖\t3520\n脖\t3521\n垢\t3522\n抉\t3523\n冉\t3524\n茧\t3525\nfrom\t3526\n渲\t3527\n癫\t3528\n125\t3529\nde\t3530\n悼\t3531\n嫂\t3532\n瞒\t3533\n纶\t3534\n肘\t3535\n炖\t3536\n瀚\t3537\n皋\t3538\n姊\t3539\n颐\t3540\n1600\t3541\n俏\t3542\n颊\t3543\ngps\t3544\n讶\t3545\n札\t3546\n奕\t3547\n磊\t3548\n镖\t3549\n遐\t3550\n眺\t3551\n腑\t3552\nboss\t3553\n琦\t3554\n蚊\t3555\n窜\t3556\n渍\t3557\n嗯\t3558\n102\t3559\n1926\t3560\ntouch\t3561\n夯\t3562\n1300\t3563\n笙\t3564\n蘑\t3565\n翡\t3566\n碘\t3567\n卯\t3568\n啼\t3569\n靓\t3570\n辍\t3571\n莺\t3572\n躬\t3573\n猿\t3574\n杞\t3575\n眩\t3576\n虔\t3577\n凋\t3578\n遁\t3579\n泾\t3580\n岔\t3581\n羟\t3582\n弛\t3583\n娄\t3584\n茸\t3585\n皓\t3586\n峙\t3587\n逅\t3588\n邂\t3589\n苇\t3590\n楹\t3591\n蹲\t3592\n拢\t3593\n甄\t3594\n鳃\t3595\n104\t3596\n邯\t3597\n捆\t3598\n勺\t3599\n450\t3600\n酉\t3601\n荚\t3602\n唑\t3603\n臻\t3604\n辗\t3605\n绰\t3606\n徊\t3607\n榨\t3608\n苛\t3609\n赦\t3610\n盔\t3611\n壬\t3612\n恍\t3613\n缉\t3614\n2020\t3615\n熨\t3616\n7000\t3617\n澡\t3618\n桨\t3619\n匣\t3620\n兢\t3621\n106\t3622\n驭\t3623\nx1\t3624\n镍\t3625\n孰\t3626\n绮\t3627\n馏\t3628\n蝇\t3629\n佼\t3630\n鲸\t3631\n128\t3632\n哎\t3633\n裳\t3634\n蜕\t3635\n嚼\t3636\n嘻\t3637\nweb\t3638\n庇\t3639\n绢\t3640\n倩\t3641\n钵\t3642\nii\t3643\n恪\t3644\n帷\t3645\n莆\t3646\n柠\t3647\n藕\t3648\n砾\t3649\n115\t3650\n绊\t3651\n喙\t3652\n坂\t3653\n徘\t3654\n荀\t3655\n瞧\t3656\n蛾\t3657\n1925\t3658\n晦\t3659\nph\t3660\nmm\t3661\n铎\t3662\n107\t3663\n紊\t3664\n锚\t3665\n酪\t3666\n稷\t3667\n聋\t3668\n闵\t3669\n熹\t3670\n冕\t3671\n诫\t3672\n珑\t3673\n曦\t3674\n篷\t3675\n320\t3676\n迥\t3677\n蘖\t3678\n胤\t3679\n103\t3680\n檬\t3681\n瑾\t3682\n钳\t3683\n遏\t3684\n辄\t3685\n嬉\t3686\n隅\t3687\nps\t3688\n秃\t3689\n112\t3690\n帛\t3691\n聆\t3692\n芥\t3693\n诬\t3694\n1100\t3695\n挟\t3696\n宕\t3697\n2018\t3698\n鹊\t3699\n琶\t3700\n膛\t3701\nmv\t3702\n兀\t3703\ngb\t3704\n懿\t3705\n碾\t3706\n叮\t3707\n863\t3708\n蠕\t3709\n譬\t3710\n缮\t3711\n烽\t3712\n妍\t3713\n榕\t3714\n260\t3715\n1920\t3716\n邃\t3717\n焙\t3718\n倘\t3719\n210\t3720\n戌\t3721\n茹\t3722\n豚\t3723\n晾\t3724\n浒\t3725\n玺\t3726\n醚\t3727\n祐\t3728\n炽\t3729\nthis\t3730\n缪\t3731\n凛\t3732\n噩\t3733\n溅\t3734\n毋\t3735\n槛\t3736\nei\t3737\nare\t3738\n嫡\t3739\n蝠\t3740\n娴\t3741\n稣\t3742\n禀\t3743\n壑\t3744\n殆\t3745\n敖\t3746\ncm\t3747\nios\t3748\n倭\t3749\n挛\t3750\n侃\t3751\n蚌\t3752\n咀\t3753\n盎\t3754\n殉\t3755\n岑\t3756\n浚\t3757\n谬\t3758\n狡\t3759\n1924\t3760\n癸\t3761\n280\t3762\n逛\t3763\n耽\t3764\n俺\t3765\n璨\t3766\n巳\t3767\n茜\t3768\n郸\t3769\n蒴\t3770\n琵\t3771\nwe\t3772\n230\t3773\n叩\t3774\n泸\t3775\n塾\t3776\none\t3777\n稼\t3778\nreg\t3779\n侮\t3780\n锂\t3781\n曙\t3782\n3500\t3783\nup\t3784\n薰\t3785\n婿\t3786\n惶\t3787\n拭\t3788\n篱\t3789\n恬\t3790\n淌\t3791\n烙\t3792\n袜\t3793\n徵\t3794\n慷\t3795\n夭\t3796\n噶\t3797\n莘\t3798\n135\t3799\n鸳\t3800\n殡\t3801\n蚂\t3802\n1900\t3803\n憎\t3804\n喃\t3805\n佚\t3806\n龛\t3807\n潢\t3808\n烃\t3809\nat\t3810\n岱\t3811\n潺\t3812\n109\t3813\n衢\t3814\n璀\t3815\n5cm\t3816\n1400\t3817\n鹭\t3818\n揣\t3819\n痢\t3820\nknow\t3821\n厮\t3822\n氓\t3823\n怠\t3824\nno\t3825\nnbsp\t3826\n痘\t3827\n硒\t3828\n镌\t3829\n乍\t3830\n咯\t3831\n惬\t3832\nnot\t3833\n桦\t3834\n骇\t3835\n枉\t3836\n蜗\t3837\n睾\t3838\n淇\t3839\n耘\t3840\n娓\t3841\n弼\t3842\n鳌\t3843\n嗅\t3844\ngdp\t3845\n狙\t3846\n箫\t3847\n朦\t3848\n椰\t3849\n胥\t3850\n丐\t3851\n陂\t3852\n唾\t3853\n鳄\t3854\n柚\t3855\n谒\t3856\njournal\t3857\n戍\t3858\n1912\t3859\n刁\t3860\n鸾\t3861\n缭\t3862\n骸\t3863\n铣\t3864\n酋\t3865\n蝎\t3866\n掏\t3867\n耦\t3868\n怯\t3869\n娲\t3870\n拇\t3871\n汹\t3872\n胧\t3873\n疤\t3874\n118\t3875\n硼\t3876\n恕\t3877\n哗\t3878\n眶\t3879\n痫\t3880\n凳\t3881\n鲨\t3882\n擢\t3883\n歹\t3884\n樵\t3885\n瘠\t3886\napp\t3887\n茗\t3888\n翟\t3889\n黯\t3890\n蜒\t3891\n壹\t3892\n殇\t3893\n伶\t3894\n辙\t3895\nan\t3896\n瑕\t3897\n町\t3898\n孚\t3899\n痉\t3900\n铵\t3901\n搁\t3902\n漾\t3903\n戟\t3904\n镰\t3905\n鸯\t3906\n猩\t3907\n190\t3908\n蔷\t3909\n缤\t3910\n叭\t3911\n垩\t3912\n113\t3913\n曳\t3914\nusb\t3915\n奚\t3916\n毓\t3917\nibm\t3918\n颓\t3919\n汐\t3920\n靴\t3921\nchina\t3922\n傣\t3923\n尬\t3924\n濮\t3925\n赂\t3926\n媛\t3927\n懦\t3928\n扦\t3929\n111\t3930\n韬\t3931\nlike\t3932\n戳\t3933\njava\t3934\n雯\t3935\n114\t3936\n蜿\t3937\n116\t3938\n1923\t3939\n笺\t3940\n裘\t3941\n尴\t3942\n侗\t3943\nmba\t3944\n3g\t3945\n钨\t3946\n1919\t3947\n苓\t3948\n1922\t3949\n寰\t3950\n蛊\t3951\n扳\t3952\n搓\t3953\n涟\t3954\n睫\t3955\n淬\t3956\n5mm\t3957\n123\t3958\nve\t3959\n121\t3960\n赈\t3961\n恺\t3962\n瞎\t3963\n蝙\t3964\n1921\t3965\n枸\t3966\n萱\t3967\n颚\t3968\n憩\t3969\n秽\t3970\n秸\t3971\n拷\t3972\n阑\t3973\n貂\t3974\n粱\t3975\n煲\t3976\n隘\t3977\n暧\t3978\n惕\t3979\n沽\t3980\ntime\t3981\n菠\t3982\n1911\t3983\n趟\t3984\n磋\t3985\n偕\t3986\n涕\t3987\n邸\t3988\nso\t3989\n踞\t3990\n惫\t3991\n122\t3992\n阪\t3993\n鞠\t3994\n饺\t3995\n汞\t3996\n颍\t3997\n氰\t3998\n屹\t3999\n蛟\t4000\n跻\t4001\n哟\t4002\nhave\t4003\n126\t4004\n臼\t4005\n熄\t4006\n绛\t4007\n弩\t4008\n褪\t4009\n117\t4010\n渎\t4011\n亟\t4012\n匮\t4013\n撇\t4014\ninternet\t4015\n霆\t4016\n攒\t4017\n舵\t4018\n扛\t4019\n彤\t4020\nnba\t4021\n蛤\t4022\n婢\t4023\n偃\t4024\n胫\t4025\n姥\t4026\n睑\t4027\nlove\t4028\niso\t4029\npk\t4030\n诙\t4031\nwhat\t4032\n诲\t4033\n锭\t4034\n悚\t4035\n扒\t4036\n洱\t4037\n劾\t4038\n惰\t4039\n篡\t4040\n瓯\t4041\n徇\t4042\n铀\t4043\n骋\t4044\nflash\t4045\n1918\t4046\nout\t4047\n筷\t4048\n渚\t4049\n踵\t4050\n俨\t4051\nceo\t4052\n榻\t4053\n糜\t4054\n捻\t4055\n釜\t4056\n哩\t4057\n萤\t4058\n270\t4059\n蛹\t4060\n隽\t4061\n垮\t4062\n鸠\t4063\n鸥\t4064\n漕\t4065\n瑙\t4066\n礴\t4067\n憧\t4068\n殴\t4069\n潼\t4070\n悯\t4071\n砺\t4072\n拽\t4073\n钗\t4074\nct\t4075\n酣\t4076\n镂\t4077\nmp3\t4078\n膺\t4079\n楞\t4080\n竺\t4081\n迂\t4082\n嫣\t4083\n忱\t4084\ncad\t4085\n哄\t4086\n疣\t4087\n鹦\t4088\n1700\t4089\n枭\t4090\n憬\t4091\n疱\t4092\nwill\t4093\n婪\t4094\n沮\t4095\n1914\t4096\n怅\t4097\n119\t4098\n筱\t4099\n扉\t4100\n瞰\t4101\nlinux\t4102\n旌\t4103\n蔑\t4104\n铠\t4105\n瀛\t4106\nvip\t4107\n琥\t4108\n750\t4109\n127\t4110\n懵\t4111\n谴\t4112\n捍\t4113\n蟾\t4114\n漩\t4115\n1913\t4116\n拣\t4117\n汴\t4118\nuniversity\t4119\n刨\t4120\n叱\t4121\n曜\t4122\n妞\t4123\n澎\t4124\n镑\t4125\n翎\t4126\n瞪\t4127\nsh\t4128\n倔\t4129\n芍\t4130\n璞\t4131\n瓮\t4132\n驹\t4133\n芷\t4134\n寐\t4135\n擂\t4136\n丕\t4137\n蟠\t4138\n诃\t4139\n悸\t4140\n亘\t4141\n溴\t4142\n宸\t4143\n廿\t4144\n恃\t4145\n棣\t4146\n1917\t4147\n荼\t4148\n筠\t4149\n羚\t4150\n慑\t4151\n唉\t4152\n纣\t4153\n麼\t4154\n蹦\t4155\n锄\t4156\n145\t4157\ninternational\t4158\n124\t4159\n淆\t4160\n甙\t4161\n132\t4162\n蚜\t4163\n椿\t4164\n禺\t4165\n绯\t4166\n冗\t4167\n168\t4168\n葩\t4169\n厝\t4170\n媲\t4171\n蒿\t4172\n痪\t4173\n650\t4174\n菁\t4175\n炊\t4176\nwifi\t4177\n俑\t4178\nnew\t4179\n讥\t4180\nmin\t4181\n桀\t4182\n祺\t4183\n129\t4184\n吡\t4185\n迩\t4186\ndo\t4187\njohn\t4188\n箔\t4189\n皿\t4190\n缎\t4191\n萦\t4192\n剃\t4193\n霓\t4194\n酝\t4195\nmg\t4196\n诰\t4197\n茉\t4198\njust\t4199\nget\t4200\n飙\t4201\n湍\t4202\n蜥\t4203\n箕\t4204\n蘸\t4205\n550\t4206\n4500\t4207\n柬\t4208\n韭\t4209\n溥\t4210\nbut\t4211\n熠\t4212\n鹉\t4213\n咐\t4214\n剌\t4215\n138\t4216\n悖\t4217\n瞿\t4218\n槟\t4219\n娩\t4220\n闾\t4221\npvc\t4222\n遴\t4223\n咫\t4224\n20000\t4225\n孺\t4226\n彷\t4227\n茬\t4228\n211\t4229\n蓟\t4230\nli\t4231\nif\t4232\n憨\t4233\n袅\t4234\n佬\t4235\n炯\t4236\nerp\t4237\n1910\t4238\n啶\t4239\n昙\t4240\n蚩\t4241\n136\t4242\n痔\t4243\n蕨\t4244\n瓢\t4245\n夔\t4246\n毡\t4247\n赃\t4248\n鳖\t4249\n沅\t4250\nwang\t4251\ngo\t4252\n饷\t4253\n165\t4254\n臧\t4255\n掖\t4256\n褚\t4257\n羹\t4258\nic\t4259\n勐\t4260\ntv\t4261\n谚\t4262\n畦\t4263\n眨\t4264\n贻\t4265\n攸\t4266\n涎\t4267\n弑\t4268\n咎\t4269\n铂\t4270\n瑛\t4271\n1905\t4272\n矗\t4273\n虱\t4274\nmore\t4275\n133\t4276\n秤\t4277\n谟\t4278\n漱\t4279\n俸\t4280\n夙\t4281\n1915\t4282\nbr\t4283\ngame\t4284\n雉\t4285\n螨\t4286\n恣\t4287\n斛\t4288\n175\t4289\n谙\t4290\n隍\t4291\n131\t4292\n奄\t4293\n480\t4294\nyy\t4295\n1916\t4296\n壕\t4297\n髻\t4298\n155\t4299\n鄱\t4300\n嘶\t4301\n磕\t4302\n濡\t4303\n赘\t4304\n荞\t4305\n讹\t4306\n猕\t4307\n痞\t4308\n鬓\t4309\n铮\t4310\n腱\t4311\n幡\t4312\n榭\t4313\n爻\t4314\n5m\t4315\n涓\t4316\n晤\t4317\n咕\t4318\n惭\t4319\n钼\t4320\n匕\t4321\nok\t4322\n撮\t4323\n庾\t4324\n笠\t4325\n窘\t4326\n癖\t4327\n365\t4328\n垛\t4329\n窒\t4330\n畲\t4331\n甬\t4332\n彗\t4333\n缨\t4334\n湮\t4335\n寮\t4336\net\t4337\n衅\t4338\n谪\t4339\n156\t4340\n绫\t4341\n9000\t4342\n152\t4343\n兖\t4344\n疽\t4345\n磐\t4346\n380\t4347\n菏\t4348\n沱\t4349\n骁\t4350\n嫔\t4351\n盂\t4352\n娆\t4353\n钊\t4354\n蟒\t4355\n忏\t4356\n谤\t4357\n148\t4358\n137\t4359\nserver\t4360\n2200\t4361\n晟\t4362\nng\t4363\n15000\t4364\ngoogle\t4365\n痈\t4366\n耆\t4367\n谧\t4368\n簪\t4369\n134\t4370\nml\t4371\n疟\t4372\n扈\t4373\n脍\t4374\n琛\t4375\n咋\t4376\n胄\t4377\n142\t4378\n144\t4379\n葆\t4380\n轶\t4381\n桢\t4382\n973\t4383\n攘\t4384\nwas\t4385\n邕\t4386\n拧\t4387\n茯\t4388\n205\t4389\n摒\t4390\n1908\t4391\nintel\t4392\n傀\t4393\n祚\t4394\n嘟\t4395\n帼\t4396\n1906\t4397\nwto\t4398\n筵\t4399\nwhen\t4400\n馒\t4401\n疚\t4402\n璇\t4403\n砧\t4404\nmerge\t4405\n槃\t4406\nmicrosoft\t4407\n犷\t4408\nexe\t4409\n腓\t4410\n煜\t4411\n弋\t4412\n疸\t4413\n濑\t4414\n310\t4415\n201\t4416\n麝\t4417\n嗟\t4418\n忻\t4419\n愣\t4420\nfacebook\t4421\n斓\t4422\n吝\t4423\n咧\t4424\n矾\t4425\n愫\t4426\n151\t4427\n158\t4428\n漪\t4429\n珂\t4430\nrna\t4431\n逞\t4432\n146\t4433\n206\t4434\n糠\t4435\n璐\t4436\n藓\t4437\n昕\t4438\n妩\t4439\n屌\t4440\n疵\t4441\nexcel\t4442\n嘘\t4443\nhe\t4444\nplc\t4445\n袂\t4446\n2400\t4447\n139\t4448\n稃\t4449\n剁\t4450\n侏\t4451\n掐\t4452\n猾\t4453\n匍\t4454\n2800\t4455\n坳\t4456\n黜\t4457\n邺\t4458\n闫\t4459\n猥\t4460\n湃\t4461\n斟\t4462\n癣\t4463\n1904\t4464\n185\t4465\n匐\t4466\n粳\t4467\nsql\t4468\n330\t4469\n141\t4470\ncp\t4471\n1909\t4472\n叟\t4473\n俾\t4474\n儡\t4475\n莒\t4476\n12000\t4477\n骥\t4478\n跤\t4479\n耙\t4480\n矜\t4481\n翱\t4482\nzhang\t4483\nms\t4484\n赡\t4485\n1907\t4486\n浣\t4487\n栾\t4488\n拈\t4489\nscience\t4490\n420\t4491\n螟\t4492\naaa\t4493\n桧\t4494\n坍\t4495\n睢\t4496\n趴\t4497\nid\t4498\n伎\t4499\n2100\t4500\n婺\t4501\n霹\t4502\n痊\t4503\n膊\t4504\n眯\t4505\n豌\t4506\n202\t4507\n驮\t4508\n骈\t4509\n850\t4510\niii\t4511\n嶂\t4512\n淞\t4513\n143\t4514\n腮\t4515\n髅\t4516\n炀\t4517\n啄\t4518\n亳\t4519\n麾\t4520\n147\t4521\n筐\t4522\n叨\t4523\n徨\t4524\n跷\t4525\nac\t4526\n楂\t4527\n郴\t4528\n绶\t4529\nhp\t4530\n羔\t4531\nxp\t4532\nieee\t4533\n咤\t4534\nnow\t4535\nthere\t4536\n靳\t4537\nthey\t4538\n屎\t4539\n雳\t4540\n瘘\t4541\n蹬\t4542\n2300\t4543\n惮\t4544\nacid\t4545\n涪\t4546\n阖\t4547\n煽\t4548\n蹊\t4549\n225\t4550\n栉\t4551\n153\t4552\n俟\t4553\n涸\t4554\n辫\t4555\n锢\t4556\n佟\t4557\n176\t4558\n皎\t4559\ncctv\t4560\n啮\t4561\n钰\t4562\n螂\t4563\ndc\t4564\n啪\t4565\n绷\t4566\n204\t4567\n闰\t4568\n畿\t4569\n2d\t4570\n覃\t4571\n2600\t4572\n惘\t4573\n贰\t4574\n154\t4575\n碉\t4576\n卞\t4577\n酐\t4578\n枷\t4579\n葺\t4580\n芪\t4581\n207\t4582\n蕙\t4583\n192\t4584\n咚\t4585\n籁\t4586\npro\t4587\n钴\t4588\n162\t4589\n冽\t4590\n玮\t4591\n骷\t4592\n啃\t4593\n焖\t4594\n猝\t4595\n榈\t4596\n滁\t4597\n拮\t4598\n跗\t4599\n讷\t4600\n蝗\t4601\n208\t4602\n蠡\t4603\nworld\t4604\n烨\t4605\nbeen\t4606\nhd\t4607\ngmp\t4608\n256\t4609\n脯\t4610\n歙\t4611\n泠\t4612\n刍\t4613\n掳\t4614\npe\t4615\nhis\t4616\n僳\t4617\n340\t4618\n1902\t4619\n螯\t4620\n胳\t4621\n髦\t4622\n粽\t4623\n戾\t4624\n祜\t4625\n178\t4626\n186\t4627\n岷\t4628\n懋\t4629\n馥\t4630\n昵\t4631\n踊\t4632\n湄\t4633\n郢\t4634\n斡\t4635\n迢\t4636\nce\t4637\nphotoshop\t4638\n嗪\t4639\nabout\t4640\n裨\t4641\n1903\t4642\n羧\t4643\n膈\t4644\n翊\t4645\nlcd\t4646\n鲫\t4647\n163\t4648\n螃\t4649\n沓\t4650\n疝\t4651\n笈\t4652\nktv\t4653\n榔\t4654\n157\t4655\n诘\t4656\nautocad\t4657\n195\t4658\n颉\t4659\n蛀\t4660\n鸢\t4661\n焯\t4662\n囧\t4663\nmake\t4664\n梆\t4665\nnpc\t4666\n潞\t4667\n戛\t4668\nsee\t4669\nsystem\t4670\n149\t4671\n佗\t4672\n艮\t4673\nchinese\t4674\nlet\t4675\n霾\t4676\n鬟\t4677\n215\t4678\nnet\t4679\n玖\t4680\n1898\t4681\n腭\t4682\n喔\t4683\n172\t4684\n罔\t4685\n佥\t4686\n粑\t4687\nvisual\t4688\n舷\t4689\n泯\t4690\nm2\t4691\n198\t4692\nhas\t4693\n203\t4694\nsd\t4695\n泓\t4696\n炜\t4697\n谗\t4698\n烬\t4699\n跆\t4700\nrpg\t4701\n傩\t4702\n飓\t4703\n浔\t4704\n钤\t4705\n惚\t4706\n胭\t4707\n踝\t4708\n镯\t4709\nep\t4710\n221\t4711\n臆\t4712\n196\t4713\n蜚\t4714\n揪\t4715\n觞\t4716\n皈\t4717\ndj\t4718\n183\t4719\napi\t4720\n迸\t4721\n匝\t4722\n筏\t4723\n167\t4724\n醴\t4725\n黍\t4726\n洮\t4727\n滦\t4728\n侬\t4729\n甾\t4730\n290\t4731\nway\t4732\n3200\t4733\n188\t4734\ndiy\t4735\n2cm\t4736\ncom\t4737\n澧\t4738\n阈\t4739\n袱\t4740\n迤\t4741\n衮\t4742\n166\t4743\n濂\t4744\n娑\t4745\n砥\t4746\n砷\t4747\n铨\t4748\n缜\t4749\n箴\t4750\n30000\t4751\n逵\t4752\n猖\t4753\n159\t4754\n蛰\t4755\n箍\t4756\n侥\t4757\n2mm\t4758\n搂\t4759\n纨\t4760\n裱\t4761\n枋\t4762\n嫦\t4763\n敝\t4764\n挝\t4765\n贲\t4766\n潦\t4767\n235\t4768\n撩\t4769\n惺\t4770\n铰\t4771\nf1\t4772\n忒\t4773\n咆\t4774\n哆\t4775\n莅\t4776\n164\t4777\n炕\t4778\n抨\t4779\n涿\t4780\n龈\t4781\n猷\t4782\ngot\t4783\nb1\t4784\n182\t4785\n2m\t4786\n212\t4787\n遒\t4788\n缥\t4789\nvs\t4790\n捂\t4791\n俐\t4792\nla\t4793\n瘙\t4794\n搐\t4795\n牍\t4796\nisbn\t4797\n馍\t4798\nour\t4799\n痿\t4800\n袤\t4801\n峥\t4802\n184\t4803\n栎\t4804\n罹\t4805\n燎\t4806\n喵\t4807\n209\t4808\n1901\t4809\n璜\t4810\n飒\t4811\n蔼\t4812\n珞\t4813\n澹\t4814\n奘\t4815\n岖\t4816\n芡\t4817\n簸\t4818\n杵\t4819\n甥\t4820\n骊\t4821\n216\t4822\n悴\t4823\n173\t4824\n惆\t4825\n5mg\t4826\n殃\t4827\n1895\t4828\n呃\t4829\n161\t4830\n5g\t4831\n祗\t4832\n3600\t4833\n髋\t4834\n169\t4835\nliu\t4836\nwho\t4837\n幔\t4838\ndown\t4839\n榛\t4840\n犊\t4841\n霁\t4842\n芮\t4843\n520\t4844\n牒\t4845\n佰\t4846\nher\t4847\n狈\t4848\n薨\t4849\nco\t4850\n吩\t4851\n鳝\t4852\n嵘\t4853\n濠\t4854\n呤\t4855\n纫\t4856\n3mm\t4857\n檄\t4858\n214\t4859\n浜\t4860\n370\t4861\n189\t4862\n缙\t4863\n缢\t4864\n煦\t4865\n蓦\t4866\n揖\t4867\n拴\t4868\n缈\t4869\n218\t4870\n褥\t4871\n铿\t4872\n312\t4873\n燮\t4874\nlife\t4875\n锵\t4876\n174\t4877\n荥\t4878\n187\t4879\n忿\t4880\n4s\t4881\n僖\t4882\n婶\t4883\n171\t4884\nchen\t4885\n芾\t4886\n镐\t4887\n痣\t4888\nresearch\t4889\n眈\t4890\n460\t4891\n祇\t4892\n邈\t4893\n翳\t4894\n碣\t4895\n遨\t4896\n鳗\t4897\n诂\t4898\nnever\t4899\n岫\t4900\n焘\t4901\n3cm\t4902\nco2\t4903\n茱\t4904\ntcp\t4905\nonly\t4906\n255\t4907\ngsm\t4908\nsay\t4909\n洵\t4910\n晁\t4911\nright\t4912\n噢\t4913\nshe\t4914\nover\t4915\n偈\t4916\n旖\t4917\ndavid\t4918\n181\t4919\n232\t4920\n蚓\t4921\n柘\t4922\n珐\t4923\n遽\t4924\n岌\t4925\n桅\t4926\n213\t4927\n唔\t4928\n222\t4929\n鄞\t4930\n雹\t4931\nmichael\t4932\n驸\t4933\n苻\t4934\n恻\t4935\n鬃\t4936\n玑\t4937\n磬\t4938\n崂\t4939\n304\t4940\n祉\t4941\n荤\t4942\n淼\t4943\n560\t4944\n264\t4945\n肱\t4946\n呗\t4947\npp\t4948\nb2\t4949\n骡\t4950\n囱\t4951\n10cm\t4952\n佞\t4953\nback\t4954\n1890\t4955\n226\t4956\n耒\t4957\n伫\t4958\n嚷\t4959\n粼\t4960\naa\t4961\n歆\t4962\n佃\t4963\n旎\t4964\n惋\t4965\n殁\t4966\n杳\t4967\ntheir\t4968\n阡\t4969\nred\t4970\n畈\t4971\n蔺\t4972\nos\t4973\n177\t4974\nmap\t4975\n巽\t4976\ncbd\t4977\n昱\t4978\n啰\t4979\n吠\t4980\n179\t4981\n199\t4982\n嗔\t4983\n涮\t4984\n238\t4985\n奂\t4986\n1896\t4987\n撷\t4988\n301\t4989\n袒\t4990\n720\t4991\n爰\t4992\n捶\t4993\n赭\t4994\n蜓\t4995\n姗\t4996\n蔻\t4997\n垠\t4998\n193\t4999\ngis\t5000\n噻\t5001\nab\t5002\n峒\t5003\n皙\t5004\nwant\t5005\n245\t5006\n憔\t5007\n帚\t5008\noffice\t5009\nxx\t5010\n杷\t5011\n蟆\t5012\niso14001\t5013\n觐\t5014\n钒\t5015\n岙\t5016\n2700\t5017\n1899\t5018\n栀\t5019\n幄\t5020\n啧\t5021\n癜\t5022\n擀\t5023\n轲\t5024\n铆\t5025\nthem\t5026\n讴\t5027\n樽\t5028\n霏\t5029\nmtv\t5030\n肮\t5031\n枳\t5032\n骞\t5033\n诧\t5034\n瘢\t5035\n虬\t5036\n拗\t5037\nplay\t5038\n219\t5039\n蕲\t5040\n316\t5041\n茁\t5042\n唆\t5043\ntechnology\t5044\nword\t5045\n沭\t5046\n毂\t5047\n蛎\t5048\n芊\t5049\n銮\t5050\n瞥\t5051\n呱\t5052\n223\t5053\n羿\t5054\n吒\t5055\n傥\t5056\n髯\t5057\n濯\t5058\n蜻\t5059\n皴\t5060\n802\t5061\n430\t5062\n邳\t5063\n燧\t5064\n1860\t5065\n獭\t5066\n垭\t5067\n祟\t5068\n217\t5069\n虢\t5070\nhow\t5071\n枇\t5072\nabs\t5073\n鹫\t5074\n194\t5075\n颞\t5076\n1894\t5077\n333\t5078\n皑\t5079\n脲\t5080\n197\t5081\n舔\t5082\n魇\t5083\n霭\t5084\norg\t5085\n坨\t5086\n郧\t5087\nbaby\t5088\n椽\t5089\n舫\t5090\n228\t5091\noh\t5092\n305\t5093\n荠\t5094\n琊\t5095\n溟\t5096\n1897\t5097\n煨\t5098\n265\t5099\n谯\t5100\n粲\t5101\n罂\t5102\ngonna\t5103\n屉\t5104\n佯\t5105\n郦\t5106\n亵\t5107\n诽\t5108\n芩\t5109\n嵇\t5110\n蚤\t5111\n哒\t5112\n315\t5113\n啬\t5114\nain\t5115\n嚎\t5116\n玥\t5117\ntwitter\t5118\n191\t5119\n隼\t5120\n唢\t5121\n铛\t5122\ncause\t5123\n壅\t5124\n藜\t5125\nwon\t5126\n吱\t5127\nrom\t5128\n楣\t5129\n璟\t5130\n锆\t5131\n憋\t5132\n罡\t5133\nal\t5134\n咙\t5135\n1850\t5136\n腈\t5137\noslash\t5138\njob\t5139\n233\t5140\n廪\t5141\n堑\t5142\ninto\t5143\n诩\t5144\nb2c\t5145\n溧\t5146\n鹑\t5147\n讫\t5148\n哌\t5149\n铢\t5150\n蜴\t5151\n1ml\t5152\n稹\t5153\n噜\t5154\n镉\t5155\n224\t5156\n愕\t5157\n桁\t5158\n晔\t5159\n琰\t5160\n陲\t5161\n疙\t5162\n667\t5163\n崮\t5164\nneed\t5165\n540\t5166\n8mm\t5167\nhtml\t5168\n颛\t5169\nthrough\t5170\nasp\t5171\n桡\t5172\n钜\t5173\n580\t5174\ntake\t5175\n谑\t5176\n仞\t5177\n咦\t5178\n珪\t5179\n揍\t5180\n鱿\t5181\n阉\t5182\n3800\t5183\n瘩\t5184\n410\t5185\n槌\t5186\n滓\t5187\n茴\t5188\ntft\t5189\n泮\t5190\n涣\t5191\natm\t5192\npci\t5193\n柞\t5194\n渥\t5195\n飨\t5196\n孪\t5197\n沔\t5198\n谲\t5199\n桉\t5200\nvcd\t5201\n慵\t5202\n318\t5203\noem\t5204\nother\t5205\n俚\t5206\npaul\t5207\n跖\t5208\n纭\t5209\n恙\t5210\nwhich\t5211\nfi\t5212\n佘\t5213\n236\t5214\n荃\t5215\n咄\t5216\n鞅\t5217\n叁\t5218\njames\t5219\n恽\t5220\nm3\t5221\n253\t5222\n炔\t5223\n萘\t5224\n钺\t5225\n6500\t5226\n1880\t5227\nccd\t5228\n楫\t5229\n塬\t5230\n钡\t5231\n琮\t5232\n苄\t5233\n950\t5234\n325\t5235\n275\t5236\n1g\t5237\nday\t5238\no2o\t5239\n960\t5240\nmusic\t5241\n骰\t5242\n偎\t5243\n粕\t5244\namd\t5245\n咔\t5246\n鹄\t5247\n瓒\t5248\n阆\t5249\n捅\t5250\n嬴\t5251\nadobe\t5252\n箨\t5253\nname\t5254\n390\t5255\n680\t5256\n640\t5257\n氦\t5258\n倜\t5259\nb2b\t5260\n觊\t5261\nxml\t5262\n婕\t5263\n229\t5264\njar\t5265\n锑\t5266\n撬\t5267\nchem\t5268\n掰\t5269\n嗷\t5270\n5500\t5271\n1cm\t5272\n饯\t5273\n蓓\t5274\n234\t5275\ngood\t5276\n鼬\t5277\nspa\t5278\n佤\t5279\n5a\t5280\nss\t5281\n蚯\t5282\n挞\t5283\n臾\t5284\nwhere\t5285\natp\t5286\n227\t5287\n嶙\t5288\n幂\t5289\n饬\t5290\n闱\t5291\nlive\t5292\nhigh\t5293\n煅\t5294\n嘧\t5295\n1mm\t5296\n蹭\t5297\nsun\t5298\nabc\t5299\n瞭\t5300\n顼\t5301\n箐\t5302\nhere\t5303\n徉\t5304\n231\t5305\n骜\t5306\n302\t5307\n嗨\t5308\n邛\t5309\n庑\t5310\n柩\t5311\n饕\t5312\n俎\t5313\n4mm\t5314\n15g\t5315\n嘌\t5316\n50000\t5317\n颏\t5318\ncssci\t5319\n椁\t5320\n崧\t5321\n锉\t5322\n籼\t5323\n1870\t5324\n狞\t5325\n弁\t5326\n6mm\t5327\n羯\t5328\n踹\t5329\n糅\t5330\n248\t5331\n1840\t5332\n砼\t5333\n263\t5334\n嫖\t5335\ntmp\t5336\n252\t5337\nmac\t5338\n285\t5339\n豉\t5340\n啉\t5341\n榷\t5342\n嘈\t5343\nen\t5344\n俪\t5345\n痂\t5346\n308\t5347\ninf\t5348\n630\t5349\n儋\t5350\n4a\t5351\n芎\t5352\nai\t5353\nman\t5354\n繇\t5355\n1889\t5356\nbt\t5357\n239\t5358\nmeta\t5359\n蹇\t5360\n242\t5361\n530\t5362\n诋\t5363\nbbc\t5364\n煸\t5365\n峋\t5366\n淙\t5367\n324\t5368\nmanagement\t5369\n1885\t5370\n泱\t5371\n徜\t5372\ncrm\t5373\n4cm\t5374\nfree\t5375\n汩\t5376\n纥\t5377\n246\t5378\n蝼\t5379\n囿\t5380\nuv\t5381\n暹\t5382\n谆\t5383\n蹂\t5384\n鞣\t5385\n3c\t5386\nmr\t5387\n螳\t5388\ncs\t5389\n馗\t5390\n幺\t5391\n鞑\t5392\n贽\t5393\n268\t5394\nistp\t5395\n243\t5396\n漯\t5397\n237\t5398\n牦\t5399\n淖\t5400\nengineering\t5401\ndr\t5402\n囤\t5403\nthan\t5404\ngprs\t5405\nsp\t5406\n440\t5407\n晗\t5408\n1888\t5409\n258\t5410\n忡\t5411\n懊\t5412\n呋\t5413\n埂\t5414\npcb\t5415\n307\t5416\nfirst\t5417\n321\t5418\nrobert\t5419\n鲈\t5420\nsup2\t5421\n阕\t5422\n3m\t5423\n幌\t5424\ncg\t5425\n303\t5426\n鳅\t5427\n勰\t5428\nfind\t5429\n8cm\t5430\n萸\t5431\n剽\t5432\n蚝\t5433\nwi\t5434\n绔\t5435\npdf\t5436\n1250\t5437\n262\t5438\nphp\t5439\n辇\t5440\n10mg\t5441\nuse\t5442\nie\t5443\n麋\t5444\n1884\t5445\n陟\t5446\n宥\t5447\noracle\t5448\n锺\t5449\n喽\t5450\n620\t5451\n1892\t5452\n1893\t5453\n淅\t5454\n熵\t5455\n荨\t5456\n247\t5457\n忤\t5458\namerican\t5459\n266\t5460\nseo\t5461\n轭\t5462\n嗦\t5463\n荪\t5464\nalso\t5465\n骠\t5466\n鹘\t5467\np2p\t5468\n4g\t5469\n聿\t5470\n绾\t5471\n诶\t5472\n985\t5473\n怆\t5474\n244\t5475\n喋\t5476\n恸\t5477\n湟\t5478\n睨\t5479\n翦\t5480\nfe\t5481\n蜈\t5482\n1875\t5483\n褂\t5484\n娼\t5485\n1886\t5486\n羸\t5487\n觎\t5488\n470\t5489\n瘁\t5490\n306\t5491\n蚣\t5492\n呻\t5493\n241\t5494\n1882\t5495\n昶\t5496\n谶\t5497\n猬\t5498\n荻\t5499\nschool\t5500\n286\t5501\n酗\t5502\nunit\t5503\n肄\t5504\n躏\t5505\n膑\t5506\n288\t5507\n2g\t5508\n嗡\t5509\n273\t5510\niv\t5511\ncam\t5512\n510\t5513\n庠\t5514\n崽\t5515\n254\t5516\n搪\t5517\npcr\t5518\n胯\t5519\n309\t5520\n铉\t5521\n峤\t5522\n郯\t5523\n藐\t5524\n舂\t5525\ncome\t5526\n蓼\t5527\nsome\t5528\n薏\t5529\n窿\t5530\n羣\t5531\n氽\t5532\n徕\t5533\n冼\t5534\nrs\t5535\n阂\t5536\n欤\t5537\n殒\t5538\n窈\t5539\n脘\t5540\n780\t5541\n篝\t5542\nyang\t5543\n1861\t5544\n3300\t5545\niso9000\t5546\n麸\t5547\n砭\t5548\nmax\t5549\n砰\t5550\n骶\t5551\n豺\t5552\nlg\t5553\n窠\t5554\n獒\t5555\nthink\t5556\n腴\t5557\n苕\t5558\nany\t5559\nits\t5560\n缇\t5561\n骅\t5562\n劭\t5563\ncollege\t5564\n卅\t5565\nups\t5566\n揆\t5567\n垅\t5568\nna\t5569\n6cm\t5570\n琏\t5571\n镗\t5572\n苜\t5573\n胛\t5574\n1881\t5575\nblack\t5576\n珏\t5577\n吮\t5578\n抠\t5579\n搔\t5580\n276\t5581\nrock\t5582\n251\t5583\n槎\t5584\n4200\t5585\n323\t5586\n掣\t5587\npet\t5588\n1887\t5589\nap\t5590\n琨\t5591\n餮\t5592\n375\t5593\n舛\t5594\ngive\t5595\nsi\t5596\n痤\t5597\nus\t5598\n311\t5599\n278\t5600\n埭\t5601\nenglish\t5602\npeter\t5603\n1891\t5604\n820\t5605\n胪\t5606\n喹\t5607\n妲\t5608\n婀\t5609\n帙\t5610\n10g\t5611\noa\t5612\n7500\t5613\n箩\t5614\n灏\t5615\n霎\t5616\nlogo\t5617\n袄\t5618\ndsp\t5619\nbl\t5620\n镭\t5621\n蓿\t5622\npower\t5623\nlong\t5624\n墉\t5625\ntoo\t5626\n嵊\t5627\n1862\t5628\ngirl\t5629\n堇\t5630\nking\t5631\n蟋\t5632\n610\t5633\n叽\t5634\n249\t5635\n钎\t5636\n30cm\t5637\nfm\t5638\n録\t5639\ngroup\t5640\n1883\t5641\n郓\t5642\n瘴\t5643\nvol\t5644\n丶\t5645\n呦\t5646\n邬\t5647\n頫\t5648\n272\t5649\n馁\t5650\nhiv\t5651\n鄢\t5652\n257\t5653\n1876\t5654\nordm\t5655\n蛭\t5656\n322\t5657\n愍\t5658\n锲\t5659\n槿\t5660\n珈\t5661\nbest\t5662\n4800\t5663\nmri\t5664\n1080\t5665\nfda\t5666\n10mm\t5667\n261\t5668\nnt\t5669\n660\t5670\nsuper\t5671\n1m\t5672\ncenter\t5673\nui\t5674\n335\t5675\n蜃\t5676\n298\t5677\n拎\t5678\n鎏\t5679\n裟\t5680\n沏\t5681\nnp\t5682\n螭\t5683\n7mm\t5684\n觑\t5685\n墒\t5686\n捺\t5687\n轸\t5688\nmicro\t5689\n榫\t5690\nbased\t5691\n319\t5692\n怔\t5693\nram\t5694\n618\t5695\n昀\t5696\neven\t5697\n泷\t5698\n1864\t5699\nca\t5700\n凫\t5701\n唠\t5702\n狰\t5703\n鲛\t5704\n氐\t5705\n呛\t5706\n绀\t5707\n碛\t5708\n茏\t5709\n盅\t5710\n蟀\t5711\n洙\t5712\noff\t5713\n訇\t5714\n蠹\t5715\nauml\t5716\ndos\t5717\n20cm\t5718\n267\t5719\n棂\t5720\n18000\t5721\n蚴\t5722\n篾\t5723\ntwo\t5724\n靛\t5725\n暄\t5726\nshow\t5727\n1868\t5728\n泞\t5729\ncdma\t5730\nmark\t5731\nvc\t5732\n洄\t5733\n赓\t5734\n麽\t5735\n25000\t5736\n篓\t5737\n孑\t5738\n860\t5739\n烩\t5740\n980\t5741\ndesign\t5742\n颢\t5743\n钣\t5744\nvar\t5745\n髂\t5746\n蹴\t5747\nwanna\t5748\n筮\t5749\n蝌\t5750\n醮\t5751\nhome\t5752\n菖\t5753\nfun\t5754\ncmos\t5755\n獗\t5756\nfriends\t5757\nbusiness\t5758\n岘\t5759\n570\t5760\n鼐\t5761\n1865\t5762\n姣\t5763\nnational\t5764\n1874\t5765\n蟑\t5766\n袈\t5767\n葶\t5768\n掬\t5769\nmost\t5770\nvga\t5771\nemba\t5772\n躇\t5773\n30g\t5774\n鹌\t5775\ncity\t5776\n踌\t5777\n282\t5778\n钹\t5779\n蚪\t5780\n颧\t5781\n001\t5782\n13000\t5783\n鹳\t5784\n274\t5785\nkm\t5786\n345\t5787\n1050\t5788\nstop\t5789\n328\t5790\nthen\t5791\n鲲\t5792\n驷\t5793\n潴\t5794\n295\t5795\n386\t5796\n焱\t5797\n稔\t5798\n悌\t5799\nmpeg\t5800\nst\t5801\nsuv\t5802\nvista\t5803\na1\t5804\nvi\t5805\n283\t5806\nhelp\t5807\nbasic\t5808\n唏\t5809\n11000\t5810\n苒\t5811\n蹙\t5812\nhouse\t5813\nheart\t5814\nouml\t5815\n281\t5816\n氩\t5817\nbug\t5818\nmobile\t5819\n宓\t5820\nservice\t5821\ndll\t5822\n綦\t5823\n苎\t5824\napplication\t5825\n疃\t5826\nmethyl\t5827\n攫\t5828\nrfid\t5829\n100g\t5830\n287\t5831\n掾\t5832\n1871\t5833\n徭\t5834\n490\t5835\n舀\t5836\n逶\t5837\n嗤\t5838\n760\t5839\n0m\t5840\nge\t5841\n1872\t5842\npeople\t5843\nhr\t5844\n蜷\t5845\n茔\t5846\n512\t5847\n疳\t5848\n迳\t5849\n罄\t5850\n瓠\t5851\n100mg\t5852\n讪\t5853\npsp\t5854\nav\t5855\n傈\t5856\nppp\t5857\n杲\t5858\n灞\t5859\n氲\t5860\n鬲\t5861\n獠\t5862\n柒\t5863\n骧\t5864\n1848\t5865\naway\t5866\nwilliam\t5867\n326\t5868\n搀\t5869\n珩\t5870\n绦\t5871\n1879\t5872\n嚏\t5873\n710\t5874\n镛\t5875\n喱\t5876\n倏\t5877\n馋\t5878\n茭\t5879\n擘\t5880\n斫\t5881\n284\t5882\n1mg\t5883\n怂\t5884\nhdmi\t5885\n唧\t5886\n犍\t5887\n谩\t5888\n赊\t5889\n317\t5890\n271\t5891\nwu\t5892\n鬻\t5893\n禛\t5894\n15cm\t5895\n259\t5896\n840\t5897\nfeel\t5898\n485\t5899\n圻\t5900\n10m\t5901\n蹶\t5902\n5kg\t5903\n1877\t5904\n1873\t5905\n缄\t5906\n瘿\t5907\n黠\t5908\n甑\t5909\n矸\t5910\n嘀\t5911\nil\t5912\n蹼\t5913\njack\t5914\nlee\t5915\n269\t5916\n叼\t5917\ndi\t5918\n313\t5919\n旻\t5920\nauc\t5921\n502\t5922\n1350\t5923\n鹜\t5924\n289\t5925\nfc\t5926\n稗\t5927\n336\t5928\n999\t5929\nassociation\t5930\nmany\t5931\n293\t5932\n雒\t5933\ngeorge\t5934\ntd\t5935\n赉\t5936\nstyle\t5937\n馔\t5938\n颦\t5939\nul\t5940\nld50\t5941\n1867\t5942\n颔\t5943\n掇\t5944\n1863\t5945\neach\t5946\n赅\t5947\n桎\t5948\ninc\t5949\n痧\t5950\ndv\t5951\n谄\t5952\n孛\t5953\n笆\t5954\n鲶\t5955\n铳\t5956\n3100\t5957\nmc\t5958\ntell\t5959\n4m\t5960\nblue\t5961\n327\t5962\n299\t5963\nbios\t5964\n龋\t5965\n385\t5966\n盱\t5967\n笏\t5968\n2030\t5969\n窕\t5970\n苴\t5971\n314\t5972\nbig\t5973\n1866\t5974\n296\t5975\n萋\t5976\n355\t5977\n辘\t5978\n琬\t5979\ncu\t5980\n梏\t5981\nmuch\t5982\n蚧\t5983\n3400\t5984\n1280\t5985\n镳\t5986\n24h\t5987\nown\t5988\n670\t5989\nstudio\t5990\n瞅\t5991\nkeep\t5992\n6g\t5993\nppt\t5994\nconference\t5995\naround\t5996\ninformation\t5997\n睬\t5998\n1878\t5999\nclass\t6000\n偌\t6001\n鲵\t6002\n惦\t6003\n1830\t6004\n蜍\t6005\nmp4\t6006\nwhy\t6007\n靼\t6008\n1851\t6009\n332\t6010\n阗\t6011\n菟\t6012\n黝\t6013\n1650\t6014\ncontrol\t6015\n挈\t6016\n嵴\t6017\n剡\t6018\n358\t6019\n楸\t6020\ndha\t6021\n氤\t6022\nm1\t6023\nvr\t6024\n呎\t6025\n珲\t6026\n5ml\t6027\n馄\t6028\n滂\t6029\n338\t6030\n蹉\t6031\n蓑\t6032\n锷\t6033\n297\t6034\n279\t6035\n啜\t6036\n1644\t6037\nsm\t6038\n婵\t6039\nwell\t6040\n鬣\t6041\n7cm\t6042\n钿\t6043\nbbs\t6044\n晌\t6045\n蛆\t6046\n隗\t6047\n酞\t6048\n枞\t6049\n352\t6050\nwork\t6051\nalways\t6052\n9g\t6053\n戬\t6054\n獾\t6055\n镕\t6056\nstar\t6057\neasy\t6058\n饨\t6059\n娣\t6060\n缰\t6061\n邾\t6062\n334\t6063\n8m\t6064\nni\t6065\n鹗\t6066\n277\t6067\n425\t6068\nend\t6069\nhad\t6070\n嗒\t6071\n苋\t6072\n薮\t6073\n棹\t6074\ntype\t6075\nrichard\t6076\n880\t6077\n6m\t6078\n拄\t6079\nair\t6080\n埕\t6081\n勖\t6082\n鹞\t6083\n殚\t6084\n鲢\t6085\npop\t6086\na4\t6087\n1750\t6088\nftp\t6089\n16000\t6090\n啖\t6091\nad\t6092\n沣\t6093\n501\t6094\n靥\t6095\n葭\t6096\n诿\t6097\nhtc\t6098\n鸪\t6099\n007\t6100\n饴\t6101\nt1\t6102\n疖\t6103\n抟\t6104\n睽\t6105\n770\t6106\naccess\t6107\ntcl\t6108\n稞\t6109\n吋\t6110\n谀\t6111\n澍\t6112\n杈\t6113\n妤\t6114\nsata\t6115\npart\t6116\n峄\t6117\nsystems\t6118\n漉\t6119\n40000\t6120\never\t6121\n気\t6122\n368\t6123\n咲\t6124\nqs\t6125\nta\t6126\n璘\t6127\nltd\t6128\nmol\t6129\nmedia\t6130\n萜\t6131\n僭\t6132\n朐\t6133\n742\t6134\n1855\t6135\ncc\t6136\n圜\t6137\n癞\t6138\n藿\t6139\n555\t6140\n珉\t6141\nisp\t6142\nset\t6143\n1450\t6144\n陉\t6145\nhim\t6146\n僮\t6147\n292\t6148\n膻\t6149\n1853\t6150\n薹\t6151\n810\t6152\n汊\t6153\nstill\t6154\n锗\t6155\n昉\t6156\npvp\t6157\n猗\t6158\nhttp\t6159\n1859\t6160\n3700\t6161\nstrong\t6162\n3a\t6163\n锶\t6164\nreal\t6165\n跛\t6166\nart\t6167\n1869\t6168\n331\t6169\n1368\t6170\n嘹\t6171\n337\t6172\n瓤\t6173\n402\t6174\n衄\t6175\n1856\t6176\n1820\t6177\n1150\t6178\nmatlab\t6179\n豕\t6180\n吆\t6181\n腆\t6182\nthomas\t6183\na2\t6184\n294\t6185\nle\t6186\n366\t6187\nusing\t6188\n356\t6189\nbb\t6190\n喆\t6191\nsmith\t6192\ndifferent\t6193\n莴\t6194\n401\t6195\n谌\t6196\nci\t6197\n珙\t6198\n疥\t6199\nkw\t6200\n鲑\t6201\n405\t6202\n玷\t6203\n蛔\t6204\n砀\t6205\n361\t6206\nzh\t6207\nnasa\t6208\nmaterials\t6209\n329\t6210\nnature\t6211\n1h\t6212\n谔\t6213\n睥\t6214\nch\t6215\n20mg\t6216\n2mg\t6217\ndu\t6218\nmail\t6219\ndata\t6220\nevery\t6221\n蹑\t6222\n诒\t6223\n逋\t6224\n372\t6225\nwhile\t6226\n姝\t6227\n刈\t6228\n婧\t6229\ngoing\t6230\n喳\t6231\n镞\t6232\n铌\t6233\n291\t6234\n712\t6235\n辎\t6236\n鹧\t6237\n檩\t6238\n740\t6239\n扪\t6240\n10ml\t6241\n霰\t6242\nar\t6243\n裆\t6244\nol\t6245\n嬷\t6246\n0mm\t6247\nufo\t6248\ncharles\t6249\n20mm\t6250\ntvb\t6251\napple\t6252\n刎\t6253\niec\t6254\nproject\t6255\nsbs\t6256\n嵋\t6257\n342\t6258\n690\t6259\n悱\t6260\n920\t6261\n嘤\t6262\njean\t6263\n篁\t6264\n荸\t6265\n瞑\t6266\n殓\t6267\n搽\t6268\n50mg\t6269\n343\t6270\n橇\t6271\ninclude\t6272\neva\t6273\n雎\t6274\n弭\t6275\n獐\t6276\nhaccp\t6277\n恿\t6278\nvideo\t6279\ncf\t6280\nvpn\t6281\nsociety\t6282\n眦\t6283\n730\t6284\n铐\t6285\nsong\t6286\n尕\t6287\n捎\t6288\n诟\t6289\ninstitute\t6290\n痨\t6291\ncn\t6292\n369\t6293\n笞\t6294\n756\t6295\nversion\t6296\ndes\t6297\nsns\t6298\n趺\t6299\n590\t6300\naward\t6301\n唬\t6302\n苣\t6303\ncss\t6304\nlte\t6305\nxu\t6306\nfbi\t6307\n啾\t6308\n瘪\t6309\n垸\t6310\n357\t6311\n橹\t6312\nafter\t6313\n濛\t6314\n曷\t6315\nlevel\t6316\n樾\t6317\nvery\t6318\n汨\t6319\n仟\t6320\n姒\t6321\n1858\t6322\nagain\t6323\n怦\t6324\n荏\t6325\ntom\t6326\n诤\t6327\n苡\t6328\n吭\t6329\n830\t6330\ndm\t6331\nbefore\t6332\n406\t6333\n崆\t6334\n氡\t6335\nyoung\t6336\n脩\t6337\nlan\t6338\n胝\t6339\n钏\t6340\n3ds\t6341\ncr\t6342\narm\t6343\npos\t6344\nnight\t6345\n屐\t6346\n395\t6347\n忐\t6348\n彧\t6349\n拚\t6350\n鏖\t6351\n344\t6352\n100ml\t6353\n525\t6354\n孳\t6355\n1024\t6356\nyu\t6357\n忑\t6358\n384\t6359\n邝\t6360\n穰\t6361\n403\t6362\n摈\t6363\n庖\t6364\n351\t6365\n鸵\t6366\n398\t6367\nhello\t6368\n矽\t6369\n354\t6370\n鲟\t6371\nsaid\t6372\n381\t6373\n768\t6374\n発\t6375\n762\t6376\nsap\t6377\n1854\t6378\nmsn\t6379\n菅\t6380\nbook\t6381\n353\t6382\ntrue\t6383\n339\t6384\njavascript\t6385\n348\t6386\n2900\t6387\n圪\t6388\n蹋\t6389\n衾\t6390\n簋\t6391\n璎\t6392\n367\t6393\n噎\t6394\n911\t6395\n嬗\t6396\n346\t6397\n肼\t6398\n362\t6399\n359\t6400\n跎\t6401\n滟\t6402\nlittle\t6403\n4300\t6404\n701\t6405\n戦\t6406\n嵬\t6407\nlook\t6408\n仝\t6409\nphys\t6410\nclub\t6411\n惇\t6412\n纾\t6413\ntimes\t6414\n14000\t6415\n炁\t6416\n382\t6417\nxyz\t6418\nnumber\t6419\nak\t6420\nmind\t6421\nhuang\t6422\n闳\t6423\n骐\t6424\n秣\t6425\n眙\t6426\n谘\t6427\n碓\t6428\niso9002\t6429\n疔\t6430\n412\t6431\n恂\t6432\nam\t6433\ntop\t6434\nmaster\t6435\n鳕\t6436\ngreen\t6437\n鸱\t6438\nint\t6439\n爨\t6440\n镊\t6441\n404\t6442\nwere\t6443\n4600\t6444\nem\t6445\nbetter\t6446\n钯\t6447\n圮\t6448\n楽\t6449\n堀\t6450\n1852\t6451\n408\t6452\nsat\t6453\n1857\t6454\n378\t6455\n422\t6456\n膘\t6457\n705\t6458\n噗\t6459\n347\t6460\nstart\t6461\n486\t6462\n锹\t6463\n505\t6464\n杼\t6465\n酊\t6466\nsame\t6467\n376\t6468\nwhite\t6469\n挎\t6470\n箸\t6471\n郗\t6472\n垌\t6473\nsa\t6474\n溏\t6475\nmartin\t6476\n蔫\t6477\n偻\t6478\n364\t6479\n妫\t6480\n飚\t6481\n625\t6482\n601\t6483\n辔\t6484\n濬\t6485\n666\t6486\nds\t6487\n瑄\t6488\n621\t6489\n觚\t6490\n5600\t6491\nnhk\t6492\n415\t6493\nexpress\t6494\n铍\t6495\nbit\t6496\n跚\t6497\n9mm\t6498\n翕\t6499\n煊\t6500\nthese\t6501\n50mm\t6502\ngpu\t6503\nb6\t6504\nhip\t6505\n耄\t6506\n铋\t6507\n篦\t6508\nzhou\t6509\n阇\t6510\n骛\t6511\nnvidia\t6512\n莪\t6513\n吲\t6514\nyoutube\t6515\n唁\t6516\n870\t6517\n箧\t6518\n503\t6519\ntm\t6520\n8500\t6521\nreally\t6522\n珅\t6523\n潋\t6524\n迨\t6525\n哽\t6526\nwithout\t6527\n砦\t6528\nmodel\t6529\n缗\t6530\nhey\t6531\n謇\t6532\n呸\t6533\nmrna\t6534\n垓\t6535\n糍\t6536\npark\t6537\nwap\t6538\n璠\t6539\n妣\t6540\n狎\t6541\n攥\t6542\n396\t6543\n闇\t6544\nyork\t6545\n蛉\t6546\n瑁\t6547\njoe\t6548\n腼\t6549\n蹒\t6550\ngreat\t6551\nreview\t6552\n200mg\t6553\nchris\t6554\nwww\t6555\n嶷\t6556\nonline\t6557\n莠\t6558\n沤\t6559\n哚\t6560\n475\t6561\n遑\t6562\nv1\t6563\nsuch\t6564\n跺\t6565\n膦\t6566\n蹿\t6567\nunix\t6568\nhard\t6569\n40cm\t6570\n50cm\t6571\nnothing\t6572\n郫\t6573\nzhao\t6574\n玳\t6575\nma\t6576\nboy\t6577\n埚\t6578\nurl\t6579\n432\t6580\nnetwork\t6581\naaaa\t6582\n衿\t6583\n371\t6584\ntry\t6585\n醪\t6586\nfull\t6587\n挹\t6588\nraid\t6589\nbg\t6590\n绡\t6591\n汜\t6592\ndigital\t6593\nmb\t6594\nc1\t6595\n坩\t6596\nccc\t6597\n旃\t6598\n5200\t6599\n607\t6600\nitunes\t6601\npowerpoint\t6602\n鸨\t6603\nbetween\t6604\n407\t6605\n翈\t6606\n1842\t6607\n1844\t6608\n435\t6609\n838\t6610\n抡\t6611\nchemistry\t6612\nteam\t6613\nparty\t6614\ndie\t6615\n晞\t6616\nplace\t6617\ncare\t6618\n盥\t6619\n藁\t6620\n蓖\t6621\n383\t6622\ncv\t6623\n臊\t6624\nmade\t6625\nstate\t6626\n465\t6627\n羰\t6628\n388\t6629\n1620\t6630\nsas\t6631\n楝\t6632\n噱\t6633\nji\t6634\n饽\t6635\n苌\t6636\nsoho\t6637\n褓\t6638\n佶\t6639\nmp\t6640\n581\t6641\nyears\t6642\n1260\t6643\n1680\t6644\nhop\t6645\n稜\t6646\n瞠\t6647\n仡\t6648\n25mm\t6649\n605\t6650\n423\t6651\n341\t6652\n363\t6653\n374\t6654\n627\t6655\ntext\t6656\ndevelopment\t6657\n518\t6658\n伉\t6659\n襁\t6660\nug\t6661\nchange\t6662\n713\t6663\n涞\t6664\n1849\t6665\n蜇\t6666\n抿\t6667\n瑗\t6668\npda\t6669\n418\t6670\nun\t6671\nline\t6672\n958\t6673\n孱\t6674\n懑\t6675\n416\t6676\nvon\t6677\n373\t6678\n淦\t6679\n赝\t6680\ncore\t6681\ndns\t6682\n747\t6683\n427\t6684\n387\t6685\nwould\t6686\nipo\t6687\n醌\t6688\n551\t6689\n缫\t6690\n蠲\t6691\nalt\t6692\n嚓\t6693\n鲷\t6694\n湫\t6695\n捋\t6696\n1845\t6697\n咩\t6698\n裏\t6699\navi\t6700\n犒\t6701\n2050\t6702\n墀\t6703\nyeah\t6704\ngod\t6705\n445\t6706\nlesson\t6707\n硐\t6708\n蔸\t6709\n399\t6710\n758\t6711\npu\t6712\ncomputer\t6713\n456\t6714\n钽\t6715\n1847\t6716\n麂\t6717\nbrown\t6718\nstore\t6719\n蒡\t6720\n鼹\t6721\n绻\t6722\n1821\t6723\n錾\t6724\n仃\t6725\n515\t6726\n篙\t6727\n蕤\t6728\n589\t6729\napplied\t6730\n737\t6731\n930\t6732\nc3\t6733\n1841\t6734\n铤\t6735\nbillboard\t6736\napec\t6737\n槁\t6738\n牖\t6739\n螈\t6740\nmary\t6741\n俦\t6742\nfamily\t6743\n笄\t6744\ncolor\t6745\n啻\t6746\n対\t6747\njsp\t6748\n郤\t6749\nnext\t6750\niq\t6751\n645\t6752\n506\t6753\nhbv\t6754\n闼\t6755\na3\t6756\n349\t6757\nvalue\t6758\n413\t6759\nigg\t6760\n411\t6761\n426\t6762\n醺\t6763\n赍\t6764\n檗\t6765\nusa\t6766\n裾\t6767\nhead\t6768\n噫\t6769\n掸\t6770\nmike\t6771\n箓\t6772\nusb2\t6773\nthings\t6774\n5800\t6775\n5v\t6776\no2\t6777\n妪\t6778\n乂\t6779\n蝈\t6780\n砻\t6781\n胍\t6782\n220v\t6783\n392\t6784\ncba\t6785\n397\t6786\n535\t6787\nidc\t6788\nanalysis\t6789\n25mg\t6790\n蜱\t6791\nti\t6792\n2h\t6793\n聃\t6794\n雠\t6795\n碚\t6796\n椤\t6797\n缯\t6798\n昴\t6799\n890\t6800\n缱\t6801\n祎\t6802\nder\t6803\n缬\t6804\nex\t6805\n508\t6806\n铙\t6807\ncnc\t6808\npentium\t6809\n孀\t6810\n533\t6811\nadvanced\t6812\nmpa\t6813\nyl\t6814\n笳\t6815\n蘇\t6816\n愆\t6817\n685\t6818\n榉\t6819\nold\t6820\n氙\t6821\ncall\t6822\nalex\t6823\n燹\t6824\n撂\t6825\n菽\t6826\n583\t6827\n箬\t6828\n蛄\t6829\n瘸\t6830\n嬛\t6831\n495\t6832\n橐\t6833\ncould\t6834\n60000\t6835\nsomething\t6836\n纡\t6837\n刽\t6838\n辂\t6839\nhong\t6840\n377\t6841\nlaw\t6842\n蒯\t6843\n邨\t6844\n1846\t6845\n1550\t6846\nr2\t6847\n1837\t6848\n赀\t6849\nplayer\t6850\n414\t6851\n跸\t6852\nphone\t6853\n邙\t6854\nhold\t6855\nrgb\t6856\n421\t6857\nhenry\t6858\n2025\t6859\n黟\t6860\n409\t6861\n磴\t6862\n1815\t6863\nmode\t6864\n1843\t6865\n闿\t6866\n504\t6867\nletters\t6868\n1780\t6869\n428\t6870\n垟\t6871\n389\t6872\nt2\t6873\nlondon\t6874\n528\t6875\njpeg\t6876\n嵯\t6877\n钚\t6878\nsteve\t6879\n跄\t6880\n30min\t6881\n527\t6882\n潸\t6883\nh2\t6884\n35000\t6885\n崴\t6886\neric\t6887\n379\t6888\nrun\t6889\nthree\t6890\nrf\t6891\nleft\t6892\n455\t6893\n恁\t6894\nopen\t6895\n楮\t6896\n556\t6897\nbc\t6898\n476\t6899\n腧\t6900\n458\t6901\nplus\t6902\n1812\t6903\n1839\t6904\n胨\t6905\nb12\t6906\n4d\t6907\n芫\t6908\namerica\t6909\nest\t6910\ndream\t6911\n碴\t6912\n隰\t6913\n杓\t6914\nmd\t6915\nya\t6916\nglobal\t6917\n436\t6918\n15mm\t6919\n2ml\t6920\n貉\t6921\n欹\t6922\nsup3\t6923\n侑\t6924\nea\t6925\n鳜\t6926\n910\t6927\nben\t6928\n铄\t6929\n椴\t6930\n昇\t6931\n醍\t6932\n1020\t6933\n798\t6934\nmidi\t6935\n肓\t6936\nfeatures\t6937\nlc\t6938\nbrian\t6939\nakb48\t6940\n缂\t6941\n1835\t6942\ntest\t6943\n铡\t6944\nlight\t6945\n978\t6946\ns1\t6947\n1799\t6948\nkey\t6949\nsim\t6950\n1795\t6951\nsimple\t6952\nenergy\t6953\n蹠\t6954\n徂\t6955\nwest\t6956\n725\t6957\nbody\t6958\n豢\t6959\n424\t6960\nface\t6961\n蒽\t6962\nlin\t6963\n805\t6964\n1120\t6965\n479\t6966\n菡\t6967\nbill\t6968\n433\t6969\n衲\t6970\n阚\t6971\nbelieve\t6972\nbrt\t6973\npa\t6974\nlast\t6975\n芗\t6976\nhu\t6977\nsam\t6978\nwei\t6979\nadsl\t6980\n602\t6981\nmk\t6982\n痍\t6983\n玠\t6984\n1832\t6985\n523\t6986\n晷\t6987\n604\t6988\njj\t6989\n468\t6990\n淝\t6991\n1560\t6992\n鄯\t6993\nck\t6994\n473\t6995\n糗\t6996\n耨\t6997\n榧\t6998\n394\t6999\n940\t7000\neq\t7001\n498\t7002\nused\t7003\nsc\t7004\n胴\t7005\nc2\t7006\n蕈\t7007\nscreen\t7008\n镬\t7009\n635\t7010\n鼾\t7011\n431\t7012\neducation\t7013\nwwe\t7014\n摭\t7015\n鸮\t7016\ncl\t7017\n5400\t7018\nfpga\t7019\n恚\t7020\n419\t7021\n実\t7022\nasia\t7023\n534\t7024\n552\t7025\n砝\t7026\n100mm\t7027\npid\t7028\n741\t7029\n珣\t7030\nunder\t7031\n603\t7032\n寤\t7033\n埙\t7034\nmbc\t7035\ntc\t7036\nxxx\t7037\ndidn\t7038\n478\t7039\nmn\t7040\np1\t7041\n锏\t7042\nsimon\t7043\nansi\t7044\n438\t7045\nhi\t7046\n615\t7047\n喟\t7048\n蘅\t7049\n骺\t7050\ncell\t7051\n捭\t7052\nstudy\t7053\n586\t7054\n393\t7055\n莜\t7056\nshould\t7057\nxi\t7058\n缶\t7059\nf2\t7060\ngames\t7061\n0g\t7062\n1760\t7063\nmini\t7064\njohnson\t7065\njones\t7066\nyes\t7067\n锟\t7068\n1825\t7069\n叵\t7070\ncm3\t7071\n炷\t7072\n1580\t7073\nstay\t7074\n675\t7075\nanother\t7076\n6800\t7077\n鲧\t7078\n1736\t7079\nps2\t7080\n胼\t7081\n517\t7082\n査\t7083\n岬\t7084\n2019\t7085\n1640\t7086\nrose\t7087\n鹂\t7088\n牯\t7089\n珥\t7090\nentertainment\t7091\n448\t7092\nund\t7093\n496\t7094\n莼\t7095\nsoftware\t7096\n970\t7097\n邠\t7098\n5300\t7099\nh1n1\t7100\n488\t7101\nda\t7102\n眇\t7103\n卟\t7104\n変\t7105\n20m\t7106\nmay\t7107\n417\t7108\nlady\t7109\ngalaxy\t7110\n4100\t7111\n惴\t7112\n1789\t7113\n846\t7114\n801\t7115\n渑\t7116\n907\t7117\nput\t7118\n蚱\t7119\ngone\t7120\n606\t7121\nt3\t7122\ncompany\t7123\n632\t7124\n454\t7125\n516\t7126\n998\t7127\n548\t7128\n391\t7129\n4700\t7130\n瞌\t7131\nide\t7132\n瘰\t7133\n7200\t7134\n佝\t7135\ntogether\t7136\nstreet\t7137\n旸\t7138\n626\t7139\n衽\t7140\n郅\t7141\n奁\t7142\n731\t7143\n30mg\t7144\nmvp\t7145\n1370\t7146\n60cm\t7147\n12cm\t7148\n魑\t7149\n1828\t7150\n628\t7151\neverything\t7152\n612\t7153\nsan\t7154\n937\t7155\n缛\t7156\n2gb\t7157\nlu\t7158\nangel\t7159\n20ml\t7160\n576\t7161\n颙\t7162\nsony\t7163\n790\t7164\npress\t7165\n镫\t7166\nhall\t7167\n簌\t7168\nbeautiful\t7169\n豇\t7170\n711\t7171\n453\t7172\npm\t7173\n姹\t7174\nthing\t7175\n442\t7176\n邋\t7177\nalpha\t7178\nleave\t7179\n暝\t7180\n441\t7181\n30mm\t7182\nchapter\t7183\n507\t7184\n100000\t7185\n526\t7186\ndirectx\t7187\n511\t7188\n9cm\t7189\nwords\t7190\n釐\t7191\n619\t7192\n洹\t7193\n444\t7194\nfrank\t7195\n咿\t7196\neyes\t7197\n483\t7198\n俳\t7199\n522\t7200\n蜊\t7201\n醐\t7202\n541\t7203\nwater\t7204\n499\t7205\n聩\t7206\nnon\t7207\nbob\t7208\n坻\t7209\n532\t7210\n757\t7211\n545\t7212\n毽\t7213\noo\t7214\n喾\t7215\nalone\t7216\nscott\t7217\n744\t7218\n辋\t7219\nriver\t7220\nzhu\t7221\n倌\t7222\n媪\t7223\n蛳\t7224\n滹\t7225\n哙\t7226\nnc\t7227\n20g\t7228\n阊\t7229\ngs\t7230\nqueen\t7231\n趸\t7232\n1130\t7233\n1645\t7234\n祢\t7235\n4mg\t7236\n1814\t7237\ngirls\t7238\n544\t7239\ne1\t7240\n籀\t7241\n1210\t7242\n1573\t7243\n徼\t7244\nipv6\t7245\n訾\t7246\n髁\t7247\n1a\t7248\njackson\t7249\n砜\t7250\n1836\t7251\nles\t7252\n4gb\t7253\n撸\t7254\n瓘\t7255\n1790\t7256\n缁\t7257\n镓\t7258\nsars\t7259\neps\t7260\n519\t7261\nsod\t7262\nbp\t7263\n1810\t7264\nyear\t7265\n縻\t7266\nsound\t7267\n617\t7268\n菀\t7269\n1125\t7270\n598\t7271\n酢\t7272\n桠\t7273\n466\t7274\nemc\t7275\n撵\t7276\n怏\t7277\n429\t7278\n1838\t7279\nready\t7280\n渌\t7281\n546\t7282\ntaylor\t7283\n452\t7284\nnews\t7285\n1180\t7286\n568\t7287\n2a\t7288\naf\t7289\n538\t7290\nlist\t7291\nhot\t7292\n1380\t7293\netc\t7294\n1796\t7295\n摞\t7296\nmo\t7297\n槲\t7298\nlevels\t7299\nht\t7300\n浠\t7301\n诜\t7302\n魉\t7303\n韫\t7304\ndaniel\t7305\n亓\t7306\n盤\t7307\npv\t7308\n瑭\t7309\n魍\t7310\n1831\t7311\nemi\t7312\n襞\t7313\nsocial\t7314\ndreamweaver\t7315\n爿\t7316\nkbs\t7317\n565\t7318\n613\t7319\n990\t7320\n浃\t7321\n樯\t7322\njb\t7323\n讵\t7324\n揩\t7325\nphysics\t7326\n耋\t7327\n帏\t7328\nlng\t7329\n崃\t7330\nbs\t7331\n457\t7332\nenough\t7333\nshy\t7334\n521\t7335\n596\t7336\nec\t7337\n451\t7338\n鸩\t7339\n遢\t7340\nturn\t7341\n臃\t7342\navailable\t7343\n4400\t7344\n585\t7345\n粿\t7346\n1010\t7347\n禳\t7348\nhand\t7349\n439\t7350\n536\t7351\n桫\t7352\nlink\t7353\nside\t7354\nearth\t7355\nmx\t7356\n髹\t7357\n7m\t7358\n482\t7359\n诳\t7360\n472\t7361\n1140\t7362\n707\t7363\n622\t7364\nwcdma\t7365\n513\t7366\nmust\t7367\n492\t7368\n462\t7369\n踉\t7370\n40mg\t7371\n948\t7372\ncmax\t7373\n郃\t7374\n1320\t7375\nv2\t7376\n542\t7377\nemail\t7378\n493\t7379\n嗖\t7380\nsup\t7381\n讧\t7382\ncnn\t7383\n446\t7384\n碁\t7385\n17000\t7386\n湎\t7387\n30m\t7388\n529\t7389\n653\t7390\n531\t7391\n575\t7392\n阏\t7393\nsr\t7394\nunited\t7395\npm2\t7396\nmt\t7397\n媾\t7398\n443\t7399\n様\t7400\naac\t7401\n806\t7402\n哔\t7403\n舸\t7404\nvb\t7405\n611\t7406\n曩\t7407\n821\t7408\ngre\t7409\ngl\t7410\ncisco\t7411\n忝\t7412\n峁\t7413\n掂\t7414\n464\t7415\n葳\t7416\n487\t7417\n437\t7418\nincluding\t7419\n715\t7420\n鄄\t7421\n558\t7422\nboth\t7423\n谵\t7424\n463\t7425\njim\t7426\n608\t7427\nm4\t7428\n5100\t7429\n彊\t7430\n锴\t7431\nwar\t7432\n郜\t7433\nmoney\t7434\n481\t7435\n葖\t7436\n1824\t7437\ntnt\t7438\n蓇\t7439\n瓴\t7440\n鳟\t7441\n橼\t7442\n5s\t7443\nlouis\t7444\n434\t7445\n鲇\t7446\n邗\t7447\nel\t7448\n犄\t7449\n秭\t7450\n3900\t7451\nrecords\t7452\nview\t7453\nchemical\t7454\n1001\t7455\n1mol\t7456\ndance\t7457\n668\t7458\ndl\t7459\n槭\t7460\n缵\t7461\nque\t7462\n624\t7463\nrt\t7464\n1823\t7465\n1805\t7466\n005\t7467\n1826\t7468\n巯\t7469\nsgs\t7470\nuser\t7471\n龊\t7472\nqc\t7473\n狍\t7474\nisland\t7475\nlanguage\t7476\nspace\t7477\n擞\t7478\nsaint\t7479\n2n\t7480\npt\t7481\nshare\t7482\n瞽\t7483\nhotel\t7484\nchristian\t7485\n557\t7486\n栲\t7487\n撅\t7488\n2b\t7489\n1801\t7490\n447\t7491\n1822\t7492\n瑀\t7493\nsmt\t7494\nhk\t7495\n1834\t7496\n戢\t7497\n825\t7498\n50ml\t7499\n朓\t7500\n逖\t7501\ngeneral\t7502\n椹\t7503\nnm\t7504\n洺\t7505\ncae\t7506\n484\t7507\n艏\t7508\nwma\t7509\nzn\t7510\n苁\t7511\nsingle\t7512\n599\t7513\nc4\t7514\n滘\t7515\n777\t7516\n铧\t7517\n侪\t7518\nocirc\t7519\n1kg\t7520\n684\t7521\n豳\t7522\nskf\t7523\n12mm\t7524\n489\t7525\nhla\t7526\n竦\t7527\n貔\t7528\nld\t7529\nbeing\t7530\n562\t7531\n圄\t7532\nvan\t7533\ngm\t7534\n688\t7535\n655\t7536\nspecial\t7537\n呷\t7538\nedition\t7539\n1s\t7540\njiang\t7541\n131108\t7542\n514\t7543\n1792\t7544\nncaa\t7545\n1833\t7546\n旄\t7547\n遛\t7548\njr\t7549\nprogram\t7550\n656\t7551\n467\t7552\ning\t7553\n901\t7554\n755\t7555\n509\t7556\n芈\t7557\nkong\t7558\nrp\t7559\n砣\t7560\n桷\t7561\naudio\t7562\nicp\t7563\nhappy\t7564\n龌\t7565\ndone\t7566\n疬\t7567\njapan\t7568\nts\t7569\nmit\t7570\np2\t7571\n524\t7572\nlooking\t7573\nmiss\t7574\n缟\t7575\n582\t7576\n洌\t7577\n35mm\t7578\n494\t7579\ngrand\t7580\n跏\t7581\nthose\t7582\njoseph\t7583\nctrl\t7584\n547\t7585\n1040\t7586\n686\t7587\n蝮\t7588\nlp\t7589\ncod\t7590\n菰\t7591\nsio2\t7592\ntxt\t7593\n1770\t7594\n1060\t7595\n帑\t7596\n767\t7597\nnorth\t7598\nfcc\t7599\n怙\t7600\nester\t7601\n718\t7602\nstory\t7603\nedi\t7604\n634\t7605\n1360\t7606\n豸\t7607\n1660\t7608\nlh\t7609\n雩\t7610\n1230\t7611\nmagic\t7612\n誊\t7613\n549\t7614\n臬\t7615\n4k\t7616\nop\t7617\n1662\t7618\n651\t7619\n镣\t7620\n箇\t7621\n616\t7622\ntitle\t7623\nsciences\t7624\n25cm\t7625\n踱\t7626\ns2\t7627\nt4\t7628\n钍\t7629\n648\t7630\n100m\t7631\n543\t7632\n588\t7633\n苫\t7634\n554\t7635\n蝽\t7636\nr1\t7637\n3mg\t7638\namino\t7639\n1776\t7640\n浯\t7641\n609\t7642\n772\t7643\nca2\t7644\nvlan\t7645\n469\t7646\n500mg\t7647\n単\t7648\nroad\t7649\n亶\t7650\n636\t7651\nmetal\t7652\ndevice\t7653\n40mm\t7654\n囹\t7655\n穑\t7656\n1730\t7657\n佻\t7658\n1818\t7659\n绌\t7660\n12g\t7661\n537\t7662\n诔\t7663\npve\t7664\nautodesk\t7665\n477\t7666\nv8\t7667\nray\t7668\ngp\t7669\nspan\t7670\ngc\t7671\nsize\t7672\n716\t7673\n鹬\t7674\nssl\t7675\ncrt\t7676\n1670\t7677\n925\t7678\n髌\t7679\npn\t7680\n1127\t7681\n702\t7682\n658\t7683\nservices\t7684\nsupport\t7685\n1802\t7686\n蒌\t7687\ncoming\t7688\nexperience\t7689\nnbc\t7690\n鳏\t7691\n631\t7692\n638\t7693\nace\t7694\n0cm\t7695\nems\t7696\n9001\t7697\n殄\t7698\nyen\t7699\nsoc\t7700\nethyl\t7701\n怛\t7702\ntf\t7703\n筌\t7704\n刳\t7705\nstudies\t7706\ntheory\t7707\n1030\t7708\n578\t7709\nradio\t7710\n翮\t7711\n卍\t7712\n畹\t7713\n471\t7714\n704\t7715\nbecause\t7716\n1610\t7717\n箜\t7718\nsave\t7719\n燔\t7720\n赳\t7721\n553\t7722\n1809\t7723\n篌\t7724\n窨\t7725\n翥\t7726\n785\t7727\n炅\t7728\n钕\t7729\nlett\t7730\n803\t7731\n1827\t7732\nacademy\t7733\ned\t7734\n629\t7735\nsf\t7736\npr\t7737\nhill\t7738\nexplorer\t7739\nfuture\t7740\nfood\t7741\n莳\t7742\n662\t7743\n567\t7744\ndcs\t7745\n忖\t7746\n戡\t7747\n1086\t7748\n1190\t7749\n1829\t7750\nbad\t7751\nes\t7752\n15m\t7753\norder\t7754\nspring\t7755\n沢\t7756\nsouth\t7757\n497\t7758\n025\t7759\nmove\t7760\n狒\t7761\n1630\t7762\n圉\t7763\nabb\t7764\n449\t7765\nlearn\t7766\nl0\t7767\nd2\t7768\n5d\t7769\nwav\t7770\n琯\t7771\n邰\t7772\ncis\t7773\nquality\t7774\nodm\t7775\n926\t7776\nacta\t7777\nroot\t7778\nsmart\t7779\n1661\t7780\n苾\t7781\ncm2\t7782\nphotos\t7783\nl2\t7784\nvia\t7785\nsk\t7786\n犸\t7787\n623\t7788\n邡\t7789\nfeeling\t7790\n572\t7791\n郏\t7792\n襦\t7793\npython\t7794\nbmw\t7795\n888\t7796\nguo\t7797\nepa\t7798\nwilliams\t7799\n沆\t7800\n813\t7801\nbot\t7802\nread\t7803\nfunction\t7804\nwilson\t7805\n1723\t7806\nenterprise\t7807\n玟\t7808\n50hz\t7809\ns26\t7810\nfire\t7811\nengineer\t7812\ntony\t7813\n1819\t7814\n濉\t7815\nrh\t7816\n洎\t7817\n莨\t7818\n氘\t7819\npb\t7820\n咛\t7821\n1720\t7822\n佺\t7823\n1460\t7824\n815\t7825\ncbs\t7826\n腩\t7827\nbeta\t7828\n鳔\t7829\n1735\t7830\nyan\t7831\n1gb\t7832\nx2\t7833\n剜\t7834\n秕\t7835\n牝\t7836\n芨\t7837\ndin\t7838\n関\t7839\ndel\t7840\nsms\t7841\n649\t7842\npal\t7843\n1369\t7844\nfar\t7845\nmaya\t7846\n654\t7847\n拊\t7848\n812\t7849\n595\t7850\n竑\t7851\n50m\t7852\n圹\t7853\nclose\t7854\neos\t7855\n颡\t7856\n1420\t7857\n6300\t7858\n1816\t7859\nwrong\t7860\nbreak\t7861\n573\t7862\n765\t7863\nfile\t7864\nfriend\t7865\n002\t7866\n摺\t7867\n683\t7868\nnx\t7869\n沩\t7870\n蜉\t7871\nplease\t7872\n1170\t7873\nro\t7874\n6400\t7875\n筚\t7876\nnick\t7877\nacm\t7878\n愔\t7879\nati\t7880\npoint\t7881\n肟\t7882\n766\t7883\n俶\t7884\nfast\t7885\nata\t7886\nd1\t7887\n678\t7888\ngeforce\t7889\n1710\t7890\nyahoo\t7891\n堃\t7892\n绉\t7893\nmysql\t7894\n1793\t7895\n奭\t7896\ngap\t7897\niso14000\t7898\nuk\t7899\nastm\t7900\nh2o\t7901\nn2\t7902\nfilm\t7903\nmethod\t7904\n1804\t7905\n罅\t7906\nso2\t7907\n嗳\t7908\n665\t7909\nadam\t7910\nuc\t7911\n蜢\t7912\n1806\t7913\n1775\t7914\nphoto\t7915\n疠\t7916\n474\t7917\nimage\t7918\n200mm\t7919\nsure\t7920\n561\t7921\n帔\t7922\n髡\t7923\n643\t7924\n黥\t7925\n1813\t7926\nproceedings\t7927\n褛\t7928\n柰\t7929\nbeyond\t7930\nroyal\t7931\nelse\t7932\neda\t7933\n808\t7934\nddr\t7935\ngif\t7936\n鏊\t7937\nl1\t7938\n痼\t7939\n571\t7940\nwaiting\t7941\n堞\t7942\ncode\t7943\n652\t7944\nrss\t7945\nlearning\t7946\n嗝\t7947\n461\t7948\nbeijing\t7949\n娉\t7950\n566\t7951\n577\t7952\n708\t7953\n1520\t7954\n689\t7955\nkevin\t7956\nhuman\t7957\n661\t7958\n539\t7959\n875\t7960\n1811\t7961\nssci\t7962\n6600\t7963\n戕\t7964\n587\t7965\n735\t7966\n3s\t7967\n铱\t7968\n耜\t7969\n觥\t7970\n867\t7971\n镒\t7972\n584\t7973\n呓\t7974\n1522\t7975\n904\t7976\ncase\t7977\n1101\t7978\n491\t7979\n1080p\t7980\nhistory\t7981\n蒹\t7982\n栱\t7983\nim\t7984\n564\t7985\nf4\t7986\n卮\t7987\n琚\t7988\nsalt\t7989\njason\t7990\nrohs\t7991\n12v\t7992\nhydroxy\t7993\n逦\t7994\nmodem\t7995\nfont\t7996\n酩\t7997\n蓍\t7998\ncry\t7999\n65536\t8000\nhealth\t8001\n虺\t8002\n1798\t8003\ntonight\t8004\nsmall\t8005\n谠\t8006\n1570\t8007\n1220\t8008\njane\t8009\nagainst\t8010\n597\t8011\n751\t8012\n459\t8013\nbd\t8014\n鼋\t8015\n焗\t8016\nudp\t8017\nprocess\t8018\n1070\t8019\n1807\t8020\nchildren\t8021\n8g\t8022\neb\t8023\n62mm\t8024\n22000\t8025\nadd\t8026\n1440\t8027\n褴\t8028\nrm\t8029\n25g\t8030\nccedil\t8031\n706\t8032\n714\t8033\n5l\t8034\n砒\t8035\n赧\t8036\n蛏\t8037\n709\t8038\n蚬\t8039\n1530\t8040\n瘕\t8041\n5h\t8042\n559\t8043\njay\t8044\niga\t8045\n020\t8046\nfall\t8047\nscsi\t8048\n顗\t8049\nisdn\t8050\ndeath\t8051\n563\t8052\ntoday\t8053\n愠\t8054\ndvi\t8055\n勣\t8056\nwait\t8057\n1642\t8058\n飕\t8059\n徳\t8060\n滢\t8061\n琇\t8062\n鳙\t8063\ndb\t8064\n瞟\t8065\n尻\t8066\nforce\t8067\n400mg\t8068\n澶\t8069\n荽\t8070\n舐\t8071\narts\t8072\nha\t8073\neast\t8074\nlost\t8075\neffects\t8076\n1628\t8077\nalbum\t8078\nharry\t8079\n633\t8080\ndark\t8081\npublic\t8082\n2250\t8083\nsoul\t8084\n826\t8085\n659\t8086\nexo\t8087\n侂\t8088\n733\t8089\nse\t8090\n黼\t8091\nicu\t8092\n4h\t8093\nmarket\t8094\n潟\t8095\n7800\t8096\n绂\t8097\n瘗\t8098\nngc\t8099\n1794\t8100\ncrazy\t8101\n蓥\t8102\n竽\t8103\n濞\t8104\nigm\t8105\nscdma\t8106\n6200\t8107\ncb\t8108\n835\t8109\n699\t8110\n骖\t8111\n偁\t8112\nbmp\t8113\n809\t8114\n1270\t8115\noled\t8116\n応\t8117\n1160\t8118\n1621\t8119\n锜\t8120\ng3\t8121\nova\t8122\ncheng\t8123\n614\t8124\n匏\t8125\nthinkpad\t8126\n赑\t8127\nfps\t8128\ncreate\t8129\nkim\t8130\n讦\t8131\n1480\t8132\n诨\t8133\n1540\t8134\nrev\t8135\n1v1\t8136\n罘\t8137\nfans\t8138\n巖\t8139\n1740\t8140\nag\t8141\n嫘\t8142\n1649\t8143\nps3\t8144\n908\t8145\n颀\t8146\ng1\t8147\n703\t8148\n岿\t8149\nv3\t8150\n虻\t8151\n936\t8152\nfl\t8153\nc2c\t8154\n罴\t8155\nenvironmental\t8156\nparis\t8157\n594\t8158\nhear\t8159\n囗\t8160\njump\t8161\ncommunications\t8162\n溆\t8163\ntalk\t8164\n噤\t8165\n824\t8166\n骝\t8167\n003\t8168\n咂\t8169\n695\t8170\n728\t8171\ne2\t8172\nnec\t8173\niptv\t8174\n1797\t8175\nkelly\t8176\n500ml\t8177\n锛\t8178\n721\t8179\nrc\t8180\n1808\t8181\nldl\t8182\n1240\t8183\n槊\t8184\nradeon\t8185\n676\t8186\n啕\t8187\ntang\t8188\nplant\t8189\n50g\t8190\n驽\t8191\nprofessional\t8192\n凇\t8193\n698\t8194\ns36\t8195\nlord\t8196\nsearch\t8197\nalan\t8198\n籴\t8199\npd\t8200\n1403\t8201\n硖\t8202\n1791\t8203\n816\t8204\n1636\t8205\n3h\t8206\ngsp\t8207\n811\t8208\nsky\t8209\n1632\t8210\n铯\t8211\nchristmas\t8212\n怿\t8213\n笥\t8214\nmatter\t8215\n574\t8216\n噙\t8217\n倨\t8218\neffect\t8219\n647\t8220\n779\t8221\n1803\t8222\n657\t8223\nsorry\t8224\nawards\t8225\nigbt\t8226\npwm\t8227\n坭\t8228\n醅\t8229\nsos\t8230\n976\t8231\n592\t8232\n滏\t8233\n10min\t8234\n682\t8235\ncs3\t8236\n悻\t8237\ndid\t8238\nmater\t8239\n579\t8240\n聒\t8241\n1724\t8242\nfeng\t8243\nlow\t8244\nmhz\t8245\n836\t8246\n722\t8247\n枥\t8248\n726\t8249\n昺\t8250\nbank\t8251\nmemory\t8252\nrap\t8253\n975\t8254\n663\t8255\nips\t8256\n酆\t8257\n2kg\t8258\n787\t8259\n簟\t8260\n睇\t8261\n轫\t8262\n溱\t8263\n骢\t8264\n榘\t8265\n642\t8266\n珺\t8267\n跹\t8268\n677\t8269\nseries\t8270\nnlp\t8271\nraquo\t8272\n蚶\t8273\nstone\t8274\n1672\t8275\n1817\t8276\n1646\t8277\n827\t8278\n驺\t8279\nko\t8280\nsecurity\t8281\nperfect\t8282\nalexander\t8283\n746\t8284\ntt\t8285\ncheck\t8286\n804\t8287\n饧\t8288\n15mg\t8289\nsir\t8290\nmoon\t8291\ndoesn\t8292\n591\t8293\ninside\t8294\ntim\t8295\n672\t8296\n641\t8297\n噼\t8298\n儆\t8299\n1w\t8300\n氚\t8301\n646\t8302\n哧\t8303\n1783\t8304\n旒\t8305\n鸬\t8306\n1648\t8307\n夥\t8308\nev\t8309\n1688\t8310\nscore\t8311\nstandard\t8312\n玦\t8313\n723\t8314\n貅\t8315\n揄\t8316\n戗\t8317\nfx\t8318\n938\t8319\n璩\t8320\nfu\t8321\n1654\t8322\n剐\t8323\n010\t8324\ncpi\t8325\n垴\t8326\n蘼\t8327\nhz\t8328\n1521\t8329\n1067\t8330\n727\t8331\nah\t8332\nlv\t8333\n916\t8334\n裒\t8335\n639\t8336\nhan\t8337\n躅\t8338\n1715\t8339\n唳\t8340\nform\t8341\nsecond\t8342\n嗑\t8343\n荦\t8344\n674\t8345\n霈\t8346\njin\t8347\n缦\t8348\n啭\t8349\npi\t8350\n1788\t8351\nrx\t8352\n隈\t8353\ngao\t8354\nsdk\t8355\nzheng\t8356\n悫\t8357\n745\t8358\nhref\t8359\n593\t8360\nngo\t8361\nmulti\t8362\nd3\t8363\n彀\t8364\n637\t8365\n1276\t8366\n悭\t8367\nfound\t8368\njis\t8369\n5700\t8370\n焓\t8371\n1234\t8372\n80cm\t8373\n磔\t8374\naim\t8375\n1778\t8376\n蓊\t8377\nact\t8378\n569\t8379\nxiao\t8380\n郾\t8381\n717\t8382\n786\t8383\nreturn\t8384\n5min\t8385\n1582\t8386\netf\t8387\n1590\t8388\naction\t8389\n1625\t8390\nsarah\t8391\nyourself\t8392\n枧\t8393\n鹚\t8394\n10kg\t8395\n80000\t8396\n検\t8397\n775\t8398\n818\t8399\nstephen\t8400\ngui\t8401\n屃\t8402\n644\t8403\n9500\t8404\nv6\t8405\n馑\t8406\nwlan\t8407\nhs\t8408\n2048\t8409\narea\t8410\n1616\t8411\nandrew\t8412\n8226\t8413\n6mg\t8414\n1567\t8415\n1763\t8416\n1470\t8417\n嗲\t8418\npps\t8419\n铟\t8420\nrca\t8421\npierre\t8422\n687\t8423\nnull\t8424\nmanager\t8425\n738\t8426\nsdh\t8427\n828\t8428\n薤\t8429\n60g\t8430\n300mg\t8431\njun\t8432\n1685\t8433\nfavorite\t8434\nmaking\t8435\nplaying\t8436\nsummer\t8437\n754\t8438\n692\t8439\n涔\t8440\n樗\t8441\n664\t8442\n忾\t8443\n収\t8444\n绺\t8445\n945\t8446\nh2s\t8447\nbis\t8448\nself\t8449\n300mm\t8450\n烊\t8451\nopengl\t8452\n912\t8453\nacute\t8454\n螫\t8455\n黩\t8456\n996\t8457\nmagazine\t8458\nedward\t8459\nsu\t8460\nelisa\t8461\nhdl\t8462\ncyp3a4\t8463\n鞫\t8464\nfoundation\t8465\nalice\t8466\nddr3\t8467\n915\t8468\n923\t8469\ntbs\t8470\nandy\t8471\nfield\t8472\ndate\t8473\ntransactions\t8474\nlimited\t8475\nduring\t8476\n1126\t8477\n鲠\t8478\n1057\t8479\nfan\t8480\n嘭\t8481\n缣\t8482\n845\t8483\n681\t8484\nrw\t8485\nmean\t8486\n1566\t8487\nbecome\t8488\neconomic\t8489\n852\t8490\njohnny\t8491\n蒺\t8492\nunique\t8493\n黒\t8494\ntu\t8495\nboys\t8496\n1330\t8497\n885\t8498\ngetting\t8499\ncj\t8500\n1072\t8501\nnh\t8502\nne\t8503\nband\t8504\ncool\t8505\n724\t8506\n771\t8507\n骘\t8508\n氖\t8509\ncontent\t8510\n842\t8511\n镝\t8512\n俅\t8513\n谮\t8514\nte\t8515\n9600\t8516\ndrive\t8517\nphenyl\t8518\n1275\t8519\n屦\t8520\ncao\t8521\nmenu\t8522\n823\t8523\n摁\t8524\n氪\t8525\n蘧\t8526\nactive\t8527\nsb\t8528\nappl\t8529\n988\t8530\n1622\t8531\n伝\t8532\n1725\t8533\nzero\t8534\n1008\t8535\n3kg\t8536\n腠\t8537\n叡\t8538\nhit\t8539\n鲂\t8540\nmi\t8541\n0kg\t8542\n748\t8543\nlite\t8544\nenjoy\t8545\nlocal\t8546\n789\t8547\n続\t8548\n1506\t8549\nseen\t8550\ns3\t8551\n1765\t8552\neuropean\t8553\n讣\t8554\ngold\t8555\n1279\t8556\n736\t8557\n965\t8558\npl\t8559\nbutton\t8560\n耷\t8561\n1430\t8562\n986\t8563\n763\t8564\ntoefl\t8565\n燊\t8566\n鸷\t8567\njimmy\t8568\ndota\t8569\n955\t8570\n861\t8571\n猊\t8572\n732\t8573\nxbox\t8574\ndays\t8575\ndan\t8576\n673\t8577\n833\t8578\n囡\t8579\n崤\t8580\n4c\t8581\neconomics\t8582\n23000\t8583\nagent\t8584\nhtml5\t8585\npoints\t8586\nryan\t8587\nshi\t8588\n砬\t8589\n湜\t8590\nreading\t8591\n918\t8592\nmine\t8593\nadc\t8594\n917\t8595\n1592\t8596\n1781\t8597\n翚\t8598\n峯\t8599\n909\t8600\nonce\t8601\nexchange\t8602\nchoose\t8603\ncurrent\t8604\nsymbian\t8605\nts16949\t8606\ndave\t8607\nmachine\t8608\n鲎\t8609\nqos\t8610\n蕖\t8611\n1785\t8612\n9m\t8613\ncia\t8614\nuntil\t8615\ncs4\t8616\n759\t8617\nf3\t8618\n903\t8619\n24000\t8620\n968\t8621\n8mg\t8622\nlewis\t8623\n鹈\t8624\n凼\t8625\nsnh48\t8626\n866\t8627\n泫\t8628\n荑\t8629\n黻\t8630\n牂\t8631\n1722\t8632\n鄣\t8633\n篑\t8634\nho\t8635\n1110\t8636\n1784\t8637\n髭\t8638\n陬\t8639\n寔\t8640\ndt\t8641\nshanghai\t8642\n疴\t8643\n邽\t8644\n987\t8645\n45000\t8646\n1042\t8647\n喏\t8648\n彖\t8649\nsl\t8650\nsaas\t8651\n814\t8652\n28000\t8653\na5\t8654\n彘\t8655\n赟\t8656\n819\t8657\nfoxpro\t8658\nshit\t8659\n822\t8660\n盹\t8661\n诮\t8662\n鸫\t8663\nper\t8664\ndoes\t8665\n150mm\t8666\nproducts\t8667\ncamp\t8668\nselect\t8669\ncapital\t8670\n茕\t8671\ncorporation\t8672\n26000\t8673\n铖\t8674\n954\t8675\ndd\t8676\n闩\t8677\nstring\t8678\npage\t8679\nba\t8680\n671\t8681\n読\t8682\n782\t8683\n鄜\t8684\n漈\t8685\n盍\t8686\ndlp\t8687\n729\t8688\n甭\t8689\n愎\t8690\noutlook\t8691\nwii\t8692\nue\t8693\n1787\t8694\nfestival\t8695\ncommunication\t8696\nchannel\t8697\ngary\t8698\n1755\t8699\n1774\t8700\n8600\t8701\ncopy\t8702\n150mg\t8703\n魃\t8704\ndragon\t8705\n1056\t8706\nc5\t8707\n炆\t8708\ntrack\t8709\nhdpe\t8710\nliang\t8711\n鍊\t8712\n1800mhz\t8713\n1619\t8714\n蛐\t8715\n995\t8716\n21000\t8717\n薜\t8718\nwin\t8719\n1394\t8720\n1786\t8721\nrain\t8722\n楯\t8723\ntable\t8724\n鲀\t8725\n逡\t8726\nitu\t8727\napplications\t8728\nmmorpg\t8729\n嘞\t8730\ns7\t8731\n696\t8732\n侔\t8733\n1069\t8734\n觇\t8735\nlbs\t8736\n0mg\t8737\ncar\t8738\nwave\t8739\n糸\t8740\n踮\t8741\n狷\t8742\n1552\t8743\n1627\t8744\nlatest\t8745\nstep\t8746\n886\t8747\n761\t8748\n菘\t8749\n783\t8750\n寳\t8751\nesp\t8752\n扃\t8753\n865\t8754\njazz\t8755\nk1\t8756\nfine\t8757\nchild\t8758\nkind\t8759\nanna\t8760\n60mg\t8761\n997\t8762\nmaria\t8763\nnk\t8764\n792\t8765\nraw\t8766\nlate\t8767\nsoa\t8768\n905\t8769\ncai\t8770\nttl\t8771\ndelphi\t8772\nprince\t8773\n1340\t8774\n禊\t8775\nsynthesis\t8776\n喑\t8777\nrmb\t8778\nmiller\t8779\npatrick\t8780\n933\t8781\nrunning\t8782\n50kg\t8783\n1398\t8784\nast\t8785\n752\t8786\nlocation\t8787\ndead\t8788\n塍\t8789\nchateau\t8790\nallows\t8791\nforget\t8792\ntg\t8793\n921\t8794\n栝\t8795\n5w\t8796\nkiss\t8797\n1690\t8798\n691\t8799\narthur\t8800\n瓿\t8801\nindex\t8802\ncsa\t8803\nrmvb\t8804\nmsc\t8805\n廨\t8806\ncas\t8807\nknown\t8808\nh1\t8809\ntj\t8810\nj2ee\t8811\nasian\t8812\n841\t8813\n1227\t8814\ng20\t8815\ncross\t8816\ncos\t8817\nntilde\t8818\n719\t8819\n貘\t8820\ndnf\t8821\ncalifornia\t8822\nfrance\t8823\nmodern\t8824\npacific\t8825\n769\t8826\n1066\t8827\nturbo\t8828\n753\t8829\n795\t8830\n669\t8831\n1764\t8832\n868\t8833\n馕\t8834\n僰\t8835\nunion\t8836\n1772\t8837\n2150\t8838\n1063\t8839\n哏\t8840\ndouble\t8841\nfight\t8842\n858\t8843\nmath\t8844\nbo\t8845\n瑷\t8846\nmen\t8847\nsea\t8848\n6700\t8849\nsem\t8850\n697\t8851\n疎\t8852\n882\t8853\nnote\t8854\nqi\t8855\numl\t8856\n902\t8857\n1637\t8858\ntp\t8859\n1290\t8860\n1085\t8861\n776\t8862\n蝣\t8863\n怵\t8864\n阃\t8865\ndps\t8866\n1687\t8867\n弢\t8868\n镲\t8869\nhcl\t8870\nal2o3\t8871\njs\t8872\nauto\t8873\n螅\t8874\n1683\t8875\nv5\t8876\nculture\t8877\n935\t8878\n吖\t8879\nedge\t8880\n碲\t8881\nvoice\t8882\n1007\t8883\nbridge\t8884\n855\t8885\n008\t8886\n夼\t8887\n茌\t8888\nbattle\t8889\n嗬\t8890\n靺\t8891\ndp\t8892\nae\t8893\n1090\t8894\n895\t8895\n1012\t8896\n1162\t8897\nbi\t8898\n778\t8899\n髀\t8900\n1575\t8901\npcm\t8902\n15min\t8903\n1598\t8904\n铊\t8905\nsecret\t8906\n739\t8907\n200m\t8908\n6h\t8909\nmatt\t8910\n谡\t8911\ncard\t8912\nmic\t8913\n癔\t8914\necu\t8915\n16mm\t8916\n984\t8917\n镠\t8918\n5km\t8919\ndhcp\t8920\n1753\t8921\n巻\t8922\n秾\t8923\nliving\t8924\ngn\t8925\n1643\t8926\nframework\t8927\n菪\t8928\n679\t8929\n赜\t8930\n1782\t8931\nfour\t8932\n铈\t8933\n1777\t8934\nbritish\t8935\nshell\t8936\nsanta\t8937\nyuan\t8938\n20ma\t8939\nfly\t8940\n927\t8941\nqu\t8942\nnds\t8943\nqaq\t8944\nbar\t8945\n髙\t8946\narp\t8947\n1667\t8948\n1773\t8949\n693\t8950\nmain\t8951\n鲳\t8952\n1510\t8953\n1002\t8954\n2022\t8955\ncdna\t8956\nbox\t8957\n珰\t8958\n100km\t8959\n004\t8960\n畋\t8961\nbring\t8962\n泅\t8963\n959\t8964\nhpv\t8965\nmakes\t8966\ncmv\t8967\n鲅\t8968\ntmd\t8969\n1762\t8970\n854\t8971\n泚\t8972\nghost\t8973\nshort\t8974\nmcu\t8975\n1768\t8976\ncat\t8977\n963\t8978\n1757\t8979\n1206\t8980\n1207\t8981\npuzzle\t8982\n793\t8983\ncentral\t8984\n859\t8985\n飏\t8986\nwalter\t8987\n60hz\t8988\nanderson\t8989\n1727\t8990\nthought\t8991\n屍\t8992\n仨\t8993\n864\t8994\nmolecular\t8995\n856\t8996\ndong\t8997\nfinancial\t8998\n1728\t8999\nsurface\t9000\ng2\t9001\nmf\t9002\n葚\t9003\n叻\t9004\nsolidworks\t9005\nres\t9006\nspeed\t9007\n1195\t9008\n咻\t9009\nascii\t9010\n1404\t9011\n784\t9012\njeff\t9013\n衩\t9014\n1371\t9015\nland\t9016\nbiology\t9017\n1655\t9018\n郄\t9019\notc\t9020\nsio\t9021\n1310\t9022\n1605\t9023\n蹩\t9024\nmems\t9025\n1618\t9026\nm16\t9027\ncomplete\t9028\nindustrial\t9029\nacs\t9030\n1603\t9031\nkids\t9032\ntour\t9033\nu2\t9034\nallen\t9035\n1756\t9036\n743\t9037\n嬖\t9038\n踽\t9039\ndavis\t9040\n柽\t9041\n鞨\t9042\n65279\t9043\n7600\t9044\n30ml\t9045\n957\t9046\n0l\t9047\n734\t9048\np450\t9049\n956\t9050\nir\t9051\n麴\t9052\n500mm\t9053\ncasio\t9054\n1038\t9055\nroger\t9056\nlibrary\t9057\n015\t9058\n1652\t9059\n薙\t9060\nwithin\t9061\nhands\t9062\n874\t9063\nntsc\t9064\n钇\t9065\nwhole\t9066\njq\t9067\n氵\t9068\n垆\t9069\npost\t9070\nsweet\t9071\nwall\t9072\n898\t9073\ncs5\t9074\nfeo\t9075\n9800\t9076\ncms\t9077\n1390\t9078\nsince\t9079\nmedical\t9080\n犟\t9081\n1492\t9082\n罍\t9083\nstand\t9084\njustin\t9085\nlake\t9086\ni5\t9087\n1729\t9088\nbell\t9089\nruby\t9090\nimportant\t9091\nbout\t9092\nimages\t9093\nlab\t9094\n962\t9095\n1759\t9096\nrj\t9097\ncache\t9098\nnb\t9099\nproduction\t9100\n経\t9101\n807\t9102\n1771\t9103\ndoing\t9104\n粜\t9105\ntnf\t9106\nws\t9107\nguide\t9108\nbim\t9109\nevents\t9110\n1626\t9111\n1016\t9112\n焜\t9113\nperformance\t9114\nra\t9115\nzl\t9116\n牀\t9117\n1568\t9118\n1647\t9119\n埝\t9120\n洧\t9121\n1615\t9122\nshift\t9123\n788\t9124\nshen\t9125\n1588\t9126\n60mm\t9127\n覧\t9128\ntuv\t9129\n1673\t9130\nelectronic\t9131\nmos\t9132\n蓣\t9133\n8kg\t9134\n862\t9135\necho\t9136\n1572\t9137\nsection\t9138\n981\t9139\n甯\t9140\nsg\t9141\n1664\t9142\nunderstand\t9143\nhsk\t9144\ndelta\t9145\nx86\t9146\neap\t9147\nblock\t9148\n1578\t9149\ner\t9150\nxl\t9151\n蒐\t9152\n馐\t9153\nnox\t9154\n畑\t9155\nib\t9156\ntrying\t9157\nann\t9158\n1635\t9159\napache\t9160\nnaoh\t9161\n12345\t9162\n缑\t9163\n礽\t9164\n1624\t9165\n694\t9166\n瞋\t9167\n1601\t9168\n浍\t9169\n983\t9170\n773\t9171\n1000m\t9172\nsomeone\t9173\n15kg\t9174\n25m\t9175\n847\t9176\n袢\t9177\n桕\t9178\n1037\t9179\njerry\t9180\n843\t9181\npicture\t9182\n919\t9183\ne3\t9184\nprintf\t9185\n3gs\t9186\nmarie\t9187\n853\t9188\nrj45\t9189\n侩\t9190\n913\t9191\n896\t9192\nlose\t9193\nunicode\t9194\n100cm\t9195\n1711\t9196\ncharlie\t9197\n詈\t9198\n戸\t9199\n1689\t9200\nroom\t9201\n烝\t9202\nbeat\t9203\n堌\t9204\n伋\t9205\nhplc\t9206\n9300\t9207\n110kv\t9208\nnfc\t9209\n倬\t9210\n764\t9211\niis\t9212\n圯\t9213\nsolo\t9214\n碇\t9215\nef\t9216\nround\t9217\nchang\t9218\n1366\t9219\n781\t9220\n1585\t9221\n982\t9222\nsocket\t9223\ndf\t9224\n892\t9225\n1536\t9226\n831\t9227\nren\t9228\n6kg\t9229\n4900\t9230\n纰\t9231\nobject\t9232\nforever\t9233\n832\t9234\n951\t9235\nqr\t9236\n1023\t9237\n8800\t9238\n4kg\t9239\n磾\t9240\n泔\t9241\n1131\t9242\n纮\t9243\n蓁\t9244\n971\t9245\nbuilding\t9246\n1021\t9247\n铗\t9248\n939\t9249\n弇\t9250\n挲\t9251\ncrystal\t9252\n艉\t9253\nsmtp\t9254\n鱬\t9255\ncims\t9256\nfang\t9257\n1265\t9258\ntrans\t9259\npan\t9260\n1745\t9261\n1604\t9262\n泺\t9263\n橛\t9264\n817\t9265\n796\t9266\n袴\t9267\ncosplay\t9268\n1154\t9269\n1189\t9270\n749\t9271\n794\t9272\n1068\t9273\n881\t9274\nhc\t9275\nhope\t9276\n1410\t9277\ncouldn\t9278\n1638\t9279\n992\t9280\nalong\t9281\nage\t9282\n250mg\t9283\nclear\t9284\naps\t9285\n1631\t9286\n1011\t9287\nprovides\t9288\n1123\t9289\n1701\t9290\n36000\t9291\ncsf\t9292\n韪\t9293\nn1\t9294\nworks\t9295\n籓\t9296\n967\t9297\nptc\t9298\n贶\t9299\n1111\t9300\n1651\t9301\n棰\t9302\n1726\t9303\nsar\t9304\n1666\t9305\nqvga\t9306\nhf\t9307\ncoreldraw\t9308\npossible\t9309\n趵\t9310\n1629\t9311\n943\t9312\nmarc\t9313\nluo\t9314\n樨\t9315\n848\t9316\ncounty\t9317\n944\t9318\ntb\t9319\ndts\t9320\njunior\t9321\nvba\t9322\nlot\t9323\n傕\t9324\n玕\t9325\n毎\t9326\ndirect\t9327\n839\t9328\n繸\t9329\n2350\t9330\n774\t9331\n劵\t9332\nfsh\t9333\nwmv\t9334\n镧\t9335\n秫\t9336\n1094\t9337\nosi\t9338\n1602\t9339\n邶\t9340\n猞\t9341\ndior\t9342\n1766\t9343\n1623\t9344\n廛\t9345\n栌\t9346\n钲\t9347\n镦\t9348\n1607\t9349\npsa\t9350\nspss\t9351\nxy\t9352\n1769\t9353\ncells\t9354\n1465\t9355\n1577\t9356\ngon\t9357\nsend\t9358\nvision\t9359\nthinking\t9360\nimf\t9361\n嘏\t9362\ncarl\t9363\n蝰\t9364\n32000\t9365\nbay\t9366\n928\t9367\nis09001\t9368\n镏\t9369\n20kg\t9370\n淠\t9371\nimax\t9372\nnovel\t9373\nqt\t9374\n1684\t9375\n荇\t9376\n逄\t9377\nau\t9378\nauthor\t9379\nmod\t9380\n80mm\t9381\n1748\t9382\n849\t9383\n1612\t9384\nyet\t9385\n嘅\t9386\n929\t9387\n6l\t9388\nkarl\t9389\n6100\t9390\nstudents\t9391\ngmat\t9392\nmyself\t9393\nkate\t9394\njpg\t9395\n979\t9396\n1752\t9397\n829\t9398\n2450\t9399\n914\t9400\n876\t9401\n祕\t9402\n瑠\t9403\n48h\t9404\nmpv\t9405\n1734\t9406\nmis\t9407\n1565\t9408\nwalk\t9409\n941\t9410\n1075\t9411\n1235\t9412\nnatural\t9413\nk2\t9414\n977\t9415\n炝\t9416\n杪\t9417\n4050\t9418\n1669\t9419\np3\t9420\n1004\t9421\nfn\t9422\n埴\t9423\n1555\t9424\nvmware\t9425\nchloride\t9426\n942\t9427\nsteven\t9428\n1078\t9429\n獬\t9430\n966\t9431\n1135\t9432\ncountry\t9433\n947\t9434\n柢\t9435\n捱\t9436\n跣\t9437\n887\t9438\n涑\t9439\n75mm\t9440\n1278\t9441\n1583\t9442\nwestern\t9443\nwatch\t9444\n撃\t9445\n伢\t9446\n堠\t9447\n1045\t9448\n12m\t9449\nmuseum\t9450\n1215\t9451\ndocument\t9452\nmarketing\t9453\n952\t9454\n卽\t9455\n猁\t9456\nusb3\t9457\n906\t9458\n厣\t9459\nphysical\t9460\n辏\t9461\n1668\t9462\n旆\t9463\nagp\t9464\n茆\t9465\n1488\t9466\npg\t9467\n乜\t9468\ndeep\t9469\n1082\t9470\n961\t9471\n踯\t9472\n1526\t9473\n#\t9474\n[\t9475\nyam\t9476\nlofter\t9477\n##s\t9478\n##0\t9479\n##a\t9480\n##2\t9481\n##1\t9482\n##3\t9483\n##e\t9484\n##8\t9485\n##5\t9486\n##6\t9487\n##4\t9488\n##9\t9489\n##7\t9490\n##t\t9491\n##o\t9492\n##d\t9493\n##i\t9494\n##n\t9495\n##m\t9496\n##c\t9497\n##l\t9498\n##y\t9499\n##r\t9500\n##g\t9501\n##p\t9502\n##f\t9503\npixnet\t9504\ncookies\t9505\ntripadvisor\t9506\n##er\t9507\n##k\t9508\n##h\t9509\n##b\t9510\n##x\t9511\n##u\t9512\n##w\t9513\n##ing\t9514\nctrip\t9515\n##on\t9516\n##v\t9517\nllc\t9518\n##an\t9519\n##z\t9520\nblogthis\t9521\n##le\t9522\n##in\t9523\n##mm\t9524\n##00\t9525\nig\t9526\n##ng\t9527\n##us\t9528\n##te\t9529\n##ed\t9530\nncc\t9531\nblog\t9532\n##10\t9533\n##al\t9534\n##ic\t9535\n##ia\t9536\n##q\t9537\n##ce\t9538\n##en\t9539\n##is\t9540\n##ra\t9541\n##es\t9542\n##j\t9543\n##cm\t9544\ntw\t9545\n##ne\t9546\n##re\t9547\n##tion\t9548\npony\t9549\n##2017\t9550\n##ch\t9551\n##or\t9552\n##na\t9553\ncafe\t9554\npinterest\t9555\npixstyleme3c\t9556\n##ta\t9557\n##2016\t9558\n##ll\t9559\n##20\t9560\n##ie\t9561\n##ma\t9562\n##17\t9563\n##ion\t9564\n##th\t9565\n##st\t9566\n##se\t9567\n##et\t9568\n##ck\t9569\n##ly\t9570\nweb885\t9571\n##ge\t9572\nxd\t9573\n##ry\t9574\n##11\t9575\n0fork\t9576\n##12\t9577\n##ter\t9578\n##ar\t9579\n##la\t9580\n##os\t9581\n##30\t9582\n##el\t9583\n##50\t9584\n##ml\t9585\ntue\t9586\nposted\t9587\n##at\t9588\n##man\t9589\n##15\t9590\nago\t9591\n##it\t9592\n##me\t9593\n##de\t9594\n##nt\t9595\n##mb\t9596\n##16\t9597\n##ve\t9598\n##da\t9599\n##ps\t9600\n##to\t9601\nhttps\t9602\nmomo\t9603\n##son\t9604\n##ke\t9605\n##80\t9606\nebd\t9607\napk\t9608\n##88\t9609\n##um\t9610\nwiki\t9611\nbrake\t9612\nmon\t9613\npo\t9614\njune\t9615\n##ss\t9616\nfb\t9617\n##as\t9618\nleonardo\t9619\nsafari\t9620\n##60\t9621\nwed\t9622\nwin7\t9623\nkiehl\t9624\n##co\t9625\n##go\t9626\nvfm\t9627\nkanye\t9628\n##90\t9629\n##2015\t9630\n##id\t9631\n##ey\t9632\n##sa\t9633\n##ro\t9634\n##am\t9635\n##no\t9636\nthu\t9637\nfri\t9638\n##sh\t9639\n##ki\t9640\ncomments\t9641\n##pe\t9642\n##ine\t9643\nuber\t9644\n##mi\t9645\n##ton\t9646\nwordpress\t9647\n##ment\t9648\nwin10\t9649\n##ld\t9650\n##li\t9651\ngmail\t9652\n##rs\t9653\n##ri\t9654\n##rd\t9655\n##21\t9656\n##io\t9657\n##99\t9658\npaypal\t9659\npolicy\t9660\n##40\t9661\n##ty\t9662\n##18\t9663\n##01\t9664\n##ba\t9665\ntaiwan\t9666\n##ga\t9667\nprivacy\t9668\nagoda\t9669\n##13\t9670\n##ny\t9671\n##24\t9672\n##22\t9673\n##by\t9674\n##ur\t9675\n##hz\t9676\n##ang\t9677\ncookie\t9678\nnetscape\t9679\n##ka\t9680\n##ad\t9681\nnike\t9682\nsurvey\t9683\n##016\t9684\nwikia\t9685\n##32\t9686\n##017\t9687\ncbc\t9688\n##tor\t9689\n##kg\t9690\n##rt\t9691\n##14\t9692\ncampaign\t9693\n##ct\t9694\n##ts\t9695\n##ns\t9696\n##ao\t9697\n##nd\t9698\n##70\t9699\n##ya\t9700\n##il\t9701\n##25\t9702\n0020\t9703\n897\t9704\n##23\t9705\nhotels\t9706\n##ian\t9707\n6606\t9708\n##ers\t9709\n##26\t9710\n##day\t9711\n##ay\t9712\n##line\t9713\n##be\t9714\ntalk2yam\t9715\nyamservice\t9716\ncoco\t9717\n##dy\t9718\n##ies\t9719\n##ha\t9720\ninstagram\t9721\n##ot\t9722\n##va\t9723\n##mo\t9724\n##land\t9725\nltxsw\t9726\n##ation\t9727\n##pa\t9728\n##ol\t9729\ntag\t9730\n##ue\t9731\n##31\t9732\noppo\t9733\n##ca\t9734\n##om\t9735\nchrome\t9736\n##ure\t9737\nlol\t9738\n##19\t9739\n##bo\t9740\n##100\t9741\n##way\t9742\n##ko\t9743\n##do\t9744\n##un\t9745\n##ni\t9746\nherme\t9747\n##28\t9748\n##up\t9749\n##06\t9750\n##ds\t9751\nadmin\t9752\n##48\t9753\n##015\t9754\n##35\t9755\n##ee\t9756\ntpp\t9757\n##ive\t9758\n##cc\t9759\n##ble\t9760\n##ity\t9761\n##ex\t9762\n##ler\t9763\n##ap\t9764\n##book\t9765\n##ice\t9766\n##km\t9767\n##mg\t9768\n##ms\t9769\nebay\t9770\n##29\t9771\nubuntu\t9772\n##cy\t9773\n##view\t9774\n##lo\t9775\n##oo\t9776\n##02\t9777\nstep1\t9778\njuly\t9779\n##net\t9780\n##ls\t9781\n##ii\t9782\n##05\t9783\n##33\t9784\nstep2\t9785\nios9\t9786\n##box\t9787\n##ley\t9788\nsamsung\t9789\npokemon\t9790\n##ent\t9791\n##les\t9792\ns8\t9793\natom\t9794\n##said\t9795\n##55\t9796\n##2014\t9797\n##66\t9798\nadidas\t9799\namazon\t9800\n##ber\t9801\n##ner\t9802\nvisa\t9803\n##77\t9804\n##der\t9805\nconnectivity\t9806\n##hi\t9807\nfirefox\t9808\nskip\t9809\n##27\t9810\n##ir\t9811\n##61\t9812\n##ai\t9813\n##ver\t9814\ncafe2017\t9815\n##ron\t9816\n##ster\t9817\n##sk\t9818\n##ft\t9819\nlongchamp\t9820\nssd\t9821\n##ti\t9822\nreply\t9823\n##my\t9824\napr\t9825\n##ker\t9826\nsource\t9827\n##one\t9828\n##2013\t9829\n##ow\t9830\ngoods\t9831\n##lin\t9832\n##ip\t9833\n##ics\t9834\n##45\t9835\n##03\t9836\n##ff\t9837\n##47\t9838\nganji\t9839\n##nce\t9840\n##per\t9841\nfaq\t9842\ncomment\t9843\n##ock\t9844\n##bs\t9845\n##ah\t9846\n##lv\t9847\n##mp\t9848\n##000\t9849\nmelody\t9850\n17life\t9851\n##au\t9852\n##71\t9853\n##04\t9854\n##95\t9855\n##age\t9856\ntips\t9857\n##68\t9858\n##ting\t9859\n##ung\t9860\nwonderland\t9861\n##ction\t9862\nmar\t9863\narticle\t9864\n##db\t9865\n##07\t9866\n##ore\t9867\n##op\t9868\n##78\t9869\n##38\t9870\n##ong\t9871\n##73\t9872\n##08\t9873\n##ica\t9874\n##36\t9875\n##wa\t9876\n##64\t9877\nhomemesh\t9878\n##85\t9879\n##tv\t9880\n##di\t9881\nmacbook\t9882\n##ier\t9883\n##si\t9884\n##75\t9885\n##ok\t9886\ngoris\t9887\nlock\t9888\n##ut\t9889\ncarol\t9890\n##vi\t9891\n##ac\t9892\nanti\t9893\njan\t9894\ntags\t9895\n##98\t9896\n##51\t9897\naugust\t9898\n##86\t9899\n##fs\t9900\n##sion\t9901\njordan\t9902\n##tt\t9903\n##lt\t9904\n##42\t9905\n##bc\t9906\nvivi\t9907\n##rry\t9908\n##ted\t9909\n##rn\t9910\nusd\t9911\n##t00\t9912\n##58\t9913\n##09\t9914\n##34\t9915\ngoo\t9916\n##ui\t9917\n##ary\t9918\nitem\t9919\n##pm\t9920\n##41\t9921\n##za\t9922\n##2012\t9923\nblogabstract\t9924\n##ger\t9925\n##62\t9926\n##44\t9927\ngr2\t9928\nasus\t9929\ncindy\t9930\n##hd\t9931\nesc\t9932\n##od\t9933\nbooking\t9934\n##53\t9935\nfed\t9936\n##81\t9937\n##ina\t9938\nchan\t9939\ndistribution\t9940\nsteam\t9941\npk10\t9942\n##ix\t9943\n##65\t9944\n##91\t9945\ndec\t9946\n##ana\t9947\nicecat\t9948\n00z\t9949\n##46\t9950\n##ji\t9951\n##ard\t9952\noct\t9953\n##ain\t9954\njp\t9955\n##ze\t9956\n##bi\t9957\ncio\t9958\n##56\t9959\nh5\t9960\n##39\t9961\n##port\t9962\ncurve\t9963\n##nm\t9964\n##dia\t9965\nutc\t9966\n12345678910\t9967\n##52\t9968\nchanel\t9969\n##and\t9970\n##im\t9971\n##63\t9972\nvera\t9973\nvivo\t9974\n##ei\t9975\n2756\t9976\n##69\t9977\nmsci\t9978\n##po\t9979\n##89\t9980\n##bit\t9981\n##out\t9982\n##zz\t9983\n##97\t9984\n##67\t9985\nopec\t9986\n##96\t9987\n##tes\t9988\n##ast\t9989\n##ling\t9990\n##ory\t9991\n##ical\t9992\nkitty\t9993\n##43\t9994\nstep3\t9995\n##cn\t9996\nwin8\t9997\niphone7\t9998\nbeauty\t9999\n##87\t10000\ndollars\t10001\n##ys\t10002\n##oc\t10003\npay\t10004\n##2011\t10005\n##lly\t10006\n##ks\t10007\ndownload\t10008\nsep\t10009\n##board\t10010\n##37\t10011\n##lan\t10012\nwinrar\t10013\n##que\t10014\n##ua\t10015\n##com\t10016\nettoday\t10017\n##54\t10018\n##ren\t10019\n##via\t10020\n##72\t10021\n##79\t10022\n##tch\t10023\n##49\t10024\n##ial\t10025\n##nn\t10026\nstep4\t10027\n2765\t10028\ngov\t10029\n##xx\t10030\nmandy\t10031\n##ser\t10032\ncopyright\t10033\nfashion\t10034\n##ist\t10035\n##art\t10036\n##lm\t10037\n##ek\t10038\n##ning\t10039\n##if\t10040\n##ite\t10041\niot\t10042\n##84\t10043\n##2010\t10044\n##ku\t10045\noctober\t10046\n##ux\t10047\ntrump\t10048\n##hs\t10049\n##ide\t10050\n##ins\t10051\napril\t10052\n##ight\t10053\n##83\t10054\nprotected\t10055\n##fe\t10056\n##ho\t10057\nofo\t10058\ngomaji\t10059\nmarch\t10060\n##lla\t10061\n##pp\t10062\n##ec\t10063\n6s\t10064\n720p\t10065\n##rm\t10066\n##ham\t10067\n##92\t10068\nfandom\t10069\n##ell\t10070\ninfo\t10071\n##82\t10072\nsina\t10073\n4066\t10074\n##able\t10075\n##ctor\t10076\nrights\t10077\njul\t10078\n##76\t10079\nmall\t10080\n##59\t10081\ndonald\t10082\nsodu\t10083\n##light\t10084\nreserved\t10085\nhtm\t10086\n##han\t10087\n##57\t10088\n##ise\t10089\n##tions\t10090\n##shi\t10091\ndoc\t10092\n055\t10093\n##ram\t10094\nshopping\t10095\naug\t10096\n##pi\t10097\n##well\t10098\nwam\t10099\n##hu\t10100\n##gb\t10101\n##93\t10102\nmix\t10103\n##ef\t10104\n##uan\t10105\nbwl\t10106\n##plus\t10107\n##res\t10108\n##ess\t10109\ntea\t10110\nhktvmall\t10111\n##ate\t10112\n##ese\t10113\nfeb\t10114\ninn\t10115\nnov\t10116\n##ci\t10117\npass\t10118\n##bet\t10119\n##nk\t10120\ncoffee\t10121\nairbnb\t10122\n##ute\t10123\nwoshipm\t10124\nskype\t10125\n##fc\t10126\n##www\t10127\n##94\t10128\n##ght\t10129\n##gs\t10130\n##ile\t10131\n##wood\t10132\n##uo\t10133\nicon\t10134\n##em\t10135\nsays\t10136\n##king\t10137\n##tive\t10138\nblogger\t10139\n##74\t10140\n##ox\t10141\n##zy\t10142\n##red\t10143\n##ium\t10144\n##lf\t10145\nnokia\t10146\nclaire\t10147\n##ding\t10148\nnovember\t10149\nlohas\t10150\n##500\t10151\n##tic\t10152\n##cs\t10153\n##che\t10154\n##ire\t10155\n##gy\t10156\n##ult\t10157\njanuary\t10158\nptt\t10159\n##fa\t10160\n##mer\t10161\npchome\t10162\nudn\t10163\n##time\t10164\n##tte\t10165\ngarden\t10166\neleven\t10167\n309b\t10168\nbat\t10169\n##123\t10170\n##tra\t10171\nkindle\t10172\n##ern\t10173\nxperia\t10174\nces\t10175\ntravel\t10176\n##ous\t10177\n##int\t10178\nedu\t10179\ncho\t10180\n##car\t10181\n##our\t10182\n##ant\t10183\nrends\t10184\n##jo\t10185\nmastercard\t10186\n##2000\t10187\nkb\t10188\n##min\t10189\n##ino\t10190\n##ris\t10191\n##ud\t10192\n##set\t10193\n##her\t10194\n##ou\t10195\ntaipei\t10196\n##fi\t10197\n##ill\t10198\naphojoy\t10199\ndecember\t10200\nmeiki\t10201\n##ick\t10202\ntweet\t10203\n##av\t10204\niphone6\t10205\n##dd\t10206\nviews\t10207\n##mark\t10208\n##ash\t10209\n##ome\t10210\nkoreanmall\t10211\n##ak\t10212\nq2\t10213\n##200\t10214\nmlb\t10215\n##lle\t10216\n##watch\t10217\n##und\t10218\n##tal\t10219\n##less\t10220\n4399\t10221\n##rl\t10222\nupdate\t10223\nshop\t10224\n##mhz\t10225\n##house\t10226\n##key\t10227\n##001\t10228\n##hy\t10229\n##web\t10230\n##2009\t10231\n##gg\t10232\n##wan\t10233\n##val\t10234\n2021\t10235\n##ons\t10236\ndoi\t10237\ntrivago\t10238\noverdope\t10239\n##ance\t10240\n573032185\t10241\nwx17house\t10242\n##so\t10243\naudi\t10244\n##he\t10245\n##rp\t10246\n##ake\t10247\nbeach\t10248\ncfa\t10249\nps4\t10250\n##800\t10251\n##link\t10252\n##hp\t10253\nferragamo\t10254\n##eng\t10255\n##style\t10256\n##gi\t10257\ni7\t10258\n##ray\t10259\n##max\t10260\n##pc\t10261\nseptember\t10262\n##ace\t10263\nvps\t10264\nfebruary\t10265\npantos\t10266\nwp\t10267\nlisa\t10268\njquery\t10269\noffer\t10270\n##berg\t10271\n##news\t10272\nfks\t10273\n##all\t10274\n##rus\t10275\n##888\t10276\n##works\t10277\nblogtitle\t10278\nloftpermalink\t10279\nling\t10280\n##ja\t10281\noutlet\t10282\n##ea\t10283\n##top\t10284\n##ness\t10285\nsalvatore\t10286\n##lu\t10287\nswift\t10288\n##ul\t10289\nweek\t10290\n##ean\t10291\n##300\t10292\n##gle\t10293\n##back\t10294\npowered\t10295\n##tan\t10296\n##nes\t10297\ncanon\t10298\n##zi\t10299\n##las\t10300\n##oe\t10301\n##sd\t10302\n##bot\t10303\n##world\t10304\n##zo\t10305\ntop100\t10306\npmi\t10307\n##vr\t10308\nball\t10309\nvogue\t10310\nofweek\t10311\n##list\t10312\n##ort\t10313\n##lon\t10314\n##tc\t10315\n##of\t10316\n##bus\t10317\n##gen\t10318\nnas\t10319\n##lie\t10320\n##ria\t10321\n##coin\t10322\n##bt\t10323\nnata\t10324\nvive\t10325\ncup\t10326\n##ook\t10327\n##sy\t10328\nmsg\t10329\n3ce\t10330\n##word\t10331\nebooks\t10332\nr8\t10333\nnice\t10334\nmonths\t10335\nrewards\t10336\n##ther\t10337\n0800\t10338\n##xi\t10339\n##sc\t10340\ngg\t10341\nblogfp\t10342\ndaily\t10343\n##bb\t10344\n##tar\t10345\n##ky\t10346\nanthony\t10347\n##yo\t10348\n##ara\t10349\n##aa\t10350\n##rc\t10351\n##tz\t10352\n##ston\t10353\ngear\t10354\n##eo\t10355\n##ade\t10356\n##win\t10357\n##ura\t10358\n##den\t10359\n##ita\t10360\n##sm\t10361\npng\t10362\nrakuten\t10363\nwhatsapp\t10364\n##use\t10365\npad\t10366\ngucci\t10367\n##ode\t10368\n##fo\t10369\nchicago\t10370\n##hone\t10371\nio\t10372\nsogo\t10373\nbe2\t10374\n##ology\t10375\ncloud\t10376\n##con\t10377\n##ford\t10378\n##joy\t10379\n##kb\t10380\n##rade\t10381\n##ach\t10382\ndocker\t10383\n##ful\t10384\n##ase\t10385\nford\t10386\n##star\t10387\nedited\t10388\n##are\t10389\n##mc\t10390\nsiri\t10391\n##ella\t10392\nbloomberg\t10393\n##read\t10394\npizza\t10395\n##ison\t10396\n##vm\t10397\nnode\t10398\n18k\t10399\n##play\t10400\n##cer\t10401\n##yu\t10402\n##ings\t10403\nasr\t10404\n##lia\t10405\nstep5\t10406\n##cd\t10407\npixstyleme\t10408\n##600\t10409\n##tus\t10410\ntokyo\t10411\n##rial\t10412\n##life\t10413\n##ae\t10414\ntcs\t10415\n##rk\t10416\n##wang\t10417\n##sp\t10418\n##ving\t10419\npremium\t10420\nnetflix\t10421\n##lton\t10422\n##ple\t10423\n##cal\t10424\n021\t10425\n##sen\t10426\n##ville\t10427\nnexus\t10428\n##ius\t10429\n##mah\t10430\ntila\t10431\n##tin\t10432\nresort\t10433\n##ws\t10434\np10\t10435\nreport\t10436\n##360\t10437\n##ru\t10438\nbus\t10439\nvans\t10440\n##est\t10441\nlinks\t10442\nrebecca\t10443\n##dm\t10444\nazure\t10445\n##365\t10446\n##mon\t10447\nmoto\t10448\n##eam\t10449\nblogspot\t10450\n##ments\t10451\n##ik\t10452\n##kw\t10453\n##bin\t10454\n##ata\t10455\n##vin\t10456\n##tu\t10457\n##ula\t10458\nstation\t10459\n##ature\t10460\nfiles\t10461\nzara\t10462\nhdr\t10463\ntop10\t10464\ns6\t10465\nmarriott\t10466\navira\t10467\ntab\t10468\n##ran\t10469\n##home\t10470\noculus\t10471\n##ral\t10472\nrosie\t10473\n##force\t10474\n##ini\t10475\nice\t10476\n##bert\t10477\n##nder\t10478\n##mber\t10479\nplurk\t10480\n##sis\t10481\n00kg\t10482\n##ence\t10483\n##nc\t10484\n##name\t10485\nlog\t10486\nikea\t10487\nmalaysia\t10488\n##ncy\t10489\n##nie\t10490\n##ye\t10491\n##oid\t10492\n##chi\t10493\nxuehai\t10494\n##1000\t10495\n##orm\t10496\n##rf\t10497\n##ware\t10498\n##pro\t10499\n##era\t10500\n##ub\t10501\n##2008\t10502\n8891\t10503\nscp\t10504\n##zen\t10505\nqvod\t10506\njcb\t10507\n##hr\t10508\nweibo\t10509\n##row\t10510\n##ish\t10511\ngithub\t10512\nmate\t10513\n##lot\t10514\n##ane\t10515\n##tina\t10516\ned2k\t10517\n##vel\t10518\n##900\t10519\nfinal\t10520\nns\t10521\nbytes\t10522\n##ene\t10523\n##cker\t10524\n##2007\t10525\n##px\t10526\ntopapp\t10527\nhelpapp\t10528\n14k\t10529\ng4g\t10530\nldquo\t10531\n##fork\t10532\n##gan\t10533\n##zon\t10534\n##qq\t10535\n##google\t10536\n##ism\t10537\n##zer\t10538\ntoyota\t10539\ncategory\t10540\n##labels\t10541\nrestaurant\t10542\n##md\t10543\nposts\t10544\n##ico\t10545\nangelababy\t10546\n123456\t10547\nsports\t10548\ncandy\t10549\n##new\t10550\n##here\t10551\nswissinfo\t10552\ndram\t10553\n##ual\t10554\n##vice\t10555\n##wer\t10556\nsport\t10557\nq1\t10558\nios10\t10559\n##mll\t10560\nwan\t10561\n##uk\t10562\nx3\t10563\n0t\t10564\n##ming\t10565\ne5\t10566\n##3d\t10567\nh7n9\t10568\nworldcat\t10569\n##vo\t10570\n##led\t10571\n##580\t10572\n##ax\t10573\n##ert\t10574\npolo\t10575\n##lr\t10576\n##hing\t10577\n##chat\t10578\n##ule\t10579\nhotmail\t10580\n##pad\t10581\nbbq\t10582\n##ring\t10583\nwali\t10584\n2k\t10585\ncostco\t10586\nswitch\t10587\n##city\t10588\nphilips\t10589\n##mann\t10590\npanasonic\t10591\n##cl\t10592\n##vd\t10593\n##ping\t10594\n##rge\t10595\n##lk\t10596\ncss3\t10597\n##ney\t10598\n##ular\t10599\n##400\t10600\n##tter\t10601\nlz\t10602\n##tm\t10603\n##yan\t10604\n##let\t10605\ncoach\t10606\n##pt\t10607\na8\t10608\nfollow\t10609\n##berry\t10610\n##ew\t10611\n##wn\t10612\n##og\t10613\n##code\t10614\n##rid\t10615\nvilla\t10616\ngit\t10617\nr11\t10618\n##cket\t10619\nerror\t10620\n##anonymoussaid\t10621\n##ag\t10622\n##ame\t10623\n##gc\t10624\nqa\t10625\n##lis\t10626\n##gin\t10627\nvmalife\t10628\n##cher\t10629\nwedding\t10630\n##tis\t10631\ndemo\t10632\nbye\t10633\n##rant\t10634\norz\t10635\nacer\t10636\n##ats\t10637\n##ven\t10638\nmacd\t10639\nyougou\t10640\n##dn\t10641\n##ano\t10642\n##urt\t10643\n##rent\t10644\ncontinue\t10645\nscript\t10646\n##wen\t10647\n##ect\t10648\npaper\t10649\n##chel\t10650\n##cat\t10651\nx5\t10652\nfox\t10653\n##blog\t10654\nloading\t10655\n##yn\t10656\n##tp\t10657\nkuso\t10658\n799\t10659\nvdc\t10660\nforest\t10661\nprime\t10662\nultra\t10663\n##rmb\t10664\nsquare\t10665\n##field\t10666\n##reen\t10667\n##ors\t10668\n##ju\t10669\n##air\t10670\n##map\t10671\ncdn\t10672\n##wo\t10673\nm8\t10674\n##get\t10675\nopera\t10676\n##base\t10677\n##ood\t10678\nvsa\t10679\n##aw\t10680\n##ail\t10681\ncount\t10682\n##een\t10683\n##gp\t10684\nvsc\t10685\ntree\t10686\n##eg\t10687\n##ose\t10688\n##ories\t10689\n##shop\t10690\nalphago\t10691\nv4\t10692\nfluke62max\t10693\nzip\t10694\n##sta\t10695\nbas\t10696\n##yer\t10697\nhadoop\t10698\n##ube\t10699\n##wi\t10700\n0755\t10701\nhola\t10702\n##low\t10703\ncentre\t10704\n##fer\t10705\n##750\t10706\n##media\t10707\n##san\t10708\n##bank\t10709\nq3\t10710\n##nge\t10711\n##mail\t10712\n##lp\t10713\nclient\t10714\nevent\t10715\nvincent\t10716\n##nse\t10717\nsui\t10718\nadchoice\t10719\n##stry\t10720\n##zone\t10721\nga\t10722\napps\t10723\n##ab\t10724\n##rner\t10725\nkymco\t10726\n##care\t10727\n##pu\t10728\n##yi\t10729\nminkoff\t10730\nannie\t10731\ncollection\t10732\nkpi\t10733\nplaystation\t10734\nbh\t10735\n##bar\t10736\narmani\t10737\n##xy\t10738\niherb\t10739\n##ery\t10740\n##share\t10741\n##ob\t10742\nvolvo\t10743\n##ball\t10744\n##hk\t10745\n##cp\t10746\n##rie\t10747\n##ona\t10748\n##sl\t10749\ngtx\t10750\nrdquo\t10751\njayz\t10752\n##lex\t10753\n##rum\t10754\nnamespace\t10755\n##ale\t10756\n##atic\t10757\n##erson\t10758\n##ql\t10759\n##ves\t10760\n##type\t10761\nenter\t10762\n##168\t10763\n##mix\t10764\n##bian\t10765\na9\t10766\nky\t10767\n##lc\t10768\nmovie\t10769\n##hc\t10770\ntower\t10771\n##ration\t10772\n##mit\t10773\n##nch\t10774\nua\t10775\ntel\t10776\nprefix\t10777\n##o2\t10778\n##point\t10779\nott\t10780\n##http\t10781\n##ury\t10782\nbaidu\t10783\n##ink\t10784\nmember\t10785\n##logy\t10786\nbigbang\t10787\nnownews\t10788\n##js\t10789\n##shot\t10790\n##tb\t10791\neba\t10792\n##tics\t10793\n##lus\t10794\nspark\t10795\n##ama\t10796\n##ions\t10797\n##lls\t10798\n##down\t10799\n##ress\t10800\nburberry\t10801\nday2\t10802\n##kv\t10803\nrelated\t10804\nedit\t10805\n##ark\t10806\ncx\t10807\n32gb\t10808\ng9\t10809\n##ans\t10810\n##tty\t10811\ns5\t10812\n##bee\t10813\nthread\t10814\nxr\t10815\nbuy\t10816\nspotify\t10817\n##ari\t10818\n##verse\t10819\n7headlines\t10820\nnego\t10821\nsunny\t10822\ndom\t10823\npositioning\t10824\nfit\t10825\n##tton\t10826\nalexa\t10827\n##ties\t10828\n##llow\t10829\namy\t10830\n##du\t10831\n##rth\t10832\n##lar\t10833\n2345\t10834\n##des\t10835\nsidebar\t10836\nsite\t10837\n##cky\t10838\n##kit\t10839\n##ime\t10840\n##009\t10841\nseason\t10842\n##fun\t10843\ngogoro\t10844\na7\t10845\nlily\t10846\ntwd600\t10847\n##vis\t10848\n##cture\t10849\nfriday\t10850\nyi\t10851\n##tta\t10852\n##tel\t10853\n##lock\t10854\neconomy\t10855\ntinker\t10856\n8gb\t10857\n##app\t10858\noops\t10859\n##right\t10860\nedm\t10861\n##cent\t10862\nsupreme\t10863\n##its\t10864\n##asia\t10865\ndropbox\t10866\n##tti\t10867\nbooks\t10868\n##tle\t10869\n##ller\t10870\n##ken\t10871\n##more\t10872\n##boy\t10873\nsex\t10874\n##dom\t10875\n##ider\t10876\n##unch\t10877\n##put\t10878\n##gh\t10879\nka\t10880\namoled\t10881\ndiv\t10882\n##tr\t10883\n##n1\t10884\nport\t10885\nhoward\t10886\n##tags\t10887\nken\t10888\n##nus\t10889\nadsense\t10890\nbuff\t10891\nthunder\t10892\n##town\t10893\n##ique\t10894\n##body\t10895\npin\t10896\n##erry\t10897\ntee\t10898\n##the\t10899\n##013\t10900\nudnbkk\t10901\n16gb\t10902\n##mic\t10903\nmiui\t10904\n##tro\t10905\n##alk\t10906\n##nity\t10907\ns4\t10908\n##oa\t10909\ndocomo\t10910\n##tf\t10911\n##ack\t10912\nfc2\t10913\n##ded\t10914\n##sco\t10915\n##014\t10916\n##rite\t10917\nlinkedin\t10918\n##ada\t10919\n##now\t10920\n##ndy\t10921\nucbug\t10922\nsputniknews\t10923\nlegalminer\t10924\n##ika\t10925\n##xp\t10926\n##bu\t10927\nq10\t10928\n##rman\t10929\ncheese\t10930\nming\t10931\nmaker\t10932\n##gm\t10933\nnikon\t10934\n##fig\t10935\nppi\t10936\njchere\t10937\nted\t10938\nfgo\t10939\ntech\t10940\n##tto\t10941\n##gl\t10942\n##len\t10943\nhair\t10944\nimg\t10945\n##pper\t10946\n##a1\t10947\nacca\t10948\n##ition\t10949\n##ference\t10950\nsuite\t10951\n##ig\t10952\n##mond\t10953\n##cation\t10954\n##pr\t10955\n101vip\t10956\n##999\t10957\n64gb\t10958\nairport\t10959\n##over\t10960\n##ith\t10961\n##su\t10962\ntown\t10963\npiece\t10964\n##llo\t10965\nno1\t10966\n##qi\t10967\nfocus\t10968\nreader\t10969\n##admin\t10970\n##ora\t10971\nfalse\t10972\n##log\t10973\n##ces\t10974\n##ume\t10975\nmotel\t10976\n##oper\t10977\nflickr\t10978\nnetcomponents\t10979\n##af\t10980\npose\t10981\n##ound\t10982\n##cg\t10983\n##site\t10984\n##iko\t10985\ncon\t10986\n##ath\t10987\n##hip\t10988\n##rey\t10989\ncream\t10990\n##cks\t10991\n012\t10992\n##dp\t10993\nfacebooktwitterpinterestgoogle\t10994\nsso\t10995\nshtml\t10996\nswiss\t10997\n##mw\t10998\nlumia\t10999\nxdd\t11000\ntiffany\t11001\ninsee\t11002\nrussell\t11003\ndell\t11004\n##ations\t11005\ncamera\t11006\n##vs\t11007\n##flow\t11008\n##late\t11009\nclassic\t11010\n##nter\t11011\n##ever\t11012\n##lab\t11013\n##nger\t11014\nqe\t11015\n##cing\t11016\neditor\t11017\n##nap\t11018\nsunday\t11019\n##ens\t11020\n##700\t11021\n##bra\t11022\nacg\t11023\nsofascore\t11024\nmkv\t11025\n##ign\t11026\njonathan\t11027\nbuild\t11028\nlabels\t11029\n##oto\t11030\ntesla\t11031\nmoba\t11032\ngohappy\t11033\najax\t11034\n##test\t11035\n##urs\t11036\nwps\t11037\nfedora\t11038\n##ich\t11039\nmozilla\t11040\n##480\t11041\n##dr\t11042\nurn\t11043\n##lina\t11044\ngrace\t11045\n##die\t11046\n##try\t11047\n##ader\t11048\nelle\t11049\n##chen\t11050\nprice\t11051\n##ten\t11052\nuhz\t11053\n##ough\t11054\n##hen\t11055\nstates\t11056\npush\t11057\nsession\t11058\nbalance\t11059\nwow\t11060\n##cus\t11061\n##py\t11062\n##ward\t11063\n##ep\t11064\n34e\t11065\nwong\t11066\nprada\t11067\n##cle\t11068\n##ree\t11069\nq4\t11070\n##ctive\t11071\n##ool\t11072\n##ira\t11073\n##163\t11074\nrq\t11075\nbuffet\t11076\ne6\t11077\n##ez\t11078\n##card\t11079\n##cha\t11080\nday3\t11081\neye\t11082\n##end\t11083\nadi\t11084\ntvbs\t11085\n##ala\t11086\nnova\t11087\n##tail\t11088\n##ries\t11089\n##ved\t11090\nbase\t11091\n##ways\t11092\nhero\t11093\nhgih\t11094\nprofile\t11095\nfish\t11096\nmu\t11097\nssh\t11098\n##wd\t11099\nclick\t11100\ncake\t11101\n##ond\t11102\npre\t11103\n##tom\t11104\nkic\t11105\npixel\t11106\n##ov\t11107\n##fl\t11108\nproduct\t11109\n6a\t11110\n##pd\t11111\ndear\t11112\n##gate\t11113\nyumi\t11114\n##sky\t11115\nbin\t11116\n##ture\t11117\n##ape\t11118\nisis\t11119\nnand\t11120\n##101\t11121\n##load\t11122\n##ream\t11123\na6\t11124\n##post\t11125\n##we\t11126\nzenfone\t11127\n##ike\t11128\ngd\t11129\nforum\t11130\njessica\t11131\n##ould\t11132\n##ious\t11133\nlohasthree\t11134\n##gar\t11135\n##ggle\t11136\n##ric\t11137\n##own\t11138\neclipse\t11139\n##side\t11140\n061\t11141\n##other\t11142\n##tech\t11143\n##ator\t11144\nengine\t11145\n##ged\t11146\nplaza\t11147\n##fit\t11148\nwestbrook\t11149\nreuters\t11150\n##ily\t11151\ncontextlink\t11152\n##hn\t11153\n##cil\t11154\n##cel\t11155\ncambridge\t11156\n##ize\t11157\n##aid\t11158\n##data\t11159\nfrm\t11160\n##head\t11161\nbutler\t11162\n##sun\t11163\n##mar\t11164\npuma\t11165\npmid\t11166\nkitchen\t11167\n##lic\t11168\nday1\t11169\n##text\t11170\n##page\t11171\n##rris\t11172\npm1\t11173\n##ket\t11174\ntrackback\t11175\n##hai\t11176\ndisplay\t11177\n##hl\t11178\nidea\t11179\n##sent\t11180\nairmail\t11181\n##ug\t11182\n##men\t11183\n028\t11184\n##lution\t11185\nschemas\t11186\nasics\t11187\nwikipedia\t11188\n##tional\t11189\n##vy\t11190\n##dget\t11191\n##ein\t11192\ncontact\t11193\npepper\t11194\n##uel\t11195\n##ument\t11196\n##hang\t11197\nq5\t11198\n##sue\t11199\n##ndi\t11200\nswatch\t11201\n##cept\t11202\npopular\t11203\n##ste\t11204\n##tag\t11205\ntrc\t11206\n##west\t11207\n##live\t11208\nhonda\t11209\nping\t11210\nmessenger\t11211\n##rap\t11212\nv9\t11213\nunity\t11214\nappqq\t11215\nleo\t11216\n##tone\t11217\n##ass\t11218\nuniqlo\t11219\n##010\t11220\nmoneydj\t11221\n##tical\t11222\n12306\t11223\n##m2\t11224\ncoc\t11225\nmiacare\t11226\n##mn\t11227\ntmt\t11228\n##core\t11229\nvim\t11230\nkk\t11231\n##may\t11232\ntarget\t11233\n##2c\t11234\n##ope\t11235\nomega\t11236\npinkoi\t11237\n##rain\t11238\n##ement\t11239\np9\t11240\nrd\t11241\n##tier\t11242\n##vic\t11243\nzone\t11244\nisofix\t11245\ncpa\t11246\nkimi\t11247\n##lay\t11248\nlulu\t11249\n##uck\t11250\n050\t11251\nweeks\t11252\n##hop\t11253\n##ear\t11254\neia\t11255\n##fly\t11256\nkorea\t11257\nboost\t11258\n##ship\t11259\neur\t11260\nvalley\t11261\n##iel\t11262\n##ude\t11263\nrn\t11264\n##ena\t11265\nfeed\t11266\n5757\t11267\nqqmei\t11268\n##thing\t11269\naws\t11270\npink\t11271\n##ters\t11272\n##kin\t11273\nboard\t11274\n##vertisement\t11275\nwine\t11276\n##ien\t11277\n##dge\t11278\n##tant\t11279\n##twitter\t11280\n##3c\t11281\ncool1\t11282\n##012\t11283\n##150\t11284\n##fu\t11285\n##iner\t11286\ngooglemsn\t11287\npixnetfacebookyahoo\t11288\nx7\t11289\n##uce\t11290\nsao\t11291\n##ev\t11292\n##file\t11293\n9678\t11294\nxddd\t11295\nshirt\t11296\n##rio\t11297\n##hat\t11298\ngivenchy\t11299\nbang\t11300\n##lio\t11301\nmonday\t11302\n##abc\t11303\nubuntuforumwikilinuxpastechat\t11304\n##vc\t11305\n##rity\t11306\n7866\t11307\n##ost\t11308\nimsean\t11309\ntiger\t11310\n##fet\t11311\ndji\t11312\n##come\t11313\n##beth\t11314\n##aft\t11315\n##don\t11316\n3p\t11317\nemma\t11318\n##khz\t11319\nx6\t11320\n##face\t11321\npptv\t11322\nx4\t11323\n##mate\t11324\nsophie\t11325\n##jing\t11326\nfifa\t11327\n##mand\t11328\nsale\t11329\ninwedding\t11330\n##gn\t11331\n##mmy\t11332\n##pmlast\t11333\nnana\t11334\n##wu\t11335\nnote7\t11336\n##340\t11337\n##bel\t11338\nwindow\t11339\n##dio\t11340\n##ht\t11341\n##ivity\t11342\ndomain\t11343\nneo\t11344\n##isa\t11345\n##lter\t11346\n5k\t11347\nf5\t11348\n##cts\t11349\nft\t11350\nzol\t11351\n##act\t11352\nmwc\t11353\nnbapop\t11354\neds\t11355\n##room\t11356\nprevious\t11357\ntomtom\t11358\n##ets\t11359\n5t\t11360\nchi\t11361\n##hg\t11362\nfairmont\t11363\ngay\t11364\n1b\t11365\n##raph\t11366\n##ils\t11367\ni3\t11368\navenue\t11369\n##host\t11370\n##bon\t11371\n##tsu\t11372\nmessage\t11373\nnavigation\t11374\nfintech\t11375\nh6\t11376\n##ject\t11377\n##vas\t11378\n##firm\t11379\ncredit\t11380\n##wf\t11381\nxxxx\t11382\n##nor\t11383\n##space\t11384\nhuawei\t11385\nplan\t11386\njson\t11387\nsbl\t11388\n##dc\t11389\nwish\t11390\n##120\t11391\n##sol\t11392\nwindows7\t11393\nwashington\t11394\n##nsis\t11395\nlo\t11396\n##sio\t11397\n##ym\t11398\n##bor\t11399\nplanet\t11400\n##wt\t11401\ngpa\t11402\n##tw\t11403\n##oka\t11404\nconnect\t11405\n##rss\t11406\n##work\t11407\n##atus\t11408\nchicken\t11409\n##times\t11410\nfa\t11411\n##ather\t11412\n##cord\t11413\n009\t11414\n##eep\t11415\nhitachi\t11416\n##pan\t11417\ndisney\t11418\n##press\t11419\nwind\t11420\nfrigidaire\t11421\n##tl\t11422\nhsu\t11423\n##ull\t11424\nexpedia\t11425\narchives\t11426\n##wei\t11427\ncut\t11428\nins\t11429\n6gb\t11430\nbrand\t11431\ncf1\t11432\n##rip\t11433\n##nis\t11434\n128gb\t11435\n3t\t11436\n##oon\t11437\nquick\t11438\n15058\t11439\nwing\t11440\n##bug\t11441\n##cms\t11442\n##dar\t11443\n##oh\t11444\nzoom\t11445\ntrip\t11446\n##nba\t11447\nrcep\t11448\naspx\t11449\n080\t11450\ngnu\t11451\n##count\t11452\n##url\t11453\n##ging\t11454\n8591\t11455\nam09\t11456\nshadow\t11457\n##cia\t11458\nemily\t11459\n##tation\t11460\nhost\t11461\nff\t11462\ntechorz\t11463\n##mini\t11464\n##mporary\t11465\n##ering\t11466\n##next\t11467\ncma\t11468\n##mbps\t11469\n##gas\t11470\n##ift\t11471\n##dot\t11472\namana\t11473\n##ros\t11474\n##eet\t11475\n##ible\t11476\n##aka\t11477\n##lor\t11478\nmaggie\t11479\n##011\t11480\n##iu\t11481\n##gt\t11482\n1tb\t11483\narticles\t11484\n##burg\t11485\n##iki\t11486\ndatabase\t11487\nfantasy\t11488\n##rex\t11489\n##cam\t11490\ndlc\t11491\ndean\t11492\n##you\t11493\npath\t11494\ngaming\t11495\nvictoria\t11496\nmaps\t11497\n##lee\t11498\n##itor\t11499\noverchicstoretvhome\t11500\n##xt\t11501\n##nan\t11502\nx9\t11503\ninstall\t11504\n##ann\t11505\n##ph\t11506\n##rcle\t11507\n##nic\t11508\n##nar\t11509\nmetro\t11510\nchocolate\t11511\n##rian\t11512\n##table\t11513\nskin\t11514\n##sn\t11515\nmountain\t11516\n##0mm\t11517\ninparadise\t11518\n7x24\t11519\n##jia\t11520\neeworld\t11521\ncreative\t11522\ng5\t11523\nparker\t11524\necfa\t11525\nvillage\t11526\nsylvia\t11527\nhbl\t11528\n##ques\t11529\n##onsored\t11530\n##x2\t11531\n##v4\t11532\n##tein\t11533\nie6\t11534\n##stack\t11535\nver\t11536\n##ads\t11537\n##baby\t11538\nbbe\t11539\n##110\t11540\n##lone\t11541\n##uid\t11542\nads\t11543\n022\t11544\ngundam\t11545\n006\t11546\nscrum\t11547\nmatch\t11548\n##ave\t11549\n##470\t11550\n##oy\t11551\n##talk\t11552\nglass\t11553\nlamigo\t11554\n##eme\t11555\n##a5\t11556\nwade\t11557\nkde\t11558\n##lace\t11559\nocean\t11560\ntvg\t11561\n##covery\t11562\n##r3\t11563\n##ners\t11564\n##rea\t11565\n##aine\t11566\ncover\t11567\n##ision\t11568\n##sia\t11569\n##bow\t11570\nmsi\t11571\n##love\t11572\nsoft\t11573\nz2\t11574\n##pl\t11575\nmobil\t11576\n##uy\t11577\nnginx\t11578\n##oi\t11579\n##rr\t11580\n6221\t11581\n##mple\t11582\n##sson\t11583\n##nts\t11584\n91tv\t11585\ncomhd\t11586\ncrv3000\t11587\n##uard\t11588\ngallery\t11589\n##bia\t11590\nrate\t11591\nspf\t11592\nredis\t11593\ntraction\t11594\nicloud\t11595\n011\t11596\njose\t11597\n##tory\t11598\nsohu\t11599\n899\t11600\nkicstart2\t11601\n##hia\t11602\n##sit\t11603\n##walk\t11604\n##xure\t11605\n500g\t11606\n##pact\t11607\nxa\t11608\ncarlo\t11609\n##250\t11610\n##walker\t11611\n##can\t11612\ncto\t11613\ngigi\t11614\npen\t11615\n##hoo\t11616\nob\t11617\n##yy\t11618\n13913459\t11619\n##iti\t11620\nmango\t11621\n##bbs\t11622\nsense\t11623\noxford\t11624\nwalker\t11625\njennifer\t11626\n##ola\t11627\ncourse\t11628\n##bre\t11629\n##pus\t11630\n##rder\t11631\nlucky\t11632\n075\t11633\nivy\t11634\n##nia\t11635\nsotheby\t11636\n##ugh\t11637\njoy\t11638\n##orage\t11639\n##ush\t11640\n##bat\t11641\n##dt\t11642\nr9\t11643\n##2d\t11644\n##gio\t11645\nwear\t11646\n##lax\t11647\n##moon\t11648\nseven\t11649\nlonzo\t11650\n8k\t11651\nevolution\t11652\n##kk\t11653\nkd\t11654\narduino\t11655\n##lux\t11656\narpg\t11657\n##rdon\t11658\ncook\t11659\n##x5\t11660\nfive\t11661\n##als\t11662\n##ida\t11663\nsign\t11664\n##nda\t11665\n##posted\t11666\nfresh\t11667\n##mine\t11668\n##skip\t11669\n##form\t11670\n##ssion\t11671\n##tee\t11672\ndyson\t11673\nstage\t11674\n##jie\t11675\n##night\t11676\nepson\t11677\npack\t11678\n##ppy\t11679\nwd\t11680\n##eh\t11681\n##rence\t11682\n##lvin\t11683\ngolden\t11684\ndiscovery\t11685\n##trix\t11686\n##n2\t11687\nloft\t11688\n##uch\t11689\n##dra\t11690\n##sse\t11691\n1mdb\t11692\nwelcome\t11693\n##urn\t11694\ngaga\t11695\n##lmer\t11696\nteddy\t11697\n##160\t11698\n##f2016\t11699\n##sha\t11700\nrar\t11701\nholiday\t11702\n074\t11703\n##vg\t11704\n##nos\t11705\n##rail\t11706\ngartner\t11707\ngi\t11708\n6p\t11709\n##dium\t11710\nkit\t11711\nb3\t11712\neco\t11713\nsean\t11714\n##stone\t11715\nnu\t11716\n##np\t11717\nf16\t11718\nwrite\t11719\n029\t11720\nm5\t11721\n##ias\t11722\n##dk\t11723\nfsm\t11724\n52kb\t11725\n##xxx\t11726\n##cake\t11727\nlim\t11728\nru\t11729\n1v\t11730\n##ification\t11731\npublished\t11732\nangela\t11733\n16g\t11734\nanalytics\t11735\n##nel\t11736\ngmt\t11737\n##icon\t11738\n##bby\t11739\nios11\t11740\nwaze\t11741\n9985\t11742\n##ust\t11743\n##007\t11744\ndelete\t11745\n52sykb\t11746\nwwdc\t11747\n027\t11748\n##fw\t11749\n1389\t11750\n##xon\t11751\nbrandt\t11752\n##ses\t11753\n##dragon\t11754\nvetements\t11755\nanne\t11756\nmonte\t11757\nofficial\t11758\n##ere\t11759\n##nne\t11760\n##oud\t11761\netnews\t11762\n##a2\t11763\n##graphy\t11764\n##rtex\t11765\n##gma\t11766\nmount\t11767\narchive\t11768\nmorning\t11769\ntan\t11770\nddos\t11771\ne7\t11772\nday4\t11773\nfactory\t11774\nbruce\t11775\n##ito\t11776\nguest\t11777\n##lling\t11778\nn3\t11779\nmega\t11780\nwomen\t11781\ndac\t11782\nchurch\t11783\n##jun\t11784\nsingapore\t11785\n##facebook\t11786\n6991\t11787\nstarbucks\t11788\n##tos\t11789\n##stin\t11790\n##shine\t11791\nzen\t11792\n##mu\t11793\ntina\t11794\nrequest\t11795\n##gence\t11796\nq7\t11797\n##zzi\t11798\ndiary\t11799\n##tore\t11800\n##ead\t11801\ncst\t11802\n##osa\t11803\ncanada\t11804\nva\t11805\n##jiang\t11806\n##lam\t11807\n##nix\t11808\n##sday\t11809\ng6\t11810\n##master\t11811\nbing\t11812\n##zl\t11813\nnb40\t11814\nthai\t11815\nln284ct\t11816\n##itz\t11817\n##2f\t11818\nbonnie\t11819\n##food\t11820\n##lent\t11821\noriginals\t11822\n##stro\t11823\n##lts\t11824\n##bscribe\t11825\nntd\t11826\nyesstyle\t11827\nhmv\t11828\n##tment\t11829\nd5\t11830\n##pn\t11831\ntopios9\t11832\nlifestyle\t11833\nvirtual\t11834\n##ague\t11835\nxz\t11836\n##deo\t11837\nmuji\t11838\n024\t11839\nunt\t11840\n##nnis\t11841\nfaq1\t11842\n##ette\t11843\ncurry\t11844\n##pop\t11845\nrelease\t11846\n##cast\t11847\n073\t11848\n##ews\t11849\n5c\t11850\n##stle\t11851\nios7\t11852\n##ima\t11853\ndog\t11854\nlenovo\t11855\n##r4\t11856\n013\t11857\nvornado\t11858\n##desk\t11859\n##ald\t11860\n9595\t11861\n##van\t11862\noil\t11863\ncommon\t11864\n##jy\t11865\n##lines\t11866\ng7\t11867\ntwice\t11868\nella\t11869\nnano\t11870\nbelle\t11871\n##mes\t11872\n##self\t11873\n##note\t11874\nbenz\t11875\n##ova\t11876\n##wing\t11877\nkai\t11878\n##hua\t11879\n##rect\t11880\nrainer\t11881\n##unge\t11882\n##0m\t11883\nguestname\t11884\n##uma\t11885\n##kins\t11886\n##zu\t11887\ntokichoi\t11888\n##price\t11889\n##med\t11890\n##mus\t11891\nrmk\t11892\naddress\t11893\nvm\t11894\nopenload\t11895\n##group\t11896\n##hin\t11897\n##iginal\t11898\namg\t11899\nurban\t11900\n##oz\t11901\njobs\t11902\n##public\t11903\n##sch\t11904\n##dden\t11905\n##bell\t11906\nhostel\t11907\n##drive\t11908\n##rmin\t11909\nboot\t11910\n##370\t11911\n##fx\t11912\n##nome\t11913\n##ctionary\t11914\n##oman\t11915\n##lish\t11916\n##cr\t11917\n##hm\t11918\n##how\t11919\nfrancis\t11920\nc919\t11921\nb5\t11922\nevernote\t11923\n##uc\t11924\n##3000\t11925\ncoupe\t11926\n##urg\t11927\n##cca\t11928\n##uality\t11929\n019\t11930\n##ett\t11931\n##ani\t11932\n##tax\t11933\n##rma\t11934\nleonnhurt\t11935\n##jin\t11936\nict\t11937\nbird\t11938\nnotes\t11939\n##dical\t11940\n##lli\t11941\nresult\t11942\niu\t11943\nee\t11944\nsmap\t11945\ngopro\t11946\n##last\t11947\nyin\t11948\npure\t11949\n32g\t11950\n##dan\t11951\n##rame\t11952\nmama\t11953\n##oot\t11954\nbean\t11955\n##hur\t11956\n2l\t11957\nbella\t11958\nsync\t11959\nxuite\t11960\n##ground\t11961\ndiscuz\t11962\n##getrelax\t11963\n##ince\t11964\n##bay\t11965\n##5s\t11966\napt\t11967\n##pass\t11968\njing\t11969\n##rix\t11970\nrich\t11971\nniusnews\t11972\n##ello\t11973\nbag\t11974\n##eting\t11975\n##mobile\t11976\n##ience\t11977\ndetails\t11978\nuniversal\t11979\nsilver\t11980\ndit\t11981\nprivate\t11982\nddd\t11983\nu11\t11984\nkanshu\t11985\n##ified\t11986\nfung\t11987\n##nny\t11988\ndx\t11989\n##520\t11990\ntai\t11991\n023\t11992\n##fr\t11993\n##lean\t11994\n##pin\t11995\n##rin\t11996\nly\t11997\nrick\t11998\n##bility\t11999\nbanner\t12000\n##baru\t12001\n##gion\t12002\nvdf\t12003\nqualcomm\t12004\nbear\t12005\noldid\t12006\nian\t12007\njo\t12008\n##tors\t12009\npopulation\t12010\n##ernel\t12011\n##mv\t12012\n##bike\t12013\nww\t12014\n##ager\t12015\nexhibition\t12016\n##del\t12017\n##pods\t12018\nfpx\t12019\nstructure\t12020\n##free\t12021\n##tings\t12022\nkl\t12023\n##rley\t12024\n##copyright\t12025\n##mma\t12026\norange\t12027\nyoga\t12028\n4l\t12029\ncanmake\t12030\nhoney\t12031\n##anda\t12032\nnikkie\t12033\ndhl\t12034\npublishing\t12035\n##mall\t12036\n##gnet\t12037\ne88\t12038\n##dog\t12039\nfishbase\t12040\n###\t12041\n##[\t12042\n。\t12043\n！\t12044\n？\t12045\n!\t12046\n?\t12047\n；\t12048\n:\t12049\n;\t12050\n-\t12051\n(\t12052\n)\t12053\n/\t12054\n+\t12055\n\"\t12056\n_\t12057\n…\t12058\n~\t12059\n=\t12060\n'\t12061\n%\t12062\n&\t12063\n·\t12064\n*\t12065\n@\t12066\n\\\t12067\n]\t12068\n—\t12069\n～\t12070\n^\t12071\n>\t12072\n丨\t12073\n|\t12074\n<\t12075\n】\t12076\nの\t12077\n【\t12078\n〔\t12079\n〕\t12080\nー\t12081\n★\t12082\n’\t12083\n$\t12084\n{\t12085\n}\t12086\n‘\t12087\n[UNK]\t12088\n"
  },
  {
    "path": "modules/text/sentiment_analysis/ernie_skep_sentiment_analysis/model/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/sentiment_analysis/ernie_skep_sentiment_analysis/model/ernie.py",
    "content": "# -*- coding:utf-8 -**\n# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"ERNIE\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport json\nimport logging\n\nimport six\n\n\nclass ErnieConfig(object):\n    \"\"\"parse ernie config\"\"\"\n\n    def __init__(self, config_path):\n        \"\"\"\n        :param config_path:\n        \"\"\"\n        self._config_dict = self._parse(config_path)\n\n    def _parse(self, config_path):\n        \"\"\"\n        :param config_path:\n        :return:\n        \"\"\"\n        try:\n            with open(config_path, 'r') as json_file:\n                config_dict = json.load(json_file)\n        except Exception:\n            raise IOError(\"Error in parsing Ernie model config file '%s'\" % config_path)\n        else:\n            return config_dict\n\n    def __getitem__(self, key):\n        \"\"\"\n        :param key:\n        :return:\n        \"\"\"\n        return self._config_dict.get(key, None)\n\n    def has(self, key):\n        \"\"\"\n        :param key:\n        :return:\n        \"\"\"\n        if key in self._config_dict:\n            return True\n        return False\n\n    def get(self, key, default_value):\n        \"\"\"\n        :param key,default_value:\n        :retrun:\n        \"\"\"\n        if key in self._config_dict:\n            return self._config_dict[key]\n        else:\n            return default_value\n\n    def print_config(self):\n        \"\"\"\n        :return:\n        \"\"\"\n        for arg, value in sorted(six.iteritems(self._config_dict)):\n            logging.info('%s: %s' % (arg, value))\n        logging.info('------------------------------------------------')\n"
  },
  {
    "path": "modules/text/sentiment_analysis/ernie_skep_sentiment_analysis/module.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport argparse\nimport ast\nimport os\n\nimport numpy as np\nfrom ernie_skep_sentiment_analysis.model.ernie import ErnieConfig\nfrom paddle.framework import core\n\nfrom paddlehub import TransformerModule\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\nfrom paddlehub.reader.batching import pad_batch_data\nfrom paddlehub.reader.tokenization import convert_to_unicode\nfrom paddlehub.reader.tokenization import FullTokenizer\n\n\n@moduleinfo(\n    name=\"ernie_skep_sentiment_analysis\",\n    version=\"1.0.1\",\n    summary=\n    \"SKEP: Sentiment Knowledge Enhanced Pre-training for Sentiment Analysis. Ernie_skep_sentiment_analysis module is initialize with enie_1.0_chn_large when pretraining. This module is finetuned on ChnSentiCorp dataset to do sentiment claasification. It can do sentiment analysis prediction directly, label as positive or negative.\",\n    author=\"baidu-nlp\",\n    author_email=\"\",\n    type=\"nlp/sentiment_analysis\",\n)\nclass ErnieSkepSentimentAnalysis(TransformerModule):\n    \"\"\"\n    Ernie_skep_sentiment_analysis module is initialize with enie_1.0_chn_large when pretraining.\n    This module is finetuned on ChnSentiCorp dataset to do sentiment claasification.\n    It can do sentiment analysis prediction directly, label as positive or negative.\n    \"\"\"\n\n    def _initialize(self):\n        ernie_config_path = os.path.join(self.directory, \"assets\", \"ernie_1.0_large_ch.config.json\")\n        self.ernie_config = ErnieConfig(ernie_config_path)\n        self.MAX_SEQ_LEN = 512\n        self.vocab_path = os.path.join(self.directory, \"assets\", \"ernie_1.0_large_ch.vocab.txt\")\n        self.params_path = os.path.join(self.directory, \"assets\", \"params\")\n\n        self.infer_model_path = os.path.join(self.directory, \"assets\", \"inference_step_601\")\n        self.tokenizer = FullTokenizer(vocab_file=self.vocab_path)\n\n        self.vocab = self.tokenizer.vocab\n        self.pad_id = self.vocab[\"[PAD]\"]\n        self.label_map = {0: 'negative', 1: 'positive'}\n\n        self._set_config()\n\n    def _set_config(self):\n        \"\"\"\n        predictor config setting\n        \"\"\"\n        model_file_path = os.path.join(self.infer_model_path, 'model')\n        params_file_path = os.path.join(self.infer_model_path, 'params')\n\n        config = core.AnalysisConfig(model_file_path, params_file_path)\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n\n        if use_gpu:\n            config.enable_use_gpu(8000, 0)\n        else:\n            config.disable_gpu()\n\n        config.disable_glog_info()\n\n        self.predictor = core.create_paddle_predictor(config)\n\n    def array2tensor(self, arr_data):\n        \"\"\"\n        convert numpy array to PaddleTensor\n        \"\"\"\n        tensor_data = core.PaddleTensor(arr_data)\n        return tensor_data\n\n    @serving\n    def predict_sentiment(self, texts=[], use_gpu=False):\n        \"\"\"\n        Get the sentiment label for the predicted texts. It will be classified as positive and negative.\n        Args:\n            texts (list(str)): the data to be predicted.\n            use_gpu (bool): Whether to use gpu or not.\n        Returns:\n            res (list): The result of sentiment label and probabilties.\n        \"\"\"\n\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        results = []\n        feature_list = []\n        for text in texts:\n            # feature.shape: [1, 512, 1]\n            # batch on the first dimension\n            feature = self._convert_text_to_feature(text)\n            feature_list.append(feature)\n\n        feature_batch = [\n            np.concatenate([feature[0] for feature in feature_list], axis=0),\n            np.concatenate([feature[1] for feature in feature_list], axis=0),\n            np.concatenate([feature[2] for feature in feature_list], axis=0),\n            np.concatenate([feature[3] for feature in feature_list], axis=0),\n            np.concatenate([feature[4] for feature in feature_list], axis=0),\n        ]\n\n        inputs = [self.array2tensor(ndarray) for ndarray in feature_batch]\n        output = self.predictor.run(inputs)\n        probilities_list = np.array(output[0].data.float_data())\n        probilities_list = probilities_list.reshape((-1, 2))\n        for i, probilities in enumerate(probilities_list):\n            label = self.label_map[np.argmax(probilities)]\n            result = {\n                'text': texts[i],\n                'sentiment_label': label,\n                'positive_probs': probilities[1],\n                'negative_probs': probilities[0]\n            }\n            results.append(result)\n\n        return results\n\n    def _convert_text_to_feature(self, text):\n        \"\"\"\n        Convert the raw text to feature which is needed to run program (feed_vars).\n        \"\"\"\n        text_a = convert_to_unicode(text)\n        tokens_a = self.tokenizer.tokenize(text_a)\n        max_seq_len = 512\n\n        # Account for [CLS] and [SEP] with \"- 2\"\n        if len(tokens_a) > max_seq_len - 2:\n            tokens_a = tokens_a[0:(max_seq_len - 2)]\n\n        tokens = []\n        text_type_ids = []\n        tokens.append(\"[CLS]\")\n        text_type_ids.append(0)\n        for token in tokens_a:\n            tokens.append(token)\n            text_type_ids.append(0)\n        tokens.append(\"[SEP]\")\n        text_type_ids.append(0)\n\n        token_ids = self.tokenizer.convert_tokens_to_ids(tokens)\n        position_ids = list(range(len(token_ids)))\n        task_ids = [0] * len(token_ids)\n\n        padded_token_ids, input_mask = pad_batch_data([token_ids],\n                                                      max_seq_len=max_seq_len,\n                                                      pad_idx=self.pad_id,\n                                                      return_input_mask=True)\n        padded_text_type_ids = pad_batch_data([text_type_ids], max_seq_len=max_seq_len, pad_idx=self.pad_id)\n        padded_position_ids = pad_batch_data([position_ids], max_seq_len=max_seq_len, pad_idx=self.pad_id)\n        padded_task_ids = pad_batch_data([task_ids], max_seq_len=max_seq_len, pad_idx=self.pad_id)\n\n        feature = [padded_token_ids, padded_position_ids, padded_text_type_ids, input_mask, padded_task_ids]\n        return feature\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the %s module.\" % self.name,\n                                              prog='hub run %s' % self.name,\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n\n        args = self.parser.parse_args(argvs)\n        results = self.predict_sentiment(texts=[args.input_text], use_gpu=args.use_gpu)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help=\"whether use GPU or not\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options\n        \"\"\"\n        self.arg_input_group.add_argument('--input_text', type=str, default=None, help=\"data to be predicted\")\n\n\nif __name__ == '__main__':\n    test_module = ErnieSkepSentimentAnalysis()\n    test_texts = ['你不是不聪明，而是不认真', '虽然小明很努力，但是他还是没有考100分']\n    results = test_module.predict_sentiment(test_texts, use_gpu=False)\n    print(results)\n    test_module.context(max_seq_len=128)\n    print(test_module.get_embedding(texts=[['你不是不聪明，而是不认真']]))\n    print(test_module.get_params_layer())\n"
  },
  {
    "path": "modules/text/sentiment_analysis/senta_bilstm/README.md",
    "content": "# senta_bilstm\n|模型名称|senta_bilstm|\n| :--- | :---: |\n|类别|文本-情感分析|\n|网络|BiLSTM|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|否|\n|模型大小|690MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - 情感倾向分析（Sentiment Classification，简称Senta）针对带有主观描述的中文文本，可自动判断该文本的情感极性类别并给出相应的置信度，能够帮助企业理解用户消费习惯、分析热点话题和危机舆情监控，为企业提供有利的决策支持。该模型基于一个双向LSTM结构，情感类型分为积极、消极。\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.8.0\n\n  - paddlehub >= 1.8.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install senta_bilstm\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run senta_bilstm --input_text \"这家餐厅很好吃\"\n    ```\n    或者\n  - ```shell\n    $ hub run senta_bilstm --input_file test.txt\n    ```  \n    - test.txt 存放待预测文本， 如：\n      > 这家餐厅很好吃\n\n      > 这部电影真的很差劲\n\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    senta = hub.Module(name=\"senta_bilstm\")\n    test_text = [\"这家餐厅很好吃\", \"这部电影真的很差劲\"]\n    results = senta.sentiment_classify(texts=test_text,\n                                       use_gpu=False,\n                                       batch_size=1)\n\n    for result in results:\n        print(result['text'])\n        print(result['sentiment_label'])\n        print(result['sentiment_key'])\n        print(result['positive_probs'])\n        print(result['negative_probs'])\n\n    # 这家餐厅很好吃 1 positive 0.9407 0.0593\n    # 这部电影真的很差劲 0 negative 0.02 0.98\n    ```\n\n- ### 3、API\n\n  - ```python\n    def sentiment_classify(texts=[], data={}, use_gpu=False, batch_size=1)\n    ```\n\n    - senta_bilstm预测接口，预测输入句子的情感分类(二分类，积极/消极）\n\n    - **参数**\n\n      - texts(list): 待预测数据，如果使用texts参数，则不用传入data参数，二选一即可\n      - data(dict): 预测数据，key必须为text，value是带预测数据。如果使用data参数，则不用传入texts参数，二选一即可。建议使用texts参数，data参数后续会废弃。\n      - use_gpu(bool): 是否使用GPU预测，如果使用GPU预测，则在预测之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置\n      - batch_size(int): 批处理大小\n\n    - **返回**\n\n      - results(list): 情感分类结果\n\n\n  - ```python\n    def get_labels()\n    ```\n    - 获取senta_bilstm的类别\n\n    - **返回**\n\n      - labels(dict): senta_bilstm的类别(二分类，积极/消极)\n\n  - ```python\n    def get_vocab_path()\n    ```\n    - 获取预训练时使用的词汇表\n\n    - **返回**\n\n      - vocab_path(str): 词汇表路径\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线情感分析服务，可以将此接口用于在线web应用。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m senta_bilstm  \n    ```\n\n  - 启动时会显示加载模型过程，启动成功后显示\n  - ```shell\n    Loading senta_bilstm successful.\n    ```\n\n  - 这样就完成了服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 待预测数据\n    text = [\"这家餐厅很好吃\", \"这部电影真的很差劲\"]\n\n    # 设置运行配置\n    # 对应本地预测senta_bilstm.sentiment_classify(texts=text, batch_size=1, use_gpu=True)\n    data = {\"texts\": text, \"batch_size\": 1, \"use_gpu\":True}\n\n    # 指定预测方法为senta_bilstm并发送post请求，content-type类型应指定json方式\n    # HOST_IP为服务器IP\n    url = \"http://HOST_IP:8866/predict/senta_bilstm\"\n    headers = {\"Content-Type\": \"application/json\"}\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(json.dumps(r.json(), indent=4, ensure_ascii=False))\n    ```\n\n  - 关于PaddleHub Serving更多信息参考：[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  词汇表升级\n\n* 1.1.0\n\n  大幅提升预测性能\n\n* 1.2.0\n\n  模型升级，支持用于文本分类，文本匹配等各种任务迁移学习\n\n* 1.2.1\n\n  移除 fluid api\n\n  - ```shell\n    $ hub install senta_bilstm==1.2.1\n    ```\n"
  },
  {
    "path": "modules/text/sentiment_analysis/senta_bilstm/README_en.md",
    "content": "# senta_bilstm\n\n|     Module Name      |  senta_bilstm  |\n|  :------------------ | :------------: |\n|       Category       | text-sentiment_analysis  |\n|         Network      |      BiLSTM      |\n|         Dataset      | Dataset built by Baidu |\n| Fine-tuning supported or not |      No       |\n|     Module Size      |       690M       |\n| Latest update date   |   2021-02-26   |\n|   Data indicators    |       -        |\n\n\n## I. Basic Information of Module\n\n- ### Module Introduction\n\n  - Sentiment Classification (Senta for short) can automatically judge the emotional polarity category of Chinese texts with subjective description and give corresponding confidence, which can help enterprises understand users' consumption habits, analyze hot topics and crisis public opinion monitoring, and provide favorable decision support for enterprises. The model is based on a bidirectional LSTM structure, with positive and negative emotion types.\n\n\n\n## II. Installation\n\n- ### 1、Environmental dependence\n\n  - paddlepaddle >= 1.8.0\n\n  - paddlehub >= 1.8.0    | [How to install PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install senta_bilstm\n    ```\n  - If you have problems during installation, please refer to:[windows_quickstart](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [linux_quickstart](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [mac_quickstart](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n## III. Module API and Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run senta_bilstm --input_text \"这家餐厅很好吃\"\n    ```\n    or\n  - ```shell\n    $ hub run senta_bilstm --input_file test.txt\n    ```  \n    - test.txt stores the text to be predicted, for example:\n\n      > 这家餐厅很好吃\n\n      > 这部电影真的很差劲\n\n    - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command line instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n\n    senta = hub.Module(name=\"senta_bilstm\")\n    test_text = [\"这家餐厅很好吃\", \"这部电影真的很差劲\"]\n    results = senta.sentiment_classify(texts=test_text,\n                                       use_gpu=False,\n                                       batch_size=1)\n\n    for result in results:\n        print(result['text'])\n        print(result['sentiment_label'])\n        print(result['sentiment_key'])\n        print(result['positive_probs'])\n        print(result['negative_probs'])\n\n    # 这家餐厅很好吃 1 positive 0.9407 0.0593\n    # 这部电影真的很差劲 0 negative 0.02 0.98\n    ```\n\n- ### 3、API\n\n  - ```python\n    def sentiment_classify(texts=[], data={}, use_gpu=False, batch_size=1)\n    ```\n\n    - senta_bilstm predicting interfaces, predicting sentiment classification of input sentences (dichotomies, positive/negative)\n\n    - **Parameter**\n\n      - texts(list):  data to be predicted, if texts parameter is used, there is no need to pass in data parameter. You can use any of the two parameters.\n      - data(dict): predicted data , key must be text，value is data to be predicted. if data parameter is used, there is no need to pass in texts parameter. You can use any of the two parameters. It is suggested to use texts parameter, and data parameter will be discarded later.\n      - use_gpu(bool): use GPU or not. If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before prediction. Otherwise, need not set it.\n      - batch_size(int): batch size\n\n    - **Return**\n\n      - results(list): result of sentiment classification\n\n\n  - ```python\n    def get_labels()\n    ```\n    - get the category of senta_bilstm\n\n    - **Return**\n\n      - labels(dict): the category of senta_bilstm(Dichotomies, positive/negative)\n\n  - ```python\n    def get_vocab_path()\n    ```\n    - Get a vocabulary for pre-training\n\n    - **Return**\n\n      - vocab_path(str): Vocabulary path\n\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online sentiment analysis detection service and you can use this interface for online Web applications.\n\n- ## Step 1: Start PaddleHub Serving\n\n  - Run the startup command:\n  - ```shell\n    $ hub serving start -m senta_bilstm  \n    ```\n\n  - The model loading process is displayed on startup. After the startup is successful, the following information is displayed:\n  - ```shell\n    Loading senta_bilstm successful.\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n   - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before prediction. Otherwise, need not set it.\n\n- ## Step 2: Send a predictive request\n\n  - After configuring the server, the following lines of code can be used to send the prediction request and obtain the prediction result\n\n  - ```python\n    import requests\n    import json\n\n    # data to be predicted\n    text = [\"这家餐厅很好吃\", \"这部电影真的很差劲\"]\n\n    # Set the running configuration\n    # Corresponding to local prediction senta_bilstm.sentiment_classify(texts=text, batch_size=1, use_gpu=True)\n    data = {\"texts\": text, \"batch_size\": 1, \"use_gpu\":True}\n\n    # set the prediction method to senta_bilstm and send a POST request, content-type should be set to json\n    # HOST_IP is the IP address of the server\n    url = \"http://HOST_IP:8866/predict/senta_bilstm\"\n    headers = {\"Content-Type\": \"application/json\"}\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction result\n    print(json.dumps(r.json(), indent=4, ensure_ascii=False))\n    ```\n\n  - For more information about PaddleHub Serving, please refer to:[Serving Deployment](../../../../docs/docs_ch/tutorial/serving.md)\n\n\n\n## V. Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.0.1\n\n  Vocabulary upgrade\n\n* 1.1.0\n\n  Significantly improve predictive performance\n\n* 1.2.1\n\n  Remove fluid api\n\n  - ```shell\n    $ hub install senta_bilstm==1.2.1\n    ```\n"
  },
  {
    "path": "modules/text/sentiment_analysis/senta_bilstm/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/sentiment_analysis/senta_bilstm/module.py",
    "content": "# -*- coding:utf-8 -*-\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport json\nimport math\nimport os\n\nimport six\nfrom senta_bilstm.processor import load_vocab\nfrom senta_bilstm.processor import postprocess\nfrom senta_bilstm.processor import preprocess\n\nimport paddlehub as hub\nfrom paddlehub.common.paddle_helper import add_vars_prefix\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"senta_bilstm\",\n            version=\"1.2.1\",\n            summary=\"Baidu's open-source Sentiment Classification System.\",\n            author=\"baidu-nlp\",\n            author_email=\"\",\n            type=\"nlp/sentiment_analysis\")\nclass SentaBiLSTM(hub.NLPPredictionModule):\n\n    def _initialize(self):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        self.pretrained_model_path = os.path.join(self.directory, \"assets\", \"infer_model\")\n        self.vocab_path = os.path.join(self.directory, \"assets/vocab.txt\")\n        self.word_dict = load_vocab(self.vocab_path)\n        self._word_seg_module = None\n\n        self.predict = self.sentiment_classify\n\n        self._set_config()\n\n    @property\n    def word_seg_module(self):\n        \"\"\"\n        lac module\n        \"\"\"\n        if not self._word_seg_module:\n            self._word_seg_module = hub.Module(name=\"lac\")\n        return self._word_seg_module\n\n    @serving\n    def sentiment_classify(self, texts=[], data={}, use_gpu=False, batch_size=1):\n        \"\"\"\n        Get the sentiment prediction results results with the texts as input\n\n        Args:\n             texts(list): the input texts to be predicted, if texts not data\n             data(dict): key must be 'text', value is the texts to be predicted, if data not texts\n             use_gpu(bool): whether use gpu to predict or not\n             batch_size(int): the program deals once with one batch\n\n        Returns:\n             results(list): the word segmentation results\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        if texts != [] and isinstance(texts, list) and data == {}:\n            predicted_data = texts\n        elif texts == [] and isinstance(data, dict) and isinstance(data.get('text', None), list) and data['text']:\n            predicted_data = data[\"text\"]\n        else:\n            raise ValueError(\"The input data is inconsistent with expectations.\")\n\n        predicted_data = self.to_unicode(predicted_data)\n        start_idx = 0\n        iteration = int(math.ceil(len(predicted_data) / batch_size))\n        results = []\n        for i in range(iteration):\n            if i < (iteration - 1):\n                batch_data = predicted_data[start_idx:(start_idx + batch_size)]\n            else:\n                batch_data = predicted_data[start_idx:]\n\n            start_idx = start_idx + batch_size\n            processed_results = preprocess(self.word_seg_module, batch_data, self.word_dict, use_gpu, batch_size)\n            tensor_words = self.texts2tensor(processed_results)\n\n            if use_gpu:\n                batch_out = self.gpu_predictor.run([tensor_words])\n            else:\n                batch_out = self.cpu_predictor.run([tensor_words])\n            batch_result = postprocess(batch_out[0], processed_results)\n            results += batch_result\n        return results\n\n    def get_labels(self):\n        \"\"\"\n        Get the labels which was used when pretraining\n        Returns:\n             self.labels(dict)\n        \"\"\"\n        self.labels = {\"positive\": 1, \"negative\": 0}\n        return self.labels\n"
  },
  {
    "path": "modules/text/sentiment_analysis/senta_bilstm/processor.py",
    "content": "# -*- coding:utf-8 -*-\nimport io\nimport numpy as np\n\n\ndef load_vocab(file_path):\n    \"\"\"\n    load the given vocabulary\n    \"\"\"\n    vocab = {}\n    with io.open(file_path, 'r', encoding='utf8') as f:\n        wid = 0\n        for line in f:\n            parts = line.rstrip().split('\\t')\n            vocab[parts[0]] = int(parts[1])\n    vocab[\"<unk>\"] = len(vocab)\n    return vocab\n\n\ndef preprocess(lac, texts, word_dict, use_gpu=False, batch_size=1):\n    \"\"\"\n    firstly, the predicted texts are segmented by lac module\n    then, the word segmention results input into senta\n    \"\"\"\n    result = []\n    input_dict = {'text': texts}\n    processed = lac.lexical_analysis(data=input_dict, use_gpu=use_gpu, batch_size=batch_size)\n    unk_id = word_dict[\"<unk>\"]\n    for index, data in enumerate(processed):\n        result_i = {'processed': []}\n        result_i['origin'] = texts[index]\n        for word in data['word']:\n            if word in word_dict:\n                _index = word_dict[word]\n            else:\n                _index = unk_id\n            result_i['processed'].append(_index)\n        result.append(result_i)\n    return result\n\n\ndef postprocess(predict_out, texts):\n    \"\"\"\n    Convert model's output tensor to sentiment label\n    \"\"\"\n    predict_out = predict_out.as_ndarray()\n    batch_size = len(texts)\n    result = []\n    for index in range(batch_size):\n        result_i = {}\n        result_i['text'] = texts[index]['origin']\n        label = int(np.argmax(predict_out[index]))\n        if label == 0:\n            key = 'negative'\n        else:\n            key = 'positive'\n        result_i['sentiment_label'] = label\n        result_i['sentiment_key'] = key\n        result_i['positive_probs'] = float('%.4f' % predict_out[index, 1])\n        result_i['negative_probs'] = float('%.4f' % (1 - predict_out[index, 1]))\n        result.append(result_i)\n    return result\n"
  },
  {
    "path": "modules/text/sentiment_analysis/senta_bow/README.md",
    "content": "# senta_bow\n\n|模型名称|senta_bow|\n| :--- | :---: |\n|类别|文本-情感分析|\n|网络|BOW|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|否|\n|模型大小|637MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - 情感倾向分析（Sentiment Classification，简称Senta）针对带有主观描述的中文文本，可自动判断该文本的情感极性类别并给出相应的置信度，能够帮助企业理解用户消费习惯、分析热点话题和危机舆情监控，为企业提供有利的决策支持。该模型基于一个BOW结构，情感类型分为积极、消极。\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.8.0\n\n  - paddlehub >= 1.8.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install senta_bow\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run senta_bow --input_text \"这家餐厅很好吃\"\n    ```\n  - 或者\n  - ```shell\n    $ hub run senta_bow --input_file test.txt\n    ```  \n    - test.txt 存放待预测文本， 如：\n      > 这家餐厅很好吃\n\n      > 这部电影真的很差劲\n\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    senta = hub.Module(name=\"senta_bow\")\n    test_text = [\"这家餐厅很好吃\", \"这部电影真的很差劲\"]\n\n    results = senta.sentiment_classify(texts=test_text, use_gpu=False, batch_size=1)\n\n    for result in results:\n        print(result['text'])\n        print(result['sentiment_label'])\n        print(result['sentiment_key'])\n        print(result['positive_probs'])\n        print(result['negative_probs'])\n\n    # 这家餐厅很好吃 1 positive 0.9782 0.0218\n    # 这部电影真的很差劲 0 negative 0.0124 0.9876\n    ```\n\n\n- ### 3、API\n\n  - ```python\n    def sentiment_classify(texts=[], data={}, use_gpu=False, batch_size=1)\n    ```\n    - senta_bow预测接口，预测输入句子的情感分类(二分类，积极/消极）\n\n    - **参数**\n\n      - texts(list): 待预测数据，如果使用texts参数，则不用传入data参数，二选一即可\n      - data(dict): 预测数据，key必须为text，value是带预测数据。如果使用data参数，则不用传入texts参数，二选一即可。建议使用texts参数，data参数后续会废弃。\n      - use_gpu(bool): 是否使用GPU预测，如果使用GPU预测，则在预测之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置\n      - batch_size(int): 批处理大小\n\n    - **返回**\n\n      - results(list): 情感分类结果\n\n\n  - ```python\n    def get_labels()\n    ```\n\n    - 获取senta_bow的类别\n\n    - **返回**\n\n      - labels(dict): senta_bow的类别(二分类，积极/消极)\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取预训练时使用的词汇表\n\n    - **返回**\n\n      - vocab_path(str): 词汇表路径\n\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线情感分析服务，可以将此接口用于在线web应用。\n\n- ## 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n    ```shell\n    $ hub serving start -m senta_bow  \n    ```\n\n  - 启动时会显示加载模型过程，启动成功后显示\n    ```shell\n    Loading senta_bow successful.\n    ```\n\n  - 这样就完成了服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ## 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    ```python\n    import requests\n    import json\n\n    # 待预测数据\n    text = [\"这家餐厅很好吃\", \"这部电影真的很差劲\"]\n\n    # 设置运行配置\n    # 对应本地预测senta_bow.sentiment_classify(texts=text, batch_size=1, use_gpu=True)\n    data = {\"texts\": text, \"batch_size\": 1, \"use_gpu\":True}\n\n    # 指定预测方法为senta_bow并发送post请求，content-type类型应指定json方式\n    # HOST_IP为服务器IP\n    url = \"http://HOST_IP:8866/predict/senta_bow\"\n    headers = {\"Content-Type\": \"application/json\"}\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(json.dumps(r.json(), indent=4, ensure_ascii=False))\n    ```\n\n  - 关于PaddleHub Serving更多信息参考：[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  词汇表升级\n\n* 1.1.0\n\n  大幅提升预测性能\n\n* 1.2.0\n\n  模型升级，支持用于文本分类，文本匹配等各种任务迁移学习\n\n* 1.2.1\n\n  移除 fluid api\n\n  - ```shell\n    $ hub install senta_bow==1.2.1\n    ```\n"
  },
  {
    "path": "modules/text/sentiment_analysis/senta_bow/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/sentiment_analysis/senta_bow/module.py",
    "content": "# -*- coding:utf-8 -*-\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport json\nimport math\nimport os\n\nimport six\nfrom senta_bow.processor import load_vocab\nfrom senta_bow.processor import postprocess\nfrom senta_bow.processor import preprocess\n\nimport paddlehub as hub\nfrom paddlehub.common.paddle_helper import add_vars_prefix\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"senta_bow\",\n            version=\"1.2.1\",\n            summary=\"Baidu's open-source Sentiment Classification System.\",\n            author=\"baidu-nlp\",\n            author_email=\"\",\n            type=\"nlp/sentiment_analysis\")\nclass SentaBow(hub.NLPPredictionModule):\n\n    def _initialize(self):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        self.pretrained_model_path = os.path.join(self.directory, \"assets\", \"infer_model\")\n        self.vocab_path = os.path.join(self.directory, \"assets/vocab.txt\")\n        self.word_dict = load_vocab(self.vocab_path)\n        self._word_seg_module = None\n\n        self.predict = self.sentiment_classify\n\n        self._set_config()\n\n    @property\n    def word_seg_module(self):\n        \"\"\"\n        lac module\n        \"\"\"\n        if not self._word_seg_module:\n            self._word_seg_module = hub.Module(name=\"lac\")\n        return self._word_seg_module\n\n    @serving\n    def sentiment_classify(self, texts=[], data={}, use_gpu=False, batch_size=1):\n        \"\"\"\n        Get the sentiment prediction results results with the texts as input\n\n        Args:\n             texts(list): the input texts to be predicted, if texts not data\n             data(dict): key must be 'text', value is the texts to be predicted, if data not texts\n             use_gpu(bool): whether use gpu to predict or not\n             batch_size(int): the program deals once with one batch\n\n        Returns:\n             results(list): the word segmentation results\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        if texts != [] and isinstance(texts, list) and data == {}:\n            predicted_data = texts\n        elif texts == [] and isinstance(data, dict) and isinstance(data.get('text', None), list) and data['text']:\n            predicted_data = data[\"text\"]\n        else:\n            raise ValueError(\"The input data is inconsistent with expectations.\")\n\n        predicted_data = self.to_unicode(predicted_data)\n        start_idx = 0\n        iteration = int(math.ceil(len(predicted_data) / batch_size))\n        results = []\n        for i in range(iteration):\n            if i < (iteration - 1):\n                batch_data = predicted_data[start_idx:(start_idx + batch_size)]\n            else:\n                batch_data = predicted_data[start_idx:]\n\n            start_idx = start_idx + batch_size\n            processed_results = preprocess(self.word_seg_module, batch_data, self.word_dict, use_gpu, batch_size)\n            tensor_words = self.texts2tensor(processed_results)\n\n            if use_gpu:\n                batch_out = self.gpu_predictor.run([tensor_words])\n            else:\n                batch_out = self.cpu_predictor.run([tensor_words])\n            batch_result = postprocess(batch_out[0], processed_results)\n            results += batch_result\n        return results\n\n    def get_labels(self):\n        \"\"\"\n        Get the labels which was used when pretraining\n        Returns:\n             self.labels(dict)\n        \"\"\"\n        self.labels = {\"positive\": 1, \"negative\": 0}\n        return self.labels\n"
  },
  {
    "path": "modules/text/sentiment_analysis/senta_bow/processor.py",
    "content": "# -*- coding:utf-8 -*-\nimport io\nimport numpy as np\n\n\ndef load_vocab(file_path):\n    \"\"\"\n    load the given vocabulary\n    \"\"\"\n    vocab = {}\n    with io.open(file_path, 'r', encoding='utf8') as f:\n        wid = 0\n        for line in f:\n            parts = line.rstrip().split('\\t')\n            vocab[parts[0]] = int(parts[1])\n    vocab[\"<unk>\"] = len(vocab)\n    return vocab\n\n\ndef preprocess(lac, texts, word_dict, use_gpu=False, batch_size=1):\n    \"\"\"\n    firstly, the predicted texts are segmented by lac module\n    then, the word segmention results input into senta\n    \"\"\"\n    result = []\n    input_dict = {\"text\": texts}\n    processed = lac.lexical_analysis(data=input_dict, use_gpu=use_gpu, batch_size=batch_size)\n    unk_id = word_dict[\"<unk>\"]\n    for index, data in enumerate(processed):\n        result_i = {'processed': []}\n        result_i['origin'] = texts[index]\n        for word in data['word']:\n            if word in word_dict:\n                _index = word_dict[word]\n            else:\n                _index = unk_id\n            result_i['processed'].append(_index)\n        result.append(result_i)\n    return result\n\n\ndef postprocess(predict_out, texts):\n    \"\"\"\n    Convert model's output tensor to sentiment label\n    \"\"\"\n    predict_out = predict_out.as_ndarray()\n    batch_size = len(texts)\n    result = []\n    for index in range(batch_size):\n        result_i = {}\n        result_i['text'] = texts[index]['origin']\n        label = int(np.argmax(predict_out[index]))\n        if label == 0:\n            key = 'negative'\n        else:\n            key = 'positive'\n        result_i['sentiment_label'] = label\n        result_i['sentiment_key'] = key\n        result_i['positive_probs'] = float('%.4f' % predict_out[index, 1])\n        result_i['negative_probs'] = float('%.4f' % (1 - predict_out[index, 1]))\n        result.append(result_i)\n    return result\n"
  },
  {
    "path": "modules/text/sentiment_analysis/senta_cnn/README.md",
    "content": "# senta_cnn\n|模型名称|senta_cnn|\n| :--- | :---: |\n|类别|文本-情感分析|\n|网络|CNN|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|否|\n|模型大小|637MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - 情感倾向分析（Sentiment Classification，简称Senta）针对带有主观描述的中文文本，可自动判断该文本的情感极性类别并给出相应的置信度，能够帮助企业理解用户消费习惯、分析热点话题和危机舆情监控，为企业提供有利的决策支持。该模型基于一个CNN结构，情感类型分为积极、消极。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.8.0\n\n  - paddlehub >= 1.8.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install senta_cnn\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run senta_cnn --input_text \"这家餐厅很好吃\"\n    ```\n    或者\n  - ```shell\n    $ hub run senta_cnn --input_file test.txt\n    ```\n    - test.txt 存放待预测文本， 如：\n      > 这家餐厅很好吃\n\n      > 这部电影真的很差劲\n\n  - 通过命令行方式实现文字识别模型的调用，更多请见：[PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    senta = hub.Module(name=\"senta_cnn\")\n    test_text = [\"这家餐厅很好吃\", \"这部电影真的很差劲\"]\n    results = senta.sentiment_classify(texts=test_text,\n                                       use_gpu=False,\n                                       batch_size=1)\n\n    for result in results:\n        print(result['text'])\n        print(result['sentiment_label'])\n        print(result['sentiment_key'])\n        print(result['positive_probs'])\n        print(result['negative_probs'])\n\n    # 这家餐厅很好吃 1 positive 0.7902 0.2098\n    # 这部电影真的很差劲 0 negative 0.0343 0.9657\n    ```\n\n- ### 3、API\n\n  - ```python\n    def sentiment_classify(texts=[], data={}, use_gpu=False, batch_size=1)\n    ```\n\n    - senta_cnn预测接口，预测输入句子的情感分类(二分类，积极/消极）\n\n    - **参数**\n\n      - texts(list): 待预测数据，如果使用texts参数，则不用传入data参数，二选一即可\n      - data(dict): 预测数据，key必须为text，value是带预测数据。如果使用data参数，则不用传入texts参数，二选一即可。建议使用texts参数，data参数后续会废弃。\n      - use_gpu(bool): 是否使用GPU预测，如果使用GPU预测，则在预测之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置\n      - batch_size(int): 批处理大小\n\n    - **返回**\n\n      - results(list): 情感分类结果\n\n\n  - ```python\n    def get_labels()\n    ```\n\n    - 获取senta_cnn的类别\n\n    - **返回**\n\n      - labels(dict): senta_cnn的类别(二分类，积极/消极)\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取预训练时使用的词汇表\n\n    - **返回**\n\n      - vocab_path(str): 词汇表路径\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线情感分析服务，可以将此接口用于在线web应用。\n\n- ## 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n    ```shell\n    $ hub serving start -m senta_cnn  \n    ```\n\n  - 启动时会显示加载模型过程，启动成功后显示\n    ```shell\n    Loading senta_cnn successful.\n    ```\n\n  - 这样就完成了服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ## 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    ```python\n    import requests\n    import json\n\n    # 待预测数据\n    text = [\"这家餐厅很好吃\", \"这部电影真的很差劲\"]\n\n    # 设置运行配置\n    # 对应本地预测senta_cnn.sentiment_classify(texts=text, batch_size=1, use_gpu=True)\n    data = {\"texts\": text, \"batch_size\": 1, \"use_gpu\":True}\n\n    # 指定预测方法为senta_cnn并发送post请求，content-type类型应指定json方式\n    # HOST_IP为服务器IP\n    url = \"http://HOST_IP:8866/predict/senta_cnn\"\n    headers = {\"Content-Type\": \"application/json\"}\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(json.dumps(r.json(), indent=4, ensure_ascii=False))\n    ```\n\n  - 关于PaddleHub Serving更多信息参考：[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n\n\n## 五、更新历史\n\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  词汇表升级\n\n* 1.1.0\n\n  大幅提升预测性能\n\n* 1.2.0\n\n  模型升级，支持用于文本分类，文本匹配等各种任务迁移学习\n\n* 1.2.1\n\n  移除 fluid api\n\n  - ```shell\n    $ hub install senta_cnn==1.2.1\n    ```\n"
  },
  {
    "path": "modules/text/sentiment_analysis/senta_cnn/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/sentiment_analysis/senta_cnn/module.py",
    "content": "# -*- coding:utf-8 -*-\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport json\nimport math\nimport os\n\nimport six\nfrom senta_cnn.processor import load_vocab\nfrom senta_cnn.processor import postprocess\nfrom senta_cnn.processor import preprocess\n\nimport paddlehub as hub\nfrom paddlehub.common.paddle_helper import add_vars_prefix\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"senta_cnn\",\n            version=\"1.2.1\",\n            summary=\"Baidu's open-source Sentiment Classification System.\",\n            author=\"baidu-nlp\",\n            author_email=\"\",\n            type=\"nlp/sentiment_analysis\")\nclass SentaCNN(hub.NLPPredictionModule):\n\n    def _initialize(self, user_dict=None):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        self.pretrained_model_path = os.path.join(self.directory, \"assets\", \"infer_model\")\n        self.vocab_path = os.path.join(self.directory, \"assets/vocab.txt\")\n        self.word_dict = load_vocab(self.vocab_path)\n        self._word_seg_module = None\n\n        self.predict = self.sentiment_classify\n\n        self._set_config()\n\n    @property\n    def word_seg_module(self):\n        \"\"\"\n        lac module\n        \"\"\"\n        if not self._word_seg_module:\n            self._word_seg_module = hub.Module(name=\"lac\")\n        return self._word_seg_module\n\n    @serving\n    def sentiment_classify(self, texts=[], data={}, use_gpu=False, batch_size=1):\n        \"\"\"\n        Get the sentiment prediction results results with the texts as input\n\n        Args:\n             texts(list): the input texts to be predicted, if texts not data\n             data(dict): key must be 'text', value is the texts to be predicted, if data not texts\n             use_gpu(bool): whether use gpu to predict or not\n             batch_size(int): the program deals once with one batch\n\n        Returns:\n             results(list): the word segmentation results\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        if texts != [] and isinstance(texts, list) and data == {}:\n            predicted_data = texts\n        elif texts == [] and isinstance(data, dict) and isinstance(data.get('text', None), list) and data['text']:\n            predicted_data = data[\"text\"]\n        else:\n            raise ValueError(\"The input data is inconsistent with expectations.\")\n\n        predicted_data = self.to_unicode(predicted_data)\n        start_idx = 0\n        iteration = int(math.ceil(len(predicted_data) / batch_size))\n        results = []\n        for i in range(iteration):\n            if i < (iteration - 1):\n                batch_data = predicted_data[start_idx:(start_idx + batch_size)]\n            else:\n                batch_data = predicted_data[start_idx:]\n\n            start_idx = start_idx + batch_size\n            processed_results = preprocess(self.word_seg_module, batch_data, self.word_dict, use_gpu, batch_size)\n            tensor_words = self.texts2tensor(processed_results)\n\n            if use_gpu:\n                batch_out = self.gpu_predictor.run([tensor_words])\n            else:\n                batch_out = self.cpu_predictor.run([tensor_words])\n            batch_result = postprocess(batch_out[0], processed_results)\n            results += batch_result\n        return results\n\n    def get_labels(self):\n        \"\"\"\n        Get the labels which was used when pretraining\n        Returns:\n             self.labels(dict)\n        \"\"\"\n        self.labels = {\"positive\": 1, \"negative\": 0}\n        return self.labels\n"
  },
  {
    "path": "modules/text/sentiment_analysis/senta_cnn/processor.py",
    "content": "# -*- coding:utf-8 -*-\nimport io\nimport numpy as np\n\n\ndef load_vocab(file_path):\n    \"\"\"\n    load the given vocabulary\n    \"\"\"\n    vocab = {}\n    with io.open(file_path, 'r', encoding='utf8') as f:\n        wid = 0\n        for line in f:\n            parts = line.rstrip().split('\\t')\n            vocab[parts[0]] = int(parts[1])\n    vocab[\"<unk>\"] = len(vocab)\n    return vocab\n\n\ndef preprocess(lac, texts, word_dict, use_gpu=False, batch_size=1):\n    \"\"\"\n    firstly, the predicted texts are segmented by lac module\n    then, the word segmention results input into senta\n    \"\"\"\n    result = []\n    input_dict = {'text': texts}\n    processed = lac.lexical_analysis(data=input_dict, use_gpu=use_gpu, batch_size=batch_size)\n    unk_id = word_dict[\"<unk>\"]\n    for index, data in enumerate(processed):\n        result_i = {'processed': []}\n        result_i['origin'] = texts[index]\n        for word in data['word']:\n            if word in word_dict:\n                _index = word_dict[word]\n            else:\n                _index = unk_id\n            result_i['processed'].append(_index)\n        result.append(result_i)\n    return result\n\n\ndef postprocess(predict_out, texts):\n    \"\"\"\n    Convert model's output tensor to sentiment label\n    \"\"\"\n    predict_out = predict_out.as_ndarray()\n    batch_size = len(texts)\n    result = []\n    for index in range(batch_size):\n        result_i = {}\n        result_i['text'] = texts[index]['origin']\n        label = int(np.argmax(predict_out[index]))\n        if label == 0:\n            key = 'negative'\n        else:\n            key = 'positive'\n        result_i['sentiment_label'] = label\n        result_i['sentiment_key'] = key\n        result_i['positive_probs'] = float('%.4f' % predict_out[index, 1])\n        result_i['negative_probs'] = float('%.4f' % (1 - predict_out[index, 1]))\n        result.append(result_i)\n    return result\n"
  },
  {
    "path": "modules/text/sentiment_analysis/senta_gru/README.md",
    "content": "# senta_gru\n\n|模型名称|senta_gru|\n| :--- | :---: |\n|类别|文本-情感分析|\n|网络|GRU|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|否|\n|模型大小|637MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - 情感倾向分析（Sentiment Classification，简称Senta）针对带有主观描述的中文文本，可自动判断该文本的情感极性类别并给出相应的置信度，能够帮助企业理解用户消费习惯、分析热点话题和危机舆情监控，为企业提供有利的决策支持。该模型基于一个GRU结构，情感类型分为积极、消极。\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.8.0\n\n  - paddlehub >= 1.8.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install senta_gru\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run senta_gru --input_text \"今天天气真好\"\n    ```\n    或者\n  - ```shell\n    $ hub run senta_gru --input_file test.txt\n    ```\n    - test.txt 存放待预测文本， 如：\n      > 这家餐厅很好吃\n\n      > 这部电影真的很差劲\n\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    senta = hub.Module(name=\"senta_gru\")\n    test_text = [\"这家餐厅很好吃\", \"这部电影真的很差劲\"]\n\n    results = senta.sentiment_classify(texts=test_text, use_gpu=False, batch_size=1)\n\n    for result in results:\n        print(result['text'])\n        print(result['sentiment_label'])\n        print(result['sentiment_key'])\n        print(result['positive_probs'])\n        print(result['negative_probs'])\n    # 这家餐厅很好吃 1 positive 0.9607 0.0393\n    # 这部电影真的很差劲 0 negative 0.0187 0.9813\n    ```\n\n- ### 3、API\n\n  - ```python\n    def sentiment_classify(texts=[], data={}, use_gpu=False, batch_size=1)\n    ```\n    - senta_gru预测接口，预测输入句子的情感分类(二分类，积极/消极）\n\n    - **参数**\n\n      - texts(list): 待预测数据，如果使用texts参数，则不用传入data参数，二选一即可\n      - data(dict): 预测数据，key必须为text，value是带预测数据。如果使用data参数，则不用传入texts参数，二选一即可。建议使用texts参数，data参数后续会废弃。\n      - use_gpu(bool): 是否使用GPU预测，如果使用GPU预测，则在预测之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置\n      - batch_size(int): 批处理大小\n\n    - **返回**\n\n      - results(list): 情感分类结果\n\n\n  - ```python\n    def get_labels()\n    ```\n\n    - 获取senta_gru的类别\n\n    - **返回**\n\n      - labels(dict): senta_gru的类别(二分类，积极/消极)\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取预训练时使用的词汇表\n\n    - **返回**\n\n      - vocab_path(str): 词汇表路径\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线情感分析服务，可以将此接口用于在线web应用。\n\n- ## 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n    ```shell\n    $ hub serving start -m senta_gru\n    ```\n\n  - 启动时会显示加载模型过程，启动成功后显示\n    ```shell\n    Loading senta_gru successful.\n    ```\n\n  - 这样就完成了服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ## 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    ```python\n    import requests\n    import json\n\n    # 待预测数据\n    text = [\"这家餐厅很好吃\", \"这部电影真的很差劲\"]\n\n    # 设置运行配置\n    # 对应本地预测senta_gru.sentiment_classify(texts=text, batch_size=1, use_gpu=True)\n    data = {\"texts\": text, \"batch_size\": 1, \"use_gpu\":True}\n\n    # 指定预测方法为senta_gru并发送post请求，content-type类型应指定json方式\n    # HOST_IP为服务器IP\n    url = \"http://HOST_IP:8866/predict/senta_gru\"\n    headers = {\"Content-Type\": \"application/json\"}\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(json.dumps(r.json(), indent=4, ensure_ascii=False))\n    ```\n\n  - 关于PaddleHub Serving更多信息参考：[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  词汇表升级\n\n* 1.1.0\n\n  大幅提升预测性能\n\n* 1.2.0\n\n  模型升级，支持用于文本分类，文本匹配等各种任务迁移学习\n\n* 1.2.1\n\n  移除 fluid api\n\n  - ```shell\n    $ hub install senta_gru==1.2.1\n    ```\n"
  },
  {
    "path": "modules/text/sentiment_analysis/senta_gru/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/sentiment_analysis/senta_gru/module.py",
    "content": "# -*- coding:utf-8 -*-\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport json\nimport math\nimport os\n\nimport six\nfrom senta_gru.processor import load_vocab\nfrom senta_gru.processor import postprocess\nfrom senta_gru.processor import preprocess\n\nimport paddlehub as hub\nfrom paddlehub.common.paddle_helper import add_vars_prefix\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"senta_gru\",\n            version=\"1.2.1\",\n            summary=\"Baidu's open-source Sentiment Classification System.\",\n            author=\"baidu-nlp\",\n            author_email=\"\",\n            type=\"nlp/sentiment_analysis\")\nclass SentaGRU(hub.NLPPredictionModule):\n\n    def _initialize(self, user_dict=None):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        self.pretrained_model_path = os.path.join(self.directory, \"assets\", \"infer_model\")\n        self.vocab_path = os.path.join(self.directory, \"assets/vocab.txt\")\n        self.word_dict = load_vocab(self.vocab_path)\n        self._word_seg_module = None\n\n        self.predict = self.sentiment_classify\n\n        self._set_config()\n\n    @property\n    def word_seg_module(self):\n        \"\"\"\n        lac module\n        \"\"\"\n        if not self._word_seg_module:\n            self._word_seg_module = hub.Module(name=\"lac\")\n        return self._word_seg_module\n\n    @serving\n    def sentiment_classify(self, texts=[], data={}, use_gpu=False, batch_size=1):\n        \"\"\"\n        Get the sentiment prediction results results with the texts as input\n\n        Args:\n             texts(list): the input texts to be predicted, if texts not data\n             data(dict): key must be 'text', value is the texts to be predicted, if data not texts\n             use_gpu(bool): whether use gpu to predict or not\n             batch_size(int): the program deals once with one batch\n\n        Returns:\n             results(list): the word segmentation results\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        if texts != [] and isinstance(texts, list) and data == {}:\n            predicted_data = texts\n        elif texts == [] and isinstance(data, dict) and isinstance(data.get('text', None), list) and data['text']:\n            predicted_data = data[\"text\"]\n        else:\n            raise ValueError(\"The input data is inconsistent with expectations.\")\n\n        predicted_data = self.to_unicode(predicted_data)\n        start_idx = 0\n        iteration = int(math.ceil(len(predicted_data) / batch_size))\n        results = []\n        for i in range(iteration):\n            if i < (iteration - 1):\n                batch_data = predicted_data[start_idx:(start_idx + batch_size)]\n            else:\n                batch_data = predicted_data[start_idx:]\n\n            start_idx = start_idx + batch_size\n            processed_results = preprocess(self.word_seg_module, batch_data, self.word_dict, use_gpu, batch_size)\n            tensor_words = self.texts2tensor(processed_results)\n\n            if use_gpu:\n                batch_out = self.gpu_predictor.run([tensor_words])\n            else:\n                batch_out = self.cpu_predictor.run([tensor_words])\n            batch_result = postprocess(batch_out[0], processed_results)\n            results += batch_result\n        return results\n\n    def get_labels(self):\n        \"\"\"\n        Get the labels which was used when pretraining\n        Returns:\n             self.labels(dict)\n        \"\"\"\n        self.labels = {\"positive\": 1, \"negative\": 0}\n        return self.labels\n"
  },
  {
    "path": "modules/text/sentiment_analysis/senta_gru/processor.py",
    "content": "# -*- coding:utf-8 -*-\nimport io\nimport numpy as np\n\n\ndef load_vocab(file_path):\n    \"\"\"\n    load the given vocabulary\n    \"\"\"\n    vocab = {}\n    with io.open(file_path, 'r', encoding='utf8') as f:\n        wid = 0\n        for line in f:\n            parts = line.rstrip().split('\\t')\n            vocab[parts[0]] = int(parts[1])\n    vocab[\"<unk>\"] = len(vocab)\n    return vocab\n\n\ndef preprocess(lac, texts, word_dict, use_gpu=False, batch_size=1):\n    \"\"\"\n    firstly, the predicted texts are segmented by lac module\n    then, the word segmention results input into senta\n    \"\"\"\n    result = []\n    input_dict = {'text': texts}\n    processed = lac.lexical_analysis(data=input_dict, use_gpu=use_gpu, batch_size=batch_size)\n    unk_id = word_dict[\"<unk>\"]\n    for index, data in enumerate(processed):\n        result_i = {'processed': []}\n        result_i['origin'] = texts[index]\n        for word in data['word']:\n            if word in word_dict:\n                _index = word_dict[word]\n            else:\n                _index = unk_id\n            result_i['processed'].append(_index)\n        result.append(result_i)\n    return result\n\n\ndef postprocess(predict_out, texts):\n    \"\"\"\n    Convert model's output tensor to sentiment label\n    \"\"\"\n    predict_out = predict_out.as_ndarray()\n    batch_size = len(texts)\n    result = []\n    for index in range(batch_size):\n        result_i = {}\n        result_i['text'] = texts[index]['origin']\n        label = int(np.argmax(predict_out[index]))\n        if label == 0:\n            key = 'negative'\n        else:\n            key = 'positive'\n        result_i['sentiment_label'] = label\n        result_i['sentiment_key'] = key\n        result_i['positive_probs'] = float('%.4f' % predict_out[index, 1])\n        result_i['negative_probs'] = float('%.4f' % (1 - predict_out[index, 1]))\n        result.append(result_i)\n    return result\n"
  },
  {
    "path": "modules/text/sentiment_analysis/senta_lstm/README.md",
    "content": "# senta_lstm\n|模型名称|senta_lstm|\n| :--- | :---: |\n|类别|文本-情感分析|\n|网络|LSTM|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|否|\n|模型大小|637MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - 情感倾向分析（Sentiment Classification，简称Senta）针对带有主观描述的中文文本，可自动判断该文本的情感极性类别并给出相应的置信度，能够帮助企业理解用户消费习惯、分析热点话题和危机舆情监控，为企业提供有利的决策支持。该模型基于一个LSTM结构，情感类型分为积极、消极。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.8.0\n\n  - paddlehub >= 1.8.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install senta_lstm\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run senta_lstm --input_text \"这家餐厅很好吃\"\n    ```\n    或者\n  - ```shell\n    $ hub run senta_lstm --input_file test.txt\n    ```\n    - test.txt 存放待预测文本， 如：\n      > 这家餐厅很好吃\n\n      > 这部电影真的很差劲\n\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    senta = hub.Module(name=\"senta_lstm\")\n    test_text = [\"这家餐厅很好吃\", \"这部电影真的很差劲\"]\n    results = senta.sentiment_classify(texts=test_text,\n                                       use_gpu=False,\n                                       batch_size=1)\n\n    for result in results:\n        print(result['text'])\n        print(result['sentiment_label'])\n        print(result['sentiment_key'])\n        print(result['positive_probs'])\n        print(result['negative_probs'])\n\n    # 这家餐厅很好吃 1 positive 0.9285 0.0715\n    # 这部电影真的很差劲 0 negative 0.0187 0.9813\n    ```\n\n- ### 3、API\n\n  - ```python\n    sentiment_classify(texts=[], data={}, use_gpu=False, batch_size=1)\n    ```\n\n    - senta_lstm预测接口，预测输入句子的情感分类(二分类，积极/消极）\n\n    - **参数**\n\n      - texts(list): 待预测数据，如果使用texts参数，则不用传入data参数，二选一即可\n      - data(dict): 预测数据，key必须为text，value是带预测数据。如果使用data参数，则不用传入texts参数，二选一即可。建议使用texts参数，data参数后续会废弃。\n      - use_gpu(bool): 是否使用GPU预测，如果使用GPU预测，则在预测之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置\n      - batch_size(int): 批处理大小\n\n    - **返回**\n\n      - results(list): 情感分类结果\n\n\n  - ```python\n    get_labels()\n    ```\n    - 获取senta_lstm的类别\n\n    - **返回**\n\n      - labels(dict): senta_lstm的类别(二分类，积极/消极)\n\n  - ```python\n    get_vocab_path()\n    ```\n\n    - 获取预训练时使用的词汇表\n\n    - **返回**\n\n      - vocab_path(str): 词汇表路径\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线情感分析服务，可以将此接口用于在线web应用。\n\n- ## 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m senta_lstm  \n    ```\n\n  - 启动时会显示加载模型过程，启动成功后显示\n  - ```shell\n    Loading senta_lstm successful.\n    ```\n\n  - 这样就完成了服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ## 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 待预测数据\n    text = [\"这家餐厅很好吃\", \"这部电影真的很差劲\"]\n\n    # 设置运行配置\n    # 对应本地预测senta_lstm.sentiment_classify(texts=text, batch_size=1, use_gpu=True)\n    data = {\"texts\": text, \"batch_size\": 1, \"use_gpu\":True}\n\n    # 指定预测方法为senta_lstm并发送post请求，content-type类型应指定json方式\n    # HOST_IP为服务器IP\n    url = \"http://HOST_IP:8866/predict/senta_lstm\"\n    headers = {\"Content-Type\": \"application/json\"}\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(json.dumps(r.json(), indent=4, ensure_ascii=False))\n    ```\n\n  - 关于PaddleHub Serving更多信息参考[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  词汇表升级\n\n* 1.1.0\n\n  大幅提升预测性能\n\n* 1.2.0\n\n  模型升级，支持用于文本分类，文本匹配等各种任务迁移学习\n\n* 1.2.1\n\n  移除 fluid api\n\n  - ```shell\n    $ hub install senta_lstm==1.2.1\n    ```\n"
  },
  {
    "path": "modules/text/sentiment_analysis/senta_lstm/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/sentiment_analysis/senta_lstm/module.py",
    "content": "# -*- coding:utf-8 -*-\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport json\nimport math\nimport os\n\nimport six\nfrom senta_lstm.processor import load_vocab\nfrom senta_lstm.processor import postprocess\nfrom senta_lstm.processor import preprocess\n\nimport paddlehub as hub\nfrom paddlehub.common.paddle_helper import add_vars_prefix\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"senta_lstm\",\n            version=\"1.2.1\",\n            summary=\"Baidu's open-source Sentiment Classification System.\",\n            author=\"baidu-nlp\",\n            author_email=\"\",\n            type=\"nlp/sentiment_analysis\")\nclass SentaLSTM(hub.NLPPredictionModule):\n\n    def _initialize(self, user_dict=None):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        self.pretrained_model_path = os.path.join(self.directory, \"assets\", \"infer_model\")\n        self.vocab_path = os.path.join(self.directory, \"assets\", \"vocab.txt\")\n        self.word_dict = load_vocab(self.vocab_path)\n        self._word_seg_module = None\n\n        self.predict = self.sentiment_classify\n\n        self._set_config()\n\n    @property\n    def word_seg_module(self):\n        \"\"\"\n        lac module\n        \"\"\"\n        if not self._word_seg_module:\n            self._word_seg_module = hub.Module(name=\"lac\")\n        return self._word_seg_module\n\n    @serving\n    def sentiment_classify(self, texts=[], data={}, use_gpu=False, batch_size=1):\n        \"\"\"\n        Get the sentiment prediction results results with the texts as input\n\n        Args:\n             texts(list): the input texts to be predicted, if texts not data\n             data(dict): key must be 'text', value is the texts to be predicted, if data not texts\n             use_gpu(bool): whether use gpu to predict or not\n             batch_size(int): the program deals once with one batch\n\n        Returns:\n             results(list): the word segmentation results\n        \"\"\"\n        if use_gpu:\n            try:\n                _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n                int(_places[0])\n            except:\n                raise RuntimeError(\n                    \"Environment Variable CUDA_VISIBLE_DEVICES is not set correctly. If you wanna use gpu, please set CUDA_VISIBLE_DEVICES as cuda_device_id.\"\n                )\n\n        if texts != [] and isinstance(texts, list) and data == {}:\n            predicted_data = texts\n        elif texts == [] and isinstance(data, dict) and isinstance(data.get('text', None), list) and data['text']:\n            predicted_data = data[\"text\"]\n        else:\n            raise ValueError(\"The input data is inconsistent with expectations.\")\n\n        predicted_data = self.to_unicode(predicted_data)\n        start_idx = 0\n        iteration = int(math.ceil(len(predicted_data) / batch_size))\n        results = []\n        for i in range(iteration):\n            if i < (iteration - 1):\n                batch_data = predicted_data[start_idx:(start_idx + batch_size)]\n            else:\n                batch_data = predicted_data[start_idx:]\n\n            start_idx = start_idx + batch_size\n            processed_results = preprocess(self.word_seg_module, batch_data, self.word_dict, use_gpu, batch_size)\n            tensor_words = self.texts2tensor(processed_results)\n\n            if use_gpu:\n                batch_out = self.gpu_predictor.run([tensor_words])\n            else:\n                batch_out = self.cpu_predictor.run([tensor_words])\n            batch_result = postprocess(batch_out[0], processed_results)\n            results += batch_result\n        return results\n\n    def get_labels(self):\n        \"\"\"\n        Get the labels which was used when pretraining\n        Returns:\n             self.labels(dict)\n        \"\"\"\n        self.labels = {\"positive\": 1, \"negative\": 0}\n        return self.labels\n"
  },
  {
    "path": "modules/text/sentiment_analysis/senta_lstm/processor.py",
    "content": "# -*- coding:utf-8 -*-\nimport io\nimport numpy as np\n\n\ndef load_vocab(file_path):\n    \"\"\"\n    load the given vocabulary\n    \"\"\"\n    vocab = {}\n    with io.open(file_path, 'r', encoding='utf8') as f:\n        wid = 0\n        for line in f:\n            parts = line.rstrip().split('\\t')\n            vocab[parts[0]] = int(parts[1])\n    vocab[\"<unk>\"] = len(vocab)\n    return vocab\n\n\ndef preprocess(lac, texts, word_dict, use_gpu=False, batch_size=1):\n    \"\"\"\n    firstly, the predicted texts are segmented by lac module\n    then, the word segmention results input into senta\n    \"\"\"\n    result = []\n    input_dict = {'text': texts}\n    processed = lac.lexical_analysis(data=input_dict, use_gpu=use_gpu, batch_size=batch_size)\n    unk_id = word_dict[\"<unk>\"]\n    for index, data in enumerate(processed):\n        result_i = {'processed': []}\n        result_i['origin'] = texts[index]\n        for word in data['word']:\n            if word in word_dict:\n                _index = word_dict[word]\n            else:\n                _index = unk_id\n            result_i['processed'].append(_index)\n        result.append(result_i)\n    return result\n\n\ndef postprocess(predict_out, texts):\n    \"\"\"\n    Convert model's output tensor to sentiment label\n    \"\"\"\n    predict_out = predict_out.as_ndarray()\n    batch_size = len(texts)\n    result = []\n    for index in range(batch_size):\n        result_i = {}\n        result_i['text'] = texts[index]['origin']\n        label = int(np.argmax(predict_out[index]))\n        if label == 0:\n            key = 'negative'\n        else:\n            key = 'positive'\n        result_i['sentiment_label'] = label\n        result_i['sentiment_key'] = key\n        result_i['positive_probs'] = float('%.4f' % predict_out[index, 1])\n        result_i['negative_probs'] = float('%.4f' % (1 - predict_out[index, 1]))\n        result.append(result_i)\n    return result\n"
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_1/README.md",
    "content": "# transformer_nist_wait_1\n|模型名称|transformer_nist_wait_1|\n| :--- | :---: | \n|类别|同声传译|\n|网络|transformer|\n|数据集|NIST 2008-中英翻译数据集|\n|是否支持Fine-tuning|否|\n|模型大小|377MB|\n|最新更新日期|2021-09-17|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - 同声传译（Simultaneous Translation），即在句子完成之前进行翻译，同声传译的目标是实现同声传译的自动化，它可以与源语言同时翻译，延迟时间只有几秒钟。\n    STACL 是论文 [STACL: Simultaneous Translation with Implicit Anticipation and Controllable Latency using Prefix-to-Prefix Framework](https://www.aclweb.org/anthology/P19-1289/) 中针对同传提出的适用于所有同传场景的翻译架构。\n    - STACL 主要具有以下优势：\n\n    - Prefix-to-Prefix架构拥有预测能力，即在未看到源词的情况下仍然可以翻译出对应的目标词，克服了SOV→SVO等词序差异\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/40840292/133761990-13e55d0f-5c3a-476c-8865-5808d13cba97.png\"> <br />\n    </p>\n     和传统的机器翻译模型主要的区别在于翻译时是否需要利用全句的源句。上图中，Seq2Seq模型需要等到全句的源句（1-5）全部输入Encoder后，Decoder才开始解码进行翻译；而STACL架构采用了Wait-k（图中Wait-2）的策略，当源句只有两个词（1和2）输入到Encoder后，Decoder即可开始解码预测目标句的第一个词。\n\n    - Wait-k策略可以不需要全句的源句，直接预测目标句，可以实现任意的字级延迟，同时保持较高的翻译质量。\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/40840292/133762098-6ea6f3ca-0d70-4a0a-981d-0fcc6f3cd96b.png\"> <br />\n    </p>\n     Wait-k策略首先等待源句单词，然后与源句的其余部分同时翻译，即输出总是隐藏在输入后面。这是受到同声传译人员的启发，同声传译人员通常会在几秒钟内开始翻译演讲者的演讲，在演讲者结束几秒钟后完成。例如，如果k=2，第一个目标词使用前2个源词预测，第二个目标词使用前3个源词预测，以此类推。上图中，(a)simultaneous: our wait-2 等到\"布什\"和\"总统\"输入后就开始解码预测\"pres.\"，而(b) non-simultaneous baseline 为传统的翻译模型，需要等到整句\"布什 总统 在 莫斯科 与 普京 会晤\"才开始解码预测。\n    \n  - 该PaddleHub Module基于transformer网络结构，采用wait-1策略进行中文到英文的翻译。\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.1.0\n\n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install transformer_nist_wait_1\n    ```\n\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    model = hub.Module(name=\"transformer_nist_wait_1\")\n\n    # 待预测数据（模拟同声传译实时输入）\n    text = [\n        \"他\", \n        \"他还\", \n        \"他还说\", \n        \"他还说现在\", \n        \"他还说现在正在\",\n        \"他还说现在正在为\",\n        \"他还说现在正在为这\",\n        \"他还说现在正在为这一\",\n        \"他还说现在正在为这一会议\",\n        \"他还说现在正在为这一会议作出\",\n        \"他还说现在正在为这一会议作出安排\",\n        \"他还说现在正在为这一会议作出安排。\",      \n    ]\n\n    for t in text:\n        print(\"input: {}\".format(t))\n        result = model.translate(t)\n        print(\"model output: {}\\n\".format(result))\n\n    # input: 他\n    # model output: he\n    #\n    # input: 他还\n    # model output: he also\n    #\n    # input: 他还说\n    # model output: he also said\n    #\n    # input: 他还说现在\n    # model output: he also said that\n    #\n    # input: 他还说现在正在\n    # model output: he also said that he\n    #\n    # input: 他还说现在正在为\n    # model output: he also said that he is\n    #\n    # input: 他还说现在正在为这\n    # model output: he also said that he is making\n    #\n    # input: 他还说现在正在为这一\n    # model output: he also said that he is making preparations\n    #\n    # input: 他还说现在正在为这一会议\n    # model output: he also said that he is making preparations for\n    #\n    # input: 他还说现在正在为这一会议作出\n    # model output: he also said that he is making preparations for this\n    #\n    # input: 他还说现在正在为这一会议作出安排\n    # model output: he also said that he is making preparations for this meeting\n    #\n    # input: 他还说现在正在为这一会议作出安排。\n    # model output: he also said that he is making preparations for this meeting .\n    ```\n\n- ### 2、 API\n\n    - ```python\n      __init__(max_length=256, max_out_len=256)\n      ```\n\n        - 初始化module， 可配置模型的输入文本的最大长度\n\n        - **参数**\n\n            - max_length(int): 输入文本的最大长度，默认值为256。\n            - max_out_len(int): 输出文本的最大解码长度，超过最大解码长度时会截断句子的后半部分，默认值为256。\n\n    - ```python\n      translate(text, use_gpu=False)\n      ```\n\n        - 预测API，输入源语言的文本（模拟同传语音输入），解码后输出翻译后的目标语言文本。\n\n        - **参数**\n\n            - text(str): 输入源语言的文本，数据类型为str\n            - use_gpu(bool): 是否使用gpu进行预测，默认为False\n\n        - **返回**\n\n            - result(str): 翻译后的目标语言文本。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线同声传译服务(需要用户配置一个语音转文本应用预先将语音输入转为中文文字)，可以将此接口用于在线web应用。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m transformer_nist_wait_1\n    ```\n\n  - 启动时会显示加载模型过程，启动成功后显示\n\n  - ```shell\n    Loading transformer_nist_wait_1 successful.\n    ```\n\n  - 这样就完成了服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果       \n\n  - ```python\n    import requests\n    import json\n\n    # 待预测数据（模拟同声传译实时输入）\n    text = [\n        \"他\", \n        \"他还\", \n        \"他还说\", \n        \"他还说现在\", \n        \"他还说现在正在\",\n        \"他还说现在正在为\",\n        \"他还说现在正在为这\",\n        \"他还说现在正在为这一\",\n        \"他还说现在正在为这一会议\",\n        \"他还说现在正在为这一会议作出\",\n        \"他还说现在正在为这一会议作出安排\",\n        \"他还说现在正在为这一会议作出安排。\",      \n    ]\n\n    # 指定预测方法为transformer_nist_wait_1并发送post请求，content-type类型应指定json方式\n    # HOST_IP为服务器IP\n    url = \"http://HOST_IP:8866/predict/transformer_nist_wait_1\"\n    headers = {\"Content-Type\": \"application/json\"}\n    for t in text:\n        print(\"input: {}\".format(t))\n        result = requests.post(url=url, headers=headers, data=json.dumps({\"text\": t}))\n        # 打印预测结果\n        print(\"model output: {}\\n\".format(result.json()['results']))\n\n  - 关于PaddleHub Serving更多信息参考：[服务部署](../../../../../docs/docs_ch/tutorial/serving.md)\n\n## 五、更新历史\n\n* 1.0.0\n    初始发布\n    ```shell\n    hub install transformer_nist_wait_1==1.0.0\n    ```\n"
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_1/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_1/model.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport numpy as np\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlenlp.transformers import WordEmbedding, PositionalEmbedding\n\n\nclass DecoderLayer(nn.TransformerDecoderLayer):\n    def __init__(self, *args, **kwargs):\n        super(DecoderLayer, self).__init__(*args, **kwargs)\n\n    def forward(self, tgt, memory, tgt_mask=None, memory_mask=None, cache=None):\n        residual = tgt\n        if self.normalize_before:\n            tgt = self.norm1(tgt)\n        if cache is None:\n            tgt = self.self_attn(tgt, tgt, tgt, tgt_mask, None)\n        else:\n            tgt, incremental_cache = self.self_attn(tgt, tgt, tgt, tgt_mask,\n                                                    cache[0])\n        tgt = residual + self.dropout1(tgt)\n        if not self.normalize_before:\n            tgt = self.norm1(tgt)\n\n        residual = tgt\n        if self.normalize_before:\n            tgt = self.norm2(tgt)\n        if len(memory) == 1:\n            # Full sent\n            tgt = self.cross_attn(tgt, memory[0], memory[0], memory_mask, None)\n        else:\n            # Wait-k policy\n            cross_attn_outputs = []\n            for i in range(tgt.shape[1]):\n                q = tgt[:, i:i + 1, :]\n                if i >= len(memory):\n                    e = memory[-1]\n                else:\n                    e = memory[i]\n                cross_attn_outputs.append(\n                    self.cross_attn(q, e, e, memory_mask[:, :, i:i + 1, :\n                                                         e.shape[1]], None))\n            tgt = paddle.concat(cross_attn_outputs, axis=1)\n        tgt = residual + self.dropout2(tgt)\n        if not self.normalize_before:\n            tgt = self.norm2(tgt)\n\n        residual = tgt\n        if self.normalize_before:\n            tgt = self.norm3(tgt)\n        tgt = self.linear2(self.dropout(self.activation(self.linear1(tgt))))\n        tgt = residual + self.dropout3(tgt)\n        if not self.normalize_before:\n            tgt = self.norm3(tgt)\n        return tgt if cache is None else (tgt, (incremental_cache, ))\n\n\nclass Decoder(nn.TransformerDecoder):\n    \"\"\"\n    PaddlePaddle 2.1 casts memory_mask.dtype to memory.dtype, but in STACL,\n    type of memory is list, having no dtype attribute.\n    \"\"\"\n\n    def forward(self, tgt, memory, tgt_mask=None, memory_mask=None, cache=None):\n        output = tgt\n        new_caches = []\n        for i, mod in enumerate(self.layers):\n            if cache is None:\n                output = mod(output,\n                             memory,\n                             tgt_mask=tgt_mask,\n                             memory_mask=memory_mask,\n                             cache=None)\n            else:\n                output, new_cache = mod(output,\n                                        memory,\n                                        tgt_mask=tgt_mask,\n                                        memory_mask=memory_mask,\n                                        cache=cache[i])\n                new_caches.append(new_cache)\n\n        if self.norm is not None:\n            output = self.norm(output)\n\n        return output if cache is None else (output, new_caches)\n\n\nclass SimultaneousTransformer(nn.Layer):\n    \"\"\"\n    model\n    \"\"\"\n    def __init__(self,\n                 src_vocab_size,\n                 trg_vocab_size,\n                 max_length=256,\n                 n_layer=6,\n                 n_head=8,\n                 d_model=512,\n                 d_inner_hid=2048,\n                 dropout=0.1,\n                 weight_sharing=False,\n                 bos_id=0,\n                 eos_id=1,\n                 waitk=-1):\n        super(SimultaneousTransformer, self).__init__()\n        self.trg_vocab_size = trg_vocab_size\n        self.emb_dim = d_model\n        self.bos_id = bos_id\n        self.eos_id = eos_id\n        self.dropout = dropout\n        self.waitk = waitk\n        self.n_layer = n_layer\n        self.n_head = n_head\n        self.d_model = d_model\n\n        self.src_word_embedding = WordEmbedding(\n            vocab_size=src_vocab_size, emb_dim=d_model, bos_id=self.bos_id)\n        self.src_pos_embedding = PositionalEmbedding(\n            emb_dim=d_model, max_length=max_length+1)\n        if weight_sharing:\n            assert src_vocab_size == trg_vocab_size, (\n                \"Vocabularies in source and target should be same for weight sharing.\"\n            )\n            self.trg_word_embedding = self.src_word_embedding\n            self.trg_pos_embedding = self.src_pos_embedding\n        else:\n            self.trg_word_embedding = WordEmbedding(\n                vocab_size=trg_vocab_size, emb_dim=d_model, bos_id=self.bos_id)\n            self.trg_pos_embedding = PositionalEmbedding(\n                emb_dim=d_model, max_length=max_length+1)\n\n        encoder_layer = nn.TransformerEncoderLayer(\n            d_model=d_model,\n            nhead=n_head,\n            dim_feedforward=d_inner_hid,\n            dropout=dropout,\n            activation='relu',\n            normalize_before=True,\n            bias_attr=[False, True])\n        encoder_norm = nn.LayerNorm(d_model)\n        self.encoder = nn.TransformerEncoder(\n            encoder_layer=encoder_layer, num_layers=n_layer, norm=encoder_norm)\n\n        decoder_layer = DecoderLayer(\n            d_model=d_model,\n            nhead=n_head,\n            dim_feedforward=d_inner_hid,\n            dropout=dropout,\n            activation='relu',\n            normalize_before=True,\n            bias_attr=[False, False, True])\n        decoder_norm = nn.LayerNorm(d_model)\n        self.decoder = Decoder(\n            decoder_layer=decoder_layer, num_layers=n_layer, norm=decoder_norm)\n\n        if weight_sharing:\n            self.linear = lambda x: paddle.matmul(\n                x=x, y=self.trg_word_embedding.word_embedding.weight, transpose_y=True)\n        else:\n            self.linear = nn.Linear(\n                in_features=d_model,\n                out_features=trg_vocab_size,\n                bias_attr=False)\n\n    def forward(self, src_word, trg_word):\n        src_max_len = paddle.shape(src_word)[-1]\n        trg_max_len = paddle.shape(trg_word)[-1]\n        base_attn_bias = paddle.cast(\n            src_word == self.bos_id,\n            dtype=paddle.get_default_dtype()).unsqueeze([1, 2]) * -1e9\n        src_slf_attn_bias = base_attn_bias\n        src_slf_attn_bias.stop_gradient = True\n        trg_slf_attn_bias = paddle.tensor.triu(\n            (paddle.ones(\n                (trg_max_len, trg_max_len),\n                dtype=paddle.get_default_dtype()) * -np.inf),\n            1)\n        trg_slf_attn_bias.stop_gradient = True\n        trg_src_attn_bias = paddle.tile(base_attn_bias, [1, 1, trg_max_len, 1])\n        src_pos = paddle.cast(\n            src_word != self.bos_id, dtype=\"int64\") * paddle.arange(\n                start=0, end=src_max_len)\n        trg_pos = paddle.cast(\n            trg_word != self.bos_id, dtype=\"int64\") * paddle.arange(\n                start=0, end=trg_max_len)\n        src_emb = self.src_word_embedding(src_word)\n        src_pos_emb = self.src_pos_embedding(src_pos)\n        src_emb = src_emb + src_pos_emb\n        enc_input = F.dropout(\n            src_emb, p=self.dropout,\n            training=self.training) if self.dropout else src_emb\n        with paddle.static.amp.fp16_guard():\n            if self.waitk >= src_max_len or self.waitk == -1:\n                # Full sentence\n                enc_outputs = [\n                    self.encoder(\n                        enc_input, src_mask=src_slf_attn_bias)\n                ]\n            else:\n                # Wait-k policy\n                enc_outputs = []\n                for i in range(self.waitk, src_max_len + 1):\n                    enc_output = self.encoder(\n                        enc_input[:, :i, :],\n                        src_mask=src_slf_attn_bias[:, :, :, :i])\n                    enc_outputs.append(enc_output)\n\n            trg_emb = self.trg_word_embedding(trg_word)\n            trg_pos_emb = self.trg_pos_embedding(trg_pos)\n            trg_emb = trg_emb + trg_pos_emb\n            dec_input = F.dropout(\n                trg_emb, p=self.dropout,\n                training=self.training) if self.dropout else trg_emb\n            dec_output = self.decoder(\n                dec_input,\n                enc_outputs,\n                tgt_mask=trg_slf_attn_bias,\n                memory_mask=trg_src_attn_bias)\n\n            predict = self.linear(dec_output)\n\n        return predict\n\n    def beam_search(self, src_word, beam_size=4, max_len=256, waitk=-1):\n        # TODO: \"Speculative Beam Search for Simultaneous Translation\"\n        raise NotImplementedError\n\n    def greedy_search(self,\n                      src_word,\n                      max_len=256,\n                      waitk=-1,\n                      caches=None,\n                      bos_id=None):\n        \"\"\"\n        greedy_search uses streaming reader. It doesn't need calling\n        encoder many times, an a sub-sentence just needs calling encoder once.\n        So, it needs previous state(caches) and last one of generated\n        tokens id last time.\n        \"\"\"\n        src_max_len = paddle.shape(src_word)[-1]\n        base_attn_bias = paddle.cast(\n            src_word == self.bos_id,\n            dtype=paddle.get_default_dtype()).unsqueeze([1, 2]) * -1e9\n        src_slf_attn_bias = base_attn_bias\n        src_slf_attn_bias.stop_gradient = True\n        trg_src_attn_bias = paddle.tile(base_attn_bias, [1, 1, 1, 1])\n        src_pos = paddle.cast(\n            src_word != self.bos_id, dtype=\"int64\") * paddle.arange(\n                start=0, end=src_max_len)\n        src_emb = self.src_word_embedding(src_word)\n        src_pos_emb = self.src_pos_embedding(src_pos)\n        src_emb = src_emb + src_pos_emb\n        enc_input = F.dropout(\n            src_emb, p=self.dropout,\n            training=self.training) if self.dropout else src_emb\n        enc_outputs = [self.encoder(enc_input, src_mask=src_slf_attn_bias)]\n\n        # constant number\n        batch_size = enc_outputs[-1].shape[0]\n        max_len = (\n            enc_outputs[-1].shape[1] + 20) if max_len is None else max_len\n        end_token_tensor = paddle.full(\n            shape=[batch_size, 1], fill_value=self.eos_id, dtype=\"int64\")\n\n        predict_ids = []\n        log_probs = paddle.full(\n            shape=[batch_size, 1], fill_value=0, dtype=\"float32\")\n        if not bos_id:\n            trg_word = paddle.full(\n                shape=[batch_size, 1], fill_value=self.bos_id, dtype=\"int64\")\n        else:\n            trg_word = paddle.full(\n                shape=[batch_size, 1], fill_value=bos_id, dtype=\"int64\")\n\n        # init states (caches) for transformer\n        if not caches:\n            caches = self.decoder.gen_cache(enc_outputs[-1], do_zip=False)\n\n        for i in range(max_len):\n            trg_pos = paddle.full(\n                shape=trg_word.shape, fill_value=i, dtype=\"int64\")\n            trg_emb = self.trg_word_embedding(trg_word)\n            trg_pos_emb = self.trg_pos_embedding(trg_pos)\n            trg_emb = trg_emb + trg_pos_emb\n            dec_input = F.dropout(\n                trg_emb, p=self.dropout,\n                training=self.training) if self.dropout else trg_emb\n\n            if waitk < 0 or i >= len(enc_outputs):\n                # if the decoder step is full sent or longer than all source\n                # step, then read the whole src\n                _e = enc_outputs[-1]\n                dec_output, caches = self.decoder(\n                    dec_input, [_e], None,\n                    trg_src_attn_bias[:, :, :, :_e.shape[1]], caches)\n            else:\n                _e = enc_outputs[i]\n                dec_output, caches = self.decoder(\n                    dec_input, [_e], None,\n                    trg_src_attn_bias[:, :, :, :_e.shape[1]], caches)\n\n            dec_output = paddle.reshape(\n                dec_output, shape=[-1, dec_output.shape[-1]])\n\n            logits = self.linear(dec_output)\n            step_log_probs = paddle.log(F.softmax(logits, axis=-1))\n            log_probs = paddle.add(x=step_log_probs, y=log_probs)\n            scores = log_probs\n            topk_scores, topk_indices = paddle.topk(x=scores, k=1)\n\n            finished = paddle.equal(topk_indices, end_token_tensor)\n            trg_word = topk_indices\n            log_probs = topk_scores\n\n            predict_ids.append(topk_indices)\n\n            if paddle.all(finished).numpy():\n                break\n\n        predict_ids = paddle.stack(predict_ids, axis=0)\n        finished_seq = paddle.transpose(predict_ids, [1, 2, 0])\n        finished_scores = topk_scores\n\n        return finished_seq, finished_scores, caches"
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_1/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\n\nimport jieba\nimport paddle\nfrom paddlenlp.transformers import position_encoding_init\nfrom paddlenlp.transformers import WordEmbedding, PositionalEmbedding\nfrom paddlehub.env import MODULE_HOME\nfrom paddlehub.module.module import moduleinfo, serving\n\nfrom transformer_nist_wait_1.model import SimultaneousTransformer\nfrom transformer_nist_wait_1.processor import STACLTokenizer, predict\n\n\n@moduleinfo(\n    name=\"transformer_nist_wait_1\",\n    version=\"1.0.0\",\n    summary=\"\",\n    author=\"PaddlePaddle\",\n    author_email=\"\",\n    type=\"nlp/simultaneous_translation\",\n)\nclass STTransformer():\n    \"\"\"\n    Transformer model for simultaneous translation.\n    \"\"\"\n\n    # Model config\n    model_config = {\n        # Number of head used in multi-head attention.\n        \"n_head\": 8,\n        # Number of sub-layers to be stacked in the encoder and decoder.\n        \"n_layer\": 6,\n        # The dimension for word embeddings, which is also the last dimension of\n        # the input and output of multi-head attention, position-wise feed-forward\n        # networks, encoder and decoder.\n        \"d_model\": 512,\n    }\n\n    def __init__(self, \n                 max_length=256,\n                 max_out_len=256,\n                 ):\n        super(STTransformer, self).__init__()\n        bpe_codes_fpath = os.path.join(MODULE_HOME, \"transformer_nist_wait_1\", \"assets\", \"2M.zh2en.dict4bpe.zh\")\n        src_vocab_fpath = os.path.join(MODULE_HOME, \"transformer_nist_wait_1\", \"assets\", \"nist.20k.zh.vocab\")\n        trg_vocab_fpath = os.path.join(MODULE_HOME, \"transformer_nist_wait_1\", \"assets\", \"nist.10k.en.vocab\")\n        params_fpath = os.path.join(MODULE_HOME, \"transformer_nist_wait_1\", \"assets\", \"transformer.pdparams\")\n        self.max_length = max_length\n        self.max_out_len = max_out_len\n        self.tokenizer = STACLTokenizer(\n            bpe_codes_fpath,\n            src_vocab_fpath,\n            trg_vocab_fpath,\n        )\n        src_vocab_size = self.tokenizer.src_vocab_size\n        trg_vocab_size = self.tokenizer.trg_vocab_size\n        self.transformer = SimultaneousTransformer(\n            src_vocab_size,\n            trg_vocab_size,\n            max_length=self.max_length,\n            n_layer=self.model_config['n_layer'],\n            n_head=self.model_config['n_head'],\n            d_model=self.model_config['d_model'],\n        )\n        model_dict = paddle.load(params_fpath)\n        # To avoid a longer length than training, reset the size of position\n        # encoding to max_length\n        model_dict[\"src_pos_embedding.pos_encoder.weight\"] = position_encoding_init(\n            self.max_length + 1, self.model_config['d_model'])\n        model_dict[\"trg_pos_embedding.pos_encoder.weight\"] = position_encoding_init(\n            self.max_length + 1, self.model_config['d_model'])\n        self.transformer.load_dict(model_dict)\n\n    @serving\n    def translate(self, text, use_gpu=False):\n        paddle.set_device('gpu') if use_gpu else paddle.set_device('cpu')\n\n        # Word segmentation\n        text = ' '.join(jieba.cut(text))\n        # For decoding max length\n        decoder_max_length = 1\n        # For decoding cache\n        cache = None\n        # For decoding start token id\n        bos_id = None\n        # Current source word index\n        i = 0\n        # For decoding: is_last=True, max_len=256\n        is_last = False\n        # Tokenized id\n        user_input_tokenized = []\n        # Store the translation\n        result = []\n\n        bpe_str, tokenized_src = self.tokenizer.tokenize(text)\n        while i < len(tokenized_src):\n            user_input_tokenized.append(tokenized_src[i])\n            if bpe_str[i] in ['。', '？', '！']:\n                is_last = True\n            result, cache, bos_id = predict(\n                user_input_tokenized, \n                decoder_max_length,\n                is_last, \n                cache, \n                bos_id, \n                result,\n                self.tokenizer, \n                self.transformer,\n                max_out_len=self.max_out_len)\n            i += 1    \n        return \" \".join(result)"
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_1/processor.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nfrom paddlenlp.data import Vocab\nfrom subword_nmt import subword_nmt\n\n\nclass STACLTokenizer:\n    \"\"\"\n    Jieba+BPE, and convert tokens to ids.\n    \"\"\"\n\n    def __init__(self,\n                 bpe_codes_fpath,\n                 src_vocab_fpath,\n                 trg_vocab_fpath,\n                 special_token=[\"<s>\", \"<e>\", \"<unk>\"]):\n        bpe_parser = subword_nmt.create_apply_bpe_parser()\n        bpe_args = bpe_parser.parse_args(args=['-c', bpe_codes_fpath])\n        bpe_args.codes.close()\n        bpe_args.codes = open(bpe_codes_fpath, 'r', encoding='utf-8')\n        self.bpe = subword_nmt.BPE(bpe_args.codes, bpe_args.merges,\n                                   bpe_args.separator, None,\n                                   bpe_args.glossaries)\n\n        self.src_vocab = Vocab.load_vocabulary(\n            src_vocab_fpath,\n            bos_token=special_token[0],\n            eos_token=special_token[1],\n            unk_token=special_token[2])\n\n        self.trg_vocab = Vocab.load_vocabulary(\n            trg_vocab_fpath,\n            bos_token=special_token[0],\n            eos_token=special_token[1],\n            unk_token=special_token[2])\n\n        self.src_vocab_size = len(self.src_vocab)\n        self.trg_vocab_size = len(self.trg_vocab)\n\n    def tokenize(self, text):\n        bpe_str = self.bpe.process_line(text)\n        ids = self.src_vocab.to_indices(bpe_str.split())\n        return bpe_str.split(), ids\n\n\ndef post_process_seq(seq, \n                     bos_idx=0, \n                     eos_idx=1, \n                     output_bos=False, \n                     output_eos=False):\n    \"\"\"\n    Post-process the decoded sequence.\n    \"\"\"\n    eos_pos = len(seq) - 1\n    for i, idx in enumerate(seq):\n        if idx == eos_idx:\n            eos_pos = i\n            break\n    seq = [\n        idx for idx in seq[:eos_pos + 1]\n        if (output_bos or idx != bos_idx) and (output_eos or idx != eos_idx)\n    ]\n    return seq\n\n\ndef predict(tokenized_src, \n              decoder_max_length, \n              is_last, \n              cache, \n              bos_id, \n              result,\n              tokenizer, \n              transformer,\n              n_best=1,\n              max_out_len=256,\n              eos_idx=1,\n              waitk=1,\n    ):\n    # Set evaluate mode\n    transformer.eval()\n\n    if len(tokenized_src) < waitk:\n        return result, cache, bos_id\n\n    with paddle.no_grad():\n        paddle.disable_static()\n        input_src = tokenized_src\n        if is_last:\n            decoder_max_length = max_out_len\n            input_src += [eos_idx]\n        src_word = paddle.to_tensor(input_src).unsqueeze(axis=0)\n        finished_seq, finished_scores, cache = transformer.greedy_search(\n            src_word,\n            max_len=decoder_max_length,\n            waitk=waitk,\n            caches=cache,\n            bos_id=bos_id)\n        finished_seq = finished_seq.numpy()\n        for beam_idx, beam in enumerate(finished_seq[0]):\n            if beam_idx >= n_best:\n                break\n            id_list = post_process_seq(beam)\n            if len(id_list) == 0:\n                continue\n            bos_id = id_list[-1]\n            word_list = tokenizer.trg_vocab.to_tokens(id_list)\n            for word in word_list:\n                result.append(word)\n            res = ' '.join(word_list).replace('@@ ', '')\n        paddle.enable_static()\n    return result, cache, bos_id"
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_1/requirements.txt",
    "content": "jieba==0.42.1 \nsubword-nmt==0.3.7\n"
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_3/README.md",
    "content": "# transformer_nist_wait_3\n|模型名称|transformer_nist_wait_3|\n| :--- | :---: | \n|类别|同声传译|\n|网络|transformer|\n|数据集|NIST 2008-中英翻译数据集|\n|是否支持Fine-tuning|否|\n|模型大小|377MB|\n|最新更新日期|2021-09-17|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - 同声传译（Simultaneous Translation），即在句子完成之前进行翻译，同声传译的目标是实现同声传译的自动化，它可以与源语言同时翻译，延迟时间只有几秒钟。\n    STACL 是论文 [STACL: Simultaneous Translation with Implicit Anticipation and Controllable Latency using Prefix-to-Prefix Framework](https://www.aclweb.org/anthology/P19-1289/) 中针对同传提出的适用于所有同传场景的翻译架构。\n    - STACL 主要具有以下优势：\n\n    - Prefix-to-Prefix架构拥有预测能力，即在未看到源词的情况下仍然可以翻译出对应的目标词，克服了SOV→SVO等词序差异\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/40840292/133761990-13e55d0f-5c3a-476c-8865-5808d13cba97.png\"> <br />\n    </p>\n     和传统的机器翻译模型主要的区别在于翻译时是否需要利用全句的源句。上图中，Seq2Seq模型需要等到全句的源句（1-5）全部输入Encoder后，Decoder才开始解码进行翻译；而STACL架构采用了Wait-k（图中Wait-2）的策略，当源句只有两个词（1和2）输入到Encoder后，Decoder即可开始解码预测目标句的第一个词。\n\n    - Wait-k策略可以不需要全句的源句，直接预测目标句，可以实现任意的字级延迟，同时保持较高的翻译质量。\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/40840292/133762098-6ea6f3ca-0d70-4a0a-981d-0fcc6f3cd96b.png\"> <br />\n    </p>\n     Wait-k策略首先等待源句单词，然后与源句的其余部分同时翻译，即输出总是隐藏在输入后面。这是受到同声传译人员的启发，同声传译人员通常会在几秒钟内开始翻译演讲者的演讲，在演讲者结束几秒钟后完成。例如，如果k=2，第一个目标词使用前2个源词预测，第二个目标词使用前3个源词预测，以此类推。上图中，(a)simultaneous: our wait-2 等到\"布什\"和\"总统\"输入后就开始解码预测\"pres.\"，而(b) non-simultaneous baseline 为传统的翻译模型，需要等到整句\"布什 总统 在 莫斯科 与 普京 会晤\"才开始解码预测。\n    \n  - 该PaddleHub Module基于transformer网络结构，采用wait-3策略进行中文到英文的翻译。\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.1.0\n\n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install transformer_nist_wait_3\n    ```\n\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    model = hub.Module(name=\"transformer_nist_wait_3\")\n\n    # 待预测数据（模拟同声传译实时输入）\n    text = [\n        \"他\", \n        \"他还\", \n        \"他还说\", \n        \"他还说现在\", \n        \"他还说现在正在\",\n        \"他还说现在正在为\",\n        \"他还说现在正在为这\",\n        \"他还说现在正在为这一\",\n        \"他还说现在正在为这一会议\",\n        \"他还说现在正在为这一会议作出\",\n        \"他还说现在正在为这一会议作出安排\",\n        \"他还说现在正在为这一会议作出安排。\",      \n    ]\n\n    for t in text:\n        print(\"input: {}\".format(t))\n        result = model.translate(t)\n        print(\"model output: {}\\n\".format(result))\n\n    # input: 他\n    # model output: \n    #\n    # input: 他还\n    # model output: \n    #\n    # input: 他还说\n    # model output: he\n    #\n    # input: 他还说现在\n    # model output: he also\n    #\n    # input: 他还说现在正在\n    # model output: he also said\n    #\n    # input: 他还说现在正在为\n    # model output: he also said that\n    #\n    # input: 他还说现在正在为这\n    # model output: he also said that he\n    #\n    # input: 他还说现在正在为这一\n    # model output: he also said that he is\n    #\n    # input: 他还说现在正在为这一会议\n    # model output: he also said that he is making\n    #\n    # input: 他还说现在正在为这一会议作出\n    # model output: he also said that he is making preparations\n    #\n    # input: 他还说现在正在为这一会议作出安排\n    # model output: he also said that he is making preparations for\n    #\n    # input: 他还说现在正在为这一会议作出安排。\n    # model output: he also said that he is making preparations for this meeting .\n    ```\n\n- ### 2、 API\n\n    - ```python\n      __init__(max_length=256, max_out_len=256)\n      ```\n\n        - 初始化module， 可配置模型的输入文本的最大长度\n\n        - **参数**\n\n            - max_length(int): 输入文本的最大长度，默认值为256。\n            - max_out_len(int): 输出文本的最大解码长度，超过最大解码长度时会截断句子的后半部分，默认值为256。\n\n    - ```python\n      translate(text, use_gpu=False)\n      ```\n\n        - 预测API，输入源语言的文本（模拟同传语音输入），解码后输出翻译后的目标语言文本。\n\n        - **参数**\n\n            - text(str): 输入源语言的文本，数据类型为str\n            - use_gpu(bool): 是否使用gpu进行预测，默认为False\n\n        - **返回**\n\n            - result(str): 翻译后的目标语言文本。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线同声传译服务(需要用户配置一个语音转文本应用预先将语音输入转为中文文字)，可以将此接口用于在线web应用。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m transformer_nist_wait_3\n    ```\n\n  - 启动时会显示加载模型过程，启动成功后显示\n\n  - ```shell\n    Loading transformer_nist_wait_3 successful.\n    ```\n\n  - 这样就完成了服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果       \n\n  - ```python\n    import requests\n    import json\n\n    # 待预测数据（模拟同声传译实时输入）\n    text = [\n        \"他\", \n        \"他还\", \n        \"他还说\", \n        \"他还说现在\", \n        \"他还说现在正在\",\n        \"他还说现在正在为\",\n        \"他还说现在正在为这\",\n        \"他还说现在正在为这一\",\n        \"他还说现在正在为这一会议\",\n        \"他还说现在正在为这一会议作出\",\n        \"他还说现在正在为这一会议作出安排\",\n        \"他还说现在正在为这一会议作出安排。\",      \n    ]\n\n    # 指定预测方法为transformer_nist_wait_3并发送post请求，content-type类型应指定json方式\n    # HOST_IP为服务器IP\n    url = \"http://HOST_IP:8866/predict/transformer_nist_wait_3\"\n    headers = {\"Content-Type\": \"application/json\"}\n    for t in text:\n        print(\"input: {}\".format(t))\n        result = requests.post(url=url, headers=headers, data=json.dumps({\"text\": t}))\n        # 打印预测结果\n        print(\"model output: {}\\n\".format(result.json()['results']))\n\n  - 关于PaddleHub Serving更多信息参考：[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n## 五、更新历史\n\n* 1.0.0\n    初始发布\n    ```shell\n    hub install transformer_nist_wait_3==1.0.0\n    ```\n"
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_3/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_3/model.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport numpy as np\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlenlp.transformers import WordEmbedding, PositionalEmbedding\n\n\nclass DecoderLayer(nn.TransformerDecoderLayer):\n    def __init__(self, *args, **kwargs):\n        super(DecoderLayer, self).__init__(*args, **kwargs)\n\n    def forward(self, tgt, memory, tgt_mask=None, memory_mask=None, cache=None):\n        residual = tgt\n        if self.normalize_before:\n            tgt = self.norm1(tgt)\n        if cache is None:\n            tgt = self.self_attn(tgt, tgt, tgt, tgt_mask, None)\n        else:\n            tgt, incremental_cache = self.self_attn(tgt, tgt, tgt, tgt_mask,\n                                                    cache[0])\n        tgt = residual + self.dropout1(tgt)\n        if not self.normalize_before:\n            tgt = self.norm1(tgt)\n\n        residual = tgt\n        if self.normalize_before:\n            tgt = self.norm2(tgt)\n        if len(memory) == 1:\n            # Full sent\n            tgt = self.cross_attn(tgt, memory[0], memory[0], memory_mask, None)\n        else:\n            # Wait-k policy\n            cross_attn_outputs = []\n            for i in range(tgt.shape[1]):\n                q = tgt[:, i:i + 1, :]\n                if i >= len(memory):\n                    e = memory[-1]\n                else:\n                    e = memory[i]\n                cross_attn_outputs.append(\n                    self.cross_attn(q, e, e, memory_mask[:, :, i:i + 1, :\n                                                         e.shape[1]], None))\n            tgt = paddle.concat(cross_attn_outputs, axis=1)\n        tgt = residual + self.dropout2(tgt)\n        if not self.normalize_before:\n            tgt = self.norm2(tgt)\n\n        residual = tgt\n        if self.normalize_before:\n            tgt = self.norm3(tgt)\n        tgt = self.linear2(self.dropout(self.activation(self.linear1(tgt))))\n        tgt = residual + self.dropout3(tgt)\n        if not self.normalize_before:\n            tgt = self.norm3(tgt)\n        return tgt if cache is None else (tgt, (incremental_cache, ))\n\n\nclass Decoder(nn.TransformerDecoder):\n    \"\"\"\n    PaddlePaddle 2.1 casts memory_mask.dtype to memory.dtype, but in STACL,\n    type of memory is list, having no dtype attribute.\n    \"\"\"\n\n    def forward(self, tgt, memory, tgt_mask=None, memory_mask=None, cache=None):\n        output = tgt\n        new_caches = []\n        for i, mod in enumerate(self.layers):\n            if cache is None:\n                output = mod(output,\n                             memory,\n                             tgt_mask=tgt_mask,\n                             memory_mask=memory_mask,\n                             cache=None)\n            else:\n                output, new_cache = mod(output,\n                                        memory,\n                                        tgt_mask=tgt_mask,\n                                        memory_mask=memory_mask,\n                                        cache=cache[i])\n                new_caches.append(new_cache)\n\n        if self.norm is not None:\n            output = self.norm(output)\n\n        return output if cache is None else (output, new_caches)\n\n\nclass SimultaneousTransformer(nn.Layer):\n    \"\"\"\n    model\n    \"\"\"\n    def __init__(self,\n                 src_vocab_size,\n                 trg_vocab_size,\n                 max_length=256,\n                 n_layer=6,\n                 n_head=8,\n                 d_model=512,\n                 d_inner_hid=2048,\n                 dropout=0.1,\n                 weight_sharing=False,\n                 bos_id=0,\n                 eos_id=1,\n                 waitk=-1):\n        super(SimultaneousTransformer, self).__init__()\n        self.trg_vocab_size = trg_vocab_size\n        self.emb_dim = d_model\n        self.bos_id = bos_id\n        self.eos_id = eos_id\n        self.dropout = dropout\n        self.waitk = waitk\n        self.n_layer = n_layer\n        self.n_head = n_head\n        self.d_model = d_model\n\n        self.src_word_embedding = WordEmbedding(\n            vocab_size=src_vocab_size, emb_dim=d_model, bos_id=self.bos_id)\n        self.src_pos_embedding = PositionalEmbedding(\n            emb_dim=d_model, max_length=max_length+1)\n        if weight_sharing:\n            assert src_vocab_size == trg_vocab_size, (\n                \"Vocabularies in source and target should be same for weight sharing.\"\n            )\n            self.trg_word_embedding = self.src_word_embedding\n            self.trg_pos_embedding = self.src_pos_embedding\n        else:\n            self.trg_word_embedding = WordEmbedding(\n                vocab_size=trg_vocab_size, emb_dim=d_model, bos_id=self.bos_id)\n            self.trg_pos_embedding = PositionalEmbedding(\n                emb_dim=d_model, max_length=max_length+1)\n\n        encoder_layer = nn.TransformerEncoderLayer(\n            d_model=d_model,\n            nhead=n_head,\n            dim_feedforward=d_inner_hid,\n            dropout=dropout,\n            activation='relu',\n            normalize_before=True,\n            bias_attr=[False, True])\n        encoder_norm = nn.LayerNorm(d_model)\n        self.encoder = nn.TransformerEncoder(\n            encoder_layer=encoder_layer, num_layers=n_layer, norm=encoder_norm)\n\n        decoder_layer = DecoderLayer(\n            d_model=d_model,\n            nhead=n_head,\n            dim_feedforward=d_inner_hid,\n            dropout=dropout,\n            activation='relu',\n            normalize_before=True,\n            bias_attr=[False, False, True])\n        decoder_norm = nn.LayerNorm(d_model)\n        self.decoder = Decoder(\n            decoder_layer=decoder_layer, num_layers=n_layer, norm=decoder_norm)\n\n        if weight_sharing:\n            self.linear = lambda x: paddle.matmul(\n                x=x, y=self.trg_word_embedding.word_embedding.weight, transpose_y=True)\n        else:\n            self.linear = nn.Linear(\n                in_features=d_model,\n                out_features=trg_vocab_size,\n                bias_attr=False)\n\n    def forward(self, src_word, trg_word):\n        src_max_len = paddle.shape(src_word)[-1]\n        trg_max_len = paddle.shape(trg_word)[-1]\n        base_attn_bias = paddle.cast(\n            src_word == self.bos_id,\n            dtype=paddle.get_default_dtype()).unsqueeze([1, 2]) * -1e9\n        src_slf_attn_bias = base_attn_bias\n        src_slf_attn_bias.stop_gradient = True\n        trg_slf_attn_bias = paddle.tensor.triu(\n            (paddle.ones(\n                (trg_max_len, trg_max_len),\n                dtype=paddle.get_default_dtype()) * -np.inf),\n            1)\n        trg_slf_attn_bias.stop_gradient = True\n        trg_src_attn_bias = paddle.tile(base_attn_bias, [1, 1, trg_max_len, 1])\n        src_pos = paddle.cast(\n            src_word != self.bos_id, dtype=\"int64\") * paddle.arange(\n                start=0, end=src_max_len)\n        trg_pos = paddle.cast(\n            trg_word != self.bos_id, dtype=\"int64\") * paddle.arange(\n                start=0, end=trg_max_len)\n        src_emb = self.src_word_embedding(src_word)\n        src_pos_emb = self.src_pos_embedding(src_pos)\n        src_emb = src_emb + src_pos_emb\n        enc_input = F.dropout(\n            src_emb, p=self.dropout,\n            training=self.training) if self.dropout else src_emb\n        with paddle.static.amp.fp16_guard():\n            if self.waitk >= src_max_len or self.waitk == -1:\n                # Full sentence\n                enc_outputs = [\n                    self.encoder(\n                        enc_input, src_mask=src_slf_attn_bias)\n                ]\n            else:\n                # Wait-k policy\n                enc_outputs = []\n                for i in range(self.waitk, src_max_len + 1):\n                    enc_output = self.encoder(\n                        enc_input[:, :i, :],\n                        src_mask=src_slf_attn_bias[:, :, :, :i])\n                    enc_outputs.append(enc_output)\n\n            trg_emb = self.trg_word_embedding(trg_word)\n            trg_pos_emb = self.trg_pos_embedding(trg_pos)\n            trg_emb = trg_emb + trg_pos_emb\n            dec_input = F.dropout(\n                trg_emb, p=self.dropout,\n                training=self.training) if self.dropout else trg_emb\n            dec_output = self.decoder(\n                dec_input,\n                enc_outputs,\n                tgt_mask=trg_slf_attn_bias,\n                memory_mask=trg_src_attn_bias)\n\n            predict = self.linear(dec_output)\n\n        return predict\n\n    def beam_search(self, src_word, beam_size=4, max_len=256, waitk=-1):\n        # TODO: \"Speculative Beam Search for Simultaneous Translation\"\n        raise NotImplementedError\n\n    def greedy_search(self,\n                      src_word,\n                      max_len=256,\n                      waitk=-1,\n                      caches=None,\n                      bos_id=None):\n        \"\"\"\n        greedy_search uses streaming reader. It doesn't need calling\n        encoder many times, an a sub-sentence just needs calling encoder once.\n        So, it needs previous state(caches) and last one of generated\n        tokens id last time.\n        \"\"\"\n        src_max_len = paddle.shape(src_word)[-1]\n        base_attn_bias = paddle.cast(\n            src_word == self.bos_id,\n            dtype=paddle.get_default_dtype()).unsqueeze([1, 2]) * -1e9\n        src_slf_attn_bias = base_attn_bias\n        src_slf_attn_bias.stop_gradient = True\n        trg_src_attn_bias = paddle.tile(base_attn_bias, [1, 1, 1, 1])\n        src_pos = paddle.cast(\n            src_word != self.bos_id, dtype=\"int64\") * paddle.arange(\n                start=0, end=src_max_len)\n        src_emb = self.src_word_embedding(src_word)\n        src_pos_emb = self.src_pos_embedding(src_pos)\n        src_emb = src_emb + src_pos_emb\n        enc_input = F.dropout(\n            src_emb, p=self.dropout,\n            training=self.training) if self.dropout else src_emb\n        enc_outputs = [self.encoder(enc_input, src_mask=src_slf_attn_bias)]\n\n        # constant number\n        batch_size = enc_outputs[-1].shape[0]\n        max_len = (\n            enc_outputs[-1].shape[1] + 20) if max_len is None else max_len\n        end_token_tensor = paddle.full(\n            shape=[batch_size, 1], fill_value=self.eos_id, dtype=\"int64\")\n\n        predict_ids = []\n        log_probs = paddle.full(\n            shape=[batch_size, 1], fill_value=0, dtype=\"float32\")\n        if not bos_id:\n            trg_word = paddle.full(\n                shape=[batch_size, 1], fill_value=self.bos_id, dtype=\"int64\")\n        else:\n            trg_word = paddle.full(\n                shape=[batch_size, 1], fill_value=bos_id, dtype=\"int64\")\n\n        # init states (caches) for transformer\n        if not caches:\n            caches = self.decoder.gen_cache(enc_outputs[-1], do_zip=False)\n\n        for i in range(max_len):\n            trg_pos = paddle.full(\n                shape=trg_word.shape, fill_value=i, dtype=\"int64\")\n            trg_emb = self.trg_word_embedding(trg_word)\n            trg_pos_emb = self.trg_pos_embedding(trg_pos)\n            trg_emb = trg_emb + trg_pos_emb\n            dec_input = F.dropout(\n                trg_emb, p=self.dropout,\n                training=self.training) if self.dropout else trg_emb\n\n            if waitk < 0 or i >= len(enc_outputs):\n                # if the decoder step is full sent or longer than all source\n                # step, then read the whole src\n                _e = enc_outputs[-1]\n                dec_output, caches = self.decoder(\n                    dec_input, [_e], None,\n                    trg_src_attn_bias[:, :, :, :_e.shape[1]], caches)\n            else:\n                _e = enc_outputs[i]\n                dec_output, caches = self.decoder(\n                    dec_input, [_e], None,\n                    trg_src_attn_bias[:, :, :, :_e.shape[1]], caches)\n\n            dec_output = paddle.reshape(\n                dec_output, shape=[-1, dec_output.shape[-1]])\n\n            logits = self.linear(dec_output)\n            step_log_probs = paddle.log(F.softmax(logits, axis=-1))\n            log_probs = paddle.add(x=step_log_probs, y=log_probs)\n            scores = log_probs\n            topk_scores, topk_indices = paddle.topk(x=scores, k=1)\n\n            finished = paddle.equal(topk_indices, end_token_tensor)\n            trg_word = topk_indices\n            log_probs = topk_scores\n\n            predict_ids.append(topk_indices)\n\n            if paddle.all(finished).numpy():\n                break\n\n        predict_ids = paddle.stack(predict_ids, axis=0)\n        finished_seq = paddle.transpose(predict_ids, [1, 2, 0])\n        finished_scores = topk_scores\n\n        return finished_seq, finished_scores, caches"
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_3/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\n\nimport jieba\nimport paddle\nfrom paddlenlp.transformers import position_encoding_init\nfrom paddlenlp.transformers import WordEmbedding, PositionalEmbedding\nfrom paddlehub.env import MODULE_HOME\nfrom paddlehub.module.module import moduleinfo, serving\n\nfrom transformer_nist_wait_3.model import SimultaneousTransformer\nfrom transformer_nist_wait_3.processor import STACLTokenizer, predict\n\n\n@moduleinfo(\n    name=\"transformer_nist_wait_3\",\n    version=\"1.0.0\",\n    summary=\"\",\n    author=\"PaddlePaddle\",\n    author_email=\"\",\n    type=\"nlp/simultaneous_translation\",\n)\nclass STTransformer():\n    \"\"\"\n    Transformer model for simultaneous translation.\n    \"\"\"\n\n    # Model config\n    model_config = {\n        # Number of head used in multi-head attention.\n        \"n_head\": 8,\n        # Number of sub-layers to be stacked in the encoder and decoder.\n        \"n_layer\": 6,\n        # The dimension for word embeddings, which is also the last dimension of\n        # the input and output of multi-head attention, position-wise feed-forward\n        # networks, encoder and decoder.\n        \"d_model\": 512,\n    }\n\n    def __init__(self, \n                 max_length=256,\n                 max_out_len=256,\n                 ):\n        super(STTransformer, self).__init__()\n        bpe_codes_fpath = os.path.join(MODULE_HOME, \"transformer_nist_wait_3\", \"assets\", \"2M.zh2en.dict4bpe.zh\")\n        src_vocab_fpath = os.path.join(MODULE_HOME, \"transformer_nist_wait_3\", \"assets\", \"nist.20k.zh.vocab\")\n        trg_vocab_fpath = os.path.join(MODULE_HOME, \"transformer_nist_wait_3\", \"assets\", \"nist.10k.en.vocab\")\n        params_fpath = os.path.join(MODULE_HOME, \"transformer_nist_wait_3\", \"assets\", \"transformer.pdparams\")\n        self.max_length = max_length\n        self.max_out_len = max_out_len\n        self.tokenizer = STACLTokenizer(\n            bpe_codes_fpath,\n            src_vocab_fpath,\n            trg_vocab_fpath,\n        )\n        src_vocab_size = self.tokenizer.src_vocab_size\n        trg_vocab_size = self.tokenizer.trg_vocab_size\n        self.transformer = SimultaneousTransformer(\n            src_vocab_size,\n            trg_vocab_size,\n            max_length=self.max_length,\n            n_layer=self.model_config['n_layer'],\n            n_head=self.model_config['n_head'],\n            d_model=self.model_config['d_model'],\n        )\n        model_dict = paddle.load(params_fpath)\n        # To avoid a longer length than training, reset the size of position\n        # encoding to max_length\n        model_dict[\"src_pos_embedding.pos_encoder.weight\"] = position_encoding_init(\n            self.max_length + 1, self.model_config['d_model'])\n        model_dict[\"trg_pos_embedding.pos_encoder.weight\"] = position_encoding_init(\n            self.max_length + 1, self.model_config['d_model'])\n        self.transformer.load_dict(model_dict)\n\n    @serving\n    def translate(self, text, use_gpu=False):\n        paddle.set_device('gpu') if use_gpu else paddle.set_device('cpu')\n\n        # Word segmentation\n        text = ' '.join(jieba.cut(text))\n        # For decoding max length\n        decoder_max_length = 1\n        # For decoding cache\n        cache = None\n        # For decoding start token id\n        bos_id = None\n        # Current source word index\n        i = 0\n        # For decoding: is_last=True, max_len=256\n        is_last = False\n        # Tokenized id\n        user_input_tokenized = []\n        # Store the translation\n        result = []\n\n        bpe_str, tokenized_src = self.tokenizer.tokenize(text)\n        while i < len(tokenized_src):\n            user_input_tokenized.append(tokenized_src[i])\n            if bpe_str[i] in ['。', '？', '！']:\n                is_last = True\n            result, cache, bos_id = predict(\n                user_input_tokenized, \n                decoder_max_length,\n                is_last, \n                cache, \n                bos_id, \n                result,\n                self.tokenizer, \n                self.transformer,\n                max_out_len=self.max_out_len)\n            i += 1    \n        return \" \".join(result)"
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_3/processor.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nfrom paddlenlp.data import Vocab\nfrom subword_nmt import subword_nmt\n\n\nclass STACLTokenizer:\n    \"\"\"\n    Jieba+BPE, and convert tokens to ids.\n    \"\"\"\n\n    def __init__(self,\n                 bpe_codes_fpath,\n                 src_vocab_fpath,\n                 trg_vocab_fpath,\n                 special_token=[\"<s>\", \"<e>\", \"<unk>\"]):\n        bpe_parser = subword_nmt.create_apply_bpe_parser()\n        bpe_args = bpe_parser.parse_args(args=['-c', bpe_codes_fpath])\n        bpe_args.codes.close()\n        bpe_args.codes = open(bpe_codes_fpath, 'r', encoding='utf-8')\n        self.bpe = subword_nmt.BPE(bpe_args.codes, bpe_args.merges,\n                                   bpe_args.separator, None,\n                                   bpe_args.glossaries)\n\n        self.src_vocab = Vocab.load_vocabulary(\n            src_vocab_fpath,\n            bos_token=special_token[0],\n            eos_token=special_token[1],\n            unk_token=special_token[2])\n\n        self.trg_vocab = Vocab.load_vocabulary(\n            trg_vocab_fpath,\n            bos_token=special_token[0],\n            eos_token=special_token[1],\n            unk_token=special_token[2])\n\n        self.src_vocab_size = len(self.src_vocab)\n        self.trg_vocab_size = len(self.trg_vocab)\n\n    def tokenize(self, text):\n        bpe_str = self.bpe.process_line(text)\n        ids = self.src_vocab.to_indices(bpe_str.split())\n        return bpe_str.split(), ids\n\n\ndef post_process_seq(seq, \n                     bos_idx=0, \n                     eos_idx=1, \n                     output_bos=False, \n                     output_eos=False):\n    \"\"\"\n    Post-process the decoded sequence.\n    \"\"\"\n    eos_pos = len(seq) - 1\n    for i, idx in enumerate(seq):\n        if idx == eos_idx:\n            eos_pos = i\n            break\n    seq = [\n        idx for idx in seq[:eos_pos + 1]\n        if (output_bos or idx != bos_idx) and (output_eos or idx != eos_idx)\n    ]\n    return seq\n\n\ndef predict(tokenized_src, \n              decoder_max_length, \n              is_last, \n              cache, \n              bos_id, \n              result,\n              tokenizer, \n              transformer,\n              n_best=1,\n              max_out_len=256,\n              eos_idx=1,\n              waitk=3,\n    ):\n    # Set evaluate mode\n    transformer.eval()\n\n    if len(tokenized_src) < waitk:\n        return result, cache, bos_id\n\n    with paddle.no_grad():\n        paddle.disable_static()\n        input_src = tokenized_src\n        if is_last:\n            decoder_max_length = max_out_len\n            input_src += [eos_idx]\n        src_word = paddle.to_tensor(input_src).unsqueeze(axis=0)\n        finished_seq, finished_scores, cache = transformer.greedy_search(\n            src_word,\n            max_len=decoder_max_length,\n            waitk=waitk,\n            caches=cache,\n            bos_id=bos_id)\n        finished_seq = finished_seq.numpy()\n        for beam_idx, beam in enumerate(finished_seq[0]):\n            if beam_idx >= n_best:\n                break\n            id_list = post_process_seq(beam)\n            if len(id_list) == 0:\n                continue\n            bos_id = id_list[-1]\n            word_list = tokenizer.trg_vocab.to_tokens(id_list)\n            for word in word_list:\n                result.append(word)\n            res = ' '.join(word_list).replace('@@ ', '')\n        paddle.enable_static()\n    return result, cache, bos_id"
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_3/requirements.txt",
    "content": "jieba==0.42.1 \nsubword-nmt==0.3.7\n"
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_5/README.md",
    "content": "# transformer_nist_wait_5\n|模型名称|transformer_nist_wait_5|\n| :--- | :---: | \n|类别|同声传译|\n|网络|transformer|\n|数据集|NIST 2008-中英翻译数据集|\n|是否支持Fine-tuning|否|\n|模型大小|377MB|\n|最新更新日期|2021-09-17|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - 同声传译（Simultaneous Translation），即在句子完成之前进行翻译，同声传译的目标是实现同声传译的自动化，它可以与源语言同时翻译，延迟时间只有几秒钟。\n    STACL 是论文 [STACL: Simultaneous Translation with Implicit Anticipation and Controllable Latency using Prefix-to-Prefix Framework](https://www.aclweb.org/anthology/P19-1289/) 中针对同传提出的适用于所有同传场景的翻译架构。\n    - STACL 主要具有以下优势：\n\n    - Prefix-to-Prefix架构拥有预测能力，即在未看到源词的情况下仍然可以翻译出对应的目标词，克服了SOV→SVO等词序差异\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/40840292/133761990-13e55d0f-5c3a-476c-8865-5808d13cba97.png\"> <br />\n    </p>\n     和传统的机器翻译模型主要的区别在于翻译时是否需要利用全句的源句。上图中，Seq2Seq模型需要等到全句的源句（1-5）全部输入Encoder后，Decoder才开始解码进行翻译；而STACL架构采用了Wait-k（图中Wait-2）的策略，当源句只有两个词（1和2）输入到Encoder后，Decoder即可开始解码预测目标句的第一个词。\n\n    - Wait-k策略可以不需要全句的源句，直接预测目标句，可以实现任意的字级延迟，同时保持较高的翻译质量。\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/40840292/133762098-6ea6f3ca-0d70-4a0a-981d-0fcc6f3cd96b.png\"> <br />\n    </p>\n     Wait-k策略首先等待源句单词，然后与源句的其余部分同时翻译，即输出总是隐藏在输入后面。这是受到同声传译人员的启发，同声传译人员通常会在几秒钟内开始翻译演讲者的演讲，在演讲者结束几秒钟后完成。例如，如果k=2，第一个目标词使用前2个源词预测，第二个目标词使用前3个源词预测，以此类推。上图中，(a)simultaneous: our wait-2 等到\"布什\"和\"总统\"输入后就开始解码预测\"pres.\"，而(b) non-simultaneous baseline 为传统的翻译模型，需要等到整句\"布什 总统 在 莫斯科 与 普京 会晤\"才开始解码预测。\n    \n  - 该PaddleHub Module基于transformer网络结构，采用wait-5策略进行中文到英文的翻译。\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.1.0\n\n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install transformer_nist_wait_5\n    ```\n\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    model = hub.Module(name=\"transformer_nist_wait_5\")\n\n    # 待预测数据（模拟同声传译实时输入）\n    text = [\n        \"他\", \n        \"他还\", \n        \"他还说\", \n        \"他还说现在\", \n        \"他还说现在正在\",\n        \"他还说现在正在为\",\n        \"他还说现在正在为这\",\n        \"他还说现在正在为这一\",\n        \"他还说现在正在为这一会议\",\n        \"他还说现在正在为这一会议作出\",\n        \"他还说现在正在为这一会议作出安排\",\n        \"他还说现在正在为这一会议作出安排。\",      \n    ]\n\n    for t in text:\n        print(\"input: {}\".format(t))\n        result = model.translate(t)\n        print(\"model output: {}\\n\".format(result))\n\n    # input: 他\n    # model output: \n    #\n    # input: 他还\n    # model output: \n    #\n    # input: 他还说\n    # model output:\n    #\n    # input: 他还说现在\n    # model output:\n    #\n    # input: 他还说现在正在\n    # model output: he\n    #\n    # input: 他还说现在正在为\n    # model output: he also\n    #\n    # input: 他还说现在正在为这\n    # model output: he also said\n    #\n    # input: 他还说现在正在为这一\n    # model output: he also said that\n    #\n    # input: 他还说现在正在为这一会议\n    # model output: he also said that he\n    #\n    # input: 他还说现在正在为这一会议作出\n    # model output: he also said that he was\n    #\n    # input: 他还说现在正在为这一会议作出安排\n    # model output: he also said that he was making\n    #\n    # input: 他还说现在正在为这一会议作出安排。\n    # model output: he also said that he was making arrangements for this meeting . \n    ```\n\n- ### 2、 API\n\n    - ```python\n      __init__(max_length=256, max_out_len=256)\n      ```\n\n        - 初始化module， 可配置模型的输入文本的最大长度\n\n        - **参数**\n\n            - max_length(int): 输入文本的最大长度，默认值为256。\n            - max_out_len(int): 输出文本的最大解码长度，超过最大解码长度时会截断句子的后半部分，默认值为256。\n\n    - ```python\n      translate(text, use_gpu=False)\n      ```\n\n        - 预测API，输入源语言的文本（模拟同传语音输入），解码后输出翻译后的目标语言文本。\n\n        - **参数**\n\n            - text(str): 输入源语言的文本，数据类型为str\n            - use_gpu(bool): 是否使用gpu进行预测，默认为False\n\n        - **返回**\n\n            - result(str): 翻译后的目标语言文本。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线同声传译服务(需要用户配置一个语音转文本应用预先将语音输入转为中文文字)，可以将此接口用于在线web应用。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m transformer_nist_wait_5\n    ```\n\n  - 启动时会显示加载模型过程，启动成功后显示\n\n  - ```shell\n    Loading transformer_nist_wait_5 successful.\n    ```\n\n  - 这样就完成了服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果       \n\n  - ```python\n    import requests\n    import json\n\n    # 待预测数据（模拟同声传译实时输入）\n    text = [\n        \"他\", \n        \"他还\", \n        \"他还说\", \n        \"他还说现在\", \n        \"他还说现在正在\",\n        \"他还说现在正在为\",\n        \"他还说现在正在为这\",\n        \"他还说现在正在为这一\",\n        \"他还说现在正在为这一会议\",\n        \"他还说现在正在为这一会议作出\",\n        \"他还说现在正在为这一会议作出安排\",\n        \"他还说现在正在为这一会议作出安排。\",      \n    ]\n\n    # 指定预测方法为transformer_nist_wait_5并发送post请求，content-type类型应指定json方式\n    # HOST_IP为服务器IP\n    url = \"http://HOST_IP:8866/predict/transformer_nist_wait_5\"\n    headers = {\"Content-Type\": \"application/json\"}\n    for t in text:\n        print(\"input: {}\".format(t))\n        r = requests.post(url=url, headers=headers, data=json.dumps({\"text\": t}))\n        # 打印预测结果\n        print(\"model output: {}\\n\".format(result.json()['results']))\n\n  - 关于PaddleHub Serving更多信息参考：[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n## 五、更新历史\n\n* 1.0.0\n    初始发布\n    ```shell\n    hub install transformer_nist_wait_5==1.0.0\n    ```\n"
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_5/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_5/model.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport numpy as np\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlenlp.transformers import WordEmbedding, PositionalEmbedding\n\n\nclass DecoderLayer(nn.TransformerDecoderLayer):\n    def __init__(self, *args, **kwargs):\n        super(DecoderLayer, self).__init__(*args, **kwargs)\n\n    def forward(self, tgt, memory, tgt_mask=None, memory_mask=None, cache=None):\n        residual = tgt\n        if self.normalize_before:\n            tgt = self.norm1(tgt)\n        if cache is None:\n            tgt = self.self_attn(tgt, tgt, tgt, tgt_mask, None)\n        else:\n            tgt, incremental_cache = self.self_attn(tgt, tgt, tgt, tgt_mask,\n                                                    cache[0])\n        tgt = residual + self.dropout1(tgt)\n        if not self.normalize_before:\n            tgt = self.norm1(tgt)\n\n        residual = tgt\n        if self.normalize_before:\n            tgt = self.norm2(tgt)\n        if len(memory) == 1:\n            # Full sent\n            tgt = self.cross_attn(tgt, memory[0], memory[0], memory_mask, None)\n        else:\n            # Wait-k policy\n            cross_attn_outputs = []\n            for i in range(tgt.shape[1]):\n                q = tgt[:, i:i + 1, :]\n                if i >= len(memory):\n                    e = memory[-1]\n                else:\n                    e = memory[i]\n                cross_attn_outputs.append(\n                    self.cross_attn(q, e, e, memory_mask[:, :, i:i + 1, :\n                                                         e.shape[1]], None))\n            tgt = paddle.concat(cross_attn_outputs, axis=1)\n        tgt = residual + self.dropout2(tgt)\n        if not self.normalize_before:\n            tgt = self.norm2(tgt)\n\n        residual = tgt\n        if self.normalize_before:\n            tgt = self.norm3(tgt)\n        tgt = self.linear2(self.dropout(self.activation(self.linear1(tgt))))\n        tgt = residual + self.dropout3(tgt)\n        if not self.normalize_before:\n            tgt = self.norm3(tgt)\n        return tgt if cache is None else (tgt, (incremental_cache, ))\n\n\nclass Decoder(nn.TransformerDecoder):\n    \"\"\"\n    PaddlePaddle 2.1 casts memory_mask.dtype to memory.dtype, but in STACL,\n    type of memory is list, having no dtype attribute.\n    \"\"\"\n\n    def forward(self, tgt, memory, tgt_mask=None, memory_mask=None, cache=None):\n        output = tgt\n        new_caches = []\n        for i, mod in enumerate(self.layers):\n            if cache is None:\n                output = mod(output,\n                             memory,\n                             tgt_mask=tgt_mask,\n                             memory_mask=memory_mask,\n                             cache=None)\n            else:\n                output, new_cache = mod(output,\n                                        memory,\n                                        tgt_mask=tgt_mask,\n                                        memory_mask=memory_mask,\n                                        cache=cache[i])\n                new_caches.append(new_cache)\n\n        if self.norm is not None:\n            output = self.norm(output)\n\n        return output if cache is None else (output, new_caches)\n\n\nclass SimultaneousTransformer(nn.Layer):\n    \"\"\"\n    model\n    \"\"\"\n    def __init__(self,\n                 src_vocab_size,\n                 trg_vocab_size,\n                 max_length=256,\n                 n_layer=6,\n                 n_head=8,\n                 d_model=512,\n                 d_inner_hid=2048,\n                 dropout=0.1,\n                 weight_sharing=False,\n                 bos_id=0,\n                 eos_id=1,\n                 waitk=-1):\n        super(SimultaneousTransformer, self).__init__()\n        self.trg_vocab_size = trg_vocab_size\n        self.emb_dim = d_model\n        self.bos_id = bos_id\n        self.eos_id = eos_id\n        self.dropout = dropout\n        self.waitk = waitk\n        self.n_layer = n_layer\n        self.n_head = n_head\n        self.d_model = d_model\n\n        self.src_word_embedding = WordEmbedding(\n            vocab_size=src_vocab_size, emb_dim=d_model, bos_id=self.bos_id)\n        self.src_pos_embedding = PositionalEmbedding(\n            emb_dim=d_model, max_length=max_length+1)\n        if weight_sharing:\n            assert src_vocab_size == trg_vocab_size, (\n                \"Vocabularies in source and target should be same for weight sharing.\"\n            )\n            self.trg_word_embedding = self.src_word_embedding\n            self.trg_pos_embedding = self.src_pos_embedding\n        else:\n            self.trg_word_embedding = WordEmbedding(\n                vocab_size=trg_vocab_size, emb_dim=d_model, bos_id=self.bos_id)\n            self.trg_pos_embedding = PositionalEmbedding(\n                emb_dim=d_model, max_length=max_length+1)\n\n        encoder_layer = nn.TransformerEncoderLayer(\n            d_model=d_model,\n            nhead=n_head,\n            dim_feedforward=d_inner_hid,\n            dropout=dropout,\n            activation='relu',\n            normalize_before=True,\n            bias_attr=[False, True])\n        encoder_norm = nn.LayerNorm(d_model)\n        self.encoder = nn.TransformerEncoder(\n            encoder_layer=encoder_layer, num_layers=n_layer, norm=encoder_norm)\n\n        decoder_layer = DecoderLayer(\n            d_model=d_model,\n            nhead=n_head,\n            dim_feedforward=d_inner_hid,\n            dropout=dropout,\n            activation='relu',\n            normalize_before=True,\n            bias_attr=[False, False, True])\n        decoder_norm = nn.LayerNorm(d_model)\n        self.decoder = Decoder(\n            decoder_layer=decoder_layer, num_layers=n_layer, norm=decoder_norm)\n\n        if weight_sharing:\n            self.linear = lambda x: paddle.matmul(\n                x=x, y=self.trg_word_embedding.word_embedding.weight, transpose_y=True)\n        else:\n            self.linear = nn.Linear(\n                in_features=d_model,\n                out_features=trg_vocab_size,\n                bias_attr=False)\n\n    def forward(self, src_word, trg_word):\n        src_max_len = paddle.shape(src_word)[-1]\n        trg_max_len = paddle.shape(trg_word)[-1]\n        base_attn_bias = paddle.cast(\n            src_word == self.bos_id,\n            dtype=paddle.get_default_dtype()).unsqueeze([1, 2]) * -1e9\n        src_slf_attn_bias = base_attn_bias\n        src_slf_attn_bias.stop_gradient = True\n        trg_slf_attn_bias = paddle.tensor.triu(\n            (paddle.ones(\n                (trg_max_len, trg_max_len),\n                dtype=paddle.get_default_dtype()) * -np.inf),\n            1)\n        trg_slf_attn_bias.stop_gradient = True\n        trg_src_attn_bias = paddle.tile(base_attn_bias, [1, 1, trg_max_len, 1])\n        src_pos = paddle.cast(\n            src_word != self.bos_id, dtype=\"int64\") * paddle.arange(\n                start=0, end=src_max_len)\n        trg_pos = paddle.cast(\n            trg_word != self.bos_id, dtype=\"int64\") * paddle.arange(\n                start=0, end=trg_max_len)\n        src_emb = self.src_word_embedding(src_word)\n        src_pos_emb = self.src_pos_embedding(src_pos)\n        src_emb = src_emb + src_pos_emb\n        enc_input = F.dropout(\n            src_emb, p=self.dropout,\n            training=self.training) if self.dropout else src_emb\n        with paddle.static.amp.fp16_guard():\n            if self.waitk >= src_max_len or self.waitk == -1:\n                # Full sentence\n                enc_outputs = [\n                    self.encoder(\n                        enc_input, src_mask=src_slf_attn_bias)\n                ]\n            else:\n                # Wait-k policy\n                enc_outputs = []\n                for i in range(self.waitk, src_max_len + 1):\n                    enc_output = self.encoder(\n                        enc_input[:, :i, :],\n                        src_mask=src_slf_attn_bias[:, :, :, :i])\n                    enc_outputs.append(enc_output)\n\n            trg_emb = self.trg_word_embedding(trg_word)\n            trg_pos_emb = self.trg_pos_embedding(trg_pos)\n            trg_emb = trg_emb + trg_pos_emb\n            dec_input = F.dropout(\n                trg_emb, p=self.dropout,\n                training=self.training) if self.dropout else trg_emb\n            dec_output = self.decoder(\n                dec_input,\n                enc_outputs,\n                tgt_mask=trg_slf_attn_bias,\n                memory_mask=trg_src_attn_bias)\n\n            predict = self.linear(dec_output)\n\n        return predict\n\n    def beam_search(self, src_word, beam_size=4, max_len=256, waitk=-1):\n        # TODO: \"Speculative Beam Search for Simultaneous Translation\"\n        raise NotImplementedError\n\n    def greedy_search(self,\n                      src_word,\n                      max_len=256,\n                      waitk=-1,\n                      caches=None,\n                      bos_id=None):\n        \"\"\"\n        greedy_search uses streaming reader. It doesn't need calling\n        encoder many times, an a sub-sentence just needs calling encoder once.\n        So, it needs previous state(caches) and last one of generated\n        tokens id last time.\n        \"\"\"\n        src_max_len = paddle.shape(src_word)[-1]\n        base_attn_bias = paddle.cast(\n            src_word == self.bos_id,\n            dtype=paddle.get_default_dtype()).unsqueeze([1, 2]) * -1e9\n        src_slf_attn_bias = base_attn_bias\n        src_slf_attn_bias.stop_gradient = True\n        trg_src_attn_bias = paddle.tile(base_attn_bias, [1, 1, 1, 1])\n        src_pos = paddle.cast(\n            src_word != self.bos_id, dtype=\"int64\") * paddle.arange(\n                start=0, end=src_max_len)\n        src_emb = self.src_word_embedding(src_word)\n        src_pos_emb = self.src_pos_embedding(src_pos)\n        src_emb = src_emb + src_pos_emb\n        enc_input = F.dropout(\n            src_emb, p=self.dropout,\n            training=self.training) if self.dropout else src_emb\n        enc_outputs = [self.encoder(enc_input, src_mask=src_slf_attn_bias)]\n\n        # constant number\n        batch_size = enc_outputs[-1].shape[0]\n        max_len = (\n            enc_outputs[-1].shape[1] + 20) if max_len is None else max_len\n        end_token_tensor = paddle.full(\n            shape=[batch_size, 1], fill_value=self.eos_id, dtype=\"int64\")\n\n        predict_ids = []\n        log_probs = paddle.full(\n            shape=[batch_size, 1], fill_value=0, dtype=\"float32\")\n        if not bos_id:\n            trg_word = paddle.full(\n                shape=[batch_size, 1], fill_value=self.bos_id, dtype=\"int64\")\n        else:\n            trg_word = paddle.full(\n                shape=[batch_size, 1], fill_value=bos_id, dtype=\"int64\")\n\n        # init states (caches) for transformer\n        if not caches:\n            caches = self.decoder.gen_cache(enc_outputs[-1], do_zip=False)\n\n        for i in range(max_len):\n            trg_pos = paddle.full(\n                shape=trg_word.shape, fill_value=i, dtype=\"int64\")\n            trg_emb = self.trg_word_embedding(trg_word)\n            trg_pos_emb = self.trg_pos_embedding(trg_pos)\n            trg_emb = trg_emb + trg_pos_emb\n            dec_input = F.dropout(\n                trg_emb, p=self.dropout,\n                training=self.training) if self.dropout else trg_emb\n\n            if waitk < 0 or i >= len(enc_outputs):\n                # if the decoder step is full sent or longer than all source\n                # step, then read the whole src\n                _e = enc_outputs[-1]\n                dec_output, caches = self.decoder(\n                    dec_input, [_e], None,\n                    trg_src_attn_bias[:, :, :, :_e.shape[1]], caches)\n            else:\n                _e = enc_outputs[i]\n                dec_output, caches = self.decoder(\n                    dec_input, [_e], None,\n                    trg_src_attn_bias[:, :, :, :_e.shape[1]], caches)\n\n            dec_output = paddle.reshape(\n                dec_output, shape=[-1, dec_output.shape[-1]])\n\n            logits = self.linear(dec_output)\n            step_log_probs = paddle.log(F.softmax(logits, axis=-1))\n            log_probs = paddle.add(x=step_log_probs, y=log_probs)\n            scores = log_probs\n            topk_scores, topk_indices = paddle.topk(x=scores, k=1)\n\n            finished = paddle.equal(topk_indices, end_token_tensor)\n            trg_word = topk_indices\n            log_probs = topk_scores\n\n            predict_ids.append(topk_indices)\n\n            if paddle.all(finished).numpy():\n                break\n\n        predict_ids = paddle.stack(predict_ids, axis=0)\n        finished_seq = paddle.transpose(predict_ids, [1, 2, 0])\n        finished_scores = topk_scores\n\n        return finished_seq, finished_scores, caches"
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_5/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\n\nimport jieba\nimport paddle\nfrom paddlenlp.transformers import position_encoding_init\nfrom paddlenlp.transformers import WordEmbedding, PositionalEmbedding\nfrom paddlehub.env import MODULE_HOME\nfrom paddlehub.module.module import moduleinfo, serving\n\nfrom transformer_nist_wait_5.model import SimultaneousTransformer\nfrom transformer_nist_wait_5.processor import STACLTokenizer, predict\n\n\n@moduleinfo(\n    name=\"transformer_nist_wait_5\",\n    version=\"1.0.0\",\n    summary=\"\",\n    author=\"PaddlePaddle\",\n    author_email=\"\",\n    type=\"nlp/simultaneous_translation\",\n)\nclass STTransformer():\n    \"\"\"\n    Transformer model for simultaneous translation.\n    \"\"\"\n\n    # Model config\n    model_config = {\n        # Number of head used in multi-head attention.\n        \"n_head\": 8,\n        # Number of sub-layers to be stacked in the encoder and decoder.\n        \"n_layer\": 6,\n        # The dimension for word embeddings, which is also the last dimension of\n        # the input and output of multi-head attention, position-wise feed-forward\n        # networks, encoder and decoder.\n        \"d_model\": 512,\n    }\n\n    def __init__(self, \n                 max_length=256,\n                 max_out_len=256,\n                 ):\n        super(STTransformer, self).__init__()\n        bpe_codes_fpath = os.path.join(MODULE_HOME, \"transformer_nist_wait_5\", \"assets\", \"2M.zh2en.dict4bpe.zh\")\n        src_vocab_fpath = os.path.join(MODULE_HOME, \"transformer_nist_wait_5\", \"assets\", \"nist.20k.zh.vocab\")\n        trg_vocab_fpath = os.path.join(MODULE_HOME, \"transformer_nist_wait_5\", \"assets\", \"nist.10k.en.vocab\")\n        params_fpath = os.path.join(MODULE_HOME, \"transformer_nist_wait_5\", \"assets\", \"transformer.pdparams\")\n        self.max_length = max_length\n        self.max_out_len = max_out_len\n        self.tokenizer = STACLTokenizer(\n            bpe_codes_fpath,\n            src_vocab_fpath,\n            trg_vocab_fpath,\n        )\n        src_vocab_size = self.tokenizer.src_vocab_size\n        trg_vocab_size = self.tokenizer.trg_vocab_size\n        self.transformer = SimultaneousTransformer(\n            src_vocab_size,\n            trg_vocab_size,\n            max_length=self.max_length,\n            n_layer=self.model_config['n_layer'],\n            n_head=self.model_config['n_head'],\n            d_model=self.model_config['d_model'],\n        )\n        model_dict = paddle.load(params_fpath)\n        # To avoid a longer length than training, reset the size of position\n        # encoding to max_length\n        model_dict[\"src_pos_embedding.pos_encoder.weight\"] = position_encoding_init(\n            self.max_length + 1, self.model_config['d_model'])\n        model_dict[\"trg_pos_embedding.pos_encoder.weight\"] = position_encoding_init(\n            self.max_length + 1, self.model_config['d_model'])\n        self.transformer.load_dict(model_dict)\n\n    @serving\n    def translate(self, text, use_gpu=False):\n        paddle.set_device('gpu') if use_gpu else paddle.set_device('cpu')\n\n        # Word segmentation\n        text = ' '.join(jieba.cut(text))\n        # For decoding max length\n        decoder_max_length = 1\n        # For decoding cache\n        cache = None\n        # For decoding start token id\n        bos_id = None\n        # Current source word index\n        i = 0\n        # For decoding: is_last=True, max_len=256\n        is_last = False\n        # Tokenized id\n        user_input_tokenized = []\n        # Store the translation\n        result = []\n\n        bpe_str, tokenized_src = self.tokenizer.tokenize(text)\n        while i < len(tokenized_src):\n            user_input_tokenized.append(tokenized_src[i])\n            if bpe_str[i] in ['。', '？', '！']:\n                is_last = True\n            result, cache, bos_id = predict(\n                user_input_tokenized, \n                decoder_max_length,\n                is_last, \n                cache, \n                bos_id, \n                result,\n                self.tokenizer, \n                self.transformer,\n                max_out_len=self.max_out_len)\n            i += 1    \n        return \" \".join(result)"
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_5/processor.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nfrom paddlenlp.data import Vocab\nfrom subword_nmt import subword_nmt\n\n\nclass STACLTokenizer:\n    \"\"\"\n    Jieba+BPE, and convert tokens to ids.\n    \"\"\"\n\n    def __init__(self,\n                 bpe_codes_fpath,\n                 src_vocab_fpath,\n                 trg_vocab_fpath,\n                 special_token=[\"<s>\", \"<e>\", \"<unk>\"]):\n        bpe_parser = subword_nmt.create_apply_bpe_parser()\n        bpe_args = bpe_parser.parse_args(args=['-c', bpe_codes_fpath])\n        bpe_args.codes.close()\n        bpe_args.codes = open(bpe_codes_fpath, 'r', encoding='utf-8')\n        self.bpe = subword_nmt.BPE(bpe_args.codes, bpe_args.merges,\n                                   bpe_args.separator, None,\n                                   bpe_args.glossaries)\n\n        self.src_vocab = Vocab.load_vocabulary(\n            src_vocab_fpath,\n            bos_token=special_token[0],\n            eos_token=special_token[1],\n            unk_token=special_token[2])\n\n        self.trg_vocab = Vocab.load_vocabulary(\n            trg_vocab_fpath,\n            bos_token=special_token[0],\n            eos_token=special_token[1],\n            unk_token=special_token[2])\n\n        self.src_vocab_size = len(self.src_vocab)\n        self.trg_vocab_size = len(self.trg_vocab)\n\n    def tokenize(self, text):\n        bpe_str = self.bpe.process_line(text)\n        ids = self.src_vocab.to_indices(bpe_str.split())\n        return bpe_str.split(), ids\n\n\ndef post_process_seq(seq, \n                     bos_idx=0, \n                     eos_idx=1, \n                     output_bos=False, \n                     output_eos=False):\n    \"\"\"\n    Post-process the decoded sequence.\n    \"\"\"\n    eos_pos = len(seq) - 1\n    for i, idx in enumerate(seq):\n        if idx == eos_idx:\n            eos_pos = i\n            break\n    seq = [\n        idx for idx in seq[:eos_pos + 1]\n        if (output_bos or idx != bos_idx) and (output_eos or idx != eos_idx)\n    ]\n    return seq\n\n\ndef predict(tokenized_src, \n              decoder_max_length, \n              is_last, \n              cache, \n              bos_id, \n              result,\n              tokenizer, \n              transformer,\n              n_best=1,\n              max_out_len=256,\n              eos_idx=1,\n              waitk=5,\n    ):\n    # Set evaluate mode\n    transformer.eval()\n\n    if len(tokenized_src) < waitk:\n        return result, cache, bos_id\n\n    with paddle.no_grad():\n        paddle.disable_static()\n        input_src = tokenized_src\n        if is_last:\n            decoder_max_length = max_out_len\n            input_src += [eos_idx]\n        src_word = paddle.to_tensor(input_src).unsqueeze(axis=0)\n        finished_seq, finished_scores, cache = transformer.greedy_search(\n            src_word,\n            max_len=decoder_max_length,\n            waitk=waitk,\n            caches=cache,\n            bos_id=bos_id)\n        finished_seq = finished_seq.numpy()\n        for beam_idx, beam in enumerate(finished_seq[0]):\n            if beam_idx >= n_best:\n                break\n            id_list = post_process_seq(beam)\n            if len(id_list) == 0:\n                continue\n            bos_id = id_list[-1]\n            word_list = tokenizer.trg_vocab.to_tokens(id_list)\n            for word in word_list:\n                result.append(word)\n            res = ' '.join(word_list).replace('@@ ', '')\n        paddle.enable_static()\n    return result, cache, bos_id"
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_5/requirements.txt",
    "content": "jieba==0.42.1 \nsubword-nmt==0.3.7\n"
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_7/README.md",
    "content": "# transformer_nist_wait_7\n|模型名称|transformer_nist_wait_7|\n| :--- | :---: | \n|类别|同声传译|\n|网络|transformer|\n|数据集|NIST 2008-中英翻译数据集|\n|是否支持Fine-tuning|否|\n|模型大小|377MB|\n|最新更新日期|2021-09-17|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - 同声传译（Simultaneous Translation），即在句子完成之前进行翻译，同声传译的目标是实现同声传译的自动化，它可以与源语言同时翻译，延迟时间只有几秒钟。\n    STACL 是论文 [STACL: Simultaneous Translation with Implicit Anticipation and Controllable Latency using Prefix-to-Prefix Framework](https://www.aclweb.org/anthology/P19-1289/) 中针对同传提出的适用于所有同传场景的翻译架构。\n    - STACL 主要具有以下优势：\n\n    - Prefix-to-Prefix架构拥有预测能力，即在未看到源词的情况下仍然可以翻译出对应的目标词，克服了SOV→SVO等词序差异\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/40840292/133761990-13e55d0f-5c3a-476c-8865-5808d13cba97.png\"> <br />\n    </p>\n     和传统的机器翻译模型主要的区别在于翻译时是否需要利用全句的源句。上图中，Seq2Seq模型需要等到全句的源句（1-5）全部输入Encoder后，Decoder才开始解码进行翻译；而STACL架构采用了Wait-k（图中Wait-2）的策略，当源句只有两个词（1和2）输入到Encoder后，Decoder即可开始解码预测目标句的第一个词。\n\n    - Wait-k策略可以不需要全句的源句，直接预测目标句，可以实现任意的字级延迟，同时保持较高的翻译质量。\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/40840292/133762098-6ea6f3ca-0d70-4a0a-981d-0fcc6f3cd96b.png\"> <br />\n    </p>\n     Wait-k策略首先等待源句单词，然后与源句的其余部分同时翻译，即输出总是隐藏在输入后面。这是受到同声传译人员的启发，同声传译人员通常会在几秒钟内开始翻译演讲者的演讲，在演讲者结束几秒钟后完成。例如，如果k=2，第一个目标词使用前2个源词预测，第二个目标词使用前3个源词预测，以此类推。上图中，(a)simultaneous: our wait-2 等到\"布什\"和\"总统\"输入后就开始解码预测\"pres.\"，而(b) non-simultaneous baseline 为传统的翻译模型，需要等到整句\"布什 总统 在 莫斯科 与 普京 会晤\"才开始解码预测。\n    \n  - 该PaddleHub Module基于transformer网络结构，采用wait-7策略进行中文到英文的翻译。\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.1.0\n\n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install transformer_nist_wait_7\n    ```\n\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    model = hub.Module(name=\"transformer_nist_wait_7\")\n\n    # 待预测数据（模拟同声传译实时输入）\n    text = [\n        \"他\", \n        \"他还\", \n        \"他还说\", \n        \"他还说现在\", \n        \"他还说现在正在\",\n        \"他还说现在正在为\",\n        \"他还说现在正在为这\",\n        \"他还说现在正在为这一\",\n        \"他还说现在正在为这一会议\",\n        \"他还说现在正在为这一会议作出\",\n        \"他还说现在正在为这一会议作出安排\",\n        \"他还说现在正在为这一会议作出安排。\",      \n    ]\n\n    for t in text:\n        print(\"input: {}\".format(t))\n        result = model.translate(t)\n        print(\"model output: {}\\n\".format(result))\n\n    # input: 他\n    # model output: \n    #\n    # input: 他还\n    # model output: \n    #\n    # input: 他还说\n    # model output:\n    #\n    # input: 他还说现在\n    # model output:\n    #\n    # input: 他还说现在正在\n    # model output:\n    #\n    # input: 他还说现在正在为\n    # model output:\n    #\n    # input: 他还说现在正在为这\n    # model output: he\n    #\n    # input: 他还说现在正在为这一\n    # model output: he also\n    #\n    # input: 他还说现在正在为这一会议\n    # model output: he also said\n    #\n    # input: 他还说现在正在为这一会议作出\n    # model output: he also said that\n    #\n    # input: 他还说现在正在为这一会议作出安排\n    # model output: he also said that arrangements\n    #\n    # input: 他还说现在正在为这一会议作出安排。\n    # model output: he also said that arrangements are now being made for this meeting .\n    ```\n\n- ### 2、 API\n\n    - ```python\n      __init__(max_length=256, max_out_len=256)\n      ```\n\n        - 初始化module， 可配置模型的输入文本的最大长度\n\n        - **参数**\n\n            - max_length(int): 输入文本的最大长度，默认值为256。\n            - max_out_len(int): 输出文本的最大解码长度，超过最大解码长度时会截断句子的后半部分，默认值为256。\n\n    - ```python\n      translate(text, use_gpu=False)\n      ```\n\n        - 预测API，输入源语言的文本（模拟同传语音输入），解码后输出翻译后的目标语言文本。\n\n        - **参数**\n\n            - text(str): 输入源语言的文本，数据类型为str\n            - use_gpu(bool): 是否使用gpu进行预测，默认为False\n\n        - **返回**\n\n            - result(str): 翻译后的目标语言文本。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线同声传译服务(需要用户配置一个语音转文本应用预先将语音输入转为中文文字)，可以将此接口用于在线web应用。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m transformer_nist_wait_7\n    ```\n\n  - 启动时会显示加载模型过程，启动成功后显示\n\n  - ```shell\n    Loading transformer_nist_wait_7 successful.\n    ```\n\n  - 这样就完成了服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果       \n\n  - ```python\n    import requests\n    import json\n\n    # 待预测数据（模拟同声传译实时输入）\n    text = [\n        \"他\", \n        \"他还\", \n        \"他还说\", \n        \"他还说现在\", \n        \"他还说现在正在\",\n        \"他还说现在正在为\",\n        \"他还说现在正在为这\",\n        \"他还说现在正在为这一\",\n        \"他还说现在正在为这一会议\",\n        \"他还说现在正在为这一会议作出\",\n        \"他还说现在正在为这一会议作出安排\",\n        \"他还说现在正在为这一会议作出安排。\",      \n    ]\n\n    # 指定预测方法为transformer_nist_wait_7并发送post请求，content-type类型应指定json方式\n    # HOST_IP为服务器IP\n    url = \"http://HOST_IP:8866/predict/transformer_nist_wait_7\"\n    headers = {\"Content-Type\": \"application/json\"}\n    for t in text:\n        print(\"input: {}\".format(t))\n        r = requests.post(url=url, headers=headers, data=json.dumps({\"text\": t}))\n        # 打印预测结果\n        print(\"model output: {}\\n\".format(result.json()['results']))\n\n  - 关于PaddleHub Serving更多信息参考：[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n## 五、更新历史\n\n* 1.0.0\n    初始发布\n    ```shell\n    hub install transformer_nist_wait_7==1.0.0\n    ```\n"
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_7/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_7/model.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport numpy as np\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlenlp.transformers import WordEmbedding, PositionalEmbedding\n\n\nclass DecoderLayer(nn.TransformerDecoderLayer):\n    def __init__(self, *args, **kwargs):\n        super(DecoderLayer, self).__init__(*args, **kwargs)\n\n    def forward(self, tgt, memory, tgt_mask=None, memory_mask=None, cache=None):\n        residual = tgt\n        if self.normalize_before:\n            tgt = self.norm1(tgt)\n        if cache is None:\n            tgt = self.self_attn(tgt, tgt, tgt, tgt_mask, None)\n        else:\n            tgt, incremental_cache = self.self_attn(tgt, tgt, tgt, tgt_mask,\n                                                    cache[0])\n        tgt = residual + self.dropout1(tgt)\n        if not self.normalize_before:\n            tgt = self.norm1(tgt)\n\n        residual = tgt\n        if self.normalize_before:\n            tgt = self.norm2(tgt)\n        if len(memory) == 1:\n            # Full sent\n            tgt = self.cross_attn(tgt, memory[0], memory[0], memory_mask, None)\n        else:\n            # Wait-k policy\n            cross_attn_outputs = []\n            for i in range(tgt.shape[1]):\n                q = tgt[:, i:i + 1, :]\n                if i >= len(memory):\n                    e = memory[-1]\n                else:\n                    e = memory[i]\n                cross_attn_outputs.append(\n                    self.cross_attn(q, e, e, memory_mask[:, :, i:i + 1, :\n                                                         e.shape[1]], None))\n            tgt = paddle.concat(cross_attn_outputs, axis=1)\n        tgt = residual + self.dropout2(tgt)\n        if not self.normalize_before:\n            tgt = self.norm2(tgt)\n\n        residual = tgt\n        if self.normalize_before:\n            tgt = self.norm3(tgt)\n        tgt = self.linear2(self.dropout(self.activation(self.linear1(tgt))))\n        tgt = residual + self.dropout3(tgt)\n        if not self.normalize_before:\n            tgt = self.norm3(tgt)\n        return tgt if cache is None else (tgt, (incremental_cache, ))\n\n\nclass Decoder(nn.TransformerDecoder):\n    \"\"\"\n    PaddlePaddle 2.1 casts memory_mask.dtype to memory.dtype, but in STACL,\n    type of memory is list, having no dtype attribute.\n    \"\"\"\n\n    def forward(self, tgt, memory, tgt_mask=None, memory_mask=None, cache=None):\n        output = tgt\n        new_caches = []\n        for i, mod in enumerate(self.layers):\n            if cache is None:\n                output = mod(output,\n                             memory,\n                             tgt_mask=tgt_mask,\n                             memory_mask=memory_mask,\n                             cache=None)\n            else:\n                output, new_cache = mod(output,\n                                        memory,\n                                        tgt_mask=tgt_mask,\n                                        memory_mask=memory_mask,\n                                        cache=cache[i])\n                new_caches.append(new_cache)\n\n        if self.norm is not None:\n            output = self.norm(output)\n\n        return output if cache is None else (output, new_caches)\n\n\nclass SimultaneousTransformer(nn.Layer):\n    \"\"\"\n    model\n    \"\"\"\n    def __init__(self,\n                 src_vocab_size,\n                 trg_vocab_size,\n                 max_length=256,\n                 n_layer=6,\n                 n_head=8,\n                 d_model=512,\n                 d_inner_hid=2048,\n                 dropout=0.1,\n                 weight_sharing=False,\n                 bos_id=0,\n                 eos_id=1,\n                 waitk=-1):\n        super(SimultaneousTransformer, self).__init__()\n        self.trg_vocab_size = trg_vocab_size\n        self.emb_dim = d_model\n        self.bos_id = bos_id\n        self.eos_id = eos_id\n        self.dropout = dropout\n        self.waitk = waitk\n        self.n_layer = n_layer\n        self.n_head = n_head\n        self.d_model = d_model\n\n        self.src_word_embedding = WordEmbedding(\n            vocab_size=src_vocab_size, emb_dim=d_model, bos_id=self.bos_id)\n        self.src_pos_embedding = PositionalEmbedding(\n            emb_dim=d_model, max_length=max_length+1)\n        if weight_sharing:\n            assert src_vocab_size == trg_vocab_size, (\n                \"Vocabularies in source and target should be same for weight sharing.\"\n            )\n            self.trg_word_embedding = self.src_word_embedding\n            self.trg_pos_embedding = self.src_pos_embedding\n        else:\n            self.trg_word_embedding = WordEmbedding(\n                vocab_size=trg_vocab_size, emb_dim=d_model, bos_id=self.bos_id)\n            self.trg_pos_embedding = PositionalEmbedding(\n                emb_dim=d_model, max_length=max_length+1)\n\n        encoder_layer = nn.TransformerEncoderLayer(\n            d_model=d_model,\n            nhead=n_head,\n            dim_feedforward=d_inner_hid,\n            dropout=dropout,\n            activation='relu',\n            normalize_before=True,\n            bias_attr=[False, True])\n        encoder_norm = nn.LayerNorm(d_model)\n        self.encoder = nn.TransformerEncoder(\n            encoder_layer=encoder_layer, num_layers=n_layer, norm=encoder_norm)\n\n        decoder_layer = DecoderLayer(\n            d_model=d_model,\n            nhead=n_head,\n            dim_feedforward=d_inner_hid,\n            dropout=dropout,\n            activation='relu',\n            normalize_before=True,\n            bias_attr=[False, False, True])\n        decoder_norm = nn.LayerNorm(d_model)\n        self.decoder = Decoder(\n            decoder_layer=decoder_layer, num_layers=n_layer, norm=decoder_norm)\n\n        if weight_sharing:\n            self.linear = lambda x: paddle.matmul(\n                x=x, y=self.trg_word_embedding.word_embedding.weight, transpose_y=True)\n        else:\n            self.linear = nn.Linear(\n                in_features=d_model,\n                out_features=trg_vocab_size,\n                bias_attr=False)\n\n    def forward(self, src_word, trg_word):\n        src_max_len = paddle.shape(src_word)[-1]\n        trg_max_len = paddle.shape(trg_word)[-1]\n        base_attn_bias = paddle.cast(\n            src_word == self.bos_id,\n            dtype=paddle.get_default_dtype()).unsqueeze([1, 2]) * -1e9\n        src_slf_attn_bias = base_attn_bias\n        src_slf_attn_bias.stop_gradient = True\n        trg_slf_attn_bias = paddle.tensor.triu(\n            (paddle.ones(\n                (trg_max_len, trg_max_len),\n                dtype=paddle.get_default_dtype()) * -np.inf),\n            1)\n        trg_slf_attn_bias.stop_gradient = True\n        trg_src_attn_bias = paddle.tile(base_attn_bias, [1, 1, trg_max_len, 1])\n        src_pos = paddle.cast(\n            src_word != self.bos_id, dtype=\"int64\") * paddle.arange(\n                start=0, end=src_max_len)\n        trg_pos = paddle.cast(\n            trg_word != self.bos_id, dtype=\"int64\") * paddle.arange(\n                start=0, end=trg_max_len)\n        src_emb = self.src_word_embedding(src_word)\n        src_pos_emb = self.src_pos_embedding(src_pos)\n        src_emb = src_emb + src_pos_emb\n        enc_input = F.dropout(\n            src_emb, p=self.dropout,\n            training=self.training) if self.dropout else src_emb\n        with paddle.static.amp.fp16_guard():\n            if self.waitk >= src_max_len or self.waitk == -1:\n                # Full sentence\n                enc_outputs = [\n                    self.encoder(\n                        enc_input, src_mask=src_slf_attn_bias)\n                ]\n            else:\n                # Wait-k policy\n                enc_outputs = []\n                for i in range(self.waitk, src_max_len + 1):\n                    enc_output = self.encoder(\n                        enc_input[:, :i, :],\n                        src_mask=src_slf_attn_bias[:, :, :, :i])\n                    enc_outputs.append(enc_output)\n\n            trg_emb = self.trg_word_embedding(trg_word)\n            trg_pos_emb = self.trg_pos_embedding(trg_pos)\n            trg_emb = trg_emb + trg_pos_emb\n            dec_input = F.dropout(\n                trg_emb, p=self.dropout,\n                training=self.training) if self.dropout else trg_emb\n            dec_output = self.decoder(\n                dec_input,\n                enc_outputs,\n                tgt_mask=trg_slf_attn_bias,\n                memory_mask=trg_src_attn_bias)\n\n            predict = self.linear(dec_output)\n\n        return predict\n\n    def beam_search(self, src_word, beam_size=4, max_len=256, waitk=-1):\n        # TODO: \"Speculative Beam Search for Simultaneous Translation\"\n        raise NotImplementedError\n\n    def greedy_search(self,\n                      src_word,\n                      max_len=256,\n                      waitk=-1,\n                      caches=None,\n                      bos_id=None):\n        \"\"\"\n        greedy_search uses streaming reader. It doesn't need calling\n        encoder many times, an a sub-sentence just needs calling encoder once.\n        So, it needs previous state(caches) and last one of generated\n        tokens id last time.\n        \"\"\"\n        src_max_len = paddle.shape(src_word)[-1]\n        base_attn_bias = paddle.cast(\n            src_word == self.bos_id,\n            dtype=paddle.get_default_dtype()).unsqueeze([1, 2]) * -1e9\n        src_slf_attn_bias = base_attn_bias\n        src_slf_attn_bias.stop_gradient = True\n        trg_src_attn_bias = paddle.tile(base_attn_bias, [1, 1, 1, 1])\n        src_pos = paddle.cast(\n            src_word != self.bos_id, dtype=\"int64\") * paddle.arange(\n                start=0, end=src_max_len)\n        src_emb = self.src_word_embedding(src_word)\n        src_pos_emb = self.src_pos_embedding(src_pos)\n        src_emb = src_emb + src_pos_emb\n        enc_input = F.dropout(\n            src_emb, p=self.dropout,\n            training=self.training) if self.dropout else src_emb\n        enc_outputs = [self.encoder(enc_input, src_mask=src_slf_attn_bias)]\n\n        # constant number\n        batch_size = enc_outputs[-1].shape[0]\n        max_len = (\n            enc_outputs[-1].shape[1] + 20) if max_len is None else max_len\n        end_token_tensor = paddle.full(\n            shape=[batch_size, 1], fill_value=self.eos_id, dtype=\"int64\")\n\n        predict_ids = []\n        log_probs = paddle.full(\n            shape=[batch_size, 1], fill_value=0, dtype=\"float32\")\n        if not bos_id:\n            trg_word = paddle.full(\n                shape=[batch_size, 1], fill_value=self.bos_id, dtype=\"int64\")\n        else:\n            trg_word = paddle.full(\n                shape=[batch_size, 1], fill_value=bos_id, dtype=\"int64\")\n\n        # init states (caches) for transformer\n        if not caches:\n            caches = self.decoder.gen_cache(enc_outputs[-1], do_zip=False)\n\n        for i in range(max_len):\n            trg_pos = paddle.full(\n                shape=trg_word.shape, fill_value=i, dtype=\"int64\")\n            trg_emb = self.trg_word_embedding(trg_word)\n            trg_pos_emb = self.trg_pos_embedding(trg_pos)\n            trg_emb = trg_emb + trg_pos_emb\n            dec_input = F.dropout(\n                trg_emb, p=self.dropout,\n                training=self.training) if self.dropout else trg_emb\n\n            if waitk < 0 or i >= len(enc_outputs):\n                # if the decoder step is full sent or longer than all source\n                # step, then read the whole src\n                _e = enc_outputs[-1]\n                dec_output, caches = self.decoder(\n                    dec_input, [_e], None,\n                    trg_src_attn_bias[:, :, :, :_e.shape[1]], caches)\n            else:\n                _e = enc_outputs[i]\n                dec_output, caches = self.decoder(\n                    dec_input, [_e], None,\n                    trg_src_attn_bias[:, :, :, :_e.shape[1]], caches)\n\n            dec_output = paddle.reshape(\n                dec_output, shape=[-1, dec_output.shape[-1]])\n\n            logits = self.linear(dec_output)\n            step_log_probs = paddle.log(F.softmax(logits, axis=-1))\n            log_probs = paddle.add(x=step_log_probs, y=log_probs)\n            scores = log_probs\n            topk_scores, topk_indices = paddle.topk(x=scores, k=1)\n\n            finished = paddle.equal(topk_indices, end_token_tensor)\n            trg_word = topk_indices\n            log_probs = topk_scores\n\n            predict_ids.append(topk_indices)\n\n            if paddle.all(finished).numpy():\n                break\n\n        predict_ids = paddle.stack(predict_ids, axis=0)\n        finished_seq = paddle.transpose(predict_ids, [1, 2, 0])\n        finished_scores = topk_scores\n\n        return finished_seq, finished_scores, caches"
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_7/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\n\nimport jieba\nimport paddle\nfrom paddlenlp.transformers import position_encoding_init\nfrom paddlenlp.transformers import WordEmbedding, PositionalEmbedding\nfrom paddlehub.env import MODULE_HOME\nfrom paddlehub.module.module import moduleinfo, serving\n\nfrom transformer_nist_wait_7.model import SimultaneousTransformer\nfrom transformer_nist_wait_7.processor import STACLTokenizer, predict\n\n\n@moduleinfo(\n    name=\"transformer_nist_wait_7\",\n    version=\"1.0.0\",\n    summary=\"\",\n    author=\"PaddlePaddle\",\n    author_email=\"\",\n    type=\"nlp/simultaneous_translation\",\n)\nclass STTransformer():\n    \"\"\"\n    Transformer model for simultaneous translation.\n    \"\"\"\n\n    # Model config\n    model_config = {\n        # Number of head used in multi-head attention.\n        \"n_head\": 8,\n        # Number of sub-layers to be stacked in the encoder and decoder.\n        \"n_layer\": 6,\n        # The dimension for word embeddings, which is also the last dimension of\n        # the input and output of multi-head attention, position-wise feed-forward\n        # networks, encoder and decoder.\n        \"d_model\": 512,\n    }\n\n    def __init__(self, \n                 max_length=256,\n                 max_out_len=256,\n                 ):\n        super(STTransformer, self).__init__()\n        bpe_codes_fpath = os.path.join(MODULE_HOME, \"transformer_nist_wait_7\", \"assets\", \"2M.zh2en.dict4bpe.zh\")\n        src_vocab_fpath = os.path.join(MODULE_HOME, \"transformer_nist_wait_7\", \"assets\", \"nist.20k.zh.vocab\")\n        trg_vocab_fpath = os.path.join(MODULE_HOME, \"transformer_nist_wait_7\", \"assets\", \"nist.10k.en.vocab\")\n        params_fpath = os.path.join(MODULE_HOME, \"transformer_nist_wait_7\", \"assets\", \"transformer.pdparams\")\n        self.max_length = max_length\n        self.max_out_len = max_out_len\n        self.tokenizer = STACLTokenizer(\n            bpe_codes_fpath,\n            src_vocab_fpath,\n            trg_vocab_fpath,\n        )\n        src_vocab_size = self.tokenizer.src_vocab_size\n        trg_vocab_size = self.tokenizer.trg_vocab_size\n        self.transformer = SimultaneousTransformer(\n            src_vocab_size,\n            trg_vocab_size,\n            max_length=self.max_length,\n            n_layer=self.model_config['n_layer'],\n            n_head=self.model_config['n_head'],\n            d_model=self.model_config['d_model'],\n        )\n        model_dict = paddle.load(params_fpath)\n        # To avoid a longer length than training, reset the size of position\n        # encoding to max_length\n        model_dict[\"src_pos_embedding.pos_encoder.weight\"] = position_encoding_init(\n            self.max_length + 1, self.model_config['d_model'])\n        model_dict[\"trg_pos_embedding.pos_encoder.weight\"] = position_encoding_init(\n            self.max_length + 1, self.model_config['d_model'])\n        self.transformer.load_dict(model_dict)\n\n    @serving\n    def translate(self, text, use_gpu=False):\n        paddle.set_device('gpu') if use_gpu else paddle.set_device('cpu')\n\n        # Word segmentation\n        text = ' '.join(jieba.cut(text))\n        # For decoding max length\n        decoder_max_length = 1\n        # For decoding cache\n        cache = None\n        # For decoding start token id\n        bos_id = None\n        # Current source word index\n        i = 0\n        # For decoding: is_last=True, max_len=256\n        is_last = False\n        # Tokenized id\n        user_input_tokenized = []\n        # Store the translation\n        result = []\n\n        bpe_str, tokenized_src = self.tokenizer.tokenize(text)\n        while i < len(tokenized_src):\n            user_input_tokenized.append(tokenized_src[i])\n            if bpe_str[i] in ['。', '？', '！']:\n                is_last = True\n            result, cache, bos_id = predict(\n                user_input_tokenized, \n                decoder_max_length,\n                is_last, \n                cache, \n                bos_id, \n                result,\n                self.tokenizer, \n                self.transformer,\n                max_out_len=self.max_out_len)\n            i += 1    \n        return \" \".join(result)"
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_7/processor.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nfrom paddlenlp.data import Vocab\nfrom subword_nmt import subword_nmt\n\n\nclass STACLTokenizer:\n    \"\"\"\n    Jieba+BPE, and convert tokens to ids.\n    \"\"\"\n\n    def __init__(self,\n                 bpe_codes_fpath,\n                 src_vocab_fpath,\n                 trg_vocab_fpath,\n                 special_token=[\"<s>\", \"<e>\", \"<unk>\"]):\n        bpe_parser = subword_nmt.create_apply_bpe_parser()\n        bpe_args = bpe_parser.parse_args(args=['-c', bpe_codes_fpath])\n        bpe_args.codes.close()\n        bpe_args.codes = open(bpe_codes_fpath, 'r', encoding='utf-8')\n        self.bpe = subword_nmt.BPE(bpe_args.codes, bpe_args.merges,\n                                   bpe_args.separator, None,\n                                   bpe_args.glossaries)\n\n        self.src_vocab = Vocab.load_vocabulary(\n            src_vocab_fpath,\n            bos_token=special_token[0],\n            eos_token=special_token[1],\n            unk_token=special_token[2])\n\n        self.trg_vocab = Vocab.load_vocabulary(\n            trg_vocab_fpath,\n            bos_token=special_token[0],\n            eos_token=special_token[1],\n            unk_token=special_token[2])\n\n        self.src_vocab_size = len(self.src_vocab)\n        self.trg_vocab_size = len(self.trg_vocab)\n\n    def tokenize(self, text):\n        bpe_str = self.bpe.process_line(text)\n        ids = self.src_vocab.to_indices(bpe_str.split())\n        return bpe_str.split(), ids\n\n\ndef post_process_seq(seq, \n                     bos_idx=0, \n                     eos_idx=1, \n                     output_bos=False, \n                     output_eos=False):\n    \"\"\"\n    Post-process the decoded sequence.\n    \"\"\"\n    eos_pos = len(seq) - 1\n    for i, idx in enumerate(seq):\n        if idx == eos_idx:\n            eos_pos = i\n            break\n    seq = [\n        idx for idx in seq[:eos_pos + 1]\n        if (output_bos or idx != bos_idx) and (output_eos or idx != eos_idx)\n    ]\n    return seq\n\n\ndef predict(tokenized_src, \n              decoder_max_length, \n              is_last, \n              cache, \n              bos_id, \n              result,\n              tokenizer, \n              transformer,\n              n_best=1,\n              max_out_len=256,\n              eos_idx=1,\n              waitk=7,\n    ):\n    # Set evaluate mode\n    transformer.eval()\n\n    if len(tokenized_src) < waitk:\n        return result, cache, bos_id\n\n    with paddle.no_grad():\n        paddle.disable_static()\n        input_src = tokenized_src\n        if is_last:\n            decoder_max_length = max_out_len\n            input_src += [eos_idx]\n        src_word = paddle.to_tensor(input_src).unsqueeze(axis=0)\n        finished_seq, finished_scores, cache = transformer.greedy_search(\n            src_word,\n            max_len=decoder_max_length,\n            waitk=waitk,\n            caches=cache,\n            bos_id=bos_id)\n        finished_seq = finished_seq.numpy()\n        for beam_idx, beam in enumerate(finished_seq[0]):\n            if beam_idx >= n_best:\n                break\n            id_list = post_process_seq(beam)\n            if len(id_list) == 0:\n                continue\n            bos_id = id_list[-1]\n            word_list = tokenizer.trg_vocab.to_tokens(id_list)\n            for word in word_list:\n                result.append(word)\n            res = ' '.join(word_list).replace('@@ ', '')\n        paddle.enable_static()\n    return result, cache, bos_id"
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_7/requirements.txt",
    "content": "jieba==0.42.1 \nsubword-nmt==0.3.7\n"
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_all/README.md",
    "content": "# transformer_nist_wait_all\n|模型名称|transformer_nist_wait_all|\n| :--- | :---: | \n|类别|同声传译|\n|网络|transformer|\n|数据集|NIST 2008-中英翻译数据集|\n|是否支持Fine-tuning|否|\n|模型大小|377MB|\n|最新更新日期|2021-09-17|\n|数据指标|-|\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - 同声传译（Simultaneous Translation），即在句子完成之前进行翻译，同声传译的目标是实现同声传译的自动化，它可以与源语言同时翻译，延迟时间只有几秒钟。\n    STACL 是论文 [STACL: Simultaneous Translation with Implicit Anticipation and Controllable Latency using Prefix-to-Prefix Framework](https://www.aclweb.org/anthology/P19-1289/) 中针对同传提出的适用于所有同传场景的翻译架构。\n    - STACL 主要具有以下优势：\n\n    - Prefix-to-Prefix架构拥有预测能力，即在未看到源词的情况下仍然可以翻译出对应的目标词，克服了SOV→SVO等词序差异\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/40840292/133761990-13e55d0f-5c3a-476c-8865-5808d13cba97.png\"> <br />\n    </p>\n     和传统的机器翻译模型主要的区别在于翻译时是否需要利用全句的源句。上图中，Seq2Seq模型需要等到全句的源句（1-5）全部输入Encoder后，Decoder才开始解码进行翻译；而STACL架构采用了Wait-k（图中Wait-2）的策略，当源句只有两个词（1和2）输入到Encoder后，Decoder即可开始解码预测目标句的第一个词。\n\n    - Wait-k策略可以不需要全句的源句，直接预测目标句，可以实现任意的字级延迟，同时保持较高的翻译质量。\n    <p align=\"center\">\n    <img src=\"https://user-images.githubusercontent.com/40840292/133762098-6ea6f3ca-0d70-4a0a-981d-0fcc6f3cd96b.png\"> <br />\n    </p>\n     Wait-k策略首先等待源句单词，然后与源句的其余部分同时翻译，即输出总是隐藏在输入后面。这是受到同声传译人员的启发，同声传译人员通常会在几秒钟内开始翻译演讲者的演讲，在演讲者结束几秒钟后完成。例如，如果k=2，第一个目标词使用前2个源词预测，第二个目标词使用前3个源词预测，以此类推。上图中，(a)simultaneous: our wait-2 等到\"布什\"和\"总统\"输入后就开始解码预测\"pres.\"，而(b) non-simultaneous baseline 为传统的翻译模型，需要等到整句\"布什 总统 在 莫斯科 与 普京 会晤\"才开始解码预测。\n    \n  - 该PaddleHub Module基于transformer网络结构，采用的策略是等到全句结束再进行中文到英文的翻译，即waitk=-1。\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.1.0\n\n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install transformer_nist_wait_all\n    ```\n\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    model = hub.Module(name=\"transformer_nist_wait_all\")\n\n    # 待预测数据（模拟同声传译实时输入）\n    text = [\n        \"他\", \n        \"他还\", \n        \"他还说\", \n        \"他还说现在\", \n        \"他还说现在正在\",\n        \"他还说现在正在为\",\n        \"他还说现在正在为这\",\n        \"他还说现在正在为这一\",\n        \"他还说现在正在为这一会议\",\n        \"他还说现在正在为这一会议作出\",\n        \"他还说现在正在为这一会议作出安排\",\n        \"他还说现在正在为这一会议作出安排。\",      \n    ]\n\n    for t in text:\n        print(\"input: {}\".format(t))\n        result = model.translate(t)\n        print(\"model output: {}\\n\".format(result))\n\n    # input: 他\n    # model output: \n    #\n    # input: 他还\n    # model output: \n    #\n    # input: 他还说\n    # model output:\n    #\n    # input: 他还说现在\n    # model output:\n    #\n    # input: 他还说现在正在\n    # model output:\n    #\n    # input: 他还说现在正在为\n    # model output:\n    #\n    # input: 他还说现在正在为这\n    # model output:\n    #\n    # input: 他还说现在正在为这一\n    # model output:\n    #\n    # input: 他还说现在正在为这一会议\n    # model output:\n    #\n    # input: 他还说现在正在为这一会议作出\n    # model output:\n    #\n    # input: 他还说现在正在为这一会议作出安排\n    # model output:\n    #\n    # input: 他还说现在正在为这一会议作出安排。\n    # model output: he also said that arrangements are now being made for this meeting .\n    ```\n\n- ### 2、 API\n\n    - ```python\n      __init__(max_length=256, max_out_len=256)\n      ```\n\n        - 初始化module， 可配置模型的输入文本的最大长度\n\n        - **参数**\n\n            - max_length(int): 输入文本的最大长度，默认值为256。\n            - max_out_len(int): 输出文本的最大解码长度，超过最大解码长度时会截断句子的后半部分，默认值为256。\n\n    - ```python\n      translate(text, use_gpu=False)\n      ```\n\n        - 预测API，输入源语言的文本（模拟同传语音输入），解码后输出翻译后的目标语言文本。\n\n        - **参数**\n\n            - text(str): 输入源语言的文本，数据类型为str\n            - use_gpu(bool): 是否使用gpu进行预测，默认为False\n\n        - **返回**\n\n            - result(str): 翻译后的目标语言文本。\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线同声传译服务(需要用户配置一个语音转文本应用预先将语音输入转为中文文字)，可以将此接口用于在线web应用。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n\n  - ```shell\n    $ hub serving start -m transformer_nist_wait_all\n    ```\n\n  - 启动时会显示加载模型过程，启动成功后显示\n\n  - ```shell\n    Loading transformer_nist_wait_all successful.\n    ```\n\n  - 这样就完成了服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果       \n\n  - ```python\n    import requests\n    import json\n\n    # 待预测数据（模拟同声传译实时输入）\n    text = [\n        \"他\", \n        \"他还\", \n        \"他还说\", \n        \"他还说现在\", \n        \"他还说现在正在\",\n        \"他还说现在正在为\",\n        \"他还说现在正在为这\",\n        \"他还说现在正在为这一\",\n        \"他还说现在正在为这一会议\",\n        \"他还说现在正在为这一会议作出\",\n        \"他还说现在正在为这一会议作出安排\",\n        \"他还说现在正在为这一会议作出安排。\",      \n    ]\n\n    # 指定预测方法为transformer_nist_wait_all并发送post请求，content-type类型应指定json方式\n    # HOST_IP为服务器IP\n    url = \"http://HOST_IP:8866/predict/transformer_nist_wait_all\"\n    headers = {\"Content-Type\": \"application/json\"}\n    for t in text:\n        print(\"input: {}\".format(t))\n        r = requests.post(url=url, headers=headers, data=json.dumps({\"text\": t}))\n        # 打印预测结果\n        print(\"model output: {}\\n\".format(result.json()['results']))\n\n  - 关于PaddleHub Serving更多信息参考：[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n## 五、更新历史\n\n* 1.0.0\n    初始发布\n    ```shell\n    hub install transformer_nist_wait_all==1.0.0\n    ```\n"
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_all/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_all/model.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport numpy as np\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom paddlenlp.transformers import WordEmbedding, PositionalEmbedding\n\n\nclass DecoderLayer(nn.TransformerDecoderLayer):\n    def __init__(self, *args, **kwargs):\n        super(DecoderLayer, self).__init__(*args, **kwargs)\n\n    def forward(self, tgt, memory, tgt_mask=None, memory_mask=None, cache=None):\n        residual = tgt\n        if self.normalize_before:\n            tgt = self.norm1(tgt)\n        if cache is None:\n            tgt = self.self_attn(tgt, tgt, tgt, tgt_mask, None)\n        else:\n            tgt, incremental_cache = self.self_attn(tgt, tgt, tgt, tgt_mask,\n                                                    cache[0])\n        tgt = residual + self.dropout1(tgt)\n        if not self.normalize_before:\n            tgt = self.norm1(tgt)\n\n        residual = tgt\n        if self.normalize_before:\n            tgt = self.norm2(tgt)\n        if len(memory) == 1:\n            # Full sent\n            tgt = self.cross_attn(tgt, memory[0], memory[0], memory_mask, None)\n        else:\n            # Wait-k policy\n            cross_attn_outputs = []\n            for i in range(tgt.shape[1]):\n                q = tgt[:, i:i + 1, :]\n                if i >= len(memory):\n                    e = memory[-1]\n                else:\n                    e = memory[i]\n                cross_attn_outputs.append(\n                    self.cross_attn(q, e, e, memory_mask[:, :, i:i + 1, :\n                                                         e.shape[1]], None))\n            tgt = paddle.concat(cross_attn_outputs, axis=1)\n        tgt = residual + self.dropout2(tgt)\n        if not self.normalize_before:\n            tgt = self.norm2(tgt)\n\n        residual = tgt\n        if self.normalize_before:\n            tgt = self.norm3(tgt)\n        tgt = self.linear2(self.dropout(self.activation(self.linear1(tgt))))\n        tgt = residual + self.dropout3(tgt)\n        if not self.normalize_before:\n            tgt = self.norm3(tgt)\n        return tgt if cache is None else (tgt, (incremental_cache, ))\n\n\nclass Decoder(nn.TransformerDecoder):\n    \"\"\"\n    PaddlePaddle 2.1 casts memory_mask.dtype to memory.dtype, but in STACL,\n    type of memory is list, having no dtype attribute.\n    \"\"\"\n\n    def forward(self, tgt, memory, tgt_mask=None, memory_mask=None, cache=None):\n        output = tgt\n        new_caches = []\n        for i, mod in enumerate(self.layers):\n            if cache is None:\n                output = mod(output,\n                             memory,\n                             tgt_mask=tgt_mask,\n                             memory_mask=memory_mask,\n                             cache=None)\n            else:\n                output, new_cache = mod(output,\n                                        memory,\n                                        tgt_mask=tgt_mask,\n                                        memory_mask=memory_mask,\n                                        cache=cache[i])\n                new_caches.append(new_cache)\n\n        if self.norm is not None:\n            output = self.norm(output)\n\n        return output if cache is None else (output, new_caches)\n\n\nclass SimultaneousTransformer(nn.Layer):\n    \"\"\"\n    model\n    \"\"\"\n    def __init__(self,\n                 src_vocab_size,\n                 trg_vocab_size,\n                 max_length=256,\n                 n_layer=6,\n                 n_head=8,\n                 d_model=512,\n                 d_inner_hid=2048,\n                 dropout=0.1,\n                 weight_sharing=False,\n                 bos_id=0,\n                 eos_id=1,\n                 waitk=-1):\n        super(SimultaneousTransformer, self).__init__()\n        self.trg_vocab_size = trg_vocab_size\n        self.emb_dim = d_model\n        self.bos_id = bos_id\n        self.eos_id = eos_id\n        self.dropout = dropout\n        self.waitk = waitk\n        self.n_layer = n_layer\n        self.n_head = n_head\n        self.d_model = d_model\n\n        self.src_word_embedding = WordEmbedding(\n            vocab_size=src_vocab_size, emb_dim=d_model, bos_id=self.bos_id)\n        self.src_pos_embedding = PositionalEmbedding(\n            emb_dim=d_model, max_length=max_length+1)\n        if weight_sharing:\n            assert src_vocab_size == trg_vocab_size, (\n                \"Vocabularies in source and target should be same for weight sharing.\"\n            )\n            self.trg_word_embedding = self.src_word_embedding\n            self.trg_pos_embedding = self.src_pos_embedding\n        else:\n            self.trg_word_embedding = WordEmbedding(\n                vocab_size=trg_vocab_size, emb_dim=d_model, bos_id=self.bos_id)\n            self.trg_pos_embedding = PositionalEmbedding(\n                emb_dim=d_model, max_length=max_length+1)\n\n        encoder_layer = nn.TransformerEncoderLayer(\n            d_model=d_model,\n            nhead=n_head,\n            dim_feedforward=d_inner_hid,\n            dropout=dropout,\n            activation='relu',\n            normalize_before=True,\n            bias_attr=[False, True])\n        encoder_norm = nn.LayerNorm(d_model)\n        self.encoder = nn.TransformerEncoder(\n            encoder_layer=encoder_layer, num_layers=n_layer, norm=encoder_norm)\n\n        decoder_layer = DecoderLayer(\n            d_model=d_model,\n            nhead=n_head,\n            dim_feedforward=d_inner_hid,\n            dropout=dropout,\n            activation='relu',\n            normalize_before=True,\n            bias_attr=[False, False, True])\n        decoder_norm = nn.LayerNorm(d_model)\n        self.decoder = Decoder(\n            decoder_layer=decoder_layer, num_layers=n_layer, norm=decoder_norm)\n\n        if weight_sharing:\n            self.linear = lambda x: paddle.matmul(\n                x=x, y=self.trg_word_embedding.word_embedding.weight, transpose_y=True)\n        else:\n            self.linear = nn.Linear(\n                in_features=d_model,\n                out_features=trg_vocab_size,\n                bias_attr=False)\n\n    def forward(self, src_word, trg_word):\n        src_max_len = paddle.shape(src_word)[-1]\n        trg_max_len = paddle.shape(trg_word)[-1]\n        base_attn_bias = paddle.cast(\n            src_word == self.bos_id,\n            dtype=paddle.get_default_dtype()).unsqueeze([1, 2]) * -1e9\n        src_slf_attn_bias = base_attn_bias\n        src_slf_attn_bias.stop_gradient = True\n        trg_slf_attn_bias = paddle.tensor.triu(\n            (paddle.ones(\n                (trg_max_len, trg_max_len),\n                dtype=paddle.get_default_dtype()) * -np.inf),\n            1)\n        trg_slf_attn_bias.stop_gradient = True\n        trg_src_attn_bias = paddle.tile(base_attn_bias, [1, 1, trg_max_len, 1])\n        src_pos = paddle.cast(\n            src_word != self.bos_id, dtype=\"int64\") * paddle.arange(\n                start=0, end=src_max_len)\n        trg_pos = paddle.cast(\n            trg_word != self.bos_id, dtype=\"int64\") * paddle.arange(\n                start=0, end=trg_max_len)\n        src_emb = self.src_word_embedding(src_word)\n        src_pos_emb = self.src_pos_embedding(src_pos)\n        src_emb = src_emb + src_pos_emb\n        enc_input = F.dropout(\n            src_emb, p=self.dropout,\n            training=self.training) if self.dropout else src_emb\n        with paddle.static.amp.fp16_guard():\n            if self.waitk >= src_max_len or self.waitk == -1:\n                # Full sentence\n                enc_outputs = [\n                    self.encoder(\n                        enc_input, src_mask=src_slf_attn_bias)\n                ]\n            else:\n                # Wait-k policy\n                enc_outputs = []\n                for i in range(self.waitk, src_max_len + 1):\n                    enc_output = self.encoder(\n                        enc_input[:, :i, :],\n                        src_mask=src_slf_attn_bias[:, :, :, :i])\n                    enc_outputs.append(enc_output)\n\n            trg_emb = self.trg_word_embedding(trg_word)\n            trg_pos_emb = self.trg_pos_embedding(trg_pos)\n            trg_emb = trg_emb + trg_pos_emb\n            dec_input = F.dropout(\n                trg_emb, p=self.dropout,\n                training=self.training) if self.dropout else trg_emb\n            dec_output = self.decoder(\n                dec_input,\n                enc_outputs,\n                tgt_mask=trg_slf_attn_bias,\n                memory_mask=trg_src_attn_bias)\n\n            predict = self.linear(dec_output)\n\n        return predict\n\n    def beam_search(self, src_word, beam_size=4, max_len=256, waitk=-1):\n        # TODO: \"Speculative Beam Search for Simultaneous Translation\"\n        raise NotImplementedError\n\n    def greedy_search(self,\n                      src_word,\n                      max_len=256,\n                      waitk=-1,\n                      caches=None,\n                      bos_id=None):\n        \"\"\"\n        greedy_search uses streaming reader. It doesn't need calling\n        encoder many times, an a sub-sentence just needs calling encoder once.\n        So, it needs previous state(caches) and last one of generated\n        tokens id last time.\n        \"\"\"\n        src_max_len = paddle.shape(src_word)[-1]\n        base_attn_bias = paddle.cast(\n            src_word == self.bos_id,\n            dtype=paddle.get_default_dtype()).unsqueeze([1, 2]) * -1e9\n        src_slf_attn_bias = base_attn_bias\n        src_slf_attn_bias.stop_gradient = True\n        trg_src_attn_bias = paddle.tile(base_attn_bias, [1, 1, 1, 1])\n        src_pos = paddle.cast(\n            src_word != self.bos_id, dtype=\"int64\") * paddle.arange(\n                start=0, end=src_max_len)\n        src_emb = self.src_word_embedding(src_word)\n        src_pos_emb = self.src_pos_embedding(src_pos)\n        src_emb = src_emb + src_pos_emb\n        enc_input = F.dropout(\n            src_emb, p=self.dropout,\n            training=self.training) if self.dropout else src_emb\n        enc_outputs = [self.encoder(enc_input, src_mask=src_slf_attn_bias)]\n\n        # constant number\n        batch_size = enc_outputs[-1].shape[0]\n        max_len = (\n            enc_outputs[-1].shape[1] + 20) if max_len is None else max_len\n        end_token_tensor = paddle.full(\n            shape=[batch_size, 1], fill_value=self.eos_id, dtype=\"int64\")\n\n        predict_ids = []\n        log_probs = paddle.full(\n            shape=[batch_size, 1], fill_value=0, dtype=\"float32\")\n        if not bos_id:\n            trg_word = paddle.full(\n                shape=[batch_size, 1], fill_value=self.bos_id, dtype=\"int64\")\n        else:\n            trg_word = paddle.full(\n                shape=[batch_size, 1], fill_value=bos_id, dtype=\"int64\")\n\n        # init states (caches) for transformer\n        if not caches:\n            caches = self.decoder.gen_cache(enc_outputs[-1], do_zip=False)\n\n        for i in range(max_len):\n            trg_pos = paddle.full(\n                shape=trg_word.shape, fill_value=i, dtype=\"int64\")\n            trg_emb = self.trg_word_embedding(trg_word)\n            trg_pos_emb = self.trg_pos_embedding(trg_pos)\n            trg_emb = trg_emb + trg_pos_emb\n            dec_input = F.dropout(\n                trg_emb, p=self.dropout,\n                training=self.training) if self.dropout else trg_emb\n\n            if waitk < 0 or i >= len(enc_outputs):\n                # if the decoder step is full sent or longer than all source\n                # step, then read the whole src\n                _e = enc_outputs[-1]\n                dec_output, caches = self.decoder(\n                    dec_input, [_e], None,\n                    trg_src_attn_bias[:, :, :, :_e.shape[1]], caches)\n            else:\n                _e = enc_outputs[i]\n                dec_output, caches = self.decoder(\n                    dec_input, [_e], None,\n                    trg_src_attn_bias[:, :, :, :_e.shape[1]], caches)\n\n            dec_output = paddle.reshape(\n                dec_output, shape=[-1, dec_output.shape[-1]])\n\n            logits = self.linear(dec_output)\n            step_log_probs = paddle.log(F.softmax(logits, axis=-1))\n            log_probs = paddle.add(x=step_log_probs, y=log_probs)\n            scores = log_probs\n            topk_scores, topk_indices = paddle.topk(x=scores, k=1)\n\n            finished = paddle.equal(topk_indices, end_token_tensor)\n            trg_word = topk_indices\n            log_probs = topk_scores\n\n            predict_ids.append(topk_indices)\n\n            if paddle.all(finished).numpy():\n                break\n\n        predict_ids = paddle.stack(predict_ids, axis=0)\n        finished_seq = paddle.transpose(predict_ids, [1, 2, 0])\n        finished_scores = topk_scores\n\n        return finished_seq, finished_scores, caches"
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_all/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\n\nimport jieba\nimport paddle\nfrom paddlenlp.transformers import position_encoding_init\nfrom paddlenlp.transformers import WordEmbedding, PositionalEmbedding\nfrom paddlehub.env import MODULE_HOME\nfrom paddlehub.module.module import moduleinfo, serving\n\nfrom transformer_nist_wait_all.model import SimultaneousTransformer\nfrom transformer_nist_wait_all.processor import STACLTokenizer, predict\n\n\n@moduleinfo(\n    name=\"transformer_nist_wait_all\",\n    version=\"1.0.0\",\n    summary=\"\",\n    author=\"PaddlePaddle\",\n    author_email=\"\",\n    type=\"nlp/simultaneous_translation\",\n)\nclass STTransformer():\n    \"\"\"\n    Transformer model for simultaneous translation.\n    \"\"\"\n\n    # Model config\n    model_config = {\n        # Number of head used in multi-head attention.\n        \"n_head\": 8,\n        # Number of sub-layers to be stacked in the encoder and decoder.\n        \"n_layer\": 6,\n        # The dimension for word embeddings, which is also the last dimension of\n        # the input and output of multi-head attention, position-wise feed-forward\n        # networks, encoder and decoder.\n        \"d_model\": 512,\n    }\n\n    def __init__(self, \n                 max_length=256,\n                 max_out_len=256,\n                 ):\n        super(STTransformer, self).__init__()\n        bpe_codes_fpath = os.path.join(MODULE_HOME, \"transformer_nist_wait_all\", \"assets\", \"2M.zh2en.dict4bpe.zh\")\n        src_vocab_fpath = os.path.join(MODULE_HOME, \"transformer_nist_wait_all\", \"assets\", \"nist.20k.zh.vocab\")\n        trg_vocab_fpath = os.path.join(MODULE_HOME, \"transformer_nist_wait_all\", \"assets\", \"nist.10k.en.vocab\")\n        params_fpath = os.path.join(MODULE_HOME, \"transformer_nist_wait_all\", \"assets\", \"transformer.pdparams\")\n        self.max_length = max_length\n        self.max_out_len = max_out_len\n        self.tokenizer = STACLTokenizer(\n            bpe_codes_fpath,\n            src_vocab_fpath,\n            trg_vocab_fpath,\n        )\n        src_vocab_size = self.tokenizer.src_vocab_size\n        trg_vocab_size = self.tokenizer.trg_vocab_size\n        self.transformer = SimultaneousTransformer(\n            src_vocab_size,\n            trg_vocab_size,\n            max_length=self.max_length,\n            n_layer=self.model_config['n_layer'],\n            n_head=self.model_config['n_head'],\n            d_model=self.model_config['d_model'],\n        )\n        model_dict = paddle.load(params_fpath)\n        # To avoid a longer length than training, reset the size of position\n        # encoding to max_length\n        model_dict[\"src_pos_embedding.pos_encoder.weight\"] = position_encoding_init(\n            self.max_length + 1, self.model_config['d_model'])\n        model_dict[\"trg_pos_embedding.pos_encoder.weight\"] = position_encoding_init(\n            self.max_length + 1, self.model_config['d_model'])\n        self.transformer.load_dict(model_dict)\n\n    @serving\n    def translate(self, text, use_gpu=False):\n        paddle.set_device('gpu') if use_gpu else paddle.set_device('cpu')\n\n        # Word segmentation\n        text = ' '.join(jieba.cut(text))\n        # For decoding max length\n        decoder_max_length = 1\n        # For decoding cache\n        cache = None\n        # For decoding start token id\n        bos_id = None\n        # Current source word index\n        i = 0\n        # For decoding: is_last=True, max_len=256\n        is_last = False\n        # Tokenized id\n        user_input_tokenized = []\n        # Store the translation\n        result = []\n\n        bpe_str, tokenized_src = self.tokenizer.tokenize(text)\n        while i < len(tokenized_src):\n            user_input_tokenized.append(tokenized_src[i])\n            if bpe_str[i] in ['。', '？', '！']:\n                is_last = True\n            result, cache, bos_id = predict(\n                user_input_tokenized, \n                decoder_max_length,\n                is_last, \n                cache, \n                bos_id, \n                result,\n                self.tokenizer, \n                self.transformer,\n                max_out_len=self.max_out_len)\n            i += 1    \n        return \" \".join(result)"
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_all/processor.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nfrom paddlenlp.data import Vocab\nfrom subword_nmt import subword_nmt\n\n\nclass STACLTokenizer:\n    \"\"\"\n    Jieba+BPE, and convert tokens to ids.\n    \"\"\"\n\n    def __init__(self,\n                 bpe_codes_fpath,\n                 src_vocab_fpath,\n                 trg_vocab_fpath,\n                 special_token=[\"<s>\", \"<e>\", \"<unk>\"]):\n        bpe_parser = subword_nmt.create_apply_bpe_parser()\n        bpe_args = bpe_parser.parse_args(args=['-c', bpe_codes_fpath])\n        bpe_args.codes.close()\n        bpe_args.codes = open(bpe_codes_fpath, 'r', encoding='utf-8')\n        self.bpe = subword_nmt.BPE(bpe_args.codes, bpe_args.merges,\n                                   bpe_args.separator, None,\n                                   bpe_args.glossaries)\n\n        self.src_vocab = Vocab.load_vocabulary(\n            src_vocab_fpath,\n            bos_token=special_token[0],\n            eos_token=special_token[1],\n            unk_token=special_token[2])\n\n        self.trg_vocab = Vocab.load_vocabulary(\n            trg_vocab_fpath,\n            bos_token=special_token[0],\n            eos_token=special_token[1],\n            unk_token=special_token[2])\n\n        self.src_vocab_size = len(self.src_vocab)\n        self.trg_vocab_size = len(self.trg_vocab)\n\n    def tokenize(self, text):\n        bpe_str = self.bpe.process_line(text)\n        ids = self.src_vocab.to_indices(bpe_str.split())\n        return bpe_str.split(), ids\n\n\ndef post_process_seq(seq, \n                     bos_idx=0, \n                     eos_idx=1, \n                     output_bos=False, \n                     output_eos=False):\n    \"\"\"\n    Post-process the decoded sequence.\n    \"\"\"\n    eos_pos = len(seq) - 1\n    for i, idx in enumerate(seq):\n        if idx == eos_idx:\n            eos_pos = i\n            break\n    seq = [\n        idx for idx in seq[:eos_pos + 1]\n        if (output_bos or idx != bos_idx) and (output_eos or idx != eos_idx)\n    ]\n    return seq\n\n\ndef predict(tokenized_src, \n              decoder_max_length, \n              is_last, \n              cache, \n              bos_id, \n              result,\n              tokenizer, \n              transformer,\n              n_best=1,\n              max_out_len=256,\n              eos_idx=1,\n              waitk=-1,\n    ):\n    # Set evaluate mode\n    transformer.eval()\n\n    if not is_last:\n        return result, cache, bos_id\n\n    with paddle.no_grad():\n        paddle.disable_static()\n        input_src = tokenized_src\n        if is_last:\n            decoder_max_length = max_out_len\n            input_src += [eos_idx]\n        src_word = paddle.to_tensor(input_src).unsqueeze(axis=0)\n        finished_seq, finished_scores, cache = transformer.greedy_search(\n            src_word,\n            max_len=decoder_max_length,\n            waitk=waitk,\n            caches=cache,\n            bos_id=bos_id)\n        finished_seq = finished_seq.numpy()\n        for beam_idx, beam in enumerate(finished_seq[0]):\n            if beam_idx >= n_best:\n                break\n            id_list = post_process_seq(beam)\n            if len(id_list) == 0:\n                continue\n            bos_id = id_list[-1]\n            word_list = tokenizer.trg_vocab.to_tokens(id_list)\n            for word in word_list:\n                result.append(word)\n            res = ' '.join(word_list).replace('@@ ', '')\n        paddle.enable_static()\n    return result, cache, bos_id"
  },
  {
    "path": "modules/text/simultaneous_translation/stacl/transformer_nist_wait_all/requirements.txt",
    "content": "jieba==0.42.1 \nsubword-nmt==0.3.7\n"
  },
  {
    "path": "modules/text/syntactic_analysis/DDParser/README.md",
    "content": "# DDParser\n\n|模型名称|DDParser|\n| :--- | :---: | \n|类别|文本-句法分析|\n|网络|Deep Biaffine Attention|\n|数据集|搜索query、网页文本、语音输入等数据|\n|是否支持Fine-tuning|否|\n|模型大小|61MB|\n|最新更新日期|2021-10-26|\n|数据指标|-|\n\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - DDParser（Baidu Dependency Parser）是百度NLP基于大规模标注数据和深度学习平台飞桨研发的中文依存句法分析工具，可帮助用户直接获取输入文本中的关联词对、长距离依赖词对等。\n  \n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.1.0\n  \n  - paddlenlp >= 2.1.0\n\n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install ddparser\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run ddparser --input_text=\"百度是一家高科技公司\"\n    ```\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import cv2\n    import paddlehub as hub\n\n    # Load ddparser\n    module = hub.Module(name=\"ddparser\")\n\n    # String input\n    results = module.parse(\"百度是一家高科技公司\")\n    print(results)\n    # [{'word': ['百度', '是', '一家', '高科技', '公司'], 'head': [2, 0, 5, 5, 2], 'deprel': ['SBV', 'HED', 'ATT', 'ATT', 'VOB']}]\n\n    # List input\n    results = module.parse([\"百度是一家高科技公司\", \"他送了一本书\"])\n    print(results)\n    # [{'word': ['百度', '是', '一家', '高科技', '公司'], 'head': [2, 0, 5, 5, 2], 'deprel': ['SBV', 'HED', 'ATT', 'ATT', 'VOB']}, {'word': ['他', '送', '了', '一本', '书'], 'head': [2, 0, 2, 5, 2], 'deprel': ['SBV', 'HED', 'MT', 'ATT', 'VOB']}]\n\n    # Use POS Tag and probability\n    module = hub.Module(name=\"ddparser\", prob=True, use_pos=True)\n    results = module.parse(\"百度是一家高科技公司\")\n    print(results)\n    # [{'word': ['百度', '是', '一家', '高科技', '公司'], 'head': [2, 0, 5, 5, 2], 'deprel': ['SBV', 'HED', 'ATT', 'ATT', 'VOB'], 'postag': ['ORG', 'v', 'm', 'n', 'n'], 'prob': [1.0, 1.0, 1.0, 1.0, 1.0]}]\n\n    # Visualization mode\n    module = hub.Module(name=\"ddparser\", return_visual=True)\n    data = module.visualize(\"百度是一家高科技公司\")\n    cv2.imwrite('test.jpg', data)\n    ```\n    \n- ### 3、API\n\n  - ```python\n    def __init__(\n      tree=True,\n      prob=False,\n      use_pos=False,\n      batch_size=1,\n      return_visual=False)\n    ```\n    - 模块初始化。\n    \n    - **参数**\n\n      - tree(bool): 输出结果是否需要满足树状结构，默认为True。\n      - prob(bool): 是否输出概率值，默认为False。\n      - use_pos(bool): 是否输出词性标签，默认为False。\n      - batch_size(int): 批大小，默认为1。\n      - return_visual(bool): 是否返回可视化结果（需配合visualize api使用），默认为False。\n\n  - ```python\n    def parse(texts)\n    ```\n    - 依存分析接口，输入文本，输出依存关系。\n\n    - **参数**\n\n      - texts(str or list\\[str\\]]): 待预测数据。\n\n    - **返回**\n\n      - results(list\\[dict\\]): 依存分析结果。每个元素都是dict类型，包含以下信息：  \n     \n            {\n                'word': list[str], 分词结果。\n                'head': list[int], 当前成分其支配者的id。\n                'deprel': list[str], 当前成分与支配者的依存关系。\n                'prob': list[float], 从属者和支配者依存的概率。\n                'postag': list[str], 词性标签，只有当texts的元素是未分词的字符串时包含这个键。\n                'visual': numpy.ndarray, 图像数组，可以使用cv2.imshow显示图像或cv2.imwrite保存图像。\n            }\n      \n\n  - ```python\n    def visualize(text)\n    ```\n\n    - 可视化接口，输入文本信息，输出依存图形数组。\n\n    - **参数**\n\n      - text(str): 输入文本，支持string格式的单条文本输入。\n\n    - **返回**\n\n      - data(numpy.ndarray): 图像数组。可以使用cv2.imshow显示图像或cv2.imwrite保存图像。\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线句法分析服务，可以将此接口用于在线web应用。\n\n- ## 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n    ```shell\n    $ hub serving start -m ddparser\n    ```\n\n  - 启动时会显示加载模型过程，启动成功后显示\n    ```shell\n    Loading ddparser successful.\n    ```\n\n  - 这样就完成了服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ## 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    ```python\n    import requests\n    import json\n\n    # 待预测数据(input string)\n    text = [\"百度是一家高科技公司\"]\n\n    # 设置运行配置\n    data = {\"texts\": text}\n    \n    # 指定预测方法为DuDepParser并发送post请求，content-type类型应指定json方式\n    url = \"http://127.0.0.1:8866/predict/ddparser\"\n    headers = {\"Content-Type\": \"application/json\"}\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    print(r.json())\n    # {'msg': '', 'results': [{'deprel': ['SBV', 'HED', 'ATT', 'VOB'], 'head': ['2', '0', '4', '2'], 'word': ['百度', '是', '一家', '公司']}], 'status': '000'}\n\n    # 待预测数据(input list)\n    text = [\"百度是一家公司\", \"他送了一本书\"]\n\n    # 设置运行配置\n    data = {\"texts\": text}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    # {'msg': '', 'results': [{'deprel': ['SBV', 'HED', 'ATT', 'VOB'], 'head': ['2', '0', '4', '2'], 'word': ['百度', '是', '一家', '公司']}, {'deprel': ['SBV', 'HED', 'MT', 'ATT', 'VOB'], 'head': ['2', '0', '2', '5', '2'], 'word': ['他', '送', '了', '一本', '书']}], 'status': '000'}\n    \n    ```\n\n  - 关于PaddleHub Serving更多信息参考：[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  适配paddlepaddle 2.1版本\n\n  - ```shell\n    $ hub install ddparser==1.1.0\n    ```\n"
  },
  {
    "path": "modules/text/syntactic_analysis/DDParser/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/syntactic_analysis/DDParser/module.py",
    "content": "# -*- coding:utf-8 -*-\nimport os\nimport argparse\n\nimport paddlehub as hub\nfrom paddlehub.module.module import serving, moduleinfo, runnable\nfrom paddlenlp import Taskflow\n\n\n@moduleinfo(\n    name=\"ddparser\",\n    version=\"1.1.0\",\n    summary=\"Baidu's open-source DDParser model.\",\n    author=\"baidu-nlp\",\n    author_email=\"\",\n    type=\"nlp/syntactic_analysis\")\nclass ddparser(hub.NLPPredictionModule):\n    def __init__(self,\n                 tree=True,\n                 prob=False, \n                 use_pos=False,\n                 batch_size=1,\n                 return_visual=False,\n                 ):\n        self.ddp = Taskflow(\n            \"dependency_parsing\",\n            tree=tree, \n            prob=prob, \n            use_pos=use_pos,\n            batch_size=batch_size,\n            return_visual=return_visual)\n\n    @serving\n    def serving_parse(self, texts):\n        results = self.parse(texts)\n        for i in range(len(results)):\n            org_list = results[i][\"head\"]\n            results[i][\"head\"] = [str(x) for x in org_list]\n        return results\n\n    def parse(self, texts):\n        \"\"\"\n        parse the dependency.\n\n        Args:\n            texts(str or list[str]): the input texts to be parse.\n\n        Returns:\n            results(list[dict]): a list, with elements corresponding to each of the elements in texts. The element is a dictionary of shape:\n                {\n                    'word': list[str], the tokenized words.\n                    'head': list[int], the head ids.\n                    'deprel': list[str], the dependency relation.\n                    'prob': list[float], the prediction probility of the dependency relation.\n                    'postag': list[str], the POS tag. If the element of the texts is list, the key 'postag' will not return.\n                    'visual' : numpy.ndarray: the dependency visualization. Use cv2.imshow to show or cv2.imwrite to save it. If return_visual=False, it will not return.\n                }\n       \"\"\"\n        return self.ddp(texts)\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description='Run the %s module.' % self.name,\n            prog='hub run %s' % self.name,\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n\n        self.add_module_input_arg()\n\n        args = self.parser.parse_args(argvs)\n\n        input_data = self.check_input_data(args)\n\n        results = self.parse(texts=input_data)\n\n        return results\n\n    def visualize(self, text):\n        \"\"\"\n        Visualize the dependency.\n\n        Args:\n            text(str): input text.\n\n        Returns:\n            data(numpy.ndarray): a numpy array, use cv2.imshow to show it or cv2.imwrite to save it.\n        \"\"\"\n\n        if isinstance(text, str):\n            result = self.ddp(text)[0]['visual']\n            return result\n        else:\n            raise TypeError(\n                \"Invalid inputs, input text should be str, but type of {} found!\".format(type(text))\n            )\n"
  },
  {
    "path": "modules/text/syntactic_analysis/DDParser/requirements.txt",
    "content": "paddlenlp>=2.1.1\nLAC>=2.1.2\n"
  },
  {
    "path": "modules/text/syntactic_analysis/README.md",
    "content": "## **更好用户体验，建议参考WEB端官方文档 -> [【句法分析】](https://www.paddlepaddle.org.cn/hublist)**\n\n### 句法分析\n\n- 推荐模型\n\n| 模型名称                                                     | 模型简介                                                     |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n| [句法分析-DDParser](https://www.paddlepaddle.org.cn/hubdetail?name=ddparser&en_category=SyntacticAnalysis) | DDParser（Baidu Dependency Parser）是百度NLP基于大规模标注数据和深度学习平台飞桨研发的中文依存句法分析工具，可帮助用户直接获取输入文本中的关联词对、长距离依赖词对等。 |\n"
  },
  {
    "path": "modules/text/syntactic_analysis/README_en.md",
    "content": "## **For better user experience, refer to the Web official document ->  [Sentiment Analysis](https://www.paddlepaddle.org.cn/hublist)**\n\n### Sentiment Analysis\n\nSentiment Classification (Senta) can automatically determine the sentiment polarity category of Chinese texts with subjective descriptions and give the corresponding confidence level. This can help enterprises understand users' consumption habits, analyze hot topics, and monitor public opinion at crisis, and provide favorable decision support for enterprises.\n\n- Recommended Models\n\n| Model Name                                                   | Model Introduction                                           |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n| [Syntactic Analysis - DDParser](https://www.paddlepaddle.org.cn/hubdetail?name=ddparser&en_category=SyntacticAnalysis) | Baidu Dependency Parser (DDParser) is a Chinese dependency syntactic analysis tool developed by Baidu NLP based on large-scale annotation data and deep learning platform Paddle. This can help users directly obtain related word pairs and long-distance dependent word pairs in input texts. |\n"
  },
  {
    "path": "modules/text/text_correction/ernie-csc/README.md",
    "content": "# ERNIE-CSC\n\n|模型名称|ERNIE-CSC|\n| :--- | :---: | \n|类别|文本-文本纠错|\n|网络|ERNIE-CSC|\n|数据集|SIGHAN|\n|是否支持Fine-tuning|否|\n|模型大小|436MB|\n|最新更新日期|2021-12-10|\n|数据指标|-|\n\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - 中文文本纠错任务是一项NLP基础任务，其输入是一个可能含有语法错误的中文句子，输出是一个正确的中文句子。语法错误类型很多，有多字、少字、错别字等，目前最常见的错误类型是错别字。大部分研究工作围绕错别字这一类型进行研究。本文实现了百度在ACL 2021上提出结合拼音特征的Softmask策略的中文错别字纠错的下游任务网络，并提供预训练模型，模型结构如下：\n\n  <p align=\"center\">\n  <img src=\"https://user-images.githubusercontent.com/40840292/146150468-9168651a-1fa0-4d60-9871-69e494d1d370.png\" hspace='10'/> <br />\n  </p>\n\n  - 更多详情请[参考论文](https://aclanthology.org/2021.findings-acl.198.pdf)\n\n  - 注：论文中暂未开源融合字音特征的预训练模型参数(即MLM-phonetics)，所以本文提供的纠错模型是在ERNIE-1.0的参数上进行Finetune，纠错模型结构与论文保持一致。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.1.0\n  \n  - paddlenlp >= 2.2.0\n\n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install ernie-csc\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run ernie-csc --input_text=\"遇到逆竟时，我们必须勇于面对，而且要愈挫愈勇，这样我们才能朝著成功之路前进。\"\n    ```\n  - 通过命令行方式实现文本纠错ernie-csc模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    # Load ernie-csc\n    module = hub.Module(name=\"ernie-csc\")\n\n    # String input\n    results = module.predict(\"遇到逆竟时，我们必须勇于面对，而且要愈挫愈勇，这样我们才能朝著成功之路前进。\")\n    print(results)\n    # [{'source': '遇到逆竟时，我们必须勇于面对，而且要愈挫愈勇，这样我们才能朝著成功之路前进。', 'target': '遇到逆境时，我们必须勇于面对，而且要愈挫愈勇，这样我们才能朝著成功之路前进。', 'errors': [{'position': 3, 'correction': {'竟': '境'}}]}]\n\n    # List input\n    results = module.predict(['遇到逆竟时，我们必须勇于面对，而且要愈挫愈勇，这样我们才能朝著成功之路前进。', '人生就是如此，经过磨练才能让自己更加拙壮，才能使自己更加乐观。'])\n    print(results)\n    # [{'source': '遇到逆竟时，我们必须勇于面对，而且要愈挫愈勇，这样我们才能朝著成功之路前进。', 'target': '遇到逆境时，我们必须勇于面对，而且要愈挫愈勇，这样我们才能朝著成功之路前进。', 'errors': [{'position': 3, 'correction': {'竟': '境'}}]}, {'source': '人生就是如此，经过磨练才能让自己更加拙壮，才能使自己更加乐观。', 'target': '人生就是如此，经过磨练才能让自己更加茁壮，才能使自己更加乐观。', 'errors': [{'position': 18, 'correction': {'拙': '茁'}}]}]\n    ```\n    \n- ### 3、API\n\n  - ```python\n    def __init__(batch_size=32)\n    ```\n\n    - **参数**\n\n      - batch_size(int): 每个预测批次的样本数目，默认为32。\n\n  - ```python\n    def predict(texts)\n    ```\n    - 预测接口，输入文本，输出文本纠错结果。\n\n    - **参数**\n\n      - texts(str or list\\[str\\]): 待预测数据。\n\n    - **返回**\n\n      - results(list\\[dict\\]): 输出结果。每个元素都是dict类型，包含以下信息：  \n\n            {\n                'source': str, 输入文本。\n                'target': str, 模型预测结果。\n                'errors': list[dict], 错误字符的详细信息，包含如下信息:\n                    {\n                        'position': int, 错误字符的位置。\n                        'correction': dict, 错误字符及其对应的校正结果。\n                    }\n            }\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线文本纠错服务，可以将此接口用于在线web应用。\n\n- ## 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n    ```shell\n    $ hub serving start -m ernie-csc\n    ```\n\n  - 这样就完成了服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ## 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    ```python\n    import requests\n    import json\n\n    # 待预测数据(input string)\n    text = [\"遇到逆竟时，我们必须勇于面对，而且要愈挫愈勇，这样我们才能朝著成功之路前进。\"]\n\n    # 设置运行配置\n    data = {\"texts\": text}\n    \n    # 指定预测方法为ernie-csc并发送post请求，content-type类型应指定json方式\n    url = \"http://127.0.0.1:8866/predict/ernie-csc\"\n    headers = {\"Content-Type\": \"application/json\"}\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n\n    # 待预测数据(input list)\n    text = ['遇到逆竟时，我们必须勇于面对，而且要愈挫愈勇，这样我们才能朝著成功之路前进。', '人生就是如此，经过磨练才能让自己更加拙壮，才能使自己更加乐观。']\n\n    # 设置运行配置\n    data = {\"texts\": text}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n  - 关于PaddleHub Serving更多信息参考：[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install ernie-csc==1.0.0\n    ```\n"
  },
  {
    "path": "modules/text/text_correction/ernie-csc/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/text_correction/ernie-csc/module.py",
    "content": "# -*- coding:utf-8 -*-\nimport os\nimport argparse\n\nimport paddle\nimport paddlehub as hub\nfrom paddlehub.module.module import serving, moduleinfo, runnable\nfrom paddlenlp import Taskflow\n\n\n@moduleinfo(\n    name=\"ernie-csc\",\n    version=\"1.0.0\",\n    summary=\"\",\n    author=\"Baidu\",\n    author_email=\"\",\n    type=\"nlp/text_correction\",\n    meta=hub.NLPPredictionModule)\nclass Ernie_CSC(paddle.nn.Layer):\n    def __init__(self, \n                 batch_size=32):\n        self.corrector = Taskflow(\"text_correction\", batch_size=batch_size)\n\n    @serving\n    def predict(self, texts):\n        \"\"\"\n        The prediction interface for ernie-csc.\n\n        Args:\n            texts(str or list[str]): the input texts to be predict.\n\n        Returns:\n            results(list[dict]): inference results. The element is a dictionary consists of:\n                {\n                    'source': str, the input texts.\n                    'target': str, the predicted correct texts.\n                    'errors': list[dict], detail information of errors, the element is a dictionary consists of:\n                        {\n                            'position': int, index of wrong charactor.\n                            'correction': int, the origin charactor and the predicted correct charactor.\n                        }\n                }\n        \"\"\"\n        return self.corrector(texts)\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description='Run the %s module.' % self.name,\n            prog='hub run %s' % self.name,\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n\n        self.add_module_input_arg()\n\n        args = self.parser.parse_args(argvs)\n\n        input_data = self.check_input_data(args)\n\n        results = self.predict(texts=input_data)\n\n        return results\n"
  },
  {
    "path": "modules/text/text_correction/ernie-csc/requirements.txt",
    "content": "paddlenlp>=2.2.0\n"
  },
  {
    "path": "modules/text/text_generation/CPM_LM/readme.md",
    "content": "# CPM_LM\n\n| 模型名称            |    CPM_LM     |\n| :------------------ | :-----------: |\n| 类别                | 文本-文本生成 |\n| 网络                |     GPT-2     |\n| 数据集              |  自建数据集   |\n| 是否支持Fine-tuning |      否       |\n| 模型大小            |     5.31G     |\n| 最新更新日期        |  2021-02-26   |\n| 数据指标            |       -       |\n\n## 一、模型基本信息\n\n- ### 模型介绍\n  - CPM-LM 是一个基于 GPT-2 的预训练生成模型，参数规模达 26 亿，预训练中文数据规模 100 GB，使用了 64 块 V100 GPU，训练时间约为 3 周。能够在多种自然语言处理任务上进行零次学习或少次学习，并达到较好的效果。基于给定上文，模型可以续写出一致性高、可读性强的文本，达到现有中文生成模型的领先效果。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n  \n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n  \n  - sentencepiece==0.1.92\n  \n    - **注意本模型对sentencepiece版本要求严格，在使用前请确认您所使用的版本是正确的**\n    \n    - ```shell\n      $ pip install sentencepiece==0.1.92\n      ```\n  \n- ### 2、安装\n\n  - ```shell\n    $ hub install CPM_LM\n    ```\n\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n   | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    \n    model = hub.Module(name='CPM_LM')  # 加载模型\n    ```\n  - Note：模型参数转换至官方开源项目，由于模型较大，推荐在GPU环境下运行，并且请确保运行环境的内存大于20G且显卡显存大于12G，否则可能无法正常运行 \n     \n  - 使用 Greedy Search 生成文本：\n\n    - ```python\n      inputs = '''默写古诗:\n      日照香炉生紫烟，遥看瀑布挂前川。\n      飞流直下三千尺，'''\n      outputs = model.predict(inputs, max_len=10, end_word='\\n')\n      print(inputs+outputs)\n      \n      # 默写古诗:\n      # 日照香炉生紫烟，遥看瀑布挂前川。\n      # 飞流直下三千尺，疑是银河落九天。\n      ```\n\n    - ```python\n      inputs = '''问题：西游记是谁写的？\n      答案：'''\n      outputs = model.predict(inputs, max_len=10, end_word='\\n')\n      print(inputs+outputs)\n      \n      # 问题:西游记是谁写的?\n      # 答案:吴承恩。\n      ```\n\n    - ```python\n      inputs = '''小明决定去吃饭，小红继续写作业\n      问题：去吃饭的人是谁？\n      答案：'''\n      outputs = model.predict(inputs, max_len=10, end_word='\\n')\n      print(inputs+outputs)\n      \n      # 小明决定去吃饭,小红继续写作业\n      # 问题:去吃饭的人是谁?\n      # 答案:小明\n      ```\n\n    - ```python\n      inputs = '''默写英文：\n      狗：dog\n      猫：'''\n      outputs = model.predict(inputs, max_len=10, end_word='\\n')\n      print(inputs+outputs)\n      \n      # 默写英文:\n      # 狗:dog\n      # 猫:cat\n      ```\n\n- ### 2、API\n\n  - ```python\n    def predict(text, max_len=32, end_word=None):\n    ```\n\n    - 预测 API ，根据输入的文字进行文本生成，使用 Greedy Search 进行解码。\n    - **参数**\n      - text (str) : 输入文本\n      - max_len (int) : 生成文本的最大长度\n      - end_word (str or None) : 终止生成的标志词\n    - **返回**\n      - results (str): 生成的文本\n    \n  - ```python\n    def tokenizer.encode(text):\n    ```\n  \n    - 编码 API\n    - **参数**\n      - text (str) : 输入文本\n    - **返回**\n      - results (list[int](https://www.paddlepaddle.org.cn/hubdetail?name=CPM_LM&en_category=TextGeneration)) : 输出编码\n    \n  - ```python\n    def tokenizer.decode(ids):\n    ```\n  \n    - 解码 API\n    - **参数**\n      - ids (list[int](https://www.paddlepaddle.org.cn/hubdetail?name=CPM_LM&en_category=TextGeneration)) : 输入编码\n    - **返回**\n      - results (str) : 输出文本\n  \n  - ```python\n    def model(x, kv_cache=None, use_cache=False):\n    ```\n  \n    - 模型前向计算 API\n    - **参数**\n      - x (tensor) : 输入编码\n      - kv_cache (tensor) : 输入的缓存\n      - use_cache (bool) : 是否使用缓存\n    - **返回**\n      - results (tensor) : 模型输出\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线文本生成服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start --modules GPT2_CPM_LM -p 8866\n    ```\n\n  - 这样就完成了一个对话机器人服务化API的部署，默认端口号为8866。\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    \n    text = \"今天是个好日子\"\n    data = {\n        \"text\": text,\n        \"mode\": \"sample\", # 'search' or 'sample'\n        # 可以更加需要设置上述 API 中提到的其他参数\n    }\n    url = \"http://127.0.0.1:8866/predict/GPT2_CPM_LM\"\n    headers = {\"Content-Type\": \"application/json\"}\n    \n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    ```\n    \n  - 关于PaddleHub Serving更多信息参考[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/text/text_generation/GPT2_Base_CN/README.md",
    "content": "## 概述\nGPT2_Base_CN 是一个预训练生成模型，是 PaddleNLP 中的内置模型。\n\n## API\n```python\ndef greedy_search(\n    text,\n    max_len=32,\n    end_word=None):\n```\n文本生成 API ，根据输入的文字进行文本生成，使用 Greedy Search 进行解码，生成的文本单一且确定，适合于问答类的文本生成。\n\n**参数**\n* text (str) : 输入文本\n* max_len (int) : 生成文本的最大长度\n* end_word (str or None) : 终止生成的标志词\n\n**返回**\n* results (str): 生成的文本\n\n```python\ndef nucleus_sample(\n    text,\n    max_len=32,\n    end_word=None,\n    repitition_penalty=1.0,\n    temperature=1.0,\n    top_k=0,\n    top_p=1.0):\n```\n文本生成 API ，根据输入的文字进行文本生成，使用采样的方式进行解码，生成的文本比较多样，适合于文章类的文本生成。\n\n**参数**\n* text (str) : 输入文本\n* max_len (int) : 生成文本的最大长度\n* end_word (str or None) : 终止生成的标志词\n* repitition_penalty (float) : 重复词抑制率，大于1抑制，小于1提高\n* temperature (float) ：较低的temperature可以让模型对最佳选择越来越有信息，大于1，则会降低，0则相当于 argmax/max ，inf则相当于均匀采样\n* top_k (int) : 抑制小于 Top K 的输出，大于0时有效\n* top_p (float) : 抑制小于 Top P 的输出，小于1.0时有效\n\n**返回**\n* results (str): 生成的文本\n\n**代码示例**\n* 加载模型：\n```python\nimport paddlehub as hub\n\nmodel = hub.Module(name='GPT2_Base_CN')\n```\n* 使用 Greedy Search 生成文本：\n```python\ninputs = '''默写古诗:\n日照香炉生紫烟，遥看瀑布挂前川。\n飞流直下三千尺，'''\n\noutputs = model.greedy_search(inputs, max_len=10, end_word='\\n')\n\nprint(outputs)\n```\n    默写古诗:\n    日照香炉生紫烟,遥看瀑布挂前川。\n    飞流直下三千尺,疑是银河落九天。\n```python\ninputs = '''问题：西游记是谁写的？\n答案：'''\n\noutputs = model.greedy_search(inputs, max_len=10, end_word='\\n')\n\nprint(outputs)\n```\n    问题:西游记是谁写的?\n    答案:吴承恩。\n```python\ninputs = '''小明决定去吃饭，小红继续写作业\n问题：去吃饭的人是谁？\n答案：'''\n\noutputs = model.greedy_search(inputs, max_len=10, end_word='\\n')\n\nprint(outputs)\n```\n    小明决定去吃饭,小红继续写作业\n    问题:去吃饭的人是谁?\n    答案:小明\n```python\ninputs = '''默写英文：\n狗：dog\n猫：'''\n\noutputs = model.greedy_search(inputs, max_len=10, end_word='\\n')\n\nprint(outputs)\n```\n    默写英文:\n    狗:dog\n    猫:cat\n\n* 使用采样方式生成文本：\n\n```python\ninputs = '''在此处输入文本的开头'''\n\noutputs = model.nucleus_sample(\n    inputs,\n    max_len=32,\n    end_word='。',\n    repitition_penalty=1.0,\n    temperature=1.0,\n    top_k=5,\n    top_p=1.0\n)\n\nprint(outputs)\n```\n    在此处输入文本的开头字母。\n\n\n```python\ninputs = '''《乡土中国》是费孝通先生在社区研究的基础上从宏观角度探讨中国社会结构的著作，'''\n\noutputs = model.nucleus_sample(\n    inputs,\n    max_len=32,\n    end_word='。',\n    repitition_penalty=1.0,\n    temperature=1.0,\n    top_k=3000,\n    top_p=1.0\n)\n\nprint(outputs)\n```\n    《乡土中国》是费孝通先生在社区研究的基础上从宏观角度探讨中国社会结构的著作,肯定了集体所有制在提升中国中低山地区农村生活水平方面所起的积极作用。\n\n## 服务部署\n\nPaddleHub Serving 可以部署一个在线文本生成服务。\n\n## 第一步：启动PaddleHub Serving\n\n运行启动命令：\n```shell\n$ hub serving start --modules GPT2_Base_CN\n```\n\n这样就完成了一个文本生成的在线服务API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n## 第二步：发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\n\ntext = \"今天是个好日子\"\ndata = {\n    \"text\": text,\n    \"mode\": \"sample\", # 'search' or 'sample'\n    # 可以更加需要设置上述 API 中提到的其他参数\n}\nurl = \"http://127.0.0.1:8866/predict/GPT2_Base_CN\"\nheaders = {\"Content-Type\": \"application/json\"}\n\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\n```\n\n## 查看代码\nhttps://github.com/PaddlePaddle/PaddleNLP\n\n## 依赖\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n\nsentencepiece==0.1.92\n"
  },
  {
    "path": "modules/text/text_generation/GPT2_Base_CN/module.py",
    "content": "import paddle\nimport numpy as np\nimport paddle.nn as nn\nfrom paddlehub.module.module import moduleinfo, serving\nfrom paddlenlp.transformers import GPT2ForPretraining, GPT2ChineseTokenizer\n\n\n@moduleinfo(\n    name=\"GPT2_Base_CN\",  # 模型名称\n    type=\"NLP/NLG\",  # 模型类型\n    author=\"jm12138\",  # 作者名称\n    author_email=\"jm12138@qq.com\",  # 作者邮箱\n    summary=\"GPT2_Base_CN\",  # 模型介绍\n    version=\"1.0.0\"  # 版本号\n)\nclass GPT2_Base_CN(nn.Layer):\n    def __init__(self):\n        super(GPT2_Base_CN, self).__init__()\n        # 加载 PaddleNLP 自带的预训练中文 GPT2 模型\n        self.model = GPT2ForPretraining.from_pretrained('gpt2-base-cn')\n\n        # 设置模型为评估状态\n        self.model.eval()\n\n        # 加载编解码器\n        self.tokenizer = GPT2ChineseTokenizer.from_pretrained('gpt2-base-cn')\n\n        # 初始化编码器\n        _ = self.tokenizer.encode('_')\n\n    # Greedy Search\n    def greedy_search(self, text, max_len=32, end_word=None):\n        with paddle.no_grad():\n            # # 终止标志\n            if end_word is not None:\n                stop_id = self.tokenizer.encode(end_word)\n                length = len(stop_id)\n\n            # 初始预测\n            ids = self.tokenizer.encode(text)\n            input_id = paddle.to_tensor(np.array(ids).reshape(1, -1).astype('int64'))\n            output, cached_kvs = self.model(input_id, use_cache=True)\n            next_token = int(np.argmax(output[0, -1].numpy()))\n            ids.append(next_token)\n\n            # 使用缓存进行继续预测\n            for i in range(max_len - 1):\n                input_id = paddle.to_tensor(np.array([next_token]).reshape(1, -1).astype('int64'))\n                output, cached_kvs = self.model(input_id, use_cache=True, cache=cached_kvs)\n                next_token = int(np.argmax(output[0, -1].numpy()))\n                ids.append(next_token)\n\n                # 根据终止标志停止预测\n                if (end_word is not None) and (ids[-length:] == stop_id):\n                    break\n\n            return self.tokenizer.decode(ids)\n\n    @staticmethod\n    def top_k_top_p_filtering(logits, top_k=0, top_p=1.0, filter_value=-float('Inf')):\n        \"\"\" Filter a distribution of logits using top-k and/or nucleus (top-p) filtering\n            Args:\n                logits: logits distribution shape (vocabulary size)\n                top_k > 0: keep only top k tokens with highest probability (top-k filtering).\n                top_p > 0.0: keep the top tokens with cumulative probability >= top_p (nucleus filtering).\n                    Nucleus filtering is described in Holtzman et al. (http://arxiv.org/abs/1904.09751)\n            From: https://gist.github.com/thomwolf/1a5a29f6962089e871b94cbd09daf317\n        \"\"\"\n        top_k = min(top_k, logits.shape[-1])  # Safety check\n        logits_np = logits.numpy()\n        if top_k > 0:\n            # Remove all tokens with a probability less than the last token of the top-k\n            indices_to_remove = logits_np < np.sort(logits_np)[-top_k]\n            logits_np[indices_to_remove] = filter_value\n\n        if top_p < 1.0:\n            sorted_logits = paddle.sort(logits, descending=True)\n            sorted_indices = paddle.argsort(logits, descending=True).numpy()\n            cumulative_probs = paddle.cumsum(paddle.nn.functional.softmax(sorted_logits, axis=-1), axis=-1).numpy()\n\n            # Remove tokens with cumulative probability above the threshold\n            sorted_indices_to_remove = cumulative_probs > top_p\n            # Shift the indices to the right to keep also the first token above the threshold\n            sorted_indices_to_remove[..., 1:] = sorted_indices_to_remove[..., :-1]\n            sorted_indices_to_remove[..., 0] = 0\n\n            indices_to_remove = sorted_indices[sorted_indices_to_remove]\n            logits_np[indices_to_remove] = filter_value\n\n        return paddle.to_tensor(logits_np)\n\n    def nucleus_sample(self,\n                       text,\n                       max_len=32,\n                       end_word=None,\n                       repitition_penalty=1.0,\n                       temperature=1.0,\n                       top_k=0,\n                       top_p=1.0):\n        with paddle.no_grad():\n            # 终止标志\n            if end_word is not None:\n                stop_id = self.tokenizer.encode(end_word)\n                length = len(stop_id)\n\n            # 初始预测\n            ids = self.tokenizer.encode(text)\n            input_id = paddle.to_tensor(np.array(ids).reshape(1, -1).astype('int64'))\n            output, cached_kvs = self.model(input_id, use_cache=True)\n            next_token_logits = output[0, -1, :]\n            for id in set(ids):\n                next_token_logits[id] /= repitition_penalty\n            next_token_logits = next_token_logits / temperature\n            filtered_logits = self.top_k_top_p_filtering(next_token_logits, top_k=top_k, top_p=top_p)\n            next_token = paddle.multinomial(\n                paddle.nn.functional.softmax(filtered_logits, axis=-1), num_samples=1).numpy()\n            ids += [int(next_token)]\n\n            # 使用缓存进行继续预测\n            for i in range(max_len - 1):\n                input_id = paddle.to_tensor(np.array([next_token]).reshape(1, -1).astype('int64'))\n                output, cached_kvs = self.model(input_id, use_cache=True, cache=cached_kvs)\n                next_token_logits = output[0, -1, :]\n                for id in set(ids):\n                    next_token_logits[id] /= repitition_penalty\n                next_token_logits = next_token_logits / temperature\n                filtered_logits = self.top_k_top_p_filtering(next_token_logits, top_k=top_k, top_p=top_p)\n                next_token = paddle.multinomial(\n                    paddle.nn.functional.softmax(filtered_logits, axis=-1), num_samples=1).numpy()\n                ids += [int(next_token)]\n\n                # 根据终止标志停止预测\n                if (end_word is not None) and (ids[-length:] == stop_id):\n                    break\n\n            return self.tokenizer.decode(ids)\n\n    # Hub Serving\n    @serving\n    def serving_method(self, text, mode='search', **kwargs):\n        if mode == 'search':\n            results = self.greedy_search(text, **kwargs)\n        else:\n            results = self.nucleus_sample(text, **kwargs)\n\n        return results\n"
  },
  {
    "path": "modules/text/text_generation/GPT2_CPM_LM/README.md",
    "content": "## 概述\nCPM-LM 是一个基于 GPT-2 的预训练生成模型，参数规模达 26 亿，预训练中文数据规模 100 GB，使用了 64 块 V100 GPU，训练时间约为 3 周。能够在多种自然语言处理任务上进行零次学习或少次学习，并达到较好的效果。基于给定上文，模型可以续写出一致性高、可读性强的文本，达到现有中文生成模型的领先效果。\n\n模型参数转换至官方开源项目，由于模型较大，推荐在GPU环境下运行，并且请确保运行环境的内存大于20G且显卡显存大于12G，否则可能无法正常运行\n\n更多详情参考[清源CPM官网](https://cpm.baai.ac.cn)及其[Github项目主页](https://github.com/TsinghuaAI/CPM-Generate)\n\n## API\n```python\ndef greedy_search(\n    text,\n    max_len=32,\n    end_word=None):\n```\n文本生成 API ，根据输入的文字进行文本生成，使用 Greedy Search 进行解码，生成的文本单一且确定，适合于问答类的文本生成。\n\n**参数**\n* text (str) : 输入文本\n* max_len (int) : 生成文本的最大长度\n* end_word (str or None) : 终止生成的标志词\n\n**返回**\n* results (str): 生成的文本\n\n```python\ndef nucleus_sample(\n    text,\n    max_len=32,\n    end_word=None,\n    repitition_penalty=1.0,\n    temperature=1.0,\n    top_k=0,\n    top_p=1.0):\n```\n文本生成 API ，根据输入的文字进行文本生成，使用采样的方式进行解码，生成的文本比较多样，适合于文章类的文本生成。\n\n**参数**\n* text (str) : 输入文本\n* max_len (int) : 生成文本的最大长度\n* end_word (str or None) : 终止生成的标志词\n* repitition_penalty (float) : 重复词抑制率，大于1抑制，小于1提高\n* temperature (float) ：较低的temperature可以让模型对最佳选择越来越有信息，大于1，则会降低，0则相当于 argmax/max ，inf则相当于均匀采样\n* top_k (int) : 抑制小于 Top K 的输出，大于0时有效\n* top_p (float) : 抑制小于 Top P 的输出，小于1.0时有效\n\n**返回**\n* results (str): 生成的文本\n\n**代码示例**\n* 加载模型：\n```python\nimport paddlehub as hub\n\nmodel = hub.Module(name='GPT2_CPM_LM')\n```\n* 使用 Greedy Search 生成文本：\n```python\ninputs = '''默写古诗:\n日照香炉生紫烟，遥看瀑布挂前川。\n飞流直下三千尺，'''\n\noutputs = model.greedy_search(inputs, max_len=10, end_word='\\n')\n\nprint(outputs)\n```\n    默写古诗:\n    日照香炉生紫烟,遥看瀑布挂前川。\n    飞流直下三千尺,疑是银河落九天。\n```python\ninputs = '''问题：西游记是谁写的？\n答案：'''\n\noutputs = model.greedy_search(inputs, max_len=10, end_word='\\n')\n\nprint(outputs)\n```\n    问题:西游记是谁写的?\n    答案:吴承恩。\n```python\ninputs = '''小明决定去吃饭，小红继续写作业\n问题：去吃饭的人是谁？\n答案：'''\n\noutputs = model.greedy_search(inputs, max_len=10, end_word='\\n')\n\nprint(outputs)\n```\n    小明决定去吃饭,小红继续写作业\n    问题:去吃饭的人是谁?\n    答案:小明\n```python\ninputs = '''默写英文：\n狗：dog\n猫：'''\n\noutputs = model.greedy_search(inputs, max_len=10, end_word='\\n')\n\nprint(outputs)\n```\n    默写英文:\n    狗:dog\n    猫:cat\n\n* 使用采样方式生成文本：\n\n```python\ninputs = '''在此处输入文本的开头'''\n\noutputs = model.nucleus_sample(\n    inputs,\n    max_len=32,\n    end_word='。',\n    repitition_penalty=1.0,\n    temperature=1.0,\n    top_k=5,\n    top_p=1.0\n)\n\nprint(outputs)\n```\n    在此处输入文本的开头、结尾或中间部分,然后按下回车键。\n\n\n```python\ninputs = '''《乡土中国》是费孝通先生在社区研究的基础上从宏观角度探讨中国社会结构的著作，'''\n\noutputs = model.nucleus_sample(\n    inputs,\n    max_len=32,\n    end_word='。',\n    repitition_penalty=1.0,\n    temperature=1.0,\n    top_k=3000,\n    top_p=1.0\n)\n\nprint(outputs)\n```\n    《乡土中国》是费孝通先生在社区研究的基础上从宏观角度探讨中国社会结构的著作,在书中,他试图向读者说明,中国社会的结构是如何形成、演变的,家庭伦理是如何形成、\n\n\n\n## 服务部署\n\nPaddleHub Serving 可以部署一个在线文本生成服务。\n\n## 第一步：启动PaddleHub Serving\n\n运行启动命令：\n```shell\n$ hub serving start --modules GPT2_CPM_LM\n```\n\n这样就完成了一个文本生成的在线服务API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n## 第二步：发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\n\ntext = \"今天是个好日子\"\ndata = {\n    \"text\": text,\n    \"mode\": \"sample\", # 'search' or 'sample'\n    # 可以更加需要设置上述 API 中提到的其他参数\n}\nurl = \"http://127.0.0.1:8866/predict/GPT2_CPM_LM\"\nheaders = {\"Content-Type\": \"application/json\"}\n\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\n```\n\n\n## 查看代码\nhttps://github.com/jm12138/CPM-Generate-Paddle\n\n## 依赖\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.0.0\n\nsentencepiece==0.1.92\n"
  },
  {
    "path": "modules/text/text_generation/GPT2_CPM_LM/module.py",
    "content": "import os\nimport paddle\nimport numpy as np\nimport paddle.nn as nn\nfrom paddlehub.module.module import moduleinfo, serving\nfrom paddlenlp.transformers import GPT2ForPretraining, GPT2ChineseTokenizer, GPT2Model\n\n\n@moduleinfo(\n    name=\"GPT2_CPM_LM\",  # 模型名称\n    type=\"NLP/NLG\",  # 模型类型\n    author=\"jm12138\",  # 作者名称\n    author_email=\"jm12138@qq.com\",  # 作者邮箱\n    summary=\"GPT2_CPM_LM\",  # 模型介绍\n    version=\"1.0.0\"  # 版本号\n)\nclass GPT2_CPM_LM(nn.Layer):\n    def __init__(self):\n        super(GPT2_CPM_LM, self).__init__()\n        # 实例化模型\n        gpt2 = GPT2Model(\n            vocab_size=30000,\n            hidden_size=2560,\n            num_hidden_layers=32,\n            num_attention_heads=32,\n            intermediate_size=10240,\n            hidden_act=\"gelu\",\n            hidden_dropout_prob=0.1,\n            attention_probs_dropout_prob=0.1,\n            max_position_embeddings=1024,\n            type_vocab_size=1,\n            initializer_range=0.02,\n            pad_token_id=0)\n        self.model = GPT2ForPretraining(gpt2)\n\n        # 读取CPM-LM模型参数(FP16)\n        state_dict = paddle.load(os.path.join(self.directory, 'CPM-LM.pdparams'))\n\n        # FP16 -> FP32\n        for param in state_dict:\n            state_dict[param] = state_dict[param].astype('float32')\n\n        # 设置模型参数\n        self.model.set_dict(state_dict)\n\n        # 将模型设置为评估状态\n        self.model.eval()\n\n        # 加载编解码器\n        self.tokenizer = GPT2ChineseTokenizer(\n            vocab_file=os.path.join(self.directory, 'vocab.json'),\n            model_file=os.path.join(self.directory, 'chinese_vocab.model'))\n\n        # 初始化编码器\n        _ = self.tokenizer.encode('_')\n\n    # Greedy Search\n    def greedy_search(self, text, max_len=32, end_word=None):\n        with paddle.no_grad():\n            # # 终止标志\n            if end_word is not None:\n                stop_id = self.tokenizer.encode(end_word)\n                length = len(stop_id)\n\n            # 初始预测\n            ids = self.tokenizer.encode(text)\n            input_id = paddle.to_tensor(np.array(ids).reshape(1, -1).astype('int64'))\n            output, cached_kvs = self.model(input_id, use_cache=True)\n            next_token = int(np.argmax(output[0, -1].numpy()))\n            ids.append(next_token)\n\n            # 使用缓存进行继续预测\n            for i in range(max_len - 1):\n                input_id = paddle.to_tensor(np.array([next_token]).reshape(1, -1).astype('int64'))\n                output, cached_kvs = self.model(input_id, use_cache=True, cache=cached_kvs)\n                next_token = int(np.argmax(output[0, -1].numpy()))\n                ids.append(next_token)\n\n                # 根据终止标志停止预测\n                if (end_word is not None) and (ids[-length:] == stop_id):\n                    break\n\n            return self.tokenizer.decode(ids)\n\n    @staticmethod\n    def top_k_top_p_filtering(logits, top_k=0, top_p=1.0, filter_value=-float('Inf')):\n        \"\"\" Filter a distribution of logits using top-k and/or nucleus (top-p) filtering\n            Args:\n                logits: logits distribution shape (vocabulary size)\n                top_k > 0: keep only top k tokens with highest probability (top-k filtering).\n                top_p > 0.0: keep the top tokens with cumulative probability >= top_p (nucleus filtering).\n                    Nucleus filtering is described in Holtzman et al. (http://arxiv.org/abs/1904.09751)\n            From: https://gist.github.com/thomwolf/1a5a29f6962089e871b94cbd09daf317\n        \"\"\"\n        top_k = min(top_k, logits.shape[-1])  # Safety check\n        logits_np = logits.numpy()\n        if top_k > 0:\n            # Remove all tokens with a probability less than the last token of the top-k\n            indices_to_remove = logits_np < np.sort(logits_np)[-top_k]\n            logits_np[indices_to_remove] = filter_value\n\n        if top_p < 1.0:\n            sorted_logits = paddle.sort(logits, descending=True)\n            sorted_indices = paddle.argsort(logits, descending=True).numpy()\n            cumulative_probs = paddle.cumsum(paddle.nn.functional.softmax(sorted_logits, axis=-1), axis=-1).numpy()\n\n            # Remove tokens with cumulative probability above the threshold\n            sorted_indices_to_remove = cumulative_probs > top_p\n            # Shift the indices to the right to keep also the first token above the threshold\n            sorted_indices_to_remove[..., 1:] = sorted_indices_to_remove[..., :-1]\n            sorted_indices_to_remove[..., 0] = 0\n\n            indices_to_remove = sorted_indices[sorted_indices_to_remove]\n            logits_np[indices_to_remove] = filter_value\n\n        return paddle.to_tensor(logits_np)\n\n    def nucleus_sample(self,\n                       text,\n                       max_len=32,\n                       end_word=None,\n                       repitition_penalty=1.0,\n                       temperature=1.0,\n                       top_k=0,\n                       top_p=1.0):\n        with paddle.no_grad():\n            # 终止标志\n            if end_word is not None:\n                stop_id = self.tokenizer.encode(end_word)\n                length = len(stop_id)\n\n            # 初始预测\n            ids = self.tokenizer.encode(text)\n            input_id = paddle.to_tensor(np.array(ids).reshape(1, -1).astype('int64'))\n            output, cached_kvs = self.model(input_id, use_cache=True)\n            next_token_logits = output[0, -1, :]\n            for id in set(ids):\n                next_token_logits[id] /= repitition_penalty\n            next_token_logits = next_token_logits / temperature\n            next_token_logits[self.tokenizer.encoder['<unk>']] = -float('Inf')\n            filtered_logits = self.top_k_top_p_filtering(next_token_logits, top_k=top_k, top_p=top_p)\n            next_token = paddle.multinomial(\n                paddle.nn.functional.softmax(filtered_logits, axis=-1), num_samples=1).numpy()\n            ids += [int(next_token)]\n\n            # 使用缓存进行继续预测\n            for i in range(max_len - 1):\n                input_id = paddle.to_tensor(np.array([next_token]).reshape(1, -1).astype('int64'))\n                output, cached_kvs = self.model(input_id, use_cache=True, cache=cached_kvs)\n                next_token_logits = output[0, -1, :]\n                for id in set(ids):\n                    next_token_logits[id] /= repitition_penalty\n                next_token_logits = next_token_logits / temperature\n                next_token_logits[self.tokenizer.encoder['<unk>']] = -float('Inf')\n                filtered_logits = self.top_k_top_p_filtering(next_token_logits, top_k=top_k, top_p=top_p)\n                next_token = paddle.multinomial(\n                    paddle.nn.functional.softmax(filtered_logits, axis=-1), num_samples=1).numpy()\n                ids += [int(next_token)]\n\n                # 根据终止标志停止预测\n                if (end_word is not None) and (ids[-length:] == stop_id):\n                    break\n\n            return self.tokenizer.decode(ids)\n\n    # Hub Serving\n    @serving\n    def serving_method(self, text, mode='search', **kwargs):\n        if mode == 'search':\n            results = self.greedy_search(text, **kwargs)\n        else:\n            results = self.nucleus_sample(text, **kwargs)\n\n        return results\n"
  },
  {
    "path": "modules/text/text_generation/README.md",
    "content": "## **更好用户体验，建议参考WEB端官方文档 -> [【文本生成】](https://www.paddlepaddle.org.cn/hublist)**\n\n### 文本生成\n\n\n- 推荐模型\n\n| 模型名称                                                     | 模型简介                                                     |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n| [AI写诗](https://www.paddlepaddle.org.cn/hubdetail?name=ernie_gen_poetry&en_category=TextGeneration) |根据输入上句内容，自动生成后面的多句诗歌 |\n| [AI情话](https://www.paddlepaddle.org.cn/hubdetail?name=ernie_gen_lover_words&en_category=TextGeneration) |根据输入上句内容，自动生成后面的多句情话。 |\n| [生成式对话](https://www.paddlepaddle.org.cn/hubdetail?name=plato2_en_large&en_category=TextGeneration) |交互式多轮问答，目前仅支持英文 |\n| [看图写诗](https://www.paddlepaddle.org.cn/hubdetail?name=reading_pictures_writing_poems&en_category=TextGeneration)|该模型可自动根据图像生成古诗词|\n.\n"
  },
  {
    "path": "modules/text/text_generation/README_en.md",
    "content": "## **For better user experience, refer to the Web official document -> [Text Generation]https://www.paddlepaddle.org.cn/hublist)**\n\n### Text Generation\n\n- Recommended Models\n\n| Model Name                                                   | Model Introduction                                           |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n| [AI Poem Writing](https://www.paddlepaddle.org.cn/hubdetail?name=ernie_gen_poetry&en_category=TextGeneration) | Generate the next lines of poetry automatically according to the input of the previous line. |\n| [AI Love Word](https://www.paddlepaddle.org.cn/hubdetail?name=ernie_gen_lover_words&en_category=TextGeneration) | Generate the next lines of poetry automatically according to the input of the previous line. |\n| [Generative Dialogues](https://www.paddlepaddle.org.cn/hubdetail?name=plato2_en_large&en_category=TextGeneration) | Interactive multi-round question and answer. It currently supports only English. |\n| [Poem Writing by Watching an Image](https://www.paddlepaddle.org.cn/hubdetail?name=reading_pictures_writing_poems&en_category=TextGeneration) | The model can automatically generate an ancient poem according to an images. |\n"
  },
  {
    "path": "modules/text/text_generation/Rumor_prediction/README.md",
    "content": "## 概述\n\n\nRumor_prediction是预测语句是否为谣言的模型。\n\n## 命令行预测\n\n```shell\n$ hub run Rumor_prediction --input_text='兴仁县今天抢小孩没抢走，把孩子母亲捅了一刀，看见这车的注意了，真事，车牌号辽HFM055！！！！！赶紧散播！ 都别带孩子出去瞎转悠了 尤其别让老人自己带孩子出去 太危险了 注意了！！！！辽HFM055北京现代朗动，在各学校门口抢小孩！！！110已经 证实！！全市通缉！！'\n```\n\n## API\n\n```python\ndef Rumor(texts, use_gpu=False):\n```\n\n预测API，预测语句是否为谣言。\n\n**参数**\n\n* texts (list\\[str\\]): 想要预测是否为谣言的语句；\n* use\\_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA\\_VISIBLE\\_DEVICES环境变量**；\n\n**返回**\n\n* results (list[dict]): 预测结果的列表，列表中每一个元素为 dict，各字段为：\n\n    - content(str):输入文本内容\n    - prediction(str):预测结果\n    - probability(float):预测结果概率\n\n**代码示例**\n\n```python\nimport paddlehub as hub\n\nmodule = hub.Module(name=\"Rumor_prediction\")\n\ntest_texts = ['兴仁县今天抢小孩没抢走，把孩子母亲捅了一刀，看见这车的注意了，真事，车牌号辽HFM055！！！！！赶紧散播！ 都别带孩子出去瞎转悠了 尤其别让老人自己带孩子出去 太危险了 注意了！！！！辽HFM055北京现代朗动，在各学校门口抢小孩！！！110已经 证实！！全市通缉！！']\nresults = module.Rumor(texts=test_texts, use_gpu=True)\nprint(results)\n```\n\n\n### 依赖\n\npaddlepaddle >= 2.0.0rc1\n\npaddlehub >= 2.0.0rc0\n"
  },
  {
    "path": "modules/text/text_generation/Rumor_prediction/dict.txt",
    "content": "{'寒': 0, '煲': 1, '升': 2, '耳': 3, '孜': 4, '矶': 5, '惑': 6, '谩': 7, '奉': 8, '坛': 9, '嘗': 10, '索': 11, '所': 12, '藏': 13, '阜': 14, '孟': 15, '久': 16, '）': 17, '散': 18, '真': 19, '肇': 20, '保': 21, '蜜': 22, '丁': 23, '玉': 24, '伏': 25, '次': 26, '隽': 27, '囯': 28, '浊': 29, '沥': 30, '豪': 31, '果': 32, '卢': 33, '夏': 34, '朦': 35, '墓': 36, '圖': 37, '躬': 38, '铃': 39, '浇': 40, '反': 41, '瑩': 42, '慕': 43, '練': 44, '抨': 45, '喃': 46, '滑': 47, '亇': 48, '紅': 49, '拙': 50, '侍': 51, '卤': 52, '摄': 53, '〗': 54, '谤': 55, '跟': 56, '⒑': 57, '备': 58, '躺': 59, '稳': 60, '九': 61, '歉': 62, '味': 63, '莎': 64, '黍': 65, '涎': 66, '想': 67, '鳍': 68, '籠': 69, '臨': 70, '纶': 71, '性': 72, '推': 73, '殉': 74, '平': 75, '倍': 76, '洽': 77, '浸': 78, '裔': 79, '鹤': 80, '破': 81, '軟': 82, '尚': 83, '肃': 84, '凱': 85, '呼': 86, '踊': 87, '编': 88, '輯': 89, '病': 90, '勤': 91, '婴': 92, '枯': 93, '邦': 94, '隨': 95, '級': 96, '〝': 97, '奸': 98, '愧': 99, '团': 100, '济': 101, '董': 102, '艺': 103, '赢': 104, '泄': 105, '蜂': 106, '东': 107, '荆': 108, '汶': 109, '痰': 110, '溅': 111, '湾': 112, '咚': 113, '異': 114, '省': 115, '互': 116, '亂': 117, '耙': 118, '棒': 119, '判': 120, '绘': 121, '呐': 122, '掷': 123, '匿': 124, '韵': 125, '低': 126, '演': 127, '做': 128, '榕': 129, '郡': 130, '明': 131, '吞': 132, '7': 133, '侣': 134, '曼': 135, '炭': 136, '淘': 137, '當': 138, '寨': 139, '餘': 140, '力': 141, '覽': 142, '坏': 143, '肩': 144, '宿': 145, '舟': 146, '嘉': 147, '妹': 148, '各': 149, '著': 150, '归': 151, '遗': 152, '表': 153, '勋': 154, '》': 155, '拦': 156, '瞬': 157, '運': 158, '挖': 159, '谊': 160, '乒': 161, '忽': 162, 'お': 163, '伞': 164, '粤': 165, '曾': 166, '糍': 167, '墨': 168, '设': 169, '滞': 170, '踩': 171, '沛': 172, '盗': 173, '尢': 174, '慌': 175, 'w': 176, '币': 177, 'O': 178, '份': 179, '晨': 180, '菌': 181, '药': 182, '颅': 183, '碍': 184, '桐': 185, '驱': 186, '险': 187, '焖': 188, '仕': 189, '牒': 190, '功': 191, '万': 192, '恼': 193, '囤': 194, '狐': 195, '诸': 196, '憨': 197, '戈': 198, '雀': 199, '筆': 200, '咆': 201, '郅': 202, '残': 203, '刷': 204, '茄': 205, '垄': 206, '眾': 207, '偿': 208, '求': 209, '０': 210, 'g': 211, '荩': 212, '帳': 213, '襲': 214, '庞': 215, '逅': 216, '杆': 217, '埃': 218, '俊': 219, '缺': 220, '爭': 221, '坨': 222, '秃': 223, '遐': 224, '是': 225, '玮': 226, '邀': 227, '监': 228, '呢': 229, '曦': 230, '紹': 231, '惋': 232, '揣': 233, '铺': 234, '篇': 235, '獨': 236, '哀': 237, '趣': 238, '咩': 239, '澳': 240, '坪': 241, '冰': 242, '婶': 243, '烟': 244, '像': 245, '👍': 246, '庸': 247, '舞': 248, '父': 249, '\\ue415': 250, '貨': 251, '稠': 252, '锣': 253, '憶': 254, '鹅': 255, '苕': 256, '宋': 257, '机': 258, '.': 259, '危': 260, '鳝': 261, '御': 262, '隶': 263, '锥': 264, '失': 265, '第': 266, '座': 267, '★': 268, '宥': 269, '鞭': 270, '才': 271, '弃': 272, '憬': 273, '帝': 274, '\\ue021': 275, '睡': 276, '凿': 277, '瀟': 278, '帥': 279, '渢': 280, '说': 281, '疚': 282, '墀': 283, '榨': 284, '哑': 285, '吼': 286, '意': 287, '드': 288, '–': 289, '耍': 290, '劝': 291, '話': 292, '親': 293, '桩': 294, \"'\": 295, '酚': 296, '干': 297, '国': 298, '歼': 299, '蕴': 300, '酿': 301, '叠': 302, '派': 303, '嬛': 304, '韩': 305, '宫': 306, '仁': 307, '臭': 308, '牌': 309, '說': 310, '棕': 311, '舍': 312, '伊': 313, '卿': 314, '抱': 315, '蔚': 316, '遛': 317, '/': 318, '腰': 319, '違': 320, '纱': 321, '溯': 322, '\\u2029': 323, '怯': 324, '哎': 325, '曝': 326, '终': 327, '丨': 328, '逺': 329, '哩': 330, '警': 331, '捷': 332, '宙': 333, '峻': 334, '原': 335, '觀': 336, '蓋': 337, '竹': 338, '戴': 339, '聽': 340, '桓': 341, '沫': 342, '忐': 343, '杰': 344, '执': 345, '利': 346, '帽': 347, '嗷': 348, '枳': 349, '沪': 350, '率': 351, '雾': 352, '嚣': 353, '啸': 354, '乎': 355, '饮': 356, '独': 357, '添': 358, '走': 359, '涉': 360, '怪': 361, '羔': 362, '巾': 363, '盼': 364, '繁': 365, '呦': 366, '舌': 367, '斐': 368, '使': 369, '坐': 370, '依': 371, '啊': 372, '电': 373, '幺': 374, '沿': 375, '內': 376, '汪': 377, '称': 378, '妈': 379, '宏': 380, '柜': 381, '盲': 382, '蹒': 383, '開': 384, '稼': 385, '诈': 386, '瞰': 387, 'ㅋ': 388, '∩': 389, '嫉': 390, '泮': 391, '起': 392, '资': 393, '仍': 394, '憎': 395, '美': 396, '。': 397, '傈': 398, '裴': 399, '棺': 400, '弱': 401, '匪': 402, '箱': 403, '相': 404, '更': 405, '没': 406, '聚': 407, '跨': 408, '訴': 409, '龙': 410, '施': 411, '厌': 412, '梓': 413, '莺': 414, '阶': 415, '棋': 416, '专': 417, '挤': 418, '禮': 419, 'る': 420, '\\ue10c': 421, '巡': 422, '遥': 423, '日': 424, '岗': 425, '勝': 426, '殡': 427, '痴': 428, '措': 429, '狸': 430, '＃': 431, '歷': 432, '趁': 433, '殆': 434, '只': 435, '鼓': 436, '亞': 437, ' ': 438, '流': 439, '悲': 440, '噬': 441, '裤': 442, '拐': 443, '😠': 444, '狂': 445, '山': 446, '镇': 447, '稍': 448, '染': 449, '-': 450, '瑾': 451, '账': 452, 'l': 453, '誌': 454, '赡': 455, '지': 456, 'キ': 457, '谅': 458, '聘': 459, '绎': 460, '词': 461, '血': 462, '墙': 463, '℃': 464, '嫖': 465, '尺': 466, '活': 467, '脍': 468, '担': 469, '男': 470, '掉': 471, '咒': 472, '吸': 473, '痞': 474, '根': 475, '晏': 476, '仨': 477, '急': 478, '怠': 479, '履': 480, '洼': 481, '唾': 482, '懷': 483, '妆': 484, '单': 485, '肾': 486, '奧': 487, '薪': 488, '皂': 489, '参': 490, '朔': 491, '甲': 492, '钉': 493, '雖': 494, '希': 495, '冬': 496, '摩': 497, '谎': 498, '铂': 499, '蹄': 500, '壮': 501, '纺': 502, '岛': 503, '伴': 504, '贱': 505, '柯': 506, '拒': 507, '鲑': 508, '童': 509, '怡': 510, '績': 511, 'で': 512, '邻': 513, '班': 514, '藉': 515, '锐': 516, '鄙': 517, '蛰': 518, '告': 519, '⒒': 520, '浙': 521, '近': 522, '屈': 523, '喝': 524, '呛': 525, '痛': 526, '甚': 527, '铜': 528, '巅': 529, '盾': 530, '爵': 531, '段': 532, '貓': 533, '紀': 534, '臂': 535, '載': 536, '扁': 537, '😜': 538, '焚': 539, '厕': 540, '︰': 541, '谭': 542, '粱': 543, '殒': 544, '睐': 545, '夫': 546, '淞': 547, '骚': 548, '凳': 549, '洪': 550, '碎': 551, 'C': 552, '全': 553, '以': 554, '霉': 555, '放': 556, '觅': 557, '磕': 558, '励': 559, '搜': 560, '膊': 561, '畫': 562, '熊': 563, '罐': 564, '闸': 565, '歆': 566, '虹': 567, '估': 568, '落': 569, '經': 570, '拼': 571, '挺': 572, '糙': 573, '鉴': 574, '豁': 575, '捆': 576, '比': 577, '濛': 578, '初': 579, '属': 580, '寫': 581, '候': 582, '參': 583, '碳': 584, '哟': 585, '姜': 586, '垢': 587, '券': 588, '慑': 589, '点': 590, '己': 591, '霞': 592, '纸': 593, '哥': 594, '赎': 595, '妞': 596, '勲': 597, '刁': 598, '胃': 599, '韭': 600, '註': 601, '詐': 602, '燮': 603, '群': 604, '庙': 605, '來': 606, '仗': 607, '9': 608, '探': 609, '蝶': 610, '傅': 611, '徽': 612, '缤': 613, '^': 614, '堡': 615, '赏': 616, '蛆': 617, '烩': 618, '準': 619, '朵': 620, '吃': 621, '嘴': 622, '典': 623, '端': 624, '連': 625, '趟': 626, '欲': 627, '『': 628, '馒': 629, '神': 630, '拯': 631, '芸': 632, '防': 633, '竣': 634, '时': 635, '輕': 636, '却': 637, '泳': 638, '陡': 639, '冒': 640, '💖': 641, '托': 642, '鹫': 643, '姊': 644, '嘲': 645, '枸': 646, '总': 647, '绿': 648, '症': 649, '练': 650, '耕': 651, '野': 652, '强': 653, '匆': 654, '🙏': 655, '吶': 656, 'o': 657, '包': 658, '幣': 659, '央': 660, '惮': 661, '險': 662, '爬': 663, '猪': 664, '邯': 665, '妖': 666, '挣': 667, '世': 668, '登': 669, '女': 670, '佐': 671, '笙': 672, '×': 673, '你': 674, '肆': 675, '池': 676, '鳄': 677, '蒂': 678, '腕': 679, '囡': 680, '娅': 681, '°': 682, '徇': 683, '沱': 684, '恢': 685, '“': 686, 'I': 687, '恭': 688, '缝': 689, '肮': 690, '就': 691, '眶': 692, '席': 693, '據': 694, '剂': 695, '哄': 696, '谈': 697, '岔': 698, '瞒': 699, '坦': 700, '忑': 701, '赈': 702, '雷': 703, '辰': 704, 'e': 705, '荥': 706, '闯': 707, '純': 708, '揽': 709, '林': 710, '巴': 711, '逞': 712, '串': 713, '璨': 714, '聊': 715, '偌': 716, '斑': 717, '暄': 718, '计': 719, '会': 720, '琪': 721, '⒊': 722, '吹': 723, '碟': 724, '胚': 725, '陣': 726, '饭': 727, '🔴': 728, '友': 729, '招': 730, '扯': 731, '武': 732, '錄': 733, '後': 734, '敖': 735, '审': 736, '鸟': 737, '筑': 738, '稽': 739, '吵': 740, '制': 741, '俄': 742, '逮': 743, '毙': 744, '摘': 745, '巫': 746, '姣': 747, '從': 748, '瑰': 749, '闻': 750, '队': 751, '汲': 752, '听': 753, '邓': 754, '逆': 755, '隔': 756, '袒': 757, '芮': 758, '肺': 759, '汗': 760, '权': 761, '注': 762, '华': 763, '技': 764, '肓': 765, '”': 766, '愚': 767, '奠': 768, '呃': 769, '壹': 770, '搽': 771, '榜': 772, '莫': 773, '邮': 774, '狱': 775, '镑': 776, '雁': 777, '殊': 778, '貌': 779, '两': 780, '璃': 781, '关': 782, '吻': 783, '悉': 784, '惊': 785, '靴': 786, '手': 787, '姨': 788, '朴': 789, '修': 790, '谄': 791, '必': 792, '熱': 793, '煞': 794, '煜': 795, '廉': 796, '炅': 797, '照': 798, '睿': 799, 'う': 800, '呀': 801, '甜': 802, '珞': 803, '攬': 804, '简': 805, '牧': 806, '漳': 807, '狼': 808, '契': 809, '焉': 810, '糨': 811, '賤': 812, '庄': 813, '於': 814, '\\u3000': 815, '慨': 816, '吧': 817, '交': 818, '赴': 819, '薰': 820, '磋': 821, '囗': 822, '诺': 823, '龜': 824, '孀': 825, '绝': 826, '旧': 827, '擀': 828, '録': 829, '秉': 830, '淋': 831, '料': 832, '碗': 833, '七': 834, '降': 835, '乾': 836, '叨': 837, '確': 838, '韧': 839, '廳': 840, '胖': 841, '階': 842, '肿': 843, '断': 844, '汹': 845, '伪': 846, '且': 847, '烧': 848, '銀': 849, '蚌': 850, '翼': 851, '纳': 852, '斌': 853, '侃': 854, '规': 855, '款': 856, '路': 857, '拧': 858, '别': 859, '协': 860, '矮': 861, '悬': 862, '場': 863, '•': 864, '寺': 865, '昨': 866, '尘': 867, '藕': 868, '能': 869, '講': 870, '蛮': 871, '곤': 872, '澡': 873, '炫': 874, '写': 875, '够': 876, '胞': 877, '藩': 878, '赦': 879, '鈞': 880, '〖': 881, '迁': 882, '灿': 883, '桦': 884, '瞎': 885, '戲': 886, '迦': 887, '楷': 888, '玄': 889, '哮': 890, '古': 891, 'N': 892, '配': 893, '弄': 894, '太': 895, '都': 896, '盯': 897, '邹': 898, '隻': 899, '🎯': 900, '靠': 901, '谱': 902, '任': 903, '应': 904, '約': 905, '攸': 906, '恨': 907, '邵': 908, '尿': 909, '岖': 910, '煮': 911, '柄': 912, '珀': 913, '还': 914, '削': 915, '輸': 916, '诿': 917, '秩': 918, '\\xa0': 919, '喽': 920, '吳': 921, '説': 922, 'E': 923, '勃': 924, '紫': 925, '补': 926, '痨': 927, '卷': 928, '巢': 929, '拢': 930, '對': 931, '浮': 932, '期': 933, '兰': 934, '勁': 935, '死': 936, '传': 937, '備': 938, '篡': 939, '瓤': 940, '醇': 941, '錢': 942, '強': 943, '狰': 944, '蛀': 945, '健': 946, '键': 947, '圳': 948, '丧': 949, '拳': 950, '沈': 951, '捉': 952, '浆': 953, '金': 954, '品': 955, '悚': 956, '佈': 957, '愫': 958, '株': 959, '陀': 960, '廣': 961, '斤': 962, '烛': 963, '连': 964, '癌': 965, '晤': 966, '诛': 967, '倫': 968, '→': 969, '梧': 970, '瀬': 971, '蜗': 972, '刨': 973, '叮': 974, '戰': 975, '界': 976, '婷': 977, '拷': 978, '飙': 979, '绷': 980, '开': 981, '還': 982, '蚝': 983, '暗': 984, '焦': 985, '右': 986, '<': 987, '脑': 988, '攀': 989, '蹋': 990, '源': 991, '热': 992, '引': 993, '圓': 994, '咂': 995, '乌': 996, '塚': 997, '银': 998, '館': 999, '范': 1000, '乍': 1001, '均': 1002, '圣': 1003, '舱': 1004, '凑': 1005, '青': 1006, '寂': 1007, '馅': 1008, '惫': 1009, '😂': 1010, '曰': 1011, '戮': 1012, '砸': 1013, '逐': 1014, '⚠': 1015, '奚': 1016, '榄': 1017, '屉': 1018, '炮': 1019, '統': 1020, '樟': 1021, '谙': 1022, '肉': 1023, '蝴': 1024, '4': 1025, '栽': 1026, '葡': 1027, '诞': 1028, '嚏': 1029, '无': 1030, '沢': 1031, '夸': 1032, '娆': 1033, '限': 1034, '跷': 1035, '样': 1036, '势': 1037, '虫': 1038, '频': 1039, '裙': 1040, '糗': 1041, '涵': 1042, '禽': 1043, '終': 1044, '搏': 1045, '勇': 1046, '秦': 1047, 'θ': 1048, '#': 1049, '＆': 1050, '抠': 1051, '磅': 1052, '垃': 1053, '耀': 1054, '律': 1055, '适': 1056, '究': 1057, '杂': 1058, '堵': 1059, '迷': 1060, '钻': 1061, '缆': 1062, '职': 1063, '共': 1064, '濃': 1065, '滋': 1066, '張': 1067, '剔': 1068, '层': 1069, '媽': 1070, '恕': 1071, '细': 1072, '體': 1073, '麒': 1074, '刊': 1075, '俏': 1076, '傻': 1077, '莱': 1078, '策': 1079, '浓': 1080, '离': 1081, '鸭': 1082, 'c': 1083, '釜': 1084, '蛩': 1085, '本': 1086, '龄': 1087, '忌': 1088, '载': 1089, '訪': 1090, '泥': 1091, '朽': 1092, '叶': 1093, '字': 1094, '盐': 1095, '争': 1096, '尹': 1097, '扣': 1098, '场': 1099, '螺': 1100, '文': 1101, '挨': 1102, '炎': 1103, '竿': 1104, '恃': 1105, '贡': 1106, '堰': 1107, '栖': 1108, '捏': 1109, '≪': 1110, '腊': 1111, '杖': 1112, '肚': 1113, '幾': 1114, '＜': 1115, '饥': 1116, '醒': 1117, '掼': 1118, '束': 1119, '再': 1120, '叫': 1121, '湯': 1122, '扇': 1123, '緯': 1124, '亊': 1125, '撤': 1126, '５': 1127, '室': 1128, '離': 1129, '严': 1130, '压': 1131, '霖': 1132, '魅': 1133, '改': 1134, '樽': 1135, '腥': 1136, '歲': 1137, '谜': 1138, '優': 1139, '矩': 1140, '顏': 1141, '喔': 1142, '旁': 1143, '聂': 1144, '缓': 1145, '勾': 1146, '寄': 1147, '棠': 1148, '纹': 1149, '轿': 1150, '触': 1151, '先': 1152, '投': 1153, '⒍': 1154, '傑': 1155, '鹰': 1156, '趴': 1157, '霜': 1158, '酬': 1159, '⒔': 1160, '拎': 1161, '澜': 1162, '盎': 1163, '蚁': 1164, '南': 1165, '焱': 1166, '飏': 1167, '讯': 1168, '胡': 1169, '谦': 1170, '篪': 1171, '按': 1172, '恵': 1173, '辽': 1174, '寓': 1175, '祷': 1176, '峯': 1177, '档': 1178, '尸': 1179, '‘': 1180, '牛': 1181, '遨': 1182, '匣': 1183, '拭': 1184, '赶': 1185, '润': 1186, '捧': 1187, '薦': 1188, '桢': 1189, '踮': 1190, '祈': 1191, '洞': 1192, '疱': 1193, '杞': 1194, '侬': 1195, '则': 1196, '圭': 1197, '痔': 1198, '认': 1199, '泡': 1200, '宪': 1201, '抉': 1202, '衙': 1203, '欧': 1204, '擁': 1205, '哈': 1206, '砣': 1207, '膳': 1208, '科': 1209, '睬': 1210, '買': 1211, '藥': 1212, '缠': 1213, '永': 1214, '啲': 1215, '我': 1216, '捞': 1217, '杏': 1218, '敬': 1219, '持': 1220, '牺': 1221, '陂': 1222, '辛': 1223, '慧': 1224, '傳': 1225, '汽': 1226, '雉': 1227, '饪': 1228, '打': 1229, '分': 1230, '姑': 1231, '竟': 1232, '娜': 1233, '筋': 1234, '殴': 1235, '乳': 1236, '朋': 1237, '负': 1238, '靓': 1239, '潮': 1240, '织': 1241, '洋': 1242, '揉': 1243, '象': 1244, '齊': 1245, '顺': 1246, '漉': 1247, '⒉': 1248, '挡': 1249, '冧': 1250, '咔': 1251, '角': 1252, '网': 1253, '遍': 1254, '尤': 1255, '茉': 1256, '搀': 1257, '\\u200a': 1258, '豚': 1259, '绑': 1260, '绵': 1261, '實': 1262, '骇': 1263, '滩': 1264, '彼': 1265, '桔': 1266, '槟': 1267, '哆': 1268, '头': 1269, '旭': 1270, '芳': 1271, '喉': 1272, '又': 1273, '脏': 1274, '几': 1275, '羽': 1276, '鑫': 1277, '沧': 1278, '「': 1279, '净': 1280, '驰': 1281, '帘': 1282, '企': 1283, '绯': 1284, '啪': 1285, '献': 1286, '掌': 1287, '赫': 1288, '癫': 1289, '诉': 1290, '承': 1291, '列': 1292, '緣': 1293, '复': 1294, '天': 1295, '丈': 1296, '元': 1297, '货': 1298, '辱': 1299, '糕': 1300, '咽': 1301, '厥': 1302, '地': 1303, '伶': 1304, '谨': 1305, '魄': 1306, '識': 1307, '孕': 1308, '負': 1309, '存': 1310, '⑥': 1311, '宁': 1312, '闺': 1313, '个': 1314, '虏': 1315, '暖': 1316, '冤': 1317, '母': 1318, '组': 1319, '燃': 1320, '憋': 1321, '厨': 1322, '咸': 1323, '贿': 1324, '捶': 1325, '租': 1326, '毒': 1327, '炳': 1328, '熔': 1329, '澄': 1330, '抑': 1331, '領': 1332, '惭': 1333, '满': 1334, '菇': 1335, '另': 1336, '旋': 1337, '柏': 1338, '些': 1339, '质': 1340, '撇': 1341, '恰': 1342, '臣': 1343, '丛': 1344, '沇': 1345, '远': 1346, '烂': 1347, '债': 1348, '批': 1349, '菊': 1350, '夜': 1351, '锻': 1352, '嚓': 1353, '傍': 1354, '邡': 1355, '晓': 1356, '岸': 1357, '爱': 1358, '毕': 1359, '漓': 1360, '锡': 1361, '⒕': 1362, '访': 1363, '豆': 1364, '沾': 1365, '牢': 1366, '惠': 1367, '豹': 1368, '念': 1369, '唤': 1370, '扭': 1371, '網': 1372, '爷': 1373, '錯': 1374, '旅': 1375, '休': 1376, '桶': 1377, '疼': 1378, '📢': 1379, '铁': 1380, '叙': 1381, '楼': 1382, '辟': 1383, '搞': 1384, 'て': 1385, '台': 1386, '炽': 1387, '侯': 1388, '霓': 1389, '粹': 1390, '卦': 1391, '煎': 1392, '枪': 1393, '高': 1394, '叟': 1395, '巧': 1396, '桥': 1397, '跪': 1398, '萝': 1399, '唇': 1400, '苑': 1401, '旗': 1402, '渊': 1403, '葩': 1404, '晾': 1405, '伦': 1406, '受': 1407, '椒': 1408, '姚': 1409, '梗': 1410, '尬': 1411, '局': 1412, '庝': 1413, '兲': 1414, '竞': 1415, '被': 1416, '雞': 1417, '覺': 1418, '攪': 1419, '惘': 1420, '丘': 1421, '闷': 1422, '擦': 1423, '沟': 1424, '皮': 1425, '炼': 1426, '礦': 1427, '叹': 1428, '检': 1429, '陈': 1430, '胎': 1431, '👏': 1432, '甘': 1433, '颍': 1434, '萬': 1435, '部': 1436, '楚': 1437, '隋': 1438, '燈': 1439, '客': 1440, '⒓': 1441, '襟': 1442, '悠': 1443, '葫': 1444, '着': 1445, '徹': 1446, '撅': 1447, '弘': 1448, '琅': 1449, '怨': 1450, '＋': 1451, '披': 1452, '筠': 1453, '习': 1454, '停': 1455, '翻': 1456, '寿': 1457, '寝': 1458, '维': 1459, '漏': 1460, '程': 1461, '向': 1462, '=': 1463, '拘': 1464, '乙': 1465, '將': 1466, '姥': 1467, '柳': 1468, '冯': 1469, '搖': 1470, '吠': 1471, '上': 1472, '蹈': 1473, 'M': 1474, '倔': 1475, '痤': 1476, '腺': 1477, '须': 1478, '秤': 1479, '姿': 1480, '逛': 1481, 'S': 1482, '窈': 1483, '彰': 1484, '黎': 1485, '帷': 1486, '+': 1487, '县': 1488, '釧': 1489, '觊': 1490, '扒': 1491, '幼': 1492, '崖': 1493, '多': 1494, '峡': 1495, '动': 1496, '溃': 1497, '翠': 1498, '液': 1499, '抗': 1500, '拋': 1501, '管': 1502, 'K': 1503, '睛': 1504, '案': 1505, '宅': 1506, '鲲': 1507, '扬': 1508, '折': 1509, '珍': 1510, '幫': 1511, '届': 1512, '節': 1513, '嚷': 1514, '問': 1515, '虞': 1516, '校': 1517, '造': 1518, '憧': 1519, '退': 1520, '祎': 1521, '溜': 1522, '役': 1523, '逼': 1524, '➊': 1525, '語': 1526, '超': 1527, '辜': 1528, '４': 1529, '奋': 1530, '虚': 1531, '卑': 1532, '袁': 1533, '\\ue00e': 1534, '嘅': 1535, '骸': 1536, 'サ': 1537, '僳': 1538, '芦': 1539, '股': 1540, '舰': 1541, '奕': 1542, '撞': 1543, '癢': 1544, '膨': 1545, '攫': 1546, '伤': 1547, '枭': 1548, '诅': 1549, '哨': 1550, '荡': 1551, '膛': 1552, '爸': 1553, '沉': 1554, '悟': 1555, '蹦': 1556, '陳': 1557, '弯': 1558, '梨': 1559, '脉': 1560, '烈': 1561, '蘇': 1562, '肘': 1563, '确': 1564, '漆': 1565, '8': 1566, '钊': 1567, '获': 1568, '噱': 1569, '刺': 1570, '丽': 1571, '扩': 1572, '领': 1573, '潇': 1574, '即': 1575, '把': 1576, '撕': 1577, ',': 1578, '吟': 1579, '饨': 1580, '隘': 1581, 'i': 1582, '夠': 1583, '郝': 1584, '者': 1585, '渠': 1586, '淄': 1587, '嵌': 1588, '幻': 1589, '鸣': 1590, '兑': 1591, 'ャ': 1592, '脊': 1593, '和': 1594, '柒': 1595, '簿': 1596, '匀': 1597, '缩': 1598, '井': 1599, '隆': 1600, '龍': 1601, '寸': 1602, '浴': 1603, '将': 1604, '徙': 1605, '塔': 1606, '定': 1607, '營': 1608, '⒖': 1609, '評': 1610, '或': 1611, '鸡': 1612, '轉': 1613, '崩': 1614, '矢': 1615, '甄': 1616, '晒': 1617, '喵': 1618, '窦': 1619, '⒌': 1620, '環': 1621, '姗': 1622, '❤': 1623, '齿': 1624, '阱': 1625, '北': 1626, '抵': 1627, '眈': 1628, '舅': 1629, '伙': 1630, '陷': 1631, '剥': 1632, '淀': 1633, '恍': 1634, '蔥': 1635, '宛': 1636, '卻': 1637, '览': 1638, '應': 1639, '動': 1640, '顿': 1641, '义': 1642, '炜': 1643, '奖': 1644, '琍': 1645, '啬': 1646, '匡': 1647, '狄': 1648, '欢': 1649, '阖': 1650, '方': 1651, '↓': 1652, '劑': 1653, '占': 1654, '贬': 1655, '观': 1656, '弧': 1657, '口': 1658, '蘋': 1659, '封': 1660, '拽': 1661, '哇': 1662, '船': 1663, '畜': 1664, '洗': 1665, '嘟': 1666, '忡': 1667, '佑': 1668, '贞': 1669, '俩': 1670, '它': 1671, '埋': 1672, '／': 1673, '殺': 1674, '窘': 1675, '兹': 1676, '纬': 1677, '桑': 1678, '迭': 1679, '卖': 1680, '➋': 1681, '躲': 1682, '驻': 1683, '阀': 1684, '穎': 1685, '嗨': 1686, '簸': 1687, '腔': 1688, '🔲': 1689, '努': 1690, '剁': 1691, '擅': 1692, '欺': 1693, '⒐': 1694, '唔': 1695, '们': 1696, '逝': 1697, '斓': 1698, '积': 1699, '烨': 1700, 'R': 1701, '陸': 1702, '悔': 1703, '非': 1704, '耗': 1705, '园': 1706, '嘎': 1707, '蝎': 1708, '咙': 1709, '侨': 1710, '痘': 1711, '曹': 1712, '侥': 1713, '接': 1714, '咖': 1715, '９': 1716, '住': 1717, '玛': 1718, '鞠': 1719, '脾': 1720, '撼': 1721, '火': 1722, '剩': 1723, '牙': 1724, '酋': 1725, '韶': 1726, '目': 1727, '论': 1728, '环': 1729, '６': 1730, '祛': 1731, '喊': 1732, '娘': 1733, '抄': 1734, '构': 1735, '嗲': 1736, '缮': 1737, '贤': 1738, '遣': 1739, '竺': 1740, '缙': 1741, '雅': 1742, '摇': 1743, '间': 1744, '刀': 1745, '拍': 1746, '（': 1747, '庐': 1748, '胺': 1749, '携': 1750, '价': 1751, '合': 1752, '益': 1753, '溝': 1754, '電': 1755, '佢': 1756, '黑': 1757, '骗': 1758, '亿': 1759, '阉': 1760, '坼': 1761, '趋': 1762, '蕉': 1763, '侠': 1764, '昌': 1765, '素': 1766, '飯': 1767, '僧': 1768, '逻': 1769, '赌': 1770, '尊': 1771, '紋': 1772, '彬': 1773, '庆': 1774, '找': 1775, '讲': 1776, '…': 1777, '雇': 1778, '纪': 1779, 'J': 1780, '」': 1781, '杯': 1782, '獎': 1783, '吕': 1784, '皓': 1785, '沁': 1786, '椽': 1787, '出': 1788, '邱': 1789, '咗': 1790, '？': 1791, '充': 1792, '阳': 1793, '\\ue141': 1794, '扶': 1795, '亢': 1796, '逃': 1797, '河': 1798, '治': 1799, '愿': 1800, '际': 1801, '图': 1802, '拔': 1803, '祸': 1804, '墟': 1805, '横': 1806, '啦': 1807, '炒': 1808, '首': 1809, '證': 1810, '丢': 1811, '芜': 1812, '少': 1813, '敞': 1814, '诫': 1815, '陆': 1816, '`': 1817, '旬': 1818, '刑': 1819, '行': 1820, '．': 1821, 'é': 1822, '删': 1823, '犬': 1824, '邪': 1825, '亨': 1826, '*': 1827, '巳': 1828, '虑': 1829, '灵': 1830, '箭': 1831, '倡': 1832, '隧': 1833, '懒': 1834, '疡': 1835, '已': 1836, '摔': 1837, '谋': 1838, '讼': 1839, '衡': 1840, '妥': 1841, '鞋': 1842, '区': 1843, '仲': 1844, '盘': 1845, '腚': 1846, '沒': 1847, '拌': 1848, '蒸': 1849, '侵': 1850, '迹': 1851, '守': 1852, '湿': 1853, '達': 1854, '骏': 1855, '萧': 1856, '硝': 1857, '麻': 1858, '颗': 1859, '柔': 1860, '昧': 1861, '堪': 1862, '晟': 1863, '衔': 1864, '杠': 1865, '啖': 1866, '戟': 1867, '睹': 1868, '异': 1869, 'h': 1870, '┭': 1871, '迢': 1872, '蕾': 1873, '怜': 1874, '缴': 1875, '印': 1876, '醫': 1877, '袍': 1878, '妊': 1879, '录': 1880, '嘈': 1881, '蕭': 1882, '闹': 1883, '支': 1884, '唐': 1885, '星': 1886, '订': 1887, '烦': 1888, '齒': 1889, '甫': 1890, '既': 1891, '疮': 1892, '绪': 1893, '皇': 1894, '莲': 1895, '志': 1896, '涡': 1897, '偎': 1898, '胁': 1899, '疹': 1900, '勺': 1901, '因': 1902, '杜': 1903, '宠': 1904, '渎': 1905, '贯': 1906, '瓦': 1907, '衅': 1908, '叩': 1909, '瘀': 1910, '直': 1911, '肥': 1912, '许': 1913, '京': 1914, '敲': 1915, '褶': 1916, '沸': 1917, '毁': 1918, '讨': 1919, '屿': 1920, '值': 1921, '蹭': 1922, '芩': 1923, '街': 1924, '馨': 1925, '髦': 1926, '湧': 1927, '粵': 1928, '玻': 1929, '朱': 1930, '凌': 1931, '汕': 1932, '絕': 1933, '謝': 1934, '完': 1935, '函': 1936, '龚': 1937, '飽': 1938, '檐': 1939, '猫': 1940, '坍': 1941, '微': 1942, '跌': 1943, '奏': 1944, '仙': 1945, '站': 1946, '彪': 1947, '尔': 1948, '迈': 1949, '节': 1950, '尽': 1951, '诠': 1952, '乏': 1953, '犯': 1954, '研': 1955, '宰': 1956, '厮': 1957, '項': 1958, '搬': 1959, '忘': 1960, '当': 1961, '怀': 1962, '冲': 1963, '侄': 1964, '骤': 1965, '況': 1966, '會': 1967, '卸': 1968, '泾': 1969, '毯': 1970, '剑': 1971, '见': 1972, '蔗': 1973, '輩': 1974, '季': 1975, '珊': 1976, '嚕': 1977, '稣': 1978, '建': 1979, '误': 1980, '询': 1981, '茂': 1982, '獠': 1983, '潘': 1984, '舆': 1985, '嫁': 1986, '砂': 1987, '係': 1988, '仅': 1989, '茫': 1990, '酥': 1991, '茎': 1992, '汾': 1993, '﹣': 1994, '凶': 1995, '居': 1996, '喂': 1997, '搅': 1998, '璋': 1999, '羁': 2000, '挥': 2001, '回': 2002, '囊': 2003, '赞': 2004, '揪': 2005, '浦': 2006, '椰': 2007, '衷': 2008, '：': 2009, '汤': 2010, '編': 2011, '裏': 2012, '续': 2013, '广': 2014, '靡': 2015, '困': 2016, '選': 2017, '今': 2018, '垫': 2019, '崴': 2020, '车': 2021, '择': 2022, '饼': 2023, '炬': 2024, '傲': 2025, '組': 2026, '若': 2027, '敌': 2028, '疽': 2029, '骄': 2030, '誓': 2031, '温': 2032, '攝': 2033, '忻': 2034, '千': 2035, '綠': 2036, '辑': 2037, '佯': 2038, '傾': 2039, '桃': 2040, '抿': 2041, '踏': 2042, '豫': 2043, '态': 2044, '❌': 2045, '抹': 2046, '懈': 2047, '员': 2048, '对': 2049, '圾': 2050, '潭': 2051, '孔': 2052, '看': 2053, '鬼': 2054, '假': 2055, '呱': 2056, '號': 2057, '鍾': 2058, 'も': 2059, '疗': 2060, '谷': 2061, '彗': 2062, '丝': 2063, '之': 2064, '阪': 2065, '帮': 2066, '侧': 2067, '付': 2068, '祀': 2069, '苯': 2070, '诚': 2071, '歪': 2072, '举': 2073, '加': 2074, '婺': 2075, '窃': 2076, '👽': 2077, '容': 2078, '切': 2079, '锦': 2080, '唉': 2081, '弊': 2082, '及': 2083, '寻': 2084, '式': 2085, '页': 2086, '随': 2087, '钟': 2088, '炙': 2089, '颐': 2090, '瘦': 2091, '肤': 2092, '２': 2093, '絮': 2094, '畔': 2095, '娟': 2096, '⑤': 2097, '晰': 2098, '馆': 2099, '疏': 2100, '砧': 2101, '挂': 2102, '視': 2103, '浔': 2104, '丫': 2105, '１': 2106, '纷': 2107, '掏': 2108, '释': 2109, '惟': 2110, '家': 2111, '芥': 2112, '侮': 2113, '挝': 2114, '狠': 2115, '畸': 2116, 'A': 2117, '殃': 2118, '鲁': 2119, '琴': 2120, '枉': 2121, '佳': 2122, '菲': 2123, 'ン': 2124, '甩': 2125, '唱': 2126, '糟': 2127, '徨': 2128, '进': 2129, '忆': 2130, '蚂': 2131, '氣': 2132, '諾': 2133, '敦': 2134, '叭': 2135, '梳': 2136, '庇': 2137, '球': 2138, '饺': 2139, 'V': 2140, '增': 2141, '《': 2142, '亏': 2143, '匹': 2144, '楠': 2145, '畅': 2146, '暮': 2147, '物': 2148, '屠': 2149, '税': 2150, '魏': 2151, '碰': 2152, '［': 2153, '鲜': 2154, '蟹': 2155, '縛': 2156, '基': 2157, '蔡': 2158, '爽': 2159, '導': 2160, '级': 2161, '赛': 2162, '项': 2163, '寞': 2164, '湘': 2165, '渴': 2166, '么': 2167, '稚': 2168, '冷': 2169, '轩': 2170, '\\ue419': 2171, '教': 2172, '爪': 2173, '淆': 2174, '轻': 2175, '靈': 2176, '融': 2177, '衩': 2178, '結': 2179, '喱': 2180, '曉': 2181, '贴': 2182, '云': 2183, '尝': 2184, '紧': 2185, '慘': 2186, '线': 2187, '笋': 2188, '暴': 2189, '數': 2190, '不': 2191, '拖': 2192, '滤': 2193, '秀': 2194, '蜀': 2195, '愤': 2196, '易': 2197, '导': 2198, '玲': 2199, '蛇': 2200, '奂': 2201, '挫': 2202, '嘛': 2203, '腻': 2204, '雯': 2205, '阔': 2206, '实': 2207, '蛊': 2208, '叼': 2209, '经': 2210, '廊': 2211, '拓': 2212, '达': 2213, '混': 2214, '仆': 2215, '痕': 2216, '较': 2217, '信': 2218, '镌': 2219, '荣': 2220, '羊': 2221, '吴': 2222, '苟': 2223, '借': 2224, '郑': 2225, '祠': 2226, '喜': 2227, '歌': 2228, '况': 2229, '桉': 2230, '笔': 2231, '聆': 2232, '树': 2233, '啃': 2234, '飞': 2235, '从': 2236, '門': 2237, 'G': 2238, '仓': 2239, '位': 2240, '欣': 2241, '音': 2242, '扑': 2243, '❗': 2244, '透': 2245, '述': 2246, '報': 2247, '咎': 2248, '肌': 2249, '吊': 2250, '了': 2251, '贾': 2252, '半': 2253, '截': 2254, '‼': 2255, '允': 2256, '瞄': 2257, '奴': 2258, '鹿': 2259, '蓆': 2260, 'め': 2261, '故': 2262, '革': 2263, '循': 2264, '诩': 2265, '拉': 2266, '\\ue112': 2267, '〜': 2268, '粘': 2269, '眨': 2270, '垮': 2271, '⒋': 2272, '≧': 2273, '呸': 2274, '量': 2275, '氰': 2276, '涩': 2277, '吁': 2278, '瑜': 2279, '有': 2280, '罚': 2281, '邢': 2282, '英': 2283, '鼠': 2284, '蜘': 2285, '⑦': 2286, '別': 2287, '際': 2288, '记': 2289, '麼': 2290, '城': 2291, '邊': 2292, '哉': 2293, '茹': 2294, '矣': 2295, '聞': 2296, '航': 2297, '瘙': 2298, '椅': 2299, '泰': 2300, '屬': 2301, '蹂': 2302, '咁': 2303, '躁': 2304, '|': 2305, '变': 2306, '胜': 2307, '调': 2308, '疆': 2309, '该': 2310, '亡': 2311, '晔': 2312, '窒': 2313, '罡': 2314, '核': 2315, '·': 2316, '糠': 2317, '旨': 2318, '钱': 2319, '凰': 2320, '民': 2321, '祥': 2322, '洒': 2323, '锅': 2324, '悄': 2325, '迂': 2326, '器': 2327, '戳': 2328, '蒲': 2329, '诙': 2330, '喳': 2331, '為': 2332, '雨': 2333, '旻': 2334, '灼': 2335, '肝': 2336, '匠': 2337, '土': 2338, '琳': 2339, '惩': 2340, '・': 2341, '姐': 2342, '彩': 2343, '障': 2344, '進': 2345, '劵': 2346, '理': 2347, '沏': 2348, '外': 2349, '佛': 2350, 'か': 2351, '裝': 2352, '皙': 2353, '颇': 2354, '肪': 2355, '崔': 2356, '嚼': 2357, '讳': 2358, '救': 2359, '淮': 2360, '烁': 2361, '搂': 2362, '⒎': 2363, '臀': 2364, '💗': 2365, '诀': 2366, '踪': 2367, '辆': 2368, '殇': 2369, '岁': 2370, '猥': 2371, '墩': 2372, '晃': 2373, '渔': 2374, '腐': 2375, '觉': 2376, '吨': 2377, '芙': 2378, '🇸': 2379, '服': 2380, '需': 2381, 't': 2382, '琨': 2383, '丐': 2384, '昼': 2385, '兜': 2386, '事': 2387, '谬': 2388, '氛': 2389, '菠': 2390, '介': 2391, '径': 2392, '俐': 2393, '黯': 2394, '3': 2395, '陕': 2396, '➍': 2397, '蝙': 2398, '岐': 2399, '藝': 2400, '黏': 2401, '蓉': 2402, '陶': 2403, '准': 2404, '追': 2405, '衝': 2406, '雌': 2407, '沃': 2408, '關': 2409, '贝': 2410, 'd': 2411, '博': 2412, '速': 2413, '洁': 2414, '珐': 2415, '督': 2416, '瑞': 2417, '步': 2418, '嗯': 2419, '贸': 2420, '喀': 2421, '拟': 2422, '件': 2423, '💓': 2424, '生': 2425, '钨': 2426, '！': 2427, '機': 2428, '\\ue41d': 2429, '皱': 2430, '族': 2431, '僭': 2432, '镐': 2433, '精': 2434, '艘': 2435, '镖': 2436, '曙': 2437, '扔': 2438, '😚': 2439, '勉': 2440, '疯': 2441, '赋': 2442, '騙': 2443, '徐': 2444, '塑': 2445, '凭': 2446, '人': 2447, '川': 2448, '\\ue333': 2449, '弈': 2450, '賀': 2451, '党': 2452, '始': 2453, 'v': 2454, '腋': 2455, '致': 2456, '隊': 2457, '丸': 2458, '😭': 2459, '格': 2460, '幸': 2461, '與': 2462, '淌': 2463, '掩': 2464, '待': 2465, '于': 2466, '悍': 2467, '蹲': 2468, '难': 2469, '禺': 2470, '可': 2471, '義': 2472, '䄂': 2473, '谢': 2474, '咕': 2475, '毬': 2476, '喇': 2477, '戸': 2478, '魚': 2479, '娠': 2480, '圈': 2481, '弓': 2482, '蒋': 2483, '掘': 2484, '滾': 2485, '谶': 2486, '孱': 2487, '購': 2488, '躏': 2489, '呵': 2490, '焯': 2491, '\\ue418': 2492, '仰': 2493, '密': 2494, '苗': 2495, '纠': 2496, '霆': 2497, '臥': 2498, '灬': 2499, '願': 2500, '荐': 2501, '惧': 2502, '兽': 2503, '渡': 2504, '酷': 2505, '森': 2506, '厘': 2507, '食': 2508, '办': 2509, '俞': 2510, '训': 2511, '灭': 2512, '婕': 2513, '袜': 2514, '罢': 2515, '旺': 2516, '瞥': 2517, '寧': 2518, '笨': 2519, '筷': 2520, '睦': 2521, '迪': 2522, '种': 2523, '題': 2524, '纲': 2525, '預': 2526, '螂': 2527, '醉': 2528, '息': 2529, '胭': 2530, '昕': 2531, '鲨': 2532, '衰': 2533, '逸': 2534, '享': 2535, '士': 2536, '纵': 2537, '莓': 2538, '顾': 2539, '孩': 2540, '拨': 2541, '乓': 2542, '吐': 2543, '显': 2544, '難': 2545, '泌': 2546, '舉': 2547, '剃': 2548, '∕': 2549, '無': 2550, '叔': 2551, '俗': 2552, '裕': 2553, '～': 2554, '讓': 2555, '卜': 2556, '奔': 2557, '凤': 2558, '畏': 2559, '6': 2560, '虐': 2561, '婆': 2562, '骆': 2563, '霧': 2564, '最': 2565, '缨': 2566, 'z': 2567, '晶': 2568, '粑': 2569, '觑': 2570, '砷': 2571, '劣': 2572, '濡': 2573, '骁': 2574, '附': 2575, '鱼': 2576, '综': 2577, '敷': 2578, '粟': 2579, 'x': 2580, '恩': 2581, '迫': 2582, 'з': 2583, '予': 2584, '谟': 2585, '辍': 2586, '螨': 2587, '幽': 2588, '讥': 2589, '填': 2590, '專': 2591, '报': 2592, '驴': 2593, '促': 2594, '语': 2595, '辣': 2596, '棵': 2597, '峙': 2598, '崎': 2599, '珑': 2600, '左': 2601, '東': 2602, '琥': 2603, '厢': 2604, '悦': 2605, '心': 2606, '莞': 2607, '☞': 2608, '阎': 2609, '琼': 2610, '赔': 2611, '厦': 2612, '瞑': 2613, '邃': 2614, '苍': 2615, '炉': 2616, '朗': 2617, '视': 2618, '劲': 2619, '臾': 2620, '颖': 2621, '哋': 2622, '堆': 2623, '课': 2624, '咪': 2625, '缘': 2626, '屍': 2627, '恻': 2628, '裹': 2629, '市': 2630, '魯': 2631, '卵': 2632, '扎': 2633, '钞': 2634, '禀': 2635, '瘋': 2636, '窿': 2637, '差': 2638, '脂': 2639, '化': 2640, '掺': 2641, '菩': 2642, '溟': 2643, '焰': 2644, '淳': 2645, '逢': 2646, '铎': 2647, '訂': 2648, '鬣': 2649, '括': 2650, '启': 2651, '吾': 2652, '输': 2653, '芽': 2654, '昆': 2655, '旦': 2656, '套': 2657, '韦': 2658, '姻': 2659, '弗': 2660, '戒': 2661, '遁': 2662, 'B': 2663, '蔬': 2664, '俠': 2665, '读': 2666, '早': 2667, '并': 2668, '三': 2669, '剿': 2670, '颈': 2671, '渭': 2672, '罒': 2673, '亭': 2674, '湛': 2675, '铛': 2676, '嗜': 2677, '巍': 2678, '讣': 2679, '恋': 2680, '酒': 2681, '蔓': 2682, '冠': 2683, '绚': 2684, '碉': 2685, '減': 2686, '抓': 2687, '眠': 2688, '％': 2689, 'q': 2690, '婚': 2691, '肛': 2692, '让': 2693, '梦': 2694, '李': 2695, '得': 2696, '乞': 2697, '赂': 2698, '圆': 2699, '擎': 2700, 'F': 2701, '务': 2702, '＝': 2703, '解': 2704, '宴': 2705, '名': 2706, '鹂': 2707, '碑': 2708, '篮': 2709, '带': 2710, '议': 2711, '鲍': 2712, '慰': 2713, '舊': 2714, '感': 2715, '煥': 2716, '饰': 2717, '爆': 2718, '梁': 2719, '副': 2720, '米': 2721, '腹': 2722, '🐵': 2723, '耻': 2724, '赵': 2725, '蛛': 2726, '羯': 2727, '瑚': 2728, '忏': 2729, '箴': 2730, '驚': 2731, '除': 2732, '娃': 2733, '链': 2734, '嬉': 2735, '袱': 2736, '㎡': 2737, '噜': 2738, '中': 2739, '谐': 2740, '识': 2741, '禅': 2742, '秽': 2743, '眩': 2744, '彦': 2745, '塞': 2746, '摒': 2747, '魂': 2748, '秋': 2749, '铭': 2750, '\\\\': 2751, '泱': 2752, '胶': 2753, '樣': 2754, '妃': 2755, '厄': 2756, '尅': 2757, '术': 2758, '转': 2759, '途': 2760, '灯': 2761, '爹': 2762, '喻': 2763, '痒': 2764, '栎': 2765, '馬': 2766, '訓': 2767, '囂': 2768, '▽': 2769, '联': 2770, '熄': 2771, '周': 2772, '殷': 2773, '整': 2774, '睇': 2775, '便': 2776, '蜷': 2777, '硕': 2778, '彻': 2779, '试': 2780, '傭': 2781, '冼': 2782, '避': 2783, 'ノ': 2784, '镜': 2785, '瓣': 2786, '噤': 2787, '耐': 2788, '炸': 2789, '疾': 2790, '商': 2791, '愁': 2792, '腑': 2793, '吏': 2794, '贷': 2795, '算': 2796, '瞧': 2797, '孰': 2798, '婪': 2799, '氧': 2800, '详': 2801, '崛': 2802, '福': 2803, '营': 2804, '姓': 2805, '霾': 2806, '奈': 2807, '潜': 2808, '✨': 2809, '铱': 2810, '妝': 2811, '裸': 2812, '递': 2813, '番': 2814, '薇': 2815, '瑟': 2816, '挚': 2817, '默': 2818, '妍': 2819, '诽': 2820, '忠': 2821, '欠': 2822, '诋': 2823, '秘': 2824, '栗': 2825, '风': 2826, '跋': 2827, '師': 2828, '取': 2829, '灾': 2830, '瑪': 2831, '遏': 2832, '彝': 2833, '侦': 2834, '妩': 2835, '\"': 2836, '院': 2837, '础': 2838, '藍': 2839, '也': 2840, '此': 2841, '灌': 2842, '兴': 2843, '覆': 2844, '馍': 2845, '公': 2846, '怎': 2847, '亚': 2848, '跳': 2849, '肠': 2850, '歡': 2851, '坡': 2852, '邂': 2853, '凹': 2854, '谁': 2855, '插': 2856, '荷': 2857, '琵': 2858, '兒': 2859, '槃': 2860, '芒': 2861, 'k': 2862, '豢': 2863, '她': 2864, '穿': 2865, '劈': 2866, '尴': 2867, '击': 2868, '滴': 2869, '茜': 2870, '募': 2871, '烙': 2872, '柱': 2873, '嘘': 2874, '夙': 2875, '】': 2876, '擇': 2877, '肢': 2878, '璐': 2879, '粮': 2880, '阻': 2881, '绞': 2882, '赤': 2883, '捂': 2884, '泵': 2885, '圃': 2886, '蓬': 2887, '赖': 2888, '悯': 2889, '底': 2890, '岩': 2891, '淤': 2892, '闲': 2893, '慶': 2894, '媛': 2895, '惕': 2896, '岂': 2897, '为': 2898, '贩': 2899, '田': 2900, '勒': 2901, '捅': 2902, '业': 2903, '黃': 2904, '话': 2905, '愛': 2906, '徒': 2907, '什': 2908, '屁': 2909, '孝': 2910, '胳': 2911, '闭': 2912, '雕': 2913, 'し': 2914, '卧': 2915, '农': 2916, '奥': 2917, '伟': 2918, '轰': 2919, '昏': 2920, '馥': 2921, '戚': 2922, '戶': 2923, '饿': 2924, '糸': 2925, '入': 2926, '逗': 2927, '豬': 2928, '波': 2929, '尋': 2930, '颠': 2931, '堂': 2932, '枚': 2933, '枝': 2934, '珉': 2935, '送': 2936, '脖': 2937, '成': 2938, '咬': 2939, '鲟': 2940, '抚': 2941, '与': 2942, '茬': 2943, '拱': 2944, '学': 2945, '?': 2946, '摸': 2947, '腌': 2948, '怒': 2949, '哗': 2950, '选': 2951, '眼': 2952, '芬': 2953, '罕': 2954, '创': 2955, '涂': 2956, '稻': 2957, '大': 2958, '腱': 2959, '辈': 2960, '億': 2961, '猴': 2962, '新': 2963, 'y': 2964, '射': 2965, '概': 2966, '娇': 2967, '败': 2968, '辞': 2969, '裱': 2970, '個': 2971, '额': 2972, '帖': 2973, '遂': 2974, '質': 2975, '頭': 2976, '绕': 2977, '噢': 2978, '래': 2979, '房': 2980, '丹': 2981, '条': 2982, '苒': 2983, '捐': 2984, '顶': 2985, '檬': 2986, '災': 2987, '返': 2988, '史': 2989, '逊': 2990, '糜': 2991, '题': 2992, '嫌': 2993, '蓝': 2994, '饲': 2995, '沙': 2996, '蘑': 2997, '雪': 2998, '材': 2999, '媚': 3000, '』': 3001, '葵': 3002, '妄': 3003, '穷': 3004, '贈': 3005, '焕': 3006, '嘱': 3007, '播': 3008, '援': 3009, '脸': 3010, '废': 3011, '菜': 3012, '糯': 3013, '－': 3014, '蘭': 3015, '!': 3016, '四': 3017, '临': 3018, '苹': 3019, '缕': 3020, '迄': 3021, '窗': 3022, '孤': 3023, '罹': 3024, '萄': 3025, '莹': 3026, '蜕': 3027, '遵': 3028, '橄': 3029, '乘': 3030, '那': 3031, '仿': 3032, '絲': 3033, '\\ue109': 3034, '扫': 3035, '贫': 3036, '隅': 3037, '觎': 3038, '雲': 3039, '洛': 3040, '踢': 3041, '抛': 3042, '磁': 3043, '穆': 3044, '涛': 3045, 'H': 3046, '贼': 3047, '噩': 3048, '昭': 3049, '蝠': 3050, '墅': 3051, '屹': 3052, '堕': 3053, '祇': 3054, '靜': 3055, '禄': 3056, '购': 3057, '瑶': 3058, 'à': 3059, '言': 3060, '泽': 3061, '揚': 3062, '宣': 3063, '瀑': 3064, '书': 3065, '澈': 3066, '玑': 3067, '违': 3068, '劳': 3069, '較': 3070, '指': 3071, '詩': 3072, '纤': 3073, '笑': 3074, '華': 3075, '诗': 3076, '袂': 3077, '倪': 3078, '羞': 3079, '拾': 3080, '小': 3081, '￥': 3082, '轮': 3083, '纽': 3084, '蹬': 3085, '惯': 3086, '➌': 3087, '下': 3088, '宽': 3089, '好': 3090, '店': 3091, '芝': 3092, '藻': 3093, '暑': 3094, '跑': 3095, '褐': 3096, '響': 3097, '、': 3098, '☑': 3099, '短': 3100, '晚': 3101, '挪': 3102, '⒏': 3103, '哕': 3104, '形': 3105, '陪': 3106, '芭': 3107, '枣': 3108, '總': 3109, '〞': 3110, '涅': 3111, '但': 3112, '影': 3113, '据': 3114, '笫': 3115, '港': 3116, '月': 3117, '版': 3118, '彷': 3119, '柴': 3120, '阿': 3121, '玩': 3122, '损': 3123, '结': 3124, '虎': 3125, '殖': 3126, '韓': 3127, '鯉': 3128, '歇': 3129, '屯': 3130, '句': 3131, '坊': 3132, '酸': 3133, '某': 3134, '屏': 3135, '養': 3136, '迟': 3137, '萌': 3138, '产': 3139, '减': 3140, '嘍': 3141, '颚': 3142, '遇': 3143, '倦': 3144, '嘶': 3145, '獻': 3146, '枫': 3147, '置': 3148, '钗': 3149, '响': 3150, '奘': 3151, '现': 3152, '➏': 3153, '消': 3154, '屋': 3155, '粗': 3156, '痊': 3157, '狈': 3158, '海': 3159, '卓': 3160, '郭': 3161, '帛': 3162, '过': 3163, '坤': 3164, '晗': 3165, '杨': 3166, '賓': 3167, '岼': 3168, '嘿': 3169, '辉': 3170, '蜡': 3171, '愣': 3172, '伐': 3173, '张': 3174, '帆': 3175, '龈': 3176, '害': 3177, '團': 3178, '重': 3179, '自': 3180, '剧': 3181, '骂': 3182, '亲': 3183, '践': 3184, '寡': 3185, '荫': 3186, '用': 3187, '系': 3188, '\\u200b': 3189, '橙': 3190, '愉': 3191, '缉': 3192, '哦': 3193, '窟': 3194, '砖': 3195, '鴻': 3196, '体': 3197, '空': 3198, '汉': 3199, '阅': 3200, '淡': 3201, '祭': 3202, '痈': 3203, '映': 3204, '卡': 3205, '牠': 3206, '夕': 3207, '财': 3208, '豊': 3209, '麟': 3210, '贵': 3211, 'X': 3212, '驼': 3213, '脱': 3214, '¥': 3215, '@': 3216, '(': 3217, '矛': 3218, '瓷': 3219, '汨': 3220, '框': 3221, '悱': 3222, '竖': 3223, '宾': 3224, '霸': 3225, '坟': 3226, '栋': 3227, 'a': 3228, '同': 3229, '正': 3230, '片': 3231, 'b': 3232, '边': 3233, '樱': 3234, '畑': 3235, '要': 3236, '斯': 3237, '咯': 3238, '的': 3239, '亦': 3240, '摊': 3241, '赁': 3242, '續': 3243, '呻': 3244, '司': 3245, '摆': 3246, '绳': 3247, '唠': 3248, '嬷': 3249, '煌': 3250, '章': 3251, '翅': 3252, '＼': 3253, '腿': 3254, '棘': 3255, '老': 3256, '{': 3257, '姬': 3258, '惶': 3259, '晴': 3260, '兮': 3261, '咏': 3262, '号': 3263, '漠': 3264, '厅': 3265, '匙': 3266, '議': 3267, '滥': 3268, '飆': 3269, '锤': 3270, '屎': 3271, '幕': 3272, '祝': 3273, '阴': 3274, '盟': 3275, '壤': 3276, '胸': 3277, '妓': 3278, '囉': 3279, '瑕': 3280, '阮': 3281, '㎝': 3282, '峰': 3283, '溧': 3284, '轺': 3285, '止': 3286, '浩': 3287, '趕': 3288, '衛': 3289, '遷': 3290, '奶': 3291, '供': 3292, '这': 3293, '現': 3294, '塌': 3295, '慎': 3296, '提': 3297, '良': 3298, '津': 3299, '威': 3300, '州': 3301, '售': 3302, '筒': 3303, '┮': 3304, '🇺': 3305, ')': 3306, '溺': 3307, '春': 3308, '鳥': 3309, '驳': 3310, '辖': 3311, '苛': 3312, '赘': 3313, '敏': 3314, '飘': 3315, '筹': 3316, '激': 3317, '毫': 3318, '掀': 3319, '宇': 3320, '稿': 3321, '瘪': 3322, '誕': 3323, '✅': 3324, '赐': 3325, '恳': 3326, '岭': 3327, '白': 3328, '声': 3329, '村': 3330, '頁': 3331, '淚': 3332, '鲵': 3333, '恪': 3334, '错': 3335, '香': 3336, '靶': 3337, '骨': 3338, '雄': 3339, '萍': 3340, '昊': 3341, 'リ': 3342, '五': 3343, '挟': 3344, '鉛': 3345, '滨': 3346, '漱': 3347, '喷': 3348, '油': 3349, '状': 3350, '髓': 3351, '丰': 3352, '培': 3353, '裁': 3354, '繹': 3355, '蔑': 3356, '棉': 3357, '泼': 3358, '③': 3359, '掐': 3360, '喺': 3361, '克': 3362, '硬': 3363, '闪': 3364, '伺': 3365, '褪': 3366, '猬': 3367, '哭': 3368, '費': 3369, '薛': 3370, '淫': 3371, '矜': 3372, '丑': 3373, '清': 3374, '馋': 3375, '伍': 3376, '预': 3377, '駿': 3378, '丶': 3379, '其': 3380, '潸': 3381, '辗': 3382, '妮': 3383, '未': 3384, '疑': 3385, '盖': 3386, '刻': 3387, '悼': 3388, '◆': 3389, '评': 3390, '籍': 3391, '巨': 3392, '迅': 3393, '秒': 3394, '斩': 3395, '◇': 3396, '胀': 3397, '杀': 3398, '杭': 3399, '萨': 3400, '鑿': 3401, '該': 3402, '郁': 3403, '换': 3404, '距': 3405, '茨': 3406, '搁': 3407, '歹': 3408, '帕': 3409, '劉': 3410, '缔': 3411, '漢': 3412, '裡': 3413, '屡': 3414, '[': 3415, '毛': 3416, '誉': 3417, '涯': 3418, '儿': 3419, '躯': 3420, '驶': 3421, '荼': 3422, '啫': 3423, '彤': 3424, '烤': 3425, '收': 3426, '瓜': 3427, '侈': 3428, '斗': 3429, '里': 3430, '辩': 3431, '熙': 3432, '采': 3433, '忧': 3434, '穴': 3435, '符': 3436, '免': 3437, '握': 3438, '請': 3439, '鸠': 3440, '慈': 3441, '廈': 3442, '抬': 3443, '嚴': 3444, '身': 3445, '虔': 3446, '然': 3447, '斋': 3448, '控': 3449, '患': 3450, '飛': 3451, '赃': 3452, '撵': 3453, '燥': 3454, '舜': 3455, '國': 3456, '膝': 3457, '羅': 3458, '葱': 3459, '汀': 3460, '乖': 3461, '蛟': 3462, '露': 3463, '梆': 3464, '麽': 3465, '医': 3466, '條': 3467, '板': 3468, '割': 3469, '祖': 3470, '钢': 3471, '渺': 3472, '點': 3473, '惰': 3474, '戏': 3475, '具': 3476, '延': 3477, '刹': 3478, '塘': 3479, '铅': 3480, '诊': 3481, '凝': 3482, '綸': 3483, '☆': 3484, '壶': 3485, '計': 3486, '锋': 3487, '在': 3488, '颤': 3489, '伯': 3490, '固': 3491, '①': 3492, '游': 3493, '囚': 3494, '帼': 3495, '每': 3496, '亮': 3497, '蚊': 3498, '而': 3499, 'Q': 3500, '奢': 3501, '赠': 3502, '檔': 3503, '含': 3504, '继': 3505, '蛙': 3506, '顷': 3507, '艰': 3508, '撮': 3509, '｀': 3510, '怕': 3511, '夺': 3512, '咳': 3513, '認': 3514, '隐': 3515, '⒈': 3516, '②': 3517, '蜃': 3518, '衬': 3519, '喬': 3520, '牲': 3521, '淇': 3522, '私': 3523, '哲': 3524, '雙': 3525, '痪': 3526, '嵘': 3527, '晕': 3528, '撒': 3529, '莉': 3530, '霍': 3531, '園': 3532, '摧': 3533, '➎': 3534, '艱': 3535, '🍀': 3536, '姆': 3537, '谍': 3538, '军': 3539, '越': 3540, '撰': 3541, '双': 3542, '唯': 3543, '嘻': 3544, '狗': 3545, '襄': 3546, '］': 3547, '脚': 3548, '貴': 3549, '湊': 3550, '懊': 3551, '斜': 3552, '，': 3553, '智': 3554, '蠢': 3555, '幅': 3556, '惨': 3557, '俺': 3558, '膀': 3559, '年': 3560, '震': 3561, '禁': 3562, '桌': 3563, '⋯': 3564, '厂': 3565, 'と': 3566, '翁': 3567, '瓯': 3568, '花': 3569, '詞': 3570, 'j': 3571, '战': 3572, '魇': 3573, '舒': 3574, '雹': 3575, '主': 3576, '鄉': 3577, '❀': 3578, '惹': 3579, '扰': 3580, '棍': 3581, '啥': 3582, '柿': 3583, '坠': 3584, '译': 3585, '泓': 3586, '否': 3587, '粒': 3588, '酝': 3589, '敗': 3590, '猿': 3591, '跃': 3592, '泉': 3593, '饕': 3594, '狮': 3595, '浪': 3596, '背': 3597, '至': 3598, '罂': 3599, '岚': 3600, '骑': 3601, '苏': 3602, '测': 3603, '仔': 3604, '＞': 3605, '}': 3606, '毅': 3607, '突': 3608, '数': 3609, '齐': 3610, 'n': 3611, '丙': 3612, '敢': 3613, '掠': 3614, '犀': 3615, '码': 3616, '盒': 3617, '雜': 3618, '析': 3619, '乔': 3620, '🐒': 3621, '蒜': 3622, '♪': 3623, '架': 3624, '脐': 3625, '倩': 3626, '刘': 3627, '馄': 3628, '扳': 3629, '销': 3630, '彈': 3631, '滚': 3632, ']': 3633, '豌': 3634, '規': 3635, '羡': 3636, '佣': 3637, '讶': 3638, '代': 3639, '裳': 3640, '疤': 3641, '哪': 3642, '何': 3643, '聋': 3644, '绩': 3645, '發': 3646, '振': 3647, '鎮': 3648, '户': 3649, '亟': 3650, '虾': 3651, '沦': 3652, '泛': 3653, '淑': 3654, '寰': 3655, '黛': 3656, '溫': 3657, '粽': 3658, '溢': 3659, '蠻': 3660, '廿': 3661, '類': 3662, '椎': 3663, '扼': 3664, '😱': 3665, 'Z': 3666, '麦': 3667, '西': 3668, '卫': 3669, '瞻': 3670, '舵': 3671, '2': 3672, '富': 3673, '暹': 3674, '道': 3675, '渣': 3676, '查': 3677, '命': 3678, '噗': 3679, '令': 3680, '请': 3681, '腾': 3682, '决': 3683, '搡': 3684, '帶': 3685, '娉': 3686, '膏': 3687, '展': 3688, '累': 3689, '眉': 3690, '壁': 3691, '剎': 3692, '睾': 3693, '很': 3694, '八': 3695, '蟒': 3696, '茶': 3697, '朩': 3698, '銳': 3699, '描': 3700, '快': 3701, '嫂': 3702, '厚': 3703, '④': 3704, '≫': 3705, '陵': 3706, '签': 3707, '诬': 3708, '由': 3709, '马': 3710, '昂': 3711, '溪': 3712, '石': 3713, '暂': 3714, 's': 3715, '橡': 3716, '运': 3717, '漫': 3718, '刮': 3719, '呗': 3720, '綦': 3721, '勘': 3722, '亩': 3723, '布': 3724, '盈': 3725, '谛': 3726, '嗽': 3727, '罗': 3728, '宝': 3729, '痺': 3730, '漂': 3731, 'Y': 3732, '凉': 3733, '胆': 3734, '․': 3735, '婉': 3736, '艇': 3737, '鳗': 3738, '幹': 3739, '碧': 3740, '們': 3741, '催': 3742, '´': 3743, '讹': 3744, '隣': 3745, 'T': 3746, '骼': 3747, '颁': 3748, '罄': 3749, '木': 3750, '慢': 3751, '腫': 3752, '度': 3753, '恐': 3754, '百': 3755, '鹏': 3756, 'u': 3757, '往': 3758, ':': 3759, '模': 3760, '魔': 3761, '十': 3762, '郎': 3763, '讽': 3764, '婀': 3765, '揭': 3766, '耽': 3767, '栏': 3768, '绣': 3769, '頻': 3770, '拥': 3771, '層': 3772, '面': 3773, '酱': 3774, '😲': 3775, '書': 3776, '睽': 3777, '偷': 3778, '兔': 3779, '叛': 3780, '肯': 3781, '衫': 3782, '集': 3783, '络': 3784, '类': 3785, '翰': 3786, '磊': 3787, '牡': 3788, '氯': 3789, '特': 3790, '标': 3791, 'W': 3792, '妨': 3793, '效': 3794, '冀': 3795, '召': 3796, '政': 3797, '囧': 3798, '惜': 3799, '讪': 3800, '磨': 3801, '深': 3802, '璧': 3803, '犹': 3804, '瘤': 3805, '餐': 3806, '挽': 3807, '吉': 3808, '廷': 3809, '呲': 3810, '訊': 3811, '酗': 3812, '佬': 3813, '酶': 3814, '轨': 3815, '型': 3816, '偕': 3817, '诵': 3818, '漯': 3819, '似': 3820, '嗦': 3821, '乃': 3822, '梅': 3823, '⑧': 3824, '靖': 3825, '票': 3826, '滿': 3827, '色': 3828, '址': 3829, 'r': 3830, '屑': 3831, '衣': 3832, '%': 3833, '咋': 3834, '棚': 3835, '_': 3836, '帅': 3837, '娑': 3838, '窕': 3839, '拜': 3840, '酵': 3841, '埔': 3842, '茅': 3843, '他': 3844, '見': 3845, '操': 3846, '等': 3847, '境': 3848, '叉': 3849, '遭': 3850, '札': 3851, '来': 3852, '水': 3853, '鄭': 3854, '历': 3855, '劫': 3856, '署': 3857, '孙': 3858, '红': 3859, '养': 3860, '壳': 3861, '艳': 3862, '捣': 3863, '饶': 3864, '恤': 3865, '醋': 3866, '憐': 3867, '植': 3868, '翱': 3869, '辅': 3870, '蛋': 3871, '鄂': 3872, '媳': 3873, '泣': 3874, '替': 3875, '猎': 3876, '憔': 3877, '晋': 3878, '韌': 3879, '统': 3880, '雍': 3881, '翡': 3882, '偶': 3883, '弥': 3884, '兩': 3885, '戀': 3886, '嗎': 3887, '≦': 3888, '烫': 3889, '😢': 3890, '聪': 3891, '﹏': 3892, '佟': 3893, '厉': 3894, '甸': 3895, '普': 3896, '轴': 3897, '寅': 3898, '优': 3899, '坑': 3900, '哼': 3901, '拆': 3902, '验': 3903, '内': 3904, 'U': 3905, '婵': 3906, '搭': 3907, '時': 3908, 'D': 3909, '颜': 3910, '繼': 3911, '坞': 3912, '斷': 3913, '咱': 3914, '諒': 3915, '郸': 3916, '康': 3917, '六': 3918, '娶': 3919, '獸': 3920, '巩': 3921, '睁': 3922, '奇': 3923, '汁': 3924, '拿': 3925, '黔': 3926, '捍': 3927, '溶': 3928, '瓢': 3929, '阁': 3930, '阂': 3931, '蟑': 3932, '瑋': 3933, '谣': 3934, '去': 3935, '悸': 3936, '麥': 3937, '創': 3938, '袋': 3939, '立': 3940, '册': 3941, '榴': 3942, '荏': 3943, '乱': 3944, '常': 3945, '淹': 3946, '育': 3947, '藤': 3948, '汰': 3949, '缢': 3950, '倒': 3951, '偏': 3952, '瘫': 3953, '凡': 3954, ';': 3955, '辐': 3956, '诱': 3957, '忙': 3958, '熟': 3959, '零': 3960, '荒': 3961, '庵': 3962, '江': 3963, '逍': 3964, '煽': 3965, '佩': 3966, '凸': 3967, '泊': 3968, '巷': 3969, '凯': 3970, '丞': 3971, '學': 3972, '騰': 3973, '碾': 3974, '萱': 3975, '钓': 3976, '勿': 3977, '煤': 3978, '扈': 3979, '灰': 3980, '烹': 3981, '磐': 3982, '冻': 3983, '围': 3984, '筝': 3985, '嫡': 3986, '耶': 3987, '矫': 3988, '鼻': 3989, '粉': 3990, '踹': 3991, '捡': 3992, '赚': 3993, '绍': 3994, '泪': 3995, '善': 3996, '弟': 3997, '萃': 3998, '诶': 3999, '試': 4000, '垂': 4001, '庭': 4002, '费': 4003, '乡': 4004, '礁': 4005, '申': 4006, '呜': 4007, '坷': 4008, '坝': 4009, '飒': 4010, '证': 4011, '扮': 4012, '痿': 4013, '阐': 4014, '庚': 4015, '1': 4016, '问': 4017, '5': 4018, '俱': 4019, '祺': 4020, '嫩': 4021, '礼': 4022, '琶': 4023, '疫': 4024, '针': 4025, '盡': 4026, '汇': 4027, '暧': 4028, '乐': 4029, '尾': 4030, '德': 4031, '膜': 4032, '湖': 4033, '缪': 4034, '极': 4035, '☎': 4036, '獒': 4037, '恶': 4038, '熹': 4039, '谠': 4040, '凄': 4041, '买': 4042, '午': 4043, '狞': 4044, '伸': 4045, '贪': 4046, '兵': 4047, '唁': 4048, '察': 4049, '燕': 4050, '浏': 4051, '剛': 4052, '龟': 4053, '浅': 4054, '橇': 4055, '艹': 4056, '薄': 4057, '扛': 4058, '绛': 4059, '委': 4060, '勢': 4061, '憾': 4062, '污': 4063, '螃': 4064, '郊': 4065, '＂': 4066, '官': 4067, '虽': 4068, '啤': 4069, '诲': 4070, '蓄': 4071, '喘': 4072, '软': 4073, '排': 4074, '遠': 4075, '彭': 4076, '倾': 4077, '授': 4078, '眸': 4079, 'p': 4080, '遮': 4081, '恒': 4082, '师': 4083, '崇': 4084, '般': 4085, '琐': 4086, '责': 4087, '宗': 4088, '呆': 4089, '鳌': 4090, '处': 4091, '攻': 4092, '钥': 4093, '松': 4094, '醺': 4095, '鼎': 4096, '储': 4097, '陌': 4098, '咲': 4099, '３': 4100, '幂': 4101, '恣': 4102, '谓': 4103, '過': 4104, '緊': 4105, '咨': 4106, '宵': 4107, '抖': 4108, '鑑': 4109, '到': 4110, '盔': 4111, '望': 4112, '浑': 4113, '给': 4114, '剪': 4115, '妙': 4116, '僵': 4117, '饱': 4118, '岳': 4119, '髮': 4120, '怺': 4121, '工': 4122, '鸦': 4123, '渐': 4124, '驾': 4125, '娛': 4126, '葛': 4127, '風': 4128, '愈': 4129, '糊': 4130, '週': 4131, '洲': 4132, '颂': 4133, '曲': 4134, '助': 4135, '懂': 4136, '王': 4137, '妻': 4138, '俚': 4139, '肋': 4140, '潼': 4141, '氓': 4142, '袭': 4143, '&': 4144, '🇨': 4145, '草': 4146, '広': 4147, '子': 4148, '🌟': 4149, '呈': 4150, '景': 4151, '二': 4152, '捕': 4153, '绒': 4154, '忍': 4155, '迎': 4156, '礴': 4157, '瘾': 4158, '序': 4159, '７': 4160, '胧': 4161, '锢': 4162, 'f': 4163, '掇': 4164, '咻': 4165, '吝': 4166, '寶': 4167, '氏': 4168, '窝': 4169, '阵': 4170, '坚': 4171, '疲': 4172, '兼': 4173, '皆': 4174, '攒': 4175, '酣': 4176, '仪': 4177, '變': 4178, '桂': 4179, '兆': 4180, '昶': 4181, '装': 4182, '尖': 4183, 'L': 4184, '瓶': 4185, '稀': 4186, '诡': 4187, '妒': 4188, '裂': 4189, '弦': 4190, '翔': 4191, '葬': 4192, '馈': 4193, '扉': 4194, '囔': 4195, '喧': 4196, '盛': 4197, '笛': 4198, '態': 4199, '町': 4200, '餮': 4201, '钛': 4202, '🍁': 4203, '灣': 4204, '鬥': 4205, '嵯': 4206, '粥': 4207, '慵': 4208, '如': 4209, '葆': 4210, '記': 4211, '足': 4212, '约': 4213, '屌': 4214, '移': 4215, '门': 4216, '詹': 4217, '價': 4218, '闽': 4219, '屆': 4220, '碱': 4221, '袖': 4222, '長': 4223, '画': 4224, '余': 4225, '琢': 4226, '帐': 4227, '嚎': 4228, '留': 4229, '跚': 4230, '床': 4231, '刚': 4232, '哒': 4233, '鸽': 4234, '知': 4235, '块': 4236, '杉': 4237, '尼': 4238, '’': 4239, '敛': 4240, '涨': 4241, '橫': 4242, '思': 4243, '媒': 4244, '朝': 4245, '輝': 4246, '例': 4247, '押': 4248, '槽': 4249, '挑': 4250, '狭': 4251, '間': 4252, '前': 4253, '考': 4254, '娱': 4255, '械': 4256, '✈': 4257, '嗓': 4258, '斥': 4259, '【': 4260, '紐': 4261, '罪': 4262, '皈': 4263, '长': 4264, '仇': 4265, '捭': 4266, '猜': 4267, 'm': 4268, '罩': 4269, '逾': 4270, '宜': 4271, '光': 4272, '后': 4273, '撑': 4274, '剖': 4275, '盆': 4276, '️': 4277, '峭': 4278, '牵': 4279, '砍': 4280, '沂': 4281, 'れ': 4282, '樊': 4283, '贺': 4284, '略': 4285, '🇳': 4286, '—': 4287, '吓': 4288, '拣': 4289, '亵': 4290, '静': 4291, '谴': 4292, '鬧': 4293, '論': 4294, '耿': 4295, '护': 4296, '苦': 4297, '艾': 4298, '∠': 4299, '猝': 4300, 'P': 4301, '黄': 4302, '君': 4303, 'こ': 4304, '弛': 4305, '恙': 4306, '笼': 4307, '柬': 4308, '猛': 4309, '酯': 4310, '划': 4311, '肖': 4312, '撬': 4313, '郫': 4314, '~': 4315, '缸': 4316, '種': 4317, '崭': 4318, '毗': 4319, '薯': 4320, '粪': 4321, '俭': 4322, '篷': 4323, '萤': 4324, '標': 4325, '糖': 4326, '裆': 4327, '熬': 4328, '一': 4329, '库': 4330, '▲': 4331, '冥': 4332, '锁': 4333, '俘': 4334, '抢': 4335, '征': 4336, '玫': 4337, '厲': 4338, '芯': 4339, '众': 4340, '吗': 4341, '歧': 4342, '楊': 4343, '篱': 4344, '夹': 4345, '悴': 4346, '；': 4347, '菁': 4348, '示': 4349, '衍': 4350, '抽': 4351, '纯': 4352, '您': 4353, '答': 4354, '法': 4355, '>': 4356, '窜': 4357, '坎': 4358, '柠': 4359, 'ら': 4360, '給': 4361, '♥': 4362, '噪': 4363, '⚫': 4364, '枕': 4365, '榆': 4366, '樂': 4367, '气': 4368, '末': 4369, '這': 4370, '矿': 4371, '員': 4372, '蚤': 4373, '梯': 4374, '通': 4375, '脆': 4376, '聲': 4377, '0': 4378, '弹': 4379, '怖': 4380, '俨': 4381, '域': 4382, '冉': 4383, '痹': 4384, '府': 4385, '啡': 4386, '绽': 4387, '頒': 4388, '辦': 4389, '发': 4390, '碌': 4391, '社': 4392, '🚬': 4393, '渗': 4394, '珠': 4395, '兄': 4396, '鸿': 4397, '哺': 4398, '俯': 4399, '妇': 4400, '蒙': 4401, '幢': 4402, '叽': 4403, '幡': 4404, '鎖': 4405, '安': 4406, '作': 4407, '情': 4408, '<unk>': 4409}\n"
  },
  {
    "path": "modules/text/text_generation/Rumor_prediction/module.py",
    "content": "# coding:utf-8\r\n#\r\n# Licensed under the Apache License, Version 2.0 (the \"License\"\r\n# you may not use this file except in compliance with the License.\r\n# You may obtain a copy of the License at\r\n#\r\n#     http://www.apache.org/licenses/LICENSE-2.0\r\n#\r\n# Unless required by applicable law or agreed to in writing, software\r\n# distributed under the License is distributed on an \"AS IS\" BASIS,\r\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\r\n# See the License for the specific language governing permissions and\r\n# limitations under the License.\r\nimport argparse\r\nimport ast\r\nimport os\r\nimport math\r\nimport six\r\nimport time\r\nfrom pathlib import Path\r\n\r\nfrom paddle.fluid.core import PaddleTensor, AnalysisConfig, create_paddle_predictor\r\nfrom paddlehub.module.module import runnable, serving, moduleinfo\r\nfrom paddlehub.io.parser import txt_parser\r\nfrom paddlehub.compat.module.nlp_module import DataFormatError\r\nimport numpy as np\r\nimport paddle\r\nimport paddlehub as hub\r\n\r\n\r\n@moduleinfo(\r\n    name=\"Rumor_prediction\",\r\n    version=\"1.0.0\",\r\n    type=\"nlp/semantic_model\",\r\n    summary=\"Is the input text prediction a rumor\",\r\n    author=\"彭兆帅，郑博培\",\r\n    author_email=\"1084667371@qq.com，2733821739@qq.com\")\r\nclass Rumorprediction(hub.Module):\r\n    def _initialize(self):\r\n        \"\"\"\r\n        Initialize with the necessary elements\r\n        \"\"\"\r\n        # 加载模型路径\r\n        self.default_pretrained_model_path = os.path.join(self.directory, \"infer_model\")\r\n\r\n    def Rumor(self, texts, use_gpu=False):\r\n        \"\"\"\r\n        Get the input and program of the infer model\r\n\r\n        Args:\r\n             image (list(numpy.ndarray)): images data, shape of each is [H, W, C], the color space is BGR.\r\n             use_gpu(bool): Weather to use gpu\r\n        \"\"\"\r\n\r\n        # 获取数据\r\n        def get_data(sentence):\r\n            # 读取数据字典\r\n            with open(self.directory + '/dict.txt', 'r', encoding='utf-8') as f_data:\r\n                dict_txt = eval(f_data.readlines()[0])\r\n            dict_txt = dict(dict_txt)\r\n            # 把字符串数据转换成列表数据\r\n            keys = dict_txt.keys()\r\n            data = []\r\n            for s in sentence:\r\n                # 判断是否存在未知字符\r\n                if not s in keys:\r\n                    s = '<unk>'\r\n                data.append(int(dict_txt[s]))\r\n            return data\r\n\r\n        data = []\r\n        for text in texts:\r\n            text = get_data(text)\r\n            data.append(text)\r\n        base_shape = [[len(c) for c in data]]\r\n        paddle.enable_static()\r\n        place = paddle.CUDAPlace(0) if use_gpu else paddle.CPUPlace()\r\n        exe = paddle.static.Executor(place)\r\n        exe.run(paddle.static.default_startup_program())\r\n        [infer_program, feeded_var_names, target_var] = paddle.fluid.io.load_inference_model(\r\n            dirname=self.default_pretrained_model_path, executor=exe)\r\n        # 生成预测数据\r\n        tensor_words = paddle.fluid.create_lod_tensor(data, base_shape, place)\r\n        # 执行预测\r\n        result = exe.run(program=infer_program, feed={feeded_var_names[0]: tensor_words}, fetch_list=target_var)\r\n        # 分类名称\r\n        names = ['谣言', '非谣言']\r\n\r\n        results = []\r\n\r\n        # 获取结果概率最大的label\r\n        for i in range(len(data)):\r\n            content = texts[i]\r\n            lab = np.argsort(result)[0][i][-1]\r\n\r\n            alltext = {'content': content, 'prediction': names[lab], 'probability': result[0][i][lab]}\r\n            alltext = [alltext]\r\n            results = results + alltext\r\n\r\n        return results\r\n\r\n    def add_module_config_arg(self):\r\n        \"\"\"\r\n        Add the command config options\r\n        \"\"\"\r\n        self.arg_config_group.add_argument(\r\n            '--use_gpu', type=ast.literal_eval, default=False, help=\"whether use GPU for prediction\")\r\n\r\n    def add_module_input_arg(self):\r\n        \"\"\"\r\n        Add the command input options\r\n        \"\"\"\r\n        self.arg_input_group.add_argument('--input_text', type=str, default=None, help=\"input_text is str\")\r\n\r\n    @runnable\r\n    def run_cmd(self, argvs):\r\n        \"\"\"\r\n        Run as a command\r\n        \"\"\"\r\n        self.parser = argparse.ArgumentParser(\r\n            description='Run the %s module.' % self.name,\r\n            prog='hub run %s' % self.name,\r\n            usage='%(prog)s',\r\n            add_help=True)\r\n\r\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\r\n        self.arg_config_group = self.parser.add_argument_group(\r\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, optional.\")\r\n\r\n        self.add_module_config_arg()\r\n        self.add_module_input_arg()\r\n\r\n        args = self.parser.parse_args(argvs)\r\n        input_text = [args.input_text]\r\n        results = self.Rumor(texts=input_text, use_gpu=args.use_gpu)\r\n\r\n        return results\r\n"
  },
  {
    "path": "modules/text/text_generation/ernie_gen/README.md",
    "content": "# ernie_gen\n\n| 模型名称            |   ernie_gen   |\n| :------------------ | :-----------: |\n| 类别                | 文本-文本生成 |\n| 网络                |   ERNIE-GEN   |\n| 数据集              |       -       |\n| 是否支持Fine-tuning |      是       |\n| 模型大小            |      85K      |\n| 最新更新日期        |  2021-07-20   |\n| 数据指标            |       -       |\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n  - ERNIE-GEN 是面向生成任务的预训练-微调框架，首次在预训练阶段加入span-by-span 生成任务，让模型每次能够生成一个语义完整的片段。在预训练和微调中通过填充式生成机制和噪声感知机制来缓解曝光偏差问题。此外, ERNIE-GEN 采样多片段-多粒度目标文本采样策略, 增强源文本和目标文本的关联性，加强了编码器和解码器的交互。\n  - ernie_gen module是一个具备微调功能的module，可以快速完成特定场景module的制作。\n\n<p align=\"center\">\n<img src=\"https://user-images.githubusercontent.com/76040149/133191670-8eb1c542-f8e8-4715-adb2-6346b976fab1.png\"  width=\"600\" hspace='10'/>\n</p>\n\n- 更多详情请查看：[ERNIE-GEN:An Enhanced Multi-Flow Pre-training and Fine-tuning Framework for Natural Language Generation](https://arxiv.org/abs/2001.11314)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0   | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n  - paddlenlp >= 2.0.0\t\t\t\t\t                                \n\n- ### 2、安装\n\n  - ```shell\n    $ hub install ernie_gen\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ernie_gen需要**先针对特定数据集fine-tune**，才能使用\n  - 文本生成任务有很多种，ernie_gen仅提供了生成文本的基本参数，需要针对特定任务的数据集做fine-tune才能使用\n  - paddlehub提供了简单的fine-tune数据集：[train.txt](./test_data/train.txt), [dev.txt](./test_data/dev.txt)\n  - paddlehub也提供了多个fine-tune好的预训练模型：[对联生成](../ernie_gen_couplet/)，[情话生成](../ernie_gen_lover_words/)，[诗歌生成](../ernie_gen_poetry/)等\n\n### 1、Fine-tune并封装\n\n- #### Fine-tune代码实例\n\n  - ```python\n    import paddlehub as hub\n    \n    module = hub.Module(name=\"ernie_gen\")\n    \n    result = module.finetune(\n        train_path='train.txt',\n        dev_path='dev.txt',\n        max_steps=300,\n        batch_size=2\n    )\n    \n    module.export(params_path=result['last_save_path'], module_name=\"ernie_gen_test\", author=\"test\")\n    ```\n\n- #### API说明\n\n  - ```python\n    def finetune(train_path,\n                 dev_path=None,\n                 save_dir=\"ernie_gen_result\",\n                 init_ckpt_path=None,\n                 use_gpu=True,\n                 max_steps=500,\n                 batch_size=8,\n                 max_encode_len=15,\n                 max_decode_len=15,\n                 learning_rate=5e-5,\n                 warmup_proportion=0.1,\n                 weight_decay=0.1,\n                 noise_prob=0,\n                 label_smooth=0,\n                 beam_width=5,\n                 length_penalty=1.0,\n                 log_interval=100,\n                 save_interval=200):\n    ```\n    \n    - 微调模型参数的API\n    - **参数**\n      - train_path(str): 训练集路径。训练集的格式应为：\"序号\\t输入文本\\t标签\"，例如：\"1\\t床前明月光\\t疑是地上霜\"，注意\\t不能使用空格替代\n      - dev_path(str): 验证集路径。验证集的格式应为：\"序号\\t输入文本\\t标签\"，例如：\"1\\t举头望明月\\t低头思故乡\"，注意\\t不能使用空格替代\n      - save_dir(str): 模型保存以及验证集预测输出路径。\n      - init_ckpt_path(str): 模型初始化加载路径，可实现增量训练。\n      - use_gpu(bool): 是否使用GPU。\n      - max_steps(int): 最大训练步数。\n      - batch_size(int): 训练时的batch大小。\n      - max_encode_len(int): 最长编码长度。\n      - max_decode_len(int): 最长解码长度。\n      - learning_rate(float): 学习率大小。\n      - warmup_proportion(float): 学习率warmup比例。\n      - weight_decay(float): 权值衰减大小。\n      - noise_prob(float): 噪声概率，详见ernie gen论文。\n      - label_smooth(float): 标签平滑权重。\n      - beam_width(int): 验证集预测时的beam大小。\n      - length_penalty(float): 验证集预测时的长度惩罚权重。\n      - log_interval(int): 训练时的日志打印间隔步数。\n      - save_interval(int): 训练时的模型保存间隔部署。验证集将在模型保存完毕后进行预测。\n    - **返回**\n      - result(dict): 运行结果。包含2个键:\n        - last_save_path(str): 训练结束时的模型保存路径。\n        - last_ppl(float): 训练结束时的模型困惑度。\n    \n  - ```python\n    def export(\n      params_path,\n      module_name,\n      author,\n      version=\"1.0.0\",\n      summary=\"\",\n      author_email=\"\",\n      export_path=\".\"):\n    ```\n    \n    - module导出API，通过此API可以一键将训练参数打包为hub module。\n    - **参数**\n      - params_path(str): 模型参数路径。\n      - module_name(str): module名称，例如\"ernie_gen_couplet\"。\n      - author(str): 作者名称。\n      - max_encode_len(int): 最大编码长度。\n      - max_decode_len(int): 最大解码长度\n      - version(str): 版本号。\n      - summary(str): module的英文简介。\n      - author_email(str): 作者的邮箱地址。\n      - export_path(str): module的导出路径。\n\n### 2、模型预测\n\n- **定义`$module_name`为export指定的module_name**\n\n- 模型转换完毕之后，通过`hub install $module_name`安装该模型，即可通过以下2种方式调用自制module：\n\n- #### 法1：命令行预测\n\n  - ```python\n    $ hub run $module_name --input_text=\"输入文本\" --use_gpu True --beam_width 5\n    ```\n    \n  - 通过命令行方式实现hub模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- #### 法2：API预测\n\n  - ```python\n    import paddlehub as hub\n    \n    module = hub.Module(name=\"$module_name\")\n    \n    test_texts = [\"输入文本1\", \"输入文本2\"]\n    # generate包含3个参数，texts为输入文本列表，use_gpu指定是否使用gpu，beam_width指定beam search宽度。\n    results = module.generate(texts=test_texts, use_gpu=True, beam_width=5)\n    for result in results:\n        print(result)\n    ```\n\n- 您也可以将`$module_name`文件夹打包为tar.gz压缩包并联系PaddleHub工作人员上传至PaddleHub模型仓库，这样更多的用户可以通过一键安装的方式使用您的模型。PaddleHub非常欢迎您的贡献，共同推动开源社区成长。\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个文本生成的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m $module_name -p 8866\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 客户端通过以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    \n    # 发送HTTP请求\n    \n    data = {'texts':[\"输入文本1\", \"输入文本2\"],\n            'use_gpu':True, 'beam_width':5}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/$module_name\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    \n    # 保存结果\n    results = r.json()[\"results\"]\n    for result in results:\n        print(result)\n    ```\n  \n- **NOTE:** 上述`$module_name`为export指定的module_name\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  修复模型导出bug\n\n* 1.0.2\n\n   修复windows运行中的bug\n\n* 1.1.0\n\n   接入PaddleNLP\n   \n   - ```shell\n     $ hub install ernie_gen==1.1.0\n     ```\n"
  },
  {
    "path": "modules/text/text_generation/ernie_gen/README_en.md",
    "content": "# ernie_gen\n\n| 模型名称            |   ernie_gen   |\n| :------------------ | :-----------: |\n| 类别                | 文本-文本生成 |\n| 网络                |   ERNIE-GEN   |\n| 数据集              |       -       |\n| 是否支持Fine-tuning |      是       |\n| 模型大小            |      85K      |\n| 最新更新日期        |  2021-07-20   |\n| 数据指标            |       -       |\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n  - ERNIE-GEN 是面向生成任务的预训练-微调框架，首次在预训练阶段加入span-by-span 生成任务，让模型每次能够生成一个语义完整的片段。在预训练和微调中通过填充式生成机制和噪声感知机制来缓解曝光偏差问题。此外, ERNIE-GEN 采样多片段-多粒度目标文本采样策略, 增强源文本和目标文本的关联性，加强了编码器和解码器的交互。\n  - ernie_gen module是一个具备微调功能的module，可以快速完成特定场景module的制作。\n\n<p align=\"center\">\n<img src=\"https://user-images.githubusercontent.com/76040149/133191670-8eb1c542-f8e8-4715-adb2-6346b976fab1.png\"  width=\"600\" hspace='10'/>\n</p>\n\n- 更多详情请查看：[ERNIE-GEN:An Enhanced Multi-Flow Pre-training and Fine-tuning Framework for Natural Language Generation](https://arxiv.org/abs/2001.11314)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.0.0   | [如何安装paddlehub](../../../../docs/docs_ch/get_start/installation.rst)\n\n  - paddlenlp >= 2.0.0\t\t\t\t\t                                \n\n- ### 2、安装\n\n  - ```shell\n    $ hub install ernie_gen\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ernie_gen can be used **only if it is first targeted at the specific dataset fine-tune**\n  - There are many types of text generation tasks, ernie_gen only provides the basic parameters for text generation, which can only be used after fine-tuning the dataset for a specific task\n  - Paddlehub provides a simple fine-tune dataset:[train.txt](./test_data/train.txt), [dev.txt](./test_data/dev.txt)\n  - Paddlehub also offers multiple fine-tune pre-training models that work well:[Couplet generated](../ernie_gen_couplet/)，[Lover words generated](../ernie_gen_lover_words/)，[Poetry generated](../ernie_gen_poetry/)等\n\n### 1、Fine-tune and encapsulation\n\n- #### Fine-tune Code Example\n\n  - ```python\n    import paddlehub as hub\n    \n    module = hub.Module(name=\"ernie_gen\")\n    \n    result = module.finetune(\n        train_path='train.txt',\n        dev_path='dev.txt',\n        max_steps=300,\n        batch_size=2\n    )\n    \n    module.export(params_path=result['last_save_path'], module_name=\"ernie_gen_test\", author=\"test\")\n    ```\n\n- #### API Instruction\n\n  - ```python\n    def finetune(train_path,\n                 dev_path=None,\n                 save_dir=\"ernie_gen_result\",\n                 init_ckpt_path=None,\n                 use_gpu=True,\n                 max_steps=500,\n                 batch_size=8,\n                 max_encode_len=15,\n                 max_decode_len=15,\n                 learning_rate=5e-5,\n                 warmup_proportion=0.1,\n                 weight_decay=0.1,\n                 noise_prob=0,\n                 label_smooth=0,\n                 beam_width=5,\n                 length_penalty=1.0,\n                 log_interval=100,\n                 save_interval=200):\n    ```\n    \n    - Fine tuning model parameters API\n    - **Parameter**\n      - train_path(str): Training set path. The format of the training set should be: \"serial number\\tinput text\\tlabel\", such as \"1\\t床前明月光\\t疑是地上霜\", note that \\t cannot be replaced by Spaces\n      - dev_path(str): validation set path. The format of the validation set should be: \"serial number\\tinput text\\tlabel, such as \"1\\t举头望明月\\t低头思故乡\", note that \\t cannot be replaced by Spaces\n      - save_dir(str): Model saving and validation sets predict output paths.\n      - init_ckpt_path(str): The model initializes the loading path to realize incremental training.\n      - use_gpu(bool): use gpu or not\n      - max_steps(int): Maximum training steps.\n      - batch_size(int): Batch size during training.\n      - max_encode_len(int): Maximum encoding length.\n      - max_decode_len(int): Maximum decoding length.\n      - learning_rate(float): Learning rate size.\n      - warmup_proportion(float): Warmup rate.\n      - weight_decay(float): Weight decay size.\n      - noise_prob(float): Noise probability, refer to the Ernie Gen's paper.\n      - label_smooth(float): Label smoothing weight.\n      - beam_width(int): Beam size of validation set at the time of prediction.\n      - length_penalty(float): Length penalty weight for validation set prediction.\n      - log_interval(int): Number of steps at a training log printing interval.\n      - save_interval(int): training model save interval deployment. The validation set will make predictions after the model is saved.\n    - **Return**\n      - result(dict): Run result. Contains 2 keys:\n        - last_save_path(str): Save path of model at the end of training.\n        - last_ppl(float): Model confusion at the end of training.\n    \n  - ```python\n    def export(\n      params_path,\n      module_name,\n      author,\n      version=\"1.0.0\",\n      summary=\"\",\n      author_email=\"\",\n      export_path=\".\"):\n    ```\n    \n    - Module exports an API through which training parameters can be packaged into a Hub Module with one click.\n    - **Parameter**\n      - params_path(str): Module parameter path.\n      - module_name(str): module name, such as \"ernie_gen_couplet\"。\n      - author(str): Author name\n      - max_encode_len(int): Maximum encoding length.\n      - max_decode_len(int): Maximum decoding length.\n      - version(str): The version number.\n      - summary(str): English introduction to Module.\n      - author_email(str): Email address of the author.\n      - export_path(str): Module export path.\n\n### 2、模型预测\n\n- **定义`$module_name`为export指定的module_name**\n\n- 模型转换完毕之后，通过`hub install $module_name`安装该模型，即可通过以下2种方式调用自制module：\n\n- #### 法1：命令行预测\n\n  - ```python\n    $ hub run $module_name --input_text=\"输入文本\" --use_gpu True --beam_width 5\n    ```\n    \n  - 通过命令行方式实现hub模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- #### 法2：API预测\n\n  - ```python\n    import paddlehub as hub\n    \n    module = hub.Module(name=\"$module_name\")\n    \n    test_texts = [\"输入文本1\", \"输入文本2\"]\n    # generate包含3个参数，texts为输入文本列表，use_gpu指定是否使用gpu，beam_width指定beam search宽度。\n    results = module.generate(texts=test_texts, use_gpu=True, beam_width=5)\n    for result in results:\n        print(result)\n    ```\n\n- 您也可以将`$module_name`文件夹打包为tar.gz压缩包并联系PaddleHub工作人员上传至PaddleHub模型仓库，这样更多的用户可以通过一键安装的方式使用您的模型。PaddleHub非常欢迎您的贡献，共同推动开源社区成长。\n\n## 四、服务部署\n\n- PaddleHub Serving 可以部署一个文本生成的在线服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m $module_name -p 8866\n    ```\n\n  - 这样就完成了一个目标检测的服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ### 第二步：发送预测请求\n\n  - 客户端通过以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    \n    # 发送HTTP请求\n    \n    data = {'texts':[\"输入文本1\", \"输入文本2\"],\n            'use_gpu':True, 'beam_width':5}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/$module_name\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    \n    # 保存结果\n    results = r.json()[\"results\"]\n    for result in results:\n        print(result)\n    ```\n  \n- **NOTE:** 上述`$module_name`为export指定的module_name\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  修复模型导出bug\n\n* 1.0.2\n\n   修复windows运行中的bug\n\n* 1.1.0\n\n   接入PaddleNLP\n   \n   - ```shell\n     $ hub install ernie_gen==1.1.0\n     ```\n"
  },
  {
    "path": "modules/text/text_generation/ernie_gen/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/text_generation/ernie_gen/decode.py",
    "content": "#   Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom __future__ import division\nfrom __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport sys\nimport re\nimport argparse\nimport logging\nimport json\nimport numpy as np\nfrom collections import namedtuple\n\nimport paddle\nimport paddle.nn as nn\nimport numpy as np\nfrom paddlenlp.utils.log import logger\n\n\ndef gen_bias(encoder_inputs, decoder_inputs, step):\n    decoder_bsz, decoder_seqlen = decoder_inputs.shape[:2]\n    encoder_bsz, encoder_seqlen = encoder_inputs.shape[:2]\n    attn_bias = paddle.reshape(paddle.arange(0, decoder_seqlen, 1, dtype='float32') + 1, [1, -1, 1])\n    decoder_bias = paddle.cast((paddle.matmul(attn_bias, 1. / attn_bias, transpose_y=True) >= 1.),\n                               'float32')  #[1, decoderlen, decoderlen]\n    encoder_bias = paddle.unsqueeze(paddle.cast(paddle.ones_like(encoder_inputs), 'float32'),\n                                    [1])  #[bsz, 1, encoderlen]\n    encoder_bias = paddle.expand(encoder_bias,\n                                 [encoder_bsz, decoder_seqlen, encoder_seqlen])  #[bsz,decoderlen, encoderlen]\n    decoder_bias = paddle.expand(decoder_bias,\n                                 [decoder_bsz, decoder_seqlen, decoder_seqlen])  #[bsz, decoderlen, decoderlen]\n    if step > 0:\n        bias = paddle.concat(\n            [encoder_bias, paddle.ones([decoder_bsz, decoder_seqlen, step], 'float32'), decoder_bias], -1)\n    else:\n        bias = paddle.concat([encoder_bias, decoder_bias], -1)\n    return bias\n\n\n@paddle.no_grad()\ndef greedy_search_infilling(model,\n                            token_ids,\n                            token_type_ids,\n                            sos_id,\n                            eos_id,\n                            attn_id,\n                            pad_id,\n                            unk_id,\n                            vocab_size,\n                            max_encode_len=640,\n                            max_decode_len=100,\n                            tgt_type_id=3):\n    _, logits, info = model(token_ids, token_type_ids)\n    d_batch, d_seqlen = token_ids.shape\n    seqlen = paddle.sum(paddle.cast(token_ids != 0, 'int64'), 1, keepdim=True)\n    has_stopped = np.zeros([d_batch], dtype=np.bool)\n    gen_seq_len = np.zeros([d_batch], dtype=np.int64)\n    output_ids = []\n\n    past_cache = info['caches']\n\n    cls_ids = paddle.ones([d_batch], dtype='int64') * sos_id\n    attn_ids = paddle.ones([d_batch], dtype='int64') * attn_id\n    ids = paddle.stack([cls_ids, attn_ids], -1)\n    for step in range(max_decode_len):\n        bias = gen_bias(token_ids, ids, step)\n        pos_ids = paddle.to_tensor(np.tile(np.array([[step, step + 1]], dtype=np.int64), [d_batch, 1]))\n        pos_ids += seqlen\n        _, logits, info = model(ids,\n                                paddle.ones_like(ids) * tgt_type_id,\n                                pos_ids=pos_ids,\n                                attn_bias=bias,\n                                past_cache=past_cache)\n\n        if logits.shape[-1] > vocab_size:\n            logits[:, :, vocab_size:] = 0\n        logits[:, :, pad_id] = 0\n        logits[:, :, unk_id] = 0\n        logits[:, :, attn_id] = 0\n\n        gen_ids = paddle.argmax(logits, -1)\n\n        past_cached_k, past_cached_v = past_cache\n        cached_k, cached_v = info['caches']\n        cached_k = [paddle.concat([pk, k[:, :1, :]], 1) for pk, k in zip(past_cached_k, cached_k)]  # concat cached\n        cached_v = [paddle.concat([pv, v[:, :1, :]], 1) for pv, v in zip(past_cached_v, cached_v)]\n        past_cache = (cached_k, cached_v)\n\n        gen_ids = gen_ids[:, 1]\n        ids = paddle.stack([gen_ids, attn_ids], 1)\n\n        gen_ids = gen_ids.numpy()\n        has_stopped |= (gen_ids == eos_id).astype(np.bool)\n        gen_seq_len += (1 - has_stopped.astype(np.int64))\n        output_ids.append(gen_ids.tolist())\n        if has_stopped.all():\n            break\n    output_ids = np.array(output_ids).transpose([1, 0])\n    return output_ids\n\n\nBeamSearchState = namedtuple('BeamSearchState', ['log_probs', 'lengths', 'finished'])\nBeamSearchOutput = namedtuple('BeamSearchOutput', ['scores', 'predicted_ids', 'beam_parent_ids'])\n\n\ndef log_softmax(x):\n    e_x = np.exp(x - np.max(x))\n    return np.log(e_x / e_x.sum())\n\n\ndef mask_prob(p, onehot_eos, finished):\n    is_finished = paddle.cast(paddle.reshape(finished, [-1, 1]) != 0, 'float32')\n    p = is_finished * (1. - paddle.cast(onehot_eos, 'float32')) * -9999. + (1. - is_finished) * p\n    return p\n\n\ndef hyp_score(log_probs, length, length_penalty):\n    lp = paddle.pow((5. + paddle.cast(length, 'float32')) / 6., length_penalty)\n    return log_probs / lp\n\n\ndef beam_search_step(state, logits, eos_id, beam_width, is_first_step, length_penalty):\n    \"\"\"logits.shape == [B*W, V]\"\"\"\n    _, vocab_size = logits.shape\n\n    bsz, beam_width = state.log_probs.shape\n    onehot_eos = paddle.cast(nn.functional.one_hot(paddle.ones([1], 'int64') * eos_id, vocab_size), 'int64')  #[1, V]\n\n    probs = paddle.log(nn.functional.softmax(logits))  #[B*W, V]\n    probs = mask_prob(probs, onehot_eos, state.finished)  #[B*W, V]\n    allprobs = paddle.reshape(state.log_probs, [-1, 1]) + probs  #[B*W, V]\n\n    not_finished = 1 - paddle.reshape(state.finished, [-1, 1])  #[B*W,1]\n    not_eos = 1 - onehot_eos\n    length_to_add = not_finished * not_eos  #[B*W,V]\n    alllen = paddle.reshape(state.lengths, [-1, 1]) + length_to_add\n\n    allprobs = paddle.reshape(allprobs, [-1, beam_width * vocab_size])\n    alllen = paddle.reshape(alllen, [-1, beam_width * vocab_size])\n    allscore = hyp_score(allprobs, alllen, length_penalty)\n    if is_first_step:\n        allscore = paddle.reshape(allscore, [bsz, beam_width, -1])[:, 0, :]  # first step only consiter beam 0\n    scores, idx = paddle.topk(allscore, k=beam_width)  #[B, W]\n    next_beam_id = idx // vocab_size  #[B, W]\n    next_word_id = idx % vocab_size\n\n    gather_idx = paddle.concat([paddle.nonzero(idx != -1)[:, :1], paddle.reshape(idx, [-1, 1])], 1)\n    next_probs = paddle.reshape(paddle.gather_nd(allprobs, gather_idx), idx.shape)\n    next_len = paddle.reshape(paddle.gather_nd(alllen, gather_idx), idx.shape)\n\n    gather_idx = paddle.concat([paddle.nonzero(next_beam_id != -1)[:, :1], paddle.reshape(next_beam_id, [-1, 1])], 1)\n    next_finished = paddle.reshape(paddle.gather_nd(state.finished, gather_idx),\n                                   state.finished.shape)  #[gather new beam state according to new beam id]\n\n    next_finished += paddle.cast(next_word_id == eos_id, 'int64')\n    next_finished = paddle.cast(next_finished > 0, 'int64')\n\n    next_state = BeamSearchState(log_probs=next_probs, lengths=next_len, finished=next_finished)\n    output = BeamSearchOutput(scores=scores, predicted_ids=next_word_id, beam_parent_ids=next_beam_id)\n\n    return output, next_state\n\n\n@paddle.no_grad()\ndef beam_search_infilling(model,\n                          token_ids,\n                          token_type_ids,\n                          sos_id,\n                          eos_id,\n                          attn_id,\n                          pad_id,\n                          unk_id,\n                          vocab_size,\n                          max_encode_len=640,\n                          max_decode_len=100,\n                          beam_width=5,\n                          tgt_type_id=3,\n                          length_penalty=1.0):\n    _, __, info = model(token_ids, token_type_ids)\n    d_batch, d_seqlen = token_ids.shape\n\n    state = BeamSearchState(log_probs=paddle.zeros([d_batch, beam_width], 'float32'),\n                            lengths=paddle.zeros([d_batch, beam_width], 'int64'),\n                            finished=paddle.zeros([d_batch, beam_width], 'int64'))\n    outputs = []\n\n    def reorder_(t, parent_id):\n        \"\"\"reorder cache according to parent beam id\"\"\"\n        gather_idx = paddle.nonzero(parent_id != -1)[:, 0] * beam_width + paddle.reshape(parent_id, [-1])\n        t = paddle.gather(t, gather_idx)\n        return t\n\n    def tile_(t, times):\n        _shapes = list(t.shape[1:])\n        new_shape = [t.shape[0], times] + list(t.shape[1:])\n        ret = paddle.reshape(paddle.expand(paddle.unsqueeze(t, [1]), new_shape), [\n            -1,\n        ] + _shapes)\n        return ret\n\n    cached_k, cached_v = info['caches']\n    cached_k = [tile_(k, beam_width) for k in cached_k]\n    cached_v = [tile_(v, beam_width) for v in cached_v]\n    past_cache = (cached_k, cached_v)\n\n    token_ids = tile_(token_ids, beam_width)\n    seqlen = paddle.sum(paddle.cast(token_ids != 0, 'int64'), 1, keepdim=True)\n\n    cls_ids = paddle.ones([d_batch * beam_width], dtype='int64') * sos_id\n    attn_ids = paddle.ones([d_batch * beam_width], dtype='int64') * attn_id  # SOS\n    ids = paddle.stack([cls_ids, attn_ids], -1)\n    for step in range(max_decode_len):\n        bias = gen_bias(token_ids, ids, step)\n        pos_ids = paddle.to_tensor(np.tile(np.array([[step, step + 1]], dtype=np.int64), [d_batch * beam_width, 1]))\n        pos_ids += seqlen\n        _, logits, info = model(ids,\n                                paddle.ones_like(ids) * tgt_type_id,\n                                pos_ids=pos_ids,\n                                attn_bias=bias,\n                                past_cache=past_cache)\n        if logits.shape[-1] > vocab_size:\n            logits[:, :, vocab_size:] = 0\n        logits[:, :, pad_id] = 0\n        logits[:, :, unk_id] = 0\n        logits[:, :, attn_id] = 0\n\n        output, state = beam_search_step(state,\n                                         logits[:, 1],\n                                         eos_id=eos_id,\n                                         beam_width=beam_width,\n                                         is_first_step=(step == 0),\n                                         length_penalty=length_penalty)\n        outputs.append(output)\n\n        past_cached_k, past_cached_v = past_cache\n        cached_k, cached_v = info['caches']\n        cached_k = [\n            reorder_(paddle.concat([pk, k[:, :1, :]], 1), output.beam_parent_ids)\n            for pk, k in zip(past_cached_k, cached_k)\n        ]  # concat cached\n        cached_v = [\n            reorder_(paddle.concat([pv, v[:, :1, :]], 1), output.beam_parent_ids)\n            for pv, v in zip(past_cached_v, cached_v)\n        ]\n        past_cache = (cached_k, cached_v)\n\n        pred_ids_flatten = paddle.reshape(output.predicted_ids, [d_batch * beam_width])\n        ids = paddle.stack([pred_ids_flatten, attn_ids], 1)\n\n        if state.finished.numpy().all():\n            break\n\n    final_ids = paddle.stack([o.predicted_ids for o in outputs], 0)\n    final_parent_ids = paddle.stack([o.beam_parent_ids for o in outputs], 0)\n    final_ids = nn.functional.gather_tree(final_ids, final_parent_ids)  #[:, :, 0]  #pick best beam\n    final_ids = paddle.transpose(paddle.reshape(final_ids, [-1, d_batch * 1, beam_width]), [1, 2, 0])\n\n    return final_ids.numpy()\n\n\nen_patten = re.compile(r'^[a-zA-Z0-9]*$')\n\n\ndef post_process(token):\n    if token.startswith('##'):\n        ret = token[2:]\n    elif token in ['[CLS]', '[SEP]', '[PAD]']:\n        ret = ''\n    else:\n        if en_patten.match(token):\n            ret = ' ' + token\n        else:\n            ret = token\n    return ret\n"
  },
  {
    "path": "modules/text/text_generation/ernie_gen/encode.py",
    "content": "#   Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom copy import deepcopy\n\nimport numpy as np\n\n\ndef convert_example(tokenizer,\n                    attn_id,\n                    tgt_type_id=3,\n                    max_encode_len=512,\n                    max_decode_len=128,\n                    is_test=False,\n                    noise_prob=0.,\n                    use_random_noice=False):\n    def warpper(example):\n        \"\"\"convert an example into necessary features\"\"\"\n        tokens = example['tokens']\n        labels = example['labels']\n        encoded_src = tokenizer(tokens, max_seq_len=max_encode_len, pad_to_max_seq_len=False)\n        src_ids, src_sids = encoded_src[\"input_ids\"], encoded_src[\"token_type_ids\"]\n        src_pids = np.arange(len(src_ids))\n\n        if not is_test:\n            encoded_tgt = tokenizer(labels, max_seq_len=max_decode_len, pad_to_max_seq_len=False)\n            tgt_ids, tgt_sids = encoded_tgt[\"input_ids\"], encoded_tgt[\"token_type_ids\"]\n            tgt_ids = np.array(tgt_ids)\n            tgt_sids = np.array(tgt_sids) + tgt_type_id\n            tgt_pids = np.arange(len(tgt_ids)) + len(src_ids)\n\n        attn_ids = np.ones_like(tgt_ids) * attn_id\n        if noise_prob > 0.:\n            tgt_labels = deepcopy(tgt_ids)\n            if use_random_noice:\n                noice_ids = np.random.randint(1, len(tokenizer.vocab), size=tgt_ids.shape)\n            else:\n                noice_ids = np.ones_like(tgt_ids) * tokenizer.vocab['[NOISE]']\n            pos, = np.where(np.ones_like(tgt_ids))\n            np.random.shuffle(pos)\n            pos = pos[:int(noise_prob * len(pos))]\n            tgt_ids[pos, ] = noice_ids[pos, ]\n        else:\n            tgt_labels = tgt_ids\n\n        return [np.asarray(item, dtype=np.int64) for item \\\n            in [src_ids, src_pids, src_sids, tgt_ids, tgt_pids, tgt_sids, attn_ids, tgt_labels]]\n\n    return warpper\n\n\ndef gen_mask(batch_ids, mask_type='bidi', query_len=None, pad_value=0):\n    if query_len is None:\n        query_len = batch_ids.shape[1]\n    if mask_type != 'empty':\n        mask = (batch_ids != pad_value).astype(np.float32)\n        mask = np.tile(np.expand_dims(mask, 1), [1, query_len, 1])\n        if mask_type == 'causal':\n            assert query_len == batch_ids.shape[1]\n            mask = np.tril(mask)\n        elif mask_type == 'causal_without_diag':\n            assert query_len == batch_ids.shape[1]\n            mask = np.tril(mask, -1)\n        elif mask_type == 'diag':\n            assert query_len == batch_ids.shape[1]\n            # import pdb; pdb.set_trace()\n            mask = np.stack([np.diag(np.diag(m)) for m in mask], 0)\n\n    else:\n        mask_type == 'empty'\n        mask = np.zeros_like(batch_ids).astype(np.float32)\n        mask = np.tile(np.expand_dims(mask, 1), [1, query_len, 1])\n    return mask\n\n\ndef after_padding(args):\n    '''\n    attention mask:\n    ***  src,  tgt, attn\n    src  00,   01,   11\n    tgt  10,   11,   12\n    attn 20,   21,   22\n\n    ***   s1, s2 | t1 t2 t3| attn1 attn2 attn3\n    s1    1,  1  | 0, 0, 0,| 0,    0,    0,\n    s2    1,  1  | 0, 0, 0,| 0,    0,    0,\n    -\n    t1    1,  1, | 1, 0, 0,| 0,    0,    0,\n    t2    1,  1, | 1, 1, 0,| 0,    0,    0,\n    t3    1,  1, | 1, 1, 1,| 0,    0,    0,\n    -\n    attn1 1,  1, | 0, 0, 0,| 1,    0,    0,\n    attn2 1,  1, | 1, 0, 0,| 0,    1,    0,\n    attn3 1,  1, | 1, 1, 0,| 0,    0,    1,\n\n    for details, see Fig3. https://arxiv.org/abs/2001.11314\n    '''\n    src_ids, src_pids, src_sids, tgt_ids, tgt_pids, tgt_sids, attn_ids, tgt_labels = args\n    src_len = src_ids.shape[1]\n    tgt_len = tgt_ids.shape[1]\n    mask_00 = gen_mask(src_ids, 'bidi', query_len=src_len)\n    mask_01 = gen_mask(tgt_ids, 'empty', query_len=src_len)\n    mask_02 = gen_mask(attn_ids, 'empty', query_len=src_len)\n\n    mask_10 = gen_mask(src_ids, 'bidi', query_len=tgt_len)\n    mask_11 = gen_mask(tgt_ids, 'causal', query_len=tgt_len)\n    mask_12 = gen_mask(attn_ids, 'empty', query_len=tgt_len)\n\n    mask_20 = gen_mask(src_ids, 'bidi', query_len=tgt_len)\n    mask_21 = gen_mask(tgt_ids, 'causal_without_diag', query_len=tgt_len)\n    mask_22 = gen_mask(attn_ids, 'diag', query_len=tgt_len)\n\n    mask_src_2_src = mask_00\n    mask_tgt_2_srctgt = np.concatenate([mask_10, mask_11], 2)\n    mask_attn_2_srctgtattn = np.concatenate([mask_20, mask_21, mask_22], 2)\n\n    raw_tgt_labels = deepcopy(tgt_labels)\n    tgt_labels = tgt_labels[np.where(tgt_labels != 0)]\n    return (src_ids, src_sids, src_pids, tgt_ids, tgt_sids, tgt_pids, attn_ids, mask_src_2_src, mask_tgt_2_srctgt,\n            mask_attn_2_srctgtattn, tgt_labels, raw_tgt_labels)\n"
  },
  {
    "path": "modules/text/text_generation/ernie_gen/model.py",
    "content": "#   Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle\nimport paddle.nn as nn\nimport numpy as np\n\n\nclass StackModel(nn.Layer):\n    def __init__(self, model):\n        super().__init__()\n        self.model = model\n\n    def forward(self, src_ids, src_sids, src_pids, tgt_ids, tgt_sids, tgt_pids, attn_ids, mask_src_2_src,\n                mask_tgt_2_srctgt, mask_attn_2_srctgtattn, tgt_labels, tgt_pos):\n        _, __, info = self.model(src_ids,\n                                 sent_ids=src_sids,\n                                 pos_ids=src_pids,\n                                 attn_bias=mask_src_2_src,\n                                 encode_only=True)\n        cached_k, cached_v = info['caches']\n        _, __, info = self.model(tgt_ids,\n                                 sent_ids=tgt_sids,\n                                 pos_ids=tgt_pids,\n                                 attn_bias=mask_tgt_2_srctgt,\n                                 past_cache=(cached_k, cached_v),\n                                 encode_only=True)\n        cached_k2, cached_v2 = info['caches']\n        past_cache_k = [paddle.concat([k, k2], 1) for k, k2 in zip(cached_k, cached_k2)]\n        past_cache_v = [paddle.concat([v, v2], 1) for v, v2 in zip(cached_v, cached_v2)]\n        loss, _, __ = self.model(attn_ids,\n                                 sent_ids=tgt_sids,\n                                 pos_ids=tgt_pids,\n                                 attn_bias=mask_attn_2_srctgtattn,\n                                 past_cache=(past_cache_k, past_cache_v),\n                                 tgt_labels=tgt_labels,\n                                 tgt_pos=tgt_pos)\n        loss = loss.mean()\n        return loss\n"
  },
  {
    "path": "modules/text/text_generation/ernie_gen/module.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport sys\nimport shutil\nfrom copy import deepcopy\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nfrom paddle.io import DataLoader\nimport paddlehub as hub\nfrom paddlehub.common.logger import logger\nfrom paddlehub.module.module import moduleinfo\nfrom paddlenlp.datasets import MapDataset\nfrom paddlenlp.data import Stack, Tuple, Pad\nfrom paddlenlp.metrics import Rouge1, Rouge2\nfrom paddlenlp.transformers import ErnieTokenizer, ErnieForGeneration, LinearDecayWithWarmup\n\nfrom .encode import convert_example, after_padding\nfrom .decode import post_process, beam_search_infilling\nfrom .model import StackModel\n\n\n@moduleinfo(\n    name=\"ernie_gen\",\n    version=\"1.1.0\",\n    summary=\"ERNIE-GEN is a multi-flow language generation framework for both pre-training and fine-tuning.\",\n    author=\"baidu\",\n    author_email=\"\",\n    type=\"nlp/text_generation\",\n)\nclass ErnieGen():\n    def __init__(self):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        self.tokenizer = ErnieTokenizer.from_pretrained(\"ernie-1.0\")\n        self.rev_dict = self.tokenizer.vocab.idx_to_token\n        self.rev_lookup = np.vectorize(lambda i: self.rev_dict[i])\n        self._model = None\n\n    @property\n    def model(self):\n        if not self._model:\n            self._model = ErnieForGeneration.from_pretrained(\"ernie-1.0\")\n        return self._model\n\n    def finetune(\n        self,\n        train_path,\n        dev_path=None,\n        save_dir=\"ernie_gen_result\",\n        init_ckpt_path=None,\n        use_gpu=True,\n        max_steps=500,\n        batch_size=8,\n        max_encode_len=50,\n        max_decode_len=50,\n        learning_rate=5e-5,\n        warmup_proportion=0.1,\n        weight_decay=0.1,\n        noise_prob=0,\n        label_smooth=0,\n        beam_width=5,\n        length_penalty=1.0,\n        log_interval=100,\n        save_interval=200,\n    ):\n        \"\"\"\n        finetune with the specified dataset.\n\n        Args:\n            train_path(str): the train dataset path.\n            dev_path(str): the dev dataset path.\n            save_dir(str): the model params and dev dataset predict result save path.\n            init_ckpt_path(str): incremental training load path.\n            use_gpu(bool): use gpu or not.\n            max_steps(int): max training steps.\n            batch_size(int): the batch size.\n            max_encode_len(int): the max encode length.\n            max_decode_len(int): the max decode length.\n            learning_rate(float): the learning rate.\n            warmup_proportion(float): the warmup proportion.\n            weight_decay(float): the weight decay magnitude.\n            noise_prob(float): the nosie probability. see the ernie gen paper for details.\n            label_smooth(float): the label smooth magnitude.\n            beam_width(int): the beam size during evaluating the dev dataset.\n            length_penalty(float): the length penalty during evaluating the dev dataset.\n            log_interval(int): the log interval.\n            save_interval(int): the save interval. dev set will be evaluated after saving.\n\n        Return:\n            result(dict): A Dictionary of shape::\n                {\n                    last_save_path(str): last model save path.\n                    last_ppl(float): last model ppl.\n                }\n        \"\"\"\n        paddle.disable_static()\n        paddle.set_device('gpu') if use_gpu else paddle.set_device('cpu')\n\n        if init_ckpt_path is not None:\n            logger.info('loading checkpoint from %s' % init_ckpt_path)\n            sd = paddle.load(init_ckpt_path)\n            self.model.set_state_dict(sd)\n\n        train_dataset = self._load_dataset(train_path)\n        attn_id = self.tokenizer.vocab['[MASK]']\n        trans_func = convert_example(tokenizer=self.tokenizer,\n                                     attn_id=attn_id,\n                                     tgt_type_id=1,\n                                     max_encode_len=max_encode_len,\n                                     max_decode_len=max_decode_len,\n                                     noise_prob=noise_prob)\n\n        train_dataset = train_dataset.map(trans_func)\n        train_batch_sampler = paddle.io.BatchSampler(train_dataset, batch_size=batch_size, shuffle=True)\n        batchify_fn = lambda samples, fn=Tuple(\n            Pad(axis=0, pad_val=self.tokenizer.pad_token_id),  # src_ids\n            Pad(axis=0, pad_val=self.tokenizer.pad_token_id),  # src_pids\n            Pad(axis=0, pad_val=self.tokenizer.pad_token_type_id),  # src_tids\n            Pad(axis=0, pad_val=self.tokenizer.pad_token_id),  # tgt_ids\n            Pad(axis=0, pad_val=self.tokenizer.pad_token_id),  # tgt_pids\n            Pad(axis=0, pad_val=self.tokenizer.pad_token_type_id),  # tgt_tids\n            Pad(axis=0, pad_val=self.tokenizer.pad_token_id),  # attn_ids\n            Pad(axis=0, pad_val=self.tokenizer.pad_token_id),  # tgt_labels\n        ): after_padding(fn(samples))\n        train_data_loader = DataLoader(dataset=train_dataset,\n                                       batch_sampler=train_batch_sampler,\n                                       collate_fn=batchify_fn,\n                                       num_workers=0,\n                                       return_list=True)\n\n        if dev_path:\n            dev_dataset = self._load_dataset(dev_path)\n            dev_dataset = dev_dataset.map(trans_func)\n            dev_data_loader = DataLoader(dataset=dev_dataset,\n                                         batch_size=batch_size,\n                                         collate_fn=batchify_fn,\n                                         num_workers=0,\n                                         return_list=True)\n\n        label_num = self.model.word_emb.weight.shape[0]\n        train_model = StackModel(self.model)\n        lr_scheduler = LinearDecayWithWarmup(learning_rate, max_steps, warmup_proportion)\n        # Generate parameter names needed to perform weight decay.\n        # All bias and LayerNorm parameters are excluded.\n        decay_params = [p.name for n, p in self.model.named_parameters() if not any(nd in n for nd in [\"bias\", \"norm\"])]\n        optimizer = paddle.optimizer.AdamW(learning_rate=lr_scheduler,\n                                           parameters=self.model.parameters(),\n                                           weight_decay=weight_decay,\n                                           grad_clip=nn.ClipGradByGlobalNorm(1.0),\n                                           apply_decay_param_fun=lambda x: x in decay_params)\n\n        rouge1 = Rouge1()\n        rouge2 = Rouge2()\n        global_step = 1\n        if save_dir and not os.path.exists(save_dir):\n            os.makedirs(save_dir)\n        while True:\n            for batch in train_data_loader:\n                (src_ids, src_tids, src_pids, tgt_ids, tgt_tids, tgt_pids, attn_ids, mask_src_2_src, mask_tgt_2_srctgt,\n                 mask_attn_2_srctgtattn, tgt_labels, _) = batch\n                if label_smooth > 0.:\n                    tgt_labels = nn.functional.label_smooth(nn.functional.one_hot(tgt_labels, label_num),\n                                                            epsilon=label_smooth)\n\n                tgt_pos = paddle.nonzero(attn_ids == attn_id)\n                loss = train_model(src_ids, src_tids, src_pids, tgt_ids, tgt_tids, tgt_pids, attn_ids, mask_src_2_src,\n                                   mask_tgt_2_srctgt, mask_attn_2_srctgtattn, tgt_labels, tgt_pos)\n\n                loss.backward()\n                optimizer.step()\n                lr_scheduler.step()\n                optimizer.clear_grad()\n\n                if global_step % log_interval == 0 and paddle.distributed.get_rank() == 0:\n                    loss_np = loss.numpy()\n                    ppl = np.exp(loss_np)\n                    logger.info('[step %d / %d]train loss %.5f, ppl %.5f, elr %.3e' %\n                                (global_step, max_steps, loss_np, ppl, lr_scheduler.get_lr()))\n                if save_dir and global_step % save_interval == 0 and global_step > 0:\n                    loss_np = loss.numpy()\n                    ppl = np.exp(loss_np)\n                    save_name = \"step_%s_ppl_%.5f.params\" % (global_step, ppl)\n                    save_path = os.path.join(save_dir, save_name)\n                    logger.info(\"save the model in %s\" % save_path)\n                    paddle.save(self.model.state_dict(), save_path)\n\n                    if dev_path:\n                        self._evaluate(self.model, dev_data_loader, self.tokenizer, rouge1, rouge2, attn_id,\n                                       max_decode_len, max_encode_len, beam_width, length_penalty)\n\n                if global_step >= max_steps:\n                    break\n                global_step += 1\n\n            if global_step >= max_steps:\n                break\n\n        if global_step % save_interval != 0:\n            loss_np = loss.numpy()\n            ppl = np.exp(loss_np)\n            logger.info('[final step %d]train loss %.5f, ppl %.5f, elr %.3e' %\n                        (global_step, loss_np, ppl, lr_scheduler.get_lr()))\n            if save_dir:\n                save_name = \"step_%s_ppl_%.5f.pdparams\" % (global_step, ppl)\n                save_path = os.path.join(save_dir, save_name)\n                logger.info(\"save the model in %s\" % save_path)\n                paddle.save(self.model.state_dict(), save_path)\n\n                if dev_path:\n                    self._evaluate(self.model, dev_data_loader, self.tokenizer, rouge1, rouge2, attn_id, max_decode_len,\n                                   max_encode_len, beam_width, length_penalty)\n\n        result = {\n            \"last_save_path\": \"%s\" % save_path,\n            \"last_ppl\": ppl[0],\n        }\n\n        return result\n\n    def export(self,\n               params_path,\n               module_name,\n               author,\n               max_encode_len=50,\n               max_decode_len=50,\n               version=\"1.0.0\",\n               summary=\"\",\n               author_email=\"\",\n               export_path=\".\"):\n        \"\"\"\n        export the model saved in the params_path to a hub module.\n\n        Args:\n            params_path(str): the model params save path.\n            module_name(str): the module name.\n            author(str): the author name.\n            max_encode_len(int): the max encode length.\n            max_decode_len(int): the max decode length.\n            version(str): the version information.\n            summary(str): the module brief introduction.\n            author_email(str): the author email address.\n            export_path(str): the module export path.\n        \"\"\"\n        if not os.path.exists(params_path):\n            raise FileNotFoundError(\"The path %s does not exist.\" % params_path)\n        export_module_path = os.path.join(export_path, module_name)\n        if not os.path.exists(export_module_path):\n            os.makedirs(export_module_path)\n        logger.info(\"Begin export the model save in %s ...\" % params_path)\n\n        assets_path = os.path.join(self.directory, \"template\", \"assets\")\n        init_path = os.path.join(self.directory, \"template\", \"__init__.py\")\n        decode_path = os.path.join(self.directory, \"template\", \"decode.py\")\n        module_temp_path = os.path.join(self.directory, \"template\", \"module.temp\")\n\n        export_assets_path = os.path.join(export_module_path, \"assets\")\n        export_params_path = os.path.join(export_module_path, \"assets\", \"ernie_gen.pdparams\")\n        export_init_path = os.path.join(export_module_path, \"__init__.py\")\n        export_decode_path = os.path.join(export_module_path, \"decode.py\")\n\n        if not os.path.exists(export_assets_path):\n            os.makedirs(export_assets_path)\n        shutil.copyfile(init_path, export_init_path)\n        shutil.copyfile(params_path, export_params_path)\n        shutil.copyfile(decode_path, export_decode_path)\n\n        module_path = os.path.join(export_module_path, \"module.py\")\n        with open(module_temp_path, encoding=\"utf8\") as ftemp, open(module_path, \"w\") as fmodule:\n            content = ftemp.read().replace(r\"{module_name}\", module_name).replace(r\"{author}\", author).replace(\n                r\"{version}\", version).replace(r\"{summary}\", summary).replace(r\"{author_email}\", author_email).replace(\n                    r\"{max_encode_len}\", str(max_encode_len)).replace(r\"{max_decode_len}\", str(max_decode_len))\n            fmodule.write(content)\n\n        logger.info(\"The module has exported to %s\" % os.path.abspath(export_module_path))\n\n    def _evaluate(self, model, data_loader, tokenizer, rouge1, rouge2, attn_id, max_decode_len, max_encode_len,\n                  beam_width, length_penalty):\n        paddle.disable_static()\n        model.eval()\n\n        vocab = tokenizer.vocab\n        eos_id = vocab[tokenizer.sep_token]\n        sos_id = vocab[tokenizer.cls_token]\n        pad_id = vocab[tokenizer.pad_token]\n        unk_id = vocab[tokenizer.unk_token]\n        vocab_size = len(vocab)\n        evaluated_sentences_ids = []\n        reference_sentences_ids = []\n        logger.info(\"Evaluating...\")\n        for data in data_loader:\n            (src_ids, src_tids, src_pids, _, _, _, _, _, _, _, _, raw_tgt_labels) = data  # never use target when infer\n            # Use greedy_search_infilling or beam_search_infilling to get predictions\n            output_ids = beam_search_infilling(model,\n                                               src_ids,\n                                               src_tids,\n                                               eos_id=eos_id,\n                                               sos_id=sos_id,\n                                               attn_id=attn_id,\n                                               pad_id=pad_id,\n                                               unk_id=unk_id,\n                                               vocab_size=vocab_size,\n                                               max_decode_len=max_decode_len,\n                                               max_encode_len=max_encode_len,\n                                               beam_width=beam_width,\n                                               length_penalty=length_penalty,\n                                               tgt_type_id=1)\n\n            for ids in output_ids.tolist():\n                if eos_id in ids:\n                    ids = ids[:ids.index(eos_id)]\n                evaluated_sentences_ids.append(ids[0])\n\n            for ids in raw_tgt_labels.numpy().tolist():\n                ids = ids[:ids.index(eos_id)]\n                reference_sentences_ids.append(ids)\n\n        score1 = rouge1.score(evaluated_sentences_ids, reference_sentences_ids)\n        score2 = rouge2.score(evaluated_sentences_ids, reference_sentences_ids)\n\n        logger.info(\"Rouge-1: %.5f ,Rouge-2: %.5f\" % (score1 * 100, score2 * 100))\n\n        evaluated_sentences = []\n        reference_sentences = []\n        for ids in reference_sentences_ids[:3]:\n            reference_sentences.append(''.join(map(post_process, vocab.to_tokens(ids))))\n        for ids in evaluated_sentences_ids[:3]:\n            evaluated_sentences.append(''.join(map(post_process, vocab.to_tokens(ids))))\n        logger.debug(reference_sentences)\n        logger.debug(evaluated_sentences)\n\n        model.train()\n\n    def _load_dataset(self, datafiles):\n        def read(data_path):\n            with open(data_path, 'r', encoding='utf-8') as fp:\n                for line in fp.readlines():\n                    order, words, labels = line.strip('\\n').split('\\t')\n                    yield {'tokens': words, 'labels': labels}\n\n        if isinstance(datafiles, str):\n            return MapDataset(list(read(datafiles)))\n        elif isinstance(datafiles, list) or isinstance(datafiles, tuple):\n            return [MapDataset(list(read(datafile))) for datafile in datafiles]\n\n\nif __name__ == \"__main__\":\n    module = ErnieGen()\n    result = module.finetune(train_path='test_data/train.txt',\n                             dev_path='test_data/dev.txt',\n                             max_steps=30,\n                             batch_size=2,\n                             log_interval=10,\n                             save_interval=20)\n    module.export(params_path=result['last_save_path'], module_name=\"ernie_gen_test\", author=\"test\")\n"
  },
  {
    "path": "modules/text/text_generation/ernie_gen/template/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/text_generation/ernie_gen/template/decode.py",
    "content": "#   Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom __future__ import division\nfrom __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport sys\nimport re\nimport argparse\nimport logging\nimport json\nimport numpy as np\nfrom collections import namedtuple\n\nimport paddle\nimport paddle.nn as nn\nimport numpy as np\nfrom paddlenlp.utils.log import logger\n\n\ndef gen_bias(encoder_inputs, decoder_inputs, step):\n    decoder_bsz, decoder_seqlen = decoder_inputs.shape[:2]\n    encoder_bsz, encoder_seqlen = encoder_inputs.shape[:2]\n    attn_bias = paddle.reshape(paddle.arange(0, decoder_seqlen, 1, dtype='float32') + 1, [1, -1, 1])\n    decoder_bias = paddle.cast((paddle.matmul(attn_bias, 1. / attn_bias, transpose_y=True) >= 1.),\n                               'float32')  #[1, decoderlen, decoderlen]\n    encoder_bias = paddle.unsqueeze(paddle.cast(paddle.ones_like(encoder_inputs), 'float32'),\n                                    [1])  #[bsz, 1, encoderlen]\n    encoder_bias = paddle.expand(encoder_bias,\n                                 [encoder_bsz, decoder_seqlen, encoder_seqlen])  #[bsz,decoderlen, encoderlen]\n    decoder_bias = paddle.expand(decoder_bias,\n                                 [decoder_bsz, decoder_seqlen, decoder_seqlen])  #[bsz, decoderlen, decoderlen]\n    if step > 0:\n        bias = paddle.concat(\n            [encoder_bias, paddle.ones([decoder_bsz, decoder_seqlen, step], 'float32'), decoder_bias], -1)\n    else:\n        bias = paddle.concat([encoder_bias, decoder_bias], -1)\n    return bias\n\n\n@paddle.no_grad()\ndef greedy_search_infilling(model,\n                            token_ids,\n                            token_type_ids,\n                            sos_id,\n                            eos_id,\n                            attn_id,\n                            pad_id,\n                            unk_id,\n                            vocab_size,\n                            max_encode_len=640,\n                            max_decode_len=100,\n                            tgt_type_id=3):\n    _, logits, info = model(token_ids, token_type_ids)\n    d_batch, d_seqlen = token_ids.shape\n    seqlen = paddle.sum(paddle.cast(token_ids != 0, 'int64'), 1, keepdim=True)\n    has_stopped = np.zeros([d_batch], dtype=np.bool)\n    gen_seq_len = np.zeros([d_batch], dtype=np.int64)\n    output_ids = []\n\n    past_cache = info['caches']\n\n    cls_ids = paddle.ones([d_batch], dtype='int64') * sos_id\n    attn_ids = paddle.ones([d_batch], dtype='int64') * attn_id\n    ids = paddle.stack([cls_ids, attn_ids], -1)\n    for step in range(max_decode_len):\n        bias = gen_bias(token_ids, ids, step)\n        pos_ids = paddle.to_tensor(np.tile(np.array([[step, step + 1]], dtype=np.int64), [d_batch, 1]))\n        pos_ids += seqlen\n        _, logits, info = model(ids,\n                                paddle.ones_like(ids) * tgt_type_id,\n                                pos_ids=pos_ids,\n                                attn_bias=bias,\n                                past_cache=past_cache)\n\n        if logits.shape[-1] > vocab_size:\n            logits[:, :, vocab_size:] = 0\n        logits[:, :, pad_id] = 0\n        logits[:, :, unk_id] = 0\n        logits[:, :, attn_id] = 0\n\n        gen_ids = paddle.argmax(logits, -1)\n\n        past_cached_k, past_cached_v = past_cache\n        cached_k, cached_v = info['caches']\n        cached_k = [paddle.concat([pk, k[:, :1, :]], 1) for pk, k in zip(past_cached_k, cached_k)]  # concat cached\n        cached_v = [paddle.concat([pv, v[:, :1, :]], 1) for pv, v in zip(past_cached_v, cached_v)]\n        past_cache = (cached_k, cached_v)\n\n        gen_ids = gen_ids[:, 1]\n        ids = paddle.stack([gen_ids, attn_ids], 1)\n\n        gen_ids = gen_ids.numpy()\n        has_stopped |= (gen_ids == eos_id).astype(np.bool)\n        gen_seq_len += (1 - has_stopped.astype(np.int64))\n        output_ids.append(gen_ids.tolist())\n        if has_stopped.all():\n            break\n    output_ids = np.array(output_ids).transpose([1, 0])\n    return output_ids\n\n\nBeamSearchState = namedtuple('BeamSearchState', ['log_probs', 'lengths', 'finished'])\nBeamSearchOutput = namedtuple('BeamSearchOutput', ['scores', 'predicted_ids', 'beam_parent_ids'])\n\n\ndef log_softmax(x):\n    e_x = np.exp(x - np.max(x))\n    return np.log(e_x / e_x.sum())\n\n\ndef mask_prob(p, onehot_eos, finished):\n    is_finished = paddle.cast(paddle.reshape(finished, [-1, 1]) != 0, 'float32')\n    p = is_finished * (1. - paddle.cast(onehot_eos, 'float32')) * -9999. + (1. - is_finished) * p\n    return p\n\n\ndef hyp_score(log_probs, length, length_penalty):\n    lp = paddle.pow((5. + paddle.cast(length, 'float32')) / 6., length_penalty)\n    return log_probs / lp\n\n\ndef beam_search_step(state, logits, eos_id, beam_width, is_first_step, length_penalty):\n    \"\"\"logits.shape == [B*W, V]\"\"\"\n    _, vocab_size = logits.shape\n\n    bsz, beam_width = state.log_probs.shape\n    onehot_eos = paddle.cast(nn.functional.one_hot(paddle.ones([1], 'int64') * eos_id, vocab_size), 'int64')  #[1, V]\n\n    probs = paddle.log(nn.functional.softmax(logits))  #[B*W, V]\n    probs = mask_prob(probs, onehot_eos, state.finished)  #[B*W, V]\n    allprobs = paddle.reshape(state.log_probs, [-1, 1]) + probs  #[B*W, V]\n\n    not_finished = 1 - paddle.reshape(state.finished, [-1, 1])  #[B*W,1]\n    not_eos = 1 - onehot_eos\n    length_to_add = not_finished * not_eos  #[B*W,V]\n    alllen = paddle.reshape(state.lengths, [-1, 1]) + length_to_add\n\n    allprobs = paddle.reshape(allprobs, [-1, beam_width * vocab_size])\n    alllen = paddle.reshape(alllen, [-1, beam_width * vocab_size])\n    allscore = hyp_score(allprobs, alllen, length_penalty)\n    if is_first_step:\n        allscore = paddle.reshape(allscore, [bsz, beam_width, -1])[:, 0, :]  # first step only consiter beam 0\n    scores, idx = paddle.topk(allscore, k=beam_width)  #[B, W]\n    next_beam_id = idx // vocab_size  #[B, W]\n    next_word_id = idx % vocab_size\n\n    gather_idx = paddle.concat([paddle.nonzero(idx != -1)[:, :1], paddle.reshape(idx, [-1, 1])], 1)\n    next_probs = paddle.reshape(paddle.gather_nd(allprobs, gather_idx), idx.shape)\n    next_len = paddle.reshape(paddle.gather_nd(alllen, gather_idx), idx.shape)\n\n    gather_idx = paddle.concat([paddle.nonzero(next_beam_id != -1)[:, :1], paddle.reshape(next_beam_id, [-1, 1])], 1)\n    next_finished = paddle.reshape(paddle.gather_nd(state.finished, gather_idx),\n                                   state.finished.shape)  #[gather new beam state according to new beam id]\n\n    next_finished += paddle.cast(next_word_id == eos_id, 'int64')\n    next_finished = paddle.cast(next_finished > 0, 'int64')\n\n    next_state = BeamSearchState(log_probs=next_probs, lengths=next_len, finished=next_finished)\n    output = BeamSearchOutput(scores=scores, predicted_ids=next_word_id, beam_parent_ids=next_beam_id)\n\n    return output, next_state\n\n\n@paddle.no_grad()\ndef beam_search_infilling(model,\n                          token_ids,\n                          token_type_ids,\n                          sos_id,\n                          eos_id,\n                          attn_id,\n                          pad_id,\n                          unk_id,\n                          vocab_size,\n                          max_encode_len=640,\n                          max_decode_len=100,\n                          beam_width=5,\n                          tgt_type_id=3,\n                          length_penalty=1.0):\n    _, __, info = model(token_ids, token_type_ids)\n    d_batch, d_seqlen = token_ids.shape\n\n    state = BeamSearchState(log_probs=paddle.zeros([d_batch, beam_width], 'float32'),\n                            lengths=paddle.zeros([d_batch, beam_width], 'int64'),\n                            finished=paddle.zeros([d_batch, beam_width], 'int64'))\n    outputs = []\n\n    def reorder_(t, parent_id):\n        \"\"\"reorder cache according to parent beam id\"\"\"\n        gather_idx = paddle.nonzero(parent_id != -1)[:, 0] * beam_width + paddle.reshape(parent_id, [-1])\n        t = paddle.gather(t, gather_idx)\n        return t\n\n    def tile_(t, times):\n        _shapes = list(t.shape[1:])\n        new_shape = [t.shape[0], times] + list(t.shape[1:])\n        ret = paddle.reshape(paddle.expand(paddle.unsqueeze(t, [1]), new_shape), [\n            -1,\n        ] + _shapes)\n        return ret\n\n    cached_k, cached_v = info['caches']\n    cached_k = [tile_(k, beam_width) for k in cached_k]\n    cached_v = [tile_(v, beam_width) for v in cached_v]\n    past_cache = (cached_k, cached_v)\n\n    token_ids = tile_(token_ids, beam_width)\n    seqlen = paddle.sum(paddle.cast(token_ids != 0, 'int64'), 1, keepdim=True)\n\n    cls_ids = paddle.ones([d_batch * beam_width], dtype='int64') * sos_id\n    attn_ids = paddle.ones([d_batch * beam_width], dtype='int64') * attn_id  # SOS\n    ids = paddle.stack([cls_ids, attn_ids], -1)\n    for step in range(max_decode_len):\n        bias = gen_bias(token_ids, ids, step)\n        pos_ids = paddle.to_tensor(np.tile(np.array([[step, step + 1]], dtype=np.int64), [d_batch * beam_width, 1]))\n        pos_ids += seqlen\n        _, logits, info = model(ids,\n                                paddle.ones_like(ids) * tgt_type_id,\n                                pos_ids=pos_ids,\n                                attn_bias=bias,\n                                past_cache=past_cache)\n        if logits.shape[-1] > vocab_size:\n            logits[:, :, vocab_size:] = 0\n        logits[:, :, pad_id] = 0\n        logits[:, :, unk_id] = 0\n        logits[:, :, attn_id] = 0\n\n        output, state = beam_search_step(state,\n                                         logits[:, 1],\n                                         eos_id=eos_id,\n                                         beam_width=beam_width,\n                                         is_first_step=(step == 0),\n                                         length_penalty=length_penalty)\n        outputs.append(output)\n\n        past_cached_k, past_cached_v = past_cache\n        cached_k, cached_v = info['caches']\n        cached_k = [\n            reorder_(paddle.concat([pk, k[:, :1, :]], 1), output.beam_parent_ids)\n            for pk, k in zip(past_cached_k, cached_k)\n        ]  # concat cached\n        cached_v = [\n            reorder_(paddle.concat([pv, v[:, :1, :]], 1), output.beam_parent_ids)\n            for pv, v in zip(past_cached_v, cached_v)\n        ]\n        past_cache = (cached_k, cached_v)\n\n        pred_ids_flatten = paddle.reshape(output.predicted_ids, [d_batch * beam_width])\n        ids = paddle.stack([pred_ids_flatten, attn_ids], 1)\n\n        if state.finished.numpy().all():\n            break\n\n    final_ids = paddle.stack([o.predicted_ids for o in outputs], 0)\n    final_parent_ids = paddle.stack([o.beam_parent_ids for o in outputs], 0)\n    final_ids = nn.functional.gather_tree(final_ids, final_parent_ids)  #[:, :, 0]  #pick best beam\n    final_ids = paddle.transpose(paddle.reshape(final_ids, [-1, d_batch * 1, beam_width]), [1, 2, 0])\n\n    return final_ids.numpy()\n\n\nen_patten = re.compile(r'^[a-zA-Z0-9]*$')\n\n\ndef post_process(token):\n    if token.startswith('##'):\n        ret = token[2:]\n    elif token in ['[CLS]', '[SEP]', '[PAD]']:\n        ret = ''\n    else:\n        if en_patten.match(token):\n            ret = ' ' + token\n        else:\n            ret = token\n    return ret\n"
  },
  {
    "path": "modules/text/text_generation/ernie_gen/template/module.temp",
    "content": "# coding:utf-8\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport ast\nimport json\nimport argparse\nimport os\n\nimport numpy as np\nimport paddle\nimport paddlehub as hub\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.nlp_module import DataFormatError\nfrom paddlehub.common.logger import logger\nfrom paddlehub.module.module import moduleinfo, serving\nfrom paddlenlp.transformers import ErnieTokenizer, ErnieForGeneration\n\nfrom .decode import beam_search_infilling\n\n\n@moduleinfo(\n    name=\"{module_name}\",\n    version=\"{version}\",\n    summary=\n    \"{summary}\",\n    author=\"{author}\",\n    author_email=\"{author_email}\",\n    type=\"nlp/text_generation\",\n)\nclass ErnieGen(hub.NLPPredictionModule):\n    def __init__(self):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        assets_path = os.path.join(self.directory, \"assets\")\n        gen_checkpoint_path = os.path.join(assets_path, \"ernie_gen.pdparams\")\n        self.model = ErnieForGeneration.from_pretrained(\"ernie-1.0\")\n        model_state = paddle.load(gen_checkpoint_path)\n        self.model.set_dict(model_state)\n        self.tokenizer = ErnieTokenizer.from_pretrained(\"ernie-1.0\")\n        self.rev_dict = self.tokenizer.vocab.idx_to_token\n        self.rev_dict[self.tokenizer.vocab['[PAD]']] = ''  # replace [PAD]\n        self.rev_dict[self.tokenizer.vocab['[UNK]']] = ''  # replace [PAD]\n        self.rev_lookup = np.vectorize(lambda i: self.rev_dict[i])\n\n    @serving\n    def generate(self, texts, use_gpu=False, beam_width=5):\n        \"\"\"\n        Get the predict result from the input texts.\n\n        Args:\n             texts(list): the input texts.\n             use_gpu(bool): whether use gpu to predict or not\n             beam_width(int): the beam search width.\n\n        Returns:\n             results(list): the predict result.\n        \"\"\"\n        paddle.disable_static()\n\n        if texts and isinstance(texts, list) and all(texts) and all(\n            [isinstance(text, str) for text in texts]):\n            predicted_data = texts\n        else:\n            raise ValueError(\n                \"The input texts should be a list with nonempty string elements.\"\n            )\n\n        if use_gpu and \"CUDA_VISIBLE_DEVICES\" not in os.environ:\n            use_gpu = False\n            logger.warning(\n                \"use_gpu has been set False as you didn't set the environment variable CUDA_VISIBLE_DEVICES while using use_gpu=True\"\n            )\n\n        paddle.set_device('gpu') if use_gpu else paddle.set_device('cpu')\n        \n        self.model.eval()\n        results = []\n        for text in predicted_data:\n            sample_results = []\n            encode_text = self.tokenizer.encode(text)\n            src_ids = paddle.to_tensor(encode_text['input_ids']).unsqueeze(0)\n            src_sids = paddle.to_tensor(encode_text['token_type_ids']).unsqueeze(0)\n            output_ids = beam_search_infilling(\n                self.model,\n                src_ids,\n                src_sids,\n                eos_id=self.tokenizer.vocab['[SEP]'],\n                sos_id=self.tokenizer.vocab['[CLS]'],\n                attn_id=self.tokenizer.vocab['[MASK]'],\n                pad_id=self.tokenizer.vocab['[PAD]'],\n                unk_id=self.tokenizer.vocab['[UNK]'],\n                vocab_size=len(self.tokenizer.vocab),\n                max_decode_len={max_decode_len},\n                max_encode_len={max_encode_len},\n                beam_width=beam_width,\n                tgt_type_id=1)\n            output_str = self.rev_lookup(output_ids[0])\n\n            for ostr in output_str.tolist():\n                if '[SEP]' in ostr:\n                    ostr = ostr[:ostr.index('[SEP]')]\n                sample_results.append(\"\".join(ostr))\n            results.append(sample_results)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu',\n            type=ast.literal_eval,\n            default=False,\n            help=\"whether use GPU for prediction\")\n\n        self.arg_config_group.add_argument(\n            '--beam_width', type=int, default=5, help=\"the beam search width\")\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description='Run the %s module.' % self.name,\n            prog='hub run %s' % self.name,\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(\n            title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\",\n            description=\n            \"Run configuration for controlling module behavior, optional.\")\n\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n\n        args = self.parser.parse_args(argvs)\n\n        try:\n            input_data = self.check_input_data(args)\n        except DataFormatError and RuntimeError:\n            self.parser.print_help()\n            return None\n\n        results = self.generate(\n            texts=input_data, use_gpu=args.use_gpu, beam_width=args.beam_width)\n\n        return results\n"
  },
  {
    "path": "modules/text/text_generation/ernie_gen/test_data/dev.txt",
    "content": "1\t入林不动草，入水不动波。\t镬汤无冷处，合眼跳黄河。\n2\t画师端为谁，钓者亦安在。\t我不识若人，相望大千内。\n3\t何必关山远，凉风在殿西。\t箫声犹袅袅，舞袖忽凄凄。\n4\t不记门前路，门前一尺深。\t梅花如有语，参透老逋心。\n5\t石桥跨两岫，野叟尝远蹠。\t旁有枰棋处，云是仙人奕。\n6\t顾渚吴商绝，蒙山蜀信稀。\t千丛因此始，含霞紫英肥。\n"
  },
  {
    "path": "modules/text/text_generation/ernie_gen/test_data/train.txt",
    "content": "1\t落叶频惊鹿，连峰欲映雕。\t此生诗病苦，此病更萧条。\n2\t汴水夹榆柳，今留胡马踪。\t如何进贤路，只是见青松。\n3\t夜行无月时，古路多荒榛。\t山鬼摇把火，自照不照人。\n4\t春到村居好，园林兴味长。\t蚕贪桑眼出，蜂趁蜜脾忙。\n5\t麀鹿同呦呦，山林风雨秋。\t姑苏台上月，子胥曾约游。\n6\t谢家庭下玉，化此青琅玕。\t风标敻不俗，谁谓骨相寒。\n7\t团团青枫阴，绰绰万间屋。\t下有避俗翁，扫石叠两足。\n8\t万壑摇苍烟，百滩度流水。\t下有骑馿人，萧萧吹冻耳。\n9\t太平蜀雀异，仍映碧桃间。\t一秀三千岁，高枝永共攀。\n10\t壁带非烟润，金铺霁景鲜。\t绣功添采缕，和气入繁弦。\n11\t苔寒两不借，对面宁尔劳。\t欲语二三子，卑之毋甚高。\n12\t肌细分红脉，香浓破紫苞。\t无因留得翫，争忍折来抛。\n13\t流水难穷目，斜阳易断肠。\t谁同砑光帽，一曲舞山香。\n14\t孙儿正啼哭，母言来与金。\t捻他黄叶把，便是正声音。\n15\t多病苦虚羸，晴明强展眉。\t读书心绪少，闲卧日长时。\n16\t去国投兹土，编茅隐旧踪。\t年年秋水上，独对数株松。\n17\t易觉春风老，偏知夏日长。\t四山新笋出，一涧野花香。\n18\t门拥千峰翠，溪无一点尘。\t松风清入耳，山月白随人。\n19\t款款穿芳径，双双度短墙。\t不知身是幻，抵死恋花香。\n20\t游目贝叶书，究竟华严境。\t当年寓名心，观者要深省。\n21\t声求不可求，见迹不寻牛。\t迹在牛还在，不求何自休。\n22\t学道如钻火，逢烟未可休。\t直待金星现，曹门取郑州。\n23\t修证彼何人，有国号众香。\t此境了不殊，沉檀蔼飞扬。\n24\t山中砖塔闭，松下影堂新。\t恨不生前识，今朝礼画身。\n"
  },
  {
    "path": "modules/text/text_generation/ernie_gen_acrostic_poetry/README.md",
    "content": "# ernie_gen_acrostic_poetry\n\n| 模型名称            | ernie_gen_acrostic_poetry |\n| :------------------ | :-----------------------: |\n| 类别                |       文本-文本生成       |\n| 网络                |         ERNIE-GEN         |\n| 数据集              |      开源诗歌数据集       |\n| 是否支持Fine-tuning |            否             |\n| 模型大小            |           1.64G           |\n| 最新更新日期        |        2021-02-26         |\n| 数据指标            |             -             |\n\n## 一、基本信息\n\n- ### 模型介绍\n  - ERNIE-GEN 是面向生成任务的预训练-微调框架，首次在预训练阶段加入span-by-span 生成任务，让模型每次能够生成一个语义完整的片段。在预训练和微调中通过填充式生成机制和噪声感知机制来缓解曝光偏差问题。此外, ERNIE-GEN 采样多片段-多粒度目标文本采样策略, 增强源文本和目标文本的关联性，加强了编码器和解码器的交互。\n  - ernie_gen_acrostic_poetry采用开源诗歌数据集进行微调，可用于生成藏头诗。\n\n<p align=\"center\">\n<img src=\"https://user-images.githubusercontent.com/76040149/133191670-8eb1c542-f8e8-4715-adb2-6346b976fab1.png\"  width=\"600\" hspace='10'/>\n</p>\n\n- 更多详情请参考：[ERNIE-GEN:An Enhanced Multi-Flow Pre-training and Fine-tuning Framework for Natural Language Generation](https://arxiv.org/abs/2001.11314)\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 1.8.2\n  - paddlehub >= 1.7.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n  \n- ### 2、安装\n\n  - ```shell\n    $ hub install ernie_gen_acrostic_poetry\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run ernie_gen_acrostic_poetry --input_text=\"我喜欢你\" --use_gpu True --beam_width 5\n    ```\n    \n    - **NOTE:** 命令行预测的方式只支持七言绝句，如需输出五言绝句、五言律诗、七言律诗，请使用API。\n    - **参数**\n      - input_text: 诗歌的藏头，长度不应超过4，否则将被截断；\n      - use_gpu: 是否采用GPU进行预测；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量**；\n      - beam_width: beam search宽度，决定每个藏头输出的下文数目。\n    \n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    \n    # 在模型定义时，可以通过设置line=4或8指定输出绝句或律诗，设置word=5或7指定输出五言或七言。\n    # 默认line=4, word=7 即输出七言绝句。\n    module = hub.Module(name=\"ernie_gen_acrostic_poetry\", line=4, word=7)\n    \n    test_texts = ['我喜欢你']\n    results = module.generate(texts=test_texts, use_gpu=True, beam_width=5)\n    for result in results:\n        print(result)\n        \n    # ['我方治地种秋芳，喜见新花照眼黄。欢友相逢头白尽，你缘何事得先尝。', '我今解此如意珠，喜汝为我返魂无。欢声百里镇如席，你若来时我自有。', '我今解此如意珠，喜汝为我返魂无。欢声百里镇如席，你若来时我自孤。', '我今解此如意珠，喜汝为我返魂无。欢声百里镇如席，你若来时我自如。', '我方治地种秋芳，喜见新花照眼黄。欢友相逢头白尽，你缘何事苦生凉。']\n    ```\n\n- ### 3、API\n\n  - ```python\n    hub.Module(name=\"ernie_gen_acrostic_poetry\", line=4, word=7)\n    ```\n\n    - 在模型定义时，可以通过设置line=4或8指定输出绝句或律诗，设置word=5或7指定输出五言或七言。\n    - 默认line=4, word=7 即输出七言绝句。\n\n  - ```python\n    def generate(texts, use_gpu=False, beam_width=5):\n    ```\n\n    - 预测API，输入诗歌藏头，输出诗歌下文。\n    - **参数**\n      - texts (list[str]): 诗歌的藏头；\n      - use_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量**；\n      - beam_width: beam search宽度，决定每个藏头输出的下文数目。\n    - **返回**\n      - results (list[list]\\[str]): 诗歌下文，每个诗歌藏头会生成beam_width个下文。\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线诗歌生成服务，可以将此接口用于在线web应用。\n\n- **NOTE:** 服务部署的方式只支持七言绝句，如需输出五言绝句、五言律诗、七言律诗，请使用API。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m ernie_gen_acrostic_poetry -p 8866\n    ```\n\n  - 这样就完成了一个服务化API的部署，默认端口号为8866。\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    \n    # 发送HTTP请求\n    \n    data = {'texts':['我喜欢你'],\n            'use_gpu':True, 'beam_width':5}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/ernie_gen_acrostic_poetry\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    \n    # 保存结果\n    results = r.json()[\"results\"]\n    for result in results:\n        print(result)\n    \n    # serving运行结果同本地运行结果（见上）\n    ```\n    \n  - 关于PaddleHub Serving更多信息参考[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  完善API的输入文本检查\n\n\n- 1.1.0\n\n  接入PaddleNLP\n\n  - ```shell\n    $ hub install ernie_gen_poetry==1.1.0\n    ```\n"
  },
  {
    "path": "modules/text/text_generation/ernie_gen_acrostic_poetry/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/text_generation/ernie_gen_acrostic_poetry/decode.py",
    "content": "#   Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom __future__ import division\nfrom __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport sys\nimport re\nimport argparse\nimport logging\nimport json\nimport numpy as np\nfrom collections import namedtuple\n\nimport paddle\nimport paddle.nn as nn\nimport numpy as np\nfrom paddlenlp.utils.log import logger\n\n\ndef gen_bias(encoder_inputs, decoder_inputs, step):\n    decoder_bsz, decoder_seqlen = decoder_inputs.shape[:2]\n    encoder_bsz, encoder_seqlen = encoder_inputs.shape[:2]\n    attn_bias = paddle.reshape(paddle.arange(0, decoder_seqlen, 1, dtype='float32') + 1, [1, -1, 1])\n    decoder_bias = paddle.cast((paddle.matmul(attn_bias, 1. / attn_bias, transpose_y=True) >= 1.),\n                               'float32')  #[1, decoderlen, decoderlen]\n    encoder_bias = paddle.unsqueeze(paddle.cast(paddle.ones_like(encoder_inputs), 'float32'),\n                                    [1])  #[bsz, 1, encoderlen]\n    encoder_bias = paddle.expand(encoder_bias,\n                                 [encoder_bsz, decoder_seqlen, encoder_seqlen])  #[bsz,decoderlen, encoderlen]\n    decoder_bias = paddle.expand(decoder_bias,\n                                 [decoder_bsz, decoder_seqlen, decoder_seqlen])  #[bsz, decoderlen, decoderlen]\n    if step > 0:\n        bias = paddle.concat(\n            [encoder_bias, paddle.ones([decoder_bsz, decoder_seqlen, step], 'float32'), decoder_bias], -1)\n    else:\n        bias = paddle.concat([encoder_bias, decoder_bias], -1)\n    return bias\n\n\n@paddle.no_grad()\ndef greedy_search_infilling(model,\n                            token_ids,\n                            token_type_ids,\n                            sos_id,\n                            eos_id,\n                            attn_id,\n                            pad_id,\n                            unk_id,\n                            vocab_size,\n                            max_encode_len=640,\n                            max_decode_len=100,\n                            tgt_type_id=3):\n    _, logits, info = model(token_ids, token_type_ids)\n    d_batch, d_seqlen = token_ids.shape\n    seqlen = paddle.sum(paddle.cast(token_ids != 0, 'int64'), 1, keepdim=True)\n    has_stopped = np.zeros([d_batch], dtype=np.bool)\n    gen_seq_len = np.zeros([d_batch], dtype=np.int64)\n    output_ids = []\n\n    past_cache = info['caches']\n\n    cls_ids = paddle.ones([d_batch], dtype='int64') * sos_id\n    attn_ids = paddle.ones([d_batch], dtype='int64') * attn_id\n    ids = paddle.stack([cls_ids, attn_ids], -1)\n    for step in range(max_decode_len):\n        bias = gen_bias(token_ids, ids, step)\n        pos_ids = paddle.to_tensor(np.tile(np.array([[step, step + 1]], dtype=np.int64), [d_batch, 1]))\n        pos_ids += seqlen\n        _, logits, info = model(ids,\n                                paddle.ones_like(ids) * tgt_type_id,\n                                pos_ids=pos_ids,\n                                attn_bias=bias,\n                                past_cache=past_cache)\n\n        if logits.shape[-1] > vocab_size:\n            logits[:, :, vocab_size:] = 0\n        logits[:, :, pad_id] = 0\n        logits[:, :, unk_id] = 0\n        logits[:, :, attn_id] = 0\n\n        gen_ids = paddle.argmax(logits, -1)\n\n        past_cached_k, past_cached_v = past_cache\n        cached_k, cached_v = info['caches']\n        cached_k = [paddle.concat([pk, k[:, :1, :]], 1) for pk, k in zip(past_cached_k, cached_k)]  # concat cached\n        cached_v = [paddle.concat([pv, v[:, :1, :]], 1) for pv, v in zip(past_cached_v, cached_v)]\n        past_cache = (cached_k, cached_v)\n\n        gen_ids = gen_ids[:, 1]\n        ids = paddle.stack([gen_ids, attn_ids], 1)\n\n        gen_ids = gen_ids.numpy()\n        has_stopped |= (gen_ids == eos_id).astype(np.bool)\n        gen_seq_len += (1 - has_stopped.astype(np.int64))\n        output_ids.append(gen_ids.tolist())\n        if has_stopped.all():\n            break\n    output_ids = np.array(output_ids).transpose([1, 0])\n    return output_ids\n\n\nBeamSearchState = namedtuple('BeamSearchState', ['log_probs', 'lengths', 'finished'])\nBeamSearchOutput = namedtuple('BeamSearchOutput', ['scores', 'predicted_ids', 'beam_parent_ids'])\n\n\ndef log_softmax(x):\n    e_x = np.exp(x - np.max(x))\n    return np.log(e_x / e_x.sum())\n\n\ndef mask_prob(p, onehot_eos, finished):\n    is_finished = paddle.cast(paddle.reshape(finished, [-1, 1]) != 0, 'float32')\n    p = is_finished * (1. - paddle.cast(onehot_eos, 'float32')) * -9999. + (1. - is_finished) * p\n    return p\n\n\ndef hyp_score(log_probs, length, length_penalty):\n    lp = paddle.pow((5. + paddle.cast(length, 'float32')) / 6., length_penalty)\n    return log_probs / lp\n\n\ndef beam_search_step(state, logits, eos_id, beam_width, is_first_step, length_penalty):\n    \"\"\"logits.shape == [B*W, V]\"\"\"\n    _, vocab_size = logits.shape\n\n    bsz, beam_width = state.log_probs.shape\n    onehot_eos = paddle.cast(nn.functional.one_hot(paddle.ones([1], 'int64') * eos_id, vocab_size), 'int64')  #[1, V]\n\n    probs = paddle.log(nn.functional.softmax(logits))  #[B*W, V]\n    probs = mask_prob(probs, onehot_eos, state.finished)  #[B*W, V]\n    allprobs = paddle.reshape(state.log_probs, [-1, 1]) + probs  #[B*W, V]\n\n    not_finished = 1 - paddle.reshape(state.finished, [-1, 1])  #[B*W,1]\n    not_eos = 1 - onehot_eos\n    length_to_add = not_finished * not_eos  #[B*W,V]\n    alllen = paddle.reshape(state.lengths, [-1, 1]) + length_to_add\n\n    allprobs = paddle.reshape(allprobs, [-1, beam_width * vocab_size])\n    alllen = paddle.reshape(alllen, [-1, beam_width * vocab_size])\n    allscore = hyp_score(allprobs, alllen, length_penalty)\n    if is_first_step:\n        allscore = paddle.reshape(allscore, [bsz, beam_width, -1])[:, 0, :]  # first step only consiter beam 0\n    scores, idx = paddle.topk(allscore, k=beam_width)  #[B, W]\n    next_beam_id = idx // vocab_size  #[B, W]\n    next_word_id = idx % vocab_size\n\n    gather_idx = paddle.concat([paddle.nonzero(idx != -1)[:, :1], paddle.reshape(idx, [-1, 1])], 1)\n    next_probs = paddle.reshape(paddle.gather_nd(allprobs, gather_idx), idx.shape)\n    next_len = paddle.reshape(paddle.gather_nd(alllen, gather_idx), idx.shape)\n\n    gather_idx = paddle.concat([paddle.nonzero(next_beam_id != -1)[:, :1], paddle.reshape(next_beam_id, [-1, 1])], 1)\n    next_finished = paddle.reshape(paddle.gather_nd(state.finished, gather_idx),\n                                   state.finished.shape)  #[gather new beam state according to new beam id]\n\n    next_finished += paddle.cast(next_word_id == eos_id, 'int64')\n    next_finished = paddle.cast(next_finished > 0, 'int64')\n\n    next_state = BeamSearchState(log_probs=next_probs, lengths=next_len, finished=next_finished)\n    output = BeamSearchOutput(scores=scores, predicted_ids=next_word_id, beam_parent_ids=next_beam_id)\n\n    return output, next_state\n\n\n@paddle.no_grad()\ndef beam_search_infilling(model,\n                          token_ids,\n                          token_type_ids,\n                          sos_id,\n                          eos_id,\n                          attn_id,\n                          pad_id,\n                          unk_id,\n                          vocab_size,\n                          max_encode_len=640,\n                          max_decode_len=100,\n                          beam_width=5,\n                          tgt_type_id=3,\n                          length_penalty=1.0):\n    _, __, info = model(token_ids, token_type_ids)\n    d_batch, d_seqlen = token_ids.shape\n\n    state = BeamSearchState(log_probs=paddle.zeros([d_batch, beam_width], 'float32'),\n                            lengths=paddle.zeros([d_batch, beam_width], 'int64'),\n                            finished=paddle.zeros([d_batch, beam_width], 'int64'))\n    outputs = []\n\n    def reorder_(t, parent_id):\n        \"\"\"reorder cache according to parent beam id\"\"\"\n        gather_idx = paddle.nonzero(parent_id != -1)[:, 0] * beam_width + paddle.reshape(parent_id, [-1])\n        t = paddle.gather(t, gather_idx)\n        return t\n\n    def tile_(t, times):\n        _shapes = list(t.shape[1:])\n        new_shape = [t.shape[0], times] + list(t.shape[1:])\n        ret = paddle.reshape(paddle.expand(paddle.unsqueeze(t, [1]), new_shape), [\n            -1,\n        ] + _shapes)\n        return ret\n\n    cached_k, cached_v = info['caches']\n    cached_k = [tile_(k, beam_width) for k in cached_k]\n    cached_v = [tile_(v, beam_width) for v in cached_v]\n    past_cache = (cached_k, cached_v)\n\n    token_ids = tile_(token_ids, beam_width)\n    seqlen = paddle.sum(paddle.cast(token_ids != 0, 'int64'), 1, keepdim=True)\n\n    cls_ids = paddle.ones([d_batch * beam_width], dtype='int64') * sos_id\n    attn_ids = paddle.ones([d_batch * beam_width], dtype='int64') * attn_id  # SOS\n    ids = paddle.stack([cls_ids, attn_ids], -1)\n    for step in range(max_decode_len):\n        bias = gen_bias(token_ids, ids, step)\n        pos_ids = paddle.to_tensor(np.tile(np.array([[step, step + 1]], dtype=np.int64), [d_batch * beam_width, 1]))\n        pos_ids += seqlen\n        _, logits, info = model(ids,\n                                paddle.ones_like(ids) * tgt_type_id,\n                                pos_ids=pos_ids,\n                                attn_bias=bias,\n                                past_cache=past_cache)\n        if logits.shape[-1] > vocab_size:\n            logits[:, :, vocab_size:] = 0\n        logits[:, :, pad_id] = 0\n        logits[:, :, unk_id] = 0\n        logits[:, :, attn_id] = 0\n\n        output, state = beam_search_step(state,\n                                         logits[:, 1],\n                                         eos_id=eos_id,\n                                         beam_width=beam_width,\n                                         is_first_step=(step == 0),\n                                         length_penalty=length_penalty)\n        outputs.append(output)\n\n        past_cached_k, past_cached_v = past_cache\n        cached_k, cached_v = info['caches']\n        cached_k = [\n            reorder_(paddle.concat([pk, k[:, :1, :]], 1), output.beam_parent_ids)\n            for pk, k in zip(past_cached_k, cached_k)\n        ]  # concat cached\n        cached_v = [\n            reorder_(paddle.concat([pv, v[:, :1, :]], 1), output.beam_parent_ids)\n            for pv, v in zip(past_cached_v, cached_v)\n        ]\n        past_cache = (cached_k, cached_v)\n\n        pred_ids_flatten = paddle.reshape(output.predicted_ids, [d_batch * beam_width])\n        ids = paddle.stack([pred_ids_flatten, attn_ids], 1)\n\n        if state.finished.numpy().all():\n            break\n\n    final_ids = paddle.stack([o.predicted_ids for o in outputs], 0)\n    final_parent_ids = paddle.stack([o.beam_parent_ids for o in outputs], 0)\n    final_ids = nn.functional.gather_tree(final_ids, final_parent_ids)  #[:, :, 0]  #pick best beam\n    final_ids = paddle.transpose(paddle.reshape(final_ids, [-1, d_batch * 1, beam_width]), [1, 2, 0])\n\n    return final_ids.numpy()\n\n\nen_patten = re.compile(r'^[a-zA-Z0-9]*$')\n\n\ndef post_process(token):\n    if token.startswith('##'):\n        ret = token[2:]\n    elif token in ['[CLS]', '[SEP]', '[PAD]']:\n        ret = ''\n    else:\n        if en_patten.match(token):\n            ret = ' ' + token\n        else:\n            ret = token\n    return ret\n"
  },
  {
    "path": "modules/text/text_generation/ernie_gen_acrostic_poetry/module.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport ast\nimport json\nimport argparse\nimport os\n\nimport numpy as np\nimport paddle\nimport paddlehub as hub\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.nlp_module import DataFormatError\nfrom paddlehub.common.logger import logger\nfrom paddlehub.module.module import moduleinfo, serving\nfrom paddlenlp.transformers import ErnieTokenizer, ErnieForGeneration\n\nfrom ernie_gen_acrostic_poetry.decode import beam_search_infilling\n\n\n@moduleinfo(\n    name=\"ernie_gen_acrostic_poetry\",\n    version=\"1.1.0\",\n    summary=\n    \"ERNIE-GEN is a multi-flow language generation framework for both pre-training and fine-tuning. This module has fine-tuned for poetry generation task.\",\n    author=\"adaxiadaxi\",\n    author_email=\"\",\n    type=\"nlp/text_generation\",\n)\nclass ErnieGen(hub.NLPPredictionModule):\n    def __init__(self, line=4, word=7):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        if line not in [4, 8]:\n            raise ValueError(\"The line could only be 4 or 8.\")\n        if word not in [5, 7]:\n            raise ValueError(\"The word could only be 5 or 7.\")\n\n        self.line = line\n        assets_path = os.path.join(self.directory, \"assets\")\n        gen_checkpoint_path = os.path.join(assets_path, \"ernie_gen_acrostic_poetry_L%sW%s.pdparams\" % (line, word))\n        self.model = ErnieForGeneration.from_pretrained(\"ernie-1.0\")\n        model_state = paddle.load(gen_checkpoint_path)\n        self.model.set_dict(model_state)\n        self.tokenizer = ErnieTokenizer.from_pretrained(\"ernie-1.0\")\n        self.rev_dict = self.tokenizer.vocab.idx_to_token\n        self.rev_dict[self.tokenizer.vocab['[PAD]']] = ''  # replace [PAD]\n        self.rev_dict[self.tokenizer.vocab['[UNK]']] = ''  # replace [PAD]\n        self.rev_lookup = np.vectorize(lambda i: self.rev_dict[i])\n\n    @serving\n    def generate(self, texts, use_gpu=False, beam_width=5):\n        \"\"\"\n        Get the continuation of the input poetry.\n\n        Args:\n             texts(list): the front part of a poetry.\n             use_gpu(bool): whether use gpu to predict or not\n             beam_width(int): the beam search width.\n\n        Returns:\n             results(list): the poetry continuations.\n        \"\"\"\n        paddle.disable_static()\n\n        if texts and isinstance(texts, list) and all(texts) and all([isinstance(text, str) for text in texts]):\n            predicted_data = texts\n        else:\n            raise ValueError(\"The input texts should be a list with nonempty string elements.\")\n        for i, text in enumerate(texts):\n            if len(text) > self.line:\n                logger.warning(\n                    'The input text: %s, contains more than %i characters, which will be cut off' % (text, self.line))\n                texts[i] = text[:self.line]\n\n            for char in text:\n                if not '\\u4e00' <= char <= '\\u9fff':\n                    logger.warning(\n                        'The input text: %s, contains non-Chinese characters, which may result in magic output' % text)\n                    break\n\n        if use_gpu and \"CUDA_VISIBLE_DEVICES\" not in os.environ:\n            use_gpu = False\n            logger.warning(\n                \"use_gpu has been set False as you didn't set the environment variable CUDA_VISIBLE_DEVICES while using use_gpu=True\"\n            )\n\n        paddle.set_device('gpu') if use_gpu else paddle.set_device('cpu')\n\n        self.model.eval()\n        results = []\n        for text in predicted_data:\n            sample_results = []\n            encode_text = self.tokenizer.encode(text)\n            src_ids = paddle.to_tensor(encode_text['input_ids']).unsqueeze(0)\n            src_sids = paddle.to_tensor(encode_text['token_type_ids']).unsqueeze(0)\n            output_ids = beam_search_infilling(\n                self.model,\n                src_ids,\n                src_sids,\n                eos_id=self.tokenizer.vocab['[SEP]'],\n                sos_id=self.tokenizer.vocab['[CLS]'],\n                attn_id=self.tokenizer.vocab['[MASK]'],\n                pad_id=self.tokenizer.vocab['[PAD]'],\n                unk_id=self.tokenizer.vocab['[UNK]'],\n                vocab_size=len(self.tokenizer.vocab),\n                max_decode_len=80,\n                max_encode_len=20,\n                beam_width=beam_width,\n                tgt_type_id=1)\n            output_str = self.rev_lookup(output_ids[0])\n\n            for ostr in output_str.tolist():\n                if '[SEP]' in ostr:\n                    ostr = ostr[:ostr.index('[SEP]')]\n                sample_results.append(\"\".join(ostr))\n            results.append(sample_results)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu', type=ast.literal_eval, default=False, help=\"whether use GPU for prediction\")\n\n        self.arg_config_group.add_argument('--beam_width', type=int, default=5, help=\"the beam search width\")\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description='Run the %s module.' % self.name,\n            prog='hub run %s' % self.name,\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, optional.\")\n\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n\n        args = self.parser.parse_args(argvs)\n\n        try:\n            input_data = self.check_input_data(args)\n        except DataFormatError and RuntimeError:\n            self.parser.print_help()\n            return None\n\n        results = self.generate(texts=input_data, use_gpu=args.use_gpu, beam_width=args.beam_width)\n\n        return results\n\n\nif __name__ == \"__main__\":\n    module = ErnieGen()\n    for result in module.generate(['夏雨荷', '我喜欢你'], beam_width=5):\n        print(result)\n"
  },
  {
    "path": "modules/text/text_generation/ernie_gen_couplet/README.md",
    "content": "# ernie_gen_couplet\n\n| 模型名称            | ernie_gen_couplet |\n| :------------------ | :---------------: |\n| 类别                |   文本-文本生成   |\n| 网络                |     ERNIE-GEN     |\n| 数据集              |  开源对联数据集   |\n| 是否支持Fine-tuning |        否         |\n| 模型大小            |       421M        |\n| 最新更新日期        |    2021-02-26     |\n| 数据指标            |         -         |\n\n## 一、模型基本信息\n\n- 模型介绍\n  - ERNIE-GEN 是面向生成任务的预训练-微调框架，首次在预训练阶段加入span-by-span 生成任务，让模型每次能够生成一个语义完整的片段。在预训练和微调中通过填充式生成机制和噪声感知机制来缓解曝光偏差问题。此外, ERNIE-GEN 采样多片段-多粒度目标文本采样策略, 增强源文本和目标文本的关联性，加强了编码器和解码器的交互。\n  - ernie_gen_couplet采用开源对联数据集进行微调，输入上联，可生成下联。\n\n<p align=\"center\">\n<img src=\"https://user-images.githubusercontent.com/76040149/133191670-8eb1c542-f8e8-4715-adb2-6346b976fab1.png\"  width=\"600\" hspace='10'/>\n</p>\n\n- 更多详情请参考：[ERNIE-GEN:An Enhanced Multi-Flow Pre-training and Fine-tuning Framework for Natural Language Generation](https://arxiv.org/abs/2001.11314)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n  - paddlenlp >= 2.0.0\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install ernie_gen_couplet\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run ernie_gen_couplet --input_text=\"人增福寿年增岁\" --use_gpu True --beam_width 5\n    ```\n    \n    - input_text: 上联文本\n    - use_gpu: 是否采用GPU进行预测\n    - beam_width: beam search宽度，决定每个上联输出的下联数量。\n    \n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    \n    module = hub.Module(name=\"ernie_gen_couplet\")\n    \n    test_texts = [\"人增福寿年增岁\", \"风吹云乱天垂泪\"]\n    results = module.generate(texts=test_texts, use_gpu=True, beam_width=5)\n    for result in results:\n        print(result)\n        \n    # ['春满乾坤喜满门', '竹报平安梅报春', '春满乾坤福满门', '春满乾坤酒满樽', '春满乾坤喜满家']\n    # ['雨打花残地痛心', '雨打花残地皱眉', '雨打花残地动容', '雨打霜欺地动容', '雨打花残地洒愁']\n    ```\n\n- ### 3、API\n\n  - ```python\n    def generate(texts, use_gpu=False, beam_width=5):\n    ```\n\n    - 预测API，由上联生成下联。\n    - **参数**\n      - texts (list[str]): 上联文本；\n      - use_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量**；\n      - beam_width: beam search宽度，决定每个上联输出的下联数量。\n    - **返回**\n      - results (list[list]\\[str]): 下联文本，每个上联会生成beam_width个下联。\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线对联生成服务，可以将此接口用于在线web应用。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m ernie_gen_couplet -p 8866\n    ```\n\n  - 这样就完成了一个服务化API的部署，默认端口号为8866。\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    \n    # 发送HTTP请求\n    \n    data = {'texts':[\"人增福寿年增岁\", \"风吹云乱天垂泪\"],\n            'use_gpu':False, 'beam_width':5}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/ernie_gen_couplet\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    \n    # 保存结果\n    results = r.json()[\"results\"]\n    for result in results:\n        print(result)\n    \n    # ['春满乾坤喜满门', '竹报平安梅报春', '春满乾坤福满门', '春满乾坤酒满樽', '春满乾坤喜满家']\n    # ['雨打花残地痛心', '雨打花残地皱眉', '雨打花残地动容', '雨打霜欺地动容', '雨打花残地洒愁']\n    ```\n\n  - 关于PaddleHub Serving更多信息参考[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  修复windows中的编码问题\n\n* 1.0.2\n\n  完善API的输入文本检查\n  \n* 1.1.0\n\n  修复兼容性问题\n\n  * ```shell\n    $ hub install ernie_gen_couplet==1.1.0\n    ```\n"
  },
  {
    "path": "modules/text/text_generation/ernie_gen_couplet/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/text_generation/ernie_gen_couplet/decode.py",
    "content": "#   Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom __future__ import division\nfrom __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport sys\nimport re\nimport argparse\nimport logging\nimport json\nimport numpy as np\nfrom collections import namedtuple\n\nimport paddle\nimport paddle.nn as nn\nimport numpy as np\nfrom paddlenlp.utils.log import logger\n\n\ndef gen_bias(encoder_inputs, decoder_inputs, step):\n    decoder_bsz, decoder_seqlen = decoder_inputs.shape[:2]\n    encoder_bsz, encoder_seqlen = encoder_inputs.shape[:2]\n    attn_bias = paddle.reshape(paddle.arange(0, decoder_seqlen, 1, dtype='float32') + 1, [1, -1, 1])\n    decoder_bias = paddle.cast((paddle.matmul(attn_bias, 1. / attn_bias, transpose_y=True) >= 1.),\n                               'float32')  #[1, decoderlen, decoderlen]\n    encoder_bias = paddle.unsqueeze(paddle.cast(paddle.ones_like(encoder_inputs), 'float32'),\n                                    [1])  #[bsz, 1, encoderlen]\n    encoder_bias = paddle.expand(encoder_bias,\n                                 [encoder_bsz, decoder_seqlen, encoder_seqlen])  #[bsz,decoderlen, encoderlen]\n    decoder_bias = paddle.expand(decoder_bias,\n                                 [decoder_bsz, decoder_seqlen, decoder_seqlen])  #[bsz, decoderlen, decoderlen]\n    if step > 0:\n        bias = paddle.concat(\n            [encoder_bias, paddle.ones([decoder_bsz, decoder_seqlen, step], 'float32'), decoder_bias], -1)\n    else:\n        bias = paddle.concat([encoder_bias, decoder_bias], -1)\n    return bias\n\n\n@paddle.no_grad()\ndef greedy_search_infilling(model,\n                            token_ids,\n                            token_type_ids,\n                            sos_id,\n                            eos_id,\n                            attn_id,\n                            pad_id,\n                            unk_id,\n                            vocab_size,\n                            max_encode_len=640,\n                            max_decode_len=100,\n                            tgt_type_id=3):\n    _, logits, info = model(token_ids, token_type_ids)\n    d_batch, d_seqlen = token_ids.shape\n    seqlen = paddle.sum(paddle.cast(token_ids != 0, 'int64'), 1, keepdim=True)\n    has_stopped = np.zeros([d_batch], dtype=np.bool)\n    gen_seq_len = np.zeros([d_batch], dtype=np.int64)\n    output_ids = []\n\n    past_cache = info['caches']\n\n    cls_ids = paddle.ones([d_batch], dtype='int64') * sos_id\n    attn_ids = paddle.ones([d_batch], dtype='int64') * attn_id\n    ids = paddle.stack([cls_ids, attn_ids], -1)\n    for step in range(max_decode_len):\n        bias = gen_bias(token_ids, ids, step)\n        pos_ids = paddle.to_tensor(np.tile(np.array([[step, step + 1]], dtype=np.int64), [d_batch, 1]))\n        pos_ids += seqlen\n        _, logits, info = model(ids,\n                                paddle.ones_like(ids) * tgt_type_id,\n                                pos_ids=pos_ids,\n                                attn_bias=bias,\n                                past_cache=past_cache)\n\n        if logits.shape[-1] > vocab_size:\n            logits[:, :, vocab_size:] = 0\n        logits[:, :, pad_id] = 0\n        logits[:, :, unk_id] = 0\n        logits[:, :, attn_id] = 0\n\n        gen_ids = paddle.argmax(logits, -1)\n\n        past_cached_k, past_cached_v = past_cache\n        cached_k, cached_v = info['caches']\n        cached_k = [paddle.concat([pk, k[:, :1, :]], 1) for pk, k in zip(past_cached_k, cached_k)]  # concat cached\n        cached_v = [paddle.concat([pv, v[:, :1, :]], 1) for pv, v in zip(past_cached_v, cached_v)]\n        past_cache = (cached_k, cached_v)\n\n        gen_ids = gen_ids[:, 1]\n        ids = paddle.stack([gen_ids, attn_ids], 1)\n\n        gen_ids = gen_ids.numpy()\n        has_stopped |= (gen_ids == eos_id).astype(np.bool)\n        gen_seq_len += (1 - has_stopped.astype(np.int64))\n        output_ids.append(gen_ids.tolist())\n        if has_stopped.all():\n            break\n    output_ids = np.array(output_ids).transpose([1, 0])\n    return output_ids\n\n\nBeamSearchState = namedtuple('BeamSearchState', ['log_probs', 'lengths', 'finished'])\nBeamSearchOutput = namedtuple('BeamSearchOutput', ['scores', 'predicted_ids', 'beam_parent_ids'])\n\n\ndef log_softmax(x):\n    e_x = np.exp(x - np.max(x))\n    return np.log(e_x / e_x.sum())\n\n\ndef mask_prob(p, onehot_eos, finished):\n    is_finished = paddle.cast(paddle.reshape(finished, [-1, 1]) != 0, 'float32')\n    p = is_finished * (1. - paddle.cast(onehot_eos, 'float32')) * -9999. + (1. - is_finished) * p\n    return p\n\n\ndef hyp_score(log_probs, length, length_penalty):\n    lp = paddle.pow((5. + paddle.cast(length, 'float32')) / 6., length_penalty)\n    return log_probs / lp\n\n\ndef beam_search_step(state, logits, eos_id, beam_width, is_first_step, length_penalty):\n    \"\"\"logits.shape == [B*W, V]\"\"\"\n    _, vocab_size = logits.shape\n\n    bsz, beam_width = state.log_probs.shape\n    onehot_eos = paddle.cast(nn.functional.one_hot(paddle.ones([1], 'int64') * eos_id, vocab_size), 'int64')  #[1, V]\n\n    probs = paddle.log(nn.functional.softmax(logits))  #[B*W, V]\n    probs = mask_prob(probs, onehot_eos, state.finished)  #[B*W, V]\n    allprobs = paddle.reshape(state.log_probs, [-1, 1]) + probs  #[B*W, V]\n\n    not_finished = 1 - paddle.reshape(state.finished, [-1, 1])  #[B*W,1]\n    not_eos = 1 - onehot_eos\n    length_to_add = not_finished * not_eos  #[B*W,V]\n    alllen = paddle.reshape(state.lengths, [-1, 1]) + length_to_add\n\n    allprobs = paddle.reshape(allprobs, [-1, beam_width * vocab_size])\n    alllen = paddle.reshape(alllen, [-1, beam_width * vocab_size])\n    allscore = hyp_score(allprobs, alllen, length_penalty)\n    if is_first_step:\n        allscore = paddle.reshape(allscore, [bsz, beam_width, -1])[:, 0, :]  # first step only consiter beam 0\n    scores, idx = paddle.topk(allscore, k=beam_width)  #[B, W]\n    next_beam_id = idx // vocab_size  #[B, W]\n    next_word_id = idx % vocab_size\n\n    gather_idx = paddle.concat([paddle.nonzero(idx != -1)[:, :1], paddle.reshape(idx, [-1, 1])], 1)\n    next_probs = paddle.reshape(paddle.gather_nd(allprobs, gather_idx), idx.shape)\n    next_len = paddle.reshape(paddle.gather_nd(alllen, gather_idx), idx.shape)\n\n    gather_idx = paddle.concat([paddle.nonzero(next_beam_id != -1)[:, :1], paddle.reshape(next_beam_id, [-1, 1])], 1)\n    next_finished = paddle.reshape(paddle.gather_nd(state.finished, gather_idx),\n                                   state.finished.shape)  #[gather new beam state according to new beam id]\n\n    next_finished += paddle.cast(next_word_id == eos_id, 'int64')\n    next_finished = paddle.cast(next_finished > 0, 'int64')\n\n    next_state = BeamSearchState(log_probs=next_probs, lengths=next_len, finished=next_finished)\n    output = BeamSearchOutput(scores=scores, predicted_ids=next_word_id, beam_parent_ids=next_beam_id)\n\n    return output, next_state\n\n\n@paddle.no_grad()\ndef beam_search_infilling(model,\n                          token_ids,\n                          token_type_ids,\n                          sos_id,\n                          eos_id,\n                          attn_id,\n                          pad_id,\n                          unk_id,\n                          vocab_size,\n                          max_encode_len=640,\n                          max_decode_len=100,\n                          beam_width=5,\n                          tgt_type_id=3,\n                          length_penalty=1.0):\n    _, __, info = model(token_ids, token_type_ids)\n    d_batch, d_seqlen = token_ids.shape\n\n    state = BeamSearchState(log_probs=paddle.zeros([d_batch, beam_width], 'float32'),\n                            lengths=paddle.zeros([d_batch, beam_width], 'int64'),\n                            finished=paddle.zeros([d_batch, beam_width], 'int64'))\n    outputs = []\n\n    def reorder_(t, parent_id):\n        \"\"\"reorder cache according to parent beam id\"\"\"\n        gather_idx = paddle.nonzero(parent_id != -1)[:, 0] * beam_width + paddle.reshape(parent_id, [-1])\n        t = paddle.gather(t, gather_idx)\n        return t\n\n    def tile_(t, times):\n        _shapes = list(t.shape[1:])\n        new_shape = [t.shape[0], times] + list(t.shape[1:])\n        ret = paddle.reshape(paddle.expand(paddle.unsqueeze(t, [1]), new_shape), [\n            -1,\n        ] + _shapes)\n        return ret\n\n    cached_k, cached_v = info['caches']\n    cached_k = [tile_(k, beam_width) for k in cached_k]\n    cached_v = [tile_(v, beam_width) for v in cached_v]\n    past_cache = (cached_k, cached_v)\n\n    token_ids = tile_(token_ids, beam_width)\n    seqlen = paddle.sum(paddle.cast(token_ids != 0, 'int64'), 1, keepdim=True)\n\n    cls_ids = paddle.ones([d_batch * beam_width], dtype='int64') * sos_id\n    attn_ids = paddle.ones([d_batch * beam_width], dtype='int64') * attn_id  # SOS\n    ids = paddle.stack([cls_ids, attn_ids], -1)\n    for step in range(max_decode_len):\n        bias = gen_bias(token_ids, ids, step)\n        pos_ids = paddle.to_tensor(np.tile(np.array([[step, step + 1]], dtype=np.int64), [d_batch * beam_width, 1]))\n        pos_ids += seqlen\n        _, logits, info = model(ids,\n                                paddle.ones_like(ids) * tgt_type_id,\n                                pos_ids=pos_ids,\n                                attn_bias=bias,\n                                past_cache=past_cache)\n        if logits.shape[-1] > vocab_size:\n            logits[:, :, vocab_size:] = 0\n        logits[:, :, pad_id] = 0\n        logits[:, :, unk_id] = 0\n        logits[:, :, attn_id] = 0\n\n        output, state = beam_search_step(state,\n                                         logits[:, 1],\n                                         eos_id=eos_id,\n                                         beam_width=beam_width,\n                                         is_first_step=(step == 0),\n                                         length_penalty=length_penalty)\n        outputs.append(output)\n\n        past_cached_k, past_cached_v = past_cache\n        cached_k, cached_v = info['caches']\n        cached_k = [\n            reorder_(paddle.concat([pk, k[:, :1, :]], 1), output.beam_parent_ids)\n            for pk, k in zip(past_cached_k, cached_k)\n        ]  # concat cached\n        cached_v = [\n            reorder_(paddle.concat([pv, v[:, :1, :]], 1), output.beam_parent_ids)\n            for pv, v in zip(past_cached_v, cached_v)\n        ]\n        past_cache = (cached_k, cached_v)\n\n        pred_ids_flatten = paddle.reshape(output.predicted_ids, [d_batch * beam_width])\n        ids = paddle.stack([pred_ids_flatten, attn_ids], 1)\n\n        if state.finished.numpy().all():\n            break\n\n    final_ids = paddle.stack([o.predicted_ids for o in outputs], 0)\n    final_parent_ids = paddle.stack([o.beam_parent_ids for o in outputs], 0)\n    final_ids = nn.functional.gather_tree(final_ids, final_parent_ids)  #[:, :, 0]  #pick best beam\n    final_ids = paddle.transpose(paddle.reshape(final_ids, [-1, d_batch * 1, beam_width]), [1, 2, 0])\n\n    return final_ids.numpy()\n\n\nen_patten = re.compile(r'^[a-zA-Z0-9]*$')\n\n\ndef post_process(token):\n    if token.startswith('##'):\n        ret = token[2:]\n    elif token in ['[CLS]', '[SEP]', '[PAD]']:\n        ret = ''\n    else:\n        if en_patten.match(token):\n            ret = ' ' + token\n        else:\n            ret = token\n    return ret\n"
  },
  {
    "path": "modules/text/text_generation/ernie_gen_couplet/module.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport ast\nimport json\nimport argparse\nimport os\n\nimport numpy as np\nimport paddle\nimport paddlehub as hub\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.nlp_module import DataFormatError\nfrom paddlehub.common.logger import logger\nfrom paddlehub.module.module import moduleinfo, serving\nfrom paddlenlp.transformers import ErnieTokenizer, ErnieForGeneration\n\nfrom ernie_gen_couplet.decode import beam_search_infilling\n\n\n@moduleinfo(\n    name=\"ernie_gen_couplet\",\n    version=\"1.1.0\",\n    summary=\n    \"ERNIE-GEN is a multi-flow language generation framework for both pre-training and fine-tuning. This module has fine-tuned for couplet generation task.\",\n    author=\"baidu-nlp\",\n    author_email=\"\",\n    type=\"nlp/text_generation\",\n)\nclass ErnieGen(hub.NLPPredictionModule):\n    def __init__(self):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        assets_path = os.path.join(self.directory, \"assets\")\n        gen_checkpoint_path = os.path.join(assets_path, \"ernie_gen_couplet.pdparams\")\n        self.model = ErnieForGeneration.from_pretrained(\"ernie-1.0\")\n        model_state = paddle.load(gen_checkpoint_path)\n        self.model.set_dict(model_state)\n        self.tokenizer = ErnieTokenizer.from_pretrained(\"ernie-1.0\")\n        self.rev_dict = self.tokenizer.vocab.idx_to_token\n        self.rev_dict[self.tokenizer.vocab['[PAD]']] = ''  # replace [PAD]\n        self.rev_dict[self.tokenizer.vocab['[UNK]']] = ''  # replace [PAD]\n        self.rev_lookup = np.vectorize(lambda i: self.rev_dict[i])\n\n    @serving\n    def generate(self, texts, use_gpu=False, beam_width=5):\n        \"\"\"\n        Get the right rolls from the left rolls.\n\n        Args:\n             texts(list): the left rolls.\n             use_gpu(bool): whether use gpu to predict or not\n             beam_width(int): the beam search width.\n\n        Returns:\n             results(list): the right rolls.\n        \"\"\"\n        paddle.disable_static()\n\n        if texts and isinstance(texts, list) and all(texts) and all([isinstance(text, str) for text in texts]):\n            predicted_data = texts\n        else:\n            raise ValueError(\"The input texts should be a list with nonempty string elements.\")\n        for i, text in enumerate(texts):\n            for char in text:\n                if not '\\u4e00' <= char <= '\\u9fff':\n                    logger.warning(\n                        'The input text: %s, contains non-Chinese characters, which may result in magic output' % text)\n                    break\n\n        if use_gpu and \"CUDA_VISIBLE_DEVICES\" not in os.environ:\n            use_gpu = False\n            logger.warning(\n                \"use_gpu has been set False as you didn't set the environment variable CUDA_VISIBLE_DEVICES while using use_gpu=True\"\n            )\n\n        paddle.set_device('gpu') if use_gpu else paddle.set_device('cpu')\n\n        self.model.eval()\n        results = []\n        for text in predicted_data:\n            sample_results = []\n            encode_text = self.tokenizer.encode(text)\n            src_ids = paddle.to_tensor(encode_text['input_ids']).unsqueeze(0)\n            src_sids = paddle.to_tensor(encode_text['token_type_ids']).unsqueeze(0)\n            output_ids = beam_search_infilling(\n                self.model,\n                src_ids,\n                src_sids,\n                eos_id=self.tokenizer.vocab['[SEP]'],\n                sos_id=self.tokenizer.vocab['[CLS]'],\n                attn_id=self.tokenizer.vocab['[MASK]'],\n                pad_id=self.tokenizer.vocab['[PAD]'],\n                unk_id=self.tokenizer.vocab['[UNK]'],\n                vocab_size=len(self.tokenizer.vocab),\n                max_decode_len=20,\n                max_encode_len=20,\n                beam_width=beam_width,\n                tgt_type_id=1)\n            output_str = self.rev_lookup(output_ids[0])\n\n            for ostr in output_str.tolist():\n                if '[SEP]' in ostr:\n                    ostr = ostr[:ostr.index('[SEP]')]\n                sample_results.append(\"\".join(ostr))\n            results.append(sample_results)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu', type=ast.literal_eval, default=False, help=\"whether use GPU for prediction\")\n\n        self.arg_config_group.add_argument('--beam_width', type=int, default=5, help=\"the beam search width\")\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description='Run the %s module.' % self.name,\n            prog='hub run %s' % self.name,\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, optional.\")\n\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n\n        args = self.parser.parse_args(argvs)\n\n        try:\n            input_data = self.check_input_data(args)\n        except DataFormatError and RuntimeError:\n            self.parser.print_help()\n            return None\n\n        results = self.generate(texts=input_data, use_gpu=args.use_gpu, beam_width=args.beam_width)\n\n        return results\n\n\nif __name__ == \"__main__\":\n    module = ErnieGen()\n    for result in module.generate(['人增福寿年增岁', '上海自来水来自海上', '风吹云乱天垂泪'], beam_width=5, use_gpu=True):\n        print(result)\n"
  },
  {
    "path": "modules/text/text_generation/ernie_gen_lover_words/README.md",
    "content": "# ernie_gen_lover_words\n\n| 模型名称            | ernie_gen_lover_words |\n| :------------------ | :-------------------: |\n| 类别                |     文本-文本生成     |\n| 网络                |       ERNIE-GEN       |\n| 数据集              |  网络情诗、情话数据   |\n| 是否支持Fine-tuning |          否           |\n| 模型大小            |         420M          |\n| 最新更新日期        |      2021-02-26       |\n| 数据指标            |           -           |\n\n## 一、模型基本信息\n\n- ### 模型介绍\n  - ERNIE-GEN 是面向生成任务的预训练-微调框架，首次在预训练阶段加入span-by-span 生成任务，让模型每次能够生成一个语义完整的片段。在预训练和微调中通过填充式生成机制和噪声感知机制来缓解曝光偏差问题。此外, ERNIE-GEN 采样多片段-多粒度目标文本采样策略, 增强源文本和目标文本的关联性，加强了编码器和解码器的交互。\n  - ernie_gen_lover_words采用网络搜集的情诗、情话数据微调，可用于生成情话。\n\n<p align=\"center\">\n<img src=\"https://user-images.githubusercontent.com/76040149/133191670-8eb1c542-f8e8-4715-adb2-6346b976fab1.png\"  width=\"600\" hspace='10'/>\n</p>\n\n- 更多详情请参考：[ERNIE-GEN:An Enhanced Multi-Flow Pre-training and Fine-tuning Framework for Natural Language Generation](https://arxiv.org/abs/2001.11314)\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 1.8.2\n  \n  - paddlehub >= 1.7.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install ernie_gen_lover_words\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run ernie_gen_lover_words --input_text \"情人节\" --use_gpu True --beam_width 5\n    ```\n  - 通过命令行方式实现hub模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n  \n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    \n    module = hub.Module(name=\"ernie_gen_lover_words\")\n    \n    test_texts = ['情人节', '故乡', '小编带大家了解一下程序员情人节']\n    results = module.generate(texts=test_texts, use_gpu=True, beam_width=5)\n    for result in results:\n        print(result)\n        \n    # '情人节，我愿做一条鱼，任你红烧、白煮、清蒸，然后躺在你温柔的胃里。', '情人节，对你的思念太重，压断了电话线，烧坏了手机卡，掏尽了钱包袋，吃光了安眠药，哎!可我还是思念你。', '情人节，对你的思念太重，压断了电话线，烧坏了手机卡，掏尽了钱包袋，吃光了安眠药，哎!可我还是思念你，祝你情人节快乐!', '情人节，对你的思念太重，压断了电话线，烧坏了手机卡，掏尽了钱包袋，吃光了安眠药，唉!可我还是思念你，祝你情人节快乐!', '情人节，对你的思念太重，压断了电话线，烧坏了手机卡，掏尽了钱包袋，吃光了安眠药，哎!可是我还是思念你。']\n    # ['故乡，是深秋的雨，云雾缭绕，夏日的阳光照耀下，像一只只翅膀，那就是思念。', '故乡，是深秋的雨，是诗人们吟咏的乡村序曲，但愿天下有情人，一定难忘。', '故乡，是深秋的雨，是诗人们吟咏的一篇美丽的诗章，但愿天下有情人，都一定走进了蒙蒙细雨中。', '故乡，是深秋的雨，是诗人们吟咏的一篇美丽的诗章，但愿天下有情人，都一定走进了蒙蒙的细雨，纷纷而来。', '故乡，是深秋的雨，是诗人们吟咏的一篇美丽的诗章，但愿天下有情人，都一定走进了蒙蒙的细雨中。']\n    # ['小编带大家了解一下程序员情人节，没有人会悄悄的下载数据库，没有人会升级!希望程序可以好好的工作!', '小编带大家了解一下程序员情人节，没有人会悄悄的下载数据库，没有人会升级!希望程序可以重新拥有!', '小编带大家了解一下程序员情人节，没有人会悄悄的下载数据库，没有人会升级!希望程序可以好好的工作。', '小编带大家了解一下程序员情人节，没有人会悄悄的下载数据库，没有人会升级!希望程序可以重新把我们送上。', '小编带大家了解一下程序员情人节，没有人会悄悄的下载数据库，没有人会升级!希望程序可以重新把我们送上!']\n    ```\n\n- ### 3、API\n\n  - ```python\n    def generate(texts, use_gpu=False, beam_width=5):\n    ```\n    \n    - 预测API，输入情话开头，输出情话下文。\n\n    - **参数**\n      - texts(list[str]):情话的开头\n      - use_gpu(bool): 是否使用GPU预测，如果使用GPU预测，则在预测之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置\n      - beam_width: beam search宽度，决定每个情话开头输出的下文数目\n    \n    - **返回**\n      - results(list[list]\\[str]): 情话下文，每个情话开头会生成beam_width个下文\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线文本生成服务，可以将此接口用于在线web应用。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m ernie_gen_lover_words -p 8866  \n    ```\n\n  - 这样就完成了服务化API的部署，默认端口号为8866。\n  \n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    \n    # 发送HTTP请求\n    \n    data = {'texts':['情人节', '故乡', '小编带大家了解一下程序员情人节'],\n            'use_gpu':False, 'beam_width':5}\n    headers = {\"Content-type\": \"application/json\"}\n    # HOST_IP为服务器IP\n    url = \"http://HOST_IP:8866/predict/ernie_gen_lover_words\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    \n    # 保存结果\n    results = r.json()[\"results\"]\n    for result in results:\n        print(result)\n    ```\n    \n  - 关于PaddleHub Serving更多信息参考[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  完善API的输入文本检查\n  \n  - ```shell\n    $ hub install ernie_gen_lover_words==1.0.1\n    ```\n"
  },
  {
    "path": "modules/text/text_generation/ernie_gen_lover_words/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/text_generation/ernie_gen_lover_words/decode.py",
    "content": "#   Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom __future__ import division\nfrom __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport sys\nimport re\nimport argparse\nimport logging\nimport json\nimport numpy as np\nfrom collections import namedtuple\n\nimport paddle\nimport paddle.nn as nn\nimport numpy as np\nfrom paddlenlp.utils.log import logger\n\n\ndef gen_bias(encoder_inputs, decoder_inputs, step):\n    decoder_bsz, decoder_seqlen = decoder_inputs.shape[:2]\n    encoder_bsz, encoder_seqlen = encoder_inputs.shape[:2]\n    attn_bias = paddle.reshape(paddle.arange(0, decoder_seqlen, 1, dtype='float32') + 1, [1, -1, 1])\n    decoder_bias = paddle.cast((paddle.matmul(attn_bias, 1. / attn_bias, transpose_y=True) >= 1.),\n                               'float32')  #[1, decoderlen, decoderlen]\n    encoder_bias = paddle.unsqueeze(paddle.cast(paddle.ones_like(encoder_inputs), 'float32'),\n                                    [1])  #[bsz, 1, encoderlen]\n    encoder_bias = paddle.expand(encoder_bias,\n                                 [encoder_bsz, decoder_seqlen, encoder_seqlen])  #[bsz,decoderlen, encoderlen]\n    decoder_bias = paddle.expand(decoder_bias,\n                                 [decoder_bsz, decoder_seqlen, decoder_seqlen])  #[bsz, decoderlen, decoderlen]\n    if step > 0:\n        bias = paddle.concat(\n            [encoder_bias, paddle.ones([decoder_bsz, decoder_seqlen, step], 'float32'), decoder_bias], -1)\n    else:\n        bias = paddle.concat([encoder_bias, decoder_bias], -1)\n    return bias\n\n\n@paddle.no_grad()\ndef greedy_search_infilling(model,\n                            token_ids,\n                            token_type_ids,\n                            sos_id,\n                            eos_id,\n                            attn_id,\n                            pad_id,\n                            unk_id,\n                            vocab_size,\n                            max_encode_len=640,\n                            max_decode_len=100,\n                            tgt_type_id=3):\n    _, logits, info = model(token_ids, token_type_ids)\n    d_batch, d_seqlen = token_ids.shape\n    seqlen = paddle.sum(paddle.cast(token_ids != 0, 'int64'), 1, keepdim=True)\n    has_stopped = np.zeros([d_batch], dtype=np.bool)\n    gen_seq_len = np.zeros([d_batch], dtype=np.int64)\n    output_ids = []\n\n    past_cache = info['caches']\n\n    cls_ids = paddle.ones([d_batch], dtype='int64') * sos_id\n    attn_ids = paddle.ones([d_batch], dtype='int64') * attn_id\n    ids = paddle.stack([cls_ids, attn_ids], -1)\n    for step in range(max_decode_len):\n        bias = gen_bias(token_ids, ids, step)\n        pos_ids = paddle.to_tensor(np.tile(np.array([[step, step + 1]], dtype=np.int64), [d_batch, 1]))\n        pos_ids += seqlen\n        _, logits, info = model(ids,\n                                paddle.ones_like(ids) * tgt_type_id,\n                                pos_ids=pos_ids,\n                                attn_bias=bias,\n                                past_cache=past_cache)\n\n        if logits.shape[-1] > vocab_size:\n            logits[:, :, vocab_size:] = 0\n        logits[:, :, pad_id] = 0\n        logits[:, :, unk_id] = 0\n        logits[:, :, attn_id] = 0\n\n        gen_ids = paddle.argmax(logits, -1)\n\n        past_cached_k, past_cached_v = past_cache\n        cached_k, cached_v = info['caches']\n        cached_k = [paddle.concat([pk, k[:, :1, :]], 1) for pk, k in zip(past_cached_k, cached_k)]  # concat cached\n        cached_v = [paddle.concat([pv, v[:, :1, :]], 1) for pv, v in zip(past_cached_v, cached_v)]\n        past_cache = (cached_k, cached_v)\n\n        gen_ids = gen_ids[:, 1]\n        ids = paddle.stack([gen_ids, attn_ids], 1)\n\n        gen_ids = gen_ids.numpy()\n        has_stopped |= (gen_ids == eos_id).astype(np.bool)\n        gen_seq_len += (1 - has_stopped.astype(np.int64))\n        output_ids.append(gen_ids.tolist())\n        if has_stopped.all():\n            break\n    output_ids = np.array(output_ids).transpose([1, 0])\n    return output_ids\n\n\nBeamSearchState = namedtuple('BeamSearchState', ['log_probs', 'lengths', 'finished'])\nBeamSearchOutput = namedtuple('BeamSearchOutput', ['scores', 'predicted_ids', 'beam_parent_ids'])\n\n\ndef log_softmax(x):\n    e_x = np.exp(x - np.max(x))\n    return np.log(e_x / e_x.sum())\n\n\ndef mask_prob(p, onehot_eos, finished):\n    is_finished = paddle.cast(paddle.reshape(finished, [-1, 1]) != 0, 'float32')\n    p = is_finished * (1. - paddle.cast(onehot_eos, 'float32')) * -9999. + (1. - is_finished) * p\n    return p\n\n\ndef hyp_score(log_probs, length, length_penalty):\n    lp = paddle.pow((5. + paddle.cast(length, 'float32')) / 6., length_penalty)\n    return log_probs / lp\n\n\ndef beam_search_step(state, logits, eos_id, beam_width, is_first_step, length_penalty):\n    \"\"\"logits.shape == [B*W, V]\"\"\"\n    _, vocab_size = logits.shape\n\n    bsz, beam_width = state.log_probs.shape\n    onehot_eos = paddle.cast(nn.functional.one_hot(paddle.ones([1], 'int64') * eos_id, vocab_size), 'int64')  #[1, V]\n\n    probs = paddle.log(nn.functional.softmax(logits))  #[B*W, V]\n    probs = mask_prob(probs, onehot_eos, state.finished)  #[B*W, V]\n    allprobs = paddle.reshape(state.log_probs, [-1, 1]) + probs  #[B*W, V]\n\n    not_finished = 1 - paddle.reshape(state.finished, [-1, 1])  #[B*W,1]\n    not_eos = 1 - onehot_eos\n    length_to_add = not_finished * not_eos  #[B*W,V]\n    alllen = paddle.reshape(state.lengths, [-1, 1]) + length_to_add\n\n    allprobs = paddle.reshape(allprobs, [-1, beam_width * vocab_size])\n    alllen = paddle.reshape(alllen, [-1, beam_width * vocab_size])\n    allscore = hyp_score(allprobs, alllen, length_penalty)\n    if is_first_step:\n        allscore = paddle.reshape(allscore, [bsz, beam_width, -1])[:, 0, :]  # first step only consiter beam 0\n    scores, idx = paddle.topk(allscore, k=beam_width)  #[B, W]\n    next_beam_id = idx // vocab_size  #[B, W]\n    next_word_id = idx % vocab_size\n\n    gather_idx = paddle.concat([paddle.nonzero(idx != -1)[:, :1], paddle.reshape(idx, [-1, 1])], 1)\n    next_probs = paddle.reshape(paddle.gather_nd(allprobs, gather_idx), idx.shape)\n    next_len = paddle.reshape(paddle.gather_nd(alllen, gather_idx), idx.shape)\n\n    gather_idx = paddle.concat([paddle.nonzero(next_beam_id != -1)[:, :1], paddle.reshape(next_beam_id, [-1, 1])], 1)\n    next_finished = paddle.reshape(paddle.gather_nd(state.finished, gather_idx),\n                                   state.finished.shape)  #[gather new beam state according to new beam id]\n\n    next_finished += paddle.cast(next_word_id == eos_id, 'int64')\n    next_finished = paddle.cast(next_finished > 0, 'int64')\n\n    next_state = BeamSearchState(log_probs=next_probs, lengths=next_len, finished=next_finished)\n    output = BeamSearchOutput(scores=scores, predicted_ids=next_word_id, beam_parent_ids=next_beam_id)\n\n    return output, next_state\n\n\n@paddle.no_grad()\ndef beam_search_infilling(model,\n                          token_ids,\n                          token_type_ids,\n                          sos_id,\n                          eos_id,\n                          attn_id,\n                          pad_id,\n                          unk_id,\n                          vocab_size,\n                          max_encode_len=640,\n                          max_decode_len=100,\n                          beam_width=5,\n                          tgt_type_id=3,\n                          length_penalty=1.0):\n    _, __, info = model(token_ids, token_type_ids)\n    d_batch, d_seqlen = token_ids.shape\n\n    state = BeamSearchState(log_probs=paddle.zeros([d_batch, beam_width], 'float32'),\n                            lengths=paddle.zeros([d_batch, beam_width], 'int64'),\n                            finished=paddle.zeros([d_batch, beam_width], 'int64'))\n    outputs = []\n\n    def reorder_(t, parent_id):\n        \"\"\"reorder cache according to parent beam id\"\"\"\n        gather_idx = paddle.nonzero(parent_id != -1)[:, 0] * beam_width + paddle.reshape(parent_id, [-1])\n        t = paddle.gather(t, gather_idx)\n        return t\n\n    def tile_(t, times):\n        _shapes = list(t.shape[1:])\n        new_shape = [t.shape[0], times] + list(t.shape[1:])\n        ret = paddle.reshape(paddle.expand(paddle.unsqueeze(t, [1]), new_shape), [\n            -1,\n        ] + _shapes)\n        return ret\n\n    cached_k, cached_v = info['caches']\n    cached_k = [tile_(k, beam_width) for k in cached_k]\n    cached_v = [tile_(v, beam_width) for v in cached_v]\n    past_cache = (cached_k, cached_v)\n\n    token_ids = tile_(token_ids, beam_width)\n    seqlen = paddle.sum(paddle.cast(token_ids != 0, 'int64'), 1, keepdim=True)\n\n    cls_ids = paddle.ones([d_batch * beam_width], dtype='int64') * sos_id\n    attn_ids = paddle.ones([d_batch * beam_width], dtype='int64') * attn_id  # SOS\n    ids = paddle.stack([cls_ids, attn_ids], -1)\n    for step in range(max_decode_len):\n        bias = gen_bias(token_ids, ids, step)\n        pos_ids = paddle.to_tensor(np.tile(np.array([[step, step + 1]], dtype=np.int64), [d_batch * beam_width, 1]))\n        pos_ids += seqlen\n        _, logits, info = model(ids,\n                                paddle.ones_like(ids) * tgt_type_id,\n                                pos_ids=pos_ids,\n                                attn_bias=bias,\n                                past_cache=past_cache)\n        if logits.shape[-1] > vocab_size:\n            logits[:, :, vocab_size:] = 0\n        logits[:, :, pad_id] = 0\n        logits[:, :, unk_id] = 0\n        logits[:, :, attn_id] = 0\n\n        output, state = beam_search_step(state,\n                                         logits[:, 1],\n                                         eos_id=eos_id,\n                                         beam_width=beam_width,\n                                         is_first_step=(step == 0),\n                                         length_penalty=length_penalty)\n        outputs.append(output)\n\n        past_cached_k, past_cached_v = past_cache\n        cached_k, cached_v = info['caches']\n        cached_k = [\n            reorder_(paddle.concat([pk, k[:, :1, :]], 1), output.beam_parent_ids)\n            for pk, k in zip(past_cached_k, cached_k)\n        ]  # concat cached\n        cached_v = [\n            reorder_(paddle.concat([pv, v[:, :1, :]], 1), output.beam_parent_ids)\n            for pv, v in zip(past_cached_v, cached_v)\n        ]\n        past_cache = (cached_k, cached_v)\n\n        pred_ids_flatten = paddle.reshape(output.predicted_ids, [d_batch * beam_width])\n        ids = paddle.stack([pred_ids_flatten, attn_ids], 1)\n\n        if state.finished.numpy().all():\n            break\n\n    final_ids = paddle.stack([o.predicted_ids for o in outputs], 0)\n    final_parent_ids = paddle.stack([o.beam_parent_ids for o in outputs], 0)\n    final_ids = nn.functional.gather_tree(final_ids, final_parent_ids)  #[:, :, 0]  #pick best beam\n    final_ids = paddle.transpose(paddle.reshape(final_ids, [-1, d_batch * 1, beam_width]), [1, 2, 0])\n\n    return final_ids.numpy()\n\n\nen_patten = re.compile(r'^[a-zA-Z0-9]*$')\n\n\ndef post_process(token):\n    if token.startswith('##'):\n        ret = token[2:]\n    elif token in ['[CLS]', '[SEP]', '[PAD]']:\n        ret = ''\n    else:\n        if en_patten.match(token):\n            ret = ' ' + token\n        else:\n            ret = token\n    return ret\n"
  },
  {
    "path": "modules/text/text_generation/ernie_gen_lover_words/module.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport ast\nimport json\nimport argparse\nimport os\n\nimport numpy as np\nimport paddle\nimport paddlehub as hub\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.nlp_module import DataFormatError\nfrom paddlehub.common.logger import logger\nfrom paddlehub.module.module import moduleinfo, serving\nfrom paddlenlp.transformers import ErnieTokenizer, ErnieForGeneration\n\nfrom ernie_gen_lover_words.decode import beam_search_infilling\n\n\n@moduleinfo(\n    name=\"ernie_gen_lover_words\",\n    version=\"1.1.0\",\n    summary=\n    \"ERNIE-GEN is a multi-flow language generation framework for both pre-training and fine-tuning. This module has fine-tuned for lover's words generation task.\",\n    author=\"adaxiadaxi\",\n    author_email=\"\",\n    type=\"nlp/text_generation\",\n)\nclass ErnieGen(hub.NLPPredictionModule):\n    def __init__(self):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        assets_path = os.path.join(self.directory, \"assets\")\n        gen_checkpoint_path = os.path.join(assets_path, \"ernie_gen_lover_words.pdparams\")\n        self.model = ErnieForGeneration.from_pretrained(\"ernie-1.0\")\n        model_state = paddle.load(gen_checkpoint_path)\n        self.model.set_dict(model_state)\n        self.tokenizer = ErnieTokenizer.from_pretrained(\"ernie-1.0\")\n        self.rev_dict = self.tokenizer.vocab.idx_to_token\n        self.rev_dict[self.tokenizer.vocab['[PAD]']] = ''  # replace [PAD]\n        self.rev_dict[self.tokenizer.vocab['[UNK]']] = ''  # replace [PAD]\n        self.rev_lookup = np.vectorize(lambda i: self.rev_dict[i])\n\n    @serving\n    def generate(self, texts, use_gpu=False, beam_width=5):\n        \"\"\"\n        Get the continuation of the input poetry.\n\n        Args:\n             texts(list): the front part of a poetry.\n             use_gpu(bool): whether use gpu to predict or not\n             beam_width(int): the beam search width.\n\n        Returns:\n             results(list): the poetry continuations.\n        \"\"\"\n        paddle.disable_static()\n\n        if texts and isinstance(texts, list) and all(texts) and all([isinstance(text, str) for text in texts]):\n            predicted_data = texts\n        else:\n            raise ValueError(\"The input texts should be a list with nonempty string elements.\")\n\n        if use_gpu and \"CUDA_VISIBLE_DEVICES\" not in os.environ:\n            use_gpu = False\n            logger.warning(\n                \"use_gpu has been set False as you didn't set the environment variable CUDA_VISIBLE_DEVICES while using use_gpu=True\"\n            )\n        paddle.set_device('gpu') if use_gpu else paddle.set_device('cpu')\n        self.model.eval()\n        results = []\n        for text in predicted_data:\n            sample_results = []\n            encode_text = self.tokenizer.encode(text)\n            src_ids = paddle.to_tensor(encode_text['input_ids']).unsqueeze(0)\n            src_sids = paddle.to_tensor(encode_text['token_type_ids']).unsqueeze(0)\n            output_ids = beam_search_infilling(\n                self.model,\n                src_ids,\n                src_sids,\n                eos_id=self.tokenizer.vocab['[SEP]'],\n                sos_id=self.tokenizer.vocab['[CLS]'],\n                attn_id=self.tokenizer.vocab['[MASK]'],\n                pad_id=self.tokenizer.vocab['[PAD]'],\n                unk_id=self.tokenizer.vocab['[UNK]'],\n                vocab_size=len(self.tokenizer.vocab),\n                max_decode_len=80,\n                max_encode_len=20,\n                beam_width=beam_width,\n                tgt_type_id=1)\n            output_str = self.rev_lookup(output_ids[0])\n\n            for ostr in output_str.tolist():\n                if '[SEP]' in ostr:\n                    ostr = ostr[:ostr.index('[SEP]')]\n                sample_results.append(\"\".join(ostr))\n            results.append(sample_results)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu', type=ast.literal_eval, default=False, help=\"whether use GPU for prediction\")\n\n        self.arg_config_group.add_argument('--beam_width', type=int, default=5, help=\"the beam search width\")\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description='Run the %s module.' % self.name,\n            prog='hub run %s' % self.name,\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, optional.\")\n\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n\n        args = self.parser.parse_args(argvs)\n\n        try:\n            input_data = self.check_input_data(args)\n        except DataFormatError and RuntimeError:\n            self.parser.print_help()\n            return None\n\n        results = self.generate(texts=input_data, use_gpu=args.use_gpu, beam_width=args.beam_width)\n\n        return results\n\n\nif __name__ == \"__main__\":\n    module = ErnieGen()\n    for result in module.generate(['情人节', '故乡', '小编带大家了解一下程序员情人节', '昔年旅南服，始识王荆州。', '高名出汉阴，禅阁跨香岑。'], beam_width=5):\n        print(result)\n"
  },
  {
    "path": "modules/text/text_generation/ernie_gen_poetry/README.md",
    "content": "# ernie_gen_poetry\n\n| 模型名称            | ernie_gen_poetry |\n| :------------------ | :--------------: |\n| 类别                |  文本-文本生成   |\n| 网络                |    ERNIE-GEN     |\n| 数据集              |  开源诗歌数据集  |\n| 是否支持Fine-tuning |        否        |\n| 模型大小            |       422M       |\n| 最新更新日期        |    2021-02-26    |\n| 数据指标            |        -         |\n\n## 一、模型基本信息\n\n- 模型介绍\n  - ERNIE-GEN 是面向生成任务的预训练-微调框架，首次在预训练阶段加入span-by-span 生成任务，让模型每次能够生成一个语义完整的片段。在预训练和微调中通过填充式生成机制和噪声感知机制来缓解曝光偏差问题。此外, ERNIE-GEN 采样多片段-多粒度目标文本采样策略, 增强源文本和目标文本的关联性，加强了编码器和解码器的交互。\n  - ernie_gen_poetry采用开源诗歌数据集进行微调，可用于生成诗歌。\n\n<p align=\"center\">\n<img src=\"https://user-images.githubusercontent.com/76040149/133191670-8eb1c542-f8e8-4715-adb2-6346b976fab1.png\"  width=\"600\" hspace='10'/>\n</p>\n\n- 更多详情请参考：[ERNIE-GEN:An Enhanced Multi-Flow Pre-training and Fine-tuning Framework for Natural Language Generation](https://arxiv.org/abs/2001.11314)\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n  - paddlehub >= 2.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n  - paddlenlp >= 2.0.0\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install ernie_gen_poetry\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run ernie_gen_poetry --input_text=\"昔年旅南服，始识王荆州。\" --use_gpu True --beam_width 5\n    ```\n    \n    - input_text: 诗歌的开头。\n    - use_gpu: 是否采用GPU进行预测。\n    - beam_width: beam search宽度，决定每个诗歌开头输出的下文数目。\n    \n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n    \n    module = hub.Module(name=\"ernie_gen_poetry\")\n    \n    test_texts = ['昔年旅南服，始识王荆州。', '高名出汉阴，禅阁跨香岑。']\n    results = module.generate(texts=test_texts, use_gpu=True, beam_width=5)\n    for result in results:\n        print(result)\n    \n    # ['一见便倾盖，论交更绸缪。别来二十年，日月如奔流。人生会合难，俯仰成春秋。', '一见便倾盖，论交更绸缪。别来二十年，日月如奔流。人生会合难，况乃岁月遒。君家富文史，我老无田畴。相逢不相识，各在天一陬。人生百年内，聚散如浮沤。况我与夫子，相逢', '一见便倾盖，论交更绸缪。别来二十年，日月如奔流。人生会合难，况乃岁月遒。君家富文史，我老无田畴。相逢不相识，各在天一陬。人生百年内，聚散如浮沤。况我与君别，飘零', '一见便倾盖，论交更绸缪。别来二十年，日月如奔流。人生会合难，况乃岁月遒。君家富文史，我老无田畴。相逢不相识，各在天一陬。人生百年内，聚散如浮沤。况复各异乡，各在', '一见便倾盖，论交更绸缪。别来二十年，日月如奔流。人生会合难，况乃岁月遒。君家富文史，我老无田畴。相逢不相识，各在天一陬。人生百年内，聚散如浮沤。况复各异乡，风雨']\n    # ['地僻无尘到，山高见水深。钟声传远寺，塔影落前林。欲问西来意，庭前柏树林。', '地僻无尘到，山高见水深。钟声传远寺，塔影落前林。欲问西来意，庭前柏树阴。', '地僻无尘到，山高见水深。钟声传远寺，塔影落前林。欲问西来意，庭前有桂林。', '地僻无尘到，山高见水深。钟声传远寺，塔影落前林。欲问西来意，庭前柏正森。', '地僻无尘到，山高见水深。钟声传远寺，塔影落前林。欲问西来意，庭前有桂阴。']\n    ```\n\n- ### 3、API\n\n  - ```python\n    def generate(texts, use_gpu=False, beam_width=5):\n    ```\n\n    - 预测API，输入诗歌开头，输出诗歌下文。\n    - **参数**\n      - texts (list[str]): 诗歌的开头；\n      - use_gpu (bool): 是否使用 GPU；**若使用GPU，请先设置CUDA_VISIBLE_DEVICES环境变量**；\n      - beam_width: beam search宽度，决定每个诗歌开头输出的下文数目。\n    - **返回**\n      - results (list[list]\\[str]): 诗歌下文，每个诗歌开头会生成beam_width个下文。\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线诗歌生成服务，可以将此接口用于在线web应用。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m ernie_gen_poetry -p 8866\n    ```\n\n  - 这样就完成了一个服务化API的部署，默认端口号为8866。\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    \n    # 发送HTTP请求\n    \n    data = {'texts':['昔年旅南服，始识王荆州。', '高名出汉阴，禅阁跨香岑。'],\n            'use_gpu':False, 'beam_width':5}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/ernie_gen_poetry\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    \n    # 保存结果\n    results = r.json()[\"results\"]\n    for result in results:\n        print(result)\n    \n    # serving运行结果同本地运行结果（见上）\n    ```\n    \n  - 关于PaddleHub Serving更多信息参考[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.0.1\n\n  修复windows中的编码问题\n\n* 1.0.2\n\n  完善API的输入文本检查\n\n- 1.1.0\n\n  修复兼容性问题\n\n  - ```shell\n    $ hub install ernie_gen_poetry==1.1.0\n    ```\n"
  },
  {
    "path": "modules/text/text_generation/ernie_gen_poetry/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/text_generation/ernie_gen_poetry/decode.py",
    "content": "#   Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom __future__ import division\nfrom __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport sys\nimport re\nimport argparse\nimport logging\nimport json\nimport numpy as np\nfrom collections import namedtuple\n\nimport paddle\nimport paddle.nn as nn\nimport numpy as np\nfrom paddlenlp.utils.log import logger\n\n\ndef gen_bias(encoder_inputs, decoder_inputs, step):\n    decoder_bsz, decoder_seqlen = decoder_inputs.shape[:2]\n    encoder_bsz, encoder_seqlen = encoder_inputs.shape[:2]\n    attn_bias = paddle.reshape(paddle.arange(0, decoder_seqlen, 1, dtype='float32') + 1, [1, -1, 1])\n    decoder_bias = paddle.cast((paddle.matmul(attn_bias, 1. / attn_bias, transpose_y=True) >= 1.),\n                               'float32')  #[1, decoderlen, decoderlen]\n    encoder_bias = paddle.unsqueeze(paddle.cast(paddle.ones_like(encoder_inputs), 'float32'),\n                                    [1])  #[bsz, 1, encoderlen]\n    encoder_bias = paddle.expand(encoder_bias,\n                                 [encoder_bsz, decoder_seqlen, encoder_seqlen])  #[bsz,decoderlen, encoderlen]\n    decoder_bias = paddle.expand(decoder_bias,\n                                 [decoder_bsz, decoder_seqlen, decoder_seqlen])  #[bsz, decoderlen, decoderlen]\n    if step > 0:\n        bias = paddle.concat(\n            [encoder_bias, paddle.ones([decoder_bsz, decoder_seqlen, step], 'float32'), decoder_bias], -1)\n    else:\n        bias = paddle.concat([encoder_bias, decoder_bias], -1)\n    return bias\n\n\n@paddle.no_grad()\ndef greedy_search_infilling(model,\n                            token_ids,\n                            token_type_ids,\n                            sos_id,\n                            eos_id,\n                            attn_id,\n                            pad_id,\n                            unk_id,\n                            vocab_size,\n                            max_encode_len=640,\n                            max_decode_len=100,\n                            tgt_type_id=3):\n    _, logits, info = model(token_ids, token_type_ids)\n    d_batch, d_seqlen = token_ids.shape\n    seqlen = paddle.sum(paddle.cast(token_ids != 0, 'int64'), 1, keepdim=True)\n    has_stopped = np.zeros([d_batch], dtype=np.bool)\n    gen_seq_len = np.zeros([d_batch], dtype=np.int64)\n    output_ids = []\n\n    past_cache = info['caches']\n\n    cls_ids = paddle.ones([d_batch], dtype='int64') * sos_id\n    attn_ids = paddle.ones([d_batch], dtype='int64') * attn_id\n    ids = paddle.stack([cls_ids, attn_ids], -1)\n    for step in range(max_decode_len):\n        bias = gen_bias(token_ids, ids, step)\n        pos_ids = paddle.to_tensor(np.tile(np.array([[step, step + 1]], dtype=np.int64), [d_batch, 1]))\n        pos_ids += seqlen\n        _, logits, info = model(ids,\n                                paddle.ones_like(ids) * tgt_type_id,\n                                pos_ids=pos_ids,\n                                attn_bias=bias,\n                                past_cache=past_cache)\n\n        gen_ids = paddle.argmax(logits, -1)\n\n        past_cached_k, past_cached_v = past_cache\n        cached_k, cached_v = info['caches']\n        cached_k = [paddle.concat([pk, k[:, :1, :]], 1) for pk, k in zip(past_cached_k, cached_k)]  # concat cached\n        cached_v = [paddle.concat([pv, v[:, :1, :]], 1) for pv, v in zip(past_cached_v, cached_v)]\n        past_cache = (cached_k, cached_v)\n\n        gen_ids = gen_ids[:, 1]\n        ids = paddle.stack([gen_ids, attn_ids], 1)\n\n        gen_ids = gen_ids.numpy()\n        has_stopped |= (gen_ids == eos_id).astype(np.bool)\n        gen_seq_len += (1 - has_stopped.astype(np.int64))\n        output_ids.append(gen_ids.tolist())\n        if has_stopped.all():\n            break\n    output_ids = np.array(output_ids).transpose([1, 0])\n    return output_ids\n\n\nBeamSearchState = namedtuple('BeamSearchState', ['log_probs', 'lengths', 'finished'])\nBeamSearchOutput = namedtuple('BeamSearchOutput', ['scores', 'predicted_ids', 'beam_parent_ids'])\n\n\ndef log_softmax(x):\n    e_x = np.exp(x - np.max(x))\n    return np.log(e_x / e_x.sum())\n\n\ndef mask_prob(p, onehot_eos, finished):\n    is_finished = paddle.cast(paddle.reshape(finished, [-1, 1]) != 0, 'float32')\n    p = is_finished * (1. - paddle.cast(onehot_eos, 'float32')) * -9999. + (1. - is_finished) * p\n    return p\n\n\ndef hyp_score(log_probs, length, length_penalty):\n    lp = paddle.pow((5. + paddle.cast(length, 'float32')) / 6., length_penalty)\n    return log_probs / lp\n\n\ndef beam_search_step(state, logits, eos_id, beam_width, is_first_step, length_penalty):\n    \"\"\"logits.shape == [B*W, V]\"\"\"\n    _, vocab_size = logits.shape\n\n    bsz, beam_width = state.log_probs.shape\n    onehot_eos = paddle.cast(nn.functional.one_hot(paddle.ones([1], 'int64') * eos_id, vocab_size), 'int64')  #[1, V]\n\n    probs = paddle.log(nn.functional.softmax(logits))  #[B*W, V]\n    probs = mask_prob(probs, onehot_eos, state.finished)  #[B*W, V]\n    allprobs = paddle.reshape(state.log_probs, [-1, 1]) + probs  #[B*W, V]\n\n    not_finished = 1 - paddle.reshape(state.finished, [-1, 1])  #[B*W,1]\n    not_eos = 1 - onehot_eos\n    length_to_add = not_finished * not_eos  #[B*W,V]\n    alllen = paddle.reshape(state.lengths, [-1, 1]) + length_to_add\n\n    allprobs = paddle.reshape(allprobs, [-1, beam_width * vocab_size])\n    alllen = paddle.reshape(alllen, [-1, beam_width * vocab_size])\n    allscore = hyp_score(allprobs, alllen, length_penalty)\n    if is_first_step:\n        allscore = paddle.reshape(allscore, [bsz, beam_width, -1])[:, 0, :]  # first step only consiter beam 0\n    scores, idx = paddle.topk(allscore, k=beam_width)  #[B, W]\n    next_beam_id = idx // vocab_size  #[B, W]\n    next_word_id = idx % vocab_size\n\n    gather_idx = paddle.concat([paddle.nonzero(idx != -1)[:, :1], paddle.reshape(idx, [-1, 1])], 1)\n    next_probs = paddle.reshape(paddle.gather_nd(allprobs, gather_idx), idx.shape)\n    next_len = paddle.reshape(paddle.gather_nd(alllen, gather_idx), idx.shape)\n\n    gather_idx = paddle.concat([paddle.nonzero(next_beam_id != -1)[:, :1], paddle.reshape(next_beam_id, [-1, 1])], 1)\n    next_finished = paddle.reshape(paddle.gather_nd(state.finished, gather_idx),\n                                   state.finished.shape)  #[gather new beam state according to new beam id]\n\n    next_finished += paddle.cast(next_word_id == eos_id, 'int64')\n    next_finished = paddle.cast(next_finished > 0, 'int64')\n\n    next_state = BeamSearchState(log_probs=next_probs, lengths=next_len, finished=next_finished)\n    output = BeamSearchOutput(scores=scores, predicted_ids=next_word_id, beam_parent_ids=next_beam_id)\n\n    return output, next_state\n\n\n@paddle.no_grad()\ndef beam_search_infilling(model,\n                          token_ids,\n                          token_type_ids,\n                          sos_id,\n                          eos_id,\n                          attn_id,\n                          pad_id,\n                          unk_id,\n                          vocab_size,\n                          max_encode_len=640,\n                          max_decode_len=100,\n                          beam_width=5,\n                          tgt_type_id=3,\n                          length_penalty=1.0):\n    _, __, info = model(token_ids, token_type_ids)\n    d_batch, d_seqlen = token_ids.shape\n\n    state = BeamSearchState(log_probs=paddle.zeros([d_batch, beam_width], 'float32'),\n                            lengths=paddle.zeros([d_batch, beam_width], 'int64'),\n                            finished=paddle.zeros([d_batch, beam_width], 'int64'))\n    outputs = []\n\n    def reorder_(t, parent_id):\n        \"\"\"reorder cache according to parent beam id\"\"\"\n        gather_idx = paddle.nonzero(parent_id != -1)[:, 0] * beam_width + paddle.reshape(parent_id, [-1])\n        t = paddle.gather(t, gather_idx)\n        return t\n\n    def tile_(t, times):\n        _shapes = list(t.shape[1:])\n        new_shape = [t.shape[0], times] + list(t.shape[1:])\n        ret = paddle.reshape(paddle.expand(paddle.unsqueeze(t, [1]), new_shape), [\n            -1,\n        ] + _shapes)\n        return ret\n\n    cached_k, cached_v = info['caches']\n    cached_k = [tile_(k, beam_width) for k in cached_k]\n    cached_v = [tile_(v, beam_width) for v in cached_v]\n    past_cache = (cached_k, cached_v)\n\n    token_ids = tile_(token_ids, beam_width)\n    seqlen = paddle.sum(paddle.cast(token_ids != 0, 'int64'), 1, keepdim=True)\n\n    cls_ids = paddle.ones([d_batch * beam_width], dtype='int64') * sos_id\n    attn_ids = paddle.ones([d_batch * beam_width], dtype='int64') * attn_id  # SOS\n    ids = paddle.stack([cls_ids, attn_ids], -1)\n    for step in range(max_decode_len):\n        bias = gen_bias(token_ids, ids, step)\n        pos_ids = paddle.to_tensor(np.tile(np.array([[step, step + 1]], dtype=np.int64), [d_batch * beam_width, 1]))\n        pos_ids += seqlen\n        _, logits, info = model(ids,\n                                paddle.ones_like(ids) * tgt_type_id,\n                                pos_ids=pos_ids,\n                                attn_bias=bias,\n                                past_cache=past_cache)\n\n        output, state = beam_search_step(state,\n                                         logits[:, 1],\n                                         eos_id=eos_id,\n                                         beam_width=beam_width,\n                                         is_first_step=(step == 0),\n                                         length_penalty=length_penalty)\n        outputs.append(output)\n\n        past_cached_k, past_cached_v = past_cache\n        cached_k, cached_v = info['caches']\n        cached_k = [\n            reorder_(paddle.concat([pk, k[:, :1, :]], 1), output.beam_parent_ids)\n            for pk, k in zip(past_cached_k, cached_k)\n        ]  # concat cached\n        cached_v = [\n            reorder_(paddle.concat([pv, v[:, :1, :]], 1), output.beam_parent_ids)\n            for pv, v in zip(past_cached_v, cached_v)\n        ]\n        past_cache = (cached_k, cached_v)\n\n        pred_ids_flatten = paddle.reshape(output.predicted_ids, [d_batch * beam_width])\n        ids = paddle.stack([pred_ids_flatten, attn_ids], 1)\n\n        if state.finished.numpy().all():\n            break\n\n    final_ids = paddle.stack([o.predicted_ids for o in outputs], 0)\n    final_parent_ids = paddle.stack([o.beam_parent_ids for o in outputs], 0)\n    final_ids = nn.functional.gather_tree(final_ids, final_parent_ids)  #[:, :, 0]  #pick best beam\n    final_ids = paddle.transpose(paddle.reshape(final_ids, [-1, d_batch * 1, beam_width]), [1, 2, 0])\n\n    return final_ids.numpy()\n\n\nen_patten = re.compile(r'^[a-zA-Z0-9]*$')\n\n\ndef post_process(token):\n    if token.startswith('##'):\n        ret = token[2:]\n    elif token in ['[CLS]', '[SEP]', '[PAD]']:\n        ret = ''\n    else:\n        if en_patten.match(token):\n            ret = ' ' + token\n        else:\n            ret = token\n    return ret\n"
  },
  {
    "path": "modules/text/text_generation/ernie_gen_poetry/module.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport ast\nimport json\nimport argparse\nimport os\n\nimport numpy as np\nimport paddle\nimport paddlehub as hub\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.nlp_module import DataFormatError\nfrom paddlehub.common.logger import logger\nfrom paddlehub.module.module import moduleinfo, serving\nfrom paddlenlp.transformers import ErnieTokenizer, ErnieForGeneration\n\nfrom ernie_gen_poetry.decode import beam_search_infilling\n\n\n@moduleinfo(\n    name=\"ernie_gen_poetry\",\n    version=\"1.1.0\",\n    summary=\n    \"ERNIE-GEN is a multi-flow language generation framework for both pre-training and fine-tuning. This module has fine-tuned for poetry generation task.\",\n    author=\"baidu-nlp\",\n    author_email=\"\",\n    type=\"nlp/text_generation\",\n)\nclass ErnieGen(hub.NLPPredictionModule):\n    def __init__(self):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        assets_path = os.path.join(self.directory, \"assets\")\n        gen_checkpoint_path = os.path.join(assets_path, \"ernie_gen_poetry.pdparams\")\n        self.model = ErnieForGeneration.from_pretrained(\"ernie-1.0\")\n        model_state = paddle.load(gen_checkpoint_path)\n        self.model.set_dict(model_state)\n        self.tokenizer = ErnieTokenizer.from_pretrained(\"ernie-1.0\")\n        self.rev_dict = self.tokenizer.vocab.idx_to_token\n        self.rev_dict[self.tokenizer.vocab['[PAD]']] = ''  # replace [PAD]\n        self.rev_dict[self.tokenizer.vocab['[UNK]']] = ''  # replace [PAD]\n        self.rev_lookup = np.vectorize(lambda i: self.rev_dict[i])\n\n    @serving\n    def generate(self, texts, use_gpu=False, beam_width=5):\n        \"\"\"\n        Get the continuation of the input poetry.\n\n        Args:\n             texts(list): the front part of a poetry.\n             use_gpu(bool): whether use gpu to predict or not\n             beam_width(int): the beam search width.\n\n        Returns:\n             results(list): the poetry continuations.\n        \"\"\"\n        paddle.disable_static()\n\n        if texts and isinstance(texts, list) and all(texts) and all([isinstance(text, str) for text in texts]):\n            predicted_data = texts\n        else:\n            raise ValueError(\"The input texts should be a list with nonempty string elements.\")\n        for i, text in enumerate(texts):\n            if '，' not in text or '。' not in text:\n                logger.warning(\n                    \"The input text: %s, does not contain '，' or '。', which is not a complete verse and may result in magic output\"\n                    % text)\n            else:\n                front, rear = text[:-1].split('，')\n                if len(front) != len(rear):\n                    logger.warning(\n                        \"The input text: %s, is no antithetical parallelism, which may result in magic output\" % text)\n\n            for char in text:\n                if not '\\u4e00' <= char <= '\\u9fff' and char not in ['，', '。']:\n                    logger.warning(\n                        \"The input text: %s, contains characters not Chinese or ‘，’ '。', which may result in magic output\"\n                        % text)\n                break\n\n        if use_gpu and \"CUDA_VISIBLE_DEVICES\" not in os.environ:\n            use_gpu = False\n            logger.warning(\n                \"use_gpu has been set False as you didn't set the environment variable CUDA_VISIBLE_DEVICES while using use_gpu=True\"\n            )\n        paddle.set_device('gpu') if use_gpu else paddle.set_device('cpu')\n        self.model.eval()\n        results = []\n        for text in predicted_data:\n            sample_results = []\n            encode_text = self.tokenizer.encode(text)\n            src_ids = paddle.to_tensor(encode_text['input_ids']).unsqueeze(0)\n            src_sids = paddle.to_tensor(encode_text['token_type_ids']).unsqueeze(0)\n            output_ids = beam_search_infilling(\n                self.model,\n                src_ids,\n                src_sids,\n                eos_id=self.tokenizer.vocab['[SEP]'],\n                sos_id=self.tokenizer.vocab['[CLS]'],\n                attn_id=self.tokenizer.vocab['[MASK]'],\n                pad_id=self.tokenizer.vocab['[PAD]'],\n                unk_id=self.tokenizer.vocab['[UNK]'],\n                vocab_size=len(self.tokenizer.vocab),\n                max_decode_len=80,\n                max_encode_len=20,\n                beam_width=beam_width,\n                tgt_type_id=1)\n            output_str = self.rev_lookup(output_ids[0])\n\n            for ostr in output_str.tolist():\n                if '[SEP]' in ostr:\n                    ostr = ostr[:ostr.index('[SEP]')]\n                sample_results.append(\"\".join(ostr))\n            results.append(sample_results)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu', type=ast.literal_eval, default=False, help=\"whether use GPU for prediction\")\n\n        self.arg_config_group.add_argument('--beam_width', type=int, default=5, help=\"the beam search width\")\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description='Run the %s module.' % self.name,\n            prog='hub run %s' % self.name,\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, optional.\")\n\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n\n        args = self.parser.parse_args(argvs)\n\n        try:\n            input_data = self.check_input_data(args)\n        except DataFormatError and RuntimeError:\n            self.parser.print_help()\n            return None\n\n        results = self.generate(texts=input_data, use_gpu=args.use_gpu, beam_width=args.beam_width)\n\n        return results\n\n\nif __name__ == \"__main__\":\n    module = ErnieGen()\n    for result in module.generate(['昔年旅南服，始识王荆州。', '高名出汉阴，禅阁跨香岑。'], beam_width=5, use_gpu=True):\n        print(result)\n"
  },
  {
    "path": "modules/text/text_generation/ernie_tiny_couplet/README.md",
    "content": "ernie_tiny_couplet是一个对联生成模型，它由ernie_tiny预训练模型经PaddleHub TextGenerationTask微调而来，仅支持预测，如需进一步微调请参考PaddleHub text_generation demo。\n\n```shell\n$ hub install ernie_tiny_couplet==1.0.0\n```\n<p align=\"center\">\n<img src=\"https://paddlehub.bj.bcebos.com/paddlehub-img%2Fernie_tiny_framework.PNG\" hspace='10'/> <br />\n</p>\n\n本预测module系ernie_tiny预训练模型经由TextGenerationTask微调而来，有关ernie\\_tiny的介绍请参考[ernie_tiny module](https://www.paddlepaddle.org.cn/hubdetail?name=ernie_tiny&en_category=SemanticModel)，微调方式请参考[text_generation demo](https://github.com/PaddlePaddle/PaddleHub/tree/release/v1.8/demo/text_generation)，预训练模型转换成预测module的转换方式请参考[Fine-tune保存的模型如何转化为一个PaddleHub Module](https://github.com/PaddlePaddle/PaddleHub/blob/develop/docs/tutorial/finetuned_model_to_module.md)\n\n## 命令行预测\n\n```shell\n$ hub run ernie_tiny_couplet --input_text '风吹云乱天垂泪'\n```\n命令行预测只支持使用CPU预测，如需使用GPU，请使用API方式预测。\n\n## API\n```python\ndef generate(texts)\n```\n\n对联预测接口，输入上联文本，输出下联文本。该接口封装了上联文本使用`hub.BertTokenizer`编码的过程，因此它的调用方式比demo中提供的[predcit接口](https://github.com/PaddlePaddle/PaddleHub/blob/develop/demo/text_generation/predict.py#L83)简单。\n\n**参数**\n\n> texts(list\\[str\\])： 上联文本。\n\n**返回**\n\n> result(list\\[str\\]): 下联文本。每个上联会对应输出10个下联。\n\n**代码示例**\n\n```python\nimport paddlehub as hub\n\n# Load ernie pretrained model\nmodule = hub.Module(name=\"ernie_tiny_couplet\", use_gpu=True)\nresults = module.generate([\"风吹云乱天垂泪\", \"若有经心风过耳\"])\nfor result in results:\n    print(result)\n```\n\n## 服务部署\n\nPaddleHub Serving 可以部署在线服务。\n\n### 第一步：启动PaddleHub Serving\n\n运行启动命令：\n```shell\n$ hub serving start -m ernie_tiny_couplet\n```\n\n这样就完成了一个服务化API的部署，默认端口号为8866。\n\n**NOTE:** 服务部署只支持使用CPU，如需使用GPU，请使用API方式预测。\n\n### 第二步：发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\n\n# 发送HTTP请求\n\ndata = {'texts':[\"风吹云乱天垂泪\", \"若有经心风过耳\"]}\nheaders = {\"Content-type\": \"application/json\"}\nurl = \"http://127.0.0.1:8866/predict/ernie_tiny_couplet\"\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n# 保存结果\nresults = r.json()[\"results\"]\nprint(results)\n```\n\n##   查看代码\n\nhttps://github.com/PaddlePaddle/PaddleHub/blob/develop/demo/text_generation\n\n\n## 依赖\n\npaddlepaddle >= 1.8.2\n\npaddlehub >= 1.8.0\n\n## 更新历史\n\n* 1.0.0\n\n  初始发布。\n"
  },
  {
    "path": "modules/text/text_generation/ernie_tiny_couplet/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/text_generation/ernie_tiny_couplet/module.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport ast\nimport argparse\n\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo, serving, runnable\nfrom paddlehub.module.nlp_module import DataFormatError\n\n\n@moduleinfo(\n    name=\"ernie_tiny_couplet\",\n    version=\"1.0.0\",\n    summary=\"couplet generation model fine-tuned with ernie_tiny module\",\n    author=\"paddlehub\",\n    author_email=\"\",\n    type=\"nlp/text_generation\",\n)\nclass ErnieTinyCouplet(hub.NLPPredictionModule):\n    def _initialize(self, use_gpu=False):\n        # Load Paddlehub ERNIE Tiny pretrained model\n        self.module = hub.Module(name=\"ernie_tiny\")\n        inputs, outputs, program = self.module.context(trainable=True, max_seq_len=128)\n\n        # Download dataset and get its label list and label num\n        # If you just want labels information, you can omit its tokenizer parameter to avoid preprocessing the train set.\n        dataset = hub.dataset.Couplet()\n        self.label_list = dataset.get_labels()\n\n        # Setup RunConfig for PaddleHub Fine-tune API\n        config = hub.RunConfig(\n            use_data_parallel=False,\n            use_cuda=use_gpu,\n            batch_size=1,\n            checkpoint_dir=os.path.join(self.directory, \"assets\", \"ckpt\"),\n            strategy=hub.AdamWeightDecayStrategy())\n\n        # Construct transfer learning network\n        # Use \"pooled_output\" for classification tasks on an entire sentence.\n        # Use \"sequence_output\" for token-level output.\n        pooled_output = outputs[\"pooled_output\"]\n        sequence_output = outputs[\"sequence_output\"]\n\n        # Define a classfication fine-tune task by PaddleHub's API\n        self.gen_task = hub.TextGenerationTask(\n            feature=pooled_output,\n            token_feature=sequence_output,\n            max_seq_len=128,\n            num_classes=dataset.num_labels,\n            config=config,\n            metrics_choices=[\"bleu\"])\n\n    def generate(self, texts):\n        # Add 0x02 between characters to match the format of training data,\n        # otherwise the length of prediction results will not match the input string\n        # if the input string contains non-Chinese characters.\n        formatted_text_a = list(map(\"\\002\".join, texts))\n\n        # Use the appropriate tokenizer to preprocess the data\n        # For ernie_tiny, it use BertTokenizer too.\n        tokenizer = hub.BertTokenizer(vocab_file=self.module.get_vocab_path())\n        encoded_data = [tokenizer.encode(text=text, max_seq_len=128) for text in formatted_text_a]\n        results = self.gen_task.predict(data=encoded_data, label_list=self.label_list, accelerate_mode=False)\n        results = [[\"\".join(sample_result) for sample_result in sample_results] for sample_results in results]\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu', type=ast.literal_eval, default=False, help=\"whether use GPU for prediction\")\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description='Run the %s module.' % self.name,\n            prog='hub run %s' % self.name,\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n\n        args = self.parser.parse_args(argvs)\n\n        try:\n            input_data = self.check_input_data(args)\n        except DataFormatError and RuntimeError:\n            self.parser.print_help()\n            return None\n\n        results = self.generate(texts=input_data)\n\n        return results\n\n    @serving\n    def serving_method(self, texts):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        results = self.generate(texts)\n        return results\n\n\nif __name__ == '__main__':\n    module = ErnieTinyCouplet()\n    results = module.generate([\"风吹云乱天垂泪\", \"若有经心风过耳\"])\n    for result in results:\n        print(result)\n"
  },
  {
    "path": "modules/text/text_generation/ernie_zeus/README.md",
    "content": "# ernie_zeus\n\n|模型名称|ernie_zeus|\n| :--- | :---: |\n|类别|文本-文本生成|\n|网络|-|\n|数据集|-|\n|是否支持Fine-tuning|否|\n|模型大小|-|\n|最新更新日期|2022-08-16|\n|数据指标|-|\n\n## 一、模型基本信息\n### 应用效果展示\n- 作文创作：\n    - 作文标题：诚以养德，信以修身\n\n    - 作文：翻开我的书橱，展现在眼前的就是《曾国藩家书》。每当读起这些充满哲理的内容时，心里总会不禁佩服他。他虽出生于官宦之家，但并没有因此而骄傲自大，从小养成了平淡做人、踏实肯干的好品质，最后更赢得了属下和朋友们对他的一致认同和赞赏。由此可见，只要平时注意锻炼自己，处事脚踏实地，定能收获一番丰硕的成果！记得有句话叫“以诚待人”。我觉得曾国藩就是始终把做到真诚与诚信作为修身立业的准则和美德。\n\n- 文案创作：\n    - 产品描述：芍药香氛的沐浴乳\n\n    - 文案：使用多种纯天然草本植物精华，泡沫细腻绵密，丰富的维他命及矿物质滋养皮肤。成分温和安全，适合干性、中性肌肤或敏感性肌肤使用！\n\n### 模型介绍\nERNIE 3.0 Zeus 是 ERNIE 3.0 系列模型的最新升级。其除了对无标注数据和知识图谱的学习之外，还通过持续学习对百余种不同形式的任务数据学习。实现了任务知识增强，显著提升了模型的零样本/小样本学习能力。\n\n更多详情参考 [文心大模型官网](https://wenxin.baidu.com/wenxin) 及 [ERNIE 3.0 Zeus 项目主页](https://wenxin.baidu.com/wenxin/modelbasedetail/ernie3_zeus)。\n\n## 二、安装\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n\n  - paddlehub >= 2.2.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install ernie_zeus\n    ```\n\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n- ### 3. 使用申请（可选）\n  - 请前往 [文心旸谷社区](https://wenxin.baidu.com/moduleApi/key) 申请使用本模型所需的 API key 和 Secret Key。\n\n\n## 三、模型 API 预测\n- ### 1. 命令行预测\n\n  - ```bash\n    # 作文创作\n    # 请设置 '--ak' 和 '--sk' 参数\n    # 或者设置 'WENXIN_AK' 和 'WENXIN_SK' 环境变量\n    # 更多细节参考下方 API 说明\n    $ hub run ernie_zeus \\\n        --task composition_generation \\\n        --text '诚以养德，信以修身'\n    ```\n\n    - **参数**\n      - --task(str): 指定任务名称，与 API 名称保持一直\n      - --text(str): 根据不同的任务输入所需的文本。\n      - 其他参数请参考后续 API 章节。\n\n- ### 2. 预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    # 请设置 'ak' 和 'sk' 参数\n    # 或者设置 'WENXIN_AK' 和 'WENXIN_SK' 环境变量\n    # 更多细节参考下方 API 说明\n    model = hub.Module(name='ernie_zeus')\n\n    # 作文创作\n    result = model.composition_generation(\n        text='诚以养德，信以修身'\n    )\n\n    print(result)\n    ```\n\n- ### 3. API\n  - ```python\n    def __init__(\n        ak: Optional[str] = None,\n        sk: Optional[str] = None\n    ) -> None\n    ```\n\n    - 初始化 API\n\n    - **参数**\n\n      - sk(Optional[str]): 文心 API AK，默认为 None，即从环境变量 'WENXIN_AK' 中获取；\n      - ak(Optional[str]): 文心 API SK，默认为 None，即从环境变量 'WENXIN_SK' 中获取。\n\n  - ```python\n    def custom_generation(\n        text: str,\n        min_dec_len: int = 1,\n        seq_len: int = 128,\n        topp: float = 1.0,\n        penalty_score: float = 1.0,\n        stop_token: str = '',\n        task_prompt: str = '',\n        penalty_text: str = '',\n        choice_text: str = '',\n        is_unidirectional: bool = False,\n        min_dec_penalty_text: str = '',\n        logits_bias: int = -10000,\n        mask_type: str = 'word'\n    ) -> str\n    ```\n    - 自定义文本生成 API\n\n    - **参数**\n      - text(srt): 模型的输入文本, 为 prompt 形式的输入。文本长度 [1, 1000]。注: ERNIE 3.0-1.5B 模型取值范围 ≤ 512。\n      - min_dec_len(int): 输出结果的最小长度, 避免因模型生成 END 或者遇到用户指定的 stop_token 而生成长度过短的情况,与 seq_len 结合使用来设置生成文本的长度范围 [1, seq_len]。\n      - seq_len(int): 输出结果的最大长度, 因模型生成 END 或者遇到用户指定的 stop_token, 实际返回结果可能会小于这个长度, 与 min_dec_len 结合使用来控制生成文本的长度范围 [1, 1000]。(注: ERNIE 3.0-1.5B 模型取值范围 ≤ 512)\n      - topp(float): 影响输出文本的多样性, 取值越大, 生成文本的多样性越强。取值范围 [0.0, 1.0]。\n      - penalty_score(float): 通过对已生成的 token 增加惩罚, 减少重复生成的现象。值越大表示惩罚越大。取值范围 [1.0, 2.0]。\n      - stop_token(str): 预测结果解析时使用的结束字符串, 碰到对应字符串则直接截断并返回。可以通过设置该值, 过滤掉 few-shot 等场景下模型重复的 cases。\n      - task_prompt(str): 指定预置的任务模板, 效果更好。\n        PARAGRAPH: 引导模型生成一段文章; SENT: 引导模型生成一句话; ENTITY: 引导模型生成词组;\n        Summarization: 摘要; MT: 翻译; Text2Annotation: 抽取; Correction: 纠错;\n        QA_MRC: 阅读理解; Dialogue: 对话; QA_Closed_book: 闭卷问答; QA_Multi_Choice: 多选问答;\n        QuestionGeneration: 问题生成; Paraphrasing: 复述; NLI: 文本蕴含识别; SemanticMatching: 匹配;\n        Text2SQL: 文本描述转SQL; TextClassification: 文本分类; SentimentClassification: 情感分析;\n        zuowen: 写作文; adtext: 写文案; couplet: 对对联; novel: 写小说; cloze: 文本补全; Misc: 其它任务。\n      - penalty_text(str): 模型会惩罚该字符串中的 token。通过设置该值, 可以减少某些冗余与异常字符的生成。\n      - choice_text(str): 模型只能生成该字符串中的 token 的组合。通过设置该值, 可以对某些抽取式任务进行定向调优。\n      - is_unidirectional(bool): False 表示模型为双向生成, True 表示模型为单向生成。建议续写与 few-shot 等通用场景建议采用单向生成方式, 而完型填空等任务相关场景建议采用双向生成方式。\n      - min_dec_penalty_text(str): 与最小生成长度搭配使用, 可以在 min_dec_len 步前不让模型生成该字符串中的 tokens。\n      - logits_bias(int): 配合 penalty_text 使用, 对给定的 penalty_text 中的 token 增加一个 logits_bias, 可以通过设置该值屏蔽某些 token 生成的概率。\n      - mask_type(str): 设置该值可以控制模型生成粒度。可选参数为 word, sentence, paragraph。\n\n    - **返回**\n      - text(str): 生成的文本。\n\n  - ```python\n     def text_cloze(\n         text: str,\n         min_dec_len: int = 1,\n         seq_len: int = 512,\n         topp: float = 0.9,\n         penalty_score: float = 1.0\n     ) -> str\n     ```\n\n     - 完形填空 API\n\n     - **参数**\n       - text(str): 文字段落。使用 [MASK] 标记待补全文字。\n       - min_dec_len(int): 输出结果的最小长度, 避免因模型生成 END 或者遇到用户指定的 stop_token 而生成长度过短的情况,与 seq_len 结合使用来设置生成文本的长度范围 [1, seq_len]。\n       - seq_len(int): 输出结果的最大长度, 因模型生成 END 或者遇到用户指定的 stop_token, 实际返回结果可能会小于这个长度, 与 min_dec_len 结合使用来控制生成文本的长度范围 [1, 1000]。(注: ERNIE 3.0-1.5B 模型取值范围 ≤ 512)\n       - topp(float): 影响输出文本的多样性, 取值越大, 生成文本的多样性越强。取值范围 [0.0, 1.0]。\n       - penalty_score(float): 通过对已生成的 token 增加惩罚, 减少重复生成的现象。值越大表示惩罚越大。取值范围 [1.0, 2.0]。\n\n     - **返回**\n       - text(str): 补全词语\n\n  - ```python\n     def composition_generation(\n         text: str,\n         min_dec_len: int = 128,\n         seq_len: int = 512,\n         topp: float = 0.9,\n         penalty_score: float = 1.2\n     ) -> str\n     ```\n     - 作文创作 API\n\n     - **参数**\n       - text(str): 作文题目。\n       - min_dec_len(int): 输出结果的最小长度, 避免因模型生成 END 或者遇到用户指定的 stop_token 而生成长度过短的情况,与 seq_len 结合使用来设置生成文本的长度范围 [1, seq_len]。\n       - seq_len(int): 输出结果的最大长度, 因模型生成 END 或者遇到用户指定的 stop_token, 实际返回结果可能会小于这个长度, 与 min_dec_len 结合使用来控制生成文本的长度范围 [1, 1000]。(注: ERNIE 3.0-1.5B 模型取值范围 ≤ 512)\n       - topp(float): 影响输出文本的多样性, 取值越大, 生成文本的多样性越强。取值范围 [0.0, 1.0]。\n       - penalty_score(float): 通过对已生成的 token 增加惩罚, 减少重复生成的现象。值越大表示惩罚越大。取值范围 [1.0, 2.0]。\n\n     - **返回**\n       - text(str): 作文内容。\n\n  - ```python\n     def answer_generation(\n         text: str,\n         min_dec_len: int = 2,\n         seq_len: int = 512,\n         topp: float = 0.9,\n         penalty_score: float = 1.2\n     ) -> str\n     ```\n     - 自由问答 API\n\n     - **参数**\n       - text(str): 问题内容。\n       - min_dec_len(int): 输出结果的最小长度, 避免因模型生成 END 或者遇到用户指定的 stop_token 而生成长度过短的情况,与 seq_len 结合使用来设置生成文本的长度范围 [1, seq_len]。\n       - seq_len(int): 输出结果的最大长度, 因模型生成 END 或者遇到用户指定的 stop_token, 实际返回结果可能会小于这个长度, 与 min_dec_len 结合使用来控制生成文本的长度范围 [1, 1000]。(注: ERNIE 3.0-1.5B 模型取值范围 ≤ 512)\n       - topp(float): 影响输出文本的多样性, 取值越大, 生成文本的多样性越强。取值范围 [0.0, 1.0]。\n       - penalty_score(float): 通过对已生成的 token 增加惩罚, 减少重复生成的现象。值越大表示惩罚越大。取值范围 [1.0, 2.0]。\n\n     - **返回**\n       - text(str): 问题答案。\n\n\n   - ```python\n     def couplet_continuation(\n         text: str,\n         min_dec_len: int = 2,\n         seq_len: int = 512,\n         topp: float = 0.9,\n         penalty_score: float = 1.0\n     ) -> str\n     ```\n     - 对联续写 API\n\n     - **参数**\n       - text(str): 对联上联。\n       - min_dec_len(int): 输出结果的最小长度, 避免因模型生成 END 或者遇到用户指定的 stop_token 而生成长度过短的情况,与 seq_len 结合使用来设置生成文本的长度范围 [1, seq_len]。\n       - seq_len(int): 输出结果的最大长度, 因模型生成 END 或者遇到用户指定的 stop_token, 实际返回结果可能会小于这个长度, 与 min_dec_len 结合使用来控制生成文本的长度范围 [1, 1000]。(注: ERNIE 3.0-1.5B 模型取值范围 ≤ 512)\n       - topp(float): 影响输出文本的多样性, 取值越大, 生成文本的多样性越强。取值范围 [0.0, 1.0]。\n       - penalty_score(float): 通过对已生成的 token 增加惩罚, 减少重复生成的现象。值越大表示惩罚越大。取值范围 [1.0, 2.0]。\n\n     - **返回**\n       - text(str): 对联下联。\n\n   - ```python\n     def copywriting_generation(\n         text: str,\n         min_dec_len: int = 32,\n         seq_len: int = 512,\n         topp: float = 0.9,\n         penalty_score: float = 1.2\n     ) -> str\n     ```\n     - 文案创作 API\n\n     - **参数**\n       - text(str): 产品描述。\n       - min_dec_len(int): 输出结果的最小长度, 避免因模型生成 END 或者遇到用户指定的 stop_token 而生成长度过短的情况,与 seq_len 结合使用来设置生成文本的长度范围 [1, seq_len]。\n       - seq_len(int): 输出结果的最大长度, 因模型生成 END 或者遇到用户指定的 stop_token, 实际返回结果可能会小于这个长度, 与 min_dec_len 结合使用来控制生成文本的长度范围 [1, 1000]。(注: ERNIE 3.0-1.5B 模型取值范围 ≤ 512)\n       - topp(float): 影响输出文本的多样性, 取值越大, 生成文本的多样性越强。取值范围 [0.0, 1.0]。\n       - penalty_score(float): 通过对已生成的 token 增加惩罚, 减少重复生成的现象。值越大表示惩罚越大。取值范围 [1.0, 2.0]。\n\n     - **返回**\n       - text(str): 产品文案。\n\n  - ```python\n     def novel_continuation(\n         text: str,\n         min_dec_len: int = 2,\n         seq_len: int = 512,\n         topp: float = 0.9,\n         penalty_score: float = 1.2\n     ) -> str\n     ```\n     - 小说续写 API\n\n     - **参数**\n       - text(str): 小说上文。\n       - min_dec_len(int): 输出结果的最小长度, 避免因模型生成 END 或者遇到用户指定的 stop_token 而生成长度过短的情况,与 seq_len 结合使用来设置生成文本的长度范围 [1, seq_len]。\n       - seq_len(int): 输出结果的最大长度, 因模型生成 END 或者遇到用户指定的 stop_token, 实际返回结果可能会小于这个长度, 与 min_dec_len 结合使用来控制生成文本的长度范围 [1, 1000]。(注: ERNIE 3.0-1.5B 模型取值范围 ≤ 512)\n       - topp(float): 影响输出文本的多样性, 取值越大, 生成文本的多样性越强。取值范围 [0.0, 1.0]。\n       - penalty_score(float): 通过对已生成的 token 增加惩罚, 减少重复生成的现象。值越大表示惩罚越大。取值范围 [1.0, 2.0]。\n\n     - **返回**\n       - text(str): 小说下文。\n\n  - ```python\n     def text_summarization(\n         text: str,\n         min_dec_len: int = 4,\n         seq_len: int = 512,\n         topp: float = 0.0,\n         penalty_score: float = 1.0\n     ) -> str\n     ```\n     - 文本摘要 API\n\n     - **参数**\n       - text(str): 文本段落。\n       - min_dec_len(int): 输出结果的最小长度, 避免因模型生成 END 或者遇到用户指定的 stop_token 而生成长度过短的情况,与 seq_len 结合使用来设置生成文本的长度范围 [1, seq_len]。\n       - seq_len(int): 输出结果的最大长度, 因模型生成 END 或者遇到用户指定的 stop_token, 实际返回结果可能会小于这个长度, 与 min_dec_len 结合使用来控制生成文本的长度范围 [1, 1000]。(注: ERNIE 3.0-1.5B 模型取值范围 ≤ 512)\n       - topp(float): 影响输出文本的多样性, 取值越大, 生成文本的多样性越强。取值范围 [0.0, 1.0]。\n       - penalty_score(float): 通过对已生成的 token 增加惩罚, 减少重复生成的现象。值越大表示惩罚越大。取值范围 [1.0, 2.0]。\n\n     - **返回**\n       - text(str): 段落摘要。\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线文本生成服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m ernie_zeus\n    ```\n\n  - 这样就完成了一个文本生成的在线服务API的部署，默认端口号为8866。\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果。\n\n  - ```python\n    import requests\n    import json\n\n    # 发送HTTP请求\n    # 参数参考自定义文本生成接口\n    data = {'text': '巨大的白色城堡'}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/ernie_zeus\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 获取返回结果\n    print(r.json()[\"results\"])\n\n- ### gradio app 支持\n  从paddlehub 2.3.1开始支持使用链接 http://127.0.0.1:8866/gradio/ernie_zeus 在浏览器中访问ernie_zeus的gradio app。\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除默认 AK 和 SK\n\n* 1.2.0\n\n  添加 Serving 和 Gradio APP\n\n  ```shell\n  $ hub install ernie_zeus == 1.2.0\n  ```\n"
  },
  {
    "path": "modules/text/text_generation/ernie_zeus/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/text_generation/ernie_zeus/module.py",
    "content": "import argparse\nimport json\nimport os\n\nimport requests\n\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\n\ndef get_access_token(ak: str = None, sk: str = None) -> str:\n    '''\n    Get Access Token\n\n    Params:\n        ak(str): API Key\n        sk(str): Secret Key\n\n    Return:\n        access_token(str): Access Token\n    '''\n    ak = ak if ak else os.getenv('WENXIN_AK')\n    sk = sk if sk else os.getenv('WENXIN_SK')\n\n    assert ak and sk, RuntimeError(\n        'Please go to the wenxin official website to apply for AK and SK and set the parameters “ak” and “sk” correctly, or set the environment variables “WENXIN_AK” and “WENXIN_SK”.'\n    )\n\n    url = 'https://wenxin.baidu.com/younger/portal/api/oauth/token'\n    headers = {'Content-Type': 'application/x-www-form-urlencoded'}\n    datas = {'grant_type': 'client_credentials', 'client_id': ak, 'client_secret': sk}\n\n    responses = requests.post(url, datas, headers=headers)\n\n    assert responses.status_code == 200, f\"Network Error {responses.status_code}.\"\n\n    results = json.loads(responses.text)\n\n    assert results['msg'] == 'success', f\"Error message: '{results['msg']}'. Please check the ak and sk.\"\n\n    return results['data']\n\n\n@moduleinfo(name='ernie_zeus',\n            type='nlp/text_generation',\n            author='paddlepaddle',\n            author_email='',\n            summary='ernie_zeus',\n            version='1.2.0')\nclass ERNIEZeus:\n\n    def __init__(self, ak: str = None, sk: str = None) -> None:\n        self.access_token = get_access_token(ak, sk)\n\n    @serving\n    def custom_generation(self,\n                          text: str,\n                          min_dec_len: int = 1,\n                          seq_len: int = 128,\n                          topp: float = 1.0,\n                          penalty_score: float = 1.0,\n                          stop_token: str = '',\n                          task_prompt: str = '',\n                          penalty_text: str = '',\n                          choice_text: str = '',\n                          is_unidirectional: bool = False,\n                          min_dec_penalty_text: str = '',\n                          logits_bias: int = -10000,\n                          mask_type: str = 'word') -> str:\n        '''\n        ERNIE 3.0 Zeus 自定义接口\n\n        Params:\n            text(srt): 模型的输入文本, 为 prompt 形式的输入。文本长度 [1, 1000]。注: ERNIE 3.0-1.5B 模型取值范围 ≤ 512。\n            min_dec_len(int): 输出结果的最小长度, 避免因模型生成 END 或者遇到用户指定的 stop_token 而生成长度过短的情况,与 seq_len 结合使用来设置生成文本的长度范围 [1, seq_len]。\n            seq_len(int): 输出结果的最大长度, 因模型生成 END 或者遇到用户指定的 stop_token, 实际返回结果可能会小于这个长度, 与 min_dec_len 结合使用来控制生成文本的长度范围 [1, 1000]。(注: ERNIE 3.0-1.5B 模型取值范围 ≤ 512)\n            topp(float): 影响输出文本的多样性, 取值越大, 生成文本的多样性越强。取值范围 [0.0, 1.0]。\n            penalty_score(float): 通过对已生成的 token 增加惩罚, 减少重复生成的现象。值越大表示惩罚越大。取值范围 [1.0, 2.0]。\n            stop_token(str): 预测结果解析时使用的结束字符串, 碰到对应字符串则直接截断并返回。可以通过设置该值, 过滤掉 few-shot 等场景下模型重复的 cases。\n            task_prompt(str): 指定预置的任务模板, 效果更好。\n                              PARAGRAPH: 引导模型生成一段文章; SENT: 引导模型生成一句话; ENTITY: 引导模型生成词组;\n                              Summarization: 摘要; MT: 翻译; Text2Annotation: 抽取; Correction: 纠错;\n                              QA_MRC: 阅读理解; Dialogue: 对话; QA_Closed_book: 闭卷问答; QA_Multi_Choice: 多选问答;\n                              QuestionGeneration: 问题生成; Paraphrasing: 复述; NLI: 文本蕴含识别; SemanticMatching: 匹配;\n                              Text2SQL: 文本描述转SQL; TextClassification: 文本分类; SentimentClassification: 情感分析;\n                              zuowen: 写作文; adtext: 写文案; couplet: 对对联; novel: 写小说; cloze: 文本补全; Misc: 其它任务。\n            penalty_text(str): 模型会惩罚该字符串中的 token。通过设置该值, 可以减少某些冗余与异常字符的生成。\n            choice_text(str): 模型只能生成该字符串中的 token 的组合。通过设置该值, 可以对某些抽取式任务进行定向调优。\n            is_unidirectional(bool): False 表示模型为双向生成, True 表示模型为单向生成。建议续写与 few-shot 等通用场景建议采用单向生成方式, 而完型填空等任务相关场景建议采用双向生成方式。\n            min_dec_penalty_text(str): 与最小生成长度搭配使用, 可以在 min_dec_len 步前不让模型生成该字符串中的 tokens。\n            logits_bias(int): 配合 penalty_text 使用, 对给定的 penalty_text 中的 token 增加一个 logits_bias, 可以通过设置该值屏蔽某些 token 生成的概率。\n            mask_type(str): 设置该值可以控制模型生成粒度。可选参数为 word, sentence, paragraph。\n\n        Return:\n            text(str): 生成的文本\n        '''\n        url = 'https://wenxin.baidu.com/moduleApi/portal/api/rest/1.0/ernie/3.0.28/zeus?from=paddlehub'\n        access_token = self.access_token\n        headers = {'Content-Type': 'application/x-www-form-urlencoded'}\n        datas = {\n            'access_token': access_token,\n            'text': text,\n            'min_dec_len': min_dec_len,\n            'seq_len': seq_len,\n            'topp': topp,\n            'penalty_score': penalty_score,\n            'stop_token': stop_token,\n            'task_prompt': task_prompt,\n            'penalty_text': penalty_text,\n            'choice_text': choice_text,\n            'is_unidirectional': int(is_unidirectional),\n            'min_dec_penalty_text': min_dec_penalty_text,\n            'logits_bias': logits_bias,\n            'mask_type': mask_type,\n        }\n\n        responses = requests.post(url, datas, headers=headers)\n\n        assert responses.status_code == 200, f\"Network Error {responses.status_code}.\"\n\n        results = json.loads(responses.text)\n\n        assert results['code'] == 0, f\"Error message: '{results['msg']}'.\"\n\n        return results['data']['result']\n\n    def text_generation(self,\n                        text: str,\n                        min_dec_len: int = 4,\n                        seq_len: int = 512,\n                        topp: float = 0.9,\n                        penalty_score: float = 1.2) -> str:\n        '''\n        文本生成\n        '''\n        return self.custom_generation(text,\n                                      min_dec_len,\n                                      seq_len,\n                                      topp,\n                                      penalty_score,\n                                      stop_token='',\n                                      task_prompt='PARAGRAPH',\n                                      penalty_text='[{[gEND]',\n                                      choice_text='',\n                                      is_unidirectional=True,\n                                      min_dec_penalty_text='。？：！[<S>]',\n                                      logits_bias=-10,\n                                      mask_type='paragraph')\n\n    def text_summarization(self,\n                           text: str,\n                           min_dec_len: int = 4,\n                           seq_len: int = 512,\n                           topp: float = 0.0,\n                           penalty_score: float = 1.0) -> str:\n        '''\n        摘要生成\n        '''\n        text = \"文章：{} 摘要：\".format(text)\n        return self.custom_generation(text,\n                                      min_dec_len,\n                                      seq_len,\n                                      topp,\n                                      penalty_score,\n                                      stop_token='',\n                                      task_prompt='Summarization',\n                                      penalty_text='',\n                                      choice_text='',\n                                      is_unidirectional=False,\n                                      min_dec_penalty_text='',\n                                      logits_bias=-10000,\n                                      mask_type='word')\n\n    def copywriting_generation(self,\n                               text: str,\n                               min_dec_len: int = 32,\n                               seq_len: int = 512,\n                               topp: float = 0.9,\n                               penalty_score: float = 1.2) -> str:\n        '''\n        文案生成\n        '''\n        text = \"标题：{} 文案：\".format(text)\n        return self.custom_generation(text,\n                                      min_dec_len,\n                                      seq_len,\n                                      topp,\n                                      penalty_score,\n                                      stop_token='',\n                                      task_prompt='adtext',\n                                      penalty_text='',\n                                      choice_text='',\n                                      is_unidirectional=False,\n                                      min_dec_penalty_text='',\n                                      logits_bias=-10000,\n                                      mask_type='word')\n\n    def novel_continuation(self,\n                           text: str,\n                           min_dec_len: int = 2,\n                           seq_len: int = 512,\n                           topp: float = 0.9,\n                           penalty_score: float = 1.2) -> str:\n        '''\n        小说续写\n        '''\n        text = \"上文：{} 下文：\".format(text)\n        return self.custom_generation(text,\n                                      min_dec_len,\n                                      seq_len,\n                                      topp,\n                                      penalty_score,\n                                      stop_token='',\n                                      task_prompt='gPARAGRAPH',\n                                      penalty_text='',\n                                      choice_text='',\n                                      is_unidirectional=True,\n                                      min_dec_penalty_text='。？：！[<S>]',\n                                      logits_bias=-5,\n                                      mask_type='paragraph')\n\n    def answer_generation(self,\n                          text: str,\n                          min_dec_len: int = 2,\n                          seq_len: int = 512,\n                          topp: float = 0.9,\n                          penalty_score: float = 1.2) -> str:\n        '''\n        自由问答\n        '''\n        text = \"问题：{} 回答：\".format(text)\n        return self.custom_generation(text,\n                                      min_dec_len,\n                                      seq_len,\n                                      topp,\n                                      penalty_score,\n                                      stop_token='',\n                                      task_prompt='qa',\n                                      penalty_text='[gEND]',\n                                      choice_text='',\n                                      is_unidirectional=True,\n                                      min_dec_penalty_text='。？：！[<S>]',\n                                      logits_bias=-5,\n                                      mask_type='paragraph')\n\n    def couplet_continuation(self,\n                             text: str,\n                             min_dec_len: int = 2,\n                             seq_len: int = 512,\n                             topp: float = 0.9,\n                             penalty_score: float = 1.0) -> str:\n        '''\n        对联续写\n        '''\n        text = \"上联：{} 下联：\".format(text)\n        return self.custom_generation(text,\n                                      min_dec_len,\n                                      seq_len,\n                                      topp,\n                                      penalty_score,\n                                      stop_token='',\n                                      task_prompt='couplet',\n                                      penalty_text='',\n                                      choice_text='',\n                                      is_unidirectional=False,\n                                      min_dec_penalty_text='',\n                                      logits_bias=-10000,\n                                      mask_type='word')\n\n    def composition_generation(self,\n                               text: str,\n                               min_dec_len: int = 128,\n                               seq_len: int = 512,\n                               topp: float = 0.9,\n                               penalty_score: float = 1.2) -> str:\n        '''\n        作文创作\n        '''\n        text = \"作文题目：{} 正文：\".format(text)\n        return self.custom_generation(text,\n                                      min_dec_len,\n                                      seq_len,\n                                      topp,\n                                      penalty_score,\n                                      stop_token='',\n                                      task_prompt='zuowen',\n                                      penalty_text='',\n                                      choice_text='',\n                                      is_unidirectional=False,\n                                      min_dec_penalty_text='',\n                                      logits_bias=-10000,\n                                      mask_type='word')\n\n    def text_cloze(self,\n                   text: str,\n                   min_dec_len: int = 1,\n                   seq_len: int = 512,\n                   topp: float = 0.9,\n                   penalty_score: float = 1.0) -> str:\n        '''\n        完形填空\n        '''\n        return self.custom_generation(text,\n                                      min_dec_len,\n                                      seq_len,\n                                      topp,\n                                      penalty_score,\n                                      stop_token='',\n                                      task_prompt='cloze',\n                                      penalty_text='',\n                                      choice_text='',\n                                      is_unidirectional=False,\n                                      min_dec_penalty_text='',\n                                      logits_bias=-10000,\n                                      mask_type='word')\n\n    @runnable\n    def cmd(self, argvs):\n        parser = argparse.ArgumentParser(description=\"Run the {}\".format(self.name),\n                                         prog=\"hub run {}\".format(self.name),\n                                         usage='%(prog)s',\n                                         add_help=True)\n\n        parser.add_argument('--text', type=str, required=True)\n        parser.add_argument('--min_dec_len', type=int, default=1)\n        parser.add_argument('--seq_len', type=int, default=128)\n        parser.add_argument('--topp', type=float, default=1.0)\n        parser.add_argument('--penalty_score', type=float, default=1.0)\n        parser.add_argument('--stop_token', type=str, default='')\n        parser.add_argument('--task_prompt', type=str, default='')\n        parser.add_argument('--penalty_text', type=str, default='')\n        parser.add_argument('--choice_text', type=str, default='')\n        parser.add_argument('--is_unidirectional', type=bool, default=False)\n        parser.add_argument('--min_dec_penalty_text', type=str, default='')\n        parser.add_argument('--logits_bias', type=int, default=-10000)\n        parser.add_argument('--mask_type', type=str, default='word')\n        parser.add_argument('--ak', type=str, default='')\n        parser.add_argument('--sk', type=str, default='')\n        parser.add_argument('--task', type=str, default='custom_generation')\n\n        args = parser.parse_args(argvs)\n\n        func = getattr(self, args.task)\n\n        if (args.ak != '') and (args.sk != ''):\n            self.access_token = get_access_token(args.ak, args.sk)\n\n        kwargs = vars(args)\n        if kwargs['task'] not in ['custom_generation']:\n            kwargs.pop('stop_token')\n            kwargs.pop('task_prompt')\n            kwargs.pop('penalty_text')\n            kwargs.pop('choice_text')\n            kwargs.pop('is_unidirectional')\n            kwargs.pop('min_dec_penalty_text')\n            kwargs.pop('logits_bias')\n            kwargs.pop('mask_type')\n            default_kwargs = {'min_dec_len': 1, 'seq_len': 128, 'topp': 1.0, 'penalty_score': 1.0}\n        else:\n            default_kwargs = {\n                'min_dec_len': 1,\n                'seq_len': 128,\n                'topp': 1.0,\n                'penalty_score': 1.0,\n                'stop_token': '',\n                'task_prompt': '',\n                'penalty_text': '',\n                'choice_text': '',\n                'is_unidirectional': False,\n                'min_dec_penalty_text': '',\n                'logits_bias': -10000,\n                'mask_type': 'word'\n            }\n        kwargs.pop('task')\n        kwargs.pop('ak')\n        kwargs.pop('sk')\n\n        for k in default_kwargs.keys():\n            if kwargs[k] == default_kwargs[k]:\n                kwargs.pop(k)\n\n        return func(**kwargs)\n\n    def create_gradio_app(self):\n        import gradio as gr\n\n        def inference(task: str,\n                      text: str,\n                      min_dec_len: int = 2,\n                      seq_len: int = 512,\n                      topp: float = 0.9,\n                      penalty_score: float = 1.0):\n\n            func = getattr(self, task)\n            try:\n                result = func(text, min_dec_len, seq_len, topp, penalty_score)\n                return result\n            except Exception as error:\n                return str(error)\n\n        examples = [\n            [\n                'text_summarization',\n                '外媒7月18日报道，阿联酋政府当日证实该国将建设首个核电站，以应对不断上涨的用电需求。分析称阿联酋作为世界第三大石油出口国，更愿意将该能源用于出口，而非发电。首座核反应堆预计在2017年运行。cntv李婉然编译报道',\n                4, 512, 0.3, 1.0\n            ],\n            ['copywriting_generation', '芍药香氛的沐浴乳', 8, 512, 0.9, 1.2],\n            ['novel_continuation', '昆仑山可以说是天下龙脉的根源，所有的山脉都可以看作是昆仑的分支。这些分出来的枝枝杈杈，都可以看作是一条条独立的龙脉。', 2, 512, 0.9, 1.2],\n            ['answer_generation', '做生意的基本原则是什么？', 2, 512, 0.5, 1.2],\n            ['couplet_continuation', '天增岁月人增寿', 2, 512, 1.0, 1.0],\n            ['composition_generation', '拔河比赛', 128, 512, 0.9, 1.2],\n            ['text_cloze', '她有着一双[MASK]的眼眸。', 1, 512, 0.3, 1.2],\n        ]\n\n        text = gr.Textbox(\n            label=\"input_text\",\n            placeholder=\"Please enter Chinese text.\",\n        )\n        task = gr.Dropdown(label=\"task\",\n                           choices=[\n                               'text_summarization', 'copywriting_generation', 'novel_continuation',\n                               'answer_generation', 'couplet_continuation', 'composition_generation', 'text_cloze'\n                           ],\n                           value='text_summarization')\n\n        min_dec_len = gr.Slider(minimum=1, maximum=511, value=1, label=\"min_dec_len\", step=1, interactive=True)\n        seq_len = gr.Slider(minimum=2, maximum=512, value=128, label=\"seq_len\", step=1, interactive=True)\n        topp = gr.Slider(minimum=0.0, maximum=1.0, value=1.0, label=\"topp\", step=0.01, interactive=True)\n        penalty_score = gr.Slider(minimum=1.0,\n                                  maximum=2.0,\n                                  value=1.0,\n                                  label=\"penalty_score\",\n                                  step=0.01,\n                                  interactive=True)\n        text_gen = gr.Text(label=\"generated_text\")\n        interface = gr.Interface(inference, [task, text, min_dec_len, seq_len, topp, penalty_score], [text_gen],\n                                 examples=examples,\n                                 allow_flagging='never',\n                                 title='ERNIE-Zeus')\n        return interface\n"
  },
  {
    "path": "modules/text/text_generation/ernie_zeus/requirements.txt",
    "content": "requests\n"
  },
  {
    "path": "modules/text/text_generation/ernie_zeus/test.py",
    "content": "import unittest\n\nimport paddlehub as hub\n\n\nclass TestHubModule(unittest.TestCase):\n\n    @classmethod\n    def setUpClass(cls) -> None:\n        cls.module = hub.Module(name='ernie_zeus')\n\n    def test_custom_generation(self):\n        results = self.module.custom_generation('你好，')\n        self.assertIsInstance(results, str)\n\n    def test_text_generation(self):\n        results = self.module.text_generation('给宠物猫起一些可爱的名字。名字：')\n        self.assertIsInstance(results, str)\n\n    def test_text_summarization(self):\n        results = self.module.text_summarization(\n            '在芬兰、瑞典提交“入约”申请近一个月来，北约成员国内部尚未对此达成一致意见。与此同时，俄罗斯方面也多次对北约“第六轮扩张”发出警告。据北约官网显示，北约秘书长斯托尔滕贝格将于本月12日至13日出访瑞典和芬兰，并将分别与两国领导人进行会晤。'\n        )\n        self.assertIsInstance(results, str)\n\n    def test_copywriting_generation(self):\n        results = self.module.copywriting_generation('芍药香氛的沐浴乳')\n        self.assertIsInstance(results, str)\n\n    def test_modulenovel_continuation(self):\n        results = self.module.novel_continuation('昆仑山可以说是天下龙脉的根源，所有的山脉都可以看作是昆仑的分支。这些分出来的枝枝杈杈，都可以看作是一条条独立的龙脉。')\n        self.assertIsInstance(results, str)\n\n    def test_answer_generation(self):\n        results = self.module.answer_generation('交朋友的原则是什么？')\n        self.assertIsInstance(results, str)\n\n    def test_couplet_continuation(self):\n        results = self.module.couplet_continuation('五湖四海皆春色')\n        self.assertIsInstance(results, str)\n\n    def test_composition_generation(self):\n        results = self.module.composition_generation('诚以养德，信以修身')\n        self.assertIsInstance(results, str)\n\n    def test_text_cloze(self):\n        results = self.module.text_cloze('她有着一双[MASK]的眼眸。')\n        self.assertIsInstance(results, str)\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  },
  {
    "path": "modules/text/text_generation/plato-mini/README.md",
    "content": "# plato-mini\n\n| 模型名称            |       plato-mini       |\n| :------------------ | :--------------------: |\n| 类别                |     文本-文本生成      |\n| 网络                |  Unified Transformer   |\n| 数据集              | 十亿级别的中文对话数据 |\n| 是否支持Fine-tuning |           否           |\n| 模型大小            |         5.28K          |\n| 最新更新日期        |       2021-06-30       |\n| 数据指标            |           -            |\n\n## 一、模型基本信息\n\n- ### 模型介绍\n  - [UnifiedTransformer](https://arxiv.org/abs/2006.16779)以[Transformer](https://arxiv.org/abs/1706.03762) 编码器为网络基本组件，采用灵活的注意力机制，十分适合文本生成任务，并在模型输入中加入了标识不同对话技能的special token，使得模型能同时支持闲聊对话、推荐对话和知识对话。\n该模型在十亿级别的中文对话数据上进行预训练，通过PaddleHub加载后可直接用于对话任务，仅支持中文对话。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n  \n- ### 2、安装\n\n  - ```shell\n    $ hub install plato-mini\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- plato-mini不支持一行预测，仅支持python代码预测\n\n- ### 1、预测代码示例\n\n  - ```python\n    # 非交互模式\n    import paddlehub as hub\n    \n    model = hub.Module(name='plato-mini')\n    data = [[\"你是谁？\"], [\"你好啊。\", \"吃饭了吗？\",]]\n    result = model.predict(data)\n    print(result)\n    \n    # ['我是一个小角色,我是在玩游戏', '吃过了呢,你吃了没?']\n    # 每次的运行结果可能有所不同\n    ```\n    \n  - ```python\n    # 交互模式\n    # 使用命令行与机器人对话\n    import paddlehub as hub\n    import readline\n    \n    model = hub.Module(name='plato-mini')\n    with model.interactive_mode(max_turn=3):\n        while True:\n            human_utterance = input(\"[Human]: \").strip()\n            robot_utterance = model.predict(human_utterance)[0]\n            print(\"[Bot]: %s\"%robot_utterance)\n    ```\n\n- ### 2、API\n\n  - ```python\n    def predict(data, max_seq_len=512, batch_size=1, use_gpu=False, **kwargs):\n    ```\n\n    - 预测API，输入对话上下文，输出机器回复。\n    - **参数**\n      - data(Union[List[List[str](https://www.paddlepaddle.org.cn/hubdetail?name=plato-mini&en_category=TextGeneration)], str]): 在非交互模式中，数据类型为List[List[str](https://www.paddlepaddle.org.cn/hubdetail?name=plato-mini&en_category=TextGeneration)]，每个样本是一个List[str](https://www.paddlepaddle.org.cn/hubdetail?name=plato-mini&en_category=TextGeneration)，表示为对话内容\n      - max_seq_len(int): 每个样本的最大文本长度\n      - batch_size(int): 进行预测的batch_size\n      - use_gpu(bool): 是否使用gpu执行预测\n      - kwargs: 预测时传给模型的额外参数，以keyword方式传递。其余的参数详情请查看[UnifiedTransformer](https://github.com/PaddlePaddle/PaddleNLP/tree/develop/examples/dialogue/unified_transformer)。\n    - **返回**\n      - results(List[str](https://www.paddlepaddle.org.cn/hubdetail?name=plato-mini&en_category=TextGeneration)): 每个元素为相应对话中模型的新回复\n    \n  - ```python\n    def interactive_mode(max_turn=3):\n    ```\n  \n    - 配置交互模式并进入。\n    - **参数**\n      - max_turn(int): 模型能记忆的对话轮次，当max_turn为1时，模型只能记住当前对话，无法获知之前的对话内容。\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线对话机器人服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m plato-mini -p 8866\n    ```\n\n  - 这样就完成了一个对话机器人服务化API的部署，默认端口号为8866。\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n    \n    texts = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\n    data = {\"data\": texts}\n    # 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\n    url = \"http://127.0.0.1:8866/predict/plato_mini\"\n    # 指定post请求的headers为application/json方式\n    headers = {\"Content-Type\": \"application/json\"}\n    \n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    \n    # {'msg': '', 'results': ['是个好日子啊!', '下雨就不出门了,在家宅着吧'], 'status': '000'}\n    # 每次的运行结果可能有所不同\n    ```\n    \n  - 关于PaddleHub Serving更多信息参考[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n  \n  - ```shell\n    $ hub install plato-mini==1.0.0\n    ```\n"
  },
  {
    "path": "modules/text/text_generation/plato-mini/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/text_generation/plato-mini/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport contextlib\nfrom collections import deque\nfrom typing import List, Union\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nfrom paddlehub.module.module import moduleinfo, serving\nfrom paddlenlp.data import Pad\nfrom paddlenlp.transformers import UnifiedTransformerLMHeadModel, UnifiedTransformerTokenizer\n\nfrom plato_mini.utils import select_response\n\n\n@moduleinfo(\n    name=\"plato-mini\",\n    version=\"1.0.0\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/text_generation\",\n)\nclass UnifiedTransformer(nn.Layer):\n    def __init__(self):\n        super(UnifiedTransformer, self).__init__()\n\n        self.model = UnifiedTransformerLMHeadModel.from_pretrained('plato-mini')\n        self.tokenizer = UnifiedTransformerTokenizer.from_pretrained('plato-mini')\n        self._interactive_mode = False\n\n    def _convert_text_to_input(self, texts: List[str], max_seq_len: int):\n        \"\"\"\n        Convert input strings to tokens.\n        \"\"\"\n        return self.tokenizer.dialogue_encode(\n            texts, max_seq_len=max_seq_len, add_start_token_as_response=True, is_split_into_words=False)\n\n    def _batchify(self, data: List[List[str]], max_seq_len: int, batch_size: int):\n        \"\"\"\n        Generate input batches.\n        \"\"\"\n        padding = False if batch_size == 1 else True\n        pad_func = Pad(pad_val=self.tokenizer.pad_token_id, pad_right=False, dtype=np.int64)\n\n        def pad_mask(batch_attention_mask):\n            batch_size = len(batch_attention_mask)\n            max_len = max(map(len, batch_attention_mask))\n            attention_mask = np.ones((batch_size, max_len, max_len), dtype='float32') * -1e9\n            for i, mask_data in enumerate(attention_mask):\n                seq_len = len(batch_attention_mask[i])\n                mask_data[-seq_len:, -seq_len:] = np.array(batch_attention_mask[i], dtype='float32')\n            # In order to ensure the correct broadcasting mechanism, expand one\n            # dimension to the second dimension (n_head of Transformer).\n            attention_mask = np.expand_dims(attention_mask, axis=1)\n            return attention_mask\n\n        def _parse_batch(batch_examples):\n            if padding:\n                input_ids = pad_func([example['input_ids'] for example in batch_examples])\n                token_type_ids = pad_func([example['token_type_ids'] for example in batch_examples])\n                position_ids = pad_func([example['position_ids'] for example in batch_examples])\n                attention_mask = pad_mask([example['attention_mask'] for example in batch_examples])\n            else:\n                input_ids = np.asarray([example['input_ids'] for example in batch_examples], dtype=np.int64)\n                token_type_ids = np.asarray([example['token_type_ids'] for example in batch_examples], dtype=np.int64)\n                position_ids = np.asarray([example['position_ids'] for example in batch_examples], dtype=np.int64)\n                attention_mask = np.asarray([example['attention_mask'] for example in batch_examples])\n                attention_mask = np.expand_dims(attention_mask, 0)\n\n            return input_ids, token_type_ids, position_ids, attention_mask\n\n        examples = []\n        for texts in data:\n            examples.append(self._convert_text_to_input(texts, max_seq_len))\n\n        # Seperates data into some batches.\n        one_batch = []\n        for example in examples:\n            one_batch.append(example)\n            if len(one_batch) == batch_size:\n                yield _parse_batch(one_batch)\n                one_batch = []\n        if one_batch:\n            yield _parse_batch(one_batch)\n\n    @contextlib.contextmanager\n    def interactive_mode(self, max_turn=3):\n        \"\"\"\n        Enter the interactive mode.\n        \"\"\"\n        self._interactive_mode = True\n        self.max_turn = max_turn\n        self.context = deque(maxlen=self.max_turn)\n        yield\n        self.context.clear()\n        self._interactive_mode = False\n\n    def forward(self,\n                input_ids,\n                token_type_ids,\n                position_ids,\n                attention_mask,\n                max_length=64,\n                min_length=1,\n                decode_strategy='sampling',\n                temperature=1.0,\n                top_k=5,\n                top_p=1.0,\n                num_beams=0,\n                length_penalty=1.0,\n                early_stopping=False,\n                num_return_sequences=1):\n\n        ids, scores = self.model.generate(\n            input_ids=input_ids,\n            token_type_ids=token_type_ids,\n            position_ids=position_ids,\n            attention_mask=attention_mask,\n            max_length=max_length,\n            min_length=min_length,\n            decode_strategy=decode_strategy,\n            temperature=temperature,\n            top_k=top_k,\n            top_p=top_p,\n            num_beams=num_beams,\n            length_penalty=length_penalty,\n            early_stopping=early_stopping,\n            num_return_sequences=num_return_sequences)\n\n        return ids, scores\n\n    @serving\n    def predict(self,\n                data: Union[List[List[str]], str],\n                max_seq_len: int = 512,\n                batch_size: int = 1,\n                use_gpu: bool = False,\n                **kwargs):\n\n        if self._interactive_mode:\n            if isinstance(data, str):\n                self.context.append(data.strip())\n                data = [list(self.context)]\n            else:\n                raise ValueError(\"In the interactive mode, the input data should be a string.\")\n        elif not isinstance(data, list):\n            raise ValueError(\"If not in the interactive mode, the input data should be a list.\")\n\n        paddle.set_device('gpu') if use_gpu else paddle.set_device('cpu')\n\n        batches = self._batchify(data, max_seq_len, batch_size)\n\n        results = []\n        self.eval()\n        for batch in batches:\n            input_ids, token_type_ids, position_ids, attention_mask = map(paddle.to_tensor, batch)\n            ids, scores = self(input_ids, token_type_ids, position_ids, attention_mask, **kwargs)\n            num_return_sequences = 1 if 'num_return_sequences' not in kwargs\\\n                else kwargs['num_return_sequences']\n            results.extend(\n                select_response(\n                    ids, scores, self.tokenizer, num_return_sequences=num_return_sequences, keep_space=False))\n\n        if self._interactive_mode:\n            self.context.append(results[0].strip())\n\n        return results\n"
  },
  {
    "path": "modules/text/text_generation/plato-mini/requirements.txt",
    "content": "sentencepiece\n"
  },
  {
    "path": "modules/text/text_generation/plato-mini/utils.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import List\n\n\ndef post_process_response(token_ids: List[int], tokenizer):\n    '''\n    Post-process the decoded sequence. Truncate from the first <eos>.\n    '''\n    eos_pos = len(token_ids)\n    for i, tok_id in enumerate(token_ids):\n        if tok_id == tokenizer.sep_token_id:\n            eos_pos = i\n            break\n    token_ids = token_ids[:eos_pos]\n    tokens = tokenizer.convert_ids_to_tokens(token_ids)\n    tokens = tokenizer.merge_subword(tokens)\n    return token_ids, tokens\n\n\ndef get_in_turn_repetition(pred: List[str], is_cn: bool = False):\n    '''\n    Get in-turn repetition.\n    '''\n    if len(pred) == 0:\n        return 1.0\n    if isinstance(pred[0], str):\n        pred = [tok.lower() for tok in pred]\n        if is_cn:\n            pred = \"\".join(pred)\n    tri_grams = set()\n    for i in range(len(pred) - 2):\n        tri_gram = tuple(pred[i:i + 3])\n        if tri_gram in tri_grams:\n            return True\n        tri_grams.add(tri_gram)\n    return False\n\n\ndef select_response(ids,\n                    scores: List[float],\n                    tokenizer,\n                    max_dec_len: int = None,\n                    num_return_sequences: int = 1,\n                    keep_space: bool = True):\n    '''\n    Select response with the highest score.\n    '''\n    ids = ids.numpy().tolist()\n    scores = scores.numpy()\n\n    if len(ids) != len(scores) or (len(ids) % num_return_sequences) != 0:\n        raise ValueError(\"the length of `ids` is {}, but the `num_return_sequences` is {}\".format(\n            len(ids), num_return_sequences))\n\n    group = []\n    tmp = []\n    for pred, score in zip(ids, scores):\n        pred_token_ids, pred_tokens = post_process_response(pred, tokenizer)\n        num_token = len(pred_token_ids)\n        if keep_space:\n            response = \" \".join(pred_tokens)\n        else:\n            response = \"\".join(pred_tokens)\n\n        in_turn_repetition = get_in_turn_repetition(pred_tokens, True) or get_in_turn_repetition(pred_token_ids)\n        # not ending\n        if max_dec_len is not None and num_token >= max_dec_len:\n            score -= 1e3\n        elif in_turn_repetition:\n            score -= 1e3\n\n        tmp.append([response, score])\n        if len(tmp) == num_return_sequences:\n            group.append(tmp)\n            tmp = []\n\n    results = []\n    for preds in group:\n        preds = sorted(preds, key=lambda x: -x[1])\n        results.append(preds[0][0])\n    return results\n"
  },
  {
    "path": "modules/text/text_generation/plato2_en_base/README.md",
    "content": "# plato2_en_base\n\n| 模型名称            |       plato2_en_base       |\n| :------------------ | :--------------------: |\n| 类别                |     文本-文本生成      |\n| 网络                |  PLATO2   |\n| 数据集              | 大规模开放域英文数据集 |\n| 是否支持Fine-tuning |           否           |\n| 模型大小            |         3.5 GB      |\n| 最新更新日期        |       2022-11-05       |\n| 数据指标            |           -            |\n\n## 一、模型基本信息\n\n- ### 模型介绍\n  - PLATO2 是一个超大规模生成式对话系统模型。它承袭了 PLATO 隐变量进行回复多样化生成的特性，能够就开放域话题进行流畅深入的聊天。据公开数据，其效果超越了 Google 于 2020 年 2 月份发布的 Meena 和 Facebook AI Research 于2020 年 4 月份发布的 Blender 的效果。plato2_en_base 包含 310M 参数，可用于一键预测对话回复。由于该 Module 参数量较多，推荐使用GPU预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install plato2_en_base\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```bash\n    $ hub run plato2_en_base --input_text=\"Hello, how are you\"\n    ```\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"plato2_en_base\")\n\n    test_texts = [\"Hello\",\"Hello\\thi, nice to meet you\\tnice to meet you\"]\n    results = module.generate(texts=test_texts)\n    for result in results:\n        print(result)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def generate(texts):\n    ```\n\n    - 预测API，输入对话上下文，输出机器回复。\n    - **参数**\n      - texts (list\\[str\\] or str): 如果不在交互模式中，texts应为list，每个元素为一次对话的上下文，上下文应包含人类和机器人的对话内容，不同角色之间的聊天用分隔符\"\\t\"进行分割；例如[[\"Hello\\thi, nice to meet you\\tnice to meet you\"]]。这个输入中包含1次对话，机器人回复了\"hi, nice to meet you\"后人类回复“nice to meet you”，现在轮到机器人回复了。如果在交互模式中，texts应为str，模型将自动构建它的上下文。\n\n  - ```python\n    def interactive_mode(max_turn=6):\n    ```\n\n    - 进入交互模式。交互模式中，generate接口的texts将支持字符串类型。\n    - **参数**\n      - max_turn (int): 模型能记忆的对话轮次，当max_turn = 1时，模型只能记住当前对话，无法获知之前的对话内容。\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线对话机器人服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m plato2_en_base -p 8866\n    ```\n\n  - 这样就完成了一个对话机器人服务化API的部署，默认端口号为8866。\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    data = {'texts':[\"Hello\",\"Hello\\thi, nice to meet you\\tnice to meet you\"]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/plato2_en_base\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 保存结果\n    results = r.json()[\"results\"]\n    for result in results:\n        print(result)\n    ```\n\n  - 关于PaddleHub Serving更多信息参考[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n  \n* 1.1.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install plato2_en_base==1.1.0\n    ```\n"
  },
  {
    "path": "modules/text/text_generation/plato2_en_base/model.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom collections import namedtuple\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\ndef post_process_context(token_ids, reader, merge=True):\n    \"\"\"Post-process the context sequence.\"\"\"\n    context = []\n    utt = []\n    for tok_id in token_ids[1:]:\n        if tok_id == reader.eos_id:\n            utt = reader.tokenizer.convert_ids_to_tokens(utt)\n            if merge:\n                utt = reader.tokenizer.merge_subword(utt)\n            context.append(utt)\n            utt = []\n        else:\n            utt.append(tok_id)\n    return context\n\n\ndef post_process_response(token_ids, reader, merge=True):\n    \"\"\"\n    Post-process the decoded sequence. Truncate from the first\n    <eos> and remove the <bos> and <eos> tokens currently.\n    \"\"\"\n    eos_pos = len(token_ids)\n    for i, tok_id in enumerate(token_ids):\n        if tok_id == reader.eos_id:\n            eos_pos = i\n            break\n    token_ids = token_ids[1:eos_pos]\n    response = reader.tokenizer.convert_ids_to_tokens(token_ids)\n    if merge:\n        response = reader.tokenizer.merge_subword(response)\n    return token_ids, response\n\n\ndef get_cross_turn_repetition(context, pred_tokens, eos_idx, is_cn=False):\n    \"\"\"Get cross-turn repetition.\"\"\"\n    if len(pred_tokens) == 0:\n        return 1.0\n    if is_cn:\n        context = [\"\".join(utt) for utt in context]\n        pred_tokens = \"\".join(pred_tokens)\n\n    pred_tri_grams = set()\n    for i in range(len(pred_tokens) - 2):\n        tri_gram = tuple(pred_tokens[i:i + 3])\n        pred_tri_grams.add(tri_gram)\n    for utt in context:\n        for i in range(len(utt) - 2):\n            tri_gram = tuple(utt[i:i + 3])\n            if tri_gram in pred_tri_grams:\n                return 1.0\n    return 0.0\n\n\ndef get_in_turn_repetition(pred, is_cn=False):\n    \"\"\"Get in-turn repetition.\"\"\"\n    if len(pred) == 0:\n        return 1.0\n    if isinstance(pred[0], str):\n        pred = [tok.lower() for tok in pred]\n        if is_cn:\n            pred = \"\".join(pred)\n    tri_grams = set()\n    for i in range(len(pred) - 2):\n        tri_gram = tuple(pred[i:i + 3])\n        if tri_gram in tri_grams:\n            return 1.0\n        tri_grams.add(tri_gram)\n    return 0.0\n\n\nclass Plato2EncoderLayer(nn.Layer):\n\n    def __init__(self, n_head, hidden_size, attn_dropout, act_dropout):\n        super(Plato2EncoderLayer, self).__init__()\n\n        self.self_attn = nn.MultiHeadAttention(hidden_size, n_head, attn_dropout)\n        self.pre_norm_layer = nn.LayerNorm(hidden_size)\n        self.post_norm_layer = nn.LayerNorm(hidden_size)\n        self.fc1 = nn.Linear(hidden_size, hidden_size * 4)\n        self.fc2 = nn.Linear(hidden_size * 4, hidden_size)\n\n        self.dropout_layer = nn.Dropout(act_dropout)\n        self.gelu_layer = nn.GELU()\n\n    def forward(self, x, attn_mask, cache):\n        query = self.pre_norm_layer(x)\n        attn_output, new_cache = self.self_attn(query, None, None, attn_mask, cache)\n        attn_output = self.dropout_layer(attn_output)\n        attn_output = attn_output + x\n        ffd_input = self.post_norm_layer(attn_output)\n\n        ffd_output = self.fc1(ffd_input)\n        ffd_output = self.gelu_layer(ffd_output)\n        ffd_output = self.dropout_layer(ffd_output)\n\n        ffd_output = self.fc2(ffd_output)\n        ffd_output = self.dropout_layer(ffd_output)\n        out = ffd_output + attn_output\n\n        return out, new_cache\n\n    def gen_cache(self, key):\n        return self.self_attn.gen_cache(key)\n\n\nclass Plato2Encoder(nn.Layer):\n\n    def __init__(self, vocab_size, type_size, max_position_seq_len, num_layers, n_head, hidden_size, attn_dropout,\n                 act_dropout):\n        super(Plato2Encoder, self).__init__()\n\n        self.n_head = n_head\n\n        self.word_embedding_layer = nn.Embedding(vocab_size, hidden_size)\n        self.sent_embedding_layer = nn.Embedding(type_size, hidden_size)\n        self.pos_embedding_layer = nn.Embedding(max_position_seq_len, hidden_size)\n\n        self.encoder_layers = []\n        for i in range(num_layers):\n            encoder_layer = Plato2EncoderLayer(n_head, hidden_size, attn_dropout, act_dropout)\n            self.encoder_layers.append(encoder_layer)\n            self.add_sublayer('layers.' + str(i), encoder_layer)\n        self.post_encoder_layer_norm = nn.LayerNorm(hidden_size)\n\n        self.dropout_layer = nn.Dropout(act_dropout)\n\n    def forward(self, caches, token_ids, type_ids, pos_ids, generation_mask, aux_emb=None):\n        out, self_attn_mask = self.gen_input(token_ids, type_ids, pos_ids, generation_mask, aux_emb)\n\n        new_caches = []\n        for i, encoder_layer in enumerate(self.encoder_layers):\n            out, new_cache = encoder_layer(out, self_attn_mask, caches[i])\n            new_caches.append(new_cache)\n\n        enc_output = self.post_encoder_layer_norm(out)\n        return enc_output, new_caches\n\n    def gen_input(self, token_ids, type_ids, pos_ids, input_mask, aux_emb=None):\n        token_emb_out = self.word_embedding_layer(token_ids)\n        type_emb_out = self.sent_embedding_layer(type_ids)\n        pos_emb_out = self.pos_embedding_layer(pos_ids)\n        emb_out = token_emb_out + type_emb_out + pos_emb_out\n\n        # auxiliary memory embeddings\n        if aux_emb is not None:\n            emb_out = paddle.concat([aux_emb, emb_out], axis=1)\n\n        emb_out = self.dropout_layer(emb_out)\n\n        # generate n-head self-attention mask\n        self_attn_mask = input_mask\n        self_attn_mask = paddle.scale(x=self_attn_mask, scale=1e4, bias=-1.0, bias_after_scale=False)\n        n_head_self_attn_mask = paddle.stack(x=[self_attn_mask] * self.n_head, axis=1)\n        n_head_self_attn_mask.stop_gradient = True\n\n        return emb_out, n_head_self_attn_mask\n\n    def gen_caches(self, key):\n        caches = [encoder_layer.gen_cache(key) for encoder_layer in self.encoder_layers]\n        return caches\n\n\nclass NSP(nn.Layer):\n\n    def __init__(self, vocab_size, type_size, max_position_seq_len, num_layers, n_head, hidden_size, attn_dropout,\n                 act_dropout):\n        super(NSP, self).__init__()\n\n        self.n_head = n_head\n        self.hidden_size = hidden_size\n\n        self.word_embedding_layer = nn.Embedding(vocab_size, hidden_size)\n        self.sent_embedding_layer = nn.Embedding(type_size, hidden_size)\n        self.pos_embedding_layer = nn.Embedding(max_position_seq_len, hidden_size)\n\n        encoder_layer = nn.TransformerEncoderLayer(hidden_size, n_head, hidden_size * 4, act_dropout, 'gelu',\n                                                   attn_dropout, act_dropout, 'True')\n        encoder_norm = nn.LayerNorm(hidden_size)\n        self.encoder = nn.TransformerEncoder(encoder_layer, num_layers, encoder_norm)\n        self.fc1 = nn.Linear(hidden_size, hidden_size)\n        self.fc2 = nn.Linear(hidden_size, 2)\n\n        self.dropout_layer = nn.Dropout(act_dropout)\n        self.tanh_layer = nn.Tanh()\n        self.softmax = nn.Softmax()\n\n    def forward(self, inputs):\n        token_ids = inputs['token_ids']\n        type_ids = inputs['type_ids']\n        pos_ids = inputs['pos_ids']\n        attention_mask = inputs['attention_mask']\n        label_pos = inputs[\"label_pos\"]\n\n        out, self_attn_mask = self.gen_input(token_ids, type_ids, pos_ids, attention_mask)\n        # [-1, seq_len, hidden_size]\n        enc_out = self.encoder(out, self_attn_mask)\n\n        enc_out = paddle.reshape(enc_out, [-1, self.hidden_size])\n        label_pos = paddle.cast(label_pos, 'int64')\n        out = paddle.gather(enc_out, label_pos)\n        pooled_out = self.fc1(out)\n        pooled_out = self.tanh_layer(pooled_out)\n\n        # [-1, 2]\n        logits = self.fc2(pooled_out)\n        probs = self.softmax(logits)\n\n        return probs\n\n    def gen_input(self, token_ids, type_ids, pos_ids, input_mask, aux_emb=None):\n        token_emb_out = self.word_embedding_layer(token_ids)\n        type_emb_out = self.sent_embedding_layer(type_ids)\n        pos_emb_out = self.pos_embedding_layer(pos_ids)\n        emb_out = token_emb_out + type_emb_out + pos_emb_out\n\n        # auxiliary memory embeddings\n        if aux_emb is not None:\n            emb_out = paddle.concat([aux_emb, emb_out], axis=1)\n\n        emb_out = self.dropout_layer(emb_out)\n\n        # generate n-head self-attention mask\n        self_attn_mask = input_mask\n        self_attn_mask = paddle.scale(x=self_attn_mask, scale=1e4, bias=-1.0, bias_after_scale=False)\n        n_head_self_attn_mask = paddle.stack(x=[self_attn_mask] * self.n_head, axis=1)\n        n_head_self_attn_mask.stop_gradient = True\n\n        return emb_out, n_head_self_attn_mask\n\n\nclass Plato2InferModel(nn.Layer):\n\n    def __init__(self,\n                 nsp_reader,\n                 num_layers,\n                 n_head,\n                 hidden_size,\n                 vocab_size=8001,\n                 type_size=2,\n                 latent_type_size=20,\n                 max_position_seq_len=256,\n                 act_dropout=0.1,\n                 attn_dropout=0.1,\n                 max_dec_len=64,\n                 min_dec_len=1,\n                 topk=10):\n        super(Plato2InferModel, self).__init__()\n\n        self.nsp_reader = nsp_reader\n        self.num_layers = num_layers\n        self.latent_type_size = latent_type_size\n        self.max_dec_len = max_dec_len\n        self.min_dec_len = min_dec_len\n        self.topk = topk\n        self.unk_id = 0\n        self.bos_id = 1\n        self.eos_id = 2\n        self.mask_id = 8000\n        self.after_eos = paddle.ones([vocab_size]) * -1e9\n        self.after_eos[self.eos_id] = 0\n        self.is_cn = False\n        self.batch_size = 1\n\n        self.latent_weight = paddle.create_parameter([hidden_size, latent_type_size], 'float32')\n\n        self.plato2_encoder = Plato2Encoder(vocab_size, type_size, max_position_seq_len, num_layers, n_head,\n                                            hidden_size, attn_dropout, act_dropout)\n\n        self.logits_fc_layer = nn.Linear(hidden_size, hidden_size)\n        self.logits_layer_norm = nn.LayerNorm(hidden_size)\n        self.logits_bias = paddle.create_parameter([vocab_size], 'float32', is_bias=True)\n\n        self.nsp_predictor = NSP(vocab_size, type_size, max_position_seq_len, num_layers, n_head, hidden_size,\n                                 attn_dropout, act_dropout)\n\n        self.gelu_layer = nn.GELU()\n        self.softmax = nn.Softmax()\n\n    @paddle.no_grad()\n    def forward(self, inputs):\n        token_ids = inputs['token_ids']\n        type_ids = inputs['type_ids']\n        pos_ids = inputs['pos_ids']\n        generation_mask = inputs['generation_mask']\n        latent_id = inputs['latent_id']\n        data_id = inputs['data_id']\n\n        # [-1, 1, latent_type_size]\n        latent_id = F.one_hot(latent_id, self.latent_type_size)\n        # [-1, 1, hidden_size]\n        latent_emb = paddle.matmul(latent_id, self.latent_weight, transpose_y=True)\n\n        caches = self.plato2_encoder.gen_caches(token_ids)\n\n        # [-1, seq_len + 1, hidden_size]\n        enc_out, new_caches = self.plato2_encoder(caches, token_ids, type_ids, pos_ids, generation_mask, latent_emb)\n\n        pred_ids = self.decode(inputs, new_caches)\n\n        nsp_inputs = self.gen_nsp_input(token_ids, pred_ids)\n        # [-1, 2]\n        probs = self.nsp_predictor(nsp_inputs)\n\n        return self.get_results(data_id, token_ids, pred_ids, probs)\n\n    def decode(self, inputs, caches):\n        tgt_ids = inputs['tgt_ids']\n        tgt_pos = inputs['tgt_pos']\n        tgt_generation_mask = inputs['tgt_generation_mask']\n        predictions = tgt_ids\n\n        # TODO\n        step = 0\n        while step < self.max_dec_len:\n            # [-1, 1]\n            append_mask = paddle.cast(tgt_ids != self.eos_id, dtype=tgt_generation_mask.dtype)\n            tgt_generation_mask = paddle.concat([tgt_generation_mask, paddle.unsqueeze(append_mask, 1)], axis=-1)\n            tgt_sent = paddle.ones([tgt_generation_mask.shape[0], 1], dtype=tgt_ids.dtype)\n\n            # [-1, 1, hidden_size]\n            out, caches = self.plato2_encoder(caches, tgt_ids, tgt_sent, tgt_pos, tgt_generation_mask)\n            out = paddle.squeeze(out, axis=1)\n\n            # [-1, hidden_size]\n            trans = self.logits_fc_layer(out)\n            trans = self.gelu_layer(trans)\n            trans = self.logits_layer_norm(trans)\n\n            # [-1, vocab_size]\n            logits = paddle.matmul(trans, self.plato2_encoder.word_embedding_layer.weight,\n                                   transpose_y=True) + self.logits_bias\n            logits[:, self.unk_id] = -1e9\n            logits[:, self.bos_id] = -1e9\n            logits[:, self.mask_id] = -1e9\n            if step < self.min_dec_len:\n                logits[:, self.eos_id] = -1e9\n            logits = logits * append_mask + (1 - append_mask) * self.after_eos\n            probs = self.softmax(logits)\n\n            # [-1, topk]\n            topk_probs, _ = paddle.topk(probs, k=self.topk)\n            mask = paddle.cast(probs >= topk_probs[:, -1:], 'float32')\n            sums = paddle.sum(topk_probs, axis=-1, keepdim=True)\n            new_probs = probs * mask / sums\n            # [-1, 1]\n            sampling_ids = paddle.multinomial(new_probs)\n\n            step = step + 1\n            tgt_ids = sampling_ids\n            tgt_pos = tgt_pos + 1\n            predictions = paddle.concat([predictions, tgt_ids], axis=1)\n        return predictions\n\n    def gen_nsp_input(self, token_ids, pred_ids):\n        token_ids = token_ids.numpy()\n        pred_ids = pred_ids.numpy()\n\n        def __reader__():\n            headers = [\"src\", \"tgt\", \"data_id\"]\n\n            Example = namedtuple(\"Example\", headers)\n\n            for i, (raw, pred) in enumerate(zip(token_ids, pred_ids)):\n                context = post_process_context(raw, self.nsp_reader, merge=False)\n                _, response = post_process_response(pred, self.nsp_reader, merge=False)\n                context_tokenized_input = \" [SEP] \".join(\" \".join(utt) for utt in context)\n                response_tokenized_input = \" \".join(response)\n                example = Example(src=context_tokenized_input, tgt=response_tokenized_input, data_id=i)\n                data = self.nsp_reader._convert_example_to_record(example, is_infer=True)\n                yield data\n            return\n\n        generator = self.nsp_reader.data_generator(\n            reader=__reader__,\n            is_infer=True,\n            phase=\"test\",\n        )\n        inputs = next(generator())\n\n        #print('\\nnsp_inputs:')\n        for key in inputs:\n            inputs[key] = paddle.to_tensor(inputs[key])\n            if key in ['token_ids', 'type_ids', 'pos_ids']:\n                inputs[key] = paddle.squeeze(inputs[key], axis=-1)\n            #print(key, inputs[key].shape)\n            #print(inputs[key])\n        return inputs\n\n    def get_results(self, data_id, token_ids, pred_ids, probs):\n        data_id = data_id.numpy()\n        token_ids = token_ids.numpy()\n        pred_ids = pred_ids.numpy()\n        probs = probs.numpy()\n\n        infos = []\n        for raw, pred, prob in zip(token_ids, pred_ids, probs):\n            tokens = post_process_context(raw, self.nsp_reader)\n            pred_token_ids, pred_tokens = post_process_response(pred, self.nsp_reader)\n            info = {}\n            info['response'] = ' '.join(pred_tokens)\n            cross_turn_repetition = get_cross_turn_repetition(tokens, pred_tokens, self.nsp_reader.eos_id, self.is_cn)\n            in_turn_repetition = max(get_in_turn_repetition(pred_tokens, self.is_cn),\n                                     get_in_turn_repetition(pred_token_ids))\n\n            info['score'] = float(prob[1])\n            if len(pred_token_ids) >= self.max_dec_len:\n                info['score'] -= 1e3\n            elif cross_turn_repetition > 0:\n                info['score'] -= 1e3\n            elif in_turn_repetition > 0:\n                info['score'] -= 1e3\n            infos.append(info)\n\n        results = []\n        pre_idx = 0\n        sample = []\n        for idx, info in zip(data_id, infos):\n            if idx != pre_idx:\n                sample = sorted(sample, key=lambda info: -info[\"score\"])\n                result = sample[0]\n                result['data_id'] = pre_idx\n                results.apeend(result)\n                sample = []\n                pre_idx = idx\n            sample.append(info)\n        if sample:\n            sample = sorted(sample, key=lambda info: -info[\"score\"])\n            result = sample[0]\n            result['data_id'] = pre_idx\n            results.append(result)\n        return results\n"
  },
  {
    "path": "modules/text/text_generation/plato2_en_base/module.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport contextlib\nimport os\nimport sys\nfrom collections import namedtuple\n\nimport paddle\nimport paddle.nn as nn\n\nimport paddlehub as hub\nfrom .model import Plato2InferModel\nfrom .readers.nsp_reader import NSPReader\nfrom .readers.plato_reader import PlatoReader\nfrom .utils import gen_inputs\nfrom .utils.args import parse_args\nfrom .utils.args import str2bool\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\nfrom paddlehub.module.nlp_module import DataFormatError\n\n\n@moduleinfo(\n    name=\"plato2_en_base\",\n    version=\"1.1.0\",\n    summary=\n    \"A novel pre-training model for dialogue generation, incorporated with latent discrete variables for one-to-many relationship modeling.\",\n    author=\"baidu-nlp\",\n    author_email=\"\",\n    type=\"nlp/text_generation\",\n)\nclass Plato2(nn.Layer, hub.NLPPredictionModule):\n\n    def __init__(self):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        super(Plato2, self).__init__()\n        args = self.setup_args()\n\n        if args.num_layers == 24:\n            n_head = 16\n            hidden_size = 1024\n        elif args.num_layers == 32:\n            n_head = 32\n            hidden_size = 2048\n        else:\n            raise ValueError('The pre-trained model only support 24 or 32 layers, '\n                             'but received num_layers=%d.' % args.num_layers)\n\n        self.plato_reader = PlatoReader(args)\n        nsp_reader = NSPReader(args)\n        self.model = Plato2InferModel(nsp_reader, args.num_layers, n_head, hidden_size)\n        state_dict = paddle.load(args.init_from_ckpt)\n        self.model.set_state_dict(state_dict)\n        self.model.eval()\n        self.Example = namedtuple(\"Example\", [\"src\", \"data_id\"])\n        self.latent_type_size = args.latent_type_size\n        self._interactive_mode = False\n\n    def setup_args(self):\n        \"\"\"\n        Setup arguments.\n        \"\"\"\n        ckpt_path = os.path.join(self.directory, 'assets', '24L.pdparams')\n        vocab_path = os.path.join(self.directory, 'assets', 'vocab.txt')\n        spm_model_file = os.path.join(self.directory, 'assets', 'spm.model')\n\n        # ArgumentParser.parse_args use argv[1:], it will drop the first one arg, so the first one in sys.argv should be \"\"\n        sys.argv = [\n            \"--empty\",\n            \"--spm_model_file\",\n            \"%s\" % spm_model_file,\n            \"--vocab_path\",\n            \"%s\" % vocab_path,\n        ]\n\n        parser = argparse.ArgumentParser()\n        group = parser.add_argument_group(\"Model\")\n        group.add_argument(\"--init_from_ckpt\", type=str, default=ckpt_path)\n        group.add_argument(\"--vocab_size\", type=int, default=8001)\n        group.add_argument(\"--latent_type_size\", type=int, default=20)\n        group.add_argument(\"--num_layers\", type=int, default=24)\n\n        group = parser.add_argument_group(\"Task\")\n        group.add_argument(\"--is_cn\", type=str2bool, default=False)\n\n        NSPReader.add_cmdline_args(parser)\n\n        args = parse_args(parser)\n        args.batch_size *= args.latent_type_size\n\n        return args\n\n    @serving\n    @paddle.no_grad()\n    def generate(self, texts):\n        \"\"\"\n        Get the robot responses of the input texts.\n\n        Args:\n             texts(list or str): If not in the interactive mode, texts should be a list in which every element is the chat context separated with '\\t'.\n                                 Otherwise, texts shoule be one sentence. The module can get the context automatically.\n\n        Returns:\n             results(list): the robot responses.\n        \"\"\"\n        if not texts:\n            return []\n        if self._interactive_mode:\n            if isinstance(texts, str):\n                self.context.append(texts.strip())\n                texts = [\" [SEP] \".join(self.context[-self.max_turn:])]\n            else:\n                raise ValueError(\"In the interactive mode, the input data should be a string.\")\n        elif not isinstance(texts, list):\n            raise ValueError(\"If not in the interactive mode, the input data should be a list.\")\n\n        bot_responses = []\n        for i, text in enumerate(texts):\n            example = self.Example(src=text.replace(\"\\t\", \" [SEP] \"), data_id=0)\n            record = self.plato_reader._convert_example_to_record(example, is_infer=True)\n            data = self.plato_reader._pad_batch_records([record], is_infer=True)\n            inputs = gen_inputs(data, self.latent_type_size)\n            inputs['tgt_ids'] = inputs['tgt_ids'].astype('int64')\n            pred = self.model(inputs)[0]  # batch_size is 1\n            bot_response = pred[\"response\"]  # ignore data_id and score\n            bot_responses.append(bot_response)\n\n        if self._interactive_mode:\n            self.context.append(bot_responses[0].strip())\n        return bot_responses\n\n    @contextlib.contextmanager\n    def interactive_mode(self, max_turn=6):\n        \"\"\"\n        Enter the interactive mode.\n\n        Args:\n            max_turn(int): the max dialogue turns. max_turn = 1 means the robot can only remember the last one utterance you have said.\n        \"\"\"\n        self._interactive_mode = True\n        self.max_turn = max_turn\n        self.context = []\n        yield\n        self.context = []\n        self._interactive_mode = False\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description='Run the %s module.' % self.name,\n                                              prog='hub run %s' % self.name,\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, optional.\")\n\n        self.add_module_input_arg()\n\n        args = self.parser.parse_args(argvs)\n\n        try:\n            input_data = self.check_input_data(args)\n        except DataFormatError and RuntimeError:\n            self.parser.print_help()\n            return None\n\n        results = self.generate(texts=input_data)\n\n        return results\n"
  },
  {
    "path": "modules/text/text_generation/plato2_en_base/readers/dialog_reader.py",
    "content": "#   Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Dialogue Reader.\"\"\"\n\nimport csv\nfrom collections import namedtuple\nfrom contextlib import contextmanager\nimport gzip\n\nimport numpy as np\n\nfrom ..utils import pad_batch_data\nfrom ..utils.args import str2bool\nfrom ..utils.masking import mask\nfrom ..utils import tokenization\n\n\nclass DialogReader(object):\n    \"\"\"The implement of DialogReader.\"\"\"\n\n    @classmethod\n    def add_cmdline_args(cls, parser):\n        \"\"\"Add cmdline argurments.\"\"\"\n        group = parser.add_argument_group(\"Reader\")\n        group.add_argument(\"--max_src_len\", type=int, default=128)\n        group.add_argument(\"--max_tgt_len\", type=int, default=128)\n        group.add_argument(\"--truncate_first_turn\",\n                           type=str2bool,\n                           default=False)\n        group.add_argument(\"--file_format\",\n                           type=str,\n                           default=\"file\",\n                           choices=[\"file\", \"filelist\"])\n        group.add_argument(\"--data_format\",\n                           type=str,\n                           default=\"raw\",\n                           choices=[\"raw\", \"tokenized\", \"numerical\"])\n        group.add_argument(\"--in_tokens\", type=str2bool, default=False)\n        group.add_argument(\"--batch_size\", type=int, default=16)\n        group.add_argument(\"--continuous_position\", type=str2bool, default=True)\n        group.add_argument(\"--random_seed\", type=int, default=11)\n        group.add_argument(\"--sort_pool_size\", type=int, default=2**16)\n\n        group = parser.add_argument_group(\"Tokenizer\")\n        group.add_argument(\"--tokenizer\",\n                           type=str,\n                           default=\"SentencePieceTokenizer\")\n        args, _ = parser.parse_known_args()\n        tokenizer_cls = getattr(tokenization, args.tokenizer)\n        tokenizer_cls.add_cmdline_args(parser)\n        return group\n\n    def __init__(self, args):\n        tokenizer_cls = getattr(tokenization, args.tokenizer)\n        self.tokenizer = tokenizer_cls(args)\n        self.vocab = self.tokenizer.vocab\n        self.pad_id = args.pad_id = self.vocab[\"[PAD]\"]\n        self.bos_id = args.bos_id = self.vocab[\"[CLS]\"]\n        self.eos_id = args.eos_id = self.vocab[\"[SEP]\"]\n        self.unk_id = args.unk_id = self.vocab[\"[UNK]\"]\n        self.mask_id = args.mask_id = self.vocab[\"[MASK]\"]\n        self.vocab_size = args.get(\"vocab_size\", 0)\n        self.max_src_len = args.max_src_len\n        self.max_tgt_len = args.max_tgt_len\n        self.truncate_first_turn = args.truncate_first_turn\n        self.file_format = args.file_format\n        self.data_format = args.data_format\n        self.in_tokens = args.in_tokens\n        self.batch_size = args.batch_size\n        self.continuous_position = args.continuous_position\n        self.sort_pool_size = args.sort_pool_size\n\n        # random_seed must be set for data slicing when using multi-gpu\n        self.global_rng = np.random.RandomState(args.random_seed)\n\n        # training progress\n        self.current_example = 0\n        self.current_epoch = 0\n        self.num_examples = 0\n\n        # model related\n\n        self.fields = [\"token_ids\", \"type_ids\", \"pos_ids\"]\n        self.num_numerical_fields = len(self.fields)\n        self.fields += [\"tgt_start_idx\", \"data_id\"]\n        self.sort_key = lambda record: [len(record.token_ids)]\n\n        self.Record = namedtuple(\"Record\",\n                                 self.fields,\n                                 defaults=(None, ) * len(self.fields))\n\n        self.features = {}\n        return\n\n    def get_train_progress(self):\n        \"\"\"Gets progress for training phase.\"\"\"\n        return self.current_epoch, self.current_file_index, self.total_file\n\n    def _convert_example_to_record(self, example, is_infer):\n        # process src\n        src_token_ids = []\n        src_pos_ids = []\n\n        # tokenize src\n        s_token_ids_list = []\n        for s in example.src.split(\"[SEP]\"):\n            s = tokenization.convert_to_unicode(s).strip()\n\n            if self.data_format == \"tokenized\":\n                s_tokens = s.split(\" \")\n            else:\n                s_tokens = self.tokenizer.tokenize(s)\n\n            s_token_ids = self.tokenizer.convert_tokens_to_ids(s_tokens) + [\n                self.eos_id\n            ]\n            s_token_ids_list.append(s_token_ids)\n\n        # trim src\n        idx = len(s_token_ids_list) - 1\n        total_token_num = 1\n        while idx >= 0:\n            total_token_num += len(s_token_ids_list[idx])\n            if total_token_num > self.max_src_len:\n                if self.truncate_first_turn and idx == 0:\n                    truncated_ids = s_token_ids_list[idx][:self.max_src_len -\n                                                          total_token_num]\n                    if len(truncated_ids) > 1:\n                        s_token_ids_list[idx] = truncated_ids[:-1] + [\n                            self.eos_id\n                        ]\n                        idx -= 1\n                break\n            idx -= 1\n\n        for i, s_token_ids in enumerate(s_token_ids_list[idx + 1:], idx + 1):\n            src_token_ids += s_token_ids\n            src_pos_ids += list(range(1, len(s_token_ids) + 1))\n\n        src_token_ids = [self.bos_id] + src_token_ids\n        src_type_ids = [0] * len(src_token_ids)\n        src_pos_ids = [0] + src_pos_ids\n        assert len(src_token_ids) == len(src_type_ids) == len(src_pos_ids), \\\n            \"not len(src_token_ids) == len(src_type_ids) == len(src_pos_ids)\"\n\n        token_ids = src_token_ids\n        type_ids = src_type_ids\n        pos_ids = src_pos_ids\n        tgt_start_idx = len(token_ids)\n\n        if not is_infer:\n            # process tgt\n            # tokenize tgt\n            tgt = tokenization.convert_to_unicode(example.tgt).strip()\n            if self.data_format == \"tokenized\":\n                tgt_tokens = tgt.split(\" \")\n            else:\n                tgt_tokens = self.tokenizer.tokenize(tgt)\n\n            tgt_token_ids = self.tokenizer.convert_tokens_to_ids(tgt_tokens)\n            tgt_token_ids.append(self.eos_id)\n\n            # trim tgt\n            if len(tgt_token_ids) > self.max_tgt_len - 1:\n                tgt_token_ids = tgt_token_ids[:self.max_tgt_len - 1]\n\n            tgt_token_ids = [self.bos_id] + tgt_token_ids\n            tgt_type_ids = [1] * len(tgt_token_ids)\n            tgt_pos_ids = list(range(1, len(tgt_token_ids) + 1))\n            assert len(tgt_token_ids) == len(tgt_type_ids) == len(tgt_pos_ids), \\\n                \"not len(tgt_token_ids) == len(tgt_type_ids) == len(tgt_pos_ids)\"\n\n            token_ids += tgt_token_ids\n            type_ids += tgt_type_ids\n            pos_ids += tgt_pos_ids\n\n        assert len(token_ids) == len(type_ids) == len(pos_ids), \\\n            \"not len(token_ids) == len(type_ids) == len(pos_ids)\"\n\n        if self.continuous_position:\n            src_pos_ids = list(range(len(src_token_ids)))\n            if not is_infer:\n                tgt_pos_ids = list(range(len(tgt_token_ids)))\n            pos_ids = list(range(len(token_ids)))\n\n        field_values = {\n            \"token_ids\": src_token_ids,\n            \"type_ids\": src_type_ids,\n            \"pos_ids\": src_pos_ids\n        }\n        field_values[\"tgt_start_idx\"] = tgt_start_idx\n        field_values[\"data_id\"] = example.data_id\n\n        record = self.Record(**field_values)\n        return record\n\n    def _read_tsv(self, fp, phase, is_infer, delimiter=\"\\t\", quotechar=None):\n        \"\"\"Reads a tab separated value file.\"\"\"\n        csv.field_size_limit(2**20)\n        reader = csv.reader(fp, delimiter=delimiter, quotechar=quotechar)\n        headers = next(reader)\n        headers.append(\"data_id\")\n        Example = namedtuple(\"Example\", headers)\n\n        for i, line in enumerate(reader):\n            example = Example(*line, data_id=i)\n            if is_infer or phase.endswith(\"test\"):\n                self.features[phase][i] = example\n            record = self._convert_example_to_record(example, is_infer)\n            yield record\n\n    def _read_numerical_file(self, fp, delimiter=\";\"):\n        for i, line in enumerate(fp):\n            cols = tokenization.convert_to_unicode(line).strip().split(\n                delimiter)\n            cols = list(map(lambda x: list(map(int, x.split(\" \"))), cols))\n            if len(cols) > self.num_numerical_fields:\n                cols = cols[:self.num_numerical_fields]\n            tgt_start_idx = cols[0].index(self.bos_id, 1)\n            record = self.Record(*cols, tgt_start_idx=tgt_start_idx, data_id=i)\n            yield record\n\n    def _read_file(self, input_file, phase, is_infer):\n\n        def __wrapper__():\n            with open_file(input_file) as fp:\n                if self.data_format == \"numerical\":\n                    records = self._read_numerical_file(fp)\n                else:\n                    records = self._read_tsv(fp, phase, is_infer)\n                for record in records:\n                    yield record\n\n        return __wrapper__\n\n    def _read_files(self, filelist, phase, is_infer, shuffle_files):\n        input_files = open(filelist).readlines()\n\n        def __wrapper__():\n            if shuffle_files:\n                self.global_rng.shuffle(input_files)\n\n            if phase == \"train\":\n                self.total_file = len(input_files)\n            for file_index, input_file in enumerate(input_files, 1):\n                if phase == \"train\":\n                    self.current_file_index = file_index\n                    self.current_file = input_file\n                file_reader = self._read_file(input_file.strip(), phase,\n                                              is_infer)\n                for record in file_reader():\n                    yield record\n\n        return __wrapper__\n\n    def _batch_reader(self,\n                      reader,\n                      phase=None,\n                      is_infer=False,\n                      sort_pool_size=2**16):\n        \"\"\"Construct a batch reader.\"\"\"\n\n        def update_max_lens(max_lens, record):\n            \"\"\"Update max_lens.\"\"\"\n            if max_lens is None:\n                return self.sort_key(record)\n            else:\n                return [\n                    max(max_len, l)\n                    for max_len, l in zip(max_lens, self.sort_key(record))\n                ]\n\n        def get_batch(reader):\n            \"\"\"Generate batches from reader.\"\"\"\n            batch, max_lens = [], None\n            for record in reader():\n                if record is None:\n                    yield batch\n                    batch, max_lens = [], None\n                    continue\n\n                self.current_example += 1\n                max_lens = update_max_lens(max_lens, record)\n                if self.in_tokens:\n                    to_append = (len(batch) +\n                                 1) * sum(max_lens) <= self.batch_size\n                else:\n                    to_append = len(batch) < self.batch_size\n                if to_append:\n                    batch.append(record)\n                else:\n                    yield batch\n                    batch, max_lens = [record], self.sort_key(record)\n\n            if len(batch) > 0:\n                yield batch\n\n        def get_sorted_batch(pool):\n            \"\"\"Generate sorted batches from pool.\"\"\"\n            pool = sorted(pool, key=self.sort_key)\n            batches = []\n            batch, max_lens = [], None\n            for record in pool:\n                self.current_example += 1\n                max_lens = update_max_lens(max_lens, record)\n                if self.in_tokens:\n                    to_append = (len(batch) +\n                                 1) * sum(max_lens) <= self.batch_size\n                else:\n                    to_append = len(batch) < self.batch_size\n                if to_append:\n                    batch.append(record)\n                else:\n                    batches.append(batch)\n                    batch, max_lens = [record], self.sort_key(record)\n\n            if len(batch) > 0:\n                batches.append(batch)\n            self.global_rng.shuffle(batches)\n\n            for batch in batches:\n                yield batch\n\n        def __wrapper__():\n            if sort_pool_size > 0:\n                pool = []\n                for record in reader():\n                    pool.append(record)\n                    if len(pool) == sort_pool_size:\n                        for batch in get_sorted_batch(pool):\n                            yield batch\n                        pool = []\n                if len(pool) > 0:\n                    for batch in get_sorted_batch(pool):\n                        yield batch\n            else:\n                for batch in get_batch(reader):\n                    yield batch\n\n        return __wrapper__\n\n    def _distributed_batch_reader(self,\n                                  batch_reader,\n                                  num_part,\n                                  part_id,\n                                  is_test=False):\n\n        def __wrapper__():\n            batches = []\n            for batch in batch_reader():\n                batches.append(batch)\n                if len(batches) == num_part:\n                    yield batches[part_id]\n                    batches = []\n            if is_test and 0 <= part_id < len(batches):\n                yield batches[part_id]\n            return\n\n        return __wrapper__\n\n    def data_generator(self,\n                       input_file=None,\n                       reader=None,\n                       num_epochs=1,\n                       num_part=1,\n                       part_id=0,\n                       phase=None,\n                       is_infer=False):\n        \"\"\"Data generator.\"\"\"\n\n        def __wrapper__():\n            if is_infer or phase.endswith(\"test\"):\n                self.features[phase] = {}\n\n            nonlocal reader\n            if reader is None:\n                if self.file_format == \"filelist\":\n                    reader = self._read_files(input_file, phase, is_infer,\n                                              not phase.endswith(\"test\"))\n                else:\n                    if phase == \"train\":\n                        self.total_file = 1\n                        self.current_file_index = 1\n                        self.current_file = input_file\n                    reader = self._read_file(input_file, phase, is_infer)\n\n            batch_reader = self._batch_reader(\n                reader,\n                phase,\n                is_infer,\n                sort_pool_size=self.sort_pool_size if not is_infer else 0)\n            if phase == \"train\":\n                batch_reader = self._distributed_batch_reader(\n                    batch_reader, num_part, part_id)\n            elif phase.startswith(\"distributed\"):\n                batch_reader = self._distributed_batch_reader(batch_reader,\n                                                              num_part,\n                                                              part_id,\n                                                              is_test=True)\n\n            for epoch_index in range(num_epochs):\n                if phase == \"train\":\n                    self.current_example = 0\n                    self.current_epoch = epoch_index + 1\n                for batch in batch_reader():\n                    yield self._pad_batch_records(batch, is_infer)\n\n        return __wrapper__\n\n    def _gen_self_attn_mask(self,\n                            batch_token_ids,\n                            batch_tgt_start_idx=None,\n                            is_unidirectional=True,\n                            shift_len=0):\n        max_len = max(map(len, batch_token_ids))\n        input_mask_data = np.zeros(\n            (len(batch_token_ids), max_len + shift_len, max_len + shift_len))\n        if is_unidirectional:\n            for index, mask_data in enumerate(input_mask_data):\n                start = 0 if batch_tgt_start_idx is None else batch_tgt_start_idx[\n                    index]\n                end = len(batch_token_ids[index])\n                mask_data[:end + shift_len, :start + shift_len] = 1.0\n                # Generate the lower triangular matrix using the slice of matrix\n                b = np.tril(np.ones([end - start, end - start]), 0)\n                mask_data[start + shift_len:end + shift_len,\n                          start + shift_len:end + shift_len] = b\n        else:\n            for index, token_ids in enumerate(batch_token_ids):\n                input_mask_data[index, :len(token_ids) +\n                                shift_len, :len(token_ids) + shift_len] = 1.0\n        return input_mask_data.astype(\"float32\")\n\n    def _pad_batch_records(self, batch_records, is_infer):\n        \"\"\"\n        Padding batch records and construct model's inputs.\n        \"\"\"\n        batch_size = len(batch_records)\n        batch = {}\n        batch_token_ids = [record.token_ids for record in batch_records]\n        batch_type_ids = [record.type_ids for record in batch_records]\n        batch_pos_ids = [record.pos_ids for record in batch_records]\n        batch[\"token_ids\"] = pad_batch_data(batch_token_ids, pad_id=self.pad_id)\n        batch[\"type_ids\"] = pad_batch_data(batch_type_ids, pad_id=self.pad_id)\n        batch[\"pos_ids\"] = pad_batch_data(batch_pos_ids, pad_id=self.pad_id)\n\n        batch_tgt_start_idx = [record.tgt_start_idx for record in batch_records]\n        batch[\"generation_mask\"] = self._gen_self_attn_mask(\n            batch_token_ids, batch_tgt_start_idx=batch_tgt_start_idx)\n\n        if is_infer:\n            tgt_ids = np.array([[[self.bos_id]]] * len(batch_token_ids),\n                               dtype=\"int64\")\n            if self.continuous_position:\n                tgt_pos = np.array(batch_tgt_start_idx, dtype=\"int64\")\n            else:\n                tgt_pos = np.zeros_like(batch_tgt_start_idx, dtype=\"int64\")\n            tgt_pos = tgt_pos.reshape(-1, 1, 1)\n            batch[\"init_score\"] = np.zeros_like(\n                tgt_ids, dtype=\"float32\").reshape(-1, 1).tolist()\n            batch[\"tgt_ids\"] = tgt_ids.tolist()\n            batch[\"tgt_pos\"] = tgt_pos.tolist()\n\n            batch[\"tgt_generation_mask\"] = batch[\n                \"generation_mask\"][:, 0:1, :].astype(\"float32\")\n        else:\n            batch[\"tgt_label\"], batch[\"tgt_pos\"] = mask(\n                batch_tokens=batch_token_ids,\n                vocab_size=self.vocab_size,\n                sent_b_starts=batch_tgt_start_idx,\n                is_unidirectional=True)\n\n        batch_data_id = [record.data_id for record in batch_records]\n        batch[\"data_id\"] = np.array(batch_data_id).astype(\"int64\").reshape(\n            [-1, 1])\n        return batch\n\n\n@contextmanager\ndef open_file(filename):\n    \"\"\"Open file.\"\"\"\n    if filename.endswith(\".gz\"):\n        fp = gzip.open(filename, \"rt\")\n    else:\n        fp = open(filename)\n    yield fp\n    fp.close()\n"
  },
  {
    "path": "modules/text/text_generation/plato2_en_base/readers/nsp_reader.py",
    "content": "#   Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"NSP Reader.\"\"\"\n\nfrom collections import namedtuple\n\nimport numpy as np\n\nfrom .dialog_reader import DialogReader\nfrom ..utils import pad_batch_data\nfrom ..utils.args import str2bool\nfrom ..utils.masking import mask\n\n\nclass NSPReader(DialogReader):\n    \"\"\"NSP Reader.\"\"\"\n\n    @classmethod\n    def add_cmdline_args(cls, parser):\n        \"\"\"Add cmdline argurments.\"\"\"\n        group = DialogReader.add_cmdline_args(parser)\n        group.add_argument(\"--attention_style\",\n                           type=str,\n                           default=\"bidirectional\",\n                           choices=[\"bidirectional\", \"unidirectional\"])\n        group.add_argument(\"--mix_negative_sample\",\n                           type=str2bool,\n                           default=False)\n        return group\n\n    def __init__(self, args):\n        super(NSPReader, self).__init__(args)\n        self.fields.append(\"label\")\n        self.Record = namedtuple(\"Record\",\n                                 self.fields,\n                                 defaults=(None, ) * len(self.fields))\n\n        self.attention_style = args.attention_style\n        self.mix_negative_sample = args.mix_negative_sample\n        return\n\n    def _convert_example_to_record(self, example, is_infer):\n        record = super(NSPReader,\n                       self)._convert_example_to_record(example, False)\n        if \"label\" in example._fields:\n            record = record._replace(label=int(example.label))\n        return record\n\n    def _mix_negative_sample(self, reader, neg_pool_size=2**16):\n\n        def gen_from_pool(pool):\n            num_samples = len(pool)\n            if num_samples == 1:\n                # only one sample: it is impossible to generate negative sample\n                yield pool[0]._replace(label=1)\n                return\n            self.global_rng.shuffle(pool)\n            for i in range(num_samples):\n                pool[i] = pool[i]._replace(label=1)\n                j = (i + 1) % num_samples\n                idx_i = pool[i].tgt_start_idx\n                idx_j = pool[j].tgt_start_idx\n                field_values = {}\n                field_values[\"token_ids\"] = pool[i].token_ids[:idx_i] + pool[\n                    j].token_ids[idx_j:]\n                field_values[\"type_ids\"] = pool[i].type_ids[:idx_i] + pool[\n                    j].type_ids[idx_j:]\n                field_values[\"pos_ids\"] = list(\n                    range(len(field_values[\"token_ids\"])))\n                neg_record = self.Record(**field_values,\n                                         tgt_start_idx=idx_i,\n                                         data_id=-1,\n                                         label=0)\n                pool.append(neg_record)\n                assert len(neg_record.token_ids) <= self.max_seq_len\n            self.global_rng.shuffle(pool)\n            for record in pool:\n                yield record\n\n        def __wrapper__():\n            pool = []\n            for record in reader():\n                pool.append(record)\n                if len(pool) == neg_pool_size:\n                    for record in gen_from_pool(pool):\n                        yield record\n                    pool = []\n            if len(pool) > 0:\n                for record in gen_from_pool(pool):\n                    yield record\n\n        return __wrapper__\n\n    def _batch_reader(self,\n                      reader,\n                      phase=None,\n                      is_infer=False,\n                      sort_pool_size=2**16):\n        if self.mix_negative_sample:\n            reader = self._mix_negative_sample(reader)\n        return super(NSPReader,\n                     self)._batch_reader(reader,\n                                         phase=phase,\n                                         is_infer=is_infer,\n                                         sort_pool_size=sort_pool_size)\n\n    def _pad_batch_records(self, batch_records, is_infer):\n        \"\"\"\n        Padding batch records and construct model's inputs.\n        \"\"\"\n        batch = {}\n        batch_token_ids = [record.token_ids for record in batch_records]\n        batch_type_ids = [record.type_ids for record in batch_records]\n        batch_pos_ids = [record.pos_ids for record in batch_records]\n        batch_tgt_start_idx = [record.tgt_start_idx for record in batch_records]\n        batch_label = [record.label for record in batch_records]\n\n        if self.attention_style == \"unidirectional\":\n            batch[\"token_ids\"] = pad_batch_data(batch_token_ids,\n                                                pad_id=self.pad_id)\n            batch[\"type_ids\"] = pad_batch_data(batch_type_ids,\n                                               pad_id=self.pad_id)\n            batch[\"pos_ids\"] = pad_batch_data(batch_pos_ids, pad_id=self.pad_id)\n            tgt_label, tgt_pos, label_pos = mask(\n                batch_tokens=batch_token_ids,\n                vocab_size=self.vocab_size,\n                bos_id=self.bos_id,\n                sent_b_starts=batch_tgt_start_idx,\n                labels=batch_label,\n                is_unidirectional=True)\n            attention_mask = self._gen_self_attn_mask(batch_token_ids,\n                                                      batch_tgt_start_idx)\n        else:\n            batch_mask_token_ids, tgt_label, tgt_pos, label_pos = mask(\n                batch_tokens=batch_token_ids,\n                vocab_size=self.vocab_size,\n                bos_id=self.bos_id,\n                eos_id=self.eos_id,\n                mask_id=self.mask_id,\n                sent_b_starts=batch_tgt_start_idx,\n                labels=batch_label,\n                is_unidirectional=False)\n            if not is_infer:\n                batch_token_ids = batch_mask_token_ids\n            batch[\"token_ids\"] = pad_batch_data(batch_token_ids,\n                                                pad_id=self.pad_id)\n            batch[\"type_ids\"] = pad_batch_data(batch_type_ids,\n                                               pad_id=self.pad_id)\n            batch[\"pos_ids\"] = pad_batch_data(batch_pos_ids, pad_id=self.pad_id)\n            attention_mask = self._gen_self_attn_mask(batch_token_ids,\n                                                      is_unidirectional=False)\n\n        batch[\"attention_mask\"] = attention_mask\n        batch[\"label_pos\"] = label_pos\n\n        if not is_infer:\n            batch_label = np.array(batch_label).astype(\"int64\").reshape([-1, 1])\n            batch[\"label\"] = batch_label\n            batch[\"tgt_label\"] = tgt_label\n            batch[\"tgt_pos\"] = tgt_pos\n\n        batch_data_id = [record.data_id for record in batch_records]\n        batch[\"data_id\"] = np.array(batch_data_id).astype(\"int64\").reshape(\n            [-1, 1])\n        return batch\n"
  },
  {
    "path": "modules/text/text_generation/plato2_en_base/readers/plato_reader.py",
    "content": "#   Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Plato Reader.\"\"\"\n\nimport numpy as np\n\nfrom .dialog_reader import DialogReader\nfrom ..utils import pad_batch_data\nfrom ..utils.masking import mask\n\n\nclass PlatoReader(DialogReader):\n    \"\"\"The implement of PlatoReader\"\"\"\n\n    def __init__(self, args):\n        super(PlatoReader, self).__init__(args)\n        self.latent_type_size = args.latent_type_size\n        self.use_bow = args.use_bow\n\n    def _pad_batch_records(self, batch_records, is_infer):\n        \"\"\"\n        Padding batch records and construct model's inputs.\n        \"\"\"\n        batch = {}\n        batch_token_ids = [record.token_ids for record in batch_records]\n        batch_type_ids = [record.type_ids for record in batch_records]\n        batch_pos_ids = [record.pos_ids for record in batch_records]\n\n        batch_tgt_start_idx = [record.tgt_start_idx for record in batch_records]\n\n        batch_size = len(batch_token_ids)\n\n        # padding\n        batch[\"token_ids\"] = pad_batch_data(batch_token_ids, pad_id=self.pad_id)\n        batch[\"type_ids\"] = pad_batch_data(batch_type_ids, pad_id=self.pad_id)\n        batch[\"pos_ids\"] = pad_batch_data(batch_pos_ids, pad_id=self.pad_id)\n\n        batch[\"generation_mask\"] = self._gen_self_attn_mask(\n            batch_token_ids,\n            batch_tgt_start_idx=batch_tgt_start_idx,\n            is_unidirectional=True,\n            shift_len=1)\n        if not is_infer:\n            batch[\"recognition_mask\"] = self._gen_self_attn_mask(\n                batch_token_ids, is_unidirectional=False, shift_len=1)\n\n        if is_infer:\n            tgt_ids = np.array([[[self.bos_id]]] * batch_size, dtype=\"int64\")\n            if self.continuous_position:\n                tgt_pos = np.array(batch_tgt_start_idx, dtype=\"int64\")\n            else:\n                tgt_pos = np.zeros_like(batch_tgt_start_idx, dtype=\"int64\")\n            tgt_pos = tgt_pos.reshape(-1, 1, 1)\n            batch[\"init_score\"] = np.zeros_like(\n                tgt_ids, dtype=\"float32\").reshape(-1, 1).tolist()\n            batch[\"tgt_ids\"] = tgt_ids.tolist()\n            batch[\"tgt_pos\"] = tgt_pos.tolist()\n            batch[\"parent_idx\"] = np.array(range(batch_size), dtype=\"int32\")\n\n            batch[\"tgt_generation_mask\"] = batch[\n                \"generation_mask\"][:, 0:1, :].astype(\"float32\")\n        else:\n            mask_return_list = mask(batch_tokens=batch_token_ids,\n                                    vocab_size=self.vocab_size,\n                                    sent_b_starts=batch_tgt_start_idx,\n                                    is_unidirectional=True,\n                                    use_latent=True,\n                                    use_bow=self.use_bow)\n            batch[\"tgt_label\"] = mask_return_list[0]\n            batch[\"tgt_pos\"] = mask_return_list[1]\n            if self.use_bow:\n                batch[\"bow_label\"] = mask_return_list[2]\n                batch[\"bow_pos\"] = mask_return_list[3]\n\n        batch_data_id = [record.data_id for record in batch_records]\n        batch[\"data_id\"] = np.array(batch_data_id).astype(\"int64\").reshape(\n            [-1, 1])\n        return batch\n"
  },
  {
    "path": "modules/text/text_generation/plato2_en_base/utils/__init__.py",
    "content": "#   Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Utils.\"\"\"\n\nfrom itertools import chain\nimport numpy as np\nimport paddle\n\n\ndef repeat_array(array, times):\n    \"\"\"Repeate numpy array.\"\"\"\n    if isinstance(array, list):\n        return list(chain(*([array] * times)))\n    else:\n        return np.concatenate([array] * times, axis=0)\n\n\ndef gen_inputs(inputs, latent_type_size):\n    batch_size = len(inputs[\"data_id\"])\n    new_bsz = batch_size * latent_type_size\n    inputs = {\n        name: repeat_array(array, latent_type_size)\n        for name, array in inputs.items()\n    }\n    # Add latent_id\n    inputs[\"latent_id\"] = np.array(\n        [i for i in range(latent_type_size) for _ in range(batch_size)],\n        dtype=\"int64\").reshape([-1, 1])\n\n    #print('\\nplato_inputs:')\n    for key in inputs:\n        inputs[key] = paddle.to_tensor(inputs[key])\n        if key in [\n                'token_ids', 'type_ids', 'pos_ids', 'tgt_ids', 'tgt_pos',\n                'data_id'\n        ]:\n            inputs[key] = paddle.squeeze(inputs[key], axis=-1)\n        #print(key, inputs[key].shape, inputs[key].dtype)\n    return inputs\n\n\ndef pad_batch_data(insts, pad_id=0):\n    \"\"\"Pad the instances to the max sequence length in batch. \"\"\"\n    max_len = max(map(len, insts))\n    inst_data = np.array(\n        [list(inst) + [pad_id] * (max_len - len(inst)) for inst in insts])\n    return inst_data.astype(\"int64\").reshape([-1, max_len, 1])\n"
  },
  {
    "path": "modules/text/text_generation/plato2_en_base/utils/args.py",
    "content": "#   Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Parse argument.\"\"\"\n\nimport argparse\nimport json\n\n\ndef str2bool(v):\n    \"\"\" Support bool type for argparse. \"\"\"\n    if v.lower() in (\"yes\", \"true\", \"t\", \"y\", \"1\"):\n        return True\n    elif v.lower() in (\"no\", \"false\", \"f\", \"n\", \"0\"):\n        return False\n    else:\n        raise argparse.ArgumentTypeError(\"Unsupported value encountered.\")\n\n\nclass Args(dict):\n    \"\"\" Arguments class\n\n    Store arguments in training / infer / ... scripts.\n    \"\"\"\n\n    def __getattr__(self, name):\n        if name in self.keys():\n            return self[name]\n        for v in self.values():\n            if isinstance(v, Args):\n                if name in v:\n                    return v[name]\n        return None\n\n    def get(self, key, default_value=None):\n        \"\"\"Get the value of corresponding key.\"\"\"\n        if key in self.keys():\n            return self[key]\n        for v in self.values():\n            if isinstance(v, Args):\n                if key in v:\n                    return v[key]\n        return default_value\n\n    def __setattr__(self, name, value):\n        self[name] = value\n\n    def save(self, filename):\n        with open(filename, \"w\") as fp:\n            json.dump(self, fp, ensure_ascii=False, indent=4, sort_keys=False)\n\n    def load(self, filename, group_name=None):\n        if group_name is not None:\n            if group_name not in self:\n                self[group_name] = Args()\n            self[group_name].load(filename)\n            return\n        with open(filename, \"r\") as fp:\n            params_dict = json.load(fp)\n        for k, v in params_dict.items():\n            if isinstance(v, dict):\n                self[k].update(Args(v))\n            else:\n                self[k] = v\n\n\ndef parse_args(parser: argparse.ArgumentParser, allow_unknown=False) -> Args:\n    \"\"\" Parse hyper-parameters from cmdline. \"\"\"\n    if allow_unknown:\n        parsed, _ = parser.parse_known_args()\n    else:\n        parsed = parser.parse_args()\n    args = Args()\n    optional_args = parser._action_groups[1]\n    for action in optional_args._group_actions[1:]:\n        arg_name = action.dest\n        args[arg_name] = getattr(parsed, arg_name)\n    for group in parser._action_groups[2:]:\n        group_args = Args()\n        for action in group._group_actions:\n            arg_name = action.dest\n            group_args[arg_name] = getattr(parsed, arg_name)\n        if len(group_args) > 0:\n            if group.title in args:\n                args[group.title].update(group_args)\n            else:\n                args[group.title] = group_args\n    return args\n"
  },
  {
    "path": "modules/text/text_generation/plato2_en_base/utils/masking.py",
    "content": "#   Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Reader utils.\"\"\"\n\nimport numpy as np\n\n\ndef mask(batch_tokens,\n         vocab_size,\n         bos_id=1,\n         eos_id=2,\n         mask_id=3,\n         sent_b_starts=None,\n         labels=None,\n         is_unidirectional=False,\n         use_latent=False,\n         use_bow=False):\n    \"\"\"\n    Add mask for batch_tokens, return out, mask_label, mask_pos;\n    Note: mask_pos responding the batch_tokens after padded;\n    \"\"\"\n    batch_tokens = np.copy(batch_tokens)\n    max_len = max(map(len, batch_tokens))\n    mask_label = []\n    mask_pos = []\n    if labels is not None:\n        label_pos = []\n\n    if is_unidirectional:\n        # unidirectional language model\n        if use_latent:\n            max_len += 1\n            shift_len = 1\n        else:\n            shift_len = 0\n        for sent_index, sent in enumerate(batch_tokens):\n            sent_b_index = sent_b_starts[\n                sent_index] if sent_b_starts is not None else 0\n            need_cal = True\n            if labels is not None:\n                label_pos.append(sent_index * max_len + len(sent) - 1 +\n                                 shift_len)\n                if labels[sent_index] == 0:\n                    need_cal = False\n            mask_label.extend(sent[sent_b_index + 1:])\n            mask_pos.extend([\n                sent_index * max_len + i + shift_len\n                for i in range(sent_b_index,\n                               len(sent) - 1)\n            ])\n        mask_label = np.array(mask_label).astype(\"int64\").reshape([-1, 1])\n        mask_pos = np.array(mask_pos).astype(\"int64\").reshape([-1, 1])\n        return_list = [mask_label, mask_pos]\n\n        # latent related (bow label and pos)\n        if use_latent and use_bow:\n            bow_label = []\n            bow_pos = []\n            for sent_index, sent in enumerate(batch_tokens):\n                sent_b_index = sent_b_starts[\n                    sent_index] if sent_b_starts is not None else 0\n\n                def __filter__(tok_id):\n                    # TODO: exclude [EOS] from bow loss\n                    return True\n\n                bow_pos.extend([\n                    sent_index for i in range(sent_b_index + 1, len(sent))\n                    if __filter__(sent[i])\n                ])\n                bow_label.extend([\n                    sent[i] for i in range(sent_b_index + 1, len(sent))\n                    if __filter__(sent[i])\n                ])\n            bow_label = np.array(bow_label).astype(\"int64\").reshape([-1, 1])\n            bow_pos = np.array(bow_pos).astype(\"int64\").reshape([-1, 1])\n            return_list += [bow_label, bow_pos]\n    else:\n        # bidirectional mask language model\n        total_token_num = sum(map(len, batch_tokens))\n        prob_mask = np.random.rand(total_token_num)\n        # TODO: fix replace_ids, include [UNK]\n        replace_ids = np.random.randint(3,\n                                        high=vocab_size,\n                                        size=total_token_num)\n        prob_index = 0\n        for sent_index, sent in enumerate(batch_tokens):\n            # add pair label position\n            if labels is not None:\n                label_pos.append(sent_index * max_len)\n\n            # add mask label and position\n            for token_index, token in enumerate(sent):\n                if token == eos_id or token == bos_id:\n                    continue\n                prob = prob_mask[prob_index + token_index]\n                if prob > 0.15:\n                    continue\n                elif 0.03 < prob <= 0.15:\n                    # mask\n                    mask_label.append(sent[token_index])\n                    sent[token_index] = mask_id\n                    mask_pos.append(sent_index * max_len + token_index)\n                elif 0.015 < prob <= 0.03:\n                    # random replace\n                    mask_label.append(sent[token_index])\n                    sent[token_index] = replace_ids[prob_index + token_index]\n                    mask_pos.append(sent_index * max_len + token_index)\n                else:\n                    # keep the original token\n                    mask_label.append(sent[token_index])\n                    mask_pos.append(sent_index * max_len + token_index)\n\n            prob_index += len(sent)\n\n        mask_label = np.array(mask_label).astype(\"int64\").reshape([-1, 1])\n        mask_pos = np.array(mask_pos).astype(\"int64\").reshape([-1, 1])\n        return_list = [batch_tokens, mask_label, mask_pos]\n\n    if labels is not None:\n        label_pos = np.array(label_pos).astype(\"int64\").reshape([-1, 1])\n        assert len(labels) == len(label_pos)\n        return_list.append(label_pos)\n    return return_list\n"
  },
  {
    "path": "modules/text/text_generation/plato2_en_base/utils/tokenization.py",
    "content": "#   Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Tokenization classes.\"\"\"\n\nimport collections\nimport sentencepiece as spm\nimport unicodedata\n\nfrom .args import str2bool\n\n\ndef clean_text(text):\n    \"\"\"Performs invalid character removal and whitespace cleanup on text.\"\"\"\n    text = text.replace(u\"“\", u'\"')\\\n        .replace(u'”', u'\"')\\\n        .replace(u'‘', \"'\")\\\n        .replace(u'’', u\"'\")\\\n        .replace(u'—', u'-')\n\n    output = []\n    for char in text:\n        if _is_control(char):\n            continue\n        if _is_whitespace(char):\n            output.append(\" \")\n        else:\n            output.append(char)\n    return \"\".join(output)\n\n\ndef preprocess_text(inputs, remove_space=True, lower=False):\n    \"\"\"preprocess data by removing extra space and normalize data.\"\"\"\n    outputs = inputs\n    if remove_space:\n        outputs = \" \".join(inputs.strip().split())\n\n    outputs = unicodedata.normalize(\"NFKD\", outputs)\n    outputs = \"\".join([c for c in outputs if not unicodedata.combining(c)])\n    if lower:\n        outputs = outputs.lower()\n\n    return outputs\n\n\ndef encode_pieces(spm_model, text, return_unicode=True, sample=False):\n    \"\"\"turn sentences into word pieces.\"\"\"\n    # liujiaxiang: add for ernie-albert, mainly consider for “/”/‘/’/— causing too many unk\n    text = clean_text(text)\n\n    if not sample:\n        pieces = spm_model.EncodeAsPieces(text)\n    else:\n        pieces = spm_model.SampleEncodeAsPieces(text, 64, 0.1)\n\n    return pieces\n\n\ndef encode_ids(spm_model, text, sample=False):\n    \"\"\"turn sentences into word pieces.\"\"\"\n    pieces = encode_pieces(spm_model, text, return_unicode=False, sample=sample)\n    ids = [spm_model.PieceToId(piece) for piece in pieces]\n    return ids\n\n\ndef convert_to_unicode(text):\n    \"\"\"Converts `text` to Unicode (if it's not already), assuming utf-8 input.\"\"\"\n    if isinstance(text, str):\n        return text\n    elif isinstance(text, bytes):\n        return text.decode(\"utf-8\", \"ignore\")\n    else:\n        raise ValueError(\"Unsupported string type: %s\" % (type(text)))\n\n\ndef load_vocab(vocab_file):\n    \"\"\"Loads a vocabulary file into a dictionary.\"\"\"\n    vocab = collections.OrderedDict()\n    fin = open(vocab_file, 'r', encoding=\"UTF-8\")\n    for num, line in enumerate(fin):\n        items = convert_to_unicode(line.rstrip()).split(\"\\t\")\n        if len(items) > 2:\n            break\n        token = items[0]\n        index = items[1] if len(items) == 2 else num\n        token = token.strip()\n        vocab[token] = int(index)\n    return vocab\n\n\ndef convert_by_vocab(vocab, items):\n    \"\"\"Converts a sequence of [tokens|ids] using the vocab.\"\"\"\n    output = []\n    for item in items:\n        output.append(vocab[item])\n    return output\n\n\nclass SentencePieceTokenizer(object):\n    \"\"\"Runs end-to-end tokenziation.\"\"\"\n\n    @classmethod\n    def add_cmdline_args(cls, parser):\n        \"\"\"Add cmdline argurments.\"\"\"\n        group = parser.add_argument_group(\"Tokenizer\")\n        group.add_argument(\"--vocab_path\", type=str, required=True)\n        group.add_argument(\"--do_lower_case\", type=str2bool, default=False)\n        group.add_argument(\"--spm_model_file\", type=str, required=True)\n        return group\n\n    def __init__(self, args):\n        self.spm_model = spm.SentencePieceProcessor()\n        self.spm_model.Load(args.spm_model_file)\n        self.vocab = load_vocab(args.vocab_path)\n        self.do_lower_case = args.do_lower_case\n        self.inv_vocab = {v: k for k, v in self.vocab.items()}\n\n    def tokenize(self, text):\n        \"\"\"Tokenizes a piece of text.\"\"\"\n        text = preprocess_text(text, lower=self.do_lower_case)\n        return encode_pieces(self.spm_model, text, return_unicode=True)\n\n    def convert_tokens_to_ids(self, tokens):\n        \"\"\"Convert tokens to ids.\"\"\"\n        ret = []\n        unk_id = self.vocab[\"<unk>\"]\n        for token in tokens:\n            if token in self.vocab:\n                ret.append(self.vocab[token])\n            else:\n                ret.append(unk_id)\n        return ret\n\n    def convert_ids_to_tokens(self, ids):\n        \"\"\"Convert ids to tokens.\"\"\"\n        return convert_by_vocab(self.inv_vocab, ids)\n\n    def merge_subword(self, tokens):\n        \"\"\"Merge subword.\"\"\"\n        ret = []\n        for token in tokens:\n            if token.startswith(u\"▁\"):\n                ret.append(token[1:])\n            else:\n                if len(ret):\n                    ret[-1] += token\n                else:\n                    ret.append(token)\n\n        ret = [token for token in ret if token]\n        return ret\n\n    def convert_ids_to_str(self, ids):\n        \"\"\"Convert ids to string.\"\"\"\n        tokens = self.convert_ids_to_tokens(ids)\n        tokens = self.merge_subword(tokens)\n        res = \" \".join(tokens).replace(\"<s>\", \"\")\n        res = res.replace(\"</s>\", \"\\n\").replace(\"\\n \", \"\\n\").strip()\n        return res\n\n\ndef _is_whitespace(char):\n    \"\"\"Checks whether `chars` is a whitespace character.\"\"\"\n    # \\t, \\n, and \\r are technically contorl characters but we treat them\n    # as whitespace since they are generally considered as such.\n    if char == \" \" or char == \"\\t\" or char == \"\\n\" or char == \"\\r\":\n        return True\n    cat = unicodedata.category(char)\n    if cat == \"Zs\":\n        return True\n    return False\n\n\ndef _is_control(char):\n    \"\"\"Checks whether `chars` is a control character.\"\"\"\n    # These are technically control characters but we count them as whitespace\n    # characters.\n    if char == \"\\t\" or char == \"\\n\" or char == \"\\r\":\n        return False\n    cat = unicodedata.category(char)\n    if cat.startswith(\"C\"):\n        return True\n    return False\n"
  },
  {
    "path": "modules/text/text_generation/plato2_en_large/README.md",
    "content": "# plato2_en_large\n\n| 模型名称            |       plato2_en_large       |\n| :------------------ | :--------------------: |\n| 类别                |     文本-文本生成      |\n| 网络                |  PLATO2   |\n| 数据集              | 大规模开放域英文数据集 |\n| 是否支持Fine-tuning |           否           |\n| 模型大小            |         19.3 GB       |\n| 最新更新日期        |       2022-11-05       |\n| 数据指标            |           -            |\n\n## 一、模型基本信息\n\n- ### 模型介绍\n  - PLATO2 是一个超大规模生成式对话系统模型。它承袭了 PLATO 隐变量进行回复多样化生成的特性，能够就开放域话题进行流畅深入的聊天。据公开数据，其效果超越了 Google 于 2020 年 2 月份发布的 Meena 和 Facebook AI Research 于2020 年 4 月份发布的 Blender 的效果。plato2_en_large 包含 1.6B 参数，可用于一键预测对话回复。由于该 Module 参数量较多，推荐使用GPU预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 2.0.0\n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install plato2_en_large\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```bash\n    $ hub run plato2_en_large --input_text=\"Hello, how are you\"\n    ```\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    module = hub.Module(name=\"plato2_en_large\")\n\n    test_texts = [\"Hello\",\"Hello\\thi, nice to meet you\\tnice to meet you\"]\n    results = module.generate(texts=test_texts)\n    for result in results:\n        print(result)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def generate(texts):\n    ```\n\n    - 预测API，输入对话上下文，输出机器回复。\n    - **参数**\n      - texts (list\\[str\\] or str): 如果不在交互模式中，texts应为list，每个元素为一次对话的上下文，上下文应包含人类和机器人的对话内容，不同角色之间的聊天用分隔符\"\\t\"进行分割；例如[[\"Hello\\thi, nice to meet you\\tnice to meet you\"]]。这个输入中包含1次对话，机器人回复了\"hi, nice to meet you\"后人类回复“nice to meet you”，现在轮到机器人回复了。如果在交互模式中，texts应为str，模型将自动构建它的上下文。\n\n  - ```python\n    def interactive_mode(max_turn=6):\n    ```\n\n    - 进入交互模式。交互模式中，generate接口的texts将支持字符串类型。\n    - **参数**\n      - max_turn (int): 模型能记忆的对话轮次，当max_turn = 1时，模型只能记住当前对话，无法获知之前的对话内容。\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线对话机器人服务。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m plato2_en_large -p 8866\n    ```\n\n  - 这样就完成了一个对话机器人服务化API的部署，默认端口号为8866。\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    data = {'texts':[\"Hello\",\"Hello\\thi, nice to meet you\\tnice to meet you\"]}\n    headers = {\"Content-type\": \"application/json\"}\n    url = \"http://127.0.0.1:8866/predict/plato2_en_large\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 保存结果\n    results = r.json()[\"results\"]\n    for result in results:\n        print(result)\n    ```\n\n  - 关于PaddleHub Serving更多信息参考[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install plato==1.1.0\n    ```\n"
  },
  {
    "path": "modules/text/text_generation/plato2_en_large/model.py",
    "content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom collections import namedtuple\n\nimport paddle\nimport paddle.nn as nn\nimport paddle.nn.functional as F\n\n\ndef post_process_context(token_ids, reader, merge=True):\n    \"\"\"Post-process the context sequence.\"\"\"\n    context = []\n    utt = []\n    for tok_id in token_ids[1:]:\n        if tok_id == reader.eos_id:\n            utt = reader.tokenizer.convert_ids_to_tokens(utt)\n            if merge:\n                utt = reader.tokenizer.merge_subword(utt)\n            context.append(utt)\n            utt = []\n        else:\n            utt.append(tok_id)\n    return context\n\n\ndef post_process_response(token_ids, reader, merge=True):\n    \"\"\"\n    Post-process the decoded sequence. Truncate from the first\n    <eos> and remove the <bos> and <eos> tokens currently.\n    \"\"\"\n    eos_pos = len(token_ids)\n    for i, tok_id in enumerate(token_ids):\n        if tok_id == reader.eos_id:\n            eos_pos = i\n            break\n    token_ids = token_ids[1:eos_pos]\n    response = reader.tokenizer.convert_ids_to_tokens(token_ids)\n    if merge:\n        response = reader.tokenizer.merge_subword(response)\n    return token_ids, response\n\n\ndef get_cross_turn_repetition(context, pred_tokens, eos_idx, is_cn=False):\n    \"\"\"Get cross-turn repetition.\"\"\"\n    if len(pred_tokens) == 0:\n        return 1.0\n    if is_cn:\n        context = [\"\".join(utt) for utt in context]\n        pred_tokens = \"\".join(pred_tokens)\n\n    pred_tri_grams = set()\n    for i in range(len(pred_tokens) - 2):\n        tri_gram = tuple(pred_tokens[i:i + 3])\n        pred_tri_grams.add(tri_gram)\n    for utt in context:\n        for i in range(len(utt) - 2):\n            tri_gram = tuple(utt[i:i + 3])\n            if tri_gram in pred_tri_grams:\n                return 1.0\n    return 0.0\n\n\ndef get_in_turn_repetition(pred, is_cn=False):\n    \"\"\"Get in-turn repetition.\"\"\"\n    if len(pred) == 0:\n        return 1.0\n    if isinstance(pred[0], str):\n        pred = [tok.lower() for tok in pred]\n        if is_cn:\n            pred = \"\".join(pred)\n    tri_grams = set()\n    for i in range(len(pred) - 2):\n        tri_gram = tuple(pred[i:i + 3])\n        if tri_gram in tri_grams:\n            return 1.0\n        tri_grams.add(tri_gram)\n    return 0.0\n\n\nclass Plato2EncoderLayer(nn.Layer):\n\n    def __init__(self, n_head, hidden_size, attn_dropout, act_dropout):\n        super(Plato2EncoderLayer, self).__init__()\n\n        self.self_attn = nn.MultiHeadAttention(hidden_size, n_head, attn_dropout)\n        self.pre_norm_layer = nn.LayerNorm(hidden_size)\n        self.post_norm_layer = nn.LayerNorm(hidden_size)\n        self.fc1 = nn.Linear(hidden_size, hidden_size * 4)\n        self.fc2 = nn.Linear(hidden_size * 4, hidden_size)\n\n        self.dropout_layer = nn.Dropout(act_dropout)\n        self.gelu_layer = nn.GELU()\n\n    def forward(self, x, attn_mask, cache):\n        query = self.pre_norm_layer(x)\n        attn_output, new_cache = self.self_attn(query, None, None, attn_mask, cache)\n        attn_output = self.dropout_layer(attn_output)\n        attn_output = attn_output + x\n        ffd_input = self.post_norm_layer(attn_output)\n\n        ffd_output = self.fc1(ffd_input)\n        ffd_output = self.gelu_layer(ffd_output)\n        ffd_output = self.dropout_layer(ffd_output)\n\n        ffd_output = self.fc2(ffd_output)\n        ffd_output = self.dropout_layer(ffd_output)\n        out = ffd_output + attn_output\n\n        return out, new_cache\n\n    def gen_cache(self, key):\n        return self.self_attn.gen_cache(key)\n\n\nclass Plato2Encoder(nn.Layer):\n\n    def __init__(self, vocab_size, type_size, max_position_seq_len, num_layers, n_head, hidden_size, attn_dropout,\n                 act_dropout):\n        super(Plato2Encoder, self).__init__()\n\n        self.n_head = n_head\n\n        self.word_embedding_layer = nn.Embedding(vocab_size, hidden_size)\n        self.sent_embedding_layer = nn.Embedding(type_size, hidden_size)\n        self.pos_embedding_layer = nn.Embedding(max_position_seq_len, hidden_size)\n\n        self.encoder_layers = []\n        for i in range(num_layers):\n            encoder_layer = Plato2EncoderLayer(n_head, hidden_size, attn_dropout, act_dropout)\n            self.encoder_layers.append(encoder_layer)\n            self.add_sublayer('layers.' + str(i), encoder_layer)\n        self.post_encoder_layer_norm = nn.LayerNorm(hidden_size)\n\n        self.dropout_layer = nn.Dropout(act_dropout)\n\n    def forward(self, caches, token_ids, type_ids, pos_ids, generation_mask, aux_emb=None):\n        out, self_attn_mask = self.gen_input(token_ids, type_ids, pos_ids, generation_mask, aux_emb)\n\n        new_caches = []\n        for i, encoder_layer in enumerate(self.encoder_layers):\n            out, new_cache = encoder_layer(out, self_attn_mask, caches[i])\n            new_caches.append(new_cache)\n\n        enc_output = self.post_encoder_layer_norm(out)\n        return enc_output, new_caches\n\n    def gen_input(self, token_ids, type_ids, pos_ids, input_mask, aux_emb=None):\n        token_emb_out = self.word_embedding_layer(token_ids)\n        type_emb_out = self.sent_embedding_layer(type_ids)\n        pos_emb_out = self.pos_embedding_layer(pos_ids)\n        emb_out = token_emb_out + type_emb_out + pos_emb_out\n\n        # auxiliary memory embeddings\n        if aux_emb is not None:\n            emb_out = paddle.concat([aux_emb, emb_out], axis=1)\n\n        emb_out = self.dropout_layer(emb_out)\n\n        # generate n-head self-attention mask\n        self_attn_mask = input_mask\n        self_attn_mask = paddle.scale(x=self_attn_mask, scale=1e4, bias=-1.0, bias_after_scale=False)\n        n_head_self_attn_mask = paddle.stack(x=[self_attn_mask] * self.n_head, axis=1)\n        n_head_self_attn_mask.stop_gradient = True\n\n        return emb_out, n_head_self_attn_mask\n\n    def gen_caches(self, key):\n        caches = [encoder_layer.gen_cache(key) for encoder_layer in self.encoder_layers]\n        return caches\n\n\nclass NSP(nn.Layer):\n\n    def __init__(self, vocab_size, type_size, max_position_seq_len, num_layers, n_head, hidden_size, attn_dropout,\n                 act_dropout):\n        super(NSP, self).__init__()\n\n        self.n_head = n_head\n        self.hidden_size = hidden_size\n\n        self.word_embedding_layer = nn.Embedding(vocab_size, hidden_size)\n        self.sent_embedding_layer = nn.Embedding(type_size, hidden_size)\n        self.pos_embedding_layer = nn.Embedding(max_position_seq_len, hidden_size)\n\n        encoder_layer = nn.TransformerEncoderLayer(hidden_size, n_head, hidden_size * 4, act_dropout, 'gelu',\n                                                   attn_dropout, act_dropout, 'True')\n        encoder_norm = nn.LayerNorm(hidden_size)\n        self.encoder = nn.TransformerEncoder(encoder_layer, num_layers, encoder_norm)\n        self.fc1 = nn.Linear(hidden_size, hidden_size)\n        self.fc2 = nn.Linear(hidden_size, 2)\n\n        self.dropout_layer = nn.Dropout(act_dropout)\n        self.tanh_layer = nn.Tanh()\n        self.softmax = nn.Softmax()\n\n    def forward(self, inputs):\n        token_ids = inputs['token_ids']\n        type_ids = inputs['type_ids']\n        pos_ids = inputs['pos_ids']\n        attention_mask = inputs['attention_mask']\n        label_pos = inputs[\"label_pos\"]\n\n        out, self_attn_mask = self.gen_input(token_ids, type_ids, pos_ids, attention_mask)\n        # [-1, seq_len, hidden_size]\n        enc_out = self.encoder(out, self_attn_mask)\n\n        enc_out = paddle.reshape(enc_out, [-1, self.hidden_size])\n        label_pos = paddle.cast(label_pos, 'int64')\n        out = paddle.gather(enc_out, label_pos)\n        pooled_out = self.fc1(out)\n        pooled_out = self.tanh_layer(pooled_out)\n\n        # [-1, 2]\n        logits = self.fc2(pooled_out)\n        probs = self.softmax(logits)\n\n        return probs\n\n    def gen_input(self, token_ids, type_ids, pos_ids, input_mask, aux_emb=None):\n        token_emb_out = self.word_embedding_layer(token_ids)\n        type_emb_out = self.sent_embedding_layer(type_ids)\n        pos_emb_out = self.pos_embedding_layer(pos_ids)\n        emb_out = token_emb_out + type_emb_out + pos_emb_out\n\n        # auxiliary memory embeddings\n        if aux_emb is not None:\n            emb_out = paddle.concat([aux_emb, emb_out], axis=1)\n\n        emb_out = self.dropout_layer(emb_out)\n\n        # generate n-head self-attention mask\n        self_attn_mask = input_mask\n        self_attn_mask = paddle.scale(x=self_attn_mask, scale=1e4, bias=-1.0, bias_after_scale=False)\n        n_head_self_attn_mask = paddle.stack(x=[self_attn_mask] * self.n_head, axis=1)\n        n_head_self_attn_mask.stop_gradient = True\n\n        return emb_out, n_head_self_attn_mask\n\n\nclass Plato2InferModel(nn.Layer):\n\n    def __init__(self,\n                 nsp_reader,\n                 num_layers,\n                 n_head,\n                 hidden_size,\n                 vocab_size=8001,\n                 type_size=2,\n                 latent_type_size=20,\n                 max_position_seq_len=256,\n                 act_dropout=0.1,\n                 attn_dropout=0.1,\n                 max_dec_len=64,\n                 min_dec_len=1,\n                 topk=10):\n        super(Plato2InferModel, self).__init__()\n\n        self.nsp_reader = nsp_reader\n        self.num_layers = num_layers\n        self.latent_type_size = latent_type_size\n        self.max_dec_len = max_dec_len\n        self.min_dec_len = min_dec_len\n        self.topk = topk\n        self.unk_id = 0\n        self.bos_id = 1\n        self.eos_id = 2\n        self.mask_id = 8000\n        self.after_eos = paddle.ones([vocab_size]) * -1e9\n        self.after_eos[self.eos_id] = 0\n        self.is_cn = False\n        self.batch_size = 1\n\n        self.latent_weight = paddle.create_parameter([hidden_size, latent_type_size], 'float32')\n\n        self.plato2_encoder = Plato2Encoder(vocab_size, type_size, max_position_seq_len, num_layers, n_head,\n                                            hidden_size, attn_dropout, act_dropout)\n\n        self.logits_fc_layer = nn.Linear(hidden_size, hidden_size)\n        self.logits_layer_norm = nn.LayerNorm(hidden_size)\n        self.logits_bias = paddle.create_parameter([vocab_size], 'float32', is_bias=True)\n\n        self.nsp_predictor = NSP(vocab_size, type_size, max_position_seq_len, num_layers, n_head, hidden_size,\n                                 attn_dropout, act_dropout)\n\n        self.gelu_layer = nn.GELU()\n        self.softmax = nn.Softmax()\n\n    @paddle.no_grad()\n    def forward(self, inputs):\n        token_ids = inputs['token_ids']\n        type_ids = inputs['type_ids']\n        pos_ids = inputs['pos_ids']\n        generation_mask = inputs['generation_mask']\n        latent_id = inputs['latent_id']\n        data_id = inputs['data_id']\n\n        # [-1, 1, latent_type_size]\n        latent_id = F.one_hot(latent_id, self.latent_type_size)\n        # [-1, 1, hidden_size]\n        latent_emb = paddle.matmul(latent_id, self.latent_weight, transpose_y=True)\n\n        caches = self.plato2_encoder.gen_caches(token_ids)\n\n        # [-1, seq_len + 1, hidden_size]\n        enc_out, new_caches = self.plato2_encoder(caches, token_ids, type_ids, pos_ids, generation_mask, latent_emb)\n\n        pred_ids = self.decode(inputs, new_caches)\n\n        nsp_inputs = self.gen_nsp_input(token_ids, pred_ids)\n        # [-1, 2]\n        probs = self.nsp_predictor(nsp_inputs)\n\n        return self.get_results(data_id, token_ids, pred_ids, probs)\n\n    def decode(self, inputs, caches):\n        tgt_ids = inputs['tgt_ids']\n        tgt_pos = inputs['tgt_pos']\n        tgt_generation_mask = inputs['tgt_generation_mask']\n        predictions = tgt_ids\n\n        # TODO\n        step = 0\n        while step < self.max_dec_len:\n            # [-1, 1]\n            append_mask = paddle.cast(tgt_ids != self.eos_id, dtype=tgt_generation_mask.dtype)\n            tgt_generation_mask = paddle.concat([tgt_generation_mask, paddle.unsqueeze(append_mask, 1)], axis=-1)\n            tgt_sent = paddle.ones([tgt_generation_mask.shape[0], 1], dtype=tgt_ids.dtype)\n\n            # [-1, 1, hidden_size]\n            out, caches = self.plato2_encoder(caches, tgt_ids, tgt_sent, tgt_pos, tgt_generation_mask)\n            out = paddle.squeeze(out, axis=1)\n\n            # [-1, hidden_size]\n            trans = self.logits_fc_layer(out)\n            trans = self.gelu_layer(trans)\n            trans = self.logits_layer_norm(trans)\n\n            # [-1, vocab_size]\n            logits = paddle.matmul(trans, self.plato2_encoder.word_embedding_layer.weight,\n                                   transpose_y=True) + self.logits_bias\n            logits[:, self.unk_id] = -1e9\n            logits[:, self.bos_id] = -1e9\n            logits[:, self.mask_id] = -1e9\n            if step < self.min_dec_len:\n                logits[:, self.eos_id] = -1e9\n            logits = logits * append_mask + (1 - append_mask) * self.after_eos\n            probs = self.softmax(logits)\n\n            # [-1, topk]\n            topk_probs, _ = paddle.topk(probs, k=self.topk)\n            mask = paddle.cast(probs >= topk_probs[:, -1:], 'float32')\n            sums = paddle.sum(topk_probs, axis=-1, keepdim=True)\n            new_probs = probs * mask / sums\n            # [-1, 1]\n            sampling_ids = paddle.multinomial(new_probs)\n\n            step = step + 1\n            tgt_ids = sampling_ids\n            tgt_pos = tgt_pos + 1\n            predictions = paddle.concat([predictions, tgt_ids], axis=1)\n        return predictions\n\n    def gen_nsp_input(self, token_ids, pred_ids):\n        token_ids = token_ids.numpy()\n        pred_ids = pred_ids.numpy()\n\n        def __reader__():\n            headers = [\"src\", \"tgt\", \"data_id\"]\n\n            Example = namedtuple(\"Example\", headers)\n\n            for i, (raw, pred) in enumerate(zip(token_ids, pred_ids)):\n                context = post_process_context(raw, self.nsp_reader, merge=False)\n                _, response = post_process_response(pred, self.nsp_reader, merge=False)\n                context_tokenized_input = \" [SEP] \".join(\" \".join(utt) for utt in context)\n                response_tokenized_input = \" \".join(response)\n                example = Example(src=context_tokenized_input, tgt=response_tokenized_input, data_id=i)\n                data = self.nsp_reader._convert_example_to_record(example, is_infer=True)\n                yield data\n            return\n\n        generator = self.nsp_reader.data_generator(\n            reader=__reader__,\n            is_infer=True,\n            phase=\"test\",\n        )\n        inputs = next(generator())\n\n        #print('\\nnsp_inputs:')\n        for key in inputs:\n            inputs[key] = paddle.to_tensor(inputs[key])\n            if key in ['token_ids', 'type_ids', 'pos_ids']:\n                inputs[key] = paddle.squeeze(inputs[key], axis=-1)\n            #print(key, inputs[key].shape)\n            #print(inputs[key])\n        return inputs\n\n    def get_results(self, data_id, token_ids, pred_ids, probs):\n        data_id = data_id.numpy()\n        token_ids = token_ids.numpy()\n        pred_ids = pred_ids.numpy()\n        probs = probs.numpy()\n\n        infos = []\n        for raw, pred, prob in zip(token_ids, pred_ids, probs):\n            tokens = post_process_context(raw, self.nsp_reader)\n            pred_token_ids, pred_tokens = post_process_response(pred, self.nsp_reader)\n            info = {}\n            info['response'] = ' '.join(pred_tokens)\n            cross_turn_repetition = get_cross_turn_repetition(tokens, pred_tokens, self.nsp_reader.eos_id, self.is_cn)\n            in_turn_repetition = max(get_in_turn_repetition(pred_tokens, self.is_cn),\n                                     get_in_turn_repetition(pred_token_ids))\n\n            info['score'] = float(prob[1])\n            if len(pred_token_ids) >= self.max_dec_len:\n                info['score'] -= 1e3\n            elif cross_turn_repetition > 0:\n                info['score'] -= 1e3\n            elif in_turn_repetition > 0:\n                info['score'] -= 1e3\n            infos.append(info)\n\n        results = []\n        pre_idx = 0\n        sample = []\n        for idx, info in zip(data_id, infos):\n            if idx != pre_idx:\n                sample = sorted(sample, key=lambda info: -info[\"score\"])\n                result = sample[0]\n                result['data_id'] = pre_idx\n                results.apeend(result)\n                sample = []\n                pre_idx = idx\n            sample.append(info)\n        if sample:\n            sample = sorted(sample, key=lambda info: -info[\"score\"])\n            result = sample[0]\n            result['data_id'] = pre_idx\n            results.append(result)\n        return results\n"
  },
  {
    "path": "modules/text/text_generation/plato2_en_large/module.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport contextlib\nimport os\nimport sys\nfrom collections import namedtuple\n\nimport paddle\nimport paddle.nn as nn\n\nimport paddlehub as hub\nfrom .model import Plato2InferModel\nfrom .readers.nsp_reader import NSPReader\nfrom .readers.plato_reader import PlatoReader\nfrom .utils import gen_inputs\nfrom .utils.args import parse_args\nfrom .utils.args import str2bool\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\nfrom paddlehub.module.nlp_module import DataFormatError\n\n\n@moduleinfo(\n    name=\"plato2_en_large\",\n    version=\"1.1.0\",\n    summary=\n    \"A novel pre-training model for dialogue generation, incorporated with latent discrete variables for one-to-many relationship modeling.\",\n    author=\"baidu-nlp\",\n    author_email=\"\",\n    type=\"nlp/text_generation\",\n)\nclass Plato2(nn.Layer, hub.NLPPredictionModule):\n\n    def __init__(self):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        super(Plato2, self).__init__()\n        args = self.setup_args()\n\n        if args.num_layers == 24:\n            n_head = 16\n            hidden_size = 1024\n        elif args.num_layers == 32:\n            n_head = 32\n            hidden_size = 2048\n        else:\n            raise ValueError('The pre-trained model only support 24 or 32 layers, '\n                             'but received num_layers=%d.' % args.num_layers)\n\n        self.plato_reader = PlatoReader(args)\n        nsp_reader = NSPReader(args)\n        self.model = Plato2InferModel(nsp_reader, args.num_layers, n_head, hidden_size)\n        state_dict = paddle.load(args.init_from_ckpt)\n        self.model.set_state_dict(state_dict)\n        self.model.eval()\n        self.Example = namedtuple(\"Example\", [\"src\", \"data_id\"])\n        self.latent_type_size = args.latent_type_size\n        self._interactive_mode = False\n\n    def setup_args(self):\n        \"\"\"\n        Setup arguments.\n        \"\"\"\n        ckpt_path = os.path.join(self.directory, 'assets', '32L.pdparams')\n        vocab_path = os.path.join(self.directory, 'assets', 'vocab.txt')\n        spm_model_file = os.path.join(self.directory, 'assets', 'spm.model')\n\n        # ArgumentParser.parse_args use argv[1:], it will drop the first one arg, so the first one in sys.argv should be \"\"\n        sys.argv = [\n            \"--empty\",\n            \"--spm_model_file\",\n            \"%s\" % spm_model_file,\n            \"--vocab_path\",\n            \"%s\" % vocab_path,\n        ]\n\n        parser = argparse.ArgumentParser()\n        group = parser.add_argument_group(\"Model\")\n        group.add_argument(\"--init_from_ckpt\", type=str, default=ckpt_path)\n        group.add_argument(\"--vocab_size\", type=int, default=8001)\n        group.add_argument(\"--latent_type_size\", type=int, default=20)\n        group.add_argument(\"--num_layers\", type=int, default=32)\n\n        group = parser.add_argument_group(\"Task\")\n        group.add_argument(\"--is_cn\", type=str2bool, default=False)\n\n        NSPReader.add_cmdline_args(parser)\n\n        args = parse_args(parser)\n        args.batch_size *= args.latent_type_size\n\n        return args\n\n    @serving\n    @paddle.no_grad()\n    def generate(self, texts):\n        \"\"\"\n        Get the robot responses of the input texts.\n\n        Args:\n             texts(list or str): If not in the interactive mode, texts should be a list in which every element is the chat context separated with '\\t'.\n                                 Otherwise, texts shoule be one sentence. The module can get the context automatically.\n\n        Returns:\n             results(list): the robot responses.\n        \"\"\"\n        if not texts:\n            return []\n        if self._interactive_mode:\n            if isinstance(texts, str):\n                self.context.append(texts.strip())\n                texts = [\" [SEP] \".join(self.context[-self.max_turn:])]\n            else:\n                raise ValueError(\"In the interactive mode, the input data should be a string.\")\n        elif not isinstance(texts, list):\n            raise ValueError(\"If not in the interactive mode, the input data should be a list.\")\n\n        bot_responses = []\n        for i, text in enumerate(texts):\n            example = self.Example(src=text.replace(\"\\t\", \" [SEP] \"), data_id=0)\n            record = self.plato_reader._convert_example_to_record(example, is_infer=True)\n            data = self.plato_reader._pad_batch_records([record], is_infer=True)\n            inputs = gen_inputs(data, self.latent_type_size)\n            inputs['tgt_ids'] = inputs['tgt_ids'].astype('int64')\n            pred = self.model(inputs)[0]  # batch_size is 1\n            bot_response = pred[\"response\"]  # ignore data_id and score\n            bot_responses.append(bot_response)\n\n        if self._interactive_mode:\n            self.context.append(bot_responses[0].strip())\n        return bot_responses\n\n    @contextlib.contextmanager\n    def interactive_mode(self, max_turn=6):\n        \"\"\"\n        Enter the interactive mode.\n\n        Args:\n            max_turn(int): the max dialogue turns. max_turn = 1 means the robot can only remember the last one utterance you have said.\n        \"\"\"\n        self._interactive_mode = True\n        self.max_turn = max_turn\n        self.context = []\n        yield\n        self.context = []\n        self._interactive_mode = False\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description='Run the %s module.' % self.name,\n                                              prog='hub run %s' % self.name,\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, optional.\")\n\n        self.add_module_input_arg()\n\n        args = self.parser.parse_args(argvs)\n\n        try:\n            input_data = self.check_input_data(args)\n        except DataFormatError and RuntimeError:\n            self.parser.print_help()\n            return None\n\n        results = self.generate(texts=input_data)\n\n        return results\n"
  },
  {
    "path": "modules/text/text_generation/plato2_en_large/readers/dialog_reader.py",
    "content": "#   Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Dialogue Reader.\"\"\"\n\nimport csv\nfrom collections import namedtuple\nfrom contextlib import contextmanager\nimport gzip\n\nimport numpy as np\n\nfrom ..utils import pad_batch_data\nfrom ..utils.args import str2bool\nfrom ..utils.masking import mask\nfrom ..utils import tokenization\n\n\nclass DialogReader(object):\n    \"\"\"The implement of DialogReader.\"\"\"\n\n    @classmethod\n    def add_cmdline_args(cls, parser):\n        \"\"\"Add cmdline argurments.\"\"\"\n        group = parser.add_argument_group(\"Reader\")\n        group.add_argument(\"--max_src_len\", type=int, default=128)\n        group.add_argument(\"--max_tgt_len\", type=int, default=128)\n        group.add_argument(\"--truncate_first_turn\",\n                           type=str2bool,\n                           default=False)\n        group.add_argument(\"--file_format\",\n                           type=str,\n                           default=\"file\",\n                           choices=[\"file\", \"filelist\"])\n        group.add_argument(\"--data_format\",\n                           type=str,\n                           default=\"raw\",\n                           choices=[\"raw\", \"tokenized\", \"numerical\"])\n        group.add_argument(\"--in_tokens\", type=str2bool, default=False)\n        group.add_argument(\"--batch_size\", type=int, default=16)\n        group.add_argument(\"--continuous_position\", type=str2bool, default=True)\n        group.add_argument(\"--random_seed\", type=int, default=11)\n        group.add_argument(\"--sort_pool_size\", type=int, default=2**16)\n\n        group = parser.add_argument_group(\"Tokenizer\")\n        group.add_argument(\"--tokenizer\",\n                           type=str,\n                           default=\"SentencePieceTokenizer\")\n        args, _ = parser.parse_known_args()\n        tokenizer_cls = getattr(tokenization, args.tokenizer)\n        tokenizer_cls.add_cmdline_args(parser)\n        return group\n\n    def __init__(self, args):\n        tokenizer_cls = getattr(tokenization, args.tokenizer)\n        self.tokenizer = tokenizer_cls(args)\n        self.vocab = self.tokenizer.vocab\n        self.pad_id = args.pad_id = self.vocab[\"[PAD]\"]\n        self.bos_id = args.bos_id = self.vocab[\"[CLS]\"]\n        self.eos_id = args.eos_id = self.vocab[\"[SEP]\"]\n        self.unk_id = args.unk_id = self.vocab[\"[UNK]\"]\n        self.mask_id = args.mask_id = self.vocab[\"[MASK]\"]\n        self.vocab_size = args.get(\"vocab_size\", 0)\n        self.max_src_len = args.max_src_len\n        self.max_tgt_len = args.max_tgt_len\n        self.truncate_first_turn = args.truncate_first_turn\n        self.file_format = args.file_format\n        self.data_format = args.data_format\n        self.in_tokens = args.in_tokens\n        self.batch_size = args.batch_size\n        self.continuous_position = args.continuous_position\n        self.sort_pool_size = args.sort_pool_size\n\n        # random_seed must be set for data slicing when using multi-gpu\n        self.global_rng = np.random.RandomState(args.random_seed)\n\n        # training progress\n        self.current_example = 0\n        self.current_epoch = 0\n        self.num_examples = 0\n\n        # model related\n\n        self.fields = [\"token_ids\", \"type_ids\", \"pos_ids\"]\n        self.num_numerical_fields = len(self.fields)\n        self.fields += [\"tgt_start_idx\", \"data_id\"]\n        self.sort_key = lambda record: [len(record.token_ids)]\n\n        self.Record = namedtuple(\"Record\",\n                                 self.fields,\n                                 defaults=(None, ) * len(self.fields))\n\n        self.features = {}\n        return\n\n    def get_train_progress(self):\n        \"\"\"Gets progress for training phase.\"\"\"\n        return self.current_epoch, self.current_file_index, self.total_file\n\n    def _convert_example_to_record(self, example, is_infer):\n        # process src\n        src_token_ids = []\n        src_pos_ids = []\n\n        # tokenize src\n        s_token_ids_list = []\n        for s in example.src.split(\"[SEP]\"):\n            s = tokenization.convert_to_unicode(s).strip()\n\n            if self.data_format == \"tokenized\":\n                s_tokens = s.split(\" \")\n            else:\n                s_tokens = self.tokenizer.tokenize(s)\n\n            s_token_ids = self.tokenizer.convert_tokens_to_ids(s_tokens) + [\n                self.eos_id\n            ]\n            s_token_ids_list.append(s_token_ids)\n\n        # trim src\n        idx = len(s_token_ids_list) - 1\n        total_token_num = 1\n        while idx >= 0:\n            total_token_num += len(s_token_ids_list[idx])\n            if total_token_num > self.max_src_len:\n                if self.truncate_first_turn and idx == 0:\n                    truncated_ids = s_token_ids_list[idx][:self.max_src_len -\n                                                          total_token_num]\n                    if len(truncated_ids) > 1:\n                        s_token_ids_list[idx] = truncated_ids[:-1] + [\n                            self.eos_id\n                        ]\n                        idx -= 1\n                break\n            idx -= 1\n\n        for i, s_token_ids in enumerate(s_token_ids_list[idx + 1:], idx + 1):\n            src_token_ids += s_token_ids\n            src_pos_ids += list(range(1, len(s_token_ids) + 1))\n\n        src_token_ids = [self.bos_id] + src_token_ids\n        src_type_ids = [0] * len(src_token_ids)\n        src_pos_ids = [0] + src_pos_ids\n        assert len(src_token_ids) == len(src_type_ids) == len(src_pos_ids), \\\n            \"not len(src_token_ids) == len(src_type_ids) == len(src_pos_ids)\"\n\n        token_ids = src_token_ids\n        type_ids = src_type_ids\n        pos_ids = src_pos_ids\n        tgt_start_idx = len(token_ids)\n\n        if not is_infer:\n            # process tgt\n            # tokenize tgt\n            tgt = tokenization.convert_to_unicode(example.tgt).strip()\n            if self.data_format == \"tokenized\":\n                tgt_tokens = tgt.split(\" \")\n            else:\n                tgt_tokens = self.tokenizer.tokenize(tgt)\n\n            tgt_token_ids = self.tokenizer.convert_tokens_to_ids(tgt_tokens)\n            tgt_token_ids.append(self.eos_id)\n\n            # trim tgt\n            if len(tgt_token_ids) > self.max_tgt_len - 1:\n                tgt_token_ids = tgt_token_ids[:self.max_tgt_len - 1]\n\n            tgt_token_ids = [self.bos_id] + tgt_token_ids\n            tgt_type_ids = [1] * len(tgt_token_ids)\n            tgt_pos_ids = list(range(1, len(tgt_token_ids) + 1))\n            assert len(tgt_token_ids) == len(tgt_type_ids) == len(tgt_pos_ids), \\\n                \"not len(tgt_token_ids) == len(tgt_type_ids) == len(tgt_pos_ids)\"\n\n            token_ids += tgt_token_ids\n            type_ids += tgt_type_ids\n            pos_ids += tgt_pos_ids\n\n        assert len(token_ids) == len(type_ids) == len(pos_ids), \\\n            \"not len(token_ids) == len(type_ids) == len(pos_ids)\"\n\n        if self.continuous_position:\n            src_pos_ids = list(range(len(src_token_ids)))\n            if not is_infer:\n                tgt_pos_ids = list(range(len(tgt_token_ids)))\n            pos_ids = list(range(len(token_ids)))\n\n        field_values = {\n            \"token_ids\": src_token_ids,\n            \"type_ids\": src_type_ids,\n            \"pos_ids\": src_pos_ids\n        }\n        field_values[\"tgt_start_idx\"] = tgt_start_idx\n        field_values[\"data_id\"] = example.data_id\n\n        record = self.Record(**field_values)\n        return record\n\n    def _read_tsv(self, fp, phase, is_infer, delimiter=\"\\t\", quotechar=None):\n        \"\"\"Reads a tab separated value file.\"\"\"\n        csv.field_size_limit(2**20)\n        reader = csv.reader(fp, delimiter=delimiter, quotechar=quotechar)\n        headers = next(reader)\n        headers.append(\"data_id\")\n        Example = namedtuple(\"Example\", headers)\n\n        for i, line in enumerate(reader):\n            example = Example(*line, data_id=i)\n            if is_infer or phase.endswith(\"test\"):\n                self.features[phase][i] = example\n            record = self._convert_example_to_record(example, is_infer)\n            yield record\n\n    def _read_numerical_file(self, fp, delimiter=\";\"):\n        for i, line in enumerate(fp):\n            cols = tokenization.convert_to_unicode(line).strip().split(\n                delimiter)\n            cols = list(map(lambda x: list(map(int, x.split(\" \"))), cols))\n            if len(cols) > self.num_numerical_fields:\n                cols = cols[:self.num_numerical_fields]\n            tgt_start_idx = cols[0].index(self.bos_id, 1)\n            record = self.Record(*cols, tgt_start_idx=tgt_start_idx, data_id=i)\n            yield record\n\n    def _read_file(self, input_file, phase, is_infer):\n\n        def __wrapper__():\n            with open_file(input_file) as fp:\n                if self.data_format == \"numerical\":\n                    records = self._read_numerical_file(fp)\n                else:\n                    records = self._read_tsv(fp, phase, is_infer)\n                for record in records:\n                    yield record\n\n        return __wrapper__\n\n    def _read_files(self, filelist, phase, is_infer, shuffle_files):\n        input_files = open(filelist).readlines()\n\n        def __wrapper__():\n            if shuffle_files:\n                self.global_rng.shuffle(input_files)\n\n            if phase == \"train\":\n                self.total_file = len(input_files)\n            for file_index, input_file in enumerate(input_files, 1):\n                if phase == \"train\":\n                    self.current_file_index = file_index\n                    self.current_file = input_file\n                file_reader = self._read_file(input_file.strip(), phase,\n                                              is_infer)\n                for record in file_reader():\n                    yield record\n\n        return __wrapper__\n\n    def _batch_reader(self,\n                      reader,\n                      phase=None,\n                      is_infer=False,\n                      sort_pool_size=2**16):\n        \"\"\"Construct a batch reader.\"\"\"\n\n        def update_max_lens(max_lens, record):\n            \"\"\"Update max_lens.\"\"\"\n            if max_lens is None:\n                return self.sort_key(record)\n            else:\n                return [\n                    max(max_len, l)\n                    for max_len, l in zip(max_lens, self.sort_key(record))\n                ]\n\n        def get_batch(reader):\n            \"\"\"Generate batches from reader.\"\"\"\n            batch, max_lens = [], None\n            for record in reader():\n                if record is None:\n                    yield batch\n                    batch, max_lens = [], None\n                    continue\n\n                self.current_example += 1\n                max_lens = update_max_lens(max_lens, record)\n                if self.in_tokens:\n                    to_append = (len(batch) +\n                                 1) * sum(max_lens) <= self.batch_size\n                else:\n                    to_append = len(batch) < self.batch_size\n                if to_append:\n                    batch.append(record)\n                else:\n                    yield batch\n                    batch, max_lens = [record], self.sort_key(record)\n\n            if len(batch) > 0:\n                yield batch\n\n        def get_sorted_batch(pool):\n            \"\"\"Generate sorted batches from pool.\"\"\"\n            pool = sorted(pool, key=self.sort_key)\n            batches = []\n            batch, max_lens = [], None\n            for record in pool:\n                self.current_example += 1\n                max_lens = update_max_lens(max_lens, record)\n                if self.in_tokens:\n                    to_append = (len(batch) +\n                                 1) * sum(max_lens) <= self.batch_size\n                else:\n                    to_append = len(batch) < self.batch_size\n                if to_append:\n                    batch.append(record)\n                else:\n                    batches.append(batch)\n                    batch, max_lens = [record], self.sort_key(record)\n\n            if len(batch) > 0:\n                batches.append(batch)\n            self.global_rng.shuffle(batches)\n\n            for batch in batches:\n                yield batch\n\n        def __wrapper__():\n            if sort_pool_size > 0:\n                pool = []\n                for record in reader():\n                    pool.append(record)\n                    if len(pool) == sort_pool_size:\n                        for batch in get_sorted_batch(pool):\n                            yield batch\n                        pool = []\n                if len(pool) > 0:\n                    for batch in get_sorted_batch(pool):\n                        yield batch\n            else:\n                for batch in get_batch(reader):\n                    yield batch\n\n        return __wrapper__\n\n    def _distributed_batch_reader(self,\n                                  batch_reader,\n                                  num_part,\n                                  part_id,\n                                  is_test=False):\n\n        def __wrapper__():\n            batches = []\n            for batch in batch_reader():\n                batches.append(batch)\n                if len(batches) == num_part:\n                    yield batches[part_id]\n                    batches = []\n            if is_test and 0 <= part_id < len(batches):\n                yield batches[part_id]\n            return\n\n        return __wrapper__\n\n    def data_generator(self,\n                       input_file=None,\n                       reader=None,\n                       num_epochs=1,\n                       num_part=1,\n                       part_id=0,\n                       phase=None,\n                       is_infer=False):\n        \"\"\"Data generator.\"\"\"\n\n        def __wrapper__():\n            if is_infer or phase.endswith(\"test\"):\n                self.features[phase] = {}\n\n            nonlocal reader\n            if reader is None:\n                if self.file_format == \"filelist\":\n                    reader = self._read_files(input_file, phase, is_infer,\n                                              not phase.endswith(\"test\"))\n                else:\n                    if phase == \"train\":\n                        self.total_file = 1\n                        self.current_file_index = 1\n                        self.current_file = input_file\n                    reader = self._read_file(input_file, phase, is_infer)\n\n            batch_reader = self._batch_reader(\n                reader,\n                phase,\n                is_infer,\n                sort_pool_size=self.sort_pool_size if not is_infer else 0)\n            if phase == \"train\":\n                batch_reader = self._distributed_batch_reader(\n                    batch_reader, num_part, part_id)\n            elif phase.startswith(\"distributed\"):\n                batch_reader = self._distributed_batch_reader(batch_reader,\n                                                              num_part,\n                                                              part_id,\n                                                              is_test=True)\n\n            for epoch_index in range(num_epochs):\n                if phase == \"train\":\n                    self.current_example = 0\n                    self.current_epoch = epoch_index + 1\n                for batch in batch_reader():\n                    yield self._pad_batch_records(batch, is_infer)\n\n        return __wrapper__\n\n    def _gen_self_attn_mask(self,\n                            batch_token_ids,\n                            batch_tgt_start_idx=None,\n                            is_unidirectional=True,\n                            shift_len=0):\n        max_len = max(map(len, batch_token_ids))\n        input_mask_data = np.zeros(\n            (len(batch_token_ids), max_len + shift_len, max_len + shift_len))\n        if is_unidirectional:\n            for index, mask_data in enumerate(input_mask_data):\n                start = 0 if batch_tgt_start_idx is None else batch_tgt_start_idx[\n                    index]\n                end = len(batch_token_ids[index])\n                mask_data[:end + shift_len, :start + shift_len] = 1.0\n                # Generate the lower triangular matrix using the slice of matrix\n                b = np.tril(np.ones([end - start, end - start]), 0)\n                mask_data[start + shift_len:end + shift_len,\n                          start + shift_len:end + shift_len] = b\n        else:\n            for index, token_ids in enumerate(batch_token_ids):\n                input_mask_data[index, :len(token_ids) +\n                                shift_len, :len(token_ids) + shift_len] = 1.0\n        return input_mask_data.astype(\"float32\")\n\n    def _pad_batch_records(self, batch_records, is_infer):\n        \"\"\"\n        Padding batch records and construct model's inputs.\n        \"\"\"\n        batch_size = len(batch_records)\n        batch = {}\n        batch_token_ids = [record.token_ids for record in batch_records]\n        batch_type_ids = [record.type_ids for record in batch_records]\n        batch_pos_ids = [record.pos_ids for record in batch_records]\n        batch[\"token_ids\"] = pad_batch_data(batch_token_ids, pad_id=self.pad_id)\n        batch[\"type_ids\"] = pad_batch_data(batch_type_ids, pad_id=self.pad_id)\n        batch[\"pos_ids\"] = pad_batch_data(batch_pos_ids, pad_id=self.pad_id)\n\n        batch_tgt_start_idx = [record.tgt_start_idx for record in batch_records]\n        batch[\"generation_mask\"] = self._gen_self_attn_mask(\n            batch_token_ids, batch_tgt_start_idx=batch_tgt_start_idx)\n\n        if is_infer:\n            tgt_ids = np.array([[[self.bos_id]]] * len(batch_token_ids),\n                               dtype=\"int64\")\n            if self.continuous_position:\n                tgt_pos = np.array(batch_tgt_start_idx, dtype=\"int64\")\n            else:\n                tgt_pos = np.zeros_like(batch_tgt_start_idx, dtype=\"int64\")\n            tgt_pos = tgt_pos.reshape(-1, 1, 1)\n            batch[\"init_score\"] = np.zeros_like(\n                tgt_ids, dtype=\"float32\").reshape(-1, 1).tolist()\n            batch[\"tgt_ids\"] = tgt_ids.tolist()\n            batch[\"tgt_pos\"] = tgt_pos.tolist()\n\n            batch[\"tgt_generation_mask\"] = batch[\n                \"generation_mask\"][:, 0:1, :].astype(\"float32\")\n        else:\n            batch[\"tgt_label\"], batch[\"tgt_pos\"] = mask(\n                batch_tokens=batch_token_ids,\n                vocab_size=self.vocab_size,\n                sent_b_starts=batch_tgt_start_idx,\n                is_unidirectional=True)\n\n        batch_data_id = [record.data_id for record in batch_records]\n        batch[\"data_id\"] = np.array(batch_data_id).astype(\"int64\").reshape(\n            [-1, 1])\n        return batch\n\n\n@contextmanager\ndef open_file(filename):\n    \"\"\"Open file.\"\"\"\n    if filename.endswith(\".gz\"):\n        fp = gzip.open(filename, \"rt\")\n    else:\n        fp = open(filename)\n    yield fp\n    fp.close()\n"
  },
  {
    "path": "modules/text/text_generation/plato2_en_large/readers/nsp_reader.py",
    "content": "#   Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"NSP Reader.\"\"\"\n\nfrom collections import namedtuple\n\nimport numpy as np\n\nfrom .dialog_reader import DialogReader\nfrom ..utils import pad_batch_data\nfrom ..utils.args import str2bool\nfrom ..utils.masking import mask\n\n\nclass NSPReader(DialogReader):\n    \"\"\"NSP Reader.\"\"\"\n\n    @classmethod\n    def add_cmdline_args(cls, parser):\n        \"\"\"Add cmdline argurments.\"\"\"\n        group = DialogReader.add_cmdline_args(parser)\n        group.add_argument(\"--attention_style\",\n                           type=str,\n                           default=\"bidirectional\",\n                           choices=[\"bidirectional\", \"unidirectional\"])\n        group.add_argument(\"--mix_negative_sample\",\n                           type=str2bool,\n                           default=False)\n        return group\n\n    def __init__(self, args):\n        super(NSPReader, self).__init__(args)\n        self.fields.append(\"label\")\n        self.Record = namedtuple(\"Record\",\n                                 self.fields,\n                                 defaults=(None, ) * len(self.fields))\n\n        self.attention_style = args.attention_style\n        self.mix_negative_sample = args.mix_negative_sample\n        return\n\n    def _convert_example_to_record(self, example, is_infer):\n        record = super(NSPReader,\n                       self)._convert_example_to_record(example, False)\n        if \"label\" in example._fields:\n            record = record._replace(label=int(example.label))\n        return record\n\n    def _mix_negative_sample(self, reader, neg_pool_size=2**16):\n\n        def gen_from_pool(pool):\n            num_samples = len(pool)\n            if num_samples == 1:\n                # only one sample: it is impossible to generate negative sample\n                yield pool[0]._replace(label=1)\n                return\n            self.global_rng.shuffle(pool)\n            for i in range(num_samples):\n                pool[i] = pool[i]._replace(label=1)\n                j = (i + 1) % num_samples\n                idx_i = pool[i].tgt_start_idx\n                idx_j = pool[j].tgt_start_idx\n                field_values = {}\n                field_values[\"token_ids\"] = pool[i].token_ids[:idx_i] + pool[\n                    j].token_ids[idx_j:]\n                field_values[\"type_ids\"] = pool[i].type_ids[:idx_i] + pool[\n                    j].type_ids[idx_j:]\n                field_values[\"pos_ids\"] = list(\n                    range(len(field_values[\"token_ids\"])))\n                neg_record = self.Record(**field_values,\n                                         tgt_start_idx=idx_i,\n                                         data_id=-1,\n                                         label=0)\n                pool.append(neg_record)\n                assert len(neg_record.token_ids) <= self.max_seq_len\n            self.global_rng.shuffle(pool)\n            for record in pool:\n                yield record\n\n        def __wrapper__():\n            pool = []\n            for record in reader():\n                pool.append(record)\n                if len(pool) == neg_pool_size:\n                    for record in gen_from_pool(pool):\n                        yield record\n                    pool = []\n            if len(pool) > 0:\n                for record in gen_from_pool(pool):\n                    yield record\n\n        return __wrapper__\n\n    def _batch_reader(self,\n                      reader,\n                      phase=None,\n                      is_infer=False,\n                      sort_pool_size=2**16):\n        if self.mix_negative_sample:\n            reader = self._mix_negative_sample(reader)\n        return super(NSPReader,\n                     self)._batch_reader(reader,\n                                         phase=phase,\n                                         is_infer=is_infer,\n                                         sort_pool_size=sort_pool_size)\n\n    def _pad_batch_records(self, batch_records, is_infer):\n        \"\"\"\n        Padding batch records and construct model's inputs.\n        \"\"\"\n        batch = {}\n        batch_token_ids = [record.token_ids for record in batch_records]\n        batch_type_ids = [record.type_ids for record in batch_records]\n        batch_pos_ids = [record.pos_ids for record in batch_records]\n        batch_tgt_start_idx = [record.tgt_start_idx for record in batch_records]\n        batch_label = [record.label for record in batch_records]\n\n        if self.attention_style == \"unidirectional\":\n            batch[\"token_ids\"] = pad_batch_data(batch_token_ids,\n                                                pad_id=self.pad_id)\n            batch[\"type_ids\"] = pad_batch_data(batch_type_ids,\n                                               pad_id=self.pad_id)\n            batch[\"pos_ids\"] = pad_batch_data(batch_pos_ids, pad_id=self.pad_id)\n            tgt_label, tgt_pos, label_pos = mask(\n                batch_tokens=batch_token_ids,\n                vocab_size=self.vocab_size,\n                bos_id=self.bos_id,\n                sent_b_starts=batch_tgt_start_idx,\n                labels=batch_label,\n                is_unidirectional=True)\n            attention_mask = self._gen_self_attn_mask(batch_token_ids,\n                                                      batch_tgt_start_idx)\n        else:\n            batch_mask_token_ids, tgt_label, tgt_pos, label_pos = mask(\n                batch_tokens=batch_token_ids,\n                vocab_size=self.vocab_size,\n                bos_id=self.bos_id,\n                eos_id=self.eos_id,\n                mask_id=self.mask_id,\n                sent_b_starts=batch_tgt_start_idx,\n                labels=batch_label,\n                is_unidirectional=False)\n            if not is_infer:\n                batch_token_ids = batch_mask_token_ids\n            batch[\"token_ids\"] = pad_batch_data(batch_token_ids,\n                                                pad_id=self.pad_id)\n            batch[\"type_ids\"] = pad_batch_data(batch_type_ids,\n                                               pad_id=self.pad_id)\n            batch[\"pos_ids\"] = pad_batch_data(batch_pos_ids, pad_id=self.pad_id)\n            attention_mask = self._gen_self_attn_mask(batch_token_ids,\n                                                      is_unidirectional=False)\n\n        batch[\"attention_mask\"] = attention_mask\n        batch[\"label_pos\"] = label_pos\n\n        if not is_infer:\n            batch_label = np.array(batch_label).astype(\"int64\").reshape([-1, 1])\n            batch[\"label\"] = batch_label\n            batch[\"tgt_label\"] = tgt_label\n            batch[\"tgt_pos\"] = tgt_pos\n\n        batch_data_id = [record.data_id for record in batch_records]\n        batch[\"data_id\"] = np.array(batch_data_id).astype(\"int64\").reshape(\n            [-1, 1])\n        return batch\n"
  },
  {
    "path": "modules/text/text_generation/plato2_en_large/readers/plato_reader.py",
    "content": "#   Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Plato Reader.\"\"\"\n\nimport numpy as np\n\nfrom .dialog_reader import DialogReader\nfrom ..utils import pad_batch_data\nfrom ..utils.masking import mask\n\n\nclass PlatoReader(DialogReader):\n    \"\"\"The implement of PlatoReader\"\"\"\n\n    def __init__(self, args):\n        super(PlatoReader, self).__init__(args)\n        self.latent_type_size = args.latent_type_size\n        self.use_bow = args.use_bow\n\n    def _pad_batch_records(self, batch_records, is_infer):\n        \"\"\"\n        Padding batch records and construct model's inputs.\n        \"\"\"\n        batch = {}\n        batch_token_ids = [record.token_ids for record in batch_records]\n        batch_type_ids = [record.type_ids for record in batch_records]\n        batch_pos_ids = [record.pos_ids for record in batch_records]\n\n        batch_tgt_start_idx = [record.tgt_start_idx for record in batch_records]\n\n        batch_size = len(batch_token_ids)\n\n        # padding\n        batch[\"token_ids\"] = pad_batch_data(batch_token_ids, pad_id=self.pad_id)\n        batch[\"type_ids\"] = pad_batch_data(batch_type_ids, pad_id=self.pad_id)\n        batch[\"pos_ids\"] = pad_batch_data(batch_pos_ids, pad_id=self.pad_id)\n\n        batch[\"generation_mask\"] = self._gen_self_attn_mask(\n            batch_token_ids,\n            batch_tgt_start_idx=batch_tgt_start_idx,\n            is_unidirectional=True,\n            shift_len=1)\n        if not is_infer:\n            batch[\"recognition_mask\"] = self._gen_self_attn_mask(\n                batch_token_ids, is_unidirectional=False, shift_len=1)\n\n        if is_infer:\n            tgt_ids = np.array([[[self.bos_id]]] * batch_size, dtype=\"int64\")\n            if self.continuous_position:\n                tgt_pos = np.array(batch_tgt_start_idx, dtype=\"int64\")\n            else:\n                tgt_pos = np.zeros_like(batch_tgt_start_idx, dtype=\"int64\")\n            tgt_pos = tgt_pos.reshape(-1, 1, 1)\n            batch[\"init_score\"] = np.zeros_like(\n                tgt_ids, dtype=\"float32\").reshape(-1, 1).tolist()\n            batch[\"tgt_ids\"] = tgt_ids.tolist()\n            batch[\"tgt_pos\"] = tgt_pos.tolist()\n            batch[\"parent_idx\"] = np.array(range(batch_size), dtype=\"int32\")\n\n            batch[\"tgt_generation_mask\"] = batch[\n                \"generation_mask\"][:, 0:1, :].astype(\"float32\")\n        else:\n            mask_return_list = mask(batch_tokens=batch_token_ids,\n                                    vocab_size=self.vocab_size,\n                                    sent_b_starts=batch_tgt_start_idx,\n                                    is_unidirectional=True,\n                                    use_latent=True,\n                                    use_bow=self.use_bow)\n            batch[\"tgt_label\"] = mask_return_list[0]\n            batch[\"tgt_pos\"] = mask_return_list[1]\n            if self.use_bow:\n                batch[\"bow_label\"] = mask_return_list[2]\n                batch[\"bow_pos\"] = mask_return_list[3]\n\n        batch_data_id = [record.data_id for record in batch_records]\n        batch[\"data_id\"] = np.array(batch_data_id).astype(\"int64\").reshape(\n            [-1, 1])\n        return batch\n"
  },
  {
    "path": "modules/text/text_generation/plato2_en_large/utils/__init__.py",
    "content": "#   Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Utils.\"\"\"\n\nfrom itertools import chain\nimport numpy as np\nimport paddle\n\n\ndef repeat_array(array, times):\n    \"\"\"Repeate numpy array.\"\"\"\n    if isinstance(array, list):\n        return list(chain(*([array] * times)))\n    else:\n        return np.concatenate([array] * times, axis=0)\n\n\ndef gen_inputs(inputs, latent_type_size):\n    batch_size = len(inputs[\"data_id\"])\n    new_bsz = batch_size * latent_type_size\n    inputs = {\n        name: repeat_array(array, latent_type_size)\n        for name, array in inputs.items()\n    }\n    # Add latent_id\n    inputs[\"latent_id\"] = np.array(\n        [i for i in range(latent_type_size) for _ in range(batch_size)],\n        dtype=\"int64\").reshape([-1, 1])\n\n    #print('\\nplato_inputs:')\n    for key in inputs:\n        inputs[key] = paddle.to_tensor(inputs[key])\n        if key in [\n                'token_ids', 'type_ids', 'pos_ids', 'tgt_ids', 'tgt_pos',\n                'data_id'\n        ]:\n            inputs[key] = paddle.squeeze(inputs[key], axis=-1)\n        #print(key, inputs[key].shape, inputs[key].dtype)\n    return inputs\n\n\ndef pad_batch_data(insts, pad_id=0):\n    \"\"\"Pad the instances to the max sequence length in batch. \"\"\"\n    max_len = max(map(len, insts))\n    inst_data = np.array(\n        [list(inst) + [pad_id] * (max_len - len(inst)) for inst in insts])\n    return inst_data.astype(\"int64\").reshape([-1, max_len, 1])\n"
  },
  {
    "path": "modules/text/text_generation/plato2_en_large/utils/args.py",
    "content": "#   Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Parse argument.\"\"\"\n\nimport argparse\nimport json\n\n\ndef str2bool(v):\n    \"\"\" Support bool type for argparse. \"\"\"\n    if v.lower() in (\"yes\", \"true\", \"t\", \"y\", \"1\"):\n        return True\n    elif v.lower() in (\"no\", \"false\", \"f\", \"n\", \"0\"):\n        return False\n    else:\n        raise argparse.ArgumentTypeError(\"Unsupported value encountered.\")\n\n\nclass Args(dict):\n    \"\"\" Arguments class\n\n    Store arguments in training / infer / ... scripts.\n    \"\"\"\n\n    def __getattr__(self, name):\n        if name in self.keys():\n            return self[name]\n        for v in self.values():\n            if isinstance(v, Args):\n                if name in v:\n                    return v[name]\n        return None\n\n    def get(self, key, default_value=None):\n        \"\"\"Get the value of corresponding key.\"\"\"\n        if key in self.keys():\n            return self[key]\n        for v in self.values():\n            if isinstance(v, Args):\n                if key in v:\n                    return v[key]\n        return default_value\n\n    def __setattr__(self, name, value):\n        self[name] = value\n\n    def save(self, filename):\n        with open(filename, \"w\") as fp:\n            json.dump(self, fp, ensure_ascii=False, indent=4, sort_keys=False)\n\n    def load(self, filename, group_name=None):\n        if group_name is not None:\n            if group_name not in self:\n                self[group_name] = Args()\n            self[group_name].load(filename)\n            return\n        with open(filename, \"r\") as fp:\n            params_dict = json.load(fp)\n        for k, v in params_dict.items():\n            if isinstance(v, dict):\n                self[k].update(Args(v))\n            else:\n                self[k] = v\n\n\ndef parse_args(parser: argparse.ArgumentParser, allow_unknown=False) -> Args:\n    \"\"\" Parse hyper-parameters from cmdline. \"\"\"\n    if allow_unknown:\n        parsed, _ = parser.parse_known_args()\n    else:\n        parsed = parser.parse_args()\n    args = Args()\n    optional_args = parser._action_groups[1]\n    for action in optional_args._group_actions[1:]:\n        arg_name = action.dest\n        args[arg_name] = getattr(parsed, arg_name)\n    for group in parser._action_groups[2:]:\n        group_args = Args()\n        for action in group._group_actions:\n            arg_name = action.dest\n            group_args[arg_name] = getattr(parsed, arg_name)\n        if len(group_args) > 0:\n            if group.title in args:\n                args[group.title].update(group_args)\n            else:\n                args[group.title] = group_args\n    return args\n"
  },
  {
    "path": "modules/text/text_generation/plato2_en_large/utils/masking.py",
    "content": "#   Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Reader utils.\"\"\"\n\nimport numpy as np\n\n\ndef mask(batch_tokens,\n         vocab_size,\n         bos_id=1,\n         eos_id=2,\n         mask_id=3,\n         sent_b_starts=None,\n         labels=None,\n         is_unidirectional=False,\n         use_latent=False,\n         use_bow=False):\n    \"\"\"\n    Add mask for batch_tokens, return out, mask_label, mask_pos;\n    Note: mask_pos responding the batch_tokens after padded;\n    \"\"\"\n    batch_tokens = np.copy(batch_tokens)\n    max_len = max(map(len, batch_tokens))\n    mask_label = []\n    mask_pos = []\n    if labels is not None:\n        label_pos = []\n\n    if is_unidirectional:\n        # unidirectional language model\n        if use_latent:\n            max_len += 1\n            shift_len = 1\n        else:\n            shift_len = 0\n        for sent_index, sent in enumerate(batch_tokens):\n            sent_b_index = sent_b_starts[\n                sent_index] if sent_b_starts is not None else 0\n            need_cal = True\n            if labels is not None:\n                label_pos.append(sent_index * max_len + len(sent) - 1 +\n                                 shift_len)\n                if labels[sent_index] == 0:\n                    need_cal = False\n            mask_label.extend(sent[sent_b_index + 1:])\n            mask_pos.extend([\n                sent_index * max_len + i + shift_len\n                for i in range(sent_b_index,\n                               len(sent) - 1)\n            ])\n        mask_label = np.array(mask_label).astype(\"int64\").reshape([-1, 1])\n        mask_pos = np.array(mask_pos).astype(\"int64\").reshape([-1, 1])\n        return_list = [mask_label, mask_pos]\n\n        # latent related (bow label and pos)\n        if use_latent and use_bow:\n            bow_label = []\n            bow_pos = []\n            for sent_index, sent in enumerate(batch_tokens):\n                sent_b_index = sent_b_starts[\n                    sent_index] if sent_b_starts is not None else 0\n\n                def __filter__(tok_id):\n                    # TODO: exclude [EOS] from bow loss\n                    return True\n\n                bow_pos.extend([\n                    sent_index for i in range(sent_b_index + 1, len(sent))\n                    if __filter__(sent[i])\n                ])\n                bow_label.extend([\n                    sent[i] for i in range(sent_b_index + 1, len(sent))\n                    if __filter__(sent[i])\n                ])\n            bow_label = np.array(bow_label).astype(\"int64\").reshape([-1, 1])\n            bow_pos = np.array(bow_pos).astype(\"int64\").reshape([-1, 1])\n            return_list += [bow_label, bow_pos]\n    else:\n        # bidirectional mask language model\n        total_token_num = sum(map(len, batch_tokens))\n        prob_mask = np.random.rand(total_token_num)\n        # TODO: fix replace_ids, include [UNK]\n        replace_ids = np.random.randint(3,\n                                        high=vocab_size,\n                                        size=total_token_num)\n        prob_index = 0\n        for sent_index, sent in enumerate(batch_tokens):\n            # add pair label position\n            if labels is not None:\n                label_pos.append(sent_index * max_len)\n\n            # add mask label and position\n            for token_index, token in enumerate(sent):\n                if token == eos_id or token == bos_id:\n                    continue\n                prob = prob_mask[prob_index + token_index]\n                if prob > 0.15:\n                    continue\n                elif 0.03 < prob <= 0.15:\n                    # mask\n                    mask_label.append(sent[token_index])\n                    sent[token_index] = mask_id\n                    mask_pos.append(sent_index * max_len + token_index)\n                elif 0.015 < prob <= 0.03:\n                    # random replace\n                    mask_label.append(sent[token_index])\n                    sent[token_index] = replace_ids[prob_index + token_index]\n                    mask_pos.append(sent_index * max_len + token_index)\n                else:\n                    # keep the original token\n                    mask_label.append(sent[token_index])\n                    mask_pos.append(sent_index * max_len + token_index)\n\n            prob_index += len(sent)\n\n        mask_label = np.array(mask_label).astype(\"int64\").reshape([-1, 1])\n        mask_pos = np.array(mask_pos).astype(\"int64\").reshape([-1, 1])\n        return_list = [batch_tokens, mask_label, mask_pos]\n\n    if labels is not None:\n        label_pos = np.array(label_pos).astype(\"int64\").reshape([-1, 1])\n        assert len(labels) == len(label_pos)\n        return_list.append(label_pos)\n    return return_list\n"
  },
  {
    "path": "modules/text/text_generation/plato2_en_large/utils/tokenization.py",
    "content": "#   Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Tokenization classes.\"\"\"\n\nimport collections\nimport sentencepiece as spm\nimport unicodedata\n\nfrom .args import str2bool\n\n\ndef clean_text(text):\n    \"\"\"Performs invalid character removal and whitespace cleanup on text.\"\"\"\n    text = text.replace(u\"“\", u'\"')\\\n        .replace(u'”', u'\"')\\\n        .replace(u'‘', \"'\")\\\n        .replace(u'’', u\"'\")\\\n        .replace(u'—', u'-')\n\n    output = []\n    for char in text:\n        if _is_control(char):\n            continue\n        if _is_whitespace(char):\n            output.append(\" \")\n        else:\n            output.append(char)\n    return \"\".join(output)\n\n\ndef preprocess_text(inputs, remove_space=True, lower=False):\n    \"\"\"preprocess data by removing extra space and normalize data.\"\"\"\n    outputs = inputs\n    if remove_space:\n        outputs = \" \".join(inputs.strip().split())\n\n    outputs = unicodedata.normalize(\"NFKD\", outputs)\n    outputs = \"\".join([c for c in outputs if not unicodedata.combining(c)])\n    if lower:\n        outputs = outputs.lower()\n\n    return outputs\n\n\ndef encode_pieces(spm_model, text, return_unicode=True, sample=False):\n    \"\"\"turn sentences into word pieces.\"\"\"\n    # liujiaxiang: add for ernie-albert, mainly consider for “/”/‘/’/— causing too many unk\n    text = clean_text(text)\n\n    if not sample:\n        pieces = spm_model.EncodeAsPieces(text)\n    else:\n        pieces = spm_model.SampleEncodeAsPieces(text, 64, 0.1)\n\n    return pieces\n\n\ndef encode_ids(spm_model, text, sample=False):\n    \"\"\"turn sentences into word pieces.\"\"\"\n    pieces = encode_pieces(spm_model, text, return_unicode=False, sample=sample)\n    ids = [spm_model.PieceToId(piece) for piece in pieces]\n    return ids\n\n\ndef convert_to_unicode(text):\n    \"\"\"Converts `text` to Unicode (if it's not already), assuming utf-8 input.\"\"\"\n    if isinstance(text, str):\n        return text\n    elif isinstance(text, bytes):\n        return text.decode(\"utf-8\", \"ignore\")\n    else:\n        raise ValueError(\"Unsupported string type: %s\" % (type(text)))\n\n\ndef load_vocab(vocab_file):\n    \"\"\"Loads a vocabulary file into a dictionary.\"\"\"\n    vocab = collections.OrderedDict()\n    fin = open(vocab_file, 'r', encoding=\"UTF-8\")\n    for num, line in enumerate(fin):\n        items = convert_to_unicode(line.rstrip()).split(\"\\t\")\n        if len(items) > 2:\n            break\n        token = items[0]\n        index = items[1] if len(items) == 2 else num\n        token = token.strip()\n        vocab[token] = int(index)\n    return vocab\n\n\ndef convert_by_vocab(vocab, items):\n    \"\"\"Converts a sequence of [tokens|ids] using the vocab.\"\"\"\n    output = []\n    for item in items:\n        output.append(vocab[item])\n    return output\n\n\nclass SentencePieceTokenizer(object):\n    \"\"\"Runs end-to-end tokenziation.\"\"\"\n\n    @classmethod\n    def add_cmdline_args(cls, parser):\n        \"\"\"Add cmdline argurments.\"\"\"\n        group = parser.add_argument_group(\"Tokenizer\")\n        group.add_argument(\"--vocab_path\", type=str, required=True)\n        group.add_argument(\"--do_lower_case\", type=str2bool, default=False)\n        group.add_argument(\"--spm_model_file\", type=str, required=True)\n        return group\n\n    def __init__(self, args):\n        self.spm_model = spm.SentencePieceProcessor()\n        self.spm_model.Load(args.spm_model_file)\n        self.vocab = load_vocab(args.vocab_path)\n        self.do_lower_case = args.do_lower_case\n        self.inv_vocab = {v: k for k, v in self.vocab.items()}\n\n    def tokenize(self, text):\n        \"\"\"Tokenizes a piece of text.\"\"\"\n        text = preprocess_text(text, lower=self.do_lower_case)\n        return encode_pieces(self.spm_model, text, return_unicode=True)\n\n    def convert_tokens_to_ids(self, tokens):\n        \"\"\"Convert tokens to ids.\"\"\"\n        ret = []\n        unk_id = self.vocab[\"<unk>\"]\n        for token in tokens:\n            if token in self.vocab:\n                ret.append(self.vocab[token])\n            else:\n                ret.append(unk_id)\n        return ret\n\n    def convert_ids_to_tokens(self, ids):\n        \"\"\"Convert ids to tokens.\"\"\"\n        return convert_by_vocab(self.inv_vocab, ids)\n\n    def merge_subword(self, tokens):\n        \"\"\"Merge subword.\"\"\"\n        ret = []\n        for token in tokens:\n            if token.startswith(u\"▁\"):\n                ret.append(token[1:])\n            else:\n                if len(ret):\n                    ret[-1] += token\n                else:\n                    ret.append(token)\n\n        ret = [token for token in ret if token]\n        return ret\n\n    def convert_ids_to_str(self, ids):\n        \"\"\"Convert ids to string.\"\"\"\n        tokens = self.convert_ids_to_tokens(ids)\n        tokens = self.merge_subword(tokens)\n        res = \" \".join(tokens).replace(\"<s>\", \"\")\n        res = res.replace(\"</s>\", \"\\n\").replace(\"\\n \", \"\\n\").strip()\n        return res\n\n\ndef _is_whitespace(char):\n    \"\"\"Checks whether `chars` is a whitespace character.\"\"\"\n    # \\t, \\n, and \\r are technically contorl characters but we treat them\n    # as whitespace since they are generally considered as such.\n    if char == \" \" or char == \"\\t\" or char == \"\\n\" or char == \"\\r\":\n        return True\n    cat = unicodedata.category(char)\n    if cat == \"Zs\":\n        return True\n    return False\n\n\ndef _is_control(char):\n    \"\"\"Checks whether `chars` is a control character.\"\"\"\n    # These are technically control characters but we count them as whitespace\n    # characters.\n    if char == \"\\t\" or char == \"\\n\" or char == \"\\r\":\n        return False\n    cat = unicodedata.category(char)\n    if cat.startswith(\"C\"):\n        return True\n    return False\n"
  },
  {
    "path": "modules/text/text_generation/reading_pictures_writing_poems/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/text_generation/reading_pictures_writing_poems/module.py",
    "content": "import argparse\r\nimport ast\r\nimport os\r\n\r\nfrom translate import Translator\r\n\r\nimport paddlehub as hub\r\nfrom paddlehub.module.module import moduleinfo\r\nfrom paddlehub.module.module import runnable\r\n\r\n\r\n@moduleinfo(name=\"reading_pictures_writing_poems\",\r\n            version=\"1.1.0\",\r\n            summary=\"Just for test\",\r\n            author=\"Mr.郑先生_\",\r\n            author_email=\"2733821739@qq.com\",\r\n            type=\"nlp/text_generation\")\r\nclass ReadingPicturesWritingPoems:\r\n\r\n    def __init__(self):\r\n        \"\"\"\r\n        Initialize with the necessary elements\r\n        \"\"\"\r\n        self.module_image = hub.Module(name=\"efficientnetb4_imagenet\")  # 调用图像分类的模型\r\n        self.module_similar = hub.Module(name=\"ernie_gen_couplet\")  # 调用对联生成的模型\r\n        self.module_poem = hub.Module(name=\"ernie_gen_poetry\")  # 调用古诗生成的模型\r\n\r\n    def is_chinese(self, string):\r\n        \"\"\"\r\n        检查整个字符串是否为中文\r\n        Args:\r\n            string (str): 需要检查的字符串,包含空格也是False\r\n        Return\r\n            bool\r\n        \"\"\"\r\n        if (len(string) <= 1):  # 去除只有单个字或者为空的字符串\r\n            return False\r\n\r\n        for chart in string:  # 把除了中文的所有字母、数字、符号去除\r\n            if (chart < u'\\u4e00' or chart > u'\\u9fff'):\r\n                return False\r\n\r\n        return True\r\n\r\n    def WritingPoem(self, image, use_gpu=False):\r\n        results_image = self.module_image.classification(paths=[image])\r\n        PictureClassification = list(results_image[0].keys())[0]\r\n        translator = Translator(to_lang=\"chinese\")\r\n        PictureClassification_ch = translator.translate(\"{}\".format(PictureClassification))\r\n        texts = [\"{}\".format(PictureClassification_ch)]\r\n        results_keywords = self.module_similar.generate(texts=texts, use_gpu=use_gpu, beam_width=20)\r\n        Words = []  # 将符合标准的近义词保存在这里（标准：字符串为中文且长度大于1）\r\n        for item in range(20):\r\n            if (self.is_chinese(results_keywords[0][item])):\r\n                Words.append(results_keywords[0][item])\r\n        # 古诗的一句可以拆分成许多词语，因此这里先找到能合成古诗的词语\r\n        FirstWord = Words[0] + Words[1]\r\n        SecondWord = Words[2] + Words[3]\r\n        ThirdWord = Words[4] + Words[5]\r\n        FourthWord = Words[6] + Words[7]\r\n        # 出句和对句，也可以理解为上下句（专业讲法是出句和对句，古诗词是中国传统文化，出句和对句的英文翻译即拼音）\r\n        ChuJu = FirstWord + SecondWord  # 出句\r\n        DuiJu = ThirdWord + FourthWord  # 对句\r\n        FirstPoetry = [\"{:.5}，{:.5}。\".format(ChuJu, DuiJu)]  # 古诗词的上阕\r\n        results = self.module_poem.generate(texts=FirstPoetry, use_gpu=use_gpu, beam_width=5)\r\n        SecondPoetry = [\"{:.12}\".format(results[0][0])]\r\n        poetry = []\r\n        poetry.append(FirstPoetry)\r\n        poetry.append(SecondPoetry)\r\n        print(\"根据图片生成的古诗词：\")\r\n        print(\"{}\".format(poetry[0][0]))\r\n        print(\"{}\".format(poetry[1][0]))\r\n        results = [{'image': image, 'poetry': \"{}\".format(poetry[0][0] + poetry[1][0])}]\r\n\r\n        return results\r\n\r\n    @runnable\r\n    def run_cmd(self, argvs):\r\n        \"\"\"\r\n        Run as a command.\r\n        \"\"\"\r\n        self.parser = argparse.ArgumentParser(description='Run the %s module.' % self.name,\r\n                                              prog='hub run %s' % self.name,\r\n                                              usage='%(prog)s',\r\n                                              add_help=True)\r\n\r\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\r\n        self.arg_config_group = self.parser.add_argument_group(\r\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\r\n\r\n        self.add_module_config_arg()\r\n        self.add_module_input_arg()\r\n\r\n        args = self.parser.parse_args(argvs)\r\n\r\n        try:\r\n            input_data = self.check_input_data(args)\r\n        except RuntimeError:\r\n            self.parser.print_help()\r\n            return None\r\n\r\n        results = self.WritingPoem(input_data)\r\n\r\n        return results\r\n\r\n    def add_module_config_arg(self):\r\n        \"\"\"\r\n        Add the command config options.\r\n        \"\"\"\r\n        self.arg_config_group.add_argument('--use_gpu',\r\n                                           type=ast.literal_eval,\r\n                                           default=False,\r\n                                           help=\"whether use GPU for prediction\")\r\n\r\n    def add_module_input_arg(self):\r\n        \"\"\"\r\n        Add the command input options\r\n        \"\"\"\r\n        self.arg_input_group.add_argument('--input_image', type=str, default=None, help=\"Pictures to write poetry\")\r\n\r\n    def check_input_data(self, args):\r\n        input_data = []\r\n        if args.input_image:\r\n            if not os.path.exists(args.input_image):\r\n                raise RuntimeError(\"File %s is not exist.\" % args.input_image)\r\n            else:\r\n                input_data = args.input_image\r\n\r\n        if input_data == []:\r\n            raise RuntimeError(\"The input data is inconsistent with expectations.\")\r\n\r\n        return input_data\r\n"
  },
  {
    "path": "modules/text/text_generation/reading_pictures_writing_poems/readme.md",
    "content": "# reading_pictures_writing_poems\n\n| 模型名称            | reading_pictures_writing_poems |\n| :------------------ | :----------------------------: |\n| 类别                |         文本-文本生成          |\n| 网络                |           多网络级联           |\n| 数据集              |               -                |\n| 是否支持Fine-tuning |               否               |\n| 模型大小            |             3.16K              |\n| 最新更新日期        |           2021-04-26           |\n| 数据指标            |               -                |\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n<p align=\"center\">\n<img src=\"https://user-images.githubusercontent.com/76040149/133189274-ff86801f-017f-460e-adb0-1d381d74aff6.jpg\" width=\"300\">\n</p>\n\n  - 输入以上图片生成的古诗是：\n\n     - 蕾蕾海河海，岳峰岳麓蔓。\n     - 不萌枝上春，自结心中线。\n\n- ### 模型介绍\n  - 看图写诗（reading_pictures_writing_poems），该模型可自动根据图像生成古诗词。该PaddleHub Module支持预测。\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 1.8.2\n\n  - paddlehub >= 1.8.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n  - translate\n\n    - ```shell\n      $ pip install translate\n      ```\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install reading_pictures_writing_poems\n    ```\n\n    - 本模型还需要用到xception71_imagenet, ernie_gen_couplet, ernie_gen_poetry这3个模型\n    - 若您未安装这3个模型，代码运行时会自动帮您下载\n\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n   | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run reading_pictures_writing_poems --input_image \"/PATH/TO/IMAGE\"\n    ```\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    readingPicturesWritingPoems = hub.Module(name=\"reading_pictures_writing_poems\")\n    results = readingPicturesWritingPoems.WritingPoem(image=\"/PATH/TO/IMAGE\", use_gpu=False)\n\n    for result in results:\n        print(result)\n    ```\n\n- ### 3、API\n\n  - ```python\n    def WritingPoem(image, use_gpu=False):\n    ```\n\n     - 看图写诗预测接口，预测输入一张图像，输出一首古诗词\n     - **参数**\n         - image(str): 待检测的图片路径\n         - use_gpu (bool): 是否使用 GPU\n     - **返回**\n         - results (list\\[dict\\]): 识别结果的列表，列表中每一个元素为 dict，关键字有 image, poetry\n           - image: 为原输入图片的路径\n           - poetry: 字段为输出的古诗词\n\n## 四、服务部署\n\n- 本模型不支持hub serving\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除 Fluid API，更换分类模型\n\n  - ```shell\n    $ hub install porn_detection_lstm==1.1.0\n    ```\n"
  },
  {
    "path": "modules/text/text_generation/reading_pictures_writing_poems/requirements.txt",
    "content": "translate\n"
  },
  {
    "path": "modules/text/text_generation/reading_pictures_writing_poems_for_midautumn/MidAutumnDetection/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/text_generation/reading_pictures_writing_poems_for_midautumn/MidAutumnDetection/module.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\n\nimport os\nimport cv2\nimport argparse\nimport base64\nimport paddlex as pdx\n\nimport numpy as np\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef cv2_to_base64(image):\n    # return base64.b64encode(image)\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef read_images(paths):\n    images = []\n    for path in paths:\n        images.append(cv2.imread(path))\n    return images\n\n\n@moduleinfo(\n    name='MidAutumnDetection',\n    type='CV',\n    author='彭兆帅，郑博培',\n    author_email='1084667371@qq.com，2733821739@qq.com',\n    summary='',\n    version='1.0.0')\nclass MODULE(hub.Module):\n    def _initialize(self, **kwargs):\n        self.default_pretrained_model_path = os.path.join(self.directory, 'assets')\n        self.model = pdx.deploy.Predictor(self.default_pretrained_model_path, **kwargs)\n\n    def predict(self, images=None, paths=None, data=None, batch_size=1, use_gpu=False, **kwargs):\n\n        all_data = images if images is not None else read_images(paths)\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n        res = []\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except IndexError:\n                    break\n            out = self.model.batch_predict(batch_data, **kwargs)\n            res.extend(out)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.predict(images_decode, **kwargs)\n        res = []\n        for result in results:\n            if isinstance(result, dict):\n                # result_new = dict()\n                for key, value in result.items():\n                    if isinstance(value, np.ndarray):\n                        result[key] = cv2_to_base64(value)\n                    elif isinstance(value, np.generic):\n                        result[key] = np.asscalar(value)\n\n            elif isinstance(result, list):\n                for index in range(len(result)):\n                    for key, value in result[index].items():\n                        if isinstance(value, np.ndarray):\n                            result[index][key] = cv2_to_base64(value)\n                        elif isinstance(value, np.generic):\n                            result[index][key] = np.asscalar(value)\n            else:\n                raise RuntimeError('The result cannot be used in serving.')\n            res.append(result)\n        return res\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.predict(paths=[args.input_path], use_gpu=args.use_gpu)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', type=bool, default=False, help=\"whether use GPU or not\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n"
  },
  {
    "path": "modules/text/text_generation/reading_pictures_writing_poems_for_midautumn/MidAutumnPoetry/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/text_generation/reading_pictures_writing_poems_for_midautumn/MidAutumnPoetry/model/decode.py",
    "content": "#   Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport re\nimport numpy as np\nfrom collections import namedtuple\n\nimport paddle.fluid as F\nimport paddle.fluid.layers as L\nimport paddle.fluid.dygraph as D\n\n\ndef gen_bias(encoder_inputs, decoder_inputs, step):\n    decoder_bsz, decoder_seqlen = decoder_inputs.shape[:2]\n    attn_bias = L.reshape(L.range(0, decoder_seqlen, 1, dtype='float32') + 1, [1, -1, 1])\n    decoder_bias = L.cast((L.matmul(attn_bias, 1. / attn_bias, transpose_y=True) >= 1.),\n                          'float32')  #[1, 1, decoderlen, decoderlen]\n    encoder_bias = L.unsqueeze(L.cast(L.ones_like(encoder_inputs), 'float32'), [1])  #[bsz, 1, encoderlen]\n    encoder_bias = L.expand(encoder_bias, [1, decoder_seqlen, 1])  #[bsz,decoderlen, encoderlen]\n    decoder_bias = L.expand(decoder_bias, [decoder_bsz, 1, 1])  #[bsz, decoderlen, decoderlen]\n    if step > 0:\n        bias = L.concat([encoder_bias, L.ones([decoder_bsz, decoder_seqlen, step], 'float32'), decoder_bias], -1)\n    else:\n        bias = L.concat([encoder_bias, decoder_bias], -1)\n    return bias\n\n\n@D.no_grad\ndef greedy_search_infilling(model,\n                            q_ids,\n                            q_sids,\n                            sos_id,\n                            eos_id,\n                            attn_id,\n                            max_encode_len=640,\n                            max_decode_len=100,\n                            tgt_type_id=3):\n    model.eval()\n    _, logits, info = model(q_ids, q_sids)\n    gen_ids = L.argmax(logits, -1)\n    d_batch, d_seqlen = q_ids.shape\n    seqlen = L.reduce_sum(L.cast(q_ids != 0, 'int64'), 1, keep_dim=True)\n    has_stopped = np.zeros([d_batch], dtype=np.bool)\n    gen_seq_len = np.zeros([d_batch], dtype=np.int64)\n    output_ids = []\n\n    past_cache = info['caches']\n\n    cls_ids = L.ones([d_batch], dtype='int64') * sos_id\n    attn_ids = L.ones([d_batch], dtype='int64') * attn_id\n    ids = L.stack([cls_ids, attn_ids], -1)\n    for step in range(max_decode_len):\n        bias = gen_bias(q_ids, ids, step)\n        pos_ids = D.to_variable(np.tile(np.array([[step, step + 1]], dtype=np.int64), [d_batch, 1]))\n        pos_ids += seqlen\n        _, logits, info = model(\n            ids, L.ones_like(ids) * tgt_type_id, pos_ids=pos_ids, attn_bias=bias, past_cache=past_cache)\n        gen_ids = L.argmax(logits, -1)\n\n        past_cached_k, past_cached_v = past_cache\n        cached_k, cached_v = info['caches']\n        cached_k = [L.concat([pk, k[:, :1, :]], 1) for pk, k in zip(past_cached_k, cached_k)]  # concat cached\n        cached_v = [L.concat([pv, v[:, :1, :]], 1) for pv, v in zip(past_cached_v, cached_v)]\n        past_cache = (cached_k, cached_v)\n\n        gen_ids = gen_ids[:, 1]\n        ids = L.stack([gen_ids, attn_ids], 1)\n\n        gen_ids = gen_ids.numpy()\n        has_stopped |= (gen_ids == eos_id).astype(np.bool)\n        gen_seq_len += (1 - has_stopped.astype(np.int64))\n        output_ids.append(gen_ids.tolist())\n        if has_stopped.all():\n            break\n    output_ids = np.array(output_ids).transpose([1, 0])\n    return output_ids\n\n\nBeamSearchState = namedtuple('BeamSearchState', ['log_probs', 'lengths', 'finished'])\nBeamSearchOutput = namedtuple('BeamSearchOutput', ['scores', 'predicted_ids', 'beam_parent_ids'])\n\n\ndef log_softmax(x):\n    e_x = np.exp(x - np.max(x))\n    return np.log(e_x / e_x.sum())\n\n\ndef mask_prob(p, onehot_eos, finished):\n    is_finished = L.cast(L.reshape(finished, [-1, 1]) != 0, 'float32')\n    p = is_finished * (1. - L.cast(onehot_eos, 'float32')) * -9999. + (1. - is_finished) * p\n    return p\n\n\ndef hyp_score(log_probs, length, length_penalty):\n    lp = L.pow((5. + L.cast(length, 'float32')) / 6., length_penalty)\n    return log_probs / lp\n\n\ndef beam_search_step(state, logits, eos_id, beam_width, is_first_step, length_penalty):\n    \"\"\"logits.shape == [B*W, V]\"\"\"\n    beam_size, vocab_size = logits.shape  # as batch size=1 in this hub module. the first dim means bsz * beam_size equals beam_size\n    logits_np = logits.numpy()\n    for i in range(beam_size):\n        logits_np[i][17963] = 0  # make [UNK] prob = 0\n    logits = D.to_variable(logits_np)\n\n    bsz, beam_width = state.log_probs.shape\n    onehot_eos = L.cast(F.one_hot(L.ones([1], 'int64') * eos_id, vocab_size), 'int64')  #[1, V]\n\n    probs = L.log(L.softmax(logits))  #[B*W, V]\n    probs = mask_prob(probs, onehot_eos, state.finished)  #[B*W, V]\n    allprobs = L.reshape(state.log_probs, [-1, 1]) + probs  #[B*W, V]\n\n    not_finished = 1 - L.reshape(state.finished, [-1, 1])  #[B*W,1]\n    not_eos = 1 - onehot_eos\n    length_to_add = not_finished * not_eos  #[B*W,V]\n    alllen = L.reshape(state.lengths, [-1, 1]) + length_to_add\n\n    allprobs = L.reshape(allprobs, [-1, beam_width * vocab_size])\n    alllen = L.reshape(alllen, [-1, beam_width * vocab_size])\n    allscore = hyp_score(allprobs, alllen, length_penalty)\n    if is_first_step:\n        allscore = L.reshape(allscore, [bsz, beam_width, -1])[:, 0, :]  # first step only consiter beam 0\n    scores, idx = L.topk(allscore, k=beam_width)  #[B, W]\n    next_beam_id = idx // vocab_size  #[B, W]\n    next_word_id = idx % vocab_size\n\n    gather_idx = L.concat([L.where(idx != -1)[:, :1], L.reshape(idx, [-1, 1])], 1)\n    next_probs = L.reshape(L.gather_nd(allprobs, gather_idx), idx.shape)\n    next_len = L.reshape(L.gather_nd(alllen, gather_idx), idx.shape)\n\n    gather_idx = L.concat([L.where(next_beam_id != -1)[:, :1], L.reshape(next_beam_id, [-1, 1])], 1)\n    next_finished = L.reshape(L.gather_nd(state.finished, gather_idx),\n                              state.finished.shape)  #[gather new beam state according to new beam id]\n\n    next_finished += L.cast(next_word_id == eos_id, 'int64')\n    next_finished = L.cast(next_finished > 0, 'int64')\n\n    next_state = BeamSearchState(log_probs=next_probs, lengths=next_len, finished=next_finished)\n    output = BeamSearchOutput(scores=scores, predicted_ids=next_word_id, beam_parent_ids=next_beam_id)\n\n    return output, next_state\n\n\n@D.no_grad\ndef beam_search_infilling(model,\n                          q_ids,\n                          q_sids,\n                          sos_id,\n                          eos_id,\n                          attn_id,\n                          max_encode_len=640,\n                          max_decode_len=100,\n                          beam_width=5,\n                          tgt_type_id=3,\n                          length_penalty=1.0):\n    model.eval()\n    _, __, info = model(q_ids, q_sids)\n    d_batch, d_seqlen = q_ids.shape\n\n    state = BeamSearchState(\n        log_probs=L.zeros([d_batch, beam_width], 'float32'),\n        lengths=L.zeros([d_batch, beam_width], 'int64'),\n        finished=L.zeros([d_batch, beam_width], 'int64'))\n    outputs = []\n\n    def reorder_(t, parent_id):\n        \"\"\"reorder cache according to parent beam id\"\"\"\n        gather_idx = L.where(parent_id != -1)[:, 0] * beam_width + L.reshape(parent_id, [-1])\n        t = L.gather(t, gather_idx)\n        return t\n\n    def tile_(t, times):\n        _shapes = list(t.shape[1:])\n        ret = L.reshape(L.expand(L.unsqueeze(t, [1]), [\n            1,\n            times,\n        ] + [\n            1,\n        ] * len(_shapes)), [\n            -1,\n        ] + _shapes)\n        return ret\n\n    cached_k, cached_v = info['caches']\n    cached_k = [tile_(k, beam_width) for k in cached_k]\n    cached_v = [tile_(v, beam_width) for v in cached_v]\n    past_cache = (cached_k, cached_v)\n\n    q_ids = tile_(q_ids, beam_width)\n    seqlen = L.reduce_sum(L.cast(q_ids != 0, 'int64'), 1, keep_dim=True)\n\n    cls_ids = L.ones([d_batch * beam_width], dtype='int64') * sos_id\n    attn_ids = L.ones([d_batch * beam_width], dtype='int64') * attn_id  # SOS\n    ids = L.stack([cls_ids, attn_ids], -1)\n    for step in range(max_decode_len):\n        bias = gen_bias(q_ids, ids, step)\n        pos_ids = D.to_variable(np.tile(np.array([[step, step + 1]], dtype=np.int64), [d_batch * beam_width, 1]))\n        pos_ids += seqlen\n\n        _, logits, info = model(\n            ids, L.ones_like(ids) * tgt_type_id, pos_ids=pos_ids, attn_bias=bias, past_cache=past_cache)\n\n        output, state = beam_search_step(\n            state,\n            logits[:, 1],\n            eos_id=eos_id,\n            beam_width=beam_width,\n            is_first_step=(step == 0),\n            length_penalty=length_penalty)\n        outputs.append(output)\n\n        past_cached_k, past_cached_v = past_cache\n        cached_k, cached_v = info['caches']\n        cached_k = [\n            reorder_(L.concat([pk, k[:, :1, :]], 1), output.beam_parent_ids) for pk, k in zip(past_cached_k, cached_k)\n        ]  # concat cached\n        cached_v = [\n            reorder_(L.concat([pv, v[:, :1, :]], 1), output.beam_parent_ids) for pv, v in zip(past_cached_v, cached_v)\n        ]\n        past_cache = (cached_k, cached_v)\n\n        pred_ids_flatten = L.reshape(output.predicted_ids, [d_batch * beam_width])\n        ids = L.stack([pred_ids_flatten, attn_ids], 1)\n\n        if state.finished.numpy().all():\n            break\n\n    final_ids = L.stack([o.predicted_ids for o in outputs], 0)\n    final_parent_ids = L.stack([o.beam_parent_ids for o in outputs], 0)\n    final_ids = L.gather_tree(final_ids, final_parent_ids)  #[:, :,\n    #0]  #pick best beam\n    final_ids = L.transpose(L.reshape(final_ids, [-1, d_batch * 1, beam_width]), [1, 2, 0])\n    return final_ids\n\n\nen_patten = re.compile(r'^[a-zA-Z0-9]*$')\n\n\ndef post_process(token):\n    if token.startswith('##'):\n        ret = token[2:]\n    else:\n        if en_patten.match(token):\n            ret = ' ' + token\n        else:\n            ret = token\n    return ret\n"
  },
  {
    "path": "modules/text/text_generation/reading_pictures_writing_poems_for_midautumn/MidAutumnPoetry/model/file_utils.py",
    "content": "#   Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\n\nfrom tqdm import tqdm\nfrom paddlehub.common.logger import logger\nfrom paddlehub.common.dir import MODULE_HOME\n\n\ndef _fetch_from_remote(url, force_download=False):\n    import tempfile, requests, tarfile\n    cached_dir = os.path.join(MODULE_HOME, \"ernie_for_gen\")\n    if force_download or not os.path.exists(cached_dir):\n        with tempfile.NamedTemporaryFile() as f:\n            #url = 'https://ernie.bj.bcebos.com/ERNIE_stable.tgz'\n            r = requests.get(url, stream=True)\n            total_len = int(r.headers.get('content-length'))\n            for chunk in tqdm(\n                    r.iter_content(chunk_size=1024), total=total_len // 1024, desc='downloading %s' % url, unit='KB'):\n                if chunk:\n                    f.write(chunk)\n                    f.flush()\n            logger.debug('extacting... to %s' % f.name)\n            with tarfile.open(f.name) as tf:\n                def is_within_directory(directory, target):\n                    \n                    abs_directory = os.path.abspath(directory)\n                    abs_target = os.path.abspath(target)\n                \n                    prefix = os.path.commonprefix([abs_directory, abs_target])\n                    \n                    return prefix == abs_directory\n                \n                def safe_extract(tar, path=\".\", members=None, *, numeric_owner=False):\n                \n                    for member in tar.getmembers():\n                        member_path = os.path.join(path, member.name)\n                        if not is_within_directory(path, member_path):\n                            raise Exception(\"Attempted Path Traversal in Tar File\")\n                \n                    tar.extractall(path, members, numeric_owner=numeric_owner) \n                    \n                \n                safe_extract(tf, path=cached_dir)\n    logger.debug('%s cached in %s' % (url, cached_dir))\n    return cached_dir\n\n\ndef add_docstring(doc):\n    def func(f):\n        f.__doc__ += ('\\n======other docs from supper class ======\\n%s' % doc)\n        return f\n\n    return func\n"
  },
  {
    "path": "modules/text/text_generation/reading_pictures_writing_poems_for_midautumn/MidAutumnPoetry/model/modeling_ernie.py",
    "content": "#   Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom __future__ import division\nfrom __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport logging\n\nimport paddle.fluid.dygraph as D\nimport paddle.fluid as F\nimport paddle.fluid.layers as L\n\nlog = logging.getLogger(__name__)\n\n\ndef _build_linear(n_in, n_out, name, init, act=None):\n    return D.Linear(\n        n_in,\n        n_out,\n        param_attr=F.ParamAttr(name='%s.w_0' % name if name is not None else None, initializer=init),\n        bias_attr='%s.b_0' % name if name is not None else None,\n        act=act)\n\n\ndef _build_ln(n_in, name):\n    return D.LayerNorm(\n        normalized_shape=n_in,\n        param_attr=F.ParamAttr(\n            name='%s_layer_norm_scale' % name if name is not None else None, initializer=F.initializer.Constant(1.)),\n        bias_attr=F.ParamAttr(\n            name='%s_layer_norm_bias' % name if name is not None else None, initializer=F.initializer.Constant(1.)),\n    )\n\n\ndef append_name(name, postfix):\n    if name is None:\n        return None\n    elif name == '':\n        return postfix\n    else:\n        return '%s_%s' % (name, postfix)\n\n\nclass AttentionLayer(D.Layer):\n    def __init__(self, cfg, name=None):\n        super(AttentionLayer, self).__init__()\n        initializer = F.initializer.TruncatedNormal(scale=cfg['initializer_range'])\n        d_model = cfg['hidden_size']\n        n_head = cfg['num_attention_heads']\n        assert d_model % n_head == 0\n        d_model_q = cfg.get('query_hidden_size_per_head', d_model // n_head) * n_head\n        d_model_v = cfg.get('value_hidden_size_per_head', d_model // n_head) * n_head\n        self.n_head = n_head\n        self.d_key = d_model_q // n_head\n        self.q = _build_linear(d_model, d_model_q, append_name(name, 'query_fc'), initializer)\n        self.k = _build_linear(d_model, d_model_q, append_name(name, 'key_fc'), initializer)\n        self.v = _build_linear(d_model, d_model_v, append_name(name, 'value_fc'), initializer)\n        self.o = _build_linear(d_model_v, d_model, append_name(name, 'output_fc'), initializer)\n        self.dropout = lambda i: L.dropout(\n            i,\n            dropout_prob=cfg['attention_probs_dropout_prob'],\n            dropout_implementation=\"upscale_in_train\",\n        ) if self.training else i\n\n    def forward(self, queries, keys, values, attn_bias, past_cache):\n        assert len(queries.shape) == len(keys.shape) == len(values.shape) == 3\n\n        q = self.q(queries)\n        k = self.k(keys)\n        v = self.v(values)\n\n        cache = (k, v)\n        if past_cache is not None:\n            cached_k, cached_v = past_cache\n            k = L.concat([cached_k, k], 1)\n            v = L.concat([cached_v, v], 1)\n\n        q = L.transpose(L.reshape(q, [0, 0, self.n_head, q.shape[-1] // self.n_head]),\n                        [0, 2, 1, 3])  #[batch, head, seq, dim]\n        k = L.transpose(L.reshape(k, [0, 0, self.n_head, k.shape[-1] // self.n_head]),\n                        [0, 2, 1, 3])  #[batch, head, seq, dim]\n        v = L.transpose(L.reshape(v, [0, 0, self.n_head, v.shape[-1] // self.n_head]),\n                        [0, 2, 1, 3])  #[batch, head, seq, dim]\n\n        q = L.scale(q, scale=self.d_key**-0.5)\n        score = L.matmul(q, k, transpose_y=True)\n        if attn_bias is not None:\n            score += attn_bias\n        score = L.softmax(score, use_cudnn=True)\n        score = self.dropout(score)\n\n        out = L.matmul(score, v)\n        out = L.transpose(out, [0, 2, 1, 3])\n        out = L.reshape(out, [0, 0, out.shape[2] * out.shape[3]])\n\n        out = self.o(out)\n        return out, cache\n\n\nclass PositionwiseFeedForwardLayer(D.Layer):\n    def __init__(self, cfg, name=None):\n        super(PositionwiseFeedForwardLayer, self).__init__()\n        initializer = F.initializer.TruncatedNormal(scale=cfg['initializer_range'])\n        d_model = cfg['hidden_size']\n        d_ffn = cfg.get('intermediate_size', 4 * d_model)\n        assert cfg['hidden_act'] in ['relu', 'gelu']\n        self.i = _build_linear(d_model, d_ffn, append_name(name, 'fc_0'), initializer, act=cfg['hidden_act'])\n        self.o = _build_linear(d_ffn, d_model, append_name(name, 'fc_1'), initializer)\n        prob = cfg.get('intermediate_dropout_prob', 0.)\n        self.dropout = lambda i: L.dropout(\n            i,\n            dropout_prob=prob,\n            dropout_implementation=\"upscale_in_train\",\n        ) if self.training else i\n\n    def forward(self, inputs):\n        hidden = self.i(inputs)\n        hidden = self.dropout(hidden)\n        out = self.o(hidden)\n        return out\n\n\nclass ErnieBlock(D.Layer):\n    def __init__(self, cfg, name=None):\n        super(ErnieBlock, self).__init__()\n        d_model = cfg['hidden_size']\n        initializer = F.initializer.TruncatedNormal(scale=cfg['initializer_range'])\n\n        self.attn = AttentionLayer(cfg, name=append_name(name, 'multi_head_att'))\n        self.ln1 = _build_ln(d_model, name=append_name(name, 'post_att'))\n        self.ffn = PositionwiseFeedForwardLayer(cfg, name=append_name(name, 'ffn'))\n        self.ln2 = _build_ln(d_model, name=append_name(name, 'post_ffn'))\n        prob = cfg.get('intermediate_dropout_prob', cfg['hidden_dropout_prob'])\n        self.dropout = lambda i: L.dropout(\n            i,\n            dropout_prob=prob,\n            dropout_implementation=\"upscale_in_train\",\n        ) if self.training else i\n\n    def forward(self, inputs, attn_bias=None, past_cache=None):\n        attn_out, cache = self.attn(inputs, inputs, inputs, attn_bias, past_cache=past_cache)  #self attn\n        attn_out = self.dropout(attn_out)\n        hidden = attn_out + inputs\n        hidden = self.ln1(hidden)  # dropout/ add/ norm\n\n        ffn_out = self.ffn(hidden)\n        ffn_out = self.dropout(ffn_out)\n        hidden = ffn_out + hidden\n        hidden = self.ln2(hidden)\n        return hidden, cache\n\n\nclass ErnieEncoderStack(D.Layer):\n    def __init__(self, cfg, name=None):\n        super(ErnieEncoderStack, self).__init__()\n        n_layers = cfg['num_hidden_layers']\n        self.block = D.LayerList([ErnieBlock(cfg, append_name(name, 'layer_%d' % i)) for i in range(n_layers)])\n\n    def forward(self, inputs, attn_bias=None, past_cache=None):\n        if past_cache is not None:\n            assert isinstance(\n                past_cache,\n                tuple), 'unknown type of `past_cache`, expect tuple or list. got %s' % repr(type(past_cache))\n            past_cache = list(zip(*past_cache))\n        else:\n            past_cache = [None] * len(self.block)\n        cache_list_k, cache_list_v, hidden_list = [], [], [inputs]\n\n        for b, p in zip(self.block, past_cache):\n            inputs, cache = b(inputs, attn_bias=attn_bias, past_cache=p)\n            cache_k, cache_v = cache\n            cache_list_k.append(cache_k)\n            cache_list_v.append(cache_v)\n            hidden_list.append(inputs)\n\n        return inputs, hidden_list, (cache_list_k, cache_list_v)\n\n\nclass ErnieModel(D.Layer):\n    def __init__(self, cfg, name=None):\n        \"\"\"\n        Fundamental pretrained Ernie model\n        \"\"\"\n        log.debug('init ErnieModel with config: %s' % repr(cfg))\n        D.Layer.__init__(self)\n        d_model = cfg['hidden_size']\n        d_emb = cfg.get('emb_size', cfg['hidden_size'])\n        d_vocab = cfg['vocab_size']\n        d_pos = cfg['max_position_embeddings']\n        d_sent = cfg.get(\"sent_type_vocab_size\") or cfg['type_vocab_size']\n        self.n_head = cfg['num_attention_heads']\n        self.return_additional_info = cfg.get('return_additional_info', False)\n        initializer = F.initializer.TruncatedNormal(scale=cfg['initializer_range'])\n\n        self.ln = _build_ln(d_model, name=append_name(name, 'pre_encoder'))\n        self.word_emb = D.Embedding([d_vocab, d_emb],\n                                    param_attr=F.ParamAttr(\n                                        name=append_name(name, 'word_embedding'), initializer=initializer))\n        self.pos_emb = D.Embedding([d_pos, d_emb],\n                                   param_attr=F.ParamAttr(\n                                       name=append_name(name, 'pos_embedding'), initializer=initializer))\n        self.sent_emb = D.Embedding([d_sent, d_emb],\n                                    param_attr=F.ParamAttr(\n                                        name=append_name(name, 'sent_embedding'), initializer=initializer))\n        prob = cfg['hidden_dropout_prob']\n        self.dropout = lambda i: L.dropout(\n            i,\n            dropout_prob=prob,\n            dropout_implementation=\"upscale_in_train\",\n        ) if self.training else i\n\n        self.encoder_stack = ErnieEncoderStack(cfg, append_name(name, 'encoder'))\n        if cfg.get('has_pooler', True):\n            self.pooler = _build_linear(\n                cfg['hidden_size'], cfg['hidden_size'], append_name(name, 'pooled_fc'), initializer, act='tanh')\n        else:\n            self.pooler = None\n        self.train()\n\n    def eval(self):\n        if F.in_dygraph_mode():\n            super(ErnieModel, self).eval()\n        self.training = False\n        for l in self.sublayers():\n            l.training = False\n\n    def train(self):\n        if F.in_dygraph_mode():\n            super(ErnieModel, self).train()\n        self.training = True\n        for l in self.sublayers():\n            l.training = True\n\n    def forward(self,\n                src_ids,\n                sent_ids=None,\n                pos_ids=None,\n                input_mask=None,\n                attn_bias=None,\n                past_cache=None,\n                use_causal_mask=False):\n        \"\"\"\n        Args:\n            src_ids (`Variable` of shape `[batch_size, seq_len]`):\n                Indices of input sequence tokens in the vocabulary.\n            sent_ids (optional, `Variable` of shape `[batch_size, seq_len]`):\n                aka token_type_ids, Segment token indices to indicate first and second portions of the inputs.\n                if None, assume all tokens come from `segment_a`\n            pos_ids(optional, `Variable` of shape `[batch_size, seq_len]`):\n                Indices of positions of each input sequence tokens in the position embeddings.\n            input_mask(optional `Variable` of shape `[batch_size, seq_len]`):\n                Mask to avoid performing attention on the padding token indices of the encoder input.\n            attn_bias(optional, `Variable` of shape `[batch_size, seq_len, seq_len] or False`):\n                3D version of `input_mask`, if set, overrides `input_mask`; if set not False, will not apply attention mask\n            past_cache(optional, tuple of two lists: cached key and cached value,\n                each is a list of `Variable`s of shape `[batch_size, seq_len, hidden_size]`):\n                cached key/value tensor that will be concated to generated key/value when performing self attention.\n                if set, `attn_bias` should not be None.\n\n        Returns:\n            pooled (`Variable` of shape `[batch_size, hidden_size]`):\n                output logits of pooler classifier\n            encoded(`Variable` of shape `[batch_size, seq_len, hidden_size]`):\n                output logits of transformer stack\n        \"\"\"\n        assert len(src_ids.shape) == 2, 'expect src_ids.shape = [batch, sequecen], got %s' % (repr(src_ids.shape))\n        assert attn_bias is not None if past_cache else True, 'if `past_cache` is specified; attn_bias should not be None'\n        d_batch = L.shape(src_ids)[0]\n        d_seqlen = L.shape(src_ids)[1]\n        if pos_ids is None:\n            pos_ids = L.reshape(L.range(0, d_seqlen, 1, dtype='int32'), [1, -1])\n            pos_ids = L.cast(pos_ids, 'int64')\n        if attn_bias is None:\n            if input_mask is None:\n                input_mask = L.cast(src_ids != 0, 'float32')\n            assert len(input_mask.shape) == 2\n            input_mask = L.unsqueeze(input_mask, axes=[-1])\n            attn_bias = L.matmul(input_mask, input_mask, transpose_y=True)\n            if use_causal_mask:\n                sequence = L.reshape(L.range(0, d_seqlen, 1, dtype='float32') + 1., [1, 1, -1, 1])\n                causal_mask = L.cast((L.matmul(sequence, 1. / sequence, transpose_y=True) >= 1.), 'float32')\n                attn_bias *= causal_mask\n        else:\n            assert len(attn_bias.shape) == 3, 'expect attn_bias tobe rank 3, got %r' % attn_bias.shape\n        attn_bias = (1. - attn_bias) * -10000.0\n        attn_bias = L.unsqueeze(attn_bias, [1])\n        attn_bias = L.expand(attn_bias, [1, self.n_head, 1, 1])  # avoid broadcast =_=\n        attn_bias.stop_gradient = True\n\n        if sent_ids is None:\n            sent_ids = L.zeros_like(src_ids)\n\n        src_embedded = self.word_emb(src_ids)\n        pos_embedded = self.pos_emb(pos_ids)\n        sent_embedded = self.sent_emb(sent_ids)\n        embedded = src_embedded + pos_embedded + sent_embedded\n\n        embedded = self.dropout(self.ln(embedded))\n\n        encoded, hidden_list, cache_list = self.encoder_stack(embedded, attn_bias, past_cache=past_cache)\n        if self.pooler is not None:\n            pooled = self.pooler(encoded[:, 0, :])\n        else:\n            pooled = None\n\n        additional_info = {\n            'hiddens': hidden_list,\n            'caches': cache_list,\n        }\n\n        if self.return_additional_info:\n            return pooled, encoded, additional_info\n        else:\n            return pooled, encoded\n"
  },
  {
    "path": "modules/text/text_generation/reading_pictures_writing_poems_for_midautumn/MidAutumnPoetry/model/modeling_ernie_gen.py",
    "content": "#   Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport paddle.fluid as F\nimport paddle.fluid.layers as L\n\nfrom .modeling_ernie import ErnieModel\nfrom .modeling_ernie import _build_linear, _build_ln, append_name\n\n\nclass ErnieModelForGeneration(ErnieModel):\n    def __init__(self, cfg, name=None):\n        cfg['return_additional_info'] = True\n        cfg['has_pooler'] = False\n        super(ErnieModelForGeneration, self).__init__(cfg, name=name)\n        initializer = F.initializer.TruncatedNormal(scale=cfg['initializer_range'])\n        d_model = cfg['hidden_size']\n        d_vocab = cfg['vocab_size']\n\n        self.mlm = _build_linear(\n            d_model, d_model, append_name(name, 'mask_lm_trans_fc'), initializer, act=cfg['hidden_act'])\n        self.mlm_ln = _build_ln(d_model, name=append_name(name, 'mask_lm_trans'))\n        self.mlm_bias = L.create_parameter(\n            dtype='float32',\n            shape=[d_vocab],\n            attr=F.ParamAttr(\n                name=append_name(name, 'mask_lm_out_fc.b_0'), initializer=F.initializer.Constant(value=0.0)),\n            is_bias=True,\n        )\n\n    def forward(self, src_ids, *args, **kwargs):\n        tgt_labels = kwargs.pop('tgt_labels', None)\n        tgt_pos = kwargs.pop('tgt_pos', None)\n        encode_only = kwargs.pop('encode_only', False)\n        _, encoded, info = ErnieModel.forward(self, src_ids, *args, **kwargs)\n        if encode_only:\n            return None, None, info\n        elif tgt_labels is None:\n            encoded = self.mlm(encoded)\n            encoded = self.mlm_ln(encoded)\n            logits = L.matmul(encoded, self.word_emb.weight, transpose_y=True) + self.mlm_bias\n            output_ids = L.argmax(logits, -1)\n            return output_ids, logits, info\n        else:\n            encoded_2d = L.gather_nd(encoded, tgt_pos)\n            encoded_2d = self.mlm(encoded_2d)\n            encoded_2d = self.mlm_ln(encoded_2d)\n            logits_2d = L.matmul(encoded_2d, self.word_emb.weight, transpose_y=True) + self.mlm_bias\n            if len(tgt_labels.shape) == 1:\n                tgt_labels = L.reshape(tgt_labels, [-1, 1])\n\n            loss = L.reduce_mean(\n                L.softmax_with_cross_entropy(logits_2d, tgt_labels, soft_label=(tgt_labels.shape[-1] != 1)))\n            return loss, logits_2d, info\n"
  },
  {
    "path": "modules/text/text_generation/reading_pictures_writing_poems_for_midautumn/MidAutumnPoetry/model/tokenizing_ernie.py",
    "content": "#   Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport six\nimport re\nimport logging\nfrom functools import partial\n\nimport numpy as np\n\nimport io\n\nopen = partial(io.open, encoding='utf8')\n\nlog = logging.getLogger(__name__)\n\n_max_input_chars_per_word = 100\n\n\ndef _wordpiece(token, vocab, unk_token, prefix='##', sentencepiece_prefix=''):\n    \"\"\" wordpiece: helloworld => [hello, ##world] \"\"\"\n    chars = list(token)\n    if len(chars) > _max_input_chars_per_word:\n        return [unk_token], [(0, len(chars))]\n\n    is_bad = False\n    start = 0\n    sub_tokens = []\n    sub_pos = []\n    while start < len(chars):\n        end = len(chars)\n        cur_substr = None\n        while start < end:\n            substr = \"\".join(chars[start:end])\n            if start == 0:\n                substr = sentencepiece_prefix + substr\n            if start > 0:\n                substr = prefix + substr\n            if substr in vocab:\n                cur_substr = substr\n                break\n            end -= 1\n        if cur_substr is None:\n            is_bad = True\n            break\n        sub_tokens.append(cur_substr)\n        sub_pos.append((start, end))\n        start = end\n    if is_bad:\n        return [unk_token], [(0, len(chars))]\n    else:\n        return sub_tokens, sub_pos\n\n\nclass ErnieTokenizer(object):\n    def __init__(self,\n                 vocab,\n                 unk_token='[UNK]',\n                 sep_token='[SEP]',\n                 cls_token='[CLS]',\n                 pad_token='[PAD]',\n                 mask_token='[MASK]',\n                 wordpiece_prefix='##',\n                 sentencepiece_prefix='',\n                 lower=True,\n                 encoding='utf8',\n                 special_token_list=[]):\n        if not isinstance(vocab, dict):\n            raise ValueError('expect `vocab` to be instance of dict, got %s' % type(vocab))\n        self.vocab = vocab\n        self.lower = lower\n        self.prefix = wordpiece_prefix\n        self.sentencepiece_prefix = sentencepiece_prefix\n        self.pad_id = self.vocab[pad_token]\n        self.cls_id = cls_token and self.vocab[cls_token]\n        self.sep_id = sep_token and self.vocab[sep_token]\n        self.unk_id = unk_token and self.vocab[unk_token]\n        self.mask_id = mask_token and self.vocab[mask_token]\n        self.unk_token = unk_token\n        special_tokens = {pad_token, cls_token, sep_token, unk_token, mask_token} | set(special_token_list)\n        pat_str = ''\n        for t in special_tokens:\n            if t is None:\n                continue\n            pat_str += '(%s)|' % re.escape(t)\n        pat_str += r'([a-zA-Z0-9]+|\\S)'\n        log.debug('regex: %s' % pat_str)\n        self.pat = re.compile(pat_str)\n        self.encoding = encoding\n\n    def tokenize(self, text):\n        if len(text) == 0:\n            return []\n        if six.PY3 and not isinstance(text, six.string_types):\n            text = text.decode(self.encoding)\n        if six.PY2 and isinstance(text, str):\n            text = text.decode(self.encoding)\n\n        res = []\n        for match in self.pat.finditer(text):\n            match_group = match.group(0)\n            if match.groups()[-1]:\n                if self.lower:\n                    match_group = match_group.lower()\n                words, _ = _wordpiece(\n                    match_group,\n                    vocab=self.vocab,\n                    unk_token=self.unk_token,\n                    prefix=self.prefix,\n                    sentencepiece_prefix=self.sentencepiece_prefix)\n            else:\n                words = [match_group]\n            res += words\n        return res\n\n    def convert_tokens_to_ids(self, tokens):\n        return [self.vocab.get(t, self.unk_id) for t in tokens]\n\n    def truncate(self, id1, id2, seqlen):\n        len1 = len(id1)\n        len2 = len(id2)\n        half = seqlen // 2\n        if len1 > len2:\n            len1_truncated, len2_truncated = max(half, seqlen - len2), min(half, len2)\n        else:\n            len1_truncated, len2_truncated = min(half, seqlen - len1), max(half, seqlen - len1)\n        return id1[:len1_truncated], id2[:len2_truncated]\n\n    def build_for_ernie(self, text_id, pair_id=[]):\n        \"\"\"build sentence type id, add [CLS] [SEP]\"\"\"\n        text_id_type = np.zeros_like(text_id, dtype=np.int64)\n        ret_id = np.concatenate([[self.cls_id], text_id, [self.sep_id]], 0)\n        ret_id_type = np.concatenate([[0], text_id_type, [0]], 0)\n\n        if len(pair_id):\n            pair_id_type = np.ones_like(pair_id, dtype=np.int64)\n            ret_id = np.concatenate([ret_id, pair_id, [self.sep_id]], 0)\n            ret_id_type = np.concatenate([ret_id_type, pair_id_type, [1]], 0)\n        return ret_id, ret_id_type\n\n    def encode(self, text, pair=None, truncate_to=None):\n        text_id = np.array(self.convert_tokens_to_ids(self.tokenize(text)), dtype=np.int64)\n        text_id_type = np.zeros_like(text_id, dtype=np.int64)\n        if pair is not None:\n            pair_id = np.array(self.convert_tokens_to_ids(self.tokenize(pair)), dtype=np.int64)\n        else:\n            pair_id = []\n        if truncate_to is not None:\n            text_id, pair_id = self.truncate(text_id, [] if pair_id is None else pair_id, truncate_to)\n\n        ret_id, ret_id_type = self.build_for_ernie(text_id, pair_id)\n        return ret_id, ret_id_type\n"
  },
  {
    "path": "modules/text/text_generation/reading_pictures_writing_poems_for_midautumn/MidAutumnPoetry/module.py",
    "content": "# coding:utf-8\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport ast\nimport json\n\nimport paddle.fluid as fluid\nimport paddlehub as hub\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.nlp_module import DataFormatError\nfrom paddlehub.common.logger import logger\nfrom paddlehub.module.module import moduleinfo, serving\n\nimport argparse\nimport os\nimport numpy as np\n\nimport paddle.fluid.dygraph as D\n\nfrom reading_pictures_writing_poems_for_midautumn.MidAutumnPoetry.model.tokenizing_ernie import ErnieTokenizer\nfrom reading_pictures_writing_poems_for_midautumn.MidAutumnPoetry.model.decode import beam_search_infilling\nfrom reading_pictures_writing_poems_for_midautumn.MidAutumnPoetry.model.modeling_ernie_gen import ErnieModelForGeneration\n\n\n@moduleinfo(\n    name=\"MidAutumnPoetry\",\n    version=\"1.0.0\",\n    summary=\"\",\n    author=\"郑博培，彭兆帅\",\n    author_email=\"2733821739@qq.com，1084667371@qq.com\",\n    type=\"nlp/text_generation\",\n)\nclass ErnieGen(hub.NLPPredictionModule):\n    def _initialize(self):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        assets_path = os.path.join(self.directory, \"assets\")\n        gen_checkpoint_path = os.path.join(assets_path, \"ernie_gen\")\n        ernie_cfg_path = os.path.join(assets_path, 'ernie_config.json')\n        with open(ernie_cfg_path, encoding='utf8') as ernie_cfg_file:\n            ernie_cfg = dict(json.loads(ernie_cfg_file.read()))\n        ernie_vocab_path = os.path.join(assets_path, 'vocab.txt')\n        with open(ernie_vocab_path, encoding='utf8') as ernie_vocab_file:\n            ernie_vocab = {j.strip().split('\\t')[0]: i for i, j in enumerate(ernie_vocab_file.readlines())}\n\n        with fluid.dygraph.guard(fluid.CPUPlace()):\n            with fluid.unique_name.guard():\n                self.model = ErnieModelForGeneration(ernie_cfg)\n                finetuned_states, _ = D.load_dygraph(gen_checkpoint_path)\n                self.model.set_dict(finetuned_states)\n\n        self.tokenizer = ErnieTokenizer(ernie_vocab)\n        self.rev_dict = {v: k for k, v in self.tokenizer.vocab.items()}\n        self.rev_dict[self.tokenizer.pad_id] = ''  # replace [PAD]\n        self.rev_dict[self.tokenizer.unk_id] = ''  # replace [PAD]\n        self.rev_lookup = np.vectorize(lambda i: self.rev_dict[i])\n\n    @serving\n    def generate(self, texts, use_gpu=False, beam_width=5):\n        \"\"\"\n        Get the predict result from the input texts.\n\n        Args:\n             texts(list): the input texts.\n             use_gpu(bool): whether use gpu to predict or not\n             beam_width(int): the beam search width.\n\n        Returns:\n             results(list): the predict result.\n        \"\"\"\n        if texts and isinstance(texts, list) and all(texts) and all([isinstance(text, str) for text in texts]):\n            predicted_data = texts\n        else:\n            raise ValueError(\"The input texts should be a list with nonempty string elements.\")\n\n        if use_gpu and \"CUDA_VISIBLE_DEVICES\" not in os.environ:\n            use_gpu = False\n            logger.warning(\n                \"use_gpu has been set False as you didn't set the environment variable CUDA_VISIBLE_DEVICES while using use_gpu=True\"\n            )\n        if use_gpu:\n            place = fluid.CUDAPlace(0)\n        else:\n            place = fluid.CPUPlace()\n\n        with fluid.dygraph.guard(place):\n            self.model.eval()\n            results = []\n            for text in predicted_data:\n                sample_results = []\n                ids, sids = self.tokenizer.encode(text)\n                src_ids = D.to_variable(np.expand_dims(ids, 0))\n                src_sids = D.to_variable(np.expand_dims(sids, 0))\n                output_ids = beam_search_infilling(\n                    self.model,\n                    src_ids,\n                    src_sids,\n                    eos_id=self.tokenizer.sep_id,\n                    sos_id=self.tokenizer.cls_id,\n                    attn_id=self.tokenizer.vocab['[MASK]'],\n                    max_decode_len=50,\n                    max_encode_len=50,\n                    beam_width=beam_width,\n                    tgt_type_id=1)\n                output_str = self.rev_lookup(output_ids[0].numpy())\n\n                for ostr in output_str.tolist():\n                    if '[SEP]' in ostr:\n                        ostr = ostr[:ostr.index('[SEP]')]\n                    sample_results.append(\"\".join(ostr))\n                results.append(sample_results)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu', type=ast.literal_eval, default=False, help=\"whether use GPU for prediction\")\n\n        self.arg_config_group.add_argument('--beam_width', type=int, default=5, help=\"the beam search width\")\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description='Run the %s module.' % self.name,\n            prog='hub run %s' % self.name,\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, optional.\")\n\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n\n        args = self.parser.parse_args(argvs)\n\n        try:\n            input_data = self.check_input_data(args)\n        except DataFormatError and RuntimeError:\n            self.parser.print_help()\n            return None\n\n        results = self.generate(texts=input_data, use_gpu=args.use_gpu, beam_width=args.beam_width)\n\n        return results\n"
  },
  {
    "path": "modules/text/text_generation/reading_pictures_writing_poems_for_midautumn/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/text_generation/reading_pictures_writing_poems_for_midautumn/module.py",
    "content": "import argparse\r\nimport ast\r\nimport os\r\nimport math\r\nimport six\r\n\r\nfrom paddle.fluid.core import PaddleTensor, AnalysisConfig, create_paddle_predictor\r\nfrom paddlehub.module.module import runnable, serving, moduleinfo\r\nfrom paddlehub.io.parser import txt_parser\r\nimport numpy as np\r\nimport paddle.fluid as fluid\r\nimport paddlehub as hub\r\nfrom translate import Translator\r\nimport reading_pictures_writing_poems_for_midautumn.MidAutumnDetection.module as MidAutumnDetection\r\nimport reading_pictures_writing_poems_for_midautumn.MidAutumnPoetry.module as MidAutumnPoetry\r\n\r\n\r\n@moduleinfo(\r\n    name=\"reading_pictures_writing_poems_for_midautumn\",\r\n    version=\"1.0.0\",\r\n    summary=\"Reading Pictures And Writing Poems For MidAutumn\",\r\n    author=\"郑博培，彭兆帅\",\r\n    author_email=\"2733821739@qq.com，1084667371@qq.com\",\r\n    type=\"nlp/text_generation\")\r\nclass ReadingPicturesWritingPoems(hub.Module):\r\n    def _initialize(self):\r\n        \"\"\"\r\n        Initialize with the necessary elements\r\n        \"\"\"\r\n        self.pretrained_model_path = os.path.join(self.directory, \"assets\", \"infer_model\")\r\n        self.module_image = MidAutumnDetection.MODULE(\r\n            directory=\"reading_pictures_writing_poems_for_midautumn/MidAutumnDetection\")  # 调用目标检测的模型\r\n        self.module_similar = MidAutumnPoetry.ErnieGen(\r\n            directory='reading_pictures_writing_poems_for_midautumn/MidAutumnPoetry')  # 调用根据关键词生成古诗上阕的模型\r\n        self.module_poem = hub.Module(name=\"ernie_gen_poetry\")  # 调用古诗生成的模型\r\n\r\n    def WritingPoem(self, images, use_gpu=False):\r\n        # 目标检测，输入图片，输入得分最高的标签\r\n        results_image = self.module_image.predict(images=images)\r\n        best = {'score': 0, 'category': 'none'}\r\n        for item in results_image:\r\n            for items in item:\r\n                if (items['score'] > best['score']):\r\n                    best['score'], best['category'] = items['score'], items['category']\r\n        if best['category'] == 'MoonCake':\r\n            objects = ['月饼']\r\n        elif best['category'] == 'moon':\r\n            objects = ['月亮']\r\n        elif best['category'] == 'lantern':\r\n            objects = ['灯笼']\r\n        elif best['category'] == 'rabbit':\r\n            objects = ['兔子']\r\n        else:\r\n            objects = ['中秋节']\r\n        # 根据关键词生成古诗上阕\r\n        FirstPoetrys = self.module_similar.generate(texts=objects, use_gpu=True, beam_width=5)\r\n        FirstPoetry = [FirstPoetrys[0][0]]\r\n        # 调用古诗生成模型，使用上阕生成下阕\r\n        SecondPoetry = self.module_poem.generate(texts=FirstPoetry, use_gpu=True, beam_width=5)\r\n        Poetrys = []\r\n        Poetrys.append(FirstPoetry[0])\r\n        Poetrys.append(SecondPoetry[0][0])\r\n        results = [{'images': images, 'Poetrys': \"{}\".format(Poetrys[0] + Poetrys[1])}]\r\n\r\n        return results\r\n\r\n    @runnable\r\n    def run_cmd(self, argvs):\r\n        \"\"\"\r\n        Run as a command.\r\n        \"\"\"\r\n        self.parser = argparse.ArgumentParser(\r\n            description='Run the %s module.' % self.name,\r\n            prog='hub run %s' % self.name,\r\n            usage='%(prog)s',\r\n            add_help=True)\r\n\r\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\r\n        self.arg_config_group = self.parser.add_argument_group(\r\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\r\n\r\n        self.add_module_config_arg()\r\n        self.add_module_input_arg()\r\n\r\n        args = self.parser.parse_args(argvs)\r\n\r\n        try:\r\n            input_data = self.check_input_data(args)\r\n        except RuntimeError:\r\n            self.parser.print_help()\r\n            return None\r\n\r\n        results = self.WritingPoem(input_data)\r\n\r\n        return results\r\n\r\n    def add_module_config_arg(self):\r\n        \"\"\"\r\n        Add the command config options.\r\n        \"\"\"\r\n        self.arg_config_group.add_argument(\r\n            '--use_gpu', type=ast.literal_eval, default=False, help=\"whether use GPU for prediction\")\r\n\r\n    def add_module_input_arg(self):\r\n        \"\"\"\r\n        Add the command input options\r\n        \"\"\"\r\n        self.arg_input_group.add_argument('--input_image', type=str, default=None, help=\"Pictures to write poetry\")\r\n\r\n    def check_input_data(self, args):\r\n        input_data = []\r\n        if args.input_image:\r\n            if not os.path.exists(args.input_image):\r\n                raise RuntimeError(\"File %s is not exist.\" % args.input_image)\r\n            else:\r\n                input_data = args.input_image\r\n\r\n        if input_data == []:\r\n            raise RuntimeError(\"The input data is inconsistent with expectations.\")\r\n\r\n        return input_data\r\n"
  },
  {
    "path": "modules/text/text_generation/unified_transformer-12L-cn/README.md",
    "content": "```shell\n$ hub install unified_transformer_12L_cn==1.0.0\n```\n\n## 概述\n\n近年来，人机对话系统受到了学术界和产业界的广泛关注并取得了不错的发展。开放域对话系统旨在建立一个开放域的多轮对话系统，使得机器可以流畅自然地与人进行语言交互，既可以进行日常问候类的闲聊，又可以完成特定功能，以使得开放域对话系统具有实际应用价值。具体的说，开放域对话可以继续拆分为支持不同功能的对话形式，例如对话式推荐，知识对话技术等，如何解决并有效融合以上多个技能面临诸多挑战。\n\n[UnifiedTransformer](https://arxiv.org/abs/2006.16779)以[Transformer](https://arxiv.org/abs/1706.03762) 编码器为网络基本组件，采用灵活的注意力机制，十分适合文本生成任务，并在模型输入中加入了标识不同对话技能的special token，使得模型能同时支持闲聊对话、推荐对话和知识对话。\n\nunified_transformer_12L_cn包含12层的transformer结构，头数为12，隐藏层参数为768，参数量为132M。该预训练模型使用了样本量为60M的文本数据和20M的对话数据的大型中文对话数据集进行预训练，具体训练详情可以查看[LUGE-Dialogue](https://github.com/PaddlePaddle/Knover/tree/luge-dialogue/luge-dialogue)。\n\n## API\n\n```python\ndef predict(data: Union[List[List[str]], str],\n            max_seq_len: int = 512,\n            batch_size: int = 1,\n            use_gpu: bool = False,\n            **kwargs):\n```\n预测API，输入对话上下文，输出机器回复。\n\n**参数**\n- `data`(Union[List[List[str]], str]): 在非交互模式中，数据类型为List[List[str]]，每个样本是一个List[str]，表示为对话内容\n- `max_seq_len`(int): 每个样本的最大文本长度\n- `batch_size`(int): 进行预测的batch_size\n- `use_gpu`(bool): 是否使用gpu执行预测\n- `kwargs`: 预测时传给模型的额外参数，以keyword方式传递。其余的参数详情请查看[UnifiedTransformer](https://github.com/PaddlePaddle/PaddleNLP/tree/develop/examples/dialogue/unified_transformer)。\n\n**返回**\n* `results`(List[str]): 每个元素为相应对话中模型的新回复\n\n```python\ndef interactive_mode(max_turn=3)\n```\n进入交互模式。交互模式中，predict接口的data将支持字符串类型。\n\n**参数**\n- `max_turn`(int): 模型能记忆的对话轮次，当`max_turn`为1时，模型只能记住当前对话，无法获知之前的对话内容。\n\n\n**代码示例**\n\n```python\n# 非交互模式\nimport paddlehub as hub\n\nmodel = hub.Module(name='unified_transformer_12L_cn')\ndata = [[\"你是谁？\"], [\"你好啊。\", \"吃饭了吗？\",]]\nresult = model.predict(data)\n```\n\n```python\n# 交互模式\nimport paddlehub as hub\n\nmodel = hub.Module(name='unified_transformer_12L_cn')\nwith model.interactive_mode(max_turn=3):\n    while True:\n        human_utterance = input(\"[Human]: \").strip()\n        robot_utterance = model.predict(human_utterance)[0]\n        print(\"[Bot]: %s\"%robot_utterance)\n```\n\n## 服务部署\n\nPaddleHub Serving可以部署在线服务。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m unified_transformer_12L_cn\n```\n\n这样就完成了一个对话机器人服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\n\ntexts = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\ndata = {\"data\": texts}\n# 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\nurl = \"http://127.0.0.1:8866/predict/unified_transformer_12L_cn\"\n# 指定post请求的headers为application/json方式\nheaders = {\"Content-Type\": \"application/json\"}\n\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\nprint(r.json())\n```\n\n## 查看代码\n\nhttps://github.com/PaddlePaddle/Knover\n\n## 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.1.0\n\n## 更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/text/text_generation/unified_transformer-12L-cn/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/text_generation/unified_transformer-12L-cn/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport contextlib\nfrom collections import deque\nfrom typing import List, Union\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nfrom paddlehub.module.module import moduleinfo, serving\nfrom paddlenlp.data import Pad\nfrom paddlenlp.transformers import UnifiedTransformerLMHeadModel, UnifiedTransformerTokenizer\n\nfrom unified_transformer_12L_cn.utils import select_response\n\n\n@moduleinfo(\n    name=\"unified_transformer_12L_cn\",\n    version=\"1.0.0\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/text_generation\",\n)\nclass UnifiedTransformer(nn.Layer):\n    def __init__(self):\n        super(UnifiedTransformer, self).__init__()\n\n        self.model = UnifiedTransformerLMHeadModel.from_pretrained('unified_transformer-12L-cn')\n        self.tokenizer = UnifiedTransformerTokenizer.from_pretrained('unified_transformer-12L-cn')\n        self._interactive_mode = False\n\n    def _convert_text_to_input(self, texts: List[str], max_seq_len: int):\n        \"\"\"\n        Convert input strings to tokens.\n        \"\"\"\n        return self.tokenizer.dialogue_encode(\n            texts, max_seq_len=max_seq_len, add_start_token_as_response=True, is_split_into_words=False)\n\n    def _batchify(self, data: List[List[str]], max_seq_len: int, batch_size: int):\n        \"\"\"\n        Generate input batches.\n        \"\"\"\n        padding = False if batch_size == 1 else True\n        pad_func = Pad(pad_val=self.tokenizer.pad_token_id, pad_right=False, dtype=np.int64)\n\n        def pad_mask(batch_attention_mask):\n            batch_size = len(batch_attention_mask)\n            max_len = max(map(len, batch_attention_mask))\n            attention_mask = np.ones((batch_size, max_len, max_len), dtype='float32') * -1e9\n            for i, mask_data in enumerate(attention_mask):\n                seq_len = len(batch_attention_mask[i])\n                mask_data[-seq_len:, -seq_len:] = np.array(batch_attention_mask[i], dtype='float32')\n            # In order to ensure the correct broadcasting mechanism, expand one\n            # dimension to the second dimension (n_head of Transformer).\n            attention_mask = np.expand_dims(attention_mask, axis=1)\n            return attention_mask\n\n        def _parse_batch(batch_examples):\n            if padding:\n                input_ids = pad_func([example['input_ids'] for example in batch_examples])\n                token_type_ids = pad_func([example['token_type_ids'] for example in batch_examples])\n                position_ids = pad_func([example['position_ids'] for example in batch_examples])\n                attention_mask = pad_mask([example['attention_mask'] for example in batch_examples])\n            else:\n                input_ids = np.asarray([example['input_ids'] for example in batch_examples], dtype=np.int64)\n                token_type_ids = np.asarray([example['token_type_ids'] for example in batch_examples], dtype=np.int64)\n                position_ids = np.asarray([example['position_ids'] for example in batch_examples], dtype=np.int64)\n                attention_mask = np.asarray([example['attention_mask'] for example in batch_examples])\n                attention_mask = np.expand_dims(attention_mask, 0)\n\n            return input_ids, token_type_ids, position_ids, attention_mask\n\n        examples = []\n        for texts in data:\n            examples.append(self._convert_text_to_input(texts, max_seq_len))\n\n        # Seperates data into some batches.\n        one_batch = []\n        for example in examples:\n            one_batch.append(example)\n            if len(one_batch) == batch_size:\n                yield _parse_batch(one_batch)\n                one_batch = []\n        if one_batch:\n            yield _parse_batch(one_batch)\n\n    @contextlib.contextmanager\n    def interactive_mode(self, max_turn=3):\n        \"\"\"\n        Enter the interactive mode.\n        \"\"\"\n        self._interactive_mode = True\n        self.max_turn = max_turn\n        self.context = deque(maxlen=self.max_turn)\n        yield\n        self.context.clear()\n        self._interactive_mode = False\n\n    def forward(self,\n                input_ids,\n                token_type_ids,\n                position_ids,\n                attention_mask,\n                max_length=64,\n                min_length=1,\n                decode_strategy='sampling',\n                temperature=1.0,\n                top_k=5,\n                top_p=1.0,\n                num_beams=0,\n                length_penalty=1.0,\n                early_stopping=False,\n                num_return_sequences=1):\n\n        ids, scores = self.model.generate(\n            input_ids=input_ids,\n            token_type_ids=token_type_ids,\n            position_ids=position_ids,\n            attention_mask=attention_mask,\n            max_length=max_length,\n            min_length=min_length,\n            decode_strategy=decode_strategy,\n            temperature=temperature,\n            top_k=top_k,\n            top_p=top_p,\n            num_beams=num_beams,\n            length_penalty=length_penalty,\n            early_stopping=early_stopping,\n            num_return_sequences=num_return_sequences)\n\n        return ids, scores\n\n    @serving\n    def predict(self,\n                data: Union[List[List[str]], str],\n                max_seq_len: int = 512,\n                batch_size: int = 1,\n                use_gpu: bool = False,\n                **kwargs):\n\n        if self._interactive_mode:\n            if isinstance(data, str):\n                self.context.append(data.strip())\n                data = [list(self.context)]\n            else:\n                raise ValueError(\"In the interactive mode, the input data should be a string.\")\n        elif not isinstance(data, list):\n            raise ValueError(\"If not in the interactive mode, the input data should be a list.\")\n\n        paddle.set_device('gpu') if use_gpu else paddle.set_device('cpu')\n\n        batches = self._batchify(data, max_seq_len, batch_size)\n\n        results = []\n        self.eval()\n        for batch in batches:\n            input_ids, token_type_ids, position_ids, attention_mask = map(paddle.to_tensor, batch)\n            ids, scores = self(input_ids, token_type_ids, position_ids, attention_mask, **kwargs)\n            num_return_sequences = 1 if 'num_return_sequences' not in kwargs\\\n                else kwargs['num_return_sequences']\n            results.extend(\n                select_response(\n                    ids, scores, self.tokenizer, num_return_sequences=num_return_sequences, keep_space=False))\n\n        if self._interactive_mode:\n            self.context.append(results[0].strip())\n\n        return results\n"
  },
  {
    "path": "modules/text/text_generation/unified_transformer-12L-cn/requirements.txt",
    "content": "sentencepiece\n"
  },
  {
    "path": "modules/text/text_generation/unified_transformer-12L-cn/utils.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import List\n\n\ndef post_process_response(token_ids: List[int], tokenizer):\n    '''\n    Post-process the decoded sequence. Truncate from the first <eos>.\n    '''\n    eos_pos = len(token_ids)\n    for i, tok_id in enumerate(token_ids):\n        if tok_id == tokenizer.sep_token_id:\n            eos_pos = i\n            break\n    token_ids = token_ids[:eos_pos]\n    tokens = tokenizer.convert_ids_to_tokens(token_ids)\n    tokens = tokenizer.merge_subword(tokens)\n    return token_ids, tokens\n\n\ndef get_in_turn_repetition(pred: List[str], is_cn: bool = False):\n    '''\n    Get in-turn repetition.\n    '''\n    if len(pred) == 0:\n        return 1.0\n    if isinstance(pred[0], str):\n        pred = [tok.lower() for tok in pred]\n        if is_cn:\n            pred = \"\".join(pred)\n    tri_grams = set()\n    for i in range(len(pred) - 2):\n        tri_gram = tuple(pred[i:i + 3])\n        if tri_gram in tri_grams:\n            return True\n        tri_grams.add(tri_gram)\n    return False\n\n\ndef select_response(ids,\n                    scores: List[float],\n                    tokenizer,\n                    max_dec_len: int = None,\n                    num_return_sequences: int = 1,\n                    keep_space: bool = True):\n    '''\n    Select response with the highest score.\n    '''\n    ids = ids.numpy().tolist()\n    scores = scores.numpy()\n\n    if len(ids) != len(scores) or (len(ids) % num_return_sequences) != 0:\n        raise ValueError(\"the length of `ids` is {}, but the `num_return_sequences` is {}\".format(\n            len(ids), num_return_sequences))\n\n    group = []\n    tmp = []\n    for pred, score in zip(ids, scores):\n        pred_token_ids, pred_tokens = post_process_response(pred, tokenizer)\n        num_token = len(pred_token_ids)\n        if keep_space:\n            response = \" \".join(pred_tokens)\n        else:\n            response = \"\".join(pred_tokens)\n\n        in_turn_repetition = get_in_turn_repetition(pred_tokens, True) or get_in_turn_repetition(pred_token_ids)\n        # not ending\n        if max_dec_len is not None and num_token >= max_dec_len:\n            score -= 1e3\n        elif in_turn_repetition:\n            score -= 1e3\n\n        tmp.append([response, score])\n        if len(tmp) == num_return_sequences:\n            group.append(tmp)\n            tmp = []\n\n    results = []\n    for preds in group:\n        preds = sorted(preds, key=lambda x: -x[1])\n        results.append(preds[0][0])\n    return results\n"
  },
  {
    "path": "modules/text/text_generation/unified_transformer-12L-cn-luge/README.md",
    "content": "```shell\n$ hub install unified_transformer_12L_cn_luge==1.0.0\n```\n\n## 概述\n\n近年来，人机对话系统受到了学术界和产业界的广泛关注并取得了不错的发展。开放域对话系统旨在建立一个开放域的多轮对话系统，使得机器可以流畅自然地与人进行语言交互，既可以进行日常问候类的闲聊，又可以完成特定功能，以使得开放域对话系统具有实际应用价值。具体的说，开放域对话可以继续拆分为支持不同功能的对话形式，例如对话式推荐，知识对话技术等，如何解决并有效融合以上多个技能面临诸多挑战。\n\n[UnifiedTransformer](https://arxiv.org/abs/2006.16779)以[Transformer](https://arxiv.org/abs/1706.03762) 编码器为网络基本组件，采用灵活的注意力机制，十分适合文本生成任务，并在模型输入中加入了标识不同对话技能的special token，使得模型能同时支持闲聊对话、推荐对话和知识对话。\n\nunified_transformer_12L_cn_luge包含12层的transformer结构，头数为12，隐藏层参数为768，参数量为132M。该预训练模型使用了样本量为60M的文本数据和20M的对话数据的大型中文对话数据集进行预训练，然后在luge-dialogue的训练集合上进行finetune，具体训练详情可以查看[LUGE-Dialogue](https://github.com/PaddlePaddle/Knover/tree/luge-dialogue/luge-dialogue)。\n\n## API\n\n```python\ndef predict(data: Union[List[List[str]], str],\n            max_seq_len: int = 512,\n            batch_size: int = 1,\n            use_gpu: bool = False,\n            **kwargs):\n```\n预测API，输入对话上下文，输出机器回复。\n\n**参数**\n- `data`(Union[List[List[str]], str]): 在非交互模式中，数据类型为List[List[str]]，每个样本是一个List[str]，表示为对话内容\n- `max_seq_len`(int): 每个样本的最大文本长度\n- `batch_size`(int): 进行预测的batch_size\n- `use_gpu`(bool): 是否使用gpu执行预测\n- `kwargs`: 预测时传给模型的额外参数，以keyword方式传递。其余的参数详情请查看[UnifiedTransformer](https://github.com/PaddlePaddle/PaddleNLP/tree/develop/examples/dialogue/unified_transformer)。\n\n**返回**\n* `results`(List[str]): 每个元素为相应对话中模型的新回复\n\n```python\ndef interactive_mode(max_turn=3)\n```\n进入交互模式。交互模式中，predict接口的data将支持字符串类型。\n\n**参数**\n- `max_turn`(int): 模型能记忆的对话轮次，当`max_turn`为1时，模型只能记住当前对话，无法获知之前的对话内容。\n\n\n**代码示例**\n\n```python\n# 非交互模式\nimport paddlehub as hub\n\nmodel = hub.Module(name='unified_transformer_12L_cn_luge')\ndata = [[\"你是谁？\"], [\"你好啊。\", \"吃饭了吗？\",]]\nresult = model.predict(data)\n```\n\n```python\n# 交互模式\nimport paddlehub as hub\n\nmodel = hub.Module(name='unified_transformer_12L_cn_luge')\nwith model.interactive_mode(max_turn=3):\n    while True:\n        human_utterance = input(\"[Human]: \").strip()\n        robot_utterance = model.predict(human_utterance)[0]\n        print(\"[Bot]: %s\"%robot_utterance)\n```\n\n## 服务部署\n\nPaddleHub Serving可以部署在线服务。\n\n### Step1: 启动PaddleHub Serving\n\n运行启动命令：\n\n```shell\n$ hub serving start -m unified_transformer_12L_cn_luge\n```\n\n这样就完成了一个对话机器人服务化API的部署，默认端口号为8866。\n\n**NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n### Step2: 发送预测请求\n\n配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n```python\nimport requests\nimport json\n\ntexts = [[\"今天是个好日子\"], [\"天气预报说今天要下雨\"]]\ndata = {\"data\": texts}\n# 发送post请求，content-type类型应指定json方式，url中的ip地址需改为对应机器的ip\nurl = \"http://127.0.0.1:8866/predict/unified_transformer_12L_cn_luge\"\n# 指定post请求的headers为application/json方式\nheaders = {\"Content-Type\": \"application/json\"}\n\nr = requests.post(url=url, headers=headers, data=json.dumps(data))\nprint(r.json())\n```\n\n## 查看代码\n\nhttps://github.com/PaddlePaddle/Knover\n\n## 依赖\n\npaddlepaddle >= 2.0.0\n\npaddlehub >= 2.1.0\n\n## 更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/text/text_generation/unified_transformer-12L-cn-luge/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/text_generation/unified_transformer-12L-cn-luge/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport contextlib\nfrom collections import deque\nfrom typing import List, Union\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nfrom paddlehub.module.module import moduleinfo, serving\nfrom paddlenlp.data import Pad\nfrom paddlenlp.transformers import UnifiedTransformerLMHeadModel, UnifiedTransformerTokenizer\n\nfrom unified_transformer_12L_cn_luge.utils import select_response\n\n\n@moduleinfo(\n    name=\"unified_transformer_12L_cn_luge\",\n    version=\"1.0.0\",\n    summary=\"\",\n    author=\"paddlepaddle\",\n    author_email=\"\",\n    type=\"nlp/text_generation\",\n)\nclass UnifiedTransformer(nn.Layer):\n    def __init__(self):\n        super(UnifiedTransformer, self).__init__()\n\n        self.model = UnifiedTransformerLMHeadModel.from_pretrained('unified_transformer-12L-cn-luge')\n        self.tokenizer = UnifiedTransformerTokenizer.from_pretrained('unified_transformer-12L-cn-luge')\n        self._interactive_mode = False\n\n    def _convert_text_to_input(self, texts: List[str], max_seq_len: int):\n        \"\"\"\n        Convert input strings to tokens.\n        \"\"\"\n        return self.tokenizer.dialogue_encode(\n            texts, max_seq_len=max_seq_len, add_start_token_as_response=True, is_split_into_words=False)\n\n    def _batchify(self, data: List[List[str]], max_seq_len: int, batch_size: int):\n        \"\"\"\n        Generate input batches.\n        \"\"\"\n        padding = False if batch_size == 1 else True\n        pad_func = Pad(pad_val=self.tokenizer.pad_token_id, pad_right=False, dtype=np.int64)\n\n        def pad_mask(batch_attention_mask):\n            batch_size = len(batch_attention_mask)\n            max_len = max(map(len, batch_attention_mask))\n            attention_mask = np.ones((batch_size, max_len, max_len), dtype='float32') * -1e9\n            for i, mask_data in enumerate(attention_mask):\n                seq_len = len(batch_attention_mask[i])\n                mask_data[-seq_len:, -seq_len:] = np.array(batch_attention_mask[i], dtype='float32')\n            # In order to ensure the correct broadcasting mechanism, expand one\n            # dimension to the second dimension (n_head of Transformer).\n            attention_mask = np.expand_dims(attention_mask, axis=1)\n            return attention_mask\n\n        def _parse_batch(batch_examples):\n            if padding:\n                input_ids = pad_func([example['input_ids'] for example in batch_examples])\n                token_type_ids = pad_func([example['token_type_ids'] for example in batch_examples])\n                position_ids = pad_func([example['position_ids'] for example in batch_examples])\n                attention_mask = pad_mask([example['attention_mask'] for example in batch_examples])\n            else:\n                input_ids = np.asarray([example['input_ids'] for example in batch_examples], dtype=np.int64)\n                token_type_ids = np.asarray([example['token_type_ids'] for example in batch_examples], dtype=np.int64)\n                position_ids = np.asarray([example['position_ids'] for example in batch_examples], dtype=np.int64)\n                attention_mask = np.asarray([example['attention_mask'] for example in batch_examples])\n                attention_mask = np.expand_dims(attention_mask, 0)\n\n            return input_ids, token_type_ids, position_ids, attention_mask\n\n        examples = []\n        for texts in data:\n            examples.append(self._convert_text_to_input(texts, max_seq_len))\n\n        # Seperates data into some batches.\n        one_batch = []\n        for example in examples:\n            one_batch.append(example)\n            if len(one_batch) == batch_size:\n                yield _parse_batch(one_batch)\n                one_batch = []\n        if one_batch:\n            yield _parse_batch(one_batch)\n\n    @contextlib.contextmanager\n    def interactive_mode(self, max_turn=3):\n        \"\"\"\n        Enter the interactive mode.\n        \"\"\"\n        self._interactive_mode = True\n        self.max_turn = max_turn\n        self.context = deque(maxlen=self.max_turn)\n        yield\n        self.context.clear()\n        self._interactive_mode = False\n\n    def forward(self,\n                input_ids,\n                token_type_ids,\n                position_ids,\n                attention_mask,\n                max_length=64,\n                min_length=1,\n                decode_strategy='sampling',\n                temperature=1.0,\n                top_k=5,\n                top_p=1.0,\n                num_beams=0,\n                length_penalty=1.0,\n                early_stopping=False,\n                num_return_sequences=1):\n\n        ids, scores = self.model.generate(\n            input_ids=input_ids,\n            token_type_ids=token_type_ids,\n            position_ids=position_ids,\n            attention_mask=attention_mask,\n            max_length=max_length,\n            min_length=min_length,\n            decode_strategy=decode_strategy,\n            temperature=temperature,\n            top_k=top_k,\n            top_p=top_p,\n            num_beams=num_beams,\n            length_penalty=length_penalty,\n            early_stopping=early_stopping,\n            num_return_sequences=num_return_sequences)\n\n        return ids, scores\n\n    @serving\n    def predict(self,\n                data: Union[List[List[str]], str],\n                max_seq_len: int = 512,\n                batch_size: int = 1,\n                use_gpu: bool = False,\n                **kwargs):\n\n        if self._interactive_mode:\n            if isinstance(data, str):\n                self.context.append(data.strip())\n                data = [list(self.context)]\n            else:\n                raise ValueError(\"In the interactive mode, the input data should be a string.\")\n        elif not isinstance(data, list):\n            raise ValueError(\"If not in the interactive mode, the input data should be a list.\")\n\n        paddle.set_device('gpu') if use_gpu else paddle.set_device('cpu')\n\n        batches = self._batchify(data, max_seq_len, batch_size)\n\n        results = []\n        self.eval()\n        for batch in batches:\n            input_ids, token_type_ids, position_ids, attention_mask = map(paddle.to_tensor, batch)\n            ids, scores = self(input_ids, token_type_ids, position_ids, attention_mask, **kwargs)\n            num_return_sequences = 1 if 'num_return_sequences' not in kwargs\\\n                else kwargs['num_return_sequences']\n            results.extend(\n                select_response(\n                    ids, scores, self.tokenizer, num_return_sequences=num_return_sequences, keep_space=False))\n\n        if self._interactive_mode:\n            self.context.append(results[0].strip())\n\n        return results\n"
  },
  {
    "path": "modules/text/text_generation/unified_transformer-12L-cn-luge/requirements.txt",
    "content": "sentencepiece\n"
  },
  {
    "path": "modules/text/text_generation/unified_transformer-12L-cn-luge/utils.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import List\n\n\ndef post_process_response(token_ids: List[int], tokenizer):\n    '''\n    Post-process the decoded sequence. Truncate from the first <eos>.\n    '''\n    eos_pos = len(token_ids)\n    for i, tok_id in enumerate(token_ids):\n        if tok_id == tokenizer.sep_token_id:\n            eos_pos = i\n            break\n    token_ids = token_ids[:eos_pos]\n    tokens = tokenizer.convert_ids_to_tokens(token_ids)\n    tokens = tokenizer.merge_subword(tokens)\n    return token_ids, tokens\n\n\ndef get_in_turn_repetition(pred: List[str], is_cn: bool = False):\n    '''\n    Get in-turn repetition.\n    '''\n    if len(pred) == 0:\n        return 1.0\n    if isinstance(pred[0], str):\n        pred = [tok.lower() for tok in pred]\n        if is_cn:\n            pred = \"\".join(pred)\n    tri_grams = set()\n    for i in range(len(pred) - 2):\n        tri_gram = tuple(pred[i:i + 3])\n        if tri_gram in tri_grams:\n            return True\n        tri_grams.add(tri_gram)\n    return False\n\n\ndef select_response(ids,\n                    scores: List[float],\n                    tokenizer,\n                    max_dec_len: int = None,\n                    num_return_sequences: int = 1,\n                    keep_space: bool = True):\n    '''\n    Select response with the highest score.\n    '''\n    ids = ids.numpy().tolist()\n    scores = scores.numpy()\n\n    if len(ids) != len(scores) or (len(ids) % num_return_sequences) != 0:\n        raise ValueError(\"the length of `ids` is {}, but the `num_return_sequences` is {}\".format(\n            len(ids), num_return_sequences))\n\n    group = []\n    tmp = []\n    for pred, score in zip(ids, scores):\n        pred_token_ids, pred_tokens = post_process_response(pred, tokenizer)\n        num_token = len(pred_token_ids)\n        if keep_space:\n            response = \" \".join(pred_tokens)\n        else:\n            response = \"\".join(pred_tokens)\n\n        in_turn_repetition = get_in_turn_repetition(pred_tokens, True) or get_in_turn_repetition(pred_token_ids)\n        # not ending\n        if max_dec_len is not None and num_token >= max_dec_len:\n            score -= 1e3\n        elif in_turn_repetition:\n            score -= 1e3\n\n        tmp.append([response, score])\n        if len(tmp) == num_return_sequences:\n            group.append(tmp)\n            tmp = []\n\n    results = []\n    for preds in group:\n        preds = sorted(preds, key=lambda x: -x[1])\n        results.append(preds[0][0])\n    return results\n"
  },
  {
    "path": "modules/text/text_review/README.md",
    "content": "## **更好用户体验，建议参考WEB端官方文档 -> [【文本审核】](https://www.paddlepaddle.org.cn/hublist)**\n\n### 文本审核\n\n- 推荐模型\n\n| 模型名称                                                     | 模型简介                                                     |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n| [色情识别-LSTM](https://www.paddlepaddle.org.cn/hubdetail?name=porn_detection_lstm&en_category=TextCensorship) | 色情检测模型可自动判别文本是否涉黄并给出相应的置信度，对文本中的色情描述、低俗交友、污秽文爱进行识别，核心采用LSTM网络结构并按字粒度进行切词。 |\n| [色情识别-GRU](https://www.paddlepaddle.org.cn/hubdetail?name=porn_detection_gru&en_category=TextCensorship) | 色情检测模型可自动判别文本是否涉黄并给出相应的置信度，对文本中的色情描述、低俗交友、污秽文爱进行识别，核心采用GRU网络结构并按字粒度进行切词。 |\n| [色情识别-CNN](https://www.paddlepaddle.org.cn/hubdetail?name=porn_detection_cnn&en_category=TextCensorship) | 色情检测模型可自动判别文本是否涉黄并给出相应的置信度，对文本中的色情描述、低俗交友、污秽文爱进行识别，核心采用CNN网络结构并按字粒度进行切词。|\n"
  },
  {
    "path": "modules/text/text_review/README_en.md",
    "content": "## **For better user experience, refer to the Web official document -> [Text Review](https://www.paddlepaddle.org.cn/hublist)**\n\n### Text Review\n\n- Recommended Models\n\n| Model Name                                                   | Model Introduction                                           |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n| [Porn recognition - LSTM](https://www.paddlepaddle.org.cn/hubdetail?name=porn_detection_lstm&en_category=TextCensorship) | The porn detection model can automatically determine whether the text is pornographic or not and give the corresponding confidence level, and identify porn descriptions, vulgar dating, and obscene literature in the text. The core uses the LSTM network structure and performs the word segmentation according to the word granularity. |\n| [Porn recognition - GRU](https://www.paddlepaddle.org.cn/hubdetail?name=porn_detection_gru&en_category=TextCensorship) | The porn detection model can automatically determine whether the text is pornographic or not and give the corresponding confidence level, and identify porn descriptions, vulgar dating, and obscene literature in the text. The core uses the GRU network structure and performs the word segmentation according to the word granularity. |\n| [Porn recognition - CNNN](https://www.paddlepaddle.org.cn/hubdetail?name=porn_detection_cnn&en_category=TextCensorship) | The porn detection model can automatically determine whether the text is pornographic or not and give the corresponding confidence level, and identify porn descriptions, vulgar dating, and obscene literature in the text. The core uses the CNN network structure and performs the word segmentation according to the word granularity. |\n"
  },
  {
    "path": "modules/text/text_review/porn_detection_cnn/README.md",
    "content": "# porn_detection_cnn\n\n| 模型名称            |  porn_detection_cnn  |\n| :------------------ | :------------: |\n| 类别                | 文本-文本审核  |\n| 网络                |      CNN     |\n| 数据集              | 百度自建数据集 |\n| 是否支持Fine-tuning |       否       |\n| 模型大小            |       20M       |\n| 最新更新日期        |   2021-02-26   |\n| 数据指标            |       -        |\n\n## 一、模型基本信息\n\n- ### 模型介绍\n  - 色情检测模型可自动判别文本是否涉黄并给出相应的置信度，对文本中的色情描述、低俗交友、污秽文案进行识别。\n  - porn_detection_cnn采用CNN网络结构并按字粒度进行切词，具有较高的预测速度。该模型最大句子长度为256字，仅支持预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 1.6.2\n\n  - paddlehub >= 1.6.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install porn_detection_cnn\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run porn_detection_cnn --input_text \"黄片下载\"\n    ```\n\n  - 或者\n\n  - ```shell\n    $ hub run porn_detection_cnn --input_file test.txt\n    ```\n\n    - 其中test.txt存放待审查文本，每行仅放置一段待审核文本\n\n  - 通过命令行方式实现hub模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    porn_detection_cnn = hub.Module(name=\"porn_detection_cnn\")\n\n    test_text = [\"黄片下载\", \"打击黄牛党\"]\n\n    results = porn_detection_cnn.detection(texts=test_text, use_gpu=True, batch_size=1)\n\n    for index, text in enumerate(test_text):\n        results[index][\"text\"] = text\n    for index, result in enumerate(results):\n        print(results[index])\n\n    # 输出结果如下：\n    # {'text': '黄片下载', 'porn_detection_label': 1, 'porn_detection_key': 'porn', 'porn_probs': 0.9324, 'not_porn_probs': 0.0676}\n    # {'text': '打击黄牛党', 'porn_detection_label': 0, 'porn_detection_key': 'not_porn', 'porn_probs': 0.0004, 'not_porn_probs': 0.9996}\n    ```\n\n\n- ### 3、API\n\n  - ```python\n    def detection(texts=[], data={}, use_gpu=False, batch_size=1)\n    ```\n\n    - porn_detection_cnn预测接口，鉴定输入句子是否包含色情文案\n\n    - **参数**\n\n      - texts(list): 待预测数据，如果使用texts参数，则不用传入data参数，二选一即可\n\n      - data(dict): 预测数据，key必须为text，value是带预测数据。如果使用data参数，则不用传入texts参数，二选一即可。建议使用texts参数，data参数后续会废弃。\n\n      - use_gpu(bool): 是否使用GPU预测，如果使用GPU预测，则在预测之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置\n\n      - batch_size(int): 批处理大小\n\n    - **返回**\n\n      - results(list): 鉴定结果\n\n\n  - ```python\n    def get_labels()\n    ```\n    - 获取porn_detection_cnn的类别\n\n    - **返回**\n\n      - labels(dict): porn_detection_cnn的类别(二分类，是/不是）\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取预训练时使用的词汇表\n\n    - **返回**\n\n      - vocab_path(str): 词汇表路径\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线色情文案检测服务，可以将此接口用于在线web应用。\n\n- ## 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m porn_detection_cnn  \n    ```\n\n  - 启动时会显示加载模型过程，启动成功后显示\n  - ```shell\n    Loading porn_detection_cnn successful.\n    ```\n\n  - 这样就完成了服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ## 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 待预测数据\n    text = [\"黄片下载\", \"打击黄牛党\"]\n\n    # 设置运行配置\n    # 对应本地预测porn_detection_cnn.detection(texts=text, batch_size=1, use_gpu=True)\n    data = {\"texts\": text, \"batch_size\": 1, \"use_gpu\":True}\n\n    # 指定预测方法为porn_detection_cnn并发送post请求，content-type类型应指定json方式\n    # HOST_IP为服务器IP\n    url = \"http://HOST_IP:8866/predict/porn_detection_cnn\"\n    headers = {\"Content-Type\": \"application/json\"}\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(json.dumps(r.json(), indent=4, ensure_ascii=False))\n    ```\n\n  - 关于PaddleHub Serving更多信息参考[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  大幅提升预测性能，同时简化接口使用\n\n* 1.2.0\n\n  移除 fluid api\n\n  - ```shell\n    $ hub install porn_detection_cnn==1.2.0\n    ```\n"
  },
  {
    "path": "modules/text/text_review/porn_detection_cnn/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/text_review/porn_detection_cnn/assets/params.txt",
    "content": "@HUB_porn_detection_cnn@layer_norm_1.b_0\n@HUB_porn_detection_cnn@fc_0.w_0\n@HUB_porn_detection_cnn@embedding_0.w_0\n@HUB_porn_detection_cnn@fc_1.w_0\n@HUB_porn_detection_cnn@layer_norm_1.w_0\n@HUB_porn_detection_cnn@layer_norm_0.w_0\n@HUB_porn_detection_cnn@fc_1.b_0\n@HUB_porn_detection_cnn@sequence_conv_0.b_0\n@HUB_porn_detection_cnn@sequence_conv_0.w_0\n@HUB_porn_detection_cnn@layer_norm_0.b_0\n@HUB_porn_detection_cnn@fc_0.b_0\n"
  },
  {
    "path": "modules/text/text_review/porn_detection_cnn/assets/vocab.txt",
    "content": "[PAD]\n[unused1]\n[unused2]\n[unused3]\n[unused4]\n[unused5]\n[unused6]\n[unused7]\n[unused8]\n[unused9]\n[unused10]\n[unused11]\n[unused12]\n[unused13]\n[unused14]\n[unused15]\n[unused16]\n[unused17]\n[unused18]\n[unused19]\n[unused20]\n[unused21]\n[unused22]\n[unused23]\n[unused24]\n[unused25]\n[unused26]\n[unused27]\n[unused28]\n[unused29]\n[unused30]\n[unused31]\n[unused32]\n[unused33]\n[unused34]\n[unused35]\n[unused36]\n[unused37]\n[unused38]\n[unused39]\n[unused40]\n[unused41]\n[unused42]\n[unused43]\n[unused44]\n[unused45]\n[unused46]\n[unused47]\n[unused48]\n[unused49]\n[unused50]\n[unused51]\n[unused52]\n[unused53]\n[unused54]\n[unused55]\n[unused56]\n[unused57]\n[unused58]\n[unused59]\n[unused60]\n[unused61]\n[unused62]\n[unused63]\n[unused64]\n[unused65]\n[unused66]\n[unused67]\n[unused68]\n[unused69]\n[unused70]\n[unused71]\n[unused72]\n[unused73]\n[unused74]\n[unused75]\n[unused76]\n[unused77]\n[unused78]\n[unused79]\n[unused80]\n[unused81]\n[unused82]\n[unused83]\n[unused84]\n[unused85]\n[unused86]\n[unused87]\n[unused88]\n[unused89]\n[unused90]\n[unused91]\n[unused92]\n[unused93]\n[unused94]\n[unused95]\n[unused96]\n[unused97]\n[unused98]\n[unused99]\n[UNK]\n[CLS]\n[SEP]\n[MASK]\n<S>\n<T>\n!\n\"\n#\n$\n%\n&\n'\n(\n)\n*\n+\n,\n-\n.\n/\n0\n1\n2\n3\n4\n5\n6\n7\n8\n9\n:\n;\n<\n=\n>\n?\n@\n[\n\\\n]\n^\n_\na\nb\nc\nd\ne\nf\ng\nh\ni\nj\nk\nl\nm\nn\no\np\nq\nr\ns\nt\nu\nv\nw\nx\ny\nz\n{\n|\n}\n~\n£\n¤\n¥\n§\n©\n«\n®\n°\n±\n²\n³\nµ\n·\n¹\nº\n»\n¼\n×\nß\næ\n÷\nø\nđ\nŋ\nɔ\nə\nɡ\nʰ\nˇ\nˈ\nˊ\nˋ\nˍ\nː\n˙\n˚\nˢ\nα\nβ\nγ\nδ\nε\nη\nθ\nι\nκ\nλ\nμ\nν\nο\nπ\nρ\nς\nσ\nτ\nυ\nφ\nχ\nψ\nω\nа\nб\nв\nг\nд\nе\nж\nз\nи\nк\nл\nм\nн\nо\nп\nр\nс\nт\nу\nф\nх\nц\nч\nш\nы\nь\nя\nі\nا\nب\nة\nت\nد\nر\nس\nع\nل\nم\nن\nه\nو\nي\n۩\nก\nง\nน\nม\nย\nร\nอ\nา\nเ\n๑\n་\nღ\nᄀ\nᄁ\nᄂ\nᄃ\nᄅ\nᄆ\nᄇ\nᄈ\nᄉ\nᄋ\nᄌ\nᄎ\nᄏ\nᄐ\nᄑ\nᄒ\nᅡ\nᅢ\nᅣ\nᅥ\nᅦ\nᅧ\nᅨ\nᅩ\nᅪ\nᅬ\nᅭ\nᅮ\nᅯ\nᅲ\nᅳ\nᅴ\nᅵ\nᆨ\nᆫ\nᆯ\nᆷ\nᆸ\nᆺ\nᆻ\nᆼ\nᗜ\nᵃ\nᵉ\nᵍ\nᵏ\nᵐ\nᵒ\nᵘ\n‖\n„\n†\n•\n‥\n‧\n \n‰\n′\n″\n‹\n›\n※\n‿\n⁄\nⁱ\n⁺\nⁿ\n₁\n₂\n₃\n₄\n€\n℃\n№\n™\nⅰ\nⅱ\nⅲ\nⅳ\nⅴ\n←\n↑\n→\n↓\n↔\n↗\n↘\n⇒\n∀\n−\n∕\n∙\n√\n∞\n∟\n∠\n∣\n∥\n∩\n∮\n∶\n∼\n∽\n≈\n≒\n≡\n≤\n≥\n≦\n≧\n≪\n≫\n⊙\n⋅\n⋈\n⋯\n⌒\n①\n②\n③\n④\n⑤\n⑥\n⑦\n⑧\n⑨\n⑩\n⑴\n⑵\n⑶\n⑷\n⑸\n⒈\n⒉\n⒊\n⒋\nⓒ\nⓔ\nⓘ\n─\n━\n│\n┃\n┅\n┆\n┊\n┌\n└\n├\n┣\n═\n║\n╚\n╞\n╠\n╭\n╮\n╯\n╰\n╱\n╳\n▂\n▃\n▅\n▇\n█\n▉\n▋\n▌\n▍\n▎\n■\n□\n▪\n▫\n▬\n▲\n△\n▶\n►\n▼\n▽\n◆\n◇\n○\n◎\n●\n◕\n◠\n◢\n◤\n☀\n★\n☆\n☕\n☞\n☺\n☼\n♀\n♂\n♠\n♡\n♣\n♥\n♦\n♪\n♫\n♬\n✈\n✔\n✕\n✖\n✦\n✨\n✪\n✰\n✿\n❀\n❤\n➜\n➤\n⦿\n、\n。\n〃\n々\n〇\n〈\n〉\n《\n》\n「\n」\n『\n』\n【\n】\n〓\n〔\n〕\n〖\n〗\n〜\n〝\n〞\nぁ\nあ\nぃ\nい\nう\nぇ\nえ\nお\nか\nき\nく\nけ\nこ\nさ\nし\nす\nせ\nそ\nた\nち\nっ\nつ\nて\nと\nな\nに\nぬ\nね\nの\nは\nひ\nふ\nへ\nほ\nま\nみ\nむ\nめ\nも\nゃ\nや\nゅ\nゆ\nょ\nよ\nら\nり\nる\nれ\nろ\nわ\nを\nん\n゜\nゝ\nァ\nア\nィ\nイ\nゥ\nウ\nェ\nエ\nォ\nオ\nカ\nキ\nク\nケ\nコ\nサ\nシ\nス\nセ\nソ\nタ\nチ\nッ\nツ\nテ\nト\nナ\nニ\nヌ\nネ\nノ\nハ\nヒ\nフ\nヘ\nホ\nマ\nミ\nム\nメ\nモ\nャ\nヤ\nュ\nユ\nョ\nヨ\nラ\nリ\nル\nレ\nロ\nワ\nヲ\nン\nヶ\n・\nー\nヽ\nㄅ\nㄆ\nㄇ\nㄉ\nㄋ\nㄌ\nㄍ\nㄎ\nㄏ\nㄒ\nㄚ\nㄛ\nㄞ\nㄟ\nㄢ\nㄤ\nㄥ\nㄧ\nㄨ\nㆍ\n㈦\n㊣\n㎡\n㗎\n一\n丁\n七\n万\n丈\n三\n上\n下\n不\n与\n丐\n丑\n专\n且\n丕\n世\n丘\n丙\n业\n丛\n东\n丝\n丞\n丟\n両\n丢\n两\n严\n並\n丧\n丨\n个\n丫\n中\n丰\n串\n临\n丶\n丸\n丹\n为\n主\n丼\n丽\n举\n丿\n乂\n乃\n久\n么\n义\n之\n乌\n乍\n乎\n乏\n乐\n乒\n乓\n乔\n乖\n乗\n乘\n乙\n乜\n九\n乞\n也\n习\n乡\n书\n乩\n买\n乱\n乳\n乾\n亀\n亂\n了\n予\n争\n事\n二\n于\n亏\n云\n互\n五\n井\n亘\n亙\n亚\n些\n亜\n亞\n亟\n亡\n亢\n交\n亥\n亦\n产\n亨\n亩\n享\n京\n亭\n亮\n亲\n亳\n亵\n人\n亿\n什\n仁\n仃\n仄\n仅\n仆\n仇\n今\n介\n仍\n从\n仏\n仑\n仓\n仔\n仕\n他\n仗\n付\n仙\n仝\n仞\n仟\n代\n令\n以\n仨\n仪\n们\n仮\n仰\n仲\n件\n价\n任\n份\n仿\n企\n伉\n伊\n伍\n伎\n伏\n伐\n休\n伕\n众\n优\n伙\n会\n伝\n伞\n伟\n传\n伢\n伤\n伦\n伪\n伫\n伯\n估\n伴\n伶\n伸\n伺\n似\n伽\n佃\n但\n佇\n佈\n位\n低\n住\n佐\n佑\n体\n佔\n何\n佗\n佘\n余\n佚\n佛\n作\n佝\n佞\n佟\n你\n佢\n佣\n佤\n佥\n佩\n佬\n佯\n佰\n佳\n併\n佶\n佻\n佼\n使\n侃\n侄\n來\n侈\n例\n侍\n侏\n侑\n侖\n侗\n供\n依\n侠\n価\n侣\n侥\n侦\n侧\n侨\n侬\n侮\n侯\n侵\n侶\n侷\n便\n係\n促\n俄\n俊\n俎\n俏\n俐\n俑\n俗\n俘\n俚\n保\n俞\n俟\n俠\n信\n俨\n俩\n俪\n俬\n俭\n修\n俯\n俱\n俳\n俸\n俺\n俾\n倆\n倉\n個\n倌\n倍\n倏\n們\n倒\n倔\n倖\n倘\n候\n倚\n倜\n借\n倡\n値\n倦\n倩\n倪\n倫\n倬\n倭\n倶\n债\n值\n倾\n偃\n假\n偈\n偉\n偌\n偎\n偏\n偕\n做\n停\n健\n側\n偵\n偶\n偷\n偻\n偽\n偿\n傀\n傅\n傍\n傑\n傘\n備\n傚\n傢\n傣\n傥\n储\n傩\n催\n傭\n傲\n傳\n債\n傷\n傻\n傾\n僅\n働\n像\n僑\n僕\n僖\n僚\n僥\n僧\n僭\n僮\n僱\n僵\n價\n僻\n儀\n儂\n億\n儆\n儉\n儋\n儒\n儕\n儘\n償\n儡\n優\n儲\n儷\n儼\n儿\n兀\n允\n元\n兄\n充\n兆\n兇\n先\n光\n克\n兌\n免\n児\n兑\n兒\n兔\n兖\n党\n兜\n兢\n入\n內\n全\n兩\n八\n公\n六\n兮\n兰\n共\n兲\n关\n兴\n兵\n其\n具\n典\n兹\n养\n兼\n兽\n冀\n内\n円\n冇\n冈\n冉\n冊\n册\n再\n冏\n冒\n冕\n冗\n写\n军\n农\n冠\n冢\n冤\n冥\n冨\n冪\n冬\n冯\n冰\n冲\n决\n况\n冶\n冷\n冻\n冼\n冽\n冾\n净\n凄\n准\n凇\n凈\n凉\n凋\n凌\n凍\n减\n凑\n凛\n凜\n凝\n几\n凡\n凤\n処\n凪\n凭\n凯\n凰\n凱\n凳\n凶\n凸\n凹\n出\n击\n函\n凿\n刀\n刁\n刃\n分\n切\n刈\n刊\n刍\n刎\n刑\n划\n列\n刘\n则\n刚\n创\n初\n删\n判\n別\n刨\n利\n刪\n别\n刮\n到\n制\n刷\n券\n刹\n刺\n刻\n刽\n剁\n剂\n剃\n則\n剉\n削\n剋\n剌\n前\n剎\n剐\n剑\n剔\n剖\n剛\n剜\n剝\n剣\n剤\n剥\n剧\n剩\n剪\n副\n割\n創\n剷\n剽\n剿\n劃\n劇\n劈\n劉\n劊\n劍\n劏\n劑\n力\n劝\n办\n功\n加\n务\n劣\n动\n助\n努\n劫\n劭\n励\n劲\n劳\n労\n劵\n効\n劾\n势\n勁\n勃\n勇\n勉\n勋\n勐\n勒\n動\n勖\n勘\n務\n勛\n勝\n勞\n募\n勢\n勤\n勧\n勳\n勵\n勸\n勺\n勻\n勾\n勿\n匀\n包\n匆\n匈\n匍\n匐\n匕\n化\n北\n匙\n匝\n匠\n匡\n匣\n匪\n匮\n匯\n匱\n匹\n区\n医\n匾\n匿\n區\n十\n千\n卅\n升\n午\n卉\n半\n卍\n华\n协\n卑\n卒\n卓\n協\n单\n卖\n南\n単\n博\n卜\n卞\n卟\n占\n卡\n卢\n卤\n卦\n卧\n卫\n卮\n卯\n印\n危\n即\n却\n卵\n卷\n卸\n卻\n卿\n厂\n厄\n厅\n历\n厉\n压\n厌\n厕\n厘\n厚\n厝\n原\n厢\n厥\n厦\n厨\n厩\n厭\n厮\n厲\n厳\n去\n县\n叁\n参\n參\n又\n叉\n及\n友\n双\n反\n収\n发\n叔\n取\n受\n变\n叙\n叛\n叟\n叠\n叡\n叢\n口\n古\n句\n另\n叨\n叩\n只\n叫\n召\n叭\n叮\n可\n台\n叱\n史\n右\n叵\n叶\n号\n司\n叹\n叻\n叼\n叽\n吁\n吃\n各\n吆\n合\n吉\n吊\n吋\n同\n名\n后\n吏\n吐\n向\n吒\n吓\n吕\n吖\n吗\n君\n吝\n吞\n吟\n吠\n吡\n否\n吧\n吨\n吩\n含\n听\n吭\n吮\n启\n吱\n吳\n吴\n吵\n吶\n吸\n吹\n吻\n吼\n吽\n吾\n呀\n呂\n呃\n呆\n呈\n告\n呋\n呎\n呐\n呓\n呕\n呗\n员\n呛\n呜\n呢\n呤\n呦\n周\n呱\n呲\n味\n呵\n呷\n呸\n呻\n呼\n命\n咀\n咁\n咂\n咄\n咆\n咋\n和\n咎\n咏\n咐\n咒\n咔\n咕\n咖\n咗\n咘\n咙\n咚\n咛\n咣\n咤\n咦\n咧\n咨\n咩\n咪\n咫\n咬\n咭\n咯\n咱\n咲\n咳\n咸\n咻\n咽\n咿\n哀\n品\n哂\n哄\n哆\n哇\n哈\n哉\n哋\n哌\n响\n哎\n哏\n哐\n哑\n哒\n哔\n哗\n哟\n員\n哥\n哦\n哧\n哨\n哩\n哪\n哭\n哮\n哲\n哺\n哼\n哽\n唁\n唄\n唆\n唇\n唉\n唏\n唐\n唑\n唔\n唠\n唤\n唧\n唬\n售\n唯\n唰\n唱\n唳\n唷\n唸\n唾\n啃\n啄\n商\n啉\n啊\n問\n啓\n啕\n啖\n啜\n啞\n啟\n啡\n啤\n啥\n啦\n啧\n啪\n啫\n啬\n啮\n啰\n啱\n啲\n啵\n啶\n啷\n啸\n啻\n啼\n啾\n喀\n喂\n喃\n善\n喆\n喇\n喉\n喊\n喋\n喎\n喏\n喔\n喘\n喙\n喚\n喜\n喝\n喟\n喧\n喪\n喫\n喬\n單\n喰\n喱\n喲\n喳\n喵\n営\n喷\n喹\n喺\n喻\n喽\n嗅\n嗆\n嗇\n嗎\n嗑\n嗒\n嗓\n嗔\n嗖\n嗚\n嗜\n嗝\n嗟\n嗡\n嗣\n嗤\n嗦\n嗨\n嗪\n嗬\n嗯\n嗰\n嗲\n嗳\n嗶\n嗷\n嗽\n嘀\n嘅\n嘆\n嘈\n嘉\n嘌\n嘍\n嘎\n嘔\n嘖\n嘗\n嘘\n嘚\n嘛\n嘜\n嘞\n嘟\n嘢\n嘣\n嘤\n嘧\n嘩\n嘭\n嘮\n嘯\n嘰\n嘱\n嘲\n嘴\n嘶\n嘸\n嘹\n嘻\n嘿\n噁\n噌\n噎\n噓\n噔\n噗\n噙\n噜\n噠\n噢\n噤\n器\n噩\n噪\n噬\n噱\n噴\n噶\n噸\n噹\n噻\n噼\n嚀\n嚇\n嚎\n嚏\n嚐\n嚓\n嚕\n嚟\n嚣\n嚥\n嚨\n嚮\n嚴\n嚷\n嚼\n囂\n囉\n囊\n囍\n囑\n囔\n囗\n囚\n四\n囝\n回\n囟\n因\n囡\n团\n団\n囤\n囧\n囪\n囫\n园\n困\n囱\n囲\n図\n围\n囹\n固\n国\n图\n囿\n圃\n圄\n圆\n圈\n國\n圍\n圏\n園\n圓\n圖\n團\n圜\n土\n圣\n圧\n在\n圩\n圭\n地\n圳\n场\n圻\n圾\n址\n坂\n均\n坊\n坍\n坎\n坏\n坐\n坑\n块\n坚\n坛\n坝\n坞\n坟\n坠\n坡\n坤\n坦\n坨\n坪\n坯\n坳\n坵\n坷\n垂\n垃\n垄\n型\n垒\n垚\n垛\n垠\n垢\n垣\n垦\n垩\n垫\n垭\n垮\n垵\n埂\n埃\n埋\n城\n埔\n埕\n埗\n域\n埠\n埤\n埵\n執\n埸\n培\n基\n埼\n堀\n堂\n堃\n堅\n堆\n堇\n堑\n堕\n堙\n堡\n堤\n堪\n堯\n堰\n報\n場\n堵\n堺\n堿\n塊\n塌\n塑\n塔\n塗\n塘\n塚\n塞\n塢\n塩\n填\n塬\n塭\n塵\n塾\n墀\n境\n墅\n墉\n墊\n墒\n墓\n増\n墘\n墙\n墜\n增\n墟\n墨\n墩\n墮\n墳\n墻\n墾\n壁\n壅\n壆\n壇\n壊\n壑\n壓\n壕\n壘\n壞\n壟\n壢\n壤\n壩\n士\n壬\n壮\n壯\n声\n売\n壳\n壶\n壹\n壺\n壽\n处\n备\n変\n复\n夏\n夔\n夕\n外\n夙\n多\n夜\n够\n夠\n夢\n夥\n大\n天\n太\n夫\n夭\n央\n夯\n失\n头\n夷\n夸\n夹\n夺\n夾\n奂\n奄\n奇\n奈\n奉\n奋\n奎\n奏\n奐\n契\n奔\n奕\n奖\n套\n奘\n奚\n奠\n奢\n奥\n奧\n奪\n奬\n奮\n女\n奴\n奶\n奸\n她\n好\n如\n妃\n妄\n妆\n妇\n妈\n妊\n妍\n妒\n妓\n妖\n妘\n妙\n妝\n妞\n妣\n妤\n妥\n妨\n妩\n妪\n妮\n妲\n妳\n妹\n妻\n妾\n姆\n姉\n姊\n始\n姍\n姐\n姑\n姒\n姓\n委\n姗\n姚\n姜\n姝\n姣\n姥\n姦\n姨\n姪\n姫\n姬\n姹\n姻\n姿\n威\n娃\n娄\n娅\n娆\n娇\n娉\n娑\n娓\n娘\n娛\n娜\n娟\n娠\n娣\n娥\n娩\n娱\n娲\n娴\n娶\n娼\n婀\n婁\n婆\n婉\n婊\n婕\n婚\n婢\n婦\n婧\n婪\n婭\n婴\n婵\n婶\n婷\n婺\n婿\n媒\n媚\n媛\n媞\n媧\n媲\n媳\n媽\n媾\n嫁\n嫂\n嫉\n嫌\n嫑\n嫔\n嫖\n嫘\n嫚\n嫡\n嫣\n嫦\n嫩\n嫲\n嫵\n嫻\n嬅\n嬉\n嬌\n嬗\n嬛\n嬢\n嬤\n嬪\n嬰\n嬴\n嬷\n嬸\n嬿\n孀\n孃\n子\n孑\n孔\n孕\n孖\n字\n存\n孙\n孚\n孛\n孜\n孝\n孟\n孢\n季\n孤\n学\n孩\n孪\n孫\n孬\n孰\n孱\n孳\n孵\n學\n孺\n孽\n孿\n宁\n它\n宅\n宇\n守\n安\n宋\n完\n宏\n宓\n宕\n宗\n官\n宙\n定\n宛\n宜\n宝\n实\n実\n宠\n审\n客\n宣\n室\n宥\n宦\n宪\n宫\n宮\n宰\n害\n宴\n宵\n家\n宸\n容\n宽\n宾\n宿\n寂\n寄\n寅\n密\n寇\n富\n寐\n寒\n寓\n寛\n寝\n寞\n察\n寡\n寢\n寥\n實\n寧\n寨\n審\n寫\n寬\n寮\n寰\n寵\n寶\n寸\n对\n寺\n寻\n导\n対\n寿\n封\n専\n射\n将\n將\n專\n尉\n尊\n尋\n對\n導\n小\n少\n尔\n尕\n尖\n尘\n尚\n尝\n尤\n尧\n尬\n就\n尴\n尷\n尸\n尹\n尺\n尻\n尼\n尽\n尾\n尿\n局\n屁\n层\n屄\n居\n屆\n屈\n屉\n届\n屋\n屌\n屍\n屎\n屏\n屐\n屑\n展\n屜\n属\n屠\n屡\n屢\n層\n履\n屬\n屯\n山\n屹\n屿\n岀\n岁\n岂\n岌\n岐\n岑\n岔\n岖\n岗\n岘\n岙\n岚\n岛\n岡\n岩\n岫\n岬\n岭\n岱\n岳\n岷\n岸\n峇\n峋\n峒\n峙\n峡\n峤\n峥\n峦\n峨\n峪\n峭\n峯\n峰\n峴\n島\n峻\n峽\n崁\n崂\n崆\n崇\n崎\n崑\n崔\n崖\n崗\n崙\n崛\n崧\n崩\n崭\n崴\n崽\n嵇\n嵊\n嵋\n嵌\n嵐\n嵘\n嵩\n嵬\n嵯\n嶂\n嶄\n嶇\n嶋\n嶙\n嶺\n嶼\n嶽\n巅\n巍\n巒\n巔\n巖\n川\n州\n巡\n巢\n工\n左\n巧\n巨\n巩\n巫\n差\n己\n已\n巳\n巴\n巷\n巻\n巽\n巾\n巿\n币\n市\n布\n帅\n帆\n师\n希\n帐\n帑\n帕\n帖\n帘\n帚\n帛\n帜\n帝\n帥\n带\n帧\n師\n席\n帮\n帯\n帰\n帳\n帶\n帷\n常\n帼\n帽\n幀\n幂\n幄\n幅\n幌\n幔\n幕\n幟\n幡\n幢\n幣\n幫\n干\n平\n年\n并\n幸\n幹\n幺\n幻\n幼\n幽\n幾\n广\n庁\n広\n庄\n庆\n庇\n床\n序\n庐\n库\n应\n底\n庖\n店\n庙\n庚\n府\n庞\n废\n庠\n度\n座\n庫\n庭\n庵\n庶\n康\n庸\n庹\n庾\n廁\n廂\n廃\n廈\n廉\n廊\n廓\n廖\n廚\n廝\n廟\n廠\n廢\n廣\n廬\n廳\n延\n廷\n建\n廿\n开\n弁\n异\n弃\n弄\n弈\n弊\n弋\n式\n弑\n弒\n弓\n弔\n引\n弗\n弘\n弛\n弟\n张\n弥\n弦\n弧\n弩\n弭\n弯\n弱\n張\n強\n弹\n强\n弼\n弾\n彅\n彆\n彈\n彌\n彎\n归\n当\n录\n彗\n彙\n彝\n形\n彤\n彥\n彦\n彧\n彩\n彪\n彫\n彬\n彭\n彰\n影\n彷\n役\n彻\n彼\n彿\n往\n征\n径\n待\n徇\n很\n徉\n徊\n律\n後\n徐\n徑\n徒\n従\n徕\n得\n徘\n徙\n徜\n從\n徠\n御\n徨\n復\n循\n徬\n微\n徳\n徴\n徵\n德\n徹\n徼\n徽\n心\n必\n忆\n忌\n忍\n忏\n忐\n忑\n忒\n忖\n志\n忘\n忙\n応\n忠\n忡\n忤\n忧\n忪\n快\n忱\n念\n忻\n忽\n忿\n怀\n态\n怂\n怅\n怆\n怎\n怏\n怒\n怔\n怕\n怖\n怙\n怜\n思\n怠\n怡\n急\n怦\n性\n怨\n怪\n怯\n怵\n总\n怼\n恁\n恃\n恆\n恋\n恍\n恐\n恒\n恕\n恙\n恚\n恢\n恣\n恤\n恥\n恨\n恩\n恪\n恫\n恬\n恭\n息\n恰\n恳\n恵\n恶\n恸\n恺\n恻\n恼\n恿\n悄\n悅\n悉\n悌\n悍\n悔\n悖\n悚\n悟\n悠\n患\n悦\n您\n悩\n悪\n悬\n悯\n悱\n悲\n悴\n悵\n悶\n悸\n悻\n悼\n悽\n情\n惆\n惇\n惊\n惋\n惑\n惕\n惘\n惚\n惜\n惟\n惠\n惡\n惦\n惧\n惨\n惩\n惫\n惬\n惭\n惮\n惯\n惰\n惱\n想\n惴\n惶\n惹\n惺\n愁\n愆\n愈\n愉\n愍\n意\n愕\n愚\n愛\n愜\n感\n愣\n愤\n愧\n愫\n愷\n愿\n慄\n慈\n態\n慌\n慎\n慑\n慕\n慘\n慚\n慟\n慢\n慣\n慧\n慨\n慫\n慮\n慰\n慳\n慵\n慶\n慷\n慾\n憂\n憊\n憋\n憎\n憐\n憑\n憔\n憚\n憤\n憧\n憨\n憩\n憫\n憬\n憲\n憶\n憾\n懂\n懇\n懈\n應\n懊\n懋\n懑\n懒\n懦\n懲\n懵\n懶\n懷\n懸\n懺\n懼\n懾\n懿\n戀\n戈\n戊\n戌\n戍\n戎\n戏\n成\n我\n戒\n戕\n或\n战\n戚\n戛\n戟\n戡\n戦\n截\n戬\n戮\n戰\n戲\n戳\n戴\n戶\n户\n戸\n戻\n戾\n房\n所\n扁\n扇\n扈\n扉\n手\n才\n扎\n扑\n扒\n打\n扔\n払\n托\n扛\n扣\n扦\n执\n扩\n扪\n扫\n扬\n扭\n扮\n扯\n扰\n扱\n扳\n扶\n批\n扼\n找\n承\n技\n抄\n抉\n把\n抑\n抒\n抓\n投\n抖\n抗\n折\n抚\n抛\n抜\n択\n抟\n抠\n抡\n抢\n护\n报\n抨\n披\n抬\n抱\n抵\n抹\n押\n抽\n抿\n拂\n拄\n担\n拆\n拇\n拈\n拉\n拋\n拌\n拍\n拎\n拐\n拒\n拓\n拔\n拖\n拗\n拘\n拙\n拚\n招\n拜\n拟\n拡\n拢\n拣\n拥\n拦\n拧\n拨\n择\n括\n拭\n拮\n拯\n拱\n拳\n拴\n拷\n拼\n拽\n拾\n拿\n持\n挂\n指\n挈\n按\n挎\n挑\n挖\n挙\n挚\n挛\n挝\n挞\n挟\n挠\n挡\n挣\n挤\n挥\n挨\n挪\n挫\n振\n挲\n挹\n挺\n挽\n挾\n捂\n捅\n捆\n捉\n捋\n捌\n捍\n捎\n捏\n捐\n捕\n捞\n损\n捡\n换\n捣\n捧\n捨\n捩\n据\n捱\n捲\n捶\n捷\n捺\n捻\n掀\n掂\n掃\n掇\n授\n掉\n掌\n掏\n掐\n排\n掖\n掘\n掙\n掛\n掠\n採\n探\n掣\n接\n控\n推\n掩\n措\n掬\n掰\n掲\n掳\n掴\n掷\n掸\n掺\n揀\n揃\n揄\n揆\n揉\n揍\n描\n提\n插\n揖\n揚\n換\n握\n揣\n揩\n揪\n揭\n揮\n援\n揶\n揸\n揹\n揽\n搀\n搁\n搂\n搅\n損\n搏\n搐\n搓\n搔\n搖\n搗\n搜\n搞\n搡\n搪\n搬\n搭\n搵\n搶\n携\n搽\n摀\n摁\n摄\n摆\n摇\n摈\n摊\n摒\n摔\n摘\n摞\n摟\n摧\n摩\n摯\n摳\n摸\n摹\n摺\n摻\n撂\n撃\n撅\n撇\n撈\n撐\n撑\n撒\n撓\n撕\n撚\n撞\n撤\n撥\n撩\n撫\n撬\n播\n撮\n撰\n撲\n撵\n撷\n撸\n撻\n撼\n撿\n擀\n擁\n擂\n擄\n擅\n擇\n擊\n擋\n操\n擎\n擒\n擔\n擘\n據\n擞\n擠\n擡\n擢\n擦\n擬\n擰\n擱\n擲\n擴\n擷\n擺\n擼\n擾\n攀\n攏\n攒\n攔\n攘\n攙\n攜\n攝\n攞\n攢\n攣\n攤\n攥\n攪\n攫\n攬\n支\n收\n攸\n改\n攻\n放\n政\n故\n效\n敌\n敍\n敎\n敏\n救\n敕\n敖\n敗\n敘\n教\n敛\n敝\n敞\n敢\n散\n敦\n敬\n数\n敲\n整\n敵\n敷\n數\n斂\n斃\n文\n斋\n斌\n斎\n斐\n斑\n斓\n斗\n料\n斛\n斜\n斟\n斡\n斤\n斥\n斧\n斩\n斫\n斬\n断\n斯\n新\n斷\n方\n於\n施\n旁\n旃\n旅\n旋\n旌\n旎\n族\n旖\n旗\n无\n既\n日\n旦\n旧\n旨\n早\n旬\n旭\n旮\n旱\n时\n旷\n旺\n旻\n昀\n昂\n昆\n昇\n昉\n昊\n昌\n明\n昏\n易\n昔\n昕\n昙\n星\n映\n春\n昧\n昨\n昭\n是\n昱\n昴\n昵\n昶\n昼\n显\n晁\n時\n晃\n晉\n晋\n晌\n晏\n晒\n晓\n晔\n晕\n晖\n晗\n晚\n晝\n晞\n晟\n晤\n晦\n晨\n晩\n普\n景\n晰\n晴\n晶\n晷\n智\n晾\n暂\n暄\n暇\n暈\n暉\n暌\n暐\n暑\n暖\n暗\n暝\n暢\n暧\n暨\n暫\n暮\n暱\n暴\n暸\n暹\n曄\n曆\n曇\n曉\n曖\n曙\n曜\n曝\n曠\n曦\n曬\n曰\n曲\n曳\n更\n書\n曹\n曼\n曾\n替\n最\n會\n月\n有\n朋\n服\n朐\n朔\n朕\n朗\n望\n朝\n期\n朦\n朧\n木\n未\n末\n本\n札\n朮\n术\n朱\n朴\n朵\n机\n朽\n杀\n杂\n权\n杆\n杈\n杉\n李\n杏\n材\n村\n杓\n杖\n杜\n杞\n束\n杠\n条\n来\n杨\n杭\n杯\n杰\n東\n杳\n杵\n杷\n杼\n松\n板\n极\n构\n枇\n枉\n枋\n析\n枕\n林\n枚\n果\n枝\n枢\n枣\n枪\n枫\n枭\n枯\n枰\n枱\n枳\n架\n枷\n枸\n柄\n柏\n某\n柑\n柒\n染\n柔\n柘\n柚\n柜\n柞\n柠\n柢\n查\n柩\n柬\n柯\n柱\n柳\n柴\n柵\n査\n柿\n栀\n栃\n栄\n栅\n标\n栈\n栉\n栋\n栎\n栏\n树\n栓\n栖\n栗\n校\n栩\n株\n样\n核\n根\n格\n栽\n栾\n桀\n桁\n桂\n桃\n桅\n框\n案\n桉\n桌\n桎\n桐\n桑\n桓\n桔\n桜\n桠\n桡\n桢\n档\n桥\n桦\n桧\n桨\n桩\n桶\n桿\n梁\n梅\n梆\n梏\n梓\n梗\n條\n梟\n梢\n梦\n梧\n梨\n梭\n梯\n械\n梳\n梵\n梶\n检\n棂\n棄\n棉\n棋\n棍\n棒\n棕\n棗\n棘\n棚\n棟\n棠\n棣\n棧\n森\n棱\n棲\n棵\n棹\n棺\n椁\n椅\n椋\n植\n椎\n椒\n検\n椪\n椭\n椰\n椹\n椽\n椿\n楂\n楊\n楓\n楔\n楚\n楝\n楞\n楠\n楣\n楨\n楫\n業\n楮\n極\n楷\n楸\n楹\n楼\n楽\n概\n榄\n榆\n榈\n榉\n榔\n榕\n榖\n榛\n榜\n榨\n榫\n榭\n榮\n榱\n榴\n榷\n榻\n槁\n槃\n構\n槌\n槍\n槎\n槐\n槓\n様\n槛\n槟\n槤\n槭\n槲\n槳\n槻\n槽\n槿\n樁\n樂\n樊\n樑\n樓\n標\n樞\n樟\n模\n樣\n権\n横\n樫\n樯\n樱\n樵\n樸\n樹\n樺\n樽\n樾\n橄\n橇\n橋\n橐\n橘\n橙\n機\n橡\n橢\n橫\n橱\n橹\n橼\n檀\n檄\n檎\n檐\n檔\n檗\n檜\n檢\n檬\n檯\n檳\n檸\n檻\n櫃\n櫚\n櫛\n櫥\n櫸\n櫻\n欄\n權\n欒\n欖\n欠\n次\n欢\n欣\n欧\n欲\n欸\n欺\n欽\n款\n歆\n歇\n歉\n歌\n歎\n歐\n歓\n歙\n歛\n歡\n止\n正\n此\n步\n武\n歧\n歩\n歪\n歯\n歲\n歳\n歴\n歷\n歸\n歹\n死\n歼\n殁\n殃\n殆\n殇\n殉\n殊\n残\n殒\n殓\n殖\n殘\n殞\n殡\n殤\n殭\n殯\n殲\n殴\n段\n殷\n殺\n殼\n殿\n毀\n毁\n毂\n毅\n毆\n毋\n母\n毎\n每\n毒\n毓\n比\n毕\n毗\n毘\n毙\n毛\n毡\n毫\n毯\n毽\n氈\n氏\n氐\n民\n氓\n气\n氖\n気\n氙\n氛\n氟\n氡\n氢\n氣\n氤\n氦\n氧\n氨\n氪\n氫\n氮\n氯\n氰\n氲\n水\n氷\n永\n氹\n氾\n汀\n汁\n求\n汆\n汇\n汉\n汎\n汐\n汕\n汗\n汙\n汛\n汝\n汞\n江\n池\n污\n汤\n汨\n汩\n汪\n汰\n汲\n汴\n汶\n汹\n決\n汽\n汾\n沁\n沂\n沃\n沅\n沈\n沉\n沌\n沏\n沐\n沒\n沓\n沖\n沙\n沛\n沟\n没\n沢\n沣\n沥\n沦\n沧\n沪\n沫\n沭\n沮\n沱\n河\n沸\n油\n治\n沼\n沽\n沾\n沿\n況\n泄\n泉\n泊\n泌\n泓\n法\n泗\n泛\n泞\n泠\n泡\n波\n泣\n泥\n注\n泪\n泫\n泮\n泯\n泰\n泱\n泳\n泵\n泷\n泸\n泻\n泼\n泽\n泾\n洁\n洄\n洋\n洒\n洗\n洙\n洛\n洞\n津\n洩\n洪\n洮\n洱\n洲\n洵\n洶\n洸\n洹\n活\n洼\n洽\n派\n流\n浃\n浄\n浅\n浆\n浇\n浊\n测\n济\n浏\n浑\n浒\n浓\n浔\n浙\n浚\n浜\n浣\n浦\n浩\n浪\n浬\n浮\n浯\n浴\n海\n浸\n涂\n涅\n涇\n消\n涉\n涌\n涎\n涓\n涔\n涕\n涙\n涛\n涝\n涞\n涟\n涠\n涡\n涣\n涤\n润\n涧\n涨\n涩\n涪\n涮\n涯\n液\n涵\n涸\n涼\n涿\n淀\n淄\n淅\n淆\n淇\n淋\n淌\n淑\n淒\n淖\n淘\n淙\n淚\n淞\n淡\n淤\n淦\n淨\n淩\n淪\n淫\n淬\n淮\n深\n淳\n淵\n混\n淹\n淺\n添\n淼\n清\n済\n渉\n渊\n渋\n渍\n渎\n渐\n渔\n渗\n渙\n渚\n減\n渝\n渠\n渡\n渣\n渤\n渥\n渦\n温\n測\n渭\n港\n渲\n渴\n游\n渺\n渾\n湃\n湄\n湊\n湍\n湖\n湘\n湛\n湟\n湧\n湫\n湮\n湯\n湳\n湾\n湿\n満\n溃\n溅\n溉\n溏\n源\n準\n溜\n溝\n溟\n溢\n溥\n溧\n溪\n溫\n溯\n溱\n溴\n溶\n溺\n溼\n滁\n滂\n滄\n滅\n滇\n滋\n滌\n滑\n滓\n滔\n滕\n滙\n滚\n滝\n滞\n滟\n满\n滢\n滤\n滥\n滦\n滨\n滩\n滬\n滯\n滲\n滴\n滷\n滸\n滾\n滿\n漁\n漂\n漆\n漉\n漏\n漓\n演\n漕\n漠\n漢\n漣\n漩\n漪\n漫\n漬\n漯\n漱\n漲\n漳\n漸\n漾\n漿\n潆\n潇\n潋\n潍\n潑\n潔\n潘\n潛\n潜\n潞\n潟\n潢\n潤\n潦\n潧\n潭\n潮\n潰\n潴\n潸\n潺\n潼\n澀\n澄\n澆\n澈\n澍\n澎\n澗\n澜\n澡\n澤\n澧\n澱\n澳\n澹\n激\n濁\n濂\n濃\n濑\n濒\n濕\n濘\n濛\n濟\n濠\n濡\n濤\n濫\n濬\n濮\n濯\n濱\n濺\n濾\n瀅\n瀆\n瀉\n瀋\n瀏\n瀑\n瀕\n瀘\n瀚\n瀛\n瀝\n瀞\n瀟\n瀧\n瀨\n瀬\n瀰\n瀾\n灌\n灏\n灑\n灘\n灝\n灞\n灣\n火\n灬\n灭\n灯\n灰\n灵\n灶\n灸\n灼\n災\n灾\n灿\n炀\n炁\n炅\n炉\n炊\n炎\n炒\n炔\n炕\n炖\n炙\n炜\n炫\n炬\n炭\n炮\n炯\n炳\n炷\n炸\n点\n為\n炼\n炽\n烁\n烂\n烃\n烈\n烊\n烏\n烘\n烙\n烛\n烟\n烤\n烦\n烧\n烨\n烩\n烫\n烬\n热\n烯\n烷\n烹\n烽\n焉\n焊\n焕\n焖\n焗\n焘\n焙\n焚\n焜\n無\n焦\n焯\n焰\n焱\n然\n焼\n煅\n煉\n煊\n煌\n煎\n煒\n煖\n煙\n煜\n煞\n煤\n煥\n煦\n照\n煨\n煩\n煮\n煲\n煸\n煽\n熄\n熊\n熏\n熒\n熔\n熙\n熟\n熠\n熨\n熬\n熱\n熵\n熹\n熾\n燁\n燃\n燄\n燈\n燉\n燊\n燎\n燒\n燔\n燕\n燙\n燜\n營\n燥\n燦\n燧\n燭\n燮\n燴\n燻\n燼\n燿\n爆\n爍\n爐\n爛\n爪\n爬\n爭\n爰\n爱\n爲\n爵\n父\n爷\n爸\n爹\n爺\n爻\n爽\n爾\n牆\n片\n版\n牌\n牍\n牒\n牙\n牛\n牝\n牟\n牠\n牡\n牢\n牦\n牧\n物\n牯\n牲\n牴\n牵\n特\n牺\n牽\n犀\n犁\n犄\n犊\n犍\n犒\n犢\n犧\n犬\n犯\n状\n犷\n犸\n犹\n狀\n狂\n狄\n狈\n狎\n狐\n狒\n狗\n狙\n狞\n狠\n狡\n狩\n独\n狭\n狮\n狰\n狱\n狸\n狹\n狼\n狽\n猎\n猕\n猖\n猗\n猙\n猛\n猜\n猝\n猥\n猩\n猪\n猫\n猬\n献\n猴\n猶\n猷\n猾\n猿\n獄\n獅\n獎\n獐\n獒\n獗\n獠\n獣\n獨\n獭\n獰\n獲\n獵\n獷\n獸\n獺\n獻\n獼\n獾\n玄\n率\n玉\n王\n玑\n玖\n玛\n玟\n玠\n玥\n玩\n玫\n玮\n环\n现\n玲\n玳\n玷\n玺\n玻\n珀\n珂\n珅\n珈\n珉\n珊\n珍\n珏\n珐\n珑\n珙\n珞\n珠\n珣\n珥\n珩\n珪\n班\n珮\n珲\n珺\n現\n球\n琅\n理\n琇\n琉\n琊\n琍\n琏\n琐\n琛\n琢\n琥\n琦\n琨\n琪\n琬\n琮\n琰\n琲\n琳\n琴\n琵\n琶\n琺\n琼\n瑀\n瑁\n瑄\n瑋\n瑕\n瑗\n瑙\n瑚\n瑛\n瑜\n瑞\n瑟\n瑠\n瑣\n瑤\n瑩\n瑪\n瑯\n瑰\n瑶\n瑾\n璀\n璁\n璃\n璇\n璉\n璋\n璎\n璐\n璜\n璞\n璟\n璧\n璨\n環\n璽\n璿\n瓊\n瓏\n瓒\n瓜\n瓢\n瓣\n瓤\n瓦\n瓮\n瓯\n瓴\n瓶\n瓷\n甄\n甌\n甕\n甘\n甙\n甚\n甜\n生\n產\n産\n甥\n甦\n用\n甩\n甫\n甬\n甭\n甯\n田\n由\n甲\n申\n电\n男\n甸\n町\n画\n甾\n畀\n畅\n界\n畏\n畑\n畔\n留\n畜\n畝\n畢\n略\n畦\n番\n畫\n異\n畲\n畳\n畴\n當\n畸\n畹\n畿\n疆\n疇\n疊\n疏\n疑\n疔\n疖\n疗\n疙\n疚\n疝\n疟\n疡\n疣\n疤\n疥\n疫\n疮\n疯\n疱\n疲\n疳\n疵\n疸\n疹\n疼\n疽\n疾\n痂\n病\n症\n痈\n痉\n痊\n痍\n痒\n痔\n痕\n痘\n痙\n痛\n痞\n痠\n痢\n痣\n痤\n痧\n痨\n痪\n痫\n痰\n痱\n痴\n痹\n痺\n痼\n痿\n瘀\n瘁\n瘋\n瘍\n瘓\n瘘\n瘙\n瘟\n瘠\n瘡\n瘢\n瘤\n瘦\n瘧\n瘩\n瘪\n瘫\n瘴\n瘸\n瘾\n療\n癇\n癌\n癒\n癖\n癜\n癞\n癡\n癢\n癣\n癥\n癫\n癬\n癮\n癱\n癲\n癸\n発\n登\n發\n白\n百\n皂\n的\n皆\n皇\n皈\n皋\n皎\n皑\n皓\n皖\n皙\n皚\n皮\n皰\n皱\n皴\n皺\n皿\n盂\n盃\n盅\n盆\n盈\n益\n盎\n盏\n盐\n监\n盒\n盔\n盖\n盗\n盘\n盛\n盜\n盞\n盟\n盡\n監\n盤\n盥\n盧\n盪\n目\n盯\n盱\n盲\n直\n相\n盹\n盼\n盾\n省\n眈\n眉\n看\n県\n眙\n眞\n真\n眠\n眦\n眨\n眩\n眯\n眶\n眷\n眸\n眺\n眼\n眾\n着\n睁\n睇\n睏\n睐\n睑\n睛\n睜\n睞\n睡\n睢\n督\n睥\n睦\n睨\n睪\n睫\n睬\n睹\n睽\n睾\n睿\n瞄\n瞅\n瞇\n瞋\n瞌\n瞎\n瞑\n瞒\n瞓\n瞞\n瞟\n瞠\n瞥\n瞧\n瞩\n瞪\n瞬\n瞭\n瞰\n瞳\n瞻\n瞼\n瞿\n矇\n矍\n矗\n矚\n矛\n矜\n矢\n矣\n知\n矩\n矫\n短\n矮\n矯\n石\n矶\n矽\n矾\n矿\n码\n砂\n砌\n砍\n砒\n研\n砖\n砗\n砚\n砝\n砣\n砥\n砧\n砭\n砰\n砲\n破\n砷\n砸\n砺\n砼\n砾\n础\n硅\n硐\n硒\n硕\n硝\n硫\n硬\n确\n硯\n硼\n碁\n碇\n碉\n碌\n碍\n碎\n碑\n碓\n碗\n碘\n碚\n碛\n碟\n碣\n碧\n碩\n碰\n碱\n碳\n碴\n確\n碼\n碾\n磁\n磅\n磊\n磋\n磐\n磕\n磚\n磡\n磨\n磬\n磯\n磲\n磷\n磺\n礁\n礎\n礙\n礡\n礦\n礪\n礫\n礴\n示\n礼\n社\n祀\n祁\n祂\n祇\n祈\n祉\n祎\n祐\n祕\n祖\n祗\n祚\n祛\n祜\n祝\n神\n祟\n祠\n祢\n祥\n票\n祭\n祯\n祷\n祸\n祺\n祿\n禀\n禁\n禄\n禅\n禍\n禎\n福\n禛\n禦\n禧\n禪\n禮\n禱\n禹\n禺\n离\n禽\n禾\n禿\n秀\n私\n秃\n秆\n秉\n秋\n种\n科\n秒\n秘\n租\n秣\n秤\n秦\n秧\n秩\n秭\n积\n称\n秸\n移\n秽\n稀\n稅\n程\n稍\n税\n稔\n稗\n稚\n稜\n稞\n稟\n稠\n稣\n種\n稱\n稲\n稳\n稷\n稹\n稻\n稼\n稽\n稿\n穀\n穂\n穆\n穌\n積\n穎\n穗\n穢\n穩\n穫\n穴\n究\n穷\n穹\n空\n穿\n突\n窃\n窄\n窈\n窍\n窑\n窒\n窓\n窕\n窖\n窗\n窘\n窜\n窝\n窟\n窠\n窥\n窦\n窨\n窩\n窪\n窮\n窯\n窺\n窿\n竄\n竅\n竇\n竊\n立\n竖\n站\n竜\n竞\n竟\n章\n竣\n童\n竭\n端\n競\n竹\n竺\n竽\n竿\n笃\n笆\n笈\n笋\n笏\n笑\n笔\n笙\n笛\n笞\n笠\n符\n笨\n第\n笹\n笺\n笼\n筆\n等\n筊\n筋\n筍\n筏\n筐\n筑\n筒\n答\n策\n筛\n筝\n筠\n筱\n筲\n筵\n筷\n筹\n签\n简\n箇\n箋\n箍\n箏\n箐\n箔\n箕\n算\n箝\n管\n箩\n箫\n箭\n箱\n箴\n箸\n節\n篁\n範\n篆\n篇\n築\n篑\n篓\n篙\n篝\n篠\n篡\n篤\n篩\n篪\n篮\n篱\n篷\n簇\n簌\n簍\n簡\n簦\n簧\n簪\n簫\n簷\n簸\n簽\n簾\n簿\n籁\n籃\n籌\n籍\n籐\n籟\n籠\n籤\n籬\n籮\n籲\n米\n类\n籼\n籽\n粄\n粉\n粑\n粒\n粕\n粗\n粘\n粟\n粤\n粥\n粧\n粪\n粮\n粱\n粲\n粳\n粵\n粹\n粼\n粽\n精\n粿\n糅\n糊\n糍\n糕\n糖\n糗\n糙\n糜\n糞\n糟\n糠\n糧\n糬\n糯\n糰\n糸\n系\n糾\n紀\n紂\n約\n紅\n紉\n紊\n紋\n納\n紐\n紓\n純\n紗\n紘\n紙\n級\n紛\n紜\n素\n紡\n索\n紧\n紫\n紮\n累\n細\n紳\n紹\n紺\n終\n絃\n組\n絆\n経\n結\n絕\n絞\n絡\n絢\n給\n絨\n絮\n統\n絲\n絳\n絵\n絶\n絹\n綁\n綏\n綑\n經\n継\n続\n綜\n綠\n綢\n綦\n綫\n綬\n維\n綱\n網\n綴\n綵\n綸\n綺\n綻\n綽\n綾\n綿\n緊\n緋\n総\n緑\n緒\n緘\n線\n緝\n緞\n締\n緣\n編\n緩\n緬\n緯\n練\n緹\n緻\n縁\n縄\n縈\n縛\n縝\n縣\n縫\n縮\n縱\n縴\n縷\n總\n績\n繁\n繃\n繆\n繇\n繋\n織\n繕\n繚\n繞\n繡\n繩\n繪\n繫\n繭\n繳\n繹\n繼\n繽\n纂\n續\n纍\n纏\n纓\n纔\n纖\n纜\n纠\n红\n纣\n纤\n约\n级\n纨\n纪\n纫\n纬\n纭\n纯\n纰\n纱\n纲\n纳\n纵\n纶\n纷\n纸\n纹\n纺\n纽\n纾\n线\n绀\n练\n组\n绅\n细\n织\n终\n绊\n绍\n绎\n经\n绑\n绒\n结\n绔\n绕\n绘\n给\n绚\n绛\n络\n绝\n绞\n统\n绡\n绢\n绣\n绥\n绦\n继\n绩\n绪\n绫\n续\n绮\n绯\n绰\n绳\n维\n绵\n绶\n绷\n绸\n绻\n综\n绽\n绾\n绿\n缀\n缄\n缅\n缆\n缇\n缈\n缉\n缎\n缓\n缔\n缕\n编\n缘\n缙\n缚\n缜\n缝\n缠\n缢\n缤\n缥\n缨\n缩\n缪\n缭\n缮\n缰\n缱\n缴\n缸\n缺\n缽\n罂\n罄\n罌\n罐\n网\n罔\n罕\n罗\n罚\n罡\n罢\n罩\n罪\n置\n罰\n署\n罵\n罷\n罹\n羁\n羅\n羈\n羊\n羌\n美\n羔\n羚\n羞\n羟\n羡\n羣\n群\n羥\n羧\n羨\n義\n羯\n羲\n羸\n羹\n羽\n羿\n翁\n翅\n翊\n翌\n翎\n習\n翔\n翘\n翟\n翠\n翡\n翦\n翩\n翰\n翱\n翳\n翹\n翻\n翼\n耀\n老\n考\n耄\n者\n耆\n耋\n而\n耍\n耐\n耒\n耕\n耗\n耘\n耙\n耦\n耨\n耳\n耶\n耷\n耸\n耻\n耽\n耿\n聂\n聆\n聊\n聋\n职\n聒\n联\n聖\n聘\n聚\n聞\n聪\n聯\n聰\n聲\n聳\n聴\n聶\n職\n聽\n聾\n聿\n肃\n肄\n肅\n肆\n肇\n肉\n肋\n肌\n肏\n肓\n肖\n肘\n肚\n肛\n肝\n肠\n股\n肢\n肤\n肥\n肩\n肪\n肮\n肯\n肱\n育\n肴\n肺\n肽\n肾\n肿\n胀\n胁\n胃\n胄\n胆\n背\n胍\n胎\n胖\n胚\n胛\n胜\n胝\n胞\n胡\n胤\n胥\n胧\n胫\n胭\n胯\n胰\n胱\n胳\n胴\n胶\n胸\n胺\n能\n脂\n脅\n脆\n脇\n脈\n脉\n脊\n脍\n脏\n脐\n脑\n脓\n脖\n脘\n脚\n脛\n脣\n脩\n脫\n脯\n脱\n脲\n脳\n脸\n脹\n脾\n腆\n腈\n腊\n腋\n腌\n腎\n腐\n腑\n腓\n腔\n腕\n腥\n腦\n腩\n腫\n腭\n腮\n腰\n腱\n腳\n腴\n腸\n腹\n腺\n腻\n腼\n腾\n腿\n膀\n膈\n膊\n膏\n膑\n膘\n膚\n膛\n膜\n膝\n膠\n膦\n膨\n膩\n膳\n膺\n膻\n膽\n膾\n膿\n臀\n臂\n臃\n臆\n臉\n臊\n臍\n臓\n臘\n臟\n臣\n臥\n臧\n臨\n自\n臬\n臭\n至\n致\n臺\n臻\n臼\n臾\n舀\n舂\n舅\n舆\n與\n興\n舉\n舊\n舌\n舍\n舎\n舐\n舒\n舔\n舖\n舗\n舛\n舜\n舞\n舟\n航\n舫\n般\n舰\n舱\n舵\n舶\n舷\n舸\n船\n舺\n舾\n艇\n艋\n艘\n艙\n艦\n艮\n良\n艰\n艱\n色\n艳\n艷\n艹\n艺\n艾\n节\n芃\n芈\n芊\n芋\n芍\n芎\n芒\n芙\n芜\n芝\n芡\n芥\n芦\n芩\n芪\n芫\n芬\n芭\n芮\n芯\n花\n芳\n芷\n芸\n芹\n芻\n芽\n芾\n苁\n苄\n苇\n苋\n苍\n苏\n苑\n苒\n苓\n苔\n苕\n苗\n苛\n苜\n苞\n苟\n苡\n苣\n若\n苦\n苫\n苯\n英\n苷\n苹\n苻\n茁\n茂\n范\n茄\n茅\n茉\n茎\n茏\n茗\n茜\n茧\n茨\n茫\n茬\n茭\n茯\n茱\n茲\n茴\n茵\n茶\n茸\n茹\n茼\n荀\n荃\n荆\n草\n荊\n荏\n荐\n荒\n荔\n荖\n荘\n荚\n荞\n荟\n荠\n荡\n荣\n荤\n荥\n荧\n荨\n荪\n荫\n药\n荳\n荷\n荸\n荻\n荼\n荽\n莅\n莆\n莉\n莊\n莎\n莒\n莓\n莖\n莘\n莞\n莠\n莢\n莧\n莪\n莫\n莱\n莲\n莴\n获\n莹\n莺\n莽\n莿\n菀\n菁\n菅\n菇\n菈\n菊\n菌\n菏\n菓\n菖\n菘\n菜\n菟\n菠\n菡\n菩\n華\n菱\n菲\n菸\n菽\n萁\n萃\n萄\n萊\n萋\n萌\n萍\n萎\n萘\n萝\n萤\n营\n萦\n萧\n萨\n萩\n萬\n萱\n萵\n萸\n萼\n落\n葆\n葉\n著\n葚\n葛\n葡\n董\n葦\n葩\n葫\n葬\n葭\n葯\n葱\n葳\n葵\n葷\n葺\n蒂\n蒋\n蒐\n蒔\n蒙\n蒜\n蒞\n蒟\n蒡\n蒨\n蒲\n蒸\n蒹\n蒻\n蒼\n蒿\n蓁\n蓄\n蓆\n蓉\n蓋\n蓑\n蓓\n蓖\n蓝\n蓟\n蓦\n蓬\n蓮\n蓼\n蓿\n蔑\n蔓\n蔔\n蔗\n蔘\n蔚\n蔡\n蔣\n蔥\n蔫\n蔬\n蔭\n蔵\n蔷\n蔺\n蔻\n蔼\n蔽\n蕁\n蕃\n蕈\n蕉\n蕊\n蕎\n蕙\n蕤\n蕨\n蕩\n蕪\n蕭\n蕲\n蕴\n蕻\n蕾\n薄\n薅\n薇\n薈\n薊\n薏\n薑\n薔\n薙\n薛\n薦\n薨\n薩\n薪\n薬\n薯\n薰\n薹\n藉\n藍\n藏\n藐\n藓\n藕\n藜\n藝\n藤\n藥\n藩\n藹\n藻\n藿\n蘆\n蘇\n蘊\n蘋\n蘑\n蘚\n蘭\n蘸\n蘼\n蘿\n虎\n虏\n虐\n虑\n虔\n處\n虚\n虛\n虜\n虞\n號\n虢\n虧\n虫\n虬\n虱\n虹\n虻\n虽\n虾\n蚀\n蚁\n蚂\n蚊\n蚌\n蚓\n蚕\n蚜\n蚝\n蚣\n蚤\n蚩\n蚪\n蚯\n蚱\n蚵\n蛀\n蛆\n蛇\n蛊\n蛋\n蛎\n蛐\n蛔\n蛙\n蛛\n蛟\n蛤\n蛭\n蛮\n蛰\n蛳\n蛹\n蛻\n蛾\n蜀\n蜂\n蜃\n蜆\n蜇\n蜈\n蜊\n蜍\n蜒\n蜓\n蜕\n蜗\n蜘\n蜚\n蜜\n蜡\n蜢\n蜥\n蜱\n蜴\n蜷\n蜻\n蜿\n蝇\n蝈\n蝉\n蝌\n蝎\n蝕\n蝗\n蝙\n蝟\n蝠\n蝦\n蝨\n蝴\n蝶\n蝸\n蝼\n螂\n螃\n融\n螞\n螢\n螨\n螯\n螳\n螺\n蟀\n蟄\n蟆\n蟋\n蟎\n蟑\n蟒\n蟠\n蟬\n蟲\n蟹\n蟻\n蟾\n蠅\n蠍\n蠔\n蠕\n蠛\n蠟\n蠡\n蠢\n蠣\n蠱\n蠶\n蠹\n蠻\n血\n衄\n衅\n衆\n行\n衍\n術\n衔\n街\n衙\n衛\n衝\n衞\n衡\n衢\n衣\n补\n表\n衩\n衫\n衬\n衮\n衰\n衲\n衷\n衹\n衾\n衿\n袁\n袂\n袄\n袅\n袈\n袋\n袍\n袒\n袖\n袜\n袞\n袤\n袪\n被\n袭\n袱\n裁\n裂\n装\n裆\n裊\n裏\n裔\n裕\n裘\n裙\n補\n裝\n裟\n裡\n裤\n裨\n裱\n裳\n裴\n裸\n裹\n製\n裾\n褂\n複\n褐\n褒\n褓\n褔\n褚\n褥\n褪\n褫\n褲\n褶\n褻\n襁\n襄\n襟\n襠\n襪\n襬\n襯\n襲\n西\n要\n覃\n覆\n覇\n見\n規\n覓\n視\n覚\n覦\n覧\n親\n覬\n観\n覷\n覺\n覽\n觀\n见\n观\n规\n觅\n视\n览\n觉\n觊\n觎\n觐\n觑\n角\n觞\n解\n觥\n触\n觸\n言\n訂\n計\n訊\n討\n訓\n訕\n訖\n託\n記\n訛\n訝\n訟\n訣\n訥\n訪\n設\n許\n訳\n訴\n訶\n診\n註\n証\n詆\n詐\n詔\n評\n詛\n詞\n詠\n詡\n詢\n詣\n試\n詩\n詫\n詬\n詭\n詮\n詰\n話\n該\n詳\n詹\n詼\n誅\n誇\n誉\n誌\n認\n誓\n誕\n誘\n語\n誠\n誡\n誣\n誤\n誥\n誦\n誨\n說\n説\n読\n誰\n課\n誹\n誼\n調\n諄\n談\n請\n諏\n諒\n論\n諗\n諜\n諡\n諦\n諧\n諫\n諭\n諮\n諱\n諳\n諷\n諸\n諺\n諾\n謀\n謁\n謂\n謄\n謊\n謎\n謐\n謔\n謗\n謙\n講\n謝\n謠\n謨\n謬\n謹\n謾\n譁\n證\n譎\n譏\n識\n譙\n譚\n譜\n警\n譬\n譯\n議\n譲\n譴\n護\n譽\n讀\n變\n讓\n讚\n讞\n计\n订\n认\n讥\n讧\n讨\n让\n讪\n讫\n训\n议\n讯\n记\n讲\n讳\n讴\n讶\n讷\n许\n讹\n论\n讼\n讽\n设\n访\n诀\n证\n诃\n评\n诅\n识\n诈\n诉\n诊\n诋\n词\n诏\n译\n试\n诗\n诘\n诙\n诚\n诛\n话\n诞\n诟\n诠\n诡\n询\n诣\n诤\n该\n详\n诧\n诩\n诫\n诬\n语\n误\n诰\n诱\n诲\n说\n诵\n诶\n请\n诸\n诺\n读\n诽\n课\n诿\n谀\n谁\n调\n谄\n谅\n谆\n谈\n谊\n谋\n谌\n谍\n谎\n谏\n谐\n谑\n谒\n谓\n谔\n谕\n谗\n谘\n谙\n谚\n谛\n谜\n谟\n谢\n谣\n谤\n谥\n谦\n谧\n谨\n谩\n谪\n谬\n谭\n谯\n谱\n谲\n谴\n谶\n谷\n豁\n豆\n豇\n豈\n豉\n豊\n豌\n豎\n豐\n豔\n豚\n象\n豢\n豪\n豫\n豬\n豹\n豺\n貂\n貅\n貌\n貓\n貔\n貘\n貝\n貞\n負\n財\n貢\n貧\n貨\n販\n貪\n貫\n責\n貯\n貰\n貳\n貴\n貶\n買\n貸\n費\n貼\n貽\n貿\n賀\n賁\n賂\n賃\n賄\n資\n賈\n賊\n賑\n賓\n賜\n賞\n賠\n賡\n賢\n賣\n賤\n賦\n質\n賬\n賭\n賴\n賺\n購\n賽\n贅\n贈\n贊\n贍\n贏\n贓\n贖\n贛\n贝\n贞\n负\n贡\n财\n责\n贤\n败\n账\n货\n质\n贩\n贪\n贫\n贬\n购\n贮\n贯\n贰\n贱\n贲\n贴\n贵\n贷\n贸\n费\n贺\n贻\n贼\n贾\n贿\n赁\n赂\n赃\n资\n赅\n赈\n赊\n赋\n赌\n赎\n赏\n赐\n赓\n赔\n赖\n赘\n赚\n赛\n赝\n赞\n赠\n赡\n赢\n赣\n赤\n赦\n赧\n赫\n赭\n走\n赳\n赴\n赵\n赶\n起\n趁\n超\n越\n趋\n趕\n趙\n趟\n趣\n趨\n足\n趴\n趵\n趸\n趺\n趾\n跃\n跄\n跆\n跋\n跌\n跎\n跑\n跖\n跚\n跛\n距\n跟\n跡\n跤\n跨\n跩\n跪\n路\n跳\n践\n跷\n跹\n跺\n跻\n踉\n踊\n踌\n踏\n踐\n踝\n踞\n踟\n踢\n踩\n踪\n踮\n踱\n踴\n踵\n踹\n蹂\n蹄\n蹇\n蹈\n蹉\n蹊\n蹋\n蹑\n蹒\n蹙\n蹟\n蹣\n蹤\n蹦\n蹩\n蹬\n蹭\n蹲\n蹴\n蹶\n蹺\n蹼\n蹿\n躁\n躇\n躉\n躊\n躋\n躍\n躏\n躪\n身\n躬\n躯\n躲\n躺\n軀\n車\n軋\n軌\n軍\n軒\n軟\n転\n軸\n軼\n軽\n軾\n較\n載\n輒\n輓\n輔\n輕\n輛\n輝\n輟\n輩\n輪\n輯\n輸\n輻\n輾\n輿\n轄\n轅\n轆\n轉\n轍\n轎\n轟\n车\n轧\n轨\n轩\n转\n轭\n轮\n软\n轰\n轲\n轴\n轶\n轻\n轼\n载\n轿\n较\n辄\n辅\n辆\n辇\n辈\n辉\n辊\n辍\n辐\n辑\n输\n辕\n辖\n辗\n辘\n辙\n辛\n辜\n辞\n辟\n辣\n辦\n辨\n辩\n辫\n辭\n辮\n辯\n辰\n辱\n農\n边\n辺\n辻\n込\n辽\n达\n迁\n迂\n迄\n迅\n过\n迈\n迎\n运\n近\n返\n还\n这\n进\n远\n违\n连\n迟\n迢\n迤\n迥\n迦\n迩\n迪\n迫\n迭\n述\n迴\n迷\n迸\n迹\n迺\n追\n退\n送\n适\n逃\n逅\n逆\n选\n逊\n逍\n透\n逐\n递\n途\n逕\n逗\n這\n通\n逛\n逝\n逞\n速\n造\n逢\n連\n逮\n週\n進\n逵\n逶\n逸\n逻\n逼\n逾\n遁\n遂\n遅\n遇\n遊\n運\n遍\n過\n遏\n遐\n遑\n遒\n道\n達\n違\n遗\n遙\n遛\n遜\n遞\n遠\n遢\n遣\n遥\n遨\n適\n遭\n遮\n遲\n遴\n遵\n遶\n遷\n選\n遺\n遼\n遽\n避\n邀\n邁\n邂\n邃\n還\n邇\n邈\n邊\n邋\n邏\n邑\n邓\n邕\n邛\n邝\n邢\n那\n邦\n邨\n邪\n邬\n邮\n邯\n邰\n邱\n邳\n邵\n邸\n邹\n邺\n邻\n郁\n郅\n郊\n郎\n郑\n郜\n郝\n郡\n郢\n郤\n郦\n郧\n部\n郫\n郭\n郴\n郵\n郷\n郸\n都\n鄂\n鄉\n鄒\n鄔\n鄙\n鄞\n鄢\n鄧\n鄭\n鄰\n鄱\n鄲\n鄺\n酉\n酊\n酋\n酌\n配\n酐\n酒\n酗\n酚\n酝\n酢\n酣\n酥\n酩\n酪\n酬\n酮\n酯\n酰\n酱\n酵\n酶\n酷\n酸\n酿\n醃\n醇\n醉\n醋\n醍\n醐\n醒\n醚\n醛\n醜\n醞\n醣\n醪\n醫\n醬\n醮\n醯\n醴\n醺\n釀\n釁\n采\n釉\n释\n釋\n里\n重\n野\n量\n釐\n金\n釗\n釘\n釜\n針\n釣\n釦\n釧\n釵\n鈀\n鈉\n鈍\n鈎\n鈔\n鈕\n鈞\n鈣\n鈦\n鈪\n鈴\n鈺\n鈾\n鉀\n鉄\n鉅\n鉉\n鉑\n鉗\n鉚\n鉛\n鉤\n鉴\n鉻\n銀\n銃\n銅\n銑\n銓\n銖\n銘\n銜\n銬\n銭\n銮\n銳\n銷\n銹\n鋁\n鋅\n鋒\n鋤\n鋪\n鋰\n鋸\n鋼\n錄\n錐\n錘\n錚\n錠\n錢\n錦\n錨\n錫\n錮\n錯\n録\n錳\n錶\n鍊\n鍋\n鍍\n鍛\n鍥\n鍰\n鍵\n鍺\n鍾\n鎂\n鎊\n鎌\n鎏\n鎔\n鎖\n鎗\n鎚\n鎧\n鎬\n鎮\n鎳\n鏈\n鏖\n鏗\n鏘\n鏞\n鏟\n鏡\n鏢\n鏤\n鏽\n鐘\n鐮\n鐲\n鐳\n鐵\n鐸\n鐺\n鑄\n鑊\n鑑\n鑒\n鑣\n鑫\n鑰\n鑲\n鑼\n鑽\n鑾\n鑿\n针\n钉\n钊\n钎\n钏\n钒\n钓\n钗\n钙\n钛\n钜\n钝\n钞\n钟\n钠\n钡\n钢\n钣\n钤\n钥\n钦\n钧\n钨\n钩\n钮\n钯\n钰\n钱\n钳\n钴\n钵\n钺\n钻\n钼\n钾\n钿\n铀\n铁\n铂\n铃\n铄\n铅\n铆\n铉\n铎\n铐\n铛\n铜\n铝\n铠\n铡\n铢\n铣\n铤\n铨\n铩\n铬\n铭\n铮\n铰\n铲\n铵\n银\n铸\n铺\n链\n铿\n销\n锁\n锂\n锄\n锅\n锆\n锈\n锉\n锋\n锌\n锏\n锐\n锑\n错\n锚\n锟\n锡\n锢\n锣\n锤\n锥\n锦\n锭\n键\n锯\n锰\n锲\n锵\n锹\n锺\n锻\n镀\n镁\n镂\n镇\n镉\n镌\n镍\n镐\n镑\n镕\n镖\n镗\n镛\n镜\n镣\n镭\n镯\n镰\n镳\n镶\n長\n长\n門\n閃\n閉\n開\n閎\n閏\n閑\n閒\n間\n閔\n閘\n閡\n関\n閣\n閥\n閨\n閩\n閱\n閲\n閹\n閻\n閾\n闆\n闇\n闊\n闌\n闍\n闔\n闕\n闖\n闘\n關\n闡\n闢\n门\n闪\n闫\n闭\n问\n闯\n闰\n闲\n间\n闵\n闷\n闸\n闹\n闺\n闻\n闽\n闾\n阀\n阁\n阂\n阅\n阆\n阇\n阈\n阉\n阎\n阐\n阑\n阔\n阕\n阖\n阙\n阚\n阜\n队\n阡\n阪\n阮\n阱\n防\n阳\n阴\n阵\n阶\n阻\n阿\n陀\n陂\n附\n际\n陆\n陇\n陈\n陋\n陌\n降\n限\n陕\n陛\n陝\n陞\n陟\n陡\n院\n陣\n除\n陨\n险\n陪\n陰\n陲\n陳\n陵\n陶\n陷\n陸\n険\n陽\n隅\n隆\n隈\n隊\n隋\n隍\n階\n随\n隐\n隔\n隕\n隘\n隙\n際\n障\n隠\n隣\n隧\n隨\n險\n隱\n隴\n隶\n隸\n隻\n隼\n隽\n难\n雀\n雁\n雄\n雅\n集\n雇\n雉\n雋\n雌\n雍\n雎\n雏\n雑\n雒\n雕\n雖\n雙\n雛\n雜\n雞\n離\n難\n雨\n雪\n雯\n雰\n雲\n雳\n零\n雷\n雹\n電\n雾\n需\n霁\n霄\n霆\n震\n霈\n霉\n霊\n霍\n霎\n霏\n霑\n霓\n霖\n霜\n霞\n霧\n霭\n霰\n露\n霸\n霹\n霽\n霾\n靂\n靄\n靈\n青\n靓\n靖\n静\n靚\n靛\n靜\n非\n靠\n靡\n面\n靥\n靦\n革\n靳\n靴\n靶\n靼\n鞅\n鞋\n鞍\n鞏\n鞑\n鞘\n鞠\n鞣\n鞦\n鞭\n韆\n韋\n韌\n韓\n韜\n韦\n韧\n韩\n韬\n韭\n音\n韵\n韶\n韻\n響\n頁\n頂\n頃\n項\n順\n須\n頌\n預\n頑\n頒\n頓\n頗\n領\n頜\n頡\n頤\n頫\n頭\n頰\n頷\n頸\n頹\n頻\n頼\n顆\n題\n額\n顎\n顏\n顔\n願\n顛\n類\n顧\n顫\n顯\n顱\n顴\n页\n顶\n顷\n项\n顺\n须\n顼\n顽\n顾\n顿\n颁\n颂\n预\n颅\n领\n颇\n颈\n颉\n颊\n颌\n颍\n颐\n频\n颓\n颔\n颖\n颗\n题\n颚\n颛\n颜\n额\n颞\n颠\n颡\n颢\n颤\n颦\n颧\n風\n颯\n颱\n颳\n颶\n颼\n飄\n飆\n风\n飒\n飓\n飕\n飘\n飙\n飚\n飛\n飞\n食\n飢\n飨\n飩\n飪\n飯\n飲\n飼\n飽\n飾\n餃\n餅\n餉\n養\n餌\n餐\n餒\n餓\n餘\n餚\n餛\n餞\n餡\n館\n餮\n餵\n餾\n饅\n饈\n饋\n饌\n饍\n饑\n饒\n饕\n饗\n饞\n饥\n饨\n饪\n饬\n饭\n饮\n饯\n饰\n饱\n饲\n饴\n饵\n饶\n饷\n饺\n饼\n饽\n饿\n馀\n馁\n馄\n馅\n馆\n馈\n馋\n馍\n馏\n馒\n馔\n首\n馗\n香\n馥\n馨\n馬\n馭\n馮\n馳\n馴\n駁\n駄\n駅\n駆\n駐\n駒\n駕\n駛\n駝\n駭\n駱\n駿\n騁\n騎\n騏\n験\n騙\n騨\n騰\n騷\n驀\n驅\n驊\n驍\n驒\n驕\n驗\n驚\n驛\n驟\n驢\n驥\n马\n驭\n驮\n驯\n驰\n驱\n驳\n驴\n驶\n驷\n驸\n驹\n驻\n驼\n驾\n驿\n骁\n骂\n骄\n骅\n骆\n骇\n骈\n骊\n骋\n验\n骏\n骐\n骑\n骗\n骚\n骛\n骜\n骞\n骠\n骡\n骤\n骥\n骧\n骨\n骯\n骰\n骶\n骷\n骸\n骼\n髂\n髅\n髋\n髏\n髒\n髓\n體\n髖\n高\n髦\n髪\n髮\n髯\n髻\n鬃\n鬆\n鬍\n鬓\n鬚\n鬟\n鬢\n鬣\n鬥\n鬧\n鬱\n鬼\n魁\n魂\n魄\n魅\n魇\n魍\n魏\n魔\n魘\n魚\n魯\n魷\n鮑\n鮨\n鮪\n鮭\n鮮\n鯉\n鯊\n鯖\n鯛\n鯨\n鯰\n鯽\n鰍\n鰓\n鰭\n鰲\n鰻\n鰾\n鱈\n鱉\n鱔\n鱗\n鱷\n鱸\n鱼\n鱿\n鲁\n鲈\n鲍\n鲑\n鲛\n鲜\n鲟\n鲢\n鲤\n鲨\n鲫\n鲱\n鲲\n鲶\n鲷\n鲸\n鳃\n鳄\n鳅\n鳌\n鳍\n鳕\n鳖\n鳗\n鳝\n鳞\n鳥\n鳩\n鳳\n鳴\n鳶\n鴉\n鴕\n鴛\n鴦\n鴨\n鴻\n鴿\n鵑\n鵜\n鵝\n鵡\n鵬\n鵰\n鵲\n鶘\n鶩\n鶯\n鶴\n鷗\n鷲\n鷹\n鷺\n鸚\n鸞\n鸟\n鸠\n鸡\n鸢\n鸣\n鸥\n鸦\n鸨\n鸪\n鸭\n鸯\n鸳\n鸵\n鸽\n鸾\n鸿\n鹂\n鹃\n鹄\n鹅\n鹈\n鹉\n鹊\n鹌\n鹏\n鹑\n鹕\n鹘\n鹜\n鹞\n鹤\n鹦\n鹧\n鹫\n鹭\n鹰\n鹳\n鹵\n鹹\n鹼\n鹽\n鹿\n麂\n麋\n麒\n麓\n麗\n麝\n麟\n麥\n麦\n麩\n麴\n麵\n麸\n麺\n麻\n麼\n麽\n麾\n黃\n黄\n黍\n黎\n黏\n黑\n黒\n黔\n默\n黛\n黜\n黝\n點\n黠\n黨\n黯\n黴\n鼋\n鼎\n鼐\n鼓\n鼠\n鼬\n鼹\n鼻\n鼾\n齁\n齊\n齋\n齐\n齒\n齡\n齢\n齣\n齦\n齿\n龄\n龅\n龈\n龊\n龋\n龌\n龍\n龐\n龔\n龕\n龙\n龚\n龛\n龜\n龟\n︰\n︱\n︶\n︿\n﹁\n﹂\n﹍\n﹏\n﹐\n﹑\n﹒\n﹔\n﹕\n﹖\n﹗\n﹙\n﹚\n﹝\n﹞\n﹡\n﹣\n！\n＂\n＃\n＄\n％\n＆\n＇\n（\n）\n＊\n＋\n，\n－\n．\n／\n０\n１\n２\n３\n４\n５\n６\n７\n８\n９\n：\n；\n＜\n＝\n＞\n？\n＠\n［\n＼\n］\n＾\n＿\n｀\nａ\nｂ\nｃ\nｄ\nｅ\nｆ\nｇ\nｈ\nｉ\nｊ\nｋ\nｌ\nｍ\nｎ\nｏ\nｐ\nｑ\nｒ\nｓ\nｔ\nｕ\nｖ\nｗ\nｘ\nｙ\nｚ\n｛\n｜\n｝\n～\n｡\n｢\n｣\n､\n･\nｯ\nｰ\nｲ\nｸ\nｼ\nｽ\nﾄ\nﾉ\nﾌ\nﾗ\nﾙ\nﾝ\nﾞ\nﾟ\n￣\n￥\n👍\n🔥\n😂\n😎\n...\nyam\n10\n2017\n12\n11\n2016\n20\n30\n15\n06\nlofter\n##s\n2015\nby\n16\n14\n18\n13\n24\n17\n2014\n21\n##0\n22\n19\n25\n23\ncom\n100\n00\n05\n2013\n##a\n03\n09\n08\n28\n##2\n50\n01\n04\n##1\n27\n02\n2012\n##3\n26\n##e\n07\n##8\n##5\n##6\n##4\n##9\n##7\n29\n2011\n40\n##t\n2010\n##o\n##d\n##i\n2009\n##n\napp\nwww\nthe\n##m\n31\n##c\n##l\n##y\n##r\n##g\n2008\n60\nhttp\n200\nqq\n##p\n80\n##f\ngoogle\npixnet\n90\ncookies\ntripadvisor\n500\n##er\n##k\n35\n##h\nfacebook\n2007\n2000\n70\n##b\nof\n##x\n##u\n45\n300\niphone\n32\n1000\n2006\n48\nip\n36\nin\n38\n3d\n##w\n##ing\n55\nctrip\n##on\n##v\n33\n##の\nto\n34\n400\nid\n2005\nit\n37\nwindows\nllc\ntop\n99\n42\n39\n000\nled\nat\n##an\n41\n51\n52\n46\n49\n43\n53\n44\n##z\nandroid\n58\nand\n59\n2004\n56\nvr\n##か\n5000\n2003\n47\nblogthis\ntwitter\n54\n##le\n150\nok\n2018\n57\n75\ncn\nno\nios\n##in\n##mm\n##00\n800\non\nte\n3000\n65\n2001\n360\n95\nig\nlv\n120\n##ng\n##を\n##us\n##に\npc\nてす\n──\n600\n##te\n85\n2002\n88\n##ed\nhtml\nncc\nwifi\nemail\n64\nblog\nis\n##10\n##て\nmail\nonline\n##al\ndvd\n##ic\nstudio\n##は\n##℃\n##ia\n##と\nline\nvip\n72\n##q\n98\n##ce\n##en\nfor\n##is\n##ra\n##es\n##j\nusb\nnet\ncp\n1999\nasia\n4g\n##cm\ndiy\nnew\n3c\n##お\nta\n66\nlanguage\nvs\napple\ntw\n86\nweb\n##ne\nipad\n62\nyou\n##re\n101\n68\n##tion\nps\nde\nbt\npony\natm\n##2017\n1998\n67\n##ch\nceo\n##or\ngo\n##na\nav\npro\ncafe\n96\npinterest\n97\n63\npixstyleme3c\n##ta\nmore\nsaid\n##2016\n1997\nmp3\n700\n##ll\nnba\njun\n##20\n92\ntv\n1995\npm\n61\n76\nnbsp\n250\n##ie\nlinux\n##ma\ncd\n110\nhd\n##17\n78\n##ion\n77\n6000\nam\n##th\n##st\n94\n##se\n##et\n69\n180\ngdp\nmy\n105\n81\nabc\n89\nflash\n79\none\n93\n1990\n1996\n##ck\ngps\n##も\n##ly\nweb885\n106\n2020\n91\n##ge\n4000\n1500\nxd\nboss\nisbn\n1994\norg\n##ry\nme\nlove\n##11\n0fork\n73\n##12\n3g\n##ter\n##ar\n71\n82\n##la\nhotel\n130\n1970\npk\n83\n87\n140\nie\n##os\n##30\n##el\n74\n##50\nseo\ncpu\n##ml\np2p\n84\nmay\n##る\nsun\ntue\ninternet\ncc\nposted\nyoutube\n##at\n##ン\n##man\nii\n##ル\n##15\nabs\nnt\npdf\nyahoo\nago\n1980\n##it\nnews\nmac\n104\n##てす\n##me\n##り\njava\n1992\nspa\n##de\n##nt\nhk\nall\nplus\nla\n1993\n##mb\n##16\n##ve\nwest\n##da\n160\nair\n##い\n##ps\nから\n##to\n1989\nlogo\nhtc\nphp\nhttps\nfi\nmomo\n##son\nsat\n##ke\n##80\nebd\nsuv\nwi\nday\napk\n##88\n##um\nmv\ngalaxy\nwiki\nor\nbrake\n##ス\n1200\nする\nthis\n1991\nmon\n##こ\n❤2017\npo\n##ない\njavascript\nlife\nhome\njune\n##ss\nsystem\n900\n##ー\n##０\npp\n1988\nworld\nfb\n4k\nbr\n##as\nic\nai\nleonardo\nsafari\n##60\nlive\nfree\nxx\nwed\nwin7\nkiehl\n##co\nlg\no2o\n##go\nus\n235\n1949\nmm\nしい\nvfm\nkanye\n##90\n##2015\n##id\njr\n##ey\n123\nrss\n##sa\n##ro\n##am\n##no\nthu\nfri\n350\n##sh\n##ki\n103\ncomments\nname\n##のて\n##pe\n##ine\nmax\n1987\n8000\nuber\n##mi\n##ton\nwordpress\noffice\n1986\n1985\n##ment\n107\nbd\nwin10\n##ld\n##li\ngmail\nbb\ndior\n##rs\n##ri\n##rd\n##ます\nup\ncad\n##®\ndr\nして\nread\n##21\nをお\n##io\n##99\nurl\n1984\npvc\npaypal\nshow\npolicy\n##40\n##ty\n##18\nwith\n##★\n##01\ntxt\n102\n##ba\ndna\nfrom\npost\nmini\nar\ntaiwan\njohn\n##ga\nprivacy\nagoda\n##13\n##ny\nword\n##24\n##22\n##by\n##ur\n##hz\n1982\n##ang\n265\ncookie\nnetscape\n108\n##ka\n##～\n##ad\nhouse\nshare\nnote\nibm\ncode\nhello\nnike\nsim\nsurvey\n##016\n1979\n1950\nwikia\n##32\n##017\n5g\ncbc\n##tor\n##kg\n1983\n##rt\n##14\ncampaign\nstore\n2500\nos\n##ct\n##ts\n##°\n170\napi\n##ns\n365\nexcel\n##な\n##ao\n##ら\n##し\n～～\n##nd\nuniversity\n163\nには\n518\n##70\n##ya\n##il\n##25\npierre\nipo\n0020\n897\n##23\nhotels\n##ian\nのお\n125\nyears\n6606\n##ers\n##26\nhigh\n##day\ntime\n##ay\nbug\n##line\n##く\n##す\n##be\nxp\ntalk2yam\nyamservice\n10000\ncoco\n##dy\nsony\n##ies\n1978\nmicrosoft\ndavid\npeople\n##ha\n1960\ninstagram\nintel\nその\n##ot\niso\n1981\n##va\n115\n##mo\n##land\nxxx\nman\nco\nltxsw\n##ation\nbaby\n220\n##pa\n##ol\n1945\n7000\ntag\n450\n##ue\nmsn\n##31\noppo\n##ト\n##ca\ncontrol\n##om\nst\nchrome\n##ure\n##ん\nbe\n##き\nlol\n##19\nした\n##bo\n240\nlady\n##100\n##way\n##から\n4600\n##ko\n##do\n##un\n4s\ncorporation\n168\n##ni\nherme\n##28\nｃｐ\n978\n##up\n##06\nui\n##ds\nppt\nadmin\nthree\nします\nbbc\nre\n128\n##48\nca\n##015\n##35\nhp\n##ee\ntpp\n##た\n##ive\n××\nroot\n##cc\n##ました\n##ble\n##ity\nadobe\npark\n114\net\noled\ncity\n##ex\n##ler\n##ap\nchina\n##book\n20000\nview\n##ice\nglobal\n##km\nyour\nhong\n##mg\nout\n##ms\nng\nebay\n##29\nmenu\nubuntu\n##cy\nrom\n##view\nopen\nktv\ndo\nserver\n##lo\nif\nenglish\n##ね\n##５\n##oo\n1600\n##02\nstep1\nkong\nclub\n135\njuly\ninc\n1976\nmr\nhi\n##net\ntouch\n##ls\n##ii\nmichael\nlcd\n##05\n##33\nphone\njames\nstep2\n1300\nios9\n##box\ndc\n##２\n##ley\nsamsung\n111\n280\npokemon\ncss\n##ent\n##les\nいいえ\n##１\ns8\natom\nplay\nbmw\n##said\nsa\netf\nctrl\n♥yoyo♥\n##55\n2025\n##2014\n##66\nadidas\namazon\n1958\n##ber\n##ner\nvisa\n##77\n##der\n1800\nconnectivity\n##hi\nfirefox\n109\n118\nhr\nso\nstyle\nmark\npop\nol\nskip\n1975\nas\n##27\n##ir\n##61\n190\nmba\n##う\n##ai\nle\n##ver\n1900\ncafe2017\nlte\nsuper\n113\n129\n##ron\namd\nlike\n##☆\nare\n##ster\nwe\n##sk\npaul\ndata\ninternational\n##ft\nlongchamp\nssd\ngood\n##ート\n##ti\nreply\n##my\n↓↓↓\napr\nstar\n##ker\nsource\n136\njs\n112\nget\nforce\nphoto\n##one\n126\n##2013\n##ow\nlink\nbbs\n1972\ngoods\n##lin\npython\n119\n##ip\ngame\n##ics\n##ません\nblue\n##●\n520\n##45\npage\nitunes\n##03\n1955\n260\n1968\ngt\ngif\n618\n##ff\n##47\ngroup\nくたさい\nabout\nbar\nganji\n##nce\nmusic\nlee\nnot\n1977\n1971\n1973\n##per\nan\nfaq\ncomment\n##って\ndays\n##ock\n116\n##bs\n1974\n1969\nv1\nplayer\n1956\nxbox\nsql\nfm\nf1\n139\n##ah\n210\n##lv\n##mp\n##000\nmelody\n1957\n##３\n550\n17life\n199\n1966\nxml\nmarket\n##au\n##71\n999\n##04\nwhat\ngl\n##95\n##age\ntips\n##68\nbook\n##ting\nmysql\ncan\n1959\n230\n##ung\nwonderland\nwatch\n10℃\n##ction\n9000\nmar\nmobile\n1946\n1962\narticle\n##db\npart\n▲top\nparty\nって\n1967\n1964\n1948\n##07\n##ore\n##op\nこの\ndj\n##78\n##38\n010\nmain\n225\n1965\n##ong\nart\n320\nad\n134\n020\n##73\n117\npm2\njapan\n228\n##08\nts\n1963\n##ica\nder\nsm\n##36\n2019\n##wa\nct\n##７\n##や\n##64\n1937\nhomemesh\nsearch\n##85\n##れは\n##tv\n##di\nmacbook\n##９\n##くたさい\nservice\n##♥\ntype\nった\n750\n##ier\n##si\n##75\n##います\n##ok\nbest\n##ット\ngoris\nlock\n##った\ncf\n3m\nbig\n##ut\nftp\ncarol\n##vi\n１０\n1961\nhappy\nsd\n##ac\n122\nanti\npe\ncnn\niii\n1920\n138\n##ラ\n1940\nesp\njan\ntags\n##98\n##51\naugust\nvol\n##86\n154\n##™\n##fs\n##れ\n##sion\ndesign\nac\n##ム\npress\njordan\nppp\nthat\nkey\ncheck\n##６\n##tt\n##㎡\n1080p\n##lt\npower\n##42\n1952\n##bc\nvivi\n##ック\nhe\n133\n121\njpg\n##rry\n201\n175\n3500\n1947\nnb\n##ted\n##rn\nしています\n1954\nusd\n##t00\nmaster\n##ンク\n001\nmodel\n##58\nal\n##09\n1953\n##34\nram\ngoo\nても\n##ui\n127\n1930\nred\n##ary\nrpg\nitem\n##pm\n##41\n270\n##za\nproject\n##2012\nhot\ntd\nblogabstract\n##ger\n##62\n650\n##44\ngr2\n##します\n##ｍ\nblack\nelectronic\nnfc\nyear\nasus\nまた\nhtml5\ncindy\n##hd\nm3\n132\nesc\n##od\nbooking\n##53\nfed\ntvb\n##81\n##ina\nmit\n165\n##いる\nchan\n192\ndistribution\nnext\nになる\npeter\nbios\nsteam\ncm\n1941\nにも\npk10\n##ix\n##65\n##91\ndec\nnasa\n##ana\nicecat\n00z\nb1\nwill\n##46\nli\nse\n##ji\n##み\n##ard\noct\n##ain\njp\n##ze\n##bi\ncio\n##56\nsmart\nh5\n##39\n##port\ncurve\nvpn\n##nm\n##dia\nutc\n##あり\n12345678910\n##52\nrmvb\nchanel\na4\nmiss\n##and\n##im\nmedia\nwho\n##63\nshe\ngirl\n5s\n124\nvera\n##して\nclass\nvivo\nking\n##フ\n##ei\nnational\nab\n1951\n5cm\n888\n145\nipod\nap\n1100\n5mm\n211\nms\n2756\n##69\nmp4\nmsci\n##po\n##89\n131\nmg\nindex\n380\n##bit\n##out\n##zz\n##97\n##67\n158\napec\n##８\nphotoshop\nopec\n￥799\nては\n##96\n##tes\n##ast\n2g\n○○\n##ール\n￥2899\n##ling\n##よ\n##ory\n1938\n##ical\nkitty\ncontent\n##43\nstep3\n##cn\nwin8\n155\nvc\n1400\niphone7\nrobert\n##した\ntcl\n137\nbeauty\n##87\nen\ndollars\n##ys\n##oc\nstep\npay\nyy\na1\n##2011\n##lly\n##ks\n##♪\n1939\n188\ndownload\n1944\nsep\nexe\nph\nいます\nschool\ngb\ncenter\npr\nstreet\n##board\nuv\n##37\n##lan\nwinrar\n##que\n##ua\n##com\n1942\n1936\n480\ngpu\n##４\nettoday\nfu\ntom\n##54\n##ren\n##via\n149\n##72\nb2b\n144\n##79\n##tch\nrose\narm\nmb\n##49\n##ial\n##nn\nnvidia\nstep4\nmvp\n00㎡\nyork\n156\n##イ\nhow\ncpi\n591\n2765\ngov\nkg\njoe\n##xx\nmandy\npa\n##ser\ncopyright\nfashion\n1935\ndon\n##け\necu\n##ist\n##art\nerp\nwap\nhave\n##lm\ntalk\n##ek\n##ning\n##if\nch\n##ite\nvideo\n1943\ncs\nsan\niot\nlook\n##84\n##2010\n##ku\noctober\n##ux\ntrump\n##hs\n##ide\nbox\n141\nfirst\n##ins\napril\n##ight\n##83\n185\nangel\nprotected\naa\n151\n162\nx1\nm2\n##fe\n##×\n##ho\nsize\n143\nmin\nofo\nfun\ngomaji\nex\nhdmi\nfood\ndns\nmarch\nchris\nkevin\n##のか\n##lla\n##pp\n##ec\nag\nems\n6s\n720p\n##rm\n##ham\noff\n##92\nasp\nteam\nfandom\ned\n299\n▌♥\n##ell\ninfo\nされています\n##82\nsina\n4066\n161\n##able\n##ctor\n330\n399\n315\ndll\nrights\nltd\nidc\njul\n3kg\n1927\n142\nma\nsurface\n##76\n##ク\n～～～\n304\nmall\neps\n146\ngreen\n##59\nmap\nspace\ndonald\nv2\nsodu\n##light\n1931\n148\n1700\nまて\n310\nreserved\nhtm\n##han\n##57\n2d\n178\nmod\n##ise\n##tions\n152\nti\n##shi\ndoc\n1933\nicp\n055\nwang\n##ram\nshopping\naug\n##pi\n##well\nnow\nwam\nb2\nからお\n##hu\n236\n1928\n##gb\n266\nf2\n##93\n153\nmix\n##ef\n##uan\nbwl\n##plus\n##res\ncore\n##ess\ntea\n5℃\nhktvmall\nnhk\n##ate\nlist\n##ese\n301\nfeb\n4m\ninn\nての\nnov\n159\n12345\ndaniel\n##ci\npass\n##bet\n##nk\ncoffee\n202\nssl\nairbnb\n##ute\nfbi\nwoshipm\nskype\nea\ncg\nsp\n##fc\n##www\nyes\nedge\nalt\n007\n##94\nfpga\n##ght\n##gs\niso9001\nさい\n##ile\n##wood\n##uo\nimage\nlin\nicon\namerican\n##em\n1932\nset\nsays\n##king\n##tive\nblogger\n##74\nなと\n256\n147\n##ox\n##zy\n##red\n##ium\n##lf\nnokia\nclaire\n##リ\n##ding\nnovember\nlohas\n##500\n##tic\n##マ\n##cs\n##ある\n##che\n##ire\n##gy\n##ult\ndb\njanuary\nwin\n##カ\n166\nroad\nptt\n##ま\n##つ\n198\n##fa\n##mer\nanna\npchome\nはい\nudn\nef\n420\n##time\n##tte\n2030\n##ア\ng20\nwhite\nかかります\n1929\n308\ngarden\neleven\ndi\n##おります\nchen\n309b\n777\n172\nyoung\ncosplay\nちてない\n4500\nbat\n##123\n##tra\n##ては\nkindle\nnpc\nsteve\netc\n##ern\n##｜\ncall\nxperia\nces\ntravel\nsk\ns7\n##ous\n1934\n##int\nみいたたけます\n183\nedu\nfile\ncho\nqr\n##car\n##our\n186\n##ant\n##ｄ\neric\n1914\nrends\n##jo\n##する\nmastercard\n##2000\nkb\n##min\n290\n##ino\nvista\n##ris\n##ud\njack\n2400\n##set\n169\npos\n1912\n##her\n##ou\ntaipei\nしく\n205\nbeta\n##ませんか\n232\n##fi\nexpress\n255\nbody\n##ill\naphojoy\nuser\ndecember\nmeiki\n##ick\ntweet\nrichard\n##av\n##ᆫ\niphone6\n##dd\nちてすか\nviews\n##mark\n321\npd\n##００\ntimes\n##▲\nlevel\n##ash\n10g\npoint\n5l\n##ome\n208\nkoreanmall\n##ak\ngeorge\nq2\n206\nwma\ntcp\n##200\nスタッフ\nfull\nmlb\n##lle\n##watch\ntm\nrun\n179\n911\nsmith\nbusiness\n##und\n1919\ncolor\n##tal\n222\n171\n##less\nmoon\n4399\n##rl\nupdate\npcb\nshop\n499\n157\nlittle\nなし\nend\n##mhz\nvan\ndsp\neasy\n660\n##house\n##key\nhistory\n##ｏ\noh\n##001\n##hy\n##web\noem\nlet\nwas\n##2009\n##gg\nreview\n##wan\n182\n##°c\n203\nuc\ntitle\n##val\nunited\n233\n2021\n##ons\ndoi\ntrivago\noverdope\nsbs\n##ance\n##ち\ngrand\nspecial\n573032185\nimf\n216\nwx17house\n##so\n##ーム\naudi\n##he\nlondon\nwilliam\n##rp\n##ake\nscience\nbeach\ncfa\namp\nps4\n880\n##800\n##link\n##hp\ncrm\nferragamo\nbell\nmake\n##eng\n195\nunder\nzh\nphotos\n2300\n##style\n##ント\nvia\n176\nda\n##gi\ncompany\ni7\n##ray\nthomas\n370\nufo\ni5\n##max\nplc\nben\nback\nresearch\n8g\n173\nmike\n##pc\n##ッフ\nseptember\n189\n##ace\nvps\nfebruary\n167\npantos\nwp\nlisa\n1921\n★★\njquery\nnight\nlong\noffer\n##berg\n##news\n1911\n##いて\nray\nfks\nwto\nせます\nover\n164\n340\n##all\n##rus\n1924\n##888\n##works\nblogtitle\nloftpermalink\n##→\n187\nmartin\ntest\nling\nkm\n##め\n15000\nfda\nv3\n##ja\n##ロ\nｗedding\nかある\noutlet\nfamily\n##ea\nをこ\n##top\nstory\n##ness\nsalvatore\n##lu\n204\nswift\n215\nroom\nしている\noracle\n##ul\n1925\nsam\nb2c\nweek\npi\nrock\n##のは\n##ａ\n##けと\n##ean\n##300\n##gle\ncctv\nafter\nchinese\n##back\npowered\nx2\n##tan\n1918\n##nes\n##イン\ncanon\nonly\n181\n##zi\n##las\nsay\n##oe\n184\n##sd\n221\n##bot\n##world\n##zo\nsky\nmade\ntop100\njust\n1926\npmi\n802\n234\ngap\n##vr\n177\nles\n174\n▲topoct\nball\nvogue\nvi\ning\nofweek\ncos\n##list\n##ort\n▲topmay\n##なら\n##lon\nとして\nlast\n##tc\n##of\n##bus\n##gen\nreal\neva\n##コ\na3\nnas\n##lie\n##ria\n##coin\n##bt\n▲topapr\nhis\n212\ncat\nnata\nvive\nhealth\n⋯⋯\ndrive\nsir\n▲topmar\ndu\ncup\n##カー\n##ook\n##よう\n##sy\nalex\nmsg\ntour\nしました\n3ce\n##word\n193\nebooks\nr8\nblock\n318\n##より\n2200\nnice\npvp\n207\nmonths\n1905\nrewards\n##ther\n1917\n0800\n##xi\n##チ\n##sc\nmicro\n850\ngg\nblogfp\nop\n1922\ndaily\nm1\n264\ntrue\n##bb\nml\n##tar\n##のお\n##ky\nanthony\n196\n253\n##yo\nstate\n218\n##ara\n##aa\n##rc\n##tz\n##ston\nより\ngear\n##eo\n##ade\nge\nsee\n1923\n##win\n##ura\nss\nheart\n##den\n##ita\ndown\n##sm\nel\npng\n2100\n610\nrakuten\nwhatsapp\nbay\ndream\nadd\n##use\n680\n311\npad\ngucci\nmpv\n##ode\n##fo\nisland\n▲topjun\n##▼\n223\njason\n214\nchicago\n##❤\nしの\n##hone\nio\n##れる\n##ことか\nsogo\nbe2\n##ology\n990\ncloud\nvcd\n##con\n2～3\n##ford\n##joy\n##kb\n##こさいます\n##rade\nbut\n##ach\ndocker\n##ful\nrfid\nul\n##ase\nhit\nford\n##star\n580\n##○\n１１\na2\nsdk\nreading\nedited\n##are\ncmos\n##mc\n238\nsiri\nlight\n##ella\n##ため\nbloomberg\n##read\npizza\n##ison\njimmy\n##vm\ncollege\nnode\njournal\nba\n18k\n##play\n245\n##cer\n２０\nmagic\n##yu\n191\njump\n288\ntt\n##ings\nasr\n##lia\n3200\nstep5\nnetwork\n##cd\nmc\nいします\n1234\npixstyleme\n273\n##600\n2800\nmoney\n★★★★★\n1280\n１２\n430\nbl\nみの\nact\n##tus\ntokyo\n##rial\n##life\nemba\n##ae\nsaas\ntcs\n##rk\n##wang\nsummer\n##sp\nko\n##ving\n390\npremium\n##その\nnetflix\n##ヒ\nuk\nmt\n##lton\nright\nfrank\ntwo\n209\nえる\n##ple\n##cal\n021\n##んな\n##sen\n##ville\nhold\nnexus\ndd\n##ius\nてお\n##mah\n##なく\ntila\nzero\n820\nce\n##tin\nresort\n##ws\ncharles\nold\np10\n5d\nreport\n##360\n##ru\n##には\nbus\nvans\nlt\n##est\npv\n##レ\nlinks\nrebecca\n##ツ\n##dm\nazure\n##365\nきな\nlimited\nbit\n4gb\n##mon\n1910\nmoto\n##eam\n213\n1913\nvar\neos\nなとの\n226\nblogspot\nされた\n699\ne3\ndos\ndm\nfc\n##ments\n##ik\n##kw\nboy\n##bin\n##ata\n960\ner\n##せ\n219\n##vin\n##tu\n##ula\n194\n##∥\nstation\n##ろ\n##ature\n835\nfiles\nzara\nhdr\ntop10\nnature\n950\nmagazine\ns6\nmarriott\n##シ\navira\ncase\n##っと\ntab\n##ran\ntony\n##home\noculus\nim\n##ral\njean\nsaint\ncry\n307\nrosie\n##force\n##ini\nice\n##bert\nのある\n##nder\n##mber\npet\n2600\n##◆\nplurk\n▲topdec\n##sis\n00kg\n▲topnov\n720\n##ence\ntim\n##ω\n##nc\n##ても\n##name\nlog\nips\ngreat\nikea\nmalaysia\nunix\n##イト\n3600\n##ncy\n##nie\n12000\nakb48\n##ye\n##oid\n404\n##chi\n##いた\noa\nxuehai\n##1000\n##orm\n##rf\n275\nさん\n##ware\n##リー\n980\nho\n##pro\ntext\n##era\n560\nbob\n227\n##ub\n##2008\n8891\nscp\navi\n##zen\n2022\nmi\nwu\nmuseum\nqvod\napache\nlake\njcb\n▲topaug\n★★★\nni\n##hr\nhill\n302\nne\nweibo\n490\nruby\n##ーシ\n##ヶ\n##row\n4d\n▲topjul\niv\n##ish\ngithub\n306\nmate\n312\n##スト\n##lot\n##ane\nandrew\nのハイト\n##tina\nt1\nrf\ned2k\n##vel\n##900\nway\nfinal\nりの\nns\n5a\n705\n197\n##メ\nsweet\nbytes\n##ene\n▲topjan\n231\n##cker\n##2007\n##px\n100g\ntopapp\n229\nhelpapp\nrs\nlow\n14k\ng4g\ncare\n630\nldquo\nあり\n##fork\nleave\nrm\nedition\n##gan\n##zon\n##qq\n▲topsep\n##google\n##ism\ngold\n224\nexplorer\n##zer\ntoyota\ncategory\nselect\nvisual\n##labels\nrestaurant\n##md\nposts\ns1\n##ico\nもっと\nangelababy\n123456\n217\nsports\ns3\nmbc\n1915\nしてくたさい\nshell\nx86\ncandy\n##new\nkbs\nface\nxl\n470\n##here\n4a\nswissinfo\nv8\n▲topfeb\ndram\n##ual\n##vice\n3a\n##wer\nsport\nq1\nios10\npublic\nint\ncard\n##ｃ\nep\nau\nrt\n##れた\n1080\nbill\n##mll\nkim\n３０\n460\nwan\n##uk\n##ミ\nx3\n298\n0t\nscott\n##ming\n239\ne5\n##3d\nh7n9\nworldcat\nbrown\n##あります\n##vo\n##led\n##580\n##ax\n249\n410\n##ert\nparis\n##～6\npolo\n925\n##lr\n599\n##ナ\ncapital\n##hing\nbank\ncv\n1g\n##chat\n##ｓ\n##たい\nadc\n##ule\n2m\n##ｅ\ndigital\nhotmail\n268\n##pad\n870\nbbq\nquot\n##ring\nbefore\nwali\n##まて\nmcu\n2k\n2b\nという\ncostco\n316\nnorth\n333\nswitch\n##city\n##ｐ\nphilips\n##mann\nmanagement\npanasonic\n##cl\n##vd\n##ping\n##rge\nalice\n##lk\n##ましょう\ncss3\n##ney\nvision\nalpha\n##ular\n##400\n##tter\nlz\nにお\n##ありません\nmode\ngre\n1916\npci\n##tm\n237\n1～2\n##yan\n##そ\nについて\n##let\n##キ\nwork\nwar\ncoach\nah\nmary\n##ᅵ\nhuang\n##pt\na8\npt\nfollow\n##berry\n1895\n##ew\na5\nghost\n##ション\n##wn\n##og\nsouth\n##code\ngirls\n##rid\naction\nvilla\ngit\nr11\ntable\ngames\n##cket\nerror\n##anonymoussaid\n##ag\nhere\n##ame\n##gc\nqa\n##■\n##lis\ngmp\n##gin\nvmalife\n##cher\nyu\nwedding\n##tis\ndemo\ndragon\n530\nsoho\nsocial\nbye\n##rant\nriver\norz\nacer\n325\n##↑\n##ース\n##ats\n261\ndel\n##ven\n440\nups\n##ように\n##ター\n305\nvalue\nmacd\nyougou\n##dn\n661\n##ano\nll\n##urt\n##rent\ncontinue\nscript\n##wen\n##ect\npaper\n263\n319\nshift\n##chel\n##フト\n##cat\n258\nx5\nfox\n243\n##さん\ncar\naaa\n##blog\nloading\n##yn\n##tp\nkuso\n799\nsi\nsns\nイカせるテンマ\nヒンクテンマ3\nrmb\nvdc\nforest\ncentral\nprime\nhelp\nultra\n##rmb\n##ような\n241\nsquare\n688\n##しい\nのないフロクに\n##field\n##reen\n##ors\n##ju\nc1\nstart\n510\n##air\n##map\ncdn\n##wo\ncba\nstephen\nm8\n100km\n##get\nopera\n##base\n##ood\nvsa\ncom™\n##aw\n##ail\n251\nなのて\ncount\nt2\n##ᅡ\n##een\n2700\nhop\n##gp\nvsc\ntree\n##eg\n##ose\n816\n285\n##ories\n##shop\nalphago\nv4\n1909\nsimon\n##ᆼ\nfluke62max\nzip\nスホンサー\n##sta\nlouis\ncr\nbas\n##～10\nbc\n##yer\nhadoop\n##ube\n##wi\n1906\n0755\nhola\n##low\nplace\ncentre\n5v\nd3\n##fer\n252\n##750\n##media\n281\n540\n0l\nexchange\n262\nseries\n##ハー\n##san\neb\n##bank\n##ｋ\nq3\n##nge\n##mail\ntake\n##lp\n259\n1888\nclient\neast\ncache\nevent\nvincent\n##ールを\nきを\n##nse\nsui\n855\nadchoice\n##и\n##stry\n##なたの\n246\n##zone\nga\napps\nsea\n##ab\n248\ncisco\n##タ\n##rner\nkymco\n##care\ndha\n##pu\n##yi\nminkoff\nroyal\np1\nへの\nannie\n269\ncollection\nkpi\nplaystation\n257\nになります\n866\nbh\n##bar\nqueen\n505\nradio\n1904\nandy\narmani\n##xy\nmanager\niherb\n##ery\n##share\nspring\nraid\njohnson\n1908\n##ob\nvolvo\nhall\n##ball\nv6\nour\ntaylor\n##hk\nbi\n242\n##cp\nkate\nbo\nwater\ntechnology\n##rie\nサイトは\n277\n##ona\n##sl\nhpv\n303\ngtx\nhip\nrdquo\njayz\nstone\n##lex\n##rum\nnamespace\n##やり\n620\n##ale\n##atic\ndes\n##erson\n##ql\n##ves\n##type\nenter\n##この\n##てきます\nd2\n##168\n##mix\n##bian\nとの\na9\njj\nky\n##lc\naccess\nmovie\n##hc\nリストに\ntower\n##ration\n##mit\nます\n##nch\nua\ntel\nprefix\n##o2\n1907\n##point\n1901\nott\n～10\n##http\n##ury\nbaidu\n##ink\nmember\n##logy\nbigbang\nnownews\n##js\n##shot\n##tb\n##こと\n247\neba\n##tics\n##lus\nける\nv5\nspark\n##ama\nthere\n##ions\ngod\n##lls\n##down\nhiv\n##ress\nburberry\nday2\n##kv\n◆◆\njeff\nrelated\nfilm\nedit\njoseph\n283\n##ark\ncx\n32gb\norder\ng9\n30000\n##ans\n##tty\ns5\n##bee\nかあります\nthread\nxr\nbuy\nsh\n005\nland\nspotify\nmx\n##ari\n276\n##verse\n×email\nsf\nwhy\n##ことて\n244\n7headlines\nnego\nsunny\ndom\nexo\n401\n666\npositioning\nfit\nrgb\n##tton\n278\nkiss\nalexa\nadam\nlp\nみリストを\n##ｇ\nmp\n##ties\n##llow\namy\n##du\nnp\n002\ninstitute\n271\n##rth\n##lar\n2345\n590\n##des\nsidebar\n１５\nimax\nsite\n##cky\n##kit\n##ime\n##009\nseason\n323\n##fun\n##ンター\n##ひ\ngogoro\na7\npu\nlily\nfire\ntwd600\n##ッセーシを\nいて\n##vis\n30ml\n##cture\n##をお\ninformation\n##オ\nclose\nfriday\n##くれる\nyi\nnick\nてすか\n##tta\n##tel\n6500\n##lock\ncbd\neconomy\n254\nかお\n267\ntinker\ndouble\n375\n8gb\nvoice\n##app\noops\nchannel\ntoday\n985\n##right\nraw\nxyz\n##＋\njim\nedm\n##cent\n7500\nsupreme\n814\nds\n##its\n##asia\ndropbox\n##てすか\n##tti\nbooks\n272\n100ml\n##tle\n##ller\n##ken\n##more\n##boy\nsex\n309\n##dom\nt3\n##ider\n##なります\n##unch\n1903\n810\nfeel\n5500\n##かった\n##put\nにより\ns2\nmo\n##gh\nmen\nka\namoled\ndiv\n##tr\n##n1\nport\nhoward\n##tags\nken\ndnf\n##nus\nadsense\n##а\nide\n##へ\nbuff\nthunder\n##town\n##ique\nhas\n##body\nauto\npin\n##erry\ntee\nてした\n295\nnumber\n##the\n##013\nobject\npsp\ncool\nudnbkk\n16gb\n##mic\nmiui\n##tro\nmost\nr2\n##alk\n##nity\n1880\n±0\n##いました\n428\ns4\nlaw\nversion\n##oa\nn1\nsgs\ndocomo\n##tf\n##ack\nhenry\nfc2\n##ded\n##sco\n##014\n##rite\n286\n0mm\nlinkedin\n##ada\n##now\nwii\n##ndy\nucbug\n##◎\nsputniknews\nlegalminer\n##ika\n##xp\n2gb\n##bu\nq10\noo\nb6\ncome\n##rman\ncheese\nming\nmaker\n##gm\nnikon\n##fig\nppi\nkelly\n##ります\njchere\nてきます\nted\nmd\n003\nfgo\ntech\n##tto\ndan\nsoc\n##gl\n##len\nhair\nearth\n640\n521\nimg\n##pper\n##a1\n##てきる\n##ロク\nacca\n##ition\n##ference\nsuite\n##ig\noutlook\n##mond\n##cation\n398\n##pr\n279\n101vip\n358\n##999\n282\n64gb\n3800\n345\nairport\n##over\n284\n##おり\njones\n##ith\nlab\n##su\n##いるのて\nco2\ntown\npiece\n##llo\nno1\nvmware\n24h\n##qi\nfocus\nreader\n##admin\n##ora\ntb\nfalse\n##log\n1898\nknow\nlan\n838\n##ces\nf4\n##ume\nmotel\nstop\n##oper\nna\nflickr\nnetcomponents\n##af\n##─\npose\nwilliams\nlocal\n##ound\n##cg\n##site\n##iko\nいお\n274\n5m\ngsm\ncon\n##ath\n1902\nfriends\n##hip\ncell\n317\n##rey\n780\ncream\n##cks\n012\n##dp\nfacebooktwitterpinterestgoogle\nsso\n324\nshtml\nsong\nswiss\n##mw\n##キンク\nlumia\nxdd\nstring\ntiffany\n522\nmarc\nられた\ninsee\nrussell\nsc\ndell\n##ations\nｏｋ\ncamera\n289\n##vs\n##flow\n##late\nclassic\n287\n##nter\nstay\ng1\nmtv\n512\n##ever\n##lab\n##nger\nqe\nsata\nryan\nd1\n50ml\ncms\n##cing\nsu\n292\n3300\neditor\n296\n##nap\nsecurity\nsunday\nassociation\n##ens\n##700\n##bra\nacg\n##かり\nsofascore\nとは\nmkv\n##ign\njonathan\ngary\nbuild\nlabels\n##oto\ntesla\nmoba\nqi\ngohappy\ngeneral\najax\n1024\n##かる\nサイト\nsociety\n##test\n##urs\nwps\nfedora\n##ich\nmozilla\n328\n##480\n##dr\nusa\nurn\n##lina\n##ｒ\ngrace\n##die\n##try\n##ader\n1250\n##なり\nelle\n570\n##chen\n##ᆯ\nprice\n##ten\nuhz\n##ough\neq\n##hen\nstates\npush\nsession\nbalance\nwow\n506\n##cus\n##py\nwhen\n##ward\n##ep\n34e\nwong\nlibrary\nprada\n##サイト\n##cle\nrunning\n##ree\n313\nck\ndate\nq4\n##ctive\n##ool\n##＞\nmk\n##ira\n##163\n388\ndie\nsecret\nrq\ndota\nbuffet\nは１ヶ\ne6\n##ez\npan\n368\nha\n##card\n##cha\n2a\n##さ\nalan\nday3\neye\nf3\n##end\nfrance\nkeep\nadi\nrna\ntvbs\n##ala\nsolo\nnova\n##え\n##tail\n##ょう\nsupport\n##ries\n##なる\n##ved\nbase\ncopy\niis\nfps\n##ways\nhero\nhgih\nprofile\nfish\nmu\nssh\nentertainment\nchang\n##wd\nclick\ncake\n##ond\npre\n##tom\nkic\npixel\n##ov\n##fl\nproduct\n6a\n##pd\ndear\n##gate\nes\nyumi\naudio\n##²\n##sky\necho\nbin\nwhere\n##ture\n329\n##ape\nfind\nsap\nisis\n##なと\nnand\n##101\n##load\n##ream\nband\na6\n525\nnever\n##post\nfestival\n50cm\n##we\n555\nguide\n314\nzenfone\n##ike\n335\ngd\nforum\njessica\nstrong\nalexander\n##ould\nsoftware\nallen\n##ious\nprogram\n360°\nelse\nlohasthree\n##gar\nすることかてきます\nplease\n##れます\nrc\n##ggle\n##ric\nbim\n50000\n##own\neclipse\n355\nbrian\n3ds\n##side\n061\n361\n##other\n##ける\n##tech\n##ator\n485\nengine\n##ged\n##ｔ\nplaza\n##fit\ncia\nngo\nwestbrook\nshi\ntbs\n50mm\n##みませんか\nsci\n291\nreuters\n##ily\ncontextlink\n##hn\naf\n##cil\nbridge\nvery\n##cel\n1890\ncambridge\n##ize\n15g\n##aid\n##data\n790\nfrm\n##head\naward\nbutler\n##sun\nmeta\n##mar\namerica\nps3\npuma\npmid\n##すか\nlc\n670\nkitchen\n##lic\nオーフン5\nきなしソフトサーヒス\nそして\nday1\nfuture\n★★★★\n##text\n##page\n##rris\npm1\n##ket\nfans\n##っています\n1001\nchristian\nbot\nkids\ntrackback\n##hai\nc3\ndisplay\n##hl\nn2\n1896\nidea\nさんも\n##sent\nairmail\n##ug\n##men\npwm\nけます\n028\n##lution\n369\n852\nawards\nschemas\n354\nasics\nwikipedia\nfont\n##tional\n##vy\nc2\n293\n##れている\n##dget\n##ein\nっている\ncontact\npepper\nスキル\n339\n##～5\n294\n##uel\n##ument\n730\n##hang\nみてす\nq5\n##sue\nrain\n##ndi\nwei\nswatch\n##cept\nわせ\n331\npopular\n##ste\n##tag\np2\n501\ntrc\n1899\n##west\n##live\njustin\nhonda\nping\nmessenger\n##rap\nv9\n543\n##とは\nunity\nappqq\nはすへて\n025\nleo\n##tone\n##テ\n##ass\nuniqlo\n##010\n502\nher\njane\nmemory\nmoneydj\n##tical\nhuman\n12306\nしていると\n##m2\ncoc\nmiacare\n##mn\ntmt\n##core\nvim\nkk\n##may\nfan\ntarget\nuse\ntoo\n338\n435\n2050\n867\n737\nfast\n##2c\nservices\n##ope\nomega\nenergy\n##わ\npinkoi\n1a\n##なから\n##rain\njackson\n##ement\n##シャンルの\n374\n366\nそんな\np9\nrd\n##ᆨ\n1111\n##tier\n##vic\nzone\n##│\n385\n690\ndl\nisofix\ncpa\nm4\n322\nkimi\nめて\ndavis\n##lay\nlulu\n##uck\n050\nweeks\nqs\n##hop\n920\n##ｎ\nae\n##ear\n～5\neia\n405\n##fly\nkorea\njpeg\nboost\n##ship\nsmall\n##リア\n1860\neur\n297\n425\nvalley\n##iel\nsimple\n##ude\nrn\nk2\n##ena\nされます\nnon\npatrick\nしているから\n##ナー\nfeed\n5757\n30g\nprocess\nwell\nqqmei\n##thing\nthey\naws\nlu\npink\n##ters\n##kin\nまたは\nboard\n##vertisement\nwine\n##ien\nunicode\n##dge\nr1\n359\n##tant\nいを\n##twitter\n##3c\ncool1\nされる\n##れて\n##ｌ\nisp\n##012\nstandard\n45㎡2\n402\n##150\nmatt\n##fu\n326\n##iner\ngooglemsn\npixnetfacebookyahoo\n##ラン\nx7\n886\n##uce\nメーカー\nsao\n##ev\n##きました\n##file\n9678\n403\nxddd\nshirt\n6l\n##rio\n##hat\n3mm\ngivenchy\nya\nbang\n##lio\nmonday\ncrystal\nロクイン\n##abc\n336\nhead\n890\nubuntuforumwikilinuxpastechat\n##vc\n##～20\n##rity\ncnc\n7866\nipv6\nnull\n1897\n##ost\nyang\nimsean\ntiger\n##fet\n##ンス\n352\n##＝\ndji\n327\nji\nmaria\n##come\n##んて\nfoundation\n3100\n##beth\n##なった\n1m\n601\nactive\n##aft\n##don\n3p\nsr\n349\nemma\n##khz\nliving\n415\n353\n1889\n341\n709\n457\nsas\nx6\n##face\npptv\nx4\n##mate\nhan\nsophie\n##jing\n337\nfifa\n##mand\nother\nsale\ninwedding\n##gn\nてきちゃいます\n##mmy\n##pmlast\nbad\nnana\nnbc\nしてみてくたさいね\nなとはお\n##wu\n##かあります\n##あ\nnote7\nsingle\n##340\nせからこ\nしてくたさい♪この\nしにはとんとんワークケートを\nするとあなたにもっとマッチした\nならワークケートへ\nもみつかっちゃうかも\nワークケートの\n##bel\nwindow\n##dio\n##ht\nunion\nage\n382\n１４\n##ivity\n##ｙ\nコメント\ndomain\nneo\n##isa\n##lter\n5k\nf5\nsteven\n##cts\npowerpoint\ntft\nself\ng2\nft\n##テル\nzol\n##act\nmwc\n381\n343\nもう\nnbapop\n408\nてある\neds\nace\n##room\nprevious\nauthor\ntomtom\nil\n##ets\nhu\nfinancial\n☆☆☆\nっています\nbp\n5t\nchi\n1gb\n##hg\nfairmont\ncross\n008\ngay\nh2\nfunction\n##けて\n356\nalso\n1b\n625\n##ータ\n##raph\n1894\n3～5\n##ils\ni3\n334\navenue\n##host\nによる\n##bon\n##tsu\nmessage\nnavigation\n50g\nfintech\nh6\n##ことを\n8cm\n##ject\n##vas\n##firm\ncredit\n##wf\nxxxx\nform\n##nor\n##space\nhuawei\nplan\njson\nsbl\n##dc\nmachine\n921\n392\nwish\n##120\n##sol\nwindows7\nedward\n##ために\ndevelopment\nwashington\n##nsis\nlo\n818\n##sio\n##ym\n##bor\nplanet\n##～8\n##wt\nieee\ngpa\n##めて\ncamp\nann\ngm\n##tw\n##oka\nconnect\n##rss\n##work\n##atus\nwall\nchicken\nsoul\n2mm\n##times\nfa\n##ather\n##cord\n009\n##eep\nhitachi\ngui\nharry\n##pan\ne1\ndisney\n##press\n##ーション\nwind\n386\nfrigidaire\n##tl\nliu\nhsu\n332\nbasic\nvon\nev\nいた\nてきる\nスホンサーサイト\nlearning\n##ull\nexpedia\narchives\nchange\n##wei\nsanta\ncut\nins\n6gb\nturbo\nbrand\ncf1\n508\n004\nreturn\n747\n##rip\nh1\n##nis\n##をこ\n128gb\n##にお\n3t\napplication\nしており\nemc\nrx\n##oon\n384\nquick\n412\n15058\nwilson\nwing\nchapter\n##bug\nbeyond\n##cms\n##dar\n##oh\nzoom\ne2\ntrip\nsb\n##nba\nrcep\n342\naspx\nci\n080\ngc\ngnu\nめる\n##count\nadvanced\ndance\ndv\n##url\n##ging\n367\n8591\nam09\nshadow\nbattle\n346\n##ｉ\n##cia\n##という\nemily\n##のてす\n##tation\nhost\nff\ntechorz\nsars\n##mini\n##mporary\n##ering\nnc\n4200\n798\n##next\ncma\n##mbps\n##gas\n##ift\n##dot\n##ィ\n455\n##～17\namana\n##りの\n426\n##ros\nir\n00㎡1\n##eet\n##ible\n##↓\n710\nˋ▽ˊ\n##aka\ndcs\niq\n##ｖ\nl1\n##lor\nmaggie\n##011\n##iu\n588\n##～1\n830\n##gt\n1tb\narticles\ncreate\n##burg\n##iki\ndatabase\nfantasy\n##rex\n##cam\ndlc\ndean\n##you\nhard\npath\ngaming\nvictoria\nmaps\ncb\n##lee\n##itor\noverchicstoretvhome\nsystems\n##xt\n416\np3\nsarah\n760\n##nan\n407\n486\nx9\ninstall\nsecond\n626\n##ann\n##ph\n##rcle\n##nic\n860\n##nar\nec\n##とう\n768\nmetro\nchocolate\n##rian\n～4\n##table\n##しています\nskin\n##sn\n395\nmountain\n##0mm\ninparadise\n6m\n7x24\nib\n4800\n##jia\neeworld\ncreative\ng5\ng3\n357\nparker\necfa\nvillage\nからの\n18000\nsylvia\nサーヒス\nhbl\n##ques\n##onsored\n##x2\n##きます\n##v4\n##tein\nie6\n383\n##stack\n389\nver\n##ads\n##baby\nsound\nbbe\n##110\n##lone\n##uid\nads\n022\ngundam\n351\nthinkpad\n006\nscrum\nmatch\n##ave\nmems\n##470\n##oy\n##なりました\n##talk\nglass\nlamigo\nspan\n##eme\njob\n##a5\njay\nwade\nkde\n498\n##lace\nocean\ntvg\n##covery\n##r3\n##ners\n##rea\njunior\nthink\n##aine\ncover\n##ision\n##sia\n↓↓\n##bow\nmsi\n413\n458\n406\n##love\n711\n801\nsoft\nz2\n##pl\n456\n1840\nmobil\nmind\n##uy\n427\nnginx\n##oi\nめた\n##rr\n6221\n##mple\n##sson\n##ーシてす\n371\n##nts\n91tv\ncomhd\ncrv3000\n##uard\n1868\n397\ndeep\nlost\nfield\ngallery\n##bia\nrate\nspf\nredis\ntraction\n930\nicloud\n011\nなら\nfe\njose\n372\n##tory\ninto\nsohu\nfx\n899\n379\nkicstart2\n##hia\nすく\n##～3\n##sit\nra\n２４\n##walk\n##xure\n500g\n##pact\npacific\nxa\nnatural\ncarlo\n##250\n##walker\n1850\n##can\ncto\ngigi\n516\n##サー\npen\n##hoo\nob\nmatlab\n##ｂ\n##yy\n13913459\n##iti\nmango\n##bbs\nsense\nc5\noxford\n##ニア\nwalker\njennifer\n##ola\ncourse\n##bre\n701\n##pus\n##rder\nlucky\n075\n##ぁ\nivy\nなお\n##nia\nsotheby\nside\n##ugh\njoy\n##orage\n##ush\n##bat\n##dt\n364\nr9\n##2d\n##gio\n511\ncountry\nwear\n##lax\n##～7\n##moon\n393\nseven\nstudy\n411\n348\nlonzo\n8k\n##ェ\nevolution\n##イフ\n##kk\ngs\nkd\n##レス\narduino\n344\nb12\n##lux\narpg\n##rdon\ncook\n##x5\ndark\nfive\n##als\n##ida\nとても\nsign\n362\n##ちの\nsomething\n20mm\n##nda\n387\n##posted\nfresh\ntf\n1870\n422\ncam\n##mine\n##skip\n##form\n##ssion\neducation\n394\n##tee\ndyson\nstage\n##jie\nwant\n##night\nepson\npack\nあります\n##ppy\nテリヘル\n##█\nwd\n##eh\n##rence\nleft\n##lvin\ngolden\nmhz\ndiscovery\n##trix\n##n2\nloft\n##uch\n##dra\n##sse\nspeed\n～1\n1mdb\nsorry\nwelcome\n##urn\nwave\ngaga\n##lmer\nteddy\n##160\nトラックハック\nせよ\n611\n##f2016\n378\nrp\n##sha\nrar\n##あなたに\n##きた\n840\nholiday\n##ュー\n373\n074\n##vg\n##nos\n##rail\ngartner\ngi\n6p\n##dium\nkit\n488\nb3\neco\n##ろう\n20g\nsean\n##stone\nautocad\nnu\n##np\nf16\nwrite\n029\nm5\n##ias\nimages\natp\n##dk\nfsm\n504\n1350\nve\n52kb\n##xxx\n##のに\n##cake\n414\nunit\nlim\nru\n1v\n##ification\npublished\nangela\n16g\nanalytics\nak\n##ｑ\n##nel\ngmt\n##icon\nagain\n##₂\n##bby\nios11\n445\nかこさいます\nwaze\nいてす\n##ハ\n9985\n##ust\n##ティー\nframework\n##007\niptv\ndelete\n52sykb\ncl\nwwdc\n027\n30cm\n##fw\n##ての\n1389\n##xon\nbrandt\n##ses\n##dragon\ntc\nvetements\nanne\nmonte\nmodern\nofficial\n##へて\n##ere\n##nne\n##oud\nもちろん\n５０\netnews\n##a2\n##graphy\n421\n863\n##ちゃん\n444\n##rtex\n##てお\nl2\n##gma\nmount\nccd\nたと\narchive\nmorning\ntan\nddos\ne7\n##ホ\nday4\n##ウ\ngis\n453\nits\n495\nfactory\nbruce\npg\n##ito\nってくたさい\nguest\ncdma\n##lling\n536\nn3\nしかし\n3～4\nmega\neyes\nro\n１３\nwomen\ndac\nchurch\n##jun\nsingapore\n##facebook\n6991\nstarbucks\n##tos\n##stin\n##shine\nzen\n##mu\ntina\n20℃\n1893\n##たけて\n503\n465\nrequest\n##gence\nqt\n##っ\n1886\n347\n363\nq7\n##zzi\ndiary\n##tore\n409\n##ead\n468\ncst\n##osa\ncanada\nagent\nva\n##jiang\n##ちは\n##ーク\n##lam\nsg\n##nix\n##sday\n##よって\ng6\n##master\nbing\n##zl\ncharlie\n１６\n8mm\nnb40\n##ーン\nthai\n##ルフ\nln284ct\n##itz\n##2f\nbonnie\n##food\n##lent\noriginals\n##stro\n##lts\n418\n∟∣\n##bscribe\nchildren\nntd\nyesstyle\n##かも\nhmv\n##tment\nd5\n2cm\narts\nsms\n##pn\n##я\n##いい\ntopios9\n539\nlifestyle\nvirtual\n##ague\nxz\n##deo\nmuji\n024\nunt\n##nnis\n##ᅩ\nfaq1\n1884\n396\n##ette\nfly\n64㎡\nはしめまして\n441\ncurry\n##pop\nのこ\nrelease\n##←\n##◆◆\n##cast\n073\nありな\n500ml\n##ews\n5c\n##stle\nios7\n##ima\n787\ndog\nlenovo\n##r4\nroger\n013\ncbs\nvornado\n100m\n417\n##desk\n##クok\n##ald\n1867\n9595\n2900\n##van\noil\n##ｘ\nsome\nbreak\ncommon\n##jy\n##lines\ng7\ntwice\n419\nella\nnano\nbelle\nにこ\n##mes\n##self\n##note\njb\n##ことかてきます\nbenz\n##との\n##ova\n451\nsave\n##wing\n##ますのて\nkai\nりは\n##hua\n##rect\nrainer\n##unge\n448\n##0m\nadsl\n##かな\nguestname\n##uma\n##kins\n##zu\ntokichoi\n##price\ncounty\n##med\n##mus\nrmk\n391\naddress\nvm\nえて\nopenload\n##group\n##hin\n##iginal\namg\nurban\n##oz\njobs\nemi\n##public\nbeautiful\n##sch\nalbum\n##dden\n##bell\njerry\nworks\nhostel\nmiller\n##drive\n##rmin\n##１０\n376\nboot\n828\n##370\n##fx\n##cm～\n1885\n##nome\n##ctionary\n##oman\n##lish\n##cr\n##hm\n433\n##how\n432\nfrancis\nxi\nc919\nb5\nevernote\n##uc\nvga\n##3000\ncoupe\n##urg\n##cca\n##uality\n019\n6g\nれる\nmulti\n##また\n##ett\nem\nhey\n##ani\n##tax\n##rma\ninside\nthan\n740\nleonnhurt\n##jin\nict\nれた\nbird\nnotes\n200mm\nくの\n##dical\n##lli\nresult\n442\niu\nee\n438\nsmap\ngopro\n##last\nyin\npure\n998\n32g\nけた\n5kg\n##dan\n##rame\nmama\n##oot\nbean\nmarketing\n##hur\n2l\nbella\nsync\nxuite\n##ground\n515\ndiscuz\n##getrelax\n##ince\n##bay\n##5s\ncj\n##イス\ngmat\napt\n##pass\njing\n##rix\nc4\nrich\n##とても\nniusnews\n##ello\nbag\n770\n##eting\n##mobile\n１８\nculture\n015\n##のてすか\n377\n1020\narea\n##ience\n616\ndetails\ngp\nuniversal\nsilver\ndit\nはお\nprivate\nddd\nu11\nkanshu\n##ified\nfung\n##nny\ndx\n##520\ntai\n475\n023\n##fr\n##lean\n3s\n##pin\n429\n##rin\n25000\nly\nrick\n##bility\nusb3\nbanner\n##baru\n##gion\nmetal\ndt\nvdf\n1871\nkarl\nqualcomm\nbear\n1010\noldid\nian\njo\n##tors\npopulation\n##ernel\n1882\nmmorpg\n##mv\n##bike\n603\n##©\nww\nfriend\n##ager\nexhibition\n##del\n##pods\nfpx\nstructure\n##free\n##tings\nkl\n##rley\n##copyright\n##mma\ncalifornia\n3400\norange\nyoga\n4l\ncanmake\nhoney\n##anda\n##コメント\n595\nnikkie\n##ルハイト\ndhl\npublishing\n##mall\n##gnet\n20cm\n513\n##クセス\n##┅\ne88\n970\n##dog\nfishbase\n##!\n##\"\n###\n##$\n##%\n##&\n##'\n##(\n##)\n##*\n##+\n##,\n##-\n##.\n##/\n##:\n##;\n##<\n##=\n##>\n##?\n##@\n##[\n##\\\n##]\n##^\n##_\n##{\n##|\n##}\n##~\n##£\n##¤\n##¥\n##§\n##«\n##±\n##³\n##µ\n##·\n##¹\n##º\n##»\n##¼\n##ß\n##æ\n##÷\n##ø\n##đ\n##ŋ\n##ɔ\n##ə\n##ɡ\n##ʰ\n##ˇ\n##ˈ\n##ˊ\n##ˋ\n##ˍ\n##ː\n##˙\n##˚\n##ˢ\n##α\n##β\n##γ\n##δ\n##ε\n##η\n##θ\n##ι\n##κ\n##λ\n##μ\n##ν\n##ο\n##π\n##ρ\n##ς\n##σ\n##τ\n##υ\n##φ\n##χ\n##ψ\n##б\n##в\n##г\n##д\n##е\n##ж\n##з\n##к\n##л\n##м\n##н\n##о\n##п\n##р\n##с\n##т\n##у\n##ф\n##х\n##ц\n##ч\n##ш\n##ы\n##ь\n##і\n##ا\n##ب\n##ة\n##ت\n##د\n##ر\n##س\n##ع\n##ل\n##م\n##ن\n##ه\n##و\n##ي\n##۩\n##ก\n##ง\n##น\n##ม\n##ย\n##ร\n##อ\n##า\n##เ\n##๑\n##་\n##ღ\n##ᄀ\n##ᄁ\n##ᄂ\n##ᄃ\n##ᄅ\n##ᄆ\n##ᄇ\n##ᄈ\n##ᄉ\n##ᄋ\n##ᄌ\n##ᄎ\n##ᄏ\n##ᄐ\n##ᄑ\n##ᄒ\n##ᅢ\n##ᅣ\n##ᅥ\n##ᅦ\n##ᅧ\n##ᅨ\n##ᅪ\n##ᅬ\n##ᅭ\n##ᅮ\n##ᅯ\n##ᅲ\n##ᅳ\n##ᅴ\n##ᆷ\n##ᆸ\n##ᆺ\n##ᆻ\n##ᗜ\n##ᵃ\n##ᵉ\n##ᵍ\n##ᵏ\n##ᵐ\n##ᵒ\n##ᵘ\n##‖\n##„\n##†\n##•\n##‥\n##‧\n## \n##‰\n##′\n##″\n##‹\n##›\n##※\n##‿\n##⁄\n##ⁱ\n##⁺\n##ⁿ\n##₁\n##₃\n##₄\n##€\n##№\n##ⅰ\n##ⅱ\n##ⅲ\n##ⅳ\n##ⅴ\n##↔\n##↗\n##↘\n##⇒\n##∀\n##−\n##∕\n##∙\n##√\n##∞\n##∟\n##∠\n##∣\n##∩\n##∮\n##∶\n##∼\n##∽\n##≈\n##≒\n##≡\n##≤\n##≥\n##≦\n##≧\n##≪\n##≫\n##⊙\n##⋅\n##⋈\n##⋯\n##⌒\n##①\n##②\n##③\n##④\n##⑤\n##⑥\n##⑦\n##⑧\n##⑨\n##⑩\n##⑴\n##⑵\n##⑶\n##⑷\n##⑸\n##⒈\n##⒉\n##⒊\n##⒋\n##ⓒ\n##ⓔ\n##ⓘ\n##━\n##┃\n##┆\n##┊\n##┌\n##└\n##├\n##┣\n##═\n##║\n##╚\n##╞\n##╠\n##╭\n##╮\n##╯\n##╰\n##╱\n##╳\n##▂\n##▃\n##▅\n##▇\n##▉\n##▋\n##▌\n##▍\n##▎\n##□\n##▪\n##▫\n##▬\n##△\n##▶\n##►\n##▽\n##◇\n##◕\n##◠\n##◢\n##◤\n##☀\n##☕\n##☞\n##☺\n##☼\n##♀\n##♂\n##♠\n##♡\n##♣\n##♦\n##♫\n##♬\n##✈\n##✔\n##✕\n##✖\n##✦\n##✨\n##✪\n##✰\n##✿\n##❀\n##➜\n##➤\n##⦿\n##、\n##。\n##〃\n##々\n##〇\n##〈\n##〉\n##《\n##》\n##「\n##」\n##『\n##』\n##【\n##】\n##〓\n##〔\n##〕\n##〖\n##〗\n##〜\n##〝\n##〞\n##ぃ\n##ぇ\n##ぬ\n##ふ\n##ほ\n##む\n##ゃ\n##ゅ\n##ゆ\n##ょ\n##゜\n##ゝ\n##ァ\n##ゥ\n##エ\n##ォ\n##ケ\n##サ\n##セ\n##ソ\n##ッ\n##ニ\n##ヌ\n##ネ\n##ノ\n##ヘ\n##モ\n##ャ\n##ヤ\n##ュ\n##ユ\n##ョ\n##ヨ\n##ワ\n##ヲ\n##・\n##ヽ\n##ㄅ\n##ㄆ\n##ㄇ\n##ㄉ\n##ㄋ\n##ㄌ\n##ㄍ\n##ㄎ\n##ㄏ\n##ㄒ\n##ㄚ\n##ㄛ\n##ㄞ\n##ㄟ\n##ㄢ\n##ㄤ\n##ㄥ\n##ㄧ\n##ㄨ\n##ㆍ\n##㈦\n##㊣\n##㗎\n##一\n##丁\n##七\n##万\n##丈\n##三\n##上\n##下\n##不\n##与\n##丐\n##丑\n##专\n##且\n##丕\n##世\n##丘\n##丙\n##业\n##丛\n##东\n##丝\n##丞\n##丟\n##両\n##丢\n##两\n##严\n##並\n##丧\n##丨\n##个\n##丫\n##中\n##丰\n##串\n##临\n##丶\n##丸\n##丹\n##为\n##主\n##丼\n##丽\n##举\n##丿\n##乂\n##乃\n##久\n##么\n##义\n##之\n##乌\n##乍\n##乎\n##乏\n##乐\n##乒\n##乓\n##乔\n##乖\n##乗\n##乘\n##乙\n##乜\n##九\n##乞\n##也\n##习\n##乡\n##书\n##乩\n##买\n##乱\n##乳\n##乾\n##亀\n##亂\n##了\n##予\n##争\n##事\n##二\n##于\n##亏\n##云\n##互\n##五\n##井\n##亘\n##亙\n##亚\n##些\n##亜\n##亞\n##亟\n##亡\n##亢\n##交\n##亥\n##亦\n##产\n##亨\n##亩\n##享\n##京\n##亭\n##亮\n##亲\n##亳\n##亵\n##人\n##亿\n##什\n##仁\n##仃\n##仄\n##仅\n##仆\n##仇\n##今\n##介\n##仍\n##从\n##仏\n##仑\n##仓\n##仔\n##仕\n##他\n##仗\n##付\n##仙\n##仝\n##仞\n##仟\n##代\n##令\n##以\n##仨\n##仪\n##们\n##仮\n##仰\n##仲\n##件\n##价\n##任\n##份\n##仿\n##企\n##伉\n##伊\n##伍\n##伎\n##伏\n##伐\n##休\n##伕\n##众\n##优\n##伙\n##会\n##伝\n##伞\n##伟\n##传\n##伢\n##伤\n##伦\n##伪\n##伫\n##伯\n##估\n##伴\n##伶\n##伸\n##伺\n##似\n##伽\n##佃\n##但\n##佇\n##佈\n##位\n##低\n##住\n##佐\n##佑\n##体\n##佔\n##何\n##佗\n##佘\n##余\n##佚\n##佛\n##作\n##佝\n##佞\n##佟\n##你\n##佢\n##佣\n##佤\n##佥\n##佩\n##佬\n##佯\n##佰\n##佳\n##併\n##佶\n##佻\n##佼\n##使\n##侃\n##侄\n##來\n##侈\n##例\n##侍\n##侏\n##侑\n##侖\n##侗\n##供\n##依\n##侠\n##価\n##侣\n##侥\n##侦\n##侧\n##侨\n##侬\n##侮\n##侯\n##侵\n##侶\n##侷\n##便\n##係\n##促\n##俄\n##俊\n##俎\n##俏\n##俐\n##俑\n##俗\n##俘\n##俚\n##保\n##俞\n##俟\n##俠\n##信\n##俨\n##俩\n##俪\n##俬\n##俭\n##修\n##俯\n##俱\n##俳\n##俸\n##俺\n##俾\n##倆\n##倉\n##個\n##倌\n##倍\n##倏\n##們\n##倒\n##倔\n##倖\n##倘\n##候\n##倚\n##倜\n##借\n##倡\n##値\n##倦\n##倩\n##倪\n##倫\n##倬\n##倭\n##倶\n##债\n##值\n##倾\n##偃\n##假\n##偈\n##偉\n##偌\n##偎\n##偏\n##偕\n##做\n##停\n##健\n##側\n##偵\n##偶\n##偷\n##偻\n##偽\n##偿\n##傀\n##傅\n##傍\n##傑\n##傘\n##備\n##傚\n##傢\n##傣\n##傥\n##储\n##傩\n##催\n##傭\n##傲\n##傳\n##債\n##傷\n##傻\n##傾\n##僅\n##働\n##像\n##僑\n##僕\n##僖\n##僚\n##僥\n##僧\n##僭\n##僮\n##僱\n##僵\n##價\n##僻\n##儀\n##儂\n##億\n##儆\n##儉\n##儋\n##儒\n##儕\n##儘\n##償\n##儡\n##優\n##儲\n##儷\n##儼\n##儿\n##兀\n##允\n##元\n##兄\n##充\n##兆\n##兇\n##先\n##光\n##克\n##兌\n##免\n##児\n##兑\n##兒\n##兔\n##兖\n##党\n##兜\n##兢\n##入\n##內\n##全\n##兩\n##八\n##公\n##六\n##兮\n##兰\n##共\n##兲\n##关\n##兴\n##兵\n##其\n##具\n##典\n##兹\n##养\n##兼\n##兽\n##冀\n##内\n##円\n##冇\n##冈\n##冉\n##冊\n##册\n##再\n##冏\n##冒\n##冕\n##冗\n##写\n##军\n##农\n##冠\n##冢\n##冤\n##冥\n##冨\n##冪\n##冬\n##冯\n##冰\n##冲\n##决\n##况\n##冶\n##冷\n##冻\n##冼\n##冽\n##冾\n##净\n##凄\n##准\n##凇\n##凈\n##凉\n##凋\n##凌\n##凍\n##减\n##凑\n##凛\n##凜\n##凝\n##几\n##凡\n##凤\n##処\n##凪\n##凭\n##凯\n##凰\n##凱\n##凳\n##凶\n##凸\n##凹\n##出\n##击\n##函\n##凿\n##刀\n##刁\n##刃\n##分\n##切\n##刈\n##刊\n##刍\n##刎\n##刑\n##划\n##列\n##刘\n##则\n##刚\n##创\n##初\n##删\n##判\n##別\n##刨\n##利\n##刪\n##别\n##刮\n##到\n##制\n##刷\n##券\n##刹\n##刺\n##刻\n##刽\n##剁\n##剂\n##剃\n##則\n##剉\n##削\n##剋\n##剌\n##前\n##剎\n##剐\n##剑\n##剔\n##剖\n##剛\n##剜\n##剝\n##剣\n##剤\n##剥\n##剧\n##剩\n##剪\n##副\n##割\n##創\n##剷\n##剽\n##剿\n##劃\n##劇\n##劈\n##劉\n##劊\n##劍\n##劏\n##劑\n##力\n##劝\n##办\n##功\n##加\n##务\n##劣\n##动\n##助\n##努\n##劫\n##劭\n##励\n##劲\n##劳\n##労\n##劵\n##効\n##劾\n##势\n##勁\n##勃\n##勇\n##勉\n##勋\n##勐\n##勒\n##動\n##勖\n##勘\n##務\n##勛\n##勝\n##勞\n##募\n##勢\n##勤\n##勧\n##勳\n##勵\n##勸\n##勺\n##勻\n##勾\n##勿\n##匀\n##包\n##匆\n##匈\n##匍\n##匐\n##匕\n##化\n##北\n##匙\n##匝\n##匠\n##匡\n##匣\n##匪\n##匮\n##匯\n##匱\n##匹\n##区\n##医\n##匾\n##匿\n##區\n##十\n##千\n##卅\n##升\n##午\n##卉\n##半\n##卍\n##华\n##协\n##卑\n##卒\n##卓\n##協\n##单\n##卖\n##南\n##単\n##博\n##卜\n##卞\n##卟\n##占\n##卡\n##卢\n##卤\n##卦\n##卧\n##卫\n##卮\n##卯\n##印\n##危\n##即\n##却\n##卵\n##卷\n##卸\n##卻\n##卿\n##厂\n##厄\n##厅\n##历\n##厉\n##压\n##厌\n##厕\n##厘\n##厚\n##厝\n##原\n##厢\n##厥\n##厦\n##厨\n##厩\n##厭\n##厮\n##厲\n##厳\n##去\n##县\n##叁\n##参\n##參\n##又\n##叉\n##及\n##友\n##双\n##反\n##収\n##发\n##叔\n##取\n##受\n##变\n##叙\n##叛\n##叟\n##叠\n##叡\n##叢\n##口\n##古\n##句\n##另\n##叨\n##叩\n##只\n##叫\n##召\n##叭\n##叮\n##可\n##台\n##叱\n##史\n##右\n##叵\n##叶\n##号\n##司\n##叹\n##叻\n##叼\n##叽\n##吁\n##吃\n##各\n##吆\n##合\n##吉\n##吊\n##吋\n##同\n##名\n##后\n##吏\n##吐\n##向\n##吒\n##吓\n##吕\n##吖\n##吗\n##君\n##吝\n##吞\n##吟\n##吠\n##吡\n##否\n##吧\n##吨\n##吩\n##含\n##听\n##吭\n##吮\n##启\n##吱\n##吳\n##吴\n##吵\n##吶\n##吸\n##吹\n##吻\n##吼\n##吽\n##吾\n##呀\n##呂\n##呃\n##呆\n##呈\n##告\n##呋\n##呎\n##呐\n##呓\n##呕\n##呗\n##员\n##呛\n##呜\n##呢\n##呤\n##呦\n##周\n##呱\n##呲\n##味\n##呵\n##呷\n##呸\n##呻\n##呼\n##命\n##咀\n##咁\n##咂\n##咄\n##咆\n##咋\n##和\n##咎\n##咏\n##咐\n##咒\n##咔\n##咕\n##咖\n##咗\n##咘\n##咙\n##咚\n##咛\n##咣\n##咤\n##咦\n##咧\n##咨\n##咩\n##咪\n##咫\n##咬\n##咭\n##咯\n##咱\n##咲\n##咳\n##咸\n##咻\n##咽\n##咿\n##哀\n##品\n##哂\n##哄\n##哆\n##哇\n##哈\n##哉\n##哋\n##哌\n##响\n##哎\n##哏\n##哐\n##哑\n##哒\n##哔\n##哗\n##哟\n##員\n##哥\n##哦\n##哧\n##哨\n##哩\n##哪\n##哭\n##哮\n##哲\n##哺\n##哼\n##哽\n##唁\n##唄\n##唆\n##唇\n##唉\n##唏\n##唐\n##唑\n##唔\n##唠\n##唤\n##唧\n##唬\n##售\n##唯\n##唰\n##唱\n##唳\n##唷\n##唸\n##唾\n##啃\n##啄\n##商\n##啉\n##啊\n##問\n##啓\n##啕\n##啖\n##啜\n##啞\n##啟\n##啡\n##啤\n##啥\n##啦\n##啧\n##啪\n##啫\n##啬\n##啮\n##啰\n##啱\n##啲\n##啵\n##啶\n##啷\n##啸\n##啻\n##啼\n##啾\n##喀\n##喂\n##喃\n##善\n##喆\n##喇\n##喉\n##喊\n##喋\n##喎\n##喏\n##喔\n##喘\n##喙\n##喚\n##喜\n##喝\n##喟\n##喧\n##喪\n##喫\n##喬\n##單\n##喰\n##喱\n##喲\n##喳\n##喵\n##営\n##喷\n##喹\n##喺\n##喻\n##喽\n##嗅\n##嗆\n##嗇\n##嗎\n##嗑\n##嗒\n##嗓\n##嗔\n##嗖\n##嗚\n##嗜\n##嗝\n##嗟\n##嗡\n##嗣\n##嗤\n##嗦\n##嗨\n##嗪\n##嗬\n##嗯\n##嗰\n##嗲\n##嗳\n##嗶\n##嗷\n##嗽\n##嘀\n##嘅\n##嘆\n##嘈\n##嘉\n##嘌\n##嘍\n##嘎\n##嘔\n##嘖\n##嘗\n##嘘\n##嘚\n##嘛\n##嘜\n##嘞\n##嘟\n##嘢\n##嘣\n##嘤\n##嘧\n##嘩\n##嘭\n##嘮\n##嘯\n##嘰\n##嘱\n##嘲\n##嘴\n##嘶\n##嘸\n##嘹\n##嘻\n##嘿\n##噁\n##噌\n##噎\n##噓\n##噔\n##噗\n##噙\n##噜\n##噠\n##噢\n##噤\n##器\n##噩\n##噪\n##噬\n##噱\n##噴\n##噶\n##噸\n##噹\n##噻\n##噼\n##嚀\n##嚇\n##嚎\n##嚏\n##嚐\n##嚓\n##嚕\n##嚟\n##嚣\n##嚥\n##嚨\n##嚮\n##嚴\n##嚷\n##嚼\n##囂\n##囉\n##囊\n##囍\n##囑\n##囔\n##囗\n##囚\n##四\n##囝\n##回\n##囟\n##因\n##囡\n##团\n##団\n##囤\n##囧\n##囪\n##囫\n##园\n##困\n##囱\n##囲\n##図\n##围\n##囹\n##固\n##国\n##图\n##囿\n##圃\n##圄\n##圆\n##圈\n##國\n##圍\n##圏\n##園\n##圓\n##圖\n##團\n##圜\n##土\n##圣\n##圧\n##在\n##圩\n##圭\n##地\n##圳\n##场\n##圻\n##圾\n##址\n##坂\n##均\n##坊\n##坍\n##坎\n##坏\n##坐\n##坑\n##块\n##坚\n##坛\n##坝\n##坞\n##坟\n##坠\n##坡\n##坤\n##坦\n##坨\n##坪\n##坯\n##坳\n##坵\n##坷\n##垂\n##垃\n##垄\n##型\n##垒\n##垚\n##垛\n##垠\n##垢\n##垣\n##垦\n##垩\n##垫\n##垭\n##垮\n##垵\n##埂\n##埃\n##埋\n##城\n##埔\n##埕\n##埗\n##域\n##埠\n##埤\n##埵\n##執\n##埸\n##培\n##基\n##埼\n##堀\n##堂\n##堃\n##堅\n##堆\n##堇\n##堑\n##堕\n##堙\n##堡\n##堤\n##堪\n##堯\n##堰\n##報\n##場\n##堵\n##堺\n##堿\n##塊\n##塌\n##塑\n##塔\n##塗\n##塘\n##塚\n##塞\n##塢\n##塩\n##填\n##塬\n##塭\n##塵\n##塾\n##墀\n##境\n##墅\n##墉\n##墊\n##墒\n##墓\n##増\n##墘\n##墙\n##墜\n##增\n##墟\n##墨\n##墩\n##墮\n##墳\n##墻\n##墾\n##壁\n##壅\n##壆\n##壇\n##壊\n##壑\n##壓\n##壕\n##壘\n##壞\n##壟\n##壢\n##壤\n##壩\n##士\n##壬\n##壮\n##壯\n##声\n##売\n##壳\n##壶\n##壹\n##壺\n##壽\n##处\n##备\n##変\n##复\n##夏\n##夔\n##夕\n##外\n##夙\n##多\n##夜\n##够\n##夠\n##夢\n##夥\n##大\n##天\n##太\n##夫\n##夭\n##央\n##夯\n##失\n##头\n##夷\n##夸\n##夹\n##夺\n##夾\n##奂\n##奄\n##奇\n##奈\n##奉\n##奋\n##奎\n##奏\n##奐\n##契\n##奔\n##奕\n##奖\n##套\n##奘\n##奚\n##奠\n##奢\n##奥\n##奧\n##奪\n##奬\n##奮\n##女\n##奴\n##奶\n##奸\n##她\n##好\n##如\n##妃\n##妄\n##妆\n##妇\n##妈\n##妊\n##妍\n##妒\n##妓\n##妖\n##妘\n##妙\n##妝\n##妞\n##妣\n##妤\n##妥\n##妨\n##妩\n##妪\n##妮\n##妲\n##妳\n##妹\n##妻\n##妾\n##姆\n##姉\n##姊\n##始\n##姍\n##姐\n##姑\n##姒\n##姓\n##委\n##姗\n##姚\n##姜\n##姝\n##姣\n##姥\n##姦\n##姨\n##姪\n##姫\n##姬\n##姹\n##姻\n##姿\n##威\n##娃\n##娄\n##娅\n##娆\n##娇\n##娉\n##娑\n##娓\n##娘\n##娛\n##娜\n##娟\n##娠\n##娣\n##娥\n##娩\n##娱\n##娲\n##娴\n##娶\n##娼\n##婀\n##婁\n##婆\n##婉\n##婊\n##婕\n##婚\n##婢\n##婦\n##婧\n##婪\n##婭\n##婴\n##婵\n##婶\n##婷\n##婺\n##婿\n##媒\n##媚\n##媛\n##媞\n##媧\n##媲\n##媳\n##媽\n##媾\n##嫁\n##嫂\n##嫉\n##嫌\n##嫑\n##嫔\n##嫖\n##嫘\n##嫚\n##嫡\n##嫣\n##嫦\n##嫩\n##嫲\n##嫵\n##嫻\n##嬅\n##嬉\n##嬌\n##嬗\n##嬛\n##嬢\n##嬤\n##嬪\n##嬰\n##嬴\n##嬷\n##嬸\n##嬿\n##孀\n##孃\n##子\n##孑\n##孔\n##孕\n##孖\n##字\n##存\n##孙\n##孚\n##孛\n##孜\n##孝\n##孟\n##孢\n##季\n##孤\n##学\n##孩\n##孪\n##孫\n##孬\n##孰\n##孱\n##孳\n##孵\n##學\n##孺\n##孽\n##孿\n##宁\n##它\n##宅\n##宇\n##守\n##安\n##宋\n##完\n##宏\n##宓\n##宕\n##宗\n##官\n##宙\n##定\n##宛\n##宜\n##宝\n##实\n##実\n##宠\n##审\n##客\n##宣\n##室\n##宥\n##宦\n##宪\n##宫\n##宮\n##宰\n##害\n##宴\n##宵\n##家\n##宸\n##容\n##宽\n##宾\n##宿\n##寂\n##寄\n##寅\n##密\n##寇\n##富\n##寐\n##寒\n##寓\n##寛\n##寝\n##寞\n##察\n##寡\n##寢\n##寥\n##實\n##寧\n##寨\n##審\n##寫\n##寬\n##寮\n##寰\n##寵\n##寶\n##寸\n##对\n##寺\n##寻\n##导\n##対\n##寿\n##封\n##専\n##射\n##将\n##將\n##專\n##尉\n##尊\n##尋\n##對\n##導\n##小\n##少\n##尔\n##尕\n##尖\n##尘\n##尚\n##尝\n##尤\n##尧\n##尬\n##就\n##尴\n##尷\n##尸\n##尹\n##尺\n##尻\n##尼\n##尽\n##尾\n##尿\n##局\n##屁\n##层\n##屄\n##居\n##屆\n##屈\n##屉\n##届\n##屋\n##屌\n##屍\n##屎\n##屏\n##屐\n##屑\n##展\n##屜\n##属\n##屠\n##屡\n##屢\n##層\n##履\n##屬\n##屯\n##山\n##屹\n##屿\n##岀\n##岁\n##岂\n##岌\n##岐\n##岑\n##岔\n##岖\n##岗\n##岘\n##岙\n##岚\n##岛\n##岡\n##岩\n##岫\n##岬\n##岭\n##岱\n##岳\n##岷\n##岸\n##峇\n##峋\n##峒\n##峙\n##峡\n##峤\n##峥\n##峦\n##峨\n##峪\n##峭\n##峯\n##峰\n##峴\n##島\n##峻\n##峽\n##崁\n##崂\n##崆\n##崇\n##崎\n##崑\n##崔\n##崖\n##崗\n##崙\n##崛\n##崧\n##崩\n##崭\n##崴\n##崽\n##嵇\n##嵊\n##嵋\n##嵌\n##嵐\n##嵘\n##嵩\n##嵬\n##嵯\n##嶂\n##嶄\n##嶇\n##嶋\n##嶙\n##嶺\n##嶼\n##嶽\n##巅\n##巍\n##巒\n##巔\n##巖\n##川\n##州\n##巡\n##巢\n##工\n##左\n##巧\n##巨\n##巩\n##巫\n##差\n##己\n##已\n##巳\n##巴\n##巷\n##巻\n##巽\n##巾\n##巿\n##币\n##市\n##布\n##帅\n##帆\n##师\n##希\n##帐\n##帑\n##帕\n##帖\n##帘\n##帚\n##帛\n##帜\n##帝\n##帥\n##带\n##帧\n##師\n##席\n##帮\n##帯\n##帰\n##帳\n##帶\n##帷\n##常\n##帼\n##帽\n##幀\n##幂\n##幄\n##幅\n##幌\n##幔\n##幕\n##幟\n##幡\n##幢\n##幣\n##幫\n##干\n##平\n##年\n##并\n##幸\n##幹\n##幺\n##幻\n##幼\n##幽\n##幾\n##广\n##庁\n##広\n##庄\n##庆\n##庇\n##床\n##序\n##庐\n##库\n##应\n##底\n##庖\n##店\n##庙\n##庚\n##府\n##庞\n##废\n##庠\n##度\n##座\n##庫\n##庭\n##庵\n##庶\n##康\n##庸\n##庹\n##庾\n##廁\n##廂\n##廃\n##廈\n##廉\n##廊\n##廓\n##廖\n##廚\n##廝\n##廟\n##廠\n##廢\n##廣\n##廬\n##廳\n##延\n##廷\n##建\n##廿\n##开\n##弁\n##异\n##弃\n##弄\n##弈\n##弊\n##弋\n##式\n##弑\n##弒\n##弓\n##弔\n##引\n##弗\n##弘\n##弛\n##弟\n##张\n##弥\n##弦\n##弧\n##弩\n##弭\n##弯\n##弱\n##張\n##強\n##弹\n##强\n##弼\n##弾\n##彅\n##彆\n##彈\n##彌\n##彎\n##归\n##当\n##录\n##彗\n##彙\n##彝\n##形\n##彤\n##彥\n##彦\n##彧\n##彩\n##彪\n##彫\n##彬\n##彭\n##彰\n##影\n##彷\n##役\n##彻\n##彼\n##彿\n##往\n##征\n##径\n##待\n##徇\n##很\n##徉\n##徊\n##律\n##後\n##徐\n##徑\n##徒\n##従\n##徕\n##得\n##徘\n##徙\n##徜\n##從\n##徠\n##御\n##徨\n##復\n##循\n##徬\n##微\n##徳\n##徴\n##徵\n##德\n##徹\n##徼\n##徽\n##心\n##必\n##忆\n##忌\n##忍\n##忏\n##忐\n##忑\n##忒\n##忖\n##志\n##忘\n##忙\n##応\n##忠\n##忡\n##忤\n##忧\n##忪\n##快\n##忱\n##念\n##忻\n##忽\n##忿\n##怀\n##态\n##怂\n##怅\n##怆\n##怎\n##怏\n##怒\n##怔\n##怕\n##怖\n##怙\n##怜\n##思\n##怠\n##怡\n##急\n##怦\n##性\n##怨\n##怪\n##怯\n##怵\n##总\n##怼\n##恁\n##恃\n##恆\n##恋\n##恍\n##恐\n##恒\n##恕\n##恙\n##恚\n##恢\n##恣\n##恤\n##恥\n##恨\n##恩\n##恪\n##恫\n##恬\n##恭\n##息\n##恰\n##恳\n##恵\n##恶\n##恸\n##恺\n##恻\n##恼\n##恿\n##悄\n##悅\n##悉\n##悌\n##悍\n##悔\n##悖\n##悚\n##悟\n##悠\n##患\n##悦\n##您\n##悩\n##悪\n##悬\n##悯\n##悱\n##悲\n##悴\n##悵\n##悶\n##悸\n##悻\n##悼\n##悽\n##情\n##惆\n##惇\n##惊\n##惋\n##惑\n##惕\n##惘\n##惚\n##惜\n##惟\n##惠\n##惡\n##惦\n##惧\n##惨\n##惩\n##惫\n##惬\n##惭\n##惮\n##惯\n##惰\n##惱\n##想\n##惴\n##惶\n##惹\n##惺\n##愁\n##愆\n##愈\n##愉\n##愍\n##意\n##愕\n##愚\n##愛\n##愜\n##感\n##愣\n##愤\n##愧\n##愫\n##愷\n##愿\n##慄\n##慈\n##態\n##慌\n##慎\n##慑\n##慕\n##慘\n##慚\n##慟\n##慢\n##慣\n##慧\n##慨\n##慫\n##慮\n##慰\n##慳\n##慵\n##慶\n##慷\n##慾\n##憂\n##憊\n##憋\n##憎\n##憐\n##憑\n##憔\n##憚\n##憤\n##憧\n##憨\n##憩\n##憫\n##憬\n##憲\n##憶\n##憾\n##懂\n##懇\n##懈\n##應\n##懊\n##懋\n##懑\n##懒\n##懦\n##懲\n##懵\n##懶\n##懷\n##懸\n##懺\n##懼\n##懾\n##懿\n##戀\n##戈\n##戊\n##戌\n##戍\n##戎\n##戏\n##成\n##我\n##戒\n##戕\n##或\n##战\n##戚\n##戛\n##戟\n##戡\n##戦\n##截\n##戬\n##戮\n##戰\n##戲\n##戳\n##戴\n##戶\n##户\n##戸\n##戻\n##戾\n##房\n##所\n##扁\n##扇\n##扈\n##扉\n##手\n##才\n##扎\n##扑\n##扒\n##打\n##扔\n##払\n##托\n##扛\n##扣\n##扦\n##执\n##扩\n##扪\n##扫\n##扬\n##扭\n##扮\n##扯\n##扰\n##扱\n##扳\n##扶\n##批\n##扼\n##找\n##承\n##技\n##抄\n##抉\n##把\n##抑\n##抒\n##抓\n##投\n##抖\n##抗\n##折\n##抚\n##抛\n##抜\n##択\n##抟\n##抠\n##抡\n##抢\n##护\n##报\n##抨\n##披\n##抬\n##抱\n##抵\n##抹\n##押\n##抽\n##抿\n##拂\n##拄\n##担\n##拆\n##拇\n##拈\n##拉\n##拋\n##拌\n##拍\n##拎\n##拐\n##拒\n##拓\n##拔\n##拖\n##拗\n##拘\n##拙\n##拚\n##招\n##拜\n##拟\n##拡\n##拢\n##拣\n##拥\n##拦\n##拧\n##拨\n##择\n##括\n##拭\n##拮\n##拯\n##拱\n##拳\n##拴\n##拷\n##拼\n##拽\n##拾\n##拿\n##持\n##挂\n##指\n##挈\n##按\n##挎\n##挑\n##挖\n##挙\n##挚\n##挛\n##挝\n##挞\n##挟\n##挠\n##挡\n##挣\n##挤\n##挥\n##挨\n##挪\n##挫\n##振\n##挲\n##挹\n##挺\n##挽\n##挾\n##捂\n##捅\n##捆\n##捉\n##捋\n##捌\n##捍\n##捎\n##捏\n##捐\n##捕\n##捞\n##损\n##捡\n##换\n##捣\n##捧\n##捨\n##捩\n##据\n##捱\n##捲\n##捶\n##捷\n##捺\n##捻\n##掀\n##掂\n##掃\n##掇\n##授\n##掉\n##掌\n##掏\n##掐\n##排\n##掖\n##掘\n##掙\n##掛\n##掠\n##採\n##探\n##掣\n##接\n##控\n##推\n##掩\n##措\n##掬\n##掰\n##掲\n##掳\n##掴\n##掷\n##掸\n##掺\n##揀\n##揃\n##揄\n##揆\n##揉\n##揍\n##描\n##提\n##插\n##揖\n##揚\n##換\n##握\n##揣\n##揩\n##揪\n##揭\n##揮\n##援\n##揶\n##揸\n##揹\n##揽\n##搀\n##搁\n##搂\n##搅\n##損\n##搏\n##搐\n##搓\n##搔\n##搖\n##搗\n##搜\n##搞\n##搡\n##搪\n##搬\n##搭\n##搵\n##搶\n##携\n##搽\n##摀\n##摁\n##摄\n##摆\n##摇\n##摈\n##摊\n##摒\n##摔\n##摘\n##摞\n##摟\n##摧\n##摩\n##摯\n##摳\n##摸\n##摹\n##摺\n##摻\n##撂\n##撃\n##撅\n##撇\n##撈\n##撐\n##撑\n##撒\n##撓\n##撕\n##撚\n##撞\n##撤\n##撥\n##撩\n##撫\n##撬\n##播\n##撮\n##撰\n##撲\n##撵\n##撷\n##撸\n##撻\n##撼\n##撿\n##擀\n##擁\n##擂\n##擄\n##擅\n##擇\n##擊\n##擋\n##操\n##擎\n##擒\n##擔\n##擘\n##據\n##擞\n##擠\n##擡\n##擢\n##擦\n##擬\n##擰\n##擱\n##擲\n##擴\n##擷\n##擺\n##擼\n##擾\n##攀\n##攏\n##攒\n##攔\n##攘\n##攙\n##攜\n##攝\n##攞\n##攢\n##攣\n##攤\n##攥\n##攪\n##攫\n##攬\n##支\n##收\n##攸\n##改\n##攻\n##放\n##政\n##故\n##效\n##敌\n##敍\n##敎\n##敏\n##救\n##敕\n##敖\n##敗\n##敘\n##教\n##敛\n##敝\n##敞\n##敢\n##散\n##敦\n##敬\n##数\n##敲\n##整\n##敵\n##敷\n##數\n##斂\n##斃\n##文\n##斋\n##斌\n##斎\n##斐\n##斑\n##斓\n##斗\n##料\n##斛\n##斜\n##斟\n##斡\n##斤\n##斥\n##斧\n##斩\n##斫\n##斬\n##断\n##斯\n##新\n##斷\n##方\n##於\n##施\n##旁\n##旃\n##旅\n##旋\n##旌\n##旎\n##族\n##旖\n##旗\n##无\n##既\n##日\n##旦\n##旧\n##旨\n##早\n##旬\n##旭\n##旮\n##旱\n##时\n##旷\n##旺\n##旻\n##昀\n##昂\n##昆\n##昇\n##昉\n##昊\n##昌\n##明\n##昏\n##易\n##昔\n##昕\n##昙\n##星\n##映\n##春\n##昧\n##昨\n##昭\n##是\n##昱\n##昴\n##昵\n##昶\n##昼\n##显\n##晁\n##時\n##晃\n##晉\n##晋\n##晌\n##晏\n##晒\n##晓\n##晔\n##晕\n##晖\n##晗\n##晚\n##晝\n##晞\n##晟\n##晤\n##晦\n##晨\n##晩\n##普\n##景\n##晰\n##晴\n##晶\n##晷\n##智\n##晾\n##暂\n##暄\n##暇\n##暈\n##暉\n##暌\n##暐\n##暑\n##暖\n##暗\n##暝\n##暢\n##暧\n##暨\n##暫\n##暮\n##暱\n##暴\n##暸\n##暹\n##曄\n##曆\n##曇\n##曉\n##曖\n##曙\n##曜\n##曝\n##曠\n##曦\n##曬\n##曰\n##曲\n##曳\n##更\n##書\n##曹\n##曼\n##曾\n##替\n##最\n##會\n##月\n##有\n##朋\n##服\n##朐\n##朔\n##朕\n##朗\n##望\n##朝\n##期\n##朦\n##朧\n##木\n##未\n##末\n##本\n##札\n##朮\n##术\n##朱\n##朴\n##朵\n##机\n##朽\n##杀\n##杂\n##权\n##杆\n##杈\n##杉\n##李\n##杏\n##材\n##村\n##杓\n##杖\n##杜\n##杞\n##束\n##杠\n##条\n##来\n##杨\n##杭\n##杯\n##杰\n##東\n##杳\n##杵\n##杷\n##杼\n##松\n##板\n##极\n##构\n##枇\n##枉\n##枋\n##析\n##枕\n##林\n##枚\n##果\n##枝\n##枢\n##枣\n##枪\n##枫\n##枭\n##枯\n##枰\n##枱\n##枳\n##架\n##枷\n##枸\n##柄\n##柏\n##某\n##柑\n##柒\n##染\n##柔\n##柘\n##柚\n##柜\n##柞\n##柠\n##柢\n##查\n##柩\n##柬\n##柯\n##柱\n##柳\n##柴\n##柵\n##査\n##柿\n##栀\n##栃\n##栄\n##栅\n##标\n##栈\n##栉\n##栋\n##栎\n##栏\n##树\n##栓\n##栖\n##栗\n##校\n##栩\n##株\n##样\n##核\n##根\n##格\n##栽\n##栾\n##桀\n##桁\n##桂\n##桃\n##桅\n##框\n##案\n##桉\n##桌\n##桎\n##桐\n##桑\n##桓\n##桔\n##桜\n##桠\n##桡\n##桢\n##档\n##桥\n##桦\n##桧\n##桨\n##桩\n##桶\n##桿\n##梁\n##梅\n##梆\n##梏\n##梓\n##梗\n##條\n##梟\n##梢\n##梦\n##梧\n##梨\n##梭\n##梯\n##械\n##梳\n##梵\n##梶\n##检\n##棂\n##棄\n##棉\n##棋\n##棍\n##棒\n##棕\n##棗\n##棘\n##棚\n##棟\n##棠\n##棣\n##棧\n##森\n##棱\n##棲\n##棵\n##棹\n##棺\n##椁\n##椅\n##椋\n##植\n##椎\n##椒\n##検\n##椪\n##椭\n##椰\n##椹\n##椽\n##椿\n##楂\n##楊\n##楓\n##楔\n##楚\n##楝\n##楞\n##楠\n##楣\n##楨\n##楫\n##業\n##楮\n##極\n##楷\n##楸\n##楹\n##楼\n##楽\n##概\n##榄\n##榆\n##榈\n##榉\n##榔\n##榕\n##榖\n##榛\n##榜\n##榨\n##榫\n##榭\n##榮\n##榱\n##榴\n##榷\n##榻\n##槁\n##槃\n##構\n##槌\n##槍\n##槎\n##槐\n##槓\n##様\n##槛\n##槟\n##槤\n##槭\n##槲\n##槳\n##槻\n##槽\n##槿\n##樁\n##樂\n##樊\n##樑\n##樓\n##標\n##樞\n##樟\n##模\n##樣\n##権\n##横\n##樫\n##樯\n##樱\n##樵\n##樸\n##樹\n##樺\n##樽\n##樾\n##橄\n##橇\n##橋\n##橐\n##橘\n##橙\n##機\n##橡\n##橢\n##橫\n##橱\n##橹\n##橼\n##檀\n##檄\n##檎\n##檐\n##檔\n##檗\n##檜\n##檢\n##檬\n##檯\n##檳\n##檸\n##檻\n##櫃\n##櫚\n##櫛\n##櫥\n##櫸\n##櫻\n##欄\n##權\n##欒\n##欖\n##欠\n##次\n##欢\n##欣\n##欧\n##欲\n##欸\n##欺\n##欽\n##款\n##歆\n##歇\n##歉\n##歌\n##歎\n##歐\n##歓\n##歙\n##歛\n##歡\n##止\n##正\n##此\n##步\n##武\n##歧\n##歩\n##歪\n##歯\n##歲\n##歳\n##歴\n##歷\n##歸\n##歹\n##死\n##歼\n##殁\n##殃\n##殆\n##殇\n##殉\n##殊\n##残\n##殒\n##殓\n##殖\n##殘\n##殞\n##殡\n##殤\n##殭\n##殯\n##殲\n##殴\n##段\n##殷\n##殺\n##殼\n##殿\n##毀\n##毁\n##毂\n##毅\n##毆\n##毋\n##母\n##毎\n##每\n##毒\n##毓\n##比\n##毕\n##毗\n##毘\n##毙\n##毛\n##毡\n##毫\n##毯\n##毽\n##氈\n##氏\n##氐\n##民\n##氓\n##气\n##氖\n##気\n##氙\n##氛\n##氟\n##氡\n##氢\n##氣\n##氤\n##氦\n##氧\n##氨\n##氪\n##氫\n##氮\n##氯\n##氰\n##氲\n##水\n##氷\n##永\n##氹\n##氾\n##汀\n##汁\n##求\n##汆\n##汇\n##汉\n##汎\n##汐\n##汕\n##汗\n##汙\n##汛\n##汝\n##汞\n##江\n##池\n##污\n##汤\n##汨\n##汩\n##汪\n##汰\n##汲\n##汴\n##汶\n##汹\n##決\n##汽\n##汾\n##沁\n##沂\n##沃\n##沅\n##沈\n##沉\n##沌\n##沏\n##沐\n##沒\n##沓\n##沖\n##沙\n##沛\n##沟\n##没\n##沢\n##沣\n##沥\n##沦\n##沧\n##沪\n##沫\n##沭\n##沮\n##沱\n##河\n##沸\n##油\n##治\n##沼\n##沽\n##沾\n##沿\n##況\n##泄\n##泉\n##泊\n##泌\n##泓\n##法\n##泗\n##泛\n##泞\n##泠\n##泡\n##波\n##泣\n##泥\n##注\n##泪\n##泫\n##泮\n##泯\n##泰\n##泱\n##泳\n##泵\n##泷\n##泸\n##泻\n##泼\n##泽\n##泾\n##洁\n##洄\n##洋\n##洒\n##洗\n##洙\n##洛\n##洞\n##津\n##洩\n##洪\n##洮\n##洱\n##洲\n##洵\n##洶\n##洸\n##洹\n##活\n##洼\n##洽\n##派\n##流\n##浃\n##浄\n##浅\n##浆\n##浇\n##浊\n##测\n##济\n##浏\n##浑\n##浒\n##浓\n##浔\n##浙\n##浚\n##浜\n##浣\n##浦\n##浩\n##浪\n##浬\n##浮\n##浯\n##浴\n##海\n##浸\n##涂\n##涅\n##涇\n##消\n##涉\n##涌\n##涎\n##涓\n##涔\n##涕\n##涙\n##涛\n##涝\n##涞\n##涟\n##涠\n##涡\n##涣\n##涤\n##润\n##涧\n##涨\n##涩\n##涪\n##涮\n##涯\n##液\n##涵\n##涸\n##涼\n##涿\n##淀\n##淄\n##淅\n##淆\n##淇\n##淋\n##淌\n##淑\n##淒\n##淖\n##淘\n##淙\n##淚\n##淞\n##淡\n##淤\n##淦\n##淨\n##淩\n##淪\n##淫\n##淬\n##淮\n##深\n##淳\n##淵\n##混\n##淹\n##淺\n##添\n##淼\n##清\n##済\n##渉\n##渊\n##渋\n##渍\n##渎\n##渐\n##渔\n##渗\n##渙\n##渚\n##減\n##渝\n##渠\n##渡\n##渣\n##渤\n##渥\n##渦\n##温\n##測\n##渭\n##港\n##渲\n##渴\n##游\n##渺\n##渾\n##湃\n##湄\n##湊\n##湍\n##湖\n##湘\n##湛\n##湟\n##湧\n##湫\n##湮\n##湯\n##湳\n##湾\n##湿\n##満\n##溃\n##溅\n##溉\n##溏\n##源\n##準\n##溜\n##溝\n##溟\n##溢\n##溥\n##溧\n##溪\n##溫\n##溯\n##溱\n##溴\n##溶\n##溺\n##溼\n##滁\n##滂\n##滄\n##滅\n##滇\n##滋\n##滌\n##滑\n##滓\n##滔\n##滕\n##滙\n##滚\n##滝\n##滞\n##滟\n##满\n##滢\n##滤\n##滥\n##滦\n##滨\n##滩\n##滬\n##滯\n##滲\n##滴\n##滷\n##滸\n##滾\n##滿\n##漁\n##漂\n##漆\n##漉\n##漏\n##漓\n##演\n##漕\n##漠\n##漢\n##漣\n##漩\n##漪\n##漫\n##漬\n##漯\n##漱\n##漲\n##漳\n##漸\n##漾\n##漿\n##潆\n##潇\n##潋\n##潍\n##潑\n##潔\n##潘\n##潛\n##潜\n##潞\n##潟\n##潢\n##潤\n##潦\n##潧\n##潭\n##潮\n##潰\n##潴\n##潸\n##潺\n##潼\n##澀\n##澄\n##澆\n##澈\n##澍\n##澎\n##澗\n##澜\n##澡\n##澤\n##澧\n##澱\n##澳\n##澹\n##激\n##濁\n##濂\n##濃\n##濑\n##濒\n##濕\n##濘\n##濛\n##濟\n##濠\n##濡\n##濤\n##濫\n##濬\n##濮\n##濯\n##濱\n##濺\n##濾\n##瀅\n##瀆\n##瀉\n##瀋\n##瀏\n##瀑\n##瀕\n##瀘\n##瀚\n##瀛\n##瀝\n##瀞\n##瀟\n##瀧\n##瀨\n##瀬\n##瀰\n##瀾\n##灌\n##灏\n##灑\n##灘\n##灝\n##灞\n##灣\n##火\n##灬\n##灭\n##灯\n##灰\n##灵\n##灶\n##灸\n##灼\n##災\n##灾\n##灿\n##炀\n##炁\n##炅\n##炉\n##炊\n##炎\n##炒\n##炔\n##炕\n##炖\n##炙\n##炜\n##炫\n##炬\n##炭\n##炮\n##炯\n##炳\n##炷\n##炸\n##点\n##為\n##炼\n##炽\n##烁\n##烂\n##烃\n##烈\n##烊\n##烏\n##烘\n##烙\n##烛\n##烟\n##烤\n##烦\n##烧\n##烨\n##烩\n##烫\n##烬\n##热\n##烯\n##烷\n##烹\n##烽\n##焉\n##焊\n##焕\n##焖\n##焗\n##焘\n##焙\n##焚\n##焜\n##無\n##焦\n##焯\n##焰\n##焱\n##然\n##焼\n##煅\n##煉\n##煊\n##煌\n##煎\n##煒\n##煖\n##煙\n##煜\n##煞\n##煤\n##煥\n##煦\n##照\n##煨\n##煩\n##煮\n##煲\n##煸\n##煽\n##熄\n##熊\n##熏\n##熒\n##熔\n##熙\n##熟\n##熠\n##熨\n##熬\n##熱\n##熵\n##熹\n##熾\n##燁\n##燃\n##燄\n##燈\n##燉\n##燊\n##燎\n##燒\n##燔\n##燕\n##燙\n##燜\n##營\n##燥\n##燦\n##燧\n##燭\n##燮\n##燴\n##燻\n##燼\n##燿\n##爆\n##爍\n##爐\n##爛\n##爪\n##爬\n##爭\n##爰\n##爱\n##爲\n##爵\n##父\n##爷\n##爸\n##爹\n##爺\n##爻\n##爽\n##爾\n##牆\n##片\n##版\n##牌\n##牍\n##牒\n##牙\n##牛\n##牝\n##牟\n##牠\n##牡\n##牢\n##牦\n##牧\n##物\n##牯\n##牲\n##牴\n##牵\n##特\n##牺\n##牽\n##犀\n##犁\n##犄\n##犊\n##犍\n##犒\n##犢\n##犧\n##犬\n##犯\n##状\n##犷\n##犸\n##犹\n##狀\n##狂\n##狄\n##狈\n##狎\n##狐\n##狒\n##狗\n##狙\n##狞\n##狠\n##狡\n##狩\n##独\n##狭\n##狮\n##狰\n##狱\n##狸\n##狹\n##狼\n##狽\n##猎\n##猕\n##猖\n##猗\n##猙\n##猛\n##猜\n##猝\n##猥\n##猩\n##猪\n##猫\n##猬\n##献\n##猴\n##猶\n##猷\n##猾\n##猿\n##獄\n##獅\n##獎\n##獐\n##獒\n##獗\n##獠\n##獣\n##獨\n##獭\n##獰\n##獲\n##獵\n##獷\n##獸\n##獺\n##獻\n##獼\n##獾\n##玄\n##率\n##玉\n##王\n##玑\n##玖\n##玛\n##玟\n##玠\n##玥\n##玩\n##玫\n##玮\n##环\n##现\n##玲\n##玳\n##玷\n##玺\n##玻\n##珀\n##珂\n##珅\n##珈\n##珉\n##珊\n##珍\n##珏\n##珐\n##珑\n##珙\n##珞\n##珠\n##珣\n##珥\n##珩\n##珪\n##班\n##珮\n##珲\n##珺\n##現\n##球\n##琅\n##理\n##琇\n##琉\n##琊\n##琍\n##琏\n##琐\n##琛\n##琢\n##琥\n##琦\n##琨\n##琪\n##琬\n##琮\n##琰\n##琲\n##琳\n##琴\n##琵\n##琶\n##琺\n##琼\n##瑀\n##瑁\n##瑄\n##瑋\n##瑕\n##瑗\n##瑙\n##瑚\n##瑛\n##瑜\n##瑞\n##瑟\n##瑠\n##瑣\n##瑤\n##瑩\n##瑪\n##瑯\n##瑰\n##瑶\n##瑾\n##璀\n##璁\n##璃\n##璇\n##璉\n##璋\n##璎\n##璐\n##璜\n##璞\n##璟\n##璧\n##璨\n##環\n##璽\n##璿\n##瓊\n##瓏\n##瓒\n##瓜\n##瓢\n##瓣\n##瓤\n##瓦\n##瓮\n##瓯\n##瓴\n##瓶\n##瓷\n##甄\n##甌\n##甕\n##甘\n##甙\n##甚\n##甜\n##生\n##產\n##産\n##甥\n##甦\n##用\n##甩\n##甫\n##甬\n##甭\n##甯\n##田\n##由\n##甲\n##申\n##电\n##男\n##甸\n##町\n##画\n##甾\n##畀\n##畅\n##界\n##畏\n##畑\n##畔\n##留\n##畜\n##畝\n##畢\n##略\n##畦\n##番\n##畫\n##異\n##畲\n##畳\n##畴\n##當\n##畸\n##畹\n##畿\n##疆\n##疇\n##疊\n##疏\n##疑\n##疔\n##疖\n##疗\n##疙\n##疚\n##疝\n##疟\n##疡\n##疣\n##疤\n##疥\n##疫\n##疮\n##疯\n##疱\n##疲\n##疳\n##疵\n##疸\n##疹\n##疼\n##疽\n##疾\n##痂\n##病\n##症\n##痈\n##痉\n##痊\n##痍\n##痒\n##痔\n##痕\n##痘\n##痙\n##痛\n##痞\n##痠\n##痢\n##痣\n##痤\n##痧\n##痨\n##痪\n##痫\n##痰\n##痱\n##痴\n##痹\n##痺\n##痼\n##痿\n##瘀\n##瘁\n##瘋\n##瘍\n##瘓\n##瘘\n##瘙\n##瘟\n##瘠\n##瘡\n##瘢\n##瘤\n##瘦\n##瘧\n##瘩\n##瘪\n##瘫\n##瘴\n##瘸\n##瘾\n##療\n##癇\n##癌\n##癒\n##癖\n##癜\n##癞\n##癡\n##癢\n##癣\n##癥\n##癫\n##癬\n##癮\n##癱\n##癲\n##癸\n##発\n##登\n##發\n##白\n##百\n##皂\n##的\n##皆\n##皇\n##皈\n##皋\n##皎\n##皑\n##皓\n##皖\n##皙\n##皚\n##皮\n##皰\n##皱\n##皴\n##皺\n##皿\n##盂\n##盃\n##盅\n##盆\n##盈\n##益\n##盎\n##盏\n##盐\n##监\n##盒\n##盔\n##盖\n##盗\n##盘\n##盛\n##盜\n##盞\n##盟\n##盡\n##監\n##盤\n##盥\n##盧\n##盪\n##目\n##盯\n##盱\n##盲\n##直\n##相\n##盹\n##盼\n##盾\n##省\n##眈\n##眉\n##看\n##県\n##眙\n##眞\n##真\n##眠\n##眦\n##眨\n##眩\n##眯\n##眶\n##眷\n##眸\n##眺\n##眼\n##眾\n##着\n##睁\n##睇\n##睏\n##睐\n##睑\n##睛\n##睜\n##睞\n##睡\n##睢\n##督\n##睥\n##睦\n##睨\n##睪\n##睫\n##睬\n##睹\n##睽\n##睾\n##睿\n##瞄\n##瞅\n##瞇\n##瞋\n##瞌\n##瞎\n##瞑\n##瞒\n##瞓\n##瞞\n##瞟\n##瞠\n##瞥\n##瞧\n##瞩\n##瞪\n##瞬\n##瞭\n##瞰\n##瞳\n##瞻\n##瞼\n##瞿\n##矇\n##矍\n##矗\n##矚\n##矛\n##矜\n##矢\n##矣\n##知\n##矩\n##矫\n##短\n##矮\n##矯\n##石\n##矶\n##矽\n##矾\n##矿\n##码\n##砂\n##砌\n##砍\n##砒\n##研\n##砖\n##砗\n##砚\n##砝\n##砣\n##砥\n##砧\n##砭\n##砰\n##砲\n##破\n##砷\n##砸\n##砺\n##砼\n##砾\n##础\n##硅\n##硐\n##硒\n##硕\n##硝\n##硫\n##硬\n##确\n##硯\n##硼\n##碁\n##碇\n##碉\n##碌\n##碍\n##碎\n##碑\n##碓\n##碗\n##碘\n##碚\n##碛\n##碟\n##碣\n##碧\n##碩\n##碰\n##碱\n##碳\n##碴\n##確\n##碼\n##碾\n##磁\n##磅\n##磊\n##磋\n##磐\n##磕\n##磚\n##磡\n##磨\n##磬\n##磯\n##磲\n##磷\n##磺\n##礁\n##礎\n##礙\n##礡\n##礦\n##礪\n##礫\n##礴\n##示\n##礼\n##社\n##祀\n##祁\n##祂\n##祇\n##祈\n##祉\n##祎\n##祐\n##祕\n##祖\n##祗\n##祚\n##祛\n##祜\n##祝\n##神\n##祟\n##祠\n##祢\n##祥\n##票\n##祭\n##祯\n##祷\n##祸\n##祺\n##祿\n##禀\n##禁\n##禄\n##禅\n##禍\n##禎\n##福\n##禛\n##禦\n##禧\n##禪\n##禮\n##禱\n##禹\n##禺\n##离\n##禽\n##禾\n##禿\n##秀\n##私\n##秃\n##秆\n##秉\n##秋\n##种\n##科\n##秒\n##秘\n##租\n##秣\n##秤\n##秦\n##秧\n##秩\n##秭\n##积\n##称\n##秸\n##移\n##秽\n##稀\n##稅\n##程\n##稍\n##税\n##稔\n##稗\n##稚\n##稜\n##稞\n##稟\n##稠\n##稣\n##種\n##稱\n##稲\n##稳\n##稷\n##稹\n##稻\n##稼\n##稽\n##稿\n##穀\n##穂\n##穆\n##穌\n##積\n##穎\n##穗\n##穢\n##穩\n##穫\n##穴\n##究\n##穷\n##穹\n##空\n##穿\n##突\n##窃\n##窄\n##窈\n##窍\n##窑\n##窒\n##窓\n##窕\n##窖\n##窗\n##窘\n##窜\n##窝\n##窟\n##窠\n##窥\n##窦\n##窨\n##窩\n##窪\n##窮\n##窯\n##窺\n##窿\n##竄\n##竅\n##竇\n##竊\n##立\n##竖\n##站\n##竜\n##竞\n##竟\n##章\n##竣\n##童\n##竭\n##端\n##競\n##竹\n##竺\n##竽\n##竿\n##笃\n##笆\n##笈\n##笋\n##笏\n##笑\n##笔\n##笙\n##笛\n##笞\n##笠\n##符\n##笨\n##第\n##笹\n##笺\n##笼\n##筆\n##等\n##筊\n##筋\n##筍\n##筏\n##筐\n##筑\n##筒\n##答\n##策\n##筛\n##筝\n##筠\n##筱\n##筲\n##筵\n##筷\n##筹\n##签\n##简\n##箇\n##箋\n##箍\n##箏\n##箐\n##箔\n##箕\n##算\n##箝\n##管\n##箩\n##箫\n##箭\n##箱\n##箴\n##箸\n##節\n##篁\n##範\n##篆\n##篇\n##築\n##篑\n##篓\n##篙\n##篝\n##篠\n##篡\n##篤\n##篩\n##篪\n##篮\n##篱\n##篷\n##簇\n##簌\n##簍\n##簡\n##簦\n##簧\n##簪\n##簫\n##簷\n##簸\n##簽\n##簾\n##簿\n##籁\n##籃\n##籌\n##籍\n##籐\n##籟\n##籠\n##籤\n##籬\n##籮\n##籲\n##米\n##类\n##籼\n##籽\n##粄\n##粉\n##粑\n##粒\n##粕\n##粗\n##粘\n##粟\n##粤\n##粥\n##粧\n##粪\n##粮\n##粱\n##粲\n##粳\n##粵\n##粹\n##粼\n##粽\n##精\n##粿\n##糅\n##糊\n##糍\n##糕\n##糖\n##糗\n##糙\n##糜\n##糞\n##糟\n##糠\n##糧\n##糬\n##糯\n##糰\n##糸\n##系\n##糾\n##紀\n##紂\n##約\n##紅\n##紉\n##紊\n##紋\n##納\n##紐\n##紓\n##純\n##紗\n##紘\n##紙\n##級\n##紛\n##紜\n##素\n##紡\n##索\n##紧\n##紫\n##紮\n##累\n##細\n##紳\n##紹\n##紺\n##終\n##絃\n##組\n##絆\n##経\n##結\n##絕\n##絞\n##絡\n##絢\n##給\n##絨\n##絮\n##統\n##絲\n##絳\n##絵\n##絶\n##絹\n##綁\n##綏\n##綑\n##經\n##継\n##続\n##綜\n##綠\n##綢\n##綦\n##綫\n##綬\n##維\n##綱\n##網\n##綴\n##綵\n##綸\n##綺\n##綻\n##綽\n##綾\n##綿\n##緊\n##緋\n##総\n##緑\n##緒\n##緘\n##線\n##緝\n##緞\n##締\n##緣\n##編\n##緩\n##緬\n##緯\n##練\n##緹\n##緻\n##縁\n##縄\n##縈\n##縛\n##縝\n##縣\n##縫\n##縮\n##縱\n##縴\n##縷\n##總\n##績\n##繁\n##繃\n##繆\n##繇\n##繋\n##織\n##繕\n##繚\n##繞\n##繡\n##繩\n##繪\n##繫\n##繭\n##繳\n##繹\n##繼\n##繽\n##纂\n##續\n##纍\n##纏\n##纓\n##纔\n##纖\n##纜\n##纠\n##红\n##纣\n##纤\n##约\n##级\n##纨\n##纪\n##纫\n##纬\n##纭\n##纯\n##纰\n##纱\n##纲\n##纳\n##纵\n##纶\n##纷\n##纸\n##纹\n##纺\n##纽\n##纾\n##线\n##绀\n##练\n##组\n##绅\n##细\n##织\n##终\n##绊\n##绍\n##绎\n##经\n##绑\n##绒\n##结\n##绔\n##绕\n##绘\n##给\n##绚\n##绛\n##络\n##绝\n##绞\n##统\n##绡\n##绢\n##绣\n##绥\n##绦\n##继\n##绩\n##绪\n##绫\n##续\n##绮\n##绯\n##绰\n##绳\n##维\n##绵\n##绶\n##绷\n##绸\n##绻\n##综\n##绽\n##绾\n##绿\n##缀\n##缄\n##缅\n##缆\n##缇\n##缈\n##缉\n##缎\n##缓\n##缔\n##缕\n##编\n##缘\n##缙\n##缚\n##缜\n##缝\n##缠\n##缢\n##缤\n##缥\n##缨\n##缩\n##缪\n##缭\n##缮\n##缰\n##缱\n##缴\n##缸\n##缺\n##缽\n##罂\n##罄\n##罌\n##罐\n##网\n##罔\n##罕\n##罗\n##罚\n##罡\n##罢\n##罩\n##罪\n##置\n##罰\n##署\n##罵\n##罷\n##罹\n##羁\n##羅\n##羈\n##羊\n##羌\n##美\n##羔\n##羚\n##羞\n##羟\n##羡\n##羣\n##群\n##羥\n##羧\n##羨\n##義\n##羯\n##羲\n##羸\n##羹\n##羽\n##羿\n##翁\n##翅\n##翊\n##翌\n##翎\n##習\n##翔\n##翘\n##翟\n##翠\n##翡\n##翦\n##翩\n##翰\n##翱\n##翳\n##翹\n##翻\n##翼\n##耀\n##老\n##考\n##耄\n##者\n##耆\n##耋\n##而\n##耍\n##耐\n##耒\n##耕\n##耗\n##耘\n##耙\n##耦\n##耨\n##耳\n##耶\n##耷\n##耸\n##耻\n##耽\n##耿\n##聂\n##聆\n##聊\n##聋\n##职\n##聒\n##联\n##聖\n##聘\n##聚\n##聞\n##聪\n##聯\n##聰\n##聲\n##聳\n##聴\n##聶\n##職\n##聽\n##聾\n##聿\n##肃\n##肄\n##肅\n##肆\n##肇\n##肉\n##肋\n##肌\n##肏\n##肓\n##肖\n##肘\n##肚\n##肛\n##肝\n##肠\n##股\n##肢\n##肤\n##肥\n##肩\n##肪\n##肮\n##肯\n##肱\n##育\n##肴\n##肺\n##肽\n##肾\n##肿\n##胀\n##胁\n##胃\n##胄\n##胆\n##背\n##胍\n##胎\n##胖\n##胚\n##胛\n##胜\n##胝\n##胞\n##胡\n##胤\n##胥\n##胧\n##胫\n##胭\n##胯\n##胰\n##胱\n##胳\n##胴\n##胶\n##胸\n##胺\n##能\n##脂\n##脅\n##脆\n##脇\n##脈\n##脉\n##脊\n##脍\n##脏\n##脐\n##脑\n##脓\n##脖\n##脘\n##脚\n##脛\n##脣\n##脩\n##脫\n##脯\n##脱\n##脲\n##脳\n##脸\n##脹\n##脾\n##腆\n##腈\n##腊\n##腋\n##腌\n##腎\n##腐\n##腑\n##腓\n##腔\n##腕\n##腥\n##腦\n##腩\n##腫\n##腭\n##腮\n##腰\n##腱\n##腳\n##腴\n##腸\n##腹\n##腺\n##腻\n##腼\n##腾\n##腿\n##膀\n##膈\n##膊\n##膏\n##膑\n##膘\n##膚\n##膛\n##膜\n##膝\n##膠\n##膦\n##膨\n##膩\n##膳\n##膺\n##膻\n##膽\n##膾\n##膿\n##臀\n##臂\n##臃\n##臆\n##臉\n##臊\n##臍\n##臓\n##臘\n##臟\n##臣\n##臥\n##臧\n##臨\n##自\n##臬\n##臭\n##至\n##致\n##臺\n##臻\n##臼\n##臾\n##舀\n##舂\n##舅\n##舆\n##與\n##興\n##舉\n##舊\n##舌\n##舍\n##舎\n##舐\n##舒\n##舔\n##舖\n##舗\n##舛\n##舜\n##舞\n##舟\n##航\n##舫\n##般\n##舰\n##舱\n##舵\n##舶\n##舷\n##舸\n##船\n##舺\n##舾\n##艇\n##艋\n##艘\n##艙\n##艦\n##艮\n##良\n##艰\n##艱\n##色\n##艳\n##艷\n##艹\n##艺\n##艾\n##节\n##芃\n##芈\n##芊\n##芋\n##芍\n##芎\n##芒\n##芙\n##芜\n##芝\n##芡\n##芥\n##芦\n##芩\n##芪\n##芫\n##芬\n##芭\n##芮\n##芯\n##花\n##芳\n##芷\n##芸\n##芹\n##芻\n##芽\n##芾\n##苁\n##苄\n##苇\n##苋\n##苍\n##苏\n##苑\n##苒\n##苓\n##苔\n##苕\n##苗\n##苛\n##苜\n##苞\n##苟\n##苡\n##苣\n##若\n##苦\n##苫\n##苯\n##英\n##苷\n##苹\n##苻\n##茁\n##茂\n##范\n##茄\n##茅\n##茉\n##茎\n##茏\n##茗\n##茜\n##茧\n##茨\n##茫\n##茬\n##茭\n##茯\n##茱\n##茲\n##茴\n##茵\n##茶\n##茸\n##茹\n##茼\n##荀\n##荃\n##荆\n##草\n##荊\n##荏\n##荐\n##荒\n##荔\n##荖\n##荘\n##荚\n##荞\n##荟\n##荠\n##荡\n##荣\n##荤\n##荥\n##荧\n##荨\n##荪\n##荫\n##药\n##荳\n##荷\n##荸\n##荻\n##荼\n##荽\n##莅\n##莆\n##莉\n##莊\n##莎\n##莒\n##莓\n##莖\n##莘\n##莞\n##莠\n##莢\n##莧\n##莪\n##莫\n##莱\n##莲\n##莴\n##获\n##莹\n##莺\n##莽\n##莿\n##菀\n##菁\n##菅\n##菇\n##菈\n##菊\n##菌\n##菏\n##菓\n##菖\n##菘\n##菜\n##菟\n##菠\n##菡\n##菩\n##華\n##菱\n##菲\n##菸\n##菽\n##萁\n##萃\n##萄\n##萊\n##萋\n##萌\n##萍\n##萎\n##萘\n##萝\n##萤\n##营\n##萦\n##萧\n##萨\n##萩\n##萬\n##萱\n##萵\n##萸\n##萼\n##落\n##葆\n##葉\n##著\n##葚\n##葛\n##葡\n##董\n##葦\n##葩\n##葫\n##葬\n##葭\n##葯\n##葱\n##葳\n##葵\n##葷\n##葺\n##蒂\n##蒋\n##蒐\n##蒔\n##蒙\n##蒜\n##蒞\n##蒟\n##蒡\n##蒨\n##蒲\n##蒸\n##蒹\n##蒻\n##蒼\n##蒿\n##蓁\n##蓄\n##蓆\n##蓉\n##蓋\n##蓑\n##蓓\n##蓖\n##蓝\n##蓟\n##蓦\n##蓬\n##蓮\n##蓼\n##蓿\n##蔑\n##蔓\n##蔔\n##蔗\n##蔘\n##蔚\n##蔡\n##蔣\n##蔥\n##蔫\n##蔬\n##蔭\n##蔵\n##蔷\n##蔺\n##蔻\n##蔼\n##蔽\n##蕁\n##蕃\n##蕈\n##蕉\n##蕊\n##蕎\n##蕙\n##蕤\n##蕨\n##蕩\n##蕪\n##蕭\n##蕲\n##蕴\n##蕻\n##蕾\n##薄\n##薅\n##薇\n##薈\n##薊\n##薏\n##薑\n##薔\n##薙\n##薛\n##薦\n##薨\n##薩\n##薪\n##薬\n##薯\n##薰\n##薹\n##藉\n##藍\n##藏\n##藐\n##藓\n##藕\n##藜\n##藝\n##藤\n##藥\n##藩\n##藹\n##藻\n##藿\n##蘆\n##蘇\n##蘊\n##蘋\n##蘑\n##蘚\n##蘭\n##蘸\n##蘼\n##蘿\n##虎\n##虏\n##虐\n##虑\n##虔\n##處\n##虚\n##虛\n##虜\n##虞\n##號\n##虢\n##虧\n##虫\n##虬\n##虱\n##虹\n##虻\n##虽\n##虾\n##蚀\n##蚁\n##蚂\n##蚊\n##蚌\n##蚓\n##蚕\n##蚜\n##蚝\n##蚣\n##蚤\n##蚩\n##蚪\n##蚯\n##蚱\n##蚵\n##蛀\n##蛆\n##蛇\n##蛊\n##蛋\n##蛎\n##蛐\n##蛔\n##蛙\n##蛛\n##蛟\n##蛤\n##蛭\n##蛮\n##蛰\n##蛳\n##蛹\n##蛻\n##蛾\n##蜀\n##蜂\n##蜃\n##蜆\n##蜇\n##蜈\n##蜊\n##蜍\n##蜒\n##蜓\n##蜕\n##蜗\n##蜘\n##蜚\n##蜜\n##蜡\n##蜢\n##蜥\n##蜱\n##蜴\n##蜷\n##蜻\n##蜿\n##蝇\n##蝈\n##蝉\n##蝌\n##蝎\n##蝕\n##蝗\n##蝙\n##蝟\n##蝠\n##蝦\n##蝨\n##蝴\n##蝶\n##蝸\n##蝼\n##螂\n##螃\n##融\n##螞\n##螢\n##螨\n##螯\n##螳\n##螺\n##蟀\n##蟄\n##蟆\n##蟋\n##蟎\n##蟑\n##蟒\n##蟠\n##蟬\n##蟲\n##蟹\n##蟻\n##蟾\n##蠅\n##蠍\n##蠔\n##蠕\n##蠛\n##蠟\n##蠡\n##蠢\n##蠣\n##蠱\n##蠶\n##蠹\n##蠻\n##血\n##衄\n##衅\n##衆\n##行\n##衍\n##術\n##衔\n##街\n##衙\n##衛\n##衝\n##衞\n##衡\n##衢\n##衣\n##补\n##表\n##衩\n##衫\n##衬\n##衮\n##衰\n##衲\n##衷\n##衹\n##衾\n##衿\n##袁\n##袂\n##袄\n##袅\n##袈\n##袋\n##袍\n##袒\n##袖\n##袜\n##袞\n##袤\n##袪\n##被\n##袭\n##袱\n##裁\n##裂\n##装\n##裆\n##裊\n##裏\n##裔\n##裕\n##裘\n##裙\n##補\n##裝\n##裟\n##裡\n##裤\n##裨\n##裱\n##裳\n##裴\n##裸\n##裹\n##製\n##裾\n##褂\n##複\n##褐\n##褒\n##褓\n##褔\n##褚\n##褥\n##褪\n##褫\n##褲\n##褶\n##褻\n##襁\n##襄\n##襟\n##襠\n##襪\n##襬\n##襯\n##襲\n##西\n##要\n##覃\n##覆\n##覇\n##見\n##規\n##覓\n##視\n##覚\n##覦\n##覧\n##親\n##覬\n##観\n##覷\n##覺\n##覽\n##觀\n##见\n##观\n##规\n##觅\n##视\n##览\n##觉\n##觊\n##觎\n##觐\n##觑\n##角\n##觞\n##解\n##觥\n##触\n##觸\n##言\n##訂\n##計\n##訊\n##討\n##訓\n##訕\n##訖\n##託\n##記\n##訛\n##訝\n##訟\n##訣\n##訥\n##訪\n##設\n##許\n##訳\n##訴\n##訶\n##診\n##註\n##証\n##詆\n##詐\n##詔\n##評\n##詛\n##詞\n##詠\n##詡\n##詢\n##詣\n##試\n##詩\n##詫\n##詬\n##詭\n##詮\n##詰\n##話\n##該\n##詳\n##詹\n##詼\n##誅\n##誇\n##誉\n##誌\n##認\n##誓\n##誕\n##誘\n##語\n##誠\n##誡\n##誣\n##誤\n##誥\n##誦\n##誨\n##說\n##説\n##読\n##誰\n##課\n##誹\n##誼\n##調\n##諄\n##談\n##請\n##諏\n##諒\n##論\n##諗\n##諜\n##諡\n##諦\n##諧\n##諫\n##諭\n##諮\n##諱\n##諳\n##諷\n##諸\n##諺\n##諾\n##謀\n##謁\n##謂\n##謄\n##謊\n##謎\n##謐\n##謔\n##謗\n##謙\n##講\n##謝\n##謠\n##謨\n##謬\n##謹\n##謾\n##譁\n##證\n##譎\n##譏\n##識\n##譙\n##譚\n##譜\n##警\n##譬\n##譯\n##議\n##譲\n##譴\n##護\n##譽\n##讀\n##變\n##讓\n##讚\n##讞\n##计\n##订\n##认\n##讥\n##讧\n##讨\n##让\n##讪\n##讫\n##训\n##议\n##讯\n##记\n##讲\n##讳\n##讴\n##讶\n##讷\n##许\n##讹\n##论\n##讼\n##讽\n##设\n##访\n##诀\n##证\n##诃\n##评\n##诅\n##识\n##诈\n##诉\n##诊\n##诋\n##词\n##诏\n##译\n##试\n##诗\n##诘\n##诙\n##诚\n##诛\n##话\n##诞\n##诟\n##诠\n##诡\n##询\n##诣\n##诤\n##该\n##详\n##诧\n##诩\n##诫\n##诬\n##语\n##误\n##诰\n##诱\n##诲\n##说\n##诵\n##诶\n##请\n##诸\n##诺\n##读\n##诽\n##课\n##诿\n##谀\n##谁\n##调\n##谄\n##谅\n##谆\n##谈\n##谊\n##谋\n##谌\n##谍\n##谎\n##谏\n##谐\n##谑\n##谒\n##谓\n##谔\n##谕\n##谗\n##谘\n##谙\n##谚\n##谛\n##谜\n##谟\n##谢\n##谣\n##谤\n##谥\n##谦\n##谧\n##谨\n##谩\n##谪\n##谬\n##谭\n##谯\n##谱\n##谲\n##谴\n##谶\n##谷\n##豁\n##豆\n##豇\n##豈\n##豉\n##豊\n##豌\n##豎\n##豐\n##豔\n##豚\n##象\n##豢\n##豪\n##豫\n##豬\n##豹\n##豺\n##貂\n##貅\n##貌\n##貓\n##貔\n##貘\n##貝\n##貞\n##負\n##財\n##貢\n##貧\n##貨\n##販\n##貪\n##貫\n##責\n##貯\n##貰\n##貳\n##貴\n##貶\n##買\n##貸\n##費\n##貼\n##貽\n##貿\n##賀\n##賁\n##賂\n##賃\n##賄\n##資\n##賈\n##賊\n##賑\n##賓\n##賜\n##賞\n##賠\n##賡\n##賢\n##賣\n##賤\n##賦\n##質\n##賬\n##賭\n##賴\n##賺\n##購\n##賽\n##贅\n##贈\n##贊\n##贍\n##贏\n##贓\n##贖\n##贛\n##贝\n##贞\n##负\n##贡\n##财\n##责\n##贤\n##败\n##账\n##货\n##质\n##贩\n##贪\n##贫\n##贬\n##购\n##贮\n##贯\n##贰\n##贱\n##贲\n##贴\n##贵\n##贷\n##贸\n##费\n##贺\n##贻\n##贼\n##贾\n##贿\n##赁\n##赂\n##赃\n##资\n##赅\n##赈\n##赊\n##赋\n##赌\n##赎\n##赏\n##赐\n##赓\n##赔\n##赖\n##赘\n##赚\n##赛\n##赝\n##赞\n##赠\n##赡\n##赢\n##赣\n##赤\n##赦\n##赧\n##赫\n##赭\n##走\n##赳\n##赴\n##赵\n##赶\n##起\n##趁\n##超\n##越\n##趋\n##趕\n##趙\n##趟\n##趣\n##趨\n##足\n##趴\n##趵\n##趸\n##趺\n##趾\n##跃\n##跄\n##跆\n##跋\n##跌\n##跎\n##跑\n##跖\n##跚\n##跛\n##距\n##跟\n##跡\n##跤\n##跨\n##跩\n##跪\n##路\n##跳\n##践\n##跷\n##跹\n##跺\n##跻\n##踉\n##踊\n##踌\n##踏\n##踐\n##踝\n##踞\n##踟\n##踢\n##踩\n##踪\n##踮\n##踱\n##踴\n##踵\n##踹\n##蹂\n##蹄\n##蹇\n##蹈\n##蹉\n##蹊\n##蹋\n##蹑\n##蹒\n##蹙\n##蹟\n##蹣\n##蹤\n##蹦\n##蹩\n##蹬\n##蹭\n##蹲\n##蹴\n##蹶\n##蹺\n##蹼\n##蹿\n##躁\n##躇\n##躉\n##躊\n##躋\n##躍\n##躏\n##躪\n##身\n##躬\n##躯\n##躲\n##躺\n##軀\n##車\n##軋\n##軌\n##軍\n##軒\n##軟\n##転\n##軸\n##軼\n##軽\n##軾\n##較\n##載\n##輒\n##輓\n##輔\n##輕\n##輛\n##輝\n##輟\n##輩\n##輪\n##輯\n##輸\n##輻\n##輾\n##輿\n##轄\n##轅\n##轆\n##轉\n##轍\n##轎\n##轟\n##车\n##轧\n##轨\n##轩\n##转\n##轭\n##轮\n##软\n##轰\n##轲\n##轴\n##轶\n##轻\n##轼\n##载\n##轿\n##较\n##辄\n##辅\n##辆\n##辇\n##辈\n##辉\n##辊\n##辍\n##辐\n##辑\n##输\n##辕\n##辖\n##辗\n##辘\n##辙\n##辛\n##辜\n##辞\n##辟\n##辣\n##辦\n##辨\n##辩\n##辫\n##辭\n##辮\n##辯\n##辰\n##辱\n##農\n##边\n##辺\n##辻\n##込\n##辽\n##达\n##迁\n##迂\n##迄\n##迅\n##过\n##迈\n##迎\n##运\n##近\n##返\n##还\n##这\n##进\n##远\n##违\n##连\n##迟\n##迢\n##迤\n##迥\n##迦\n##迩\n##迪\n##迫\n##迭\n##述\n##迴\n##迷\n##迸\n##迹\n##迺\n##追\n##退\n##送\n##适\n##逃\n##逅\n##逆\n##选\n##逊\n##逍\n##透\n##逐\n##递\n##途\n##逕\n##逗\n##這\n##通\n##逛\n##逝\n##逞\n##速\n##造\n##逢\n##連\n##逮\n##週\n##進\n##逵\n##逶\n##逸\n##逻\n##逼\n##逾\n##遁\n##遂\n##遅\n##遇\n##遊\n##運\n##遍\n##過\n##遏\n##遐\n##遑\n##遒\n##道\n##達\n##違\n##遗\n##遙\n##遛\n##遜\n##遞\n##遠\n##遢\n##遣\n##遥\n##遨\n##適\n##遭\n##遮\n##遲\n##遴\n##遵\n##遶\n##遷\n##選\n##遺\n##遼\n##遽\n##避\n##邀\n##邁\n##邂\n##邃\n##還\n##邇\n##邈\n##邊\n##邋\n##邏\n##邑\n##邓\n##邕\n##邛\n##邝\n##邢\n##那\n##邦\n##邨\n##邪\n##邬\n##邮\n##邯\n##邰\n##邱\n##邳\n##邵\n##邸\n##邹\n##邺\n##邻\n##郁\n##郅\n##郊\n##郎\n##郑\n##郜\n##郝\n##郡\n##郢\n##郤\n##郦\n##郧\n##部\n##郫\n##郭\n##郴\n##郵\n##郷\n##郸\n##都\n##鄂\n##鄉\n##鄒\n##鄔\n##鄙\n##鄞\n##鄢\n##鄧\n##鄭\n##鄰\n##鄱\n##鄲\n##鄺\n##酉\n##酊\n##酋\n##酌\n##配\n##酐\n##酒\n##酗\n##酚\n##酝\n##酢\n##酣\n##酥\n##酩\n##酪\n##酬\n##酮\n##酯\n##酰\n##酱\n##酵\n##酶\n##酷\n##酸\n##酿\n##醃\n##醇\n##醉\n##醋\n##醍\n##醐\n##醒\n##醚\n##醛\n##醜\n##醞\n##醣\n##醪\n##醫\n##醬\n##醮\n##醯\n##醴\n##醺\n##釀\n##釁\n##采\n##釉\n##释\n##釋\n##里\n##重\n##野\n##量\n##釐\n##金\n##釗\n##釘\n##釜\n##針\n##釣\n##釦\n##釧\n##釵\n##鈀\n##鈉\n##鈍\n##鈎\n##鈔\n##鈕\n##鈞\n##鈣\n##鈦\n##鈪\n##鈴\n##鈺\n##鈾\n##鉀\n##鉄\n##鉅\n##鉉\n##鉑\n##鉗\n##鉚\n##鉛\n##鉤\n##鉴\n##鉻\n##銀\n##銃\n##銅\n##銑\n##銓\n##銖\n##銘\n##銜\n##銬\n##銭\n##銮\n##銳\n##銷\n##銹\n##鋁\n##鋅\n##鋒\n##鋤\n##鋪\n##鋰\n##鋸\n##鋼\n##錄\n##錐\n##錘\n##錚\n##錠\n##錢\n##錦\n##錨\n##錫\n##錮\n##錯\n##録\n##錳\n##錶\n##鍊\n##鍋\n##鍍\n##鍛\n##鍥\n##鍰\n##鍵\n##鍺\n##鍾\n##鎂\n##鎊\n##鎌\n##鎏\n##鎔\n##鎖\n##鎗\n##鎚\n##鎧\n##鎬\n##鎮\n##鎳\n##鏈\n##鏖\n##鏗\n##鏘\n##鏞\n##鏟\n##鏡\n##鏢\n##鏤\n##鏽\n##鐘\n##鐮\n##鐲\n##鐳\n##鐵\n##鐸\n##鐺\n##鑄\n##鑊\n##鑑\n##鑒\n##鑣\n##鑫\n##鑰\n##鑲\n##鑼\n##鑽\n##鑾\n##鑿\n##针\n##钉\n##钊\n##钎\n##钏\n##钒\n##钓\n##钗\n##钙\n##钛\n##钜\n##钝\n##钞\n##钟\n##钠\n##钡\n##钢\n##钣\n##钤\n##钥\n##钦\n##钧\n##钨\n##钩\n##钮\n##钯\n##钰\n##钱\n##钳\n##钴\n##钵\n##钺\n##钻\n##钼\n##钾\n##钿\n##铀\n##铁\n##铂\n##铃\n##铄\n##铅\n##铆\n##铉\n##铎\n##铐\n##铛\n##铜\n##铝\n##铠\n##铡\n##铢\n##铣\n##铤\n##铨\n##铩\n##铬\n##铭\n##铮\n##铰\n##铲\n##铵\n##银\n##铸\n##铺\n##链\n##铿\n##销\n##锁\n##锂\n##锄\n##锅\n##锆\n##锈\n##锉\n##锋\n##锌\n##锏\n##锐\n##锑\n##错\n##锚\n##锟\n##锡\n##锢\n##锣\n##锤\n##锥\n##锦\n##锭\n##键\n##锯\n##锰\n##锲\n##锵\n##锹\n##锺\n##锻\n##镀\n##镁\n##镂\n##镇\n##镉\n##镌\n##镍\n##镐\n##镑\n##镕\n##镖\n##镗\n##镛\n##镜\n##镣\n##镭\n##镯\n##镰\n##镳\n##镶\n##長\n##长\n##門\n##閃\n##閉\n##開\n##閎\n##閏\n##閑\n##閒\n##間\n##閔\n##閘\n##閡\n##関\n##閣\n##閥\n##閨\n##閩\n##閱\n##閲\n##閹\n##閻\n##閾\n##闆\n##闇\n##闊\n##闌\n##闍\n##闔\n##闕\n##闖\n##闘\n##關\n##闡\n##闢\n##门\n##闪\n##闫\n##闭\n##问\n##闯\n##闰\n##闲\n##间\n##闵\n##闷\n##闸\n##闹\n##闺\n##闻\n##闽\n##闾\n##阀\n##阁\n##阂\n##阅\n##阆\n##阇\n##阈\n##阉\n##阎\n##阐\n##阑\n##阔\n##阕\n##阖\n##阙\n##阚\n##阜\n##队\n##阡\n##阪\n##阮\n##阱\n##防\n##阳\n##阴\n##阵\n##阶\n##阻\n##阿\n##陀\n##陂\n##附\n##际\n##陆\n##陇\n##陈\n##陋\n##陌\n##降\n##限\n##陕\n##陛\n##陝\n##陞\n##陟\n##陡\n##院\n##陣\n##除\n##陨\n##险\n##陪\n##陰\n##陲\n##陳\n##陵\n##陶\n##陷\n##陸\n##険\n##陽\n##隅\n##隆\n##隈\n##隊\n##隋\n##隍\n##階\n##随\n##隐\n##隔\n##隕\n##隘\n##隙\n##際\n##障\n##隠\n##隣\n##隧\n##隨\n##險\n##隱\n##隴\n##隶\n##隸\n##隻\n##隼\n##隽\n##难\n##雀\n##雁\n##雄\n##雅\n##集\n##雇\n##雉\n##雋\n##雌\n##雍\n##雎\n##雏\n##雑\n##雒\n##雕\n##雖\n##雙\n##雛\n##雜\n##雞\n##離\n##難\n##雨\n##雪\n##雯\n##雰\n##雲\n##雳\n##零\n##雷\n##雹\n##電\n##雾\n##需\n##霁\n##霄\n##霆\n##震\n##霈\n##霉\n##霊\n##霍\n##霎\n##霏\n##霑\n##霓\n##霖\n##霜\n##霞\n##霧\n##霭\n##霰\n##露\n##霸\n##霹\n##霽\n##霾\n##靂\n##靄\n##靈\n##青\n##靓\n##靖\n##静\n##靚\n##靛\n##靜\n##非\n##靠\n##靡\n##面\n##靥\n##靦\n##革\n##靳\n##靴\n##靶\n##靼\n##鞅\n##鞋\n##鞍\n##鞏\n##鞑\n##鞘\n##鞠\n##鞣\n##鞦\n##鞭\n##韆\n##韋\n##韌\n##韓\n##韜\n##韦\n##韧\n##韩\n##韬\n##韭\n##音\n##韵\n##韶\n##韻\n##響\n##頁\n##頂\n##頃\n##項\n##順\n##須\n##頌\n##預\n##頑\n##頒\n##頓\n##頗\n##領\n##頜\n##頡\n##頤\n##頫\n##頭\n##頰\n##頷\n##頸\n##頹\n##頻\n##頼\n##顆\n##題\n##額\n##顎\n##顏\n##顔\n##願\n##顛\n##類\n##顧\n##顫\n##顯\n##顱\n##顴\n##页\n##顶\n##顷\n##项\n##顺\n##须\n##顼\n##顽\n##顾\n##顿\n##颁\n##颂\n##预\n##颅\n##领\n##颇\n##颈\n##颉\n##颊\n##颌\n##颍\n##颐\n##频\n##颓\n##颔\n##颖\n##颗\n##题\n##颚\n##颛\n##颜\n##额\n##颞\n##颠\n##颡\n##颢\n##颤\n##颦\n##颧\n##風\n##颯\n##颱\n##颳\n##颶\n##颼\n##飄\n##飆\n##风\n##飒\n##飓\n##飕\n##飘\n##飙\n##飚\n##飛\n##飞\n##食\n##飢\n##飨\n##飩\n##飪\n##飯\n##飲\n##飼\n##飽\n##飾\n##餃\n##餅\n##餉\n##養\n##餌\n##餐\n##餒\n##餓\n##餘\n##餚\n##餛\n##餞\n##餡\n##館\n##餮\n##餵\n##餾\n##饅\n##饈\n##饋\n##饌\n##饍\n##饑\n##饒\n##饕\n##饗\n##饞\n##饥\n##饨\n##饪\n##饬\n##饭\n##饮\n##饯\n##饰\n##饱\n##饲\n##饴\n##饵\n##饶\n##饷\n##饺\n##饼\n##饽\n##饿\n##馀\n##馁\n##馄\n##馅\n##馆\n##馈\n##馋\n##馍\n##馏\n##馒\n##馔\n##首\n##馗\n##香\n##馥\n##馨\n##馬\n##馭\n##馮\n##馳\n##馴\n##駁\n##駄\n##駅\n##駆\n##駐\n##駒\n##駕\n##駛\n##駝\n##駭\n##駱\n##駿\n##騁\n##騎\n##騏\n##験\n##騙\n##騨\n##騰\n##騷\n##驀\n##驅\n##驊\n##驍\n##驒\n##驕\n##驗\n##驚\n##驛\n##驟\n##驢\n##驥\n##马\n##驭\n##驮\n##驯\n##驰\n##驱\n##驳\n##驴\n##驶\n##驷\n##驸\n##驹\n##驻\n##驼\n##驾\n##驿\n##骁\n##骂\n##骄\n##骅\n##骆\n##骇\n##骈\n##骊\n##骋\n##验\n##骏\n##骐\n##骑\n##骗\n##骚\n##骛\n##骜\n##骞\n##骠\n##骡\n##骤\n##骥\n##骧\n##骨\n##骯\n##骰\n##骶\n##骷\n##骸\n##骼\n##髂\n##髅\n##髋\n##髏\n##髒\n##髓\n##體\n##髖\n##高\n##髦\n##髪\n##髮\n##髯\n##髻\n##鬃\n##鬆\n##鬍\n##鬓\n##鬚\n##鬟\n##鬢\n##鬣\n##鬥\n##鬧\n##鬱\n##鬼\n##魁\n##魂\n##魄\n##魅\n##魇\n##魍\n##魏\n##魔\n##魘\n##魚\n##魯\n##魷\n##鮑\n##鮨\n##鮪\n##鮭\n##鮮\n##鯉\n##鯊\n##鯖\n##鯛\n##鯨\n##鯰\n##鯽\n##鰍\n##鰓\n##鰭\n##鰲\n##鰻\n##鰾\n##鱈\n##鱉\n##鱔\n##鱗\n##鱷\n##鱸\n##鱼\n##鱿\n##鲁\n##鲈\n##鲍\n##鲑\n##鲛\n##鲜\n##鲟\n##鲢\n##鲤\n##鲨\n##鲫\n##鲱\n##鲲\n##鲶\n##鲷\n##鲸\n##鳃\n##鳄\n##鳅\n##鳌\n##鳍\n##鳕\n##鳖\n##鳗\n##鳝\n##鳞\n##鳥\n##鳩\n##鳳\n##鳴\n##鳶\n##鴉\n##鴕\n##鴛\n##鴦\n##鴨\n##鴻\n##鴿\n##鵑\n##鵜\n##鵝\n##鵡\n##鵬\n##鵰\n##鵲\n##鶘\n##鶩\n##鶯\n##鶴\n##鷗\n##鷲\n##鷹\n##鷺\n##鸚\n##鸞\n##鸟\n##鸠\n##鸡\n##鸢\n##鸣\n##鸥\n##鸦\n##鸨\n##鸪\n##鸭\n##鸯\n##鸳\n##鸵\n##鸽\n##鸾\n##鸿\n##鹂\n##鹃\n##鹄\n##鹅\n##鹈\n##鹉\n##鹊\n##鹌\n##鹏\n##鹑\n##鹕\n##鹘\n##鹜\n##鹞\n##鹤\n##鹦\n##鹧\n##鹫\n##鹭\n##鹰\n##鹳\n##鹵\n##鹹\n##鹼\n##鹽\n##鹿\n##麂\n##麋\n##麒\n##麓\n##麗\n##麝\n##麟\n##麥\n##麦\n##麩\n##麴\n##麵\n##麸\n##麺\n##麻\n##麼\n##麽\n##麾\n##黃\n##黄\n##黍\n##黎\n##黏\n##黑\n##黒\n##黔\n##默\n##黛\n##黜\n##黝\n##點\n##黠\n##黨\n##黯\n##黴\n##鼋\n##鼎\n##鼐\n##鼓\n##鼠\n##鼬\n##鼹\n##鼻\n##鼾\n##齁\n##齊\n##齋\n##齐\n##齒\n##齡\n##齢\n##齣\n##齦\n##齿\n##龄\n##龅\n##龈\n##龊\n##龋\n##龌\n##龍\n##龐\n##龔\n##龕\n##龙\n##龚\n##龛\n##龜\n##龟\n##︰\n##︱\n##︶\n##︿\n##﹁\n##﹂\n##﹍\n##﹏\n##﹐\n##﹑\n##﹒\n##﹔\n##﹕\n##﹖\n##﹗\n##﹙\n##﹚\n##﹝\n##﹞\n##﹡\n##﹣\n##！\n##＂\n##＃\n##＄\n##％\n##＆\n##＇\n##（\n##）\n##＊\n##，\n##－\n##．\n##／\n##：\n##；\n##＜\n##？\n##＠\n##［\n##＼\n##］\n##＾\n##＿\n##｀\n##ｆ\n##ｈ\n##ｊ\n##ｕ\n##ｗ\n##ｚ\n##｛\n##｝\n##｡\n##｢\n##｣\n##､\n##･\n##ｯ\n##ｰ\n##ｲ\n##ｸ\n##ｼ\n##ｽ\n##ﾄ\n##ﾉ\n##ﾌ\n##ﾗ\n##ﾙ\n##ﾝ\n##ﾞ\n##ﾟ\n##￣\n##￥\n##👍\n##🔥\n##😂\n##😎\n"
  },
  {
    "path": "modules/text/text_review/porn_detection_cnn/assets/word_dict.txt",
    "content": "<PAD>\n<UNK>\n，\n的\n[UNK]\n。\n一\n了\n是\n不\n我\n在\n人\n这\n有\n他\n着\n来\n上\n你\n个\n她\n大\n到\n道\n就\n下\n那\n说\n子\n出\n地\n：\n也\n小\n中\n时\n！\n要\n么\n们\n得\n然\n看\n自\n里\n？\n好\n身\n过\n手\n起\n后\n为\n可\n去\n没\n会\n心\n和\n之\n还\n都\n天\n、\n以\n头\n能\n对\n想\n而\n开\n只\n面\n己\n女\n生\n动\n样\n如\n发\n声\n无\n多\n用\n已\n又\n前\n被\n把\n家\n事\n经\n点\n两\n知\n情\n老\n眼\n力\n现\n于\n年\n感\n什\n很\n儿\n让\n但\n意\n间\n方\n成\n些\n笑\n体\n进\n长\n高\n从\n将\n话\n美\n见\n啊\n住\n当\n气\n所\n真\n回\n白\n口\n作\n再\n此\n明\n给\n觉\n正\n妈\n边\n色\n才\n行\n快\n公\n种\n分\n三\n几\n全\n法\n却\n次\n.\n本\n轻\n走\n国\n最\n同\n打\n十\n向\n实\n神\n定\n学\n张\n水\n光\n门\n内\n紧\n主\n更\n其\n入\n直\n外\n听\n问\n脸\n少\n肉\n因\n死\n做\n部\n双\n处\n与\n男\n便\n王\n太\n等\n怎\n像\n吧\n性\n红\n第\n二\n候\n理\n接\n放\n满\n阴\n果\n别\n比\n重\n相\n林\n受\n位\n「\n叫\n玉\n何\n龙\n强\n爱\n日\n应\n花\n清\n微\n加\n安\n机\n完\n连\n名\n者\n房\n李\n刚\n姐\n东\n带\n月\n西\n师\n亲\n文\n风\n服\n并\n难\n山\n深\n指\n先\n关\n常\n干\n马\n跟\n由\n」\n目\n变\n四\n使\n嘴\n平\n合\n立\n吗\n精\n物\n车\n流\n火\n呢\n巴\n股\n望\n军\n空\n任\n金\n衣\n似\n工\n乳\n始\n越\n电\n原\n战\n新\n般\n黑\n今\n飞\n海\n转\n插\n至\n淫\n通\n表\n腿\n雪\n度\n\"\n思\n根\n信\n场\n业\n随\n吃\n虽\n许\n反\n阵\n书\n乎\n慢\n世\n往\n云\n路\n解\n特\n离\n算\n哥\n青\n半\n命\n音\n竟\n,\n交\n非\n拉\n教\n化\n容\n娇\n条\n夫\n刻\n活\n丝\n《\n》\n万\n量\n坐\n司\n且\n传\n管\n该\n热\n南\n找\n周\n五\n惊\n突\n早\n片\n落\n数\n它\n欢\n阳\n总\n极\n结\n步\n棒\n整\n城\n激\n孩\n制\n喜\n倒\n鸡\n远\n穴\n床\n冷\n怕\n市\n香\n母\n及\n抽\n；\n言\n息\n每\n件\n阿\n提\n百\n血\n记\n利\n断\n绝\n停\n弄\n（\n失\n站\n宝\n）\n影\n露\n静\n语\n即\n士\n近\n弟\n尔\n尽\n屁\n若\n欲\n柔\n系\n杀\n形\n细\n斯\n穿\n兴\n剑\n灵\n武\n吸\n认\n收\n星\n足\n底\n办\n石\n急\n求\n代\n脚\n黄\n保\n击\n功\n江\n丽\n娘\n拿\n令\n终\n华\n陈\n杨\n未\n冲\n晚\n或\n乐\n低\n哪\n修\n奇\n楚\n挺\n显\n刺\n必\n品\n员\n暗\n视\n友\n叶\n谁\n痛\n嫩\n则\n玩\n告\n千\n妹\n射\n包\n计\n烈\n温\n界\n抱\n字\n唇\n产\n胸\n民\n准\n魔\n够\n罗\n液\n摸\n势\n帮\n沉\n各\n程\n软\n忍\n乱\n单\n*\n酒\n达\n怪\n久\n忙\n敢\n哈\n兵\n顶\n伤\n切\n留\n著\n依\n速\n背\n题\n德\n压\n-\n象\n伸\n呼\n害\n送\n父\n区\n元\n续\n北\n队\n夜\n建\n官\n识\n具\n院\n请\n备\n刘\n格\n装\n顾\n睛\n摇\n亮\n缓\n克\n八\n苦\n持\n宫\n章\n裤\n脑\n消\n舒\n梦\n布\n务\n易\n响\n朝\n错\n钱\n期\n破\n答\n怀\n展\n照\n报\n领\n客\n术\n腰\n室\n巨\n妇\n待\n线\n调\n复\n苏\n渐\n论\n滑\n秦\n掌\n狂\n古\n异\n猛\n皇\n味\n角\n确\n兰\n睡\n雨\n决\n句\n式\n仙\n运\n吟\n脱\n顿\n推\n狠\n冰\n羞\n众\n另\n政\n台\n抓\n居\n爷\n六\n游\n英\n甚\n七\n木\n嗯\n迷\n器\n网\n团\n注\n淡\n春\n级\n硬\n峰\n设\n毛\n旁\n喝\n忽\n舌\n左\n铁\n科\n商\n皮\n按\n族\n曾\n图\n首\n资\n奶\n简\n米\n故\n继\n取\n掉\n爸\n除\n丰\n密\n况\n呀\n类\n赵\n谢\n右\n态\n段\n围\n楼\n基\n约\n粗\n潮\n需\n技\n佛\n跳\n九\n充\n荡\n顺\n差\n引\n姑\n艳\n愿\n浪\n耳\n熟\n兄\n湿\n唐\n您\n义\n赶\n念\n护\n观\n药\n松\n隐\n跑\n试\n君\n圆\n局\n节\n散\n诉\n闪\n止\n颤\n限\n刀\n支\n粉\n鬼\n禁\n医\n洞\n齐\n含\n集\n帝\n秀\n握\n臀\n初\n造\n吻\n料\n婆\n号\n导\n麻\n招\n抬\n质\n眉\n坚\n套\n希\n疑\n拍\n波\n村\n球\n社\n怒\n肯\n毫\n透\n雅\n份\n抚\n翻\n退\n专\n河\n奋\n户\n校\n考\n醒\n示\n雷\n究\n景\n圣\n胡\n病\n摆\n汉\n钟\n威\n统\n模\n哦\n暴\n爽\n举\n恶\n靠\n称\n卫\n弹\n警\n尖\n选\n既\n层\n攻\n扭\n承\n呻\n存\n余\n斗\n仍\n魂\n舔\n素\n改\n夏\n群\n府\n沙\n境\n置\n联\n板\n饭\n孙\n志\n·\n陆\n历\n骚\n秘\n呵\n抖\n1\n属\n州\n闻\n查\n副\n致\n屄\n操\n亚\n龟\n仅\n派\n证\n默\n换\n妻\n块\n莫\n写\n组\n凌\n坏\n贵\n助\n宁\n媚\n恐\n排\n状\n雄\n毕\n紫\n卡\n省\n福\n短\n追\n据\n食\n岁\n规\n杰\n萧\n寒\n环\n广\n枪\n蜜\n谈\n费\n骨\n座\n委\n丁\n宇\n画\n超\n蛋\n礼\n严\n草\n戏\n配\n朋\n端\n共\n治\n妙\n察\n较\n研\n恩\n咬\n伙\n型\n俊\n店\n独\n版\n润\n宗\n震\n揉\n议\n讲\n柳\n价\n树\n京\n投\n哼\n朱\n敏\n挑\n扬\n浑\n争\n梅\n习\n丹\n擦\n银\n营\n维\n幸\n洗\n劲\n野\n假\n偷\n忘\n宋\n源\n索\n土\n守\n虚\n混\n洁\n闭\n救\n付\n控\n叹\n摩\n练\n徐\n烟\n裸\n划\n县\n秋\n喘\n堂\n吴\n买\n敌\n参\n姨\n施\n标\n狗\n否\n权\n夹\n厉\n担\n遇\n险\n毒\n触\n虎\n负\n纪\n效\n际\n巧\n养\n晓\n哭\n欣\n须\n尚\n肩\n胜\n演\n贝\n享\n曲\n项\n升\n躺\n啦\n案\n藏\n咱\n珠\n嘿\n防\n韩\n婚\n屋\n凤\n材\n挥\n适\n2\n娜\n创\n迎\n诗\n仿\n优\n班\n责\n喊\n史\n泪\n探\n芳\n剧\n牙\n湖\n茎\n贴\n赤\n验\n临\n智\n盘\n撞\n休\n磨\n舞\n镇\n蓝\n妖\n逃\n疯\n牛\n鲜\n永\n田\n凡\n范\n抗\n捏\n呆\n职\n普\n恨\n喷\n诱\n叔\n园\n桌\n侧\n疼\n印\n康\n奸\n移\n肏\n裙\n尊\n兽\n~\n缩\n羽\n盖\n丈\n瞬\n育\n略\n伟\n(\n徒\n幽\n挂\n凝\n)\n3\n寻\n旧\n袋\n滚\n船\n伯\n躯\n良\n?\n尼\n某\n庭\n肥\n颜\n富\n弱\n拥\n塞\n洛\n碰\n厅\n庄\n吐\n鱼\n肤\n托\n谷\n狼\n卖\n姿\n臂\n灯\n沈\n封\n壁\n辈\n玄\n梁\n采\n浓\n油\n免\n诺\n燕\n晃\n轮\n惑\n值\n涌\n款\n嘛\n尤\n杯\n玲\n鼻\n艺\n纯\n痒\n．\n洋\n┅\n犹\n莉\n圈\n肌\n炼\n杂\n隔\n喔\n积\n邪\n增\n读\n逸\n奴\n扑\n佳\n腹\n慕\n骑\n释\n趣\n晶\n货\n罪\n获\n拔\n腾\n笔\n菜\n爆\n农\n歌\n败\n伏\n汗\n蒙\n烧\n薄\n善\n午\n稍\n吓\n泄\n伊\n载\n架\n厚\n惜\n窗\n蓉\n飘\n骂\n扎\n/\n抵\n墙\n归\n颗\n袜\n束\n企\n番\n纤\n茶\n列\n烦\n横\n占\n私\n赛\n滴\n折\n街\n济\n借\n避\n亦\n糊\n牌\n净\n袁\n吞\n傅\n奈\n皱\n;\n伦\n吮\n尸\n绿\n遍\n挤\n供\n鼓\n搞\n乡\n5\n##┅\n郎\n桃\n灭\n讨\n介\n缝\n偏\n彩\n党\n纳\n卷\n罩\n互\n敬\n懂\n慧\n诸\n犯\n菲\n扶\n织\n壮\n纷\n替\n裂\n逼\n缠\n律\n征\n缘\n俩\n残\n纸\n甲\n距\n婷\n豪\n率\n盯\n监\n幻\n镜\n迹\n怜\n降\n泽\n迫\n舍\n登\n甜\n碎\n悄\n呜\n乔\n4\n吉\n欧\n罢\n祖\n森\n勇\n拳\n销\n郑\n昨\n冒\n财\n彻\n凉\n琳\n岳\n危\n奔\n搂\n择\n姓\n墨\n勾\n殿\n酥\n测\n爬\n陪\n鸣\n莲\n醉\n训\n滋\n预\n健\n漂\n弃\n概\n唯\n暖\n域\n馆\n肚\n盛\n尘\n闹\n來\n仰\n仔\n撑\n耐\n郭\n晕\n鲁\n聚\n睁\n荣\n瞧\n挣\n:\n戴\n博\n董\n逐\n凶\n劳\n蕾\n躲\n垂\n陷\n浮\n拼\n川\n翘\n迅\n幕\n构\n庆\n胆\n累\n唔\n蒂\n啪\n跪\n朵\n宜\n附\n餐\n乖\n补\n涨\n後\n编\n忌\n昏\n麽\n惨\n奥\n杜\n翼\n炎\n仇\n孔\n亡\n岛\n童\n枫\n词\n稳\n仪\n谓\n旋\n竹\n剩\n臣\n恋\n遥\n绕\n吹\n凭\n融\n尝\n逗\n执\n拜\n扯\n阻\n琴\n纵\n虑\n瑶\n盈\n貌\n侯\n耸\n挡\n赏\n闲\n扫\n耻\n珍\n例\n愣\n傲\n漫\n辰\n旅\n莹\n丢\n悲\n促\n掩\n宽\n脉\n困\n启\n酸\n策\n宣\n袭\n庞\n额\n录\n络\n乌\n樱\n沟\n缺\n遗\n趴\n灰\n裹\n浴\n伴\n拒\n傻\n误\n塔\n&\n核\n努\n弯\n脖\n妃\n艾\n乃\n瞪\n妮\n迟\n甘\n池\n洪\n搓\n检\n辱\n惠\n蛇\n哎\n课\n慌\n诚\n婉\n!\n批\n淋\n益\n闷\n阶\n协\n输\n拨\n慰\n聊\n抹\n寸\n腻\n愈\n胀\n鞋\n苍\n6\n饱\n丫\n氏\n愤\n猜\n曼\n辞\n赞\n孤\n尾\n眸\n胯\n评\n芒\n抢\n浩\n怨\n碧\n豆\n均\n薇\n贱\n爹\n锁\n泛\n惯\n侍\n忆\n晴\n夺\n膀\n烫\n绪\n仁\n典\n隆\n轩\n固\n菊\n绵\n7\n嘻\n辉\n搭\n俏\n针\n炸\n冯\n锋\n汁\n痕\n席\n延\n哀\n忠\n炮\n恢\n椅\n辛\n埋\n悦\n汤\n堪\n迪\n浅\n钻\n嫣\n述\n8\n遭\n泰\n轰\n肖\n码\n硕\n孟\n咪\n姬\n屈\n帅\n胖\n括\n扣\n津\n吼\n泉\n贼\n频\n嘉\n10\n悠\n径\n季\n授\n彼\n晨\n沿\n倩\n悉\n荒\n暂\n唱\n染\n瑞\n判\n箭\n凑\n倾\n溜\n琪\n卧\n勒\n陌\n邓\n脏\n跃\n谋\n勃\n辣\n估\n佩\n旦\n侠\n吕\n绍\n渴\n侵\n鸟\n熊\n韵\n辆\n沾\n瓶\n惧\n坦\n忧\n筑\n朗\n雾\n齿\n燃\n拖\n瓣\n焦\n饰\n霞\n涂\n奉\n旗\n豫\n遮\n丸\n尿\n霸\n烂\n涛\n岩\n芸\n御\n减\n票\n怡\n莱\n匆\n牢\n疾\n递\n霜\n啸\n喃\n革\n猪\n厌\n衫\n皆\n脆\n尺\n途\n审\n废\n瓜\n姆\n损\n堆\n颈\n偶\n踏\n潜\n猫\n零\n奖\n毁\n颖\n魏\n贞\n培\n於\n踪\n末\n喉\n韦\n涩\n讶\n曰\n莎\n厂\n欺\n央\n骗\n郁\n悟\n臭\n娃\n薛\n凯\n锐\n趁\n库\n颊\n玛\n倍\n嫂\n萨\n钢\n奏\n泡\n乘\n泥\n购\n汽\n袍\n桥\n萍\n卿\n痴\n盟\n肛\n翠\n符\n眯\n窄\n洲\n冬\n曹\n岂\n悔\n疗\n袖\n淑\n截\n狐\n港\n羊\n撕\n舅\n刑\n楠\n绑\n惹\n霍\n驾\n汇\n唤\n怔\n饮\n敲\n咽\n陶\n%\n牵\n献\n糖\n柱\n枝\n售\n覆\n虫\n映\n箱\n珊\n拾\n卓\n撒\n葛\n铺\n阁\n恭\n颇\n瓦\n瘦\n屏\n俗\n帐\n幅\n幼\n篇\n振\n畅\n播\n扔\n宿\n铃\n茹\n寂\n耀\n劫\n航\n雕\n抑\n谦\n穷\n黎\n纹\n宾\n汪\n肃\n刹\n焰\n绳\n晋\n饶\n舟\n喂\n肢\n恼\n督\n摄\n抛\n予\n跨\n肠\n狱\n赫\n蛮\n腔\n岸\n扮\n昊\n雯\n聪\n井\n肆\n蒋\n桑\n9\n允\n贪\n禾\n梯\n疲\n丑\n斩\n斜\n戒\n呈\n廷\n陵\n析\n劝\n苗\n序\n冠\n俱\n锦\n亿\n#\n伍\n殊\n掏\n芬\n姊\n拦\n亏\n愧\n扰\n膝\n丛\n召\n麦\n巾\n龄\n贤\n蹲\n乾\n璃\n恰\n漠\n碗\n催\n懒\n衬\n洒\n噢\n眨\n稀\n悬\n吊\n嫁\n慎\n鹏\n瑕\n肿\n纠\n灌\n慈\n键\n赖\n熙\n纱\n柏\n寺\n晰\n污\n粒\n棍\n冥\n12\n魄\n贺\n贾\n茫\n丧\n墓\n植\n昂\n這\n链\n粘\n唉\n～\n罚\n秒\n噗\n叉\n牧\n申\n柜\n扩\n僵\n涵\n弥\n贯\n询\n虹\n魅\n俯\n鞭\n撩\n琼\n措\n厨\n勉\n佐\n吁\n访\n披\n舰\n毅\n踢\n签\n咯\n溪\n孕\n堡\n澡\n祭\n挖\n爵\n卢\n贫\n赌\n弓\n甩\n溢\n扇\n茵\n搜\n讯\n怖\n芝\n捧\n笼\n窝\n盗\n浸\n驶\n胴\n藤\n繁\n祝\n猎\n20\n馨\n∶\n帆\n衡\n添\n咕\n棠\n蝶\n匹\n僧\n租\n泣\n跌\n捉\n筋\n柄\n详\n勤\n【\n逍\n娟\n返\n仲\n##s\n咳\n瞳\n潘\n砸\n违\n础\n鸿\n掠\n竞\n铜\n啥\n[\n愉\n籍\n誉\n】\n萌\n耗\n赢\n－\n]\n11\n埃\n歉\n迈\n脂\n铭\n麟\n刷\n祸\n障\n祥\n锅\n鹰\n邦\n宠\n摔\n凸\n搬\n崩\n骄\n斥\n尬\n酷\n碍\n芙\nt\n捕\n虐\n膜\n尴\n援\n闯\n殷\n耶\n妍\n咐\n夸\n饿\n宛\n腕\n昌\n『\n仗\n囊\n宏\n瑟\n逆\n阅\n蕴\n膛\n驱\n钰\n描\n亭\n叛\n哇\n寿\n媳\n崇\n嫌\n姚\n丘\n糟\n渡\n串\n胁\n币\n愁\n爪\n婴\n册\n患\n盒\n欠\n吾\n煞\n绷\n吵\n浆\n兼\n巫\n弗\n崔\n岚\n歇\n妳\n俄\n漆\n掀\n挨\n姜\n割\n盾\n枚\n胳\n疏\n撤\n宅\n穆\n裁\n邵\n绩\n嗔\n』\n膊\n涉\n档\n刮\n励\n0\n驰\n靖\n蔡\n荷\n黏\n灾\n粮\n哟\n诡\n帽\n孝\n劈\n恒\n搅\n桂\n咒\n订\n氛\n凄\n赐\n昭\n蕊\n驻\n灼\n踩\n矿\n矮\n捷\n湾\n玻\n捂\n仆\n巡\n帕\n媒\n哲\n械\n隶\n庙\n恍\n妩\n腐\n刃\n##t\n鼎\n萝\n症\n妆\n渊\n颠\n隙\n昆\n堵\n30\n妓\n仓\n漉\n逢\n郡\n瘫\n鼠\n哑\n歪\n综\n杆\n蠢\n坡\n斑\n阔\n媛\n坛\n邀\n～～\n翔\n辑\n叠\n瞒\n廊\n咙\n溃\n卑\n誓\n瑗\n宴\n彭\n##2\n珑\n搐\n艰\n夕\n拢\n婶\n炉\n眠\n蠕\n吩\n邻\n税\n蹭\n剂\n玫\n嘶\n兔\n漏\n##0\n缕\n盆\n畏\n矛\n崖\n皙\n胎\n捅\n帘\n啲\n乏\n旺\n贸\n壶\n筒\n陛\n脾\n骤\n犬\n沒\n烁\n迁\n湘\n戈\n芷\n岭\n枕\n伐\n##9\n拱\n禹\n沐\n趟\n饥\n契\n腥\n怯\n谅\n砍\n赚\n狭\n鹿\n骇\n乜\n坠\n役\n時\n澜\n矩\n遂\n挪\n枯\n谨\n渗\n拟\n笨\n璇\n霄\n阮\n辩\n剥\n說\n菁\na\n甫\n宙\n寨\n悍\n寄\n摊\n塑\n棉\n嘟\n匪\n捆\n嘲\n屠\n竖\n届\n黛\n妥\n靡\n猴\n彤\n15\n偿\n鉴\n挽\n斤\n亵\n储\n沫\n哄\n侄\n瞎\n鹤\n咖\n拐\n娶\n盼\n溅\n页\n嗓\n筹\n攀\n卜\n兜\n妾\n桐\n雀\n填\n痉\n躁\n噬\n辅\n砰\n挛\n聂\n萱\n桶\n婢\n##g\n棋\n嗦\n胞\n雁\n牲\n哗\n蜂\n啧\n坑\n咧\n胶\n啡\n兮\n谭\n##1\n葬\n厢\n锟\n彷\n翎\n滩\n窜\n译\n淌\n##3\n胃\n亩\n趾\n尉\n腴\n憋\n塌\n岗\n篷\n坤\n账\n柴\n缚\n旨\n杏\n睿\n绽\n蓬\n削\n诧\n柯\n轨\n掐\n镖\n绯\n辨\n##n\n绘\n竭\n朴\n扒\n娴\n磊\n伪\n厕\n茂\n览\n赋\n筱\n拓\n弦\n漓\n膨\n妄\n陡\n盐\n循\n殖\n琛\n栗\n戚\n##d\n遣\n耍\n吱\n屑\n肝\n##r\n娅\n愕\n舐\n诀\n戳\n剪\n伺\n拂\n啼\n燥\n邱\n朕\n棺\n壳\n拽\n凰\n羡\n栋\n儒\n姻\n揭\n翅\n##k\n瞥\n匙\n焚\n翰\n绣\n烛\n冤\n脯\n霖\n寇\n17\n拆\n寡\n##5\n##y\n泳\n旭\n墅\n哧\n惩\n##4\n潇\n址\n叮\n尹\n蔚\n逝\n蟒\n蔓\n糕\n窥\n※\n摘\n婪\n禅\n厮\n昔\n狞\n褪\n晌\n鄙\n泌\n脊\n慨\n狮\n赴\n辽\nb\n秽\n骆\n垫\n唾\n赔\n艘\n撅\n妨\n逊\n淼\n谎\n宰\n茜\n谱\n嚷\n沸\n捣\n婊\n昧\n轴\n烤\n撼\n搏\n妒\n汹\n栏\n滥\n熬\n陀\n##the\n寞\n撇\n瑜\n翁\n廉\n芯\n豹\n##6\n荆\n兆\n瑰\n荐\n遵\n钥\n愚\n狄\n杖\n##a\n哩\n捡\n兹\n82\n+\n浙\n债\n凛\n＊\n泼\n過\n娱\n漾\n哆\n##i\n裴\nc\n夷\n亢\n冉\n矣\n钉\n炽\n佣\n冻\n##8\n苹\n灿\n徽\n寓\n耿\nquot\n擅\n敞\n歹\n押\n闺\n践\n嗤\n凹\n阜\n##x\n猥\n畜\n诊\n泊\n擒\n舱\n贡\n摧\n##b\n憾\n蓄\n疆\n讽\n惟\n##w\n趋\n瞄\n轿\n炒\n##12\n16\n##e\n匀\n蓦\n嘱\n呃\n沃\n稠\n姥\n倦\n杭\n躬\n逛\n2010\n惶\n姗\n乙\n氓\n罕\n100\n璟\n裳\n①\n彪\n瘾\n郝\n妞\n50\n##7\n篮\n署\n巷\n揽\n唧\n乞\n呐\n牺\n梨\n巍\n嗅\n彦\n椒\n甬\n巅\n携\n俞\n衷\n梳\n##c\n肺\n18\n胤\n碑\n衰\n辜\n俺\n钩\n弘\n绮\n──\n拧\n00\n叙\n耕\n倏\n稚\n拭\n邮\n葡\n贷\n倪\n琦\n咸\n丐\n刊\n饼\n圳\n朦\n##m\n瓷\n庸\n裡\n浇\n挠\n胧\n坊\n纽\n曦\n缸\n秃\n搁\n卵\n倚\n藉\n醋\n芦\n黯\n滨\n滞\n侦\n13\ns\n蒸\n钦\nx\n祈\n窃\n萄\n沧\n毯\n2011\n涯\nxi\n14\n蚁\n诞\n蚀\n嗡\n匠\n浊\n荫\n驳\n裆\n芊\n榜\n敛\n裕\n蹦\n稿\n霉\n襄\n蔽\n衍\n##th\ncom\n潭\n煮\n履\n魁\n蹂\n棵\n蝉\n03\n砖\n斌\n惫\n抠\n琐\n抄\n锤\n##he\n眩\n彬\n嘘\n侣\n碌\n寝\n幺\n=\n顽\n躏\n甸\n狸\n娥\n檀\n劣\n暇\n桩\n孽\n焉\n卒\n耽\n聘\n弧\n恕\n_\n暮\n2009\n栽\n葱\n##o\n茅\n蹙\n哨\n拇\n咋\n24\nwww\n揪\n斧\n罡\n喻\n##of\n狡\n嫉\n鲍\n厦\n對\n狰\n卦\n扁\n敦\n阎\n睹\n擎\n绒\n旷\n衙\n嚎\n##u\n忖\n菱\n腮\n蛛\n遁\n朽\n菌\n苞\n盏\n璐\n厘\n蹄\n湛\n陋\n抿\n鹅\n仑\n株\n婿\n發\n拘\n們\n##l\n扳\nk\n辖\n禀\n襟\n咦\n25\n坟\n巢\n蹬\n渔\n辟\n2012\n淘\n2008\n尧\n倘\n淮\n蓓\n##h\n蹈\n沁\n祁\n磁\n蜡\n粹\n屌\n囚\n艇\n疚\n绸\n赠\n佑\n澈\n蛊\n麒\n經\n蜀\n扛\n淹\n绞\n榻\n恳\n皓\n筝\n煤\n暧\n驴\n帖\n嚣\n谐\n垃\n蹊\n琢\n沌\n捞\n宵\n圾\n嘎\n晔\n芽\n郊\n2018\n晒\n渣\n40\n雍\n峡\n罐\n嘤\n21\n堕\n沦\n狈\n咨\n逻\n甄\n淳\n钧\n磕\n窒\n2000\n诈\n掘\n栈\n酬\n禽\n塘\n纲\n惚\n渠\n匕\n\\\n叨\n挟\n惮\n猿\n箍\n筷\n眶\n涡\n螺\n19\n宪\n鸭\n嘀\n迭\n喇\n詹\n伽\n##ra\n鳞\n婧\n奢\n锡\n嗽\n滔\n棚\n矜\n咚\n靴\n靥\n煌\n莺\n凳\n吏\n泻\n‥\n兀\n煎\n蝴\n唷\n翩\n亨\n斋\n剔\n##sh\n伞\n掰\n傍\n梭\n##er\n澄\n60\n媽\n##ao\n褐\n锻\n腊\n禄\n虞\n鸦\n枉\n陕\n俘\n嵌\n##f\n砂\n恬\n2007\n坎\n呕\n叽\n啜\n哮\n窟\nm\n勿\n拎\n畔\n颅\n卸\n苑\n诏\n22\n氧\n铸\n黝\n膏\n梵\n骏\n酱\n窍\n堤\n雇\n毙\n辕\nandroid\n菩\nh\n虾\n噜\n侮\n瞅\n笛\n霆\n鸾\n廖\n贩\n潺\n券\n咏\n绰\n衔\namp\n薪\n2006\n踝\n茸\n疤\n渺\n##ng\n粥\n谊\n槽\n樊\n拚\nj\n顷\n乍\n伶\n冈\n芭\n熏\n矢\n喧\n谜\n稻\n##to\n髓\n梓\n2005\n惕\n叱\n溶\n鞠\n##ａ\n恤\n莽\n##st\n撸\n咆\n哉\n吭\n##ing\n霎\n2013\n丞\n糙\n戮\n函\n驼\n##ed\n睫\n敖\n斐\n卉\n浦\n熄\n##hi\n啃\n迦\n箫\nf\nr\n疫\n诛\n蒲\n掷\n惭\n楞\n僻\n挫\n39\n梆\n澳\n嬉\nd\n勋\n搔\n啤\n23\n##on\n80\n臻\n苟\nu\n炫\n奕\n|\n框\n虏\n滕\n##ha\n弩\n绫\n奎\n>\n葫\ng\n蛟\ne\n嚼\n廓\n汝\n褶\n##uo\n钓\n蜷\n睾\n韧\n踹\n岔\n梢\n琉\n##v\n逞\n##～\n2014\n厥\n蚂\n怦\n##z\n盲\n汩\n祷\n叩\n～～～\n現\n犀\n盔\n拙\n讪\n茨\n26\nw\n瞟\n筠\n偎\n炙\n暑\n骸\n揍\nse\n桓\n旬\n峻\n肘\n'\n侃\n窦\n喽\n28\n缴\n##in\n漱\n噩\n瑾\n峨\n悸\n絮\n棱\n镶\n蔑\n##p\n豁\n闵\n碟\n##xt\n迸\n叭\n垮\n琅\n溺\n薰\n菇\n##le\n屎\n捐\n缎\n稽\n##and\n2017\n渍\n瀚\n蝎\n炭\n坪\n捻\n跋\n90\n屡\n勺\n跄\n芮\n焕\n##be\n2003\n肾\n##ro\n2004\n沛\n葵\n②\n踉\n呛\n鞘\nn\nv\n瑛\n酿\n腋\n烘\n敷\n邹\n诵\n##q\n逮\n揣\n呸\n<\nthe\n涎\n脐\n貂\n嗷\n冶\n200\nl\n垒\n咛\n##re\n崎\n霈\n壤\n邢\n慑\no\n眷\n##an\n拌\n27\n喀\n窘\n笙\n瀑\n钮\n70\n梧\n刁\n讥\n2015\n觑\n猩\n##ck\n癌\n邃\n隋\n涟\n捶\n澎\n萎\n漩\n茄\n##wa\n庵\n濡\n缀\n蚊\n骷\n夭\n髅\ni\n狙\n肮\n藩\n呦\n幢\n〔\n咔\n谣\n壑\n旱\n吨\n涅\n穹\n铮\n陨\n暄\n翡\n嗜\n兑\n呗\n牡\n蚕\n棕\n讳\np\n兢\n##wi\n坝\n##is\n诅\n雌\n攥\n〕\n肋\n瘤\n媾\n##me\n簇\n匿\n铐\n懵\n傀\n500\n祀\n杉\n挲\n绅\n烙\n猾\n橡\n跺\n芹\n2016\n１\n秩\n怠\n##at\n枣\n##×\n枢\n铠\n粪\n鑫\n蜘\n29\n钗\n倔\n31\n懈\n涕\n醇\n阀\n娆\n﹐\n##sa\n颂\n秉\n栖\n颓\n匣\n2002\n茉\n炯\n哽\n璧\n槐\n##co\n彰\n厄\n##fo\n腺\n晖\n慾\n龚\n熔\n棘\n捺\n驯\n曳\nz\n屯\n湃\n阱\n##ho\n搀\n榨\n檐\n鸽\n蟹\n辙\n##ｏ\n憨\n叼\n藕\n霓\n歧\n岑\n褥\n衅\n袅\n沮\n巩\n恣\n##j\n纺\n麼\n颔\n馒\n贲\n哒\n寥\n晏\n佬\n忿\n剖\n仕\n##di\n疙\n颁\n矫\n##ly\n35\n祟\n01\n婕\n裘\n蒜\n缭\n蔬\n漪\n儡\n##ma\n唬\n逾\n叟\n遏\n觅\n300\ntxt\n蕙\n懊\n瘩\n寰\n浣\n嗒\n##al\n忐\n忑\n莞\n甥\n橙\n##00\n剿\n啰\n皂\n昕\n嬷\n擂\n戟\n铲\n鬓\n穗\n奚\n娣\ny\n2001\n剃\n晦\n徊\n纬\n炕\n##se\n##el\n斟\n嚓\n1000\n嗣\n馀\n虔\n庐\n癫\n悚\n庚\n慵\n禧\n嗨\n弊\n蝇\n##mm\n聋\n丙\n绊\n朔\n痹\n竿\n绢\n##nt\n赦\n嗖\n08\n憎\n昼\n##un\n啾\n嗲\n臊\n３\n喳\n##you\n鬟\n茬\n邑\n徘\n榴\n姣\n宸\n舵\n窑\n咿\n##pa\n屉\n凿\n##de\n梗\n藻\n##os\n歼\n飙\n糜\n锥\n胭\n珂\n桦\n删\n赎\n鞍\n悯\n缅\n谕\n懿\n戬\n橱\n淇\n酣\n枭\nnbsp\n##te\n垢\n曝\n##ar\n雏\n冀\n猝\n钞\n椎\n惺\n蹑\n驭\n沼\nro\n墟\n舆\n腑\n霹\n##or\n攒\n##li\nand\n苔\n僚\n##mo\n笠\n磅\n##ｎ\nyin\n辫\n##go\n##℃\n##et\n噪\n灶\n笋\n挚\n贬\n峙\n##la\n谍\n##ot\n佟\n45\n囔\n##us\n勘\n36\n钝\n##so\n弛\n昵\n＂\n04\n拯\n^\n赧\n橘\n邸\n##no\n##ch\n##es\n珏\n撰\n泞\n##na\n璋\n戎\n蔷\n悴\n嵩\n##da\n碾\n##ｅ\n铅\n##en\n蛤\n杠\n雳\n09\n個\n渎\n栅\n谬\n捋\n峭\n##ne\n32\n嫖\n靓\n踱\n崽\n遐\n俐\n萤\n蝠\n辗\n锈\n憔\n簌\n叁\n癸\n２\n瑄\n##ｍ\n庇\n赘\n嫡\n##ss\n##ta\n剌\n奄\nq\n1999\n咂\n蛙\n铛\n悻\n桀\n@\n孜\n拣\n##９\n鲸\n##fi\n##10\n缪\n咻\n鸳\n募\n倡\n06\n碳\n05\n怅\n裔\n③\n恃\nunt\n揖\n慷\n唰\n##om\n侥\n杵\n膳\n咫\n苇\n懦\n鸠\n寐\n##ting\n阐\n吒\n绛\n玖\n缉\n蹿\n匈\n茧\n钳\n07\n庶\n##si\n##it\n掺\n烨\n為\n婵\n卯\n癖\n寅\n惬\n淀\n筛\n骋\n咀\n183\n砚\n嬴\n绎\n璞\n芜\n饲\n戾\n肴\n##as\n蕉\n1998\n敝\n闸\n镯\n##ol\n惦\n婀\n殆\n燎\n笃\n峦\n殇\n屿\n聆\n桔\n瞻\n##ver\n粟\nhttp\nch\n##ve\n幡\n##do\n匡\nａ\n绚\n髻\n##pe\n滤\n栓\n沥\n奠\n##０\n疮\n辐\n飒\n巳\n洽\n鸯\n祠\n倭\n幂\n熠\n跷\n馈\n偌\n讼\n掳\n摁\n沪\n噼\n祯\n##ll\n＿\n冽\n##int\n瘪\n髯\n舜\n撮\n唠\n衩\n宦\n蚩\n鹃\n##20\n苓\n圭\n02\n##fr\n##ｉ\n袒\n韶\n诫\n洼\n拴\n驿\n烹\n馋\n牟\nit\n璨\n麾\n##rs\n煽\nps\n翟\n汐\n浏\n谏\n1997\n诣\n崛\n炖\n蘑\n焱\n120\ngt\n嫔\n颐\n##ti\n璀\n陇\n［\n袱\n苛\n扉\n##mi\n亥\n瑚\n簸\n眺\n禛\n簿\n锯\n唆\n汀\n煜\n磷\n1996\n褂\n隧\n恙\n侬\n睬\n##ge\n眈\n55\n娼\n缰\n蹋\n紊\n珈\n粤\n珀\n侈\n崭\n菡\n馅\n##ｔ\n頭\n墩\nji\n妤\n##nd\n##ce\n肪\n窈\n栩\nqq\n34\n75\n抡\n☆\n贿\n％\n噎\n毓\n羹\n嫦\n睽\n阙\n150\nno\n拗\n炳\n骼\n窖\n殴\n##ri\n鄂\n瞿\n鲤\n##po\n1995\n隻\n呓\n33\n砌\n##ba\n##ｘ\n玥\n礁\n羁\n哺\n蛾\n咄\n##rt\n囡\n蔼\n槛\n坂\n##sp\n##lo\n譬\n阖\n##fa\n扼\n晟\n##ｄ\nhe\n锣\n衲\nchang\n茗\n酌\n涧\n窕\n蚌\n躇\n滢\n帜\n祺\n踌\n薯\nｃ\n娄\n##ea\n缔\n鳌\n荃\n拷\n扈\n汰\n靳\n荧\n蜒\n榆\n開\n46\n##ad\n1994\n溯\n淤\n褚\n##ｒ\n##ter\n羲\n呱\n##fe\nsm\n彿\n38\n69\n︰\n痞\n萦\n莓\n汶\nst\n47\n］\n瓢\n啄\n酋\n牝\n狩\nbr\n37\n48\n冢\n佘\n拈\n烬\n妲\n讷\n镰\n玺\n吝\n惘\n珉\n萋\n饺\n42\n嘈\n400\n〖\n蕃\n〗\n1993\n俨\n恻\n忡\n谑\n竺\n毗\n##ｙ\n嘭\n羔\n氨\n還\n矗\n##ke\n飕\n垣\n夙\n撬\n##ic\n陰\n焊\n蝙\n盎\n65\n弈\n##ul\n侨\n毋\n##11\n噔\n惋\n玮\n渝\n呲\n铎\n玑\n馥\n札\n1992\nre\n俭\n吆\n##cm\n簪\n苒\n##ry\n揩\n##ｕ\nｔ\n##ia\n忒\n荀\n##ut\n瘙\n琬\n##cr\n坞\n铝\n43\n睨\n骡\n##su\n颉\n纨\n學\n疵\n鲨\n咎\n垦\n曙\n锢\n##ca\n##ur\n冕\n窿\n##vi\n珺\n佯\n動\n瘟\n56\n##ci\n洵\n靶\n柿\n3d\n52\n绾\n##ther\n濒\n帛\n硝\n碱\n蜿\n萃\n##＝\n醺\n赁\n##up\n痪\n53\n楷\n簧\n##man\n##ui\n帷\n稼\n嵋\n４\n虬\n珣\n屹\n##ga\n畴\n颌\n锵\n钵\n掣\n噙\n黠\n柬\n掂\n泓\n硫\n邬\n蜈\n##am\n袄\n／\n霏\n58\n荔\n腆\n歆\n飚\n皎\n##ｓ\n琶\n渭\n谴\n##ｗ\ncat\n##oo\n汲\n##hu\n胚\n殃\n怂\n○\n镂\n鳄\n谧\n戊\n驸\n##ec\n##con\n憧\n3000\n##gi\n□\n瘸\n﹗\n##ds\n骁\n驹\n1985\n##ir\n篱\n鹊\n喋\n④\n琥\n娲\n##°\n##il\n涓\n饵\n枷\n缆\n晾\n99\n獒\n攘\n●\n1990\n惴\n95\n澹\n62\n踮\n缇\n54\n遽\n##ab\njing\n##one\n斓\n##ｈ\nsk\n萼\n悼\n苯\nca\nsa\n51\n龌\n64\n喵\nfa\n蜕\n卞\n殉\n1991\n诬\nｓ\n會\n篆\n600\n孰\nhu\n壬\n##bu\n1988\n搽\n110\n44\n##ou\n糯\n芥\n5000\n淬\n1989\n痊\n睦\n##ns\nsh\n##ac\n##wo\n魇\n##ｃ\n涤\n籁\n57\n##op\n1984\nso\n犁\n72\n旖\n##ie\n秤\n1986\n檬\n耷\nme\n闽\n鹫\n泵\n##ｌ\n##50\n##her\n樣\n俪\n岐\nru\n壹\nic\n##ld\n1987\n##ft\n鹭\n笺\n##26\n憬\n甭\n琵\n抒\n赃\n259\n800\n昱\n稣\n##per\n荤\n##um\n槟\nshe\n龊\nan\n噤\n##tion\n媲\n礴\n亀\n酝\n迢\n毡\n阉\n68\n玷\n49\nios\n壕\n##５\n炬\n##pro\n椭\n稷\n##18\n啮\n汨\n１０\n诶\n##22\n涔\n缨\n炊\n辄\n兒\n##hat\n蟆\nmi\n瞩\n鸢\n##tr\n180\n韬\n##bo\n伎\n1982\n榄\nco\n覃\n涸\n５\n诲\n爲\n##we\n孚\nsu\n痰\n##pi\n硅\n將\n酵\n饷\n剁\n##tu\n髦\n踞\n##out\n筐\n91\n锺\n蔻\n88\n忏\n镀\n荼\nmm\ndi\nboss\n酉\n85\n兩\nwin\n麓\n桨\n铿\n旎\n##mb\n##du\n##od\n浚\n##ph\n６\n睑\n迂\n恪\n##chi\n泱\n徵\n##dt\n闫\n啵\n椰\n肇\n佼\n##ct\n瀛\nma\n##７\n勐\n鹦\n蘸\n濮\n袂\n湮\n晤\n##19\n##my\n柠\n98\n59\n曜\n粱\n褒\n戌\n63\n41\n隅\n畸\n刨\n暨\n##art\n鳖\n##sc\n锄\n乒\n吠\n羌\n##http\n##id\n渲\n蟾\n##son\n##ｐ\n鹉\n體\n##ita\n辘\n坍\n66\n瞠\n姝\n##ine\n##bi\n獠\n胄\n蚣\n乓\napp\n癞\nav\n##21\n茴\n湄\n##cl\n##３\n蔺\n颚\nｍ\n芋\n掖\n猖\nde\nok\n凋\n##ni\npc\n钙\n唏\n##ys\n迄\n聲\n邯\n迥\n##２\n麝\n珞\nangela\n蟠\n瓮\n喏\n诃\n轧\n孺\n舷\n61\n##au\n##rd\n颦\n溉\n妪\nbe\n磋\n##６\n霭\n幌\n矶\n娑\n圃\n##rea\nna\n##cha\n橄\n嗳\n蜥\n##32\n筏\n76\njb\n杞\n怆\n炜\nal\n谛\n烽\n俟\n##sm\n##oc\n驮\n脓\n內\nbut\n1980\n熨\n67\n##14\n##im\n樵\n隘\n##dr\n##ts\n撂\n溟\nxx\n卤\n##pr\n##ru\n78\n##rn\n娓\n##13\n娉\n睐\n跤\nta\n籽\n伫\n戛\n纶\nla\n氯\npa\n羚\n莘\n##８\n笈\n螂\nto\n進\n##ag\n##～1\n沓\n##va\n##lu\n贮\n##ee\n筵\n侏\n璎\n亘\n赊\nbi\n酪\n酮\nas\n##em\n罂\n埔\n腼\n汴\n涣\n##ent\n恺\n讓\n##mp\n##all\n1978\n鸨\n鞑\n氢\n皋\n缤\n胱\n酶\n旌\n匐\n##ff\n棣\n##by\n惆\n##ang\n攸\n無\n##ist\n匍\n##dd\n73\n##san\n沅\n##ion\n##ment\n莪\n峥\n105\n##30\n##pl\n埠\n椿\n##ay\n峒\n谒\n绥\n81\n##ja\n盅\n96\n腌\n胥\n鸥\n1983\nlu\nin\n啬\n71\n##sta\n赣\nbo\n##vo\n隽\nvi\n捎\n剐\n霁\n##ka\n弼\n蝗\n阑\n篝\n蜻\n痣\n挞\n晗\n##now\n缈\n膺\n徙\n##60\n挎\n瞑\n##02\n裏\n霾\n殡\n87\nsi\n幔\n抉\n##40\n74\n##tt\n鼾\n昀\n92\nsuv\nｂ\n緊\ndr\n##ls\n耘\n〉\nmo\n94\n沂\n##23\nli\n##15\n煦\n撵\n犊\n楣\n镑\n##com\n##ex\n弑\n##wer\n缥\n##ow\n1979\n160\n瘴\n祛\n##ey\n偕\n##㎡\n蜓\n佻\n##za\n磐\n踵\n臾\n敕\n滇\n垄\n烯\n##xi\n昙\n跚\n槌\n77\n##rr\n洩\n##ef\n谙\n84\n130\n夥\n阚\n邊\n匾\n泯\n123\n祇\nda\n偃\n鲲\n間\n##ki\n⑤\n纣\n97\n〈\nlt\n豚\n谄\n憩\n啷\n煲\n101\n##80\n##ran\n1963\n爻\n##are\n镣\n柚\nha\n拄\n彝\nの\nai\n淅\n×\n胺\n赂\n蹒\n茱\n蜗\n豔\n攫\n##ue\n觊\n##16\n1981\n熹\n７\n93\n觎\n舶\n##１\n嚏\ncd\n飓\n##han\nｊ\n痘\n##hen\n##ap\n86\nac\n##４\n覺\nyu\nga\nzh\n978\n##ya\n泾\n弋\n擞\n##und\n瑙\n##ny\n##ob\n瞰\n诠\n灸\n粼\n濑\n篡\n骰\n##yo\n釉\n８\nｘ\n泠\n揄\n##70\n汕\n芍\n啕\n篓\n聿\n锭\n砾\n##if\nle\n噶\n##sl\n250\n1964\n淄\n诩\n轲\n##lt\n1958\n║\n鲛\n137\n葩\n沖\n帚\n畹\n舫\n菈\n玟\nshi\n瞌\nyou\nid\n绔\n孵\n##ant\n1949\n摹\n２０\n桅\n從\n狎\nnba\n##jo\n##ina\n##31\n##tra\n雙\n##fl\n氲\nvip\n倜\n##17\ngui\n題\n##24\n坷\nap\n##ok\n掬\n當\n##ome\n83\n000\n1970\n姒\n鳅\nra\n翌\n氤\n蚓\n戍\nsc\n4000\n##ud\n恿\n##ms\nad\n##ty\n搡\n瘀\n皖\nko\n赳\n##ins\n搪\n沽\n##io\n岱\n##rc\n##db\n滟\n##ps\n坯\n1969\n79\n咣\n濛\n點\n黔\n與\n##ua\n㎡\n89\ndna\n渤\npe\n堇\n揶\n##ep\n輕\n掸\n炀\n廿\n羿\n蚯\n##ong\n酯\n##ｖ\n葆\n贻\n##ev\n1962\n##ive\ndo\n岌\n樟\n呤\n##ew\n泗\niphone\n##ers\n诽\n捱\n170\n■\n丕\n##rl\n阂\n嗬\n漳\n汾\n徨\nｋ\n1500\n殒\n##ip\n1956\n悖\n##ah\n##can\n3p\n##ko\n##pp\n##51\n##km\n##ye\n骜\n臆\n##mu\n##uan\n##ig\n喟\n痿\n潞\n##og\nis\n俸\n蝼\n祗\nip\n吋\n衮\n185\n榭\n##aw\n峪\n捍\n摒\n鬣\n骥\n忪\n##ian\nar\n##dy\nｗ\n猬\n轶\n##35\nba\nge\nat\n９\ngo\n1968\n燮\n愫\n琊\nka\n垛\n煊\n锚\n郸\n崆\n菠\n惰\n##45\n##ｋ\n125\n馁\n潼\nel\n102\n〞\n##fu\nｄ\nled\n}\n栾\n谤\n秧\n骊\nti\n劾\n榔\n嗝\n旻\n釜\n邝\n讹\n$\nfu\n##ub\n##90\n宓\n##ven\n涮\n見\nceo\n螃\ncl\n逵\n昶\n##ju\n1976\n##lli\nktv\n1965\n坳\n##03\nsp\n钏\n→\n鼬\n蜃\nｏ\n##33\n##des\n粽\n摞\n匝\n傥\n恁\n186\n##cc\n晁\n巽\n営\n颍\n##ai\n××\n砺\n仞\n115\n##ain\n##thing\n岖\n##xp\n罔\n韭\n187\n潦\nni\n孪\n1966\n##ill\n饪\n翊\n怼\n﹑\n舀\n##25\n##ber\n蔫\n俑\n恸\nph\n溥\n蜴\n##ation\n##ide\n##dn\n##200\n竣\n杳\n姦\n堰\n佈\n##pt\n垩\n處\n隼\n1974\n蛰\n##hy\n##der\n##est\n1960\n1977\n愛\n鳍\n′\n琰\n夔\n桢\n1957\n潍\n##lm\n怏\nwe\n##lan\n坨\n盹\n⑥\n111\n嬿\n##ugh\n##ii\n##rp\n##here\n##ast\nte\ngb\n140\n##ei\n##gh\n184\n噌\n滿\n##zi\n##91\n倌\n##sha\n##─\n108\n嚐\n泫\n1950\n1975\njohn\n##ste\n螳\n哂\n顼\n髮\n##ｂ\n1972\nam\n##75\n##eg\n钊\n##ear\n遛\n皑\npi\n臼\nya\n跛\n潢\n牠\n1200\n##ser\n沏\n─\n##61\n##71\n168\n榕\n104\n##our\n##res\n106\n帧\n洱\n##ight\n灏\n##sy\n忤\n##01\n鬃\n109\n锏\n話\n亟\n##kg\nvs\n##98\n##28\n121\n箕\n360\nhi\nnt\n阪\n垠\n逅\n掇\n##mar\nｇ\n葳\nol\n摺\n犷\n骧\npro\n缄\n侗\n##ere\n##eo\n紅\n忻\n哐\n纂\n181\n宕\n袈\n烷\ncia\n##ore\nor\nab\n粑\nne\n邂\n炷\n##nn\n##ron\nmy\n113\ncr\n##ton\n蔗\n裟\n##tw\n##win\n糗\n1973\n蓟\npo\n槿\n馍\n1971\n##day\n鞅\n##41\n##ric\n豺\n182\n##les\n佔\n##99\n##era\n給\n﹖\n103\n湍\njj\n仄\n1945\n鬆\n1938\ntop\n122\n秆\n臃\n1955\n盂\n##42\n蹶\n糠\n##af\n邋\nss\n邺\n##red\n##ris\n锌\nwindows\n斛\n##mon\nm1\n6000\n虱\n偻\n1961\nqi\n128\n钠\n蛆\n诘\n##che\n##88\n##way\n112\n##nce\n1937\n長\n##ass\n1959\n淆\n雞\n琏\nyy\n闇\n鹂\n翱\nce\n龛\n町\n氟\n闾\n钺\n桧\n辇\n2020\n##55\n##72\n##ate\n##78\n谲\n孢\n##lle\n107\n卻\n磺\n瞭\n雉\n銮\n氣\n仨\n別\n##gen\n焖\n汛\n1954\n##land\n##sn\n1967\n##av\n門\n蒿\n漕\n##yi\n嶙\nmr\n翦\n呷\n##form\n##000\n怵\n##tan\n##dan\n蚤\n忱\n##age\ncpu\n119\n蠡\n##ith\n##ure\n##use\n175\n##ov\n瞇\nｐ\n##ell\n匮\n翳\nｎ\n胰\n牒\n枸\n傑\n遢\n珩\nlin\n##ze\n##66\n幾\n##04\n冗\n##don\nfor\n##08\n##ym\n{\n胛\n茁\n3g\n↓\n囱\n##ana\n痢\n##53\n171\n剽\n##ous\n##sk\n種\n##tal\n##min\n##gn\n張\n伢\n親\n疹\n妊\ncon\n##bs\n骞\n＃\n##06\n##shi\n##cal\nx1\n##rie\n傢\nisbn\n锹\n剜\noh\n绡\n喙\n臧\n谥\nkb\nｆ\n##yn\n##ble\nmar\n尻\n##００\n##nm\n珪\n囤\n谗\n178\n##ted\n##34\n蹩\n##62\n彗\n荪\n##tin\n1952\n衾\n跆\n##ity\n##get\n##pu\n118\n##top\n聽\n藐\n##hr\n荟\n##lla\n##43\n缜\nlo\n700\n127\n僮\n##nc\n##oi\n佝\n##ess\n146\n耙\n峋\n衝\n##64\n172\nho\n椁\n##68\n泸\n##rm\n240\n嗟\n##ction\n辍\n162\n136\n皿\n孬\n饕\n谆\nall\nfe\nintel\n165\n嗑\n遑\n##81\n柒\ned\n岷\n強\ndvd\nmen\n笆\n畿\n##ach\n##gg\n⊙\n##sd\n163\nex\n臉\n##vis\nbl\n蟑\n嘹\n126\n114\n116\n衢\n羯\n155\n129\n8000\n氮\nif\n##65\n1600\n箩\n##ｇ\n1946\n##47\n1948\n##ren\n##nes\n粲\n##oa\n179\n##07\n156\nnet\n##95\n・\n131\n娠\ndv\n##27\n浒\n厩\nㄚ\n##ten\n１２\ndb\n##ever\n##52\n##ern\n##ite\n問\nfi\n##gl\n##38\n１８\n##ica\ncn\n莆\n##48\n##ak\n叵\n荻\n夯\n##05\n##cy\n##29\n钾\n郢\n##rth\n瘁\n﹕\n##tel\n##ree\n並\nflash\naa\n淞\n東\nen\nmac\n娩\n##que\n贰\n姹\n##ji\n139\n##dm\n镁\n##44\ngl\n1953\n##ath\n##ect\n1951\n佞\n柩\nes\n135\n##tro\n衿\n嬛\n柑\n铢\n耆\n荚\n犒\n##iti\n001\ncc\n900\n##dc\n吖\n158\n##cs\n←\n##ame\n颧\n骠\n188\n铬\n##tor\n124\n151\n孱\n咤\n164\n##mg\n176\n##ave\nｈ\n缮\n绶\n1942\n350\n嵘\n堑\n睇\n1300\n##uc\n##ae\nbb\n##39\n1944\n珅\n鹞\n碴\n剎\n##den\n##mer\n##36\n##ard\n##76\n◆\n##lc\n##car\n镐\n潋\n220\n餮\n##73\n##ens\n浃\n渥\n岫\nｖ\npk\nlove\n饨\nsd\n眦\n刽\n169\n楔\n掴\ndon\n谩\n##ata\n崂\n##ust\ntom\ndu\n1947\n##end\n117\n奂\n橇\n庾\n##isa\n1940\n鳗\n★\n##ene\n##eh\n轼\n##94\n##82\n钛\n##kb\n230\nmu\n镳\ndd\n鱿\nci\n##vel\n##led\n遨\nim\n##49\n161\nint\n##ken\n撷\n緻\n隨\n３０\n##hin\n觞\n##lp\n##rk\n丨\nms\ntm\n豌\n##63\n嗎\n##85\n##ort\n颢\ncm\n##king\n雹\n##qi\n璜\n##54\n202\n##tom\n俎\n##men\n1941\n嘞\n馗\n##ger\n##ice\n沱\n142\n##word\n谟\ndan\n祎\n##zz\n##ek\n##ini\n##77\n渚\n##time\nwifi\n##let\n##lor\n134\n138\n10000\n##ert\n琨\n152\n##bb\n##ml\n##eng\n141\n##dp\nev\n##del\n盥\n##its\nman\n绻\njack\n##□\n蜚\n祚\n馄\n##93\nct\n1931\n##ace\n##ake\n##ans\n谀\n##hz\n鹄\n132\n嘣\n##ros\n1100\n帶\n##mc\n稔\n苁\n睥\n鸵\n##92\n擢\n秸\n酰\n##gt\n##97\n赈\n##oh\n##other\n##ani\n懑\n1933\n##len\n210\n疡\n亳\n箐\n樽\n1934\ndavid\n袤\n▲\n砥\npu\n##bra\npr\n133\n醛\n145\n##ful\n師\n##sch\n哔\n佚\n##over\n159\n迤\n斡\n##67\n剛\n1936\n镌\n崴\n##how\n##09\n##ws\n枳\n碉\n##love\n##ple\n166\n##46\n⑦\nthey\n囧\n苻\n瘠\n跻\n诋\n##ies\n177\n##96\n##lon\n淙\n##tle\n173\n##lin\n皈\n佗\n##ien\nsam\n塾\n榈\n##ings\n1800\n2500\n##69\n##rin\n##dia\n濠\n1939\n準\n麗\n聒\n##aa\nak\n卟\n##100\n艮\n蕨\n2019\nc1\n宥\ngi\n##gan\nmon\n陣\n##net\n荨\n##wan\n昇\nthere\n铣\n帼\nchapter\n胫\n觐\n##56\njo\n144\n邰\nii\n##world\n##ough\n饯\n##ute\n##86\n嶽\n丶\n谪\n##ire\n1943\n鄞\n鹘\ncs\n鏖\n鲫\nnpc\n##row\n##oy\n##gb\nhp\n##yu\n鳝\n祐\npp\n##89\n殓\n143\nⅱ\nmp\n裨\n￥\n##oe\n弭\n##port\n##ima\n5g\n兖\n##lic\n##tm\n槃\n##uch\n##zo\n膻\n蕩\n##sse\n佃\n##yan\n杈\nwhat\n變\n幹\n褲\n０\n迩\n１１\n167\n诰\n154\nwang\ncar\nsaid\n谚\n闆\n蛹\n##ram\n##ker\n玳\nibm\n藜\n##rf\n##cam\n茯\nchina\n瓒\n纰\n谯\nva\n實\n褓\n骛\njason\n1927\n悽\n豢\n##tb\n##las\n５０\n260\n碘\nmp3\n##ari\nweb\n##nk\n4s\nop\n抨\n##ane\n148\n##wen\n則\n1900\n##sco\n弁\n啻\n焜\n##lv\n##74\nｉ\n##ler\n##fc\nusb\nea\n讫\n鹜\n##fs\n榫\n锂\n绦\n彧\n檄\ngps\n篑\n赡\ngdp\n149\n##mus\n裾\n##back\n铂\n##bee\n##ley\n诙\n踊\n蟋\n##tos\n藓\n##bor\n1935\n##ug\nem\n闰\n7000\ntv\n1930\n##ise\n浔\n1932\n腳\n204\n157\n##val\n莅\nｒ\n##rus\n##van\n缱\n裱\n##cp\n斫\n連\nep\n##84\n##79\n俾\n##ces\nvol\n熱\nll\n砧\n罹\n1920\nag\n陲\n147\n擡\n##x2\nwi\net\n##ku\n##more\n１５\n痔\n##pan\n##ase\n##ena\n##ash\n211\n##lina\n##tch\n幄\n禺\n320\n##ora\n邛\n難\n骈\nil\ntt\n嬌\n174\n365\n囗\nyes\n##ona\n猗\nking\nhd\n##ose\n750\n蛭\n##ura\n俚\n驷\n滦\n##ula\n國\n隍\n◎\nos\n##ves\n##ale\n##83\n棂\n萸\n§\n##pm\n蚝\n##57\n炅\n箴\n侑\nmv\n##87\n##gs\n涝\n##cer\n##urn\n153\n薨\n##sen\n##ala\n##2000\n屍\n酚\n鄱\ner\n襁\n3500\njames\n陽\noppo\n##ano\n蚱\n##tar\n淩\nsan\n1928\n##cent\n##tto\nig\n##ral\n槁\n茭\n酗\n荥\n##tes\n##ster\n1929\n##∶\n##ical\n##low\npeter\nxp\nben\n##37\nipad\n腓\n256\n##hd\n砣\n烩\ngoogle\nｙ\n蹴\n圩\n##bar\n##nan\n痫\n##gar\n##mes\n焙\n＞\n應\ncp\nns\n##nic\n猕\n１６\n黍\n##ism\n＋\n睢\neric\n##ish\n係\n鹑\n犄\n##bet\n##ult\n焯\nｌ\n▼\n劭\nbar\n楹\n扪\n##lly\none\n嫚\n##58\n蟀\n淖\n纾\nexe\nbt\n黜\n##less\n僖\n擀\n钜\n##ade\n##tic\n寮\nｅ\n##urs\n##rid\n嵇\n##plus\n轉\noo\n##als\n##lie\n##ons\ntw\n##act\n蛐\n##mic\n潸\n孑\n##ics\n##tl\n柢\n蓑\nmba\nを\n##ime\n滓\n箸\n懋\n189\nmc\n蘇\n烊\n##ン\n##sis\n郦\n荏\na4\n##fer\nds\n苡\n饬\n##work\nlan\n##ara\nken\nthat\n徇\nspa\n殁\n铤\nβ\n腩\n喱\n##gin\n##lf\n氾\n##kin\nkk\n讚\n栀\n笞\n菅\n興\n##cle\n##sit\n锲\n##ks\n嶂\n虢\n##ile\n##sion\n濯\n″\n嬅\nt1\n耦\nts\n泷\ndl\n挝\n1926\n##ving\n##lia\n##59\n1400\n##tc\n201\n髒\n饽\n##zu\n龈\n邕\n##ー\n條\n琮\n複\n##の\n##ama\n囫\n铆\n##ood\n腱\n##np\n榛\niso\n箔\n唑\nau\n##bus\n##bre\n蛀\n212\n##ner\n樾\nah\n##life\ncan\n##hn\n糸\n##cd\ncindy\n沣\n骐\n##ory\n##ling\n##bit\nlg\nbook\n雖\n##tp\n##app\n##chel\ndc\n190\n##pc\n缢\n##anda\n⑧\n颛\n##ost\n##ool\n##new\n##orm\n##ture\n谌\n瀞\n##down\n恫\n噁\n採\nmk\n203\n獭\n陂\n##live\n暹\n##own\n##lam\n##ink\n狒\n280\nc2\n##come\n刎\n蜍\nlv\n佰\n晞\n301\n##house\n膘\n篙\nか\n擘\nir\n##ark\nhigh\n谶\n##ial\n##lar\nmiss\n##een\n##die\n絲\nbc\n##night\njava\ndx\n儆\n紮\n450\n纫\n镭\n菖\ntim\n咭\n焘\n##rley\n##ctor\n罄\n##dra\n##600\n睪\n##nar\n##ich\n##ward\n毂\n##ier\n##light\n##ual\n悱\nexcel\n吡\n##ean\n峤\n##book\n0t\n栎\n##ock\nhr\n瑠\n##kw\n1912\nbaby\n圜\n205\n##ght\n拋\n##001\n##bert\nscott\n##ox\n⑨\n孀\n##mhz\n##line\nmichael\nㄧ\n##hs\nfacebook\n羟\nsb\nve\n痨\n犸\non\n菏\n##ola\n##set\n##hl\n鄢\nbug\n珲\n榷\n206\n##star\n##500\nlol\nchi\n钨\nm2\n飨\n1917\n##vin\n豉\n滂\n撫\n獗\n酊\n##～3\n冼\n刍\n漸\n鳃\n##tf\nbill\n邈\n苷\n斷\nabc\n歡\n##ida\n°\n關\n醚\n##ann\n佥\n濕\n##ance\nipo\n徉\n羸\n##air\n雎\n桎\n祢\n##llo\nmax\n缙\n##ond\n##key\n細\n偈\n梏\nng\n龍\n脫\n伉\n徕\n002\np2p\ngod\n磬\n1919\n##nge\npaul\nby\n唳\n馬\n楂\n魍\n纭\n##bot\n肽\n##ence\nは\n涿\nvr\n##hing\n270\nnew\n龜\n祜\n##uy\n##た\n5m\n艷\n##ign\ns1\n##med\nfan\n##cia\n208\n##ead\nに\n栉\n##ky\n鎏\n歙\n氰\n##well\n##side\n樂\n纔\n##force\n##～20\n痂\nwho\n207\n衹\n##か\n仃\n锉\n雒\n##ein\n歳\n##ses\n##ik\n##face\n＜\n##jing\n萘\n##oto\n##nder\n恥\n蛎\n209\n燧\n1m\n虻\n##ht\n##mit\n甯\n啖\ndj\nb1\n##tions\n風\n##ould\n##bin\n瓯\n車\n##nter\n##ding\n徜\n蹉\n##lis\n##mn\n##ick\n##ese\nsr\n桠\ncf\n##hai\n薏\n桉\n##а\n癣\naf\n钿\n##ndi\nwhen\n##put\nles\n阕\nxxx\nmobile\n拮\n脍\n涪\n咲\nα\nins\n##eme\n臬\npart\nlisa\n##☆\n20000\npre\n許\n軟\n鸪\n僭\n##ｊ\n##zy\n##ino\n離\n庖\n##tte\n##ning\n婺\n蹇\n鹌\n##dar\nvcd\n赝\n1700\n癡\n##ope\na1\n汞\n濂\n熵\n1914\n##lee\n橹\n咩\n##ery\n213\nchris\n##boy\n矾\n221\nf1\nmb\n225\n226\n##lls\n獐\n##ode\n##ring\npen\nly\n##tv\n鹧\n##play\n##llow\n矍\n218\n蝌\n##nch\nm3\n奮\njoe\n墉\n##vd\n##uk\n｛\n1924\nae\n222\nmt\n靼\n##nis\n電\n瑁\n##mark\nthis\n酩\n｝\n荞\n##nda\n##し\n##ラ\n##rge\na6\n澍\n卅\n歎\n豐\n勖\n##pin\n１４\nmd\n##ats\nmin\n##ket\n214\n噻\n猷\n1925\n◇\n芪\n溴\n啶\n祉\n520\n##view\n370\n##ｆ\n铄\n245\n##lk\n澧\n書\n##gle\n##joy\nｑ\n232\n##750\n##vs\nhow\n255\n摀\nvivo\n##wn\nfbi\n1918\n##cel\n216\n##ium\n饴\n糅\n##bel\n228\n##jia\n231\n235\n〇\n铡\n珐\nwu\n##ham\n##ray\nred\nparty\n頂\n##ress\n4g\nsimon\n1921\n滁\nwin7\nsex\n##ル\n##kk\n硒\n硼\nww\n讧\n##ound\nvista\n戶\n淒\n芩\nie\n觥\n##ions\naaa\n242\n251\n1911\nⅰ\n330\n淨\n旮\n##gy\ndiy\nbest\n007\n髋\n2200\n1922\n莖\n噱\n麵\n秣\n302\n蜇\n肱\n215\nnot\n##nor\nlily\n##0m\n##400\n##able\nup\n##said\n4500\nfont\nlogo\n##と\n痈\n1909\n1923\n##800\n##bc\n〝\n鲈\nwith\n唁\n復\n溫\nyi\n‧\n##v4\n極\nhan\n##μ\nfc\n芈\nrpg\n檎\n999\n傳\nt2\n葭\n227\n##nia\n谔\n牍\n莒\nkg\n##home\nrobert\n##tter\n埂\n##deo\n視\n##sol\n媞\n310\n##sun\n锰\n##vice\n##ook\nphotoshop\njs\n268\n512\n##ull\n赭\n眞\n醐\n195\n粳\n诟\n##ux\n蚪\n2400\n氐\n遒\n機\n煨\n##bt\nv1\n盡\n絕\n惇\ncv\nyang\n##x5\n诿\n馏\n铨\n##ments\n蕲\nofo\n膑\n⑩\n##dio\n靜\n沢\ncho\n##bell\n##cat\n650\n##ters\n钎\n醍\n##count\nplay\nsci\n##lay\n##ike\n##right\n醴\n蒨\n##ung\n##mate\n1915\n螨\n##ison\n阆\n吶\nmg\nnu\n鲑\n續\n##ume\n##900\n灞\n##ett\n疟\n镍\n##ix\nwill\n﹡\n480\n##→\n疽\n##bat\n##last\n4gb\n鐘\nib\n##hc\n飢\n阡\n笏\n312\n264\n303\nrock\n屐\n##～10\n##cky\n牦\n219\n241\n##xx\n砝\niv\n１３\n蝈\n##log\n郴\n##fun\n緩\n##ico\n##ness\n325\n##ors\n##nne\n△\n340\n荠\n##ved\n膈\n380\n霰\nhello\ncj\n箝\n赓\n##hop\n##ス\n##hm\n##berg\n鳕\n1906\n##rect\n1910\n壓\n##ez\n##ller\n遴\n##wood\n##イ\nrn\n253\n##rry\n厝\n##lr\ncos\nb2\n仝\n290\n終\n溧\n〃\n##tech\nlet\n骅\n##bon\n##gp\ntb\n餘\n珮\n##tti\n##n2\n306\ngp\nsg\n229\n2100\ncad\n葺\n##xxx\n##kv\ndm\n宮\n苋\n洙\n亂\n該\n○○\na3\n##rio\nmvp\n##и\ngame\n騷\n｜\n脣\n蘭\n總\n黃\n鲶\n720\n##cn\n邳\n##free\n苕\nxa\n1916\n010\n233\nus\nbp\n##tz\n铉\nnote\nart\n蕤\n##aft\n铀\n##ule\n椽\nff\ncbd\n艹\n##wu\nof\npd\n15000\n嵬\n腭\n＆\n305\n##ク\n##eep\n稞\n窠\nback\nvivi\n850\n##lio\n2m\n眾\n##fig\n##ince\nmay\n##ary\n##300\n##reen\n跖\nfm\nsf\n224\n##nts\n##ada\n234\n诤\nlee\npt\n223\n236\n238\nvc\niso9001\n仟\n304\nlp\n牆\n265\n##a2\nui\n傩\n昴\n焗\n269\n產\n##wei\n藍\n##mw\n蓼\nrs\n佶\n##gm\n308\n##chen\n悌\n3c\n赅\nkevin\n##く\n２４\nsat\n麋\n疣\n##ize\ns2\n挈\n##●\nrichard\n333\n##0mm\nnon\n1905\n綦\n＝\nedward\n##m2\n##ape\n踟\n馔\n##cher\n槎\n##lab\n稹\n##rian\n掛\n靛\n430\nsao\n217\n252\n##lus\n蛳\n舂\n1908\n##～6\n12000\njane\n##head\n語\n##cks\n##erry\n##vc\n場\ntd\nい\n##フ\n199\nmark\nｚ\n237\n旃\n邨\n痍\n鹈\n8gb\n託\n恚\n##hua\nsir\nnp\nhis\n1913\n2gb\ngm\n299\n冇\nb2c\nmatt\nml\n##700\nword\ngre\n##self\n琇\nhappy\n痺\nlc\n550\n##120\n⌒\n262\n##hip\n驚\n順\n##py\n##ways\n##い\ninternet\n##lot\n##ria\n數\ncome\n婦\nrt\n243\n246\nⅲ\n跎\n瓤\nsuper\n410\n記\n暝\nmini\n020\n摈\n夾\n##aid\nppt\n##wt\n##oon\ncenter\n261\n292\n賤\n530\n1907\n352\n420\n蛔\n249\n##ius\n杷\n識\n##tis\n315\ntf\nlive\n263\n結\n詩\n##hang\nwas\nmaggie\nver\n##mber\n傣\n##dom\n##pass\nangel\n##①\n##rap\n2600\n##ious\n272\nmicro\nbit\nautocad\n碣\n##lock\n311\n##cord\n##uel\n##oid\n芡\nwei\n顯\n耒\n2g\n達\n##eet\n288\n郧\n莴\n##kit\n歲\n2300\n##read\ndel\nｕ\n揆\n≥\n蠹\n##き\n態\n囹\n9000\n囿\n1902\n＠\nhk\n##urt\n##osa\n##rip\n##cts\n楮\n##シ\nnow\n芫\n##е\n440\n##に\n製\n##ｑ\n279\n618\n湫\n196\n钴\n雲\n278\n溼\nits\n響\n##ques\n361\n瑀\n##md\n##ude\nmaria\n##rial\n##▼\n2d\n316\no2o\n##ack\n##tional\n##test\ngsm\nsim\n##point\n244\n寫\n煸\nfree\nleo\n##gc\n##123\nher\nnvidia\ngs\n560\n510\n潤\n～1\n##rain\nec\n##う\n##a1\n塬\ncctv\n顫\n##mand\nft\n壅\nsun\nalex\n愆\n275\n##press\nalbum\n##ded\n搖\n3m\n##body\n觸\nandy\n0l\n铰\n##о\n畦\n5mm\n580\nasp\n248\n247\n666\nm4\n##010\nend\n##max\n辊\n粕\n005\n奘\n擺\n嚥\npop\n阇\n##ject\nf4\n捌\nwto\n砒\nmail\nipod\n岬\n198\n玠\n篪\n##｜\n273\na2\n##★\ngood\n276\n##3d\n321\n309\n倬\n戕\n肓\n307\n浜\n##city\ngif\n271\n##レ\n680\n911\nx3\n296\n##tee\n牯\nrp\n煅\n迴\nget\n192\n祕\n繼\n##ular\n##tore\n喆\n266\n##ア\n疱\nimg\nchen\n擊\n捨\nx5\n##tus\n##gas\n055\n噹\nt3\n劉\n##r3\n郤\n##ible\n193\n螯\nset\n欸\n苣\n323\n苜\n283\n##ira\n菘\n嘚\n橐\n5l\n820\n1903\n##н\n##2010\n##ford\n##nse\n薹\nhtml\n裝\n圻\n##ら\n辦\n繇\n191\nshow\n抟\n圓\n285\nvan\nmx\n請\n爰\n##oz\n4m\n瞋\n##oot\n枰\nthinkpad\n芃\n##logy\n##ations\n1898\n##tive\n挹\n钤\n##テ\n鬍\n2700\n##vr\n##ssion\n孛\n##may\n003\nbig\n##nel\n尕\n322\n篁\n腦\nv2\n砷\nfrom\n##cture\n枇\nbob\nreal\n佢\n濬\nlook\n##ming\n亞\n##ん\n##works\n龋\nwhy\nabs\n菟\n圄\n##room\n##tta\n齊\ncam\n318\nuv\n鼐\npv\nout\n樓\n幫\npan\n≤\n##wang\n##ova\n2800\n噴\nbat\noffice\ng1\namy\n疸\n1895\n342\n砭\n貼\n楫\n##sky\ndream\n儘\nrom\n洄\n∩\n##ping\n780\n1a\n萬\n＇\n鲷\nbbs\nkiss\n飯\n##て\ncarol\n540\n羧\n##mann\n30000\nalt\n197\n堅\n##late\ngay\nlinux\n##vic\nと\n##nos\nｏｋ\n1904\n癜\n讴\ncba\n飛\n710\npet\ning\n庠\ncall\n080\n蚜\n5t\np2\n##○\n##～5\n980\n4a\n267\n寶\n254\n論\n逶\npg\n6p\nvisual\n##ush\nunit\n008\n舛\n##box\n犍\n##ball\n陳\n##リ\n柞\n950\n醮\n372\n##raph\n##eam\n282\n286\n328\n##な\n##xy\n遠\n墀\n1901\n##rey\n194\n##hp\n##nap\nmsn\n3200\namd\n339\n##cept\n##list\n##fit\n洮\nf2\n##cker\n985\n##ather\nmix\n##hur\nadam\nerp\nef\n##は\n貅\n閒\n嗪\n408\ne1\n331\nua\n281\n沭\ns3\n調\n##ware\nb2b\n287\n##lish\nray\n跹\n##rite\n298\nmicrosoft\n隱\n011\n##ering\n縫\n盪\n##￣\n畫\n291\n疥\nthomas\n竜\n堀\n314\n863\n##rent\n##hoo\n##fw\n6m\nob\n誘\nwhere\n##times\n##ト\n蓿\n薙\n佤\n乩\n柘\n茏\n##name\n趺\n钼\n390\n铩\n366\ncosplay\n竽\ncb\nesp\n##110\n單\n##も\n獾\n##alk\ntc\n460\n##ets\n妣\n401\n華\njust\n阈\napple\n1897\nio\n##oud\n芎\n鼹\n楊\n莠\n##к\nnc\n﹍\nsome\n↑\n##ない\nき\n##ube\n050\niii\njan\n約\n淦\nbonnie\njun\nkey\n##101\n1888\nplc\ns6\n棹\n郅\nul\n##ald\n勢\n5a\n晷\nlike\n006\n6l\nし\np1\n呎\n3600\nmate\nrc\n啉\npvc\nhtc\n##ax\n##pd\nice\na8\n烃\n鼋\n疝\n362\n289\n漲\n菀\n##stone\n##＋\n##ration\n##fly\n##mini\n##hone\n1～2\n##ᅡ\n335\n##rant\n##ail\nalice\ndes\n##iel\n2～3\nblue\nc4\ntony\ncx\ndt\n257\n彈\n琍\n1250\n炁\n鹕\ndec\n355\n369\n孳\n##ito\n412\n喹\n##uck\n鐵\n陟\n襬\n##nger\n##↓\n趸\n338\n##field\nfrank\nkl\n囉\n##mah\n470\n愍\nc3\nwilliam\n##flow\n324\n耄\n640\n绀\ntcl\n##～8\n##n1\n1894\n嬢\nwell\nover\n郜\n嫻\n284\nnb\n##160\n##lace\n##り\n##mond\n313\n﹔\n295\npos\n239\n##web\n蘼\n##zl\n##bank\n閉\n277\n膚\n咁\njr\ninc\nthree\n楝\n##type\npm\n332\n週\n夠\n##base\n818\n##р\n叻\n1890\n##す\n##ま\narm\n勁\n羨\nh2\n換\n##ily\n351\n402\n##ience\n##cus\n確\n镗\n630\n怙\ndoc\nbuff\n##cing\n桿\n杼\n釐\n5℃\n縮\n004\ndota\n铵\n##mv\n025\n##group\n##cil\n329\n洹\n桁\n蒹\nbbc\n廝\n##o2\nact\n258\n霑\n##tone\n6500\njournal\n860\nsmart\n稗\n##2007\ndnf\nr1\nmacd\n375\n適\n326\ncore\n醪\nく\n603\n榉\npose\n蜊\n021\n334\n￣\n358\nsky\nsunny\n294\n345\n龅\nbd\n5cm\nstar\n卍\n異\nsata\nnext\n##stin\n鹳\n##2c\n##с\n岘\n甾\n388\n凈\n##ream\n##る\nrain\ntan\n##ged\n錯\nopen\ntel\nuc\n懷\n薅\n##jin\ns7\n啱\n癢\nhome\n##ition\nsoho\n368\n镛\n##cast\n##uma\n274\ndie\nrose\n##card\ni7\n015\n601\n009\n319\n備\n萁\n##hg\n##iu\nare\n苫\ntvb\n570\ngd\n瘢\nair\n跡\n##try\neva\na5\nhttps\n鈺\n1024\noa\n##480\npower\n##talk\nalan\n線\ngigi\n徳\n##jun\n##vas\n鳳\n##█\n##＜\n垭\nフ\n##west\n谘\n##◇\nmaster\n1g\n359\n##board\n試\n405\njean\n崧\n1gb\n100g\ntype\n386\n畲\n狀\n##～7\nmodel\n##mall\nq5\n##dge\n稜\n漯\n810\n1840\n嵊\n##520\n660\nzero\n012\n義\n願\n徼\n730\n##cg\n杓\nrmb\n刈\n顆\n慄\n555\n##250\n960\n±\n418\n3a\n327\n415\n﹝\nseo\n013\n##ロ\n麂\nsoc\n1899\n芾\n297\n##nie\nfans\n620\n##oper\n##tag\n##cation\n341\n穀\n領\n##mine\nqs\n4l\n聞\n382\n湟\n##vm\n趕\ngeorge\n##pop\neos\nγ\n蜢\n##ias\n501\n網\n##care\nxxxx\n較\n認\n293\n鍊\n﹞\n嵯\nblack\nemail\nh6\nfirst\n888\n椋\nsteve\ncharles\ntwo\n##ette\n倆\n訴\n瘋\nltd\n028\n##コ\neb\nwar\n##み\n353\n貔\n撐\n羅\n##wd\n觀\n哏\n840\ngc\n##さ\n矇\n昉\nx2\n圍\n##ument\n燊\n##150\n502\n瑤\n濃\n##в\njim\n##rade\n##nsis\nhong\nlz\n##data\n鞣\nella\n痤\n枋\n384\nsas\n錢\nmike\n1884\n422\n760\n357\nmod\n##lent\nafter\n385\n019\n##core\n##カ\n##zon\n1860\n398\n疖\n890\nv8\nvar\n儋\nsong\nchan\n##dden\napi\nhold\n苄\n317\nn1\n楸\n1880\ngalaxy\ntina\n碛\n##360\n##zen\nmarc\n1850\n5500\n432\nck\nlady\n##ology\n##stro\n354\n珙\n珥\nyour\n##■\n7500\n1886\nauto\n2030\n##ᆫ\n樑\ni5\ne3\n##a5\nkim\n♀\njpg\n##note\nな\nmary\navi\n戰\neq\njp\ndos\n367\n610\n傷\n##site\n襪\n3300\n＾\ncg\n俳\n##pad\n郫\n蓁\n##beth\n##oman\n痧\n2b\n##⌒\n##wing\npass\nて\nた\n瘘\nufo\ndaniel\n談\n##ney\n376\n363\n513\niq\n022\npolo\n渾\n翹\n2022\n343\nexo\nh5\n344\nder\nあ\ndll\nh1\n1280\n1080\n菸\n##coin\n蓋\nmit\nonline\n394\n488\n##tsu\nebay\n20℃\n‖\n軀\n顏\n##itor\n##ｚ\n讀\n甙\nthan\n‰\njul\n##║\n術\n##file\n碁\n##media\n##hk\nacer\nㄛ\n##つ\n4k\n1001\nshirt\n518\n##aka\ngg\n氪\n桡\n##gma\n麸\n690\n348\n378\n##erson\n379\n惡\n##walk\n襠\n##burg\n盱\n8mm\n##ppy\n##lling\n##rma\n##ello\n謝\nrx\n701\n1893\n##tein\n丟\n442\n##3c\n##ィ\n500g\nつ\n啟\ntft\n爺\n哌\n鑽\n2a\nbas\n炔\nliu\nky\n誰\nmake\n課\n411\n381\nkelly\n★★★\n##999\nbot\n脘\n##watch\n##2008\n肄\n##こ\n藥\neasy\n衆\n347\nnova\nstart\n##ヒ\nシ\n##via\n399\nsea\nf3\n堃\n3800\n456\n軍\npci\ncase\n346\nqa\n##ference\n1896\npsp\n407\ninfo\n##zer\ne2\n409\n625\n##ground\n802\n##ury\nmp4\n##news\n︶\ns8\n維\n389\nwind\n414\n6a\n掙\n029\ncoco\n節\nrd\n##mma\n##fx\n泮\n##ella\n027\n1v\n潔\n併\n頓\n嬗\n##ウ\n垚\n##ries\nangelababy\nrna\nrm\n酢\n櫻\nり\n##を\n404\nkd\nm5\n##uid\ns5\n18000\n395\n##table\nア\n337\n371\n繩\npdf\nl1\nphp\n盤\nhotel\n務\neur\n##°c\n670\n737\ndata\n##マ\n報\n扦\n況\n522\n钯\n戡\n1870\nさ\natm\n##nny\n##lex\n##1000\n024\nrf\n364\n朐\n##ᅵ\n##uard\n壞\nuber\n100m\n25000\nstep\n##タ\n捲\nwap\n钣\nxl\n歷\n##②\nkm\n##drive\n503\n##〇\nfly\nbrian\n頫\nlow\n356\nhip\n毽\n決\n##tina\n遊\n490\n吳\nlog\n##━\n##ハ\nuk\n嗚\nφ\nmatlab\n##wf\n▪\n##uce\nfind\nsee\n劍\n##け\nharry\n458\nⅳ\n##rman\n燄\n373\n賞\ni3\n钒\ng3\n貝\n聳\n590\n癒\n336\ntwitter\n稱\n帑\nlink\n419\nmartin\n##tical\n396\nsql\n413\n2025\nhero\n顧\n970\n##load\n438\n鮮\n50000\n儀\nspace\n023\nlcd\n臥\nemba\nlevel\n廳\n葉\n倫\netc\n##r4\nけ\n菽\n740\n416\n##2009\npcb\n纏\n##ᄋ\ncup\n426\n433\nallen\n聖\n脹\n830\n靈\n陞\n藿\n運\n349\n377\ntcp\n1350\n504\n暈\njay\nlong\nwhite\n##ement\n##ᅩ\n殭\nv3\n374\nmoto\n##oka\n優\nnhk\n##pus\n嘌\n倖\nalpha\nlouis\n轭\nright\n講\ngis\n註\n##１０\n##α\n##キ\n鹹\n##tics\n戲\n##л\n祂\n☆☆☆\narea\n專\nsorry\ndiv\n1889\n疔\n##atic\nfb\np3\nsize\n籐\n##page\n6g\n##￥\n455\nvideo\nhenry\nwest\n孃\n燔\n##tant\ngreen\n計\n黒\n淚\n425\nhot\n摟\n1867\n3ds\nwow\nadc\n435\n##ties\n495\n夢\n釦\n465\nう\n##⊙\npin\n##shop\n綾\n伕\nω\n嚇\n1885\nmpv\n##vy\n##ads\n＼\n蟲\n瓏\nvga\nping\n蓖\n##ift\n516\n億\nm8\nxd\nups\n##ship\n##nus\ntime\n##ils\n392\nmtv\n##zone\nzen\n768\n##vg\n1234\n涞\n躪\n際\n簡\na7\n擠\nrq\nieee\ng5\n##2d\nwindows7\n##rum\n蕈\nhas\n##ナ\n葚\n515\nqueen\n##2012\nace\nmore\n⑴\n880\ncmos\nc5\nlife\nadobe\n##bow\nd3\n30g\nbye\nb5\nmap\n卮\n##mple\n飽\njeff\n溱\nlight\n荽\n猶\n紀\ntake\n##www\n##lone\nform\n蔘\n##ika\n涼\nusa\ng2\nㄟ\nlab\n10℃\n511\n451\n##ctive\n唸\n8g\n盧\n397\nfun\nemi\nbmw\n##yer\nstyle\n1882\nbang\n403\nclass\n##ェ\nhard\norg\nd1\nimage\nsay\nnews\njpeg\n禮\n##stry\n##mix\n買\n湧\n##ique\n32gb\n2900\n姫\n100km\n##cket\n777\n##mobile\n747\n##town\n豇\n##ⅱ\n##±\n##chat\n槭\n氦\n##link\n姪\noem\n骶\n##ader\nイ\ng6\n##dog\nide\nnever\n痠\n★★★★\n員\nhdmi\n呋\n輪\n##ひ\n8cm\n499\n387\n號\n##tax\n醫\npost\n褫\n427\nanna\n謂\nfile\n##gio\nb6\n##2011\nk2\n##そ\nidc\n508\n506\n撥\n##3000\n##host\n##next\ncpi\ncandy\n##ider\n##ω\n樺\n舉\n1010\n417\ninternational\nann\n##code\ngmp\n##sent\n2k\n査\n趙\n脩\n521\n##space\n荸\ntoo\n漢\nq2\n巖\n##dk\n##yy\n##lines\n棗\n頸\n543\ntech\neco\nboy\nppp\ntop10\n閃\n｀\npage\n##baby\n##ncy\n吽\napps\n簾\n##＄\n##sia\n10g\n2cm\n453\n4800\n##168\n酐\n槲\ng7\n2mm\n屬\nπ\n##iko\n##370\n乂\n驗\namg\n429\n禪\n嘖\n##rmin\nfire\n##rris\n##cca\nwp\neps\n厲\n燒\n505\nseven\n##eting\n##asia\n嫘\n蜱\n镕\n暸\n鲟\n##urg\nworld\n耋\n##rder\n16gb\n##qq\n##tty\n##ville\nnas\n468\n##ツ\n061\nparis\n##т\nd2\n颞\n痼\ngreat\ndesign\n耨\n##5s\n870\ndom\n770\n樯\n醯\n485\n##map\n須\nanne\n073\n##tment\ngrace\n養\nvia\nmall\ncdma\nsweet\ngpu\nト\nmandy\n肅\n406\nv6\noracle\ncss\nyoutube\n籼\n##011\n448\njerry\nrfid\nxml\n3100\n705\n殘\nplease\nコ\n##moon\nmsi\n綑\naccess\n##tton\n##text\nれ\n腈\n##↑\n辺\ngucci\nmagic\nitunes\ninn\n##ql\n岙\n##ews\ncrm\n歸\n391\nport\nram\n##あ\n##れ\n##url\n20g\n鲢\n##mbps\n475\n嘆\n1868\nquick\nonly\nold\n626\n##rner\n闕\n虛\nee\n側\n##tation\n##lution\n##nba\n1871\n##ニ\n##ᅮ\n1111\nneo\n綁\n勝\n蘋\n⑵\nrick\n595\n##dget\n##pper\nway\nbus\n癮\nplan\n##っ\n2050\n層\n15g\n檗\n繫\netf\nx6\n糍\ngtx\njennifer\n燈\n叡\n瀰\nカ\n428\n戦\n711\n滾\n599\nhere\n藝\n##ston\n588\n835\n##fet\nproject\n業\n璁\nunder\n褔\n##888\n##ature\n纖\n質\n繞\n##ち\nsony\n純\nみ\n##pact\njoseph\n##gate\n##у\ngov\n8k\neleven\n3～5\n##012\n##uality\n##google\n##м\nmusic\nq7\n##tail\ndemo\nる\nひ\n枱\n##ager\n498\n##verse\n塚\npad\n钡\nblock\n獣\nchinese\n尷\n##д\n##ging\n矽\n褻\nthink\n4200\nlock\nエ\nbenz\n##bay\n÷\nread\ngary\nbh\namazon\n瑩\nmoon\npvp\n嚕\n##lts\n眙\nsteam\nrun\n##iki\n##dot\nn2\nfx\n##ット\n資\n燙\n﹒\n笹\npaper\n3400\nハ\n5v\n畀\nccd\n﹚\n3t\nselect\n##365\nplus\n##ᆷ\n喫\n嘧\n獨\n##よ\n##や\nこ\n##pn\n688\n禦\n鏡\n1080p\nwan\natp\n##ᆨ\n℃\n瓴\n↓↓↓\noffer\n産\n1020\nbad\n##ator\nμ\nhave\nabout\n##リー\n團\n##aine\nδ\n##bility\nl2\n##js\ns4\n5k\n##≤\n臨\nkate\n##モ\nsummer\n★★★★★\n槍\n锑\nfox\n彎\n##ール\n##え\n##hia\n##007\n2021\n曉\n碇\n習\nfda\ndha\n##＞\nrich\nclub\n锆\n皴\nland\nbell\n環\n﹙\n荳\n5s\nname\n壯\n##tier\n1b\njustin\n801\n5d\n383\n##mail\n隠\ndns\n塊\n氖\n廣\nま\n##わ\n睏\n##food\n擁\nbox\n920\n秭\ndown\nidea\n525\n樹\nlist\ne6\nlaw\nocean\n##unch\nctrl\n##ᆼ\n痱\n罷\n##ホ\nfri\n寧\n##tory\n墒\n骯\n##せ\n##sson\ncharlie\n膩\n798\n##メ\ng9\nlittle\n##gion\n##lter\nnana\nday\ntue\ncare\n氙\n445\n頰\ndoi\nス\n##nix\njoy\nengine\nす\nら\n▽\npay\nco2\n##ᄉ\n##013\nnasa\n蒡\nって\ntouch\n421\n啓\n##ners\nq1\n園\n##ories\nmysql\nhop\n790\n灬\n錄\nnice\n討\n855\n徹\n嚴\n暱\n##jie\n##＾\ntai\n喲\noled\n舊\n朮\n镉\n##ッ\ngirl\n貴\n銀\nbase\n3～4\n脲\nク\n睜\n##チ\nkbs\n50g\n##ふ\ntips\n選\n負\n##◆\nedu\n鬚\namerican\n457\nλ\n碩\nbreak\nuniversity\n##2013\n##iner\nloft\n舸\ncross\n611\n蹼\n##ɡ\n##bian\nmost\nb12\n綿\n類\nming\nサ\nhoney\nldquo\nnike\n恆\n075\nesc\n##gnet\n慘\n##ンク\n鎖\n925\n舖\n璿\n##～17\n標\nε\n靦\n##jiang\n##ㄟ\nyoung\nforum\ntimes\n獸\nwater\n偉\n爭\nheart\n##580\noff\n筆\n930\n##bia\n##サ\nreview\n鑑\nstring\n窨\nも\n臺\ninstagram\n緣\ncnn\n393\ndsp\nqt\n蚵\nspring\npython\na9\n傾\n罵\n湊\n胍\nb3\n##khz\nfrance\n661\nturbo\n867\n參\n憐\n兇\n##お\nhuang\n暌\nbella\nhey\nvisa\n暢\n顔\nandrew\n廚\nnick\nlast\nisp\n##public\nmulti\n##п\n襲\n##イト\n##dium\n20mm\n441\nほ\nzip\n##009\nict\n5kg\n諸\n∠\n碚\n瓊\ncnc\n溏\n##ndy\n埕\n膦\nline\nadi\n##ゆ\nsolo\n畢\nfull\njohnson\n##ゃ\nmade\n證\ngrand\nroger\n##px\nhonda\n観\n##ック\n椹\n僕\n練\nsouth\nstephen\njump\n##2015\n##ント\n揸\njonathan\ntalk\nhit\n塗\nnight\nsns\nmain\n699\nsmall\n444\n##ᄃ\nbrand\n##icon\nkai\n##ャ\nxbox\nwin8\npony\n砼\n##ほ\n##エ\ncool\nknow\n賽\nadmin\n琲\njimmy\n載\n釋\n##nity\n##ート\n##ム\nprofile\ntext\nphoto\n隊\n##share\n汙\nhead\n衛\n##015\n##я\nс\nroom\n敗\n##ᅳ\n4d\n┃\nedge\nmacbook\n591\nsmith\nhiv\n髂\n爾\n##ᅧ\ntrue\nemc\nsap\n##sio\n啫\ncheck\ncloud\n616\n##rame\n##ろ\ndigital\nghost\nや\nwma\n709\nお\nblog\n獲\n│\nsurface\n貨\n饒\n##itz\nsms\nmega\nteam\nvery\ncry\n蓮\n##ソ\n楽\n撲\nwd\n##rity\n劇\nkeep\ncpa\n積\n##ˋ\n尋\nわ\n餵\nryan\n豊\n##ᅥ\npack\n徬\nrebecca\n縣\niu\njuly\n##ㄧ\nstrong\n韻\nnokia\n疳\nalexander\ncity\n厭\ntest\nsound\n##rex\n腫\ncentral\nhall\n胝\n##lean\nv5\n莊\nface\nsdk\n〓\n##ᄌ\n權\n雜\n##rdon\n圖\nftp\n隈\nwed\n閱\nlucky\n紙\ncma\n##ュ\nsarah\nfifa\n##shot\n貘\n膽\nnov\nq3\ncount\n豈\n煙\npeople\nband\ncbs\n##ᄀ\n##б\ndrive\nbin\n棄\n詠\n##ᄅ\nrdquo\nレ\n碓\n輩\n蒐\n539\n074\n斎\n3mm\n慣\n720p\n蔔\ndell\n20cm\ngold\neye\nanti\n錶\n##kins\nhelp\nps4\nprime\n戀\nbbq\n橫\n叢\ngoo\n隸\nx7\n⑶\n議\nelle\n詞\n畑\n##master\n彙\ncolor\nted\n硐\nsaas\n##mmy\nstudio\n990\n颡\n##014\nr8\nstay\n3s\n##sue\n據\nf5\n汆\nテ\n導\n##lton\n814\nstop\n腸\nr2\n##ㄚ\n誌\nえ\n##から\ncamp\n項\nmmorpg\n皺\n360°\n1tb\n緋\n6gb\n##post\n鬱\n籤\nasia\nwing\n##style\n536\necho\n921\n瀟\nshare\nsars\n蕻\n486\n浬\n創\n##covery\nwear\n貪\n喪\nrgb\nvalue\n2l\n##ァ\nqe\n絶\nlim\nオ\nㄋ\ndeep\ntbs\nbing\nagent\n潴\n規\npatrick\n區\nwork\n針\nrss\nタ\n責\n甦\n紹\napt\ntaylor\nラ\npierre\n##ター\nsystem\n庹\n##め\nへ\n悅\n4399\nruby\nsoul\nsale\nroot\nhouse\ntee\nyahoo\n鎚\ng20\n堙\nσ\n##ision\nother\nミ\nvictoria\n攝\nmode\nour\n##470\n★★\n僅\nnorth\nᄋ\nkitty\nvilla\n劃\n##ケ\n気\ncut\nchristian\n##г\n12345\n饑\nfast\n～5\ncanon\nces\n憶\n##ague\nbeyond\nshopping\n嫵\nリ\n呂\n烏\n擴\n寬\ncarlo\nthu\n撚\nlocal\n鄉\nipv6\n787\nrar\n統\n跩\nして\n##して\n##って\nった\n0mm\n828\nd5\nvm\nadd\n襯\nbruce\nian\nwall\n韋\n迺\nv4\ncontrol\ndlc\nnata\ngeneral\n##2016\n剋\nⅴ\n騰\n##tings\nnational\n設\n838\nx4\n##ᄒ\n00㎡\nopec\npng\n齁\n臟\n艋\ndog\n4600\n孫\nkarl\ndark\nxr\n營\nindex\n涠\n鈎\nィ\n##tors\nいて\nling\ncoc\n掃\n詳\n煖\n##º\n敵\n##ネ\n奧\n陸\nwilliams\n##016\ndouble\n嘗\n##xon\n蹟\nems\n賣\nsaint\nfamily\nannie\nsbs\nhtml5\nkic\n納\n鲱\nnull\nview\n200mm\n128gb\ncfa\n##オ\n々\nssd\nせ\nめ\ndragon\nalexa\nmoney\n蠛\n##◆◆\nfive\nqr\ngroup\n歐\nヒ\n##③\npark\nage\npush\nxyz\n##iginal\n溝\nurl\nhdr\nホ\n縱\n嬤\n護\n揚\n濤\n##ヘ\n囟\nbay\nfeel\nfps\n##lux\n##ч\n埼\n險\n譯\n0020\n檯\n##graphy\ncrystal\nfed\njapan\n##rmb\n鴻\n##ミ\n澀\n##atus\n煩\nskin\nunix\nfish\n＄\nedm\n巿\nmetal\n級\nnetflix\n魚\nロ\ndior\n囝\n織\n##firm\n##desk\n費\nimf\nbusiness\n擼\n齒\n補\n﹏\n##ⅲ\n遺\n橼\nmoba\naction\n⒉\nholiday\n鬥\n揮\ntea\n##rail\n899\ntiffany\nfrancis\nstudy\nscience\nanthony\nacg\n濫\n##ょう\nbios\nfinal\nsoft\n龐\n123456\nactive\n袪\nvoice\n橋\n榖\n##ね\n誠\n砲\nenter\nclose\nlost\nprada\npoint\n##ggle\ntop100\n¤\nhpv\n##セ\ntag\n職\n6s\n繃\ne5\nadsl\nа\nnumber\n箇\neia\n輸\nacca\nhuman\n##dragon\nicon\n998\n##jy\nマ\n麴\n証\n##▲\nspeed\n##した\n⑸\nads\n##2017\njavascript\ncisco\n816\nraid\nウ\ncard\n♂\n##♀\n##ョ\n5c\nstandard\n64gb\n##ハー\n##ㄨ\nchrome\n##trix\n館\n##いる\n##いい\n##れた\n◆◆\nopera\nshadow\nyam\nproduct\nbaidu\nv9\napec\n##ˊ\n897\n殺\nこの\n辻\n攏\n##ノ\n##lmer\nbrown\n∽\n～4\n劑\n##ᅢ\n騎\nキ\n##rence\nsteven\n萊\n淺\njessica\n飲\nuse\n##lax\nsamsung\n##bike\nskip\npizza\n鄰\necu\nmedia\n鎗\nshell\n埤\npowerpoint\n劵\nsign\ntab\n︱\nvmware\n擔\nθ\n##ο\npm2\n##った\njose\nakb48\n凇\ndear\nfield\nм\nspecial\n茼\nclaire\n穩\nsso\n慶\n##2014\nstate\n##ワ\n銅\nelectronic\n##berry\ntokyo\nz2\n鬧\n涙\n麥\n##こと\n##rcle\n⒈\npmi\n18k\nraw\niis\n32g\n•\n##ろう\npath\nした\nいた\n##この\n##ified\nusd\n瀬\n恵\nvon\n憂\noutlook\ncover\nddd\nwish\n鶴\npure\n≡\nshop\njackson\noct\n##λ\np9\nmatch\nrmvb\nshift\nmeta\niot\n##163\nivy\n説\n悶\n##へ\n～10\niphone7\nnano\n⑷\nψ\n擋\n監\n頌\n█\n##340\nvincent\nふ\n━\nalso\n擇\n滅\n韓\n∞\nadvanced\nswitch\nチ\n##ь\n獄\n##stle\nservice\n##nnis\nsep\nメ\neast\n濟\n##△\nwave\n減\nultra\n飄\n鵰\nbeta\nsomething\nmsci\nglobal\n銷\ntable\n##κ\nsimple\napr\nn3\n799\nlondon\nphone\n##ι\n##スト\n##walker\nmind\nsecond\nvision\nbattle\n靚\n##←\nbasic\n##っと\nしい\nん\nsays\nriver\n懸\n簷\npaypal\ndonald\n餚\n牽\nhtm\n##ーム\n桜\n34e\n嚨\ntrip\n爛\nmenu\nbody\n籠\nbeautiful\nsean\n実\naugust\n摳\ngolden\n##ーン\n訝\n⋯⋯\n廢\n##ぬ\n朧\n##イス\n濁\n剝\n彆\n##bug\n##bbs\nresearch\ndac\nchanel\n紛\nfit\n≈\n遲\n鮨\n測\nsina\n頗\nmbc\ndance\nago\n50mm\n孖\n衄\n√\nzoom\nち\n##ょ\n寛\nxperia\n堺\nnbc\n憤\nwatch\n峯\nvpn\nside\n嬰\nв\n綠\nauthor\nsui\n穫\nmama\n碼\nbuy\npink\nnetwork\n##ᅦ\n埵\ncell\nmiui\n##orage\nschool\nˇ\n粿\n紋\nherme\nasr\nspf\n##ᆯ\n︿\n##®\nnfc\n##β\n∕\n勞\n##ⅰ\n##のか\n発\nmarket\nppi\nleft\n潆\nnature\n3kg\njob\n##2f\n醃\n飾\n##れて\n##たい\n↓↓\nkimi\nァ\n100ml\n30ml\npixel\njordan\nfalse\n№\n粧\n##≥\nposted\n階\naug\n趵\nmarch\nearth\n頻\nprivate\n煉\n檢\nbear\n蠍\n曆\nott\n訊\n縛\n覽\n墻\nrussell\n執\n攪\n慮\n##ゅ\n絃\nwelcome\n額\n賦\nboot\n聯\ndaily\nhealth\n##bby\njones\n竇\n伝\n##ッフ\nヽ\nps3\n500ml\n喬\n##イン\n緒\n哋\n擾\norder\n毎\n╮\n廁\n歩\nbeauty\nexpress\nsilver\nᄀ\nsecret\nscript\n萩\n鄭\n魯\n遞\nsingle\n偽\ncio\nセ\nsoftware\nitem\n##ォ\nmetro\nそ\n##▽\n穎\n##price\n獻\ntravel\nprice\nfpga\nф\n嵐\n捩\nノ\n癱\nssh\n様\nmcu\n撿\n燿\n箏\n2345\ncapital\n浯\nyear\nicp\n礙\nball\n##abc\nvogue\n價\n##rss\n闊\nllc\n##fork\nヘ\n痙\n攣\n24h\nkong\nsave\n##mporary\n侖\nゆ\n鎮\nbim\n賓\nroad\nmomo\n蓆\nburberry\n穢\nsiri\n島\nsport\nfocus\n丿\n##ㄋ\n##とう\nт\nhost\n墮\ngmail\nevent\n##ification\n886\n繪\n帯\nself\n躍\n##sday\ntvbs\n##zzi\n晩\n##ᄆ\ndean\nkpi\natom\nbefore\n財\n囍\n暐\naspx\npublic\nzara\n撃\nfriends\nroyal\n誇\nbigbang\narpg\nsupport\n纍\nprogram\n訓\n組\n咘\nuniversal\n嘩\n弔\n蕭\n貫\n狹\n##stack\najax\ne7\n葯\n##ション\n┌\nserver\ncalifornia\nprocess\n##んな\n##また\n湯\nworks\ngmat\n誤\nbag\n譁\n闖\n闍\np10\n##rtex\ncontinue\n##∕\n閣\nネ\nsunday\n##リア\nunity\n##④\n勻\naudio\ngpa\n惱\n##017\n戸\n輝\n曖\nnikon\nclick\n賊\n灘\ncode\nsource\n範\n樫\ntoday\n╰\ncoffee\nplace\npuma\ncookie\nreading\n磡\n対\nfantasy\n舾\n##ラン\n##ivity\nから\ntour\nvera\ngit\n預\nchildren\n鳴\n╯\nnatural\n値\n蔣\nsite\n崁\nkuso\ncreative\n奪\nsocial\nhill\ncanada\nnexus\ndram\n筲\nwindow\n槓\nchannel\npm1\n16g\n梶\nhadoop\n賁\n誦\n盃\n墊\n埸\nᄆ\nradio\neyes\n852\nsanta\ndate\n##shine\nedit\n紘\nno1\ncto\nwrite\nx9\nえて\nえる\nめる\n##かり\nmonte\nrunning\norange\n醜\n##｀\n##のは\nford\nя\nflickr\ndit\n寢\n縴\n勸\nhoward\n鳥\nfollow\n砗\nenglish\n蝟\nк\ntpp\n舺\n廟\n##いた\nける\n淪\n##む\nisland\n鹽\n866\n##│\n##ース\n賢\norz\njcb\nuser\nenergy\nˊ\nwiki\nisis\nthunder\n傭\nmodern\nfriend\n崑\nguest\n咗\nfood\n##∼\ngear\n50ml\n壺\nmount\n櫃\n##ある\nlake\nsophie\nstep3\n搶\n潟\n##する\n▍\n垵\nstone\n療\nomega\nleave\n喚\n鏽\n審\npwm\ntechnology\n##lvin\n暫\n30cm\n剣\n##з\n≧\nsports\n澆\n濺\nssl\n贊\n##カー\ncdn\nぃ\nlte\n##cake\nutc\n##ヤ\n##ß\n塵\nelse\n憑\nー\n篠\n匯\nkit\n〜\naws\nyumi\nㄆ\namoled\n##ᄇ\ngap\n貓\nд\n##ㄢ\nwii\nips\n嘅\n疊\n##ちの\n##かな\ntown\nunited\n峇\nᄉ\n攤\n##◎\nする\nswift\n##ました\ninstitute\nthread\nㄌ\nemily\n徑\ntiger\n俠\n滝\n漿\nよ\n圏\n攞\n齋\n綵\nebd\nobject\nˋ\n氡\n弾\n埗\nfpx\nfintech\napril\n蠻\n顛\niphone6\n##δ\n##ρ\nmemory\n円\n搵\nbank\n闔\nkindle\n冊\n##れは\n榮\n経\nwps\n鏤\n⒊\n罰\n鋼\n災\ncountry\nл\n1389\nredis\n╭\n紐\n懼\nれる\n窮\n黨\nsogo\nusb3\n窺\n槻\n綺\ngirls\nemma\n辭\nspark\n飆\nmorning\n##τ\niptv\napk\nlearning\nmachine\ncream\napache\n関\nconnect\n嚮\n揹\nvirtual\n廂\n驀\n##unge\ntarget\n甕\n##ㄥ\n餓\nspan\nlenovo\n燻\nユ\ntaiwan\nᄂ\nても\n12306\n鉄\n巔\n##ⅳ\n贏\n##÷\nmiller\n##↗\nparker\ngoods\n##ように\nれた\n##まて\n##²\n潛\nη\ndownload\nplaystation\nstory\n礡\ngithub\nskype\n▋\n竄\n紗\n鍵\n燉\nngo\n##ᄑ\n啞\n嘢\ntower\nwin10\npowered\nｃｐ\n霊\nfresh\nr9\nstreet\n聰\n貞\n児\ncoach\nbuild\nfuture\n変\n羣\n##dical\nmelody\nswatch\n樸\n瑯\ndelete\nevolution\n勳\n敎\nてある\npacific\n殼\n徴\n鍾\n臘\n攬\n##くたさい\ntrump\nboard\n勵\nchange\n滲\n簽\n瑪\n##х\nwilson\nニ\nimax\n##なと\nmountain\n##きた\nnode\n歛\nsquare\n賀\nォ\nfriday\n噓\ninto\n##⑥\nfashion\n嚀\n録\njune\nbridge\n爐\n鏈\n訪\n##ける\neds\n獵\ncollege\ncommon\ntree\n懶\n墘\n辯\n轟\n霧\n脈\n銬\nnand\n犧\n閑\naudi\n郷\n売\ndiscovery\nerror\ntoyota\nliving\n##ㄞ\nwant\ncopy\n曠\n懲\ncontent\nρ\n齡\nbird\n檔\n覧\nsupreme\nツ\n##ᄂ\n灑\n樁\n騙\nн\n##ヨ\n##baru\nq4\n˙\n##ς\n壇\n棲\n##あなたに\nq10\ncreate\n簦\n鐲\n驕\n闢\n##その\nまて\n##なく\n込\n蘿\nκ\n賜\n諷\ncoupe\n簫\n遷\n##ても\njquery\n▌\ncms\nモ\nswiss\n輛\n##blog\nᄒ\n潰\n擱\nupdate\n14k\ncontact\n郵\n澤\nchicago\n##ータ\ncamera\nios11\ngmt\n擬\n糞\n##ᅪ\nzone\n評\nз\n侷\n##〓\n##ルフ\n##ц\nvive\nsync\nrate\n∥\n閨\n稟\n癲\n##ーション\n氷\n豎\n##ンス\nе\n##ш\n違\n遙\nてす\nル\n逕\n燭\nр\nviews\ncst\nstation\nf16\n糾\n巻\n禍\narmani\nナ\nvim\n獅\n氹\n##ø\nケ\nsystems\n##∩\n冏\n鰍\ncostco\n薩\n兲\n悪\nnovember\nagain\nvsc\nmozilla\n渉\n団\ngivenchy\nrelease\nㄏ\n##ㄛ\n◢\nヶ\n##ᄎ\n蒼\nthai\n轎\nι\n彫\nunion\nfrm\n湳\nи\nclassic\n広\n##♂\nassociation\n##なら\n貧\nkanye\nχ\n##ν\n##nome\n賭\nlanguage\ninside\n##サー\nㄒ\ncompany\n壽\n図\n丼\n灝\nsense\n糬\n##╰\nг\n鏢\n穂\n蹤\n僱\nvans\n彌\n頑\nп\nワ\n##りの\n膠\n絵\n滯\n##ы\nreturn\nrequest\n綸\n##∥\nmems\nbooking\n##ㄏ\n喰\n##ᅲ\nstep1\n##ーシ\n螢\n##よう\n搗\n##cms\n##rix\nairbnb\nculture\nまた\nphilips\n協\n##ˇ\n##↘\n┊\n鑼\n続\n##かる\ndays\ntrc\n##π\n靂\nwomen\ncafe\nyoga\n嬪\n寵\n筍\n##ㄤ\n##ㄉ\n凪\nsearch\ncheese\nとして\n閔\n##∠\n邁\n駛\ndavis\n洸\n績\nddos\n綻\n##んて\n喺\ndisplay\n籬\n摯\n晝\n##χ\n潑\n蒟\n絳\nmuji\n鈕\nbeach\n倉\nы\ndcs\n##ちゃん\nseason\n驅\n訂\n糧\ndomain\nglass\nlumia\nfeed\n撈\n⒋\n貶\n##ə\n侶\nios7\n牴\n帳\nソ\npress\npinterest\nrights\n弒\n艱\n挾\nlimited\nptt\narticle\ncambridge\n##≈\nleonardo\n覓\n虧\n毀\n浄\n妘\n駕\n嬸\n窩\neba\nㄨ\n##いて\n済\n鈴\n熾\nっ\nunicode\neducation\n##ます\nwordpress\nfinancial\nvolvo\nadidas\nhistory\n轅\n籲\n単\n賴\n榱\n##θ\n転\nfiles\n##なり\nstep2\ngarden\n™\n棟\n噠\n俬\ngaga\n##ж\n総\ntitle\n##∞\ncss3\nぁ\n販\n嘰\n欄\nurban\n##™\n築\nicloud\n##ュー\nmarketing\n##admin\nmkv\n≦\n##ᄐ\n##ᄏ\ndevelopment\nvps\n敘\n鉅\n##γ\noctober\nо\nかある\n®\n抜\npepper\n##ては\n脅\n満\n錦\n筊\n##イフ\nbean\nο\nすく\n唄\n傘\nboost\nexchange\n##ᅬ\n##ㄇ\n0755\n−\n##ᅴ\n竊\nㄥ\n##˚\nyork\n衞\narchive\nedition\nfilm\nmuseum\n嶋\nofficial\ninformation\naward\nseries\ngundam\nstage\nsociety\ngames\nentertainment\nversion\nmovie\ncollection\nforce\nreport\nlibrary\nguide\nbooks\noxford\ncentre\nchurch\nmagazine\nfestival\nstates\nawards\ndatabase\nplanet\namerica\njunior\npublishing\nfoundation\n歴\nstore\n脇\n毘\narts\nkorea\nmanagement\nfunction\nplayer\n専\ncounty\nyears\napplication\njanuary\nfebruary\ncorporation\n県\neditor\nweek\nwashington\n亜\nforest\nsingapore\n##ctionary\nseptember\ndecember\n姉\nsecurity\nfirefox\nkids\nmaps\npublished\nsurvey\nmhz\nphotos\n蔵\nservices\ncategory\nstructure\npiece\n増\ngnu\nwong\nresult\n収\n栃\navenue\ntaipei\nlamigo\nplaza\npremium\ncook\nmalaysia\nmember\ngallery\n舎\nfactory\n稲\nedited\nwalker\nairport\nx86\nvalley\nnotes\nsmap\nvillage\nchicken\npolicy\nimages\ntwice\ncourse\nwikipedia\narchives\npopulation\nlinks\ndisney\nframework\n塭\nexplorer\nmaker\nkde\nmessage\n読\nsuite\n##gence\neclipse\n闘\ndistribution\nfung\n髪\ncampaign\n遶\n敍\n仮\n駅\ndetails\nmanager\nncc\n隣\nbelle\nsession\narticles\n菓\n驒\nexhibition\n齢\n険\noil\n乗\nazure\ninstall\neconomy\nubuntu\ncomment\n渋\n塩\nfeb\n応\naddress\n栄\n権\nhsu\npopular\nwine\nnetscape\nrelated\n頼\n帰\n験\nnownews\n効\n薬\n雑\nkitchen\nreader\n壆\nsafari\nsbl\n壊\n縄\nlulu\ndiary\n莿\n検\nteddy\nresort\n覇\n両\n荖\nhair\ncbc\nqualcomm\ndocomo\n覚\n##ernel\nettoday\n倶\nscp\n掲\n冨\nbalance\n駄\nmobil\nmlb\nalphago\npmid\n訳\nprevious\n坵\nrestaurant\n処\nwonderland\nwedding\n扱\npanasonic\nhmv\n継\n庁\ncredit\nmessenger\nclient\n聴\ncurve\n従\nmonday\n働\n譲\n荘\nhotels\n廃\nscrum\nbutler\ncopyright\n縁\n嫲\nchocolate\nweeks\n焼\n岀\n挙\ncache\nsalvatore\n労\necfa\n傚\nshtml\n麺\ngaming\nasus\nfedora\nr11\n厳\nmonths\nhbl\n仏\nurn\nlinkedin\njson\n舗\noops\nzenfone\nfaq\nbytes\nxz\ntrivago\nc919\n嚟\ncake\nwam\nposts\ntesla\n彅\ned2k\nnego\nmwc\nbanner\nreuters\nh7n9\nbrandt\nzol\nevernote\njobs\nmsg\nrosie\nsylvia\nbrake\npchome\n駆\ngogoro\nrmk\nloading\ncomments\nspotify\n蒻\nie6\n圧\nikea\nrewards\n囲\nnavigation\n0800\n悩\nuniqlo\nwhatsapp\n軽\ndollars\nudn\nwali\n脳\ncookies\nlifestyle\narduino\npokemon\noutlet\ngartner\n択\nrainer\ndji\nwade\n窓\ncurry\nferragamo\nplurk\ntcs\n繋\noriginals\nbloomberg\ndhl\n磲\ncf1\nsgs\nmango\nlohas\ntinker\n##pods\nvsa\nprotected\nanalytics\n騨\n歯\nprivacy\n臓\nblogspot\n拡\ntila\n払\nfc2\npptv\nmarriott\noculus\npixnet\n価\n銭\nhitachi\n瞓\nntd\nwikia\nnamespace\nagoda\npositioning\nsohu\n##bscribe\nhola\nmastercard\n9985\nblogger\nexpedia\nwestbrook\nbbe\n畳\ndyson\nrcep\n粄\nweibo\ntraction\n##onsored\nfandom\nday1\nstarbucks\nwwdc\nbuffet\n剉\ntags\n2765\nebooks\n##skip\ntweet\ne88\nhostel\n8591\n4066\nmotel\n1mdb\nprefix\nlabels\n椪\n勧\ninsee\n##vertisement\n##tags\nday3\nu11\n戻\n##t00\nepson\ntripadvisor\ntmt\nfishbase\nbe2\nvdc\ngopro\n歓\nreply\nhktvmall\ngr2\n##facebook\npinkoi\nmoneydj\ngoris\nrends\ntomtom\nnginx\nfsm\n9595\n6221\nios10\n堿\nrakuten\nxuite\nadsense\nday4\n00z\nconnectivity\nworldcat\n##copyright\nhuawei\nreserved\ntvg\ndropbox\n9678\nlongchamp\n50cm\namana\n12345678910\nlofter\n5757\n6606\n遅\nvfm\n17life\n<eos>\n<unk>\n"
  },
  {
    "path": "modules/text/text_review/porn_detection_cnn/module.py",
    "content": "import math\nimport os\n\nimport paddlehub as hub\nfrom .processor import load_vocab\nfrom .processor import postprocess\nfrom .processor import preprocess\nfrom paddlehub.compat.task import tokenization\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"porn_detection_cnn\",\n            version=\"1.2.0\",\n            summary=\"Baidu's open-source Porn Detection Model.\",\n            author=\"baidu-nlp\",\n            author_email=\"\",\n            type=\"nlp/sentiment_analysis\")\nclass PornDetectionCNN(hub.NLPPredictionModule):\n\n    def __init__(self):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        self.pretrained_model_path = os.path.join(self.directory, \"infer_model\")\n        self.tokenizer_vocab_path = os.path.join(self.directory, \"assets\", \"vocab.txt\")\n        self.vocab_path = os.path.join(self.directory, \"assets\", \"word_dict.txt\")\n        self.vocab = load_vocab(self.vocab_path)\n        self.sequence_max_len = 256\n        self.tokenizer = tokenization.FullTokenizer(self.tokenizer_vocab_path)\n\n        self.param_file = os.path.join(self.directory, \"assets\", \"params.txt\")\n\n        self.predict = self.detection\n\n        self._set_config()\n\n    @serving\n    def detection(self, texts=[], data={}, use_gpu=False, batch_size=1):\n        \"\"\"\n        Get the porn prediction results results with the texts as input\n\n        Args:\n             texts(list): the input texts to be predicted, if texts not data\n             data(dict): key must be 'text', value is the texts to be predicted, if data not texts\n             use_gpu(bool): whether use gpu to predict or not\n             batch_size(int): the program deals once with one batch\n\n        Returns:\n             results(list): the porn prediction results\n        \"\"\"\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n        except:\n            use_gpu = False\n\n        if texts != [] and isinstance(texts, list) and data == {}:\n            predicted_data = texts\n        elif texts == [] and isinstance(data, dict) and isinstance(data.get('text', None), list) and data['text']:\n            predicted_data = data[\"text\"]\n        else:\n            raise ValueError(\"The input data is inconsistent with expectations.\")\n\n        predicted_data = self.to_unicode(predicted_data)\n        start_idx = 0\n        iteration = int(math.ceil(len(predicted_data) / batch_size))\n        results = []\n        for i in range(iteration):\n            if i < (iteration - 1):\n                batch_data = predicted_data[start_idx:(start_idx + batch_size)]\n            else:\n                batch_data = predicted_data[start_idx:]\n\n            start_idx = start_idx + batch_size\n            processed_results = preprocess(batch_data, self.tokenizer, self.vocab, self.sequence_max_len)\n            tensor_words = self.texts2tensor(processed_results)\n\n            if use_gpu:\n                batch_out = self.gpu_predictor.run([tensor_words])\n            else:\n                batch_out = self.cpu_predictor.run([tensor_words])\n            batch_result = postprocess(batch_out[0], processed_results)\n            results += batch_result\n        return results\n\n    def get_labels(self):\n        \"\"\"\n        Get the labels which was used when pretraining\n        Returns:\n             self.labels(dict)\n        \"\"\"\n        self.labels = {\"porn\": 1, \"not_porn\": 0}\n        return self.labels\n"
  },
  {
    "path": "modules/text/text_review/porn_detection_cnn/processor.py",
    "content": "import io\n\nimport numpy as np\n\n\ndef load_vocab(file_path):\n    \"\"\"\n    load the given vocabulary\n    \"\"\"\n    vocab = {}\n    with io.open(file_path, 'r', encoding='utf8') as f:\n        for i, line in enumerate(f):\n            vocab[line.rstrip()] = int(i)\n    return vocab\n\n\ndef get_predict_label(pos_prob):\n    \"\"\"\n    Convert the prediction probabilities to label\n    \"\"\"\n    # threshold should be (1, 0.5)\n    neu_threshold = 0.5\n    if pos_prob >= neu_threshold:\n        label, key = 1, \"porn\"\n    else:\n        label, key = 0, \"not_porn\"\n    return label, key\n\n\ndef preprocess(predicted_data, tokenizer, vocab, sequence_max_len=256):\n    \"\"\"\n    Convert the word str to word id and pad the text\n    \"\"\"\n    result = []\n    padding = vocab['<PAD>']\n    unknown = vocab['<UNK>']\n    for index, text in enumerate(predicted_data):\n        data_arr = tokenizer.tokenize(''.join(text.split()))\n        wids = [vocab.get(w, unknown) for w in data_arr[:sequence_max_len]]\n        if len(wids) < sequence_max_len:\n            wids = wids + [padding] * (sequence_max_len - len(wids))\n\n        result_i = {'processed': []}\n        result_i['origin'] = predicted_data[index]\n        result_i['processed'] += wids\n        result.append(result_i)\n    return result\n\n\ndef postprocess(predict_out, texts):\n    \"\"\"\n    Convert model's output tensor to pornography label\n    \"\"\"\n    result = []\n    predict_out = predict_out.as_ndarray()\n    for index in range(len(texts)):\n        result_i = {}\n        result_i['text'] = texts[index]['origin']\n        label = int(np.argmax(predict_out[index]))\n        if label == 0:\n            key = 'not_porn'\n        else:\n            key = 'porn'\n        result_i['porn_detection_label'] = label\n        result_i['porn_detection_key'] = key\n        result_i['porn_probs'] = float('%.4f' % predict_out[index, 1])\n        result_i['not_porn_probs'] = float('%.4f' % (predict_out[index, 0]))\n        result.append(result_i)\n    return result\n"
  },
  {
    "path": "modules/text/text_review/porn_detection_gru/README.md",
    "content": "# porn_detection_gru\n\n| 模型名称            |  porn_detection_gru  |\n| :------------------ | :------------: |\n| 类别                | 文本-文本审核  |\n| 网络                |      GRU  |\n| 数据集              | 百度自建数据集 |\n| 是否支持Fine-tuning |       否       |\n| 模型大小            |       20M       |\n| 最新更新日期        |   2021-02-26   |\n| 数据指标            |       -        |\n\n## 一、模型基本信息\n\n- ### 模型介绍\n  - 色情检测模型可自动判别文本是否涉黄并给出相应的置信度，对文本中的色情描述、低俗交友、污秽文案进行识别。\n  - porn_detection_gru采用GRU网络结构并按字粒度进行切词，具有较高的预测速度。该模型最大句子长度为256字，仅支持预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 1.6.2\n\n  - paddlehub >= 1.6.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install porn_detection_gru\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run porn_detection_gru --input_text \"黄片下载\"\n    ```\n\n  - 或者\n\n  - ```shell\n    $ hub run porn_detection_gru --input_file test.txt\n    ```\n\n    - 其中test.txt存放待审查文本，每行仅放置一段待审核文本\n\n  - 通过命令行方式实现hub模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    porn_detection_gru = hub.Module(name=\"porn_detection_gru\")\n\n    test_text = [\"黄片下载\", \"打击黄牛党\"]\n\n    results = porn_detection_gru.detection(texts=test_text, use_gpu=True, batch_size=1)   # 如不使用GPU，请修改为use_gpu=False\n\n    for index, text in enumerate(test_text):\n        results[index][\"text\"] = text\n    for index, result in enumerate(results):\n        print(results[index])\n\n    # 输出结果如下：\n    # {'text': '黄片下载', 'porn_detection_label': 1, 'porn_detection_key': 'porn', 'porn_probs': 0.9324, 'not_porn_probs': 0.0676}\n    # {'text': '打击黄牛党', 'porn_detection_label': 0, 'porn_detection_key': 'not_porn', 'porn_probs': 0.0004, 'not_porn_probs': 0.9996}\n    ```\n\n\n- ### 3、API\n\n  - ```python\n    def detection(texts=[], data={}, use_gpu=False, batch_size=1)\n    ```\n\n    - porn_detection_gru预测接口，鉴定输入句子是否包含色情文案\n\n    - **参数**\n\n      - texts(list): 待预测数据，如果使用texts参数，则不用传入data参数，二选一即可\n\n      - data(dict): 预测数据，key必须为text，value是带预测数据。如果使用data参数，则不用传入texts参数，二选一即可。建议使用texts参数，data参数后续会废弃。\n\n      - use_gpu(bool): 是否使用GPU预测，如果使用GPU预测，则在预测之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置\n\n      - batch_size(int): 批处理大小\n\n    - **返回**\n\n      - results(list): 鉴定结果\n\n\n  - ```python\n    def get_labels()\n    ```\n    - 获取porn_detection_gru的类别\n\n    - **返回**\n\n      - labels(dict): porn_detection_gru的类别(二分类，是/不是）\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - 获取预训练时使用的词汇表\n\n    - **返回**\n\n      - vocab_path(str): 词汇表路径\n\n\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线色情文案检测服务，可以将此接口用于在线web应用。\n\n- ## 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m porn_detection_gru  \n    ```\n\n  - 启动时会显示加载模型过程，启动成功后显示\n  - ```shell\n    Loading porn_detection_gur successful.\n    ```\n\n  - 这样就完成了服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n- ## 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 待预测数据\n    text = [\"黄片下载\", \"打击黄牛党\"]\n\n    # 设置运行配置\n    # 对应本地预测porn_detection_gru.detection(texts=text, batch_size=1, use_gpu=True)\n    data = {\"texts\": text, \"batch_size\": 1, \"use_gpu\":True}\n\n    # 指定预测方法为porn_detection_gru并发送post请求，content-type类型应指定json方式\n    # HOST_IP为服务器IP\n    url = \"http://HOST_IP:8866/predict/porn_detection_gru\"\n    headers = {\"Content-Type\": \"application/json\"}\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(json.dumps(r.json(), indent=4, ensure_ascii=False))\n    ```\n\n  - 关于PaddleHub Serving更多信息参考[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  大幅提升预测性能，同时简化接口使用\n\n* 1.2.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install porn_detection_gru==1.2.0\n    ```\n\n"
  },
  {
    "path": "modules/text/text_review/porn_detection_gru/README_en.md",
    "content": "# porn_detection_gru\n\n|     Module Name      |  porn_detection_gru  |\n|  :------------------ | :------------: |\n|       Category       | text-text_review  |\n|         Network      |      GRU      |\n|         Dataset      | Dataset built by Baidu |\n| Fine-tuning supported or not |      No       |\n|     Module Size      |       20M       |\n| Latest update date   |   2021-02-26   |\n|   Data indicators    |       -        |\n\n## I. Basic Information of Module\n\n- ### Module Introduction\n  - Pornography detection model can automatically distinguish whether the text is pornographic or not and give the corresponding confidence, and identify the pornographic description, vulgar communication and filthy text in the text.\n  - porn_detection_gru adopts GRU network structure and cuts words according to word granularity, which has high prediction speed. The maximum sentence length of this model is 256 words, and only prediction is supported.\n\n\n## II. Installation\n\n- ### 1、Environmental dependence\n\n  - paddlepaddle >= 1.6.2\n\n  - paddlehub >= 1.6.0    | [How to install PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、Installation\n\n  - ```shell\n    $ hub install porn_detection_gru\n    ```\n  - If you have problems during installation, please refer to:[windows_quickstart](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [linux_quickstart](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [mac_quickstart](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n## III. Module API and Prediction\n\n- ### 1、Command line Prediction\n\n  - ```shell\n    $ hub run porn_detection_gru --input_text \"黄片下载\"\n    ```\n\n  - or\n\n  - ```shell\n    $ hub run porn_detection_gru --input_file test.txt\n    ```\n\n    - test.txt stores the text to be reviewed. Each line contains only one text\n\n  - If you want to call the Hub module through the command line, please refer to: [PaddleHub Command line instruction](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、Prediction Code Example\n\n  - ```python\n    import paddlehub as hub\n\n    porn_detection_gru = hub.Module(name=\"porn_detection_gru\")\n\n    test_text = [\"黄片下载\", \"打击黄牛党\"]\n\n    results = porn_detection_gru.detection(texts=test_text, use_gpu=True, batch_size=1)   # If you do not use GPU, please set use_gpu=False\n\n    for index, text in enumerate(test_text):\n        results[index][\"text\"] = text\n    for index, result in enumerate(results):\n        print(results[index])\n\n    # The output：\n    # {'text': '黄片下载', 'porn_detection_label': 1, 'porn_detection_key': 'porn', 'porn_probs': 0.9324, 'not_porn_probs': 0.0676}\n    # {'text': '打击黄牛党', 'porn_detection_label': 0, 'porn_detection_key': 'not_porn', 'porn_probs': 0.0004, 'not_porn_probs': 0.9996}\n    ```\n\n\n- ### 3、API\n\n  - ```python\n    def detection(texts=[], data={}, use_gpu=False, batch_size=1)\n    ```\n\n    - prediction api of porn_detection_gru，to identify whether input sentences contain pornography\n\n    - **Parameter**\n\n      - texts(list): data to be predicted, if texts parameter is used, there is no need to pass in data parameter. You can use any of the two parameters.\n\n      - data(dict): predicted data , key must be text，value is data to be predicted. if data parameter is used, there is no need to pass in texts parameter. You can use any of the two parameters. It is suggested to use texts parameter, and data parameter will be discarded later.\n\n      - use_gpu(bool): use GPU or not. If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before prediction. Otherwise, need not set it.\n\n    - **Return**\n\n      - results(list): prediction result\n\n\n  - ```python\n    def get_labels()\n    ```\n    - get the category of porn_detection_gru\n\n    - **Return**\n\n      - labels(dict): the category of porn_detection_gru (Dichotomies, yes/no)\n\n  - ```python\n    def get_vocab_path()\n    ```\n\n    - get a vocabulary for pre-training\n\n    - **Return**\n\n      - vocab_path(str): Vocabulary path\n\n\n\n## IV. Server Deployment\n\n- PaddleHub Serving can deploy an online pornography detection service and you can use this interface for online Web applications.\n\n- ## Step 1: Start PaddleHub Serving\n\n  - Run the startup command:\n  - ```shell\n    $ hub serving start -m porn_detection_gru  \n    ```\n\n  - The model loading process is displayed on startup. After the startup is successful, the following information is displayed:\n  - ```shell\n    Loading porn_detection_gur successful.\n    ```\n\n  - The servitization API is now deployed and the default port number is 8866.\n\n  - **NOTE:**  If GPU is used for prediction, set CUDA_VISIBLE_DEVICES environment variable before prediction. Otherwise, need not set it.\n\n\n- ## Step 2: Send a predictive request\n\n  - After configuring the server, the following lines of code can be used to send the prediction request and obtain the prediction result\n  - ```python\n    import requests\n    import json\n\n    # data to be predicted\n    text = [\"黄片下载\", \"打击黄牛党\"]\n\n    # Set the running configuration\n    # Corresponding local forecast porn_detection_gru.detection(texts=text, batch_size=1, use_gpu=True)\n    data = {\"texts\": text, \"batch_size\": 1, \"use_gpu\":True}\n\n    # set the prediction method to porn_detection_gru and send a POST request, content-type should be set to json\n    # HOST_IP is the IP address of the server\n    url = \"http://HOST_IP:8866/predict/porn_detection_gru\"\n    headers = {\"Content-Type\": \"application/json\"}\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # print prediction result\n    print(json.dumps(r.json(), indent=4, ensure_ascii=False))\n    ```\n\n  - For more information about PaddleHub Serving, please refer to:[Serving Deployment](../../../../docs/docs_ch/tutorial/serving.md)\n\n\n\n\n## V. Release Note\n\n* 1.0.0\n\n  First release\n\n* 1.1.0\n\n  Improves prediction performance and simplifies interface usage\n\n* 1.2.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install porn_detection_gru==1.2.0\n    ```\n\n"
  },
  {
    "path": "modules/text/text_review/porn_detection_gru/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/text_review/porn_detection_gru/assets/params.txt",
    "content": "@HUB_porn_detection_gru@fc_0.w_0\n@HUB_porn_detection_gru@embedding_0.w_0\n@HUB_porn_detection_gru@fc_1.w_0\n@HUB_porn_detection_gru@gru_0.b_0\n@HUB_porn_detection_gru@layer_norm_0.w_0\n@HUB_porn_detection_gru@fc_1.b_0\n@HUB_porn_detection_gru@gru_0.w_0\n@HUB_porn_detection_gru@layer_norm_0.b_0\n@HUB_porn_detection_gru@fc_0.b_0\n"
  },
  {
    "path": "modules/text/text_review/porn_detection_gru/assets/vocab.txt",
    "content": "[PAD]\n[unused1]\n[unused2]\n[unused3]\n[unused4]\n[unused5]\n[unused6]\n[unused7]\n[unused8]\n[unused9]\n[unused10]\n[unused11]\n[unused12]\n[unused13]\n[unused14]\n[unused15]\n[unused16]\n[unused17]\n[unused18]\n[unused19]\n[unused20]\n[unused21]\n[unused22]\n[unused23]\n[unused24]\n[unused25]\n[unused26]\n[unused27]\n[unused28]\n[unused29]\n[unused30]\n[unused31]\n[unused32]\n[unused33]\n[unused34]\n[unused35]\n[unused36]\n[unused37]\n[unused38]\n[unused39]\n[unused40]\n[unused41]\n[unused42]\n[unused43]\n[unused44]\n[unused45]\n[unused46]\n[unused47]\n[unused48]\n[unused49]\n[unused50]\n[unused51]\n[unused52]\n[unused53]\n[unused54]\n[unused55]\n[unused56]\n[unused57]\n[unused58]\n[unused59]\n[unused60]\n[unused61]\n[unused62]\n[unused63]\n[unused64]\n[unused65]\n[unused66]\n[unused67]\n[unused68]\n[unused69]\n[unused70]\n[unused71]\n[unused72]\n[unused73]\n[unused74]\n[unused75]\n[unused76]\n[unused77]\n[unused78]\n[unused79]\n[unused80]\n[unused81]\n[unused82]\n[unused83]\n[unused84]\n[unused85]\n[unused86]\n[unused87]\n[unused88]\n[unused89]\n[unused90]\n[unused91]\n[unused92]\n[unused93]\n[unused94]\n[unused95]\n[unused96]\n[unused97]\n[unused98]\n[unused99]\n[UNK]\n[CLS]\n[SEP]\n[MASK]\n<S>\n<T>\n!\n\"\n#\n$\n%\n&\n'\n(\n)\n*\n+\n,\n-\n.\n/\n0\n1\n2\n3\n4\n5\n6\n7\n8\n9\n:\n;\n<\n=\n>\n?\n@\n[\n\\\n]\n^\n_\na\nb\nc\nd\ne\nf\ng\nh\ni\nj\nk\nl\nm\nn\no\np\nq\nr\ns\nt\nu\nv\nw\nx\ny\nz\n{\n|\n}\n~\n£\n¤\n¥\n§\n©\n«\n®\n°\n±\n²\n³\nµ\n·\n¹\nº\n»\n¼\n×\nß\næ\n÷\nø\nđ\nŋ\nɔ\nə\nɡ\nʰ\nˇ\nˈ\nˊ\nˋ\nˍ\nː\n˙\n˚\nˢ\nα\nβ\nγ\nδ\nε\nη\nθ\nι\nκ\nλ\nμ\nν\nο\nπ\nρ\nς\nσ\nτ\nυ\nφ\nχ\nψ\nω\nа\nб\nв\nг\nд\nе\nж\nз\nи\nк\nл\nм\nн\nо\nп\nр\nс\nт\nу\nф\nх\nц\nч\nш\nы\nь\nя\nі\nا\nب\nة\nت\nد\nر\nس\nع\nل\nم\nن\nه\nو\nي\n۩\nก\nง\nน\nม\nย\nร\nอ\nา\nเ\n๑\n་\nღ\nᄀ\nᄁ\nᄂ\nᄃ\nᄅ\nᄆ\nᄇ\nᄈ\nᄉ\nᄋ\nᄌ\nᄎ\nᄏ\nᄐ\nᄑ\nᄒ\nᅡ\nᅢ\nᅣ\nᅥ\nᅦ\nᅧ\nᅨ\nᅩ\nᅪ\nᅬ\nᅭ\nᅮ\nᅯ\nᅲ\nᅳ\nᅴ\nᅵ\nᆨ\nᆫ\nᆯ\nᆷ\nᆸ\nᆺ\nᆻ\nᆼ\nᗜ\nᵃ\nᵉ\nᵍ\nᵏ\nᵐ\nᵒ\nᵘ\n‖\n„\n†\n•\n‥\n‧\n \n‰\n′\n″\n‹\n›\n※\n‿\n⁄\nⁱ\n⁺\nⁿ\n₁\n₂\n₃\n₄\n€\n℃\n№\n™\nⅰ\nⅱ\nⅲ\nⅳ\nⅴ\n←\n↑\n→\n↓\n↔\n↗\n↘\n⇒\n∀\n−\n∕\n∙\n√\n∞\n∟\n∠\n∣\n∥\n∩\n∮\n∶\n∼\n∽\n≈\n≒\n≡\n≤\n≥\n≦\n≧\n≪\n≫\n⊙\n⋅\n⋈\n⋯\n⌒\n①\n②\n③\n④\n⑤\n⑥\n⑦\n⑧\n⑨\n⑩\n⑴\n⑵\n⑶\n⑷\n⑸\n⒈\n⒉\n⒊\n⒋\nⓒ\nⓔ\nⓘ\n─\n━\n│\n┃\n┅\n┆\n┊\n┌\n└\n├\n┣\n═\n║\n╚\n╞\n╠\n╭\n╮\n╯\n╰\n╱\n╳\n▂\n▃\n▅\n▇\n█\n▉\n▋\n▌\n▍\n▎\n■\n□\n▪\n▫\n▬\n▲\n△\n▶\n►\n▼\n▽\n◆\n◇\n○\n◎\n●\n◕\n◠\n◢\n◤\n☀\n★\n☆\n☕\n☞\n☺\n☼\n♀\n♂\n♠\n♡\n♣\n♥\n♦\n♪\n♫\n♬\n✈\n✔\n✕\n✖\n✦\n✨\n✪\n✰\n✿\n❀\n❤\n➜\n➤\n⦿\n、\n。\n〃\n々\n〇\n〈\n〉\n《\n》\n「\n」\n『\n』\n【\n】\n〓\n〔\n〕\n〖\n〗\n〜\n〝\n〞\nぁ\nあ\nぃ\nい\nう\nぇ\nえ\nお\nか\nき\nく\nけ\nこ\nさ\nし\nす\nせ\nそ\nた\nち\nっ\nつ\nて\nと\nな\nに\nぬ\nね\nの\nは\nひ\nふ\nへ\nほ\nま\nみ\nむ\nめ\nも\nゃ\nや\nゅ\nゆ\nょ\nよ\nら\nり\nる\nれ\nろ\nわ\nを\nん\n゜\nゝ\nァ\nア\nィ\nイ\nゥ\nウ\nェ\nエ\nォ\nオ\nカ\nキ\nク\nケ\nコ\nサ\nシ\nス\nセ\nソ\nタ\nチ\nッ\nツ\nテ\nト\nナ\nニ\nヌ\nネ\nノ\nハ\nヒ\nフ\nヘ\nホ\nマ\nミ\nム\nメ\nモ\nャ\nヤ\nュ\nユ\nョ\nヨ\nラ\nリ\nル\nレ\nロ\nワ\nヲ\nン\nヶ\n・\nー\nヽ\nㄅ\nㄆ\nㄇ\nㄉ\nㄋ\nㄌ\nㄍ\nㄎ\nㄏ\nㄒ\nㄚ\nㄛ\nㄞ\nㄟ\nㄢ\nㄤ\nㄥ\nㄧ\nㄨ\nㆍ\n㈦\n㊣\n㎡\n㗎\n一\n丁\n七\n万\n丈\n三\n上\n下\n不\n与\n丐\n丑\n专\n且\n丕\n世\n丘\n丙\n业\n丛\n东\n丝\n丞\n丟\n両\n丢\n两\n严\n並\n丧\n丨\n个\n丫\n中\n丰\n串\n临\n丶\n丸\n丹\n为\n主\n丼\n丽\n举\n丿\n乂\n乃\n久\n么\n义\n之\n乌\n乍\n乎\n乏\n乐\n乒\n乓\n乔\n乖\n乗\n乘\n乙\n乜\n九\n乞\n也\n习\n乡\n书\n乩\n买\n乱\n乳\n乾\n亀\n亂\n了\n予\n争\n事\n二\n于\n亏\n云\n互\n五\n井\n亘\n亙\n亚\n些\n亜\n亞\n亟\n亡\n亢\n交\n亥\n亦\n产\n亨\n亩\n享\n京\n亭\n亮\n亲\n亳\n亵\n人\n亿\n什\n仁\n仃\n仄\n仅\n仆\n仇\n今\n介\n仍\n从\n仏\n仑\n仓\n仔\n仕\n他\n仗\n付\n仙\n仝\n仞\n仟\n代\n令\n以\n仨\n仪\n们\n仮\n仰\n仲\n件\n价\n任\n份\n仿\n企\n伉\n伊\n伍\n伎\n伏\n伐\n休\n伕\n众\n优\n伙\n会\n伝\n伞\n伟\n传\n伢\n伤\n伦\n伪\n伫\n伯\n估\n伴\n伶\n伸\n伺\n似\n伽\n佃\n但\n佇\n佈\n位\n低\n住\n佐\n佑\n体\n佔\n何\n佗\n佘\n余\n佚\n佛\n作\n佝\n佞\n佟\n你\n佢\n佣\n佤\n佥\n佩\n佬\n佯\n佰\n佳\n併\n佶\n佻\n佼\n使\n侃\n侄\n來\n侈\n例\n侍\n侏\n侑\n侖\n侗\n供\n依\n侠\n価\n侣\n侥\n侦\n侧\n侨\n侬\n侮\n侯\n侵\n侶\n侷\n便\n係\n促\n俄\n俊\n俎\n俏\n俐\n俑\n俗\n俘\n俚\n保\n俞\n俟\n俠\n信\n俨\n俩\n俪\n俬\n俭\n修\n俯\n俱\n俳\n俸\n俺\n俾\n倆\n倉\n個\n倌\n倍\n倏\n們\n倒\n倔\n倖\n倘\n候\n倚\n倜\n借\n倡\n値\n倦\n倩\n倪\n倫\n倬\n倭\n倶\n债\n值\n倾\n偃\n假\n偈\n偉\n偌\n偎\n偏\n偕\n做\n停\n健\n側\n偵\n偶\n偷\n偻\n偽\n偿\n傀\n傅\n傍\n傑\n傘\n備\n傚\n傢\n傣\n傥\n储\n傩\n催\n傭\n傲\n傳\n債\n傷\n傻\n傾\n僅\n働\n像\n僑\n僕\n僖\n僚\n僥\n僧\n僭\n僮\n僱\n僵\n價\n僻\n儀\n儂\n億\n儆\n儉\n儋\n儒\n儕\n儘\n償\n儡\n優\n儲\n儷\n儼\n儿\n兀\n允\n元\n兄\n充\n兆\n兇\n先\n光\n克\n兌\n免\n児\n兑\n兒\n兔\n兖\n党\n兜\n兢\n入\n內\n全\n兩\n八\n公\n六\n兮\n兰\n共\n兲\n关\n兴\n兵\n其\n具\n典\n兹\n养\n兼\n兽\n冀\n内\n円\n冇\n冈\n冉\n冊\n册\n再\n冏\n冒\n冕\n冗\n写\n军\n农\n冠\n冢\n冤\n冥\n冨\n冪\n冬\n冯\n冰\n冲\n决\n况\n冶\n冷\n冻\n冼\n冽\n冾\n净\n凄\n准\n凇\n凈\n凉\n凋\n凌\n凍\n减\n凑\n凛\n凜\n凝\n几\n凡\n凤\n処\n凪\n凭\n凯\n凰\n凱\n凳\n凶\n凸\n凹\n出\n击\n函\n凿\n刀\n刁\n刃\n分\n切\n刈\n刊\n刍\n刎\n刑\n划\n列\n刘\n则\n刚\n创\n初\n删\n判\n別\n刨\n利\n刪\n别\n刮\n到\n制\n刷\n券\n刹\n刺\n刻\n刽\n剁\n剂\n剃\n則\n剉\n削\n剋\n剌\n前\n剎\n剐\n剑\n剔\n剖\n剛\n剜\n剝\n剣\n剤\n剥\n剧\n剩\n剪\n副\n割\n創\n剷\n剽\n剿\n劃\n劇\n劈\n劉\n劊\n劍\n劏\n劑\n力\n劝\n办\n功\n加\n务\n劣\n动\n助\n努\n劫\n劭\n励\n劲\n劳\n労\n劵\n効\n劾\n势\n勁\n勃\n勇\n勉\n勋\n勐\n勒\n動\n勖\n勘\n務\n勛\n勝\n勞\n募\n勢\n勤\n勧\n勳\n勵\n勸\n勺\n勻\n勾\n勿\n匀\n包\n匆\n匈\n匍\n匐\n匕\n化\n北\n匙\n匝\n匠\n匡\n匣\n匪\n匮\n匯\n匱\n匹\n区\n医\n匾\n匿\n區\n十\n千\n卅\n升\n午\n卉\n半\n卍\n华\n协\n卑\n卒\n卓\n協\n单\n卖\n南\n単\n博\n卜\n卞\n卟\n占\n卡\n卢\n卤\n卦\n卧\n卫\n卮\n卯\n印\n危\n即\n却\n卵\n卷\n卸\n卻\n卿\n厂\n厄\n厅\n历\n厉\n压\n厌\n厕\n厘\n厚\n厝\n原\n厢\n厥\n厦\n厨\n厩\n厭\n厮\n厲\n厳\n去\n县\n叁\n参\n參\n又\n叉\n及\n友\n双\n反\n収\n发\n叔\n取\n受\n变\n叙\n叛\n叟\n叠\n叡\n叢\n口\n古\n句\n另\n叨\n叩\n只\n叫\n召\n叭\n叮\n可\n台\n叱\n史\n右\n叵\n叶\n号\n司\n叹\n叻\n叼\n叽\n吁\n吃\n各\n吆\n合\n吉\n吊\n吋\n同\n名\n后\n吏\n吐\n向\n吒\n吓\n吕\n吖\n吗\n君\n吝\n吞\n吟\n吠\n吡\n否\n吧\n吨\n吩\n含\n听\n吭\n吮\n启\n吱\n吳\n吴\n吵\n吶\n吸\n吹\n吻\n吼\n吽\n吾\n呀\n呂\n呃\n呆\n呈\n告\n呋\n呎\n呐\n呓\n呕\n呗\n员\n呛\n呜\n呢\n呤\n呦\n周\n呱\n呲\n味\n呵\n呷\n呸\n呻\n呼\n命\n咀\n咁\n咂\n咄\n咆\n咋\n和\n咎\n咏\n咐\n咒\n咔\n咕\n咖\n咗\n咘\n咙\n咚\n咛\n咣\n咤\n咦\n咧\n咨\n咩\n咪\n咫\n咬\n咭\n咯\n咱\n咲\n咳\n咸\n咻\n咽\n咿\n哀\n品\n哂\n哄\n哆\n哇\n哈\n哉\n哋\n哌\n响\n哎\n哏\n哐\n哑\n哒\n哔\n哗\n哟\n員\n哥\n哦\n哧\n哨\n哩\n哪\n哭\n哮\n哲\n哺\n哼\n哽\n唁\n唄\n唆\n唇\n唉\n唏\n唐\n唑\n唔\n唠\n唤\n唧\n唬\n售\n唯\n唰\n唱\n唳\n唷\n唸\n唾\n啃\n啄\n商\n啉\n啊\n問\n啓\n啕\n啖\n啜\n啞\n啟\n啡\n啤\n啥\n啦\n啧\n啪\n啫\n啬\n啮\n啰\n啱\n啲\n啵\n啶\n啷\n啸\n啻\n啼\n啾\n喀\n喂\n喃\n善\n喆\n喇\n喉\n喊\n喋\n喎\n喏\n喔\n喘\n喙\n喚\n喜\n喝\n喟\n喧\n喪\n喫\n喬\n單\n喰\n喱\n喲\n喳\n喵\n営\n喷\n喹\n喺\n喻\n喽\n嗅\n嗆\n嗇\n嗎\n嗑\n嗒\n嗓\n嗔\n嗖\n嗚\n嗜\n嗝\n嗟\n嗡\n嗣\n嗤\n嗦\n嗨\n嗪\n嗬\n嗯\n嗰\n嗲\n嗳\n嗶\n嗷\n嗽\n嘀\n嘅\n嘆\n嘈\n嘉\n嘌\n嘍\n嘎\n嘔\n嘖\n嘗\n嘘\n嘚\n嘛\n嘜\n嘞\n嘟\n嘢\n嘣\n嘤\n嘧\n嘩\n嘭\n嘮\n嘯\n嘰\n嘱\n嘲\n嘴\n嘶\n嘸\n嘹\n嘻\n嘿\n噁\n噌\n噎\n噓\n噔\n噗\n噙\n噜\n噠\n噢\n噤\n器\n噩\n噪\n噬\n噱\n噴\n噶\n噸\n噹\n噻\n噼\n嚀\n嚇\n嚎\n嚏\n嚐\n嚓\n嚕\n嚟\n嚣\n嚥\n嚨\n嚮\n嚴\n嚷\n嚼\n囂\n囉\n囊\n囍\n囑\n囔\n囗\n囚\n四\n囝\n回\n囟\n因\n囡\n团\n団\n囤\n囧\n囪\n囫\n园\n困\n囱\n囲\n図\n围\n囹\n固\n国\n图\n囿\n圃\n圄\n圆\n圈\n國\n圍\n圏\n園\n圓\n圖\n團\n圜\n土\n圣\n圧\n在\n圩\n圭\n地\n圳\n场\n圻\n圾\n址\n坂\n均\n坊\n坍\n坎\n坏\n坐\n坑\n块\n坚\n坛\n坝\n坞\n坟\n坠\n坡\n坤\n坦\n坨\n坪\n坯\n坳\n坵\n坷\n垂\n垃\n垄\n型\n垒\n垚\n垛\n垠\n垢\n垣\n垦\n垩\n垫\n垭\n垮\n垵\n埂\n埃\n埋\n城\n埔\n埕\n埗\n域\n埠\n埤\n埵\n執\n埸\n培\n基\n埼\n堀\n堂\n堃\n堅\n堆\n堇\n堑\n堕\n堙\n堡\n堤\n堪\n堯\n堰\n報\n場\n堵\n堺\n堿\n塊\n塌\n塑\n塔\n塗\n塘\n塚\n塞\n塢\n塩\n填\n塬\n塭\n塵\n塾\n墀\n境\n墅\n墉\n墊\n墒\n墓\n増\n墘\n墙\n墜\n增\n墟\n墨\n墩\n墮\n墳\n墻\n墾\n壁\n壅\n壆\n壇\n壊\n壑\n壓\n壕\n壘\n壞\n壟\n壢\n壤\n壩\n士\n壬\n壮\n壯\n声\n売\n壳\n壶\n壹\n壺\n壽\n处\n备\n変\n复\n夏\n夔\n夕\n外\n夙\n多\n夜\n够\n夠\n夢\n夥\n大\n天\n太\n夫\n夭\n央\n夯\n失\n头\n夷\n夸\n夹\n夺\n夾\n奂\n奄\n奇\n奈\n奉\n奋\n奎\n奏\n奐\n契\n奔\n奕\n奖\n套\n奘\n奚\n奠\n奢\n奥\n奧\n奪\n奬\n奮\n女\n奴\n奶\n奸\n她\n好\n如\n妃\n妄\n妆\n妇\n妈\n妊\n妍\n妒\n妓\n妖\n妘\n妙\n妝\n妞\n妣\n妤\n妥\n妨\n妩\n妪\n妮\n妲\n妳\n妹\n妻\n妾\n姆\n姉\n姊\n始\n姍\n姐\n姑\n姒\n姓\n委\n姗\n姚\n姜\n姝\n姣\n姥\n姦\n姨\n姪\n姫\n姬\n姹\n姻\n姿\n威\n娃\n娄\n娅\n娆\n娇\n娉\n娑\n娓\n娘\n娛\n娜\n娟\n娠\n娣\n娥\n娩\n娱\n娲\n娴\n娶\n娼\n婀\n婁\n婆\n婉\n婊\n婕\n婚\n婢\n婦\n婧\n婪\n婭\n婴\n婵\n婶\n婷\n婺\n婿\n媒\n媚\n媛\n媞\n媧\n媲\n媳\n媽\n媾\n嫁\n嫂\n嫉\n嫌\n嫑\n嫔\n嫖\n嫘\n嫚\n嫡\n嫣\n嫦\n嫩\n嫲\n嫵\n嫻\n嬅\n嬉\n嬌\n嬗\n嬛\n嬢\n嬤\n嬪\n嬰\n嬴\n嬷\n嬸\n嬿\n孀\n孃\n子\n孑\n孔\n孕\n孖\n字\n存\n孙\n孚\n孛\n孜\n孝\n孟\n孢\n季\n孤\n学\n孩\n孪\n孫\n孬\n孰\n孱\n孳\n孵\n學\n孺\n孽\n孿\n宁\n它\n宅\n宇\n守\n安\n宋\n完\n宏\n宓\n宕\n宗\n官\n宙\n定\n宛\n宜\n宝\n实\n実\n宠\n审\n客\n宣\n室\n宥\n宦\n宪\n宫\n宮\n宰\n害\n宴\n宵\n家\n宸\n容\n宽\n宾\n宿\n寂\n寄\n寅\n密\n寇\n富\n寐\n寒\n寓\n寛\n寝\n寞\n察\n寡\n寢\n寥\n實\n寧\n寨\n審\n寫\n寬\n寮\n寰\n寵\n寶\n寸\n对\n寺\n寻\n导\n対\n寿\n封\n専\n射\n将\n將\n專\n尉\n尊\n尋\n對\n導\n小\n少\n尔\n尕\n尖\n尘\n尚\n尝\n尤\n尧\n尬\n就\n尴\n尷\n尸\n尹\n尺\n尻\n尼\n尽\n尾\n尿\n局\n屁\n层\n屄\n居\n屆\n屈\n屉\n届\n屋\n屌\n屍\n屎\n屏\n屐\n屑\n展\n屜\n属\n屠\n屡\n屢\n層\n履\n屬\n屯\n山\n屹\n屿\n岀\n岁\n岂\n岌\n岐\n岑\n岔\n岖\n岗\n岘\n岙\n岚\n岛\n岡\n岩\n岫\n岬\n岭\n岱\n岳\n岷\n岸\n峇\n峋\n峒\n峙\n峡\n峤\n峥\n峦\n峨\n峪\n峭\n峯\n峰\n峴\n島\n峻\n峽\n崁\n崂\n崆\n崇\n崎\n崑\n崔\n崖\n崗\n崙\n崛\n崧\n崩\n崭\n崴\n崽\n嵇\n嵊\n嵋\n嵌\n嵐\n嵘\n嵩\n嵬\n嵯\n嶂\n嶄\n嶇\n嶋\n嶙\n嶺\n嶼\n嶽\n巅\n巍\n巒\n巔\n巖\n川\n州\n巡\n巢\n工\n左\n巧\n巨\n巩\n巫\n差\n己\n已\n巳\n巴\n巷\n巻\n巽\n巾\n巿\n币\n市\n布\n帅\n帆\n师\n希\n帐\n帑\n帕\n帖\n帘\n帚\n帛\n帜\n帝\n帥\n带\n帧\n師\n席\n帮\n帯\n帰\n帳\n帶\n帷\n常\n帼\n帽\n幀\n幂\n幄\n幅\n幌\n幔\n幕\n幟\n幡\n幢\n幣\n幫\n干\n平\n年\n并\n幸\n幹\n幺\n幻\n幼\n幽\n幾\n广\n庁\n広\n庄\n庆\n庇\n床\n序\n庐\n库\n应\n底\n庖\n店\n庙\n庚\n府\n庞\n废\n庠\n度\n座\n庫\n庭\n庵\n庶\n康\n庸\n庹\n庾\n廁\n廂\n廃\n廈\n廉\n廊\n廓\n廖\n廚\n廝\n廟\n廠\n廢\n廣\n廬\n廳\n延\n廷\n建\n廿\n开\n弁\n异\n弃\n弄\n弈\n弊\n弋\n式\n弑\n弒\n弓\n弔\n引\n弗\n弘\n弛\n弟\n张\n弥\n弦\n弧\n弩\n弭\n弯\n弱\n張\n強\n弹\n强\n弼\n弾\n彅\n彆\n彈\n彌\n彎\n归\n当\n录\n彗\n彙\n彝\n形\n彤\n彥\n彦\n彧\n彩\n彪\n彫\n彬\n彭\n彰\n影\n彷\n役\n彻\n彼\n彿\n往\n征\n径\n待\n徇\n很\n徉\n徊\n律\n後\n徐\n徑\n徒\n従\n徕\n得\n徘\n徙\n徜\n從\n徠\n御\n徨\n復\n循\n徬\n微\n徳\n徴\n徵\n德\n徹\n徼\n徽\n心\n必\n忆\n忌\n忍\n忏\n忐\n忑\n忒\n忖\n志\n忘\n忙\n応\n忠\n忡\n忤\n忧\n忪\n快\n忱\n念\n忻\n忽\n忿\n怀\n态\n怂\n怅\n怆\n怎\n怏\n怒\n怔\n怕\n怖\n怙\n怜\n思\n怠\n怡\n急\n怦\n性\n怨\n怪\n怯\n怵\n总\n怼\n恁\n恃\n恆\n恋\n恍\n恐\n恒\n恕\n恙\n恚\n恢\n恣\n恤\n恥\n恨\n恩\n恪\n恫\n恬\n恭\n息\n恰\n恳\n恵\n恶\n恸\n恺\n恻\n恼\n恿\n悄\n悅\n悉\n悌\n悍\n悔\n悖\n悚\n悟\n悠\n患\n悦\n您\n悩\n悪\n悬\n悯\n悱\n悲\n悴\n悵\n悶\n悸\n悻\n悼\n悽\n情\n惆\n惇\n惊\n惋\n惑\n惕\n惘\n惚\n惜\n惟\n惠\n惡\n惦\n惧\n惨\n惩\n惫\n惬\n惭\n惮\n惯\n惰\n惱\n想\n惴\n惶\n惹\n惺\n愁\n愆\n愈\n愉\n愍\n意\n愕\n愚\n愛\n愜\n感\n愣\n愤\n愧\n愫\n愷\n愿\n慄\n慈\n態\n慌\n慎\n慑\n慕\n慘\n慚\n慟\n慢\n慣\n慧\n慨\n慫\n慮\n慰\n慳\n慵\n慶\n慷\n慾\n憂\n憊\n憋\n憎\n憐\n憑\n憔\n憚\n憤\n憧\n憨\n憩\n憫\n憬\n憲\n憶\n憾\n懂\n懇\n懈\n應\n懊\n懋\n懑\n懒\n懦\n懲\n懵\n懶\n懷\n懸\n懺\n懼\n懾\n懿\n戀\n戈\n戊\n戌\n戍\n戎\n戏\n成\n我\n戒\n戕\n或\n战\n戚\n戛\n戟\n戡\n戦\n截\n戬\n戮\n戰\n戲\n戳\n戴\n戶\n户\n戸\n戻\n戾\n房\n所\n扁\n扇\n扈\n扉\n手\n才\n扎\n扑\n扒\n打\n扔\n払\n托\n扛\n扣\n扦\n执\n扩\n扪\n扫\n扬\n扭\n扮\n扯\n扰\n扱\n扳\n扶\n批\n扼\n找\n承\n技\n抄\n抉\n把\n抑\n抒\n抓\n投\n抖\n抗\n折\n抚\n抛\n抜\n択\n抟\n抠\n抡\n抢\n护\n报\n抨\n披\n抬\n抱\n抵\n抹\n押\n抽\n抿\n拂\n拄\n担\n拆\n拇\n拈\n拉\n拋\n拌\n拍\n拎\n拐\n拒\n拓\n拔\n拖\n拗\n拘\n拙\n拚\n招\n拜\n拟\n拡\n拢\n拣\n拥\n拦\n拧\n拨\n择\n括\n拭\n拮\n拯\n拱\n拳\n拴\n拷\n拼\n拽\n拾\n拿\n持\n挂\n指\n挈\n按\n挎\n挑\n挖\n挙\n挚\n挛\n挝\n挞\n挟\n挠\n挡\n挣\n挤\n挥\n挨\n挪\n挫\n振\n挲\n挹\n挺\n挽\n挾\n捂\n捅\n捆\n捉\n捋\n捌\n捍\n捎\n捏\n捐\n捕\n捞\n损\n捡\n换\n捣\n捧\n捨\n捩\n据\n捱\n捲\n捶\n捷\n捺\n捻\n掀\n掂\n掃\n掇\n授\n掉\n掌\n掏\n掐\n排\n掖\n掘\n掙\n掛\n掠\n採\n探\n掣\n接\n控\n推\n掩\n措\n掬\n掰\n掲\n掳\n掴\n掷\n掸\n掺\n揀\n揃\n揄\n揆\n揉\n揍\n描\n提\n插\n揖\n揚\n換\n握\n揣\n揩\n揪\n揭\n揮\n援\n揶\n揸\n揹\n揽\n搀\n搁\n搂\n搅\n損\n搏\n搐\n搓\n搔\n搖\n搗\n搜\n搞\n搡\n搪\n搬\n搭\n搵\n搶\n携\n搽\n摀\n摁\n摄\n摆\n摇\n摈\n摊\n摒\n摔\n摘\n摞\n摟\n摧\n摩\n摯\n摳\n摸\n摹\n摺\n摻\n撂\n撃\n撅\n撇\n撈\n撐\n撑\n撒\n撓\n撕\n撚\n撞\n撤\n撥\n撩\n撫\n撬\n播\n撮\n撰\n撲\n撵\n撷\n撸\n撻\n撼\n撿\n擀\n擁\n擂\n擄\n擅\n擇\n擊\n擋\n操\n擎\n擒\n擔\n擘\n據\n擞\n擠\n擡\n擢\n擦\n擬\n擰\n擱\n擲\n擴\n擷\n擺\n擼\n擾\n攀\n攏\n攒\n攔\n攘\n攙\n攜\n攝\n攞\n攢\n攣\n攤\n攥\n攪\n攫\n攬\n支\n收\n攸\n改\n攻\n放\n政\n故\n效\n敌\n敍\n敎\n敏\n救\n敕\n敖\n敗\n敘\n教\n敛\n敝\n敞\n敢\n散\n敦\n敬\n数\n敲\n整\n敵\n敷\n數\n斂\n斃\n文\n斋\n斌\n斎\n斐\n斑\n斓\n斗\n料\n斛\n斜\n斟\n斡\n斤\n斥\n斧\n斩\n斫\n斬\n断\n斯\n新\n斷\n方\n於\n施\n旁\n旃\n旅\n旋\n旌\n旎\n族\n旖\n旗\n无\n既\n日\n旦\n旧\n旨\n早\n旬\n旭\n旮\n旱\n时\n旷\n旺\n旻\n昀\n昂\n昆\n昇\n昉\n昊\n昌\n明\n昏\n易\n昔\n昕\n昙\n星\n映\n春\n昧\n昨\n昭\n是\n昱\n昴\n昵\n昶\n昼\n显\n晁\n時\n晃\n晉\n晋\n晌\n晏\n晒\n晓\n晔\n晕\n晖\n晗\n晚\n晝\n晞\n晟\n晤\n晦\n晨\n晩\n普\n景\n晰\n晴\n晶\n晷\n智\n晾\n暂\n暄\n暇\n暈\n暉\n暌\n暐\n暑\n暖\n暗\n暝\n暢\n暧\n暨\n暫\n暮\n暱\n暴\n暸\n暹\n曄\n曆\n曇\n曉\n曖\n曙\n曜\n曝\n曠\n曦\n曬\n曰\n曲\n曳\n更\n書\n曹\n曼\n曾\n替\n最\n會\n月\n有\n朋\n服\n朐\n朔\n朕\n朗\n望\n朝\n期\n朦\n朧\n木\n未\n末\n本\n札\n朮\n术\n朱\n朴\n朵\n机\n朽\n杀\n杂\n权\n杆\n杈\n杉\n李\n杏\n材\n村\n杓\n杖\n杜\n杞\n束\n杠\n条\n来\n杨\n杭\n杯\n杰\n東\n杳\n杵\n杷\n杼\n松\n板\n极\n构\n枇\n枉\n枋\n析\n枕\n林\n枚\n果\n枝\n枢\n枣\n枪\n枫\n枭\n枯\n枰\n枱\n枳\n架\n枷\n枸\n柄\n柏\n某\n柑\n柒\n染\n柔\n柘\n柚\n柜\n柞\n柠\n柢\n查\n柩\n柬\n柯\n柱\n柳\n柴\n柵\n査\n柿\n栀\n栃\n栄\n栅\n标\n栈\n栉\n栋\n栎\n栏\n树\n栓\n栖\n栗\n校\n栩\n株\n样\n核\n根\n格\n栽\n栾\n桀\n桁\n桂\n桃\n桅\n框\n案\n桉\n桌\n桎\n桐\n桑\n桓\n桔\n桜\n桠\n桡\n桢\n档\n桥\n桦\n桧\n桨\n桩\n桶\n桿\n梁\n梅\n梆\n梏\n梓\n梗\n條\n梟\n梢\n梦\n梧\n梨\n梭\n梯\n械\n梳\n梵\n梶\n检\n棂\n棄\n棉\n棋\n棍\n棒\n棕\n棗\n棘\n棚\n棟\n棠\n棣\n棧\n森\n棱\n棲\n棵\n棹\n棺\n椁\n椅\n椋\n植\n椎\n椒\n検\n椪\n椭\n椰\n椹\n椽\n椿\n楂\n楊\n楓\n楔\n楚\n楝\n楞\n楠\n楣\n楨\n楫\n業\n楮\n極\n楷\n楸\n楹\n楼\n楽\n概\n榄\n榆\n榈\n榉\n榔\n榕\n榖\n榛\n榜\n榨\n榫\n榭\n榮\n榱\n榴\n榷\n榻\n槁\n槃\n構\n槌\n槍\n槎\n槐\n槓\n様\n槛\n槟\n槤\n槭\n槲\n槳\n槻\n槽\n槿\n樁\n樂\n樊\n樑\n樓\n標\n樞\n樟\n模\n樣\n権\n横\n樫\n樯\n樱\n樵\n樸\n樹\n樺\n樽\n樾\n橄\n橇\n橋\n橐\n橘\n橙\n機\n橡\n橢\n橫\n橱\n橹\n橼\n檀\n檄\n檎\n檐\n檔\n檗\n檜\n檢\n檬\n檯\n檳\n檸\n檻\n櫃\n櫚\n櫛\n櫥\n櫸\n櫻\n欄\n權\n欒\n欖\n欠\n次\n欢\n欣\n欧\n欲\n欸\n欺\n欽\n款\n歆\n歇\n歉\n歌\n歎\n歐\n歓\n歙\n歛\n歡\n止\n正\n此\n步\n武\n歧\n歩\n歪\n歯\n歲\n歳\n歴\n歷\n歸\n歹\n死\n歼\n殁\n殃\n殆\n殇\n殉\n殊\n残\n殒\n殓\n殖\n殘\n殞\n殡\n殤\n殭\n殯\n殲\n殴\n段\n殷\n殺\n殼\n殿\n毀\n毁\n毂\n毅\n毆\n毋\n母\n毎\n每\n毒\n毓\n比\n毕\n毗\n毘\n毙\n毛\n毡\n毫\n毯\n毽\n氈\n氏\n氐\n民\n氓\n气\n氖\n気\n氙\n氛\n氟\n氡\n氢\n氣\n氤\n氦\n氧\n氨\n氪\n氫\n氮\n氯\n氰\n氲\n水\n氷\n永\n氹\n氾\n汀\n汁\n求\n汆\n汇\n汉\n汎\n汐\n汕\n汗\n汙\n汛\n汝\n汞\n江\n池\n污\n汤\n汨\n汩\n汪\n汰\n汲\n汴\n汶\n汹\n決\n汽\n汾\n沁\n沂\n沃\n沅\n沈\n沉\n沌\n沏\n沐\n沒\n沓\n沖\n沙\n沛\n沟\n没\n沢\n沣\n沥\n沦\n沧\n沪\n沫\n沭\n沮\n沱\n河\n沸\n油\n治\n沼\n沽\n沾\n沿\n況\n泄\n泉\n泊\n泌\n泓\n法\n泗\n泛\n泞\n泠\n泡\n波\n泣\n泥\n注\n泪\n泫\n泮\n泯\n泰\n泱\n泳\n泵\n泷\n泸\n泻\n泼\n泽\n泾\n洁\n洄\n洋\n洒\n洗\n洙\n洛\n洞\n津\n洩\n洪\n洮\n洱\n洲\n洵\n洶\n洸\n洹\n活\n洼\n洽\n派\n流\n浃\n浄\n浅\n浆\n浇\n浊\n测\n济\n浏\n浑\n浒\n浓\n浔\n浙\n浚\n浜\n浣\n浦\n浩\n浪\n浬\n浮\n浯\n浴\n海\n浸\n涂\n涅\n涇\n消\n涉\n涌\n涎\n涓\n涔\n涕\n涙\n涛\n涝\n涞\n涟\n涠\n涡\n涣\n涤\n润\n涧\n涨\n涩\n涪\n涮\n涯\n液\n涵\n涸\n涼\n涿\n淀\n淄\n淅\n淆\n淇\n淋\n淌\n淑\n淒\n淖\n淘\n淙\n淚\n淞\n淡\n淤\n淦\n淨\n淩\n淪\n淫\n淬\n淮\n深\n淳\n淵\n混\n淹\n淺\n添\n淼\n清\n済\n渉\n渊\n渋\n渍\n渎\n渐\n渔\n渗\n渙\n渚\n減\n渝\n渠\n渡\n渣\n渤\n渥\n渦\n温\n測\n渭\n港\n渲\n渴\n游\n渺\n渾\n湃\n湄\n湊\n湍\n湖\n湘\n湛\n湟\n湧\n湫\n湮\n湯\n湳\n湾\n湿\n満\n溃\n溅\n溉\n溏\n源\n準\n溜\n溝\n溟\n溢\n溥\n溧\n溪\n溫\n溯\n溱\n溴\n溶\n溺\n溼\n滁\n滂\n滄\n滅\n滇\n滋\n滌\n滑\n滓\n滔\n滕\n滙\n滚\n滝\n滞\n滟\n满\n滢\n滤\n滥\n滦\n滨\n滩\n滬\n滯\n滲\n滴\n滷\n滸\n滾\n滿\n漁\n漂\n漆\n漉\n漏\n漓\n演\n漕\n漠\n漢\n漣\n漩\n漪\n漫\n漬\n漯\n漱\n漲\n漳\n漸\n漾\n漿\n潆\n潇\n潋\n潍\n潑\n潔\n潘\n潛\n潜\n潞\n潟\n潢\n潤\n潦\n潧\n潭\n潮\n潰\n潴\n潸\n潺\n潼\n澀\n澄\n澆\n澈\n澍\n澎\n澗\n澜\n澡\n澤\n澧\n澱\n澳\n澹\n激\n濁\n濂\n濃\n濑\n濒\n濕\n濘\n濛\n濟\n濠\n濡\n濤\n濫\n濬\n濮\n濯\n濱\n濺\n濾\n瀅\n瀆\n瀉\n瀋\n瀏\n瀑\n瀕\n瀘\n瀚\n瀛\n瀝\n瀞\n瀟\n瀧\n瀨\n瀬\n瀰\n瀾\n灌\n灏\n灑\n灘\n灝\n灞\n灣\n火\n灬\n灭\n灯\n灰\n灵\n灶\n灸\n灼\n災\n灾\n灿\n炀\n炁\n炅\n炉\n炊\n炎\n炒\n炔\n炕\n炖\n炙\n炜\n炫\n炬\n炭\n炮\n炯\n炳\n炷\n炸\n点\n為\n炼\n炽\n烁\n烂\n烃\n烈\n烊\n烏\n烘\n烙\n烛\n烟\n烤\n烦\n烧\n烨\n烩\n烫\n烬\n热\n烯\n烷\n烹\n烽\n焉\n焊\n焕\n焖\n焗\n焘\n焙\n焚\n焜\n無\n焦\n焯\n焰\n焱\n然\n焼\n煅\n煉\n煊\n煌\n煎\n煒\n煖\n煙\n煜\n煞\n煤\n煥\n煦\n照\n煨\n煩\n煮\n煲\n煸\n煽\n熄\n熊\n熏\n熒\n熔\n熙\n熟\n熠\n熨\n熬\n熱\n熵\n熹\n熾\n燁\n燃\n燄\n燈\n燉\n燊\n燎\n燒\n燔\n燕\n燙\n燜\n營\n燥\n燦\n燧\n燭\n燮\n燴\n燻\n燼\n燿\n爆\n爍\n爐\n爛\n爪\n爬\n爭\n爰\n爱\n爲\n爵\n父\n爷\n爸\n爹\n爺\n爻\n爽\n爾\n牆\n片\n版\n牌\n牍\n牒\n牙\n牛\n牝\n牟\n牠\n牡\n牢\n牦\n牧\n物\n牯\n牲\n牴\n牵\n特\n牺\n牽\n犀\n犁\n犄\n犊\n犍\n犒\n犢\n犧\n犬\n犯\n状\n犷\n犸\n犹\n狀\n狂\n狄\n狈\n狎\n狐\n狒\n狗\n狙\n狞\n狠\n狡\n狩\n独\n狭\n狮\n狰\n狱\n狸\n狹\n狼\n狽\n猎\n猕\n猖\n猗\n猙\n猛\n猜\n猝\n猥\n猩\n猪\n猫\n猬\n献\n猴\n猶\n猷\n猾\n猿\n獄\n獅\n獎\n獐\n獒\n獗\n獠\n獣\n獨\n獭\n獰\n獲\n獵\n獷\n獸\n獺\n獻\n獼\n獾\n玄\n率\n玉\n王\n玑\n玖\n玛\n玟\n玠\n玥\n玩\n玫\n玮\n环\n现\n玲\n玳\n玷\n玺\n玻\n珀\n珂\n珅\n珈\n珉\n珊\n珍\n珏\n珐\n珑\n珙\n珞\n珠\n珣\n珥\n珩\n珪\n班\n珮\n珲\n珺\n現\n球\n琅\n理\n琇\n琉\n琊\n琍\n琏\n琐\n琛\n琢\n琥\n琦\n琨\n琪\n琬\n琮\n琰\n琲\n琳\n琴\n琵\n琶\n琺\n琼\n瑀\n瑁\n瑄\n瑋\n瑕\n瑗\n瑙\n瑚\n瑛\n瑜\n瑞\n瑟\n瑠\n瑣\n瑤\n瑩\n瑪\n瑯\n瑰\n瑶\n瑾\n璀\n璁\n璃\n璇\n璉\n璋\n璎\n璐\n璜\n璞\n璟\n璧\n璨\n環\n璽\n璿\n瓊\n瓏\n瓒\n瓜\n瓢\n瓣\n瓤\n瓦\n瓮\n瓯\n瓴\n瓶\n瓷\n甄\n甌\n甕\n甘\n甙\n甚\n甜\n生\n產\n産\n甥\n甦\n用\n甩\n甫\n甬\n甭\n甯\n田\n由\n甲\n申\n电\n男\n甸\n町\n画\n甾\n畀\n畅\n界\n畏\n畑\n畔\n留\n畜\n畝\n畢\n略\n畦\n番\n畫\n異\n畲\n畳\n畴\n當\n畸\n畹\n畿\n疆\n疇\n疊\n疏\n疑\n疔\n疖\n疗\n疙\n疚\n疝\n疟\n疡\n疣\n疤\n疥\n疫\n疮\n疯\n疱\n疲\n疳\n疵\n疸\n疹\n疼\n疽\n疾\n痂\n病\n症\n痈\n痉\n痊\n痍\n痒\n痔\n痕\n痘\n痙\n痛\n痞\n痠\n痢\n痣\n痤\n痧\n痨\n痪\n痫\n痰\n痱\n痴\n痹\n痺\n痼\n痿\n瘀\n瘁\n瘋\n瘍\n瘓\n瘘\n瘙\n瘟\n瘠\n瘡\n瘢\n瘤\n瘦\n瘧\n瘩\n瘪\n瘫\n瘴\n瘸\n瘾\n療\n癇\n癌\n癒\n癖\n癜\n癞\n癡\n癢\n癣\n癥\n癫\n癬\n癮\n癱\n癲\n癸\n発\n登\n發\n白\n百\n皂\n的\n皆\n皇\n皈\n皋\n皎\n皑\n皓\n皖\n皙\n皚\n皮\n皰\n皱\n皴\n皺\n皿\n盂\n盃\n盅\n盆\n盈\n益\n盎\n盏\n盐\n监\n盒\n盔\n盖\n盗\n盘\n盛\n盜\n盞\n盟\n盡\n監\n盤\n盥\n盧\n盪\n目\n盯\n盱\n盲\n直\n相\n盹\n盼\n盾\n省\n眈\n眉\n看\n県\n眙\n眞\n真\n眠\n眦\n眨\n眩\n眯\n眶\n眷\n眸\n眺\n眼\n眾\n着\n睁\n睇\n睏\n睐\n睑\n睛\n睜\n睞\n睡\n睢\n督\n睥\n睦\n睨\n睪\n睫\n睬\n睹\n睽\n睾\n睿\n瞄\n瞅\n瞇\n瞋\n瞌\n瞎\n瞑\n瞒\n瞓\n瞞\n瞟\n瞠\n瞥\n瞧\n瞩\n瞪\n瞬\n瞭\n瞰\n瞳\n瞻\n瞼\n瞿\n矇\n矍\n矗\n矚\n矛\n矜\n矢\n矣\n知\n矩\n矫\n短\n矮\n矯\n石\n矶\n矽\n矾\n矿\n码\n砂\n砌\n砍\n砒\n研\n砖\n砗\n砚\n砝\n砣\n砥\n砧\n砭\n砰\n砲\n破\n砷\n砸\n砺\n砼\n砾\n础\n硅\n硐\n硒\n硕\n硝\n硫\n硬\n确\n硯\n硼\n碁\n碇\n碉\n碌\n碍\n碎\n碑\n碓\n碗\n碘\n碚\n碛\n碟\n碣\n碧\n碩\n碰\n碱\n碳\n碴\n確\n碼\n碾\n磁\n磅\n磊\n磋\n磐\n磕\n磚\n磡\n磨\n磬\n磯\n磲\n磷\n磺\n礁\n礎\n礙\n礡\n礦\n礪\n礫\n礴\n示\n礼\n社\n祀\n祁\n祂\n祇\n祈\n祉\n祎\n祐\n祕\n祖\n祗\n祚\n祛\n祜\n祝\n神\n祟\n祠\n祢\n祥\n票\n祭\n祯\n祷\n祸\n祺\n祿\n禀\n禁\n禄\n禅\n禍\n禎\n福\n禛\n禦\n禧\n禪\n禮\n禱\n禹\n禺\n离\n禽\n禾\n禿\n秀\n私\n秃\n秆\n秉\n秋\n种\n科\n秒\n秘\n租\n秣\n秤\n秦\n秧\n秩\n秭\n积\n称\n秸\n移\n秽\n稀\n稅\n程\n稍\n税\n稔\n稗\n稚\n稜\n稞\n稟\n稠\n稣\n種\n稱\n稲\n稳\n稷\n稹\n稻\n稼\n稽\n稿\n穀\n穂\n穆\n穌\n積\n穎\n穗\n穢\n穩\n穫\n穴\n究\n穷\n穹\n空\n穿\n突\n窃\n窄\n窈\n窍\n窑\n窒\n窓\n窕\n窖\n窗\n窘\n窜\n窝\n窟\n窠\n窥\n窦\n窨\n窩\n窪\n窮\n窯\n窺\n窿\n竄\n竅\n竇\n竊\n立\n竖\n站\n竜\n竞\n竟\n章\n竣\n童\n竭\n端\n競\n竹\n竺\n竽\n竿\n笃\n笆\n笈\n笋\n笏\n笑\n笔\n笙\n笛\n笞\n笠\n符\n笨\n第\n笹\n笺\n笼\n筆\n等\n筊\n筋\n筍\n筏\n筐\n筑\n筒\n答\n策\n筛\n筝\n筠\n筱\n筲\n筵\n筷\n筹\n签\n简\n箇\n箋\n箍\n箏\n箐\n箔\n箕\n算\n箝\n管\n箩\n箫\n箭\n箱\n箴\n箸\n節\n篁\n範\n篆\n篇\n築\n篑\n篓\n篙\n篝\n篠\n篡\n篤\n篩\n篪\n篮\n篱\n篷\n簇\n簌\n簍\n簡\n簦\n簧\n簪\n簫\n簷\n簸\n簽\n簾\n簿\n籁\n籃\n籌\n籍\n籐\n籟\n籠\n籤\n籬\n籮\n籲\n米\n类\n籼\n籽\n粄\n粉\n粑\n粒\n粕\n粗\n粘\n粟\n粤\n粥\n粧\n粪\n粮\n粱\n粲\n粳\n粵\n粹\n粼\n粽\n精\n粿\n糅\n糊\n糍\n糕\n糖\n糗\n糙\n糜\n糞\n糟\n糠\n糧\n糬\n糯\n糰\n糸\n系\n糾\n紀\n紂\n約\n紅\n紉\n紊\n紋\n納\n紐\n紓\n純\n紗\n紘\n紙\n級\n紛\n紜\n素\n紡\n索\n紧\n紫\n紮\n累\n細\n紳\n紹\n紺\n終\n絃\n組\n絆\n経\n結\n絕\n絞\n絡\n絢\n給\n絨\n絮\n統\n絲\n絳\n絵\n絶\n絹\n綁\n綏\n綑\n經\n継\n続\n綜\n綠\n綢\n綦\n綫\n綬\n維\n綱\n網\n綴\n綵\n綸\n綺\n綻\n綽\n綾\n綿\n緊\n緋\n総\n緑\n緒\n緘\n線\n緝\n緞\n締\n緣\n編\n緩\n緬\n緯\n練\n緹\n緻\n縁\n縄\n縈\n縛\n縝\n縣\n縫\n縮\n縱\n縴\n縷\n總\n績\n繁\n繃\n繆\n繇\n繋\n織\n繕\n繚\n繞\n繡\n繩\n繪\n繫\n繭\n繳\n繹\n繼\n繽\n纂\n續\n纍\n纏\n纓\n纔\n纖\n纜\n纠\n红\n纣\n纤\n约\n级\n纨\n纪\n纫\n纬\n纭\n纯\n纰\n纱\n纲\n纳\n纵\n纶\n纷\n纸\n纹\n纺\n纽\n纾\n线\n绀\n练\n组\n绅\n细\n织\n终\n绊\n绍\n绎\n经\n绑\n绒\n结\n绔\n绕\n绘\n给\n绚\n绛\n络\n绝\n绞\n统\n绡\n绢\n绣\n绥\n绦\n继\n绩\n绪\n绫\n续\n绮\n绯\n绰\n绳\n维\n绵\n绶\n绷\n绸\n绻\n综\n绽\n绾\n绿\n缀\n缄\n缅\n缆\n缇\n缈\n缉\n缎\n缓\n缔\n缕\n编\n缘\n缙\n缚\n缜\n缝\n缠\n缢\n缤\n缥\n缨\n缩\n缪\n缭\n缮\n缰\n缱\n缴\n缸\n缺\n缽\n罂\n罄\n罌\n罐\n网\n罔\n罕\n罗\n罚\n罡\n罢\n罩\n罪\n置\n罰\n署\n罵\n罷\n罹\n羁\n羅\n羈\n羊\n羌\n美\n羔\n羚\n羞\n羟\n羡\n羣\n群\n羥\n羧\n羨\n義\n羯\n羲\n羸\n羹\n羽\n羿\n翁\n翅\n翊\n翌\n翎\n習\n翔\n翘\n翟\n翠\n翡\n翦\n翩\n翰\n翱\n翳\n翹\n翻\n翼\n耀\n老\n考\n耄\n者\n耆\n耋\n而\n耍\n耐\n耒\n耕\n耗\n耘\n耙\n耦\n耨\n耳\n耶\n耷\n耸\n耻\n耽\n耿\n聂\n聆\n聊\n聋\n职\n聒\n联\n聖\n聘\n聚\n聞\n聪\n聯\n聰\n聲\n聳\n聴\n聶\n職\n聽\n聾\n聿\n肃\n肄\n肅\n肆\n肇\n肉\n肋\n肌\n肏\n肓\n肖\n肘\n肚\n肛\n肝\n肠\n股\n肢\n肤\n肥\n肩\n肪\n肮\n肯\n肱\n育\n肴\n肺\n肽\n肾\n肿\n胀\n胁\n胃\n胄\n胆\n背\n胍\n胎\n胖\n胚\n胛\n胜\n胝\n胞\n胡\n胤\n胥\n胧\n胫\n胭\n胯\n胰\n胱\n胳\n胴\n胶\n胸\n胺\n能\n脂\n脅\n脆\n脇\n脈\n脉\n脊\n脍\n脏\n脐\n脑\n脓\n脖\n脘\n脚\n脛\n脣\n脩\n脫\n脯\n脱\n脲\n脳\n脸\n脹\n脾\n腆\n腈\n腊\n腋\n腌\n腎\n腐\n腑\n腓\n腔\n腕\n腥\n腦\n腩\n腫\n腭\n腮\n腰\n腱\n腳\n腴\n腸\n腹\n腺\n腻\n腼\n腾\n腿\n膀\n膈\n膊\n膏\n膑\n膘\n膚\n膛\n膜\n膝\n膠\n膦\n膨\n膩\n膳\n膺\n膻\n膽\n膾\n膿\n臀\n臂\n臃\n臆\n臉\n臊\n臍\n臓\n臘\n臟\n臣\n臥\n臧\n臨\n自\n臬\n臭\n至\n致\n臺\n臻\n臼\n臾\n舀\n舂\n舅\n舆\n與\n興\n舉\n舊\n舌\n舍\n舎\n舐\n舒\n舔\n舖\n舗\n舛\n舜\n舞\n舟\n航\n舫\n般\n舰\n舱\n舵\n舶\n舷\n舸\n船\n舺\n舾\n艇\n艋\n艘\n艙\n艦\n艮\n良\n艰\n艱\n色\n艳\n艷\n艹\n艺\n艾\n节\n芃\n芈\n芊\n芋\n芍\n芎\n芒\n芙\n芜\n芝\n芡\n芥\n芦\n芩\n芪\n芫\n芬\n芭\n芮\n芯\n花\n芳\n芷\n芸\n芹\n芻\n芽\n芾\n苁\n苄\n苇\n苋\n苍\n苏\n苑\n苒\n苓\n苔\n苕\n苗\n苛\n苜\n苞\n苟\n苡\n苣\n若\n苦\n苫\n苯\n英\n苷\n苹\n苻\n茁\n茂\n范\n茄\n茅\n茉\n茎\n茏\n茗\n茜\n茧\n茨\n茫\n茬\n茭\n茯\n茱\n茲\n茴\n茵\n茶\n茸\n茹\n茼\n荀\n荃\n荆\n草\n荊\n荏\n荐\n荒\n荔\n荖\n荘\n荚\n荞\n荟\n荠\n荡\n荣\n荤\n荥\n荧\n荨\n荪\n荫\n药\n荳\n荷\n荸\n荻\n荼\n荽\n莅\n莆\n莉\n莊\n莎\n莒\n莓\n莖\n莘\n莞\n莠\n莢\n莧\n莪\n莫\n莱\n莲\n莴\n获\n莹\n莺\n莽\n莿\n菀\n菁\n菅\n菇\n菈\n菊\n菌\n菏\n菓\n菖\n菘\n菜\n菟\n菠\n菡\n菩\n華\n菱\n菲\n菸\n菽\n萁\n萃\n萄\n萊\n萋\n萌\n萍\n萎\n萘\n萝\n萤\n营\n萦\n萧\n萨\n萩\n萬\n萱\n萵\n萸\n萼\n落\n葆\n葉\n著\n葚\n葛\n葡\n董\n葦\n葩\n葫\n葬\n葭\n葯\n葱\n葳\n葵\n葷\n葺\n蒂\n蒋\n蒐\n蒔\n蒙\n蒜\n蒞\n蒟\n蒡\n蒨\n蒲\n蒸\n蒹\n蒻\n蒼\n蒿\n蓁\n蓄\n蓆\n蓉\n蓋\n蓑\n蓓\n蓖\n蓝\n蓟\n蓦\n蓬\n蓮\n蓼\n蓿\n蔑\n蔓\n蔔\n蔗\n蔘\n蔚\n蔡\n蔣\n蔥\n蔫\n蔬\n蔭\n蔵\n蔷\n蔺\n蔻\n蔼\n蔽\n蕁\n蕃\n蕈\n蕉\n蕊\n蕎\n蕙\n蕤\n蕨\n蕩\n蕪\n蕭\n蕲\n蕴\n蕻\n蕾\n薄\n薅\n薇\n薈\n薊\n薏\n薑\n薔\n薙\n薛\n薦\n薨\n薩\n薪\n薬\n薯\n薰\n薹\n藉\n藍\n藏\n藐\n藓\n藕\n藜\n藝\n藤\n藥\n藩\n藹\n藻\n藿\n蘆\n蘇\n蘊\n蘋\n蘑\n蘚\n蘭\n蘸\n蘼\n蘿\n虎\n虏\n虐\n虑\n虔\n處\n虚\n虛\n虜\n虞\n號\n虢\n虧\n虫\n虬\n虱\n虹\n虻\n虽\n虾\n蚀\n蚁\n蚂\n蚊\n蚌\n蚓\n蚕\n蚜\n蚝\n蚣\n蚤\n蚩\n蚪\n蚯\n蚱\n蚵\n蛀\n蛆\n蛇\n蛊\n蛋\n蛎\n蛐\n蛔\n蛙\n蛛\n蛟\n蛤\n蛭\n蛮\n蛰\n蛳\n蛹\n蛻\n蛾\n蜀\n蜂\n蜃\n蜆\n蜇\n蜈\n蜊\n蜍\n蜒\n蜓\n蜕\n蜗\n蜘\n蜚\n蜜\n蜡\n蜢\n蜥\n蜱\n蜴\n蜷\n蜻\n蜿\n蝇\n蝈\n蝉\n蝌\n蝎\n蝕\n蝗\n蝙\n蝟\n蝠\n蝦\n蝨\n蝴\n蝶\n蝸\n蝼\n螂\n螃\n融\n螞\n螢\n螨\n螯\n螳\n螺\n蟀\n蟄\n蟆\n蟋\n蟎\n蟑\n蟒\n蟠\n蟬\n蟲\n蟹\n蟻\n蟾\n蠅\n蠍\n蠔\n蠕\n蠛\n蠟\n蠡\n蠢\n蠣\n蠱\n蠶\n蠹\n蠻\n血\n衄\n衅\n衆\n行\n衍\n術\n衔\n街\n衙\n衛\n衝\n衞\n衡\n衢\n衣\n补\n表\n衩\n衫\n衬\n衮\n衰\n衲\n衷\n衹\n衾\n衿\n袁\n袂\n袄\n袅\n袈\n袋\n袍\n袒\n袖\n袜\n袞\n袤\n袪\n被\n袭\n袱\n裁\n裂\n装\n裆\n裊\n裏\n裔\n裕\n裘\n裙\n補\n裝\n裟\n裡\n裤\n裨\n裱\n裳\n裴\n裸\n裹\n製\n裾\n褂\n複\n褐\n褒\n褓\n褔\n褚\n褥\n褪\n褫\n褲\n褶\n褻\n襁\n襄\n襟\n襠\n襪\n襬\n襯\n襲\n西\n要\n覃\n覆\n覇\n見\n規\n覓\n視\n覚\n覦\n覧\n親\n覬\n観\n覷\n覺\n覽\n觀\n见\n观\n规\n觅\n视\n览\n觉\n觊\n觎\n觐\n觑\n角\n觞\n解\n觥\n触\n觸\n言\n訂\n計\n訊\n討\n訓\n訕\n訖\n託\n記\n訛\n訝\n訟\n訣\n訥\n訪\n設\n許\n訳\n訴\n訶\n診\n註\n証\n詆\n詐\n詔\n評\n詛\n詞\n詠\n詡\n詢\n詣\n試\n詩\n詫\n詬\n詭\n詮\n詰\n話\n該\n詳\n詹\n詼\n誅\n誇\n誉\n誌\n認\n誓\n誕\n誘\n語\n誠\n誡\n誣\n誤\n誥\n誦\n誨\n說\n説\n読\n誰\n課\n誹\n誼\n調\n諄\n談\n請\n諏\n諒\n論\n諗\n諜\n諡\n諦\n諧\n諫\n諭\n諮\n諱\n諳\n諷\n諸\n諺\n諾\n謀\n謁\n謂\n謄\n謊\n謎\n謐\n謔\n謗\n謙\n講\n謝\n謠\n謨\n謬\n謹\n謾\n譁\n證\n譎\n譏\n識\n譙\n譚\n譜\n警\n譬\n譯\n議\n譲\n譴\n護\n譽\n讀\n變\n讓\n讚\n讞\n计\n订\n认\n讥\n讧\n讨\n让\n讪\n讫\n训\n议\n讯\n记\n讲\n讳\n讴\n讶\n讷\n许\n讹\n论\n讼\n讽\n设\n访\n诀\n证\n诃\n评\n诅\n识\n诈\n诉\n诊\n诋\n词\n诏\n译\n试\n诗\n诘\n诙\n诚\n诛\n话\n诞\n诟\n诠\n诡\n询\n诣\n诤\n该\n详\n诧\n诩\n诫\n诬\n语\n误\n诰\n诱\n诲\n说\n诵\n诶\n请\n诸\n诺\n读\n诽\n课\n诿\n谀\n谁\n调\n谄\n谅\n谆\n谈\n谊\n谋\n谌\n谍\n谎\n谏\n谐\n谑\n谒\n谓\n谔\n谕\n谗\n谘\n谙\n谚\n谛\n谜\n谟\n谢\n谣\n谤\n谥\n谦\n谧\n谨\n谩\n谪\n谬\n谭\n谯\n谱\n谲\n谴\n谶\n谷\n豁\n豆\n豇\n豈\n豉\n豊\n豌\n豎\n豐\n豔\n豚\n象\n豢\n豪\n豫\n豬\n豹\n豺\n貂\n貅\n貌\n貓\n貔\n貘\n貝\n貞\n負\n財\n貢\n貧\n貨\n販\n貪\n貫\n責\n貯\n貰\n貳\n貴\n貶\n買\n貸\n費\n貼\n貽\n貿\n賀\n賁\n賂\n賃\n賄\n資\n賈\n賊\n賑\n賓\n賜\n賞\n賠\n賡\n賢\n賣\n賤\n賦\n質\n賬\n賭\n賴\n賺\n購\n賽\n贅\n贈\n贊\n贍\n贏\n贓\n贖\n贛\n贝\n贞\n负\n贡\n财\n责\n贤\n败\n账\n货\n质\n贩\n贪\n贫\n贬\n购\n贮\n贯\n贰\n贱\n贲\n贴\n贵\n贷\n贸\n费\n贺\n贻\n贼\n贾\n贿\n赁\n赂\n赃\n资\n赅\n赈\n赊\n赋\n赌\n赎\n赏\n赐\n赓\n赔\n赖\n赘\n赚\n赛\n赝\n赞\n赠\n赡\n赢\n赣\n赤\n赦\n赧\n赫\n赭\n走\n赳\n赴\n赵\n赶\n起\n趁\n超\n越\n趋\n趕\n趙\n趟\n趣\n趨\n足\n趴\n趵\n趸\n趺\n趾\n跃\n跄\n跆\n跋\n跌\n跎\n跑\n跖\n跚\n跛\n距\n跟\n跡\n跤\n跨\n跩\n跪\n路\n跳\n践\n跷\n跹\n跺\n跻\n踉\n踊\n踌\n踏\n踐\n踝\n踞\n踟\n踢\n踩\n踪\n踮\n踱\n踴\n踵\n踹\n蹂\n蹄\n蹇\n蹈\n蹉\n蹊\n蹋\n蹑\n蹒\n蹙\n蹟\n蹣\n蹤\n蹦\n蹩\n蹬\n蹭\n蹲\n蹴\n蹶\n蹺\n蹼\n蹿\n躁\n躇\n躉\n躊\n躋\n躍\n躏\n躪\n身\n躬\n躯\n躲\n躺\n軀\n車\n軋\n軌\n軍\n軒\n軟\n転\n軸\n軼\n軽\n軾\n較\n載\n輒\n輓\n輔\n輕\n輛\n輝\n輟\n輩\n輪\n輯\n輸\n輻\n輾\n輿\n轄\n轅\n轆\n轉\n轍\n轎\n轟\n车\n轧\n轨\n轩\n转\n轭\n轮\n软\n轰\n轲\n轴\n轶\n轻\n轼\n载\n轿\n较\n辄\n辅\n辆\n辇\n辈\n辉\n辊\n辍\n辐\n辑\n输\n辕\n辖\n辗\n辘\n辙\n辛\n辜\n辞\n辟\n辣\n辦\n辨\n辩\n辫\n辭\n辮\n辯\n辰\n辱\n農\n边\n辺\n辻\n込\n辽\n达\n迁\n迂\n迄\n迅\n过\n迈\n迎\n运\n近\n返\n还\n这\n进\n远\n违\n连\n迟\n迢\n迤\n迥\n迦\n迩\n迪\n迫\n迭\n述\n迴\n迷\n迸\n迹\n迺\n追\n退\n送\n适\n逃\n逅\n逆\n选\n逊\n逍\n透\n逐\n递\n途\n逕\n逗\n這\n通\n逛\n逝\n逞\n速\n造\n逢\n連\n逮\n週\n進\n逵\n逶\n逸\n逻\n逼\n逾\n遁\n遂\n遅\n遇\n遊\n運\n遍\n過\n遏\n遐\n遑\n遒\n道\n達\n違\n遗\n遙\n遛\n遜\n遞\n遠\n遢\n遣\n遥\n遨\n適\n遭\n遮\n遲\n遴\n遵\n遶\n遷\n選\n遺\n遼\n遽\n避\n邀\n邁\n邂\n邃\n還\n邇\n邈\n邊\n邋\n邏\n邑\n邓\n邕\n邛\n邝\n邢\n那\n邦\n邨\n邪\n邬\n邮\n邯\n邰\n邱\n邳\n邵\n邸\n邹\n邺\n邻\n郁\n郅\n郊\n郎\n郑\n郜\n郝\n郡\n郢\n郤\n郦\n郧\n部\n郫\n郭\n郴\n郵\n郷\n郸\n都\n鄂\n鄉\n鄒\n鄔\n鄙\n鄞\n鄢\n鄧\n鄭\n鄰\n鄱\n鄲\n鄺\n酉\n酊\n酋\n酌\n配\n酐\n酒\n酗\n酚\n酝\n酢\n酣\n酥\n酩\n酪\n酬\n酮\n酯\n酰\n酱\n酵\n酶\n酷\n酸\n酿\n醃\n醇\n醉\n醋\n醍\n醐\n醒\n醚\n醛\n醜\n醞\n醣\n醪\n醫\n醬\n醮\n醯\n醴\n醺\n釀\n釁\n采\n釉\n释\n釋\n里\n重\n野\n量\n釐\n金\n釗\n釘\n釜\n針\n釣\n釦\n釧\n釵\n鈀\n鈉\n鈍\n鈎\n鈔\n鈕\n鈞\n鈣\n鈦\n鈪\n鈴\n鈺\n鈾\n鉀\n鉄\n鉅\n鉉\n鉑\n鉗\n鉚\n鉛\n鉤\n鉴\n鉻\n銀\n銃\n銅\n銑\n銓\n銖\n銘\n銜\n銬\n銭\n銮\n銳\n銷\n銹\n鋁\n鋅\n鋒\n鋤\n鋪\n鋰\n鋸\n鋼\n錄\n錐\n錘\n錚\n錠\n錢\n錦\n錨\n錫\n錮\n錯\n録\n錳\n錶\n鍊\n鍋\n鍍\n鍛\n鍥\n鍰\n鍵\n鍺\n鍾\n鎂\n鎊\n鎌\n鎏\n鎔\n鎖\n鎗\n鎚\n鎧\n鎬\n鎮\n鎳\n鏈\n鏖\n鏗\n鏘\n鏞\n鏟\n鏡\n鏢\n鏤\n鏽\n鐘\n鐮\n鐲\n鐳\n鐵\n鐸\n鐺\n鑄\n鑊\n鑑\n鑒\n鑣\n鑫\n鑰\n鑲\n鑼\n鑽\n鑾\n鑿\n针\n钉\n钊\n钎\n钏\n钒\n钓\n钗\n钙\n钛\n钜\n钝\n钞\n钟\n钠\n钡\n钢\n钣\n钤\n钥\n钦\n钧\n钨\n钩\n钮\n钯\n钰\n钱\n钳\n钴\n钵\n钺\n钻\n钼\n钾\n钿\n铀\n铁\n铂\n铃\n铄\n铅\n铆\n铉\n铎\n铐\n铛\n铜\n铝\n铠\n铡\n铢\n铣\n铤\n铨\n铩\n铬\n铭\n铮\n铰\n铲\n铵\n银\n铸\n铺\n链\n铿\n销\n锁\n锂\n锄\n锅\n锆\n锈\n锉\n锋\n锌\n锏\n锐\n锑\n错\n锚\n锟\n锡\n锢\n锣\n锤\n锥\n锦\n锭\n键\n锯\n锰\n锲\n锵\n锹\n锺\n锻\n镀\n镁\n镂\n镇\n镉\n镌\n镍\n镐\n镑\n镕\n镖\n镗\n镛\n镜\n镣\n镭\n镯\n镰\n镳\n镶\n長\n长\n門\n閃\n閉\n開\n閎\n閏\n閑\n閒\n間\n閔\n閘\n閡\n関\n閣\n閥\n閨\n閩\n閱\n閲\n閹\n閻\n閾\n闆\n闇\n闊\n闌\n闍\n闔\n闕\n闖\n闘\n關\n闡\n闢\n门\n闪\n闫\n闭\n问\n闯\n闰\n闲\n间\n闵\n闷\n闸\n闹\n闺\n闻\n闽\n闾\n阀\n阁\n阂\n阅\n阆\n阇\n阈\n阉\n阎\n阐\n阑\n阔\n阕\n阖\n阙\n阚\n阜\n队\n阡\n阪\n阮\n阱\n防\n阳\n阴\n阵\n阶\n阻\n阿\n陀\n陂\n附\n际\n陆\n陇\n陈\n陋\n陌\n降\n限\n陕\n陛\n陝\n陞\n陟\n陡\n院\n陣\n除\n陨\n险\n陪\n陰\n陲\n陳\n陵\n陶\n陷\n陸\n険\n陽\n隅\n隆\n隈\n隊\n隋\n隍\n階\n随\n隐\n隔\n隕\n隘\n隙\n際\n障\n隠\n隣\n隧\n隨\n險\n隱\n隴\n隶\n隸\n隻\n隼\n隽\n难\n雀\n雁\n雄\n雅\n集\n雇\n雉\n雋\n雌\n雍\n雎\n雏\n雑\n雒\n雕\n雖\n雙\n雛\n雜\n雞\n離\n難\n雨\n雪\n雯\n雰\n雲\n雳\n零\n雷\n雹\n電\n雾\n需\n霁\n霄\n霆\n震\n霈\n霉\n霊\n霍\n霎\n霏\n霑\n霓\n霖\n霜\n霞\n霧\n霭\n霰\n露\n霸\n霹\n霽\n霾\n靂\n靄\n靈\n青\n靓\n靖\n静\n靚\n靛\n靜\n非\n靠\n靡\n面\n靥\n靦\n革\n靳\n靴\n靶\n靼\n鞅\n鞋\n鞍\n鞏\n鞑\n鞘\n鞠\n鞣\n鞦\n鞭\n韆\n韋\n韌\n韓\n韜\n韦\n韧\n韩\n韬\n韭\n音\n韵\n韶\n韻\n響\n頁\n頂\n頃\n項\n順\n須\n頌\n預\n頑\n頒\n頓\n頗\n領\n頜\n頡\n頤\n頫\n頭\n頰\n頷\n頸\n頹\n頻\n頼\n顆\n題\n額\n顎\n顏\n顔\n願\n顛\n類\n顧\n顫\n顯\n顱\n顴\n页\n顶\n顷\n项\n顺\n须\n顼\n顽\n顾\n顿\n颁\n颂\n预\n颅\n领\n颇\n颈\n颉\n颊\n颌\n颍\n颐\n频\n颓\n颔\n颖\n颗\n题\n颚\n颛\n颜\n额\n颞\n颠\n颡\n颢\n颤\n颦\n颧\n風\n颯\n颱\n颳\n颶\n颼\n飄\n飆\n风\n飒\n飓\n飕\n飘\n飙\n飚\n飛\n飞\n食\n飢\n飨\n飩\n飪\n飯\n飲\n飼\n飽\n飾\n餃\n餅\n餉\n養\n餌\n餐\n餒\n餓\n餘\n餚\n餛\n餞\n餡\n館\n餮\n餵\n餾\n饅\n饈\n饋\n饌\n饍\n饑\n饒\n饕\n饗\n饞\n饥\n饨\n饪\n饬\n饭\n饮\n饯\n饰\n饱\n饲\n饴\n饵\n饶\n饷\n饺\n饼\n饽\n饿\n馀\n馁\n馄\n馅\n馆\n馈\n馋\n馍\n馏\n馒\n馔\n首\n馗\n香\n馥\n馨\n馬\n馭\n馮\n馳\n馴\n駁\n駄\n駅\n駆\n駐\n駒\n駕\n駛\n駝\n駭\n駱\n駿\n騁\n騎\n騏\n験\n騙\n騨\n騰\n騷\n驀\n驅\n驊\n驍\n驒\n驕\n驗\n驚\n驛\n驟\n驢\n驥\n马\n驭\n驮\n驯\n驰\n驱\n驳\n驴\n驶\n驷\n驸\n驹\n驻\n驼\n驾\n驿\n骁\n骂\n骄\n骅\n骆\n骇\n骈\n骊\n骋\n验\n骏\n骐\n骑\n骗\n骚\n骛\n骜\n骞\n骠\n骡\n骤\n骥\n骧\n骨\n骯\n骰\n骶\n骷\n骸\n骼\n髂\n髅\n髋\n髏\n髒\n髓\n體\n髖\n高\n髦\n髪\n髮\n髯\n髻\n鬃\n鬆\n鬍\n鬓\n鬚\n鬟\n鬢\n鬣\n鬥\n鬧\n鬱\n鬼\n魁\n魂\n魄\n魅\n魇\n魍\n魏\n魔\n魘\n魚\n魯\n魷\n鮑\n鮨\n鮪\n鮭\n鮮\n鯉\n鯊\n鯖\n鯛\n鯨\n鯰\n鯽\n鰍\n鰓\n鰭\n鰲\n鰻\n鰾\n鱈\n鱉\n鱔\n鱗\n鱷\n鱸\n鱼\n鱿\n鲁\n鲈\n鲍\n鲑\n鲛\n鲜\n鲟\n鲢\n鲤\n鲨\n鲫\n鲱\n鲲\n鲶\n鲷\n鲸\n鳃\n鳄\n鳅\n鳌\n鳍\n鳕\n鳖\n鳗\n鳝\n鳞\n鳥\n鳩\n鳳\n鳴\n鳶\n鴉\n鴕\n鴛\n鴦\n鴨\n鴻\n鴿\n鵑\n鵜\n鵝\n鵡\n鵬\n鵰\n鵲\n鶘\n鶩\n鶯\n鶴\n鷗\n鷲\n鷹\n鷺\n鸚\n鸞\n鸟\n鸠\n鸡\n鸢\n鸣\n鸥\n鸦\n鸨\n鸪\n鸭\n鸯\n鸳\n鸵\n鸽\n鸾\n鸿\n鹂\n鹃\n鹄\n鹅\n鹈\n鹉\n鹊\n鹌\n鹏\n鹑\n鹕\n鹘\n鹜\n鹞\n鹤\n鹦\n鹧\n鹫\n鹭\n鹰\n鹳\n鹵\n鹹\n鹼\n鹽\n鹿\n麂\n麋\n麒\n麓\n麗\n麝\n麟\n麥\n麦\n麩\n麴\n麵\n麸\n麺\n麻\n麼\n麽\n麾\n黃\n黄\n黍\n黎\n黏\n黑\n黒\n黔\n默\n黛\n黜\n黝\n點\n黠\n黨\n黯\n黴\n鼋\n鼎\n鼐\n鼓\n鼠\n鼬\n鼹\n鼻\n鼾\n齁\n齊\n齋\n齐\n齒\n齡\n齢\n齣\n齦\n齿\n龄\n龅\n龈\n龊\n龋\n龌\n龍\n龐\n龔\n龕\n龙\n龚\n龛\n龜\n龟\n︰\n︱\n︶\n︿\n﹁\n﹂\n﹍\n﹏\n﹐\n﹑\n﹒\n﹔\n﹕\n﹖\n﹗\n﹙\n﹚\n﹝\n﹞\n﹡\n﹣\n！\n＂\n＃\n＄\n％\n＆\n＇\n（\n）\n＊\n＋\n，\n－\n．\n／\n０\n１\n２\n３\n４\n５\n６\n７\n８\n９\n：\n；\n＜\n＝\n＞\n？\n＠\n［\n＼\n］\n＾\n＿\n｀\nａ\nｂ\nｃ\nｄ\nｅ\nｆ\nｇ\nｈ\nｉ\nｊ\nｋ\nｌ\nｍ\nｎ\nｏ\nｐ\nｑ\nｒ\nｓ\nｔ\nｕ\nｖ\nｗ\nｘ\nｙ\nｚ\n｛\n｜\n｝\n～\n｡\n｢\n｣\n､\n･\nｯ\nｰ\nｲ\nｸ\nｼ\nｽ\nﾄ\nﾉ\nﾌ\nﾗ\nﾙ\nﾝ\nﾞ\nﾟ\n￣\n￥\n👍\n🔥\n😂\n😎\n...\nyam\n10\n2017\n12\n11\n2016\n20\n30\n15\n06\nlofter\n##s\n2015\nby\n16\n14\n18\n13\n24\n17\n2014\n21\n##0\n22\n19\n25\n23\ncom\n100\n00\n05\n2013\n##a\n03\n09\n08\n28\n##2\n50\n01\n04\n##1\n27\n02\n2012\n##3\n26\n##e\n07\n##8\n##5\n##6\n##4\n##9\n##7\n29\n2011\n40\n##t\n2010\n##o\n##d\n##i\n2009\n##n\napp\nwww\nthe\n##m\n31\n##c\n##l\n##y\n##r\n##g\n2008\n60\nhttp\n200\nqq\n##p\n80\n##f\ngoogle\npixnet\n90\ncookies\ntripadvisor\n500\n##er\n##k\n35\n##h\nfacebook\n2007\n2000\n70\n##b\nof\n##x\n##u\n45\n300\niphone\n32\n1000\n2006\n48\nip\n36\nin\n38\n3d\n##w\n##ing\n55\nctrip\n##on\n##v\n33\n##の\nto\n34\n400\nid\n2005\nit\n37\nwindows\nllc\ntop\n99\n42\n39\n000\nled\nat\n##an\n41\n51\n52\n46\n49\n43\n53\n44\n##z\nandroid\n58\nand\n59\n2004\n56\nvr\n##か\n5000\n2003\n47\nblogthis\ntwitter\n54\n##le\n150\nok\n2018\n57\n75\ncn\nno\nios\n##in\n##mm\n##00\n800\non\nte\n3000\n65\n2001\n360\n95\nig\nlv\n120\n##ng\n##を\n##us\n##に\npc\nてす\n──\n600\n##te\n85\n2002\n88\n##ed\nhtml\nncc\nwifi\nemail\n64\nblog\nis\n##10\n##て\nmail\nonline\n##al\ndvd\n##ic\nstudio\n##は\n##℃\n##ia\n##と\nline\nvip\n72\n##q\n98\n##ce\n##en\nfor\n##is\n##ra\n##es\n##j\nusb\nnet\ncp\n1999\nasia\n4g\n##cm\ndiy\nnew\n3c\n##お\nta\n66\nlanguage\nvs\napple\ntw\n86\nweb\n##ne\nipad\n62\nyou\n##re\n101\n68\n##tion\nps\nde\nbt\npony\natm\n##2017\n1998\n67\n##ch\nceo\n##or\ngo\n##na\nav\npro\ncafe\n96\npinterest\n97\n63\npixstyleme3c\n##ta\nmore\nsaid\n##2016\n1997\nmp3\n700\n##ll\nnba\njun\n##20\n92\ntv\n1995\npm\n61\n76\nnbsp\n250\n##ie\nlinux\n##ma\ncd\n110\nhd\n##17\n78\n##ion\n77\n6000\nam\n##th\n##st\n94\n##se\n##et\n69\n180\ngdp\nmy\n105\n81\nabc\n89\nflash\n79\none\n93\n1990\n1996\n##ck\ngps\n##も\n##ly\nweb885\n106\n2020\n91\n##ge\n4000\n1500\nxd\nboss\nisbn\n1994\norg\n##ry\nme\nlove\n##11\n0fork\n73\n##12\n3g\n##ter\n##ar\n71\n82\n##la\nhotel\n130\n1970\npk\n83\n87\n140\nie\n##os\n##30\n##el\n74\n##50\nseo\ncpu\n##ml\np2p\n84\nmay\n##る\nsun\ntue\ninternet\ncc\nposted\nyoutube\n##at\n##ン\n##man\nii\n##ル\n##15\nabs\nnt\npdf\nyahoo\nago\n1980\n##it\nnews\nmac\n104\n##てす\n##me\n##り\njava\n1992\nspa\n##de\n##nt\nhk\nall\nplus\nla\n1993\n##mb\n##16\n##ve\nwest\n##da\n160\nair\n##い\n##ps\nから\n##to\n1989\nlogo\nhtc\nphp\nhttps\nfi\nmomo\n##son\nsat\n##ke\n##80\nebd\nsuv\nwi\nday\napk\n##88\n##um\nmv\ngalaxy\nwiki\nor\nbrake\n##ス\n1200\nする\nthis\n1991\nmon\n##こ\n❤2017\npo\n##ない\njavascript\nlife\nhome\njune\n##ss\nsystem\n900\n##ー\n##０\npp\n1988\nworld\nfb\n4k\nbr\n##as\nic\nai\nleonardo\nsafari\n##60\nlive\nfree\nxx\nwed\nwin7\nkiehl\n##co\nlg\no2o\n##go\nus\n235\n1949\nmm\nしい\nvfm\nkanye\n##90\n##2015\n##id\njr\n##ey\n123\nrss\n##sa\n##ro\n##am\n##no\nthu\nfri\n350\n##sh\n##ki\n103\ncomments\nname\n##のて\n##pe\n##ine\nmax\n1987\n8000\nuber\n##mi\n##ton\nwordpress\noffice\n1986\n1985\n##ment\n107\nbd\nwin10\n##ld\n##li\ngmail\nbb\ndior\n##rs\n##ri\n##rd\n##ます\nup\ncad\n##®\ndr\nして\nread\n##21\nをお\n##io\n##99\nurl\n1984\npvc\npaypal\nshow\npolicy\n##40\n##ty\n##18\nwith\n##★\n##01\ntxt\n102\n##ba\ndna\nfrom\npost\nmini\nar\ntaiwan\njohn\n##ga\nprivacy\nagoda\n##13\n##ny\nword\n##24\n##22\n##by\n##ur\n##hz\n1982\n##ang\n265\ncookie\nnetscape\n108\n##ka\n##～\n##ad\nhouse\nshare\nnote\nibm\ncode\nhello\nnike\nsim\nsurvey\n##016\n1979\n1950\nwikia\n##32\n##017\n5g\ncbc\n##tor\n##kg\n1983\n##rt\n##14\ncampaign\nstore\n2500\nos\n##ct\n##ts\n##°\n170\napi\n##ns\n365\nexcel\n##な\n##ao\n##ら\n##し\n～～\n##nd\nuniversity\n163\nには\n518\n##70\n##ya\n##il\n##25\npierre\nipo\n0020\n897\n##23\nhotels\n##ian\nのお\n125\nyears\n6606\n##ers\n##26\nhigh\n##day\ntime\n##ay\nbug\n##line\n##く\n##す\n##be\nxp\ntalk2yam\nyamservice\n10000\ncoco\n##dy\nsony\n##ies\n1978\nmicrosoft\ndavid\npeople\n##ha\n1960\ninstagram\nintel\nその\n##ot\niso\n1981\n##va\n115\n##mo\n##land\nxxx\nman\nco\nltxsw\n##ation\nbaby\n220\n##pa\n##ol\n1945\n7000\ntag\n450\n##ue\nmsn\n##31\noppo\n##ト\n##ca\ncontrol\n##om\nst\nchrome\n##ure\n##ん\nbe\n##き\nlol\n##19\nした\n##bo\n240\nlady\n##100\n##way\n##から\n4600\n##ko\n##do\n##un\n4s\ncorporation\n168\n##ni\nherme\n##28\nｃｐ\n978\n##up\n##06\nui\n##ds\nppt\nadmin\nthree\nします\nbbc\nre\n128\n##48\nca\n##015\n##35\nhp\n##ee\ntpp\n##た\n##ive\n××\nroot\n##cc\n##ました\n##ble\n##ity\nadobe\npark\n114\net\noled\ncity\n##ex\n##ler\n##ap\nchina\n##book\n20000\nview\n##ice\nglobal\n##km\nyour\nhong\n##mg\nout\n##ms\nng\nebay\n##29\nmenu\nubuntu\n##cy\nrom\n##view\nopen\nktv\ndo\nserver\n##lo\nif\nenglish\n##ね\n##５\n##oo\n1600\n##02\nstep1\nkong\nclub\n135\njuly\ninc\n1976\nmr\nhi\n##net\ntouch\n##ls\n##ii\nmichael\nlcd\n##05\n##33\nphone\njames\nstep2\n1300\nios9\n##box\ndc\n##２\n##ley\nsamsung\n111\n280\npokemon\ncss\n##ent\n##les\nいいえ\n##１\ns8\natom\nplay\nbmw\n##said\nsa\netf\nctrl\n♥yoyo♥\n##55\n2025\n##2014\n##66\nadidas\namazon\n1958\n##ber\n##ner\nvisa\n##77\n##der\n1800\nconnectivity\n##hi\nfirefox\n109\n118\nhr\nso\nstyle\nmark\npop\nol\nskip\n1975\nas\n##27\n##ir\n##61\n190\nmba\n##う\n##ai\nle\n##ver\n1900\ncafe2017\nlte\nsuper\n113\n129\n##ron\namd\nlike\n##☆\nare\n##ster\nwe\n##sk\npaul\ndata\ninternational\n##ft\nlongchamp\nssd\ngood\n##ート\n##ti\nreply\n##my\n↓↓↓\napr\nstar\n##ker\nsource\n136\njs\n112\nget\nforce\nphoto\n##one\n126\n##2013\n##ow\nlink\nbbs\n1972\ngoods\n##lin\npython\n119\n##ip\ngame\n##ics\n##ません\nblue\n##●\n520\n##45\npage\nitunes\n##03\n1955\n260\n1968\ngt\ngif\n618\n##ff\n##47\ngroup\nくたさい\nabout\nbar\nganji\n##nce\nmusic\nlee\nnot\n1977\n1971\n1973\n##per\nan\nfaq\ncomment\n##って\ndays\n##ock\n116\n##bs\n1974\n1969\nv1\nplayer\n1956\nxbox\nsql\nfm\nf1\n139\n##ah\n210\n##lv\n##mp\n##000\nmelody\n1957\n##３\n550\n17life\n199\n1966\nxml\nmarket\n##au\n##71\n999\n##04\nwhat\ngl\n##95\n##age\ntips\n##68\nbook\n##ting\nmysql\ncan\n1959\n230\n##ung\nwonderland\nwatch\n10℃\n##ction\n9000\nmar\nmobile\n1946\n1962\narticle\n##db\npart\n▲top\nparty\nって\n1967\n1964\n1948\n##07\n##ore\n##op\nこの\ndj\n##78\n##38\n010\nmain\n225\n1965\n##ong\nart\n320\nad\n134\n020\n##73\n117\npm2\njapan\n228\n##08\nts\n1963\n##ica\nder\nsm\n##36\n2019\n##wa\nct\n##７\n##や\n##64\n1937\nhomemesh\nsearch\n##85\n##れは\n##tv\n##di\nmacbook\n##９\n##くたさい\nservice\n##♥\ntype\nった\n750\n##ier\n##si\n##75\n##います\n##ok\nbest\n##ット\ngoris\nlock\n##った\ncf\n3m\nbig\n##ut\nftp\ncarol\n##vi\n１０\n1961\nhappy\nsd\n##ac\n122\nanti\npe\ncnn\niii\n1920\n138\n##ラ\n1940\nesp\njan\ntags\n##98\n##51\naugust\nvol\n##86\n154\n##™\n##fs\n##れ\n##sion\ndesign\nac\n##ム\npress\njordan\nppp\nthat\nkey\ncheck\n##６\n##tt\n##㎡\n1080p\n##lt\npower\n##42\n1952\n##bc\nvivi\n##ック\nhe\n133\n121\njpg\n##rry\n201\n175\n3500\n1947\nnb\n##ted\n##rn\nしています\n1954\nusd\n##t00\nmaster\n##ンク\n001\nmodel\n##58\nal\n##09\n1953\n##34\nram\ngoo\nても\n##ui\n127\n1930\nred\n##ary\nrpg\nitem\n##pm\n##41\n270\n##za\nproject\n##2012\nhot\ntd\nblogabstract\n##ger\n##62\n650\n##44\ngr2\n##します\n##ｍ\nblack\nelectronic\nnfc\nyear\nasus\nまた\nhtml5\ncindy\n##hd\nm3\n132\nesc\n##od\nbooking\n##53\nfed\ntvb\n##81\n##ina\nmit\n165\n##いる\nchan\n192\ndistribution\nnext\nになる\npeter\nbios\nsteam\ncm\n1941\nにも\npk10\n##ix\n##65\n##91\ndec\nnasa\n##ana\nicecat\n00z\nb1\nwill\n##46\nli\nse\n##ji\n##み\n##ard\noct\n##ain\njp\n##ze\n##bi\ncio\n##56\nsmart\nh5\n##39\n##port\ncurve\nvpn\n##nm\n##dia\nutc\n##あり\n12345678910\n##52\nrmvb\nchanel\na4\nmiss\n##and\n##im\nmedia\nwho\n##63\nshe\ngirl\n5s\n124\nvera\n##して\nclass\nvivo\nking\n##フ\n##ei\nnational\nab\n1951\n5cm\n888\n145\nipod\nap\n1100\n5mm\n211\nms\n2756\n##69\nmp4\nmsci\n##po\n##89\n131\nmg\nindex\n380\n##bit\n##out\n##zz\n##97\n##67\n158\napec\n##８\nphotoshop\nopec\n￥799\nては\n##96\n##tes\n##ast\n2g\n○○\n##ール\n￥2899\n##ling\n##よ\n##ory\n1938\n##ical\nkitty\ncontent\n##43\nstep3\n##cn\nwin8\n155\nvc\n1400\niphone7\nrobert\n##した\ntcl\n137\nbeauty\n##87\nen\ndollars\n##ys\n##oc\nstep\npay\nyy\na1\n##2011\n##lly\n##ks\n##♪\n1939\n188\ndownload\n1944\nsep\nexe\nph\nいます\nschool\ngb\ncenter\npr\nstreet\n##board\nuv\n##37\n##lan\nwinrar\n##que\n##ua\n##com\n1942\n1936\n480\ngpu\n##４\nettoday\nfu\ntom\n##54\n##ren\n##via\n149\n##72\nb2b\n144\n##79\n##tch\nrose\narm\nmb\n##49\n##ial\n##nn\nnvidia\nstep4\nmvp\n00㎡\nyork\n156\n##イ\nhow\ncpi\n591\n2765\ngov\nkg\njoe\n##xx\nmandy\npa\n##ser\ncopyright\nfashion\n1935\ndon\n##け\necu\n##ist\n##art\nerp\nwap\nhave\n##lm\ntalk\n##ek\n##ning\n##if\nch\n##ite\nvideo\n1943\ncs\nsan\niot\nlook\n##84\n##2010\n##ku\noctober\n##ux\ntrump\n##hs\n##ide\nbox\n141\nfirst\n##ins\napril\n##ight\n##83\n185\nangel\nprotected\naa\n151\n162\nx1\nm2\n##fe\n##×\n##ho\nsize\n143\nmin\nofo\nfun\ngomaji\nex\nhdmi\nfood\ndns\nmarch\nchris\nkevin\n##のか\n##lla\n##pp\n##ec\nag\nems\n6s\n720p\n##rm\n##ham\noff\n##92\nasp\nteam\nfandom\ned\n299\n▌♥\n##ell\ninfo\nされています\n##82\nsina\n4066\n161\n##able\n##ctor\n330\n399\n315\ndll\nrights\nltd\nidc\njul\n3kg\n1927\n142\nma\nsurface\n##76\n##ク\n～～～\n304\nmall\neps\n146\ngreen\n##59\nmap\nspace\ndonald\nv2\nsodu\n##light\n1931\n148\n1700\nまて\n310\nreserved\nhtm\n##han\n##57\n2d\n178\nmod\n##ise\n##tions\n152\nti\n##shi\ndoc\n1933\nicp\n055\nwang\n##ram\nshopping\naug\n##pi\n##well\nnow\nwam\nb2\nからお\n##hu\n236\n1928\n##gb\n266\nf2\n##93\n153\nmix\n##ef\n##uan\nbwl\n##plus\n##res\ncore\n##ess\ntea\n5℃\nhktvmall\nnhk\n##ate\nlist\n##ese\n301\nfeb\n4m\ninn\nての\nnov\n159\n12345\ndaniel\n##ci\npass\n##bet\n##nk\ncoffee\n202\nssl\nairbnb\n##ute\nfbi\nwoshipm\nskype\nea\ncg\nsp\n##fc\n##www\nyes\nedge\nalt\n007\n##94\nfpga\n##ght\n##gs\niso9001\nさい\n##ile\n##wood\n##uo\nimage\nlin\nicon\namerican\n##em\n1932\nset\nsays\n##king\n##tive\nblogger\n##74\nなと\n256\n147\n##ox\n##zy\n##red\n##ium\n##lf\nnokia\nclaire\n##リ\n##ding\nnovember\nlohas\n##500\n##tic\n##マ\n##cs\n##ある\n##che\n##ire\n##gy\n##ult\ndb\njanuary\nwin\n##カ\n166\nroad\nptt\n##ま\n##つ\n198\n##fa\n##mer\nanna\npchome\nはい\nudn\nef\n420\n##time\n##tte\n2030\n##ア\ng20\nwhite\nかかります\n1929\n308\ngarden\neleven\ndi\n##おります\nchen\n309b\n777\n172\nyoung\ncosplay\nちてない\n4500\nbat\n##123\n##tra\n##ては\nkindle\nnpc\nsteve\netc\n##ern\n##｜\ncall\nxperia\nces\ntravel\nsk\ns7\n##ous\n1934\n##int\nみいたたけます\n183\nedu\nfile\ncho\nqr\n##car\n##our\n186\n##ant\n##ｄ\neric\n1914\nrends\n##jo\n##する\nmastercard\n##2000\nkb\n##min\n290\n##ino\nvista\n##ris\n##ud\njack\n2400\n##set\n169\npos\n1912\n##her\n##ou\ntaipei\nしく\n205\nbeta\n##ませんか\n232\n##fi\nexpress\n255\nbody\n##ill\naphojoy\nuser\ndecember\nmeiki\n##ick\ntweet\nrichard\n##av\n##ᆫ\niphone6\n##dd\nちてすか\nviews\n##mark\n321\npd\n##００\ntimes\n##▲\nlevel\n##ash\n10g\npoint\n5l\n##ome\n208\nkoreanmall\n##ak\ngeorge\nq2\n206\nwma\ntcp\n##200\nスタッフ\nfull\nmlb\n##lle\n##watch\ntm\nrun\n179\n911\nsmith\nbusiness\n##und\n1919\ncolor\n##tal\n222\n171\n##less\nmoon\n4399\n##rl\nupdate\npcb\nshop\n499\n157\nlittle\nなし\nend\n##mhz\nvan\ndsp\neasy\n660\n##house\n##key\nhistory\n##ｏ\noh\n##001\n##hy\n##web\noem\nlet\nwas\n##2009\n##gg\nreview\n##wan\n182\n##°c\n203\nuc\ntitle\n##val\nunited\n233\n2021\n##ons\ndoi\ntrivago\noverdope\nsbs\n##ance\n##ち\ngrand\nspecial\n573032185\nimf\n216\nwx17house\n##so\n##ーム\naudi\n##he\nlondon\nwilliam\n##rp\n##ake\nscience\nbeach\ncfa\namp\nps4\n880\n##800\n##link\n##hp\ncrm\nferragamo\nbell\nmake\n##eng\n195\nunder\nzh\nphotos\n2300\n##style\n##ント\nvia\n176\nda\n##gi\ncompany\ni7\n##ray\nthomas\n370\nufo\ni5\n##max\nplc\nben\nback\nresearch\n8g\n173\nmike\n##pc\n##ッフ\nseptember\n189\n##ace\nvps\nfebruary\n167\npantos\nwp\nlisa\n1921\n★★\njquery\nnight\nlong\noffer\n##berg\n##news\n1911\n##いて\nray\nfks\nwto\nせます\nover\n164\n340\n##all\n##rus\n1924\n##888\n##works\nblogtitle\nloftpermalink\n##→\n187\nmartin\ntest\nling\nkm\n##め\n15000\nfda\nv3\n##ja\n##ロ\nｗedding\nかある\noutlet\nfamily\n##ea\nをこ\n##top\nstory\n##ness\nsalvatore\n##lu\n204\nswift\n215\nroom\nしている\noracle\n##ul\n1925\nsam\nb2c\nweek\npi\nrock\n##のは\n##ａ\n##けと\n##ean\n##300\n##gle\ncctv\nafter\nchinese\n##back\npowered\nx2\n##tan\n1918\n##nes\n##イン\ncanon\nonly\n181\n##zi\n##las\nsay\n##oe\n184\n##sd\n221\n##bot\n##world\n##zo\nsky\nmade\ntop100\njust\n1926\npmi\n802\n234\ngap\n##vr\n177\nles\n174\n▲topoct\nball\nvogue\nvi\ning\nofweek\ncos\n##list\n##ort\n▲topmay\n##なら\n##lon\nとして\nlast\n##tc\n##of\n##bus\n##gen\nreal\neva\n##コ\na3\nnas\n##lie\n##ria\n##coin\n##bt\n▲topapr\nhis\n212\ncat\nnata\nvive\nhealth\n⋯⋯\ndrive\nsir\n▲topmar\ndu\ncup\n##カー\n##ook\n##よう\n##sy\nalex\nmsg\ntour\nしました\n3ce\n##word\n193\nebooks\nr8\nblock\n318\n##より\n2200\nnice\npvp\n207\nmonths\n1905\nrewards\n##ther\n1917\n0800\n##xi\n##チ\n##sc\nmicro\n850\ngg\nblogfp\nop\n1922\ndaily\nm1\n264\ntrue\n##bb\nml\n##tar\n##のお\n##ky\nanthony\n196\n253\n##yo\nstate\n218\n##ara\n##aa\n##rc\n##tz\n##ston\nより\ngear\n##eo\n##ade\nge\nsee\n1923\n##win\n##ura\nss\nheart\n##den\n##ita\ndown\n##sm\nel\npng\n2100\n610\nrakuten\nwhatsapp\nbay\ndream\nadd\n##use\n680\n311\npad\ngucci\nmpv\n##ode\n##fo\nisland\n▲topjun\n##▼\n223\njason\n214\nchicago\n##❤\nしの\n##hone\nio\n##れる\n##ことか\nsogo\nbe2\n##ology\n990\ncloud\nvcd\n##con\n2～3\n##ford\n##joy\n##kb\n##こさいます\n##rade\nbut\n##ach\ndocker\n##ful\nrfid\nul\n##ase\nhit\nford\n##star\n580\n##○\n１１\na2\nsdk\nreading\nedited\n##are\ncmos\n##mc\n238\nsiri\nlight\n##ella\n##ため\nbloomberg\n##read\npizza\n##ison\njimmy\n##vm\ncollege\nnode\njournal\nba\n18k\n##play\n245\n##cer\n２０\nmagic\n##yu\n191\njump\n288\ntt\n##ings\nasr\n##lia\n3200\nstep5\nnetwork\n##cd\nmc\nいします\n1234\npixstyleme\n273\n##600\n2800\nmoney\n★★★★★\n1280\n１２\n430\nbl\nみの\nact\n##tus\ntokyo\n##rial\n##life\nemba\n##ae\nsaas\ntcs\n##rk\n##wang\nsummer\n##sp\nko\n##ving\n390\npremium\n##その\nnetflix\n##ヒ\nuk\nmt\n##lton\nright\nfrank\ntwo\n209\nえる\n##ple\n##cal\n021\n##んな\n##sen\n##ville\nhold\nnexus\ndd\n##ius\nてお\n##mah\n##なく\ntila\nzero\n820\nce\n##tin\nresort\n##ws\ncharles\nold\np10\n5d\nreport\n##360\n##ru\n##には\nbus\nvans\nlt\n##est\npv\n##レ\nlinks\nrebecca\n##ツ\n##dm\nazure\n##365\nきな\nlimited\nbit\n4gb\n##mon\n1910\nmoto\n##eam\n213\n1913\nvar\neos\nなとの\n226\nblogspot\nされた\n699\ne3\ndos\ndm\nfc\n##ments\n##ik\n##kw\nboy\n##bin\n##ata\n960\ner\n##せ\n219\n##vin\n##tu\n##ula\n194\n##∥\nstation\n##ろ\n##ature\n835\nfiles\nzara\nhdr\ntop10\nnature\n950\nmagazine\ns6\nmarriott\n##シ\navira\ncase\n##っと\ntab\n##ran\ntony\n##home\noculus\nim\n##ral\njean\nsaint\ncry\n307\nrosie\n##force\n##ini\nice\n##bert\nのある\n##nder\n##mber\npet\n2600\n##◆\nplurk\n▲topdec\n##sis\n00kg\n▲topnov\n720\n##ence\ntim\n##ω\n##nc\n##ても\n##name\nlog\nips\ngreat\nikea\nmalaysia\nunix\n##イト\n3600\n##ncy\n##nie\n12000\nakb48\n##ye\n##oid\n404\n##chi\n##いた\noa\nxuehai\n##1000\n##orm\n##rf\n275\nさん\n##ware\n##リー\n980\nho\n##pro\ntext\n##era\n560\nbob\n227\n##ub\n##2008\n8891\nscp\navi\n##zen\n2022\nmi\nwu\nmuseum\nqvod\napache\nlake\njcb\n▲topaug\n★★★\nni\n##hr\nhill\n302\nne\nweibo\n490\nruby\n##ーシ\n##ヶ\n##row\n4d\n▲topjul\niv\n##ish\ngithub\n306\nmate\n312\n##スト\n##lot\n##ane\nandrew\nのハイト\n##tina\nt1\nrf\ned2k\n##vel\n##900\nway\nfinal\nりの\nns\n5a\n705\n197\n##メ\nsweet\nbytes\n##ene\n▲topjan\n231\n##cker\n##2007\n##px\n100g\ntopapp\n229\nhelpapp\nrs\nlow\n14k\ng4g\ncare\n630\nldquo\nあり\n##fork\nleave\nrm\nedition\n##gan\n##zon\n##qq\n▲topsep\n##google\n##ism\ngold\n224\nexplorer\n##zer\ntoyota\ncategory\nselect\nvisual\n##labels\nrestaurant\n##md\nposts\ns1\n##ico\nもっと\nangelababy\n123456\n217\nsports\ns3\nmbc\n1915\nしてくたさい\nshell\nx86\ncandy\n##new\nkbs\nface\nxl\n470\n##here\n4a\nswissinfo\nv8\n▲topfeb\ndram\n##ual\n##vice\n3a\n##wer\nsport\nq1\nios10\npublic\nint\ncard\n##ｃ\nep\nau\nrt\n##れた\n1080\nbill\n##mll\nkim\n３０\n460\nwan\n##uk\n##ミ\nx3\n298\n0t\nscott\n##ming\n239\ne5\n##3d\nh7n9\nworldcat\nbrown\n##あります\n##vo\n##led\n##580\n##ax\n249\n410\n##ert\nparis\n##～6\npolo\n925\n##lr\n599\n##ナ\ncapital\n##hing\nbank\ncv\n1g\n##chat\n##ｓ\n##たい\nadc\n##ule\n2m\n##ｅ\ndigital\nhotmail\n268\n##pad\n870\nbbq\nquot\n##ring\nbefore\nwali\n##まて\nmcu\n2k\n2b\nという\ncostco\n316\nnorth\n333\nswitch\n##city\n##ｐ\nphilips\n##mann\nmanagement\npanasonic\n##cl\n##vd\n##ping\n##rge\nalice\n##lk\n##ましょう\ncss3\n##ney\nvision\nalpha\n##ular\n##400\n##tter\nlz\nにお\n##ありません\nmode\ngre\n1916\npci\n##tm\n237\n1～2\n##yan\n##そ\nについて\n##let\n##キ\nwork\nwar\ncoach\nah\nmary\n##ᅵ\nhuang\n##pt\na8\npt\nfollow\n##berry\n1895\n##ew\na5\nghost\n##ション\n##wn\n##og\nsouth\n##code\ngirls\n##rid\naction\nvilla\ngit\nr11\ntable\ngames\n##cket\nerror\n##anonymoussaid\n##ag\nhere\n##ame\n##gc\nqa\n##■\n##lis\ngmp\n##gin\nvmalife\n##cher\nyu\nwedding\n##tis\ndemo\ndragon\n530\nsoho\nsocial\nbye\n##rant\nriver\norz\nacer\n325\n##↑\n##ース\n##ats\n261\ndel\n##ven\n440\nups\n##ように\n##ター\n305\nvalue\nmacd\nyougou\n##dn\n661\n##ano\nll\n##urt\n##rent\ncontinue\nscript\n##wen\n##ect\npaper\n263\n319\nshift\n##chel\n##フト\n##cat\n258\nx5\nfox\n243\n##さん\ncar\naaa\n##blog\nloading\n##yn\n##tp\nkuso\n799\nsi\nsns\nイカせるテンマ\nヒンクテンマ3\nrmb\nvdc\nforest\ncentral\nprime\nhelp\nultra\n##rmb\n##ような\n241\nsquare\n688\n##しい\nのないフロクに\n##field\n##reen\n##ors\n##ju\nc1\nstart\n510\n##air\n##map\ncdn\n##wo\ncba\nstephen\nm8\n100km\n##get\nopera\n##base\n##ood\nvsa\ncom™\n##aw\n##ail\n251\nなのて\ncount\nt2\n##ᅡ\n##een\n2700\nhop\n##gp\nvsc\ntree\n##eg\n##ose\n816\n285\n##ories\n##shop\nalphago\nv4\n1909\nsimon\n##ᆼ\nfluke62max\nzip\nスホンサー\n##sta\nlouis\ncr\nbas\n##～10\nbc\n##yer\nhadoop\n##ube\n##wi\n1906\n0755\nhola\n##low\nplace\ncentre\n5v\nd3\n##fer\n252\n##750\n##media\n281\n540\n0l\nexchange\n262\nseries\n##ハー\n##san\neb\n##bank\n##ｋ\nq3\n##nge\n##mail\ntake\n##lp\n259\n1888\nclient\neast\ncache\nevent\nvincent\n##ールを\nきを\n##nse\nsui\n855\nadchoice\n##и\n##stry\n##なたの\n246\n##zone\nga\napps\nsea\n##ab\n248\ncisco\n##タ\n##rner\nkymco\n##care\ndha\n##pu\n##yi\nminkoff\nroyal\np1\nへの\nannie\n269\ncollection\nkpi\nplaystation\n257\nになります\n866\nbh\n##bar\nqueen\n505\nradio\n1904\nandy\narmani\n##xy\nmanager\niherb\n##ery\n##share\nspring\nraid\njohnson\n1908\n##ob\nvolvo\nhall\n##ball\nv6\nour\ntaylor\n##hk\nbi\n242\n##cp\nkate\nbo\nwater\ntechnology\n##rie\nサイトは\n277\n##ona\n##sl\nhpv\n303\ngtx\nhip\nrdquo\njayz\nstone\n##lex\n##rum\nnamespace\n##やり\n620\n##ale\n##atic\ndes\n##erson\n##ql\n##ves\n##type\nenter\n##この\n##てきます\nd2\n##168\n##mix\n##bian\nとの\na9\njj\nky\n##lc\naccess\nmovie\n##hc\nリストに\ntower\n##ration\n##mit\nます\n##nch\nua\ntel\nprefix\n##o2\n1907\n##point\n1901\nott\n～10\n##http\n##ury\nbaidu\n##ink\nmember\n##logy\nbigbang\nnownews\n##js\n##shot\n##tb\n##こと\n247\neba\n##tics\n##lus\nける\nv5\nspark\n##ama\nthere\n##ions\ngod\n##lls\n##down\nhiv\n##ress\nburberry\nday2\n##kv\n◆◆\njeff\nrelated\nfilm\nedit\njoseph\n283\n##ark\ncx\n32gb\norder\ng9\n30000\n##ans\n##tty\ns5\n##bee\nかあります\nthread\nxr\nbuy\nsh\n005\nland\nspotify\nmx\n##ari\n276\n##verse\n×email\nsf\nwhy\n##ことて\n244\n7headlines\nnego\nsunny\ndom\nexo\n401\n666\npositioning\nfit\nrgb\n##tton\n278\nkiss\nalexa\nadam\nlp\nみリストを\n##ｇ\nmp\n##ties\n##llow\namy\n##du\nnp\n002\ninstitute\n271\n##rth\n##lar\n2345\n590\n##des\nsidebar\n１５\nimax\nsite\n##cky\n##kit\n##ime\n##009\nseason\n323\n##fun\n##ンター\n##ひ\ngogoro\na7\npu\nlily\nfire\ntwd600\n##ッセーシを\nいて\n##vis\n30ml\n##cture\n##をお\ninformation\n##オ\nclose\nfriday\n##くれる\nyi\nnick\nてすか\n##tta\n##tel\n6500\n##lock\ncbd\neconomy\n254\nかお\n267\ntinker\ndouble\n375\n8gb\nvoice\n##app\noops\nchannel\ntoday\n985\n##right\nraw\nxyz\n##＋\njim\nedm\n##cent\n7500\nsupreme\n814\nds\n##its\n##asia\ndropbox\n##てすか\n##tti\nbooks\n272\n100ml\n##tle\n##ller\n##ken\n##more\n##boy\nsex\n309\n##dom\nt3\n##ider\n##なります\n##unch\n1903\n810\nfeel\n5500\n##かった\n##put\nにより\ns2\nmo\n##gh\nmen\nka\namoled\ndiv\n##tr\n##n1\nport\nhoward\n##tags\nken\ndnf\n##nus\nadsense\n##а\nide\n##へ\nbuff\nthunder\n##town\n##ique\nhas\n##body\nauto\npin\n##erry\ntee\nてした\n295\nnumber\n##the\n##013\nobject\npsp\ncool\nudnbkk\n16gb\n##mic\nmiui\n##tro\nmost\nr2\n##alk\n##nity\n1880\n±0\n##いました\n428\ns4\nlaw\nversion\n##oa\nn1\nsgs\ndocomo\n##tf\n##ack\nhenry\nfc2\n##ded\n##sco\n##014\n##rite\n286\n0mm\nlinkedin\n##ada\n##now\nwii\n##ndy\nucbug\n##◎\nsputniknews\nlegalminer\n##ika\n##xp\n2gb\n##bu\nq10\noo\nb6\ncome\n##rman\ncheese\nming\nmaker\n##gm\nnikon\n##fig\nppi\nkelly\n##ります\njchere\nてきます\nted\nmd\n003\nfgo\ntech\n##tto\ndan\nsoc\n##gl\n##len\nhair\nearth\n640\n521\nimg\n##pper\n##a1\n##てきる\n##ロク\nacca\n##ition\n##ference\nsuite\n##ig\noutlook\n##mond\n##cation\n398\n##pr\n279\n101vip\n358\n##999\n282\n64gb\n3800\n345\nairport\n##over\n284\n##おり\njones\n##ith\nlab\n##su\n##いるのて\nco2\ntown\npiece\n##llo\nno1\nvmware\n24h\n##qi\nfocus\nreader\n##admin\n##ora\ntb\nfalse\n##log\n1898\nknow\nlan\n838\n##ces\nf4\n##ume\nmotel\nstop\n##oper\nna\nflickr\nnetcomponents\n##af\n##─\npose\nwilliams\nlocal\n##ound\n##cg\n##site\n##iko\nいお\n274\n5m\ngsm\ncon\n##ath\n1902\nfriends\n##hip\ncell\n317\n##rey\n780\ncream\n##cks\n012\n##dp\nfacebooktwitterpinterestgoogle\nsso\n324\nshtml\nsong\nswiss\n##mw\n##キンク\nlumia\nxdd\nstring\ntiffany\n522\nmarc\nられた\ninsee\nrussell\nsc\ndell\n##ations\nｏｋ\ncamera\n289\n##vs\n##flow\n##late\nclassic\n287\n##nter\nstay\ng1\nmtv\n512\n##ever\n##lab\n##nger\nqe\nsata\nryan\nd1\n50ml\ncms\n##cing\nsu\n292\n3300\neditor\n296\n##nap\nsecurity\nsunday\nassociation\n##ens\n##700\n##bra\nacg\n##かり\nsofascore\nとは\nmkv\n##ign\njonathan\ngary\nbuild\nlabels\n##oto\ntesla\nmoba\nqi\ngohappy\ngeneral\najax\n1024\n##かる\nサイト\nsociety\n##test\n##urs\nwps\nfedora\n##ich\nmozilla\n328\n##480\n##dr\nusa\nurn\n##lina\n##ｒ\ngrace\n##die\n##try\n##ader\n1250\n##なり\nelle\n570\n##chen\n##ᆯ\nprice\n##ten\nuhz\n##ough\neq\n##hen\nstates\npush\nsession\nbalance\nwow\n506\n##cus\n##py\nwhen\n##ward\n##ep\n34e\nwong\nlibrary\nprada\n##サイト\n##cle\nrunning\n##ree\n313\nck\ndate\nq4\n##ctive\n##ool\n##＞\nmk\n##ira\n##163\n388\ndie\nsecret\nrq\ndota\nbuffet\nは１ヶ\ne6\n##ez\npan\n368\nha\n##card\n##cha\n2a\n##さ\nalan\nday3\neye\nf3\n##end\nfrance\nkeep\nadi\nrna\ntvbs\n##ala\nsolo\nnova\n##え\n##tail\n##ょう\nsupport\n##ries\n##なる\n##ved\nbase\ncopy\niis\nfps\n##ways\nhero\nhgih\nprofile\nfish\nmu\nssh\nentertainment\nchang\n##wd\nclick\ncake\n##ond\npre\n##tom\nkic\npixel\n##ov\n##fl\nproduct\n6a\n##pd\ndear\n##gate\nes\nyumi\naudio\n##²\n##sky\necho\nbin\nwhere\n##ture\n329\n##ape\nfind\nsap\nisis\n##なと\nnand\n##101\n##load\n##ream\nband\na6\n525\nnever\n##post\nfestival\n50cm\n##we\n555\nguide\n314\nzenfone\n##ike\n335\ngd\nforum\njessica\nstrong\nalexander\n##ould\nsoftware\nallen\n##ious\nprogram\n360°\nelse\nlohasthree\n##gar\nすることかてきます\nplease\n##れます\nrc\n##ggle\n##ric\nbim\n50000\n##own\neclipse\n355\nbrian\n3ds\n##side\n061\n361\n##other\n##ける\n##tech\n##ator\n485\nengine\n##ged\n##ｔ\nplaza\n##fit\ncia\nngo\nwestbrook\nshi\ntbs\n50mm\n##みませんか\nsci\n291\nreuters\n##ily\ncontextlink\n##hn\naf\n##cil\nbridge\nvery\n##cel\n1890\ncambridge\n##ize\n15g\n##aid\n##data\n790\nfrm\n##head\naward\nbutler\n##sun\nmeta\n##mar\namerica\nps3\npuma\npmid\n##すか\nlc\n670\nkitchen\n##lic\nオーフン5\nきなしソフトサーヒス\nそして\nday1\nfuture\n★★★★\n##text\n##page\n##rris\npm1\n##ket\nfans\n##っています\n1001\nchristian\nbot\nkids\ntrackback\n##hai\nc3\ndisplay\n##hl\nn2\n1896\nidea\nさんも\n##sent\nairmail\n##ug\n##men\npwm\nけます\n028\n##lution\n369\n852\nawards\nschemas\n354\nasics\nwikipedia\nfont\n##tional\n##vy\nc2\n293\n##れている\n##dget\n##ein\nっている\ncontact\npepper\nスキル\n339\n##～5\n294\n##uel\n##ument\n730\n##hang\nみてす\nq5\n##sue\nrain\n##ndi\nwei\nswatch\n##cept\nわせ\n331\npopular\n##ste\n##tag\np2\n501\ntrc\n1899\n##west\n##live\njustin\nhonda\nping\nmessenger\n##rap\nv9\n543\n##とは\nunity\nappqq\nはすへて\n025\nleo\n##tone\n##テ\n##ass\nuniqlo\n##010\n502\nher\njane\nmemory\nmoneydj\n##tical\nhuman\n12306\nしていると\n##m2\ncoc\nmiacare\n##mn\ntmt\n##core\nvim\nkk\n##may\nfan\ntarget\nuse\ntoo\n338\n435\n2050\n867\n737\nfast\n##2c\nservices\n##ope\nomega\nenergy\n##わ\npinkoi\n1a\n##なから\n##rain\njackson\n##ement\n##シャンルの\n374\n366\nそんな\np9\nrd\n##ᆨ\n1111\n##tier\n##vic\nzone\n##│\n385\n690\ndl\nisofix\ncpa\nm4\n322\nkimi\nめて\ndavis\n##lay\nlulu\n##uck\n050\nweeks\nqs\n##hop\n920\n##ｎ\nae\n##ear\n～5\neia\n405\n##fly\nkorea\njpeg\nboost\n##ship\nsmall\n##リア\n1860\neur\n297\n425\nvalley\n##iel\nsimple\n##ude\nrn\nk2\n##ena\nされます\nnon\npatrick\nしているから\n##ナー\nfeed\n5757\n30g\nprocess\nwell\nqqmei\n##thing\nthey\naws\nlu\npink\n##ters\n##kin\nまたは\nboard\n##vertisement\nwine\n##ien\nunicode\n##dge\nr1\n359\n##tant\nいを\n##twitter\n##3c\ncool1\nされる\n##れて\n##ｌ\nisp\n##012\nstandard\n45㎡2\n402\n##150\nmatt\n##fu\n326\n##iner\ngooglemsn\npixnetfacebookyahoo\n##ラン\nx7\n886\n##uce\nメーカー\nsao\n##ev\n##きました\n##file\n9678\n403\nxddd\nshirt\n6l\n##rio\n##hat\n3mm\ngivenchy\nya\nbang\n##lio\nmonday\ncrystal\nロクイン\n##abc\n336\nhead\n890\nubuntuforumwikilinuxpastechat\n##vc\n##～20\n##rity\ncnc\n7866\nipv6\nnull\n1897\n##ost\nyang\nimsean\ntiger\n##fet\n##ンス\n352\n##＝\ndji\n327\nji\nmaria\n##come\n##んて\nfoundation\n3100\n##beth\n##なった\n1m\n601\nactive\n##aft\n##don\n3p\nsr\n349\nemma\n##khz\nliving\n415\n353\n1889\n341\n709\n457\nsas\nx6\n##face\npptv\nx4\n##mate\nhan\nsophie\n##jing\n337\nfifa\n##mand\nother\nsale\ninwedding\n##gn\nてきちゃいます\n##mmy\n##pmlast\nbad\nnana\nnbc\nしてみてくたさいね\nなとはお\n##wu\n##かあります\n##あ\nnote7\nsingle\n##340\nせからこ\nしてくたさい♪この\nしにはとんとんワークケートを\nするとあなたにもっとマッチした\nならワークケートへ\nもみつかっちゃうかも\nワークケートの\n##bel\nwindow\n##dio\n##ht\nunion\nage\n382\n１４\n##ivity\n##ｙ\nコメント\ndomain\nneo\n##isa\n##lter\n5k\nf5\nsteven\n##cts\npowerpoint\ntft\nself\ng2\nft\n##テル\nzol\n##act\nmwc\n381\n343\nもう\nnbapop\n408\nてある\neds\nace\n##room\nprevious\nauthor\ntomtom\nil\n##ets\nhu\nfinancial\n☆☆☆\nっています\nbp\n5t\nchi\n1gb\n##hg\nfairmont\ncross\n008\ngay\nh2\nfunction\n##けて\n356\nalso\n1b\n625\n##ータ\n##raph\n1894\n3～5\n##ils\ni3\n334\navenue\n##host\nによる\n##bon\n##tsu\nmessage\nnavigation\n50g\nfintech\nh6\n##ことを\n8cm\n##ject\n##vas\n##firm\ncredit\n##wf\nxxxx\nform\n##nor\n##space\nhuawei\nplan\njson\nsbl\n##dc\nmachine\n921\n392\nwish\n##120\n##sol\nwindows7\nedward\n##ために\ndevelopment\nwashington\n##nsis\nlo\n818\n##sio\n##ym\n##bor\nplanet\n##～8\n##wt\nieee\ngpa\n##めて\ncamp\nann\ngm\n##tw\n##oka\nconnect\n##rss\n##work\n##atus\nwall\nchicken\nsoul\n2mm\n##times\nfa\n##ather\n##cord\n009\n##eep\nhitachi\ngui\nharry\n##pan\ne1\ndisney\n##press\n##ーション\nwind\n386\nfrigidaire\n##tl\nliu\nhsu\n332\nbasic\nvon\nev\nいた\nてきる\nスホンサーサイト\nlearning\n##ull\nexpedia\narchives\nchange\n##wei\nsanta\ncut\nins\n6gb\nturbo\nbrand\ncf1\n508\n004\nreturn\n747\n##rip\nh1\n##nis\n##をこ\n128gb\n##にお\n3t\napplication\nしており\nemc\nrx\n##oon\n384\nquick\n412\n15058\nwilson\nwing\nchapter\n##bug\nbeyond\n##cms\n##dar\n##oh\nzoom\ne2\ntrip\nsb\n##nba\nrcep\n342\naspx\nci\n080\ngc\ngnu\nめる\n##count\nadvanced\ndance\ndv\n##url\n##ging\n367\n8591\nam09\nshadow\nbattle\n346\n##ｉ\n##cia\n##という\nemily\n##のてす\n##tation\nhost\nff\ntechorz\nsars\n##mini\n##mporary\n##ering\nnc\n4200\n798\n##next\ncma\n##mbps\n##gas\n##ift\n##dot\n##ィ\n455\n##～17\namana\n##りの\n426\n##ros\nir\n00㎡1\n##eet\n##ible\n##↓\n710\nˋ▽ˊ\n##aka\ndcs\niq\n##ｖ\nl1\n##lor\nmaggie\n##011\n##iu\n588\n##～1\n830\n##gt\n1tb\narticles\ncreate\n##burg\n##iki\ndatabase\nfantasy\n##rex\n##cam\ndlc\ndean\n##you\nhard\npath\ngaming\nvictoria\nmaps\ncb\n##lee\n##itor\noverchicstoretvhome\nsystems\n##xt\n416\np3\nsarah\n760\n##nan\n407\n486\nx9\ninstall\nsecond\n626\n##ann\n##ph\n##rcle\n##nic\n860\n##nar\nec\n##とう\n768\nmetro\nchocolate\n##rian\n～4\n##table\n##しています\nskin\n##sn\n395\nmountain\n##0mm\ninparadise\n6m\n7x24\nib\n4800\n##jia\neeworld\ncreative\ng5\ng3\n357\nparker\necfa\nvillage\nからの\n18000\nsylvia\nサーヒス\nhbl\n##ques\n##onsored\n##x2\n##きます\n##v4\n##tein\nie6\n383\n##stack\n389\nver\n##ads\n##baby\nsound\nbbe\n##110\n##lone\n##uid\nads\n022\ngundam\n351\nthinkpad\n006\nscrum\nmatch\n##ave\nmems\n##470\n##oy\n##なりました\n##talk\nglass\nlamigo\nspan\n##eme\njob\n##a5\njay\nwade\nkde\n498\n##lace\nocean\ntvg\n##covery\n##r3\n##ners\n##rea\njunior\nthink\n##aine\ncover\n##ision\n##sia\n↓↓\n##bow\nmsi\n413\n458\n406\n##love\n711\n801\nsoft\nz2\n##pl\n456\n1840\nmobil\nmind\n##uy\n427\nnginx\n##oi\nめた\n##rr\n6221\n##mple\n##sson\n##ーシてす\n371\n##nts\n91tv\ncomhd\ncrv3000\n##uard\n1868\n397\ndeep\nlost\nfield\ngallery\n##bia\nrate\nspf\nredis\ntraction\n930\nicloud\n011\nなら\nfe\njose\n372\n##tory\ninto\nsohu\nfx\n899\n379\nkicstart2\n##hia\nすく\n##～3\n##sit\nra\n２４\n##walk\n##xure\n500g\n##pact\npacific\nxa\nnatural\ncarlo\n##250\n##walker\n1850\n##can\ncto\ngigi\n516\n##サー\npen\n##hoo\nob\nmatlab\n##ｂ\n##yy\n13913459\n##iti\nmango\n##bbs\nsense\nc5\noxford\n##ニア\nwalker\njennifer\n##ola\ncourse\n##bre\n701\n##pus\n##rder\nlucky\n075\n##ぁ\nivy\nなお\n##nia\nsotheby\nside\n##ugh\njoy\n##orage\n##ush\n##bat\n##dt\n364\nr9\n##2d\n##gio\n511\ncountry\nwear\n##lax\n##～7\n##moon\n393\nseven\nstudy\n411\n348\nlonzo\n8k\n##ェ\nevolution\n##イフ\n##kk\ngs\nkd\n##レス\narduino\n344\nb12\n##lux\narpg\n##rdon\ncook\n##x5\ndark\nfive\n##als\n##ida\nとても\nsign\n362\n##ちの\nsomething\n20mm\n##nda\n387\n##posted\nfresh\ntf\n1870\n422\ncam\n##mine\n##skip\n##form\n##ssion\neducation\n394\n##tee\ndyson\nstage\n##jie\nwant\n##night\nepson\npack\nあります\n##ppy\nテリヘル\n##█\nwd\n##eh\n##rence\nleft\n##lvin\ngolden\nmhz\ndiscovery\n##trix\n##n2\nloft\n##uch\n##dra\n##sse\nspeed\n～1\n1mdb\nsorry\nwelcome\n##urn\nwave\ngaga\n##lmer\nteddy\n##160\nトラックハック\nせよ\n611\n##f2016\n378\nrp\n##sha\nrar\n##あなたに\n##きた\n840\nholiday\n##ュー\n373\n074\n##vg\n##nos\n##rail\ngartner\ngi\n6p\n##dium\nkit\n488\nb3\neco\n##ろう\n20g\nsean\n##stone\nautocad\nnu\n##np\nf16\nwrite\n029\nm5\n##ias\nimages\natp\n##dk\nfsm\n504\n1350\nve\n52kb\n##xxx\n##のに\n##cake\n414\nunit\nlim\nru\n1v\n##ification\npublished\nangela\n16g\nanalytics\nak\n##ｑ\n##nel\ngmt\n##icon\nagain\n##₂\n##bby\nios11\n445\nかこさいます\nwaze\nいてす\n##ハ\n9985\n##ust\n##ティー\nframework\n##007\niptv\ndelete\n52sykb\ncl\nwwdc\n027\n30cm\n##fw\n##ての\n1389\n##xon\nbrandt\n##ses\n##dragon\ntc\nvetements\nanne\nmonte\nmodern\nofficial\n##へて\n##ere\n##nne\n##oud\nもちろん\n５０\netnews\n##a2\n##graphy\n421\n863\n##ちゃん\n444\n##rtex\n##てお\nl2\n##gma\nmount\nccd\nたと\narchive\nmorning\ntan\nddos\ne7\n##ホ\nday4\n##ウ\ngis\n453\nits\n495\nfactory\nbruce\npg\n##ito\nってくたさい\nguest\ncdma\n##lling\n536\nn3\nしかし\n3～4\nmega\neyes\nro\n１３\nwomen\ndac\nchurch\n##jun\nsingapore\n##facebook\n6991\nstarbucks\n##tos\n##stin\n##shine\nzen\n##mu\ntina\n20℃\n1893\n##たけて\n503\n465\nrequest\n##gence\nqt\n##っ\n1886\n347\n363\nq7\n##zzi\ndiary\n##tore\n409\n##ead\n468\ncst\n##osa\ncanada\nagent\nva\n##jiang\n##ちは\n##ーク\n##lam\nsg\n##nix\n##sday\n##よって\ng6\n##master\nbing\n##zl\ncharlie\n１６\n8mm\nnb40\n##ーン\nthai\n##ルフ\nln284ct\n##itz\n##2f\nbonnie\n##food\n##lent\noriginals\n##stro\n##lts\n418\n∟∣\n##bscribe\nchildren\nntd\nyesstyle\n##かも\nhmv\n##tment\nd5\n2cm\narts\nsms\n##pn\n##я\n##いい\ntopios9\n539\nlifestyle\nvirtual\n##ague\nxz\n##deo\nmuji\n024\nunt\n##nnis\n##ᅩ\nfaq1\n1884\n396\n##ette\nfly\n64㎡\nはしめまして\n441\ncurry\n##pop\nのこ\nrelease\n##←\n##◆◆\n##cast\n073\nありな\n500ml\n##ews\n5c\n##stle\nios7\n##ima\n787\ndog\nlenovo\n##r4\nroger\n013\ncbs\nvornado\n100m\n417\n##desk\n##クok\n##ald\n1867\n9595\n2900\n##van\noil\n##ｘ\nsome\nbreak\ncommon\n##jy\n##lines\ng7\ntwice\n419\nella\nnano\nbelle\nにこ\n##mes\n##self\n##note\njb\n##ことかてきます\nbenz\n##との\n##ova\n451\nsave\n##wing\n##ますのて\nkai\nりは\n##hua\n##rect\nrainer\n##unge\n448\n##0m\nadsl\n##かな\nguestname\n##uma\n##kins\n##zu\ntokichoi\n##price\ncounty\n##med\n##mus\nrmk\n391\naddress\nvm\nえて\nopenload\n##group\n##hin\n##iginal\namg\nurban\n##oz\njobs\nemi\n##public\nbeautiful\n##sch\nalbum\n##dden\n##bell\njerry\nworks\nhostel\nmiller\n##drive\n##rmin\n##１０\n376\nboot\n828\n##370\n##fx\n##cm～\n1885\n##nome\n##ctionary\n##oman\n##lish\n##cr\n##hm\n433\n##how\n432\nfrancis\nxi\nc919\nb5\nevernote\n##uc\nvga\n##3000\ncoupe\n##urg\n##cca\n##uality\n019\n6g\nれる\nmulti\n##また\n##ett\nem\nhey\n##ani\n##tax\n##rma\ninside\nthan\n740\nleonnhurt\n##jin\nict\nれた\nbird\nnotes\n200mm\nくの\n##dical\n##lli\nresult\n442\niu\nee\n438\nsmap\ngopro\n##last\nyin\npure\n998\n32g\nけた\n5kg\n##dan\n##rame\nmama\n##oot\nbean\nmarketing\n##hur\n2l\nbella\nsync\nxuite\n##ground\n515\ndiscuz\n##getrelax\n##ince\n##bay\n##5s\ncj\n##イス\ngmat\napt\n##pass\njing\n##rix\nc4\nrich\n##とても\nniusnews\n##ello\nbag\n770\n##eting\n##mobile\n１８\nculture\n015\n##のてすか\n377\n1020\narea\n##ience\n616\ndetails\ngp\nuniversal\nsilver\ndit\nはお\nprivate\nddd\nu11\nkanshu\n##ified\nfung\n##nny\ndx\n##520\ntai\n475\n023\n##fr\n##lean\n3s\n##pin\n429\n##rin\n25000\nly\nrick\n##bility\nusb3\nbanner\n##baru\n##gion\nmetal\ndt\nvdf\n1871\nkarl\nqualcomm\nbear\n1010\noldid\nian\njo\n##tors\npopulation\n##ernel\n1882\nmmorpg\n##mv\n##bike\n603\n##©\nww\nfriend\n##ager\nexhibition\n##del\n##pods\nfpx\nstructure\n##free\n##tings\nkl\n##rley\n##copyright\n##mma\ncalifornia\n3400\norange\nyoga\n4l\ncanmake\nhoney\n##anda\n##コメント\n595\nnikkie\n##ルハイト\ndhl\npublishing\n##mall\n##gnet\n20cm\n513\n##クセス\n##┅\ne88\n970\n##dog\nfishbase\n##!\n##\"\n###\n##$\n##%\n##&\n##'\n##(\n##)\n##*\n##+\n##,\n##-\n##.\n##/\n##:\n##;\n##<\n##=\n##>\n##?\n##@\n##[\n##\\\n##]\n##^\n##_\n##{\n##|\n##}\n##~\n##£\n##¤\n##¥\n##§\n##«\n##±\n##³\n##µ\n##·\n##¹\n##º\n##»\n##¼\n##ß\n##æ\n##÷\n##ø\n##đ\n##ŋ\n##ɔ\n##ə\n##ɡ\n##ʰ\n##ˇ\n##ˈ\n##ˊ\n##ˋ\n##ˍ\n##ː\n##˙\n##˚\n##ˢ\n##α\n##β\n##γ\n##δ\n##ε\n##η\n##θ\n##ι\n##κ\n##λ\n##μ\n##ν\n##ο\n##π\n##ρ\n##ς\n##σ\n##τ\n##υ\n##φ\n##χ\n##ψ\n##б\n##в\n##г\n##д\n##е\n##ж\n##з\n##к\n##л\n##м\n##н\n##о\n##п\n##р\n##с\n##т\n##у\n##ф\n##х\n##ц\n##ч\n##ш\n##ы\n##ь\n##і\n##ا\n##ب\n##ة\n##ت\n##د\n##ر\n##س\n##ع\n##ل\n##م\n##ن\n##ه\n##و\n##ي\n##۩\n##ก\n##ง\n##น\n##ม\n##ย\n##ร\n##อ\n##า\n##เ\n##๑\n##་\n##ღ\n##ᄀ\n##ᄁ\n##ᄂ\n##ᄃ\n##ᄅ\n##ᄆ\n##ᄇ\n##ᄈ\n##ᄉ\n##ᄋ\n##ᄌ\n##ᄎ\n##ᄏ\n##ᄐ\n##ᄑ\n##ᄒ\n##ᅢ\n##ᅣ\n##ᅥ\n##ᅦ\n##ᅧ\n##ᅨ\n##ᅪ\n##ᅬ\n##ᅭ\n##ᅮ\n##ᅯ\n##ᅲ\n##ᅳ\n##ᅴ\n##ᆷ\n##ᆸ\n##ᆺ\n##ᆻ\n##ᗜ\n##ᵃ\n##ᵉ\n##ᵍ\n##ᵏ\n##ᵐ\n##ᵒ\n##ᵘ\n##‖\n##„\n##†\n##•\n##‥\n##‧\n## \n##‰\n##′\n##″\n##‹\n##›\n##※\n##‿\n##⁄\n##ⁱ\n##⁺\n##ⁿ\n##₁\n##₃\n##₄\n##€\n##№\n##ⅰ\n##ⅱ\n##ⅲ\n##ⅳ\n##ⅴ\n##↔\n##↗\n##↘\n##⇒\n##∀\n##−\n##∕\n##∙\n##√\n##∞\n##∟\n##∠\n##∣\n##∩\n##∮\n##∶\n##∼\n##∽\n##≈\n##≒\n##≡\n##≤\n##≥\n##≦\n##≧\n##≪\n##≫\n##⊙\n##⋅\n##⋈\n##⋯\n##⌒\n##①\n##②\n##③\n##④\n##⑤\n##⑥\n##⑦\n##⑧\n##⑨\n##⑩\n##⑴\n##⑵\n##⑶\n##⑷\n##⑸\n##⒈\n##⒉\n##⒊\n##⒋\n##ⓒ\n##ⓔ\n##ⓘ\n##━\n##┃\n##┆\n##┊\n##┌\n##└\n##├\n##┣\n##═\n##║\n##╚\n##╞\n##╠\n##╭\n##╮\n##╯\n##╰\n##╱\n##╳\n##▂\n##▃\n##▅\n##▇\n##▉\n##▋\n##▌\n##▍\n##▎\n##□\n##▪\n##▫\n##▬\n##△\n##▶\n##►\n##▽\n##◇\n##◕\n##◠\n##◢\n##◤\n##☀\n##☕\n##☞\n##☺\n##☼\n##♀\n##♂\n##♠\n##♡\n##♣\n##♦\n##♫\n##♬\n##✈\n##✔\n##✕\n##✖\n##✦\n##✨\n##✪\n##✰\n##✿\n##❀\n##➜\n##➤\n##⦿\n##、\n##。\n##〃\n##々\n##〇\n##〈\n##〉\n##《\n##》\n##「\n##」\n##『\n##』\n##【\n##】\n##〓\n##〔\n##〕\n##〖\n##〗\n##〜\n##〝\n##〞\n##ぃ\n##ぇ\n##ぬ\n##ふ\n##ほ\n##む\n##ゃ\n##ゅ\n##ゆ\n##ょ\n##゜\n##ゝ\n##ァ\n##ゥ\n##エ\n##ォ\n##ケ\n##サ\n##セ\n##ソ\n##ッ\n##ニ\n##ヌ\n##ネ\n##ノ\n##ヘ\n##モ\n##ャ\n##ヤ\n##ュ\n##ユ\n##ョ\n##ヨ\n##ワ\n##ヲ\n##・\n##ヽ\n##ㄅ\n##ㄆ\n##ㄇ\n##ㄉ\n##ㄋ\n##ㄌ\n##ㄍ\n##ㄎ\n##ㄏ\n##ㄒ\n##ㄚ\n##ㄛ\n##ㄞ\n##ㄟ\n##ㄢ\n##ㄤ\n##ㄥ\n##ㄧ\n##ㄨ\n##ㆍ\n##㈦\n##㊣\n##㗎\n##一\n##丁\n##七\n##万\n##丈\n##三\n##上\n##下\n##不\n##与\n##丐\n##丑\n##专\n##且\n##丕\n##世\n##丘\n##丙\n##业\n##丛\n##东\n##丝\n##丞\n##丟\n##両\n##丢\n##两\n##严\n##並\n##丧\n##丨\n##个\n##丫\n##中\n##丰\n##串\n##临\n##丶\n##丸\n##丹\n##为\n##主\n##丼\n##丽\n##举\n##丿\n##乂\n##乃\n##久\n##么\n##义\n##之\n##乌\n##乍\n##乎\n##乏\n##乐\n##乒\n##乓\n##乔\n##乖\n##乗\n##乘\n##乙\n##乜\n##九\n##乞\n##也\n##习\n##乡\n##书\n##乩\n##买\n##乱\n##乳\n##乾\n##亀\n##亂\n##了\n##予\n##争\n##事\n##二\n##于\n##亏\n##云\n##互\n##五\n##井\n##亘\n##亙\n##亚\n##些\n##亜\n##亞\n##亟\n##亡\n##亢\n##交\n##亥\n##亦\n##产\n##亨\n##亩\n##享\n##京\n##亭\n##亮\n##亲\n##亳\n##亵\n##人\n##亿\n##什\n##仁\n##仃\n##仄\n##仅\n##仆\n##仇\n##今\n##介\n##仍\n##从\n##仏\n##仑\n##仓\n##仔\n##仕\n##他\n##仗\n##付\n##仙\n##仝\n##仞\n##仟\n##代\n##令\n##以\n##仨\n##仪\n##们\n##仮\n##仰\n##仲\n##件\n##价\n##任\n##份\n##仿\n##企\n##伉\n##伊\n##伍\n##伎\n##伏\n##伐\n##休\n##伕\n##众\n##优\n##伙\n##会\n##伝\n##伞\n##伟\n##传\n##伢\n##伤\n##伦\n##伪\n##伫\n##伯\n##估\n##伴\n##伶\n##伸\n##伺\n##似\n##伽\n##佃\n##但\n##佇\n##佈\n##位\n##低\n##住\n##佐\n##佑\n##体\n##佔\n##何\n##佗\n##佘\n##余\n##佚\n##佛\n##作\n##佝\n##佞\n##佟\n##你\n##佢\n##佣\n##佤\n##佥\n##佩\n##佬\n##佯\n##佰\n##佳\n##併\n##佶\n##佻\n##佼\n##使\n##侃\n##侄\n##來\n##侈\n##例\n##侍\n##侏\n##侑\n##侖\n##侗\n##供\n##依\n##侠\n##価\n##侣\n##侥\n##侦\n##侧\n##侨\n##侬\n##侮\n##侯\n##侵\n##侶\n##侷\n##便\n##係\n##促\n##俄\n##俊\n##俎\n##俏\n##俐\n##俑\n##俗\n##俘\n##俚\n##保\n##俞\n##俟\n##俠\n##信\n##俨\n##俩\n##俪\n##俬\n##俭\n##修\n##俯\n##俱\n##俳\n##俸\n##俺\n##俾\n##倆\n##倉\n##個\n##倌\n##倍\n##倏\n##們\n##倒\n##倔\n##倖\n##倘\n##候\n##倚\n##倜\n##借\n##倡\n##値\n##倦\n##倩\n##倪\n##倫\n##倬\n##倭\n##倶\n##债\n##值\n##倾\n##偃\n##假\n##偈\n##偉\n##偌\n##偎\n##偏\n##偕\n##做\n##停\n##健\n##側\n##偵\n##偶\n##偷\n##偻\n##偽\n##偿\n##傀\n##傅\n##傍\n##傑\n##傘\n##備\n##傚\n##傢\n##傣\n##傥\n##储\n##傩\n##催\n##傭\n##傲\n##傳\n##債\n##傷\n##傻\n##傾\n##僅\n##働\n##像\n##僑\n##僕\n##僖\n##僚\n##僥\n##僧\n##僭\n##僮\n##僱\n##僵\n##價\n##僻\n##儀\n##儂\n##億\n##儆\n##儉\n##儋\n##儒\n##儕\n##儘\n##償\n##儡\n##優\n##儲\n##儷\n##儼\n##儿\n##兀\n##允\n##元\n##兄\n##充\n##兆\n##兇\n##先\n##光\n##克\n##兌\n##免\n##児\n##兑\n##兒\n##兔\n##兖\n##党\n##兜\n##兢\n##入\n##內\n##全\n##兩\n##八\n##公\n##六\n##兮\n##兰\n##共\n##兲\n##关\n##兴\n##兵\n##其\n##具\n##典\n##兹\n##养\n##兼\n##兽\n##冀\n##内\n##円\n##冇\n##冈\n##冉\n##冊\n##册\n##再\n##冏\n##冒\n##冕\n##冗\n##写\n##军\n##农\n##冠\n##冢\n##冤\n##冥\n##冨\n##冪\n##冬\n##冯\n##冰\n##冲\n##决\n##况\n##冶\n##冷\n##冻\n##冼\n##冽\n##冾\n##净\n##凄\n##准\n##凇\n##凈\n##凉\n##凋\n##凌\n##凍\n##减\n##凑\n##凛\n##凜\n##凝\n##几\n##凡\n##凤\n##処\n##凪\n##凭\n##凯\n##凰\n##凱\n##凳\n##凶\n##凸\n##凹\n##出\n##击\n##函\n##凿\n##刀\n##刁\n##刃\n##分\n##切\n##刈\n##刊\n##刍\n##刎\n##刑\n##划\n##列\n##刘\n##则\n##刚\n##创\n##初\n##删\n##判\n##別\n##刨\n##利\n##刪\n##别\n##刮\n##到\n##制\n##刷\n##券\n##刹\n##刺\n##刻\n##刽\n##剁\n##剂\n##剃\n##則\n##剉\n##削\n##剋\n##剌\n##前\n##剎\n##剐\n##剑\n##剔\n##剖\n##剛\n##剜\n##剝\n##剣\n##剤\n##剥\n##剧\n##剩\n##剪\n##副\n##割\n##創\n##剷\n##剽\n##剿\n##劃\n##劇\n##劈\n##劉\n##劊\n##劍\n##劏\n##劑\n##力\n##劝\n##办\n##功\n##加\n##务\n##劣\n##动\n##助\n##努\n##劫\n##劭\n##励\n##劲\n##劳\n##労\n##劵\n##効\n##劾\n##势\n##勁\n##勃\n##勇\n##勉\n##勋\n##勐\n##勒\n##動\n##勖\n##勘\n##務\n##勛\n##勝\n##勞\n##募\n##勢\n##勤\n##勧\n##勳\n##勵\n##勸\n##勺\n##勻\n##勾\n##勿\n##匀\n##包\n##匆\n##匈\n##匍\n##匐\n##匕\n##化\n##北\n##匙\n##匝\n##匠\n##匡\n##匣\n##匪\n##匮\n##匯\n##匱\n##匹\n##区\n##医\n##匾\n##匿\n##區\n##十\n##千\n##卅\n##升\n##午\n##卉\n##半\n##卍\n##华\n##协\n##卑\n##卒\n##卓\n##協\n##单\n##卖\n##南\n##単\n##博\n##卜\n##卞\n##卟\n##占\n##卡\n##卢\n##卤\n##卦\n##卧\n##卫\n##卮\n##卯\n##印\n##危\n##即\n##却\n##卵\n##卷\n##卸\n##卻\n##卿\n##厂\n##厄\n##厅\n##历\n##厉\n##压\n##厌\n##厕\n##厘\n##厚\n##厝\n##原\n##厢\n##厥\n##厦\n##厨\n##厩\n##厭\n##厮\n##厲\n##厳\n##去\n##县\n##叁\n##参\n##參\n##又\n##叉\n##及\n##友\n##双\n##反\n##収\n##发\n##叔\n##取\n##受\n##变\n##叙\n##叛\n##叟\n##叠\n##叡\n##叢\n##口\n##古\n##句\n##另\n##叨\n##叩\n##只\n##叫\n##召\n##叭\n##叮\n##可\n##台\n##叱\n##史\n##右\n##叵\n##叶\n##号\n##司\n##叹\n##叻\n##叼\n##叽\n##吁\n##吃\n##各\n##吆\n##合\n##吉\n##吊\n##吋\n##同\n##名\n##后\n##吏\n##吐\n##向\n##吒\n##吓\n##吕\n##吖\n##吗\n##君\n##吝\n##吞\n##吟\n##吠\n##吡\n##否\n##吧\n##吨\n##吩\n##含\n##听\n##吭\n##吮\n##启\n##吱\n##吳\n##吴\n##吵\n##吶\n##吸\n##吹\n##吻\n##吼\n##吽\n##吾\n##呀\n##呂\n##呃\n##呆\n##呈\n##告\n##呋\n##呎\n##呐\n##呓\n##呕\n##呗\n##员\n##呛\n##呜\n##呢\n##呤\n##呦\n##周\n##呱\n##呲\n##味\n##呵\n##呷\n##呸\n##呻\n##呼\n##命\n##咀\n##咁\n##咂\n##咄\n##咆\n##咋\n##和\n##咎\n##咏\n##咐\n##咒\n##咔\n##咕\n##咖\n##咗\n##咘\n##咙\n##咚\n##咛\n##咣\n##咤\n##咦\n##咧\n##咨\n##咩\n##咪\n##咫\n##咬\n##咭\n##咯\n##咱\n##咲\n##咳\n##咸\n##咻\n##咽\n##咿\n##哀\n##品\n##哂\n##哄\n##哆\n##哇\n##哈\n##哉\n##哋\n##哌\n##响\n##哎\n##哏\n##哐\n##哑\n##哒\n##哔\n##哗\n##哟\n##員\n##哥\n##哦\n##哧\n##哨\n##哩\n##哪\n##哭\n##哮\n##哲\n##哺\n##哼\n##哽\n##唁\n##唄\n##唆\n##唇\n##唉\n##唏\n##唐\n##唑\n##唔\n##唠\n##唤\n##唧\n##唬\n##售\n##唯\n##唰\n##唱\n##唳\n##唷\n##唸\n##唾\n##啃\n##啄\n##商\n##啉\n##啊\n##問\n##啓\n##啕\n##啖\n##啜\n##啞\n##啟\n##啡\n##啤\n##啥\n##啦\n##啧\n##啪\n##啫\n##啬\n##啮\n##啰\n##啱\n##啲\n##啵\n##啶\n##啷\n##啸\n##啻\n##啼\n##啾\n##喀\n##喂\n##喃\n##善\n##喆\n##喇\n##喉\n##喊\n##喋\n##喎\n##喏\n##喔\n##喘\n##喙\n##喚\n##喜\n##喝\n##喟\n##喧\n##喪\n##喫\n##喬\n##單\n##喰\n##喱\n##喲\n##喳\n##喵\n##営\n##喷\n##喹\n##喺\n##喻\n##喽\n##嗅\n##嗆\n##嗇\n##嗎\n##嗑\n##嗒\n##嗓\n##嗔\n##嗖\n##嗚\n##嗜\n##嗝\n##嗟\n##嗡\n##嗣\n##嗤\n##嗦\n##嗨\n##嗪\n##嗬\n##嗯\n##嗰\n##嗲\n##嗳\n##嗶\n##嗷\n##嗽\n##嘀\n##嘅\n##嘆\n##嘈\n##嘉\n##嘌\n##嘍\n##嘎\n##嘔\n##嘖\n##嘗\n##嘘\n##嘚\n##嘛\n##嘜\n##嘞\n##嘟\n##嘢\n##嘣\n##嘤\n##嘧\n##嘩\n##嘭\n##嘮\n##嘯\n##嘰\n##嘱\n##嘲\n##嘴\n##嘶\n##嘸\n##嘹\n##嘻\n##嘿\n##噁\n##噌\n##噎\n##噓\n##噔\n##噗\n##噙\n##噜\n##噠\n##噢\n##噤\n##器\n##噩\n##噪\n##噬\n##噱\n##噴\n##噶\n##噸\n##噹\n##噻\n##噼\n##嚀\n##嚇\n##嚎\n##嚏\n##嚐\n##嚓\n##嚕\n##嚟\n##嚣\n##嚥\n##嚨\n##嚮\n##嚴\n##嚷\n##嚼\n##囂\n##囉\n##囊\n##囍\n##囑\n##囔\n##囗\n##囚\n##四\n##囝\n##回\n##囟\n##因\n##囡\n##团\n##団\n##囤\n##囧\n##囪\n##囫\n##园\n##困\n##囱\n##囲\n##図\n##围\n##囹\n##固\n##国\n##图\n##囿\n##圃\n##圄\n##圆\n##圈\n##國\n##圍\n##圏\n##園\n##圓\n##圖\n##團\n##圜\n##土\n##圣\n##圧\n##在\n##圩\n##圭\n##地\n##圳\n##场\n##圻\n##圾\n##址\n##坂\n##均\n##坊\n##坍\n##坎\n##坏\n##坐\n##坑\n##块\n##坚\n##坛\n##坝\n##坞\n##坟\n##坠\n##坡\n##坤\n##坦\n##坨\n##坪\n##坯\n##坳\n##坵\n##坷\n##垂\n##垃\n##垄\n##型\n##垒\n##垚\n##垛\n##垠\n##垢\n##垣\n##垦\n##垩\n##垫\n##垭\n##垮\n##垵\n##埂\n##埃\n##埋\n##城\n##埔\n##埕\n##埗\n##域\n##埠\n##埤\n##埵\n##執\n##埸\n##培\n##基\n##埼\n##堀\n##堂\n##堃\n##堅\n##堆\n##堇\n##堑\n##堕\n##堙\n##堡\n##堤\n##堪\n##堯\n##堰\n##報\n##場\n##堵\n##堺\n##堿\n##塊\n##塌\n##塑\n##塔\n##塗\n##塘\n##塚\n##塞\n##塢\n##塩\n##填\n##塬\n##塭\n##塵\n##塾\n##墀\n##境\n##墅\n##墉\n##墊\n##墒\n##墓\n##増\n##墘\n##墙\n##墜\n##增\n##墟\n##墨\n##墩\n##墮\n##墳\n##墻\n##墾\n##壁\n##壅\n##壆\n##壇\n##壊\n##壑\n##壓\n##壕\n##壘\n##壞\n##壟\n##壢\n##壤\n##壩\n##士\n##壬\n##壮\n##壯\n##声\n##売\n##壳\n##壶\n##壹\n##壺\n##壽\n##处\n##备\n##変\n##复\n##夏\n##夔\n##夕\n##外\n##夙\n##多\n##夜\n##够\n##夠\n##夢\n##夥\n##大\n##天\n##太\n##夫\n##夭\n##央\n##夯\n##失\n##头\n##夷\n##夸\n##夹\n##夺\n##夾\n##奂\n##奄\n##奇\n##奈\n##奉\n##奋\n##奎\n##奏\n##奐\n##契\n##奔\n##奕\n##奖\n##套\n##奘\n##奚\n##奠\n##奢\n##奥\n##奧\n##奪\n##奬\n##奮\n##女\n##奴\n##奶\n##奸\n##她\n##好\n##如\n##妃\n##妄\n##妆\n##妇\n##妈\n##妊\n##妍\n##妒\n##妓\n##妖\n##妘\n##妙\n##妝\n##妞\n##妣\n##妤\n##妥\n##妨\n##妩\n##妪\n##妮\n##妲\n##妳\n##妹\n##妻\n##妾\n##姆\n##姉\n##姊\n##始\n##姍\n##姐\n##姑\n##姒\n##姓\n##委\n##姗\n##姚\n##姜\n##姝\n##姣\n##姥\n##姦\n##姨\n##姪\n##姫\n##姬\n##姹\n##姻\n##姿\n##威\n##娃\n##娄\n##娅\n##娆\n##娇\n##娉\n##娑\n##娓\n##娘\n##娛\n##娜\n##娟\n##娠\n##娣\n##娥\n##娩\n##娱\n##娲\n##娴\n##娶\n##娼\n##婀\n##婁\n##婆\n##婉\n##婊\n##婕\n##婚\n##婢\n##婦\n##婧\n##婪\n##婭\n##婴\n##婵\n##婶\n##婷\n##婺\n##婿\n##媒\n##媚\n##媛\n##媞\n##媧\n##媲\n##媳\n##媽\n##媾\n##嫁\n##嫂\n##嫉\n##嫌\n##嫑\n##嫔\n##嫖\n##嫘\n##嫚\n##嫡\n##嫣\n##嫦\n##嫩\n##嫲\n##嫵\n##嫻\n##嬅\n##嬉\n##嬌\n##嬗\n##嬛\n##嬢\n##嬤\n##嬪\n##嬰\n##嬴\n##嬷\n##嬸\n##嬿\n##孀\n##孃\n##子\n##孑\n##孔\n##孕\n##孖\n##字\n##存\n##孙\n##孚\n##孛\n##孜\n##孝\n##孟\n##孢\n##季\n##孤\n##学\n##孩\n##孪\n##孫\n##孬\n##孰\n##孱\n##孳\n##孵\n##學\n##孺\n##孽\n##孿\n##宁\n##它\n##宅\n##宇\n##守\n##安\n##宋\n##完\n##宏\n##宓\n##宕\n##宗\n##官\n##宙\n##定\n##宛\n##宜\n##宝\n##实\n##実\n##宠\n##审\n##客\n##宣\n##室\n##宥\n##宦\n##宪\n##宫\n##宮\n##宰\n##害\n##宴\n##宵\n##家\n##宸\n##容\n##宽\n##宾\n##宿\n##寂\n##寄\n##寅\n##密\n##寇\n##富\n##寐\n##寒\n##寓\n##寛\n##寝\n##寞\n##察\n##寡\n##寢\n##寥\n##實\n##寧\n##寨\n##審\n##寫\n##寬\n##寮\n##寰\n##寵\n##寶\n##寸\n##对\n##寺\n##寻\n##导\n##対\n##寿\n##封\n##専\n##射\n##将\n##將\n##專\n##尉\n##尊\n##尋\n##對\n##導\n##小\n##少\n##尔\n##尕\n##尖\n##尘\n##尚\n##尝\n##尤\n##尧\n##尬\n##就\n##尴\n##尷\n##尸\n##尹\n##尺\n##尻\n##尼\n##尽\n##尾\n##尿\n##局\n##屁\n##层\n##屄\n##居\n##屆\n##屈\n##屉\n##届\n##屋\n##屌\n##屍\n##屎\n##屏\n##屐\n##屑\n##展\n##屜\n##属\n##屠\n##屡\n##屢\n##層\n##履\n##屬\n##屯\n##山\n##屹\n##屿\n##岀\n##岁\n##岂\n##岌\n##岐\n##岑\n##岔\n##岖\n##岗\n##岘\n##岙\n##岚\n##岛\n##岡\n##岩\n##岫\n##岬\n##岭\n##岱\n##岳\n##岷\n##岸\n##峇\n##峋\n##峒\n##峙\n##峡\n##峤\n##峥\n##峦\n##峨\n##峪\n##峭\n##峯\n##峰\n##峴\n##島\n##峻\n##峽\n##崁\n##崂\n##崆\n##崇\n##崎\n##崑\n##崔\n##崖\n##崗\n##崙\n##崛\n##崧\n##崩\n##崭\n##崴\n##崽\n##嵇\n##嵊\n##嵋\n##嵌\n##嵐\n##嵘\n##嵩\n##嵬\n##嵯\n##嶂\n##嶄\n##嶇\n##嶋\n##嶙\n##嶺\n##嶼\n##嶽\n##巅\n##巍\n##巒\n##巔\n##巖\n##川\n##州\n##巡\n##巢\n##工\n##左\n##巧\n##巨\n##巩\n##巫\n##差\n##己\n##已\n##巳\n##巴\n##巷\n##巻\n##巽\n##巾\n##巿\n##币\n##市\n##布\n##帅\n##帆\n##师\n##希\n##帐\n##帑\n##帕\n##帖\n##帘\n##帚\n##帛\n##帜\n##帝\n##帥\n##带\n##帧\n##師\n##席\n##帮\n##帯\n##帰\n##帳\n##帶\n##帷\n##常\n##帼\n##帽\n##幀\n##幂\n##幄\n##幅\n##幌\n##幔\n##幕\n##幟\n##幡\n##幢\n##幣\n##幫\n##干\n##平\n##年\n##并\n##幸\n##幹\n##幺\n##幻\n##幼\n##幽\n##幾\n##广\n##庁\n##広\n##庄\n##庆\n##庇\n##床\n##序\n##庐\n##库\n##应\n##底\n##庖\n##店\n##庙\n##庚\n##府\n##庞\n##废\n##庠\n##度\n##座\n##庫\n##庭\n##庵\n##庶\n##康\n##庸\n##庹\n##庾\n##廁\n##廂\n##廃\n##廈\n##廉\n##廊\n##廓\n##廖\n##廚\n##廝\n##廟\n##廠\n##廢\n##廣\n##廬\n##廳\n##延\n##廷\n##建\n##廿\n##开\n##弁\n##异\n##弃\n##弄\n##弈\n##弊\n##弋\n##式\n##弑\n##弒\n##弓\n##弔\n##引\n##弗\n##弘\n##弛\n##弟\n##张\n##弥\n##弦\n##弧\n##弩\n##弭\n##弯\n##弱\n##張\n##強\n##弹\n##强\n##弼\n##弾\n##彅\n##彆\n##彈\n##彌\n##彎\n##归\n##当\n##录\n##彗\n##彙\n##彝\n##形\n##彤\n##彥\n##彦\n##彧\n##彩\n##彪\n##彫\n##彬\n##彭\n##彰\n##影\n##彷\n##役\n##彻\n##彼\n##彿\n##往\n##征\n##径\n##待\n##徇\n##很\n##徉\n##徊\n##律\n##後\n##徐\n##徑\n##徒\n##従\n##徕\n##得\n##徘\n##徙\n##徜\n##從\n##徠\n##御\n##徨\n##復\n##循\n##徬\n##微\n##徳\n##徴\n##徵\n##德\n##徹\n##徼\n##徽\n##心\n##必\n##忆\n##忌\n##忍\n##忏\n##忐\n##忑\n##忒\n##忖\n##志\n##忘\n##忙\n##応\n##忠\n##忡\n##忤\n##忧\n##忪\n##快\n##忱\n##念\n##忻\n##忽\n##忿\n##怀\n##态\n##怂\n##怅\n##怆\n##怎\n##怏\n##怒\n##怔\n##怕\n##怖\n##怙\n##怜\n##思\n##怠\n##怡\n##急\n##怦\n##性\n##怨\n##怪\n##怯\n##怵\n##总\n##怼\n##恁\n##恃\n##恆\n##恋\n##恍\n##恐\n##恒\n##恕\n##恙\n##恚\n##恢\n##恣\n##恤\n##恥\n##恨\n##恩\n##恪\n##恫\n##恬\n##恭\n##息\n##恰\n##恳\n##恵\n##恶\n##恸\n##恺\n##恻\n##恼\n##恿\n##悄\n##悅\n##悉\n##悌\n##悍\n##悔\n##悖\n##悚\n##悟\n##悠\n##患\n##悦\n##您\n##悩\n##悪\n##悬\n##悯\n##悱\n##悲\n##悴\n##悵\n##悶\n##悸\n##悻\n##悼\n##悽\n##情\n##惆\n##惇\n##惊\n##惋\n##惑\n##惕\n##惘\n##惚\n##惜\n##惟\n##惠\n##惡\n##惦\n##惧\n##惨\n##惩\n##惫\n##惬\n##惭\n##惮\n##惯\n##惰\n##惱\n##想\n##惴\n##惶\n##惹\n##惺\n##愁\n##愆\n##愈\n##愉\n##愍\n##意\n##愕\n##愚\n##愛\n##愜\n##感\n##愣\n##愤\n##愧\n##愫\n##愷\n##愿\n##慄\n##慈\n##態\n##慌\n##慎\n##慑\n##慕\n##慘\n##慚\n##慟\n##慢\n##慣\n##慧\n##慨\n##慫\n##慮\n##慰\n##慳\n##慵\n##慶\n##慷\n##慾\n##憂\n##憊\n##憋\n##憎\n##憐\n##憑\n##憔\n##憚\n##憤\n##憧\n##憨\n##憩\n##憫\n##憬\n##憲\n##憶\n##憾\n##懂\n##懇\n##懈\n##應\n##懊\n##懋\n##懑\n##懒\n##懦\n##懲\n##懵\n##懶\n##懷\n##懸\n##懺\n##懼\n##懾\n##懿\n##戀\n##戈\n##戊\n##戌\n##戍\n##戎\n##戏\n##成\n##我\n##戒\n##戕\n##或\n##战\n##戚\n##戛\n##戟\n##戡\n##戦\n##截\n##戬\n##戮\n##戰\n##戲\n##戳\n##戴\n##戶\n##户\n##戸\n##戻\n##戾\n##房\n##所\n##扁\n##扇\n##扈\n##扉\n##手\n##才\n##扎\n##扑\n##扒\n##打\n##扔\n##払\n##托\n##扛\n##扣\n##扦\n##执\n##扩\n##扪\n##扫\n##扬\n##扭\n##扮\n##扯\n##扰\n##扱\n##扳\n##扶\n##批\n##扼\n##找\n##承\n##技\n##抄\n##抉\n##把\n##抑\n##抒\n##抓\n##投\n##抖\n##抗\n##折\n##抚\n##抛\n##抜\n##択\n##抟\n##抠\n##抡\n##抢\n##护\n##报\n##抨\n##披\n##抬\n##抱\n##抵\n##抹\n##押\n##抽\n##抿\n##拂\n##拄\n##担\n##拆\n##拇\n##拈\n##拉\n##拋\n##拌\n##拍\n##拎\n##拐\n##拒\n##拓\n##拔\n##拖\n##拗\n##拘\n##拙\n##拚\n##招\n##拜\n##拟\n##拡\n##拢\n##拣\n##拥\n##拦\n##拧\n##拨\n##择\n##括\n##拭\n##拮\n##拯\n##拱\n##拳\n##拴\n##拷\n##拼\n##拽\n##拾\n##拿\n##持\n##挂\n##指\n##挈\n##按\n##挎\n##挑\n##挖\n##挙\n##挚\n##挛\n##挝\n##挞\n##挟\n##挠\n##挡\n##挣\n##挤\n##挥\n##挨\n##挪\n##挫\n##振\n##挲\n##挹\n##挺\n##挽\n##挾\n##捂\n##捅\n##捆\n##捉\n##捋\n##捌\n##捍\n##捎\n##捏\n##捐\n##捕\n##捞\n##损\n##捡\n##换\n##捣\n##捧\n##捨\n##捩\n##据\n##捱\n##捲\n##捶\n##捷\n##捺\n##捻\n##掀\n##掂\n##掃\n##掇\n##授\n##掉\n##掌\n##掏\n##掐\n##排\n##掖\n##掘\n##掙\n##掛\n##掠\n##採\n##探\n##掣\n##接\n##控\n##推\n##掩\n##措\n##掬\n##掰\n##掲\n##掳\n##掴\n##掷\n##掸\n##掺\n##揀\n##揃\n##揄\n##揆\n##揉\n##揍\n##描\n##提\n##插\n##揖\n##揚\n##換\n##握\n##揣\n##揩\n##揪\n##揭\n##揮\n##援\n##揶\n##揸\n##揹\n##揽\n##搀\n##搁\n##搂\n##搅\n##損\n##搏\n##搐\n##搓\n##搔\n##搖\n##搗\n##搜\n##搞\n##搡\n##搪\n##搬\n##搭\n##搵\n##搶\n##携\n##搽\n##摀\n##摁\n##摄\n##摆\n##摇\n##摈\n##摊\n##摒\n##摔\n##摘\n##摞\n##摟\n##摧\n##摩\n##摯\n##摳\n##摸\n##摹\n##摺\n##摻\n##撂\n##撃\n##撅\n##撇\n##撈\n##撐\n##撑\n##撒\n##撓\n##撕\n##撚\n##撞\n##撤\n##撥\n##撩\n##撫\n##撬\n##播\n##撮\n##撰\n##撲\n##撵\n##撷\n##撸\n##撻\n##撼\n##撿\n##擀\n##擁\n##擂\n##擄\n##擅\n##擇\n##擊\n##擋\n##操\n##擎\n##擒\n##擔\n##擘\n##據\n##擞\n##擠\n##擡\n##擢\n##擦\n##擬\n##擰\n##擱\n##擲\n##擴\n##擷\n##擺\n##擼\n##擾\n##攀\n##攏\n##攒\n##攔\n##攘\n##攙\n##攜\n##攝\n##攞\n##攢\n##攣\n##攤\n##攥\n##攪\n##攫\n##攬\n##支\n##收\n##攸\n##改\n##攻\n##放\n##政\n##故\n##效\n##敌\n##敍\n##敎\n##敏\n##救\n##敕\n##敖\n##敗\n##敘\n##教\n##敛\n##敝\n##敞\n##敢\n##散\n##敦\n##敬\n##数\n##敲\n##整\n##敵\n##敷\n##數\n##斂\n##斃\n##文\n##斋\n##斌\n##斎\n##斐\n##斑\n##斓\n##斗\n##料\n##斛\n##斜\n##斟\n##斡\n##斤\n##斥\n##斧\n##斩\n##斫\n##斬\n##断\n##斯\n##新\n##斷\n##方\n##於\n##施\n##旁\n##旃\n##旅\n##旋\n##旌\n##旎\n##族\n##旖\n##旗\n##无\n##既\n##日\n##旦\n##旧\n##旨\n##早\n##旬\n##旭\n##旮\n##旱\n##时\n##旷\n##旺\n##旻\n##昀\n##昂\n##昆\n##昇\n##昉\n##昊\n##昌\n##明\n##昏\n##易\n##昔\n##昕\n##昙\n##星\n##映\n##春\n##昧\n##昨\n##昭\n##是\n##昱\n##昴\n##昵\n##昶\n##昼\n##显\n##晁\n##時\n##晃\n##晉\n##晋\n##晌\n##晏\n##晒\n##晓\n##晔\n##晕\n##晖\n##晗\n##晚\n##晝\n##晞\n##晟\n##晤\n##晦\n##晨\n##晩\n##普\n##景\n##晰\n##晴\n##晶\n##晷\n##智\n##晾\n##暂\n##暄\n##暇\n##暈\n##暉\n##暌\n##暐\n##暑\n##暖\n##暗\n##暝\n##暢\n##暧\n##暨\n##暫\n##暮\n##暱\n##暴\n##暸\n##暹\n##曄\n##曆\n##曇\n##曉\n##曖\n##曙\n##曜\n##曝\n##曠\n##曦\n##曬\n##曰\n##曲\n##曳\n##更\n##書\n##曹\n##曼\n##曾\n##替\n##最\n##會\n##月\n##有\n##朋\n##服\n##朐\n##朔\n##朕\n##朗\n##望\n##朝\n##期\n##朦\n##朧\n##木\n##未\n##末\n##本\n##札\n##朮\n##术\n##朱\n##朴\n##朵\n##机\n##朽\n##杀\n##杂\n##权\n##杆\n##杈\n##杉\n##李\n##杏\n##材\n##村\n##杓\n##杖\n##杜\n##杞\n##束\n##杠\n##条\n##来\n##杨\n##杭\n##杯\n##杰\n##東\n##杳\n##杵\n##杷\n##杼\n##松\n##板\n##极\n##构\n##枇\n##枉\n##枋\n##析\n##枕\n##林\n##枚\n##果\n##枝\n##枢\n##枣\n##枪\n##枫\n##枭\n##枯\n##枰\n##枱\n##枳\n##架\n##枷\n##枸\n##柄\n##柏\n##某\n##柑\n##柒\n##染\n##柔\n##柘\n##柚\n##柜\n##柞\n##柠\n##柢\n##查\n##柩\n##柬\n##柯\n##柱\n##柳\n##柴\n##柵\n##査\n##柿\n##栀\n##栃\n##栄\n##栅\n##标\n##栈\n##栉\n##栋\n##栎\n##栏\n##树\n##栓\n##栖\n##栗\n##校\n##栩\n##株\n##样\n##核\n##根\n##格\n##栽\n##栾\n##桀\n##桁\n##桂\n##桃\n##桅\n##框\n##案\n##桉\n##桌\n##桎\n##桐\n##桑\n##桓\n##桔\n##桜\n##桠\n##桡\n##桢\n##档\n##桥\n##桦\n##桧\n##桨\n##桩\n##桶\n##桿\n##梁\n##梅\n##梆\n##梏\n##梓\n##梗\n##條\n##梟\n##梢\n##梦\n##梧\n##梨\n##梭\n##梯\n##械\n##梳\n##梵\n##梶\n##检\n##棂\n##棄\n##棉\n##棋\n##棍\n##棒\n##棕\n##棗\n##棘\n##棚\n##棟\n##棠\n##棣\n##棧\n##森\n##棱\n##棲\n##棵\n##棹\n##棺\n##椁\n##椅\n##椋\n##植\n##椎\n##椒\n##検\n##椪\n##椭\n##椰\n##椹\n##椽\n##椿\n##楂\n##楊\n##楓\n##楔\n##楚\n##楝\n##楞\n##楠\n##楣\n##楨\n##楫\n##業\n##楮\n##極\n##楷\n##楸\n##楹\n##楼\n##楽\n##概\n##榄\n##榆\n##榈\n##榉\n##榔\n##榕\n##榖\n##榛\n##榜\n##榨\n##榫\n##榭\n##榮\n##榱\n##榴\n##榷\n##榻\n##槁\n##槃\n##構\n##槌\n##槍\n##槎\n##槐\n##槓\n##様\n##槛\n##槟\n##槤\n##槭\n##槲\n##槳\n##槻\n##槽\n##槿\n##樁\n##樂\n##樊\n##樑\n##樓\n##標\n##樞\n##樟\n##模\n##樣\n##権\n##横\n##樫\n##樯\n##樱\n##樵\n##樸\n##樹\n##樺\n##樽\n##樾\n##橄\n##橇\n##橋\n##橐\n##橘\n##橙\n##機\n##橡\n##橢\n##橫\n##橱\n##橹\n##橼\n##檀\n##檄\n##檎\n##檐\n##檔\n##檗\n##檜\n##檢\n##檬\n##檯\n##檳\n##檸\n##檻\n##櫃\n##櫚\n##櫛\n##櫥\n##櫸\n##櫻\n##欄\n##權\n##欒\n##欖\n##欠\n##次\n##欢\n##欣\n##欧\n##欲\n##欸\n##欺\n##欽\n##款\n##歆\n##歇\n##歉\n##歌\n##歎\n##歐\n##歓\n##歙\n##歛\n##歡\n##止\n##正\n##此\n##步\n##武\n##歧\n##歩\n##歪\n##歯\n##歲\n##歳\n##歴\n##歷\n##歸\n##歹\n##死\n##歼\n##殁\n##殃\n##殆\n##殇\n##殉\n##殊\n##残\n##殒\n##殓\n##殖\n##殘\n##殞\n##殡\n##殤\n##殭\n##殯\n##殲\n##殴\n##段\n##殷\n##殺\n##殼\n##殿\n##毀\n##毁\n##毂\n##毅\n##毆\n##毋\n##母\n##毎\n##每\n##毒\n##毓\n##比\n##毕\n##毗\n##毘\n##毙\n##毛\n##毡\n##毫\n##毯\n##毽\n##氈\n##氏\n##氐\n##民\n##氓\n##气\n##氖\n##気\n##氙\n##氛\n##氟\n##氡\n##氢\n##氣\n##氤\n##氦\n##氧\n##氨\n##氪\n##氫\n##氮\n##氯\n##氰\n##氲\n##水\n##氷\n##永\n##氹\n##氾\n##汀\n##汁\n##求\n##汆\n##汇\n##汉\n##汎\n##汐\n##汕\n##汗\n##汙\n##汛\n##汝\n##汞\n##江\n##池\n##污\n##汤\n##汨\n##汩\n##汪\n##汰\n##汲\n##汴\n##汶\n##汹\n##決\n##汽\n##汾\n##沁\n##沂\n##沃\n##沅\n##沈\n##沉\n##沌\n##沏\n##沐\n##沒\n##沓\n##沖\n##沙\n##沛\n##沟\n##没\n##沢\n##沣\n##沥\n##沦\n##沧\n##沪\n##沫\n##沭\n##沮\n##沱\n##河\n##沸\n##油\n##治\n##沼\n##沽\n##沾\n##沿\n##況\n##泄\n##泉\n##泊\n##泌\n##泓\n##法\n##泗\n##泛\n##泞\n##泠\n##泡\n##波\n##泣\n##泥\n##注\n##泪\n##泫\n##泮\n##泯\n##泰\n##泱\n##泳\n##泵\n##泷\n##泸\n##泻\n##泼\n##泽\n##泾\n##洁\n##洄\n##洋\n##洒\n##洗\n##洙\n##洛\n##洞\n##津\n##洩\n##洪\n##洮\n##洱\n##洲\n##洵\n##洶\n##洸\n##洹\n##活\n##洼\n##洽\n##派\n##流\n##浃\n##浄\n##浅\n##浆\n##浇\n##浊\n##测\n##济\n##浏\n##浑\n##浒\n##浓\n##浔\n##浙\n##浚\n##浜\n##浣\n##浦\n##浩\n##浪\n##浬\n##浮\n##浯\n##浴\n##海\n##浸\n##涂\n##涅\n##涇\n##消\n##涉\n##涌\n##涎\n##涓\n##涔\n##涕\n##涙\n##涛\n##涝\n##涞\n##涟\n##涠\n##涡\n##涣\n##涤\n##润\n##涧\n##涨\n##涩\n##涪\n##涮\n##涯\n##液\n##涵\n##涸\n##涼\n##涿\n##淀\n##淄\n##淅\n##淆\n##淇\n##淋\n##淌\n##淑\n##淒\n##淖\n##淘\n##淙\n##淚\n##淞\n##淡\n##淤\n##淦\n##淨\n##淩\n##淪\n##淫\n##淬\n##淮\n##深\n##淳\n##淵\n##混\n##淹\n##淺\n##添\n##淼\n##清\n##済\n##渉\n##渊\n##渋\n##渍\n##渎\n##渐\n##渔\n##渗\n##渙\n##渚\n##減\n##渝\n##渠\n##渡\n##渣\n##渤\n##渥\n##渦\n##温\n##測\n##渭\n##港\n##渲\n##渴\n##游\n##渺\n##渾\n##湃\n##湄\n##湊\n##湍\n##湖\n##湘\n##湛\n##湟\n##湧\n##湫\n##湮\n##湯\n##湳\n##湾\n##湿\n##満\n##溃\n##溅\n##溉\n##溏\n##源\n##準\n##溜\n##溝\n##溟\n##溢\n##溥\n##溧\n##溪\n##溫\n##溯\n##溱\n##溴\n##溶\n##溺\n##溼\n##滁\n##滂\n##滄\n##滅\n##滇\n##滋\n##滌\n##滑\n##滓\n##滔\n##滕\n##滙\n##滚\n##滝\n##滞\n##滟\n##满\n##滢\n##滤\n##滥\n##滦\n##滨\n##滩\n##滬\n##滯\n##滲\n##滴\n##滷\n##滸\n##滾\n##滿\n##漁\n##漂\n##漆\n##漉\n##漏\n##漓\n##演\n##漕\n##漠\n##漢\n##漣\n##漩\n##漪\n##漫\n##漬\n##漯\n##漱\n##漲\n##漳\n##漸\n##漾\n##漿\n##潆\n##潇\n##潋\n##潍\n##潑\n##潔\n##潘\n##潛\n##潜\n##潞\n##潟\n##潢\n##潤\n##潦\n##潧\n##潭\n##潮\n##潰\n##潴\n##潸\n##潺\n##潼\n##澀\n##澄\n##澆\n##澈\n##澍\n##澎\n##澗\n##澜\n##澡\n##澤\n##澧\n##澱\n##澳\n##澹\n##激\n##濁\n##濂\n##濃\n##濑\n##濒\n##濕\n##濘\n##濛\n##濟\n##濠\n##濡\n##濤\n##濫\n##濬\n##濮\n##濯\n##濱\n##濺\n##濾\n##瀅\n##瀆\n##瀉\n##瀋\n##瀏\n##瀑\n##瀕\n##瀘\n##瀚\n##瀛\n##瀝\n##瀞\n##瀟\n##瀧\n##瀨\n##瀬\n##瀰\n##瀾\n##灌\n##灏\n##灑\n##灘\n##灝\n##灞\n##灣\n##火\n##灬\n##灭\n##灯\n##灰\n##灵\n##灶\n##灸\n##灼\n##災\n##灾\n##灿\n##炀\n##炁\n##炅\n##炉\n##炊\n##炎\n##炒\n##炔\n##炕\n##炖\n##炙\n##炜\n##炫\n##炬\n##炭\n##炮\n##炯\n##炳\n##炷\n##炸\n##点\n##為\n##炼\n##炽\n##烁\n##烂\n##烃\n##烈\n##烊\n##烏\n##烘\n##烙\n##烛\n##烟\n##烤\n##烦\n##烧\n##烨\n##烩\n##烫\n##烬\n##热\n##烯\n##烷\n##烹\n##烽\n##焉\n##焊\n##焕\n##焖\n##焗\n##焘\n##焙\n##焚\n##焜\n##無\n##焦\n##焯\n##焰\n##焱\n##然\n##焼\n##煅\n##煉\n##煊\n##煌\n##煎\n##煒\n##煖\n##煙\n##煜\n##煞\n##煤\n##煥\n##煦\n##照\n##煨\n##煩\n##煮\n##煲\n##煸\n##煽\n##熄\n##熊\n##熏\n##熒\n##熔\n##熙\n##熟\n##熠\n##熨\n##熬\n##熱\n##熵\n##熹\n##熾\n##燁\n##燃\n##燄\n##燈\n##燉\n##燊\n##燎\n##燒\n##燔\n##燕\n##燙\n##燜\n##營\n##燥\n##燦\n##燧\n##燭\n##燮\n##燴\n##燻\n##燼\n##燿\n##爆\n##爍\n##爐\n##爛\n##爪\n##爬\n##爭\n##爰\n##爱\n##爲\n##爵\n##父\n##爷\n##爸\n##爹\n##爺\n##爻\n##爽\n##爾\n##牆\n##片\n##版\n##牌\n##牍\n##牒\n##牙\n##牛\n##牝\n##牟\n##牠\n##牡\n##牢\n##牦\n##牧\n##物\n##牯\n##牲\n##牴\n##牵\n##特\n##牺\n##牽\n##犀\n##犁\n##犄\n##犊\n##犍\n##犒\n##犢\n##犧\n##犬\n##犯\n##状\n##犷\n##犸\n##犹\n##狀\n##狂\n##狄\n##狈\n##狎\n##狐\n##狒\n##狗\n##狙\n##狞\n##狠\n##狡\n##狩\n##独\n##狭\n##狮\n##狰\n##狱\n##狸\n##狹\n##狼\n##狽\n##猎\n##猕\n##猖\n##猗\n##猙\n##猛\n##猜\n##猝\n##猥\n##猩\n##猪\n##猫\n##猬\n##献\n##猴\n##猶\n##猷\n##猾\n##猿\n##獄\n##獅\n##獎\n##獐\n##獒\n##獗\n##獠\n##獣\n##獨\n##獭\n##獰\n##獲\n##獵\n##獷\n##獸\n##獺\n##獻\n##獼\n##獾\n##玄\n##率\n##玉\n##王\n##玑\n##玖\n##玛\n##玟\n##玠\n##玥\n##玩\n##玫\n##玮\n##环\n##现\n##玲\n##玳\n##玷\n##玺\n##玻\n##珀\n##珂\n##珅\n##珈\n##珉\n##珊\n##珍\n##珏\n##珐\n##珑\n##珙\n##珞\n##珠\n##珣\n##珥\n##珩\n##珪\n##班\n##珮\n##珲\n##珺\n##現\n##球\n##琅\n##理\n##琇\n##琉\n##琊\n##琍\n##琏\n##琐\n##琛\n##琢\n##琥\n##琦\n##琨\n##琪\n##琬\n##琮\n##琰\n##琲\n##琳\n##琴\n##琵\n##琶\n##琺\n##琼\n##瑀\n##瑁\n##瑄\n##瑋\n##瑕\n##瑗\n##瑙\n##瑚\n##瑛\n##瑜\n##瑞\n##瑟\n##瑠\n##瑣\n##瑤\n##瑩\n##瑪\n##瑯\n##瑰\n##瑶\n##瑾\n##璀\n##璁\n##璃\n##璇\n##璉\n##璋\n##璎\n##璐\n##璜\n##璞\n##璟\n##璧\n##璨\n##環\n##璽\n##璿\n##瓊\n##瓏\n##瓒\n##瓜\n##瓢\n##瓣\n##瓤\n##瓦\n##瓮\n##瓯\n##瓴\n##瓶\n##瓷\n##甄\n##甌\n##甕\n##甘\n##甙\n##甚\n##甜\n##生\n##產\n##産\n##甥\n##甦\n##用\n##甩\n##甫\n##甬\n##甭\n##甯\n##田\n##由\n##甲\n##申\n##电\n##男\n##甸\n##町\n##画\n##甾\n##畀\n##畅\n##界\n##畏\n##畑\n##畔\n##留\n##畜\n##畝\n##畢\n##略\n##畦\n##番\n##畫\n##異\n##畲\n##畳\n##畴\n##當\n##畸\n##畹\n##畿\n##疆\n##疇\n##疊\n##疏\n##疑\n##疔\n##疖\n##疗\n##疙\n##疚\n##疝\n##疟\n##疡\n##疣\n##疤\n##疥\n##疫\n##疮\n##疯\n##疱\n##疲\n##疳\n##疵\n##疸\n##疹\n##疼\n##疽\n##疾\n##痂\n##病\n##症\n##痈\n##痉\n##痊\n##痍\n##痒\n##痔\n##痕\n##痘\n##痙\n##痛\n##痞\n##痠\n##痢\n##痣\n##痤\n##痧\n##痨\n##痪\n##痫\n##痰\n##痱\n##痴\n##痹\n##痺\n##痼\n##痿\n##瘀\n##瘁\n##瘋\n##瘍\n##瘓\n##瘘\n##瘙\n##瘟\n##瘠\n##瘡\n##瘢\n##瘤\n##瘦\n##瘧\n##瘩\n##瘪\n##瘫\n##瘴\n##瘸\n##瘾\n##療\n##癇\n##癌\n##癒\n##癖\n##癜\n##癞\n##癡\n##癢\n##癣\n##癥\n##癫\n##癬\n##癮\n##癱\n##癲\n##癸\n##発\n##登\n##發\n##白\n##百\n##皂\n##的\n##皆\n##皇\n##皈\n##皋\n##皎\n##皑\n##皓\n##皖\n##皙\n##皚\n##皮\n##皰\n##皱\n##皴\n##皺\n##皿\n##盂\n##盃\n##盅\n##盆\n##盈\n##益\n##盎\n##盏\n##盐\n##监\n##盒\n##盔\n##盖\n##盗\n##盘\n##盛\n##盜\n##盞\n##盟\n##盡\n##監\n##盤\n##盥\n##盧\n##盪\n##目\n##盯\n##盱\n##盲\n##直\n##相\n##盹\n##盼\n##盾\n##省\n##眈\n##眉\n##看\n##県\n##眙\n##眞\n##真\n##眠\n##眦\n##眨\n##眩\n##眯\n##眶\n##眷\n##眸\n##眺\n##眼\n##眾\n##着\n##睁\n##睇\n##睏\n##睐\n##睑\n##睛\n##睜\n##睞\n##睡\n##睢\n##督\n##睥\n##睦\n##睨\n##睪\n##睫\n##睬\n##睹\n##睽\n##睾\n##睿\n##瞄\n##瞅\n##瞇\n##瞋\n##瞌\n##瞎\n##瞑\n##瞒\n##瞓\n##瞞\n##瞟\n##瞠\n##瞥\n##瞧\n##瞩\n##瞪\n##瞬\n##瞭\n##瞰\n##瞳\n##瞻\n##瞼\n##瞿\n##矇\n##矍\n##矗\n##矚\n##矛\n##矜\n##矢\n##矣\n##知\n##矩\n##矫\n##短\n##矮\n##矯\n##石\n##矶\n##矽\n##矾\n##矿\n##码\n##砂\n##砌\n##砍\n##砒\n##研\n##砖\n##砗\n##砚\n##砝\n##砣\n##砥\n##砧\n##砭\n##砰\n##砲\n##破\n##砷\n##砸\n##砺\n##砼\n##砾\n##础\n##硅\n##硐\n##硒\n##硕\n##硝\n##硫\n##硬\n##确\n##硯\n##硼\n##碁\n##碇\n##碉\n##碌\n##碍\n##碎\n##碑\n##碓\n##碗\n##碘\n##碚\n##碛\n##碟\n##碣\n##碧\n##碩\n##碰\n##碱\n##碳\n##碴\n##確\n##碼\n##碾\n##磁\n##磅\n##磊\n##磋\n##磐\n##磕\n##磚\n##磡\n##磨\n##磬\n##磯\n##磲\n##磷\n##磺\n##礁\n##礎\n##礙\n##礡\n##礦\n##礪\n##礫\n##礴\n##示\n##礼\n##社\n##祀\n##祁\n##祂\n##祇\n##祈\n##祉\n##祎\n##祐\n##祕\n##祖\n##祗\n##祚\n##祛\n##祜\n##祝\n##神\n##祟\n##祠\n##祢\n##祥\n##票\n##祭\n##祯\n##祷\n##祸\n##祺\n##祿\n##禀\n##禁\n##禄\n##禅\n##禍\n##禎\n##福\n##禛\n##禦\n##禧\n##禪\n##禮\n##禱\n##禹\n##禺\n##离\n##禽\n##禾\n##禿\n##秀\n##私\n##秃\n##秆\n##秉\n##秋\n##种\n##科\n##秒\n##秘\n##租\n##秣\n##秤\n##秦\n##秧\n##秩\n##秭\n##积\n##称\n##秸\n##移\n##秽\n##稀\n##稅\n##程\n##稍\n##税\n##稔\n##稗\n##稚\n##稜\n##稞\n##稟\n##稠\n##稣\n##種\n##稱\n##稲\n##稳\n##稷\n##稹\n##稻\n##稼\n##稽\n##稿\n##穀\n##穂\n##穆\n##穌\n##積\n##穎\n##穗\n##穢\n##穩\n##穫\n##穴\n##究\n##穷\n##穹\n##空\n##穿\n##突\n##窃\n##窄\n##窈\n##窍\n##窑\n##窒\n##窓\n##窕\n##窖\n##窗\n##窘\n##窜\n##窝\n##窟\n##窠\n##窥\n##窦\n##窨\n##窩\n##窪\n##窮\n##窯\n##窺\n##窿\n##竄\n##竅\n##竇\n##竊\n##立\n##竖\n##站\n##竜\n##竞\n##竟\n##章\n##竣\n##童\n##竭\n##端\n##競\n##竹\n##竺\n##竽\n##竿\n##笃\n##笆\n##笈\n##笋\n##笏\n##笑\n##笔\n##笙\n##笛\n##笞\n##笠\n##符\n##笨\n##第\n##笹\n##笺\n##笼\n##筆\n##等\n##筊\n##筋\n##筍\n##筏\n##筐\n##筑\n##筒\n##答\n##策\n##筛\n##筝\n##筠\n##筱\n##筲\n##筵\n##筷\n##筹\n##签\n##简\n##箇\n##箋\n##箍\n##箏\n##箐\n##箔\n##箕\n##算\n##箝\n##管\n##箩\n##箫\n##箭\n##箱\n##箴\n##箸\n##節\n##篁\n##範\n##篆\n##篇\n##築\n##篑\n##篓\n##篙\n##篝\n##篠\n##篡\n##篤\n##篩\n##篪\n##篮\n##篱\n##篷\n##簇\n##簌\n##簍\n##簡\n##簦\n##簧\n##簪\n##簫\n##簷\n##簸\n##簽\n##簾\n##簿\n##籁\n##籃\n##籌\n##籍\n##籐\n##籟\n##籠\n##籤\n##籬\n##籮\n##籲\n##米\n##类\n##籼\n##籽\n##粄\n##粉\n##粑\n##粒\n##粕\n##粗\n##粘\n##粟\n##粤\n##粥\n##粧\n##粪\n##粮\n##粱\n##粲\n##粳\n##粵\n##粹\n##粼\n##粽\n##精\n##粿\n##糅\n##糊\n##糍\n##糕\n##糖\n##糗\n##糙\n##糜\n##糞\n##糟\n##糠\n##糧\n##糬\n##糯\n##糰\n##糸\n##系\n##糾\n##紀\n##紂\n##約\n##紅\n##紉\n##紊\n##紋\n##納\n##紐\n##紓\n##純\n##紗\n##紘\n##紙\n##級\n##紛\n##紜\n##素\n##紡\n##索\n##紧\n##紫\n##紮\n##累\n##細\n##紳\n##紹\n##紺\n##終\n##絃\n##組\n##絆\n##経\n##結\n##絕\n##絞\n##絡\n##絢\n##給\n##絨\n##絮\n##統\n##絲\n##絳\n##絵\n##絶\n##絹\n##綁\n##綏\n##綑\n##經\n##継\n##続\n##綜\n##綠\n##綢\n##綦\n##綫\n##綬\n##維\n##綱\n##網\n##綴\n##綵\n##綸\n##綺\n##綻\n##綽\n##綾\n##綿\n##緊\n##緋\n##総\n##緑\n##緒\n##緘\n##線\n##緝\n##緞\n##締\n##緣\n##編\n##緩\n##緬\n##緯\n##練\n##緹\n##緻\n##縁\n##縄\n##縈\n##縛\n##縝\n##縣\n##縫\n##縮\n##縱\n##縴\n##縷\n##總\n##績\n##繁\n##繃\n##繆\n##繇\n##繋\n##織\n##繕\n##繚\n##繞\n##繡\n##繩\n##繪\n##繫\n##繭\n##繳\n##繹\n##繼\n##繽\n##纂\n##續\n##纍\n##纏\n##纓\n##纔\n##纖\n##纜\n##纠\n##红\n##纣\n##纤\n##约\n##级\n##纨\n##纪\n##纫\n##纬\n##纭\n##纯\n##纰\n##纱\n##纲\n##纳\n##纵\n##纶\n##纷\n##纸\n##纹\n##纺\n##纽\n##纾\n##线\n##绀\n##练\n##组\n##绅\n##细\n##织\n##终\n##绊\n##绍\n##绎\n##经\n##绑\n##绒\n##结\n##绔\n##绕\n##绘\n##给\n##绚\n##绛\n##络\n##绝\n##绞\n##统\n##绡\n##绢\n##绣\n##绥\n##绦\n##继\n##绩\n##绪\n##绫\n##续\n##绮\n##绯\n##绰\n##绳\n##维\n##绵\n##绶\n##绷\n##绸\n##绻\n##综\n##绽\n##绾\n##绿\n##缀\n##缄\n##缅\n##缆\n##缇\n##缈\n##缉\n##缎\n##缓\n##缔\n##缕\n##编\n##缘\n##缙\n##缚\n##缜\n##缝\n##缠\n##缢\n##缤\n##缥\n##缨\n##缩\n##缪\n##缭\n##缮\n##缰\n##缱\n##缴\n##缸\n##缺\n##缽\n##罂\n##罄\n##罌\n##罐\n##网\n##罔\n##罕\n##罗\n##罚\n##罡\n##罢\n##罩\n##罪\n##置\n##罰\n##署\n##罵\n##罷\n##罹\n##羁\n##羅\n##羈\n##羊\n##羌\n##美\n##羔\n##羚\n##羞\n##羟\n##羡\n##羣\n##群\n##羥\n##羧\n##羨\n##義\n##羯\n##羲\n##羸\n##羹\n##羽\n##羿\n##翁\n##翅\n##翊\n##翌\n##翎\n##習\n##翔\n##翘\n##翟\n##翠\n##翡\n##翦\n##翩\n##翰\n##翱\n##翳\n##翹\n##翻\n##翼\n##耀\n##老\n##考\n##耄\n##者\n##耆\n##耋\n##而\n##耍\n##耐\n##耒\n##耕\n##耗\n##耘\n##耙\n##耦\n##耨\n##耳\n##耶\n##耷\n##耸\n##耻\n##耽\n##耿\n##聂\n##聆\n##聊\n##聋\n##职\n##聒\n##联\n##聖\n##聘\n##聚\n##聞\n##聪\n##聯\n##聰\n##聲\n##聳\n##聴\n##聶\n##職\n##聽\n##聾\n##聿\n##肃\n##肄\n##肅\n##肆\n##肇\n##肉\n##肋\n##肌\n##肏\n##肓\n##肖\n##肘\n##肚\n##肛\n##肝\n##肠\n##股\n##肢\n##肤\n##肥\n##肩\n##肪\n##肮\n##肯\n##肱\n##育\n##肴\n##肺\n##肽\n##肾\n##肿\n##胀\n##胁\n##胃\n##胄\n##胆\n##背\n##胍\n##胎\n##胖\n##胚\n##胛\n##胜\n##胝\n##胞\n##胡\n##胤\n##胥\n##胧\n##胫\n##胭\n##胯\n##胰\n##胱\n##胳\n##胴\n##胶\n##胸\n##胺\n##能\n##脂\n##脅\n##脆\n##脇\n##脈\n##脉\n##脊\n##脍\n##脏\n##脐\n##脑\n##脓\n##脖\n##脘\n##脚\n##脛\n##脣\n##脩\n##脫\n##脯\n##脱\n##脲\n##脳\n##脸\n##脹\n##脾\n##腆\n##腈\n##腊\n##腋\n##腌\n##腎\n##腐\n##腑\n##腓\n##腔\n##腕\n##腥\n##腦\n##腩\n##腫\n##腭\n##腮\n##腰\n##腱\n##腳\n##腴\n##腸\n##腹\n##腺\n##腻\n##腼\n##腾\n##腿\n##膀\n##膈\n##膊\n##膏\n##膑\n##膘\n##膚\n##膛\n##膜\n##膝\n##膠\n##膦\n##膨\n##膩\n##膳\n##膺\n##膻\n##膽\n##膾\n##膿\n##臀\n##臂\n##臃\n##臆\n##臉\n##臊\n##臍\n##臓\n##臘\n##臟\n##臣\n##臥\n##臧\n##臨\n##自\n##臬\n##臭\n##至\n##致\n##臺\n##臻\n##臼\n##臾\n##舀\n##舂\n##舅\n##舆\n##與\n##興\n##舉\n##舊\n##舌\n##舍\n##舎\n##舐\n##舒\n##舔\n##舖\n##舗\n##舛\n##舜\n##舞\n##舟\n##航\n##舫\n##般\n##舰\n##舱\n##舵\n##舶\n##舷\n##舸\n##船\n##舺\n##舾\n##艇\n##艋\n##艘\n##艙\n##艦\n##艮\n##良\n##艰\n##艱\n##色\n##艳\n##艷\n##艹\n##艺\n##艾\n##节\n##芃\n##芈\n##芊\n##芋\n##芍\n##芎\n##芒\n##芙\n##芜\n##芝\n##芡\n##芥\n##芦\n##芩\n##芪\n##芫\n##芬\n##芭\n##芮\n##芯\n##花\n##芳\n##芷\n##芸\n##芹\n##芻\n##芽\n##芾\n##苁\n##苄\n##苇\n##苋\n##苍\n##苏\n##苑\n##苒\n##苓\n##苔\n##苕\n##苗\n##苛\n##苜\n##苞\n##苟\n##苡\n##苣\n##若\n##苦\n##苫\n##苯\n##英\n##苷\n##苹\n##苻\n##茁\n##茂\n##范\n##茄\n##茅\n##茉\n##茎\n##茏\n##茗\n##茜\n##茧\n##茨\n##茫\n##茬\n##茭\n##茯\n##茱\n##茲\n##茴\n##茵\n##茶\n##茸\n##茹\n##茼\n##荀\n##荃\n##荆\n##草\n##荊\n##荏\n##荐\n##荒\n##荔\n##荖\n##荘\n##荚\n##荞\n##荟\n##荠\n##荡\n##荣\n##荤\n##荥\n##荧\n##荨\n##荪\n##荫\n##药\n##荳\n##荷\n##荸\n##荻\n##荼\n##荽\n##莅\n##莆\n##莉\n##莊\n##莎\n##莒\n##莓\n##莖\n##莘\n##莞\n##莠\n##莢\n##莧\n##莪\n##莫\n##莱\n##莲\n##莴\n##获\n##莹\n##莺\n##莽\n##莿\n##菀\n##菁\n##菅\n##菇\n##菈\n##菊\n##菌\n##菏\n##菓\n##菖\n##菘\n##菜\n##菟\n##菠\n##菡\n##菩\n##華\n##菱\n##菲\n##菸\n##菽\n##萁\n##萃\n##萄\n##萊\n##萋\n##萌\n##萍\n##萎\n##萘\n##萝\n##萤\n##营\n##萦\n##萧\n##萨\n##萩\n##萬\n##萱\n##萵\n##萸\n##萼\n##落\n##葆\n##葉\n##著\n##葚\n##葛\n##葡\n##董\n##葦\n##葩\n##葫\n##葬\n##葭\n##葯\n##葱\n##葳\n##葵\n##葷\n##葺\n##蒂\n##蒋\n##蒐\n##蒔\n##蒙\n##蒜\n##蒞\n##蒟\n##蒡\n##蒨\n##蒲\n##蒸\n##蒹\n##蒻\n##蒼\n##蒿\n##蓁\n##蓄\n##蓆\n##蓉\n##蓋\n##蓑\n##蓓\n##蓖\n##蓝\n##蓟\n##蓦\n##蓬\n##蓮\n##蓼\n##蓿\n##蔑\n##蔓\n##蔔\n##蔗\n##蔘\n##蔚\n##蔡\n##蔣\n##蔥\n##蔫\n##蔬\n##蔭\n##蔵\n##蔷\n##蔺\n##蔻\n##蔼\n##蔽\n##蕁\n##蕃\n##蕈\n##蕉\n##蕊\n##蕎\n##蕙\n##蕤\n##蕨\n##蕩\n##蕪\n##蕭\n##蕲\n##蕴\n##蕻\n##蕾\n##薄\n##薅\n##薇\n##薈\n##薊\n##薏\n##薑\n##薔\n##薙\n##薛\n##薦\n##薨\n##薩\n##薪\n##薬\n##薯\n##薰\n##薹\n##藉\n##藍\n##藏\n##藐\n##藓\n##藕\n##藜\n##藝\n##藤\n##藥\n##藩\n##藹\n##藻\n##藿\n##蘆\n##蘇\n##蘊\n##蘋\n##蘑\n##蘚\n##蘭\n##蘸\n##蘼\n##蘿\n##虎\n##虏\n##虐\n##虑\n##虔\n##處\n##虚\n##虛\n##虜\n##虞\n##號\n##虢\n##虧\n##虫\n##虬\n##虱\n##虹\n##虻\n##虽\n##虾\n##蚀\n##蚁\n##蚂\n##蚊\n##蚌\n##蚓\n##蚕\n##蚜\n##蚝\n##蚣\n##蚤\n##蚩\n##蚪\n##蚯\n##蚱\n##蚵\n##蛀\n##蛆\n##蛇\n##蛊\n##蛋\n##蛎\n##蛐\n##蛔\n##蛙\n##蛛\n##蛟\n##蛤\n##蛭\n##蛮\n##蛰\n##蛳\n##蛹\n##蛻\n##蛾\n##蜀\n##蜂\n##蜃\n##蜆\n##蜇\n##蜈\n##蜊\n##蜍\n##蜒\n##蜓\n##蜕\n##蜗\n##蜘\n##蜚\n##蜜\n##蜡\n##蜢\n##蜥\n##蜱\n##蜴\n##蜷\n##蜻\n##蜿\n##蝇\n##蝈\n##蝉\n##蝌\n##蝎\n##蝕\n##蝗\n##蝙\n##蝟\n##蝠\n##蝦\n##蝨\n##蝴\n##蝶\n##蝸\n##蝼\n##螂\n##螃\n##融\n##螞\n##螢\n##螨\n##螯\n##螳\n##螺\n##蟀\n##蟄\n##蟆\n##蟋\n##蟎\n##蟑\n##蟒\n##蟠\n##蟬\n##蟲\n##蟹\n##蟻\n##蟾\n##蠅\n##蠍\n##蠔\n##蠕\n##蠛\n##蠟\n##蠡\n##蠢\n##蠣\n##蠱\n##蠶\n##蠹\n##蠻\n##血\n##衄\n##衅\n##衆\n##行\n##衍\n##術\n##衔\n##街\n##衙\n##衛\n##衝\n##衞\n##衡\n##衢\n##衣\n##补\n##表\n##衩\n##衫\n##衬\n##衮\n##衰\n##衲\n##衷\n##衹\n##衾\n##衿\n##袁\n##袂\n##袄\n##袅\n##袈\n##袋\n##袍\n##袒\n##袖\n##袜\n##袞\n##袤\n##袪\n##被\n##袭\n##袱\n##裁\n##裂\n##装\n##裆\n##裊\n##裏\n##裔\n##裕\n##裘\n##裙\n##補\n##裝\n##裟\n##裡\n##裤\n##裨\n##裱\n##裳\n##裴\n##裸\n##裹\n##製\n##裾\n##褂\n##複\n##褐\n##褒\n##褓\n##褔\n##褚\n##褥\n##褪\n##褫\n##褲\n##褶\n##褻\n##襁\n##襄\n##襟\n##襠\n##襪\n##襬\n##襯\n##襲\n##西\n##要\n##覃\n##覆\n##覇\n##見\n##規\n##覓\n##視\n##覚\n##覦\n##覧\n##親\n##覬\n##観\n##覷\n##覺\n##覽\n##觀\n##见\n##观\n##规\n##觅\n##视\n##览\n##觉\n##觊\n##觎\n##觐\n##觑\n##角\n##觞\n##解\n##觥\n##触\n##觸\n##言\n##訂\n##計\n##訊\n##討\n##訓\n##訕\n##訖\n##託\n##記\n##訛\n##訝\n##訟\n##訣\n##訥\n##訪\n##設\n##許\n##訳\n##訴\n##訶\n##診\n##註\n##証\n##詆\n##詐\n##詔\n##評\n##詛\n##詞\n##詠\n##詡\n##詢\n##詣\n##試\n##詩\n##詫\n##詬\n##詭\n##詮\n##詰\n##話\n##該\n##詳\n##詹\n##詼\n##誅\n##誇\n##誉\n##誌\n##認\n##誓\n##誕\n##誘\n##語\n##誠\n##誡\n##誣\n##誤\n##誥\n##誦\n##誨\n##說\n##説\n##読\n##誰\n##課\n##誹\n##誼\n##調\n##諄\n##談\n##請\n##諏\n##諒\n##論\n##諗\n##諜\n##諡\n##諦\n##諧\n##諫\n##諭\n##諮\n##諱\n##諳\n##諷\n##諸\n##諺\n##諾\n##謀\n##謁\n##謂\n##謄\n##謊\n##謎\n##謐\n##謔\n##謗\n##謙\n##講\n##謝\n##謠\n##謨\n##謬\n##謹\n##謾\n##譁\n##證\n##譎\n##譏\n##識\n##譙\n##譚\n##譜\n##警\n##譬\n##譯\n##議\n##譲\n##譴\n##護\n##譽\n##讀\n##變\n##讓\n##讚\n##讞\n##计\n##订\n##认\n##讥\n##讧\n##讨\n##让\n##讪\n##讫\n##训\n##议\n##讯\n##记\n##讲\n##讳\n##讴\n##讶\n##讷\n##许\n##讹\n##论\n##讼\n##讽\n##设\n##访\n##诀\n##证\n##诃\n##评\n##诅\n##识\n##诈\n##诉\n##诊\n##诋\n##词\n##诏\n##译\n##试\n##诗\n##诘\n##诙\n##诚\n##诛\n##话\n##诞\n##诟\n##诠\n##诡\n##询\n##诣\n##诤\n##该\n##详\n##诧\n##诩\n##诫\n##诬\n##语\n##误\n##诰\n##诱\n##诲\n##说\n##诵\n##诶\n##请\n##诸\n##诺\n##读\n##诽\n##课\n##诿\n##谀\n##谁\n##调\n##谄\n##谅\n##谆\n##谈\n##谊\n##谋\n##谌\n##谍\n##谎\n##谏\n##谐\n##谑\n##谒\n##谓\n##谔\n##谕\n##谗\n##谘\n##谙\n##谚\n##谛\n##谜\n##谟\n##谢\n##谣\n##谤\n##谥\n##谦\n##谧\n##谨\n##谩\n##谪\n##谬\n##谭\n##谯\n##谱\n##谲\n##谴\n##谶\n##谷\n##豁\n##豆\n##豇\n##豈\n##豉\n##豊\n##豌\n##豎\n##豐\n##豔\n##豚\n##象\n##豢\n##豪\n##豫\n##豬\n##豹\n##豺\n##貂\n##貅\n##貌\n##貓\n##貔\n##貘\n##貝\n##貞\n##負\n##財\n##貢\n##貧\n##貨\n##販\n##貪\n##貫\n##責\n##貯\n##貰\n##貳\n##貴\n##貶\n##買\n##貸\n##費\n##貼\n##貽\n##貿\n##賀\n##賁\n##賂\n##賃\n##賄\n##資\n##賈\n##賊\n##賑\n##賓\n##賜\n##賞\n##賠\n##賡\n##賢\n##賣\n##賤\n##賦\n##質\n##賬\n##賭\n##賴\n##賺\n##購\n##賽\n##贅\n##贈\n##贊\n##贍\n##贏\n##贓\n##贖\n##贛\n##贝\n##贞\n##负\n##贡\n##财\n##责\n##贤\n##败\n##账\n##货\n##质\n##贩\n##贪\n##贫\n##贬\n##购\n##贮\n##贯\n##贰\n##贱\n##贲\n##贴\n##贵\n##贷\n##贸\n##费\n##贺\n##贻\n##贼\n##贾\n##贿\n##赁\n##赂\n##赃\n##资\n##赅\n##赈\n##赊\n##赋\n##赌\n##赎\n##赏\n##赐\n##赓\n##赔\n##赖\n##赘\n##赚\n##赛\n##赝\n##赞\n##赠\n##赡\n##赢\n##赣\n##赤\n##赦\n##赧\n##赫\n##赭\n##走\n##赳\n##赴\n##赵\n##赶\n##起\n##趁\n##超\n##越\n##趋\n##趕\n##趙\n##趟\n##趣\n##趨\n##足\n##趴\n##趵\n##趸\n##趺\n##趾\n##跃\n##跄\n##跆\n##跋\n##跌\n##跎\n##跑\n##跖\n##跚\n##跛\n##距\n##跟\n##跡\n##跤\n##跨\n##跩\n##跪\n##路\n##跳\n##践\n##跷\n##跹\n##跺\n##跻\n##踉\n##踊\n##踌\n##踏\n##踐\n##踝\n##踞\n##踟\n##踢\n##踩\n##踪\n##踮\n##踱\n##踴\n##踵\n##踹\n##蹂\n##蹄\n##蹇\n##蹈\n##蹉\n##蹊\n##蹋\n##蹑\n##蹒\n##蹙\n##蹟\n##蹣\n##蹤\n##蹦\n##蹩\n##蹬\n##蹭\n##蹲\n##蹴\n##蹶\n##蹺\n##蹼\n##蹿\n##躁\n##躇\n##躉\n##躊\n##躋\n##躍\n##躏\n##躪\n##身\n##躬\n##躯\n##躲\n##躺\n##軀\n##車\n##軋\n##軌\n##軍\n##軒\n##軟\n##転\n##軸\n##軼\n##軽\n##軾\n##較\n##載\n##輒\n##輓\n##輔\n##輕\n##輛\n##輝\n##輟\n##輩\n##輪\n##輯\n##輸\n##輻\n##輾\n##輿\n##轄\n##轅\n##轆\n##轉\n##轍\n##轎\n##轟\n##车\n##轧\n##轨\n##轩\n##转\n##轭\n##轮\n##软\n##轰\n##轲\n##轴\n##轶\n##轻\n##轼\n##载\n##轿\n##较\n##辄\n##辅\n##辆\n##辇\n##辈\n##辉\n##辊\n##辍\n##辐\n##辑\n##输\n##辕\n##辖\n##辗\n##辘\n##辙\n##辛\n##辜\n##辞\n##辟\n##辣\n##辦\n##辨\n##辩\n##辫\n##辭\n##辮\n##辯\n##辰\n##辱\n##農\n##边\n##辺\n##辻\n##込\n##辽\n##达\n##迁\n##迂\n##迄\n##迅\n##过\n##迈\n##迎\n##运\n##近\n##返\n##还\n##这\n##进\n##远\n##违\n##连\n##迟\n##迢\n##迤\n##迥\n##迦\n##迩\n##迪\n##迫\n##迭\n##述\n##迴\n##迷\n##迸\n##迹\n##迺\n##追\n##退\n##送\n##适\n##逃\n##逅\n##逆\n##选\n##逊\n##逍\n##透\n##逐\n##递\n##途\n##逕\n##逗\n##這\n##通\n##逛\n##逝\n##逞\n##速\n##造\n##逢\n##連\n##逮\n##週\n##進\n##逵\n##逶\n##逸\n##逻\n##逼\n##逾\n##遁\n##遂\n##遅\n##遇\n##遊\n##運\n##遍\n##過\n##遏\n##遐\n##遑\n##遒\n##道\n##達\n##違\n##遗\n##遙\n##遛\n##遜\n##遞\n##遠\n##遢\n##遣\n##遥\n##遨\n##適\n##遭\n##遮\n##遲\n##遴\n##遵\n##遶\n##遷\n##選\n##遺\n##遼\n##遽\n##避\n##邀\n##邁\n##邂\n##邃\n##還\n##邇\n##邈\n##邊\n##邋\n##邏\n##邑\n##邓\n##邕\n##邛\n##邝\n##邢\n##那\n##邦\n##邨\n##邪\n##邬\n##邮\n##邯\n##邰\n##邱\n##邳\n##邵\n##邸\n##邹\n##邺\n##邻\n##郁\n##郅\n##郊\n##郎\n##郑\n##郜\n##郝\n##郡\n##郢\n##郤\n##郦\n##郧\n##部\n##郫\n##郭\n##郴\n##郵\n##郷\n##郸\n##都\n##鄂\n##鄉\n##鄒\n##鄔\n##鄙\n##鄞\n##鄢\n##鄧\n##鄭\n##鄰\n##鄱\n##鄲\n##鄺\n##酉\n##酊\n##酋\n##酌\n##配\n##酐\n##酒\n##酗\n##酚\n##酝\n##酢\n##酣\n##酥\n##酩\n##酪\n##酬\n##酮\n##酯\n##酰\n##酱\n##酵\n##酶\n##酷\n##酸\n##酿\n##醃\n##醇\n##醉\n##醋\n##醍\n##醐\n##醒\n##醚\n##醛\n##醜\n##醞\n##醣\n##醪\n##醫\n##醬\n##醮\n##醯\n##醴\n##醺\n##釀\n##釁\n##采\n##釉\n##释\n##釋\n##里\n##重\n##野\n##量\n##釐\n##金\n##釗\n##釘\n##釜\n##針\n##釣\n##釦\n##釧\n##釵\n##鈀\n##鈉\n##鈍\n##鈎\n##鈔\n##鈕\n##鈞\n##鈣\n##鈦\n##鈪\n##鈴\n##鈺\n##鈾\n##鉀\n##鉄\n##鉅\n##鉉\n##鉑\n##鉗\n##鉚\n##鉛\n##鉤\n##鉴\n##鉻\n##銀\n##銃\n##銅\n##銑\n##銓\n##銖\n##銘\n##銜\n##銬\n##銭\n##銮\n##銳\n##銷\n##銹\n##鋁\n##鋅\n##鋒\n##鋤\n##鋪\n##鋰\n##鋸\n##鋼\n##錄\n##錐\n##錘\n##錚\n##錠\n##錢\n##錦\n##錨\n##錫\n##錮\n##錯\n##録\n##錳\n##錶\n##鍊\n##鍋\n##鍍\n##鍛\n##鍥\n##鍰\n##鍵\n##鍺\n##鍾\n##鎂\n##鎊\n##鎌\n##鎏\n##鎔\n##鎖\n##鎗\n##鎚\n##鎧\n##鎬\n##鎮\n##鎳\n##鏈\n##鏖\n##鏗\n##鏘\n##鏞\n##鏟\n##鏡\n##鏢\n##鏤\n##鏽\n##鐘\n##鐮\n##鐲\n##鐳\n##鐵\n##鐸\n##鐺\n##鑄\n##鑊\n##鑑\n##鑒\n##鑣\n##鑫\n##鑰\n##鑲\n##鑼\n##鑽\n##鑾\n##鑿\n##针\n##钉\n##钊\n##钎\n##钏\n##钒\n##钓\n##钗\n##钙\n##钛\n##钜\n##钝\n##钞\n##钟\n##钠\n##钡\n##钢\n##钣\n##钤\n##钥\n##钦\n##钧\n##钨\n##钩\n##钮\n##钯\n##钰\n##钱\n##钳\n##钴\n##钵\n##钺\n##钻\n##钼\n##钾\n##钿\n##铀\n##铁\n##铂\n##铃\n##铄\n##铅\n##铆\n##铉\n##铎\n##铐\n##铛\n##铜\n##铝\n##铠\n##铡\n##铢\n##铣\n##铤\n##铨\n##铩\n##铬\n##铭\n##铮\n##铰\n##铲\n##铵\n##银\n##铸\n##铺\n##链\n##铿\n##销\n##锁\n##锂\n##锄\n##锅\n##锆\n##锈\n##锉\n##锋\n##锌\n##锏\n##锐\n##锑\n##错\n##锚\n##锟\n##锡\n##锢\n##锣\n##锤\n##锥\n##锦\n##锭\n##键\n##锯\n##锰\n##锲\n##锵\n##锹\n##锺\n##锻\n##镀\n##镁\n##镂\n##镇\n##镉\n##镌\n##镍\n##镐\n##镑\n##镕\n##镖\n##镗\n##镛\n##镜\n##镣\n##镭\n##镯\n##镰\n##镳\n##镶\n##長\n##长\n##門\n##閃\n##閉\n##開\n##閎\n##閏\n##閑\n##閒\n##間\n##閔\n##閘\n##閡\n##関\n##閣\n##閥\n##閨\n##閩\n##閱\n##閲\n##閹\n##閻\n##閾\n##闆\n##闇\n##闊\n##闌\n##闍\n##闔\n##闕\n##闖\n##闘\n##關\n##闡\n##闢\n##门\n##闪\n##闫\n##闭\n##问\n##闯\n##闰\n##闲\n##间\n##闵\n##闷\n##闸\n##闹\n##闺\n##闻\n##闽\n##闾\n##阀\n##阁\n##阂\n##阅\n##阆\n##阇\n##阈\n##阉\n##阎\n##阐\n##阑\n##阔\n##阕\n##阖\n##阙\n##阚\n##阜\n##队\n##阡\n##阪\n##阮\n##阱\n##防\n##阳\n##阴\n##阵\n##阶\n##阻\n##阿\n##陀\n##陂\n##附\n##际\n##陆\n##陇\n##陈\n##陋\n##陌\n##降\n##限\n##陕\n##陛\n##陝\n##陞\n##陟\n##陡\n##院\n##陣\n##除\n##陨\n##险\n##陪\n##陰\n##陲\n##陳\n##陵\n##陶\n##陷\n##陸\n##険\n##陽\n##隅\n##隆\n##隈\n##隊\n##隋\n##隍\n##階\n##随\n##隐\n##隔\n##隕\n##隘\n##隙\n##際\n##障\n##隠\n##隣\n##隧\n##隨\n##險\n##隱\n##隴\n##隶\n##隸\n##隻\n##隼\n##隽\n##难\n##雀\n##雁\n##雄\n##雅\n##集\n##雇\n##雉\n##雋\n##雌\n##雍\n##雎\n##雏\n##雑\n##雒\n##雕\n##雖\n##雙\n##雛\n##雜\n##雞\n##離\n##難\n##雨\n##雪\n##雯\n##雰\n##雲\n##雳\n##零\n##雷\n##雹\n##電\n##雾\n##需\n##霁\n##霄\n##霆\n##震\n##霈\n##霉\n##霊\n##霍\n##霎\n##霏\n##霑\n##霓\n##霖\n##霜\n##霞\n##霧\n##霭\n##霰\n##露\n##霸\n##霹\n##霽\n##霾\n##靂\n##靄\n##靈\n##青\n##靓\n##靖\n##静\n##靚\n##靛\n##靜\n##非\n##靠\n##靡\n##面\n##靥\n##靦\n##革\n##靳\n##靴\n##靶\n##靼\n##鞅\n##鞋\n##鞍\n##鞏\n##鞑\n##鞘\n##鞠\n##鞣\n##鞦\n##鞭\n##韆\n##韋\n##韌\n##韓\n##韜\n##韦\n##韧\n##韩\n##韬\n##韭\n##音\n##韵\n##韶\n##韻\n##響\n##頁\n##頂\n##頃\n##項\n##順\n##須\n##頌\n##預\n##頑\n##頒\n##頓\n##頗\n##領\n##頜\n##頡\n##頤\n##頫\n##頭\n##頰\n##頷\n##頸\n##頹\n##頻\n##頼\n##顆\n##題\n##額\n##顎\n##顏\n##顔\n##願\n##顛\n##類\n##顧\n##顫\n##顯\n##顱\n##顴\n##页\n##顶\n##顷\n##项\n##顺\n##须\n##顼\n##顽\n##顾\n##顿\n##颁\n##颂\n##预\n##颅\n##领\n##颇\n##颈\n##颉\n##颊\n##颌\n##颍\n##颐\n##频\n##颓\n##颔\n##颖\n##颗\n##题\n##颚\n##颛\n##颜\n##额\n##颞\n##颠\n##颡\n##颢\n##颤\n##颦\n##颧\n##風\n##颯\n##颱\n##颳\n##颶\n##颼\n##飄\n##飆\n##风\n##飒\n##飓\n##飕\n##飘\n##飙\n##飚\n##飛\n##飞\n##食\n##飢\n##飨\n##飩\n##飪\n##飯\n##飲\n##飼\n##飽\n##飾\n##餃\n##餅\n##餉\n##養\n##餌\n##餐\n##餒\n##餓\n##餘\n##餚\n##餛\n##餞\n##餡\n##館\n##餮\n##餵\n##餾\n##饅\n##饈\n##饋\n##饌\n##饍\n##饑\n##饒\n##饕\n##饗\n##饞\n##饥\n##饨\n##饪\n##饬\n##饭\n##饮\n##饯\n##饰\n##饱\n##饲\n##饴\n##饵\n##饶\n##饷\n##饺\n##饼\n##饽\n##饿\n##馀\n##馁\n##馄\n##馅\n##馆\n##馈\n##馋\n##馍\n##馏\n##馒\n##馔\n##首\n##馗\n##香\n##馥\n##馨\n##馬\n##馭\n##馮\n##馳\n##馴\n##駁\n##駄\n##駅\n##駆\n##駐\n##駒\n##駕\n##駛\n##駝\n##駭\n##駱\n##駿\n##騁\n##騎\n##騏\n##験\n##騙\n##騨\n##騰\n##騷\n##驀\n##驅\n##驊\n##驍\n##驒\n##驕\n##驗\n##驚\n##驛\n##驟\n##驢\n##驥\n##马\n##驭\n##驮\n##驯\n##驰\n##驱\n##驳\n##驴\n##驶\n##驷\n##驸\n##驹\n##驻\n##驼\n##驾\n##驿\n##骁\n##骂\n##骄\n##骅\n##骆\n##骇\n##骈\n##骊\n##骋\n##验\n##骏\n##骐\n##骑\n##骗\n##骚\n##骛\n##骜\n##骞\n##骠\n##骡\n##骤\n##骥\n##骧\n##骨\n##骯\n##骰\n##骶\n##骷\n##骸\n##骼\n##髂\n##髅\n##髋\n##髏\n##髒\n##髓\n##體\n##髖\n##高\n##髦\n##髪\n##髮\n##髯\n##髻\n##鬃\n##鬆\n##鬍\n##鬓\n##鬚\n##鬟\n##鬢\n##鬣\n##鬥\n##鬧\n##鬱\n##鬼\n##魁\n##魂\n##魄\n##魅\n##魇\n##魍\n##魏\n##魔\n##魘\n##魚\n##魯\n##魷\n##鮑\n##鮨\n##鮪\n##鮭\n##鮮\n##鯉\n##鯊\n##鯖\n##鯛\n##鯨\n##鯰\n##鯽\n##鰍\n##鰓\n##鰭\n##鰲\n##鰻\n##鰾\n##鱈\n##鱉\n##鱔\n##鱗\n##鱷\n##鱸\n##鱼\n##鱿\n##鲁\n##鲈\n##鲍\n##鲑\n##鲛\n##鲜\n##鲟\n##鲢\n##鲤\n##鲨\n##鲫\n##鲱\n##鲲\n##鲶\n##鲷\n##鲸\n##鳃\n##鳄\n##鳅\n##鳌\n##鳍\n##鳕\n##鳖\n##鳗\n##鳝\n##鳞\n##鳥\n##鳩\n##鳳\n##鳴\n##鳶\n##鴉\n##鴕\n##鴛\n##鴦\n##鴨\n##鴻\n##鴿\n##鵑\n##鵜\n##鵝\n##鵡\n##鵬\n##鵰\n##鵲\n##鶘\n##鶩\n##鶯\n##鶴\n##鷗\n##鷲\n##鷹\n##鷺\n##鸚\n##鸞\n##鸟\n##鸠\n##鸡\n##鸢\n##鸣\n##鸥\n##鸦\n##鸨\n##鸪\n##鸭\n##鸯\n##鸳\n##鸵\n##鸽\n##鸾\n##鸿\n##鹂\n##鹃\n##鹄\n##鹅\n##鹈\n##鹉\n##鹊\n##鹌\n##鹏\n##鹑\n##鹕\n##鹘\n##鹜\n##鹞\n##鹤\n##鹦\n##鹧\n##鹫\n##鹭\n##鹰\n##鹳\n##鹵\n##鹹\n##鹼\n##鹽\n##鹿\n##麂\n##麋\n##麒\n##麓\n##麗\n##麝\n##麟\n##麥\n##麦\n##麩\n##麴\n##麵\n##麸\n##麺\n##麻\n##麼\n##麽\n##麾\n##黃\n##黄\n##黍\n##黎\n##黏\n##黑\n##黒\n##黔\n##默\n##黛\n##黜\n##黝\n##點\n##黠\n##黨\n##黯\n##黴\n##鼋\n##鼎\n##鼐\n##鼓\n##鼠\n##鼬\n##鼹\n##鼻\n##鼾\n##齁\n##齊\n##齋\n##齐\n##齒\n##齡\n##齢\n##齣\n##齦\n##齿\n##龄\n##龅\n##龈\n##龊\n##龋\n##龌\n##龍\n##龐\n##龔\n##龕\n##龙\n##龚\n##龛\n##龜\n##龟\n##︰\n##︱\n##︶\n##︿\n##﹁\n##﹂\n##﹍\n##﹏\n##﹐\n##﹑\n##﹒\n##﹔\n##﹕\n##﹖\n##﹗\n##﹙\n##﹚\n##﹝\n##﹞\n##﹡\n##﹣\n##！\n##＂\n##＃\n##＄\n##％\n##＆\n##＇\n##（\n##）\n##＊\n##，\n##－\n##．\n##／\n##：\n##；\n##＜\n##？\n##＠\n##［\n##＼\n##］\n##＾\n##＿\n##｀\n##ｆ\n##ｈ\n##ｊ\n##ｕ\n##ｗ\n##ｚ\n##｛\n##｝\n##｡\n##｢\n##｣\n##､\n##･\n##ｯ\n##ｰ\n##ｲ\n##ｸ\n##ｼ\n##ｽ\n##ﾄ\n##ﾉ\n##ﾌ\n##ﾗ\n##ﾙ\n##ﾝ\n##ﾞ\n##ﾟ\n##￣\n##￥\n##👍\n##🔥\n##😂\n##😎\n"
  },
  {
    "path": "modules/text/text_review/porn_detection_gru/assets/word_dict.txt",
    "content": "<PAD>\n<UNK>\n，\n的\n[UNK]\n。\n一\n了\n是\n不\n我\n在\n人\n这\n有\n他\n着\n来\n上\n你\n个\n她\n大\n到\n道\n就\n下\n那\n说\n子\n出\n地\n：\n也\n小\n中\n时\n！\n要\n么\n们\n得\n然\n看\n自\n里\n？\n好\n身\n过\n手\n起\n后\n为\n可\n去\n没\n会\n心\n和\n之\n还\n都\n天\n、\n以\n头\n能\n对\n想\n而\n开\n只\n面\n己\n女\n生\n动\n样\n如\n发\n声\n无\n多\n用\n已\n又\n前\n被\n把\n家\n事\n经\n点\n两\n知\n情\n老\n眼\n力\n现\n于\n年\n感\n什\n很\n儿\n让\n但\n意\n间\n方\n成\n些\n笑\n体\n进\n长\n高\n从\n将\n话\n美\n见\n啊\n住\n当\n气\n所\n真\n回\n白\n口\n作\n再\n此\n明\n给\n觉\n正\n妈\n边\n色\n才\n行\n快\n公\n种\n分\n三\n几\n全\n法\n却\n次\n.\n本\n轻\n走\n国\n最\n同\n打\n十\n向\n实\n神\n定\n学\n张\n水\n光\n门\n内\n紧\n主\n更\n其\n入\n直\n外\n听\n问\n脸\n少\n肉\n因\n死\n做\n部\n双\n处\n与\n男\n便\n王\n太\n等\n怎\n像\n吧\n性\n红\n第\n二\n候\n理\n接\n放\n满\n阴\n果\n别\n比\n重\n相\n林\n受\n位\n「\n叫\n玉\n何\n龙\n强\n爱\n日\n应\n花\n清\n微\n加\n安\n机\n完\n连\n名\n者\n房\n李\n刚\n姐\n东\n带\n月\n西\n师\n亲\n文\n风\n服\n并\n难\n山\n深\n指\n先\n关\n常\n干\n马\n跟\n由\n」\n目\n变\n四\n使\n嘴\n平\n合\n立\n吗\n精\n物\n车\n流\n火\n呢\n巴\n股\n望\n军\n空\n任\n金\n衣\n似\n工\n乳\n始\n越\n电\n原\n战\n新\n般\n黑\n今\n飞\n海\n转\n插\n至\n淫\n通\n表\n腿\n雪\n度\n\"\n思\n根\n信\n场\n业\n随\n吃\n虽\n许\n反\n阵\n书\n乎\n慢\n世\n往\n云\n路\n解\n特\n离\n算\n哥\n青\n半\n命\n音\n竟\n,\n交\n非\n拉\n教\n化\n容\n娇\n条\n夫\n刻\n活\n丝\n《\n》\n万\n量\n坐\n司\n且\n传\n管\n该\n热\n南\n找\n周\n五\n惊\n突\n早\n片\n落\n数\n它\n欢\n阳\n总\n极\n结\n步\n棒\n整\n城\n激\n孩\n制\n喜\n倒\n鸡\n远\n穴\n床\n冷\n怕\n市\n香\n母\n及\n抽\n；\n言\n息\n每\n件\n阿\n提\n百\n血\n记\n利\n断\n绝\n停\n弄\n（\n失\n站\n宝\n）\n影\n露\n静\n语\n即\n士\n近\n弟\n尔\n尽\n屁\n若\n欲\n柔\n系\n杀\n形\n细\n斯\n穿\n兴\n剑\n灵\n武\n吸\n认\n收\n星\n足\n底\n办\n石\n急\n求\n代\n脚\n黄\n保\n击\n功\n江\n丽\n娘\n拿\n令\n终\n华\n陈\n杨\n未\n冲\n晚\n或\n乐\n低\n哪\n修\n奇\n楚\n挺\n显\n刺\n必\n品\n员\n暗\n视\n友\n叶\n谁\n痛\n嫩\n则\n玩\n告\n千\n妹\n射\n包\n计\n烈\n温\n界\n抱\n字\n唇\n产\n胸\n民\n准\n魔\n够\n罗\n液\n摸\n势\n帮\n沉\n各\n程\n软\n忍\n乱\n单\n*\n酒\n达\n怪\n久\n忙\n敢\n哈\n兵\n顶\n伤\n切\n留\n著\n依\n速\n背\n题\n德\n压\n-\n象\n伸\n呼\n害\n送\n父\n区\n元\n续\n北\n队\n夜\n建\n官\n识\n具\n院\n请\n备\n刘\n格\n装\n顾\n睛\n摇\n亮\n缓\n克\n八\n苦\n持\n宫\n章\n裤\n脑\n消\n舒\n梦\n布\n务\n易\n响\n朝\n错\n钱\n期\n破\n答\n怀\n展\n照\n报\n领\n客\n术\n腰\n室\n巨\n妇\n待\n线\n调\n复\n苏\n渐\n论\n滑\n秦\n掌\n狂\n古\n异\n猛\n皇\n味\n角\n确\n兰\n睡\n雨\n决\n句\n式\n仙\n运\n吟\n脱\n顿\n推\n狠\n冰\n羞\n众\n另\n政\n台\n抓\n居\n爷\n六\n游\n英\n甚\n七\n木\n嗯\n迷\n器\n网\n团\n注\n淡\n春\n级\n硬\n峰\n设\n毛\n旁\n喝\n忽\n舌\n左\n铁\n科\n商\n皮\n按\n族\n曾\n图\n首\n资\n奶\n简\n米\n故\n继\n取\n掉\n爸\n除\n丰\n密\n况\n呀\n类\n赵\n谢\n右\n态\n段\n围\n楼\n基\n约\n粗\n潮\n需\n技\n佛\n跳\n九\n充\n荡\n顺\n差\n引\n姑\n艳\n愿\n浪\n耳\n熟\n兄\n湿\n唐\n您\n义\n赶\n念\n护\n观\n药\n松\n隐\n跑\n试\n君\n圆\n局\n节\n散\n诉\n闪\n止\n颤\n限\n刀\n支\n粉\n鬼\n禁\n医\n洞\n齐\n含\n集\n帝\n秀\n握\n臀\n初\n造\n吻\n料\n婆\n号\n导\n麻\n招\n抬\n质\n眉\n坚\n套\n希\n疑\n拍\n波\n村\n球\n社\n怒\n肯\n毫\n透\n雅\n份\n抚\n翻\n退\n专\n河\n奋\n户\n校\n考\n醒\n示\n雷\n究\n景\n圣\n胡\n病\n摆\n汉\n钟\n威\n统\n模\n哦\n暴\n爽\n举\n恶\n靠\n称\n卫\n弹\n警\n尖\n选\n既\n层\n攻\n扭\n承\n呻\n存\n余\n斗\n仍\n魂\n舔\n素\n改\n夏\n群\n府\n沙\n境\n置\n联\n板\n饭\n孙\n志\n·\n陆\n历\n骚\n秘\n呵\n抖\n1\n属\n州\n闻\n查\n副\n致\n屄\n操\n亚\n龟\n仅\n派\n证\n默\n换\n妻\n块\n莫\n写\n组\n凌\n坏\n贵\n助\n宁\n媚\n恐\n排\n状\n雄\n毕\n紫\n卡\n省\n福\n短\n追\n据\n食\n岁\n规\n杰\n萧\n寒\n环\n广\n枪\n蜜\n谈\n费\n骨\n座\n委\n丁\n宇\n画\n超\n蛋\n礼\n严\n草\n戏\n配\n朋\n端\n共\n治\n妙\n察\n较\n研\n恩\n咬\n伙\n型\n俊\n店\n独\n版\n润\n宗\n震\n揉\n议\n讲\n柳\n价\n树\n京\n投\n哼\n朱\n敏\n挑\n扬\n浑\n争\n梅\n习\n丹\n擦\n银\n营\n维\n幸\n洗\n劲\n野\n假\n偷\n忘\n宋\n源\n索\n土\n守\n虚\n混\n洁\n闭\n救\n付\n控\n叹\n摩\n练\n徐\n烟\n裸\n划\n县\n秋\n喘\n堂\n吴\n买\n敌\n参\n姨\n施\n标\n狗\n否\n权\n夹\n厉\n担\n遇\n险\n毒\n触\n虎\n负\n纪\n效\n际\n巧\n养\n晓\n哭\n欣\n须\n尚\n肩\n胜\n演\n贝\n享\n曲\n项\n升\n躺\n啦\n案\n藏\n咱\n珠\n嘿\n防\n韩\n婚\n屋\n凤\n材\n挥\n适\n2\n娜\n创\n迎\n诗\n仿\n优\n班\n责\n喊\n史\n泪\n探\n芳\n剧\n牙\n湖\n茎\n贴\n赤\n验\n临\n智\n盘\n撞\n休\n磨\n舞\n镇\n蓝\n妖\n逃\n疯\n牛\n鲜\n永\n田\n凡\n范\n抗\n捏\n呆\n职\n普\n恨\n喷\n诱\n叔\n园\n桌\n侧\n疼\n印\n康\n奸\n移\n肏\n裙\n尊\n兽\n~\n缩\n羽\n盖\n丈\n瞬\n育\n略\n伟\n(\n徒\n幽\n挂\n凝\n)\n3\n寻\n旧\n袋\n滚\n船\n伯\n躯\n良\n?\n尼\n某\n庭\n肥\n颜\n富\n弱\n拥\n塞\n洛\n碰\n厅\n庄\n吐\n鱼\n肤\n托\n谷\n狼\n卖\n姿\n臂\n灯\n沈\n封\n壁\n辈\n玄\n梁\n采\n浓\n油\n免\n诺\n燕\n晃\n轮\n惑\n值\n涌\n款\n嘛\n尤\n杯\n玲\n鼻\n艺\n纯\n痒\n．\n洋\n┅\n犹\n莉\n圈\n肌\n炼\n杂\n隔\n喔\n积\n邪\n增\n读\n逸\n奴\n扑\n佳\n腹\n慕\n骑\n释\n趣\n晶\n货\n罪\n获\n拔\n腾\n笔\n菜\n爆\n农\n歌\n败\n伏\n汗\n蒙\n烧\n薄\n善\n午\n稍\n吓\n泄\n伊\n载\n架\n厚\n惜\n窗\n蓉\n飘\n骂\n扎\n/\n抵\n墙\n归\n颗\n袜\n束\n企\n番\n纤\n茶\n列\n烦\n横\n占\n私\n赛\n滴\n折\n街\n济\n借\n避\n亦\n糊\n牌\n净\n袁\n吞\n傅\n奈\n皱\n;\n伦\n吮\n尸\n绿\n遍\n挤\n供\n鼓\n搞\n乡\n5\n##┅\n郎\n桃\n灭\n讨\n介\n缝\n偏\n彩\n党\n纳\n卷\n罩\n互\n敬\n懂\n慧\n诸\n犯\n菲\n扶\n织\n壮\n纷\n替\n裂\n逼\n缠\n律\n征\n缘\n俩\n残\n纸\n甲\n距\n婷\n豪\n率\n盯\n监\n幻\n镜\n迹\n怜\n降\n泽\n迫\n舍\n登\n甜\n碎\n悄\n呜\n乔\n4\n吉\n欧\n罢\n祖\n森\n勇\n拳\n销\n郑\n昨\n冒\n财\n彻\n凉\n琳\n岳\n危\n奔\n搂\n择\n姓\n墨\n勾\n殿\n酥\n测\n爬\n陪\n鸣\n莲\n醉\n训\n滋\n预\n健\n漂\n弃\n概\n唯\n暖\n域\n馆\n肚\n盛\n尘\n闹\n來\n仰\n仔\n撑\n耐\n郭\n晕\n鲁\n聚\n睁\n荣\n瞧\n挣\n:\n戴\n博\n董\n逐\n凶\n劳\n蕾\n躲\n垂\n陷\n浮\n拼\n川\n翘\n迅\n幕\n构\n庆\n胆\n累\n唔\n蒂\n啪\n跪\n朵\n宜\n附\n餐\n乖\n补\n涨\n後\n编\n忌\n昏\n麽\n惨\n奥\n杜\n翼\n炎\n仇\n孔\n亡\n岛\n童\n枫\n词\n稳\n仪\n谓\n旋\n竹\n剩\n臣\n恋\n遥\n绕\n吹\n凭\n融\n尝\n逗\n执\n拜\n扯\n阻\n琴\n纵\n虑\n瑶\n盈\n貌\n侯\n耸\n挡\n赏\n闲\n扫\n耻\n珍\n例\n愣\n傲\n漫\n辰\n旅\n莹\n丢\n悲\n促\n掩\n宽\n脉\n困\n启\n酸\n策\n宣\n袭\n庞\n额\n录\n络\n乌\n樱\n沟\n缺\n遗\n趴\n灰\n裹\n浴\n伴\n拒\n傻\n误\n塔\n&\n核\n努\n弯\n脖\n妃\n艾\n乃\n瞪\n妮\n迟\n甘\n池\n洪\n搓\n检\n辱\n惠\n蛇\n哎\n课\n慌\n诚\n婉\n!\n批\n淋\n益\n闷\n阶\n协\n输\n拨\n慰\n聊\n抹\n寸\n腻\n愈\n胀\n鞋\n苍\n6\n饱\n丫\n氏\n愤\n猜\n曼\n辞\n赞\n孤\n尾\n眸\n胯\n评\n芒\n抢\n浩\n怨\n碧\n豆\n均\n薇\n贱\n爹\n锁\n泛\n惯\n侍\n忆\n晴\n夺\n膀\n烫\n绪\n仁\n典\n隆\n轩\n固\n菊\n绵\n7\n嘻\n辉\n搭\n俏\n针\n炸\n冯\n锋\n汁\n痕\n席\n延\n哀\n忠\n炮\n恢\n椅\n辛\n埋\n悦\n汤\n堪\n迪\n浅\n钻\n嫣\n述\n8\n遭\n泰\n轰\n肖\n码\n硕\n孟\n咪\n姬\n屈\n帅\n胖\n括\n扣\n津\n吼\n泉\n贼\n频\n嘉\n10\n悠\n径\n季\n授\n彼\n晨\n沿\n倩\n悉\n荒\n暂\n唱\n染\n瑞\n判\n箭\n凑\n倾\n溜\n琪\n卧\n勒\n陌\n邓\n脏\n跃\n谋\n勃\n辣\n估\n佩\n旦\n侠\n吕\n绍\n渴\n侵\n鸟\n熊\n韵\n辆\n沾\n瓶\n惧\n坦\n忧\n筑\n朗\n雾\n齿\n燃\n拖\n瓣\n焦\n饰\n霞\n涂\n奉\n旗\n豫\n遮\n丸\n尿\n霸\n烂\n涛\n岩\n芸\n御\n减\n票\n怡\n莱\n匆\n牢\n疾\n递\n霜\n啸\n喃\n革\n猪\n厌\n衫\n皆\n脆\n尺\n途\n审\n废\n瓜\n姆\n损\n堆\n颈\n偶\n踏\n潜\n猫\n零\n奖\n毁\n颖\n魏\n贞\n培\n於\n踪\n末\n喉\n韦\n涩\n讶\n曰\n莎\n厂\n欺\n央\n骗\n郁\n悟\n臭\n娃\n薛\n凯\n锐\n趁\n库\n颊\n玛\n倍\n嫂\n萨\n钢\n奏\n泡\n乘\n泥\n购\n汽\n袍\n桥\n萍\n卿\n痴\n盟\n肛\n翠\n符\n眯\n窄\n洲\n冬\n曹\n岂\n悔\n疗\n袖\n淑\n截\n狐\n港\n羊\n撕\n舅\n刑\n楠\n绑\n惹\n霍\n驾\n汇\n唤\n怔\n饮\n敲\n咽\n陶\n%\n牵\n献\n糖\n柱\n枝\n售\n覆\n虫\n映\n箱\n珊\n拾\n卓\n撒\n葛\n铺\n阁\n恭\n颇\n瓦\n瘦\n屏\n俗\n帐\n幅\n幼\n篇\n振\n畅\n播\n扔\n宿\n铃\n茹\n寂\n耀\n劫\n航\n雕\n抑\n谦\n穷\n黎\n纹\n宾\n汪\n肃\n刹\n焰\n绳\n晋\n饶\n舟\n喂\n肢\n恼\n督\n摄\n抛\n予\n跨\n肠\n狱\n赫\n蛮\n腔\n岸\n扮\n昊\n雯\n聪\n井\n肆\n蒋\n桑\n9\n允\n贪\n禾\n梯\n疲\n丑\n斩\n斜\n戒\n呈\n廷\n陵\n析\n劝\n苗\n序\n冠\n俱\n锦\n亿\n#\n伍\n殊\n掏\n芬\n姊\n拦\n亏\n愧\n扰\n膝\n丛\n召\n麦\n巾\n龄\n贤\n蹲\n乾\n璃\n恰\n漠\n碗\n催\n懒\n衬\n洒\n噢\n眨\n稀\n悬\n吊\n嫁\n慎\n鹏\n瑕\n肿\n纠\n灌\n慈\n键\n赖\n熙\n纱\n柏\n寺\n晰\n污\n粒\n棍\n冥\n12\n魄\n贺\n贾\n茫\n丧\n墓\n植\n昂\n這\n链\n粘\n唉\n～\n罚\n秒\n噗\n叉\n牧\n申\n柜\n扩\n僵\n涵\n弥\n贯\n询\n虹\n魅\n俯\n鞭\n撩\n琼\n措\n厨\n勉\n佐\n吁\n访\n披\n舰\n毅\n踢\n签\n咯\n溪\n孕\n堡\n澡\n祭\n挖\n爵\n卢\n贫\n赌\n弓\n甩\n溢\n扇\n茵\n搜\n讯\n怖\n芝\n捧\n笼\n窝\n盗\n浸\n驶\n胴\n藤\n繁\n祝\n猎\n20\n馨\n∶\n帆\n衡\n添\n咕\n棠\n蝶\n匹\n僧\n租\n泣\n跌\n捉\n筋\n柄\n详\n勤\n【\n逍\n娟\n返\n仲\n##s\n咳\n瞳\n潘\n砸\n违\n础\n鸿\n掠\n竞\n铜\n啥\n[\n愉\n籍\n誉\n】\n萌\n耗\n赢\n－\n]\n11\n埃\n歉\n迈\n脂\n铭\n麟\n刷\n祸\n障\n祥\n锅\n鹰\n邦\n宠\n摔\n凸\n搬\n崩\n骄\n斥\n尬\n酷\n碍\n芙\nt\n捕\n虐\n膜\n尴\n援\n闯\n殷\n耶\n妍\n咐\n夸\n饿\n宛\n腕\n昌\n『\n仗\n囊\n宏\n瑟\n逆\n阅\n蕴\n膛\n驱\n钰\n描\n亭\n叛\n哇\n寿\n媳\n崇\n嫌\n姚\n丘\n糟\n渡\n串\n胁\n币\n愁\n爪\n婴\n册\n患\n盒\n欠\n吾\n煞\n绷\n吵\n浆\n兼\n巫\n弗\n崔\n岚\n歇\n妳\n俄\n漆\n掀\n挨\n姜\n割\n盾\n枚\n胳\n疏\n撤\n宅\n穆\n裁\n邵\n绩\n嗔\n』\n膊\n涉\n档\n刮\n励\n0\n驰\n靖\n蔡\n荷\n黏\n灾\n粮\n哟\n诡\n帽\n孝\n劈\n恒\n搅\n桂\n咒\n订\n氛\n凄\n赐\n昭\n蕊\n驻\n灼\n踩\n矿\n矮\n捷\n湾\n玻\n捂\n仆\n巡\n帕\n媒\n哲\n械\n隶\n庙\n恍\n妩\n腐\n刃\n##t\n鼎\n萝\n症\n妆\n渊\n颠\n隙\n昆\n堵\n30\n妓\n仓\n漉\n逢\n郡\n瘫\n鼠\n哑\n歪\n综\n杆\n蠢\n坡\n斑\n阔\n媛\n坛\n邀\n～～\n翔\n辑\n叠\n瞒\n廊\n咙\n溃\n卑\n誓\n瑗\n宴\n彭\n##2\n珑\n搐\n艰\n夕\n拢\n婶\n炉\n眠\n蠕\n吩\n邻\n税\n蹭\n剂\n玫\n嘶\n兔\n漏\n##0\n缕\n盆\n畏\n矛\n崖\n皙\n胎\n捅\n帘\n啲\n乏\n旺\n贸\n壶\n筒\n陛\n脾\n骤\n犬\n沒\n烁\n迁\n湘\n戈\n芷\n岭\n枕\n伐\n##9\n拱\n禹\n沐\n趟\n饥\n契\n腥\n怯\n谅\n砍\n赚\n狭\n鹿\n骇\n乜\n坠\n役\n時\n澜\n矩\n遂\n挪\n枯\n谨\n渗\n拟\n笨\n璇\n霄\n阮\n辩\n剥\n說\n菁\na\n甫\n宙\n寨\n悍\n寄\n摊\n塑\n棉\n嘟\n匪\n捆\n嘲\n屠\n竖\n届\n黛\n妥\n靡\n猴\n彤\n15\n偿\n鉴\n挽\n斤\n亵\n储\n沫\n哄\n侄\n瞎\n鹤\n咖\n拐\n娶\n盼\n溅\n页\n嗓\n筹\n攀\n卜\n兜\n妾\n桐\n雀\n填\n痉\n躁\n噬\n辅\n砰\n挛\n聂\n萱\n桶\n婢\n##g\n棋\n嗦\n胞\n雁\n牲\n哗\n蜂\n啧\n坑\n咧\n胶\n啡\n兮\n谭\n##1\n葬\n厢\n锟\n彷\n翎\n滩\n窜\n译\n淌\n##3\n胃\n亩\n趾\n尉\n腴\n憋\n塌\n岗\n篷\n坤\n账\n柴\n缚\n旨\n杏\n睿\n绽\n蓬\n削\n诧\n柯\n轨\n掐\n镖\n绯\n辨\n##n\n绘\n竭\n朴\n扒\n娴\n磊\n伪\n厕\n茂\n览\n赋\n筱\n拓\n弦\n漓\n膨\n妄\n陡\n盐\n循\n殖\n琛\n栗\n戚\n##d\n遣\n耍\n吱\n屑\n肝\n##r\n娅\n愕\n舐\n诀\n戳\n剪\n伺\n拂\n啼\n燥\n邱\n朕\n棺\n壳\n拽\n凰\n羡\n栋\n儒\n姻\n揭\n翅\n##k\n瞥\n匙\n焚\n翰\n绣\n烛\n冤\n脯\n霖\n寇\n17\n拆\n寡\n##5\n##y\n泳\n旭\n墅\n哧\n惩\n##4\n潇\n址\n叮\n尹\n蔚\n逝\n蟒\n蔓\n糕\n窥\n※\n摘\n婪\n禅\n厮\n昔\n狞\n褪\n晌\n鄙\n泌\n脊\n慨\n狮\n赴\n辽\nb\n秽\n骆\n垫\n唾\n赔\n艘\n撅\n妨\n逊\n淼\n谎\n宰\n茜\n谱\n嚷\n沸\n捣\n婊\n昧\n轴\n烤\n撼\n搏\n妒\n汹\n栏\n滥\n熬\n陀\n##the\n寞\n撇\n瑜\n翁\n廉\n芯\n豹\n##6\n荆\n兆\n瑰\n荐\n遵\n钥\n愚\n狄\n杖\n##a\n哩\n捡\n兹\n82\n+\n浙\n债\n凛\n＊\n泼\n過\n娱\n漾\n哆\n##i\n裴\nc\n夷\n亢\n冉\n矣\n钉\n炽\n佣\n冻\n##8\n苹\n灿\n徽\n寓\n耿\nquot\n擅\n敞\n歹\n押\n闺\n践\n嗤\n凹\n阜\n##x\n猥\n畜\n诊\n泊\n擒\n舱\n贡\n摧\n##b\n憾\n蓄\n疆\n讽\n惟\n##w\n趋\n瞄\n轿\n炒\n##12\n16\n##e\n匀\n蓦\n嘱\n呃\n沃\n稠\n姥\n倦\n杭\n躬\n逛\n2010\n惶\n姗\n乙\n氓\n罕\n100\n璟\n裳\n①\n彪\n瘾\n郝\n妞\n50\n##7\n篮\n署\n巷\n揽\n唧\n乞\n呐\n牺\n梨\n巍\n嗅\n彦\n椒\n甬\n巅\n携\n俞\n衷\n梳\n##c\n肺\n18\n胤\n碑\n衰\n辜\n俺\n钩\n弘\n绮\n──\n拧\n00\n叙\n耕\n倏\n稚\n拭\n邮\n葡\n贷\n倪\n琦\n咸\n丐\n刊\n饼\n圳\n朦\n##m\n瓷\n庸\n裡\n浇\n挠\n胧\n坊\n纽\n曦\n缸\n秃\n搁\n卵\n倚\n藉\n醋\n芦\n黯\n滨\n滞\n侦\n13\ns\n蒸\n钦\nx\n祈\n窃\n萄\n沧\n毯\n2011\n涯\nxi\n14\n蚁\n诞\n蚀\n嗡\n匠\n浊\n荫\n驳\n裆\n芊\n榜\n敛\n裕\n蹦\n稿\n霉\n襄\n蔽\n衍\n##th\ncom\n潭\n煮\n履\n魁\n蹂\n棵\n蝉\n03\n砖\n斌\n惫\n抠\n琐\n抄\n锤\n##he\n眩\n彬\n嘘\n侣\n碌\n寝\n幺\n=\n顽\n躏\n甸\n狸\n娥\n檀\n劣\n暇\n桩\n孽\n焉\n卒\n耽\n聘\n弧\n恕\n_\n暮\n2009\n栽\n葱\n##o\n茅\n蹙\n哨\n拇\n咋\n24\nwww\n揪\n斧\n罡\n喻\n##of\n狡\n嫉\n鲍\n厦\n對\n狰\n卦\n扁\n敦\n阎\n睹\n擎\n绒\n旷\n衙\n嚎\n##u\n忖\n菱\n腮\n蛛\n遁\n朽\n菌\n苞\n盏\n璐\n厘\n蹄\n湛\n陋\n抿\n鹅\n仑\n株\n婿\n發\n拘\n們\n##l\n扳\nk\n辖\n禀\n襟\n咦\n25\n坟\n巢\n蹬\n渔\n辟\n2012\n淘\n2008\n尧\n倘\n淮\n蓓\n##h\n蹈\n沁\n祁\n磁\n蜡\n粹\n屌\n囚\n艇\n疚\n绸\n赠\n佑\n澈\n蛊\n麒\n經\n蜀\n扛\n淹\n绞\n榻\n恳\n皓\n筝\n煤\n暧\n驴\n帖\n嚣\n谐\n垃\n蹊\n琢\n沌\n捞\n宵\n圾\n嘎\n晔\n芽\n郊\n2018\n晒\n渣\n40\n雍\n峡\n罐\n嘤\n21\n堕\n沦\n狈\n咨\n逻\n甄\n淳\n钧\n磕\n窒\n2000\n诈\n掘\n栈\n酬\n禽\n塘\n纲\n惚\n渠\n匕\n\\\n叨\n挟\n惮\n猿\n箍\n筷\n眶\n涡\n螺\n19\n宪\n鸭\n嘀\n迭\n喇\n詹\n伽\n##ra\n鳞\n婧\n奢\n锡\n嗽\n滔\n棚\n矜\n咚\n靴\n靥\n煌\n莺\n凳\n吏\n泻\n‥\n兀\n煎\n蝴\n唷\n翩\n亨\n斋\n剔\n##sh\n伞\n掰\n傍\n梭\n##er\n澄\n60\n媽\n##ao\n褐\n锻\n腊\n禄\n虞\n鸦\n枉\n陕\n俘\n嵌\n##f\n砂\n恬\n2007\n坎\n呕\n叽\n啜\n哮\n窟\nm\n勿\n拎\n畔\n颅\n卸\n苑\n诏\n22\n氧\n铸\n黝\n膏\n梵\n骏\n酱\n窍\n堤\n雇\n毙\n辕\nandroid\n菩\nh\n虾\n噜\n侮\n瞅\n笛\n霆\n鸾\n廖\n贩\n潺\n券\n咏\n绰\n衔\namp\n薪\n2006\n踝\n茸\n疤\n渺\n##ng\n粥\n谊\n槽\n樊\n拚\nj\n顷\n乍\n伶\n冈\n芭\n熏\n矢\n喧\n谜\n稻\n##to\n髓\n梓\n2005\n惕\n叱\n溶\n鞠\n##ａ\n恤\n莽\n##st\n撸\n咆\n哉\n吭\n##ing\n霎\n2013\n丞\n糙\n戮\n函\n驼\n##ed\n睫\n敖\n斐\n卉\n浦\n熄\n##hi\n啃\n迦\n箫\nf\nr\n疫\n诛\n蒲\n掷\n惭\n楞\n僻\n挫\n39\n梆\n澳\n嬉\nd\n勋\n搔\n啤\n23\n##on\n80\n臻\n苟\nu\n炫\n奕\n|\n框\n虏\n滕\n##ha\n弩\n绫\n奎\n>\n葫\ng\n蛟\ne\n嚼\n廓\n汝\n褶\n##uo\n钓\n蜷\n睾\n韧\n踹\n岔\n梢\n琉\n##v\n逞\n##～\n2014\n厥\n蚂\n怦\n##z\n盲\n汩\n祷\n叩\n～～～\n現\n犀\n盔\n拙\n讪\n茨\n26\nw\n瞟\n筠\n偎\n炙\n暑\n骸\n揍\nse\n桓\n旬\n峻\n肘\n'\n侃\n窦\n喽\n28\n缴\n##in\n漱\n噩\n瑾\n峨\n悸\n絮\n棱\n镶\n蔑\n##p\n豁\n闵\n碟\n##xt\n迸\n叭\n垮\n琅\n溺\n薰\n菇\n##le\n屎\n捐\n缎\n稽\n##and\n2017\n渍\n瀚\n蝎\n炭\n坪\n捻\n跋\n90\n屡\n勺\n跄\n芮\n焕\n##be\n2003\n肾\n##ro\n2004\n沛\n葵\n②\n踉\n呛\n鞘\nn\nv\n瑛\n酿\n腋\n烘\n敷\n邹\n诵\n##q\n逮\n揣\n呸\n<\nthe\n涎\n脐\n貂\n嗷\n冶\n200\nl\n垒\n咛\n##re\n崎\n霈\n壤\n邢\n慑\no\n眷\n##an\n拌\n27\n喀\n窘\n笙\n瀑\n钮\n70\n梧\n刁\n讥\n2015\n觑\n猩\n##ck\n癌\n邃\n隋\n涟\n捶\n澎\n萎\n漩\n茄\n##wa\n庵\n濡\n缀\n蚊\n骷\n夭\n髅\ni\n狙\n肮\n藩\n呦\n幢\n〔\n咔\n谣\n壑\n旱\n吨\n涅\n穹\n铮\n陨\n暄\n翡\n嗜\n兑\n呗\n牡\n蚕\n棕\n讳\np\n兢\n##wi\n坝\n##is\n诅\n雌\n攥\n〕\n肋\n瘤\n媾\n##me\n簇\n匿\n铐\n懵\n傀\n500\n祀\n杉\n挲\n绅\n烙\n猾\n橡\n跺\n芹\n2016\n１\n秩\n怠\n##at\n枣\n##×\n枢\n铠\n粪\n鑫\n蜘\n29\n钗\n倔\n31\n懈\n涕\n醇\n阀\n娆\n﹐\n##sa\n颂\n秉\n栖\n颓\n匣\n2002\n茉\n炯\n哽\n璧\n槐\n##co\n彰\n厄\n##fo\n腺\n晖\n慾\n龚\n熔\n棘\n捺\n驯\n曳\nz\n屯\n湃\n阱\n##ho\n搀\n榨\n檐\n鸽\n蟹\n辙\n##ｏ\n憨\n叼\n藕\n霓\n歧\n岑\n褥\n衅\n袅\n沮\n巩\n恣\n##j\n纺\n麼\n颔\n馒\n贲\n哒\n寥\n晏\n佬\n忿\n剖\n仕\n##di\n疙\n颁\n矫\n##ly\n35\n祟\n01\n婕\n裘\n蒜\n缭\n蔬\n漪\n儡\n##ma\n唬\n逾\n叟\n遏\n觅\n300\ntxt\n蕙\n懊\n瘩\n寰\n浣\n嗒\n##al\n忐\n忑\n莞\n甥\n橙\n##00\n剿\n啰\n皂\n昕\n嬷\n擂\n戟\n铲\n鬓\n穗\n奚\n娣\ny\n2001\n剃\n晦\n徊\n纬\n炕\n##se\n##el\n斟\n嚓\n1000\n嗣\n馀\n虔\n庐\n癫\n悚\n庚\n慵\n禧\n嗨\n弊\n蝇\n##mm\n聋\n丙\n绊\n朔\n痹\n竿\n绢\n##nt\n赦\n嗖\n08\n憎\n昼\n##un\n啾\n嗲\n臊\n３\n喳\n##you\n鬟\n茬\n邑\n徘\n榴\n姣\n宸\n舵\n窑\n咿\n##pa\n屉\n凿\n##de\n梗\n藻\n##os\n歼\n飙\n糜\n锥\n胭\n珂\n桦\n删\n赎\n鞍\n悯\n缅\n谕\n懿\n戬\n橱\n淇\n酣\n枭\nnbsp\n##te\n垢\n曝\n##ar\n雏\n冀\n猝\n钞\n椎\n惺\n蹑\n驭\n沼\nro\n墟\n舆\n腑\n霹\n##or\n攒\n##li\nand\n苔\n僚\n##mo\n笠\n磅\n##ｎ\nyin\n辫\n##go\n##℃\n##et\n噪\n灶\n笋\n挚\n贬\n峙\n##la\n谍\n##ot\n佟\n45\n囔\n##us\n勘\n36\n钝\n##so\n弛\n昵\n＂\n04\n拯\n^\n赧\n橘\n邸\n##no\n##ch\n##es\n珏\n撰\n泞\n##na\n璋\n戎\n蔷\n悴\n嵩\n##da\n碾\n##ｅ\n铅\n##en\n蛤\n杠\n雳\n09\n個\n渎\n栅\n谬\n捋\n峭\n##ne\n32\n嫖\n靓\n踱\n崽\n遐\n俐\n萤\n蝠\n辗\n锈\n憔\n簌\n叁\n癸\n２\n瑄\n##ｍ\n庇\n赘\n嫡\n##ss\n##ta\n剌\n奄\nq\n1999\n咂\n蛙\n铛\n悻\n桀\n@\n孜\n拣\n##９\n鲸\n##fi\n##10\n缪\n咻\n鸳\n募\n倡\n06\n碳\n05\n怅\n裔\n③\n恃\nunt\n揖\n慷\n唰\n##om\n侥\n杵\n膳\n咫\n苇\n懦\n鸠\n寐\n##ting\n阐\n吒\n绛\n玖\n缉\n蹿\n匈\n茧\n钳\n07\n庶\n##si\n##it\n掺\n烨\n為\n婵\n卯\n癖\n寅\n惬\n淀\n筛\n骋\n咀\n183\n砚\n嬴\n绎\n璞\n芜\n饲\n戾\n肴\n##as\n蕉\n1998\n敝\n闸\n镯\n##ol\n惦\n婀\n殆\n燎\n笃\n峦\n殇\n屿\n聆\n桔\n瞻\n##ver\n粟\nhttp\nch\n##ve\n幡\n##do\n匡\nａ\n绚\n髻\n##pe\n滤\n栓\n沥\n奠\n##０\n疮\n辐\n飒\n巳\n洽\n鸯\n祠\n倭\n幂\n熠\n跷\n馈\n偌\n讼\n掳\n摁\n沪\n噼\n祯\n##ll\n＿\n冽\n##int\n瘪\n髯\n舜\n撮\n唠\n衩\n宦\n蚩\n鹃\n##20\n苓\n圭\n02\n##fr\n##ｉ\n袒\n韶\n诫\n洼\n拴\n驿\n烹\n馋\n牟\nit\n璨\n麾\n##rs\n煽\nps\n翟\n汐\n浏\n谏\n1997\n诣\n崛\n炖\n蘑\n焱\n120\ngt\n嫔\n颐\n##ti\n璀\n陇\n［\n袱\n苛\n扉\n##mi\n亥\n瑚\n簸\n眺\n禛\n簿\n锯\n唆\n汀\n煜\n磷\n1996\n褂\n隧\n恙\n侬\n睬\n##ge\n眈\n55\n娼\n缰\n蹋\n紊\n珈\n粤\n珀\n侈\n崭\n菡\n馅\n##ｔ\n頭\n墩\nji\n妤\n##nd\n##ce\n肪\n窈\n栩\nqq\n34\n75\n抡\n☆\n贿\n％\n噎\n毓\n羹\n嫦\n睽\n阙\n150\nno\n拗\n炳\n骼\n窖\n殴\n##ri\n鄂\n瞿\n鲤\n##po\n1995\n隻\n呓\n33\n砌\n##ba\n##ｘ\n玥\n礁\n羁\n哺\n蛾\n咄\n##rt\n囡\n蔼\n槛\n坂\n##sp\n##lo\n譬\n阖\n##fa\n扼\n晟\n##ｄ\nhe\n锣\n衲\nchang\n茗\n酌\n涧\n窕\n蚌\n躇\n滢\n帜\n祺\n踌\n薯\nｃ\n娄\n##ea\n缔\n鳌\n荃\n拷\n扈\n汰\n靳\n荧\n蜒\n榆\n開\n46\n##ad\n1994\n溯\n淤\n褚\n##ｒ\n##ter\n羲\n呱\n##fe\nsm\n彿\n38\n69\n︰\n痞\n萦\n莓\n汶\nst\n47\n］\n瓢\n啄\n酋\n牝\n狩\nbr\n37\n48\n冢\n佘\n拈\n烬\n妲\n讷\n镰\n玺\n吝\n惘\n珉\n萋\n饺\n42\n嘈\n400\n〖\n蕃\n〗\n1993\n俨\n恻\n忡\n谑\n竺\n毗\n##ｙ\n嘭\n羔\n氨\n還\n矗\n##ke\n飕\n垣\n夙\n撬\n##ic\n陰\n焊\n蝙\n盎\n65\n弈\n##ul\n侨\n毋\n##11\n噔\n惋\n玮\n渝\n呲\n铎\n玑\n馥\n札\n1992\nre\n俭\n吆\n##cm\n簪\n苒\n##ry\n揩\n##ｕ\nｔ\n##ia\n忒\n荀\n##ut\n瘙\n琬\n##cr\n坞\n铝\n43\n睨\n骡\n##su\n颉\n纨\n學\n疵\n鲨\n咎\n垦\n曙\n锢\n##ca\n##ur\n冕\n窿\n##vi\n珺\n佯\n動\n瘟\n56\n##ci\n洵\n靶\n柿\n3d\n52\n绾\n##ther\n濒\n帛\n硝\n碱\n蜿\n萃\n##＝\n醺\n赁\n##up\n痪\n53\n楷\n簧\n##man\n##ui\n帷\n稼\n嵋\n４\n虬\n珣\n屹\n##ga\n畴\n颌\n锵\n钵\n掣\n噙\n黠\n柬\n掂\n泓\n硫\n邬\n蜈\n##am\n袄\n／\n霏\n58\n荔\n腆\n歆\n飚\n皎\n##ｓ\n琶\n渭\n谴\n##ｗ\ncat\n##oo\n汲\n##hu\n胚\n殃\n怂\n○\n镂\n鳄\n谧\n戊\n驸\n##ec\n##con\n憧\n3000\n##gi\n□\n瘸\n﹗\n##ds\n骁\n驹\n1985\n##ir\n篱\n鹊\n喋\n④\n琥\n娲\n##°\n##il\n涓\n饵\n枷\n缆\n晾\n99\n獒\n攘\n●\n1990\n惴\n95\n澹\n62\n踮\n缇\n54\n遽\n##ab\njing\n##one\n斓\n##ｈ\nsk\n萼\n悼\n苯\nca\nsa\n51\n龌\n64\n喵\nfa\n蜕\n卞\n殉\n1991\n诬\nｓ\n會\n篆\n600\n孰\nhu\n壬\n##bu\n1988\n搽\n110\n44\n##ou\n糯\n芥\n5000\n淬\n1989\n痊\n睦\n##ns\nsh\n##ac\n##wo\n魇\n##ｃ\n涤\n籁\n57\n##op\n1984\nso\n犁\n72\n旖\n##ie\n秤\n1986\n檬\n耷\nme\n闽\n鹫\n泵\n##ｌ\n##50\n##her\n樣\n俪\n岐\nru\n壹\nic\n##ld\n1987\n##ft\n鹭\n笺\n##26\n憬\n甭\n琵\n抒\n赃\n259\n800\n昱\n稣\n##per\n荤\n##um\n槟\nshe\n龊\nan\n噤\n##tion\n媲\n礴\n亀\n酝\n迢\n毡\n阉\n68\n玷\n49\nios\n壕\n##５\n炬\n##pro\n椭\n稷\n##18\n啮\n汨\n１０\n诶\n##22\n涔\n缨\n炊\n辄\n兒\n##hat\n蟆\nmi\n瞩\n鸢\n##tr\n180\n韬\n##bo\n伎\n1982\n榄\nco\n覃\n涸\n５\n诲\n爲\n##we\n孚\nsu\n痰\n##pi\n硅\n將\n酵\n饷\n剁\n##tu\n髦\n踞\n##out\n筐\n91\n锺\n蔻\n88\n忏\n镀\n荼\nmm\ndi\nboss\n酉\n85\n兩\nwin\n麓\n桨\n铿\n旎\n##mb\n##du\n##od\n浚\n##ph\n６\n睑\n迂\n恪\n##chi\n泱\n徵\n##dt\n闫\n啵\n椰\n肇\n佼\n##ct\n瀛\nma\n##７\n勐\n鹦\n蘸\n濮\n袂\n湮\n晤\n##19\n##my\n柠\n98\n59\n曜\n粱\n褒\n戌\n63\n41\n隅\n畸\n刨\n暨\n##art\n鳖\n##sc\n锄\n乒\n吠\n羌\n##http\n##id\n渲\n蟾\n##son\n##ｐ\n鹉\n體\n##ita\n辘\n坍\n66\n瞠\n姝\n##ine\n##bi\n獠\n胄\n蚣\n乓\napp\n癞\nav\n##21\n茴\n湄\n##cl\n##３\n蔺\n颚\nｍ\n芋\n掖\n猖\nde\nok\n凋\n##ni\npc\n钙\n唏\n##ys\n迄\n聲\n邯\n迥\n##２\n麝\n珞\nangela\n蟠\n瓮\n喏\n诃\n轧\n孺\n舷\n61\n##au\n##rd\n颦\n溉\n妪\nbe\n磋\n##６\n霭\n幌\n矶\n娑\n圃\n##rea\nna\n##cha\n橄\n嗳\n蜥\n##32\n筏\n76\njb\n杞\n怆\n炜\nal\n谛\n烽\n俟\n##sm\n##oc\n驮\n脓\n內\nbut\n1980\n熨\n67\n##14\n##im\n樵\n隘\n##dr\n##ts\n撂\n溟\nxx\n卤\n##pr\n##ru\n78\n##rn\n娓\n##13\n娉\n睐\n跤\nta\n籽\n伫\n戛\n纶\nla\n氯\npa\n羚\n莘\n##８\n笈\n螂\nto\n進\n##ag\n##～1\n沓\n##va\n##lu\n贮\n##ee\n筵\n侏\n璎\n亘\n赊\nbi\n酪\n酮\nas\n##em\n罂\n埔\n腼\n汴\n涣\n##ent\n恺\n讓\n##mp\n##all\n1978\n鸨\n鞑\n氢\n皋\n缤\n胱\n酶\n旌\n匐\n##ff\n棣\n##by\n惆\n##ang\n攸\n無\n##ist\n匍\n##dd\n73\n##san\n沅\n##ion\n##ment\n莪\n峥\n105\n##30\n##pl\n埠\n椿\n##ay\n峒\n谒\n绥\n81\n##ja\n盅\n96\n腌\n胥\n鸥\n1983\nlu\nin\n啬\n71\n##sta\n赣\nbo\n##vo\n隽\nvi\n捎\n剐\n霁\n##ka\n弼\n蝗\n阑\n篝\n蜻\n痣\n挞\n晗\n##now\n缈\n膺\n徙\n##60\n挎\n瞑\n##02\n裏\n霾\n殡\n87\nsi\n幔\n抉\n##40\n74\n##tt\n鼾\n昀\n92\nsuv\nｂ\n緊\ndr\n##ls\n耘\n〉\nmo\n94\n沂\n##23\nli\n##15\n煦\n撵\n犊\n楣\n镑\n##com\n##ex\n弑\n##wer\n缥\n##ow\n1979\n160\n瘴\n祛\n##ey\n偕\n##㎡\n蜓\n佻\n##za\n磐\n踵\n臾\n敕\n滇\n垄\n烯\n##xi\n昙\n跚\n槌\n77\n##rr\n洩\n##ef\n谙\n84\n130\n夥\n阚\n邊\n匾\n泯\n123\n祇\nda\n偃\n鲲\n間\n##ki\n⑤\n纣\n97\n〈\nlt\n豚\n谄\n憩\n啷\n煲\n101\n##80\n##ran\n1963\n爻\n##are\n镣\n柚\nha\n拄\n彝\nの\nai\n淅\n×\n胺\n赂\n蹒\n茱\n蜗\n豔\n攫\n##ue\n觊\n##16\n1981\n熹\n７\n93\n觎\n舶\n##１\n嚏\ncd\n飓\n##han\nｊ\n痘\n##hen\n##ap\n86\nac\n##４\n覺\nyu\nga\nzh\n978\n##ya\n泾\n弋\n擞\n##und\n瑙\n##ny\n##ob\n瞰\n诠\n灸\n粼\n濑\n篡\n骰\n##yo\n釉\n８\nｘ\n泠\n揄\n##70\n汕\n芍\n啕\n篓\n聿\n锭\n砾\n##if\nle\n噶\n##sl\n250\n1964\n淄\n诩\n轲\n##lt\n1958\n║\n鲛\n137\n葩\n沖\n帚\n畹\n舫\n菈\n玟\nshi\n瞌\nyou\nid\n绔\n孵\n##ant\n1949\n摹\n２０\n桅\n從\n狎\nnba\n##jo\n##ina\n##31\n##tra\n雙\n##fl\n氲\nvip\n倜\n##17\ngui\n題\n##24\n坷\nap\n##ok\n掬\n當\n##ome\n83\n000\n1970\n姒\n鳅\nra\n翌\n氤\n蚓\n戍\nsc\n4000\n##ud\n恿\n##ms\nad\n##ty\n搡\n瘀\n皖\nko\n赳\n##ins\n搪\n沽\n##io\n岱\n##rc\n##db\n滟\n##ps\n坯\n1969\n79\n咣\n濛\n點\n黔\n與\n##ua\n㎡\n89\ndna\n渤\npe\n堇\n揶\n##ep\n輕\n掸\n炀\n廿\n羿\n蚯\n##ong\n酯\n##ｖ\n葆\n贻\n##ev\n1962\n##ive\ndo\n岌\n樟\n呤\n##ew\n泗\niphone\n##ers\n诽\n捱\n170\n■\n丕\n##rl\n阂\n嗬\n漳\n汾\n徨\nｋ\n1500\n殒\n##ip\n1956\n悖\n##ah\n##can\n3p\n##ko\n##pp\n##51\n##km\n##ye\n骜\n臆\n##mu\n##uan\n##ig\n喟\n痿\n潞\n##og\nis\n俸\n蝼\n祗\nip\n吋\n衮\n185\n榭\n##aw\n峪\n捍\n摒\n鬣\n骥\n忪\n##ian\nar\n##dy\nｗ\n猬\n轶\n##35\nba\nge\nat\n９\ngo\n1968\n燮\n愫\n琊\nka\n垛\n煊\n锚\n郸\n崆\n菠\n惰\n##45\n##ｋ\n125\n馁\n潼\nel\n102\n〞\n##fu\nｄ\nled\n}\n栾\n谤\n秧\n骊\nti\n劾\n榔\n嗝\n旻\n釜\n邝\n讹\n$\nfu\n##ub\n##90\n宓\n##ven\n涮\n見\nceo\n螃\ncl\n逵\n昶\n##ju\n1976\n##lli\nktv\n1965\n坳\n##03\nsp\n钏\n→\n鼬\n蜃\nｏ\n##33\n##des\n粽\n摞\n匝\n傥\n恁\n186\n##cc\n晁\n巽\n営\n颍\n##ai\n××\n砺\n仞\n115\n##ain\n##thing\n岖\n##xp\n罔\n韭\n187\n潦\nni\n孪\n1966\n##ill\n饪\n翊\n怼\n﹑\n舀\n##25\n##ber\n蔫\n俑\n恸\nph\n溥\n蜴\n##ation\n##ide\n##dn\n##200\n竣\n杳\n姦\n堰\n佈\n##pt\n垩\n處\n隼\n1974\n蛰\n##hy\n##der\n##est\n1960\n1977\n愛\n鳍\n′\n琰\n夔\n桢\n1957\n潍\n##lm\n怏\nwe\n##lan\n坨\n盹\n⑥\n111\n嬿\n##ugh\n##ii\n##rp\n##here\n##ast\nte\ngb\n140\n##ei\n##gh\n184\n噌\n滿\n##zi\n##91\n倌\n##sha\n##─\n108\n嚐\n泫\n1950\n1975\njohn\n##ste\n螳\n哂\n顼\n髮\n##ｂ\n1972\nam\n##75\n##eg\n钊\n##ear\n遛\n皑\npi\n臼\nya\n跛\n潢\n牠\n1200\n##ser\n沏\n─\n##61\n##71\n168\n榕\n104\n##our\n##res\n106\n帧\n洱\n##ight\n灏\n##sy\n忤\n##01\n鬃\n109\n锏\n話\n亟\n##kg\nvs\n##98\n##28\n121\n箕\n360\nhi\nnt\n阪\n垠\n逅\n掇\n##mar\nｇ\n葳\nol\n摺\n犷\n骧\npro\n缄\n侗\n##ere\n##eo\n紅\n忻\n哐\n纂\n181\n宕\n袈\n烷\ncia\n##ore\nor\nab\n粑\nne\n邂\n炷\n##nn\n##ron\nmy\n113\ncr\n##ton\n蔗\n裟\n##tw\n##win\n糗\n1973\n蓟\npo\n槿\n馍\n1971\n##day\n鞅\n##41\n##ric\n豺\n182\n##les\n佔\n##99\n##era\n給\n﹖\n103\n湍\njj\n仄\n1945\n鬆\n1938\ntop\n122\n秆\n臃\n1955\n盂\n##42\n蹶\n糠\n##af\n邋\nss\n邺\n##red\n##ris\n锌\nwindows\n斛\n##mon\nm1\n6000\n虱\n偻\n1961\nqi\n128\n钠\n蛆\n诘\n##che\n##88\n##way\n112\n##nce\n1937\n長\n##ass\n1959\n淆\n雞\n琏\nyy\n闇\n鹂\n翱\nce\n龛\n町\n氟\n闾\n钺\n桧\n辇\n2020\n##55\n##72\n##ate\n##78\n谲\n孢\n##lle\n107\n卻\n磺\n瞭\n雉\n銮\n氣\n仨\n別\n##gen\n焖\n汛\n1954\n##land\n##sn\n1967\n##av\n門\n蒿\n漕\n##yi\n嶙\nmr\n翦\n呷\n##form\n##000\n怵\n##tan\n##dan\n蚤\n忱\n##age\ncpu\n119\n蠡\n##ith\n##ure\n##use\n175\n##ov\n瞇\nｐ\n##ell\n匮\n翳\nｎ\n胰\n牒\n枸\n傑\n遢\n珩\nlin\n##ze\n##66\n幾\n##04\n冗\n##don\nfor\n##08\n##ym\n{\n胛\n茁\n3g\n↓\n囱\n##ana\n痢\n##53\n171\n剽\n##ous\n##sk\n種\n##tal\n##min\n##gn\n張\n伢\n親\n疹\n妊\ncon\n##bs\n骞\n＃\n##06\n##shi\n##cal\nx1\n##rie\n傢\nisbn\n锹\n剜\noh\n绡\n喙\n臧\n谥\nkb\nｆ\n##yn\n##ble\nmar\n尻\n##００\n##nm\n珪\n囤\n谗\n178\n##ted\n##34\n蹩\n##62\n彗\n荪\n##tin\n1952\n衾\n跆\n##ity\n##get\n##pu\n118\n##top\n聽\n藐\n##hr\n荟\n##lla\n##43\n缜\nlo\n700\n127\n僮\n##nc\n##oi\n佝\n##ess\n146\n耙\n峋\n衝\n##64\n172\nho\n椁\n##68\n泸\n##rm\n240\n嗟\n##ction\n辍\n162\n136\n皿\n孬\n饕\n谆\nall\nfe\nintel\n165\n嗑\n遑\n##81\n柒\ned\n岷\n強\ndvd\nmen\n笆\n畿\n##ach\n##gg\n⊙\n##sd\n163\nex\n臉\n##vis\nbl\n蟑\n嘹\n126\n114\n116\n衢\n羯\n155\n129\n8000\n氮\nif\n##65\n1600\n箩\n##ｇ\n1946\n##47\n1948\n##ren\n##nes\n粲\n##oa\n179\n##07\n156\nnet\n##95\n・\n131\n娠\ndv\n##27\n浒\n厩\nㄚ\n##ten\n１２\ndb\n##ever\n##52\n##ern\n##ite\n問\nfi\n##gl\n##38\n１８\n##ica\ncn\n莆\n##48\n##ak\n叵\n荻\n夯\n##05\n##cy\n##29\n钾\n郢\n##rth\n瘁\n﹕\n##tel\n##ree\n並\nflash\naa\n淞\n東\nen\nmac\n娩\n##que\n贰\n姹\n##ji\n139\n##dm\n镁\n##44\ngl\n1953\n##ath\n##ect\n1951\n佞\n柩\nes\n135\n##tro\n衿\n嬛\n柑\n铢\n耆\n荚\n犒\n##iti\n001\ncc\n900\n##dc\n吖\n158\n##cs\n←\n##ame\n颧\n骠\n188\n铬\n##tor\n124\n151\n孱\n咤\n164\n##mg\n176\n##ave\nｈ\n缮\n绶\n1942\n350\n嵘\n堑\n睇\n1300\n##uc\n##ae\nbb\n##39\n1944\n珅\n鹞\n碴\n剎\n##den\n##mer\n##36\n##ard\n##76\n◆\n##lc\n##car\n镐\n潋\n220\n餮\n##73\n##ens\n浃\n渥\n岫\nｖ\npk\nlove\n饨\nsd\n眦\n刽\n169\n楔\n掴\ndon\n谩\n##ata\n崂\n##ust\ntom\ndu\n1947\n##end\n117\n奂\n橇\n庾\n##isa\n1940\n鳗\n★\n##ene\n##eh\n轼\n##94\n##82\n钛\n##kb\n230\nmu\n镳\ndd\n鱿\nci\n##vel\n##led\n遨\nim\n##49\n161\nint\n##ken\n撷\n緻\n隨\n３０\n##hin\n觞\n##lp\n##rk\n丨\nms\ntm\n豌\n##63\n嗎\n##85\n##ort\n颢\ncm\n##king\n雹\n##qi\n璜\n##54\n202\n##tom\n俎\n##men\n1941\n嘞\n馗\n##ger\n##ice\n沱\n142\n##word\n谟\ndan\n祎\n##zz\n##ek\n##ini\n##77\n渚\n##time\nwifi\n##let\n##lor\n134\n138\n10000\n##ert\n琨\n152\n##bb\n##ml\n##eng\n141\n##dp\nev\n##del\n盥\n##its\nman\n绻\njack\n##□\n蜚\n祚\n馄\n##93\nct\n1931\n##ace\n##ake\n##ans\n谀\n##hz\n鹄\n132\n嘣\n##ros\n1100\n帶\n##mc\n稔\n苁\n睥\n鸵\n##92\n擢\n秸\n酰\n##gt\n##97\n赈\n##oh\n##other\n##ani\n懑\n1933\n##len\n210\n疡\n亳\n箐\n樽\n1934\ndavid\n袤\n▲\n砥\npu\n##bra\npr\n133\n醛\n145\n##ful\n師\n##sch\n哔\n佚\n##over\n159\n迤\n斡\n##67\n剛\n1936\n镌\n崴\n##how\n##09\n##ws\n枳\n碉\n##love\n##ple\n166\n##46\n⑦\nthey\n囧\n苻\n瘠\n跻\n诋\n##ies\n177\n##96\n##lon\n淙\n##tle\n173\n##lin\n皈\n佗\n##ien\nsam\n塾\n榈\n##ings\n1800\n2500\n##69\n##rin\n##dia\n濠\n1939\n準\n麗\n聒\n##aa\nak\n卟\n##100\n艮\n蕨\n2019\nc1\n宥\ngi\n##gan\nmon\n陣\n##net\n荨\n##wan\n昇\nthere\n铣\n帼\nchapter\n胫\n觐\n##56\njo\n144\n邰\nii\n##world\n##ough\n饯\n##ute\n##86\n嶽\n丶\n谪\n##ire\n1943\n鄞\n鹘\ncs\n鏖\n鲫\nnpc\n##row\n##oy\n##gb\nhp\n##yu\n鳝\n祐\npp\n##89\n殓\n143\nⅱ\nmp\n裨\n￥\n##oe\n弭\n##port\n##ima\n5g\n兖\n##lic\n##tm\n槃\n##uch\n##zo\n膻\n蕩\n##sse\n佃\n##yan\n杈\nwhat\n變\n幹\n褲\n０\n迩\n１１\n167\n诰\n154\nwang\ncar\nsaid\n谚\n闆\n蛹\n##ram\n##ker\n玳\nibm\n藜\n##rf\n##cam\n茯\nchina\n瓒\n纰\n谯\nva\n實\n褓\n骛\njason\n1927\n悽\n豢\n##tb\n##las\n５０\n260\n碘\nmp3\n##ari\nweb\n##nk\n4s\nop\n抨\n##ane\n148\n##wen\n則\n1900\n##sco\n弁\n啻\n焜\n##lv\n##74\nｉ\n##ler\n##fc\nusb\nea\n讫\n鹜\n##fs\n榫\n锂\n绦\n彧\n檄\ngps\n篑\n赡\ngdp\n149\n##mus\n裾\n##back\n铂\n##bee\n##ley\n诙\n踊\n蟋\n##tos\n藓\n##bor\n1935\n##ug\nem\n闰\n7000\ntv\n1930\n##ise\n浔\n1932\n腳\n204\n157\n##val\n莅\nｒ\n##rus\n##van\n缱\n裱\n##cp\n斫\n連\nep\n##84\n##79\n俾\n##ces\nvol\n熱\nll\n砧\n罹\n1920\nag\n陲\n147\n擡\n##x2\nwi\net\n##ku\n##more\n１５\n痔\n##pan\n##ase\n##ena\n##ash\n211\n##lina\n##tch\n幄\n禺\n320\n##ora\n邛\n難\n骈\nil\ntt\n嬌\n174\n365\n囗\nyes\n##ona\n猗\nking\nhd\n##ose\n750\n蛭\n##ura\n俚\n驷\n滦\n##ula\n國\n隍\n◎\nos\n##ves\n##ale\n##83\n棂\n萸\n§\n##pm\n蚝\n##57\n炅\n箴\n侑\nmv\n##87\n##gs\n涝\n##cer\n##urn\n153\n薨\n##sen\n##ala\n##2000\n屍\n酚\n鄱\ner\n襁\n3500\njames\n陽\noppo\n##ano\n蚱\n##tar\n淩\nsan\n1928\n##cent\n##tto\nig\n##ral\n槁\n茭\n酗\n荥\n##tes\n##ster\n1929\n##∶\n##ical\n##low\npeter\nxp\nben\n##37\nipad\n腓\n256\n##hd\n砣\n烩\ngoogle\nｙ\n蹴\n圩\n##bar\n##nan\n痫\n##gar\n##mes\n焙\n＞\n應\ncp\nns\n##nic\n猕\n１６\n黍\n##ism\n＋\n睢\neric\n##ish\n係\n鹑\n犄\n##bet\n##ult\n焯\nｌ\n▼\n劭\nbar\n楹\n扪\n##lly\none\n嫚\n##58\n蟀\n淖\n纾\nexe\nbt\n黜\n##less\n僖\n擀\n钜\n##ade\n##tic\n寮\nｅ\n##urs\n##rid\n嵇\n##plus\n轉\noo\n##als\n##lie\n##ons\ntw\n##act\n蛐\n##mic\n潸\n孑\n##ics\n##tl\n柢\n蓑\nmba\nを\n##ime\n滓\n箸\n懋\n189\nmc\n蘇\n烊\n##ン\n##sis\n郦\n荏\na4\n##fer\nds\n苡\n饬\n##work\nlan\n##ara\nken\nthat\n徇\nspa\n殁\n铤\nβ\n腩\n喱\n##gin\n##lf\n氾\n##kin\nkk\n讚\n栀\n笞\n菅\n興\n##cle\n##sit\n锲\n##ks\n嶂\n虢\n##ile\n##sion\n濯\n″\n嬅\nt1\n耦\nts\n泷\ndl\n挝\n1926\n##ving\n##lia\n##59\n1400\n##tc\n201\n髒\n饽\n##zu\n龈\n邕\n##ー\n條\n琮\n複\n##の\n##ama\n囫\n铆\n##ood\n腱\n##np\n榛\niso\n箔\n唑\nau\n##bus\n##bre\n蛀\n212\n##ner\n樾\nah\n##life\ncan\n##hn\n糸\n##cd\ncindy\n沣\n骐\n##ory\n##ling\n##bit\nlg\nbook\n雖\n##tp\n##app\n##chel\ndc\n190\n##pc\n缢\n##anda\n⑧\n颛\n##ost\n##ool\n##new\n##orm\n##ture\n谌\n瀞\n##down\n恫\n噁\n採\nmk\n203\n獭\n陂\n##live\n暹\n##own\n##lam\n##ink\n狒\n280\nc2\n##come\n刎\n蜍\nlv\n佰\n晞\n301\n##house\n膘\n篙\nか\n擘\nir\n##ark\nhigh\n谶\n##ial\n##lar\nmiss\n##een\n##die\n絲\nbc\n##night\njava\ndx\n儆\n紮\n450\n纫\n镭\n菖\ntim\n咭\n焘\n##rley\n##ctor\n罄\n##dra\n##600\n睪\n##nar\n##ich\n##ward\n毂\n##ier\n##light\n##ual\n悱\nexcel\n吡\n##ean\n峤\n##book\n0t\n栎\n##ock\nhr\n瑠\n##kw\n1912\nbaby\n圜\n205\n##ght\n拋\n##001\n##bert\nscott\n##ox\n⑨\n孀\n##mhz\n##line\nmichael\nㄧ\n##hs\nfacebook\n羟\nsb\nve\n痨\n犸\non\n菏\n##ola\n##set\n##hl\n鄢\nbug\n珲\n榷\n206\n##star\n##500\nlol\nchi\n钨\nm2\n飨\n1917\n##vin\n豉\n滂\n撫\n獗\n酊\n##～3\n冼\n刍\n漸\n鳃\n##tf\nbill\n邈\n苷\n斷\nabc\n歡\n##ida\n°\n關\n醚\n##ann\n佥\n濕\n##ance\nipo\n徉\n羸\n##air\n雎\n桎\n祢\n##llo\nmax\n缙\n##ond\n##key\n細\n偈\n梏\nng\n龍\n脫\n伉\n徕\n002\np2p\ngod\n磬\n1919\n##nge\npaul\nby\n唳\n馬\n楂\n魍\n纭\n##bot\n肽\n##ence\nは\n涿\nvr\n##hing\n270\nnew\n龜\n祜\n##uy\n##た\n5m\n艷\n##ign\ns1\n##med\nfan\n##cia\n208\n##ead\nに\n栉\n##ky\n鎏\n歙\n氰\n##well\n##side\n樂\n纔\n##force\n##～20\n痂\nwho\n207\n衹\n##か\n仃\n锉\n雒\n##ein\n歳\n##ses\n##ik\n##face\n＜\n##jing\n萘\n##oto\n##nder\n恥\n蛎\n209\n燧\n1m\n虻\n##ht\n##mit\n甯\n啖\ndj\nb1\n##tions\n風\n##ould\n##bin\n瓯\n車\n##nter\n##ding\n徜\n蹉\n##lis\n##mn\n##ick\n##ese\nsr\n桠\ncf\n##hai\n薏\n桉\n##а\n癣\naf\n钿\n##ndi\nwhen\n##put\nles\n阕\nxxx\nmobile\n拮\n脍\n涪\n咲\nα\nins\n##eme\n臬\npart\nlisa\n##☆\n20000\npre\n許\n軟\n鸪\n僭\n##ｊ\n##zy\n##ino\n離\n庖\n##tte\n##ning\n婺\n蹇\n鹌\n##dar\nvcd\n赝\n1700\n癡\n##ope\na1\n汞\n濂\n熵\n1914\n##lee\n橹\n咩\n##ery\n213\nchris\n##boy\n矾\n221\nf1\nmb\n225\n226\n##lls\n獐\n##ode\n##ring\npen\nly\n##tv\n鹧\n##play\n##llow\n矍\n218\n蝌\n##nch\nm3\n奮\njoe\n墉\n##vd\n##uk\n｛\n1924\nae\n222\nmt\n靼\n##nis\n電\n瑁\n##mark\nthis\n酩\n｝\n荞\n##nda\n##し\n##ラ\n##rge\na6\n澍\n卅\n歎\n豐\n勖\n##pin\n１４\nmd\n##ats\nmin\n##ket\n214\n噻\n猷\n1925\n◇\n芪\n溴\n啶\n祉\n520\n##view\n370\n##ｆ\n铄\n245\n##lk\n澧\n書\n##gle\n##joy\nｑ\n232\n##750\n##vs\nhow\n255\n摀\nvivo\n##wn\nfbi\n1918\n##cel\n216\n##ium\n饴\n糅\n##bel\n228\n##jia\n231\n235\n〇\n铡\n珐\nwu\n##ham\n##ray\nred\nparty\n頂\n##ress\n4g\nsimon\n1921\n滁\nwin7\nsex\n##ル\n##kk\n硒\n硼\nww\n讧\n##ound\nvista\n戶\n淒\n芩\nie\n觥\n##ions\naaa\n242\n251\n1911\nⅰ\n330\n淨\n旮\n##gy\ndiy\nbest\n007\n髋\n2200\n1922\n莖\n噱\n麵\n秣\n302\n蜇\n肱\n215\nnot\n##nor\nlily\n##0m\n##400\n##able\nup\n##said\n4500\nfont\nlogo\n##と\n痈\n1909\n1923\n##800\n##bc\n〝\n鲈\nwith\n唁\n復\n溫\nyi\n‧\n##v4\n極\nhan\n##μ\nfc\n芈\nrpg\n檎\n999\n傳\nt2\n葭\n227\n##nia\n谔\n牍\n莒\nkg\n##home\nrobert\n##tter\n埂\n##deo\n視\n##sol\n媞\n310\n##sun\n锰\n##vice\n##ook\nphotoshop\njs\n268\n512\n##ull\n赭\n眞\n醐\n195\n粳\n诟\n##ux\n蚪\n2400\n氐\n遒\n機\n煨\n##bt\nv1\n盡\n絕\n惇\ncv\nyang\n##x5\n诿\n馏\n铨\n##ments\n蕲\nofo\n膑\n⑩\n##dio\n靜\n沢\ncho\n##bell\n##cat\n650\n##ters\n钎\n醍\n##count\nplay\nsci\n##lay\n##ike\n##right\n醴\n蒨\n##ung\n##mate\n1915\n螨\n##ison\n阆\n吶\nmg\nnu\n鲑\n續\n##ume\n##900\n灞\n##ett\n疟\n镍\n##ix\nwill\n﹡\n480\n##→\n疽\n##bat\n##last\n4gb\n鐘\nib\n##hc\n飢\n阡\n笏\n312\n264\n303\nrock\n屐\n##～10\n##cky\n牦\n219\n241\n##xx\n砝\niv\n１３\n蝈\n##log\n郴\n##fun\n緩\n##ico\n##ness\n325\n##ors\n##nne\n△\n340\n荠\n##ved\n膈\n380\n霰\nhello\ncj\n箝\n赓\n##hop\n##ス\n##hm\n##berg\n鳕\n1906\n##rect\n1910\n壓\n##ez\n##ller\n遴\n##wood\n##イ\nrn\n253\n##rry\n厝\n##lr\ncos\nb2\n仝\n290\n終\n溧\n〃\n##tech\nlet\n骅\n##bon\n##gp\ntb\n餘\n珮\n##tti\n##n2\n306\ngp\nsg\n229\n2100\ncad\n葺\n##xxx\n##kv\ndm\n宮\n苋\n洙\n亂\n該\n○○\na3\n##rio\nmvp\n##и\ngame\n騷\n｜\n脣\n蘭\n總\n黃\n鲶\n720\n##cn\n邳\n##free\n苕\nxa\n1916\n010\n233\nus\nbp\n##tz\n铉\nnote\nart\n蕤\n##aft\n铀\n##ule\n椽\nff\ncbd\n艹\n##wu\nof\npd\n15000\n嵬\n腭\n＆\n305\n##ク\n##eep\n稞\n窠\nback\nvivi\n850\n##lio\n2m\n眾\n##fig\n##ince\nmay\n##ary\n##300\n##reen\n跖\nfm\nsf\n224\n##nts\n##ada\n234\n诤\nlee\npt\n223\n236\n238\nvc\niso9001\n仟\n304\nlp\n牆\n265\n##a2\nui\n傩\n昴\n焗\n269\n產\n##wei\n藍\n##mw\n蓼\nrs\n佶\n##gm\n308\n##chen\n悌\n3c\n赅\nkevin\n##く\n２４\nsat\n麋\n疣\n##ize\ns2\n挈\n##●\nrichard\n333\n##0mm\nnon\n1905\n綦\n＝\nedward\n##m2\n##ape\n踟\n馔\n##cher\n槎\n##lab\n稹\n##rian\n掛\n靛\n430\nsao\n217\n252\n##lus\n蛳\n舂\n1908\n##～6\n12000\njane\n##head\n語\n##cks\n##erry\n##vc\n場\ntd\nい\n##フ\n199\nmark\nｚ\n237\n旃\n邨\n痍\n鹈\n8gb\n託\n恚\n##hua\nsir\nnp\nhis\n1913\n2gb\ngm\n299\n冇\nb2c\nmatt\nml\n##700\nword\ngre\n##self\n琇\nhappy\n痺\nlc\n550\n##120\n⌒\n262\n##hip\n驚\n順\n##py\n##ways\n##い\ninternet\n##lot\n##ria\n數\ncome\n婦\nrt\n243\n246\nⅲ\n跎\n瓤\nsuper\n410\n記\n暝\nmini\n020\n摈\n夾\n##aid\nppt\n##wt\n##oon\ncenter\n261\n292\n賤\n530\n1907\n352\n420\n蛔\n249\n##ius\n杷\n識\n##tis\n315\ntf\nlive\n263\n結\n詩\n##hang\nwas\nmaggie\nver\n##mber\n傣\n##dom\n##pass\nangel\n##①\n##rap\n2600\n##ious\n272\nmicro\nbit\nautocad\n碣\n##lock\n311\n##cord\n##uel\n##oid\n芡\nwei\n顯\n耒\n2g\n達\n##eet\n288\n郧\n莴\n##kit\n歲\n2300\n##read\ndel\nｕ\n揆\n≥\n蠹\n##き\n態\n囹\n9000\n囿\n1902\n＠\nhk\n##urt\n##osa\n##rip\n##cts\n楮\n##シ\nnow\n芫\n##е\n440\n##に\n製\n##ｑ\n279\n618\n湫\n196\n钴\n雲\n278\n溼\nits\n響\n##ques\n361\n瑀\n##md\n##ude\nmaria\n##rial\n##▼\n2d\n316\no2o\n##ack\n##tional\n##test\ngsm\nsim\n##point\n244\n寫\n煸\nfree\nleo\n##gc\n##123\nher\nnvidia\ngs\n560\n510\n潤\n～1\n##rain\nec\n##う\n##a1\n塬\ncctv\n顫\n##mand\nft\n壅\nsun\nalex\n愆\n275\n##press\nalbum\n##ded\n搖\n3m\n##body\n觸\nandy\n0l\n铰\n##о\n畦\n5mm\n580\nasp\n248\n247\n666\nm4\n##010\nend\n##max\n辊\n粕\n005\n奘\n擺\n嚥\npop\n阇\n##ject\nf4\n捌\nwto\n砒\nmail\nipod\n岬\n198\n玠\n篪\n##｜\n273\na2\n##★\ngood\n276\n##3d\n321\n309\n倬\n戕\n肓\n307\n浜\n##city\ngif\n271\n##レ\n680\n911\nx3\n296\n##tee\n牯\nrp\n煅\n迴\nget\n192\n祕\n繼\n##ular\n##tore\n喆\n266\n##ア\n疱\nimg\nchen\n擊\n捨\nx5\n##tus\n##gas\n055\n噹\nt3\n劉\n##r3\n郤\n##ible\n193\n螯\nset\n欸\n苣\n323\n苜\n283\n##ira\n菘\n嘚\n橐\n5l\n820\n1903\n##н\n##2010\n##ford\n##nse\n薹\nhtml\n裝\n圻\n##ら\n辦\n繇\n191\nshow\n抟\n圓\n285\nvan\nmx\n請\n爰\n##oz\n4m\n瞋\n##oot\n枰\nthinkpad\n芃\n##logy\n##ations\n1898\n##tive\n挹\n钤\n##テ\n鬍\n2700\n##vr\n##ssion\n孛\n##may\n003\nbig\n##nel\n尕\n322\n篁\n腦\nv2\n砷\nfrom\n##cture\n枇\nbob\nreal\n佢\n濬\nlook\n##ming\n亞\n##ん\n##works\n龋\nwhy\nabs\n菟\n圄\n##room\n##tta\n齊\ncam\n318\nuv\n鼐\npv\nout\n樓\n幫\npan\n≤\n##wang\n##ova\n2800\n噴\nbat\noffice\ng1\namy\n疸\n1895\n342\n砭\n貼\n楫\n##sky\ndream\n儘\nrom\n洄\n∩\n##ping\n780\n1a\n萬\n＇\n鲷\nbbs\nkiss\n飯\n##て\ncarol\n540\n羧\n##mann\n30000\nalt\n197\n堅\n##late\ngay\nlinux\n##vic\nと\n##nos\nｏｋ\n1904\n癜\n讴\ncba\n飛\n710\npet\ning\n庠\ncall\n080\n蚜\n5t\np2\n##○\n##～5\n980\n4a\n267\n寶\n254\n論\n逶\npg\n6p\nvisual\n##ush\nunit\n008\n舛\n##box\n犍\n##ball\n陳\n##リ\n柞\n950\n醮\n372\n##raph\n##eam\n282\n286\n328\n##な\n##xy\n遠\n墀\n1901\n##rey\n194\n##hp\n##nap\nmsn\n3200\namd\n339\n##cept\n##list\n##fit\n洮\nf2\n##cker\n985\n##ather\nmix\n##hur\nadam\nerp\nef\n##は\n貅\n閒\n嗪\n408\ne1\n331\nua\n281\n沭\ns3\n調\n##ware\nb2b\n287\n##lish\nray\n跹\n##rite\n298\nmicrosoft\n隱\n011\n##ering\n縫\n盪\n##￣\n畫\n291\n疥\nthomas\n竜\n堀\n314\n863\n##rent\n##hoo\n##fw\n6m\nob\n誘\nwhere\n##times\n##ト\n蓿\n薙\n佤\n乩\n柘\n茏\n##name\n趺\n钼\n390\n铩\n366\ncosplay\n竽\ncb\nesp\n##110\n單\n##も\n獾\n##alk\ntc\n460\n##ets\n妣\n401\n華\njust\n阈\napple\n1897\nio\n##oud\n芎\n鼹\n楊\n莠\n##к\nnc\n﹍\nsome\n↑\n##ない\nき\n##ube\n050\niii\njan\n約\n淦\nbonnie\njun\nkey\n##101\n1888\nplc\ns6\n棹\n郅\nul\n##ald\n勢\n5a\n晷\nlike\n006\n6l\nし\np1\n呎\n3600\nmate\nrc\n啉\npvc\nhtc\n##ax\n##pd\nice\na8\n烃\n鼋\n疝\n362\n289\n漲\n菀\n##stone\n##＋\n##ration\n##fly\n##mini\n##hone\n1～2\n##ᅡ\n335\n##rant\n##ail\nalice\ndes\n##iel\n2～3\nblue\nc4\ntony\ncx\ndt\n257\n彈\n琍\n1250\n炁\n鹕\ndec\n355\n369\n孳\n##ito\n412\n喹\n##uck\n鐵\n陟\n襬\n##nger\n##↓\n趸\n338\n##field\nfrank\nkl\n囉\n##mah\n470\n愍\nc3\nwilliam\n##flow\n324\n耄\n640\n绀\ntcl\n##～8\n##n1\n1894\n嬢\nwell\nover\n郜\n嫻\n284\nnb\n##160\n##lace\n##り\n##mond\n313\n﹔\n295\npos\n239\n##web\n蘼\n##zl\n##bank\n閉\n277\n膚\n咁\njr\ninc\nthree\n楝\n##type\npm\n332\n週\n夠\n##base\n818\n##р\n叻\n1890\n##す\n##ま\narm\n勁\n羨\nh2\n換\n##ily\n351\n402\n##ience\n##cus\n確\n镗\n630\n怙\ndoc\nbuff\n##cing\n桿\n杼\n釐\n5℃\n縮\n004\ndota\n铵\n##mv\n025\n##group\n##cil\n329\n洹\n桁\n蒹\nbbc\n廝\n##o2\nact\n258\n霑\n##tone\n6500\njournal\n860\nsmart\n稗\n##2007\ndnf\nr1\nmacd\n375\n適\n326\ncore\n醪\nく\n603\n榉\npose\n蜊\n021\n334\n￣\n358\nsky\nsunny\n294\n345\n龅\nbd\n5cm\nstar\n卍\n異\nsata\nnext\n##stin\n鹳\n##2c\n##с\n岘\n甾\n388\n凈\n##ream\n##る\nrain\ntan\n##ged\n錯\nopen\ntel\nuc\n懷\n薅\n##jin\ns7\n啱\n癢\nhome\n##ition\nsoho\n368\n镛\n##cast\n##uma\n274\ndie\nrose\n##card\ni7\n015\n601\n009\n319\n備\n萁\n##hg\n##iu\nare\n苫\ntvb\n570\ngd\n瘢\nair\n跡\n##try\neva\na5\nhttps\n鈺\n1024\noa\n##480\npower\n##talk\nalan\n線\ngigi\n徳\n##jun\n##vas\n鳳\n##█\n##＜\n垭\nフ\n##west\n谘\n##◇\nmaster\n1g\n359\n##board\n試\n405\njean\n崧\n1gb\n100g\ntype\n386\n畲\n狀\n##～7\nmodel\n##mall\nq5\n##dge\n稜\n漯\n810\n1840\n嵊\n##520\n660\nzero\n012\n義\n願\n徼\n730\n##cg\n杓\nrmb\n刈\n顆\n慄\n555\n##250\n960\n±\n418\n3a\n327\n415\n﹝\nseo\n013\n##ロ\n麂\nsoc\n1899\n芾\n297\n##nie\nfans\n620\n##oper\n##tag\n##cation\n341\n穀\n領\n##mine\nqs\n4l\n聞\n382\n湟\n##vm\n趕\ngeorge\n##pop\neos\nγ\n蜢\n##ias\n501\n網\n##care\nxxxx\n較\n認\n293\n鍊\n﹞\n嵯\nblack\nemail\nh6\nfirst\n888\n椋\nsteve\ncharles\ntwo\n##ette\n倆\n訴\n瘋\nltd\n028\n##コ\neb\nwar\n##み\n353\n貔\n撐\n羅\n##wd\n觀\n哏\n840\ngc\n##さ\n矇\n昉\nx2\n圍\n##ument\n燊\n##150\n502\n瑤\n濃\n##в\njim\n##rade\n##nsis\nhong\nlz\n##data\n鞣\nella\n痤\n枋\n384\nsas\n錢\nmike\n1884\n422\n760\n357\nmod\n##lent\nafter\n385\n019\n##core\n##カ\n##zon\n1860\n398\n疖\n890\nv8\nvar\n儋\nsong\nchan\n##dden\napi\nhold\n苄\n317\nn1\n楸\n1880\ngalaxy\ntina\n碛\n##360\n##zen\nmarc\n1850\n5500\n432\nck\nlady\n##ology\n##stro\n354\n珙\n珥\nyour\n##■\n7500\n1886\nauto\n2030\n##ᆫ\n樑\ni5\ne3\n##a5\nkim\n♀\njpg\n##note\nな\nmary\navi\n戰\neq\njp\ndos\n367\n610\n傷\n##site\n襪\n3300\n＾\ncg\n俳\n##pad\n郫\n蓁\n##beth\n##oman\n痧\n2b\n##⌒\n##wing\npass\nて\nた\n瘘\nufo\ndaniel\n談\n##ney\n376\n363\n513\niq\n022\npolo\n渾\n翹\n2022\n343\nexo\nh5\n344\nder\nあ\ndll\nh1\n1280\n1080\n菸\n##coin\n蓋\nmit\nonline\n394\n488\n##tsu\nebay\n20℃\n‖\n軀\n顏\n##itor\n##ｚ\n讀\n甙\nthan\n‰\njul\n##║\n術\n##file\n碁\n##media\n##hk\nacer\nㄛ\n##つ\n4k\n1001\nshirt\n518\n##aka\ngg\n氪\n桡\n##gma\n麸\n690\n348\n378\n##erson\n379\n惡\n##walk\n襠\n##burg\n盱\n8mm\n##ppy\n##lling\n##rma\n##ello\n謝\nrx\n701\n1893\n##tein\n丟\n442\n##3c\n##ィ\n500g\nつ\n啟\ntft\n爺\n哌\n鑽\n2a\nbas\n炔\nliu\nky\n誰\nmake\n課\n411\n381\nkelly\n★★★\n##999\nbot\n脘\n##watch\n##2008\n肄\n##こ\n藥\neasy\n衆\n347\nnova\nstart\n##ヒ\nシ\n##via\n399\nsea\nf3\n堃\n3800\n456\n軍\npci\ncase\n346\nqa\n##ference\n1896\npsp\n407\ninfo\n##zer\ne2\n409\n625\n##ground\n802\n##ury\nmp4\n##news\n︶\ns8\n維\n389\nwind\n414\n6a\n掙\n029\ncoco\n節\nrd\n##mma\n##fx\n泮\n##ella\n027\n1v\n潔\n併\n頓\n嬗\n##ウ\n垚\n##ries\nangelababy\nrna\nrm\n酢\n櫻\nり\n##を\n404\nkd\nm5\n##uid\ns5\n18000\n395\n##table\nア\n337\n371\n繩\npdf\nl1\nphp\n盤\nhotel\n務\neur\n##°c\n670\n737\ndata\n##マ\n報\n扦\n況\n522\n钯\n戡\n1870\nさ\natm\n##nny\n##lex\n##1000\n024\nrf\n364\n朐\n##ᅵ\n##uard\n壞\nuber\n100m\n25000\nstep\n##タ\n捲\nwap\n钣\nxl\n歷\n##②\nkm\n##drive\n503\n##〇\nfly\nbrian\n頫\nlow\n356\nhip\n毽\n決\n##tina\n遊\n490\n吳\nlog\n##━\n##ハ\nuk\n嗚\nφ\nmatlab\n##wf\n▪\n##uce\nfind\nsee\n劍\n##け\nharry\n458\nⅳ\n##rman\n燄\n373\n賞\ni3\n钒\ng3\n貝\n聳\n590\n癒\n336\ntwitter\n稱\n帑\nlink\n419\nmartin\n##tical\n396\nsql\n413\n2025\nhero\n顧\n970\n##load\n438\n鮮\n50000\n儀\nspace\n023\nlcd\n臥\nemba\nlevel\n廳\n葉\n倫\netc\n##r4\nけ\n菽\n740\n416\n##2009\npcb\n纏\n##ᄋ\ncup\n426\n433\nallen\n聖\n脹\n830\n靈\n陞\n藿\n運\n349\n377\ntcp\n1350\n504\n暈\njay\nlong\nwhite\n##ement\n##ᅩ\n殭\nv3\n374\nmoto\n##oka\n優\nnhk\n##pus\n嘌\n倖\nalpha\nlouis\n轭\nright\n講\ngis\n註\n##１０\n##α\n##キ\n鹹\n##tics\n戲\n##л\n祂\n☆☆☆\narea\n專\nsorry\ndiv\n1889\n疔\n##atic\nfb\np3\nsize\n籐\n##page\n6g\n##￥\n455\nvideo\nhenry\nwest\n孃\n燔\n##tant\ngreen\n計\n黒\n淚\n425\nhot\n摟\n1867\n3ds\nwow\nadc\n435\n##ties\n495\n夢\n釦\n465\nう\n##⊙\npin\n##shop\n綾\n伕\nω\n嚇\n1885\nmpv\n##vy\n##ads\n＼\n蟲\n瓏\nvga\nping\n蓖\n##ift\n516\n億\nm8\nxd\nups\n##ship\n##nus\ntime\n##ils\n392\nmtv\n##zone\nzen\n768\n##vg\n1234\n涞\n躪\n際\n簡\na7\n擠\nrq\nieee\ng5\n##2d\nwindows7\n##rum\n蕈\nhas\n##ナ\n葚\n515\nqueen\n##2012\nace\nmore\n⑴\n880\ncmos\nc5\nlife\nadobe\n##bow\nd3\n30g\nbye\nb5\nmap\n卮\n##mple\n飽\njeff\n溱\nlight\n荽\n猶\n紀\ntake\n##www\n##lone\nform\n蔘\n##ika\n涼\nusa\ng2\nㄟ\nlab\n10℃\n511\n451\n##ctive\n唸\n8g\n盧\n397\nfun\nemi\nbmw\n##yer\nstyle\n1882\nbang\n403\nclass\n##ェ\nhard\norg\nd1\nimage\nsay\nnews\njpeg\n禮\n##stry\n##mix\n買\n湧\n##ique\n32gb\n2900\n姫\n100km\n##cket\n777\n##mobile\n747\n##town\n豇\n##ⅱ\n##±\n##chat\n槭\n氦\n##link\n姪\noem\n骶\n##ader\nイ\ng6\n##dog\nide\nnever\n痠\n★★★★\n員\nhdmi\n呋\n輪\n##ひ\n8cm\n499\n387\n號\n##tax\n醫\npost\n褫\n427\nanna\n謂\nfile\n##gio\nb6\n##2011\nk2\n##そ\nidc\n508\n506\n撥\n##3000\n##host\n##next\ncpi\ncandy\n##ider\n##ω\n樺\n舉\n1010\n417\ninternational\nann\n##code\ngmp\n##sent\n2k\n査\n趙\n脩\n521\n##space\n荸\ntoo\n漢\nq2\n巖\n##dk\n##yy\n##lines\n棗\n頸\n543\ntech\neco\nboy\nppp\ntop10\n閃\n｀\npage\n##baby\n##ncy\n吽\napps\n簾\n##＄\n##sia\n10g\n2cm\n453\n4800\n##168\n酐\n槲\ng7\n2mm\n屬\nπ\n##iko\n##370\n乂\n驗\namg\n429\n禪\n嘖\n##rmin\nfire\n##rris\n##cca\nwp\neps\n厲\n燒\n505\nseven\n##eting\n##asia\n嫘\n蜱\n镕\n暸\n鲟\n##urg\nworld\n耋\n##rder\n16gb\n##qq\n##tty\n##ville\nnas\n468\n##ツ\n061\nparis\n##т\nd2\n颞\n痼\ngreat\ndesign\n耨\n##5s\n870\ndom\n770\n樯\n醯\n485\n##map\n須\nanne\n073\n##tment\ngrace\n養\nvia\nmall\ncdma\nsweet\ngpu\nト\nmandy\n肅\n406\nv6\noracle\ncss\nyoutube\n籼\n##011\n448\njerry\nrfid\nxml\n3100\n705\n殘\nplease\nコ\n##moon\nmsi\n綑\naccess\n##tton\n##text\nれ\n腈\n##↑\n辺\ngucci\nmagic\nitunes\ninn\n##ql\n岙\n##ews\ncrm\n歸\n391\nport\nram\n##あ\n##れ\n##url\n20g\n鲢\n##mbps\n475\n嘆\n1868\nquick\nonly\nold\n626\n##rner\n闕\n虛\nee\n側\n##tation\n##lution\n##nba\n1871\n##ニ\n##ᅮ\n1111\nneo\n綁\n勝\n蘋\n⑵\nrick\n595\n##dget\n##pper\nway\nbus\n癮\nplan\n##っ\n2050\n層\n15g\n檗\n繫\netf\nx6\n糍\ngtx\njennifer\n燈\n叡\n瀰\nカ\n428\n戦\n711\n滾\n599\nhere\n藝\n##ston\n588\n835\n##fet\nproject\n業\n璁\nunder\n褔\n##888\n##ature\n纖\n質\n繞\n##ち\nsony\n純\nみ\n##pact\njoseph\n##gate\n##у\ngov\n8k\neleven\n3～5\n##012\n##uality\n##google\n##м\nmusic\nq7\n##tail\ndemo\nる\nひ\n枱\n##ager\n498\n##verse\n塚\npad\n钡\nblock\n獣\nchinese\n尷\n##д\n##ging\n矽\n褻\nthink\n4200\nlock\nエ\nbenz\n##bay\n÷\nread\ngary\nbh\namazon\n瑩\nmoon\npvp\n嚕\n##lts\n眙\nsteam\nrun\n##iki\n##dot\nn2\nfx\n##ット\n資\n燙\n﹒\n笹\npaper\n3400\nハ\n5v\n畀\nccd\n﹚\n3t\nselect\n##365\nplus\n##ᆷ\n喫\n嘧\n獨\n##よ\n##や\nこ\n##pn\n688\n禦\n鏡\n1080p\nwan\natp\n##ᆨ\n℃\n瓴\n↓↓↓\noffer\n産\n1020\nbad\n##ator\nμ\nhave\nabout\n##リー\n團\n##aine\nδ\n##bility\nl2\n##js\ns4\n5k\n##≤\n臨\nkate\n##モ\nsummer\n★★★★★\n槍\n锑\nfox\n彎\n##ール\n##え\n##hia\n##007\n2021\n曉\n碇\n習\nfda\ndha\n##＞\nrich\nclub\n锆\n皴\nland\nbell\n環\n﹙\n荳\n5s\nname\n壯\n##tier\n1b\njustin\n801\n5d\n383\n##mail\n隠\ndns\n塊\n氖\n廣\nま\n##わ\n睏\n##food\n擁\nbox\n920\n秭\ndown\nidea\n525\n樹\nlist\ne6\nlaw\nocean\n##unch\nctrl\n##ᆼ\n痱\n罷\n##ホ\nfri\n寧\n##tory\n墒\n骯\n##せ\n##sson\ncharlie\n膩\n798\n##メ\ng9\nlittle\n##gion\n##lter\nnana\nday\ntue\ncare\n氙\n445\n頰\ndoi\nス\n##nix\njoy\nengine\nす\nら\n▽\npay\nco2\n##ᄉ\n##013\nnasa\n蒡\nって\ntouch\n421\n啓\n##ners\nq1\n園\n##ories\nmysql\nhop\n790\n灬\n錄\nnice\n討\n855\n徹\n嚴\n暱\n##jie\n##＾\ntai\n喲\noled\n舊\n朮\n镉\n##ッ\ngirl\n貴\n銀\nbase\n3～4\n脲\nク\n睜\n##チ\nkbs\n50g\n##ふ\ntips\n選\n負\n##◆\nedu\n鬚\namerican\n457\nλ\n碩\nbreak\nuniversity\n##2013\n##iner\nloft\n舸\ncross\n611\n蹼\n##ɡ\n##bian\nmost\nb12\n綿\n類\nming\nサ\nhoney\nldquo\nnike\n恆\n075\nesc\n##gnet\n慘\n##ンク\n鎖\n925\n舖\n璿\n##～17\n標\nε\n靦\n##jiang\n##ㄟ\nyoung\nforum\ntimes\n獸\nwater\n偉\n爭\nheart\n##580\noff\n筆\n930\n##bia\n##サ\nreview\n鑑\nstring\n窨\nも\n臺\ninstagram\n緣\ncnn\n393\ndsp\nqt\n蚵\nspring\npython\na9\n傾\n罵\n湊\n胍\nb3\n##khz\nfrance\n661\nturbo\n867\n參\n憐\n兇\n##お\nhuang\n暌\nbella\nhey\nvisa\n暢\n顔\nandrew\n廚\nnick\nlast\nisp\n##public\nmulti\n##п\n襲\n##イト\n##dium\n20mm\n441\nほ\nzip\n##009\nict\n5kg\n諸\n∠\n碚\n瓊\ncnc\n溏\n##ndy\n埕\n膦\nline\nadi\n##ゆ\nsolo\n畢\nfull\njohnson\n##ゃ\nmade\n證\ngrand\nroger\n##px\nhonda\n観\n##ック\n椹\n僕\n練\nsouth\nstephen\njump\n##2015\n##ント\n揸\njonathan\ntalk\nhit\n塗\nnight\nsns\nmain\n699\nsmall\n444\n##ᄃ\nbrand\n##icon\nkai\n##ャ\nxbox\nwin8\npony\n砼\n##ほ\n##エ\ncool\nknow\n賽\nadmin\n琲\njimmy\n載\n釋\n##nity\n##ート\n##ム\nprofile\ntext\nphoto\n隊\n##share\n汙\nhead\n衛\n##015\n##я\nс\nroom\n敗\n##ᅳ\n4d\n┃\nedge\nmacbook\n591\nsmith\nhiv\n髂\n爾\n##ᅧ\ntrue\nemc\nsap\n##sio\n啫\ncheck\ncloud\n616\n##rame\n##ろ\ndigital\nghost\nや\nwma\n709\nお\nblog\n獲\n│\nsurface\n貨\n饒\n##itz\nsms\nmega\nteam\nvery\ncry\n蓮\n##ソ\n楽\n撲\nwd\n##rity\n劇\nkeep\ncpa\n積\n##ˋ\n尋\nわ\n餵\nryan\n豊\n##ᅥ\npack\n徬\nrebecca\n縣\niu\njuly\n##ㄧ\nstrong\n韻\nnokia\n疳\nalexander\ncity\n厭\ntest\nsound\n##rex\n腫\ncentral\nhall\n胝\n##lean\nv5\n莊\nface\nsdk\n〓\n##ᄌ\n權\n雜\n##rdon\n圖\nftp\n隈\nwed\n閱\nlucky\n紙\ncma\n##ュ\nsarah\nfifa\n##shot\n貘\n膽\nnov\nq3\ncount\n豈\n煙\npeople\nband\ncbs\n##ᄀ\n##б\ndrive\nbin\n棄\n詠\n##ᄅ\nrdquo\nレ\n碓\n輩\n蒐\n539\n074\n斎\n3mm\n慣\n720p\n蔔\ndell\n20cm\ngold\neye\nanti\n錶\n##kins\nhelp\nps4\nprime\n戀\nbbq\n橫\n叢\ngoo\n隸\nx7\n⑶\n議\nelle\n詞\n畑\n##master\n彙\ncolor\nted\n硐\nsaas\n##mmy\nstudio\n990\n颡\n##014\nr8\nstay\n3s\n##sue\n據\nf5\n汆\nテ\n導\n##lton\n814\nstop\n腸\nr2\n##ㄚ\n誌\nえ\n##から\ncamp\n項\nmmorpg\n皺\n360°\n1tb\n緋\n6gb\n##post\n鬱\n籤\nasia\nwing\n##style\n536\necho\n921\n瀟\nshare\nsars\n蕻\n486\n浬\n創\n##covery\nwear\n貪\n喪\nrgb\nvalue\n2l\n##ァ\nqe\n絶\nlim\nオ\nㄋ\ndeep\ntbs\nbing\nagent\n潴\n規\npatrick\n區\nwork\n針\nrss\nタ\n責\n甦\n紹\napt\ntaylor\nラ\npierre\n##ター\nsystem\n庹\n##め\nへ\n悅\n4399\nruby\nsoul\nsale\nroot\nhouse\ntee\nyahoo\n鎚\ng20\n堙\nσ\n##ision\nother\nミ\nvictoria\n攝\nmode\nour\n##470\n★★\n僅\nnorth\nᄋ\nkitty\nvilla\n劃\n##ケ\n気\ncut\nchristian\n##г\n12345\n饑\nfast\n～5\ncanon\nces\n憶\n##ague\nbeyond\nshopping\n嫵\nリ\n呂\n烏\n擴\n寬\ncarlo\nthu\n撚\nlocal\n鄉\nipv6\n787\nrar\n統\n跩\nして\n##して\n##って\nった\n0mm\n828\nd5\nvm\nadd\n襯\nbruce\nian\nwall\n韋\n迺\nv4\ncontrol\ndlc\nnata\ngeneral\n##2016\n剋\nⅴ\n騰\n##tings\nnational\n設\n838\nx4\n##ᄒ\n00㎡\nopec\npng\n齁\n臟\n艋\ndog\n4600\n孫\nkarl\ndark\nxr\n營\nindex\n涠\n鈎\nィ\n##tors\nいて\nling\ncoc\n掃\n詳\n煖\n##º\n敵\n##ネ\n奧\n陸\nwilliams\n##016\ndouble\n嘗\n##xon\n蹟\nems\n賣\nsaint\nfamily\nannie\nsbs\nhtml5\nkic\n納\n鲱\nnull\nview\n200mm\n128gb\ncfa\n##オ\n々\nssd\nせ\nめ\ndragon\nalexa\nmoney\n蠛\n##◆◆\nfive\nqr\ngroup\n歐\nヒ\n##③\npark\nage\npush\nxyz\n##iginal\n溝\nurl\nhdr\nホ\n縱\n嬤\n護\n揚\n濤\n##ヘ\n囟\nbay\nfeel\nfps\n##lux\n##ч\n埼\n險\n譯\n0020\n檯\n##graphy\ncrystal\nfed\njapan\n##rmb\n鴻\n##ミ\n澀\n##atus\n煩\nskin\nunix\nfish\n＄\nedm\n巿\nmetal\n級\nnetflix\n魚\nロ\ndior\n囝\n織\n##firm\n##desk\n費\nimf\nbusiness\n擼\n齒\n補\n﹏\n##ⅲ\n遺\n橼\nmoba\naction\n⒉\nholiday\n鬥\n揮\ntea\n##rail\n899\ntiffany\nfrancis\nstudy\nscience\nanthony\nacg\n濫\n##ょう\nbios\nfinal\nsoft\n龐\n123456\nactive\n袪\nvoice\n橋\n榖\n##ね\n誠\n砲\nenter\nclose\nlost\nprada\npoint\n##ggle\ntop100\n¤\nhpv\n##セ\ntag\n職\n6s\n繃\ne5\nadsl\nа\nnumber\n箇\neia\n輸\nacca\nhuman\n##dragon\nicon\n998\n##jy\nマ\n麴\n証\n##▲\nspeed\n##した\n⑸\nads\n##2017\njavascript\ncisco\n816\nraid\nウ\ncard\n♂\n##♀\n##ョ\n5c\nstandard\n64gb\n##ハー\n##ㄨ\nchrome\n##trix\n館\n##いる\n##いい\n##れた\n◆◆\nopera\nshadow\nyam\nproduct\nbaidu\nv9\napec\n##ˊ\n897\n殺\nこの\n辻\n攏\n##ノ\n##lmer\nbrown\n∽\n～4\n劑\n##ᅢ\n騎\nキ\n##rence\nsteven\n萊\n淺\njessica\n飲\nuse\n##lax\nsamsung\n##bike\nskip\npizza\n鄰\necu\nmedia\n鎗\nshell\n埤\npowerpoint\n劵\nsign\ntab\n︱\nvmware\n擔\nθ\n##ο\npm2\n##った\njose\nakb48\n凇\ndear\nfield\nм\nspecial\n茼\nclaire\n穩\nsso\n慶\n##2014\nstate\n##ワ\n銅\nelectronic\n##berry\ntokyo\nz2\n鬧\n涙\n麥\n##こと\n##rcle\n⒈\npmi\n18k\nraw\niis\n32g\n•\n##ろう\npath\nした\nいた\n##この\n##ified\nusd\n瀬\n恵\nvon\n憂\noutlook\ncover\nddd\nwish\n鶴\npure\n≡\nshop\njackson\noct\n##λ\np9\nmatch\nrmvb\nshift\nmeta\niot\n##163\nivy\n説\n悶\n##へ\n～10\niphone7\nnano\n⑷\nψ\n擋\n監\n頌\n█\n##340\nvincent\nふ\n━\nalso\n擇\n滅\n韓\n∞\nadvanced\nswitch\nチ\n##ь\n獄\n##stle\nservice\n##nnis\nsep\nメ\neast\n濟\n##△\nwave\n減\nultra\n飄\n鵰\nbeta\nsomething\nmsci\nglobal\n銷\ntable\n##κ\nsimple\napr\nn3\n799\nlondon\nphone\n##ι\n##スト\n##walker\nmind\nsecond\nvision\nbattle\n靚\n##←\nbasic\n##っと\nしい\nん\nsays\nriver\n懸\n簷\npaypal\ndonald\n餚\n牽\nhtm\n##ーム\n桜\n34e\n嚨\ntrip\n爛\nmenu\nbody\n籠\nbeautiful\nsean\n実\naugust\n摳\ngolden\n##ーン\n訝\n⋯⋯\n廢\n##ぬ\n朧\n##イス\n濁\n剝\n彆\n##bug\n##bbs\nresearch\ndac\nchanel\n紛\nfit\n≈\n遲\n鮨\n測\nsina\n頗\nmbc\ndance\nago\n50mm\n孖\n衄\n√\nzoom\nち\n##ょ\n寛\nxperia\n堺\nnbc\n憤\nwatch\n峯\nvpn\nside\n嬰\nв\n綠\nauthor\nsui\n穫\nmama\n碼\nbuy\npink\nnetwork\n##ᅦ\n埵\ncell\nmiui\n##orage\nschool\nˇ\n粿\n紋\nherme\nasr\nspf\n##ᆯ\n︿\n##®\nnfc\n##β\n∕\n勞\n##ⅰ\n##のか\n発\nmarket\nppi\nleft\n潆\nnature\n3kg\njob\n##2f\n醃\n飾\n##れて\n##たい\n↓↓\nkimi\nァ\n100ml\n30ml\npixel\njordan\nfalse\n№\n粧\n##≥\nposted\n階\naug\n趵\nmarch\nearth\n頻\nprivate\n煉\n檢\nbear\n蠍\n曆\nott\n訊\n縛\n覽\n墻\nrussell\n執\n攪\n慮\n##ゅ\n絃\nwelcome\n額\n賦\nboot\n聯\ndaily\nhealth\n##bby\njones\n竇\n伝\n##ッフ\nヽ\nps3\n500ml\n喬\n##イン\n緒\n哋\n擾\norder\n毎\n╮\n廁\n歩\nbeauty\nexpress\nsilver\nᄀ\nsecret\nscript\n萩\n鄭\n魯\n遞\nsingle\n偽\ncio\nセ\nsoftware\nitem\n##ォ\nmetro\nそ\n##▽\n穎\n##price\n獻\ntravel\nprice\nfpga\nф\n嵐\n捩\nノ\n癱\nssh\n様\nmcu\n撿\n燿\n箏\n2345\ncapital\n浯\nyear\nicp\n礙\nball\n##abc\nvogue\n價\n##rss\n闊\nllc\n##fork\nヘ\n痙\n攣\n24h\nkong\nsave\n##mporary\n侖\nゆ\n鎮\nbim\n賓\nroad\nmomo\n蓆\nburberry\n穢\nsiri\n島\nsport\nfocus\n丿\n##ㄋ\n##とう\nт\nhost\n墮\ngmail\nevent\n##ification\n886\n繪\n帯\nself\n躍\n##sday\ntvbs\n##zzi\n晩\n##ᄆ\ndean\nkpi\natom\nbefore\n財\n囍\n暐\naspx\npublic\nzara\n撃\nfriends\nroyal\n誇\nbigbang\narpg\nsupport\n纍\nprogram\n訓\n組\n咘\nuniversal\n嘩\n弔\n蕭\n貫\n狹\n##stack\najax\ne7\n葯\n##ション\n┌\nserver\ncalifornia\nprocess\n##んな\n##また\n湯\nworks\ngmat\n誤\nbag\n譁\n闖\n闍\np10\n##rtex\ncontinue\n##∕\n閣\nネ\nsunday\n##リア\nunity\n##④\n勻\naudio\ngpa\n惱\n##017\n戸\n輝\n曖\nnikon\nclick\n賊\n灘\ncode\nsource\n範\n樫\ntoday\n╰\ncoffee\nplace\npuma\ncookie\nreading\n磡\n対\nfantasy\n舾\n##ラン\n##ivity\nから\ntour\nvera\ngit\n預\nchildren\n鳴\n╯\nnatural\n値\n蔣\nsite\n崁\nkuso\ncreative\n奪\nsocial\nhill\ncanada\nnexus\ndram\n筲\nwindow\n槓\nchannel\npm1\n16g\n梶\nhadoop\n賁\n誦\n盃\n墊\n埸\nᄆ\nradio\neyes\n852\nsanta\ndate\n##shine\nedit\n紘\nno1\ncto\nwrite\nx9\nえて\nえる\nめる\n##かり\nmonte\nrunning\norange\n醜\n##｀\n##のは\nford\nя\nflickr\ndit\n寢\n縴\n勸\nhoward\n鳥\nfollow\n砗\nenglish\n蝟\nк\ntpp\n舺\n廟\n##いた\nける\n淪\n##む\nisland\n鹽\n866\n##│\n##ース\n賢\norz\njcb\nuser\nenergy\nˊ\nwiki\nisis\nthunder\n傭\nmodern\nfriend\n崑\nguest\n咗\nfood\n##∼\ngear\n50ml\n壺\nmount\n櫃\n##ある\nlake\nsophie\nstep3\n搶\n潟\n##する\n▍\n垵\nstone\n療\nomega\nleave\n喚\n鏽\n審\npwm\ntechnology\n##lvin\n暫\n30cm\n剣\n##з\n≧\nsports\n澆\n濺\nssl\n贊\n##カー\ncdn\nぃ\nlte\n##cake\nutc\n##ヤ\n##ß\n塵\nelse\n憑\nー\n篠\n匯\nkit\n〜\naws\nyumi\nㄆ\namoled\n##ᄇ\ngap\n貓\nд\n##ㄢ\nwii\nips\n嘅\n疊\n##ちの\n##かな\ntown\nunited\n峇\nᄉ\n攤\n##◎\nする\nswift\n##ました\ninstitute\nthread\nㄌ\nemily\n徑\ntiger\n俠\n滝\n漿\nよ\n圏\n攞\n齋\n綵\nebd\nobject\nˋ\n氡\n弾\n埗\nfpx\nfintech\napril\n蠻\n顛\niphone6\n##δ\n##ρ\nmemory\n円\n搵\nbank\n闔\nkindle\n冊\n##れは\n榮\n経\nwps\n鏤\n⒊\n罰\n鋼\n災\ncountry\nл\n1389\nredis\n╭\n紐\n懼\nれる\n窮\n黨\nsogo\nusb3\n窺\n槻\n綺\ngirls\nemma\n辭\nspark\n飆\nmorning\n##τ\niptv\napk\nlearning\nmachine\ncream\napache\n関\nconnect\n嚮\n揹\nvirtual\n廂\n驀\n##unge\ntarget\n甕\n##ㄥ\n餓\nspan\nlenovo\n燻\nユ\ntaiwan\nᄂ\nても\n12306\n鉄\n巔\n##ⅳ\n贏\n##÷\nmiller\n##↗\nparker\ngoods\n##ように\nれた\n##まて\n##²\n潛\nη\ndownload\nplaystation\nstory\n礡\ngithub\nskype\n▋\n竄\n紗\n鍵\n燉\nngo\n##ᄑ\n啞\n嘢\ntower\nwin10\npowered\nｃｐ\n霊\nfresh\nr9\nstreet\n聰\n貞\n児\ncoach\nbuild\nfuture\n変\n羣\n##dical\nmelody\nswatch\n樸\n瑯\ndelete\nevolution\n勳\n敎\nてある\npacific\n殼\n徴\n鍾\n臘\n攬\n##くたさい\ntrump\nboard\n勵\nchange\n滲\n簽\n瑪\n##х\nwilson\nニ\nimax\n##なと\nmountain\n##きた\nnode\n歛\nsquare\n賀\nォ\nfriday\n噓\ninto\n##⑥\nfashion\n嚀\n録\njune\nbridge\n爐\n鏈\n訪\n##ける\neds\n獵\ncollege\ncommon\ntree\n懶\n墘\n辯\n轟\n霧\n脈\n銬\nnand\n犧\n閑\naudi\n郷\n売\ndiscovery\nerror\ntoyota\nliving\n##ㄞ\nwant\ncopy\n曠\n懲\ncontent\nρ\n齡\nbird\n檔\n覧\nsupreme\nツ\n##ᄂ\n灑\n樁\n騙\nн\n##ヨ\n##baru\nq4\n˙\n##ς\n壇\n棲\n##あなたに\nq10\ncreate\n簦\n鐲\n驕\n闢\n##その\nまて\n##なく\n込\n蘿\nκ\n賜\n諷\ncoupe\n簫\n遷\n##ても\njquery\n▌\ncms\nモ\nswiss\n輛\n##blog\nᄒ\n潰\n擱\nupdate\n14k\ncontact\n郵\n澤\nchicago\n##ータ\ncamera\nios11\ngmt\n擬\n糞\n##ᅪ\nzone\n評\nз\n侷\n##〓\n##ルフ\n##ц\nvive\nsync\nrate\n∥\n閨\n稟\n癲\n##ーション\n氷\n豎\n##ンス\nе\n##ш\n違\n遙\nてす\nル\n逕\n燭\nр\nviews\ncst\nstation\nf16\n糾\n巻\n禍\narmani\nナ\nvim\n獅\n氹\n##ø\nケ\nsystems\n##∩\n冏\n鰍\ncostco\n薩\n兲\n悪\nnovember\nagain\nvsc\nmozilla\n渉\n団\ngivenchy\nrelease\nㄏ\n##ㄛ\n◢\nヶ\n##ᄎ\n蒼\nthai\n轎\nι\n彫\nunion\nfrm\n湳\nи\nclassic\n広\n##♂\nassociation\n##なら\n貧\nkanye\nχ\n##ν\n##nome\n賭\nlanguage\ninside\n##サー\nㄒ\ncompany\n壽\n図\n丼\n灝\nsense\n糬\n##╰\nг\n鏢\n穂\n蹤\n僱\nvans\n彌\n頑\nп\nワ\n##りの\n膠\n絵\n滯\n##ы\nreturn\nrequest\n綸\n##∥\nmems\nbooking\n##ㄏ\n喰\n##ᅲ\nstep1\n##ーシ\n螢\n##よう\n搗\n##cms\n##rix\nairbnb\nculture\nまた\nphilips\n協\n##ˇ\n##↘\n┊\n鑼\n続\n##かる\ndays\ntrc\n##π\n靂\nwomen\ncafe\nyoga\n嬪\n寵\n筍\n##ㄤ\n##ㄉ\n凪\nsearch\ncheese\nとして\n閔\n##∠\n邁\n駛\ndavis\n洸\n績\nddos\n綻\n##んて\n喺\ndisplay\n籬\n摯\n晝\n##χ\n潑\n蒟\n絳\nmuji\n鈕\nbeach\n倉\nы\ndcs\n##ちゃん\nseason\n驅\n訂\n糧\ndomain\nglass\nlumia\nfeed\n撈\n⒋\n貶\n##ə\n侶\nios7\n牴\n帳\nソ\npress\npinterest\nrights\n弒\n艱\n挾\nlimited\nptt\narticle\ncambridge\n##≈\nleonardo\n覓\n虧\n毀\n浄\n妘\n駕\n嬸\n窩\neba\nㄨ\n##いて\n済\n鈴\n熾\nっ\nunicode\neducation\n##ます\nwordpress\nfinancial\nvolvo\nadidas\nhistory\n轅\n籲\n単\n賴\n榱\n##θ\n転\nfiles\n##なり\nstep2\ngarden\n™\n棟\n噠\n俬\ngaga\n##ж\n総\ntitle\n##∞\ncss3\nぁ\n販\n嘰\n欄\nurban\n##™\n築\nicloud\n##ュー\nmarketing\n##admin\nmkv\n≦\n##ᄐ\n##ᄏ\ndevelopment\nvps\n敘\n鉅\n##γ\noctober\nо\nかある\n®\n抜\npepper\n##ては\n脅\n満\n錦\n筊\n##イフ\nbean\nο\nすく\n唄\n傘\nboost\nexchange\n##ᅬ\n##ㄇ\n0755\n−\n##ᅴ\n竊\nㄥ\n##˚\nyork\n衞\narchive\nedition\nfilm\nmuseum\n嶋\nofficial\ninformation\naward\nseries\ngundam\nstage\nsociety\ngames\nentertainment\nversion\nmovie\ncollection\nforce\nreport\nlibrary\nguide\nbooks\noxford\ncentre\nchurch\nmagazine\nfestival\nstates\nawards\ndatabase\nplanet\namerica\njunior\npublishing\nfoundation\n歴\nstore\n脇\n毘\narts\nkorea\nmanagement\nfunction\nplayer\n専\ncounty\nyears\napplication\njanuary\nfebruary\ncorporation\n県\neditor\nweek\nwashington\n亜\nforest\nsingapore\n##ctionary\nseptember\ndecember\n姉\nsecurity\nfirefox\nkids\nmaps\npublished\nsurvey\nmhz\nphotos\n蔵\nservices\ncategory\nstructure\npiece\n増\ngnu\nwong\nresult\n収\n栃\navenue\ntaipei\nlamigo\nplaza\npremium\ncook\nmalaysia\nmember\ngallery\n舎\nfactory\n稲\nedited\nwalker\nairport\nx86\nvalley\nnotes\nsmap\nvillage\nchicken\npolicy\nimages\ntwice\ncourse\nwikipedia\narchives\npopulation\nlinks\ndisney\nframework\n塭\nexplorer\nmaker\nkde\nmessage\n読\nsuite\n##gence\neclipse\n闘\ndistribution\nfung\n髪\ncampaign\n遶\n敍\n仮\n駅\ndetails\nmanager\nncc\n隣\nbelle\nsession\narticles\n菓\n驒\nexhibition\n齢\n険\noil\n乗\nazure\ninstall\neconomy\nubuntu\ncomment\n渋\n塩\nfeb\n応\naddress\n栄\n権\nhsu\npopular\nwine\nnetscape\nrelated\n頼\n帰\n験\nnownews\n効\n薬\n雑\nkitchen\nreader\n壆\nsafari\nsbl\n壊\n縄\nlulu\ndiary\n莿\n検\nteddy\nresort\n覇\n両\n荖\nhair\ncbc\nqualcomm\ndocomo\n覚\n##ernel\nettoday\n倶\nscp\n掲\n冨\nbalance\n駄\nmobil\nmlb\nalphago\npmid\n訳\nprevious\n坵\nrestaurant\n処\nwonderland\nwedding\n扱\npanasonic\nhmv\n継\n庁\ncredit\nmessenger\nclient\n聴\ncurve\n従\nmonday\n働\n譲\n荘\nhotels\n廃\nscrum\nbutler\ncopyright\n縁\n嫲\nchocolate\nweeks\n焼\n岀\n挙\ncache\nsalvatore\n労\necfa\n傚\nshtml\n麺\ngaming\nasus\nfedora\nr11\n厳\nmonths\nhbl\n仏\nurn\nlinkedin\njson\n舗\noops\nzenfone\nfaq\nbytes\nxz\ntrivago\nc919\n嚟\ncake\nwam\nposts\ntesla\n彅\ned2k\nnego\nmwc\nbanner\nreuters\nh7n9\nbrandt\nzol\nevernote\njobs\nmsg\nrosie\nsylvia\nbrake\npchome\n駆\ngogoro\nrmk\nloading\ncomments\nspotify\n蒻\nie6\n圧\nikea\nrewards\n囲\nnavigation\n0800\n悩\nuniqlo\nwhatsapp\n軽\ndollars\nudn\nwali\n脳\ncookies\nlifestyle\narduino\npokemon\noutlet\ngartner\n択\nrainer\ndji\nwade\n窓\ncurry\nferragamo\nplurk\ntcs\n繋\noriginals\nbloomberg\ndhl\n磲\ncf1\nsgs\nmango\nlohas\ntinker\n##pods\nvsa\nprotected\nanalytics\n騨\n歯\nprivacy\n臓\nblogspot\n拡\ntila\n払\nfc2\npptv\nmarriott\noculus\npixnet\n価\n銭\nhitachi\n瞓\nntd\nwikia\nnamespace\nagoda\npositioning\nsohu\n##bscribe\nhola\nmastercard\n9985\nblogger\nexpedia\nwestbrook\nbbe\n畳\ndyson\nrcep\n粄\nweibo\ntraction\n##onsored\nfandom\nday1\nstarbucks\nwwdc\nbuffet\n剉\ntags\n2765\nebooks\n##skip\ntweet\ne88\nhostel\n8591\n4066\nmotel\n1mdb\nprefix\nlabels\n椪\n勧\ninsee\n##vertisement\n##tags\nday3\nu11\n戻\n##t00\nepson\ntripadvisor\ntmt\nfishbase\nbe2\nvdc\ngopro\n歓\nreply\nhktvmall\ngr2\n##facebook\npinkoi\nmoneydj\ngoris\nrends\ntomtom\nnginx\nfsm\n9595\n6221\nios10\n堿\nrakuten\nxuite\nadsense\nday4\n00z\nconnectivity\nworldcat\n##copyright\nhuawei\nreserved\ntvg\ndropbox\n9678\nlongchamp\n50cm\namana\n12345678910\nlofter\n5757\n6606\n遅\nvfm\n17life\n<eos>\n<unk>\n"
  },
  {
    "path": "modules/text/text_review/porn_detection_gru/module.py",
    "content": "import math\nimport os\n\n\nimport paddlehub as hub\nfrom .processor import load_vocab\nfrom .processor import postprocess\nfrom .processor import preprocess\nfrom paddlehub.compat.task import tokenization\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"porn_detection_gru\",\n            version=\"1.2.0\",\n            summary=\"Baidu's open-source Porn Detection Model.\",\n            author=\"baidu-nlp\",\n            author_email=\"\",\n            type=\"nlp/sentiment_analysis\")\nclass PornDetectionGRU(hub.NLPPredictionModule):\n\n    def __init__(self):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        self.pretrained_model_path = os.path.join(self.directory, \"infer_model\")\n        self.tokenizer_vocab_path = os.path.join(self.directory, \"assets\", \"vocab.txt\")\n        self.vocab_path = os.path.join(self.directory, \"assets\", \"word_dict.txt\")\n        self.vocab = load_vocab(self.vocab_path)\n        self.sequence_max_len = 256\n        self.tokenizer = tokenization.FullTokenizer(self.tokenizer_vocab_path)\n\n        self.param_file = os.path.join(self.directory, \"assets\", \"params.txt\")\n\n        self.predict = self.detection\n\n        self._set_config()\n\n    @serving\n    def detection(self, texts=[], data={}, use_gpu=False, batch_size=1):\n        \"\"\"\n        Get the porn prediction results results with the texts as input\n\n        Args:\n             texts(list): the input texts to be predicted, if texts not data\n             data(dict): key must be 'text', value is the texts to be predicted, if data not texts\n             use_gpu(bool): whether use gpu to predict or not\n             batch_size(int): the program deals once with one batch\n\n        Returns:\n             results(list): the porn prediction results\n        \"\"\"\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n        except:\n            use_gpu = False\n\n        if texts != [] and isinstance(texts, list) and data == {}:\n            predicted_data = texts\n        elif texts == [] and isinstance(data, dict) and isinstance(data.get('text', None), list) and data['text']:\n            predicted_data = data[\"text\"]\n        else:\n            raise ValueError(\"The input data is inconsistent with expectations.\")\n\n        predicted_data = self.to_unicode(predicted_data)\n        start_idx = 0\n        iteration = int(math.ceil(len(predicted_data) / batch_size))\n        results = []\n        for i in range(iteration):\n            if i < (iteration - 1):\n                batch_data = predicted_data[start_idx:(start_idx + batch_size)]\n            else:\n                batch_data = predicted_data[start_idx:]\n\n            start_idx = start_idx + batch_size\n            processed_results = preprocess(batch_data, self.tokenizer, self.vocab, self.sequence_max_len)\n            tensor_words = self.texts2tensor(processed_results)\n\n            if use_gpu:\n                batch_out = self.gpu_predictor.run([tensor_words])\n            else:\n                batch_out = self.cpu_predictor.run([tensor_words])\n            batch_result = postprocess(batch_out[0], processed_results)\n            results += batch_result\n        return results\n\n    def get_labels(self):\n        \"\"\"\n        Get the labels which was used when pretraining\n        Returns:\n             self.labels(dict)\n        \"\"\"\n        self.labels = {\"porn\": 1, \"not_porn\": 0}\n        return self.labels\n"
  },
  {
    "path": "modules/text/text_review/porn_detection_gru/processor.py",
    "content": "import io\n\nimport numpy as np\n\n\ndef load_vocab(file_path):\n    \"\"\"\n    load the given vocabulary\n    \"\"\"\n    vocab = {}\n    with io.open(file_path, 'r', encoding='utf8') as f:\n        for i, line in enumerate(f):\n            vocab[line.rstrip()] = int(i)\n    return vocab\n\n\ndef get_predict_label(pos_prob):\n    \"\"\"\n    Convert the prediction probabilities to label\n    \"\"\"\n    # threshold should be (1, 0.5)\n    neu_threshold = 0.5\n    if pos_prob >= neu_threshold:\n        label, key = 1, \"porn\"\n    else:\n        label, key = 0, \"not_porn\"\n    return label, key\n\n\ndef preprocess(predicted_data, tokenizer, vocab, sequence_max_len=256):\n    \"\"\"\n    Convert the word str to word id and pad the text\n    \"\"\"\n    result = []\n    padding = vocab['<PAD>']\n    unknown = vocab['<UNK>']\n    for index, text in enumerate(predicted_data):\n        data_arr = tokenizer.tokenize(''.join(text.split()))\n        wids = [vocab.get(w, unknown) for w in data_arr[:sequence_max_len]]\n        if len(wids) < sequence_max_len:\n            wids = wids + [padding] * (sequence_max_len - len(wids))\n\n        result_i = {'processed': []}\n        result_i['origin'] = predicted_data[index]\n        result_i['processed'] += wids\n        result.append(result_i)\n    return result\n\n\ndef postprocess(predict_out, texts):\n    \"\"\"\n    Convert model's output tensor to pornography label\n    \"\"\"\n    result = []\n    predict_out = predict_out.as_ndarray()\n    for index in range(len(texts)):\n        result_i = {}\n        result_i['text'] = texts[index]['origin']\n        label = int(np.argmax(predict_out[index]))\n        if label == 0:\n            key = 'not_porn'\n        else:\n            key = 'porn'\n        result_i['porn_detection_label'] = label\n        result_i['porn_detection_key'] = key\n        result_i['porn_probs'] = float('%.4f' % predict_out[index, 1])\n        result_i['not_porn_probs'] = float('%.4f' % (predict_out[index, 0]))\n        result.append(result_i)\n    return result\n"
  },
  {
    "path": "modules/text/text_review/porn_detection_lstm/README.md",
    "content": "# porn_detection_lstm\n\n| 模型名称            |  senta_bilstm  |\n| :------------------ | :------------: |\n| 类别                | 文本-文本审核  |\n| 网络                |      LSTM      |\n| 数据集              | 百度自建数据集 |\n| 是否支持Fine-tuning |       否       |\n| 模型大小            |       1M       |\n| 最新更新日期        |   2021-02-26   |\n| 数据指标            |       -        |\n\n## 一、模型基本信息\n\n- ### 模型介绍\n  - 色情检测模型可自动判别文本是否涉黄并给出相应的置信度，对文本中的色情描述、低俗交友、污秽文案进行识别。\n  - porn_detection_lstm采用LSTM网络结构并按字粒度进行切词，具有较高的分类精度。该模型最大句子长度为256字，仅支持预测。\n\n## 二、安装\n\n- ### 1、环境依赖\n\n  - paddlepaddle >= 1.6.2\n\n  - paddlehub >= 1.6.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install porn_detection_lstm\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run porn_detection_lstm --input_text \"黄片下载\"\n    ```\n\n  - 或者\n\n  - ```shell\n    $ hub run porn_detection_lstm --input_file test.txt\n    ```\n\n    - 其中test.txt存放待审查文本，每行仅放置一段待审核文本\n\n  - 通过命令行方式实现hub模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    porn_detection_lstm = hub.Module(name=\"porn_detection_lstm\")\n\n    test_text = [\"黄片下载\", \"打击黄牛党\"]\n\n    results = porn_detection_lstm.detection(texts=test_text, use_gpu=True, batch_size=1)\n\n    for index, text in enumerate(test_text):\n        results[index][\"text\"] = text\n    for index, result in enumerate(results):\n        print(results[index])\n\n    # 输出结果如下：\n    # {'text': '黄片下载', 'porn_detection_label': 1, 'porn_detection_key': 'porn', 'porn_probs': 0.9879, 'not_porn_probs': 0.0121}\n    # {'text': '打击黄牛党', 'porn_detection_label': 0, 'porn_detection_key': 'not_porn', 'porn_probs': 0.0004, 'not_porn_probs': 0.9996}\n    ```\n\n- ### 3、API\n\n  - ```python\n    def detection(texts=[], data={}, use_gpu=False, batch_size=1):\n    ```\n\n    - porn_detection_lstm预测接口，鉴定输入句子是否为黄文\n\n    - **参数**\n      - texts(list[str]): 待预测数据，如果使用texts参数，则不用传入data参数，二选一即可\n      - data(dict): 预测数据，key必须为text，value是带预测数据。如果使用data参数，则不用传入texts参数，二选一即可。建议使用texts参数，data参数后续会废弃。\n      - use_gpu(bool): 是否使用GPU预测\n      - batch_size(int): 批处理大小\n\n    - **返回**\n      - results(list): 鉴定结果\n\n  - ```python\n    def get_labels():\n    ```\n    - 获取porn_detection_lstm的可识别的类别及其编号\n\n    - **返回**\n      - labels(dict): porn_detection_lstm的类别及其对应编号(二分类，是/不是)\n\n  - ```python\n    def get_vocab_path():\n    ```\n    - 获取预训练时使用的词汇表\n\n    - **返回**\n      - vocab_path(str): 词汇表路径\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线色情文案检测服务，可以将此接口用于在线web应用。\n\n- ### 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n  - ```shell\n    $ hub serving start -m porn_detection_lstm -p 8866\n    ```\n\n  - 这样就完成了服务化API的部署，默认端口号为8866。\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA_VISIBLE_DEVICES环境变量，否则不用设置。\n\n\n- ### 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n  - ```python\n    import requests\n    import json\n\n    # 待预测数据\n    text = [\"黄片下载\", \"打击黄牛党\"]\n\n    # 设置运行配置\n    # 对应本地预测porn_detection_lstm.detection(texts=text, batch_size=1, use_gpu=True)\n    data = {\"texts\": text, \"batch_size\": 1, \"use_gpu\":True}\n\n    # 指定预测方法为porn_detection_lstm并发送post请求，content-type类型应指定json方式\n    # HOST_IP为服务器IP\n    url = \"http://HOST_IP:8866/predict/porn_detection_lstm\"\n    headers = {\"Content-Type\": \"application/json\"}\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(json.dumps(r.json(), indent=4, ensure_ascii=False))\n    ```\n\n  - 关于PaddleHub Serving更多信息参考[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  大幅提升预测性能，同时简化接口使用\n\n* 1.2.0\n\n  移除 Fluid API\n\n  - ```shell\n    $ hub install porn_detection_lstm==1.2.0\n    ```\n"
  },
  {
    "path": "modules/text/text_review/porn_detection_lstm/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/text_review/porn_detection_lstm/assets/params.txt",
    "content": "@HUB_porn_detection_lstm@fc_0.w_0\n@HUB_porn_detection_lstm@lstm_0.w_0\n@HUB_porn_detection_lstm@embedding_0.w_0\n@HUB_porn_detection_lstm@fc_1.w_0\n@HUB_porn_detection_lstm@lstm_0.b_0\n@HUB_porn_detection_lstm@layer_norm_0.w_0\n@HUB_porn_detection_lstm@fc_1.b_0\n@HUB_porn_detection_lstm@layer_norm_0.b_0\n@HUB_porn_detection_lstm@fc_0.b_0\n"
  },
  {
    "path": "modules/text/text_review/porn_detection_lstm/assets/vocab.txt",
    "content": "[PAD]\n[unused1]\n[unused2]\n[unused3]\n[unused4]\n[unused5]\n[unused6]\n[unused7]\n[unused8]\n[unused9]\n[unused10]\n[unused11]\n[unused12]\n[unused13]\n[unused14]\n[unused15]\n[unused16]\n[unused17]\n[unused18]\n[unused19]\n[unused20]\n[unused21]\n[unused22]\n[unused23]\n[unused24]\n[unused25]\n[unused26]\n[unused27]\n[unused28]\n[unused29]\n[unused30]\n[unused31]\n[unused32]\n[unused33]\n[unused34]\n[unused35]\n[unused36]\n[unused37]\n[unused38]\n[unused39]\n[unused40]\n[unused41]\n[unused42]\n[unused43]\n[unused44]\n[unused45]\n[unused46]\n[unused47]\n[unused48]\n[unused49]\n[unused50]\n[unused51]\n[unused52]\n[unused53]\n[unused54]\n[unused55]\n[unused56]\n[unused57]\n[unused58]\n[unused59]\n[unused60]\n[unused61]\n[unused62]\n[unused63]\n[unused64]\n[unused65]\n[unused66]\n[unused67]\n[unused68]\n[unused69]\n[unused70]\n[unused71]\n[unused72]\n[unused73]\n[unused74]\n[unused75]\n[unused76]\n[unused77]\n[unused78]\n[unused79]\n[unused80]\n[unused81]\n[unused82]\n[unused83]\n[unused84]\n[unused85]\n[unused86]\n[unused87]\n[unused88]\n[unused89]\n[unused90]\n[unused91]\n[unused92]\n[unused93]\n[unused94]\n[unused95]\n[unused96]\n[unused97]\n[unused98]\n[unused99]\n[UNK]\n[CLS]\n[SEP]\n[MASK]\n<S>\n<T>\n!\n\"\n#\n$\n%\n&\n'\n(\n)\n*\n+\n,\n-\n.\n/\n0\n1\n2\n3\n4\n5\n6\n7\n8\n9\n:\n;\n<\n=\n>\n?\n@\n[\n\\\n]\n^\n_\na\nb\nc\nd\ne\nf\ng\nh\ni\nj\nk\nl\nm\nn\no\np\nq\nr\ns\nt\nu\nv\nw\nx\ny\nz\n{\n|\n}\n~\n£\n¤\n¥\n§\n©\n«\n®\n°\n±\n²\n³\nµ\n·\n¹\nº\n»\n¼\n×\nß\næ\n÷\nø\nđ\nŋ\nɔ\nə\nɡ\nʰ\nˇ\nˈ\nˊ\nˋ\nˍ\nː\n˙\n˚\nˢ\nα\nβ\nγ\nδ\nε\nη\nθ\nι\nκ\nλ\nμ\nν\nο\nπ\nρ\nς\nσ\nτ\nυ\nφ\nχ\nψ\nω\nа\nб\nв\nг\nд\nе\nж\nз\nи\nк\nл\nм\nн\nо\nп\nр\nс\nт\nу\nф\nх\nц\nч\nш\nы\nь\nя\nі\nا\nب\nة\nت\nد\nر\nس\nع\nل\nم\nن\nه\nو\nي\n۩\nก\nง\nน\nม\nย\nร\nอ\nา\nเ\n๑\n་\nღ\nᄀ\nᄁ\nᄂ\nᄃ\nᄅ\nᄆ\nᄇ\nᄈ\nᄉ\nᄋ\nᄌ\nᄎ\nᄏ\nᄐ\nᄑ\nᄒ\nᅡ\nᅢ\nᅣ\nᅥ\nᅦ\nᅧ\nᅨ\nᅩ\nᅪ\nᅬ\nᅭ\nᅮ\nᅯ\nᅲ\nᅳ\nᅴ\nᅵ\nᆨ\nᆫ\nᆯ\nᆷ\nᆸ\nᆺ\nᆻ\nᆼ\nᗜ\nᵃ\nᵉ\nᵍ\nᵏ\nᵐ\nᵒ\nᵘ\n‖\n„\n†\n•\n‥\n‧\n \n‰\n′\n″\n‹\n›\n※\n‿\n⁄\nⁱ\n⁺\nⁿ\n₁\n₂\n₃\n₄\n€\n℃\n№\n™\nⅰ\nⅱ\nⅲ\nⅳ\nⅴ\n←\n↑\n→\n↓\n↔\n↗\n↘\n⇒\n∀\n−\n∕\n∙\n√\n∞\n∟\n∠\n∣\n∥\n∩\n∮\n∶\n∼\n∽\n≈\n≒\n≡\n≤\n≥\n≦\n≧\n≪\n≫\n⊙\n⋅\n⋈\n⋯\n⌒\n①\n②\n③\n④\n⑤\n⑥\n⑦\n⑧\n⑨\n⑩\n⑴\n⑵\n⑶\n⑷\n⑸\n⒈\n⒉\n⒊\n⒋\nⓒ\nⓔ\nⓘ\n─\n━\n│\n┃\n┅\n┆\n┊\n┌\n└\n├\n┣\n═\n║\n╚\n╞\n╠\n╭\n╮\n╯\n╰\n╱\n╳\n▂\n▃\n▅\n▇\n█\n▉\n▋\n▌\n▍\n▎\n■\n□\n▪\n▫\n▬\n▲\n△\n▶\n►\n▼\n▽\n◆\n◇\n○\n◎\n●\n◕\n◠\n◢\n◤\n☀\n★\n☆\n☕\n☞\n☺\n☼\n♀\n♂\n♠\n♡\n♣\n♥\n♦\n♪\n♫\n♬\n✈\n✔\n✕\n✖\n✦\n✨\n✪\n✰\n✿\n❀\n❤\n➜\n➤\n⦿\n、\n。\n〃\n々\n〇\n〈\n〉\n《\n》\n「\n」\n『\n』\n【\n】\n〓\n〔\n〕\n〖\n〗\n〜\n〝\n〞\nぁ\nあ\nぃ\nい\nう\nぇ\nえ\nお\nか\nき\nく\nけ\nこ\nさ\nし\nす\nせ\nそ\nた\nち\nっ\nつ\nて\nと\nな\nに\nぬ\nね\nの\nは\nひ\nふ\nへ\nほ\nま\nみ\nむ\nめ\nも\nゃ\nや\nゅ\nゆ\nょ\nよ\nら\nり\nる\nれ\nろ\nわ\nを\nん\n゜\nゝ\nァ\nア\nィ\nイ\nゥ\nウ\nェ\nエ\nォ\nオ\nカ\nキ\nク\nケ\nコ\nサ\nシ\nス\nセ\nソ\nタ\nチ\nッ\nツ\nテ\nト\nナ\nニ\nヌ\nネ\nノ\nハ\nヒ\nフ\nヘ\nホ\nマ\nミ\nム\nメ\nモ\nャ\nヤ\nュ\nユ\nョ\nヨ\nラ\nリ\nル\nレ\nロ\nワ\nヲ\nン\nヶ\n・\nー\nヽ\nㄅ\nㄆ\nㄇ\nㄉ\nㄋ\nㄌ\nㄍ\nㄎ\nㄏ\nㄒ\nㄚ\nㄛ\nㄞ\nㄟ\nㄢ\nㄤ\nㄥ\nㄧ\nㄨ\nㆍ\n㈦\n㊣\n㎡\n㗎\n一\n丁\n七\n万\n丈\n三\n上\n下\n不\n与\n丐\n丑\n专\n且\n丕\n世\n丘\n丙\n业\n丛\n东\n丝\n丞\n丟\n両\n丢\n两\n严\n並\n丧\n丨\n个\n丫\n中\n丰\n串\n临\n丶\n丸\n丹\n为\n主\n丼\n丽\n举\n丿\n乂\n乃\n久\n么\n义\n之\n乌\n乍\n乎\n乏\n乐\n乒\n乓\n乔\n乖\n乗\n乘\n乙\n乜\n九\n乞\n也\n习\n乡\n书\n乩\n买\n乱\n乳\n乾\n亀\n亂\n了\n予\n争\n事\n二\n于\n亏\n云\n互\n五\n井\n亘\n亙\n亚\n些\n亜\n亞\n亟\n亡\n亢\n交\n亥\n亦\n产\n亨\n亩\n享\n京\n亭\n亮\n亲\n亳\n亵\n人\n亿\n什\n仁\n仃\n仄\n仅\n仆\n仇\n今\n介\n仍\n从\n仏\n仑\n仓\n仔\n仕\n他\n仗\n付\n仙\n仝\n仞\n仟\n代\n令\n以\n仨\n仪\n们\n仮\n仰\n仲\n件\n价\n任\n份\n仿\n企\n伉\n伊\n伍\n伎\n伏\n伐\n休\n伕\n众\n优\n伙\n会\n伝\n伞\n伟\n传\n伢\n伤\n伦\n伪\n伫\n伯\n估\n伴\n伶\n伸\n伺\n似\n伽\n佃\n但\n佇\n佈\n位\n低\n住\n佐\n佑\n体\n佔\n何\n佗\n佘\n余\n佚\n佛\n作\n佝\n佞\n佟\n你\n佢\n佣\n佤\n佥\n佩\n佬\n佯\n佰\n佳\n併\n佶\n佻\n佼\n使\n侃\n侄\n來\n侈\n例\n侍\n侏\n侑\n侖\n侗\n供\n依\n侠\n価\n侣\n侥\n侦\n侧\n侨\n侬\n侮\n侯\n侵\n侶\n侷\n便\n係\n促\n俄\n俊\n俎\n俏\n俐\n俑\n俗\n俘\n俚\n保\n俞\n俟\n俠\n信\n俨\n俩\n俪\n俬\n俭\n修\n俯\n俱\n俳\n俸\n俺\n俾\n倆\n倉\n個\n倌\n倍\n倏\n們\n倒\n倔\n倖\n倘\n候\n倚\n倜\n借\n倡\n値\n倦\n倩\n倪\n倫\n倬\n倭\n倶\n债\n值\n倾\n偃\n假\n偈\n偉\n偌\n偎\n偏\n偕\n做\n停\n健\n側\n偵\n偶\n偷\n偻\n偽\n偿\n傀\n傅\n傍\n傑\n傘\n備\n傚\n傢\n傣\n傥\n储\n傩\n催\n傭\n傲\n傳\n債\n傷\n傻\n傾\n僅\n働\n像\n僑\n僕\n僖\n僚\n僥\n僧\n僭\n僮\n僱\n僵\n價\n僻\n儀\n儂\n億\n儆\n儉\n儋\n儒\n儕\n儘\n償\n儡\n優\n儲\n儷\n儼\n儿\n兀\n允\n元\n兄\n充\n兆\n兇\n先\n光\n克\n兌\n免\n児\n兑\n兒\n兔\n兖\n党\n兜\n兢\n入\n內\n全\n兩\n八\n公\n六\n兮\n兰\n共\n兲\n关\n兴\n兵\n其\n具\n典\n兹\n养\n兼\n兽\n冀\n内\n円\n冇\n冈\n冉\n冊\n册\n再\n冏\n冒\n冕\n冗\n写\n军\n农\n冠\n冢\n冤\n冥\n冨\n冪\n冬\n冯\n冰\n冲\n决\n况\n冶\n冷\n冻\n冼\n冽\n冾\n净\n凄\n准\n凇\n凈\n凉\n凋\n凌\n凍\n减\n凑\n凛\n凜\n凝\n几\n凡\n凤\n処\n凪\n凭\n凯\n凰\n凱\n凳\n凶\n凸\n凹\n出\n击\n函\n凿\n刀\n刁\n刃\n分\n切\n刈\n刊\n刍\n刎\n刑\n划\n列\n刘\n则\n刚\n创\n初\n删\n判\n別\n刨\n利\n刪\n别\n刮\n到\n制\n刷\n券\n刹\n刺\n刻\n刽\n剁\n剂\n剃\n則\n剉\n削\n剋\n剌\n前\n剎\n剐\n剑\n剔\n剖\n剛\n剜\n剝\n剣\n剤\n剥\n剧\n剩\n剪\n副\n割\n創\n剷\n剽\n剿\n劃\n劇\n劈\n劉\n劊\n劍\n劏\n劑\n力\n劝\n办\n功\n加\n务\n劣\n动\n助\n努\n劫\n劭\n励\n劲\n劳\n労\n劵\n効\n劾\n势\n勁\n勃\n勇\n勉\n勋\n勐\n勒\n動\n勖\n勘\n務\n勛\n勝\n勞\n募\n勢\n勤\n勧\n勳\n勵\n勸\n勺\n勻\n勾\n勿\n匀\n包\n匆\n匈\n匍\n匐\n匕\n化\n北\n匙\n匝\n匠\n匡\n匣\n匪\n匮\n匯\n匱\n匹\n区\n医\n匾\n匿\n區\n十\n千\n卅\n升\n午\n卉\n半\n卍\n华\n协\n卑\n卒\n卓\n協\n单\n卖\n南\n単\n博\n卜\n卞\n卟\n占\n卡\n卢\n卤\n卦\n卧\n卫\n卮\n卯\n印\n危\n即\n却\n卵\n卷\n卸\n卻\n卿\n厂\n厄\n厅\n历\n厉\n压\n厌\n厕\n厘\n厚\n厝\n原\n厢\n厥\n厦\n厨\n厩\n厭\n厮\n厲\n厳\n去\n县\n叁\n参\n參\n又\n叉\n及\n友\n双\n反\n収\n发\n叔\n取\n受\n变\n叙\n叛\n叟\n叠\n叡\n叢\n口\n古\n句\n另\n叨\n叩\n只\n叫\n召\n叭\n叮\n可\n台\n叱\n史\n右\n叵\n叶\n号\n司\n叹\n叻\n叼\n叽\n吁\n吃\n各\n吆\n合\n吉\n吊\n吋\n同\n名\n后\n吏\n吐\n向\n吒\n吓\n吕\n吖\n吗\n君\n吝\n吞\n吟\n吠\n吡\n否\n吧\n吨\n吩\n含\n听\n吭\n吮\n启\n吱\n吳\n吴\n吵\n吶\n吸\n吹\n吻\n吼\n吽\n吾\n呀\n呂\n呃\n呆\n呈\n告\n呋\n呎\n呐\n呓\n呕\n呗\n员\n呛\n呜\n呢\n呤\n呦\n周\n呱\n呲\n味\n呵\n呷\n呸\n呻\n呼\n命\n咀\n咁\n咂\n咄\n咆\n咋\n和\n咎\n咏\n咐\n咒\n咔\n咕\n咖\n咗\n咘\n咙\n咚\n咛\n咣\n咤\n咦\n咧\n咨\n咩\n咪\n咫\n咬\n咭\n咯\n咱\n咲\n咳\n咸\n咻\n咽\n咿\n哀\n品\n哂\n哄\n哆\n哇\n哈\n哉\n哋\n哌\n响\n哎\n哏\n哐\n哑\n哒\n哔\n哗\n哟\n員\n哥\n哦\n哧\n哨\n哩\n哪\n哭\n哮\n哲\n哺\n哼\n哽\n唁\n唄\n唆\n唇\n唉\n唏\n唐\n唑\n唔\n唠\n唤\n唧\n唬\n售\n唯\n唰\n唱\n唳\n唷\n唸\n唾\n啃\n啄\n商\n啉\n啊\n問\n啓\n啕\n啖\n啜\n啞\n啟\n啡\n啤\n啥\n啦\n啧\n啪\n啫\n啬\n啮\n啰\n啱\n啲\n啵\n啶\n啷\n啸\n啻\n啼\n啾\n喀\n喂\n喃\n善\n喆\n喇\n喉\n喊\n喋\n喎\n喏\n喔\n喘\n喙\n喚\n喜\n喝\n喟\n喧\n喪\n喫\n喬\n單\n喰\n喱\n喲\n喳\n喵\n営\n喷\n喹\n喺\n喻\n喽\n嗅\n嗆\n嗇\n嗎\n嗑\n嗒\n嗓\n嗔\n嗖\n嗚\n嗜\n嗝\n嗟\n嗡\n嗣\n嗤\n嗦\n嗨\n嗪\n嗬\n嗯\n嗰\n嗲\n嗳\n嗶\n嗷\n嗽\n嘀\n嘅\n嘆\n嘈\n嘉\n嘌\n嘍\n嘎\n嘔\n嘖\n嘗\n嘘\n嘚\n嘛\n嘜\n嘞\n嘟\n嘢\n嘣\n嘤\n嘧\n嘩\n嘭\n嘮\n嘯\n嘰\n嘱\n嘲\n嘴\n嘶\n嘸\n嘹\n嘻\n嘿\n噁\n噌\n噎\n噓\n噔\n噗\n噙\n噜\n噠\n噢\n噤\n器\n噩\n噪\n噬\n噱\n噴\n噶\n噸\n噹\n噻\n噼\n嚀\n嚇\n嚎\n嚏\n嚐\n嚓\n嚕\n嚟\n嚣\n嚥\n嚨\n嚮\n嚴\n嚷\n嚼\n囂\n囉\n囊\n囍\n囑\n囔\n囗\n囚\n四\n囝\n回\n囟\n因\n囡\n团\n団\n囤\n囧\n囪\n囫\n园\n困\n囱\n囲\n図\n围\n囹\n固\n国\n图\n囿\n圃\n圄\n圆\n圈\n國\n圍\n圏\n園\n圓\n圖\n團\n圜\n土\n圣\n圧\n在\n圩\n圭\n地\n圳\n场\n圻\n圾\n址\n坂\n均\n坊\n坍\n坎\n坏\n坐\n坑\n块\n坚\n坛\n坝\n坞\n坟\n坠\n坡\n坤\n坦\n坨\n坪\n坯\n坳\n坵\n坷\n垂\n垃\n垄\n型\n垒\n垚\n垛\n垠\n垢\n垣\n垦\n垩\n垫\n垭\n垮\n垵\n埂\n埃\n埋\n城\n埔\n埕\n埗\n域\n埠\n埤\n埵\n執\n埸\n培\n基\n埼\n堀\n堂\n堃\n堅\n堆\n堇\n堑\n堕\n堙\n堡\n堤\n堪\n堯\n堰\n報\n場\n堵\n堺\n堿\n塊\n塌\n塑\n塔\n塗\n塘\n塚\n塞\n塢\n塩\n填\n塬\n塭\n塵\n塾\n墀\n境\n墅\n墉\n墊\n墒\n墓\n増\n墘\n墙\n墜\n增\n墟\n墨\n墩\n墮\n墳\n墻\n墾\n壁\n壅\n壆\n壇\n壊\n壑\n壓\n壕\n壘\n壞\n壟\n壢\n壤\n壩\n士\n壬\n壮\n壯\n声\n売\n壳\n壶\n壹\n壺\n壽\n处\n备\n変\n复\n夏\n夔\n夕\n外\n夙\n多\n夜\n够\n夠\n夢\n夥\n大\n天\n太\n夫\n夭\n央\n夯\n失\n头\n夷\n夸\n夹\n夺\n夾\n奂\n奄\n奇\n奈\n奉\n奋\n奎\n奏\n奐\n契\n奔\n奕\n奖\n套\n奘\n奚\n奠\n奢\n奥\n奧\n奪\n奬\n奮\n女\n奴\n奶\n奸\n她\n好\n如\n妃\n妄\n妆\n妇\n妈\n妊\n妍\n妒\n妓\n妖\n妘\n妙\n妝\n妞\n妣\n妤\n妥\n妨\n妩\n妪\n妮\n妲\n妳\n妹\n妻\n妾\n姆\n姉\n姊\n始\n姍\n姐\n姑\n姒\n姓\n委\n姗\n姚\n姜\n姝\n姣\n姥\n姦\n姨\n姪\n姫\n姬\n姹\n姻\n姿\n威\n娃\n娄\n娅\n娆\n娇\n娉\n娑\n娓\n娘\n娛\n娜\n娟\n娠\n娣\n娥\n娩\n娱\n娲\n娴\n娶\n娼\n婀\n婁\n婆\n婉\n婊\n婕\n婚\n婢\n婦\n婧\n婪\n婭\n婴\n婵\n婶\n婷\n婺\n婿\n媒\n媚\n媛\n媞\n媧\n媲\n媳\n媽\n媾\n嫁\n嫂\n嫉\n嫌\n嫑\n嫔\n嫖\n嫘\n嫚\n嫡\n嫣\n嫦\n嫩\n嫲\n嫵\n嫻\n嬅\n嬉\n嬌\n嬗\n嬛\n嬢\n嬤\n嬪\n嬰\n嬴\n嬷\n嬸\n嬿\n孀\n孃\n子\n孑\n孔\n孕\n孖\n字\n存\n孙\n孚\n孛\n孜\n孝\n孟\n孢\n季\n孤\n学\n孩\n孪\n孫\n孬\n孰\n孱\n孳\n孵\n學\n孺\n孽\n孿\n宁\n它\n宅\n宇\n守\n安\n宋\n完\n宏\n宓\n宕\n宗\n官\n宙\n定\n宛\n宜\n宝\n实\n実\n宠\n审\n客\n宣\n室\n宥\n宦\n宪\n宫\n宮\n宰\n害\n宴\n宵\n家\n宸\n容\n宽\n宾\n宿\n寂\n寄\n寅\n密\n寇\n富\n寐\n寒\n寓\n寛\n寝\n寞\n察\n寡\n寢\n寥\n實\n寧\n寨\n審\n寫\n寬\n寮\n寰\n寵\n寶\n寸\n对\n寺\n寻\n导\n対\n寿\n封\n専\n射\n将\n將\n專\n尉\n尊\n尋\n對\n導\n小\n少\n尔\n尕\n尖\n尘\n尚\n尝\n尤\n尧\n尬\n就\n尴\n尷\n尸\n尹\n尺\n尻\n尼\n尽\n尾\n尿\n局\n屁\n层\n屄\n居\n屆\n屈\n屉\n届\n屋\n屌\n屍\n屎\n屏\n屐\n屑\n展\n屜\n属\n屠\n屡\n屢\n層\n履\n屬\n屯\n山\n屹\n屿\n岀\n岁\n岂\n岌\n岐\n岑\n岔\n岖\n岗\n岘\n岙\n岚\n岛\n岡\n岩\n岫\n岬\n岭\n岱\n岳\n岷\n岸\n峇\n峋\n峒\n峙\n峡\n峤\n峥\n峦\n峨\n峪\n峭\n峯\n峰\n峴\n島\n峻\n峽\n崁\n崂\n崆\n崇\n崎\n崑\n崔\n崖\n崗\n崙\n崛\n崧\n崩\n崭\n崴\n崽\n嵇\n嵊\n嵋\n嵌\n嵐\n嵘\n嵩\n嵬\n嵯\n嶂\n嶄\n嶇\n嶋\n嶙\n嶺\n嶼\n嶽\n巅\n巍\n巒\n巔\n巖\n川\n州\n巡\n巢\n工\n左\n巧\n巨\n巩\n巫\n差\n己\n已\n巳\n巴\n巷\n巻\n巽\n巾\n巿\n币\n市\n布\n帅\n帆\n师\n希\n帐\n帑\n帕\n帖\n帘\n帚\n帛\n帜\n帝\n帥\n带\n帧\n師\n席\n帮\n帯\n帰\n帳\n帶\n帷\n常\n帼\n帽\n幀\n幂\n幄\n幅\n幌\n幔\n幕\n幟\n幡\n幢\n幣\n幫\n干\n平\n年\n并\n幸\n幹\n幺\n幻\n幼\n幽\n幾\n广\n庁\n広\n庄\n庆\n庇\n床\n序\n庐\n库\n应\n底\n庖\n店\n庙\n庚\n府\n庞\n废\n庠\n度\n座\n庫\n庭\n庵\n庶\n康\n庸\n庹\n庾\n廁\n廂\n廃\n廈\n廉\n廊\n廓\n廖\n廚\n廝\n廟\n廠\n廢\n廣\n廬\n廳\n延\n廷\n建\n廿\n开\n弁\n异\n弃\n弄\n弈\n弊\n弋\n式\n弑\n弒\n弓\n弔\n引\n弗\n弘\n弛\n弟\n张\n弥\n弦\n弧\n弩\n弭\n弯\n弱\n張\n強\n弹\n强\n弼\n弾\n彅\n彆\n彈\n彌\n彎\n归\n当\n录\n彗\n彙\n彝\n形\n彤\n彥\n彦\n彧\n彩\n彪\n彫\n彬\n彭\n彰\n影\n彷\n役\n彻\n彼\n彿\n往\n征\n径\n待\n徇\n很\n徉\n徊\n律\n後\n徐\n徑\n徒\n従\n徕\n得\n徘\n徙\n徜\n從\n徠\n御\n徨\n復\n循\n徬\n微\n徳\n徴\n徵\n德\n徹\n徼\n徽\n心\n必\n忆\n忌\n忍\n忏\n忐\n忑\n忒\n忖\n志\n忘\n忙\n応\n忠\n忡\n忤\n忧\n忪\n快\n忱\n念\n忻\n忽\n忿\n怀\n态\n怂\n怅\n怆\n怎\n怏\n怒\n怔\n怕\n怖\n怙\n怜\n思\n怠\n怡\n急\n怦\n性\n怨\n怪\n怯\n怵\n总\n怼\n恁\n恃\n恆\n恋\n恍\n恐\n恒\n恕\n恙\n恚\n恢\n恣\n恤\n恥\n恨\n恩\n恪\n恫\n恬\n恭\n息\n恰\n恳\n恵\n恶\n恸\n恺\n恻\n恼\n恿\n悄\n悅\n悉\n悌\n悍\n悔\n悖\n悚\n悟\n悠\n患\n悦\n您\n悩\n悪\n悬\n悯\n悱\n悲\n悴\n悵\n悶\n悸\n悻\n悼\n悽\n情\n惆\n惇\n惊\n惋\n惑\n惕\n惘\n惚\n惜\n惟\n惠\n惡\n惦\n惧\n惨\n惩\n惫\n惬\n惭\n惮\n惯\n惰\n惱\n想\n惴\n惶\n惹\n惺\n愁\n愆\n愈\n愉\n愍\n意\n愕\n愚\n愛\n愜\n感\n愣\n愤\n愧\n愫\n愷\n愿\n慄\n慈\n態\n慌\n慎\n慑\n慕\n慘\n慚\n慟\n慢\n慣\n慧\n慨\n慫\n慮\n慰\n慳\n慵\n慶\n慷\n慾\n憂\n憊\n憋\n憎\n憐\n憑\n憔\n憚\n憤\n憧\n憨\n憩\n憫\n憬\n憲\n憶\n憾\n懂\n懇\n懈\n應\n懊\n懋\n懑\n懒\n懦\n懲\n懵\n懶\n懷\n懸\n懺\n懼\n懾\n懿\n戀\n戈\n戊\n戌\n戍\n戎\n戏\n成\n我\n戒\n戕\n或\n战\n戚\n戛\n戟\n戡\n戦\n截\n戬\n戮\n戰\n戲\n戳\n戴\n戶\n户\n戸\n戻\n戾\n房\n所\n扁\n扇\n扈\n扉\n手\n才\n扎\n扑\n扒\n打\n扔\n払\n托\n扛\n扣\n扦\n执\n扩\n扪\n扫\n扬\n扭\n扮\n扯\n扰\n扱\n扳\n扶\n批\n扼\n找\n承\n技\n抄\n抉\n把\n抑\n抒\n抓\n投\n抖\n抗\n折\n抚\n抛\n抜\n択\n抟\n抠\n抡\n抢\n护\n报\n抨\n披\n抬\n抱\n抵\n抹\n押\n抽\n抿\n拂\n拄\n担\n拆\n拇\n拈\n拉\n拋\n拌\n拍\n拎\n拐\n拒\n拓\n拔\n拖\n拗\n拘\n拙\n拚\n招\n拜\n拟\n拡\n拢\n拣\n拥\n拦\n拧\n拨\n择\n括\n拭\n拮\n拯\n拱\n拳\n拴\n拷\n拼\n拽\n拾\n拿\n持\n挂\n指\n挈\n按\n挎\n挑\n挖\n挙\n挚\n挛\n挝\n挞\n挟\n挠\n挡\n挣\n挤\n挥\n挨\n挪\n挫\n振\n挲\n挹\n挺\n挽\n挾\n捂\n捅\n捆\n捉\n捋\n捌\n捍\n捎\n捏\n捐\n捕\n捞\n损\n捡\n换\n捣\n捧\n捨\n捩\n据\n捱\n捲\n捶\n捷\n捺\n捻\n掀\n掂\n掃\n掇\n授\n掉\n掌\n掏\n掐\n排\n掖\n掘\n掙\n掛\n掠\n採\n探\n掣\n接\n控\n推\n掩\n措\n掬\n掰\n掲\n掳\n掴\n掷\n掸\n掺\n揀\n揃\n揄\n揆\n揉\n揍\n描\n提\n插\n揖\n揚\n換\n握\n揣\n揩\n揪\n揭\n揮\n援\n揶\n揸\n揹\n揽\n搀\n搁\n搂\n搅\n損\n搏\n搐\n搓\n搔\n搖\n搗\n搜\n搞\n搡\n搪\n搬\n搭\n搵\n搶\n携\n搽\n摀\n摁\n摄\n摆\n摇\n摈\n摊\n摒\n摔\n摘\n摞\n摟\n摧\n摩\n摯\n摳\n摸\n摹\n摺\n摻\n撂\n撃\n撅\n撇\n撈\n撐\n撑\n撒\n撓\n撕\n撚\n撞\n撤\n撥\n撩\n撫\n撬\n播\n撮\n撰\n撲\n撵\n撷\n撸\n撻\n撼\n撿\n擀\n擁\n擂\n擄\n擅\n擇\n擊\n擋\n操\n擎\n擒\n擔\n擘\n據\n擞\n擠\n擡\n擢\n擦\n擬\n擰\n擱\n擲\n擴\n擷\n擺\n擼\n擾\n攀\n攏\n攒\n攔\n攘\n攙\n攜\n攝\n攞\n攢\n攣\n攤\n攥\n攪\n攫\n攬\n支\n收\n攸\n改\n攻\n放\n政\n故\n效\n敌\n敍\n敎\n敏\n救\n敕\n敖\n敗\n敘\n教\n敛\n敝\n敞\n敢\n散\n敦\n敬\n数\n敲\n整\n敵\n敷\n數\n斂\n斃\n文\n斋\n斌\n斎\n斐\n斑\n斓\n斗\n料\n斛\n斜\n斟\n斡\n斤\n斥\n斧\n斩\n斫\n斬\n断\n斯\n新\n斷\n方\n於\n施\n旁\n旃\n旅\n旋\n旌\n旎\n族\n旖\n旗\n无\n既\n日\n旦\n旧\n旨\n早\n旬\n旭\n旮\n旱\n时\n旷\n旺\n旻\n昀\n昂\n昆\n昇\n昉\n昊\n昌\n明\n昏\n易\n昔\n昕\n昙\n星\n映\n春\n昧\n昨\n昭\n是\n昱\n昴\n昵\n昶\n昼\n显\n晁\n時\n晃\n晉\n晋\n晌\n晏\n晒\n晓\n晔\n晕\n晖\n晗\n晚\n晝\n晞\n晟\n晤\n晦\n晨\n晩\n普\n景\n晰\n晴\n晶\n晷\n智\n晾\n暂\n暄\n暇\n暈\n暉\n暌\n暐\n暑\n暖\n暗\n暝\n暢\n暧\n暨\n暫\n暮\n暱\n暴\n暸\n暹\n曄\n曆\n曇\n曉\n曖\n曙\n曜\n曝\n曠\n曦\n曬\n曰\n曲\n曳\n更\n書\n曹\n曼\n曾\n替\n最\n會\n月\n有\n朋\n服\n朐\n朔\n朕\n朗\n望\n朝\n期\n朦\n朧\n木\n未\n末\n本\n札\n朮\n术\n朱\n朴\n朵\n机\n朽\n杀\n杂\n权\n杆\n杈\n杉\n李\n杏\n材\n村\n杓\n杖\n杜\n杞\n束\n杠\n条\n来\n杨\n杭\n杯\n杰\n東\n杳\n杵\n杷\n杼\n松\n板\n极\n构\n枇\n枉\n枋\n析\n枕\n林\n枚\n果\n枝\n枢\n枣\n枪\n枫\n枭\n枯\n枰\n枱\n枳\n架\n枷\n枸\n柄\n柏\n某\n柑\n柒\n染\n柔\n柘\n柚\n柜\n柞\n柠\n柢\n查\n柩\n柬\n柯\n柱\n柳\n柴\n柵\n査\n柿\n栀\n栃\n栄\n栅\n标\n栈\n栉\n栋\n栎\n栏\n树\n栓\n栖\n栗\n校\n栩\n株\n样\n核\n根\n格\n栽\n栾\n桀\n桁\n桂\n桃\n桅\n框\n案\n桉\n桌\n桎\n桐\n桑\n桓\n桔\n桜\n桠\n桡\n桢\n档\n桥\n桦\n桧\n桨\n桩\n桶\n桿\n梁\n梅\n梆\n梏\n梓\n梗\n條\n梟\n梢\n梦\n梧\n梨\n梭\n梯\n械\n梳\n梵\n梶\n检\n棂\n棄\n棉\n棋\n棍\n棒\n棕\n棗\n棘\n棚\n棟\n棠\n棣\n棧\n森\n棱\n棲\n棵\n棹\n棺\n椁\n椅\n椋\n植\n椎\n椒\n検\n椪\n椭\n椰\n椹\n椽\n椿\n楂\n楊\n楓\n楔\n楚\n楝\n楞\n楠\n楣\n楨\n楫\n業\n楮\n極\n楷\n楸\n楹\n楼\n楽\n概\n榄\n榆\n榈\n榉\n榔\n榕\n榖\n榛\n榜\n榨\n榫\n榭\n榮\n榱\n榴\n榷\n榻\n槁\n槃\n構\n槌\n槍\n槎\n槐\n槓\n様\n槛\n槟\n槤\n槭\n槲\n槳\n槻\n槽\n槿\n樁\n樂\n樊\n樑\n樓\n標\n樞\n樟\n模\n樣\n権\n横\n樫\n樯\n樱\n樵\n樸\n樹\n樺\n樽\n樾\n橄\n橇\n橋\n橐\n橘\n橙\n機\n橡\n橢\n橫\n橱\n橹\n橼\n檀\n檄\n檎\n檐\n檔\n檗\n檜\n檢\n檬\n檯\n檳\n檸\n檻\n櫃\n櫚\n櫛\n櫥\n櫸\n櫻\n欄\n權\n欒\n欖\n欠\n次\n欢\n欣\n欧\n欲\n欸\n欺\n欽\n款\n歆\n歇\n歉\n歌\n歎\n歐\n歓\n歙\n歛\n歡\n止\n正\n此\n步\n武\n歧\n歩\n歪\n歯\n歲\n歳\n歴\n歷\n歸\n歹\n死\n歼\n殁\n殃\n殆\n殇\n殉\n殊\n残\n殒\n殓\n殖\n殘\n殞\n殡\n殤\n殭\n殯\n殲\n殴\n段\n殷\n殺\n殼\n殿\n毀\n毁\n毂\n毅\n毆\n毋\n母\n毎\n每\n毒\n毓\n比\n毕\n毗\n毘\n毙\n毛\n毡\n毫\n毯\n毽\n氈\n氏\n氐\n民\n氓\n气\n氖\n気\n氙\n氛\n氟\n氡\n氢\n氣\n氤\n氦\n氧\n氨\n氪\n氫\n氮\n氯\n氰\n氲\n水\n氷\n永\n氹\n氾\n汀\n汁\n求\n汆\n汇\n汉\n汎\n汐\n汕\n汗\n汙\n汛\n汝\n汞\n江\n池\n污\n汤\n汨\n汩\n汪\n汰\n汲\n汴\n汶\n汹\n決\n汽\n汾\n沁\n沂\n沃\n沅\n沈\n沉\n沌\n沏\n沐\n沒\n沓\n沖\n沙\n沛\n沟\n没\n沢\n沣\n沥\n沦\n沧\n沪\n沫\n沭\n沮\n沱\n河\n沸\n油\n治\n沼\n沽\n沾\n沿\n況\n泄\n泉\n泊\n泌\n泓\n法\n泗\n泛\n泞\n泠\n泡\n波\n泣\n泥\n注\n泪\n泫\n泮\n泯\n泰\n泱\n泳\n泵\n泷\n泸\n泻\n泼\n泽\n泾\n洁\n洄\n洋\n洒\n洗\n洙\n洛\n洞\n津\n洩\n洪\n洮\n洱\n洲\n洵\n洶\n洸\n洹\n活\n洼\n洽\n派\n流\n浃\n浄\n浅\n浆\n浇\n浊\n测\n济\n浏\n浑\n浒\n浓\n浔\n浙\n浚\n浜\n浣\n浦\n浩\n浪\n浬\n浮\n浯\n浴\n海\n浸\n涂\n涅\n涇\n消\n涉\n涌\n涎\n涓\n涔\n涕\n涙\n涛\n涝\n涞\n涟\n涠\n涡\n涣\n涤\n润\n涧\n涨\n涩\n涪\n涮\n涯\n液\n涵\n涸\n涼\n涿\n淀\n淄\n淅\n淆\n淇\n淋\n淌\n淑\n淒\n淖\n淘\n淙\n淚\n淞\n淡\n淤\n淦\n淨\n淩\n淪\n淫\n淬\n淮\n深\n淳\n淵\n混\n淹\n淺\n添\n淼\n清\n済\n渉\n渊\n渋\n渍\n渎\n渐\n渔\n渗\n渙\n渚\n減\n渝\n渠\n渡\n渣\n渤\n渥\n渦\n温\n測\n渭\n港\n渲\n渴\n游\n渺\n渾\n湃\n湄\n湊\n湍\n湖\n湘\n湛\n湟\n湧\n湫\n湮\n湯\n湳\n湾\n湿\n満\n溃\n溅\n溉\n溏\n源\n準\n溜\n溝\n溟\n溢\n溥\n溧\n溪\n溫\n溯\n溱\n溴\n溶\n溺\n溼\n滁\n滂\n滄\n滅\n滇\n滋\n滌\n滑\n滓\n滔\n滕\n滙\n滚\n滝\n滞\n滟\n满\n滢\n滤\n滥\n滦\n滨\n滩\n滬\n滯\n滲\n滴\n滷\n滸\n滾\n滿\n漁\n漂\n漆\n漉\n漏\n漓\n演\n漕\n漠\n漢\n漣\n漩\n漪\n漫\n漬\n漯\n漱\n漲\n漳\n漸\n漾\n漿\n潆\n潇\n潋\n潍\n潑\n潔\n潘\n潛\n潜\n潞\n潟\n潢\n潤\n潦\n潧\n潭\n潮\n潰\n潴\n潸\n潺\n潼\n澀\n澄\n澆\n澈\n澍\n澎\n澗\n澜\n澡\n澤\n澧\n澱\n澳\n澹\n激\n濁\n濂\n濃\n濑\n濒\n濕\n濘\n濛\n濟\n濠\n濡\n濤\n濫\n濬\n濮\n濯\n濱\n濺\n濾\n瀅\n瀆\n瀉\n瀋\n瀏\n瀑\n瀕\n瀘\n瀚\n瀛\n瀝\n瀞\n瀟\n瀧\n瀨\n瀬\n瀰\n瀾\n灌\n灏\n灑\n灘\n灝\n灞\n灣\n火\n灬\n灭\n灯\n灰\n灵\n灶\n灸\n灼\n災\n灾\n灿\n炀\n炁\n炅\n炉\n炊\n炎\n炒\n炔\n炕\n炖\n炙\n炜\n炫\n炬\n炭\n炮\n炯\n炳\n炷\n炸\n点\n為\n炼\n炽\n烁\n烂\n烃\n烈\n烊\n烏\n烘\n烙\n烛\n烟\n烤\n烦\n烧\n烨\n烩\n烫\n烬\n热\n烯\n烷\n烹\n烽\n焉\n焊\n焕\n焖\n焗\n焘\n焙\n焚\n焜\n無\n焦\n焯\n焰\n焱\n然\n焼\n煅\n煉\n煊\n煌\n煎\n煒\n煖\n煙\n煜\n煞\n煤\n煥\n煦\n照\n煨\n煩\n煮\n煲\n煸\n煽\n熄\n熊\n熏\n熒\n熔\n熙\n熟\n熠\n熨\n熬\n熱\n熵\n熹\n熾\n燁\n燃\n燄\n燈\n燉\n燊\n燎\n燒\n燔\n燕\n燙\n燜\n營\n燥\n燦\n燧\n燭\n燮\n燴\n燻\n燼\n燿\n爆\n爍\n爐\n爛\n爪\n爬\n爭\n爰\n爱\n爲\n爵\n父\n爷\n爸\n爹\n爺\n爻\n爽\n爾\n牆\n片\n版\n牌\n牍\n牒\n牙\n牛\n牝\n牟\n牠\n牡\n牢\n牦\n牧\n物\n牯\n牲\n牴\n牵\n特\n牺\n牽\n犀\n犁\n犄\n犊\n犍\n犒\n犢\n犧\n犬\n犯\n状\n犷\n犸\n犹\n狀\n狂\n狄\n狈\n狎\n狐\n狒\n狗\n狙\n狞\n狠\n狡\n狩\n独\n狭\n狮\n狰\n狱\n狸\n狹\n狼\n狽\n猎\n猕\n猖\n猗\n猙\n猛\n猜\n猝\n猥\n猩\n猪\n猫\n猬\n献\n猴\n猶\n猷\n猾\n猿\n獄\n獅\n獎\n獐\n獒\n獗\n獠\n獣\n獨\n獭\n獰\n獲\n獵\n獷\n獸\n獺\n獻\n獼\n獾\n玄\n率\n玉\n王\n玑\n玖\n玛\n玟\n玠\n玥\n玩\n玫\n玮\n环\n现\n玲\n玳\n玷\n玺\n玻\n珀\n珂\n珅\n珈\n珉\n珊\n珍\n珏\n珐\n珑\n珙\n珞\n珠\n珣\n珥\n珩\n珪\n班\n珮\n珲\n珺\n現\n球\n琅\n理\n琇\n琉\n琊\n琍\n琏\n琐\n琛\n琢\n琥\n琦\n琨\n琪\n琬\n琮\n琰\n琲\n琳\n琴\n琵\n琶\n琺\n琼\n瑀\n瑁\n瑄\n瑋\n瑕\n瑗\n瑙\n瑚\n瑛\n瑜\n瑞\n瑟\n瑠\n瑣\n瑤\n瑩\n瑪\n瑯\n瑰\n瑶\n瑾\n璀\n璁\n璃\n璇\n璉\n璋\n璎\n璐\n璜\n璞\n璟\n璧\n璨\n環\n璽\n璿\n瓊\n瓏\n瓒\n瓜\n瓢\n瓣\n瓤\n瓦\n瓮\n瓯\n瓴\n瓶\n瓷\n甄\n甌\n甕\n甘\n甙\n甚\n甜\n生\n產\n産\n甥\n甦\n用\n甩\n甫\n甬\n甭\n甯\n田\n由\n甲\n申\n电\n男\n甸\n町\n画\n甾\n畀\n畅\n界\n畏\n畑\n畔\n留\n畜\n畝\n畢\n略\n畦\n番\n畫\n異\n畲\n畳\n畴\n當\n畸\n畹\n畿\n疆\n疇\n疊\n疏\n疑\n疔\n疖\n疗\n疙\n疚\n疝\n疟\n疡\n疣\n疤\n疥\n疫\n疮\n疯\n疱\n疲\n疳\n疵\n疸\n疹\n疼\n疽\n疾\n痂\n病\n症\n痈\n痉\n痊\n痍\n痒\n痔\n痕\n痘\n痙\n痛\n痞\n痠\n痢\n痣\n痤\n痧\n痨\n痪\n痫\n痰\n痱\n痴\n痹\n痺\n痼\n痿\n瘀\n瘁\n瘋\n瘍\n瘓\n瘘\n瘙\n瘟\n瘠\n瘡\n瘢\n瘤\n瘦\n瘧\n瘩\n瘪\n瘫\n瘴\n瘸\n瘾\n療\n癇\n癌\n癒\n癖\n癜\n癞\n癡\n癢\n癣\n癥\n癫\n癬\n癮\n癱\n癲\n癸\n発\n登\n發\n白\n百\n皂\n的\n皆\n皇\n皈\n皋\n皎\n皑\n皓\n皖\n皙\n皚\n皮\n皰\n皱\n皴\n皺\n皿\n盂\n盃\n盅\n盆\n盈\n益\n盎\n盏\n盐\n监\n盒\n盔\n盖\n盗\n盘\n盛\n盜\n盞\n盟\n盡\n監\n盤\n盥\n盧\n盪\n目\n盯\n盱\n盲\n直\n相\n盹\n盼\n盾\n省\n眈\n眉\n看\n県\n眙\n眞\n真\n眠\n眦\n眨\n眩\n眯\n眶\n眷\n眸\n眺\n眼\n眾\n着\n睁\n睇\n睏\n睐\n睑\n睛\n睜\n睞\n睡\n睢\n督\n睥\n睦\n睨\n睪\n睫\n睬\n睹\n睽\n睾\n睿\n瞄\n瞅\n瞇\n瞋\n瞌\n瞎\n瞑\n瞒\n瞓\n瞞\n瞟\n瞠\n瞥\n瞧\n瞩\n瞪\n瞬\n瞭\n瞰\n瞳\n瞻\n瞼\n瞿\n矇\n矍\n矗\n矚\n矛\n矜\n矢\n矣\n知\n矩\n矫\n短\n矮\n矯\n石\n矶\n矽\n矾\n矿\n码\n砂\n砌\n砍\n砒\n研\n砖\n砗\n砚\n砝\n砣\n砥\n砧\n砭\n砰\n砲\n破\n砷\n砸\n砺\n砼\n砾\n础\n硅\n硐\n硒\n硕\n硝\n硫\n硬\n确\n硯\n硼\n碁\n碇\n碉\n碌\n碍\n碎\n碑\n碓\n碗\n碘\n碚\n碛\n碟\n碣\n碧\n碩\n碰\n碱\n碳\n碴\n確\n碼\n碾\n磁\n磅\n磊\n磋\n磐\n磕\n磚\n磡\n磨\n磬\n磯\n磲\n磷\n磺\n礁\n礎\n礙\n礡\n礦\n礪\n礫\n礴\n示\n礼\n社\n祀\n祁\n祂\n祇\n祈\n祉\n祎\n祐\n祕\n祖\n祗\n祚\n祛\n祜\n祝\n神\n祟\n祠\n祢\n祥\n票\n祭\n祯\n祷\n祸\n祺\n祿\n禀\n禁\n禄\n禅\n禍\n禎\n福\n禛\n禦\n禧\n禪\n禮\n禱\n禹\n禺\n离\n禽\n禾\n禿\n秀\n私\n秃\n秆\n秉\n秋\n种\n科\n秒\n秘\n租\n秣\n秤\n秦\n秧\n秩\n秭\n积\n称\n秸\n移\n秽\n稀\n稅\n程\n稍\n税\n稔\n稗\n稚\n稜\n稞\n稟\n稠\n稣\n種\n稱\n稲\n稳\n稷\n稹\n稻\n稼\n稽\n稿\n穀\n穂\n穆\n穌\n積\n穎\n穗\n穢\n穩\n穫\n穴\n究\n穷\n穹\n空\n穿\n突\n窃\n窄\n窈\n窍\n窑\n窒\n窓\n窕\n窖\n窗\n窘\n窜\n窝\n窟\n窠\n窥\n窦\n窨\n窩\n窪\n窮\n窯\n窺\n窿\n竄\n竅\n竇\n竊\n立\n竖\n站\n竜\n竞\n竟\n章\n竣\n童\n竭\n端\n競\n竹\n竺\n竽\n竿\n笃\n笆\n笈\n笋\n笏\n笑\n笔\n笙\n笛\n笞\n笠\n符\n笨\n第\n笹\n笺\n笼\n筆\n等\n筊\n筋\n筍\n筏\n筐\n筑\n筒\n答\n策\n筛\n筝\n筠\n筱\n筲\n筵\n筷\n筹\n签\n简\n箇\n箋\n箍\n箏\n箐\n箔\n箕\n算\n箝\n管\n箩\n箫\n箭\n箱\n箴\n箸\n節\n篁\n範\n篆\n篇\n築\n篑\n篓\n篙\n篝\n篠\n篡\n篤\n篩\n篪\n篮\n篱\n篷\n簇\n簌\n簍\n簡\n簦\n簧\n簪\n簫\n簷\n簸\n簽\n簾\n簿\n籁\n籃\n籌\n籍\n籐\n籟\n籠\n籤\n籬\n籮\n籲\n米\n类\n籼\n籽\n粄\n粉\n粑\n粒\n粕\n粗\n粘\n粟\n粤\n粥\n粧\n粪\n粮\n粱\n粲\n粳\n粵\n粹\n粼\n粽\n精\n粿\n糅\n糊\n糍\n糕\n糖\n糗\n糙\n糜\n糞\n糟\n糠\n糧\n糬\n糯\n糰\n糸\n系\n糾\n紀\n紂\n約\n紅\n紉\n紊\n紋\n納\n紐\n紓\n純\n紗\n紘\n紙\n級\n紛\n紜\n素\n紡\n索\n紧\n紫\n紮\n累\n細\n紳\n紹\n紺\n終\n絃\n組\n絆\n経\n結\n絕\n絞\n絡\n絢\n給\n絨\n絮\n統\n絲\n絳\n絵\n絶\n絹\n綁\n綏\n綑\n經\n継\n続\n綜\n綠\n綢\n綦\n綫\n綬\n維\n綱\n網\n綴\n綵\n綸\n綺\n綻\n綽\n綾\n綿\n緊\n緋\n総\n緑\n緒\n緘\n線\n緝\n緞\n締\n緣\n編\n緩\n緬\n緯\n練\n緹\n緻\n縁\n縄\n縈\n縛\n縝\n縣\n縫\n縮\n縱\n縴\n縷\n總\n績\n繁\n繃\n繆\n繇\n繋\n織\n繕\n繚\n繞\n繡\n繩\n繪\n繫\n繭\n繳\n繹\n繼\n繽\n纂\n續\n纍\n纏\n纓\n纔\n纖\n纜\n纠\n红\n纣\n纤\n约\n级\n纨\n纪\n纫\n纬\n纭\n纯\n纰\n纱\n纲\n纳\n纵\n纶\n纷\n纸\n纹\n纺\n纽\n纾\n线\n绀\n练\n组\n绅\n细\n织\n终\n绊\n绍\n绎\n经\n绑\n绒\n结\n绔\n绕\n绘\n给\n绚\n绛\n络\n绝\n绞\n统\n绡\n绢\n绣\n绥\n绦\n继\n绩\n绪\n绫\n续\n绮\n绯\n绰\n绳\n维\n绵\n绶\n绷\n绸\n绻\n综\n绽\n绾\n绿\n缀\n缄\n缅\n缆\n缇\n缈\n缉\n缎\n缓\n缔\n缕\n编\n缘\n缙\n缚\n缜\n缝\n缠\n缢\n缤\n缥\n缨\n缩\n缪\n缭\n缮\n缰\n缱\n缴\n缸\n缺\n缽\n罂\n罄\n罌\n罐\n网\n罔\n罕\n罗\n罚\n罡\n罢\n罩\n罪\n置\n罰\n署\n罵\n罷\n罹\n羁\n羅\n羈\n羊\n羌\n美\n羔\n羚\n羞\n羟\n羡\n羣\n群\n羥\n羧\n羨\n義\n羯\n羲\n羸\n羹\n羽\n羿\n翁\n翅\n翊\n翌\n翎\n習\n翔\n翘\n翟\n翠\n翡\n翦\n翩\n翰\n翱\n翳\n翹\n翻\n翼\n耀\n老\n考\n耄\n者\n耆\n耋\n而\n耍\n耐\n耒\n耕\n耗\n耘\n耙\n耦\n耨\n耳\n耶\n耷\n耸\n耻\n耽\n耿\n聂\n聆\n聊\n聋\n职\n聒\n联\n聖\n聘\n聚\n聞\n聪\n聯\n聰\n聲\n聳\n聴\n聶\n職\n聽\n聾\n聿\n肃\n肄\n肅\n肆\n肇\n肉\n肋\n肌\n肏\n肓\n肖\n肘\n肚\n肛\n肝\n肠\n股\n肢\n肤\n肥\n肩\n肪\n肮\n肯\n肱\n育\n肴\n肺\n肽\n肾\n肿\n胀\n胁\n胃\n胄\n胆\n背\n胍\n胎\n胖\n胚\n胛\n胜\n胝\n胞\n胡\n胤\n胥\n胧\n胫\n胭\n胯\n胰\n胱\n胳\n胴\n胶\n胸\n胺\n能\n脂\n脅\n脆\n脇\n脈\n脉\n脊\n脍\n脏\n脐\n脑\n脓\n脖\n脘\n脚\n脛\n脣\n脩\n脫\n脯\n脱\n脲\n脳\n脸\n脹\n脾\n腆\n腈\n腊\n腋\n腌\n腎\n腐\n腑\n腓\n腔\n腕\n腥\n腦\n腩\n腫\n腭\n腮\n腰\n腱\n腳\n腴\n腸\n腹\n腺\n腻\n腼\n腾\n腿\n膀\n膈\n膊\n膏\n膑\n膘\n膚\n膛\n膜\n膝\n膠\n膦\n膨\n膩\n膳\n膺\n膻\n膽\n膾\n膿\n臀\n臂\n臃\n臆\n臉\n臊\n臍\n臓\n臘\n臟\n臣\n臥\n臧\n臨\n自\n臬\n臭\n至\n致\n臺\n臻\n臼\n臾\n舀\n舂\n舅\n舆\n與\n興\n舉\n舊\n舌\n舍\n舎\n舐\n舒\n舔\n舖\n舗\n舛\n舜\n舞\n舟\n航\n舫\n般\n舰\n舱\n舵\n舶\n舷\n舸\n船\n舺\n舾\n艇\n艋\n艘\n艙\n艦\n艮\n良\n艰\n艱\n色\n艳\n艷\n艹\n艺\n艾\n节\n芃\n芈\n芊\n芋\n芍\n芎\n芒\n芙\n芜\n芝\n芡\n芥\n芦\n芩\n芪\n芫\n芬\n芭\n芮\n芯\n花\n芳\n芷\n芸\n芹\n芻\n芽\n芾\n苁\n苄\n苇\n苋\n苍\n苏\n苑\n苒\n苓\n苔\n苕\n苗\n苛\n苜\n苞\n苟\n苡\n苣\n若\n苦\n苫\n苯\n英\n苷\n苹\n苻\n茁\n茂\n范\n茄\n茅\n茉\n茎\n茏\n茗\n茜\n茧\n茨\n茫\n茬\n茭\n茯\n茱\n茲\n茴\n茵\n茶\n茸\n茹\n茼\n荀\n荃\n荆\n草\n荊\n荏\n荐\n荒\n荔\n荖\n荘\n荚\n荞\n荟\n荠\n荡\n荣\n荤\n荥\n荧\n荨\n荪\n荫\n药\n荳\n荷\n荸\n荻\n荼\n荽\n莅\n莆\n莉\n莊\n莎\n莒\n莓\n莖\n莘\n莞\n莠\n莢\n莧\n莪\n莫\n莱\n莲\n莴\n获\n莹\n莺\n莽\n莿\n菀\n菁\n菅\n菇\n菈\n菊\n菌\n菏\n菓\n菖\n菘\n菜\n菟\n菠\n菡\n菩\n華\n菱\n菲\n菸\n菽\n萁\n萃\n萄\n萊\n萋\n萌\n萍\n萎\n萘\n萝\n萤\n营\n萦\n萧\n萨\n萩\n萬\n萱\n萵\n萸\n萼\n落\n葆\n葉\n著\n葚\n葛\n葡\n董\n葦\n葩\n葫\n葬\n葭\n葯\n葱\n葳\n葵\n葷\n葺\n蒂\n蒋\n蒐\n蒔\n蒙\n蒜\n蒞\n蒟\n蒡\n蒨\n蒲\n蒸\n蒹\n蒻\n蒼\n蒿\n蓁\n蓄\n蓆\n蓉\n蓋\n蓑\n蓓\n蓖\n蓝\n蓟\n蓦\n蓬\n蓮\n蓼\n蓿\n蔑\n蔓\n蔔\n蔗\n蔘\n蔚\n蔡\n蔣\n蔥\n蔫\n蔬\n蔭\n蔵\n蔷\n蔺\n蔻\n蔼\n蔽\n蕁\n蕃\n蕈\n蕉\n蕊\n蕎\n蕙\n蕤\n蕨\n蕩\n蕪\n蕭\n蕲\n蕴\n蕻\n蕾\n薄\n薅\n薇\n薈\n薊\n薏\n薑\n薔\n薙\n薛\n薦\n薨\n薩\n薪\n薬\n薯\n薰\n薹\n藉\n藍\n藏\n藐\n藓\n藕\n藜\n藝\n藤\n藥\n藩\n藹\n藻\n藿\n蘆\n蘇\n蘊\n蘋\n蘑\n蘚\n蘭\n蘸\n蘼\n蘿\n虎\n虏\n虐\n虑\n虔\n處\n虚\n虛\n虜\n虞\n號\n虢\n虧\n虫\n虬\n虱\n虹\n虻\n虽\n虾\n蚀\n蚁\n蚂\n蚊\n蚌\n蚓\n蚕\n蚜\n蚝\n蚣\n蚤\n蚩\n蚪\n蚯\n蚱\n蚵\n蛀\n蛆\n蛇\n蛊\n蛋\n蛎\n蛐\n蛔\n蛙\n蛛\n蛟\n蛤\n蛭\n蛮\n蛰\n蛳\n蛹\n蛻\n蛾\n蜀\n蜂\n蜃\n蜆\n蜇\n蜈\n蜊\n蜍\n蜒\n蜓\n蜕\n蜗\n蜘\n蜚\n蜜\n蜡\n蜢\n蜥\n蜱\n蜴\n蜷\n蜻\n蜿\n蝇\n蝈\n蝉\n蝌\n蝎\n蝕\n蝗\n蝙\n蝟\n蝠\n蝦\n蝨\n蝴\n蝶\n蝸\n蝼\n螂\n螃\n融\n螞\n螢\n螨\n螯\n螳\n螺\n蟀\n蟄\n蟆\n蟋\n蟎\n蟑\n蟒\n蟠\n蟬\n蟲\n蟹\n蟻\n蟾\n蠅\n蠍\n蠔\n蠕\n蠛\n蠟\n蠡\n蠢\n蠣\n蠱\n蠶\n蠹\n蠻\n血\n衄\n衅\n衆\n行\n衍\n術\n衔\n街\n衙\n衛\n衝\n衞\n衡\n衢\n衣\n补\n表\n衩\n衫\n衬\n衮\n衰\n衲\n衷\n衹\n衾\n衿\n袁\n袂\n袄\n袅\n袈\n袋\n袍\n袒\n袖\n袜\n袞\n袤\n袪\n被\n袭\n袱\n裁\n裂\n装\n裆\n裊\n裏\n裔\n裕\n裘\n裙\n補\n裝\n裟\n裡\n裤\n裨\n裱\n裳\n裴\n裸\n裹\n製\n裾\n褂\n複\n褐\n褒\n褓\n褔\n褚\n褥\n褪\n褫\n褲\n褶\n褻\n襁\n襄\n襟\n襠\n襪\n襬\n襯\n襲\n西\n要\n覃\n覆\n覇\n見\n規\n覓\n視\n覚\n覦\n覧\n親\n覬\n観\n覷\n覺\n覽\n觀\n见\n观\n规\n觅\n视\n览\n觉\n觊\n觎\n觐\n觑\n角\n觞\n解\n觥\n触\n觸\n言\n訂\n計\n訊\n討\n訓\n訕\n訖\n託\n記\n訛\n訝\n訟\n訣\n訥\n訪\n設\n許\n訳\n訴\n訶\n診\n註\n証\n詆\n詐\n詔\n評\n詛\n詞\n詠\n詡\n詢\n詣\n試\n詩\n詫\n詬\n詭\n詮\n詰\n話\n該\n詳\n詹\n詼\n誅\n誇\n誉\n誌\n認\n誓\n誕\n誘\n語\n誠\n誡\n誣\n誤\n誥\n誦\n誨\n說\n説\n読\n誰\n課\n誹\n誼\n調\n諄\n談\n請\n諏\n諒\n論\n諗\n諜\n諡\n諦\n諧\n諫\n諭\n諮\n諱\n諳\n諷\n諸\n諺\n諾\n謀\n謁\n謂\n謄\n謊\n謎\n謐\n謔\n謗\n謙\n講\n謝\n謠\n謨\n謬\n謹\n謾\n譁\n證\n譎\n譏\n識\n譙\n譚\n譜\n警\n譬\n譯\n議\n譲\n譴\n護\n譽\n讀\n變\n讓\n讚\n讞\n计\n订\n认\n讥\n讧\n讨\n让\n讪\n讫\n训\n议\n讯\n记\n讲\n讳\n讴\n讶\n讷\n许\n讹\n论\n讼\n讽\n设\n访\n诀\n证\n诃\n评\n诅\n识\n诈\n诉\n诊\n诋\n词\n诏\n译\n试\n诗\n诘\n诙\n诚\n诛\n话\n诞\n诟\n诠\n诡\n询\n诣\n诤\n该\n详\n诧\n诩\n诫\n诬\n语\n误\n诰\n诱\n诲\n说\n诵\n诶\n请\n诸\n诺\n读\n诽\n课\n诿\n谀\n谁\n调\n谄\n谅\n谆\n谈\n谊\n谋\n谌\n谍\n谎\n谏\n谐\n谑\n谒\n谓\n谔\n谕\n谗\n谘\n谙\n谚\n谛\n谜\n谟\n谢\n谣\n谤\n谥\n谦\n谧\n谨\n谩\n谪\n谬\n谭\n谯\n谱\n谲\n谴\n谶\n谷\n豁\n豆\n豇\n豈\n豉\n豊\n豌\n豎\n豐\n豔\n豚\n象\n豢\n豪\n豫\n豬\n豹\n豺\n貂\n貅\n貌\n貓\n貔\n貘\n貝\n貞\n負\n財\n貢\n貧\n貨\n販\n貪\n貫\n責\n貯\n貰\n貳\n貴\n貶\n買\n貸\n費\n貼\n貽\n貿\n賀\n賁\n賂\n賃\n賄\n資\n賈\n賊\n賑\n賓\n賜\n賞\n賠\n賡\n賢\n賣\n賤\n賦\n質\n賬\n賭\n賴\n賺\n購\n賽\n贅\n贈\n贊\n贍\n贏\n贓\n贖\n贛\n贝\n贞\n负\n贡\n财\n责\n贤\n败\n账\n货\n质\n贩\n贪\n贫\n贬\n购\n贮\n贯\n贰\n贱\n贲\n贴\n贵\n贷\n贸\n费\n贺\n贻\n贼\n贾\n贿\n赁\n赂\n赃\n资\n赅\n赈\n赊\n赋\n赌\n赎\n赏\n赐\n赓\n赔\n赖\n赘\n赚\n赛\n赝\n赞\n赠\n赡\n赢\n赣\n赤\n赦\n赧\n赫\n赭\n走\n赳\n赴\n赵\n赶\n起\n趁\n超\n越\n趋\n趕\n趙\n趟\n趣\n趨\n足\n趴\n趵\n趸\n趺\n趾\n跃\n跄\n跆\n跋\n跌\n跎\n跑\n跖\n跚\n跛\n距\n跟\n跡\n跤\n跨\n跩\n跪\n路\n跳\n践\n跷\n跹\n跺\n跻\n踉\n踊\n踌\n踏\n踐\n踝\n踞\n踟\n踢\n踩\n踪\n踮\n踱\n踴\n踵\n踹\n蹂\n蹄\n蹇\n蹈\n蹉\n蹊\n蹋\n蹑\n蹒\n蹙\n蹟\n蹣\n蹤\n蹦\n蹩\n蹬\n蹭\n蹲\n蹴\n蹶\n蹺\n蹼\n蹿\n躁\n躇\n躉\n躊\n躋\n躍\n躏\n躪\n身\n躬\n躯\n躲\n躺\n軀\n車\n軋\n軌\n軍\n軒\n軟\n転\n軸\n軼\n軽\n軾\n較\n載\n輒\n輓\n輔\n輕\n輛\n輝\n輟\n輩\n輪\n輯\n輸\n輻\n輾\n輿\n轄\n轅\n轆\n轉\n轍\n轎\n轟\n车\n轧\n轨\n轩\n转\n轭\n轮\n软\n轰\n轲\n轴\n轶\n轻\n轼\n载\n轿\n较\n辄\n辅\n辆\n辇\n辈\n辉\n辊\n辍\n辐\n辑\n输\n辕\n辖\n辗\n辘\n辙\n辛\n辜\n辞\n辟\n辣\n辦\n辨\n辩\n辫\n辭\n辮\n辯\n辰\n辱\n農\n边\n辺\n辻\n込\n辽\n达\n迁\n迂\n迄\n迅\n过\n迈\n迎\n运\n近\n返\n还\n这\n进\n远\n违\n连\n迟\n迢\n迤\n迥\n迦\n迩\n迪\n迫\n迭\n述\n迴\n迷\n迸\n迹\n迺\n追\n退\n送\n适\n逃\n逅\n逆\n选\n逊\n逍\n透\n逐\n递\n途\n逕\n逗\n這\n通\n逛\n逝\n逞\n速\n造\n逢\n連\n逮\n週\n進\n逵\n逶\n逸\n逻\n逼\n逾\n遁\n遂\n遅\n遇\n遊\n運\n遍\n過\n遏\n遐\n遑\n遒\n道\n達\n違\n遗\n遙\n遛\n遜\n遞\n遠\n遢\n遣\n遥\n遨\n適\n遭\n遮\n遲\n遴\n遵\n遶\n遷\n選\n遺\n遼\n遽\n避\n邀\n邁\n邂\n邃\n還\n邇\n邈\n邊\n邋\n邏\n邑\n邓\n邕\n邛\n邝\n邢\n那\n邦\n邨\n邪\n邬\n邮\n邯\n邰\n邱\n邳\n邵\n邸\n邹\n邺\n邻\n郁\n郅\n郊\n郎\n郑\n郜\n郝\n郡\n郢\n郤\n郦\n郧\n部\n郫\n郭\n郴\n郵\n郷\n郸\n都\n鄂\n鄉\n鄒\n鄔\n鄙\n鄞\n鄢\n鄧\n鄭\n鄰\n鄱\n鄲\n鄺\n酉\n酊\n酋\n酌\n配\n酐\n酒\n酗\n酚\n酝\n酢\n酣\n酥\n酩\n酪\n酬\n酮\n酯\n酰\n酱\n酵\n酶\n酷\n酸\n酿\n醃\n醇\n醉\n醋\n醍\n醐\n醒\n醚\n醛\n醜\n醞\n醣\n醪\n醫\n醬\n醮\n醯\n醴\n醺\n釀\n釁\n采\n釉\n释\n釋\n里\n重\n野\n量\n釐\n金\n釗\n釘\n釜\n針\n釣\n釦\n釧\n釵\n鈀\n鈉\n鈍\n鈎\n鈔\n鈕\n鈞\n鈣\n鈦\n鈪\n鈴\n鈺\n鈾\n鉀\n鉄\n鉅\n鉉\n鉑\n鉗\n鉚\n鉛\n鉤\n鉴\n鉻\n銀\n銃\n銅\n銑\n銓\n銖\n銘\n銜\n銬\n銭\n銮\n銳\n銷\n銹\n鋁\n鋅\n鋒\n鋤\n鋪\n鋰\n鋸\n鋼\n錄\n錐\n錘\n錚\n錠\n錢\n錦\n錨\n錫\n錮\n錯\n録\n錳\n錶\n鍊\n鍋\n鍍\n鍛\n鍥\n鍰\n鍵\n鍺\n鍾\n鎂\n鎊\n鎌\n鎏\n鎔\n鎖\n鎗\n鎚\n鎧\n鎬\n鎮\n鎳\n鏈\n鏖\n鏗\n鏘\n鏞\n鏟\n鏡\n鏢\n鏤\n鏽\n鐘\n鐮\n鐲\n鐳\n鐵\n鐸\n鐺\n鑄\n鑊\n鑑\n鑒\n鑣\n鑫\n鑰\n鑲\n鑼\n鑽\n鑾\n鑿\n针\n钉\n钊\n钎\n钏\n钒\n钓\n钗\n钙\n钛\n钜\n钝\n钞\n钟\n钠\n钡\n钢\n钣\n钤\n钥\n钦\n钧\n钨\n钩\n钮\n钯\n钰\n钱\n钳\n钴\n钵\n钺\n钻\n钼\n钾\n钿\n铀\n铁\n铂\n铃\n铄\n铅\n铆\n铉\n铎\n铐\n铛\n铜\n铝\n铠\n铡\n铢\n铣\n铤\n铨\n铩\n铬\n铭\n铮\n铰\n铲\n铵\n银\n铸\n铺\n链\n铿\n销\n锁\n锂\n锄\n锅\n锆\n锈\n锉\n锋\n锌\n锏\n锐\n锑\n错\n锚\n锟\n锡\n锢\n锣\n锤\n锥\n锦\n锭\n键\n锯\n锰\n锲\n锵\n锹\n锺\n锻\n镀\n镁\n镂\n镇\n镉\n镌\n镍\n镐\n镑\n镕\n镖\n镗\n镛\n镜\n镣\n镭\n镯\n镰\n镳\n镶\n長\n长\n門\n閃\n閉\n開\n閎\n閏\n閑\n閒\n間\n閔\n閘\n閡\n関\n閣\n閥\n閨\n閩\n閱\n閲\n閹\n閻\n閾\n闆\n闇\n闊\n闌\n闍\n闔\n闕\n闖\n闘\n關\n闡\n闢\n门\n闪\n闫\n闭\n问\n闯\n闰\n闲\n间\n闵\n闷\n闸\n闹\n闺\n闻\n闽\n闾\n阀\n阁\n阂\n阅\n阆\n阇\n阈\n阉\n阎\n阐\n阑\n阔\n阕\n阖\n阙\n阚\n阜\n队\n阡\n阪\n阮\n阱\n防\n阳\n阴\n阵\n阶\n阻\n阿\n陀\n陂\n附\n际\n陆\n陇\n陈\n陋\n陌\n降\n限\n陕\n陛\n陝\n陞\n陟\n陡\n院\n陣\n除\n陨\n险\n陪\n陰\n陲\n陳\n陵\n陶\n陷\n陸\n険\n陽\n隅\n隆\n隈\n隊\n隋\n隍\n階\n随\n隐\n隔\n隕\n隘\n隙\n際\n障\n隠\n隣\n隧\n隨\n險\n隱\n隴\n隶\n隸\n隻\n隼\n隽\n难\n雀\n雁\n雄\n雅\n集\n雇\n雉\n雋\n雌\n雍\n雎\n雏\n雑\n雒\n雕\n雖\n雙\n雛\n雜\n雞\n離\n難\n雨\n雪\n雯\n雰\n雲\n雳\n零\n雷\n雹\n電\n雾\n需\n霁\n霄\n霆\n震\n霈\n霉\n霊\n霍\n霎\n霏\n霑\n霓\n霖\n霜\n霞\n霧\n霭\n霰\n露\n霸\n霹\n霽\n霾\n靂\n靄\n靈\n青\n靓\n靖\n静\n靚\n靛\n靜\n非\n靠\n靡\n面\n靥\n靦\n革\n靳\n靴\n靶\n靼\n鞅\n鞋\n鞍\n鞏\n鞑\n鞘\n鞠\n鞣\n鞦\n鞭\n韆\n韋\n韌\n韓\n韜\n韦\n韧\n韩\n韬\n韭\n音\n韵\n韶\n韻\n響\n頁\n頂\n頃\n項\n順\n須\n頌\n預\n頑\n頒\n頓\n頗\n領\n頜\n頡\n頤\n頫\n頭\n頰\n頷\n頸\n頹\n頻\n頼\n顆\n題\n額\n顎\n顏\n顔\n願\n顛\n類\n顧\n顫\n顯\n顱\n顴\n页\n顶\n顷\n项\n顺\n须\n顼\n顽\n顾\n顿\n颁\n颂\n预\n颅\n领\n颇\n颈\n颉\n颊\n颌\n颍\n颐\n频\n颓\n颔\n颖\n颗\n题\n颚\n颛\n颜\n额\n颞\n颠\n颡\n颢\n颤\n颦\n颧\n風\n颯\n颱\n颳\n颶\n颼\n飄\n飆\n风\n飒\n飓\n飕\n飘\n飙\n飚\n飛\n飞\n食\n飢\n飨\n飩\n飪\n飯\n飲\n飼\n飽\n飾\n餃\n餅\n餉\n養\n餌\n餐\n餒\n餓\n餘\n餚\n餛\n餞\n餡\n館\n餮\n餵\n餾\n饅\n饈\n饋\n饌\n饍\n饑\n饒\n饕\n饗\n饞\n饥\n饨\n饪\n饬\n饭\n饮\n饯\n饰\n饱\n饲\n饴\n饵\n饶\n饷\n饺\n饼\n饽\n饿\n馀\n馁\n馄\n馅\n馆\n馈\n馋\n馍\n馏\n馒\n馔\n首\n馗\n香\n馥\n馨\n馬\n馭\n馮\n馳\n馴\n駁\n駄\n駅\n駆\n駐\n駒\n駕\n駛\n駝\n駭\n駱\n駿\n騁\n騎\n騏\n験\n騙\n騨\n騰\n騷\n驀\n驅\n驊\n驍\n驒\n驕\n驗\n驚\n驛\n驟\n驢\n驥\n马\n驭\n驮\n驯\n驰\n驱\n驳\n驴\n驶\n驷\n驸\n驹\n驻\n驼\n驾\n驿\n骁\n骂\n骄\n骅\n骆\n骇\n骈\n骊\n骋\n验\n骏\n骐\n骑\n骗\n骚\n骛\n骜\n骞\n骠\n骡\n骤\n骥\n骧\n骨\n骯\n骰\n骶\n骷\n骸\n骼\n髂\n髅\n髋\n髏\n髒\n髓\n體\n髖\n高\n髦\n髪\n髮\n髯\n髻\n鬃\n鬆\n鬍\n鬓\n鬚\n鬟\n鬢\n鬣\n鬥\n鬧\n鬱\n鬼\n魁\n魂\n魄\n魅\n魇\n魍\n魏\n魔\n魘\n魚\n魯\n魷\n鮑\n鮨\n鮪\n鮭\n鮮\n鯉\n鯊\n鯖\n鯛\n鯨\n鯰\n鯽\n鰍\n鰓\n鰭\n鰲\n鰻\n鰾\n鱈\n鱉\n鱔\n鱗\n鱷\n鱸\n鱼\n鱿\n鲁\n鲈\n鲍\n鲑\n鲛\n鲜\n鲟\n鲢\n鲤\n鲨\n鲫\n鲱\n鲲\n鲶\n鲷\n鲸\n鳃\n鳄\n鳅\n鳌\n鳍\n鳕\n鳖\n鳗\n鳝\n鳞\n鳥\n鳩\n鳳\n鳴\n鳶\n鴉\n鴕\n鴛\n鴦\n鴨\n鴻\n鴿\n鵑\n鵜\n鵝\n鵡\n鵬\n鵰\n鵲\n鶘\n鶩\n鶯\n鶴\n鷗\n鷲\n鷹\n鷺\n鸚\n鸞\n鸟\n鸠\n鸡\n鸢\n鸣\n鸥\n鸦\n鸨\n鸪\n鸭\n鸯\n鸳\n鸵\n鸽\n鸾\n鸿\n鹂\n鹃\n鹄\n鹅\n鹈\n鹉\n鹊\n鹌\n鹏\n鹑\n鹕\n鹘\n鹜\n鹞\n鹤\n鹦\n鹧\n鹫\n鹭\n鹰\n鹳\n鹵\n鹹\n鹼\n鹽\n鹿\n麂\n麋\n麒\n麓\n麗\n麝\n麟\n麥\n麦\n麩\n麴\n麵\n麸\n麺\n麻\n麼\n麽\n麾\n黃\n黄\n黍\n黎\n黏\n黑\n黒\n黔\n默\n黛\n黜\n黝\n點\n黠\n黨\n黯\n黴\n鼋\n鼎\n鼐\n鼓\n鼠\n鼬\n鼹\n鼻\n鼾\n齁\n齊\n齋\n齐\n齒\n齡\n齢\n齣\n齦\n齿\n龄\n龅\n龈\n龊\n龋\n龌\n龍\n龐\n龔\n龕\n龙\n龚\n龛\n龜\n龟\n︰\n︱\n︶\n︿\n﹁\n﹂\n﹍\n﹏\n﹐\n﹑\n﹒\n﹔\n﹕\n﹖\n﹗\n﹙\n﹚\n﹝\n﹞\n﹡\n﹣\n！\n＂\n＃\n＄\n％\n＆\n＇\n（\n）\n＊\n＋\n，\n－\n．\n／\n０\n１\n２\n３\n４\n５\n６\n７\n８\n９\n：\n；\n＜\n＝\n＞\n？\n＠\n［\n＼\n］\n＾\n＿\n｀\nａ\nｂ\nｃ\nｄ\nｅ\nｆ\nｇ\nｈ\nｉ\nｊ\nｋ\nｌ\nｍ\nｎ\nｏ\nｐ\nｑ\nｒ\nｓ\nｔ\nｕ\nｖ\nｗ\nｘ\nｙ\nｚ\n｛\n｜\n｝\n～\n｡\n｢\n｣\n､\n･\nｯ\nｰ\nｲ\nｸ\nｼ\nｽ\nﾄ\nﾉ\nﾌ\nﾗ\nﾙ\nﾝ\nﾞ\nﾟ\n￣\n￥\n👍\n🔥\n😂\n😎\n...\nyam\n10\n2017\n12\n11\n2016\n20\n30\n15\n06\nlofter\n##s\n2015\nby\n16\n14\n18\n13\n24\n17\n2014\n21\n##0\n22\n19\n25\n23\ncom\n100\n00\n05\n2013\n##a\n03\n09\n08\n28\n##2\n50\n01\n04\n##1\n27\n02\n2012\n##3\n26\n##e\n07\n##8\n##5\n##6\n##4\n##9\n##7\n29\n2011\n40\n##t\n2010\n##o\n##d\n##i\n2009\n##n\napp\nwww\nthe\n##m\n31\n##c\n##l\n##y\n##r\n##g\n2008\n60\nhttp\n200\nqq\n##p\n80\n##f\ngoogle\npixnet\n90\ncookies\ntripadvisor\n500\n##er\n##k\n35\n##h\nfacebook\n2007\n2000\n70\n##b\nof\n##x\n##u\n45\n300\niphone\n32\n1000\n2006\n48\nip\n36\nin\n38\n3d\n##w\n##ing\n55\nctrip\n##on\n##v\n33\n##の\nto\n34\n400\nid\n2005\nit\n37\nwindows\nllc\ntop\n99\n42\n39\n000\nled\nat\n##an\n41\n51\n52\n46\n49\n43\n53\n44\n##z\nandroid\n58\nand\n59\n2004\n56\nvr\n##か\n5000\n2003\n47\nblogthis\ntwitter\n54\n##le\n150\nok\n2018\n57\n75\ncn\nno\nios\n##in\n##mm\n##00\n800\non\nte\n3000\n65\n2001\n360\n95\nig\nlv\n120\n##ng\n##を\n##us\n##に\npc\nてす\n──\n600\n##te\n85\n2002\n88\n##ed\nhtml\nncc\nwifi\nemail\n64\nblog\nis\n##10\n##て\nmail\nonline\n##al\ndvd\n##ic\nstudio\n##は\n##℃\n##ia\n##と\nline\nvip\n72\n##q\n98\n##ce\n##en\nfor\n##is\n##ra\n##es\n##j\nusb\nnet\ncp\n1999\nasia\n4g\n##cm\ndiy\nnew\n3c\n##お\nta\n66\nlanguage\nvs\napple\ntw\n86\nweb\n##ne\nipad\n62\nyou\n##re\n101\n68\n##tion\nps\nde\nbt\npony\natm\n##2017\n1998\n67\n##ch\nceo\n##or\ngo\n##na\nav\npro\ncafe\n96\npinterest\n97\n63\npixstyleme3c\n##ta\nmore\nsaid\n##2016\n1997\nmp3\n700\n##ll\nnba\njun\n##20\n92\ntv\n1995\npm\n61\n76\nnbsp\n250\n##ie\nlinux\n##ma\ncd\n110\nhd\n##17\n78\n##ion\n77\n6000\nam\n##th\n##st\n94\n##se\n##et\n69\n180\ngdp\nmy\n105\n81\nabc\n89\nflash\n79\none\n93\n1990\n1996\n##ck\ngps\n##も\n##ly\nweb885\n106\n2020\n91\n##ge\n4000\n1500\nxd\nboss\nisbn\n1994\norg\n##ry\nme\nlove\n##11\n0fork\n73\n##12\n3g\n##ter\n##ar\n71\n82\n##la\nhotel\n130\n1970\npk\n83\n87\n140\nie\n##os\n##30\n##el\n74\n##50\nseo\ncpu\n##ml\np2p\n84\nmay\n##る\nsun\ntue\ninternet\ncc\nposted\nyoutube\n##at\n##ン\n##man\nii\n##ル\n##15\nabs\nnt\npdf\nyahoo\nago\n1980\n##it\nnews\nmac\n104\n##てす\n##me\n##り\njava\n1992\nspa\n##de\n##nt\nhk\nall\nplus\nla\n1993\n##mb\n##16\n##ve\nwest\n##da\n160\nair\n##い\n##ps\nから\n##to\n1989\nlogo\nhtc\nphp\nhttps\nfi\nmomo\n##son\nsat\n##ke\n##80\nebd\nsuv\nwi\nday\napk\n##88\n##um\nmv\ngalaxy\nwiki\nor\nbrake\n##ス\n1200\nする\nthis\n1991\nmon\n##こ\n❤2017\npo\n##ない\njavascript\nlife\nhome\njune\n##ss\nsystem\n900\n##ー\n##０\npp\n1988\nworld\nfb\n4k\nbr\n##as\nic\nai\nleonardo\nsafari\n##60\nlive\nfree\nxx\nwed\nwin7\nkiehl\n##co\nlg\no2o\n##go\nus\n235\n1949\nmm\nしい\nvfm\nkanye\n##90\n##2015\n##id\njr\n##ey\n123\nrss\n##sa\n##ro\n##am\n##no\nthu\nfri\n350\n##sh\n##ki\n103\ncomments\nname\n##のて\n##pe\n##ine\nmax\n1987\n8000\nuber\n##mi\n##ton\nwordpress\noffice\n1986\n1985\n##ment\n107\nbd\nwin10\n##ld\n##li\ngmail\nbb\ndior\n##rs\n##ri\n##rd\n##ます\nup\ncad\n##®\ndr\nして\nread\n##21\nをお\n##io\n##99\nurl\n1984\npvc\npaypal\nshow\npolicy\n##40\n##ty\n##18\nwith\n##★\n##01\ntxt\n102\n##ba\ndna\nfrom\npost\nmini\nar\ntaiwan\njohn\n##ga\nprivacy\nagoda\n##13\n##ny\nword\n##24\n##22\n##by\n##ur\n##hz\n1982\n##ang\n265\ncookie\nnetscape\n108\n##ka\n##～\n##ad\nhouse\nshare\nnote\nibm\ncode\nhello\nnike\nsim\nsurvey\n##016\n1979\n1950\nwikia\n##32\n##017\n5g\ncbc\n##tor\n##kg\n1983\n##rt\n##14\ncampaign\nstore\n2500\nos\n##ct\n##ts\n##°\n170\napi\n##ns\n365\nexcel\n##な\n##ao\n##ら\n##し\n～～\n##nd\nuniversity\n163\nには\n518\n##70\n##ya\n##il\n##25\npierre\nipo\n0020\n897\n##23\nhotels\n##ian\nのお\n125\nyears\n6606\n##ers\n##26\nhigh\n##day\ntime\n##ay\nbug\n##line\n##く\n##す\n##be\nxp\ntalk2yam\nyamservice\n10000\ncoco\n##dy\nsony\n##ies\n1978\nmicrosoft\ndavid\npeople\n##ha\n1960\ninstagram\nintel\nその\n##ot\niso\n1981\n##va\n115\n##mo\n##land\nxxx\nman\nco\nltxsw\n##ation\nbaby\n220\n##pa\n##ol\n1945\n7000\ntag\n450\n##ue\nmsn\n##31\noppo\n##ト\n##ca\ncontrol\n##om\nst\nchrome\n##ure\n##ん\nbe\n##き\nlol\n##19\nした\n##bo\n240\nlady\n##100\n##way\n##から\n4600\n##ko\n##do\n##un\n4s\ncorporation\n168\n##ni\nherme\n##28\nｃｐ\n978\n##up\n##06\nui\n##ds\nppt\nadmin\nthree\nします\nbbc\nre\n128\n##48\nca\n##015\n##35\nhp\n##ee\ntpp\n##た\n##ive\n××\nroot\n##cc\n##ました\n##ble\n##ity\nadobe\npark\n114\net\noled\ncity\n##ex\n##ler\n##ap\nchina\n##book\n20000\nview\n##ice\nglobal\n##km\nyour\nhong\n##mg\nout\n##ms\nng\nebay\n##29\nmenu\nubuntu\n##cy\nrom\n##view\nopen\nktv\ndo\nserver\n##lo\nif\nenglish\n##ね\n##５\n##oo\n1600\n##02\nstep1\nkong\nclub\n135\njuly\ninc\n1976\nmr\nhi\n##net\ntouch\n##ls\n##ii\nmichael\nlcd\n##05\n##33\nphone\njames\nstep2\n1300\nios9\n##box\ndc\n##２\n##ley\nsamsung\n111\n280\npokemon\ncss\n##ent\n##les\nいいえ\n##１\ns8\natom\nplay\nbmw\n##said\nsa\netf\nctrl\n♥yoyo♥\n##55\n2025\n##2014\n##66\nadidas\namazon\n1958\n##ber\n##ner\nvisa\n##77\n##der\n1800\nconnectivity\n##hi\nfirefox\n109\n118\nhr\nso\nstyle\nmark\npop\nol\nskip\n1975\nas\n##27\n##ir\n##61\n190\nmba\n##う\n##ai\nle\n##ver\n1900\ncafe2017\nlte\nsuper\n113\n129\n##ron\namd\nlike\n##☆\nare\n##ster\nwe\n##sk\npaul\ndata\ninternational\n##ft\nlongchamp\nssd\ngood\n##ート\n##ti\nreply\n##my\n↓↓↓\napr\nstar\n##ker\nsource\n136\njs\n112\nget\nforce\nphoto\n##one\n126\n##2013\n##ow\nlink\nbbs\n1972\ngoods\n##lin\npython\n119\n##ip\ngame\n##ics\n##ません\nblue\n##●\n520\n##45\npage\nitunes\n##03\n1955\n260\n1968\ngt\ngif\n618\n##ff\n##47\ngroup\nくたさい\nabout\nbar\nganji\n##nce\nmusic\nlee\nnot\n1977\n1971\n1973\n##per\nan\nfaq\ncomment\n##って\ndays\n##ock\n116\n##bs\n1974\n1969\nv1\nplayer\n1956\nxbox\nsql\nfm\nf1\n139\n##ah\n210\n##lv\n##mp\n##000\nmelody\n1957\n##３\n550\n17life\n199\n1966\nxml\nmarket\n##au\n##71\n999\n##04\nwhat\ngl\n##95\n##age\ntips\n##68\nbook\n##ting\nmysql\ncan\n1959\n230\n##ung\nwonderland\nwatch\n10℃\n##ction\n9000\nmar\nmobile\n1946\n1962\narticle\n##db\npart\n▲top\nparty\nって\n1967\n1964\n1948\n##07\n##ore\n##op\nこの\ndj\n##78\n##38\n010\nmain\n225\n1965\n##ong\nart\n320\nad\n134\n020\n##73\n117\npm2\njapan\n228\n##08\nts\n1963\n##ica\nder\nsm\n##36\n2019\n##wa\nct\n##７\n##や\n##64\n1937\nhomemesh\nsearch\n##85\n##れは\n##tv\n##di\nmacbook\n##９\n##くたさい\nservice\n##♥\ntype\nった\n750\n##ier\n##si\n##75\n##います\n##ok\nbest\n##ット\ngoris\nlock\n##った\ncf\n3m\nbig\n##ut\nftp\ncarol\n##vi\n１０\n1961\nhappy\nsd\n##ac\n122\nanti\npe\ncnn\niii\n1920\n138\n##ラ\n1940\nesp\njan\ntags\n##98\n##51\naugust\nvol\n##86\n154\n##™\n##fs\n##れ\n##sion\ndesign\nac\n##ム\npress\njordan\nppp\nthat\nkey\ncheck\n##６\n##tt\n##㎡\n1080p\n##lt\npower\n##42\n1952\n##bc\nvivi\n##ック\nhe\n133\n121\njpg\n##rry\n201\n175\n3500\n1947\nnb\n##ted\n##rn\nしています\n1954\nusd\n##t00\nmaster\n##ンク\n001\nmodel\n##58\nal\n##09\n1953\n##34\nram\ngoo\nても\n##ui\n127\n1930\nred\n##ary\nrpg\nitem\n##pm\n##41\n270\n##za\nproject\n##2012\nhot\ntd\nblogabstract\n##ger\n##62\n650\n##44\ngr2\n##します\n##ｍ\nblack\nelectronic\nnfc\nyear\nasus\nまた\nhtml5\ncindy\n##hd\nm3\n132\nesc\n##od\nbooking\n##53\nfed\ntvb\n##81\n##ina\nmit\n165\n##いる\nchan\n192\ndistribution\nnext\nになる\npeter\nbios\nsteam\ncm\n1941\nにも\npk10\n##ix\n##65\n##91\ndec\nnasa\n##ana\nicecat\n00z\nb1\nwill\n##46\nli\nse\n##ji\n##み\n##ard\noct\n##ain\njp\n##ze\n##bi\ncio\n##56\nsmart\nh5\n##39\n##port\ncurve\nvpn\n##nm\n##dia\nutc\n##あり\n12345678910\n##52\nrmvb\nchanel\na4\nmiss\n##and\n##im\nmedia\nwho\n##63\nshe\ngirl\n5s\n124\nvera\n##して\nclass\nvivo\nking\n##フ\n##ei\nnational\nab\n1951\n5cm\n888\n145\nipod\nap\n1100\n5mm\n211\nms\n2756\n##69\nmp4\nmsci\n##po\n##89\n131\nmg\nindex\n380\n##bit\n##out\n##zz\n##97\n##67\n158\napec\n##８\nphotoshop\nopec\n￥799\nては\n##96\n##tes\n##ast\n2g\n○○\n##ール\n￥2899\n##ling\n##よ\n##ory\n1938\n##ical\nkitty\ncontent\n##43\nstep3\n##cn\nwin8\n155\nvc\n1400\niphone7\nrobert\n##した\ntcl\n137\nbeauty\n##87\nen\ndollars\n##ys\n##oc\nstep\npay\nyy\na1\n##2011\n##lly\n##ks\n##♪\n1939\n188\ndownload\n1944\nsep\nexe\nph\nいます\nschool\ngb\ncenter\npr\nstreet\n##board\nuv\n##37\n##lan\nwinrar\n##que\n##ua\n##com\n1942\n1936\n480\ngpu\n##４\nettoday\nfu\ntom\n##54\n##ren\n##via\n149\n##72\nb2b\n144\n##79\n##tch\nrose\narm\nmb\n##49\n##ial\n##nn\nnvidia\nstep4\nmvp\n00㎡\nyork\n156\n##イ\nhow\ncpi\n591\n2765\ngov\nkg\njoe\n##xx\nmandy\npa\n##ser\ncopyright\nfashion\n1935\ndon\n##け\necu\n##ist\n##art\nerp\nwap\nhave\n##lm\ntalk\n##ek\n##ning\n##if\nch\n##ite\nvideo\n1943\ncs\nsan\niot\nlook\n##84\n##2010\n##ku\noctober\n##ux\ntrump\n##hs\n##ide\nbox\n141\nfirst\n##ins\napril\n##ight\n##83\n185\nangel\nprotected\naa\n151\n162\nx1\nm2\n##fe\n##×\n##ho\nsize\n143\nmin\nofo\nfun\ngomaji\nex\nhdmi\nfood\ndns\nmarch\nchris\nkevin\n##のか\n##lla\n##pp\n##ec\nag\nems\n6s\n720p\n##rm\n##ham\noff\n##92\nasp\nteam\nfandom\ned\n299\n▌♥\n##ell\ninfo\nされています\n##82\nsina\n4066\n161\n##able\n##ctor\n330\n399\n315\ndll\nrights\nltd\nidc\njul\n3kg\n1927\n142\nma\nsurface\n##76\n##ク\n～～～\n304\nmall\neps\n146\ngreen\n##59\nmap\nspace\ndonald\nv2\nsodu\n##light\n1931\n148\n1700\nまて\n310\nreserved\nhtm\n##han\n##57\n2d\n178\nmod\n##ise\n##tions\n152\nti\n##shi\ndoc\n1933\nicp\n055\nwang\n##ram\nshopping\naug\n##pi\n##well\nnow\nwam\nb2\nからお\n##hu\n236\n1928\n##gb\n266\nf2\n##93\n153\nmix\n##ef\n##uan\nbwl\n##plus\n##res\ncore\n##ess\ntea\n5℃\nhktvmall\nnhk\n##ate\nlist\n##ese\n301\nfeb\n4m\ninn\nての\nnov\n159\n12345\ndaniel\n##ci\npass\n##bet\n##nk\ncoffee\n202\nssl\nairbnb\n##ute\nfbi\nwoshipm\nskype\nea\ncg\nsp\n##fc\n##www\nyes\nedge\nalt\n007\n##94\nfpga\n##ght\n##gs\niso9001\nさい\n##ile\n##wood\n##uo\nimage\nlin\nicon\namerican\n##em\n1932\nset\nsays\n##king\n##tive\nblogger\n##74\nなと\n256\n147\n##ox\n##zy\n##red\n##ium\n##lf\nnokia\nclaire\n##リ\n##ding\nnovember\nlohas\n##500\n##tic\n##マ\n##cs\n##ある\n##che\n##ire\n##gy\n##ult\ndb\njanuary\nwin\n##カ\n166\nroad\nptt\n##ま\n##つ\n198\n##fa\n##mer\nanna\npchome\nはい\nudn\nef\n420\n##time\n##tte\n2030\n##ア\ng20\nwhite\nかかります\n1929\n308\ngarden\neleven\ndi\n##おります\nchen\n309b\n777\n172\nyoung\ncosplay\nちてない\n4500\nbat\n##123\n##tra\n##ては\nkindle\nnpc\nsteve\netc\n##ern\n##｜\ncall\nxperia\nces\ntravel\nsk\ns7\n##ous\n1934\n##int\nみいたたけます\n183\nedu\nfile\ncho\nqr\n##car\n##our\n186\n##ant\n##ｄ\neric\n1914\nrends\n##jo\n##する\nmastercard\n##2000\nkb\n##min\n290\n##ino\nvista\n##ris\n##ud\njack\n2400\n##set\n169\npos\n1912\n##her\n##ou\ntaipei\nしく\n205\nbeta\n##ませんか\n232\n##fi\nexpress\n255\nbody\n##ill\naphojoy\nuser\ndecember\nmeiki\n##ick\ntweet\nrichard\n##av\n##ᆫ\niphone6\n##dd\nちてすか\nviews\n##mark\n321\npd\n##００\ntimes\n##▲\nlevel\n##ash\n10g\npoint\n5l\n##ome\n208\nkoreanmall\n##ak\ngeorge\nq2\n206\nwma\ntcp\n##200\nスタッフ\nfull\nmlb\n##lle\n##watch\ntm\nrun\n179\n911\nsmith\nbusiness\n##und\n1919\ncolor\n##tal\n222\n171\n##less\nmoon\n4399\n##rl\nupdate\npcb\nshop\n499\n157\nlittle\nなし\nend\n##mhz\nvan\ndsp\neasy\n660\n##house\n##key\nhistory\n##ｏ\noh\n##001\n##hy\n##web\noem\nlet\nwas\n##2009\n##gg\nreview\n##wan\n182\n##°c\n203\nuc\ntitle\n##val\nunited\n233\n2021\n##ons\ndoi\ntrivago\noverdope\nsbs\n##ance\n##ち\ngrand\nspecial\n573032185\nimf\n216\nwx17house\n##so\n##ーム\naudi\n##he\nlondon\nwilliam\n##rp\n##ake\nscience\nbeach\ncfa\namp\nps4\n880\n##800\n##link\n##hp\ncrm\nferragamo\nbell\nmake\n##eng\n195\nunder\nzh\nphotos\n2300\n##style\n##ント\nvia\n176\nda\n##gi\ncompany\ni7\n##ray\nthomas\n370\nufo\ni5\n##max\nplc\nben\nback\nresearch\n8g\n173\nmike\n##pc\n##ッフ\nseptember\n189\n##ace\nvps\nfebruary\n167\npantos\nwp\nlisa\n1921\n★★\njquery\nnight\nlong\noffer\n##berg\n##news\n1911\n##いて\nray\nfks\nwto\nせます\nover\n164\n340\n##all\n##rus\n1924\n##888\n##works\nblogtitle\nloftpermalink\n##→\n187\nmartin\ntest\nling\nkm\n##め\n15000\nfda\nv3\n##ja\n##ロ\nｗedding\nかある\noutlet\nfamily\n##ea\nをこ\n##top\nstory\n##ness\nsalvatore\n##lu\n204\nswift\n215\nroom\nしている\noracle\n##ul\n1925\nsam\nb2c\nweek\npi\nrock\n##のは\n##ａ\n##けと\n##ean\n##300\n##gle\ncctv\nafter\nchinese\n##back\npowered\nx2\n##tan\n1918\n##nes\n##イン\ncanon\nonly\n181\n##zi\n##las\nsay\n##oe\n184\n##sd\n221\n##bot\n##world\n##zo\nsky\nmade\ntop100\njust\n1926\npmi\n802\n234\ngap\n##vr\n177\nles\n174\n▲topoct\nball\nvogue\nvi\ning\nofweek\ncos\n##list\n##ort\n▲topmay\n##なら\n##lon\nとして\nlast\n##tc\n##of\n##bus\n##gen\nreal\neva\n##コ\na3\nnas\n##lie\n##ria\n##coin\n##bt\n▲topapr\nhis\n212\ncat\nnata\nvive\nhealth\n⋯⋯\ndrive\nsir\n▲topmar\ndu\ncup\n##カー\n##ook\n##よう\n##sy\nalex\nmsg\ntour\nしました\n3ce\n##word\n193\nebooks\nr8\nblock\n318\n##より\n2200\nnice\npvp\n207\nmonths\n1905\nrewards\n##ther\n1917\n0800\n##xi\n##チ\n##sc\nmicro\n850\ngg\nblogfp\nop\n1922\ndaily\nm1\n264\ntrue\n##bb\nml\n##tar\n##のお\n##ky\nanthony\n196\n253\n##yo\nstate\n218\n##ara\n##aa\n##rc\n##tz\n##ston\nより\ngear\n##eo\n##ade\nge\nsee\n1923\n##win\n##ura\nss\nheart\n##den\n##ita\ndown\n##sm\nel\npng\n2100\n610\nrakuten\nwhatsapp\nbay\ndream\nadd\n##use\n680\n311\npad\ngucci\nmpv\n##ode\n##fo\nisland\n▲topjun\n##▼\n223\njason\n214\nchicago\n##❤\nしの\n##hone\nio\n##れる\n##ことか\nsogo\nbe2\n##ology\n990\ncloud\nvcd\n##con\n2～3\n##ford\n##joy\n##kb\n##こさいます\n##rade\nbut\n##ach\ndocker\n##ful\nrfid\nul\n##ase\nhit\nford\n##star\n580\n##○\n１１\na2\nsdk\nreading\nedited\n##are\ncmos\n##mc\n238\nsiri\nlight\n##ella\n##ため\nbloomberg\n##read\npizza\n##ison\njimmy\n##vm\ncollege\nnode\njournal\nba\n18k\n##play\n245\n##cer\n２０\nmagic\n##yu\n191\njump\n288\ntt\n##ings\nasr\n##lia\n3200\nstep5\nnetwork\n##cd\nmc\nいします\n1234\npixstyleme\n273\n##600\n2800\nmoney\n★★★★★\n1280\n１２\n430\nbl\nみの\nact\n##tus\ntokyo\n##rial\n##life\nemba\n##ae\nsaas\ntcs\n##rk\n##wang\nsummer\n##sp\nko\n##ving\n390\npremium\n##その\nnetflix\n##ヒ\nuk\nmt\n##lton\nright\nfrank\ntwo\n209\nえる\n##ple\n##cal\n021\n##んな\n##sen\n##ville\nhold\nnexus\ndd\n##ius\nてお\n##mah\n##なく\ntila\nzero\n820\nce\n##tin\nresort\n##ws\ncharles\nold\np10\n5d\nreport\n##360\n##ru\n##には\nbus\nvans\nlt\n##est\npv\n##レ\nlinks\nrebecca\n##ツ\n##dm\nazure\n##365\nきな\nlimited\nbit\n4gb\n##mon\n1910\nmoto\n##eam\n213\n1913\nvar\neos\nなとの\n226\nblogspot\nされた\n699\ne3\ndos\ndm\nfc\n##ments\n##ik\n##kw\nboy\n##bin\n##ata\n960\ner\n##せ\n219\n##vin\n##tu\n##ula\n194\n##∥\nstation\n##ろ\n##ature\n835\nfiles\nzara\nhdr\ntop10\nnature\n950\nmagazine\ns6\nmarriott\n##シ\navira\ncase\n##っと\ntab\n##ran\ntony\n##home\noculus\nim\n##ral\njean\nsaint\ncry\n307\nrosie\n##force\n##ini\nice\n##bert\nのある\n##nder\n##mber\npet\n2600\n##◆\nplurk\n▲topdec\n##sis\n00kg\n▲topnov\n720\n##ence\ntim\n##ω\n##nc\n##ても\n##name\nlog\nips\ngreat\nikea\nmalaysia\nunix\n##イト\n3600\n##ncy\n##nie\n12000\nakb48\n##ye\n##oid\n404\n##chi\n##いた\noa\nxuehai\n##1000\n##orm\n##rf\n275\nさん\n##ware\n##リー\n980\nho\n##pro\ntext\n##era\n560\nbob\n227\n##ub\n##2008\n8891\nscp\navi\n##zen\n2022\nmi\nwu\nmuseum\nqvod\napache\nlake\njcb\n▲topaug\n★★★\nni\n##hr\nhill\n302\nne\nweibo\n490\nruby\n##ーシ\n##ヶ\n##row\n4d\n▲topjul\niv\n##ish\ngithub\n306\nmate\n312\n##スト\n##lot\n##ane\nandrew\nのハイト\n##tina\nt1\nrf\ned2k\n##vel\n##900\nway\nfinal\nりの\nns\n5a\n705\n197\n##メ\nsweet\nbytes\n##ene\n▲topjan\n231\n##cker\n##2007\n##px\n100g\ntopapp\n229\nhelpapp\nrs\nlow\n14k\ng4g\ncare\n630\nldquo\nあり\n##fork\nleave\nrm\nedition\n##gan\n##zon\n##qq\n▲topsep\n##google\n##ism\ngold\n224\nexplorer\n##zer\ntoyota\ncategory\nselect\nvisual\n##labels\nrestaurant\n##md\nposts\ns1\n##ico\nもっと\nangelababy\n123456\n217\nsports\ns3\nmbc\n1915\nしてくたさい\nshell\nx86\ncandy\n##new\nkbs\nface\nxl\n470\n##here\n4a\nswissinfo\nv8\n▲topfeb\ndram\n##ual\n##vice\n3a\n##wer\nsport\nq1\nios10\npublic\nint\ncard\n##ｃ\nep\nau\nrt\n##れた\n1080\nbill\n##mll\nkim\n３０\n460\nwan\n##uk\n##ミ\nx3\n298\n0t\nscott\n##ming\n239\ne5\n##3d\nh7n9\nworldcat\nbrown\n##あります\n##vo\n##led\n##580\n##ax\n249\n410\n##ert\nparis\n##～6\npolo\n925\n##lr\n599\n##ナ\ncapital\n##hing\nbank\ncv\n1g\n##chat\n##ｓ\n##たい\nadc\n##ule\n2m\n##ｅ\ndigital\nhotmail\n268\n##pad\n870\nbbq\nquot\n##ring\nbefore\nwali\n##まて\nmcu\n2k\n2b\nという\ncostco\n316\nnorth\n333\nswitch\n##city\n##ｐ\nphilips\n##mann\nmanagement\npanasonic\n##cl\n##vd\n##ping\n##rge\nalice\n##lk\n##ましょう\ncss3\n##ney\nvision\nalpha\n##ular\n##400\n##tter\nlz\nにお\n##ありません\nmode\ngre\n1916\npci\n##tm\n237\n1～2\n##yan\n##そ\nについて\n##let\n##キ\nwork\nwar\ncoach\nah\nmary\n##ᅵ\nhuang\n##pt\na8\npt\nfollow\n##berry\n1895\n##ew\na5\nghost\n##ション\n##wn\n##og\nsouth\n##code\ngirls\n##rid\naction\nvilla\ngit\nr11\ntable\ngames\n##cket\nerror\n##anonymoussaid\n##ag\nhere\n##ame\n##gc\nqa\n##■\n##lis\ngmp\n##gin\nvmalife\n##cher\nyu\nwedding\n##tis\ndemo\ndragon\n530\nsoho\nsocial\nbye\n##rant\nriver\norz\nacer\n325\n##↑\n##ース\n##ats\n261\ndel\n##ven\n440\nups\n##ように\n##ター\n305\nvalue\nmacd\nyougou\n##dn\n661\n##ano\nll\n##urt\n##rent\ncontinue\nscript\n##wen\n##ect\npaper\n263\n319\nshift\n##chel\n##フト\n##cat\n258\nx5\nfox\n243\n##さん\ncar\naaa\n##blog\nloading\n##yn\n##tp\nkuso\n799\nsi\nsns\nイカせるテンマ\nヒンクテンマ3\nrmb\nvdc\nforest\ncentral\nprime\nhelp\nultra\n##rmb\n##ような\n241\nsquare\n688\n##しい\nのないフロクに\n##field\n##reen\n##ors\n##ju\nc1\nstart\n510\n##air\n##map\ncdn\n##wo\ncba\nstephen\nm8\n100km\n##get\nopera\n##base\n##ood\nvsa\ncom™\n##aw\n##ail\n251\nなのて\ncount\nt2\n##ᅡ\n##een\n2700\nhop\n##gp\nvsc\ntree\n##eg\n##ose\n816\n285\n##ories\n##shop\nalphago\nv4\n1909\nsimon\n##ᆼ\nfluke62max\nzip\nスホンサー\n##sta\nlouis\ncr\nbas\n##～10\nbc\n##yer\nhadoop\n##ube\n##wi\n1906\n0755\nhola\n##low\nplace\ncentre\n5v\nd3\n##fer\n252\n##750\n##media\n281\n540\n0l\nexchange\n262\nseries\n##ハー\n##san\neb\n##bank\n##ｋ\nq3\n##nge\n##mail\ntake\n##lp\n259\n1888\nclient\neast\ncache\nevent\nvincent\n##ールを\nきを\n##nse\nsui\n855\nadchoice\n##и\n##stry\n##なたの\n246\n##zone\nga\napps\nsea\n##ab\n248\ncisco\n##タ\n##rner\nkymco\n##care\ndha\n##pu\n##yi\nminkoff\nroyal\np1\nへの\nannie\n269\ncollection\nkpi\nplaystation\n257\nになります\n866\nbh\n##bar\nqueen\n505\nradio\n1904\nandy\narmani\n##xy\nmanager\niherb\n##ery\n##share\nspring\nraid\njohnson\n1908\n##ob\nvolvo\nhall\n##ball\nv6\nour\ntaylor\n##hk\nbi\n242\n##cp\nkate\nbo\nwater\ntechnology\n##rie\nサイトは\n277\n##ona\n##sl\nhpv\n303\ngtx\nhip\nrdquo\njayz\nstone\n##lex\n##rum\nnamespace\n##やり\n620\n##ale\n##atic\ndes\n##erson\n##ql\n##ves\n##type\nenter\n##この\n##てきます\nd2\n##168\n##mix\n##bian\nとの\na9\njj\nky\n##lc\naccess\nmovie\n##hc\nリストに\ntower\n##ration\n##mit\nます\n##nch\nua\ntel\nprefix\n##o2\n1907\n##point\n1901\nott\n～10\n##http\n##ury\nbaidu\n##ink\nmember\n##logy\nbigbang\nnownews\n##js\n##shot\n##tb\n##こと\n247\neba\n##tics\n##lus\nける\nv5\nspark\n##ama\nthere\n##ions\ngod\n##lls\n##down\nhiv\n##ress\nburberry\nday2\n##kv\n◆◆\njeff\nrelated\nfilm\nedit\njoseph\n283\n##ark\ncx\n32gb\norder\ng9\n30000\n##ans\n##tty\ns5\n##bee\nかあります\nthread\nxr\nbuy\nsh\n005\nland\nspotify\nmx\n##ari\n276\n##verse\n×email\nsf\nwhy\n##ことて\n244\n7headlines\nnego\nsunny\ndom\nexo\n401\n666\npositioning\nfit\nrgb\n##tton\n278\nkiss\nalexa\nadam\nlp\nみリストを\n##ｇ\nmp\n##ties\n##llow\namy\n##du\nnp\n002\ninstitute\n271\n##rth\n##lar\n2345\n590\n##des\nsidebar\n１５\nimax\nsite\n##cky\n##kit\n##ime\n##009\nseason\n323\n##fun\n##ンター\n##ひ\ngogoro\na7\npu\nlily\nfire\ntwd600\n##ッセーシを\nいて\n##vis\n30ml\n##cture\n##をお\ninformation\n##オ\nclose\nfriday\n##くれる\nyi\nnick\nてすか\n##tta\n##tel\n6500\n##lock\ncbd\neconomy\n254\nかお\n267\ntinker\ndouble\n375\n8gb\nvoice\n##app\noops\nchannel\ntoday\n985\n##right\nraw\nxyz\n##＋\njim\nedm\n##cent\n7500\nsupreme\n814\nds\n##its\n##asia\ndropbox\n##てすか\n##tti\nbooks\n272\n100ml\n##tle\n##ller\n##ken\n##more\n##boy\nsex\n309\n##dom\nt3\n##ider\n##なります\n##unch\n1903\n810\nfeel\n5500\n##かった\n##put\nにより\ns2\nmo\n##gh\nmen\nka\namoled\ndiv\n##tr\n##n1\nport\nhoward\n##tags\nken\ndnf\n##nus\nadsense\n##а\nide\n##へ\nbuff\nthunder\n##town\n##ique\nhas\n##body\nauto\npin\n##erry\ntee\nてした\n295\nnumber\n##the\n##013\nobject\npsp\ncool\nudnbkk\n16gb\n##mic\nmiui\n##tro\nmost\nr2\n##alk\n##nity\n1880\n±0\n##いました\n428\ns4\nlaw\nversion\n##oa\nn1\nsgs\ndocomo\n##tf\n##ack\nhenry\nfc2\n##ded\n##sco\n##014\n##rite\n286\n0mm\nlinkedin\n##ada\n##now\nwii\n##ndy\nucbug\n##◎\nsputniknews\nlegalminer\n##ika\n##xp\n2gb\n##bu\nq10\noo\nb6\ncome\n##rman\ncheese\nming\nmaker\n##gm\nnikon\n##fig\nppi\nkelly\n##ります\njchere\nてきます\nted\nmd\n003\nfgo\ntech\n##tto\ndan\nsoc\n##gl\n##len\nhair\nearth\n640\n521\nimg\n##pper\n##a1\n##てきる\n##ロク\nacca\n##ition\n##ference\nsuite\n##ig\noutlook\n##mond\n##cation\n398\n##pr\n279\n101vip\n358\n##999\n282\n64gb\n3800\n345\nairport\n##over\n284\n##おり\njones\n##ith\nlab\n##su\n##いるのて\nco2\ntown\npiece\n##llo\nno1\nvmware\n24h\n##qi\nfocus\nreader\n##admin\n##ora\ntb\nfalse\n##log\n1898\nknow\nlan\n838\n##ces\nf4\n##ume\nmotel\nstop\n##oper\nna\nflickr\nnetcomponents\n##af\n##─\npose\nwilliams\nlocal\n##ound\n##cg\n##site\n##iko\nいお\n274\n5m\ngsm\ncon\n##ath\n1902\nfriends\n##hip\ncell\n317\n##rey\n780\ncream\n##cks\n012\n##dp\nfacebooktwitterpinterestgoogle\nsso\n324\nshtml\nsong\nswiss\n##mw\n##キンク\nlumia\nxdd\nstring\ntiffany\n522\nmarc\nられた\ninsee\nrussell\nsc\ndell\n##ations\nｏｋ\ncamera\n289\n##vs\n##flow\n##late\nclassic\n287\n##nter\nstay\ng1\nmtv\n512\n##ever\n##lab\n##nger\nqe\nsata\nryan\nd1\n50ml\ncms\n##cing\nsu\n292\n3300\neditor\n296\n##nap\nsecurity\nsunday\nassociation\n##ens\n##700\n##bra\nacg\n##かり\nsofascore\nとは\nmkv\n##ign\njonathan\ngary\nbuild\nlabels\n##oto\ntesla\nmoba\nqi\ngohappy\ngeneral\najax\n1024\n##かる\nサイト\nsociety\n##test\n##urs\nwps\nfedora\n##ich\nmozilla\n328\n##480\n##dr\nusa\nurn\n##lina\n##ｒ\ngrace\n##die\n##try\n##ader\n1250\n##なり\nelle\n570\n##chen\n##ᆯ\nprice\n##ten\nuhz\n##ough\neq\n##hen\nstates\npush\nsession\nbalance\nwow\n506\n##cus\n##py\nwhen\n##ward\n##ep\n34e\nwong\nlibrary\nprada\n##サイト\n##cle\nrunning\n##ree\n313\nck\ndate\nq4\n##ctive\n##ool\n##＞\nmk\n##ira\n##163\n388\ndie\nsecret\nrq\ndota\nbuffet\nは１ヶ\ne6\n##ez\npan\n368\nha\n##card\n##cha\n2a\n##さ\nalan\nday3\neye\nf3\n##end\nfrance\nkeep\nadi\nrna\ntvbs\n##ala\nsolo\nnova\n##え\n##tail\n##ょう\nsupport\n##ries\n##なる\n##ved\nbase\ncopy\niis\nfps\n##ways\nhero\nhgih\nprofile\nfish\nmu\nssh\nentertainment\nchang\n##wd\nclick\ncake\n##ond\npre\n##tom\nkic\npixel\n##ov\n##fl\nproduct\n6a\n##pd\ndear\n##gate\nes\nyumi\naudio\n##²\n##sky\necho\nbin\nwhere\n##ture\n329\n##ape\nfind\nsap\nisis\n##なと\nnand\n##101\n##load\n##ream\nband\na6\n525\nnever\n##post\nfestival\n50cm\n##we\n555\nguide\n314\nzenfone\n##ike\n335\ngd\nforum\njessica\nstrong\nalexander\n##ould\nsoftware\nallen\n##ious\nprogram\n360°\nelse\nlohasthree\n##gar\nすることかてきます\nplease\n##れます\nrc\n##ggle\n##ric\nbim\n50000\n##own\neclipse\n355\nbrian\n3ds\n##side\n061\n361\n##other\n##ける\n##tech\n##ator\n485\nengine\n##ged\n##ｔ\nplaza\n##fit\ncia\nngo\nwestbrook\nshi\ntbs\n50mm\n##みませんか\nsci\n291\nreuters\n##ily\ncontextlink\n##hn\naf\n##cil\nbridge\nvery\n##cel\n1890\ncambridge\n##ize\n15g\n##aid\n##data\n790\nfrm\n##head\naward\nbutler\n##sun\nmeta\n##mar\namerica\nps3\npuma\npmid\n##すか\nlc\n670\nkitchen\n##lic\nオーフン5\nきなしソフトサーヒス\nそして\nday1\nfuture\n★★★★\n##text\n##page\n##rris\npm1\n##ket\nfans\n##っています\n1001\nchristian\nbot\nkids\ntrackback\n##hai\nc3\ndisplay\n##hl\nn2\n1896\nidea\nさんも\n##sent\nairmail\n##ug\n##men\npwm\nけます\n028\n##lution\n369\n852\nawards\nschemas\n354\nasics\nwikipedia\nfont\n##tional\n##vy\nc2\n293\n##れている\n##dget\n##ein\nっている\ncontact\npepper\nスキル\n339\n##～5\n294\n##uel\n##ument\n730\n##hang\nみてす\nq5\n##sue\nrain\n##ndi\nwei\nswatch\n##cept\nわせ\n331\npopular\n##ste\n##tag\np2\n501\ntrc\n1899\n##west\n##live\njustin\nhonda\nping\nmessenger\n##rap\nv9\n543\n##とは\nunity\nappqq\nはすへて\n025\nleo\n##tone\n##テ\n##ass\nuniqlo\n##010\n502\nher\njane\nmemory\nmoneydj\n##tical\nhuman\n12306\nしていると\n##m2\ncoc\nmiacare\n##mn\ntmt\n##core\nvim\nkk\n##may\nfan\ntarget\nuse\ntoo\n338\n435\n2050\n867\n737\nfast\n##2c\nservices\n##ope\nomega\nenergy\n##わ\npinkoi\n1a\n##なから\n##rain\njackson\n##ement\n##シャンルの\n374\n366\nそんな\np9\nrd\n##ᆨ\n1111\n##tier\n##vic\nzone\n##│\n385\n690\ndl\nisofix\ncpa\nm4\n322\nkimi\nめて\ndavis\n##lay\nlulu\n##uck\n050\nweeks\nqs\n##hop\n920\n##ｎ\nae\n##ear\n～5\neia\n405\n##fly\nkorea\njpeg\nboost\n##ship\nsmall\n##リア\n1860\neur\n297\n425\nvalley\n##iel\nsimple\n##ude\nrn\nk2\n##ena\nされます\nnon\npatrick\nしているから\n##ナー\nfeed\n5757\n30g\nprocess\nwell\nqqmei\n##thing\nthey\naws\nlu\npink\n##ters\n##kin\nまたは\nboard\n##vertisement\nwine\n##ien\nunicode\n##dge\nr1\n359\n##tant\nいを\n##twitter\n##3c\ncool1\nされる\n##れて\n##ｌ\nisp\n##012\nstandard\n45㎡2\n402\n##150\nmatt\n##fu\n326\n##iner\ngooglemsn\npixnetfacebookyahoo\n##ラン\nx7\n886\n##uce\nメーカー\nsao\n##ev\n##きました\n##file\n9678\n403\nxddd\nshirt\n6l\n##rio\n##hat\n3mm\ngivenchy\nya\nbang\n##lio\nmonday\ncrystal\nロクイン\n##abc\n336\nhead\n890\nubuntuforumwikilinuxpastechat\n##vc\n##～20\n##rity\ncnc\n7866\nipv6\nnull\n1897\n##ost\nyang\nimsean\ntiger\n##fet\n##ンス\n352\n##＝\ndji\n327\nji\nmaria\n##come\n##んて\nfoundation\n3100\n##beth\n##なった\n1m\n601\nactive\n##aft\n##don\n3p\nsr\n349\nemma\n##khz\nliving\n415\n353\n1889\n341\n709\n457\nsas\nx6\n##face\npptv\nx4\n##mate\nhan\nsophie\n##jing\n337\nfifa\n##mand\nother\nsale\ninwedding\n##gn\nてきちゃいます\n##mmy\n##pmlast\nbad\nnana\nnbc\nしてみてくたさいね\nなとはお\n##wu\n##かあります\n##あ\nnote7\nsingle\n##340\nせからこ\nしてくたさい♪この\nしにはとんとんワークケートを\nするとあなたにもっとマッチした\nならワークケートへ\nもみつかっちゃうかも\nワークケートの\n##bel\nwindow\n##dio\n##ht\nunion\nage\n382\n１４\n##ivity\n##ｙ\nコメント\ndomain\nneo\n##isa\n##lter\n5k\nf5\nsteven\n##cts\npowerpoint\ntft\nself\ng2\nft\n##テル\nzol\n##act\nmwc\n381\n343\nもう\nnbapop\n408\nてある\neds\nace\n##room\nprevious\nauthor\ntomtom\nil\n##ets\nhu\nfinancial\n☆☆☆\nっています\nbp\n5t\nchi\n1gb\n##hg\nfairmont\ncross\n008\ngay\nh2\nfunction\n##けて\n356\nalso\n1b\n625\n##ータ\n##raph\n1894\n3～5\n##ils\ni3\n334\navenue\n##host\nによる\n##bon\n##tsu\nmessage\nnavigation\n50g\nfintech\nh6\n##ことを\n8cm\n##ject\n##vas\n##firm\ncredit\n##wf\nxxxx\nform\n##nor\n##space\nhuawei\nplan\njson\nsbl\n##dc\nmachine\n921\n392\nwish\n##120\n##sol\nwindows7\nedward\n##ために\ndevelopment\nwashington\n##nsis\nlo\n818\n##sio\n##ym\n##bor\nplanet\n##～8\n##wt\nieee\ngpa\n##めて\ncamp\nann\ngm\n##tw\n##oka\nconnect\n##rss\n##work\n##atus\nwall\nchicken\nsoul\n2mm\n##times\nfa\n##ather\n##cord\n009\n##eep\nhitachi\ngui\nharry\n##pan\ne1\ndisney\n##press\n##ーション\nwind\n386\nfrigidaire\n##tl\nliu\nhsu\n332\nbasic\nvon\nev\nいた\nてきる\nスホンサーサイト\nlearning\n##ull\nexpedia\narchives\nchange\n##wei\nsanta\ncut\nins\n6gb\nturbo\nbrand\ncf1\n508\n004\nreturn\n747\n##rip\nh1\n##nis\n##をこ\n128gb\n##にお\n3t\napplication\nしており\nemc\nrx\n##oon\n384\nquick\n412\n15058\nwilson\nwing\nchapter\n##bug\nbeyond\n##cms\n##dar\n##oh\nzoom\ne2\ntrip\nsb\n##nba\nrcep\n342\naspx\nci\n080\ngc\ngnu\nめる\n##count\nadvanced\ndance\ndv\n##url\n##ging\n367\n8591\nam09\nshadow\nbattle\n346\n##ｉ\n##cia\n##という\nemily\n##のてす\n##tation\nhost\nff\ntechorz\nsars\n##mini\n##mporary\n##ering\nnc\n4200\n798\n##next\ncma\n##mbps\n##gas\n##ift\n##dot\n##ィ\n455\n##～17\namana\n##りの\n426\n##ros\nir\n00㎡1\n##eet\n##ible\n##↓\n710\nˋ▽ˊ\n##aka\ndcs\niq\n##ｖ\nl1\n##lor\nmaggie\n##011\n##iu\n588\n##～1\n830\n##gt\n1tb\narticles\ncreate\n##burg\n##iki\ndatabase\nfantasy\n##rex\n##cam\ndlc\ndean\n##you\nhard\npath\ngaming\nvictoria\nmaps\ncb\n##lee\n##itor\noverchicstoretvhome\nsystems\n##xt\n416\np3\nsarah\n760\n##nan\n407\n486\nx9\ninstall\nsecond\n626\n##ann\n##ph\n##rcle\n##nic\n860\n##nar\nec\n##とう\n768\nmetro\nchocolate\n##rian\n～4\n##table\n##しています\nskin\n##sn\n395\nmountain\n##0mm\ninparadise\n6m\n7x24\nib\n4800\n##jia\neeworld\ncreative\ng5\ng3\n357\nparker\necfa\nvillage\nからの\n18000\nsylvia\nサーヒス\nhbl\n##ques\n##onsored\n##x2\n##きます\n##v4\n##tein\nie6\n383\n##stack\n389\nver\n##ads\n##baby\nsound\nbbe\n##110\n##lone\n##uid\nads\n022\ngundam\n351\nthinkpad\n006\nscrum\nmatch\n##ave\nmems\n##470\n##oy\n##なりました\n##talk\nglass\nlamigo\nspan\n##eme\njob\n##a5\njay\nwade\nkde\n498\n##lace\nocean\ntvg\n##covery\n##r3\n##ners\n##rea\njunior\nthink\n##aine\ncover\n##ision\n##sia\n↓↓\n##bow\nmsi\n413\n458\n406\n##love\n711\n801\nsoft\nz2\n##pl\n456\n1840\nmobil\nmind\n##uy\n427\nnginx\n##oi\nめた\n##rr\n6221\n##mple\n##sson\n##ーシてす\n371\n##nts\n91tv\ncomhd\ncrv3000\n##uard\n1868\n397\ndeep\nlost\nfield\ngallery\n##bia\nrate\nspf\nredis\ntraction\n930\nicloud\n011\nなら\nfe\njose\n372\n##tory\ninto\nsohu\nfx\n899\n379\nkicstart2\n##hia\nすく\n##～3\n##sit\nra\n２４\n##walk\n##xure\n500g\n##pact\npacific\nxa\nnatural\ncarlo\n##250\n##walker\n1850\n##can\ncto\ngigi\n516\n##サー\npen\n##hoo\nob\nmatlab\n##ｂ\n##yy\n13913459\n##iti\nmango\n##bbs\nsense\nc5\noxford\n##ニア\nwalker\njennifer\n##ola\ncourse\n##bre\n701\n##pus\n##rder\nlucky\n075\n##ぁ\nivy\nなお\n##nia\nsotheby\nside\n##ugh\njoy\n##orage\n##ush\n##bat\n##dt\n364\nr9\n##2d\n##gio\n511\ncountry\nwear\n##lax\n##～7\n##moon\n393\nseven\nstudy\n411\n348\nlonzo\n8k\n##ェ\nevolution\n##イフ\n##kk\ngs\nkd\n##レス\narduino\n344\nb12\n##lux\narpg\n##rdon\ncook\n##x5\ndark\nfive\n##als\n##ida\nとても\nsign\n362\n##ちの\nsomething\n20mm\n##nda\n387\n##posted\nfresh\ntf\n1870\n422\ncam\n##mine\n##skip\n##form\n##ssion\neducation\n394\n##tee\ndyson\nstage\n##jie\nwant\n##night\nepson\npack\nあります\n##ppy\nテリヘル\n##█\nwd\n##eh\n##rence\nleft\n##lvin\ngolden\nmhz\ndiscovery\n##trix\n##n2\nloft\n##uch\n##dra\n##sse\nspeed\n～1\n1mdb\nsorry\nwelcome\n##urn\nwave\ngaga\n##lmer\nteddy\n##160\nトラックハック\nせよ\n611\n##f2016\n378\nrp\n##sha\nrar\n##あなたに\n##きた\n840\nholiday\n##ュー\n373\n074\n##vg\n##nos\n##rail\ngartner\ngi\n6p\n##dium\nkit\n488\nb3\neco\n##ろう\n20g\nsean\n##stone\nautocad\nnu\n##np\nf16\nwrite\n029\nm5\n##ias\nimages\natp\n##dk\nfsm\n504\n1350\nve\n52kb\n##xxx\n##のに\n##cake\n414\nunit\nlim\nru\n1v\n##ification\npublished\nangela\n16g\nanalytics\nak\n##ｑ\n##nel\ngmt\n##icon\nagain\n##₂\n##bby\nios11\n445\nかこさいます\nwaze\nいてす\n##ハ\n9985\n##ust\n##ティー\nframework\n##007\niptv\ndelete\n52sykb\ncl\nwwdc\n027\n30cm\n##fw\n##ての\n1389\n##xon\nbrandt\n##ses\n##dragon\ntc\nvetements\nanne\nmonte\nmodern\nofficial\n##へて\n##ere\n##nne\n##oud\nもちろん\n５０\netnews\n##a2\n##graphy\n421\n863\n##ちゃん\n444\n##rtex\n##てお\nl2\n##gma\nmount\nccd\nたと\narchive\nmorning\ntan\nddos\ne7\n##ホ\nday4\n##ウ\ngis\n453\nits\n495\nfactory\nbruce\npg\n##ito\nってくたさい\nguest\ncdma\n##lling\n536\nn3\nしかし\n3～4\nmega\neyes\nro\n１３\nwomen\ndac\nchurch\n##jun\nsingapore\n##facebook\n6991\nstarbucks\n##tos\n##stin\n##shine\nzen\n##mu\ntina\n20℃\n1893\n##たけて\n503\n465\nrequest\n##gence\nqt\n##っ\n1886\n347\n363\nq7\n##zzi\ndiary\n##tore\n409\n##ead\n468\ncst\n##osa\ncanada\nagent\nva\n##jiang\n##ちは\n##ーク\n##lam\nsg\n##nix\n##sday\n##よって\ng6\n##master\nbing\n##zl\ncharlie\n１６\n8mm\nnb40\n##ーン\nthai\n##ルフ\nln284ct\n##itz\n##2f\nbonnie\n##food\n##lent\noriginals\n##stro\n##lts\n418\n∟∣\n##bscribe\nchildren\nntd\nyesstyle\n##かも\nhmv\n##tment\nd5\n2cm\narts\nsms\n##pn\n##я\n##いい\ntopios9\n539\nlifestyle\nvirtual\n##ague\nxz\n##deo\nmuji\n024\nunt\n##nnis\n##ᅩ\nfaq1\n1884\n396\n##ette\nfly\n64㎡\nはしめまして\n441\ncurry\n##pop\nのこ\nrelease\n##←\n##◆◆\n##cast\n073\nありな\n500ml\n##ews\n5c\n##stle\nios7\n##ima\n787\ndog\nlenovo\n##r4\nroger\n013\ncbs\nvornado\n100m\n417\n##desk\n##クok\n##ald\n1867\n9595\n2900\n##van\noil\n##ｘ\nsome\nbreak\ncommon\n##jy\n##lines\ng7\ntwice\n419\nella\nnano\nbelle\nにこ\n##mes\n##self\n##note\njb\n##ことかてきます\nbenz\n##との\n##ova\n451\nsave\n##wing\n##ますのて\nkai\nりは\n##hua\n##rect\nrainer\n##unge\n448\n##0m\nadsl\n##かな\nguestname\n##uma\n##kins\n##zu\ntokichoi\n##price\ncounty\n##med\n##mus\nrmk\n391\naddress\nvm\nえて\nopenload\n##group\n##hin\n##iginal\namg\nurban\n##oz\njobs\nemi\n##public\nbeautiful\n##sch\nalbum\n##dden\n##bell\njerry\nworks\nhostel\nmiller\n##drive\n##rmin\n##１０\n376\nboot\n828\n##370\n##fx\n##cm～\n1885\n##nome\n##ctionary\n##oman\n##lish\n##cr\n##hm\n433\n##how\n432\nfrancis\nxi\nc919\nb5\nevernote\n##uc\nvga\n##3000\ncoupe\n##urg\n##cca\n##uality\n019\n6g\nれる\nmulti\n##また\n##ett\nem\nhey\n##ani\n##tax\n##rma\ninside\nthan\n740\nleonnhurt\n##jin\nict\nれた\nbird\nnotes\n200mm\nくの\n##dical\n##lli\nresult\n442\niu\nee\n438\nsmap\ngopro\n##last\nyin\npure\n998\n32g\nけた\n5kg\n##dan\n##rame\nmama\n##oot\nbean\nmarketing\n##hur\n2l\nbella\nsync\nxuite\n##ground\n515\ndiscuz\n##getrelax\n##ince\n##bay\n##5s\ncj\n##イス\ngmat\napt\n##pass\njing\n##rix\nc4\nrich\n##とても\nniusnews\n##ello\nbag\n770\n##eting\n##mobile\n１８\nculture\n015\n##のてすか\n377\n1020\narea\n##ience\n616\ndetails\ngp\nuniversal\nsilver\ndit\nはお\nprivate\nddd\nu11\nkanshu\n##ified\nfung\n##nny\ndx\n##520\ntai\n475\n023\n##fr\n##lean\n3s\n##pin\n429\n##rin\n25000\nly\nrick\n##bility\nusb3\nbanner\n##baru\n##gion\nmetal\ndt\nvdf\n1871\nkarl\nqualcomm\nbear\n1010\noldid\nian\njo\n##tors\npopulation\n##ernel\n1882\nmmorpg\n##mv\n##bike\n603\n##©\nww\nfriend\n##ager\nexhibition\n##del\n##pods\nfpx\nstructure\n##free\n##tings\nkl\n##rley\n##copyright\n##mma\ncalifornia\n3400\norange\nyoga\n4l\ncanmake\nhoney\n##anda\n##コメント\n595\nnikkie\n##ルハイト\ndhl\npublishing\n##mall\n##gnet\n20cm\n513\n##クセス\n##┅\ne88\n970\n##dog\nfishbase\n##!\n##\"\n###\n##$\n##%\n##&\n##'\n##(\n##)\n##*\n##+\n##,\n##-\n##.\n##/\n##:\n##;\n##<\n##=\n##>\n##?\n##@\n##[\n##\\\n##]\n##^\n##_\n##{\n##|\n##}\n##~\n##£\n##¤\n##¥\n##§\n##«\n##±\n##³\n##µ\n##·\n##¹\n##º\n##»\n##¼\n##ß\n##æ\n##÷\n##ø\n##đ\n##ŋ\n##ɔ\n##ə\n##ɡ\n##ʰ\n##ˇ\n##ˈ\n##ˊ\n##ˋ\n##ˍ\n##ː\n##˙\n##˚\n##ˢ\n##α\n##β\n##γ\n##δ\n##ε\n##η\n##θ\n##ι\n##κ\n##λ\n##μ\n##ν\n##ο\n##π\n##ρ\n##ς\n##σ\n##τ\n##υ\n##φ\n##χ\n##ψ\n##б\n##в\n##г\n##д\n##е\n##ж\n##з\n##к\n##л\n##м\n##н\n##о\n##п\n##р\n##с\n##т\n##у\n##ф\n##х\n##ц\n##ч\n##ш\n##ы\n##ь\n##і\n##ا\n##ب\n##ة\n##ت\n##د\n##ر\n##س\n##ع\n##ل\n##م\n##ن\n##ه\n##و\n##ي\n##۩\n##ก\n##ง\n##น\n##ม\n##ย\n##ร\n##อ\n##า\n##เ\n##๑\n##་\n##ღ\n##ᄀ\n##ᄁ\n##ᄂ\n##ᄃ\n##ᄅ\n##ᄆ\n##ᄇ\n##ᄈ\n##ᄉ\n##ᄋ\n##ᄌ\n##ᄎ\n##ᄏ\n##ᄐ\n##ᄑ\n##ᄒ\n##ᅢ\n##ᅣ\n##ᅥ\n##ᅦ\n##ᅧ\n##ᅨ\n##ᅪ\n##ᅬ\n##ᅭ\n##ᅮ\n##ᅯ\n##ᅲ\n##ᅳ\n##ᅴ\n##ᆷ\n##ᆸ\n##ᆺ\n##ᆻ\n##ᗜ\n##ᵃ\n##ᵉ\n##ᵍ\n##ᵏ\n##ᵐ\n##ᵒ\n##ᵘ\n##‖\n##„\n##†\n##•\n##‥\n##‧\n## \n##‰\n##′\n##″\n##‹\n##›\n##※\n##‿\n##⁄\n##ⁱ\n##⁺\n##ⁿ\n##₁\n##₃\n##₄\n##€\n##№\n##ⅰ\n##ⅱ\n##ⅲ\n##ⅳ\n##ⅴ\n##↔\n##↗\n##↘\n##⇒\n##∀\n##−\n##∕\n##∙\n##√\n##∞\n##∟\n##∠\n##∣\n##∩\n##∮\n##∶\n##∼\n##∽\n##≈\n##≒\n##≡\n##≤\n##≥\n##≦\n##≧\n##≪\n##≫\n##⊙\n##⋅\n##⋈\n##⋯\n##⌒\n##①\n##②\n##③\n##④\n##⑤\n##⑥\n##⑦\n##⑧\n##⑨\n##⑩\n##⑴\n##⑵\n##⑶\n##⑷\n##⑸\n##⒈\n##⒉\n##⒊\n##⒋\n##ⓒ\n##ⓔ\n##ⓘ\n##━\n##┃\n##┆\n##┊\n##┌\n##└\n##├\n##┣\n##═\n##║\n##╚\n##╞\n##╠\n##╭\n##╮\n##╯\n##╰\n##╱\n##╳\n##▂\n##▃\n##▅\n##▇\n##▉\n##▋\n##▌\n##▍\n##▎\n##□\n##▪\n##▫\n##▬\n##△\n##▶\n##►\n##▽\n##◇\n##◕\n##◠\n##◢\n##◤\n##☀\n##☕\n##☞\n##☺\n##☼\n##♀\n##♂\n##♠\n##♡\n##♣\n##♦\n##♫\n##♬\n##✈\n##✔\n##✕\n##✖\n##✦\n##✨\n##✪\n##✰\n##✿\n##❀\n##➜\n##➤\n##⦿\n##、\n##。\n##〃\n##々\n##〇\n##〈\n##〉\n##《\n##》\n##「\n##」\n##『\n##』\n##【\n##】\n##〓\n##〔\n##〕\n##〖\n##〗\n##〜\n##〝\n##〞\n##ぃ\n##ぇ\n##ぬ\n##ふ\n##ほ\n##む\n##ゃ\n##ゅ\n##ゆ\n##ょ\n##゜\n##ゝ\n##ァ\n##ゥ\n##エ\n##ォ\n##ケ\n##サ\n##セ\n##ソ\n##ッ\n##ニ\n##ヌ\n##ネ\n##ノ\n##ヘ\n##モ\n##ャ\n##ヤ\n##ュ\n##ユ\n##ョ\n##ヨ\n##ワ\n##ヲ\n##・\n##ヽ\n##ㄅ\n##ㄆ\n##ㄇ\n##ㄉ\n##ㄋ\n##ㄌ\n##ㄍ\n##ㄎ\n##ㄏ\n##ㄒ\n##ㄚ\n##ㄛ\n##ㄞ\n##ㄟ\n##ㄢ\n##ㄤ\n##ㄥ\n##ㄧ\n##ㄨ\n##ㆍ\n##㈦\n##㊣\n##㗎\n##一\n##丁\n##七\n##万\n##丈\n##三\n##上\n##下\n##不\n##与\n##丐\n##丑\n##专\n##且\n##丕\n##世\n##丘\n##丙\n##业\n##丛\n##东\n##丝\n##丞\n##丟\n##両\n##丢\n##两\n##严\n##並\n##丧\n##丨\n##个\n##丫\n##中\n##丰\n##串\n##临\n##丶\n##丸\n##丹\n##为\n##主\n##丼\n##丽\n##举\n##丿\n##乂\n##乃\n##久\n##么\n##义\n##之\n##乌\n##乍\n##乎\n##乏\n##乐\n##乒\n##乓\n##乔\n##乖\n##乗\n##乘\n##乙\n##乜\n##九\n##乞\n##也\n##习\n##乡\n##书\n##乩\n##买\n##乱\n##乳\n##乾\n##亀\n##亂\n##了\n##予\n##争\n##事\n##二\n##于\n##亏\n##云\n##互\n##五\n##井\n##亘\n##亙\n##亚\n##些\n##亜\n##亞\n##亟\n##亡\n##亢\n##交\n##亥\n##亦\n##产\n##亨\n##亩\n##享\n##京\n##亭\n##亮\n##亲\n##亳\n##亵\n##人\n##亿\n##什\n##仁\n##仃\n##仄\n##仅\n##仆\n##仇\n##今\n##介\n##仍\n##从\n##仏\n##仑\n##仓\n##仔\n##仕\n##他\n##仗\n##付\n##仙\n##仝\n##仞\n##仟\n##代\n##令\n##以\n##仨\n##仪\n##们\n##仮\n##仰\n##仲\n##件\n##价\n##任\n##份\n##仿\n##企\n##伉\n##伊\n##伍\n##伎\n##伏\n##伐\n##休\n##伕\n##众\n##优\n##伙\n##会\n##伝\n##伞\n##伟\n##传\n##伢\n##伤\n##伦\n##伪\n##伫\n##伯\n##估\n##伴\n##伶\n##伸\n##伺\n##似\n##伽\n##佃\n##但\n##佇\n##佈\n##位\n##低\n##住\n##佐\n##佑\n##体\n##佔\n##何\n##佗\n##佘\n##余\n##佚\n##佛\n##作\n##佝\n##佞\n##佟\n##你\n##佢\n##佣\n##佤\n##佥\n##佩\n##佬\n##佯\n##佰\n##佳\n##併\n##佶\n##佻\n##佼\n##使\n##侃\n##侄\n##來\n##侈\n##例\n##侍\n##侏\n##侑\n##侖\n##侗\n##供\n##依\n##侠\n##価\n##侣\n##侥\n##侦\n##侧\n##侨\n##侬\n##侮\n##侯\n##侵\n##侶\n##侷\n##便\n##係\n##促\n##俄\n##俊\n##俎\n##俏\n##俐\n##俑\n##俗\n##俘\n##俚\n##保\n##俞\n##俟\n##俠\n##信\n##俨\n##俩\n##俪\n##俬\n##俭\n##修\n##俯\n##俱\n##俳\n##俸\n##俺\n##俾\n##倆\n##倉\n##個\n##倌\n##倍\n##倏\n##們\n##倒\n##倔\n##倖\n##倘\n##候\n##倚\n##倜\n##借\n##倡\n##値\n##倦\n##倩\n##倪\n##倫\n##倬\n##倭\n##倶\n##债\n##值\n##倾\n##偃\n##假\n##偈\n##偉\n##偌\n##偎\n##偏\n##偕\n##做\n##停\n##健\n##側\n##偵\n##偶\n##偷\n##偻\n##偽\n##偿\n##傀\n##傅\n##傍\n##傑\n##傘\n##備\n##傚\n##傢\n##傣\n##傥\n##储\n##傩\n##催\n##傭\n##傲\n##傳\n##債\n##傷\n##傻\n##傾\n##僅\n##働\n##像\n##僑\n##僕\n##僖\n##僚\n##僥\n##僧\n##僭\n##僮\n##僱\n##僵\n##價\n##僻\n##儀\n##儂\n##億\n##儆\n##儉\n##儋\n##儒\n##儕\n##儘\n##償\n##儡\n##優\n##儲\n##儷\n##儼\n##儿\n##兀\n##允\n##元\n##兄\n##充\n##兆\n##兇\n##先\n##光\n##克\n##兌\n##免\n##児\n##兑\n##兒\n##兔\n##兖\n##党\n##兜\n##兢\n##入\n##內\n##全\n##兩\n##八\n##公\n##六\n##兮\n##兰\n##共\n##兲\n##关\n##兴\n##兵\n##其\n##具\n##典\n##兹\n##养\n##兼\n##兽\n##冀\n##内\n##円\n##冇\n##冈\n##冉\n##冊\n##册\n##再\n##冏\n##冒\n##冕\n##冗\n##写\n##军\n##农\n##冠\n##冢\n##冤\n##冥\n##冨\n##冪\n##冬\n##冯\n##冰\n##冲\n##决\n##况\n##冶\n##冷\n##冻\n##冼\n##冽\n##冾\n##净\n##凄\n##准\n##凇\n##凈\n##凉\n##凋\n##凌\n##凍\n##减\n##凑\n##凛\n##凜\n##凝\n##几\n##凡\n##凤\n##処\n##凪\n##凭\n##凯\n##凰\n##凱\n##凳\n##凶\n##凸\n##凹\n##出\n##击\n##函\n##凿\n##刀\n##刁\n##刃\n##分\n##切\n##刈\n##刊\n##刍\n##刎\n##刑\n##划\n##列\n##刘\n##则\n##刚\n##创\n##初\n##删\n##判\n##別\n##刨\n##利\n##刪\n##别\n##刮\n##到\n##制\n##刷\n##券\n##刹\n##刺\n##刻\n##刽\n##剁\n##剂\n##剃\n##則\n##剉\n##削\n##剋\n##剌\n##前\n##剎\n##剐\n##剑\n##剔\n##剖\n##剛\n##剜\n##剝\n##剣\n##剤\n##剥\n##剧\n##剩\n##剪\n##副\n##割\n##創\n##剷\n##剽\n##剿\n##劃\n##劇\n##劈\n##劉\n##劊\n##劍\n##劏\n##劑\n##力\n##劝\n##办\n##功\n##加\n##务\n##劣\n##动\n##助\n##努\n##劫\n##劭\n##励\n##劲\n##劳\n##労\n##劵\n##効\n##劾\n##势\n##勁\n##勃\n##勇\n##勉\n##勋\n##勐\n##勒\n##動\n##勖\n##勘\n##務\n##勛\n##勝\n##勞\n##募\n##勢\n##勤\n##勧\n##勳\n##勵\n##勸\n##勺\n##勻\n##勾\n##勿\n##匀\n##包\n##匆\n##匈\n##匍\n##匐\n##匕\n##化\n##北\n##匙\n##匝\n##匠\n##匡\n##匣\n##匪\n##匮\n##匯\n##匱\n##匹\n##区\n##医\n##匾\n##匿\n##區\n##十\n##千\n##卅\n##升\n##午\n##卉\n##半\n##卍\n##华\n##协\n##卑\n##卒\n##卓\n##協\n##单\n##卖\n##南\n##単\n##博\n##卜\n##卞\n##卟\n##占\n##卡\n##卢\n##卤\n##卦\n##卧\n##卫\n##卮\n##卯\n##印\n##危\n##即\n##却\n##卵\n##卷\n##卸\n##卻\n##卿\n##厂\n##厄\n##厅\n##历\n##厉\n##压\n##厌\n##厕\n##厘\n##厚\n##厝\n##原\n##厢\n##厥\n##厦\n##厨\n##厩\n##厭\n##厮\n##厲\n##厳\n##去\n##县\n##叁\n##参\n##參\n##又\n##叉\n##及\n##友\n##双\n##反\n##収\n##发\n##叔\n##取\n##受\n##变\n##叙\n##叛\n##叟\n##叠\n##叡\n##叢\n##口\n##古\n##句\n##另\n##叨\n##叩\n##只\n##叫\n##召\n##叭\n##叮\n##可\n##台\n##叱\n##史\n##右\n##叵\n##叶\n##号\n##司\n##叹\n##叻\n##叼\n##叽\n##吁\n##吃\n##各\n##吆\n##合\n##吉\n##吊\n##吋\n##同\n##名\n##后\n##吏\n##吐\n##向\n##吒\n##吓\n##吕\n##吖\n##吗\n##君\n##吝\n##吞\n##吟\n##吠\n##吡\n##否\n##吧\n##吨\n##吩\n##含\n##听\n##吭\n##吮\n##启\n##吱\n##吳\n##吴\n##吵\n##吶\n##吸\n##吹\n##吻\n##吼\n##吽\n##吾\n##呀\n##呂\n##呃\n##呆\n##呈\n##告\n##呋\n##呎\n##呐\n##呓\n##呕\n##呗\n##员\n##呛\n##呜\n##呢\n##呤\n##呦\n##周\n##呱\n##呲\n##味\n##呵\n##呷\n##呸\n##呻\n##呼\n##命\n##咀\n##咁\n##咂\n##咄\n##咆\n##咋\n##和\n##咎\n##咏\n##咐\n##咒\n##咔\n##咕\n##咖\n##咗\n##咘\n##咙\n##咚\n##咛\n##咣\n##咤\n##咦\n##咧\n##咨\n##咩\n##咪\n##咫\n##咬\n##咭\n##咯\n##咱\n##咲\n##咳\n##咸\n##咻\n##咽\n##咿\n##哀\n##品\n##哂\n##哄\n##哆\n##哇\n##哈\n##哉\n##哋\n##哌\n##响\n##哎\n##哏\n##哐\n##哑\n##哒\n##哔\n##哗\n##哟\n##員\n##哥\n##哦\n##哧\n##哨\n##哩\n##哪\n##哭\n##哮\n##哲\n##哺\n##哼\n##哽\n##唁\n##唄\n##唆\n##唇\n##唉\n##唏\n##唐\n##唑\n##唔\n##唠\n##唤\n##唧\n##唬\n##售\n##唯\n##唰\n##唱\n##唳\n##唷\n##唸\n##唾\n##啃\n##啄\n##商\n##啉\n##啊\n##問\n##啓\n##啕\n##啖\n##啜\n##啞\n##啟\n##啡\n##啤\n##啥\n##啦\n##啧\n##啪\n##啫\n##啬\n##啮\n##啰\n##啱\n##啲\n##啵\n##啶\n##啷\n##啸\n##啻\n##啼\n##啾\n##喀\n##喂\n##喃\n##善\n##喆\n##喇\n##喉\n##喊\n##喋\n##喎\n##喏\n##喔\n##喘\n##喙\n##喚\n##喜\n##喝\n##喟\n##喧\n##喪\n##喫\n##喬\n##單\n##喰\n##喱\n##喲\n##喳\n##喵\n##営\n##喷\n##喹\n##喺\n##喻\n##喽\n##嗅\n##嗆\n##嗇\n##嗎\n##嗑\n##嗒\n##嗓\n##嗔\n##嗖\n##嗚\n##嗜\n##嗝\n##嗟\n##嗡\n##嗣\n##嗤\n##嗦\n##嗨\n##嗪\n##嗬\n##嗯\n##嗰\n##嗲\n##嗳\n##嗶\n##嗷\n##嗽\n##嘀\n##嘅\n##嘆\n##嘈\n##嘉\n##嘌\n##嘍\n##嘎\n##嘔\n##嘖\n##嘗\n##嘘\n##嘚\n##嘛\n##嘜\n##嘞\n##嘟\n##嘢\n##嘣\n##嘤\n##嘧\n##嘩\n##嘭\n##嘮\n##嘯\n##嘰\n##嘱\n##嘲\n##嘴\n##嘶\n##嘸\n##嘹\n##嘻\n##嘿\n##噁\n##噌\n##噎\n##噓\n##噔\n##噗\n##噙\n##噜\n##噠\n##噢\n##噤\n##器\n##噩\n##噪\n##噬\n##噱\n##噴\n##噶\n##噸\n##噹\n##噻\n##噼\n##嚀\n##嚇\n##嚎\n##嚏\n##嚐\n##嚓\n##嚕\n##嚟\n##嚣\n##嚥\n##嚨\n##嚮\n##嚴\n##嚷\n##嚼\n##囂\n##囉\n##囊\n##囍\n##囑\n##囔\n##囗\n##囚\n##四\n##囝\n##回\n##囟\n##因\n##囡\n##团\n##団\n##囤\n##囧\n##囪\n##囫\n##园\n##困\n##囱\n##囲\n##図\n##围\n##囹\n##固\n##国\n##图\n##囿\n##圃\n##圄\n##圆\n##圈\n##國\n##圍\n##圏\n##園\n##圓\n##圖\n##團\n##圜\n##土\n##圣\n##圧\n##在\n##圩\n##圭\n##地\n##圳\n##场\n##圻\n##圾\n##址\n##坂\n##均\n##坊\n##坍\n##坎\n##坏\n##坐\n##坑\n##块\n##坚\n##坛\n##坝\n##坞\n##坟\n##坠\n##坡\n##坤\n##坦\n##坨\n##坪\n##坯\n##坳\n##坵\n##坷\n##垂\n##垃\n##垄\n##型\n##垒\n##垚\n##垛\n##垠\n##垢\n##垣\n##垦\n##垩\n##垫\n##垭\n##垮\n##垵\n##埂\n##埃\n##埋\n##城\n##埔\n##埕\n##埗\n##域\n##埠\n##埤\n##埵\n##執\n##埸\n##培\n##基\n##埼\n##堀\n##堂\n##堃\n##堅\n##堆\n##堇\n##堑\n##堕\n##堙\n##堡\n##堤\n##堪\n##堯\n##堰\n##報\n##場\n##堵\n##堺\n##堿\n##塊\n##塌\n##塑\n##塔\n##塗\n##塘\n##塚\n##塞\n##塢\n##塩\n##填\n##塬\n##塭\n##塵\n##塾\n##墀\n##境\n##墅\n##墉\n##墊\n##墒\n##墓\n##増\n##墘\n##墙\n##墜\n##增\n##墟\n##墨\n##墩\n##墮\n##墳\n##墻\n##墾\n##壁\n##壅\n##壆\n##壇\n##壊\n##壑\n##壓\n##壕\n##壘\n##壞\n##壟\n##壢\n##壤\n##壩\n##士\n##壬\n##壮\n##壯\n##声\n##売\n##壳\n##壶\n##壹\n##壺\n##壽\n##处\n##备\n##変\n##复\n##夏\n##夔\n##夕\n##外\n##夙\n##多\n##夜\n##够\n##夠\n##夢\n##夥\n##大\n##天\n##太\n##夫\n##夭\n##央\n##夯\n##失\n##头\n##夷\n##夸\n##夹\n##夺\n##夾\n##奂\n##奄\n##奇\n##奈\n##奉\n##奋\n##奎\n##奏\n##奐\n##契\n##奔\n##奕\n##奖\n##套\n##奘\n##奚\n##奠\n##奢\n##奥\n##奧\n##奪\n##奬\n##奮\n##女\n##奴\n##奶\n##奸\n##她\n##好\n##如\n##妃\n##妄\n##妆\n##妇\n##妈\n##妊\n##妍\n##妒\n##妓\n##妖\n##妘\n##妙\n##妝\n##妞\n##妣\n##妤\n##妥\n##妨\n##妩\n##妪\n##妮\n##妲\n##妳\n##妹\n##妻\n##妾\n##姆\n##姉\n##姊\n##始\n##姍\n##姐\n##姑\n##姒\n##姓\n##委\n##姗\n##姚\n##姜\n##姝\n##姣\n##姥\n##姦\n##姨\n##姪\n##姫\n##姬\n##姹\n##姻\n##姿\n##威\n##娃\n##娄\n##娅\n##娆\n##娇\n##娉\n##娑\n##娓\n##娘\n##娛\n##娜\n##娟\n##娠\n##娣\n##娥\n##娩\n##娱\n##娲\n##娴\n##娶\n##娼\n##婀\n##婁\n##婆\n##婉\n##婊\n##婕\n##婚\n##婢\n##婦\n##婧\n##婪\n##婭\n##婴\n##婵\n##婶\n##婷\n##婺\n##婿\n##媒\n##媚\n##媛\n##媞\n##媧\n##媲\n##媳\n##媽\n##媾\n##嫁\n##嫂\n##嫉\n##嫌\n##嫑\n##嫔\n##嫖\n##嫘\n##嫚\n##嫡\n##嫣\n##嫦\n##嫩\n##嫲\n##嫵\n##嫻\n##嬅\n##嬉\n##嬌\n##嬗\n##嬛\n##嬢\n##嬤\n##嬪\n##嬰\n##嬴\n##嬷\n##嬸\n##嬿\n##孀\n##孃\n##子\n##孑\n##孔\n##孕\n##孖\n##字\n##存\n##孙\n##孚\n##孛\n##孜\n##孝\n##孟\n##孢\n##季\n##孤\n##学\n##孩\n##孪\n##孫\n##孬\n##孰\n##孱\n##孳\n##孵\n##學\n##孺\n##孽\n##孿\n##宁\n##它\n##宅\n##宇\n##守\n##安\n##宋\n##完\n##宏\n##宓\n##宕\n##宗\n##官\n##宙\n##定\n##宛\n##宜\n##宝\n##实\n##実\n##宠\n##审\n##客\n##宣\n##室\n##宥\n##宦\n##宪\n##宫\n##宮\n##宰\n##害\n##宴\n##宵\n##家\n##宸\n##容\n##宽\n##宾\n##宿\n##寂\n##寄\n##寅\n##密\n##寇\n##富\n##寐\n##寒\n##寓\n##寛\n##寝\n##寞\n##察\n##寡\n##寢\n##寥\n##實\n##寧\n##寨\n##審\n##寫\n##寬\n##寮\n##寰\n##寵\n##寶\n##寸\n##对\n##寺\n##寻\n##导\n##対\n##寿\n##封\n##専\n##射\n##将\n##將\n##專\n##尉\n##尊\n##尋\n##對\n##導\n##小\n##少\n##尔\n##尕\n##尖\n##尘\n##尚\n##尝\n##尤\n##尧\n##尬\n##就\n##尴\n##尷\n##尸\n##尹\n##尺\n##尻\n##尼\n##尽\n##尾\n##尿\n##局\n##屁\n##层\n##屄\n##居\n##屆\n##屈\n##屉\n##届\n##屋\n##屌\n##屍\n##屎\n##屏\n##屐\n##屑\n##展\n##屜\n##属\n##屠\n##屡\n##屢\n##層\n##履\n##屬\n##屯\n##山\n##屹\n##屿\n##岀\n##岁\n##岂\n##岌\n##岐\n##岑\n##岔\n##岖\n##岗\n##岘\n##岙\n##岚\n##岛\n##岡\n##岩\n##岫\n##岬\n##岭\n##岱\n##岳\n##岷\n##岸\n##峇\n##峋\n##峒\n##峙\n##峡\n##峤\n##峥\n##峦\n##峨\n##峪\n##峭\n##峯\n##峰\n##峴\n##島\n##峻\n##峽\n##崁\n##崂\n##崆\n##崇\n##崎\n##崑\n##崔\n##崖\n##崗\n##崙\n##崛\n##崧\n##崩\n##崭\n##崴\n##崽\n##嵇\n##嵊\n##嵋\n##嵌\n##嵐\n##嵘\n##嵩\n##嵬\n##嵯\n##嶂\n##嶄\n##嶇\n##嶋\n##嶙\n##嶺\n##嶼\n##嶽\n##巅\n##巍\n##巒\n##巔\n##巖\n##川\n##州\n##巡\n##巢\n##工\n##左\n##巧\n##巨\n##巩\n##巫\n##差\n##己\n##已\n##巳\n##巴\n##巷\n##巻\n##巽\n##巾\n##巿\n##币\n##市\n##布\n##帅\n##帆\n##师\n##希\n##帐\n##帑\n##帕\n##帖\n##帘\n##帚\n##帛\n##帜\n##帝\n##帥\n##带\n##帧\n##師\n##席\n##帮\n##帯\n##帰\n##帳\n##帶\n##帷\n##常\n##帼\n##帽\n##幀\n##幂\n##幄\n##幅\n##幌\n##幔\n##幕\n##幟\n##幡\n##幢\n##幣\n##幫\n##干\n##平\n##年\n##并\n##幸\n##幹\n##幺\n##幻\n##幼\n##幽\n##幾\n##广\n##庁\n##広\n##庄\n##庆\n##庇\n##床\n##序\n##庐\n##库\n##应\n##底\n##庖\n##店\n##庙\n##庚\n##府\n##庞\n##废\n##庠\n##度\n##座\n##庫\n##庭\n##庵\n##庶\n##康\n##庸\n##庹\n##庾\n##廁\n##廂\n##廃\n##廈\n##廉\n##廊\n##廓\n##廖\n##廚\n##廝\n##廟\n##廠\n##廢\n##廣\n##廬\n##廳\n##延\n##廷\n##建\n##廿\n##开\n##弁\n##异\n##弃\n##弄\n##弈\n##弊\n##弋\n##式\n##弑\n##弒\n##弓\n##弔\n##引\n##弗\n##弘\n##弛\n##弟\n##张\n##弥\n##弦\n##弧\n##弩\n##弭\n##弯\n##弱\n##張\n##強\n##弹\n##强\n##弼\n##弾\n##彅\n##彆\n##彈\n##彌\n##彎\n##归\n##当\n##录\n##彗\n##彙\n##彝\n##形\n##彤\n##彥\n##彦\n##彧\n##彩\n##彪\n##彫\n##彬\n##彭\n##彰\n##影\n##彷\n##役\n##彻\n##彼\n##彿\n##往\n##征\n##径\n##待\n##徇\n##很\n##徉\n##徊\n##律\n##後\n##徐\n##徑\n##徒\n##従\n##徕\n##得\n##徘\n##徙\n##徜\n##從\n##徠\n##御\n##徨\n##復\n##循\n##徬\n##微\n##徳\n##徴\n##徵\n##德\n##徹\n##徼\n##徽\n##心\n##必\n##忆\n##忌\n##忍\n##忏\n##忐\n##忑\n##忒\n##忖\n##志\n##忘\n##忙\n##応\n##忠\n##忡\n##忤\n##忧\n##忪\n##快\n##忱\n##念\n##忻\n##忽\n##忿\n##怀\n##态\n##怂\n##怅\n##怆\n##怎\n##怏\n##怒\n##怔\n##怕\n##怖\n##怙\n##怜\n##思\n##怠\n##怡\n##急\n##怦\n##性\n##怨\n##怪\n##怯\n##怵\n##总\n##怼\n##恁\n##恃\n##恆\n##恋\n##恍\n##恐\n##恒\n##恕\n##恙\n##恚\n##恢\n##恣\n##恤\n##恥\n##恨\n##恩\n##恪\n##恫\n##恬\n##恭\n##息\n##恰\n##恳\n##恵\n##恶\n##恸\n##恺\n##恻\n##恼\n##恿\n##悄\n##悅\n##悉\n##悌\n##悍\n##悔\n##悖\n##悚\n##悟\n##悠\n##患\n##悦\n##您\n##悩\n##悪\n##悬\n##悯\n##悱\n##悲\n##悴\n##悵\n##悶\n##悸\n##悻\n##悼\n##悽\n##情\n##惆\n##惇\n##惊\n##惋\n##惑\n##惕\n##惘\n##惚\n##惜\n##惟\n##惠\n##惡\n##惦\n##惧\n##惨\n##惩\n##惫\n##惬\n##惭\n##惮\n##惯\n##惰\n##惱\n##想\n##惴\n##惶\n##惹\n##惺\n##愁\n##愆\n##愈\n##愉\n##愍\n##意\n##愕\n##愚\n##愛\n##愜\n##感\n##愣\n##愤\n##愧\n##愫\n##愷\n##愿\n##慄\n##慈\n##態\n##慌\n##慎\n##慑\n##慕\n##慘\n##慚\n##慟\n##慢\n##慣\n##慧\n##慨\n##慫\n##慮\n##慰\n##慳\n##慵\n##慶\n##慷\n##慾\n##憂\n##憊\n##憋\n##憎\n##憐\n##憑\n##憔\n##憚\n##憤\n##憧\n##憨\n##憩\n##憫\n##憬\n##憲\n##憶\n##憾\n##懂\n##懇\n##懈\n##應\n##懊\n##懋\n##懑\n##懒\n##懦\n##懲\n##懵\n##懶\n##懷\n##懸\n##懺\n##懼\n##懾\n##懿\n##戀\n##戈\n##戊\n##戌\n##戍\n##戎\n##戏\n##成\n##我\n##戒\n##戕\n##或\n##战\n##戚\n##戛\n##戟\n##戡\n##戦\n##截\n##戬\n##戮\n##戰\n##戲\n##戳\n##戴\n##戶\n##户\n##戸\n##戻\n##戾\n##房\n##所\n##扁\n##扇\n##扈\n##扉\n##手\n##才\n##扎\n##扑\n##扒\n##打\n##扔\n##払\n##托\n##扛\n##扣\n##扦\n##执\n##扩\n##扪\n##扫\n##扬\n##扭\n##扮\n##扯\n##扰\n##扱\n##扳\n##扶\n##批\n##扼\n##找\n##承\n##技\n##抄\n##抉\n##把\n##抑\n##抒\n##抓\n##投\n##抖\n##抗\n##折\n##抚\n##抛\n##抜\n##択\n##抟\n##抠\n##抡\n##抢\n##护\n##报\n##抨\n##披\n##抬\n##抱\n##抵\n##抹\n##押\n##抽\n##抿\n##拂\n##拄\n##担\n##拆\n##拇\n##拈\n##拉\n##拋\n##拌\n##拍\n##拎\n##拐\n##拒\n##拓\n##拔\n##拖\n##拗\n##拘\n##拙\n##拚\n##招\n##拜\n##拟\n##拡\n##拢\n##拣\n##拥\n##拦\n##拧\n##拨\n##择\n##括\n##拭\n##拮\n##拯\n##拱\n##拳\n##拴\n##拷\n##拼\n##拽\n##拾\n##拿\n##持\n##挂\n##指\n##挈\n##按\n##挎\n##挑\n##挖\n##挙\n##挚\n##挛\n##挝\n##挞\n##挟\n##挠\n##挡\n##挣\n##挤\n##挥\n##挨\n##挪\n##挫\n##振\n##挲\n##挹\n##挺\n##挽\n##挾\n##捂\n##捅\n##捆\n##捉\n##捋\n##捌\n##捍\n##捎\n##捏\n##捐\n##捕\n##捞\n##损\n##捡\n##换\n##捣\n##捧\n##捨\n##捩\n##据\n##捱\n##捲\n##捶\n##捷\n##捺\n##捻\n##掀\n##掂\n##掃\n##掇\n##授\n##掉\n##掌\n##掏\n##掐\n##排\n##掖\n##掘\n##掙\n##掛\n##掠\n##採\n##探\n##掣\n##接\n##控\n##推\n##掩\n##措\n##掬\n##掰\n##掲\n##掳\n##掴\n##掷\n##掸\n##掺\n##揀\n##揃\n##揄\n##揆\n##揉\n##揍\n##描\n##提\n##插\n##揖\n##揚\n##換\n##握\n##揣\n##揩\n##揪\n##揭\n##揮\n##援\n##揶\n##揸\n##揹\n##揽\n##搀\n##搁\n##搂\n##搅\n##損\n##搏\n##搐\n##搓\n##搔\n##搖\n##搗\n##搜\n##搞\n##搡\n##搪\n##搬\n##搭\n##搵\n##搶\n##携\n##搽\n##摀\n##摁\n##摄\n##摆\n##摇\n##摈\n##摊\n##摒\n##摔\n##摘\n##摞\n##摟\n##摧\n##摩\n##摯\n##摳\n##摸\n##摹\n##摺\n##摻\n##撂\n##撃\n##撅\n##撇\n##撈\n##撐\n##撑\n##撒\n##撓\n##撕\n##撚\n##撞\n##撤\n##撥\n##撩\n##撫\n##撬\n##播\n##撮\n##撰\n##撲\n##撵\n##撷\n##撸\n##撻\n##撼\n##撿\n##擀\n##擁\n##擂\n##擄\n##擅\n##擇\n##擊\n##擋\n##操\n##擎\n##擒\n##擔\n##擘\n##據\n##擞\n##擠\n##擡\n##擢\n##擦\n##擬\n##擰\n##擱\n##擲\n##擴\n##擷\n##擺\n##擼\n##擾\n##攀\n##攏\n##攒\n##攔\n##攘\n##攙\n##攜\n##攝\n##攞\n##攢\n##攣\n##攤\n##攥\n##攪\n##攫\n##攬\n##支\n##收\n##攸\n##改\n##攻\n##放\n##政\n##故\n##效\n##敌\n##敍\n##敎\n##敏\n##救\n##敕\n##敖\n##敗\n##敘\n##教\n##敛\n##敝\n##敞\n##敢\n##散\n##敦\n##敬\n##数\n##敲\n##整\n##敵\n##敷\n##數\n##斂\n##斃\n##文\n##斋\n##斌\n##斎\n##斐\n##斑\n##斓\n##斗\n##料\n##斛\n##斜\n##斟\n##斡\n##斤\n##斥\n##斧\n##斩\n##斫\n##斬\n##断\n##斯\n##新\n##斷\n##方\n##於\n##施\n##旁\n##旃\n##旅\n##旋\n##旌\n##旎\n##族\n##旖\n##旗\n##无\n##既\n##日\n##旦\n##旧\n##旨\n##早\n##旬\n##旭\n##旮\n##旱\n##时\n##旷\n##旺\n##旻\n##昀\n##昂\n##昆\n##昇\n##昉\n##昊\n##昌\n##明\n##昏\n##易\n##昔\n##昕\n##昙\n##星\n##映\n##春\n##昧\n##昨\n##昭\n##是\n##昱\n##昴\n##昵\n##昶\n##昼\n##显\n##晁\n##時\n##晃\n##晉\n##晋\n##晌\n##晏\n##晒\n##晓\n##晔\n##晕\n##晖\n##晗\n##晚\n##晝\n##晞\n##晟\n##晤\n##晦\n##晨\n##晩\n##普\n##景\n##晰\n##晴\n##晶\n##晷\n##智\n##晾\n##暂\n##暄\n##暇\n##暈\n##暉\n##暌\n##暐\n##暑\n##暖\n##暗\n##暝\n##暢\n##暧\n##暨\n##暫\n##暮\n##暱\n##暴\n##暸\n##暹\n##曄\n##曆\n##曇\n##曉\n##曖\n##曙\n##曜\n##曝\n##曠\n##曦\n##曬\n##曰\n##曲\n##曳\n##更\n##書\n##曹\n##曼\n##曾\n##替\n##最\n##會\n##月\n##有\n##朋\n##服\n##朐\n##朔\n##朕\n##朗\n##望\n##朝\n##期\n##朦\n##朧\n##木\n##未\n##末\n##本\n##札\n##朮\n##术\n##朱\n##朴\n##朵\n##机\n##朽\n##杀\n##杂\n##权\n##杆\n##杈\n##杉\n##李\n##杏\n##材\n##村\n##杓\n##杖\n##杜\n##杞\n##束\n##杠\n##条\n##来\n##杨\n##杭\n##杯\n##杰\n##東\n##杳\n##杵\n##杷\n##杼\n##松\n##板\n##极\n##构\n##枇\n##枉\n##枋\n##析\n##枕\n##林\n##枚\n##果\n##枝\n##枢\n##枣\n##枪\n##枫\n##枭\n##枯\n##枰\n##枱\n##枳\n##架\n##枷\n##枸\n##柄\n##柏\n##某\n##柑\n##柒\n##染\n##柔\n##柘\n##柚\n##柜\n##柞\n##柠\n##柢\n##查\n##柩\n##柬\n##柯\n##柱\n##柳\n##柴\n##柵\n##査\n##柿\n##栀\n##栃\n##栄\n##栅\n##标\n##栈\n##栉\n##栋\n##栎\n##栏\n##树\n##栓\n##栖\n##栗\n##校\n##栩\n##株\n##样\n##核\n##根\n##格\n##栽\n##栾\n##桀\n##桁\n##桂\n##桃\n##桅\n##框\n##案\n##桉\n##桌\n##桎\n##桐\n##桑\n##桓\n##桔\n##桜\n##桠\n##桡\n##桢\n##档\n##桥\n##桦\n##桧\n##桨\n##桩\n##桶\n##桿\n##梁\n##梅\n##梆\n##梏\n##梓\n##梗\n##條\n##梟\n##梢\n##梦\n##梧\n##梨\n##梭\n##梯\n##械\n##梳\n##梵\n##梶\n##检\n##棂\n##棄\n##棉\n##棋\n##棍\n##棒\n##棕\n##棗\n##棘\n##棚\n##棟\n##棠\n##棣\n##棧\n##森\n##棱\n##棲\n##棵\n##棹\n##棺\n##椁\n##椅\n##椋\n##植\n##椎\n##椒\n##検\n##椪\n##椭\n##椰\n##椹\n##椽\n##椿\n##楂\n##楊\n##楓\n##楔\n##楚\n##楝\n##楞\n##楠\n##楣\n##楨\n##楫\n##業\n##楮\n##極\n##楷\n##楸\n##楹\n##楼\n##楽\n##概\n##榄\n##榆\n##榈\n##榉\n##榔\n##榕\n##榖\n##榛\n##榜\n##榨\n##榫\n##榭\n##榮\n##榱\n##榴\n##榷\n##榻\n##槁\n##槃\n##構\n##槌\n##槍\n##槎\n##槐\n##槓\n##様\n##槛\n##槟\n##槤\n##槭\n##槲\n##槳\n##槻\n##槽\n##槿\n##樁\n##樂\n##樊\n##樑\n##樓\n##標\n##樞\n##樟\n##模\n##樣\n##権\n##横\n##樫\n##樯\n##樱\n##樵\n##樸\n##樹\n##樺\n##樽\n##樾\n##橄\n##橇\n##橋\n##橐\n##橘\n##橙\n##機\n##橡\n##橢\n##橫\n##橱\n##橹\n##橼\n##檀\n##檄\n##檎\n##檐\n##檔\n##檗\n##檜\n##檢\n##檬\n##檯\n##檳\n##檸\n##檻\n##櫃\n##櫚\n##櫛\n##櫥\n##櫸\n##櫻\n##欄\n##權\n##欒\n##欖\n##欠\n##次\n##欢\n##欣\n##欧\n##欲\n##欸\n##欺\n##欽\n##款\n##歆\n##歇\n##歉\n##歌\n##歎\n##歐\n##歓\n##歙\n##歛\n##歡\n##止\n##正\n##此\n##步\n##武\n##歧\n##歩\n##歪\n##歯\n##歲\n##歳\n##歴\n##歷\n##歸\n##歹\n##死\n##歼\n##殁\n##殃\n##殆\n##殇\n##殉\n##殊\n##残\n##殒\n##殓\n##殖\n##殘\n##殞\n##殡\n##殤\n##殭\n##殯\n##殲\n##殴\n##段\n##殷\n##殺\n##殼\n##殿\n##毀\n##毁\n##毂\n##毅\n##毆\n##毋\n##母\n##毎\n##每\n##毒\n##毓\n##比\n##毕\n##毗\n##毘\n##毙\n##毛\n##毡\n##毫\n##毯\n##毽\n##氈\n##氏\n##氐\n##民\n##氓\n##气\n##氖\n##気\n##氙\n##氛\n##氟\n##氡\n##氢\n##氣\n##氤\n##氦\n##氧\n##氨\n##氪\n##氫\n##氮\n##氯\n##氰\n##氲\n##水\n##氷\n##永\n##氹\n##氾\n##汀\n##汁\n##求\n##汆\n##汇\n##汉\n##汎\n##汐\n##汕\n##汗\n##汙\n##汛\n##汝\n##汞\n##江\n##池\n##污\n##汤\n##汨\n##汩\n##汪\n##汰\n##汲\n##汴\n##汶\n##汹\n##決\n##汽\n##汾\n##沁\n##沂\n##沃\n##沅\n##沈\n##沉\n##沌\n##沏\n##沐\n##沒\n##沓\n##沖\n##沙\n##沛\n##沟\n##没\n##沢\n##沣\n##沥\n##沦\n##沧\n##沪\n##沫\n##沭\n##沮\n##沱\n##河\n##沸\n##油\n##治\n##沼\n##沽\n##沾\n##沿\n##況\n##泄\n##泉\n##泊\n##泌\n##泓\n##法\n##泗\n##泛\n##泞\n##泠\n##泡\n##波\n##泣\n##泥\n##注\n##泪\n##泫\n##泮\n##泯\n##泰\n##泱\n##泳\n##泵\n##泷\n##泸\n##泻\n##泼\n##泽\n##泾\n##洁\n##洄\n##洋\n##洒\n##洗\n##洙\n##洛\n##洞\n##津\n##洩\n##洪\n##洮\n##洱\n##洲\n##洵\n##洶\n##洸\n##洹\n##活\n##洼\n##洽\n##派\n##流\n##浃\n##浄\n##浅\n##浆\n##浇\n##浊\n##测\n##济\n##浏\n##浑\n##浒\n##浓\n##浔\n##浙\n##浚\n##浜\n##浣\n##浦\n##浩\n##浪\n##浬\n##浮\n##浯\n##浴\n##海\n##浸\n##涂\n##涅\n##涇\n##消\n##涉\n##涌\n##涎\n##涓\n##涔\n##涕\n##涙\n##涛\n##涝\n##涞\n##涟\n##涠\n##涡\n##涣\n##涤\n##润\n##涧\n##涨\n##涩\n##涪\n##涮\n##涯\n##液\n##涵\n##涸\n##涼\n##涿\n##淀\n##淄\n##淅\n##淆\n##淇\n##淋\n##淌\n##淑\n##淒\n##淖\n##淘\n##淙\n##淚\n##淞\n##淡\n##淤\n##淦\n##淨\n##淩\n##淪\n##淫\n##淬\n##淮\n##深\n##淳\n##淵\n##混\n##淹\n##淺\n##添\n##淼\n##清\n##済\n##渉\n##渊\n##渋\n##渍\n##渎\n##渐\n##渔\n##渗\n##渙\n##渚\n##減\n##渝\n##渠\n##渡\n##渣\n##渤\n##渥\n##渦\n##温\n##測\n##渭\n##港\n##渲\n##渴\n##游\n##渺\n##渾\n##湃\n##湄\n##湊\n##湍\n##湖\n##湘\n##湛\n##湟\n##湧\n##湫\n##湮\n##湯\n##湳\n##湾\n##湿\n##満\n##溃\n##溅\n##溉\n##溏\n##源\n##準\n##溜\n##溝\n##溟\n##溢\n##溥\n##溧\n##溪\n##溫\n##溯\n##溱\n##溴\n##溶\n##溺\n##溼\n##滁\n##滂\n##滄\n##滅\n##滇\n##滋\n##滌\n##滑\n##滓\n##滔\n##滕\n##滙\n##滚\n##滝\n##滞\n##滟\n##满\n##滢\n##滤\n##滥\n##滦\n##滨\n##滩\n##滬\n##滯\n##滲\n##滴\n##滷\n##滸\n##滾\n##滿\n##漁\n##漂\n##漆\n##漉\n##漏\n##漓\n##演\n##漕\n##漠\n##漢\n##漣\n##漩\n##漪\n##漫\n##漬\n##漯\n##漱\n##漲\n##漳\n##漸\n##漾\n##漿\n##潆\n##潇\n##潋\n##潍\n##潑\n##潔\n##潘\n##潛\n##潜\n##潞\n##潟\n##潢\n##潤\n##潦\n##潧\n##潭\n##潮\n##潰\n##潴\n##潸\n##潺\n##潼\n##澀\n##澄\n##澆\n##澈\n##澍\n##澎\n##澗\n##澜\n##澡\n##澤\n##澧\n##澱\n##澳\n##澹\n##激\n##濁\n##濂\n##濃\n##濑\n##濒\n##濕\n##濘\n##濛\n##濟\n##濠\n##濡\n##濤\n##濫\n##濬\n##濮\n##濯\n##濱\n##濺\n##濾\n##瀅\n##瀆\n##瀉\n##瀋\n##瀏\n##瀑\n##瀕\n##瀘\n##瀚\n##瀛\n##瀝\n##瀞\n##瀟\n##瀧\n##瀨\n##瀬\n##瀰\n##瀾\n##灌\n##灏\n##灑\n##灘\n##灝\n##灞\n##灣\n##火\n##灬\n##灭\n##灯\n##灰\n##灵\n##灶\n##灸\n##灼\n##災\n##灾\n##灿\n##炀\n##炁\n##炅\n##炉\n##炊\n##炎\n##炒\n##炔\n##炕\n##炖\n##炙\n##炜\n##炫\n##炬\n##炭\n##炮\n##炯\n##炳\n##炷\n##炸\n##点\n##為\n##炼\n##炽\n##烁\n##烂\n##烃\n##烈\n##烊\n##烏\n##烘\n##烙\n##烛\n##烟\n##烤\n##烦\n##烧\n##烨\n##烩\n##烫\n##烬\n##热\n##烯\n##烷\n##烹\n##烽\n##焉\n##焊\n##焕\n##焖\n##焗\n##焘\n##焙\n##焚\n##焜\n##無\n##焦\n##焯\n##焰\n##焱\n##然\n##焼\n##煅\n##煉\n##煊\n##煌\n##煎\n##煒\n##煖\n##煙\n##煜\n##煞\n##煤\n##煥\n##煦\n##照\n##煨\n##煩\n##煮\n##煲\n##煸\n##煽\n##熄\n##熊\n##熏\n##熒\n##熔\n##熙\n##熟\n##熠\n##熨\n##熬\n##熱\n##熵\n##熹\n##熾\n##燁\n##燃\n##燄\n##燈\n##燉\n##燊\n##燎\n##燒\n##燔\n##燕\n##燙\n##燜\n##營\n##燥\n##燦\n##燧\n##燭\n##燮\n##燴\n##燻\n##燼\n##燿\n##爆\n##爍\n##爐\n##爛\n##爪\n##爬\n##爭\n##爰\n##爱\n##爲\n##爵\n##父\n##爷\n##爸\n##爹\n##爺\n##爻\n##爽\n##爾\n##牆\n##片\n##版\n##牌\n##牍\n##牒\n##牙\n##牛\n##牝\n##牟\n##牠\n##牡\n##牢\n##牦\n##牧\n##物\n##牯\n##牲\n##牴\n##牵\n##特\n##牺\n##牽\n##犀\n##犁\n##犄\n##犊\n##犍\n##犒\n##犢\n##犧\n##犬\n##犯\n##状\n##犷\n##犸\n##犹\n##狀\n##狂\n##狄\n##狈\n##狎\n##狐\n##狒\n##狗\n##狙\n##狞\n##狠\n##狡\n##狩\n##独\n##狭\n##狮\n##狰\n##狱\n##狸\n##狹\n##狼\n##狽\n##猎\n##猕\n##猖\n##猗\n##猙\n##猛\n##猜\n##猝\n##猥\n##猩\n##猪\n##猫\n##猬\n##献\n##猴\n##猶\n##猷\n##猾\n##猿\n##獄\n##獅\n##獎\n##獐\n##獒\n##獗\n##獠\n##獣\n##獨\n##獭\n##獰\n##獲\n##獵\n##獷\n##獸\n##獺\n##獻\n##獼\n##獾\n##玄\n##率\n##玉\n##王\n##玑\n##玖\n##玛\n##玟\n##玠\n##玥\n##玩\n##玫\n##玮\n##环\n##现\n##玲\n##玳\n##玷\n##玺\n##玻\n##珀\n##珂\n##珅\n##珈\n##珉\n##珊\n##珍\n##珏\n##珐\n##珑\n##珙\n##珞\n##珠\n##珣\n##珥\n##珩\n##珪\n##班\n##珮\n##珲\n##珺\n##現\n##球\n##琅\n##理\n##琇\n##琉\n##琊\n##琍\n##琏\n##琐\n##琛\n##琢\n##琥\n##琦\n##琨\n##琪\n##琬\n##琮\n##琰\n##琲\n##琳\n##琴\n##琵\n##琶\n##琺\n##琼\n##瑀\n##瑁\n##瑄\n##瑋\n##瑕\n##瑗\n##瑙\n##瑚\n##瑛\n##瑜\n##瑞\n##瑟\n##瑠\n##瑣\n##瑤\n##瑩\n##瑪\n##瑯\n##瑰\n##瑶\n##瑾\n##璀\n##璁\n##璃\n##璇\n##璉\n##璋\n##璎\n##璐\n##璜\n##璞\n##璟\n##璧\n##璨\n##環\n##璽\n##璿\n##瓊\n##瓏\n##瓒\n##瓜\n##瓢\n##瓣\n##瓤\n##瓦\n##瓮\n##瓯\n##瓴\n##瓶\n##瓷\n##甄\n##甌\n##甕\n##甘\n##甙\n##甚\n##甜\n##生\n##產\n##産\n##甥\n##甦\n##用\n##甩\n##甫\n##甬\n##甭\n##甯\n##田\n##由\n##甲\n##申\n##电\n##男\n##甸\n##町\n##画\n##甾\n##畀\n##畅\n##界\n##畏\n##畑\n##畔\n##留\n##畜\n##畝\n##畢\n##略\n##畦\n##番\n##畫\n##異\n##畲\n##畳\n##畴\n##當\n##畸\n##畹\n##畿\n##疆\n##疇\n##疊\n##疏\n##疑\n##疔\n##疖\n##疗\n##疙\n##疚\n##疝\n##疟\n##疡\n##疣\n##疤\n##疥\n##疫\n##疮\n##疯\n##疱\n##疲\n##疳\n##疵\n##疸\n##疹\n##疼\n##疽\n##疾\n##痂\n##病\n##症\n##痈\n##痉\n##痊\n##痍\n##痒\n##痔\n##痕\n##痘\n##痙\n##痛\n##痞\n##痠\n##痢\n##痣\n##痤\n##痧\n##痨\n##痪\n##痫\n##痰\n##痱\n##痴\n##痹\n##痺\n##痼\n##痿\n##瘀\n##瘁\n##瘋\n##瘍\n##瘓\n##瘘\n##瘙\n##瘟\n##瘠\n##瘡\n##瘢\n##瘤\n##瘦\n##瘧\n##瘩\n##瘪\n##瘫\n##瘴\n##瘸\n##瘾\n##療\n##癇\n##癌\n##癒\n##癖\n##癜\n##癞\n##癡\n##癢\n##癣\n##癥\n##癫\n##癬\n##癮\n##癱\n##癲\n##癸\n##発\n##登\n##發\n##白\n##百\n##皂\n##的\n##皆\n##皇\n##皈\n##皋\n##皎\n##皑\n##皓\n##皖\n##皙\n##皚\n##皮\n##皰\n##皱\n##皴\n##皺\n##皿\n##盂\n##盃\n##盅\n##盆\n##盈\n##益\n##盎\n##盏\n##盐\n##监\n##盒\n##盔\n##盖\n##盗\n##盘\n##盛\n##盜\n##盞\n##盟\n##盡\n##監\n##盤\n##盥\n##盧\n##盪\n##目\n##盯\n##盱\n##盲\n##直\n##相\n##盹\n##盼\n##盾\n##省\n##眈\n##眉\n##看\n##県\n##眙\n##眞\n##真\n##眠\n##眦\n##眨\n##眩\n##眯\n##眶\n##眷\n##眸\n##眺\n##眼\n##眾\n##着\n##睁\n##睇\n##睏\n##睐\n##睑\n##睛\n##睜\n##睞\n##睡\n##睢\n##督\n##睥\n##睦\n##睨\n##睪\n##睫\n##睬\n##睹\n##睽\n##睾\n##睿\n##瞄\n##瞅\n##瞇\n##瞋\n##瞌\n##瞎\n##瞑\n##瞒\n##瞓\n##瞞\n##瞟\n##瞠\n##瞥\n##瞧\n##瞩\n##瞪\n##瞬\n##瞭\n##瞰\n##瞳\n##瞻\n##瞼\n##瞿\n##矇\n##矍\n##矗\n##矚\n##矛\n##矜\n##矢\n##矣\n##知\n##矩\n##矫\n##短\n##矮\n##矯\n##石\n##矶\n##矽\n##矾\n##矿\n##码\n##砂\n##砌\n##砍\n##砒\n##研\n##砖\n##砗\n##砚\n##砝\n##砣\n##砥\n##砧\n##砭\n##砰\n##砲\n##破\n##砷\n##砸\n##砺\n##砼\n##砾\n##础\n##硅\n##硐\n##硒\n##硕\n##硝\n##硫\n##硬\n##确\n##硯\n##硼\n##碁\n##碇\n##碉\n##碌\n##碍\n##碎\n##碑\n##碓\n##碗\n##碘\n##碚\n##碛\n##碟\n##碣\n##碧\n##碩\n##碰\n##碱\n##碳\n##碴\n##確\n##碼\n##碾\n##磁\n##磅\n##磊\n##磋\n##磐\n##磕\n##磚\n##磡\n##磨\n##磬\n##磯\n##磲\n##磷\n##磺\n##礁\n##礎\n##礙\n##礡\n##礦\n##礪\n##礫\n##礴\n##示\n##礼\n##社\n##祀\n##祁\n##祂\n##祇\n##祈\n##祉\n##祎\n##祐\n##祕\n##祖\n##祗\n##祚\n##祛\n##祜\n##祝\n##神\n##祟\n##祠\n##祢\n##祥\n##票\n##祭\n##祯\n##祷\n##祸\n##祺\n##祿\n##禀\n##禁\n##禄\n##禅\n##禍\n##禎\n##福\n##禛\n##禦\n##禧\n##禪\n##禮\n##禱\n##禹\n##禺\n##离\n##禽\n##禾\n##禿\n##秀\n##私\n##秃\n##秆\n##秉\n##秋\n##种\n##科\n##秒\n##秘\n##租\n##秣\n##秤\n##秦\n##秧\n##秩\n##秭\n##积\n##称\n##秸\n##移\n##秽\n##稀\n##稅\n##程\n##稍\n##税\n##稔\n##稗\n##稚\n##稜\n##稞\n##稟\n##稠\n##稣\n##種\n##稱\n##稲\n##稳\n##稷\n##稹\n##稻\n##稼\n##稽\n##稿\n##穀\n##穂\n##穆\n##穌\n##積\n##穎\n##穗\n##穢\n##穩\n##穫\n##穴\n##究\n##穷\n##穹\n##空\n##穿\n##突\n##窃\n##窄\n##窈\n##窍\n##窑\n##窒\n##窓\n##窕\n##窖\n##窗\n##窘\n##窜\n##窝\n##窟\n##窠\n##窥\n##窦\n##窨\n##窩\n##窪\n##窮\n##窯\n##窺\n##窿\n##竄\n##竅\n##竇\n##竊\n##立\n##竖\n##站\n##竜\n##竞\n##竟\n##章\n##竣\n##童\n##竭\n##端\n##競\n##竹\n##竺\n##竽\n##竿\n##笃\n##笆\n##笈\n##笋\n##笏\n##笑\n##笔\n##笙\n##笛\n##笞\n##笠\n##符\n##笨\n##第\n##笹\n##笺\n##笼\n##筆\n##等\n##筊\n##筋\n##筍\n##筏\n##筐\n##筑\n##筒\n##答\n##策\n##筛\n##筝\n##筠\n##筱\n##筲\n##筵\n##筷\n##筹\n##签\n##简\n##箇\n##箋\n##箍\n##箏\n##箐\n##箔\n##箕\n##算\n##箝\n##管\n##箩\n##箫\n##箭\n##箱\n##箴\n##箸\n##節\n##篁\n##範\n##篆\n##篇\n##築\n##篑\n##篓\n##篙\n##篝\n##篠\n##篡\n##篤\n##篩\n##篪\n##篮\n##篱\n##篷\n##簇\n##簌\n##簍\n##簡\n##簦\n##簧\n##簪\n##簫\n##簷\n##簸\n##簽\n##簾\n##簿\n##籁\n##籃\n##籌\n##籍\n##籐\n##籟\n##籠\n##籤\n##籬\n##籮\n##籲\n##米\n##类\n##籼\n##籽\n##粄\n##粉\n##粑\n##粒\n##粕\n##粗\n##粘\n##粟\n##粤\n##粥\n##粧\n##粪\n##粮\n##粱\n##粲\n##粳\n##粵\n##粹\n##粼\n##粽\n##精\n##粿\n##糅\n##糊\n##糍\n##糕\n##糖\n##糗\n##糙\n##糜\n##糞\n##糟\n##糠\n##糧\n##糬\n##糯\n##糰\n##糸\n##系\n##糾\n##紀\n##紂\n##約\n##紅\n##紉\n##紊\n##紋\n##納\n##紐\n##紓\n##純\n##紗\n##紘\n##紙\n##級\n##紛\n##紜\n##素\n##紡\n##索\n##紧\n##紫\n##紮\n##累\n##細\n##紳\n##紹\n##紺\n##終\n##絃\n##組\n##絆\n##経\n##結\n##絕\n##絞\n##絡\n##絢\n##給\n##絨\n##絮\n##統\n##絲\n##絳\n##絵\n##絶\n##絹\n##綁\n##綏\n##綑\n##經\n##継\n##続\n##綜\n##綠\n##綢\n##綦\n##綫\n##綬\n##維\n##綱\n##網\n##綴\n##綵\n##綸\n##綺\n##綻\n##綽\n##綾\n##綿\n##緊\n##緋\n##総\n##緑\n##緒\n##緘\n##線\n##緝\n##緞\n##締\n##緣\n##編\n##緩\n##緬\n##緯\n##練\n##緹\n##緻\n##縁\n##縄\n##縈\n##縛\n##縝\n##縣\n##縫\n##縮\n##縱\n##縴\n##縷\n##總\n##績\n##繁\n##繃\n##繆\n##繇\n##繋\n##織\n##繕\n##繚\n##繞\n##繡\n##繩\n##繪\n##繫\n##繭\n##繳\n##繹\n##繼\n##繽\n##纂\n##續\n##纍\n##纏\n##纓\n##纔\n##纖\n##纜\n##纠\n##红\n##纣\n##纤\n##约\n##级\n##纨\n##纪\n##纫\n##纬\n##纭\n##纯\n##纰\n##纱\n##纲\n##纳\n##纵\n##纶\n##纷\n##纸\n##纹\n##纺\n##纽\n##纾\n##线\n##绀\n##练\n##组\n##绅\n##细\n##织\n##终\n##绊\n##绍\n##绎\n##经\n##绑\n##绒\n##结\n##绔\n##绕\n##绘\n##给\n##绚\n##绛\n##络\n##绝\n##绞\n##统\n##绡\n##绢\n##绣\n##绥\n##绦\n##继\n##绩\n##绪\n##绫\n##续\n##绮\n##绯\n##绰\n##绳\n##维\n##绵\n##绶\n##绷\n##绸\n##绻\n##综\n##绽\n##绾\n##绿\n##缀\n##缄\n##缅\n##缆\n##缇\n##缈\n##缉\n##缎\n##缓\n##缔\n##缕\n##编\n##缘\n##缙\n##缚\n##缜\n##缝\n##缠\n##缢\n##缤\n##缥\n##缨\n##缩\n##缪\n##缭\n##缮\n##缰\n##缱\n##缴\n##缸\n##缺\n##缽\n##罂\n##罄\n##罌\n##罐\n##网\n##罔\n##罕\n##罗\n##罚\n##罡\n##罢\n##罩\n##罪\n##置\n##罰\n##署\n##罵\n##罷\n##罹\n##羁\n##羅\n##羈\n##羊\n##羌\n##美\n##羔\n##羚\n##羞\n##羟\n##羡\n##羣\n##群\n##羥\n##羧\n##羨\n##義\n##羯\n##羲\n##羸\n##羹\n##羽\n##羿\n##翁\n##翅\n##翊\n##翌\n##翎\n##習\n##翔\n##翘\n##翟\n##翠\n##翡\n##翦\n##翩\n##翰\n##翱\n##翳\n##翹\n##翻\n##翼\n##耀\n##老\n##考\n##耄\n##者\n##耆\n##耋\n##而\n##耍\n##耐\n##耒\n##耕\n##耗\n##耘\n##耙\n##耦\n##耨\n##耳\n##耶\n##耷\n##耸\n##耻\n##耽\n##耿\n##聂\n##聆\n##聊\n##聋\n##职\n##聒\n##联\n##聖\n##聘\n##聚\n##聞\n##聪\n##聯\n##聰\n##聲\n##聳\n##聴\n##聶\n##職\n##聽\n##聾\n##聿\n##肃\n##肄\n##肅\n##肆\n##肇\n##肉\n##肋\n##肌\n##肏\n##肓\n##肖\n##肘\n##肚\n##肛\n##肝\n##肠\n##股\n##肢\n##肤\n##肥\n##肩\n##肪\n##肮\n##肯\n##肱\n##育\n##肴\n##肺\n##肽\n##肾\n##肿\n##胀\n##胁\n##胃\n##胄\n##胆\n##背\n##胍\n##胎\n##胖\n##胚\n##胛\n##胜\n##胝\n##胞\n##胡\n##胤\n##胥\n##胧\n##胫\n##胭\n##胯\n##胰\n##胱\n##胳\n##胴\n##胶\n##胸\n##胺\n##能\n##脂\n##脅\n##脆\n##脇\n##脈\n##脉\n##脊\n##脍\n##脏\n##脐\n##脑\n##脓\n##脖\n##脘\n##脚\n##脛\n##脣\n##脩\n##脫\n##脯\n##脱\n##脲\n##脳\n##脸\n##脹\n##脾\n##腆\n##腈\n##腊\n##腋\n##腌\n##腎\n##腐\n##腑\n##腓\n##腔\n##腕\n##腥\n##腦\n##腩\n##腫\n##腭\n##腮\n##腰\n##腱\n##腳\n##腴\n##腸\n##腹\n##腺\n##腻\n##腼\n##腾\n##腿\n##膀\n##膈\n##膊\n##膏\n##膑\n##膘\n##膚\n##膛\n##膜\n##膝\n##膠\n##膦\n##膨\n##膩\n##膳\n##膺\n##膻\n##膽\n##膾\n##膿\n##臀\n##臂\n##臃\n##臆\n##臉\n##臊\n##臍\n##臓\n##臘\n##臟\n##臣\n##臥\n##臧\n##臨\n##自\n##臬\n##臭\n##至\n##致\n##臺\n##臻\n##臼\n##臾\n##舀\n##舂\n##舅\n##舆\n##與\n##興\n##舉\n##舊\n##舌\n##舍\n##舎\n##舐\n##舒\n##舔\n##舖\n##舗\n##舛\n##舜\n##舞\n##舟\n##航\n##舫\n##般\n##舰\n##舱\n##舵\n##舶\n##舷\n##舸\n##船\n##舺\n##舾\n##艇\n##艋\n##艘\n##艙\n##艦\n##艮\n##良\n##艰\n##艱\n##色\n##艳\n##艷\n##艹\n##艺\n##艾\n##节\n##芃\n##芈\n##芊\n##芋\n##芍\n##芎\n##芒\n##芙\n##芜\n##芝\n##芡\n##芥\n##芦\n##芩\n##芪\n##芫\n##芬\n##芭\n##芮\n##芯\n##花\n##芳\n##芷\n##芸\n##芹\n##芻\n##芽\n##芾\n##苁\n##苄\n##苇\n##苋\n##苍\n##苏\n##苑\n##苒\n##苓\n##苔\n##苕\n##苗\n##苛\n##苜\n##苞\n##苟\n##苡\n##苣\n##若\n##苦\n##苫\n##苯\n##英\n##苷\n##苹\n##苻\n##茁\n##茂\n##范\n##茄\n##茅\n##茉\n##茎\n##茏\n##茗\n##茜\n##茧\n##茨\n##茫\n##茬\n##茭\n##茯\n##茱\n##茲\n##茴\n##茵\n##茶\n##茸\n##茹\n##茼\n##荀\n##荃\n##荆\n##草\n##荊\n##荏\n##荐\n##荒\n##荔\n##荖\n##荘\n##荚\n##荞\n##荟\n##荠\n##荡\n##荣\n##荤\n##荥\n##荧\n##荨\n##荪\n##荫\n##药\n##荳\n##荷\n##荸\n##荻\n##荼\n##荽\n##莅\n##莆\n##莉\n##莊\n##莎\n##莒\n##莓\n##莖\n##莘\n##莞\n##莠\n##莢\n##莧\n##莪\n##莫\n##莱\n##莲\n##莴\n##获\n##莹\n##莺\n##莽\n##莿\n##菀\n##菁\n##菅\n##菇\n##菈\n##菊\n##菌\n##菏\n##菓\n##菖\n##菘\n##菜\n##菟\n##菠\n##菡\n##菩\n##華\n##菱\n##菲\n##菸\n##菽\n##萁\n##萃\n##萄\n##萊\n##萋\n##萌\n##萍\n##萎\n##萘\n##萝\n##萤\n##营\n##萦\n##萧\n##萨\n##萩\n##萬\n##萱\n##萵\n##萸\n##萼\n##落\n##葆\n##葉\n##著\n##葚\n##葛\n##葡\n##董\n##葦\n##葩\n##葫\n##葬\n##葭\n##葯\n##葱\n##葳\n##葵\n##葷\n##葺\n##蒂\n##蒋\n##蒐\n##蒔\n##蒙\n##蒜\n##蒞\n##蒟\n##蒡\n##蒨\n##蒲\n##蒸\n##蒹\n##蒻\n##蒼\n##蒿\n##蓁\n##蓄\n##蓆\n##蓉\n##蓋\n##蓑\n##蓓\n##蓖\n##蓝\n##蓟\n##蓦\n##蓬\n##蓮\n##蓼\n##蓿\n##蔑\n##蔓\n##蔔\n##蔗\n##蔘\n##蔚\n##蔡\n##蔣\n##蔥\n##蔫\n##蔬\n##蔭\n##蔵\n##蔷\n##蔺\n##蔻\n##蔼\n##蔽\n##蕁\n##蕃\n##蕈\n##蕉\n##蕊\n##蕎\n##蕙\n##蕤\n##蕨\n##蕩\n##蕪\n##蕭\n##蕲\n##蕴\n##蕻\n##蕾\n##薄\n##薅\n##薇\n##薈\n##薊\n##薏\n##薑\n##薔\n##薙\n##薛\n##薦\n##薨\n##薩\n##薪\n##薬\n##薯\n##薰\n##薹\n##藉\n##藍\n##藏\n##藐\n##藓\n##藕\n##藜\n##藝\n##藤\n##藥\n##藩\n##藹\n##藻\n##藿\n##蘆\n##蘇\n##蘊\n##蘋\n##蘑\n##蘚\n##蘭\n##蘸\n##蘼\n##蘿\n##虎\n##虏\n##虐\n##虑\n##虔\n##處\n##虚\n##虛\n##虜\n##虞\n##號\n##虢\n##虧\n##虫\n##虬\n##虱\n##虹\n##虻\n##虽\n##虾\n##蚀\n##蚁\n##蚂\n##蚊\n##蚌\n##蚓\n##蚕\n##蚜\n##蚝\n##蚣\n##蚤\n##蚩\n##蚪\n##蚯\n##蚱\n##蚵\n##蛀\n##蛆\n##蛇\n##蛊\n##蛋\n##蛎\n##蛐\n##蛔\n##蛙\n##蛛\n##蛟\n##蛤\n##蛭\n##蛮\n##蛰\n##蛳\n##蛹\n##蛻\n##蛾\n##蜀\n##蜂\n##蜃\n##蜆\n##蜇\n##蜈\n##蜊\n##蜍\n##蜒\n##蜓\n##蜕\n##蜗\n##蜘\n##蜚\n##蜜\n##蜡\n##蜢\n##蜥\n##蜱\n##蜴\n##蜷\n##蜻\n##蜿\n##蝇\n##蝈\n##蝉\n##蝌\n##蝎\n##蝕\n##蝗\n##蝙\n##蝟\n##蝠\n##蝦\n##蝨\n##蝴\n##蝶\n##蝸\n##蝼\n##螂\n##螃\n##融\n##螞\n##螢\n##螨\n##螯\n##螳\n##螺\n##蟀\n##蟄\n##蟆\n##蟋\n##蟎\n##蟑\n##蟒\n##蟠\n##蟬\n##蟲\n##蟹\n##蟻\n##蟾\n##蠅\n##蠍\n##蠔\n##蠕\n##蠛\n##蠟\n##蠡\n##蠢\n##蠣\n##蠱\n##蠶\n##蠹\n##蠻\n##血\n##衄\n##衅\n##衆\n##行\n##衍\n##術\n##衔\n##街\n##衙\n##衛\n##衝\n##衞\n##衡\n##衢\n##衣\n##补\n##表\n##衩\n##衫\n##衬\n##衮\n##衰\n##衲\n##衷\n##衹\n##衾\n##衿\n##袁\n##袂\n##袄\n##袅\n##袈\n##袋\n##袍\n##袒\n##袖\n##袜\n##袞\n##袤\n##袪\n##被\n##袭\n##袱\n##裁\n##裂\n##装\n##裆\n##裊\n##裏\n##裔\n##裕\n##裘\n##裙\n##補\n##裝\n##裟\n##裡\n##裤\n##裨\n##裱\n##裳\n##裴\n##裸\n##裹\n##製\n##裾\n##褂\n##複\n##褐\n##褒\n##褓\n##褔\n##褚\n##褥\n##褪\n##褫\n##褲\n##褶\n##褻\n##襁\n##襄\n##襟\n##襠\n##襪\n##襬\n##襯\n##襲\n##西\n##要\n##覃\n##覆\n##覇\n##見\n##規\n##覓\n##視\n##覚\n##覦\n##覧\n##親\n##覬\n##観\n##覷\n##覺\n##覽\n##觀\n##见\n##观\n##规\n##觅\n##视\n##览\n##觉\n##觊\n##觎\n##觐\n##觑\n##角\n##觞\n##解\n##觥\n##触\n##觸\n##言\n##訂\n##計\n##訊\n##討\n##訓\n##訕\n##訖\n##託\n##記\n##訛\n##訝\n##訟\n##訣\n##訥\n##訪\n##設\n##許\n##訳\n##訴\n##訶\n##診\n##註\n##証\n##詆\n##詐\n##詔\n##評\n##詛\n##詞\n##詠\n##詡\n##詢\n##詣\n##試\n##詩\n##詫\n##詬\n##詭\n##詮\n##詰\n##話\n##該\n##詳\n##詹\n##詼\n##誅\n##誇\n##誉\n##誌\n##認\n##誓\n##誕\n##誘\n##語\n##誠\n##誡\n##誣\n##誤\n##誥\n##誦\n##誨\n##說\n##説\n##読\n##誰\n##課\n##誹\n##誼\n##調\n##諄\n##談\n##請\n##諏\n##諒\n##論\n##諗\n##諜\n##諡\n##諦\n##諧\n##諫\n##諭\n##諮\n##諱\n##諳\n##諷\n##諸\n##諺\n##諾\n##謀\n##謁\n##謂\n##謄\n##謊\n##謎\n##謐\n##謔\n##謗\n##謙\n##講\n##謝\n##謠\n##謨\n##謬\n##謹\n##謾\n##譁\n##證\n##譎\n##譏\n##識\n##譙\n##譚\n##譜\n##警\n##譬\n##譯\n##議\n##譲\n##譴\n##護\n##譽\n##讀\n##變\n##讓\n##讚\n##讞\n##计\n##订\n##认\n##讥\n##讧\n##讨\n##让\n##讪\n##讫\n##训\n##议\n##讯\n##记\n##讲\n##讳\n##讴\n##讶\n##讷\n##许\n##讹\n##论\n##讼\n##讽\n##设\n##访\n##诀\n##证\n##诃\n##评\n##诅\n##识\n##诈\n##诉\n##诊\n##诋\n##词\n##诏\n##译\n##试\n##诗\n##诘\n##诙\n##诚\n##诛\n##话\n##诞\n##诟\n##诠\n##诡\n##询\n##诣\n##诤\n##该\n##详\n##诧\n##诩\n##诫\n##诬\n##语\n##误\n##诰\n##诱\n##诲\n##说\n##诵\n##诶\n##请\n##诸\n##诺\n##读\n##诽\n##课\n##诿\n##谀\n##谁\n##调\n##谄\n##谅\n##谆\n##谈\n##谊\n##谋\n##谌\n##谍\n##谎\n##谏\n##谐\n##谑\n##谒\n##谓\n##谔\n##谕\n##谗\n##谘\n##谙\n##谚\n##谛\n##谜\n##谟\n##谢\n##谣\n##谤\n##谥\n##谦\n##谧\n##谨\n##谩\n##谪\n##谬\n##谭\n##谯\n##谱\n##谲\n##谴\n##谶\n##谷\n##豁\n##豆\n##豇\n##豈\n##豉\n##豊\n##豌\n##豎\n##豐\n##豔\n##豚\n##象\n##豢\n##豪\n##豫\n##豬\n##豹\n##豺\n##貂\n##貅\n##貌\n##貓\n##貔\n##貘\n##貝\n##貞\n##負\n##財\n##貢\n##貧\n##貨\n##販\n##貪\n##貫\n##責\n##貯\n##貰\n##貳\n##貴\n##貶\n##買\n##貸\n##費\n##貼\n##貽\n##貿\n##賀\n##賁\n##賂\n##賃\n##賄\n##資\n##賈\n##賊\n##賑\n##賓\n##賜\n##賞\n##賠\n##賡\n##賢\n##賣\n##賤\n##賦\n##質\n##賬\n##賭\n##賴\n##賺\n##購\n##賽\n##贅\n##贈\n##贊\n##贍\n##贏\n##贓\n##贖\n##贛\n##贝\n##贞\n##负\n##贡\n##财\n##责\n##贤\n##败\n##账\n##货\n##质\n##贩\n##贪\n##贫\n##贬\n##购\n##贮\n##贯\n##贰\n##贱\n##贲\n##贴\n##贵\n##贷\n##贸\n##费\n##贺\n##贻\n##贼\n##贾\n##贿\n##赁\n##赂\n##赃\n##资\n##赅\n##赈\n##赊\n##赋\n##赌\n##赎\n##赏\n##赐\n##赓\n##赔\n##赖\n##赘\n##赚\n##赛\n##赝\n##赞\n##赠\n##赡\n##赢\n##赣\n##赤\n##赦\n##赧\n##赫\n##赭\n##走\n##赳\n##赴\n##赵\n##赶\n##起\n##趁\n##超\n##越\n##趋\n##趕\n##趙\n##趟\n##趣\n##趨\n##足\n##趴\n##趵\n##趸\n##趺\n##趾\n##跃\n##跄\n##跆\n##跋\n##跌\n##跎\n##跑\n##跖\n##跚\n##跛\n##距\n##跟\n##跡\n##跤\n##跨\n##跩\n##跪\n##路\n##跳\n##践\n##跷\n##跹\n##跺\n##跻\n##踉\n##踊\n##踌\n##踏\n##踐\n##踝\n##踞\n##踟\n##踢\n##踩\n##踪\n##踮\n##踱\n##踴\n##踵\n##踹\n##蹂\n##蹄\n##蹇\n##蹈\n##蹉\n##蹊\n##蹋\n##蹑\n##蹒\n##蹙\n##蹟\n##蹣\n##蹤\n##蹦\n##蹩\n##蹬\n##蹭\n##蹲\n##蹴\n##蹶\n##蹺\n##蹼\n##蹿\n##躁\n##躇\n##躉\n##躊\n##躋\n##躍\n##躏\n##躪\n##身\n##躬\n##躯\n##躲\n##躺\n##軀\n##車\n##軋\n##軌\n##軍\n##軒\n##軟\n##転\n##軸\n##軼\n##軽\n##軾\n##較\n##載\n##輒\n##輓\n##輔\n##輕\n##輛\n##輝\n##輟\n##輩\n##輪\n##輯\n##輸\n##輻\n##輾\n##輿\n##轄\n##轅\n##轆\n##轉\n##轍\n##轎\n##轟\n##车\n##轧\n##轨\n##轩\n##转\n##轭\n##轮\n##软\n##轰\n##轲\n##轴\n##轶\n##轻\n##轼\n##载\n##轿\n##较\n##辄\n##辅\n##辆\n##辇\n##辈\n##辉\n##辊\n##辍\n##辐\n##辑\n##输\n##辕\n##辖\n##辗\n##辘\n##辙\n##辛\n##辜\n##辞\n##辟\n##辣\n##辦\n##辨\n##辩\n##辫\n##辭\n##辮\n##辯\n##辰\n##辱\n##農\n##边\n##辺\n##辻\n##込\n##辽\n##达\n##迁\n##迂\n##迄\n##迅\n##过\n##迈\n##迎\n##运\n##近\n##返\n##还\n##这\n##进\n##远\n##违\n##连\n##迟\n##迢\n##迤\n##迥\n##迦\n##迩\n##迪\n##迫\n##迭\n##述\n##迴\n##迷\n##迸\n##迹\n##迺\n##追\n##退\n##送\n##适\n##逃\n##逅\n##逆\n##选\n##逊\n##逍\n##透\n##逐\n##递\n##途\n##逕\n##逗\n##這\n##通\n##逛\n##逝\n##逞\n##速\n##造\n##逢\n##連\n##逮\n##週\n##進\n##逵\n##逶\n##逸\n##逻\n##逼\n##逾\n##遁\n##遂\n##遅\n##遇\n##遊\n##運\n##遍\n##過\n##遏\n##遐\n##遑\n##遒\n##道\n##達\n##違\n##遗\n##遙\n##遛\n##遜\n##遞\n##遠\n##遢\n##遣\n##遥\n##遨\n##適\n##遭\n##遮\n##遲\n##遴\n##遵\n##遶\n##遷\n##選\n##遺\n##遼\n##遽\n##避\n##邀\n##邁\n##邂\n##邃\n##還\n##邇\n##邈\n##邊\n##邋\n##邏\n##邑\n##邓\n##邕\n##邛\n##邝\n##邢\n##那\n##邦\n##邨\n##邪\n##邬\n##邮\n##邯\n##邰\n##邱\n##邳\n##邵\n##邸\n##邹\n##邺\n##邻\n##郁\n##郅\n##郊\n##郎\n##郑\n##郜\n##郝\n##郡\n##郢\n##郤\n##郦\n##郧\n##部\n##郫\n##郭\n##郴\n##郵\n##郷\n##郸\n##都\n##鄂\n##鄉\n##鄒\n##鄔\n##鄙\n##鄞\n##鄢\n##鄧\n##鄭\n##鄰\n##鄱\n##鄲\n##鄺\n##酉\n##酊\n##酋\n##酌\n##配\n##酐\n##酒\n##酗\n##酚\n##酝\n##酢\n##酣\n##酥\n##酩\n##酪\n##酬\n##酮\n##酯\n##酰\n##酱\n##酵\n##酶\n##酷\n##酸\n##酿\n##醃\n##醇\n##醉\n##醋\n##醍\n##醐\n##醒\n##醚\n##醛\n##醜\n##醞\n##醣\n##醪\n##醫\n##醬\n##醮\n##醯\n##醴\n##醺\n##釀\n##釁\n##采\n##釉\n##释\n##釋\n##里\n##重\n##野\n##量\n##釐\n##金\n##釗\n##釘\n##釜\n##針\n##釣\n##釦\n##釧\n##釵\n##鈀\n##鈉\n##鈍\n##鈎\n##鈔\n##鈕\n##鈞\n##鈣\n##鈦\n##鈪\n##鈴\n##鈺\n##鈾\n##鉀\n##鉄\n##鉅\n##鉉\n##鉑\n##鉗\n##鉚\n##鉛\n##鉤\n##鉴\n##鉻\n##銀\n##銃\n##銅\n##銑\n##銓\n##銖\n##銘\n##銜\n##銬\n##銭\n##銮\n##銳\n##銷\n##銹\n##鋁\n##鋅\n##鋒\n##鋤\n##鋪\n##鋰\n##鋸\n##鋼\n##錄\n##錐\n##錘\n##錚\n##錠\n##錢\n##錦\n##錨\n##錫\n##錮\n##錯\n##録\n##錳\n##錶\n##鍊\n##鍋\n##鍍\n##鍛\n##鍥\n##鍰\n##鍵\n##鍺\n##鍾\n##鎂\n##鎊\n##鎌\n##鎏\n##鎔\n##鎖\n##鎗\n##鎚\n##鎧\n##鎬\n##鎮\n##鎳\n##鏈\n##鏖\n##鏗\n##鏘\n##鏞\n##鏟\n##鏡\n##鏢\n##鏤\n##鏽\n##鐘\n##鐮\n##鐲\n##鐳\n##鐵\n##鐸\n##鐺\n##鑄\n##鑊\n##鑑\n##鑒\n##鑣\n##鑫\n##鑰\n##鑲\n##鑼\n##鑽\n##鑾\n##鑿\n##针\n##钉\n##钊\n##钎\n##钏\n##钒\n##钓\n##钗\n##钙\n##钛\n##钜\n##钝\n##钞\n##钟\n##钠\n##钡\n##钢\n##钣\n##钤\n##钥\n##钦\n##钧\n##钨\n##钩\n##钮\n##钯\n##钰\n##钱\n##钳\n##钴\n##钵\n##钺\n##钻\n##钼\n##钾\n##钿\n##铀\n##铁\n##铂\n##铃\n##铄\n##铅\n##铆\n##铉\n##铎\n##铐\n##铛\n##铜\n##铝\n##铠\n##铡\n##铢\n##铣\n##铤\n##铨\n##铩\n##铬\n##铭\n##铮\n##铰\n##铲\n##铵\n##银\n##铸\n##铺\n##链\n##铿\n##销\n##锁\n##锂\n##锄\n##锅\n##锆\n##锈\n##锉\n##锋\n##锌\n##锏\n##锐\n##锑\n##错\n##锚\n##锟\n##锡\n##锢\n##锣\n##锤\n##锥\n##锦\n##锭\n##键\n##锯\n##锰\n##锲\n##锵\n##锹\n##锺\n##锻\n##镀\n##镁\n##镂\n##镇\n##镉\n##镌\n##镍\n##镐\n##镑\n##镕\n##镖\n##镗\n##镛\n##镜\n##镣\n##镭\n##镯\n##镰\n##镳\n##镶\n##長\n##长\n##門\n##閃\n##閉\n##開\n##閎\n##閏\n##閑\n##閒\n##間\n##閔\n##閘\n##閡\n##関\n##閣\n##閥\n##閨\n##閩\n##閱\n##閲\n##閹\n##閻\n##閾\n##闆\n##闇\n##闊\n##闌\n##闍\n##闔\n##闕\n##闖\n##闘\n##關\n##闡\n##闢\n##门\n##闪\n##闫\n##闭\n##问\n##闯\n##闰\n##闲\n##间\n##闵\n##闷\n##闸\n##闹\n##闺\n##闻\n##闽\n##闾\n##阀\n##阁\n##阂\n##阅\n##阆\n##阇\n##阈\n##阉\n##阎\n##阐\n##阑\n##阔\n##阕\n##阖\n##阙\n##阚\n##阜\n##队\n##阡\n##阪\n##阮\n##阱\n##防\n##阳\n##阴\n##阵\n##阶\n##阻\n##阿\n##陀\n##陂\n##附\n##际\n##陆\n##陇\n##陈\n##陋\n##陌\n##降\n##限\n##陕\n##陛\n##陝\n##陞\n##陟\n##陡\n##院\n##陣\n##除\n##陨\n##险\n##陪\n##陰\n##陲\n##陳\n##陵\n##陶\n##陷\n##陸\n##険\n##陽\n##隅\n##隆\n##隈\n##隊\n##隋\n##隍\n##階\n##随\n##隐\n##隔\n##隕\n##隘\n##隙\n##際\n##障\n##隠\n##隣\n##隧\n##隨\n##險\n##隱\n##隴\n##隶\n##隸\n##隻\n##隼\n##隽\n##难\n##雀\n##雁\n##雄\n##雅\n##集\n##雇\n##雉\n##雋\n##雌\n##雍\n##雎\n##雏\n##雑\n##雒\n##雕\n##雖\n##雙\n##雛\n##雜\n##雞\n##離\n##難\n##雨\n##雪\n##雯\n##雰\n##雲\n##雳\n##零\n##雷\n##雹\n##電\n##雾\n##需\n##霁\n##霄\n##霆\n##震\n##霈\n##霉\n##霊\n##霍\n##霎\n##霏\n##霑\n##霓\n##霖\n##霜\n##霞\n##霧\n##霭\n##霰\n##露\n##霸\n##霹\n##霽\n##霾\n##靂\n##靄\n##靈\n##青\n##靓\n##靖\n##静\n##靚\n##靛\n##靜\n##非\n##靠\n##靡\n##面\n##靥\n##靦\n##革\n##靳\n##靴\n##靶\n##靼\n##鞅\n##鞋\n##鞍\n##鞏\n##鞑\n##鞘\n##鞠\n##鞣\n##鞦\n##鞭\n##韆\n##韋\n##韌\n##韓\n##韜\n##韦\n##韧\n##韩\n##韬\n##韭\n##音\n##韵\n##韶\n##韻\n##響\n##頁\n##頂\n##頃\n##項\n##順\n##須\n##頌\n##預\n##頑\n##頒\n##頓\n##頗\n##領\n##頜\n##頡\n##頤\n##頫\n##頭\n##頰\n##頷\n##頸\n##頹\n##頻\n##頼\n##顆\n##題\n##額\n##顎\n##顏\n##顔\n##願\n##顛\n##類\n##顧\n##顫\n##顯\n##顱\n##顴\n##页\n##顶\n##顷\n##项\n##顺\n##须\n##顼\n##顽\n##顾\n##顿\n##颁\n##颂\n##预\n##颅\n##领\n##颇\n##颈\n##颉\n##颊\n##颌\n##颍\n##颐\n##频\n##颓\n##颔\n##颖\n##颗\n##题\n##颚\n##颛\n##颜\n##额\n##颞\n##颠\n##颡\n##颢\n##颤\n##颦\n##颧\n##風\n##颯\n##颱\n##颳\n##颶\n##颼\n##飄\n##飆\n##风\n##飒\n##飓\n##飕\n##飘\n##飙\n##飚\n##飛\n##飞\n##食\n##飢\n##飨\n##飩\n##飪\n##飯\n##飲\n##飼\n##飽\n##飾\n##餃\n##餅\n##餉\n##養\n##餌\n##餐\n##餒\n##餓\n##餘\n##餚\n##餛\n##餞\n##餡\n##館\n##餮\n##餵\n##餾\n##饅\n##饈\n##饋\n##饌\n##饍\n##饑\n##饒\n##饕\n##饗\n##饞\n##饥\n##饨\n##饪\n##饬\n##饭\n##饮\n##饯\n##饰\n##饱\n##饲\n##饴\n##饵\n##饶\n##饷\n##饺\n##饼\n##饽\n##饿\n##馀\n##馁\n##馄\n##馅\n##馆\n##馈\n##馋\n##馍\n##馏\n##馒\n##馔\n##首\n##馗\n##香\n##馥\n##馨\n##馬\n##馭\n##馮\n##馳\n##馴\n##駁\n##駄\n##駅\n##駆\n##駐\n##駒\n##駕\n##駛\n##駝\n##駭\n##駱\n##駿\n##騁\n##騎\n##騏\n##験\n##騙\n##騨\n##騰\n##騷\n##驀\n##驅\n##驊\n##驍\n##驒\n##驕\n##驗\n##驚\n##驛\n##驟\n##驢\n##驥\n##马\n##驭\n##驮\n##驯\n##驰\n##驱\n##驳\n##驴\n##驶\n##驷\n##驸\n##驹\n##驻\n##驼\n##驾\n##驿\n##骁\n##骂\n##骄\n##骅\n##骆\n##骇\n##骈\n##骊\n##骋\n##验\n##骏\n##骐\n##骑\n##骗\n##骚\n##骛\n##骜\n##骞\n##骠\n##骡\n##骤\n##骥\n##骧\n##骨\n##骯\n##骰\n##骶\n##骷\n##骸\n##骼\n##髂\n##髅\n##髋\n##髏\n##髒\n##髓\n##體\n##髖\n##高\n##髦\n##髪\n##髮\n##髯\n##髻\n##鬃\n##鬆\n##鬍\n##鬓\n##鬚\n##鬟\n##鬢\n##鬣\n##鬥\n##鬧\n##鬱\n##鬼\n##魁\n##魂\n##魄\n##魅\n##魇\n##魍\n##魏\n##魔\n##魘\n##魚\n##魯\n##魷\n##鮑\n##鮨\n##鮪\n##鮭\n##鮮\n##鯉\n##鯊\n##鯖\n##鯛\n##鯨\n##鯰\n##鯽\n##鰍\n##鰓\n##鰭\n##鰲\n##鰻\n##鰾\n##鱈\n##鱉\n##鱔\n##鱗\n##鱷\n##鱸\n##鱼\n##鱿\n##鲁\n##鲈\n##鲍\n##鲑\n##鲛\n##鲜\n##鲟\n##鲢\n##鲤\n##鲨\n##鲫\n##鲱\n##鲲\n##鲶\n##鲷\n##鲸\n##鳃\n##鳄\n##鳅\n##鳌\n##鳍\n##鳕\n##鳖\n##鳗\n##鳝\n##鳞\n##鳥\n##鳩\n##鳳\n##鳴\n##鳶\n##鴉\n##鴕\n##鴛\n##鴦\n##鴨\n##鴻\n##鴿\n##鵑\n##鵜\n##鵝\n##鵡\n##鵬\n##鵰\n##鵲\n##鶘\n##鶩\n##鶯\n##鶴\n##鷗\n##鷲\n##鷹\n##鷺\n##鸚\n##鸞\n##鸟\n##鸠\n##鸡\n##鸢\n##鸣\n##鸥\n##鸦\n##鸨\n##鸪\n##鸭\n##鸯\n##鸳\n##鸵\n##鸽\n##鸾\n##鸿\n##鹂\n##鹃\n##鹄\n##鹅\n##鹈\n##鹉\n##鹊\n##鹌\n##鹏\n##鹑\n##鹕\n##鹘\n##鹜\n##鹞\n##鹤\n##鹦\n##鹧\n##鹫\n##鹭\n##鹰\n##鹳\n##鹵\n##鹹\n##鹼\n##鹽\n##鹿\n##麂\n##麋\n##麒\n##麓\n##麗\n##麝\n##麟\n##麥\n##麦\n##麩\n##麴\n##麵\n##麸\n##麺\n##麻\n##麼\n##麽\n##麾\n##黃\n##黄\n##黍\n##黎\n##黏\n##黑\n##黒\n##黔\n##默\n##黛\n##黜\n##黝\n##點\n##黠\n##黨\n##黯\n##黴\n##鼋\n##鼎\n##鼐\n##鼓\n##鼠\n##鼬\n##鼹\n##鼻\n##鼾\n##齁\n##齊\n##齋\n##齐\n##齒\n##齡\n##齢\n##齣\n##齦\n##齿\n##龄\n##龅\n##龈\n##龊\n##龋\n##龌\n##龍\n##龐\n##龔\n##龕\n##龙\n##龚\n##龛\n##龜\n##龟\n##︰\n##︱\n##︶\n##︿\n##﹁\n##﹂\n##﹍\n##﹏\n##﹐\n##﹑\n##﹒\n##﹔\n##﹕\n##﹖\n##﹗\n##﹙\n##﹚\n##﹝\n##﹞\n##﹡\n##﹣\n##！\n##＂\n##＃\n##＄\n##％\n##＆\n##＇\n##（\n##）\n##＊\n##，\n##－\n##．\n##／\n##：\n##；\n##＜\n##？\n##＠\n##［\n##＼\n##］\n##＾\n##＿\n##｀\n##ｆ\n##ｈ\n##ｊ\n##ｕ\n##ｗ\n##ｚ\n##｛\n##｝\n##｡\n##｢\n##｣\n##､\n##･\n##ｯ\n##ｰ\n##ｲ\n##ｸ\n##ｼ\n##ｽ\n##ﾄ\n##ﾉ\n##ﾌ\n##ﾗ\n##ﾙ\n##ﾝ\n##ﾞ\n##ﾟ\n##￣\n##￥\n##👍\n##🔥\n##😂\n##😎\n"
  },
  {
    "path": "modules/text/text_review/porn_detection_lstm/assets/word_dict.txt",
    "content": "<PAD>\n<UNK>\n，\n的\n[UNK]\n。\n一\n了\n是\n不\n我\n在\n人\n这\n有\n他\n着\n来\n上\n你\n个\n她\n大\n到\n道\n就\n下\n那\n说\n子\n出\n地\n：\n也\n小\n中\n时\n！\n要\n么\n们\n得\n然\n看\n自\n里\n？\n好\n身\n过\n手\n起\n后\n为\n可\n去\n没\n会\n心\n和\n之\n还\n都\n天\n、\n以\n头\n能\n对\n想\n而\n开\n只\n面\n己\n女\n生\n动\n样\n如\n发\n声\n无\n多\n用\n已\n又\n前\n被\n把\n家\n事\n经\n点\n两\n知\n情\n老\n眼\n力\n现\n于\n年\n感\n什\n很\n儿\n让\n但\n意\n间\n方\n成\n些\n笑\n体\n进\n长\n高\n从\n将\n话\n美\n见\n啊\n住\n当\n气\n所\n真\n回\n白\n口\n作\n再\n此\n明\n给\n觉\n正\n妈\n边\n色\n才\n行\n快\n公\n种\n分\n三\n几\n全\n法\n却\n次\n.\n本\n轻\n走\n国\n最\n同\n打\n十\n向\n实\n神\n定\n学\n张\n水\n光\n门\n内\n紧\n主\n更\n其\n入\n直\n外\n听\n问\n脸\n少\n肉\n因\n死\n做\n部\n双\n处\n与\n男\n便\n王\n太\n等\n怎\n像\n吧\n性\n红\n第\n二\n候\n理\n接\n放\n满\n阴\n果\n别\n比\n重\n相\n林\n受\n位\n「\n叫\n玉\n何\n龙\n强\n爱\n日\n应\n花\n清\n微\n加\n安\n机\n完\n连\n名\n者\n房\n李\n刚\n姐\n东\n带\n月\n西\n师\n亲\n文\n风\n服\n并\n难\n山\n深\n指\n先\n关\n常\n干\n马\n跟\n由\n」\n目\n变\n四\n使\n嘴\n平\n合\n立\n吗\n精\n物\n车\n流\n火\n呢\n巴\n股\n望\n军\n空\n任\n金\n衣\n似\n工\n乳\n始\n越\n电\n原\n战\n新\n般\n黑\n今\n飞\n海\n转\n插\n至\n淫\n通\n表\n腿\n雪\n度\n\"\n思\n根\n信\n场\n业\n随\n吃\n虽\n许\n反\n阵\n书\n乎\n慢\n世\n往\n云\n路\n解\n特\n离\n算\n哥\n青\n半\n命\n音\n竟\n,\n交\n非\n拉\n教\n化\n容\n娇\n条\n夫\n刻\n活\n丝\n《\n》\n万\n量\n坐\n司\n且\n传\n管\n该\n热\n南\n找\n周\n五\n惊\n突\n早\n片\n落\n数\n它\n欢\n阳\n总\n极\n结\n步\n棒\n整\n城\n激\n孩\n制\n喜\n倒\n鸡\n远\n穴\n床\n冷\n怕\n市\n香\n母\n及\n抽\n；\n言\n息\n每\n件\n阿\n提\n百\n血\n记\n利\n断\n绝\n停\n弄\n（\n失\n站\n宝\n）\n影\n露\n静\n语\n即\n士\n近\n弟\n尔\n尽\n屁\n若\n欲\n柔\n系\n杀\n形\n细\n斯\n穿\n兴\n剑\n灵\n武\n吸\n认\n收\n星\n足\n底\n办\n石\n急\n求\n代\n脚\n黄\n保\n击\n功\n江\n丽\n娘\n拿\n令\n终\n华\n陈\n杨\n未\n冲\n晚\n或\n乐\n低\n哪\n修\n奇\n楚\n挺\n显\n刺\n必\n品\n员\n暗\n视\n友\n叶\n谁\n痛\n嫩\n则\n玩\n告\n千\n妹\n射\n包\n计\n烈\n温\n界\n抱\n字\n唇\n产\n胸\n民\n准\n魔\n够\n罗\n液\n摸\n势\n帮\n沉\n各\n程\n软\n忍\n乱\n单\n*\n酒\n达\n怪\n久\n忙\n敢\n哈\n兵\n顶\n伤\n切\n留\n著\n依\n速\n背\n题\n德\n压\n-\n象\n伸\n呼\n害\n送\n父\n区\n元\n续\n北\n队\n夜\n建\n官\n识\n具\n院\n请\n备\n刘\n格\n装\n顾\n睛\n摇\n亮\n缓\n克\n八\n苦\n持\n宫\n章\n裤\n脑\n消\n舒\n梦\n布\n务\n易\n响\n朝\n错\n钱\n期\n破\n答\n怀\n展\n照\n报\n领\n客\n术\n腰\n室\n巨\n妇\n待\n线\n调\n复\n苏\n渐\n论\n滑\n秦\n掌\n狂\n古\n异\n猛\n皇\n味\n角\n确\n兰\n睡\n雨\n决\n句\n式\n仙\n运\n吟\n脱\n顿\n推\n狠\n冰\n羞\n众\n另\n政\n台\n抓\n居\n爷\n六\n游\n英\n甚\n七\n木\n嗯\n迷\n器\n网\n团\n注\n淡\n春\n级\n硬\n峰\n设\n毛\n旁\n喝\n忽\n舌\n左\n铁\n科\n商\n皮\n按\n族\n曾\n图\n首\n资\n奶\n简\n米\n故\n继\n取\n掉\n爸\n除\n丰\n密\n况\n呀\n类\n赵\n谢\n右\n态\n段\n围\n楼\n基\n约\n粗\n潮\n需\n技\n佛\n跳\n九\n充\n荡\n顺\n差\n引\n姑\n艳\n愿\n浪\n耳\n熟\n兄\n湿\n唐\n您\n义\n赶\n念\n护\n观\n药\n松\n隐\n跑\n试\n君\n圆\n局\n节\n散\n诉\n闪\n止\n颤\n限\n刀\n支\n粉\n鬼\n禁\n医\n洞\n齐\n含\n集\n帝\n秀\n握\n臀\n初\n造\n吻\n料\n婆\n号\n导\n麻\n招\n抬\n质\n眉\n坚\n套\n希\n疑\n拍\n波\n村\n球\n社\n怒\n肯\n毫\n透\n雅\n份\n抚\n翻\n退\n专\n河\n奋\n户\n校\n考\n醒\n示\n雷\n究\n景\n圣\n胡\n病\n摆\n汉\n钟\n威\n统\n模\n哦\n暴\n爽\n举\n恶\n靠\n称\n卫\n弹\n警\n尖\n选\n既\n层\n攻\n扭\n承\n呻\n存\n余\n斗\n仍\n魂\n舔\n素\n改\n夏\n群\n府\n沙\n境\n置\n联\n板\n饭\n孙\n志\n·\n陆\n历\n骚\n秘\n呵\n抖\n1\n属\n州\n闻\n查\n副\n致\n屄\n操\n亚\n龟\n仅\n派\n证\n默\n换\n妻\n块\n莫\n写\n组\n凌\n坏\n贵\n助\n宁\n媚\n恐\n排\n状\n雄\n毕\n紫\n卡\n省\n福\n短\n追\n据\n食\n岁\n规\n杰\n萧\n寒\n环\n广\n枪\n蜜\n谈\n费\n骨\n座\n委\n丁\n宇\n画\n超\n蛋\n礼\n严\n草\n戏\n配\n朋\n端\n共\n治\n妙\n察\n较\n研\n恩\n咬\n伙\n型\n俊\n店\n独\n版\n润\n宗\n震\n揉\n议\n讲\n柳\n价\n树\n京\n投\n哼\n朱\n敏\n挑\n扬\n浑\n争\n梅\n习\n丹\n擦\n银\n营\n维\n幸\n洗\n劲\n野\n假\n偷\n忘\n宋\n源\n索\n土\n守\n虚\n混\n洁\n闭\n救\n付\n控\n叹\n摩\n练\n徐\n烟\n裸\n划\n县\n秋\n喘\n堂\n吴\n买\n敌\n参\n姨\n施\n标\n狗\n否\n权\n夹\n厉\n担\n遇\n险\n毒\n触\n虎\n负\n纪\n效\n际\n巧\n养\n晓\n哭\n欣\n须\n尚\n肩\n胜\n演\n贝\n享\n曲\n项\n升\n躺\n啦\n案\n藏\n咱\n珠\n嘿\n防\n韩\n婚\n屋\n凤\n材\n挥\n适\n2\n娜\n创\n迎\n诗\n仿\n优\n班\n责\n喊\n史\n泪\n探\n芳\n剧\n牙\n湖\n茎\n贴\n赤\n验\n临\n智\n盘\n撞\n休\n磨\n舞\n镇\n蓝\n妖\n逃\n疯\n牛\n鲜\n永\n田\n凡\n范\n抗\n捏\n呆\n职\n普\n恨\n喷\n诱\n叔\n园\n桌\n侧\n疼\n印\n康\n奸\n移\n肏\n裙\n尊\n兽\n~\n缩\n羽\n盖\n丈\n瞬\n育\n略\n伟\n(\n徒\n幽\n挂\n凝\n)\n3\n寻\n旧\n袋\n滚\n船\n伯\n躯\n良\n?\n尼\n某\n庭\n肥\n颜\n富\n弱\n拥\n塞\n洛\n碰\n厅\n庄\n吐\n鱼\n肤\n托\n谷\n狼\n卖\n姿\n臂\n灯\n沈\n封\n壁\n辈\n玄\n梁\n采\n浓\n油\n免\n诺\n燕\n晃\n轮\n惑\n值\n涌\n款\n嘛\n尤\n杯\n玲\n鼻\n艺\n纯\n痒\n．\n洋\n┅\n犹\n莉\n圈\n肌\n炼\n杂\n隔\n喔\n积\n邪\n增\n读\n逸\n奴\n扑\n佳\n腹\n慕\n骑\n释\n趣\n晶\n货\n罪\n获\n拔\n腾\n笔\n菜\n爆\n农\n歌\n败\n伏\n汗\n蒙\n烧\n薄\n善\n午\n稍\n吓\n泄\n伊\n载\n架\n厚\n惜\n窗\n蓉\n飘\n骂\n扎\n/\n抵\n墙\n归\n颗\n袜\n束\n企\n番\n纤\n茶\n列\n烦\n横\n占\n私\n赛\n滴\n折\n街\n济\n借\n避\n亦\n糊\n牌\n净\n袁\n吞\n傅\n奈\n皱\n;\n伦\n吮\n尸\n绿\n遍\n挤\n供\n鼓\n搞\n乡\n5\n##┅\n郎\n桃\n灭\n讨\n介\n缝\n偏\n彩\n党\n纳\n卷\n罩\n互\n敬\n懂\n慧\n诸\n犯\n菲\n扶\n织\n壮\n纷\n替\n裂\n逼\n缠\n律\n征\n缘\n俩\n残\n纸\n甲\n距\n婷\n豪\n率\n盯\n监\n幻\n镜\n迹\n怜\n降\n泽\n迫\n舍\n登\n甜\n碎\n悄\n呜\n乔\n4\n吉\n欧\n罢\n祖\n森\n勇\n拳\n销\n郑\n昨\n冒\n财\n彻\n凉\n琳\n岳\n危\n奔\n搂\n择\n姓\n墨\n勾\n殿\n酥\n测\n爬\n陪\n鸣\n莲\n醉\n训\n滋\n预\n健\n漂\n弃\n概\n唯\n暖\n域\n馆\n肚\n盛\n尘\n闹\n來\n仰\n仔\n撑\n耐\n郭\n晕\n鲁\n聚\n睁\n荣\n瞧\n挣\n:\n戴\n博\n董\n逐\n凶\n劳\n蕾\n躲\n垂\n陷\n浮\n拼\n川\n翘\n迅\n幕\n构\n庆\n胆\n累\n唔\n蒂\n啪\n跪\n朵\n宜\n附\n餐\n乖\n补\n涨\n後\n编\n忌\n昏\n麽\n惨\n奥\n杜\n翼\n炎\n仇\n孔\n亡\n岛\n童\n枫\n词\n稳\n仪\n谓\n旋\n竹\n剩\n臣\n恋\n遥\n绕\n吹\n凭\n融\n尝\n逗\n执\n拜\n扯\n阻\n琴\n纵\n虑\n瑶\n盈\n貌\n侯\n耸\n挡\n赏\n闲\n扫\n耻\n珍\n例\n愣\n傲\n漫\n辰\n旅\n莹\n丢\n悲\n促\n掩\n宽\n脉\n困\n启\n酸\n策\n宣\n袭\n庞\n额\n录\n络\n乌\n樱\n沟\n缺\n遗\n趴\n灰\n裹\n浴\n伴\n拒\n傻\n误\n塔\n&\n核\n努\n弯\n脖\n妃\n艾\n乃\n瞪\n妮\n迟\n甘\n池\n洪\n搓\n检\n辱\n惠\n蛇\n哎\n课\n慌\n诚\n婉\n!\n批\n淋\n益\n闷\n阶\n协\n输\n拨\n慰\n聊\n抹\n寸\n腻\n愈\n胀\n鞋\n苍\n6\n饱\n丫\n氏\n愤\n猜\n曼\n辞\n赞\n孤\n尾\n眸\n胯\n评\n芒\n抢\n浩\n怨\n碧\n豆\n均\n薇\n贱\n爹\n锁\n泛\n惯\n侍\n忆\n晴\n夺\n膀\n烫\n绪\n仁\n典\n隆\n轩\n固\n菊\n绵\n7\n嘻\n辉\n搭\n俏\n针\n炸\n冯\n锋\n汁\n痕\n席\n延\n哀\n忠\n炮\n恢\n椅\n辛\n埋\n悦\n汤\n堪\n迪\n浅\n钻\n嫣\n述\n8\n遭\n泰\n轰\n肖\n码\n硕\n孟\n咪\n姬\n屈\n帅\n胖\n括\n扣\n津\n吼\n泉\n贼\n频\n嘉\n10\n悠\n径\n季\n授\n彼\n晨\n沿\n倩\n悉\n荒\n暂\n唱\n染\n瑞\n判\n箭\n凑\n倾\n溜\n琪\n卧\n勒\n陌\n邓\n脏\n跃\n谋\n勃\n辣\n估\n佩\n旦\n侠\n吕\n绍\n渴\n侵\n鸟\n熊\n韵\n辆\n沾\n瓶\n惧\n坦\n忧\n筑\n朗\n雾\n齿\n燃\n拖\n瓣\n焦\n饰\n霞\n涂\n奉\n旗\n豫\n遮\n丸\n尿\n霸\n烂\n涛\n岩\n芸\n御\n减\n票\n怡\n莱\n匆\n牢\n疾\n递\n霜\n啸\n喃\n革\n猪\n厌\n衫\n皆\n脆\n尺\n途\n审\n废\n瓜\n姆\n损\n堆\n颈\n偶\n踏\n潜\n猫\n零\n奖\n毁\n颖\n魏\n贞\n培\n於\n踪\n末\n喉\n韦\n涩\n讶\n曰\n莎\n厂\n欺\n央\n骗\n郁\n悟\n臭\n娃\n薛\n凯\n锐\n趁\n库\n颊\n玛\n倍\n嫂\n萨\n钢\n奏\n泡\n乘\n泥\n购\n汽\n袍\n桥\n萍\n卿\n痴\n盟\n肛\n翠\n符\n眯\n窄\n洲\n冬\n曹\n岂\n悔\n疗\n袖\n淑\n截\n狐\n港\n羊\n撕\n舅\n刑\n楠\n绑\n惹\n霍\n驾\n汇\n唤\n怔\n饮\n敲\n咽\n陶\n%\n牵\n献\n糖\n柱\n枝\n售\n覆\n虫\n映\n箱\n珊\n拾\n卓\n撒\n葛\n铺\n阁\n恭\n颇\n瓦\n瘦\n屏\n俗\n帐\n幅\n幼\n篇\n振\n畅\n播\n扔\n宿\n铃\n茹\n寂\n耀\n劫\n航\n雕\n抑\n谦\n穷\n黎\n纹\n宾\n汪\n肃\n刹\n焰\n绳\n晋\n饶\n舟\n喂\n肢\n恼\n督\n摄\n抛\n予\n跨\n肠\n狱\n赫\n蛮\n腔\n岸\n扮\n昊\n雯\n聪\n井\n肆\n蒋\n桑\n9\n允\n贪\n禾\n梯\n疲\n丑\n斩\n斜\n戒\n呈\n廷\n陵\n析\n劝\n苗\n序\n冠\n俱\n锦\n亿\n#\n伍\n殊\n掏\n芬\n姊\n拦\n亏\n愧\n扰\n膝\n丛\n召\n麦\n巾\n龄\n贤\n蹲\n乾\n璃\n恰\n漠\n碗\n催\n懒\n衬\n洒\n噢\n眨\n稀\n悬\n吊\n嫁\n慎\n鹏\n瑕\n肿\n纠\n灌\n慈\n键\n赖\n熙\n纱\n柏\n寺\n晰\n污\n粒\n棍\n冥\n12\n魄\n贺\n贾\n茫\n丧\n墓\n植\n昂\n這\n链\n粘\n唉\n～\n罚\n秒\n噗\n叉\n牧\n申\n柜\n扩\n僵\n涵\n弥\n贯\n询\n虹\n魅\n俯\n鞭\n撩\n琼\n措\n厨\n勉\n佐\n吁\n访\n披\n舰\n毅\n踢\n签\n咯\n溪\n孕\n堡\n澡\n祭\n挖\n爵\n卢\n贫\n赌\n弓\n甩\n溢\n扇\n茵\n搜\n讯\n怖\n芝\n捧\n笼\n窝\n盗\n浸\n驶\n胴\n藤\n繁\n祝\n猎\n20\n馨\n∶\n帆\n衡\n添\n咕\n棠\n蝶\n匹\n僧\n租\n泣\n跌\n捉\n筋\n柄\n详\n勤\n【\n逍\n娟\n返\n仲\n##s\n咳\n瞳\n潘\n砸\n违\n础\n鸿\n掠\n竞\n铜\n啥\n[\n愉\n籍\n誉\n】\n萌\n耗\n赢\n－\n]\n11\n埃\n歉\n迈\n脂\n铭\n麟\n刷\n祸\n障\n祥\n锅\n鹰\n邦\n宠\n摔\n凸\n搬\n崩\n骄\n斥\n尬\n酷\n碍\n芙\nt\n捕\n虐\n膜\n尴\n援\n闯\n殷\n耶\n妍\n咐\n夸\n饿\n宛\n腕\n昌\n『\n仗\n囊\n宏\n瑟\n逆\n阅\n蕴\n膛\n驱\n钰\n描\n亭\n叛\n哇\n寿\n媳\n崇\n嫌\n姚\n丘\n糟\n渡\n串\n胁\n币\n愁\n爪\n婴\n册\n患\n盒\n欠\n吾\n煞\n绷\n吵\n浆\n兼\n巫\n弗\n崔\n岚\n歇\n妳\n俄\n漆\n掀\n挨\n姜\n割\n盾\n枚\n胳\n疏\n撤\n宅\n穆\n裁\n邵\n绩\n嗔\n』\n膊\n涉\n档\n刮\n励\n0\n驰\n靖\n蔡\n荷\n黏\n灾\n粮\n哟\n诡\n帽\n孝\n劈\n恒\n搅\n桂\n咒\n订\n氛\n凄\n赐\n昭\n蕊\n驻\n灼\n踩\n矿\n矮\n捷\n湾\n玻\n捂\n仆\n巡\n帕\n媒\n哲\n械\n隶\n庙\n恍\n妩\n腐\n刃\n##t\n鼎\n萝\n症\n妆\n渊\n颠\n隙\n昆\n堵\n30\n妓\n仓\n漉\n逢\n郡\n瘫\n鼠\n哑\n歪\n综\n杆\n蠢\n坡\n斑\n阔\n媛\n坛\n邀\n～～\n翔\n辑\n叠\n瞒\n廊\n咙\n溃\n卑\n誓\n瑗\n宴\n彭\n##2\n珑\n搐\n艰\n夕\n拢\n婶\n炉\n眠\n蠕\n吩\n邻\n税\n蹭\n剂\n玫\n嘶\n兔\n漏\n##0\n缕\n盆\n畏\n矛\n崖\n皙\n胎\n捅\n帘\n啲\n乏\n旺\n贸\n壶\n筒\n陛\n脾\n骤\n犬\n沒\n烁\n迁\n湘\n戈\n芷\n岭\n枕\n伐\n##9\n拱\n禹\n沐\n趟\n饥\n契\n腥\n怯\n谅\n砍\n赚\n狭\n鹿\n骇\n乜\n坠\n役\n時\n澜\n矩\n遂\n挪\n枯\n谨\n渗\n拟\n笨\n璇\n霄\n阮\n辩\n剥\n說\n菁\na\n甫\n宙\n寨\n悍\n寄\n摊\n塑\n棉\n嘟\n匪\n捆\n嘲\n屠\n竖\n届\n黛\n妥\n靡\n猴\n彤\n15\n偿\n鉴\n挽\n斤\n亵\n储\n沫\n哄\n侄\n瞎\n鹤\n咖\n拐\n娶\n盼\n溅\n页\n嗓\n筹\n攀\n卜\n兜\n妾\n桐\n雀\n填\n痉\n躁\n噬\n辅\n砰\n挛\n聂\n萱\n桶\n婢\n##g\n棋\n嗦\n胞\n雁\n牲\n哗\n蜂\n啧\n坑\n咧\n胶\n啡\n兮\n谭\n##1\n葬\n厢\n锟\n彷\n翎\n滩\n窜\n译\n淌\n##3\n胃\n亩\n趾\n尉\n腴\n憋\n塌\n岗\n篷\n坤\n账\n柴\n缚\n旨\n杏\n睿\n绽\n蓬\n削\n诧\n柯\n轨\n掐\n镖\n绯\n辨\n##n\n绘\n竭\n朴\n扒\n娴\n磊\n伪\n厕\n茂\n览\n赋\n筱\n拓\n弦\n漓\n膨\n妄\n陡\n盐\n循\n殖\n琛\n栗\n戚\n##d\n遣\n耍\n吱\n屑\n肝\n##r\n娅\n愕\n舐\n诀\n戳\n剪\n伺\n拂\n啼\n燥\n邱\n朕\n棺\n壳\n拽\n凰\n羡\n栋\n儒\n姻\n揭\n翅\n##k\n瞥\n匙\n焚\n翰\n绣\n烛\n冤\n脯\n霖\n寇\n17\n拆\n寡\n##5\n##y\n泳\n旭\n墅\n哧\n惩\n##4\n潇\n址\n叮\n尹\n蔚\n逝\n蟒\n蔓\n糕\n窥\n※\n摘\n婪\n禅\n厮\n昔\n狞\n褪\n晌\n鄙\n泌\n脊\n慨\n狮\n赴\n辽\nb\n秽\n骆\n垫\n唾\n赔\n艘\n撅\n妨\n逊\n淼\n谎\n宰\n茜\n谱\n嚷\n沸\n捣\n婊\n昧\n轴\n烤\n撼\n搏\n妒\n汹\n栏\n滥\n熬\n陀\n##the\n寞\n撇\n瑜\n翁\n廉\n芯\n豹\n##6\n荆\n兆\n瑰\n荐\n遵\n钥\n愚\n狄\n杖\n##a\n哩\n捡\n兹\n82\n+\n浙\n债\n凛\n＊\n泼\n過\n娱\n漾\n哆\n##i\n裴\nc\n夷\n亢\n冉\n矣\n钉\n炽\n佣\n冻\n##8\n苹\n灿\n徽\n寓\n耿\nquot\n擅\n敞\n歹\n押\n闺\n践\n嗤\n凹\n阜\n##x\n猥\n畜\n诊\n泊\n擒\n舱\n贡\n摧\n##b\n憾\n蓄\n疆\n讽\n惟\n##w\n趋\n瞄\n轿\n炒\n##12\n16\n##e\n匀\n蓦\n嘱\n呃\n沃\n稠\n姥\n倦\n杭\n躬\n逛\n2010\n惶\n姗\n乙\n氓\n罕\n100\n璟\n裳\n①\n彪\n瘾\n郝\n妞\n50\n##7\n篮\n署\n巷\n揽\n唧\n乞\n呐\n牺\n梨\n巍\n嗅\n彦\n椒\n甬\n巅\n携\n俞\n衷\n梳\n##c\n肺\n18\n胤\n碑\n衰\n辜\n俺\n钩\n弘\n绮\n──\n拧\n00\n叙\n耕\n倏\n稚\n拭\n邮\n葡\n贷\n倪\n琦\n咸\n丐\n刊\n饼\n圳\n朦\n##m\n瓷\n庸\n裡\n浇\n挠\n胧\n坊\n纽\n曦\n缸\n秃\n搁\n卵\n倚\n藉\n醋\n芦\n黯\n滨\n滞\n侦\n13\ns\n蒸\n钦\nx\n祈\n窃\n萄\n沧\n毯\n2011\n涯\nxi\n14\n蚁\n诞\n蚀\n嗡\n匠\n浊\n荫\n驳\n裆\n芊\n榜\n敛\n裕\n蹦\n稿\n霉\n襄\n蔽\n衍\n##th\ncom\n潭\n煮\n履\n魁\n蹂\n棵\n蝉\n03\n砖\n斌\n惫\n抠\n琐\n抄\n锤\n##he\n眩\n彬\n嘘\n侣\n碌\n寝\n幺\n=\n顽\n躏\n甸\n狸\n娥\n檀\n劣\n暇\n桩\n孽\n焉\n卒\n耽\n聘\n弧\n恕\n_\n暮\n2009\n栽\n葱\n##o\n茅\n蹙\n哨\n拇\n咋\n24\nwww\n揪\n斧\n罡\n喻\n##of\n狡\n嫉\n鲍\n厦\n對\n狰\n卦\n扁\n敦\n阎\n睹\n擎\n绒\n旷\n衙\n嚎\n##u\n忖\n菱\n腮\n蛛\n遁\n朽\n菌\n苞\n盏\n璐\n厘\n蹄\n湛\n陋\n抿\n鹅\n仑\n株\n婿\n發\n拘\n們\n##l\n扳\nk\n辖\n禀\n襟\n咦\n25\n坟\n巢\n蹬\n渔\n辟\n2012\n淘\n2008\n尧\n倘\n淮\n蓓\n##h\n蹈\n沁\n祁\n磁\n蜡\n粹\n屌\n囚\n艇\n疚\n绸\n赠\n佑\n澈\n蛊\n麒\n經\n蜀\n扛\n淹\n绞\n榻\n恳\n皓\n筝\n煤\n暧\n驴\n帖\n嚣\n谐\n垃\n蹊\n琢\n沌\n捞\n宵\n圾\n嘎\n晔\n芽\n郊\n2018\n晒\n渣\n40\n雍\n峡\n罐\n嘤\n21\n堕\n沦\n狈\n咨\n逻\n甄\n淳\n钧\n磕\n窒\n2000\n诈\n掘\n栈\n酬\n禽\n塘\n纲\n惚\n渠\n匕\n\\\n叨\n挟\n惮\n猿\n箍\n筷\n眶\n涡\n螺\n19\n宪\n鸭\n嘀\n迭\n喇\n詹\n伽\n##ra\n鳞\n婧\n奢\n锡\n嗽\n滔\n棚\n矜\n咚\n靴\n靥\n煌\n莺\n凳\n吏\n泻\n‥\n兀\n煎\n蝴\n唷\n翩\n亨\n斋\n剔\n##sh\n伞\n掰\n傍\n梭\n##er\n澄\n60\n媽\n##ao\n褐\n锻\n腊\n禄\n虞\n鸦\n枉\n陕\n俘\n嵌\n##f\n砂\n恬\n2007\n坎\n呕\n叽\n啜\n哮\n窟\nm\n勿\n拎\n畔\n颅\n卸\n苑\n诏\n22\n氧\n铸\n黝\n膏\n梵\n骏\n酱\n窍\n堤\n雇\n毙\n辕\nandroid\n菩\nh\n虾\n噜\n侮\n瞅\n笛\n霆\n鸾\n廖\n贩\n潺\n券\n咏\n绰\n衔\namp\n薪\n2006\n踝\n茸\n疤\n渺\n##ng\n粥\n谊\n槽\n樊\n拚\nj\n顷\n乍\n伶\n冈\n芭\n熏\n矢\n喧\n谜\n稻\n##to\n髓\n梓\n2005\n惕\n叱\n溶\n鞠\n##ａ\n恤\n莽\n##st\n撸\n咆\n哉\n吭\n##ing\n霎\n2013\n丞\n糙\n戮\n函\n驼\n##ed\n睫\n敖\n斐\n卉\n浦\n熄\n##hi\n啃\n迦\n箫\nf\nr\n疫\n诛\n蒲\n掷\n惭\n楞\n僻\n挫\n39\n梆\n澳\n嬉\nd\n勋\n搔\n啤\n23\n##on\n80\n臻\n苟\nu\n炫\n奕\n|\n框\n虏\n滕\n##ha\n弩\n绫\n奎\n>\n葫\ng\n蛟\ne\n嚼\n廓\n汝\n褶\n##uo\n钓\n蜷\n睾\n韧\n踹\n岔\n梢\n琉\n##v\n逞\n##～\n2014\n厥\n蚂\n怦\n##z\n盲\n汩\n祷\n叩\n～～～\n現\n犀\n盔\n拙\n讪\n茨\n26\nw\n瞟\n筠\n偎\n炙\n暑\n骸\n揍\nse\n桓\n旬\n峻\n肘\n'\n侃\n窦\n喽\n28\n缴\n##in\n漱\n噩\n瑾\n峨\n悸\n絮\n棱\n镶\n蔑\n##p\n豁\n闵\n碟\n##xt\n迸\n叭\n垮\n琅\n溺\n薰\n菇\n##le\n屎\n捐\n缎\n稽\n##and\n2017\n渍\n瀚\n蝎\n炭\n坪\n捻\n跋\n90\n屡\n勺\n跄\n芮\n焕\n##be\n2003\n肾\n##ro\n2004\n沛\n葵\n②\n踉\n呛\n鞘\nn\nv\n瑛\n酿\n腋\n烘\n敷\n邹\n诵\n##q\n逮\n揣\n呸\n<\nthe\n涎\n脐\n貂\n嗷\n冶\n200\nl\n垒\n咛\n##re\n崎\n霈\n壤\n邢\n慑\no\n眷\n##an\n拌\n27\n喀\n窘\n笙\n瀑\n钮\n70\n梧\n刁\n讥\n2015\n觑\n猩\n##ck\n癌\n邃\n隋\n涟\n捶\n澎\n萎\n漩\n茄\n##wa\n庵\n濡\n缀\n蚊\n骷\n夭\n髅\ni\n狙\n肮\n藩\n呦\n幢\n〔\n咔\n谣\n壑\n旱\n吨\n涅\n穹\n铮\n陨\n暄\n翡\n嗜\n兑\n呗\n牡\n蚕\n棕\n讳\np\n兢\n##wi\n坝\n##is\n诅\n雌\n攥\n〕\n肋\n瘤\n媾\n##me\n簇\n匿\n铐\n懵\n傀\n500\n祀\n杉\n挲\n绅\n烙\n猾\n橡\n跺\n芹\n2016\n１\n秩\n怠\n##at\n枣\n##×\n枢\n铠\n粪\n鑫\n蜘\n29\n钗\n倔\n31\n懈\n涕\n醇\n阀\n娆\n﹐\n##sa\n颂\n秉\n栖\n颓\n匣\n2002\n茉\n炯\n哽\n璧\n槐\n##co\n彰\n厄\n##fo\n腺\n晖\n慾\n龚\n熔\n棘\n捺\n驯\n曳\nz\n屯\n湃\n阱\n##ho\n搀\n榨\n檐\n鸽\n蟹\n辙\n##ｏ\n憨\n叼\n藕\n霓\n歧\n岑\n褥\n衅\n袅\n沮\n巩\n恣\n##j\n纺\n麼\n颔\n馒\n贲\n哒\n寥\n晏\n佬\n忿\n剖\n仕\n##di\n疙\n颁\n矫\n##ly\n35\n祟\n01\n婕\n裘\n蒜\n缭\n蔬\n漪\n儡\n##ma\n唬\n逾\n叟\n遏\n觅\n300\ntxt\n蕙\n懊\n瘩\n寰\n浣\n嗒\n##al\n忐\n忑\n莞\n甥\n橙\n##00\n剿\n啰\n皂\n昕\n嬷\n擂\n戟\n铲\n鬓\n穗\n奚\n娣\ny\n2001\n剃\n晦\n徊\n纬\n炕\n##se\n##el\n斟\n嚓\n1000\n嗣\n馀\n虔\n庐\n癫\n悚\n庚\n慵\n禧\n嗨\n弊\n蝇\n##mm\n聋\n丙\n绊\n朔\n痹\n竿\n绢\n##nt\n赦\n嗖\n08\n憎\n昼\n##un\n啾\n嗲\n臊\n３\n喳\n##you\n鬟\n茬\n邑\n徘\n榴\n姣\n宸\n舵\n窑\n咿\n##pa\n屉\n凿\n##de\n梗\n藻\n##os\n歼\n飙\n糜\n锥\n胭\n珂\n桦\n删\n赎\n鞍\n悯\n缅\n谕\n懿\n戬\n橱\n淇\n酣\n枭\nnbsp\n##te\n垢\n曝\n##ar\n雏\n冀\n猝\n钞\n椎\n惺\n蹑\n驭\n沼\nro\n墟\n舆\n腑\n霹\n##or\n攒\n##li\nand\n苔\n僚\n##mo\n笠\n磅\n##ｎ\nyin\n辫\n##go\n##℃\n##et\n噪\n灶\n笋\n挚\n贬\n峙\n##la\n谍\n##ot\n佟\n45\n囔\n##us\n勘\n36\n钝\n##so\n弛\n昵\n＂\n04\n拯\n^\n赧\n橘\n邸\n##no\n##ch\n##es\n珏\n撰\n泞\n##na\n璋\n戎\n蔷\n悴\n嵩\n##da\n碾\n##ｅ\n铅\n##en\n蛤\n杠\n雳\n09\n個\n渎\n栅\n谬\n捋\n峭\n##ne\n32\n嫖\n靓\n踱\n崽\n遐\n俐\n萤\n蝠\n辗\n锈\n憔\n簌\n叁\n癸\n２\n瑄\n##ｍ\n庇\n赘\n嫡\n##ss\n##ta\n剌\n奄\nq\n1999\n咂\n蛙\n铛\n悻\n桀\n@\n孜\n拣\n##９\n鲸\n##fi\n##10\n缪\n咻\n鸳\n募\n倡\n06\n碳\n05\n怅\n裔\n③\n恃\nunt\n揖\n慷\n唰\n##om\n侥\n杵\n膳\n咫\n苇\n懦\n鸠\n寐\n##ting\n阐\n吒\n绛\n玖\n缉\n蹿\n匈\n茧\n钳\n07\n庶\n##si\n##it\n掺\n烨\n為\n婵\n卯\n癖\n寅\n惬\n淀\n筛\n骋\n咀\n183\n砚\n嬴\n绎\n璞\n芜\n饲\n戾\n肴\n##as\n蕉\n1998\n敝\n闸\n镯\n##ol\n惦\n婀\n殆\n燎\n笃\n峦\n殇\n屿\n聆\n桔\n瞻\n##ver\n粟\nhttp\nch\n##ve\n幡\n##do\n匡\nａ\n绚\n髻\n##pe\n滤\n栓\n沥\n奠\n##０\n疮\n辐\n飒\n巳\n洽\n鸯\n祠\n倭\n幂\n熠\n跷\n馈\n偌\n讼\n掳\n摁\n沪\n噼\n祯\n##ll\n＿\n冽\n##int\n瘪\n髯\n舜\n撮\n唠\n衩\n宦\n蚩\n鹃\n##20\n苓\n圭\n02\n##fr\n##ｉ\n袒\n韶\n诫\n洼\n拴\n驿\n烹\n馋\n牟\nit\n璨\n麾\n##rs\n煽\nps\n翟\n汐\n浏\n谏\n1997\n诣\n崛\n炖\n蘑\n焱\n120\ngt\n嫔\n颐\n##ti\n璀\n陇\n［\n袱\n苛\n扉\n##mi\n亥\n瑚\n簸\n眺\n禛\n簿\n锯\n唆\n汀\n煜\n磷\n1996\n褂\n隧\n恙\n侬\n睬\n##ge\n眈\n55\n娼\n缰\n蹋\n紊\n珈\n粤\n珀\n侈\n崭\n菡\n馅\n##ｔ\n頭\n墩\nji\n妤\n##nd\n##ce\n肪\n窈\n栩\nqq\n34\n75\n抡\n☆\n贿\n％\n噎\n毓\n羹\n嫦\n睽\n阙\n150\nno\n拗\n炳\n骼\n窖\n殴\n##ri\n鄂\n瞿\n鲤\n##po\n1995\n隻\n呓\n33\n砌\n##ba\n##ｘ\n玥\n礁\n羁\n哺\n蛾\n咄\n##rt\n囡\n蔼\n槛\n坂\n##sp\n##lo\n譬\n阖\n##fa\n扼\n晟\n##ｄ\nhe\n锣\n衲\nchang\n茗\n酌\n涧\n窕\n蚌\n躇\n滢\n帜\n祺\n踌\n薯\nｃ\n娄\n##ea\n缔\n鳌\n荃\n拷\n扈\n汰\n靳\n荧\n蜒\n榆\n開\n46\n##ad\n1994\n溯\n淤\n褚\n##ｒ\n##ter\n羲\n呱\n##fe\nsm\n彿\n38\n69\n︰\n痞\n萦\n莓\n汶\nst\n47\n］\n瓢\n啄\n酋\n牝\n狩\nbr\n37\n48\n冢\n佘\n拈\n烬\n妲\n讷\n镰\n玺\n吝\n惘\n珉\n萋\n饺\n42\n嘈\n400\n〖\n蕃\n〗\n1993\n俨\n恻\n忡\n谑\n竺\n毗\n##ｙ\n嘭\n羔\n氨\n還\n矗\n##ke\n飕\n垣\n夙\n撬\n##ic\n陰\n焊\n蝙\n盎\n65\n弈\n##ul\n侨\n毋\n##11\n噔\n惋\n玮\n渝\n呲\n铎\n玑\n馥\n札\n1992\nre\n俭\n吆\n##cm\n簪\n苒\n##ry\n揩\n##ｕ\nｔ\n##ia\n忒\n荀\n##ut\n瘙\n琬\n##cr\n坞\n铝\n43\n睨\n骡\n##su\n颉\n纨\n學\n疵\n鲨\n咎\n垦\n曙\n锢\n##ca\n##ur\n冕\n窿\n##vi\n珺\n佯\n動\n瘟\n56\n##ci\n洵\n靶\n柿\n3d\n52\n绾\n##ther\n濒\n帛\n硝\n碱\n蜿\n萃\n##＝\n醺\n赁\n##up\n痪\n53\n楷\n簧\n##man\n##ui\n帷\n稼\n嵋\n４\n虬\n珣\n屹\n##ga\n畴\n颌\n锵\n钵\n掣\n噙\n黠\n柬\n掂\n泓\n硫\n邬\n蜈\n##am\n袄\n／\n霏\n58\n荔\n腆\n歆\n飚\n皎\n##ｓ\n琶\n渭\n谴\n##ｗ\ncat\n##oo\n汲\n##hu\n胚\n殃\n怂\n○\n镂\n鳄\n谧\n戊\n驸\n##ec\n##con\n憧\n3000\n##gi\n□\n瘸\n﹗\n##ds\n骁\n驹\n1985\n##ir\n篱\n鹊\n喋\n④\n琥\n娲\n##°\n##il\n涓\n饵\n枷\n缆\n晾\n99\n獒\n攘\n●\n1990\n惴\n95\n澹\n62\n踮\n缇\n54\n遽\n##ab\njing\n##one\n斓\n##ｈ\nsk\n萼\n悼\n苯\nca\nsa\n51\n龌\n64\n喵\nfa\n蜕\n卞\n殉\n1991\n诬\nｓ\n會\n篆\n600\n孰\nhu\n壬\n##bu\n1988\n搽\n110\n44\n##ou\n糯\n芥\n5000\n淬\n1989\n痊\n睦\n##ns\nsh\n##ac\n##wo\n魇\n##ｃ\n涤\n籁\n57\n##op\n1984\nso\n犁\n72\n旖\n##ie\n秤\n1986\n檬\n耷\nme\n闽\n鹫\n泵\n##ｌ\n##50\n##her\n樣\n俪\n岐\nru\n壹\nic\n##ld\n1987\n##ft\n鹭\n笺\n##26\n憬\n甭\n琵\n抒\n赃\n259\n800\n昱\n稣\n##per\n荤\n##um\n槟\nshe\n龊\nan\n噤\n##tion\n媲\n礴\n亀\n酝\n迢\n毡\n阉\n68\n玷\n49\nios\n壕\n##５\n炬\n##pro\n椭\n稷\n##18\n啮\n汨\n１０\n诶\n##22\n涔\n缨\n炊\n辄\n兒\n##hat\n蟆\nmi\n瞩\n鸢\n##tr\n180\n韬\n##bo\n伎\n1982\n榄\nco\n覃\n涸\n５\n诲\n爲\n##we\n孚\nsu\n痰\n##pi\n硅\n將\n酵\n饷\n剁\n##tu\n髦\n踞\n##out\n筐\n91\n锺\n蔻\n88\n忏\n镀\n荼\nmm\ndi\nboss\n酉\n85\n兩\nwin\n麓\n桨\n铿\n旎\n##mb\n##du\n##od\n浚\n##ph\n６\n睑\n迂\n恪\n##chi\n泱\n徵\n##dt\n闫\n啵\n椰\n肇\n佼\n##ct\n瀛\nma\n##７\n勐\n鹦\n蘸\n濮\n袂\n湮\n晤\n##19\n##my\n柠\n98\n59\n曜\n粱\n褒\n戌\n63\n41\n隅\n畸\n刨\n暨\n##art\n鳖\n##sc\n锄\n乒\n吠\n羌\n##http\n##id\n渲\n蟾\n##son\n##ｐ\n鹉\n體\n##ita\n辘\n坍\n66\n瞠\n姝\n##ine\n##bi\n獠\n胄\n蚣\n乓\napp\n癞\nav\n##21\n茴\n湄\n##cl\n##３\n蔺\n颚\nｍ\n芋\n掖\n猖\nde\nok\n凋\n##ni\npc\n钙\n唏\n##ys\n迄\n聲\n邯\n迥\n##２\n麝\n珞\nangela\n蟠\n瓮\n喏\n诃\n轧\n孺\n舷\n61\n##au\n##rd\n颦\n溉\n妪\nbe\n磋\n##６\n霭\n幌\n矶\n娑\n圃\n##rea\nna\n##cha\n橄\n嗳\n蜥\n##32\n筏\n76\njb\n杞\n怆\n炜\nal\n谛\n烽\n俟\n##sm\n##oc\n驮\n脓\n內\nbut\n1980\n熨\n67\n##14\n##im\n樵\n隘\n##dr\n##ts\n撂\n溟\nxx\n卤\n##pr\n##ru\n78\n##rn\n娓\n##13\n娉\n睐\n跤\nta\n籽\n伫\n戛\n纶\nla\n氯\npa\n羚\n莘\n##８\n笈\n螂\nto\n進\n##ag\n##～1\n沓\n##va\n##lu\n贮\n##ee\n筵\n侏\n璎\n亘\n赊\nbi\n酪\n酮\nas\n##em\n罂\n埔\n腼\n汴\n涣\n##ent\n恺\n讓\n##mp\n##all\n1978\n鸨\n鞑\n氢\n皋\n缤\n胱\n酶\n旌\n匐\n##ff\n棣\n##by\n惆\n##ang\n攸\n無\n##ist\n匍\n##dd\n73\n##san\n沅\n##ion\n##ment\n莪\n峥\n105\n##30\n##pl\n埠\n椿\n##ay\n峒\n谒\n绥\n81\n##ja\n盅\n96\n腌\n胥\n鸥\n1983\nlu\nin\n啬\n71\n##sta\n赣\nbo\n##vo\n隽\nvi\n捎\n剐\n霁\n##ka\n弼\n蝗\n阑\n篝\n蜻\n痣\n挞\n晗\n##now\n缈\n膺\n徙\n##60\n挎\n瞑\n##02\n裏\n霾\n殡\n87\nsi\n幔\n抉\n##40\n74\n##tt\n鼾\n昀\n92\nsuv\nｂ\n緊\ndr\n##ls\n耘\n〉\nmo\n94\n沂\n##23\nli\n##15\n煦\n撵\n犊\n楣\n镑\n##com\n##ex\n弑\n##wer\n缥\n##ow\n1979\n160\n瘴\n祛\n##ey\n偕\n##㎡\n蜓\n佻\n##za\n磐\n踵\n臾\n敕\n滇\n垄\n烯\n##xi\n昙\n跚\n槌\n77\n##rr\n洩\n##ef\n谙\n84\n130\n夥\n阚\n邊\n匾\n泯\n123\n祇\nda\n偃\n鲲\n間\n##ki\n⑤\n纣\n97\n〈\nlt\n豚\n谄\n憩\n啷\n煲\n101\n##80\n##ran\n1963\n爻\n##are\n镣\n柚\nha\n拄\n彝\nの\nai\n淅\n×\n胺\n赂\n蹒\n茱\n蜗\n豔\n攫\n##ue\n觊\n##16\n1981\n熹\n７\n93\n觎\n舶\n##１\n嚏\ncd\n飓\n##han\nｊ\n痘\n##hen\n##ap\n86\nac\n##４\n覺\nyu\nga\nzh\n978\n##ya\n泾\n弋\n擞\n##und\n瑙\n##ny\n##ob\n瞰\n诠\n灸\n粼\n濑\n篡\n骰\n##yo\n釉\n８\nｘ\n泠\n揄\n##70\n汕\n芍\n啕\n篓\n聿\n锭\n砾\n##if\nle\n噶\n##sl\n250\n1964\n淄\n诩\n轲\n##lt\n1958\n║\n鲛\n137\n葩\n沖\n帚\n畹\n舫\n菈\n玟\nshi\n瞌\nyou\nid\n绔\n孵\n##ant\n1949\n摹\n２０\n桅\n從\n狎\nnba\n##jo\n##ina\n##31\n##tra\n雙\n##fl\n氲\nvip\n倜\n##17\ngui\n題\n##24\n坷\nap\n##ok\n掬\n當\n##ome\n83\n000\n1970\n姒\n鳅\nra\n翌\n氤\n蚓\n戍\nsc\n4000\n##ud\n恿\n##ms\nad\n##ty\n搡\n瘀\n皖\nko\n赳\n##ins\n搪\n沽\n##io\n岱\n##rc\n##db\n滟\n##ps\n坯\n1969\n79\n咣\n濛\n點\n黔\n與\n##ua\n㎡\n89\ndna\n渤\npe\n堇\n揶\n##ep\n輕\n掸\n炀\n廿\n羿\n蚯\n##ong\n酯\n##ｖ\n葆\n贻\n##ev\n1962\n##ive\ndo\n岌\n樟\n呤\n##ew\n泗\niphone\n##ers\n诽\n捱\n170\n■\n丕\n##rl\n阂\n嗬\n漳\n汾\n徨\nｋ\n1500\n殒\n##ip\n1956\n悖\n##ah\n##can\n3p\n##ko\n##pp\n##51\n##km\n##ye\n骜\n臆\n##mu\n##uan\n##ig\n喟\n痿\n潞\n##og\nis\n俸\n蝼\n祗\nip\n吋\n衮\n185\n榭\n##aw\n峪\n捍\n摒\n鬣\n骥\n忪\n##ian\nar\n##dy\nｗ\n猬\n轶\n##35\nba\nge\nat\n９\ngo\n1968\n燮\n愫\n琊\nka\n垛\n煊\n锚\n郸\n崆\n菠\n惰\n##45\n##ｋ\n125\n馁\n潼\nel\n102\n〞\n##fu\nｄ\nled\n}\n栾\n谤\n秧\n骊\nti\n劾\n榔\n嗝\n旻\n釜\n邝\n讹\n$\nfu\n##ub\n##90\n宓\n##ven\n涮\n見\nceo\n螃\ncl\n逵\n昶\n##ju\n1976\n##lli\nktv\n1965\n坳\n##03\nsp\n钏\n→\n鼬\n蜃\nｏ\n##33\n##des\n粽\n摞\n匝\n傥\n恁\n186\n##cc\n晁\n巽\n営\n颍\n##ai\n××\n砺\n仞\n115\n##ain\n##thing\n岖\n##xp\n罔\n韭\n187\n潦\nni\n孪\n1966\n##ill\n饪\n翊\n怼\n﹑\n舀\n##25\n##ber\n蔫\n俑\n恸\nph\n溥\n蜴\n##ation\n##ide\n##dn\n##200\n竣\n杳\n姦\n堰\n佈\n##pt\n垩\n處\n隼\n1974\n蛰\n##hy\n##der\n##est\n1960\n1977\n愛\n鳍\n′\n琰\n夔\n桢\n1957\n潍\n##lm\n怏\nwe\n##lan\n坨\n盹\n⑥\n111\n嬿\n##ugh\n##ii\n##rp\n##here\n##ast\nte\ngb\n140\n##ei\n##gh\n184\n噌\n滿\n##zi\n##91\n倌\n##sha\n##─\n108\n嚐\n泫\n1950\n1975\njohn\n##ste\n螳\n哂\n顼\n髮\n##ｂ\n1972\nam\n##75\n##eg\n钊\n##ear\n遛\n皑\npi\n臼\nya\n跛\n潢\n牠\n1200\n##ser\n沏\n─\n##61\n##71\n168\n榕\n104\n##our\n##res\n106\n帧\n洱\n##ight\n灏\n##sy\n忤\n##01\n鬃\n109\n锏\n話\n亟\n##kg\nvs\n##98\n##28\n121\n箕\n360\nhi\nnt\n阪\n垠\n逅\n掇\n##mar\nｇ\n葳\nol\n摺\n犷\n骧\npro\n缄\n侗\n##ere\n##eo\n紅\n忻\n哐\n纂\n181\n宕\n袈\n烷\ncia\n##ore\nor\nab\n粑\nne\n邂\n炷\n##nn\n##ron\nmy\n113\ncr\n##ton\n蔗\n裟\n##tw\n##win\n糗\n1973\n蓟\npo\n槿\n馍\n1971\n##day\n鞅\n##41\n##ric\n豺\n182\n##les\n佔\n##99\n##era\n給\n﹖\n103\n湍\njj\n仄\n1945\n鬆\n1938\ntop\n122\n秆\n臃\n1955\n盂\n##42\n蹶\n糠\n##af\n邋\nss\n邺\n##red\n##ris\n锌\nwindows\n斛\n##mon\nm1\n6000\n虱\n偻\n1961\nqi\n128\n钠\n蛆\n诘\n##che\n##88\n##way\n112\n##nce\n1937\n長\n##ass\n1959\n淆\n雞\n琏\nyy\n闇\n鹂\n翱\nce\n龛\n町\n氟\n闾\n钺\n桧\n辇\n2020\n##55\n##72\n##ate\n##78\n谲\n孢\n##lle\n107\n卻\n磺\n瞭\n雉\n銮\n氣\n仨\n別\n##gen\n焖\n汛\n1954\n##land\n##sn\n1967\n##av\n門\n蒿\n漕\n##yi\n嶙\nmr\n翦\n呷\n##form\n##000\n怵\n##tan\n##dan\n蚤\n忱\n##age\ncpu\n119\n蠡\n##ith\n##ure\n##use\n175\n##ov\n瞇\nｐ\n##ell\n匮\n翳\nｎ\n胰\n牒\n枸\n傑\n遢\n珩\nlin\n##ze\n##66\n幾\n##04\n冗\n##don\nfor\n##08\n##ym\n{\n胛\n茁\n3g\n↓\n囱\n##ana\n痢\n##53\n171\n剽\n##ous\n##sk\n種\n##tal\n##min\n##gn\n張\n伢\n親\n疹\n妊\ncon\n##bs\n骞\n＃\n##06\n##shi\n##cal\nx1\n##rie\n傢\nisbn\n锹\n剜\noh\n绡\n喙\n臧\n谥\nkb\nｆ\n##yn\n##ble\nmar\n尻\n##００\n##nm\n珪\n囤\n谗\n178\n##ted\n##34\n蹩\n##62\n彗\n荪\n##tin\n1952\n衾\n跆\n##ity\n##get\n##pu\n118\n##top\n聽\n藐\n##hr\n荟\n##lla\n##43\n缜\nlo\n700\n127\n僮\n##nc\n##oi\n佝\n##ess\n146\n耙\n峋\n衝\n##64\n172\nho\n椁\n##68\n泸\n##rm\n240\n嗟\n##ction\n辍\n162\n136\n皿\n孬\n饕\n谆\nall\nfe\nintel\n165\n嗑\n遑\n##81\n柒\ned\n岷\n強\ndvd\nmen\n笆\n畿\n##ach\n##gg\n⊙\n##sd\n163\nex\n臉\n##vis\nbl\n蟑\n嘹\n126\n114\n116\n衢\n羯\n155\n129\n8000\n氮\nif\n##65\n1600\n箩\n##ｇ\n1946\n##47\n1948\n##ren\n##nes\n粲\n##oa\n179\n##07\n156\nnet\n##95\n・\n131\n娠\ndv\n##27\n浒\n厩\nㄚ\n##ten\n１２\ndb\n##ever\n##52\n##ern\n##ite\n問\nfi\n##gl\n##38\n１８\n##ica\ncn\n莆\n##48\n##ak\n叵\n荻\n夯\n##05\n##cy\n##29\n钾\n郢\n##rth\n瘁\n﹕\n##tel\n##ree\n並\nflash\naa\n淞\n東\nen\nmac\n娩\n##que\n贰\n姹\n##ji\n139\n##dm\n镁\n##44\ngl\n1953\n##ath\n##ect\n1951\n佞\n柩\nes\n135\n##tro\n衿\n嬛\n柑\n铢\n耆\n荚\n犒\n##iti\n001\ncc\n900\n##dc\n吖\n158\n##cs\n←\n##ame\n颧\n骠\n188\n铬\n##tor\n124\n151\n孱\n咤\n164\n##mg\n176\n##ave\nｈ\n缮\n绶\n1942\n350\n嵘\n堑\n睇\n1300\n##uc\n##ae\nbb\n##39\n1944\n珅\n鹞\n碴\n剎\n##den\n##mer\n##36\n##ard\n##76\n◆\n##lc\n##car\n镐\n潋\n220\n餮\n##73\n##ens\n浃\n渥\n岫\nｖ\npk\nlove\n饨\nsd\n眦\n刽\n169\n楔\n掴\ndon\n谩\n##ata\n崂\n##ust\ntom\ndu\n1947\n##end\n117\n奂\n橇\n庾\n##isa\n1940\n鳗\n★\n##ene\n##eh\n轼\n##94\n##82\n钛\n##kb\n230\nmu\n镳\ndd\n鱿\nci\n##vel\n##led\n遨\nim\n##49\n161\nint\n##ken\n撷\n緻\n隨\n３０\n##hin\n觞\n##lp\n##rk\n丨\nms\ntm\n豌\n##63\n嗎\n##85\n##ort\n颢\ncm\n##king\n雹\n##qi\n璜\n##54\n202\n##tom\n俎\n##men\n1941\n嘞\n馗\n##ger\n##ice\n沱\n142\n##word\n谟\ndan\n祎\n##zz\n##ek\n##ini\n##77\n渚\n##time\nwifi\n##let\n##lor\n134\n138\n10000\n##ert\n琨\n152\n##bb\n##ml\n##eng\n141\n##dp\nev\n##del\n盥\n##its\nman\n绻\njack\n##□\n蜚\n祚\n馄\n##93\nct\n1931\n##ace\n##ake\n##ans\n谀\n##hz\n鹄\n132\n嘣\n##ros\n1100\n帶\n##mc\n稔\n苁\n睥\n鸵\n##92\n擢\n秸\n酰\n##gt\n##97\n赈\n##oh\n##other\n##ani\n懑\n1933\n##len\n210\n疡\n亳\n箐\n樽\n1934\ndavid\n袤\n▲\n砥\npu\n##bra\npr\n133\n醛\n145\n##ful\n師\n##sch\n哔\n佚\n##over\n159\n迤\n斡\n##67\n剛\n1936\n镌\n崴\n##how\n##09\n##ws\n枳\n碉\n##love\n##ple\n166\n##46\n⑦\nthey\n囧\n苻\n瘠\n跻\n诋\n##ies\n177\n##96\n##lon\n淙\n##tle\n173\n##lin\n皈\n佗\n##ien\nsam\n塾\n榈\n##ings\n1800\n2500\n##69\n##rin\n##dia\n濠\n1939\n準\n麗\n聒\n##aa\nak\n卟\n##100\n艮\n蕨\n2019\nc1\n宥\ngi\n##gan\nmon\n陣\n##net\n荨\n##wan\n昇\nthere\n铣\n帼\nchapter\n胫\n觐\n##56\njo\n144\n邰\nii\n##world\n##ough\n饯\n##ute\n##86\n嶽\n丶\n谪\n##ire\n1943\n鄞\n鹘\ncs\n鏖\n鲫\nnpc\n##row\n##oy\n##gb\nhp\n##yu\n鳝\n祐\npp\n##89\n殓\n143\nⅱ\nmp\n裨\n￥\n##oe\n弭\n##port\n##ima\n5g\n兖\n##lic\n##tm\n槃\n##uch\n##zo\n膻\n蕩\n##sse\n佃\n##yan\n杈\nwhat\n變\n幹\n褲\n０\n迩\n１１\n167\n诰\n154\nwang\ncar\nsaid\n谚\n闆\n蛹\n##ram\n##ker\n玳\nibm\n藜\n##rf\n##cam\n茯\nchina\n瓒\n纰\n谯\nva\n實\n褓\n骛\njason\n1927\n悽\n豢\n##tb\n##las\n５０\n260\n碘\nmp3\n##ari\nweb\n##nk\n4s\nop\n抨\n##ane\n148\n##wen\n則\n1900\n##sco\n弁\n啻\n焜\n##lv\n##74\nｉ\n##ler\n##fc\nusb\nea\n讫\n鹜\n##fs\n榫\n锂\n绦\n彧\n檄\ngps\n篑\n赡\ngdp\n149\n##mus\n裾\n##back\n铂\n##bee\n##ley\n诙\n踊\n蟋\n##tos\n藓\n##bor\n1935\n##ug\nem\n闰\n7000\ntv\n1930\n##ise\n浔\n1932\n腳\n204\n157\n##val\n莅\nｒ\n##rus\n##van\n缱\n裱\n##cp\n斫\n連\nep\n##84\n##79\n俾\n##ces\nvol\n熱\nll\n砧\n罹\n1920\nag\n陲\n147\n擡\n##x2\nwi\net\n##ku\n##more\n１５\n痔\n##pan\n##ase\n##ena\n##ash\n211\n##lina\n##tch\n幄\n禺\n320\n##ora\n邛\n難\n骈\nil\ntt\n嬌\n174\n365\n囗\nyes\n##ona\n猗\nking\nhd\n##ose\n750\n蛭\n##ura\n俚\n驷\n滦\n##ula\n國\n隍\n◎\nos\n##ves\n##ale\n##83\n棂\n萸\n§\n##pm\n蚝\n##57\n炅\n箴\n侑\nmv\n##87\n##gs\n涝\n##cer\n##urn\n153\n薨\n##sen\n##ala\n##2000\n屍\n酚\n鄱\ner\n襁\n3500\njames\n陽\noppo\n##ano\n蚱\n##tar\n淩\nsan\n1928\n##cent\n##tto\nig\n##ral\n槁\n茭\n酗\n荥\n##tes\n##ster\n1929\n##∶\n##ical\n##low\npeter\nxp\nben\n##37\nipad\n腓\n256\n##hd\n砣\n烩\ngoogle\nｙ\n蹴\n圩\n##bar\n##nan\n痫\n##gar\n##mes\n焙\n＞\n應\ncp\nns\n##nic\n猕\n１６\n黍\n##ism\n＋\n睢\neric\n##ish\n係\n鹑\n犄\n##bet\n##ult\n焯\nｌ\n▼\n劭\nbar\n楹\n扪\n##lly\none\n嫚\n##58\n蟀\n淖\n纾\nexe\nbt\n黜\n##less\n僖\n擀\n钜\n##ade\n##tic\n寮\nｅ\n##urs\n##rid\n嵇\n##plus\n轉\noo\n##als\n##lie\n##ons\ntw\n##act\n蛐\n##mic\n潸\n孑\n##ics\n##tl\n柢\n蓑\nmba\nを\n##ime\n滓\n箸\n懋\n189\nmc\n蘇\n烊\n##ン\n##sis\n郦\n荏\na4\n##fer\nds\n苡\n饬\n##work\nlan\n##ara\nken\nthat\n徇\nspa\n殁\n铤\nβ\n腩\n喱\n##gin\n##lf\n氾\n##kin\nkk\n讚\n栀\n笞\n菅\n興\n##cle\n##sit\n锲\n##ks\n嶂\n虢\n##ile\n##sion\n濯\n″\n嬅\nt1\n耦\nts\n泷\ndl\n挝\n1926\n##ving\n##lia\n##59\n1400\n##tc\n201\n髒\n饽\n##zu\n龈\n邕\n##ー\n條\n琮\n複\n##の\n##ama\n囫\n铆\n##ood\n腱\n##np\n榛\niso\n箔\n唑\nau\n##bus\n##bre\n蛀\n212\n##ner\n樾\nah\n##life\ncan\n##hn\n糸\n##cd\ncindy\n沣\n骐\n##ory\n##ling\n##bit\nlg\nbook\n雖\n##tp\n##app\n##chel\ndc\n190\n##pc\n缢\n##anda\n⑧\n颛\n##ost\n##ool\n##new\n##orm\n##ture\n谌\n瀞\n##down\n恫\n噁\n採\nmk\n203\n獭\n陂\n##live\n暹\n##own\n##lam\n##ink\n狒\n280\nc2\n##come\n刎\n蜍\nlv\n佰\n晞\n301\n##house\n膘\n篙\nか\n擘\nir\n##ark\nhigh\n谶\n##ial\n##lar\nmiss\n##een\n##die\n絲\nbc\n##night\njava\ndx\n儆\n紮\n450\n纫\n镭\n菖\ntim\n咭\n焘\n##rley\n##ctor\n罄\n##dra\n##600\n睪\n##nar\n##ich\n##ward\n毂\n##ier\n##light\n##ual\n悱\nexcel\n吡\n##ean\n峤\n##book\n0t\n栎\n##ock\nhr\n瑠\n##kw\n1912\nbaby\n圜\n205\n##ght\n拋\n##001\n##bert\nscott\n##ox\n⑨\n孀\n##mhz\n##line\nmichael\nㄧ\n##hs\nfacebook\n羟\nsb\nve\n痨\n犸\non\n菏\n##ola\n##set\n##hl\n鄢\nbug\n珲\n榷\n206\n##star\n##500\nlol\nchi\n钨\nm2\n飨\n1917\n##vin\n豉\n滂\n撫\n獗\n酊\n##～3\n冼\n刍\n漸\n鳃\n##tf\nbill\n邈\n苷\n斷\nabc\n歡\n##ida\n°\n關\n醚\n##ann\n佥\n濕\n##ance\nipo\n徉\n羸\n##air\n雎\n桎\n祢\n##llo\nmax\n缙\n##ond\n##key\n細\n偈\n梏\nng\n龍\n脫\n伉\n徕\n002\np2p\ngod\n磬\n1919\n##nge\npaul\nby\n唳\n馬\n楂\n魍\n纭\n##bot\n肽\n##ence\nは\n涿\nvr\n##hing\n270\nnew\n龜\n祜\n##uy\n##た\n5m\n艷\n##ign\ns1\n##med\nfan\n##cia\n208\n##ead\nに\n栉\n##ky\n鎏\n歙\n氰\n##well\n##side\n樂\n纔\n##force\n##～20\n痂\nwho\n207\n衹\n##か\n仃\n锉\n雒\n##ein\n歳\n##ses\n##ik\n##face\n＜\n##jing\n萘\n##oto\n##nder\n恥\n蛎\n209\n燧\n1m\n虻\n##ht\n##mit\n甯\n啖\ndj\nb1\n##tions\n風\n##ould\n##bin\n瓯\n車\n##nter\n##ding\n徜\n蹉\n##lis\n##mn\n##ick\n##ese\nsr\n桠\ncf\n##hai\n薏\n桉\n##а\n癣\naf\n钿\n##ndi\nwhen\n##put\nles\n阕\nxxx\nmobile\n拮\n脍\n涪\n咲\nα\nins\n##eme\n臬\npart\nlisa\n##☆\n20000\npre\n許\n軟\n鸪\n僭\n##ｊ\n##zy\n##ino\n離\n庖\n##tte\n##ning\n婺\n蹇\n鹌\n##dar\nvcd\n赝\n1700\n癡\n##ope\na1\n汞\n濂\n熵\n1914\n##lee\n橹\n咩\n##ery\n213\nchris\n##boy\n矾\n221\nf1\nmb\n225\n226\n##lls\n獐\n##ode\n##ring\npen\nly\n##tv\n鹧\n##play\n##llow\n矍\n218\n蝌\n##nch\nm3\n奮\njoe\n墉\n##vd\n##uk\n｛\n1924\nae\n222\nmt\n靼\n##nis\n電\n瑁\n##mark\nthis\n酩\n｝\n荞\n##nda\n##し\n##ラ\n##rge\na6\n澍\n卅\n歎\n豐\n勖\n##pin\n１４\nmd\n##ats\nmin\n##ket\n214\n噻\n猷\n1925\n◇\n芪\n溴\n啶\n祉\n520\n##view\n370\n##ｆ\n铄\n245\n##lk\n澧\n書\n##gle\n##joy\nｑ\n232\n##750\n##vs\nhow\n255\n摀\nvivo\n##wn\nfbi\n1918\n##cel\n216\n##ium\n饴\n糅\n##bel\n228\n##jia\n231\n235\n〇\n铡\n珐\nwu\n##ham\n##ray\nred\nparty\n頂\n##ress\n4g\nsimon\n1921\n滁\nwin7\nsex\n##ル\n##kk\n硒\n硼\nww\n讧\n##ound\nvista\n戶\n淒\n芩\nie\n觥\n##ions\naaa\n242\n251\n1911\nⅰ\n330\n淨\n旮\n##gy\ndiy\nbest\n007\n髋\n2200\n1922\n莖\n噱\n麵\n秣\n302\n蜇\n肱\n215\nnot\n##nor\nlily\n##0m\n##400\n##able\nup\n##said\n4500\nfont\nlogo\n##と\n痈\n1909\n1923\n##800\n##bc\n〝\n鲈\nwith\n唁\n復\n溫\nyi\n‧\n##v4\n極\nhan\n##μ\nfc\n芈\nrpg\n檎\n999\n傳\nt2\n葭\n227\n##nia\n谔\n牍\n莒\nkg\n##home\nrobert\n##tter\n埂\n##deo\n視\n##sol\n媞\n310\n##sun\n锰\n##vice\n##ook\nphotoshop\njs\n268\n512\n##ull\n赭\n眞\n醐\n195\n粳\n诟\n##ux\n蚪\n2400\n氐\n遒\n機\n煨\n##bt\nv1\n盡\n絕\n惇\ncv\nyang\n##x5\n诿\n馏\n铨\n##ments\n蕲\nofo\n膑\n⑩\n##dio\n靜\n沢\ncho\n##bell\n##cat\n650\n##ters\n钎\n醍\n##count\nplay\nsci\n##lay\n##ike\n##right\n醴\n蒨\n##ung\n##mate\n1915\n螨\n##ison\n阆\n吶\nmg\nnu\n鲑\n續\n##ume\n##900\n灞\n##ett\n疟\n镍\n##ix\nwill\n﹡\n480\n##→\n疽\n##bat\n##last\n4gb\n鐘\nib\n##hc\n飢\n阡\n笏\n312\n264\n303\nrock\n屐\n##～10\n##cky\n牦\n219\n241\n##xx\n砝\niv\n１３\n蝈\n##log\n郴\n##fun\n緩\n##ico\n##ness\n325\n##ors\n##nne\n△\n340\n荠\n##ved\n膈\n380\n霰\nhello\ncj\n箝\n赓\n##hop\n##ス\n##hm\n##berg\n鳕\n1906\n##rect\n1910\n壓\n##ez\n##ller\n遴\n##wood\n##イ\nrn\n253\n##rry\n厝\n##lr\ncos\nb2\n仝\n290\n終\n溧\n〃\n##tech\nlet\n骅\n##bon\n##gp\ntb\n餘\n珮\n##tti\n##n2\n306\ngp\nsg\n229\n2100\ncad\n葺\n##xxx\n##kv\ndm\n宮\n苋\n洙\n亂\n該\n○○\na3\n##rio\nmvp\n##и\ngame\n騷\n｜\n脣\n蘭\n總\n黃\n鲶\n720\n##cn\n邳\n##free\n苕\nxa\n1916\n010\n233\nus\nbp\n##tz\n铉\nnote\nart\n蕤\n##aft\n铀\n##ule\n椽\nff\ncbd\n艹\n##wu\nof\npd\n15000\n嵬\n腭\n＆\n305\n##ク\n##eep\n稞\n窠\nback\nvivi\n850\n##lio\n2m\n眾\n##fig\n##ince\nmay\n##ary\n##300\n##reen\n跖\nfm\nsf\n224\n##nts\n##ada\n234\n诤\nlee\npt\n223\n236\n238\nvc\niso9001\n仟\n304\nlp\n牆\n265\n##a2\nui\n傩\n昴\n焗\n269\n產\n##wei\n藍\n##mw\n蓼\nrs\n佶\n##gm\n308\n##chen\n悌\n3c\n赅\nkevin\n##く\n２４\nsat\n麋\n疣\n##ize\ns2\n挈\n##●\nrichard\n333\n##0mm\nnon\n1905\n綦\n＝\nedward\n##m2\n##ape\n踟\n馔\n##cher\n槎\n##lab\n稹\n##rian\n掛\n靛\n430\nsao\n217\n252\n##lus\n蛳\n舂\n1908\n##～6\n12000\njane\n##head\n語\n##cks\n##erry\n##vc\n場\ntd\nい\n##フ\n199\nmark\nｚ\n237\n旃\n邨\n痍\n鹈\n8gb\n託\n恚\n##hua\nsir\nnp\nhis\n1913\n2gb\ngm\n299\n冇\nb2c\nmatt\nml\n##700\nword\ngre\n##self\n琇\nhappy\n痺\nlc\n550\n##120\n⌒\n262\n##hip\n驚\n順\n##py\n##ways\n##い\ninternet\n##lot\n##ria\n數\ncome\n婦\nrt\n243\n246\nⅲ\n跎\n瓤\nsuper\n410\n記\n暝\nmini\n020\n摈\n夾\n##aid\nppt\n##wt\n##oon\ncenter\n261\n292\n賤\n530\n1907\n352\n420\n蛔\n249\n##ius\n杷\n識\n##tis\n315\ntf\nlive\n263\n結\n詩\n##hang\nwas\nmaggie\nver\n##mber\n傣\n##dom\n##pass\nangel\n##①\n##rap\n2600\n##ious\n272\nmicro\nbit\nautocad\n碣\n##lock\n311\n##cord\n##uel\n##oid\n芡\nwei\n顯\n耒\n2g\n達\n##eet\n288\n郧\n莴\n##kit\n歲\n2300\n##read\ndel\nｕ\n揆\n≥\n蠹\n##き\n態\n囹\n9000\n囿\n1902\n＠\nhk\n##urt\n##osa\n##rip\n##cts\n楮\n##シ\nnow\n芫\n##е\n440\n##に\n製\n##ｑ\n279\n618\n湫\n196\n钴\n雲\n278\n溼\nits\n響\n##ques\n361\n瑀\n##md\n##ude\nmaria\n##rial\n##▼\n2d\n316\no2o\n##ack\n##tional\n##test\ngsm\nsim\n##point\n244\n寫\n煸\nfree\nleo\n##gc\n##123\nher\nnvidia\ngs\n560\n510\n潤\n～1\n##rain\nec\n##う\n##a1\n塬\ncctv\n顫\n##mand\nft\n壅\nsun\nalex\n愆\n275\n##press\nalbum\n##ded\n搖\n3m\n##body\n觸\nandy\n0l\n铰\n##о\n畦\n5mm\n580\nasp\n248\n247\n666\nm4\n##010\nend\n##max\n辊\n粕\n005\n奘\n擺\n嚥\npop\n阇\n##ject\nf4\n捌\nwto\n砒\nmail\nipod\n岬\n198\n玠\n篪\n##｜\n273\na2\n##★\ngood\n276\n##3d\n321\n309\n倬\n戕\n肓\n307\n浜\n##city\ngif\n271\n##レ\n680\n911\nx3\n296\n##tee\n牯\nrp\n煅\n迴\nget\n192\n祕\n繼\n##ular\n##tore\n喆\n266\n##ア\n疱\nimg\nchen\n擊\n捨\nx5\n##tus\n##gas\n055\n噹\nt3\n劉\n##r3\n郤\n##ible\n193\n螯\nset\n欸\n苣\n323\n苜\n283\n##ira\n菘\n嘚\n橐\n5l\n820\n1903\n##н\n##2010\n##ford\n##nse\n薹\nhtml\n裝\n圻\n##ら\n辦\n繇\n191\nshow\n抟\n圓\n285\nvan\nmx\n請\n爰\n##oz\n4m\n瞋\n##oot\n枰\nthinkpad\n芃\n##logy\n##ations\n1898\n##tive\n挹\n钤\n##テ\n鬍\n2700\n##vr\n##ssion\n孛\n##may\n003\nbig\n##nel\n尕\n322\n篁\n腦\nv2\n砷\nfrom\n##cture\n枇\nbob\nreal\n佢\n濬\nlook\n##ming\n亞\n##ん\n##works\n龋\nwhy\nabs\n菟\n圄\n##room\n##tta\n齊\ncam\n318\nuv\n鼐\npv\nout\n樓\n幫\npan\n≤\n##wang\n##ova\n2800\n噴\nbat\noffice\ng1\namy\n疸\n1895\n342\n砭\n貼\n楫\n##sky\ndream\n儘\nrom\n洄\n∩\n##ping\n780\n1a\n萬\n＇\n鲷\nbbs\nkiss\n飯\n##て\ncarol\n540\n羧\n##mann\n30000\nalt\n197\n堅\n##late\ngay\nlinux\n##vic\nと\n##nos\nｏｋ\n1904\n癜\n讴\ncba\n飛\n710\npet\ning\n庠\ncall\n080\n蚜\n5t\np2\n##○\n##～5\n980\n4a\n267\n寶\n254\n論\n逶\npg\n6p\nvisual\n##ush\nunit\n008\n舛\n##box\n犍\n##ball\n陳\n##リ\n柞\n950\n醮\n372\n##raph\n##eam\n282\n286\n328\n##な\n##xy\n遠\n墀\n1901\n##rey\n194\n##hp\n##nap\nmsn\n3200\namd\n339\n##cept\n##list\n##fit\n洮\nf2\n##cker\n985\n##ather\nmix\n##hur\nadam\nerp\nef\n##は\n貅\n閒\n嗪\n408\ne1\n331\nua\n281\n沭\ns3\n調\n##ware\nb2b\n287\n##lish\nray\n跹\n##rite\n298\nmicrosoft\n隱\n011\n##ering\n縫\n盪\n##￣\n畫\n291\n疥\nthomas\n竜\n堀\n314\n863\n##rent\n##hoo\n##fw\n6m\nob\n誘\nwhere\n##times\n##ト\n蓿\n薙\n佤\n乩\n柘\n茏\n##name\n趺\n钼\n390\n铩\n366\ncosplay\n竽\ncb\nesp\n##110\n單\n##も\n獾\n##alk\ntc\n460\n##ets\n妣\n401\n華\njust\n阈\napple\n1897\nio\n##oud\n芎\n鼹\n楊\n莠\n##к\nnc\n﹍\nsome\n↑\n##ない\nき\n##ube\n050\niii\njan\n約\n淦\nbonnie\njun\nkey\n##101\n1888\nplc\ns6\n棹\n郅\nul\n##ald\n勢\n5a\n晷\nlike\n006\n6l\nし\np1\n呎\n3600\nmate\nrc\n啉\npvc\nhtc\n##ax\n##pd\nice\na8\n烃\n鼋\n疝\n362\n289\n漲\n菀\n##stone\n##＋\n##ration\n##fly\n##mini\n##hone\n1～2\n##ᅡ\n335\n##rant\n##ail\nalice\ndes\n##iel\n2～3\nblue\nc4\ntony\ncx\ndt\n257\n彈\n琍\n1250\n炁\n鹕\ndec\n355\n369\n孳\n##ito\n412\n喹\n##uck\n鐵\n陟\n襬\n##nger\n##↓\n趸\n338\n##field\nfrank\nkl\n囉\n##mah\n470\n愍\nc3\nwilliam\n##flow\n324\n耄\n640\n绀\ntcl\n##～8\n##n1\n1894\n嬢\nwell\nover\n郜\n嫻\n284\nnb\n##160\n##lace\n##り\n##mond\n313\n﹔\n295\npos\n239\n##web\n蘼\n##zl\n##bank\n閉\n277\n膚\n咁\njr\ninc\nthree\n楝\n##type\npm\n332\n週\n夠\n##base\n818\n##р\n叻\n1890\n##す\n##ま\narm\n勁\n羨\nh2\n換\n##ily\n351\n402\n##ience\n##cus\n確\n镗\n630\n怙\ndoc\nbuff\n##cing\n桿\n杼\n釐\n5℃\n縮\n004\ndota\n铵\n##mv\n025\n##group\n##cil\n329\n洹\n桁\n蒹\nbbc\n廝\n##o2\nact\n258\n霑\n##tone\n6500\njournal\n860\nsmart\n稗\n##2007\ndnf\nr1\nmacd\n375\n適\n326\ncore\n醪\nく\n603\n榉\npose\n蜊\n021\n334\n￣\n358\nsky\nsunny\n294\n345\n龅\nbd\n5cm\nstar\n卍\n異\nsata\nnext\n##stin\n鹳\n##2c\n##с\n岘\n甾\n388\n凈\n##ream\n##る\nrain\ntan\n##ged\n錯\nopen\ntel\nuc\n懷\n薅\n##jin\ns7\n啱\n癢\nhome\n##ition\nsoho\n368\n镛\n##cast\n##uma\n274\ndie\nrose\n##card\ni7\n015\n601\n009\n319\n備\n萁\n##hg\n##iu\nare\n苫\ntvb\n570\ngd\n瘢\nair\n跡\n##try\neva\na5\nhttps\n鈺\n1024\noa\n##480\npower\n##talk\nalan\n線\ngigi\n徳\n##jun\n##vas\n鳳\n##█\n##＜\n垭\nフ\n##west\n谘\n##◇\nmaster\n1g\n359\n##board\n試\n405\njean\n崧\n1gb\n100g\ntype\n386\n畲\n狀\n##～7\nmodel\n##mall\nq5\n##dge\n稜\n漯\n810\n1840\n嵊\n##520\n660\nzero\n012\n義\n願\n徼\n730\n##cg\n杓\nrmb\n刈\n顆\n慄\n555\n##250\n960\n±\n418\n3a\n327\n415\n﹝\nseo\n013\n##ロ\n麂\nsoc\n1899\n芾\n297\n##nie\nfans\n620\n##oper\n##tag\n##cation\n341\n穀\n領\n##mine\nqs\n4l\n聞\n382\n湟\n##vm\n趕\ngeorge\n##pop\neos\nγ\n蜢\n##ias\n501\n網\n##care\nxxxx\n較\n認\n293\n鍊\n﹞\n嵯\nblack\nemail\nh6\nfirst\n888\n椋\nsteve\ncharles\ntwo\n##ette\n倆\n訴\n瘋\nltd\n028\n##コ\neb\nwar\n##み\n353\n貔\n撐\n羅\n##wd\n觀\n哏\n840\ngc\n##さ\n矇\n昉\nx2\n圍\n##ument\n燊\n##150\n502\n瑤\n濃\n##в\njim\n##rade\n##nsis\nhong\nlz\n##data\n鞣\nella\n痤\n枋\n384\nsas\n錢\nmike\n1884\n422\n760\n357\nmod\n##lent\nafter\n385\n019\n##core\n##カ\n##zon\n1860\n398\n疖\n890\nv8\nvar\n儋\nsong\nchan\n##dden\napi\nhold\n苄\n317\nn1\n楸\n1880\ngalaxy\ntina\n碛\n##360\n##zen\nmarc\n1850\n5500\n432\nck\nlady\n##ology\n##stro\n354\n珙\n珥\nyour\n##■\n7500\n1886\nauto\n2030\n##ᆫ\n樑\ni5\ne3\n##a5\nkim\n♀\njpg\n##note\nな\nmary\navi\n戰\neq\njp\ndos\n367\n610\n傷\n##site\n襪\n3300\n＾\ncg\n俳\n##pad\n郫\n蓁\n##beth\n##oman\n痧\n2b\n##⌒\n##wing\npass\nて\nた\n瘘\nufo\ndaniel\n談\n##ney\n376\n363\n513\niq\n022\npolo\n渾\n翹\n2022\n343\nexo\nh5\n344\nder\nあ\ndll\nh1\n1280\n1080\n菸\n##coin\n蓋\nmit\nonline\n394\n488\n##tsu\nebay\n20℃\n‖\n軀\n顏\n##itor\n##ｚ\n讀\n甙\nthan\n‰\njul\n##║\n術\n##file\n碁\n##media\n##hk\nacer\nㄛ\n##つ\n4k\n1001\nshirt\n518\n##aka\ngg\n氪\n桡\n##gma\n麸\n690\n348\n378\n##erson\n379\n惡\n##walk\n襠\n##burg\n盱\n8mm\n##ppy\n##lling\n##rma\n##ello\n謝\nrx\n701\n1893\n##tein\n丟\n442\n##3c\n##ィ\n500g\nつ\n啟\ntft\n爺\n哌\n鑽\n2a\nbas\n炔\nliu\nky\n誰\nmake\n課\n411\n381\nkelly\n★★★\n##999\nbot\n脘\n##watch\n##2008\n肄\n##こ\n藥\neasy\n衆\n347\nnova\nstart\n##ヒ\nシ\n##via\n399\nsea\nf3\n堃\n3800\n456\n軍\npci\ncase\n346\nqa\n##ference\n1896\npsp\n407\ninfo\n##zer\ne2\n409\n625\n##ground\n802\n##ury\nmp4\n##news\n︶\ns8\n維\n389\nwind\n414\n6a\n掙\n029\ncoco\n節\nrd\n##mma\n##fx\n泮\n##ella\n027\n1v\n潔\n併\n頓\n嬗\n##ウ\n垚\n##ries\nangelababy\nrna\nrm\n酢\n櫻\nり\n##を\n404\nkd\nm5\n##uid\ns5\n18000\n395\n##table\nア\n337\n371\n繩\npdf\nl1\nphp\n盤\nhotel\n務\neur\n##°c\n670\n737\ndata\n##マ\n報\n扦\n況\n522\n钯\n戡\n1870\nさ\natm\n##nny\n##lex\n##1000\n024\nrf\n364\n朐\n##ᅵ\n##uard\n壞\nuber\n100m\n25000\nstep\n##タ\n捲\nwap\n钣\nxl\n歷\n##②\nkm\n##drive\n503\n##〇\nfly\nbrian\n頫\nlow\n356\nhip\n毽\n決\n##tina\n遊\n490\n吳\nlog\n##━\n##ハ\nuk\n嗚\nφ\nmatlab\n##wf\n▪\n##uce\nfind\nsee\n劍\n##け\nharry\n458\nⅳ\n##rman\n燄\n373\n賞\ni3\n钒\ng3\n貝\n聳\n590\n癒\n336\ntwitter\n稱\n帑\nlink\n419\nmartin\n##tical\n396\nsql\n413\n2025\nhero\n顧\n970\n##load\n438\n鮮\n50000\n儀\nspace\n023\nlcd\n臥\nemba\nlevel\n廳\n葉\n倫\netc\n##r4\nけ\n菽\n740\n416\n##2009\npcb\n纏\n##ᄋ\ncup\n426\n433\nallen\n聖\n脹\n830\n靈\n陞\n藿\n運\n349\n377\ntcp\n1350\n504\n暈\njay\nlong\nwhite\n##ement\n##ᅩ\n殭\nv3\n374\nmoto\n##oka\n優\nnhk\n##pus\n嘌\n倖\nalpha\nlouis\n轭\nright\n講\ngis\n註\n##１０\n##α\n##キ\n鹹\n##tics\n戲\n##л\n祂\n☆☆☆\narea\n專\nsorry\ndiv\n1889\n疔\n##atic\nfb\np3\nsize\n籐\n##page\n6g\n##￥\n455\nvideo\nhenry\nwest\n孃\n燔\n##tant\ngreen\n計\n黒\n淚\n425\nhot\n摟\n1867\n3ds\nwow\nadc\n435\n##ties\n495\n夢\n釦\n465\nう\n##⊙\npin\n##shop\n綾\n伕\nω\n嚇\n1885\nmpv\n##vy\n##ads\n＼\n蟲\n瓏\nvga\nping\n蓖\n##ift\n516\n億\nm8\nxd\nups\n##ship\n##nus\ntime\n##ils\n392\nmtv\n##zone\nzen\n768\n##vg\n1234\n涞\n躪\n際\n簡\na7\n擠\nrq\nieee\ng5\n##2d\nwindows7\n##rum\n蕈\nhas\n##ナ\n葚\n515\nqueen\n##2012\nace\nmore\n⑴\n880\ncmos\nc5\nlife\nadobe\n##bow\nd3\n30g\nbye\nb5\nmap\n卮\n##mple\n飽\njeff\n溱\nlight\n荽\n猶\n紀\ntake\n##www\n##lone\nform\n蔘\n##ika\n涼\nusa\ng2\nㄟ\nlab\n10℃\n511\n451\n##ctive\n唸\n8g\n盧\n397\nfun\nemi\nbmw\n##yer\nstyle\n1882\nbang\n403\nclass\n##ェ\nhard\norg\nd1\nimage\nsay\nnews\njpeg\n禮\n##stry\n##mix\n買\n湧\n##ique\n32gb\n2900\n姫\n100km\n##cket\n777\n##mobile\n747\n##town\n豇\n##ⅱ\n##±\n##chat\n槭\n氦\n##link\n姪\noem\n骶\n##ader\nイ\ng6\n##dog\nide\nnever\n痠\n★★★★\n員\nhdmi\n呋\n輪\n##ひ\n8cm\n499\n387\n號\n##tax\n醫\npost\n褫\n427\nanna\n謂\nfile\n##gio\nb6\n##2011\nk2\n##そ\nidc\n508\n506\n撥\n##3000\n##host\n##next\ncpi\ncandy\n##ider\n##ω\n樺\n舉\n1010\n417\ninternational\nann\n##code\ngmp\n##sent\n2k\n査\n趙\n脩\n521\n##space\n荸\ntoo\n漢\nq2\n巖\n##dk\n##yy\n##lines\n棗\n頸\n543\ntech\neco\nboy\nppp\ntop10\n閃\n｀\npage\n##baby\n##ncy\n吽\napps\n簾\n##＄\n##sia\n10g\n2cm\n453\n4800\n##168\n酐\n槲\ng7\n2mm\n屬\nπ\n##iko\n##370\n乂\n驗\namg\n429\n禪\n嘖\n##rmin\nfire\n##rris\n##cca\nwp\neps\n厲\n燒\n505\nseven\n##eting\n##asia\n嫘\n蜱\n镕\n暸\n鲟\n##urg\nworld\n耋\n##rder\n16gb\n##qq\n##tty\n##ville\nnas\n468\n##ツ\n061\nparis\n##т\nd2\n颞\n痼\ngreat\ndesign\n耨\n##5s\n870\ndom\n770\n樯\n醯\n485\n##map\n須\nanne\n073\n##tment\ngrace\n養\nvia\nmall\ncdma\nsweet\ngpu\nト\nmandy\n肅\n406\nv6\noracle\ncss\nyoutube\n籼\n##011\n448\njerry\nrfid\nxml\n3100\n705\n殘\nplease\nコ\n##moon\nmsi\n綑\naccess\n##tton\n##text\nれ\n腈\n##↑\n辺\ngucci\nmagic\nitunes\ninn\n##ql\n岙\n##ews\ncrm\n歸\n391\nport\nram\n##あ\n##れ\n##url\n20g\n鲢\n##mbps\n475\n嘆\n1868\nquick\nonly\nold\n626\n##rner\n闕\n虛\nee\n側\n##tation\n##lution\n##nba\n1871\n##ニ\n##ᅮ\n1111\nneo\n綁\n勝\n蘋\n⑵\nrick\n595\n##dget\n##pper\nway\nbus\n癮\nplan\n##っ\n2050\n層\n15g\n檗\n繫\netf\nx6\n糍\ngtx\njennifer\n燈\n叡\n瀰\nカ\n428\n戦\n711\n滾\n599\nhere\n藝\n##ston\n588\n835\n##fet\nproject\n業\n璁\nunder\n褔\n##888\n##ature\n纖\n質\n繞\n##ち\nsony\n純\nみ\n##pact\njoseph\n##gate\n##у\ngov\n8k\neleven\n3～5\n##012\n##uality\n##google\n##м\nmusic\nq7\n##tail\ndemo\nる\nひ\n枱\n##ager\n498\n##verse\n塚\npad\n钡\nblock\n獣\nchinese\n尷\n##д\n##ging\n矽\n褻\nthink\n4200\nlock\nエ\nbenz\n##bay\n÷\nread\ngary\nbh\namazon\n瑩\nmoon\npvp\n嚕\n##lts\n眙\nsteam\nrun\n##iki\n##dot\nn2\nfx\n##ット\n資\n燙\n﹒\n笹\npaper\n3400\nハ\n5v\n畀\nccd\n﹚\n3t\nselect\n##365\nplus\n##ᆷ\n喫\n嘧\n獨\n##よ\n##や\nこ\n##pn\n688\n禦\n鏡\n1080p\nwan\natp\n##ᆨ\n℃\n瓴\n↓↓↓\noffer\n産\n1020\nbad\n##ator\nμ\nhave\nabout\n##リー\n團\n##aine\nδ\n##bility\nl2\n##js\ns4\n5k\n##≤\n臨\nkate\n##モ\nsummer\n★★★★★\n槍\n锑\nfox\n彎\n##ール\n##え\n##hia\n##007\n2021\n曉\n碇\n習\nfda\ndha\n##＞\nrich\nclub\n锆\n皴\nland\nbell\n環\n﹙\n荳\n5s\nname\n壯\n##tier\n1b\njustin\n801\n5d\n383\n##mail\n隠\ndns\n塊\n氖\n廣\nま\n##わ\n睏\n##food\n擁\nbox\n920\n秭\ndown\nidea\n525\n樹\nlist\ne6\nlaw\nocean\n##unch\nctrl\n##ᆼ\n痱\n罷\n##ホ\nfri\n寧\n##tory\n墒\n骯\n##せ\n##sson\ncharlie\n膩\n798\n##メ\ng9\nlittle\n##gion\n##lter\nnana\nday\ntue\ncare\n氙\n445\n頰\ndoi\nス\n##nix\njoy\nengine\nす\nら\n▽\npay\nco2\n##ᄉ\n##013\nnasa\n蒡\nって\ntouch\n421\n啓\n##ners\nq1\n園\n##ories\nmysql\nhop\n790\n灬\n錄\nnice\n討\n855\n徹\n嚴\n暱\n##jie\n##＾\ntai\n喲\noled\n舊\n朮\n镉\n##ッ\ngirl\n貴\n銀\nbase\n3～4\n脲\nク\n睜\n##チ\nkbs\n50g\n##ふ\ntips\n選\n負\n##◆\nedu\n鬚\namerican\n457\nλ\n碩\nbreak\nuniversity\n##2013\n##iner\nloft\n舸\ncross\n611\n蹼\n##ɡ\n##bian\nmost\nb12\n綿\n類\nming\nサ\nhoney\nldquo\nnike\n恆\n075\nesc\n##gnet\n慘\n##ンク\n鎖\n925\n舖\n璿\n##～17\n標\nε\n靦\n##jiang\n##ㄟ\nyoung\nforum\ntimes\n獸\nwater\n偉\n爭\nheart\n##580\noff\n筆\n930\n##bia\n##サ\nreview\n鑑\nstring\n窨\nも\n臺\ninstagram\n緣\ncnn\n393\ndsp\nqt\n蚵\nspring\npython\na9\n傾\n罵\n湊\n胍\nb3\n##khz\nfrance\n661\nturbo\n867\n參\n憐\n兇\n##お\nhuang\n暌\nbella\nhey\nvisa\n暢\n顔\nandrew\n廚\nnick\nlast\nisp\n##public\nmulti\n##п\n襲\n##イト\n##dium\n20mm\n441\nほ\nzip\n##009\nict\n5kg\n諸\n∠\n碚\n瓊\ncnc\n溏\n##ndy\n埕\n膦\nline\nadi\n##ゆ\nsolo\n畢\nfull\njohnson\n##ゃ\nmade\n證\ngrand\nroger\n##px\nhonda\n観\n##ック\n椹\n僕\n練\nsouth\nstephen\njump\n##2015\n##ント\n揸\njonathan\ntalk\nhit\n塗\nnight\nsns\nmain\n699\nsmall\n444\n##ᄃ\nbrand\n##icon\nkai\n##ャ\nxbox\nwin8\npony\n砼\n##ほ\n##エ\ncool\nknow\n賽\nadmin\n琲\njimmy\n載\n釋\n##nity\n##ート\n##ム\nprofile\ntext\nphoto\n隊\n##share\n汙\nhead\n衛\n##015\n##я\nс\nroom\n敗\n##ᅳ\n4d\n┃\nedge\nmacbook\n591\nsmith\nhiv\n髂\n爾\n##ᅧ\ntrue\nemc\nsap\n##sio\n啫\ncheck\ncloud\n616\n##rame\n##ろ\ndigital\nghost\nや\nwma\n709\nお\nblog\n獲\n│\nsurface\n貨\n饒\n##itz\nsms\nmega\nteam\nvery\ncry\n蓮\n##ソ\n楽\n撲\nwd\n##rity\n劇\nkeep\ncpa\n積\n##ˋ\n尋\nわ\n餵\nryan\n豊\n##ᅥ\npack\n徬\nrebecca\n縣\niu\njuly\n##ㄧ\nstrong\n韻\nnokia\n疳\nalexander\ncity\n厭\ntest\nsound\n##rex\n腫\ncentral\nhall\n胝\n##lean\nv5\n莊\nface\nsdk\n〓\n##ᄌ\n權\n雜\n##rdon\n圖\nftp\n隈\nwed\n閱\nlucky\n紙\ncma\n##ュ\nsarah\nfifa\n##shot\n貘\n膽\nnov\nq3\ncount\n豈\n煙\npeople\nband\ncbs\n##ᄀ\n##б\ndrive\nbin\n棄\n詠\n##ᄅ\nrdquo\nレ\n碓\n輩\n蒐\n539\n074\n斎\n3mm\n慣\n720p\n蔔\ndell\n20cm\ngold\neye\nanti\n錶\n##kins\nhelp\nps4\nprime\n戀\nbbq\n橫\n叢\ngoo\n隸\nx7\n⑶\n議\nelle\n詞\n畑\n##master\n彙\ncolor\nted\n硐\nsaas\n##mmy\nstudio\n990\n颡\n##014\nr8\nstay\n3s\n##sue\n據\nf5\n汆\nテ\n導\n##lton\n814\nstop\n腸\nr2\n##ㄚ\n誌\nえ\n##から\ncamp\n項\nmmorpg\n皺\n360°\n1tb\n緋\n6gb\n##post\n鬱\n籤\nasia\nwing\n##style\n536\necho\n921\n瀟\nshare\nsars\n蕻\n486\n浬\n創\n##covery\nwear\n貪\n喪\nrgb\nvalue\n2l\n##ァ\nqe\n絶\nlim\nオ\nㄋ\ndeep\ntbs\nbing\nagent\n潴\n規\npatrick\n區\nwork\n針\nrss\nタ\n責\n甦\n紹\napt\ntaylor\nラ\npierre\n##ター\nsystem\n庹\n##め\nへ\n悅\n4399\nruby\nsoul\nsale\nroot\nhouse\ntee\nyahoo\n鎚\ng20\n堙\nσ\n##ision\nother\nミ\nvictoria\n攝\nmode\nour\n##470\n★★\n僅\nnorth\nᄋ\nkitty\nvilla\n劃\n##ケ\n気\ncut\nchristian\n##г\n12345\n饑\nfast\n～5\ncanon\nces\n憶\n##ague\nbeyond\nshopping\n嫵\nリ\n呂\n烏\n擴\n寬\ncarlo\nthu\n撚\nlocal\n鄉\nipv6\n787\nrar\n統\n跩\nして\n##して\n##って\nった\n0mm\n828\nd5\nvm\nadd\n襯\nbruce\nian\nwall\n韋\n迺\nv4\ncontrol\ndlc\nnata\ngeneral\n##2016\n剋\nⅴ\n騰\n##tings\nnational\n設\n838\nx4\n##ᄒ\n00㎡\nopec\npng\n齁\n臟\n艋\ndog\n4600\n孫\nkarl\ndark\nxr\n營\nindex\n涠\n鈎\nィ\n##tors\nいて\nling\ncoc\n掃\n詳\n煖\n##º\n敵\n##ネ\n奧\n陸\nwilliams\n##016\ndouble\n嘗\n##xon\n蹟\nems\n賣\nsaint\nfamily\nannie\nsbs\nhtml5\nkic\n納\n鲱\nnull\nview\n200mm\n128gb\ncfa\n##オ\n々\nssd\nせ\nめ\ndragon\nalexa\nmoney\n蠛\n##◆◆\nfive\nqr\ngroup\n歐\nヒ\n##③\npark\nage\npush\nxyz\n##iginal\n溝\nurl\nhdr\nホ\n縱\n嬤\n護\n揚\n濤\n##ヘ\n囟\nbay\nfeel\nfps\n##lux\n##ч\n埼\n險\n譯\n0020\n檯\n##graphy\ncrystal\nfed\njapan\n##rmb\n鴻\n##ミ\n澀\n##atus\n煩\nskin\nunix\nfish\n＄\nedm\n巿\nmetal\n級\nnetflix\n魚\nロ\ndior\n囝\n織\n##firm\n##desk\n費\nimf\nbusiness\n擼\n齒\n補\n﹏\n##ⅲ\n遺\n橼\nmoba\naction\n⒉\nholiday\n鬥\n揮\ntea\n##rail\n899\ntiffany\nfrancis\nstudy\nscience\nanthony\nacg\n濫\n##ょう\nbios\nfinal\nsoft\n龐\n123456\nactive\n袪\nvoice\n橋\n榖\n##ね\n誠\n砲\nenter\nclose\nlost\nprada\npoint\n##ggle\ntop100\n¤\nhpv\n##セ\ntag\n職\n6s\n繃\ne5\nadsl\nа\nnumber\n箇\neia\n輸\nacca\nhuman\n##dragon\nicon\n998\n##jy\nマ\n麴\n証\n##▲\nspeed\n##した\n⑸\nads\n##2017\njavascript\ncisco\n816\nraid\nウ\ncard\n♂\n##♀\n##ョ\n5c\nstandard\n64gb\n##ハー\n##ㄨ\nchrome\n##trix\n館\n##いる\n##いい\n##れた\n◆◆\nopera\nshadow\nyam\nproduct\nbaidu\nv9\napec\n##ˊ\n897\n殺\nこの\n辻\n攏\n##ノ\n##lmer\nbrown\n∽\n～4\n劑\n##ᅢ\n騎\nキ\n##rence\nsteven\n萊\n淺\njessica\n飲\nuse\n##lax\nsamsung\n##bike\nskip\npizza\n鄰\necu\nmedia\n鎗\nshell\n埤\npowerpoint\n劵\nsign\ntab\n︱\nvmware\n擔\nθ\n##ο\npm2\n##った\njose\nakb48\n凇\ndear\nfield\nм\nspecial\n茼\nclaire\n穩\nsso\n慶\n##2014\nstate\n##ワ\n銅\nelectronic\n##berry\ntokyo\nz2\n鬧\n涙\n麥\n##こと\n##rcle\n⒈\npmi\n18k\nraw\niis\n32g\n•\n##ろう\npath\nした\nいた\n##この\n##ified\nusd\n瀬\n恵\nvon\n憂\noutlook\ncover\nddd\nwish\n鶴\npure\n≡\nshop\njackson\noct\n##λ\np9\nmatch\nrmvb\nshift\nmeta\niot\n##163\nivy\n説\n悶\n##へ\n～10\niphone7\nnano\n⑷\nψ\n擋\n監\n頌\n█\n##340\nvincent\nふ\n━\nalso\n擇\n滅\n韓\n∞\nadvanced\nswitch\nチ\n##ь\n獄\n##stle\nservice\n##nnis\nsep\nメ\neast\n濟\n##△\nwave\n減\nultra\n飄\n鵰\nbeta\nsomething\nmsci\nglobal\n銷\ntable\n##κ\nsimple\napr\nn3\n799\nlondon\nphone\n##ι\n##スト\n##walker\nmind\nsecond\nvision\nbattle\n靚\n##←\nbasic\n##っと\nしい\nん\nsays\nriver\n懸\n簷\npaypal\ndonald\n餚\n牽\nhtm\n##ーム\n桜\n34e\n嚨\ntrip\n爛\nmenu\nbody\n籠\nbeautiful\nsean\n実\naugust\n摳\ngolden\n##ーン\n訝\n⋯⋯\n廢\n##ぬ\n朧\n##イス\n濁\n剝\n彆\n##bug\n##bbs\nresearch\ndac\nchanel\n紛\nfit\n≈\n遲\n鮨\n測\nsina\n頗\nmbc\ndance\nago\n50mm\n孖\n衄\n√\nzoom\nち\n##ょ\n寛\nxperia\n堺\nnbc\n憤\nwatch\n峯\nvpn\nside\n嬰\nв\n綠\nauthor\nsui\n穫\nmama\n碼\nbuy\npink\nnetwork\n##ᅦ\n埵\ncell\nmiui\n##orage\nschool\nˇ\n粿\n紋\nherme\nasr\nspf\n##ᆯ\n︿\n##®\nnfc\n##β\n∕\n勞\n##ⅰ\n##のか\n発\nmarket\nppi\nleft\n潆\nnature\n3kg\njob\n##2f\n醃\n飾\n##れて\n##たい\n↓↓\nkimi\nァ\n100ml\n30ml\npixel\njordan\nfalse\n№\n粧\n##≥\nposted\n階\naug\n趵\nmarch\nearth\n頻\nprivate\n煉\n檢\nbear\n蠍\n曆\nott\n訊\n縛\n覽\n墻\nrussell\n執\n攪\n慮\n##ゅ\n絃\nwelcome\n額\n賦\nboot\n聯\ndaily\nhealth\n##bby\njones\n竇\n伝\n##ッフ\nヽ\nps3\n500ml\n喬\n##イン\n緒\n哋\n擾\norder\n毎\n╮\n廁\n歩\nbeauty\nexpress\nsilver\nᄀ\nsecret\nscript\n萩\n鄭\n魯\n遞\nsingle\n偽\ncio\nセ\nsoftware\nitem\n##ォ\nmetro\nそ\n##▽\n穎\n##price\n獻\ntravel\nprice\nfpga\nф\n嵐\n捩\nノ\n癱\nssh\n様\nmcu\n撿\n燿\n箏\n2345\ncapital\n浯\nyear\nicp\n礙\nball\n##abc\nvogue\n價\n##rss\n闊\nllc\n##fork\nヘ\n痙\n攣\n24h\nkong\nsave\n##mporary\n侖\nゆ\n鎮\nbim\n賓\nroad\nmomo\n蓆\nburberry\n穢\nsiri\n島\nsport\nfocus\n丿\n##ㄋ\n##とう\nт\nhost\n墮\ngmail\nevent\n##ification\n886\n繪\n帯\nself\n躍\n##sday\ntvbs\n##zzi\n晩\n##ᄆ\ndean\nkpi\natom\nbefore\n財\n囍\n暐\naspx\npublic\nzara\n撃\nfriends\nroyal\n誇\nbigbang\narpg\nsupport\n纍\nprogram\n訓\n組\n咘\nuniversal\n嘩\n弔\n蕭\n貫\n狹\n##stack\najax\ne7\n葯\n##ション\n┌\nserver\ncalifornia\nprocess\n##んな\n##また\n湯\nworks\ngmat\n誤\nbag\n譁\n闖\n闍\np10\n##rtex\ncontinue\n##∕\n閣\nネ\nsunday\n##リア\nunity\n##④\n勻\naudio\ngpa\n惱\n##017\n戸\n輝\n曖\nnikon\nclick\n賊\n灘\ncode\nsource\n範\n樫\ntoday\n╰\ncoffee\nplace\npuma\ncookie\nreading\n磡\n対\nfantasy\n舾\n##ラン\n##ivity\nから\ntour\nvera\ngit\n預\nchildren\n鳴\n╯\nnatural\n値\n蔣\nsite\n崁\nkuso\ncreative\n奪\nsocial\nhill\ncanada\nnexus\ndram\n筲\nwindow\n槓\nchannel\npm1\n16g\n梶\nhadoop\n賁\n誦\n盃\n墊\n埸\nᄆ\nradio\neyes\n852\nsanta\ndate\n##shine\nedit\n紘\nno1\ncto\nwrite\nx9\nえて\nえる\nめる\n##かり\nmonte\nrunning\norange\n醜\n##｀\n##のは\nford\nя\nflickr\ndit\n寢\n縴\n勸\nhoward\n鳥\nfollow\n砗\nenglish\n蝟\nк\ntpp\n舺\n廟\n##いた\nける\n淪\n##む\nisland\n鹽\n866\n##│\n##ース\n賢\norz\njcb\nuser\nenergy\nˊ\nwiki\nisis\nthunder\n傭\nmodern\nfriend\n崑\nguest\n咗\nfood\n##∼\ngear\n50ml\n壺\nmount\n櫃\n##ある\nlake\nsophie\nstep3\n搶\n潟\n##する\n▍\n垵\nstone\n療\nomega\nleave\n喚\n鏽\n審\npwm\ntechnology\n##lvin\n暫\n30cm\n剣\n##з\n≧\nsports\n澆\n濺\nssl\n贊\n##カー\ncdn\nぃ\nlte\n##cake\nutc\n##ヤ\n##ß\n塵\nelse\n憑\nー\n篠\n匯\nkit\n〜\naws\nyumi\nㄆ\namoled\n##ᄇ\ngap\n貓\nд\n##ㄢ\nwii\nips\n嘅\n疊\n##ちの\n##かな\ntown\nunited\n峇\nᄉ\n攤\n##◎\nする\nswift\n##ました\ninstitute\nthread\nㄌ\nemily\n徑\ntiger\n俠\n滝\n漿\nよ\n圏\n攞\n齋\n綵\nebd\nobject\nˋ\n氡\n弾\n埗\nfpx\nfintech\napril\n蠻\n顛\niphone6\n##δ\n##ρ\nmemory\n円\n搵\nbank\n闔\nkindle\n冊\n##れは\n榮\n経\nwps\n鏤\n⒊\n罰\n鋼\n災\ncountry\nл\n1389\nredis\n╭\n紐\n懼\nれる\n窮\n黨\nsogo\nusb3\n窺\n槻\n綺\ngirls\nemma\n辭\nspark\n飆\nmorning\n##τ\niptv\napk\nlearning\nmachine\ncream\napache\n関\nconnect\n嚮\n揹\nvirtual\n廂\n驀\n##unge\ntarget\n甕\n##ㄥ\n餓\nspan\nlenovo\n燻\nユ\ntaiwan\nᄂ\nても\n12306\n鉄\n巔\n##ⅳ\n贏\n##÷\nmiller\n##↗\nparker\ngoods\n##ように\nれた\n##まて\n##²\n潛\nη\ndownload\nplaystation\nstory\n礡\ngithub\nskype\n▋\n竄\n紗\n鍵\n燉\nngo\n##ᄑ\n啞\n嘢\ntower\nwin10\npowered\nｃｐ\n霊\nfresh\nr9\nstreet\n聰\n貞\n児\ncoach\nbuild\nfuture\n変\n羣\n##dical\nmelody\nswatch\n樸\n瑯\ndelete\nevolution\n勳\n敎\nてある\npacific\n殼\n徴\n鍾\n臘\n攬\n##くたさい\ntrump\nboard\n勵\nchange\n滲\n簽\n瑪\n##х\nwilson\nニ\nimax\n##なと\nmountain\n##きた\nnode\n歛\nsquare\n賀\nォ\nfriday\n噓\ninto\n##⑥\nfashion\n嚀\n録\njune\nbridge\n爐\n鏈\n訪\n##ける\neds\n獵\ncollege\ncommon\ntree\n懶\n墘\n辯\n轟\n霧\n脈\n銬\nnand\n犧\n閑\naudi\n郷\n売\ndiscovery\nerror\ntoyota\nliving\n##ㄞ\nwant\ncopy\n曠\n懲\ncontent\nρ\n齡\nbird\n檔\n覧\nsupreme\nツ\n##ᄂ\n灑\n樁\n騙\nн\n##ヨ\n##baru\nq4\n˙\n##ς\n壇\n棲\n##あなたに\nq10\ncreate\n簦\n鐲\n驕\n闢\n##その\nまて\n##なく\n込\n蘿\nκ\n賜\n諷\ncoupe\n簫\n遷\n##ても\njquery\n▌\ncms\nモ\nswiss\n輛\n##blog\nᄒ\n潰\n擱\nupdate\n14k\ncontact\n郵\n澤\nchicago\n##ータ\ncamera\nios11\ngmt\n擬\n糞\n##ᅪ\nzone\n評\nз\n侷\n##〓\n##ルフ\n##ц\nvive\nsync\nrate\n∥\n閨\n稟\n癲\n##ーション\n氷\n豎\n##ンス\nе\n##ш\n違\n遙\nてす\nル\n逕\n燭\nр\nviews\ncst\nstation\nf16\n糾\n巻\n禍\narmani\nナ\nvim\n獅\n氹\n##ø\nケ\nsystems\n##∩\n冏\n鰍\ncostco\n薩\n兲\n悪\nnovember\nagain\nvsc\nmozilla\n渉\n団\ngivenchy\nrelease\nㄏ\n##ㄛ\n◢\nヶ\n##ᄎ\n蒼\nthai\n轎\nι\n彫\nunion\nfrm\n湳\nи\nclassic\n広\n##♂\nassociation\n##なら\n貧\nkanye\nχ\n##ν\n##nome\n賭\nlanguage\ninside\n##サー\nㄒ\ncompany\n壽\n図\n丼\n灝\nsense\n糬\n##╰\nг\n鏢\n穂\n蹤\n僱\nvans\n彌\n頑\nп\nワ\n##りの\n膠\n絵\n滯\n##ы\nreturn\nrequest\n綸\n##∥\nmems\nbooking\n##ㄏ\n喰\n##ᅲ\nstep1\n##ーシ\n螢\n##よう\n搗\n##cms\n##rix\nairbnb\nculture\nまた\nphilips\n協\n##ˇ\n##↘\n┊\n鑼\n続\n##かる\ndays\ntrc\n##π\n靂\nwomen\ncafe\nyoga\n嬪\n寵\n筍\n##ㄤ\n##ㄉ\n凪\nsearch\ncheese\nとして\n閔\n##∠\n邁\n駛\ndavis\n洸\n績\nddos\n綻\n##んて\n喺\ndisplay\n籬\n摯\n晝\n##χ\n潑\n蒟\n絳\nmuji\n鈕\nbeach\n倉\nы\ndcs\n##ちゃん\nseason\n驅\n訂\n糧\ndomain\nglass\nlumia\nfeed\n撈\n⒋\n貶\n##ə\n侶\nios7\n牴\n帳\nソ\npress\npinterest\nrights\n弒\n艱\n挾\nlimited\nptt\narticle\ncambridge\n##≈\nleonardo\n覓\n虧\n毀\n浄\n妘\n駕\n嬸\n窩\neba\nㄨ\n##いて\n済\n鈴\n熾\nっ\nunicode\neducation\n##ます\nwordpress\nfinancial\nvolvo\nadidas\nhistory\n轅\n籲\n単\n賴\n榱\n##θ\n転\nfiles\n##なり\nstep2\ngarden\n™\n棟\n噠\n俬\ngaga\n##ж\n総\ntitle\n##∞\ncss3\nぁ\n販\n嘰\n欄\nurban\n##™\n築\nicloud\n##ュー\nmarketing\n##admin\nmkv\n≦\n##ᄐ\n##ᄏ\ndevelopment\nvps\n敘\n鉅\n##γ\noctober\nо\nかある\n®\n抜\npepper\n##ては\n脅\n満\n錦\n筊\n##イフ\nbean\nο\nすく\n唄\n傘\nboost\nexchange\n##ᅬ\n##ㄇ\n0755\n−\n##ᅴ\n竊\nㄥ\n##˚\nyork\n衞\narchive\nedition\nfilm\nmuseum\n嶋\nofficial\ninformation\naward\nseries\ngundam\nstage\nsociety\ngames\nentertainment\nversion\nmovie\ncollection\nforce\nreport\nlibrary\nguide\nbooks\noxford\ncentre\nchurch\nmagazine\nfestival\nstates\nawards\ndatabase\nplanet\namerica\njunior\npublishing\nfoundation\n歴\nstore\n脇\n毘\narts\nkorea\nmanagement\nfunction\nplayer\n専\ncounty\nyears\napplication\njanuary\nfebruary\ncorporation\n県\neditor\nweek\nwashington\n亜\nforest\nsingapore\n##ctionary\nseptember\ndecember\n姉\nsecurity\nfirefox\nkids\nmaps\npublished\nsurvey\nmhz\nphotos\n蔵\nservices\ncategory\nstructure\npiece\n増\ngnu\nwong\nresult\n収\n栃\navenue\ntaipei\nlamigo\nplaza\npremium\ncook\nmalaysia\nmember\ngallery\n舎\nfactory\n稲\nedited\nwalker\nairport\nx86\nvalley\nnotes\nsmap\nvillage\nchicken\npolicy\nimages\ntwice\ncourse\nwikipedia\narchives\npopulation\nlinks\ndisney\nframework\n塭\nexplorer\nmaker\nkde\nmessage\n読\nsuite\n##gence\neclipse\n闘\ndistribution\nfung\n髪\ncampaign\n遶\n敍\n仮\n駅\ndetails\nmanager\nncc\n隣\nbelle\nsession\narticles\n菓\n驒\nexhibition\n齢\n険\noil\n乗\nazure\ninstall\neconomy\nubuntu\ncomment\n渋\n塩\nfeb\n応\naddress\n栄\n権\nhsu\npopular\nwine\nnetscape\nrelated\n頼\n帰\n験\nnownews\n効\n薬\n雑\nkitchen\nreader\n壆\nsafari\nsbl\n壊\n縄\nlulu\ndiary\n莿\n検\nteddy\nresort\n覇\n両\n荖\nhair\ncbc\nqualcomm\ndocomo\n覚\n##ernel\nettoday\n倶\nscp\n掲\n冨\nbalance\n駄\nmobil\nmlb\nalphago\npmid\n訳\nprevious\n坵\nrestaurant\n処\nwonderland\nwedding\n扱\npanasonic\nhmv\n継\n庁\ncredit\nmessenger\nclient\n聴\ncurve\n従\nmonday\n働\n譲\n荘\nhotels\n廃\nscrum\nbutler\ncopyright\n縁\n嫲\nchocolate\nweeks\n焼\n岀\n挙\ncache\nsalvatore\n労\necfa\n傚\nshtml\n麺\ngaming\nasus\nfedora\nr11\n厳\nmonths\nhbl\n仏\nurn\nlinkedin\njson\n舗\noops\nzenfone\nfaq\nbytes\nxz\ntrivago\nc919\n嚟\ncake\nwam\nposts\ntesla\n彅\ned2k\nnego\nmwc\nbanner\nreuters\nh7n9\nbrandt\nzol\nevernote\njobs\nmsg\nrosie\nsylvia\nbrake\npchome\n駆\ngogoro\nrmk\nloading\ncomments\nspotify\n蒻\nie6\n圧\nikea\nrewards\n囲\nnavigation\n0800\n悩\nuniqlo\nwhatsapp\n軽\ndollars\nudn\nwali\n脳\ncookies\nlifestyle\narduino\npokemon\noutlet\ngartner\n択\nrainer\ndji\nwade\n窓\ncurry\nferragamo\nplurk\ntcs\n繋\noriginals\nbloomberg\ndhl\n磲\ncf1\nsgs\nmango\nlohas\ntinker\n##pods\nvsa\nprotected\nanalytics\n騨\n歯\nprivacy\n臓\nblogspot\n拡\ntila\n払\nfc2\npptv\nmarriott\noculus\npixnet\n価\n銭\nhitachi\n瞓\nntd\nwikia\nnamespace\nagoda\npositioning\nsohu\n##bscribe\nhola\nmastercard\n9985\nblogger\nexpedia\nwestbrook\nbbe\n畳\ndyson\nrcep\n粄\nweibo\ntraction\n##onsored\nfandom\nday1\nstarbucks\nwwdc\nbuffet\n剉\ntags\n2765\nebooks\n##skip\ntweet\ne88\nhostel\n8591\n4066\nmotel\n1mdb\nprefix\nlabels\n椪\n勧\ninsee\n##vertisement\n##tags\nday3\nu11\n戻\n##t00\nepson\ntripadvisor\ntmt\nfishbase\nbe2\nvdc\ngopro\n歓\nreply\nhktvmall\ngr2\n##facebook\npinkoi\nmoneydj\ngoris\nrends\ntomtom\nnginx\nfsm\n9595\n6221\nios10\n堿\nrakuten\nxuite\nadsense\nday4\n00z\nconnectivity\nworldcat\n##copyright\nhuawei\nreserved\ntvg\ndropbox\n9678\nlongchamp\n50cm\namana\n12345678910\nlofter\n5757\n6606\n遅\nvfm\n17life\n<eos>\n<unk>\n"
  },
  {
    "path": "modules/text/text_review/porn_detection_lstm/module.py",
    "content": "import math\nimport os\n\nimport paddlehub as hub\nfrom .processor import load_vocab\nfrom .processor import postprocess\nfrom .processor import preprocess\nfrom paddlehub.compat.task import tokenization\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import serving\n\n\n@moduleinfo(name=\"porn_detection_lstm\",\n            version=\"1.2.0\",\n            summary=\"Baidu's open-source Porn Detection Model.\",\n            author=\"baidu-nlp\",\n            author_email=\"\",\n            type=\"nlp/sentiment_analysis\")\nclass PornDetectionLSTM(hub.NLPPredictionModule):\n\n    def __init__(self):\n        \"\"\"\n        initialize with the necessary elements\n        \"\"\"\n        self.pretrained_model_path = os.path.join(self.directory, \"infer_model\")\n        self.tokenizer_vocab_path = os.path.join(self.directory, \"assets\", \"vocab.txt\")\n        self.vocab_path = os.path.join(self.directory, \"assets\", \"word_dict.txt\")\n        self.vocab = load_vocab(self.vocab_path)\n        self.sequence_max_len = 256\n        self.tokenizer = tokenization.FullTokenizer(self.tokenizer_vocab_path)\n\n        self.param_file = os.path.join(self.directory, \"assets\", \"params.txt\")\n\n        self.predict = self.detection\n\n        self._set_config()\n\n    @serving\n    def detection(self, texts=[], data={}, use_gpu=False, batch_size=1):\n        \"\"\"\n        Get the porn prediction results results with the texts as input\n\n        Args:\n             texts(list): the input texts to be predicted, if texts not data\n             data(dict): key must be 'text', value is the texts to be predicted, if data not texts\n             use_gpu(bool): whether use gpu to predict or not\n             batch_size(int): the program deals once with one batch\n\n        Returns:\n             results(list): the porn prediction results\n        \"\"\"\n        try:\n            _places = os.environ[\"CUDA_VISIBLE_DEVICES\"]\n            int(_places[0])\n        except:\n            use_gpu = False\n\n        if texts != [] and isinstance(texts, list) and data == {}:\n            predicted_data = texts\n        elif texts == [] and isinstance(data, dict) and isinstance(data.get('text', None), list) and data['text']:\n            predicted_data = data[\"text\"]\n        else:\n            raise ValueError(\"The input data is inconsistent with expectations.\")\n\n        predicted_data = self.to_unicode(predicted_data)\n        start_idx = 0\n        iteration = int(math.ceil(len(predicted_data) / batch_size))\n        results = []\n        for i in range(iteration):\n            if i < (iteration - 1):\n                batch_data = predicted_data[start_idx:(start_idx + batch_size)]\n            else:\n                batch_data = predicted_data[start_idx:]\n\n            start_idx = start_idx + batch_size\n            processed_results = preprocess(batch_data, self.tokenizer, self.vocab, self.sequence_max_len)\n            tensor_words = self.texts2tensor(processed_results)\n\n            if use_gpu:\n                batch_out = self.gpu_predictor.run([tensor_words])\n            else:\n                batch_out = self.cpu_predictor.run([tensor_words])\n            batch_result = postprocess(batch_out[0], processed_results)\n            results += batch_result\n        return results\n\n    def get_labels(self):\n        \"\"\"\n        Get the labels which was used when pretraining\n        Returns:\n             self.labels(dict)\n        \"\"\"\n        self.labels = {\"porn\": 1, \"not_porn\": 0}\n        return self.labels\n"
  },
  {
    "path": "modules/text/text_review/porn_detection_lstm/processor.py",
    "content": "import io\n\nimport numpy as np\n\n\ndef load_vocab(file_path):\n    \"\"\"\n    load the given vocabulary\n    \"\"\"\n    vocab = {}\n    with io.open(file_path, 'r', encoding='utf8') as f:\n        for i, line in enumerate(f):\n            vocab[line.rstrip()] = int(i)\n    return vocab\n\n\ndef get_predict_label(pos_prob):\n    \"\"\"\n    Convert the prediction probabilities to label\n    \"\"\"\n    # threshold should be (1, 0.5)\n    neu_threshold = 0.5\n    if pos_prob >= neu_threshold:\n        label, key = 1, \"porn\"\n    else:\n        label, key = 0, \"not_porn\"\n    return label, key\n\n\ndef preprocess(predicted_data, tokenizer, vocab, sequence_max_len=256):\n    \"\"\"\n    Convert the word str to word id and pad the text\n    \"\"\"\n    result = []\n    padding = vocab['<PAD>']\n    unknown = vocab['<UNK>']\n    for index, text in enumerate(predicted_data):\n        data_arr = tokenizer.tokenize(''.join(text.split()))\n        wids = [vocab.get(w, unknown) for w in data_arr[:sequence_max_len]]\n        if len(wids) < sequence_max_len:\n            wids = wids + [padding] * (sequence_max_len - len(wids))\n\n        result_i = {'processed': []}\n        result_i['origin'] = predicted_data[index]\n        result_i['processed'] += wids\n        result.append(result_i)\n    return result\n\n\ndef postprocess(predict_out, texts):\n    \"\"\"\n    Convert model's output tensor to pornography label\n    \"\"\"\n    result = []\n    predict_out = predict_out.as_ndarray()\n    for index in range(len(texts)):\n        result_i = {}\n        result_i['text'] = texts[index]['origin']\n        label = int(np.argmax(predict_out[index]))\n        if label == 0:\n            key = 'not_porn'\n        else:\n            key = 'porn'\n        result_i['porn_detection_label'] = label\n        result_i['porn_detection_key'] = key\n        result_i['porn_probs'] = float('%.4f' % predict_out[index, 1])\n        result_i['not_porn_probs'] = float('%.4f' % (predict_out[index, 0]))\n        result.append(result_i)\n    return result\n"
  },
  {
    "path": "modules/text/text_to_knowledge/nptag/README.md",
    "content": "# NPTag\n\n|模型名称|NPTag|\n| :--- | :---: | \n|类别|文本-文本知识关联|\n|网络|ERNIE-CTM|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|否|\n|模型大小|378MB|\n|最新更新日期|2021-12-10|\n|数据指标|-|\n\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - NPTag（名词短语标注工具）是首个能够覆盖所有中文名词性词汇及短语的细粒度知识标注工具，旨在解决NLP中，名词性短语收录不足，导致的OOV（out-of-vocabulary，超出收录词表）问题。可直接应用构造知识特征，辅助NLP任务\n\n  - NPTag特点\n\n    - 包含2000+细粒度类别，覆盖所有中文名词性短语的词类体系，更丰富的知识标注结果\n        - NPTag试用的词类体系未覆盖所有中文名词性短语的词类体系，对所有类目做了更细类目的识别（如注射剂、鱼类、博物馆等），共包含2000+细粒度类别，且可以直接关联百科知识树。\n    - 可自由定制的分类框架\n        - NPTag开源版标注使用的词类体系是我们在实践中对**百科词条**分类应用较好的一个版本，用户可以自由定制自己的词类体系和训练样本，构建自己的NPTag，以获得更好的适配效果。例如，可按照自定义的类别构造训练样本，使用小学习率、短训练周期微调NPTag模型，即可获得自己定制的NPTag工具。\n\n  - 模型结构\n    - NPTag使用ERNIE-CTM+prompt训练而成，使用启发式搜索解码，保证分类结果都在标签体系之内。\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.1.0\n  \n  - paddlenlp >= 2.2.0\n\n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install nptag\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run nptag --input_text=\"糖醋排骨\"\n    ```\n  - 通过命令行方式实现NPTag模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    # Load NPTag\n    module = hub.Module(name=\"nptag\")\n\n    # String input\n    results = module.predict(\"糖醋排骨\")\n    print(results)\n    # [{'text': '糖醋排骨', 'label': '菜品', 'category': '饮食类_菜品'}]\n\n    # List input\n    results = module.predict([\"糖醋排骨\", \"红曲霉菌\"])\n    print(results)\n    # [{'text': '糖醋排骨', 'label': '菜品', 'category': '饮食类_菜品'}, {'text': '红曲霉菌', 'label': '微生物', 'category': '生物类_微生物'}]\n    ```\n    \n- ### 3、API\n\n  - ```python\n    def __init__(\n      batch_size=32,\n      max_seq_length=128,\n      linking=True,\n    )\n    ```\n\n    - **参数**\n\n      - batch_size(int): 每个预测批次的样本数目，默认为32。\n      - max_seq_length(int): 最大句子长度，默认为128。\n      - linking(bool): 实现与WordTag类别标签的linking，默认为True。\n\n  - ```python\n    def predict(texts)\n    ```\n    - 预测接口，输入文本，输出名词短语标注结果。\n\n    - **参数**\n\n      - texts(str or list\\[str\\]): 待预测数据。\n\n    - **返回**\n\n      - results(list\\[dict\\]): 输出结果。每个元素都是dict类型，包含以下信息：  \n     \n            {\n                'text': str, 原始文本。\n                'label': str，预测结果。\n                'category'：str，对应的WordTag类别标签。\n            }\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线中文名词短语标注服务，可以将此接口用于在线web应用。\n\n- ## 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n    ```shell\n    $ hub serving start -m nptag\n    ```\n\n  - 这样就完成了服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ## 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    ```python\n    import requests\n    import json\n\n    # 待预测数据(input string)\n    text = [\"糖醋排骨\"]\n\n    # 设置运行配置\n    data = {\"texts\": text}\n    \n    # 指定预测方法为WordTag并发送post请求，content-type类型应指定json方式\n    url = \"http://127.0.0.1:8866/predict/nptag\"\n    headers = {\"Content-Type\": \"application/json\"}\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n\n    # 待预测数据(input list)\n    text = [\"糖醋排骨\", \"红曲霉菌\"]\n\n    # 设置运行配置\n    data = {\"texts\": text}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n  - 关于PaddleHub Serving更多信息参考：[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install nptag==1.0.0\n    ```\n"
  },
  {
    "path": "modules/text/text_to_knowledge/nptag/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/text_to_knowledge/nptag/module.py",
    "content": "# -*- coding:utf-8 -*-\nimport os\nimport argparse\n\nimport paddle\nimport paddlehub as hub\nfrom paddlehub.module.module import serving, moduleinfo, runnable\nfrom paddlenlp import Taskflow\n\n\n@moduleinfo(\n    name=\"nptag\",\n    version=\"1.0.0\",\n    summary=\"\",\n    author=\"Baidu\",\n    author_email=\"\",\n    type=\"nlp/text_to_knowledge\",\n    meta=hub.NLPPredictionModule)\nclass NPTag(paddle.nn.Layer):\n    def __init__(self, \n                 batch_size=32, \n                 max_seq_length=128,\n                 linking=True,\n                 ):\n        self.nptag = Taskflow(\"knowledge_mining\", model=\"nptag\", batch_size=batch_size, max_seq_length=max_seq_length, linking=linking)\n\n    @serving\n    def predict(self, texts):\n        \"\"\"\n        The prediction interface for nptag.\n\n        Args:\n            texts(str or list[str]): the input texts to be predict.\n\n        Returns:\n            results(list[dict]): inference results. The element is a dictionary consists of:\n                {\n                    'text': str, the input texts.\n                    'head': list[dict], tagging results, the element is a dictionary consists of:\n                        {\n                            'item': str, segmented word.\n                            'offset': int, the offset compared with the first character.\n                            'nptag_label':str, Part-Of-Speech label.\n                            'length': int, word length.\n                            'termid': str, link result with encyclopedia knowledge tree.\n                        }\n                }\n        \"\"\"\n        return self.nptag(texts)\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description='Run the %s module.' % self.name,\n            prog='hub run %s' % self.name,\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n\n        self.add_module_input_arg()\n\n        args = self.parser.parse_args(argvs)\n\n        input_data = self.check_input_data(args)\n\n        results = self.predict(texts=input_data)\n\n        return results\n"
  },
  {
    "path": "modules/text/text_to_knowledge/nptag/requirements.txt",
    "content": "paddlenlp>=2.2.0\n"
  },
  {
    "path": "modules/text/text_to_knowledge/wordtag/README.md",
    "content": "# WordTag\n\n|模型名称|WordTag|\n| :--- | :---: | \n|类别|文本-文本知识关联|\n|网络|ERNIE-CTM+CRF|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|否|\n|模型大小|549MB|\n|最新更新日期|2021-10-26|\n|数据指标|-|\n\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - WordTag（中文词类知识标注工具）是首个能够覆盖所有中文词汇的词类知识标注工具，旨在为中文文本解析提供全面、丰富的知识标注结果，可以应用于模板（挖掘模板、解析模板）生成与匹配、知识挖掘(新词发现、关系挖掘)等自然语言处理任务中，提升文本解析与挖掘精度；也可以作为中文文本特征生成器，为各类机器学习模型提供文本特征。\n\n  <p align=\"center\">\n  <img src=\"https://user-images.githubusercontent.com/40840292/137913774-186f72e9-c51b-469e-8356-b72bafc4d926.png\" hspace='10'/> <br />\n  </p>\n\n  - WordTag特点\n\n    - 覆盖所有中文词汇的词类体系，更丰富的知识标注结果\n      - WordTag使用的词类体系为覆盖所有中文词汇的词类体系，包括各类实体词与非实体词（如概念、实体/专名、语法词等）。WordTag开源版对部分类目（如组织机构等），做了更细类目的划分识别（如，医疗卫生机构、体育组织机构），对仅使用文本信息难以细分的类目（如人物类、作品类、品牌名等），不做更细粒度的词类识别。用户需要细粒度的词类识别时，可利用百科知识树的类别体系自行定制。\n    \n    - 整合百科知识树链接结果，获得更丰富的标注知识\n      - 如上图示例所示，各个切分标注结果中，除词类标注外，还整合了百科知识树的链接结果，用户可以结合百科知识树数据共同使用：如，利用百科知识树中的subtype获得更细的上位粒度，利用term的百科信息获得更加丰富的知识等。\n\n    - 可定制的词类序列标注框架\n      - WordTag开源版标注使用的词类体系是我们在实践中对百科文本解析应用较好的一个版本，不同类型文本（如，搜索query、新闻资讯）的词类分布不同，用户可以利用百科知识树定制自己的词类体系和训练样本，构建自己的WordTag应用版，以获得更好的适配效果。例如，可将自定义的词表按照百科知识树的字段定义好，挂接/整合到百科知识树上，即可使用自己的Term数据定制标注样本和标注任务。\n\n  - 模型结构\n    - 模型使用ERNIE-CTM+CRF训练而成，预测时使用viterbi解码，模型结构如下：\n\n  <p align=\"center\">\n  <img src=\"https://user-images.githubusercontent.com/40840292/137915351-0ef2609c-1aab-4c7e-8634-8726f8ddb30d.png\" hspace='10'/> <br />\n  </p>\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 2.1.0\n  \n  - paddlenlp >= 2.1.0\n\n  - paddlehub >= 2.1.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install wordtag\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    $ hub run wordtag --input_text=\"《孤女》是2010年九州出版社出版的小说，作者是余兼羽。\"\n    ```\n  - 通过命令行方式实现WordTag模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    # Load WordTag\n    module = hub.Module(name=\"wordtag\")\n\n    # String input\n    results = module.predict(\"《孤女》是2010年九州出版社出版的小说，作者是余兼羽。\")\n    print(results)\n    # [{'text': '《孤女》是2010年九州出版社出版的小说，作者是余兼羽', 'items': [{'item': '《', 'offset': 0, 'wordtag_label': 'w', 'length': 1}, {'item': '孤女', 'offset': 1, 'wordtag_label': '作品类_实体', 'length': 2}, {'item': '》', 'offset': 3, 'wordtag_label': 'w', 'length': 1}, {'item': '是', 'offset': 4, 'wordtag_label': '肯定词', 'length': 1, 'termid': '肯定否定词_cb_是'}, {'item': '2010年', 'offset': 5, 'wordtag_label': '时间类', 'length': 5, 'termid': '时间阶段_cb_2010年'}, {'item': '九州出版社', 'offset': 10, 'wordtag_label': '组织机构类', 'length': 5, 'termid': '组织机构_eb_九州出版社'}, {'item': '出版', 'offset': 15, 'wordtag_label': '场景事件', 'length': 2, 'termid': '场景事件_cb_出版'}, {'item': '的', 'offset': 17, 'wordtag_label': '助词', 'length': 1, 'termid': '助词_cb_的'}, {'item': '小说', 'offset': 18, 'wordtag_label': '作品类_概念', 'length': 2, 'termid': '小说_cb_小说'}, {'item': '，', 'offset': 20, 'wordtag_label': 'w', 'length': 1}, {'item': '作者', 'offset': 21, 'wordtag_label': '人物类_概念', 'length': 2, 'termid': '人物_cb_作者'}, {'item': '是', 'offset': 23, 'wordtag_label': '肯定词', 'length': 1, 'termid': '肯定否定词_cb_是'}, {'item': '余兼羽', 'offset': 24, 'wordtag_label': '人物类_实体', 'length': 3}]}]\n\n    # List input\n    results = module.predict([\"热梅茶是一道以梅子为主要原料制作的茶饮\", \"《孤女》是2010年九州出版社出版的小说，作者是余兼羽\"])\n    print(results)\n    # [{'text': '热梅茶是一道以梅子为主要原料制作的茶饮', 'items': [{'item': '热梅茶', 'offset': 0, 'wordtag_label': '饮食类_饮品', 'length': 3}, {'item': '是', 'offset': 3, 'wordtag_label': '肯定词', 'length': 1, 'termid': '肯定否定词_cb_是'}, {'item': '一道', 'offset': 4, 'wordtag_label': '数量词', 'length': 2}, {'item': '以', 'offset': 6, 'wordtag_label': '介词', 'length': 1, 'termid': '介词_cb_以'}, {'item': '梅子', 'offset': 7, 'wordtag_label': '饮食类', 'length': 2, 'termid': '饮食_cb_梅'}, {'item': '为', 'offset': 9, 'wordtag_label': '肯定词', 'length': 1, 'termid': '肯定否定词_cb_为'}, {'item': '主要原料', 'offset': 10, 'wordtag_label': '物体类', 'length': 4, 'termid': '物品_cb_主要原料'}, {'item': '制作', 'offset': 14, 'wordtag_label': '场景事件', 'length': 2, 'termid': '场景事件_cb_制作'}, {'item': '的', 'offset': 16, 'wordtag_label': '助词', 'length': 1, 'termid': '助词_cb_的'}, {'item': '茶饮', 'offset': 17, 'wordtag_label': '饮食类_饮品', 'length': 2, 'termid': '饮品_cb_茶饮'}]}, {'text': '《孤女》是2010年九州出版社出版的小说，作者是余兼羽', 'items': [{'item': '《', 'offset': 0, 'wordtag_label': 'w', 'length': 1}, {'item': '孤女', 'offset': 1, 'wordtag_label': '作品类_实体', 'length': 2}, {'item': '》', 'offset': 3, 'wordtag_label': 'w', 'length': 1}, {'item': '是', 'offset': 4, 'wordtag_label': '肯定词', 'length': 1, 'termid': '肯定否定词_cb_是'}, {'item': '2010年', 'offset': 5, 'wordtag_label': '时间类', 'length': 5, 'termid': '时间阶段_cb_2010年'}, {'item': '九州出版社', 'offset': 10, 'wordtag_label': '组织机构类', 'length': 5, 'termid': '组织机构_eb_九州出版社'}, {'item': '出版', 'offset': 15, 'wordtag_label': '场景事件', 'length': 2, 'termid': '场景事件_cb_出版'}, {'item': '的', 'offset': 17, 'wordtag_label': '助词', 'length': 1, 'termid': '助词_cb_的'}, {'item': '小说', 'offset': 18, 'wordtag_label': '作品类_概念', 'length': 2, 'termid': '小说_cb_小说'}, {'item': '，', 'offset': 20, 'wordtag_label': 'w', 'length': 1}, {'item': '作者', 'offset': 21, 'wordtag_label': '人物类_概念', 'length': 2, 'termid': '人物_cb_作者'}, {'item': '是', 'offset': 23, 'wordtag_label': '肯定词', 'length': 1, 'termid': '肯定否定词_cb_是'}, {'item': '余兼羽', 'offset': 24, 'wordtag_label': '人物类_实体', 'length': 3}]}]\n    ```\n    \n- ### 3、API\n\n  - ```python\n    def __init__(\n      batch_size=32,\n      max_seq_length=128,\n      linking=True,\n    )\n    ```\n\n    - **参数**\n\n      - batch_size(int): 每个预测批次的样本数目，默认为32。\n      - max_seq_length(int): 最大句子长度，默认为128。\n      - linking(bool): 是否返回百科知识树的链接结果，默认为True。\n\n  - ```python\n    def predict(texts)\n    ```\n    - 预测接口，输入文本，输出词类标注结果以及百科知识树的链接结果。\n\n    - **参数**\n\n      - texts(str or list\\[str\\]): 待预测数据。\n\n    - **返回**\n\n      - results(list\\[dict\\]): 输出结果。每个元素都是dict类型，包含以下信息：  \n     \n            {\n                'text': str, 原始文本。\n                'items': list\\[dict\\], 标注结果, 包含以下信息：\n                  {\n                    'item': str, 分词结果。\n                    'offset': int, 与输入文本首个字的偏移值。\n                    'wordtag_label': str, 词类知识标注结果。\n                    'length': int, 词汇长度。\n                    'termid': str, 与百科知识树的链接结果。\n                  }\n            }\n\n## 四、服务部署\n\n- PaddleHub Serving可以部署一个在线中文词类知识标注服务，可以将此接口用于在线web应用。\n\n- ## 第一步：启动PaddleHub Serving\n\n  - 运行启动命令：\n    ```shell\n    $ hub serving start -m wordtag\n    ```\n\n  - 这样就完成了服务化API的部署，默认端口号为8866。\n\n  - **NOTE:** 如使用GPU预测，则需要在启动服务之前，请设置CUDA\\_VISIBLE\\_DEVICES环境变量，否则不用设置。\n\n- ## 第二步：发送预测请求\n\n  - 配置好服务端，以下数行代码即可实现发送预测请求，获取预测结果\n\n    ```python\n    import requests\n    import json\n\n    # 待预测数据(input string)\n    text = [\"《孤女》是2010年九州出版社出版的小说，作者是余兼羽。\"]\n\n    # 设置运行配置\n    data = {\"texts\": text}\n    \n    # 指定预测方法为WordTag并发送post请求，content-type类型应指定json方式\n    url = \"http://127.0.0.1:8866/predict/wordtag\"\n    headers = {\"Content-Type\": \"application/json\"}\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n\n    # 待预测数据(input list)\n    text = [\"热梅茶是一道以梅子为主要原料制作的茶饮\", \"《孤女》是2010年九州出版社出版的小说，作者是余兼羽\"]\n\n    # 设置运行配置\n    data = {\"texts\": text}\n\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n    print(r.json())\n    ```\n\n  - 关于PaddleHub Serving更多信息参考：[服务部署](../../../../docs/docs_ch/tutorial/serving.md)\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n\n  - ```shell\n    $ hub install wordtag==1.0.0\n    ```\n"
  },
  {
    "path": "modules/text/text_to_knowledge/wordtag/__init__.py",
    "content": ""
  },
  {
    "path": "modules/text/text_to_knowledge/wordtag/module.py",
    "content": "# -*- coding:utf-8 -*-\nimport os\nimport argparse\n\nimport paddle\nimport paddlehub as hub\nfrom paddlehub.module.module import serving, moduleinfo, runnable\nfrom paddlenlp import Taskflow\n\n\n@moduleinfo(\n    name=\"wordtag\",\n    version=\"1.0.0\",\n    summary=\"\",\n    author=\"baidu-nlp\",\n    author_email=\"\",\n    type=\"nlp/text_to_knowledge\",\n    meta=hub.NLPPredictionModule)\nclass WordTag(paddle.nn.Layer):\n    def __init__(self, \n                 batch_size=32, \n                 max_seq_length=128,\n                 linking=True,\n                 ):\n        self.wordtag = Taskflow(\"knowledge_mining\", batch_size=batch_size, max_seq_length=max_seq_length, linking=linking)\n\n    @serving\n    def predict(self, texts):\n        \"\"\"\n        The prediction interface for wordtag.\n\n        Args:\n            texts(str or list[str]): the input texts to be predict.\n\n        Returns:\n            results(list[dict]): inference results. The element is a dictionary consists of:\n                {\n                    'text': str, the input texts.\n                    'head': list[dict], tagging results, the element is a dictionary consists of:\n                        {\n                            'item': str, segmented word.\n                            'offset': int, the offset compared with the first character.\n                            'wordtag_label':str, Part-Of-Speech label.\n                            'length': int, word length.\n                            'termid': str, link result with encyclopedia knowledge tree.\n                        }\n                }\n       \"\"\"\n        return self.wordtag(texts)\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description='Run the %s module.' % self.name,\n            prog='hub run %s' % self.name,\n            usage='%(prog)s',\n            add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n\n        self.add_module_input_arg()\n\n        args = self.parser.parse_args(argvs)\n\n        input_data = self.check_input_data(args)\n\n        results = self.predict(texts=input_data)\n\n        return results\n"
  },
  {
    "path": "modules/text/text_to_knowledge/wordtag/requirements.txt",
    "content": "paddlenlp>=2.2.0\n"
  },
  {
    "path": "modules/video/README.md",
    "content": "## **更好用户体验，建议参考WEB端官方文档 -> [【视频分类】](https://www.paddlepaddle.org.cn/hublist)**\n\n### 视频分类\n视频数据包含语音、图像等多种信息，因此理解视频任务不仅需要处理语音和图像，还需要提取视频帧时间序列中的上下文信息，视频分类模型适用于各类短视频快速打标签等场景。\n- 推荐模型\n\n| 模型名称                                                     | 模型简介                                                     |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n| [视频分类](https://www.paddlepaddle.org.cn/hubdetail?name=videotag_tsn_lstm&en_category=VideoClassification) | 基于千万短视频预训练的视频分类模型，支持超过3000个短视频标签，在实际业务中取得89.9%的Top-1精度，同时具有良好的泛化能力，\n"
  },
  {
    "path": "modules/video/README_en.md",
    "content": "## **For better user experience, refer to the Web official document ->  [Video Classification](https://www.paddlepaddle.org.cn/hublist)**\n\n### Video Classification\n\nVideo data contains lots of information such as voice and image. For the video task, the voice and images are processed, and the contextual information is extracted from time series of video frames. The video classification model is suitable for quick tagging of various short videos.\n\n- Recommended Models\n\n| Model Name                                             | Model Introduction                                         |\n| ------------------------------------------------------------ | ------------------------------------------------------------ |\n\n| [Video Classification](https://www.paddlepaddle.org.cn/hubdetail?name=videotag_tsn_lstm&en_category=VideoClassification) | Based on the pre-trained video classification model for tens of millions of short videos, it supports more than 3000 short video tags, achieving 89.9% Top-1 accuracy in the actual business with good generalization ability.\n"
  },
  {
    "path": "modules/video/Video_editing/SkyAR/README.md",
    "content": "# SkyAR\n\n|模型名称|SkyAR|\n| :--- | :---: |\n|类别|视频-视频编辑|\n|网络|UNet|\n|数据集|-|\n|是否支持Fine-tuning|否|\n|模型大小|206MB|\n|指标|-|\n|最新更新日期|2021-02-26|\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n\n    - 样例结果示例：\n        * 原始视频：\n\n            ![原始视频](https://img-blog.csdnimg.cn/20210126142046572.gif)\n\n        * 木星：\n\n            ![木星](https://img-blog.csdnimg.cn/20210125211435619.gif)\n        * 雨天：\n\n            ![雨天](https://img-blog.csdnimg.cn/2021012521152492.gif)\n        * 银河：\n\n            ![银河](https://img-blog.csdnimg.cn/20210125211523491.gif)\n        * 第九区飞船：\n\n            ![第九区飞船](https://img-blog.csdnimg.cn/20210125211520955.gif)\n        * 原始视频：\n\n            ![原始视频](https://img-blog.csdnimg.cn/20210126142038716.gif)\n        * 漂浮城堡：\n\n            ![漂浮城堡](https://img-blog.csdnimg.cn/20210125211514997.gif)\n        * 电闪雷鸣：\n\n            ![电闪雷鸣](https://img-blog.csdnimg.cn/20210125211433591.gif)\n        * 超级月亮：\n\n            ![超级月亮](https://img-blog.csdnimg.cn/20210125211417524.gif)\n\n- ### 模型介绍\n\n    - SkyAR是一种用于视频中天空置换与协调的视觉方法，主要由三个核心组成：天空抠图网络、运动估计和图像融合。\n\n    - 更多详情请参考：[SkyAR](https://github.com/jiupinjia/SkyAR)\n\n    - 参考论文：Zhengxia Zou. [Castle in the Sky: Dynamic Sky Replacement and Harmonization in Videos](https://arxiv.org/abs/2010.11800). CoRR, abs/2010.118003, 2020.\n\n## 二、安装\n\n- ### 1、环境依赖\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、安装\n\n    ```shell\n    $hub install SkyAR\n    ```\n    -  如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n      | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n## 三、模型API预测\n\n- ### 1、预测代码示例\n\n    ```python\n    import paddlehub as hub\n\n    model = hub.Module(name='SkyAR')\n\n    model.MagicSky(\n        video_path=\"/PATH/TO/VIDEO\",\n        save_path=\"/PATH/TO/SAVE/RESULT\"\n    )\n    ```\n- ### 2、API\n\n    ```python\n    def MagicSky(\n            video_path, save_path, config='jupiter',\n            is_rainy=False, preview_frames_num=0, is_video_sky=False, is_show=False,\n            skybox_img=None, skybox_video=None, rain_cap_path=None,\n            halo_effect=True, auto_light_matching=False,\n            relighting_factor=0.8, recoloring_factor=0.5, skybox_center_crop=0.5\n        )\n    ```\n\n    - **参数**\n\n        * video_path(str)：输入视频路径\n        * save_path(str)：视频保存路径\n        * config(str): 预设 SkyBox 配置，所有预设配置如下，如果使用自定义 SkyBox，请设置为 None：\n        ```\n        [\n            'cloudy', 'district9ship', 'floatingcastle', 'galaxy', 'jupiter',\n            'rainy', 'sunny', 'sunset', 'supermoon', 'thunderstorm'\n        ]\n        ```\n        * skybox_img(str)：自定义的 SkyBox 图像路径\n        * skybox_video(str)：自定义的 SkyBox 视频路径\n        * is_video_sky(bool)：自定义 SkyBox 是否为视频\n        * rain_cap_path(str)：自定义下雨效果视频路径\n        * is_rainy(bool): 天空是否下雨\n        * halo_effect(bool)：是否开启 halo effect\n        * auto_light_matching(bool)：是否开启自动亮度匹配\n        * relighting_factor(float): Relighting factor\n        * recoloring_factor(float): Recoloring factor\n        * skybox_center_crop(float)：SkyBox center crop factor\n        * preview_frames_num(int)：设置预览帧数量，即只处理开头这几帧，设为 0，则为全部处理\n        * is_show(bool)：是否图形化预览\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n"
  },
  {
    "path": "modules/video/Video_editing/SkyAR/README_en.md",
    "content": "# SkyAR\n\n|Module Name|SkyAR|\n| :--- | :---: |\n|Category|Video editing|\n|Network|UNet|\n|Dataset|-|\n|Fine-tuning supported or not|No|\n|Module Size|206MB|\n|Data indicators|-|\n|Latest update date|2021-02-26|\n\n## I. Basic Information \n\n- ### Application Effect Display\n\n    - Sample results:\n        * Input video:\n\n            ![Input video](https://img-blog.csdnimg.cn/20210126142046572.gif)\n\n        * Jupiter:\n\n            ![Jupiter](https://img-blog.csdnimg.cn/20210125211435619.gif)\n        * Rainy day:\n\n            ![Rainy day](https://img-blog.csdnimg.cn/2021012521152492.gif)\n        * Galaxy:\n\n            ![Galaxy](https://img-blog.csdnimg.cn/20210125211523491.gif)\n        * Ninth area spacecraft:\n\n            ![Ninth area spacecraft](https://img-blog.csdnimg.cn/20210125211520955.gif)\n\n        * Input video:\n\n            ![Input video](https://img-blog.csdnimg.cn/20210126142038716.gif)\n        * Floating castle:\n\n            ![Floating castle](https://img-blog.csdnimg.cn/20210125211514997.gif)\n        * Thunder and lightning:\n\n            ![Thunder and lightning](https://img-blog.csdnimg.cn/20210125211433591.gif)\n\n        * Super moon:\n        \n            ![Super moon](https://img-blog.csdnimg.cn/20210125211417524.gif)\n\n- ### Module Introduction\n\n    - SkyAR is based on [Castle in the Sky: Dynamic Sky Replacement and Harmonization in Videos](https://arxiv.org/abs/2010.11800). It mainly consists of three parts: sky matting network, motion estimation and image fusion.\n\n    - For more information, please refer to:[SkyAR](https://github.com/jiupinjia/SkyAR)\n\n\n## II. Installation\n\n- ### 1、Environmental Dependence\n\n    - paddlepaddle >= 2.0.0\n\n    - paddlehub >= 2.0.0\n\n- ### 2、Installation\n\n    - ```shell\n      $hub install SkyAR\n      ```\n    - In case of any problems during installation, please refer to:[Windows_Quickstart](../../../../docs/docs_en/get_start/windows_quickstart.md)\n    | [Linux_Quickstart](../../../../docs/docs_en/get_start/linux_quickstart.md) | [Mac_Quickstart](../../../../docs/docs_en/get_start/mac_quickstart.md)  \n\n## III. Module API Prediction\n\n- ### 1、Prediction Code Example\n\n    ```python\n    import paddlehub as hub\n\n    model = hub.Module(name='SkyAR')\n\n    model.MagicSky(\n        video_path=[path to input video path],\n        save_path=[path to save video path]\n    )\n    ```\n- ### 2、API\n\n    ```python\n    def MagicSky(\n            video_path, save_path, config='jupiter',\n            is_rainy=False, preview_frames_num=0, is_video_sky=False, is_show=False,\n            skybox_img=None, skybox_video=None, rain_cap_path=None,\n            halo_effect=True, auto_light_matching=False,\n            relighting_factor=0.8, recoloring_factor=0.5, skybox_center_crop=0.5\n        )\n    ```\n\n    - **Parameter**\n\n        * video_path(str)：input video path.\n        * save_path(str)：save videp path.\n        * config(str): SkyBox configuration, all preset configurations are as follows:  `['cloudy', 'district9ship', 'floatingcastle', 'galaxy', 'jupiter',\n            'rainy', 'sunny', 'sunset', 'supermoon', 'thunderstorm'\n        ]`, if you use a custom SkyBox, please set it to None.\n    \n        * skybox_img(str)：custom SkyBox image path\n        * skybox_video(str)：custom SkyBox video path\n        * is_video_sky(bool)：customize whether SkyBox is a video\n        * rain_cap_path(str)：custom video path with rain\n        * is_rainy(bool): whether the sky is raining\n        * halo_effect(bool)：whether to open halo effect\n        * auto_light_matching(bool)：whether to enable automatic brightness matching\n        * relighting_factor(float): relighting factor\n        * recoloring_factor(float): recoloring factor\n        * skybox_center_crop(float)：skyBox center crop factor\n        * preview_frames_num(int)：set the number of preview frames\n        * is_show(bool)：whether to preview graphically\n\n\n## IV. Release Note\n\n- 1.0.0\n\n  First release\n"
  },
  {
    "path": "modules/video/Video_editing/SkyAR/__init__.py",
    "content": ""
  },
  {
    "path": "modules/video/Video_editing/SkyAR/module.py",
    "content": "import os\nimport paddle.nn as nn\nfrom .skyfilter import SkyFilter\nfrom paddlehub.module.module import moduleinfo\n\n\n@moduleinfo(name=\"SkyAR\", type=\"CV/Video_editing\", author=\"jm12138\", author_email=\"\", summary=\"SkyAR\", version=\"1.0.0\")\nclass SkyAR(nn.Layer):\n    def __init__(self, model_path=None):\n        super(SkyAR, self).__init__()\n        self.imgs = [\n            'cloudy', 'district9ship', 'floatingcastle', 'galaxy', 'jupiter', 'rainy', 'sunny', 'sunset', 'supermoon'\n        ]\n        self.videos = ['thunderstorm']\n        if model_path:\n            self.model_path = model_path\n        else:\n            self.model_path = os.path.join(self.directory, './ResNet50FCN')\n\n    def MagicSky(self,\n                 video_path,\n                 save_path,\n                 config='jupiter',\n                 is_rainy=False,\n                 preview_frames_num=0,\n                 is_video_sky=False,\n                 is_show=False,\n                 skybox_img=None,\n                 skybox_video=None,\n                 rain_cap_path=None,\n                 halo_effect=True,\n                 auto_light_matching=False,\n                 relighting_factor=0.8,\n                 recoloring_factor=0.5,\n                 skybox_center_crop=0.5):\n        if config in self.imgs:\n            skybox_img = os.path.join(self.directory, 'skybox', '%s.jpg' % config)\n            skybox_video = None\n            is_video_sky = False\n        elif config in self.videos:\n            skybox_img = None\n            skybox_video = os.path.join(self.directory, 'skybox', '%s.mp4' % config)\n            is_video_sky = True\n        elif skybox_img:\n            is_video_sky = False\n            skybox_video = None\n        elif is_video_sky and skybox_video:\n            skybox_img = None\n        else:\n            raise 'please check your configs'\n\n        if not rain_cap_path:\n            rain_cap_path = os.path.join(self.directory, 'rain_streaks', 'videoplayback.mp4')\n\n        skyfilter = SkyFilter(\n            model_path=self.model_path,\n            video_path=video_path,\n            save_path=save_path,\n            in_size=(384, 384),\n            halo_effect=halo_effect,\n            auto_light_matching=auto_light_matching,\n            relighting_factor=relighting_factor,\n            recoloring_factor=recoloring_factor,\n            skybox_center_crop=skybox_center_crop,\n            rain_cap_path=rain_cap_path,\n            skybox_img=skybox_img,\n            skybox_video=skybox_video,\n            is_video=is_video_sky,\n            is_rainy=is_rainy,\n            is_show=is_show)\n\n        skyfilter.run(preview_frames_num)\n"
  },
  {
    "path": "modules/video/Video_editing/SkyAR/rain.py",
    "content": "import cv2\nimport numpy as np\n\n__all__ = ['Rain']\n\n\nclass Rain():\n    def __init__(self, rain_cap_path, rain_intensity=1.0, haze_intensity=4.0, gamma=2.0, light_correction=0.9):\n        self.rain_intensity = rain_intensity\n        self.haze_intensity = haze_intensity\n        self.gamma = gamma\n        self.light_correction = light_correction\n        self.frame_id = 1\n\n        self.cap = cv2.VideoCapture(rain_cap_path)\n\n    def _get_rain_layer(self):\n        ret, frame = self.cap.read()\n        if ret:\n            rain_layer = frame\n        else:  # if reach the last frame, read from the begining\n            self.cap.set(cv2.CAP_PROP_POS_FRAMES, 0)\n            ret, frame = self.cap.read()\n            rain_layer = frame\n\n        rain_layer = cv2.cvtColor(rain_layer, cv2.COLOR_BGR2RGB) / 255.0\n        rain_layer = np.array(rain_layer, dtype=np.float32)\n\n        return rain_layer\n\n    def _create_haze_layer(self, rain_layer):\n        return 0.1 * np.ones_like(rain_layer)\n\n    def forward(self, img):\n        # get input image size\n        h, w, c = img.shape\n\n        # create a rain layer\n        rain_layer = self._get_rain_layer()\n\n        rain_layer = cv2.resize(rain_layer, (w, h))\n        rain_layer = cv2.blur(rain_layer, (3, 3))\n        rain_layer = rain_layer * \\\n            (1 - cv2.boxFilter(img, -1, (int(w/10), int(h/10))))\n\n        # create a haze layer\n        haze_layer = self._create_haze_layer(rain_layer)\n\n        # combine the rain layer and haze layer together\n        rain_layer = self.rain_intensity*rain_layer + \\\n            self.haze_intensity*haze_layer\n\n        # synthesize an output image (screen blend)\n        img_out = 1 - (1 - rain_layer) * (1 - img)\n\n        # gamma and light correction\n        img_out = self.light_correction * (img_out**self.gamma)\n\n        # check boundary\n        img_out = np.clip(img_out, a_min=0, a_max=1.)\n\n        return img_out\n"
  },
  {
    "path": "modules/video/Video_editing/SkyAR/skybox.py",
    "content": "import cv2\nimport numpy as np\n\nfrom .rain import Rain\nfrom .utils import build_transformation_matrix, update_transformation_matrix, estimate_partial_transform, removeOutliers, guidedfilter\n\n\nclass SkyBox():\n    def __init__(self, out_size, skybox_img, skybox_video, halo_effect, auto_light_matching, relighting_factor,\n                 recoloring_factor, skybox_center_crop, rain_cap_path, is_video, is_rainy):\n\n        self.out_size_w, self.out_size_h = out_size\n\n        self.skybox_img = skybox_img\n        self.skybox_video = skybox_video\n\n        self.is_rainy = is_rainy\n        self.is_video = is_video\n\n        self.halo_effect = halo_effect\n        self.auto_light_matching = auto_light_matching\n\n        self.relighting_factor = relighting_factor\n        self.recoloring_factor = recoloring_factor\n\n        self.skybox_center_crop = skybox_center_crop\n        self.load_skybox()\n        self.rainmodel = Rain(\n            rain_cap_path=rain_cap_path, rain_intensity=0.8, haze_intensity=0.0, gamma=1.0, light_correction=1.0)\n\n        # motion parameters\n        self.M = np.array([[1, 0, 0], [0, 1, 0]], dtype=np.float32)\n\n        self.frame_id = 0\n\n    def tile_skybox_img(self, imgtile):\n        screen_y1 = int(imgtile.shape[0] / 2 - self.out_size_h / 2)\n        screen_x1 = int(imgtile.shape[1] / 2 - self.out_size_w / 2)\n        imgtile = np.concatenate([imgtile[screen_y1:, :, :], imgtile[0:screen_y1, :, :]], axis=0)\n        imgtile = np.concatenate([imgtile[:, screen_x1:, :], imgtile[:, 0:screen_x1, :]], axis=1)\n\n        return imgtile\n\n    def load_skybox(self):\n        print('initialize skybox...')\n        if not self.is_video:\n            # static backgroud\n            skybox_img = cv2.imread(self.skybox_img, cv2.IMREAD_COLOR)\n            skybox_img = cv2.cvtColor(skybox_img, cv2.COLOR_BGR2RGB)\n\n            self.skybox_img = cv2.resize(skybox_img, (self.out_size_w, self.out_size_h))\n            cc = 1. / self.skybox_center_crop\n            imgtile = cv2.resize(skybox_img, (int(cc * self.out_size_w), int(cc * self.out_size_h)))\n            self.skybox_imgx2 = self.tile_skybox_img(imgtile)\n            self.skybox_imgx2 = np.expand_dims(self.skybox_imgx2, axis=0)\n\n        else:\n            # video backgroud\n            cap = cv2.VideoCapture(self.skybox_video)\n            m_frames = int(cap.get(cv2.CAP_PROP_FRAME_COUNT))\n            cc = 1. / self.skybox_center_crop\n            self.skybox_imgx2 = np.zeros([m_frames, int(cc * self.out_size_h), int(cc * self.out_size_w), 3], np.uint8)\n            for i in range(m_frames):\n                _, skybox_img = cap.read()\n                skybox_img = cv2.cvtColor(skybox_img, cv2.COLOR_BGR2RGB)\n                imgtile = cv2.resize(skybox_img, (int(cc * self.out_size_w), int(cc * self.out_size_h)))\n                skybox_imgx2 = self.tile_skybox_img(imgtile)\n                self.skybox_imgx2[i, :] = skybox_imgx2\n\n    def skymask_refinement(self, G_pred, img):\n        r, eps = 20, 0.01\n        refined_skymask = guidedfilter(img[:, :, 2], G_pred[:, :, 0], r, eps)\n\n        refined_skymask = np.stack([refined_skymask, refined_skymask, refined_skymask], axis=-1)\n\n        return np.clip(refined_skymask, a_min=0, a_max=1)\n\n    def get_skybg_from_box(self, m):\n        self.M = update_transformation_matrix(self.M, m)\n\n        nbgs, bgh, bgw, c = self.skybox_imgx2.shape\n        fetch_id = self.frame_id % nbgs\n        skybg_warp = cv2.warpAffine(\n            self.skybox_imgx2[fetch_id, :, :, :], self.M, (bgw, bgh), borderMode=cv2.BORDER_WRAP)\n\n        skybg = skybg_warp[0:self.out_size_h, 0:self.out_size_w, :]\n\n        self.frame_id += 1\n\n        return np.array(skybg, np.float32) / 255.\n\n    def skybox_tracking(self, frame, frame_prev, skymask):\n        if np.mean(skymask) < 0.05:\n            print('sky area is too small')\n            return np.array([[1, 0, 0], [0, 1, 0]], dtype=np.float32)\n\n        prev_gray = cv2.cvtColor(frame_prev, cv2.COLOR_RGB2GRAY)\n        prev_gray = np.array(255 * prev_gray, dtype=np.uint8)\n        curr_gray = cv2.cvtColor(frame, cv2.COLOR_RGB2GRAY)\n        curr_gray = np.array(255 * curr_gray, dtype=np.uint8)\n\n        mask = np.array(skymask[:, :, 0] > 0.99, dtype=np.uint8)\n\n        template_size = int(0.05 * mask.shape[0])\n        mask = cv2.erode(mask, np.ones([template_size, template_size]))\n\n        # ShiTomasi corner detection\n        prev_pts = cv2.goodFeaturesToTrack(\n            prev_gray, mask=mask, maxCorners=200, qualityLevel=0.01, minDistance=30, blockSize=3)\n\n        if prev_pts is None:\n            print('no feature point detected')\n            return np.array([[1, 0, 0], [0, 1, 0]], dtype=np.float32)\n\n        # Calculate optical flow (i.e. track feature points)\n        curr_pts, status, err = cv2.calcOpticalFlowPyrLK(prev_gray, curr_gray, prev_pts, None)\n        # Filter only valid points\n        idx = np.where(status == 1)[0]\n        if idx.size == 0:\n            print('no good point matched')\n            return np.array([[1, 0, 0], [0, 1, 0]], dtype=np.float32)\n\n        prev_pts, curr_pts = removeOutliers(prev_pts, curr_pts)\n\n        if curr_pts.shape[0] < 10:\n            print('no good point matched')\n            return np.array([[1, 0, 0], [0, 1, 0]], dtype=np.float32)\n\n        # limit the motion to translation + rotation\n        dxdyda = estimate_partial_transform((np.array(prev_pts), np.array(curr_pts)))\n        m = build_transformation_matrix(dxdyda)\n\n        return m\n\n    def relighting(self, img, skybg, skymask):\n        # color matching, reference: skybox_img\n        step = int(img.shape[0] / 20)\n        skybg_thumb = skybg[::step, ::step, :]\n        img_thumb = img[::step, ::step, :]\n        skymask_thumb = skymask[::step, ::step, :]\n        skybg_mean = np.mean(skybg_thumb, axis=(0, 1), keepdims=True)\n        img_mean = np.sum(img_thumb * (1-skymask_thumb), axis=(0, 1), keepdims=True) \\\n            / ((1-skymask_thumb).sum(axis=(0, 1), keepdims=True) + 1e-9)\n        diff = skybg_mean - img_mean\n        img_colortune = img + self.recoloring_factor * diff\n\n        if self.auto_light_matching:\n            img = img_colortune\n        else:\n            # keep foreground ambient_light and maunally adjust lighting\n            img = self.relighting_factor * \\\n                (img_colortune + (img.mean() - img_colortune.mean()))\n\n        return img\n\n    def halo(self, syneth, skybg, skymask):\n        # reflection\n        halo = 0.5 * cv2.blur(skybg * skymask, (int(self.out_size_w / 5), int(self.out_size_w / 5)))\n        # screen blend 1 - (1-a)(1-b)\n        syneth_with_halo = 1 - (1 - syneth) * (1 - halo)\n\n        return syneth_with_halo\n\n    def skyblend(self, img, img_prev, skymask):\n        m = self.skybox_tracking(img, img_prev, skymask)\n\n        skybg = self.get_skybg_from_box(m)\n\n        img = self.relighting(img, skybg, skymask)\n        syneth = img * (1 - skymask) + skybg * skymask\n\n        if self.halo_effect:\n            # halo effect brings better visual realism but will slow down the speed\n            syneth = self.halo(syneth, skybg, skymask)\n\n        if self.is_rainy:\n            syneth = self.rainmodel.forward(syneth)\n\n        return np.clip(syneth, a_min=0, a_max=1)\n"
  },
  {
    "path": "modules/video/Video_editing/SkyAR/skyfilter.py",
    "content": "import os\nimport cv2\nimport paddle\nimport numpy as np\nfrom .skybox import SkyBox\n\n__all__ = ['SkyFilter']\n\n\nclass SkyFilter():\n    def __init__(self, model_path, video_path, save_path, in_size, halo_effect, auto_light_matching, relighting_factor,\n                 recoloring_factor, skybox_center_crop, rain_cap_path, skybox_img, skybox_video, is_video, is_rainy,\n                 is_show):\n        self.in_size = in_size\n        self.is_show = is_show\n        self.cap = cv2.VideoCapture(video_path)\n        self.m_frames = int(self.cap.get(cv2.CAP_PROP_FRAME_COUNT))\n        self.fps = self.cap.get(cv2.CAP_PROP_FPS)\n        self.out_size = int(self.cap.get(cv2.CAP_PROP_FRAME_WIDTH)), int(self.cap.get(cv2.CAP_PROP_FRAME_HEIGHT))\n\n        self.model = paddle.jit.load(model_path, model_filename='__model__', params_filename='__params__')\n        self.model.eval()\n\n        self.skyboxengine = SkyBox(\n            out_size=self.out_size,\n            skybox_img=skybox_img,\n            skybox_video=skybox_video,\n            halo_effect=halo_effect,\n            auto_light_matching=auto_light_matching,\n            relighting_factor=relighting_factor,\n            recoloring_factor=recoloring_factor,\n            skybox_center_crop=skybox_center_crop,\n            rain_cap_path=rain_cap_path,\n            is_video=is_video,\n            is_rainy=is_rainy)\n        path, _ = os.path.split(save_path)\n        if path == '':\n            path = '.'\n        if not os.path.exists(path):\n            os.mkdir(path)\n        self.video_writer = cv2.VideoWriter(save_path, cv2.VideoWriter_fourcc(*'MP4V'), self.fps, self.out_size)\n\n    def synthesize(self, img_HD, img_HD_prev):\n        h, w, _ = img_HD.shape\n\n        img = cv2.resize(img_HD, self.in_size)\n        img = np.array(img, dtype=np.float32)\n        img = img.transpose(2, 0, 1)\n        img = img[np.newaxis, ...]\n        img = paddle.to_tensor(img)\n\n        G_pred = self.model(img)\n        G_pred = paddle.nn.functional.interpolate(G_pred, (h, w), mode='bicubic', align_corners=False)\n        G_pred = G_pred[0, :].transpose([1, 2, 0])\n        G_pred = paddle.concat([G_pred, G_pred, G_pred], axis=-1)\n        G_pred = G_pred.detach().numpy()\n        G_pred = np.clip(G_pred, a_max=1.0, a_min=0.0)\n\n        skymask = self.skyboxengine.skymask_refinement(G_pred, img_HD)\n        syneth = self.skyboxengine.skyblend(img_HD, img_HD_prev, skymask)\n\n        return syneth, G_pred, skymask\n\n    def run(self, preview_frames_num=0):\n\n        img_HD_prev = None\n        frames_num = preview_frames_num if 0 < preview_frames_num < self.m_frames else self.m_frames\n\n        print('frames_num: %d, running evaluation...' % frames_num)\n        for idx in range(1, frames_num + 1):\n            ret, frame = self.cap.read()\n            if ret:\n                frame = cv2.resize(frame, self.out_size)\n                img_HD = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)\n                img_HD = np.array(img_HD / 255., dtype=np.float32)\n\n                if img_HD_prev is None:\n                    img_HD_prev = img_HD\n\n                syneth, _, _ = self.synthesize(img_HD, img_HD_prev)\n                result = np.array(255.0 * syneth[:, :, ::-1], dtype=np.uint8)\n                self.video_writer.write(result)\n                if self.is_show:\n                    show_img = np.concatenate([frame, result], 1)\n                    h, w = show_img.shape[:2]\n                    show_img = cv2.resize(show_img, (720, int(720 / w * h)))\n                    cv2.imshow('preview', show_img)\n                    k = cv2.waitKey(1)\n                    if (k == 27) or (cv2.getWindowProperty('preview', 0) == -1):\n                        self.video_writer.release()\n                        cv2.destroyAllWindows()\n                        break\n                print('processing: %d / %d ...' % (idx, frames_num))\n\n                img_HD_prev = img_HD\n\n            else:\n                self.video_writer.release()\n                cv2.destroyAllWindows()\n                break\n"
  },
  {
    "path": "modules/video/Video_editing/SkyAR/utils.py",
    "content": "import cv2\nimport numpy as np\nfrom sklearn.neighbors import KernelDensity\n\n__all__ = [\n    'build_transformation_matrix', 'update_transformation_matrix', 'estimate_partial_transform', 'removeOutliers',\n    'guidedfilter'\n]\n\n\ndef build_transformation_matrix(transform):\n    \"\"\"Convert transform list to transformation matrix\n\n    :param transform: transform list as [dx, dy, da]\n    :return: transform matrix as 2d (2, 3) numpy array\n    \"\"\"\n    transform_matrix = np.zeros((2, 3))\n\n    transform_matrix[0, 0] = np.cos(transform[2])\n    transform_matrix[0, 1] = -np.sin(transform[2])\n    transform_matrix[1, 0] = np.sin(transform[2])\n    transform_matrix[1, 1] = np.cos(transform[2])\n    transform_matrix[0, 2] = transform[0]\n    transform_matrix[1, 2] = transform[1]\n\n    return transform_matrix\n\n\ndef update_transformation_matrix(M, m):\n\n    # extend M and m to 3x3 by adding an [0,0,1] to their 3rd row\n    M_ = np.concatenate([M, np.zeros([1, 3])], axis=0)\n    M_[-1, -1] = 1\n    m_ = np.concatenate([m, np.zeros([1, 3])], axis=0)\n    m_[-1, -1] = 1\n\n    M_new = np.matmul(m_, M_)\n    return M_new[0:2, :]\n\n\ndef estimate_partial_transform(matched_keypoints):\n    \"\"\"Wrapper of cv2.estimateRigidTransform for convenience in vidstab process\n\n    :param matched_keypoints: output of match_keypoints util function; tuple of (cur_matched_kp, prev_matched_kp)\n    :return: transform as list of [dx, dy, da]\n    \"\"\"\n    prev_matched_kp, cur_matched_kp = matched_keypoints\n    transform = cv2.estimateAffinePartial2D(np.array(prev_matched_kp), np.array(cur_matched_kp))[0]\n\n    if transform is not None:\n        # translation x\n        dx = transform[0, 2]\n        # translation y\n        dy = transform[1, 2]\n        # rotation\n        da = np.arctan2(transform[1, 0], transform[0, 0])\n    else:\n        dx = dy = da = 0\n\n    return [dx, dy, da]\n\n\ndef removeOutliers(prev_pts, curr_pts):\n\n    d = np.sum((prev_pts - curr_pts)**2, axis=-1)**0.5\n\n    d_ = np.array(d).reshape(-1, 1)\n    kde = KernelDensity(kernel='gaussian', bandwidth=0.5).fit(d_)\n    density = np.exp(kde.score_samples(d_))\n\n    prev_pts = prev_pts[np.where((density >= 0.1))]\n    curr_pts = curr_pts[np.where((density >= 0.1))]\n\n    return prev_pts, curr_pts\n\n\ndef boxfilter(img, r):\n    (rows, cols) = img.shape\n    imDst = np.zeros_like(img)\n\n    imCum = np.cumsum(img, 0)\n    imDst[0:r + 1, :] = imCum[r:2 * r + 1, :]\n    imDst[r + 1:rows - r, :] = imCum[2 * r + 1:rows, :] - imCum[0:rows - 2 * r - 1, :]\n    imDst[rows - r:rows, :] = np.tile(imCum[rows - 1, :], [r, 1]) - imCum[rows - 2 * r - 1:rows - r - 1, :]\n\n    imCum = np.cumsum(imDst, 1)\n    imDst[:, 0:r + 1] = imCum[:, r:2 * r + 1]\n    imDst[:, r + 1:cols - r] = imCum[:, 2 * r + 1:cols] - imCum[:, 0:cols - 2 * r - 1]\n    imDst[:, cols - r:cols] = np.tile(imCum[:, cols - 1], [r, 1]).T - imCum[:, cols - 2 * r - 1:cols - r - 1]\n\n    return imDst\n\n\ndef guidedfilter(img, p, r, eps):\n    (rows, cols) = img.shape\n    N = boxfilter(np.ones([rows, cols]), r)\n\n    meanI = boxfilter(img, r) / N\n    meanP = boxfilter(p, r) / N\n    meanIp = boxfilter(img * p, r) / N\n    covIp = meanIp - meanI * meanP\n\n    meanII = boxfilter(img * img, r) / N\n    varI = meanII - meanI * meanI\n\n    a = covIp / (varI + eps)\n    b = meanP - a * meanI\n\n    meanA = boxfilter(a, r) / N\n    meanB = boxfilter(b, r) / N\n\n    q = meanA * img + meanB\n    return q\n"
  },
  {
    "path": "modules/video/classification/README.md",
    "content": ""
  },
  {
    "path": "modules/video/classification/nonlocal_kinetics400/README.md",
    "content": "# nonlocal_kinetics400\n\n|模型名称|nonlocal_kinetics400|\n| :--- | :---: | \n|类别|视频-视频分类|\n|网络|Non-local|\n|数据集|Kinetics-400|\n|是否支持Fine-tuning|否|\n|模型大小|129MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - Non-local Neural Networks是由Xiaolong Wang等研究者在2017年提出的模型，主要特点是通过引入Non-local操作来描述距离较远的像素点之间的关联关系。其借助于传统计算机视觉中的non-local mean的思想，并将该思想扩展到神经网络中，通过定义输出位置和所有输入位置之间的关联函数，建立全局关联特性。Non-local模型的训练数据采用由DeepMind公布的Kinetics-400动作识别数据集。该PaddleHub Module可支持预测。\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0\n  \n  - paddlehub >= 1.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install nonlocal_kinetics400\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    hub run nonlocal_kinetics400 --input_path \"/PATH/TO/VIDEO\" --use_gpu True\n    ```\n    \n    或者\n    \n  - ```shell\n    hub run nonlocal_kinetics400 --input_file test.txt --use_gpu True\n    ```    \n    \n  - test.txt 存放待分类视频的存放路径；\n  - Note: 该PaddleHub Module目前只支持在GPU环境下使用，在使用前，请使用下述命令指定GPU设备（设备ID请根据实际情况指定）\n  \n  - ```shell\n    export CUDA_VISIBLE_DEVICES=0\n    ```    \n\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n\n    import paddlehub as hub\n\n    nonlocal = hub.Module(name=\"nonlocal_kinetics400\")\n\n    test_video_path = \"/PATH/TO/VIDEO\"\n\n    # set input dict\n    input_dict = {\"image\": [test_video_path]}\n\n    # execute predict and print the result\n    results = nonlocal.video_classification(data=input_dict)\n    for result in results:\n        print(result)\n    ```\n    \n- ### 3、API\n\n  - ```python\n    def video_classification(data)\n    ```    \n\n    - 用于视频分类预测\n    \n    - **参数**\n\n      - data(dict): dict类型，key为image，str类型；value为待分类的视频路径，list类型。\n\n\n    - **返回**\n\n      - result(list\\[dict\\]): list类型，每个元素为对应输入视频的预测结果。预测结果为dict类型，key为label，value为该label对应的概率值。\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n  \n  - ```shell\n    $  hub install nonlocal_kinetics400==1.0.0\n    ```\n"
  },
  {
    "path": "modules/video/classification/stnet_kinetics400/README.md",
    "content": "# stnet_kinetics400\n\n|模型名称|stnet_kinetics400|\n| :--- | :---: | \n|类别|视频-视频分类|\n|网络|StNet|\n|数据集|Kinetics-400|\n|是否支持Fine-tuning|否|\n|模型大小|129MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - StNet模型框架为ActivityNet Kinetics Challenge 2018中夺冠的基础网络框架，是基于ResNet50实现的。该模型提出super-image的概念，在super-image上进行2D卷积，建模视频中局部时空相关性。另外通过temporal modeling block建模视频的全局时空依赖，最后用一个temporal Xception block对抽取的特征序列进行长时序建模。StNet的训练数据采用由DeepMind公布的Kinetics-400动作识别数据集。该PaddleHub Module可支持预测。\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0\n  \n  - paddlehub >= 1.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install stnet_kinetics400\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    hub run stnet_kinetics400 --input_path \"/PATH/TO/VIDEO\"\n    ```\n    \n    或者\n    \n  - ```shell\n    hub run stnet_kinetics400 --input_file test.txt \n    ```    \n    \n  - test.txt 存放待分类视频的存放路径\n\n\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n\n    import paddlehub as hub\n\n    stnet = hub.Module(name=\"stnet_kinetics400\")\n\n    test_video_path = \"/PATH/TO/VIDEO\"\n\n    # set input dict\n    input_dict = {\"image\": [test_video_path]}\n\n    # execute predict and print the result\n    results = stnet.video_classification(data=input_dict)\n    for result in results:\n        print(result)\n    ```\n    \n- ### 3、API\n\n  - ```python\n    def video_classification(data)\n    ```    \n\n    - 用于视频分类预测\n    \n    - **参数**\n\n      - data(dict): dict类型，key为image，str类型；value为待分类的视频路径，list类型。\n\n\n    - **返回**\n\n      - result(list\\[dict\\]): list类型，每个元素为对应输入视频的预测结果。预测结果为dict类型，key为label，value为该label对应的概率值。\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n  \n  - ```shell\n    $ hub install stnet_kinetics400==1.0.0\n    ```\n"
  },
  {
    "path": "modules/video/classification/tsm_kinetics400/README.md",
    "content": "# tsm_kinetics400\n\n|模型名称|tsm_kinetics400|\n| :--- | :---: | \n|类别|视频-视频分类|\n|网络|TSM|\n|数据集|Kinetics-400|\n|是否支持Fine-tuning|否|\n|模型大小|95MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - TSM（Temporal Shift Module）是由MIT和IBM Watson AI Lab的JiLin，ChuangGan和SongHan等人提出的通过时间位移来提高网络视频理解能力的模块。TSM的训练数据采用由DeepMind公布的Kinetics-400动作识别数据集。该PaddleHub Module可支持预测。\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0\n  \n  - paddlehub >= 1.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install tsm_kinetics400\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    hub run tsm_kinetics400 --input_path \"/PATH/TO/VIDEO\"\n    ```\n    \n    或者\n    \n  - ```shell\n    hub run tsm_kinetics400 --input_file test.txt \n    ```    \n    \n  - Note: test.txt 存放待分类视频的存放路径\n\n\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n\n    import paddlehub as hub\n\n    tsm = hub.Module(name=\"tsm_kinetics400\")\n\n    test_video_path = \"/PATH/TO/VIDEO\"\n\n    # set input dict\n    input_dict = {\"image\": [test_video_path]}\n\n    # execute predict and print the result\n    results = tsm.video_classification(data=input_dict)\n    for result in results:\n        print(result)\n    ```\n    \n- ### 3、API\n\n  - ```python\n    def video_classification(data)\n    ```    \n\n    - 用于视频分类预测\n    \n    - **参数**\n\n      - data(dict): dict类型，key为image，str类型；value为待分类的视频路径，list类型。\n\n\n    - **返回**\n\n      - result(list\\[dict\\]): list类型，每个元素为对应输入视频的预测结果。预测结果为dict类型，key为label，value为该label对应的概率值。\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n  \n  - ```shell\n    $ hub install tsm_kinetics400==1.0.0\n    ```\n"
  },
  {
    "path": "modules/video/classification/tsn_kinetics400/README.md",
    "content": "# tsn_kinetics400\n\n|模型名称|tsn_kinetics400|\n| :--- | :---: | \n|类别|视频-视频分类|\n|网络|TSN|\n|数据集|Kinetics-400|\n|是否支持Fine-tuning|否|\n|模型大小|95MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - TSN（Temporal Segment Network）是视频分类领域经典的基于2D-CNN的解决方案。该方法主要解决视频的长时间行为判断问题，通过稀疏采样视频帧的方式代替稠密采样，既能捕获视频全局信息，也能去除冗余，降低计算量。最终将每帧特征平均融合后得到视频的整体特征，并用于分类。TSN的训练数据采用由DeepMind公布的Kinetics-400动作识别数据集。该PaddleHub Module可支持预测。\n\n  - 具体网络结构可参考论文：[TSN](https://arxiv.org/abs/1608.00859)。\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.4.0\n  \n  - paddlehub >= 1.0.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install tsn_kinetics400\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    hub run tsn_kinetics400 --input_path \"/PATH/TO/VIDEO\"\n    ```\n    \n    或者\n    \n  - ```shell\n    hub run tsn_kinetics400 --input_file test.txt \n    ```    \n    \n  - Note: test.txt 存放待分类视频的存放路径\n\n\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n\n    import paddlehub as hub\n\n    tsn = hub.Module(name=\"tsn_kinetics400\")\n\n    test_video_path = \"/PATH/TO/VIDEO\"\n\n    # set input dict\n    input_dict = {\"image\": [test_video_path]}\n\n    # execute predict and print the result\n    results = tsn.video_classification(data=input_dict)\n    for result in results:\n        print(result)\n    ```\n    \n- ### 3、API\n\n  - ```python\n    def video_classification(data)\n    ```    \n\n    - 用于视频分类预测\n    \n    - **参数**\n\n      - data(dict): dict类型，key为image，str类型；value为待分类的视频路径，list类型。\n\n\n    - **返回**\n\n      - result(list\\[dict\\]): list类型，每个元素为对应输入视频的预测结果。预测结果为dict类型，key为label，value为该label对应的概率值。\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n  \n  - ```shell\n    $ hub install tsn_kinetics400==1.0.0\n    ```\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/README.md",
    "content": "# videotag_tsn_lstm\n\n|模型名称|videotag_tsn_lstm|\n| :--- | :---: | \n|类别|视频-视频分类|\n|网络|TSN + AttentionLSTM|\n|数据集|百度自建数据集|\n|是否支持Fine-tuning|否|\n|模型大小|409MB|\n|最新更新日期|2021-02-26|\n|数据指标|-|\n\n\n\n## 一、模型基本信息\n\n- ### 模型介绍\n\n  - videotag_tsn_lstm是一个基于千万短视频预训练的视频分类模型，可直接预测短视频的中文标签。模型分为视频特征抽取和序列建模两个阶段，前者使用TSN网络提取视频特征，后者基于前者输出使用AttentionLSTM网络进行序列建模实现分类。模型基于百度实际短视频场景中的大规模数据训练得到，在实际业务中取得89.9%的Top-1精度，同时具有良好的泛化能力，适用于多种短视频中文标签分类场景。该PaddleHub Module可支持预测。\n\n\n<p align=\"center\">\n<img src=\"https://paddlehub.bj.bcebos.com/model/video/video_classifcation/VideoTag_TSN_AttentionLSTM.png\"  width = \"450\"  hspace='10'/> <br />\n</p>\n\n  - 具体网络结构可参考论文：[TSN](https://arxiv.org/abs/1608.00859) 和 [AttentionLSTM](https://arxiv.org/abs/1503.08909)。\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddlepaddle >= 1.7.2\n  \n  - paddlehub >= 1.6.0    | [如何安装PaddleHub](../../../../docs/docs_ch/get_start/installation.rst)\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install videotag_tsn_lstm\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n\n\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    hub run videotag_tsn_lstm --input_path 1.mp4 --use_gpu False\n    ```\n    \n  - 示例文件下载：\n    - [1.mp4](https://paddlehub.bj.bcebos.com/model/video/video_classifcation/1.mp4)\n    - [2.mp4](https://paddlehub.bj.bcebos.com/model/video/video_classifcation/2.mp4)\n\n  - 通过命令行方式实现文字识别模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    videotag = hub.Module(name=\"videotag_tsn_lstm\")\n\n    # execute predict and print the result\n    results = videotag.classify(paths=[\"1.mp4\",\"2.mp4\"], use_gpu=False)  # 示例文件请在上方下载\n    print(results)\n    \n    #[{'path': '1.mp4', 'prediction': {'训练': 0.9771281480789185, '蹲': 0.9389840960502625, '杠铃': 0.8554490804672241, '健身房': 0.8479971885681152}}, {'path': '2.mp4', 'prediction': {'舞蹈': 0.8504238724708557}}]\n\n\n    ```\n    \n- ### 3、API\n\n  - ```python\n    def classify(paths,\n                 use_gpu=False,\n                 threshold=0.5,\n                 top_k=10)\n    ```    \n\n    - 用于视频分类预测\n\n    - **参数**\n\n      - paths(list\\[str\\])：mp4文件路径\n      - use_gpu(bool)：是否使用GPU预测，默认为False\n      - threshold(float)：预测结果阈值，只有预测概率大于阈值的类别会被返回，默认为0.5\n      - top_k(int): 返回预测结果的前k个，默认为10\n\n    - **返回**\n\n      - results(list\\[dict\\]): result中的每个元素为对应输入的预测结果，预测单个mp4文件时仅有1个元素。每个预测结果为dict，包含mp4文件路径path及其分类概率。\n\n\n## 五、更新历史\n\n* 1.0.0\n\n  初始发布\n  \n  - ```shell\n    $ hub install videotag_tsn_lstm==1.0.0\n    ```\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/__init__.py",
    "content": ""
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/module.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport argparse\nimport ast\nimport os\n\nimport paddle.fluid as fluid\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo, runnable\nfrom paddlehub.common.logger import logger\n\nfrom videotag_tsn_lstm.resource.utils.config_utils import *\nimport videotag_tsn_lstm.resource.models as models\nfrom videotag_tsn_lstm.resource.reader import get_reader\nfrom videotag_tsn_lstm.resource.metrics import get_metrics\nfrom videotag_tsn_lstm.resource.utils.utility import check_cuda\nfrom videotag_tsn_lstm.resource.utils.utility import check_version\n\n\n@moduleinfo(\n    name=\"videotag_tsn_lstm\",\n    version=\"1.0.0\",\n    summary=\n    \"videotag_tsn_lstm is a video classification model, using TSN for feature extraction and AttentionLSTM for classification\",\n    author=\"paddlepaddle\",\n    author_email=\"paddle-dev@baidu.com\",\n    type=\"video/classification\",\n)\nclass VideoTag(hub.Module):\n    def _initialize(self):\n        # add arg parser\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the videotag_tsn_lstm module.\",\n            prog='hub run videotag_tsn_lstm',\n            usage='%(prog)s',\n            add_help=True)\n        self.parser.add_argument('--use_gpu', type=ast.literal_eval, default=False, help='default use gpu.')\n        self.parser.add_argument('--input_path', type=str, default=None, help='path of video data, single video')\n        self._has_load = False\n\n    def _extractor(self, args, exe, place):\n        extractor_scope = fluid.Scope()\n        with fluid.scope_guard(extractor_scope):\n            extractor_startup_prog = fluid.Program()\n            extractor_main_prog = fluid.Program()\n            with fluid.program_guard(extractor_main_prog, extractor_startup_prog):\n                extractor_config = parse_config(args.extractor_config)\n                extractor_infer_config = merge_configs(extractor_config, 'infer', vars(args))\n\n                # build model\n                extractor_model = models.get_model(\"TSN\", extractor_infer_config, mode='infer')\n                extractor_model.build_input(use_dataloader=False)\n                extractor_model.build_model()\n                extractor_feeds = extractor_model.feeds()\n                extractor_fetch_list = extractor_model.fetches()\n\n                exe.run(extractor_startup_prog)\n\n                logger.info('load extractor weights from {}'.format(args.extractor_weights))\n                extractor_model.load_test_weights(exe, args.extractor_weights, extractor_main_prog)\n\n                extractor_feeder = fluid.DataFeeder(place=place, feed_list=extractor_feeds)\n        return extractor_main_prog, extractor_fetch_list, extractor_feeder, extractor_scope\n\n    def _predictor(self, args, exe, place):\n        predictor_scope = fluid.Scope()\n        with fluid.scope_guard(predictor_scope):\n            predictor_startup_prog = fluid.default_startup_program()\n            predictor_main_prog = fluid.default_main_program()\n            with fluid.program_guard(predictor_main_prog, predictor_startup_prog):\n                # parse config\n                predictor_config = parse_config(args.predictor_config)\n                predictor_infer_config = merge_configs(predictor_config, 'infer', vars(args))\n\n                predictor_model = models.get_model(\"AttentionLSTM\", predictor_infer_config, mode='infer')\n                predictor_model.build_input(use_dataloader=False)\n                predictor_model.build_model()\n                predictor_feeds = predictor_model.feeds()\n                predictor_outputs = predictor_model.outputs()\n\n                exe.run(predictor_startup_prog)\n\n                logger.info('load lstm weights from {}'.format(args.predictor_weights))\n                predictor_model.load_test_weights(exe, args.predictor_weights, predictor_main_prog)\n\n                predictor_feeder = fluid.DataFeeder(place=place, feed_list=predictor_feeds)\n                predictor_fetch_list = predictor_model.fetches()\n        return predictor_main_prog, predictor_fetch_list, predictor_feeder, predictor_scope\n\n    @runnable\n    def run_cmd(self, argsv):\n        args = self.parser.parse_args(argsv)\n        results = self.classify(paths=[args.input_path], use_gpu=args.use_gpu)\n        return results\n\n    def classify(self, paths, use_gpu=False, threshold=0.5, top_k=10):\n        \"\"\"\n        API of Classification.\n\n        Args:\n            paths (list[str]): the path of mp4s.\n            use_gpu (bool): whether to use gpu or not.\n            threshold (float): the result value >= threshold will be returned.\n            top_k (int): the top k result will be returned.\n\n        Returns:\n            results (list[dict]): every dict includes the mp4 file path and prediction.\n        \"\"\"\n        args = self.parser.parse_args([])\n        # config the args in videotag_tsn_lstm\n        args.use_gpu = use_gpu\n        args.filelist = paths\n        args.topk = top_k\n        args.threshold = threshold\n        args.extractor_config = os.path.join(self.directory, 'resource', 'configs', 'tsn.yaml')\n        args.predictor_config = os.path.join(self.directory, 'resource', 'configs', 'attention_lstm.yaml')\n        args.extractor_weights = os.path.join(self.directory, 'weights', 'tsn')\n        args.predictor_weights = os.path.join(self.directory, 'weights', 'attention_lstm')\n        args.label_file = os.path.join(self.directory, 'resource', 'label_3396.txt')\n\n        check_cuda(args.use_gpu)\n        check_version()\n\n        if not self._has_load:\n            self.place = fluid.CUDAPlace(0) if args.use_gpu else fluid.CPUPlace()\n            self.exe = fluid.Executor(self.place)\n            self.extractor_main_prog, self.extractor_fetch_list, self.extractor_feeder, self.extractor_scope = self._extractor(\n                args, self.exe, self.place)\n            self.predictor_main_prog, self.predictor_fetch_list, self.predictor_feeder, self.predictor_scope = self._predictor(\n                args, self.exe, self.place)\n            self._has_load = True\n\n        extractor_config = parse_config(args.extractor_config)\n        extractor_infer_config = merge_configs(extractor_config, 'infer', vars(args))\n        extractor_reader = get_reader(\"TSN\", 'infer', extractor_infer_config)\n        feature_list = []\n        file_list = []\n\n        for idx, data in enumerate(extractor_reader()):\n            file_id = [item[-1] for item in data]\n            feed_data = [item[:-1] for item in data]\n            feature_out = self.exe.run(\n                program=self.extractor_main_prog,\n                fetch_list=self.extractor_fetch_list,\n                feed=self.extractor_feeder.feed(feed_data),\n                scope=self.extractor_scope)\n            feature_list.append(feature_out)\n            file_list.append(file_id)\n            logger.info('========[Stage 1 Sample {} ] Tsn feature extractor finished======'.format(idx))\n\n        # get AttentionLSTM input from Tsn output\n        num_frames = 300\n        predictor_feed_list = []\n        for i in range(len(feature_list)):\n            feature_out = feature_list[i]\n            extractor_feature = feature_out[0]\n            predictor_feed_data = [[extractor_feature[0].astype(float)[0:num_frames, :]]]\n            predictor_feed_list.append((predictor_feed_data, file_list[i]))\n\n        metrics_config = parse_config(args.predictor_config)\n        metrics_config['MODEL']['topk'] = args.topk\n        metrics_config['MODEL']['threshold'] = args.threshold\n        predictor_metrics = get_metrics(\"AttentionLSTM\".upper(), 'infer', metrics_config)\n        predictor_metrics.reset()\n        for idx, data in enumerate(predictor_feed_list):\n            file_id = data[1]\n            predictor_feed_data = data[0]\n            final_outs = self.exe.run(\n                program=self.predictor_main_prog,\n                fetch_list=self.predictor_fetch_list,\n                feed=self.predictor_feeder.feed(predictor_feed_data, ),\n                scope=self.predictor_scope)\n            logger.info('=======[Stage 2 Sample {} ] AttentionLSTM predict finished========'.format(idx))\n            final_result_list = [item for item in final_outs] + [file_id]\n\n            predictor_metrics.accumulate(final_result_list)\n        results = predictor_metrics.finalize_and_log_out(label_file=args.label_file)\n        return results\n\n\nif __name__ == '__main__':\n    test_module = VideoTag()\n    print(test_module.run_cmd(argsv=['--input_path', \"1.mp4\", '--use_gpu', str(False)]))\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/__init__.py",
    "content": ""
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/configs/attention_lstm.yaml",
    "content": "MODEL:\n    name: \"AttentionLSTM\"\n    dataset: \"YouTube-8M\"\n    bone_nework: None\n    drop_rate: 0.5\n    feature_num: 2\n    feature_names: ['rgb']\n    feature_dims: [2048]\n    embedding_size: 1024\n    lstm_size: 512\n    num_classes: 3396\n    topk: 10\n\nINFER:\n    batch_size: 1\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/configs/tsn.yaml",
    "content": "MODEL:\n    name: \"TSN\"\n    format: \"mp4\"\n    num_classes: 400\n    seg_num: 3\n    seglen: 1\n    image_mean: [0.485, 0.456, 0.406]\n    image_std: [0.229, 0.224, 0.225]\n    num_layers: 50\n    topk: 5\n\nINFER:\n    seg_num: 300\n    short_size: 256\n    target_size: 224\n    num_reader_threads: 12\n    buf_size: 1024\n    batch_size: 1\n    kinetics_labels: \"./data/kinetics_labels.json\"\n    filelist: \"./data/tsn.list\"\n    video_path: \"./data/mp4/1.mp4\"\n    single_file: True\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/label_3396.txt",
    "content": "胶合板\n坠楼\n空手道\n弹奏\n直升机\n罗盘\n健身\n羽毛球拍\n龙与地下城\n漆\n混合器\n学生\n安全气囊\n法庭\n游泳池\n潜艇\n穆斯林头巾\n奇葩\n绞狐大冒险\n飞行器\n演出\n喷枪\n萝莉\n暗黑血统\n彭彭丁满历险记\n出生\n嫩模\n流星雨\n超市\nStepMania\n自动扶梯\n讲座\n缝纫机\n自助餐\n衣服\n翼装飞行\n手语\n可爱颂\n复合弓\n列车\n欧洲模拟卡车\n吃豆人\n队长\n僵尸\n猩红\n战争片\n通关攻略\n横梁\n机场\n引体向上\n暴力片\n橱柜\n卡车\n美人\n仙境传说\n格斗\n奇趣蛋\n健美\n新能源\n佳能\n电视\n喊麦\n信件\n双胞胎\n膳食补充剂\n胸部\n碟子\n女排\n地铁：最后的曙光\n牛肉\n激光照明\n毛巾\n面包店\n时空之轮\n泰迪\n吉他\n绿茶\n自驾游\n签名会\n酱\n抽屉\n山火\nT台\n喝醉\n马桶\n巴松管\n皇帝\n沙丘\n主播\n炖汤\n糖\n球球大作战\n彩票\n中暑\n雷达\n独木舟\n星座\n弓箭\n跑车\n大豆\n妖怪\n激光\n中秋节\n风景\n橡皮筋\n固体\n音乐会\n幽灵\n救生员\n彩虹\n政治\n眼线\n柴\n医疗\n购物中心\n舰载机\n空战\n服装\n钢模\n拖鞋\n教室\n羽毛球\n烤肉\n煎饼\n金星\n火箭\n婴儿车\n黑暗之魂\n夏目友人帐\n图像处理\n恐龙\n柔术\n剪刀\n冒险任务世界\n冰雹\n木工刨\n白金汉宫\n可丽饼\n绅士\n盖瑞模组\n滑板\n游戏网站\n套房\n动作教学\nDOTA\n海盗传说\n小马慢跑\n怪物中学\n快闪\n冠军\n手风琴\n工具\n进击的巨人\n怀孕\n停车场\n舌钉\n自行车运动\n飞檐走壁\n滑雪板\n保健\n大蒜\n门\n咏春\n热火吉他手\n筷子\n饮料罐\n拳无虚发\n糗事\n豆腐\n动物园大亨\n佛兰肯斯坦\n动漫\n机长\n脱发\n石英\n医生\n母婴\n数码\n螳螂\n加仑\n核电站\n老鹰\n哑铃\n成语故事\n情景剧\n小提琴\n熊猫\n泥石流\n贴花\n合唱团\n质量效应\n东京食尸鬼\n流行音乐\n犁\n帆\n监拍\n城市\n液氮\n扳手\n卫星\n跳伞\n三维\n美味\n特种部队\n名模\n手帕\n瀑布\n教师\n风铃\n爱丽丝梦游仙境\n风光\n通用电气公司\n逗比\n豹子\n石油\n仙乐传说\n晴天\n皮革\n露台·天井\n实验室\n口琴\n驾车\n枕头\n鸡\n遥控器\n铁路运输\n瓦片\n原子弹\n偶像剧\n闯关\n西游记\n吉他音箱\n车速表\n甜品\n电源供应器\n人行道\n疲劳驾驶\n房车\n量子\n民工\n薄暮传说\n节日\n连连看\n遥控\n科学探索\n银河\n雨水沟\n小丑\n建造\n鹅\n地毯\n赛车俱乐部\n超级飞侠\n美女与野兽\n克兰娜德\n中央处理器\n儿童故事\n口罩\n警匪片\n美女直播\n海洋\n睡衣\n忍者\n烧伤\n裙子\n剪影\n生活大爆炸\n麦田怪圈\n勺子\n狮子王\n床戏\n导管\n冰雪奇缘\n彩泥\n货物\n驼铃\n牙膏\n高铁\n古风\n新娘\n深空传说\n鹰\n鹿\n铲车\n星际战甲\n怪物猎人\n转蛋\n香奈儿\n醉驾\n坦克世界\n新能源汽车\n幻想传奇\n纺织品\n超级英雄\n谍战片\n起重机\n钥匙·按键\n苹果商店\n河粉\n名侦探柯南\n蜂窝\n演唱会\n喷泉\n比基尼\n面粉\n日本食玩\n王子\n画画\n激情戏\n中国队\n帆船\n电商\n消防员\n美腿\n侏罗纪\n吃饭\n锯木机\n烤面包机\n土星\n珠子\n大头儿子\n穴位\n旅客\n演员\n短信\n擂台\n东方永夜抄\n龙之谷\n马路\n袜子\n神秘岛\n勋章\n斑马\n攻壳特工队\n激流回旋\n路易鬼屋\n飞盘\n汽车\n走秀\n异度之刃\n奥利奥\n相声\n房屋\n三国无双\n猫和老鼠\n高校\n鬼片\n维修\n巢\n煎蛋\n哪吒\n排球\n人体穿孔\n核武器\n明星\n水底\n水库\n海军陆战队\n景区\n陀螺战士\n战斗公主西娜\n教学\n火花塞\n收费站\n风力\n马里奥派对\n操作系统\n灼眼的夏娜\n古罗马\n哈士奇\n气象\n神魔之塔\n锁定：现代空战\n球接头娃娃\n神鬼寓言\n幽灵战车\n战争前线\n骡子\n出游\n早餐\n华为\n房间\n现代片\n海报\n游戏王\n咳嗽\n金丝雀\n音乐剧\n根\n灯泡\n星界边境\n视频教学\n剥\n钢铁\n星之卡比\n试驾\n车技\n剑\n树\n茄子\n轨道\n坠毁\n面团\n玩具屋\n拳击\n音乐中心\n行李\n长江\n花絮\n纯情罗曼史\n地精\n铁铲\n公园\n杠铃\n旅游团\n特斯拉线圈\n喷染术\n电子书\n猪猪侠\n骆驼\n假人挑战\n推杆\n图书馆\n洗澡\n耀西之岛\n武装突袭\n幼儿园\n印刷电路板\n头盔式相机\n金字塔\n双簧管\n养老院\n黎明杀机\n复活节兔子\n马棚\n枪杀\n二维码\n击杀\n刷子\n古筝\n财经\n武术\n影视周边\n游览车\n鳄鱼\n开箱\n水晶\n街头霸王\n恐怖袭击\n过生日\n陶瓷\n健身球\n慢镜头\n贝斯\n异形附身\n风扇\n时装秀\n海底\n奔驰小哥\n弹弓\n生化奇兵\n俱乐部\n人字拖\n推土机\n钞票\n救人\n派对\n土豆\n宿舍\n玉米\n乐动魔方\n国产剧\n柚子\n模子\n细菌\n背包\n婚礼\n菠菜\n遛狗\n东方红魔乡\n山口\n驴友\n偶像大师\n噬神者\n假面骑士\n瑞奇与叮当\n新郎\n坦克在线\n网吧\n酵母\n车手\n枪击\n杂志封面\n孩之宝\n猎人\n夜市\n黑岩射手\n王座\n雕塑粘土\n同人志\n浪客剑心\n车票\n重生娃娃\n驱逐舰\n反叛的鲁路修\n领带\n死亡空间\n幽默\n障碍技巧\n运输机\n铙钹\n条码\n采石场\n排骨\n壁橱\n高尔夫球\n恐怖主义\n圆号\n悠悠球\n科技奇趣\n陶轮\n石头\n枪战\n纸板\n斯诺克\n荒野大镖客\n吉祥物\n满月\n野蛮人柯南\n家电\n电子竞技\n但丁地狱\n天花板\n披萨\n车辆\n巨人\n风车\n高速公路\n婚房\n蛤蜊\n抢救\n兔子\n航展\n火山\n发动机\n装载机\n皮艇\n梳子\n维秘\n星际火狐\n嫦娥\n沼泽\n舞曲\n炒鸡蛋\n心灵杀手\n怪物\n中国风\n理发师\n悬崖\n铅笔\n博士\n海豚\n芥末\n磨刀\n卸妆\n黄牌\n魔法门\n飞行\n游泳\n羚羊\n自动售货机\n优惠券\n银行\n打车\n东北二人转\n演讲\n香槟酒\n油罐车\n海豹\n万智牌\n步枪\n造型师\n空间站\n大风\n鼻子\n外卖\nX战警\n田径\n外星人\n木材\n速度生活\n豪车\n鬼魂\n手榴弹\n海底隧道\n表演者\n木琴\n月饼\n活页乐谱\n红牛\n天才\n南瓜饼\n鸟\n离合器\n精灵复兴\n击倒\n农产品\n轰炸\n商家\n美貌\n狗粮\n绞盘\n虚构人物\n冰川\n怒之铁拳\n车祸\n星火\n陆战队\n太阳\n大学\n录音机\n全职猎人\n内衣\n赛车总动员\n同学会\n四重奏\n桨\n驾驶员\n健身房\n瓷器\n抢劫\n爆米花\n绿色\n蕾丝\n黑熊\n公主抱\n刀剑神域\n馒头\n圣诞礼物\n墙壁\n幼儿\n信用卡\n刀\n狂飙旧金山\n日历\n新生\n婚戒\n雪\n雨\n竹子\n美人鱼\n音乐键盘\n娃娃\n键盘\n动力火车\n骑兵·装甲兵\n立交桥\n散步\n成就\n荣誉勋章\n助攻\n沙滩\n蚯蚓\n动物\n汽车越野赛\n项链\n啤酒\n女装\n和尚\n乳清蛋白\n圣诞树\n手绘\n投篮\n大麦\n光头强\n工作会议\n苍蝇\n宝藏\n射击游戏\n粉笔\n杏仁\n碗\n神舟\n胭脂\n惊天动地\n马\n封面\n小学\n物联网\n沙子\n录音棚\n挖土机\n穿衣\n飞机\n大盘\n内涝\n恶魔\n鳄梨\n飞驰竞速\n西兰花\n实验\n录影机\n气球塔防\n跑酷\n交警\n熊\n桔梗\n解放军\n活动房屋\n相机\n数学\n特斯拉\n太空堡垒\n宅男女神\n安卓\n冰块\n鸡舍\n美妙天堂\n化石\n超时空要塞\n数字\n网球\n神秘海域\n艺考\n艺术节\n编织\n打字\n明星派\n二十一点\n护栏\n大海\n极光\n舞力全开\n广场\n神庙逃亡\n纽扣\n时装周\n西葫芦\n炊具和烤盘\n星巴克\n油炸\n划船\n创世纪\n摩托车越野赛\n星星\n金刚\n弹球\n美女\n三明治\n工艺\n冒险\n垃圾桶\n极限竞速\n加菲猫\n宝宝辅食\n首饰\n场地赛\n球\n幻想水浒\n生活剧\n希曼\n插图\n潜水\n秃鹫\n诺亚方舟\n少女\n比武\n糖果粉碎传奇\n拳皇\n墨水\n校园暴力\n引擎\n脱口秀\n路由·伐木\n牡蛎\n漂移\n熊出没\n校车\n牧羊人\n功夫\n植物大战僵尸\n朗诵\n娇妻\n镜框·画框\n百叶窗\n客流\n咖啡\n塑像\n生物学\n手电筒\n机器\n座位\n沙包·沙袋\n森林\n乐高主题公园\n视频制作\n充电器\n犬夜叉\n超级粉碎兄弟\n交通安全\n躲猫猫\n翼\n粘土动画\n山羊\n海王星\n导弹\n街头表演\n水獭\n访谈节目\n石榴\n讲解教学\n拥堵\n变形\n电饭煲\n星际公民\n猿\n头\n丝路传说\n极品飞车\n皮卡丘\n拍照\n化油器\n肥料\n鲨鱼\n星云\n冬奥会\n模拟器\nCD机\n中国梦\n捕食\n泰坦陨落\n白宫\n饺子\n光环\n火鸡\n男装\n火爆狂飙\n推钱机\n命令与征服\n大金刚国度\n古琴\n食堂\n消防站\n愤怒的小鸟\n护士\n母亲\n暗杀\n美妙旋律\n芦笋\n荷花\n弓猎\n超车\n松下\n宙斯\n生活记录\n公路\n模拟合成器\n时尚\n宾馆\n难民\n立体声扬声器\n旋转\n杯子\n模型\n坦克\n生食\n波西杰克逊\n气球\n峡谷\n锁\n粉蜡笔画\n铅笔盒\n收藏\n激光笔\n智能家居\n翻筋斗\n烤面包\n生化危机\n演奏\n百货公司\n屁股\n锯\n车站\n瓜\n极速前进\n篮子\n蹦极\n纸片马里奥\n秦时明月\n全面战争\n游乐园\n最终幻想\n水手\n水上乐园\n尾巴\n鸡蛋\n相声演员\n坚果\n硬盘驱动器\n吃货\n望远镜\n夹克\n僧侣\n山洪\n打斗\n仓库\n独奏\n毁灭战士\n牵手\n普乐路路轨道\n天鹅\n旅行社\n柔道\n景观\n古墓丽影\n蓝龙\n甜美\n拍手\n酒店\n膝盖\n歌曲\n滑翔伞\n小马宝莉\n修道院\n滑板公园\n旅馆\n云朵\n麦片\n灾区\n水槽\n卧室\n避暑\n小熊维尼\n棒球帽\n拖车\n四大名助\n铜管乐器\n沙画\n外太空\n模拟人生\n健身教练\n数字电子\n公寓\n乐迪\n枪战片\n便秘\n姑娘\n大宅门\n猪蹄\n山峰\n三国志大战\n灯\n锅炉\n火\n气球造型\n面部\n光标\n动作片\n上网本\n汽艇\n棉花\n雪橇\n热泵\n装修\n记者\n女警\n恐怖\n龙\n夜景\n民警\n算命\n手里剑\n夜晚\n笑傲江湖\n精灵\n炮弹\n表情包\n刮刮卡\n三轮车\n护目镜\n墙纸\n洗头\n红包\n星系\n运动鞋\n菌类\n冰\n拔牙\n腿\n肿瘤\n先锋\n开心农场\n迪士尼\n山体滑坡\n表格\n文物\n眉毛\n刷牙\n绝命毒师\n电子宠物\n咖啡机\n流苏花边\n素描\n超级跑跑\n搏击\n司机\n卡通\n灰姑娘\n晨练\n记号笔\n心脏\n大提琴\n卫生巾\n受灾\n任天堂\n珠宝\n英雄连\n溜冰场\n青岛大姨\n大灰熊\n骑车\n基督\n道具\n料理\n甜菜根\n鱼饵\n车床\n反曲弓\n影视\n网络直播\n车库\n波斯王子\n船厂\n捕食者\n青铜\n橄榄\n污点·着色\n咖啡屋\n水稻\n改装车\n小正太\n烧烤\n卡布奇诺\n蝴蝶结\n桥梁\n邮件\n数码宝贝\n手臂\n炉子\n学校\n霸王龙\n山\n客车\n焊接\n小车\n分裂细胞\n管道\n爱情剧\n摇滚名人堂\n游行\n完美世界\n开枪\n微波炉\n中学\n东方大头条\n香菇\n虾\n双眼皮\n椅子\n格雷少年\n相亲节目\n称重秤\n香精油\n小路\n压力清洗\n木头\n水彩画\n土豆泥\n电脑\n方舟\n乐高好友\n球体\n冷空气\n大闸蟹\n帽子\n涂料\n手提包\n战争\n水球\n汤\n西红柿\n唇妆\n商铺\n王者之剑\n腕表\n藤蔓\n钱包\n刀工\n平衡车\n奥斯卡金像奖\n抗日剧\n导游\n行星边际\n泡沫\n任务栏\n中药\n死侍\n小小大星球\n自行车\n签名\n胸肌\n太极\n儿童安全座椅\n口哨\n罗技\n休闲\n汉堡\n德军司令部\n变压器\n考拉\n动物之森\n手势\n竖琴\n椰子\n大炮\n医保\n杂技\n电影摄像机\n表演艺术\n话剧\n工作室\n黄河\n吸毒\n黄油\n无限试驾\n高空\n冬天\n酒\n洞穴\n甘薯\n流星体\n手表\n救护车\n金牌\n麦迪逊广场花园\n特技演员\n饼干\n垃圾车\n服装搭配\n出租车\n暴力\n女王\n盗墓\n手提箱\n丝巾\n化学反应\n海贼王\n淋浴\n选秀\n成型\n童话故事\n麦克风\n黑客\n无尽传说\n羊\n狙击手\n小轮车\n夺宝奇兵\n美食\n食品\n肥皂泡\n骑牛\n辫子\n重型设备\n战队\n制服诱惑\n法官\n蝎子\n小屋\n酒精灯\n青鬼\n马赛克\n南方公园\n无人机\n调酒师\n万万没想到\n粉底\n捕鱼\n初音未来\n毒贩\n矮人\n好莱坞\n六孔哨\n棺材\n猜拳\n潜水服\n搞笑\n火星\n盗窃\nDJ\n沐浴类产品\n长颈鹿\n整蛊\n围攻\n教堂\n黑带\n浮桥\n单眼皮\n陷\n软件\n过山车大亨\n围巾\n幸存者\n情感剧\n洗剂\n拆除\n星际迷航\n浮子\n雪地\n安保\n黄金眼\n追尾\n岩石\n电视广告\n行窃\n会计\n鸭子\nVR显示器\n莱克斯卢瑟\n反恐精英\n蒸汽机\n球场\n游戏动漫\n玉米卷\n漫威传奇\n腾讯\n亚洲\n卫生间\n吸烟\n战争机器\n青蛙\n喜羊羊与灰太狼\n飞艇\n猎犬\n招式\n拉伸\n连帽衫\n欧美音乐\n恶魔岛\n拳击之夜\n车\n大型强子对撞机\n舰艇\n枫之谷\n真功夫\n轴\n飞碟\n生物\n魔兽争霸\n欧巴\n平底锅\n石膏\n钢琴\n海关\n剪纸\n坐垫\n镜子\n夏令营\n战争之人\n简历\n彩排\n船\n真空管\n邮轮\n法制节目\n皇室战争\n小龙斯派罗\n博览会\n舞蹈革命\n生活\n圣诞贺卡\n拥抱\n飞飞全明星\n驾考\n卫生纸\n上市\n果酱\n儿子\n教会\n艺术团\n刷卡\n信封\n军阀\n军队\n黑塔利亚\n玉米饼\n滑雪\n猕猴桃\n提拉米苏\n航天\n芭蕾\n狮子\n跑步机\n杀出重围\n忍者龙剑传\n碰撞\n使命召唤\n自拍\n火柴\n火车站\n枫树\n咖啡师\n解说\n狒狒\n终极格斗冠军\n魔法禁书目录\n消防车\n极限运动\n电脑机箱\n兵\n家畜\n墨镜\n演技派\n大长腿\n功夫片\n梯子\n夏日\n排箫\n法师\n急救\n福尔摩斯\n农场\n发型\n决战之夜\n太子妃\n华夫饼\n刺猬索尼克\n赌博\n磨砂机\n办公室\n器官\n毕业\n军训\n带子\n治愈\n船长\n砂浆\n最游记\n绿野仙踪\n炉石传说\n数字录像机\n清洁\n喷气艇\n刺猬\n恒温器\n透视装\n黑执事\n基金\n守望者\nATM取款机\n干墙\n曲棍球\n双节棍\n明胶\n锤子\n婚宴\n街道\n甜饼怪\n上帝模式\n狂神国度\n烈火战车\n麻将\nX音素\n液压机\n水杯\n扭曲\n魔界战记\n车评\n独角兽\n特种兵\n诱饵\n活动\n面具\n九阴真经\n实况足球\n护肤品\n游戏工作室\n榴莲\n马戏团\n原油\n蚁类\n分娩\n钓鱼\n游戏手柄\n影评\n虚幻竞技场\n神枪手\n架线工\n无线遥控飞机\n轮滑\n排气系统\n水管\n电源\n星之海洋\n摄像机\n纪录片\n优雅\n闺蜜\n曼妥思\n作曲家\n锡罐\n骑行\n快递\n电影节\n车队\n犀牛\n肌肉\n纽约时代广场\n敌人\n英雄\n八路\n纹身\n留声机唱片\n家常菜\n影视原声\n撞车\n达人秀\n古玩\n吊坠手链\n旅游\n录节目\n竞技\n黄梅戏\n村民\n昆虫\n旅行车\n草原\n毛衣\n叉车\n决斗大师\n灌木\n手工\n神之浩劫\n广场舞\n工厂\n练习室\n智能硬件\n龙珠\n龙梦幻境\n模仿\n枪支\n加速处理单元\n皮卡\n踏板车\n卡丁车\n歹徒\n跳跃\n大屠杀\n阀\n霍比特人\n煤矿\n遥控车\n女仆\n眼镜\n遇难者\n足球\n英雄工厂\n种族\n武打\n皇牌空战\n曲奇饼\n蜡像\n衬衫\n平衡木\n火灾\n水果蜜饯\n孔雀\n头文字D\n战国\n正手击打\n港台剧\n空中巴士\n部队\n挡风玻璃刮水器\n楼梯\n无人驾驶\n写作\n塑料袋\n灯塔\n徒步旅行\n埃菲尔铁塔\n快餐\n丛林\n怪兽\n灌篮高手\n导航\n台球\n裤子\n包子\n绘图仪\n宠物\n冲浪板\n厕所\n龙虾\n寿司\n海蜇\n赛车游戏\n下午茶\n跨栏\n图像扫描仪\n王者荣耀\n钢琴弹奏\n润肤膏\n真人快打\n橡皮泥\n二胡\n新封印传说\n衣服熨斗\n红烧肉\n除毛\n变脸\n泡菜\n酸奶\n中文\n甘蔗\n拉丁\n萨克斯\n鼓\n炸弹人\n壁炉\n球员\n角斗士\n轮缘\n病毒\n洛基\n科技数码\n梦想俱乐部\n私房菜\n平板\n灯光\n圆筒\n工人\n音乐\n灯具\n探险\n相亲\n传送门\n互联网\n喝\n鼠\n齿轮\n油脂\n旗\n糖霜酥皮\n光学错觉\n数字音频工作站\n击球\n截拳道\n指环王\n高达\n网球王子\n瘦腿\n神秘博士\n自行火炮\n向日葵\n纤维\n电视台\n羊肉\n飞行员\n电车\n按摩\n射箭\n欧洲杯\n戒指\n英雄传说\n棋牌\n魔术\n电动车\n体操\n毁灭公爵\nT恤\n宗教\n豚鼠\n精彩剪辑\n卡拉OK\n护肤\n海盗\n染发\n名人采访\n锐化\n午夜俱乐部\n吃鱼\n飙车\n吸管\n肾脏\n焙烧\n跑步\n紫罗兰\n海岛奇兵\n东京喵喵\n阅兵\n偷窃\n奶茶\n辣条\n特战先锋\n蝙蝠侠\n孤岛危机\n魔法王国\n挖掘机\nU盘\n荧光棒\n图章\n女婴\n光晕\n礼品\n会议\n车展\n电音\n家具\n木雕\n台锯\n终极奇迹\n草坪\n模拟城市\n画眉\n淑女\n酒馆\n唇膏\n手机数码\n橄榄球\n锻造\n水疗\n音悦台\n反导系统\n动感\n第二人生\n星空\n园艺\n稻草人\n无头骑士\n盔甲\n舞会\n蛋\n高空抛物\n无敌浩克\n姜饼\n印刷\n帝国时代\n黄山\n鲁邦三世\n盲人\n蛇\n睡眠\n战舰世界\n蟑螂\n面包车\n缝纫针\n脂肪\n纸模型\n室内装潢\n恐怖分子\n客机\n欧美影视\n便利店\n核弹\n双面人\n厨师\n跑道\n计算机\n灾难片\n飞哥与小佛\n放牧\n文艺演出\n肖像\n红绿灯\n锥体\n喇叭\n赛道狂飙\n全家福\n麻辣烫\n包包\n身体护甲\n航空\n毒品\n天空\n针织\n魔杖\n猪肉\n砖\n松糕\n圣诞装饰\n轰炸机\n无尽的任务\n摇滚史密斯\n网页\n汽车照明系统\n小镇\n巫师\n月球\n硬汉\n机车\n面食\n手术\n海鲜\n玩具熊的五夜后宫\n巧克力\n手机\nVox\n画法\n莫妮卡的团伙\n大米\n全金属狂潮\n随声听\n旋律\n放生\n操场\n窗户\n恐怖喜剧\n大力水手\n惩罚者\n木工\n悬疑\n长方形\n木片\n电子电路\n查理与巧克力工厂\n不锈钢\n苍翼默示录\n盒子\n耐力赛\n保龄球\n海啸\n舰队收藏\n死亡岛\n歌手\n电话\n感染：幸存者故事\n真人秀\n恶魔城\n五佳球\n机械\n马里奥与路易吉\n饲养员\n滑水\n龙舟\n大理石\n港片\n葫芦娃\n武装分子\n奶油烤菜\n吓人\n斧头\n正义联盟\n超凡双生\n蜜蜂\n游艇\n头骨\n道路\n神奇四侠\n弓道\n呼啦圈\n拍客\n航空母舰\n狂热节拍\n宇宙\n美景\n健身队\n武侠\n武林高手\n测评\n薄樱鬼\n人物专访\n颈椎\n皮带\n少年泰坦\n黑色\n交响乐\n震荡\n火炉\n光盘\n喝水\n守望先锋\n烹饪\n装甲车\n棒球\n网游\n黄蜂\n安全带\n泰坦\n巴掌\n指南\n复活节彩蛋\n餐馆\n樱花\n溜冰鞋\n机甲战士\n耐克\n命运石之门\n装扮\n山水画\n耀斑\n贺卡\n日本团子\n月亮\n黑人\n科普\n钥匙扣\n甜瓜\n垃圾\n美食猎人\n头巾\n无线电遥控船\n骨牌\n单挑\n上古世纪\n覆盆子\n绳子\n海绵\n超模\n香肠\n奇观\n直线加速赛\n菜园\n雨伞\n十二生肖\n奶油\n汽车修理\n大号\n倒霉熊\n音乐节目\n唇彩\n几何冲刺\n视频游戏厅\n射击\n鬼屋\n手套\n驾驶\n青蛙军曹\n鞍\n港口\n彩灯\n广播公司\n摄影\n鞋\n我的世界\n大发\n马甲线\n模式·图案\n干衣机\n机器人战斗\n人工呼吸\n华尔兹\n水族馆\n国庆\n领奖\n巫师之怒\n火影忍者\n马克杯\n战鹰\n年会\n垂钓\n摩天大楼\n炸酱面\n企鹅\n整形\n睫毛\n暴走大事件\n教程\n钢铁侠\n日出\n国家公园\n戏剧\n折纸\n花\n说唱史诗战\n白娘子\n头盔\n威浮球\n热血无赖\n眼球\n香烟\n抗战片\n小鲜肉\n音响\n武功\n场地自行车\n稻田\n真侍魂\n海战英豪\n火焰之纹章\n婚纱摄影\n发布会\n损伤\n下水道\n雕刻\n制服\n延时摄影\n凯蒂猫\n截屏\n奇幻森林\n舞台剧\n雪糕\n飞车手罗德\n我想当爷们\n肉丸\n短号\n炮兵\n孩子\n搞怪\n军事\n对决\n战神\n菜花\n欧冠\n冰壶\n蓝莓\n帐篷\n幸运星\n化妆\n激战\n方便面\n旋转木马\n人物\n磁带\n恐怖片\n梦幻龙族\n牙齿\n海滩\n猛鬼街\n鲸\n唱片公司\n露营\n松饼\n安妮\n百乐门\n圣诞\n扬琴\n棚子\n调解\n发射\n体育\n通心粉\n热可可\n二次元\n迷人\n宇航员\n运钞车\n行车记录仪\n官员\n奥数\n玉米地\n音乐人\n彗星\n颁奖典礼\n表演\n粉丝\n军人\n堂吉诃德\n狙击枪\n减脂\n古装\n游戏机\n饥饿游戏\n撒旦\n邮票\n理发店\n网络主播\n身材火辣\n棒球\n兔八哥\n大巴车\n耳环\n数码产品\n游民星空\n泰拳\n配音秀\n机器人\n盛装舞步\n玩具人\n袋鼠\n酒吧\n蘑菇\n死亡边境\n世界杯\n驾驶舱\n海藻\n乐高\n艺术\n龙之信条\n开关\n武警\n日蚀·月蚀\n手机评测\n诛仙\n行李箱\n恐龙世界\n天宫\n滑板\n青贮饲料\n摄像头\n工程车\n阀门·龙头\n石工\n孤岛惊魂\n胫骨\n砸车\n迷你人形\n超级玛丽\n生活技巧\n武打片\n胡子\n苹果\n橙色\n灾害\n猫\n翅膀\n吵架\n唱诗班\n雷神\n扑克\n史酷比\n魔龙骑士\n人体\n拾音器\n圆圈·循环\n地狱\n运球\n游轮\n疯狂动物城\n战舰\n核反应堆\n雾霾\n版画\n真正的家庭主妇\n海龟\n烘培\n电容器\n核试验\n寒潮\n垂死之光\n橡木\n游乐场\n养生\n杀手\n魔法\n台阶·门廊\n倒塌\n法院\n硬币\n拳击比赛\n弩\n可爱\n笔记本\n花卉设计\n僵尸末日\n闹钟\n调制解调器\n狗窝\n萌妹\n部落战争\n聚会\n乐器\n劫匪\n腹语\n电动工具\n头发\n地下城与勇士\n卡牌\n卡片\n别墅\n地球冒险\n暴风雪\n瑜伽\n海狸\n安检\n绘画\n沙拉\n浴缸\n毛绒玩具\n海狮\n琵琶\n肯得基\n口红\n娱乐\n魔戒\n婴儿\n烫发器\n狂飙\n积水\n机动车\n奖\n椰奶\n芦荟\n刺客\n拖拉机\n蒙娜丽莎\n牛仔\n葡萄酒\n猴子\n潜水员\n盘式制动器\n比赛\n吸尘器\n豌豆\n拍摄现场\n帆布\n喜剧演员\n蜡笔小新\n香蕉\n全民健身\n牛排\n音响系统\n啦啦队\n街头采访\n视觉小说\n弹唱\n飞车\n装甲核心\n罐头\n哈利波特\n沉香\n举重\n纸\n拼图\n电视频道\n防护\n视频游戏\n家居\n平屋顶\n开车\n航拍\n特技\n杂货店\n拍卖\n薯条\n珍珠\n手指\n柔力球\n美少女战士\n游戏公司\n冰球\n天气预报\n充气船\n爆炒\n机油\n眼泪\n西区故事\n镶嵌\n仪表着陆系统\n鱼\n爆炸\n骑马\n礼服\n植物\n战地\n淘宝\n烟花\n求婚\n饮料\n蹲\n喜剧\n猎天使魔女\n潜行者\n船员\n汽油\n低音炮\n美甲\n无花果\n超级大金刚\n猩猩\n带锯\n国旗\n开幕式\n货运工具\n腹部\n泥潭\n秀逗魔导士\n交通\n小米\n钢琴家\n机票\n肉\n姜黄\n龙腾世纪\n杀戮空间\n婴儿吊带\n拿铁\n僵尸片\n孤儿院\n自爆\n马里奥赛车\n火锅\n冬季运动\n女巫\n大厦\n街头赛车\n快板\n驾校\n秀场\n侠盗猎车手\n杂志拍摄\n乌龟\n蜂蜜\n减肥操\n水上艇筏\n象\n播种\n单词\n偷车\n玻璃贴膜\n俄罗斯方块\n惊悚\n火车头托马斯\n净水器\n电影解说\n画家\n谷类\n机枪\n滑翔翼\n瓶子\n合唱\n超胆侠\n轮盘\n电气布线\n考古\n豆类\n集装箱\n异形\n洗碗机\n割草机\n茶\n计算器\n魔方\n宝莱坞\n辣妹\n军官\n牛人\n后备箱\n海边\n电磁线圈\n印度\n红酒\n食谱\n工地\n特技飞行\n家庭剧\n培乐多\n温泉\n钩针\n宫殿\n时装\n鹦鹉\n棕熊\n运动会\n空姐\n球星卡\n葱油饼\n洛奇\n女团\n老虎机\n记者会\n体育场\n票房\n无冬城\n浣熊\n洗衣服\n菜市场\n寂静岭\n肉汁\n大力士\n鼓棒\n金属加工\n壶铃\n德云社\n国际军事\n驾照\n面条\n手枪\n金条\n泰迪熊\n河马\n洗涤\n阁楼\n爆炸袭击\n桑拿\n踢打\n爱探险的朵拉\n葡萄园\n闪光\n妈妈\n骨头\n钓竿\n颜色\n摩托车头盔\n纱线\n驯鹿\n银魂\n独轮车\n虚拟玩家角色\n圣经\n毛笔字\n电影\n音乐影片\n西餐\n菠萝\n西湖\n清洁剂\n斗牛\n小红帽\n餐巾\n单杠\n地球\n爽肤水\n打印机\n吹风机\n记号笔\n小麦\n螺帽\n乐高都市\n白酒\n显卡\n都市\n画展\n光之美少女\n银行卡\n群星\n穿越火线\n古装剧\n单簧管\n网络\n洪水\n美容\n汤姆猫\n讲故事\n海底世界\n操作杆\n赛车方向盘\n倚天\n球赛\n海岸\n空调\n铁路\n怪物卡车大毁灭\n下巴\n票\n复仇者联盟\n新闻\n雪崩\n彩绘\n狂野飙车\n沙雕\n木偶\n轮椅\n文艺\n家电公司\n海岛\n苹果派\n降龙十八掌\n打结\n素食\n深渊传说\n骑士\n视频解说\n活塞\n小猪佩奇\n直播\n蟋蟀\n乘客\n英雄联盟\n大气污染\n硬石餐厅\n晶体管\n宝石\n奶酪\n图表\n鲜花\n背心\n反恐\n科学家\n种子\n喂食\n爪子\n火线精英\n体育用品\n照片\n军事武器\n直线\n电脑硬件\n开锁\n鼓手\n模型车\n航天器\n屏幕\n花生\n直排轮滑鞋\n军舰\n钻石\n橄榄油\n稻草\n蜡笔\n妆容\n杀手本能\n餐厅\n摔跤\n内裤\n蹦床\n樱兰高校男公关部\n跆拳道\n科幻\n豪宅\n停车\n冰淇淋\n钢盘·平底深锅\n大乱斗\n服装店\n千与千寻\n音标\n吉他英雄\n南瓜\n采访\n小吃\n漫画英雄\n最后生还者\n红薯\n镜之边缘\n燃脂\n葫芦丝\n篮球\n组装\n台球杆\n过滤器\n空翻\n壁画\n闪电\n海域\n红唇\n面试\n吊坠\n武侠剧\n睫毛膏\n香水\n舞蹈室\n资讯\n眼影\n军装\n躺骑车\n白色\n英魂之刃\n魔鬼\n饭团\n琴弦\n冰箱\n通灵王\n公交\n魔法之战\n泳装\n文本\n长号\n羊毛\n古诗\n马克思佩恩\n演习\n陀螺仪\n车牌\n静物写生\n木屋\n米饭\n萝卜\n高尔夫球\n散热器\n直播间\n星球大战\n黄金\n果汁\n疯狂橄榄球\n散打\n犰狳\n爱情故事\n决斗\n电动汽车\n缝纫\n餐饮\n魔兽世界\n设计师\n航班\n麻薯\n以撒的结合\n中提琴\n孢子\n说唱\n死神\n迷宫\n战斗\n警长\n手球\n睡袋\n镲片\n城堡\n性感\n酒精\n生化模式\n湖\n黑暗\n小小世界\n户外休闲\n球技\n同步带\n制动\n剧情片\n球鞋\n清纯\n聚餐\n刺绣\n减肥\n对唱\n睡美人\n儿童\n烤箱\n黄色\n干草\n神灵\n航空公司\n元素周期表\n电影院\n女神转生\n字典\n飞镖\n战锤\n失忆症\n死亡笔记\n亚马逊公司\n虐杀原形\n象棋\n虚幻引擎\n烧烤架\n奶粉\n悉尼歌剧院\n伐木\n草莓\n爆破\n忍者神龟\n银\n四轮车\n鬼泣\n娱乐八卦\n浴室\n鸡肉\n胡萝卜\n胎儿\n液体\n收割机\n铜\n玩具世界\n一字马\n飞船\n修剪器\n煤炭\n简笔图\n网剧\n小品\n洋葱\n便当\n百事\n蜘蛛\n警车\n马车\n尼姑\n河流\n斗牛士\n染色\n黄瓜\n跳水\n音乐大师课\n蜗牛\n钢笔\n故宫\n公益片\n渔船\n蓝色\n卷发器\n超级快递\n鞭炮\n珊瑚\n实战\n跳绳\n滑冰\n小行星\n翻车\n博物馆\n欧元\n哆啦A梦\n乐乐天使娃娃\n空难\n阴阳师\n辣椒\n青之驱魔师\n鸿雁\nSaGa\n凝胶\n池塘\n节拍器\n亲子节目\n播放机\n打印\n歌迷\n荒野星球\n农业\n地震\n时政\n吴哥窟\n拉面\n音乐节\n甜甜圈\n藤球\n灾难意外\n骑马与砍杀\n柑橘\n不明飞行物\n软管\n相册\n触摸屏\n飞行表演\n圣杯神器\n紫色\n笛子\n存储卡\n鸽赛\n蔬菜\n山地自行车\n哑剧大师\n双簧\n长椅\n松弛熊\n官兵\n巧克力\n动画\n侦探\n溜冰\n拉链\n警察局\n工程师\n分屏\n牧师\n球拍\n馅饼\n马展\n蜡烛\n游戏\n舌头\n增压器\n泰拉瑞亚\n三国\n污染\n管带夹\n丫鬟\n歌剧魅影\n温室\n八卦\n晚会\n多米诺骨牌\n西瓜\n无主之地\n薯片\n降落伞\n家具装饰\n螃蟹\n模拟山羊\n麦当劳\n传感器\n粉扑\n太阳能\n裁判\n保卫萝卜\n地铁\n松鼠\n猫女\n课堂\n木星\n耳机\n耳朵\n医学\n尼尔机械纪元\n驾驶证\n婚车\n砂锅\n死海\n海绵宝宝\n模拟农场\n警官\n调酒\n龙战士\n动车\n老鼠\n辛普森一家\n蜥蜴\n和服\n女生\n影视混剪\n长毛绒\n广告牌\n撒娇\n炒锅\n萌宝\n自然\n指甲油\n灰泥\n火腿\n桌子\n月姬格斗\n塑料\n大脑\n接线盒\n攀岩\n水果忍者\n货币\n秋千\n销售\n卷轴\n化妆品\n包裹\n斑马线\n面包超人\n蛋糕\n肉桂\n寺庙\n书法\n团队套牛\n仙人掌\n餐饮\n火箭炮\n视频直播\n鬼娃回魂\n画线骑士\n宜家\n春晚\n步行\n日落\n袋子\n击剑\n理发\n地下室\n斗地主\n打针\n喝酒\n喷漆\n柯南时代\n锦鲤\n凝乳\n杀戮地带\n恶霸鲁尼\n奖牌\n猫头鹰\n赛道\n战士\n美照\n购物\n蝴蝶\n字母表\n客厅\n乌鸦\n唢呐\n反串\n潘多拉\n监控\n烤鸭\n明星大乱斗\n葡萄\n飓风\n病人\n吊车\n蝙蝠\n伪装\n益智玩具\n舞蹈\n合金装备\n跳楼\n勇者斗恶龙\n油\n网站\n厨师机\n凯恩的遗产\n钱\n食材\n外交部\n酒厂\n显示器\n主持\n羽绒服\n牛仔布\n车模\n盐\n芝麻\n痘痘\n股票\n微笑\n菜单\n地板\n烤鸡\n自动唱机\n雪貂\n涡轮\n扎染\n歌剧\n变形金刚\n失火\n门票\n雪山\n风筝\n长袍·礼服\n书柜\n家庭教师\n死亡之屋\nDarkOrbit\n粮食\n公益活动\n藏獒\n渔民\n下一站巨星\n彩虹手环\n苦瓜\n冲浪\n卷心菜\n珠饰\n西贡小姐\n地铁酷跑\n训练营\n运输\n磁铁\n健康\n床垫\n摇摆\n街头恶搞\n糕点\n拳王\n肋骨\n猫\n曲艺\n加油站\n凉宫春日\n妖怪手表\n动力伞\n墓地\n工程\n民房\n胶片\n色带\n主教\n樱桃小丸子\n鸡翅\n轮子\n牛\n邻里\n萌\n音乐制作\n洛克人\n芒果\n地图\n劈木机\n勇士\n火锅店\n电梯\n吻\n弹球盘\n三角形\n粘土\n鸡尾酒\n慈善\n天天酷跑\n唱片骑师\n结婚\n家庭\n手机壳\n航线\n职业摔跤\n肥皂\n竞技场\n丧钟\n摩天轮\n天使\n台面\n外汇市场\n肉搏\n求生之路\n铜牌\n泡面\n流亡黯道\n灯笼\n谜题\n婴儿室\n捕猎\n尿布袋\n鱼鹰\n雪犁\n方块世界\n斑鸠\n建筑\n电视剧\n堆肥\n细胞\n邪恶力量\n零食\n湾岸竞速\n太鼓达人\n赛车\n金枪鱼\n司令\n皮肤\n马拉松\n末日\n垒球\n涂鸦\n充气城堡\n十字架\n食疗\n早教\n速叠杯\n纸牌\n披肩\n躲避球\n柠檬\n打牌\n抗战\n绕口令\n美容院\n惠普\n情感节目\n永恒之塔\n电脑鼠标\n虚拟现实\n特警\n吊床\n货车\n飞绑\n可乐\n运动\n双重国度\n多功能工具\n妹子\n农村\n眼睛\n干冰\n果冻\n相声小品\n电线杆\n战友\n影视配音\n孤岛生存大乱斗\n奥运\n沃尔玛\n太空\n星际之门\n装饰\n灰色\n樱桃\n电锯\n手铃\n科幻片\n身份证\n古墓\n乒乓\n溪流\n手链\n野外生存\n天线\n玻璃\n营地\n庆典\n玩具\n袭击事件\n美术\n橡皮\n加农\n镜头\n探测器\n洗发精\n彩虹岛\n武器\n装置艺术\n葱\n护理\n命运\n仓鼠\n碎石\n青蛙科密特\n螺旋桨\n七日杀\n整容\n行星\n小宝宝\n科技\n台风\n勇者前线\n皇家国教骑士团\n狂欢节\n热狗\n捉迷藏\n弦乐琴\n叶子\n床\n彼得潘\n写真\n托儿所\n设备\n冰桶挑战\n萌物\n变色龙\n花瓣\n伴郎\n打戏\n画报\n罪恶装备\n漫画\n瘫痪\n飞机失事\n奇闻趣事\n大选\n花瓶\n钢之炼金术师\n杂志\n鼠型车\n教育\n旺达与巨像\n插花\n城堡破坏者\n泵\n混音带\n字体\n超人\n倒计时\n恶作剧\n鹌鹑\n吸血鬼\n小朋友\n颤音琴\n符号\n调音台\n梦幻之星\n橘子\n奶昔\n面糊\n冬不拉\n北斗神拳\n越野\n灭火器\n水果\n婚纱\n上古卷轴\n007\n暮光之城\n蜘蛛侠\n冰沙\n下坡\n毡\n警察\n超市特工\n外套\n汉服\n女童\n筏流\n花园\n布丁\n花圈\n生菜\n新年\n清雪机\n气雾喷雾器\n暮蝉悲鸣时\n公主\n显微镜\n秋天\n模特\n收藏品\n咖喱\n空气净化器\n漫威宇宙\n混凝土\n育儿\n电子琴\n遮瑕膏\n火车\n芭比娃娃\n爵士\n音箱\n黑洞\n积木\n剑球\n奶爸\n监管\n美国队长\n爆笑\n闪电\n降世神通\n祷告\n家禽\n穿越时空\n分裂\n轮胎\n水坝\n索尼\n战斗机\n恶搞路人\n拍戏\n电池\n爆胎\n光棍\n俯卧撑\n摩斯\n饮用水\n狂热\n阅读器\n训练\n奥特曼\n王国之心\n学车\n快递员\n住宅\n袋狼大冒险\n悟空\n面包\n雷曼疯狂兔子\n杀手\n赛马\n啄木鸟伍迪\n国务院\n拖把\n壁虎\n铁拳\n高跟鞋\n动物园\n唱片\n金鹰节\n棒球公园\n宠物小精灵\n手游\n部落冲突\n兽人\n魔术师\n谷仓\n圣剑传说\n商场\n起火\n内饰\n暴龙\n鲸\n上课\n油画\n剧本\n武士\n村庄\n脖子\n卷饼\n蚊子\n狩猎\n保健品\n红毯\n总统\n塔罗牌\n偶像活动\n涂层\n合金弹头\n黑白\n沙漠\n白头鹰\n芝士\n宅男\n战利品\n军营\n围棋\n洗衣店\n教育部\n模糊\n国画\n菲比娃娃\n雕塑\n施工\n书呆子\n冬季\nF-Zero\n核桃\n狱警\n游戏人物\n旗袍\n笑话\n衣柜\n综艺\n迫击炮\n梨\n圣斗士\n媒体\n辩论\n健美操\n速降\n男团\n杀人\n圣诞老人\n圆顶\n海豚音\n特技表演\n耙\n探索\n僵尸围城\n银河战士\n长城\n雪人\n作画\n狼\n星际争霸\n立方体\n武装·装备\n被子\n自行车赛\n吃东西\n金属\n交易\n铲屎官\n培根\n档案\n飞去来器\n歌舞表演\n报纸\n仙女\n舞蹈中心\n亚瑟王传奇\n浏览器\n钟\n狗\n露营车\n艺术品\n洗衣机\n睡姿\n打野\n西装\n管风琴\n半机械人\nU型场地\n光\n鸽子\n窗帘\n练习生\n刺客信条\n黑道圣徒\n农民\n煤气灶\n播放器\n塞尔达传说\n消防\n黄铜\n胶带\n挡泥板\n越战越勇\n糖浆\n武装部队\n录像带\n倒车\n牛奶\n冰棍\n阳台\n饮品\n番茄\n灵异事件\n屋顶\n角色扮演\n大富翁\n饿狼传说\n玫瑰\n猪\n海马\n防汛抗洪\n水井\n书\n土地\n村长\n权力的游戏\n东方妖妖梦\n半条命\n国家队\n木瓜\n绿箭\n滑翔\n视频艺术\n人猿泰山\n国防部\n报警装置\n吉尼斯\n厢型布景\n突袭\n狐狸\n倒立\n搅拌机\n腹肌\n飙酷车神\n电子键盘\n惩罚\n失落的星球\n乐队\n丝绸\n冲突\n豆芽\n交通工具\n滑翔机\n亲子\n拳击手\n少儿\n厨房\n花栗鼠\n楼市\n卡通城\n夜店\n洗车\n广告\n饭店\n合气道\n雪地车\n留声机\n全民枪战\n毛皮\n迷你四驱车\n钻头\n生活常识\n少林\n校园\n拔河\n事故\n菊花\n小蛮腰\n过山车\n鸡腿\n暗黑破坏神\n炸鸡\n排版\n拼贴画\n制造业\n艺人\n选美\n猛兽\n英语\n手\n酥皮\n运动员\n卡士达酱\n内衣秀\n护照\n民航\n土匪\n监狱\n靴子\n积雪草\n沙发\n加勒比海盗\n咱们穿越吧\n极度恐慌\n拉力赛\n背部\n伴娘\n投影机\n面膜\n水\n玉·翡翠\n易拉罐\n度假村\n益智\n吻戏\n丈夫\n吊扇\n模具\n水泥\n火柴人\n公安部\n泥土\n地铁站\n打火机\n小小宠物店\n橙子\n子弹\n猴子岛\n闪电十一人\n雪碧\n指甲\n摩托车\n摄影师\n角色\n电人\n老虎\n音乐合奏\n塑料瓶\n发带\n标签·商标\n肉排\n桃子\n指板\n狼人\n分解动作\n读书\n志愿者\n灵魂能力\n星际宝贝\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/metrics/__init__.py",
    "content": "from .metrics_util import get_metrics\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/metrics/metrics_util.py",
    "content": "#  Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n#Licensed under the Apache License, Version 2.0 (the \"License\");\n#you may not use this file except in compliance with the License.\n#You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n#Unless required by applicable law or agreed to in writing, software\n#distributed under the License is distributed on an \"AS IS\" BASIS,\n#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n#See the License for the specific language governing permissions and\n#limitations under the License.\n\nfrom __future__ import absolute_import\nfrom __future__ import unicode_literals\nfrom __future__ import print_function\nfrom __future__ import division\n\nimport io\nimport logging\n\nimport numpy as np\n\nfrom videotag_tsn_lstm.resource.metrics.youtube8m import eval_util as youtube8m_metrics\n\nlogger = logging.getLogger(__name__)\n\n\nclass Metrics(object):\n    def __init__(self, name, mode, metrics_args):\n        \"\"\"Not implemented\"\"\"\n        pass\n\n    def calculate_and_log_out(self, fetch_list, info=''):\n        \"\"\"Not implemented\"\"\"\n        pass\n\n    def accumulate(self, fetch_list, info=''):\n        \"\"\"Not implemented\"\"\"\n        pass\n\n    def finalize_and_log_out(self, info='', savedir='./'):\n        \"\"\"Not implemented\"\"\"\n        pass\n\n    def reset(self):\n        \"\"\"Not implemented\"\"\"\n        pass\n\n\nclass Youtube8mMetrics(Metrics):\n    def __init__(self, name, mode, metrics_args):\n        self.name = name\n        self.mode = mode\n        self.num_classes = metrics_args['MODEL']['num_classes']\n        self.topk = metrics_args['MODEL']['topk']\n        self.threshold = metrics_args['MODEL']['threshold']\n\n        self.calculator = youtube8m_metrics.EvaluationMetrics(self.num_classes, self.topk)\n        if self.mode == 'infer':\n            self.infer_results = []\n\n    def calculate_and_log_out(self, fetch_list, info=''):\n        loss = np.mean(np.array(fetch_list[0]))\n        pred = np.array(fetch_list[1])\n        label = np.array(fetch_list[2])\n        hit_at_one = youtube8m_metrics.calculate_hit_at_one(pred, label)\n        perr = youtube8m_metrics.calculate_precision_at_equal_recall_rate(pred, label)\n        gap = youtube8m_metrics.calculate_gap(pred, label)\n        logger.info(info + ' , loss = {0}, Hit@1 = {1}, PERR = {2}, GAP = {3}'.format(\\\n                     '%.6f' % loss, '%.2f' % hit_at_one, '%.2f' % perr, '%.2f' % gap))\n\n    def accumulate(self, fetch_list, info=''):\n        if self.mode == 'infer':\n            predictions = np.array(fetch_list[0])\n            video_id = fetch_list[1]\n            for i in range(len(predictions)):\n                topk_inds = predictions[i].argsort()[0 - self.topk:]\n                topk_inds = topk_inds[::-1]\n                preds = predictions[i][topk_inds]\n                self.infer_results.append((video_id[i], topk_inds.tolist(), preds.tolist()))\n        else:\n            loss = np.array(fetch_list[0])\n            pred = np.array(fetch_list[1])\n            label = np.array(fetch_list[2])\n            self.calculator.accumulate(loss, pred, label)\n\n    def finalize_and_log_out(self, info='', label_file='./label_3396.txt'):\n        if self.mode == 'infer':\n            all_res_list = []\n            for index, item in enumerate(self.infer_results):\n                video_id = item[0]\n                f = io.open(label_file, \"r\", encoding=\"utf-8\")\n                fl = f.readlines()\n                res = {}\n                res[\"path\"] = video_id\n                res[\"prediction\"] = {}\n                for i in range(len(item[1])):\n                    class_id = item[1][i]\n                    class_prob = item[2][i]\n                    if class_prob < self.threshold:\n                        continue\n                    class_name = fl[class_id].split('\\n')[0]\n                    res[\"prediction\"][class_name] = class_prob\n                if not res[\"prediction\"]:\n                    logger.warning(\"%s: No prediction exceeds the threshold = %s.\" % (video_id, self.threshold))\n                all_res_list.append(res)\n            return all_res_list\n        else:\n            epoch_info_dict = self.calculator.get()\n            logger.info(info + '\\tavg_hit_at_one: {0},\\tavg_perr: {1},\\tavg_loss :{2},\\taps: {3},\\tgap:{4}'\\\n                     .format(epoch_info_dict['avg_hit_at_one'], epoch_info_dict['avg_perr'], \\\n                             epoch_info_dict['avg_loss'], epoch_info_dict['aps'], epoch_info_dict['gap']))\n\n    def reset(self):\n        self.calculator.clear()\n        if self.mode == 'infer':\n            self.infer_results = []\n\n\nclass MetricsZoo(object):\n    def __init__(self):\n        self.metrics_zoo = {}\n\n    def regist(self, name, metrics):\n        assert metrics.__base__ == Metrics, \"Unknow model type {}\".format(type(metrics))\n        self.metrics_zoo[name] = metrics\n\n    def get(self, name, mode, cfg):\n        for k, v in self.metrics_zoo.items():\n            if k == name:\n                return v(name, mode, cfg)\n        raise KeyError(name, self.metrics_zoo.keys())\n\n\n# singleton metrics_zoo\nmetrics_zoo = MetricsZoo()\n\n\ndef regist_metrics(name, metrics):\n    metrics_zoo.regist(name, metrics)\n\n\ndef get_metrics(name, mode, cfg):\n    return metrics_zoo.get(name, mode, cfg)\n\n\n# sort by alphabet\nregist_metrics(\"ATTENTIONLSTM\", Youtube8mMetrics)\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/metrics/youtube8m/__init__.py",
    "content": ""
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/metrics/youtube8m/average_precision_calculator.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#      http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS-IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Calculate or keep track of the interpolated average precision.\n\nIt provides an interface for calculating interpolated average precision for an\nentire list or the top-n ranked items. For the definition of the\n(non-)interpolated average precision:\nhttp://trec.nist.gov/pubs/trec15/appendices/CE.MEASURES06.pdf\n\nExample usages:\n1) Use it as a static function call to directly calculate average precision for\na short ranked list in the memory.\n\n```\nimport random\n\np = np.array([random.random() for _ in xrange(10)])\na = np.array([random.choice([0, 1]) for _ in xrange(10)])\n\nap = average_precision_calculator.AveragePrecisionCalculator.ap(p, a)\n```\n\n2) Use it as an object for long ranked list that cannot be stored in memory or\nthe case where partial predictions can be observed at a time (Tensorflow\npredictions). In this case, we first call the function accumulate many times\nto process parts of the ranked list. After processing all the parts, we call\npeek_interpolated_ap_at_n.\n```\np1 = np.array([random.random() for _ in xrange(5)])\na1 = np.array([random.choice([0, 1]) for _ in xrange(5)])\np2 = np.array([random.random() for _ in xrange(5)])\na2 = np.array([random.choice([0, 1]) for _ in xrange(5)])\n\n# interpolated average precision at 10 using 1000 break points\ncalculator = average_precision_calculator.AveragePrecisionCalculator(10)\ncalculator.accumulate(p1, a1)\ncalculator.accumulate(p2, a2)\nap3 = calculator.peek_ap_at_n()\n```\n\"\"\"\n\nimport heapq\nimport random\nimport numbers\n\nimport numpy\n\n\nclass AveragePrecisionCalculator(object):\n    \"\"\"Calculate the average precision and average precision at n.\"\"\"\n\n    def __init__(self, top_n=None):\n        \"\"\"Construct an AveragePrecisionCalculator to calculate average precision.\n\n    This class is used to calculate the average precision for a single label.\n\n    Args:\n      top_n: A positive Integer specifying the average precision at n, or\n        None to use all provided data points.\n\n    Raises:\n      ValueError: An error occurred when the top_n is not a positive integer.\n    \"\"\"\n        if not ((isinstance(top_n, int) and top_n >= 0) or top_n is None):\n            raise ValueError(\"top_n must be a positive integer or None.\")\n\n        self._top_n = top_n  # average precision at n\n        self._total_positives = 0  # total number of positives have seen\n        self._heap = []  # max heap of (prediction, actual)\n\n    @property\n    def heap_size(self):\n        \"\"\"Gets the heap size maintained in the class.\"\"\"\n        return len(self._heap)\n\n    @property\n    def num_accumulated_positives(self):\n        \"\"\"Gets the number of positive samples that have been accumulated.\"\"\"\n        return self._total_positives\n\n    def accumulate(self, predictions, actuals, num_positives=None):\n        \"\"\"Accumulate the predictions and their ground truth labels.\n\n    After the function call, we may call peek_ap_at_n to actually calculate\n    the average precision.\n    Note predictions and actuals must have the same shape.\n\n    Args:\n      predictions: a list storing the prediction scores.\n      actuals: a list storing the ground truth labels. Any value\n      larger than 0 will be treated as positives, otherwise as negatives.\n      num_positives = If the 'predictions' and 'actuals' inputs aren't complete,\n      then it's possible some true positives were missed in them. In that case,\n      you can provide 'num_positives' in order to accurately track recall.\n\n    Raises:\n      ValueError: An error occurred when the format of the input is not the\n      numpy 1-D array or the shape of predictions and actuals does not match.\n    \"\"\"\n        if len(predictions) != len(actuals):\n            raise ValueError(\"the shape of predictions and actuals does not match.\")\n\n        if not num_positives is None:\n            if not isinstance(num_positives, numbers.Number) or num_positives < 0:\n                raise ValueError(\"'num_positives' was provided but it wan't a nonzero number.\")\n\n        if not num_positives is None:\n            self._total_positives += num_positives\n        else:\n            self._total_positives += numpy.size(numpy.where(actuals > 0))\n        topk = self._top_n\n        heap = self._heap\n\n        for i in range(numpy.size(predictions)):\n            if topk is None or len(heap) < topk:\n                heapq.heappush(heap, (predictions[i], actuals[i]))\n            else:\n                if predictions[i] > heap[0][0]:  # heap[0] is the smallest\n                    heapq.heappop(heap)\n                    heapq.heappush(heap, (predictions[i], actuals[i]))\n\n    def clear(self):\n        \"\"\"Clear the accumulated predictions.\"\"\"\n        self._heap = []\n        self._total_positives = 0\n\n    def peek_ap_at_n(self):\n        \"\"\"Peek the non-interpolated average precision at n.\n\n    Returns:\n      The non-interpolated average precision at n (default 0).\n      If n is larger than the length of the ranked list,\n      the average precision will be returned.\n    \"\"\"\n        if self.heap_size <= 0:\n            return 0\n        predlists = numpy.array(list(zip(*self._heap)))\n\n        ap = self.ap_at_n(predlists[0], predlists[1], n=self._top_n, total_num_positives=self._total_positives)\n        return ap\n\n    @staticmethod\n    def ap(predictions, actuals):\n        \"\"\"Calculate the non-interpolated average precision.\n\n    Args:\n      predictions: a numpy 1-D array storing the sparse prediction scores.\n      actuals: a numpy 1-D array storing the ground truth labels. Any value\n      larger than 0 will be treated as positives, otherwise as negatives.\n\n    Returns:\n      The non-interpolated average precision at n.\n      If n is larger than the length of the ranked list,\n      the average precision will be returned.\n\n    Raises:\n      ValueError: An error occurred when the format of the input is not the\n      numpy 1-D array or the shape of predictions and actuals does not match.\n    \"\"\"\n        return AveragePrecisionCalculator.ap_at_n(predictions, actuals, n=None)\n\n    @staticmethod\n    def ap_at_n(predictions, actuals, n=20, total_num_positives=None):\n        \"\"\"Calculate the non-interpolated average precision.\n\n    Args:\n      predictions: a numpy 1-D array storing the sparse prediction scores.\n      actuals: a numpy 1-D array storing the ground truth labels. Any value\n      larger than 0 will be treated as positives, otherwise as negatives.\n      n: the top n items to be considered in ap@n.\n      total_num_positives : (optionally) you can specify the number of total\n        positive\n      in the list. If specified, it will be used in calculation.\n\n    Returns:\n      The non-interpolated average precision at n.\n      If n is larger than the length of the ranked list,\n      the average precision will be returned.\n\n    Raises:\n      ValueError: An error occurred when\n      1) the format of the input is not the numpy 1-D array;\n      2) the shape of predictions and actuals does not match;\n      3) the input n is not a positive integer.\n    \"\"\"\n        if len(predictions) != len(actuals):\n            raise ValueError(\"the shape of predictions and actuals does not match.\")\n\n        if n is not None:\n            if not isinstance(n, int) or n <= 0:\n                raise ValueError(\"n must be 'None' or a positive integer.\" \" It was '%s'.\" % n)\n\n        ap = 0.0\n\n        predictions = numpy.array(predictions)\n        actuals = numpy.array(actuals)\n\n        # add a shuffler to avoid overestimating the ap\n        predictions, actuals = AveragePrecisionCalculator._shuffle(predictions, actuals)\n        sortidx = sorted(range(len(predictions)), key=lambda k: predictions[k], reverse=True)\n\n        if total_num_positives is None:\n            numpos = numpy.size(numpy.where(actuals > 0))\n        else:\n            numpos = total_num_positives\n\n        if numpos == 0:\n            return 0\n\n        if n is not None:\n            numpos = min(numpos, n)\n        delta_recall = 1.0 / numpos\n        poscount = 0.0\n\n        # calculate the ap\n        r = len(sortidx)\n        if n is not None:\n            r = min(r, n)\n        for i in range(r):\n            if actuals[sortidx[i]] > 0:\n                poscount += 1\n                ap += poscount / (i + 1) * delta_recall\n        return ap\n\n    @staticmethod\n    def _shuffle(predictions, actuals):\n        random.seed(0)\n        suffidx = random.sample(range(len(predictions)), len(predictions))\n        predictions = predictions[suffidx]\n        actuals = actuals[suffidx]\n        return predictions, actuals\n\n    @staticmethod\n    def _zero_one_normalize(predictions, epsilon=1e-7):\n        \"\"\"Normalize the predictions to the range between 0.0 and 1.0.\n\n    For some predictions like SVM predictions, we need to normalize them before\n    calculate the interpolated average precision. The normalization will not\n    change the rank in the original list and thus won't change the average\n    precision.\n\n    Args:\n      predictions: a numpy 1-D array storing the sparse prediction scores.\n      epsilon: a small constant to avoid denominator being zero.\n\n    Returns:\n      The normalized prediction.\n    \"\"\"\n        denominator = numpy.max(predictions) - numpy.min(predictions)\n        ret = (predictions - numpy.min(predictions)) / numpy.max(denominator, epsilon)\n        return ret\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/metrics/youtube8m/eval_util.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#      http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS-IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Provides functions to help with evaluating models.\"\"\"\nimport datetime\nimport numpy\n\nfrom . import mean_average_precision_calculator as map_calculator\nfrom . import average_precision_calculator as ap_calculator\n\n\ndef flatten(l):\n    \"\"\" Merges a list of lists into a single list. \"\"\"\n    return [item for sublist in l for item in sublist]\n\n\ndef calculate_hit_at_one(predictions, actuals):\n    \"\"\"Performs a local (numpy) calculation of the hit at one.\n\n  Args:\n    predictions: Matrix containing the outputs of the model.\n      Dimensions are 'batch' x 'num_classes'.\n    actuals: Matrix containing the ground truth labels.\n      Dimensions are 'batch' x 'num_classes'.\n\n  Returns:\n    float: The average hit at one across the entire batch.\n  \"\"\"\n    top_prediction = numpy.argmax(predictions, 1)\n    hits = actuals[numpy.arange(actuals.shape[0]), top_prediction]\n    return numpy.average(hits)\n\n\ndef calculate_precision_at_equal_recall_rate(predictions, actuals):\n    \"\"\"Performs a local (numpy) calculation of the PERR.\n\n  Args:\n    predictions: Matrix containing the outputs of the model.\n      Dimensions are 'batch' x 'num_classes'.\n    actuals: Matrix containing the ground truth labels.\n      Dimensions are 'batch' x 'num_classes'.\n\n  Returns:\n    float: The average precision at equal recall rate across the entire batch.\n  \"\"\"\n    aggregated_precision = 0.0\n    num_videos = actuals.shape[0]\n    for row in numpy.arange(num_videos):\n        num_labels = int(numpy.sum(actuals[row]))\n        top_indices = numpy.argpartition(predictions[row], -num_labels)[-num_labels:]\n        item_precision = 0.0\n        for label_index in top_indices:\n            if predictions[row][label_index] > 0:\n                item_precision += actuals[row][label_index]\n        item_precision /= top_indices.size\n        aggregated_precision += item_precision\n    aggregated_precision /= num_videos\n    return aggregated_precision\n\n\ndef calculate_gap(predictions, actuals, top_k=20):\n    \"\"\"Performs a local (numpy) calculation of the global average precision.\n\n  Only the top_k predictions are taken for each of the videos.\n\n  Args:\n    predictions: Matrix containing the outputs of the model.\n      Dimensions are 'batch' x 'num_classes'.\n    actuals: Matrix containing the ground truth labels.\n      Dimensions are 'batch' x 'num_classes'.\n    top_k: How many predictions to use per video.\n\n  Returns:\n    float: The global average precision.\n  \"\"\"\n    gap_calculator = ap_calculator.AveragePrecisionCalculator()\n    sparse_predictions, sparse_labels, num_positives = top_k_by_class(predictions, actuals, top_k)\n    gap_calculator.accumulate(flatten(sparse_predictions), flatten(sparse_labels), sum(num_positives))\n    return gap_calculator.peek_ap_at_n()\n\n\ndef top_k_by_class(predictions, labels, k=20):\n    \"\"\"Extracts the top k predictions for each video, sorted by class.\n\n  Args:\n    predictions: A numpy matrix containing the outputs of the model.\n      Dimensions are 'batch' x 'num_classes'.\n    k: the top k non-zero entries to preserve in each prediction.\n\n  Returns:\n    A tuple (predictions,labels, true_positives). 'predictions' and 'labels'\n    are lists of lists of floats. 'true_positives' is a list of scalars. The\n    length of the lists are equal to the number of classes. The entries in the\n    predictions variable are probability predictions, and\n    the corresponding entries in the labels variable are the ground truth for\n    those predictions. The entries in 'true_positives' are the number of true\n    positives for each class in the ground truth.\n\n  Raises:\n    ValueError: An error occurred when the k is not a positive integer.\n  \"\"\"\n    if k <= 0:\n        raise ValueError(\"k must be a positive integer.\")\n    k = min(k, predictions.shape[1])\n    num_classes = predictions.shape[1]\n    prediction_triplets = []\n    for video_index in range(predictions.shape[0]):\n        prediction_triplets.extend(top_k_triplets(predictions[video_index], labels[video_index], k))\n    out_predictions = [[] for v in range(num_classes)]\n    out_labels = [[] for v in range(num_classes)]\n    for triplet in prediction_triplets:\n        out_predictions[triplet[0]].append(triplet[1])\n        out_labels[triplet[0]].append(triplet[2])\n    out_true_positives = [numpy.sum(labels[:, i]) for i in range(num_classes)]\n\n    return out_predictions, out_labels, out_true_positives\n\n\ndef top_k_triplets(predictions, labels, k=20):\n    \"\"\"Get the top_k for a 1-d numpy array. Returns a sparse list of tuples in\n  (prediction, class) format\"\"\"\n    m = len(predictions)\n    k = min(k, m)\n    indices = numpy.argpartition(predictions, -k)[-k:]\n    return [(index, predictions[index], labels[index]) for index in indices]\n\n\nclass EvaluationMetrics(object):\n    \"\"\"A class to store the evaluation metrics.\"\"\"\n\n    def __init__(self, num_class, top_k):\n        \"\"\"Construct an EvaluationMetrics object to store the evaluation metrics.\n\n    Args:\n      num_class: A positive integer specifying the number of classes.\n      top_k: A positive integer specifying how many predictions are considered per video.\n\n    Raises:\n      ValueError: An error occurred when MeanAveragePrecisionCalculator cannot\n        not be constructed.\n    \"\"\"\n        self.sum_hit_at_one = 0.0\n        self.sum_perr = 0.0\n        self.sum_loss = 0.0\n        self.map_calculator = map_calculator.MeanAveragePrecisionCalculator(num_class)\n        self.global_ap_calculator = ap_calculator.AveragePrecisionCalculator()\n        self.top_k = top_k\n        self.num_examples = 0\n\n    #def accumulate(self, predictions, labels, loss):\n    def accumulate(self, loss, predictions, labels):\n        \"\"\"Accumulate the metrics calculated locally for this mini-batch.\n\n    Args:\n      predictions: A numpy matrix containing the outputs of the model.\n        Dimensions are 'batch' x 'num_classes'.\n      labels: A numpy matrix containing the ground truth labels.\n        Dimensions are 'batch' x 'num_classes'.\n      loss: A numpy array containing the loss for each sample.\n\n    Returns:\n      dictionary: A dictionary storing the metrics for the mini-batch.\n\n    Raises:\n      ValueError: An error occurred when the shape of predictions and actuals\n        does not match.\n    \"\"\"\n        batch_size = labels.shape[0]\n        mean_hit_at_one = calculate_hit_at_one(predictions, labels)\n        mean_perr = calculate_precision_at_equal_recall_rate(predictions, labels)\n        mean_loss = numpy.mean(loss)\n\n        # Take the top 20 predictions.\n        sparse_predictions, sparse_labels, num_positives = top_k_by_class(predictions, labels, self.top_k)\n        self.map_calculator.accumulate(sparse_predictions, sparse_labels, num_positives)\n        self.global_ap_calculator.accumulate(flatten(sparse_predictions), flatten(sparse_labels), sum(num_positives))\n\n        self.num_examples += batch_size\n        self.sum_hit_at_one += mean_hit_at_one * batch_size\n        self.sum_perr += mean_perr * batch_size\n        self.sum_loss += mean_loss * batch_size\n\n        return {\"hit_at_one\": mean_hit_at_one, \"perr\": mean_perr, \"loss\": mean_loss}\n\n    def get(self):\n        \"\"\"Calculate the evaluation metrics for the whole epoch.\n\n    Raises:\n      ValueError: If no examples were accumulated.\n\n    Returns:\n      dictionary: a dictionary storing the evaluation metrics for the epoch. The\n        dictionary has the fields: avg_hit_at_one, avg_perr, avg_loss, and\n        aps (default nan).\n    \"\"\"\n        if self.num_examples <= 0:\n            raise ValueError(\"total_sample must be positive.\")\n        avg_hit_at_one = self.sum_hit_at_one / self.num_examples\n        avg_perr = self.sum_perr / self.num_examples\n        avg_loss = self.sum_loss / self.num_examples\n\n        aps = self.map_calculator.peek_map_at_n()\n        gap = self.global_ap_calculator.peek_ap_at_n()\n\n        epoch_info_dict = {}\n        return {\"avg_hit_at_one\": avg_hit_at_one, \"avg_perr\": avg_perr, \"avg_loss\": avg_loss, \"aps\": aps, \"gap\": gap}\n\n    def clear(self):\n        \"\"\"Clear the evaluation metrics and reset the EvaluationMetrics object.\"\"\"\n        self.sum_hit_at_one = 0.0\n        self.sum_perr = 0.0\n        self.sum_loss = 0.0\n        self.map_calculator.clear()\n        self.global_ap_calculator.clear()\n        self.num_examples = 0\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/metrics/youtube8m/mean_average_precision_calculator.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#      http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS-IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Calculate the mean average precision.\n\nIt provides an interface for calculating mean average precision\nfor an entire list or the top-n ranked items.\n\nExample usages:\nWe first call the function accumulate many times to process parts of the ranked\nlist. After processing all the parts, we call peek_map_at_n\nto calculate the mean average precision.\n\n```\nimport random\n\np = np.array([[random.random() for _ in xrange(50)] for _ in xrange(1000)])\na = np.array([[random.choice([0, 1]) for _ in xrange(50)]\n     for _ in xrange(1000)])\n\n# mean average precision for 50 classes.\ncalculator = mean_average_precision_calculator.MeanAveragePrecisionCalculator(\n            num_class=50)\ncalculator.accumulate(p, a)\naps = calculator.peek_map_at_n()\n```\n\"\"\"\n\nimport numpy\nfrom . import average_precision_calculator\n\n\nclass MeanAveragePrecisionCalculator(object):\n    \"\"\"This class is to calculate mean average precision.\n  \"\"\"\n\n    def __init__(self, num_class):\n        \"\"\"Construct a calculator to calculate the (macro) average precision.\n\n    Args:\n      num_class: A positive Integer specifying the number of classes.\n      top_n_array: A list of positive integers specifying the top n for each\n      class. The top n in each class will be used to calculate its average\n      precision at n.\n      The size of the array must be num_class.\n\n    Raises:\n      ValueError: An error occurred when num_class is not a positive integer;\n      or the top_n_array is not a list of positive integers.\n    \"\"\"\n        if not isinstance(num_class, int) or num_class <= 1:\n            raise ValueError(\"num_class must be a positive integer.\")\n\n        self._ap_calculators = []  # member of AveragePrecisionCalculator\n        self._num_class = num_class  # total number of classes\n        for i in range(num_class):\n            self._ap_calculators.append(average_precision_calculator.AveragePrecisionCalculator())\n\n    def accumulate(self, predictions, actuals, num_positives=None):\n        \"\"\"Accumulate the predictions and their ground truth labels.\n\n    Args:\n      predictions: A list of lists storing the prediction scores. The outer\n      dimension corresponds to classes.\n      actuals: A list of lists storing the ground truth labels. The dimensions\n      should correspond to the predictions input. Any value\n      larger than 0 will be treated as positives, otherwise as negatives.\n      num_positives: If provided, it is a list of numbers representing the\n      number of true positives for each class. If not provided, the number of\n      true positives will be inferred from the 'actuals' array.\n\n    Raises:\n      ValueError: An error occurred when the shape of predictions and actuals\n      does not match.\n    \"\"\"\n        if not num_positives:\n            num_positives = [None for i in predictions.shape[1]]\n\n        calculators = self._ap_calculators\n        for i in range(len(predictions)):\n            calculators[i].accumulate(predictions[i], actuals[i], num_positives[i])\n\n    def clear(self):\n        for calculator in self._ap_calculators:\n            calculator.clear()\n\n    def is_empty(self):\n        return ([calculator.heap_size for calculator in self._ap_calculators] == [0 for _ in range(self._num_class)])\n\n    def peek_map_at_n(self):\n        \"\"\"Peek the non-interpolated mean average precision at n.\n\n    Returns:\n      An array of non-interpolated average precision at n (default 0) for each\n      class.\n    \"\"\"\n        aps = [self._ap_calculators[i].peek_ap_at_n() for i in range(self._num_class)]\n        return aps\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/models/__init__.py",
    "content": "from .model import regist_model, get_model\nfrom .attention_lstm import AttentionLSTM\nfrom .tsn import TSN\n\n# regist models, sort by alphabet\nregist_model(\"AttentionLSTM\", AttentionLSTM)\nregist_model(\"TSN\", TSN)\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/models/attention_lstm/__init__.py",
    "content": "from .attention_lstm import *\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/models/attention_lstm/attention_lstm.py",
    "content": "#  Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n#Licensed under the Apache License, Version 2.0 (the \"License\");\n#you may not use this file except in compliance with the License.\n#You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n#Unless required by applicable law or agreed to in writing, software\n#distributed under the License is distributed on an \"AS IS\" BASIS,\n#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n#See the License for the specific language governing permissions and\n#limitations under the License.\n\nimport logging\n\nimport paddle.fluid as fluid\nfrom paddle.fluid import ParamAttr\n\nfrom ..model import ModelBase\nfrom .lstm_attention import LSTMAttentionModel\n\n__all__ = [\"AttentionLSTM\"]\nlogger = logging.getLogger(__name__)\n\n\nclass AttentionLSTM(ModelBase):\n    def __init__(self, name, cfg, mode='train'):\n        super(AttentionLSTM, self).__init__(name, cfg, mode)\n        self.get_config()\n\n    def get_config(self):\n        # get model configs\n        self.feature_num = self.cfg.MODEL.feature_num\n        self.feature_names = self.cfg.MODEL.feature_names\n        self.feature_dims = self.cfg.MODEL.feature_dims\n        self.num_classes = self.cfg.MODEL.num_classes\n        self.embedding_size = self.cfg.MODEL.embedding_size\n        self.lstm_size = self.cfg.MODEL.lstm_size\n        self.drop_rate = self.cfg.MODEL.drop_rate\n\n        # get mode configs\n        self.batch_size = self.get_config_from_sec(self.mode, 'batch_size', 1)\n        self.num_gpus = self.get_config_from_sec(self.mode, 'num_gpus', 1)\n\n    def build_input(self, use_dataloader):\n        self.feature_input = []\n        for name, dim in zip(self.feature_names, self.feature_dims):\n            self.feature_input.append(fluid.data(shape=[None, dim], lod_level=1, dtype='float32', name=name))\n        if use_dataloader:\n            assert self.mode != 'infer', \\\n                    'dataloader is not recommendated when infer, please set use_dataloader to be false.'\n            self.dataloader = fluid.io.DataLoader.from_generator(\n                feed_list=self.feature_input,  #+ [self.label_input],\n                capacity=8,\n                iterable=True)\n\n    def build_model(self):\n        att_outs = []\n        for i, (input_dim, feature) in enumerate(zip(self.feature_dims, self.feature_input)):\n            att = LSTMAttentionModel(input_dim, self.embedding_size, self.lstm_size, self.drop_rate)\n            att_out = att.forward(feature, is_training=(self.mode == 'train'))\n            att_outs.append(att_out)\n        if len(att_outs) > 1:\n            out = fluid.layers.concat(att_outs, axis=1)\n        else:\n            out = att_outs[0]\n\n        fc1 = fluid.layers.fc(\n            input=out,\n            size=8192,\n            act='relu',\n            bias_attr=ParamAttr(\n                regularizer=fluid.regularizer.L2Decay(0.0), initializer=fluid.initializer.NormalInitializer(scale=0.0)),\n            name='fc1')\n        fc2 = fluid.layers.fc(\n            input=fc1,\n            size=4096,\n            act='tanh',\n            bias_attr=ParamAttr(\n                regularizer=fluid.regularizer.L2Decay(0.0), initializer=fluid.initializer.NormalInitializer(scale=0.0)),\n            name='fc2')\n\n        self.logit = fluid.layers.fc(input=fc2, size=self.num_classes, act=None, \\\n                              bias_attr=ParamAttr(regularizer=fluid.regularizer.L2Decay(0.0),\n                                                  initializer=fluid.initializer.NormalInitializer(scale=0.0)),\n                              name = 'output')\n\n        self.output = fluid.layers.sigmoid(self.logit)\n\n    def optimizer(self):\n        assert self.mode == 'train', \"optimizer only can be get in train mode\"\n        values = [self.learning_rate * (self.decay_gamma**i) for i in range(len(self.decay_epochs) + 1)]\n        iter_per_epoch = self.num_samples / self.batch_size\n        boundaries = [e * iter_per_epoch for e in self.decay_epochs]\n        return fluid.optimizer.RMSProp(\n            learning_rate=fluid.layers.piecewise_decay(values=values, boundaries=boundaries),\n            centered=True,\n            regularization=fluid.regularizer.L2Decay(self.weight_decay))\n\n    def loss(self):\n        assert self.mode != 'infer', \"invalid loss calculationg in infer mode\"\n        cost = fluid.layers.sigmoid_cross_entropy_with_logits(x=self.logit, label=self.label_input)\n        cost = fluid.layers.reduce_sum(cost, dim=-1)\n        sum_cost = fluid.layers.reduce_sum(cost)\n        self.loss_ = fluid.layers.scale(sum_cost, scale=self.num_gpus, bias_after_scale=False)\n        return self.loss_\n\n    def outputs(self):\n        return [self.output, self.logit]\n\n    def feeds(self):\n        return self.feature_input\n\n    def fetches(self):\n        fetch_list = [self.output]\n        return fetch_list\n\n    def weights_info(self):\n        return ('AttentionLSTM.pdparams',\n                'https://paddlemodels.bj.bcebos.com/video_classification/AttentionLSTM.pdparams')\n\n    def load_pretrain_params(self, exe, pretrain, prog, place):\n        logger.info(\"Load pretrain weights from {}, exclude fc layer.\".format(pretrain))\n\n        state_dict = fluid.load_program_state(pretrain)\n        dict_keys = list(state_dict.keys())\n        for name in dict_keys:\n            if \"fc_0\" in name:\n                del state_dict[name]\n                logger.info('Delete {} from pretrained parameters. Do not load it'.format(name))\n        fluid.set_program_state(prog, state_dict)\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/models/attention_lstm/lstm_attention.py",
    "content": "#  Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n#Licensed under the Apache License, Version 2.0 (the \"License\");\n#you may not use this file except in compliance with the License.\n#You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n#Unless required by applicable law or agreed to in writing, software\n#distributed under the License is distributed on an \"AS IS\" BASIS,\n#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n#See the License for the specific language governing permissions and\n#limitations under the License.\nimport paddle.fluid as fluid\nfrom paddle.fluid import ParamAttr\n\n\nclass LSTMAttentionModel(object):\n    \"\"\"LSTM Attention Model\"\"\"\n\n    def __init__(self, bias_attr, embedding_size=512, lstm_size=1024, drop_rate=0.5):\n        self.lstm_size = lstm_size\n        self.embedding_size = embedding_size\n        self.drop_rate = drop_rate\n\n    def forward(self, input, is_training):\n        input_fc = fluid.layers.fc(\n            input=input,\n            size=self.embedding_size,\n            act='tanh',\n            bias_attr=ParamAttr(\n                regularizer=fluid.regularizer.L2Decay(0.0), initializer=fluid.initializer.NormalInitializer(scale=0.0)),\n            name='rgb_fc')\n\n        lstm_forward_fc = fluid.layers.fc(\n            input=input_fc, size=self.lstm_size * 4, act=None, bias_attr=False, name='rgb_fc_forward')\n\n        lstm_forward, _ = fluid.layers.dynamic_lstm(\n            input=lstm_forward_fc, size=self.lstm_size * 4, is_reverse=False, name='rgb_lstm_forward')\n\n        lsmt_backward_fc = fluid.layers.fc(\n            input=input_fc, size=self.lstm_size * 4, act=None, bias_attr=False, name='rgb_fc_backward')\n\n        lstm_backward, _ = fluid.layers.dynamic_lstm(\n            input=lsmt_backward_fc, size=self.lstm_size * 4, is_reverse=True, name='rgb_lstm_backward')\n\n        lstm_concat = fluid.layers.concat(input=[lstm_forward, lstm_backward], axis=1)\n\n        lstm_dropout = fluid.layers.dropout(x=lstm_concat, dropout_prob=self.drop_rate, is_test=(not is_training))\n\n        lstm_weight = fluid.layers.fc(\n            input=lstm_dropout, size=1, act='sequence_softmax', bias_attr=False, name='rgb_weight')\n\n        scaled = fluid.layers.elementwise_mul(x=lstm_dropout, y=lstm_weight, axis=0)\n        lstm_pool = fluid.layers.sequence_pool(input=scaled, pool_type='sum')\n\n        return lstm_pool\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/models/model.py",
    "content": "#  Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.\n#\n#Licensed under the Apache License, Version 2.0 (the \"License\");\n#you may not use this file except in compliance with the License.\n#You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n#Unless required by applicable law or agreed to in writing, software\n#distributed under the License is distributed on an \"AS IS\" BASIS,\n#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n#See the License for the specific language governing permissions and\n#limitations under the License.\n\nimport os\nimport logging\ntry:\n    from configparser import ConfigParser\nexcept:\n    from ConfigParser import ConfigParser\n\nimport paddle.fluid as fluid\n\nWEIGHT_DIR = os.path.join(os.path.expanduser('~'), '.paddle', 'weights')\n\nlogger = logging.getLogger(__name__)\n\n\ndef is_parameter(var):\n    return isinstance(var, fluid.framework.Parameter)\n\n\nclass NotImplementError(Exception):\n    \"Error: model function not implement\"\n\n    def __init__(self, model, function):\n        super(NotImplementError, self).__init__()\n        self.model = model.__class__.__name__\n        self.function = function.__name__\n\n    def __str__(self):\n        return \"Function {}() is not implemented in model {}\".format(self.function, self.model)\n\n\nclass ModelNotFoundError(Exception):\n    \"Error: model not found\"\n\n    def __init__(self, model_name, avail_models):\n        super(ModelNotFoundError, self).__init__()\n        self.model_name = model_name\n        self.avail_models = avail_models\n\n    def __str__(self):\n        msg = \"Model {} Not Found.\\nAvailiable models:\\n\".format(self.model_name)\n        for model in self.avail_models:\n            msg += \"  {}\\n\".format(model)\n        return msg\n\n\nclass ModelBase(object):\n    def __init__(self, name, cfg, mode='train'):\n        assert mode in ['train', 'valid', 'test', 'infer'], \\\n                \"Unknown mode type {}\".format(mode)\n        self.name = name\n        self.is_training = (mode == 'train')\n        self.mode = mode\n        self.cfg = cfg\n        self.dataloader = None\n\n    def build_model(self):\n        \"build model struct\"\n        raise NotImplementError(self, self.build_model)\n\n    def build_input(self, use_dataloader):\n        \"build input Variable\"\n        raise NotImplementError(self, self.build_input)\n\n    def optimizer(self):\n        \"get model optimizer\"\n        raise NotImplementError(self, self.optimizer)\n\n    def outputs(self):\n        \"get output variable\"\n        raise NotImplementedError(self, self.outputs)\n\n    def loss(self):\n        \"get loss variable\"\n        raise NotImplementedError(self, self.loss)\n\n    def feeds(self):\n        \"get feed inputs list\"\n        raise NotImplementError(self, self.feeds)\n\n    def fetches(self):\n        \"get fetch list of model\"\n        raise NotImplementError(self, self.fetches)\n\n    def weights_info(self):\n        \"get model weight default path and download url\"\n        raise NotImplementError(self, self.weights_info)\n\n    def dataloader(self):\n        return self.dataloader\n\n    def epoch_num(self):\n        \"get train epoch num\"\n        return self.cfg.TRAIN.epoch\n\n    def pretrain_info(self):\n        \"get pretrain base model directory\"\n        return (None, None)\n\n    def load_pretrain_params(self, exe, pretrain, prog, place):\n        logger.info(\"Load pretrain weights from {}\".format(pretrain))\n        state_dict = fluid.load_program_state(pretrain)\n        fluid.set_program_state(prog, state_dict)\n\n    def load_test_weights(self, exe, weights, prog):\n        params_list = list(filter(is_parameter, prog.list_vars()))\n        fluid.load(prog, weights, executor=exe, var_list=params_list)\n\n    def get_config_from_sec(self, sec, item, default=None):\n        if sec.upper() not in self.cfg:\n            return default\n        return self.cfg[sec.upper()].get(item, default)\n\n\nclass ModelZoo(object):\n    def __init__(self):\n        self.model_zoo = {}\n\n    def regist(self, name, model):\n        assert model.__base__ == ModelBase, \"Unknow model type {}\".format(type(model))\n        self.model_zoo[name] = model\n\n    def get(self, name, cfg, mode='train'):\n        for k, v in self.model_zoo.items():\n            if k.upper() == name.upper():\n                return v(name, cfg, mode)\n        raise ModelNotFoundError(name, self.model_zoo.keys())\n\n\n# singleton model_zoo\nmodel_zoo = ModelZoo()\n\n\ndef regist_model(name, model):\n    model_zoo.regist(name, model)\n\n\ndef get_model(name, cfg, mode='train'):\n    return model_zoo.get(name, cfg, mode)\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/models/tsn/__init__.py",
    "content": "from .tsn import *\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/models/tsn/name.py",
    "content": "import json\n\ndepth = [3, 4, 23, 3]\nnum_filters = [64, 128, 256, 512]\n\nlayer_index = 1\ncaffe_param_list = []\n\nname_list = ['conv1']\nparams_list = []\nname = name_list[0]\nconv_w = name + '_weights'\ncaffe_conv_w = 'ConvNdBackward' + str(layer_index) + '_weights'\nparams_list.append(conv_w)\ncaffe_param_list.append(caffe_conv_w)\n\nlayer_index += 1\n\nbn_name = \"bn_\" + name\ncaffe_bn_name = 'BatchNormBackward' + str(layer_index) + '_bn'\nparams_list.append(bn_name + '_scale')\nparams_list.append(bn_name + '_offset')\nparams_list.append(bn_name + '_mean')\nparams_list.append(bn_name + '_variance')\n\ncaffe_param_list.append(caffe_bn_name + '_scale')\ncaffe_param_list.append(caffe_bn_name + '_offset')\ncaffe_param_list.append(caffe_bn_name + '_mean')\ncaffe_param_list.append(caffe_bn_name + '_variance')\n\nfilter_input = 64\n\nlayer_index += 3\n\nfor block in range(len(depth)):\n    for i in range(depth[block]):\n        if block == 2:\n            if i == 0:\n                name = \"res\" + str(block + 2) + \"a\"\n            else:\n                name = \"res\" + str(block + 2) + \"b\" + str(i)\n        else:\n            name = \"res\" + str(block + 2) + chr(97 + i)\n\n        name_list.append(name)\n\n        for item in ['a', 'b', 'c']:\n            name_branch = name + '_branch2' + item\n            bn_name = 'bn' + name_branch[3:]\n            params_list.append(name_branch + '_weights')\n            params_list.append(bn_name + '_scale')\n            params_list.append(bn_name + '_offset')\n            params_list.append(bn_name + '_mean')\n            params_list.append(bn_name + '_variance')\n\n            caffe_name_branch = 'ConvNdBackward' + str(layer_index)\n            caffe_param_list.append(caffe_name_branch + '_weights')\n\n            layer_index += 1\n            caffe_bn_name = 'BatchNormBackward' + str(layer_index) + '_bn'\n            caffe_param_list.append(caffe_bn_name + '_scale')\n            caffe_param_list.append(caffe_bn_name + '_offset')\n            caffe_param_list.append(caffe_bn_name + '_mean')\n            caffe_param_list.append(caffe_bn_name + '_variance')\n\n            layer_index += 2\n\n        stride = 2 if i == 0 and block != 0 else 1\n        filter_num = num_filters[block]\n        filter_output = filter_num * 4\n\n        if (filter_output != filter_input) or (stride != 1):\n            name_branch = name + '_branch1'\n\n            print('filter_input {}, filter_output {}, stride {}, branch name {}'.format(\n                filter_input, filter_output, stride, name_branch))\n            bn_name = 'bn' + name_branch[3:]\n            params_list.append(name_branch + '_weights')\n            params_list.append(bn_name + '_scale')\n            params_list.append(bn_name + '_offset')\n            params_list.append(bn_name + '_mean')\n            params_list.append(bn_name + '_variance')\n\n            caffe_name_branch = 'ConvNdBackward' + str(layer_index)\n            caffe_param_list.append(caffe_name_branch + '_weights')\n\n            layer_index += 1\n            caffe_bn_name = 'BatchNormBackward' + str(layer_index) + '_bn'\n            caffe_param_list.append(caffe_bn_name + '_scale')\n            caffe_param_list.append(caffe_bn_name + '_offset')\n            caffe_param_list.append(caffe_bn_name + '_mean')\n            caffe_param_list.append(caffe_bn_name + '_variance')\n\n            layer_index += 3\n        else:\n            layer_index += 2\n\n        filter_input = filter_output\n\nmap_dict = {}\n\nfor i in range(len(params_list)):\n    print(params_list[i], caffe_param_list[i])\n    map_dict[params_list[i]] = caffe_param_list[i]\n\njson.dump(map_dict, open('name_map.json', 'w'))\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/models/tsn/name1",
    "content": "conv1_weights\nbn_conv1_scale\nbn_conv1_offset\nbn_conv1_mean\nbn_conv1_variance\nres2a_branch2a_weights\nbn2a_branch2a_scale\nbn2a_branch2a_offset\nbn2a_branch2a_mean\nbn2a_branch2a_variance\nres2a_branch2b_weights\nbn2a_branch2b_scale\nbn2a_branch2b_offset\nbn2a_branch2b_mean\nbn2a_branch2b_variance\nres2a_branch2c_weights\nbn2a_branch2c_scale\nbn2a_branch2c_offset\nbn2a_branch2c_mean\nbn2a_branch2c_variance\nres2a_branch1_weights\nbn2a_branch1_scale\nbn2a_branch1_offset\nbn2a_branch1_mean\nbn2a_branch1_variance\nres2b_branch2a_weights\nbn2b_branch2a_scale\nbn2b_branch2a_offset\nbn2b_branch2a_mean\nbn2b_branch2a_variance\nres2b_branch2b_weights\nbn2b_branch2b_scale\nbn2b_branch2b_offset\nbn2b_branch2b_mean\nbn2b_branch2b_variance\nres2b_branch2c_weights\nbn2b_branch2c_scale\nbn2b_branch2c_offset\nbn2b_branch2c_mean\nbn2b_branch2c_variance\nres2c_branch2a_weights\nbn2c_branch2a_scale\nbn2c_branch2a_offset\nbn2c_branch2a_mean\nbn2c_branch2a_variance\nres2c_branch2b_weights\nbn2c_branch2b_scale\nbn2c_branch2b_offset\nbn2c_branch2b_mean\nbn2c_branch2b_variance\nres2c_branch2c_weights\nbn2c_branch2c_scale\nbn2c_branch2c_offset\nbn2c_branch2c_mean\nbn2c_branch2c_variance\nres3a_branch2a_weights\nbn3a_branch2a_scale\nbn3a_branch2a_offset\nbn3a_branch2a_mean\nbn3a_branch2a_variance\nres3a_branch2b_weights\nbn3a_branch2b_scale\nbn3a_branch2b_offset\nbn3a_branch2b_mean\nbn3a_branch2b_variance\nres3a_branch2c_weights\nbn3a_branch2c_scale\nbn3a_branch2c_offset\nbn3a_branch2c_mean\nbn3a_branch2c_variance\nres3a_branch1_weights\nbn3a_branch1_scale\nbn3a_branch1_offset\nbn3a_branch1_mean\nbn3a_branch1_variance\nres3b_branch2a_weights\nbn3b_branch2a_scale\nbn3b_branch2a_offset\nbn3b_branch2a_mean\nbn3b_branch2a_variance\nres3b_branch2b_weights\nbn3b_branch2b_scale\nbn3b_branch2b_offset\nbn3b_branch2b_mean\nbn3b_branch2b_variance\nres3b_branch2c_weights\nbn3b_branch2c_scale\nbn3b_branch2c_offset\nbn3b_branch2c_mean\nbn3b_branch2c_variance\nres3c_branch2a_weights\nbn3c_branch2a_scale\nbn3c_branch2a_offset\nbn3c_branch2a_mean\nbn3c_branch2a_variance\nres3c_branch2b_weights\nbn3c_branch2b_scale\nbn3c_branch2b_offset\nbn3c_branch2b_mean\nbn3c_branch2b_variance\nres3c_branch2c_weights\nbn3c_branch2c_scale\nbn3c_branch2c_offset\nbn3c_branch2c_mean\nbn3c_branch2c_variance\nres3d_branch2a_weights\nbn3d_branch2a_scale\nbn3d_branch2a_offset\nbn3d_branch2a_mean\nbn3d_branch2a_variance\nres3d_branch2b_weights\nbn3d_branch2b_scale\nbn3d_branch2b_offset\nbn3d_branch2b_mean\nbn3d_branch2b_variance\nres3d_branch2c_weights\nbn3d_branch2c_scale\nbn3d_branch2c_offset\nbn3d_branch2c_mean\nbn3d_branch2c_variance\nres4a_branch2a_weights\nbn4a_branch2a_scale\nbn4a_branch2a_offset\nbn4a_branch2a_mean\nbn4a_branch2a_variance\nres4a_branch2b_weights\nbn4a_branch2b_scale\nbn4a_branch2b_offset\nbn4a_branch2b_mean\nbn4a_branch2b_variance\nres4a_branch2c_weights\nbn4a_branch2c_scale\nbn4a_branch2c_offset\nbn4a_branch2c_mean\nbn4a_branch2c_variance\nres4a_branch1_weights\nbn4a_branch1_scale\nbn4a_branch1_offset\nbn4a_branch1_mean\nbn4a_branch1_variance\nres4b1_branch2a_weights\nbn4b1_branch2a_scale\nbn4b1_branch2a_offset\nbn4b1_branch2a_mean\nbn4b1_branch2a_variance\nres4b1_branch2b_weights\nbn4b1_branch2b_scale\nbn4b1_branch2b_offset\nbn4b1_branch2b_mean\nbn4b1_branch2b_variance\nres4b1_branch2c_weights\nbn4b1_branch2c_scale\nbn4b1_branch2c_offset\nbn4b1_branch2c_mean\nbn4b1_branch2c_variance\nres4b2_branch2a_weights\nbn4b2_branch2a_scale\nbn4b2_branch2a_offset\nbn4b2_branch2a_mean\nbn4b2_branch2a_variance\nres4b2_branch2b_weights\nbn4b2_branch2b_scale\nbn4b2_branch2b_offset\nbn4b2_branch2b_mean\nbn4b2_branch2b_variance\nres4b2_branch2c_weights\nbn4b2_branch2c_scale\nbn4b2_branch2c_offset\nbn4b2_branch2c_mean\nbn4b2_branch2c_variance\nres4b3_branch2a_weights\nbn4b3_branch2a_scale\nbn4b3_branch2a_offset\nbn4b3_branch2a_mean\nbn4b3_branch2a_variance\nres4b3_branch2b_weights\nbn4b3_branch2b_scale\nbn4b3_branch2b_offset\nbn4b3_branch2b_mean\nbn4b3_branch2b_variance\nres4b3_branch2c_weights\nbn4b3_branch2c_scale\nbn4b3_branch2c_offset\nbn4b3_branch2c_mean\nbn4b3_branch2c_variance\nres4b4_branch2a_weights\nbn4b4_branch2a_scale\nbn4b4_branch2a_offset\nbn4b4_branch2a_mean\nbn4b4_branch2a_variance\nres4b4_branch2b_weights\nbn4b4_branch2b_scale\nbn4b4_branch2b_offset\nbn4b4_branch2b_mean\nbn4b4_branch2b_variance\nres4b4_branch2c_weights\nbn4b4_branch2c_scale\nbn4b4_branch2c_offset\nbn4b4_branch2c_mean\nbn4b4_branch2c_variance\nres4b5_branch2a_weights\nbn4b5_branch2a_scale\nbn4b5_branch2a_offset\nbn4b5_branch2a_mean\nbn4b5_branch2a_variance\nres4b5_branch2b_weights\nbn4b5_branch2b_scale\nbn4b5_branch2b_offset\nbn4b5_branch2b_mean\nbn4b5_branch2b_variance\nres4b5_branch2c_weights\nbn4b5_branch2c_scale\nbn4b5_branch2c_offset\nbn4b5_branch2c_mean\nbn4b5_branch2c_variance\nres4b6_branch2a_weights\nbn4b6_branch2a_scale\nbn4b6_branch2a_offset\nbn4b6_branch2a_mean\nbn4b6_branch2a_variance\nres4b6_branch2b_weights\nbn4b6_branch2b_scale\nbn4b6_branch2b_offset\nbn4b6_branch2b_mean\nbn4b6_branch2b_variance\nres4b6_branch2c_weights\nbn4b6_branch2c_scale\nbn4b6_branch2c_offset\nbn4b6_branch2c_mean\nbn4b6_branch2c_variance\nres4b7_branch2a_weights\nbn4b7_branch2a_scale\nbn4b7_branch2a_offset\nbn4b7_branch2a_mean\nbn4b7_branch2a_variance\nres4b7_branch2b_weights\nbn4b7_branch2b_scale\nbn4b7_branch2b_offset\nbn4b7_branch2b_mean\nbn4b7_branch2b_variance\nres4b7_branch2c_weights\nbn4b7_branch2c_scale\nbn4b7_branch2c_offset\nbn4b7_branch2c_mean\nbn4b7_branch2c_variance\nres4b8_branch2a_weights\nbn4b8_branch2a_scale\nbn4b8_branch2a_offset\nbn4b8_branch2a_mean\nbn4b8_branch2a_variance\nres4b8_branch2b_weights\nbn4b8_branch2b_scale\nbn4b8_branch2b_offset\nbn4b8_branch2b_mean\nbn4b8_branch2b_variance\nres4b8_branch2c_weights\nbn4b8_branch2c_scale\nbn4b8_branch2c_offset\nbn4b8_branch2c_mean\nbn4b8_branch2c_variance\nres4b9_branch2a_weights\nbn4b9_branch2a_scale\nbn4b9_branch2a_offset\nbn4b9_branch2a_mean\nbn4b9_branch2a_variance\nres4b9_branch2b_weights\nbn4b9_branch2b_scale\nbn4b9_branch2b_offset\nbn4b9_branch2b_mean\nbn4b9_branch2b_variance\nres4b9_branch2c_weights\nbn4b9_branch2c_scale\nbn4b9_branch2c_offset\nbn4b9_branch2c_mean\nbn4b9_branch2c_variance\nres4b10_branch2a_weights\nbn4b10_branch2a_scale\nbn4b10_branch2a_offset\nbn4b10_branch2a_mean\nbn4b10_branch2a_variance\nres4b10_branch2b_weights\nbn4b10_branch2b_scale\nbn4b10_branch2b_offset\nbn4b10_branch2b_mean\nbn4b10_branch2b_variance\nres4b10_branch2c_weights\nbn4b10_branch2c_scale\nbn4b10_branch2c_offset\nbn4b10_branch2c_mean\nbn4b10_branch2c_variance\nres4b11_branch2a_weights\nbn4b11_branch2a_scale\nbn4b11_branch2a_offset\nbn4b11_branch2a_mean\nbn4b11_branch2a_variance\nres4b11_branch2b_weights\nbn4b11_branch2b_scale\nbn4b11_branch2b_offset\nbn4b11_branch2b_mean\nbn4b11_branch2b_variance\nres4b11_branch2c_weights\nbn4b11_branch2c_scale\nbn4b11_branch2c_offset\nbn4b11_branch2c_mean\nbn4b11_branch2c_variance\nres4b12_branch2a_weights\nbn4b12_branch2a_scale\nbn4b12_branch2a_offset\nbn4b12_branch2a_mean\nbn4b12_branch2a_variance\nres4b12_branch2b_weights\nbn4b12_branch2b_scale\nbn4b12_branch2b_offset\nbn4b12_branch2b_mean\nbn4b12_branch2b_variance\nres4b12_branch2c_weights\nbn4b12_branch2c_scale\nbn4b12_branch2c_offset\nbn4b12_branch2c_mean\nbn4b12_branch2c_variance\nres4b13_branch2a_weights\nbn4b13_branch2a_scale\nbn4b13_branch2a_offset\nbn4b13_branch2a_mean\nbn4b13_branch2a_variance\nres4b13_branch2b_weights\nbn4b13_branch2b_scale\nbn4b13_branch2b_offset\nbn4b13_branch2b_mean\nbn4b13_branch2b_variance\nres4b13_branch2c_weights\nbn4b13_branch2c_scale\nbn4b13_branch2c_offset\nbn4b13_branch2c_mean\nbn4b13_branch2c_variance\nres4b14_branch2a_weights\nbn4b14_branch2a_scale\nbn4b14_branch2a_offset\nbn4b14_branch2a_mean\nbn4b14_branch2a_variance\nres4b14_branch2b_weights\nbn4b14_branch2b_scale\nbn4b14_branch2b_offset\nbn4b14_branch2b_mean\nbn4b14_branch2b_variance\nres4b14_branch2c_weights\nbn4b14_branch2c_scale\nbn4b14_branch2c_offset\nbn4b14_branch2c_mean\nbn4b14_branch2c_variance\nres4b15_branch2a_weights\nbn4b15_branch2a_scale\nbn4b15_branch2a_offset\nbn4b15_branch2a_mean\nbn4b15_branch2a_variance\nres4b15_branch2b_weights\nbn4b15_branch2b_scale\nbn4b15_branch2b_offset\nbn4b15_branch2b_mean\nbn4b15_branch2b_variance\nres4b15_branch2c_weights\nbn4b15_branch2c_scale\nbn4b15_branch2c_offset\nbn4b15_branch2c_mean\nbn4b15_branch2c_variance\nres4b16_branch2a_weights\nbn4b16_branch2a_scale\nbn4b16_branch2a_offset\nbn4b16_branch2a_mean\nbn4b16_branch2a_variance\nres4b16_branch2b_weights\nbn4b16_branch2b_scale\nbn4b16_branch2b_offset\nbn4b16_branch2b_mean\nbn4b16_branch2b_variance\nres4b16_branch2c_weights\nbn4b16_branch2c_scale\nbn4b16_branch2c_offset\nbn4b16_branch2c_mean\nbn4b16_branch2c_variance\nres4b17_branch2a_weights\nbn4b17_branch2a_scale\nbn4b17_branch2a_offset\nbn4b17_branch2a_mean\nbn4b17_branch2a_variance\nres4b17_branch2b_weights\nbn4b17_branch2b_scale\nbn4b17_branch2b_offset\nbn4b17_branch2b_mean\nbn4b17_branch2b_variance\nres4b17_branch2c_weights\nbn4b17_branch2c_scale\nbn4b17_branch2c_offset\nbn4b17_branch2c_mean\nbn4b17_branch2c_variance\nres4b18_branch2a_weights\nbn4b18_branch2a_scale\nbn4b18_branch2a_offset\nbn4b18_branch2a_mean\nbn4b18_branch2a_variance\nres4b18_branch2b_weights\nbn4b18_branch2b_scale\nbn4b18_branch2b_offset\nbn4b18_branch2b_mean\nbn4b18_branch2b_variance\nres4b18_branch2c_weights\nbn4b18_branch2c_scale\nbn4b18_branch2c_offset\nbn4b18_branch2c_mean\nbn4b18_branch2c_variance\nres4b19_branch2a_weights\nbn4b19_branch2a_scale\nbn4b19_branch2a_offset\nbn4b19_branch2a_mean\nbn4b19_branch2a_variance\nres4b19_branch2b_weights\nbn4b19_branch2b_scale\nbn4b19_branch2b_offset\nbn4b19_branch2b_mean\nbn4b19_branch2b_variance\nres4b19_branch2c_weights\nbn4b19_branch2c_scale\nbn4b19_branch2c_offset\nbn4b19_branch2c_mean\nbn4b19_branch2c_variance\nres4b20_branch2a_weights\nbn4b20_branch2a_scale\nbn4b20_branch2a_offset\nbn4b20_branch2a_mean\nbn4b20_branch2a_variance\nres4b20_branch2b_weights\nbn4b20_branch2b_scale\nbn4b20_branch2b_offset\nbn4b20_branch2b_mean\nbn4b20_branch2b_variance\nres4b20_branch2c_weights\nbn4b20_branch2c_scale\nbn4b20_branch2c_offset\nbn4b20_branch2c_mean\nbn4b20_branch2c_variance\nres4b21_branch2a_weights\nbn4b21_branch2a_scale\nbn4b21_branch2a_offset\nbn4b21_branch2a_mean\nbn4b21_branch2a_variance\nres4b21_branch2b_weights\nbn4b21_branch2b_scale\nbn4b21_branch2b_offset\nbn4b21_branch2b_mean\nbn4b21_branch2b_variance\nres4b21_branch2c_weights\nbn4b21_branch2c_scale\nbn4b21_branch2c_offset\nbn4b21_branch2c_mean\nbn4b21_branch2c_variance\nres4b22_branch2a_weights\nbn4b22_branch2a_scale\nbn4b22_branch2a_offset\nbn4b22_branch2a_mean\nbn4b22_branch2a_variance\nres4b22_branch2b_weights\nbn4b22_branch2b_scale\nbn4b22_branch2b_offset\nbn4b22_branch2b_mean\nbn4b22_branch2b_variance\nres4b22_branch2c_weights\nbn4b22_branch2c_scale\nbn4b22_branch2c_offset\nbn4b22_branch2c_mean\nbn4b22_branch2c_variance\nres5a_branch2a_weights\nbn5a_branch2a_scale\nbn5a_branch2a_offset\nbn5a_branch2a_mean\nbn5a_branch2a_variance\nres5a_branch2b_weights\nbn5a_branch2b_scale\nbn5a_branch2b_offset\nbn5a_branch2b_mean\nbn5a_branch2b_variance\nres5a_branch2c_weights\nbn5a_branch2c_scale\nbn5a_branch2c_offset\nbn5a_branch2c_mean\nbn5a_branch2c_variance\nres5a_branch1_weights\nbn5a_branch1_scale\nbn5a_branch1_offset\nbn5a_branch1_mean\nbn5a_branch1_variance\nres5b_branch2a_weights\nbn5b_branch2a_scale\nbn5b_branch2a_offset\nbn5b_branch2a_mean\nbn5b_branch2a_variance\nres5b_branch2b_weights\nbn5b_branch2b_scale\nbn5b_branch2b_offset\nbn5b_branch2b_mean\nbn5b_branch2b_variance\nres5b_branch2c_weights\nbn5b_branch2c_scale\nbn5b_branch2c_offset\nbn5b_branch2c_mean\nbn5b_branch2c_variance\nres5c_branch2a_weights\nbn5c_branch2a_scale\nbn5c_branch2a_offset\nbn5c_branch2a_mean\nbn5c_branch2a_variance\nres5c_branch2b_weights\nbn5c_branch2b_scale\nbn5c_branch2b_offset\nbn5c_branch2b_mean\nbn5c_branch2b_variance\nres5c_branch2c_weights\nbn5c_branch2c_scale\nbn5c_branch2c_offset\nbn5c_branch2c_mean\nbn5c_branch2c_variance\n\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/models/tsn/name2",
    "content": "conv1_weights\nbn_conv1_scale\nbn_conv1_offset\nbn_conv1_mean\nbn_conv1_variance\nres2a_branch2a_weights\nbn2a_branch2a_scale\nbn2a_branch2a_offset\nbn2a_branch2a_mean\nbn2a_branch2a_variance\nres2a_branch2b_weights\nbn2a_branch2b_scale\nbn2a_branch2b_offset\nbn2a_branch2b_mean\nbn2a_branch2b_variance\nres2a_branch2c_weights\nbn2a_branch2c_scale\nbn2a_branch2c_offset\nbn2a_branch2c_mean\nbn2a_branch2c_variance\nres2a_branch1_weights\nbn2a_branch1_scale\nbn2a_branch1_offset\nbn2a_branch1_mean\nbn2a_branch1_variance\nres2b_branch2a_weights\nbn2b_branch2a_scale\nbn2b_branch2a_offset\nbn2b_branch2a_mean\nbn2b_branch2a_variance\nres2b_branch2b_weights\nbn2b_branch2b_scale\nbn2b_branch2b_offset\nbn2b_branch2b_mean\nbn2b_branch2b_variance\nres2b_branch2c_weights\nbn2b_branch2c_scale\nbn2b_branch2c_offset\nbn2b_branch2c_mean\nbn2b_branch2c_variance\nres2c_branch2a_weights\nbn2c_branch2a_scale\nbn2c_branch2a_offset\nbn2c_branch2a_mean\nbn2c_branch2a_variance\nres2c_branch2b_weights\nbn2c_branch2b_scale\nbn2c_branch2b_offset\nbn2c_branch2b_mean\nbn2c_branch2b_variance\nres2c_branch2c_weights\nbn2c_branch2c_scale\nbn2c_branch2c_offset\nbn2c_branch2c_mean\nbn2c_branch2c_variance\nres3a_branch2a_weights\nbn3a_branch2a_scale\nbn3a_branch2a_offset\nbn3a_branch2a_mean\nbn3a_branch2a_variance\nres3a_branch2b_weights\nbn3a_branch2b_scale\nbn3a_branch2b_offset\nbn3a_branch2b_mean\nbn3a_branch2b_variance\nres3a_branch2c_weights\nbn3a_branch2c_scale\nbn3a_branch2c_offset\nbn3a_branch2c_mean\nbn3a_branch2c_variance\nres3a_branch1_weights\nbn3a_branch1_scale\nbn3a_branch1_offset\nbn3a_branch1_mean\nbn3a_branch1_variance\nres3b_branch2a_weights\nbn3b_branch2a_scale\nbn3b_branch2a_offset\nbn3b_branch2a_mean\nbn3b_branch2a_variance\nres3b_branch2b_weights\nbn3b_branch2b_scale\nbn3b_branch2b_offset\nbn3b_branch2b_mean\nbn3b_branch2b_variance\nres3b_branch2c_weights\nbn3b_branch2c_scale\nbn3b_branch2c_offset\nbn3b_branch2c_mean\nbn3b_branch2c_variance\nres3c_branch2a_weights\nbn3c_branch2a_scale\nbn3c_branch2a_offset\nbn3c_branch2a_mean\nbn3c_branch2a_variance\nres3c_branch2b_weights\nbn3c_branch2b_scale\nbn3c_branch2b_offset\nbn3c_branch2b_mean\nbn3c_branch2b_variance\nres3c_branch2c_weights\nbn3c_branch2c_scale\nbn3c_branch2c_offset\nbn3c_branch2c_mean\nbn3c_branch2c_variance\nres3d_branch2a_weights\nbn3d_branch2a_scale\nbn3d_branch2a_offset\nbn3d_branch2a_mean\nbn3d_branch2a_variance\nres3d_branch2b_weights\nbn3d_branch2b_scale\nbn3d_branch2b_offset\nbn3d_branch2b_mean\nbn3d_branch2b_variance\nres3d_branch2c_weights\nbn3d_branch2c_scale\nbn3d_branch2c_offset\nbn3d_branch2c_mean\nbn3d_branch2c_variance\nres4a_branch2a_weights\nbn4a_branch2a_scale\nbn4a_branch2a_offset\nbn4a_branch2a_mean\nbn4a_branch2a_variance\nres4a_branch2b_weights\nbn4a_branch2b_scale\nbn4a_branch2b_offset\nbn4a_branch2b_mean\nbn4a_branch2b_variance\nres4a_branch2c_weights\nbn4a_branch2c_scale\nbn4a_branch2c_offset\nbn4a_branch2c_mean\nbn4a_branch2c_variance\nres4a_branch1_weights\nbn4a_branch1_scale\nbn4a_branch1_offset\nbn4a_branch1_mean\nbn4a_branch1_variance\nres4b1_branch2a_weights\nbn4b1_branch2a_scale\nbn4b1_branch2a_offset\nbn4b1_branch2a_mean\nbn4b1_branch2a_variance\nres4b1_branch2b_weights\nbn4b1_branch2b_scale\nbn4b1_branch2b_offset\nbn4b1_branch2b_mean\nbn4b1_branch2b_variance\nres4b1_branch2c_weights\nbn4b1_branch2c_scale\nbn4b1_branch2c_offset\nbn4b1_branch2c_mean\nbn4b1_branch2c_variance\nres4b2_branch2a_weights\nbn4b2_branch2a_scale\nbn4b2_branch2a_offset\nbn4b2_branch2a_mean\nbn4b2_branch2a_variance\nres4b2_branch2b_weights\nbn4b2_branch2b_scale\nbn4b2_branch2b_offset\nbn4b2_branch2b_mean\nbn4b2_branch2b_variance\nres4b2_branch2c_weights\nbn4b2_branch2c_scale\nbn4b2_branch2c_offset\nbn4b2_branch2c_mean\nbn4b2_branch2c_variance\nres4b3_branch2a_weights\nbn4b3_branch2a_scale\nbn4b3_branch2a_offset\nbn4b3_branch2a_mean\nbn4b3_branch2a_variance\nres4b3_branch2b_weights\nbn4b3_branch2b_scale\nbn4b3_branch2b_offset\nbn4b3_branch2b_mean\nbn4b3_branch2b_variance\nres4b3_branch2c_weights\nbn4b3_branch2c_scale\nbn4b3_branch2c_offset\nbn4b3_branch2c_mean\nbn4b3_branch2c_variance\nres4b4_branch2a_weights\nbn4b4_branch2a_scale\nbn4b4_branch2a_offset\nbn4b4_branch2a_mean\nbn4b4_branch2a_variance\nres4b4_branch2b_weights\nbn4b4_branch2b_scale\nbn4b4_branch2b_offset\nbn4b4_branch2b_mean\nbn4b4_branch2b_variance\nres4b4_branch2c_weights\nbn4b4_branch2c_scale\nbn4b4_branch2c_offset\nbn4b4_branch2c_mean\nbn4b4_branch2c_variance\nres4b5_branch2a_weights\nbn4b5_branch2a_scale\nbn4b5_branch2a_offset\nbn4b5_branch2a_mean\nbn4b5_branch2a_variance\nres4b5_branch2b_weights\nbn4b5_branch2b_scale\nbn4b5_branch2b_offset\nbn4b5_branch2b_mean\nbn4b5_branch2b_variance\nres4b5_branch2c_weights\nbn4b5_branch2c_scale\nbn4b5_branch2c_offset\nbn4b5_branch2c_mean\nbn4b5_branch2c_variance\nres4b6_branch2a_weights\nbn4b6_branch2a_scale\nbn4b6_branch2a_offset\nbn4b6_branch2a_mean\nbn4b6_branch2a_variance\nres4b6_branch2b_weights\nbn4b6_branch2b_scale\nbn4b6_branch2b_offset\nbn4b6_branch2b_mean\nbn4b6_branch2b_variance\nres4b6_branch2c_weights\nbn4b6_branch2c_scale\nbn4b6_branch2c_offset\nbn4b6_branch2c_mean\nbn4b6_branch2c_variance\nres4b7_branch2a_weights\nbn4b7_branch2a_scale\nbn4b7_branch2a_offset\nbn4b7_branch2a_mean\nbn4b7_branch2a_variance\nres4b7_branch2b_weights\nbn4b7_branch2b_scale\nbn4b7_branch2b_offset\nbn4b7_branch2b_mean\nbn4b7_branch2b_variance\nres4b7_branch2c_weights\nbn4b7_branch2c_scale\nbn4b7_branch2c_offset\nbn4b7_branch2c_mean\nbn4b7_branch2c_variance\nres4b8_branch2a_weights\nbn4b8_branch2a_scale\nbn4b8_branch2a_offset\nbn4b8_branch2a_mean\nbn4b8_branch2a_variance\nres4b8_branch2b_weights\nbn4b8_branch2b_scale\nbn4b8_branch2b_offset\nbn4b8_branch2b_mean\nbn4b8_branch2b_variance\nres4b8_branch2c_weights\nbn4b8_branch2c_scale\nbn4b8_branch2c_offset\nbn4b8_branch2c_mean\nbn4b8_branch2c_variance\nres4b9_branch2a_weights\nbn4b9_branch2a_scale\nbn4b9_branch2a_offset\nbn4b9_branch2a_mean\nbn4b9_branch2a_variance\nres4b9_branch2b_weights\nbn4b9_branch2b_scale\nbn4b9_branch2b_offset\nbn4b9_branch2b_mean\nbn4b9_branch2b_variance\nres4b9_branch2c_weights\nbn4b9_branch2c_scale\nbn4b9_branch2c_offset\nbn4b9_branch2c_mean\nbn4b9_branch2c_variance\nres4b10_branch2a_weights\nbn4b10_branch2a_scale\nbn4b10_branch2a_offset\nbn4b10_branch2a_mean\nbn4b10_branch2a_variance\nres4b10_branch2b_weights\nbn4b10_branch2b_scale\nbn4b10_branch2b_offset\nbn4b10_branch2b_mean\nbn4b10_branch2b_variance\nres4b10_branch2c_weights\nbn4b10_branch2c_scale\nbn4b10_branch2c_offset\nbn4b10_branch2c_mean\nbn4b10_branch2c_variance\nres4b11_branch2a_weights\nbn4b11_branch2a_scale\nbn4b11_branch2a_offset\nbn4b11_branch2a_mean\nbn4b11_branch2a_variance\nres4b11_branch2b_weights\nbn4b11_branch2b_scale\nbn4b11_branch2b_offset\nbn4b11_branch2b_mean\nbn4b11_branch2b_variance\nres4b11_branch2c_weights\nbn4b11_branch2c_scale\nbn4b11_branch2c_offset\nbn4b11_branch2c_mean\nbn4b11_branch2c_variance\nres4b12_branch2a_weights\nbn4b12_branch2a_scale\nbn4b12_branch2a_offset\nbn4b12_branch2a_mean\nbn4b12_branch2a_variance\nres4b12_branch2b_weights\nbn4b12_branch2b_scale\nbn4b12_branch2b_offset\nbn4b12_branch2b_mean\nbn4b12_branch2b_variance\nres4b12_branch2c_weights\nbn4b12_branch2c_scale\nbn4b12_branch2c_offset\nbn4b12_branch2c_mean\nbn4b12_branch2c_variance\nres4b13_branch2a_weights\nbn4b13_branch2a_scale\nbn4b13_branch2a_offset\nbn4b13_branch2a_mean\nbn4b13_branch2a_variance\nres4b13_branch2b_weights\nbn4b13_branch2b_scale\nbn4b13_branch2b_offset\nbn4b13_branch2b_mean\nbn4b13_branch2b_variance\nres4b13_branch2c_weights\nbn4b13_branch2c_scale\nbn4b13_branch2c_offset\nbn4b13_branch2c_mean\nbn4b13_branch2c_variance\nres4b14_branch2a_weights\nbn4b14_branch2a_scale\nbn4b14_branch2a_offset\nbn4b14_branch2a_mean\nbn4b14_branch2a_variance\nres4b14_branch2b_weights\nbn4b14_branch2b_scale\nbn4b14_branch2b_offset\nbn4b14_branch2b_mean\nbn4b14_branch2b_variance\nres4b14_branch2c_weights\nbn4b14_branch2c_scale\nbn4b14_branch2c_offset\nbn4b14_branch2c_mean\nbn4b14_branch2c_variance\nres4b15_branch2a_weights\nbn4b15_branch2a_scale\nbn4b15_branch2a_offset\nbn4b15_branch2a_mean\nbn4b15_branch2a_variance\nres4b15_branch2b_weights\nbn4b15_branch2b_scale\nbn4b15_branch2b_offset\nbn4b15_branch2b_mean\nbn4b15_branch2b_variance\nres4b15_branch2c_weights\nbn4b15_branch2c_scale\nbn4b15_branch2c_offset\nbn4b15_branch2c_mean\nbn4b15_branch2c_variance\nres4b16_branch2a_weights\nbn4b16_branch2a_scale\nbn4b16_branch2a_offset\nbn4b16_branch2a_mean\nbn4b16_branch2a_variance\nres4b16_branch2b_weights\nbn4b16_branch2b_scale\nbn4b16_branch2b_offset\nbn4b16_branch2b_mean\nbn4b16_branch2b_variance\nres4b16_branch2c_weights\nbn4b16_branch2c_scale\nbn4b16_branch2c_offset\nbn4b16_branch2c_mean\nbn4b16_branch2c_variance\nres4b17_branch2a_weights\nbn4b17_branch2a_scale\nbn4b17_branch2a_offset\nbn4b17_branch2a_mean\nbn4b17_branch2a_variance\nres4b17_branch2b_weights\nbn4b17_branch2b_scale\nbn4b17_branch2b_offset\nbn4b17_branch2b_mean\nbn4b17_branch2b_variance\nres4b17_branch2c_weights\nbn4b17_branch2c_scale\nbn4b17_branch2c_offset\nbn4b17_branch2c_mean\nbn4b17_branch2c_variance\nres4b18_branch2a_weights\nbn4b18_branch2a_scale\nbn4b18_branch2a_offset\nbn4b18_branch2a_mean\nbn4b18_branch2a_variance\nres4b18_branch2b_weights\nbn4b18_branch2b_scale\nbn4b18_branch2b_offset\nbn4b18_branch2b_mean\nbn4b18_branch2b_variance\nres4b18_branch2c_weights\nbn4b18_branch2c_scale\nbn4b18_branch2c_offset\nbn4b18_branch2c_mean\nbn4b18_branch2c_variance\nres4b19_branch2a_weights\nbn4b19_branch2a_scale\nbn4b19_branch2a_offset\nbn4b19_branch2a_mean\nbn4b19_branch2a_variance\nres4b19_branch2b_weights\nbn4b19_branch2b_scale\nbn4b19_branch2b_offset\nbn4b19_branch2b_mean\nbn4b19_branch2b_variance\nres4b19_branch2c_weights\nbn4b19_branch2c_scale\nbn4b19_branch2c_offset\nbn4b19_branch2c_mean\nbn4b19_branch2c_variance\nres4b20_branch2a_weights\nbn4b20_branch2a_scale\nbn4b20_branch2a_offset\nbn4b20_branch2a_mean\nbn4b20_branch2a_variance\nres4b20_branch2b_weights\nbn4b20_branch2b_scale\nbn4b20_branch2b_offset\nbn4b20_branch2b_mean\nbn4b20_branch2b_variance\nres4b20_branch2c_weights\nbn4b20_branch2c_scale\nbn4b20_branch2c_offset\nbn4b20_branch2c_mean\nbn4b20_branch2c_variance\nres4b21_branch2a_weights\nbn4b21_branch2a_scale\nbn4b21_branch2a_offset\nbn4b21_branch2a_mean\nbn4b21_branch2a_variance\nres4b21_branch2b_weights\nbn4b21_branch2b_scale\nbn4b21_branch2b_offset\nbn4b21_branch2b_mean\nbn4b21_branch2b_variance\nres4b21_branch2c_weights\nbn4b21_branch2c_scale\nbn4b21_branch2c_offset\nbn4b21_branch2c_mean\nbn4b21_branch2c_variance\nres4b22_branch2a_weights\nbn4b22_branch2a_scale\nbn4b22_branch2a_offset\nbn4b22_branch2a_mean\nbn4b22_branch2a_variance\nres4b22_branch2b_weights\nbn4b22_branch2b_scale\nbn4b22_branch2b_offset\nbn4b22_branch2b_mean\nbn4b22_branch2b_variance\nres4b22_branch2c_weights\nbn4b22_branch2c_scale\nbn4b22_branch2c_offset\nbn4b22_branch2c_mean\nbn4b22_branch2c_variance\nres5a_branch2a_weights\nbn5a_branch2a_scale\nbn5a_branch2a_offset\nbn5a_branch2a_mean\nbn5a_branch2a_variance\nres5a_branch2b_weights\nbn5a_branch2b_scale\nbn5a_branch2b_offset\nbn5a_branch2b_mean\nbn5a_branch2b_variance\nres5a_branch2c_weights\nbn5a_branch2c_scale\nbn5a_branch2c_offset\nbn5a_branch2c_mean\nbn5a_branch2c_variance\nres5a_branch1_weights\nbn5a_branch1_scale\nbn5a_branch1_offset\nbn5a_branch1_mean\nbn5a_branch1_variance\nres5b_branch2a_weights\nbn5b_branch2a_scale\nbn5b_branch2a_offset\nbn5b_branch2a_mean\nbn5b_branch2a_variance\nres5b_branch2b_weights\nbn5b_branch2b_scale\nbn5b_branch2b_offset\nbn5b_branch2b_mean\nbn5b_branch2b_variance\nres5b_branch2c_weights\nbn5b_branch2c_scale\nbn5b_branch2c_offset\nbn5b_branch2c_mean\nbn5b_branch2c_variance\nres5c_branch2a_weights\nbn5c_branch2a_scale\nbn5c_branch2a_offset\nbn5c_branch2a_mean\nbn5c_branch2a_variance\nres5c_branch2b_weights\nbn5c_branch2b_scale\nbn5c_branch2b_offset\nbn5c_branch2b_mean\nbn5c_branch2b_variance\nres5c_branch2c_weights\nbn5c_branch2c_scale\nbn5c_branch2c_offset\nbn5c_branch2c_mean\nbn5c_branch2c_variance\n\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/models/tsn/name_map.json",
    "content": "{\"res4a_branch1_weights\": \"ConvNdBackward95_weights\", \"bn2a_branch2a_mean\": \"BatchNormBackward6_bn_mean\", \"bn4b17_branch2c_offset\": \"BatchNormBackward282_bn_offset\", \"bn4b12_branch2a_variance\": \"BatchNormBackward221_bn_variance\", \"res4b1_branch2a_weights\": \"ConvNdBackward99_weights\", \"bn4b2_branch2a_scale\": \"BatchNormBackward111_bn_scale\", \"bn4b18_branch2c_scale\": \"BatchNormBackward293_bn_scale\", \"res4b7_branch2c_weights\": \"ConvNdBackward171_weights\", \"bn3d_branch2a_scale\": \"BatchNormBackward76_bn_scale\", \"res4b13_branch2a_weights\": \"ConvNdBackward231_weights\", \"bn2c_branch2c_offset\": \"BatchNormBackward36_bn_offset\", \"bn4b18_branch2b_scale\": \"BatchNormBackward290_bn_scale\", \"bn4b20_branch2c_scale\": \"BatchNormBackward315_bn_scale\", \"bn4b7_branch2a_mean\": \"BatchNormBackward166_bn_mean\", \"bn3b_branch2b_mean\": \"BatchNormBackward57_bn_mean\", \"bn5a_branch1_variance\": \"BatchNormBackward351_bn_variance\", \"bn2b_branch2a_scale\": \"BatchNormBackward19_bn_scale\", \"bn4b16_branch2c_variance\": \"BatchNormBackward271_bn_variance\", \"bn4b12_branch2b_variance\": \"BatchNormBackward224_bn_variance\", \"bn4b19_branch2b_scale\": \"BatchNormBackward301_bn_scale\", \"bn4b20_branch2a_offset\": \"BatchNormBackward309_bn_offset\", \"bn4b6_branch2c_scale\": \"BatchNormBackward161_bn_scale\", \"bn4b9_branch2a_scale\": \"BatchNormBackward188_bn_scale\", \"bn3b_branch2b_offset\": \"BatchNormBackward57_bn_offset\", \"bn4b16_branch2c_scale\": \"BatchNormBackward271_bn_scale\", \"bn4b22_branch2a_offset\": \"BatchNormBackward331_bn_offset\", \"bn2c_branch2a_offset\": \"BatchNormBackward30_bn_offset\", \"bn4b2_branch2c_scale\": \"BatchNormBackward117_bn_scale\", \"bn4b8_branch2a_offset\": \"BatchNormBackward177_bn_offset\", \"res4b18_branch2b_weights\": \"ConvNdBackward289_weights\", \"res4b3_branch2c_weights\": \"ConvNdBackward127_weights\", \"bn3a_branch1_variance\": \"BatchNormBackward50_bn_variance\", \"bn4a_branch2a_variance\": \"BatchNormBackward87_bn_variance\", \"bn4b22_branch2a_mean\": \"BatchNormBackward331_bn_mean\", \"res4b19_branch2c_weights\": \"ConvNdBackward303_weights\", \"res5a_branch2a_weights\": \"ConvNdBackward341_weights\", \"bn4b14_branch2a_variance\": \"BatchNormBackward243_bn_variance\", \"bn4b19_branch2b_mean\": \"BatchNormBackward301_bn_mean\", \"bn3a_branch2a_scale\": \"BatchNormBackward41_bn_scale\", \"bn4b6_branch2c_offset\": \"BatchNormBackward161_bn_offset\", \"bn5c_branch2a_scale\": \"BatchNormBackward366_bn_scale\", \"bn4b5_branch2a_scale\": \"BatchNormBackward144_bn_scale\", \"bn4b17_branch2a_variance\": \"BatchNormBackward276_bn_variance\", \"bn4a_branch2b_variance\": \"BatchNormBackward90_bn_variance\", \"bn4b3_branch2b_offset\": \"BatchNormBackward125_bn_offset\", \"bn2a_branch2a_variance\": \"BatchNormBackward6_bn_variance\", \"res4b4_branch2a_weights\": \"ConvNdBackward132_weights\", \"res5c_branch2b_weights\": \"ConvNdBackward368_weights\", \"bn5b_branch2c_offset\": \"BatchNormBackward361_bn_offset\", \"bn2c_branch2b_mean\": \"BatchNormBackward33_bn_mean\", \"bn4b10_branch2a_variance\": \"BatchNormBackward199_bn_variance\", \"bn5b_branch2b_offset\": \"BatchNormBackward358_bn_offset\", \"bn4b17_branch2b_variance\": \"BatchNormBackward279_bn_variance\", \"bn4b2_branch2b_scale\": \"BatchNormBackward114_bn_scale\", \"bn4b19_branch2a_mean\": \"BatchNormBackward298_bn_mean\", \"res5a_branch2c_weights\": \"ConvNdBackward347_weights\", \"bn4b22_branch2b_scale\": \"BatchNormBackward334_bn_scale\", \"bn4b14_branch2b_scale\": \"BatchNormBackward246_bn_scale\", \"bn3c_branch2c_variance\": \"BatchNormBackward71_bn_variance\", \"bn4b5_branch2a_variance\": \"BatchNormBackward144_bn_variance\", \"bn4b17_branch2c_variance\": \"BatchNormBackward282_bn_variance\", \"bn2b_branch2b_mean\": \"BatchNormBackward22_bn_mean\", \"bn3a_branch2c_offset\": \"BatchNormBackward47_bn_offset\", \"bn4b19_branch2a_variance\": \"BatchNormBackward298_bn_variance\", \"bn3a_branch2a_variance\": \"BatchNormBackward41_bn_variance\", \"bn4b12_branch2c_offset\": \"BatchNormBackward227_bn_offset\", \"res4b20_branch2a_weights\": \"ConvNdBackward308_weights\", \"bn2c_branch2c_variance\": \"BatchNormBackward36_bn_variance\", \"bn3c_branch2c_scale\": \"BatchNormBackward71_bn_scale\", \"bn4b18_branch2b_variance\": \"BatchNormBackward290_bn_variance\", \"bn4b12_branch2a_offset\": \"BatchNormBackward221_bn_offset\", \"bn5a_branch2b_mean\": \"BatchNormBackward345_bn_mean\", \"bn4b20_branch2b_scale\": \"BatchNormBackward312_bn_scale\", \"bn5a_branch2c_mean\": \"BatchNormBackward348_bn_mean\", \"res2a_branch2a_weights\": \"ConvNdBackward5_weights\", \"res4b3_branch2b_weights\": \"ConvNdBackward124_weights\", \"bn2c_branch2b_scale\": \"BatchNormBackward33_bn_scale\", \"bn5c_branch2a_mean\": \"BatchNormBackward366_bn_mean\", \"res4b14_branch2c_weights\": \"ConvNdBackward248_weights\", \"bn2b_branch2a_variance\": \"BatchNormBackward19_bn_variance\", \"bn4b15_branch2c_mean\": \"BatchNormBackward260_bn_mean\", \"bn4b4_branch2b_scale\": \"BatchNormBackward136_bn_scale\", \"bn4b12_branch2c_scale\": \"BatchNormBackward227_bn_scale\", \"res2a_branch1_weights\": \"ConvNdBackward14_weights\", \"bn4b22_branch2c_scale\": \"BatchNormBackward337_bn_scale\", \"bn4b5_branch2b_scale\": \"BatchNormBackward147_bn_scale\", \"res4b2_branch2a_weights\": \"ConvNdBackward110_weights\", \"res4b10_branch2a_weights\": \"ConvNdBackward198_weights\", \"bn4b12_branch2b_mean\": \"BatchNormBackward224_bn_mean\", \"bn5a_branch1_mean\": \"BatchNormBackward351_bn_mean\", \"bn4b11_branch2c_offset\": \"BatchNormBackward216_bn_offset\", \"bn3c_branch2b_variance\": \"BatchNormBackward68_bn_variance\", \"bn4b20_branch2c_offset\": \"BatchNormBackward315_bn_offset\", \"bn4b9_branch2c_scale\": \"BatchNormBackward194_bn_scale\", \"bn4b17_branch2b_mean\": \"BatchNormBackward279_bn_mean\", \"bn4b16_branch2b_variance\": \"BatchNormBackward268_bn_variance\", \"bn4b16_branch2a_mean\": \"BatchNormBackward265_bn_mean\", \"bn4b14_branch2a_mean\": \"BatchNormBackward243_bn_mean\", \"bn4b9_branch2a_variance\": \"BatchNormBackward188_bn_variance\", \"res2c_branch2c_weights\": \"ConvNdBackward35_weights\", \"bn4b22_branch2c_offset\": \"BatchNormBackward337_bn_offset\", \"bn4b4_branch2a_scale\": \"BatchNormBackward133_bn_scale\", \"bn4b11_branch2a_offset\": \"BatchNormBackward210_bn_offset\", \"res4b20_branch2c_weights\": \"ConvNdBackward314_weights\", \"bn2c_branch2a_mean\": \"BatchNormBackward30_bn_mean\", \"bn4b10_branch2a_scale\": \"BatchNormBackward199_bn_scale\", \"bn4b16_branch2b_mean\": \"BatchNormBackward268_bn_mean\", \"bn3a_branch1_offset\": \"BatchNormBackward50_bn_offset\", \"bn4b15_branch2c_scale\": \"BatchNormBackward260_bn_scale\", \"res4b16_branch2b_weights\": \"ConvNdBackward267_weights\", \"bn4b19_branch2c_offset\": \"BatchNormBackward304_bn_offset\", \"bn2a_branch2b_scale\": \"BatchNormBackward9_bn_scale\", \"bn5b_branch2a_scale\": \"BatchNormBackward355_bn_scale\", \"bn4b11_branch2b_variance\": \"BatchNormBackward213_bn_variance\", \"bn4b14_branch2b_mean\": \"BatchNormBackward246_bn_mean\", \"bn2c_branch2a_variance\": \"BatchNormBackward30_bn_variance\", \"bn5c_branch2b_mean\": \"BatchNormBackward369_bn_mean\", \"res5c_branch2a_weights\": \"ConvNdBackward365_weights\", \"bn4b1_branch2c_scale\": \"BatchNormBackward106_bn_scale\", \"bn4b5_branch2c_offset\": \"BatchNormBackward150_bn_offset\", \"res4b21_branch2c_weights\": \"ConvNdBackward325_weights\", \"bn4b21_branch2a_offset\": \"BatchNormBackward320_bn_offset\", \"bn4b12_branch2a_mean\": \"BatchNormBackward221_bn_mean\", \"res4b19_branch2a_weights\": \"ConvNdBackward297_weights\", \"bn5a_branch2c_scale\": \"BatchNormBackward348_bn_scale\", \"res4b11_branch2c_weights\": \"ConvNdBackward215_weights\", \"bn3b_branch2c_variance\": \"BatchNormBackward60_bn_variance\", \"bn4b17_branch2a_mean\": \"BatchNormBackward276_bn_mean\", \"bn4b15_branch2c_offset\": \"BatchNormBackward260_bn_offset\", \"bn4b10_branch2a_offset\": \"BatchNormBackward199_bn_offset\", \"bn3d_branch2a_mean\": \"BatchNormBackward76_bn_mean\", \"bn4b20_branch2c_variance\": \"BatchNormBackward315_bn_variance\", \"res5a_branch2b_weights\": \"ConvNdBackward344_weights\", \"res4b2_branch2c_weights\": \"ConvNdBackward116_weights\", \"bn5c_branch2c_scale\": \"BatchNormBackward372_bn_scale\", \"bn4b6_branch2a_offset\": \"BatchNormBackward155_bn_offset\", \"bn4b10_branch2c_variance\": \"BatchNormBackward205_bn_variance\", \"bn4b1_branch2b_variance\": \"BatchNormBackward103_bn_variance\", \"bn5b_branch2a_offset\": \"BatchNormBackward355_bn_offset\", \"res4b7_branch2b_weights\": \"ConvNdBackward168_weights\", \"bn4b3_branch2c_scale\": \"BatchNormBackward128_bn_scale\", \"bn4b15_branch2a_variance\": \"BatchNormBackward254_bn_variance\", \"bn4b5_branch2c_mean\": \"BatchNormBackward150_bn_mean\", \"res2c_branch2b_weights\": \"ConvNdBackward32_weights\", \"bn4b19_branch2c_variance\": \"BatchNormBackward304_bn_variance\", \"bn4b5_branch2b_mean\": \"BatchNormBackward147_bn_mean\", \"bn3a_branch2a_offset\": \"BatchNormBackward41_bn_offset\", \"bn2a_branch2c_offset\": \"BatchNormBackward12_bn_offset\", \"bn4b10_branch2b_offset\": \"BatchNormBackward202_bn_offset\", \"res4b22_branch2c_weights\": \"ConvNdBackward336_weights\", \"bn4b7_branch2c_offset\": \"BatchNormBackward172_bn_offset\", \"bn4b14_branch2c_mean\": \"BatchNormBackward249_bn_mean\", \"bn4b5_branch2a_mean\": \"BatchNormBackward144_bn_mean\", \"bn4b3_branch2c_variance\": \"BatchNormBackward128_bn_variance\", \"bn3d_branch2a_variance\": \"BatchNormBackward76_bn_variance\", \"bn4b10_branch2b_variance\": \"BatchNormBackward202_bn_variance\", \"res2b_branch2a_weights\": \"ConvNdBackward18_weights\", \"res4b22_branch2b_weights\": \"ConvNdBackward333_weights\", \"bn2b_branch2c_mean\": \"BatchNormBackward25_bn_mean\", \"bn4b17_branch2c_mean\": \"BatchNormBackward282_bn_mean\", \"bn2a_branch2c_mean\": \"BatchNormBackward12_bn_mean\", \"bn4a_branch2b_scale\": \"BatchNormBackward90_bn_scale\", \"bn3a_branch2b_variance\": \"BatchNormBackward44_bn_variance\", \"bn4b3_branch2b_mean\": \"BatchNormBackward125_bn_mean\", \"bn2c_branch2b_variance\": \"BatchNormBackward33_bn_variance\", \"res4b19_branch2b_weights\": \"ConvNdBackward300_weights\", \"res4b16_branch2c_weights\": \"ConvNdBackward270_weights\", \"bn4b14_branch2b_offset\": \"BatchNormBackward246_bn_offset\", \"bn4b15_branch2b_scale\": \"BatchNormBackward257_bn_scale\", \"bn4b13_branch2a_mean\": \"BatchNormBackward232_bn_mean\", \"res3a_branch1_weights\": \"ConvNdBackward49_weights\", \"bn4b4_branch2c_variance\": \"BatchNormBackward139_bn_variance\", \"bn4b2_branch2c_mean\": \"BatchNormBackward117_bn_mean\", \"bn3d_branch2b_variance\": \"BatchNormBackward79_bn_variance\", \"res4b1_branch2b_weights\": \"ConvNdBackward102_weights\", \"bn4b21_branch2a_mean\": \"BatchNormBackward320_bn_mean\", \"res3c_branch2a_weights\": \"ConvNdBackward64_weights\", \"res4b12_branch2c_weights\": \"ConvNdBackward226_weights\", \"bn4b5_branch2a_offset\": \"BatchNormBackward144_bn_offset\", \"bn5a_branch2a_offset\": \"BatchNormBackward342_bn_offset\", \"bn3b_branch2c_offset\": \"BatchNormBackward60_bn_offset\", \"bn4b19_branch2a_offset\": \"BatchNormBackward298_bn_offset\", \"bn3c_branch2b_mean\": \"BatchNormBackward68_bn_mean\", \"bn3c_branch2c_offset\": \"BatchNormBackward71_bn_offset\", \"res4b21_branch2b_weights\": \"ConvNdBackward322_weights\", \"bn4b13_branch2a_scale\": \"BatchNormBackward232_bn_scale\", \"bn4b13_branch2c_scale\": \"BatchNormBackward238_bn_scale\", \"bn4b15_branch2b_variance\": \"BatchNormBackward257_bn_variance\", \"bn4b9_branch2c_mean\": \"BatchNormBackward194_bn_mean\", \"bn4b19_branch2a_scale\": \"BatchNormBackward298_bn_scale\", \"bn4b9_branch2a_mean\": \"BatchNormBackward188_bn_mean\", \"bn4a_branch1_variance\": \"BatchNormBackward96_bn_variance\", \"bn4b10_branch2a_mean\": \"BatchNormBackward199_bn_mean\", \"bn5b_branch2a_variance\": \"BatchNormBackward355_bn_variance\", \"bn4b21_branch2a_variance\": \"BatchNormBackward320_bn_variance\", \"bn4b9_branch2b_variance\": \"BatchNormBackward191_bn_variance\", \"bn5c_branch2c_offset\": \"BatchNormBackward372_bn_offset\", \"bn4b6_branch2c_mean\": \"BatchNormBackward161_bn_mean\", \"bn5a_branch2b_scale\": \"BatchNormBackward345_bn_scale\", \"bn4b11_branch2b_scale\": \"BatchNormBackward213_bn_scale\", \"bn4b21_branch2c_scale\": \"BatchNormBackward326_bn_scale\", \"bn5c_branch2a_offset\": \"BatchNormBackward366_bn_offset\", \"bn3b_branch2a_mean\": \"BatchNormBackward54_bn_mean\", \"res2b_branch2c_weights\": \"ConvNdBackward24_weights\", \"bn4b5_branch2b_offset\": \"BatchNormBackward147_bn_offset\", \"bn3d_branch2c_variance\": \"BatchNormBackward82_bn_variance\", \"bn3a_branch1_mean\": \"BatchNormBackward50_bn_mean\", \"bn2b_branch2c_offset\": \"BatchNormBackward25_bn_offset\", \"bn4b21_branch2c_mean\": \"BatchNormBackward326_bn_mean\", \"res4b15_branch2b_weights\": \"ConvNdBackward256_weights\", \"bn5b_branch2b_variance\": \"BatchNormBackward358_bn_variance\", \"res3d_branch2a_weights\": \"ConvNdBackward75_weights\", \"bn4b2_branch2a_offset\": \"BatchNormBackward111_bn_offset\", \"bn4b7_branch2a_scale\": \"BatchNormBackward166_bn_scale\", \"bn5c_branch2c_variance\": \"BatchNormBackward372_bn_variance\", \"bn5c_branch2b_scale\": \"BatchNormBackward369_bn_scale\", \"bn3b_branch2b_variance\": \"BatchNormBackward57_bn_variance\", \"bn2a_branch1_offset\": \"BatchNormBackward15_bn_offset\", \"bn4b8_branch2c_variance\": \"BatchNormBackward183_bn_variance\", \"bn4b21_branch2b_offset\": \"BatchNormBackward323_bn_offset\", \"bn4b15_branch2a_mean\": \"BatchNormBackward254_bn_mean\", \"res4b6_branch2a_weights\": \"ConvNdBackward154_weights\", \"bn5a_branch1_offset\": \"BatchNormBackward351_bn_offset\", \"bn4b5_branch2c_scale\": \"BatchNormBackward150_bn_scale\", \"bn3a_branch2c_variance\": \"BatchNormBackward47_bn_variance\", \"bn5b_branch2c_variance\": \"BatchNormBackward361_bn_variance\", \"bn3a_branch2a_mean\": \"BatchNormBackward41_bn_mean\", \"bn4b7_branch2b_scale\": \"BatchNormBackward169_bn_scale\", \"bn5a_branch2b_offset\": \"BatchNormBackward345_bn_offset\", \"bn4b19_branch2c_scale\": \"BatchNormBackward304_bn_scale\", \"res2a_branch2b_weights\": \"ConvNdBackward8_weights\", \"bn2c_branch2b_offset\": \"BatchNormBackward33_bn_offset\", \"bn3b_branch2c_mean\": \"BatchNormBackward60_bn_mean\", \"res4b16_branch2a_weights\": \"ConvNdBackward264_weights\", \"bn4b18_branch2b_offset\": \"BatchNormBackward290_bn_offset\", \"bn3a_branch2c_mean\": \"BatchNormBackward47_bn_mean\", \"bn4a_branch2b_offset\": \"BatchNormBackward90_bn_offset\", \"bn4b18_branch2b_mean\": \"BatchNormBackward290_bn_mean\", \"bn4b10_branch2c_mean\": \"BatchNormBackward205_bn_mean\", \"res3b_branch2c_weights\": \"ConvNdBackward59_weights\", \"bn3c_branch2a_scale\": \"BatchNormBackward65_bn_scale\", \"bn4b13_branch2b_variance\": \"BatchNormBackward235_bn_variance\", \"bn4b8_branch2c_offset\": \"BatchNormBackward183_bn_offset\", \"res4b14_branch2b_weights\": \"ConvNdBackward245_weights\", \"bn4b19_branch2b_offset\": \"BatchNormBackward301_bn_offset\", \"res4b18_branch2a_weights\": \"ConvNdBackward286_weights\", \"bn4b3_branch2c_mean\": \"BatchNormBackward128_bn_mean\", \"res4b11_branch2a_weights\": \"ConvNdBackward209_weights\", \"bn_conv1_variance\": \"BatchNormBackward2_bn_variance\", \"bn4b22_branch2b_variance\": \"BatchNormBackward334_bn_variance\", \"bn2b_branch2a_offset\": \"BatchNormBackward19_bn_offset\", \"bn_conv1_scale\": \"BatchNormBackward2_bn_scale\", \"bn5a_branch1_scale\": \"BatchNormBackward351_bn_scale\", \"bn4b7_branch2a_offset\": \"BatchNormBackward166_bn_offset\", \"bn4b9_branch2c_offset\": \"BatchNormBackward194_bn_offset\", \"res4b3_branch2a_weights\": \"ConvNdBackward121_weights\", \"bn2a_branch1_variance\": \"BatchNormBackward15_bn_variance\", \"bn4b3_branch2a_variance\": \"BatchNormBackward122_bn_variance\", \"res4b9_branch2a_weights\": \"ConvNdBackward187_weights\", \"bn4b9_branch2a_offset\": \"BatchNormBackward188_bn_offset\", \"bn3c_branch2a_mean\": \"BatchNormBackward65_bn_mean\", \"bn2b_branch2b_offset\": \"BatchNormBackward22_bn_offset\", \"res3a_branch2b_weights\": \"ConvNdBackward43_weights\", \"bn4b12_branch2b_offset\": \"BatchNormBackward224_bn_offset\", \"bn4a_branch2c_variance\": \"BatchNormBackward93_bn_variance\", \"bn4b18_branch2a_mean\": \"BatchNormBackward287_bn_mean\", \"bn4b16_branch2c_mean\": \"BatchNormBackward271_bn_mean\", \"bn4b20_branch2b_variance\": \"BatchNormBackward312_bn_variance\", \"bn4b8_branch2b_mean\": \"BatchNormBackward180_bn_mean\", \"bn3c_branch2c_mean\": \"BatchNormBackward71_bn_mean\", \"bn4b13_branch2a_variance\": \"BatchNormBackward232_bn_variance\", \"bn3d_branch2c_offset\": \"BatchNormBackward82_bn_offset\", \"bn4b1_branch2a_scale\": \"BatchNormBackward100_bn_scale\", \"bn4b2_branch2a_mean\": \"BatchNormBackward111_bn_mean\", \"bn4b8_branch2a_scale\": \"BatchNormBackward177_bn_scale\", \"res4b7_branch2a_weights\": \"ConvNdBackward165_weights\", \"bn4b20_branch2a_scale\": \"BatchNormBackward309_bn_scale\", \"bn4b12_branch2b_scale\": \"BatchNormBackward224_bn_scale\", \"bn3d_branch2b_offset\": \"BatchNormBackward79_bn_offset\", \"bn4b21_branch2b_mean\": \"BatchNormBackward323_bn_mean\", \"bn4b1_branch2b_scale\": \"BatchNormBackward103_bn_scale\", \"res3d_branch2c_weights\": \"ConvNdBackward81_weights\", \"bn4b20_branch2b_offset\": \"BatchNormBackward312_bn_offset\", \"bn4b5_branch2c_variance\": \"BatchNormBackward150_bn_variance\", \"res4b5_branch2c_weights\": \"ConvNdBackward149_weights\", \"bn5c_branch2b_variance\": \"BatchNormBackward369_bn_variance\", \"res4b20_branch2b_weights\": \"ConvNdBackward311_weights\", \"bn2a_branch2c_scale\": \"BatchNormBackward12_bn_scale\", \"res4b15_branch2c_weights\": \"ConvNdBackward259_weights\", \"bn3b_branch2a_variance\": \"BatchNormBackward54_bn_variance\", \"bn4b18_branch2c_mean\": \"BatchNormBackward293_bn_mean\", \"bn4b22_branch2a_scale\": \"BatchNormBackward331_bn_scale\", \"res3c_branch2b_weights\": \"ConvNdBackward67_weights\", \"bn4b8_branch2a_variance\": \"BatchNormBackward177_bn_variance\", \"res5c_branch2c_weights\": \"ConvNdBackward371_weights\", \"bn4b3_branch2a_offset\": \"BatchNormBackward122_bn_offset\", \"bn5c_branch2b_offset\": \"BatchNormBackward369_bn_offset\", \"bn2a_branch2b_mean\": \"BatchNormBackward9_bn_mean\", \"bn4b4_branch2b_variance\": \"BatchNormBackward136_bn_variance\", \"res3b_branch2b_weights\": \"ConvNdBackward56_weights\", \"bn4b10_branch2b_mean\": \"BatchNormBackward202_bn_mean\", \"bn4b15_branch2b_mean\": \"BatchNormBackward257_bn_mean\", \"bn4b21_branch2c_offset\": \"BatchNormBackward326_bn_offset\", \"res4b2_branch2b_weights\": \"ConvNdBackward113_weights\", \"bn4b8_branch2c_scale\": \"BatchNormBackward183_bn_scale\", \"bn2b_branch2c_variance\": \"BatchNormBackward25_bn_variance\", \"bn4b1_branch2c_mean\": \"BatchNormBackward106_bn_mean\", \"bn3b_branch2a_scale\": \"BatchNormBackward54_bn_scale\", \"bn4b6_branch2c_variance\": \"BatchNormBackward161_bn_variance\", \"res2c_branch2a_weights\": \"ConvNdBackward29_weights\", \"bn4b4_branch2a_variance\": \"BatchNormBackward133_bn_variance\", \"bn3d_branch2c_mean\": \"BatchNormBackward82_bn_mean\", \"bn4b4_branch2c_mean\": \"BatchNormBackward139_bn_mean\", \"bn4b21_branch2b_scale\": \"BatchNormBackward323_bn_scale\", \"bn4b7_branch2b_mean\": \"BatchNormBackward169_bn_mean\", \"res4b12_branch2a_weights\": \"ConvNdBackward220_weights\", \"bn4b1_branch2a_offset\": \"BatchNormBackward100_bn_offset\", \"bn3b_branch2b_scale\": \"BatchNormBackward57_bn_scale\", \"res4b9_branch2c_weights\": \"ConvNdBackward193_weights\", \"bn4b10_branch2c_scale\": \"BatchNormBackward205_bn_scale\", \"bn4b19_branch2c_mean\": \"BatchNormBackward304_bn_mean\", \"res4b1_branch2c_weights\": \"ConvNdBackward105_weights\", \"bn4b18_branch2a_scale\": \"BatchNormBackward287_bn_scale\", \"bn3a_branch2c_scale\": \"BatchNormBackward47_bn_scale\", \"bn2a_branch2b_offset\": \"BatchNormBackward9_bn_offset\", \"bn4b2_branch2b_mean\": \"BatchNormBackward114_bn_mean\", \"bn3d_branch2b_scale\": \"BatchNormBackward79_bn_scale\", \"res4a_branch2a_weights\": \"ConvNdBackward86_weights\", \"res4b6_branch2b_weights\": \"ConvNdBackward157_weights\", \"res4b9_branch2b_weights\": \"ConvNdBackward190_weights\", \"bn4b4_branch2a_mean\": \"BatchNormBackward133_bn_mean\", \"bn3b_branch2a_offset\": \"BatchNormBackward54_bn_offset\", \"res4b11_branch2b_weights\": \"ConvNdBackward212_weights\", \"bn4b4_branch2c_scale\": \"BatchNormBackward139_bn_scale\", \"bn4b8_branch2b_offset\": \"BatchNormBackward180_bn_offset\", \"bn4b2_branch2b_variance\": \"BatchNormBackward114_bn_variance\", \"res4b17_branch2b_weights\": \"ConvNdBackward278_weights\", \"res4b15_branch2a_weights\": \"ConvNdBackward253_weights\", \"bn3d_branch2c_scale\": \"BatchNormBackward82_bn_scale\", \"res3d_branch2b_weights\": \"ConvNdBackward78_weights\", \"bn4b13_branch2b_offset\": \"BatchNormBackward235_bn_offset\", \"bn2c_branch2c_mean\": \"BatchNormBackward36_bn_mean\", \"bn4b13_branch2b_mean\": \"BatchNormBackward235_bn_mean\", \"res4b17_branch2a_weights\": \"ConvNdBackward275_weights\", \"bn4b4_branch2b_offset\": \"BatchNormBackward136_bn_offset\", \"res4b18_branch2c_weights\": \"ConvNdBackward292_weights\", \"bn4b10_branch2c_offset\": \"BatchNormBackward205_bn_offset\", \"bn4b10_branch2b_scale\": \"BatchNormBackward202_bn_scale\", \"res4b21_branch2a_weights\": \"ConvNdBackward319_weights\", \"bn4b6_branch2b_scale\": \"BatchNormBackward158_bn_scale\", \"bn5a_branch2c_variance\": \"BatchNormBackward348_bn_variance\", \"conv1_weights\": \"ConvNdBackward1_weights\", \"bn5b_branch2c_mean\": \"BatchNormBackward361_bn_mean\", \"bn4a_branch2c_scale\": \"BatchNormBackward93_bn_scale\", \"bn3d_branch2a_offset\": \"BatchNormBackward76_bn_offset\", \"res5b_branch2a_weights\": \"ConvNdBackward354_weights\", \"bn5a_branch2b_variance\": \"BatchNormBackward345_bn_variance\", \"res3a_branch2a_weights\": \"ConvNdBackward40_weights\", \"bn4b14_branch2c_scale\": \"BatchNormBackward249_bn_scale\", \"bn2a_branch2b_variance\": \"BatchNormBackward9_bn_variance\", \"bn4b8_branch2a_mean\": \"BatchNormBackward177_bn_mean\", \"bn4b5_branch2b_variance\": \"BatchNormBackward147_bn_variance\", \"bn4b7_branch2c_scale\": \"BatchNormBackward172_bn_scale\", \"bn4b14_branch2a_scale\": \"BatchNormBackward243_bn_scale\", \"bn3c_branch2a_variance\": \"BatchNormBackward65_bn_variance\", \"res4b13_branch2c_weights\": \"ConvNdBackward237_weights\", \"bn4b3_branch2a_mean\": \"BatchNormBackward122_bn_mean\", \"bn3c_branch2a_offset\": \"BatchNormBackward65_bn_offset\", \"res4b12_branch2b_weights\": \"ConvNdBackward223_weights\", \"res4b10_branch2b_weights\": \"ConvNdBackward201_weights\", \"bn4b15_branch2c_variance\": \"BatchNormBackward260_bn_variance\", \"res4b10_branch2c_weights\": \"ConvNdBackward204_weights\", \"bn4b8_branch2b_scale\": \"BatchNormBackward180_bn_scale\", \"bn4a_branch2a_offset\": \"BatchNormBackward87_bn_offset\", \"bn4b21_branch2c_variance\": \"BatchNormBackward326_bn_variance\", \"res2b_branch2b_weights\": \"ConvNdBackward21_weights\", \"res4b8_branch2b_weights\": \"ConvNdBackward179_weights\", \"bn4b16_branch2a_offset\": \"BatchNormBackward265_bn_offset\", \"bn4b2_branch2a_variance\": \"BatchNormBackward111_bn_variance\", \"bn3b_branch2c_scale\": \"BatchNormBackward60_bn_scale\", \"bn4b7_branch2c_mean\": \"BatchNormBackward172_bn_mean\", \"res4b5_branch2a_weights\": \"ConvNdBackward143_weights\", \"bn5c_branch2c_mean\": \"BatchNormBackward372_bn_mean\", \"bn2a_branch2a_offset\": \"BatchNormBackward6_bn_offset\", \"bn5b_branch2a_mean\": \"BatchNormBackward355_bn_mean\", \"bn4b18_branch2a_offset\": \"BatchNormBackward287_bn_offset\", \"bn4b17_branch2c_scale\": \"BatchNormBackward282_bn_scale\", \"bn4b16_branch2c_offset\": \"BatchNormBackward271_bn_offset\", \"res2a_branch2c_weights\": \"ConvNdBackward11_weights\", \"res4b5_branch2b_weights\": \"ConvNdBackward146_weights\", \"bn4b14_branch2c_offset\": \"BatchNormBackward249_bn_offset\", \"res4b13_branch2b_weights\": \"ConvNdBackward234_weights\", \"bn3d_branch2b_mean\": \"BatchNormBackward79_bn_mean\", \"bn4a_branch2a_mean\": \"BatchNormBackward87_bn_mean\", \"bn4b1_branch2a_mean\": \"BatchNormBackward100_bn_mean\", \"bn4b8_branch2b_variance\": \"BatchNormBackward180_bn_variance\", \"bn4b22_branch2c_variance\": \"BatchNormBackward337_bn_variance\", \"bn4b3_branch2b_scale\": \"BatchNormBackward125_bn_scale\", \"bn4b22_branch2c_mean\": \"BatchNormBackward337_bn_mean\", \"bn4b9_branch2b_mean\": \"BatchNormBackward191_bn_mean\", \"bn5a_branch2a_scale\": \"BatchNormBackward342_bn_scale\", \"bn4a_branch2c_mean\": \"BatchNormBackward93_bn_mean\", \"bn3a_branch2b_mean\": \"BatchNormBackward44_bn_mean\", \"bn4b20_branch2b_mean\": \"BatchNormBackward312_bn_mean\", \"bn4b11_branch2c_variance\": \"BatchNormBackward216_bn_variance\", \"bn3c_branch2b_offset\": \"BatchNormBackward68_bn_offset\", \"bn5a_branch2a_mean\": \"BatchNormBackward342_bn_mean\", \"bn4a_branch1_offset\": \"BatchNormBackward96_bn_offset\", \"bn4b7_branch2b_offset\": \"BatchNormBackward169_bn_offset\", \"bn4b13_branch2c_offset\": \"BatchNormBackward238_bn_offset\", \"bn_conv1_offset\": \"BatchNormBackward2_bn_offset\", \"bn4b14_branch2a_offset\": \"BatchNormBackward243_bn_offset\", \"bn4b14_branch2b_variance\": \"BatchNormBackward246_bn_variance\", \"bn2c_branch2a_scale\": \"BatchNormBackward30_bn_scale\", \"bn2a_branch2a_scale\": \"BatchNormBackward6_bn_scale\", \"bn4b3_branch2c_offset\": \"BatchNormBackward128_bn_offset\", \"res3b_branch2a_weights\": \"ConvNdBackward53_weights\", \"bn4b6_branch2b_variance\": \"BatchNormBackward158_bn_variance\", \"bn4b6_branch2b_offset\": \"BatchNormBackward158_bn_offset\", \"bn3a_branch2b_scale\": \"BatchNormBackward44_bn_scale\", \"bn4b4_branch2b_mean\": \"BatchNormBackward136_bn_mean\", \"bn4b11_branch2a_variance\": \"BatchNormBackward210_bn_variance\", \"bn4b4_branch2a_offset\": \"BatchNormBackward133_bn_offset\", \"bn4b6_branch2a_variance\": \"BatchNormBackward155_bn_variance\", \"res4b22_branch2a_weights\": \"ConvNdBackward330_weights\", \"bn4b19_branch2b_variance\": \"BatchNormBackward301_bn_variance\", \"bn2b_branch2a_mean\": \"BatchNormBackward19_bn_mean\", \"bn4b11_branch2a_scale\": \"BatchNormBackward210_bn_scale\", \"bn4b3_branch2b_variance\": \"BatchNormBackward125_bn_variance\", \"res4a_branch2b_weights\": \"ConvNdBackward89_weights\", \"bn4a_branch2c_offset\": \"BatchNormBackward93_bn_offset\", \"bn4b13_branch2b_scale\": \"BatchNormBackward235_bn_scale\", \"bn4a_branch1_scale\": \"BatchNormBackward96_bn_scale\", \"bn5b_branch2b_scale\": \"BatchNormBackward358_bn_scale\", \"bn4b21_branch2b_variance\": \"BatchNormBackward323_bn_variance\", \"bn4b16_branch2a_variance\": \"BatchNormBackward265_bn_variance\", \"bn5b_branch2b_mean\": \"BatchNormBackward358_bn_mean\", \"bn4b22_branch2a_variance\": \"BatchNormBackward331_bn_variance\", \"bn4b20_branch2a_mean\": \"BatchNormBackward309_bn_mean\", \"bn4b12_branch2a_scale\": \"BatchNormBackward221_bn_scale\", \"bn4b15_branch2a_scale\": \"BatchNormBackward254_bn_scale\", \"bn4b16_branch2a_scale\": \"BatchNormBackward265_bn_scale\", \"res4b14_branch2a_weights\": \"ConvNdBackward242_weights\", \"bn4b15_branch2a_offset\": \"BatchNormBackward254_bn_offset\", \"bn4b13_branch2c_mean\": \"BatchNormBackward238_bn_mean\", \"bn5b_branch2c_scale\": \"BatchNormBackward361_bn_scale\", \"bn5a_branch2c_offset\": \"BatchNormBackward348_bn_offset\", \"bn4b9_branch2b_offset\": \"BatchNormBackward191_bn_offset\", \"bn4b14_branch2c_variance\": \"BatchNormBackward249_bn_variance\", \"bn4b17_branch2b_scale\": \"BatchNormBackward279_bn_scale\", \"res4b8_branch2a_weights\": \"ConvNdBackward176_weights\", \"bn4b15_branch2b_offset\": \"BatchNormBackward257_bn_offset\", \"bn4b11_branch2b_offset\": \"BatchNormBackward213_bn_offset\", \"res5b_branch2c_weights\": \"ConvNdBackward360_weights\", \"bn4b16_branch2b_offset\": \"BatchNormBackward268_bn_offset\", \"bn4a_branch1_mean\": \"BatchNormBackward96_bn_mean\", \"bn4b18_branch2a_variance\": \"BatchNormBackward287_bn_variance\", \"bn2b_branch2b_scale\": \"BatchNormBackward22_bn_scale\", \"bn4b16_branch2b_scale\": \"BatchNormBackward268_bn_scale\", \"bn5a_branch2a_variance\": \"BatchNormBackward342_bn_variance\", \"bn4b18_branch2c_offset\": \"BatchNormBackward293_bn_offset\", \"bn4b6_branch2a_scale\": \"BatchNormBackward155_bn_scale\", \"res5a_branch1_weights\": \"ConvNdBackward350_weights\", \"bn2a_branch1_scale\": \"BatchNormBackward15_bn_scale\", \"bn4b1_branch2b_mean\": \"BatchNormBackward103_bn_mean\", \"res4b4_branch2b_weights\": \"ConvNdBackward135_weights\", \"res4b4_branch2c_weights\": \"ConvNdBackward138_weights\", \"res3a_branch2c_weights\": \"ConvNdBackward46_weights\", \"bn3a_branch1_scale\": \"BatchNormBackward50_bn_scale\", \"bn4b1_branch2c_variance\": \"BatchNormBackward106_bn_variance\", \"bn4b18_branch2c_variance\": \"BatchNormBackward293_bn_variance\", \"bn4b17_branch2a_scale\": \"BatchNormBackward276_bn_scale\", \"bn4b1_branch2b_offset\": \"BatchNormBackward103_bn_offset\", \"bn4b22_branch2b_offset\": \"BatchNormBackward334_bn_offset\", \"bn4b9_branch2b_scale\": \"BatchNormBackward191_bn_scale\", \"bn4b2_branch2c_variance\": \"BatchNormBackward117_bn_variance\", \"res4b17_branch2c_weights\": \"ConvNdBackward281_weights\", \"bn5c_branch2a_variance\": \"BatchNormBackward366_bn_variance\", \"bn4b11_branch2c_mean\": \"BatchNormBackward216_bn_mean\", \"bn2b_branch2c_scale\": \"BatchNormBackward25_bn_scale\", \"bn4a_branch2b_mean\": \"BatchNormBackward90_bn_mean\", \"bn4b7_branch2b_variance\": \"BatchNormBackward169_bn_variance\", \"bn4a_branch2a_scale\": \"BatchNormBackward87_bn_scale\", \"bn4b1_branch2a_variance\": \"BatchNormBackward100_bn_variance\", \"bn2a_branch2c_variance\": \"BatchNormBackward12_bn_variance\", \"bn4b3_branch2a_scale\": \"BatchNormBackward122_bn_scale\", \"bn4b13_branch2a_offset\": \"BatchNormBackward232_bn_offset\", \"res4a_branch2c_weights\": \"ConvNdBackward92_weights\", \"res3c_branch2c_weights\": \"ConvNdBackward70_weights\", \"bn4b11_branch2b_mean\": \"BatchNormBackward213_bn_mean\", \"res4b8_branch2c_weights\": \"ConvNdBackward182_weights\", \"bn2c_branch2c_scale\": \"BatchNormBackward36_bn_scale\", \"bn4b6_branch2b_mean\": \"BatchNormBackward158_bn_mean\", \"bn4b9_branch2c_variance\": \"BatchNormBackward194_bn_variance\", \"bn_conv1_mean\": \"BatchNormBackward2_bn_mean\", \"bn4b7_branch2a_variance\": \"BatchNormBackward166_bn_variance\", \"bn4b4_branch2c_offset\": \"BatchNormBackward139_bn_offset\", \"bn3c_branch2b_scale\": \"BatchNormBackward68_bn_scale\", \"bn4b20_branch2a_variance\": \"BatchNormBackward309_bn_variance\", \"bn2b_branch2b_variance\": \"BatchNormBackward22_bn_variance\", \"bn4b17_branch2b_offset\": \"BatchNormBackward279_bn_offset\", \"bn4b11_branch2c_scale\": \"BatchNormBackward216_bn_scale\", \"res5b_branch2b_weights\": \"ConvNdBackward357_weights\", \"bn4b8_branch2c_mean\": \"BatchNormBackward183_bn_mean\", \"bn2a_branch1_mean\": \"BatchNormBackward15_bn_mean\", \"bn4b20_branch2c_mean\": \"BatchNormBackward315_bn_mean\", \"bn4b11_branch2a_mean\": \"BatchNormBackward210_bn_mean\", \"bn4b21_branch2a_scale\": \"BatchNormBackward320_bn_scale\", \"bn4b7_branch2c_variance\": \"BatchNormBackward172_bn_variance\", \"bn4b2_branch2c_offset\": \"BatchNormBackward117_bn_offset\", \"bn4b12_branch2c_mean\": \"BatchNormBackward227_bn_mean\", \"bn4b17_branch2a_offset\": \"BatchNormBackward276_bn_offset\", \"bn4b2_branch2b_offset\": \"BatchNormBackward114_bn_offset\", \"bn4b22_branch2b_mean\": \"BatchNormBackward334_bn_mean\", \"res4b6_branch2c_weights\": \"ConvNdBackward160_weights\", \"bn4b1_branch2c_offset\": \"BatchNormBackward106_bn_offset\", \"bn4b12_branch2c_variance\": \"BatchNormBackward227_bn_variance\", \"bn4b13_branch2c_variance\": \"BatchNormBackward238_bn_variance\", \"bn3a_branch2b_offset\": \"BatchNormBackward44_bn_offset\", \"bn4b6_branch2a_mean\": \"BatchNormBackward155_bn_mean\"}\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/models/tsn/tsn.py",
    "content": "#  Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n#Licensed under the Apache License, Version 2.0 (the \"License\");\n#you may not use this file except in compliance with the License.\n#You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n#Unless required by applicable law or agreed to in writing, software\n#distributed under the License is distributed on an \"AS IS\" BASIS,\n#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n#See the License for the specific language governing permissions and\n#limitations under the License.\n\nimport numpy as np\n\nimport paddle.fluid as fluid\nfrom paddle.fluid import ParamAttr\n\nfrom ..model import ModelBase\nfrom .tsn_res_model import TSN_ResNet\n\nimport logging\nlogger = logging.getLogger(__name__)\n\n__all__ = [\"TSN\"]\n\n\nclass TSN(ModelBase):\n    def __init__(self, name, cfg, mode='train'):\n        super(TSN, self).__init__(name, cfg, mode=mode)\n        self.get_config()\n\n    def get_config(self):\n        self.num_classes = self.get_config_from_sec('model', 'num_classes')\n        self.seg_num = self.get_config_from_sec('model', 'seg_num')\n        self.seglen = self.get_config_from_sec('model', 'seglen')\n        self.image_mean = self.get_config_from_sec('model', 'image_mean')\n        self.image_std = self.get_config_from_sec('model', 'image_std')\n        self.num_layers = self.get_config_from_sec('model', 'num_layers')\n\n        self.num_epochs = self.get_config_from_sec('train', 'epoch')\n        self.total_videos = self.get_config_from_sec('train', 'total_videos')\n        self.base_learning_rate = self.get_config_from_sec('train', 'learning_rate')\n        self.learning_rate_decay = self.get_config_from_sec('train', 'learning_rate_decay')\n        self.l2_weight_decay = self.get_config_from_sec('train', 'l2_weight_decay')\n        self.momentum = self.get_config_from_sec('train', 'momentum')\n\n        self.seg_num = self.get_config_from_sec(self.mode, 'seg_num', self.seg_num)\n        self.target_size = self.get_config_from_sec(self.mode, 'target_size')\n        self.batch_size = self.get_config_from_sec(self.mode, 'batch_size')\n\n    def build_input(self, use_dataloader=True):\n        image_shape = [3, self.target_size, self.target_size]\n        image_shape[0] = image_shape[0] * self.seglen\n        image_shape = [None, self.seg_num] + image_shape\n        self.use_dataloader = use_dataloader\n\n        image = fluid.data(name='image', shape=image_shape, dtype='float32')\n        if self.mode != 'infer':\n            label = fluid.data(name='label', shape=[None, 1], dtype='int64')\n        else:\n            label = None\n\n        if use_dataloader:\n            assert self.mode != 'infer', \\\n                        'dataloader is not recommendated when infer, please set use_dataloader to be false.'\n            self.dataloader = fluid.io.DataLoader.from_generator(feed_list=[image, label], capacity=4, iterable=True)\n\n        self.feature_input = [image]\n        self.label_input = label\n\n    def create_model_args(self):\n        cfg = {}\n        cfg['layers'] = self.num_layers\n        cfg['class_dim'] = self.num_classes\n        cfg['seg_num'] = self.seg_num\n        return cfg\n\n    def build_model(self):\n        cfg = self.create_model_args()\n        videomodel = TSN_ResNet(layers=cfg['layers'], seg_num=cfg['seg_num'], is_training=(self.mode == 'train'))\n        out = videomodel.net(input=self.feature_input[0], class_dim=cfg['class_dim'])\n        self.feature_output = out\n        #self.network_outputs = [out]\n\n    def optimizer(self):\n        assert self.mode == 'train', \"optimizer only can be get in train mode\"\n        epoch_points = [self.num_epochs / 3, self.num_epochs * 2 / 3]\n        total_videos = self.total_videos\n        step = int(total_videos / self.batch_size + 1)\n        bd = [e * step for e in epoch_points]\n        base_lr = self.base_learning_rate\n        lr_decay = self.learning_rate_decay\n        lr = [base_lr, base_lr * lr_decay, base_lr * lr_decay * lr_decay]\n        l2_weight_decay = self.l2_weight_decay\n        momentum = self.momentum\n        optimizer = fluid.optimizer.Momentum(\n            learning_rate=fluid.layers.piecewise_decay(boundaries=bd, values=lr),\n            momentum=momentum,\n            regularization=fluid.regularizer.L2Decay(l2_weight_decay))\n\n        return optimizer\n\n    def loss(self):\n        assert self.mode != 'infer', \"invalid loss calculationg in infer mode\"\n        cost = fluid.layers.cross_entropy(input=self.network_outputs[0], \\\n                           label=self.label_input, ignore_index=-1)\n        self.loss_ = fluid.layers.mean(x=cost)\n        return self.loss_\n\n    def outputs(self):\n        return self.network_outputs\n\n    def feeds(self):\n        return self.feature_input  #if self.mode == 'infer' else self.feature_input + [\n#            self.label_input\n#        ]\n\n    def fetches(self):\n        if self.mode == 'train' or self.mode == 'valid':\n            losses = self.loss()\n            fetch_list = [losses, self.network_outputs[0], self.label_input]\n        elif self.mode == 'test':\n            #losses = self.loss()\n            fetch_list = [self.feature_output, self.label_input]\n        elif self.mode == 'infer':\n            fetch_list = self.feature_output\n        else:\n            raise NotImplementedError('mode {} not implemented'.format(self.mode))\n\n        return fetch_list\n\n    def pretrain_info(self):\n        return ('ResNet50_pretrained',\n                'https://paddlemodels.bj.bcebos.com/video_classification/ResNet50_pretrained.tar.gz')\n\n    def weights_info(self):\n        return ('TSN.pdparams', 'https://paddlemodels.bj.bcebos.com/video_classification/TSN.pdparams')\n\n    def load_pretrain_params(self, exe, pretrain, prog, place):\n        def is_parameter(var):\n            return isinstance(var, fluid.framework.Parameter)\n\n        params_list = list(filter(is_parameter, prog.list_vars()))\n        for param in params_list:\n            print(param.name)\n\n        logger.info(\"Load pretrain weights from {}, exclude fc layer.\".format(pretrain))\n\n        state_dict = fluid.load_program_state(pretrain)\n        dict_keys = list(state_dict.keys())\n        for name in dict_keys:\n            if \"fc_0\" in name:\n                del state_dict[name]\n                print('Delete {} from pretrained parameters. Do not load it'.format(name))\n        fluid.set_program_state(prog, state_dict)\n\n\n#    def load_test_weights(self, exe, weights, prog):\n#        def is_parameter(var):\n#            return isinstance(var, fluid.framework.Parameter)\n#        params_list = list(filter(is_parameter, prog.list_vars()))\n\n#        state_dict = np.load(weights)\n#        for p in params_list:\n#            if p.name in state_dict.keys():\n#                print('########### load param {} from file'.format(p.name))\n#            else:\n#                print('----------- param {} not in file'.format(p.name))\n#        fluid.set_program_state(prog, state_dict)\n#        fluid.save(prog, './model_weight/tsn')\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/models/tsn/tsn_res_model.py",
    "content": "#  Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n#Licensed under the Apache License, Version 2.0 (the \"License\");\n#you may not use this file except in compliance with the License.\n#You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n#Unless required by applicable law or agreed to in writing, software\n#distributed under the License is distributed on an \"AS IS\" BASIS,\n#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n#See the License for the specific language governing permissions and\n#limitations under the License.\n\nimport os\nimport time\nimport sys\nimport paddle.fluid as fluid\nimport math\n\n\nclass TSN_ResNet():\n    def __init__(self, layers=50, seg_num=7, is_training=True):\n        self.layers = 101  #layers\n        self.seg_num = seg_num\n        self.is_training = is_training\n\n    def conv_bn_layer(self, input, num_filters, filter_size, stride=1, groups=1, act=None, name=None):\n        conv = fluid.layers.conv2d(\n            input=input,\n            num_filters=num_filters,\n            filter_size=filter_size,\n            stride=stride,\n            padding=(filter_size - 1) // 2,\n            groups=groups,\n            act=None,\n            param_attr=fluid.param_attr.ParamAttr(name=name + \"_weights\"),\n            bias_attr=False)\n        if name == \"conv1\":\n            bn_name = \"bn_\" + name\n        else:\n            bn_name = \"bn\" + name[3:]\n\n        return fluid.layers.batch_norm(\n            input=conv,\n            act=act,\n            is_test=(not self.is_training),\n            param_attr=fluid.param_attr.ParamAttr(name=bn_name + \"_scale\"),\n            bias_attr=fluid.param_attr.ParamAttr(bn_name + '_offset'),\n            moving_mean_name=bn_name + \"_mean\",\n            moving_variance_name=bn_name + '_variance')\n\n    def shortcut(self, input, ch_out, stride, name):\n        ch_in = input.shape[1]\n        if ch_in != ch_out or stride != 1:\n            return self.conv_bn_layer(input, ch_out, 1, stride, name=name)\n        else:\n            return input\n\n    def bottleneck_block(self, input, num_filters, stride, name):\n        conv0 = self.conv_bn_layer(\n            input=input, num_filters=num_filters, filter_size=1, act='relu', name=name + \"_branch2a\")\n        conv1 = self.conv_bn_layer(\n            input=conv0, num_filters=num_filters, filter_size=3, stride=stride, act='relu', name=name + \"_branch2b\")\n        conv2 = self.conv_bn_layer(\n            input=conv1, num_filters=num_filters * 4, filter_size=1, act=None, name=name + \"_branch2c\")\n\n        short = self.shortcut(input, num_filters * 4, stride, name=name + \"_branch1\")\n\n        return fluid.layers.elementwise_add(x=short, y=conv2, act='relu')\n\n    def net(self, input, class_dim=101):\n        layers = self.layers\n        seg_num = self.seg_num\n        supported_layers = [50, 101, 152]\n        assert layers in supported_layers, \\\n            \"supported layers are {} but input layer is {}\".format(supported_layers, layers)\n\n        # reshape input\n        channels = input.shape[2]\n        short_size = input.shape[3]\n        input = fluid.layers.reshape(x=input, shape=[-1, channels, short_size, short_size])\n\n        if layers == 50:\n            depth = [3, 4, 6, 3]\n        elif layers == 101:\n            depth = [3, 4, 23, 3]\n        elif layers == 152:\n            depth = [3, 8, 36, 3]\n        num_filters = [64, 128, 256, 512]\n\n        conv = self.conv_bn_layer(input=input, num_filters=64, filter_size=7, stride=2, act='relu', name='conv1')\n        conv = fluid.layers.pool2d(input=conv, pool_size=3, pool_stride=2, pool_padding=1, pool_type='max')\n\n        for block in range(len(depth)):\n            for i in range(depth[block]):\n                if layers in [101, 152] and block == 2:\n                    if i == 0:\n                        conv_name = \"res\" + str(block + 2) + \"a\"\n                    else:\n                        conv_name = \"res\" + str(block + 2) + \"b\" + str(i)\n                else:\n                    conv_name = \"res\" + str(block + 2) + chr(97 + i)\n\n                conv = self.bottleneck_block(\n                    input=conv,\n                    num_filters=num_filters[block],\n                    stride=2 if i == 0 and block != 0 else 1,\n                    name=conv_name)\n\n        pool = fluid.layers.pool2d(input=conv, pool_size=7, pool_type='avg', global_pooling=True)\n\n        feature = fluid.layers.reshape(x=pool, shape=[-1, seg_num, pool.shape[1]])\n        return feature\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/reader/__init__.py",
    "content": "from .reader_utils import regist_reader, get_reader\nfrom .kinetics_reader import KineticsReader\n\n# regist reader, sort by alphabet\nregist_reader(\"TSN\", KineticsReader)\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/reader/kinetics_reader.py",
    "content": "#  Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n#Licensed under the Apache License, Version 2.0 (the \"License\");\n#you may not use this file except in compliance with the License.\n#You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n#Unless required by applicable law or agreed to in writing, software\n#distributed under the License is distributed on an \"AS IS\" BASIS,\n#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n#See the License for the specific language governing permissions and\n#limitations under the License.\nimport sys\n\nimport random\nimport functools\nimport logging\ntry:\n    import cPickle as pickle\n    from cStringIO import StringIO\nexcept ImportError:\n    import pickle\n    from io import BytesIO\n\nimport paddle\nimport cv2\nimport numpy as np\nfrom PIL import Image\n\nfrom .reader_utils import DataReader\n\nlogger = logging.getLogger(__name__)\npython_ver = sys.version_info\n\n\nclass KineticsReader(DataReader):\n    \"\"\"\n    Data reader for kinetics dataset of two format mp4 and pkl.\n    1. mp4, the original format of kinetics400\n    2. pkl, the mp4 was decoded previously and stored as pkl\n    In both case, load the data, and then get the frame data in the form of numpy and label as an integer.\n     dataset cfg: format\n                  num_classes\n                  seg_num\n                  short_size\n                  target_size\n                  num_reader_threads\n                  buf_size\n                  image_mean\n                  image_std\n                  batch_size\n                  list\n    \"\"\"\n\n    def __init__(self, name, mode, cfg):\n        super(KineticsReader, self).__init__(name, mode, cfg)\n        self.format = cfg.MODEL.format\n        self.num_classes = self.get_config_from_sec('model', 'num_classes')\n        self.seg_num = self.get_config_from_sec('model', 'seg_num')\n        self.seglen = self.get_config_from_sec('model', 'seglen')\n\n        self.seg_num = self.get_config_from_sec(mode, 'seg_num', self.seg_num)\n        self.short_size = self.get_config_from_sec(mode, 'short_size')\n        self.target_size = self.get_config_from_sec(mode, 'target_size')\n        self.num_reader_threads = self.get_config_from_sec(mode, 'num_reader_threads')\n        self.buf_size = self.get_config_from_sec(mode, 'buf_size')\n\n        self.img_mean = np.array(cfg.MODEL.image_mean).reshape([3, 1, 1]).astype(np.float32)\n        self.img_std = np.array(cfg.MODEL.image_std).reshape([3, 1, 1]).astype(np.float32)\n        # set batch size and file list\n        self.batch_size = cfg[mode.upper()]['batch_size']\n        self.filelist = cfg[mode.upper()]['filelist']\n\n    def create_reader(self):\n        _reader = self._reader_creator(self.filelist, self.mode, seg_num=self.seg_num, seglen = self.seglen, \\\n                         short_size = self.short_size, target_size = self.target_size, \\\n                         img_mean = self.img_mean, img_std = self.img_std, \\\n                         shuffle = (self.mode == 'train'), \\\n                         num_threads = self.num_reader_threads, \\\n                         buf_size = self.buf_size, format = self.format)\n\n        def _batch_reader():\n            batch_out = []\n            for imgs, label in _reader():\n                if imgs is None:\n                    continue\n                batch_out.append((imgs, label))\n                if len(batch_out) == self.batch_size:\n                    yield batch_out\n                    batch_out = []\n\n        return _batch_reader\n\n    def _reader_creator(self,\n                        pickle_list,\n                        mode,\n                        seg_num,\n                        seglen,\n                        short_size,\n                        target_size,\n                        img_mean,\n                        img_std,\n                        shuffle=False,\n                        num_threads=1,\n                        buf_size=1024,\n                        format='pkl'):\n        def decode_mp4(sample, mode, seg_num, seglen, short_size, target_size, img_mean, img_std):\n            sample = sample[0].split(' ')\n            mp4_path = sample[0]\n            try:\n                imgs = mp4_loader(mp4_path, seg_num, seglen, mode)\n                if len(imgs) < 1:\n                    logger.error('{} frame length {} less than 1.'.format(mp4_path, len(imgs)))\n                    return None, None\n            except:\n                logger.error('Error when loading {}'.format(mp4_path))\n                return None, None\n\n            return imgs_transform(imgs, mode, seg_num, seglen, \\\n                         short_size, target_size, img_mean, img_std, name = self.name), mp4_path\n\n        def reader():\n            lines = [line.strip() for line in pickle_list]\n            if shuffle:\n                random.shuffle(lines)\n            for line in lines:\n                pickle_path = line.strip()\n                yield [pickle_path]\n\n        mapper = functools.partial(\n            decode_mp4,\n            mode=mode,\n            seg_num=seg_num,\n            seglen=seglen,\n            short_size=short_size,\n            target_size=target_size,\n            img_mean=img_mean,\n            img_std=img_std)\n\n        return paddle.reader.xmap_readers(mapper, reader, num_threads, buf_size)\n\n\ndef imgs_transform(imgs, mode, seg_num, seglen, short_size, target_size, img_mean, img_std, name=''):\n    imgs = group_scale(imgs, short_size)\n\n    np_imgs = np.array([np.array(img).astype('float32') for img in imgs])  #dhwc\n    np_imgs = group_center_crop(np_imgs, target_size)\n    np_imgs = np_imgs.transpose(0, 3, 1, 2) / 255  #dchw\n    np_imgs -= img_mean\n    np_imgs /= img_std\n\n    return np_imgs\n\n\ndef group_center_crop(np_imgs, target_size):\n    d, h, w, c = np_imgs.shape\n    th, tw = target_size, target_size\n    assert (w >= target_size) and (h >= target_size), \\\n         \"image width({}) and height({}) should be larger than crop size\".format(w, h, target_size)\n\n    h_off = int(round((h - th) / 2.))\n    w_off = int(round((w - tw) / 2.))\n\n    img_crop = np_imgs[:, h_off:h_off + target_size, w_off:w_off + target_size, :]\n    return img_crop\n\n\ndef group_scale(imgs, target_size):\n    resized_imgs = []\n    for i in range(len(imgs)):\n        img = imgs[i]\n        w, h = img.size\n        if (w <= h and w == target_size) or (h <= w and h == target_size):\n            resized_imgs.append(img)\n            continue\n\n        if w < h:\n            ow = target_size\n            oh = int(target_size * 4.0 / 3.0)\n            resized_imgs.append(img.resize((ow, oh), Image.BILINEAR))\n        else:\n            oh = target_size\n            ow = int(target_size * 4.0 / 3.0)\n            resized_imgs.append(img.resize((ow, oh), Image.BILINEAR))\n\n    return resized_imgs\n\n\ndef mp4_loader(filepath, nsample, seglen, mode):\n    cap = cv2.VideoCapture(filepath)\n    videolen = int(cap.get(cv2.CAP_PROP_FRAME_COUNT))\n    sampledFrames = []\n    for i in range(videolen):\n        ret, frame = cap.read()\n        # maybe first frame is empty\n        if ret == False:\n            continue\n        img = frame[:, :, ::-1]\n        sampledFrames.append(img)\n    average_dur = int(len(sampledFrames) / nsample)\n    imgs = []\n    for i in range(nsample):\n        idx = 0\n        if average_dur >= seglen:\n            idx = (average_dur - 1) // 2\n            idx += i * average_dur\n        elif average_dur >= 1:\n            idx += i * average_dur\n        else:\n            idx = i\n\n        for jj in range(idx, idx + seglen):\n            imgbuf = sampledFrames[int(jj % len(sampledFrames))]\n            img = Image.fromarray(imgbuf, mode='RGB')\n            imgs.append(img)\n    return imgs\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/reader/reader_utils.py",
    "content": "#  Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserve.\n#\n#Licensed under the Apache License, Version 2.0 (the \"License\");\n#you may not use this file except in compliance with the License.\n#You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n#Unless required by applicable law or agreed to in writing, software\n#distributed under the License is distributed on an \"AS IS\" BASIS,\n#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n#See the License for the specific language governing permissions and\n#limitations under the License.\n\n\nclass ReaderNotFoundError(Exception):\n    \"Error: reader not found\"\n\n    def __init__(self, reader_name, avail_readers):\n        super(ReaderNotFoundError, self).__init__()\n        self.reader_name = reader_name\n        self.avail_readers = avail_readers\n\n    def __str__(self):\n        msg = \"Reader {} Not Found.\\nAvailiable readers:\\n\".format(self.reader_name)\n        for reader in self.avail_readers:\n            msg += \"  {}\\n\".format(reader)\n        return msg\n\n\nclass DataReader(object):\n    \"\"\"data reader for video input\"\"\"\n\n    def __init__(self, model_name, mode, cfg):\n        self.name = model_name\n        self.mode = mode\n        self.cfg = cfg\n\n    def create_reader(self):\n        \"\"\"Not implemented\"\"\"\n        pass\n\n    def get_config_from_sec(self, sec, item, default=None):\n        if sec.upper() not in self.cfg:\n            return default\n        return self.cfg[sec.upper()].get(item, default)\n\n\nclass ReaderZoo(object):\n    def __init__(self):\n        self.reader_zoo = {}\n\n    def regist(self, name, reader):\n        assert reader.__base__ == DataReader, \"Unknow model type {}\".format(type(reader))\n        self.reader_zoo[name] = reader\n\n    def get(self, name, mode, cfg):\n        for k, v in self.reader_zoo.items():\n            if k == name:\n                return v(name, mode, cfg)\n        raise ReaderNotFoundError(name, self.reader_zoo.keys())\n\n\n# singleton reader_zoo\nreader_zoo = ReaderZoo()\n\n\ndef regist_reader(name, reader):\n    reader_zoo.regist(name, reader)\n\n\ndef get_reader(name, mode, cfg):\n    reader_model = reader_zoo.get(name, mode, cfg)\n    return reader_model.create_reader()\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/utils/__init__.py",
    "content": ""
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/utils/config_utils.py",
    "content": "#  Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.\n#\n#Licensed under the Apache License, Version 2.0 (the \"License\");\n#you may not use this file except in compliance with the License.\n#You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n#Unless required by applicable law or agreed to in writing, software\n#distributed under the License is distributed on an \"AS IS\" BASIS,\n#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n#See the License for the specific language governing permissions and\n#limitations under the License.\n\nimport logging\n\nfrom .utility import AttrDict\n\nlogger = logging.getLogger(__name__)\n\nCONFIG_SECS = [\n    'train',\n    'valid',\n    'test',\n    'infer',\n]\n\n\ndef parse_config(cfg_file):\n    \"\"\"Load a config file into AttrDict\"\"\"\n    import yaml\n    with open(cfg_file, 'r') as fopen:\n        yaml_config = AttrDict(yaml.load(fopen, Loader=yaml.Loader))\n    create_attr_dict(yaml_config)\n    return yaml_config\n\n\ndef create_attr_dict(yaml_config):\n    from ast import literal_eval\n    for key, value in yaml_config.items():\n        if type(value) is dict:\n            yaml_config[key] = value = AttrDict(value)\n        if isinstance(value, str):\n            try:\n                value = literal_eval(value)\n            except BaseException:\n                pass\n        if isinstance(value, AttrDict):\n            create_attr_dict(yaml_config[key])\n        else:\n            yaml_config[key] = value\n    return\n\n\ndef merge_configs(cfg, sec, args_dict):\n    assert sec in CONFIG_SECS, \"invalid config section {}\".format(sec)\n    sec_dict = getattr(cfg, sec.upper())\n    for k, v in args_dict.items():\n        if v is None:\n            continue\n        try:\n            if hasattr(sec_dict, k):\n                setattr(sec_dict, k, v)\n        except:\n            pass\n    return cfg\n\n\ndef print_configs(cfg, mode):\n    logger.info(\"---------------- {:>5} Arguments ----------------\".format(mode))\n    for sec, sec_items in cfg.items():\n        logger.info(\"{}:\".format(sec))\n        for k, v in sec_items.items():\n            logger.info(\"    {}:{}\".format(k, v))\n    logger.info(\"-------------------------------------------------\")\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/utils/train_utils.py",
    "content": "#   Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport sys\nimport logging\n\nimport time\nimport numpy as np\nimport paddle.fluid as fluid\nfrom paddle.fluid import profiler\n\nlogger = logging.getLogger(__name__)\n\n\ndef log_lr_and_step():\n    try:\n        # In optimizers, if learning_rate is set as constant, lr_var\n        # name is 'learning_rate_0', and iteration counter is not\n        # recorded. If learning_rate is set as decayed values from\n        # learning_rate_scheduler, lr_var name is 'learning_rate',\n        # and iteration counter is recorded with name '@LR_DECAY_COUNTER@',\n        # better impliment is required here\n        lr_var = fluid.global_scope().find_var(\"learning_rate\")\n        if not lr_var:\n            lr_var = fluid.global_scope().find_var(\"learning_rate_0\")\n        lr = np.array(lr_var.get_tensor())\n\n        lr_count = '[-]'\n        lr_count_var = fluid.global_scope().find_var(\"@LR_DECAY_COUNTER@\")\n        if lr_count_var:\n            lr_count = np.array(lr_count_var.get_tensor())\n        logger.info(\"------- learning rate {}, learning rate counter {} -----\".format(np.array(lr), np.array(lr_count)))\n    except:\n        logger.warn(\"Unable to get learning_rate and LR_DECAY_COUNTER.\")\n\n\ndef test_with_dataloader(exe,\n                         compiled_test_prog,\n                         test_dataloader,\n                         test_fetch_list,\n                         test_metrics,\n                         log_interval=0,\n                         save_model_name=''):\n    if not test_dataloader:\n        logger.error(\"[TEST] get dataloader failed.\")\n    test_metrics.reset()\n    test_iter = 0\n\n    for data in test_dataloader():\n        test_outs = exe.run(compiled_test_prog, fetch_list=test_fetch_list, feed=data)\n        test_metrics.accumulate(test_outs)\n        if log_interval > 0 and test_iter % log_interval == 0:\n            test_metrics.calculate_and_log_out(test_outs, \\\n               info = '[TEST] test_iter {} '.format(test_iter))\n        test_iter += 1\n    test_metrics.finalize_and_log_out(\"[TEST] Finish\")\n\n\ndef train_with_dataloader(exe, train_prog, compiled_train_prog, train_dataloader, \\\n                        train_fetch_list, train_metrics, epochs = 10, \\\n                        log_interval = 0, valid_interval = 0, save_dir = './', \\\n                        save_model_name = 'model', fix_random_seed = False, \\\n                        compiled_test_prog = None, test_dataloader = None, \\\n                        test_fetch_list = None, test_metrics = None, \\\n                        is_profiler = None, profiler_path = None):\n    if not train_dataloader:\n        logger.error(\"[TRAIN] get dataloader failed.\")\n    epoch_periods = []\n    train_loss = 0\n    for epoch in range(epochs):\n        log_lr_and_step()\n\n        train_iter = 0\n        epoch_periods = []\n\n        for data in train_dataloader():\n            cur_time = time.time()\n            train_outs = exe.run(compiled_train_prog, fetch_list=train_fetch_list, feed=data)\n            period = time.time() - cur_time\n            epoch_periods.append(period)\n            if log_interval > 0 and (train_iter % log_interval == 0):\n                train_metrics.calculate_and_log_out(train_outs, \\\n                        info = '[TRAIN] Epoch {}, iter {} '.format(epoch, train_iter))\n            train_iter += 1\n\n            # NOTE: profiler tools, used for benchmark\n            if is_profiler and epoch == 0 and train_iter == log_interval:\n                profiler.start_profiler(\"All\")\n            elif is_profiler and epoch == 0 and train_iter == log_interval + 5:\n                profiler.stop_profiler(\"total\", profiler_path)\n                return\n\n        if len(epoch_periods) < 1:\n            logger.info('No iteration was executed, please check the data reader')\n            sys.exit(1)\n\n        logger.info('[TRAIN] Epoch {} training finished, average time: {}'.format(epoch, np.mean(epoch_periods[1:])))\n        save_model(exe, train_prog, save_dir, save_model_name, \"_epoch{}\".format(epoch), save_type='.pdckpt')\n        save_model(exe, train_prog, save_dir, save_model_name, \"_epoch{}\".format(epoch), save_type='.pdparams')\n        if compiled_test_prog and valid_interval > 0 and (epoch + 1) % valid_interval == 0:\n            test_with_dataloader(exe, compiled_test_prog, test_dataloader, test_fetch_list, test_metrics, log_interval,\n                                 save_model_name)\n\n    save_model(exe, train_prog, save_dir, save_model_name, '_final', save_type='.pdckpt')\n    save_model(exe, train_prog, save_dir, save_model_name, '_final', save_type='.pdparams')\n    #when fix_random seed for debug\n    if fix_random_seed:\n        cards = os.environ.get('CUDA_VISIBLE_DEVICES')\n        gpu_num = len(cards.split(\",\"))\n        print(\"kpis\\ttrain_cost_card{}\\t{}\".format(gpu_num, train_loss))\n        print(\"kpis\\ttrain_speed_card{}\\t{}\".format(gpu_num, np.mean(epoch_periods)))\n\n\ndef save_model(exe, program, save_dir, model_name, postfix=None, save_type='.pdckpt'):\n    \"\"\"\n    save_type: '.pdckpt' or '.pdparams', '.pdckpt' for all persistable variables,\n               '.pdparams' for parameters only\n    \"\"\"\n    if not os.path.isdir(save_dir):\n        os.makedirs(save_dir)\n    saved_model_name = model_name + postfix\n\n    fluid.save(program, os.path.join(save_dir, saved_model_name))\n\n    return\n"
  },
  {
    "path": "modules/video/classification/videotag_tsn_lstm/resource/utils/utility.py",
    "content": "#  Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserve.\n#\n#Licensed under the Apache License, Version 2.0 (the \"License\");\n#you may not use this file except in compliance with the License.\n#You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n#Unless required by applicable law or agreed to in writing, software\n#distributed under the License is distributed on an \"AS IS\" BASIS,\n#WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n#See the License for the specific language governing permissions and\n#limitations under the License.\n\nimport os\nimport sys\nimport signal\nimport logging\n\nimport paddle.fluid as fluid\n\n__all__ = ['AttrDict']\n\nlogger = logging.getLogger(__name__)\n\n\ndef _term(sig_num, addition):\n    print('current pid is %s, group id is %s' % (os.getpid(), os.getpgrp()))\n    os.killpg(os.getpgid(os.getpid()), signal.SIGKILL)\n\n\nsignal.signal(signal.SIGTERM, _term)\nsignal.signal(signal.SIGINT, _term)\n\n\nclass AttrDict(dict):\n    def __getattr__(self, key):\n        return self[key]\n\n    def __setattr__(self, key, value):\n        if key in self.__dict__:\n            self.__dict__[key] = value\n        else:\n            self[key] = value\n\ndef check_cuda(use_cuda, err = \\\n    \"\\nYou can not set use_gpu = True in the model because you are using paddlepaddle-cpu.\\n \\\n    Please: 1. Install paddlepaddle-gpu to run your models on GPU or 2. Set use_gpu = False to run models on CPU.\\n\"\n                                                                                                                     ):\n    try:\n        if use_cuda == True and fluid.is_compiled_with_cuda() == False:\n            print(err)\n            sys.exit(1)\n    except Exception as e:\n        pass\n\n\ndef check_version():\n    \"\"\"\n     Log error and exit when the installed version of paddlepaddle is\n     not satisfied.\n     \"\"\"\n    err = \"PaddlePaddle version 1.6 or higher is required, \" \\\n          \"or a suitable develop version is satisfied as well. \\n\" \\\n          \"Please make sure the version is good with your code.\" \\\n\n    try:\n        fluid.require_version('1.6.0')\n    except Exception as e:\n        logger.error(err)\n        sys.exit(1)\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/fairmot_dla34/README.md",
    "content": "# fairmot_dla34\n\n|模型名称|fairmot_dla34|\n| :--- | :---: |\n|类别|视频 - 多目标追踪|\n|网络|CenterNet|\n|数据集|Caltech Pedestrian+CityPersons+CUHK-SYSU+PRW+ETHZ+MOT17|\n|是否支持Fine-tuning|否|\n|模型大小|125MB|\n|最新更新日期|2021-08-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n  <p align=\"center\">\n  <img src=\"https://user-images.githubusercontent.com/22424850/131989578-ec06e18f-e122-40b0-84d2-8772fd35391a.gif\"  hspace='10'/> <br />\n  </p>\n\n- ### 模型介绍\n\n  - FairMOT以Anchor Free的CenterNet检测器为基础，克服了Anchor-Based的检测框架中anchor和特征不对齐问题，深浅层特征融合使得检测和ReID任务各自获得所需要的特征，并且使用低维度ReID特征，提出了一种由两个同质分支组成的简单baseline来预测像素级目标得分和ReID特征，实现了两个任务之间的公平性，并获得了更高水平的实时多目标跟踪精度。\n\n  - 更多详情参考：[FairMOT: On the Fairness of Detection and Re-Identification in Multiple Object Tracking](https://arxiv.org/abs/2004.01888)\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddledet >= 2.2.0\n\n  - opencv-python\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install fairmot_dla34\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n  - 在windows下安装，由于paddledet package会依赖cython-bbox以及pycocotools, 这两个包需要windows用户提前装好，可参考[cython-bbox安装](https://blog.csdn.net/qq_24739717/article/details/105588729)和[pycocotools安装](https://github.com/PaddlePaddle/PaddleX/blob/release/1.3/docs/install.md#pycocotools安装问题)\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    # Read from a video file\n    $ hub run fairmot_dla34 --video_stream \"/PATH/TO/VIDEO\"\n    ```\n  - 通过命令行方式实现多目标追踪模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    tracker = hub.Module(name=\"fairmot_dla34\")\n    # Read from a video file\n    tracker.tracking('/PATH/TO/VIDEO', output_dir='mot_result', visualization=True,\n                        draw_threshold=0.5, use_gpu=False)\n    # or read from a image stream\n    # with tracker.stream_mode(output_dir='image_stream_output', visualization=True, draw_threshold=0.5, use_gpu=True):\n    #    tracker.predict([images])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def tracking(video_stream,\n                 output_dir='',\n                 visualization=True,\n                 draw_threshold=0.5,\n                 use_gpu=False)\n    ```\n    - 视频预测API，完成对视频内容的多目标追踪，并存储追踪结果。\n\n    - **参数**\n\n      - video_stream (str): 视频文件的路径; <br/>\n      - output_dir (str): 结果保存路径的根目录，默认为当前目录； <br/>\n      - visualization (bool): 是否保存追踪结果；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - draw\\_threshold (float): 预测置信度的阈值。\n\n  - ```python\n    def stream_mode(output_dir='',\n                    visualization=True,\n                    draw_threshold=0.5,\n                    use_gpu=False)\n    ```\n    - 进入图片流预测模式API，在该模式中完成对图片流的多目标追踪，并存储追踪结果。\n\n    - **参数**\n\n      - output_dir (str): 结果保存路径的根目录，默认为当前目录； <br/>\n      - visualization (bool): 是否保存追踪结果；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - draw\\_threshold (float): 预测置信度的阈值。\n\n  - ```python\n    def predict(images: list = [])\n    ```\n    - 对图片进行预测的API, 该接口必须在stream_mode API被调用后使用。\n\n    - **参数**\n\n      - images (list): 待预测的图片列表。\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除fluid api\n\n  - ```shell\n    $ hub install fairmot_dla34==1.1.0\n    ```\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/fairmot_dla34/config/_base_/fairmot_dla34.yml",
    "content": "architecture: FairMOT\npretrain_weights: https://paddledet.bj.bcebos.com/models/pretrained/fairmot_dla34_crowdhuman_pretrained.pdparams\n\nFairMOT:\n  detector: CenterNet\n  reid: FairMOTEmbeddingHead\n  loss: FairMOTLoss\n  tracker: FrozenJDETracker\n\nCenterNet:\n  backbone: DLA\n  neck: CenterNetDLAFPN\n  head: CenterNetHead\n  post_process: CenterNetPostProcess\n  for_mot: True\n\nCenterNetPostProcess:\n  for_mot: True\n\nJDETracker:\n  conf_thres: 0.4\n  tracked_thresh: 0.4\n  metric_type: cosine\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/fairmot_dla34/config/_base_/fairmot_reader_1088x608.yml",
    "content": "worker_num: 4\nTrainReader:\n  inputs_def:\n    image_shape: [3, 608, 1088]\n  sample_transforms:\n    - Decode: {}\n    - RGBReverse: {}\n    - AugmentHSV: {}\n    - LetterBoxResize: {target_size: [608, 1088]}\n    - MOTRandomAffine: {reject_outside: False}\n    - RandomFlip: {}\n    - BboxXYXY2XYWH: {}\n    - NormalizeBox: {}\n    - NormalizeImage: {mean: [0, 0, 0], std: [1, 1, 1]}\n    - RGBReverse: {}\n    - Permute: {}\n  batch_transforms:\n    - Gt2FairMOTTarget: {}\n  batch_size: 6\n  shuffle: True\n  drop_last: True\n  use_shared_memory: True\n\nEvalMOTReader:\n  sample_transforms:\n    - Decode: {}\n    - LetterBoxResize: {target_size: [608, 1088]}\n    - NormalizeImage: {mean: [0, 0, 0], std: [1, 1, 1]}\n    - Permute: {}\n  batch_size: 1\n\n\nTestMOTReader:\n  inputs_def:\n    image_shape: [3, 608, 1088]\n  sample_transforms:\n    - Decode: {}\n    - LetterBoxResize: {target_size: [608, 1088]}\n    - NormalizeImage: {mean: [0, 0, 0], std: [1, 1, 1]}\n    - Permute: {}\n  batch_size: 1\n\nMOTVideoStreamReader:\n  sample_transforms:\n    - Decode: {}\n    - LetterBoxResize: {target_size: [608, 1088]}\n    - NormalizeImage: {mean: [0, 0, 0], std: [1, 1, 1], is_scale: True}\n    - Permute: {}\n  batch_size: 1\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/fairmot_dla34/config/_base_/mot.yml",
    "content": "metric: MOT\nnum_classes: 1\n\n# for MOT training\nTrainDataset:\n  !MOTDataSet\n    dataset_dir: dataset/mot\n    image_lists: ['mot17.train', 'caltech.all', 'cuhksysu.train', 'prw.train', 'citypersons.train', 'eth.train']\n    data_fields: ['image', 'gt_bbox', 'gt_class', 'gt_ide']\n\n# for MOT evaluation\n# If you want to change the MOT evaluation dataset, please modify 'data_root'\nEvalMOTDataset:\n  !MOTImageFolder\n    dataset_dir: dataset/mot\n    data_root: MOT16/images/train\n    keep_ori_im: False # set True if save visualization images or video, or used in DeepSORT\n\n# for MOT video inference\nTestMOTDataset:\n  !MOTImageFolder\n    dataset_dir: dataset/mot\n    keep_ori_im: True # set True if save visualization images or video\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/fairmot_dla34/config/_base_/optimizer_30e.yml",
    "content": "epoch: 30\n\nLearningRate:\n  base_lr: 0.0004\n  schedulers:\n  - !PiecewiseDecay\n    gamma: 0.1\n    milestones: [20,]\n    use_warmup: False\n\nOptimizerBuilder:\n  optimizer:\n    type: Adam\n  regularizer: NULL\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/fairmot_dla34/config/_base_/runtime.yml",
    "content": "use_gpu: true\nlog_iter: 20\nsave_dir: output\nsnapshot_epoch: 1\nprint_flops: false\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/fairmot_dla34/config/fairmot_dla34_30e_1088x608.yml",
    "content": "_BASE_: [\n  '_base_/mot.yml',\n  '_base_/runtime.yml',\n  '_base_/fairmot_dla34.yml',\n  '_base_/fairmot_reader_1088x608.yml',\n]\n\nmetric: MOT\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/fairmot_dla34/dataset.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport numbers\nimport os\nimport sys\nfrom collections import deque\nfrom collections.abc import Mapping\n\nimport six\ntry:\n    from collections.abc import Sequence, Mapping\nexcept:\n    from collections import Sequence, Mapping\n\nfrom ppdet.core.workspace import register, serializable\nfrom ppdet.utils.logger import setup_logger\nfrom ppdet.data.reader import BaseDataLoader, Compose\nimport cv2\nfrom imageio import imread, imwrite\nimport numpy as np\nimport paddle\nfrom paddle.framework import core\n\nlogger = setup_logger(__name__)\n\n\ndef default_collate_fn(batch):\n    \"\"\"\n    Default batch collating function for :code:`paddle.io.DataLoader`,\n    get input data as a list of sample datas, each element in list\n    if the data of a sample, and sample data should composed of list,\n    dictionary, string, number, numpy array and paddle.Tensor, this\n    function will parse input data recursively and stack number,\n    numpy array and paddle.Tensor datas as batch datas. e.g. for\n    following input data:\n    [{'image': np.array(shape=[3, 224, 224]), 'label': 1},\n     {'image': np.array(shape=[3, 224, 224]), 'label': 3},\n     {'image': np.array(shape=[3, 224, 224]), 'label': 4},\n     {'image': np.array(shape=[3, 224, 224]), 'label': 5},]\n\n\n    This default collate function zipped each number and numpy array\n    field together and stack each field as the batch field as follows:\n    {'image': np.array(shape=[4, 3, 224, 224]), 'label': np.array([1, 3, 4, 5])}\n    Args:\n        batch(list of sample data): batch should be a list of sample data.\n\n    Returns:\n        Batched data: batched each number, numpy array and paddle.Tensor\n                      in input data.\n    \"\"\"\n    sample = batch[0]\n    if isinstance(sample, np.ndarray):\n        batch = np.stack(batch, axis=0)\n        return batch\n    elif isinstance(sample, (paddle.Tensor)):\n        return paddle.stack(batch, axis=0)\n    elif isinstance(sample, numbers.Number):\n        batch = np.array(batch)\n        return batch\n    elif isinstance(sample, (str, bytes)):\n        return batch\n    elif isinstance(sample, Mapping):\n        return {key: default_collate_fn([d[key] for d in batch]) for key in sample}\n    elif isinstance(sample, Sequence):\n        sample_fields_num = len(sample)\n        if not all(len(sample) == sample_fields_num for sample in iter(batch)):\n            raise RuntimeError(\"fileds number not same among samples in a batch\")\n        return [default_collate_fn(fields) for fields in zip(*batch)]\n\n    raise TypeError(\"batch data con only contains: tensor, numpy.ndarray, \"\n                    \"dict, list, number, but got {}\".format(type(sample)))\n\n\n@register\n@serializable\nclass MOTVideoStream:\n    \"\"\"\n    Load MOT dataset with MOT format from video stream.\n    Args:\n        video_stream (str): path or url of the video file, default ''.\n        keep_ori_im (bool): whether to keep original image, default False.\n            Set True when used during MOT model inference while saving\n            images or video, or used in DeepSORT.\n    \"\"\"\n\n    def __init__(self, video_stream=None, keep_ori_im=False, **kwargs):\n        self.video_stream = video_stream\n        self.keep_ori_im = keep_ori_im\n        self._curr_iter = 0\n        self.transform = None\n        try:\n            if video_stream == None:\n                print('No video stream is specified, please check the --video_stream option.')\n                raise FileNotFoundError(\"No video_stream is specified.\")\n            self.stream = cv2.VideoCapture(video_stream)\n            if not self.stream.isOpened():\n                raise Exception(\"Open video stream Error!\")\n        except Exception as e:\n            print('Failed to read {}.'.format(video_stream))\n            raise e\n\n        self.videoframeraw_dir = os.path.splitext(os.path.basename(self.video_stream))[0] + '_raw'\n        if not os.path.exists(self.videoframeraw_dir):\n            os.makedirs(self.videoframeraw_dir)\n\n    def set_kwargs(self, **kwargs):\n        self.mixup_epoch = kwargs.get('mixup_epoch', -1)\n        self.cutmix_epoch = kwargs.get('cutmix_epoch', -1)\n        self.mosaic_epoch = kwargs.get('mosaic_epoch', -1)\n\n    def set_transform(self, transform):\n        self.transform = transform\n\n    def set_epoch(self, epoch_id):\n        self._epoch = epoch_id\n\n    def parse_dataset(self):\n        pass\n\n    def __iter__(self):\n        ct = 0\n        while True:\n            ret, frame = self.stream.read()\n            if ret:\n                imgname = os.path.join(self.videoframeraw_dir, 'frame{}.png'.format(ct))\n                cv2.imwrite(imgname, frame)\n                image = imread(imgname)\n                rec = {'im_id': np.array([ct]), 'im_file': imgname}\n                if self.keep_ori_im:\n                    rec.update({'keep_ori_im': 1})\n                rec['curr_iter'] = self._curr_iter\n                self._curr_iter += 1\n                ct += 1\n                if self.transform:\n                    yield self.transform(rec)\n                else:\n                    yield rec\n            else:\n                return\n\n\n@register\n@serializable\nclass MOTImageStream:\n    \"\"\"\n    Load MOT dataset with MOT format from image stream.\n    Args:\n        keep_ori_im (bool): whether to keep original image, default False.\n            Set True when used during MOT model inference while saving\n            images or video, or used in DeepSORT.\n    \"\"\"\n\n    def __init__(self, sample_num=-1, keep_ori_im=False, **kwargs):\n        self.keep_ori_im = keep_ori_im\n        self._curr_iter = 0\n        self.transform = None\n        self.imagequeue = deque()\n\n        self.frameraw_dir = 'inputimages_raw'\n        if not os.path.exists(self.frameraw_dir):\n            os.makedirs(self.frameraw_dir)\n\n    def add_image(self, image):\n        self.imagequeue.append(image)\n\n    def set_kwargs(self, **kwargs):\n        self.mixup_epoch = kwargs.get('mixup_epoch', -1)\n        self.cutmix_epoch = kwargs.get('cutmix_epoch', -1)\n        self.mosaic_epoch = kwargs.get('mosaic_epoch', -1)\n\n    def set_transform(self, transform):\n        self.transform = transform\n\n    def set_epoch(self, epoch_id):\n        self._epoch = epoch_id\n\n    def parse_dataset(self):\n        pass\n\n    def __iter__(self):\n        ct = 0\n        while True:\n            if self.imagequeue:\n                frame = self.imagequeue.popleft()\n                imgname = os.path.join(self.frameraw_dir, 'frame{}.png'.format(ct))\n                cv2.imwrite(imgname, frame)\n                image = imread(imgname)\n                rec = {'im_id': np.array([ct]), 'im_file': imgname}\n                if self.keep_ori_im:\n                    rec.update({'keep_ori_im': 1})\n                rec['curr_iter'] = self._curr_iter\n                self._curr_iter += 1\n                ct += 1\n                if self.transform:\n                    yield self.transform(rec)\n                else:\n                    yield rec\n            else:\n                return\n\n\n@register\nclass MOTVideoStreamReader:\n    __shared__ = ['num_classes']\n\n    def __init__(self, sample_transforms=[], batch_size=1, drop_last=False, num_classes=1, **kwargs):\n        self._sample_transforms = Compose(sample_transforms, num_classes=num_classes)\n        self.batch_size = batch_size\n        self.drop_last = drop_last\n        self.num_classes = num_classes\n        self.kwargs = kwargs\n\n    def __call__(\n        self,\n        dataset,\n        worker_num,\n    ):\n        self.dataset = dataset\n        # get data\n        self.dataset.set_transform(self._sample_transforms)\n        # set kwargs\n        self.dataset.set_kwargs(**self.kwargs)\n\n        self.loader = iter(self.dataset)\n        return self\n\n    def __len__(self):\n        return sys.maxint\n\n    def __iter__(self):\n        return self\n\n    def to_tensor(self, batch):\n        paddle.disable_static()\n        if isinstance(batch, np.ndarray):\n            batch = paddle.to_tensor(batch)\n        elif isinstance(batch, Mapping):\n            batch = {key: self.to_tensor(batch[key]) for key in batch}\n        return batch\n\n    def __next__(self):\n        try:\n            batch = []\n            for i in range(self.batch_size):\n                batch.append(next(self.loader))\n            batch = default_collate_fn(batch)\n            return self.to_tensor(batch)\n\n        except StopIteration as e:\n            raise e\n\n    def next(self):\n        # python2 compatibility\n        return self.__next__()\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/fairmot_dla34/modeling/mot/__init__.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom . import matching\nfrom . import tracker\nfrom . import motion\nfrom . import visualization\nfrom . import utils\n\nfrom .matching import *\nfrom .tracker import *\nfrom .motion import *\nfrom .visualization import *\nfrom .utils import *\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/fairmot_dla34/modeling/mot/matching/__init__.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom . import jde_matching\nfrom . import deepsort_matching\n\nfrom .jde_matching import *\nfrom .deepsort_matching import *\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/fairmot_dla34/modeling/mot/matching/deepsort_matching.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"\nThis code is borrow from https://github.com/nwojke/deep_sort/tree/master/deep_sort\n\"\"\"\n\nimport numpy as np\nfrom scipy.optimize import linear_sum_assignment\nfrom ..motion import kalman_filter\n\nINFTY_COST = 1e+5\n\n__all__ = [\n    'iou_1toN',\n    'iou_cost',\n    '_nn_euclidean_distance',\n    '_nn_cosine_distance',\n    'NearestNeighborDistanceMetric',\n    'min_cost_matching',\n    'matching_cascade',\n    'gate_cost_matrix',\n]\n\n\ndef iou_1toN(bbox, candidates):\n    \"\"\"\n    Computer intersection over union (IoU) by one box to N candidates.\n\n    Args:\n        bbox (ndarray): A bounding box in format `(top left x, top left y, width, height)`.\n            candidates (ndarray): A matrix of candidate bounding boxes (one per row) in the\n            same format as `bbox`.\n\n    Returns:\n        ious (ndarray): The intersection over union in [0, 1] between the `bbox`\n            and each candidate. A higher score means a larger fraction of the\n            `bbox` is occluded by the candidate.\n    \"\"\"\n    bbox_tl = bbox[:2]\n    bbox_br = bbox[:2] + bbox[2:]\n    candidates_tl = candidates[:, :2]\n    candidates_br = candidates[:, :2] + candidates[:, 2:]\n\n    tl = np.c_[np.maximum(bbox_tl[0], candidates_tl[:, 0])[:, np.newaxis],\n               np.maximum(bbox_tl[1], candidates_tl[:, 1])[:, np.newaxis]]\n    br = np.c_[np.minimum(bbox_br[0], candidates_br[:, 0])[:, np.newaxis],\n               np.minimum(bbox_br[1], candidates_br[:, 1])[:, np.newaxis]]\n    wh = np.maximum(0., br - tl)\n\n    area_intersection = wh.prod(axis=1)\n    area_bbox = bbox[2:].prod()\n    area_candidates = candidates[:, 2:].prod(axis=1)\n    ious = area_intersection / (area_bbox + area_candidates - area_intersection)\n    return ious\n\n\ndef iou_cost(tracks, detections, track_indices=None, detection_indices=None):\n    \"\"\"\n    IoU distance metric.\n\n    Args:\n        tracks (list[Track]): A list of tracks.\n        detections (list[Detection]): A list of detections.\n        track_indices (Optional[list[int]]): A list of indices to tracks that\n            should be matched. Defaults to all `tracks`.\n        detection_indices (Optional[list[int]]): A list of indices to detections\n            that should be matched. Defaults to all `detections`.\n\n    Returns:\n        cost_matrix (ndarray): A cost matrix of shape len(track_indices),\n            len(detection_indices) where entry (i, j) is\n            `1 - iou(tracks[track_indices[i]], detections[detection_indices[j]])`.\n    \"\"\"\n    if track_indices is None:\n        track_indices = np.arange(len(tracks))\n    if detection_indices is None:\n        detection_indices = np.arange(len(detections))\n\n    cost_matrix = np.zeros((len(track_indices), len(detection_indices)))\n    for row, track_idx in enumerate(track_indices):\n        if tracks[track_idx].time_since_update > 1:\n            cost_matrix[row, :] = 1e+5\n            continue\n\n        bbox = tracks[track_idx].to_tlwh()\n        candidates = np.asarray([detections[i].tlwh for i in detection_indices])\n        cost_matrix[row, :] = 1. - iou_1toN(bbox, candidates)\n    return cost_matrix\n\n\ndef _nn_euclidean_distance(s, q):\n    \"\"\"\n    Compute pair-wise squared (Euclidean) distance between points in `s` and `q`.\n\n    Args:\n        s (ndarray): Sample points: an NxM matrix of N samples of dimensionality M.\n        q (ndarray): Query points: an LxM matrix of L samples of dimensionality M.\n\n    Returns:\n        distances (ndarray): A vector of length M that contains for each entry in `q` the\n            smallest Euclidean distance to a sample in `s`.\n    \"\"\"\n    s, q = np.asarray(s), np.asarray(q)\n    if len(s) == 0 or len(q) == 0:\n        return np.zeros((len(s), len(q)))\n    s2, q2 = np.square(s).sum(axis=1), np.square(q).sum(axis=1)\n    distances = -2. * np.dot(s, q.T) + s2[:, None] + q2[None, :]\n    distances = np.clip(distances, 0., float(np.inf))\n\n    return np.maximum(0.0, distances.min(axis=0))\n\n\ndef _nn_cosine_distance(s, q):\n    \"\"\"\n    Compute pair-wise cosine distance between points in `s` and `q`.\n\n    Args:\n        s (ndarray): Sample points: an NxM matrix of N samples of dimensionality M.\n        q (ndarray): Query points: an LxM matrix of L samples of dimensionality M.\n\n    Returns:\n        distances (ndarray): A vector of length M that contains for each entry in `q` the\n            smallest Euclidean distance to a sample in `s`.\n    \"\"\"\n    s = np.asarray(s) / np.linalg.norm(s, axis=1, keepdims=True)\n    q = np.asarray(q) / np.linalg.norm(q, axis=1, keepdims=True)\n    distances = 1. - np.dot(s, q.T)\n\n    return distances.min(axis=0)\n\n\nclass NearestNeighborDistanceMetric(object):\n    \"\"\"\n    A nearest neighbor distance metric that, for each target, returns\n    the closest distance to any sample that has been observed so far.\n\n    Args:\n        metric (str): Either \"euclidean\" or \"cosine\".\n        matching_threshold (float): The matching threshold. Samples with larger\n            distance are considered an invalid match.\n        budget (Optional[int]): If not None, fix samples per class to at most\n            this number. Removes the oldest samples when the budget is reached.\n\n    Attributes:\n        samples (Dict[int -> List[ndarray]]): A dictionary that maps from target\n            identities to the list of samples that have been observed so far.\n    \"\"\"\n\n    def __init__(self, metric, matching_threshold, budget=None):\n        if metric == \"euclidean\":\n            self._metric = _nn_euclidean_distance\n        elif metric == \"cosine\":\n            self._metric = _nn_cosine_distance\n        else:\n            raise ValueError(\"Invalid metric; must be either 'euclidean' or 'cosine'\")\n        self.matching_threshold = matching_threshold\n        self.budget = budget\n        self.samples = {}\n\n    def partial_fit(self, features, targets, active_targets):\n        \"\"\"\n        Update the distance metric with new data.\n\n        Args:\n            features (ndarray): An NxM matrix of N features of dimensionality M.\n            targets (ndarray): An integer array of associated target identities.\n            active_targets (List[int]): A list of targets that are currently\n                present in the scene.\n        \"\"\"\n        for feature, target in zip(features, targets):\n            self.samples.setdefault(target, []).append(feature)\n            if self.budget is not None:\n                self.samples[target] = self.samples[target][-self.budget:]\n        self.samples = {k: self.samples[k] for k in active_targets}\n\n    def distance(self, features, targets):\n        \"\"\"\n        Compute distance between features and targets.\n\n        Args:\n            features (ndarray): An NxM matrix of N features of dimensionality M.\n            targets (list[int]): A list of targets to match the given `features` against.\n\n        Returns:\n            cost_matrix (ndarray): a cost matrix of shape len(targets), len(features),\n                where element (i, j) contains the closest squared distance between\n                `targets[i]` and `features[j]`.\n        \"\"\"\n        cost_matrix = np.zeros((len(targets), len(features)))\n        for i, target in enumerate(targets):\n            cost_matrix[i, :] = self._metric(self.samples[target], features)\n        return cost_matrix\n\n\ndef min_cost_matching(distance_metric, max_distance, tracks, detections, track_indices=None, detection_indices=None):\n    \"\"\"\n    Solve linear assignment problem.\n\n    Args:\n        distance_metric :\n            Callable[List[Track], List[Detection], List[int], List[int]) -> ndarray\n            The distance metric is given a list of tracks and detections as\n            well as a list of N track indices and M detection indices. The\n            metric should return the NxM dimensional cost matrix, where element\n            (i, j) is the association cost between the i-th track in the given\n            track indices and the j-th detection in the given detection_indices.\n        max_distance (float): Gating threshold. Associations with cost larger\n            than this value are disregarded.\n        tracks (list[Track]): A list of predicted tracks at the current time\n            step.\n        detections (list[Detection]): A list of detections at the current time\n            step.\n        track_indices (list[int]): List of track indices that maps rows in\n            `cost_matrix` to tracks in `tracks`.\n        detection_indices (List[int]): List of detection indices that maps\n            columns in `cost_matrix` to detections in `detections`.\n\n    Returns:\n        A tuple (List[(int, int)], List[int], List[int]) with the following\n        three entries:\n            * A list of matched track and detection indices.\n            * A list of unmatched track indices.\n            * A list of unmatched detection indices.\n    \"\"\"\n    if track_indices is None:\n        track_indices = np.arange(len(tracks))\n    if detection_indices is None:\n        detection_indices = np.arange(len(detections))\n\n    if len(detection_indices) == 0 or len(track_indices) == 0:\n        return [], track_indices, detection_indices  # Nothing to match.\n\n    cost_matrix = distance_metric(tracks, detections, track_indices, detection_indices)\n\n    cost_matrix[cost_matrix > max_distance] = max_distance + 1e-5\n    indices = linear_sum_assignment(cost_matrix)\n\n    matches, unmatched_tracks, unmatched_detections = [], [], []\n    for col, detection_idx in enumerate(detection_indices):\n        if col not in indices[1]:\n            unmatched_detections.append(detection_idx)\n    for row, track_idx in enumerate(track_indices):\n        if row not in indices[0]:\n            unmatched_tracks.append(track_idx)\n    for row, col in zip(indices[0], indices[1]):\n        track_idx = track_indices[row]\n        detection_idx = detection_indices[col]\n        if cost_matrix[row, col] > max_distance:\n            unmatched_tracks.append(track_idx)\n            unmatched_detections.append(detection_idx)\n        else:\n            matches.append((track_idx, detection_idx))\n    return matches, unmatched_tracks, unmatched_detections\n\n\ndef matching_cascade(distance_metric,\n                     max_distance,\n                     cascade_depth,\n                     tracks,\n                     detections,\n                     track_indices=None,\n                     detection_indices=None):\n    \"\"\"\n    Run matching cascade.\n\n    Args:\n        distance_metric :\n            Callable[List[Track], List[Detection], List[int], List[int]) -> ndarray\n            The distance metric is given a list of tracks and detections as\n            well as a list of N track indices and M detection indices. The\n            metric should return the NxM dimensional cost matrix, where element\n            (i, j) is the association cost between the i-th track in the given\n            track indices and the j-th detection in the given detection_indices.\n        max_distance (float): Gating threshold. Associations with cost larger\n            than this value are disregarded.\n        cascade_depth (int): The cascade depth, should be se to the maximum\n            track age.\n        tracks (list[Track]): A list of predicted tracks at the current time\n            step.\n        detections (list[Detection]): A list of detections at the current time\n            step.\n        track_indices (list[int]): List of track indices that maps rows in\n            `cost_matrix` to tracks in `tracks`.\n        detection_indices (List[int]): List of detection indices that maps\n            columns in `cost_matrix` to detections in `detections`.\n\n    Returns:\n        A tuple (List[(int, int)], List[int], List[int]) with the following\n        three entries:\n            * A list of matched track and detection indices.\n            * A list of unmatched track indices.\n            * A list of unmatched detection indices.\n    \"\"\"\n    if track_indices is None:\n        track_indices = list(range(len(tracks)))\n    if detection_indices is None:\n        detection_indices = list(range(len(detections)))\n\n    unmatched_detections = detection_indices\n    matches = []\n    for level in range(cascade_depth):\n        if len(unmatched_detections) == 0:  # No detections left\n            break\n\n        track_indices_l = [k for k in track_indices if tracks[k].time_since_update == 1 + level]\n        if len(track_indices_l) == 0:  # Nothing to match at this level\n            continue\n\n        matches_l, _, unmatched_detections = \\\n            min_cost_matching(\n                distance_metric, max_distance, tracks, detections,\n                track_indices_l, unmatched_detections)\n        matches += matches_l\n    unmatched_tracks = list(set(track_indices) - set(k for k, _ in matches))\n    return matches, unmatched_tracks, unmatched_detections\n\n\ndef gate_cost_matrix(kf,\n                     cost_matrix,\n                     tracks,\n                     detections,\n                     track_indices,\n                     detection_indices,\n                     gated_cost=INFTY_COST,\n                     only_position=False):\n    \"\"\"\n    Invalidate infeasible entries in cost matrix based on the state\n    distributions obtained by Kalman filtering.\n\n    Args:\n        kf (object): The Kalman filter.\n        cost_matrix (ndarray): The NxM dimensional cost matrix, where N is the\n            number of track indices and M is the number of detection indices,\n            such that entry (i, j) is the association cost between\n            `tracks[track_indices[i]]` and `detections[detection_indices[j]]`.\n        tracks (list[Track]): A list of predicted tracks at the current time\n            step.\n        detections (list[Detection]): A list of detections at the current time\n            step.\n        track_indices (List[int]): List of track indices that maps rows in\n            `cost_matrix` to tracks in `tracks`.\n        detection_indices (List[int]): List of detection indices that maps\n            columns in `cost_matrix` to detections in `detections`.\n        gated_cost (Optional[float]): Entries in the cost matrix corresponding\n            to infeasible associations are set this value. Defaults to a very\n            large value.\n        only_position (Optional[bool]): If True, only the x, y position of the\n            state distribution is considered during gating. Default False.\n    \"\"\"\n    gating_dim = 2 if only_position else 4\n    gating_threshold = kalman_filter.chi2inv95[gating_dim]\n    measurements = np.asarray([detections[i].to_xyah() for i in detection_indices])\n    for row, track_idx in enumerate(track_indices):\n        track = tracks[track_idx]\n        gating_distance = kf.gating_distance(track.mean, track.covariance, measurements, only_position)\n        cost_matrix[row, gating_distance > gating_threshold] = gated_cost\n    return cost_matrix\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/fairmot_dla34/modeling/mot/matching/jde_matching.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\r\n#\r\n# Licensed under the Apache License, Version 2.0 (the \"License\");\r\n# you may not use this file except in compliance with the License.\r\n# You may obtain a copy of the License at\r\n#\r\n#     http://www.apache.org/licenses/LICENSE-2.0\r\n#\r\n# Unless required by applicable law or agreed to in writing, software\r\n# distributed under the License is distributed on an \"AS IS\" BASIS,\r\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\r\n# See the License for the specific language governing permissions and\r\n# limitations under the License.\r\n\"\"\"\r\nThis code is based on https://github.com/Zhongdao/Towards-Realtime-MOT/blob/master/tracker/matching.py\r\n\"\"\"\r\n\r\ntry:\r\n    import lap\r\nexcept:\r\n    print(\r\n        'Warning: Unable to use JDE/FairMOT/ByteTrack, please install lap, for example: `pip install lap`, see https://github.com/gatagat/lap'\r\n    )\r\n    pass\r\n\r\nimport scipy\r\nimport numpy as np\r\nfrom scipy.spatial.distance import cdist\r\nfrom ..motion import kalman_filter\r\nimport warnings\r\n\r\nwarnings.filterwarnings(\"ignore\")\r\n\r\n__all__ = [\r\n    'merge_matches',\r\n    'linear_assignment',\r\n    'bbox_ious',\r\n    'iou_distance',\r\n    'embedding_distance',\r\n    'fuse_motion',\r\n]\r\n\r\n\r\ndef merge_matches(m1, m2, shape):\r\n    O, P, Q = shape\r\n    m1 = np.asarray(m1)\r\n    m2 = np.asarray(m2)\r\n\r\n    M1 = scipy.sparse.coo_matrix((np.ones(len(m1)), (m1[:, 0], m1[:, 1])), shape=(O, P))\r\n    M2 = scipy.sparse.coo_matrix((np.ones(len(m2)), (m2[:, 0], m2[:, 1])), shape=(P, Q))\r\n\r\n    mask = M1 * M2\r\n    match = mask.nonzero()\r\n    match = list(zip(match[0], match[1]))\r\n    unmatched_O = tuple(set(range(O)) - set([i for i, j in match]))\r\n    unmatched_Q = tuple(set(range(Q)) - set([j for i, j in match]))\r\n\r\n    return match, unmatched_O, unmatched_Q\r\n\r\n\r\ndef linear_assignment(cost_matrix, thresh):\r\n    try:\r\n        import lap\r\n    except Exception as e:\r\n        raise RuntimeError(\r\n            'Unable to use JDE/FairMOT/ByteTrack, please install lap, for example: `pip install lap`, see https://github.com/gatagat/lap'\r\n        )\r\n    if cost_matrix.size == 0:\r\n        return np.empty((0, 2), dtype=int), tuple(range(cost_matrix.shape[0])), tuple(range(cost_matrix.shape[1]))\r\n    matches, unmatched_a, unmatched_b = [], [], []\r\n    cost, x, y = lap.lapjv(cost_matrix, extend_cost=True, cost_limit=thresh)\r\n    for ix, mx in enumerate(x):\r\n        if mx >= 0:\r\n            matches.append([ix, mx])\r\n    unmatched_a = np.where(x < 0)[0]\r\n    unmatched_b = np.where(y < 0)[0]\r\n    matches = np.asarray(matches)\r\n    return matches, unmatched_a, unmatched_b\r\n\r\n\r\ndef bbox_ious(atlbrs, btlbrs):\r\n    boxes = np.ascontiguousarray(atlbrs, dtype=np.float)\r\n    query_boxes = np.ascontiguousarray(btlbrs, dtype=np.float)\r\n    N = boxes.shape[0]\r\n    K = query_boxes.shape[0]\r\n    ious = np.zeros((N, K), dtype=boxes.dtype)\r\n    if N * K == 0:\r\n        return ious\r\n\r\n    for k in range(K):\r\n        box_area = ((query_boxes[k, 2] - query_boxes[k, 0] + 1) * (query_boxes[k, 3] - query_boxes[k, 1] + 1))\r\n        for n in range(N):\r\n            iw = (min(boxes[n, 2], query_boxes[k, 2]) - max(boxes[n, 0], query_boxes[k, 0]) + 1)\r\n            if iw > 0:\r\n                ih = (min(boxes[n, 3], query_boxes[k, 3]) - max(boxes[n, 1], query_boxes[k, 1]) + 1)\r\n                if ih > 0:\r\n                    ua = float((boxes[n, 2] - boxes[n, 0] + 1) * (boxes[n, 3] - boxes[n, 1] + 1) + box_area - iw * ih)\r\n                    ious[n, k] = iw * ih / ua\r\n    return ious\r\n\r\n\r\ndef iou_distance(atracks, btracks):\r\n    \"\"\"\r\n    Compute cost based on IoU between two list[STrack].\r\n    \"\"\"\r\n    if (len(atracks) > 0 and isinstance(atracks[0], np.ndarray)) or (len(btracks) > 0\r\n                                                                     and isinstance(btracks[0], np.ndarray)):\r\n        atlbrs = atracks\r\n        btlbrs = btracks\r\n    else:\r\n        atlbrs = [track.tlbr for track in atracks]\r\n        btlbrs = [track.tlbr for track in btracks]\r\n    _ious = bbox_ious(atlbrs, btlbrs)\r\n    cost_matrix = 1 - _ious\r\n\r\n    return cost_matrix\r\n\r\n\r\ndef embedding_distance(tracks, detections, metric='euclidean'):\r\n    \"\"\"\r\n    Compute cost based on features between two list[STrack].\r\n    \"\"\"\r\n    cost_matrix = np.zeros((len(tracks), len(detections)), dtype=np.float)\r\n    if cost_matrix.size == 0:\r\n        return cost_matrix\r\n    det_features = np.asarray([track.curr_feat for track in detections], dtype=np.float)\r\n    track_features = np.asarray([track.smooth_feat for track in tracks], dtype=np.float)\r\n    cost_matrix = np.maximum(0.0, cdist(track_features, det_features, metric))  # Nomalized features\r\n    return cost_matrix\r\n\r\n\r\ndef fuse_motion(kf, cost_matrix, tracks, detections, only_position=False, lambda_=0.98):\r\n    if cost_matrix.size == 0:\r\n        return cost_matrix\r\n    gating_dim = 2 if only_position else 4\r\n    gating_threshold = kalman_filter.chi2inv95[gating_dim]\r\n    measurements = np.asarray([det.to_xyah() for det in detections])\r\n    for row, track in enumerate(tracks):\r\n        gating_distance = kf.gating_distance(track.mean, track.covariance, measurements, only_position, metric='maha')\r\n        cost_matrix[row, gating_distance > gating_threshold] = np.inf\r\n        cost_matrix[row] = lambda_ * cost_matrix[row] + (1 - lambda_) * gating_distance\r\n    return cost_matrix\r\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/fairmot_dla34/modeling/mot/motion/__init__.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom . import kalman_filter\n\nfrom .kalman_filter import *\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/fairmot_dla34/modeling/mot/motion/kalman_filter.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\r\n#\r\n# Licensed under the Apache License, Version 2.0 (the \"License\");\r\n# you may not use this file except in compliance with the License.\r\n# You may obtain a copy of the License at\r\n#\r\n#     http://www.apache.org/licenses/LICENSE-2.0\r\n#\r\n# Unless required by applicable law or agreed to in writing, software\r\n# distributed under the License is distributed on an \"AS IS\" BASIS,\r\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\r\n# See the License for the specific language governing permissions and\r\n# limitations under the License.\r\n\"\"\"\r\nThis code is borrow from https://github.com/nwojke/deep_sort/blob/master/deep_sort/kalman_filter.py\r\n\"\"\"\r\n\r\nimport numpy as np\r\nimport scipy.linalg\r\n\r\n__all__ = ['KalmanFilter']\r\n\"\"\"\r\nTable for the 0.95 quantile of the chi-square distribution with N degrees of\r\nfreedom (contains values for N=1, ..., 9). Taken from MATLAB/Octave's chi2inv\r\nfunction and used as Mahalanobis gating threshold.\r\n\"\"\"\r\n\r\nchi2inv95 = {1: 3.8415, 2: 5.9915, 3: 7.8147, 4: 9.4877, 5: 11.070, 6: 12.592, 7: 14.067, 8: 15.507, 9: 16.919}\r\n\r\n\r\nclass KalmanFilter(object):\r\n    \"\"\"\r\n    A simple Kalman filter for tracking bounding boxes in image space.\r\n\r\n    The 8-dimensional state space\r\n\r\n        x, y, a, h, vx, vy, va, vh\r\n\r\n    contains the bounding box center position (x, y), aspect ratio a, height h,\r\n    and their respective velocities.\r\n\r\n    Object motion follows a constant velocity model. The bounding box location\r\n    (x, y, a, h) is taken as direct observation of the state space (linear\r\n    observation model).\r\n\r\n    \"\"\"\r\n\r\n    def __init__(self):\r\n        ndim, dt = 4, 1.\r\n\r\n        # Create Kalman filter model matrices.\r\n        self._motion_mat = np.eye(2 * ndim, 2 * ndim)\r\n        for i in range(ndim):\r\n            self._motion_mat[i, ndim + i] = dt\r\n        self._update_mat = np.eye(ndim, 2 * ndim)\r\n\r\n        # Motion and observation uncertainty are chosen relative to the current\r\n        # state estimate. These weights control the amount of uncertainty in\r\n        # the model. This is a bit hacky.\r\n        self._std_weight_position = 1. / 20\r\n        self._std_weight_velocity = 1. / 160\r\n\r\n    def initiate(self, measurement):\r\n        \"\"\"\r\n        Create track from unassociated measurement.\r\n\r\n        Args:\r\n            measurement (ndarray): Bounding box coordinates (x, y, a, h) with\r\n                center position (x, y), aspect ratio a, and height h.\r\n\r\n        Returns:\r\n            The mean vector (8 dimensional) and covariance matrix (8x8\r\n            dimensional) of the new track. Unobserved velocities are\r\n            initialized to 0 mean.\r\n        \"\"\"\r\n        mean_pos = measurement\r\n        mean_vel = np.zeros_like(mean_pos)\r\n        mean = np.r_[mean_pos, mean_vel]\r\n\r\n        std = [\r\n            2 * self._std_weight_position * measurement[3], 2 * self._std_weight_position * measurement[3], 1e-2,\r\n            2 * self._std_weight_position * measurement[3], 10 * self._std_weight_velocity * measurement[3],\r\n            10 * self._std_weight_velocity * measurement[3], 1e-5, 10 * self._std_weight_velocity * measurement[3]\r\n        ]\r\n        covariance = np.diag(np.square(std))\r\n        return mean, covariance\r\n\r\n    def predict(self, mean, covariance):\r\n        \"\"\"\r\n        Run Kalman filter prediction step.\r\n\r\n        Args:\r\n            mean (ndarray): The 8 dimensional mean vector of the object state\r\n                at the previous time step.\r\n            covariance (ndarray): The 8x8 dimensional covariance matrix of the\r\n                object state at the previous time step.\r\n\r\n        Returns:\r\n            The mean vector and covariance matrix of the predicted state.\r\n            Unobserved velocities are initialized to 0 mean.\r\n        \"\"\"\r\n        std_pos = [\r\n            self._std_weight_position * mean[3], self._std_weight_position * mean[3], 1e-2,\r\n            self._std_weight_position * mean[3]\r\n        ]\r\n        std_vel = [\r\n            self._std_weight_velocity * mean[3], self._std_weight_velocity * mean[3], 1e-5,\r\n            self._std_weight_velocity * mean[3]\r\n        ]\r\n        motion_cov = np.diag(np.square(np.r_[std_pos, std_vel]))\r\n\r\n        #mean = np.dot(self._motion_mat, mean)\r\n        mean = np.dot(mean, self._motion_mat.T)\r\n        covariance = np.linalg.multi_dot((self._motion_mat, covariance, self._motion_mat.T)) + motion_cov\r\n\r\n        return mean, covariance\r\n\r\n    def project(self, mean, covariance):\r\n        \"\"\"\r\n        Project state distribution to measurement space.\r\n\r\n        Args\r\n            mean (ndarray): The state's mean vector (8 dimensional array).\r\n            covariance (ndarray): The state's covariance matrix (8x8 dimensional).\r\n\r\n        Returns:\r\n            The projected mean and covariance matrix of the given state estimate.\r\n        \"\"\"\r\n        std = [\r\n            self._std_weight_position * mean[3], self._std_weight_position * mean[3], 1e-1,\r\n            self._std_weight_position * mean[3]\r\n        ]\r\n        innovation_cov = np.diag(np.square(std))\r\n\r\n        mean = np.dot(self._update_mat, mean)\r\n        covariance = np.linalg.multi_dot((self._update_mat, covariance, self._update_mat.T))\r\n        return mean, covariance + innovation_cov\r\n\r\n    def multi_predict(self, mean, covariance):\r\n        \"\"\"\r\n        Run Kalman filter prediction step (Vectorized version).\r\n\r\n        Args:\r\n            mean (ndarray): The Nx8 dimensional mean matrix of the object states\r\n                at the previous time step.\r\n            covariance (ndarray): The Nx8x8 dimensional covariance matrics of the\r\n                object states at the previous time step.\r\n\r\n        Returns:\r\n            The mean vector and covariance matrix of the predicted state.\r\n            Unobserved velocities are initialized to 0 mean.\r\n        \"\"\"\r\n        std_pos = [\r\n            self._std_weight_position * mean[:, 3], self._std_weight_position * mean[:, 3],\r\n            1e-2 * np.ones_like(mean[:, 3]), self._std_weight_position * mean[:, 3]\r\n        ]\r\n        std_vel = [\r\n            self._std_weight_velocity * mean[:, 3], self._std_weight_velocity * mean[:, 3],\r\n            1e-5 * np.ones_like(mean[:, 3]), self._std_weight_velocity * mean[:, 3]\r\n        ]\r\n        sqr = np.square(np.r_[std_pos, std_vel]).T\r\n\r\n        motion_cov = []\r\n        for i in range(len(mean)):\r\n            motion_cov.append(np.diag(sqr[i]))\r\n        motion_cov = np.asarray(motion_cov)\r\n\r\n        mean = np.dot(mean, self._motion_mat.T)\r\n        left = np.dot(self._motion_mat, covariance).transpose((1, 0, 2))\r\n        covariance = np.dot(left, self._motion_mat.T) + motion_cov\r\n\r\n        return mean, covariance\r\n\r\n    def update(self, mean, covariance, measurement):\r\n        \"\"\"\r\n        Run Kalman filter correction step.\r\n\r\n        Args:\r\n            mean (ndarray): The predicted state's mean vector (8 dimensional).\r\n            covariance (ndarray): The state's covariance matrix (8x8 dimensional).\r\n            measurement (ndarray): The 4 dimensional measurement vector\r\n                (x, y, a, h), where (x, y) is the center position, a the aspect\r\n                ratio, and h the height of the bounding box.\r\n\r\n        Returns:\r\n            The measurement-corrected state distribution.\r\n        \"\"\"\r\n        projected_mean, projected_cov = self.project(mean, covariance)\r\n\r\n        chol_factor, lower = scipy.linalg.cho_factor(projected_cov, lower=True, check_finite=False)\r\n        kalman_gain = scipy.linalg.cho_solve((chol_factor, lower),\r\n                                             np.dot(covariance, self._update_mat.T).T,\r\n                                             check_finite=False).T\r\n        innovation = measurement - projected_mean\r\n\r\n        new_mean = mean + np.dot(innovation, kalman_gain.T)\r\n        new_covariance = covariance - np.linalg.multi_dot((kalman_gain, projected_cov, kalman_gain.T))\r\n        return new_mean, new_covariance\r\n\r\n    def gating_distance(self, mean, covariance, measurements, only_position=False, metric='maha'):\r\n        \"\"\"\r\n        Compute gating distance between state distribution and measurements.\r\n        A suitable distance threshold can be obtained from `chi2inv95`. If\r\n        `only_position` is False, the chi-square distribution has 4 degrees of\r\n        freedom, otherwise 2.\r\n\r\n        Args:\r\n            mean (ndarray): Mean vector over the state distribution (8\r\n                dimensional).\r\n            covariance (ndarray): Covariance of the state distribution (8x8\r\n                dimensional).\r\n            measurements (ndarray): An Nx4 dimensional matrix of N measurements,\r\n                each in format (x, y, a, h) where (x, y) is the bounding box center\r\n                position, a the aspect ratio, and h the height.\r\n            only_position (Optional[bool]): If True, distance computation is\r\n                done with respect to the bounding box center position only.\r\n            metric (str): Metric type, 'gaussian' or 'maha'.\r\n\r\n        Returns\r\n            An array of length N, where the i-th element contains the squared\r\n            Mahalanobis distance between (mean, covariance) and `measurements[i]`.\r\n        \"\"\"\r\n        mean, covariance = self.project(mean, covariance)\r\n        if only_position:\r\n            mean, covariance = mean[:2], covariance[:2, :2]\r\n            measurements = measurements[:, :2]\r\n\r\n        d = measurements - mean\r\n        if metric == 'gaussian':\r\n            return np.sum(d * d, axis=1)\r\n        elif metric == 'maha':\r\n            cholesky_factor = np.linalg.cholesky(covariance)\r\n            z = scipy.linalg.solve_triangular(cholesky_factor, d.T, lower=True, check_finite=False, overwrite_b=True)\r\n            squared_maha = np.sum(z * z, axis=0)\r\n            return squared_maha\r\n        else:\r\n            raise ValueError('invalid distance metric')\r\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/fairmot_dla34/modeling/mot/tracker/__init__.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom . import base_jde_tracker\nfrom . import base_sde_tracker\nfrom . import jde_tracker\n\nfrom .base_jde_tracker import *\nfrom .base_sde_tracker import *\nfrom .jde_tracker import *\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/fairmot_dla34/modeling/mot/tracker/base_jde_tracker.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"\nThis code is borrow from https://github.com/Zhongdao/Towards-Realtime-MOT/blob/master/tracker/multitracker.py\n\"\"\"\n\nimport numpy as np\nfrom collections import deque, OrderedDict\nfrom ..matching import jde_matching as matching\nfrom ppdet.core.workspace import register, serializable\n\n__all__ = [\n    'TrackState',\n    'BaseTrack',\n    'STrack',\n    'joint_stracks',\n    'sub_stracks',\n    'remove_duplicate_stracks',\n]\n\n\nclass TrackState(object):\n    New = 0\n    Tracked = 1\n    Lost = 2\n    Removed = 3\n\n\nclass BaseTrack(object):\n    _count = 0\n\n    track_id = 0\n    is_activated = False\n    state = TrackState.New\n\n    history = OrderedDict()\n    features = []\n    curr_feature = None\n    score = 0\n    start_frame = 0\n    frame_id = 0\n    time_since_update = 0\n\n    # multi-camera\n    location = (np.inf, np.inf)\n\n    @property\n    def end_frame(self):\n        return self.frame_id\n\n    @staticmethod\n    def next_id():\n        BaseTrack._count += 1\n        return BaseTrack._count\n\n    def activate(self, *args):\n        raise NotImplementedError\n\n    def predict(self):\n        raise NotImplementedError\n\n    def update(self, *args, **kwargs):\n        raise NotImplementedError\n\n    def mark_lost(self):\n        self.state = TrackState.Lost\n\n    def mark_removed(self):\n        self.state = TrackState.Removed\n\n\nclass STrack(BaseTrack):\n    def __init__(self, tlwh, score, temp_feat, buffer_size=30):\n        # wait activate\n        self._tlwh = np.asarray(tlwh, dtype=np.float)\n        self.kalman_filter = None\n        self.mean, self.covariance = None, None\n        self.is_activated = False\n\n        self.score = score\n        self.tracklet_len = 0\n\n        self.smooth_feat = None\n        self.update_features(temp_feat)\n        self.features = deque([], maxlen=buffer_size)\n        self.alpha = 0.9\n\n    def update_features(self, feat):\n        feat /= np.linalg.norm(feat)\n        self.curr_feat = feat\n        if self.smooth_feat is None:\n            self.smooth_feat = feat\n        else:\n            self.smooth_feat = self.alpha * self.smooth_feat + (1 - self.alpha) * feat\n        self.features.append(feat)\n        self.smooth_feat /= np.linalg.norm(self.smooth_feat)\n\n    def predict(self):\n        mean_state = self.mean.copy()\n        if self.state != TrackState.Tracked:\n            mean_state[7] = 0\n        self.mean, self.covariance = self.kalman_filter.predict(mean_state, self.covariance)\n\n    @staticmethod\n    def multi_predict(stracks, kalman_filter):\n        if len(stracks) > 0:\n            multi_mean = np.asarray([st.mean.copy() for st in stracks])\n            multi_covariance = np.asarray([st.covariance for st in stracks])\n            for i, st in enumerate(stracks):\n                if st.state != TrackState.Tracked:\n                    multi_mean[i][7] = 0\n            multi_mean, multi_covariance = kalman_filter.multi_predict(multi_mean, multi_covariance)\n            for i, (mean, cov) in enumerate(zip(multi_mean, multi_covariance)):\n                stracks[i].mean = mean\n                stracks[i].covariance = cov\n\n    def activate(self, kalman_filter, frame_id):\n        \"\"\"Start a new tracklet\"\"\"\n        self.kalman_filter = kalman_filter\n        self.track_id = self.next_id()\n        self.mean, self.covariance = self.kalman_filter.initiate(self.tlwh_to_xyah(self._tlwh))\n\n        self.tracklet_len = 0\n        self.state = TrackState.Tracked\n        if frame_id == 1:\n            self.is_activated = True\n        self.frame_id = frame_id\n        self.start_frame = frame_id\n\n    def re_activate(self, new_track, frame_id, new_id=False):\n        self.mean, self.covariance = self.kalman_filter.update(self.mean, self.covariance,\n                                                               self.tlwh_to_xyah(new_track.tlwh))\n\n        self.update_features(new_track.curr_feat)\n        self.tracklet_len = 0\n        self.state = TrackState.Tracked\n        self.is_activated = True\n        self.frame_id = frame_id\n        if new_id:\n            self.track_id = self.next_id()\n\n    def update(self, new_track, frame_id, update_feature=True):\n        self.frame_id = frame_id\n        self.tracklet_len += 1\n\n        new_tlwh = new_track.tlwh\n        self.mean, self.covariance = self.kalman_filter.update(self.mean, self.covariance, self.tlwh_to_xyah(new_tlwh))\n        self.state = TrackState.Tracked\n        self.is_activated = True\n\n        self.score = new_track.score\n        if update_feature:\n            self.update_features(new_track.curr_feat)\n\n    @property\n    def tlwh(self):\n        \"\"\"\n        Get current position in bounding box format `(top left x, top left y,\n        width, height)`.\n        \"\"\"\n        if self.mean is None:\n            return self._tlwh.copy()\n        ret = self.mean[:4].copy()\n        ret[2] *= ret[3]\n        ret[:2] -= ret[2:] / 2\n        return ret\n\n    @property\n    def tlbr(self):\n        \"\"\"\n        Convert bounding box to format `(min x, min y, max x, max y)`, i.e.,\n        `(top left, bottom right)`.\n        \"\"\"\n        ret = self.tlwh.copy()\n        ret[2:] += ret[:2]\n        return ret\n\n    @staticmethod\n    def tlwh_to_xyah(tlwh):\n        \"\"\"\n        Convert bounding box to format `(center x, center y, aspect ratio,\n        height)`, where the aspect ratio is `width / height`.\n        \"\"\"\n        ret = np.asarray(tlwh).copy()\n        ret[:2] += ret[2:] / 2\n        ret[2] /= ret[3]\n        return ret\n\n    def to_xyah(self):\n        return self.tlwh_to_xyah(self.tlwh)\n\n    @staticmethod\n    def tlbr_to_tlwh(tlbr):\n        ret = np.asarray(tlbr).copy()\n        ret[2:] -= ret[:2]\n        return ret\n\n    @staticmethod\n    def tlwh_to_tlbr(tlwh):\n        ret = np.asarray(tlwh).copy()\n        ret[2:] += ret[:2]\n        return ret\n\n    def __repr__(self):\n        return 'OT_{}_({}-{})'.format(self.track_id, self.start_frame, self.end_frame)\n\n\ndef joint_stracks(tlista, tlistb):\n    exists = {}\n    res = []\n    for t in tlista:\n        exists[t.track_id] = 1\n        res.append(t)\n    for t in tlistb:\n        tid = t.track_id\n        if not exists.get(tid, 0):\n            exists[tid] = 1\n            res.append(t)\n    return res\n\n\ndef sub_stracks(tlista, tlistb):\n    stracks = {}\n    for t in tlista:\n        stracks[t.track_id] = t\n    for t in tlistb:\n        tid = t.track_id\n        if stracks.get(tid, 0):\n            del stracks[tid]\n    return list(stracks.values())\n\n\ndef remove_duplicate_stracks(stracksa, stracksb):\n    pdist = matching.iou_distance(stracksa, stracksb)\n    pairs = np.where(pdist < 0.15)\n    dupa, dupb = list(), list()\n    for p, q in zip(*pairs):\n        timep = stracksa[p].frame_id - stracksa[p].start_frame\n        timeq = stracksb[q].frame_id - stracksb[q].start_frame\n        if timep > timeq:\n            dupb.append(q)\n        else:\n            dupa.append(p)\n    resa = [t for i, t in enumerate(stracksa) if not i in dupa]\n    resb = [t for i, t in enumerate(stracksb) if not i in dupb]\n    return resa, resb\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/fairmot_dla34/modeling/mot/tracker/base_sde_tracker.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"\nThis code is borrow from https://github.com/nwojke/deep_sort/blob/master/deep_sort/track.py\n\"\"\"\n\nfrom ppdet.core.workspace import register, serializable\n\n__all__ = ['TrackState', 'Track']\n\n\nclass TrackState(object):\n    \"\"\"\n    Enumeration type for the single target track state. Newly created tracks are\n    classified as `tentative` until enough evidence has been collected. Then,\n    the track state is changed to `confirmed`. Tracks that are no longer alive\n    are classified as `deleted` to mark them for removal from the set of active\n    tracks.\n    \"\"\"\n    Tentative = 1\n    Confirmed = 2\n    Deleted = 3\n\n\nclass Track(object):\n    \"\"\"\n    A single target track with state space `(x, y, a, h)` and associated\n    velocities, where `(x, y)` is the center of the bounding box, `a` is the\n    aspect ratio and `h` is the height.\n\n    Args:\n        mean (ndarray): Mean vector of the initial state distribution.\n        covariance (ndarray): Covariance matrix of the initial state distribution.\n        track_id (int): A unique track identifier.\n        n_init (int): Number of consecutive detections before the track is confirmed.\n            The track state is set to `Deleted` if a miss occurs within the first\n            `n_init` frames.\n        max_age (int): The maximum number of consecutive misses before the track\n            state is set to `Deleted`.\n        feature (Optional[ndarray]): Feature vector of the detection this track\n            originates from. If not None, this feature is added to the `features` cache.\n\n    Attributes:\n        hits (int): Total number of measurement updates.\n        age (int): Total number of frames since first occurance.\n        time_since_update (int): Total number of frames since last measurement\n            update.\n        state (TrackState): The current track state.\n        features (List[ndarray]): A cache of features. On each measurement update,\n            the associated feature vector is added to this list.\n    \"\"\"\n\n    def __init__(self, mean, covariance, track_id, n_init, max_age, feature=None):\n        self.mean = mean\n        self.covariance = covariance\n        self.track_id = track_id\n        self.hits = 1\n        self.age = 1\n        self.time_since_update = 0\n\n        self.state = TrackState.Tentative\n        self.features = []\n        if feature is not None:\n            self.features.append(feature)\n\n        self._n_init = n_init\n        self._max_age = max_age\n\n    def to_tlwh(self):\n        \"\"\"Get position in format `(top left x, top left y, width, height)`.\"\"\"\n        ret = self.mean[:4].copy()\n        ret[2] *= ret[3]\n        ret[:2] -= ret[2:] / 2\n        return ret\n\n    def to_tlbr(self):\n        \"\"\"Get position in bounding box format `(min x, miny, max x, max y)`.\"\"\"\n        ret = self.to_tlwh()\n        ret[2:] = ret[:2] + ret[2:]\n        return ret\n\n    def predict(self, kalman_filter):\n        \"\"\"\n        Propagate the state distribution to the current time step using a Kalman\n        filter prediction step.\n        \"\"\"\n        self.mean, self.covariance = kalman_filter.predict(self.mean, self.covariance)\n        self.age += 1\n        self.time_since_update += 1\n\n    def update(self, kalman_filter, detection):\n        \"\"\"\n        Perform Kalman filter measurement update step and update the associated\n        detection feature cache.\n        \"\"\"\n        self.mean, self.covariance = kalman_filter.update(self.mean, self.covariance, detection.to_xyah())\n        self.features.append(detection.feature)\n\n        self.hits += 1\n        self.time_since_update = 0\n        if self.state == TrackState.Tentative and self.hits >= self._n_init:\n            self.state = TrackState.Confirmed\n\n    def mark_missed(self):\n        \"\"\"Mark this track as missed (no association at the current time step).\n        \"\"\"\n        if self.state == TrackState.Tentative:\n            self.state = TrackState.Deleted\n        elif self.time_since_update > self._max_age:\n            self.state = TrackState.Deleted\n\n    def is_tentative(self):\n        \"\"\"Returns True if this track is tentative (unconfirmed).\"\"\"\n        return self.state == TrackState.Tentative\n\n    def is_confirmed(self):\n        \"\"\"Returns True if this track is confirmed.\"\"\"\n        return self.state == TrackState.Confirmed\n\n    def is_deleted(self):\n        \"\"\"Returns True if this track is dead and should be deleted.\"\"\"\n        return self.state == TrackState.Deleted\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/fairmot_dla34/modeling/mot/tracker/jde_tracker.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\r\n#\r\n# Licensed under the Apache License, Version 2.0 (the \"License\");\r\n# you may not use this file except in compliance with the License.\r\n# You may obtain a copy of the License at\r\n#\r\n#     http://www.apache.org/licenses/LICENSE-2.0\r\n#\r\n# Unless required by applicable law or agreed to in writing, software\r\n# distributed under the License is distributed on an \"AS IS\" BASIS,\r\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\r\n# See the License for the specific language governing permissions and\r\n# limitations under the License.\r\n\"\"\"\r\nThis code is borrow from https://github.com/Zhongdao/Towards-Realtime-MOT/blob/master/tracker/multitracker.py\r\n\"\"\"\r\n\r\nimport paddle\r\n\r\nfrom ..matching import jde_matching as matching\r\nfrom .base_jde_tracker import TrackState, BaseTrack, STrack\r\nfrom .base_jde_tracker import joint_stracks, sub_stracks, remove_duplicate_stracks\r\n\r\nfrom ppdet.core.workspace import register, serializable\r\nfrom ppdet.utils.logger import setup_logger\r\nlogger = setup_logger(__name__)\r\n\r\n__all__ = ['FrozenJDETracker']\r\n\r\n\r\n@register\r\n@serializable\r\nclass FrozenJDETracker(object):\r\n    __inject__ = ['motion']\r\n    \"\"\"\r\n    JDE tracker\r\n\r\n    Args:\r\n        det_thresh (float): threshold of detection score\r\n        track_buffer (int): buffer for tracker\r\n        min_box_area (int): min box area to filter out low quality boxes\r\n        vertical_ratio (float): w/h, the vertical ratio of the bbox to filter\r\n            bad results, set 1.6 default for pedestrian tracking. If set -1\r\n            means no need to filter bboxes.\r\n        tracked_thresh (float): linear assignment threshold of tracked\r\n            stracks and detections\r\n        r_tracked_thresh (float): linear assignment threshold of\r\n            tracked stracks and unmatched detections\r\n        unconfirmed_thresh (float): linear assignment threshold of\r\n            unconfirmed stracks and unmatched detections\r\n        motion (object): KalmanFilter instance\r\n        conf_thres (float): confidence threshold for tracking\r\n        metric_type (str): either \"euclidean\" or \"cosine\", the distance metric\r\n            used for measurement to track association.\r\n    \"\"\"\r\n\r\n    def __init__(self,\r\n                 det_thresh=0.3,\r\n                 track_buffer=30,\r\n                 min_box_area=200,\r\n                 vertical_ratio=1.6,\r\n                 tracked_thresh=0.7,\r\n                 r_tracked_thresh=0.5,\r\n                 unconfirmed_thresh=0.7,\r\n                 motion='KalmanFilter',\r\n                 conf_thres=0,\r\n                 metric_type='euclidean'):\r\n        self.det_thresh = det_thresh\r\n        self.track_buffer = track_buffer\r\n        self.min_box_area = min_box_area\r\n        self.vertical_ratio = vertical_ratio\r\n\r\n        self.tracked_thresh = tracked_thresh\r\n        self.r_tracked_thresh = r_tracked_thresh\r\n        self.unconfirmed_thresh = unconfirmed_thresh\r\n        self.motion = motion\r\n        self.conf_thres = conf_thres\r\n        self.metric_type = metric_type\r\n\r\n        self.frame_id = 0\r\n        self.tracked_stracks = []\r\n        self.lost_stracks = []\r\n        self.removed_stracks = []\r\n\r\n        self.max_time_lost = 0\r\n        # max_time_lost will be calculated: int(frame_rate / 30.0 * track_buffer)\r\n\r\n    def update(self, pred_dets, pred_embs):\r\n        \"\"\"\r\n        Processes the image frame and finds bounding box(detections).\r\n        Associates the detection with corresponding tracklets and also handles\r\n            lost, removed, refound and active tracklets.\r\n\r\n        Args:\r\n            pred_dets (Tensor): Detection results of the image, shape is [N, 5].\r\n            pred_embs (Tensor): Embedding results of the image, shape is [N, 512].\r\n\r\n        Return:\r\n            output_stracks (list): The list contains information regarding the\r\n                online_tracklets for the recieved image tensor.\r\n        \"\"\"\r\n        self.frame_id += 1\r\n        activated_starcks = []\r\n        # for storing active tracks, for the current frame\r\n        refind_stracks = []\r\n        # Lost Tracks whose detections are obtained in the current frame\r\n        lost_stracks = []\r\n        # The tracks which are not obtained in the current frame but are not\r\n        # removed. (Lost for some time lesser than the threshold for removing)\r\n        removed_stracks = []\r\n\r\n        remain_inds = paddle.nonzero(pred_dets[:, 4] > self.conf_thres)\r\n        if remain_inds.shape[0] == 0:\r\n            pred_dets = paddle.zeros([0, 1])\r\n            pred_embs = paddle.zeros([0, 1])\r\n        else:\r\n            pred_dets = paddle.gather(pred_dets, remain_inds)\r\n            pred_embs = paddle.gather(pred_embs, remain_inds)\r\n\r\n        # Filter out the image with box_num = 0. pred_dets = [[0.0, 0.0, 0.0 ,0.0]]\r\n        empty_pred = True if len(pred_dets) == 1 and paddle.sum(pred_dets) == 0.0 else False\r\n        \"\"\" Step 1: Network forward, get detections & embeddings\"\"\"\r\n        if len(pred_dets) > 0 and not empty_pred:\r\n            pred_dets = pred_dets.numpy()\r\n            pred_embs = pred_embs.numpy()\r\n            detections = [\r\n                STrack(STrack.tlbr_to_tlwh(tlbrs[:4]), tlbrs[4], f, 30) for (tlbrs, f) in zip(pred_dets, pred_embs)\r\n            ]\r\n        else:\r\n            detections = []\r\n        ''' Add newly detected tracklets to tracked_stracks'''\r\n        unconfirmed = []\r\n        tracked_stracks = []  # type: list[STrack]\r\n        for track in self.tracked_stracks:\r\n            if not track.is_activated:\r\n                # previous tracks which are not active in the current frame are added in unconfirmed list\r\n                unconfirmed.append(track)\r\n            else:\r\n                # Active tracks are added to the local list 'tracked_stracks'\r\n                tracked_stracks.append(track)\r\n        \"\"\" Step 2: First association, with embedding\"\"\"\r\n        # Combining currently tracked_stracks and lost_stracks\r\n        strack_pool = joint_stracks(tracked_stracks, self.lost_stracks)\r\n        # Predict the current location with KF\r\n        STrack.multi_predict(strack_pool, self.motion)\r\n\r\n        dists = matching.embedding_distance(strack_pool, detections, metric=self.metric_type)\r\n        dists = matching.fuse_motion(self.motion, dists, strack_pool, detections)\r\n        # The dists is the list of distances of the detection with the tracks in strack_pool\r\n        matches, u_track, u_detection = matching.linear_assignment(dists, thresh=self.tracked_thresh)\r\n        # The matches is the array for corresponding matches of the detection with the corresponding strack_pool\r\n\r\n        for itracked, idet in matches:\r\n            # itracked is the id of the track and idet is the detection\r\n            track = strack_pool[itracked]\r\n            det = detections[idet]\r\n            if track.state == TrackState.Tracked:\r\n                # If the track is active, add the detection to the track\r\n                track.update(detections[idet], self.frame_id)\r\n                activated_starcks.append(track)\r\n            else:\r\n                # We have obtained a detection from a track which is not active,\r\n                # hence put the track in refind_stracks list\r\n                track.re_activate(det, self.frame_id, new_id=False)\r\n                refind_stracks.append(track)\r\n\r\n        # None of the steps below happen if there are no undetected tracks.\r\n        \"\"\" Step 3: Second association, with IOU\"\"\"\r\n        detections = [detections[i] for i in u_detection]\r\n        # detections is now a list of the unmatched detections\r\n        r_tracked_stracks = []\r\n        # This is container for stracks which were tracked till the previous\r\n        # frame but no detection was found for it in the current frame.\r\n\r\n        for i in u_track:\r\n            if strack_pool[i].state == TrackState.Tracked:\r\n                r_tracked_stracks.append(strack_pool[i])\r\n        dists = matching.iou_distance(r_tracked_stracks, detections)\r\n        matches, u_track, u_detection = matching.linear_assignment(dists, thresh=self.r_tracked_thresh)\r\n        # matches is the list of detections which matched with corresponding\r\n        # tracks by IOU distance method.\r\n\r\n        for itracked, idet in matches:\r\n            track = r_tracked_stracks[itracked]\r\n            det = detections[idet]\r\n            if track.state == TrackState.Tracked:\r\n                track.update(det, self.frame_id)\r\n                activated_starcks.append(track)\r\n            else:\r\n                track.re_activate(det, self.frame_id, new_id=False)\r\n                refind_stracks.append(track)\r\n        # Same process done for some unmatched detections, but now considering IOU_distance as measure\r\n\r\n        for it in u_track:\r\n            track = r_tracked_stracks[it]\r\n            if not track.state == TrackState.Lost:\r\n                track.mark_lost()\r\n                lost_stracks.append(track)\r\n        # If no detections are obtained for tracks (u_track), the tracks are added to lost_tracks list and are marked lost\r\n        '''Deal with unconfirmed tracks, usually tracks with only one beginning frame'''\r\n        detections = [detections[i] for i in u_detection]\r\n        dists = matching.iou_distance(unconfirmed, detections)\r\n        matches, u_unconfirmed, u_detection = matching.linear_assignment(dists, thresh=self.unconfirmed_thresh)\r\n        for itracked, idet in matches:\r\n            unconfirmed[itracked].update(detections[idet], self.frame_id)\r\n            activated_starcks.append(unconfirmed[itracked])\r\n\r\n        # The tracks which are yet not matched\r\n        for it in u_unconfirmed:\r\n            track = unconfirmed[it]\r\n            track.mark_removed()\r\n            removed_stracks.append(track)\r\n\r\n        # after all these confirmation steps, if a new detection is found, it is initialized for a new track\r\n        \"\"\" Step 4: Init new stracks\"\"\"\r\n        for inew in u_detection:\r\n            track = detections[inew]\r\n            if track.score < self.det_thresh:\r\n                continue\r\n            track.activate(self.motion, self.frame_id)\r\n            activated_starcks.append(track)\r\n        \"\"\" Step 5: Update state\"\"\"\r\n        # If the tracks are lost for more frames than the threshold number, the tracks are removed.\r\n        for track in self.lost_stracks:\r\n            if self.frame_id - track.end_frame > self.max_time_lost:\r\n                track.mark_removed()\r\n                removed_stracks.append(track)\r\n\r\n        # Update the self.tracked_stracks and self.lost_stracks using the updates in this step.\r\n        self.tracked_stracks = [t for t in self.tracked_stracks if t.state == TrackState.Tracked]\r\n        self.tracked_stracks = joint_stracks(self.tracked_stracks, activated_starcks)\r\n        self.tracked_stracks = joint_stracks(self.tracked_stracks, refind_stracks)\r\n\r\n        self.lost_stracks = sub_stracks(self.lost_stracks, self.tracked_stracks)\r\n        self.lost_stracks.extend(lost_stracks)\r\n        self.lost_stracks = sub_stracks(self.lost_stracks, self.removed_stracks)\r\n        self.removed_stracks.extend(removed_stracks)\r\n        self.tracked_stracks, self.lost_stracks = remove_duplicate_stracks(self.tracked_stracks, self.lost_stracks)\r\n        # get scores of lost tracks\r\n        output_stracks = [track for track in self.tracked_stracks if track.is_activated]\r\n\r\n        logger.debug('===========Frame {}=========='.format(self.frame_id))\r\n        logger.debug('Activated: {}'.format([track.track_id for track in activated_starcks]))\r\n        logger.debug('Refind: {}'.format([track.track_id for track in refind_stracks]))\r\n        logger.debug('Lost: {}'.format([track.track_id for track in lost_stracks]))\r\n        logger.debug('Removed: {}'.format([track.track_id for track in removed_stracks]))\r\n\r\n        return output_stracks\r\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/fairmot_dla34/modeling/mot/utils.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport cv2\nimport time\nimport paddle\nimport numpy as np\n\n__all__ = [\n    'Timer',\n    'Detection',\n    'load_det_results',\n    'preprocess_reid',\n    'get_crops',\n    'clip_box',\n    'scale_coords',\n]\n\n\nclass Timer(object):\n    \"\"\"\n    This class used to compute and print the current FPS while evaling.\n    \"\"\"\n\n    def __init__(self):\n        self.total_time = 0.\n        self.calls = 0\n        self.start_time = 0.\n        self.diff = 0.\n        self.average_time = 0.\n        self.duration = 0.\n\n    def tic(self):\n        # using time.time instead of time.clock because time time.clock\n        # does not normalize for multithreading\n        self.start_time = time.time()\n\n    def toc(self, average=True):\n        self.diff = time.time() - self.start_time\n        self.total_time += self.diff\n        self.calls += 1\n        self.average_time = self.total_time / self.calls\n        if average:\n            self.duration = self.average_time\n        else:\n            self.duration = self.diff\n        return self.duration\n\n    def clear(self):\n        self.total_time = 0.\n        self.calls = 0\n        self.start_time = 0.\n        self.diff = 0.\n        self.average_time = 0.\n        self.duration = 0.\n\n\nclass Detection(object):\n    \"\"\"\n    This class represents a bounding box detection in a single image.\n\n    Args:\n        tlwh (ndarray): Bounding box in format `(top left x, top left y,\n            width, height)`.\n        confidence (ndarray): Detector confidence score.\n        feature (Tensor): A feature vector that describes the object\n            contained in this image.\n    \"\"\"\n\n    def __init__(self, tlwh, confidence, feature):\n        self.tlwh = np.asarray(tlwh, dtype=np.float32)\n        self.confidence = np.asarray(confidence, dtype=np.float32)\n        self.feature = feature\n\n    def to_tlbr(self):\n        \"\"\"\n        Convert bounding box to format `(min x, min y, max x, max y)`, i.e.,\n        `(top left, bottom right)`.\n        \"\"\"\n        ret = self.tlwh.copy()\n        ret[2:] += ret[:2]\n        return ret\n\n    def to_xyah(self):\n        \"\"\"\n        Convert bounding box to format `(center x, center y, aspect ratio,\n        height)`, where the aspect ratio is `width / height`.\n        \"\"\"\n        ret = self.tlwh.copy()\n        ret[:2] += ret[2:] / 2\n        ret[2] /= ret[3]\n        return ret\n\n\ndef load_det_results(det_file, num_frames):\n    assert os.path.exists(det_file) and os.path.isfile(det_file), \\\n        'Error: det_file: {} not exist or not a file.'.format(det_file)\n    labels = np.loadtxt(det_file, dtype='float32', delimiter=',')\n    results_list = []\n    for frame_i in range(0, num_frames):\n        results = {'bbox': [], 'score': []}\n        lables_with_frame = labels[labels[:, 0] == frame_i + 1]\n        for l in lables_with_frame:\n            results['bbox'].append(l[1:5])\n            results['score'].append(l[5])\n        results_list.append(results)\n    return results_list\n\n\ndef scale_coords(coords, input_shape, im_shape, scale_factor):\n    im_shape = im_shape.numpy()[0]\n    ratio = scale_factor[0][0]\n    pad_w = (input_shape[1] - int(im_shape[1])) / 2\n    pad_h = (input_shape[0] - int(im_shape[0])) / 2\n    coords = paddle.cast(coords, 'float32')\n    coords[:, 0::2] -= pad_w\n    coords[:, 1::2] -= pad_h\n    coords[:, 0:4] /= ratio\n    coords[:, :4] = paddle.clip(coords[:, :4], min=0, max=coords[:, :4].max())\n    return coords.round()\n\n\ndef clip_box(xyxy, input_shape, im_shape, scale_factor):\n    im_shape = im_shape.numpy()[0]\n    ratio = scale_factor.numpy()[0][0]\n    img0_shape = [int(im_shape[0] / ratio), int(im_shape[1] / ratio)]\n\n    xyxy[:, 0::2] = paddle.clip(xyxy[:, 0::2], min=0, max=img0_shape[1])\n    xyxy[:, 1::2] = paddle.clip(xyxy[:, 1::2], min=0, max=img0_shape[0])\n    return xyxy\n\n\ndef get_crops(xyxy, ori_img, pred_scores, w, h):\n    crops = []\n    keep_scores = []\n    xyxy = xyxy.numpy().astype(np.int64)\n    ori_img = ori_img.numpy()\n    ori_img = np.squeeze(ori_img, axis=0).transpose(1, 0, 2)\n    pred_scores = pred_scores.numpy()\n    for i, bbox in enumerate(xyxy):\n        if bbox[2] <= bbox[0] or bbox[3] <= bbox[1]:\n            continue\n        crop = ori_img[bbox[0]:bbox[2], bbox[1]:bbox[3], :]\n        crops.append(crop)\n        keep_scores.append(pred_scores[i])\n    if len(crops) == 0:\n        return [], []\n    crops = preprocess_reid(crops, w, h)\n    return crops, keep_scores\n\n\ndef preprocess_reid(imgs, w=64, h=192, mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]):\n    im_batch = []\n    for img in imgs:\n        img = cv2.resize(img, (w, h))\n        img = img[:, :, ::-1].astype('float32').transpose((2, 0, 1)) / 255\n        img_mean = np.array(mean).reshape((3, 1, 1))\n        img_std = np.array(std).reshape((3, 1, 1))\n        img -= img_mean\n        img /= img_std\n        img = np.expand_dims(img, axis=0)\n        im_batch.append(img)\n    im_batch = np.concatenate(im_batch, 0)\n    return im_batch\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/fairmot_dla34/modeling/mot/visualization.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport cv2\nimport numpy as np\n\n\ndef tlwhs_to_tlbrs(tlwhs):\n    tlbrs = np.copy(tlwhs)\n    if len(tlbrs) == 0:\n        return tlbrs\n    tlbrs[:, 2] += tlwhs[:, 0]\n    tlbrs[:, 3] += tlwhs[:, 1]\n    return tlbrs\n\n\ndef get_color(idx):\n    idx = idx * 3\n    color = ((37 * idx) % 255, (17 * idx) % 255, (29 * idx) % 255)\n    return color\n\n\ndef resize_image(image, max_size=800):\n    if max(image.shape[:2]) > max_size:\n        scale = float(max_size) / max(image.shape[:2])\n        image = cv2.resize(image, None, fx=scale, fy=scale)\n    return image\n\n\ndef plot_tracking(image, tlwhs, obj_ids, scores=None, frame_id=0, fps=0., ids2=None):\n    im = np.ascontiguousarray(np.copy(image))\n    im_h, im_w = im.shape[:2]\n\n    top_view = np.zeros([im_w, im_w, 3], dtype=np.uint8) + 255\n\n    text_scale = max(1, image.shape[1] / 1600.)\n    text_thickness = 2\n    line_thickness = max(1, int(image.shape[1] / 500.))\n\n    radius = max(5, int(im_w / 140.))\n    cv2.putText(\n        im,\n        'frame: %d fps: %.2f num: %d' % (frame_id, fps, len(tlwhs)), (0, int(15 * text_scale)),\n        cv2.FONT_HERSHEY_PLAIN,\n        text_scale, (0, 0, 255),\n        thickness=2)\n\n    for i, tlwh in enumerate(tlwhs):\n        x1, y1, w, h = tlwh\n        intbox = tuple(map(int, (x1, y1, x1 + w, y1 + h)))\n        obj_id = int(obj_ids[i])\n        id_text = '{}'.format(int(obj_id))\n        if ids2 is not None:\n            id_text = id_text + ', {}'.format(int(ids2[i]))\n        _line_thickness = 1 if obj_id <= 0 else line_thickness\n        color = get_color(abs(obj_id))\n        cv2.rectangle(im, intbox[0:2], intbox[2:4], color=color, thickness=line_thickness)\n        cv2.putText(\n            im,\n            id_text, (intbox[0], intbox[1] + 10),\n            cv2.FONT_HERSHEY_PLAIN,\n            text_scale, (0, 0, 255),\n            thickness=text_thickness)\n\n        if scores is not None:\n            text = '{:.2f}'.format(float(scores[i]))\n            cv2.putText(\n                im,\n                text, (intbox[0], intbox[1] - 10),\n                cv2.FONT_HERSHEY_PLAIN,\n                text_scale, (0, 255, 255),\n                thickness=text_thickness)\n    return im\n\n\ndef plot_trajectory(image, tlwhs, track_ids):\n    image = image.copy()\n    for one_tlwhs, track_id in zip(tlwhs, track_ids):\n        color = get_color(int(track_id))\n        for tlwh in one_tlwhs:\n            x1, y1, w, h = tuple(map(int, tlwh))\n            cv2.circle(image, (int(x1 + 0.5 * w), int(y1 + h)), 2, color, thickness=2)\n    return image\n\n\ndef plot_detections(image, tlbrs, scores=None, color=(255, 0, 0), ids=None):\n    im = np.copy(image)\n    text_scale = max(1, image.shape[1] / 800.)\n    thickness = 2 if text_scale > 1.3 else 1\n    for i, det in enumerate(tlbrs):\n        x1, y1, x2, y2 = np.asarray(det[:4], dtype=np.int)\n        if len(det) >= 7:\n            label = 'det' if det[5] > 0 else 'trk'\n            if ids is not None:\n                text = '{}# {:.2f}: {:d}'.format(label, det[6], ids[i])\n                cv2.putText(\n                    im, text, (x1, y1 + 30), cv2.FONT_HERSHEY_PLAIN, text_scale, (0, 255, 255), thickness=thickness)\n            else:\n                text = '{}# {:.2f}'.format(label, det[6])\n\n        if scores is not None:\n            text = '{:.2f}'.format(scores[i])\n            cv2.putText(im, text, (x1, y1 + 30), cv2.FONT_HERSHEY_PLAIN, text_scale, (0, 255, 255), thickness=thickness)\n\n        cv2.rectangle(im, (x1, y1), (x2, y2), color, 2)\n    return im\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/fairmot_dla34/module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport glob\nimport os\nimport signal\nimport sys\n\nimport cv2\nimport paddle\nfrom ppdet.core.workspace import load_config\nfrom ppdet.core.workspace import merge_config\nfrom ppdet.engine import Tracker\nfrom ppdet.utils.check import check_config\nfrom ppdet.utils.check import check_gpu\nfrom ppdet.utils.check import check_version\nfrom ppdet.utils.logger import setup_logger\n\nimport paddlehub as hub\nfrom .tracker import StreamTracker\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\n\nlogger = setup_logger('Predict')\n\n\n@moduleinfo(name=\"fairmot_dla34\",\n            type=\"CV/multiple_object_tracking\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"Fairmot is a model for multiple object tracking.\",\n            version=\"1.1.0\")\nclass FairmotTracker_1088x608:\n\n    def __init__(self):\n        self.pretrained_model = os.path.join(self.directory, \"fairmot_dla34_30e_1088x608\")\n\n    def tracking(self, video_stream, output_dir='mot_result', visualization=True, draw_threshold=0.5, use_gpu=False):\n        '''\n        Track a video, and save the prediction results into output_dir, if visualization is set as True.\n\n        video_stream: the video path\n        output_dir: specify the dir to save the results\n        visualization: if True, save the results as a video, otherwise not.\n        draw_threshold: the threshold for the prediction results\n        use_gpu: if True, use gpu to perform the computation, otherwise cpu.\n        '''\n        self.video_stream = video_stream\n        self.output_dir = output_dir\n        self.visualization = visualization\n        self.draw_threshold = draw_threshold\n        self.use_gpu = use_gpu\n\n        cfg = load_config(os.path.join(self.directory, 'config', 'fairmot_dla34_30e_1088x608.yml'))\n        check_config(cfg)\n\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n\n        paddle.disable_static()\n        tracker = StreamTracker(cfg, mode='test')\n\n        # load weights\n        tracker.load_weights_jde(self.pretrained_model)\n        signal.signal(signal.SIGINT, self.signalhandler)\n        # inference\n        tracker.videostream_predict(video_stream=video_stream,\n                                    output_dir=output_dir,\n                                    data_type='mot',\n                                    model_type='FairMOT',\n                                    visualization=visualization,\n                                    draw_threshold=draw_threshold)\n\n    def stream_mode(self, output_dir='mot_result', visualization=True, draw_threshold=0.5, use_gpu=False):\n        '''\n        Entering the stream mode enables image stream prediction. Users can predict the images like a stream and save the results to a video.\n\n        output_dir: specify the dir to save the results\n        visualization: if True, save the results as a video, otherwise not.\n        draw_threshold: the threshold for the prediction results\n        use_gpu: if True, use gpu to perform the computation, otherwise cpu.\n        '''\n        self.output_dir = output_dir\n        self.visualization = visualization\n        self.draw_threshold = draw_threshold\n        self.use_gpu = use_gpu\n\n        cfg = load_config(os.path.join(self.directory, 'config', 'fairmot_dla34_30e_1088x608.yml'))\n        check_config(cfg)\n\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n\n        paddle.disable_static()\n        self.tracker = StreamTracker(cfg, mode='test')\n\n        # load weights\n        self.tracker.load_weights_jde(self.pretrained_model)\n        signal.signal(signal.SIGINT, self.signalhandler)\n        return self\n\n    def __enter__(self):\n        self.tracker_generator = self.tracker.imagestream_predict(self.output_dir,\n                                                                  data_type='mot',\n                                                                  model_type='FairMOT',\n                                                                  visualization=self.visualization,\n                                                                  draw_threshold=self.draw_threshold)\n        next(self.tracker_generator)\n\n    def __exit__(self, exc_type, exc_value, traceback):\n        seq = 'inputimages'\n        save_dir = os.path.join(self.output_dir, 'mot_outputs', seq) if self.visualization else None\n        if self.visualization:\n            #### Save using ffmpeg\n            #output_video_path = os.path.join(save_dir, '..', '{}_vis.mp4'.format(seq))\n            #cmd_str = 'ffmpeg -f image2 -i {}/%05d.jpg -vf \"scale=trunc(iw/2)*2:trunc(ih/2)*2\" {}'.format(\n            #    save_dir, output_video_path)\n            #os.system(cmd_str)\n            #### Save using opencv\n            output_video_path = os.path.join(save_dir, '..', '{}_vis.avi'.format(seq))\n            imgnames = glob.glob(os.path.join(save_dir, '*.jpg'))\n            if len(imgnames) == 0:\n                logger.info('No output images to save for video')\n                return\n            img = cv2.imread(os.path.join(save_dir, '00000.jpg'))\n            video_writer = cv2.VideoWriter(output_video_path,\n                                           apiPreference=0,\n                                           fourcc=cv2.VideoWriter_fourcc('M', 'J', 'P', 'G'),\n                                           fps=30,\n                                           frameSize=(img.shape[1], img.shape[0]))\n            for i in range(len(imgnames)):\n                imgpath = os.path.join(save_dir, '{:05d}.jpg'.format(i))\n                img = cv2.imread(imgpath)\n                video_writer.write(img)\n            video_writer.release()\n            logger.info('Save video in {}'.format(output_video_path))\n\n    def predict(self, images: list = []):\n        '''\n        Predict the images. This method should called in stream_mode.\n\n        images: the image list used for prediction.\n\n        Example:\n        tracker = hub.Module('fairmot_dla34')\n        with tracker.stream_mode(output_dir='image_stream_output', visualization=True, draw_threshold=0.5, use_gpu=True):\n            tracker.predict([images])\n        '''\n        length = len(images)\n        if length == 0:\n            print('No images provided.')\n            return\n        for image in images:\n            self.tracker.dataset.add_image(image)\n            try:\n                results = next(self.tracker_generator)\n            except StopIteration as e:\n                return\n\n        return results[-length:]\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n        self.tracking(\n            video_stream=self.args.video_stream,\n            output_dir=self.args.output_dir,\n            visualization=self.args.visualization,\n            draw_threshold=self.args.draw_threshold,\n            use_gpu=self.args.use_gpu,\n        )\n\n    def signalhandler(self, signum, frame):\n        seq = os.path.splitext(os.path.basename(self.video_stream))[0]\n        save_dir = os.path.join(self.output_dir, 'mot_outputs', seq) if self.visualization else None\n        if self.visualization:\n            #### Save using ffmpeg\n            #output_video_path = os.path.join(save_dir, '..', '{}_vis.mp4'.format(seq))\n            #cmd_str = 'ffmpeg -f image2 -i {}/%05d.jpg -vf \"scale=trunc(iw/2)*2:trunc(ih/2)*2\" {}'.format(\n            #    save_dir, output_video_path)\n            #os.system(cmd_str)\n            #### Save using opencv\n            output_video_path = os.path.join(save_dir, '..', '{}_vis.avi'.format(seq))\n            imgnames = glob.glob(os.path.join(save_dir, '*.jpg'))\n            if len(imgnames) == 0:\n                logger.info('No output images to save for video')\n                return\n            img = cv2.imread(os.path.join(save_dir, '00000.jpg'))\n            video_writer = cv2.VideoWriter(output_video_path,\n                                           apiPreference=0,\n                                           fourcc=cv2.VideoWriter_fourcc('M', 'J', 'P', 'G'),\n                                           fps=30,\n                                           frameSize=(img.shape[1], img.shape[0]))\n            for i in range(len(imgnames)):\n                imgpath = os.path.join(save_dir, '{:05d}.jpg'.format(i))\n                img = cv2.imread(imgpath)\n                video_writer.write(img)\n            video_writer.release()\n            logger.info('Save video in {}'.format(output_video_path))\n            print('Program Interrupted! Save video in {}'.format(output_video_path))\n            exit(0)\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='mot_result',\n                                           help='Directory name for output tracking results.')\n        self.arg_config_group.add_argument('--visualization',\n                                           action='store_true',\n                                           help=\"whether to save output as images.\")\n        self.arg_config_group.add_argument(\"--draw_threshold\",\n                                           type=float,\n                                           default=0.5,\n                                           help=\"Threshold to reserve the result for visualization.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--video_stream',\n                                          type=str,\n                                          help=\"path to video stream, can be a video file or stream device number.\")\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/fairmot_dla34/requirements.txt",
    "content": "cython\npaddledet == 2.2.0\nopencv-python\nimageio\npillow==7.1.2\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/fairmot_dla34/tracker.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport cv2\nimport glob\nimport paddle\nimport numpy as np\nimport collections\n\nfrom ppdet.core.workspace import create\nfrom ppdet.utils.checkpoint import load_weight, load_pretrain_weight\nfrom ppdet.metrics import Metric, MOTMetric, KITTIMOTMetric\nimport ppdet.utils.stats as stats\nfrom ppdet.engine.callbacks import Callback, ComposeCallback\nfrom ppdet.utils.logger import setup_logger\n\nfrom .dataset import MOTVideoStream, MOTImageStream\nfrom .utils import Timer\nfrom .modeling.mot.utils import Detection, get_crops, scale_coords, clip_box\nfrom .modeling.mot import visualization as mot_vis\n\nlogger = setup_logger(__name__)\n\n\nclass StreamTracker(object):\n    def __init__(self, cfg, mode='eval'):\n        self.cfg = cfg\n        assert mode.lower() in ['test', 'eval'], \\\n                \"mode should be 'test' or 'eval'\"\n        self.mode = mode.lower()\n        self.optimizer = None\n\n        # build model\n        with paddle.no_grad():\n            self.model = create(cfg.architecture)\n\n        self.status = {}\n        self.start_epoch = 0\n\n    def load_weights_jde(self, weights):\n        load_weight(self.model, weights, self.optimizer)\n\n    def _eval_seq_jde(self, dataloader, save_dir=None, show_image=False, frame_rate=30, draw_threshold=0):\n        if save_dir:\n            if not os.path.exists(save_dir): os.makedirs(save_dir)\n        tracker = self.model.tracker\n        tracker.max_time_lost = int(frame_rate / 30.0 * tracker.track_buffer)\n\n        timer = Timer()\n        results = []\n        frame_id = 0\n        self.status['mode'] = 'track'\n        self.model.eval()\n        for step_id, data in enumerate(dataloader):\n            #print('data', data)\n            self.status['step_id'] = step_id\n            if frame_id % 40 == 0:\n                logger.info('Processing frame {} ({:.2f} fps)'.format(frame_id, 1. / max(1e-5, timer.average_time)))\n\n            # forward\n            timer.tic()\n            pred_dets, pred_embs = self.model(data)\n            online_targets = self.model.tracker.update(pred_dets, pred_embs)\n            online_tlwhs, online_ids = [], []\n            online_scores = []\n            for t in online_targets:\n                tlwh = t.tlwh\n                tid = t.track_id\n                tscore = t.score\n                if tscore < draw_threshold: continue\n                vertical = tlwh[2] / tlwh[3] > 1.6\n                if tlwh[2] * tlwh[3] > tracker.min_box_area and not vertical:\n                    online_tlwhs.append(tlwh)\n                    online_ids.append(tid)\n                    online_scores.append(tscore)\n            timer.toc()\n\n            # save results\n            results.append((frame_id + 1, online_tlwhs, online_scores, online_ids))\n            self.save_results(data, frame_id, online_ids, online_tlwhs, online_scores, timer.average_time, show_image,\n                              save_dir)\n            frame_id += 1\n\n        return results, frame_id, timer.average_time, timer.calls\n\n    def _eval_seq_jde_single_image(self, iterator, save_dir=None, show_image=False, draw_threshold=0):\n        if save_dir:\n            if not os.path.exists(save_dir): os.makedirs(save_dir)\n        tracker = self.model.tracker\n        results = []\n        frame_id = 0\n        self.status['mode'] = 'track'\n        self.model.eval()\n        timer = Timer()\n        while True:\n            try:\n                data = next(iterator)\n                timer.tic()\n                pred_dets, pred_embs = self.model(data)\n                online_targets = self.model.tracker.update(pred_dets, pred_embs)\n                online_tlwhs, online_ids = [], []\n                online_scores = []\n                for t in online_targets:\n                    tlwh = t.tlwh\n                    tid = t.track_id\n                    tscore = t.score\n                    if tscore < draw_threshold: continue\n                    vertical = tlwh[2] / tlwh[3] > 1.6\n                    if tlwh[2] * tlwh[3] > tracker.min_box_area and not vertical:\n                        online_tlwhs.append(tlwh)\n                        online_ids.append(tid)\n                        online_scores.append(tscore)\n                timer.toc()\n                # save results\n                results.append((frame_id + 1, online_tlwhs, online_scores, online_ids))\n                self.save_results(data, frame_id, online_ids, online_tlwhs, online_scores, timer.average_time,\n                                  show_image, save_dir)\n                frame_id += 1\n\n                yield results, frame_id\n\n            except StopIteration as e:\n                return\n\n    def imagestream_predict(self, output_dir, data_type='mot', model_type='JDE', visualization=True,\n                            draw_threshold=0.5):\n        if not os.path.exists(output_dir): os.makedirs(output_dir)\n        result_root = os.path.join(output_dir, 'mot_results')\n        if not os.path.exists(result_root): os.makedirs(result_root)\n        assert data_type in ['mot', 'kitti'], \\\n            \"data_type should be 'mot' or 'kitti'\"\n        assert model_type in ['JDE', 'FairMOT'], \\\n            \"model_type should be 'JDE', or 'FairMOT'\"\n        seq = 'inputimages'\n        self.dataset = MOTImageStream(keep_ori_im=True)\n\n        save_dir = os.path.join(output_dir, 'mot_outputs', seq) if visualization else None\n\n        self.dataloader = create('MOTVideoStreamReader')(self.dataset, 0)\n        self.dataloader_iter = iter(self.dataloader)\n        result_filename = os.path.join(result_root, '{}.txt'.format(seq))\n\n        if model_type in ['JDE', 'FairMOT']:\n            generator = self._eval_seq_jde_single_image(\n                self.dataloader_iter, save_dir=save_dir, draw_threshold=draw_threshold)\n        else:\n            raise ValueError(model_type)\n        yield\n        results = []\n        while True:\n            try:\n                results, nf = next(generator)\n                yield results\n            except StopIteration as e:\n                self.write_mot_results(result_filename, results, data_type)\n                return\n\n    def videostream_predict(self,\n                            video_stream,\n                            output_dir,\n                            data_type='mot',\n                            model_type='JDE',\n                            visualization=True,\n                            draw_threshold=0.5):\n        assert video_stream is not None, \\\n            \"--video_stream should be set.\"\n\n        if not os.path.exists(output_dir): os.makedirs(output_dir)\n        result_root = os.path.join(output_dir, 'mot_results')\n        if not os.path.exists(result_root): os.makedirs(result_root)\n        assert data_type in ['mot', 'kitti'], \\\n            \"data_type should be 'mot' or 'kitti'\"\n        assert model_type in ['JDE', 'FairMOT'], \\\n            \"model_type should be 'JDE', or 'FairMOT'\"\n        seq = os.path.splitext(os.path.basename(video_stream))[0]\n        self.dataset = MOTVideoStream(video_stream, keep_ori_im=True)\n\n        save_dir = os.path.join(output_dir, 'mot_outputs', seq) if visualization else None\n\n        dataloader = create('MOTVideoStreamReader')(self.dataset, 0)\n        result_filename = os.path.join(result_root, '{}.txt'.format(seq))\n\n        with paddle.no_grad():\n            if model_type in ['JDE', 'FairMOT']:\n                results, nf, ta, tc = self._eval_seq_jde(dataloader, save_dir=save_dir, draw_threshold=draw_threshold)\n            else:\n                raise ValueError(model_type)\n\n        self.write_mot_results(result_filename, results, data_type)\n\n        if visualization:\n            #### Save using ffmpeg\n            #output_video_path = os.path.join(save_dir, '..', '{}_vis.mp4'.format(seq))\n            #cmd_str = 'ffmpeg -f image2 -i {}/%05d.jpg -vf \"scale=trunc(iw/2)*2:trunc(ih/2)*2\" {}'.format(\n            #    save_dir, output_video_path)\n            #os.system(cmd_str)\n            #### Save using opencv\n            output_video_path = os.path.join(save_dir, '..', '{}_vis.avi'.format(seq))\n            imgnames = glob.glob(os.path.join(save_dir, '*.jpg'))\n            if len(imgnames) == 0:\n                logger.info('No output images to save for video')\n                return\n            img = cv2.imread(os.path.join(save_dir, '00000.jpg'))\n            video_writer = cv2.VideoWriter(\n                output_video_path,\n                apiPreference=0,\n                fourcc=cv2.VideoWriter_fourcc('M', 'J', 'P', 'G'),\n                fps=30,\n                frameSize=(img.shape[1], img.shape[0]))\n            for i in range(len(imgnames)):\n                imgpath = os.path.join(save_dir, '{:05d}.jpg'.format(i))\n                img = cv2.imread(imgpath)\n                video_writer.write(img)\n            video_writer.release()\n            logger.info('Save video in {}'.format(output_video_path))\n\n    def write_mot_results(self, filename, results, data_type='mot'):\n        if data_type in ['mot', 'mcmot', 'lab']:\n            save_format = '{frame},{id},{x1},{y1},{w},{h},{score},-1,-1,-1\\n'\n        elif data_type == 'kitti':\n            save_format = '{frame} {id} car 0 0 -10 {x1} {y1} {x2} {y2} -10 -10 -10 -1000 -1000 -1000 -10\\n'\n        else:\n            raise ValueError(data_type)\n\n        with open(filename, 'w') as f:\n            for frame_id, tlwhs, tscores, track_ids in results:\n                if data_type == 'kitti':\n                    frame_id -= 1\n                for tlwh, score, track_id in zip(tlwhs, tscores, track_ids):\n                    if track_id < 0:\n                        continue\n                    x1, y1, w, h = tlwh\n                    x2, y2 = x1 + w, y1 + h\n                    line = save_format.format(\n                        frame=frame_id, id=track_id, x1=x1, y1=y1, x2=x2, y2=y2, w=w, h=h, score=score)\n                    f.write(line)\n        logger.info('MOT results save in {}'.format(filename))\n\n    def save_results(self, data, frame_id, online_ids, online_tlwhs, online_scores, average_time, show_image, save_dir):\n        if show_image or save_dir is not None:\n            assert 'ori_image' in data\n            img0 = data['ori_image'].numpy()[0]\n            online_im = mot_vis.plot_tracking(\n                img0, online_tlwhs, online_ids, online_scores, frame_id=frame_id, fps=1. / average_time)\n        if show_image:\n            cv2.imshow('online_im', online_im)\n        if save_dir is not None:\n            cv2.imwrite(os.path.join(save_dir, '{:05d}.jpg'.format(frame_id)), online_im)\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/fairmot_dla34/utils.py",
    "content": "import time\n\n\nclass Timer(object):\n    \"\"\"\n    This class used to compute and print the current FPS while evaling.\n    \"\"\"\n\n    def __init__(self):\n        self.total_time = 0.\n        self.calls = 0\n        self.start_time = 0.\n        self.diff = 0.\n        self.average_time = 0.\n        self.duration = 0.\n\n    def tic(self):\n        # using time.time instead of time.clock because time time.clock\n        # does not normalize for multithreading\n        self.start_time = time.time()\n\n    def toc(self, average=True):\n        self.diff = time.time() - self.start_time\n        self.total_time += self.diff\n        self.calls += 1\n        self.average_time = self.total_time / self.calls\n        if average:\n            self.duration = self.average_time\n        else:\n            self.duration = self.diff\n        return self.duration\n\n    def clear(self):\n        self.total_time = 0.\n        self.calls = 0\n        self.start_time = 0.\n        self.diff = 0.\n        self.average_time = 0.\n        self.duration = 0.\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/jde_darknet53/README.md",
    "content": "# jde_darknet53\n\n|模型名称|jde_darknet53|\n| :--- | :---: |\n|类别|视频 - 多目标追踪|\n|网络|YOLOv3|\n|数据集|Caltech Pedestrian+CityPersons+CUHK-SYSU+PRW+ETHZ+MOT17|\n|是否支持Fine-tuning|否|\n|模型大小|420MB|\n|最新更新日期|2021-08-26|\n|数据指标|-|\n\n\n## 一、模型基本信息\n\n- ### 应用效果展示\n  - 样例结果示例：\n  <p align=\"center\">\n  <img src=\"https://user-images.githubusercontent.com/22424850/131989578-ec06e18f-e122-40b0-84d2-8772fd35391a.gif\"  hspace='10'/> <br />\n  </p>\n\n- ### 模型介绍\n\n  - JDE(Joint Detection and Embedding)是在一个单一的共享神经网络中同时学习目标检测任务和embedding任务，并同时输出检测结果和对应的外观embedding匹配的算法。JDE原论文是基于Anchor Base的YOLOv3检测器新增加一个ReID分支学习embedding，训练过程被构建为一个多任务联合学习问题，兼顾精度和速度。\n\n  - 更多详情参考：[Towards Real-Time Multi-Object Tracking](https://arxiv.org/abs/1909.12605)\n\n\n\n## 二、安装\n\n- ### 1、环境依赖  \n\n  - paddledet >= 2.2.0\n\n  - opencv-python\n\n- ### 2、安装\n\n  - ```shell\n    $ hub install jde_darknet53\n    ```\n  - 如您安装时遇到问题，可参考：[零基础windows安装](../../../../docs/docs_ch/get_start/windows_quickstart.md)\n | [零基础Linux安装](../../../../docs/docs_ch/get_start/linux_quickstart.md) | [零基础MacOS安装](../../../../docs/docs_ch/get_start/mac_quickstart.md)\n  - 在windows下安装，由于paddledet package会依赖cython-bbox以及pycocotools, 这两个包需要windows用户提前装好，可参考[cython-bbox安装](https://blog.csdn.net/qq_24739717/article/details/105588729)和[pycocotools安装](https://github.com/PaddlePaddle/PaddleX/blob/release/1.3/docs/install.md#pycocotools安装问题)\n\n\n## 三、模型API预测\n\n- ### 1、命令行预测\n\n  - ```shell\n    # Read from a video file\n    $ hub run jde_darknet53 --video_stream \"/PATH/TO/VIDEO\"\n    ```\n  - 通过命令行方式实现多目标追踪模型的调用，更多请见 [PaddleHub命令行指令](../../../../docs/docs_ch/tutorial/cmd_usage.rst)\n\n- ### 2、预测代码示例\n\n  - ```python\n    import paddlehub as hub\n\n    tracker = hub.Module(name=\"jde_darknet53\")\n    # Read from a video file\n    tracker.tracking('/PATH/TO/VIDEO', output_dir='mot_result', visualization=True,\n                        draw_threshold=0.5, use_gpu=False)\n    # or read from a image stream\n    # with tracker.stream_mode(output_dir='image_stream_output', visualization=True, draw_threshold=0.5, use_gpu=True):\n    #    tracker.predict([images])\n    ```\n\n- ### 3、API\n\n  - ```python\n    def tracking(video_stream,\n                 output_dir='',\n                 visualization=True,\n                 draw_threshold=0.5,\n                 use_gpu=False)\n    ```\n    - 视频预测API，完成对视频内容的多目标追踪，并存储追踪结果。\n\n    - **参数**\n\n      - video_stream (str): 视频文件的路径; <br/>\n      - output_dir (str): 结果保存路径的根目录，默认为当前目录； <br/>\n      - visualization (bool): 是否保存追踪结果；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - draw\\_threshold (float): 预测置信度的阈值。\n\n  - ```python\n    def stream_mode(output_dir='',\n                    visualization=True,\n                    draw_threshold=0.5,\n                    use_gpu=False)\n    ```\n    - 进入图片流预测模式API，在该模式中完成对图片流的多目标追踪，并存储追踪结果。\n\n    - **参数**\n\n      - output_dir (str): 结果保存路径的根目录，默认为当前目录； <br/>\n      - visualization (bool): 是否保存追踪结果；<br/>\n      - use\\_gpu (bool): 是否使用 GPU；<br/>\n      - draw\\_threshold (float): 预测置信度的阈值。\n\n  - ```python\n    def predict(images: list = [])\n    ```\n    - 对图片进行预测的API, 该接口必须在stream_mode API被调用后使用。\n\n    - **参数**\n\n      - images (list): 待预测的图片列表。\n\n\n\n## 四、更新历史\n\n* 1.0.0\n\n  初始发布\n\n* 1.1.0\n\n  移除fluid api\n\n  - ```shell\n    $ hub install jde_darknet53==1.1.0\n    ```\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/jde_darknet53/config/_base_/jde_darknet53.yml",
    "content": "architecture: JDE\npretrain_weights: https://paddledet.bj.bcebos.com/models/pretrained/DarkNet53_pretrained.pdparams\nfind_unused_parameters: True\n\nJDE:\n  detector: YOLOv3\n  reid: JDEEmbeddingHead\n  tracker: FrozenJDETracker\n\nYOLOv3:\n  backbone: DarkNet\n  neck: YOLOv3FPN\n  yolo_head: YOLOv3Head\n  post_process: JDEBBoxPostProcess\n  for_mot: True\n\nDarkNet:\n  depth: 53\n  return_idx: [2, 3, 4]\n  freeze_norm: True\n\nYOLOv3FPN:\n  freeze_norm: True\n\nYOLOv3Head:\n  anchors: [[128,384], [180,540], [256,640], [512,640],\n            [32,96], [45,135], [64,192], [90,271],\n            [8,24], [11,34], [16,48], [23,68]]\n  anchor_masks: [[0, 1, 2, 3], [4, 5, 6, 7], [8, 9, 10, 11]]\n  loss: JDEDetectionLoss\n\nJDEBBoxPostProcess:\n  decode:\n    name: JDEBox\n    conf_thresh: 0.3\n    downsample_ratio: 32\n  nms:\n    name: MultiClassNMS\n    keep_top_k: 500\n    score_threshold: 0.01\n    nms_threshold: 0.5\n    nms_top_k: 2000\n    normalized: true\n\nJDEEmbeddingHead:\n  anchor_levels: 3\n  anchor_scales: 4\n  embedding_dim: 512\n  emb_loss: JDEEmbeddingLoss\n  jde_loss: JDELoss\n\nJDETracker:\n  det_thresh: 0.3\n  track_buffer: 30\n  min_box_area: 200\n  motion: KalmanFilter\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/jde_darknet53/config/_base_/jde_reader_1088x608.yml",
    "content": "worker_num: 2\nTrainReader:\n  sample_transforms:\n    - Decode: {}\n    - RGBReverse: {}\n    - AugmentHSV: {}\n    - LetterBoxResize: {target_size: [608, 1088]}\n    - MOTRandomAffine: {}\n    - RandomFlip: {}\n    - BboxXYXY2XYWH: {}\n    - NormalizeBox: {}\n    - NormalizeImage: {mean: [0, 0, 0], std: [1, 1, 1], is_scale: True}\n    - RGBReverse: {}\n    - Permute: {}\n  batch_transforms:\n    - Gt2JDETargetThres:\n        anchor_masks: [[0, 1, 2, 3], [4, 5, 6, 7], [8, 9, 10, 11]]\n        anchors: [[[128,384], [180,540], [256,640], [512,640]],\n                  [[32,96], [45,135], [64,192], [90,271]],\n                  [[8,24], [11,34], [16,48], [23,68]]]\n        downsample_ratios: [32, 16, 8]\n        ide_thresh: 0.5\n        fg_thresh: 0.5\n        bg_thresh: 0.4\n  batch_size: 4\n  shuffle: true\n  drop_last: true\n  use_shared_memory: true\n\n\nEvalMOTReader:\n  sample_transforms:\n    - Decode: {}\n    - LetterBoxResize: {target_size: [608, 1088]}\n    - NormalizeImage: {mean: [0, 0, 0], std: [1, 1, 1], is_scale: True}\n    - Permute: {}\n  batch_size: 1\n\n\nTestMOTReader:\n  inputs_def:\n    image_shape: [3, 608, 1088]\n  sample_transforms:\n    - Decode: {}\n    - LetterBoxResize: {target_size: [608, 1088]}\n    - NormalizeImage: {mean: [0, 0, 0], std: [1, 1, 1], is_scale: True}\n    - Permute: {}\n  batch_size: 1\n\n\nMOTVideoStreamReader:\n  sample_transforms:\n    - Decode: {}\n    - LetterBoxResize: {target_size: [608, 1088]}\n    - NormalizeImage: {mean: [0, 0, 0], std: [1, 1, 1], is_scale: True}\n    - Permute: {}\n  batch_size: 1\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/jde_darknet53/config/_base_/mot.yml",
    "content": "metric: MOT\nnum_classes: 1\n\n# for MOT training\nTrainDataset:\n  !MOTDataSet\n    dataset_dir: dataset/mot\n    image_lists: ['mot17.train', 'caltech.all', 'cuhksysu.train', 'prw.train', 'citypersons.train', 'eth.train']\n    data_fields: ['image', 'gt_bbox', 'gt_class', 'gt_ide']\n\n# for MOT evaluation\n# If you want to change the MOT evaluation dataset, please modify 'data_root'\nEvalMOTDataset:\n  !MOTImageFolder\n    dataset_dir: dataset/mot\n    data_root: MOT16/images/train\n    keep_ori_im: False # set True if save visualization images or video, or used in DeepSORT\n\n# for MOT video inference\nTestMOTDataset:\n  !MOTImageFolder\n    dataset_dir: dataset/mot\n    keep_ori_im: True # set True if save visualization images or video\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/jde_darknet53/config/_base_/optimizer_30e.yml",
    "content": "epoch: 30\n\nLearningRate:\n  base_lr: 0.01\n  schedulers:\n  - !PiecewiseDecay\n    gamma: 0.1\n    milestones: [15, 22]\n    use_warmup: True\n  - !BurninWarmup\n    steps: 1000\n\nOptimizerBuilder:\n  optimizer:\n    momentum: 0.9\n    type: Momentum\n  regularizer:\n    factor: 0.0001\n    type: L2\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/jde_darknet53/config/_base_/optimizer_60e.yml",
    "content": "epoch: 60\n\nLearningRate:\n  base_lr: 0.01\n  schedulers:\n  - !PiecewiseDecay\n    gamma: 0.1\n    milestones: [30, 44]\n    use_warmup: True\n  - !BurninWarmup\n    steps: 1000\n\nOptimizerBuilder:\n  optimizer:\n    momentum: 0.9\n    type: Momentum\n  regularizer:\n    factor: 0.0001\n    type: L2\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/jde_darknet53/config/_base_/runtime.yml",
    "content": "use_gpu: true\nlog_iter: 20\nsave_dir: output\nsnapshot_epoch: 1\nprint_flops: false\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/jde_darknet53/config/jde_darknet53_30e_1088x608.yml",
    "content": "_BASE_: [\n  '_base_/mot.yml',\n  '_base_/runtime.yml',\n  '_base_/jde_darknet53.yml',\n  '_base_/jde_reader_1088x608.yml',\n]\n\nJDE:\n  detector: YOLOv3\n  reid: JDEEmbeddingHead\n  tracker: FrozenJDETracker\n\nYOLOv3:\n  backbone: DarkNet\n  neck: YOLOv3FPN\n  yolo_head: YOLOv3Head\n  post_process: JDEBBoxPostProcess\n  for_mot: True\n\nYOLOv3Head:\n  anchors: [[128,384], [180,540], [256,640], [512,640],\n            [32,96], [45,135], [64,192], [90,271],\n            [8,24], [11,34], [16,48], [23,68]]\n  anchor_masks: [[0, 1, 2, 3], [4, 5, 6, 7], [8, 9, 10, 11]]\n  loss: JDEDetectionLoss\n\nJDETracker:\n  det_thresh: 0.3\n  track_buffer: 30\n  min_box_area: 200\n  motion: KalmanFilter\n\nJDEBBoxPostProcess:\n  decode:\n    name: JDEBox\n    conf_thresh: 0.5\n    downsample_ratio: 32\n  nms:\n    name: MultiClassNMS\n    keep_top_k: 500\n    score_threshold: 0.01\n    nms_threshold: 0.4\n    nms_top_k: 2000\n    normalized: true\n    return_index: true\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/jde_darknet53/dataset.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport numbers\nimport os\nimport sys\nfrom collections import deque\nfrom collections.abc import Mapping\n\nimport six\ntry:\n    from collections.abc import Sequence, Mapping\nexcept:\n    from collections import Sequence, Mapping\n\nfrom ppdet.core.workspace import register, serializable\nfrom ppdet.utils.logger import setup_logger\nfrom ppdet.data.reader import BaseDataLoader, Compose\n\nimport cv2\nfrom imageio import imread, imwrite\nimport numpy as np\nimport paddle\nfrom paddle.framework import core\n\nlogger = setup_logger(__name__)\n\n\ndef default_collate_fn(batch):\n    \"\"\"\n    Default batch collating function for :code:`paddle.io.DataLoader`,\n    get input data as a list of sample datas, each element in list\n    if the data of a sample, and sample data should composed of list,\n    dictionary, string, number, numpy array and paddle.Tensor, this\n    function will parse input data recursively and stack number,\n    numpy array and paddle.Tensor datas as batch datas. e.g. for\n    following input data:\n    [{'image': np.array(shape=[3, 224, 224]), 'label': 1},\n     {'image': np.array(shape=[3, 224, 224]), 'label': 3},\n     {'image': np.array(shape=[3, 224, 224]), 'label': 4},\n     {'image': np.array(shape=[3, 224, 224]), 'label': 5},]\n\n\n    This default collate function zipped each number and numpy array\n    field together and stack each field as the batch field as follows:\n    {'image': np.array(shape=[4, 3, 224, 224]), 'label': np.array([1, 3, 4, 5])}\n    Args:\n        batch(list of sample data): batch should be a list of sample data.\n\n    Returns:\n        Batched data: batched each number, numpy array and paddle.Tensor\n                      in input data.\n    \"\"\"\n    sample = batch[0]\n    if isinstance(sample, np.ndarray):\n        batch = np.stack(batch, axis=0)\n        return batch\n    elif isinstance(sample, (paddle.Tensor)):\n        return paddle.stack(batch, axis=0)\n    elif isinstance(sample, numbers.Number):\n        batch = np.array(batch)\n        return batch\n    elif isinstance(sample, (str, bytes)):\n        return batch\n    elif isinstance(sample, Mapping):\n        return {key: default_collate_fn([d[key] for d in batch]) for key in sample}\n    elif isinstance(sample, Sequence):\n        sample_fields_num = len(sample)\n        if not all(len(sample) == sample_fields_num for sample in iter(batch)):\n            raise RuntimeError(\"fileds number not same among samples in a batch\")\n        return [default_collate_fn(fields) for fields in zip(*batch)]\n\n    raise TypeError(\"batch data con only contains: tensor, numpy.ndarray, \"\n                    \"dict, list, number, but got {}\".format(type(sample)))\n\n\n@register\n@serializable\nclass MOTVideoStream:\n    \"\"\"\n    Load MOT dataset with MOT format from video stream.\n    Args:\n        video_stream (str): path or url of the video file, default ''.\n        keep_ori_im (bool): whether to keep original image, default False.\n            Set True when used during MOT model inference while saving\n            images or video, or used in DeepSORT.\n    \"\"\"\n\n    def __init__(self, video_stream=None, keep_ori_im=False, **kwargs):\n        self.video_stream = video_stream\n        self.keep_ori_im = keep_ori_im\n        self._curr_iter = 0\n        self.transform = None\n        try:\n            if video_stream == None:\n                print('No video stream is specified, please check the --video_stream option.')\n                raise FileNotFoundError(\"No video_stream is specified.\")\n            self.stream = cv2.VideoCapture(video_stream)\n            if not self.stream.isOpened():\n                raise Exception(\"Open video stream Error!\")\n        except Exception as e:\n            print('Failed to read {}.'.format(video_stream))\n            raise e\n\n        self.videoframeraw_dir = os.path.splitext(os.path.basename(self.video_stream))[0] + '_raw'\n        if not os.path.exists(self.videoframeraw_dir):\n            os.makedirs(self.videoframeraw_dir)\n\n    def set_kwargs(self, **kwargs):\n        self.mixup_epoch = kwargs.get('mixup_epoch', -1)\n        self.cutmix_epoch = kwargs.get('cutmix_epoch', -1)\n        self.mosaic_epoch = kwargs.get('mosaic_epoch', -1)\n\n    def set_transform(self, transform):\n        self.transform = transform\n\n    def set_epoch(self, epoch_id):\n        self._epoch = epoch_id\n\n    def parse_dataset(self):\n        pass\n\n    def __iter__(self):\n        ct = 0\n        while True:\n            ret, frame = self.stream.read()\n            if ret:\n                imgname = os.path.join(self.videoframeraw_dir, 'frame{}.png'.format(ct))\n                cv2.imwrite(imgname, frame)\n                image = imread(imgname)\n                rec = {'im_id': np.array([ct]), 'im_file': imgname}\n                if self.keep_ori_im:\n                    rec.update({'keep_ori_im': 1})\n                rec['curr_iter'] = self._curr_iter\n                self._curr_iter += 1\n                ct += 1\n                if self.transform:\n                    yield self.transform(rec)\n                else:\n                    yield rec\n            else:\n                return\n\n\n@register\n@serializable\nclass MOTImageStream:\n    \"\"\"\n    Load MOT dataset with MOT format from image stream.\n    Args:\n        keep_ori_im (bool): whether to keep original image, default False.\n            Set True when used during MOT model inference while saving\n            images or video, or used in DeepSORT.\n    \"\"\"\n\n    def __init__(self, sample_num=-1, keep_ori_im=False, **kwargs):\n        self.keep_ori_im = keep_ori_im\n        self._curr_iter = 0\n        self.transform = None\n        self.imagequeue = deque()\n\n        self.frameraw_dir = 'inputimages_raw'\n        if not os.path.exists(self.frameraw_dir):\n            os.makedirs(self.frameraw_dir)\n\n    def add_image(self, image):\n        self.imagequeue.append(image)\n\n    def set_kwargs(self, **kwargs):\n        self.mixup_epoch = kwargs.get('mixup_epoch', -1)\n        self.cutmix_epoch = kwargs.get('cutmix_epoch', -1)\n        self.mosaic_epoch = kwargs.get('mosaic_epoch', -1)\n\n    def set_transform(self, transform):\n        self.transform = transform\n\n    def set_epoch(self, epoch_id):\n        self._epoch = epoch_id\n\n    def parse_dataset(self):\n        pass\n\n    def __iter__(self):\n        ct = 0\n        while True:\n            if self.imagequeue:\n                frame = self.imagequeue.popleft()\n                imgname = os.path.join(self.frameraw_dir, 'frame{}.png'.format(ct))\n                cv2.imwrite(imgname, frame)\n                image = imread(imgname)\n                rec = {'im_id': np.array([ct]), 'im_file': imgname}\n                if self.keep_ori_im:\n                    rec.update({'keep_ori_im': 1})\n                rec['curr_iter'] = self._curr_iter\n                self._curr_iter += 1\n                ct += 1\n                if self.transform:\n                    yield self.transform(rec)\n                else:\n                    yield rec\n            else:\n                return\n\n\n@register\nclass MOTVideoStreamReader:\n    __shared__ = ['num_classes']\n\n    def __init__(self, sample_transforms=[], batch_size=1, drop_last=False, num_classes=1, **kwargs):\n        self._sample_transforms = Compose(sample_transforms, num_classes=num_classes)\n        self.batch_size = batch_size\n        self.drop_last = drop_last\n        self.num_classes = num_classes\n        self.kwargs = kwargs\n\n    def __call__(\n        self,\n        dataset,\n        worker_num,\n    ):\n        self.dataset = dataset\n        # get data\n        self.dataset.set_transform(self._sample_transforms)\n        # set kwargs\n        self.dataset.set_kwargs(**self.kwargs)\n\n        self.loader = iter(self.dataset)\n        return self\n\n    def __len__(self):\n        return sys.maxint\n\n    def __iter__(self):\n        return self\n\n    def to_tensor(self, batch):\n        paddle.disable_static()\n        if isinstance(batch, np.ndarray):\n            batch = paddle.to_tensor(batch)\n        elif isinstance(batch, Mapping):\n            batch = {key: self.to_tensor(batch[key]) for key in batch}\n        return batch\n\n    def __next__(self):\n        try:\n            batch = []\n            for i in range(self.batch_size):\n                batch.append(next(self.loader))\n            batch = default_collate_fn(batch)\n            return self.to_tensor(batch)\n\n        except StopIteration as e:\n            raise e\n\n    def next(self):\n        # python2 compatibility\n        return self.__next__()\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/jde_darknet53/modeling/mot/__init__.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom . import matching\nfrom . import tracker\nfrom . import motion\nfrom . import visualization\nfrom . import utils\n\nfrom .matching import *\nfrom .tracker import *\nfrom .motion import *\nfrom .visualization import *\nfrom .utils import *\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/jde_darknet53/modeling/mot/matching/__init__.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom . import jde_matching\nfrom . import deepsort_matching\n\nfrom .jde_matching import *\nfrom .deepsort_matching import *\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/jde_darknet53/modeling/mot/matching/deepsort_matching.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"\nThis code is borrow from https://github.com/nwojke/deep_sort/tree/master/deep_sort\n\"\"\"\n\nimport numpy as np\nfrom scipy.optimize import linear_sum_assignment\nfrom ..motion import kalman_filter\n\nINFTY_COST = 1e+5\n\n__all__ = [\n    'iou_1toN',\n    'iou_cost',\n    '_nn_euclidean_distance',\n    '_nn_cosine_distance',\n    'NearestNeighborDistanceMetric',\n    'min_cost_matching',\n    'matching_cascade',\n    'gate_cost_matrix',\n]\n\n\ndef iou_1toN(bbox, candidates):\n    \"\"\"\n    Computer intersection over union (IoU) by one box to N candidates.\n\n    Args:\n        bbox (ndarray): A bounding box in format `(top left x, top left y, width, height)`.\n            candidates (ndarray): A matrix of candidate bounding boxes (one per row) in the\n            same format as `bbox`.\n\n    Returns:\n        ious (ndarray): The intersection over union in [0, 1] between the `bbox`\n            and each candidate. A higher score means a larger fraction of the\n            `bbox` is occluded by the candidate.\n    \"\"\"\n    bbox_tl = bbox[:2]\n    bbox_br = bbox[:2] + bbox[2:]\n    candidates_tl = candidates[:, :2]\n    candidates_br = candidates[:, :2] + candidates[:, 2:]\n\n    tl = np.c_[np.maximum(bbox_tl[0], candidates_tl[:, 0])[:, np.newaxis],\n               np.maximum(bbox_tl[1], candidates_tl[:, 1])[:, np.newaxis]]\n    br = np.c_[np.minimum(bbox_br[0], candidates_br[:, 0])[:, np.newaxis],\n               np.minimum(bbox_br[1], candidates_br[:, 1])[:, np.newaxis]]\n    wh = np.maximum(0., br - tl)\n\n    area_intersection = wh.prod(axis=1)\n    area_bbox = bbox[2:].prod()\n    area_candidates = candidates[:, 2:].prod(axis=1)\n    ious = area_intersection / (area_bbox + area_candidates - area_intersection)\n    return ious\n\n\ndef iou_cost(tracks, detections, track_indices=None, detection_indices=None):\n    \"\"\"\n    IoU distance metric.\n\n    Args:\n        tracks (list[Track]): A list of tracks.\n        detections (list[Detection]): A list of detections.\n        track_indices (Optional[list[int]]): A list of indices to tracks that\n            should be matched. Defaults to all `tracks`.\n        detection_indices (Optional[list[int]]): A list of indices to detections\n            that should be matched. Defaults to all `detections`.\n\n    Returns:\n        cost_matrix (ndarray): A cost matrix of shape len(track_indices),\n            len(detection_indices) where entry (i, j) is\n            `1 - iou(tracks[track_indices[i]], detections[detection_indices[j]])`.\n    \"\"\"\n    if track_indices is None:\n        track_indices = np.arange(len(tracks))\n    if detection_indices is None:\n        detection_indices = np.arange(len(detections))\n\n    cost_matrix = np.zeros((len(track_indices), len(detection_indices)))\n    for row, track_idx in enumerate(track_indices):\n        if tracks[track_idx].time_since_update > 1:\n            cost_matrix[row, :] = 1e+5\n            continue\n\n        bbox = tracks[track_idx].to_tlwh()\n        candidates = np.asarray([detections[i].tlwh for i in detection_indices])\n        cost_matrix[row, :] = 1. - iou_1toN(bbox, candidates)\n    return cost_matrix\n\n\ndef _nn_euclidean_distance(s, q):\n    \"\"\"\n    Compute pair-wise squared (Euclidean) distance between points in `s` and `q`.\n\n    Args:\n        s (ndarray): Sample points: an NxM matrix of N samples of dimensionality M.\n        q (ndarray): Query points: an LxM matrix of L samples of dimensionality M.\n\n    Returns:\n        distances (ndarray): A vector of length M that contains for each entry in `q` the\n            smallest Euclidean distance to a sample in `s`.\n    \"\"\"\n    s, q = np.asarray(s), np.asarray(q)\n    if len(s) == 0 or len(q) == 0:\n        return np.zeros((len(s), len(q)))\n    s2, q2 = np.square(s).sum(axis=1), np.square(q).sum(axis=1)\n    distances = -2. * np.dot(s, q.T) + s2[:, None] + q2[None, :]\n    distances = np.clip(distances, 0., float(np.inf))\n\n    return np.maximum(0.0, distances.min(axis=0))\n\n\ndef _nn_cosine_distance(s, q):\n    \"\"\"\n    Compute pair-wise cosine distance between points in `s` and `q`.\n\n    Args:\n        s (ndarray): Sample points: an NxM matrix of N samples of dimensionality M.\n        q (ndarray): Query points: an LxM matrix of L samples of dimensionality M.\n\n    Returns:\n        distances (ndarray): A vector of length M that contains for each entry in `q` the\n            smallest Euclidean distance to a sample in `s`.\n    \"\"\"\n    s = np.asarray(s) / np.linalg.norm(s, axis=1, keepdims=True)\n    q = np.asarray(q) / np.linalg.norm(q, axis=1, keepdims=True)\n    distances = 1. - np.dot(s, q.T)\n\n    return distances.min(axis=0)\n\n\nclass NearestNeighborDistanceMetric(object):\n    \"\"\"\n    A nearest neighbor distance metric that, for each target, returns\n    the closest distance to any sample that has been observed so far.\n\n    Args:\n        metric (str): Either \"euclidean\" or \"cosine\".\n        matching_threshold (float): The matching threshold. Samples with larger\n            distance are considered an invalid match.\n        budget (Optional[int]): If not None, fix samples per class to at most\n            this number. Removes the oldest samples when the budget is reached.\n\n    Attributes:\n        samples (Dict[int -> List[ndarray]]): A dictionary that maps from target\n            identities to the list of samples that have been observed so far.\n    \"\"\"\n\n    def __init__(self, metric, matching_threshold, budget=None):\n        if metric == \"euclidean\":\n            self._metric = _nn_euclidean_distance\n        elif metric == \"cosine\":\n            self._metric = _nn_cosine_distance\n        else:\n            raise ValueError(\"Invalid metric; must be either 'euclidean' or 'cosine'\")\n        self.matching_threshold = matching_threshold\n        self.budget = budget\n        self.samples = {}\n\n    def partial_fit(self, features, targets, active_targets):\n        \"\"\"\n        Update the distance metric with new data.\n\n        Args:\n            features (ndarray): An NxM matrix of N features of dimensionality M.\n            targets (ndarray): An integer array of associated target identities.\n            active_targets (List[int]): A list of targets that are currently\n                present in the scene.\n        \"\"\"\n        for feature, target in zip(features, targets):\n            self.samples.setdefault(target, []).append(feature)\n            if self.budget is not None:\n                self.samples[target] = self.samples[target][-self.budget:]\n        self.samples = {k: self.samples[k] for k in active_targets}\n\n    def distance(self, features, targets):\n        \"\"\"\n        Compute distance between features and targets.\n\n        Args:\n            features (ndarray): An NxM matrix of N features of dimensionality M.\n            targets (list[int]): A list of targets to match the given `features` against.\n\n        Returns:\n            cost_matrix (ndarray): a cost matrix of shape len(targets), len(features),\n                where element (i, j) contains the closest squared distance between\n                `targets[i]` and `features[j]`.\n        \"\"\"\n        cost_matrix = np.zeros((len(targets), len(features)))\n        for i, target in enumerate(targets):\n            cost_matrix[i, :] = self._metric(self.samples[target], features)\n        return cost_matrix\n\n\ndef min_cost_matching(distance_metric, max_distance, tracks, detections, track_indices=None, detection_indices=None):\n    \"\"\"\n    Solve linear assignment problem.\n\n    Args:\n        distance_metric :\n            Callable[List[Track], List[Detection], List[int], List[int]) -> ndarray\n            The distance metric is given a list of tracks and detections as\n            well as a list of N track indices and M detection indices. The\n            metric should return the NxM dimensional cost matrix, where element\n            (i, j) is the association cost between the i-th track in the given\n            track indices and the j-th detection in the given detection_indices.\n        max_distance (float): Gating threshold. Associations with cost larger\n            than this value are disregarded.\n        tracks (list[Track]): A list of predicted tracks at the current time\n            step.\n        detections (list[Detection]): A list of detections at the current time\n            step.\n        track_indices (list[int]): List of track indices that maps rows in\n            `cost_matrix` to tracks in `tracks`.\n        detection_indices (List[int]): List of detection indices that maps\n            columns in `cost_matrix` to detections in `detections`.\n\n    Returns:\n        A tuple (List[(int, int)], List[int], List[int]) with the following\n        three entries:\n            * A list of matched track and detection indices.\n            * A list of unmatched track indices.\n            * A list of unmatched detection indices.\n    \"\"\"\n    if track_indices is None:\n        track_indices = np.arange(len(tracks))\n    if detection_indices is None:\n        detection_indices = np.arange(len(detections))\n\n    if len(detection_indices) == 0 or len(track_indices) == 0:\n        return [], track_indices, detection_indices  # Nothing to match.\n\n    cost_matrix = distance_metric(tracks, detections, track_indices, detection_indices)\n\n    cost_matrix[cost_matrix > max_distance] = max_distance + 1e-5\n    indices = linear_sum_assignment(cost_matrix)\n\n    matches, unmatched_tracks, unmatched_detections = [], [], []\n    for col, detection_idx in enumerate(detection_indices):\n        if col not in indices[1]:\n            unmatched_detections.append(detection_idx)\n    for row, track_idx in enumerate(track_indices):\n        if row not in indices[0]:\n            unmatched_tracks.append(track_idx)\n    for row, col in zip(indices[0], indices[1]):\n        track_idx = track_indices[row]\n        detection_idx = detection_indices[col]\n        if cost_matrix[row, col] > max_distance:\n            unmatched_tracks.append(track_idx)\n            unmatched_detections.append(detection_idx)\n        else:\n            matches.append((track_idx, detection_idx))\n    return matches, unmatched_tracks, unmatched_detections\n\n\ndef matching_cascade(distance_metric,\n                     max_distance,\n                     cascade_depth,\n                     tracks,\n                     detections,\n                     track_indices=None,\n                     detection_indices=None):\n    \"\"\"\n    Run matching cascade.\n\n    Args:\n        distance_metric :\n            Callable[List[Track], List[Detection], List[int], List[int]) -> ndarray\n            The distance metric is given a list of tracks and detections as\n            well as a list of N track indices and M detection indices. The\n            metric should return the NxM dimensional cost matrix, where element\n            (i, j) is the association cost between the i-th track in the given\n            track indices and the j-th detection in the given detection_indices.\n        max_distance (float): Gating threshold. Associations with cost larger\n            than this value are disregarded.\n        cascade_depth (int): The cascade depth, should be se to the maximum\n            track age.\n        tracks (list[Track]): A list of predicted tracks at the current time\n            step.\n        detections (list[Detection]): A list of detections at the current time\n            step.\n        track_indices (list[int]): List of track indices that maps rows in\n            `cost_matrix` to tracks in `tracks`.\n        detection_indices (List[int]): List of detection indices that maps\n            columns in `cost_matrix` to detections in `detections`.\n\n    Returns:\n        A tuple (List[(int, int)], List[int], List[int]) with the following\n        three entries:\n            * A list of matched track and detection indices.\n            * A list of unmatched track indices.\n            * A list of unmatched detection indices.\n    \"\"\"\n    if track_indices is None:\n        track_indices = list(range(len(tracks)))\n    if detection_indices is None:\n        detection_indices = list(range(len(detections)))\n\n    unmatched_detections = detection_indices\n    matches = []\n    for level in range(cascade_depth):\n        if len(unmatched_detections) == 0:  # No detections left\n            break\n\n        track_indices_l = [k for k in track_indices if tracks[k].time_since_update == 1 + level]\n        if len(track_indices_l) == 0:  # Nothing to match at this level\n            continue\n\n        matches_l, _, unmatched_detections = \\\n            min_cost_matching(\n                distance_metric, max_distance, tracks, detections,\n                track_indices_l, unmatched_detections)\n        matches += matches_l\n    unmatched_tracks = list(set(track_indices) - set(k for k, _ in matches))\n    return matches, unmatched_tracks, unmatched_detections\n\n\ndef gate_cost_matrix(kf,\n                     cost_matrix,\n                     tracks,\n                     detections,\n                     track_indices,\n                     detection_indices,\n                     gated_cost=INFTY_COST,\n                     only_position=False):\n    \"\"\"\n    Invalidate infeasible entries in cost matrix based on the state\n    distributions obtained by Kalman filtering.\n\n    Args:\n        kf (object): The Kalman filter.\n        cost_matrix (ndarray): The NxM dimensional cost matrix, where N is the\n            number of track indices and M is the number of detection indices,\n            such that entry (i, j) is the association cost between\n            `tracks[track_indices[i]]` and `detections[detection_indices[j]]`.\n        tracks (list[Track]): A list of predicted tracks at the current time\n            step.\n        detections (list[Detection]): A list of detections at the current time\n            step.\n        track_indices (List[int]): List of track indices that maps rows in\n            `cost_matrix` to tracks in `tracks`.\n        detection_indices (List[int]): List of detection indices that maps\n            columns in `cost_matrix` to detections in `detections`.\n        gated_cost (Optional[float]): Entries in the cost matrix corresponding\n            to infeasible associations are set this value. Defaults to a very\n            large value.\n        only_position (Optional[bool]): If True, only the x, y position of the\n            state distribution is considered during gating. Default False.\n    \"\"\"\n    gating_dim = 2 if only_position else 4\n    gating_threshold = kalman_filter.chi2inv95[gating_dim]\n    measurements = np.asarray([detections[i].to_xyah() for i in detection_indices])\n    for row, track_idx in enumerate(track_indices):\n        track = tracks[track_idx]\n        gating_distance = kf.gating_distance(track.mean, track.covariance, measurements, only_position)\n        cost_matrix[row, gating_distance > gating_threshold] = gated_cost\n    return cost_matrix\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/jde_darknet53/modeling/mot/matching/jde_matching.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\r\n#\r\n# Licensed under the Apache License, Version 2.0 (the \"License\");\r\n# you may not use this file except in compliance with the License.\r\n# You may obtain a copy of the License at\r\n#\r\n#     http://www.apache.org/licenses/LICENSE-2.0\r\n#\r\n# Unless required by applicable law or agreed to in writing, software\r\n# distributed under the License is distributed on an \"AS IS\" BASIS,\r\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\r\n# See the License for the specific language governing permissions and\r\n# limitations under the License.\r\n\"\"\"\r\nThis code is based on https://github.com/Zhongdao/Towards-Realtime-MOT/blob/master/tracker/matching.py\r\n\"\"\"\r\n\r\ntry:\r\n    import lap\r\nexcept:\r\n    print(\r\n        'Warning: Unable to use JDE/FairMOT/ByteTrack, please install lap, for example: `pip install lap`, see https://github.com/gatagat/lap'\r\n    )\r\n    pass\r\n\r\nimport scipy\r\nimport numpy as np\r\nfrom scipy.spatial.distance import cdist\r\nfrom ..motion import kalman_filter\r\nimport warnings\r\n\r\nwarnings.filterwarnings(\"ignore\")\r\n\r\n__all__ = [\r\n    'merge_matches',\r\n    'linear_assignment',\r\n    'bbox_ious',\r\n    'iou_distance',\r\n    'embedding_distance',\r\n    'fuse_motion',\r\n]\r\n\r\n\r\ndef merge_matches(m1, m2, shape):\r\n    O, P, Q = shape\r\n    m1 = np.asarray(m1)\r\n    m2 = np.asarray(m2)\r\n\r\n    M1 = scipy.sparse.coo_matrix((np.ones(len(m1)), (m1[:, 0], m1[:, 1])), shape=(O, P))\r\n    M2 = scipy.sparse.coo_matrix((np.ones(len(m2)), (m2[:, 0], m2[:, 1])), shape=(P, Q))\r\n\r\n    mask = M1 * M2\r\n    match = mask.nonzero()\r\n    match = list(zip(match[0], match[1]))\r\n    unmatched_O = tuple(set(range(O)) - set([i for i, j in match]))\r\n    unmatched_Q = tuple(set(range(Q)) - set([j for i, j in match]))\r\n\r\n    return match, unmatched_O, unmatched_Q\r\n\r\n\r\ndef linear_assignment(cost_matrix, thresh):\r\n    try:\r\n        import lap\r\n    except Exception as e:\r\n        raise RuntimeError(\r\n            'Unable to use JDE/FairMOT/ByteTrack, please install lap, for example: `pip install lap`, see https://github.com/gatagat/lap'\r\n        )\r\n    if cost_matrix.size == 0:\r\n        return np.empty((0, 2), dtype=int), tuple(range(cost_matrix.shape[0])), tuple(range(cost_matrix.shape[1]))\r\n    matches, unmatched_a, unmatched_b = [], [], []\r\n    cost, x, y = lap.lapjv(cost_matrix, extend_cost=True, cost_limit=thresh)\r\n    for ix, mx in enumerate(x):\r\n        if mx >= 0:\r\n            matches.append([ix, mx])\r\n    unmatched_a = np.where(x < 0)[0]\r\n    unmatched_b = np.where(y < 0)[0]\r\n    matches = np.asarray(matches)\r\n    return matches, unmatched_a, unmatched_b\r\n\r\n\r\ndef bbox_ious(atlbrs, btlbrs):\r\n    boxes = np.ascontiguousarray(atlbrs, dtype=np.float)\r\n    query_boxes = np.ascontiguousarray(btlbrs, dtype=np.float)\r\n    N = boxes.shape[0]\r\n    K = query_boxes.shape[0]\r\n    ious = np.zeros((N, K), dtype=boxes.dtype)\r\n    if N * K == 0:\r\n        return ious\r\n\r\n    for k in range(K):\r\n        box_area = ((query_boxes[k, 2] - query_boxes[k, 0] + 1) * (query_boxes[k, 3] - query_boxes[k, 1] + 1))\r\n        for n in range(N):\r\n            iw = (min(boxes[n, 2], query_boxes[k, 2]) - max(boxes[n, 0], query_boxes[k, 0]) + 1)\r\n            if iw > 0:\r\n                ih = (min(boxes[n, 3], query_boxes[k, 3]) - max(boxes[n, 1], query_boxes[k, 1]) + 1)\r\n                if ih > 0:\r\n                    ua = float((boxes[n, 2] - boxes[n, 0] + 1) * (boxes[n, 3] - boxes[n, 1] + 1) + box_area - iw * ih)\r\n                    ious[n, k] = iw * ih / ua\r\n    return ious\r\n\r\n\r\ndef iou_distance(atracks, btracks):\r\n    \"\"\"\r\n    Compute cost based on IoU between two list[STrack].\r\n    \"\"\"\r\n    if (len(atracks) > 0 and isinstance(atracks[0], np.ndarray)) or (len(btracks) > 0\r\n                                                                     and isinstance(btracks[0], np.ndarray)):\r\n        atlbrs = atracks\r\n        btlbrs = btracks\r\n    else:\r\n        atlbrs = [track.tlbr for track in atracks]\r\n        btlbrs = [track.tlbr for track in btracks]\r\n    _ious = bbox_ious(atlbrs, btlbrs)\r\n    cost_matrix = 1 - _ious\r\n\r\n    return cost_matrix\r\n\r\n\r\ndef embedding_distance(tracks, detections, metric='euclidean'):\r\n    \"\"\"\r\n    Compute cost based on features between two list[STrack].\r\n    \"\"\"\r\n    cost_matrix = np.zeros((len(tracks), len(detections)), dtype=np.float)\r\n    if cost_matrix.size == 0:\r\n        return cost_matrix\r\n    det_features = np.asarray([track.curr_feat for track in detections], dtype=np.float)\r\n    track_features = np.asarray([track.smooth_feat for track in tracks], dtype=np.float)\r\n    cost_matrix = np.maximum(0.0, cdist(track_features, det_features, metric))  # Nomalized features\r\n    return cost_matrix\r\n\r\n\r\ndef fuse_motion(kf, cost_matrix, tracks, detections, only_position=False, lambda_=0.98):\r\n    if cost_matrix.size == 0:\r\n        return cost_matrix\r\n    gating_dim = 2 if only_position else 4\r\n    gating_threshold = kalman_filter.chi2inv95[gating_dim]\r\n    measurements = np.asarray([det.to_xyah() for det in detections])\r\n    for row, track in enumerate(tracks):\r\n        gating_distance = kf.gating_distance(track.mean, track.covariance, measurements, only_position, metric='maha')\r\n        cost_matrix[row, gating_distance > gating_threshold] = np.inf\r\n        cost_matrix[row] = lambda_ * cost_matrix[row] + (1 - lambda_) * gating_distance\r\n    return cost_matrix\r\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/jde_darknet53/modeling/mot/motion/__init__.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom . import kalman_filter\n\nfrom .kalman_filter import *\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/jde_darknet53/modeling/mot/motion/kalman_filter.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\r\n#\r\n# Licensed under the Apache License, Version 2.0 (the \"License\");\r\n# you may not use this file except in compliance with the License.\r\n# You may obtain a copy of the License at\r\n#\r\n#     http://www.apache.org/licenses/LICENSE-2.0\r\n#\r\n# Unless required by applicable law or agreed to in writing, software\r\n# distributed under the License is distributed on an \"AS IS\" BASIS,\r\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\r\n# See the License for the specific language governing permissions and\r\n# limitations under the License.\r\n\"\"\"\r\nThis code is borrow from https://github.com/nwojke/deep_sort/blob/master/deep_sort/kalman_filter.py\r\n\"\"\"\r\n\r\nimport numpy as np\r\nimport scipy.linalg\r\n\r\n__all__ = ['KalmanFilter']\r\n\"\"\"\r\nTable for the 0.95 quantile of the chi-square distribution with N degrees of\r\nfreedom (contains values for N=1, ..., 9). Taken from MATLAB/Octave's chi2inv\r\nfunction and used as Mahalanobis gating threshold.\r\n\"\"\"\r\n\r\nchi2inv95 = {1: 3.8415, 2: 5.9915, 3: 7.8147, 4: 9.4877, 5: 11.070, 6: 12.592, 7: 14.067, 8: 15.507, 9: 16.919}\r\n\r\n\r\nclass KalmanFilter(object):\r\n    \"\"\"\r\n    A simple Kalman filter for tracking bounding boxes in image space.\r\n\r\n    The 8-dimensional state space\r\n\r\n        x, y, a, h, vx, vy, va, vh\r\n\r\n    contains the bounding box center position (x, y), aspect ratio a, height h,\r\n    and their respective velocities.\r\n\r\n    Object motion follows a constant velocity model. The bounding box location\r\n    (x, y, a, h) is taken as direct observation of the state space (linear\r\n    observation model).\r\n\r\n    \"\"\"\r\n\r\n    def __init__(self):\r\n        ndim, dt = 4, 1.\r\n\r\n        # Create Kalman filter model matrices.\r\n        self._motion_mat = np.eye(2 * ndim, 2 * ndim)\r\n        for i in range(ndim):\r\n            self._motion_mat[i, ndim + i] = dt\r\n        self._update_mat = np.eye(ndim, 2 * ndim)\r\n\r\n        # Motion and observation uncertainty are chosen relative to the current\r\n        # state estimate. These weights control the amount of uncertainty in\r\n        # the model. This is a bit hacky.\r\n        self._std_weight_position = 1. / 20\r\n        self._std_weight_velocity = 1. / 160\r\n\r\n    def initiate(self, measurement):\r\n        \"\"\"\r\n        Create track from unassociated measurement.\r\n\r\n        Args:\r\n            measurement (ndarray): Bounding box coordinates (x, y, a, h) with\r\n                center position (x, y), aspect ratio a, and height h.\r\n\r\n        Returns:\r\n            The mean vector (8 dimensional) and covariance matrix (8x8\r\n            dimensional) of the new track. Unobserved velocities are\r\n            initialized to 0 mean.\r\n        \"\"\"\r\n        mean_pos = measurement\r\n        mean_vel = np.zeros_like(mean_pos)\r\n        mean = np.r_[mean_pos, mean_vel]\r\n\r\n        std = [\r\n            2 * self._std_weight_position * measurement[3], 2 * self._std_weight_position * measurement[3], 1e-2,\r\n            2 * self._std_weight_position * measurement[3], 10 * self._std_weight_velocity * measurement[3],\r\n            10 * self._std_weight_velocity * measurement[3], 1e-5, 10 * self._std_weight_velocity * measurement[3]\r\n        ]\r\n        covariance = np.diag(np.square(std))\r\n        return mean, covariance\r\n\r\n    def predict(self, mean, covariance):\r\n        \"\"\"\r\n        Run Kalman filter prediction step.\r\n\r\n        Args:\r\n            mean (ndarray): The 8 dimensional mean vector of the object state\r\n                at the previous time step.\r\n            covariance (ndarray): The 8x8 dimensional covariance matrix of the\r\n                object state at the previous time step.\r\n\r\n        Returns:\r\n            The mean vector and covariance matrix of the predicted state.\r\n            Unobserved velocities are initialized to 0 mean.\r\n        \"\"\"\r\n        std_pos = [\r\n            self._std_weight_position * mean[3], self._std_weight_position * mean[3], 1e-2,\r\n            self._std_weight_position * mean[3]\r\n        ]\r\n        std_vel = [\r\n            self._std_weight_velocity * mean[3], self._std_weight_velocity * mean[3], 1e-5,\r\n            self._std_weight_velocity * mean[3]\r\n        ]\r\n        motion_cov = np.diag(np.square(np.r_[std_pos, std_vel]))\r\n\r\n        #mean = np.dot(self._motion_mat, mean)\r\n        mean = np.dot(mean, self._motion_mat.T)\r\n        covariance = np.linalg.multi_dot((self._motion_mat, covariance, self._motion_mat.T)) + motion_cov\r\n\r\n        return mean, covariance\r\n\r\n    def project(self, mean, covariance):\r\n        \"\"\"\r\n        Project state distribution to measurement space.\r\n\r\n        Args\r\n            mean (ndarray): The state's mean vector (8 dimensional array).\r\n            covariance (ndarray): The state's covariance matrix (8x8 dimensional).\r\n\r\n        Returns:\r\n            The projected mean and covariance matrix of the given state estimate.\r\n        \"\"\"\r\n        std = [\r\n            self._std_weight_position * mean[3], self._std_weight_position * mean[3], 1e-1,\r\n            self._std_weight_position * mean[3]\r\n        ]\r\n        innovation_cov = np.diag(np.square(std))\r\n\r\n        mean = np.dot(self._update_mat, mean)\r\n        covariance = np.linalg.multi_dot((self._update_mat, covariance, self._update_mat.T))\r\n        return mean, covariance + innovation_cov\r\n\r\n    def multi_predict(self, mean, covariance):\r\n        \"\"\"\r\n        Run Kalman filter prediction step (Vectorized version).\r\n\r\n        Args:\r\n            mean (ndarray): The Nx8 dimensional mean matrix of the object states\r\n                at the previous time step.\r\n            covariance (ndarray): The Nx8x8 dimensional covariance matrics of the\r\n                object states at the previous time step.\r\n\r\n        Returns:\r\n            The mean vector and covariance matrix of the predicted state.\r\n            Unobserved velocities are initialized to 0 mean.\r\n        \"\"\"\r\n        std_pos = [\r\n            self._std_weight_position * mean[:, 3], self._std_weight_position * mean[:, 3],\r\n            1e-2 * np.ones_like(mean[:, 3]), self._std_weight_position * mean[:, 3]\r\n        ]\r\n        std_vel = [\r\n            self._std_weight_velocity * mean[:, 3], self._std_weight_velocity * mean[:, 3],\r\n            1e-5 * np.ones_like(mean[:, 3]), self._std_weight_velocity * mean[:, 3]\r\n        ]\r\n        sqr = np.square(np.r_[std_pos, std_vel]).T\r\n\r\n        motion_cov = []\r\n        for i in range(len(mean)):\r\n            motion_cov.append(np.diag(sqr[i]))\r\n        motion_cov = np.asarray(motion_cov)\r\n\r\n        mean = np.dot(mean, self._motion_mat.T)\r\n        left = np.dot(self._motion_mat, covariance).transpose((1, 0, 2))\r\n        covariance = np.dot(left, self._motion_mat.T) + motion_cov\r\n\r\n        return mean, covariance\r\n\r\n    def update(self, mean, covariance, measurement):\r\n        \"\"\"\r\n        Run Kalman filter correction step.\r\n\r\n        Args:\r\n            mean (ndarray): The predicted state's mean vector (8 dimensional).\r\n            covariance (ndarray): The state's covariance matrix (8x8 dimensional).\r\n            measurement (ndarray): The 4 dimensional measurement vector\r\n                (x, y, a, h), where (x, y) is the center position, a the aspect\r\n                ratio, and h the height of the bounding box.\r\n\r\n        Returns:\r\n            The measurement-corrected state distribution.\r\n        \"\"\"\r\n        projected_mean, projected_cov = self.project(mean, covariance)\r\n\r\n        chol_factor, lower = scipy.linalg.cho_factor(projected_cov, lower=True, check_finite=False)\r\n        kalman_gain = scipy.linalg.cho_solve((chol_factor, lower),\r\n                                             np.dot(covariance, self._update_mat.T).T,\r\n                                             check_finite=False).T\r\n        innovation = measurement - projected_mean\r\n\r\n        new_mean = mean + np.dot(innovation, kalman_gain.T)\r\n        new_covariance = covariance - np.linalg.multi_dot((kalman_gain, projected_cov, kalman_gain.T))\r\n        return new_mean, new_covariance\r\n\r\n    def gating_distance(self, mean, covariance, measurements, only_position=False, metric='maha'):\r\n        \"\"\"\r\n        Compute gating distance between state distribution and measurements.\r\n        A suitable distance threshold can be obtained from `chi2inv95`. If\r\n        `only_position` is False, the chi-square distribution has 4 degrees of\r\n        freedom, otherwise 2.\r\n\r\n        Args:\r\n            mean (ndarray): Mean vector over the state distribution (8\r\n                dimensional).\r\n            covariance (ndarray): Covariance of the state distribution (8x8\r\n                dimensional).\r\n            measurements (ndarray): An Nx4 dimensional matrix of N measurements,\r\n                each in format (x, y, a, h) where (x, y) is the bounding box center\r\n                position, a the aspect ratio, and h the height.\r\n            only_position (Optional[bool]): If True, distance computation is\r\n                done with respect to the bounding box center position only.\r\n            metric (str): Metric type, 'gaussian' or 'maha'.\r\n\r\n        Returns\r\n            An array of length N, where the i-th element contains the squared\r\n            Mahalanobis distance between (mean, covariance) and `measurements[i]`.\r\n        \"\"\"\r\n        mean, covariance = self.project(mean, covariance)\r\n        if only_position:\r\n            mean, covariance = mean[:2], covariance[:2, :2]\r\n            measurements = measurements[:, :2]\r\n\r\n        d = measurements - mean\r\n        if metric == 'gaussian':\r\n            return np.sum(d * d, axis=1)\r\n        elif metric == 'maha':\r\n            cholesky_factor = np.linalg.cholesky(covariance)\r\n            z = scipy.linalg.solve_triangular(cholesky_factor, d.T, lower=True, check_finite=False, overwrite_b=True)\r\n            squared_maha = np.sum(z * z, axis=0)\r\n            return squared_maha\r\n        else:\r\n            raise ValueError('invalid distance metric')\r\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/jde_darknet53/modeling/mot/tracker/__init__.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom . import base_jde_tracker\nfrom . import base_sde_tracker\nfrom . import jde_tracker\n\nfrom .base_jde_tracker import *\nfrom .base_sde_tracker import *\nfrom .jde_tracker import *\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/jde_darknet53/modeling/mot/tracker/base_jde_tracker.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"\nThis code is borrow from https://github.com/Zhongdao/Towards-Realtime-MOT/blob/master/tracker/multitracker.py\n\"\"\"\n\nimport numpy as np\nfrom collections import deque, OrderedDict\nfrom ..matching import jde_matching as matching\nfrom ppdet.core.workspace import register, serializable\n\n__all__ = [\n    'TrackState',\n    'BaseTrack',\n    'STrack',\n    'joint_stracks',\n    'sub_stracks',\n    'remove_duplicate_stracks',\n]\n\n\nclass TrackState(object):\n    New = 0\n    Tracked = 1\n    Lost = 2\n    Removed = 3\n\n\nclass BaseTrack(object):\n    _count = 0\n\n    track_id = 0\n    is_activated = False\n    state = TrackState.New\n\n    history = OrderedDict()\n    features = []\n    curr_feature = None\n    score = 0\n    start_frame = 0\n    frame_id = 0\n    time_since_update = 0\n\n    # multi-camera\n    location = (np.inf, np.inf)\n\n    @property\n    def end_frame(self):\n        return self.frame_id\n\n    @staticmethod\n    def next_id():\n        BaseTrack._count += 1\n        return BaseTrack._count\n\n    def activate(self, *args):\n        raise NotImplementedError\n\n    def predict(self):\n        raise NotImplementedError\n\n    def update(self, *args, **kwargs):\n        raise NotImplementedError\n\n    def mark_lost(self):\n        self.state = TrackState.Lost\n\n    def mark_removed(self):\n        self.state = TrackState.Removed\n\n\nclass STrack(BaseTrack):\n    def __init__(self, tlwh, score, temp_feat, buffer_size=30):\n        # wait activate\n        self._tlwh = np.asarray(tlwh, dtype=np.float)\n        self.kalman_filter = None\n        self.mean, self.covariance = None, None\n        self.is_activated = False\n\n        self.score = score\n        self.tracklet_len = 0\n\n        self.smooth_feat = None\n        self.update_features(temp_feat)\n        self.features = deque([], maxlen=buffer_size)\n        self.alpha = 0.9\n\n    def update_features(self, feat):\n        feat /= np.linalg.norm(feat)\n        self.curr_feat = feat\n        if self.smooth_feat is None:\n            self.smooth_feat = feat\n        else:\n            self.smooth_feat = self.alpha * self.smooth_feat + (1 - self.alpha) * feat\n        self.features.append(feat)\n        self.smooth_feat /= np.linalg.norm(self.smooth_feat)\n\n    def predict(self):\n        mean_state = self.mean.copy()\n        if self.state != TrackState.Tracked:\n            mean_state[7] = 0\n        self.mean, self.covariance = self.kalman_filter.predict(mean_state, self.covariance)\n\n    @staticmethod\n    def multi_predict(stracks, kalman_filter):\n        if len(stracks) > 0:\n            multi_mean = np.asarray([st.mean.copy() for st in stracks])\n            multi_covariance = np.asarray([st.covariance for st in stracks])\n            for i, st in enumerate(stracks):\n                if st.state != TrackState.Tracked:\n                    multi_mean[i][7] = 0\n            multi_mean, multi_covariance = kalman_filter.multi_predict(multi_mean, multi_covariance)\n            for i, (mean, cov) in enumerate(zip(multi_mean, multi_covariance)):\n                stracks[i].mean = mean\n                stracks[i].covariance = cov\n\n    def activate(self, kalman_filter, frame_id):\n        \"\"\"Start a new tracklet\"\"\"\n        self.kalman_filter = kalman_filter\n        self.track_id = self.next_id()\n        self.mean, self.covariance = self.kalman_filter.initiate(self.tlwh_to_xyah(self._tlwh))\n\n        self.tracklet_len = 0\n        self.state = TrackState.Tracked\n        if frame_id == 1:\n            self.is_activated = True\n        self.frame_id = frame_id\n        self.start_frame = frame_id\n\n    def re_activate(self, new_track, frame_id, new_id=False):\n        self.mean, self.covariance = self.kalman_filter.update(self.mean, self.covariance,\n                                                               self.tlwh_to_xyah(new_track.tlwh))\n\n        self.update_features(new_track.curr_feat)\n        self.tracklet_len = 0\n        self.state = TrackState.Tracked\n        self.is_activated = True\n        self.frame_id = frame_id\n        if new_id:\n            self.track_id = self.next_id()\n\n    def update(self, new_track, frame_id, update_feature=True):\n        self.frame_id = frame_id\n        self.tracklet_len += 1\n\n        new_tlwh = new_track.tlwh\n        self.mean, self.covariance = self.kalman_filter.update(self.mean, self.covariance, self.tlwh_to_xyah(new_tlwh))\n        self.state = TrackState.Tracked\n        self.is_activated = True\n\n        self.score = new_track.score\n        if update_feature:\n            self.update_features(new_track.curr_feat)\n\n    @property\n    def tlwh(self):\n        \"\"\"\n        Get current position in bounding box format `(top left x, top left y,\n        width, height)`.\n        \"\"\"\n        if self.mean is None:\n            return self._tlwh.copy()\n        ret = self.mean[:4].copy()\n        ret[2] *= ret[3]\n        ret[:2] -= ret[2:] / 2\n        return ret\n\n    @property\n    def tlbr(self):\n        \"\"\"\n        Convert bounding box to format `(min x, min y, max x, max y)`, i.e.,\n        `(top left, bottom right)`.\n        \"\"\"\n        ret = self.tlwh.copy()\n        ret[2:] += ret[:2]\n        return ret\n\n    @staticmethod\n    def tlwh_to_xyah(tlwh):\n        \"\"\"\n        Convert bounding box to format `(center x, center y, aspect ratio,\n        height)`, where the aspect ratio is `width / height`.\n        \"\"\"\n        ret = np.asarray(tlwh).copy()\n        ret[:2] += ret[2:] / 2\n        ret[2] /= ret[3]\n        return ret\n\n    def to_xyah(self):\n        return self.tlwh_to_xyah(self.tlwh)\n\n    @staticmethod\n    def tlbr_to_tlwh(tlbr):\n        ret = np.asarray(tlbr).copy()\n        ret[2:] -= ret[:2]\n        return ret\n\n    @staticmethod\n    def tlwh_to_tlbr(tlwh):\n        ret = np.asarray(tlwh).copy()\n        ret[2:] += ret[:2]\n        return ret\n\n    def __repr__(self):\n        return 'OT_{}_({}-{})'.format(self.track_id, self.start_frame, self.end_frame)\n\n\ndef joint_stracks(tlista, tlistb):\n    exists = {}\n    res = []\n    for t in tlista:\n        exists[t.track_id] = 1\n        res.append(t)\n    for t in tlistb:\n        tid = t.track_id\n        if not exists.get(tid, 0):\n            exists[tid] = 1\n            res.append(t)\n    return res\n\n\ndef sub_stracks(tlista, tlistb):\n    stracks = {}\n    for t in tlista:\n        stracks[t.track_id] = t\n    for t in tlistb:\n        tid = t.track_id\n        if stracks.get(tid, 0):\n            del stracks[tid]\n    return list(stracks.values())\n\n\ndef remove_duplicate_stracks(stracksa, stracksb):\n    pdist = matching.iou_distance(stracksa, stracksb)\n    pairs = np.where(pdist < 0.15)\n    dupa, dupb = list(), list()\n    for p, q in zip(*pairs):\n        timep = stracksa[p].frame_id - stracksa[p].start_frame\n        timeq = stracksb[q].frame_id - stracksb[q].start_frame\n        if timep > timeq:\n            dupb.append(q)\n        else:\n            dupa.append(p)\n    resa = [t for i, t in enumerate(stracksa) if not i in dupa]\n    resb = [t for i, t in enumerate(stracksb) if not i in dupb]\n    return resa, resb\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/jde_darknet53/modeling/mot/tracker/base_sde_tracker.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"\nThis code is borrow from https://github.com/nwojke/deep_sort/blob/master/deep_sort/track.py\n\"\"\"\n\nfrom ppdet.core.workspace import register, serializable\n\n__all__ = ['TrackState', 'Track']\n\n\nclass TrackState(object):\n    \"\"\"\n    Enumeration type for the single target track state. Newly created tracks are\n    classified as `tentative` until enough evidence has been collected. Then,\n    the track state is changed to `confirmed`. Tracks that are no longer alive\n    are classified as `deleted` to mark them for removal from the set of active\n    tracks.\n    \"\"\"\n    Tentative = 1\n    Confirmed = 2\n    Deleted = 3\n\n\nclass Track(object):\n    \"\"\"\n    A single target track with state space `(x, y, a, h)` and associated\n    velocities, where `(x, y)` is the center of the bounding box, `a` is the\n    aspect ratio and `h` is the height.\n\n    Args:\n        mean (ndarray): Mean vector of the initial state distribution.\n        covariance (ndarray): Covariance matrix of the initial state distribution.\n        track_id (int): A unique track identifier.\n        n_init (int): Number of consecutive detections before the track is confirmed.\n            The track state is set to `Deleted` if a miss occurs within the first\n            `n_init` frames.\n        max_age (int): The maximum number of consecutive misses before the track\n            state is set to `Deleted`.\n        feature (Optional[ndarray]): Feature vector of the detection this track\n            originates from. If not None, this feature is added to the `features` cache.\n\n    Attributes:\n        hits (int): Total number of measurement updates.\n        age (int): Total number of frames since first occurance.\n        time_since_update (int): Total number of frames since last measurement\n            update.\n        state (TrackState): The current track state.\n        features (List[ndarray]): A cache of features. On each measurement update,\n            the associated feature vector is added to this list.\n    \"\"\"\n\n    def __init__(self, mean, covariance, track_id, n_init, max_age, feature=None):\n        self.mean = mean\n        self.covariance = covariance\n        self.track_id = track_id\n        self.hits = 1\n        self.age = 1\n        self.time_since_update = 0\n\n        self.state = TrackState.Tentative\n        self.features = []\n        if feature is not None:\n            self.features.append(feature)\n\n        self._n_init = n_init\n        self._max_age = max_age\n\n    def to_tlwh(self):\n        \"\"\"Get position in format `(top left x, top left y, width, height)`.\"\"\"\n        ret = self.mean[:4].copy()\n        ret[2] *= ret[3]\n        ret[:2] -= ret[2:] / 2\n        return ret\n\n    def to_tlbr(self):\n        \"\"\"Get position in bounding box format `(min x, miny, max x, max y)`.\"\"\"\n        ret = self.to_tlwh()\n        ret[2:] = ret[:2] + ret[2:]\n        return ret\n\n    def predict(self, kalman_filter):\n        \"\"\"\n        Propagate the state distribution to the current time step using a Kalman\n        filter prediction step.\n        \"\"\"\n        self.mean, self.covariance = kalman_filter.predict(self.mean, self.covariance)\n        self.age += 1\n        self.time_since_update += 1\n\n    def update(self, kalman_filter, detection):\n        \"\"\"\n        Perform Kalman filter measurement update step and update the associated\n        detection feature cache.\n        \"\"\"\n        self.mean, self.covariance = kalman_filter.update(self.mean, self.covariance, detection.to_xyah())\n        self.features.append(detection.feature)\n\n        self.hits += 1\n        self.time_since_update = 0\n        if self.state == TrackState.Tentative and self.hits >= self._n_init:\n            self.state = TrackState.Confirmed\n\n    def mark_missed(self):\n        \"\"\"Mark this track as missed (no association at the current time step).\n        \"\"\"\n        if self.state == TrackState.Tentative:\n            self.state = TrackState.Deleted\n        elif self.time_since_update > self._max_age:\n            self.state = TrackState.Deleted\n\n    def is_tentative(self):\n        \"\"\"Returns True if this track is tentative (unconfirmed).\"\"\"\n        return self.state == TrackState.Tentative\n\n    def is_confirmed(self):\n        \"\"\"Returns True if this track is confirmed.\"\"\"\n        return self.state == TrackState.Confirmed\n\n    def is_deleted(self):\n        \"\"\"Returns True if this track is dead and should be deleted.\"\"\"\n        return self.state == TrackState.Deleted\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/jde_darknet53/modeling/mot/tracker/jde_tracker.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\r\n#\r\n# Licensed under the Apache License, Version 2.0 (the \"License\");\r\n# you may not use this file except in compliance with the License.\r\n# You may obtain a copy of the License at\r\n#\r\n#     http://www.apache.org/licenses/LICENSE-2.0\r\n#\r\n# Unless required by applicable law or agreed to in writing, software\r\n# distributed under the License is distributed on an \"AS IS\" BASIS,\r\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\r\n# See the License for the specific language governing permissions and\r\n# limitations under the License.\r\n\"\"\"\r\nThis code is borrow from https://github.com/Zhongdao/Towards-Realtime-MOT/blob/master/tracker/multitracker.py\r\n\"\"\"\r\n\r\nimport paddle\r\n\r\nfrom ..matching import jde_matching as matching\r\nfrom .base_jde_tracker import TrackState, BaseTrack, STrack\r\nfrom .base_jde_tracker import joint_stracks, sub_stracks, remove_duplicate_stracks\r\n\r\nfrom ppdet.core.workspace import register, serializable\r\nfrom ppdet.utils.logger import setup_logger\r\nlogger = setup_logger(__name__)\r\n\r\n__all__ = ['FrozenJDETracker']\r\n\r\n\r\n@register\r\n@serializable\r\nclass FrozenJDETracker(object):\r\n    __inject__ = ['motion']\r\n    \"\"\"\r\n    JDE tracker\r\n\r\n    Args:\r\n        det_thresh (float): threshold of detection score\r\n        track_buffer (int): buffer for tracker\r\n        min_box_area (int): min box area to filter out low quality boxes\r\n        vertical_ratio (float): w/h, the vertical ratio of the bbox to filter\r\n            bad results, set 1.6 default for pedestrian tracking. If set -1\r\n            means no need to filter bboxes.\r\n        tracked_thresh (float): linear assignment threshold of tracked\r\n            stracks and detections\r\n        r_tracked_thresh (float): linear assignment threshold of\r\n            tracked stracks and unmatched detections\r\n        unconfirmed_thresh (float): linear assignment threshold of\r\n            unconfirmed stracks and unmatched detections\r\n        motion (object): KalmanFilter instance\r\n        conf_thres (float): confidence threshold for tracking\r\n        metric_type (str): either \"euclidean\" or \"cosine\", the distance metric\r\n            used for measurement to track association.\r\n    \"\"\"\r\n\r\n    def __init__(self,\r\n                 det_thresh=0.3,\r\n                 track_buffer=30,\r\n                 min_box_area=200,\r\n                 vertical_ratio=1.6,\r\n                 tracked_thresh=0.7,\r\n                 r_tracked_thresh=0.5,\r\n                 unconfirmed_thresh=0.7,\r\n                 motion='KalmanFilter',\r\n                 conf_thres=0,\r\n                 metric_type='euclidean'):\r\n        self.det_thresh = det_thresh\r\n        self.track_buffer = track_buffer\r\n        self.min_box_area = min_box_area\r\n        self.vertical_ratio = vertical_ratio\r\n\r\n        self.tracked_thresh = tracked_thresh\r\n        self.r_tracked_thresh = r_tracked_thresh\r\n        self.unconfirmed_thresh = unconfirmed_thresh\r\n        self.motion = motion\r\n        self.conf_thres = conf_thres\r\n        self.metric_type = metric_type\r\n\r\n        self.frame_id = 0\r\n        self.tracked_stracks = []\r\n        self.lost_stracks = []\r\n        self.removed_stracks = []\r\n\r\n        self.max_time_lost = 0\r\n        # max_time_lost will be calculated: int(frame_rate / 30.0 * track_buffer)\r\n\r\n    def update(self, pred_dets, pred_embs):\r\n        \"\"\"\r\n        Processes the image frame and finds bounding box(detections).\r\n        Associates the detection with corresponding tracklets and also handles\r\n            lost, removed, refound and active tracklets.\r\n\r\n        Args:\r\n            pred_dets (Tensor): Detection results of the image, shape is [N, 5].\r\n            pred_embs (Tensor): Embedding results of the image, shape is [N, 512].\r\n\r\n        Return:\r\n            output_stracks (list): The list contains information regarding the\r\n                online_tracklets for the recieved image tensor.\r\n        \"\"\"\r\n        self.frame_id += 1\r\n        activated_starcks = []\r\n        # for storing active tracks, for the current frame\r\n        refind_stracks = []\r\n        # Lost Tracks whose detections are obtained in the current frame\r\n        lost_stracks = []\r\n        # The tracks which are not obtained in the current frame but are not\r\n        # removed. (Lost for some time lesser than the threshold for removing)\r\n        removed_stracks = []\r\n\r\n        remain_inds = paddle.nonzero(pred_dets[:, 4] > self.conf_thres)\r\n        if remain_inds.shape[0] == 0:\r\n            pred_dets = paddle.zeros([0, 1])\r\n            pred_embs = paddle.zeros([0, 1])\r\n        else:\r\n            pred_dets = paddle.gather(pred_dets, remain_inds)\r\n            pred_embs = paddle.gather(pred_embs, remain_inds)\r\n\r\n        # Filter out the image with box_num = 0. pred_dets = [[0.0, 0.0, 0.0 ,0.0]]\r\n        empty_pred = True if len(pred_dets) == 1 and paddle.sum(pred_dets) == 0.0 else False\r\n        \"\"\" Step 1: Network forward, get detections & embeddings\"\"\"\r\n        if len(pred_dets) > 0 and not empty_pred:\r\n            pred_dets = pred_dets.numpy()\r\n            pred_embs = pred_embs.numpy()\r\n            detections = [\r\n                STrack(STrack.tlbr_to_tlwh(tlbrs[:4]), tlbrs[4], f, 30) for (tlbrs, f) in zip(pred_dets, pred_embs)\r\n            ]\r\n        else:\r\n            detections = []\r\n        ''' Add newly detected tracklets to tracked_stracks'''\r\n        unconfirmed = []\r\n        tracked_stracks = []  # type: list[STrack]\r\n        for track in self.tracked_stracks:\r\n            if not track.is_activated:\r\n                # previous tracks which are not active in the current frame are added in unconfirmed list\r\n                unconfirmed.append(track)\r\n            else:\r\n                # Active tracks are added to the local list 'tracked_stracks'\r\n                tracked_stracks.append(track)\r\n        \"\"\" Step 2: First association, with embedding\"\"\"\r\n        # Combining currently tracked_stracks and lost_stracks\r\n        strack_pool = joint_stracks(tracked_stracks, self.lost_stracks)\r\n        # Predict the current location with KF\r\n        STrack.multi_predict(strack_pool, self.motion)\r\n\r\n        dists = matching.embedding_distance(strack_pool, detections, metric=self.metric_type)\r\n        dists = matching.fuse_motion(self.motion, dists, strack_pool, detections)\r\n        # The dists is the list of distances of the detection with the tracks in strack_pool\r\n        matches, u_track, u_detection = matching.linear_assignment(dists, thresh=self.tracked_thresh)\r\n        # The matches is the array for corresponding matches of the detection with the corresponding strack_pool\r\n\r\n        for itracked, idet in matches:\r\n            # itracked is the id of the track and idet is the detection\r\n            track = strack_pool[itracked]\r\n            det = detections[idet]\r\n            if track.state == TrackState.Tracked:\r\n                # If the track is active, add the detection to the track\r\n                track.update(detections[idet], self.frame_id)\r\n                activated_starcks.append(track)\r\n            else:\r\n                # We have obtained a detection from a track which is not active,\r\n                # hence put the track in refind_stracks list\r\n                track.re_activate(det, self.frame_id, new_id=False)\r\n                refind_stracks.append(track)\r\n\r\n        # None of the steps below happen if there are no undetected tracks.\r\n        \"\"\" Step 3: Second association, with IOU\"\"\"\r\n        detections = [detections[i] for i in u_detection]\r\n        # detections is now a list of the unmatched detections\r\n        r_tracked_stracks = []\r\n        # This is container for stracks which were tracked till the previous\r\n        # frame but no detection was found for it in the current frame.\r\n\r\n        for i in u_track:\r\n            if strack_pool[i].state == TrackState.Tracked:\r\n                r_tracked_stracks.append(strack_pool[i])\r\n        dists = matching.iou_distance(r_tracked_stracks, detections)\r\n        matches, u_track, u_detection = matching.linear_assignment(dists, thresh=self.r_tracked_thresh)\r\n        # matches is the list of detections which matched with corresponding\r\n        # tracks by IOU distance method.\r\n\r\n        for itracked, idet in matches:\r\n            track = r_tracked_stracks[itracked]\r\n            det = detections[idet]\r\n            if track.state == TrackState.Tracked:\r\n                track.update(det, self.frame_id)\r\n                activated_starcks.append(track)\r\n            else:\r\n                track.re_activate(det, self.frame_id, new_id=False)\r\n                refind_stracks.append(track)\r\n        # Same process done for some unmatched detections, but now considering IOU_distance as measure\r\n\r\n        for it in u_track:\r\n            track = r_tracked_stracks[it]\r\n            if not track.state == TrackState.Lost:\r\n                track.mark_lost()\r\n                lost_stracks.append(track)\r\n        # If no detections are obtained for tracks (u_track), the tracks are added to lost_tracks list and are marked lost\r\n        '''Deal with unconfirmed tracks, usually tracks with only one beginning frame'''\r\n        detections = [detections[i] for i in u_detection]\r\n        dists = matching.iou_distance(unconfirmed, detections)\r\n        matches, u_unconfirmed, u_detection = matching.linear_assignment(dists, thresh=self.unconfirmed_thresh)\r\n        for itracked, idet in matches:\r\n            unconfirmed[itracked].update(detections[idet], self.frame_id)\r\n            activated_starcks.append(unconfirmed[itracked])\r\n\r\n        # The tracks which are yet not matched\r\n        for it in u_unconfirmed:\r\n            track = unconfirmed[it]\r\n            track.mark_removed()\r\n            removed_stracks.append(track)\r\n\r\n        # after all these confirmation steps, if a new detection is found, it is initialized for a new track\r\n        \"\"\" Step 4: Init new stracks\"\"\"\r\n        for inew in u_detection:\r\n            track = detections[inew]\r\n            if track.score < self.det_thresh:\r\n                continue\r\n            track.activate(self.motion, self.frame_id)\r\n            activated_starcks.append(track)\r\n        \"\"\" Step 5: Update state\"\"\"\r\n        # If the tracks are lost for more frames than the threshold number, the tracks are removed.\r\n        for track in self.lost_stracks:\r\n            if self.frame_id - track.end_frame > self.max_time_lost:\r\n                track.mark_removed()\r\n                removed_stracks.append(track)\r\n\r\n        # Update the self.tracked_stracks and self.lost_stracks using the updates in this step.\r\n        self.tracked_stracks = [t for t in self.tracked_stracks if t.state == TrackState.Tracked]\r\n        self.tracked_stracks = joint_stracks(self.tracked_stracks, activated_starcks)\r\n        self.tracked_stracks = joint_stracks(self.tracked_stracks, refind_stracks)\r\n\r\n        self.lost_stracks = sub_stracks(self.lost_stracks, self.tracked_stracks)\r\n        self.lost_stracks.extend(lost_stracks)\r\n        self.lost_stracks = sub_stracks(self.lost_stracks, self.removed_stracks)\r\n        self.removed_stracks.extend(removed_stracks)\r\n        self.tracked_stracks, self.lost_stracks = remove_duplicate_stracks(self.tracked_stracks, self.lost_stracks)\r\n        # get scores of lost tracks\r\n        output_stracks = [track for track in self.tracked_stracks if track.is_activated]\r\n\r\n        logger.debug('===========Frame {}=========='.format(self.frame_id))\r\n        logger.debug('Activated: {}'.format([track.track_id for track in activated_starcks]))\r\n        logger.debug('Refind: {}'.format([track.track_id for track in refind_stracks]))\r\n        logger.debug('Lost: {}'.format([track.track_id for track in lost_stracks]))\r\n        logger.debug('Removed: {}'.format([track.track_id for track in removed_stracks]))\r\n\r\n        return output_stracks\r\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/jde_darknet53/modeling/mot/utils.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport cv2\nimport time\nimport paddle\nimport numpy as np\n\n__all__ = [\n    'Timer',\n    'Detection',\n    'load_det_results',\n    'preprocess_reid',\n    'get_crops',\n    'clip_box',\n    'scale_coords',\n]\n\n\nclass Timer(object):\n    \"\"\"\n    This class used to compute and print the current FPS while evaling.\n    \"\"\"\n\n    def __init__(self):\n        self.total_time = 0.\n        self.calls = 0\n        self.start_time = 0.\n        self.diff = 0.\n        self.average_time = 0.\n        self.duration = 0.\n\n    def tic(self):\n        # using time.time instead of time.clock because time time.clock\n        # does not normalize for multithreading\n        self.start_time = time.time()\n\n    def toc(self, average=True):\n        self.diff = time.time() - self.start_time\n        self.total_time += self.diff\n        self.calls += 1\n        self.average_time = self.total_time / self.calls\n        if average:\n            self.duration = self.average_time\n        else:\n            self.duration = self.diff\n        return self.duration\n\n    def clear(self):\n        self.total_time = 0.\n        self.calls = 0\n        self.start_time = 0.\n        self.diff = 0.\n        self.average_time = 0.\n        self.duration = 0.\n\n\nclass Detection(object):\n    \"\"\"\n    This class represents a bounding box detection in a single image.\n\n    Args:\n        tlwh (ndarray): Bounding box in format `(top left x, top left y,\n            width, height)`.\n        confidence (ndarray): Detector confidence score.\n        feature (Tensor): A feature vector that describes the object\n            contained in this image.\n    \"\"\"\n\n    def __init__(self, tlwh, confidence, feature):\n        self.tlwh = np.asarray(tlwh, dtype=np.float32)\n        self.confidence = np.asarray(confidence, dtype=np.float32)\n        self.feature = feature\n\n    def to_tlbr(self):\n        \"\"\"\n        Convert bounding box to format `(min x, min y, max x, max y)`, i.e.,\n        `(top left, bottom right)`.\n        \"\"\"\n        ret = self.tlwh.copy()\n        ret[2:] += ret[:2]\n        return ret\n\n    def to_xyah(self):\n        \"\"\"\n        Convert bounding box to format `(center x, center y, aspect ratio,\n        height)`, where the aspect ratio is `width / height`.\n        \"\"\"\n        ret = self.tlwh.copy()\n        ret[:2] += ret[2:] / 2\n        ret[2] /= ret[3]\n        return ret\n\n\ndef load_det_results(det_file, num_frames):\n    assert os.path.exists(det_file) and os.path.isfile(det_file), \\\n        'Error: det_file: {} not exist or not a file.'.format(det_file)\n    labels = np.loadtxt(det_file, dtype='float32', delimiter=',')\n    results_list = []\n    for frame_i in range(0, num_frames):\n        results = {'bbox': [], 'score': []}\n        lables_with_frame = labels[labels[:, 0] == frame_i + 1]\n        for l in lables_with_frame:\n            results['bbox'].append(l[1:5])\n            results['score'].append(l[5])\n        results_list.append(results)\n    return results_list\n\n\ndef scale_coords(coords, input_shape, im_shape, scale_factor):\n    im_shape = im_shape.numpy()[0]\n    ratio = scale_factor[0][0]\n    pad_w = (input_shape[1] - int(im_shape[1])) / 2\n    pad_h = (input_shape[0] - int(im_shape[0])) / 2\n    coords = paddle.cast(coords, 'float32')\n    coords[:, 0::2] -= pad_w\n    coords[:, 1::2] -= pad_h\n    coords[:, 0:4] /= ratio\n    coords[:, :4] = paddle.clip(coords[:, :4], min=0, max=coords[:, :4].max())\n    return coords.round()\n\n\ndef clip_box(xyxy, input_shape, im_shape, scale_factor):\n    im_shape = im_shape.numpy()[0]\n    ratio = scale_factor.numpy()[0][0]\n    img0_shape = [int(im_shape[0] / ratio), int(im_shape[1] / ratio)]\n\n    xyxy[:, 0::2] = paddle.clip(xyxy[:, 0::2], min=0, max=img0_shape[1])\n    xyxy[:, 1::2] = paddle.clip(xyxy[:, 1::2], min=0, max=img0_shape[0])\n    return xyxy\n\n\ndef get_crops(xyxy, ori_img, pred_scores, w, h):\n    crops = []\n    keep_scores = []\n    xyxy = xyxy.numpy().astype(np.int64)\n    ori_img = ori_img.numpy()\n    ori_img = np.squeeze(ori_img, axis=0).transpose(1, 0, 2)\n    pred_scores = pred_scores.numpy()\n    for i, bbox in enumerate(xyxy):\n        if bbox[2] <= bbox[0] or bbox[3] <= bbox[1]:\n            continue\n        crop = ori_img[bbox[0]:bbox[2], bbox[1]:bbox[3], :]\n        crops.append(crop)\n        keep_scores.append(pred_scores[i])\n    if len(crops) == 0:\n        return [], []\n    crops = preprocess_reid(crops, w, h)\n    return crops, keep_scores\n\n\ndef preprocess_reid(imgs, w=64, h=192, mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]):\n    im_batch = []\n    for img in imgs:\n        img = cv2.resize(img, (w, h))\n        img = img[:, :, ::-1].astype('float32').transpose((2, 0, 1)) / 255\n        img_mean = np.array(mean).reshape((3, 1, 1))\n        img_std = np.array(std).reshape((3, 1, 1))\n        img -= img_mean\n        img /= img_std\n        img = np.expand_dims(img, axis=0)\n        im_batch.append(img)\n    im_batch = np.concatenate(im_batch, 0)\n    return im_batch\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/jde_darknet53/modeling/mot/visualization.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport cv2\nimport numpy as np\n\n\ndef tlwhs_to_tlbrs(tlwhs):\n    tlbrs = np.copy(tlwhs)\n    if len(tlbrs) == 0:\n        return tlbrs\n    tlbrs[:, 2] += tlwhs[:, 0]\n    tlbrs[:, 3] += tlwhs[:, 1]\n    return tlbrs\n\n\ndef get_color(idx):\n    idx = idx * 3\n    color = ((37 * idx) % 255, (17 * idx) % 255, (29 * idx) % 255)\n    return color\n\n\ndef resize_image(image, max_size=800):\n    if max(image.shape[:2]) > max_size:\n        scale = float(max_size) / max(image.shape[:2])\n        image = cv2.resize(image, None, fx=scale, fy=scale)\n    return image\n\n\ndef plot_tracking(image, tlwhs, obj_ids, scores=None, frame_id=0, fps=0., ids2=None):\n    im = np.ascontiguousarray(np.copy(image))\n    im_h, im_w = im.shape[:2]\n\n    top_view = np.zeros([im_w, im_w, 3], dtype=np.uint8) + 255\n\n    text_scale = max(1, image.shape[1] / 1600.)\n    text_thickness = 2\n    line_thickness = max(1, int(image.shape[1] / 500.))\n\n    radius = max(5, int(im_w / 140.))\n    cv2.putText(\n        im,\n        'frame: %d fps: %.2f num: %d' % (frame_id, fps, len(tlwhs)), (0, int(15 * text_scale)),\n        cv2.FONT_HERSHEY_PLAIN,\n        text_scale, (0, 0, 255),\n        thickness=2)\n\n    for i, tlwh in enumerate(tlwhs):\n        x1, y1, w, h = tlwh\n        intbox = tuple(map(int, (x1, y1, x1 + w, y1 + h)))\n        obj_id = int(obj_ids[i])\n        id_text = '{}'.format(int(obj_id))\n        if ids2 is not None:\n            id_text = id_text + ', {}'.format(int(ids2[i]))\n        _line_thickness = 1 if obj_id <= 0 else line_thickness\n        color = get_color(abs(obj_id))\n        cv2.rectangle(im, intbox[0:2], intbox[2:4], color=color, thickness=line_thickness)\n        cv2.putText(\n            im,\n            id_text, (intbox[0], intbox[1] + 10),\n            cv2.FONT_HERSHEY_PLAIN,\n            text_scale, (0, 0, 255),\n            thickness=text_thickness)\n\n        if scores is not None:\n            text = '{:.2f}'.format(float(scores[i]))\n            cv2.putText(\n                im,\n                text, (intbox[0], intbox[1] - 10),\n                cv2.FONT_HERSHEY_PLAIN,\n                text_scale, (0, 255, 255),\n                thickness=text_thickness)\n    return im\n\n\ndef plot_trajectory(image, tlwhs, track_ids):\n    image = image.copy()\n    for one_tlwhs, track_id in zip(tlwhs, track_ids):\n        color = get_color(int(track_id))\n        for tlwh in one_tlwhs:\n            x1, y1, w, h = tuple(map(int, tlwh))\n            cv2.circle(image, (int(x1 + 0.5 * w), int(y1 + h)), 2, color, thickness=2)\n    return image\n\n\ndef plot_detections(image, tlbrs, scores=None, color=(255, 0, 0), ids=None):\n    im = np.copy(image)\n    text_scale = max(1, image.shape[1] / 800.)\n    thickness = 2 if text_scale > 1.3 else 1\n    for i, det in enumerate(tlbrs):\n        x1, y1, x2, y2 = np.asarray(det[:4], dtype=np.int)\n        if len(det) >= 7:\n            label = 'det' if det[5] > 0 else 'trk'\n            if ids is not None:\n                text = '{}# {:.2f}: {:d}'.format(label, det[6], ids[i])\n                cv2.putText(\n                    im, text, (x1, y1 + 30), cv2.FONT_HERSHEY_PLAIN, text_scale, (0, 255, 255), thickness=thickness)\n            else:\n                text = '{}# {:.2f}'.format(label, det[6])\n\n        if scores is not None:\n            text = '{:.2f}'.format(scores[i])\n            cv2.putText(im, text, (x1, y1 + 30), cv2.FONT_HERSHEY_PLAIN, text_scale, (0, 255, 255), thickness=thickness)\n\n        cv2.rectangle(im, (x1, y1), (x2, y2), color, 2)\n    return im\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/jde_darknet53/module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport glob\nimport os\nimport signal\nimport sys\n\nimport cv2\nimport paddle\nfrom ppdet.core.workspace import load_config\nfrom ppdet.core.workspace import merge_config\nfrom ppdet.engine import Tracker\nfrom ppdet.utils.check import check_config\nfrom ppdet.utils.check import check_gpu\nfrom ppdet.utils.check import check_version\nfrom ppdet.utils.logger import setup_logger\n\nimport paddlehub as hub\nfrom .tracker import StreamTracker\nfrom paddlehub.module.module import moduleinfo\nfrom paddlehub.module.module import runnable\nfrom paddlehub.module.module import serving\n\nlogger = setup_logger('Predict')\n\n\n@moduleinfo(name=\"jde_darknet53\",\n            type=\"CV/multiple_object_tracking\",\n            author=\"paddlepaddle\",\n            author_email=\"\",\n            summary=\"JDE is a joint detection and appearance embedding model for multiple object tracking.\",\n            version=\"1.1.0\")\nclass JDETracker_1088x608:\n\n    def __init__(self):\n        self.pretrained_model = os.path.join(self.directory, \"jde_darknet53_30e_1088x608\")\n\n    def tracking(self, video_stream, output_dir='mot_result', visualization=True, draw_threshold=0.5, use_gpu=False):\n        '''\n        Track a video, and save the prediction results into output_dir, if visualization is set as True.\n\n        video_stream: the video path\n        output_dir: specify the dir to save the results\n        visualization: if True, save the results as a video, otherwise not.\n        draw_threshold: the threshold for the prediction results\n        use_gpu: if True, use gpu to perform the computation, otherwise cpu.\n        '''\n        self.video_stream = video_stream\n        self.output_dir = output_dir\n        self.visualization = visualization\n        self.draw_threshold = draw_threshold\n        self.use_gpu = use_gpu\n\n        cfg = load_config(os.path.join(self.directory, 'config', 'jde_darknet53_30e_1088x608.yml'))\n        check_config(cfg)\n\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n\n        paddle.disable_static()\n        tracker = StreamTracker(cfg, mode='test')\n\n        # load weights\n        tracker.load_weights_jde(self.pretrained_model)\n        signal.signal(signal.SIGINT, self.signalhandler)\n        # inference\n        tracker.videostream_predict(video_stream=video_stream,\n                                    output_dir=output_dir,\n                                    data_type='mot',\n                                    model_type='JDE',\n                                    visualization=visualization,\n                                    draw_threshold=draw_threshold)\n\n    def stream_mode(self, output_dir='mot_result', visualization=True, draw_threshold=0.5, use_gpu=False):\n        '''\n        Entering the stream mode enables image stream prediction. Users can predict the images like a stream and save the results to a video.\n\n        output_dir: specify the dir to save the results\n        visualization: if True, save the results as a video, otherwise not.\n        draw_threshold: the threshold for the prediction results\n        use_gpu: if True, use gpu to perform the computation, otherwise cpu.\n        '''\n        self.output_dir = output_dir\n        self.visualization = visualization\n        self.draw_threshold = draw_threshold\n        self.use_gpu = use_gpu\n\n        cfg = load_config(os.path.join(self.directory, 'config', 'jde_darknet53_30e_1088x608.yml'))\n        check_config(cfg)\n\n        place = 'gpu:0' if use_gpu else 'cpu'\n        place = paddle.set_device(place)\n\n        paddle.disable_static()\n        self.tracker = StreamTracker(cfg, mode='test')\n\n        # load weights\n        self.tracker.load_weights_jde(self.pretrained_model)\n        signal.signal(signal.SIGINT, self.signalhandler)\n        return self\n\n    def __enter__(self):\n        self.tracker_generator = self.tracker.imagestream_predict(self.output_dir,\n                                                                  data_type='mot',\n                                                                  model_type='JDE',\n                                                                  visualization=self.visualization,\n                                                                  draw_threshold=self.draw_threshold)\n        next(self.tracker_generator)\n\n    def __exit__(self, exc_type, exc_value, traceback):\n        seq = 'inputimages'\n        save_dir = os.path.join(self.output_dir, 'mot_outputs', seq) if self.visualization else None\n        if self.visualization:\n            #### Save using ffmpeg\n            #output_video_path = os.path.join(save_dir, '..', '{}_vis.mp4'.format(seq))\n            #cmd_str = 'ffmpeg -f image2 -i {}/%05d.jpg -vf \"scale=trunc(iw/2)*2:trunc(ih/2)*2\" {}'.format(\n            #    save_dir, output_video_path)\n            #os.system(cmd_str)\n            #### Save using opencv\n            output_video_path = os.path.join(save_dir, '..', '{}_vis.avi'.format(seq))\n            imgnames = glob.glob(os.path.join(save_dir, '*.jpg'))\n            if len(imgnames) == 0:\n                logger.info('No output images to save for video')\n                return\n            img = cv2.imread(os.path.join(save_dir, '00000.jpg'))\n            video_writer = cv2.VideoWriter(output_video_path,\n                                           apiPreference=0,\n                                           fourcc=cv2.VideoWriter_fourcc('M', 'J', 'P', 'G'),\n                                           fps=30,\n                                           frameSize=(img.shape[1], img.shape[0]))\n            for i in range(len(imgnames)):\n                imgpath = os.path.join(save_dir, '{:05d}.jpg'.format(i))\n                img = cv2.imread(imgpath)\n                video_writer.write(img)\n            video_writer.release()\n            logger.info('Save video in {}'.format(output_video_path))\n\n    def predict(self, images: list = []):\n        '''\n        Predict the images. This method should called in stream_mode.\n\n        images: the image list used for prediction.\n\n        Example:\n        tracker = hub.Module('fairmot_dla34')\n        with tracker.stream_mode(output_dir='image_stream_output', visualization=True, draw_threshold=0.5, use_gpu=True):\n            tracker.predict([images])\n        '''\n        length = len(images)\n        if length == 0:\n            print('No images provided.')\n            return\n        for image in images:\n            self.tracker.dataset.add_image(image)\n            try:\n                results = next(self.tracker_generator)\n            except StopIteration as e:\n                return\n\n        return results[-length:]\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(description=\"Run the {} module.\".format(self.name),\n                                              prog='hub run {}'.format(self.name),\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        self.args = self.parser.parse_args(argvs)\n        self.tracking(\n            video_stream=self.args.video_stream,\n            output_dir=self.args.output_dir,\n            visualization=self.args.visualization,\n            draw_threshold=self.args.draw_threshold,\n            use_gpu=self.args.use_gpu,\n        )\n\n    def signalhandler(self, signum, frame):\n        seq = os.path.splitext(os.path.basename(self.video_stream))[0]\n        save_dir = os.path.join(self.output_dir, 'mot_outputs', seq) if self.visualization else None\n        if self.visualization:\n            #### Save using ffmpeg\n            #output_video_path = os.path.join(save_dir, '..', '{}_vis.mp4'.format(seq))\n            #cmd_str = 'ffmpeg -f image2 -i {}/%05d.jpg -vf \"scale=trunc(iw/2)*2:trunc(ih/2)*2\" {}'.format(\n            #    save_dir, output_video_path)\n            #os.system(cmd_str)\n            #### Save using opencv\n            output_video_path = os.path.join(save_dir, '..', '{}_vis.avi'.format(seq))\n            imgnames = glob.glob(os.path.join(save_dir, '*.jpg'))\n            if len(imgnames) == 0:\n                logger.info('No output images to save for video')\n                return\n            img = cv2.imread(os.path.join(save_dir, '00000.jpg'))\n            video_writer = cv2.VideoWriter(output_video_path,\n                                           apiPreference=0,\n                                           fourcc=cv2.VideoWriter_fourcc('M', 'J', 'P', 'G'),\n                                           fps=30,\n                                           frameSize=(img.shape[1], img.shape[0]))\n            for i in range(len(imgnames)):\n                imgpath = os.path.join(save_dir, '{:05d}.jpg'.format(i))\n                img = cv2.imread(imgpath)\n                video_writer.write(img)\n            video_writer.release()\n            logger.info('Save video in {}'.format(output_video_path))\n            print('Program Interrupted! Save video in {}'.format(output_video_path))\n            exit(0)\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument('--use_gpu', action='store_true', help=\"use GPU or not\")\n\n        self.arg_config_group.add_argument('--output_dir',\n                                           type=str,\n                                           default='mot_result',\n                                           help='Directory name for output tracking results.')\n        self.arg_config_group.add_argument('--visualization',\n                                           action='store_true',\n                                           help=\"whether to save output as images.\")\n        self.arg_config_group.add_argument(\"--draw_threshold\",\n                                           type=float,\n                                           default=0.5,\n                                           help=\"Threshold to reserve the result for visualization.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--video_stream',\n                                          type=str,\n                                          help=\"path to video stream, can be a video file or stream device number.\")\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/jde_darknet53/requirements.txt",
    "content": "cython\npaddledet == 2.2.0\nopencv-python\nimageio\npillow==7.1.2\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/jde_darknet53/tracker.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport cv2\nimport glob\nimport paddle\nimport numpy as np\nimport collections\n\nfrom ppdet.utils.checkpoint import load_weight, load_pretrain_weight\nfrom ppdet.metrics import Metric, MOTMetric, KITTIMOTMetric\nimport ppdet.utils.stats as stats\nfrom ppdet.engine.callbacks import Callback, ComposeCallback\nfrom ppdet.core.workspace import create\nfrom ppdet.utils.logger import setup_logger\n\nfrom .dataset import MOTVideoStream, MOTImageStream\nfrom .modeling.mot.utils import Detection, get_crops, scale_coords, clip_box\nfrom .modeling.mot import visualization as mot_vis\nfrom .utils import Timer\n\nlogger = setup_logger(__name__)\n\n\nclass StreamTracker(object):\n    def __init__(self, cfg, mode='eval'):\n        self.cfg = cfg\n        assert mode.lower() in ['test', 'eval'], \\\n                \"mode should be 'test' or 'eval'\"\n        self.mode = mode.lower()\n        self.optimizer = None\n\n        # build model\n        self.model = create(cfg.architecture)\n\n        self.status = {}\n        self.start_epoch = 0\n\n    def load_weights_jde(self, weights):\n        load_weight(self.model, weights, self.optimizer)\n\n    def _eval_seq_jde(self, dataloader, save_dir=None, show_image=False, frame_rate=30, draw_threshold=0):\n        if save_dir:\n            if not os.path.exists(save_dir): os.makedirs(save_dir)\n        tracker = self.model.tracker\n        tracker.max_time_lost = int(frame_rate / 30.0 * tracker.track_buffer)\n\n        timer = Timer()\n        results = []\n        frame_id = 0\n        self.status['mode'] = 'track'\n        self.model.eval()\n        for step_id, data in enumerate(dataloader):\n            #print('data', data)\n            self.status['step_id'] = step_id\n            if frame_id % 40 == 0:\n                logger.info('Processing frame {} ({:.2f} fps)'.format(frame_id, 1. / max(1e-5, timer.average_time)))\n\n            # forward\n            timer.tic()\n            pred_dets, pred_embs = self.model(data)\n            online_targets = self.model.tracker.update(pred_dets, pred_embs)\n            online_tlwhs, online_ids = [], []\n            online_scores = []\n            for t in online_targets:\n                tlwh = t.tlwh\n                tid = t.track_id\n                tscore = t.score\n                if tscore < draw_threshold: continue\n                vertical = tlwh[2] / tlwh[3] > 1.6\n                if tlwh[2] * tlwh[3] > tracker.min_box_area and not vertical:\n                    online_tlwhs.append(tlwh)\n                    online_ids.append(tid)\n                    online_scores.append(tscore)\n            timer.toc()\n\n            # save results\n            results.append((frame_id + 1, online_tlwhs, online_scores, online_ids))\n            self.save_results(data, frame_id, online_ids, online_tlwhs, online_scores, timer.average_time, show_image,\n                              save_dir)\n            frame_id += 1\n\n        return results, frame_id, timer.average_time, timer.calls\n\n    def _eval_seq_jde_single_image(self, iterator, save_dir=None, show_image=False, draw_threshold=0):\n        if save_dir:\n            if not os.path.exists(save_dir): os.makedirs(save_dir)\n        tracker = self.model.tracker\n        results = []\n        frame_id = 0\n        self.status['mode'] = 'track'\n        self.model.eval()\n        timer = Timer()\n        while True:\n            try:\n                data = next(iterator)\n                timer.tic()\n                with paddle.no_grad():\n                    pred_dets, pred_embs = self.model(data)\n                online_targets = self.model.tracker.update(pred_dets, pred_embs)\n                online_tlwhs, online_ids = [], []\n                online_scores = []\n                for t in online_targets:\n                    tlwh = t.tlwh\n                    tid = t.track_id\n                    tscore = t.score\n                    if tscore < draw_threshold: continue\n                    vertical = tlwh[2] / tlwh[3] > 1.6\n                    if tlwh[2] * tlwh[3] > tracker.min_box_area and not vertical:\n                        online_tlwhs.append(tlwh)\n                        online_ids.append(tid)\n                        online_scores.append(tscore)\n                timer.toc()\n                # save results\n                results.append((frame_id + 1, online_tlwhs, online_scores, online_ids))\n                self.save_results(data, frame_id, online_ids, online_tlwhs, online_scores, timer.average_time,\n                                  show_image, save_dir)\n                frame_id += 1\n\n                yield results, frame_id\n\n            except StopIteration as e:\n                return\n\n    def imagestream_predict(self, output_dir, data_type='mot', model_type='JDE', visualization=True,\n                            draw_threshold=0.5):\n        if not os.path.exists(output_dir): os.makedirs(output_dir)\n        result_root = os.path.join(output_dir, 'mot_results')\n        if not os.path.exists(result_root): os.makedirs(result_root)\n        assert data_type in ['mot', 'kitti'], \\\n            \"data_type should be 'mot' or 'kitti'\"\n        assert model_type in ['JDE', 'FairMOT'], \\\n            \"model_type should be 'JDE', or 'FairMOT'\"\n        seq = 'inputimages'\n        self.dataset = MOTImageStream(keep_ori_im=True)\n\n        save_dir = os.path.join(output_dir, 'mot_outputs', seq) if visualization else None\n\n        self.dataloader = create('MOTVideoStreamReader')(self.dataset, 0)\n        self.dataloader_iter = iter(self.dataloader)\n        result_filename = os.path.join(result_root, '{}.txt'.format(seq))\n\n        if model_type in ['JDE', 'FairMOT']:\n            generator = self._eval_seq_jde_single_image(\n                self.dataloader_iter, save_dir=save_dir, draw_threshold=draw_threshold)\n        else:\n            raise ValueError(model_type)\n        yield\n        results = []\n        while True:\n            try:\n                results, nf = next(generator)\n                yield results\n            except StopIteration as e:\n                self.write_mot_results(result_filename, results, data_type)\n                return\n\n    def videostream_predict(self,\n                            video_stream,\n                            output_dir,\n                            data_type='mot',\n                            model_type='JDE',\n                            visualization=True,\n                            draw_threshold=0.5):\n        assert video_stream is not None, \\\n            \"--video_stream should be set.\"\n\n        if not os.path.exists(output_dir): os.makedirs(output_dir)\n        result_root = os.path.join(output_dir, 'mot_results')\n        if not os.path.exists(result_root): os.makedirs(result_root)\n        assert data_type in ['mot', 'kitti'], \\\n            \"data_type should be 'mot' or 'kitti'\"\n        assert model_type in ['JDE', 'FairMOT'], \\\n            \"model_type should be 'JDE', or 'FairMOT'\"\n        seq = os.path.splitext(os.path.basename(video_stream))[0]\n        self.dataset = MOTVideoStream(video_stream, keep_ori_im=True)\n\n        save_dir = os.path.join(output_dir, 'mot_outputs', seq) if visualization else None\n\n        dataloader = create('MOTVideoStreamReader')(self.dataset, 0)\n        result_filename = os.path.join(result_root, '{}.txt'.format(seq))\n\n        with paddle.no_grad():\n            if model_type in ['JDE', 'FairMOT']:\n                results, nf, ta, tc = self._eval_seq_jde(dataloader, save_dir=save_dir, draw_threshold=draw_threshold)\n            else:\n                raise ValueError(model_type)\n\n        self.write_mot_results(result_filename, results, data_type)\n\n        if visualization:\n            #### Save using ffmpeg\n            #output_video_path = os.path.join(save_dir, '..', '{}_vis.mp4'.format(seq))\n            #cmd_str = 'ffmpeg -f image2 -i {}/%05d.jpg -vf \"scale=trunc(iw/2)*2:trunc(ih/2)*2\" {}'.format(\n            #    save_dir, output_video_path)\n            #os.system(cmd_str)\n            #### Save using opencv\n            output_video_path = os.path.join(save_dir, '..', '{}_vis.avi'.format(seq))\n            imgnames = glob.glob(os.path.join(save_dir, '*.jpg'))\n            if len(imgnames) == 0:\n                logger.info('No output images to save for video')\n                return\n            img = cv2.imread(os.path.join(save_dir, '00000.jpg'))\n            video_writer = cv2.VideoWriter(\n                output_video_path,\n                apiPreference=0,\n                fourcc=cv2.VideoWriter_fourcc('M', 'J', 'P', 'G'),\n                fps=30,\n                frameSize=(img.shape[1], img.shape[0]))\n            for i in range(len(imgnames)):\n                imgpath = os.path.join(save_dir, '{:05d}.jpg'.format(i))\n                img = cv2.imread(imgpath)\n                video_writer.write(img)\n            video_writer.release()\n            logger.info('Save video in {}'.format(output_video_path))\n\n    def write_mot_results(self, filename, results, data_type='mot'):\n        if data_type in ['mot', 'mcmot', 'lab']:\n            save_format = '{frame},{id},{x1},{y1},{w},{h},{score},-1,-1,-1\\n'\n        elif data_type == 'kitti':\n            save_format = '{frame} {id} car 0 0 -10 {x1} {y1} {x2} {y2} -10 -10 -10 -1000 -1000 -1000 -10\\n'\n        else:\n            raise ValueError(data_type)\n\n        with open(filename, 'w') as f:\n            for frame_id, tlwhs, tscores, track_ids in results:\n                if data_type == 'kitti':\n                    frame_id -= 1\n                for tlwh, score, track_id in zip(tlwhs, tscores, track_ids):\n                    if track_id < 0:\n                        continue\n                    x1, y1, w, h = tlwh\n                    x2, y2 = x1 + w, y1 + h\n                    line = save_format.format(\n                        frame=frame_id, id=track_id, x1=x1, y1=y1, x2=x2, y2=y2, w=w, h=h, score=score)\n                    f.write(line)\n        logger.info('MOT results save in {}'.format(filename))\n\n    def save_results(self, data, frame_id, online_ids, online_tlwhs, online_scores, average_time, show_image, save_dir):\n        if show_image or save_dir is not None:\n            assert 'ori_image' in data\n            img0 = data['ori_image'].numpy()[0]\n            online_im = mot_vis.plot_tracking(\n                img0, online_tlwhs, online_ids, online_scores, frame_id=frame_id, fps=1. / average_time)\n        if show_image:\n            cv2.imshow('online_im', online_im)\n        if save_dir is not None:\n            cv2.imwrite(os.path.join(save_dir, '{:05d}.jpg'.format(frame_id)), online_im)\n"
  },
  {
    "path": "modules/video/multiple_object_tracking/jde_darknet53/utils.py",
    "content": "import time\n\n\nclass Timer(object):\n    \"\"\"\n    This class used to compute and print the current FPS while evaling.\n    \"\"\"\n\n    def __init__(self):\n        self.total_time = 0.\n        self.calls = 0\n        self.start_time = 0.\n        self.diff = 0.\n        self.average_time = 0.\n        self.duration = 0.\n\n    def tic(self):\n        # using time.time instead of time.clock because time time.clock\n        # does not normalize for multithreading\n        self.start_time = time.time()\n\n    def toc(self, average=True):\n        self.diff = time.time() - self.start_time\n        self.total_time += self.diff\n        self.calls += 1\n        self.average_time = self.total_time / self.calls\n        if average:\n            self.duration = self.average_time\n        else:\n            self.duration = self.diff\n        return self.duration\n\n    def clear(self):\n        self.total_time = 0.\n        self.calls = 0\n        self.start_time = 0.\n        self.diff = 0.\n        self.average_time = 0.\n        self.duration = 0.\n"
  },
  {
    "path": "paddlehub/__init__.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n__version__ = 'develop'\n\nimport paddle\nfrom packaging.version import Version\n_paddle_version = Version(paddle.__version__)\nif _paddle_version < Version('2.0.0') and _paddle_version != Version('0.0.0'):\n    raise RuntimeError(\n        'Version mismatch in PaddleHub and PaddlePaddle, you need to upgrade PaddlePaddle to version 2.0.0 or above.')\n\nimport sys\n\nfrom easydict import EasyDict\n\nfrom paddlehub import env\nfrom paddlehub.config import config\nfrom paddlehub import datasets\nfrom paddlehub.finetune import Trainer\nfrom paddlehub.utils import log, parser, utils\nfrom paddlehub.utils import download as _download\nfrom paddlehub.utils.paddlex import download, ResourceNotFoundError\nfrom paddlehub.utils.platform import is_windows\nfrom paddlehub.server import server_check\nfrom paddlehub.server.server_source import ServerConnectionError\nfrom paddlehub.module import Module\nfrom paddlehub.text.bert_tokenizer import BertTokenizer, ErnieTinyTokenizer\nfrom paddlehub.text.tokenizer import CustomTokenizer\nfrom paddlehub.text.utils import is_chinese_char\n\n# In order to maintain the compatibility of the old version, we put the relevant\n# compatible code in the paddlehub.compat package, and mapped some modules referenced\n# in the old version\nfrom paddlehub.compat import paddle_utils\nfrom paddlehub.compat.module.processor import BaseProcessor\nfrom paddlehub.compat.module.nlp_module import NLPPredictionModule, TransformerModule\nfrom paddlehub.compat.type import DataType\nfrom paddlehub.compat import task\nfrom paddlehub.compat.datasets import couplet\nfrom paddlehub.compat.task.config import RunConfig\nfrom paddlehub.compat.task.text_generation_task import TextGenerationTask\n\nsys.modules['paddlehub.io.parser'] = parser\nsys.modules['paddlehub.common.dir'] = env\nsys.modules['paddlehub.common.downloader'] = _download\nsys.modules['paddlehub.common.logger'] = log\nsys.modules['paddlehub.common.paddle_helper'] = paddle_utils\nsys.modules['paddlehub.common.utils'] = utils\nsys.modules['paddlehub.reader'] = task\nsys.modules['paddlehub.reader.batching'] = task.batch\n\nAdamWeightDecayStrategy = lambda: 0\nULMFiTStrategy = lambda params_layer=0: 0\ncommon = EasyDict(paddle_helper=paddle_utils)\ndataset = EasyDict(Couplet=couplet.Couplet)\nfinetune = EasyDict(strategy=EasyDict(ULMFiTStrategy=ULMFiTStrategy))\nlogger = EasyDict(logger=log.logger)\n\n\n# Alias for paddle.hub.*\ndef load(*args, **kwargs):\n    if _paddle_version < Version('2.1.0') and _paddle_version != Version('0.0.0'):\n        raise RuntimeError(\n            '`hub.load` is only available in PaddlePaddle 2.1 and above, please upgrade the PaddlePaddle version.')\n\n    from paddle.hub import load as phload\n    from paddlehub.server.server import CacheUpdater\n\n    CacheUpdater(\"paddle.hub.load\").start()\n    return phload(*args, **kwargs)\n\n\ndef list(*args, **kwargs):\n    if _paddle_version < Version('2.1.0') and _paddle_version != Version('0.0.0'):\n        raise RuntimeError(\n            '`hub.list` is only available in PaddlePaddle 2.1 and above, please upgrade the PaddlePaddle version.')\n\n    from paddle.hub import list as phlist\n    from paddlehub.server.server import CacheUpdater\n\n    CacheUpdater(\"paddle.hub.list\").start()\n    return phlist(*args, **kwargs)\n\n\ndef help(*args, **kwargs):\n    if _paddle_version < Version('2.1.0') and _paddle_version != Version('0.0.0'):\n        raise RuntimeError(\n            '`hub.help` is only available in PaddlePaddle 2.1 and above, please upgrade the PaddlePaddle version.')\n\n    from paddle.hub import help as phhelp\n    from paddlehub.server.server import CacheUpdater\n\n    CacheUpdater(\"paddle.hub.help\").start()\n    return phhelp(*args, **kwargs)\n\n\nif is_windows():\n    for char in env.HUB_HOME:\n        if is_chinese_char(char):\n            log.logger.warning(\n                'The home directory contains Chinese characters which may cause unknown exceptions in the execution \\\n                    of some modules. Please set another path through the set HUB_HOME command.')\n            break\n"
  },
  {
    "path": "paddlehub/commands/__init__.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlehub.commands.utils import register, get_command, execute, _commands\n\nimport paddlehub.commands.clear\nimport paddlehub.commands.config\nimport paddlehub.commands.convert\nimport paddlehub.commands.download\nimport paddlehub.commands.help\nimport paddlehub.commands.hub\nimport paddlehub.commands.install\nimport paddlehub.commands.list\nimport paddlehub.commands.run\nimport paddlehub.commands.search\nimport paddlehub.commands.serving\nimport paddlehub.commands.show\nimport paddlehub.commands.uninstall\nimport paddlehub.commands.version\n"
  },
  {
    "path": "paddlehub/commands/clear.py",
    "content": "# coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport shutil\nfrom typing import List\n\nimport paddlehub.env as hubenv\nfrom paddlehub.commands import register\n\n\ndef file_size_in_human_format(size: int) -> str:\n    size = float(size)\n    if size < 1024:\n        return \"%.1fB\" % size\n    elif size < 1024 * 1024:\n        return \"%.1fK\" % (size / 1024)\n    elif size < 1024 * 1024 * 1024:\n        return \"%.1fM\" % (size / (1024 * 1024))\n    else:\n        return \"%.1fG\" % (size / (1024 * 1024 * 1024))\n\n\n@register(name='hub.clear', description='Clear all cached data.')\nclass ClearCommand:\n    def execute(self, argv: List) -> bool:\n        total_file_size = 0.0\n        total_file_cnt = 0\n\n        for root, dirs, files in os.walk(hubenv.CACHE_HOME):\n            total_file_cnt += len(files)\n            total_file_cnt += len(dirs)\n            for file in files:\n                realpath = os.path.join(hubenv.CACHE_HOME, root, file)\n                total_file_size += os.path.getsize(realpath)\n\n            for dir in dirs:\n                realdir = os.path.join(hubenv.CACHE_HOME, root, dir)\n                total_file_size += os.path.getsize(realdir)\n\n        shutil.rmtree(hubenv.CACHE_HOME)\n\n        if total_file_cnt == 0:\n            print('No cache to release.')\n        else:\n            print('Clear {} cached files.'.format(total_file_cnt))\n            print('Free disk space {}.'.format(file_size_in_human_format(total_file_size)))\n\n        return True\n"
  },
  {
    "path": "paddlehub/commands/config.py",
    "content": "#coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport argparse\nimport ast\n\nimport paddlehub.config as hubconf\nfrom paddlehub.env import CONF_HOME\nfrom paddlehub.commands import register\n\n\n@register(name='hub.config', description='Configure PaddleHub.')\nclass ConfigCommand:\n    @staticmethod\n    def show_config():\n        print(\"The current configuration is shown below.\")\n        print(hubconf)\n\n    @staticmethod\n    def show_help():\n        str = \"config <option>\\n\"\n        str += \"\\tShow PaddleHub config without any option.\\n\"\n        str += \"option:\\n\"\n        str += \"reset\\n\"\n        str += \"\\tReset config as default.\\n\\n\"\n        str += \"server==[URL]\\n\"\n        str += \"\\tSet PaddleHub Server url as [URL].\\n\\n\"\n        str += \"log.level==[LEVEL]\\n\"\n        str += \"\\tSet log level.\\n\\n\"\n        str += \"log.enable==True|False\\n\"\n        str += \"\\tEnable or disable logger in PaddleHub.\\n\"\n        print(str)\n\n    def execute(self, argv):\n        if not argv:\n            ConfigCommand.show_config()\n        for arg in argv:\n            if arg == \"reset\":\n                hubconf.reset()\n                print(hubconf)\n            elif arg.startswith(\"server==\"):\n                hubconf.server = arg.split(\"==\")[1]\n            elif arg.startswith(\"log.level==\"):\n                hubconf.log_level = arg.split(\"==\")[1]\n            elif arg.startswith(\"log.enable==\"):\n                hubconf.log_enable = ast.literal_eval(arg.split(\"==\")[1])\n            else:\n                ConfigCommand.show_help()\n        return True\n"
  },
  {
    "path": "paddlehub/commands/convert.py",
    "content": "#coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport os\nimport time\nimport tarfile\nimport shutil\nfrom string import Template\n\nfrom paddlehub.env import TMP_HOME as tmp_dir\nfrom paddlehub.commands import register\nfrom paddlehub.utils.xarfile import XarFile\nfrom paddlehub.server.server import CacheUpdater\n\nINIT_FILE = '__init__.py'\nMODULE_FILE = 'module.py'\nSERVING_FILE = 'serving_client_demo.py'\nTMPL_DIR = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'tmpl')\n\n\n@register(name='hub.convert', description='Convert model to PaddleHub-Module.')\nclass ConvertCommand:\n    def __init__(self):\n        super(ConvertCommand, self).__init__()\n        self.parser = argparse.ArgumentParser()\n        self.parser.add_argument('command')\n        self.parser.add_argument('--module_name', '-n')\n        self.parser.add_argument('--module_version', '-v', nargs='?', default='1.0.0')\n        self.parser.add_argument('--model_dir', '-d')\n        self.parser.add_argument('--output_dir', '-o')\n\n    def create_module_tar(self):\n        if not os.path.exists(self.dest):\n            os.makedirs(self.dest)\n        tar_file = os.path.join(self.dest, '{}.tar.gz'.format(self.module))\n        tfp = XarFile(tar_file, 'w', 'tar.gz')\n        tfp.add(self.dest, self.module, False)\n        for root, dir, files in os.walk(self.src):\n            for file in files:\n                fullpath = os.path.join(root, file)\n                arcname = os.path.join(self.module, 'assets', file)\n                tfp.add(fullpath, arcname=arcname)\n\n        tfp.add(name=self.model_file, arcname=os.path.join(self.module, MODULE_FILE))\n        tfp.add(name=self.serving_file, arcname=os.path.join(self.module, SERVING_FILE))\n        tfp.add(name=self.init_file, arcname=os.path.join(self.module, INIT_FILE))\n\n    def create_module_py(self):\n        template_file = open(os.path.join(TMPL_DIR, 'x_model.tmpl'), 'r', encoding='utf-8')\n        tmpl = Template(template_file.read())\n        lines = []\n\n        lines.append(\n            tmpl.substitute(\n                NAME=\"'{}'\".format(self.module),\n                TYPE=\"'CV'\",\n                AUTHOR=\"'Baidu'\",\n                SUMMARY=\"''\",\n                VERSION=\"'{}'\".format(self.version),\n                EMAIL=\"''\"))\n        self.model_file = os.path.join(self._tmp_dir, MODULE_FILE)\n\n        with open(self.model_file, 'w', encoding='utf-8') as fp:\n            fp.writelines(lines)\n\n    def create_init_py(self):\n        self.init_file = os.path.join(self._tmp_dir, INIT_FILE)\n        if os.path.exists(self.init_file):\n            return\n        shutil.copyfile(os.path.join(TMPL_DIR, 'init_py.tmpl'), self.init_file)\n\n    def create_serving_demo_py(self):\n        template_file = open(os.path.join(TMPL_DIR, 'serving_demo.tmpl'), 'r', encoding='utf-8')\n        tmpl = Template(template_file.read())\n        lines = []\n\n        lines.append(tmpl.substitute(MODULE_NAME=self.module))\n        self.serving_file = os.path.join(self._tmp_dir, SERVING_FILE)\n        with open(self.serving_file, 'w', encoding='utf-8') as fp:\n            fp.writelines(lines)\n\n    @staticmethod\n    def show_help():\n        str = \"convert --module <module> [--version <version>] --dest dest_dir --src srd_dir\\n\"\n        str += \"\\tConvert model to PaddleHub-Module.\\n\"\n        str += \"--model_dir\\n\"\n        str += \"\\tDir of model you want to export.\\n\"\n        str += \"--module_name:\\n\"\n        str += \"\\tSet name of module.\\n\"\n        str += \"--module_version\\n\"\n        str += \"\\tSet version of module, default is `1.0.0`.\\n\"\n        str += \"--output_dir\\n\"\n        str += \"\\tDir to save PaddleHub-Module after exporting, default is `.`.\\n\"\n        print(str)\n        return\n\n    def execute(self, argv):\n        args = self.parser.parse_args()\n\n        if not args.module_name or not args.model_dir:\n            ConvertCommand.show_help()\n            return False\n        self.module = args.module_name\n        self.version = args.module_version if args.module_version is not None else '1.0.0'\n        CacheUpdater(\"hub_convert\", module=self.module, version=self.version).start()\n        self.src = args.model_dir\n        if not os.path.isdir(self.src):\n            print('`{}` is not exists or not a directory path'.format(self.src))\n            return False\n        self.dest = args.output_dir if args.output_dir is not None else os.path.join('{}_{}'.format(\n            self.module, str(time.time())))\n        os.makedirs(self.dest)\n\n        self._tmp_dir = tmp_dir\n        self.create_module_py()\n        self.create_init_py()\n        self.create_serving_demo_py()\n        self.create_module_tar()\n\n        print('The converted module is stored in `{}`.'.format(self.dest))\n\n        return True\n"
  },
  {
    "path": "paddlehub/commands/download.py",
    "content": "# coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import List\n\nimport paddlehub as hub\nfrom paddlehub.commands import register\nfrom paddlehub.server import module_server\nfrom paddlehub.utils import utils, log\nfrom paddlehub.server.server import CacheUpdater\n\n\n@register(name='hub.download', description='Download PaddlePaddle pretrained module files.')\nclass DownloadCommand:\n    def execute(self, argv: List) -> bool:\n        if not argv:\n            print(\"ERROR: You must give at least one module to download.\")\n            return False\n\n        for _arg in argv:\n            result = module_server.search_module(_arg)\n            CacheUpdater(\"hub_download\", _arg).start()\n            if result:\n                url = result[0]['url']\n                with log.ProgressBar('Download {}'.format(url)) as bar:\n                    for file, ds, ts in utils.download_with_progress(url):\n                        bar.update(float(ds) / ts)\n            else:\n                print('ERROR: Could not find a HubModule named {}'.format(_arg))\n        return True\n"
  },
  {
    "path": "paddlehub/commands/help.py",
    "content": "# coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import List\n\nimport paddlehub as hub\nfrom paddlehub.commands import register, _commands\n\n\n@register(name='hub.help', description='Show help for commands.')\nclass HelpCommand:\n    def execute(self, argv: List) -> bool:\n        msg = 'Usage:\\n'\n        msg += '    hub <command> <options>\\n\\n'\n        msg += 'Commands:\\n'\n        for command, detail in _commands['hub'].items():\n            if command.startswith('_'):\n                continue\n\n            if not '_description' in detail:\n                continue\n            msg += '    {:<15}        {}\\n'.format(command, detail['_description'])\n\n        print(msg)\n        return True\n"
  },
  {
    "path": "paddlehub/commands/hub.py",
    "content": "#coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport argparse\nimport os\n\nfrom paddlehub.commands import register, get_command\n\n\n@register(name='hub')\nclass HubCommand:\n    def execute(self, argv):\n        help = get_command('hub.help')\n        return help().execute(argv)\n"
  },
  {
    "path": "paddlehub/commands/install.py",
    "content": "#coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport argparse\nimport os\nfrom typing import List\n\nfrom paddlehub.commands import register\nfrom paddlehub.module.manager import LocalModuleManager\nfrom paddlehub.utils import xarfile\nfrom paddlehub.server.server import CacheUpdater\n\n\n@register(name='hub.install', description='Install PaddleHub module.')\nclass InstallCommand:\n    def __init__(self):\n        self.parser = argparse.ArgumentParser(prog='hub install', add_help=True)\n        self.parser.add_argument(\n            '--ignore_env_mismatch',\n            action='store_true',\n            help='Whether to ignore the environment mismatch when installing the Module.')\n\n    def execute(self, argv: List) -> bool:\n        if not argv:\n            print(\"ERROR: You must give at least one module to install.\")\n            return False\n\n        options = [arg for arg in argv if arg.startswith('-')]\n        argv = [arg for arg in argv if not arg.startswith('-')]\n        args = self.parser.parse_args(options)\n\n        manager = LocalModuleManager()\n        for _arg in argv:\n            if os.path.exists(_arg) and os.path.isdir(_arg):\n                manager.install(directory=_arg)\n            elif os.path.exists(_arg) and xarfile.is_xarfile(_arg):\n                manager.install(archive=_arg)\n            else:\n                _arg = _arg.split('==')\n                name = _arg[0]\n                version = None if len(_arg) == 1 else _arg[1]\n                CacheUpdater(\"hub_install\", name, version).start()\n                manager.install(name=name, version=version, ignore_env_mismatch=args.ignore_env_mismatch)\n        return True\n"
  },
  {
    "path": "paddlehub/commands/list.py",
    "content": "# coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import List\n\nimport paddlehub as hub\nfrom paddlehub.commands import register\nfrom paddlehub.module.manager import LocalModuleManager\nfrom paddlehub.utils import log, platform\n\n\n@register(name='hub.list', description='Show help for commands.')\nclass ListCommand:\n    def execute(self, argv: List) -> bool:\n        manager = LocalModuleManager()\n\n        widths = [20, 40] if platform.is_windows() else [25, 50]\n        aligns = ['^', '<']\n        table = log.Table(widths=widths, aligns=aligns)\n\n        table.append('ModuleName', 'Path', colors=['green', 'green'])\n\n        for module in manager.list():\n            table.append(module.name, module.directory)\n\n        print(table)\n        return True\n"
  },
  {
    "path": "paddlehub/commands/run.py",
    "content": "# coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport argparse\nimport ast\nimport os\nfrom typing import Any, List\n\nfrom paddlehub.compat.module.module_v1 import ModuleV1\nfrom paddlehub.commands import register\nfrom paddlehub.module.manager import LocalModuleManager\nfrom paddlehub.module.module import Module, InvalidHubModule\nfrom paddlehub.server.server import CacheUpdater\n\n\n@register(name='hub.run', description='Run the specific module.')\nclass RunCommand:\n    def execute(self, argv: List) -> bool:\n        if not argv:\n            print('ERROR: You must give one module to run.')\n            return False\n        module_name = argv[0]\n        CacheUpdater(\"hub_run\", module_name).start()\n\n        if os.path.exists(module_name) and os.path.isdir(module_name):\n            try:\n                module = Module.load(module_name)\n            except InvalidHubModule:\n                print('{} is not a valid HubModule'.format(module_name))\n                return False\n            except:\n                print('Some exception occurred while loading the {}'.format(module_name))\n                return False\n        else:\n            module = Module(name=module_name)\n\n        if not module.is_runnable:\n            print('ERROR! Module {} is not executable.'.format(module_name))\n            return False\n\n        if isinstance(module, ModuleV1):\n            result = self.run_module_v1(module, argv[1:])\n        else:\n            result = module._run_func(argv[1:])\n\n        print(result)\n        return True\n\n    def run_module_v1(self, module, argv: List) -> Any:\n        parser = argparse.ArgumentParser(prog='hub run {}'.format(module.name), add_help=False)\n\n        arg_input_group = parser.add_argument_group(title='Input options', description='Data feed into the module.')\n        arg_config_group = parser.add_argument_group(\n            title='Config options', description='Run configuration for controlling module behavior, optional.')\n\n        arg_config_group.add_argument(\n            '--use_gpu', type=ast.literal_eval, default=False, help='whether use GPU for prediction')\n        arg_config_group.add_argument('--batch_size', type=int, default=1, help='batch size for prediction')\n\n        module_type = module.type.lower()\n        if module_type.startswith('cv'):\n            arg_input_group.add_argument(\n                '--input_path', type=str, default=None, help='path of image/video to predict', required=True)\n        else:\n            arg_input_group.add_argument('--input_text', type=str, default=None, help='text to predict', required=True)\n\n        args = parser.parse_args(argv)\n\n        except_data_format = module.processor.data_format(module.default_signature)\n        key = list(except_data_format.keys())[0]\n        input_data = {key: [args.input_path] if module_type.startswith('cv') else [args.input_text]}\n\n        return module(\n            sign_name=module.default_signature, data=input_data, use_gpu=args.use_gpu, batch_size=args.batch_size)\n"
  },
  {
    "path": "paddlehub/commands/search.py",
    "content": "#coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport argparse\nimport os\nfrom typing import List\n\nfrom paddlehub.commands import register\nfrom paddlehub.module.manager import LocalModuleManager\nfrom paddlehub.server.server import module_server\nfrom paddlehub.utils import log, platform\nfrom paddlehub.server.server import CacheUpdater\n\n\n@register(name='hub.search', description='Search PaddleHub pretrained model through model keywords.')\nclass SearchCommand:\n    def execute(self, argv: List) -> bool:\n        argv = '.*' if not argv else argv[0]\n\n        widths = [20, 8, 30] if platform.is_windows() else [30, 8, 40]\n        table = log.Table(widths=widths)\n        table.append(*['ModuleName', 'Version', 'Summary'], aligns=['^', '^', '^'], colors=[\"blue\", \"blue\", \"blue\"])\n        CacheUpdater(\"hub_search\", argv).start()\n        results = module_server.search_module(name=argv)\n\n        for result in results:\n            if 'Module' == result['type']:\n                table.append(result['name'], result['version'], result['summary'])\n\n        print(table)\n        return True\n"
  },
  {
    "path": "paddlehub/commands/serving.py",
    "content": "# coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport argparse\nimport os\nimport platform\nimport json\nimport multiprocessing\nimport time\nimport signal\n\nimport paddlehub as hub\nfrom paddlehub.commands import register\nfrom paddlehub.serving import app_compat as app\nfrom paddlehub.env import CONF_HOME\nfrom paddlehub.serving.http_server import run_all, StandaloneApplication\nfrom paddlehub.utils import log\nfrom paddlehub.utils.utils import is_port_occupied\nfrom paddlehub.server.server import CacheUpdater\n\n\ndef number_of_workers():\n    '''\n    Get suitable quantity of workers based on empirical formula.\n    '''\n    return (multiprocessing.cpu_count() * 2) + 1\n\n\ndef pid_is_exist(pid: int):\n    '''\n    Try to kill process by PID.\n\n    Args:\n        pid(int): PID of process to be killed.\n\n    Returns:\n         True if PID will be killed.\n\n    Examples:\n    .. code-block:: python\n\n        pid_is_exist(pid=8866)\n    '''\n    try:\n        os.kill(pid, 0)\n    except:\n        return False\n    else:\n        return True\n\n\n@register(name='hub.serving', description='Start Module Serving or Bert Service for online predicting.')\nclass ServingCommand:\n    name = \"serving\"\n    module_list = []\n\n    def dump_pid_file(self):\n        '''\n        Write PID info to file.\n        '''\n        pid = os.getpid()\n        filepath = os.path.join(CONF_HOME, \"serving_\" + str(self.args.port) + \".json\")\n        if os.path.exists(filepath):\n            os.remove(filepath)\n        with open(filepath, \"w\") as fp:\n            info = {\"pid\": pid, \"module\": self.args.modules, \"start_time\": time.time()}\n            json.dump(info, fp)\n\n    @staticmethod\n    def load_pid_file(filepath: str, port: int = None):\n        '''\n        Read PID info from file.\n        '''\n        if port is None:\n            port = os.path.basename(filepath).split(\".\")[0].split(\"_\")[1]\n        if not os.path.exists(filepath):\n            log.logger.error(\n                \"PaddleHub Serving config file is not exists, please confirm the port [%s] you specified is correct.\" %\n                port)\n            return False\n        with open(filepath, \"r\") as fp:\n            info = json.load(fp)\n            return info\n\n    def stop_serving(self, port: int):\n        '''\n        Stop PaddleHub-Serving by port.\n        '''\n        filepath = os.path.join(CONF_HOME, \"serving_\" + str(port) + \".json\")\n        info = self.load_pid_file(filepath, port)\n        if info is False:\n            return\n        pid = info[\"pid\"]\n        module = info[\"module\"]\n        start_time = info[\"start_time\"]\n        CacheUpdater(\"hub_serving_stop\", module=module, addition={\"period_time\": time.time() - start_time}).start()\n        if os.path.exists(filepath):\n            os.remove(filepath)\n\n        if not pid_is_exist(pid):\n            log.logger.info(\"PaddleHub Serving has been stopped.\")\n            return\n        log.logger.info(\"PaddleHub Serving will stop.\")\n        if platform.system() == \"Windows\":\n            os.kill(pid, signal.SIGTERM)\n        else:\n            try:\n                os.killpg(pid, signal.SIGTERM)\n            except ProcessLookupError:\n                os.kill(pid, signal.SIGTERM)\n\n    @staticmethod\n    def start_bert_serving(args):\n        '''\n        Start bert serving server.\n        '''\n        if platform.system() != \"Linux\":\n            log.logger.error(\"Error. Bert Service only support linux.\")\n            return False\n\n        if is_port_occupied(\"127.0.0.1\", args.port) is True:\n            log.logger.error(\"Port %s is occupied, please change it.\" % args.port)\n            return False\n\n        from paddle_gpu_serving.run import BertServer\n        bs = BertServer(with_gpu=args.use_gpu)\n        bs.with_model(model_name=args.modules[0])\n        CacheUpdater(\"hub_bert_service\", module=args.modules[0], version=\"0.0.0\").start()\n        bs.run(gpu_index=args.gpu, port=int(args.port))\n\n    def preinstall_modules(self):\n        '''\n        Install module by PaddleHub and get info of this module.\n        '''\n        for key, value in self.modules_info.items():\n            init_args = value[\"init_args\"]\n            CacheUpdater(\"hub_serving_start\", module=key, version=init_args.get(\"version\", \"0.0.0\")).start()\n            if \"directory\" not in init_args:\n                init_args.update({\"name\": key})\n            m = hub.Module(**init_args)\n            method_name = m.serving_func_name\n            if method_name is None:\n                raise RuntimeError(\"{} cannot be use for \" \"predicting\".format(key))\n                exit(1)\n            serving_method = getattr(m, method_name)\n            category = str(m.type).split(\"/\")[0].upper()\n            self.modules_info[key].update({\n                \"method_name\": method_name,\n                \"version\": m.version,\n                \"category\": category,\n                \"module\": m,\n                \"name\": m.name,\n                \"serving_method\": serving_method\n            })\n\n    def start_app_with_args(self):\n        '''\n        Start one PaddleHub-Serving instance by arguments with gunicorn.\n        '''\n        module = self.modules_info\n        if module is not None:\n            port = self.args.port\n            if is_port_occupied(\"127.0.0.1\", port) is True:\n                log.logger.error(\"Port %s is occupied, please change it.\" % port)\n                return False\n            self.preinstall_modules()\n            options = {\"bind\": \"0.0.0.0:%s\" % port, \"workers\": self.args.workers}\n            self.dump_pid_file()\n            StandaloneApplication(app.create_app(init_flag=False, configs=self.modules_info), options).run()\n        else:\n            log.logger.error(\"Lack of necessary parameters!\")\n\n    def start_zmq_serving_with_args(self):\n        '''\n        Start one PaddleHub-Serving instance by arguments with zmq.\n        '''\n        if self.modules_info is not None:\n            for module, info in self.modules_info.items():\n                CacheUpdater(\"hub_serving_start\", module=module, version=info['init_args']['version']).start()\n            front_port = self.args.port\n            if is_port_occupied(\"127.0.0.1\", front_port) is True:\n                log.logger.error(\"Port %s is occupied, please change it.\" % front_port)\n                return False\n            back_port = int(front_port) + 1\n            for index in range(100):\n                if not is_port_occupied(\"127.0.0.1\", back_port):\n                    break\n                else:\n                    back_port = int(back_port) + 1\n            else:\n                raise RuntimeError(\n                    \"Port from %s to %s is occupied, please use another port\" % (int(front_port) + 1, back_port))\n            self.dump_pid_file()\n            run_all(self.modules_info, self.args.gpu, front_port, back_port)\n\n        else:\n            log.logger.error(\"Lack of necessary parameters!\")\n\n    def start_single_app_with_args(self):\n        '''\n        Start one PaddleHub-Serving instance by arguments with flask.\n        '''\n        module = self.modules_info\n        if module is not None:\n            port = self.args.port\n            if is_port_occupied(\"127.0.0.1\", port) is True:\n                log.logger.error(\"Port %s is occupied, please change it.\" % port)\n                return False\n            self.preinstall_modules()\n            self.dump_pid_file()\n            app.run(configs=self.modules_info, port=port)\n        else:\n            log.logger.error(\"Lack of necessary parameters!\")\n\n    def start_serving(self):\n        '''\n        Start PaddleHub-Serving with flask and gunicorn\n        '''\n        if self.args.use_gpu:\n            if self.args.use_multiprocess:\n                log.logger.warning('`use_multiprocess` will be ignored if specify `use_gpu`.')\n            self.start_zmq_serving_with_args()\n        else:\n            if self.args.use_multiprocess:\n                if platform.system() == \"Windows\":\n                    log.logger.warning(\n                        \"Warning: Windows cannot use multiprocess working mode, PaddleHub Serving will switch to single process mode\"\n                    )\n                    self.start_single_app_with_args()\n                else:\n                    self.start_app_with_args()\n            else:\n                self.start_single_app_with_args()\n\n    @staticmethod\n    def show_help():\n        str = \"serving <option>\\n\"\n        str += \"\\tManage PaddleHub Serving.\\n\"\n        str += \"sub command:\\n\"\n        str += \"1. start\\n\"\n        str += \"\\tStart PaddleHub Serving.\\n\"\n        str += \"2. stop\\n\"\n        str += \"\\tStop PaddleHub Serving.\\n\"\n        str += \"3. start bert_service\\n\"\n        str += \"\\tStart Bert Service.\\n\"\n        str += \"\\n\"\n        str += \"[start] option:\\n\"\n        str += \"--modules/-m [module1==version, module2==version...]\\n\"\n        str += \"\\tPre-install modules via the parameter list.\\n\"\n        str += \"--port/-p XXXX\\n\"\n        str += \"\\tUse port XXXX for serving.\\n\"\n        str += \"--use_multiprocess\\n\"\n        str += \"\\tChoose multoprocess mode, cannot be use on Windows.\\n\"\n        str += \"--modules_info\\n\"\n        str += \"\\tSet module config in PaddleHub Serving.\"\n        str += \"--config/-c file_path\\n\"\n        str += \"\\tUse configs in file to start PaddleHub Serving. \"\n        str += \"Other parameters will be ignored if you specify the parameter.\\n\"\n        str += \"\\n\"\n        str += \"[stop] option:\\n\"\n        str += \"--port/-p XXXX\\n\"\n        str += \"\\tStop PaddleHub Serving on port XXXX safely.\\n\"\n        str += \"\\n\"\n        str += \"[start bert_service] option:\\n\"\n        str += \"--modules/-m\\n\"\n        str += \"\\tPre-install modules via the parameter.\\n\"\n        str += \"--port/-p XXXX\\n\"\n        str += \"\\tUse port XXXX for serving.\\n\"\n        str += \"--use_gpu\\n\"\n        str += \"\\tUse gpu for predicting if specifies the parameter.\\n\"\n        str += \"--gpu\\n\"\n        str += \"\\tSpecify the GPU devices to use.\\n\"\n        print(str)\n\n    def parse_args(self):\n        if self.args.config is not None:\n            if os.path.exists(self.args.config):\n                with open(self.args.config, \"r\") as fp:\n                    # self.args.config = json.load(fp)\n                    self.args_config = json.load(fp)\n                self.args.use_gpu = self.args_config.get('use_gpu', False)\n                self.args.use_multiprocess = self.args_config.get('use_multiprocess', False)\n                self.modules_info = self.args_config[\"modules_info\"]\n                self.args.port = self.args_config.get('port', 8866)\n                if self.args.use_gpu:\n                    self.args.gpu = self.args_config.get('gpu', '0')\n                else:\n                    self.args.gpu = self.args_config.get('gpu', None)\n                self.args.use_gpu = self.args_config.get('use_gpu', False)\n                if self.args.use_multiprocess:\n                    self.args.workers = self.args_config.get('workers', number_of_workers())\n                else:\n                    self.args.workers = self.args_config.get('workers', None)\n            else:\n                raise RuntimeError(\"{} not exists.\".format(self.args.config))\n                exit(1)\n        else:\n            self.modules_info = {}\n            for item in self.args.modules:\n                version = None\n                if \"==\" in item:\n                    module = item.split(\"==\")[0]\n                    version = item.split(\"==\")[1]\n                else:\n                    module = item\n                self.modules_info.update({module: {\"init_args\": {\"version\": version}, \"predict_args\": {}}})\n        if self.args.gpu:\n            self.args.gpu = self.args.gpu.split(',')\n\n        return self.modules_info\n\n    def execute(self, argv):\n        self.show_in_help = True\n        self.description = \"Start Module Serving or Bert Service for online predicting.\"\n        self.parser = argparse.ArgumentParser(\n            description=self.__class__.__doc__, prog='hub serving', usage='%(prog)s', add_help=True)\n        self.parser.add_argument(\"command\")\n        self.parser.add_argument(\"sub_command\")\n        self.parser.add_argument(\"bert_service\", nargs=\"?\")\n        self.sub_parse = self.parser.add_mutually_exclusive_group(required=False)\n        self.parser.add_argument(\"--use_gpu\", action=\"store_true\", default=False)\n        self.parser.add_argument(\"--use_multiprocess\", action=\"store_true\", default=False)\n        self.parser.add_argument(\"--modules\", \"-m\", nargs=\"+\")\n        self.parser.add_argument(\"--config\", \"-c\", nargs=\"?\")\n        self.parser.add_argument(\"--port\", \"-p\", nargs=\"?\", default=8866)\n        self.parser.add_argument(\"--gpu\", \"-i\", nargs=\"?\", default='0')\n        self.parser.add_argument(\"--use_singleprocess\", action=\"store_true\", default=False)\n        self.parser.add_argument(\"--modules_info\", \"-mi\", default={}, type=json.loads)\n        self.parser.add_argument(\"--workers\", \"-w\", nargs=\"?\", default=number_of_workers())\n        try:\n            self.args = self.parser.parse_args()\n        except:\n            ServingCommand.show_help()\n            return False\n        if self.args.sub_command == \"start\":\n            if self.args.bert_service == \"bert_service\":\n                ServingCommand.start_bert_serving(self.args)\n            else:\n                self.parse_args()\n                self.start_serving()\n        elif self.args.sub_command == \"stop\":\n            if self.args.bert_service == \"bert_service\":\n                log.logger.warning(\"Please stop Bert Service by kill process by yourself\")\n            elif self.args.bert_service is None:\n                self.stop_serving(port=self.args.port)\n        else:\n            ServingCommand.show_help()\n"
  },
  {
    "path": "paddlehub/commands/show.py",
    "content": "# coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import List\n\nfrom paddlehub.commands import register\nfrom paddlehub.module.manager import LocalModuleManager\nfrom paddlehub.utils import log, platform\nfrom paddlehub.module.module import Module, InvalidHubModule\n\n\n@register(name='hub.show', description='Show the information of PaddleHub module.')\nclass ShowCommand:\n    def execute(self, argv: List) -> bool:\n        if not argv:\n            print(\"ERROR: You must give one module to show.\")\n            return False\n        argv = argv[0]\n\n        if os.path.exists(argv) and os.path.isdir(argv):\n            try:\n                module = Module.load(argv)\n            except InvalidHubModule:\n                print('{} is not a valid HubModule'.format(argv))\n                return False\n            except:\n                print('Some exception occurred while loading the {}'.format(argv))\n                return False\n        else:\n            module = LocalModuleManager().search(argv)\n            if not module:\n                print('{} is not existed!'.format(argv))\n                return False\n\n        widths = [15, 40] if platform.is_windows else [15, 50]\n        aligns = ['^', '<']\n        colors = ['cyan', '']\n        table = log.Table(widths=widths, colors=colors, aligns=aligns)\n\n        table.append('ModuleName', module.name)\n        table.append('Version', str(module.version))\n        table.append('Summary', module.summary)\n        table.append('Author', module.author)\n        table.append('Author-Email', module.author_email)\n        table.append('Location', module.directory)\n        print(table)\n        return True\n"
  },
  {
    "path": "paddlehub/commands/tmpl/init_py.tmpl",
    "content": ""
  },
  {
    "path": "paddlehub/commands/tmpl/serving_demo.tmpl",
    "content": "# coding: utf8\nimport requests\nimport json\nimport cv2\nimport base64\n\n\ndef cv2_to_base64(image):\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\nif __name__ == '__main__':\n    # 获取图片的base64编码格式\n    img1 = cv2_to_base64(cv2.imread(\"IMAGE_PATH1\"))\n    img2 = cv2_to_base64(cv2.imread(\"IMAGE_PATH2\"))\n    data = {'images': [img1, img2]}\n    # 指定content-type\n    headers = {\"Content-type\": \"application/json\"}\n    # 发送HTTP请求\n    url = \"http://127.0.0.1:8866/predict/${MODULE_NAME}\"\n    r = requests.post(url=url, headers=headers, data=json.dumps(data))\n\n    # 打印预测结果\n    print(r.json()[\"results\"])\n"
  },
  {
    "path": "paddlehub/commands/tmpl/x_model.tmpl",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\n\nimport os\nimport cv2\nimport argparse\nimport base64\nimport paddlex as pdx\n\nimport numpy as np\nimport paddlehub as hub\nfrom paddlehub.module.module import moduleinfo, runnable, serving\n\n\ndef base64_to_cv2(b64str):\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.frombuffer(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\ndef base64_to_np(b64tuple):\n    data, shape = b64tuple\n    data = base64.b64decode(data.encode('utf8'))\n    data = np.frombuffer(data, np.float32).reshape(shape)\n    return data\n\n\ndef cv2_to_base64(image):\n    # return base64.b64encode(image)\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef np_to_base64(array):\n    shape = array.shape\n    data = base64.b64encode(array).decode('utf8')\n    return data, shape\n\n\ndef read_images(paths):\n    images = []\n    for path in paths:\n        images.append(cv2.imread(path))\n    return images\n\n\n@moduleinfo(\n    name=${NAME},\n    type=${TYPE},\n    author=${AUTHOR},\n    author_email=${EMAIL},\n    summary=${SUMMARY},\n    version=${VERSION})\nclass MODULE(hub.Module):\n    def _initialize(self, **kwargs):\n        self.default_pretrained_model_path = os.path.join(\n            self.directory, 'assets')\n        self.model = pdx.deploy.Predictor(self.default_pretrained_model_path,\n                                          **kwargs)\n\n    def predict(self,\n                images=None,\n                paths=None,\n                data=None,\n                batch_size=1,\n                use_gpu=False,\n                **kwargs):\n\n        all_data = images if images is not None else read_images(paths)\n        total_num = len(all_data)\n        loop_num = int(np.ceil(total_num / batch_size))\n\n        res = []\n        for iter_id in range(loop_num):\n            batch_data = list()\n            handle_id = iter_id * batch_size\n            for image_id in range(batch_size):\n                try:\n                    batch_data.append(all_data[handle_id + image_id])\n                except IndexError:\n                    break\n            out = self.model.batch_predict(batch_data, **kwargs)\n            res.extend(out)\n        return res\n\n    @serving\n    def serving_method(self, images, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.predict(images_decode, **kwargs)\n        res = []\n        for result in results:\n            if isinstance(result, dict):\n                # result_new = dict()\n                for key, value in result.items():\n                    if key == 'score_map':\n                        result[key] = np_to_base64(value)\n                    elif isinstance(value, np.ndarray):\n                        result[key] = cv2_to_base64(value)\n                    elif isinstance(value, np.generic):\n                        result[key] = np.asscalar(value)\n\n            elif isinstance(result, list):\n                for index in range(len(result)):\n                    for key, value in result[index].items():\n                        if key == 'score_map':\n                            result[index][key] = np_to_base64(value)\n                        elif isinstance(value, np.ndarray):\n                            result[index][key] = cv2_to_base64(value)\n                        elif isinstance(value, np.generic):\n                            result[index][key] = np.asscalar(value)\n            else:\n                raise RuntimeError('The result cannot be used in serving.')\n            res.append(result)\n        return res\n\n    @runnable\n    def run_cmd(self, argvs):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(\n            title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\",\n            description=\n            \"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.predict(\n            paths=[args.input_path],\n            use_gpu=args.use_gpu)\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n        self.arg_config_group.add_argument(\n            '--use_gpu',\n            type=bool,\n            default=False,\n            help=\"whether use GPU or not\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument(\n            '--input_path', type=str, help=\"path to image.\")\n\nif __name__ == '__main__':\n    module = MODULE(directory='./new_model')\n    images = [cv2.imread('./cat.jpg'), cv2.imread('./cat.jpg'), cv2.imread('./cat.jpg')]\n    res = module.predict(images=images)\n"
  },
  {
    "path": "paddlehub/commands/uninstall.py",
    "content": "# coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import List\n\nimport paddlehub as hub\nfrom paddlehub.commands import register\nfrom paddlehub.module.manager import LocalModuleManager\n\n\n@register(name='hub.uninstall', description='Uninstall PaddleHub module.')\nclass UninstallCommand:\n    def execute(self, argv: List) -> bool:\n        if not argv:\n            print(\"ERROR: You must give at least one module to uninstall.\")\n            return False\n\n        manager = LocalModuleManager()\n        for _arg in argv:\n            manager.uninstall(_arg)\n\n        return True\n"
  },
  {
    "path": "paddlehub/commands/utils.py",
    "content": "#coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom collections import defaultdict\nfrom typing import Any\n\n\ndef _CommandDict():\n    return defaultdict(_CommandDict)\n\n\n_commands = _CommandDict()\n\n\ndef register(name: str, description: str = '') -> Any:\n    '''\n    Register a subcommand in the command list of PaddleHub\n\n    Args:\n        name(str) : The name of the command, separated by '.' (e.g, hub.serving)\n        description(str) : The description of the specified command showd in the help command, if not description given, this command would not be shown in help command. Default is None.\n    '''\n\n    def _warpper(command):\n        items = name.split('.')\n\n        com = _commands\n        for item in items:\n            com = com[item]\n        com['_entry'] = command\n        if description:\n            com['_description'] = description\n        return command\n\n    return _warpper\n\n\ndef get_command(name: str) -> Any:\n    items = name.split('.')\n    com = _commands\n    for item in items:\n        com = com[item]\n\n    return com['_entry']\n\n\ndef execute():\n    '''\n    Execute a PaddleHub command and return the status code\n\n    Returns:\n         status(int) : Result of the command execution. 0 for a success and 1 for a failure.\n    '''\n    import sys\n    import multiprocessing\n    try:\n        multiprocessing.set_start_method(\n            'fork'\n        )  # In paddlehub, functions like gradio app need multiprocess supported, and 'fork' is our prefered start method.\n    except Exception:\n        pass  # If can not set 'fork', maintain the default start method\n    com = _commands\n    for idx, _argv in enumerate(['hub'] + sys.argv[1:]):\n        if _argv not in com:\n            break\n        com = com[_argv]\n    else:\n        idx += 1\n\n    # The method 'execute' of a command instance returns 'True' for a success\n    # while 'False' for a failure. Here converts this result into a exit status\n    # in bash: 0 for a success and 1 for a failure.\n    status = 0 if com['_entry']().execute(sys.argv[idx:]) else 1\n    return status\n"
  },
  {
    "path": "paddlehub/commands/version.py",
    "content": "# coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import List\n\nimport paddlehub as hub\nfrom paddlehub.commands import register\n\n\n@register(name='hub.version', description='Show PaddleHub\\'s version.')\nclass VersionCommand:\n    def execute(self, argv: List) -> bool:\n        print(hub.__version__)\n        return True\n"
  },
  {
    "path": "paddlehub/compat/__init__.py",
    "content": ""
  },
  {
    "path": "paddlehub/compat/datasets/__init__.py",
    "content": ""
  },
  {
    "path": "paddlehub/compat/datasets/base_dataset.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\n\nfrom paddlehub.utils.log import logger\n\n\nclass InputExample(object):\n    '''\n    Input data structure of BERT/ERNIE, can satisfy single sequence task like\n    text classification, sequence lableing; Sequence pair task like dialog\n    task.\n\n    Args:\n        guid: Unique id for the example.\n        text_a: string. The untokenized text of the first sequence. For single\n            sequence tasks, only this sequence must be specified.\n        text_b: (Optional) string. The untokenized text of the second sequence.\n            Only must be specified for sequence pair tasks.\n        label: (Optional) string. The label of the example. This should be\n            specified for train and dev examples, but not for test examples.\n    '''\n\n    def __init__(self, guid, text_a, text_b=None, label=None):\n        self.guid = guid\n        self.text_a = text_a\n        self.text_b = text_b\n        self.label = label\n\n    def __str__(self):\n        if self.text_b is None:\n            return 'text={}\\tlabel={}'.format(self.text_a, self.label)\n        else:\n            return 'text_a={}\\ttext_b={},label={}'.format(self.text_a, self.text_b, self.label)\n\n\nclass BaseDataset(object):\n    def __init__(self,\n                 base_path,\n                 train_file=None,\n                 dev_file=None,\n                 test_file=None,\n                 predict_file=None,\n                 label_file=None,\n                 label_list=None,\n                 train_file_with_header=False,\n                 dev_file_with_header=False,\n                 test_file_with_header=False,\n                 predict_file_with_header=False):\n        if not (train_file or dev_file or test_file):\n            raise ValueError('At least one file should be assigned')\n        self.base_path = base_path\n        self.train_file = train_file\n        self.dev_file = dev_file\n        self.test_file = test_file\n        self.predict_file = predict_file\n        self.label_file = label_file\n        self.label_list = label_list\n\n        self.train_examples = []\n        self.dev_examples = []\n        self.test_examples = []\n        self.predict_examples = []\n\n        self.if_file_with_header = {\n            'train': train_file_with_header,\n            'dev': dev_file_with_header,\n            'test': test_file_with_header,\n            'predict': predict_file_with_header\n        }\n\n        if train_file:\n            self._load_train_examples()\n        if dev_file:\n            self._load_dev_examples()\n        if test_file:\n            self._load_test_examples()\n        if predict_file:\n            self._load_predict_examples()\n        if self.label_file:\n            if not self.label_list:\n                self.label_list = self._load_label_data()\n            else:\n                logger.warning('As label_list has been assigned, label_file is noneffective')\n\n        if self.label_list:\n            self.label_index = dict(zip(self.label_list, range(len(self.label_list))))\n\n    def get_train_examples(self):\n        return self.train_examples\n\n    def get_dev_examples(self):\n        return self.dev_examples\n\n    def get_test_examples(self):\n        return self.test_examples\n\n    def get_val_examples(self):\n        return self.get_dev_examples()\n\n    def get_predict_examples(self):\n        return self.predict_examples\n\n    def get_examples(self, phase):\n        if phase == 'train':\n            return self.get_train_examples()\n        elif phase == 'dev':\n            return self.get_dev_examples()\n        elif phase == 'test':\n            return self.get_test_examples()\n        elif phase == 'val':\n            return self.get_val_examples()\n        elif phase == 'predict':\n            return self.get_predict_examples()\n        else:\n            raise ValueError('Invalid phase: %s' % phase)\n\n    def get_labels(self):\n        return self.label_list\n\n    @property\n    def num_labels(self):\n        return len(self.label_list)\n\n    # To be compatible with ImageClassificationDataset\n    def label_dict(self):\n        return {index: key for index, key in enumerate(self.label_list)}\n\n    def _load_train_examples(self):\n        self.train_path = os.path.join(self.base_path, self.train_file)\n        self.train_examples = self._read_file(self.train_path, phase='train')\n\n    def _load_dev_examples(self):\n        self.dev_path = os.path.join(self.base_path, self.dev_file)\n        self.dev_examples = self._read_file(self.dev_path, phase='dev')\n\n    def _load_test_examples(self):\n        self.test_path = os.path.join(self.base_path, self.test_file)\n        self.test_examples = self._read_file(self.test_path, phase='test')\n\n    def _load_predict_examples(self):\n        self.predict_path = os.path.join(self.base_path, self.predict_file)\n        self.predict_examples = self._read_file(self.predict_path, phase='predict')\n\n    def _read_file(self, path, phase=None):\n        raise NotImplementedError\n\n    def _load_label_data(self):\n        with open(os.path.join(self.base_path, self.label_file), 'r', encoding='utf8') as file:\n            return file.read().strip().split('\\n')\n\n    def __str__(self):\n        return 'Dataset: %s with %i train examples, %i dev examples and %i test examples' % (\n            self.__class__.__name__, len(self.train_examples), len(self.dev_examples), len(self.test_examples))\n"
  },
  {
    "path": "paddlehub/compat/datasets/couplet.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport codecs\nimport csv\nimport os\n\nimport paddlehub.env as hubenv\nfrom paddlehub.compat.datasets.base_dataset import InputExample\nfrom paddlehub.compat.datasets.nlp_dataset import GenerationDataset\nfrom paddlehub.utils.download import download_data\n\n\n@download_data('https://bj.bcebos.com/paddlehub-dataset/couplet.tar.gz')\nclass Couplet(GenerationDataset):\n    '''An open source couplet dataset, see https://github.com/v-zich/couplet-clean-dataset for details.'''\n\n    def __init__(self, tokenizer=None, max_seq_len=None):\n        dataset_dir = os.path.join(hubenv.DATA_HOME, 'couplet')\n        with open(os.path.join(dataset_dir, 'vocab.txt'), encoding='utf8') as vocab_file:\n            label_list = [line.strip() for line in vocab_file.readlines()]\n        super(Couplet, self).__init__(\n            base_path=dataset_dir,\n            train_file='train.tsv',\n            dev_file='dev.tsv',\n            test_file='test.tsv',\n            label_list=label_list,\n            tokenizer=tokenizer,\n            max_seq_len=max_seq_len)\n\n    def _read_file(self, input_file, phase=None):\n        '''Reads a tab separated value file.'''\n        with codecs.open(input_file, 'r', encoding='UTF-8') as f:\n            reader = csv.reader(f, delimiter='\\t', quotechar=None)\n            examples = []\n            seq_id = 0\n            for line in reader:\n                example = InputExample(guid=seq_id, label=line[1], text_a=line[0])\n                seq_id += 1\n                examples.append(example)\n\n            return examples\n"
  },
  {
    "path": "paddlehub/compat/datasets/nlp_dataset.py",
    "content": "# coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport io\nimport csv\nimport collections\n\nimport numpy as np\nfrom tqdm import tqdm\n\nfrom paddlehub.compat.datasets.base_dataset import InputExample, BaseDataset\nfrom paddlehub.utils.log import logger\nfrom paddlehub.text.tokenizer import CustomTokenizer\nfrom paddlehub.text.bert_tokenizer import BertTokenizer\n\n\nclass BaseNLPDataset(BaseDataset):\n    def __init__(self,\n                 base_path,\n                 train_file=None,\n                 dev_file=None,\n                 test_file=None,\n                 predict_file=None,\n                 label_file=None,\n                 label_list=None,\n                 train_file_with_header=False,\n                 dev_file_with_header=False,\n                 test_file_with_header=False,\n                 predict_file_with_header=False,\n                 tokenizer=None,\n                 max_seq_len=128):\n        super(BaseNLPDataset, self).__init__(\n            base_path=base_path,\n            train_file=train_file,\n            dev_file=dev_file,\n            test_file=test_file,\n            predict_file=predict_file,\n            label_file=label_file,\n            label_list=label_list,\n            train_file_with_header=train_file_with_header,\n            dev_file_with_header=dev_file_with_header,\n            test_file_with_header=test_file_with_header,\n            predict_file_with_header=predict_file_with_header)\n        self.tokenizer = tokenizer\n        self.max_seq_len = max_seq_len\n        self._train_records = None\n        self._dev_records = None\n        self._test_records = None\n        self._predict_records = None\n\n    @property\n    def train_records(self):\n        if not self._train_records:\n            examples = self.train_examples\n            if not self.tokenizer or not examples:\n                return []\n            logger.info('Processing the train set...')\n            self._train_records = self._convert_examples_to_records(examples, phase='train')\n        return self._train_records\n\n    @property\n    def dev_records(self):\n        if not self._dev_records:\n            examples = self.dev_examples\n            if not self.tokenizer or not examples:\n                return []\n            logger.info('Processing the dev set...')\n            self._dev_records = self._convert_examples_to_records(examples, phase='dev')\n        return self._dev_records\n\n    @property\n    def test_records(self):\n        if not self._test_records:\n            examples = self.test_examples\n            if not self.tokenizer or not examples:\n                return []\n            logger.info('Processing the test set...')\n            self._test_records = self._convert_examples_to_records(examples, phase='test')\n        return self._test_records\n\n    @property\n    def predict_records(self):\n        if not self._predict_records:\n            examples = self.predict_examples\n            if not self.tokenizer or not examples:\n                return []\n            logger.info('Processing the predict set...')\n            self._predict_records = self._convert_examples_to_records(examples, phase='predict')\n        return self._predict_records\n\n    def _read_file(self, input_file, phase=None):\n        '''Reads a tab separated value file.'''\n        has_warned = False\n        with io.open(input_file, 'r', encoding='UTF-8') as file:\n            reader = csv.reader(file, delimiter='\\t', quotechar=None)\n            examples = []\n            for (i, line) in enumerate(reader):\n                if i == 0:\n                    ncol = len(line)\n                    if self.if_file_with_header[phase]:\n                        continue\n                if phase != 'predict':\n                    if ncol == 1:\n                        raise Exception(\n                            'the %s file: %s only has one column but it is not a predict file' % (phase, input_file))\n                    elif ncol == 2:\n                        example = InputExample(guid=i, text_a=line[0], label=line[1])\n                    elif ncol == 3:\n                        example = InputExample(guid=i, text_a=line[0], text_b=line[1], label=line[2])\n                    else:\n                        raise Exception('the %s file: %s has too many columns (should <=3)' % (phase, input_file))\n                else:\n                    if ncol == 1:\n                        example = InputExample(guid=i, text_a=line[0])\n                    elif ncol == 2:\n                        if not has_warned:\n                            logger.warning(\n                                'the predict file: %s has 2 columns, as it is a predict file, the second one will be regarded as text_b'\n                                % (input_file))\n                            has_warned = True\n                        example = InputExample(guid=i, text_a=line[0], text_b=line[1])\n                    else:\n                        raise Exception('the predict file: %s has too many columns (should <=2)' % (input_file))\n                examples.append(example)\n            return examples\n\n    def _convert_examples_to_records(self, examples, phase):\n        '''\n        Returns a list[dict] including all the input information what the model need.\n        Args:\n            examples (list): the data example, returned by _read_file.\n            phase (str): the processing phase, can be 'train' 'dev' 'test' or 'predict'.\n        Returns:\n            a list with all the examples record.\n        '''\n\n        records = []\n        with tqdm(total=len(examples)) as process_bar:\n            for example in examples:\n                record = self.tokenizer.encode(\n                    text=example.text_a, text_pair=example.text_b, max_seq_len=self.max_seq_len)\n                # CustomTokenizer will tokenize the text firstly and then lookup words in the vocab\n                # When all words are not found in the vocab, the text will be dropped.\n                if not record:\n                    logger.info('The text %s has been dropped as it has no words in the vocab after tokenization.' %\n                                example.text_a)\n                    continue\n                if example.label:\n                    record['label'] = self.label_list.index(example.label) if self.label_list else float(example.label)\n                records.append(record)\n                process_bar.update(1)\n        return records\n\n    def get_train_records(self, shuffle=False):\n        return self.get_records('train', shuffle=shuffle)\n\n    def get_dev_records(self, shuffle=False):\n        return self.get_records('dev', shuffle=shuffle)\n\n    def get_test_records(self, shuffle=False):\n        return self.get_records('test', shuffle=shuffle)\n\n    def get_val_records(self, shuffle=False):\n        return self.get_records('val', shuffle=shuffle)\n\n    def get_predict_records(self, shuffle=False):\n        return self.get_records('predict', shuffle=shuffle)\n\n    def get_records(self, phase, shuffle=False):\n        if phase == 'train':\n            records = self.train_records\n        elif phase == 'dev':\n            records = self.dev_records\n        elif phase == 'test':\n            records = self.test_records\n        elif phase == 'val':\n            records = self.dev_records\n        elif phase == 'predict':\n            records = self.predict_records\n        else:\n            raise ValueError('Invalid phase: %s' % phase)\n\n        if shuffle:\n            np.random.shuffle(records)\n        return records\n\n    def get_feed_list(self, phase):\n        records = self.get_records(phase)\n        if records:\n            feed_list = list(records[0].keys())\n        else:\n            feed_list = []\n        return feed_list\n\n    def batch_records_generator(self, phase, batch_size, shuffle=True, pad_to_batch_max_seq_len=False):\n        ''' generate a batch of records, usually used in dynamic graph mode.\n        Args:\n            phase (str): the dataset phase, can be 'train', 'dev', 'val', 'test' or 'predict'.\n            batch_size (int): the data batch size\n            shuffle (bool): if set to True, will shuffle the dataset.\n            pad_to_batch_max_seq_len (bool): if set to True, will dynamically pad to the max sequence length of the batch data.\n                                             Only recommended to set to True when the model has used RNN.\n        '''\n        records = self.get_records(phase, shuffle=shuffle)\n\n        batch_records = []\n        batch_lens = []\n        for record in records:\n            batch_records.append(record)\n            if pad_to_batch_max_seq_len:\n                # This may reduce the processing speed\n                tokens_wo_pad = [\n                    token for token in self.tokenizer.decode(record, only_convert_to_tokens=True)\n                    if token != self.tokenizer.pad_token\n                ]\n                batch_lens.append(len(tokens_wo_pad))\n            if len(batch_records) == batch_size:\n                if pad_to_batch_max_seq_len:\n                    # This may reduce the processing speed.\n                    batch_max_seq_len = max(batch_lens)\n                    for record in batch_records:\n                        for key, value in record.items():\n                            if isinstance(value, list):\n                                # This may not be universal\n                                record[key] = value[:batch_max_seq_len]\n                rev_batch_records = {key: [record[key] for record in batch_records] for key in batch_records[0]}\n                yield rev_batch_records\n                batch_records = []\n                batch_lens = []\n\n        if batch_records:\n            if pad_to_batch_max_seq_len:\n                # This may reduce the processing speed.\n                batch_max_seq_len = max(batch_lens)\n                for record in batch_records:\n                    for key in record.keys():\n                        if isinstance(record[key], list):\n                            record[key] = record[key][:batch_max_seq_len]\n            rev_batch_records = {key: [record[key] for record in batch_records] for key in batch_records[0]}\n            yield rev_batch_records\n\n\nclass GenerationDataset(BaseNLPDataset):\n    def __init__(self,\n                 base_path,\n                 train_file=None,\n                 dev_file=None,\n                 test_file=None,\n                 predict_file=None,\n                 label_file=None,\n                 label_list=None,\n                 train_file_with_header=False,\n                 dev_file_with_header=False,\n                 test_file_with_header=False,\n                 predict_file_with_header=False,\n                 tokenizer=None,\n                 max_seq_len=128,\n                 split_char='\\002',\n                 start_token='<s>',\n                 end_token='</s>',\n                 unk_token='<unk>'):\n        self.split_char = split_char\n        self.start_token = start_token\n        self.end_token = end_token\n        self.unk_token = unk_token\n        super(GenerationDataset, self).__init__(\n            base_path=base_path,\n            train_file=train_file,\n            dev_file=dev_file,\n            test_file=test_file,\n            predict_file=predict_file,\n            label_file=label_file,\n            label_list=label_list,\n            train_file_with_header=train_file_with_header,\n            dev_file_with_header=dev_file_with_header,\n            test_file_with_header=test_file_with_header,\n            predict_file_with_header=predict_file_with_header,\n            tokenizer=tokenizer,\n            max_seq_len=max_seq_len)\n\n    def _convert_examples_to_records(self, examples, phase):\n        '''\n        Returns a list[dict] including all the input information what the model need.\n        Args:\n            examples (list): the data example, returned by _read_file.\n            phase (str): the processing phase, can be 'train' 'dev' 'test' or 'predict'.\n        Returns:\n            a list with all the examples record.\n        '''\n        records = []\n        with tqdm(total=len(examples)) as process_bar:\n            for example in examples:\n                record = self.tokenizer.encode(\n                    text=example.text_a.split(self.split_char),\n                    text_pair=example.text_b.split(self.split_char) if example.text_b else None,\n                    max_seq_len=self.max_seq_len)\n                if example.label:\n                    expand_label = [self.start_token] + example.label.split(\n                        self.split_char)[:self.max_seq_len - 2] + [self.end_token]\n                    expand_label_id = [\n                        self.label_index.get(label, self.label_index[self.unk_token]) for label in expand_label\n                    ]\n                    record['label'] = expand_label_id[1:] + [self.label_index[self.end_token]\n                                                             ] * (self.max_seq_len - len(expand_label) + 1)\n                    record['dec_input'] = expand_label_id[:-1] + [self.label_index[self.end_token]\n                                                                  ] * (self.max_seq_len - len(expand_label) + 1)\n                records.append(record)\n                process_bar.update(1)\n        return records\n"
  },
  {
    "path": "paddlehub/compat/module/__init__.py",
    "content": ""
  },
  {
    "path": "paddlehub/compat/module/module_desc.proto",
    "content": "// Copyright 2018 The Paddle Authors. All Rights Reserved.\n//\n// Licensed under the Apache License, Version 2.0 (the \"License\");\n// you may not use this file except in compliance with the License.\n// You may obtain a copy of the License at\n//\n//     http://www.apache.org/licenses/LICENSE-2.0\n//\n// Unless required by applicable law or agreed to in writing, software\n// distributed under the License is distributed on an \"AS IS\" BASIS,\n// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n// See the License for the specific language governing permissions and\n// limitations under the License.\n// =============================================================================\n\nsyntax = \"proto3\";\noption optimize_for = LITE_RUNTIME;\n\npackage paddlehub.module.desc;\n\nenum DataType {\n  NONE = 0;\n  INT = 1;\n  FLOAT = 2;\n  STRING = 3;\n  BOOLEAN = 4;\n  LIST = 5;\n  MAP = 6;\n  SET = 7;\n  OBJECT = 8;\n}\n\nmessage KVData {\n  map<string, DataType> key_type = 1;\n  map<string, ModuleAttr> data = 2;\n}\n\nmessage ModuleAttr {\n  // Basic type\n  DataType type = 1;\n  int64 i = 2;\n  double f = 3;\n  bool b = 4;\n  string s = 5;\n  KVData map = 6;\n  KVData list = 7;\n  KVData set = 8;\n  KVData object = 9;\n  // \n  string name = 10;\n  string info = 11;\n\n}\n\n// Feed Variable Description\nmessage FeedDesc {\n  string var_name = 1;\n  string alias = 2;\n};\n\n// Fetch Variable Description\nmessage FetchDesc {\n  string var_name = 1;\n  string alias = 2;\n};\n\n// Module Variable\nmessage ModuleVar {\n  repeated FetchDesc fetch_desc = 1;\n  repeated FeedDesc feed_desc = 2;\n}\n\n// A Hub Module is stored in a directory with a file 'module_desc.pb'\n// containing a serialized protocol message of this type. The further contents\n// of the directory depend on the storage format described by the message.\nmessage ModuleDesc {\n  // signature to module variable\n  map<string, ModuleVar> sign2var = 2;\n\n  ModuleAttr attr = 3;\n};\n"
  },
  {
    "path": "paddlehub/compat/module/module_desc_pb2.py",
    "content": "#coding:utf-8\n# Generated by the protocol buffer compiler.  DO NOT EDIT!\n# source: module_desc.proto\n\nimport sys\n_b = sys.version_info[0] < 3 and (lambda x: x) or (lambda x: x.encode('latin1'))\nfrom google.protobuf.internal import enum_type_wrapper\nfrom google.protobuf import descriptor as _descriptor\nfrom google.protobuf import message as _message\nfrom google.protobuf import reflection as _reflection\nfrom google.protobuf import symbol_database as _symbol_database\nfrom google.protobuf import descriptor_pb2\n# @@protoc_insertion_point(imports)\n\n_sym_db = _symbol_database.Default()\n\nDESCRIPTOR = _descriptor.FileDescriptor(\n    name='module_desc.proto',\n    package='paddlehub.module.desc',\n    syntax='proto3',\n    serialized_pb=_b(\n        '\\n\\x11module_desc.proto\\x12\\x15paddlehub.module.desc\\\"\\x9e\\x02\\n\\x06KVData\\x12<\\n\\x08key_type\\x18\\x01 \\x03(\\x0b\\x32*.paddlehub.module.desc.KVData.KeyTypeEntry\\x12\\x35\\n\\x04\\x64\\x61ta\\x18\\x02 \\x03(\\x0b\\x32\\'.paddlehub.module.desc.KVData.DataEntry\\x1aO\\n\\x0cKeyTypeEntry\\x12\\x0b\\n\\x03key\\x18\\x01 \\x01(\\t\\x12.\\n\\x05value\\x18\\x02 \\x01(\\x0e\\x32\\x1f.paddlehub.module.desc.DataType:\\x02\\x38\\x01\\x1aN\\n\\tDataEntry\\x12\\x0b\\n\\x03key\\x18\\x01 \\x01(\\t\\x12\\x30\\n\\x05value\\x18\\x02 \\x01(\\x0b\\x32!.paddlehub.module.desc.ModuleAttr:\\x02\\x38\\x01\\\"\\xb7\\x02\\n\\nModuleAttr\\x12-\\n\\x04type\\x18\\x01 \\x01(\\x0e\\x32\\x1f.paddlehub.module.desc.DataType\\x12\\t\\n\\x01i\\x18\\x02 \\x01(\\x03\\x12\\t\\n\\x01\\x66\\x18\\x03 \\x01(\\x01\\x12\\t\\n\\x01\\x62\\x18\\x04 \\x01(\\x08\\x12\\t\\n\\x01s\\x18\\x05 \\x01(\\t\\x12*\\n\\x03map\\x18\\x06 \\x01(\\x0b\\x32\\x1d.paddlehub.module.desc.KVData\\x12+\\n\\x04list\\x18\\x07 \\x01(\\x0b\\x32\\x1d.paddlehub.module.desc.KVData\\x12*\\n\\x03set\\x18\\x08 \\x01(\\x0b\\x32\\x1d.paddlehub.module.desc.KVData\\x12-\\n\\x06object\\x18\\t \\x01(\\x0b\\x32\\x1d.paddlehub.module.desc.KVData\\x12\\x0c\\n\\x04name\\x18\\n \\x01(\\t\\x12\\x0c\\n\\x04info\\x18\\x0b \\x01(\\t\\\"+\\n\\x08\\x46\\x65\\x65\\x64\\x44\\x65sc\\x12\\x10\\n\\x08var_name\\x18\\x01 \\x01(\\t\\x12\\r\\n\\x05\\x61lias\\x18\\x02 \\x01(\\t\\\",\\n\\tFetchDesc\\x12\\x10\\n\\x08var_name\\x18\\x01 \\x01(\\t\\x12\\r\\n\\x05\\x61lias\\x18\\x02 \\x01(\\t\\\"u\\n\\tModuleVar\\x12\\x34\\n\\nfetch_desc\\x18\\x01 \\x03(\\x0b\\x32 .paddlehub.module.desc.FetchDesc\\x12\\x32\\n\\tfeed_desc\\x18\\x02 \\x03(\\x0b\\x32\\x1f.paddlehub.module.desc.FeedDesc\\\"\\xd3\\x01\\n\\nModuleDesc\\x12\\x41\\n\\x08sign2var\\x18\\x02 \\x03(\\x0b\\x32/.paddlehub.module.desc.ModuleDesc.Sign2varEntry\\x12/\\n\\x04\\x61ttr\\x18\\x03 \\x01(\\x0b\\x32!.paddlehub.module.desc.ModuleAttr\\x1aQ\\n\\rSign2varEntry\\x12\\x0b\\n\\x03key\\x18\\x01 \\x01(\\t\\x12/\\n\\x05value\\x18\\x02 \\x01(\\x0b\\x32 .paddlehub.module.desc.ModuleVar:\\x02\\x38\\x01*i\\n\\x08\\x44\\x61taType\\x12\\x08\\n\\x04NONE\\x10\\x00\\x12\\x07\\n\\x03INT\\x10\\x01\\x12\\t\\n\\x05\\x46LOAT\\x10\\x02\\x12\\n\\n\\x06STRING\\x10\\x03\\x12\\x0b\\n\\x07\\x42OOLEAN\\x10\\x04\\x12\\x08\\n\\x04LIST\\x10\\x05\\x12\\x07\\n\\x03MAP\\x10\\x06\\x12\\x07\\n\\x03SET\\x10\\x07\\x12\\n\\n\\x06OBJECT\\x10\\x08\\x42\\x02H\\x03\\x62\\x06proto3'\n    ))\n_sym_db.RegisterFileDescriptor(DESCRIPTOR)\n\n_DATATYPE = _descriptor.EnumDescriptor(\n    name='DataType',\n    full_name='paddlehub.module.desc.DataType',\n    filename=None,\n    file=DESCRIPTOR,\n    values=[\n        _descriptor.EnumValueDescriptor(name='NONE', index=0, number=0, options=None, type=None),\n        _descriptor.EnumValueDescriptor(name='INT', index=1, number=1, options=None, type=None),\n        _descriptor.EnumValueDescriptor(name='FLOAT', index=2, number=2, options=None, type=None),\n        _descriptor.EnumValueDescriptor(name='STRING', index=3, number=3, options=None, type=None),\n        _descriptor.EnumValueDescriptor(name='BOOLEAN', index=4, number=4, options=None, type=None),\n        _descriptor.EnumValueDescriptor(name='LIST', index=5, number=5, options=None, type=None),\n        _descriptor.EnumValueDescriptor(name='MAP', index=6, number=6, options=None, type=None),\n        _descriptor.EnumValueDescriptor(name='SET', index=7, number=7, options=None, type=None),\n        _descriptor.EnumValueDescriptor(name='OBJECT', index=8, number=8, options=None, type=None),\n    ],\n    containing_type=None,\n    options=None,\n    serialized_start=1071,\n    serialized_end=1176,\n)\n_sym_db.RegisterEnumDescriptor(_DATATYPE)\n\nDataType = enum_type_wrapper.EnumTypeWrapper(_DATATYPE)\nNONE = 0\nINT = 1\nFLOAT = 2\nSTRING = 3\nBOOLEAN = 4\nLIST = 5\nMAP = 6\nSET = 7\nOBJECT = 8\n\n_KVDATA_KEYTYPEENTRY = _descriptor.Descriptor(\n    name='KeyTypeEntry',\n    full_name='paddlehub.module.desc.KVData.KeyTypeEntry',\n    filename=None,\n    file=DESCRIPTOR,\n    containing_type=None,\n    fields=[\n        _descriptor.FieldDescriptor(\n            name='key',\n            full_name='paddlehub.module.desc.KVData.KeyTypeEntry.key',\n            index=0,\n            number=1,\n            type=9,\n            cpp_type=9,\n            label=1,\n            has_default_value=False,\n            default_value=_b(\"\").decode('utf-8'),\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n        _descriptor.FieldDescriptor(\n            name='value',\n            full_name='paddlehub.module.desc.KVData.KeyTypeEntry.value',\n            index=1,\n            number=2,\n            type=14,\n            cpp_type=8,\n            label=1,\n            has_default_value=False,\n            default_value=0,\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n    ],\n    extensions=[],\n    nested_types=[],\n    enum_types=[],\n    options=_descriptor._ParseOptions(descriptor_pb2.MessageOptions(), _b('8\\001')),\n    is_extendable=False,\n    syntax='proto3',\n    extension_ranges=[],\n    oneofs=[],\n    serialized_start=172,\n    serialized_end=251,\n)\n\n_KVDATA_DATAENTRY = _descriptor.Descriptor(\n    name='DataEntry',\n    full_name='paddlehub.module.desc.KVData.DataEntry',\n    filename=None,\n    file=DESCRIPTOR,\n    containing_type=None,\n    fields=[\n        _descriptor.FieldDescriptor(\n            name='key',\n            full_name='paddlehub.module.desc.KVData.DataEntry.key',\n            index=0,\n            number=1,\n            type=9,\n            cpp_type=9,\n            label=1,\n            has_default_value=False,\n            default_value=_b(\"\").decode('utf-8'),\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n        _descriptor.FieldDescriptor(\n            name='value',\n            full_name='paddlehub.module.desc.KVData.DataEntry.value',\n            index=1,\n            number=2,\n            type=11,\n            cpp_type=10,\n            label=1,\n            has_default_value=False,\n            default_value=None,\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n    ],\n    extensions=[],\n    nested_types=[],\n    enum_types=[],\n    options=_descriptor._ParseOptions(descriptor_pb2.MessageOptions(), _b('8\\001')),\n    is_extendable=False,\n    syntax='proto3',\n    extension_ranges=[],\n    oneofs=[],\n    serialized_start=253,\n    serialized_end=331,\n)\n\n_KVDATA = _descriptor.Descriptor(\n    name='KVData',\n    full_name='paddlehub.module.desc.KVData',\n    filename=None,\n    file=DESCRIPTOR,\n    containing_type=None,\n    fields=[\n        _descriptor.FieldDescriptor(\n            name='key_type',\n            full_name='paddlehub.module.desc.KVData.key_type',\n            index=0,\n            number=1,\n            type=11,\n            cpp_type=10,\n            label=3,\n            has_default_value=False,\n            default_value=[],\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n        _descriptor.FieldDescriptor(\n            name='data',\n            full_name='paddlehub.module.desc.KVData.data',\n            index=1,\n            number=2,\n            type=11,\n            cpp_type=10,\n            label=3,\n            has_default_value=False,\n            default_value=[],\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n    ],\n    extensions=[],\n    nested_types=[\n        _KVDATA_KEYTYPEENTRY,\n        _KVDATA_DATAENTRY,\n    ],\n    enum_types=[],\n    options=None,\n    is_extendable=False,\n    syntax='proto3',\n    extension_ranges=[],\n    oneofs=[],\n    serialized_start=45,\n    serialized_end=331,\n)\n\n_MODULEATTR = _descriptor.Descriptor(\n    name='ModuleAttr',\n    full_name='paddlehub.module.desc.ModuleAttr',\n    filename=None,\n    file=DESCRIPTOR,\n    containing_type=None,\n    fields=[\n        _descriptor.FieldDescriptor(\n            name='type',\n            full_name='paddlehub.module.desc.ModuleAttr.type',\n            index=0,\n            number=1,\n            type=14,\n            cpp_type=8,\n            label=1,\n            has_default_value=False,\n            default_value=0,\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n        _descriptor.FieldDescriptor(\n            name='i',\n            full_name='paddlehub.module.desc.ModuleAttr.i',\n            index=1,\n            number=2,\n            type=3,\n            cpp_type=2,\n            label=1,\n            has_default_value=False,\n            default_value=0,\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n        _descriptor.FieldDescriptor(\n            name='f',\n            full_name='paddlehub.module.desc.ModuleAttr.f',\n            index=2,\n            number=3,\n            type=1,\n            cpp_type=5,\n            label=1,\n            has_default_value=False,\n            default_value=float(0),\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n        _descriptor.FieldDescriptor(\n            name='b',\n            full_name='paddlehub.module.desc.ModuleAttr.b',\n            index=3,\n            number=4,\n            type=8,\n            cpp_type=7,\n            label=1,\n            has_default_value=False,\n            default_value=False,\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n        _descriptor.FieldDescriptor(\n            name='s',\n            full_name='paddlehub.module.desc.ModuleAttr.s',\n            index=4,\n            number=5,\n            type=9,\n            cpp_type=9,\n            label=1,\n            has_default_value=False,\n            default_value=_b(\"\").decode('utf-8'),\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n        _descriptor.FieldDescriptor(\n            name='map',\n            full_name='paddlehub.module.desc.ModuleAttr.map',\n            index=5,\n            number=6,\n            type=11,\n            cpp_type=10,\n            label=1,\n            has_default_value=False,\n            default_value=None,\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n        _descriptor.FieldDescriptor(\n            name='list',\n            full_name='paddlehub.module.desc.ModuleAttr.list',\n            index=6,\n            number=7,\n            type=11,\n            cpp_type=10,\n            label=1,\n            has_default_value=False,\n            default_value=None,\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n        _descriptor.FieldDescriptor(\n            name='set',\n            full_name='paddlehub.module.desc.ModuleAttr.set',\n            index=7,\n            number=8,\n            type=11,\n            cpp_type=10,\n            label=1,\n            has_default_value=False,\n            default_value=None,\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n        _descriptor.FieldDescriptor(\n            name='object',\n            full_name='paddlehub.module.desc.ModuleAttr.object',\n            index=8,\n            number=9,\n            type=11,\n            cpp_type=10,\n            label=1,\n            has_default_value=False,\n            default_value=None,\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n        _descriptor.FieldDescriptor(\n            name='name',\n            full_name='paddlehub.module.desc.ModuleAttr.name',\n            index=9,\n            number=10,\n            type=9,\n            cpp_type=9,\n            label=1,\n            has_default_value=False,\n            default_value=_b(\"\").decode('utf-8'),\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n        _descriptor.FieldDescriptor(\n            name='info',\n            full_name='paddlehub.module.desc.ModuleAttr.info',\n            index=10,\n            number=11,\n            type=9,\n            cpp_type=9,\n            label=1,\n            has_default_value=False,\n            default_value=_b(\"\").decode('utf-8'),\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n    ],\n    extensions=[],\n    nested_types=[],\n    enum_types=[],\n    options=None,\n    is_extendable=False,\n    syntax='proto3',\n    extension_ranges=[],\n    oneofs=[],\n    serialized_start=334,\n    serialized_end=645,\n)\n\n_FEEDDESC = _descriptor.Descriptor(\n    name='FeedDesc',\n    full_name='paddlehub.module.desc.FeedDesc',\n    filename=None,\n    file=DESCRIPTOR,\n    containing_type=None,\n    fields=[\n        _descriptor.FieldDescriptor(\n            name='var_name',\n            full_name='paddlehub.module.desc.FeedDesc.var_name',\n            index=0,\n            number=1,\n            type=9,\n            cpp_type=9,\n            label=1,\n            has_default_value=False,\n            default_value=_b(\"\").decode('utf-8'),\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n        _descriptor.FieldDescriptor(\n            name='alias',\n            full_name='paddlehub.module.desc.FeedDesc.alias',\n            index=1,\n            number=2,\n            type=9,\n            cpp_type=9,\n            label=1,\n            has_default_value=False,\n            default_value=_b(\"\").decode('utf-8'),\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n    ],\n    extensions=[],\n    nested_types=[],\n    enum_types=[],\n    options=None,\n    is_extendable=False,\n    syntax='proto3',\n    extension_ranges=[],\n    oneofs=[],\n    serialized_start=647,\n    serialized_end=690,\n)\n\n_FETCHDESC = _descriptor.Descriptor(\n    name='FetchDesc',\n    full_name='paddlehub.module.desc.FetchDesc',\n    filename=None,\n    file=DESCRIPTOR,\n    containing_type=None,\n    fields=[\n        _descriptor.FieldDescriptor(\n            name='var_name',\n            full_name='paddlehub.module.desc.FetchDesc.var_name',\n            index=0,\n            number=1,\n            type=9,\n            cpp_type=9,\n            label=1,\n            has_default_value=False,\n            default_value=_b(\"\").decode('utf-8'),\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n        _descriptor.FieldDescriptor(\n            name='alias',\n            full_name='paddlehub.module.desc.FetchDesc.alias',\n            index=1,\n            number=2,\n            type=9,\n            cpp_type=9,\n            label=1,\n            has_default_value=False,\n            default_value=_b(\"\").decode('utf-8'),\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n    ],\n    extensions=[],\n    nested_types=[],\n    enum_types=[],\n    options=None,\n    is_extendable=False,\n    syntax='proto3',\n    extension_ranges=[],\n    oneofs=[],\n    serialized_start=692,\n    serialized_end=736,\n)\n\n_MODULEVAR = _descriptor.Descriptor(\n    name='ModuleVar',\n    full_name='paddlehub.module.desc.ModuleVar',\n    filename=None,\n    file=DESCRIPTOR,\n    containing_type=None,\n    fields=[\n        _descriptor.FieldDescriptor(\n            name='fetch_desc',\n            full_name='paddlehub.module.desc.ModuleVar.fetch_desc',\n            index=0,\n            number=1,\n            type=11,\n            cpp_type=10,\n            label=3,\n            has_default_value=False,\n            default_value=[],\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n        _descriptor.FieldDescriptor(\n            name='feed_desc',\n            full_name='paddlehub.module.desc.ModuleVar.feed_desc',\n            index=1,\n            number=2,\n            type=11,\n            cpp_type=10,\n            label=3,\n            has_default_value=False,\n            default_value=[],\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n    ],\n    extensions=[],\n    nested_types=[],\n    enum_types=[],\n    options=None,\n    is_extendable=False,\n    syntax='proto3',\n    extension_ranges=[],\n    oneofs=[],\n    serialized_start=738,\n    serialized_end=855,\n)\n\n_MODULEDESC_SIGN2VARENTRY = _descriptor.Descriptor(\n    name='Sign2varEntry',\n    full_name='paddlehub.module.desc.ModuleDesc.Sign2varEntry',\n    filename=None,\n    file=DESCRIPTOR,\n    containing_type=None,\n    fields=[\n        _descriptor.FieldDescriptor(\n            name='key',\n            full_name='paddlehub.module.desc.ModuleDesc.Sign2varEntry.key',\n            index=0,\n            number=1,\n            type=9,\n            cpp_type=9,\n            label=1,\n            has_default_value=False,\n            default_value=_b(\"\").decode('utf-8'),\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n        _descriptor.FieldDescriptor(\n            name='value',\n            full_name='paddlehub.module.desc.ModuleDesc.Sign2varEntry.value',\n            index=1,\n            number=2,\n            type=11,\n            cpp_type=10,\n            label=1,\n            has_default_value=False,\n            default_value=None,\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n    ],\n    extensions=[],\n    nested_types=[],\n    enum_types=[],\n    options=_descriptor._ParseOptions(descriptor_pb2.MessageOptions(), _b('8\\001')),\n    is_extendable=False,\n    syntax='proto3',\n    extension_ranges=[],\n    oneofs=[],\n    serialized_start=988,\n    serialized_end=1069,\n)\n\n_MODULEDESC = _descriptor.Descriptor(\n    name='ModuleDesc',\n    full_name='paddlehub.module.desc.ModuleDesc',\n    filename=None,\n    file=DESCRIPTOR,\n    containing_type=None,\n    fields=[\n        _descriptor.FieldDescriptor(\n            name='sign2var',\n            full_name='paddlehub.module.desc.ModuleDesc.sign2var',\n            index=0,\n            number=2,\n            type=11,\n            cpp_type=10,\n            label=3,\n            has_default_value=False,\n            default_value=[],\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n        _descriptor.FieldDescriptor(\n            name='attr',\n            full_name='paddlehub.module.desc.ModuleDesc.attr',\n            index=1,\n            number=3,\n            type=11,\n            cpp_type=10,\n            label=1,\n            has_default_value=False,\n            default_value=None,\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n    ],\n    extensions=[],\n    nested_types=[\n        _MODULEDESC_SIGN2VARENTRY,\n    ],\n    enum_types=[],\n    options=None,\n    is_extendable=False,\n    syntax='proto3',\n    extension_ranges=[],\n    oneofs=[],\n    serialized_start=858,\n    serialized_end=1069,\n)\n\n_KVDATA_KEYTYPEENTRY.fields_by_name['value'].enum_type = _DATATYPE\n_KVDATA_KEYTYPEENTRY.containing_type = _KVDATA\n_KVDATA_DATAENTRY.fields_by_name['value'].message_type = _MODULEATTR\n_KVDATA_DATAENTRY.containing_type = _KVDATA\n_KVDATA.fields_by_name['key_type'].message_type = _KVDATA_KEYTYPEENTRY\n_KVDATA.fields_by_name['data'].message_type = _KVDATA_DATAENTRY\n_MODULEATTR.fields_by_name['type'].enum_type = _DATATYPE\n_MODULEATTR.fields_by_name['map'].message_type = _KVDATA\n_MODULEATTR.fields_by_name['list'].message_type = _KVDATA\n_MODULEATTR.fields_by_name['set'].message_type = _KVDATA\n_MODULEATTR.fields_by_name['object'].message_type = _KVDATA\n_MODULEVAR.fields_by_name['fetch_desc'].message_type = _FETCHDESC\n_MODULEVAR.fields_by_name['feed_desc'].message_type = _FEEDDESC\n_MODULEDESC_SIGN2VARENTRY.fields_by_name['value'].message_type = _MODULEVAR\n_MODULEDESC_SIGN2VARENTRY.containing_type = _MODULEDESC\n_MODULEDESC.fields_by_name['sign2var'].message_type = _MODULEDESC_SIGN2VARENTRY\n_MODULEDESC.fields_by_name['attr'].message_type = _MODULEATTR\nDESCRIPTOR.message_types_by_name['KVData'] = _KVDATA\nDESCRIPTOR.message_types_by_name['ModuleAttr'] = _MODULEATTR\nDESCRIPTOR.message_types_by_name['FeedDesc'] = _FEEDDESC\nDESCRIPTOR.message_types_by_name['FetchDesc'] = _FETCHDESC\nDESCRIPTOR.message_types_by_name['ModuleVar'] = _MODULEVAR\nDESCRIPTOR.message_types_by_name['ModuleDesc'] = _MODULEDESC\nDESCRIPTOR.enum_types_by_name['DataType'] = _DATATYPE\n\nKVData = _reflection.GeneratedProtocolMessageType(\n    'KVData',\n    (_message.Message, ),\n    dict(\n        KeyTypeEntry=_reflection.GeneratedProtocolMessageType(\n            'KeyTypeEntry',\n            (_message.Message, ),\n            dict(\n                DESCRIPTOR=_KVDATA_KEYTYPEENTRY,\n                __module__='module_desc_pb2'\n                # @@protoc_insertion_point(class_scope:paddlehub.module.desc.KVData.KeyTypeEntry)\n            )),\n        DataEntry=_reflection.GeneratedProtocolMessageType(\n            'DataEntry',\n            (_message.Message, ),\n            dict(\n                DESCRIPTOR=_KVDATA_DATAENTRY,\n                __module__='module_desc_pb2'\n                # @@protoc_insertion_point(class_scope:paddlehub.module.desc.KVData.DataEntry)\n            )),\n        DESCRIPTOR=_KVDATA,\n        __module__='module_desc_pb2'\n        # @@protoc_insertion_point(class_scope:paddlehub.module.desc.KVData)\n    ))\n_sym_db.RegisterMessage(KVData)\n_sym_db.RegisterMessage(KVData.KeyTypeEntry)\n_sym_db.RegisterMessage(KVData.DataEntry)\n\nModuleAttr = _reflection.GeneratedProtocolMessageType(\n    'ModuleAttr',\n    (_message.Message, ),\n    dict(\n        DESCRIPTOR=_MODULEATTR,\n        __module__='module_desc_pb2'\n        # @@protoc_insertion_point(class_scope:paddlehub.module.desc.ModuleAttr)\n    ))\n_sym_db.RegisterMessage(ModuleAttr)\n\nFeedDesc = _reflection.GeneratedProtocolMessageType(\n    'FeedDesc',\n    (_message.Message, ),\n    dict(\n        DESCRIPTOR=_FEEDDESC,\n        __module__='module_desc_pb2'\n        # @@protoc_insertion_point(class_scope:paddlehub.module.desc.FeedDesc)\n    ))\n_sym_db.RegisterMessage(FeedDesc)\n\nFetchDesc = _reflection.GeneratedProtocolMessageType(\n    'FetchDesc',\n    (_message.Message, ),\n    dict(\n        DESCRIPTOR=_FETCHDESC,\n        __module__='module_desc_pb2'\n        # @@protoc_insertion_point(class_scope:paddlehub.module.desc.FetchDesc)\n    ))\n_sym_db.RegisterMessage(FetchDesc)\n\nModuleVar = _reflection.GeneratedProtocolMessageType(\n    'ModuleVar',\n    (_message.Message, ),\n    dict(\n        DESCRIPTOR=_MODULEVAR,\n        __module__='module_desc_pb2'\n        # @@protoc_insertion_point(class_scope:paddlehub.module.desc.ModuleVar)\n    ))\n_sym_db.RegisterMessage(ModuleVar)\n\nModuleDesc = _reflection.GeneratedProtocolMessageType(\n    'ModuleDesc',\n    (_message.Message, ),\n    dict(\n        Sign2varEntry=_reflection.GeneratedProtocolMessageType(\n            'Sign2varEntry',\n            (_message.Message, ),\n            dict(\n                DESCRIPTOR=_MODULEDESC_SIGN2VARENTRY,\n                __module__='module_desc_pb2'\n                # @@protoc_insertion_point(class_scope:paddlehub.module.desc.ModuleDesc.Sign2varEntry)\n            )),\n        DESCRIPTOR=_MODULEDESC,\n        __module__='module_desc_pb2'\n        # @@protoc_insertion_point(class_scope:paddlehub.module.desc.ModuleDesc)\n    ))\n_sym_db.RegisterMessage(ModuleDesc)\n_sym_db.RegisterMessage(ModuleDesc.Sign2varEntry)\n\nDESCRIPTOR.has_options = True\nDESCRIPTOR._options = _descriptor._ParseOptions(descriptor_pb2.FileOptions(), _b('H\\003'))\n_KVDATA_KEYTYPEENTRY.has_options = True\n_KVDATA_KEYTYPEENTRY._options = _descriptor._ParseOptions(descriptor_pb2.MessageOptions(), _b('8\\001'))\n_KVDATA_DATAENTRY.has_options = True\n_KVDATA_DATAENTRY._options = _descriptor._ParseOptions(descriptor_pb2.MessageOptions(), _b('8\\001'))\n_MODULEDESC_SIGN2VARENTRY.has_options = True\n_MODULEDESC_SIGN2VARENTRY._options = _descriptor._ParseOptions(descriptor_pb2.MessageOptions(), _b('8\\001'))\n# @@protoc_insertion_point(module_scope)\n"
  },
  {
    "path": "paddlehub/compat/module/module_v1.py",
    "content": "# coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport functools\nimport os\nfrom typing import List\nfrom typing import Tuple\n\nimport paddle.io\nimport paddle2onnx\nfrom easydict import EasyDict\n\nfrom paddlehub.compat import paddle_utils\nfrom paddlehub.compat.module import module_v1_utils\nfrom paddlehub.utils import log\nfrom paddlehub.utils import utils\n\n\nclass ModuleV1(object):\n    '''\n    ModuleV1 is an old version of the PaddleHub Module format, which is no longer in use. In order to maintain\n    compatibility, users can still load the corresponding Module for prediction. User should call `hub.Module`\n    to initialize the corresponding object, rather than `ModuleV1`.\n    '''\n\n    # All ModuleV1 in PaddleHub is static graph model\n    @paddle_utils.run_in_static_mode\n    def __init__(self, name: str = None, directory: str = None, version: str = None):\n        if not directory:\n            return\n\n        desc_file = os.path.join(directory, 'module_desc.pb')\n        self.desc = module_v1_utils.convert_module_desc(desc_file)\n        self.helper = self\n        self.signatures = self.desc.signatures\n        self.default_signature = self.desc.default_signature\n\n        self.directory = directory\n        self._load_model()\n        self._load_parameters()\n        self._load_processor()\n        self._load_assets()\n        self._load_extra_info()\n        self._generate_func()\n\n    def _load_processor(self):\n        # Some module does not have a processor(e.g. ernie)\n        if not 'processor_info' in self.desc:\n            return\n\n        python_path = os.path.join(self.directory, 'python')\n        processor_name = self.desc.processor_info\n        self.processor = utils.load_py_module(python_path, processor_name)\n        self.processor = self.processor.Processor(module=self)\n\n    def _load_assets(self):\n        self.assets = []\n        for file in os.listdir(self.assets_path()):\n            filepath = os.path.join(self.assets_path(), file)\n            self.assets.append(filepath)\n\n    def _load_parameters(self):\n        global_block = self.program.global_block()\n\n        # record num parameters loaded by PaddleHub\n        num_param_loaded = 0\n\n        for param, attrs in self.desc.param_attrs.items():\n            name = self.desc.name_prefix + param\n            if not name in global_block.vars:\n                continue\n\n            num_param_loaded += 1\n            var = global_block.vars[name]\n\n            # Since the pre-trained model saved by the old version of Paddle cannot restore the corresponding\n            # parameters, we need to restore them manually.\n            global_block.create_parameter(name=name,\n                                          shape=var.shape,\n                                          dtype=var.dtype,\n                                          type=var.type,\n                                          lod_level=var.lod_level,\n                                          error_clip=var.error_clip,\n                                          stop_gradient=var.stop_gradient,\n                                          is_data=var.is_data,\n                                          **attrs)\n\n        log.logger.info('{} pretrained paramaters loaded by PaddleHub'.format(num_param_loaded))\n\n    def _load_extra_info(self):\n        if not 'extra_info' in self.desc:\n            return\n\n        for key, value in self.desc.extra_info.items():\n            self.__dict__['get_{}'.format(key)] = value\n\n    def _generate_func(self):\n        for signature in self.desc.signatures:\n            self.__dict__[signature] = functools.partial(self.__call__, sign_name=signature)\n\n    def _load_model(self):\n        model_path = os.path.join(self.directory, 'model')\n        exe = paddle.static.Executor(paddle.CPUPlace())\n        self.program, _, _ = paddle.static.load_inference_model(model_path, executor=exe)\n\n        # Clear the callstack since it may leak the privacy of the creator.\n        for block in self.program.blocks:\n            for op in block.ops:\n                if not 'op_callstack' in op.all_attrs():\n                    continue\n                op._set_attr('op_callstack', [''])\n\n    @paddle_utils.run_in_static_mode\n    def context(self,\n                signature: str = None,\n                for_test: bool = False,\n                trainable: bool = True,\n                max_seq_len: int = 128) -> Tuple[dict, dict, paddle.static.Program]:\n        '''Get module context information, including graph structure and graph input and output variables.'''\n        program = self.program.clone(for_test=for_test)\n        paddle_utils.remove_feed_fetch_op(program)\n\n        # generate feed vars and fetch vars from signatures\n        feed_dict = {}\n        fetch_dict = {}\n        varinfos = [self.desc.signatures[signature]] if signature else self.desc.signatures.values()\n\n        for info in varinfos:\n            for feed_var in info.inputs:\n                paddle_var = program.global_block().vars[feed_var.name]\n                feed_dict[feed_var.alias] = paddle_var\n\n            for fetch_var in info.outputs:\n                paddle_var = program.global_block().vars[fetch_var.name]\n                fetch_dict[fetch_var.alias] = paddle_var\n\n        for param in program.all_parameters():\n            param.trainable = trainable\n\n        # The bert series model saved by ModuleV1 sets max_seq_len to 512 by default. We need to adjust max_seq_len\n        # according to the parameters in actual use.\n        if 'bert' in self.name or self.name.startswith('ernie'):\n            self._update_bert_max_seq_len(program, feed_dict, max_seq_len)\n\n        return feed_dict, fetch_dict, program\n\n    def _update_bert_max_seq_len(self, program: paddle.static.Program, feed_dict: dict, max_seq_len: int = 128):\n        MAX_SEQ_LENGTH = 512\n        if max_seq_len > MAX_SEQ_LENGTH or max_seq_len <= 0:\n            raise ValueError(\"max_seq_len({}) should be in the range of [1, {}]\".format(max_seq_len, MAX_SEQ_LENGTH))\n        log.logger.info(\"Set maximum sequence length of input tensor to {}\".format(max_seq_len))\n        if self.name.startswith(\"ernie_v2\"):\n            feed_list = [\"input_ids\", \"position_ids\", \"segment_ids\", \"input_mask\", \"task_ids\"]\n        else:\n            feed_list = [\"input_ids\", \"position_ids\", \"segment_ids\", \"input_mask\"]\n        for tensor_name in feed_list:\n            seq_tensor_shape = [-1, max_seq_len, 1]\n            log.logger.info(\"The shape of input tensor[{}] set to {}\".format(tensor_name, seq_tensor_shape))\n            program.global_block().var(feed_dict[tensor_name].name).desc.set_shape(seq_tensor_shape)\n\n    @paddle_utils.run_in_static_mode\n    def __call__(self, sign_name: str, data: dict, use_gpu: bool = False, batch_size: int = 1, **kwargs):\n        '''Call the specified signature function for prediction.'''\n\n        def _get_reader_and_feeder(data_format, data, place):\n\n            def _reader(process_data):\n                for item in zip(*process_data):\n                    yield item\n\n            nonlocal feed_dict\n            process_data = []\n            feed_name_list = []\n            feed_list = []\n            for key in data_format:\n                process_data.append([value['processed'] for value in data[key]])\n                feed_name_list.append(data_format[key]['feed_key'])\n                feed_list.append(feed_dict[key])\n            loader = paddle.io.DataLoader.from_generator(feed_list=feed_list, capacity=1)\n            return functools.partial(_reader, process_data=process_data), loader\n\n        feed_dict, fetch_dict, program = self.context(signature=sign_name, for_test=True)\n        fetch_list = list([value for key, value in fetch_dict.items()])\n        with paddle.static.program_guard(program):\n            result = []\n            index = 0\n            place = paddle.CUDAPlace(0) if use_gpu else paddle.CPUPlace()\n\n            exe = paddle.static.Executor(place=place)\n            data = self.processor.preprocess(sign_name=sign_name, data_dict=data)\n            data_format = self.processor.data_format(sign_name=sign_name)\n            reader, loader = _get_reader_and_feeder(data_format, data, place)\n            reader = paddle.batch(reader, batch_size=batch_size)\n            loader.set_sample_list_generator(reader, places=place)\n            for batch in loader():\n                data_out = exe.run(feed=batch, fetch_list=fetch_list, return_numpy=False)\n                sub_data = {key: value[index:index + len(batch)] for key, value in data.items()}\n                result += self.processor.postprocess(sign_name, data_out, sub_data, **kwargs)\n                index += len(batch)\n\n        return result\n\n    @classmethod\n    def get_py_requirements(cls) -> List[str]:\n        '''Get Module's python package dependency list.'''\n        return []\n\n    @classmethod\n    def load(cls, directory: str) -> EasyDict:\n        '''Load the Module object defined in the specified directory.'''\n        module_info = cls.load_module_info(directory)\n\n        # Generate a uuid based on the class information, and dynamically create a new type.\n        # If we do not do this, the information generated later will overwrite the information\n        # previously generated.\n        cls_uuid = utils.md5(module_info.name + module_info.author + module_info.author_email + module_info.type +\n                             module_info.summary + module_info.version + directory)\n        cls = type('ModuleV1_{}'.format(cls_uuid), (cls, ), {})\n\n        cls.name = module_info.name\n        cls.author = module_info.author\n        cls.author_email = module_info.author_email\n        cls.type = module_info.type\n        cls.summary = module_info.summary\n        cls.version = utils.Version(module_info.version)\n        cls.directory = directory\n        return cls\n\n    @classmethod\n    def load_module_info(cls, directory: str) -> EasyDict:\n        '''Load the infomation of Module object defined in the specified directory.'''\n        desc_file = os.path.join(directory, 'module_desc.pb')\n        desc = module_v1_utils.convert_module_desc(desc_file)\n\n        # The naming of some old versions of Module is not standardized, which format of uppercase\n        # letters. This will cause the path of these modules to be incorrect after installation.\n        module_info = desc.module_info\n        module_info.name = module_info.name.lower()\n        return module_info\n\n    def assets_path(self):\n        return os.path.join(self.directory, 'assets')\n\n    def get_name_prefix(self):\n        return self.desc.name_prefix\n\n    @property\n    def is_runnable(self):\n        '''\n        Whether the Module is runnable, in other words, whether can we execute the Module through the\n        `hub run` command.\n        '''\n        return self.default_signature != None\n\n    @paddle_utils.run_in_static_mode\n    def save_inference_model(self,\n                             dirname: str,\n                             model_filename: str = None,\n                             params_filename: str = None,\n                             combined: bool = False,\n                             **kwargs):\n        '''\n        Export the model to Paddle Inference format.\n\n        Args:\n            dirname(str): The directory to save the paddle inference model.\n            model_filename(str): The name of the saved model file. Default to `__model__`.\n            params_filename(str): The name of the saved parameters file, only takes effect when `combined` is True.\n                Default to `__params__`.\n            combined(bool): Whether to save all parameters in a combined file. Default to True.\n        '''\n        if hasattr(self, 'processor'):\n            if hasattr(self.processor, 'save_inference_model'):\n                return self.processor.save_inference_model(dirname, model_filename, params_filename, combined)\n\n        model_filename = '__model__' if not model_filename else model_filename\n        if combined:\n            params_filename = '__params__' if not params_filename else params_filename\n\n        place = paddle.CPUPlace()\n        exe = paddle.static.Executor(place)\n\n        feed_dict, fetch_dict, program = self.context(for_test=True, trainable=False)\n        paddle.static.save_inference_model(dirname=dirname,\n                                           main_program=program,\n                                           executor=exe,\n                                           feeded_var_names=[var.name for var in list(feed_dict.values())],\n                                           target_vars=list(fetch_dict.values()),\n                                           model_filename=model_filename,\n                                           params_filename=params_filename)\n\n        log.logger.info('Paddle Inference model saved in {}.'.format(dirname))\n\n    @paddle_utils.run_in_static_mode\n    def export_onnx_model(self, dirname: str, **kwargs):\n        '''\n        Export the model to ONNX format.\n\n        Args:\n            dirname(str): The directory to save the onnx model.\n            **kwargs(dict|optional): Other export configuration options for compatibility, some may be removed\n                in the future. Don't use them If not necessary. Refer to https://github.com/PaddlePaddle/paddle2onnx\n                for more information.\n        '''\n        feed_dict, fetch_dict, program = self.context(for_test=True, trainable=False)\n        inputs = set([var.name for var in feed_dict.values()])\n        if self.type == 'CV/classification':\n            outputs = [fetch_dict['class_probs']]\n        else:\n            outputs = set([var.name for var in fetch_dict.values()])\n            outputs = [program.global_block().vars[key] for key in outputs]\n\n        save_file = os.path.join(dirname, '{}.onnx'.format(self.name))\n        paddle2onnx.program2onnx(program=program,\n                                 scope=paddle.static.global_scope(),\n                                 feed_var_names=inputs,\n                                 target_vars=outputs,\n                                 save_file=save_file,\n                                 **kwargs)\n\n    def sub_modules(self, recursive: bool = True):\n        '''\n        Get all sub modules.\n\n        Args:\n            recursive(bool): Whether to get sub modules recursively. Default to True.\n        '''\n        return []\n"
  },
  {
    "path": "paddlehub/compat/module/module_v1_utils.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom easydict import EasyDict\n\nfrom paddlehub.compat.module import module_desc_pb2\n\n\ndef convert_module_desc(desc_file):\n    desc = module_desc_pb2.ModuleDesc()\n    with open(desc_file, 'rb') as file:\n        desc.ParseFromString(file.read())\n\n    result = convert_attr(desc.attr)\n    result.signatures = convert_signatures(desc.sign2var)\n    return result\n\n\ndef convert_signatures(signmaps):\n    _dict = EasyDict()\n    for sign, var in signmaps.items():\n        _dict[sign] = EasyDict(inputs=[], outputs=[])\n        for fetch_var in var.fetch_desc:\n            _dict[sign].outputs.append(EasyDict(name=fetch_var.var_name, alias=fetch_var.alias))\n\n        for feed_var in var.feed_desc:\n            _dict[sign].inputs.append(EasyDict(name=feed_var.var_name, alias=feed_var.alias))\n\n    return _dict\n\n\ndef convert_attr(module_attr):\n    if module_attr.type == 1:\n        return module_attr.i\n    elif module_attr.type == 2:\n        return module_attr.f\n    elif module_attr.type == 3:\n        return module_attr.s\n    elif module_attr.type == 4:\n        return module_attr.b\n\n    _dict = EasyDict()\n    for key, val in module_attr.map.data.items():\n        _dict[key] = convert_attr(val)\n    return _dict\n"
  },
  {
    "path": "paddlehub/compat/module/nlp_module.py",
    "content": "# coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport argparse\nimport ast\nimport os\nimport re\nfrom typing import Any\nfrom typing import List\nfrom typing import Text\nfrom typing import Tuple\n\nimport numpy as np\nimport paddle\nimport six\nfrom paddle.framework import core\n\nfrom paddlehub.compat import paddle_utils\nfrom paddlehub.compat.task.config import RunConfig\nfrom paddlehub.compat.task.reader import ClassifyReader\nfrom paddlehub.compat.task.transformer_emb_task import TransformerEmbeddingTask\nfrom paddlehub.module.module import RunModule\nfrom paddlehub.module.module import runnable\nfrom paddlehub.utils.parser import txt_parser\nfrom paddlehub.utils.utils import sys_stdin_encoding\n\n\nclass DataFormatError(Exception):\n\n    def __init__(self, *args):\n        self.args = args\n\n\nclass NLPBaseModule(RunModule):\n\n    def get_vocab_path(self):\n        '''\n        Get the path to the vocabulary whih was used to pretrain\n        Returns:\n             self.vocab_path(str): the path to vocabulary\n        '''\n        return self.vocab_path\n\n\nclass NLPPredictionModule(NLPBaseModule):\n\n    def _set_config(self):\n        '''predictor config setting'''\n        cpu_config = core.AnalysisConfig(self.pretrained_model_path)\n        cpu_config.disable_glog_info()\n        cpu_config.disable_gpu()\n        self.cpu_predictor = core.create_paddle_predictor(cpu_config)\n\n        try:\n            _places = os.environ['CUDA_VISIBLE_DEVICES']\n            int(_places[0])\n            use_gpu = True\n        except:\n            use_gpu = False\n        if use_gpu:\n            gpu_config = core.AnalysisConfig(self.pretrained_model_path)\n            gpu_config.disable_glog_info()\n            gpu_config.enable_use_gpu(memory_pool_init_size_mb=500, device_id=0)\n            self.gpu_predictor = core.create_paddle_predictor(gpu_config)\n\n    def texts2tensor(self, texts: List[dict]) -> paddle.Tensor:\n        '''\n        Tranform the texts(dict) to PaddleTensor\n        Args:\n             texts(list): each element is a dict that must have a named 'processed' key whose value is word_ids, such as\n                          texts = [{'processed': [23, 89, 43, 906]}]\n        Returns:\n             tensor(PaddleTensor): tensor with texts data\n        '''\n        lod = [0]\n        data = []\n        for i, text in enumerate(texts):\n            data += text['processed']\n            lod.append(len(text['processed']) + lod[i])\n        tensor = core.PaddleTensor(np.array(data).astype('int64'))\n        tensor.name = 'words'\n        tensor.lod = [lod]\n        tensor.shape = [lod[-1], 1]\n        return tensor\n\n    def to_unicode(self, texts: str) -> Text:\n        '''\n        Convert each element's type(str) of texts(list) to unicode in python2.7\n        Args:\n             texts(list): each element's type is str in python2.7\n        Returns:\n             texts(list): each element's type is unicode in python2.7\n        '''\n        if six.PY2:\n            unicode_texts = []\n            for text in texts:\n                if isinstance(text, six.string_types):\n                    unicode_texts.append(text.decode(sys_stdin_encoding()).decode('utf8'))\n                else:\n                    unicode_texts.append(text)\n            texts = unicode_texts\n        return texts\n\n    @runnable\n    def run_cmd(self, argvs: List[Any]):\n        '''Run as a command'''\n        self.parser = argparse.ArgumentParser(description='Run the %s module.' % self.name,\n                                              prog='hub run %s' % self.name,\n                                              usage='%(prog)s',\n                                              add_help=True)\n\n        self.arg_input_group = self.parser.add_argument_group(title='Input options', description='Input data. Required')\n        self.arg_config_group = self.parser.add_argument_group(\n            title='Config options', description='Run configuration for controlling module behavior, not required.')\n\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n\n        args = self.parser.parse_args(argvs)\n\n        try:\n            input_data = self.check_input_data(args)\n        except DataFormatError and RuntimeError:\n            self.parser.print_help()\n            return None\n\n        results = self.predict(texts=input_data, use_gpu=args.use_gpu, batch_size=args.batch_size)\n\n        return results\n\n    def add_module_config_arg(self):\n        '''Add the command config options'''\n        self.arg_config_group.add_argument('--use_gpu',\n                                           type=ast.literal_eval,\n                                           default=False,\n                                           help='whether use GPU for prediction')\n\n        self.arg_config_group.add_argument('--batch_size', type=int, default=1, help='batch size for prediction')\n\n    def add_module_input_arg(self):\n        '''Add the command input options'''\n        self.arg_input_group.add_argument('--input_file', type=str, default=None, help='file contain input data')\n        self.arg_input_group.add_argument('--input_text', type=str, default=None, help='text to predict')\n\n    def check_input_data(self, args):\n        input_data = []\n        if args.input_file:\n            if not os.path.exists(args.input_file):\n                raise FileNotFoundError('File %s does not exist.' % args.input_file)\n            else:\n                input_data = txt_parser.parse(args.input_file, use_strip=True)\n        elif args.input_text:\n            input_data = [args.input_text]\n\n        return input_data\n\n\nclass TransformerModule(NLPBaseModule):\n    '''\n    Tranformer Module base class can be used by BERT, ERNIE, RoBERTa and so on.\n    '''\n\n    def __init__(self,\n                 name: str = None,\n                 directory: str = None,\n                 module_dir: List = None,\n                 version: str = None,\n                 max_seq_len: int = 128,\n                 **kwargs):\n        if not directory:\n            return\n        super(TransformerModule, self).__init__(name=name,\n                                                directory=directory,\n                                                module_dir=module_dir,\n                                                version=version,\n                                                **kwargs)\n\n        self.max_seq_len = max_seq_len\n\n    def init_pretraining_params(self, exe: paddle.static.Executor, pretraining_params_path: str,\n                                main_program: paddle.static.Program):\n        assert os.path.exists(pretraining_params_path), '[{}] cann\\'t be found.'.format(pretraining_params_path)\n\n        def existed_params(var):\n            if not isinstance(var, paddle.device.framework.Parameter):\n                return False\n            return os.path.exists(os.path.join(pretraining_params_path, var.name))\n\n        paddle.static.load(executor=exe,\n                           model_path=pretraining_params_path,\n                           program=main_program,\n                           var_list=main_program.all_parameters())\n\n    def param_prefix(self) -> str:\n        return '@HUB_%s@' % self.name\n\n    @paddle_utils.run_in_static_mode\n    def context(\n        self,\n        max_seq_len: int = None,\n        trainable: bool = True,\n        num_slots: int = 1,\n    ) -> Tuple[dict, dict, paddle.static.Program]:\n        '''\n        get inputs, outputs and program from pre-trained module\n        Args:\n            max_seq_len (int): It will limit the total sequence returned so that it has a maximum length.\n            trainable (bool): Whether fine-tune the pre-trained module parameters or not.\n            num_slots(int): It's number of data inputted to the model, selectted as following options:\n                - 1(default): There's only one data to be feeded in the model, e.g. the module is used for sentence classification task.\n                - 2: There are two data to be feeded in the model, e.g. the module is used for text matching task (point-wise).\n                - 3: There are three data to be feeded in the model, e.g. the module is used for text matching task (pair-wise).\n        Returns: inputs, outputs, program.\n                 The inputs is a dict with keys named input_ids, position_ids, segment_ids, input_mask and task_ids\n                 The outputs is a dict with two keys named pooled_output and sequence_output.\n        '''\n        assert num_slots >= 1 and num_slots <= 3, 'num_slots must be 1, 2, or 3, but the input is %d' % num_slots\n        if not max_seq_len:\n            max_seq_len = self.max_seq_len\n\n        assert max_seq_len <= self.MAX_SEQ_LEN and max_seq_len >= 1, 'max_seq_len({}) should be in the range of [1, {}]'.format(\n            max_seq_len, self.MAX_SEQ_LEN)\n\n        module_program = paddle.static.Program()\n        startup_program = paddle.static.Program()\n        with paddle.static.program_guard(module_program, startup_program):\n            with paddle.utils.unique_name.guar.guard():\n                input_ids = paddle.static.data(name='input_ids', shape=[-1, max_seq_len, 1], dtype='int64', lod_level=0)\n                position_ids = paddle.static.data(name='position_ids',\n                                                  shape=[-1, max_seq_len, 1],\n                                                  dtype='int64',\n                                                  lod_level=0)\n                segment_ids = paddle.static.data(name='segment_ids',\n                                                 shape=[-1, max_seq_len, 1],\n                                                 dtype='int64',\n                                                 lod_level=0)\n                input_mask = paddle.static.data(name='input_mask',\n                                                shape=[-1, max_seq_len, 1],\n                                                dtype='float32',\n                                                lod_level=0)\n                pooled_output, sequence_output = self.net(input_ids, position_ids, segment_ids, input_mask)\n\n                data_list = [(input_ids, position_ids, segment_ids, input_mask)]\n                output_name_list = [(pooled_output.name, sequence_output.name)]\n\n                if num_slots > 1:\n                    input_ids_2 = paddle.static.data(name='input_ids_2',\n                                                     shape=[-1, max_seq_len, 1],\n                                                     dtype='int64',\n                                                     lod_level=0)\n                    position_ids_2 = paddle.static.data(name='position_ids_2',\n                                                        shape=[-1, max_seq_len, 1],\n                                                        dtype='int64',\n                                                        lod_level=0)\n                    segment_ids_2 = paddle.static.data(name='segment_ids_2',\n                                                       shape=[-1, max_seq_len, 1],\n                                                       dtype='int64',\n                                                       lod_level=0)\n                    input_mask_2 = paddle.static.data(name='input_mask_2',\n                                                      shape=[-1, max_seq_len, 1],\n                                                      dtype='float32',\n                                                      lod_level=0)\n                    pooled_output_2, sequence_output_2 = self.net(input_ids_2, position_ids_2, segment_ids_2,\n                                                                  input_mask_2)\n                    data_list.append((input_ids_2, position_ids_2, segment_ids_2, input_mask_2))\n                    output_name_list.append((pooled_output_2.name, sequence_output_2.name))\n\n                if num_slots > 2:\n                    input_ids_3 = paddle.static.data(name='input_ids_3',\n                                                     shape=[-1, max_seq_len, 1],\n                                                     dtype='int64',\n                                                     lod_level=0)\n                    position_ids_3 = paddle.static.data(name='position_ids_3',\n                                                        shape=[-1, max_seq_len, 1],\n                                                        dtype='int64',\n                                                        lod_level=0)\n                    segment_ids_3 = paddle.static.data(name='segment_ids_3',\n                                                       shape=[-1, max_seq_len, 1],\n                                                       dtype='int64',\n                                                       lod_level=0)\n                    input_mask_3 = paddle.static.data(name='input_mask_3',\n                                                      shape=[-1, max_seq_len, 1],\n                                                      dtype='float32',\n                                                      lod_level=0)\n                    pooled_output_3, sequence_output_3 = self.net(input_ids_3, position_ids_3, segment_ids_3,\n                                                                  input_mask_3)\n                    data_list.append((input_ids_3, position_ids_3, segment_ids_3, input_mask_3))\n                    output_name_list.append((pooled_output_3.name, sequence_output_3.name))\n\n        place = paddle.CPUPlace()\n        exe = paddle.static.Executor(place)\n\n        # To be compatible with the module v1\n        vars = filter(\n            lambda var: var not in [\n                'input_ids', 'position_ids', 'segment_ids', 'input_mask', 'input_ids_2', 'position_ids_2',\n                'segment_ids_2', 'input_mask_2', 'input_ids_3', 'position_ids_3', 'segment_ids_3', 'input_mask_3'\n            ], list(module_program.global_block().vars.keys()))\n        paddle_utils.add_vars_prefix(program=module_program, prefix=self.param_prefix(), vars=vars)\n        self.init_pretraining_params(exe, self.params_path, main_program=module_program)\n\n        self.params_layer = {}\n        for param in module_program.global_block().iter_parameters():\n            param.trainable = trainable\n            match = re.match(r'.*layer_(\\d+).*', param.name)\n            if match:\n                # layer num begins from 0\n                layer = match.group(1)\n                self.params_layer[param.name] = int(layer)\n\n        inputs = {}\n        outputs = {}\n        for index, data in enumerate(data_list):\n\n            if index == 0:\n                inputs['input_ids'] = data[0]\n                inputs['position_ids'] = data[1]\n                inputs['segment_ids'] = data[2]\n                inputs['input_mask'] = data[3]\n                outputs['pooled_output'] = module_program.global_block().vars[self.param_prefix() +\n                                                                              output_name_list[0][0]]\n                outputs['sequence_output'] = module_program.global_block().vars[self.param_prefix() +\n                                                                                output_name_list[0][1]]\n            else:\n                inputs['input_ids_%s' % (index + 1)] = data[0]\n                inputs['position_ids_%s' % (index + 1)] = data[1]\n                inputs['segment_ids_%s' % (index + 1)] = data[2]\n                inputs['input_mask_%s' % (index + 1)] = data[3]\n                outputs['pooled_output_%s' %\n                        (index + 1)] = module_program.global_block().vars[self.param_prefix() +\n                                                                          output_name_list[index][0]]\n                outputs['sequence_output_%s' %\n                        (index + 1)] = module_program.global_block().vars[self.param_prefix() +\n                                                                          output_name_list[index][1]]\n\n        return inputs, outputs, module_program\n\n    @paddle_utils.run_in_static_mode\n    def get_embedding(self, texts: List[str], max_seq_len: int = 512, use_gpu: bool = False, batch_size: int = 1):\n        '''\n        get pooled_output and sequence_output for input texts.\n        Warnings: this method depends on Paddle Inference Library, it may not work properly in PaddlePaddle <= 1.6.2.\n        Args:\n            texts (list): each element is a text sample, each sample include text_a and text_b where text_b can be omitted.\n                          for example: [[sample0_text_a, sample0_text_b], [sample1_text_a, sample1_text_b], ...]\n            max_seq_len (int): the max sequence length.\n            use_gpu (bool): use gpu or not, default False.\n            batch_size (int): the data batch size, default 1.\n        Returns:\n            pooled_outputs(list): its element is a numpy array, the first feature of each text sample.\n            sequence_outputs(list): its element is a numpy array, the whole features of each text sample.\n        '''\n        if not hasattr(self,\n                       'emb_job') or self.emb_job['batch_size'] != batch_size or self.emb_job['use_gpu'] != use_gpu:\n            inputs, outputs, program = self.context(trainable=True, max_seq_len=max_seq_len)\n\n            reader = ClassifyReader(\n                dataset=None,\n                vocab_path=self.get_vocab_path(),\n                max_seq_len=max_seq_len,\n                sp_model_path=self.get_spm_path() if hasattr(self, 'get_spm_path') else None,\n                word_dict_path=self.get_word_dict_path() if hasattr(self, 'word_dict_path') else None)\n\n            feed_list = [\n                inputs['input_ids'].name,\n                inputs['position_ids'].name,\n                inputs['segment_ids'].name,\n                inputs['input_mask'].name,\n            ]\n\n            pooled_feature, seq_feature = outputs['pooled_output'], outputs['sequence_output']\n\n            config = RunConfig(use_data_parallel=False, use_cuda=use_gpu, batch_size=batch_size)\n\n            self.emb_job = {}\n            self.emb_job['task'] = TransformerEmbeddingTask(\n                pooled_feature=pooled_feature,\n                seq_feature=seq_feature,\n                feed_list=feed_list,\n                data_reader=reader,\n                config=config,\n            )\n            self.emb_job['batch_size'] = batch_size\n            self.emb_job['use_gpu'] = use_gpu\n\n        return self.emb_job['task'].predict(data=texts, return_result=True, accelerate_mode=True)\n\n    def get_spm_path(self) -> str:\n        if hasattr(self, 'spm_path'):\n            return self.spm_path\n        return None\n\n    def get_word_dict_path(self) -> str:\n        if hasattr(self, 'word_dict_path'):\n            return self.word_dict_path\n        return None\n\n    def get_params_layer(self) -> dict:\n        if not hasattr(self, 'params_layer'):\n            raise AttributeError('The module context has not been initialized. '\n                                 'Please call context() before using get_params_layer')\n        return self.params_layer\n"
  },
  {
    "path": "paddlehub/compat/module/processor.py",
    "content": "#coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import List\n\n\nclass BaseProcessor(object):\n    '''\n    '''\n\n    def __init__(self, module):\n        ...\n\n    def configs(self) -> List:\n        return []\n\n    def preprocess(self, signature: str, data: dict):\n        '''\n        '''\n        raise NotImplementedError('BaseProcessor\\' preprocess should not be called!')\n\n    def postprocess(self, signature: str, data_out: dict, data_info: dict, **kwargs):\n        '''\n        '''\n        raise NotImplementedError('BaseProcessor\\' postprocess should not be called!')\n\n    def data_format(self, signature: str):\n        '''\n        '''\n        raise NotImplementedError('BaseProcessor\\' data_format should not be called!')\n"
  },
  {
    "path": "paddlehub/compat/paddle_utils.py",
    "content": "# coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport contextlib\nimport copy\nfrom typing import Callable\nfrom typing import List\n\nimport paddle\nfrom paddle.framework import core\n\nfrom paddlehub.utils.utils import Version\n\ndtype_map = {\n    core.VarDesc.VarType.FP32: \"float32\",\n    core.VarDesc.VarType.FP64: \"float64\",\n    core.VarDesc.VarType.FP16: \"float16\",\n    core.VarDesc.VarType.INT32: \"int32\",\n    core.VarDesc.VarType.INT16: \"int16\",\n    core.VarDesc.VarType.INT64: \"int64\",\n    core.VarDesc.VarType.BOOL: \"bool\",\n    core.VarDesc.VarType.INT16: \"int16\",\n    core.VarDesc.VarType.UINT8: \"uint8\",\n    core.VarDesc.VarType.INT8: \"int8\",\n}\n\n\ndef convert_dtype_to_string(dtype: str) -> core.VarDesc.VarType:\n    if dtype in dtype_map:\n        return dtype_map[dtype]\n    raise TypeError(\"dtype shoule in %s\" % list(dtype_map.keys()))\n\n\ndef get_variable_info(var: paddle.static.Variable) -> dict:\n    if not isinstance(var, paddle.static.Variable):\n        raise TypeError(\"var shoule be an instance of paddle.static.Variable\")\n\n    var_info = {\n        'name': var.name,\n        'stop_gradient': var.stop_gradient,\n        'is_data': var.is_data,\n        'error_clip': var.error_clip,\n        'type': var.type\n    }\n\n    try:\n        var_info['dtype'] = convert_dtype_to_string(var.dtype)\n        var_info['lod_level'] = var.lod_level\n        var_info['shape'] = var.shape\n    except:\n        pass\n\n    if isinstance(var, paddle.device.framework.Parameter):\n        var_info['trainable'] = var.trainable\n        var_info['optimize_attr'] = var.optimize_attr\n        var_info['regularizer'] = var.regularizer\n        if Version(paddle.__version__) < '1.8':\n            var_info['gradient_clip_attr'] = var.gradient_clip_attr\n        var_info['do_model_average'] = var.do_model_average\n    else:\n        var_info['persistable'] = var.persistable\n\n    return var_info\n\n\ndef remove_feed_fetch_op(program: paddle.static.Program):\n    '''Remove feed and fetch operator and variable for fine-tuning.'''\n    block = program.global_block()\n    need_to_remove_op_index = []\n\n    for i, op in enumerate(block.ops):\n        if op.type == 'feed' or op.type == \"fetch\":\n            need_to_remove_op_index.append(i)\n\n    for index in need_to_remove_op_index[::-1]:\n        block._remove_op(index)\n\n    need_to_remove_var = []\n    for var in block.vars:\n        if var.endswith(\"feed\"):\n            need_to_remove_var.append(var)\n        if var.endswith('fetch'):\n            need_to_remove_var.append(var)\n\n    for var in need_to_remove_var:\n        block._remove_var(var)\n\n    program.desc.flush()\n\n\ndef rename_var(block: paddle.device.framework.Block, old_name: str, new_name: str):\n    '''\n    '''\n    for op in block.ops:\n        for input_name in op.input_arg_names:\n            if input_name == old_name:\n                op._rename_input(old_name, new_name)\n\n        for output_name in op.output_arg_names:\n            if output_name == old_name:\n                op._rename_output(old_name, new_name)\n\n    block._rename_var(old_name, new_name)\n\n\ndef add_vars_prefix(program: paddle.static.Program,\n                    prefix: str,\n                    vars: List[paddle.static.Variable] = None,\n                    excludes: Callable = None):\n    '''\n    '''\n    block = program.global_block()\n    vars = list(vars) if vars else list(block.vars.keys())\n    vars = [var for var in vars if var not in excludes] if excludes else vars\n    for var in vars:\n        rename_var(block, var, prefix + var)\n\n\ndef remove_vars_prefix(program: paddle.static.Program,\n                       prefix: str,\n                       vars: List[paddle.static.Variable] = None,\n                       excludes: Callable = None):\n    '''\n    '''\n    block = program.global_block()\n    vars = [var for var in vars\n            if var.startswith(prefix)] if vars else [var for var in block.vars.keys() if var.startswith(prefix)]\n    vars = [var for var in vars if var not in excludes] if excludes else vars\n    for var in vars:\n        rename_var(block, var, var.replace(prefix, '', 1))\n\n\ndef clone_program(origin_program: paddle.static.Program, for_test: bool = False) -> paddle.static.Program:\n    dest_program = paddle.static.Program()\n\n    _copy_vars_and_ops_in_blocks(origin_program.global_block(), dest_program.global_block())\n\n    dest_program = dest_program.clone(for_test=for_test)\n    if not for_test:\n        for name, var in origin_program.global_block().vars.items():\n            dest_program.global_block().vars[name].stop_gradient = var.stop_gradient\n\n    return dest_program\n\n\ndef _copy_vars_and_ops_in_blocks(from_block: paddle.device.framework.Block, to_block: paddle.device.framework.Block):\n    for var in from_block.vars:\n        var = from_block.var(var)\n        var_info = copy.deepcopy(get_variable_info(var))\n        if isinstance(var, paddle.device.framework.Parameter):\n            to_block.create_parameter(**var_info)\n        else:\n            to_block.create_var(**var_info)\n\n    for op in from_block.ops:\n        all_attrs = op.all_attrs()\n        if 'sub_block' in all_attrs:\n            _sub_block = to_block.program._create_block()\n            _copy_vars_and_ops_in_blocks(all_attrs['sub_block'], _sub_block)\n            to_block.program._rollback()\n            new_attrs = {'sub_block': _sub_block}\n            for key, value in all_attrs.items():\n                if key == 'sub_block':\n                    continue\n                new_attrs[key] = copy.deepcopy(value)\n        else:\n            new_attrs = copy.deepcopy(all_attrs)\n\n        op_info = {\n            'type': op.type,\n            'inputs':\n            {input: [to_block._find_var_recursive(var) for var in op.input(input)]\n             for input in op.input_names},\n            'outputs':\n            {output: [to_block._find_var_recursive(var) for var in op.output(output)]\n             for output in op.output_names},\n            'attrs': new_attrs\n        }\n        to_block.append_op(**op_info)\n\n\ndef set_op_attr(program: paddle.static.Program, is_test: bool = False):\n    for block in program.blocks:\n        for op in block.ops:\n            if not op.has_attr('is_test'):\n                continue\n\n            op._set_attr('is_test', is_test)\n\n\n@contextlib.contextmanager\ndef static_mode_guard():\n    '''enter static graph mode with `with` statement.'''\n    premode = 'static' if not paddle.in_dynamic_mode() else 'dynamic'\n\n    if premode == 'dynamic':\n        paddle.enable_static()\n\n    yield\n\n    if premode == 'dynamic':\n        paddle.disable_static()\n\n\ndef run_in_static_mode(func):\n    '''Decorate a function to run in static graph mode.'''\n\n    def runner(*args, **kwargs):\n        with static_mode_guard():\n            return func(*args, **kwargs)\n\n    return runner\n"
  },
  {
    "path": "paddlehub/compat/task/__init__.py",
    "content": ""
  },
  {
    "path": "paddlehub/compat/task/base_task.py",
    "content": "# coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport contextlib\nimport inspect\nimport os\nfrom functools import partial\nfrom typing import Any\nfrom typing import Callable\nfrom typing import Generator\nfrom typing import Generic\nfrom typing import Iterator\nfrom typing import List\nfrom typing import Union\n\nimport numpy as np\nimport paddle\nfrom paddle.framework import core\nfrom visualdl import LogWriter\n\nfrom paddlehub.compat import paddle_utils\nfrom paddlehub.compat.task.checkpoint import load_checkpoint\nfrom paddlehub.compat.task.config import RunConfig\nfrom paddlehub.compat.task.hook import TaskHooks\nfrom paddlehub.compat.task.task_utils import RunEnv\nfrom paddlehub.compat.task.task_utils import RunState\nfrom paddlehub.utils.log import logger\nfrom paddlehub.utils.utils import generate_tempdir\n\n\nclass BaseTask(object):\n    '''\n    BaseTask is the base class of all the task. It will complete the building of all the running environment.\n    Args:\n        main_program (object): the customized main_program, default None\n        startup_program (object): the customized startup_program, default None\n        config (object): the config for the task, default None\n        metrics_choices (list): metrics used to the task, default ['acc']\n    '''\n\n    def __init__(self,\n                 dataset: Iterator = None,\n                 feed_list: List = None,\n                 data_reader: Generic = None,\n                 main_program: paddle.static.Program = None,\n                 startup_program: paddle.static.Program = None,\n                 config: RunConfig = None,\n                 metrics_choices: List[str] = None):\n        # metrics item\n        self.best_score = -999\n        if not metrics_choices:\n            metrics_choices = ['acc']\n        elif metrics_choices == None:\n            metrics_choices = []\n        if isinstance(metrics_choices, list):\n            self.metrics_choices = metrics_choices\n        else:\n            self.metrics_choices = [metrics_choices]\n\n        if main_program is None:\n            self._base_main_program = paddle_utils.clone_program(paddle.static.default_main_program(), for_test=False)\n        else:\n            self._base_main_program = paddle_utils.clone_program(main_program, for_test=False)\n        if startup_program is None:\n            self._base_startup_program = paddle_utils.clone_program(paddle.static.default_startup_program(),\n                                                                    for_test=False)\n        else:\n            self._base_startup_program = paddle_utils.clone_program(startup_program, for_test=False)\n        self.is_checkpoint_loaded = False\n        self._base_compiled_program = None\n\n        # run config\n        self.config = config if config else RunConfig()\n        self.place = self.places[0]\n        self.device_count = len(self.places)\n\n        if self.config.use_data_parallel:\n            if not self.config.use_pyreader and self.config.batch_size < self.device_count:\n                logger.warning(\n                    'Batch size({}) is less than the count of devices({}), which is not allowed in current Paddle versions'\n                    .format(self.config.batch_size, self.device_count))\n                logger.warning('Batch size automatically adjusted to {}'.format(self.device_count))\n                self.config._batch_size = self.device_count\n\n        self.exe = paddle.static.Executor(place=self.place)\n        self.build_strategy = paddle.static.BuildStrategy()\n\n        # run environment\n        self._phases = []\n        self._envs = {}\n        self._predict_data = None\n        self._vdl_writer = None\n\n        # event hooks\n        self._hooks = TaskHooks()\n        for hook_type, event_hooks in self._hooks._registered_hooks.items():\n            self._hooks.add(hook_type, 'default', eval('self._default_{}'.format(hook_type)))\n            setattr(BaseTask, '_{}'.format(hook_type), self.create_event_function(hook_type))\n\n        # accelerate predict\n        self.is_best_model_loaded = False\n        self._predictor = None\n\n        # set default phase\n        self.enter_phase('train')\n\n        self.dataset = dataset\n        if dataset:\n            self._label_list = dataset.get_labels()\n        else:\n            self._label_list = None\n\n        self._base_data_reader = data_reader\n        self._base_feed_list = feed_list\n\n        self._compatible_mode = True if data_reader else False\n\n    @contextlib.contextmanager\n    def phase_guard(self, phase: str):\n        self.enter_phase(phase)\n        yield\n        self.exit_phase()\n\n    def enter_phase(self, phase: str):\n        if phase not in ['train', 'val', 'dev', 'test', 'predict', 'inference']:\n            raise RuntimeError('Unknown phase {}.'.format(phase))\n        if phase in ['val', 'dev']:\n            phase = 'dev'\n        elif phase in ['predict', 'inference']:\n            phase = 'predict'\n        self._phases.append(phase)\n\n    def exit_phase(self):\n        self._phases = self._phases[:-1]\n\n    def init_if_necessary(self):\n        if not self.is_checkpoint_loaded:\n            if not self.load_checkpoint():\n                self.exe.run(self._base_startup_program)\n            self.is_checkpoint_loaded = True\n            self.is_best_model_loaded = False\n\n    def init_if_load_best_model(self):\n        if not self.is_best_model_loaded:\n            best_model_path = os.path.join(self.config.checkpoint_dir, \"best_model\")\n            logger.info(\"Load the best model from %s\" % best_model_path)\n            if os.path.exists(best_model_path):\n                self.load_parameters(best_model_path)\n                self.is_checkpoint_loaded = False\n                self.is_best_model_loaded = True\n            else:\n                self.init_if_necessary()\n        else:\n            logger.info(\"The best model has been loaded\")\n\n    def _build_env(self):\n        '''Building the program and strategy for specific running phase.'''\n        if self.env.is_inititalized:\n            return\n\n        self._build_env_start_event()\n        self.env.is_inititalized = True\n        self.env.main_program = paddle_utils.clone_program(self._base_main_program, for_test=False)\n\n        self.env.startup_program = paddle.static.Program()\n        with paddle.static.program_guard(self.env.main_program, self._base_startup_program):\n            with paddle.utils.unique_name.guard(self.env.UNG):\n                self.env.outputs = self._build_net()\n                if self.is_train_phase or self.is_test_phase:\n                    self.env.labels = self._add_label()\n                    self.env.loss = self._add_loss()\n                    self.env.metrics = self._add_metrics()\n\n        if self.is_predict_phase or self.is_test_phase:\n            self.env.main_program = paddle_utils.clone_program(self.env.main_program, for_test=True)\n            paddle_utils.set_op_attr(self.env.main_program, is_test=True)\n\n        if self.is_train_phase:\n            with paddle.static.program_guard(self.env.main_program, self._base_startup_program):\n                with paddle.utils.unique_name.guard(self.env.UNG):\n                    if self._compatible_mode:\n                        # This branch is compatible code for usage deprecated in paddlehub v1.8.\n                        self._base_data_reader.data_generator(batch_size=self.config.batch_size,\n                                                              phase='train',\n                                                              shuffle=True)\n                        num_train_examples = self._base_data_reader.num_examples['train']\n                        try:\n                            # nlp_reader\n                            _in_tokens = self._base_data_reader.in_tokens\n                            if _in_tokens:\n                                num_train_examples *= self._base_data_reader.max_seq_len\n                        except:\n                            # cv_reader without .in_tokens and .max_seq_len\n                            ...\n                    else:\n                        num_train_examples = len(self.dataset.get_train_records())\n\n                    self.max_train_steps = self.config.num_epoch * num_train_examples // self.config.batch_size // self.device_count\n                    self.scheduled_lr = self.config.strategy.execute(self.loss, self.max_train_steps)\n\n        if self.is_train_phase:\n            loss_name = self.env.loss.name\n        else:\n            loss_name = None\n\n        share_vars_from = self._base_compiled_program\n\n        if not self.config.use_data_parallel:\n            self.env.main_program_compiled = None\n        else:\n            self.env.main_program_compiled = paddle.static.CompiledProgram(self.env.main_program).with_data_parallel(\n                loss_name=loss_name,\n                share_vars_from=share_vars_from,\n                build_strategy=self.build_strategy,\n                places=self.places)\n\n        self.exe.run(self.env.startup_program)\n        self._build_env_end_event()\n\n    @property\n    def places(self) -> List[Union[paddle.CPUPlace, paddle.CUDAPlace]]:\n        if self.config.use_cuda:\n            _places = paddle.device.framework.cuda_places()\n        else:\n            _places = paddle.device.framework.cpu_places()\n\n        if not self.config.use_data_parallel:\n            return [_places[0]]\n        return _places\n\n    @property\n    def return_numpy(self) -> bool:\n        return True\n\n    @property\n    def is_train_phase(self) -> bool:\n        return self.phase in ['train']\n\n    @property\n    def is_test_phase(self) -> bool:\n        return self.phase in ['val', 'dev', 'test']\n\n    @property\n    def is_predict_phase(self) -> bool:\n        return self.phase in ['predict', 'inference']\n\n    @property\n    def phase(self) -> str:\n        return self._phases[-1]\n\n    @property\n    def env(self) -> RunEnv:\n        phase = self.phase\n        if phase in ['val', 'dev', 'test']:\n            phase = 'dev'\n        if not phase in self._envs:\n            self._envs[phase] = RunEnv()\n        return self._envs[phase]\n\n    @property\n    def current_step(self) -> int:\n        if not self.env.is_inititalized:\n            self._build_env()\n        return self.env.current_step\n\n    @property\n    def current_epoch(self) -> int:\n        if not self.env.is_inititalized:\n            self._build_env()\n        return self.env.current_epoch\n\n    @property\n    def main_program(self) -> paddle.static.Program:\n        if not self.env.is_inititalized:\n            self._build_env()\n        return self.env.main_program\n\n    @property\n    def startup_program(self) -> paddle.static.Program:\n        if not self.env.is_inititalized:\n            self._build_env()\n        return self.env.startup_program\n\n    @property\n    def main_program_compiled(self) -> paddle.static.CompiledProgram:\n        if not self.env.is_inititalized:\n            self._build_env()\n        return self.env.main_program_compiled\n\n    @property\n    def main_program_to_be_run(self) -> Union[paddle.static.Program, paddle.static.CompiledProgram]:\n        if self.config.use_data_parallel:\n            if self._base_compiled_program is None:\n                self._base_compiled_program = self.env.main_program_compiled\n            return self.main_program_compiled\n        return self.main_program\n\n    @property\n    def generator(self) -> Generator:\n\n        def data_generator(records):\n\n            def wrapper():\n                for record in records:\n                    values = []\n                    for feed_name in self.feed_list:\n                        values.append(record[feed_name])\n                    yield values\n\n            return wrapper\n\n        if self._compatible_mode:\n            self.env.generator = self._base_data_reader.data_generator(batch_size=self.config.batch_size,\n                                                                       phase=self.phase,\n                                                                       data=self._predict_data,\n                                                                       return_list=True)\n        else:\n            if self.is_predict_phase:\n                records = self._predict_data\n            else:\n                if self.is_train_phase:\n                    shuffle = True\n                else:\n                    shuffle = False\n                records = self.dataset.get_records(phase=self.phase, shuffle=shuffle)\n            self.env.generator = data_generator(records)\n\n        return self.env.generator\n\n    @property\n    def loss(self) -> paddle.static.Variable:\n        if self.is_predict_phase:\n            raise RuntimeError('Loss cannot be obtained in the prediction phase.')\n\n        if not self.env.is_inititalized:\n            self._build_env()\n        return self.env.loss\n\n    @property\n    def labels(self) -> List[paddle.static.Variable]:\n        if self.is_predict_phase:\n            raise RuntimeError('Labels cannot be obtained in the prediction phase.')\n\n        if not self.env.is_inititalized:\n            self._build_env()\n        return self.env.labels\n\n    @property\n    def outputs(self) -> List[paddle.static.Variable]:\n        if not self.env.is_inititalized:\n            self._build_env()\n        return self.env.outputs\n\n    @property\n    def metrics(self) -> List[str]:\n        if self.is_predict_phase:\n            raise RuntimeError('Metrics cannot be obtained in the prediction phase.')\n\n        if not self.env.is_inititalized:\n            self._build_env()\n        return self.env.metrics\n\n    @property\n    def unique_name_generator(self):\n        return self.env.UNG\n\n    @property\n    def feed_list(self) -> List[str]:\n        if not self.env.is_inititalized:\n            self._build_env()\n\n        if self._predict_data:\n            feed_list = list(self._predict_data[0].keys())\n        else:\n            feed_list = self.dataset.get_feed_list(self.phase)\n\n        feed_list = [feed_name for feed_name in feed_list if feed_name in self.main_program.global_block().vars]\n        return feed_list\n\n    @property\n    def feed_var_list(self) -> List[paddle.static.Variable]:\n        if not self.env.is_inititalized:\n            self._build_env()\n\n        vars = self.main_program.global_block().vars\n        return [vars[varname] for varname in self.feed_list]\n\n    @property\n    def fetch_list(self) -> List[str]:\n        if self.is_train_phase or self.is_test_phase:\n            return [metric.name for metric in self.metrics] + [self.loss.name]\n        return [output.name for output in self.outputs]\n\n    @property\n    def fetch_var_list(self) -> List[paddle.static.Variable]:\n        vars = self.main_program.global_block().vars\n        return [vars[varname] for varname in self.fetch_list]\n\n    @property\n    def vdl_writer(self) -> LogWriter:\n        '''\n        get vdl_writer for visualization.\n        '''\n        if not os.path.exists(self.config.checkpoint_dir):\n            os.mkdir(self.config.checkpoint_dir)\n        tb_log_dir = os.path.join(self.config.checkpoint_dir, 'visualization')\n        if not self._vdl_writer:\n            self._vdl_writer = LogWriter(tb_log_dir)\n        return self._vdl_writer\n\n    def create_event_function(self, hook_type: str) -> Callable:\n        '''\n        create handlers for specific event.\n        Args:\n            hook_type (str): specific event name\n        Returns:\n            func: executable function, the class method will receive a parameter named self.\n        '''\n\n        def hook_function(self, *args):\n            # all the handler in self._hooks[hook_type] will be configured to executable\n            for name, func in self._hooks[hook_type].items():\n                if inspect.ismethod(func):\n                    func(*args)\n                else:\n                    partial(func, self)(*args)\n\n        return hook_function\n\n    @property\n    def hooks(self) -> List[dict]:\n        return self._hooks\n\n    def hooks_info(self, show_default: bool = False) -> str:\n        '''\n        get the hooks information, including the source code.\n        Args:\n            show_default (bool): show the information of Paddlehub default hooks or not, default False\n        Returns:\n            str: the formatted string of the hooks information\n        '''\n        return self._hooks.info(show_default)\n\n    def add_hook(self, hook_type: str, name: str = None, func: Callable = None):\n        '''\n        add the handler function to spectific event.\n        Args:\n            hook_type (str): the spectific event name\n            name (str): the handler function name, default None\n            func (func): the handler function, default None\n        '''\n        if name == None:\n            name = 'hook_{}'.format(id(func))\n        self._hooks.add(hook_type, name=name, func=func)\n        logger.info('Add hook {}:{} successfully'.format(hook_type, name))\n\n    def delete_hook(self, hook_type: str, name: str):\n        '''\n        delete the handler function of spectific event.\n        Args:\n            hook_type (str): the spectific event name\n            name (str): the handler function name\n        '''\n        self._hooks.delete(hook_type, name)\n        logger.info('Delete hook {}:{} successfully'.format(hook_type, name))\n\n    def modify_hook(self, hook_type: str, name: str, func: Callable):\n        '''\n         modify the handler function of spectific event.\n         Args:\n             hook_type (str): the spectific event name\n             name (str): the handler function name\n             func (func): the new handler function\n         '''\n        self._hooks.modify(hook_type, name, func)\n        logger.info('Modify hook {}:{} successfully'.format(hook_type, name))\n\n    def _default_build_env_start_event(self):\n        ...\n\n    def _default_build_env_end_event(self):\n        if not self.is_predict_phase:\n            self.env.score_scalar = {}\n\n    def _default_finetune_start_event(self):\n        logger.info('PaddleHub finetune start')\n\n    def _default_finetune_end_event(self, run_states: List[RunState]):\n        logger.info('PaddleHub finetune finished.')\n\n    def _default_predict_start_event(self):\n        logger.info('PaddleHub predict start')\n\n    def _default_predict_end_event(self, run_states: List[RunState]):\n        logger.info('PaddleHub predict finished.')\n\n    def _default_eval_start_event(self):\n        logger.info('Evaluation on {} dataset start'.format(self.phase))\n\n    def _default_eval_end_event(self, run_states: List[RunState]):\n        '''\n        Paddlehub default handler for eval_end_event, it will complete visualization and metrics calculation\n        Args:\n            run_states (object): the results in eval phase\n        '''\n        eval_scores, eval_loss, run_speed = self._calculate_metrics(run_states)\n        if 'train' in self._envs:\n            self.vdl_writer.add_scalar(tag='Loss_{}'.format(self.phase),\n                                       value=eval_loss,\n                                       step=self._envs['train'].current_step)\n\n        log_scores = ''\n        for metric in eval_scores:\n            if 'train' in self._envs:\n                self.vdl_writer.add_scalar(tag='{}_{}'.format(metric, self.phase),\n                                           value=eval_scores[metric],\n                                           step=self._envs['train'].current_step)\n\n            log_scores += '{}={:.5f} '.format(metric, eval_scores[metric])\n        logger.eval('[{} dataset evaluation result] loss={:.5f} {}[step/sec: {:.2f}]'.format(\n            self.phase, eval_loss, log_scores, run_speed))\n\n        eval_scores_items = eval_scores.items()\n        if len(eval_scores_items):\n            # The first metric will be chose to eval\n            main_metric, main_value = list(eval_scores_items)[0]\n        else:\n            logger.warning('None of metrics has been implemented, loss will be used to evaluate.')\n            # The larger, the better\n            main_metric, main_value = 'negative loss', -eval_loss\n        if self.phase in ['dev', 'val'] and main_value > self.best_score:\n            self.best_score = main_value\n            model_saved_dir = os.path.join(self.config.checkpoint_dir, 'best_model')\n            logger.eval('best model saved to {} [best {}={:.5f}]'.format(model_saved_dir, main_metric, main_value))\n            self.save_inference_model(dirname=model_saved_dir)\n\n    def _default_log_interval_event(self, run_states: List[RunState]):\n        '''\n        PaddleHub default handler for log_interval_event, it will complete visualization.\n        Args:\n            run_states (object): the results in train phase\n        '''\n        scores, avg_loss, run_speed = self._calculate_metrics(run_states)\n        self.vdl_writer.add_scalar(tag='Loss_{}'.format(self.phase),\n                                   value=avg_loss,\n                                   step=self._envs['train'].current_step)\n        log_scores = ''\n        for metric in scores:\n            self.vdl_writer.add_scalar(tag='{}_{}'.format(metric, self.phase),\n                                       value=scores[metric],\n                                       step=self._envs['train'].current_step)\n            log_scores += '{}={:.5f} '.format(metric, scores[metric])\n        logger.train('step {} / {}: loss={:.5f} {}[step/sec: {:.2f}]'.format(self.current_step, self.max_train_steps,\n                                                                             avg_loss, log_scores, run_speed))\n\n    def _default_save_ckpt_interval_event(self):\n        self.save_checkpoint()\n\n    def _default_eval_interval_event(self):\n        self.eval(phase='dev')\n\n    def _default_run_step_event(self, run_state: List[RunState]):\n        ...\n\n    def _build_net(self):\n        raise NotImplementedError\n\n    def _add_loss(self):\n        raise NotImplementedError\n\n    def _add_label(self):\n        raise NotImplementedError\n\n    def _add_metrics(self):\n        # Some metrics like acc, auc\n        # The others can be calculated in _calculate_metrics function\n        raise NotImplementedError\n\n    def _calculate_metrics(self, run_states: List[RunState]):\n        # NOTE: if you want to customize the metrics\n        # you should make sure that the first parameter returned is a dict\n        # The first key will be used as main metrics to update the best model\n        raise NotImplementedError\n\n    def load_checkpoint(self):\n        is_load_successful, self.env.current_epoch, self.env.current_step, self.best_score = load_checkpoint(\n            self.config.checkpoint_dir, self.exe, main_program=self.main_program)\n\n        # Revise max_train_steps when incremental training\n        if is_load_successful:\n            self.max_train_steps = self.env.current_step + self.max_train_steps / self.config.num_epoch * (\n                self.config.num_epoch - self.env.current_epoch + 1)\n        return is_load_successful\n\n    def load_parameters(self, dirname):\n\n        def if_exist(var):\n            path = os.path.join(dirname, var.name)\n            return os.path.exists(path)\n\n        paddle.static.load(executor=self.exe, model_path=dirname, program=self.main_program)\n\n    def save_inference_model(self, dirname: str, model_filename: str = None, params_filename: str = None):\n        with self.phase_guard('predict'):\n            paddle.static.save_inference_model(dirname=dirname,\n                                               executor=self.exe,\n                                               main_program=self.main_program,\n                                               feeded_var_names=self.feed_list,\n                                               target_vars=self.fetch_var_list,\n                                               model_filename=model_filename,\n                                               params_filename=params_filename)\n\n    def finetune_and_eval(self) -> List[RunState]:\n        return self.finetune(do_eval=True)\n\n    def finetune(self, do_eval: bool = False) -> List[RunState]:\n        '''\n        train and finetune the module parameters.\n        Args:\n            do_eval (bool): do eval during train phase or not\n        Returns:\n            RunState: the running result of train phase\n        '''\n\n        # Start to finetune\n        with self.phase_guard(phase='train'):\n            self.init_if_necessary()\n            self._finetune_start_event()\n            run_states = []\n            if self.current_epoch <= self.config.num_epoch:\n                while self.current_epoch <= self.config.num_epoch:\n                    self.config.strategy.step()\n                    run_states = self._run(do_eval=do_eval)\n                    self.env.current_epoch += 1\n\n                # Final evaluation\n                if self._compatible_mode:\n                    dev_examples = self._base_data_reader.get_dev_examples()\n                    test_examples = self._base_data_reader.get_test_examples()\n                else:\n                    dev_examples = self.dataset.get_dev_examples()\n                    test_examples = self.dataset.get_test_examples()\n                if dev_examples != []:\n                    # Warning: DO NOT use self.eval(phase='dev', load_best_model=True) during training.\n                    # It will cause trainer unable to continue training from checkpoint after eval.\n                    # More important, The model should evaluate current performance during training.\n                    self.eval(phase='dev')\n                if test_examples != []:\n                    self.eval(phase='test', load_best_model=True)\n\n                # Save checkpoint after finetune\n                self.save_checkpoint()\n\n            self._finetune_end_event(run_states)\n            return run_states\n\n    def eval(self, phase: str = 'dev', load_best_model: bool = False) -> List[RunState]:\n        '''\n        evaluate the performance of current module.\n        Args:\n            phase (str): current run phase\n            load_best_model (bool): load the best model or not\n        Returns:\n            RunState: the running result of eval phase\n        '''\n        # Warning: DO NOT use eval(load_best_model=True) in finetune_and_eval\n        # It will cause trainer unable to continue training from checkpoint after eval\n        # More important, The model should evaluate current performance during training.\n        with self.phase_guard(phase=phase):\n            if load_best_model:\n                self.init_if_load_best_model()\n            else:\n                self.init_if_necessary()\n            self._eval_start_event()\n            run_states = self._run()\n            self._eval_end_event(run_states)\n            return run_states\n\n    def _create_predictor(self) -> core.PaddlePredictor:\n        '''\n        create high-performance predictor for predict.\n        Returns:\n            PaddlePredictor: the high-performance predictor\n        '''\n        with generate_tempdir() as _dir:\n            self.save_inference_model(dirname=_dir)\n            predictor_config = core.AnalysisConfig(_dir)\n            predictor_config.disable_glog_info()\n\n            if self.config.use_cuda:\n                predictor_config.enable_use_gpu(100, 0)\n                predictor_config.switch_ir_optim(True)\n            else:\n                predictor_config.disable_gpu()\n            predictor_config.enable_memory_optim()\n            return core.create_paddle_predictor(predictor_config)\n\n    def _run_with_predictor(self) -> List[RunState]:\n        '''\n        use high-performance predictor to make prediction.\n        Returns:\n            RunState: the running result of predict phase\n        '''\n        global_run_states = []\n        period_run_states = []\n\n        feed_var_shape = []\n        feed_var_type = []\n        for var in self.feed_var_list:\n            feed_var_shape.append(var.shape)\n            feed_var_type.append(paddle_utils.dtype_map[var.dtype])\n\n        data_reader = self.generator\n        for batch in data_reader():\n\n            step_run_state = RunState(len(self.fetch_list))\n            step_run_state.run_step = 1\n            num_batch_examples = len(batch)\n\n            # Preocessing data to the suitable shape and type for the model\n            processed_batch = [[] for i in range(len(self.feed_list))]\n\n            for sample in batch:\n                for i, data in enumerate(sample):\n                    processed_batch[i].append(data)\n            tensor_batch = [[] for i in range(len(self.feed_list))]\n            for i in range(len(processed_batch)):\n                processed_batch[i] = np.array(processed_batch[i]).reshape(feed_var_shape[i]).astype(feed_var_type[i])\n                tensor_batch[i] = core.PaddleTensor(processed_batch[i])\n\n            fetch_result = self._predictor.run(tensor_batch)\n            for index, result in enumerate(fetch_result):\n                step_run_state.run_results[index] = result.as_ndarray()\n            step_run_state.run_examples += num_batch_examples\n            step_run_state.update()\n            period_run_states += [step_run_state]\n            self._run_step_event(step_run_state)\n\n        global_run_states += period_run_states\n        return global_run_states\n\n    def predict(\n        self,\n        data: List[Any] = None,\n        label_list: List[Any] = None,\n        load_best_model: bool = True,\n        return_result: bool = True,\n        accelerate_mode: bool = True,\n    ) -> List[RunState]:\n        '''\n        make prediction for the input data.\n        Args:\n            data (list): the data will be predicted. Its element should be a record when the task is initialized without data_reader param,\n                         or a plaintext string list when the task is initialized with data_reader param (deprecated in paddlehub v1.8).\n            label_list (list): the label list, used to proprocess the output.\n            return_result (bool): return a readable result or just the raw run result. Always True when the task is not initialized with data_reader but dataset parameter.\n            accelerate_mode (bool): use high-performance predictor or not\n        Returns:\n            RunState: the running result of predict phase\n        '''\n        self.accelerate_mode = accelerate_mode\n\n        with self.phase_guard(phase='predict'):\n            self._predict_data = data\n            if label_list:\n                self._label_list = label_list\n            self._predict_start_event()\n\n            if load_best_model:\n                self.init_if_load_best_model()\n\n            if not self.accelerate_mode:\n                run_states = self._run()\n            else:\n                if not self._predictor:\n                    self._predictor = self._create_predictor()\n                run_states = self._run_with_predictor()\n\n            self._predict_end_event(run_states)\n            self._predict_data = None\n            if return_result:\n                return self._postprocessing(run_states)\n        return run_states\n\n    def _postprocessing(self, run_states: List[RunState]) -> List:\n        '''\n        postprocessing the run result, get readable result.\n        Args:\n            run_states (RunState): the raw run result to be processed\n        Returns:\n            list: readable result\n        '''\n        results = []\n        for batch_state in run_states:\n            batch_result = batch_state.run_results[0]\n            results += [result[0] for result in batch_result]\n        return results\n\n    def _run(self, do_eval: bool = False) -> List[RunState]:\n        '''\n        load data and run the program.\n        Args:\n            do_eval (bool): do eval during train phase or not\n        Returns:\n            RunState: the running result of specific phase\n        '''\n        with paddle.static.program_guard(self.main_program, self.startup_program):\n            data_loader = paddle.io.DataLoader.from_generator(feed_list=self.feed_var_list,\n                                                              capacity=64,\n                                                              use_double_buffer=True,\n                                                              iterable=True)\n            if self.is_predict_phase:\n                data_reader = data_loader.set_sample_generator(self.generator,\n                                                               places=self.places,\n                                                               batch_size=self.config.batch_size,\n                                                               drop_last=False)\n            else:\n                data_reader = data_loader.set_sample_generator(self.generator,\n                                                               places=self.places,\n                                                               batch_size=self.config.batch_size,\n                                                               drop_last=True)\n\n            global_run_states = []\n            period_run_states = []\n            for batch in data_reader():\n                step_run_state = RunState(len(self.fetch_list))\n                step_run_state.run_step = 1\n\n                # get the batch_data_size\n                tmp_name = list(batch[0].keys())[0]\n                tmp = np.array(batch[0][tmp_name])\n                num_batch_examples = tmp.shape[0]\n\n                fetch_result = self.exe.run(self.main_program_to_be_run,\n                                            feed=batch,\n                                            fetch_list=self.fetch_list,\n                                            return_numpy=self.return_numpy)\n                if not self.return_numpy:\n                    fetch_result = [np.array(x) for x in fetch_result]\n\n                for index, result in enumerate(fetch_result):\n                    step_run_state.run_results[index] = result\n                step_run_state.run_examples += num_batch_examples\n                step_run_state.update()\n                period_run_states += [step_run_state]\n                self.env.current_step += 1\n                if self.is_train_phase:\n                    if self.current_step % self.config.log_interval == 0:\n                        self._log_interval_event(period_run_states)\n                        global_run_states += period_run_states\n                        period_run_states = []\n\n                    if self.config.save_ckpt_interval and self.current_step % self.config.save_ckpt_interval == 0:\n                        self._save_ckpt_interval_event()\n\n                    if do_eval and self.current_step % self.config.eval_interval == 0:\n                        self._eval_interval_event()\n\n                self._run_step_event(step_run_state)\n\n            global_run_states += period_run_states\n            return global_run_states\n\n    def __repr__(self) -> str:\n        return 'Task: {} with metrics_choices: {}, {}'.format(self.__class__.__name__, self.metrics_choices,\n                                                              self.config)\n"
  },
  {
    "path": "paddlehub/compat/task/batch.py",
    "content": "# coding:utf-8\n# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n'''Mask, padding and batching.'''\n\nfrom typing import List, Union\n\nimport numpy as np\n\n\ndef pad_batch_data(insts: List,\n                   pad_idx: int = 0,\n                   max_seq_len: int = 128,\n                   return_pos: bool = False,\n                   return_input_mask: bool = False,\n                   return_max_len: bool = False,\n                   return_num_token: bool = False,\n                   return_seq_lens: bool = False) -> Union[List, np.ndarray]:\n    '''\n    Pad the instances to the max sequence length in batch, and generate the\n    corresponding position data and input mask.\n    '''\n    return_list = []\n    #max_len = max(len(inst) for inst in insts)\n    max_len = max_seq_len\n    # Any token included in dict can be used to pad, since the paddings' loss\n    # will be masked out by weights and make no effect on parameter gradients.\n\n    inst_data = np.array([list(inst) + list([pad_idx] * (max_len - len(inst))) for inst in insts])\n    return_list += [inst_data.astype('int64').reshape([-1, max_len, 1])]\n\n    # position data\n    if return_pos:\n        inst_pos = np.array([list(range(0, len(inst))) + [pad_idx] * (max_len - len(inst)) for inst in insts])\n\n        return_list += [inst_pos.astype('int64').reshape([-1, max_len, 1])]\n\n    if return_input_mask:\n        # This is used to avoid attention on paddings.\n        input_mask_data = np.array([[1] * len(inst) + [0] * (max_len - len(inst)) for inst in insts])\n        input_mask_data = np.expand_dims(input_mask_data, axis=-1)\n        return_list += [input_mask_data.astype('float32')]\n\n    if return_max_len:\n        return_list += [max_len]\n\n    if return_num_token:\n        num_token = 0\n        for inst in insts:\n            num_token += len(inst)\n        return_list += [num_token]\n\n    if return_seq_lens:\n        seq_lens = np.array([len(inst) for inst in insts])\n        return_list += [seq_lens.astype('int64').reshape([-1, 1])]\n\n    return return_list if len(return_list) > 1 else return_list[0]\n"
  },
  {
    "path": "paddlehub/compat/task/checkpoint.proto",
    "content": "\n// Copyright 2019 The Paddle Authors. All Rights Reserved.\n//\n// Licensed under the Apache License, Version 2.0 (the \"License\");\n// you may not use this file except in compliance with the License.\n// You may obtain a copy of the License at\n//\n//     http://www.apache.org/licenses/LICENSE-2.0\n//\n// Unless required by applicable law or agreed to in writing, software\n// distributed under the License is distributed on an \"AS IS\" BASIS,\n// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n// See the License for the specific language governing permissions and\n// limitations under the License.\n// =============================================================================\n\nsyntax = \"proto3\";\noption optimize_for = LITE_RUNTIME;\n\npackage paddlehub.task.checkpoint;\n\nmessage CheckPoint {\n  int64 current_epoch = 1;\n  int64 global_step = 2;\n  string latest_model_dir = 3;\n  double best_score = 4;\n}"
  },
  {
    "path": "paddlehub/compat/task/checkpoint.py",
    "content": "# coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Tuple\n\nimport paddle\n\nfrom paddlehub.compat.task import checkpoint_pb2\nfrom paddlehub.utils.log import logger\n\nCKPT_FILE_NAME = 'ckpt.meta'\n\n\ndef load_checkpoint(checkpoint_dir: str, exe: paddle.static.Executor,\n                    main_program: paddle.static.Program) -> Tuple[bool, int, int, float]:\n\n    ckpt_meta_path = os.path.join(checkpoint_dir, CKPT_FILE_NAME)\n    ckpt = checkpoint_pb2.CheckPoint()\n    logger.info('Try loading checkpoint from {}'.format(ckpt_meta_path))\n    if os.path.exists(ckpt_meta_path):\n        with open(ckpt_meta_path, 'rb') as f:\n            ckpt.ParseFromString(f.read())\n    current_epoch = 1\n    global_step = 0\n    best_score = -999\n\n    if ckpt.latest_model_dir:\n        paddle.static.load(executor=exe, model_path=ckpt.latest_model_dir, program=main_program)\n\n        # Compatible with older versions without best_score in checkpoint_pb2\n        try:\n            best_score = ckpt.best_score\n        except:\n            best_score = -999\n\n        logger.info('PaddleHub model checkpoint loaded. current_epoch={}, '\n                    'global_step={}, best_score={:.5f}'.format(ckpt.current_epoch, ckpt.global_step, best_score))\n\n        return True, ckpt.current_epoch, ckpt.global_step, best_score\n\n    logger.info('PaddleHub model checkpoint not found, start from scratch...')\n\n    return False, current_epoch, global_step, best_score\n"
  },
  {
    "path": "paddlehub/compat/task/checkpoint_pb2.py",
    "content": "# Generated by the protocol buffer compiler.  DO NOT EDIT!\n# source: checkpoint.proto\n\nimport sys\n_b = sys.version_info[0] < 3 and (lambda x: x) or (lambda x: x.encode('latin1'))\nfrom google.protobuf import descriptor as _descriptor\nfrom google.protobuf import message as _message\nfrom google.protobuf import reflection as _reflection\nfrom google.protobuf import symbol_database as _symbol_database\nfrom google.protobuf import descriptor_pb2\n# @@protoc_insertion_point(imports)\n\n_sym_db = _symbol_database.Default()\n\nDESCRIPTOR = _descriptor.FileDescriptor(\n    name='checkpoint.proto',\n    package='paddlehub.task.checkpoint',\n    syntax='proto3',\n    serialized_pb=_b(\n        '\\n\\x10\\x63heckpoint.proto\\x12\\x19paddlehub.task.checkpoint\\\"f\\n\\nCheckPoint\\x12\\x15\\n\\rcurrent_epoch\\x18\\x01 \\x01(\\x03\\x12\\x13\\n\\x0bglobal_step\\x18\\x02 \\x01(\\x03\\x12\\x18\\n\\x10latest_model_dir\\x18\\x03 \\x01(\\t\\x12\\x12\\n\\nbest_score\\x18\\x04 \\x01(\\x01\\x42\\x02H\\x03\\x62\\x06proto3'\n    ))\n_sym_db.RegisterFileDescriptor(DESCRIPTOR)\n\n_CHECKPOINT = _descriptor.Descriptor(\n    name='CheckPoint',\n    full_name='paddlehub.task.checkpoint.CheckPoint',\n    filename=None,\n    file=DESCRIPTOR,\n    containing_type=None,\n    fields=[\n        _descriptor.FieldDescriptor(\n            name='current_epoch',\n            full_name='paddlehub.task.checkpoint.CheckPoint.current_epoch',\n            index=0,\n            number=1,\n            type=3,\n            cpp_type=2,\n            label=1,\n            has_default_value=False,\n            default_value=0,\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n        _descriptor.FieldDescriptor(\n            name='global_step',\n            full_name='paddlehub.task.checkpoint.CheckPoint.global_step',\n            index=1,\n            number=2,\n            type=3,\n            cpp_type=2,\n            label=1,\n            has_default_value=False,\n            default_value=0,\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n        _descriptor.FieldDescriptor(\n            name='latest_model_dir',\n            full_name='paddlehub.task.checkpoint.CheckPoint.latest_model_dir',\n            index=2,\n            number=3,\n            type=9,\n            cpp_type=9,\n            label=1,\n            has_default_value=False,\n            default_value=_b(\"\").decode('utf-8'),\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n        _descriptor.FieldDescriptor(\n            name='best_score',\n            full_name='paddlehub.task.checkpoint.CheckPoint.best_score',\n            index=3,\n            number=4,\n            type=1,\n            cpp_type=5,\n            label=1,\n            has_default_value=False,\n            default_value=float(0),\n            message_type=None,\n            enum_type=None,\n            containing_type=None,\n            is_extension=False,\n            extension_scope=None,\n            options=None),\n    ],\n    extensions=[],\n    nested_types=[],\n    enum_types=[],\n    options=None,\n    is_extendable=False,\n    syntax='proto3',\n    extension_ranges=[],\n    oneofs=[],\n    serialized_start=47,\n    serialized_end=149,\n)\n\nDESCRIPTOR.message_types_by_name['CheckPoint'] = _CHECKPOINT\n\nCheckPoint = _reflection.GeneratedProtocolMessageType(\n    'CheckPoint',\n    (_message.Message, ),\n    dict(\n        DESCRIPTOR=_CHECKPOINT,\n        __module__='checkpoint_pb2'\n        # @@protoc_insertion_point(class_scope:paddlehub.task.checkpoint.CheckPoint)\n    ))\n_sym_db.RegisterMessage(CheckPoint)\n\nDESCRIPTOR.has_options = True\nDESCRIPTOR._options = _descriptor._ParseOptions(descriptor_pb2.FileOptions(), _b('H\\003'))\n# @@protoc_insertion_point(module_scope)\n"
  },
  {
    "path": "paddlehub/compat/task/config.py",
    "content": "# coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport time\nfrom typing import Callable\n\n\nclass RunConfig(object):\n    ''' This class specifies the configurations for PaddleHub to finetune '''\n\n    def __init__(self,\n                 log_interval: int = 10,\n                 eval_interval: int = 100,\n                 use_data_parallel: bool = True,\n                 save_ckpt_interval: int = None,\n                 use_cuda: bool = True,\n                 checkpoint_dir: str = None,\n                 num_epoch: int = 1,\n                 batch_size: int = 32,\n                 strategy: Callable = None):\n        ''' Construct finetune Config '''\n        self.log_interval = log_interval\n        self.eval_interval = eval_interval\n        self.save_ckpt_interval = save_ckpt_interval\n        self.use_cuda = use_cuda\n        self.num_epoch = num_epoch\n        self.batch_size = batch_size\n        self.use_data_parallel = use_data_parallel\n\n        if checkpoint_dir is None:\n            now = int(time.time())\n            time_str = time.strftime('%Y%m%d%H%M%S', time.localtime(now))\n            self.checkpoint_dir = 'ckpt_' + time_str\n        else:\n            self.checkpoint_dir = checkpoint_dir\n\n    def __repr__(self):\n        return 'config with num_epoch={}, batch_size={}, use_cuda={}, checkpoint_dir={} '.format(\n            self.num_epoch, self.batch_size, self.use_cuda, self.checkpoint_dir)\n"
  },
  {
    "path": "paddlehub/compat/task/hook.py",
    "content": "# coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport inspect\nfrom collections import OrderedDict\nfrom typing import Callable\n\n\nclass TaskHooks(object):\n    '''TaskHooks can handle some tasks during the spectific event.'''\n\n    def __init__(self):\n        self._registered_hooks = {\n            'build_env_start_event': OrderedDict(),\n            'build_env_end_event': OrderedDict(),\n            'finetune_start_event': OrderedDict(),\n            'finetune_end_event': OrderedDict(),\n            'predict_start_event': OrderedDict(),\n            'predict_end_event': OrderedDict(),\n            'eval_start_event': OrderedDict(),\n            'eval_end_event': OrderedDict(),\n            'log_interval_event': OrderedDict(),\n            'save_ckpt_interval_event': OrderedDict(),\n            'eval_interval_event': OrderedDict(),\n            'run_step_event': OrderedDict(),\n        }\n        self._hook_params_num = {\n            'build_env_start_event': 1,\n            'build_env_end_event': 1,\n            'finetune_start_event': 1,\n            'finetune_end_event': 2,\n            'predict_start_event': 1,\n            'predict_end_event': 2,\n            'eval_start_event': 1,\n            'eval_end_event': 2,\n            'log_interval_event': 2,\n            'save_ckpt_interval_event': 1,\n            'eval_interval_event': 1,\n            'run_step_event': 2,\n        }\n\n    def add(self, hook_type: str, name: str = None, func: Callable = None):\n        '''\n        add the handler function to spectific event.\n        Args:\n            hook_type (str): the spectific event name\n            name (str): the handler function name, default None\n            func (func): the handler function, default None\n        '''\n        if not func or not callable(func):\n            raise TypeError('The hook function is empty or it is not a function')\n        if name == None:\n            name = 'hook_%s' % id(func)\n\n        # check validity\n        if not isinstance(name, str) or name.strip() == '':\n            raise TypeError('The hook name must be a non-empty string')\n        if hook_type not in self._registered_hooks:\n            raise ValueError('hook_type: %s does not exist' % (hook_type))\n        if name in self._registered_hooks[hook_type]:\n            raise ValueError('name: %s has existed in hook_type:%s, use modify method to modify it' % (name, hook_type))\n        else:\n            args_num = len(inspect.getfullargspec(func).args)\n            if args_num != self._hook_params_num[hook_type]:\n                raise ValueError('The number of parameters to the hook hook_type:%s should be %i' %\n                                 (hook_type, self._hook_params_num[hook_type]))\n            self._registered_hooks[hook_type][name] = func\n\n    def delete(self, hook_type: str, name: str):\n        '''\n        delete the handler function of spectific event.\n        Args:\n            hook_type (str): the spectific event name\n            name (str): the handler function name\n        '''\n        if self.exist(hook_type, name):\n            del self._registered_hooks[hook_type][name]\n        else:\n            raise ValueError(\n                'No hook_type: %s exists or name: %s does not exist in hook_type: %s' % (hook_type, name, hook_type))\n\n    def modify(self, hook_type: str, name: str, func: Callable):\n        '''\n        modify the handler function of spectific event.\n        Args:\n            hook_type (str): the spectific event name\n            name (str): the handler function name\n            func (func): the new handler function\n        '''\n        if not (isinstance(name, str) and callable(func)):\n            raise TypeError('The hook name must be a string, and the hook function must be a function')\n        if self.exist(hook_type, name):\n            self._registered_hooks[hook_type][name] = func\n        else:\n            raise ValueError(\n                'No hook_type: %s exists or name: %s does not exist in hook_type: %s' % (hook_type, name, hook_type))\n\n    def exist(self, hook_type: str, name: str) -> bool:\n        '''\n        check if the the handler function of spectific event is existing.\n        Args:\n            hook_type (str): the spectific event name\n            name (str): the handler function name\n        Returns:\n            bool: True or False\n        '''\n        if hook_type not in self._registered_hooks \\\n                or name not in self._registered_hooks[hook_type]:\n            return False\n        else:\n            return True\n\n    def info(self, show_default: bool = False) -> str:\n        '''\n        get the hooks information, including the source code.\n        Args:\n            show_default (bool): show the information of Paddlehub default hooks or not, default False\n        Returns:\n            str: the formatted string of the hooks information\n        '''\n        # formatted output the source code\n        ret = ''\n        for hook_type, hooks in self._registered_hooks.items():\n            already_print_type = False\n            for name, func in hooks.items():\n                if name == 'default' and not show_default:\n                    continue\n                if not already_print_type:\n                    ret += 'hook_type: %s{\\n' % hook_type\n                    already_print_type = True\n                source = inspect.getsource(func)\n                ret += ' name: %s{\\n' % name\n                for line in source.split('\\n'):\n                    ret += '  %s\\n' % line\n                ret += ' }\\n'\n            if already_print_type:\n                ret += '}\\n'\n        if not ret:\n            ret = 'Not any customized hooks have been defined, you can set show_default=True to see the default hooks information'\n        return ret\n\n    def __getitem__(self, hook_type: str) -> OrderedDict:\n        return self._registered_hooks[hook_type]\n\n    def __repr__(self) -> str:\n        return self.info(show_default=False)\n"
  },
  {
    "path": "paddlehub/compat/task/metrics.py",
    "content": "# coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport collections\nimport math\nfrom typing import List\n\n\ndef _get_ngrams(segment: str, max_order: int):\n    \"\"\"\n    Extracts all n-grams upto a given maximum order from an input segment.\n\n    Args:\n        segment: text segment from which n-grams will be extracted.\n        max_order: maximum length in tokens of the n-grams returned by this\n            methods.\n\n    Returns:\n        The Counter containing all n-grams upto max_order in segment\n        with a count of how many times each n-gram occurred.\n    \"\"\"\n    ngram_counts = collections.Counter()\n    for order in range(1, max_order + 1):\n        for i in range(0, len(segment) - order + 1):\n            ngram = tuple(segment[i:i + order])\n            ngram_counts[ngram] += 1\n    return ngram_counts\n\n\ndef compute_bleu(reference_corpus: List, translation_corpus: List, max_order: int = 4, smooth: bool = False):\n    '''\n    Computes BLEU score of translated segments against one or more references.\n    Args:\n        reference_corpus: list of lists of references for each translation. Each\n            reference should be tokenized into a list of tokens.\n        translation_corpus: list of translations to score. Each translation\n            should be tokenized into a list of tokens.\n        max_order: Maximum n-gram order to use when computing BLEU score.\n        smooth: Whether or not to apply Lin et al. 2004 smoothing.\n    Returns:\n        3-Tuple with the BLEU score, n-gram precisions, geometric mean of n-gram\n        precisions and brevity penalty.\n    '''\n    matches_by_order = [0] * max_order\n    possible_matches_by_order = [0] * max_order\n    reference_length = 0\n    translation_length = 0\n    for (reference, translation) in zip(reference_corpus, translation_corpus):\n        reference_length += len(reference)\n        translation_length += len(translation)\n\n        merged_ref_ngram_counts = collections.Counter()\n        merged_ref_ngram_counts |= _get_ngrams(reference, max_order)\n        translation_ngram_counts = _get_ngrams(translation, max_order)\n        overlap = translation_ngram_counts & merged_ref_ngram_counts\n        for ngram in overlap:\n            matches_by_order[len(ngram) - 1] += overlap[ngram]\n        for order in range(1, max_order + 1):\n            possible_matches = len(translation) - order + 1\n            if possible_matches > 0:\n                possible_matches_by_order[order - 1] += possible_matches\n\n    precisions = [0] * max_order\n    for i in range(0, max_order):\n        if smooth:\n            precisions[i] = ((matches_by_order[i] + 1.) / (possible_matches_by_order[i] + 1.))\n        else:\n            if possible_matches_by_order[i] > 0:\n                precisions[i] = (float(matches_by_order[i]) / possible_matches_by_order[i])\n            else:\n                precisions[i] = 0.0\n\n    if min(precisions) > 0:\n        p_log_sum = sum((1. / max_order) * math.log(p) for p in precisions)\n        geo_mean = math.exp(p_log_sum)\n    else:\n        geo_mean = 0\n\n    ratio = float(translation_length) / reference_length\n\n    if ratio > 1.0:\n        bp = 1.\n    elif ratio > 0.0:\n        bp = math.exp(1 - 1. / ratio)\n    else:\n        bp = 0\n\n    bleu = geo_mean * bp\n\n    return (bleu, precisions, bp, ratio, translation_length, reference_length)\n"
  },
  {
    "path": "paddlehub/compat/task/reader.py",
    "content": "# coding:utf-8\n# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom collections import namedtuple\nfrom typing import Callable, Generator, Generic, List\n\nimport numpy as np\n\nfrom paddlehub.utils.log import logger\nfrom paddlehub.compat.task import tokenization\nfrom paddlehub.compat.task.batch import pad_batch_data\n\n\nclass InputExample(object):\n    '''\n    Input data structure of BERT/ERNIE, can satisfy single sequence task like\n    text classification, sequence lableing; Sequence pair task like dialog\n    task.\n    '''\n\n    def __init__(self, guid: int, text_a: str, text_b: str = None, label: str = None):\n        '''Constructs a InputExample.\n    Args:\n      guid: Unique id for the example.\n      text_a: string. The untokenized text of the first sequence. For single\n        sequence tasks, only this sequence must be specified.\n      text_b: (Optional) string. The untokenized text of the second sequence.\n        Only must be specified for sequence pair tasks.\n      label: (Optional) string. The label of the example. This should be\n        specified for train and dev examples, but not for test examples.\n    '''\n        self.guid = guid\n        self.text_a = text_a\n        self.text_b = text_b\n        self.label = label\n\n    def __str__(self):\n        if self.text_b is None:\n            return 'text={}\\tlabel={}'.format(self.text_a, self.label)\n        else:\n            return 'text_a={}\\ttext_b={},label={}'.format(self.text_a, self.text_b, self.label)\n\n\nclass BaseReader(object):\n    def __init__(self, dataset: Generic, random_seed: int = None):\n        self.dataset = dataset\n        self.num_examples = {'train': -1, 'dev': -1, 'test': -1}\n        np.random.seed(random_seed)\n\n        # generate label map\n        self.label_map = {}\n        try:\n            for index, label in enumerate(self.dataset.get_labels()):\n                self.label_map[label] = index\n            logger.info('Dataset label map = {}'.format(self.label_map))\n        except:\n            # some dataset like squad, its label_list=None\n            logger.info('Dataset is None or it has not any labels, label map = {}'.format(self.label_map))\n\n    def get_train_examples(self) -> List:\n        return self.dataset.get_train_examples()\n\n    def get_dev_examples(self) -> List:\n        return self.dataset.get_dev_examples()\n\n    def get_test_examples(self) -> List:\n        return self.dataset.get_test_examples()\n\n    def data_generator(self) -> Generic:\n        raise NotImplementedError\n\n\nclass BaseNLPReader(BaseReader):\n    def __init__(self,\n                 vocab_path: str,\n                 dataset: Generic = None,\n                 max_seq_len: int = 512,\n                 do_lower_case: bool = True,\n                 random_seed: int = None,\n                 sp_model_path: str = None,\n                 word_dict_path: str = None,\n                 in_tokens: bool = False):\n        super(BaseNLPReader, self).__init__(dataset, random_seed)\n        self.max_seq_len = max_seq_len\n        if sp_model_path and word_dict_path:\n            self.tokenizer = tokenization.WSSPTokenizer(vocab_path, sp_model_path, word_dict_path, ws=True, lower=True)\n        else:\n            self.tokenizer = tokenization.FullTokenizer(vocab_file=vocab_path, do_lower_case=do_lower_case)\n        self.vocab = self.tokenizer.vocab\n        self.pad_id = self.vocab['[PAD]']\n        self.cls_id = self.vocab['[CLS]']\n        self.sep_id = self.vocab['[SEP]']\n        self.mask_id = self.vocab['[MASK]']\n        self.in_tokens = in_tokens\n\n        self.Record_With_Label_Id = namedtuple('Record', ['token_ids', 'text_type_ids', 'position_ids', 'label_id'])\n        self.Record_Wo_Label_Id = namedtuple('Record', ['token_ids', 'text_type_ids', 'position_ids'])\n\n    def _truncate_seq_pair(self, tokens_a: List, tokens_b: List, max_length: int):\n        '''Truncates a sequence pair in place to the maximum length.'''\n\n        # This is a simple heuristic which will always truncate the longer sequence\n        # one token at a time. This makes more sense than truncating an equal percent\n        # of tokens from each, since if one sequence is very short then each token\n        # that's truncated likely contains more information than a longer sequence.\n        while True:\n            total_length = len(tokens_a) + len(tokens_b)\n            if total_length <= max_length:\n                break\n            if len(tokens_a) > len(tokens_b):\n                tokens_a.pop()\n            else:\n                tokens_b.pop()\n\n    def _convert_example_to_record(self,\n                                   example: InputExample,\n                                   max_seq_length: int,\n                                   tokenizer: Generic,\n                                   phase: str = None) -> namedtuple:\n        '''Converts a single `Example` into a single `Record`.'''\n\n        text_a = tokenization.convert_to_unicode(example.text_a)\n        tokens_a = tokenizer.tokenize(text_a)\n        tokens_b = None\n        if example.text_b is not None:\n            #if 'text_b' in example._fields:\n            text_b = tokenization.convert_to_unicode(example.text_b)\n            tokens_b = tokenizer.tokenize(text_b)\n\n        if tokens_b:\n            # Modifies `tokens_a` and `tokens_b` in place so that the total\n            # length is less than the specified length.\n            # Account for [CLS], [SEP], [SEP] with '- 3'\n            self._truncate_seq_pair(tokens_a, tokens_b, max_seq_length - 3)\n        else:\n            # Account for [CLS] and [SEP] with '- 2'\n            if len(tokens_a) > max_seq_length - 2:\n                tokens_a = tokens_a[0:(max_seq_length - 2)]\n\n        # The convention in BERT/ERNIE is:\n        # (a) For sequence pairs:\n        #  tokens:   [CLS] is this jack ##son ##ville ? [SEP] no it is not . [SEP]\n        #  type_ids: 0     0  0    0    0     0       0 0     1  1  1  1   1 1\n        # (b) For single sequences:\n        #  tokens:   [CLS] the dog is hairy . [SEP]\n        #  type_ids: 0     0   0   0  0     0 0\n        #\n        # Where 'type_ids' are used to indicate whether this is the first\n        # sequence or the second sequence. The embedding vectors for `type=0` and\n        # `type=1` were learned during pre-training and are added to the wordpiece\n        # embedding vector (and position vector). This is not *strictly* necessary\n        # since the [SEP] token unambiguously separates the sequences, but it makes\n        # it easier for the model to learn the concept of sequences.\n        #\n        # For classification tasks, the first vector (corresponding to [CLS]) is\n        # used as as the 'sentence vector'. Note that this only makes sense because\n        # the entire model is fine-tuned.\n        tokens = []\n        text_type_ids = []\n        tokens.append('[CLS]')\n        text_type_ids.append(0)\n        for token in tokens_a:\n            tokens.append(token)\n            text_type_ids.append(0)\n        tokens.append('[SEP]')\n        text_type_ids.append(0)\n\n        if tokens_b:\n            for token in tokens_b:\n                tokens.append(token)\n                text_type_ids.append(1)\n            tokens.append('[SEP]')\n            text_type_ids.append(1)\n\n        token_ids = tokenizer.convert_tokens_to_ids(tokens)\n        position_ids = list(range(len(token_ids)))\n\n        if self.label_map:\n            if example.label not in self.label_map:\n                raise KeyError('example.label = {{{}}} not in label'.format(example.label))\n            label_id = self.label_map[example.label]\n        else:\n            label_id = example.label\n\n        if phase != 'predict':\n            record = self.Record_With_Label_Id(\n                token_ids=token_ids, text_type_ids=text_type_ids, position_ids=position_ids, label_id=label_id)\n        else:\n            record = self.Record_Wo_Label_Id(\n                token_ids=token_ids, text_type_ids=text_type_ids, position_ids=position_ids)\n\n        return record\n\n    def _pad_batch_records(self, batch_records: List, phase: str):\n        raise NotImplementedError\n\n    def _prepare_batch_data(self, examples: List, batch_size: int, phase: str = None) -> Generator:\n        '''generate batch records'''\n        batch_records, max_len = [], 0\n        for index, example in enumerate(examples):\n            if phase == 'train':\n                self.current_example = index\n            record = self._convert_example_to_record(example, self.max_seq_len, self.tokenizer, phase)\n            max_len = max(max_len, len(record.token_ids))\n            if self.in_tokens:\n                to_append = (len(batch_records) + 1) * max_len <= batch_size\n            else:\n                to_append = len(batch_records) < batch_size\n            if to_append:\n                batch_records.append(record)\n            else:\n                yield self._pad_batch_records(batch_records, phase)\n                batch_records, max_len = [record], len(record.token_ids)\n\n        if batch_records:\n            yield self._pad_batch_records(batch_records, phase)\n\n    def data_generator(self,\n                       batch_size: int = 1,\n                       phase: str = 'train',\n                       shuffle: bool = True,\n                       data: List = None,\n                       return_list: bool = True) -> Callable:\n        if phase != 'predict' and not self.dataset:\n            raise ValueError('The dataset is None ! It isn\\'t allowed.')\n        if phase == 'train':\n            shuffle = True\n            examples = self.get_train_examples()\n            self.num_examples['train'] = len(examples)\n        elif phase == 'val' or phase == 'dev':\n            shuffle = False\n            examples = self.get_dev_examples()\n            self.num_examples['dev'] = len(examples)\n        elif phase == 'test':\n            shuffle = False\n            examples = self.get_test_examples()\n            self.num_examples['test'] = len(examples)\n        elif phase == 'predict':\n            shuffle = False\n            examples = []\n            seq_id = 0\n\n            for item in data:\n                # set label in order to run the program\n                if self.dataset:\n                    label = list(self.label_map.keys())[0]\n                else:\n                    label = 0\n                if len(item) == 1:\n                    item_i = InputExample(guid=seq_id, text_a=item[0], label=label)\n                elif len(item) == 2:\n                    item_i = InputExample(guid=seq_id, text_a=item[0], text_b=item[1], label=label)\n                else:\n                    raise ValueError('The length of input_text is out of handling, which must be 1 or 2!')\n                examples.append(item_i)\n                seq_id += 1\n        else:\n            raise ValueError('Unknown phase, which should be in [\\'train\\', \\'dev\\', \\'test\\', \\'predict\\'].')\n\n        def wrapper():\n            if shuffle:\n                np.random.shuffle(examples)\n\n            for batch_data in self._prepare_batch_data(examples, batch_size, phase=phase):\n                if return_list:\n                    # for DataFeeder\n                    yield [batch_data]\n                else:\n                    # for DataLoader\n                    yield batch_data\n\n        return wrapper\n\n\nclass ClassifyReader(BaseNLPReader):\n    def _pad_batch_records(self, batch_records: List, phase: str = None) -> List:\n        batch_token_ids = [record.token_ids for record in batch_records]\n        batch_text_type_ids = [record.text_type_ids for record in batch_records]\n        batch_position_ids = [record.position_ids for record in batch_records]\n\n        padded_token_ids, input_mask, batch_seq_lens = pad_batch_data(\n            batch_token_ids,\n            max_seq_len=self.max_seq_len,\n            pad_idx=self.pad_id,\n            return_input_mask=True,\n            return_seq_lens=True)\n        padded_text_type_ids = pad_batch_data(batch_text_type_ids, max_seq_len=self.max_seq_len, pad_idx=self.pad_id)\n        padded_position_ids = pad_batch_data(batch_position_ids, max_seq_len=self.max_seq_len, pad_idx=self.pad_id)\n\n        return_list = [padded_token_ids, padded_position_ids, padded_text_type_ids, input_mask, batch_seq_lens]\n        if phase != 'predict':\n            batch_labels = [record.label_id for record in batch_records]\n            batch_labels = np.array(batch_labels).astype('int64').reshape([-1, 1])\n            return_list += [batch_labels]\n\n        return return_list\n"
  },
  {
    "path": "paddlehub/compat/task/task_utils.py",
    "content": "# coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport copy\nimport time\nfrom typing import Any\n\nimport paddle.utils.unique_name\n\n\nclass RunState(object):\n    '''\n    RunState is used to save the result of every running step\n    Args:\n        length (int): the number of fetch result\n    '''\n\n    def __init__(self, length: int):\n        self.run_time_begin = time.time()\n        self.run_step = 0\n        self.run_examples = 0\n        self.run_results = [0] * length\n        self.run_time_used = 0\n        self.run_speed = 0.0\n\n    def __add__(self, other):\n        self.run_step += other.run_step\n        self.run_examples += other.run_examples\n        for index in range(len(self.run_results)):\n            self.run_results[index] += other.run_results[index]\n        return self\n\n    def update(self):\n        self.run_time_used = time.time() - self.run_time_begin\n        self.run_speed = self.run_step / self.run_time_used\n        return self\n\n\nclass RunEnv(object):\n    '''RunEnv saves the running environment of the train/dev/predict phase, including program, reader, metrics and so on.'''\n\n    def __init__(self):\n        self.current_epoch = 0\n        self.current_step = 0\n        self.main_program = None\n        self.start_program = None\n        self.main_program_compiled = None\n        self.py_reader = None\n        self.generator = None\n        self.loss = None\n        self.labels = None\n        self.metrics = None\n        self.is_inititalized = False\n        self.UNG = paddle.utils.unique_name.generate\n\n    def __setattr__(self, key: str, value: Any):\n        self.__dict__[key] = value\n\n    def __getattr__(self, key: str) -> Any:\n        return self.__dict__[key]\n"
  },
  {
    "path": "paddlehub/compat/task/text_generation_task.py",
    "content": "# coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport time\nfrom collections import OrderedDict\n\nimport numpy as np\nimport paddle\nimport paddle.nn as nn\nfrom paddle import ParamAttr\nfrom paddle.nn import BeamSearchDecoder\nfrom paddle.nn import dynamic_decode\nfrom paddle.nn import LSTMCell\nfrom paddle.nn import RNN\nfrom paddle.nn import RNNCellBase\n\nfrom paddlehub.compat.task.base_task import BaseTask\nfrom paddlehub.compat.task.metrics import compute_bleu\n\n\nclass AttentionDecoderCell(RNNCellBase):\n\n    def __init__(self, num_layers, input_size, hidden_size, dropout_prob=0., init_scale=0.1):\n        super(AttentionDecoderCell, self).__init__()\n        self.num_layers = num_layers\n        self.hidden_size = hidden_size\n        self.dropout_prob = dropout_prob\n        self.lstm_cells = []\n        self.init_scale = init_scale\n        for i in range(num_layers):\n            self.lstm_cells.append(\n                LSTMCell(input_size=input_size + hidden_size if i == 0 else hidden_size, hidden_size=hidden_size))\n\n    def attention(self, query, enc_output, mask=None):\n        query = paddle.unsqueeze(query, [1])\n        memory = paddle.static.nn.fc(enc_output,\n                                     self.hidden_size,\n                                     num_flatten_dims=2,\n                                     weight_attr=ParamAttr(name='dec_memory_w',\n                                                           initializer=nn.initializer.Uniform(low=-self.init_scale,\n                                                                                              high=self.init_scale)))\n        attn = paddle.matmul(query, memory, transpose_y=True)\n\n        if mask:\n            attn = paddle.transpose(attn, [1, 0, 2])\n            attn = attn + (mask * 1000000000)\n            attn = paddle.transpose(attn, [1, 0, 2])\n        weight = nn.functional.softmax(attn)\n        weight_memory = paddle.matmul(weight, memory)\n\n        return weight_memory\n\n    def forward(self, step_input, states, enc_output, enc_padding_mask=None):\n        lstm_states, input_feed = states\n        new_lstm_states = []\n        step_input = paddle.concat([step_input, input_feed], 1)\n        for i in range(self.num_layers):\n            out, new_lstm_state = self.lstm_cells[i](step_input, lstm_states[i])\n            step_input = nn.functional.dropout(out, self.dropout_prob,\n                                               mode='upscale_in_train') if self.dropout_prob > 0 else out\n            new_lstm_states.append(new_lstm_state)\n        dec_att = self.attention(step_input, enc_output, enc_padding_mask)\n        dec_att = paddle.squeeze(dec_att, [1])\n        concat_att_out = paddle.concat([dec_att, step_input], 1)\n        out = paddle.static.nn.fc(concat_att_out,\n                                  self.hidden_size,\n                                  weight_attr=ParamAttr(name='dec_out_w',\n                                                        initializer=nn.initializer.Uniform(low=-self.init_scale,\n                                                                                           high=self.init_scale)))\n        return out, [new_lstm_states, out]\n\n\nclass TextGenerationTask(BaseTask):\n    '''\n    TextGenerationTask use rnn as decoder and beam search technology when predict.\n    Args:\n        feature(Variable): The sentence-level feature, shape as [-1, emb_size].\n        token_feature(Variable): The token-level feature, shape as [-1, seq_len, emb_size].\n        max_seq_len(int): the decoder max sequence length.\n        num_classes(int): total labels of the task.\n        dataset(GenerationDataset): the dataset containing training set, development set and so on. If you want to finetune the model, you should set it.\n                 Otherwise, if you just want to use the model to predict, you can omit it. Default None\n        num_layers(int): the decoder rnn layers number. Default 1\n        hidden_size(int): the decoder rnn hidden size. Default 128\n        dropout(float): the decoder dropout rate. Default 0.\n        beam_size(int): the beam search size during predict phase. Default 10.\n        beam_max_step_num(int): the beam search max step number. Default 30.\n        start_token(str): the beam search start token. Default '<s>'\n        end_token(str): the beam search end token. Default '</s>'\n        startup_program(Program): the customized startup_program, default None\n        config(RunConfig): the config for the task, default None\n        metrics_choices(list): metrics used to the task, default ['bleu']\n    '''\n\n    def __init__(\n        self,\n        feature,\n        token_feature,\n        max_seq_len,\n        num_classes,\n        dataset=None,\n        num_layers=1,\n        hidden_size=512,\n        dropout=0.,\n        beam_size=10,\n        beam_max_step_num=30,\n        start_token='<s>',\n        end_token='</s>',\n        startup_program=None,\n        config=None,\n        metrics_choices='default',\n    ):\n        if metrics_choices == 'default':\n            metrics_choices = ['bleu']\n        main_program = feature.block.program\n        super(TextGenerationTask, self).__init__(dataset=dataset,\n                                                 main_program=main_program,\n                                                 startup_program=startup_program,\n                                                 config=config,\n                                                 metrics_choices=metrics_choices)\n\n        self.num_layers = num_layers\n        self.hidden_size = hidden_size\n        self.dropout = dropout\n        self.token_feature = token_feature\n        self.feature = feature\n        self.max_seq_len = max_seq_len\n        self.num_classes = num_classes\n        self.beam_size = beam_size\n        self.beam_max_step_num = beam_max_step_num\n        self.start_token = start_token\n        self.end_token = end_token\n\n    def _add_label(self):\n        label = paddle.static.data(name='label', shape=[self.max_seq_len, 1], dtype='int64')\n        return [label]\n\n    def _build_net(self):\n        self.seq_len = paddle.static.data(name='seq_len', shape=[1], dtype='int64', lod_level=0)\n        self.seq_len_used = paddle.squeeze(self.seq_len)\n        src_mask = nn.functional.sequence_mask(self.seq_len_used, maxlen=self.max_seq_len, dtype='float32')\n        enc_padding_mask = (src_mask - 1.0)\n\n        # Define decoder and initialize it.\n        dec_cell = AttentionDecoderCell(self.num_layers, self.feature.shape[-1], self.hidden_size, self.dropout)\n        dec_init_hidden = paddle.static.nn.fc(self.feature,\n                                              size=self.hidden_size,\n                                              num_flatten_dims=1,\n                                              weight_attr=ParamAttr(\n                                                  name='dec_init_hidden_w',\n                                                  initializer=nn.initializer.TruncatedNormal(std=0.02)),\n                                              bias_attr=ParamAttr(name='dec_init_hidden_b',\n                                                                  initializer=nn.initializer.Constant(0.)))\n        dec_initial_states = [\n            [[dec_init_hidden,\n              dec_cell.get_initial_states(batch_ref=self.feature, shape=[self.hidden_size])]] * self.num_layers,\n            dec_cell.get_initial_states(batch_ref=self.feature, shape=[self.hidden_size])\n        ]\n        tar_vocab_size = len(self._label_list)\n        tar_embeder = lambda x: paddle.static.nn.embedding(\n            input=x,\n            size=[tar_vocab_size, self.hidden_size],\n            dtype='float32',\n            is_sparse=False,\n            param_attr=ParamAttr(name='target_embedding', initializer=nn.initializer.Uniform(low=-0.1, high=0.1)))\n        start_token_id = self._label_list.index(self.start_token)\n        end_token_id = self._label_list.index(self.end_token)\n        if not self.is_predict_phase:\n            self.dec_input = paddle.static.data(name='dec_input', shape=[self.max_seq_len], dtype='int64')\n            tar_emb = tar_embeder(self.dec_input)\n            rnn = nn.RNN(dec_cell, is_reverse=False, time_major=False)\n            dec_output, _ = rnn(inputs=tar_emb,\n                                initial_states=dec_initial_states,\n                                enc_output=self.token_feature,\n                                enc_padding_mask=enc_padding_mask)\n            self.logits = paddle.static.nn.fc(dec_output,\n                                              size=tar_vocab_size,\n                                              num_flatten_dims=len(dec_output.shape) - 1,\n                                              weight_attr=ParamAttr(name='output_w',\n                                                                    initializer=nn.initializer.Uniform(low=-0.1,\n                                                                                                       high=0.1)))\n            self.ret_infers = paddle.reshape(x=paddle.argmax(self.logits, axis=2), shape=[-1, 1])\n            logits = self.logits\n            logits = nn.functional.softmax(logits)\n            return [logits]\n        else:\n            output_layer = lambda x: paddle.static.nn.fc(\n                x, size=tar_vocab_size, num_flatten_dims=len(x.shape) - 1, weight_attr=ParamAttr(name='output_w'))\n            beam_search_decoder = BeamSearchDecoder(dec_cell,\n                                                    start_token_id,\n                                                    end_token_id,\n                                                    self.beam_size,\n                                                    embedding_fn=tar_embeder,\n                                                    output_fn=output_layer)\n            enc_output = beam_search_decoder.tile_beam_merge_with_batch(self.token_feature, self.beam_size)\n            enc_padding_mask = beam_search_decoder.tile_beam_merge_with_batch(enc_padding_mask, self.beam_size)\n            self.ret_infers, _ = dynamic_decode(beam_search_decoder,\n                                                inits=dec_initial_states,\n                                                max_step_num=self.beam_max_step_num,\n                                                enc_output=enc_output,\n                                                enc_padding_mask=enc_padding_mask)\n            return self.ret_infers\n\n    def _postprocessing(self, run_states):\n        results = []\n        for batch_states in run_states:\n            batch_results = batch_states.run_results\n            batch_infers = batch_results[0].astype(np.int32)\n            seq_lens = batch_results[1].reshape([-1]).astype(np.int32).tolist()\n            for i, sample_infers in enumerate(batch_infers):\n                beam_result = []\n                for beam_infer in sample_infers.T:\n                    seq_result = [self._label_list[infer] for infer in beam_infer.tolist()[:seq_lens[i] - 2]]\n                    beam_result.append(seq_result)\n                results.append(beam_result)\n        return results\n\n    def _add_metrics(self):\n        self.ret_labels = paddle.reshape(x=self.labels[0], shape=[-1, 1])\n        return [self.ret_labels, self.ret_infers, self.seq_len_used]\n\n    def _add_loss(self):\n        loss = nn.functional.cross_entropy(input=self.outputs[0], label=self.labels[0], soft_label=False)\n        loss = paddle.unsqueeze(loss, axis=[2])\n        max_tar_seq_len = paddle.shape(self.dec_input)[1]\n        tar_sequence_length = self.seq_len_used - paddle.ones_like(self.seq_len_used)\n        tar_mask = nn.functional.sequence_mask(tar_sequence_length, maxlen=max_tar_seq_len, dtype='float32')\n        loss = loss * tar_mask\n        loss = paddle.mean(loss, axis=[0])\n        loss = paddle.sum(loss)\n        return loss\n\n    @property\n    def fetch_list(self):\n        if self.is_train_phase or self.is_test_phase:\n            return [metric.name for metric in self.metrics] + [self.loss.name]\n        elif self.is_predict_phase:\n            return [self.ret_infers.name] + [self.seq_len_used.name]\n        return [output.name for output in self.outputs]\n\n    def _calculate_metrics(self, run_states):\n        loss_sum = 0\n        run_step = run_examples = 0\n        labels = []\n        results = []\n        for run_state in run_states:\n            loss_sum += np.mean(run_state.run_results[-1])\n            np_labels = run_state.run_results[0]\n            np_infers = run_state.run_results[1]\n            np_lens = run_state.run_results[2]\n            batch_size = len(np_lens)\n            max_len = len(np_labels) // batch_size\n            for i in range(batch_size):\n                label = [\n                    self.dataset.label_list[int(id)] for id in np_labels[i * max_len:i * max_len + np_lens[i] - 2]\n                ]  # -2 for CLS and SEP\n                result = [\n                    self.dataset.label_list[int(id)] for id in np_infers[i * max_len:i * max_len + np_lens[i] - 2]\n                ]\n                labels.append(label)\n                results.append(result)\n\n            run_examples += run_state.run_examples\n            run_step += run_state.run_step\n\n        run_time_used = time.time() - run_states[0].run_time_begin\n        run_speed = run_step / run_time_used\n        avg_loss = loss_sum / run_examples\n\n        # The first key will be used as main metrics to update the best model\n        scores = OrderedDict()\n        for metric in self.metrics_choices:\n            if metric == 'bleu':\n                scores['bleu'] = compute_bleu(labels, results, max_order=1)[0]\n            else:\n                raise ValueError('Not Support Metric: \\'%s\\'' % metric)\n        return scores, avg_loss, run_speed\n"
  },
  {
    "path": "paddlehub/compat/task/tokenization.py",
    "content": "# coding=utf-8\n# Copyright 2018 The Google AI Language Team Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#         http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n'''Tokenization classes.'''\n\nimport collections\nimport io\nimport pickle\nimport unicodedata\nfrom typing import List, Union\n\n\ndef convert_to_unicode(text: Union[str, bytes]) -> str:\n    '''Converts `text` to Unicode (if it's not already), assuming utf-8 input.'''\n    if isinstance(text, str):\n        return text\n    elif isinstance(text, bytes):\n        return text.decode('utf-8', 'ignore')\n    else:\n        raise ValueError('Unsupported type: {}'.format(type(text)))\n\n\ndef load_vocab(vocab_file: str) -> List:\n    '''Loads a vocabulary file into a dictionary.'''\n    vocab = collections.OrderedDict()\n    with io.open(vocab_file, 'r', encoding='UTF-8') as file:\n\n        for num, line in enumerate(file):\n            items = convert_to_unicode(line.strip()).split('\\t')\n            if len(items) > 2:\n                break\n            token = items[0]\n            index = items[1] if len(items) == 2 else num\n            token = token.strip()\n            vocab[token] = int(index)\n\n        return vocab\n\n\ndef convert_by_vocab(vocab: collections.OrderedDict, items: List[str]) -> List:\n    '''Converts a sequence of [tokens|ids] using the vocab.'''\n    output = []\n    for item in items:\n        output.append(vocab[item])\n\n    return output\n\n\ndef convert_tokens_to_ids(vocab: collections.OrderedDict, tokens: List[str]) -> List:\n    return convert_by_vocab(vocab, tokens)\n\n\ndef convert_ids_to_tokens(inv_vocab, ids):\n    return convert_by_vocab(inv_vocab, ids)\n\n\ndef whitespace_tokenize(text: str) -> List:\n    '''Runs basic whitespace cleaning and splitting on a peice of text.'''\n    text = text.strip()\n    if not text:\n        return []\n\n    tokens = text.split()\n    return tokens\n\n\nclass FullTokenizer(object):\n    '''Runs end-to-end tokenziation.'''\n\n    def __init__(self, vocab_file: str, do_lower_case: bool = True, use_sentence_piece_vocab: bool = False):\n        self.vocab = load_vocab(vocab_file)\n        self.inv_vocab = {v: k for k, v in self.vocab.items()}\n        self.basic_tokenizer = BasicTokenizer(do_lower_case=do_lower_case)\n        self.use_sentence_piece_vocab = use_sentence_piece_vocab\n        self.wordpiece_tokenizer = WordpieceTokenizer(\n            vocab=self.vocab, use_sentence_piece_vocab=self.use_sentence_piece_vocab)\n\n    def tokenize(self, text: str) -> List:\n        split_tokens = []\n        for token in self.basic_tokenizer.tokenize(text):\n            for sub_token in self.wordpiece_tokenizer.tokenize(token):\n                split_tokens.append(sub_token)\n\n        return split_tokens\n\n    def convert_tokens_to_ids(self, tokens: List) -> List:\n        return convert_by_vocab(self.vocab, tokens)\n\n    def convert_ids_to_tokens(self, ids: List) -> List:\n        return convert_by_vocab(self.inv_vocab, ids)\n\n\nclass WSSPTokenizer(object):\n    def __init__(self, vocab_file: str, sp_model_dir: str, word_dict: str, ws: bool = True, lower: bool = True):\n        self.vocab = load_vocab(vocab_file)\n        self.inv_vocab = {v: k for k, v in self.vocab.items()}\n        self.ws = ws\n        self.lower = lower\n        self.dict = pickle.load(open(word_dict, 'rb'))\n\n        import sentencepiece as spm\n        self.sp_model = spm.SentencePieceProcessor()\n        self.window_size = 5\n        self.sp_model.Load(sp_model_dir)\n\n    def cut(self, chars: List) -> List:\n        words = []\n        idx = 0\n        while idx < len(chars):\n            matched = False\n            for i in range(self.window_size, 0, -1):\n                cand = chars[idx:idx + i]\n                if cand in self.dict:\n                    words.append(cand)\n                    matched = True\n                    break\n            if not matched:\n                i = 1\n                words.append(chars[idx])\n            idx += i\n        return words\n\n    def tokenize(self, text: Union[str, bytes], unk_token: str = '[UNK]') -> List:\n        text = convert_to_unicode(text)\n        if self.ws:\n            text = [s for s in self.cut(text) if s != ' ']\n        else:\n            text = text.split(' ')\n        if self.lower:\n            text = [s.lower() for s in text]\n        text = ' '.join(text)\n        tokens = self.sp_model.EncodeAsPieces(text)\n        in_vocab_tokens = []\n        for token in tokens:\n            if token in self.vocab:\n                in_vocab_tokens.append(token)\n            else:\n                in_vocab_tokens.append(unk_token)\n        return in_vocab_tokens\n\n    def convert_tokens_to_ids(self, tokens: List) -> List:\n        return convert_by_vocab(self.vocab, tokens)\n\n    def convert_ids_to_tokens(self, ids: List) -> List:\n        return convert_by_vocab(self.inv_vocab, ids)\n\n\nclass BasicTokenizer(object):\n    '''Runs basic tokenization (punctuation splitting, lower casing, etc.).'''\n\n    def __init__(self, do_lower_case: bool = True):\n        '''Constructs a BasicTokenizer.\n        Args:\n            do_lower_case: Whether to lower case the input.\n        '''\n        self.do_lower_case = do_lower_case\n\n    def tokenize(self, text: Union[str, bytes]) -> List:\n        '''Tokenizes a piece of text.'''\n        text = convert_to_unicode(text)\n        text = self._clean_text(text)\n\n        # This was added on November 1st, 2018 for the multilingual and Chinese\n        # models. This is also applied to the English models now, but it doesn't\n        # matter since the English models were not trained on any Chinese data\n        # and generally don't have any Chinese data in them (there are Chinese\n        # characters in the vocabulary because Wikipedia does have some Chinese\n        # words in the English Wikipedia.).\n        text = self._tokenize_chinese_chars(text)\n\n        orig_tokens = whitespace_tokenize(text)\n        split_tokens = []\n        for token in orig_tokens:\n            if self.do_lower_case:\n                token = token.lower()\n                token = self._run_strip_accents(token)\n            split_tokens.extend(self._run_split_on_punc(token))\n\n        output_tokens = whitespace_tokenize(' '.join(split_tokens))\n        return output_tokens\n\n    def _run_strip_accents(self, text: str) -> str:\n        '''Strips accents from a piece of text.'''\n        text = unicodedata.normalize('NFD', text)\n        output = []\n        for char in text:\n            cat = unicodedata.category(char)\n            if cat == 'Mn':\n                continue\n            output.append(char)\n        return ''.join(output)\n\n    def _run_split_on_punc(self, text: str) -> List:\n        '''Splits punctuation on a piece of text.'''\n        chars = list(text)\n        i = 0\n        start_new_word = True\n        output = []\n        while i < len(chars):\n            char = chars[i]\n            if _is_punctuation(char):\n                output.append([char])\n                start_new_word = True\n            else:\n                if start_new_word:\n                    output.append([])\n                start_new_word = False\n                output[-1].append(char)\n            i += 1\n\n        return [''.join(x) for x in output]\n\n    def _tokenize_chinese_chars(self, text: str) -> str:\n        '''Adds whitespace around any CJK character.'''\n        output = []\n        for char in text:\n            cp = ord(char)\n            if self._is_chinese_char(cp):\n                output.append(' ')\n                output.append(char)\n                output.append(' ')\n            else:\n                output.append(char)\n        return ''.join(output)\n\n    def _is_chinese_char(self, cp: int) -> bool:\n        '''Checks whether CP is the codepoint of a CJK character.'''\n        # This defines a 'chinese character' as anything in the CJK Unicode block:\n        #     https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_(Unicode_block)\n        #\n        # Note that the CJK Unicode block is NOT all Japanese and Korean characters,\n        # despite its name. The modern Korean Hangul alphabet is a different block,\n        # as is Japanese Hiragana and Katakana. Those alphabets are used to write\n        # space-separated words, so they are not treated specially and handled\n        # like the all of the other languages.\n        if ((cp >= 0x4E00 and cp <= 0x9FFF) or  #\n            (cp >= 0x3400 and cp <= 0x4DBF) or  #\n            (cp >= 0x20000 and cp <= 0x2A6DF) or  #\n            (cp >= 0x2A700 and cp <= 0x2B73F) or  #\n            (cp >= 0x2B740 and cp <= 0x2B81F) or  #\n            (cp >= 0x2B820 and cp <= 0x2CEAF) or (cp >= 0xF900 and cp <= 0xFAFF) or  #\n            (cp >= 0x2F800 and cp <= 0x2FA1F)):  #\n            return True\n\n        return False\n\n    def _clean_text(self, text: str) -> str:\n        '''Performs invalid character removal and whitespace cleanup on text.'''\n        output = []\n        for char in text:\n            cp = ord(char)\n            if cp == 0 or cp == 0xfffd or _is_control(char):\n                continue\n            if _is_whitespace(char):\n                output.append(' ')\n            else:\n                output.append(char)\n        return ''.join(output)\n\n\nclass WordpieceTokenizer(object):\n    '''Runs WordPiece tokenziation.'''\n\n    def __init__(self,\n                 vocab: collections.OrderedDict,\n                 unk_token: str = '[UNK]',\n                 max_input_chars_per_word: int = 100,\n                 use_sentence_piece_vocab: bool = False):\n        self.vocab = vocab\n        self.unk_token = unk_token\n        self.max_input_chars_per_word = max_input_chars_per_word\n        self.use_sentence_piece_vocab = use_sentence_piece_vocab\n\n    def tokenize(self, text: Union[str, bytes]) -> List:\n        '''Tokenizes a piece of text into its word pieces.\n        This uses a greedy longest-match-first algorithm to perform tokenization\n        using the given vocabulary.\n        For example:\n            input = 'unaffable'\n            output = ['un', '##aff', '##able']\n        Args:\n            text: A single token or whitespace separated tokens. This should have\n                already been passed through `BasicTokenizer.\n        Returns:\n            A list of wordpiece tokens.\n        '''\n\n        text = convert_to_unicode(text)\n\n        output_tokens = []\n        for token in whitespace_tokenize(text):\n            chars = list(token)\n            if len(chars) > self.max_input_chars_per_word:\n                output_tokens.append(self.unk_token)\n                continue\n\n            is_bad = False\n            start = 0\n            sub_tokens = []\n            while start < len(chars):\n                end = len(chars)\n                cur_substr = None\n                while start < end:\n                    substr = ''.join(chars[start:end])\n                    if start == 0 and self.use_sentence_piece_vocab:\n                        substr = u'\\u2581' + substr\n                    if start > 0 and not self.use_sentence_piece_vocab:\n                        substr = '##' + substr\n                    if substr in self.vocab:\n                        cur_substr = substr\n                        break\n                    end -= 1\n                if cur_substr is None:\n                    is_bad = True\n                    break\n                sub_tokens.append(cur_substr)\n                start = end\n\n            if is_bad:\n                output_tokens.append(self.unk_token)\n            else:\n                output_tokens.extend(sub_tokens)\n        return output_tokens\n\n\ndef _is_whitespace(char: str) -> bool:\n    '''Checks whether `chars` is a whitespace character.'''\n    # \\t, \\n, and \\r are technically contorl characters but we treat them\n    # as whitespace since they are generally considered as such.\n    if char == ' ' or char == '\\t' or char == '\\n' or char == '\\r':\n        return True\n    cat = unicodedata.category(char)\n    if cat == 'Zs':\n        return True\n    return False\n\n\ndef _is_control(char: str) -> bool:\n    '''Checks whether `chars` is a control character.'''\n    # These are technically control characters but we count them as whitespace\n    # characters.\n    if char == '\\t' or char == '\\n' or char == '\\r':\n        return False\n    cat = unicodedata.category(char)\n    if cat.startswith('C'):\n        return True\n    return False\n\n\ndef _is_punctuation(char: str) -> bool:\n    '''Checks whether `chars` is a punctuation character.'''\n    cp = ord(char)\n    # We treat all non-letter/number ASCII as punctuation.\n    # Characters such as '^', '$', and '`' are not in the Unicode\n    # Punctuation class but we treat them as punctuation anyways, for\n    # consistency.\n    if ((cp >= 33 and cp <= 47) or (cp >= 58 and cp <= 64) or (cp >= 91 and cp <= 96) or (cp >= 123 and cp <= 126)):\n        return True\n    cat = unicodedata.category(char)\n    if cat.startswith('P'):\n        return True\n    return False\n"
  },
  {
    "path": "paddlehub/compat/task/transformer_emb_task.py",
    "content": "# coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import Generic, List\n\nimport paddle\nimport numpy as np\n\nfrom paddlehub.compat.task.config import RunConfig\nfrom paddlehub.compat.task.base_task import BaseTask\nfrom paddlehub.compat.task.task_utils import RunState\n\n\nclass TransformerEmbeddingTask(BaseTask):\n    def __init__(self,\n                 pooled_feature: paddle.static.Variable,\n                 seq_feature: paddle.static.Variable,\n                 feed_list: List[str],\n                 data_reader: Generic,\n                 config: RunConfig = None):\n        main_program = pooled_feature.block.program\n        super(TransformerEmbeddingTask, self).__init__(\n            main_program=main_program, config=config, feed_list=feed_list, data_reader=data_reader, metrics_choices=[])\n        self.pooled_feature = pooled_feature\n        self.seq_feature = seq_feature\n\n    def _build_net(self) -> List[paddle.static.Variable]:\n        # ClassifyReader will return the seqence length of an input text\n        self.seq_len = paddle.static.data(name='seq_len', shape=[1], dtype='int64', lod_level=0)\n        return [self.pooled_feature, self.seq_feature]\n\n    def _postprocessing(self, run_states: List[RunState]) -> List[List[np.ndarray]]:\n        results = []\n        for batch_state in run_states:\n            batch_result = batch_state.run_results\n            batch_pooled_features = batch_result[0]\n            batch_seq_features = batch_result[1]\n            for i in range(len(batch_pooled_features)):\n                results.append([batch_pooled_features[i], batch_seq_features[i]])\n        return results\n\n    @property\n    def feed_list(self) -> List[str]:\n        feed_list = [varname for varname in self._base_feed_list] + [self.seq_len.name]\n        return feed_list\n\n    @property\n    def fetch_list(self) -> List[str]:\n        fetch_list = [output.name for output in self.outputs] + [self.seq_len.name]\n        return fetch_list\n"
  },
  {
    "path": "paddlehub/compat/type.py",
    "content": "#coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\nclass DataType(object):\n    IMAGE = 0\n    TEXT = 1\n    AUDIO = 2\n    VIDEO = 3\n    INT = 4\n    FLOAT = 5\n"
  },
  {
    "path": "paddlehub/config.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport hashlib\nimport os\nimport time\nimport json\nimport uuid\nimport yaml\nfrom easydict import EasyDict\n\nimport paddlehub.env as hubenv\n\n\ndef md5(text: str):\n    '''Calculate the md5 value of the input text.'''\n    md5code = hashlib.md5(text.encode())\n    return md5code.hexdigest()\n\n\nclass HubConfig:\n    '''\n    PaddleHub configuration management class. Each time the PaddleHub package is loaded, PaddleHub will set the\n    corresponding functions according to the configuration obtained in HubConfig, such as the log level of printing,\n    server address and so on. When the configuration is modified, PaddleHub needs to be reloaded to take effect.\n    '''\n\n    def __init__(self):\n        self._initialize()\n        self.file = os.path.join(hubenv.CONF_HOME, 'config.yaml')\n\n        if not os.path.exists(self.file):\n            self.flush()\n            return\n\n        with open(self.file, 'r') as file:\n            try:\n                cfg = yaml.load(file, Loader=yaml.FullLoader)\n                self.data.update(cfg)\n            except:\n                ...\n\n    def _initialize(self):\n        # Set default configuration values.\n        self.data = EasyDict()\n        self.data.server = 'http://paddlepaddle.org.cn/paddlehub'\n        self.data.log = EasyDict()\n        self.data.log.enable = True\n        self.data.log.level = 'DEBUG'\n\n    def reset(self):\n        '''Reset configuration to default.'''\n        self._initialize()\n        self.flush()\n\n    @property\n    def log_level(self):\n        '''\n        The lowest output level of PaddleHub logger. Logs below the specified level will not be displayed. The default\n        is Debug.\n        '''\n        return self.data.log.level\n\n    @log_level.setter\n    def log_level(self, level: str):\n        from paddlehub.utils import log\n        if not level in log.log_config.keys():\n            raise ValueError('Unknown log level {}. The valid values are {}'.format(level, list(log.log_config.keys())))\n\n        self.data.log.level = level\n        self.flush()\n\n    @property\n    def log_enable(self):\n        '''Whether to enable the PaddleHub logger to take effect. The default is True.'''\n        return self.data.log.enable\n\n    @log_enable.setter\n    def log_enable(self, enable: bool):\n        self.data.log.enable = enable\n        self.flush()\n\n    @property\n    def server(self):\n        '''PaddleHub Module server url.'''\n        return self.data.server\n\n    @server.setter\n    def server(self, url: str):\n        self.data.server = url\n        self.flush()\n\n    def flush(self):\n        '''Flush the current configuration into the configuration file.'''\n        with open(self.file, 'w') as file:\n            # convert EasyDict to dict\n            cfg = json.loads(json.dumps(self.data))\n            yaml.dump(cfg, file)\n\n    def __str__(self):\n        cfg = json.loads(json.dumps(self.data))\n        return yaml.dump(cfg)\n\n\nclass CacheConfig(object):\n    def __init__(self):\n        self._initialize()\n        self.file = os.path.join(hubenv.CONF_HOME, 'cache.yaml')\n        if not os.path.exists(self.file):\n            self.flush()\n            return\n\n        with open(self.file, 'r') as file:\n            try:\n                cfg = yaml.load(file, Loader=yaml.FullLoader)\n                self.data.update(cfg)\n            except:\n                ...\n\n    def _initialize(self):\n        # Set default configuration values.\n        self.data = EasyDict()\n        hub_name = md5(str(uuid.uuid1())[-12:]) + \"-\" + str(int(time.time()))\n        self.data.hub_name = hub_name\n\n    @property\n    def hub_name(self):\n        return self.data.hub_name\n\n    @hub_name.setter\n    def hub_name(self, url: str):\n        self.data.server = url\n        self.flush()\n\n    def flush(self):\n        '''Flush the current configuration into the configuration file.'''\n        with open(self.file, 'w') as file:\n            # convert EasyDict to dict\n            cfg = json.loads(json.dumps(self.data))\n            yaml.dump(cfg, file)\n\n    def __str__(self):\n        cfg = json.loads(json.dumps(self.data))\n        return yaml.dump(cfg)\n\n\ndef _load_old_config(config: HubConfig):\n    # The old version of the configuration file is obsolete, read the configuration value and delete it.\n    old_cfg_file = os.path.join(hubenv.CONF_HOME, 'config.json')\n    if os.path.exists(old_cfg_file):\n        with open(old_cfg_file) as file:\n            try:\n                cfg = json.loads(file.read())\n                config.server = cfg['server_url'][0]\n                config.log_level = cfg['log_level']\n            except:\n                ...\n        os.remove(old_cfg_file)\n\n\nconfig = HubConfig()\n_load_old_config(config)\ncache_config = CacheConfig()\n"
  },
  {
    "path": "paddlehub/datasets/__init__.py",
    "content": "# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlehub.datasets.canvas import Canvas\nfrom paddlehub.datasets.chnsenticorp import ChnSentiCorp\nfrom paddlehub.datasets.esc50 import ESC50\nfrom paddlehub.datasets.flowers import Flowers\nfrom paddlehub.datasets.lcqmc import LCQMC\nfrom paddlehub.datasets.minicoco import MiniCOCO\nfrom paddlehub.datasets.msra_ner import MSRA_NER\nfrom paddlehub.datasets.lcqmc import LCQMC\nfrom paddlehub.datasets.base_seg_dataset import SegDataset\nfrom paddlehub.datasets.opticdiscseg import OpticDiscSeg\n"
  },
  {
    "path": "paddlehub/datasets/base_audio_dataset.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport csv\nimport io\nimport os\nfrom typing import Dict, List, Optional, Tuple, Union\n\nimport numpy as np\nimport paddle\n\nfrom paddlehub.utils.utils import extract_melspectrogram\n\n\nclass InputExample(object):\n    \"\"\"\n    Input example of one audio sample.\n    \"\"\"\n\n    def __init__(self, guid: int, source: Union[list, str], label: Optional[str] = None):\n        self.guid = guid\n        self.source = source\n        self.label = label\n\n\nclass BaseAudioDataset(object):\n    \"\"\"\n    Base class of speech dataset.\n    \"\"\"\n\n    def __init__(self, base_path: str, data_file: str, mode: Optional[str] = \"train\"):\n        self.data_file = os.path.join(base_path, data_file)\n        self.mode = mode\n\n    def _read_file(self, input_file: str):\n        raise NotImplementedError\n\n\nclass AudioClassificationDataset(BaseAudioDataset, paddle.io.Dataset):\n    \"\"\"\n    Base class of audio classification dataset.\n    \"\"\"\n    _supported_features = ['raw', 'mel']\n\n    def __init__(self,\n                 base_path: str,\n                 data_file: str,\n                 file_type: str = 'npz',\n                 mode: str = 'train',\n                 feat_type: str = 'mel',\n                 feat_cfg: dict = None):\n        super(AudioClassificationDataset, self).__init__(base_path=base_path, mode=mode, data_file=data_file)\n\n        self.file_type = file_type\n        self.feat_type = feat_type\n        self.feat_cfg = feat_cfg\n\n        self.examples = self._read_file(self.data_file)\n        self.records = self._convert_examples_to_records(self.examples)\n\n    def _read_file(self, input_file: str) -> List[InputExample]:\n        if not os.path.exists(input_file):\n            raise RuntimeError(\"Data file: {} not found.\".format(input_file))\n\n        examples = []\n        if self.file_type == 'npz':\n            dataset = np.load(os.path.join(self.data_file), allow_pickle=True)\n            audio_id = 0\n            for waveform, label in zip(dataset['waveforms'], dataset['labels']):\n                example = InputExample(guid=audio_id, source=waveform, label=label)\n                audio_id += 1\n                examples.append(example)\n        else:\n            raise NotImplementedError(f'Only soppurts npz file type, but got {self.file_type}')\n\n        return examples\n\n    def _convert_examples_to_records(self, examples: List[InputExample]) -> List[dict]:\n        records = []\n\n        for example in examples:\n            record = {}\n            if self.feat_type == 'raw':\n                record['feat'] = example.source\n            elif self.feat_type == 'mel':\n                record['feat'] = extract_melspectrogram(\n                    example.source,\n                    sample_rate=self.feat_cfg['sample_rate'],\n                    window_size=self.feat_cfg['window_size'],\n                    hop_size=self.feat_cfg['hop_size'],\n                    mel_bins=self.feat_cfg['mel_bins'],\n                    fmin=self.feat_cfg['fmin'],\n                    fmax=self.feat_cfg['fmax'],\n                    window=self.feat_cfg['window'],\n                    center=True,\n                    pad_mode='reflect',\n                    ref=1.0,\n                    amin=1e-10,\n                    top_db=None)\n            else:\n                raise RuntimeError(\\\n                    f\"Unknown type of self.feat_type: {self.feat_type}, it must be one in {self._supported_features}\")\n\n            record['label'] = example.label\n            records.append(record)\n\n        return records\n\n    def __getitem__(self, idx):\n        \"\"\"\n        Overload this method when doing extra feature processes or data augmentation.\n        \"\"\"\n        record = self.records[idx]\n        return np.array(record['feat']), np.array(record['label'], dtype=np.int64)\n\n    def __len__(self):\n        return len(self.records)\n"
  },
  {
    "path": "paddlehub/datasets/base_nlp_dataset.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport csv\nimport io\nimport os\nfrom typing import Dict, List, Optional, Union, Tuple\n\nimport numpy as np\nimport paddle\nimport paddlenlp\nfrom packaging.version import Version\n\nfrom paddlehub.env import DATA_HOME\nfrom paddlenlp.transformers import PretrainedTokenizer\nfrom paddlenlp.data import JiebaTokenizer\nfrom paddlehub.utils.log import logger\nfrom paddlehub.utils.utils import download, reseg_token_label, pad_sequence, trunc_sequence\nfrom paddlehub.utils.xarfile import is_xarfile, unarchive\n\n\nclass InputExample(object):\n    \"\"\"\n    The input data structure of Transformer modules (BERT, ERNIE and so on).\n    \"\"\"\n\n    def __init__(self, guid: int, text_a: str, text_b: Optional[str] = None, label: Optional[str] = None):\n        \"\"\"\n        The input data structure.\n        Args:\n          guid (:obj:`int`):\n              Unique id for the input data.\n          text_a (:obj:`str`, `optional`, defaults to :obj:`None`):\n              The first sequence. For single sequence tasks, only this sequence must be specified.\n          text_b (:obj:`str`, `optional`, defaults to :obj:`None`):\n              The second sequence if sentence-pair.\n          label (:obj:`str`, `optional`, defaults to :obj:`None`):\n              The label of the example.\n        Examples:\n            .. code-block:: python\n                from paddlehub.datasets.base_nlp_dataset import InputExample\n                example = InputExample(guid=0,\n                                text_a='15.4寸笔记本的键盘确实爽，基本跟台式机差不多了',\n                                text_b='蛮喜欢数字小键盘，输数字特方便，样子也很美观，做工也相当不错',\n                                label='1')\n        \"\"\"\n        self.guid = guid\n        self.text_a = text_a\n        self.text_b = text_b\n        self.label = label\n\n    def __str__(self):\n        if self.text_b is None:\n            return \"text={}\\tlabel={}\".format(self.text_a, self.label)\n        else:\n            return \"text_a={}\\ttext_b={},label={}\".format(self.text_a, self.text_b, self.label)\n\n\nclass BaseNLPDataset(object):\n    \"\"\"\n    The virtual base class for nlp datasets, such TextClassificationDataset, SeqLabelingDataset, and so on.\n    The base class must be supered and re-implemented the method _read_file.\n    \"\"\"\n\n    def __init__(self,\n                 base_path: str,\n                 tokenizer: Union[PretrainedTokenizer, JiebaTokenizer],\n                 max_seq_len: Optional[int] = 128,\n                 mode: Optional[str] = \"train\",\n                 data_file: Optional[str] = None,\n                 label_file: Optional[str] = None,\n                 label_list: Optional[List[str]] = None):\n        \"\"\"\n        Ags:\n            base_path (:obj:`str`): The directory to the whole dataset.\n            tokenizer (:obj:`PretrainedTokenizer` or :obj:`JiebaTokenizer`):\n                It tokenizes the text and encodes the data as model needed.\n            max_seq_len (:obj:`int`, `optional`, defaults to :128):\n                If set to a number, will limit the total sequence returned so that it has a maximum length.\n            mode (:obj:`str`, `optional`, defaults to `train`):\n                It identifies the dataset mode (train, test or dev).\n            data_file(:obj:`str`, `optional`, defaults to :obj:`None`):\n                The data file name, which is relative to the base_path.\n            label_file(:obj:`str`, `optional`, defaults to :obj:`None`):\n                The label file name, which is relative to the base_path.\n                It is all labels of the dataset, one line one label.\n            label_list(:obj:`List[str]`, `optional`, defaults to :obj:`None`):\n                The list of all labels of the dataset\n        \"\"\"\n        self.data_file = os.path.join(base_path, data_file)\n        self.label_list = label_list\n\n        self.mode = mode\n        self.tokenizer = tokenizer\n        self.max_seq_len = max_seq_len\n\n        if label_file:\n            self.label_file = os.path.join(base_path, label_file)\n            if not self.label_list:\n                self.label_list = self._load_label_data()\n            else:\n                logger.warning(\"As label_list has been assigned, label_file is noneffective\")\n        if self.label_list:\n            self.label_map = {item: index for index, item in enumerate(self.label_list)}\n\n    def _load_label_data(self):\n        \"\"\"\n        Loads labels from label file.\n        \"\"\"\n        if os.path.exists(self.label_file):\n            with open(self.label_file, \"r\", encoding=\"utf8\") as f:\n                return f.read().strip().split(\"\\n\")\n        else:\n            raise RuntimeError(\"The file {} is not found.\".format(self.label_file))\n\n    def _download_and_uncompress_dataset(self, destination: str, url: str):\n        \"\"\"\n        Downloads dataset and uncompresses it.\n        Args:\n           destination (:obj:`str`): The dataset cached directory.\n           url (:obj: str): The link to be downloaded a dataset.\n        \"\"\"\n        if not os.path.exists(destination):\n            dataset_package = download(url=url, path=DATA_HOME)\n            if is_xarfile(dataset_package):\n                unarchive(dataset_package, DATA_HOME)\n        else:\n            logger.info(\"Dataset {} already cached.\".format(destination))\n\n    def _read_file(self, input_file: str, is_file_with_header: bool = False):\n        \"\"\"\n        Reads the files.\n        Args:\n            input_file (:obj:str) : The file to be read.\n            is_file_with_header(:obj:bool, `optional`, default to :obj: False) :\n                Whether or not the file is with the header introduction.\n        \"\"\"\n        raise NotImplementedError\n\n    def get_labels(self):\n        \"\"\"\n        Gets all labels.\n        \"\"\"\n        return self.label_list\n\n\nclass TextClassificationDataset(BaseNLPDataset, paddle.io.Dataset):\n    \"\"\"\n    The dataset class which is fit for all datatset of text classification.\n    \"\"\"\n\n    def __init__(self,\n                 base_path: str,\n                 tokenizer: Union[PretrainedTokenizer, JiebaTokenizer],\n                 max_seq_len: int = 128,\n                 mode: str = \"train\",\n                 data_file: str = None,\n                 label_file: str = None,\n                 label_list: list = None,\n                 is_file_with_header: bool = False):\n        \"\"\"\n        Ags:\n            base_path (:obj:`str`): The directory to the whole dataset.\n            tokenizer (:obj:`PretrainedTokenizer` or :obj:`JiebaTokenizer`):\n                It tokenizes the text and encodes the data as model needed.\n            max_seq_len (:obj:`int`, `optional`, defaults to :128):\n                If set to a number, will limit the total sequence returned so that it has a maximum length.\n            mode (:obj:`str`, `optional`, defaults to `train`):\n                It identifies the dataset mode (train, test or dev).\n            data_file(:obj:`str`, `optional`, defaults to :obj:`None`):\n                The data file name, which is relative to the base_path.\n            label_file(:obj:`str`, `optional`, defaults to :obj:`None`):\n                The label file name, which is relative to the base_path.\n                It is all labels of the dataset, one line one label.\n            label_list(:obj:`List[str]`, `optional`, defaults to :obj:`None`):\n                The list of all labels of the dataset\n            is_file_with_header(:obj:bool, `optional`, default to :obj: False) :\n                Whether or not the file is with the header introduction.\n        \"\"\"\n        super(TextClassificationDataset, self).__init__(\n            base_path=base_path,\n            tokenizer=tokenizer,\n            max_seq_len=max_seq_len,\n            mode=mode,\n            data_file=data_file,\n            label_file=label_file,\n            label_list=label_list)\n        self.examples = self._read_file(self.data_file, is_file_with_header)\n\n        self.records = self._convert_examples_to_records(self.examples)\n\n    def _read_file(self, input_file, is_file_with_header: bool = False) -> List[InputExample]:\n        \"\"\"\n        Reads a tab separated value file.\n        Args:\n            input_file (:obj:str) : The file to be read.\n            is_file_with_header(:obj:bool, `optional`, default to :obj: False) :\n                Whether or not the file is with the header introduction.\n        Returns:\n            examples (:obj:`List[InputExample]`): All the input data.\n        \"\"\"\n        if not os.path.exists(input_file):\n            raise RuntimeError(\"The file {} is not found.\".format(input_file))\n        else:\n            with io.open(input_file, \"r\", encoding=\"UTF-8\") as f:\n                reader = csv.reader(f, delimiter=\"\\t\", quotechar=None)\n                examples = []\n                seq_id = 0\n                header = next(reader) if is_file_with_header else None\n                for line in reader:\n                    example = InputExample(guid=seq_id, label=line[0], text_a=line[1])\n                    seq_id += 1\n                    examples.append(example)\n                return examples\n\n    def _convert_examples_to_records(self, examples: List[InputExample]) -> List[dict]:\n        \"\"\"\n        Converts all examples to records which the model needs.\n        Args:\n            examples(obj:`List[InputExample]`): All data examples returned by _read_file.\n        Returns:\n            records(:obj:`List[dict]`): All records which the model needs.\n        \"\"\"\n        records = []\n        for example in examples:\n            if isinstance(self.tokenizer, PretrainedTokenizer):\n                if Version(paddlenlp.__version__) <= Version('2.0.0rc2'):\n                    record = self.tokenizer.encode(\n                        text=example.text_a, text_pair=example.text_b, max_seq_len=self.max_seq_len)\n                else:\n                    record = self.tokenizer(\n                        text=example.text_a,\n                        text_pair=example.text_b,\n                        max_seq_len=self.max_seq_len,\n                        pad_to_max_seq_len=True,\n                        return_length=True)\n            elif isinstance(self.tokenizer, JiebaTokenizer):\n                pad_token = self.tokenizer.vocab.pad_token\n\n                ids = self.tokenizer.encode(sentence=example.text_a)\n                seq_len = min(len(ids), self.max_seq_len)\n                if len(ids) > self.max_seq_len:\n                    ids = trunc_sequence(ids, self.max_seq_len)\n                else:\n                    pad_token_id = self.tokenizer.vocab.to_indices(pad_token)\n                    ids = pad_sequence(ids, self.max_seq_len, pad_token_id)\n                record = {'text': ids, 'seq_len': seq_len}\n            else:\n                raise RuntimeError(\n                    \"Unknown type of self.tokenizer: {}, it must be an instance of  PretrainedTokenizer or JiebaTokenizer\"\n                    .format(type(self.tokenizer)))\n\n            if not record:\n                logger.info(\n                    \"The text %s has been dropped as it has no words in the vocab after tokenization.\" % example.text_a)\n                continue\n            if example.label:\n                record['label'] = self.label_map[example.label]\n            records.append(record)\n        return records\n\n    def __getitem__(self, idx):\n        record = self.records[idx]\n        if isinstance(self.tokenizer, PretrainedTokenizer):\n            input_ids = np.array(record['input_ids'])\n            if Version(paddlenlp.__version__) >= Version('2.0.0rc5'):\n                token_type_ids = np.array(record['token_type_ids'])\n            else:\n                token_type_ids = np.array(record['segment_ids'])\n\n            if 'label' in record.keys():\n                return input_ids, token_type_ids, np.array(record['label'], dtype=np.int64)\n            else:\n                return input_ids, token_type_ids\n\n        elif isinstance(self.tokenizer, JiebaTokenizer):\n            if 'label' in record.keys():\n                return np.array(record['text']), np.array(record['label'], dtype=np.int64)\n            else:\n                return np.array(record['text'])\n        else:\n            raise RuntimeError(\n                \"Unknown type of self.tokenizer: {}, it must be an instance of  PretrainedTokenizer or JiebaTokenizer\".\n                format(type(self.tokenizer)))\n\n    def __len__(self):\n        return len(self.records)\n\n\nclass SeqLabelingDataset(BaseNLPDataset, paddle.io.Dataset):\n    \"\"\"\n    Ags:\n        base_path (:obj:`str`): The directory to the whole dataset.\n        tokenizer (:obj:`PretrainedTokenizer` or :obj:`JiebaTokenizer`):\n            It tokenizes the text and encodes the data as model needed.\n        max_seq_len (:obj:`int`, `optional`, defaults to :128):\n            If set to a number, will limit the total sequence returned so that it has a maximum length.\n        mode (:obj:`str`, `optional`, defaults to `train`):\n            It identifies the dataset mode (train, test or dev).\n        data_file(:obj:`str`, `optional`, defaults to :obj:`None`):\n            The data file name, which is relative to the base_path.\n        label_file(:obj:`str`, `optional`, defaults to :obj:`None`):\n            The label file name, which is relative to the base_path.\n            It is all labels of the dataset, one line one label.\n        label_list(:obj:`List[str]`, `optional`, defaults to :obj:`None`):\n            The list of all labels of the dataset\n        split_char(:obj:`str`, `optional`, defaults to :obj:`\\002`):\n            The symbol used to split chars in text and labels\n        no_entity_label(:obj:`str`, `optional`, defaults to :obj:`O`):\n            The label used to mark no entities\n        ignore_label(:obj:`int`, `optional`, defaults to :-100):\n            If one token's label == ignore_label, it will be ignored when\n            calculating loss\n        is_file_with_header(:obj:bool, `optional`, default to :obj: False) :\n            Whether or not the file is with the header introduction.\n    \"\"\"\n\n    def __init__(self,\n                 base_path: str,\n                 tokenizer: Union[PretrainedTokenizer, JiebaTokenizer],\n                 max_seq_len: int = 128,\n                 mode: str = \"train\",\n                 data_file: str = None,\n                 label_file: str = None,\n                 label_list: list = None,\n                 split_char: str = \"\\002\",\n                 no_entity_label: str = \"O\",\n                 ignore_label: int = -100,\n                 is_file_with_header: bool = False):\n        super(SeqLabelingDataset, self).__init__(\n            base_path=base_path,\n            tokenizer=tokenizer,\n            max_seq_len=max_seq_len,\n            mode=mode,\n            data_file=data_file,\n            label_file=label_file,\n            label_list=label_list)\n\n        self.no_entity_label = no_entity_label\n        self.split_char = split_char\n        self.ignore_label = ignore_label\n\n        self.examples = self._read_file(self.data_file, is_file_with_header)\n        self.records = self._convert_examples_to_records(self.examples)\n\n    def _read_file(self, input_file, is_file_with_header: bool = False) -> List[InputExample]:\n        \"\"\"Reads a tab separated value file.\"\"\"\n        if not os.path.exists(input_file):\n            raise RuntimeError(\"The file {} is not found.\".format(input_file))\n        else:\n            with io.open(input_file, \"r\", encoding=\"UTF-8\") as f:\n                reader = csv.reader(f, delimiter=\"\\t\", quotechar=None)\n                examples = []\n                seq_id = 0\n                header = next(reader) if is_file_with_header else None\n                for line in reader:\n                    example = InputExample(guid=seq_id, label=line[1], text_a=line[0])\n                    seq_id += 1\n                    examples.append(example)\n                return examples\n\n    def _convert_examples_to_records(self, examples: List[InputExample]) -> List[dict]:\n        \"\"\"\n        Returns a list[dict] including all the input information what the model need.\n        Args:\n            examples (list): the data examples, returned by _read_file.\n        Returns:\n            a list with all the examples record.\n        \"\"\"\n        records = []\n        for example in examples:\n            tokens = example.text_a.split(self.split_char)\n            labels = example.label.split(self.split_char)\n\n            # convert tokens into record\n            if isinstance(self.tokenizer, PretrainedTokenizer):\n                pad_token = self.tokenizer.pad_token\n\n                tokens, labels = reseg_token_label(tokenizer=self.tokenizer, tokens=tokens, labels=labels)\n                if Version(paddlenlp.__version__) <= Version('2.0.0rc2'):\n                    record = self.tokenizer.encode(text=tokens, max_seq_len=self.max_seq_len)\n                else:\n                    record = self.tokenizer(\n                        text=tokens,\n                        max_seq_len=self.max_seq_len,\n                        pad_to_max_seq_len=True,\n                        is_split_into_words=True,\n                        return_length=True)\n            elif isinstance(self.tokenizer, JiebaTokenizer):\n                pad_token = self.tokenizer.vocab.pad_token\n\n                ids = [self.tokenizer.vocab.to_indices(token) for token in tokens]\n                seq_len = min(len(ids), self.max_seq_len)\n                if len(ids) > self.max_seq_len:\n                    ids = trunc_sequence(ids, self.max_seq_len)\n                else:\n                    pad_token_id = self.tokenizer.vocab.to_indices(pad_token)\n                    ids = pad_sequence(ids, self.max_seq_len, pad_token_id)\n\n                record = {'text': ids, 'seq_len': seq_len}\n            else:\n                raise RuntimeError(\n                    \"Unknown type of self.tokenizer: {}, it must be an instance of  PretrainedTokenizer or JiebaTokenizer\"\n                    .format(type(self.tokenizer)))\n\n            if not record:\n                logger.info(\n                    \"The text %s has been dropped as it has no words in the vocab after tokenization.\" % example.text_a)\n                continue\n\n            # convert labels into record\n            if labels:\n                record[\"label\"] = []\n                if isinstance(self.tokenizer, PretrainedTokenizer):\n                    tokens_with_specical_token = self.tokenizer.convert_ids_to_tokens(record['input_ids'])\n                elif isinstance(self.tokenizer, JiebaTokenizer):\n                    tokens_with_specical_token = [self.tokenizer.vocab.to_tokens(id_) for id_ in record['text']]\n                else:\n                    raise RuntimeError(\n                        \"Unknown type of self.tokenizer: {}, it must be an instance of  PretrainedTokenizer or JiebaTokenizer\"\n                        .format(type(self.tokenizer)))\n\n                tokens_index = 0\n                for token in tokens_with_specical_token:\n                    if tokens_index < len(tokens) and token == tokens[tokens_index]:\n                        record[\"label\"].append(self.label_list.index(labels[tokens_index]))\n                        tokens_index += 1\n                    elif token in [pad_token]:\n                        record[\"label\"].append(self.ignore_label)  # label of special token\n                    else:\n                        record[\"label\"].append(self.label_list.index(self.no_entity_label))\n            records.append(record)\n        return records\n\n    def __getitem__(self, idx):\n        record = self.records[idx]\n        if isinstance(self.tokenizer, PretrainedTokenizer):\n            input_ids = np.array(record['input_ids'])\n            seq_lens = np.array(record['seq_len'])\n            if Version(paddlenlp.__version__) >= Version('2.0.0rc5'):\n                token_type_ids = np.array(record['token_type_ids'])\n            else:\n                token_type_ids = np.array(record['segment_ids'])\n\n            if 'label' in record.keys():\n                return input_ids, token_type_ids, seq_lens, np.array(record['label'], dtype=np.int64)\n            else:\n                return input_ids, token_type_ids, seq_lens\n\n        elif isinstance(self.tokenizer, JiebaTokenizer):\n            if 'label' in record.keys():\n                return np.array(record['text']), np.array(record['seq_len']), np.array(record['label'], dtype=np.int64)\n            else:\n                return np.array(record['text']), np.array(record['seq_len'])\n        else:\n            raise RuntimeError(\n                \"Unknown type of self.tokenizer: {}, it must be an instance of  PretrainedTokenizer or JiebaTokenizer\".\n                format(type(self.tokenizer)))\n\n    def __len__(self):\n        return len(self.records)\n\n\nclass TextMatchingDataset(BaseNLPDataset, paddle.io.Dataset):\n    \"\"\"\n    The dataset class which is fit for all datatset of text matching.\n    \"\"\"\n\n    def __init__(self,\n                 base_path: str,\n                 tokenizer: PretrainedTokenizer,\n                 max_seq_len: int = 128,\n                 mode: str = \"train\",\n                 data_file: str = None,\n                 label_file: str = None,\n                 label_list: list = None,\n                 is_file_with_header: bool = False):\n        \"\"\"\n        Ags:\n            base_path (:obj:`str`): The directory to the whole dataset.\n            tokenizer (:obj:`PretrainedTokenizer`):\n                It tokenizes the text and encodes the data as model needed.\n            max_seq_len (:obj:`int`, `optional`, defaults to :128):\n                If set to a number, will limit the total sequence returned so that it has a maximum length.\n            mode (:obj:`str`, `optional`, defaults to `train`):\n                It identifies the dataset mode (train, test or dev).\n            data_file(:obj:`str`, `optional`, defaults to :obj:`None`):\n                The data file name, which is relative to the base_path.\n            label_file(:obj:`str`, `optional`, defaults to :obj:`None`):\n                The label file name, which is relative to the base_path.\n                It is all labels of the dataset, one line one label.\n            label_list(:obj:`List[str]`, `optional`, defaults to :obj:`None`):\n                The list of all labels of the dataset\n            is_file_with_header(:obj:bool, `optional`, default to :obj: False) :\n                Whether or not the file is with the header introduction.\n        \"\"\"\n        super(TextMatchingDataset, self).__init__(\n            base_path=base_path,\n            tokenizer=tokenizer,\n            max_seq_len=max_seq_len,\n            mode=mode,\n            data_file=data_file,\n            label_file=label_file,\n            label_list=label_list)\n        self.examples = self._read_file(self.data_file, is_file_with_header)\n\n        self.records = self._convert_examples_to_records(self.examples)\n\n    def _read_file(self, input_file, is_file_with_header: bool = False) -> List[InputExample]:\n        \"\"\"\n        Reads a tab separated value file.\n        Args:\n            input_file (:obj:str) : The file to be read.\n            is_file_with_header(:obj:bool, `optional`, default to :obj: False) :\n                Whether or not the file is with the header introduction.\n        Returns:\n            examples (:obj:`List[InputExample]`): All the input data.\n        \"\"\"\n        if not os.path.exists(input_file):\n            raise RuntimeError(\"The file {} is not found.\".format(input_file))\n        else:\n            with io.open(input_file, \"r\", encoding=\"UTF-8\") as f:\n                reader = csv.reader(f, delimiter=\"\\t\", quotechar=None)\n                examples = []\n                seq_id = 0\n                header = next(reader) if is_file_with_header else None\n                for line in reader:\n                    example = InputExample(guid=seq_id, text_a=line[0], text_b=line[1], label=line[2])\n                    seq_id += 1\n                    examples.append(example)\n                return examples\n\n    def _convert_examples_to_records(self, examples: List[InputExample]) -> List[dict]:\n        \"\"\"\n        Converts all examples to records which the model needs.\n        Args:\n            examples(obj:`List[InputExample]`): All data examples returned by _read_file.\n        Returns:\n            records(:obj:`List[dict]`): All records which the model needs.\n        \"\"\"\n        records = []\n        for example in examples:\n            if isinstance(self.tokenizer, PretrainedTokenizer):\n                record_a = self.tokenizer(text=example.text_a, max_seq_len=self.max_seq_len, \\\n                    pad_to_max_seq_len=True, return_length=True)\n                record_b = self.tokenizer(text=example.text_b, max_seq_len=self.max_seq_len, \\\n                    pad_to_max_seq_len=True, return_length=True)\n                record = {'text_a': record_a, 'text_b': record_b}\n            else:\n                raise RuntimeError(\n                    \"Unknown type of self.tokenizer: {}, it must be an instance of PretrainedTokenizer\".format(\n                        type(self.tokenizer)))\n\n            if not record:\n                logger.info(\n                    \"The text %s has been dropped as it has no words in the vocab after tokenization.\" % example.text_a)\n                continue\n            if example.label:\n                record['label'] = self.label_map[example.label]\n            records.append(record)\n        return records\n\n    def __getitem__(self, idx):\n        record = self.records[idx]\n        if isinstance(self.tokenizer, PretrainedTokenizer):\n            query_input_ids = np.array(record['text_a']['input_ids'])\n            query_token_type_ids = np.array(record['text_a']['token_type_ids'])\n            title_input_ids = np.array(record['text_b']['input_ids'])\n            title_token_type_ids = np.array(record['text_b']['token_type_ids'])\n\n            if 'label' in record.keys():\n                return query_input_ids, query_token_type_ids, title_input_ids, title_token_type_ids, \\\n                    np.array(record['label'], dtype=np.int64)\n            else:\n                return query_input_ids, query_token_type_ids, title_input_ids, title_token_type_ids\n        else:\n            raise RuntimeError(\n                \"Unknown type of self.tokenizer: {}, it must be an instance of PretrainedTokenizer\".format(\n                    type(self.tokenizer)))\n\n    def __len__(self):\n        return len(self.records)\n"
  },
  {
    "path": "paddlehub/datasets/base_seg_dataset.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Tuple, Callable\n\nimport paddle\nimport numpy as np\nfrom PIL import Image\n\n\nclass SegDataset(paddle.io.Dataset):\n    \"\"\"\n    Pass in a custom dataset that conforms to the format.\n\n    Args:\n        transforms (Callable): Transforms for image.\n        dataset_root (str): The dataset directory.\n        num_classes (int): Number of classes.\n        mode (str, optional): which part of dataset to use. it is one of ('train', 'val', 'test'). Default: 'train'.\n        train_path (str, optional): The train dataset file. When mode is 'train', train_path is necessary.\n            The contents of train_path file are as follow:\n            image1.jpg ground_truth1.png\n            image2.jpg ground_truth2.png\n        val_path (str. optional): The evaluation dataset file. When mode is 'val', val_path is necessary.\n            The contents is the same as train_path\n        test_path (str, optional): The test dataset file. When mode is 'test', test_path is necessary.\n            The annotation file is not necessary in test_path file.\n        separator (str, optional): The separator of dataset list. Default: ' '.\n        edge (bool, optional): Whether to compute edge while training. Default: False\n\n    \"\"\"\n\n    def __init__(self,\n                 transforms: Callable,\n                 dataset_root: str,\n                 num_classes: int,\n                 mode: str = 'train',\n                 train_path: str = None,\n                 val_path: str = None,\n                 test_path: str = None,\n                 separator: str = ' ',\n                 ignore_index: int = 255,\n                 edge: bool = False):\n        self.dataset_root = dataset_root\n        self.transforms = transforms\n        self.file_list = list()\n        mode = mode.lower()\n        self.mode = mode\n        self.num_classes = num_classes\n        self.ignore_index = ignore_index\n        self.edge = edge\n\n        if mode.lower() not in ['train', 'val', 'test']:\n            raise ValueError(\"mode should be 'train', 'val' or 'test', but got {}.\".format(mode))\n\n        if self.transforms is None:\n            raise ValueError(\"`transforms` is necessary, but it is None.\")\n\n        self.dataset_root = dataset_root\n        if not os.path.exists(self.dataset_root):\n            raise FileNotFoundError('there is not `dataset_root`: {}.'.format(self.dataset_root))\n\n        if mode == 'train':\n            if train_path is None:\n                raise ValueError('When `mode` is \"train\", `train_path` is necessary, but it is None.')\n            elif not os.path.exists(train_path):\n                raise FileNotFoundError('`train_path` is not found: {}'.format(train_path))\n            else:\n                file_path = train_path\n        elif mode == 'val':\n            if val_path is None:\n                raise ValueError('When `mode` is \"val\", `val_path` is necessary, but it is None.')\n            elif not os.path.exists(val_path):\n                raise FileNotFoundError('`val_path` is not found: {}'.format(val_path))\n            else:\n                file_path = val_path\n        else:\n            if test_path is None:\n                raise ValueError('When `mode` is \"test\", `test_path` is necessary, but it is None.')\n            elif not os.path.exists(test_path):\n                raise FileNotFoundError('`test_path` is not found: {}'.format(test_path))\n            else:\n                file_path = test_path\n\n        with open(file_path, 'r') as f:\n            for line in f:\n                items = line.strip().split(separator)\n                if len(items) != 2:\n                    if mode == 'train' or mode == 'val':\n                        raise ValueError(\"File list format incorrect! In training or evaluation task it should be\"\n                                         \" image_name{}label_name\\\\n\".format(separator))\n                    image_path = os.path.join(self.dataset_root, items[0])\n                    label_path = None\n                else:\n                    image_path = os.path.join(self.dataset_root, items[0])\n                    label_path = os.path.join(self.dataset_root, items[1])\n                self.file_list.append([image_path, label_path])\n\n    def __getitem__(self, idx: int) -> Tuple[np.ndarray]:\n        image_path, label_path = self.file_list[idx]\n        if self.mode == 'test':\n            im, _ = self.transforms(im=image_path)\n            im = im[np.newaxis, ...]\n            return im, image_path\n        elif self.mode == 'val':\n            im, _ = self.transforms(im=image_path)\n            label = np.asarray(Image.open(label_path))\n            label = label[np.newaxis, :, :]\n            return im, label\n        else:\n            im, label = self.transforms(im=image_path, label=label_path)\n            return im, label\n\n    def __len__(self) -> int:\n        return len(self.file_list)\n"
  },
  {
    "path": "paddlehub/datasets/canvas.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Callable\n\nimport paddle\nimport numpy as np\n\nimport paddlehub.env as hubenv\nfrom paddlehub.vision.utils import get_img_file\nfrom paddlehub.utils.download import download_data\n\n\n@download_data(url='https://paddlehub.bj.bcebos.com/dygraph/datasets/canvas.tar.gz')\nclass Canvas(paddle.io.Dataset):\n    \"\"\"\n    Dataset for colorization. It contains 1193 and 400 pictures for Monet and Vango paintings style, respectively.\n    We collected data from https://people.eecs.berkeley.edu/~taesung_park/CycleGAN/datasets/.\n\n    Args:\n       transform(callmethod) : The method of preprocess images.\n       mode(str): The mode for preparing dataset.\n\n    Returns:\n        DataSet: An iterable object for data iterating\n    \"\"\"\n\n    def __init__(self, transform: Callable, mode: str = 'train'):\n        self.mode = mode\n        self.transform = transform\n\n        if self.mode == 'train':\n            self.file = 'train'\n        elif self.mode == 'test':\n            self.file = 'test'\n\n        self.file = os.path.join(hubenv.DATA_HOME, 'canvas', self.file)\n        self.data = get_img_file(self.file)\n\n    def __getitem__(self, idx: int) -> np.ndarray:\n        img_path = self.data[idx]\n        im = self.transform(img_path)\n        return im\n\n    def __len__(self):\n        return len(self.data)\n"
  },
  {
    "path": "paddlehub/datasets/chnsenticorp.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nfrom typing import Dict, List, Optional, Union, Tuple\nimport os\n\nfrom paddlehub.env import DATA_HOME\nfrom paddlehub.utils.download import download_data\nfrom paddlehub.datasets.base_nlp_dataset import TextClassificationDataset\nfrom paddlehub.text.bert_tokenizer import BertTokenizer\nfrom paddlehub.text.tokenizer import CustomTokenizer\n\n\n@download_data(url=\"https://bj.bcebos.com/paddlehub-dataset/chnsenticorp.tar.gz\")\nclass ChnSentiCorp(TextClassificationDataset):\n    \"\"\"\n    ChnSentiCorp is a dataset for chinese sentiment classification,\n    which was published by Tan Songbo at ICT of Chinese Academy of Sciences.\n    \"\"\"\n\n    # TODO(zhangxuefei): simplify datatset load, such as\n    # train_ds, dev_ds, test_ds = hub.datasets.ChnSentiCorp(tokenizer=xxx, max_seq_len=128, select='train', 'test', 'valid')\n    def __init__(self, tokenizer: Union[BertTokenizer, CustomTokenizer], max_seq_len: int = 128, mode: str = 'train'):\n        \"\"\"\n        Args:\n            tokenizer (:obj:`BertTokenizer` or `CustomTokenizer`):\n                It tokenizes the text and encodes the data as model needed.\n            max_seq_len (:obj:`int`, `optional`, defaults to :128):\n                The maximum length (in number of tokens) for the inputs to the selected module,\n                such as ernie, bert and so on.\n            mode (:obj:`str`, `optional`, defaults to `train`):\n                It identifies the dataset mode (train, test or dev).\n        Examples:\n            .. code-block:: python\n                import paddlehub as hub\n\n                tokenizer = hub.BertTokenizer(vocab_file='./vocab.txt')\n                train_dataset = hub.datasets.ChnSentiCorp(tokenizer=tokenizer, max_seq_len=120, mode='train')\n                dev_dataset = hub.datasets.ChnSentiCorp(tokenizer=tokenizer, max_seq_len=120, mode='dev')\n                test_dataset = hub.datasets.ChnSentiCorp(tokenizer=tokenizer, max_seq_len=120, mode='test')\n\n        \"\"\"\n        base_path = os.path.join(DATA_HOME, \"chnsenticorp\")\n        if mode == 'train':\n            data_file = 'train.tsv'\n        elif mode == 'test':\n            data_file = 'test.tsv'\n        else:\n            data_file = 'dev.tsv'\n        super().__init__(\n            base_path=base_path,\n            tokenizer=tokenizer,\n            max_seq_len=max_seq_len,\n            mode=mode,\n            data_file=data_file,\n            label_list=[\"0\", \"1\"],\n            is_file_with_header=True)\n"
  },
  {
    "path": "paddlehub/datasets/esc50.py",
    "content": "#   Copyright (c) 2021  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\n\nfrom paddlehub.datasets.base_audio_dataset import AudioClassificationDataset\nfrom paddlehub.env import DATA_HOME\nfrom paddlehub.utils.download import download_data\n\n\n@download_data(url=\"https://bj.bcebos.com/paddlehub-dataset/esc50.tar.gz\")\nclass ESC50(AudioClassificationDataset):\n    sample_rate = 44100\n    input_length = int(sample_rate * 5)  # 5s\n    num_class = 50  # class num\n    label_list = [\n        # Animals\n        'Dog',\n        'Rooster',\n        'Pig',\n        'Cow',\n        'Frog',\n        'Cat',\n        'Hen',\n        'Insects (flying)',\n        'Sheep',\n        'Crow',\n        # Natural soundscapes & water sounds\n        'Rain',\n        'Sea waves',\n        'Crackling fire',\n        'Crickets',\n        'Chirping birds',\n        'Water drops',\n        'Wind',\n        'Pouring water',\n        'Toilet flush',\n        'Thunderstorm',\n        # Human, non-speech sounds\n        'Crying baby',\n        'Sneezing',\n        'Clapping',\n        'Breathing',\n        'Coughing',\n        'Footsteps',\n        'Laughing',\n        'Brushing teeth',\n        'Snoring',\n        'Drinking, sipping',\n        # Interior/domestic sounds\n        'Door knock',\n        'Mouse click',\n        'Keyboard typing',\n        'Door, wood creaks',\n        'Can opening',\n        'Washing machine',\n        'Vacuum cleaner',\n        'Clock alarm',\n        'Clock tick',\n        'Glass breaking',\n        # Exterior/urban noises\n        'Helicopter',\n        'Chainsaw',\n        'Siren',\n        'Car horn',\n        'Engine',\n        'Train',\n        'Church bells',\n        'Airplane',\n        'Fireworks',\n        'Hand saw',\n    ]\n\n    def __init__(self, mode: str = 'train', feat_type: str = 'mel'):\n\n        base_path = os.path.join(DATA_HOME, \"esc50\")\n\n        if mode == 'train':\n            data_file = 'train.npz'\n        else:\n            data_file = 'dev.npz'\n\n        feat_cfg = dict(\n            sample_rate=self.sample_rate,\n            window_size=1024,\n            hop_size=320,\n            mel_bins=64,\n            fmin=50,\n            fmax=14000,\n            window='hann')\n\n        super().__init__(\n            base_path=base_path,\n            data_file=data_file,\n            file_type='npz',\n            mode=mode,\n            feat_type=feat_type,\n            feat_cfg=feat_cfg)\n\n\nif __name__ == \"__main__\":\n    train_dataset = ESC50(mode='train')\n    dev_dataset = ESC50(mode='dev')\n"
  },
  {
    "path": "paddlehub/datasets/flowers.py",
    "content": "# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Callable, Tuple\n\nimport paddle\nimport numpy as np\n\nimport paddlehub.env as hubenv\nfrom paddlehub.utils.download import download_data\n\n\n@download_data(url='https://bj.bcebos.com/paddlehub-dataset/flower_photos.tar.gz')\nclass Flowers(paddle.io.Dataset):\n    def __init__(self, transforms: Callable, mode: str = 'train'):\n        self.mode = mode\n        self.transforms = transforms\n        self.num_classes = 5\n\n        if self.mode == 'train':\n            self.file = 'train_list.txt'\n        elif self.mode == 'test':\n            self.file = 'test_list.txt'\n        else:\n            self.file = 'validate_list.txt'\n        self.file = os.path.join(hubenv.DATA_HOME, 'flower_photos', self.file)\n\n        with open(self.file, 'r') as file:\n            self.data = file.read().split('\\n')\n\n    def __getitem__(self, idx) -> Tuple[np.ndarray, int]:\n        img_path, grt = self.data[idx].split(' ')\n        img_path = os.path.join(hubenv.DATA_HOME, 'flower_photos', img_path)\n        im = self.transforms(img_path)\n        return im, int(grt)\n\n    def __len__(self):\n        return len(self.data)\n"
  },
  {
    "path": "paddlehub/datasets/lcqmc.py",
    "content": "#   Copyright (c) 2021  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import Union\nimport os\n\nfrom paddlehub.env import DATA_HOME\nfrom paddlehub.utils.download import download_data\nfrom paddlehub.datasets.base_nlp_dataset import TextMatchingDataset\nfrom paddlehub.text.bert_tokenizer import BertTokenizer\nfrom paddlehub.text.tokenizer import CustomTokenizer\n\n\n@download_data(url=\"https://bj.bcebos.com/paddlehub-dataset/lcqmc.tar.gz\")\nclass LCQMC(TextMatchingDataset):\n    label_list = ['0', '1']\n\n    def __init__(\n            self,\n            tokenizer: Union[BertTokenizer, CustomTokenizer],\n            max_seq_len: int = 128,\n            mode: str = 'train',\n    ):\n        base_path = os.path.join(DATA_HOME, \"lcqmc\")\n\n        if mode == 'train':\n            data_file = 'train.tsv'\n        elif mode == 'test':\n            data_file = 'test.tsv'\n        else:\n            data_file = 'dev.tsv'\n        super().__init__(\n            base_path=base_path,\n            tokenizer=tokenizer,\n            max_seq_len=max_seq_len,\n            mode=mode,\n            data_file=data_file,\n            label_file=None,\n            label_list=self.label_list,\n            is_file_with_header=True,\n        )\n\n\nif __name__ == \"__main__\":\n    import paddlehub as hub\n    model = hub.Module(name='ernie_tiny')\n    tokenizer = model.get_tokenizer()\n\n    ds = LCQMC(tokenizer=tokenizer, max_seq_len=128, mode='dev')\n"
  },
  {
    "path": "paddlehub/datasets/minicoco.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Callable\n\nimport paddle\nimport numpy as np\n\nimport paddlehub.env as hubenv\nfrom paddlehub.vision.utils import get_img_file\nfrom paddlehub.utils.download import download_data\n\n\n@download_data(url='https://paddlehub.bj.bcebos.com/dygraph/datasets/minicoco.tar.gz')\nclass MiniCOCO(paddle.io.Dataset):\n    \"\"\"\n    Dataset for Style transfer. The dataset contains 2001 images for training set and 200 images for testing set.\n    They are derived form COCO2014. Meanwhile, it contains 21 different style pictures in file \"21styles\".\n\n    Args:\n       transform(callmethod) : The method of preprocess images.\n       mode(str): The mode for preparing dataset.\n\n    Returns:\n        DataSet: An iterable object for data iterating\n    \"\"\"\n\n    def __init__(self, transform: Callable, mode: str = 'train'):\n        self.mode = mode\n        self.transform = transform\n\n        if self.mode == 'train':\n            self.file = 'train'\n        elif self.mode == 'test':\n            self.file = 'test'\n        self.file = os.path.join(hubenv.DATA_HOME, 'minicoco', self.file)\n        self.style_file = os.path.join(hubenv.DATA_HOME, 'minicoco', '21styles')\n        self.data = get_img_file(self.file)\n        self.style = get_img_file(self.style_file)\n\n    def __getitem__(self, idx: int) -> np.ndarray:\n\n        img_path = self.data[idx]\n        im = self.transform(img_path)\n        im = im.astype('float32')\n        style_idx = idx % len(self.style)\n        style_path = self.style[style_idx]\n        style = self.transform(style_path)\n        style = style.astype('float32')\n        return im, style\n\n    def __len__(self):\n        return len(self.data)\n"
  },
  {
    "path": "paddlehub/datasets/msra_ner.py",
    "content": "#coding:utf-8\n#   Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import Union\nimport os\n\nfrom paddlehub.env import DATA_HOME\nfrom paddlehub.utils.download import download_data\nfrom paddlehub.datasets.base_nlp_dataset import SeqLabelingDataset\nfrom paddlehub.text.bert_tokenizer import BertTokenizer\nfrom paddlehub.text.tokenizer import CustomTokenizer\n\n\n@download_data(url=\"https://bj.bcebos.com/paddlehub-dataset/msra_ner.tar.gz\")\nclass MSRA_NER(SeqLabelingDataset):\n    \"\"\"\n    A set of manually annotated Chinese word-segmentation data and\n    specifications for training and testing a Chinese word-segmentation system\n    for research purposes.  For more information please refer to\n    https://www.microsoft.com/en-us/download/details.aspx?id=52531\n    \"\"\"\n    label_list = [\"B-PER\", \"I-PER\", \"B-ORG\", \"I-ORG\", \"B-LOC\", \"I-LOC\", \"O\"]\n\n    def __init__(\n            self,\n            tokenizer: Union[BertTokenizer, CustomTokenizer],\n            max_seq_len: int = 128,\n            mode: str = 'train',\n    ):\n        base_path = os.path.join(DATA_HOME, \"msra_ner\")\n\n        if mode == 'train':\n            data_file = 'train.tsv'\n        elif mode == 'test':\n            data_file = 'test.tsv'\n        else:\n            data_file = 'dev.tsv'\n        super().__init__(\n            base_path=base_path,\n            tokenizer=tokenizer,\n            max_seq_len=max_seq_len,\n            mode=mode,\n            data_file=data_file,\n            label_file=None,\n            label_list=self.label_list,\n            is_file_with_header=True,\n        )\n"
  },
  {
    "path": "paddlehub/datasets/opticdiscseg.py",
    "content": "# coding:utf-8\n# Copyright (c) 2021  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Callable\n\nimport paddle\nimport numpy as np\nfrom PIL import Image\n\nimport paddlehub.env as hubenv\nfrom paddlehub.utils.download import download_data\nfrom paddlehub.datasets.base_seg_dataset import SegDataset\n\n\n@download_data(url='https://paddleseg.bj.bcebos.com/dataset/optic_disc_seg.zip')\nclass OpticDiscSeg(SegDataset):\n    \"\"\"\n    OpticDiscSeg dataset is extraced from iChallenge-AMD\n    (https://ai.baidu.com/broad/subordinate?dataset=amd).\n\n    Args:\n        transforms (Callable): Transforms for image.\n        mode (str, optional): Which part of dataset to use. it is one of ('train', 'val', 'test'). Default: 'train'.\n        edge (bool, optional): Whether to compute edge while training. Default: False\n    \"\"\"\n\n    def __init__(self, transforms: Callable = None, mode: str = 'train'):\n        self.transforms = transforms\n        mode = mode.lower()\n        self.mode = mode\n        self.file_list = list()\n        self.num_classes = 2\n        self.ignore_index = 255\n\n        if mode not in ['train', 'val', 'test']:\n            raise ValueError(\"`mode` should be 'train', 'val' or 'test', but got {}.\".format(mode))\n\n        if self.transforms is None:\n            raise ValueError(\"`transforms` is necessary, but it is None.\")\n\n        if mode == 'train':\n            file_path = os.path.join(hubenv.DATA_HOME, 'optic_disc_seg', 'train_list.txt')\n        elif mode == 'test':\n            file_path = os.path.join(hubenv.DATA_HOME, 'optic_disc_seg', 'test_list.txt')\n        else:\n            file_path = os.path.join(hubenv.DATA_HOME, 'optic_disc_seg', 'val_list.txt')\n\n        with open(file_path, 'r') as f:\n            for line in f:\n                items = line.strip().split()\n                if len(items) != 2:\n                    if mode == 'train' or mode == 'val':\n                        raise Exception(\"File list format incorrect! It should be\" \" image_name label_name\\\\n\")\n                    image_path = os.path.join(hubenv.DATA_HOME, 'optic_disc_seg', items[0])\n                    grt_path = None\n                else:\n                    image_path = os.path.join(hubenv.DATA_HOME, 'optic_disc_seg', items[0])\n                    grt_path = os.path.join(hubenv.DATA_HOME, 'optic_disc_seg', items[1])\n                self.file_list.append([image_path, grt_path])\n"
  },
  {
    "path": "paddlehub/datasets/pascalvoc.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport copy\nfrom typing import Callable\n\nimport paddle\nimport numpy as np\nfrom paddlehub.env import DATA_HOME\nfrom pycocotools.coco import COCO\n\n\nclass DetectCatagory:\n    \"\"\"Load label name, id and map from detection dataset.\n\n    Args:\n        attrbox(Callable): Method to get detection attributes of images.\n        data_dir(str): Image dataset path.\n\n    Returns:\n        label_names(List(str)): The dataset label names.\n        label_ids(List(int)): The dataset label ids.\n        category_to_id_map(dict): Mapping relations of category and id for images.\n    \"\"\"\n\n    def __init__(self, attrbox: Callable, data_dir: str):\n        self.attrbox = attrbox\n        self.img_dir = data_dir\n\n    def __call__(self):\n        self.categories = self.attrbox.loadCats(self.attrbox.getCatIds())\n        self.num_category = len(self.categories)\n        label_names = []\n        label_ids = []\n        for category in self.categories:\n            label_names.append(category['name'])\n            label_ids.append(int(category['id']))\n        category_to_id_map = {v: i for i, v in enumerate(label_ids)}\n        return label_names, label_ids, category_to_id_map\n\n\nclass ParseImages:\n    \"\"\"Prepare images for detection.\n\n    Args:\n        attrbox(Callable): Method to get detection attributes of images.\n        data_dir(str): Image dataset path.\n        category_to_id_map(dict): Mapping relations of category and id for images.\n\n    Returns:\n        imgs(dict): The input for detection model, it is a dict.\n    \"\"\"\n\n    def __init__(self, attrbox: Callable, data_dir: str, category_to_id_map: dict):\n        self.attrbox = attrbox\n        self.img_dir = data_dir\n        self.category_to_id_map = category_to_id_map\n        self.parse_gt_annotations = GTAnotations(self.attrbox, self.category_to_id_map)\n\n    def __call__(self):\n        image_ids = self.attrbox.getImgIds()\n        image_ids.sort()\n        imgs = copy.deepcopy(self.attrbox.loadImgs(image_ids))\n\n        for img in imgs:\n            img['image'] = os.path.join(self.img_dir, img['file_name'])\n            assert os.path.exists(img['image']), \"image {} not found.\".format(img['image'])\n            box_num = 50\n            img['gt_boxes'] = np.zeros((box_num, 4), dtype=np.float32)\n            img['gt_labels'] = np.zeros((box_num), dtype=np.int32)\n            img = self.parse_gt_annotations(img)\n        return imgs\n\n\nclass GTAnotations:\n    \"\"\"Set gt boxes and gt labels for train.\n\n    Args:\n        attrbox(Callable): Method for get detection attributes for images.\n        category_to_id_map(dict): Mapping relations of category and id for images.\n        img(dict): Input for detection model.\n\n    Returns:\n        img(dict): Set specific value on the attributes of 'gt boxes' and 'gt labels' for input.\n    \"\"\"\n\n    def __init__(self, attrbox: Callable, category_to_id_map: dict):\n        self.attrbox = attrbox\n        self.category_to_id_map = category_to_id_map\n\n    def box_to_center_relative(self, box: list, img_height: int, img_width: int) -> np.ndarray:\n        \"\"\"\n            Convert COCO annotations box with format [x1, y1, w, h] to\n            center mode [center_x, center_y, w, h] and divide image width\n            and height to get relative value in range[0, 1]\n        \"\"\"\n        assert len(box) == 4, \"box should be a len(4) list or tuple\"\n        x, y, w, h = box\n\n        x1 = max(x, 0)\n        x2 = min(x + w - 1, img_width - 1)\n        y1 = max(y, 0)\n        y2 = min(y + h - 1, img_height - 1)\n\n        x = (x1 + x2) / 2 / img_width\n        y = (y1 + y2) / 2 / img_height\n        w = (x2 - x1) / img_width\n        h = (y2 - y1) / img_height\n\n        return np.array([x, y, w, h])\n\n    def __call__(self, img: dict):\n        img_height = img['height']\n        img_width = img['width']\n        anno = self.attrbox.loadAnns(self.attrbox.getAnnIds(imgIds=img['id'], iscrowd=None))\n        gt_index = 0\n\n        for target in anno:\n            if target['area'] < -1:\n                continue\n            if 'ignore' in target and target['ignore']:\n                continue\n            box = self.box_to_center_relative(target['bbox'], img_height, img_width)\n\n            if box[2] <= 0 and box[3] <= 0:\n                continue\n            img['gt_boxes'][gt_index] = box\n            img['gt_labels'][gt_index] = \\\n                self.category_to_id_map[target['category_id']]\n            gt_index += 1\n            if gt_index >= 50:\n                break\n        return img\n\n\nclass DetectionData(paddle.io.Dataset):\n    \"\"\"\n    Dataset for image detection.\n\n    Args:\n       transform(callmethod) : The method of preprocess images.\n       mode(str): The mode for preparing dataset.\n\n    Returns:\n        DataSet: An iterable object for data iterating\n    \"\"\"\n\n    def __init__(self, transform: Callable, size: int = 416, mode: str = 'train'):\n        self.mode = mode\n        self.transform = transform\n        self.size = size\n\n        if self.mode == 'train':\n            train_file_list = 'annotations/instances_train2017.json'\n            train_data_dir = 'train2017'\n            self.train_file_list = os.path.join(DATA_HOME, 'voc', train_file_list)\n            self.train_data_dir = os.path.join(DATA_HOME, 'voc', train_data_dir)\n            self.COCO = COCO(self.train_file_list)\n            self.img_dir = self.train_data_dir\n\n        elif self.mode == 'test':\n            val_file_list = 'annotations/instances_val2017.json'\n            val_data_dir = 'val2017'\n            self.val_file_list = os.path.join(DATA_HOME, 'voc', val_file_list)\n            self.val_data_dir = os.path.join(DATA_HOME, 'voc', val_data_dir)\n            self.COCO = COCO(self.val_file_list)\n            self.img_dir = self.val_data_dir\n\n        parse_dataset_catagory = DetectCatagory(self.COCO, self.img_dir)\n        self.label_names, self.label_ids, self.category_to_id_map = parse_dataset_catagory()\n        parse_images = ParseImages(self.COCO, self.img_dir, self.category_to_id_map)\n        self.data = parse_images()\n\n    def __getitem__(self, idx: int):\n        img = self.data[idx]\n        im, data = self.transform(img)\n        out_img, gt_boxes, gt_labels, gt_scores = im, data['gt_boxes'], data['gt_labels'], data['gt_scores']\n        return out_img, gt_boxes, gt_labels, gt_scores\n\n    def __len__(self):\n        return len(self.data)\n"
  },
  {
    "path": "paddlehub/env.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n'''\nThis module is used to store environmental variables in PaddleHub.\n\n\nHUB_HOME              -->  the root directory for storing PaddleHub related data. Default to ~/.paddlehub. Users can change the\n├                          default value through the HUB_HOME environment variable.\n├── MODULE_HOME       -->  Store the installed PaddleHub Module.\n├── CACHE_HOME        -->  Store the cached data.\n├── DATA_HOME         -->  Store the automatically downloaded datasets.\n├── CONF_HOME         -->  Store the default configuration files.\n├── THIRD_PARTY_HOME  -->  Store third-party libraries.\n├── TMP_HOME          -->  Store temporary files generated during running, such as intermediate products of installing modules,\n├                          files in this directory will generally be automatically cleared.\n├── SOURCES_HOME      -->  Store the installed code sources.\n└── LOG_HOME          -->  Store log files generated during operation, including some non-fatal errors. The log will be stored\n                           daily.\n'''\n\nimport os\n\n\ndef _get_user_home():\n    return os.path.expanduser('~')\n\n\ndef _get_hub_home():\n    if 'HUB_HOME' in os.environ:\n        home_path = os.environ['HUB_HOME']\n        if os.path.exists(home_path):\n            if os.path.isdir(home_path):\n                return home_path\n            else:\n                raise RuntimeError('The environment variable HUB_HOME {} is not a directory.'.format(home_path))\n        else:\n            return home_path\n    return os.path.join(_get_user_home(), '.paddlehub')\n\n\ndef _get_sub_home(directory):\n    home = os.path.join(_get_hub_home(), directory)\n    os.makedirs(home, exist_ok=True)\n    return home\n\n\nUSER_HOME = _get_user_home()\nHUB_HOME = _get_hub_home()\nMODULE_HOME = _get_sub_home('modules')\nCACHE_HOME = _get_sub_home('cache')\nDATA_HOME = _get_sub_home('dataset')\nCONF_HOME = _get_sub_home('conf')\nTHIRD_PARTY_HOME = _get_sub_home('thirdparty')\nTMP_HOME = _get_sub_home('tmp')\nSOURCES_HOME = _get_sub_home('sources')\nLOG_HOME = _get_sub_home('log')\n"
  },
  {
    "path": "paddlehub/finetune/__init__.py",
    "content": "# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlehub.finetune.trainer import Trainer\n"
  },
  {
    "path": "paddlehub/finetune/trainer.py",
    "content": "# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\nimport pickle\nimport time\nfrom collections import defaultdict\nfrom typing import Any\nfrom typing import Callable\nfrom typing import Generic\nfrom typing import List\n\nimport numpy as np\nimport paddle\nfrom visualdl import LogWriter\n\nfrom paddlehub.utils.log import logger\nfrom paddlehub.utils.utils import Timer\n\n\nclass Trainer(object):\n    '''\n    Model trainer\n\n    Args:\n        model(paddle.nn.Layer) : Model to train or evaluate.\n        optimizer(paddle.optimizer.Optimizer) : Optimizer for loss.\n        use_gpu(bool) : Whether to use gpu to run.\n        use_vdl(bool) : Whether to use visualdl to record training data.\n        checkpoint_dir(str) : Directory where the checkpoint is saved, and the trainer will restore the\n            state and model parameters from the checkpoint.\n        compare_metrics(callable) : The method of comparing the model metrics. If not specified, the main\n            metric return by `validation_step` will be used for comparison by default, the larger the\n            value, the better the effect. This method will affect the saving of the best model. If the\n            default behavior does not meet your requirements, please pass in a custom method.\n\n            Example:\n                .. code-block:: python\n\n                    def compare_metrics(old_metric: dict, new_metric: dict):\n                        mainkey = list(new_metric.keys())[0]\n                        return old_metric[mainkey] < new_metric[mainkey]\n    '''\n\n    def __init__(self,\n                 model: paddle.nn.Layer,\n                 optimizer: paddle.optimizer.Optimizer,\n                 use_gpu: bool = False,\n                 use_vdl: bool = True,\n                 checkpoint_dir: str = None,\n                 compare_metrics: Callable = None,\n                 **kwargs):\n        paddle.set_device('gpu') if use_gpu else paddle.set_device('cpu')\n        self.nranks = paddle.distributed.get_world_size()\n        self.local_rank = paddle.distributed.get_rank()\n        self.model = model\n        self.optimizer = optimizer\n        self.checkpoint_dir = checkpoint_dir if checkpoint_dir else 'ckpt_{}'.format(time.time())\n\n        if not isinstance(self.model, paddle.nn.Layer):\n            raise TypeError('The model {} is not a `paddle.nn.Layer` object.'.format(self.model.__name__))\n\n        if self.local_rank == 0 and not os.path.exists(self.checkpoint_dir):\n            os.makedirs(self.checkpoint_dir)\n\n        self.use_vdl = use_vdl\n        if self.local_rank == 0 and self.use_vdl:\n            vdl_dir = os.path.join(self.checkpoint_dir, 'visualization')\n            self.log_writer = LogWriter(vdl_dir)\n\n        self.current_epoch = 0\n        self.best_metrics = defaultdict(int)\n\n        if self.nranks > 1:\n            paddle.distributed.init_parallel_env()\n            self.model = paddle.DataParallel(self.model)\n\n        self.compare_metrics = self._compare_metrics if not compare_metrics else compare_metrics\n        self._load_checkpoint()\n\n    def _load_checkpoint(self):\n        '''Load checkpoint and state dict'''\n        max_epoch = -1\n\n        for file in os.listdir(self.checkpoint_dir):\n            if not file.startswith('epoch_'):\n                continue\n\n            _epoch = file.split('_')[-1]\n            if not _epoch.isdigit():\n                continue\n\n            max_epoch = max(max_epoch, int(_epoch))\n\n        if max_epoch == -1:\n            if self.local_rank == 0:\n                logger.warning('PaddleHub model checkpoint not found, start from scratch...')\n            return\n\n        # load best metrics\n        self._load_metrics()\n\n        self.current_epoch = max_epoch\n        metric_msg = ['{}={:.4f}'.format(metric, value) for metric, value in self.best_metrics.items()]\n        metric_msg = ' '.join(metric_msg)\n        if self.local_rank == 0:\n            logger.info('PaddleHub model checkpoint loaded. current_epoch={} [{}]'.format(\n                self.current_epoch, metric_msg))\n\n        model_path = os.path.join(self.checkpoint_dir, 'epoch_{}'.format(self.current_epoch))\n        self.load_model(model_path)\n\n    def load_model(self, load_dir: str):\n        \"\"\"load model\"\"\"\n        # load model checkpoint\n        model_params_path = os.path.join(load_dir, 'model.pdparams')\n        state_dict = paddle.load(model_params_path)\n        self.model.set_state_dict(state_dict)\n\n        # load optimizer checkpoint\n        optim_params_path = os.path.join(load_dir, 'model.pdopt')\n        state_dict = paddle.load(optim_params_path)\n        self.optimizer.set_state_dict(state_dict)\n\n    def _save_checkpoint(self):\n        '''Save model checkpoint and state dict'''\n        model_path = os.path.join(self.checkpoint_dir, 'epoch_{}'.format(self.current_epoch))\n        logger.info('Saving model checkpoint to {}'.format(model_path))\n        self.save_model(model_path)\n\n    def save_model(self, save_dir: str):\n        '''Save model'''\n        model_params_path = os.path.join(save_dir, 'model.pdparams')\n        optim_params_path = os.path.join(save_dir, 'model.pdopt')\n        paddle.save(self.model.state_dict(), model_params_path)\n        paddle.save(self.optimizer.state_dict(), optim_params_path)\n\n    def _save_metrics(self):\n        with open(os.path.join(self.checkpoint_dir, 'metrics.pkl'), 'wb') as file:\n            pickle.dump(self.best_metrics, file)\n\n    def _load_metrics(self):\n        metrics_file = os.path.join(self.checkpoint_dir, 'metrics.pkl')\n        if not os.path.exists(metrics_file):\n            return\n\n        with open(metrics_file, 'rb') as file:\n            self.best_metrics = pickle.load(file)\n\n    def train(self,\n              train_dataset: paddle.io.Dataset,\n              epochs: int = 1,\n              batch_size: int = 1,\n              num_workers: int = 0,\n              eval_dataset: paddle.io.Dataset = None,\n              log_interval: int = 10,\n              save_interval: int = 10,\n              collate_fn: Callable = None):\n        '''\n        Train a model with specific config.\n\n        Args:\n            train_dataset(paddle.io.Dataset) : Dataset to train the model\n            epochs(int) : Number of training loops, default is 1.\n            batch_size(int) : Batch size of per step, default is 1.\n            num_workers(int) : Number of subprocess to load data, default is 0.\n            eval_dataset(paddle.io.Dataset) : The validation dataset, deafult is None. If set, the Trainer will\n                execute evaluate function every `save_interval` epochs.\n            log_interval(int) : Log the train infomation every `log_interval` steps.\n            save_interval(int) : Save the checkpoint every `save_interval` epochs.\n            collate_fn(callable): function to generate mini-batch data by merging the sample list.\n                None for only stack each fields of sample in axis 0(same as :attr::`np.stack(..., axis=0)`). Default None\n        '''\n        if eval_dataset is not None:\n            if isinstance(self.model, paddle.DataParallel):\n                model = self.model._layers\n            else:\n                model = self.model\n\n            if not hasattr(model, 'validation_step'):\n                raise NotImplementedError('The specified finetuning model does not support evaluation.')\n\n        batch_sampler = paddle.io.DistributedBatchSampler(train_dataset,\n                                                          batch_size=batch_size,\n                                                          shuffle=True,\n                                                          drop_last=False)\n        loader = paddle.io.DataLoader(train_dataset,\n                                      batch_sampler=batch_sampler,\n                                      num_workers=num_workers,\n                                      return_list=True,\n                                      use_buffer_reader=True,\n                                      collate_fn=collate_fn)\n\n        steps_per_epoch = len(batch_sampler)\n        timer = Timer(steps_per_epoch * epochs)\n        timer.start()\n\n        for i in range(epochs):\n            self.current_epoch += 1\n            avg_loss = 0\n            avg_metrics = defaultdict(int)\n            self.model.train()\n\n            for batch_idx, batch in enumerate(loader):\n                loss, metrics = self.training_step(batch, batch_idx)\n                self.optimizer_step(self.current_epoch, batch_idx, self.optimizer, loss)\n                self.optimizer_zero_grad(self.current_epoch, batch_idx, self.optimizer)\n\n                # calculate metrics and loss\n                avg_loss += float(loss)\n                for metric, value in metrics.items():\n                    if isinstance(value, paddle.Tensor):\n                        value = value.numpy()\n                    avg_metrics[metric] += value\n\n                timer.count()\n\n                if (batch_idx + 1) % log_interval == 0 and self.local_rank == 0:\n                    lr = self.optimizer.get_lr()\n                    avg_loss /= log_interval\n                    if self.use_vdl:\n                        self.log_writer.add_scalar(tag='TRAIN/loss', step=timer.current_step, value=avg_loss)\n\n                    print_msg = 'Epoch={}/{}, Step={}/{}'.format(self.current_epoch, epochs, batch_idx + 1,\n                                                                 steps_per_epoch)\n                    print_msg += ' loss={:.4f}'.format(avg_loss)\n\n                    for metric, value in avg_metrics.items():\n                        value /= log_interval\n                        if self.use_vdl:\n                            self.log_writer.add_scalar(tag='TRAIN/{}'.format(metric),\n                                                       step=timer.current_step,\n                                                       value=value)\n                        if isinstance(value, np.ndarray):\n                            value = value.item()\n                        print_msg += ' {}={:.4f}'.format(metric, value)\n\n                    print_msg += ' lr={:.6f} step/sec={:.2f} | ETA {}'.format(lr, timer.timing, timer.eta)\n\n                    logger.train(print_msg)\n\n                    avg_loss = 0\n                    avg_metrics = defaultdict(int)\n\n                if self.current_epoch % save_interval == 0 and batch_idx + 1 == steps_per_epoch and self.local_rank == 0:\n                    if eval_dataset:\n                        result = self.evaluate(eval_dataset, batch_size, num_workers, collate_fn=collate_fn)\n                        eval_loss = result.get('loss', None)\n                        eval_metrics = result.get('metrics', {})\n                        if self.use_vdl:\n                            if eval_loss:\n                                self.log_writer.add_scalar(tag='EVAL/loss', step=timer.current_step, value=eval_loss)\n\n                            for metric, value in eval_metrics.items():\n                                self.log_writer.add_scalar(tag='EVAL/{}'.format(metric),\n                                                           step=timer.current_step,\n                                                           value=value)\n\n                        if not self.best_metrics or self.compare_metrics(self.best_metrics, eval_metrics):\n                            self.best_metrics = eval_metrics\n                            best_model_path = os.path.join(self.checkpoint_dir, 'best_model')\n                            self.save_model(best_model_path)\n                            self._save_metrics()\n\n                            metric_msg = [\n                                '{}={:.4f}'.format(metric, value) for metric, value in self.best_metrics.items()\n                            ]\n                            metric_msg = ' '.join(metric_msg)\n                            logger.eval('Saving best model to {} [best {}]'.format(best_model_path, metric_msg))\n\n                    self._save_checkpoint()\n\n    def evaluate(self,\n                 eval_dataset: paddle.io.Dataset,\n                 batch_size: int = 1,\n                 num_workers: int = 0,\n                 collate_fn: Callable = None):\n        '''\n        Run evaluation and returns metrics.\n\n        Args:\n            eval_dataset(paddle.io.Dataset) : The validation dataset\n            batch_size(int) : Batch size of per step, default is 1.\n            num_workers(int) : Number of subprocess to load data, default is 0.\n            collate_fn(callable): function to generate mini-batch data by merging the sample list.\n                None for only stack each fields of sample in axis 0(same as :attr::`np.stack(..., axis=0)`). Default None\n        '''\n        if self.local_rank == 0:\n            batch_sampler = paddle.io.BatchSampler(eval_dataset, batch_size=batch_size, shuffle=False, drop_last=False)\n\n            loader = paddle.io.DataLoader(eval_dataset,\n                                          batch_sampler=batch_sampler,\n                                          num_workers=num_workers,\n                                          return_list=True,\n                                          collate_fn=collate_fn)\n\n            self.model.eval()\n\n            avg_loss = num_samples = 0\n            sum_metrics = defaultdict(int)\n            avg_metrics = defaultdict(int)\n\n            with logger.processing('Evaluation on validation dataset'):\n                with paddle.no_grad():\n                    for batch_idx, batch in enumerate(loader):\n                        result = self.validation_step(batch, batch_idx)\n\n                        loss = result.get('loss', None)\n                        metrics = result.get('metrics', {})\n                        bs = batch[0].shape[0]\n                        num_samples += bs\n\n                        if loss:\n                            avg_loss += float(loss) * bs\n\n                        for metric, value in metrics.items():\n                            sum_metrics[metric] += value * bs\n\n            # print avg metrics and loss\n            print_msg = '[Evaluation result]'\n            if loss:\n                avg_loss /= num_samples\n                print_msg += ' avg_loss={:.4f}'.format(avg_loss)\n\n            for metric, value in sum_metrics.items():\n                avg_metrics[metric] = float(value) / num_samples\n                print_msg += ' avg_{}={:.4f}'.format(metric, avg_metrics[metric])\n\n            logger.eval(print_msg)\n\n            if loss:\n                return {'loss': avg_loss, 'metrics': avg_metrics}\n            return {'metrics': avg_metrics}\n\n    def training_step(self, batch: List[paddle.Tensor], batch_idx: int):\n        '''\n        One step for training, which should be called as forward computation.\n\n        Args:\n            batch(list[paddle.Tensor]) : The one batch data\n            batch_idx(int) : The index of batch.\n        '''\n        if self.nranks > 1:\n            result = self.model._layers.training_step(batch, batch_idx)\n        else:\n            result = self.model.training_step(batch, batch_idx)\n\n        # process result\n        if not isinstance(result, dict):\n            raise RuntimeError('The return value of `trainning_step` in {} is not a dict'.format(self.model.__class__))\n\n        loss = result.get('loss', None)\n        if loss is None:\n            raise RuntimeError('Cannot find loss attribute in the return value of `trainning_step` of {}'.format(\n                self.model.__class__))\n\n        metrics = result.get('metrics', {})\n\n        # back prop\n        loss.backward()\n\n        return loss, metrics\n\n    def validation_step(self, batch: Any, batch_idx: int):\n        '''\n        One step for validation, which should be called as forward computation.\n\n        Args:\n            batch(list[paddle.Tensor]) : The one batch data\n            batch_idx(int) : The index of batch.\n        '''\n        if self.nranks > 1:\n            result = self.model._layers.validation_step(batch, batch_idx)\n        else:\n            result = self.model.validation_step(batch, batch_idx)\n        return result\n\n    def optimizer_step(self, epoch_idx: int, batch_idx: int, optimizer: paddle.optimizer.Optimizer,\n                       loss: paddle.Tensor):\n        '''\n        One step for optimize.\n\n        Args:\n            epoch_idx(int) : The index of epoch.\n            batch_idx(int) : The index of batch.\n            optimizer(paddle.optimizer.Optimizer) : Optimizer used.\n            loss(paddle.Tensor) : Loss tensor.\n        '''\n        self.optimizer.step()\n        self.learning_rate_step(epoch_idx, batch_idx, self.optimizer._learning_rate, loss)\n\n    def learning_rate_step(self, epoch_idx: int, batch_idx: int, learning_rate: Generic, loss: paddle.Tensor):\n        if isinstance(learning_rate, paddle.optimizer.lr.LRScheduler):\n            learning_rate.step()\n\n    def optimizer_zero_grad(self, epoch_idx: int, batch_idx: int, optimizer: paddle.optimizer.Optimizer):\n        '''\n        One step for clear gradients.\n\n        Args:\n            epoch_idx(int) : The index of epoch.\n            batch_idx(int) : The index of batch.\n            optimizer(paddle.optimizer.Optimizer) : Optimizer used.\n            loss(paddle.Tensor) : Loss tensor.\n        '''\n        self.model.clear_gradients()\n\n    def _compare_metrics(self, old_metric: dict, new_metric: dict):\n        '''Compare the whether the new metric value is better than the old one'''\n        mainkey = list(new_metric.keys())[0]\n        return old_metric[mainkey] < new_metric[mainkey]\n"
  },
  {
    "path": "paddlehub/module/__init__.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom .module import Module\n"
  },
  {
    "path": "paddlehub/module/audio_module.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom collections import OrderedDict\nfrom typing import List, Tuple\n\nimport numpy as np\nimport paddle\n\nfrom paddlehub.module.module import RunModule, runnable, serving\nfrom paddlehub.utils.utils import extract_melspectrogram\n\n\nclass AudioClassifierModule(RunModule):\n    \"\"\"\n    The base class for audio classifier models.\n    \"\"\"\n    _tasks_supported = [\n        'sound-cls',\n    ]\n\n    def _batchify(self, data: List[List[float]], sample_rate: int, feat_type: str, batch_size: int):\n        examples = []\n        for waveform in data:\n            if feat_type == 'mel':\n                feat = extract_melspectrogram(waveform, sample_rate=sample_rate)\n                examples.append(feat)\n            else:\n                examples.append(waveform)\n\n        # Seperates data into some batches.\n        one_batch = []\n        for example in examples:\n            one_batch.append(example)\n            if len(one_batch) == batch_size:\n                yield one_batch\n                one_batch = []\n        if one_batch:\n            # The last batch whose size is less than the config batch_size setting.\n            yield one_batch\n\n    def training_step(self, batch: List[paddle.Tensor], batch_idx: int):\n        if self.task == 'sound-cls':\n            predictions, avg_loss, metric = self(feats=batch[0], labels=batch[1])\n        else:\n            raise NotImplementedError\n\n        self.metric.reset()\n        return {'loss': avg_loss, 'metrics': metric}\n\n    def validation_step(self, batch: List[paddle.Tensor], batch_idx: int):\n        if self.task == 'sound-cls':\n            predictions, avg_loss, metric = self(feats=batch[0], labels=batch[1])\n        else:\n            raise NotImplementedError\n\n        return {'metrics': metric}\n\n    def predict(self,\n                data: List[List[float]],\n                sample_rate: int,\n                batch_size: int = 1,\n                feat_type: str = 'mel',\n                use_gpu: bool = False):\n        if self.task not in self._tasks_supported \\\n                and self.task is not None:      # None for getting audioset tags\n            raise RuntimeError(f'Unknown task {self.task}, current tasks supported:\\n'\n                               '1. sound-cls: sound classification;\\n'\n                               '2. None: audioset tags')\n\n        paddle.set_device('gpu') if use_gpu else paddle.set_device('cpu')\n\n        batches = self._batchify(data, sample_rate, feat_type, batch_size)\n        results = []\n        self.eval()\n        for batch in batches:\n            feats = paddle.to_tensor(batch)\n            scores = self(feats)\n\n            for score in scores.numpy():\n                result = OrderedDict()\n                for i in (-score).argsort():\n                    result[self.label_map[i]] = score[i]\n                results.append(result)\n\n        return results\n"
  },
  {
    "path": "paddlehub/module/cv_module.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport time\nimport os\nimport base64\nimport argparse\nfrom typing import List, Union, Tuple\nfrom collections import OrderedDict\n\nimport cv2\nimport paddle\nimport numpy as np\nimport paddle.nn as nn\nimport paddle.nn.functional as F\nfrom PIL import Image\n\nimport paddlehub.vision.transforms as T\nimport paddlehub.vision.functional as Func\nfrom paddlehub.vision import utils\nfrom paddlehub.module.module import serving, RunModule, runnable\nfrom paddlehub.utils.utils import base64_to_cv2, cv2_to_base64, Version\n\n\nclass ImageServing(object):\n    @serving\n    def serving_method(self, images: List[str], **kwargs) -> List[dict]:\n        \"\"\"Run as a service.\"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        results = self.predict(images=images_decode, **kwargs)\n        return results\n\n\nclass ImageClassifierModule(RunModule, ImageServing):\n    def training_step(self, batch: int, batch_idx: int) -> dict:\n        '''\n        One step for training, which should be called as forward computation.\n\n        Args:\n            batch(list[paddle.Tensor]) : The one batch data, which contains images and labels.\n            batch_idx(int) : The index of batch.\n\n        Returns:\n            results(dict) : The model outputs, such as loss and metrics.\n        '''\n        return self.validation_step(batch, batch_idx)\n\n    def validation_step(self, batch: int, batch_idx: int) -> dict:\n        '''\n        One step for validation, which should be called as forward computation.\n\n        Args:\n            batch(list[paddle.Tensor]) : The one batch data, which contains images and labels.\n            batch_idx(int) : The index of batch.\n\n        Returns:\n            results(dict) : The model outputs, such as metrics.\n        '''\n        images = batch[0]\n        labels = paddle.unsqueeze(batch[1], axis=-1)\n        labels = labels.astype('int64')\n\n        preds, feature = self(images)\n\n        loss, _ = F.softmax_with_cross_entropy(preds, labels, return_softmax=True, axis=1)\n        loss = paddle.mean(loss)\n        acc = paddle.metric.accuracy(preds, labels)\n        return {'loss': loss, 'metrics': {'acc': acc}}\n\n    def predict(self, images: List[np.ndarray], batch_size: int = 1, top_k: int = 1) -> List[dict]:\n        '''\n        Predict images\n\n        Args:\n            images(list[numpy.ndarray]) : Images to be predicted, consist of np.ndarray in bgr format.\n            batch_size(int) : Batch size for prediciton.\n            top_k(int) : Output top k result of each image.\n\n        Returns:\n            results(list[dict]) : The prediction result of each input image\n        '''\n        self.eval()\n        with paddle.no_grad():\n            res = []\n            total_num = len(images)\n            loop_num = int(np.ceil(total_num / batch_size))\n\n            for iter_id in range(loop_num):\n                batch_data = []\n                handle_id = iter_id * batch_size\n                for image_id in range(batch_size):\n                    try:\n                        image = self.transforms(images[handle_id + image_id])\n                        batch_data.append(image)\n                    except:\n                        pass\n                batch_image = np.array(batch_data, dtype='float32')\n                preds, feature = self(paddle.to_tensor(batch_image))\n                preds = F.softmax(preds, axis=1).numpy()\n                pred_idxs = np.argsort(preds)[:, ::-1][:, :top_k]\n\n                for i, pred in enumerate(pred_idxs):\n                    res_dict = {}\n                    for k in pred:\n                        class_name = self.labels[int(k)]\n                        res_dict[class_name] = preds[i][k]\n\n                    res.append(res_dict)\n\n            return res\n\n    @serving\n    def serving_method(self, images: list, top_k: int, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        top_k = int(top_k)\n        images_decode = [base64_to_cv2(image) for image in images]\n        resdicts = self.predict(images=images_decode, top_k=top_k, **kwargs)\n        final = {}\n        for resdict in resdicts:\n            for key, value in resdict.items():\n                resdict[key] = float(value)\n        final['data'] = resdicts\n        return final\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.predict(images=[args.input_path], top_k=args.top_k)\n\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n\n        self.arg_config_group.add_argument('--top_k', type=int, default=1, help=\"top_k classification result.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n\n\nclass ImageColorizeModule(RunModule, ImageServing):\n    def training_step(self, batch: int, batch_idx: int) -> dict:\n        '''\n        One step for training, which should be called as forward computation.\n\n        Args:\n            batch(list[paddle.Tensor]): The one batch data, which contains images and labels.\n            batch_idx(int): The index of batch.\n\n        Returns:\n            results(dict): The model outputs, such as loss and metrics.\n        '''\n        return self.validation_step(batch, batch_idx)\n\n    def validation_step(self, batch: int, batch_idx: int) -> dict:\n        '''\n        One step for validation, which should be called as forward computation.\n\n        Args:\n            batch(list[paddle.Tensor]): The one batch data, which contains images and labels.\n            batch_idx(int): The index of batch.\n\n        Returns:\n            results(dict) : The model outputs, such as metrics.\n        '''\n        if Version(paddle.__version__) >= '2.1' or Version(paddle.__version__) == '0.0.0':\n            img = self.preprocess(batch)\n        else:\n            img = self.preprocess(batch[0])\n\n        out_class, out_reg = self(img['A'], img['hint_B'], img['mask_B'])\n\n        # loss\n        loss_ce = F.cross_entropy(out_class, img['real_B_enc'][:, :1, :, :], axis=1)\n        loss_ce = paddle.mean(loss_ce)\n        loss_G_L1_reg = paddle.sum(paddle.abs(img['B'] - out_reg), axis=1, keepdim=True)\n        loss_G_L1_reg = paddle.mean(loss_G_L1_reg)\n        loss = loss_ce + loss_G_L1_reg\n        return {'loss': loss}\n\n    def predict(self, images: list, visualization: bool = True, batch_size: int = 1, save_path: str = 'colorization'):\n        '''\n        Colorize images\n\n        Args:\n            images(list[str|np.ndarray]) : Images path or BGR image to be colorized.\n            visualization(bool): Whether to save colorized images.\n            batch_size(int): Batch size for prediciton.\n            save_path(str) : Path to save colorized images.\n\n        Returns:\n            res(list[dict]) : The prediction result of each input image\n        '''\n        self.eval()\n        with paddle.no_grad():\n            lab2rgb = T.LAB2RGB()\n            res = []\n            total_num = len(images)\n            loop_num = int(np.ceil(total_num / batch_size))\n            for iter_id in range(loop_num):\n                batch_data = []\n                handle_id = iter_id * batch_size\n                for image_id in range(batch_size):\n                    try:\n                        image = self.transforms(images[handle_id + image_id])\n                        batch_data.append(image)\n                    except:\n                        pass\n                batch_data = np.array(batch_data)\n                im = self.preprocess(batch_data)\n                out_class, out_reg = self(im['A'], im['hint_B'], im['mask_B'])\n\n                visual_ret = OrderedDict()\n                for i in range(im['A'].shape[0]):\n                    gray = lab2rgb(np.concatenate((im['A'].numpy(), np.zeros(im['B'].shape)), axis=1))[i]\n                    gray = np.clip(np.transpose(gray, (1, 2, 0)), 0, 1) * 255\n                    visual_ret['gray'] = gray.astype(np.uint8)\n                    hint = lab2rgb(np.concatenate((im['A'].numpy(), im['hint_B'].numpy()), axis=1))[i]\n                    hint = np.clip(np.transpose(hint, (1, 2, 0)), 0, 1) * 255\n                    visual_ret['hint'] = hint.astype(np.uint8)\n                    real = lab2rgb(np.concatenate((im['A'].numpy(), im['B'].numpy()), axis=1))[i]\n                    real = np.clip(np.transpose(real, (1, 2, 0)), 0, 1) * 255\n                    visual_ret['real'] = real.astype(np.uint8)\n                    fake = lab2rgb(np.concatenate((im['A'].numpy(), out_reg.numpy()), axis=1))[i]\n                    fake = np.clip(np.transpose(fake, (1, 2, 0)), 0, 1) * 255\n                    visual_ret['fake_reg'] = fake.astype(np.uint8)\n\n                    if visualization:\n                        if isinstance(images[handle_id + i], str):\n                            org_img = cv2.imread(images[handle_id + i]).astype('float32')\n                        else:\n                            org_img = images[handle_id + i]\n                        h, w, c = org_img.shape\n                        fake_name = \"fake_\" + str(time.time()) + \".png\"\n                        if not os.path.exists(save_path):\n                            os.mkdir(save_path)\n                        fake_path = os.path.join(save_path, fake_name)\n                        visual_gray = Image.fromarray(visual_ret['fake_reg'])\n                        visual_gray = visual_gray.resize((w, h), Image.BILINEAR)\n                        visual_gray.save(fake_path)\n\n                    res.append(visual_ret)\n            return res\n\n    @serving\n    def serving_method(self, images: list, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        visual = self.predict(images=images_decode, **kwargs)\n        final = {}\n        for i, visual_ret in enumerate(visual):\n            h, w, c = images_decode[i].shape\n            for key, value in visual_ret.items():\n                value = cv2.resize(cv2.cvtColor(value, cv2.COLOR_RGB2BGR), (w, h), cv2.INTER_NEAREST)\n                visual_ret[key] = cv2_to_base64(value)\n        final['data'] = visual\n        return final\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.predict(images=[args.input_path], visualization=args.visualization, save_path=args.output_dir)\n\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default='colorization', help=\"save visualization result.\")\n        self.arg_config_group.add_argument(\n            '--visualization', type=bool, default=True, help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n\n\nclass Yolov3Module(RunModule, ImageServing):\n    def training_step(self, batch: int, batch_idx: int) -> dict:\n        '''\n        One step for training, which should be called as forward computation.\n\n        Args:\n            batch(list[paddle.Tensor]): The one batch data, which contains images, ground truth boxes, labels and scores.\n            batch_idx(int): The index of batch.\n\n        Returns:\n            results(dict): The model outputs, such as loss.\n        '''\n\n        return self.validation_step(batch, batch_idx)\n\n    def validation_step(self, batch: int, batch_idx: int) -> dict:\n        '''\n        One step for validation, which should be called as forward computation.\n\n        Args:\n            batch(list[paddle.Tensor]): The one batch data, which contains images, ground truth boxes, labels and scores.\n            batch_idx(int): The index of batch.\n\n        Returns:\n            results(dict) : The model outputs, such as metrics.\n        '''\n        img = batch[0].astype('float32')\n        gtbox = batch[1].astype('float32')\n        gtlabel = batch[2].astype('int32')\n        gtscore = batch[3].astype(\"float32\")\n        losses = []\n        outputs = self(img)\n        self.downsample = 32\n\n        for i, out in enumerate(outputs):\n            anchor_mask = self.anchor_masks[i]\n            loss = F.yolov3_loss(\n                x=out,\n                gt_box=gtbox,\n                gt_label=gtlabel,\n                gt_score=gtscore,\n                anchors=self.anchors,\n                anchor_mask=anchor_mask,\n                class_num=self.class_num,\n                ignore_thresh=self.ignore_thresh,\n                downsample_ratio=32,\n                use_label_smooth=False)\n            losses.append(paddle.mean(loss))\n            self.downsample //= 2\n\n        return {'loss': sum(losses)}\n\n    def predict(self, imgpath: str, filelist: str, visualization: bool = True, save_path: str = 'result'):\n        '''\n        Detect images\n\n        Args:\n            imgpath(str): Image path .\n            filelist(str): Path to get label name.\n            visualization(bool): Whether to save result image.\n            save_path(str) : Path to save detected images.\n\n        Returns:\n            boxes(np.ndarray): Predict box information.\n            scores(np.ndarray): Predict score.\n            labels(np.ndarray): Predict labels.\n        '''\n        self.eval()\n        with paddle.no_grad():\n            boxes = []\n            scores = []\n            self.downsample = 32\n            im = self.transform(imgpath)\n            h, w, c = utils.img_shape(imgpath)\n            im_shape = paddle.to_tensor(np.array([[h, w]]).astype('int32'))\n            label_names = utils.get_label_infos(filelist)\n            img_data = paddle.to_tensor(np.array([im]).astype('float32'))\n\n            outputs = self(img_data)\n\n            for i, out in enumerate(outputs):\n                anchor_mask = self.anchor_masks[i]\n                mask_anchors = []\n                for m in anchor_mask:\n                    mask_anchors.append((self.anchors[2 * m]))\n                    mask_anchors.append(self.anchors[2 * m + 1])\n\n                box, score = F.yolo_box(\n                    x=out,\n                    img_size=im_shape,\n                    anchors=mask_anchors,\n                    class_num=self.class_num,\n                    conf_thresh=self.valid_thresh,\n                    downsample_ratio=self.downsample,\n                    name=\"yolo_box\" + str(i))\n\n                boxes.append(box)\n                scores.append(paddle.transpose(score, perm=[0, 2, 1]))\n                self.downsample //= 2\n\n            yolo_boxes = paddle.concat(boxes, axis=1)\n            yolo_scores = paddle.concat(scores, axis=2)\n\n            pred = F.multiclass_nms(\n                bboxes=yolo_boxes,\n                scores=yolo_scores,\n                score_threshold=self.valid_thresh,\n                nms_top_k=self.nms_topk,\n                keep_top_k=self.nms_posk,\n                nms_threshold=self.nms_thresh,\n                background_label=-1)\n\n            bboxes = pred.numpy()\n            labels = bboxes[:, 0].astype('int32')\n            scores = bboxes[:, 1].astype('float32')\n            boxes = bboxes[:, 2:].astype('float32')\n\n            if visualization:\n                if not os.path.exists(save_path):\n                    os.mkdir(save_path)\n                utils.draw_boxes_on_image(imgpath, boxes, scores, labels, label_names, 0.5, save_path)\n\n            return boxes, scores, labels\n\n\nclass StyleTransferModule(RunModule, ImageServing):\n    def training_step(self, batch: int, batch_idx: int) -> dict:\n        '''\n        One step for training, which should be called as forward computation.\n\n        Args:\n            batch(list[paddle.Tensor]): The one batch data, which contains images and labels.\n            batch_idx(int): The index of batch.\n\n        Returns:\n            results(dict) : The model outputs, such as loss and metrics.\n        '''\n        return self.validation_step(batch, batch_idx)\n\n    def validation_step(self, batch: int, batch_idx: int) -> dict:\n        '''\n        One step for validation, which should be called as forward computation.\n\n        Args:\n            batch(list[paddle.Tensor]): The one batch data, which contains images and labels.\n            batch_idx(int): The index of batch.\n\n        Returns:\n            results(dict) : The model outputs, such as metrics.\n        '''\n        mse_loss = nn.MSELoss()\n        N, C, H, W = batch[0].shape\n        batch[1] = batch[1][0].unsqueeze(0)\n        self.setTarget(batch[1])\n\n        y = self(batch[0])\n        xc = paddle.to_tensor(batch[0].numpy().copy())\n        y = utils.subtract_imagenet_mean_batch(y)\n        xc = utils.subtract_imagenet_mean_batch(xc)\n        features_y = self.getFeature(y)\n        features_xc = self.getFeature(xc)\n        f_xc_c = paddle.to_tensor(features_xc[1].numpy(), stop_gradient=True)\n        content_loss = mse_loss(features_y[1], f_xc_c)\n\n        batch[1] = utils.subtract_imagenet_mean_batch(batch[1])\n        features_style = self.getFeature(batch[1])\n        gram_style = [utils.gram_matrix(y) for y in features_style]\n        style_loss = 0.\n        for m in range(len(features_y)):\n            gram_y = utils.gram_matrix(features_y[m])\n            gram_s = paddle.to_tensor(np.tile(gram_style[m].numpy(), (N, 1, 1, 1)))\n            style_loss += mse_loss(gram_y, gram_s[:N, :, :])\n\n        loss = content_loss + style_loss\n\n        return {'loss': loss, 'metrics': {'content gap': content_loss, 'style gap': style_loss}}\n\n    def predict(self,\n                origin: list,\n                style: Union[str, np.ndarray],\n                batch_size: int = 1,\n                visualization: bool = True,\n                save_path: str = 'style_tranfer'):\n        '''\n        Colorize images\n\n        Args:\n            origin(list[str|np.array]): Content image path or BGR image.\n            style(str|np.array): Style image path or BGR image.\n            batch_size(int): Batch size for prediciton.\n            visualization(bool): Whether to save colorized images.\n            save_path(str) : Path to save colorized images.\n\n        Returns:\n            output(list[np.ndarray]) : The style transformed images with bgr mode.\n        '''\n        self.eval()\n        with paddle.no_grad():\n            style = paddle.to_tensor(self.transform(style).astype('float32'))\n            style = style.unsqueeze(0)\n\n            res = []\n            total_num = len(origin)\n            loop_num = int(np.ceil(total_num / batch_size))\n            for iter_id in range(loop_num):\n                batch_data = []\n                handle_id = iter_id * batch_size\n                for image_id in range(batch_size):\n                    try:\n                        image = self.transform(origin[handle_id + image_id])\n                        batch_data.append(image.astype('float32'))\n                    except:\n                        pass\n\n                batch_image = np.array(batch_data)\n                content = paddle.to_tensor(batch_image)\n\n                self.setTarget(style)\n                output = self(content)\n                for num in range(batch_size):\n                    out = paddle.clip(output[num].transpose((1, 2, 0)), 0, 255).numpy().astype(np.uint8)\n                    res.append(out)\n                    if visualization:\n                        style_name = \"style_\" + str(time.time()) + \".png\"\n                        if not os.path.exists(save_path):\n                            os.mkdir(save_path)\n                        path = os.path.join(save_path, style_name)\n                        cv2.imwrite(path, out)\n            return res\n\n    @serving\n    def serving_method(self, images: list, **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images[0]]\n        style_decode = base64_to_cv2(images[1])\n        results = self.predict(origin=images_decode, style=style_decode, **kwargs)\n        final = {}\n        final['data'] = [cv2_to_base64(result) for result in results]\n        return final\n\n    @runnable\n    def run_cmd(self, argvs: list):\n        \"\"\"\n        Run as a command.\n        \"\"\"\n        self.parser = argparse.ArgumentParser(\n            description=\"Run the {} module.\".format(self.name),\n            prog='hub run {}'.format(self.name),\n            usage='%(prog)s',\n            add_help=True)\n        self.arg_input_group = self.parser.add_argument_group(title=\"Input options\", description=\"Input data. Required\")\n        self.arg_config_group = self.parser.add_argument_group(\n            title=\"Config options\", description=\"Run configuration for controlling module behavior, not required.\")\n        self.add_module_config_arg()\n        self.add_module_input_arg()\n        args = self.parser.parse_args(argvs)\n        results = self.predict(\n            origin=[args.input_path],\n            style=args.style_path,\n            save_path=args.output_dir,\n            visualization=args.visualization)\n\n        return results\n\n    def add_module_config_arg(self):\n        \"\"\"\n        Add the command config options.\n        \"\"\"\n\n        self.arg_config_group.add_argument(\n            '--output_dir', type=str, default='style_tranfer', help=\"The directory to save output images.\")\n\n        self.arg_config_group.add_argument(\n            '--visualization', type=bool, default=True, help=\"whether to save output as images.\")\n\n    def add_module_input_arg(self):\n        \"\"\"\n        Add the command input options.\n        \"\"\"\n        self.arg_input_group.add_argument('--input_path', type=str, help=\"path to image.\")\n        self.arg_input_group.add_argument('--style_path', type=str, help=\"path to style image.\")\n\n\nclass ImageSegmentationModule(ImageServing, RunModule):\n    def training_step(self, batch: List[paddle.Tensor], batch_idx: int) -> dict:\n        '''\n        One step for training, which should be called as forward computation.\n\n        Args:\n            batch(list[paddle.Tensor]): The one batch data, which contains images, ground truth boxes, labels and scores.\n            batch_idx(int): The index of batch.\n\n        Returns:\n            results(dict): The model outputs, such as loss.\n\n        '''\n\n        label = batch[1].astype('int64')\n        criterionCE = nn.loss.CrossEntropyLoss()\n        logits = self(batch[0])\n        loss = 0\n        for i in range(len(logits)):\n            logit = logits[i]\n            if logit.shape[-2:] != label.shape[-2:]:\n                logit = F.interpolate(logit, label.shape[-2:], mode='bilinear')\n\n            logit = logit.transpose([0, 2, 3, 1])\n            loss_ce = criterionCE(logit, label)\n            loss += loss_ce / len(logits)\n\n        return {\"loss\": loss}\n\n    def predict(self,\n                images: Union[str, np.ndarray],\n                batch_size: int = 1,\n                visualization: bool = True,\n                save_path: str = 'seg_result') -> List[np.ndarray]:\n        '''\n        Obtain segmentation results.\n\n        Args:\n            images(list[str|np.array]): Content image path or BGR image.\n            batch_size(int): Batch size for prediciton.\n            visualization(bool): Whether to save colorized images.\n            save_path(str) : Path to save colorized images.\n\n        Returns:\n            output(list[np.ndarray]) : The segmentation mask.\n        '''\n        self.eval()\n        with paddle.no_grad():\n            result = []\n\n            total_num = len(images)\n            loop_num = int(np.ceil(total_num / batch_size))\n            for iter_id in range(loop_num):\n                batch_data = []\n                handle_id = iter_id * batch_size\n                for image_id in range(batch_size):\n                    try:\n                        image, _ = self.transform(images[handle_id + image_id])\n                        batch_data.append(image)\n                    except:\n                        pass\n                batch_image = np.array(batch_data).astype('float32')\n                pred = self(paddle.to_tensor(batch_image))\n                pred = paddle.argmax(pred[0], axis=1, keepdim=True, dtype='int32')\n\n                for num in range(pred.shape[0]):\n                    if isinstance(images[handle_id + num], str):\n                        image = cv2.imread(images[handle_id + num])\n                    else:\n                        image = images[handle_id + num]\n                    h, w, c = image.shape\n                    pred_final = utils.reverse_transform(pred[num:num + 1], (h, w), self.transforms.transforms)\n                    pred_final = paddle.squeeze(pred_final)\n                    pred_final = pred_final.numpy().astype('uint8')\n\n                    if visualization:\n                        added_image = utils.visualize(images[handle_id + num], pred_final, weight=0.6)\n                        pred_mask = utils.get_pseudo_color_map(pred_final)\n                        pred_image_path = os.path.join(save_path, 'image', str(time.time()) + \".png\")\n                        pred_mask_path = os.path.join(save_path, 'mask', str(time.time()) + \".png\")\n                        if not os.path.exists(os.path.dirname(pred_image_path)):\n                            os.makedirs(os.path.dirname(pred_image_path))\n                        if not os.path.exists(os.path.dirname(pred_mask_path)):\n                            os.makedirs(os.path.dirname(pred_mask_path))\n                        cv2.imwrite(pred_image_path, added_image)\n                        pred_mask.save(pred_mask_path)\n\n                    result.append(pred_final)\n            return result\n\n    @serving\n    def serving_method(self, images: List[str], **kwargs):\n        \"\"\"\n        Run as a service.\n        \"\"\"\n        images_decode = [base64_to_cv2(image) for image in images]\n        visual = self.predict(images=images_decode, **kwargs)\n        final = []\n        for mask in visual:\n            final.append(cv2_to_base64(mask))\n        return final\n"
  },
  {
    "path": "paddlehub/module/manager.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport shutil\n\nimport sys\nfrom collections import OrderedDict\nfrom typing import List\n\nimport filelock\n\nfrom paddlehub.env import MODULE_HOME, TMP_HOME\nfrom paddlehub.module.module import Module as HubModule\nfrom paddlehub.server import module_server\nfrom paddlehub.utils import xarfile, log, utils, pypi\n\n\nclass HubModuleNotFoundError(Exception):\n    def __init__(self, name: str, info: dict = None, version: str = None, source: str = None):\n        self.name = name\n        self.version = version\n        self.info = info\n        self.source = source\n\n    def __str__(self):\n        msg = '{}'.format(self.name)\n        if self.version:\n            msg += '-{}'.format(self.version)\n\n        if self.source:\n            msg += ' from {}'.format(self.source)\n\n        tips = 'No HubModule named {} was found'.format(log.FormattedText(text=msg, color='red'))\n\n        if self.info:\n            sort_infos = sorted(self.info.items(), key=lambda x: utils.Version(x[0]))\n\n            table = log.Table()\n            table.append(\n                *['Name', 'Version', 'PaddlePaddle Version Required', 'PaddleHub Version Required'],\n                widths=[15, 10, 35, 35],\n                aligns=['^', '^', '^', '^'],\n                colors=['cyan', 'cyan', 'cyan', 'cyan'])\n\n            for _ver, info in sort_infos:\n                paddle_version = 'Any' if not info['paddle_version'] else ', '.join(info['paddle_version'])\n                hub_version = 'Any' if not info['hub_version'] else ', '.join(info['hub_version'])\n                table.append(self.name, _ver, paddle_version, hub_version, aligns=['^', '^', '^', '^'])\n\n            tips += ':\\n{}'.format(table)\n        return tips\n\n\nclass EnvironmentMismatchError(Exception):\n    def __init__(self, name: str, info: dict, version: str = None):\n        self.name = name\n        self.version = version\n        self.info = info\n\n    def __str__(self):\n        msg = '{}'.format(self.name)\n        if self.version:\n            msg += '-{}'.format(self.version)\n\n        tips = '{} cannot be installed because some conditions are not met'.format(\n            log.FormattedText(text=msg, color='red'))\n\n        if self.info:\n            sort_infos = sorted(self.info.items(), key=lambda x: utils.Version(x[0]))\n\n            table = log.Table()\n            table.append(\n                *['Name', 'Version', 'PaddlePaddle Version Required', 'PaddleHub Version Required'],\n                widths=[15, 10, 35, 35],\n                aligns=['^', '^', '^', '^'],\n                colors=['cyan', 'cyan', 'cyan', 'cyan'])\n\n            import paddle\n            import paddlehub\n\n            for _ver, info in sort_infos:\n                paddle_version = 'Any' if not info['paddle_version'] else ', '.join(info['paddle_version'])\n                for version in info['paddle_version']:\n                    if not utils.Version(paddle.__version__).match(version):\n                        paddle_version = '{}(Mismatch)'.format(paddle_version)\n                        break\n\n                hub_version = 'Any' if not info['hub_version'] else ', '.join(info['hub_version'])\n                for version in info['hub_version']:\n                    if not utils.Version(paddlehub.__version__).match(version):\n                        hub_version = '{}(Mismatch)'.format(hub_version)\n                        break\n\n                table.append(self.name, _ver, paddle_version, hub_version, aligns=['^', '^', '^', '^'])\n\n            tips += ':\\n{}'.format(table)\n        return tips\n\n\nclass LocalModuleManager(object):\n    '''\n    LocalModuleManager is used to manage PaddleHub's local Module, which supports the installation, uninstallation,\n    and search of HubModule. LocalModuleManager is a singleton object related to the path, in other words, when the\n    LocalModuleManager object of the same home directory is generated multiple times, the same object is returned.\n\n    Args:\n        home (str): The directory where PaddleHub modules are stored, the default is ~/.paddlehub/modules\n    '''\n    _instance_map = {}\n\n    def __new__(cls, home: str = MODULE_HOME):\n        home = MODULE_HOME if not home else home\n        if home in cls._instance_map:\n            return cls._instance_map[home]\n        cls._instance_map[home] = super(LocalModuleManager, cls).__new__(cls)\n        return cls._instance_map[home]\n\n    def __init__(self, home: str = None):\n        home = MODULE_HOME if not home else home\n        self.home = home\n        self._local_modules = OrderedDict()\n\n        # Most HubModule can be regarded as a python package, so we need to add the home\n        # directory to sys.path\n        if not home in sys.path:\n            sys.path.insert(0, home)\n\n    def _get_normalized_path(self, name: str) -> str:\n        return os.path.join(self.home, self._get_normalized_name(name))\n\n    def _get_normalized_name(self, name: str) -> str:\n        # Some HubModules contain '-'  in name (eg roberta_wwm_ext_chinese_L-3_H-1024_A-16).\n        # Replace '-' with '_' to comply with python naming conventions.\n        return name.replace('-', '_')\n\n    def install(self,\n                *,\n                name: str = None,\n                directory: str = None,\n                archive: str = None,\n                url: str = None,\n                version: str = None,\n                source: str = None,\n                update: bool = False,\n                branch: str = None,\n                ignore_env_mismatch: bool = False) -> HubModule:\n        '''\n        Install a HubModule from name or directory or archive file or url. When installing with the name parameter, if a\n        module that meets the conditions (both name and version) already installed, the installation step will be skipped.\n        When installing with other parameter, The locally installed modules will be uninstalled.\n\n        Args:\n            name                (str|optional): module name to install\n            directory           (str|optional): directory containing  module code\n            archive             (str|optional): archive file containing  module code\n            url                 (str|optional): url points to a archive file containing module code\n            version             (str|optional): module version, use with name parameter\n            source              (str|optional): source containing module code, use with name paramete\n            ignore_env_mismatch (str|optional): Whether to ignore the environment mismatch when installing the Module.\n        '''\n        if name:\n\n            lock = filelock.FileLock(os.path.join(TMP_HOME, name))\n            with lock:\n                hub_module_cls = self.search(name, source, branch)\n                if hub_module_cls and hub_module_cls.version.match(version):\n                    directory = self._get_normalized_path(hub_module_cls.name)\n                    if version:\n                        msg = 'Module {}-{} already installed in {}'.format(hub_module_cls.name, hub_module_cls.version,\n                                                                            directory)\n                    else:\n                        msg = 'Module {} already installed in {}'.format(hub_module_cls.name, directory)\n                    log.logger.info(msg)\n                    return hub_module_cls\n                if source:\n                    return self._install_from_source(name, version, source, update, branch)\n                return self._install_from_name(name, version, ignore_env_mismatch)\n        elif directory:\n            return self._install_from_directory(directory)\n        elif archive:\n            return self._install_from_archive(archive)\n        elif url:\n            return self._install_from_url(url)\n        else:\n            raise RuntimeError('Attempt to install a module, but no parameters were specified.')\n\n    def uninstall(self, name: str) -> bool:\n        '''Return True if uninstall successfully else False'''\n        if not os.path.exists(self._get_normalized_path(name)):\n            log.logger.info('{} is not installed'.format(name))\n            return False\n\n        shutil.rmtree(self._get_normalized_path(name))\n        if name in self._local_modules:\n            log.logger.info('Successfully uninstalled {}-{}'.format(name, self._local_modules[name].version))\n            self._local_modules.pop(name)\n        else:\n            log.logger.info('Successfully uninstalled {}'.format(name))\n        return True\n\n    def search(self, name: str, source: str = None, branch: str = None) -> HubModule:\n        '''Return HubModule If a HubModule with a specific name is found, otherwise None.'''\n        module = None\n\n        if name in self._local_modules:\n            module = self._local_modules[name]\n        else:\n            module_dir = self._get_normalized_path(name)\n            if os.path.exists(module_dir):\n                try:\n                    module = self._local_modules[name] = HubModule.load(module_dir)\n                except:\n                    utils.record_exception('An error was encountered while loading {}'.format(name))\n\n        if not module:\n            return None\n\n        if source and source != module.source:\n            return None\n\n        if branch and branch != module.branch:\n            return None\n\n        return module\n\n    def list(self) -> List[HubModule]:\n        '''List all installed HubModule.'''\n        for subdir in os.listdir(self.home):\n            fulldir = os.path.join(self.home, subdir)\n\n            try:\n                self._local_modules[subdir] = HubModule.load(fulldir)\n            except:\n                utils.record_exception('An error was encountered while loading {}'.format(subdir))\n\n        return [module for module in self._local_modules.values()]\n\n    def _install_from_url(self, url: str) -> HubModule:\n        '''Install HubModule from url'''\n        with utils.generate_tempdir() as _tdir:\n            with log.ProgressBar('Download {}'.format(url)) as bar:\n                for file, ds, ts in utils.download_with_progress(url, _tdir):\n                    bar.update(float(ds) / ts)\n\n            return self._install_from_archive(file)\n\n    def _install_from_name(self, name: str, version: str = None, ignore_env_mismatch: bool = False) -> HubModule:\n        '''Install HubModule by name search result'''\n        result = module_server.search_module(name=name, version=version)\n        for item in result:\n            if name.lower() == item['name'].lower() and utils.Version(item['version']).match(version):\n                return self._install_from_url(item['url'])\n\n        module_infos = module_server.get_module_compat_info(name=name)\n        # The HubModule with the specified name cannot be found\n        if not module_infos:\n            raise HubModuleNotFoundError(name=name, version=version)\n\n        valid_infos = {}\n        if version:\n            for _ver, _info in module_infos.items():\n                if utils.Version(_ver).match(version):\n                    valid_infos[_ver] = _info\n        else:\n            valid_infos = module_infos.copy()\n\n        # Cannot find a HubModule that meets the version\n        if valid_infos:\n            if not ignore_env_mismatch:\n                raise EnvironmentMismatchError(name=name, info=valid_infos, version=version)\n\n            # If `ignore_env_mismatch` is set, ignore the problem of environmental mismatch, such as PaddlePaddle or PaddleHub\n            # version incompatibility. This may cause some unexpected problems during installation or running, but it is useful\n            # in some cases, for example, the development version of PaddlePaddle(with version number `0.0.0`) is installed\n            # locally.\n            if version:\n                if version in valid_infos:\n                    url = valid_infos[version]['url']\n                else:\n                    raise HubModuleNotFoundError(name=name, info=module_infos, version=version)\n            else:\n                # Get the maximum version number.\n                version = sorted([utils.Version(_v) for _v in valid_infos.keys()])[-1]\n                url = valid_infos[str(version)]['url']\n            log.logger.warning('Ignore environmental mismatch of The Module {}-{}'.format(name, version))\n            return self._install_from_url(url)\n        raise HubModuleNotFoundError(name=name, info=module_infos, version=version)\n\n    def _install_from_source(self, name: str, version: str, source: str, update: bool = False,\n                             branch: str = None) -> HubModule:\n        '''Install a HubModule from git repository'''\n        result = module_server.search_module(name=name, source=source, version=version, update=update, branch=branch)\n        for item in result:\n            if item['name'] == name and item['version'].match(version):\n\n                # uninstall local module\n                local_module = self.search(name)\n                if local_module and local_module.source == source and local_module.branch == branch:\n                    self._local_modules[name] = local_module\n                    return self._local_modules[name]\n\n                if os.path.exists(self._get_normalized_path(name)):\n                    self.uninstall(name)\n\n                installed_path = self._get_normalized_path(name)\n                shutil.copytree(item['path'], installed_path)\n\n                source_info_file = os.path.join(installed_path, '_source_info.yaml')\n                with open(source_info_file, 'w') as file:\n                    file.write('source: {}\\n'.format(source))\n                    file.write('branch: {}'.format(branch))\n\n                # Install python package requirements, This behavior needs to occur before Module.load,\n                # otherwise the model will fail to load due to missing dependencies.\n                self._install_module_requirements(installed_path)\n\n                self._local_modules[name] = HubModule.load(installed_path)\n\n                if version:\n                    log.logger.info('Successfully installed {}-{}'.format(name, version))\n                else:\n                    log.logger.info('Successfully installed {}'.format(name))\n                return self._local_modules[name]\n\n        raise HubModuleNotFoundError(name=name, version=version, source=source)\n\n    def _install_from_directory(self, directory: str) -> HubModule:\n        '''Install a HubModule from directory containing module.py'''\n        module_info = HubModule.load_module_info(directory)\n\n        # A temporary directory is copied here for two purposes:\n        # 1. Avoid affecting user-specified directory (for example, a __pycache__\n        #    directory will be generated).\n        # 2. HubModule is essentially a python package. When internal package\n        #    references are made in it, the correct package name is required.\n        with utils.generate_tempdir() as _dir:\n            tempdir = os.path.join(_dir, module_info.name)\n            tempdir = self._get_normalized_name(tempdir)\n            shutil.copytree(directory, tempdir)\n\n            # Uninstall local module\n            if os.path.exists(self._get_normalized_path(module_info.name)):\n                self.uninstall(module_info.name)\n\n            shutil.copytree(directory, self._get_normalized_path(module_info.name))\n\n            # Install python package requirements, This behavior needs to occur before Module.load,\n            # otherwise the model will fail to load due to missing dependencies.\n            self._install_module_requirements(directory)\n\n            hub_module_cls = HubModule.load(self._get_normalized_path(module_info.name))\n            self._local_modules[module_info.name] = hub_module_cls\n\n            log.logger.info('Successfully installed {}-{}'.format(hub_module_cls.name, hub_module_cls.version))\n            return hub_module_cls\n\n    def _install_from_archive(self, archive: str) -> HubModule:\n        '''Install HubModule from archive file (eg xxx.tar.gz)'''\n        with utils.generate_tempdir() as _tdir:\n            with log.ProgressBar('Decompress {}'.format(archive)) as bar:\n                for path, ds, ts in xarfile.unarchive_with_progress(archive, _tdir):\n                    bar.update(float(ds) / ts)\n\n            # Sometimes the path contains '.'\n            path = os.path.normpath(path)\n            directory = os.path.join(_tdir, path.split(os.sep)[0])\n            return self._install_from_directory(directory)\n\n    def _install_module_requirements(self, directory: str):\n\n        rfile = os.path.join(directory, 'requirements.txt')\n        if not os.path.exists(rfile):\n            return\n\n        file = utils.get_record_file()\n        with open(file, 'a') as _stream:\n\n            with log.logger.processing('Installing dependent packages from {}'.format(rfile)):\n                result = pypi.install_from_file(rfile, ostream=_stream, estream=_stream)\n                if result:\n                    log.logger.info('Successfully installed dependent packages.')\n                else:\n                    log.logger.warning(\n                        'Some errors occurred while installing dependent packages. Detailed error information can be found in the {}.'\n                        .format(file))\n"
  },
  {
    "path": "paddlehub/module/module.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport ast\nimport builtins\nimport codecs\nimport inspect\nimport os\nimport re\nimport sys\nfrom typing import Callable\nfrom typing import Generic\nfrom typing import List\nfrom typing import Optional\nfrom typing import Union\n\nimport paddle\nimport paddle2onnx\nfrom easydict import EasyDict\n\nfrom paddlehub.compat import paddle_utils\nfrom paddlehub.compat.module.module_v1 import ModuleV1\nfrom paddlehub.utils import log\nfrom paddlehub.utils import parser\nfrom paddlehub.utils import utils\n\n\nclass InvalidHubModule(Exception):\n    def __init__(self, directory: str):\n        self.directory = directory\n\n    def __str__(self):\n        return '{} is not a valid HubModule'.format(self.directory)\n\n\n_module_serving_func = {}\n_module_runnable_func = {}\n\n\ndef runnable(func: Callable) -> Callable:\n    '''Mark a Module method as runnable, when the command `hub run` is used, the method will be called.'''\n    mod = func.__module__ + '.' + inspect.stack()[1][3]\n    _module_runnable_func[mod] = func.__name__\n\n    def _wrapper(*args, **kwargs):\n        return func(*args, **kwargs)\n\n    return _wrapper\n\n\ndef serving(func: Callable) -> Callable:\n    '''Mark a Module method as serving method, when the command `hub serving` is used, the method will be called.'''\n    mod = func.__module__ + '.' + inspect.stack()[1][3]\n    _module_serving_func[mod] = func.__name__\n\n    def _wrapper(*args, **kwargs):\n        return func(*args, **kwargs)\n\n    return _wrapper\n\n\nclass RunModule(object):\n    '''The base class of PaddleHub Module, users can inherit this class to implement to realize custom class.'''\n\n    def __init__(self, *args, **kwargs):\n        super(RunModule, self).__init__()\n\n    def _get_func_name(self, current_cls: Generic, module_func_dict: dict) -> Optional[str]:\n        mod = current_cls.__module__ + '.' + current_cls.__name__\n        if mod in module_func_dict:\n            _func_name = module_func_dict[mod]\n            return _func_name\n        elif current_cls.__bases__:\n            for base_class in current_cls.__bases__:\n                base_run_func = self._get_func_name(base_class, module_func_dict)\n                if base_run_func:\n                    return base_run_func\n        else:\n            return None\n\n    # After the 2.0.0rc version, paddle uses the dynamic graph mode by default, which will cause the\n    # execution of the static graph model to fail, so compatibility protection is required.\n    def __getattribute__(self, attr):\n        _attr = object.__getattribute__(self, attr)\n\n        # If the acquired attribute is a built-in property of the object, skip it.\n        if re.match('__.*__', attr):\n            return _attr\n        # If the module is a dynamic graph model, skip it.\n        elif isinstance(self, paddle.nn.Layer):\n            return _attr\n        # If the acquired attribute is not a class method, skip it.\n        elif not inspect.ismethod(_attr):\n            return _attr\n\n        return paddle_utils.run_in_static_mode(_attr)\n\n    @classmethod\n    def get_py_requirements(cls) -> List[str]:\n        '''Get Module's python package dependency list.'''\n        py_module = sys.modules[cls.__module__]\n        directory = os.path.dirname(py_module.__file__)\n        req_file = os.path.join(directory, 'requirements.txt')\n        if not os.path.exists(req_file):\n            return []\n        with codecs.open(req_file, 'r', encoding='utf8') as file:\n            return file.read().split('\\n')\n\n    @property\n    def _run_func_name(self):\n        return self._get_func_name(self.__class__, _module_runnable_func)\n\n    @property\n    def _run_func(self):\n        return getattr(self, self._run_func_name) if self._run_func_name else None\n\n    @property\n    def is_runnable(self) -> bool:\n        '''\n        Whether the Module is runnable, in other words, whether can we execute the Module through the\n        `hub run` command.\n        '''\n        return True if self._run_func else False\n\n    @property\n    def serving_func_name(self):\n        return self._get_func_name(self.__class__, _module_serving_func)\n\n    @property\n    def _pretrained_model_path(self):\n        _pretrained_model_attrs = [\n            'pretrained_model_path', 'rec_pretrained_model_path', 'default_pretrained_model_path', 'model_path'\n        ]\n\n        for _attr in _pretrained_model_attrs:\n            if hasattr(self, _attr):\n                path = getattr(self, _attr)\n                if os.path.exists(path) and os.path.isfile(path):\n                    path = os.path.dirname(path)\n                return path\n\n        return None\n\n    def sub_modules(self, recursive: bool = True):\n        '''\n        Get all sub modules.\n\n        Args:\n            recursive(bool): Whether to get sub modules recursively. Default to True.\n        '''\n        _sub_modules = {}\n        for key, item in self.__dict__.items():\n            if id(item) == id(self):\n                continue\n\n            if isinstance(item, (RunModule, ModuleV1)):\n                _sub_modules[key] = item\n                if not recursive:\n                    continue\n\n                for _k, _v in item.sub_modules(recursive):\n                    _sub_modules['{}/{}'.format(key, _k)] = _v\n\n        return _sub_modules\n\n    def save_inference_model(self,\n                             dirname: str,\n                             model_filename: str = None,\n                             params_filename: str = None,\n                             input_spec: List[paddle.static.InputSpec] = None,\n                             include_sub_modules: bool = True,\n                             combined: bool = True):\n        '''\n        Export the model to Paddle Inference format.\n\n        Args:\n            dirname(str): The directory to save the paddle inference model.\n            model_filename(str): The name of the saved model file. Default to `__model__`.\n            params_filename(str): The name of the saved parameters file, only takes effect when `combined` is True.\n                Default to `__params__`.\n            input_spec(list): Describes the input of the saved model's forward method, which can be described by\n                InputSpec or example Tensor. If None, all input variables of the original Layer's forward method\n                would be the inputs of the saved model. Default None.\n            include_sub_modules(bool): Whether to export sub modules. Default to True.\n            combined(bool): Whether to save all parameters in a combined file. Default to True.\n        '''\n        if include_sub_modules:\n            for key, _sub_module in self.sub_modules().items():\n                try:\n                    sub_dirname = os.path.normpath(os.path.join(dirname, key))\n                    _sub_module.save_inference_model(\n                        sub_dirname,\n                        include_sub_modules=include_sub_modules,\n                        model_filename=model_filename,\n                        params_filename=params_filename,\n                        combined=combined)\n                except:\n                    utils.record_exception('Failed to save sub module {}'.format(_sub_module.name))\n\n        if isinstance(self, paddle.nn.Layer):\n            save_file = os.path.join(dirname, '{}'.format(self.name))\n            if not input_spec:\n                if hasattr(self, 'input_spec'):\n                    input_spec = self.input_spec\n                else:\n                    _type = self.type.lower()\n                    if _type.startswith('cv/image'):\n                        input_spec = [paddle.static.InputSpec(shape=[None, 3, None, None], dtype='float32')]\n                    else:\n                        raise RuntimeError(\n                            'Module {} lacks `input_spec`, please specify it when calling `save_inference_model`.'.\n                            format(self.name))\n\n            net = paddle.jit.to_static(self, input_spec)\n            paddle.jit.save(net, save_file)\n\n            log.logger.info('Paddle Inference model saved in {}.'.format(dirname))\n            return\n\n        if not self._pretrained_model_path:\n            raise RuntimeError('Module {} does not support exporting models in Paddle Inference format.'.format(\n                self.name))\n        elif not os.path.exists(\n                self._pretrained_model_path) and not os.path.exists(self._pretrained_model_path + '.pdmodel'):\n            log.logger.warning('The model path of Module {} does not exist.'.format(self.name))\n            return\n\n        place = paddle.CPUPlace()\n        exe = paddle.static.Executor(place)\n\n        _model_filename = None\n        _params_filename = None\n\n        if os.path.exists(os.path.join(self._pretrained_model_path, 'model')):\n            _model_filename = 'model'\n\n        if os.path.exists(os.path.join(self._pretrained_model_path, 'params')):\n            _params_filename = 'params'\n\n        if os.path.exists(os.path.join(self._pretrained_model_path, '__params__')):\n            _params_filename = '__params__'\n        if _model_filename is not None and _params_filename is not None:\n            program, feeded_var_names, target_vars = paddle.static.load_inference_model(\n                self._pretrained_model_path,\n                executor=exe,\n                model_filename=_model_filename,\n                params_filename=_params_filename,\n            )\n        else:\n            program, feeded_var_names, target_vars = paddle.static.load_inference_model(\n                self._pretrained_model_path, executor=exe)\n\n        global_block = program.global_block()\n        feed_vars = [global_block.var(item) for item in feeded_var_names]\n\n        path_prefix = dirname\n        if os.path.isdir(dirname):\n            path_prefix = os.path.join(dirname, 'model')\n        paddle.static.save_inference_model(\n            path_prefix, feed_vars=feed_vars, fetch_vars=target_vars, executor=exe, program=program)\n\n        log.logger.info('Paddle Inference model saved in {}.'.format(dirname))\n\n    def export_onnx_model(self,\n                          dirname: str,\n                          input_spec: List[paddle.static.InputSpec] = None,\n                          include_sub_modules: bool = True,\n                          **kwargs):\n        '''\n        Export the model to ONNX format.\n\n        Args:\n            dirname(str): The directory to save the onnx model.\n            input_spec(list): Describes the input of the saved model's forward method, which can be described by\n                InputSpec or example Tensor. If None, all input variables of the original Layer's forward method\n                would be the inputs of the saved model. Default None.\n            include_sub_modules(bool): Whether to export sub modules. Default to True.\n            **kwargs(dict|optional): Other export configuration options for compatibility, some may be removed in\n                the future. Don't use them If not necessary. Refer to https://github.com/PaddlePaddle/paddle2onnx\n                for more information.\n        '''\n        if include_sub_modules:\n            for key, _sub_module in self.sub_modules().items():\n                try:\n                    sub_dirname = os.path.normpath(os.path.join(dirname, key))\n                    _sub_module.export_onnx_model(sub_dirname, include_sub_modules=include_sub_modules, **kwargs)\n                except:\n                    utils.record_exception('Failed to export sub module {}'.format(_sub_module.name))\n\n        if isinstance(self, paddle.nn.Layer):\n            save_file = os.path.join(dirname, '{}'.format(self.name))\n            if not input_spec:\n                if hasattr(self, 'input_spec'):\n                    input_spec = self.input_spec\n                else:\n                    _type = self.type.lower()\n                    if _type.startswith('cv/image'):\n                        input_spec = [paddle.static.InputSpec(shape=[None, 3, None, None], dtype='float32')]\n                    else:\n                        raise RuntimeError(\n                            'Module {} lacks `input_spec`, please specify it when calling `export_onnx_model`.'.format(\n                                self.name))\n\n            paddle.onnx.export(self, save_file, input_spec=input_spec, **kwargs)\n            return\n\n        if not self._pretrained_model_path:\n            raise RuntimeError('Module {} does not support exporting models in ONNX format.'.format(self.name))\n        elif not os.path.exists(self._pretrained_model_path):\n            log.logger.warning('The model path of Module {} does not exist.'.format(self.name))\n            return\n\n        place = paddle.CPUPlace()\n        exe = paddle.static.Executor(place)\n\n        model_filename = None\n        params_filename = None\n\n        if os.path.exists(os.path.join(self._pretrained_model_path, 'model')):\n            model_filename = 'model'\n\n        if os.path.exists(os.path.join(self._pretrained_model_path, 'params')):\n            params_filename = 'params'\n\n        if os.path.exists(os.path.join(self._pretrained_model_path, '__params__')):\n            params_filename = '__params__'\n\n        save_file = os.path.join(dirname, '{}.onnx'.format(self.name))\n\n        program, inputs, outputs = paddle.static.load_inference_model(\n            dirname=self._pretrained_model_path,\n            model_filename=model_filename,\n            params_filename=params_filename,\n            executor=exe)\n\n        paddle2onnx.program2onnx(\n            program=program,\n            scope=paddle.static.global_scope(),\n            feed_var_names=inputs,\n            target_vars=outputs,\n            save_file=save_file,\n            **kwargs)\n\n\nclass Module(object):\n    '''\n    In PaddleHub, Module represents an executable module, which usually a pre-trained model that can be used for end-to-end\n    prediction, such as a face detection model or a lexical analysis model, or a pre-trained model that requires finetuning,\n    such as BERT/ERNIE. When loading a Module with a specified name, if the Module does not exist locally, PaddleHub will\n    automatically request the server or the specified Git source to download the resource.\n\n    Args:\n        name(str): Module name.\n        directory(str|optional): Directory of the module to be loaded, only takes effect when the `name` is not specified.\n        version(str|optional): The version limit of the module, only takes effect when the `name` is specified. When the local\n            Module does not meet the specified version conditions, PaddleHub will re-request the server to download the\n            appropriate Module. Default to None, This means that the local Module will be used. If the Module does not exist,\n            PaddleHub will download the latest version available from the server according to the usage environment.\n        source(str|optional): Url of a git repository. If this parameter is specified, PaddleHub will no longer download the\n            specified Module from the default server, but will look for it in the specified repository. Default to None.\n        update(bool|optional): Whether to update the locally cached git repository, only takes effect when the `source` is\n            specified. Default to False.\n        branch(str|optional): The branch of the specified git repository. Default to None.\n        ignore_env_mismatch(bool|optional): Whether to ignore the environment mismatch when installing the Module. Default to\n            False.\n    '''\n\n    def __new__(cls,\n                *,\n                name: str = None,\n                directory: str = None,\n                version: str = None,\n                source: str = None,\n                update: bool = False,\n                branch: str = None,\n                ignore_env_mismatch: bool = False,\n                **kwargs):\n        if cls.__name__ == 'Module':\n            from paddlehub.server.server import CacheUpdater\n            # This branch come from hub.Module(name='xxx') or hub.Module(directory='xxx')\n            if name:\n                module = cls.init_with_name(\n                    name=name,\n                    version=version,\n                    source=source,\n                    update=update,\n                    branch=branch,\n                    ignore_env_mismatch=ignore_env_mismatch,\n                    **kwargs)\n                CacheUpdater(\"update_cache\", module=name, version=version).start()\n            elif directory:\n                module = cls.init_with_directory(directory=directory, **kwargs)\n                CacheUpdater(\"update_cache\", module=directory, version=\"0.0.0\").start()\n        else:\n            module = object.__new__(cls)\n\n        return module\n\n    @classmethod\n    def load(cls, directory: str) -> Generic:\n        '''Load the Module object defined in the specified directory.'''\n        if directory.endswith(os.sep):\n            directory = directory[:-1]\n\n        # If the module description file existed, try to load as ModuleV1\n        desc_file = os.path.join(directory, 'module_desc.pb')\n        if os.path.exists(desc_file):\n            return ModuleV1.load(directory)\n\n        basename = os.path.split(directory)[-1]\n        dirname = os.path.join(*list(os.path.split(directory)[:-1]))\n        py_module = utils.load_py_module(dirname, '{}.module'.format(basename))\n\n        for _item, _cls in inspect.getmembers(py_module, inspect.isclass):\n            _item = py_module.__dict__[_item]\n            if hasattr(_item, '_hook_by_hub') and issubclass(_item, RunModule):\n                user_module_cls = _item\n                break\n        else:\n            raise InvalidHubModule(directory)\n\n        user_module_cls.directory = directory\n\n        source_info_file = os.path.join(directory, '_source_info.yaml')\n        if os.path.exists(source_info_file):\n            info = parser.yaml_parser.parse(source_info_file)\n            user_module_cls.source = info.get('source', '')\n            user_module_cls.branch = info.get('branch', '')\n        else:\n            user_module_cls.source = ''\n            user_module_cls.branch = ''\n\n        # In the case of multiple cards, the following code can set each process to use the correct place.\n        if issubclass(user_module_cls, paddle.nn.Layer):\n            place = paddle.get_device().split(':')[0]\n            paddle.set_device(place)\n\n        return user_module_cls\n\n    @classmethod\n    def load_module_info(cls, directory: str) -> EasyDict:\n        '''Load the infomation of Module object defined in the specified directory.'''\n        # If is ModuleV1\n        desc_file = os.path.join(directory, 'module_desc.pb')\n        if os.path.exists(desc_file):\n            return ModuleV1.load_module_info(directory)\n\n        # If is ModuleV2\n        module_file = os.path.join(directory, 'module.py')\n        with codecs.open(module_file, 'r', encoding='utf8') as file:\n            pycode = file.read()\n            ast_module = ast.parse(pycode)\n\n            for _body in ast_module.body:\n                if not isinstance(_body, ast.ClassDef):\n                    continue\n\n                for _decorator in _body.decorator_list:\n                    if _decorator.func.id != 'moduleinfo':\n                        continue\n\n                    info = {key.arg: key.value.s for key in _decorator.keywords if key.arg != 'meta'}\n                    return EasyDict(info)\n            else:\n                raise InvalidHubModule(directory)\n\n    @classmethod\n    def init_with_name(cls,\n                       name: str,\n                       version: str = None,\n                       source: str = None,\n                       update: bool = False,\n                       branch: str = None,\n                       ignore_env_mismatch: bool = False,\n                       **kwargs) -> Union[RunModule, ModuleV1]:\n        '''Initialize Module according to the specified name.'''\n        from paddlehub.module.manager import LocalModuleManager\n        manager = LocalModuleManager()\n        user_module_cls = manager.search(name, source=source, branch=branch)\n        if not user_module_cls or not user_module_cls.version.match(version):\n            user_module_cls = manager.install(\n                name=name,\n                version=version,\n                source=source,\n                update=update,\n                branch=branch,\n                ignore_env_mismatch=ignore_env_mismatch)\n\n        directory = manager._get_normalized_path(user_module_cls.name)\n\n        # The HubModule in the old version will use the _initialize method to initialize,\n        # this function will be obsolete in a future version\n        if hasattr(user_module_cls, '_initialize'):\n            log.logger.warning(\n                'The _initialize method in HubModule will soon be deprecated, you can use the __init__() to handle the initialization of the object'\n            )\n            user_module = user_module_cls(directory=directory)\n            user_module._initialize(**kwargs)\n            return user_module\n\n        if issubclass(user_module_cls, ModuleV1):\n            return user_module_cls(directory=directory, **kwargs)\n\n        user_module_cls.directory = directory\n        return user_module_cls(**kwargs)\n\n    @classmethod\n    def init_with_directory(cls, directory: str, **kwargs) -> Union[RunModule, ModuleV1]:\n        '''Initialize Module according to the specified directory.'''\n        user_module_cls = cls.load(directory)\n\n        # The HubModule in the old version will use the _initialize method to initialize,\n        # this function will be obsolete in a future version\n        if hasattr(user_module_cls, '_initialize'):\n            log.logger.warning(\n                'The _initialize method in HubModule will soon be deprecated, you can use the __init__() to handle the initialization of the object'\n            )\n            user_module = user_module_cls(directory=directory)\n            user_module._initialize(**kwargs)\n            return user_module\n\n        if issubclass(user_module_cls, ModuleV1):\n            return user_module_cls(directory=directory, **kwargs)\n\n        user_module_cls.directory = directory\n        return user_module_cls(**kwargs)\n\n\ndef moduleinfo(name: str,\n               version: str,\n               author: str = None,\n               author_email: str = None,\n               summary: str = None,\n               type: str = None,\n               meta=None) -> Callable:\n    '''\n    Mark Module information for a python class, and the class will automatically be extended to inherit HubModule. In other words, python classes\n    marked with moduleinfo can be loaded through hub.Module.\n    '''\n\n    def _wrapper(cls: Generic) -> Generic:\n        wrap_cls = cls\n        _meta = RunModule if not meta else meta\n        if not issubclass(cls, _meta):\n            _bases = []\n            for _b in cls.__bases__:\n                if issubclass(_meta, _b):\n                    continue\n                _bases.append(_b)\n            _bases.append(_meta)\n            _bases = tuple(_bases)\n            attr_dict = dict(cls.__dict__)\n            attr_dict.pop('__dict__', None)\n            wrap_cls = builtins.type(cls.__name__, _bases, attr_dict)\n\n        wrap_cls.name = name\n        wrap_cls.version = utils.Version(version)\n        wrap_cls.author = author\n        wrap_cls.author_email = author_email\n        wrap_cls.summary = summary\n        wrap_cls.type = type\n        wrap_cls._hook_by_hub = True\n        return wrap_cls\n\n    return _wrapper\n"
  },
  {
    "path": "paddlehub/module/nlp_module.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport copy\nimport functools\nimport inspect\nimport io\nimport json\nimport os\nimport six\nfrom typing import List, Tuple, Union\n\nimport paddle\nimport paddle.nn as nn\nfrom packaging.version import Version\nfrom paddle.dataset.common import DATA_HOME\nfrom paddle.utils.download import get_path_from_url\nfrom paddlehub.module.module import serving, RunModule, runnable\n\nfrom paddlehub.utils.log import logger\nfrom paddlehub.utils.utils import reseg_token_label\n\nimport paddlenlp\nfrom paddlenlp.embeddings.token_embedding import EMBEDDING_HOME, EMBEDDING_URL_ROOT\nfrom paddlenlp.data import JiebaTokenizer\nfrom paddlehub.compat.module.nlp_module import DataFormatError\n\n__all__ = [\n    'PretrainedModel',\n    'register_base_model',\n    'TransformerModule',\n]\n\n\ndef fn_args_to_dict(func, *args, **kwargs):\n    \"\"\"\n    Inspect function `func` and its arguments for running, and extract a\n    dict mapping between argument names and keys.\n    \"\"\"\n    if hasattr(inspect, 'getfullargspec'):\n        (spec_args, spec_varargs, spec_varkw, spec_defaults, _, _, _) = inspect.getfullargspec(func)\n    else:\n        (spec_args, spec_varargs, spec_varkw, spec_defaults) = inspect.getargspec(func)\n    # add positional argument values\n    init_dict = dict(zip(spec_args, args))\n    # add default argument values\n    kwargs_dict = dict(zip(spec_args[-len(spec_defaults):], spec_defaults)) if spec_defaults else {}\n    kwargs_dict.update(kwargs)\n    init_dict.update(kwargs_dict)\n    return init_dict\n\n\nclass InitTrackerMeta(type(nn.Layer)):\n    \"\"\"\n    This metaclass wraps the `__init__` method of a class to add `init_config`\n    attribute for instances of that class, and `init_config` use a dict to track\n    the initial configuration. If the class has `_wrap_init` method, it would be\n    hooked after `__init__` and called as `_wrap_init(self, init_fn, init_args)`.\n    Since InitTrackerMeta would be used as metaclass for pretrained model classes,\n    which always are Layer and `type(nn.Layer)` is not `type`, thus use `type(nn.Layer)`\n    rather than `type` as base class for it to avoid inheritance metaclass\n    conflicts.\n    \"\"\"\n\n    def __init__(cls, name, bases, attrs):\n        init_func = cls.__init__\n        # If attrs has `__init__`, wrap it using accessable `_wrap_init`.\n        # Otherwise, no need to wrap again since the super cls has been wraped.\n        # TODO: remove reduplicated tracker if using super cls `__init__`\n        help_func = getattr(cls, '_wrap_init', None) if '__init__' in attrs else None\n        cls.__init__ = InitTrackerMeta.init_and_track_conf(init_func, help_func)\n        super(InitTrackerMeta, cls).__init__(name, bases, attrs)\n\n    @staticmethod\n    def init_and_track_conf(init_func, help_func=None):\n        \"\"\"\n        wraps `init_func` which is `__init__` method of a class to add `init_config`\n        attribute for instances of that class.\n        Args:\n            init_func (callable): It should be the `__init__` method of a class.\n            help_func (callable, optional): If provided, it would be hooked after\n                `init_func` and called as `_wrap_init(self, init_func, *init_args, **init_args)`.\n                Default None.\n        Returns:\n            function: the wrapped function\n        \"\"\"\n\n        @functools.wraps(init_func)\n        def __impl__(self, *args, **kwargs):\n            # keep full configuration\n            init_func(self, *args, **kwargs)\n            # registed helper by `_wrap_init`\n            if help_func:\n                help_func(self, init_func, *args, **kwargs)\n            self.init_config = kwargs\n            if args:\n                kwargs['init_args'] = args\n            kwargs['init_class'] = self.__class__.__name__\n\n        return __impl__\n\n\ndef register_base_model(cls):\n    \"\"\"\n    Add a `base_model_class` attribute for the base class of decorated class,\n    representing the base model class in derived classes of the same architecture.\n    Args:\n        cls (class): the name of the model\n    \"\"\"\n    base_cls = cls.__bases__[0]\n    assert issubclass(base_cls,\n                      PretrainedModel), \"`register_base_model` should be used on subclasses of PretrainedModel.\"\n    base_cls.base_model_class = cls\n    return cls\n\n\n@six.add_metaclass(InitTrackerMeta)\nclass PretrainedModel(nn.Layer):\n    \"\"\"\n    The base class for all pretrained models. It provides some attributes and\n    common methods for all pretrained models, including attributes `init_config`,\n    `config` for initialized arguments and methods for saving, loading.\n    It also includes some class attributes (should be set by derived classes):\n    - `model_config_file` (str): represents the file name for saving and loading\n      model configuration, it's value is `model_config.json`.\n    - `resource_files_names` (dict): use this to map resources to specific file\n      names for saving and loading.\n    - `pretrained_resource_files_map` (dict): The dict has the same keys as\n      `resource_files_names`, the values are also dict mapping specific pretrained\n      model name to URL linking to pretrained model.\n    - `pretrained_init_configuration` (dict): The dict has pretrained model names\n      as keys, and the values are also dict preserving corresponding configuration\n      for model initialization.\n    - `base_model_prefix` (str): represents the the attribute associated to the\n      base model in derived classes of the same architecture adding layers on\n      top of the base model.\n    \"\"\"\n    model_config_file = \"model_config.json\"\n    pretrained_init_configuration = {}\n    # TODO: more flexible resource handle, namedtuple with fileds as:\n    # resource_name, saved_file, handle_name_for_load(None for used as __init__\n    # arguments), handle_name_for_save\n    resource_files_names = {\"model_state\": \"model_state.pdparams\"}\n    pretrained_resource_files_map = {}\n    base_model_prefix = \"\"\n\n    def _wrap_init(self, original_init, *args, **kwargs):\n        \"\"\"\n        It would be hooked after `__init__` to add a dict including arguments of\n        `__init__` as a attribute named `config` of the prtrained model instance.\n        \"\"\"\n        init_dict = fn_args_to_dict(original_init, *args, **kwargs)\n        self.config = init_dict\n\n    @property\n    def base_model(self):\n        return getattr(self, self.base_model_prefix, self)\n\n    @property\n    def model_name_list(self):\n        return list(self.pretrained_init_configuration.keys())\n\n    def get_input_embeddings(self):\n        base_model = getattr(self, self.base_model_prefix, self)\n        if base_model is not self:\n            return base_model.get_input_embeddings()\n        else:\n            raise NotImplementedError\n\n    def get_output_embeddings(self):\n        return None  # Overwrite for models with output embeddings\n\n    @classmethod\n    def from_pretrained(cls, pretrained_model_name_or_path, *args, **kwargs):\n        \"\"\"\n        Instantiate an instance of `PretrainedModel` from a predefined\n        model specified by name or path.\n        Args:\n            pretrained_model_name_or_path (str): A name of or a file path to a\n                pretrained model.\n            *args (tuple): position arguments for `__init__`. If provide, use\n                this as position argument values for model initialization.\n            **kwargs (dict): keyword arguments for `__init__`. If provide, use\n                this to update pre-defined keyword argument values for model\n                initialization.\n        Returns:\n            PretrainedModel: An instance of PretrainedModel.\n        \"\"\"\n        pretrained_models = list(cls.pretrained_init_configuration.keys())\n        resource_files = {}\n        init_configuration = {}\n        if pretrained_model_name_or_path in pretrained_models:\n            for file_id, map_list in cls.pretrained_resource_files_map.items():\n                resource_files[file_id] = map_list[pretrained_model_name_or_path]\n            init_configuration = copy.deepcopy(cls.pretrained_init_configuration[pretrained_model_name_or_path])\n        else:\n            if os.path.isdir(pretrained_model_name_or_path):\n                for file_id, file_name in cls.resource_files_names.items():\n                    full_file_name = os.path.join(pretrained_model_name_or_path, file_name)\n                    resource_files[file_id] = full_file_name\n                resource_files[\"model_config_file\"] = os.path.join(pretrained_model_name_or_path, cls.model_config_file)\n            else:\n                raise ValueError(\"Calling {}.from_pretrained() with a model identifier or the \"\n                                 \"path to a directory instead. The supported model \"\n                                 \"identifiers are as follows: {}\".format(cls.__name__,\n                                                                         cls.pretrained_init_configuration.keys()))\n        # FIXME(chenzeyu01): We should use another data path for storing model\n        default_root = os.path.join(DATA_HOME, pretrained_model_name_or_path)\n        resolved_resource_files = {}\n        for file_id, file_path in resource_files.items():\n            path = os.path.join(default_root, file_path.split('/')[-1])\n            if file_path is None or os.path.isfile(file_path):\n                resolved_resource_files[file_id] = file_path\n            elif os.path.exists(path):\n                logger.info(\"Already cached %s\" % path)\n                resolved_resource_files[file_id] = path\n            else:\n                logger.info(\"Downloading %s and saved to %s\" % (file_path, default_root))\n                resolved_resource_files[file_id] = get_path_from_url(file_path, default_root)\n\n        # Prepare model initialization kwargs\n        # Did we saved some inputs and kwargs to reload ?\n        model_config_file = resolved_resource_files.pop(\"model_config_file\", None)\n        if model_config_file is not None:\n            with io.open(model_config_file, encoding=\"utf-8\") as f:\n                init_kwargs = json.load(f)\n        else:\n            init_kwargs = init_configuration\n        # position args are stored in kwargs, maybe better not include\n        init_args = init_kwargs.pop(\"init_args\", ())\n        # class name corresponds to this configuration\n        init_class = init_kwargs.pop(\"init_class\", cls.base_model_class.__name__)\n\n        # Check if the loaded config matches the current model class's __init__\n        # arguments. If not match, the loaded config is for the base model class.\n        if init_class == cls.base_model_class.__name__:\n            base_args = init_args\n            base_kwargs = init_kwargs\n            derived_args = ()\n            derived_kwargs = {}\n            base_arg_index = None\n        else:  # extract config for base model\n            derived_args = list(init_args)\n            derived_kwargs = init_kwargs\n            for i, arg in enumerate(init_args):\n                if isinstance(arg, dict) and \"init_class\" in arg:\n                    assert arg.pop(\"init_class\") == cls.base_model_class.__name__, (\n                        \"pretrained base model should be {}\").format(cls.base_model_class.__name__)\n                    base_arg_index = i\n                    break\n            for arg_name, arg in init_kwargs.items():\n                if isinstance(arg, dict) and \"init_class\" in arg:\n                    assert arg.pop(\"init_class\") == cls.base_model_class.__name__, (\n                        \"pretrained base model should be {}\").format(cls.base_model_class.__name__)\n                    base_arg_index = arg_name\n                    break\n            base_args = arg.pop(\"init_args\", ())\n            base_kwargs = arg\n        if cls == cls.base_model_class:\n            # Update with newly provided args and kwargs for base model\n            base_args = base_args if not args else args\n            base_kwargs.update(kwargs)\n            model = cls(*base_args, **base_kwargs)\n        else:\n            # Update with newly provided args and kwargs for derived model\n            base_model = cls.base_model_class(*base_args, **base_kwargs)\n            if base_arg_index is not None:\n                derived_args[base_arg_index] = base_model\n            else:\n                derived_args = (base_model, )  # assume at the first position\n            derived_args = derived_args if not args else args\n            derived_kwargs.update(kwargs)\n            model = cls(*derived_args, **derived_kwargs)\n\n        # Maybe need more ways to load resources.\n        weight_path = list(resolved_resource_files.values())[0]\n        assert weight_path.endswith(\".pdparams\"), \"suffix of weight must be .pdparams\"\n        state_dict = paddle.load(weight_path)\n\n        # Make sure we are able to load base models as well as derived models\n        # (with heads)\n        start_prefix = \"\"\n        model_to_load = model\n        state_to_load = state_dict\n        unexpected_keys = []\n        missing_keys = []\n        if not hasattr(model, cls.base_model_prefix) and any(\n                s.startswith(cls.base_model_prefix) for s in state_dict.keys()):\n            # base model\n            state_to_load = {}\n            start_prefix = cls.base_model_prefix + \".\"\n            for k, v in state_dict.items():\n                if k.startswith(cls.base_model_prefix):\n                    state_to_load[k[len(start_prefix):]] = v\n                else:\n                    unexpected_keys.append(k)\n        if hasattr(model,\n                   cls.base_model_prefix) and not any(s.startswith(cls.base_model_prefix) for s in state_dict.keys()):\n            # derived model (base model with heads)\n            model_to_load = getattr(model, cls.base_model_prefix)\n            for k in model.state_dict().keys():\n                if not k.startswith(cls.base_model_prefix):\n                    missing_keys.append(k)\n        if len(missing_keys) > 0:\n            logger.info(\"Weights of {} not initialized from pretrained model: {}\".format(\n                model.__class__.__name__, missing_keys))\n        if len(unexpected_keys) > 0:\n            logger.info(\"Weights from pretrained model not used in {}: {}\".format(model.__class__.__name__,\n                                                                                  unexpected_keys))\n        model_to_load.set_state_dict(state_to_load)\n        if paddle.in_dynamic_mode():\n            return model\n        return model, state_to_load\n\n    def save_pretrained(self, save_directory):\n        \"\"\"\n        Save model configuration and related resources (model state) to files\n        under `save_directory`.\n        Args:\n            save_directory (str): Directory to save files into.\n        \"\"\"\n        assert os.path.isdir(save_directory), \"Saving directory ({}) should be a directory\".format(save_directory)\n        # save model config\n        model_config_file = os.path.join(save_directory, self.model_config_file)\n        model_config = self.init_config\n        # If init_config contains a Layer, use the layer's init_config to save\n        for key, value in model_config.items():\n            if key == \"init_args\":\n                args = []\n                for arg in value:\n                    args.append(arg.init_config if isinstance(arg, PretrainedModel) else arg)\n                model_config[key] = tuple(args)\n            elif isinstance(value, PretrainedModel):\n                model_config[key] = value.init_config\n        with io.open(model_config_file, \"w\", encoding=\"utf-8\") as f:\n            f.write(json.dumps(model_config, ensure_ascii=False))\n        # save model\n        file_name = os.path.join(save_directory, list(self.resource_files_names.values())[0])\n        paddle.save(self.state_dict(), file_name)\n\n\nclass TextServing(object):\n    \"\"\"\n    A base class for text model which supports serving.\n    \"\"\"\n\n    @serving\n    def predict_method(self, data: List[List[str]], max_seq_len: int = 128, batch_size: int = 1, use_gpu: bool = False):\n        \"\"\"\n        Run predict method as a service.\n        Serving as a task which is specified from serving config.\n        Tasks supported:\n        1. seq-cls: sequence classification;\n        2. token-cls: sequence labeling;\n        3. None: embedding.\n        Args:\n            data (obj:`List(List(str))`): The processed data whose each element is the list of a single text or a pair of texts.\n            max_seq_len (:obj:`int`, `optional`, defaults to 128):\n                If set to a number, will limit the total sequence returned so that it has a maximum length.\n            batch_size(obj:`int`, defaults to 1): The number of batch.\n            use_gpu(obj:`bool`, defaults to `False`): Whether to use gpu to run or not.\n        Returns:\n            results(obj:`list`): All the predictions labels.\n        \"\"\"\n        if self.task in self._tasks_supported:  # cls service\n            if self.label_map:\n                # compatible with json decoding label_map\n                self.label_map = {int(k): v for k, v in self.label_map.items()}\n            results = self.predict(data, max_seq_len, batch_size, use_gpu)\n\n            if self.task == 'token-cls':\n                # remove labels of [CLS] token and pad tokens\n                results = [token_labels[1:len(data[i][0]) + 1] for i, token_labels in enumerate(results)]\n            return results\n        elif self.task is None:  # embedding service\n            results = self.get_embedding(data, use_gpu)\n            return results\n        else:  # unknown service\n            logger.error(f'Unknown task {self.task}, current tasks supported:\\n'\n                         '1. seq-cls: sequence classification service;\\n'\n                         '2. token-cls: sequence labeling service;\\n'\n                         '3. None: embedding service')\n        return\n\n\nclass TransformerModule(RunModule, TextServing):\n    \"\"\"\n    The base class for Transformer models.\n    \"\"\"\n    _tasks_supported = [\n        'seq-cls',\n        'token-cls',\n        'text-matching',\n    ]\n\n    @property\n    def input_spec(self):\n        return [\n            paddle.static.InputSpec(shape=[None, None], dtype='int64'),\n            paddle.static.InputSpec(shape=[None, None], dtype='int64')\n        ]\n\n    def _convert_text_to_input(self, tokenizer, texts: List[str], max_seq_len: int, split_char: str):\n        pad_to_max_seq_len = False if self.task is None else True\n        if self.task == 'token-cls':  # Extra processing of token-cls task\n            tokens = texts[0].split(split_char)\n            texts[0], _ = reseg_token_label(tokenizer=tokenizer, tokens=tokens)\n            is_split_into_words = True\n        else:\n            is_split_into_words = False\n\n        encoded_inputs = []\n        if self.task == 'text-matching':\n            if len(texts) != 2:\n                raise RuntimeError(\n                    'The input texts must have two sequences, but got %d. Please check your inputs.' % len(texts))\n            encoded_inputs.append(tokenizer(text=texts[0], text_pair=None, max_seq_len=max_seq_len, \\\n                    pad_to_max_seq_len=True, is_split_into_words=is_split_into_words, return_length=True))\n            encoded_inputs.append(tokenizer(text=texts[1], text_pair=None, max_seq_len=max_seq_len, \\\n                    pad_to_max_seq_len=True, is_split_into_words=is_split_into_words, return_length=True))\n        else:\n            if len(texts) == 1:\n                if Version(paddlenlp.__version__) <= Version('2.0.0rc2'):\n                    encoded_inputs.append(tokenizer.encode(texts[0], text_pair=None, \\\n                        max_seq_len=max_seq_len, pad_to_max_seq_len=pad_to_max_seq_len))\n                else:\n                    encoded_inputs.append(tokenizer(text=texts[0], max_seq_len=max_seq_len, \\\n                        pad_to_max_seq_len=True, is_split_into_words=is_split_into_words, return_length=True))\n            elif len(texts) == 2:\n                if Version(paddlenlp.__version__) <= Version('2.0.0rc2'):\n                    encoded_inputs.append(tokenizer.encode(texts[0], text_pair=texts[1], \\\n                        max_seq_len=max_seq_len, pad_to_max_seq_len=pad_to_max_seq_len))\n                else:\n                    encoded_inputs.append(tokenizer(text=texts[0], text_pair=texts[1], max_seq_len=max_seq_len, \\\n                        pad_to_max_seq_len=True, is_split_into_words=is_split_into_words, return_length=True))\n            else:\n                raise RuntimeError(\n                    'The input text must have one or two sequence, but got %d. Please check your inputs.' % len(texts))\n        return encoded_inputs\n\n    def _batchify(self, data: List[List[str]], max_seq_len: int, batch_size: int, split_char: str):\n        def _parse_batch(batch):\n            if self.task != 'text-matching':\n                input_ids = [entry[0] for entry in batch]\n                segment_ids = [entry[1] for entry in batch]\n                return input_ids, segment_ids\n            else:\n                query_input_ids = [entry[0] for entry in batch]\n                query_segment_ids = [entry[1] for entry in batch]\n                title_input_ids = [entry[2] for entry in batch]\n                title_segment_ids = [entry[3] for entry in batch]\n                return query_input_ids, query_segment_ids, title_input_ids, title_segment_ids\n\n        if not hasattr(self, 'tokenizer'):\n            self.tokenizer = self.get_tokenizer()\n        examples = []\n\n        for texts in data:\n            encoded_inputs = self._convert_text_to_input(self.tokenizer, texts, max_seq_len, split_char)\n            example = []\n            for inp in encoded_inputs:\n                input_ids = inp['input_ids']\n                if Version(paddlenlp.__version__) >= Version('2.0.0rc5'):\n                    token_type_ids = inp['token_type_ids']\n                else:\n                    token_type_ids = inp['segment_ids']\n                example.extend((input_ids, token_type_ids))\n            examples.append(example)\n\n        # Seperates data into some batches.\n        one_batch = []\n        for example in examples:\n            one_batch.append(example)\n            if len(one_batch) == batch_size:\n                yield _parse_batch(one_batch)\n                one_batch = []\n        if one_batch:\n            # The last batch whose size is less than the config batch_size setting.\n            yield _parse_batch(one_batch)\n\n    def training_step(self, batch: List[paddle.Tensor], batch_idx: int):\n        \"\"\"\n        One step for training, which should be called as forward computation.\n        Args:\n            batch(:obj:List[paddle.Tensor]): The one batch data, which contains the model needed,\n                such as input_ids, sent_ids, pos_ids, input_mask and labels.\n            batch_idx(int): The index of batch.\n        Returns:\n            results(:obj: Dict) : The model outputs, such as loss and metrics.\n        \"\"\"\n        if self.task == 'seq-cls':\n            predictions, avg_loss, metric = self(input_ids=batch[0], token_type_ids=batch[1], labels=batch[2])\n        elif self.task == 'token-cls':\n            predictions, avg_loss, metric = self(\n                input_ids=batch[0], token_type_ids=batch[1], seq_lengths=batch[2], labels=batch[3])\n        elif self.task == 'text-matching':\n            predictions, avg_loss, metric = self(query_input_ids=batch[0], query_token_type_ids=batch[1], \\\n                title_input_ids=batch[2], title_token_type_ids=batch[3], labels=batch[4])\n        self.metric.reset()\n        return {'loss': avg_loss, 'metrics': metric}\n\n    def validation_step(self, batch: List[paddle.Tensor], batch_idx: int):\n        \"\"\"\n        One step for validation, which should be called as forward computation.\n        Args:\n            batch(:obj:List[paddle.Tensor]): The one batch data, which contains the model needed,\n                such as input_ids, sent_ids, pos_ids, input_mask and labels.\n            batch_idx(int): The index of batch.\n        Returns:\n            results(:obj: Dict) : The model outputs, such as metrics.\n        \"\"\"\n        if self.task == 'seq-cls':\n            predictions, avg_loss, metric = self(input_ids=batch[0], token_type_ids=batch[1], labels=batch[2])\n        elif self.task == 'token-cls':\n            predictions, avg_loss, metric = self(\n                input_ids=batch[0], token_type_ids=batch[1], seq_lengths=batch[2], labels=batch[3])\n        elif self.task == 'text-matching':\n            predictions, avg_loss, metric = self(query_input_ids=batch[0], query_token_type_ids=batch[1], \\\n                title_input_ids=batch[2], title_token_type_ids=batch[3], labels=batch[4])\n        return {'metrics': metric}\n\n    def get_embedding(self, data: List[List[str]], use_gpu=False):\n        \"\"\"\n        Get token level embeddings and sentence level embeddings from model.\n        Args:\n            data (obj:`List(List(str))`): The processed data whose each element is the list of a single text or a pair of texts.\n            use_gpu(obj:`bool`, defaults to `False`): Whether to use gpu to run or not.\n        Returns:\n            results(obj:`list`): All the tokens and sentences embeddings.\n        \"\"\"\n        if self.task is not None:\n            raise RuntimeError(\"The get_embedding method is only valid when task is None, but got task %s\" % self.task)\n\n        return self.predict(data=data, use_gpu=use_gpu)\n\n    def predict(self,\n                data: List[List[str]],\n                max_seq_len: int = 128,\n                split_char: str = '\\002',\n                batch_size: int = 1,\n                use_gpu: bool = False,\n                return_prob: bool = False):\n        \"\"\"\n        Predicts the data labels.\n        Args:\n            data (obj:`List(List(str))`): The processed data whose each element is the list of a single text or a pair of texts.\n            max_seq_len (:obj:`int`, `optional`, defaults to :int:`None`):\n                If set to a number, will limit the total sequence returned so that it has a maximum length.\n            split_char(obj:`str`, defaults to '\\002'): The char used to split input tokens in token-cls task.\n            batch_size(obj:`int`, defaults to 1): The number of batch.\n            use_gpu(obj:`bool`, defaults to `False`): Whether to use gpu to run or not.\n            return_prob(obj:`bool`, defaults to `False`): Whether to return label probabilities.\n        Returns:\n            results(obj:`list`): All the predictions labels.\n        \"\"\"\n        if self.task not in self._tasks_supported \\\n                and self.task is not None:      # None for getting embedding\n            raise RuntimeError(f'Unknown task {self.task}, current tasks supported:\\n'\n                               '1. seq-cls: sequence classification;\\n'\n                               '2. token-cls: sequence labeling;\\n'\n                               '3. text-matching: text matching;\\n'\n                               '4. None: embedding')\n\n        paddle.set_device('gpu') if use_gpu else paddle.set_device('cpu')\n\n        batches = self._batchify(data, max_seq_len, batch_size, split_char)\n        results = []\n        batch_probs = []\n\n        self.eval()\n        for batch in batches:\n            if self.task == 'text-matching':\n                query_input_ids, query_segment_ids, title_input_ids, title_segment_ids = batch\n                query_input_ids = paddle.to_tensor(query_input_ids)\n                query_segment_ids = paddle.to_tensor(query_segment_ids)\n                title_input_ids = paddle.to_tensor(title_input_ids)\n                title_segment_ids = paddle.to_tensor(title_segment_ids)\n                probs = self(query_input_ids=query_input_ids, query_token_type_ids=query_segment_ids, \\\n                    title_input_ids=title_input_ids, title_token_type_ids=title_segment_ids)\n\n                idx = paddle.argmax(probs, axis=1).numpy()\n                idx = idx.tolist()\n                labels = [self.label_map[i] for i in idx]\n            else:\n                input_ids, segment_ids = batch\n                input_ids = paddle.to_tensor(input_ids)\n                segment_ids = paddle.to_tensor(segment_ids)\n                if self.task == 'seq-cls':\n                    probs = self(input_ids, segment_ids)\n                    idx = paddle.argmax(probs, axis=1).numpy()\n                    idx = idx.tolist()\n                    labels = [self.label_map[i] for i in idx]\n                elif self.task == 'token-cls':\n                    probs = self(input_ids, segment_ids)\n                    batch_ids = paddle.argmax(probs, axis=2).numpy()  # (batch_size, max_seq_len)\n                    batch_ids = batch_ids.tolist()\n                    # token labels\n                    labels = [[self.label_map[i] for i in token_ids] for token_ids in batch_ids]\n                elif self.task == None:\n                    output = self(input_ids, segment_ids)\n                    if len(output) == 1:\n                        results.append(output.squeeze(0).numpy().tolist())\n                    else:\n                        sequence_output, pooled_output = output\n                        results.append(\n                            [pooled_output.squeeze(0).numpy().tolist(),\n                             sequence_output.squeeze(0).numpy().tolist()])\n            if self.task:\n                # save probs only when return prob\n                if return_prob:\n                    batch_probs.extend(probs.numpy().tolist())\n                results.extend(labels)\n\n        if self.task and return_prob:\n            return results, batch_probs\n        return results\n\n\nclass EmbeddingServing(object):\n    \"\"\"\n    A base class for embedding model which supports serving.\n    \"\"\"\n\n    @serving\n    def calc_similarity(self, data: List[List[str]]):\n        \"\"\"\n        Calculate similarities of giving word pairs.\n        \"\"\"\n        results = []\n        for word_pair in data:\n            if len(word_pair) != 2:\n                raise RuntimeError(\n                    f'The input must have two words, but got {len(word_pair)}. Please check your inputs.')\n            if not isinstance(word_pair[0], str) or not isinstance(word_pair[1], str):\n                raise RuntimeError(\n                    f'The types of text pair must be (str, str), but got'\n                    f' ({type(word_pair[0]).__name__}, {type(word_pair[1]).__name__}). Please check your inputs.')\n\n            for word in word_pair:\n                if self.get_idx_from_word(word) == \\\n                        self.get_idx_from_word(self.vocab.unk_token):\n                    raise RuntimeError(f'Word \"{word}\" is not in vocab. Please check your inputs.')\n            results.append(str(self.cosine_sim(*word_pair)))\n        return results\n\n\nclass EmbeddingModule(RunModule, EmbeddingServing):\n    \"\"\"\n    The base class for Embedding models.\n    \"\"\"\n    base_url = 'https://paddlenlp.bj.bcebos.com/models/embeddings/'\n\n    def _download_vocab(self):\n        \"\"\"\n        Download vocab from url\n        \"\"\"\n        url = EMBEDDING_URL_ROOT + '/' + f'vocab.{self.embedding_name}'\n        get_path_from_url(url, EMBEDDING_HOME)\n\n    def get_vocab_path(self):\n        \"\"\"\n        Get local vocab path\n        \"\"\"\n        vocab_path = os.path.join(EMBEDDING_HOME, f'vocab.{self.embedding_name}')\n        if not os.path.exists(vocab_path):\n            self._download_vocab()\n        return vocab_path\n\n    def get_tokenizer(self, *args, **kwargs):\n        \"\"\"\n        Get tokenizer of embedding module\n        \"\"\"\n        if self.embedding_name.endswith('.en'):  # English\n            raise NotImplementedError  # TODO: (chenxiaojie) add tokenizer of English embedding\n        else:  # Chinese\n            return JiebaTokenizer(self.vocab)\n"
  },
  {
    "path": "paddlehub/server/__init__.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom paddlehub.server.server_source import ServerSource\nfrom paddlehub.server.git_source import GitSource\nfrom paddlehub.server.server import module_server\nfrom paddlehub.utils import log\n\n\ndef server_check() -> bool:\n    '''Check whether localhost can access the PaddleHub default server normally.'''\n    if module_server.get_source_by_key('default_hub_server').is_connected():\n        log.logger.info('Request Hub-Server successfully.')\n        return True\n\n    log.logger.info('Request Hub-Server unsuccessfully.')\n    return False\n"
  },
  {
    "path": "paddlehub/server/git_source.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport ast\nimport inspect\nimport importlib\nimport os\nimport sys\nfrom collections import OrderedDict\nfrom typing import List\n\nfrom paddlehub.module.module import RunModule\nfrom paddlehub.env import SOURCES_HOME\nfrom paddlehub.utils import log, utils\n\n\nclass GitSource(object):\n    '''\n    Git source for PaddleHub module\n\n    Args:\n        url(str) : Url of git repository\n        path(str) : Path to store the git repository\n    '''\n\n    def __init__(self, url: str, path: str = None):\n        # For some environments where git is not installed, we need to set this environment\n        # variable to avoid errors.\n        os.environ['GIT_PYTHON_REFRESH'] = 'quiet'\n        from git import Repo\n\n        self.url = url\n        self.path = os.path.join(SOURCES_HOME, utils.md5(url))\n\n        if self.path.endswith('.git'):\n            self.path = self.path[:-4]\n\n        if not os.path.exists(self.path):\n            log.logger.info('Git repository {} does not exist, download from remote.'.format(self.url))\n            self.repo = Repo.clone_from(self.url, self.path)\n        else:\n            log.logger.info('Git repository {} is located at {}.'.format(self.url, self.path))\n            self.repo = Repo(self.path)\n\n        self.hub_modules = OrderedDict()\n        self.load_hub_modules()\n\n    def checkout(self, branch: str):\n        '''Checkout the current repo to the specified branch.'''\n        try:\n            self.repo.git.checkout(branch)\n            # reload modules\n            self.load_hub_modules()\n        except:\n            utils.record_exception('An error occurred while checkout {}'.format(self.path))\n\n    def update(self):\n        '''Update the current repo.'''\n        try:\n            self.repo.remote().pull(self.repo.branches[0])\n            # reload modules\n            self.load_hub_modules()\n        except:\n            self.hub_modules = OrderedDict()\n            utils.record_exception('An error occurred while update {}'.format(self.path))\n\n    def load_hub_modules(self):\n        if 'hubconf' in sys.modules:\n            sys.modules.pop('hubconf')\n\n        sys.path.insert(0, self.path)\n        try:\n            with open(os.path.join(self.path, 'hubconf.py'), 'r') as file:\n                pycode = file.read()\n                ast_module = ast.parse(pycode)\n                for _body in ast_module.body:\n                    if not isinstance(_body, (ast.Import, ast.ImportFrom)):\n                        continue\n\n                    if not _body.module.endswith('module'):\n                        continue\n\n                    subpath = '.'.join(_body.module.split('.')[:-2])\n                    subpath = os.path.join(self.path, subpath)\n                    sys.path.insert(0, subpath)\n\n            py_module = importlib.import_module('hubconf')\n            for _item, _cls in inspect.getmembers(py_module, inspect.isclass):\n                _item = py_module.__dict__[_item]\n                if issubclass(_item, RunModule):\n                    self.hub_modules[_item.name] = _item\n        except:\n            self.hub_modules = OrderedDict()\n            utils.record_exception('An error occurred while loading {}'.format(self.path))\n\n        sys.path.remove(self.path)\n\n    def search_module(self, name: str, version: str = None) -> List[dict]:\n        '''\n        Search PaddleHub module\n\n        Args:\n            name(str) : PaddleHub module name\n            version(str) : PaddleHub module version\n        '''\n        return self.search_resource(type='module', name=name, version=version)\n\n    def search_resource(self, type: str, name: str, version: str = None) -> List[dict]:\n        '''\n        Search PaddleHub Resource\n\n        Args:\n            type(str) : Resource type\n            name(str) : Resource name\n            version(str) : Resource version\n        '''\n        module = self.hub_modules.get(name, None)\n        if module and module.version.match(version):\n            path = sys.modules[module.__module__].__file__\n            path = os.path.dirname(path)\n            return [{\n                'version': module.version,\n                'name': module.name,\n                'path': path,\n                'class': module.__name__,\n                'source': self.url\n            }]\n        return None\n\n    def get_module_compat_info(self, name: str) -> dict:\n        '''Get the version compatibility information of the model.'''\n        return {}\n\n    @classmethod\n    def check(cls, url: str) -> bool:\n        '''\n        Check if the specified url is a valid git repository link\n\n        Args:\n            url(str) : Url to check\n        '''\n        import git\n\n        try:\n            git.cmd.Git().ls_remote(url)\n            return True\n        except:\n            return False\n"
  },
  {
    "path": "paddlehub/server/server.py",
    "content": "#coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport json\nimport os\nimport requests\nimport threading\nimport time\nimport yaml\n\nfrom collections import OrderedDict\nfrom typing import List\n\nimport paddle\nimport paddlehub\nimport paddlehub.config as hubconf\nfrom paddlehub.config import cache_config\nfrom paddlehub.server import ServerSource, GitSource\nfrom paddlehub.utils import utils\n\n\nclass HubServer(object):\n    '''PaddleHub server'''\n\n    def __init__(self):\n        self.sources = OrderedDict()\n        self.keysmap = OrderedDict()\n\n    def _generate_source(self, url: str, source_type: str = 'git'):\n        if source_type == 'server':\n            source = ServerSource(url)\n        elif source_type == 'git':\n            source = GitSource(url)\n        else:\n            raise ValueError('Unknown source type {}.'.format(source_type))\n        return source\n\n    def _get_source_key(self, url: str):\n        return 'source_{}'.format(utils.md5(url))\n\n    def add_source(self, url: str, source_type: str = 'git', key: str = ''):\n        '''Add a module source(GitSource or ServerSource)'''\n        key = self._get_source_key(url) if not key else key\n        self.keysmap[url] = key\n        self.sources[key] = self._generate_source(url, source_type)\n\n    def remove_source(self, url: str = None, key: str = None):\n        '''Remove a module source'''\n        self.sources.pop(key)\n\n    def get_source(self, url: str):\n        '''Get a module source by url'''\n        key = self.keysmap.get(url)\n        if not key:\n            return None\n        return self.sources.get(key)\n\n    def get_source_by_key(self, key: str):\n        '''Get a module source by key'''\n        return self.sources.get(key)\n\n    def search_module(self,\n                      name: str,\n                      version: str = None,\n                      source: str = None,\n                      update: bool = False,\n                      branch: str = None) -> List[dict]:\n        '''\n        Search PaddleHub module\n\n        Args:\n            name(str) : PaddleHub module name\n            version(str) : PaddleHub module version\n        '''\n        return self.search_resource(\n            type='module', name=name, version=version, source=source, update=update, branch=branch)\n\n    def search_resource(self,\n                        type: str,\n                        name: str,\n                        version: str = None,\n                        source: str = None,\n                        update: bool = False,\n                        branch: str = None) -> List[dict]:\n        '''\n        Search PaddleHub Resource\n\n        Args:\n            type(str) : Resource type\n            name(str) : Resource name\n            version(str) : Resource version\n        '''\n        sources = self.sources.values() if not source else [self._generate_source(source)]\n        for source in sources:\n            if isinstance(source, GitSource) and update:\n                source.update()\n\n            if isinstance(source, GitSource) and branch:\n                source.checkout(branch)\n\n            result = source.search_resource(name=name, type=type, version=version)\n            if result:\n                return result\n        return []\n\n    def get_module_compat_info(self, name: str, source: str = None) -> dict:\n        '''Get the version compatibility information of the model.'''\n        sources = self.sources.values() if not source else [self._generate_source(source)]\n        for source in sources:\n            result = source.get_module_compat_info(name=name)\n            if result:\n                return result\n        return {}\n\n\ndef uri_path(server_url, api):\n    srv = server_url\n    if server_url.endswith('/'):\n        srv = server_url[:-1]\n    if api.startswith('/'):\n        srv += api\n    else:\n        api = '/' + api\n        srv += api\n    return srv\n\n\ndef hub_request(api, params, extra=None, timeout=8):\n    params['hub_version'] = paddlehub.__version__.split('-')[0]\n    params['paddle_version'] = paddle.__version__.split('-')[0]\n\n    params[\"extra\"] = json.dumps(extra)\n    r = requests.get(api, params, timeout=timeout)\n    return r.json()\n\n\nclass CacheUpdater(threading.Thread):\n    def __init__(self, command=\"update_cache\", module=None, version=None, addition=None):\n        threading.Thread.__init__(self)\n        self.command = command\n        self.module = module\n        self.version = version\n        self.addition = addition\n\n    def update_resource_list_file(self, command=\"update_cache\", module=None, version=None, addition=None):\n        payload = {'word': module}\n        if version:\n            payload['version'] = version\n        api_url = uri_path(hubconf.server, 'search')\n        cache_path = os.path.join(\"~\")\n        hub_name = cache_config.hub_name\n        if os.path.exists(cache_path):\n            extra = {\"command\": command, \"mtime\": os.stat(cache_path).st_mtime, \"hub_name\": hub_name}\n        else:\n            extra = {\n                \"command\": command,\n                \"mtime\": time.strftime(\"%Y-%m-%d %H:%M:%S\", time.localtime()),\n                \"hub_name\": hub_name\n            }\n        if addition is not None:\n            extra.update({\"addition\": addition})\n        try:\n            r = hub_request(api_url, payload, extra, timeout=1)\n            if r.get(\"update_cache\", 0) == 1:\n                with open(cache_path, 'w+') as fp:\n                    yaml.safe_dump({'resource_list': r['data']}, fp)\n        except Exception as err:\n            pass\n\n    def run(self):\n        self.update_resource_list_file(self.command, self.module, self.version, self.addition)\n\n\nmodule_server = HubServer()\nmodule_server.add_source(hubconf.server, source_type='server', key='default_hub_server')\n"
  },
  {
    "path": "paddlehub/server/server_source.py",
    "content": "#coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport json\nfrom typing import List\n\nimport requests\n\nimport paddlehub\nfrom paddlehub.utils import platform\nfrom paddlehub.utils.utils import convert_version\nfrom paddlehub.utils.utils import Version\n\n\nclass ServerConnectionError(Exception):\n\n    def __init__(self, url: str):\n        self.url = url\n\n    def __str__(self):\n        tips = 'Can\\'t connect to Hub Server: {}'.format(self.url)\n        return tips\n\n\nclass ServerSource(object):\n    '''\n    PaddleHub server source\n\n    Args:\n        url(str) : Url of the server\n        timeout(int) : Request timeout\n    '''\n\n    def __init__(self, url: str, timeout: int = 10):\n        self._url = url\n        self._timeout = timeout\n\n    def search_module(self, name: str, version: str = None) -> List[dict]:\n        '''\n        Search PaddleHub module\n\n        Args:\n            name(str) : PaddleHub module name\n            version(str) : PaddleHub module version\n        '''\n        return self.search_resource(type='module', name=name, version=version)\n\n    def search_resource(self, type: str, name: str, version: str = None) -> List[dict]:\n        '''\n        Search PaddleHub Resource\n\n        Args:\n            type(str) : Resource type\n            name(str) : Resource name\n            version(str) : Resource version\n        '''\n        params = {'environments': platform.get_platform_info()}\n\n        params['word'] = name\n        params['type'] = type\n        if version:\n            params['version'] = version\n\n        # Delay module loading to improve command line speed\n        import paddle\n\n        paddle_version = paddle.__version__.split('-')[0]\n        hub_version = paddlehub.__version__.split('-')[0]\n        if paddle_version == '0.0.0':  # develop version\n            paddle_version = '66.0.0'\n        if hub_version == 'develop':  # develop version\n            hub_version = '66.0.0'\n        params['hub_version'] = hub_version\n        params['paddle_version'] = paddle_version\n\n        result = self.request(path='search', params=params)\n\n        if result['status'] == 0 and len(result['data']) > 0:\n            results = []\n            for module_info in result['data']:\n                should_skip = False\n                if module_info['paddle_version']:\n                    paddle_version_intervals = convert_version(module_info['paddle_version'])\n                    for module_paddle_version in paddle_version_intervals:\n                        if not Version(paddle_version).match(module_paddle_version):\n                            should_skip = True\n                if module_info['hub_version']:\n                    hub_version_intervals = convert_version(module_info['hub_version'])\n                    for module_hub_version in hub_version_intervals:\n                        if not Version(hub_version).match(module_hub_version):\n                            should_skip = True\n                if should_skip:\n                    continue\n                results.append(module_info)\n            if results:\n                return results\n        return None\n\n    def get_module_compat_info(self, name: str) -> dict:\n        '''Get the version compatibility information of the model.'''\n        params = {'name': name}\n        result = self.request(path='info', params=params)\n        if result['status'] == 0 and len(result['data']) > 0:\n            infos = {}\n            for _info in result['data']['info']:\n                infos[_info['version']] = {\n                    'url': _info['url'],\n                    'paddle_version': convert_version(_info['paddle_version']),\n                    'hub_version': convert_version(_info['hub_version'])\n                }\n            return infos\n\n        return {}\n\n    def request(self, path: str, params: dict) -> dict:\n        '''Request server.'''\n        api = '{}/{}'.format(self._url, path)\n        try:\n            result = requests.get(api, params, timeout=self._timeout)\n            return result.json()\n        except requests.exceptions.ConnectionError as e:\n            raise ServerConnectionError(self._url)\n\n    def is_connected(self):\n        return self.check(self._url)\n\n    @classmethod\n    def check(cls, url: str) -> bool:\n        '''\n        Check if the specified url is a valid paddlehub server\n\n        Args:\n            url(str) : Url to check\n        '''\n        try:\n            r = requests.get(url + '/search')\n            return r.status_code == 200\n        except:\n            return False\n"
  },
  {
    "path": "paddlehub/serving/__init__.py",
    "content": ""
  },
  {
    "path": "paddlehub/serving/app_compat.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport socket\nimport threading\nimport time\nimport traceback\nfrom multiprocessing import Process\nfrom threading import Lock\n\nimport requests\nfrom flask import Flask\nfrom flask import redirect\nfrom flask import request\nfrom flask import Response\n\nfrom paddlehub.serving.model_service.base_model_service import cv_module_info\nfrom paddlehub.serving.model_service.base_model_service import nlp_module_info\nfrom paddlehub.serving.model_service.base_model_service import v2_module_info\nfrom paddlehub.utils import log\nfrom paddlehub.utils import utils\n\nfilename = 'HubServing-%s.log' % time.strftime(\"%Y_%m_%d\", time.localtime())\n\n_gradio_apps = {}  # Used to store all launched gradio apps\n_lock = Lock()  # Used to prevent parallel requests to launch a server twice\n\n\ndef package_result(status: str, msg: str, data: dict):\n    '''\n    Package message of response.\n\n    Args:\n         status(str): Error code\n            ========   ==============================================================================================\n            Code       Meaning\n            --------   ----------------------------------------------------------------------------------------------\n            '000'      Return results normally\n            '101'      An error occurred in the predicting method\n            '111'      Module is not available\n            '112'      Use outdated and abandoned HTTP protocol format\n            ========   ===============================================================================================\n         msg(str): Detailed info for error\n         data(dict): Result of predict api.\n\n    Returns:\n        dict: Message of response\n\n    Examples:\n        .. code-block:: python\n\n            data = {'result': 0.002}\n            package_result(status='000', msg='', data=data)\n    '''\n    return {\"status\": status, \"msg\": msg, \"results\": data}\n\n\ndef create_gradio_app(module_info: dict):\n    '''\n    Create a gradio app and launch a server for users.\n    Args:\n        module_info(dict): Module info include module name, method name and\n                            other info.\n    Return:\n        int: port number, if server has been successful.\n\n    Exception:\n        Raise a exception if server can not been launched.\n    '''\n    module_name = module_info['module_name']\n    port = None\n    with _lock:\n        if module_name not in _gradio_apps:\n            try:\n                serving_method = getattr(module_info[\"module\"], 'create_gradio_app')\n            except Exception:\n                raise RuntimeError('Module {} is not supported for gradio app.'.format(module_name))\n\n            def get_free_tcp_port():\n                tcp = socket.socket(socket.AF_INET, socket.SOCK_STREAM)\n                tcp.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEPORT, 1)\n                tcp.bind(('localhost', 0))\n                addr, port = tcp.getsockname()\n                tcp.close()\n                return port\n\n            port = get_free_tcp_port()\n            app = serving_method()\n            process = Process(target=app.launch, kwargs={'server_port': port})\n            process.start()\n\n            def check_alive():\n                nonlocal port\n                while True:\n                    try:\n                        requests.get('http://localhost:{}/'.format(port))\n                        break\n                    except Exception:\n                        time.sleep(1)\n\n            check_alive()\n            _gradio_apps[module_name] = port\n    return port\n\n\ndef predict_v2(module_info: dict, input: dict):\n    '''\n\n    Predict with `serving` API of module.\n\n    Args:\n         module_info(dict): Module info include module name, method name and\n                            other info.\n         input(dict): Data to input to predict API.\n\n    Returns:\n        dict: Response after packaging by func `package_result`\n\n    Examples:\n        .. code-block:: python\n\n            module_info = {'module_name': 'lac'}}\n            data = {'text': ['今天天气很好']}\n            predict_v2(module_info=module_info, input=data)\n    '''\n    serving_method_name = module_info[\"method_name\"]\n    serving_method = getattr(module_info[\"module\"], serving_method_name)\n    predict_args = module_info[\"predict_args\"].copy()\n    predict_args.update(input)\n\n    for item in serving_method.__code__.co_varnames:\n        if item in module_info.keys():\n            predict_args.update({item: module_info[item]})\n    try:\n        output = serving_method(**predict_args)\n    except Exception as err:\n        log.logger.error(traceback.format_exc())\n        return package_result(\"101\", str(err), \"\")\n\n    return package_result(\"000\", \"\", output)\n\n\ndef create_app(init_flag: bool = False, configs: dict = None):\n    '''\n    Start one flask instance and ready for HTTP requests.\n\n    Args:\n         init_flag(bool): Whether the instance need to be initialized with\n                          `configs` or not\n         configs(dict): Module configs for initializing.\n\n    Returns:\n        One flask instance.\n\n    Examples:\n        .. code-block:: python\n\n            create_app(init_flag=False, configs=None)\n    '''\n    if init_flag is False:\n        if configs is None:\n            raise RuntimeError(\"Lack of necessary configs.\")\n        config_with_file(configs)\n\n    app_instance = Flask(__name__)\n    app_instance.config[\"JSON_AS_ASCII\"] = False\n    app_instance.logger = log.get_file_logger(filename)\n\n    @app_instance.route(\"/\", methods=[\"GET\", \"POST\"])\n    def index():\n        '''\n        Provide index page.\n        '''\n        return '暂不提供可视化界面，请直接使用脚本进行请求。<br/>No visual ' \\\n               'interface is provided for the time being, please use the' \\\n               ' python script to make a request directly.'\n\n    @app_instance.before_request\n    def before_request():\n        '''\n        Add id info to `request.data` before request.\n        '''\n        request.data = {\"id\": utils.md5(request.remote_addr + str(time.time()))}\n\n    @app_instance.route(\"/predict/<module_name>\", methods=[\"POST\"])\n    def predict_serving_v2(module_name: str):\n        '''\n        Http api for predicting.\n\n        Args:\n            module_name(str): Module name for predicting.\n\n        Returns:\n            Result of predicting after packaging.\n        '''\n        if module_name in v2_module_info.modules:\n            module_info = v2_module_info.get_module_info(module_name)\n        else:\n            msg = \"Module {} is not available.\".format(module_name)\n            return package_result(\"111\", msg, \"\")\n        inputs = request.json\n        if inputs is None:\n            results = \"This usage is out of date, please use 'application/json' as content-type to post to /predict/%s. See 'https://github.com/PaddlePaddle/PaddleHub/blob/release/v1.6/docs/tutorial/serving.md' for more details.\" % (\n                module_name)\n            return package_result(\"112\", results, \"\")\n\n        results = predict_v2(module_info, inputs)\n        return results\n\n    @app_instance.route('/gradio/<module_name>', methods=[\"GET\", \"POST\"])\n    def gradio_app(module_name: str):\n        if module_name in v2_module_info.modules:\n            module_info = v2_module_info.get_module_info(module_name)\n            module_info['module_name'] = module_name\n        else:\n            msg = \"Module {} is not supported for gradio app.\".format(module_name)\n            return package_result(\"111\", msg, \"\")\n        create_gradio_app(module_info)\n        return redirect(\"/gradio/{}/app\".format(module_name), code=302)\n\n    @app_instance.route(\"/gradio/<module_name>/<path:path>\", methods=[\"GET\", \"POST\"])\n    def request_gradio_app(module_name: str, path: str):\n        '''\n        Gradio app server url interface. We route urls for gradio app to gradio server.\n\n        Args:\n            module_name(str): Module name for gradio app.\n            path(str): All resource path from gradio server.\n\n        Returns:\n            Any thing from gradio server.\n        '''\n        port = _gradio_apps[module_name]\n        if path == 'app':\n            proxy_url = request.url.replace(request.host_url + 'gradio/{}/app'.format(module_name),\n                                            'http://localhost:{}/'.format(port))\n        else:\n            proxy_url = request.url.replace(request.host_url + 'gradio/{}/'.format(module_name),\n                                            'http://localhost:{}/'.format(port))\n        resp = requests.request(method=request.method,\n                                url=proxy_url,\n                                headers={key: value\n                                         for (key, value) in request.headers if key != 'Host'},\n                                data=request.get_data(),\n                                cookies=request.cookies,\n                                allow_redirects=False)\n        headers = [(name, value) for (name, value) in resp.raw.headers.items()]\n        response = Response(resp.content, resp.status_code, headers)\n        return response\n\n    return app_instance\n\n\ndef config_with_file(configs: dict):\n    '''\n    Config `cv_module_info` and `nlp_module_info` by configs.\n\n    Args:\n        configs(dict): Module info and configs\n\n    Examples:\n        .. code-block:: python\n\n            configs = {'lac': {'version': 1.0.0, 'category': nlp}}\n            config_with_file(configs=configs)\n    '''\n    for key, value in configs.items():\n        if \"CV\" == value[\"category\"]:\n            cv_module_info.add_module(key, {key: value})\n        elif \"NLP\" == value[\"category\"]:\n            nlp_module_info.add_module(key, {key: value})\n        v2_module_info.add_module(key, {key: value})\n        logger = log.get_file_logger(filename)\n        logger.info(\"%s==%s\" % (key, value[\"version\"]))\n\n\ndef run(configs: dict = None, port: int = 8866):\n    '''\n    Run flask instance for PaddleHub-Serving\n\n    Args:\n         configs(dict): module info and configs\n         port(int): the port of the webserver\n\n    Examples:\n        .. code-block:: python\n\n            configs = {'lac': {'version': 1.0.0, 'category': nlp}}\n            run(configs=configs, port=8866)\n    '''\n    logger = log.get_file_logger(filename)\n    if configs is not None:\n        config_with_file(configs)\n    else:\n        logger.error(\"Start failed cause of missing configuration.\")\n        return\n    my_app = create_app(init_flag=True)\n    my_app.run(host=\"0.0.0.0\", port=port, debug=False, threaded=False)\n    log.logger.info(\"PaddleHub-Serving has been stopped.\")\n"
  },
  {
    "path": "paddlehub/serving/client.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport zmq\n\n\nclass InferenceClient(object):\n    def __init__(self, frontend_addr):\n        self.frontend_addr = frontend_addr\n        self.context = zmq.Context(1)\n        self.socket = self.context.socket(zmq.REQ)\n        self.socket.connect(frontend_addr)\n\n    def send_req(self, message):\n        self.socket.send_json(message)\n        result = self.socket.recv_json()\n\n        return result\n\n\nclass InferenceClientProxy(object):\n    clients = {}\n\n    @staticmethod\n    def get_client(pid, frontend_addr):\n        if pid not in InferenceClientProxy.clients.keys():\n            client = InferenceClient(frontend_addr)\n            InferenceClientProxy.clients.update({pid: client})\n        return InferenceClientProxy.clients[pid]\n"
  },
  {
    "path": "paddlehub/serving/device.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport zmq\nimport time\nimport os\nimport json\nimport platform\nimport traceback\nimport subprocess\n\nfrom paddlehub.utils import log\nfrom paddlehub.utils.utils import is_port_occupied\n\n\nclass InferenceDevice(object):\n    '''\n    The InferenceDevice class provides zmq.device to connect with frontend and\n    backend.\n    '''\n\n    def __init__(self):\n        self.frontend = None\n        self.backend = None\n        filename = 'HubServing-%s.log' % time.strftime(\"%Y_%m_%d\", time.localtime())\n        self.logger = log.get_file_logger(filename)\n\n    def listen(self, frontend_addr: str, backend_addr: str):\n        '''\n        Start zmq.device to listen from frontend address to backend address.\n        '''\n        try:\n            context = zmq.Context(1)\n\n            self.frontend = context.socket(zmq.ROUTER)\n            self.frontend.bind(frontend_addr)\n\n            self.backend = context.socket(zmq.DEALER)\n            self.backend.bind(backend_addr)\n\n            zmq.device(zmq.QUEUE, self.frontend, self.backend)\n        except Exception as e:\n            self.logger.error(traceback.format_exc())\n        finally:\n            self.frontend.close()\n            self.backend.close()\n            context.term()\n\n\ndef start_workers(modules_info: dict, gpus: list, backend_addr: str):\n    '''\n    InferenceWorker class provides workers for different GPU device.\n\n    Args:\n        modules_info(dict): modules info, include module name, version\n        gpus(list): GPU devices index\n        backend_addr(str): the port of PaddleHub-Serving zmq backend address\n\n    Examples:\n    .. code-block:: python\n\n        modules_info = {'lac': {'init_args': {'version': '2.1.0'},\n                                'predict_args': {'batch_size': 1}}}\n        start_workers(modules_name, ['0', '1', '2'], 'ipc://backend.ipc')\n\n    '''\n    work_file = os.path.join(os.path.split(os.path.realpath(__file__))[0], 'worker.py')\n    modules_info = json.dumps(modules_info)\n    for index in range(len(gpus)):\n        subprocess.Popen(['python', work_file, modules_info, gpus[index], backend_addr])\n\n\nclass InferenceServer(object):\n    '''\n    InferenceServer class starts zmq.rep as backend.\n\n    Args:\n        modules_name(list): modules name\n        gpus(list): GPU devices index\n    '''\n\n    def __init__(self, modules_info: dict, gpus: list):\n        self.modules_info = modules_info\n        self.gpus = gpus\n\n    def listen(self, port: int):\n        if platform.system() == \"Windows\":\n            back_port = int(port) + 1\n            for index in range(100):\n                if not is_port_occupied(\"127.0.0.1\", back_port):\n                    break\n                else:\n                    back_port = int(back_port) + 1\n            else:\n                raise RuntimeError(\n                    \"Port from %s to %s is occupied, please use another port\" % (int(port) + 1, back_port))\n            worker_backend = \"tcp://localhost:%s\" % back_port\n            backend = \"tcp://*:%s\" % back_port\n        else:\n            worker_backend = \"ipc://backend.ipc\"\n            backend = \"ipc://backend.ipc\"\n\n        start_workers(modules_info=self.modules_info, gpus=self.gpus, backend_addr=worker_backend)\n        d = InferenceDevice()\n        d.listen('tcp://*:%s' % port, backend)\n"
  },
  {
    "path": "paddlehub/serving/http_server.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport time\nimport os\nimport multiprocessing\nimport platform\n\nfrom flask import Flask, request\n\nfrom paddlehub.serving.device import InferenceServer\nfrom paddlehub.serving.client import InferenceClientProxy\nfrom paddlehub.utils import utils, log\n\nfilename = 'HubServing-%s.log' % time.strftime(\"%Y_%m_%d\", time.localtime())\n\nif platform.system() == \"Windows\":\n\n    class StandaloneApplication(object):\n        def __init__(self):\n            pass\n\n        def load_config(self):\n            pass\n\n        def load(self):\n            pass\nelse:\n    import gunicorn.app.base\n\n    class StandaloneApplication(gunicorn.app.base.BaseApplication):\n        '''\n        StandaloneApplication class provides instance of StandaloneApplication\n        as gunicorn backend.\n        '''\n\n        def __init__(self, app, options=None):\n            self.options = options or {}\n            self.application = app\n            super(StandaloneApplication, self).__init__()\n\n        def load_config(self):\n            config = {\n                key: value\n                for key, value in self.options.items() if key in self.cfg.settings and value is not None\n            }\n            for key, value in config.items():\n                self.cfg.set(key.lower(), value)\n\n        def load(self):\n            return self.application\n\n\ndef package_result(status: str, msg: str, data: dict):\n    '''\n    Package message of response.\n\n    Args:\n         status(str): Error code\n            ========   ==============================================================================================\n            Code       Meaning\n            --------   ----------------------------------------------------------------------------------------------\n            '000'      Return results normally\n            '101'      An error occurred in the predicting method\n            '111'      Module is not available\n            '112'      Use outdated and abandoned HTTP protocol format\n            ========   ===============================================================================================\n         msg(str): Detailed info for error\n         data(dict): Result of predict api.\n\n    Returns:\n        dict: Message of response\n\n    Examples:\n        .. code-block:: python\n\n            data = {'result': 0.002}\n            package_result(status='000', msg='', data=data)\n    '''\n    return {\"status\": status, \"msg\": msg, \"results\": data}\n\n\ndef create_app(client_port: int = 5559, modules_name: list = []):\n    '''\n    Start one flask instance and ready for HTTP requests.\n\n    Args:\n         client_port(str): port of zmq backend address\n         modules_name(list): the name list of modules\n\n    Returns:\n        One flask instance.\n\n    Examples:\n        .. code-block:: python\n\n            create_app(client_port='5559')\n    '''\n    app_instance = Flask(__name__)\n    app_instance.config[\"JSON_AS_ASCII\"] = False\n    app_instance.logger = log.get_file_logger(filename)\n    pid = os.getpid()\n\n    @app_instance.route(\"/\", methods=[\"GET\", \"POST\"])\n    def index():\n        '''\n        Provide index page.\n        '''\n        return '暂不提供可视化界面，请直接使用脚本进行请求。<br/>No visual ' \\\n               'interface is provided for the time being, please use the' \\\n               ' python script to make a request directly.'\n\n    @app_instance.before_request\n    def before_request():\n        '''\n        Add id info to `request.data` before request.\n        '''\n        request.data = {\"id\": utils.md5(request.remote_addr + str(time.time()))}\n\n    @app_instance.route(\"/predict/<module_name>\", methods=[\"POST\"])\n    def predict_serving_v3(module_name: str):\n        '''\n        Http api for predicting.\n\n        Args:\n            module_name(str): Module name for predicting.\n\n        Returns:\n            Result of predicting after packaging.\n        '''\n\n        if module_name not in modules_name:\n            msg = \"Module {} is not available.\".format(module_name)\n            return package_result(\"111\", \"\", msg)\n        inputs = request.json\n        if inputs is None:\n            results = \"This usage is out of date, please use 'application/json' as content-type to post to /predict/%s. See 'https://github.com/PaddlePaddle/PaddleHub/blob/release/v1.6/docs/tutorial/serving.md' for more details.\" % (\n                module_name)\n            return package_result(\"112\", results, \"\")\n        inputs = {'module_name': module_name, 'inputs': inputs}\n        port_str = 'tcp://localhost:%s' % client_port\n\n        client = InferenceClientProxy.get_client(pid, port_str)\n\n        results = client.send_req(inputs)\n\n        return package_result(\"000\", results, \"\")\n\n    return app_instance\n\n\ndef run(port: int = 8866, client_port: int = 5559, names: list = [], workers: int = 1):\n    '''\n    Run flask instance for PaddleHub-Serving\n\n    Args:\n         port(int): the port of the webserver\n         client_port(int): the port of zmq backend address\n         names(list): the name list of modules\n         workers(int): workers for every client\n\n    Examples:\n        .. code-block:: python\n\n            run(port=8866, client_port='5559')\n    '''\n    if platform.system() == \"Windows\":\n        my_app = create_app(client_port, modules_name=names)\n        my_app.run(host=\"0.0.0.0\", port=port, debug=False, threaded=False)\n    else:\n        options = {\"bind\": \"0.0.0.0:%s\" % port, \"workers\": workers, \"worker_class\": \"sync\"}\n        StandaloneApplication(create_app(client_port, modules_name=names), options).run()\n\n    log.logger.info(\"PaddleHub-Serving has been stopped.\")\n\n\ndef run_http_server(port: int = 8866, client_port: int = 5559, names: list = [], workers: int = 1):\n    '''\n    Start subprocess to run function `run`\n\n    Args:\n        port(int): the port of the webserver\n        client_port(int): the port of zmq backend address\n        names(list): the name list of moduels\n        workers(int): the workers for every client\n\n    Returns:\n        process id of subprocess\n\n    Examples:\n        .. code-block:: python\n\n            run_http_server(port=8866, client_port='5559', names=['lac'])\n    '''\n    names = list(names)\n    p = multiprocessing.Process(target=run, args=(port, client_port, names, workers))\n    p.start()\n    return p.pid\n\n\ndef run_all(modules_info: dict, gpus: list, frontend_port: int, backend_port: int):\n    '''\n    Run flask instance for frontend HTTP request and zmq device for backend zmq\n    request.\n\n    Args:\n        modules_info(dict): modules info, include module name, version\n        gpus(list): GPU devices index\n        frontend_port(int): the port of PaddleHub-Serving frontend address\n        backend_port(int): the port of PaddleHub-Serving zmq backend address\n\n    Examples:\n        .. code-block:: python\n\n            modules_info = {'lac': {'init_args': {'version': '2.1.0'},\n                                    'predict_args': {'batch_size': 1}}}\n            run_all(modules_info, ['0', '1', '2'], 8866, 8867)\n    '''\n    run_http_server(frontend_port, backend_port, modules_info.keys(), len(gpus))\n    MyIS = InferenceServer(modules_info, gpus)\n    MyIS.listen(backend_port)\n"
  },
  {
    "path": "paddlehub/serving/model_service/__init__.py",
    "content": ""
  },
  {
    "path": "paddlehub/serving/model_service/base_model_service.py",
    "content": "# coding: utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport six\nimport abc\n\n\nclass BaseModuleInfo(object):\n    def __init__(self):\n        self._modules_info = {}\n        self._modules = []\n\n    def set_modules_info(self, modules_info):\n        # dict of modules info.\n        self._modules_info = modules_info\n        # list of modules name.\n        self._modules = list(self._modules_info.keys())\n\n    def get_module_info(self, module_name):\n        return self._modules_info[module_name]\n\n    def add_module(self, module_name, module_info):\n        self._modules_info.update(module_info)\n        self._modules.append(module_name)\n\n    def get_module(self, module_name):\n        return self.get_module_info(module_name).get(\"module\", None)\n\n    @property\n    def modules_info(self):\n        return self._modules_info\n\n\nclass CVModuleInfo(BaseModuleInfo):\n    def __init__(self):\n        self.cv_module_method = {\n            \"vgg19_imagenet\": \"predict_classification\",\n            \"vgg16_imagenet\": \"predict_classification\",\n            \"vgg13_imagenet\": \"predict_classification\",\n            \"vgg11_imagenet\": \"predict_classification\",\n            \"shufflenet_v2_imagenet\": \"predict_classification\",\n            \"se_resnext50_32x4d_imagenet\": \"predict_classification\",\n            \"se_resnext101_32x4d_imagenet\": \"predict_classification\",\n            \"resnet_v2_50_imagenet\": \"predict_classification\",\n            \"resnet_v2_34_imagenet\": \"predict_classification\",\n            \"resnet_v2_18_imagenet\": \"predict_classification\",\n            \"resnet_v2_152_imagenet\": \"predict_classification\",\n            \"resnet_v2_101_imagenet\": \"predict_classification\",\n            \"pnasnet_imagenet\": \"predict_classification\",\n            \"nasnet_imagenet\": \"predict_classification\",\n            \"mobilenet_v2_imagenet\": \"predict_classification\",\n            \"googlenet_imagenet\": \"predict_classification\",\n            \"alexnet_imagenet\": \"predict_classification\",\n            \"yolov3_coco2017\": \"predict_object_detection\",\n            \"ultra_light_fast_generic_face_detector_1mb_640\": \"predict_object_detection\",\n            \"ultra_light_fast_generic_face_detector_1mb_320\": \"predict_object_detection\",\n            \"ssd_mobilenet_v1_pascal\": \"predict_object_detection\",\n            \"pyramidbox_face_detection\": \"predict_object_detection\",\n            \"faster_rcnn_coco2017\": \"predict_object_detection\",\n            \"cyclegan_cityscapes\": \"predict_gan\",\n            \"deeplabv3p_xception65_humanseg\": \"predict_semantic_segmentation\",\n            \"ace2p\": \"predict_semantic_segmentation\",\n            \"pyramidbox_lite_server_mask\": \"predict_mask\",\n            \"pyramidbox_lite_mobile_mask\": \"predict_mask\"\n        }\n        super(CVModuleInfo, self).__init__()\n\n    @property\n    def cv_modules(self):\n        return self._modules\n\n    def add_module(self, module_name, module_info):\n        if \"CV\" == module_info[module_name].get(\"category\", \"\"):\n            self._modules_info.update(module_info)\n            self._modules.append(module_name)\n\n\nclass NLPModuleInfo(BaseModuleInfo):\n    def __init__(self):\n        super(NLPModuleInfo, self).__init__()\n\n    @property\n    def nlp_modules(self):\n        return self._modules\n\n    def add_module(self, module_name, module_info):\n        if \"NLP\" == module_info[module_name].get(\"category\", \"\"):\n            self._modules_info.update(module_info)\n            self._modules.append(module_name)\n\n\nclass V2ModuleInfo(BaseModuleInfo):\n    def __init__(self):\n        super(V2ModuleInfo, self).__init__()\n\n    @property\n    def modules(self):\n        return self._modules\n\n    def add_module(self, module_name, module_info):\n        self._modules_info.update(module_info)\n        self._modules.append(module_name)\n\n\nclass BaseModelService(object):\n    def _initialize(self):\n        pass\n\n    @abc.abstractmethod\n    def _pre_processing(self, data):\n        pass\n\n    @abc.abstractmethod\n    def _inference(self, data):\n        pass\n\n    @abc.abstractmethod\n    def _post_processing(self, data):\n        pass\n\n\ncv_module_info = CVModuleInfo()\nnlp_module_info = NLPModuleInfo()\nv2_module_info = V2ModuleInfo()\n"
  },
  {
    "path": "paddlehub/serving/worker.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport zmq\nimport time\nimport os\nimport json\nimport traceback\nimport sys\n\n\ndef run_worker(modules_info: dict, gpu_index: int, addr: str):\n    '''\n    Start zmq.REP as backend on specified GPU.\n\n    Args:\n        modules_info(dict): module name to serving method\n        gpu_index(int): GPU device index to use\n        addr(str): address of zmq.REP\n\n    Examples:\n        .. code-block:: python\n\n            modules_info = {'lac': lexical_analise}\n            run_worker(modules_info=modules_info,\n                       gpu_index=0,\n                       addr='ipc://backend.ipc')\n\n    '''\n    context = zmq.Context(1)\n    socket = context.socket(zmq.REP)\n    socket.connect(addr)\n\n    log.logger.info(\"Using GPU device index:%s\" % gpu_index)\n    while True:\n        try:\n            message = socket.recv_json()\n            inputs = message['inputs']\n            module_name = message['module_name']\n            inputs.update(modules_info[module_name]['predict_args'])\n            inputs.update({'use_gpu': True})\n            method = modules_info[module_name]['serving_method']\n            os.environ['CUDA_VISIBLE_DEVICES'] = gpu_index\n            output = method(**inputs)\n\n        except Exception as err:\n            log.logger.error(traceback.format_exc())\n            output = package_result(\"101\", str(err), \"\")\n        socket.send_json(output)\n\n\nif __name__ == '__main__':\n    argv = sys.argv\n    modules_info = json.loads(argv[1])\n    gpu_index = argv[2]\n    addr = argv[3]\n\n    os.environ['CUDA_VISIBLE_DEVICES'] = gpu_index\n    import paddlehub as hub\n    from paddlehub.serving.http_server import package_result\n    from paddlehub.utils import log\n\n    filename = 'HubServing-%s.log' % time.strftime(\"%Y_%m_%d\", time.localtime())\n    logger = log.get_file_logger(filename)\n    logger.logger.handlers = logger.logger.handlers[0:1]\n\n    modules_pred_info = {}\n    for module_name, module_info in modules_info.items():\n        init_args = module_info.get('init_args', {})\n        init_args.update({'name': module_name})\n        module = hub.Module(**init_args)\n        method_name = module.serving_func_name\n        serving_method = getattr(module, method_name)\n        predict_args = module_info.get('predict_args', {})\n        modules_pred_info.update({module_name: {'predict_args': predict_args, 'serving_method': serving_method}})\n\n    run_worker(modules_pred_info, gpu_index, addr)\n"
  },
  {
    "path": "paddlehub/text/__init__.py",
    "content": ""
  },
  {
    "path": "paddlehub/text/bert_tokenizer.py",
    "content": "# coding=utf-8\n# Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n'''This file is modified from https://github.com/huggingface/transformers'''\n\nimport collections\nimport os\nimport pickle\nimport unicodedata\nfrom typing import Dict, List, Optional, Union, Tuple\n\nfrom paddle.utils import try_import\n\nfrom paddlehub.text.utils import load_vocab, is_whitespace, is_control, is_punctuation, whitespace_tokenize, is_chinese_char\n\n\nclass BasicTokenizer(object):\n    '''Runs basic tokenization (punctuation splitting, lower casing, etc.).'''\n\n    def __init__(self, do_lower_case: bool = True, never_split: List[str] = None, tokenize_chinese_chars: bool = True):\n        ''' Constructs a BasicTokenizer.\n        Args:\n            do_lower_case: Whether to lower case the input.\n            never_split: (`optional`) list of str\n                List of token not to split.\n            tokenize_chinese_chars: (`optional`) boolean (default True)\n                Whether to tokenize Chinese characters.\n                This should likely be deactivated for Japanese:\n                see: https://github.com/huggingface/pytorch-pretrained-BERT/issues/328\n        '''\n        if never_split is None:\n            never_split = []\n        self.do_lower_case = do_lower_case\n        self.never_split = never_split\n        self.tokenize_chinese_chars = tokenize_chinese_chars\n\n    def tokenize(self, text: str, never_split: List[str] = None):\n        ''' Basic Tokenization of a piece of text.\n            Split on 'white spaces' only, for sub-word tokenization, see WordPieceTokenizer.\n        Args:\n            **never_split**: (`optional`) list of str\n                List of token not to split.\n        '''\n        never_split = self.never_split + (never_split if never_split is not None else [])\n        text = self._clean_text(text)\n        # This was added on November 1st, 2018 for the multilingual and Chinese\n        # models. This is also applied to the English models now, but it doesn't\n        # matter since the English models were not trained on any Chinese data\n        # and generally don't have any Chinese data in them (there are Chinese\n        # characters in the vocabulary because Wikipedia does have some Chinese\n        # words in the English Wikipedia.).\n        if self.tokenize_chinese_chars:\n            text = self._tokenize_chinese_chars(text)\n        orig_tokens = whitespace_tokenize(text)\n        split_tokens = []\n        for token in orig_tokens:\n            if self.do_lower_case and token not in never_split:\n                token = token.lower()\n                token = self._run_strip_accents(token)\n            split_tokens.extend(self._run_split_on_punc(token, never_split))\n\n        output_tokens = whitespace_tokenize(' '.join(split_tokens))\n        return output_tokens\n\n    def _run_strip_accents(self, text: str):\n        '''Strips accents from a piece of text.'''\n        text = unicodedata.normalize('NFD', text)\n        output = []\n        for char in text:\n            cat = unicodedata.category(char)\n            if cat == 'Mn':\n                continue\n            output.append(char)\n        return ''.join(output)\n\n    def _run_split_on_punc(self, text: str, never_split: List[str] = None):\n        '''Splits punctuation on a piece of text.'''\n        if never_split is not None and text in never_split:\n            return [text]\n        chars = list(text)\n        i = 0\n        start_new_word = True\n        output = []\n        while i < len(chars):\n            char = chars[i]\n            if is_punctuation(char):\n                output.append([char])\n                start_new_word = True\n            else:\n                if start_new_word:\n                    output.append([])\n                start_new_word = False\n                output[-1].append(char)\n            i += 1\n\n        return [''.join(x) for x in output]\n\n    def _tokenize_chinese_chars(self, text: str):\n        '''Adds whitespace around any CJK character.'''\n        output = []\n        for char in text:\n            if is_chinese_char(char):\n                output.append(' ')\n                output.append(char)\n                output.append(' ')\n            else:\n                output.append(char)\n        return ''.join(output)\n\n    def _clean_text(self, text: str):\n        '''Performs invalid character removal and whitespace cleanup on text.'''\n        output = []\n        for char in text:\n            cp = ord(char)\n            if cp == 0 or cp == 0xFFFD or is_control(char):\n                continue\n            if is_whitespace(char):\n                output.append(' ')\n            else:\n                output.append(char)\n        return ''.join(output)\n\n    def encode(self):\n        raise NotImplementedError('This tokenizer can only do tokenize(...), '\n                                  'the ability to convert tokens to ids has not been implemented')\n\n    def decode(self):\n        raise NotImplementedError('This tokenizer can only do tokenize(...), '\n                                  'the ability to convert ids to tokens has not been implemented')\n\n\nclass WordpieceTokenizer(object):\n    '''Runs WordPiece tokenization.'''\n\n    def __init__(self, vocab: List[str], unk_token: str, max_input_chars_per_word: int = 100):\n        self.vocab = vocab\n        self.unk_token = unk_token\n        self.max_input_chars_per_word = max_input_chars_per_word\n\n    def tokenize(self, text):\n        '''Tokenizes a piece of text into its word pieces.\n        This uses a greedy longest-match-first algorithm to perform tokenization\n        using the given vocabulary.\n        For example:\n          input = 'unaffable'\n          output = ['un', '##aff', '##able']\n        Args:\n          text: A single token or whitespace separated tokens. This should have\n            already been passed through `BasicTokenizer`.\n        Returns:\n          A list of wordpiece tokens.\n        '''\n\n        output_tokens = []\n        for token in whitespace_tokenize(text):\n            chars = list(token)\n            if len(chars) > self.max_input_chars_per_word:\n                output_tokens.append(self.unk_token)\n                continue\n\n            is_bad = False\n            start = 0\n            sub_tokens = []\n            while start < len(chars):\n                end = len(chars)\n                cur_substr = None\n                while start < end:\n                    substr = ''.join(chars[start:end])\n                    if start > 0:\n                        substr = '##' + substr\n                    if substr in self.vocab:\n                        cur_substr = substr\n                        break\n                    end -= 1\n                if cur_substr is None:\n                    is_bad = True\n                    break\n                sub_tokens.append(cur_substr)\n                start = end\n\n            if is_bad:\n                output_tokens.append(self.unk_token)\n            else:\n                output_tokens.extend(sub_tokens)\n        return output_tokens\n\n    def encode(self):\n        raise NotImplementedError('This tokenizer can only do tokenize(...), '\n                                  'the ability to convert tokens to ids has not been implemented')\n\n    def decode(self):\n        raise NotImplementedError('This tokenizer can only do tokenize(...), '\n                                  'the ability to convert ids to tokens has not been implemented')\n\n\nclass BertTokenizer(object):\n    '''\n    Constructs a BERT tokenizer. Based on WordPiece.\n    Args:\n        vocab_file (:obj:`string`):\n            File containing the vocabulary.\n        do_lower_case (:obj:`bool`, `optional`, defaults to :obj:`True`):\n            Whether to lowercase the input when tokenizing.\n        do_basic_tokenize (:obj:`bool`, `optional`, defaults to :obj:`True`):\n            Whether to do basic tokenization before WordPiece.\n        never_split (:obj:`bool`, `optional`, defaults to :obj:`True`):\n            List of tokens which will never be split during tokenization. Only has an effect when\n            :obj:`do_basic_tokenize=True`\n        unk_token (:obj:`string`, `optional`, defaults to '[UNK]'):\n            The unknown token. A token that is not in the vocabulary cannot be converted to an ID and is set to be this\n            token instead.\n        sep_token (:obj:`string`, `optional`, defaults to '[SEP]'):\n            The separator token, which is used when building a sequence from multiple sequences, e.g. two sequences\n            for sequence classification or for a text and a question for question answering.\n            It is also used as the last token of a sequence built with special tokens.\n        pad_token (:obj:`string`, `optional`, defaults to '[PAD]'):\n            The token used for padding, for example when batching sequences of different lengths.\n        cls_token (:obj:`string`, `optional`, defaults to '[CLS]'):\n            The classifier token which is used when doing sequence classification (classification of the whole\n            sequence instead of per-token classification). It is the first token of the sequence when built with\n            special tokens.\n        mask_token (:obj:`string`, `optional`, defaults to '[MASK]'):\n            The token used for masking values. This is the token used when training this model with masked language\n            modeling. This is the token which the model will try to predict.\n        tokenize_chinese_chars (:obj:`bool`, `optional`, defaults to :obj:`True`):\n            Whether to tokenize Chinese characters.\n            This should likely be deactivated for Japanese:\n            see: https://github.com/huggingface/transformers/issues/328\n    '''\n\n    def __init__(\n            self,\n            vocab_file: str,\n            do_lower_case: bool = True,\n            do_basic_tokenize: bool = True,\n            never_split: List[str] = None,\n            unk_token: str = '[UNK]',\n            sep_token: str = '[SEP]',\n            pad_token: str = '[PAD]',\n            cls_token: str = '[CLS]',\n            mask_token: str = '[MASK]',\n            tokenize_chinese_chars: bool = True,\n    ):\n        self.unk_token = unk_token\n        self.sep_token = sep_token\n        self.pad_token = pad_token\n        self.cls_token = cls_token\n        self.mask_token = mask_token\n        self.do_lower_case = do_lower_case\n        self.all_special_tokens = [unk_token, sep_token, pad_token, cls_token, mask_token]\n\n        if not os.path.isfile(vocab_file):\n            raise ValueError('Can\\'t find a vocabulary file at path \\'{}\\'.'.format(vocab_file))\n        self.vocab = load_vocab(vocab_file)\n        self.ids_to_tokens = collections.OrderedDict([(ids, tok) for tok, ids in self.vocab.items()])\n        self.do_basic_tokenize = do_basic_tokenize\n        if do_basic_tokenize:\n            self.basic_tokenizer = BasicTokenizer(\n                do_lower_case=do_lower_case, never_split=never_split, tokenize_chinese_chars=tokenize_chinese_chars)\n        self.wordpiece_tokenizer = WordpieceTokenizer(vocab=self.vocab, unk_token=self.unk_token)\n\n        self.unk_token_id = self.convert_tokens_to_ids(self.unk_token)\n        self.sep_token_id = self.convert_tokens_to_ids(self.sep_token)\n        self.pad_token_id = self.convert_tokens_to_ids(self.pad_token)\n        self.pad_token_type_id = 0\n        self.cls_token_id = self.convert_tokens_to_ids(self.cls_token)\n        self.mask_token_id = self.convert_tokens_to_ids(self.mask_token)\n        self.all_special_ids = self.convert_tokens_to_ids(self.all_special_tokens)\n\n    @property\n    def vocab_size(self):\n        return len(self.vocab)\n\n    def get_vocab(self):\n        return dict(self.vocab)\n\n    def _convert_token_to_id(self, token):\n        ''' Converts a token (str) in an id using the vocab. '''\n        return self.vocab.get(token, self.vocab.get(self.unk_token))\n\n    def _convert_id_to_token(self, index):\n        '''Converts an index (integer) in a token (str) using the vocab.'''\n        return self.ids_to_tokens.get(index, self.unk_token)\n\n    def convert_tokens_to_string(self, tokens):\n        ''' Converts a sequence of tokens (string) in a single string. '''\n        out_string = ' '.join(tokens).replace(' ##', '').strip()\n        return out_string\n\n    def convert_tokens_to_ids(self, tokens):\n        ''' Converts a token string (or a sequence of tokens) in a single integer id\n            (or a sequence of ids), using the vocabulary.\n        '''\n        if tokens is None:\n            return None\n\n        if isinstance(tokens, str):\n            return self._convert_token_to_id(tokens)\n\n        ids = []\n        for token in tokens:\n            ids.append(self._convert_token_to_id(token))\n        return ids\n\n    def convert_ids_to_tokens(self, ids: Union[int, List[int]],\n                              skip_special_tokens: bool = False) -> Union[int, List[int]]:\n        ''' Converts a single index or a sequence of indices (integers) in a token '\n            (resp.) a sequence of tokens (str), using the vocabulary and added tokens.\n            Args:\n                skip_special_tokens: Don't decode special tokens (self.all_special_tokens). Default: False\n        '''\n        if isinstance(ids, int):\n            return self._convert_id_to_token(ids)\n        tokens = []\n        for index in ids:\n            index = int(index)\n            if skip_special_tokens and index in self.all_special_ids:\n                continue\n            tokens.append(self._convert_id_to_token(index))\n        return tokens\n\n    def tokenize(self, text: str):\n        ''' Converts a string in a sequence of tokens (string), using the tokenizer.\n            Split in words for word-based vocabulary or sub-words for sub-word-based\n            vocabularies (BPE/SentencePieces/WordPieces).\n            Take care of added tokens.\n            Args:\n                text (:obj:`string`): The sequence to be encoded.\n        '''\n        split_tokens = []\n        if self.do_basic_tokenize:\n            for token in self.basic_tokenizer.tokenize(text, never_split=self.all_special_tokens):\n                for sub_token in self.wordpiece_tokenizer.tokenize(token):\n                    split_tokens.append(sub_token)\n        else:\n            split_tokens = self.wordpiece_tokenizer.tokenize(text)\n        return split_tokens\n\n    def build_inputs_with_special_tokens(self, token_ids_0: List[int],\n                                         token_ids_1: Optional[List[int]] = None) -> List[int]:\n        '''\n        Build model inputs from a sequence or a pair of sequence for sequence classification tasks\n        by concatenating and adding special tokens.\n        A BERT sequence has the following format:\n        - single sequence: ``[CLS] X [SEP]``\n        - pair of sequences: ``[CLS] A [SEP] B [SEP]``\n        Args:\n            token_ids_0 (:obj:`List[int]`):\n                List of IDs to which the special tokens will be added\n            token_ids_1 (:obj:`List[int]`, `optional`, defaults to :obj:`None`):\n                Optional second list of IDs for sequence pairs.\n        Returns:\n            :obj:`List[int]`: list of `input IDs` with the appropriate special tokens.\n        '''\n        if token_ids_1 is None:\n            return [self.cls_token_id] + token_ids_0 + [self.sep_token_id]\n        cls = [self.cls_token_id]\n        sep = [self.sep_token_id]\n        return cls + token_ids_0 + sep + token_ids_1 + sep\n\n    def num_special_tokens_to_add(self, pair=False):\n        '''\n        Returns the number of added tokens when encoding a sequence with special tokens.\n        Note:\n            This encodes inputs and checks the number of added tokens, and is therefore not efficient. Do not put this\n            inside your training loop.\n        Args:\n            pair: Returns the number of added tokens in the case of a sequence pair if set to True, returns the\n                number of added tokens in the case of a single sequence if set to False.\n        Returns:\n            Number of tokens added to sequences\n        '''\n        token_ids_0 = []\n        token_ids_1 = []\n        return len(self.build_inputs_with_special_tokens(token_ids_0, token_ids_1 if pair else None))\n\n    def get_special_tokens_mask(self,\n                                token_ids_0: List[int],\n                                token_ids_1: Optional[List[int]] = None,\n                                already_has_special_tokens: bool = False) -> List[int]:\n        '''\n        Retrieves sequence ids from a token list that has no special tokens added. This method is called when adding\n        special tokens using the tokenizer ``prepare_for_model`` or ``encode_plus`` methods.\n        Args:\n            token_ids_0 (:obj:`List[int]`):\n                List of ids.\n            token_ids_1 (:obj:`List[int]`, `optional`, defaults to :obj:`None`):\n                Optional second list of IDs for sequence pairs.\n            already_has_special_tokens (:obj:`bool`, `optional`, defaults to :obj:`False`):\n                Set to True if the token list is already formatted with special tokens for the model\n        Returns:\n            :obj:`List[int]`: A list of integers in the range [0, 1]: 1 for a special token, 0 for a sequence token.\n        '''\n\n        if already_has_special_tokens:\n            if token_ids_1 is not None:\n                raise ValueError('You should not supply a second sequence if the provided sequence of '\n                                 'ids is already formated with special tokens for the model.')\n            return list(map(lambda x: 1 if x in [self.sep_token_id, self.cls_token_id] else 0, token_ids_0))\n\n        if token_ids_1 is not None:\n            return [1] + ([0] * len(token_ids_0)) + [1] + ([0] * len(token_ids_1)) + [1]\n        return [1] + ([0] * len(token_ids_0)) + [1]\n\n    def create_segment_ids_from_sequences(self, token_ids_0: List[int],\n                                          token_ids_1: Optional[List[int]] = None) -> List[int]:\n        '''\n        Creates a mask from the two sequences passed to be used in a sequence-pair classification task.\n        A BERT sequence pair mask has the following format:\n        ::\n            0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1\n            | first sequence    | second sequence |\n        if token_ids_1 is None, only returns the first portion of the mask (0's).\n        Args:\n            token_ids_0 (:obj:`List[int]`):\n                List of ids.\n            token_ids_1 (:obj:`List[int]`, `optional`, defaults to :obj:`None`):\n                Optional second list of IDs for sequence pairs.\n        Returns:\n            :obj:`List[int]`: List of `token type IDs` according to the given sequence(s).\n        '''\n        sep = [self.sep_token_id]\n        cls = [self.cls_token_id]\n        if token_ids_1 is None:\n            return len(cls + token_ids_0 + sep) * [0]\n        return len(cls + token_ids_0 + sep) * [0] + len(token_ids_1 + sep) * [1]\n\n    def clean_up_tokenization(self, out_string: str) -> str:\n        ''' Clean up a list of simple English tokenization artifacts like spaces before punctuations and abreviated forms.\n        '''\n        out_string = (out_string.replace(' .', '.').replace(' ?', '?').replace(' !', '!').replace(' ,', ',').replace(\n            ' \\' ', '\\'').replace(' n\\'t', 'n\\'t').replace(' \\'m', '\\'m').replace(' \\'s', '\\'s').replace(\n                ' \\'ve', '\\'ve').replace(' \\'re', '\\'re'))\n        return out_string\n\n    def truncate_sequences(\n            self,\n            ids: List[int],\n            pair_ids: Optional[List[int]] = None,\n            num_tokens_to_remove: int = 0,\n            truncation_strategy: str = 'longest_first',\n            stride: int = 0,\n    ) -> Tuple[List[int], List[int], List[int]]:\n        ''' Truncates a sequence pair in place to the maximum length.\n        Args:\n            ids: list of tokenized input ids. Can be obtained from a string by chaining the\n                `tokenize` and `convert_tokens_to_ids` methods.\n            pair_ids: Optional second list of input ids. Can be obtained from a string by chaining the\n                `tokenize` and `convert_tokens_to_ids` methods.\n            num_tokens_to_remove (:obj:`int`, `optional`, defaults to ``0``):\n                number of tokens to remove using the truncation strategy\n            truncation_strategy: string selected in the following options:\n                - 'longest_first' (default) Iteratively reduce the inputs sequence until the input is under max_seq_len\n                    starting from the longest one at each token (when there is a pair of input sequences).\n                    Overflowing tokens only contains overflow from the first sequence.\n                - 'only_first': Only truncate the first sequence. raise an error if the first sequence is shorter or equal to than num_tokens_to_remove.\n                - 'only_second': Only truncate the second sequence\n                - 'do_not_truncate': Does not truncate (raise an error if the input sequence is longer than max_seq_len)\n            stride (:obj:`int`, `optional`, defaults to ``0``):\n                If set to a number along with max_seq_len, the overflowing tokens returned will contain some tokens\n                from the main sequence returned. The value of this argument defines the number of additional tokens.\n        '''\n        if num_tokens_to_remove <= 0:\n            return ids, pair_ids, []\n\n        if truncation_strategy == 'longest_first':\n            overflowing_tokens = []\n            for _ in range(num_tokens_to_remove):\n                if pair_ids is None or len(ids) > len(pair_ids):\n                    overflowing_tokens = [ids[-1]] + overflowing_tokens\n                    ids = ids[:-1]\n                else:\n                    pair_ids = pair_ids[:-1]\n            window_len = min(len(ids), stride)\n            if window_len > 0:\n                overflowing_tokens = ids[-window_len:] + overflowing_tokens\n        elif truncation_strategy == 'only_first':\n            assert len(ids) > num_tokens_to_remove\n            window_len = min(len(ids), stride + num_tokens_to_remove)\n            overflowing_tokens = ids[-window_len:]\n            ids = ids[:-num_tokens_to_remove]\n        elif truncation_strategy == 'only_second':\n            assert pair_ids is not None and len(pair_ids) > num_tokens_to_remove\n            window_len = min(len(pair_ids), stride + num_tokens_to_remove)\n            overflowing_tokens = pair_ids[-window_len:]\n            pair_ids = pair_ids[:-num_tokens_to_remove]\n        elif truncation_strategy == 'do_not_truncate':\n            raise ValueError('Input sequence are too long for max_seq_len. Please select a truncation strategy.')\n        else:\n            raise ValueError(\n                'Truncation_strategy should be selected in [\\'longest_first\\', \\'only_first\\', \\'only_second\\', \\'do_not_truncate\\']'\n            )\n        return (ids, pair_ids, overflowing_tokens)\n\n    def encode(self,\n               text: Union[str, List[str], List[int]],\n               text_pair: Optional[Union[str, List[str], List[int]]] = None,\n               max_seq_len: Optional[int] = None,\n               pad_to_max_seq_len: bool = True,\n               truncation_strategy: str = 'longest_first',\n               return_position_ids: bool = False,\n               return_segment_ids: bool = True,\n               return_input_mask: bool = False,\n               return_length: bool = True,\n               return_overflowing_tokens: bool = False,\n               return_special_tokens_mask: bool = False):\n        '''\n        Returns a dictionary containing the encoded sequence or sequence pair and additional information:\n        the mask for sequence classification and the overflowing elements if a ``max_seq_len`` is specified.\n        Args:\n            text (:obj:`str`, :obj:`List[str]` or :obj:`List[int]`):\n                The first sequence to be encoded. This can be a string, a list of strings (tokenized string using\n                the `tokenize` method) or a list of integers (tokenized string ids using the `convert_tokens_to_ids`\n                method)\n            text_pair (:obj:`str`, :obj:`List[str]` or :obj:`List[int]`, `optional`, defaults to :obj:`None`):\n                Optional second sequence to be encoded. This can be a string, a list of strings (tokenized\n                string using the `tokenize` method) or a list of integers (tokenized string ids using the\n                `convert_tokens_to_ids` method)\n            max_seq_len (:obj:`int`, `optional`, defaults to :int:`None`):\n                If set to a number, will limit the total sequence returned so that it has a maximum length.\n                If there are overflowing tokens, those will be added to the returned dictionary\n            pad_to_max_seq_len (:obj:`bool`, `optional`, defaults to :obj:`True`):\n                If set to True, the returned sequences will be padded according to the model's padding side and\n                padding index, up to their max length. If no max length is specified, the padding is done up to the\n                model's max length.\n            truncation_strategy (:obj:`str`, `optional`, defaults to `longest_first`):\n                String selected in the following options:\n                - 'longest_first' (default) Iteratively reduce the inputs sequence until the input is under max_seq_len\n                  starting from the longest one at each token (when there is a pair of input sequences)\n                - 'only_first': Only truncate the first sequence\n                - 'only_second': Only truncate the second sequence\n                - 'do_not_truncate': Does not truncate (raise an error if the input sequence is longer than max_seq_len)\n            return_position_ids (:obj:`bool`, `optional`, defaults to :obj:`False`):\n                Set to True to return tokens position ids (default True).\n            return_segment_ids (:obj:`bool`, `optional`, defaults to :obj:`True`):\n                Whether to return token type IDs.\n            return_input_mask (:obj:`bool`, `optional`, defaults to :obj:`False`):\n                Whether to return the attention mask.\n            return_length (:obj:`int`, defaults to :obj:`True`):\n                If set the resulting dictionary will include the length of each encoded inputs\n            return_overflowing_tokens (:obj:`bool`, `optional`, defaults to :obj:`False`):\n                Set to True to return overflowing token information (default False).\n            return_special_tokens_mask (:obj:`bool`, `optional`, defaults to :obj:`False`):\n                Set to True to return special tokens mask information (default False).\n        Return:\n            A Dictionary of shape::\n                {\n                    input_ids: list[int],\n                    position_ids: list[int] if return_position_ids is True (default)\n                    segment_ids: list[int] if return_segment_ids is True (default)\n                    input_mask: list[int] if return_input_mask is True (default)\n                    seq_len: int if return_length is True (default)\n                    overflowing_tokens: list[int] if a ``max_seq_len`` is specified and return_overflowing_tokens is True\n                    num_truncated_tokens: int if a ``max_seq_len`` is specified and return_overflowing_tokens is True\n                    special_tokens_mask: list[int] if return_special_tokens_mask is True\n                }\n            With the fields:\n            - ``input_ids``: list of token ids to be fed to a model\n            - ``position_ids``: list of token position ids to be fed to a model\n            - ``segment_ids``: list of token type ids to be fed to a model\n            - ``input_mask``: list of indices specifying which tokens should be attended to by the model\n            - ``length``: the input_ids length\n            - ``overflowing_tokens``: list of overflowing tokens if a max length is specified.\n            - ``num_truncated_tokens``: number of overflowing tokens a ``max_seq_len`` is specified\n            - ``special_tokens_mask``: if adding special tokens, this is a list of [0, 1], with 0 specifying special added\n              tokens and 1 specifying sequence tokens.\n        '''\n\n        def get_input_ids(text):\n            if isinstance(text, str):\n                tokens = self.tokenize(text)\n                return self.convert_tokens_to_ids(tokens)\n            elif isinstance(text, (list, tuple)) and len(text) > 0 and isinstance(text[0], str):\n                return self.convert_tokens_to_ids(text)\n            elif isinstance(text, (list, tuple)) and len(text) > 0 and isinstance(text[0], int):\n                return text\n            else:\n                raise ValueError(\n                    'Input is not valid. Should be a string, a list/tuple of strings or a list/tuple of integers.')\n\n        ids = get_input_ids(text)\n        pair_ids = get_input_ids(text_pair) if text_pair is not None else None\n\n        pair = bool(pair_ids is not None)\n        len_ids = len(ids)\n        len_pair_ids = len(pair_ids) if pair else 0\n\n        encoded_inputs = {}\n\n        # Truncation: Handle max sequence length\n        total_len = len_ids + len_pair_ids + (self.num_special_tokens_to_add(pair=pair))\n        if max_seq_len and total_len > max_seq_len:\n            ids, pair_ids, overflowing_tokens = self.truncate_sequences(\n                ids,\n                pair_ids=pair_ids,\n                num_tokens_to_remove=total_len - max_seq_len,\n                truncation_strategy=truncation_strategy,\n            )\n            if return_overflowing_tokens:\n                encoded_inputs['overflowing_tokens'] = overflowing_tokens\n                encoded_inputs['num_truncated_tokens'] = total_len - max_seq_len\n\n        # Add special tokens\n        sequence = self.build_inputs_with_special_tokens(ids, pair_ids)\n        segment_ids = self.create_segment_ids_from_sequences(ids, pair_ids)\n\n        # Build output dictionnary\n        encoded_inputs['input_ids'] = sequence\n        if return_segment_ids:\n            encoded_inputs['segment_ids'] = segment_ids\n        if return_special_tokens_mask:\n            encoded_inputs['special_tokens_mask'] = self.get_special_tokens_mask(ids, pair_ids)\n        if return_length:\n            encoded_inputs['seq_len'] = len(encoded_inputs['input_ids'])\n\n        # Check lengths\n        assert max_seq_len is None or len(encoded_inputs['input_ids']) <= max_seq_len\n\n        # Padding\n        needs_to_be_padded = pad_to_max_seq_len and \\\n                             max_seq_len and len(encoded_inputs['input_ids']) < max_seq_len\n\n        if needs_to_be_padded:\n            difference = max_seq_len - len(encoded_inputs['input_ids'])\n            if return_input_mask:\n                encoded_inputs['input_mask'] = [1] * len(encoded_inputs['input_ids']) + [0] * difference\n            if return_segment_ids:\n                encoded_inputs['segment_ids'] = (encoded_inputs['segment_ids'] + [self.pad_token_type_id] * difference)\n            if return_special_tokens_mask:\n                encoded_inputs['special_tokens_mask'] = encoded_inputs['special_tokens_mask'] + [1] * difference\n            encoded_inputs['input_ids'] = encoded_inputs['input_ids'] + [self.pad_token_id] * difference\n        else:\n            if return_input_mask:\n                encoded_inputs['input_mask'] = [1] * len(encoded_inputs['input_ids'])\n\n        if return_position_ids:\n            encoded_inputs['position_ids'] = list(range(len(encoded_inputs['input_ids'])))\n\n        return encoded_inputs\n\n    def decode(self,\n               token_ids: Union[List[int], Dict],\n               only_convert_to_tokens: bool = False,\n               skip_pad_token: bool = False,\n               skip_special_tokens: bool = False,\n               clean_up_tokenization_spaces: bool = True):\n        '''\n        Converts a sequence of ids (integer) to a string if only_convert_to_tokens is False or a list a sequence of tokens (str)\n        when only_convert_to_tokens is True.\n        Args:\n            token_ids: list of tokenized input ids or dict containing a key called 'input_ids', can be obtained using the `encode` methods.\n            only_convert_to_tokens:  if set to True, will only return a list a sequence of tokens (str). `paddlehub.dataset.base_nlp_dataset` will use this optional argument.\n            skip_pad_token: if set to True, will replace pad tokens.\n            skip_special_tokens: if set to True, will replace special tokens.\n            clean_up_tokenization_spaces: if set to True, will clean up the tokenization spaces.\n        '''\n        if isinstance(token_ids, dict):\n            token_ids = token_ids['input_ids']\n\n        filtered_tokens = self.convert_ids_to_tokens(token_ids, skip_special_tokens=skip_special_tokens)\n\n        tokens = []\n        for token in filtered_tokens:\n            if skip_pad_token and token == self.pad_token:\n                continue\n            tokens.append(token)\n        if only_convert_to_tokens:\n            return tokens\n\n        if tokens:\n            text = self.convert_tokens_to_string(tokens)\n        else:\n            text = ''\n\n        if clean_up_tokenization_spaces:\n            clean_text = self.clean_up_tokenization(text)\n            return clean_text\n        else:\n            return text\n\n\nclass ErnieTinyTokenizer(BertTokenizer):\n    def __init__(\n            self,\n            vocab_file: str,\n            spm_path: str,\n            word_dict_path: str,\n            do_lower_case: bool = True,\n            unk_token: str = '[UNK]',\n            sep_token: str = '[SEP]',\n            pad_token: str = '[PAD]',\n            cls_token: str = '[CLS]',\n            mask_token: str = '[MASK]',\n    ):\n        mod = try_import('sentencepiece')\n        self.unk_token = unk_token\n        self.sep_token = sep_token\n        self.pad_token = pad_token\n        self.cls_token = cls_token\n        self.mask_token = mask_token\n        self.do_lower_case = do_lower_case\n        self.all_special_tokens = [unk_token, sep_token, pad_token, cls_token, mask_token]\n\n        if not os.path.isfile(vocab_file):\n            raise ValueError('Can\\'t find a vocabulary file at path \\'{}\\'.'.format(vocab_file))\n        self.vocab = load_vocab(vocab_file)\n        self.ids_to_tokens = collections.OrderedDict([(ids, tok) for tok, ids in self.vocab.items()])\n\n        # Here is the difference with BertTokenizer.\n        self.dict = pickle.load(open(word_dict_path, 'rb'))\n        self.sp_model = mod.SentencePieceProcessor()\n        self.window_size = 5\n        self.sp_model.Load(spm_path)\n\n        self.unk_token_id = self.convert_tokens_to_ids(self.unk_token)\n        self.sep_token_id = self.convert_tokens_to_ids(self.sep_token)\n        self.pad_token_id = self.convert_tokens_to_ids(self.pad_token)\n        self.pad_token_type_id = 0\n        self.cls_token_id = self.convert_tokens_to_ids(self.cls_token)\n        self.mask_token_id = self.convert_tokens_to_ids(self.mask_token)\n        self.all_special_ids = self.convert_tokens_to_ids(self.all_special_tokens)\n\n    def cut(self, chars: List[str]):\n        words = []\n        idx = 0\n        while idx < len(chars):\n            matched = False\n            for i in range(self.window_size, 0, -1):\n                cand = chars[idx:idx + i]\n                if cand in self.dict:\n                    words.append(cand)\n                    matched = True\n                    break\n            if not matched:\n                i = 1\n                words.append(chars[idx])\n            idx += i\n        return words\n\n    def tokenize(self, text: str):\n        text = [s for s in self.cut(text) if s != ' ']\n        if self.do_lower_case:\n            text = [s.lower() for s in text]\n        text = ' '.join(text)\n        tokens = self.sp_model.EncodeAsPieces(text)\n        in_vocab_tokens = []\n        for token in tokens:\n            if token in self.vocab:\n                in_vocab_tokens.append(token)\n            else:\n                in_vocab_tokens.append(self.unk_token)\n        return in_vocab_tokens\n"
  },
  {
    "path": "paddlehub/text/tokenizer.py",
    "content": "# -*- coding:utf-8 -*-\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport collections\nimport inspect\nimport os\nfrom typing import Callable, List, Union\n\nimport paddlehub as hub\nfrom paddlehub.utils.log import logger\nfrom paddlehub.text.bert_tokenizer import BasicTokenizer\nfrom paddlehub.text.utils import load_vocab, whitespace_tokenize\n\n\nclass CustomTokenizer(object):\n    '''\n    Customtokenizer which will tokenize the input text as words or phases and convert the words (str) to an index (int) using the vocab.\n    If you would like tokens, please use `hub.BertTokenizer`.\n    '''\n\n    def __init__(self,\n                 vocab_file: str,\n                 do_lower_case: bool = True,\n                 pad_token: str = '[PAD]',\n                 tokenize_chinese_chars: bool = True,\n                 cut_function: Callable = None):\n        ''' Constructs a CustomTokenizer.\n        Args:\n            vocab_file (:obj:`string`): File containing the vocabulary.\n            do_lower_case (:obj:`bool`, `optional`, defaults to :obj:`True`): Whether to lower case the input if the input is in English\n            pad_token (:obj:`string`, `optional`, defaults to '[PAD]'): The token used for padding, for example when batching sequences of different lengths.\n            tokenize_chinese_chars (:obj:`bool`, `optional`, defaults to :obj:`True`): Whether to tokenize Chinese characters.\n            cut_function(:obj:`function`): It is a function that aims to segment a chinese text and get the word segmentation result (list).\n        '''\n\n        if not os.path.isfile(vocab_file):\n            raise ValueError('Can\\'t find a vocabulary file at path \\'{}\\'.'.format(vocab_file))\n        self.vocab = load_vocab(vocab_file)\n        self.ids_to_tokens = collections.OrderedDict([(ids, tok) for tok, ids in self.vocab.items()])\n        self.pad_token = pad_token\n        self.pad_token_id = self.convert_tokens_to_ids(self.pad_token)\n\n        self.tokenize_chinese_chars = tokenize_chinese_chars\n        self.basic_tokenizer = BasicTokenizer(\n            do_lower_case=do_lower_case, tokenize_chinese_chars=tokenize_chinese_chars)\n\n        self.cut_function = cut_function\n        if not self.cut_function:\n            lac = hub.Module(name='lac')\n            self.cut_function = lac.cut\n        elif inspect.isfunction(self.cut_function):\n            self.cut_function = cut_function\n        else:\n            raise RuntimeError('The cut_function (%s) is not a true function.')\n\n    @property\n    def vocab_size(self):\n        return len(self.vocab)\n\n    def get_vocab(self):\n        return dict(self.vocab)\n\n    def _convert_token_to_id(self, token: str):\n        ''' Converts a token (str) in an id using the vocab. '''\n        return self.vocab.get(token, None)\n\n    def _convert_id_to_token(self, index: int):\n        '''Converts an index (integer) in a token (str) using the vocab.'''\n        return self.ids_to_tokens.get(index, None)\n\n    def convert_tokens_to_string(self, tokens: List[str]):\n        ''' Converts a sequence of tokens (string) in a single string. '''\n        if self.tokenize_chinese_chars:\n            out_string = ''.join(tokens).strip()\n        else:\n            out_string = ' '.join(tokens).strip()\n        return out_string\n\n    def convert_ids_to_tokens(self, ids: Union[int, List[int]], skip_pad_token: bool):\n        ''' Converts a single index or a sequence of indices (integers) in a token '\n            (resp.) a sequence of tokens (str), using the vocabulary and added tokens.\n            Args:\n                ids(:obj:`int` or :obj:`List[int]`): list of tokenized input ids.\n                skip_special_token: Don't decode special tokens (self.all_special_tokens). Default: False\n        '''\n        if isinstance(ids, int):\n            return self._convert_id_to_token(ids)\n        tokens = []\n        for index in ids:\n            index = int(index)\n            if skip_pad_token and index == self.pad_token_id:\n                continue\n            tokens.append(self._convert_id_to_token(index))\n        return tokens\n\n    def convert_tokens_to_ids(self, tokens: List[str]):\n        ''' Converts a token string (or a sequence of tokens) in a single integer id\n            (or a sequence of ids), using the vocabulary.\n        '''\n        if tokens is None:\n            return None\n\n        if isinstance(tokens, str):\n            return self._convert_token_to_id(tokens)\n\n        ids = []\n        for token in tokens:\n            wid = self._convert_token_to_id(token)\n            if wid is not None:\n                ids.append(wid)\n        return ids\n\n    def tokenize(self, text: str):\n        '''\n        Converts a string in a sequence of tokens (string), using the tokenizer.\n        Text in chinese will be splitted in words using the Word Segmentor (Baidu_LAC) defaultly.\n        If cut_function is set, it will be splitted in words using cut_function.\n        Args:\n            text (`string`): The sequence to be encoded.\n        Returns:\n            split_tokens (`list`): split\n        '''\n        if self.tokenize_chinese_chars:\n            splitted_tokens = self.cut_function(text=text)\n        else:\n            splitted_tokens = self.basic_tokenizer.tokenize(text=text)\n        return splitted_tokens\n\n    def encode(self,\n               text: str,\n               text_pair: Union[str, List[str], List[int]] = None,\n               max_seq_len: int = None,\n               pad_to_max_seq_len: bool = True,\n               truncation_strategy: str = 'longest_first',\n               return_length: bool = True,\n               return_overflowing_tokens: bool = False):\n        '''\n        Returns a dictionary containing the encoded sequence or sequence pair and additional information:\n        the mask for sequence classification and the overflowing elements if a ``max_seq_len`` is specified.\n        Args:\n            text (:obj:`str`, :obj:`List[str]` or :obj:`List[int]`):\n                The first sequence to be encoded. This can be a string, a list of strings (tokenized string using\n                the `tokenize` method) or a list of integers (tokenized string ids using the `convert_tokens_to_ids`\n                method)\n            text_pair (:obj:`str`, :obj:`List[str]` or :obj:`List[int]`, `optional`, defaults to :obj:`None`):\n                It's nonsense, just for compatible.\n            max_seq_len (:obj:`int`, `optional`, defaults to :int:`None`):\n                If set to a number, will limit the total sequence returned so that it has a maximum length.\n                If there are overflowing tokens, those will be added to the returned dictionary\n            pad_to_max_seq_len (:obj:`bool`, `optional`, defaults to :obj:`True`):\n                If set to True, the returned sequences will be padded according to the model's padding side and\n                padding index, up to their max length. If no max length is specified, the padding is done up to the\n                model's max length.\n            truncation_strategy (:obj:`str`, `optional`, defaults to `longest_first`):\n                String selected in the following options:\n                - 'longest_first' (default) Iteratively reduce the inputs sequence until the input is under max_seq_len\n                  starting from the longest one at each token (when there is a pair of input sequences)\n                - 'only_first': Only truncate the first sequence\n                - 'only_second': Only truncate the second sequence\n                - 'do_not_truncate': Does not truncate (raise an error if the input sequence is longer than max_seq_len)\n            return_lengths (:obj:`bool`, `optional`, defaults to :obj:`True`):\n                If set the resulting dictionary will include the length of each encoded inputs\n            return_overflowing_tokens (:obj:`bool`, `optional`, defaults to :obj:`False`):\n                Set to True to return overflowing token information (default False).\n        Return:\n            A Dictionary of shape::\n                {\n                    text: list[int],\n                    seq_len: int if return_length is True (default)\n                    overflowing_tokens: list[int] if a ``max_seq_len`` is specified and return_overflowing_tokens is True\n                }\n            With the fields:\n            - ``text``: list of token ids to be fed to a model\n            - ``length``: the input_ids length\n            - ``overflowing_tokens``: list of overflowing tokens if a max length is specified.\n        '''\n\n        def get_input_ids(text: str):\n            if isinstance(text, str):\n                tokens = self.tokenize(text)\n                ids = self.convert_tokens_to_ids(tokens)\n                return ids\n            elif isinstance(text, (list, tuple)) and len(text) > 0 and isinstance(text[0], str):\n                return self.convert_tokens_to_ids(text)\n            elif isinstance(text, (list, tuple)) and len(text) > 0 and isinstance(text[0], int):\n                return text\n            else:\n                raise ValueError(\n                    'Input is not valid. Should be a string, a list/tuple of strings or a list/tuple of integers.')\n\n        ids = get_input_ids(text)\n        len_ids = len(ids)\n\n        encoded_inputs = {}\n        # When all words are not found in the vocab, it will return {}.\n        if not len_ids:\n            return encoded_inputs\n\n        # Truncation: Handle max sequence length\n        if max_seq_len and len_ids > max_seq_len:\n            ids, pair_ids, overflowing_tokens = self.truncate_sequences(\n                ids, num_tokens_to_remove=len_ids - max_seq_len, truncation_strategy=truncation_strategy)\n            if return_overflowing_tokens:\n                encoded_inputs['overflowing_tokens'] = overflowing_tokens\n                encoded_inputs['num_truncated_tokens'] = len_ids - max_seq_len\n\n        ## Check length and Pad\n        if pad_to_max_seq_len and len(ids) < max_seq_len:\n            encoded_inputs['text'] = ids + [self.pad_token_id] * (max_seq_len - len(ids))\n        else:\n            encoded_inputs['text'] = ids\n\n        if return_length:\n            encoded_inputs['seq_len'] = len(ids)\n\n        return encoded_inputs\n\n    def truncate_sequences(self,\n                           ids: List[int],\n                           pair_ids: List[int] = None,\n                           num_tokens_to_remove: int = 0,\n                           truncation_strategy: str = 'longest_first',\n                           stride: int = 0):\n        ''' Truncates a sequence pair in place to the maximum length.\n        Args:\n            ids: list of tokenized input ids. Can be obtained from a string by chaining the\n                `tokenize` and `convert_tokens_to_ids` methods.\n            pair_ids: Optional second list of input ids. Can be obtained from a string by chaining the\n                `tokenize` and `convert_tokens_to_ids` methods.\n            num_tokens_to_remove (:obj:`int`, `optional`, defaults to ``0``):\n                number of tokens to remove using the truncation strategy\n            truncation_strategy: string selected in the following options:\n                - 'longest_first' (default) Iteratively reduce the inputs sequence until the input is under max_seq_len\n                    starting from the longest one at each token (when there is a pair of input sequences).\n                    Overflowing tokens only contains overflow from the first sequence.\n                - 'only_first': Only truncate the first sequence. raise an error if the first sequence is shorter or equal to than num_tokens_to_remove.\n                - 'only_second': Only truncate the second sequence\n                - 'do_not_truncate': Does not truncate (raise an error if the input sequence is longer than max_seq_len)\n            stride (:obj:`int`, `optional`, defaults to ``0``):\n                If set to a number along with max_seq_len, the overflowing tokens returned will contain some tokens\n                from the main sequence returned. The value of this argument defines the number of additional tokens.\n        '''\n        if num_tokens_to_remove <= 0:\n            return ids, pair_ids, []\n\n        if truncation_strategy == 'longest_first':\n            overflowing_tokens = []\n            for _ in range(num_tokens_to_remove):\n                if pair_ids is None or len(ids) > len(pair_ids):\n                    overflowing_tokens = [ids[-1]] + overflowing_tokens\n                    ids = ids[:-1]\n                else:\n                    pair_ids = pair_ids[:-1]\n            window_len = min(len(ids), stride)\n            if window_len > 0:\n                overflowing_tokens = ids[-window_len:] + overflowing_tokens\n        elif truncation_strategy == 'only_first':\n            assert len(ids) > num_tokens_to_remove\n            window_len = min(len(ids), stride + num_tokens_to_remove)\n            overflowing_tokens = ids[-window_len:]\n            ids = ids[:-num_tokens_to_remove]\n        elif truncation_strategy == 'only_second':\n            assert pair_ids is not None and len(pair_ids) > num_tokens_to_remove\n            window_len = min(len(pair_ids), stride + num_tokens_to_remove)\n            overflowing_tokens = pair_ids[-window_len:]\n            pair_ids = pair_ids[:-num_tokens_to_remove]\n        elif truncation_strategy == 'do_not_truncate':\n            raise ValueError('Input sequence are too long for max_seq_len. Please select a truncation strategy.')\n        else:\n            raise ValueError(\n                'Truncation_strategy should be selected in [\\'longest_first\\', \\'only_first\\', \\'only_second\\', \\'do_not_truncate\\']'\n            )\n        return (ids, pair_ids, overflowing_tokens)\n\n    def decode(self,\n               token_ids: List[int],\n               only_convert_to_tokens: bool = True,\n               skip_pad_token: bool = False,\n               clean_up_tokenization_spaces: bool = True):\n        '''\n        Converts a sequence of ids (integer) to a string if only_convert_to_tokens is False or a list a sequence of tokens (str)\n        when only_convert_to_tokens is True.\n        Args:\n            token_ids: list of tokenized input ids or dict with a key called 'text', can be obtained by using the `encode` methods.\n            only_convert_to_tokens:  if set to True, will only return a list a sequence of tokens (str). `paddlehub.dataset.base_nlp_dataset` will use this optional argument.\n            skip_pad_token: if set to True, will replace pad tokens.\n            skip_special_tokens: if set to True, will replace special tokens.\n            clean_up_tokenization_spaces: if set to True, will clean up the tokenization spaces.\n        '''\n        if isinstance(token_ids, dict):\n            token_ids = token_ids['text']\n\n        tokens = self.convert_ids_to_tokens(token_ids, skip_pad_token=skip_pad_token)\n\n        if only_convert_to_tokens:\n            return tokens\n\n        if tokens and self.tokenize_chinese_chars:\n            text = ''.join(self.convert_tokens_to_string(tokens))\n        else:\n            text = ' '.join(self.convert_tokens_to_string(tokens))\n\n        if not self.tokenize_chinese_chars and clean_up_tokenization_spaces:\n            clean_text = self.clean_up_tokenization(text)\n            return clean_text\n        else:\n            return text\n\n    def clean_up_tokenization(self, out_string: str) -> str:\n        '''\n        Clean up a list of simple English tokenization artifacts like spaces before punctuations and abreviated forms.\n        '''\n        out_string = (out_string.replace(' .', '.').replace(' ?', '?').replace(' !', '!').replace(' ,', ',').replace(\n            ' \\' ', '\\'').replace(' n\\'t', 'n\\'t').replace(' \\'m', '\\'m').replace(' \\'s', '\\'s').replace(\n                ' \\'ve', '\\'ve').replace(' \\'re', '\\'re'))\n        return out_string\n"
  },
  {
    "path": "paddlehub/text/utils.py",
    "content": "# coding=utf-8\n# Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport unicodedata\nfrom collections import OrderedDict\n\n\ndef load_vocab(vocab_file: str):\n    '''Loads a vocabulary file into a dictionary.'''\n    vocab = {}\n    with open(vocab_file, 'r', encoding='utf-8') as reader:\n        tokens = reader.readlines()\n    for index, token in enumerate(tokens):\n        token = token.rstrip('\\n').split('\\t')[0]\n        vocab[token] = index\n    return vocab\n\n\ndef whitespace_tokenize(text: str):\n    '''Runs basic whitespace cleaning and splitting on a piece of text.'''\n    text = text.strip()\n    if not text:\n        return []\n    tokens = text.split()\n    return tokens\n\n\ndef is_whitespace(char: str):\n    '''Checks whether `chars` is a whitespace character.'''\n    # \\t, \\n, and \\r are technically contorl characters but we treat them\n    # as whitespace since they are generally considered as such.\n    if char == ' ' or char == '\\t' or char == '\\n' or char == '\\r':\n        return True\n    cat = unicodedata.category(char)\n    if cat == 'Zs':\n        return True\n    return False\n\n\ndef is_control(char: str):\n    '''Checks whether `chars` is a control character.'''\n    # These are technically control characters but we count them as whitespace\n    # characters.\n    if char == '\\t' or char == '\\n' or char == '\\r':\n        return False\n    cat = unicodedata.category(char)\n    if cat.startswith('C'):\n        return True\n    return False\n\n\ndef is_punctuation(char: str):\n    '''Checks whether `chars` is a punctuation character.'''\n    cp = ord(char)\n    # We treat all non-letter/number ASCII as punctuation.\n    # Characters such as '^', '$', and '`' are not in the Unicode\n    # Punctuation class but we treat them as punctuation anyways, for\n    # consistency.\n    if (cp >= 33 and cp <= 47) or (cp >= 58 and cp <= 64) or (cp >= 91 and cp <= 96) or (cp >= 123 and cp <= 126):\n        return True\n    cat = unicodedata.category(char)\n    if cat.startswith('P'):\n        return True\n    return False\n\n\ndef is_chinese_char(char: str):\n    '''Checks whether CP is the codepoint of a CJK character.'''\n    # This defines a 'chinese character' as anything in the CJK Unicode block:\n    #   https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_(Unicode_block)\n    #\n    # Note that the CJK Unicode block is NOT all Japanese and Korean characters,\n    # despite its name. The modern Korean Hangul alphabet is a different block,\n    # as is Japanese Hiragana and Katakana. Those alphabets are used to write\n    # space-separated words, so they are not treated specially and handled\n    # like the all of the other languages.\n    cp = ord(char)\n    if ((cp >= 0x4E00 and cp <= 0x9FFF) or (cp >= 0x3400 and cp <= 0x4DBF)  #\n            or (cp >= 0x20000 and cp <= 0x2A6DF)  #\n            or (cp >= 0x2A700 and cp <= 0x2B73F)  #\n            or (cp >= 0x2B740 and cp <= 0x2B81F)  #\n            or (cp >= 0x2B820 and cp <= 0x2CEAF)  #\n            or (cp >= 0xF900 and cp <= 0xFAFF) or (cp >= 0x2F800 and cp <= 0x2FA1F)  #\n        ):  #\n        return True\n\n    return False\n"
  },
  {
    "path": "paddlehub/utils/__init__.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom .log import Table, ProgressBar\n"
  },
  {
    "path": "paddlehub/utils/download.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\n\nimport filelock\n\nimport paddlehub.env as hubenv\nfrom paddle.utils.download import get_path_from_url\nfrom paddlehub.utils import log, utils, xarfile\n\n\ndef download_data(url):\n    def _wrapper(Dataset):\n        def _check_download():\n            save_name = os.path.basename(url).split('.')[0]\n            output_path = os.path.join(hubenv.DATA_HOME, save_name)\n            lock = filelock.FileLock(os.path.join(hubenv.TMP_HOME, save_name))\n            with lock:\n                if not os.path.exists(output_path):\n                    default_downloader.download_file_and_uncompress(url, hubenv.DATA_HOME, True)\n\n        class WrapperDataset(Dataset):\n            def __new__(cls, *args, **kwargs):\n                _check_download()\n                return super(WrapperDataset, cls).__new__(cls)\n\n        return WrapperDataset\n\n    return _wrapper\n\n\nclass Downloader:\n    def download_file_and_uncompress(self, url: str, save_path: str, print_progress: bool):\n        with utils.generate_tempdir() as _dir:\n            if print_progress:\n                with log.ProgressBar('Download {}'.format(url)) as bar:\n                    for path, ds, ts in utils.download_with_progress(url=url, path=_dir):\n                        bar.update(float(ds) / ts)\n            else:\n                path = utils.download(url=url, path=_dir)\n\n            if print_progress:\n                with log.ProgressBar('Decompress {}'.format(path)) as bar:\n                    for path, ds, ts in xarfile.unarchive_with_progress(name=path, path=save_path):\n                        bar.update(float(ds) / ts)\n            else:\n                path = xarfile.unarchive(name=path, path=save_path)\n\n\ndefault_downloader = Downloader()\n"
  },
  {
    "path": "paddlehub/utils/io.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport contextlib\nimport sys\nfrom typing import IO\n\nfrom paddlehub.utils.utils import generate_tempfile\n\n\n@contextlib.contextmanager\ndef redirect_istream(stream: IO):\n    '''Redirect the standard input stream to the specified stream'''\n    _t = sys.stdin\n    sys.stdin = stream\n    yield\n    sys.stdin = _t\n\n\n@contextlib.contextmanager\ndef redirect_ostream(stream: IO):\n    '''Redirect the standard output stream to the specified stream'''\n    _t = sys.stdout\n    sys.stdout = stream\n    yield\n    sys.stdout = _t\n\n\n@contextlib.contextmanager\ndef redirect_estream(stream: IO):\n    '''Redirect the standard error stream to the specified stream'''\n    _t = sys.stderr\n    sys.stderr = stream\n    yield\n    sys.stderr = _t\n\n\n@contextlib.contextmanager\ndef discard_oe():\n    '''\n    Redirect output and error stream to temporary file. In a sense,\n    it is equivalent discarded the output and error messages\n    '''\n    with generate_tempfile(mode='w') as _stream:\n        with redirect_ostream(_stream), redirect_estream(_stream):\n            yield\n\n\n@contextlib.contextmanager\ndef typein(chars: str = 'y'):\n    # typein chars to input stream\n    with generate_tempfile(mode='w+') as _stream:\n        with redirect_istream(_stream):\n            _stream.write('{}\\n'.format(chars))\n            _stream.seek(0)\n            yield\n"
  },
  {
    "path": "paddlehub/utils/log.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport contextlib\nimport copy\nimport functools\nimport logging\nimport os\nimport sys\nimport time\nimport threading\nfrom typing import List\n\nimport colorlog\nfrom colorama import Fore\n\nimport paddlehub.config as hubconf\nfrom paddlehub.env import LOG_HOME\n\nloggers = {}\n\nlog_config = {\n    'DEBUG': {\n        'level': 10,\n        'color': 'purple'\n    },\n    'INFO': {\n        'level': 20,\n        'color': 'green'\n    },\n    'TRAIN': {\n        'level': 21,\n        'color': 'cyan'\n    },\n    'EVAL': {\n        'level': 22,\n        'color': 'blue'\n    },\n    'WARNING': {\n        'level': 30,\n        'color': 'yellow'\n    },\n    'ERROR': {\n        'level': 40,\n        'color': 'red'\n    },\n    'CRITICAL': {\n        'level': 50,\n        'color': 'bold_red'\n    }\n}\n\n\nclass Logger(object):\n    '''\n    Deafult logger in PaddleHub\n\n    Args:\n        name(str) : Logger name, default is 'PaddleHub'\n    '''\n\n    def __init__(self, name: str = None):\n        name = 'PaddleHub' if not name else name\n        self.logger = logging.getLogger(name)\n\n        for key, conf in log_config.items():\n            logging.addLevelName(conf['level'], key)\n            self.__dict__[key] = functools.partial(self.__call__, conf['level'])\n            self.__dict__[key.lower()] = functools.partial(self.__call__, conf['level'])\n\n        self.format = colorlog.ColoredFormatter(\n            '%(log_color)s[%(asctime)-15s] [%(levelname)8s]%(reset)s - %(message)s',\n            log_colors={key: conf['color']\n                        for key, conf in log_config.items()})\n\n        self.handler = logging.StreamHandler()\n        self.handler.setFormatter(self.format)\n\n        self.logger.addHandler(self.handler)\n        self.logLevel = hubconf.log_level\n        self.logger.setLevel(logging.DEBUG)\n        self.logger.propagate = False\n        self._is_enable = hubconf.log_enable\n\n    def disable(self):\n        self._is_enable = False\n\n    def enable(self):\n        self._is_enable = True\n\n    @property\n    def is_enable(self) -> bool:\n        return self._is_enable\n\n    def __call__(self, log_level: str, msg: str):\n        if not self.is_enable:\n            return\n\n        self.logger.log(log_level, msg)\n\n    @contextlib.contextmanager\n    def use_terminator(self, terminator: str):\n        old_terminator = self.handler.terminator\n        self.handler.terminator = terminator\n        yield\n        self.handler.terminator = old_terminator\n\n    @contextlib.contextmanager\n    def processing(self, msg: str, interval: float = 0.1):\n        '''\n        Continuously print a progress bar with rotating special effects.\n\n        Args:\n            msg(str): Message to be printed.\n            interval(float): Rotation interval. Default to 0.1.\n        '''\n        end = False\n\n        def _printer():\n            index = 0\n            flags = ['\\\\', '|', '/', '-']\n            while not end:\n                flag = flags[index % len(flags)]\n                with self.use_terminator('\\r'):\n                    self.info('{}: {}'.format(msg, flag))\n                time.sleep(interval)\n                index += 1\n\n        t = threading.Thread(target=_printer)\n        t.daemon = True\n        t.start()\n        yield\n        end = True\n\n\nclass ProgressBar(object):\n    '''\n    Progress bar printer\n\n    Args:\n        title(str) : Title text\n        flush_interval(float): Flush rate of progress bar, default is 0.1.\n\n    Examples:\n        .. code-block:: python\n\n            with ProgressBar('Download module') as bar:\n                for i in range(100):\n                    bar.update(i / 100)\n\n            # with continuous bar.update, the progress bar in the terminal\n            # will continue to update until 100%\n            #\n            # Download module\n            # [##################################################] 100.00%\n    '''\n\n    def __init__(self, title: str, flush_interval: float = 0.1):\n        self.last_flush_time = time.time()\n        self.flush_interval = flush_interval\n        self._end = False\n        self.title = title\n\n    def __enter__(self):\n        sys.stdout.write('{}\\n'.format(self.title))\n        return self\n\n    def __exit__(self, exit_exception, exit_value, exit_traceback):\n        if not exit_value:\n            self._end = True\n            self.update(1)\n        else:\n            sys.stdout.write('\\n')\n\n    def update(self, progress: float):\n        '''\n        Update progress bar\n\n        Args:\n            progress: Processing progress, from 0.0 to 1.0\n        '''\n        msg = '[{:<50}] {:.2f}%'.format('#' * int(progress * 50), progress * 100)\n        need_flush = (time.time() - self.last_flush_time) >= self.flush_interval\n\n        if need_flush or self._end:\n            sys.stdout.write('\\r{}'.format(msg))\n            self.last_flush_time = time.time()\n            sys.stdout.flush()\n\n        if self._end:\n            sys.stdout.write('\\n')\n\n\nclass FormattedText(object):\n    '''\n    Cross-platform formatted string\n\n    Args:\n        text(str) : Text content\n        width(int) : Text length, if the text is less than the specified length, it will be filled with spaces\n        align(str) : Text alignment, it must be:\n            ========   ====================================\n            Charater   Meaning\n            --------   ------------------------------------\n            '<'        The text will remain left aligned\n            '^'        The text will remain middle aligned\n            '>'        The text will remain right aligned\n            ========   ====================================\n        color(str) : Text color, default is None(depends on terminal configuration)\n    '''\n    _MAP = {'red': Fore.RED, 'yellow': Fore.YELLOW, 'green': Fore.GREEN, 'blue': Fore.BLUE, 'cyan': Fore.CYAN}\n\n    def __init__(self, text: str, width: int = None, align: str = '<', color: str = None):\n        self.text = text\n        self.align = align\n        self.color = FormattedText._MAP[color] if color else color\n        self.width = width if width else len(self.text)\n\n    def __repr__(self) -> str:\n        form = '{{:{}{}}}'.format(self.align, self.width)\n        text = form.format(self.text)\n        if not self.color:\n            return text\n        return self.color + text + Fore.RESET\n\n\nclass TableCell(object):\n    '''The basic components of a table'''\n\n    def __init__(self, content: str = '', width: int = 0, align: str = '<', color: str = ''):\n        self._width = width if width else len(content)\n        self._width = 1 if self._width < 1 else self._width\n        self._contents = []\n        for i in range(0, len(content), self._width):\n            text = FormattedText(content[i:i + self._width], width, align, color)\n            self._contents.append(text)\n        self.align = align\n        self.color = color\n\n    @property\n    def width(self) -> int:\n        return self._width\n\n    @width.setter\n    def width(self, value: int):\n        self._width = value\n        for content in self._contents:\n            content.width = value\n\n    @property\n    def height(self) -> int:\n        return len(self._contents)\n\n    @height.setter\n    def height(self, value: int):\n        if value < self.height:\n            raise RuntimeError(self.height, value)\n        self._contents += [FormattedText('', width=self.width, align=self.align, color=self.color)\n                           ] * (value - self.height)\n\n    def __len__(self) -> int:\n        return len(self._contents)\n\n    def __getitem__(self, idx: int) -> str:\n        return self._contents[idx]\n\n    def __repr__(self) -> str:\n        return '\\n'.join([str(item) for item in self._contents])\n\n\nclass TableRow(object):\n    '''Table row composed of TableCell'''\n\n    def __init__(self):\n        self.cells = []\n\n    def append(self, cell: TableCell):\n        self.cells.append(cell)\n\n    @property\n    def width(self) -> int:\n        _width = 0\n        for cell in self.cells():\n            _width += cell.width\n        return _width\n\n    @property\n    def height(self) -> int:\n        _height = -1\n        for cell in self.cells:\n            _height = max(_height, cell.height)\n        return _height\n\n    def __len__(self) -> int:\n        return len(self.cells)\n\n    def __repr__(self) -> str:\n        content = ''\n        for i in range(self.height):\n            content += '|'\n            for cell in self.cells:\n                if i > cell.height:\n                    content = content + '|'\n                else:\n                    content = content + str(cell[i]) + '|'\n            content += '\\n'\n        return content\n\n    def __getitem__(self, idx: int) -> TableCell:\n        return self.cells[idx]\n\n\nclass TableColumn(object):\n    '''Table column composed of TableCell'''\n\n    def __init__(self):\n        self.cells = []\n\n    def append(self, cell: TableCell):\n        self.cells.append(cell)\n\n    @property\n    def width(self) -> int:\n        _width = -1\n        for cell in self.cells:\n            _width = max(_width, cell.width)\n        return _width\n\n    @property\n    def height(self) -> int:\n        _height = 0\n        for cell in self.cells:\n            _height += cell.height\n        return _height\n\n    def __len__(self) -> int:\n        return len(self.cells)\n\n    def __getitem__(self, idx: int) -> TableCell:\n        return self.cells[idx]\n\n\nclass Table(object):\n    '''\n    Table with adaptive width and height\n\n    Args:\n        colors(list[str]) : Text colors\n        aligns(list[str]) : Text alignments\n        widths(list[str]) : Text widths\n\n    Examples:\n        .. code-block:: python\n\n            table = Table(widths=[12, 20])\n            table.append('name', 'PaddleHub')\n            table.append('version', '2.0.0')\n            table.append(\n                'description',\n                'PaddleHub is a pretrainied model application tool under the PaddlePaddle')\n            table.append('author')\n\n            print(table)\n\n            # the result is\n            # +------------+--------------------+\n            # |name        |PaddleHub           |\n            # +------------+--------------------+\n            # |version     |2.0.0               |\n            # +------------+--------------------+\n            # |description |PaddleHub is a pretr|\n            # |            |ainied model applica|\n            # |            |tion tool under the |\n            # |            |PaddlePaddle        |\n            # +------------+--------------------+\n            # |author      |                    |\n            # +------------+--------------------+\n    '''\n\n    def __init__(self, colors: List[str] = [], aligns: List[str] = [], widths: List[int] = []):\n        self.rows = []\n        self.columns = []\n        self.colors = colors\n        self.aligns = aligns\n        self.widths = widths\n\n    def append(self, *contents, colors: List[str] = [], aligns: List[str] = [], widths: List[int] = []):\n        '''\n        Add a row to the table\n\n        Args:\n            *contents(*list): Contents of the row, each content will be placed in a separate cell\n            colors(list[str]) : Text colors\n            aligns(list[str]) : Text alignments\n            widths(list[str]) : Text widths\n        '''\n        newrow = TableRow()\n\n        widths = copy.deepcopy(self.widths) if not widths else widths\n        colors = copy.deepcopy(self.colors) if not colors else colors\n        aligns = copy.deepcopy(self.aligns) if not aligns else aligns\n\n        for idx, content in enumerate(contents):\n            width = widths[idx] if idx < len(widths) else len(content)\n            color = colors[idx] if idx < len(colors) else ''\n            align = aligns[idx] if idx < len(aligns) else ''\n\n            newcell = TableCell(content, width=width, color=color, align=align)\n            newrow.append(newcell)\n            if idx >= len(self.columns):\n                newcolumn = TableColumn()\n\n                for row in self.rows:\n                    cell = TableCell(width=width, color=color, align=align)\n                    row.append(cell)\n                    newcolumn.append(cell)\n                newcolumn.append(newcell)\n                self.columns.append(newcolumn)\n            else:\n                self.columns[idx].append(newcell)\n\n        for idx in range(len(newrow), len(self.columns)):\n            width = widths[idx] if idx < len(widths) else self.columns[idx].width\n            color = colors[idx] if idx < len(colors) else ''\n            align = aligns[idx] if idx < len(aligns) else ''\n            cell = TableCell(width=width, color=color, align=align)\n            newrow.append(cell)\n\n        self.rows.append(newrow)\n        self._adjust()\n\n    def _adjust(self):\n        '''Adjust the width and height of the cells in each row and column.'''\n        for column in self.columns:\n            _width = -1\n            for cell in column:\n                _width = max(_width, cell.width)\n            for cell in column:\n                cell.width = _width\n\n        for row in self.rows:\n            _height = -1\n            for cell in row:\n                _height = max(_height, cell.height)\n            for cell in row:\n                cell.height = _height\n\n    @property\n    def width(self) -> int:\n        _width = -1\n        for row in self.rows:\n            _width = max(_width, row.width)\n        return _width\n\n    @property\n    def height(self) -> int:\n        _height = -1\n        for column in self.columns:\n            _height = max(_height, column.height)\n        return _height\n\n    def __repr__(self) -> str:\n        seprow = '+{}+\\n'.format('+'.join(['-' * column.width for column in self.columns]))\n        content = ''\n        for row in self.rows:\n            content = content + str(row)\n            content += seprow\n        return seprow + content\n\n\ndef get_file_logger(filename):\n    '''\n    Set logger.handler to FileHandler.\n\n    Args:\n        filename(str): filename to logging\n\n    Examples:\n    .. code-block:: python\n\n        logger = get_file_logger('test.log')\n        logger.logger.info('test_1')\n    '''\n    log_name = os.path.join(LOG_HOME, filename)\n    if log_name in loggers:\n        return loggers[log_name]\n\n    logger = Logger()\n    logger.logger.handlers = []\n    format = logging.Formatter('[%(asctime)-15s] [%(levelname)8s] - %(message)s')\n    sh = logging.FileHandler(filename=log_name, mode='a')\n    sh.setFormatter(format)\n    logger.logger.addHandler(sh)\n    logger.logger.setLevel(logging.INFO)\n\n    loggers.update({log_name: logger})\n\n    return logger\n\n\nlogger = Logger()\n"
  },
  {
    "path": "paddlehub/utils/paddlex.py",
    "content": "# coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport shutil\n\nfrom paddlehub.server.server import module_server, CacheUpdater\nfrom paddlehub.utils import log, utils, xarfile\n\n\nclass ResourceNotFoundError(Exception):\n    def __init__(self, name: str, version: str = None):\n        self.name = name\n        self.version = version\n\n    def __str__(self):\n        if not self.version:\n            tips = 'No resource named {} was found'.format(self.name)\n        else:\n            tips = 'No resource named {}-{} was found'.format(self.name, self.version)\n        return tips\n\n\ndef download(name: str, save_path: str, version: str = None):\n    '''The download interface provided to PaddleX for downloading the specified model and resource files.'''\n\n    CacheUpdater(\"x_download\", name, version).start()\n\n    file = os.path.join(save_path, name)\n    file = os.path.realpath(file)\n    if os.path.exists(file):\n        return\n\n    resources = module_server.search_resource(name=name, version=version, type='Model')\n    if not resources:\n        raise ResourceNotFoundError(name, version)\n\n    for item in resources:\n        if item['name'] == name and utils.Version(item['version']).match(version):\n            url = item['url']\n            break\n    else:\n        raise ResourceNotFoundError(name, version)\n\n    with utils.generate_tempdir() as _dir:\n        if not os.path.exists(save_path):\n            os.makedirs(save_path)\n\n        with log.ProgressBar('Download {}'.format(url)) as _bar:\n            for savefile, dsize, tsize in utils.download_with_progress(url, _dir):\n                _bar.update(float(dsize / tsize))\n\n        if xarfile.is_xarfile(savefile):\n            with log.ProgressBar('Decompress {}'.format(savefile)) as _bar:\n                for savefile, usize, tsize in xarfile.unarchive_with_progress(savefile, _dir):\n                    _bar.update(float(usize / tsize))\n\n                savefile = os.path.join(_dir, savefile.split(os.sep)[0])\n\n        shutil.move(savefile, file)\n"
  },
  {
    "path": "paddlehub/utils/parser.py",
    "content": "# coding:utf-8\n# Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport codecs\nfrom typing import List\n\nimport yaml\n\nfrom paddlehub.utils.utils import sys_stdin_encoding\n\n\nclass CSVFileParser(object):\n    def parse(self, csv_file: str) -> dict:\n        with codecs.open(csv_file, 'r', sys_stdin_encoding()) as file:\n            content = file.read()\n        content = content.split('\\n')\n        self.title = content[0].split(',')\n        self.content = {}\n        for key in self.title:\n            self.content[key] = []\n\n        for text in content[1:]:\n            if (text == ''):\n                continue\n\n            for index, item in enumerate(text.split(',')):\n                title = self.title[index]\n                self.content[title].append(item)\n\n        return self.content\n\n\nclass YAMLFileParser(object):\n    def parse(self, yaml_file: str) -> dict:\n        with codecs.open(yaml_file, 'r', sys_stdin_encoding()) as file:\n            content = file.read()\n        return yaml.load(content, Loader=yaml.BaseLoader)\n\n\nclass TextFileParser(object):\n    def parse(self, txt_file: str, use_strip: bool = True) -> List:\n        contents = []\n        try:\n            with codecs.open(txt_file, 'r', encoding='utf8') as file:\n                for line in file:\n                    if use_strip:\n                        line = line.strip()\n                    if line:\n                        contents.append(line)\n        except:\n            with codecs.open(txt_file, 'r', encoding='gbk') as file:\n                for line in file:\n                    if use_strip:\n                        line = line.strip()\n                    if line:\n                        contents.append(line)\n        return contents\n\n\ncsv_parser = CSVFileParser()\nyaml_parser = YAMLFileParser()\ntxt_parser = TextFileParser()\n"
  },
  {
    "path": "paddlehub/utils/platform.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport sys\nimport platform\n\n\ndef get_platform() -> str:\n    return platform.platform()\n\n\ndef is_windows() -> bool:\n    return get_platform().lower().startswith(\"windows\")\n\n\ndef get_platform_info() -> dict:\n    return {\n        'python_version': '.'.join(map(str, sys.version_info[0:3])),\n        'platform_version': platform.version(),\n        'platform_system': platform.system(),\n        'platform_architecture': platform.architecture(),\n        'platform_type': platform.platform()\n    }\n"
  },
  {
    "path": "paddlehub/utils/pypi.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport subprocess\nimport sys\nfrom typing import IO\n\nfrom paddlehub.utils.utils import Version\nfrom paddlehub.utils.io import discard_oe\n\n\ndef get_installed_packages() -> dict:\n    '''Get all packages installed in current python environment'''\n    from pip._internal.utils.misc import get_installed_distributions\n    return {item.key: Version(item.version) for item in get_installed_distributions()}\n\n\ndef check(package: str, version: str = '') -> bool:\n    '''\n    Check whether the locally installed python package meets the conditions. If the package is not installed\n    locally or the version number does not meet the conditions, return False, otherwise return True.\n    '''\n    pdict = get_installed_packages()\n    if not package in pdict:\n        return False\n    if not version:\n        return True\n    return pdict[package].match(version)\n\n\ndef install(package: str, version: str = '', upgrade: bool = False, ostream: IO = sys.stdout,\n            estream: IO = sys.stderr) -> bool:\n    '''Install the python package.'''\n    package = package.replace(' ', '')\n    if version:\n        package = '{}=={}'.format(package, version)\n\n    cmd = '{} -m pip install \"{}\"'.format(sys.executable, package)\n\n    if upgrade:\n        cmd += ' --upgrade'\n\n    result, content = subprocess.getstatusoutput(cmd)\n    if result:\n        estream.write(content)\n    else:\n        ostream.write(content)\n    return result == 0\n\n\ndef install_from_file(file: str, ostream: IO = sys.stdout, estream: IO = sys.stderr) -> bool:\n    '''Install the python package.'''\n    cmd = '{} -m pip install -r {}'.format(sys.executable, file)\n\n    result, content = subprocess.getstatusoutput(cmd)\n    if result:\n        estream.write(content)\n    else:\n        ostream.write(content)\n    return result == 0\n\n\ndef uninstall(package: str, ostream: IO = sys.stdout, estream: IO = sys.stderr) -> bool:\n    '''Uninstall the python package.'''\n    # type in 'y' to confirm the uninstall operation\n    cmd = '{} -m pip uninstall {} -y'.format(sys.executable, package)\n    result, content = subprocess.getstatusoutput(cmd)\n    if result:\n        estream.write(content)\n    else:\n        ostream.write(content)\n    return result == 0\n"
  },
  {
    "path": "paddlehub/utils/utils.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport base64\nimport contextlib\nimport hashlib\nimport importlib\nimport math\nimport os\nimport socket\nimport sys\nimport tempfile\nimport time\nimport traceback\nimport types\nfrom typing import Generator\nfrom typing import List\nfrom urllib.parse import urlparse\n\nimport cv2\nimport numpy as np\nimport packaging.version\nimport requests\n\nimport paddlehub.env as hubenv\nimport paddlehub.utils as utils\nfrom paddlehub.utils.log import logger\n\n\nclass Version(packaging.version.Version):\n    '''Extended implementation of packaging.version.Version'''\n\n    def match(self, condition: str) -> bool:\n        '''\n        Determine whether the given condition are met\n        Args:\n            condition(str) : conditions for judgment\n        Returns:\n            bool: True if the given version condition are met, else False\n        Examples:\n            .. code-block:: python\n                Version('1.2.0').match('>=1.2.0a')\n        '''\n        if not condition:\n            return True\n        if condition.startswith('>='):\n            version = condition[2:]\n            _comp = self.__ge__\n        elif condition.startswith('>'):\n            version = condition[1:]\n            _comp = self.__gt__\n        elif condition.startswith('<='):\n            version = condition[2:]\n            _comp = self.__le__\n        elif condition.startswith('<'):\n            version = condition[1:]\n            _comp = self.__lt__\n        elif condition.startswith('=='):\n            version = condition[2:]\n            _comp = self.__eq__\n        elif condition.startswith('='):\n            version = condition[1:]\n            _comp = self.__eq__\n        else:\n            version = condition\n            _comp = self.__eq__\n\n        return _comp(Version(version))\n\n    def __lt__(self, other):\n        if isinstance(other, str):\n            other = Version(other)\n        return super().__lt__(other)\n\n    def __le__(self, other):\n        if isinstance(other, str):\n            other = Version(other)\n        return super().__le__(other)\n\n    def __gt__(self, other):\n        if isinstance(other, str):\n            other = Version(other)\n        return super().__gt__(other)\n\n    def __ge__(self, other):\n        if isinstance(other, str):\n            other = Version(other)\n        return super().__ge__(other)\n\n    def __eq__(self, other):\n        if isinstance(other, str):\n            other = Version(other)\n        return super().__eq__(other)\n\n\nclass Timer(object):\n    '''Calculate runing speed and estimated time of arrival(ETA)'''\n\n    def __init__(self, total_step: int):\n        self.total_step = total_step\n        self.last_start_step = 0\n        self.current_step = 0\n        self._is_running = True\n\n    def start(self):\n        self.last_time = time.time()\n        self.start_time = time.time()\n\n    def stop(self):\n        self._is_running = False\n        self.end_time = time.time()\n\n    def count(self) -> int:\n        if not self.current_step >= self.total_step:\n            self.current_step += 1\n        return self.current_step\n\n    @property\n    def timing(self) -> float:\n        run_steps = self.current_step - self.last_start_step\n        self.last_start_step = self.current_step\n        time_used = time.time() - self.last_time\n        self.last_time = time.time()\n        return run_steps / time_used\n\n    @property\n    def is_running(self) -> bool:\n        return self._is_running\n\n    @property\n    def eta(self) -> str:\n        if not self.is_running:\n            return '00:00:00'\n        scale = self.total_step / self.current_step\n        remaining_time = (time.time() - self.start_time) * scale\n        return seconds_to_hms(remaining_time)\n\n\ndef seconds_to_hms(seconds: int) -> str:\n    '''Convert the number of seconds to hh:mm:ss'''\n    h = math.floor(seconds / 3600)\n    m = math.floor((seconds - h * 3600) / 60)\n    s = int(seconds - h * 3600 - m * 60)\n    hms_str = '{:0>2}:{:0>2}:{:0>2}'.format(h, m, s)\n    return hms_str\n\n\ndef cv2_to_base64(image: np.ndarray) -> str:\n    data = cv2.imencode('.jpg', image)[1]\n    return base64.b64encode(data.tostring()).decode('utf8')\n\n\ndef base64_to_cv2(b64str: str) -> np.ndarray:\n    '''Convert a string in base64 format to cv2 data'''\n    data = base64.b64decode(b64str.encode('utf8'))\n    data = np.fromstring(data, np.uint8)\n    data = cv2.imdecode(data, cv2.IMREAD_COLOR)\n    return data\n\n\n@contextlib.contextmanager\ndef generate_tempfile(directory: str = None, **kwargs):\n    '''Generate a temporary file'''\n    directory = hubenv.TMP_HOME if not directory else directory\n    with tempfile.NamedTemporaryFile(dir=directory, **kwargs) as file:\n        yield file\n\n\n@contextlib.contextmanager\ndef generate_tempdir(directory: str = None, **kwargs):\n    '''Generate a temporary directory'''\n    directory = hubenv.TMP_HOME if not directory else directory\n    with tempfile.TemporaryDirectory(dir=directory, **kwargs) as _dir:\n        yield _dir\n\n\ndef download(url: str, path: str = None) -> str:\n    '''\n    Download a file\n    Args:\n        url (str) : url to be downloaded\n        path (str, optional) : path to store downloaded products, default is current work directory\n    Examples:\n        .. code-block:: python\n            url = 'https://xxxxx.xx/xx.tar.gz'\n            download(url, path='./output')\n    '''\n    for savename, _, _ in download_with_progress(url, path):\n        ...\n    return savename\n\n\ndef download_with_progress(url: str, path: str = None) -> Generator[str, int, int]:\n    '''\n    Download a file and return the downloading progress -> Generator[filename, download_size, total_size]\n    Args:\n        url (str) : url to be downloaded\n        path (str, optional) : path to store downloaded products, default is current work directory\n    Examples:\n        .. code-block:: python\n            url = 'https://xxxxx.xx/xx.tar.gz'\n            for filename, download_size, total_szie in download_with_progress(url, path='./output'):\n                print(filename, download_size, total_size)\n    '''\n    path = os.getcwd() if not path else path\n    if not os.path.exists(path):\n        os.makedirs(path)\n\n    parse_result = urlparse(url)\n    savename = parse_result.path.split('/')[-1]\n    savename = os.path.join(path, savename)\n\n    res = requests.get(url, stream=True)\n    download_size = 0\n    total_size = int(res.headers.get('content-length'))\n    with open(savename, 'wb') as _file:\n        for data in res.iter_content(chunk_size=4096):\n            _file.write(data)\n            download_size += len(data)\n            yield savename, download_size, total_size\n\n\ndef load_py_module(python_path: str, py_module_name: str) -> types.ModuleType:\n    '''\n    Load the specified python module.\n\n    Args:\n        python_path(str) : The directory where the python module is located\n        py_module_name(str) : Module name to be loaded\n    '''\n    sys.path.insert(0, python_path)\n\n    # Delete the cache module to avoid hazards. For example, when the user reinstalls a HubModule,\n    # if the cache is not cleared, then what the user gets at this time is actually the HubModule\n    # before uninstallation, this can cause some strange problems, e.g, fail to load model parameters.\n    if py_module_name in sys.modules:\n        sys.modules.pop(py_module_name)\n\n    py_module = importlib.import_module(py_module_name)\n    sys.path.pop(0)\n\n    return py_module\n\n\ndef get_platform_default_encoding() -> str:\n    '''Get the default encoding of the current platform.'''\n    if utils.platform.is_windows():\n        return 'gbk'\n    return 'utf8'\n\n\ndef sys_stdin_encoding() -> str:\n    '''Get the standary input stream default encoding.'''\n    encoding = sys.stdin.encoding\n    if encoding is None:\n        encoding = sys.getdefaultencoding()\n\n    if encoding is None:\n        encoding = get_platform_default_encoding()\n    return encoding\n\n\ndef sys_stdout_encoding() -> str:\n    '''Get the standary output stream default encoding.'''\n    encoding = sys.stdout.encoding\n    if encoding is None:\n        encoding = sys.getdefaultencoding()\n\n    if encoding is None:\n        encoding = get_platform_default_encoding()\n    return encoding\n\n\ndef md5(text: str):\n    '''Calculate the md5 value of the input text.'''\n    md5code = hashlib.md5(text.encode())\n    return md5code.hexdigest()\n\n\ndef record(msg: str) -> str:\n    '''Record the specified text into the PaddleHub log file witch will be automatically stored according to date.'''\n    logfile = get_record_file()\n    with open(logfile, 'a') as file:\n        file.write('=' * 50 + '\\n')\n        file.write('Record at ' + time.strftime('%Y-%m-%d %H:%M:%S') + '\\n')\n        file.write('=' * 50 + '\\n')\n        file.write(str(msg) + '\\n' * 3)\n\n    return logfile\n\n\ndef record_exception(msg: str) -> str:\n    '''Record the current exception infomation into the PaddleHub log file witch will be automatically stored according to date.'''\n    tb = traceback.format_exc()\n    file = record(tb)\n    logger.warning('{}. Detailed error information can be found in the {}.'.format(msg, file))\n\n\ndef get_record_file() -> str:\n    return os.path.join(hubenv.LOG_HOME, time.strftime('%Y%m%d.log'))\n\n\ndef is_port_occupied(ip: str, port: int) -> bool:\n    '''\n    Check if port os occupied.\n    '''\n    s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)\n    try:\n        s.connect((ip, int(port)))\n        s.shutdown(2)\n        return True\n    except:\n        return False\n\n\ndef mkdir(path: str):\n    \"\"\"The same as the shell command `mkdir -p`.\"\"\"\n    if not os.path.exists(path):\n        os.makedirs(path)\n\n\ndef reseg_token_label(tokenizer, tokens: List[str], labels: List[str] = None):\n    '''\n    Convert segments and labels of sequence labeling samples into tokens\n    based on the vocab of tokenizer.\n    '''\n    if labels:\n        if len(tokens) != len(labels):\n            raise ValueError(\"The length of tokens must be same with labels\")\n        ret_tokens = []\n        ret_labels = []\n        for token, label in zip(tokens, labels):\n            sub_token = tokenizer._tokenize(token)\n            if len(sub_token) == 0:\n                continue\n            ret_tokens.extend(sub_token)\n            ret_labels.append(label)\n            if len(sub_token) < 2:\n                continue\n            sub_label = label\n            if label.startswith(\"B-\"):\n                sub_label = \"I-\" + label[2:]\n            ret_labels.extend([sub_label] * (len(sub_token) - 1))\n\n        if len(ret_tokens) != len(ret_labels):\n            raise ValueError(\"The length of ret_tokens can't match with labels\")\n        return ret_tokens, ret_labels\n    else:\n        ret_tokens = []\n        for token in tokens:\n            sub_token = tokenizer._tokenize(token)\n            if len(sub_token) == 0:\n                continue\n            ret_tokens.extend(sub_token)\n            if len(sub_token) < 2:\n                continue\n        return ret_tokens, None\n\n\ndef pad_sequence(ids: List[int], max_seq_len: int, pad_token_id: int):\n    '''\n    Pads a sequence to max_seq_len\n    '''\n    assert len(ids) <= max_seq_len, \\\n        f'The input length {len(ids)} is greater than max_seq_len {max_seq_len}. '\\\n        'Please check the input list and max_seq_len if you really want to pad a sequence.'\n    return ids[:] + [pad_token_id] * (max_seq_len - len(ids))\n\n\ndef trunc_sequence(ids: List[int], max_seq_len: int):\n    '''\n    Truncates a sequence to max_seq_len\n    '''\n    assert len(ids) >= max_seq_len, \\\n        f'The input length {len(ids)} is less than max_seq_len {max_seq_len}. ' \\\n        'Please check the input list and max_seq_len if you really want to truncate a sequence.'\n    return ids[:max_seq_len]\n\n\ndef extract_melspectrogram(y,\n                           sample_rate: int = 32000,\n                           window_size: int = 1024,\n                           hop_size: int = 320,\n                           mel_bins: int = 64,\n                           fmin: int = 50,\n                           fmax: int = 14000,\n                           window: str = 'hann',\n                           center: bool = True,\n                           pad_mode: str = 'reflect',\n                           ref: float = 1.0,\n                           amin: float = 1e-10,\n                           top_db: float = None):\n    '''\n    Extract Mel Spectrogram from a waveform.\n    '''\n    try:\n        import librosa\n    except Exception:\n        logger.error('Failed to import librosa. Please check that librosa and numba are correctly installed.')\n        raise\n\n    s = librosa.stft(y,\n                     n_fft=window_size,\n                     hop_length=hop_size,\n                     win_length=window_size,\n                     window=window,\n                     center=center,\n                     pad_mode=pad_mode)\n\n    power = np.abs(s)**2\n    melW = librosa.filters.mel(sr=sample_rate, n_fft=window_size, n_mels=mel_bins, fmin=fmin, fmax=fmax)\n    mel = np.matmul(melW, power)\n    db = librosa.power_to_db(mel, ref=ref, amin=amin, top_db=None)\n    db = db.transpose()\n    return db\n\n\ndef convert_version(version: str) -> List:\n    '''\n    Convert version string in modules dataset such as [1.5.4, 2.0.0] to >=1.5.4 and <=2.0.0\n    '''\n    result = []\n    # from [1.5.4, 2.0.0] -> 1.5.4,2.0.0\n    version = version.replace(' ', '')[1:-1]\n    version = version.split(',')\n    # Although -1.0.0 represents no least version limited,\n    # we should also consider when users write another minus number\n    if not version[0].startswith('-'):\n        result.append('>={}'.format(version[0]))\n\n    if len(version) > 1:\n        if version[1] != '99.0.0':\n            result.append('<={}'.format(version[1]))\n\n    return result\n"
  },
  {
    "path": "paddlehub/utils/xarfile.py",
    "content": "# coding:utf-8\n# Copyright (c) 2020  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport tarfile\nimport zipfile\nfrom typing import List, Generator, Callable\n\nimport rarfile\n\n\nclass XarInfo(object):\n    '''Informational class which holds the details about an archive member given by a XarFile.'''\n\n    def __init__(self, _xarinfo, arctype='tar'):\n        self._info = _xarinfo\n        self.arctype = arctype\n\n    @property\n    def name(self) -> str:\n        if self.arctype == 'tar':\n            return self._info.name\n        return self._info.filename\n\n    @property\n    def size(self) -> int:\n        if self.arctype == 'tar':\n            return self._info.size\n        return self._info.file_size\n\n\nclass XarFile(object):\n    '''\n    The XarFile Class provides an interface to tar/rar/zip archives.\n\n    Args:\n        name(str) : file or directory name to be archived\n        mode(str) : specifies the mode in which the file is opened, it must be:\n            ========   ==============================================================================================\n            Charater   Meaning\n            --------   ----------------------------------------------------------------------------------------------\n            'r'        open for reading\n            'w'        open for writing, truncating the file first, file will be saved according to the arctype field\n            'a'        open for writing, appending to the end of the file if it exists\n            ========   ===============================================================================================\n        arctype(str) : archive type, support ['tar' 'rar' 'zip' 'tar.gz' 'tar.bz2' 'tar.xz' 'tgz' 'txz'], if\n                       the mode if 'w' or 'a', the default is 'tar', if the mode is 'r', it will be based on actual\n                       archive type of file\n    '''\n\n    def __init__(self, name: str, mode: str, arctype: str = 'tar', **kwargs):\n        # if mode is 'w', adjust mode according to arctype field\n        if mode == 'w':\n            if arctype in ['tar.gz', 'tgz']:\n                mode = 'w:gz'\n                self.arctype = 'tar'\n            elif arctype == 'tar.bz2':\n                mode = 'w:bz2'\n                self.arctype = 'tar'\n            elif arctype in ['tar.xz', 'txz']:\n                mode = 'w:xz'\n                self.arctype = 'tar'\n            else:\n                self.arctype = arctype\n        # if mode is 'r', adjust mode according to actual archive type of file\n        elif mode == 'r':\n            if tarfile.is_tarfile(name):\n                self.arctype = 'tar'\n                mode = 'r:*'\n            elif zipfile.is_zipfile(name):\n                self.arctype = 'zip'\n            elif rarfile.is_rarfile(name):\n                self.arctype = 'rar'\n        elif mode == 'a':\n            self.arctype = arctype\n        else:\n            raise RuntimeError('Unsupported mode {}'.format(mode))\n\n        if self.arctype in ['tar.gz', 'tar.bz2', 'tar.xz', 'tar', 'tgz', 'txz']:\n            self._archive_fp = tarfile.open(name, mode, **kwargs)\n        elif self.arctype == 'zip':\n            self._archive_fp = zipfile.ZipFile(name, mode, **kwargs)\n        elif self.arctype == 'rar':\n            self._archive_fp = rarfile.RarFile(name, mode, **kwargs)\n        else:\n            raise RuntimeError('Unsupported archive type {}'.format(self.arctype))\n\n    def __del__(self):\n        self._archive_fp.close()\n\n    def __enter__(self):\n        return self\n\n    def __exit__(self, exit_exception, exit_value, exit_traceback):\n        if exit_exception:\n            print(exit_traceback)\n            raise exit_exception(exit_value)\n        self._archive_fp.close()\n        return self\n\n    def add(self, name: str, arcname: str = None, recursive: bool = True, exclude: Callable = None):\n        '''\n        Add the file `name' to the archive. `name' may be any type of file (directory, fifo, symbolic link, etc.).\n        If given, `arcname' specifies an alternative name for the file in the archive. Directories are added\n        recursively by default. This can be avoided by setting `recursive' to False. `exclude' is a function that\n        should return True for each filename to be excluded.\n        '''\n        if self.arctype == 'tar':\n            self._archive_fp.add(name, arcname, recursive, filter=exclude)\n        else:\n            self._archive_fp.write(name)\n            if not recursive or not os.path.isdir(name):\n                return\n            items = []\n            for _d, _sub_ds, _files in os.walk(name):\n                items += [os.path.join(_d, _file) for _file in _files]\n                items += [os.path.join(_d, _sub_d) for _sub_d in _sub_ds]\n\n            for item in items:\n                if exclude and not exclude(item):\n                    continue\n                self._archive_fp.write(item)\n\n    def extract(self, name: str, path: str):\n        '''Extract a file from the archive to the specified path.'''\n        return self._archive_fp.extract(name, path)\n\n    def extractall(self, path: str):\n        '''Extract all files from the archive to the specified path.'''\n        return self._archive_fp.extractall(path)\n\n    def getnames(self) -> List[str]:\n        '''Return a list of file names in the archive.'''\n        if self.arctype == 'tar':\n            return self._archive_fp.getnames()\n        return self._archive_fp.namelist()\n\n    def getxarinfo(self, name: str) -> List[XarInfo]:\n        '''Return the instance of XarInfo given 'name'.'''\n        if self.arctype == 'tar':\n            return XarInfo(self._archive_fp.getmember(name), self.arctype)\n        return XarInfo(self._archive_fp.getinfo(name), self.arctype)\n\n\ndef open(name: str, mode: str = 'w', **kwargs) -> XarFile:\n    '''\n    Open a xar archive for reading, writing or appending. Return\n    an appropriate XarFile class.\n    '''\n    return XarFile(name, mode, **kwargs)\n\n\ndef archive(filename: str, recursive: bool = True, exclude: Callable = None, arctype: str = 'tar') -> str:\n    '''\n    Archive a file or directory\n\n    Args:\n        name(str) : file or directory path to be archived\n        recursive(bool) : whether to recursively archive directories\n        exclude(Callable) : function that should return True for each filename to be excluded\n        arctype(str) : archive type, support ['tar' 'rar' 'zip' 'tar.gz' 'tar.bz2' 'tar.xz' 'tgz' 'txz']\n\n    Returns:\n        str: archived file path\n\n    Examples:\n        .. code-block:: python\n\n            archive_path = '/PATH/TO/FILE'\n            archive(archive_path, arcname='output.tar.gz', arctype='tar.gz')\n    '''\n    basename = os.path.splitext(os.path.basename(filename))[0]\n    savename = '{}.{}'.format(basename, arctype)\n    with open(savename, mode='w', arctype=arctype) as file:\n        file.add(filename, recursive=recursive, exclude=exclude)\n\n    return savename\n\n\ndef unarchive(name: str, path: str):\n    '''\n    Unarchive a file\n\n    Args:\n        name(str) : file or directory name to be unarchived\n        path(str) : storage name of archive file\n\n    Examples:\n        .. code-block:: python\n\n            unarchive_path = '/PATH/TO/FILE'\n            unarchive(unarchive_path, path='./output')\n    '''\n    with open(name, mode='r') as file:\n        file.extractall(path)\n\n\ndef unarchive_with_progress(name: str, path: str) -> Generator[str, int, int]:\n    '''\n    Unarchive a file and return the unarchiving progress -> Generator[filename, extrace_size, total_size]\n\n    Args:\n        name(str) : file or directory name to be unarchived\n        path(str) : storage name of archive file\n\n    Examples:\n        .. code-block:: python\n\n            unarchive_path = 'test.tar.gz'\n            for filename, extract_size, total_szie in unarchive_with_progress(unarchive_path, path='./output'):\n                print(filename, extract_size, total_size)\n    '''\n    with open(name, mode='r') as file:\n        total_size = extract_size = 0\n        for filename in file.getnames():\n            total_size += file.getxarinfo(filename).size\n\n        for filename in file.getnames():\n            file.extract(filename, path)\n            extract_size += file.getxarinfo(filename).size\n            yield filename, extract_size, total_size\n\n\ndef is_xarfile(file: str) -> bool:\n    '''Return True if xarfile supports specific file, otherwise False'''\n    _x_func = [zipfile.is_zipfile, tarfile.is_tarfile, rarfile.is_rarfile]\n    for _f in _x_func:\n        if _f(file):\n            return True\n    return False\n"
  },
  {
    "path": "paddlehub/vision/__init__.py",
    "content": ""
  },
  {
    "path": "paddlehub/vision/detect_transforms.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport random\nfrom typing import Callable\n\nimport cv2\nimport PIL\nimport numpy as np\n\nimport paddlehub.vision.transforms.functional as F\nfrom paddlehub.vision.utils import box_crop, box_iou_xywh\n\n\nclass RandomDistort:\n    \"\"\"\n    Distort the input image randomly.\n\n    Args:\n        lower(float): The lower bound value for enhancement, default is 0.5.\n        upper(float): The upper bound value for enhancement, default is 1.5.\n\n    Returns:\n        img(np.ndarray): Distorted image.\n        data(dict): Image info and label info.\n\n    \"\"\"\n\n    def __init__(self, lower: float = 0.5, upper: float = 1.5):\n        self.lower = lower\n        self.upper = upper\n\n    def random_brightness(self, img: PIL.Image):\n        e = np.random.uniform(self.lower, self.upper)\n        return PIL.ImageEnhance.Brightness(img).enhance(e)\n\n    def random_contrast(self, img: PIL.Image):\n        e = np.random.uniform(self.lower, self.upper)\n        return PIL.ImageEnhance.Contrast(img).enhance(e)\n\n    def random_color(self, img: PIL.Image):\n        e = np.random.uniform(self.lower, self.upper)\n        return PIL.ImageEnhance.Color(img).enhance(e)\n\n    def __call__(self, img: np.ndarray, data: dict):\n        ops = [self.random_brightness, self.random_contrast, self.random_color]\n        np.random.shuffle(ops)\n        img = PIL.Image.fromarray(img)\n        img = ops[0](img)\n        img = ops[1](img)\n        img = ops[2](img)\n        img = np.asarray(img)\n\n        return img, data\n\n\nclass RandomExpand:\n    \"\"\"\n    Randomly expand images and gt boxes by random ratio. It is a data enhancement operation for model training.\n\n    Args:\n        max_ratio(float): Max value for expansion ratio, default is 4.\n        fill(list): Initialize the pixel value of the image with the input fill value, default is None.\n        keep_ratio(bool): Whether image keeps ratio.\n        thresh(float): If random ratio does not exceed the thresh, return original images and gt boxes, default is 0.5.\n\n    Return:\n        img(np.ndarray): Distorted image.\n        data(dict): Image info and label info.\n\n    \"\"\"\n\n    def __init__(self, max_ratio: float = 4., fill: list = None, keep_ratio: bool = True, thresh: float = 0.5):\n\n        self.max_ratio = max_ratio\n        self.fill = fill\n        self.keep_ratio = keep_ratio\n        self.thresh = thresh\n\n    def __call__(self, img: np.ndarray, data: dict):\n        gtboxes = data['gt_boxes']\n\n        if random.random() > self.thresh:\n            return img, data\n        if self.max_ratio < 1.0:\n            return img, data\n        h, w, c = img.shape\n\n        ratio_x = random.uniform(1, self.max_ratio)\n        if self.keep_ratio:\n            ratio_y = ratio_x\n        else:\n            ratio_y = random.uniform(1, self.max_ratio)\n\n        oh = int(h * ratio_y)\n        ow = int(w * ratio_x)\n        off_x = random.randint(0, ow - w)\n        off_y = random.randint(0, oh - h)\n\n        out_img = np.zeros((oh, ow, c))\n        if self.fill and len(self.fill) == c:\n            for i in range(c):\n                out_img[:, :, i] = self.fill[i] * 255.0\n\n        out_img[off_y:off_y + h, off_x:off_x + w, :] = img\n        gtboxes[:, 0] = ((gtboxes[:, 0] * w) + off_x) / float(ow)\n        gtboxes[:, 1] = ((gtboxes[:, 1] * h) + off_y) / float(oh)\n        gtboxes[:, 2] = gtboxes[:, 2] / ratio_x\n        gtboxes[:, 3] = gtboxes[:, 3] / ratio_y\n        data['gt_boxes'] = gtboxes\n        img = out_img.astype('uint8')\n\n        return img, data\n\n\nclass RandomCrop:\n    \"\"\"\n    Random crop the input image according to constraints.\n\n    Args:\n        scales(list): The value of the cutting area relative to the original area, expressed in the form of \\\n                      [min, max]. The default value is [.3, 1.].\n        max_ratio(float): Max ratio of the original area relative to the cutting area, default is 2.0.\n        constraints(list): The value of min and max iou values, default is None.\n        max_trial(int): The max trial for finding a valid crop area. The default value is 50.\n\n    Returns:\n        img(np.ndarray): Distorted image.\n        data(dict): Image info and label info.\n\n    \"\"\"\n\n    def __init__(self, scales: list = [0.3, 1.0], max_ratio: float = 2.0, constraints: list = None,\n                 max_trial: int = 50):\n        self.scales = scales\n        self.max_ratio = max_ratio\n        self.constraints = constraints\n        self.max_trial = max_trial\n\n    def __call__(self, img: np.ndarray, data: dict):\n        boxes = data['gt_boxes']\n        labels = data['gt_labels']\n        scores = data['gt_scores']\n\n        if len(boxes) == 0:\n            return img, data\n        if not self.constraints:\n            self.constraints = [(0.1, 1.0), (0.3, 1.0), (0.5, 1.0), (0.7, 1.0), (0.9, 1.0), (0.0, 1.0)]\n\n        img = PIL.Image.fromarray(img)\n        w, h = img.size\n        crops = [(0, 0, w, h)]\n        for min_iou, max_iou in self.constraints:\n            for _ in range(self.max_trial):\n                scale = random.uniform(self.scales[0], self.scales[1])\n                aspect_ratio = random.uniform(max(1 / self.max_ratio, scale * scale), \\\n                                              min(self.max_ratio, 1 / scale / scale))\n                crop_h = int(h * scale / np.sqrt(aspect_ratio))\n                crop_w = int(w * scale * np.sqrt(aspect_ratio))\n                crop_x = random.randrange(w - crop_w)\n                crop_y = random.randrange(h - crop_h)\n                crop_box = np.array([[(crop_x + crop_w / 2.0) / w, (crop_y + crop_h / 2.0) / h, crop_w / float(w),\n                                      crop_h / float(h)]])\n                iou = box_iou_xywh(crop_box, boxes)\n                if min_iou <= iou.min() and max_iou >= iou.max():\n                    crops.append((crop_x, crop_y, crop_w, crop_h))\n                    break\n\n        while crops:\n            crop = crops.pop(np.random.randint(0, len(crops)))\n            crop_boxes, crop_labels, crop_scores, box_num = box_crop(boxes, labels, scores, crop, (w, h))\n\n            if box_num < 1:\n                continue\n            img = img.crop((crop[0], crop[1], crop[0] + crop[2], crop[1] + crop[3])).resize(img.size, PIL.Image.LANCZOS)\n            img = np.asarray(img)\n            data['gt_boxes'] = crop_boxes\n            data['gt_labels'] = crop_labels\n            data['gt_scores'] = crop_scores\n            return img, data\n\n        img = np.asarray(img)\n        data['gt_boxes'] = boxes\n        data['gt_labels'] = labels\n        data['gt_scores'] = scores\n        return img, data\n\n\nclass RandomFlip:\n    \"\"\"Flip the images and gt boxes randomly.\n\n    Args:\n        thresh: Probability for random flip.\n\n    Returns:\n        img(np.ndarray): Distorted image.\n        data(dict): Image info and label info.\n    \"\"\"\n\n    def __init__(self, thresh: float = 0.5):\n        self.thresh = thresh\n\n    def __call__(self, img, data):\n        gtboxes = data['gt_boxes']\n        if random.random() > self.thresh:\n            img = img[:, ::-1, :]\n            gtboxes[:, 0] = 1.0 - gtboxes[:, 0]\n        data['gt_boxes'] = gtboxes\n        return img, data\n\n\nclass Compose:\n    \"\"\"\n    Preprocess the input data according to the operators.\n\n    Args:\n        transforms(list): Preprocessing operators.\n\n    Returns:\n        img(np.ndarray): Preprocessed image.\n        data(dict): Image info and label info, default is None.\n    \"\"\"\n\n    def __init__(self, transforms: list):\n        if not isinstance(transforms, list):\n            raise TypeError('The transforms must be a list!')\n        if len(transforms) < 1:\n            raise ValueError('The length of transforms ' + \\\n                             'must be equal or larger than 1!')\n        self.transforms = transforms\n\n    def __call__(self, data: dict):\n\n        if isinstance(data, dict):\n            if isinstance(data['image'], str):\n                img = cv2.imread(data['image'])\n                img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)\n            gt_labels = data['gt_labels'].copy()\n            data['gt_scores'] = np.ones_like(gt_labels)\n            for op in self.transforms:\n                img, data = op(img, data)\n            img = img.transpose((2, 0, 1))\n            return img, data\n\n        if isinstance(data, str):\n            img = cv2.imread(data)\n            img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)\n            for op in self.transforms:\n                img, data = op(img, data)\n            img = img.transpose((2, 0, 1))\n            return img\n\n\nclass Resize:\n    \"\"\"\n    Resize the input images.\n\n    Args:\n        target_size(int): Targeted input size.\n        interp(str): Interpolation method.\n\n    Returns:\n        img(np.ndarray): Preprocessed image.\n        data(dict): Image info and label info, default is None.\n    \"\"\"\n\n    def __init__(self, target_size: int = 512, interp: str = 'RANDOM'):\n        self.interp_dict = {\n            'NEAREST': cv2.INTER_NEAREST,\n            'LINEAR': cv2.INTER_LINEAR,\n            'CUBIC': cv2.INTER_CUBIC,\n            'AREA': cv2.INTER_AREA,\n            'LANCZOS4': cv2.INTER_LANCZOS4\n        }\n        self.interp = interp\n        if not (interp == \"RANDOM\" or interp in self.interp_dict):\n            raise ValueError(\"interp should be one of {}\".format(self.interp_dict.keys()))\n        if isinstance(target_size, list) or isinstance(target_size, tuple):\n            if len(target_size) != 2:\n                raise TypeError(\n                    'when target is list or tuple, it should include 2 elements, but it is {}'.format(target_size))\n        elif not isinstance(target_size, int):\n            raise TypeError(\"Type of target_size is invalid. Must be Integer or List or tuple, now is {}\".format(\n                type(target_size)))\n\n        self.target_size = target_size\n\n    def __call__(self, img, data=None):\n\n        if self.interp == \"RANDOM\":\n            interp = random.choice(list(self.interp_dict.keys()))\n        else:\n            interp = self.interp\n        img = F.resize(img, self.target_size, self.interp_dict[interp])\n        if data is not None:\n            return img, data\n        else:\n            return img\n\n\nclass Normalize:\n    \"\"\"\n    Normalize the input images.\n\n    Args:\n        mean(list): Mean values for normalization, default is [0.5, 0.5, 0.5].\n        std(list): Standard deviation for normalization, default is [0.5, 0.5, 0.5].\n\n    Returns:\n        img(np.ndarray): Preprocessed image.\n        data(dict): Image info and label info, default is None.\n    \"\"\"\n\n    def __init__(self, mean=[0.5, 0.5, 0.5], std=[0.5, 0.5, 0.5]):\n        self.mean = mean\n        self.std = std\n        if not (isinstance(self.mean, list) and isinstance(self.std, list)):\n            raise ValueError(\"{}: input type is invalid.\".format(self))\n        from functools import reduce\n        if reduce(lambda x, y: x * y, self.std) == 0:\n            raise ValueError('{}: std is invalid!'.format(self))\n\n    def __call__(self, im, data=None):\n\n        mean = np.array(self.mean)[np.newaxis, np.newaxis, :]\n        std = np.array(self.std)[np.newaxis, np.newaxis, :]\n        im = F.normalize(im, mean, std)\n\n        if data is not None:\n            return im, data\n        else:\n            return im\n\n\nclass ShuffleBox:\n    \"\"\"Shuffle detection information for corresponding input image.\"\"\"\n\n    def __call__(self, img, data):\n        gt = np.concatenate([data['gt_boxes'], data['gt_labels'][:, np.newaxis], data['gt_scores'][:, np.newaxis]],\n                            axis=1)\n        idx = np.arange(gt.shape[0])\n        np.random.shuffle(idx)\n        gt = gt[idx, :]\n        data['gt_boxes'], data['gt_labels'], data['gt_scores'] = gt[:, :4], gt[:, 4], gt[:, 5]\n        return img, data\n"
  },
  {
    "path": "paddlehub/vision/functional.py",
    "content": "# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import List, Union\n\nimport cv2\nimport PIL\nimport numpy as np\n\n\ndef normalize(im: np.ndarray, mean: float, std: float) -> np.ndarray:\n    '''\n    Normalize the input image.\n\n    Args:\n        im(np.ndarray): Input image.\n        mean(float): The mean value of normalization.\n        std(float): The standard deviation value of normalization.\n    '''\n    im = im.astype(np.float32, copy=False) / 255.0\n    im -= mean\n    im /= std\n    return im\n\n\ndef permute(im: np.ndarray) -> np.ndarray:\n    '''\n    Repermute the input image from [H, W, C] to [C, H, W].\n\n    Args:\n        im(np.ndarray): Input image.\n    '''\n    im = np.transpose(im, (2, 0, 1))\n    return im\n\n\ndef resize(im: np.ndarray, target_size: Union[List[int], int], interpolation: int = cv2.INTER_LINEAR) -> np.ndarray:\n    '''\n    Resize the input image.\n\n    Args:\n        im(np.ndarray): Input image.\n        target_size(int|list[int]): The target size, if the input type is int, the target width and height will be set\n                                    to this value, if the input type is list, the first element in the list represents\n                                    the target width, and the second value represents the target height.\n        interpolation(int): Interpolation method. Default to cv2.INTER_LINEAR.\n    '''\n    if isinstance(target_size, list) or isinstance(target_size, tuple):\n        w = target_size[0]\n        h = target_size[1]\n    else:\n        w = target_size\n        h = target_size\n    im = cv2.resize(im, (w, h), interpolation=interpolation)\n    return im\n\n\ndef resize_long(im: np.ndarray, long_size: int, interpolation: int = cv2.INTER_LINEAR) -> np.ndarray:\n    '''\n    Resize the long side of the input image to the target size.\n\n    Args:\n        im(np.ndarray): Input image.\n        long_size(int|list[int]): The target size of long side.\n        interpolation(int): Interpolation method. Default to cv2.INTER_LINEAR.\n    '''\n    value = max(im.shape[0], im.shape[1])\n    scale = float(long_size) / float(value)\n    resized_width = int(round(im.shape[1] * scale))\n    resized_height = int(round(im.shape[0] * scale))\n\n    im = cv2.resize(im, (resized_width, resized_height), interpolation=interpolation)\n    return im\n\n\ndef horizontal_flip(im: np.ndarray) -> np.ndarray:\n    '''\n    Flip the picture horizontally.\n\n    Args:\n        im(np.ndarray): Input image.\n    '''\n    if len(im.shape) == 3:\n        im = im[:, ::-1, :]\n    elif len(im.shape) == 2:\n        im = im[:, ::-1]\n    return im\n\n\ndef vertical_flip(im: np.ndarray) -> np.ndarray:\n    '''\n    Flip the picture vertically.\n\n    Args:\n        im(np.ndarray): Input image.\n    '''\n    if len(im.shape) == 3:\n        im = im[::-1, :, :]\n    elif len(im.shape) == 2:\n        im = im[::-1, :]\n    return im\n\n\ndef brightness(im: np.ndarray, brightness_lower: float, brightness_upper: float) -> np.ndarray:\n    '''\n    Randomly disturb the brightness of the picture, user can use np.random.seed to fix the random behavior.\n\n    Args:\n        im(np.ndarray): Input image.\n        brightness_lower(float): Lower bound of brightness.\n        brightness_upper(float): Upper bound of brightness.\n    '''\n    brightness_delta = np.random.uniform(brightness_lower, brightness_upper)\n    im = PIL.ImageEnhance.Brightness(im).enhance(brightness_delta)\n    return im\n\n\ndef contrast(im: np.ndarray, contrast_lower: float, contrast_upper: float) -> np.ndarray:\n    '''\n    Randomly disturb the contrast of the picture, user can use np.random.seed to fix the random behavior.\n\n    Args:\n        im(np.ndarray): Input image.\n        contrast_lower(float): Lower bound of contrast.\n        contrast_upper(float): Upper bound of contrast.\n    '''\n    contrast_delta = np.random.uniform(contrast_lower, contrast_upper)\n    im = PIL.ImageEnhance.Contrast(im).enhance(contrast_delta)\n    return im\n\n\ndef saturation(im: np.ndarray, saturation_lower: float, saturation_upper: float) -> np.ndarray:\n    '''\n    Randomly disturb the saturation of the picture, user can use np.random.seed to fix the random behavior.\n\n    Args:\n        im(np.ndarray): Input image.\n        saturation_lower(float): Lower bound of saturation.\n        saturation_upper(float): Upper bound of saturation.\n    '''\n    saturation_delta = np.random.uniform(saturation_lower, saturation_upper)\n    im = PIL.ImageEnhance.Color(im).enhance(saturation_delta)\n    return im\n\n\ndef hue(im: np.ndarray, hue_lower: float, hue_upper: float) -> np.ndarray:\n    '''\n    Randomly disturb the hue of the picture, user can use np.random.seed to fix the random behavior.\n\n    Args:\n        im(np.ndarray): Input image.\n        hue_lower(float): Lower bound of hue.\n        hue_upper(float): Upper bound of hue.\n    '''\n    hue_delta = np.random.uniform(hue_lower, hue_upper)\n    im = np.array(im.convert('HSV'))\n    im[:, :, 0] = im[:, :, 0] + hue_delta\n    im = PIL.Image.fromarray(im, mode='HSV').convert('RGB')\n    return im\n\n\ndef rotate(im: np.ndarray, rotate_lower: float, rotate_upper: float) -> np.ndarray:\n    '''\n    Rotate the input image at random angle, user can use np.random.seed to fix the random behavior.\n\n    Args:\n        im(np.ndarray): Input image.\n        rotate_lower(float): Lower bound of rotation angle.\n        rotate_upper(float): Upper bound of rotation angle.\n    '''\n    rotate_delta = np.random.uniform(rotate_lower, rotate_upper)\n    im = im.rotate(int(rotate_delta))\n    return im\n"
  },
  {
    "path": "paddlehub/vision/segmentation_transforms.py",
    "content": "# coding: utf8\n# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport random\nfrom typing import Callable, Union, List, Tuple\n\nimport cv2\nimport numpy as np\nfrom PIL import Image\nimport paddlehub.vision.functional as F\n\n\nclass Compose:\n    \"\"\"\n    Do transformation on input data with corresponding pre-processing and augmentation operations.\n    The shape of input data to all operations is [height, width, channels].\n\n    Args:\n        transforms (list): A list contains data pre-processing or augmentation.\n        to_rgb (bool, optional): If converting image to RGB color space. Default: True.\n\n    Raises:\n        TypeError: When 'transforms' is not a list.\n        ValueError: when the length of 'transforms' is less than 1.\n    \"\"\"\n\n    def __init__(self, transforms: Callable, to_rgb: bool = True):\n        if not isinstance(transforms, list):\n            raise TypeError('The transforms must be a list!')\n        if len(transforms) < 1:\n            raise ValueError('The length of transforms ' + \\\n                             'must be equal or larger than 1!')\n        self.transforms = transforms\n        self.to_rgb = to_rgb\n\n    def __call__(self, im: Union[np.ndarray, str], label: Union[np.ndarray, str] = None) -> Tuple:\n        \"\"\"\n        Args:\n            im (str|np.ndarray): It is either image path or image object.\n            label (str|np.ndarray): It is either label path or label ndarray.\n\n        Returns:\n            (tuple). A tuple including image, image info, and label after transformation.\n        \"\"\"\n        if isinstance(im, str):\n            im = cv2.imread(im).astype('float32')\n        if isinstance(label, str):\n            label = np.asarray(Image.open(label))\n        if im is None:\n            raise ValueError('Can\\'t read The image file {}!'.format(im))\n        if self.to_rgb:\n            im = cv2.cvtColor(im, cv2.COLOR_BGR2RGB)\n\n        for op in self.transforms:\n            outputs = op(im, label)\n            im = outputs[0]\n            if len(outputs) == 2:\n                label = outputs[1]\n        im = np.transpose(im, (2, 0, 1))\n        return (im, label)\n\n\nclass ColorMap:\n    \"Calculate color map for mapping segmentation result.\"\n\n    def __init__(self, num_classes: int = 256):\n        self.num_classes = num_classes + 1\n\n    def __call__(self) -> np.ndarray:\n        color_map = self.num_classes * [0, 0, 0]\n        for i in range(0, self.num_classes):\n            j = 0\n            lab = i\n            while lab:\n                color_map[i * 3] |= (((lab >> 0) & 1) << (7 - j))\n                color_map[i * 3 + 1] |= (((lab >> 1) & 1) << (7 - j))\n                color_map[i * 3 + 2] |= (((lab >> 2) & 1) << (7 - j))\n                j += 1\n                lab >>= 3\n        color_map = [color_map[i:i + 3] for i in range(0, len(color_map), 3)]\n        color_map = color_map[1:]\n        return color_map\n\n\nclass SegmentVisual:\n    \"\"\"Visualization the segmentation result.\n    Args:\n        weight(float): weight of original image in combining image, default is 0.6.\n    \"\"\"\n\n    def __init__(self, weight: float = 0.6):\n        self.weight = weight\n        self.get_color_map_list = ColorMap(256)\n\n    def __call__(self, image: str, result: np.ndarray, save_dir: str) -> np.ndarray:\n        color_map = self.get_color_map_list()\n        color_map = np.array(color_map).astype(\"uint8\")\n        # Use OpenCV LUT for color mapping\n        c1 = cv2.LUT(result, color_map[:, 0])\n        c2 = cv2.LUT(result, color_map[:, 1])\n        c3 = cv2.LUT(result, color_map[:, 2])\n        pseudo_img = np.dstack((c1, c2, c3))\n        im = cv2.imread(image)\n        vis_result = cv2.addWeighted(im, self.weight, pseudo_img, 1 - self.weight, 0)\n\n        if save_dir is not None:\n            if not os.path.exists(save_dir):\n                os.makedirs(save_dir)\n            image_name = os.path.split(image)[-1]\n            out_path = os.path.join(save_dir, image_name)\n            cv2.imwrite(out_path, vis_result)\n\n        return vis_result\n\n\nclass Padding:\n    \"\"\"\n    Add bottom-right padding to a raw image or annotation image.\n    Args:\n        target_size (list|tuple): The target size after padding.\n        im_padding_value (list, optional): The padding value of raw image.\n            Default: [127.5, 127.5, 127.5].\n        label_padding_value (int, optional): The padding value of annotation image. Default: 255.\n    Raises:\n        TypeError: When target_size is neither list nor tuple.\n        ValueError: When the length of target_size is not 2.\n    \"\"\"\n\n    def __init__(self,\n                 target_size: Union[List[int], Tuple[int], int],\n                 im_padding_value: Union[List[int], Tuple[int], int] = (128, 128, 128),\n                 label_padding_value: int = 255):\n        if isinstance(target_size, list) or isinstance(target_size, tuple):\n            if len(target_size) != 2:\n                raise ValueError('`target_size` should include 2 elements, but it is {}'.format(target_size))\n        else:\n            raise TypeError(\"Type of target_size is invalid. It should be list or tuple, now is {}\".format(\n                type(target_size)))\n        self.target_size = target_size\n        self.im_padding_value = im_padding_value\n        self.label_padding_value = label_padding_value\n\n    def __call__(self, im: np.ndarray, label: np.ndarray = None) -> Tuple:\n        \"\"\"\n        Args:\n            im (np.ndarray): The Image data.\n            label (np.ndarray, optional): The label data. Default: None.\n        Returns:\n            (tuple). When label is None, it returns (im, ), otherwise it returns (im, label).\n        \"\"\"\n\n        im_height, im_width = im.shape[0], im.shape[1]\n        if isinstance(self.target_size, int):\n            target_height = self.target_size\n            target_width = self.target_size\n        else:\n            target_height = self.target_size[1]\n            target_width = self.target_size[0]\n        pad_height = target_height - im_height\n        pad_width = target_width - im_width\n        if pad_height < 0 or pad_width < 0:\n            raise ValueError(\n                'The size of image should be less than `target_size`, but the size of image ({}, {}) is larger than `target_size` ({}, {})'\n                .format(im_width, im_height, target_width, target_height))\n        else:\n            im = cv2.copyMakeBorder(im, 0, pad_height, 0, pad_width, cv2.BORDER_CONSTANT, value=self.im_padding_value)\n            if label is not None:\n                label = cv2.copyMakeBorder(\n                    label, 0, pad_height, 0, pad_width, cv2.BORDER_CONSTANT, value=self.label_padding_value)\n        if label is None:\n            return (im, )\n        else:\n            return (im, label)\n\n\nclass Normalize:\n    \"\"\"\n    Normalize an image.\n    Args:\n        mean (list|tuple): The mean value of a data set. Default: [0.5, 0.5, 0.5].\n        std (list|tuple): The standard deviation of a data set. Default: [0.5, 0.5, 0.5].\n    Raises:\n        ValueError: When mean/std is not list or any value in std is 0.\n    \"\"\"\n\n    def __init__(self,\n                 mean: Union[List[float], Tuple[float]] = (0.5, 0.5, 0.5),\n                 std: Union[List[float], Tuple[float]] = (0.5, 0.5, 0.5)):\n        self.mean = mean\n        self.std = std\n        if not (isinstance(self.mean, (list, tuple)) and isinstance(self.std, (list, tuple))):\n            raise ValueError(\"{}: input type is invalid. It should be list or tuple\".format(self))\n        from functools import reduce\n        if reduce(lambda x, y: x * y, self.std) == 0:\n            raise ValueError('{}: std is invalid!'.format(self))\n\n    def __call__(self, im: np.ndarray, label: np.ndarray = None) -> Tuple:\n        \"\"\"\n        Args:\n            im (np.ndarray): The Image data.\n            label (np.ndarray, optional): The label data. Default: None.\n        Returns:\n            (tuple). When label is None, it returns (im, ), otherwise it returns (im, label).\n        \"\"\"\n\n        mean = np.array(self.mean)[np.newaxis, np.newaxis, :]\n        std = np.array(self.std)[np.newaxis, np.newaxis, :]\n        im = F.normalize(im, mean, std)\n\n        if label is None:\n            return (im, )\n        else:\n            return (im, label)\n\n\nclass Resize:\n    \"\"\"\n    Resize an image.\n\n    Args:\n        target_size (list|tuple, optional): The target size of image. Default: (512, 512).\n        interp (str, optional): The interpolation mode of resize is consistent with opencv.\n            ['NEAREST', 'LINEAR', 'CUBIC', 'AREA', 'LANCZOS4', 'RANDOM']. Note that when it is\n            'RANDOM', a random interpolation mode would be specified. Default: \"LINEAR\".\n\n    Raises:\n        TypeError: When 'target_size' type is neither list nor tuple.\n        ValueError: When \"interp\" is out of pre-defined methods ('NEAREST', 'LINEAR', 'CUBIC',\n        'AREA', 'LANCZOS4', 'RANDOM').\n    \"\"\"\n\n    # The interpolation mode\n    interp_dict = {\n        'NEAREST': cv2.INTER_NEAREST,\n        'LINEAR': cv2.INTER_LINEAR,\n        'CUBIC': cv2.INTER_CUBIC,\n        'AREA': cv2.INTER_AREA,\n        'LANCZOS4': cv2.INTER_LANCZOS4\n    }\n\n    def __init__(self, target_size: Union[List[int], Tuple[int]] = (512, 512), interp: str = 'LINEAR'):\n        self.interp = interp\n        if not (interp == \"RANDOM\" or interp in self.interp_dict):\n            raise ValueError(\"`interp` should be one of {}\".format(self.interp_dict.keys()))\n        if isinstance(target_size, list) or isinstance(target_size, tuple):\n            if len(target_size) != 2:\n                raise ValueError('`target_size` should include 2 elements, but it is {}'.format(target_size))\n        else:\n            raise TypeError(\"Type of `target_size` is invalid. It should be list or tuple, but it is {}\".format(\n                type(target_size)))\n\n        self.target_size = target_size\n\n    def __call__(self, im: np.ndarray, label: np.ndarray = None) -> Tuple:\n        \"\"\"\n        Args:\n            im (np.ndarray): The Image data.\n            label (np.ndarray, optional): The label data. Default: None.\n\n        Returns:\n            (tuple). When label is None, it returns (im, ), otherwise it returns (im, label),\n\n        Raises:\n            TypeError: When the 'img' type is not numpy.\n            ValueError: When the length of \"im\" shape is not 3.\n        \"\"\"\n\n        if not isinstance(im, np.ndarray):\n            raise TypeError(\"Resize: image type is not numpy.\")\n        if len(im.shape) != 3:\n            raise ValueError('Resize: image is not 3-dimensional.')\n        if self.interp == \"RANDOM\":\n            interp = random.choice(list(self.interp_dict.keys()))\n        else:\n            interp = self.interp\n        im = F.resize(im, self.target_size, self.interp_dict[interp])\n        if label is not None:\n            label = F.resize(label, self.target_size, cv2.INTER_NEAREST)\n\n        if label is None:\n            return (im, )\n        else:\n            return (im, label)\n"
  },
  {
    "path": "paddlehub/vision/transforms.py",
    "content": "# coding: utf8\n# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport random\nfrom typing import Callable, Union, List, Tuple\n\nimport cv2\nimport PIL\nimport numpy as np\nimport paddlehub.vision.functional as F\n\n\nclass Compose:\n    \"\"\"\n    Compose preprocessing operators for obtaining prepocessed data. The shape of input image for all operations is [H, W, C], where H is the image height, W is the image width, and C is the number of image channels.\n\n    Args:\n        transforms(callmethod) : The method of preprocess images.\n        to_rgb(bool): Whether to transform the input from BGR mode to RGB mode, default is False.\n        channel_first(bool): whether to permute image from channel laste to channel first\n    \"\"\"\n\n    def __init__(self, transforms: Callable, to_rgb: bool = False, channel_first: bool = True):\n        if not isinstance(transforms, list):\n            raise TypeError('The transforms must be a list!')\n        if len(transforms) < 1:\n            raise ValueError('The length of transforms ' + \\\n                             'must be equal or larger than 1!')\n        self.transforms = transforms\n        self.to_rgb = to_rgb\n        self.channel_first = channel_first\n\n    def __call__(self, im: Union[np.ndarray, str]):\n        if isinstance(im, str):\n            im = cv2.imread(im).astype('float32')\n\n        if im is None:\n            raise ValueError('Can\\'t read The image file {}!'.format(im))\n        if self.to_rgb:\n            im = cv2.cvtColor(im, cv2.COLOR_BGR2RGB)\n\n        for op in self.transforms:\n            im = op(im)\n\n        if self.channel_first:\n            im = F.permute(im)\n        return im\n\n\nclass Permute:\n    \"\"\"\n    Repermute the input image from [H, W, C] to [C, H, W].\n    \"\"\"\n\n    def __init__(self):\n        pass\n\n    def __call__(self, im):\n        im = F.permute(im)\n        return im\n\n\nclass RandomHorizontalFlip:\n    \"\"\"\n    Randomly flip the image horizontally according to given probability.\n\n    Args:\n        prob(float): The probability for flipping the image horizontally, default is 0.5.\n    \"\"\"\n\n    def __init__(self, prob: float = 0.5):\n        self.prob = prob\n\n    def __call__(self, im: np.ndarray):\n        if random.random() < self.prob:\n            im = F.horizontal_flip(im)\n        return im\n\n\nclass RandomVerticalFlip:\n    \"\"\"\n    Randomly flip the image vertically according to given probability.\n\n    Args:\n        prob(float): The probability for flipping the image vertically, default is 0.5.\n    \"\"\"\n\n    def __init__(self, prob: float = 0.5):\n        self.prob = prob\n\n    def __call__(self, im: np.ndarray):\n        if random.random() < self.prob:\n            im = F.vertical_flip(im)\n        return im\n\n\nclass Resize:\n    \"\"\"\n    Resize input image to target size.\n\n    Args:\n        target_size(List[int]|int]): Target image size.\n        interpolation(str): Interpolation mode, default is 'LINEAR'. It support 6 modes: 'NEAREST', 'LINEAR', 'CUBIC', 'AREA', 'LANCZOS4' and 'RANDOM'.\n    \"\"\"\n    # The interpolation mode\n    interpolation_dict = {\n        'NEAREST': cv2.INTER_NEAREST,\n        'LINEAR': cv2.INTER_LINEAR,\n        'CUBIC': cv2.INTER_CUBIC,\n        'AREA': cv2.INTER_AREA,\n        'LANCZOS4': cv2.INTER_LANCZOS4\n    }\n\n    def __init__(self, target_size: Union[List[int], int], interpolation: str = 'LINEAR'):\n        self.interpolation = interpolation\n        if not (interpolation == \"RANDOM\" or interpolation in self.interpolation_dict):\n            raise ValueError(\"interpolation should be one of {}\".format(self.interpolation_dict.keys()))\n        if isinstance(target_size, list) or isinstance(target_size, tuple):\n            if len(target_size) != 2:\n                raise TypeError(\n                    'when target is list or tuple, it should include 2 elements, but it is {}'.format(target_size))\n        elif not isinstance(target_size, int):\n            raise TypeError(\"Type of target_size is invalid. Must be Integer or List or tuple, now is {}\".format(\n                type(target_size)))\n\n        self.target_size = target_size\n\n    def __call__(self, im: np.ndarray):\n        if self.interpolation == \"RANDOM\":\n            interpolation = random.choice(list(self.interpolation_dict.keys()))\n        else:\n            interpolation = self.interpolation\n        im = F.resize(im, self.target_size, self.interpolation_dict[interpolation])\n        return im\n\n\nclass ResizeByLong:\n    \"\"\"\n    Resize the long side of the input image to the target size.\n\n    Args:\n        long_size(int|list[int]): The target size of long side.\n    \"\"\"\n\n    def __init__(self, long_size: Union[List[int], int]):\n        self.long_size = long_size\n\n    def __call__(self, im):\n        im = F.resize_long(im, self.long_size)\n        return im\n\n\nclass ResizeRangeScaling:\n    \"\"\"\n    Randomly select a targeted size to resize the image according to given range.\n\n    Args:\n        min_value(int): The minimum value for targeted size.\n        max_value(int): The maximum value for targeted size.\n    \"\"\"\n\n    def __init__(self, min_value: int = 400, max_value: int = 600):\n        if min_value > max_value:\n            raise ValueError('min_value must be less than max_value, '\n                             'but they are {} and {}.'.format(min_value, max_value))\n        self.min_value = min_value\n        self.max_value = max_value\n\n    def __call__(self, im: np.ndarray):\n        if self.min_value == self.max_value:\n            random_size = self.max_value\n        else:\n            random_size = int(np.random.uniform(self.min_value, self.max_value) + 0.5)\n        im = F.resize_long(im, random_size, cv2.INTER_LINEAR)\n        return im\n\n\nclass ResizeStepScaling:\n    \"\"\"\n    Randomly select a scale factor to resize the image according to given range.\n\n    Args:\n        min_scale_factor(float): The minimum scale factor for targeted scale.\n        max_scale_factor(float): The maximum scale factor for targeted scale.\n        scale_step_size(float): Scale interval.\n\n    \"\"\"\n\n    def __init__(self, min_scale_factor: float = 0.75, max_scale_factor: float = 1.25, scale_step_size: float = 0.25):\n        if min_scale_factor > max_scale_factor:\n            raise ValueError('min_scale_factor must be less than max_scale_factor, '\n                             'but they are {} and {}.'.format(min_scale_factor, max_scale_factor))\n        self.min_scale_factor = min_scale_factor\n        self.max_scale_factor = max_scale_factor\n        self.scale_step_size = scale_step_size\n\n    def __call__(self, im: np.ndarray):\n        if self.min_scale_factor == self.max_scale_factor:\n            scale_factor = self.min_scale_factor\n\n        elif self.scale_step_size == 0:\n            scale_factor = np.random.uniform(self.min_scale_factor, self.max_scale_factor)\n\n        else:\n            num_steps = int((self.max_scale_factor - self.min_scale_factor) / self.scale_step_size + 1)\n            scale_factors = np.linspace(self.min_scale_factor, self.max_scale_factor, num_steps).tolist()\n            np.random.shuffle(scale_factors)\n            scale_factor = scale_factors[0]\n        w = int(round(scale_factor * im.shape[1]))\n        h = int(round(scale_factor * im.shape[0]))\n\n        im = F.resize(im, (w, h), cv2.INTER_LINEAR)\n        return im\n\n\nclass Normalize:\n    \"\"\"\n    Normalize the input image.\n\n    Args:\n        mean(list): Mean value for normalization.\n        std(list): Standard deviation for normalization.\n        channel_first(bool): im channel firest or last\n    \"\"\"\n\n    def __init__(self, mean: list = [0.5, 0.5, 0.5], std: list = [0.5, 0.5, 0.5], channel_first: bool = False):\n        self.mean = mean\n        self.std = std\n        self.channel_first = channel_first\n        if not (isinstance(self.mean, list) and isinstance(self.std, list)):\n            raise ValueError(\"{}: input type is invalid.\".format(self))\n        from functools import reduce\n        if reduce(lambda x, y: x * y, self.std) == 0:\n            raise ValueError('{}: std is invalid!'.format(self))\n\n    def __call__(self, im):\n        if not self.channel_first:\n            mean = np.array(self.mean)[np.newaxis, np.newaxis, :]\n            std = np.array(self.std)[np.newaxis, np.newaxis, :]\n        else:\n            mean = np.array(self.mean)[:, np.newaxis, np.newaxis]\n            std = np.array(self.std)[:, np.newaxis, np.newaxis]\n        im = F.normalize(im, mean, std)\n        return im\n\n\nclass Padding:\n    \"\"\"\n    Padding input into targeted size according to specific padding value.\n\n    Args:\n        target_size(Union[List[int], Tuple[int], int]): Targeted image size.\n        im_padding_value(list): Border value for 3 channels, default is [127.5, 127.5, 127.5].\n    \"\"\"\n\n    def __init__(self, target_size: Union[List[int], Tuple[int], int], im_padding_value: list = [127.5, 127.5, 127.5]):\n        if isinstance(target_size, list) or isinstance(target_size, tuple):\n            if len(target_size) != 2:\n                raise ValueError(\n                    'when target is list or tuple, it should include 2 elements, but it is {}'.format(target_size))\n        elif not isinstance(target_size, int):\n            raise TypeError(\"Type of target_size is invalid. Must be Integer or List or tuple, now is {}\".format(\n                type(target_size)))\n        self.target_size = target_size\n        self.im_padding_value = im_padding_value\n\n    def __call__(self, im: np.ndarray):\n        im_height, im_width = im.shape[0], im.shape[1]\n        if isinstance(self.target_size, int):\n            target_height = self.target_size\n            target_width = self.target_size\n        else:\n            target_height = self.target_size[1]\n            target_width = self.target_size[0]\n        pad_height = target_height - im_height\n        pad_width = target_width - im_width\n        if pad_height < 0 or pad_width < 0:\n            raise ValueError(\n                'the size of image should be less than target_size, but the size of image ({}, {}), is larger than target_size ({}, {})'\n                .format(im_width, im_height, target_width, target_height))\n        else:\n            im = cv2.copyMakeBorder(im, 0, pad_height, 0, pad_width, cv2.BORDER_CONSTANT, value=self.im_padding_value)\n\n        return im\n\n\nclass RandomPaddingCrop:\n    \"\"\"\n    Padding input image if crop size is greater than image size. Otherwise, crop the input image to given size.\n\n    Args:\n        crop_size(Union[List[int], Tuple[int], int]): Targeted image size.\n        im_padding_value(list): Border value for 3 channels, default is [127.5, 127.5, 127.5].\n    \"\"\"\n\n    def __init__(self, crop_size, im_padding_value=[127.5, 127.5, 127.5]):\n        if isinstance(crop_size, list) or isinstance(crop_size, tuple):\n            if len(crop_size) != 2:\n                raise ValueError(\n                    'when crop_size is list or tuple, it should include 2 elements, but it is {}'.format(crop_size))\n        elif not isinstance(crop_size, int):\n            raise TypeError(\"Type of crop_size is invalid. Must be Integer or List or tuple, now is {}\".format(\n                type(crop_size)))\n        self.crop_size = crop_size\n        self.im_padding_value = im_padding_value\n\n    def __call__(self, im):\n        if isinstance(self.crop_size, int):\n            crop_width = self.crop_size\n            crop_height = self.crop_size\n        else:\n            crop_width = self.crop_size[0]\n            crop_height = self.crop_size[1]\n\n        img_height = im.shape[0]\n        img_width = im.shape[1]\n\n        if img_height == crop_height and img_width == crop_width:\n            return im\n        else:\n            pad_height = max(crop_height - img_height, 0)\n            pad_width = max(crop_width - img_width, 0)\n            if (pad_height > 0 or pad_width > 0):\n                im = cv2.copyMakeBorder(\n                    im, 0, pad_height, 0, pad_width, cv2.BORDER_CONSTANT, value=self.im_padding_value)\n\n            if crop_height > 0 and crop_width > 0:\n                h_off = np.random.randint(img_height - crop_height + 1)\n                w_off = np.random.randint(img_width - crop_width + 1)\n\n                im = im[h_off:(crop_height + h_off), w_off:(w_off + crop_width), :]\n\n            return im\n\n\nclass RandomBlur:\n    \"\"\"\n    Random blur input image by Gaussian filter according to given probability.\n\n    Args:\n        prob(float): The probability to blur the image, default is 0.1.\n    \"\"\"\n\n    def __init__(self, prob: float = 0.1):\n        self.prob = prob\n\n    def __call__(self, im: np.ndarray):\n        if self.prob <= 0:\n            n = 0\n        elif self.prob >= 1:\n            n = 1\n        else:\n            n = int(1.0 / self.prob)\n        if n > 0:\n            if np.random.randint(0, n) == 0:\n                radius = np.random.randint(3, 10)\n                if radius % 2 != 1:\n                    radius = radius + 1\n                if radius > 9:\n                    radius = 9\n                im = cv2.GaussianBlur(im, (radius, radius), 0, 0)\n\n        return im\n\n\nclass RandomRotation:\n    \"\"\"\n    Rotate the input image at random angle. The angle will not exceed to max_rotation.\n\n    Args:\n\n        max_rotation(float): Upper bound of rotation angle.\n        im_padding_value(list): Border value for 3 channels, default is [127.5, 127.5, 127.5].\n    \"\"\"\n\n    def __init__(self, max_rotation: float = 15, im_padding_value: list = [127.5, 127.5, 127.5]):\n        self.max_rotation = max_rotation\n        self.im_padding_value = im_padding_value\n\n    def __call__(self, im):\n        if self.max_rotation > 0:\n            (h, w) = im.shape[:2]\n            do_rotation = np.random.uniform(-self.max_rotation, self.max_rotation)\n            pc = (w // 2, h // 2)\n            r = cv2.getRotationMatrix2D(pc, do_rotation, 1.0)\n            cos = np.abs(r[0, 0])\n            sin = np.abs(r[0, 1])\n\n            nw = int((h * sin) + (w * cos))\n            nh = int((h * cos) + (w * sin))\n\n            (cx, cy) = pc\n            r[0, 2] += (nw / 2) - cx\n            r[1, 2] += (nh / 2) - cy\n            dsize = (nw, nh)\n            im = cv2.warpAffine(\n                im,\n                r,\n                dsize=dsize,\n                flags=cv2.INTER_LINEAR,\n                borderMode=cv2.BORDER_CONSTANT,\n                borderValue=self.im_padding_value)\n\n        return im\n\n\nclass RandomDistort:\n    \"\"\"\n    Random adjust brightness, contrast, saturation and hue according to the given random range and probability, respectively.\n\n    Args:\n\n        brightness_range(float): Boundary of brightness.\n        brightness_prob(float): Probability for disturb the brightness of image.\n        contrast_range(float): Boundary of contrast.\n        contrast_prob(float): Probability for disturb the contrast of image.\n        saturation_range(float): Boundary of saturation.\n        saturation_prob(float): Probability for disturb the saturation of image.\n        hue_range(float): Boundary of hue.\n        hue_prob(float): Probability for disturb the hue of image.\n    \"\"\"\n\n    def __init__(self,\n                 brightness_range: float = 0.5,\n                 brightness_prob: float = 0.5,\n                 contrast_range: float = 0.5,\n                 contrast_prob: float = 0.5,\n                 saturation_range: float = 0.5,\n                 saturation_prob: float = 0.5,\n                 hue_range: float = 18,\n                 hue_prob: float = 0.5):\n        self.brightness_range = brightness_range\n        self.brightness_prob = brightness_prob\n        self.contrast_range = contrast_range\n        self.contrast_prob = contrast_prob\n        self.saturation_range = saturation_range\n        self.saturation_prob = saturation_prob\n        self.hue_range = hue_range\n        self.hue_prob = hue_prob\n\n    def __call__(self, im: np.ndarray):\n        brightness_lower = 1 - self.brightness_range\n        brightness_upper = 1 + self.brightness_range\n        contrast_lower = 1 - self.contrast_range\n        contrast_upper = 1 + self.contrast_range\n        saturation_lower = 1 - self.saturation_range\n        saturation_upper = 1 + self.saturation_range\n        hue_lower = -self.hue_range\n        hue_upper = self.hue_range\n        ops = [F.brightness, F.contrast, F.saturation, F.hue]\n        random.shuffle(ops)\n        params_dict = {\n            'brightness': {\n                'brightness_lower': brightness_lower,\n                'brightness_upper': brightness_upper\n            },\n            'contrast': {\n                'contrast_lower': contrast_lower,\n                'contrast_upper': contrast_upper\n            },\n            'saturation': {\n                'saturation_lower': saturation_lower,\n                'saturation_upper': saturation_upper\n            },\n            'hue': {\n                'hue_lower': hue_lower,\n                'hue_upper': hue_upper\n            }\n        }\n        prob_dict = {\n            'brightness': self.brightness_prob,\n            'contrast': self.contrast_prob,\n            'saturation': self.saturation_prob,\n            'hue': self.hue_prob\n        }\n        im = im.astype('uint8')\n        im = PIL.Image.fromarray(im)\n        for id in range(4):\n            params = params_dict[ops[id].__name__]\n            prob = prob_dict[ops[id].__name__]\n            params['im'] = im\n            if np.random.uniform(0, 1) < prob:\n                im = ops[id](**params)\n        im = np.asarray(im).astype('float32')\n\n        return im\n\n\nclass RGB2LAB:\n    \"\"\"\n    Convert color space from RGB to LAB.\n    \"\"\"\n\n    def rgb2xyz(self, rgb: np.ndarray) -> np.ndarray:\n        \"\"\"\n        Convert color space from RGB to XYZ.\n\n        Args:\n           img(np.ndarray): Original RGB image.\n\n        Return:\n            img(np.ndarray): Converted XYZ image.\n        \"\"\"\n        mask = (rgb > 0.04045)\n        np.seterr(invalid='ignore')\n        rgb = (((rgb + .055) / 1.055)**2.4) * mask + rgb / 12.92 * (1 - mask)\n        rgb = np.nan_to_num(rgb)\n        x = .412453 * rgb[0, :, :] + .357580 * rgb[1, :, :] + .180423 * rgb[2, :, :]\n        y = .212671 * rgb[0, :, :] + .715160 * rgb[1, :, :] + .072169 * rgb[2, :, :]\n        z = .019334 * rgb[0, :, :] + .119193 * rgb[1, :, :] + .950227 * rgb[2, :, :]\n        out = np.concatenate((x[None, :, :], y[None, :, :], z[None, :, :]), axis=0)\n        return out\n\n    def xyz2lab(self, xyz: np.ndarray) -> np.ndarray:\n        \"\"\"\n        Convert color space from XYZ to LAB.\n\n        Args:\n           img(np.ndarray): Original XYZ image.\n\n        Return:\n            img(np.ndarray): Converted LAB image.\n        \"\"\"\n        sc = np.array((0.95047, 1., 1.08883))[:, None, None]\n        xyz_scale = xyz / sc\n        mask = (xyz_scale > .008856).astype(np.float32)\n        xyz_int = np.cbrt(xyz_scale) * mask + (7.787 * xyz_scale + 16. / 116.) * (1 - mask)\n        L = 116. * xyz_int[1, :, :] - 16.\n        a = 500. * (xyz_int[0, :, :] - xyz_int[1, :, :])\n        b = 200. * (xyz_int[1, :, :] - xyz_int[2, :, :])\n        out = np.concatenate((L[None, :, :], a[None, :, :], b[None, :, :]), axis=0)\n        return out\n\n    def rgb2lab(self, rgb: np.ndarray) -> np.ndarray:\n        \"\"\"\n        Convert color space from RGB to LAB.\n\n        Args:\n           img(np.ndarray): Original RGB image.\n\n        Return:\n            img(np.ndarray): Converted LAB image.\n        \"\"\"\n        lab = self.xyz2lab(self.rgb2xyz(rgb))\n        l_rs = (lab[[0], :, :] - 50) / 100\n        ab_rs = lab[1:, :, :] / 110\n        out = np.concatenate((l_rs, ab_rs), axis=0)\n        return out\n\n    def __call__(self, img: np.ndarray) -> np.ndarray:\n        img = img / 255\n        img = np.array(img).transpose(2, 0, 1)\n        img = self.rgb2lab(img)\n        return np.array(img).transpose(1, 2, 0)\n\n\nclass LAB2RGB:\n    \"\"\"\n    Convert color space from LAB to RGB.\n    \"\"\"\n\n    def __init__(self, mode: str = 'RGB2LAB'):\n        self.mode = mode\n\n    def xyz2rgb(self, xyz: np.ndarray) -> np.ndarray:\n        \"\"\"\n        Convert color space from XYZ to RGB.\n\n        Args:\n           img(np.ndarray): Original XYZ image.\n\n        Return:\n            img(np.ndarray): Converted RGB image.\n        \"\"\"\n        r = 3.24048134 * xyz[:, 0, :, :] - 1.53715152 * xyz[:, 1, :, :] - 0.49853633 * xyz[:, 2, :, :]\n        g = -0.96925495 * xyz[:, 0, :, :] + 1.87599 * xyz[:, 1, :, :] + .04155593 * xyz[:, 2, :, :]\n        b = .05564664 * xyz[:, 0, :, :] - .20404134 * xyz[:, 1, :, :] + 1.05731107 * xyz[:, 2, :, :]\n        rgb = np.concatenate((r[:, None, :, :], g[:, None, :, :], b[:, None, :, :]), axis=1)\n        rgb = np.maximum(rgb, 0)  # sometimes reaches a small negative number, which causes NaNs\n        mask = (rgb > .0031308).astype(np.float32)\n        np.seterr(invalid='ignore')\n        out = (1.055 * (rgb**(1. / 2.4)) - 0.055) * mask + 12.92 * rgb * (1 - mask)\n        out = np.nan_to_num(out)\n        return out\n\n    def lab2xyz(self, lab: np.ndarray) -> np.ndarray:\n        \"\"\"\n        Convert color space from LAB to XYZ.\n\n        Args:\n           img(np.ndarray): Original LAB image.\n\n        Return:\n            img(np.ndarray): Converted XYZ image.\n        \"\"\"\n        y_int = (lab[:, 0, :, :] + 16.) / 116.\n        x_int = (lab[:, 1, :, :] / 500.) + y_int\n        z_int = y_int - (lab[:, 2, :, :] / 200.)\n        z_int = np.maximum(z_int, 0)\n        out = np.concatenate((x_int[:, None, :, :], y_int[:, None, :, :], z_int[:, None, :, :]), axis=1)\n        mask = (out > .2068966).astype(np.float32)\n        np.seterr(invalid='ignore')\n        out = (out**3.) * mask + (out - 16. / 116.) / 7.787 * (1 - mask)\n        out = np.nan_to_num(out)\n        sc = np.array((0.95047, 1., 1.08883))[None, :, None, None]\n        out = out * sc\n        return out\n\n    def lab2rgb(self, lab_rs: np.ndarray) -> np.ndarray:\n        \"\"\"\n        Convert color space from LAB to RGB.\n\n        Args:\n           img(np.ndarray): Original LAB image.\n\n        Return:\n            img(np.ndarray): Converted RGB image.\n        \"\"\"\n        l = lab_rs[:, [0], :, :] * 100 + 50\n        ab = lab_rs[:, 1:, :, :] * 110\n        lab = np.concatenate((l, ab), axis=1)\n        out = self.xyz2rgb(self.lab2xyz(lab))\n        return out\n\n    def __call__(self, img: np.ndarray) -> np.ndarray:\n        return self.lab2rgb(img)\n\n\nclass CenterCrop:\n    \"\"\"\n        Crop the middle part of the image to the specified size.\n\n        Args:\n           crop_size(int): Crop size.\n\n        Return:\n            img(np.ndarray): Croped image.\n    \"\"\"\n\n    def __init__(self, crop_size: int):\n        self.crop_size = crop_size\n\n    def __call__(self, img: np.ndarray):\n        img_width, img_height, _ = img.shape\n        crop_top = int((img_height - self.crop_size) / 2.)\n        crop_left = int((img_width - self.crop_size) / 2.)\n        return img[crop_left:crop_left + self.crop_size, crop_top:crop_top + self.crop_size, :]\n"
  },
  {
    "path": "paddlehub/vision/utils.py",
    "content": "# Copyright (c) 2021 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#    http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nfrom typing import Callable, Union, List, Tuple\n\nimport cv2\nimport paddle\nimport PIL\nimport numpy as np\nimport paddle.nn.functional as F\nfrom scipy.sparse import csr_matrix\n\n\ndef is_image_file(filename: str) -> bool:\n    '''Determine whether the input file name is a valid image file name.'''\n    ext = os.path.splitext(filename)[-1].lower()\n    return ext in ['.bmp', '.dib', '.png', '.jpg', '.jpeg', '.pbm', '.pgm', '.ppm', '.tif', '.tiff']\n\n\ndef get_img_file(dir_name: str) -> List[str]:\n    '''Get all image file paths in several directories which have the same parent directory.'''\n    images = []\n    for parent, _, filenames in os.walk(dir_name):\n        for filename in filenames:\n            if not is_image_file(filename):\n                continue\n            img_path = os.path.join(parent, filename)\n            images.append(img_path)\n\n    return images\n\n\ndef box_crop(boxes: np.ndarray, labels: np.ndarray, scores: np.ndarray, crop: List[int], img_shape: List[int]) -> Tuple:\n    \"\"\"Crop the boxes ,labels, scores according to the given shape\"\"\"\n\n    x, y, w, h = map(float, crop)\n    im_w, im_h = map(float, img_shape)\n\n    boxes = boxes.copy()\n    boxes[:, 0], boxes[:, 2] = (boxes[:, 0] - boxes[:, 2] / 2) * im_w, (boxes[:, 0] + boxes[:, 2] / 2) * im_w\n    boxes[:, 1], boxes[:, 3] = (boxes[:, 1] - boxes[:, 3] / 2) * im_h, (boxes[:, 1] + boxes[:, 3] / 2) * im_h\n\n    crop_box = np.array([x, y, x + w, y + h])\n    centers = (boxes[:, :2] + boxes[:, 2:]) / 2.0\n    mask = np.logical_and(crop_box[:2] <= centers, centers <= crop_box[2:]).all(axis=1)\n\n    boxes[:, :2] = np.maximum(boxes[:, :2], crop_box[:2])\n    boxes[:, 2:] = np.minimum(boxes[:, 2:], crop_box[2:])\n    boxes[:, :2] -= crop_box[:2]\n    boxes[:, 2:] -= crop_box[:2]\n\n    mask = np.logical_and(mask, (boxes[:, :2] < boxes[:, 2:]).all(axis=1))\n    boxes = boxes * np.expand_dims(mask.astype('float32'), axis=1)\n    labels = labels * mask.astype('float32')\n    scores = scores * mask.astype('float32')\n    boxes[:, 0], boxes[:, 2] = (boxes[:, 0] + boxes[:, 2]) / 2 / w, (boxes[:, 2] - boxes[:, 0]) / w\n    boxes[:, 1], boxes[:, 3] = (boxes[:, 1] + boxes[:, 3]) / 2 / h, (boxes[:, 3] - boxes[:, 1]) / h\n\n    return boxes, labels, scores, mask.sum()\n\n\ndef box_iou_xywh(box1: np.ndarray, box2: np.ndarray) -> float:\n    \"\"\"Calculate iou by xywh\"\"\"\n\n    assert box1.shape[-1] == 4, \"Box1 shape[-1] should be 4.\"\n    assert box2.shape[-1] == 4, \"Box2 shape[-1] should be 4.\"\n\n    b1_x1, b1_x2 = box1[:, 0] - box1[:, 2] / 2, box1[:, 0] + box1[:, 2] / 2\n    b1_y1, b1_y2 = box1[:, 1] - box1[:, 3] / 2, box1[:, 1] + box1[:, 3] / 2\n    b2_x1, b2_x2 = box2[:, 0] - box2[:, 2] / 2, box2[:, 0] + box2[:, 2] / 2\n    b2_y1, b2_y2 = box2[:, 1] - box2[:, 3] / 2, box2[:, 1] + box2[:, 3] / 2\n\n    inter_x1 = np.maximum(b1_x1, b2_x1)\n    inter_x2 = np.minimum(b1_x2, b2_x2)\n    inter_y1 = np.maximum(b1_y1, b2_y1)\n    inter_y2 = np.minimum(b1_y2, b2_y2)\n    inter_w = inter_x2 - inter_x1\n    inter_h = inter_y2 - inter_y1\n    inter_w[inter_w < 0] = 0\n    inter_h[inter_h < 0] = 0\n\n    inter_area = inter_w * inter_h\n    b1_area = (b1_x2 - b1_x1) * (b1_y2 - b1_y1)\n    b2_area = (b2_x2 - b2_x1) * (b2_y2 - b2_y1)\n\n    return inter_area / (b1_area + b2_area - inter_area)\n\n\ndef draw_boxes_on_image(image_path: str,\n                        boxes: np.ndarray,\n                        scores: np.ndarray,\n                        labels: np.ndarray,\n                        label_names: List[str],\n                        score_thresh: float = 0.5,\n                        save_path: str = 'result'):\n    \"\"\"Draw boxes on images.\"\"\"\n    # On windows, importing paddlenlp and matplotlib at the same time will cause python\n    # to fail to catch C++ exceptions. Delay matplotlib to avoid this problem.\n    import matplotlib as plt\n\n    image = np.array(PIL.Image.open(image_path))\n    plt.figure()\n    _, ax = plt.subplots(1)\n    ax.imshow(image)\n\n    image_name = image_path.split('/')[-1]\n    print(\"Image {} detect: \".format(image_name))\n    colors = {}\n    for box, score, label in zip(boxes, scores, labels):\n        if score < score_thresh:\n            continue\n        if box[2] <= box[0] or box[3] <= box[1]:\n            continue\n        label = int(label)\n        if label not in colors:\n            colors[label] = plt.get_cmap('hsv')(label / len(label_names))\n        x1, y1, x2, y2 = box[0], box[1], box[2], box[3]\n        rect = plt.Rectangle((x1, y1), x2 - x1, y2 - y1, fill=False, linewidth=2.0, edgecolor=colors[label])\n        ax.add_patch(rect)\n        ax.text(\n            x1,\n            y1,\n            '{} {:.4f}'.format(label_names[label], score),\n            verticalalignment='bottom',\n            horizontalalignment='left',\n            bbox={\n                'facecolor': colors[label],\n                'alpha': 0.5,\n                'pad': 0\n            },\n            fontsize=8,\n            color='white')\n        print(\"\\t {:15s} at {:25} score: {:.5f}\".format(label_names[int(label)], str(list(map(int, list(box)))), score))\n    image_name = image_name.replace('jpg', 'png')\n    plt.axis('off')\n    plt.gca().xaxis.set_major_locator(plt.NullLocator())\n    plt.gca().yaxis.set_major_locator(plt.NullLocator())\n    plt.savefig(\"{}/{}\".format(save_path, image_name), bbox_inches='tight', pad_inches=0.0)\n    plt.cla()\n    plt.close('all')\n\n\ndef get_label_infos(file_list: str) -> str:\n    \"\"\"Get label names by corresponding category ids.\"\"\"\n    from pycocotools.coco import COCO\n    map_label = COCO(file_list)\n    label_names = []\n    categories = map_label.loadCats(map_label.getCatIds())\n    for category in categories:\n        label_names.append(category['name'])\n    return label_names\n\n\ndef subtract_imagenet_mean_batch(batch: paddle.Tensor) -> paddle.Tensor:\n    \"\"\"Subtract ImageNet mean pixel-wise from a BGR image.\"\"\"\n    mean = np.zeros(shape=batch.shape, dtype='float32')\n    mean[:, 0, :, :] = 103.939\n    mean[:, 1, :, :] = 116.779\n    mean[:, 2, :, :] = 123.680\n    mean = paddle.to_tensor(mean)\n    return batch - mean\n\n\ndef gram_matrix(data: paddle.Tensor) -> paddle.Tensor:\n    \"\"\"Get gram matrix\"\"\"\n    b, ch, h, w = data.shape\n    features = data.reshape((b, ch, w * h))\n    features_t = features.transpose((0, 2, 1))\n    gram = features.bmm(features_t) / (ch * h * w)\n    return gram\n\n\ndef npmax(array: np.ndarray) -> Tuple[int]:\n    \"\"\"Get max value and index.\"\"\"\n    arrayindex = array.argmax(1)\n    arrayvalue = array.max(1)\n    i = arrayvalue.argmax()\n    j = arrayindex[i]\n    return i, j\n\n\ndef visualize(image: Union[np.ndarray, str], result: np.ndarray, weight: float = 0.6) -> np.ndarray:\n    \"\"\"\n    Convert segmentation result to color image, and save added image.\n\n    Args:\n        image (str|np.ndarray): The path of origin image or bgr image.\n        result (np.ndarray): The predict result of image.\n        weight (float): The image weight of visual image, and the result weight is (1 - weight). Default: 0.6\n\n    Returns:\n        vis_result (np.ndarray): return the visualized result.\n    \"\"\"\n\n    color_map = get_color_map_list(256)\n    color_map = [color_map[i:i + 3] for i in range(0, len(color_map), 3)]\n    color_map = np.array(color_map).astype(\"uint8\")\n    # Use OpenCV LUT for color mapping\n    c1 = cv2.LUT(result, color_map[:, 0])\n    c2 = cv2.LUT(result, color_map[:, 1])\n    c3 = cv2.LUT(result, color_map[:, 2])\n    pseudo_img = np.dstack((c1, c2, c3))\n    if isinstance(image, str):\n        im = cv2.imread(image)\n    else:\n        im = image\n    vis_result = cv2.addWeighted(im, weight, pseudo_img, 1 - weight, 0)\n\n    return vis_result\n\n\ndef get_pseudo_color_map(pred: np.ndarray) -> PIL.Image.Image:\n    '''visualization the segmentation mask.'''\n    pred_mask = PIL.Image.fromarray(pred.astype(np.uint8), mode='P')\n    color_map = get_color_map_list(256)\n    pred_mask.putpalette(color_map)\n    return pred_mask\n\n\ndef get_color_map_list(num_classes: int) -> List[int]:\n    \"\"\"\n    Returns the color map for visualizing the segmentation mask,\n    which can support arbitrary number of classes.\n\n    Args:\n        num_classes (int): Number of classes.\n\n    Returns:\n        (list). The color map.\n    \"\"\"\n\n    num_classes += 1\n    color_map = num_classes * [0, 0, 0]\n    for i in range(0, num_classes):\n        j = 0\n        lab = i\n        while lab:\n            color_map[i * 3] |= (((lab >> 0) & 1) << (7 - j))\n            color_map[i * 3 + 1] |= (((lab >> 1) & 1) << (7 - j))\n            color_map[i * 3 + 2] |= (((lab >> 2) & 1) << (7 - j))\n            j += 1\n            lab >>= 3\n    color_map = color_map[3:]\n    return color_map\n\n\ndef get_reverse_list(ori_shape: List[int], transforms: List[Callable]) -> List[tuple]:\n    \"\"\"\n    get reverse list of transform.\n\n    Args:\n        ori_shape (list): Origin shape of image.\n        transforms (list): List of transform.\n\n    Returns:\n        list: List of tuple, there are two format:\n            ('resize', (h, w)) The image shape before resize,\n            ('padding', (h, w)) The image shape before padding.\n    \"\"\"\n    reverse_list = []\n    h, w = ori_shape[0], ori_shape[1]\n    for op in transforms:\n        if op.__class__.__name__ in ['Resize', 'ResizeByLong']:\n            reverse_list.append(('resize', (h, w)))\n            h, w = op.target_size[0], op.target_size[1]\n        if op.__class__.__name__ in ['Padding']:\n            reverse_list.append(('padding', (h, w)))\n            w, h = op.target_size[0], op.target_size[1]\n    return reverse_list\n\n\ndef reverse_transform(pred: paddle.Tensor, ori_shape: List[int], transforms: List[int]) -> paddle.Tensor:\n    \"\"\"recover pred to origin shape\"\"\"\n    reverse_list = get_reverse_list(ori_shape, transforms)\n    for item in reverse_list[::-1]:\n        if item[0] == 'resize':\n            h, w = item[1][0], item[1][1]\n            pred = F.interpolate(pred, (h, w), mode='nearest')\n        elif item[0] == 'padding':\n            h, w = item[1][0], item[1][1]\n            pred = pred[:, :, 0:h, 0:w]\n        else:\n            raise Exception(\"Unexpected info '{}' in im_info\".format(item[0]))\n    return pred\n\n\nclass ConfusionMatrix(object):\n    \"\"\"\n    Confusion Matrix for segmentation evaluation.\n\n    Args:\n        num_classes (int): Number of categories of the confusion matrix.\n        streaming (bool): Whether to use streaming mode. If the value is set to True, the data will be\n                          accumulated every time the `calculate` interface is called. Default to False.\n    \"\"\"\n\n    def __init__(self, num_classes: int, streaming: bool = False):\n        self.confusion_matrix = np.zeros([num_classes, num_classes], dtype='int64')\n        self.num_classes = num_classes\n        self.streaming = streaming\n\n    def calculate(self, pred, label, ignore=None):\n        # If not in streaming mode, clear matrix everytime when call `calculate`\n        if not self.streaming:\n            self.zero_matrix()\n\n        mask = np.array(ignore) == 1\n\n        label = np.asarray(label)[mask]\n        pred = np.asarray(pred)[mask]\n        one = np.ones_like(pred)\n        # Accumuate ([row=label, col=pred], 1) into sparse matrix\n        spm = csr_matrix((one, (label, pred)), shape=(self.num_classes, self.num_classes))\n        spm = spm.todense()\n        self.confusion_matrix += spm\n\n    def zero_matrix(self):\n        \"\"\" Clear confusion matrix \"\"\"\n        self.confusion_matrix = np.zeros([self.num_classes, self.num_classes], dtype='int64')\n\n    def mean_iou(self) -> float:\n        iou_list = []\n        avg_iou = 0\n        # TODO: use numpy sum axis api to simpliy\n        vji = np.zeros(self.num_classes, dtype=int)\n        vij = np.zeros(self.num_classes, dtype=int)\n        for j in range(self.num_classes):\n            v_j = 0\n            for i in range(self.num_classes):\n                v_j += self.confusion_matrix[j][i]\n            vji[j] = v_j\n\n        for i in range(self.num_classes):\n            v_i = 0\n            for j in range(self.num_classes):\n                v_i += self.confusion_matrix[j][i]\n            vij[i] = v_i\n\n        for c in range(self.num_classes):\n            total = vji[c] + vij[c] - self.confusion_matrix[c][c]\n            if total == 0:\n                iou = 0\n            else:\n                iou = float(self.confusion_matrix[c][c]) / total\n            avg_iou += iou\n            iou_list.append(iou)\n        avg_iou = float(avg_iou) / float(self.num_classes)\n        return np.array(iou_list), avg_iou\n\n    def accuracy(self) -> float:\n        total = self.confusion_matrix.sum()\n        total_right = 0\n        for c in range(self.num_classes):\n            total_right += self.confusion_matrix[c][c]\n        if total == 0:\n            avg_acc = 0\n        else:\n            avg_acc = float(total_right) / total\n\n        vij = np.zeros(self.num_classes, dtype=int)\n        for i in range(self.num_classes):\n            v_i = 0\n            for j in range(self.num_classes):\n                v_i += self.confusion_matrix[j][i]\n            vij[i] = v_i\n\n        acc_list = []\n        for c in range(self.num_classes):\n            if vij[c] == 0:\n                acc = 0\n            else:\n                acc = self.confusion_matrix[c][c] / float(vij[c])\n            acc_list.append(acc)\n        return np.array(acc_list), avg_acc\n\n    def kappa(self) -> float:\n        vji = np.zeros(self.num_classes)\n        vij = np.zeros(self.num_classes)\n        for j in range(self.num_classes):\n            v_j = 0\n            for i in range(self.num_classes):\n                v_j += self.confusion_matrix[j][i]\n            vji[j] = v_j\n\n        for i in range(self.num_classes):\n            v_i = 0\n            for j in range(self.num_classes):\n                v_i += self.confusion_matrix[j][i]\n            vij[i] = v_i\n\n        total = self.confusion_matrix.sum()\n\n        # avoid spillovers\n        # TODO: is it reasonable to hard code 10000.0?\n        total = float(total) / 10000.0\n        vji = vji / 10000.0\n        vij = vij / 10000.0\n\n        tp = 0\n        tc = 0\n        for c in range(self.num_classes):\n            tp += vji[c] * vij[c]\n            tc += self.confusion_matrix[c][c]\n\n        tc = tc / 10000.0\n        pe = tp / (total * total)\n        po = tc / total\n\n        kappa = (po - pe) / (1 - pe)\n        return kappa\n"
  },
  {
    "path": "requirements.txt",
    "content": "colorama\ncolorlog\neasydict\nfilelock\nflask >= 1.1.0\nnumpy\nmatplotlib\nopencv-python\npackaging\npaddle2onnx >= 0.5.1\npaddlenlp >= 2.0.0\nPillow\npyyaml\npyzmq\nrarfile\ntqdm\nvisualdl >= 2.0.0\n# gunicorn not support windows\ngunicorn >= 19.10.0; sys_platform != \"win32\"\ngradio\n"
  },
  {
    "path": "scripts/check_code_style.sh",
    "content": "#!/bin/bash\nfunction abort(){\n    echo \"Your change doesn't follow PaddleHub's code style.\" 1>&2\n    echo \"Please use pre-commit to check what is wrong.\" 1>&2\n    exit 1\n}\n\ntrap 'abort' 0\nset -e\n\ncd $TRAVIS_BUILD_DIR\nexport PATH=/usr/bin:$PATH\npre-commit install\n\nif ! pre-commit run -a ; then\n    git diff\n    exit 1\nfi\n\ntrap : 0\n"
  },
  {
    "path": "scripts/gen_contributors_info.py",
    "content": "import json\nimport requests\n\nheaders = {'Accept': 'application/vnd.github.v3+json'}\norg = 'PaddlePaddle'\nrepo = 'PaddleHub'\nurl = f'https://api.github.com/repos/{org}/{repo}/contributors'\n\npage_id = 1\ncontributors = []\nwhile True:\n    params = {'per_page': 100, 'page': page_id}\n    response = requests.get(url, headers=headers, params=params).text\n\n    _result = json.loads(response)\n    if not _result:\n        break\n\n    contributors += _result\n    page_id += 1\n\nprint('<p align=\"center\">')\nfor _c in contributors:\n    avatar = _c['avatar_url']\n    homepage = _c['html_url']\n    username = _c['login']\n    print(f'    <a href=\"{homepage}\"><img src=\"{avatar}\" width=75 height=75></a>')\n\nprint('</p>')\n"
  },
  {
    "path": "scripts/gen_proto.sh",
    "content": "#/bin/bash\nprotoc -I=../paddlehub/module --python_out=../paddlehub/module ../paddlehub/module/module_desc.proto\nprotoc -I=../paddlehub/module --python_out=../paddlehub/module ../paddlehub/module/check_info.proto\nprotoc -I=../paddlehub/finetune --python_out=../paddlehub/finetune ../paddlehub/finetune/checkpoint.proto\n"
  },
  {
    "path": "scripts/test_cml.sh",
    "content": "#!/bin/bash\nset -o errexit\nbase_path=$(cd `dirname $0`; pwd)\ncd $base_path\n\nshopt -s expand_aliases\nalias hub=\"python $base_path/../paddlehub/commands/hub.py \"$@\"\"\n\n# test install command\nhub install lac\n\n# test show command\nhub show lac\n\n# test list command\nhub list\n\n# test version command\nhub version\n\n# test help command\nhub help\n\n# test uninstall command\nhub uninstall lac\n\n# test clear command\nhub clear\n\n# test search command\nhub search lac\n\n# test run command\nhub run lac --input_text \"今天天气不错\"\n\n# test download command\nhub download lac\n"
  },
  {
    "path": "setup.py",
    "content": "#coding:utf-8\n#   Copyright (c) 2019  PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\"\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Setup for pip package.\"\"\"\n\nimport platform\n\nfrom setuptools import find_packages\nfrom setuptools import setup\n\nimport paddlehub as hub\n\nwith open(\"requirements.txt\") as fin:\n    REQUIRED_PACKAGES = fin.read()\n\nsetup(\n    name='paddlehub',\n    version=hub.__version__.replace('-', ''),\n    description=\n    ('A toolkit for managing pretrained models of PaddlePaddle and helping user getting started with transfer learning more efficiently.'\n     ),\n    long_description='',\n    url='https://github.com/PaddlePaddle/PaddleHub',\n    author='PaddlePaddle Author',\n    author_email='',\n    install_requires=REQUIRED_PACKAGES,\n    packages=find_packages(),\n    package_data={\n        'paddlehub/command/tmpl': [\n            'paddlehub/command/tmpl/init_py.tmpl', 'paddlehub/command/tmpl/serving_demo.tmpl',\n            'paddlehub/command/tmpl/x_model.tmpl'\n        ]\n    },\n    include_package_data=True,\n    data_files=[('paddlehub/commands/tmpl', [\n        'paddlehub/commands/tmpl/init_py.tmpl', 'paddlehub/commands/tmpl/serving_demo.tmpl',\n        'paddlehub/commands/tmpl/x_model.tmpl'\n    ])],\n    include_data_files=True,\n    # PyPI package information.\n    classifiers=[\n        'Development Status :: 4 - Beta',\n        'Intended Audience :: Developers',\n        'Intended Audience :: Education',\n        'Intended Audience :: Science/Research',\n        'License :: OSI Approved :: Apache Software License',\n        'Programming Language :: Python :: 3.6',\n        'Programming Language :: Python :: 3.7',\n        'Programming Language :: Python :: 3.8',\n        'Topic :: Scientific/Engineering',\n        'Topic :: Scientific/Engineering :: Mathematics',\n        'Topic :: Scientific/Engineering :: Artificial Intelligence',\n        'Topic :: Software Development',\n        'Topic :: Software Development :: Libraries',\n        'Topic :: Software Development :: Libraries :: Python Modules',\n    ],\n    license='Apache 2.0',\n    keywords=('paddlehub paddlepaddle fine-tune transfer-learning'),\n    entry_points={'console_scripts': ['hub=paddlehub.commands.utils:execute']})\n"
  },
  {
    "path": "tests/test_module.py",
    "content": "# coding=utf-8\nimport os\nimport unittest\nimport paddlehub as hub\n\n\nclass TestHubModule(unittest.TestCase):\n    def test_lac(self):\n        lac = hub.Module(name=\"lac\")\n        test_text = [\"今天是个好日子\", \"天气预报说今天要下雨\", \"下一班地铁马上就要到了\"]\n        inputs = {\"text\": test_text}\n        results = lac.lexical_analysis(data=inputs)\n        self.assertEqual(results[0]['word'], ['今天', '是', '个', '好日子'])\n        self.assertEqual(results[0]['tag'], ['TIME', 'v', 'q', 'n'])\n        self.assertEqual(results[1]['word'], ['天气预报', '说', '今天', '要', '下雨'])\n        self.assertEqual(results[1]['tag'], ['n', 'v', 'TIME', 'v', 'v'])\n        self.assertEqual(results[2]['word'], ['下', '一班', '地铁', '马上', '就要', '到', '了'])\n        self.assertEqual(results[2]['tag'], ['f', 'm', 'n', 'd', 'v', 'v', 'xc'])\n\n    def test_senta(self):\n        senta = hub.Module(name=\"senta_bilstm\")\n        test_text = [\"这家餐厅很好吃\", \"这部电影真的很差劲\"]\n        input_dict = {\"text\": test_text}\n        results = senta.sentiment_classify(data=input_dict)\n        self.assertEqual(results[0]['sentiment_label'], 1)\n        self.assertEqual(results[0]['sentiment_key'], 'positive')\n        self.assertEqual(results[1]['sentiment_label'], 0)\n        self.assertEqual(results[1]['sentiment_key'], 'negative')\n        for result in results:\n            print(result['text'])\n            print(result['positive_probs'])\n            print(result['negative_probs'])\n\n    def test_simnet(self):\n        simnet_bow = hub.Module(name=\"simnet_bow\")\n        test_text_1 = [\"这道题太难了\", \"这道题太难了\", \"这道题太难了\"]\n        test_text_2 = [\"这道题是上一年的考题\", \"这道题不简单\", \"这道题很有意思\"]\n        inputs = {\"text_1\": test_text_1, \"text_2\": test_text_2}\n        results = simnet_bow.similarity(data=inputs)\n        max_score = -1\n        result_text = \"\"\n        for result in results:\n            if result['similarity'] > max_score:\n                max_score = result['similarity']\n                result_text = result['text_2']\n        print(\"The most matching with the %s is %s\" % (test_text_1[0], result_text))\n\n    def test_ssd(self):\n        ssd = hub.Module(name=\"ssd_mobilenet_v1_pascal\")\n        test_img_path = os.path.join(os.path.dirname(__file__), \"resources\", \"test_img_cat.jpg\")\n        input_dict = {\"image\": [test_img_path]}\n        results = ssd.object_detection(data=input_dict)\n        for result in results:\n            print(result['path'])\n            print(result['data'])\n\n\nif __name__ == \"__main__\":\n    unittest.main()\n"
  }
]